Global Medium, Local Laws : Regulating Cross-Border Cyberhate 9781593323318

155 61 1MB

English Pages 270 Year 2009

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Global Medium, Local Laws : Regulating Cross-Border Cyberhate
 9781593323318

Citation preview

Law and Society Recent Scholarship

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Edited by Melvin I. Urofsky

A Series from LFB Scholarly

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved. Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Global Medium, Local Laws Regulating Cross-Border Cyberhate

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Bastiaan Vanacker

LFB Scholarly Publishing LLC El Paso 2009 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009 by LFB Scholarly Publishing LLC All rights reserved. Library of Congress Cataloging-in-Publication Data Vanacker, Bastiaan. Global medium, local laws : regulating cross-border cyberhate / Bastiaan Vanacker. p. cm. -- (Law and society : recent scholarship) Includes bibliographical references and index. ISBN 978-1-59332-331-8 (alk. paper) 1. Hate speech--United States. 2. Hate speech--Europe. I. Title. K5210.V36 2009 345'.0256--dc22

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

2008052590

ISBN 978-1-59332-331-8 Printed on acid-free 250-year-life paper. Manufactured in the United States of America.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Table of Contents THE FIVE COMPONENTS OF A HATE SPEECH DEFINITION .......................................................................................1 MEDIUM OF SPEECH ...........................................................................2 CONTENT ...........................................................................................2 TARGET .............................................................................................3 INTENT/CONTEXT ..............................................................................4 EFFECTS .............................................................................................4 CHAPTER 1

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

HATE SPEECH IN AMERICAN LAW .............................................7 INCITEMENT .......................................................................................7 FIGHTING WORDS ............................................................................18 GROUP LIBEL ...................................................................................23 COLLIN V. SMITH .............................................................................32 CROSS BURNING CASES ...................................................................36 THEORIES OF THE FIRST AMENDMENT .............................................45 ALTERNATIVE APPROACHES TO REGULATING HATE SPEECH IN THE UNITED STATES ...............................................................................60 CONCLUSION ...................................................................................65 CHAPTER 2 REGULATING ONLINE HATE SPEECH IN THE UNITED STATES ..............................................................................................67 INTRODUCTION ................................................................................67 RELEVANT CASE LAW .....................................................................70 CONCLUSION ...................................................................................84 DISCUSSION .....................................................................................86 OTHER VOICES IN THE DEBATE........................................................96 CONCLUSION ...................................................................................99 CHAPTER 3 HATE SPEECH LAW IN EUROPE.............................................. 101 INTRODUCTION .............................................................................. 101 INTERNATIONAL LAW .................................................................... 102 v

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

vi

Table of Contents COUNTRY LEVEL ........................................................................... 110 CONCLUSION ................................................................................. 123

CHAPTER 4 REGULATING ONLINE HATE IN EUROPE ............................ 125 FRANCE: THE EFFECTS-BASED APPROACH AND THE YAHOO! CASE ...................................................................................................... 125 THE GERMAN APPROACH .............................................................. 138 THE EUROPEAN UNION: FOLLOWING THE GERMAN EXAMPLE?..... 142 UNITED KINGDOM: HOTLINES ....................................................... 144 HOTLINES AT THE EUROPEAN LEVEL ............................................. 146 ADDITIONAL PROTOCOL TO THE CONVENTION OF CYBERCRIME ... 147 CONCLUSION ................................................................................. 151 CHAPTER 5

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

EVALUATING CROSS-BORDER INTERNET CONTENT REGULATION ................................................................................ 153 INTRODUCTION .............................................................................. 153 MAINTAINING THE LAYERED STRUCTURE OF THE INTERNET ......... 154 SOVEREIGNTY AND JURISDICTION.................................................. 167 EFFECTIVENESS ............................................................................. 187 CONCLUSION ................................................................................. 188 CHAPTER 6 EVALUATING REGULATORY OPTIONS: WHERE TO GO FROM HERE?................................................................................. 191 AGREEMENT ON COMMON INTERNET LAWS .................................. 191 ASSERTING JURISDICTION OVER CONTENT PROVIDERS ................. 194 ENFORCING FOREIGN JUDGMENTS ................................................. 195 SEARCH ENGINE FILTERING ........................................................... 202 GEOLOCATION ............................................................................... 207 SOURCE ISP LIABILITY .................................................................. 212 REGULATING THE DESTINATION ISP ............................................. 221 SUMMARY ...................................................................................... 233 CONCLUDING REMARKS........................................................... 243 BIBLIOGRAPHY............................................................................ 247 INDEX .............................................................................................. 261

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Acknowledgments

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

I want to thank Heather Ring for her support throughout the years. I could not have completed this without her. I also owe gratitude to Professor Jane Kirtley, who gave me a deeper understanding of the true meaning of free speech.

vii Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved. Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

INTRODUCTORY REMARKS

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Five Components of a Hate Speech Definition This work deals with online hate speech regulation in an international context. It tries to assess how the legal paradigms regarding hate speech in the United States and Europe have been affected by the rise of the Internet. More generally, it deals with the issue of how countries can regulate content on the Internet that is illegal within their borders, but originates from a country where that content is legal. For example, European countries have very strict hate speech laws, while the United States’ First Amendment protects most forms of speech. As a result of this situation, European countries and institutions have attempted to regulate hate speech originating from the United States. This work will analyze and evaluate these attempts based on a normative framework and formulate suggestions for approaching these issues. The term “hate speech” does not have a fixed, immutable meaning. Rodney Smolla has labeled it as a “generic term that has come to embrace the use of verbal attacks based on race, ethnicity, religion and sexual orientation or preference.”1 Hate speech can be viewed as utterances that demean a group of individuals based on characteristics such as race, sexual preference, gender or ethnicity, but it can also include speech directed at individuals in the form of insults or threats, for example. Hate speech is not a legal term. What is considered hate speech depends on the larger ideological and intellectual framework in which one operates. People striving for racial equality may focus on hate speech that targets racial minorities; feminist scholars may consider pornography hate speech; members of the Jewish community may consider Holocaust denial hate speech. This work does not attempt to put forward a definitive definition of hate speech, but it is important to point out the different components a definition of hate speech can have in the context of speech regulation. When lawmakers or certain groups want to ban or regulate “hate speech,” a definition or description of the speech that would be subject 1

Cited in: Samuel Walker, Hate Speech: The History of an American Controversy (1994) at 8.

1 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

2

Global Medium, Local Laws

to this regulation is necessary. These definitions can consist of five components: the medium of communication, the content of the speech, its target, the context in which the speech occurs, and the effects of the speech.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Medium of Speech One of the questions that will be addressed in this work, especially in chapter four, is whether or not hate speech on the Internet is qualitatively different from hate speech communicated through traditional channels. The notion that the medium through which speech is communicated is an important consideration in determining its legality is not new. A St. Paul, Minnesota ordinance, for example, stated that “[w]hoever places on public or private property a symbol, object, appellation, characterization or graffiti, including, but not limited to, a burning cross or Nazi swastika, which one knows or has reasonable grounds to know arouses anger, alarm or resentment in others on the basis of race, color, creed, religion or gender commits disorderly conduct and shall be guilty of a misdemeanor.”2 This specific ordinance applied to symbols, objects, appellations, characterizations or graffiti, but probably would not have applied to leaflets or to spoken words containing the same messages.3 In this ordinance, the medium through which a message was delivered -the mode of speech- was an important consideration in determining whether certain expressions can be punished.

Content Hate speech regulations frequently attempt to restrict certain messages or parts of messages. Regulations will therefore necessarily have to describe what specific messages fall within their purview. The topics encompassed by the term “hate speech” typically deal with race and/or ethnicity. But these are not the only speech topics that could be included in hate speech. Four broad content categories can be identified:

2

R.A.V. v. Saint Paul 505 U.S. 377 (1992) at 379. This ordinance would be struck down by the Supreme Court on First Amendment grounds; this will be discussed at length in the first chapter. 3

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

The Five Components of a Hate Speech Definition

3

1. Speech relating to characteristics that are generally considered to be crucial to an individual or group identity and are not a matter of personal choice. This would include race, nationality, gender, sexual preference4 , but would not include variable characteristics such as hair color or length, weight, etc. Even though this distinction seems somewhat arbitrary, it is based on the fact that these latter characteristics have not been the basis for violence, war and discrimination to the same extent as the former. 2 .Speech relating to characteristics that are crucial to one’s identity but that are a matter of personal choice. This refers mainly to political, religious and ideological beliefs.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

3. Speech targeting groups of people that have been historically disenfranchised, discriminated against or that represent a vulnerable minority in society. 4.Speech that does not refer directly to characteristics of individuals or groups, but that perpetuates ideas and beliefs that are deemed unacceptable in a society because they perpetuate hate and violence, or because they represent ideas antithetical to the democratic ideas of a society. For example, bans on speech denying that the Holocaust or certain crimes against humanity occurred are in place in many European countries.

Target Another possible component of a hate speech definition is that it makes clear to whom the speech is addressed. Speech that is communicated directly to an unwilling receiver is generally treated differently than speech that is communicated to the general public in a forum in which the audience has a choice whether or not to listen to this speech. Insults 4

If one accepts the notion that sexual orientation is not a matter of personal choice.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

4

Global Medium, Local Laws

or threats, for example, in most cases would have to be directed at their intended targets in order to be included in a hate speech definition. At the other end of the spectrum is public speech: speech not directed at an individual recipient, but that is made in public and can be heard or seen by a general audience. In the context of the Internet, for example, a racist Web site needs to be distinguished from a racist email. Sometimes differences of opinions can exist as to what comprises targeted speech. When neo-Nazis hold a demonstration in a predominantly Jewish neighborhood, this could be regarded as either a thinly veiled threat to the Jewish population living there, or as a legitimate public expression of political beliefs.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Intent/Context In some instances, the intention of the speaker can also be a consideration. Most people would not consider the comedy of Chris Rock or David Chapelle hate speech, though both use racial slurs and stereotypes. Because this is comedy, and perhaps because they are African American (the same racial slurs used by a white supremacist, even if in a humorous context, would be considered more “hateful”), would lead most people to believe that this is satire and not hate speech. Intent and context are often important in assessing the hatefulness of speech. In the United States, for example, burning a cross is constitutionally protected speech, unless it is done with the intent to intimidate (see infra). The context in which the speech occurs and the intentions of those involved are important considerations in many instances: “Any assessment of whether, how, or how much, hate speech ought to be prohibited must, therefore, account for certain key variables: namely who and what are involved and where and under what circumstances these cases arise.”5

Effects Underlying most of the arguments supporting regulation of hate speech is the belief that hate speech should be banned because it has negative effects, either upon the people against whom it is directed or to whom the speech applies, or upon society as a whole. In European hate speech 5

Michel Rosenfeld, "Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1526.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Five Components of a Hate Speech Definition

5

regulation for example, this latter notion is prevalent. In those cases, the harmful effect of the hate speech does not have to be proven to the courts, but it is assumed that certain expressions, by their very nature, have pernicious effects. For example, some consider pornography6 a form of hate speech because it perpetuates a climate in which objectification of and violence towards women is likely to occur. Others have argued that hate speech produces real harm against its victims,7 or that it has a corrupting effect on society.8 These interpretations consider hate speech as more than just words, but as speech that will lead to effects that lawmakers should try to prevent. As will be discussed later, one of the crucial discussions surrounding hate speech is how likely it is that the anticipated effect will occur before the speech causing the harm can be banned. These five components of a possible hate speech definition are not to be considered as five discrete parts. They can and often do intertwine and overlap. For example, the medium through which the speech is conveyed , can also be a contextual consideration from which intent to threat can be deduced. For example, an email has a specific addressee, while a website is generally available. By the same token, not all five components will be present in every definition. As will be discussed later in this work, hate speech laws can differ greatly based on which of these five components they focus on. For example, some have argued that the Internet requires a total rethinking of hate speech laws in the United States, because the Internet has the potential to profoundly alter the nature of hateful speech. This very “medium-centric” way of thinking about hate speech will be discussed in the second chapter. Alsoimportant in the context of hate speech law is the component of content. Some have argued that particular groups in society need to be protected from hateful speech, and that speech vilifying or targeting certain groups ought to be banned. In Germany, for example, this is the 6

See for example: Catharine A. MacKinnon, Only Words (1993). Richard Delgado, "Words that Wound: A Tort Action for Racial Insults, Epithets, and Name Calling." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993); Mari J. Matsuda, "Public Response to Racist Speech: Considering the Victim's Story," 87 Mich. L. Rev. 2320 (1989). 8 See for example: Alexander Tsesis, Destructive Messages: How Hate Speech Paves the Way for Harmful Social Movements (2002). 7

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

6

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

case with Holocaust survivors, while most of the laws in the United States center around the notion that any restriction on speech should be viewpoint neutral and should not protect certain groups. How each of these five components is interpreted has important consequences. For conceptual clarity, distinguishing between these different components when discussing hate speech regulation is very important. .

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

CHAPTER 1

Hate Speech in American Law This chapter will provide an overview of the case law and First Amendment theories in the area of hate speech. The chapter contains an analysis of how the jurisprudence on incitement, fighting words and group libel developed, and a discussion of the relevance of these doctrines for proposals to ban certain kinds of hate speech. A brief discussion of First Amendment theories that underpin the American approach to restrictions of speech will conclude this chapter. The discussion will show how restrictions of speech based on the viewpoint expressed by this speech are unlikely to be found constitutional, but that speech restrictions based on the effects of and the context in which the speech occurs are not altogether impossible. Chapter two will then examine whether and how the Internet has challenged these established principles.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Incitement All free societies subscribe to the ideal of freedom of speech, but recognize that certain forms of speech can and should be regulated or restricted in certain circumstances. Speech that gives rise to illegal behavior or creates a breach of the peace in many cases does not enjoy full protection. The crucial issue in debates about speech regulation is whether a causal connection exists between the speech and its outcome. How likely does it have to be that the alleged undesired effect will be the foreseeable outcome of this speech in order for it to constitute a justification for restricting speech? This debate mainly focuses on the effect component of hate speech regulation. By looking at case law and its theoretical underpinnings, this section will discuss how the American legal system developed standards to answer this question.

World War I cases Around the time of the American entry into World War I, a number of cases came before the Supreme Court dealing with the question of whether or not the government could ban speech by opponents of the war who promoted activity that could interfere with the war and draft effort. These activities were rendered illegal by the 1917 Espionage Act, which prohibited, among other things, any attempt to cause 7

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

8

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

insubordination in the military forces or “to obstruct the recruiting and enlistment service of the United States.”9 In 1919, the court ruled on three cases brought under the Espionage Act,10 of which Schenck v. U.S. is the most notorious because it contained the first formulation of the clear-and-present danger test, the standard currently applied to incitement speech. Charles Schenck, general secretary of the Socialist Party, had been charged under the Espionage Act for agreeing to print and send anti-war leaflets to men who were called for and had accepted military service.11 Schenck argued that this was an activity protected by the First Amendment. In the majority opinion, Justice Oliver Wendell Holmes acknowledged that in different circumstances that might indeed have been so, but that in the context of the war effort this was no longer the case. We admit that in many places and in ordinary times the defendants in saying all that was said in the circular would have been within their constitutional rights. But the character of every act depends upon the circumstances in which it is done….The most stringent protection of free speech would not protect a man in falsely shouting fire in a theatre and causing a panic… The question in every case is whether the words used are used in such circumstances and are of such a nature as to create a clear and present danger that they will bring about the substantive evils that Congress has a right to prevent. It is a question of proximity and degree. When a nation is at war many things that might be said in time of peace are such a hindrance to its effort that their utterance will not be endured so long as men fight and that no Court could regard them as protected by any constitutional right.12 This quote touches upon one of the crucial questions that emerges in every discussion about banning speech based on its effect. We can agree that falsely shouting fire in a crowded theatre ought not to be protected because it may lead to a dangerous situation, but what about 9

Schenck v. U.S. 249 U.S. 47 (1919) at 49. Schenck v. U.S. 249 U.S. 47 (1919); Frohwerk v. U.S. 249 U.S. 204 (1919); and Debs v. U.S. 249 U.S. 211 (1919). 11 Schenck v. U.S. 249 U.S 47 at 49-50. 12 Ibid. at 52. 10

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

9

less clear-cut situations, where speech could or might lead to this kind of outcome, but not necessarily? How does one evaluate the likelihood that this effect will occur and how does one balance it with First Amendment concerns? For example, advocating racial inequality may very well lead to instances of racial discrimination, a substantive evil which the government has a compelling interest to prevent, but does that justify barring this advocacy? A number of cases in the first half of the century in which communism, not racial hatred, was the evil that government aimed to prevent, set a loose standard for speech restriction in the face of this danger. In Abrams v. U.S.,13 the Supreme Court ruled on a 1919 Amendment to the Espionage Act that stated that one could be convicted for discrediting the form of government in the United States or for inciting resistance to the United States in the war. The defendants in Abrams, four men of Russian descent, self-declared rebels, socialists and anarchists,14 had distributed publications in which they called for a general strike in order to stop U.S. intervention in revolutionary Russia.15 The majority upheld the earlier convictions of the four, but Justice Holmes challenged the majority decision in a famous dissent in which he argued that the different publications16 did not meet the intent requirements to be banned under the Espionage Act.17 More importantly, Holmes argued that even if the required intent was present, nobody could argue that the “publishing of a silly leaflet by an unknown man, without more, would present any immediate danger that its opinions would hinder the success of the government arms or have any appreciable tendency to do so.”18 13

250 U.S. 616 (1919). Ibid. at 617-618. 15 Ibid. at 619-621. 16 An excerpt of the leaflets they distributed: “Workers, our reply to the barbaric intervention has to be a general strike! An open challenge only will let the Government know that not only the Russian Worker fights for freedom, but also here in America lives the spirit of Revolution.” (Ibid. at 621-622.) 17 Holmes argued that the act only applied to publications that had the intent to hinder the American war effort. The publications at hand, Holmes argued, had the intent to prevent government interference with the Russian revolution, an aim that could be reached without any hindrance to the war effort. (Ibid. at 628.) 18 Idem. 14

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Global Medium, Local Laws

10

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Holmes’ dissent reflects the emergence of a concern for the danger posed to free speech when there is too much room for speculation about consequences this speech may have as a justification for restricting it. Holmes argued that speech should only be restricted if it has a substantive evil as its specific intent and if the speech will bring that forbidden result immediately. If this is not the case, speech, however unpopular, should be allowed into the marketplace of ideas, for it is not up to the government to establish the truth and value of certain viewpoints: Persecution for the expression of opinions seems to me perfectly logical. If you have no doubt of your premises or your power and want a certain result with all your heart you naturally express your wishes in law and sweep away all opposition. To allow opposition by speech seems to indicate that you think the speech impotent, as when a man says that he has squared the circle, or that you do not care whole-heartedly for the result, or that you doubt either your power or your premises. But when men have realized that time has upset many fighting faiths, they may come to believe even more than they believe the very foundations of their own conduct that the ultimate good desired is better reached by free trade in ideas-- that the best test of truth is the power of the thought to get itself accepted in the competition of the market, and that truth is the only ground upon which their wishes safely can be carried out. That at any rate is the theory of our Constitution. It is an experiment, as all life is an experiment. Every year if not every day we have to wager our salvation upon some prophecy based upon imperfect knowledge.19 Holmes expresses a thought that would become central to First Amendment doctrine and its approach to hate speech: there is no such thing as a fixed and immutable truth; truth evolves through time. Therefore, every idea and thought can and should be challenged in the marketplace of ideas. What is accepted truth today, may be proven wrong tomorrow. In the context of hate speech, this might be a difficult position to accept. Many may disagree that some of the 19

Ibid. at 630.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

11

“truths” that racists profess ought to be protected because at some point in time we might come to embrace them. However, as will be discussed later in this chapter, the concept of the marketplace of ideas also justifies protection of speech that is not truthful or of which it is very unlikely that it will become “accepted” truth.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Red Scare cases The “Red Scare” cases were a second category of cases involving restrictions on incitement: Gitlow v. New York20 and Whitney v. California21 considered the question of whether it is constitutional to prosecute individuals under laws that prohibit advocacy of violent overthrow of the government. In Gitlow, the defendant had been convicted under a state provision that criminalized the advocacy of criminal anarchy for publishing a manifesto calling for mass strikes and revolutionary mass actions with the purpose of overthrowing the parliamentary system and establishing a communist system.22 The majority upheld the defendant’s conviction, but Justice Holmes, joined by Justice Louis Brandeis, wrote a short dissent, arguing that the manifesto did not present a direct danger: “it is manifest that there was no present danger of an attempt to overthrow the government by force on the part of the admittedly small minority who shared the defendant's views.”23 Holmes also expressed his radical adherence to the ideal of the free marketplace of ideas: “If in the long run the beliefs expressed in proletarian dictatorship are destined to be accepted by the dominant forces of the community, the only meaning of free speech is that they should be given their chance and have their way.”24 This statement also brings up an issue that is relevant to debates regarding certain types of hate speech: should democracies ban speech that is antithetical to democratic ideas? Just as communism at one time challenged some of the core values of liberal democracies (such as the right to private property, individual freedom), certain types of hate speech, in many instances, challenge some of the core assumptions of democratic societies. Does speech that rejects these core tenets of 20

268 U.S. 652 (1925). 274 U.S. 357 (1927). 22 268 U.S. 652 at 655-656. 23 Ibid. at 673. 24 Idem. 21

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

12

Global Medium, Local Laws

liberal democracies deserve the protection offered by the institutions of these societies or should democracy be “combative” and not extend its freedoms to its enemies? Holmes’ answer to this question is yes: these ideas should be given a chance in the marketplace of ideas, even if this would mean that these ideas might become accepted by the dominant forces in the community. This Darwinist interpretation in which the marketplace of ideas is the battlefield from which the strongest ideas (which may not necessarily be the “best” or “most truthful”) will ultimately emerge seems somewhat cynical, but finds its roots in Holmes’ notion that our established “truths” are only “truths” until they evolve into different “truths.” It should not be up to the lawmakers or the courts to determine what these established truths should be, this argument goes. If this would mean that ideas antithetical to democracy would become dominant, so be it; regulating speech because of the message it contains is an even bigger infraction on the ideals of a democracy, according to Holmes. Holmes’ dissent should not be read as an endorsement of communist ideas, but as recognition that any attempt to save democratic ideals by regulating speech that endorses antidemocratic ideals, is itself a violation of democratic values. In Whitney, the defendant had been convicted under the Criminal Syndicalism Act of California25 for helping a group engaged in proscribed activities.26 The high court upheld the conviction, arguing that a state may punish “those who abuse this freedom [of speech] by utterances inimical to the public welfare, tending to incite to crime, disturb the public peace, or endanger the foundations of organized government and threaten its overthrow by unlawful means, is not open

25

The California Act described “criminal syndicalism” as “any doctrine or precept advocating, teaching or aiding and abetting the commission of crime, sabotage (which word is hereby defined as meaning willful and malicious physical damage or injury to physical property), or unlawful acts of force and violence or unlawful methods of terrorism as a means of accomplishing a change in industrial ownership or control, or effecting any political change," and declares guilty of a felony any person who "organizes or assists in organizing, or is or knowingly becomes a member of, any organization, society, group or assemblage of persons organized or assembled to advocate, teach or aid and abet criminal syndicalism,…” (Ibid. at 359.) 26 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

13

to question.”27 This was too broad according to Holmes, who joined Justice Brandeis in his concurring opinion.28

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Fear of serious injury cannot alone justify suppression of free speech and assembly. Men feared witches and burnt women. It is the function of speech to free men from the bondage of irrational fears. To justify suppression of free speech there must be reasonable ground to fear that serious evil will result if free speech is practiced. There must be reasonable ground to believe that the danger apprehended is imminent.29 Here again, one sees the reaffirmation of the belief that there needs to be an imminent danger, not a mere fear, of a serious evil occurring as the result of speech before it can be suppressed. Holmes’ dissent in Abrams argued that because truths can shift, we have no choice but to let the marketplace of ideas run its course. This dissent has a more positive tone than the Abrams’ dissent because it acknowledges the positives of free speech. Here, free speech is seen as empowering because it frees men from irrational fears. Brandeis’ reasoning in this dissent did not rest solely on the argument that the governments should not be entrusted with making decisions about which speech is worthy of protection, he also provided a positive argument by pointing out the value of free speech for an individual and by stating why the founding fathers had introduced the First Amendment to the constitution: They believed that freedom to think as you will and to speak as you think are means indispensable to the discovery and spread of political truth; that without free speech and assembly discussion would be futile; that with them, discussion affords ordinarily adequate protection against the dissemination of noxious doctrine; that the greatest menace to freedom is an inert people; that public discussion is a political duty; and that this should be a fundamental principle of the American government. They recognized the risks to which all human institutions are subject. But they knew that order cannot be 27

Ibid. at 371. They did not dissent because the defendant had fought the conviction on different grounds. 29 268 U.S. 652 at 376. 28

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Global Medium, Local Laws

14

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

secured merely through fear of punishment for its infraction; that it is hazardous to discourage thought, hope and imagination; that fear breeds repression; that repression breeds hate; that hate menaces stable government; that the path of safety lies in the opportunity to discuss freely supposed grievances and proposed remedies; and that the fitting remedy for evil counsels is good ones. Believing in the power of reason as applied through public discussion, they eschewed silence coerced by law-- the argument of force in its worst form. Recognizing the occasional tyrannies of governing majorities, they amended the Constitution so that free speech and assembly should be guaranteed.30 This observation reaffirms the value of the marketplace of ideas for the pursuit of political truth. More importantly, it also sees this ability to speak and debate freely as the quintessential exercise in democracy that should not be restricted as doing so would discourage “thought, hope and imagination.” It reflects a commitment to the ideal of human freedom, to which the right to express oneself is considered essential. This idea lies at the heart of the self-governance theory (see infra). In the context of hate speech, this may be a better argument not to ban, for example, racist speech, than the argument that we may think at one point in the future that racists were right all along.

Cold War cases A third set of cases dealing with incitement speech were the so called “Cold War” cases, a group of cases emerging during the cold war years dealing with prosecutions under the Smith Act.31 Throughout the 1930s and 1940s, the clear-and-present-danger-test had become wellestablished and was applied to a range of free speech issues in a way that marked a significant expansion of First Amendment protection.32 This trend was halted by the so-called “Cold War” cases. In 1951, in 30

Ibid. at 375-376. A federal version of the “criminal syndicalism” statutes, which were used to prosecute officials of the Communist Party. 32 Marc H. Greenberg, "A Return to Lilliput: The LICRA v. Yahoo! Case and the Regulation of Online Content in the World Market," 18 Berkeley Tech. L.J. 1191 (2003) at 201. 31

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

15

Dennis v. United States,33 the clear-and-present-danger test was applied in a case against eleven members of the Communist Party. The eleven had been convicted by a federal grand jury two years earlier for advocating the overthrow of the government. The men had been distributing fliers and had been teaching Marxist-Leninist doctrine. According to United States Court of Appeals for the Second Circuit, this necessarily implied the teaching of forcefully overthrowing the government.34 The Supreme Court, using a rather broad interpretation of the clear-and-present-danger-test, upheld their convictions in a 6-to2 vote:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Certainly an attempt to overthrow the Government by force, even though doomed from the outset because of inadequate numbers or power of the revolutionists, is a sufficient evil for Congress to prevent. The damage which such attempts create both physically and politically to a nation makes it impossible to measure the validity in terms of the probability of success, or the imminence of a successful attempt.35 This application of the test stretched it to its limits. Speech that advocates an illegal action (overthrowing the government) that is unlikely to materialize and that is in no way “imminent” still failed to receive First Amendment protection under this interpretation of the clear-and-present-danger-test. This was a far cry from Holmes’ notion that all speech should be given its day in the free marketplace of ideas. In 1957, when the anti-communist sentiment had subsided somewhat, Yates v. United States36 reached the Supreme Court. Here, the court reversed the conviction of several lower echelon members of the Communist Party who had been convicted under the Smith Act.37 33

341 U.S. 494 (1951). Ibid. at 498. 35 Ibid. at 509. 36 354 U.S. 298 (1957). 37 The Smith Act made it a criminal offense “(1) for anyone to knowingly or willfully advocate, abet, advise, or teach the duty, necessity, desirability, or propriety of overthrowing the Government of the United States or of any State by force or violence, (2) with the intent to cause the overthrow or destruction of any government in the United States, to print, publish, edit, issue, circulate, sell, distribute, or publicly display any written or printed matter advocating, 34

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

16

Global Medium, Local Laws

Justice John Marshall Harlan II, who wrote the majority opinion, argued (among other points) that the trial court judge had failed to inform the jury that the defendants’ advocacy had to be of illegal action, not merely espousing doctrine.38 In his dissent, Justice Black argued for a more stringent protection of speech, even if that speech advocates illegal action: “I believe that the First Amendment forbids Congress to punish people for talking about public affairs, whether or not such discussion incites to action, legal or illegal.”39 These two cases did little to clarify the scope and interpretation of the clear-andpresent-danger-test, which seemed to be interpreted based on the dominant feelings of the day. Whichever opinion happened to be unpopular, could be placed outside the scope of the First Amendment by this liberal interpretation of the test. Brandenburg v. Ohio40 would end this state of confusion.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Brandenburg v. Ohio The 1969 Supreme Court decision in Brandenburg marked the next occasion on which the high court would rule on the issue of prohibiting advocacy of lawless action. In Brandenburg, the court had to decide whether or not to uphold the conviction of a Klan leader under an Ohio Criminal Syndicalism statute for "advocat[ing] . . . the duty, necessity, or propriety of crime, sabotage, violence, or unlawful methods of terrorism as a means of accomplishing industrial or political reform" and for "voluntarily assembl[ing] with any society, group or assemblage of persons formed to teach or advocate the doctrines of criminal syndicalism."41 Clarence Brandenburg had organized a Ku Klux Klan rally during which a cross had been burned and derogatory

advising, or teaching the duty, necessity, desirability, or propriety of overthrowing or destroying any government in the United States by force or violence (3) to organize or help to organize any society, group, or assembly of persons who teach, advocate, or encourage the overthrow or destruction of any government in the United States by force or violence; or to be or become a member of, or affiliate with, any such society, group, or assembly of persons, knowing the purposes thereof.” (Ibid. at footnote 1.) 38 Ibid. at 321-322. 39 Ibid. at 340. 40 395 U.S. 444 (1969). 41 Ibid. at 444-445.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

17

statements towards Jews and blacks had been made.42 Besides the Klansmen, a television reporter and a cameraman were also present. Their footage showed Brandenburg giving a speech that went as follows:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

This is an organizers' meeting. We have had quite a few members here today which are -- we have hundreds, hundreds of members throughout the State of Ohio. I can quote from a newspaper clipping from the Columbus, Ohio Dispatch, five weeks ago Sunday morning. The Klan has more members in the State of Ohio than does any other organization. We're not a revengent organization, but if our President, our Congress, our Supreme Court, continues to suppress the white, Caucasian race, it's possible that there might have to be some revengeance taken.43 The court reversed Brandenburg’s conviction. It began its discussion by referring to Whitney, stating that later decisions (the court only cited Dennis) had discredited the holding in Whitney and had “fashioned the principle that the constitutional guarantees of free speech and free press do not permit a State to forbid or proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action.”44 This is a somewhat strange interpretation of that case. Dennis does not seem to suggest such a broad protection of advocacy speech. In Dennis, the court had upheld a conviction of defendants whose advocated revolution was neither imminent nor very likely to occur. The majority opinion in Brandenburg seems to rely more on the Holmes and Brandeis dissents from these previous cases than on the majority opinions.45 But despite this shaky reliance on precedent, the standard established by the Brandenburg decision would take firm root in the following decades 42

Ibid. at 446. Idem. 44 Ibid. at 447. 45 William B. Fisch, "American Law in a Time of Global Interdependence: U.S. National Reports to the XVIth International Congress of Comparative Law: Section IV Hate Speech in the Constitutional Law of the United States," 50 Am. J. Comp. L. 463 (2002) at 475. 43

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

18

Global Medium, Local Laws

and provide a strong First Amendment protection for speech advocating lawless action. Brandenburg requires a very direct link between speech and its effects before it can be banned, leaving little room for conjecture about possible pernicious consequences as a reason for restricting speech. Brandenburg is a speech-friendly standard, as it requires a very strong connection between speech and illegal action. The potential for the clear-and-present-danger-test to prosecute hate speech is therefore rather limited. It sets a very high standard for speech to be considered incitement. Abstract threats, and certainly insults or general statements of hate cannot be banned under Brandenburg. Moreover, it is a standard that focuses on effects and circumstances of speech, not so much on its content or the viewpoint it expresses. It can just as well be used to ban speech that is spoken during an anti-racism march as during a Ku Klux Klan rally. One could argue that the Brandenburg ruling leaves more opportunity for banning hate speech originating from groups who have a history of violence, such as the Ku Klux Klan, certain anti-abortion groups or radical animal rights groups, since it is more likely that violence will erupt following incitement speech from those groups. This argument will be discussed in more detail in chapter two. As this overview shows, the government has been relatively successful in banning speech that was inimical to the interests of, to use a Marxist term, the “power elite.” The courts used very loose standards to punish communist speech, but when the time came to punish racist speech (in Brandenburg), a stricter standard emerged. Critics of the First Amendment doctrine have pointed out that the First Amendment has maintained the status quo in society’s power relations and defended the interest of the elites against those challenging this status quo. The development of incitement law in the twentieth century does little to dispel this interpretation.

Fighting Words One could argue that some expressions, such as racially-based insults, cause such an immediate and grave harm to the person against whom they are directed that they ought to be outside the scope of the First Amendment. The Supreme Court has recognized that certain types of speech can indeed prompt a man to fight, and that these expressions are

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

19

not protected speech. This seems to leave some room for restricting certain types of hate speech, about which one could argue that they are extremely hurtful and may have this effect. For example, if one could demonstrate that racial or homophobic slurs have this effect, these kinds of slurs could be banned under the fighting words doctrine. However, the way the fighting words doctrine has been interpreted by the Supreme Court does not seem to leave room for such an interpretation. In 1942, the Supreme Court ruled in Chaplinsky v. New Hampshire46 that “fighting words” constitute a form of speech that is not protected by the First Amendment. Walter Chaplinsky was a Jehovah’s Witness who had been convicted under a state law making it illegal in public places to “address any offensive, derisive or annoying word to any other person… nor call him by any offensive or derisive name, nor make any noise or exclamation in his presence and hearing with intent to deride, offend or annoy him, or to prevent him from pursuing his lawful business or occupation.”47 Chaplinsky had created a public disturbance and as he was led away by a police officer ran into the City Marshal at whom he yelled: “You are a God damned racketeer” and “a damned Fascist and the whole government of Rochester are Fascists or agents of Fascists.”48 The court agreed with the lower courts that the statute under which Chaplinsky had been convicted did not violate the First Amendment because it applied only to fighting words, speech that has a “direct tendency to cause acts of violence by the persons to whom, individually, the remark is addressed.”49 The test established by the municipal court was “what men of common intelligence would understand would be words likely to cause an average addressee to fight.”50 The Supreme Court, in an opinion written by Justice Frank Murphy, agreed with this analysis and put fighting words squarely in the category of forms of expression that do not deserve First Amendment protection.

46

315 U.S. 568 (1942). Ibid. at 569. 48 Idem. 49 Ibid. at 573. 50 Idem. 47

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

20

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

There are certain well-defined and narrowly limited classes of speech, the prevention and punishment of which have never been thought to raise any Constitutional problem. These include the lewd and obscene, the profane, the libelous, and the insulting or "fighting" words -- those which by their very utterance inflict injury or tend to incite an immediate breach of the peace. It has been well observed that such utterances are no essential part of any exposition of ideas, and are of such slight social value as a step to truth that any benefit that may be derived from them is clearly outweighed by the social interest in order and morality.51 Reading this statement, one would think that epithets directed at minorities or at members of certain ethnic or religious groups, or other groups based upon common features that are crucial to the members’ identities could be labeled “fighting words,” because it can be argued that these would be particularly harmful since they attack the very essence and identity of a person. However, subsequent jurisprudence has weakened the potential of the fighting words doctrine to be used this way, and the court has rejected the notion that words that inflict injury without inciting violence can be suppressed.52 In Terminiello v. Chicago,53 the Supreme Court ruled that the fighting words exception could not be applied to speech that “stirs public anger” or “invites dispute,” as the lower court had instructed the jury in a case where the defendant was convicted under a city breach of the peace ordinance for yelling abuse at a crowd. The Supreme Court reversed the conviction stating that a function of free speech under our system of government is to invite dispute. It may indeed best serve its high purpose when it induces a condition of unrest, creates dissatisfaction with conditions as they are, or even stirs people to anger. Speech is often provocative and challenging. It may strike at prejudices and preconceptions and have profound unsettling effects as it presses for acceptance of an idea. That is why freedom of 51

Ibid. at 571-572. Calvin R. Massey, "Hate Speech, Cultural Diversity, and the Foundational Paradigms of Free Expression," 40 UCLA Law Review 103 (1992) at 163. 53 337 US 1 (1949). 52

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

21

speech, though not absolute … is nevertheless protected against censorship or punishment, unless shown likely to produce a clear and present danger of a serious substantive evil that rises far above public inconvenience, annoyance, or unrest.”54

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

In Cohen v. California,55 the court reversed the conviction of a man for wearing a jacket with the words “fuck the draft” on it in a Los Angeles courthouse as a protest against the Vietnam War. The statute he was convicted under prohibited “maliciously and willfully disturbing the peace ... by tumultuous or offensive conduct.”56 The court ruled that the message on the jacket was speech rather than conduct, and also found that the expression did not fall in the fighting words category because it was not a direct personal insult. The court warned about setting the standard too low for barring speech that might offend some people: Surely the State has no right to cleanse public debate to the point where it is grammatically palatable to the most squeamish among us. Yet no readily ascertainable general principle exists for stopping short of that result were we to affirm the judgment below. For, while the particular four-letter word being litigated here is perhaps more distasteful than most others of its genre, it is nevertheless often true that one man's vulgarity is another's lyric. Indeed, we think it is largely because governmental officials cannot make principled distinctions in this area that the Constitution leaves matters of taste and style so largely to the individual.57 Up until this point, there had been a lack of clarity in the fighting words doctrine: how is the likelihood of violence to be determined “by the average addressee?” Was there a fixed set of fighting words of which we can assume that they will trigger a violent reaction or do the courts need to take a contextual approach and consider whether or not 54

Ibid. at 4. 403 U.S. 15 (1971). 56 Ibid. at 16. 57 Ibid. at 25. 55

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

22

Global Medium, Local Laws

the circumstances under which the speech is uttered will make a violent reaction likely?58 Gooding v. Wilson59 provided some clarity on this issue. Johnny Wilson had been convicted of hurling racial epithets and other insults at white police officers when he and a group of protesters who were trying to block the entry to an Army building were removed from a rally. He was convicted under a code that read: “Any person who shall, without provocation, use to or of another, and in his presence . . . opprobrious words or abusive language, tending to cause a breach of the peace … shall be guilty of a misdemeanor.”60 Wilson challenged the statute on the grounds that it was facially overbroad. The court agreed, noting that the adjectives “opprobrious” or “abusive” would cover much more speech than “fighting words,” words that “have a direct tendency to cause acts of violence by the person to whom, individually, the remark is addressed.”61 The Supreme Court criticized the lower courts’ interpretation of fighting words as a category of words that are inherently likely to provoke violence62 and its application of the statute to speech where there was no likelihood “that the person addressed would make an immediate violent response.”63 The Supreme Court rejected a categorical content-based interpretation of fighting words in which certain words would be labeled as fighting words in favor of a more effects-based approach that focused on the likelihood of a violent response (and not on the emotional harm inflicted upon the addressee of the speech). These cases made clear that the fighting words exception could only be invoked in instances where the speech has a direct tendency to provoke acts of violence by the person to whom the remarks are directed and not merely that they inflict injury, and only if there is a likelihood that that this person will retaliate in a violent way. The same expressions can be “fighting words” or not, depending upon the circumstances. This stress on the impact of the fighting words on 58

Michael J. Mannheimer, "The Fighting Words Doctrine," 93 Colum. L. Rev 1527 (1993) at 1540. 59 405 U.S. 518 (1972). 60 Ibid. at 519. 61 Ibid. at 524. 62 Ibid. at 541. 63 Ibid. at 528.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

23

public order (violent response that causes a breach of he peace) rather than their effect on the individual (inflicting injury) greatly reduces the potential of fighting words to be used in the fight against hate speech. By including words that by “their very utterance” inflict injury, the initial formulation of the fighting words doctrine did seem to leave some room to limit certain types of targeted hate speech. Following this standard, it would not have been inconceivable that a number of racist, sexist or otherwise hateful expressions directed at someone in particular could be regulated on the basis that they attack someone’s identity and inflict injury. However, the words-that-by-their-very-utterance-inflictinjury requirement would be abandoned by the Supreme Court when it started to focus on the other part of the fighting words doctrine, which refers to words that tend to incite an immediate breach of the peace. The post-Chaplinsky cases indicated that the court was not eager to apply the fighting words exception lightly.

Group Libel

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

In Beauharnais v. Illinois,64 the defendant challenged an Illinois statute under which he had been convicted which made it unlawful to manufacture, sell, or offer for sale, advertise or publish, present or exhibit in any public place in this state any lithograph, moving picture, play, drama or sketch, which publication or exhibition portrays depravity, criminality, unchastity, or lack of virtue of a class of citizens, of any race, color, creed or religion which said publication or exhibition exposes the citizens of any race, color, creed or religion to contempt, derision, or obloquy or which is productive of breach of the peace or riots. . . .65 Beauharnais had distributed a leaflet calling upon the mayor of Chicago “to halt the further encroachment, harassment and invasion of white people, their property, neighborhoods and persons, by the Negro…”66 and upon the “white people” to unite. “If persuasion and 64

343 U.S. 250 (1952). Ibid. at 251. 66 Idem. 65

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

24

Global Medium, Local Laws

the need to prevent the white race from becoming mongrelized by the negro will not unite us, then the aggressions . . . rapes, robberies, knives, guns and marijuana of the negro, surely will,” he warned in his publication.67 The trial court and the Illinois Supreme Court treated the statute as a criminal libel law,68 which meant that the truth of the statements could only be entered as a defense if these statements were made “with good motives and for justifiable ends.”69 The Illinois courts (followed by the Supreme Court) ruled that since the speech was not uttered for justifiable ends, the truth of Beauharnais’ statements could not be entered as a defense.70 Criminal libel, as opposed to civil libel, is not concerned with “the injury of the victim’s reputation, but on the tendency of libel to cause a breach of the peace.”71 The Illinois Supreme Court had ruled that the text in Beauharnais’ publication amounted to fighting words “liable to cause violence and disorder between the races”72 and therefore could not be seen as being uttered for justifiable ends, denying Beauharnais a truth-based defense. The Supreme Court upheld Beauharnais’ conviction in a 5-to-4 decision, stating that libelous statements directed at groups are denied First Amendment protection.73 In its reasoning, the Supreme Court cited the volatile climate and race riots that plagued Chicago, and argued that if the state legislature thought that these were the results of the speech banned by the statute, that it was not up to the Supreme Court to second guess the legislature.74 67

Idem. Ibid. at 253. 69 People v. Beauharnais 97 N.E.2d 343 (Ill. 1951) at 346. 70 Ibid. at 347. 71 William B. Fisch, "American Law in a Time of Global Interdependence: U.S. National Reports to the XVIth International Congress of Comparative Law: Section IV Hate Speech in the Constitutional Law of the United States," 50 Am. J. Comp. L. 463 (2002) at 479. 72 97 N.E.2d 343 at 346. 73 “But if an utterance directed at an individual may be the object of criminal sanctions, we cannot deny to a State power to punish the same utterance directed at a defined group, unless we can say that this is a willful and purposeless restriction unrelated to the peace and well-being of the State.” (343 U.S. 250 at 258.) 74 Ibid. at 261-262. 68

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

25

Justice Hugo Black dissented, arguing that criminal libel could not be expanded to groups,75 and that doing so restricted speech on matters of public concern.76 He also pointed out that this ruling might very well come back to haunt the civil rights movement.77 Justice William Douglas made a similar argument in his dissent: “Today a white man stands convicted for protesting in unseemly language against our decisions invalidating restrictive covenants. Tomorrow a Negro will be haled before a court for denouncing lynch law in heated terms.”78 Douglas pointed out that speech and debate often can be passionate and emotional, and that hotheads “blow off and release destructive energy in the process. They shout and rave, exaggerating weaknesses, magnifying error, viewing with alarm.”79 However, this does not mean that we need to restrict this speech, according to Douglas: “The Framers of the Constitution knew human nature as well as we do. They too had lived in dangerous days; they too knew the suffocating influence of orthodoxy and standardized thought. They weighed the compulsions for restrained speech and thought against the abuses of liberty. They chose liberty.”80 At first sight, group libel seems to provide the most encompassing legal framework to curb instances of hate speech, because it could target public speech and not only face-to-face insults, as is the case with the fighting words doctrine. It is also more likely to target speech based on its content, for example, if it exposes people of a certain creed 75

Ibid. at 273. Ibid. at 267-269. 77 “No rationalization on a purely legal level can conceal the fact that state laws like this one present a constant overhanging threat to freedom of speech, press and religion. Today Beauharnais is punished for publicly expressing strong views in favor of segregation. Ironically enough, Beauharnais, convicted of crime in Chicago, would probably be given a hero's reception in many other localities, if not in some parts of Chicago itself. Moreover, the same kind of state law that makes Beauharnais a criminal for advocating segregation in Illinois can be utilized to send people to jail in other states for advocating equality and nonsegregation. What Beauharnais said in his leaflet is mild compared with usual arguments on both sides of racial controversies.” (Ibid. at 274.) 78 Ibid. at 286. 79 Ibid. at 287. 80 Idem. 76

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

26

Global Medium, Local Laws

to contempt. However, the effect of the speech and the context in which it takes places remain important considerations. Ultimately, group libel is aimed at curbing certain effects of speech; in this particular case, it was used to prevent the creation of a volatile atmosphere conducive to race riots. Nevertheless, even within these parameters, the group libel doctrine seems to allow restrictions of hate speech as a policy to fulfill certain societal goals, such as reducing racial tensions. However, developments since Beauharnais have greatly reduced the validity of criminal libel law and, by extension, group libel. The section below discusses these blows to Beauharnais more in depth.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Developments in libel law Most observers and courts seem to agree that Beauharnais, though never overruled, is no longer good law.81 The clear-and-present-dangertest established in Brandenburg made it much harder to restrict speech that is likely to cause a breach of the peace. The Brandenburg test requires that the nexus between action and advocacy be intimate.82 Although Beauharnais deals with group libel and not advocacy of illegal action, criminal libel is also founded on “the tendency of libel to cause a breach of the peace.” Accordingly, Brandenburg is germane to the interpretation of Beauharnais. In Beauharnais, the court ruled that if the legislature was convinced that banning certain kinds of speech would prevent race riots, it was its prerogative to do so. In Brandenburg, however, the court takes a much more speech-protective stance, a stance that has been followed since, setting high standards regarding the likelihood and imminence of the lawless action. This makes it unlikely that the publication at issue in Beauharnais could now be constitutionally restricted by a similar statute.83 Garrison v. Louisiana84 has also been cited as a case limiting Beauharnais.85 Garrison established that in matters of public concern, 81

See for example: Deborah Schwartz, "A First Amendment Justification for Regulating Racist Speech on Campus," 40 Case W. Res. 733 (1990) at 725; and Michael J. Polelle, "Racial and Ethnic Group Defamation: A Speech-Friendly Proposal," 23 Boston College Third World Law Journal 213 (2003) at 253-254. 82 Martha T. Zingo, Sex/Gender Outsiders, Hate Speech, and Freedom of Expression : Can they Say that about Me? (1998) at 21. 83 Kent Greenawalt, Speech, Crime, and the Uses of Language (1989) at 295. 84 379 U.S. 64 (1964).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

27

truth should be allowed as a defense in criminal libel cases. In Beauharnais, the truth of the statements made by the defendant could not be used as a defense because the court ruled that the statements were not “published with good motives and for justifiable ends.” Garrison established that truth must be a defense in criminal libel cases.86 It has also been argued that Garrison should be interpreted as an acknowledgement by the Supreme Court that criminal libel law is contrary to modern law, as many states repealed their criminal libel statutes after this decision87 and after the Suprme Court decision in New York Times v. Sullivan.88 The Sullivan case undercut Beauharnais by granting increased protection to libelous speech.89 It established that libelous statements about public figures were only outside the scope of the First Amendment if they were published with actual malice, i.e. with knowledge of their falsity or with a reckless disregard for whether or not the statements were true. Sullivan also established that the burden of proof rested with the plaintiff. If the plaintiff is not a public figure, but the libel concerns a matter of public interest, the plaintiff must prove negligence on the part of the defendant.90 Sullivan is mentioned as weakening Beauharnais because Sullivan is based upon the notion that the debate on public issues should be uninhibited, robust and wideopen, even at the expense of some people’s reputation. It indicates that 85

William B. Fisch, "American Law in a Time of Global Interdependence: U.S. National Reports to the XVIth International Congress of Comparative Law: Section IV Hate Speech in the Constitutional Law of the United States," 50 Am. J. Comp. L. 463 (2002) at 479. 86 Garrison dealt with criticism on a public official though, which arguably would require a more speech friendly approach than the speech at issue in Beauharnais, which was derogatory to a group of unnamed individuals, not public officials. 87 Dan Bischof, "Criminal Defamation Laws are 19th Century Holdover," The News Media and the Law 25(2), Spring 2001, p.15. 88 376 U.S. 254 (1964). 89 Deborah Schwartz, "A First Amendment Justification for Regulating Racist Speech on Campus," 40 Case W. Res. 733 (1990) at 753. 90 William B. Fisch, "American Law in a Time of Global Interdependence: U.S. National Reports to the XVIth International Congress of Comparative Law: Section IV Hate Speech in the Constitutional Law of the United States," 50 Am. J. Comp. L. 463 (2002) at 481.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

28

Global Medium, Local Laws

libel is no longer put squarely outside the realm of protected speech, thus limiting Beauharnais, where the court assumed that libel was totally devoid of First Amendment protection. Also, the speech at issue in cases such as Beauharnais or in other hate speech statutes is hardly speech whose truth or falsity can easily be ascertained. Statements that the “Negro race” has a “lack of virtue” or is “depraved” or “criminal” are not statements whose truth or falsity can easily be established in a court of law. Also, an assertion of fact cannot always be treated equally when made to an individual or to an entire group.91 When you say that an individual uses drugs, this statement can be verified empirically, but it is much harder to prove or disprove that a certain behavior is characteristic for a whole group of people. In this case, we deal more with matters of opinion and not with easily ascertainable and verifiable facts. To verify these statements one would need to be able to capture the identity of a group, which is a highly subjective and evaluative process.92 Despite these criticisms, some scholars have argued that Beauharnais is still viable and that those who question the vitality of Beauharnais are “analyzing the case with a single-minded tunnel vision; reports of its death are greatly exaggerated.”93 Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Campus speech and the void for vagueness doctrine As indicated before, hate speech is hard to define. A complete definition includes many dimensions, making them complex and perhaps hard to understand. It is difficult to come up with a clear cut definition, even if one aims to ban only certain kinds of hate speech. This semantic problem has some legal consequences because unclearly defined speech regulations risk to be found void for vagueness, which constitutes grounds for a facial challenge to a law.94 A law is considered to be void for vagueness on its face if it is so vague that 91

Calvin R. Massey, "Hate Speech, Cultural Diversity, and the Foundational Paradigms of Free Expression," 40 UCLA Law Review 103 (1992) at 143. 92 Idem. 93 Kenneth Lasson, "To Stimulate Provoke Or Incite, Hate Speech and the First Amendment." In: Group Defamation and Freedom of Speech (1995) at 274. 94 Amanda J. Congdon, "Burned Out: The Supreme Court Strikes Down Virginia's Cross Burning Statute in Virginia v. Black," 35 Loy. U. Chi. L.J. 1049 (2004) at 1062.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

29

persons “of common intelligence must necessarily guess at its meaning and differ as to its application.”95 A number of campus speech codes were struck down under this doctrine, cases that serve as useful illustrations of how difficult it might be to draft a speech code that could pass constitutional muster. Although these campus speech codes were not based on group libel law, they serve as good examples of some of the other constitutional problems that arise when trying to curb hate speech under American law.

Doe v. University of Michigan In Doe v. University of Michigan,96 a campus policy on discrimination and discriminatory harassment at the University of Michigan was found to be void for vagueness. The policy’s section on racially motivated hate speech read as follows:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Any behavior, verbal or physical, that stigmatizes or victimizes an individual on the basis of race, ethnicity, religion, sex, sexual orientation, creed, national origin, ancestry, age, marital status, handicap or Vietnam-era veteran status, and that a. Involves an express or implied threat to an individual's academic efforts, employment, participation in University sponsored extra-curricular activities or personal safety; or b. Has the purpose or reasonably foreseeable effect of interfering with an individual's academic efforts, employment, participation in University sponsored extra-curricular activities or personal safety; or c. Creates an intimidating, hostile, or demeaning environment for educational pursuits, employment or participation in

95

Conally v. General Construction Co. 269 U.S. 385 (1926) at 391. 721 F. Supp. 852 (E.D. Mich 1989).

96

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

30

Global Medium, Local Laws University sponsored extra-curricular activities.97 [this section would later be scrapped].98

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The policy was challenged by a graduate student in the field of biopsychology,99 who feared that certain theories from his field of study might be perceived as “racist” or “sexist” under the policy since the code applied to the whole University, including classroom buildings. He requested that the policy be found unconstitutional because of overbreadth and void for vagueness. The United States District Court for the Eastern District of Michigan agreed, arguing that the words “stigmatize” and “victimize” are general terms that elude precise definition,100 and that speech that victimizes or stigmatizes someone is not stripped of First Amendment protection.101 The court also ruled that other terms in the statute were ill-defined, for example the demand that a stigmatizing or victimizing comment is sanctionable if it “has the purpose or reasonably foreseeable effect of interfering with an individual's academic efforts” or if it “involve[s] an express or implied threat to an individual's academic efforts, employment, participation in University sponsored extra-curricular activities or personal safety” which the court found did not provide guidance.102

The UWM Post, Inc. v. Board of Regents of University of Wisconsin System A similar policy enacted at the University of Wisconsin also did not withstand a challenge in a court of law.103 The code was crafted in the wake of Doe v. Michigan and was superior in scope and clarity to the Michigan code as it tried to avoid the pitfalls that sunk that the earlier

97

Ibid. at 856. Idem. 99 The study of the biological foundations of personality traits and mental abilities. 100 Ibid. at 867. 101 Idem. 102 Idem. 103 The UWM Post, Inc. v. Board of Regents of University of Wisconsin System 774 F. Supp. 1163 (1991). 98

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

31

speech code.104 The policy banned “discriminatory comments, epithets or other expressive behavior” that would (1) Be racist or discriminatory (2) Be directed at an individual; (3) Demean the race, sex, religion, color, creed, disability, sexual orientation, national origin, ancestry or age of the individual addressed; and

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

(4) Create an intimidating, hostile or demeaning environment for education, university related work, or other universityauthorized activity.105 The court found that this policy was overly broad because it did not merely exclude fighting words, but it also banned speech that was not likely to provoke a violent response.106 But it rejected arguments that the terms "discriminatory comments, epithets or other expressive behavior" and “demean” were too vague and rendered the policy unconstitutional.107 This was a step forward from Doe v. Michigan. Though this policy seems to have been drafted to withstand a challenge on void for vagueness grounds, the court found that the policy was unclear as to whether or not a speaker’s speech must demean a listener and create a hostile environment for education or if it suffices that this was the speaker’s intent.108 This objection might have been addressed easily by some minor changes in the speech code. The regents of the university considered drafting a revised policy that would withstand a constitutional challenge but gave up on this endeavor after the Supreme Court ruled in R.A.V. v. St. Paul (see infra).109 104

Matthew Silversten, "What's Next for Wayne Dick? The Next Phase of the Debate Over College Hate Speech Codes," 61 Ohio St. L.J. 1247 (2000) at 1247. 105 774 F. Supp. 1163 at 1166. 106 Ibid. at 1169-1173. 107 Ibid. at 1178-1181. 108 Ibid. at 1180-1181. 109 Lawrence Friedman, "Regulating Hate Speech at Public Universities After R.A.V. v. City of St. Paul," 37 How. L. J. 1 (1993) at 2.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

32

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Collin v. Smith In 1977, Frank Collin, leader of the National Socialist Party of America (NSPA), a neo-Nazi organization of sorts (it promoted, among other things, a “final solution to the Jewish question,” a national eugenics commission to ensure racial purity, prosecution of individuals for race mixing activities and many other racist beliefs110) wanted to hold a rally in Skokie, Illinois. Skokie, a North Chicago suburb of 70,000, counted a large number of Jewish people, many of them Holocaust survivors, among its citizens.111 Given Collin’s provocative statements about the Holocaust,112 and given the fact that members of the NSPA wore uniforms that looked like those worn by members of the Nazi party during the war, including the display of a Swastika,113 it is safe to assume that this choice of venue was not random. The views of Collin were extremely racist by any standard. If one were to ask a proponent of hate speech regulation to come up with an example of speech that is blatantly racist and hateful in order to justify restrictions of hate speech, it would be hard to come up with an example that would better fit his case than the speech of Frank Collin. It was only after the city of Chicago had turned down his request to hold a rally,114 that Collin had turned his attention to suburbs north of Chicago with large Jewish populations, because, according to Collin: “Where one finds the most Jews, there will be the most Jew haters.”115 In preparation, he distributed thousands of fliers showing a Swastika reaching out to throttle a stereotypically depicted Eastern European Jew 110

See: Philippa Strum, When the Nazis Came to Skokie: Freedom for Speech we Hate (1999) at 13. 111 578 F.2d 1197 (7th Cir. 1978) at n.2. 112 For example, he had been recorded as saying that the unfortunate thing about the Holocaust was not that six million Jews had died, but that so many had survived. Philippa Strum, When the Nazis Came to Skokie: Freedom for Speech we Hate (1999) at 15. 113 578 F.2d 1197 at 1198. 114 Collins had first sought a permit to hold a rally in a public place in Chicago, but the Chicago Park District, attempting to keep the NSPA out of the streets, revived an old ordinance requiring the NSPA to post public liability and property damage insurance for an estimated total of $250,000. (Philippa Strum, When the Nazis Came to Skokie : Freedom for Speech we Hate (1999) at 15.) 115 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

33

under the headline “WE ARE COMING.”116 The mayor and village council initially wanted to approve the march which was to be held on the sidewalks outside the village hall,117 but after pressure and protest from the Jewish community, the city filed for and obtained an injunction from a state court to keep the demonstration from taking place. When Collin’s lawyer appealed the injunction in the spring of 1977, the village enacted three ordinances that were designed to prevent the NSPA demonstration in case the injunction were to be lifted. The constitutionality of these ordinances would become the issue of Collin v. Smith,118 a case that is being credited for having “pulled the rug from under Beauharnais.”119 Village Ordinance No. 77-5-N-994 (994) required parades and public assemblies seeking a permit to obtain $350,000 in public liability and damage insurance and a finding by the appropriate officials that the assembly “will not portray criminality, depravity or lack of virtue in, or incite violence, hatred, abuse or hostility toward a person or group of persons by reason of reference to religious, racial, ethnic, national or regional affiliation.”120 It also stipulated that the Board of Trustees of the Village could waive any provision of the ordinance. Village Ordinance No. 77-5-N-995 (995) prohibited “the dissemination of any materials within the Village of Skokie which promotes and incites hatred against persons by reason of their race, national origin, or religion, and is intended to do so.”121 Village Ordinance No. 77-5-N-996 (996) prohibited public demonstrations by members of political parties while wearing “military-style” uniforms.122 The U.S. Court of Appeals for the Seventh Circuit, following the United States District Court for the Northern District of Illinois, struck 116

Idem. Sometimes Collin’s intended demonstration is described as a “march through Skokie” which is not accurate, the proposed demonstration was contained to the sidewalks outside the village hall for about 30 minutes, and was limited to 30 to 50 participants. (Ibid. at 16-17.) 118 578 F.2d 1197. 119 Rachel Weintraub-Reiter, "Hate Speech over the Internet, a Traditional Constitutional Analysis or a New Cyber-Constitution?" 8 B.U. Pub. Int. L.J. 145 (1998) at 153. 120 578 F.2d 1197 at 1199. 121 Idem. 122 Ibidem. at 1200. 117

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

34

Global Medium, Local Laws

down these ordinances. In its decision, the court addressed several of the key issues of the hate speech debate. Therefore, this decision serves as a good illustration of the American approach towards hate speech regulation. Ordinance 995 looks similar to the one at issue in Beauharnais,123 but was struck down on overbreadth grounds because, unlike the ordinance in Beauharnais, it did not rely on the tendency of certain utterances to cause violence and disorder.124 The court addressed the argument that certain speech can be banned because it inflicts psychic trauma by pointing out that the tort of intentional infliction of severe emotional distress might include personally directed racial slurs, but that this does not justify banning constitutionally protected speech in anticipation of such results.125 This is consistent with the prior restraint doctrine, the notion that “a free society prefers to punish the few who abuse rights of speech after they break the law than to throttle them and all others beforehand.”126 Ordinance 994 was struck down on overbreath and prior restraint grounds as well.127 (The court also confirmed that the insurance requirement that could be waived by city officials was unconstitutional because it would lead to a situation in which controversial speech that is likely to create uproar would be less likely to be relieved of the insurance requirement.128) The ordinance did not have an imminentdanger-for-violence requirement, putting it outside the scope of Brandenburg.129 The fighting words doctrine also did not apply because Chaplinsky only applies to “words with a direct tendency to 123

“to manufacture, sell, or offer for sale, advertise or publish, present or exhibit in any public place in this state any lithograph, moving picture, play, drama or sketch, which publication or exhibition portrays depravity, criminality, unchastity, or lack of virtue of a class of citizens, of any race, color, creed or religion which said publication or exhibition exposes the citizens of any race, color, creed or religion to contempt, derision, or obloquy or which is productive of breach of the peace or riots. . . ." (343 U.S. 250 at 251.) 124 578 F.2d 1197 at 1204. 125 Ibid. at 1206. 126 Ibid. at 1207. 127 Idem. 128 Ibid. at 1208. 129 Ibid. at 1204.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

35

cause violence by the persons to whom, individually, the words were addressed,”130 which is not the case in a public manifestation, when the words are not directed at anyone in particular. The court, citing Gertz v. Robert Welch, Inc. also reiterated the principle that under the First Amendment,131 there is no such thing as a false idea, rejecting the argument that Collin’s speech did not deserve protection because there is no constitutional value in false statements of fact. Falsehoods cannot be corrected by judges and juries, but by other ideas, according to this view: “If any philosophy should be regarded as completely unacceptable to civilized society, that of plaintiffs, who, while disavowing on the witness stand any advocacy of genocide, have nevertheless deliberately identified themselves with a regime whose record of brutality and barbarism is unmatched in modern history, would be a good place to start. But there can be no legitimate start down such a road. ”132 As stated above, the speech Collin engaged in might be the best available argument for a content-based restriction of hate speech beyond incitement and fighting words, but the best case available simply was not good enough according to the court. The city’s expert psychiatric witness claimed that the mere fact that self proclaimed NeoNazi’s marched through a Jewish community, through Jewish “turf,” would cause a shock effect in that community.133 The court ruled, however, that while there might indeed be stricter restrictions on speech when you have a “captive audience,” that this is only true when the speaker intrudes upon the privacy of the home.134 The village of Skokie already had conceded that ordinance 996 was unconstitutional, so the court of appeals did not discuss this in depth. At the district level, the judge had ruled that the justifications for this ordinance, namely that the wearing of military-style uniforms is repugnant to the tradition of civilian control or government and to local standards of morality and decency, are insufficient grounds to restrict speech.135

130

Ibid. at 1204. 418 U.S. 323 (1974) at 340. 132 Ibid. at 1203. 133 Ibid. at 1206. 134 Ibid. at 1206-1207. 135 447 F. Supp. 676 (ND Ill. 1978) at 701. 131

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

36

Global Medium, Local Laws

Cross Burning Cases The most recent Supreme Court decisions relevant to the hate speech issue both dealt with cross burnings. Although these cases could be discussed in the context of the fighting words doctrine, they will be discussed in a separate section here because of their prominence and recency.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

R.A.V. v. St. Paul136 In R.A.V. v. St. Paul, the court was asked to rule on the constitutionality of a city ordinance under which a young man had been charged for burning a cross in the yard of a black family.137 The St. Paul BiasMotivated Crime Ordinance stated that “Whoever places on public or private property a symbol, object, appellation, characterization or graffiti, including, but not limited to, a burning cross or Nazi swastika, which one knows or has reasonable grounds to know arouses anger, alarm or resentment in others on the basis of race, color, creed, religion or gender commits disorderly conduct and shall be guilty of a misdemeanor.”138 R.A.V. challenged the ordinance, arguing that it was overbroad and constituted an unconstitutional content-based restriction of speech. The trial court upheld this motion to dismiss, but the Minnesota Supreme Court reversed, arguing that this ordinance banned only fighting words and that the content-based restriction of speech was permissible in this case, since the restriction was narrowly tailored to accomplish a compelling government interest.139 The United States Supreme Court reversed, and even though all nine Justices agreed that the ordinance was unconstitutional, they used two very distinct rationales to reach that conclusion. The majority opinion, delivered by Justice Antonin Scalia for five justices, agreed with the Minnesota Supreme Court that this ordinance banned only fighting words, but ruled that the ordinance nevertheless was unconstitutional because it banned certain types of fighting words (insults based on race, color, 136

505 U.S. 377 (1992). Ibid. at 379. 138 Idem. 139 Idem. 137

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

37

creed, religion, or gender), while not banning others, making this an unconstitutional content-based restriction. Scalia argued, among other points, that it is the non-speech aspect of fighting words, the mode of expression, that is not protected by the First Amendment, not the content itself.140 Scalia compared fighting words to a noisy sound truck. Both are “modes of speech,” and government can regulate sound trucks because they make too much noise and are burdensome to the people living in the neighborhood. But it cannot regulate based on hostility or favoritism towards the underlying message expressed.141 Just as it would be unconstitutional to regulate sound trucks that are used to distribute message X, but not regulate sound trucks that are used to distribute message Y, is it unconstitutional to regulate certain fighting words without regulating others. For example, fighting words based on political or sexual preference were not covered by the ordinance and this troubled the majority.142 Scalia also wrote that the content-based restriction of the St. Paul ordinance amounted to viewpoint discrimination, since it would favor those supporting racial tolerance: “But ‘fighting words’ that do not themselves invoke race, color, creed, religion, or gender--aspersions upon a person's mother, for example-would seemingly be usable ad libitum in the placards of those arguing in favor of racial, color, etc. tolerance and equality, but could not be used by that speaker's opponents.”143 The court should not allow one side of the debate to fight freestyle while requiring the other side to “follow Marquis of Queensbury Rules.”144 145 140

Ibid. at 386. Ibid. at 387. 142 Ibid. at 386. 143 Ibid. at 292. 144 Ibid. at 392. 145 This argument seems to be a strange one. Take for example a heated debate between two individuals about racial equality. The Saint Paul ordinance bars both of them from using racial slurs (fighting words that would insult and provoke on the basis of race), while allowing both to use, for example, “aspersions upon a person's mother,” so this ordinance does not favor one viewpoint in this respect. The ordinance treats racial slurs directed at white people the same as it does slurs directed at black people or minorities. Scalia gives another example of how this ordinance could amount to viewpoint discrimination: A religious debate where a sign reading that “anti-Catholic 141

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

38

Global Medium, Local Laws

Scalia also wrote that even though the ordinance “regulates expression based on hostility towards its protected ideological content,”146 it could still have passed constitutional muster had it been narrowly tailored to achieve a compelling city interest.147 While the majority agreed that the state interest in this case was compelling, “to ensure the basic human rights of members of groups that have historically been subjected to discrimination,”148 this interest could have been served by an ordinance that was content neutral, one that would ban all fighting words for example.149 In other words, this decision states that government cannot restrict speech that is not constitutionally protected if it does not ban all speech that is unprotected under the same doctrine. This means that the Supreme Court rejects the notion that racial or ethnic slurs are qualitatively different or worse than fighting words based upon union membership or political affiliation or other fighting words.150 It reaffirmed the notion that there does not exist a set of expressions that are to be considered fighting words by default. However, this decision also reaffirmed the validity of the fighting words doctrine, and even seemed to broaden it somewhat. The ordinance did not demand that the “victim” of the speech retaliate in a violent way (only that it “arouses anger, alarm or resentment”) and thus cause a breach of the peace. This seems to be an extension of the fighting words doctrine compared to previous decisions discussed above, as it put the focus back on the impact fighting words have on the “victim,” rather than on the effect on public peace. Justice Edward White concurred, but disagreed vehemently with the underinclusiveness rationale of the majority. According to White, bigots’ are misbegotten” would be permitted, but one reading that all ‘papists’ are,” would not be permitted. (Ibid. at 391-392.) It may be true that this ordinance leaves in some situations (or far-fetched hypotheticals) those who support one side of an argument with a greater range of constitutionally protected speech than those who advocate the other side, but it seems to be a stretch to equate this with the government actively taking a side in an argument. 146 Ibid. at 395. 147 Idem. 148 Idem. 149 Ibid. at 396. 150 Ronald Turner, "Hate Speech and the First Amendment: The Supreme Court's R.A.V. Decision," 61 Tenn. L. Rev. 197 (1993) at 224.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

39

speech (such as the speech banned by the St. Paul ordinance) that falls under a larger category of constitutionally unprotected speech (fighting words) should also be unprotected.151 He criticized the majority for assuming that fighting words are a mode to exchange views:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Fighting words are not a means of exchanging views, rallying supporters, or registering a protest; they are directed against individuals to provoke violence or to inflict injury … Therefore, a ban on all fighting words or on a subset of the fighting words category would restrict only the social evil of hate speech, without creating the danger of driving viewpoints from the marketplace.152 According to White, the majority elevated hate speech to a form of public discussion. White also argued that a prohibition on fighting words is “not a time, place, or manner restriction; it is a ban on a class of speech that conveys an overriding message of personal injury and imminent violence, … a message that is at its ugliest when directed against groups.”153 White criticized the majority’s decision because it amounts to an all-or-nothing approach (either ban all fighting words or no fighting words) and argued that the case should have been decided under settled First Amendment principles.154 He argued that the ordinance was unconstitutional because of overbreadth. Speech that causes “anger, alarm or resentment” does not fit the narrow definition of fighting words laid out in Chaplinsky, White stated.155 For proponents of hate speech restrictions, the majority opinion might be preferable over White’s opinion. Scalia’s opinion seems to broaden the scope of “fighting words,” but makes it harder to single out specific categories of fighting words to regulate. But this decision leaves open the door for an ordinance banning all fighting words under which one could prosecute racial slurs. White proposed exactly the opposite: a more narrow reading of the fighting words doctrine, one that seems to be more in line with established case law, but leaving the 151

Ibid. at 401. Idem. 153 Ibid. at 408. 154 Ibid. at 411. 155 Ibid. at 413-414. 152

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

40

Global Medium, Local Laws

option open to single out certain types of fighting words (such as fighting words based on race, creed, ethnicity,…) for regulation, if this can be done in a narrowly tailored way. Scalia’s opinion does not seem to exclude the possibility that an ordinance could be drafted that would eliminate some of the speech that anti-hate speech advocates would like to see banned (such as the fighting words based on race or ethnicity), but only if all fighting words would be banned.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Fighting Words and R.A.V. The fighting words doctrine’s validity was thus confirmed in R.A.V. v. St. Paul. Because the fighting words doctrine had come under increasing criticism, this affirmation of its validity was meaningful. Critics of the fighting words doctrine had argued that frequent derogatory statements promoted the belief that a reasonable addressee would just put up with the abuse, rather than start a fight.156 It has also been argued that the fighting words doctrine displays a gender bias because it is based on a typical “male” response to abusive speech.157 An insult hurled at a big, aggressive man might be more likely to provoke a violent reaction than the same insult directed at a petite woman who would be less likely to respond violently.158 The reliance on violence in the fighting words doctrine has been criticized because it “allows a socially undesirable response [violent reaction] to dictate the legal standard, an approach inconsistent with other areas of law.”159 It has also been pointed out that the fighting words doctrine can be used by government officials to censor unpopular groups. Most fighting words cases have involved comments addressed at police officers; it seems as if the fighting words doctrine has been used to try to punish speech critical of government or law enforcement,160 moreso than it has been used by minority groups against hate speech. R.A.V. did not seem to increase the possibility of the fighting words doctrine being used to pass ordinances banning racist or other content specific speech, but a

156

Melody L. Hurdle, "R.A.V. v. City of St. Paul: the Continuing Confusion of the Fighting Words Doctrine," 47 Vand. L. Rev 1143 (1994) at 1154-1155. 157 Ibid. at 1155. 158 Idem. 159 Ibid. at 1156. 160 Ibid. at 1156.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

41

2003 cross burning case could provide some hope for those who would like to see hate speech directly targeted at minorities banned.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Cross burning revisited: Virginia v. Black Ten years after R.A.V. v. St. Paul, another cross burning case reached the High Court. In Virginia v. Black,161 the conviction of cross burners who had been separately convicted under a Virginia statute that made it a felony “for any person or persons, with the intent of intimidating any person or group of persons, to burn, or cause to be burned, a cross on the property of another, a highway or other public place”162 was being challenged. During the trial of one of the cross burners, the jury had been instructed that "the burning of a cross by itself is sufficient evidence from which you may infer the required intent [to intimidate].”163 The cross burners brought their cases to the Supreme Court of Virginia, which ruled that the statute was indistinguishable from the ordinance that was struck down in R.A.V. because it singled out cross burning based on the message it contained. It also ruled that the statute was overbroad.164 The Supreme Court disagreed, ruling that cross burning with the intention to intimidate was not protected speech. However, because the provision in Virginia statute treated cross burning as prima facie evidence of intent to intimidate, the Supreme Court ruled it to be unconstitutional.165 After having established that the act of cross burning, given its violent history, can constitute a true threat166 making it unprotected speech, the court turned its attention to the question of whether or not this statute was indistinguishable from the Saint St. Paul ordinance in R.A.V. In R.A.V., the court had ruled that even though the ordinance targeted constitutionally unprotected fighting words, it was still unconstitutional because it discriminated on basis of viewpoint. Writing for the majority, Justice O’Connor argued that the Virginia statute was different because it fell under one of the exceptions stated in R.A.V., under which content-based restrictions are permitted “[w]hen the basis 161

538 U.S. 343 (2002). Ibid. at 348. 163 Ibid. at 349. 164 Ibid. at 351. 165 Ibid. at 367. 166 Ibid. at 359-360. 162

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

42

Global Medium, Local Laws

for the content discrimination consists entirely of the very reason the entire class of speech at issue is proscribable.”167 Just as it is constitutional to only ban threats against the president or to ban only the most patently offensive forms of obscenity, is it constitutional to ban cross burning with the intent to intimidate because it is a “particularly virulent form of intimidation,”168 the court argued. The court also argued that under the Virginia statute it did not matter whether someone burned a cross with the intent to intimidate someone based on his religion, race, or gender, which made it less constitutionally suspect because it does not single out speech on disfavored topics.169 This may raise some eyebrows, because throughout history burning crosses seems to have been directed mainly towards racial minorities, or at least to have been associated with a certain political or ideological message, making the attempt of the court to separate the content from the conduct aspect in cross burning arguably questionable.170 The court gave examples of burning crosses being used to intimidate union members and lawyers in various settings.171 Noting that cross burning can occur as a statement of ideology or group solidarity172 or as part of a movie or play (for example, Mississippi Burning),173 the court ruled that burning crosses without intent to intimidate is still constitutionally protected speech and that the jury instruction given in one of the defendants’ trial cases mentioned above,174 that intent to intimidate could be inferred from the act of burning a cross, rendered the statute unconstitutional on its face.

167

Ibid. at 361. Ibid. at 363. 169 Ibid. at 362. 170 Justice Souter’s dissent suggested that the Virginia statute did not constitute one of the exceptions spelled out in R.A.V. and was thus a content-based restriction of speech. (Ibid. at 381-387.) 171 Ibid. at 362-363. 172 Ibid. at 365-366. 173 Ibid. at 366. 174 This jury instruction carries the same weight as had it been written into a statute. (Ibid. at 364.) 168

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

43

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Analysis At first sight, this decision seems to provide more of an opportunity to ban hate speech than the R.A.V. decision did. After all, this decision allows for a ban on cross burning with the intent to intimidate. However, in order to keep this case within the borders of R.A.V., the majority had to make the argument that this was not a content and viewpoint-based restriction the way the St. Paul ordinance was, because it did not single out speech on disfavored topics. So this decision should not be read as an indictment of the message expressed by a burning cross, as the court suggests that there is no fixed message connected to a burning cross. In this analysis, a burning cross is seen as a medium to deliver a message, as a mode of expression, not as an expression in itself. If one would accept that a certain message is identified with or attached to burning crosses, R.A.V. would make ordinances banning intimidation through cross burning underinclusive. However, because the Virginia statute bans cross burning with the intent to intimidate “because of the victim's race, gender, or religion,” as well as his “political affiliation, union membership, or homosexuality,”175 or any other category, it is saved from underinclusiveness. For those favoring restrictions on hate speech, the Virginia ruling seems to be encouraging because it allows bans on cross burnings with the intent to intimidate. However, it does not ban cross burnings as such, which the court stressed can be political speech. This ruling also established that one of the most hateful symbols of bigots is constitutionally protected speech, unless it is used to intimidate. However, by invoking the history of this symbol to establish its threatening nature, the court engaged in a historical analysis that those who favor restrictions on hate speech towards groups who have been historically persecuted and discriminated against must welcome. Though both the R.A.V. and Virginia decision stay loyal to the principle of content and viewpoint-neutrality, engaging in a historical analysis to establish the threatening nature of cross burning lets content-based considerations back in through the back door. For those who propose restrictions on hate speech that targets groups that have been the victim of violence throughout history, this 175

Ibid. at 362.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

44

Global Medium, Local Laws

attention to historical context seems to be good news. Perhaps a similar rationale could be applied to justify, for example, a ban on the display of swastikas with the intent to intimidate. According to Neis however, the analysis of the court in this case will be restricted to its specific facts. Because, unlike the swastika, the burning cross has been used throughout history as a means of intimidation:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

No other symbol in the American experience is imbued with such a specific and virulent history of intimidation and violence. The swastika, for instance, is at least as infamous a symbol of hate as a burning cross. Although the swastika is singularly linked with Nazism, white supremacy, and degradations of the Third Reich, the swastika was not employed by the Nazis for the particular purpose of intimidation. The same can be said about the Confederate Flag or the Hammer and Sickle. No other symbolic expression is imbued with such a specific and historically verifiable message. The burning cross is, in the words of Justice Thomas, truly "unlike any [other] symbol in our society.”176 The Virginia ruling also clearly establishes that as long as these so-called symbols of hate contain a speech element that is constitutionally protected, a total ban is unlikely.177 Hate mongers who engage in this kind of communication will also have to keep in mind that while their ideas may be protected, using fear and intimidation to get these ideas across is not.178 Virginia also marked the switch from a fighting word analysis to a threatening speech analysis for these kinds of cases.179

176

Eric John Nies, "The Fiery Cross: Virginia v. Black: History and the First Amendment," 50 S. D. L. Rev. 182 (2005) at 216. 177 Chris L. Brannon, "Constitutional Law--Hate Speech--First Amendment Permits Ban on Cross Burning when done with the Intent to Intimidate," 73 Miss. L.J. 323 (2003) at 344. 178 Ibid. at 345. 179 Ibid. at 344.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

45

Theories of the First Amendment As Heyman has observed, one of the reasons why the hate speech debate is so complex is that we lack a general theory about the First Amendment that gives us guidance about when speech can be restricted and when it should not. 180 While there is no unified theory of the First Amendment, there are a number of influential theories that have shaped and influenced the thinking about freedom of speech. While one cannot state that any single one of these theories is “The Theory” that can make us understand the unique American approach (especially compared with Europe) towards hate speech, these theories, taken together, can give us an idea of the intellectual and normative framework that has functioned as the main rationale for freedom of speech.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Marketplace theory / search for truth This theory can be traced back to philosophers such as John Milton and John Stuart Mill, who refined the concept in his 1859 On Liberty,181 and has found its clearest legal articulation in Holmes’ dissents (see supra). In this treatise, Mill describes the three possible conditions that exist when a government tries to censor an unpopular opinion.182 (Mill used the word “opinion” in the meaning of a verfiable assertion, so for Mill there is such a thing as a false opinion.) First of all, the unpopular opinion can be true, in which case its suppression would be very undesirable. Mill argues that throughout the ages, beliefs that were once deemed to be false turned out to be true and vice versa. Since we are not infallible, we might accept opinions for true that are in fact false. This argument, also employed by Holmes in his Abrams dissent (see supra), has been criticized because it is too relativistic,183 it assumes that every truth can be challenged, a notion that might be absurd in some instances. There might be good reasons for not forbidding someone to say that London Bridge does not exist, but the argument that the lawmaker forbidding this statement might be 180

Steven J. Heyman, Hate Speech and the Constitution (1996) at xvi-xvii. Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity (2001) at 3. 182 Ibid. at 3-6. 183 Wojciech Sadurski, Freedom of Speech and its Limits (1999) at 14. 181

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

46

Global Medium, Local Laws

fallible as to his assertion that London Bridge does indeed exist might not be the strongest reason to do so.184 A second possibility is that the suppressed expression would be false, in which case there is still no good reason to suppress it, according to Mill. False statements will force believers of true opinions to argue and defend their opinion, which will strengthen the “true” opinion. This is an argument that could be made against banning Holocaust denial. By allowing people to challenge the notion that millions of Jews died in Nazi Germany’s concentration camps, we are forced to rebut these claims and to keep thinking about the horror that took place. One could argue, that by doing this we will have a deeper understanding of the Holocaust. The third possibility is that the unpopular opinion is neither false nor true, but contains some truths and falsehoods, in this case as well, public debate to find out what is true and what is false is needed, according to Mill. As Downs remarks, Mill’s theory is consequentialist; it serves a higher purpose: the search for truth.185 Freedom of speech is instrumental to this higher goal and speech is only protected to the extent that it contributes to this quest for truth. Expressions that do not contribute to this search for truth would not be worthy of protection under this theory. Face-to-face insults, obscenity and lies that hurt someone’s reputation do not contribute to this search for truth and accordingly do not receive First Amendment protection.186 As stated, Mill defends the protection of false statements based on our fallibility and on the argument that false statements force us to keep our true beliefs “fresh.” Another argument in favor of not banning falsehoods is that regulating false statements might have a chilling effect on free speech. Speakers may engage in self-censorship, out of fear that their speech may be punishable if found to be false. Therefore, allowing false statements onto the marketplace of ideas is the price we have to pay to keep the marketplace of ideas functioning and vital. New York v.

184

Idem. Donald A. Downs, Nazis in Skokie: Freedom, Community, and the First Amendment (1985) at 84. 186 James Weinstein, Hate Speech, Pornography, and the Radical Attack on Free Speech Doctrine (1999) at 13. 185

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

47

Sullivan187 and its notion that speech needs some “breathing room” is an example of this approach. A number of criticisms can be levied against this theory as it applies to hate speech. One can question if free speech indeed leads to the truth. Maybe this is a false assumption and a society in which a select few wise men would determine what is true and what is false and restrict distribution of false ideas would provide a quicker way to attaining truth. Is there really a guarantee that free debate is conducive to truth?188 Also, while it might be true that in the long run truth will win out, this does not mean that in the short run we cannot be swayed by falsehoods. One needs only take a look at history books to realize that this criticism is not merely theoretical. Arguing that we need to put up with these theories until the marketplace of ideas filters them out may be a lot to ask from people who are at the receiving end of certain types of hate speech. Another possible criticism is that the marketplace of ideas may become a tyranny of the majority. Holmes’ dissent in Gitlow (“If in the long run the beliefs expressed in proletarian dictatorship are destined to be accepted by the dominant forces of the community, the only meaning of free speech is that they should be given their chance and have their way.”189) espouses the belief that truth is whatever the majority decides it is. Truth finding is almost seen as the result of a societal struggle rather than a rational progress towards enlightenment. It reflects the emerging relativism typical for the twentieth century, more so than Mill’s nineteenth century enlightenment optimism. This exposes his theory to the criticism that the marketplace of ideas has become exactly what John Stuart Mill has warned against: majoritism. Whatever is true is what most people, or the most influential voices, deem true, leaving minorities and alternative viewpoints marginalized. This notion that freedom of speech mainly protects and perpetuates the interests of the majority is one that lives strongly among so-called critical race theorists.190 187

376 U.S. 254 (1964). Calvin R. Massey, "Hate Speech, Cultural Diversity, and the Foundational Paradigms of Free Expression," 40 UCLA Law Review 103 (1992) at 123. 189 268 U.S. 652 at 673. 190 Critical race theory originated with leftist scholars and scholars of color who argue that dominant legal doctrine, and especially First Amendment doctrine, 188

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

48

Global Medium, Local Laws

Matsuda notes in this context that racial equality is accepted by almost every government and by every democracy in the world, it is established in numerous international treaties that the world community believes in this ideal.191 So why continue to protect speech that advocates, for example, racial inequality, that is hurtful to many, merely to force us to keep thinking about and defending our accepted truths? When looking at this question from the perspective of the “victims” of hate speech, it is understandable that one could conclude that this is too much to ask from them. The justification provided by the marketplace of ideas model for not banning hate speech because truth is relative may not be the strongest argument against banning hate speech. A practical argument that can be levied against this criticism is the the-medicine-is-worse-than-the-cure argument expressed by Judge Cohn in Doe v. University of Michigan,192 who, in his conclusion, quoted Thomas Cooley, who had written in his 1868 A Treatise on the Constitutional Limitations that even if speech “exceed[s] all the proper bounds of moderation, the consolation must be that the evil likely to spring from the violent discussion will probably be less, and its correction by public sentiment more speedy, than if the terrors of the law were brought to bear to prevent the discussion.”193 This medicineis-worse-than-the-cure argument creates an extra hurdle for introducing hate speech bans because it implies that even if one could establish that hate speech creates certain harms, one would have to establish that this harm is greater than the harm that would be created “if the terrors of the disfavors minorities and perpetuates the status quo of institutionalized racism. They feel that the perspective of minorities is not reflected in traditional legal doctrine and they want to right this imbalance. The movement with strong ties to the sixties’ civil rights movement and produced a body of scholarship in the mid-eighties, using a mix of Marxist, critical legal studies, feminist, postmodern and other theories to show how laws that were designed “to advance racial equality often benefit powerful whites more than those who are racially oppressed.” Richard Delgado, Lawrence III, Charles R., Mari J. Matsuda and Kimberlè Crenshaw, "Introduction." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993) at 5. 191 Mari J. Matsuda, "Public Response to Racist Speech: Considering the Victim’s Story." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993) at 26-31, 37. 192 721 F. Supp. 852 (E.D. Mich 1989). 193 Doe v. Michigan 721 F. Supp. 852 (E.D. Mich 1989) at 869.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

49

law were brought to bear to prevent the discussion.” This may be scant consolation for those who are at the receiving end of hate speech, but First Amendment jurisprudence and theory seem to suggest that this is the price they have to pay.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Self governance theory This theory, usually associated with Alexander Meiklejohn,194 states that individuals need to be exposed and have access to all possible opinions in order to have a citizenry that can make informed decisions. Meiklejohn took issue with the marketplace of ideas model because it allowed all kinds of private interests to enter the debate, causing people to act not as citizens trying to discover the truth but as farmers or trade union workers attempting to make their cases.195 The marketplace of ideas, according to Meiklejohn, did not help to establish the truth but merely to share whatever truth had won.196 Meiklejohn argued that this kind of private interest speech has no place in the public debate and deplored its intrusion of the public sphere as intellectual degradation. As a result, Meiklejohn’s view of freedom of speech is relatively restricted, only information germane to the democratic decision making process, related to public affairs, deserves absolute First Amendment protection.197 Though Meiklejohn argued that in areas such as education, philosophy, science and arts and public discussions of public issues there should be no abridgment of free speech,198 these areas only receive protection because they contributed to a better informed citizenry.199 By only including political speech without really defining political speech, this theory risks being either too broad or too narrow.

194

Alexander Meiklejohn, Free Speech and its Relation to Self-Government (1948). 195 Edward G. White, "The First Amendment Comes of Age, the Emergence of Free Speech in Twentieth-Century, " 95 Mich. L. Rev. 299 (1996) at 346. 196 Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity (2001) at 9. 197 Idem. 198 Wojciech Sadurski, Freedom of Speech and its Limits (1999) at 21. 199 Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity (2001) at 10.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

50

Global Medium, Local Laws

As Sadurski200 has argued, either one has a very narrow conception of the “political” or a broad one. In the fist instance, this could lead to a rather elitist interpretation in which other human endeavors not directly related to politics would remain unprotected. If, on the other hand, one were to see the political as including almost everything, because all speech can arguably contribute to the formation of one’s opinions on matters of public interest, this theory loses its usefulness too. How hate speech would fare under this theory would depend on what kind of hate speech we are dealing with, but more importantly with what one’s definition of “political” is. If hate speech would qualify as political, it would be protected. While it lies outside the scope of this work to put forward different interpretations of the “political” and how they may fit into Meiklejohn’s theory, a lot of hate speech is political, or at least has a political component. In many cases, hate speech deals with the norms, values and rules that should apply in a society. We may find a lot of these values repugnant, but that does not make them any less political. It also is not speech furthering merely private interests which Meiklejohn so despised.201 While in many cases hate speech may be objectionable to many, this does not mean that it is not political. This interpretation was also followed by the Supreme Court in Virginia, when it stated that a burning cross, despite its violent history, is political speech. This theory would not necessarily preclude banning direct racial insults or threats, such as the burning of a cross with the intent to intimidate (in which it is the mode of delivery, not the content of the speech that is actionable), but would protect non threatening speech that is uttered in a public space and that deals with issues such as race, ethnicity, religion, homosexuality, and so on. Another dilemma this theory addresses, and one that is particularly relevant to the hate speech debate, is whether or not the majority should be able to suppress certain thoughts or silence certain speakers who endorse anti-democratic ideas.202 Meiklejohn’s theory is geared towards setting the ideal conditions for a healthy democracy, and some have argued that if a people, after a democratic process, decide that some 200

Wojciech Sadurski, Freedom of Speech and its Limits (1999) at 22. Whatever the motivation may be of those engaging in hate speech, furthering their own narrow self interest may not be the most important one. 202 Calvin R. Massey, "Hate Speech, Cultural Diversity, and the Foundational Paradigms of Free Expression," 40 UCLA Law Review 103 (1992) at 117. 201

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

51

forms of speech should no longer be tolerated, that it is consistent with the self-governance theory to do so.203 Lawrence, for example, takes this route by pitting free speech against other constitutional rights that might come under fire when hate speech is not curbed. What is at play in the hate speech debate, according to Lawrence is the conflict between competing constitutional values; the Fourteenth Amendment’s equal protection clause and the First Amendment.204 This assertion is, often implicitly, relied upon by anti-hate speech advocates. They do not deny the importance of the First Amendment, but argue that if two constitutional values collide, a balanced approach is required, and that the First Amendment should not win by default. Lawrence, for example, argues that racist speech (and acts) institutionalize the idea of white supremacy. He believes that by protecting this kind of speech, the First Amendment contributes to this institutionalized racism. Individuals have no free speech rights, in this view, to advance principles or policies that are fundamentally antithetic to basic constitutional values. Racism may be philosophically debatable, but that debate cannot take place within a constitutional order committed to democratic and egalitarian ideals, this argument goes.205 This view states that certain types of hate speech, racist speech, for example, should not be given the right to compete with other views in our society, views that fall within the constitutional order. However, as Graber has pointed out, the viewpoint that certain ideas that challenge constitutional beliefs should not be allowed to compete with ideas that fall within the constitutional order is untenable. The idea that one can only advocate constitutionally legitimate ideas seems absurd to Graber, who gives the example of how one cannot and should not be barred from advocating a six-year term for the president.206 Moreover, almost every clause of the constitution, including the equal protection clause, is subject to amendment, and

203

Lee C. Bollinger, The Tolerant Society: Freedom of Speech and Extremist Speech in America (1986) at 50. 204 Charles III Lawrence, "If He Hollers, Let Him Go, Regulating Racist Speech on Campus," 1990 Duke L. J. 431 (1990). 205 Mark A. Graber, "Old Wine in New Bottles: The Constitutional Status of Unconstitutional Speech," 48 Vand. L. R. 349 (1995) at 378. 206 Ibid.at 379.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

52

Global Medium, Local Laws

must therefore also be subject to discussion and debate.207 But Graber still sees a way out for the critical theorists such as Lawrence. If one accepts the notion that certain constitutional provisions have a higher status, and cannot be changed without undermining the bounds of the American constitutional enterprise (for example the equal protection clause), while other constitutional values are more or less arbitrary arrangements (for example term limits for the president),208 one could ban hate speech, as long as one could argue that equal citizenship is such a crucial constitutional value. But this contention that there are these constitutional values that can remain unchallenged is contrary to established legal doctrine and the marketplace of ideas model as expressed by Holmes in his Gitlow dissent. As will be discussed in chapter three, European hate speech law does not share this philosophical outlook.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Autonomy theory The theories discussed above are instrumental theories; they argue for a broad protection of speech because it furthers a greater goal, such as producing an informed citizenry or generating truth. These theories can be invalidated to the extent that one can show that a free speech regime will not bring about these desired outcomes. The autonomy theory, on the other hand, states that free speech is an important component of individual liberty, regardless of the products of this speech and therefore should not be restricted.209 Free speech is not the instrument to reach a higher goal; it is the higher goal. Under this theory, free speech is seen as promoting self-expression, self-fulfillment or individual autonomy.210 211 207

Ibid.at 380. Ibid.at 381. 209 Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity (2001) at 11. 210 James Weinstein, Hate Speech, Pornography, and the Radical Attack on Free Speech Doctrine (1999) at 14. 211 But can these not be considered to be instrumental values? Stated like this, it sounds as if freedom of speech is a means to realize the goal of autonomy and self-realization. There is indeed a consequentialist interpretation that can be given to this theory, that human beings cannot attain self-realization or develop their humanity under a regime that restricts freedom of speech. (See: Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge 208

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

53

The autonomy theory is libertarian in its orientation towards the rights of the speaker as an individual, and does not concern itself with societal concerns (or better: it assumes that society is best served by not putting constraints on the individual). The speaker is seen as an autonomous individual who should be left to his own devices (unfettered by societal or governmental pressures, norms and regulations) when making decisions about what to say and what not to say. It is based upon the notion that if someone makes a statement, as Collin did in the Skokie case, this person is revealing himself and his identity, and this act of self-realization is believed to be fundamental to his humanity and no government should interfere with that. As Greenawalt has pointed out, this theory is not only premised on the autonomy of the speaker, but also on the autonomy of the receiver of speech,212 because it holds that a government should always treat its citizens as rational and autonomous persons “by allowing them all the information and advocacy that might be helpful to a rational, autonomous person making a choice.”213 This notion of autonomous rationality entails that an autonomous person should not accept the judgment of the government or anyone else as to what he should hear or believe.214 Therefore, this theory is not instrumental; it states that regulation of speech cannot be justified because it negates people’s autonomy. Applying this to hate speech, the obvious argument would be that hate speech –at least when it is not threatening or does not constitute incitement– should not be banned. Even people who hold beliefs that many may find despicable, such as Collin, have the right to express themselves and attain self-realization through speech. Furthermore, the receiver of speech is autonomous as well and should not be shielded from certain speech by the government. This does not necessarily mean that the speaker is always right, or that we should endorse this of Interdisciplinarity (2001) at 12.) But the more common interpretation of this theory seems to be that freedom of expression is an essential part of this autonomy and self-realization and coincides with it rather than merely being a means to attain the goal of “self-realization.” 212 Kent Greenawalt, Speech, Crime, and the Uses of Language(1989) at 150. 213 Idem. 214 James Weinstein, Hate Speech, Pornography, and the Radical Attack on Free Speech Doctrine (1999) at 15.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

54

Global Medium, Local Laws

kind of speech, but the fact that we may find some speech despicable cannot, under this theory, serve as a justification to strip someone of this autonomy. Of the theories discussed this far, the autonomy theory seems to lead to the most flat out rejection of bans on hate speech that does not amount to incitement, threats or fighting words. It is also the theory that might best explain the doctrinal practice of valuing free speech,215 even in cases where the truthfulness or the redeeming value of speech may be questionable at best. This theory has been criticized on various grounds. It has been argued that in domains other than speech, this autonomy and need for self-realization is strictly regulated, as Sadurski points out: “Punching another person’s nose or driving 120 km per hour through a school district during peak hours may be the best way to express your real self, or to fulfill your true capacities, but we will not allow you to exercise your autonomy in this particular way.”216 One can also dispute the accuracy of the worldview upon which it is based and contend that individuals are not autonomous and unconnected beings who arrive at self-realization only if they are left alone, but argue instead that people are social beings, who get their cues from the wider societal network in which they exist. This more communitarian perspective sees individuals as reaching their full potential by interacting and living within a greater society of which they adopt the norms. If this society rejects the norms and values a person adheres to and decides that they are pernicious for that society, societal interests should trump autonomy. This is not so much a criticism of this theory as it presents an alternative worldview entailing different normative claims, of which a ban on hate speech could be one. In a 1988 Harvard Law Review Note,217 the author defends group libel laws on the basis of a communitarian argument. Communitarians hold that a society needs to craft a balance between individual rights and the common good.218 They believe that human beings are above all part of a greater whole, of a community that serves as a frame of reference for individuals to determine what constitutes “the good life,” 215

Idem. Wojciech Sadurski, Freedom of Speech and its Limits (1999) at 18. 217 "Note: A Communitarian Defense of Group Libel Laws," 101 Harv. L. Rev. 682 (1988) at 690-691. 218 Amitai Etzioni, The Limits of Privacy (1999). 216

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

55

that provides them with the values, the end and the purposes to shape their lives and identities. This is contrary to the libertarian tradition from which autonomy theory springs and that sees individuals as totally independent entities that should be left the space and freedom to make their own decisions. For communitarians, group vilification on the basis of characteristics such as race, religion or ethnicity should be banned because religious and racial affiliations are central to people’s identities, their self-understanding and self-realization. Group vilification, by targeting these group affiliations and stating that they cannot be reconciled with group members’ identity as citizens, denies individuals the opportunity to become part of their community, and therefore should be banned, the argument goes.219 This communitarian critique of the First Amendment is a radical one. It promotes (certain) group libel laws, because if one sees people as identified by the groups they belong to and by their affiliations with other people, then a verbal attack on a group constitutes an attack on that individual. It is precisely this reliance on communitarian thinking that constitutes the greatest obstacle to making group libel laws acceptable in the United States where individualistic liberalism is deeply ingrained.220 It lies outside the scope of this work to discuss in depth the pros and the cons of communitarian theory, but the belief that society will best be served if individual freedoms are protected is deeply ingrained in First Amendment theory and the American political fabric and seems to be at odds with the communitarian belief that people define their personal and political identity by reference to the larger community. This libertarian tradition in the United States contrasts with the more communitarian tradition in, for example, Europe. Chapters three and four will discuss how this more communitarian philosophy has shaped European hate speech law. A libertarian criticism to communitarian defenses of hate speech legislation could be that community standards and community values 219

"Note: A Communitarian Defense of Group Libel Laws," 101 Harv. L. Rev. 682 (1988) at 690-691. 220 Toni M. Massaro, "Free Speech and Religious, Racial, and Sexual Harassment: Equality and Freedom of Expression: The Hate Speech Dilemma," 32 Wm and Mary L. Rev. 211 (1991) at 236. Possible exception to this is the law of obscenity in which community standards are used to determine whether or not distributed pornographic material is obscene or not.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

56

Global Medium, Local Laws

have in the past often slowed down equality and desegregation, and have, in the words of John Stuart Mill, created a “tyranny of the majority.”

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Dissent theory Steven Shiffrin argues that the theory that best captures the meaning of the First Amendment is his dissent theory. According to Shiffrin, the First Amendment’s main task is to offer protection and to sponsor rebelliousness, individualism and non-conformism in us all.221 Those who attack our customs, habits, traditions and authorities, such as flag burners, stand at the center of the First Amendment protection, according to Shiffrin.222 He also notes that this theory not only offers protection to the powerless and outnumbered, but to everyone with a dissenting voice, for example to the press as an institution that operates to check on corruption and abuses.223 In this respect, dissent theory is closely linked to Vincent Blasi’s checking value theory, which considers checking those in power for abuses to be the central role of free speech.224 While Shiffrin does not share Blasi’s blanket assumption that power necessarily corrupts and that therefore those in power need to be checked on more closely (and should be barred from regulating speech), he acknowledges that dissent will most likely come from those who are not “powerful.”225 One would expect that Shiffrin’s theory would not allow for restrictions of, for example, racist speech,226 because it is safe to assume that people such as Collin in the Skokie case are dissenting with generally accepted beliefs and customs and that he is a nonconformist. However, Shiffrin himself does not draw that conclusion. He argues that any legitimate system assumes that “all citizens are

221

Steven Shiffrin, Dissent, Injustice, and the Meanings of America (1999) at 76. 222 Ibid. at 10. 223 Ibid. at 76. 224 Vincent Blasi, "The Checking Value in First Amendment Theory," 1977 Am. B. found. Res. J. 521 (1977). 225 Steven Shiffrin, Dissent, Injustice, and the Meanings of America (1999) at 76. 226 Shiffrin focuses mainly on racist speech.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

57

worthy of equal concern and respect,”227 and since racists do not share this premise, the system should not offer them constitutional support.228 (Shiffrin does reject banning racist speech,229 however, on the basis that such a ban would be ineffective, not because it conflicts with his dissent theory.) This question is one that comes back in many forms throughout the discussion on hate speech: Do bigots forfeit First Amendment rights because they would deny these rights or other fundamental rights to other groups of people? The answer to this question is one of the main differences between free speech law in Europe and the United States. The weaknesses of Shiffrin’s argument for banning hate speech is that it seems to make a certain substantial criticisms of “the system” impossible, but this goes against the core tenets of his theory. Racists, for example, because they do not share one of the assumptions of the system (all citizens are worthy of equal concern and respect) are barred from constitutional protection. But who determines what these assumptions of the system are that cannot be attacked? One could make an argument that the right to acquire private property is also a basic assumption of our system, but communists do not (or no longer) forfeit their First Amendment rights if they propose to abolish private property. By the same token, Shiffrin’s theory seems to suggest that someone who wants to get rid of the First Amendment should not be protected by the First Amendment. Perhaps these examples are dissimilar to the example Shiffrin works with, but in the context of his “dissent theory,” they are not substantially different. Those who think that racist speech (or other kinds of hate speech) ought not to be allowed because it rejects a core belief of our society, ought to explain how other speech that challenges the core beliefs of our society would fare under their regime.230 Shiffrin seems to make a distinction between good system-challenging speech and bad system-challenging speech,

227 Steven Shiffrin, Dissent, Injustice, and the Meanings of America (1999) at 78. 228 Ibid. at 78-79. 229 Ibid. at 83-85. Shiffrin argues against bans on racist speech that directly targets its victims, such as racial slurs, as well as racist speech that is not directed at anyone in particular. 230 Wojciech Sadurski, Freedom of Speech and its Limits (1999) at 76.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

58

Global Medium, Local Laws

but it remains unclear what this distinction is based upon and how it can be supported by the dissent theory. Presenting this issue as a choice between free speech and equal treatment of minorities is misleading. By not regulating racist speech that challenges the notion that all citizens are worthy of equal concern and respect, the lawmakers do not necessarily endorse that view. The belief that all people are equal is reflected in many of the laws and regulations of this country and other democratic nations. The government has other ways to ensure that all citizens are treated equally, such as punishing behavior that contradicts these values. We cannot punish someone for stating that he believes that different races should not live in the same buildings,231 but we can punish a landlord who discriminates.232 The distinction between speech and conduct is fundamental to First Amendment doctrine, and by framing the hate speech issue simply as a choice between civil rights and free speech, this distinction is overlooked.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Tolerance theory This theory is the brainchild of Lee Bollinger. Bollinger starts his book with an analysis of the Skokie case. According to Bollinger, a superficial reading of the opinions of the court shows judges who reluctantly side with Collin because they do not see how his speech can effectively be banned without banning other valuable speech as well.233 Bollinger rejects this “line drawing” argument, because he believes that this kind of slippery slope argument is “one of the most beguiling methods of obfuscation and diversion in legal argumentation.”234 He believes that speech such as the speech at issue in the Skokie case could be effectively and clearly distinguished from other speech.235 But Bollinger argues that a closer reading of the opinions reveals a more positive rationale for striking down the Skokie ordinances. According 231

Unless perhaps if that speech amounts to incitement under the Brandenburg test. 232 Pittsburgh Press Co. v. Pittsburgh Commission Human Relations 413 U.S. 376 (1973). 233 Lee C. Bollinger, The Tolerant Society: Freedom of Speech and Extremist Speech in America (1986) at 34. 234 Idem. 235 Ibid. at 38.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

59

to Bollinger, extreme cases such as the Skokie case teach us by example.236 They teach self-restraint in the face of provocation; they help “to create a general intellectual character through the creation of a kind of tolerance ethic.”237 Instances in which this self-restraint cannot reasonably be expected, such as fighting words, obscenity or libel, fall outside the protection of the First Amendment.238 Simply put, according to this theory, the First Amendment serves as a pedagogical tool:239 “Extremist speech is very often the product or the reflection of the intolerant mind at its worst and, as such, an illustration to us of what lies within ourselves and of what we are committed, through the institution of free speech, to overcome. Perhaps ironically … the principle of ‘free speech’ serves to ‘protect,’ and so to hold up before us, that which we aspire to avoid.”240 According to this theory, extremist speech ought to be protected because it teaches the audience self-restraint. But this theory does not explain fully why this kind of restraint cannot be expected for other forms of speech such as libel, obscenity or fighting words.241 Why can we not expect the audience to exercise self-restraint when faced with obscenity while we do expect it in the case of hate speech? Also, this theory is rather limited, interpreting the First Amendment only in relation to hate speech. It does not say anything meaningful about other First Amendment issues (for example: public access to trial proceedings.)242 As Bunker observed, the “fact that toleration may follow from free speech does not necessarily suggest that it should be the normative focus of First Amendment theory.”243 A last point of criticism, which is one that could be levied against other theories as well, is that it simply might not be true. What if social 236

Ibid. at 124-128. Ibid. at 124. 238 Ibid. at 181-186. 239 Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity (2001) at 16. 240 Lee C. Bollinger, The Tolerant Society: Freedom of Speech and Extremist Speech in America (1986) at 127. 241 Wojciech Sadurski, Freedom of Speech and its Limits (1999) at 32. 242 Matthew Bunker, Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity (2001) at 16. 243 Idem. 237

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

60

Global Medium, Local Laws

sciences and psychology would reveal that this learning effect does not take place? Would that invalidate this theory? All theories contain some untested assumptions that, if invalidated, would jeopardize the theory. But these assumptions regarding reality are often embedded in a normative framework or a broader worldview (see for example the autonomy theory which is grounded in the conception of man as a autonomous being), which seem to be lacking in Bollinger’s theory.

Alternative Approaches to Regulating Hate speech in the United States

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

As this overview shows, scholars arguing for more hate speech laws in the United States are facing a tough fight. Both case law and the theoretical underpinnings of the First Amendment make it hard for any kind of proposal that depends on a content-based restriction of hate speech to pass constitutional muster. Still, advocates for hate speech restrictions have formulated proposals to constitutionally ban hate speech. Below, two of these proposals are briefly discussed The first one is a limited one that only deals with directly targeted racial insults, the second one is much broader in scope and would also include nontargeted hate speech.

Richard Delgado: A tort for racial insults In proposing a tort action for racial insults, Delgado’s proposal is rather limited in scope and tries to position itself within established First Amendment doctrine rather than opposing it. Delgado establishes the need for such a tort by arguing that racial insults create real harm244 and that existing legal remedies, such as laws against assault and battery, infliction of emotional distress,245 defamation and statutory provisions

244

Richard Delgado, "Words that Wound: A Tort Action for Racial Insults, Epithets, and Name Calling." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993) at 93-96. 245 However, the tort for intentional infliction of emotional distress has evolved to punish instances of racism, sexism and bullying in the work place. See Thomas H. Koenig and Michael Rustad, In Defense of Tort Law (2001).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

61

are not adequate remedies for racial slurs.246 Delgado’s proposal targets expressions that are directly addressed at someone and are intended to demean through reference to race and that a reasonable person would recognize as a racial insult.247 Delgado argues that the fighting words doctrine set forward in Chaplinsky allows for the establishment of such a tort, but in the light of R.A.V., his proposal might be underinclusive (because it singles out insults that make reference to race). However, this is a tort, so the real issue is whether or not a harm is suffered, and this is where the strength and the weakness of his argument lies. If one accepts the notion that racial slurs directed at someone cause a real and substantial harm, one would have to agree with Delgado’s proposal, despite First Amendment concerns. Delgado seems to assume that racial insults, by their very nature, inflict damage. However, the Supreme Court –at least in the context of fighting words, the doctrine upon which Delgado founds his proposal– has rejected such a categorical approach that labels certain words as fighting words, and proposed a contextual approach instead (see supra). Delgado gives several reasons why racial insults could be harmful:248 They cause immediate or mental distress, are a dignitary affront which denigrates the victim’s humanity, they are intentional acts that cause long-term emotional pain because they draw upon and intensify the effects of stigmatization, labeling, and disrespectful treatment that the victim has previously undergone.249 He also argues that victims of racial slurs are helpless and cannot resort to counter speech because it will only provoke more abuse. Underpinning these arguments is the belief that minorities are in a position of inferiority towards the “dominant” group of non-Jewish white Americans and therefore need extra protection. Even though Delgado’s proposal would cover racist slurs hurled at white people as well (for example “you dumb honkey”), for this to be actionable the plaintiff would have to demonstrate that he suffered harm as a result.250 246

Richard Delgado, "Words that Wound: A Tort Action for Racial Insults, Epithets, and Name Calling." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993) at 96-103. 247 Ibid. at 109. 248 Ibid. at 93-96. 249 Ibid. at 93-94. 250 Ibid. at 110.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

62

Global Medium, Local Laws

However, Delgado defines the alleged harm suffered from racial insults clearly from the perspective of minorities because he refers to a broader societal and historical context of racism, and it seems unlikely that a white person would be able to prove harm from a racial insult under these conditions. So for all practical purposes, his proposal constitutes a form of viewpoint discrimination, because it makes it more likely that racial slurs hurled at someone who is member of a minority (or more precisely, who is member of a group that has been traditionally stigmatized) would be found to be tortuous than racial insults directed at white people. The Supreme Court has shown in the cross burning cases to be very reluctant to accept these kinds of viewpoint-based restrictions on speech as constitutional.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Mari Matsuda: Hate speech from the victim’s perspective Mari Matsuda rejects the speech-conduct dichotomy and has argued for the introduction of hate speech regulation, not only for face-to-face insults, but also for general statements because of the harm they inflict. She proposes to ban hate messages that contain a message of (1) racial inferiority, (2) that are directed against a historically oppressed group and (3) that are persecutory, hateful and degrading.251 Matsuda’s approach is to take the “victim” of hate speech and to establish that real harm is done through hate speech.252 Matsuda also –reluctantly253– tries to make her proposal consistent with First Amendment values and bring it in line with the holding in the Skokie case:254 “I believe racist speech is best treated as a sui generis category, presenting an idea so historically untenable, so dangerous, and so tied to perpetuation of violence and degradation of the very classes of human beings who are 251

Mari J. Matsuda, "Public Response to Racist Speech: Considering the Victim’s Story." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993) at 36. 252 Ibid. at 25-26. 253 Critical race theorists are in the unenviable position of having to fight their battle on two fronts, on the one hand they reject the underlying principles of First Amendment doctrine, on the other hand they can only hope for their proposals to have any chance of survival if crafted within the confines of established First Amendment jurisprudence. 254 Ibid. at 35-36.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech in American Law

63

least equipped to respond that it is properly treated as outside the realm of protected discourse.”255 Matsuda argues that her proposal could pass constitutional muster if the fighting words doctrine were to be broadened so that it would include messages that are persecutory, hateful and degrading.256 But even if this were to happen, her proposal would still be underinclusive under Scalia’s standard from R.A.V. (because it would punish persecutory, hateful and degrading speech that contains message of racial inferiority, that is directed against a historically oppressed group, but not other kinds of persecutory, hateful and degrading speech).257 However, even if we accept Matsuda’s contention that racially motivated hate speech is a sui generis category that does not deserve protection and even if her proposal would not be underinclusive, it still is based upon a binary view of society: that there are dominant and oppressed groups in society and that hate speech occurs when members of the dominant group vilify members of, or the whole, “historically oppressed group.” This framework is adequate to describe instances when white people engage in hate speech towards African Americans, but society has become much more complex than that and leaves open questions of what groups exactly are “historically oppressed” (for example, are Muslims historically oppressed in the United States? Hispanics? Jews? Who qualifies as a member of an “oppressed group”?) And what happens when members of subordinated groups258 engage in hate speech against each other? Matsuda does not ignore these issues, but comes to conclusions that can hardly serve as a foundation for a legal standard. For example, on the question of what to do when members of subordinated groups engage in hate speech towards each other or amongst themselves, Matsuda writes: History and context are important in this case because the custom in a particular subordinated community may tolerate racial insults as a form of wordplay… The appropriate 255

Ibid. at 35. Ibid. at 36. 257 Matsuda’s article was first published in 1989, before the ruling in R.A.V. 258 The term “subordinate groups” might not be identical with “historically oppressed groups,” but Matsuda uses both terms in her article. 256

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

64

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

standard in determining whether language is persecutory, hateful, and degrading is the recipient’s community standard. We should beware lest by misunderstanding linguistic and cultural norms we further entrench structures of subordination.259 This would mean that a speaker would have to be aware of the cultural norms and values of a group before speaking, which puts a heavy burden upon the speaker and could have a chilling effect on speech. While the requirement of community standards is also present in, for example, the Miller obscenity test, in Matsuda’s proposal, community seems to be a much more variable and complex notion, that involves linguistic and cultural norms. Despite her claims to the contrary, a proposal such as Matsuda’s, in which hateful expression towards certain groups would be banned but that the same speech expressed in a similar context towards another group would be protected is contrary to the idea of viewpoint neutrality as it was expressed in R.A.V. and Virginia. Even if one rejects this notion of viewpoint neutrality and agrees with Matsuda that hate speech directed against certain groups ought to be banned, it is hard to come up with a way in which courts or lawmakers could establish which groups ought to be “protected” from hate speech and which groups should not be. Any determination of this kind would be very political and a source of bitter debate. It would require a very radical departure from established First Amendment doctrine to regulate speech based on such highly political and subjective standards and any such regulation would likely be found to be void for vagueness. Matsuda’s proposal departs from the harm hate speech inflicts upon its victims. And by “victims,” she does not mean people who have been targeted individually with slurs, but anyone to whom nontargeted hate speech refers. Matsuda falls short in fully explaining this harm, but instead takes it as a given. By not offering a clearer explanation of what exactly this harm consists of and how it can serve as a justification for speech regulation, her proposal becomes a 259 Mari J. Matsuda, "Public Response to Racist Speech: Considering the Victim’s Story." In: Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (1993) at 40.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech in American Law

65

subjective account of who deserves certain protection from hate speech and who does not.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Conclusion The beginning of this work identified five dimensions of hate speech: the medium, the content of the speech, its effect, its target and the intent/context of the speech. This overview showed that content-based restrictions of speech raise major constitutional problems. Theories of the First Amendment state that the search for truth (marketplace of ideas), the creation of an informed citizenry (self governance) and the self fulfillment of people (autonomy theory) are better served if the government does not regulate speech based on the ideas and viewpoints expressed in it. The majority opinion in R.A.V. was a clear statement that this notion is fundamental to First Amendment jurisprudence. There is some limited room to regulate speech based on a mix of effects and context (speech that will lead to imminent lawless action, speech that would cause a breach of the peace, speech that is likely to provoke the average person to retaliation, threatening speech), but these regulations cannot be content-based. The Skokie case made clear that the mere fact that speech is extremely offensive and hurtful cannot serve as a justification for restricting it, especially not if this restriction is viewpoint-based. For the same reasons, proposals such as Delgado’s, and certainly Matsuda’s, are unlikely to pass constitutional muster. The Brandenburg test and the fighting words doctrine have focused on maintaining public order rather than protecting individuals from speech that may be harmful at the individual level. The effects component of these speech regulations does not take the “victim” of hate speech into consideration, the harmful effects that this speech may have for societal order is the main concern. With threatening speech, however, the effect of the threat on the victim of the threat is the main consideration, and therefore, the Virginia decision could perhaps be seen as somewhat of a shift, because it looked at cross burnings through the prism of threatening speech instead of fighting words. The fighting words doctrine had come to focus on the tendency of fighting words to incite an immediate breach of the peace, rather than their tendency to inflict injury. Looking at cross burnings as threats rather than as fighting words indicates an approach in which the victim, or recipient of hate speech is the main focus. Though it is possible that the Virginia

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

66

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

ruling does not reach beyond the facts of the case, the court’s willingness to consider the context of burning crosses may also leave the door open to ban threats using other symbols of hate speech that are associated with a history of violence and intimidation such as swastikas, as long as can be argued that “the basis for the content discrimination consists entirely of the very reason the entire class of speech at issue is proscribable.”260 This overview also shows that speech that is targeted is easier to regulate than abstract advocacy speech. In order for speech to be fighting words or threats, it needs to be targeted at someone in particular. However, the Planned Parenthood case which will be discussed in the next chapter challenges this contention. This section has mainly focused on hate speech definitions that focused on the effect, content, target and context of certain types of speech, but has ignored the fifth component identified in the opening remarks of this work, namely the medium through which speech is communicated. The next chapter will discuss in greater depth how the medium of the Internet has influenced thinking about hate speech in the United States.

260

538 U.S. 343 at 361.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

CHAPTER 2

Regulating Online Hate Speech in the United States With the rise of the Internet, the debate on hate speech regulation in the United States has received new life. A considerable body of legal scholarship has dealt with the question of whether or not the medium of the Internet necessitates a rethinking of First Amendment protection of hate speech. Most of the arguments stating that it does focus on the notion that the Internet has an “amplifying” effect on hate speech: that speech over the Internet somehow has greater damaging effects than speech through other media and therefore should be regulated more strictly. This chapter provides an overview of the relevant case law and scholarly articles on this topic.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Introduction How big is the “problem” of online hate speech in the United States? Since there is no fixed definition of “hate speech,” any classification of a Web site as a “hate site” will be arbitrary to an extent. According to the Southern Poverty Law Center’s Intelligence Report, there were about 471 extreme right/racist hate sites online in 2004, marking a small decline from the year before.261 The Simon Wiesenthal Center, in a 2005 report studying “over 5,000 problematic web sites that aid and abet terrorism, promote racial violence, antisemitism, and xenophobia,”262 reported a 25% spike in hate sites worldwide from the previous year. In its 2007 report it mentioned the existence of 7000 “problematic” sites, blogs, newsgroups and on-demand video sites.263 Observers of hate groups have 261

Mark Potok, “The Year in Hate,” Southern Poverty Law Center: Intelligence Report (2005). 262 "Simon Wiesenthal Center's Digital Hate and Terrorism 2005 Report Reveals 25% Spike in Hate Sites," Simon Wiesenthal Center (2005).

263 “Digital Hate and Terrorism 2007,” Simon Wiesenthal Center (2007)

67 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

68

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

also noted a spike in their popularity following Senator Obama’s presidential bid.264 These watchdog groups265 monitor these sites because they deem them to be dangerous, but others have argued that initial fears that the Internet would spread hate across the country and around the globe have turned out to be unfounded. Dave Goldman, founder and executive director of “Hatewatch,” an organization that monitored online hate, dismantled his organization in early 2001, when he realized that the doom scenario about Internet hate had not become reality: We quickly learned that while hate groups, who once thrived and created fear in the shadows, wither and hid from the public scrutiny of the Internet. What these groups didn't count on was that forcing their way into people's homes via the Web would have the effect of mobilizing ordinary people to join in the fight against them. Far from persuading a supposed “silent white majority” of angry Aryans to join their ranks, these self proclaimed white warriors, made moms and dads into determined anti-hate activists. Now, in 2001, the news is much more encouraging than any of us expected. Hate groups have done an extremely poor job of using the Internet to increase their membership. They have utterly failed to gain widespread acceptance for their belief that bigotry, hate and violence are viable responses to human diversity. This is not to say that we no longer have cause for concern. The advent of the “lone wolf” gunman whose hatred may be fed by hate group propaganda, bigoted organizations who use e-commerce to support their hateful enterprises, and the newly emerging racist cyberterrorist, all will continue to present great challenges to law enforcement and online civil rights. And with this, the struggle continues.266

264

Eli Saslow, “Hate Groups’ New Target,” The Washington Post, June 22, 2008, p.A6. 265 See for example also an independent Web sites that publishes an annual report on hate sites and other hateful online content or services. Its 2008 edition contains over 150 pages of links to cyberhate. 266 Dave Goldman, "HateWatch Says Goodbye," hatewatch.org (2001).< http://www.davidicke.net/tellthetruth/reststory/hatewatch.html>, this link is no longer active but Goldmans message can also be found here:

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate Speech in the United States

69

Precisely this notion that the Internet dramatically increases the impact of hate speech is one of the main arguments of those who argue for a more restrictive approach towards hate speech online. As we have discussed in the previous chapter, the American approach toward hate speech has been criticized both from within and from outside the First Amendment paradigm. Some have argued that this paradigm is morally wrong and needs to be replaced; others have argued that certain kinds of hate speech could be banned within the constitutional parameters of the First Amendment. It is especially this second group that has argued that the Internet forces us to rethink how to regulate hate speech and how certain kinds of hate speech online can be constitutionally regulated. For the more radical group of anti-hate speech advocates, those who argue for rethinking the First Amendment entirely, online hate speech is of less interest. For them, hate speech is pernicious regardless of the medium it is communicated through. They do not rely on a medium-specific definition of hate speech. However, for others, the rise of the Internet has changed the landscape profoundly. They acknowledge that hate speech should be protected by the First Amendment as long as it is not communicated via the Internet. These authors have made their argument using the Planned Parenthood case. As was shown in Virginia v. Black, threatening or intimidating speech is not constitutionally protected, and we see this same tendency to enlarge the scope of threats to include hate speech also take place on the Internet. The Planned Parenthood case serves as a prime example of this trend. The speech at issue in this case was anti-abortion speech, and one could argue that this does not fall within the “content” component of a hate speech definition as defined in the introductory remarks of this work. However, this case brings up numerous issues regarding content regulation online and therefore warrants discussion here. The first part of this chapter will provide a brief overview of the relevant case law of speech regulation on the Internet, while the second part will discuss the arguments that have been formulated for a stricter regulation of hate speech online in the wake of the Planned Parenthood case.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

70

Global Medium, Local Laws

Relevant Case Law

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Reno v. ACLU: How to treat the new medium In the United States, different media enjoy different levels of protection. Invasive media, media over which the receiver only has limited control as to what message he will receive and media that are a “scarce expressive commodity,” traditionally have been subjected to higher levels of government regulation. As the Internet became more and more popular, the question of what level of scrutiny this new medium could be subjected to gained relevance. The Supreme Court answered this question in Reno v. ACLU.267 In addition, this case also illustrates the constitutional obstacles that spring up when trying to regulate diffusion of content on the Internet. In Reno, the Supreme Court struck down two sections of the 1996 Communications Decency Act (CDA) that were seeking to prohibit “the knowing transmission of obscene or indecent messages to any recipient of under eighteen years of age,”268 and the “knowing sending or displaying of patently offensive messages in a manner that is available to a person under 18.”269 In defending the constitutionality of these sections, the government relied on, among others, Renton v. Playtime Theatres.270 In this case, the court ruled that a zoning ordinance keeping adult theatres out of residential neighborhoods was not unconstitutional because it was not aimed at avoiding the dissemination of offensive speech, but instead was aimed at limiting the “secondary effects” (crime, deteriorating property values) that the theatres fostered.271 It argued that a similar cyberzoning was exactly what the CDA was designed to do. The court rejected this argument, ruling that the CDA did apply to the speech directly, and not to its secondary effects (as in Reno) and would apply to “the entire universe of cyberspace.”272 The CDA may have been a poorly crafted piece of legislation that raised a myriad of First Amendment concerns, but even its successor, the 267

521 U.S. 844 (1997). Section 47 U.S.C. A. § 223(a). 269 Section 47 U.S.C. A. § 223(d). 270 475 U.S. 41 (1986). 271 521 U.S. 844 at 867. 272 521 U.S. 844 at 868. 268

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

71

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Child Online Protection Act (COPA), has received injunctions against its enforcement by the courts.273 Even though it is considerably narrower274 than its predecessor, the courts have not been convinced that the age verification systems it would require (such as verifying age through credit cards) would not constitute a serious chill on protected speech. The COPA, like the CDA, has not been able to resolve the question of how to shield one group of people who have no constitutional right to a certain type of speech from this speech without infringing upon the First Amendment rights of those who do have a right to distribute and receive this type of speech. The same conundrum, albeit in a somewhat different format, appears in the international context all the time, when lawmakers try to apply local laws to a global medium (see infra). The Supreme Court also stated that the Internet should receive the highest level of First Amendment protection. It stated that the medium of the Internet should be free from government regulation so that the free exchange of ideas would not be disrupted, allowing the medium to continue to expand: The dramatic expansion of this new marketplace of ideas contradicts the factual basis of this contention [that the unrestricted availability of the speech at issue drives people away from the Internet]. The record demonstrates that the growth of the Internet has been and continues to be phenomenal. As a matter of constitutional tradition, in the absence of evidence to the contrary, we presume that governmental regulation of the content of speech is more likely to interfere with the free exchange of ideas than to encourage it. The interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship.275

273

Most recently, the Federal District Court in Pennsylvania rejected the COPA and argued instead argued for an approach in which parents and family would filter content as a constitutionally preferable approach to the COPA. ACLU v. Gonzales 478 F. Supp. 2d 775 (E.D. PA2007). 274 The COPA was narrower in the amount of speech it covered (harmful to minors instead of indecent), the scope of speakers it applied to (only providers of commercial content) and narrower as to the range of the Internet it applied to (not to newsgroups or chat rooms). 275 521 U.S. 844 at 885.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

72

Global Medium, Local Laws

This argument is a crucial one because it establishes that the reason that the Internet should be unregulated is that it is so unbridled and free. As will be discussed later, many of the arguments that hate speech should be regulated on the Internet focus precisely on the fact that the Internet is a free medium that is accessible to everyone. But this freedom and accessibility of the medium is precisely why the Supreme Court thinks the Internet ought not to be regulated. Any argument for more government regulation of the Internet relying on the fact that the Internet is such a highly accessible and “free” medium seems to be doomed from the beginning in the light of this ruling.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Online threats Naturally, the Reno decision does not mean that every communication that takes place online is protected. Courts have applied the existing legal framework to Internet communication quite rigorously. What is illegal offline will generally also be illegal online. In United States v. Kammersell,276 the United States Court of Appeals for the Tenth Circuit upheld the conviction of a man under federal statute 18 U.S.C. § 875(c)277 for sending a bomb threat to his girlfriend’s computer at work via AOL Instant Messenger. He hoped that this way, she would be able to get home sooner, and they could go on a date.278 The defendant challenged the conviction because both he and his girlfriend were located in Utah, while Section 875(c) only applies to threats made in interstate or foreign commerce. However, the court ruled that since the instant message traveled to Virginia, where the servers of AOL are located, before coming back to Utah to reach his girlfriend’s computer, the statute did apply. In a number of cases, disseminators of hate speech have been successfully prosecuted under Title 18. U.S.C. § 245, a federal statute prohibiting interference with a federally protected activity (attending a public college) because of the race, color, religion or national origin of

276

96 F.3d 1137 (10th Cir. 1999). “Whoever transmits in interstate or foreign commerce any communication containing any threat to kidnap any person or any threat to injure the person of another, shall be fined under this title or imprisoned not more than five years, or both.” 278 196 F.3d 1137 at 1138. 277

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

73

the victim. In United States v. Machado,279 Richard Machado, a 19-yearold student, was sentenced to a year in prison for sending threatening hate messages to 59 Asian students at the University of California at Irvine in September of 1996.280 In August of 1997, the U.S. Attorney for the Central District of California charged Machado with violating Title 18 U.S.C. § 245. A first trial in November of 1997 was declared a mistrial because of a hung jury. During the second trial, the prosecution stressed the fact that the email had not been sent to an email list, but to a group of individuals selected because they had Asian names, that the defendant had a history of sending email death threats and the impact the death threats had on the lives of some of the recipients.281 Machado was sentenced to one year in prison and a $1,000 fine. Two years later, in United States v. Kingman Quon, an Asian American student who had sent threatening racist emails282 to Hispanic professors, employees and students at businesses and universities across the country pleaded guilty to charges of interfering with the Federal rights of the students in violation of Title 18 U.S.C. § 245 and received a two-year prison sentence.283

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

279

No cases are filed on Lexis-Nexis, only a procedural appeal by Machado could be found on Lexis-Nexis. 280 The email read: “Subject: FUck You Asian Shit Hey Stupid Fucker As you can see in the name, I hate Asians, including you. If it weren’t for asias [sic] at UCI, it would be a much more popular campus. You are responsible for ALL the crimes that occur on campus. YOU are why I want you and your stupid ass comrades to get the fuck out ofUCI. [sic] IF you don’t I will hunt you down and kill your stupid asses. Do you hear me? I personally will make it my life carreer [sic] to find and kill everyone one of you personally. OK?????? That’s how determined I am. Get the fuck out, MOther FUcker (Asian Hater)”. Chuck Huff, William J. Frey and Signe Land Levine, et al.,"Machado Case History," computingcases.org ) 281 Idem. 282 The messages expressed his hatred of Latinos accused them of being “too stupid” for the University without affirmative action programs and concluded that he intended to “come down and kill” them. 283 “Combating Extremism in Cyberspace,” Anti-Defamation League (2000).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

74

Global Medium, Local Laws

In 1999, a court in Pennsylvania entered an injunction against Ryan Wilson who owned and controlled a white supremacist Web site.284 The order barred him from posting certain messages on the Internet285 and was the result of charges filed against Wilson for terrorist threats, harassment and ethnic intimidation.286 Wilson had posted various messages on his Web site directed at fair housing specialist Bonnie Jouhari, who had requested that local police take action against white supremacist activities. As a result, Jouhari was being threatened and harassed. The harassment worsened when a picture appeared on Wilson’s ALPHA HQ Web site of Jouhari holding a flier depicting a hooded Klansman with a noose that read “Race Traitor Beware.” The Web site also contained a photograph of Jouhari with a caption that read “This 'woman' works at the Reading-Berks Human Relations Council and has received warnings in the mail that she is a race traitor and she should beware.”287 Later, the Web site described the woman’s daughter as a “mongrel”288 and offered its readers instructions on how to make bombs. These instructions were posted directly below a picture of Jouhari’s office.289 Another page on the site contained an image of her office blowing up.290 These messages were combined with real world harassment, threatening phone calls and other forms of intimidation causing the woman to move numerous times fearing for her safety,291 but the harassment continued. She then successfully sued Wilson for intentional infliction of emotional distress. An administrative law judge ruled that Wilson had violated the Fair Housing Act by unleashing his terror campaign upon Jouhari and her daughter and that he had intentionally inflicted emotional distress upon them, and awarded the plaintiffs over a million dollars in damages.292 However, it is important to point out that the online threats did not stand alone and were accompanied by real world threats and harassment. 284

United States Department of Housing and Urban Development v. Ryan Wilson, 2000 WL 988268. 285 Ibid, “Order” section. 286 Idem, “Parties and Background” section. 287 Ibid., findings of fact §9. 288 Ibid., findings of fact §15. 289 Ibid., findings of fact §16. 290 Ibid., findings of fact §19. 291 Ibid., findings of fact §65-70. 292 Ibid., discussion and subsidiary findings §17-28.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

75

Without these aggravating circumstances, insults, threats and annoyances do not amount to extreme and outrageous conduct, which is one of the conditions to categorize speech as inflicting emotional distress.293 Smith argues that the unique features of the Internet (as a “force multiplier”) should be taken into consideration when assessing civil claims as a response to cyberharassment.294 Some states, such as California295 and Washington296, have since passed anti-cyberstalking legislation. Calls for stricter laws on cyberbullying have been louder in the light of teenagers committing suicide after having been bullied through social network and other Internet applications. It remains to be seen however, if cyberbullying laws outside the school context could withstand constitutional challenges.297

The Jake Baker case

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

In United States v. Jake Baker and Arthur Gonda,298 a University of Michigan student successfully quashed the indictment that charged him with violating U.S.C. § 875(c), the Federal statute violated in Kammersell.299 The indictment was the result of an email exchange300 Baker had had with “Arthur Gonda” (about whom it is only known that (s)he used an email account from Toronto, Canada but whose identity and

293

Catherine E. Smith, "Intentional Infliction of Emotional Distress, an Old Arrow Targets the New Head of the Hate Hydra," 80 Denv. U.L. Rev. 1 (2002) at 40-41. 294 Ibid. at 24. 295 Cal. Penal Code § 646.9 (West 2007). 296 Wash. Rev. Code § 9.61.260 (2004). 297 Matthew C. Ruedy, "Repercussions of a MySpace Teen Suicide: Should AntiCyberbullying Laws Be Created? " 9 N.C. J.L. & Tech. 323 (2008) at 345. 298 890 F. Supp. 1375 (E.D. Mich. 1995). 299 “Whoever transmits in interstate or foreign commerce any communication containing any threat to kidnap any person or any threat to injure the person of another, shall be fined under this title or imprisoned not more than five years or both.” 300 The original complaint was based on an FBI agent’s affidavit citing language from a story Baker had posted on an Internet newsgroup that described in very graphic detail the abduction, rape and murder of a woman who was given the name of one of Baker’s classmates. However, in the indictment that would ultimately be issued to charge Baker and Gonda, the story on which the initial complaint was based was not mentioned.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

76

Global Medium, Local Laws

whereabouts have remained unknown), in which they expressed interest in committing acts of sexual and physical violence towards females.301 Judge Cohn adopted a test from United States v. Kelner302 to determine whether or not the emails constituted a “true threat.” The Kelner requirement defines a true threat as a “threat which on its face and in the circumstances in which it is made is unequivocal, unconditional, immediate and specific as to the person threatened, as to convey a gravity

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

301

Some excerpts of the emails were mentioned in the opinion and are pasted below, including spelling and grammatical errors: “I highly agree with the type of woman you like to hurt. You seem to have the same tastes I have. When you come down, this'll be fun!.…Also, I've been thinking. I want to do it to a really young girl first. !3 (sic) or 14. There innocence makes them so much more fun and they'll be easier to control. What do you think? I haven't read your entire mail yet. I've saved it to read later, in private. I'll try to write another short phantasy (sic) and send it. If not tomorrow, maybe by Monday. No promises.” (Dec 1, 1994.) “I would love to do a 13 or 14 year old. I think you are right. . .not only their innocence but their young bodies would really be fun to hurt. As far as being easier to control... you may be right, however you can control any bitch with rope and a gag. . once tey are tieed up and struggling we could do anything we want to them... to any girl. The trick is to be very careful in planning. I will keep my eye out for young girls, and relish the fantasy. . . BTW how about your neighbour at home, youm may get a chance to see her...?...?” (Dec 2, 1994.) “True. But young girls still turn me on more. Likely to be nice and tight. Oh.they'd scream nicely too!” Yeah. I didn't see her last time I was home. She might have moved. But she'd be a great catch. She's real pretty. with nice long legs. and a great girly face ... I'd love to make her cry ...” (dec 2, 1994.) “I just picked up Bllod Lust and have started to read it. I'll look for "Final Truth" tomorrow (payday). One of the things I've started doing is going back and re-reading earlier messages of yours. Each time I do. they turn me on more and more. I can't wait to see you in person. I've been trying to think of secluded spots. but my knowledge of Ann Arbor is mostly limited to the campus. I don't want any blood in my room, though I have come upon an excellent method to abduct a bitch ---“ “As I said before, my room is right across from the girl's bathroom. Wiat until late at night. grab her when she goes to unlock the dorr (sic). Knock her unconscious. and put her into one of those portable lockers (forget the word for it). or even a duffle bag. Then hurry her out to the car and take her away ... What do you think?” (December 9, 1994.) (890 F. Supp. 1375 at 1377-1378.) 302 534 F.2d 1020 (2d.Cir.).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate Speech in the United States

77

of purpose and imminent prospect of execution.”303 The court ruled that Baker and Gonda’s communication did not meet this standard. A key consideration for Judge Cohn was the fact that the indictment pertained to private email exchanges between Baker and Gonda, who had only intended and anticipated their messages to be read by each other, so the threats were not “of a coercive or extortionate nature.”304 This limited the possibility of these messages to be seen as true threats: “It would be patently unreasonable… to think that Baker’s communications caused their only foreseeable recipient, Gonda, to fear violence, or caused him any disruption due to fear of violence.”305 Given the fact that the alleged victims were not targets of the communication, the court stated that they only needed to be protected “from the possibility that threatened violence [would] occur.”306 Judge Cohn firmly rejected the government’s claims that the email exchanges between Baker and Gonda amounted to “a firm plan of action.”307 Judge Cohn also acknowledged that in the online environment, people take on personalities that may differ from their real life personalities. When analyzing government claims about the emails, he suggested that that the government’s equating Baker’s online persona with his real personality could raise concerns.308 Unfortunately, he did not address these concerns in depth because he dismissed the government’s claims on different grounds. However, the notion that people play roles online that may be quite different from their real life persona is one that has been studied by Internet scholars309 and could have implications for a legal analysis, as it seems to assume that speech online does not necessarily need to be seen as an expression by the real life person behind the speech, but that such speech could perhaps be considered as fictitious role playing. In his conclusion, Judge Cohn reiterated that the fact that these communications took place over email and were not posted on the Web was of crucial importance.

303

890 F. Supp. 1375 at 1385. Ibid. at 1386. 305 Ibid. at 1385. 306 Idem. 307 Idem. 308 Ibid. at 1388. 309 Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (1997). 304

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

78

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Baker's words were transmitted by means of the Internet, a relatively new communications medium that is itself currently the subject of much media attention. The Internet makes it possible with unprecedented ease to achieve world-wide distribution of material, like Baker's story, posted to its public areas. When used in such a fashion, the Internet may be likened to a newspaper with unlimited distribution and no locatable printing press--and with no supervising editorial control. But Baker's e-mail messages, on which the superseding indictment is based, were not publicly published but privately sent to Gonda. While new technology such as the Internet may complicate analysis and may sometimes require new or modified laws, it does not in this instance qualitatively change the analysis under the statute or under the First Amendment. Judge Cohn seems to suggest here that had the indictment included the story that Baker had posted in a Usenet group in which he used a student’s name and that had been the source of the initial complaint, his analysis may have been more complex and would have had to take the specific nature of the Internet into consideration. In the end though, the Baker case does not provide any new insights that are particularly relevant for the analysis of hate speech online. Had this exchange taken place via postal mail, instead of email, the judge’s analysis would likely not have been different (though in that case it is unlikely that this case would have received the publicity it did, or that there would even have been a case to begin with). It is doubtful that any kind of email communication could be so “unequivocal, unconditional, immediate and specific as to the person threatened” to qualify as a true threat,310 at least when –as was the case here– this email is not addressed to the victim of the threat. On appeal, the U.S. Court of Appeals for the Sixth Circuit,311 addressing in greater detail the question whether or not the email exchanges constituted a true threat and agreeing that this was not the case,312 upheld the district court’s ruling. 310

Genelle Irene Belmas, Cyberliberties and Cyberlaw: A New Approach to Online Legal Problem-Solving (2002). 311 United States v. Alkhabaz 104 F.3d 1492 (6th Cir.1997). 312 “Even if a reasonable person would take the communications between Baker and Gonda as serious expressions of an intention to inflict bodily harm, no reasonable person would perceive such communications as being conveyed to

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

79

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Planned Parenthood of Columbia/Willamette v. American Coalition of Life Activists In 1995 and 1996, the American Coalition of Life Activists (ACLA), a coalition of anti-abortion groups, which had split off from another pro-life movement because it wanted to espouse a more “pro force”313 approach, published several posters 314 that listed the names and addresses, and in some instances, photographs of doctors who performed abortions and declared them to be guilty of crimes against humanity.315 The posters circulated widely in pro-life circles and publications.316 Very similar posters had been circulated before the shootings of abortion doctors in 1993 and 1994.317 After the posters were released, the FBI offered protection to the abortion doctors listed on the posters and advised them to wear bulletproof vests.318 In addition, the ACLA also compiled a series of “dossiers” on abortion doctors, abortion clinic employees, politicians and abortion right supporters.319 These dossiers were dubbed the “Nuremberg Files,” in reference to the location of the post-war Nazi trials and were made available online in January of 1997. The ACLA argued that these files could be used as evidence once the nation would turn against abortion and Nuremberg-like trials could be held to hold abortion providers accountable for their alleged crimes. On the site, the names of the doctors who had been killed by anti-abortion activists were crossed out and the names of the doctors who had been wounded in attacks were grayed out. In addition to taking extra security measures, the doctors sued the ACLA and affiliated organizations, arguing, among other things, that

effect some change or achieve some goal through intimidation. Quite the opposite, Baker and Gonda apparently sent e-mail messages to each other in an attempt to foster a friendship based on shared sexual fantasies.” (104 F.3d 1492 at 1496.) 313 Planned Parenthood of the Columbia/Willamette, Inc. v. American Coalition of Life Activists 290 F.3d 1058 (9th Cir. 2002) (en banc) [Planned Parenthood III ]. 314 Planned Parenthood of the Columbia/Willamette Inc. v. American Coalition of Life Activists, 244 F.3d 1007, 1012 (9th Cir. 2000) [Planned Parenthood II]. 315 Planned Parenthood III at 1064-1065. 316 Ibid. at 1065. 317 Ibid. at 1064. 318 Ibid. at 1065. 319 Planned Parenthood II at 1012.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

80

Global Medium, Local Laws

these posters and Web site violated the Freedom of Access to Clinic Entrances Act of 1994 (FACE).320 The district court agreed with the doctors and found that the web site constituted a true threat.321 The definition used by the district court for a true threat under FACE was “when the person makes a statement that, in context, a reasonable listener would interpret as communicating a serious expression of an intent to inflict or cause serious harm on or to the listener (objective); and the speaker intended that the statement be taken as a threat that would serve to place the listener in fear for his or her personal safety, regardless of whether the speaker actually intended to carry out the threat (subjective).”322 After ACLA’s motion for summary judgment had been denied,323 a jury returned a verdict in favor of the physicians and awarded them $107 million in actual and punitive damages, the court also enjoined the ACLA from publishing any more materials with the intent to threaten the plaintiffs.324 On appeal, arguing that the posters and the Web site did not amount to a true threat, a panel of the Ninth Circuit Court of Appeals reversed. Relying on NAACP v. Claiborne Hardware Co.,325 the court argued that the ACLA was in the role of a political activist, that it tried to convince people of its point of view, but that it had not directly authorized or threatened with acts of violence.326 In Claiborne, Charles Evers, a NAACP activist had stated during a 1966 rally what would happen to anyone going in stores that were boycotted against by the NAACP: “We're gonna break your damn neck.”327 The Supreme Court found that this speech “did not transcend the bounds of protected speech set forth in Brandenburg”328 because there was no evidence that the speaker had “authorized, ratified, or directly threatened acts of violence.”329 320

18 U.S.C. § 248. Planned Parenthood of the Columbia/Willamette, Inc. v. American Coalition of Life Activists, 41 F. Supp.2d 1130 (D. Or. 1999) [Planned Parenthood I]. 322 Planned Parenthood I at note 1. 323 23 F. Supp. 2d 1182 (D. Or. 1998). 324 Planned Parenthood II at 1013. 325 485 U.S. 886 (1982). 326 Planned Parenthood II at 1016. 327 485 U.S. 886 at 903. 328 Ibid. at 928. 329 Ibid. at 929. 321

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate Speech in the United States

81

The Ninth Circuit panel argued that the speakers cannot be held liable for the actions of third parties: “The jury would be entitled to hold defendants liable if it understood the statements as expressing their intention to assault the doctors but not if it understood the statements as merely encouraging or making it more likely that others would do so,”330 and argued that the lower court’s threat standard was ambiguous on this point.331 The panel argued that the mere fact that the posters and Web site made it more likely that the doctors would be attacked did not place this speech outside the protection provided by the First Amendment. The court stated that “political speech may not be punished just because it makes it more likely that someone will be harmed at some unknown time in the future by an unrelated third party.” 332 Rehearing the case en banc, however, the court reversed. After arguing that Claiborne Hardware did not apply in this case (because Planned Parenthood did not arise under a threat statute and the context of violence was not the same in both cases), the court stated that a true threat analysis as put forward in United States v. Watts333 was the relevant inquiry in this case.334 In Watts, the test applied to determine whether a statement is a true threat was “whether a reasonable person would foresee that the statement would be interpreted by those to whom the maker communicates the statement as a serious expression of intent to harm or assault.”335 The majority argued that context was of crucial importance when considering statutes proscribing threats.336 Analyzing the context of violence in which these posters and Web site had appeared, the court ruled that they went well beyond political speech. As for the Web site, the court argued that merely listing the names of abortion supporters was not a threat, but singling out a category within this group of doctors performing abortions as “ABORTIONISTS,” and scoring the killed and wounded abortion providers made this a hit list.337 For the majority, this part of the site, together with the posters, amounted to true threats. 330

Planned Parenthood II at 1016. Idem. 332 Ibid. at 1015 333 394 U.S. 705 (1969). 334 Planned Parenthood III at 1074-1075. 335 Ibid. at 1074. 336 Ibid. at 1074-75. 337 Ibid. at 1088. 331

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

82

Global Medium, Local Laws

The en banc rehearing marked a sharp departure from the panel decision. Some have argued that the events of September 11, 2001 may account for this shift.338 Since the terrorist attacks, speech that could encourage unrelated third parties to commit acts of terror may enjoy less protection than at the time the panel reached its verdict. Smith made a similar argument in her discussion of the ALPHA HQ case: “[T]he tragic events of that day have changed American's [sic] threshold for what is possible in the name of extreme political resistance. Prior to September 11th, such violence may not have been a reality to the average American. Today, however, an exploding image of Bonnie Jouhari's office, as a call to violence to hundreds or even thousands of unknown individuals who visit the Web site, will not rest in the minds of American citizens as mere folly.”339

Crucial points

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

There are a number of important differences in the reasoning of the majority in the en banc rehearing on the one hand and the reasoning applied by the panel decision and the dissent in the en banc decision on the other. Some of the main points at issue here and their relevance to the topic of this work will be discussed below. Applicability of Claiborne The applicability of Claiborne to this case was of crucial importance in determining whether the ACLA’s posters and Web site were to be considered threats or incitement. As will be discussed below, it is very unlikely that the posters and Web site could have been successfully prosecuted under the incitement doctrine. The similarities between Claiborne and Planned Parenthood seem obvious at first sight. In both cases, there was a context of violence in which the speaker made statements on an issue of high public importance that could be interpreted as an endorsement of violence against certain people. But the majority in the en banc rehearing went to great lengths to distinguish Claiborne from Planned Parenthood. The court ruled that Claiborne did not apply to this case, first of all because Planned Parenthood did not arise under a threat 338

"Intelligence Report: Hit List or Free Speech," Southern Poverty Law Center (2002). . 339 Catherine E. Smith, "Intentional Infliction of Emotional Distress, an Old Arrow Targets the New Head of the Hate Hydra," 80 Denv. U.L. Rev. 1 (2002) at 44.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

83

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

statute,340 but also because Evers’ rhetoric was “surrounded by statements supporting non-violent action, and primarily of the social ostracism sort.”341 The court pointed out that no one associated with the NAACP had broken anyone's neck or that any one of the boycott breakers took the threat that their necks would be broken seriously.342 This reading of the facts is forcefully rejected in the dissent written by Judge Kozinski, who argued that the exact opposite is true. The speech in Claiborne, Kozinski wrote, is more threatening than in Planned Parenthood. Evers’s speech explicitly threatened with violence whereas the ACLA never endorsed violence. Moreover, Evers’s speech was actually immediately followed by violent attacks against boycott breakers, which was not the case in Planned Parenthood, where none of the doctors on the posters or Web site had been the victim of violence following the publication of their names on the Nuremberg Files or any of the posters.343 Context The court in the en banc rehearing agreed that, when considered in isolation, the statements of the ACLA did not amount to true threats, but that given the context of anti-abortion violence in which these statements were made, they amounted to the level of true threats. After all, abortion doctors had been killed in the past after similar posters had been distributed. Judge Berzon, in his dissent, disputes the assumption that the context in which speech occurs can render otherwise protected speech proscribable,344 especially when that context is not of the speaker’s making.345 Some of the most contentious and divisive issues in a society are often debated against a backdrop of violence (anti-war protest, antiglobalization protests). Using this backdrop of violence for which the speaker does not bear any direct responsibility as an argument to restrict his speech could indeed put a big burden upon speech.

340

Planned Parenthood III at 1073. Idem. 342 Ibid. at 1074. 343 Ibid. at 1096. 344 Ibid. at 1103. 345 On the other hand, context is certainly not a foreign notion when assessing threats, the context in which the speech is made is taken into consideration to determine whether a threat is a “true threat” or mere hyperbole. 341

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Global Medium, Local Laws

84

Public-private distinction The court only briefly discussed the fact that the statements by the ACLA were made in public and were not specifically directed towards the alleged victims: “Neither do we agree that threatening speech made in public is entitled to heightened constitutional protection just because it is communicated publicly rather than privately. As Madsen indicates, threats are unprotected by the First Amendment ‘however communicated.’”346 The majority opinion in the en banc rehearing argued that the speech was “publicly distributed but privately targeted,” so even though the posters and Web site seem to address the population at large, the court ruled that in fact they were addressed to the doctors mentioned. However, as Judge Kozinski argues in his dissent, this was also the case in Claiborne, where names of boycott breakers were recorded and published. 347

Conclusion In his conclusion, dissenting Judge Berzon provides an apt summary of the main points that arose throughout the dissenting opinions:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

But the defendants have not murdered anyone, and for all the reasons I have discussed, neither their advocacy of doing so nor the posters and website they published crossed the line into unprotected speech. If we are not willing to provide stringent First Amendment protection and a fair trial to those with whom we as a society disagree as well as those with whom we agree -as the Supreme Court did when it struck down the conviction of members of the Ku Klux Klan for their racist, violencecondoning speech in Brandenburg-- the First Amendment will become a dead letter. Moreover, the next protest group --which may be a new civil rights movement or another group eventually vindicated by acceptance of their goals by society at large-- will (unless we cease fulfilling our obligation as judges to be evenhanded) be censored according to the rules applied to the last. I do not believe that the defendants' speech here, on this

346

Planned Parenthood III at 1076. Ibid at 1097.

347

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

85

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

record and given two major erroneous evidentiary rulings, crossed the line into unprotected speech. I therefore dissent.348 Just as in Virginia v. Black, where the Supreme Court looked at cross burning cases as threats rather than as fighting words, the Ninth Circuit Court of Appeals went to great lengths to use a threat analysis rather than an incitement analysis. It would be an exaggeration to label this as a shift in First Amendment doctrine but it does show that courts are “increasingly comfortable with laws of general applicability that happen to infringe on speech-related activities,”349 as Ronald Collins, a scholar at the Freedom Forum, claims. The ACLA did not address the doctors directly, the people charged were not responsible for the context of violence that led the majority to consider their statements threats, it did not make any explicit calls for violence or law breaking, its speech was public and on a matter of public concern and it did not directly lead to any acts of violence. But it undeniably caused a great deal of distress and justifiable fear among the plaintiffs and it was certainly not unfathomable that the posters and the Web site made it more likely that acts of violence would be committed against the physicians, given the existing context of violence at the time. However, if this context of violence would not have been present, this speech would not have been considered to be a true threat. In that sense, the rationale in Planned Parenthood is viewpoint neutral. For example, if a racist Web site would contain a list of interracial couples in a community, labeling them as race traitors and publishing their addresses, this would be protected speech as long as there is no context of violence, Planned Parenthood seems to suggest. As Hammack correctly observed, in the Planned Parenthood decision, though often touted as the first big case dealing with Internet hate speech, there was no real discussion of the role of the Internet as a mode of communication and how it contributed to the fear of the doctors.350 Also, the speech at issue in this case may not be considered 348

Ibid at 1121. Tony Mauro,"It's a Mad, Mad, Mad, Mad Court. Justices Upended Expectations in 2002-2003 Term," Texas Lawyer, July 7, 2003, p.18. 350 Scott Hammack, "The Internet Loophole: Why Threatening Speech on-Line Requires a Modification of the Courts' Approach to True Threats and Incitement," 36 Colum. J.L. & Soc. Probs. 65 (2002) at 91. 349

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

86

Global Medium, Local Laws

hate speech by many. Yet, this case has inspired numerous legal scholars to address the question of whether or not the Internet as a medium necessitates a rethinking of traditional hate speech law. The section below discusses how the discussion about this case has sparked calls for a different approach to hate speech online.

Discussion

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Applicability of Brandenburg standard to online communications While Planned Parenthood should be distinguished from Brandenburg because it does not deal with incitement but rather with threats,351 Schlosberg uses the Planned Parenthood case to speculate about how the Brandenburg test could be applied to Web communications, and argues that these types of communications probably cannot fulfill the imminence requirement: “it is difficult to imagine that communications on the Web could meet the Brandenberg [sic] test of illegality. The test, originally created to allow legislators to criminalize speech that would incite a riot, does not seem applicable to cyberspace where riots are nearly impossible.”352 Had the court in Planned Parenthood considered the case as an incitement case, it is indeed unlikely that the imminence requirement would have been fulfilled. Brandenburg allows legislators to ban speech that would incite a riot (and cause public disorder), which indeed is not likely to happen on the Internet.353 Brandenburg assumes that speakers and hearers are in the same physical environment, which is generally not the case in the context of the Internet. Schlosberg argues that because the threatening communications were made via a Web site (as opposed to face-to-face or via email) that the more relevant law to apply here was prohibition of speech that incites imminent lawless action under which the speech of the ACLA would have been protected. As Jennifer Elrod observed: “It is likely that advocacy of illegal action most 351

For a detailed analysis of the differences between incitement and threats, see: Jennifer Elrod, "Expressive Activity, True Threats, and the First Amendment," 36 Conn. L. Rev. 541 (2004) at 565-574. 352 Jason Schlosberg, "Judgment on Nuremberg: An Analysis of Free Speech and Anti-Abortion Threats made on the Internet," 7 B.U. J. Sci. & Tech. L. 52 (2001) at 75. 353 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

87

often occurs when the speaker communicates face-to-face with those individuals who are going to carry out the particular action.”354

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Civil liability and the Brandenburg test Because the imminence requirement of the Brandenburg test is difficult to meet online, some have argued that imminence should no longer be required to determine whether or not an Internet communication amounts to incitement, at least when assessing civil liability. Tiffany Komasara355 discusses hate-advocating Web sites, and argues that in many instances, the context of violence that led the court to label the Nuremberg Files as threats will not be present.356 She proposes a legal remedy to punish hate speech absent such a context of violence. She cites Rice v. Paladin Enterprises357 as an example of how conveyors of speech that drives other people to breaking the law can be held civilly liable. In Rice, the United States Court of Appeals for the Fourth Circuit held that the First Amendment did not pose a bar for holding the publishers of Hit Man, a detailed manual describing how to commit murders for a living, civilly liable for the murders committed by one of the manual’s readers.358 In doing so, it reversed the Maryland District Court’s granting of Paladin’s motion for summary judgment on First Amendment grounds.359 The court argued that the First Amendment does not protect aiding and abetting another in the commission of a criminal offense.360 It ruled that Brandenburg did not apply to this kind of speech; it applies to advocacy speech and protects the abstract teaching of principles, but not necessarily the “mere teaching.”361 354

Jennifer Elrod, "Expressive Activity, True Threats, and the First Amendment," 36 Conn. L. Rev. 541 (2004) at 567. 355 Tiffany Komasara, "Planting the Seeds of Hatred, Why Imminence should no Longer be Required to Impose Liability on Internet Communication," 29 Cap. U.L. Rev. 835 (2002). 356 Ibid. at 847. 357 128 F.3d 233 (4th Cir. 1997) at 243. 358 Ibid. at 248. 359 Tiffany Komasara, "Planting the Seeds of Hatred, Why Imminence should no Longer be Required to Impose Liability on Internet Communication," 29 Cap. U.L. Rev. 835 (2002) at 849. 360 128 F.3d 233 at 245-246. 361 Ibid. at 243.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

88

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Komasara argues that for certain Internet hate sites, there should no longer be an imminence requirement for imposing civil liability. She states that this would still allow freedom of speech, since civil liability could only occur if and when a crime is actually committed. The characteristics of the Internet justify such an approach, according to Komasara: “Society benefits from imposing civil liability for Internet communication made without consideration of the possible impact those communications might have. Free speech can become too free when coupled with the anonymous, far-reaching power of the Internet.”362 Such an approach, she argues, would take into account “the content of a message, the intent of the sender and the likelihood that the communication will result in harm (which becomes painfully clear in cases where injury has already occurred.)”363 But the analogy between the very specific step-by-step guidelines for murder at issue in Rice and the posters and Nuremberg Files Web site is not convincing. Had the Nuremberg Files provided maps of the physicians houses, information about how to circumvent alarm systems, pointed out spots from where a gunman could take aim at people inside the house, the analogy would have been more adequate. In Rice, the court stated: Paladin aided and abetted the murders at issue through the quintessential speech act of providing step-by-step instructions for murder (replete with photographs, diagrams, and narration) so comprehensive and detailed that it is as if the instructor were literally present with the would-be murderer not only in the preparation and planning, but in the actual commission of, and follow-up to, the murder; there is not even a hint that the aid was provided in the form of speech that might constitute abstract advocacy.364 There also is no evidence that any violence had been committed against any of the physicians who filed the complaint after their names

362

Tiffany Komasara, "Planting the Seeds of Hatred, Why Imminence should no Longer be Required to Impose Liability on Internet Communication," 29 Cap. U.L. Rev. 835 (2002) at 853. 363 Ibid. at 854. 364 128 F.3d 233 at 249.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

89

appeared on the Web site,365 so the standard designed by Komasara would not apply to the materials at issue in Planned Parenthood. The strategy inspiring this approach is that the threat of imposing civil liability if someone would use the posted information when attempting a murder will make publishers think twice before making similar materials available to the public, but as Komasara concedes: “Even civil liability may not be enough to stop Web sites like the Nuremberg Files, whose composers are so zealous in their beliefs.”366

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Dropping the intent requirement for online threats Topper367 takes direct issue with Schlosberg’s argument that because the Web is not considered a direct form of communication, messages posted on a Web site should not be considered threats because there may be little reason to expect that the targets will actually receive the threat.368 She argues that because the Internet amplifies messages and makes them available to more people, there are more people who may act upon these messages, making the fear and threats created by them more grave. As opposed to other media, the Internet makes threatening information (such as the info on the Nuremberg Files site) available to millions of people cheaply and instantly, increasing their threatening nature, Topper argues.369 She disagrees with the notion that the power of the Web is inconsequential because people have to seek out information.370 Just because of the fact that the Internet makes information available to individuals who care enough to look this information up, this information is more threatening, Topper maintains. In fact, according to Topper, all Internet communication is face-to-face:

365

Planned Parenthood III at 1096. Tiffany Komasara, "Planting the Seeds of Hatred, Why Imminence should no Longer be Required to Impose Liability on Internet Communication," 29 Cap. U.L. Rev. 835 (2002) at 853. 367 Prana A. Topper, "The Threatening Internet: Planned Parenthood v. ACLA and a Context-Based Approach to Internet Threats," 33 Colum. Human Rights L. Rev. 191 (2001). 368 Ibid. at 230. 369 Ibid. at 228. 370 Ibid. at 229. 366

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

90

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Under traditional modes of communication, it would be unlikely that a person who published a pamphlet in Georgia allegedly threatening someone in Buffalo would be found to have issued an actionable “true threat.” That is, it is unlikely that a reasonable person would find that an individual in Buffalo would take this statement as a serious expression of a threat. Yet, if the person in Georgia were to post this same pamphlet on the Internet, it is possible that individuals in Buffalo would view the alleged threat and there is an increased likelihood that they would act on it. It is also possible that the allegedly threatened individual will come face-to-face with this Internet posting and endure all of the fear and trepidation that accompanies viewing the web site even from the privacy of his/her own home. Thus, the unique features of the Internet facilitate the possibility of face-to-face communication as much as they do public communication.371 Topper’s article brings up interesting questions about the role of the Internet in disseminating hate, and one can only regret that the court in Planned Parenthood did not take the opportunity to address some of these issues.372 Topper makes a strong claim that the Internet should make us rethink the traditional speech friendly regime that has been applied to hate speech. The core of her argument is that these communications become more threatening because they reach more people. This is an argument that would be true when applied to incitement, but less so for threats. If one accepts the notion that the fact that the doctors’ addresses posted on the Internet automatically made it more likely that violence would be committed against them, it does follow that one needs to consider this information as more threatening. However, this increased threat is based upon the fact that a third party, unrelated to the speaker will carry out the alleged threat. The task Topper set out to fulfill, and arguably the 371

Ibid. at 229-230. This argument seems to have a somewhat unconventional reading of “face-to-face,” which Topper reads as the victim coming face-to-face with the threatening message, while the more conventional reading of face-to-face is that the victim comes face-to-face with the speaker of the speech. 372 It is important to point out that the court in fact came down harder on the posters than on the Web site, of which only a portion was ruled to amount to a “true threat.”

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

91

majority in Planned Parenthood as well, is finding a way to deal with what Hammack called a “clever inciter,” a smart person who is able to avoid punishment by inciting lawless action without specifically advocating this type of action.373 The Internet may have made the task of these clever inciters easier:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The unique characteristics of the Internet blur the distinction between threats and incitement by allowing speakers to threaten by incitement --that is, creating fear by increasing the likelihood of ensuing violence without actually threatening to carry out the violence themselves. A threat in certain contexts may not cause its recipient to be fearful if its occurrence seems very unlikely. However, that same threat masquerading as incitement will generate reasonable fear if it is particularly likely to provoke a third party to carry out the threatened act. The Internet facilitates these threat/incitement hybrids by making them more likely to cause the act they seek to bring about.374 In this case, the clever inciter suggested that the official purpose of the Web site was to keep a record of abortion providers so they could be held accountable once the nation would turn against abortion, but the facts seem to support the notion that this was nothing less than a thinly veiled attempt at intimidation. But going after clever inciters using a threat analysis would dramatically alter the understanding of what constitutes a threat. It would no longer be required that a threat is directed at someone, messages would be considered threatening because they create fear based on the likelihood that the violence allegedly advocated in them will occur, even if the speaker is not connected to this violence. It also means that the more “public” speech is, the more people it reaches, the more threatening it is. Topper focuses mainly on effects of the speech (the effect the statements had on the doctors) and takes this rationale to its logical end point in stating that the intent requirement should be dropped altogether when assessing threatening speech. She proposes a contextbased test to consider whether or not a statement is a threat based solely 373

Scott Hammack, "The Internet Loophole: Why Threatening Speech on-Line Requires a Modification of the Courts' Approach to True Threats and Incitement," 36 Colum. J.L. & Soc. Probs. 65 (2002) at 72. 374 Ibid. at 67.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

92

Global Medium, Local Laws

on the reasonable perception of the targeted individuals that a given statement is threatening, without taking into account whether or not the speaker intended his statement to be threatening.375 [T]he Internet has a potential to magnify the threatening nature of communications, increasing the future harm that can result from statement reasonably interpreted to be a threat, regardless of the specific intent of those who make it. It is therefore appropriate to evaluate such communications according to the reasonable perception of the targeted individual as opposed to the specific intent of the person making the threatening communications.376 In making this assessment, the fact that the Internet magnifies threats should be taken into consideration, as context is an important factor in the standard proposed by Topper.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Towards a new test to assess online threats and incitement Hammack377 also addresses the clever inciter dilemma and argues that a new approach should be followed when assessing threatening speech online of the kind at issue in Planned Parenthood. He gives four reasons why the Internet amplifies threats in a way that necessitates a new legal approach. First of all, the larger transient and more widespread audience gives threats greater resonance.378 He argues that the Internet is not the free marketplace of ideas that Brandeis had in mind in his Whitney dissent in which he argued that speech should not be restricted as long as it can be countered by more speech:

375

Prana A. Topper, "The Threatening Internet: Planned Parenthood v. ACLA and a Context-Based Approach to Internet Threats," 33 Colum. Human Rights L. Rev. 191 (2001) at 234. 376 Ibid. at 233-234. 377 Scott Hammack, "The Internet Loophole: Why Threatening Speech on-Line Requires a Modification of the Courts' Approach to True Threats and Incitement," 36 Colum. J.L. & Soc. Probs. 65 (2002). 378 Ibid. at 81-83.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate Speech in the United States

93

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

On the Internet, such a discussion is frequently impossible because of the inability to ensure that the public has access to both sides of the debate. For example, if an anti-Semitic web site publishes falsehoods slandering Jews, visitors to that site would be unlikely to visit a Jewish organization's web site refuting the anti-Semitic speech. Unlike the audience for television, newspapers, or even books, the on-line audience is widely scattered, making it very difficult to identify and track down.379 This argument is somewhat beside the point, since it has little relevance to the issue of threats, but is interesting nevertheless because it suggests that traditional First Amendment rationales do not apply to the Internet. However, it is not convincing. While it is true that it cannot be guaranteed that people visiting a hate site will also visit an anti-hate site, this also is not the case with traditional media. People subscribing to neoNazi newsletters are not very likely to subscribe also to publications of the Simon Wiesenthal Center. The marketplace of ideas ideal does not mandate that every single audience member hears both sides of an issue, only that all viewpoints have a chance to be heard. The Internet, more than any other medium, because it is so accessible, has the ability to provide a platform for all kinds of opinions. Secondly, Hammack argues that the speed and ease of communicating magnifies the impact of a threat. “Once the message is out in cyberspace, it is often impossible to delete and may continue to incite readers long after the speaker has moderated her position. As a result, the imminence of the speech makes it more likely that lawless action will ensue.”380 The “imminence” Hammack refers to is the ease of Internet communication. In many instances, there is little or no editorial control, and everyone in the possession of a computer with an Internet connection can post his thoughts, rants and ravings. Especially in the context of libel and invasion of privacy this is important, as this lack of editorial control and self-restraint may lead to online communicators being faced with defamation and invasion of privacy suits. So Hammack points out that it is more likely that the Internet will be used to convey threats when he states that “the imminence of the speech makes it more

379

Ibid. at 82. Ibid. at 83.

380

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

94

Global Medium, Local Laws

likely that lawless action will ensue,”381 but is not entirely clear how this increased likelihood changes the nature of the threat so fundamentally that Internet threats need to be treated differently. Hammack also argues that the fact that people can often times utter threats under a cloak of anonymity makes it more likely that they will engage in threatening behavior and that their threats will be more intimidating.382 The Internet can indeed provide some anonymity, but other media can also provide anonymity; letters, phone calls, graffiti can all be used to make anonymous threats. In this respect, the Internet does not really seem to provide new challenges. Hammack finally also stresses the low cost of access, which removes a “market check”383 on content. Whereas publishers and television cannot engage in hateful speech because of economic reasons and market pressures, individuals do not have this pressure and this “allows publication of speech unfettered from concerns about profits or lawsuits.”384 Again, this is a feature of the Internet that may make it more likely that the Internet will be used for threatening speech, but it does not really change the nature of the threats.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Internet as amplifier What all the above observers argue and agree on is that the Internet, because of its unique features, mandates a separate standard for assessing threats or incitement. However, in Planned Parenthood, the court never really gave special consideration to the fact that the Nuremberg Files were published online. It applied the same rationale to the off line posters as to the Web site. For the court, the off line posters were as threatening as the Nuremberg Files Web site. One of the issues that remains unclear when reading the articles of those who argue for a different standard for online communication is whether or not they think that the court erred in ruling that the off line publications constituted threats, because they were lacking all these amplifying factors that make threats or hate speech on the Internet so much more powerful and dangerous. The courts in Planned Parenthood did not treat online threats or incitement differently

381

Idem. Ibid. at 83-84. 383 Ibid. at 85. 384 Idem. 382

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate Speech in the United States

95

than off line threats nor did they take the Internet as a medium385 into consideration when assessing the context in which the speech occurred. The notion that the Nuremberg Files Web site was more threatening than the off line posters because it reached a potential global audience, making it more likely that somebody would take a gun and kill a doctor, seems to be one that was not shared by the court,386 and one that perhaps overstates the importance and danger of the Internet. The posters were widely distributed during anti-abortion rallies and were published in prolife literature, probably reaching most of the people who would be inclined to commit acts of violence. It has been argued that the online statements were more threatening because it was more likely that they would be seen by the targets of the threats, which is one of the arguments of those favoring stricter scrutiny for on line threats. But is that really so? The chance that they might come across the posters or fliers that are distributed nearby their hospitals is not necessarily smaller than that they would come across the Nuremberg Files Web site among the billions of pages on the Internet. However, the threat and danger of Web sites such as the Nuremberg Files should not be underestimated. After the doctors had filed their suit, on October 25, 1998, Dr. Barnett Slepian was killed by a sniper in his house in a Buffalo.387 Shortly thereafter, his name was crossed off on the Nuremberg Files, leaving many wondering about the causal connection 385

The court did acknowledge though that the medium used to deliver a threat is an important consideration to assess context, but not in the context of the Internet. (Planned Parenthood III at 1078.) For example, parking Ryder trucks outside an abortion clinic is threatening because of the reference that it evokes to the Oklahoma City bombing, where the explosives were placed in a Ryder truck parked outside the building. The court used this example to justify the banning of the wanted posters because similar posters had been distributed before killings of doctors in previous years. 386 In fact, the part of the Web site that merely listed the names of hundreds of people, including judges and pro-choice politicians, was found to be protected speech. Specifically singling out “Abortionists” and listing the names of those who provide abortion services and the crossing out and highlighting the names of those doctors who were killed and wounded, however, was considered to cross the line to threatening speech. In that respect, the Web site received a milder treatment than the posters, in which the court did not find any speech worthy of protection. 387 Jim Yardley and David Rohde, "Abortion Doctor in Buffalo Slain; Sniper Attack Fits Violent Pattern," The New York Times, October 25, 1998, p. 1.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

96

Global Medium, Local Laws

between the site and the murder.388 Determining the strength of this connection is a very difficult, speculative exercise. Slepian was a high profile abortion doctor who had clashed with abortion protesters on numerous occasions, dating back to 1988.389 The killer, James Kopp, was from Vermont, so it is not unimaginable that he would not have been aware of Slepian’s address without the Internet, but this is by no means certain.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Other Voices in the Debate The authors discussed in the previous section all try to argue from within a First Amendment perspective that the Internet mandates a rethinking of hate speech regulation. They try to argue that the characteristics of the Internet medium launch certain types of speech outside the protective scope of the First Amendment, but they do not really challenge the basic tenets of First Amendment doctrine as such. However, some scholars have argued for restrictions of hate speech online because they do not agree with this First Amendment paradigm to begin with. Tsesis,390 for example, argues for a stricter approach to hate speech online, but his arguments are not based on specific characteristics of the Internet, but on a criticism of traditional First Amendment doctrine in general. He criticizes the “imminent threat of harm” requirement as being too restrictive, dismisses the marketplace of ideas concept because it is not able to filter out hate speech and he also opposes the content neutrality requirement as expressed by Justice Scalia in R.A.V. Given this starting point, it is no surprise that he also proposes a ban on online hate speech. Burch likewise does not think that hate speech ought to be protected to begin with, as he sees it as conflicting with the Fourteenth Amendment.391 Taking this as his starting position, he sees the “boundless

388

Melanie C. Hagan, "The Freedom of Access to Clinic Entrances Act and the Nuremberg Files Web Site: Is the Site Properly Prohibited or Protected Speech?" 51 Hastings L.J. 411 (2000) at 414. 389 Jim Yardley and David Rohde, "Abortion Doctor in Buffalo Slain; Sniper Attack Fits Violent Pattern," The New York Times, October 25, 1998, p. 1. 390 See: Alexander Tsesis, Destructive Messages: How Hate Speech Paves the Way for Harmful Social Movements (2002). 391Burch, Edgar. "Censoring Hate Speech in Cyberspace: A New Debate in a New America." 3 N.C. J.L. & Tech. 175 (2001) at 179.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate Speech in the United States

97

capacity”392 of the Internet as creating more problems. He argues for regulations similar to those for television and radio, because “the increased use of the Internet coupled with the speed and anonymity that the Internet encompasses seems to make it much more invasive than a radio broadcast.”393 However, this seems to run contrary to the holding of the Supreme Court in Reno, in which it found that the Internet is not as “invasive” as radio or television. The Court stated on that occasion that “communications over the Internet do not 'invade' an individual's home or appear on one's computer screen unbidden. Users seldom encounter content 'by accident.'”394 It also found that “almost all sexually explicit images are preceded by warnings as to the content,” and cited testimony that “'odds are slim' that a user would come across a sexually explicit site by accident.”395 While hate speech may often times not be labeled as clearly as is the case with some obscene speech, it is hardly the case that hate speech portals are the default home pages of Internet browsers. This point is also made by Weintraub-Reiter, a lone voice of opposition to more speech regulation on the Internet: “while a child can be bombarded with a sexually explicit or offensive television program by merely turning the television on, no such surprise can occur through the Internet.”396 Weintraub-Reiter also notes that because of the nature of the Internet, the societal costs of hate speech may be higher, but argues that there are some additional arguments against regulating hate speech on the Internet.397 Apart from the traditional arguments that censorship is dangerous and that a free flow of ideas is crucial for democracy, she also argues that the nature of hate speech on the Internet inhibits regulation because it would necessarily be vague: “hate speech's inherent subjective and emotional nature poses a barrier to defining it. Some people may find certain speech hateful and offensive while others may find the same speech so preposterous that it is not offensive. Therefore, while hate 392

Idem. Ibid. at 191. 394 521 U.S. 844 at 869 (1997) (quoting ACLU v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996)at 844). 395 Idem. 396 Rachel Weintraub-Reiter, "Hate Speech over the Internet, a Traditional Constitutional Analysis or a New Cyber-Constitution?" 8 B.U. Pub. Int. L.J. 145 (1998) at 165. 397 Ibid. at 161-165. 393

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

98

Global Medium, Local Laws

speech could be defined in general terms to encompass speech that could be found offensive, this generality would render the statute unconstitutionally vague.”398 This vagueness argument is not specific to hate speech on the Internet, but applies to any hate speech regulation, online or offline. Marsh argues that Internet speech is qualitatively different from other speech, and that therefore speech on the Internet should be regulated more strictly than speech communicated through other media.399 She uses familiar arguments in pointing out that the Internet has a vast scope, that one can reach millions with the click of a mouse, that it is anonymous, that it presents privacy issues, that it allows for various forms of communication (emails, pop ups, banners, Web sites) and that Web sites are hard to shut down.400 She claims that these characteristics make hate speech on the Internet much more powerful than its off line variant, and therefore needs to be regulated more strictly. However, as we discussed above, these characteristics alone cannot justify a rethinking of First Amendment principles in the context of Internet communication. Marsh also discusses the Planned Parenthood and James Baker cases. She praises the approach taken by the court in Planned Parenthood, because it did not use the Brandenburg test, but considered the wider context in which these communications took place.401 However, Marsh argues that in the Baker case as well, a contextual approach should have informed the court in evaluating the Baker emails: “If measured against the broader context of violence against women, although not necessarily committed by this speaker, the court might have found, as the dissent suggested, that the e-mail messages were threats and incitements.”402 Marsh’s argument would have carried more weight had she focused on the stories with real students’ names that Baker posted in a newsgroup that was publicly available, but that were not part of the indictment. Had this been the case, the Baker case would have presented an interesting analogy with the Planned Parenthood case.

398

Ibid. at 164-165. Elizabeth P. Marsh, "Purveyors of Hate on the Internet: Are we Ready for Hate Spam?" 17 Ga. St. U.L. Rev. 379 (2000) at 381-386. 400 Ibid. at 388-389. 401 Ibid. at 395-396. 402 Ibid. at 403. 399

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate Speech in the United States

99

Marsh’s argument does point to the difficulty, and perhaps the danger (from a free speech perspective), of interpreting “context” too broadly when assessing the danger of speech. Considering the “context” of speech certainly is not alien to First Amendment doctrine, as Holmes’ famous example of falsely shouting “fire” in a crowded theater illustrates. To assess whether speech amounts to incitement, true threats or fighting words, judges take the context in which the speech took place into consideration, so March’s argument that in the Baker case, the broader context of violence against women should have been taken into consideration appears to be logical. But a distinction needs to be made between context in the sense of “the specific circumstances in which speech takes place” and context in the meaning of “the bigger societal, historical and cultural background against which an event takes place.” A contextual approach towards speech in the latter sense of the word has not traditionally been adopted by the courts, and the willingness of the court in Planned Parenthood to widen its interpretation of what constitutes the context in which speech occurs for the purposes of deciding whether or not the speech is protected or not is therefore significant. Some of those advocating bans on hate speech want the courts to consider the history of racism and discrimination when dealing with racial hate speech. Cases like the James Baker case and especially Planned Parenthood seem to have opened the door a little bit towards considering such societal and historical realities as part of the “context” in which speech occurs.

Conclusion Most of the debates following Planned Parenthood discussed above have dealt with how the Internet mandates a rethinking of hate speech regulation in the United States. The case law discussed here dealt with threats and less with fighting words, incitement or group libel, and most of the discussions in law reviews seem to center on this same issue. The applicability of fighting words and incitement to the Internet is restricted because their ultimate goal is maintaining public order, and Internet communications are unlikely to produce the situations these doctrines are trying to prevent. However, this is not the case with threats. Internet communication can easily be qualified as threatening, and some of the authors discussed above even argue that threatening Internet communications are inherently more threatening than similar communications offline. However, as discussed above, these arguments

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

100

Global Medium, Local Laws

do not withstand stricter scrutiny. This approach has not been adopted by the courts, which have refused to take the specific characteristics of the medium into consideration so far. This discussion is particularly relevant because what constitutes a threat is not entirely clear. Most circuits focus on the listener’s perception of the speech, but the First and Ninth Circuits focus on the speaker’s anticipated effects of the speech; whether or not a reasonable speaker could foresee that a recipient would understand the speech as a threat.403 So there is some room to interpret what constitutes a true threat without having to shake the foundations of First Amendment doctrine. This explains the popularity of the Planned Parenthood case with those who want greater limits put on hate speech, because it seems to lead the way for a broader interpretation of what constitutes a threat and because it may point to a wider contextual analysis when assessing threatening speech. However, it is important to note that the context of violence in which the Nuremberg Files went online and the “Wanted” posters circulated was a crucial consideration for the court to label this speech as threatening. Absent such a context of violence, similar hateful expressions could not be successfully prosecuted. The past years have seen a decrease of violent incidents at abortion clinics and towards abortion providers.404 Since the “context of violence” no longer exists to the extent it once did, would this mean that the Nuremberg Files should be allowed back online today or are these shootings still recent enough to talk about a “context of violence?” Allowing such wider contextual considerations to come into play when restricting speech without clearly articulating the boundaries of these contextual considerations (as, for example, the Brandenburg test does) could result in overly broad, potentially unconstitutional, restrictions on speech.

403

Scott Hammack, "The Internet Loophole: Why Threatening Speech on-Line Requires a Modification of the Courts' Approach to True Threats and Incitement," 36 Colum. J.L. & Soc. Probs. 65 (2002) at 78. 404 "Violence & Harassment at U.S. Abortion Clinics," religioustolerance.org, November 9, 2004.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

CHAPTER 3

Hate Speech Law in Europe

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Introduction In the United States, the debate about hate speech has mainly centered on when certain forms of speech become threatening or intimidating, amounting to incitement, fighting words or group libel. The previous chapters showed that the Internet has not radically changed the American approach towards hate speech. Hate speech is generally protected, unless it falls under one of those three categories. The international and European approaches toward this issue are radically different. Numerous international treaties and local laws have declared various forms of hate speech to be unprotected categories of speech. This difference in the European approach is rooted in a different conception of personhood and freedom of expression, a different political philosophy concerning the role of the state in guaranteeing citizens’ rights, and an emphasis on human dignity as a fundamental human right. This chapter will provide a general overview of the ways hate speech is dealt with in international, European and local law within Europe, focusing on Germany, France and the United Kingdom. Germany presents an interesting case study because its hate speech laws have received wide acclaim405 and are often brought up by scholars in the United States as a model to emulate at this side of the Atlantic.406 Some of the underlying values and characteristics of German law will be discussed in depth because they exemplify in many ways the European approach. France’s hate speech laws will be discussed because France has been proactive in applying its hate speech laws to the Internet, as will be discussed in the next chapter. Finally, this chapter will also provide a short overview of hate speech regulation 405

Ronald J. Krotoszynski, Jr., "A Comparative Perspective on the First Amendment: Free Speech, Militant Democracy, and the Primacy of Dignity as a Preferred Constitutional Value in Germany," 78 Tul. L. Rev. 1549 (2004) at 1549. 406 Claudia E. Haupt, “Regulating Hate Speech: Damned if You Do and Damned if You Don’t : Lessons Learned from Comparing the German and U.S. approaches,” 23 B.U. Int'l L.J. 299 (2005) at 303.

101 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

102

Global Medium, Local Laws

in the United Kingdom, a country that provides a slightly different perspective because of its common law tradition.

International Law United Nations The 1949 United Nations Universal Declaration of Human Rights407 (UDHR) protects speech,408 but does not explicitly address hate speech. However, it contains an equal protection clause409 that allows for restrictions of discrimination and incitement to discriminate.410 Other limiting clauses such as Article 29411 and Article 30412 also permit hate speech restrictions.413 The United Nations International Covenant on

407

United Nations Declaration of Human Rights. (1948) Article 19: “Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.” 409 Article 7: “All are equal before the law and are entitled without any discrimination to equal protection of the law. All are entitled to equal protection against any discrimination in violation of this Declaration and against any incitement to such discrimination.” 410 Stephanie Farrior, "Molding the Matrix: The Historical and Theoretical Foundations of International Law Concerning Hate Speech," 14 Berkeley J. Int'l L. 3 (1996) at 14. 411 (1) Everyone has duties to the community in which alone the free and full development of his personality is possible. (2) In the exercise of his rights and freedoms, everyone shall be subject only to such limitations as are determined by law solely for the purpose of securing due recognition and respect for the rights and freedoms of others and of meeting the just requirements of morality, public order and the general welfare in a democratic society. (3) These rights and freedoms may in no case be exercised contrary to the purposes and principles of the United Nations. 412 “Nothing in this Declaration may be interpreted as implying for any State, group or person any right to engage in any activity or to perform any act aimed at the destruction of any of the rights and freedoms set forth herein.” 413 Stephanie Farrior, "Molding the Matrix: The Historical and Theoretical Foundations of International Law Concerning Hate Speech," 14 Berkeley J. Int'l L. 3 (1996) at 19.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

408

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech Law in Europe

103

Civil and Political Rights, adopted in 1966 and enacted in 1976,414 also advocates a balanced approach toward free speech. Article 19 establishes a qualified right to free speech that can be restricted in order to protect the reputation of others, or to protect national security, public morals, health or security.415 Article 20(2) reinforces the notion that free speech is not absolute, stating that “[a]ny advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence shall be prohibited by law.” In Faurisson v. France,416 the United Nations Human Rights Committee417 found that the criminal conviction of a Holocaust denier was not a violation of Article 19. Faurisson had stated in an interview that “myth of the gas chambers was a dishonest fabrication”418 and had been ordered to pay a large fine.419 The Committee held that the restrictions placed on Faurisson’s speech were consistent with Article 19(3) because Faurisson was not convicted for having an opinion, but for violating the rights and reputations of others and because his statements could raise or strengthen anti-Semitic feelings.420

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

414

"United Nations International Covenant on Civil and Political Rights." (1966 (enacted in 1976)). 415 “1. Everyone shall have the right to hold opinions without interference. 2. Everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice. 3. The exercise of the rights provided for in paragraph 2 of this article carries with it special duties and responsibilities. It may therefore be subject to certain restrictions, but these shall only be such as are provided by law and are necessary: (a) For respect of the rights or reputations of others; (b) For the protection of national security or of public order (ordre public), or of public health or morals.” 416 Decision of 8 November 1996, Communication No. 550/1993. 417 The Human Rights Committee is the body of independent experts that monitors implementation of the International Covenant on Civil and Political Rights by its State parties. 418 Faurisson v. France at 2.6. 419 Ibid. at 2.7. 420 Ibid. at 9.6.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

104

Global Medium, Local Laws

Another relevant United Nations treaty is the 1965 International Convention on the Elimination of all Forms of Racial Discrimination421 (ICEFRD) which, among other things, requires states to declare punishable “all dissemination of ideas based on racial superiority or hatred, incitement to racial discrimination, as well as all acts of violence or incitement to such acts against any race or group of persons of another colour or ethnic origin, and also the provision of any assistance to racist activities, including the financing thereof.”422 But the strong language of this article and the negative effect of enacting these types of laws may have for freedom of speech have prevented many countries from adopting the ICEFRD into their national laws.423 However, the European Court for Human Rights (see below) has referred to the ICEFRD in its jurisprudence. As this brief overview shows, United Nations treaties advocate a more restrictive regime towards hate speech than is customary in the United States. This work will not further address the international law and treaties regarding hate speech,424 but will focus instead on the European approach, which does reflect the demands of these treaties. This work will also not address the normative and legal question of whether or not the United States as a member of the international community should (or can) adopt the principles laid out in these treaties in its jurisprudence.425 History and the overview in the previous chapter suggest that there is little reason to expect that a constitutional shift of that magnitude would occur.

The European level At the European level, there have been numerous initiatives aimed at reducing racism and xenophobia in general, and speech advocating and 421

International Convention on the Elimination of all Forms of Racial Discrimination. (1965). 422 Article 4. 423 Stephanie Farrior, "Molding the Matrix: The Historical and Theoretical Foundations of International Law Concerning Hate Speech," 14 Berkeley J. Int'l L. 3 (1996) at 54. 424 See Ibid. for an overview. 425 See Scott J. Catlin, "A Proposal for Regulating Hate Speech in the United States: Balancing Rights Under the International Covenant on Civil and Political Rights," 69 Notre Dame L. Rev. 771 (1994) at 800-809.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech Law in Europe

105

contributing to these phenomena in particular. Both the European Union and the Council of Europe have made it clear that they expect member states to make efforts to curb hate speech.

The European Union In 2001, the European Commission presented a proposal426 that would outlaw a wide range of racial hate speech.427 Previous initiatives at the European Union level had all been non-binding, but this proposal, if ratified, would create a common criminal law approach towards hate speech and racism in the EU zone. Because of differences of opinion about the scope of speech protection that should be awarded to racist speech, it took a long time before an agreement was reached.428 Finally,

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

426

E.U. Council Framework Decision on Combating Racism and Xenophobia. (2001) 427 Article four of the framework decisions spells out the types of speech that this decision would ask the member states to incorporate in national law. “(a) public incitement to violence or hatred for a racist or xenophobic purpose or to any other racist or xenophobic behaviour which may cause substantial damage to individuals or groups concerned; (b) public insults or threats towards individuals or groups for a racist or xenophobic purpose; (c) public condoning for a racist or xenophobic purpose of crimes of genocide, crimes against humanity and war crimes as defined in Articles 6, 7 and 8 of the Statute of the International Criminal Court; (d) public denial or trivialisation of the crimes defined in Article 6 of the Charter of the International Military Tribunal appended to the London Agreement of 8 April 1945 in a manner liable to disturb the public peace; (e) public dissemination or distribution of tracts, pictures or other material containing expressions of racism and xenophobia; (f) directing, supporting of or participating in the activities of a racist or xenophobic group, with the intention of contributing to the organisation’s criminal activities.” 428 In June 2005, Luc Frieden, President in office of the Justice and Home Affairs Council said about the ongoing discussion: “The outcome of this discussion can be interpreted in two different ways: the positive thing is that we all agree that racism and xenophobia go against the fundamental values in which we believe as political leaders, as democratically elected leaders of Europe. The other thing of course is to say that we have different ideas in Europe about freedom of expression. In some countries that means that freedom of expression knows almost no boundaries, certainly no boundaries imposed through criminal law sanctions. For others freedom of expression does

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

106

Global Medium, Local Laws

in 2007, the Council of EU Justice Ministers agreed on the Framework Decision on Combating Racism and Xenophobia. This agreement compels member states to impose criminal sanctions on public incitement of hatred and violence against individuals based on religion, ethnicity, religion, race and nationality. The agreement also criminalizes denying or trivializing genocides or crimes against humanity if this amounts to racist or xenophobic agitation.429 This framework decision sets a minimum standard, but this does not mean that all EU countries will have identical laws. For example, countries can also decide to criminalize holocaust denial, even if it does not amount to racist or xenophobic agitation. For example, the French negationism law does not have such a requirement, while British law does (see infra). This framework decision will not erase these kinds of differences and allows the individual states some liberty in limiting restrictions on speech to instances when the speech “amounts to a threat, verbal abuse or insult, or where the conduct in question is apt to disturb the public peace.”430 This language indicates that the framework decision was written in a way to placate countries with less restrictive hate speech provisions.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Council of Europe and the European Court of Human Rights The 1997 Treaty of Amsterdam established that the E.U. countries are bound by the fundamental regime of the European Court of Human Rights,431 a court that is not part of the European Union, but established have limits. Those limits that freely elected parliaments put into the criminal code, where the interests of others are in conflict with some fundamental human rights. This is a debate that one can have for the ages” "No Agreement on the Framework Decision on Combating Racism and Xenophobia at the Justice and Home Affairs Council," Press Release issued by the Luxembourg Presidency of the European Union, November 2005. Available at: 429 "EU: Common Criminal Provisions against Racism and Xenophobia" Press Release Issued by the Germany Presidency of the European Union, April 2007. Available at: 430 Idem 431 Tarlach McGonagle, "Wresting (Racial) Equality from Tolerance of Hate Speech," 23 Dublin University Law Journal 21 (2001) at 15. (Retrieved from

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech Law in Europe

107

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

by the Council of Europe, and which deals with complaints of violations by member states of the European Convention on Human Rights. Jurisprudence of the ECHR is therefore important to the understanding of the relationship between free speech and hate speech in Europe. The European Convention on Human Rights (Convention), signed on November 4, 1950 by the members of the Council of Europe, guarantees certain fundamental human rights to all individuals residing within the borders of the nations which signed the Convention.432 The Convention, drawing upon the United Nations Universal Declaration of Human Rights, marked the first time that an international treaty actually created binding obligations in the area of human rights. The ECHR hears cases from citizens of member countries who claim that their rights guaranteed under the Convention have been violated by their governments. Freedom of expression is protected under Article 10 of this convention and is defined as including “the freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”433 However, this right is not absolute: The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.434

this PDF file uses different pagination: 1-27.) 432 The convention applies to 46 countries, more than just E.U. countries. The only European countries to whom the convention does not apply are Monte Carlo and Vatican City. 433 Article 10(1). 434 Article 10(2).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

108

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The provision that restrictions have to be “necessary in a democratic society” has been interpreted to mean that courts should consider the purpose of a restriction on speech and whether or not a restriction of speech is proportional to this purpose.435 Case law of the ECHR suggests that the court has considered restrictions on hate speech to be compatible with Article 10. Case Law of the ECHR The ECHR ruled on only one hate speech case: In Jersild v. Denmark,436 the court ruled in favor of a Danish journalist who argued that his conviction in Denmark for aiding and abetting the dissemination of racial insults for interviewing (and broadcasting the interviews of) radical youth who expressed racist views was contrary to Article 10 of the Convention. Because the information was gathered in the context of a broadcast informing the audience on a matter of public concern, because he had sought out other viewpoints and because the broadcast was aimed at a well-informed audience, the court sided with the journalist.437 However, this judgment in no way limits states’ ability to regulate hate speech. It only declares that the media can report on groups perpetuating hate, provided they do so in a responsible manner. The youths who were portrayed in the broadcast were also convicted, but their convictions were not appealed. However, the language of the decision indicates that this kind of speech is not protected by Article 10 of the Convention. A better indication of the ECHR’s approach to hate speech can be found by analyzing the cases that never reached the court because they were dismissed by the European Commission on Human Rights.438 435

Sionaidh Douglas-Scott, "The Hatefulness of Protected Speech, a Comparison of the American and European Approaches," 7 Wm. & Mary Bill of Rts. J. 305 (1999) at 328. 436 Jersild v. Denmark, ECHR 36/1993/431/510. 437 Ibid. at §31-34. 438 Until November 1998, cases were submitted to the European Commission of Human Rights, which ruled on their admissibility. Only admitted cases that did not result in a settlement after intervention of the Commission were sent on to the ECHR. Since then, a full time court has replaced the previously part-time commission and court. ("European Court of Human Rights - Historical Background," European Court of Human Rights at §3 and §6.) )

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech Law in Europe

109

In a 1979 case, Glimmerveen and Hagenbeek v. The Netherlands,439 the applicants challenged their conviction under a Dutch law for possessing, with the intent to distribute, leaflets the court ruled amounted to incitement to racial discrimination.440 The commission denied the two free speech protection because the leaflets contained speech prohibited by the European Convention and the United Nations Convention on the Elimination of All Forms of Racial Discrimination. The Commission also stated that the speech was not protected under Article 10 of the Convention because Article 17 states that no article in the Convention “may be interpreted as implying for any State, group or person any right to engage in any activity or perform any act aimed at the destruction of any of the rights and freedoms set forth herein or at their limitation to a greater extent than is provided for in the Convention.”441 In another case,442 the Commission refused to accept the complaint of a German citizen who had been barred from posting in his garden pamphlets that called the Holocaust a lie and “a Zionist swindle.” It found that the pamphlets were a defamatory attack on each member of the Jewish community.443 The Commission pointed out that it was not discriminatory to provide this kind of collective protection to only specific groups such as Jews, which have historically been discriminated against.444 This contrasts starkly with the American approach where this kind of special protection for certain groups is not customary in the context of speech regulation. In a 1988 case, Kuhnen v. Germany,445 the Commission established that neo-Nazi propaganda was not protected under Article 10 of the convention. The publications at issue had called for a reinstitution of the Nazi party and National 439

Appn. Nos. 8348/78 & 8406/78, 18 DR 187 (1979). The leaflets were directed at “white Dutch people” and guaranteed them that if their political party were to gain power, they would remove “so-called” guest workers from the Netherlands. 441 Article 17. 442 X v. Federal Republic of Germany, Appn. No. 9235/81, 29 DR 194 (1982). 443 Stephanie Farrior, "Molding the Matrix: The Historical and Theoretical Foundations of International Law Concerning Hate Speech," 14 Berkeley J. Int'l L. 3 (1996) at 68. 444 X v. Federal Republic of Germany, Appn. No. 9235/81, 29 DR 194 at 198. 445 Appn. No. 12194/86, 56 DR 205 (1988). 440

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

110

Global Medium, Local Laws

Socialism. The commission ruled that Germany’s prosecution under its penal code of the man responsible for these materials on the basis that his publications were directed against the basic order of democracy and freedom and the notion of understanding among people, was within the parameters of Article 10(2) of the Convention.

Country Level Germany

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Human Dignity and the Basic Law After WWII, the part of Germany under control of the Western Alliance adopted the Basic Law, a legal document that was drafted as a temporary document until a formal constitution would be written for a unified Germany. The Basic Law was enacted in 1949 and has functioned as Germany’s foundational legal document ever since.446 The Basic Law’s approach to hate speech is shaped by two major influences: The German constitution’s approach to freedom of expression as a value that is circumscribed by other values such as human dignity and interests such as honor and personality on the one hand, and the Holocaust as the end point of the virulent hate propaganda of the Third Reich on the other.447 The Basic Law recognizes the right to freedom of expression in Article 5(1): “Every person shall have the right freely to express and disseminate his opinion in speech, writing, and pictures and freely to inform himself from generally accessible sources. Freedom of the press and freedom of reporting by means of broadcasts and films are guaranteed. There shall be no censorship.”448 However, Article 5(2) states: “These rights shall find their limits in the provisions of general laws, in provisions for the protection of young persons, and in the right to personal honor.”449 Just as in the United Nations documents, free speech is a qualified right that has no special standing among other 446

Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis" 24 Cardozo L. Rev. 1523 (2003) at 1552-1553. 447 Ibid. at 1548. 448 Basic Law for the Federal Republic of Germany (Grundgesetz, GG) (1949). 449 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech Law in Europe

111

rights. It is perhaps telling in this context that the first article of the Basic Law, the “German First Amendment,” established the importance of human dignity,450 suggesting that it is a constitutional value of higher importance than freedom of speech.451 In fact, the German Constitutional Court has declared human dignity to be the highest legal value articulated in the Basic Law,452 and affirmed this approach in numerous cases.453 The justification for freedom of speech in Germany is similar to the one in the United States; it is seen as essential to a democracy, to the pursuit of truth and for autonomy.454 However, the German approach is a far cry from the Holmesean marketplace of ideas where all ideas should be able to be presented to and clash with each other.

Characteristics of German Hate Speech Law

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

German speech law is based upon beliefs and principles that set it apart from its American counterpart and deserve some discussion because they also, to some extent, exemplify the European approach to this issue. Free speech is a positive right The German legal system does not consider freedom of speech to be merely a negative right (that can be guaranteed through a prohibition of government’s interference with speech), but also as a positive one. It considers the role of the government to foster and encourage free speech that is conducive to truth and democracy and to restrict speech that is antithetical to these ideals. Because of this “caretaker” role that 450

“Human dignity shall be inviolable. To respect and protect it shall be the duty of all state authority.” 451 Krotoszynski, Ronald J. Jr., "A Comparative Perspective on the First Amendment: Free Speech, Militant Democracy, and the Primacy of Dignity as a Preferred Constitutional Value in Germany," 78 Tul. L. Rev. 1549 (2004) at 1556. 452 Sionaidh Douglas-Scott, "The Hatefulness of Protected Speech, a Comparison of the American and European Approaches," 7 Wm. & Mary Bill of Rts. J. 305 (1999) at 321. 453 Krotoszynski, Ronald J. Jr., "A Comparative Perspective on the First Amendment: Free Speech, Militant Democracy, and the Primacy of Dignity as a Preferred Constitutional Value in Germany," 78 Tul. L. Rev. 1549 (2004) at 1566-1577. 454 Ibid. at 1549.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

112

Global Medium, Local Laws

is assumed by the state, it is easier to hold the state responsible for repugnant speech it does not ban.455 This notion of free speech as a positive right is also present at the European level. The approach of the European Commission and Court of Human Rights reflects that the rights spelled out in the Convention are enforceable against state actors as well as against private actors.456 For example, the court has ruled that the firing of an employee by his employer457 because the employee had criticized him was a violation of Article 10. The court ruled that the fact that the Spanish court system had affirmed the legality of this firing constituted government interference in violation of Article 10, even though the firing was a matter between two private parties. The rights spelled out in the Convention are not only to be seen as negative obligations, as rights guaranteed by prohibiting government action in certain areas, as is the case in the United States, but also as positive obligations, rights that the government should actively ensure and protect. This contrasts with the Lockean tradition in the United States which sees fundamental rights as inalienable and preceding civil society. The European approach sees the state as the locus where these rights are established and protected.458 In the American political system, fundamental rights and freedoms hinge upon the government not interfering; in Europe, the government is seen as the protector of these rights and freedoms. False speech does not deserve protection Though freedom of speech also serves the pursuit of truth, Germany does not share the same Millean assumptions that there is value to false speech.459 For the German courts, false speech, especially speech that conflicts with the value of human dignity, is not worthy of protection. As has been established by the high court in Holocaust denial cases, 455

Ibid. at 1549. Stephanie Farrior, "Molding the Matrix: The Historical and Theoretical Foundations of International Law Concerning Hate Speech," 14 Berkeley J. Int'l L. 3 (1996) at 76. 457 Fuentes Bobo v. Spain 39293/98 (2000). 458 Krotoszynski, Ronald J. Jr., "A Comparative Perspective on the First Amendment: Free Speech, Militant Democracy, and the Primacy of Dignity as a Preferred Constitutional Value in Germany," 78 Tul. L. Rev. 1549 (2004) at 1549. 459 Idem. 456

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech Law in Europe

113

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

falsehoods can be banned without impeding upon the pursuit of truth.460 The German Federal Constitutional Court has issued guidelines regarding protection of false speech, stating that false statements of fact with no connection to opinion should be considered non-speech. Opinions and value judgments are protected speech, unless they attack the dignity of persons or groups or vilify them, in which case the speech is considered to be low value speech. Even if this speech otherwise would have been considered “high value” speech, it will be considered low value speech in these circumstances and will be outweighed by other constitutional values.461 Protection of the individual to prevent alienation Another important characteristic of the German and the European approach to free speech is that the individual receiver of speech is seen as a part of the community, and less as a rational being who can autonomously process the information made available to him in the marketplace of ideas, able to decide for himself which speech to listen or respond to and which speech to ignore. German courts would not come to the conclusion of the American court in Cohen v. California,462 which states that individuals who do not want to be exposed to speech that is offensive to them can avoid this by “averting their eyes.”463 European jurisprudence does not accept the notion that individuals should be left to their own devices when it comes to evaluating speech in the public domain.464 It considers people as part of communities who process information and become informed citizens through larger interpretative frameworks provided by the community. Hate speech targets, isolates and excludes members of certain groups and places them outside of the community by attacking them in their human dignity. This ostracizing effect of hate is what the German law aims to prevent.465 In chapter one, the communitarian criticism of the American 460

Idem. Winfried Brugger. “The Treatment of Hate Speech in German Constitutional Law (Part I).” 3 German Law Journal No. 12 (2002) at §41. 462 403 U.S. 15 (1971). 463 Ibid. at 21. 464 Sionaidh Douglas-Scott, "The Hatefulness of Protected Speech, a Comparison of the American and European Approaches," 7 Wm. & Mary Bill of Rts. J. 305 (1999) at 343. 465 Ibid. at 323. 461

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

114

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

First Amendment jurisprudence was discussed and we noted that this communitarian approach is contrary to the autonomy theory. However, in Europe this communitarian approach is much deeper ingrained in political culture and partly explains the European approach of regulating hate speech. Militant democracy Another key feature of the German constitutional framework is that it sees itself as a “militant democracy.”466 The rights guaranteed by the Basic Law are not extended to those who try to subvert the democratic order, and speech protection should therefore not be extended to those whose speech advocates the upheaval of the democratic order. The German Penal Code even has a section entitled: “Threats to the Democratic Constitutional State”467 which prohibits, among other things, the “dissemination and use of propaganda by unconstitutional and National Socialist organizations”468 (for example, by displaying Nazi symbols.) The outlawing of a neo-Nazi party and the 1956 decision to ban the German Communist Party (KDP) are other examples of the German combative democracy in action.469 Given the war-torn history of twentieth-century Europe, there is little appetite for accepting Holmes’ advice articulated in his famous Abrams dissent, that if in the long run the dominant forces of the community accept dictatorship, that the only meaning of free speech is that it should be given its chance.

German hate speech law German law provides many tools to curb hate speech, including criminal and civil laws against insults and defamation at the individual as well as at the group level. Sections 185 to 200 of the criminal code, for example, contain provisions against individual as well as collective insults. “Insult” is defined as “an illegal attack on the honor of another

466

Ibid. at 322. Winfried Brugger. “The Treatment of Hate Speech in German Constitutional Law (Part I).” 3 German Law Journal No. 12 (2002) at § 32. 468 Idem; Sections, 84-86 of the Penal Code. 469 Sionaidh Douglas-Scott, "The Hatefulness of Protected Speech, a Comparison of the American and European Approaches," 7 Wm. & Mary Bill of Rts. J. 305 (1999) at 322. 467

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech Law in Europe

115

person by intentionally showing disrespect or no respect at all.”470 Contrary to the law in the United States, the fact that the insult is true does not always provide a defense against criminal defamation,471 as it is the person’s “social worth” or honor that is being protected.472 Group defamation is clearly established under German law, especially if it concerns libeling social minorities with alleged shared negative characteristics that “are supposed to be irreversibly typical of its individual members.”473 The law is specifically designed to protect minorities from hate speech since it requires that the group be a small group (effectively making this a law that protects minorities) whose characteristics differ from those of the general public. These kinds of statements must attack all members of the group on the basis of characteristics that are unalterable and that are attributed to the group by society at large. Section 130 of the penal code contains some additional restrictions on hate speech. Section 130 (1) states the everyone who “incites hatred against parts of the population or invites violence or arbitrary acts against them, or … attacks the human dignity of others by insulting, maliciously degrading or defaming parts of the population” with the intent to disturb the peace can face up to five years of imprisonment.474 Section 130 (2) bans distribution and production of documents “inciting hatred against parts of the population or against groups determined by nationality, race, religion, or ethnic origin, or inviting to violent or arbitrary acts against these parts or groups, or attacking the human dignity of others by insulting, maliciously ridiculing or 470

Winfried Brugger. “The Treatment of Hate Speech in German Constitutional Law (Part I).” 3 German Law Journal No. 12 (2002) at §28. 471 Ibid. at §29. 472 Ibid. at §30 Whitman argues that this notion of honor is something that has deep roots in German tradition of civility (defined as a set of formal practices that express respect to one another), a tradition that goes much farther back than post-Fascist Europe. He maintains that this kind of civility is absent in the much more direct and informal American culture and he uses this to (partly) explain the different approaches towards hate speech between Germany and the United States. See: James Q. Whitman, "Enforcing Civility and Respect: Three Societies," 109 Yale L.J. 1279 (2000). 473 Winfried Brugger. “The Treatment of Hate Speech in German Constitutional Law (Part I).” 3 German Law Journal No. 12 (2002) at §31. 474 Ibid. at §32.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

116

Global Medium, Local Laws

defaming parts of the population or such a group.”475 Lastly, this section also bans publicly approving, denying or minimizing “an act… committed under National Socialism, in a manner which is liable to disturb the public peace.”476 Whereas Sections 185 and following are designed to protect human dignity, this section tries to prevent the creation of a climate that is conducive to hate crimes.477 It is important to note that the meaning of “incitement” in the German legal system is quite different from its meaning in the American tradition. In Germany, violence does not have to be imminent but “incitement to racial hatred is viewed by the legislature as heightening the general danger of disruption of the public peace, including violations of the dignity and honor of minority groups and the occurrence of hate crimes.”478 The strong language in these codes makes clear that the German approach is rooted in a belief that hate speech can be the first step to greater societal evils and therefore needs to be nipped in the bud. It lies outside the scope of this work to discuss the German case law regarding hate speech in great detail, but these statutes have been more than dead letters and have led to numerous prosecutions and convictions.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

France Freedom of speech in France In the French constitution as well, freedom of speech is a value that is by no means absolute and is in many instances secondary to other values. It is established in Articles 10 and 11 of the 1789 Declaration of the Rights of Man,479 which is part of France’s constitutional framework. Article 10 states that “[n]o one shall be disquieted on account of his opinions, including his religious views, provided their manifestation does not disturb the public order established by law,” setting a precedent for limiting speech in order to further other societal 475

Idem. Idem. 477 Winfried Brugger, "Protection of Hate Speech? Some Observations Based on German and American Law," 17 Tul. Eur. & Civ. L.F. 1 (2002) at 12. 478 Winfried Brugger. “The Treatment of Hate Speech in German Constitutional Law (Part II).” 4 German Law Journal No. 1 (2003) at §54. 479 Declaration of the Rights of Man. (1789). 476

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech Law in Europe

117

goals.480 Article 11 is equally limiting in stating that “[t]he free communication of ideas and opinions is one of the most precious of the rights of man. Every citizen may, accordingly, speak, write, and print with freedom, but shall be responsible for such abuses of this freedom as shall be defined by law.” In France, as in Germany, the government assumes an active role in guaranteeing citizens’ rights. There is no theory of self-government as there is in the United States, where, as Alexander Meiklejohn observed, “it is never true that, in the long run, the security of the nation is endangered by the people. ... Freedom is always wise. That is the faith, the experimental faith, by which we Americans have undertaken to live.”481 The French, on the other hand, turn to the government rather than to their fellow citizens to safeguard the egalitarian rights spelled out in Article 1 of the constitution of the Fifth Republic: “France shall be an indivisible, secular, democratic and social Republic. It shall ensure the equality of all citizens before the law, without distinction of origin, race or religion. It shall respect all beliefs. It shall be organised on a decentralised basis.” 482 Freedom of expression in France should not be understood as the absence of limits on speech, rather, that these limits are not absolute.483 As is the case in Germany, freedom of speech is considered to be a tool of the truth, but not in the Millean sense as in the United States, where even falsehoods contribute to the pursuit of truth. Speech in France does not deserve protection if it makes it more likely that it contributes to a falsehood than to the truth and if general acceptance of this falsehood would have devastating societal consequences,484 such as increasing the likelihood of racial inequality. Vance argues that recent French hate speech laws must be seen against the backdrop of 480

Susannah C. Vance, "The Permissibility of Incitement to Religious Hatred Offenses Under European Convention Principles," 14 Transnat'l L. & Contemp. Probs. 201 (2004) at 221. 481 Alexander Meiklejohn, Testimony on the Meaning of the First Amendment, Address before the U.S. Senate Subcommittee on Constitutional Rights (1955). Cited in: Jullien Mailland, “Freedom of Speech, the Internet and the Cost of Control: The French Example," 33 N.Y.U. J. Int'l L. & Pol. 1179 (2001) at 1184. 482 Constitution of the Fifth Republic (1958). 483 Michel Troper, "Droit Et Négationnisme, Extraits d’un Article De Michel Troper" (2002). 484 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

118

Global Medium, Local Laws

increasing violence against immigrants in the seventies, eighties and nineties.485 Lawmakers have tried to defuse this combustible situation of a large immigrant population and a great deal of anti-immigrant sentiment with the local population by treating hate speech as the spark that could cause an eruption of violence and disrupt the social order. All in all, French hate speech law is relatively similar to the German approach, especially when compared to the United States. French law, however, is more concerned with maintaining public order than protecting the human dignity of the victims of hate speech.486

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate speech law in France In 1881, a law was enacted that criminalized public speech that provoked others to commit violence, arguably establishing the notion that speech can be curbed in order to have a more orderly society.487 This law has functioned as the backdrop for many anti-hate speech provisions.488 In this respect, the French hate speech approach is slightly different from the German approach. Though this distinction is by no means absolute, French anti-hate speech laws are more concerned with public order, while in Germany the protection of human dignity is also an important consideration. In 1939, a prohibition on racial defamation and insults was enacted. This prohibition was expanded in a set of legislative reforms in 1972, when antidiscrimination provisions took effect. Two incitement provisions, known as the Pleven law, replaced the previous defamation law489 and made it a crime to publicly or privately provoke discrimination, hate or violence based on ethnicity, nationality race or religion. Violators risk a prison sentence and/or a fine in the case of public incitement, but only a 485

Susannah C. Vance, "The Permissibility of Incitement to Religious Hatred Offenses under European Convention Principles," 14 Transnat'l L. & Contemp. Probs. 201 (2004) at 232. 486 See: James Q. Whitman, "Enforcing Civility and Respect: Three Societies," 109 Yale L.J. 1279 (2000). 487 Susannah C. Vance, "The Permissibility of Incitement to Religious Hatred Offenses Under European Convention Principles," 14 Transnat'l L. & Contemp. Probs. 201 (2004) at 222. 488 Idem. 489 Ibid. at 223.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Hate Speech Law in Europe

119

fine if the speech took place in private.490 Prosecution under the Pleven law has been scarce, but the cases that came to trial have often been high profile ones.491 One of the people convicted under this law is sexsymbol-turned-animal-rights-activist Brigitte Bardot, for statements she had made about Muslim slaughtering practices.492 In 1990, a law was enacted that made it a crime to deny the Holocaust.493 As in Germany, French lawmakers adhere to the notion that a democracy has the right to protect itself against anti-democratic forces.494 However, French law is not only shaped by the experiences surrounding World War II, but should be seen against the historical backdrop of France’s centuries long struggle with anti-Semitism and its own sordid colonial history.495 Despite these speech laws, both France and Germany have been criticized because actual acts of discrimination and hate crimes (as opposed to mere hate speech) are often not prosecuted to the extent that they would be in the United States.496

United Kingdom

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

In the United Kingdom, the right to free speech is guaranteed through constitutional values inherent in its rule of law tradition and through its adherence to international accords, such as the European Convention for Human Rights.497 As a member of the European Union, it is bound 490

Idem. Ibid. at 224. 492 Ibid. at footnote 155. 493 Law on Freedom of the Press, ch. 4, art. 24 bis. 494 Sevane Garibian, “Law, Identity and Historical Memory in the Face of mass Atrocity conference: Taking Denial Seriously: Genocide Denial and Freedom of Speech in the French Law,” 9 Cardozo J. Conflict Resol. 479 (2008) at 481. 495 See: Lyombe Eko, “New Medium, Old Free Speech Regimes: The Historical and Ideological Foundations of French & American Regulation of BiasMotivated Speech and Symbolic Expression on the Internet,” 28 Loy. L.A. Int’l & Comp. L. Rev. 69 (2006). 496 See: Eddie Bruce-Jones, “Race, Space, and the Nation State: Racial Recognition and the Prospects for Substantive Equality under AntiDiscrimination Law in France and Germany,” 39 Colum. Human Rights L. Rev. 423 (2008). 497 Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1544. 491

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

120

Global Medium, Local Laws

by the European Convention on Human Rights and the jurisprudence of the ECHR. The United Kingdom has adopted the European framework that considers the right of racial groups to be protected from vilification as more important than freedom of speech.498 But the United Kingdom already had a long history of banning hate speech. As early as the seventeenth century, seditious libel was an offense that punished speech that incited hatred or dissatisfaction towards the Queen or brought feelings of ill will and hostility between the classes of the Queen’s subjects.499 Although seditious libel was mainly used to silence critics of the government, it has also been used to punish group defamation. In a well-published eighteenth century case, Regina v. Osborne, a defendant was convicted for publishing a libelous paper targeting immigrant Jews, claiming that they had killed a mother and a child because the child’s father was a Christian, sparking threats and attacks against some Jews.500 However, seditious libel would be a rather blunt weapon to combat hate speech because convictions “could only be obtained upon proof of direct incitement to violence or breach of public order.”501 The Public Order Act of 1936, amended in 1963 and in 1976 by the Race Relations Act, provides a more effective tool to prevent group libel or incitement to racial hatred.502 It allowed for punishment of speech that was likely to lead to violence, even if it did not actually do so, and it also allows for punishing those who had the mere intent to

498

Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 205. 499 Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1545. 500 Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 191; Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1545. 501 Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1545 502 Cited in: Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 193.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech Law in Europe

121

provoke violence.503 Section 5A of that act states that a person commits an offense if “(a) he publishes or distributes written matter which is threatening, abusive or insulting, or (b) he uses in any place or any public meeting words which are threatening, abusive, or insulting, in a case where having regard to all the circumstances, hatred is likely to be stirred up against any racial group in Great Britain by the matter or words in question.”504 The original version of this act did not contain the provision about incitement to racial hatred, but even in its unamended form, the Public Order Act was used to prosecute cases involving racial discrimination and to combat emerging British fascism before and during WWII.505 The Race Relations Act was introduced in 1965 and focused more on incitement to hatred506 against sections of the UK populace based on color, race, ethnic and national origins than incitement to violence as had been the case before. It also required proof of intent, which made prosecution difficult.507 The law led to a number of convictions between 1965 and 1976, some of which were against leaders of the Black Liberation Movement in the late 1960s.508 Incitement laws, even if they are specifically geared towards racial violence, may not always be in the best interest of minority groups, as selective enforcement and the sometimes virulent anti-establishment speech that civil rights activists or racial minority groups may engage in can make them targets of prosecution.509

503

Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1546. 504 Cited in: Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 193. 505 Idem. 506 Either through publication and distribution of written matter, or through public speech. 507 Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1546. 508 Idem. 509 A similar phenomenon took place in the United States, where the fighting words doctrine was often invoked to punish speech directed at law enforcement.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

122

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The 1976 Race Relations Act, which also amended the Public Order Act (see supra), no longer required intent in order to convict someone for inciting hatred through threatening, abusive or insulting speech or publications.510 A 1986 Public Order Act amended the Race Relations Act and made hate speech punishable if it amounted to harassing an individual or targeted group511 and attests to the concern and respect for the human rights of minorities.512 Despite the fact that the Race Relations Act of 1976 is often cited as being the most comprehensive piece of anti-discrimination and anti-racism legislation in Europe,513 the enforcement of these laws in the U.K. has been criticized.514 Prosecution under the 1976 law has been all but nonexistent because the Attorney General refused to prosecute almost all the cases.515 As this overview shows, British hate speech law is rooted in incitement law. However, incitement is not subject to the strict European Convention standards as is the case in the United States under Brandenburg. It also does not ban incitement in general, but singles out incitement to racial hatred, which would not be constitutional in the United States. The law also covers group defamation or insulting remarks,516 or racially motivated harassment.517 Anti-Semitic material 510

Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1546; Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 201. 511 Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis," 24 Cardozo L. Rev. 1523 (2003) at 1547. 512 Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 205. 513 Susannah C. Vance, "The Permissibility of Incitement to Religious Hatred Offenses Under European Convention Principles," 14 Transnat'l L. & Contemp. Probs. 201 (2004) at 215. 514 Ibid. at 216. 515 Thomas David Jones, Human Rights : Group Defamation, Freedom of Expression, and the Law of Nations (1997) at 201. 516 Ibid. at 204. 517 Michael Rosenfield, "Conference: Hate Speech in Constitutional Jurisprudence: A Comparative Analysis ," 24 Cardozo L. Rev. 1523 (2003) at 1547.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hate Speech Law in Europe

123

accusing Jews of a range of illegal or anti-social activities,518 the publishing of names of people who are in a mixed race relationship with threatening overtones,519 publications of calls for violence such as “Race war! Kick some black ass,”520 (even if it is not likely that people will actually do this), and images that incite racial hatred,521 have been found in violation with this act. However, it does not punish mere possession of racially inflammatory material without intent to distribute the material,522 nor is Holocaust denial outlawed in the United Kingdom, unless it is part of a wider anti-Semitic campaign.523 Despite its common law tradition, British hate speech law is similar to hate speech law in France and Germany. Nevertheless, some aspects of this common law tradition in the United Kingdom result in a slightly different approach. Britain’s racial incitement statute allows a fair comment defense, which is derived from the common law of defamation. Although the fair comment defense against defamation is not as extensive as in the United States, it may shield far right groups publishing diatribes against Jews or Muslims.524 In comparison to French and German law, British law interprets “incitement” less broadly. In France, racist remarks are almost by default seen as a threat to public order because they threaten the peaceful co-existence of different groups in society. The United Kingdom’s laws do not seem to go that far, and still require that some violence or breach of the peace occur as a direct result of the speech.

Conclusion As this overview of secondary sources indicates, European hate speech laws largely follow the international approach and are vastly different from the approach followed in the United States. This difference is 518

“Racially Inflammatory Material on the Internet,” United Kingdom Home Office (2002) at 30. 519 Ibid. at 29. 520 Ibid. at 27. 521 Idem. 522 Ibid. at 21. 523 Ibid. at 58. 524 Susannah C. Vance, "The Permissibility of Incitement to Religious Hatred Offenses Under Principles," 14 Transnat'l L. & Contemp. Probs. 201 (2004) at 245-246.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

124

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

based on a radically different notion of how to attain democratic ideals, and to some extent also on what these ideals exactly are. Whereas the United States sees democracy safeguarded when individual liberties are guaranteed, the European democratic framework is based on notions of equality and communality. This implies a greater role for the state to guarantee this equality and to maintain social peace. The Internet forced European countries to reflect on how they could enforce their local hate speech laws on a global medium. Different strategies have been followed by European countries to attain this goal, not all of which have been successful, as will be discussed in the next chapter.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

CHAPTER 4

Regulating Online Hate in Europe The American hands-off approach towards hate speech, as discussed in the first chapter, has not fundamentally changed because of the Internet. Though we discussed how the rise of the Internet posed some interesting questions for courts, it has not significantly affected the American approach towards hate speech. Similarly, the European hands-on approach towards hate speech has not fundamentally changed because of the Internet. Online hate speech is criminally punishable, as is its off line counterpart. But while these laws are relatively easy to enforce against off line content, online content hosted by local Internet providers or when created by people residing in that country's jurisdiction, enforcing local laws on content originating from outside their borders provides more challenges for European countries.

France: The Effects-based Approach and the Yahoo! Case Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Yahoo! case The most notorious case regarding online hate speech in an international context is without any doubt the Yahoo! case. This case raises a plethora of legal, practical and technological issues regarding content regulation on the Internet. Even though it has not been followed by other similar cases and does not seem to be indicative of the European approach to regulating content on the Internet, it nevertheless provides an appropriate starting point for any discussion on the topic. The case arose out of a complaint regarding Yahoo!’s auction site and the Nazi memorabilia offered for sale on it. As of September 2000, about 1,500 Nazi related objects (uniforms, medals, photographs) were offered for sale on the auction site.525 Plaintiffs in the Yahoo! case were two non-profit Jewish organizations, La Ligue Internationale Contre le Racisme et l’Antisemitisme (LICRA) and l’Union des Etudiants Juifs 525

Marc H. Greenberg, "A Return to Lilliput: The LICRA v. Yahoo! Case and the Regulation of Online Content in the World Market," 18 Berkeley Tech. L.J. 1191 (2003) at 1206.

125 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

126

Global Medium, Local Laws

de France (UEJF),526 dedicated to eradicating anti-Semitism. On or around April 5, 2000, LICRA sent a cease and desist letter to Yahoo!’s headquarters in Santa Clara, California, informing the company that the sale of Nazi and Third Reich related goods is a violation of French law (Section 645-1527 of the penal code) and that legal action would be taken if Yahoo! did not take steps to prevent such sales within eight days.528 On April 20, LICRA was joined by UEJF and Yahoo! was served with process by a United States Marshal.529 Apart from the auction site, the complaint also included two Web sites that were made available through Yahoo!’s Geocities hosting domain that contained extracts of “Mein Kampf” and the famous anti-Semitic document “Protocol of the Sages of Scion.”530 Yahoo! argued that the French court had no jurisdiction because the goods were sold within the United States, that the Yahoo! site targeted an American audience, that prohibiting sales or posting of certain materials such as “Mein Kampf” or the “Protocol of the Sages of Scion” would violate the First Amendment, and that it was technologically impossible to block French users’ access to such

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

526

Even though civil action is generally only a possibility for those who suffered a direct harm, French law allows anti-racism associations to start a civil procedure with respect to certain offenses. See: Christopher D. Van Blarcum, "Internet Hate Speech: The European Framework and the Emerging American Haven," 62 Wash & Lee L. Rev. 781 (2005) at 798. 527 “Shall be punished by the fine stipulated for violations of the 5th class the fact, other than for the needs of a film; a show or an exhibit enjoying historical context, to wear or exhibit en public a uniform, an insignia or an emblem which evokes the uniforms, insignia or the emblems which were worn or exhibited either by the members of the organization declared to be criminal pursuant to article 9 of the statutes of the international military tribunal annexed to the agreement of London on August 8, 1945, or by a person found guilty by a French or international court of one or more crimes against humanity stipulated by articles 211-1 to 212-3 or stipulated in law number 64-1326 of December 26, 1964.” 528 Yahoo!, Inc. v. La Ligue Contre Le Racisme et L'Antisemitisme, 169 F. Supp. 2d 1168 (N.D. Cal. 2001) at 1172. 529 Idem. 530 Licra and UEJF v. Yahoo! Inc and Yahoo France. Order of May 22, 2000 by the Superior Court of Paris. [hereinafter: May Order]

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate in Europe

127

sites.531 On May 22, 2000, rejecting these concerns, the Tribunal de Grande Instance de Paris issued a preliminary order. Discussing the jurisdiction issue, Judge Gomez noted that Yahoo!’s action may have been unintentional, but that the harms of these actions were suffered in France and were therefore actionable under French law. The court ordered Yahoo! to “dissuade and render impossible all visitation on Yahoo.com to participate in the auction service of Nazi objects, as well as to render impossible any other site or service which makes apologies of Nazism or that contests Nazi crimes.”532 It also ordered Yahoo! France to post a warning on yahoo.fr stating that searches done through yahoo.com might generate results containing material that is illegal in France. The court ruled that, in most cases, Yahoo! was in a position to determine the geographical location of surfers through their IP address, and that it could block access to French users that way. It gave Yahoo! two months to formulate any counterproposals regarding the technological measures it was expected to take. In addition, Judge Gomez ordered Yahoo! to pay the two organizations that had lodged the complaint 10,000 francs each and to remove the heading “negationists” from its search index. During a hearing in the summer of 2000, Yahoo! argued, among other things, that it would be technologically impossible to comply with the order. In response, the court appointed a panel of specialists to write a report on the technological feasibility of compliance with the order.533 In November of the same year, Judge Gomez, after having heard the panel of experts, reaffirmed his initial decision534 and gave Yahoo! three months to implement measures necessary to carry out his judgment or pay a 100,000 francs-a-day fine. The report of the experts had stated that it was technically possible to block French users’ access to certain sites. It argued that 70% of IP addressees of French users 531

Marc H. Greenberg, "A Return to Lilliput: The LICRA v. Yahoo! Case and the Regulation of Online Content in the World Market," 18 Berkeley Tech. L.J. 1191 (2003) at 1207. 532 May Order. 533 UEJF and Licra v. Yahoo! Inc. and Yahoo France, Superior Court of Paris. Order in Summary Proceedings, August 11, 2000. 534 UEJF and Licra v. Yahoo! Inc. and Yahoo France, Superior Court of Paris. Order of November 20, 2000 by the Superior Court of Paris. [Hereinafter: November order]

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

128

Global Medium, Local Laws

could be recognized and filtered out,535 blocking access to another 20% could have been done through a declaration “upon honour” of the Web surfer of his or her nationality. Judge Gomez’ orders make clear that he believes that the Internet needs no new paradigm and that the new medium does not really alter basic legal principles. In an interview for a law review article, Gomez said:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

When people say that the Internet cannot or should not be controlled or regulated, I cannot agree. A machine has no sense of responsibility: man has a sense of responsibility. The machine cannot be left to control man. To say that the Internet has not been conceived to be regulated is a monumental mistake. Man's illegal acts cannot hide behind a machine. Whether we are dealing with the activities of pedophiles or with racist acts, we cannot allow men to hide behind the machine. We cannot forget that behind the Internet there are people and there must be some form of regulation. Responsibility cannot hide behind technical difficulties.536 As logical as this metaphor appears to be, one could question whether the Internet is really a “machine” the way that a drill or a car is a machine. The Internet is foremost a communication medium, not a device that uses or transmits energy in the performance of a task. Machines, even if manufactured abroad, can easily be subject to local laws. For example, when it became apparent that iPod’s maximum sound level exceeded the 100 decibel limit set by French law, it was temporarily pulled from the shelves in French stores.537 Even though Apple’s iPod is an American product, Apple must comply with French public health regulations if it wants to sell the product on the French market. It is very much this type of rationale that Judge Gomez applied in this case. 535

The main reason that not all French users could be identified as French was that French AOL users have an IP address that makes it look as if they are located in the United States. 536 Ibid. at 1226. 537 Tom Terryn, "IPod Pulled from French Shelves due to Sound Output," The Mac Observer, October 1, 2002.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

129

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Proceedings in the American court system Following the French court’s decision, Yahoo! filed a complaint with the U.S. District Court for the Northern District of California, requesting that the French order be declared unenforceable because it was in violation with the First Amendment.538 On November 7, 2001, the court granted summary judgment in favor of Yahoo!, ruling that the French court’s order was indeed unenforceable in the United States. Judge Vogel acknowledged that every country has a sovereign right to determine what forms of speech will be acceptable within its borders and that the choice of the French Republic to ban Nazi propaganda was one deserving of respect,539 but that it “may not enforce a foreign order that violates the protections of the United States Constitution.”540 On August 23, 2004, a three-judge panel of the Ninth Circuit Court of Appeals reversed this judgment,541 an en banc court did the same on January 12, 2006.542 However, the en banc court did not address the enforceability of this judgment, reversing Judge Vogel’s decision on procedural grounds. Three of the eleven judges argued that exercising jurisdiction over the French defendants was not proper in this case because they had not yet tried to enforce the judgment in the United States. Three other judges ruled that the case was too premature and abstract to determine whether or not the French order violated First Amendment provisions. Judge Fisher, writing for the five dissenting judges, pointed out the chilling effect on speech of the majority’s decision, by not creating clarity in this situation. They argued that under the French order, Yahoo! was facing accruing fines for each day of non-compliance, and even if LICRA would never seek to enforce the French judgment in an American court, the mere existence of this order could have a chilling effect on speech: “By peremptorily terminating Yahoo!'s access to federal court, the majority establishes a new and burdensome standard for vindicating First Amendment rights in the Internet context, 538

Yahoo!, Inc. v. La Ligue Contre Le Racisme et L'Antisemitisme, 145 F. Supp. 2d 1168 (N.D. Cal. 2001). 539 Ibid at 1186. 540 Ibid at 1192. 541 379 F.3d 1120 (9th Cir. 2004, panel). 542 433 F.3d 1199 (9th Cir. 2006, en banc).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

130

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

threatening the Internet's vitality as a medium for robust, open debate.”543 But before taking this case to the American court system, Yahoo! had already changed its auction policy and since January 2001 no longer allows users to sell Nazi artifacts on its auction site.544 The company stressed, however, that this policy change was not in response to the French order but was the result of a “general housecleaning of its auction policy” and stemmed from ongoing discussions Yahoo! had had with groups such as the Los Angeles-based Simon Wiesenthal Center.545 The review of the policy546 drew praise as well as criticism. Although anti-racism groups applauded the decision, some commentators argued that Yahoo! should not have given into pressures to change its policy. Yahoo! had also, as a sign of good will, removed the link to the “Protocole des Sages de Scion,” following the May order, because it agreed that the language of the document established a link with France.547 However, some items, such as stamps and coins or expressive media such as books like Hitler’s “Mein Kampf,” which might fall under the French order, could still be freely auctioned. Yahoo! also did not, as the order required, prevent access to other sites that may constitute an apology for Nazism, or deny Nazi crimes. It argued that it could not comply with this aspect of the order without 543

Ibid. at 1236. “Any item that promotes, glorifies, or is directly associated with groups or individuals known principally for hateful or violent positions or acts, such as Nazis or the Ku Klux Klan. Official government-issue stamps and coins are not prohibited under this policy. Expressive media, such as books and films, may be subject to more permissive standards as determined by Yahoo! in its sole discretion.” (Yahoo! Auctions Guidelines ). 545 Carl S. Kaplan, "Experts See Online Speech Case as Bellwether," The New York Times, January 5, 2001. 546 In the Spring of 2001, eBay changed its policy as well and now bans the sale of hate items such as Nazi flags and Ku Klux Klan cloaks. However, eBay stressed that this revision was in response to feedback form users and to be in line with the laws and points of views of the global community. (Troy Wolverton, "eBay to Ban Sale of Hate Items," News.com, May 3, 2001. ) 547 November Order. 544

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate in Europe

131

banning Nazi material from Yahoo.com altogether.548 Yahoo!’s Geocities service still hosts many white supremacist web sites,549 some of them apparently by people from Europe.550 The three judges who dismissed the suit on lack of ripeness grounds argued that the First Amendment implications were marginal. They stated that Yahoo! could not invoke the First Amendment for the parts of the order it claimed to have complied with through its policy changes on a voluntary basis, unrelated to the order. They further argued that the order merely asked Yahoo! to restrict access to users in France, not the United States, and indicated that it is uncertain if there is a First Amendment interest in providing French citizens with content that is criminally forbidden in their jurisdiction.551 It has been argued that this analysis relies on an erroneous interpretation of the French order.552 As mentioned above, the first part of the order required Yahoo! Inc (and not merely Yahoo.fr) to “dissuade and render impossible all visitation on Yahoo.com to participate in the auction service of Nazi objects, as well as to render impossible any other site or service which makes apologies of Nazism or that contests Nazi crimes.” Eko argues therefore that the French order has to be read as a banning of these materials from Yahoo.com altogether, and not only as making them unavailable to French users. However, in the discussion in the November order regarding geolocation (geographically locating Internet users), the court made it clear that

548

145 F. Supp. 2d 1168 at 1185. See for example: the “white pride and radicalism section” of Geocities.

550 “Naziboy’s Homepage,” . The site actually only has a backdrop of swastikas and a small welcome note stating: “I´m a 26 year old skinhead living in the Northeren part of Europe. By making this homepage I hope to get in Contact with others who share some of the same beliefs as me.” 551 Ibid. at 1220-1221. 552 See: Lyombe Eko, "New Medium, Old free Speech Regimes: The Historical and Ideological Foundations of French & American Regulation of Biasmotivated Speech and Symbolic Expression on the Internet," 28 Loy. L.A. Int’l & Comp. L. Rev. 69 (2006) at 76-78. 549

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

132

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

[t]o respect the terms of the ruling finding it liable and prohibiting access to auction sales of nazi objects, Yahoo must: 1/ identify the geographical original and nationality of internauts seeking access to its auction site 2/ prevent French internauts or those connected from French territory, to learn of the description of nazi objects being auctioned and a fortiori to bid. This happened after Yahoo! had argued that it was technologically impossible to comply with the order. It seems as if the French court never had geolocation in mind when it issued its initial ruling in the May order. Neither removing Nazi items from Yahoo.com nor putting warnings on Yahoo.fr would have required geolocation. It was only after Yahoo! had raised the technological issue that the French court addressed the geolocation issue and seemed to consider it an acceptable solution. So Eko’s claim that the opinion of the Ninth Circuit judges was “based on a misreading of the French court orders”553 might be an exaggeration as the French court did seem to accept geoblocking as an acceptable measure, but it does indicate that the French order suffers from lack of clarity and vagueness. The three judges who were part of the majority argued that Yahoo! cannot claim that complying with the judgment would mean restricting American users’ Internet access, because it is unclear what exactly compliance with the French order constitutes.554 The dissent, on the other hand, saw this vagueness as a reason to side with Yahoo! on First Amendment grounds.555

Criticism of the Yahoo! decision The Yahoo! decision elicited an avalanche of notes and articles in law journals as well as in the popular media, some very critical, others supportive of the French approach.556 Jullien Mailland’s557 criticism of 553

Ibid. at 78. 433 F.3d 1199 (9th Cir. 2006, en banc) at 1217. 555 Ibid at 1242-1244. 556 Marc H. Greenberg, "A Return to Lilliput: The LICRA v. Yahoo! Case and the Regulation of Online Content in the World Market," 18 Berkeley Tech. L.J. 1191 (2003) at 1218-1227. 557 Jullien Mailland, "Freedom of Speech, the Internet and the Cost of Control: The French Example," 33 N.Y.U. J. Int'l L. & Pol. 1179 (2001). 554

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

133

the Yahoo! order provides a good account of the major arguments that have been levied against the ruling of Judge Gomez: that it is technologically unfeasible, imperialistic, ineffective and detrimental to the free flow of information on the Internet. Below, these arguments are discussed in greater detail.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Technological argument558 Mailland argues that the French court’s assertions that up to 90% (70% through IP recognition, 20% through self-declaration of nationality) of the French users can be identified by nationality is incorrect. Users can always bypass screening devices by registering with a foreign ISP, by using local ISPs that enable them to surf the Web anonymously, or by using anonymizer software. He argues that it is likely that hate groups would educate their members on how to circumvent censorship, and that therefore server-initiated methods of identification would not accomplish their goals as “they would end up screening random users unassociated with unlawful material as opposed to users targeted by the government.”559 As we shall discuss in chapter six, geolocation technology has advanced since 2001. Regardless, even if users could bypass geolocation blocking by using anonymizer software or proxy servers, this does not render the technology obsolete. It is safe to assume that the majority of the Internet users do not have the knowledge, skill, time or intention to jump through these hoops. Geolocation blocking therefore would certainly reduce the availability of targeted content in France.

Effectiveness560 Mailland also states that the Yahoo! order would be ineffective because it would be unenforceable in the United States. Initially, this indeed seemed to have been the case, but the Ninth Circuit Court of Appeals has cast some doubt on this. In fact, the three judges rejecting Yahoo!’s claim on ripeness grounds stated that a scheme blocking access to French users would only bring up the question if the First Amendment has extra-territorial application (an issue that is not settled), suggesting that such a scheme could be constitutional as long as it does not affect 558

Ibid. at 1209-1210. Ibid. at 1210. 560 Ibid. at 1210-1211. 559

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

134

Global Medium, Local Laws

American surfers.561 The dissent’s main argument relied on the fact that the French order was vague and also would have impinged on American Internet users’ First Amendment rights, but it is harder to gauge what its stance would be on an order that is more narrowly tailored and would only affect French users. In that case, the First Amendment rights of the American surfers would be safeguarded, but a speaker-centered First Amendment analysis could still object to this, as it would require Yahoo! to incur substantial costs to comply.562 Regardless, the judgment has been effective to the extent that it has been able to change Yahoo!’s behavior. Though Yahoo! claims this change had nothing to with the judgment, it did change its policies and it did remove certain pages from its server. The costs and negative publicity associated with a high profile court case may convince big companies like Yahoo! to comply with these kinds of orders in the future. Also, this decision created uncertainty for Yahoo! executives traveling to France, as they could be arrested for not complying with the court order.563

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Imperialistic564 Mailland quotes Judge Gomez’ remark that by suppressing the disputed content from its site, Yahoo! would adopt a legal and ethical norm that is shared by all democratic countries. As many articles and chapter one of this work have made clear, this is not the case for the United States, 561

Ibid. at 1217-1218. “But suppose Yahoo! really were concerned only with not having to act in the United States as an enforcer of France's restrictions on Internet access by France-based users. That would not make the constitutional implications of the effects on Yahoo!'s United States operations go away. Yahoo! cannot merely act in France to restrict access by users located in France; the French orders require Yahoo! to make changes to its servers and protocols in the United States. That Yahoo! seeks First Amendment protection from having to compromise its domestic operations to comply with a foreign injunction does not translate into its seeking the right simply to violate French law.” Ibid. at 1244-1245. 563 Amy Oberdorfer Nyberg, "Is all Speech Local? Balancing Conflicting Free Speech Principles on the Internet," 92 Geo. L.J. 663 (2004) at 668. 564 Jullien Mailland, "Freedom of Speech, the Internet and the Cost of Control: The French Example," 33 N.Y.U. J. Int'l L. & Pol. 1179 (2001) at 1212-1213. 562

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

135

where the First Amendment prevents broad content-based restrictions on speech. And one could add that the law invoked in this case, which bans the mere public display of Nazi symbols or regalia (except when done for a film, show, or exhibit), without incitement requirement565 may also not be as universally accepted as Judge Gomez pretends it is. Mailland argues therefore that the French court acted imperialistically. It tried to impose its norms upon another country that does not share those norms, denying Yahoo! its right to self-determination, a right recognized by the United Nations.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Secondary effects566 Mailland also claims that this ruling could inspire those responsible for putting content online to block access to French citizens altogether. Since French law prohibits a wide range of speech, for example, the utterance of favorable comments about narcotics, and since French courts seem to assert jurisdiction over the whole Internet, content providers may avoid any risks and try to block access to French users altogether. Mailland further claims that from an international perspective, this case sets a very bad precedent for any regime that wants to restrict its citizens’ access to certain content online. If, for example, an Islamic regime would want France to prevent its citizens from accessing a clothing store’s site hosted in France displaying women in mini skirts, the rationale followed by this court suggests that it would have every right to do so. Such a regime could lead to a level of freedom of speech on the Internet that would ultimately be determined by the lowest common denominator, by the most speechrestrictive regime.

Support for the Yahoo! decision The Yahoo! decision also had its defenders. Frydman and Rorive,567 for example, argue that the French court was correct in asserting

565

Code pénal 645-1 avaialble from : Jullien Mailland, "Freedom of Speech, the Internet and the Cost of Control: The French Example," 33 N.Y.U. J. Int'l L. & Pol. 1179 (2001) at 1213-1216. 567 Benoît Frydman and Isabelle Rorive, "Keynote: Fighting Nazi and AntiSemitic Material on the Internet: the Yahoo! Case and its Global Implications," (2002). 566

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

136

Global Medium, Local Laws

jurisdiction568 because not only did Yahoo! own 70% of Yahoo France, and was it advertising using French language banners on the auction site, American courts have asserted jurisdiction over Web sites based outside its borders as well in similar cases. In People v. World Interactive Gaming Corporation,569 a New York court ordered a casino based in Antigua to stop offering Internet gambling services to New Yorkers. In Twentieth Century Fox v. IcraveTV.com,570 an American court declared it had jurisdiction over a Canadian Web site that webcasted certain copyrighted American TV shows over the Internet.571 Rorive and Frydman also argue that Yahoo! could not claim that it was merely a passive conductor of information for which it did not bear any responsibility, because it had been monitoring and editing content on its site before.572 They also dismiss First Amendment concerns invoked by Yahoo! because the First Amendment does not bar private corporations from censoring and editing content.573 This is a valid argument to the extent that Yahoo! claimed it was prevented from censoring content because that would violate its customers’ First Amendment rights. If Yahoo!’s claim is suspect, it is because, as a private company, it does not have any legal obligation to be the safeguard of freedom of speech and is not responsible for its clients’ First Amendment rights. However, Yahoo! could argue that it wants to safeguard the free flow of information on the Internet, and that it should not be forced to edit content on its web sites based on the viewpoint it expresses. The First Amendment rights of Yahoo!, more than those of its customers, are under fire here. The argument that Yahoo! should not have any qualms about banning Nazi memorabilia from its auction site because it already bans some other items is not convincing. Under this 568

Ibid. at 2. 1999 WL 591995 (N.Y.Sup.). 570 Twentieth Century Fox Film Corp. v. ICraveTV 2000 U.S. Dist. Lexis 11670 (2000). 571 Joel R. Reidenberg, "The Yahoo Case and the International Democratization of the Internet," Fordham Law & Economics Research Paper no. 11 (2001) at 9. 572 Benoît Frydman and Isabelle Rorive, "Keynote: Fighting Nazi and AntiSemitic Material on the Internet: the Yahoo! Case and its Global Implications" (2002). This is an important argument that has also been used by American courts to assess the liability of ISPs, as will be discussed in chapter five. 573 Idem. 569

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

137

theory, if a Web site has set up any kind of internal rules, it opens itself up for outside regulation. Lastly, Frydman and Rorive also reiterate the court’s rationale that it was technologically feasible for Yahoo! to comply with the order.574

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Yahoo! case bis? On the heels of the Yahoo! decision, a similar case came in front of Judge Gomez. In June of 2001, a French anti-racism group called “J’accuse” filed a suit against the major ISPs in France for allowing French Internet users to access an American-based hate portal called Front14.org.575 Front14 hosted numerous Nazi and other hate related sites and offered free hosting space and email exclusively to “racialists.”576 The ISPs claimed that they should not be burdened with the task of being online censors. In the Fall of 2001, Judge Gomez ruled that the portal did indeed violate French and European law, but he stopped short of forcing ISPs to block it. He did ask the ISPs to freely determine which measures they could take to prevent the site from continuing their illegal activity, but did not issue an injunction. He considered the ISPs to be conduits rather than publishers, and did not want to impose the burden of filtering upon them.577 As Fagin indicates, the court’s ruling reflects a concern that statebased regulation must remain sensitive to technological constraints, and a recognition that imposing heavy legal and financial burdens on the ISPs of France would be detrimental to the nation’s position in the online marketplace.578 This ruling appears to be inconsistent with the Yahoo! ruling, where this concern for imposing legal and financial burdens on non-French ISPs was lacking, but an important difference is that Yahoo! was hosting the auction site, exercised oversight on it and had the materials on its servers, a level of control the French ISPs did 574

Idem. Benoît Frydman and Isabelle Rorive, "Regulating Internet Content through Intermediaries in Europe and the USA," 23 Zeitschrift Für Rechtssoziologie 41 (2002) at 49. 576 Catherine Muyl, "French Yahoo Judge Issues a Decision Concerning a Racist US-Based Portal," Intellectual Property Today (March 2002). 577 Matthew Fagin, "Regulating Speech Across Borders, Technology vs. Values," 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 453. 578 Ibid. at 453-454. 575

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

138

Global Medium, Local Laws

not have in this case. In this ruling, Judge Gomez actually did request that the American hosting company, which did not appear in court, inform the court what steps it would take to put an end to this illegal activity.579 However, the American ISP did not do so. The “J’accuse” organization also identified a French citizen who was the French representative of an organization which had an anti-Semitic and racist Web site on front14.org and helped in the development of front14.org. This person was subsequently ordered to make access to these pages unavailable.580 Both this case and the Yahoo! case indicated a reluctance to burden local ISPs with content filtering in favor of a regime attempting to reaching the provider of the content. This is not an illogical strategy, the most effective and direct way to make content unavailable is to force or convince the provider of this content to remove it. Logical as this approach may be, these cases also illustrate the legislative and technological obstacles such an approach encounters. In chapter six we will see how recent legislative efforts in France have moved away from the approach outlined here in favor of an approach that targets French ISPs.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The German Approach Intro At the same time that the Yahoo! case was taking place in France, a German court declined to prosecute Yahoo! for the sale of Nazi items on its site. It argued that Yahoo! was an ISP and should not be held liable for the content of its auction Web sites.581 Unlike their French counterparts, German authorities and courts have tried from the beginning to regulate foreign Internet content by targeting local access providers.

579

Catherine Muyl, "French Yahoo Judge Issues a Decision Concerning a Racist US-Based Portal," Intellectual Property Today (March, 2002). 580 Idem. 581 Jay Lyman, "German Court Rules Yahoo! Not Liable for Nazi Auctions," Newsfactor Technology News, March 28, 2001.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

139

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The CompuServe case. In December of 1995, German law enforcement officials presented CompuServe Germany with a list of 282 newsgroups hosted on CompuServe Inc.’s servers in the United States. They alleged that the groups contained child pornography and extreme violence in violation of German law and warned that further distribution could result in criminal sanctions.582 At the time, CompuServe Germany was a subsidiary owned by the American CompuServe mother company. CompuServe Germany did the marketing and sales for the parent company and ensured that the German customers were connected, but did not provide or host content (mainly news groups), as those tasks were performed by the mother company in the United States.583 There was also no contractual relationship between the German subsidiary and the customers, whose contract was exclusively with the American company. After being informed of the illegal content, CompuServe Germany’s managing director, Felix Somm, asked the American parent company to block the newsgroups. Initially Compuserve complied with Somm’s request, but after an outcry from the Internet community in the United States, it made the newsgroups available again on February of 1996.584 The German subsidiary argued that pulling the newsgroups from the Internet was no longer necessary because it now offered a free filter tool to its subscribers. But the German prosecutor deemed this measure insufficient because the program did not block public access to hard pornography and pedophilia. In 1998, a court in Munich sentenced Somm to two years of probation,585 a decision that was overturned by the Munich Superior Court in 1999.586 The higher court argued that Somm was not in a position to remove the newsgroups and that the only thing he could do was to shut down the connection between CompuServe servers and German gateways, which would have access to the Internet deprived CompuServe Germany users 582

Lothar Determann, "The New German Internet Law," 22 Hastings Int'l & Comp. L. Rev. 111 (1998) at 118. 583 Ibid. at 118. 584 Ibid. at 119-120 585 Ibid. at 121. 586 Ibid. at 122.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

140

Global Medium, Local Laws

altogether.587 The CompuServe case displays a somewhat different approach than the approach followed in the Yahoo! case because it did not target the country where the content originated and where the content was not illegal, but the local company that provided access to this content. However, by targeting an individual who was not responsible for the content and was in no position to remove or even block this content, this approach was not popular and also did little to limit the availability of banned speech in Germany.588 Even though the ruling of the lower court seemed to display a lack of insight into the nature of the Internet, the general philosophy behind this approach, to censor content at the level of local ISPs, is one that would.re-emerge years later in the form of blocking orders.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Toben case Whatever the regulatory difficulties of banning illegal content from across the borders may be, once a known content provider of illegal material is apprehended in a jurisdiction where this content is illegal, he or she can be held liable for that content. In 1999, Fredrick Toben, a German-born Australian citizen, was sentenced to ten months in jail for distributing leaflets and maintaining a Web site on which he denied the Holocaust. He challenged the latter part of his conviction, stating that his Web site was based in Australia589 and was therefore not governed by German law. Initially, a lower court agreed with Toben and ruled that Germany’s hate speech law applied only to German based Web

587

Ibid. at 122-123. A similar action was taken in the Yahoo! case. After the ruling from the California court came down, several French anti-racism groups filed a criminal complaint arguing that Yahoo! and its former president, Timothy Koogle, were guilty of justifying war crimes. The claim was rejected by the French court, because the company’s actions did not fit the description “justifying war crimes.” (Peter Piazza, "A Legal Victory for Yahoo!" Security Management Online, May, 2003. ). 589 Though it is reportedly hosted on an American server. (Terry Lane, "Censoring the Adelaide Institute's Web Site is Futile," On Line Opinion, Nov 30, 2000. ) 588

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

141

sites, but in December of 2000, Germany's highest court, the Bundesgerichthof, overruled this decision.590

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Statutory law In 1997, two media laws were adopted in Germany, designed to, among other things, keep content criminalized by the German Penal Code, especially hate speech, off the Internet.591 At the federal level, the “Information and Communication Services Law” (ICS)592 was enacted, while the states (Länder) enacted the “States Treaty covering Media Services” (STCM).593 The media law tradition in Germany dictates that the federal government is responsible for regulations regarding infrastructure of communication systems, while the Länder are responsible for regulating media content.594 Although the language of the laws and the untraditional architecture of the Internet have caused a heated debate in Germany about who should be responsible for content regulation of the Internet, tradition suggests that the Länder would be responsible for Internet content.595 The STCM stipulates that Internet Service Providers are only liable for their own content and for third party content when they have knowledge of this content and have the ability to bar access to this content. When an ISP acts merely as a conduit, it is not responsible for third party content.596 However, if an ISP gains knowledge of illegal content that is being accessed through its service, it can be required to block access to that content.597 The section of the ICS dealing with 590

Robyn Weisman,"Germany Bans Foreign Web Site for Nazi Content," Newsfactor Technology News, December 14, 2000. 591 Yulia A. Timofeeva, "Hate Speech Online: Restricted or Protected? Comparison of Regulations in the United States and Germany," 12 J. Transnat'l L. & Pol'y 253 (2003) at 262. 592 Gesetz zur Regelung der Rahmenbedingungen für Informations- und Kommunikationsdienste (BT-Drs. 13/7934 vom 11.06.1997). 593 Eric T. Eberwine, "Sound and Fury Signifying Nothing? Jürgen Büssow's Battle Against Hate Speech on the Internet," 49 N.Y.L. Sch. L. Rev. 353 (2004/2005) at 380. 594 Ibid. at 360-364. 595 Ibid. at 384. 596 Ibid. at 387. 597 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

142

Global Medium, Local Laws

teleservices has an almost identical provision.598 However, these provisions have been criticized because they are vague as to what exactly the responsibility of ISPs is, and what the consequences of noncompliance are.599

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Application of the STCM In February 2002, Jürgen Büssow, President of the Dusseldorf District Government for Nordrhein-Westfalen, invoked the STCM to issue a blocking order against local ISPs. Büssow forced the seventy-six access providers in Nordrhein-Westfalen at the time of the order to block access to stormfront.org and Nazi-Lauck-nsdapao.com. Stormfront is one of the most popular extreme right wing sites of the Internet; it offers free server space and email addresses and functions as a portal site to all kinds of hate sites. Most of its content is in English, but the site also contains a German language section.600 The Nazi Lauck site contains a wide range of Nazi propaganda and is maintained by Gary Lauck, a German-American who served a four-year sentence in Germany for inciting racial hatred and for disseminating illegal propaganda.601 Despite protests and legal challenges, the blocking order has been found to be constitutional by courts.602 However, the order only applies to ISPs located in Nordhrein-Westfalen and does not apply to other German Länder. The implications of this blocking order will be discussed in greater detail in the final chapter of this work.

The European Union: Following the German Example? At the European level, a similar regime towards ISP liability, and not exclusively in the context of hate speech, is emerging. The European 598

Gesetz zur Regelung der Rahmenbedingungen für Informations- und Kommunikationsdienste,Artikel 1 §5. 599 Amy Oberdorfer Nyberg, "Is all Speech Local? Balancing Conflicting Free Speech Principles on the Internet," 92 Geo. L.J. 663 (2004) at 668. 600 Eric T. Eberwine, "Sound and Fury Signifying Nothing? Jürgen Büssow's Battle Against Hate Speech on the Internet," 49 N.Y.L. Sch. L. Rev. 353 (2004/2005) at 354. 601 Idem. 602 Ibid. at 395-408.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

143

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Union’s Directive on E-commerce,603 a directive requiring member states to harmonize their laws on e-business in order to reduce regulatory burdens and stimulate business, establishes under what circumstances ISPs are responsible for content they host or provide access to. The E-Commerce Directive was modeled after the German Teleservices Act604 (which would be amended in 2002 by the Electronic Commerce Law, Germany’s law that implemented the EU Directive).605 Articles 12 through 15 of the Directive establish among other things that (paraphrase): - ISPs should not be burdened with monitoring the content they store or transmit. - ISPs may be required to inform the appropriate authorities when illegal activity is undertaken or illegal information provided by recipients of their services. - ISPs can be asked to remove or prevent access to certain information. - If an ISP is a mere conduit, it will not be held liable for information provided 1) it does not initiate the transmission, 2) it does not select the receiver of the transmission and 3) it does not select or modify the information contained in the transmission. - If an ISP is acting as a host of information, it will not be liable for information provided that it does not know that the information is illegal and acts expeditiously to remove or to disable access to information. A similar regime is in place for cached information.

603

Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market. 604 Benoît Frydman and Isabelle Rorive, "Regulating Internet Content through Intermediaries in Europe and the USA," 23 Zeitschrift Für Rechtssoziologie 41 (2002). 605 Oliver Köster and Uwe Jürgens, "Liability for Links in Germany, Liability of Information Location Tools Under the German Law After the Implementation of the European Directive on E-Commerce," Working Papers of the Hans Bredow Institute, no.14 (2003).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

144

Global Medium, Local Laws

One of the crucial questions regarding ISP liability is what will constitute “knowledge” of illegal content. This is not described well in the E-Commerce Directive, nor is it clear how a notice-and-take-down or notice-and-block regime could work. The implications of this liability regime will be discussed in detail in chapter six.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

United Kingdom: Hotlines While law journals have paid much attention to the spectacular but rather ineffective French attempts to limit access to online hate speech, efforts to curb hate speech in the United Kingdom have generated far less attention. But whereas some of the approaches discussed above were based upon regulation through legal intervention, the United Kingdom has relied more on a low-key approach based upon industry self regulation. Following a Dutch example, an Internet hotline, the Internet Watch Foundation606 (IWF) was established in 1996 as an independent non-profit607 that would work together with government, police and Internet industry to minimize the availability of child pornography online.608 In doing so, the industry hoped to avoid government regulation. The IWF operates a free hotline where the general public can report illegal content found online. The IWF then investigates the claim and, if needed, notifies the ISP that there is illegal content on its servers. Usually, the ISP then removes the material. While the IWF was set up to deal specifically with child pornography and still seems to focus mainly on this type of illegal content, since 2000 it has also become more proactive in the field of hate speech.609 Currently, three kinds of illegal content can be reported to the hotlines: child abuse images610 from anywhere in the world, criminally obscene content and

606

Funded by the industry and the European Union. 608 "About the IWF," Internet Watch Foundation.

609 Alan Docherty, "Now You See, Now You Don't," Internet Freedom, February 20, 2000. (link no longer active) 610 The term “child abuse” images is used in a rather narrow meaning and only refers to “any images of children, apparently under 18 years old, involved in 607

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating Online Hate in Europe

145

criminally racist content hosted in the United Kingdom.611 The hotline can only take action against racist content hosted on servers located in the United Kingdom; ISPs outside the United Kingdom are not affected by the IWF. However, when illegal child abuse content is hosted outside the United Kingdom, the IWF will pass on the information to the hotline in that country, provided one exists, or to Interpol.612 The IWF also compiles a list of websites containing illegal child abuse content hosted on foreign servers. In 2004, British Telecom, the United Kingdom’s biggest high speed ISP, launched a project under the name of “Cleanfeed,” that uses a list compiled by the IWF to filter out these Web sites that are on the IWF blacklist.613 Users trying to access these sites receive an error message. However, this system only applies to Web sites containing illegal images of child abuse,614 not to hate speech. The hotline approach seems to make effective blocking much easier. Asking ISPs to remove illegal content hosted on their servers seems to be an appropriate enforcement of local laws. Compiling a blacklist of Web sites that ISPs can block on a voluntary basis also seems to be a reasonable way to make illegal content unavailable. Because it is a voluntary regime and the IWF is a private entity, there are fewer legal and practical hurdles to be cleared than when the government orders these kinds of blocks. Still, the IWF has its detractors. While it is technically an independent body, it works closely with the government and has received government funding in the past.615 The IWF is seen by some as a knee-jerk reaction of the industry sexual activity or posed to be sexually provocative.”

611 "The Hotline and the Law," Internet Watch Foundation.

612 Idem. 613 S.A. Mathieson, "Back Door to the Black List: BT's System to Block Access to Child Pornography could Actually be Manipulated to Search for Illegal Material, According to New Research," The Guardian, May 26, 2005, p. 19. 614 “Any images of children, apparently under 18 years old, involved in sexual activity or posed to be sexually provocative.” ("The Hotline and the Law," Internet Watch Foundation. ) 615 Michael D. Birnhack and Jacob H. Rowbottom, "Do Children have the Same First Amendment Rights as Adults? Shielding Children: The European Way," 79 Chi.-Kent. L. Rev. 175 (2004) at 224.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

146

Global Medium, Local Laws

to threats of government regulation.616 Netfreedom, a British Internet free speech organization, has criticized the IWF for being a de facto censor of the Internet and for being a non-democratic and unaccountable organization without transparency.617 Its focus has also largely been on (child) pornographic content; in 2007 it received over 28,000 reports of child pornographic content compared to only 843 reports of incitement to racial hatred. The majority of these was deemed to not amount to illegal content or was hosted outside of the U.K.618

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Hotlines at the European Level In 1997, the European Union launched its Action Plan on Promoting the Safe Use of the Internet, with the aim of formulating and stimulating self regulatory approaches, following the British model.619 Many countries now have hotlines whom one can notify about the presence of illegal online content. An organization called Inhope coordinates and facilitates the work of hotlines in different countries. Inhope counts over thirty members, including the United States.620 Although its focus is mainly on banning child pornography, Inhope also deals increasingly with instances of hate speech. However, in the absence of a global harmonized legal framework, hotlines will necessarily be local solutions. Although there are still legal differences between laws on child pornography, these laws come close to being a

616

Alan Docherty, "Now You See, Now You Don't," Internet Freedom, February 20, 2000. (link no longer available). 617 Idem. 618 “2007 Annual Report,” Internet Watch Foundation, p.4. Available at:

619 Kerem Batir, "Regulating Hate Speech on the Internet, Unilateralism v. Multilateralism, Technique v. Law," European Community Institute. 620 "Inhope: Facts by Country: United States-Overview," Inhope - The Association of Internet Hotline Providers.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

147

“legislation across the globe,”621 making international cooperation more feasible. In the area of hate speech, this level of harmonization is not present. The United States stands out because of its radical free speech approach towards hate speech, nevertheless, significant differences in hate speech law still exist within the Europe. In order to harmonize hate speech laws and provide a unified legal framework for combating cybercrime in general, the Council of Europe drafted the Additional Protocol to the Convention of Cybercrime, Concerning the Criminalisation of Acts of a Racist and Xenophobic Nature Committed through Computer Systems.622

Additional Protocol to the Convention of Cybercrime

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Council of Europe The Council of Europe was established in 1949 as an intergovernmental European organization dedicated to defending human rights and democratic values in Europe and to develop continent-wide agreements and standardized legal norms. It is mainly a body that drafts treaties623 and currently counts forty-six members. The Council of Europe is not to be confused with the European Union, which has fewer members and whose decisions are more binding for its members. The Council of Europe’s actions, on the other hand, have no effect unless they are ratified by the member countries. The United States is not a member of the Council of Europe, but has been granted observer status. It has a representative in the Council, but is not represented in the Committee of Ministers or in the Council’s Parliamentary Assembly. Treaties which the Council seeks to extend past its borders are also 621

Cormac Callanan, "Press Release: The Operational Experience of the European Internet Complaint Hotlines," Inhope: The Association of Internet Hotline Providers, June 16, 2004. (link no longer active) 622 "Explanatory Report, Additional Protocol to the Convention on Cybercrime, Concerning the Criminalisation of Acts of a Racist and Xenophobic Nature Committed through Computer Systems," at 3. 623 Christopher D. Van Blarcum, "Internet Hate Speech: The European Framework and the Emerging American Haven," 62 Wash & Lee L. Rev. 781 (2005) at 788.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

148

Global Medium, Local Laws

open for signature and ratification to non-member countries.624 The United States has signed Council of Europe treaties on several occasions.625 The United States is a signatory to the original Convention on Cybercrime (completed in 2001), which focused on effective cooperation between signatories in combating online child pornography, violation of intellectual property and piracy online, hacking, virus transmission and organized crime. During the drafting process of the Convention on Cybercrime, several delegations suggested that the document should also include a provision on hate speech. But it soon became clear that, because of First Amendment concerns, the United States would not sign the Convention if this were the case. Therefore, it was decided to move the hate speech conventions to a separate protocol. The United States has signed the Convention on Cybercrime in November 2001, ratified it in 2006 and it entered into force on January of 2007. However, the United States also declared that it would not sign the separate protocol because it is inconsistent with American Constitutional guarantees.626 The Additional Protocol to the Convention of Cybercrime Concerning the Criminalisation of Acts of a Racist and Xenophobic Nature Committed through Computer Systems (the Protocol) was opened for signature on January 28, 2001. As of July 2008, thirty-one member countries as well as two non-member have signed the protocol. It entered into force in twelve countries. The United Kingdom has not signed nor ratified the convention.627

624

Other non-member states with observer status are Canada, Japan, Mexico and The Vatican. 625 Christopher D. Van Blarcum, "Internet Hate Speech: The European Framework and the Emerging American Haven," 62 Wash & Lee L. Rev. 781 (2005) at 788-789. 626 Amy Oberdorfer Nyberg, "Is all Speech Local? Balancing Conflicting Free Speech Principles on the Internet," 92 Geo. L.J. 663 (2004) at 670. 627 For an up-to-date list, see:

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

149

The Protocol

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Protocol628 follows the approach spelled out by numerous international treaties.629 It advocates a low tolerance of hate speech, but it leaves the nation states considerable freedom in how they implement its provisions into national law. The protocol only deals with racist/xenophobic material made available to the public instead of with directed communication (except in the case of threats), suggesting that the purpose of the Protocol is more to maintain public order and societal peace than to protect individuals from hurtful expression.630 It requires that signatories criminalize “distributing, or otherwise making available, racist and xenophobic631 material to the public through a computer system.”632 However, a party can choose not to criminalize this behavior in certain circumstances. “[I]f due to established principles in its national legal system concerning freedom of expression, it cannot provide for effective remedies,”633 it can opt not to implement this provision, but only if it concerns “the advocating, promoting or inciting to discrimination, which is not associated to hatred or violence.”634 So while the Protocol leaves some room to signatories, it requires that at minimum, they ban racially motivated hate speech that is associated with violence. It is not clear how strong 628

Available at: 629 See: "Explanatory Report Additional Protocol to the Convention on Cybercrime, Concerning the Criminalisation of Acts of a Racist and Xenophobic Nature Committed through Computer Systems," preamble and commentary/explanation on Article 2. 630 Yulia A. Timofeeva, "Hate Speech Online: Restricted or Protected? Comparison of Regulations in the United States and Germany," 12 J. Transnat'l L. & Pol'y 253 (2003) at 266. 631 Defined in Article 2.1. as “any written material, any image or any other representation of ideas or theories, which advocates, promotes or incites hatred, discrimination or violence, against any individual or group of individuals, based on race, colour, descent or national or ethnic origin, as well as religion if used as a pretext for any of these factors.” 632 Article 3.2. 633 Article 3.3 634 "Explanatory Report Additional Protocol to the Convention on Cybercrime, Concerning the Criminalisation of Acts of a Racist and Xenophobic Nature Committed through Computer Systems," at 32.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

150

Global Medium, Local Laws

this “association” has to be, so it appears that countries could be given leeway to interpret this according to their own laws. In the United States (if it would have signed the Protocol, which it did not and will not) this would mean that the connection between violence and speech would have to be very strong, as spelled out in Brandenburg. But even if a statute based on article three could meet this imminence requirement, it would still be constitutionally suspect because it singles out specific types of incitement.635 Article four requires each nation state to establish as a criminal offense conduct, including the act of threatening individuals or groups of people, through a computer system, with the commission of a serious criminal offense (as defined by domestic law) because “they belong to a group, distinguished by race, colour, descent or national or ethnic origin, as well as religion, if used as a pretext for any of these factors.”636 As discussed above, threats are not constitutionally protected in the United States, but this article would still bring up First Amendment concerns because it singles out certain types of threats based on the viewpoint they express. Content-based restrictions of speech can be constitutional, for example, when they are “based on the very reasons why the particular class of speech at issue … is proscribable,”637 as was the case in Virginia, but viewpoint-based restrictions are not. Even though threats are not protected by the First Amendment, banning threats solely based on the criteria set out in article four would bring up the same underinclusiveness concerns that sunk the St. Paul statute at issue in R.A.V. Article five requires parties to the treaty to criminalize the making of public insults over a computer system on the basis of the same characteristics as spelled out in article four. This article would clearly not be constitutional in the United States where a mere insult, as long as it does not amount to fighting words, harassment or libel, is protected speech. However, the Protocol gives signatories the possibility to opt out of article five.638 Article six requires states to 635

See also: Amy Oberdorfer Nyberg, "Is all Speech Local? Balancing Conflicting Free Speech Principles on the Internet," 92 Geo. L.J. 663 (2004) at 674. 636 Article 4. 637 505 U.S. 377 at 393. 638 Article 5.2.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Regulating Online Hate in Europe

151

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

criminalize public distribution through a computer system of “material which denies, grossly minimises, approves or justifies acts constituting genocide or crimes against humanity.”639 Article 6.2 allows nation states to require that this minimization or denial is done with the intent to incite hatred, violence or discrimination against (groups of) individuals based on certain characteristics or to opt out of this provision all together. Even in its weakened form, this article would still pose First Amendment concerns, since it is viewpoint-based and because it does not stipulate the likelihood that the said illegal behavior would result. Because the Protocol leaves so many opt out options open, it is doubtful that it will actually harmonize hate speech laws across Europe. For example, nations can still decide whether or not to criminalize Holocaust denial. However, once more countries sign and enact the Protocol in national law, there would at least be some accepted common standards. For example, racially motivated calls for violence online would be banned in all countries that signed on to the Protocol and it would enable authorities to work together across borders to prosecute these types of hate speech, for example through a network of hotlines.

Conclusion This overview shows that during the years the Internet rose to prominence different European countries have explored different strategies to combat online hate speech originating from inside and outside their own borders. The E-Commerce Directive opens the door for a notice-and-take-down regime in European countries, a regime that could be even more effective if European hotlines worked together more closely. However, an efficient hotline regime depends upon a harmonized legal framework. The Protocol tried to create such a framework, but given the many opt out provisions it contains, it is doubtful that it will be able to provide this harmonized legal framework. Even then, such an approach would be powerless against hate sites hosted abroad. Many European Web masters or providers of speech that is banned in their home country host their sites in the United States, a situation 639

Article 6.1.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

152

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

that existed even before the protocol.640 As Europe cracks down on Internet hate speech, this trend will most likely only get stronger. Reliable figures of the extent of this phenomenon are hard to come by, but the United States is often referred to as the country from which most of the hate online emanates. A French former minister of Justice said that 2,500 of the 4,000 racist sites counted worldwide were hosted in the United States.641 It has been estimated that computers in the United States host 800 German language hate sites alone.642 The last chapter will discuss and evaluate solutions for European nations to ban hate speech both originating from outside and inside its borders. However, before such an evaluative effort can be undertaken, a normative framework is needed. The next chapter, based on a normative theory of the Internet and appropriate regulatory approaches to this new medium, will develop a framework against which possible solutions for this “problem” can be evaluated.

640

Brian Levin, "Because of the Constitution's First Amendment, the United States Now Hosts Hundreds of European Language Hate Sites," Southern Poverty Law Center: Intelligence Report (2003).

641 "Statement by Mr. Gérard Kerforn, Introducer at the Fourth Session of the Conference on Racism, Xenophobia and Discrimination," Vienna, 4-5 September, 2003. 642 Russell Working, "Illegal Abroad, Hate Web Sites Thrive here," Chicago Tribune, November 13, 2007, p.1.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

CHAPTER 5

Evaluating Cross-Border Internet Content Regulation

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Introduction The previous chapters described the different regulatory approaches towards hate speech in the United States and Europe and how these approaches have been applied to the Internet. As a consequence of these different regimes, European courts, anti-racism organizations and lawmakers are struggling with the problem of how to regulate hate speech that reaches Europe via the Internet but originates in the United States, where it is constitutionally protected. The final chapter will discuss some regulatory attempts and possibilities that exist to do this and evaluate them on the basis of a set of criteria that will be developed in this chapter. “Regulation” is a broad term; as Lessig observed, regulation of the Internet can come from multiple sources: the law, social norms, architecture, and the market.643 Though the focus in this work has mainly been on the law as a source of Internet regulation, other sources of regulation, for example, self-regulation through acceptable use policies, are also solutions that ought to be considered. In this chapter we will develop a set of normative criteria that should guide attempts to regulate hate speech originating from across borders. The normative model outlined below allows critical evaluation of proposed solutions to the issue of transborder content regulation and can be applied not only to hate speech but also to other types of speech. Three fundamental principles underlie these criteria: respect for the open, layered structure of the Internet, a representational concept of sovereignty, and effectiveness.

643

Lawrence Lessig, Code and Other Laws of Cyberspace (1999) at 88.

153 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

154

Global Medium, Local Laws

Maintaining the Layered Structure of the Internet

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Regulating the Internet: Normative and practical considerations Before proposing a normative framework to evaluate content regulation, it is important to ask whether or not the Internet can and should be regulated. The answer to this question depends on how the Internet is conceived. If one sees the Internet as a “machine,” as the French Judge Gomez did in the Yahoo! case, there is little reason to assume that the Internet should not be subjected to the same kinds of regulations as any other “machine.” If, on the other hand, one sees the Internet not as a tool, but as a separate sovereign place -a “cyberspace” with its own norms and values- a distinct regulatory approach for the Internet is more likely to be advocated. Claims of “cyber independence” were particularly strong up until the early-to-midnineties, when the Internet was used or “inhabited” by a relatively small community with its own specific set of values and norms. For example, unlike the Internet as we know it now, commercial activity was not tolerated in those earlier days.644 In 1994, “the Internet was viewed not as a space for commercial advertising but as a place for community, sharing, and public discourse.”645 Out of this maverick ethos emerged a strong feeling of autonomy or “cybersovereignty.” Fundamental to this ethos was the conviction that cyberspace is a space separate and different from “real space,” and that the governments of the real world should respect cyberspace’s autonomy and not try to impose their rules on this new space. John Perry Barlow articulated this sentiment in his A Declaration of the Independence of Cyberspace: “Governments of the Industrial World, 644

When two lawyers used the Internet to advertise their services in 1994 through mass emails and mailing lists they subscribed to, the reaction of the “Internet community” was one of outrage and they were forced to stop their online commercial soliciting. See: Laura J. Gurak, Persuasion and Privacy in Cyberspace: The Online Protests over Lotus Market Place and the Clipper Chip (1997). 645 Laura J. Gurak, Cyberliteracy: Navigating the Internet with Awareness (2001) at 130.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

155

you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”646 These “cyber-independents” believed in the liberating potential of this alternative reality, and therefore argued that the “real world” should not impose its laws upon this new “space” where different norms and values apply. Cyberspace was conceived to be the new Western frontier, a place of relative lawlessness, a lawlessness that does not lead to chaos but that is conducive to prosperity and development.647 This cyberlibertarian claim is based upon both factual and normative arguments that ought to be distinguished from each other. The factual argument states that attempts to regulate the Internet are futile because of certain characteristics of the Internet. Those who make this argument see the Internet as a separate place that cannot be intruded upon. They point out that because the identity and location of Internet users is often times unknown and because it is decentralized and borderless, traditional local laws cannot be applied to the Internet.648 They state that because of the geographical indeterminacy of the network architecture (an IP address tells you nothing about the geographic location or identity of the user649) the Internet was designed not to permit the flow of geographical information.650 Therefore, regulation critics have argued that local governments should accept the fact that they are unable to control information flow across their borders. This argument states that the Internet poses problems of choice of law as well as enforcement of these laws of such magnitude that Internet regulation through law is nearly impossible. The normative argument, on the other hand, claims that governments should not regulate the Internet because doing so would 646

John Perry Barlow, "A Declaration of Independence of Cyberspace," February 8, 1996. 647 Alfred C. Yen, "Western Frontier Or Feudal Society?: Metaphors and Perceptions of Cyberspace," 17 Berkeley Tech. L.J. 1207 (2002) at 1224. 648 Ibid. at 1225-1226. 649 However, as will be discussed in chapter six, geolocation software has made it possible to determine the location of a user. 650 Matthew Fagin, "Regulating Speech Across Borders, Technology vs. Values," 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 404.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

156

Global Medium, Local Laws

be detrimental to its development and to the libertarian ideals upon which it was built. The normative argument is rooted in the belief that the Internet is a separate, autonomous place where local governments have no authority. Internet regulation, those pioneers believed, should emerge from within and be enforced by the Internet community and not imposed from above. This could be done through, for example, contractual relationships and user preferences.651 A number of instances of collective action on the Internet in the mid-nineties were successful in enforcing these values against governmental and commercial players.652 The scholarly translation of this idea can be found in an important 1996 article by David Johnson and David Post653 in which they argued that, because of the disconnect between the Internet and any specific jurisdiction, the Internet had challenged the ability and legitimacy of nation states to regulate. They maintained that “[t]he Net thus radically subverts the system of rule-making based on borders between physical spaces, at least with respect to the claim that Cyberspace should naturally be governed by territorially defined rules.”654 (Note the use of a capital “C” in the word “Cyberspace.”) The challenge for these scholars was to formulate what law would apply in this new sovereign jurisdiction. During numerous symposia held in 1996 and 1997, theorists formulated proposals for distinct regulatory approaches best suited to the Internet, and analyzed legal doctrines that could work in cyberspace. This scholarship was unified by the belief that cyberspace is and should be an autonomous place, regulated independently from the “sovereigns” of the real world.655 At that time, the idea of cybersovereignty was well established. A substantial amount of scholarship did not question it, but was dedicated to establishing a 651

David R. Johnson and David G. Post, "The New 'Civic Virtue' of the Internet," 1998. 652 See for example: Laura J. Gurak, Persuasion and Privacy in Cyberspace: The Online Protests Over Lotus Market Place and the Clipper Chip (1997). 653 David R. Johnson and David R. Post, "Law and Borders - the Rise of Law in Cyberspace," 48 Stanford L. Rev. 1367 (1996). 654 Ibid. at 1370. 655 Dan Hunter, "Cyberspace as Place and the Tragedy of the Digital Anticommons," 91 Calif. L. Rev. 439 (2003) at 448-449.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

157

regulatory model for the Internet, independent of the laws of the “real world.” However, as the Internet became more popular and more commercial in the second half of the nineties, this view would come under attack. Jack Goldsmith criticized the factual argument of the “regulation skeptics,” arguing that there is no reason why local laws could not apply to cyberspace.656 The Internet, according to Goldsmith, may present some new legal challenges, but he disagreed with the notion that transactions and interactions taking place on the Internet are qualitatively different from transactions taking place off line. Goldsmith acknowledged that in cyberspace transactions between people from different jurisdictions can take place easily and that these kinds of transactions may pose jurisdictional problems, but he argued that this choice-of-law problem is not unique to Internet transactions and does not mean that these transactions cannot be governed by traditional law. At around the same time, scholars also began to chip away at the normative argument supporting cyber-independence. They argued that there is no reason to believe that the liberal democratic ideals of individual liberty, popular sovereignty and consent of the governed could be guaranteed by cyber-governance any better than by traditional law.657 Still, even though the cyberlibertarians’ values emerged at a time when the Internet’s uses were not as varied as they are today, the open structure of Internet they envisioned has advantages. Many observers have noted that the open and free flow of information, with few rules and no central authority, has been crucial to the development of the Internet because it allows, among other things, for a low cost of innovation.658 On the Internet, innovators can design new applications 656

Jack L. Goldsmith, "The Internet and the Abiding Significance of Territorial Sovereignty," 5 Ind. J. Global Leg. Stud. 475 (1998) at 1199-1200. See also Jack L. Goldsmith, "The Internet and the Abiding Significance of Territorial Sovereignty," 5 Ind. J. Global Leg. Stud. 475 (1998); Jack Goldsmith, “Unilateral Regulation of the Internet: A Modest Defense,” 11 Eur. J. Int'l L., 135 (2000); Allan R. Stein, “The Unexceptional Problem of Jurisdiction in Cyberspace,” 32 Int'l Law. 1167 (1998). 657 Dan Hunter, "Cyberspace as Place and the Tragedy of the Digital Anticommons," 91 Calif. L. Rev. 439 (2003) at 450-451. 658 Matthew Fagin, "Regulating Speech across Borders, Technology vs. Values," 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 405.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

158

Global Medium, Local Laws

or create new content without having to worry about their compatibility with the network, without having to obtain permission, or having significant financial resources. From a technological perspective, the Internet is a far more egalitarian medium than traditional mass media. It treats all users and messages alike, regardless of the content of the messages or the identity of the users. Once the information is broken down and transported, it does not matter whether the data packets are part of the New York Times Web site or a local blog about ice fishing. Naturally, at the content level, not all information is equal, and the New York Times Web site is not equal in power, resources and popularity to an amateur Web site. But as far as the medium is concerned, it takes few resources to have one’s content distributed. This egalitarian nature has facilitated the emergence of, for example, blogs as important voices in national and international debates and news stories. But the low-costof-innovation argument applies mainly to technological innovation. A 16-year-old whiz kid can develop a new application in his bedroom and make it available to the Internet community by means of a couple of mouse clicks. This simplicity, ease of access and low cost of innovation have been contributing factors to the Internet’s development and have created an environment conducive to innovation, both technological and cultural: Not innovation in just the dotcom sense, but innovation in the ways humans interact, innovation in the ways that culture is spread, and most importantly, innovation in the ways in which culture gets built. The innovation of the Internet --built into its architecture-- is an innovation in the ways in which culture gets made. Let the dotcom era flame out. It won't matter to this innovation one bit. The crucial feature of this new space is the low cost of digital creation, and the low costs of delivering what gets created.659 But the Internet has become much more than a place to exchange ideas or create culture. It is now a medium through which one can offer goods for sale, shop, file taxes, register for classes, rent movies, play 659

Lawrence Lessig, "The Architecture of Innovation," 51 Duke L.J. 1783 (2002).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

159

fantasy football, and book trips. With this proliferation of functions, complexity may have to be built into the network to ensure security and reliability of the network, making some form of (government) oversight necessary. However, a balanced approach is needed, one which “would retain some measure of the original network's simplicity and lack of structure, and aspire toward alternatives that do not needlessly fragment on-line users and their communities.”660 One does not need to subscribe to the notion that cyberspace is a distinct geographical location with unique rules to agree with the proposition that any kind of regulation should try to respect and maintain the structure of the Internet that has made the development of the medium possible. Solum and Chung, in a very expansive law review article,661 develop this principle further, proposing some specific guidelines for Internet regulation.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Code Solum and Chung rely on the work of Lawrence Lessig in developing criteria to determine how this balance between openness and regulation can be struck, and how a set of principles can be developed. They draw on Lessig’s notion that “code” is what operates and determines activity on the Internet.662 Lessig launched his code concept in an influential 1999 book,663 in which he argues and explains how computer code determines behavior on the Internet, capturing this idea in the catchphrase that “code is law” on the Internet.664 Code, according to Lessig, determines the parameters for behavior and action on the Internet. Because of the particular way the Internet is designed, because of its architecture, which is determined by code,665 certain things can be 660

Matthew Fagin, "Regulating Speech Across Borders, Technology vs. Values." 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 406. 661 Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004). 662 Ibid. at 827-829. 663 Lawrence Lessig, Code and other Laws of Cyberspace (1999) at 297. 664 Ibid. at 6. 665 The exact meaning of the term “code” and its relationship to “architecture” is not always clear in Lessig’s book. At some times, his use of the term code seems to be the meaning it has amongst computer programmers, at other times its meaning seems to be more metaphorical. “Architecture” seems to refer to the hardware, software and protocols on which the Internet is run and “code” to

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

160

Global Medium, Local Laws

done with ease, while others are hard or impossible. The Internet’s architecture is analogous to the road system in a city. Just like the road system of a city determines where you can go with your car, the Internet architecture determines how you can move around in cyberspace. Cities can decide to make their city center accessible only for buses, to levy tolls to use certain lanes at certain times, or to require permits to drive in certain areas of town at certain times. Cities may try to reward certain behaviors such as carpooling or using public transportation and they may try to discourage or punish others, such as speeding or driving during rush hour. Code, according to Lessig, does the same thing for the Internet; it determines how we can move around on the Internet. The Internet was designed to allow a rapid flow of information, uninhibited by physical borders and without central authority. This “design” has consequences for how you can “move around” on the Internet. Because of these architectural features, users in France or Germany have access to Web sites hosted on American servers that may be illegal in their own country. Currently, the options for governments wanting to regulate Internet traffic or speech are limited. This has led Lessig to state that the architecture of the Internet has exported a First Amendment in code “more extreme than our own First Amendment in law.”666 This does not mean that code always enables free flows of information. For example, code also enables content providers to make material available only for viewing and not for downloading, or to allow access to certain content only to paying subscribers or to people willing to divulge personal information. The French order in the Yahoo! case serves as a good illustration of the tenuous relationship between architecture and regulation of the Internet. As discussed in chapter five, the French judge ordered Yahoo! to use geolocation software in order to determine whether or not surfers accessing its auction site were French and block their access if they were. Experts testified that 70% of the French users could be identified this way. Part of the reason that only 70% of the users could be identified as French through geolocation software was that French AOL the computer languages and the software and hardware environment that make up the Internet. The terms do overlap in meaning, but code seems to be a more fundamental concept, underpinning the architecture of the Internet. 666 Lawrence Lessig, Code and other Laws of Cyberspace (1999) at 224.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

161

subscribers’ IP addresses came up as being located in the United States. This is because AOL uses the services of the UUNET network, a commercial Internet Service Provider. As a result of this setup, dynamic IP addresses (IP addresses assigned to a user for the length of his connection) attributed by AOL appeared as being localized in Virginia, where UUNET is located.667 However, this does not necessarily have to be so. It is conceivable that AOL would make changes in the way it assigns IP addresses or would organize its network differently. Geolocation software could (and has) become more sophisticated in finding ways to distinguish French users. In other words, the cyberlibertarian claim that location does not matter on the Internet is not one that captured an inherent characteristic of the Internet,668 but one that described the Internet architecture and code at a certain point in time. A central idea in Lessig’s work is that this architecture can be changed, and that these changes may have consequences for the way information and culture are distributed over the Internet. In this respect, Lessig distances himself from the factual claims (though he certainly shares some of their normative claims) of the cyberlibertarians that stated that the essential nature of the Internet is that it is free.669 While the architecture of the Internet as it is now (or as it was when Lessig’s book first appeared in 1999) makes it hard for governments to regulate behavior on the Internet, it is not impossible for lawmakers to regulate the architecture of the Internet.670 Lessig’s attitude towards government regulation of the Internet is somewhat ambiguous. He seems to accept that governments need to impose their laws on the online environment and does not share the notion that the Internet is a sovereign space the government has no business regulating, but he warns against governmental attempts to regulate the architecture of the Internet and the stifling consequences this would have for innovation and creativity. When developing a normative framework for Internet regulation, Solum and Chung take this open architecture that has been conducive to innovation and development as 667

November Order. Matthew Fagin, "Regulating Speech across Borders, Technology vs. Values." 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 412-415. 669 Lawrence Lessig, Code and other Laws of Cyberspace (1999) at 5. 670 Ibid. at 43. 668

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

162

Global Medium, Local Laws

a normative starting point. The fact that the code of the Internet reflects American First Amendment values is not by itself a sufficient argument for maintaining this openness, since it is only convincing to those who subscribe to the American First Amendment concept to begin with. But by taking the open structure of the Internet and its importance to innovation as the starting point for a normative theory, it has a broader appeal. Solum and Chung further develop this argument by borrowing another concept from Lessig: The end-to-end principle.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The end-to-end principle and the layered structure of the Internet The end-to-end principle671 means that on the Internet, intelligence is kept at the ends (the applications) of the network, while the network itself is kept simple and basic. In order to apply this concept and arrive at their normative principle for Internet regulation, Solum and Chung use the concept of a layered Internet.672 They argue that a key feature of the Internet is that it is comprised of different “layers,” each with its own function in the information processing and transporting process that constitutes Internet activity.673 The layer model is a spatial metaphor in which information is passed on from the top layer to the lower layers. Data contain a header with instructions for each layer, and when a layer receives data from a different layer, it performs its function according to the information found in the header, and passes it on to the next layer. 674 Solum and Chung’s model675 describes the layered nature of the TCP/IP protocol, the protocol that enables the connection of networks and constitutes the code of the Internet.676 The first layer they identify677 is the (1) content layer, referring to the content of a given Web site, email or other Internet application; the specific signs that constitute the message. The second layer is the (2) application layer, 671

Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004) at 829. 672 Ibid. at 845. 673 Ibid. at 816. 674 Ibid. at 842. 675 Ibid. at 839-840. 676 Ibid. at 838. 677 Ibid. at 852.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

163

the layer that handles the details of a certain Internet application. Examples of application layer protocols are the HTTP protocol, which enables the World Wide Web, the FTP protocol for file transfer, the Domain Name System (DNS) for connecting IP addresses to URLs and the SMTP protocol for email. The (3) transport layer (TCP) provides the flow of data between two hosts. It is where the data from the application layer are broken up into data packets and handed to the network (IP layer). At the receiving end, data packets received from the IP layer are assembled and delivered to the application layer. The (4) IP layer (Internet Protocol), or the network layer, handles the movement of data packets around the network and the encoding of IP addresses (to figure out to where data need to be sent). The (5) link layer is the layer handling the physical linking of interfacing computers’ hardware with the network hardware. When new hardware is used for Internet communication, all that needs to be done is finding a way to hook it up to the Internet. Typically, this is accomplished by a device driver for the specific piece of hardware. This way, the upper layers are independent from the TCP/IP layer, which is not burdened by having to adapt to new hardware. The (6) physical layer is the last layer, the physical infrastructure over which the actual transfer of bits takes place. Solum and Chung claim that this layer separation is fundamental to the design of the Internet. The lower layers do not “know” what happens in the layers above them; their functioning is not determined by the upper layers. Solum and Chung clarify this point by explaining the workings of the Internet in some more detail. Most Internet users know the HTTP protocol, as the application (layer 2) that enables the World Wide Web, yet other applications are also commonly used. For example, the Simple Mail Transfer Protocol (SMTP) is the application that makes email possible and the File Transfer Protocol (FTP) is the protocol that usually is used when one downloads a file from a server on the Internet. So when we state that the lower layers do not “know” what happens in the layers above, or that intelligence is kept at the “ends” of the network, this simply means that the lower layers cannot distinguish between data packets that are part of an email, a Web page or an MP3 file that is being shared through a file sharing network. Once the transport layer (3) has broken the data in several small data packets, the

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

164

Global Medium, Local Laws

IP layer (4) cannot differentiate between PDF and MP3 files. The Internet can also not ensure that all data packets arrive at a receiver’s computer at the same time, the majority of the data packets making up a network communication may arrive almost immediately, while the remaining data packets may take much longer, because they are being rerouted.678 Once it is traveling over the Internet, all information is equal.679

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Guidelines for regulators Solum and Chung argue that regulation of the Internet should respect the layered nature of the Internet, because it guarantees that cost of innovation is kept low. Based on this end-to-end principle, the layered structure of the Internet and the code thesis, Solum and Chung derive two specific guidelines for regulators of the Internet: (1) They “should not adopt any regulation that would require one layer of the Internet to differentiate the handling of data on the basis of information available only at another layer, absent a compelling regulatory interest.”680 The second guideline requires that “if compelling regulatory interests require a layer-crossing regulation, public Internet regulators should adopt the feasible regulation that minimizes the distance between the layer at which the law aims to produce an effect and the layer targeted by legal regulation.”681 In other words, this principle states that layer crossing should only be done if no other regulatory alternative is available, and if the interest that is furthered by regulation is a compelling one. Even then, the layer crossing should be minimized. The problem of regulating hate speech is manifested at the level of the content layer, so according to these rules, regulations of content should be dealt with at the content (1) layer, or if this is impossible, at a layer as close as possible to this layer. In their article, Solum and Chung then apply this layer principle to a variety of examples. They discuss the Burmese government’s policy to strictly control the physical layer by limiting access to communication lines, equipment and network hardware (physical layer (6)) in order to prevent government criticism (content layer(1)) as an 678

Ibid. at 829-831. Ibid. at 846. 680 Ibid. at 866. 681 Idem. 679

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

165

example of a layer crossing violation.682 China’s Internet policy is also discussed in this context. The Chinese government exerts control over the physical layer to control content; for example, it has a monopoly over all Internet connections going in and out of the country and ISPs are required to register with the government.683 China also blocks numerous foreign sites such as the New York Times or sites dealing with human rights issues.684 ISPs are required to block access to all sites originating from a certain IP address, so this is regulation of content at the level of the IP layer (4). According to the model of Solum and Chung, this is also a layer crossing violation, but a less serious one than trying to regulate content through controlling the physical layer, as it does not cross as many layers. These kinds of regulations, according to Solum and Chung, do have negative consequences for the Internet. For example, the IP blocking interferes with the fluidity of the network. IP blocking usually (though there are various ways that Web sites can be blocked, as will be discussed in the next chapter) means that a router (a device that, upon receiving a data packet, determines the next network point to which a data packet should be forwarded on its way towards its destination) will drop the data packet. The router accepts the data packet but is programmed to drop it if originates from a blocked IP address, even if the final destination of the information is outside China. This slows down and complicates Internet traffic.685 If every country had these kinds of measures, the Internet would be reduced to a set of interconnected intranets. By eliminating the “stupidity” of the network and by making the IP layer intelligent and discriminatory, the end-to-end principle is violated. 682

Ibid. at 878-888. Ibid. at 896. 684 However, since September 2002 China adopted a more sophisticated filtering system, packet filtering, that enables it to only partly block Web sites. For example, rather than blocking the whole BBC Web site, it might only block those pages containing criticism of the Chinese regime, but still allowing access to other sections. See: Lindsey Eastwood, "‘Don’t Be Evil’: Google Faces the Chinese Internet Market and the Global Online Freedom Act of 2007," 9 Minn. J.L. Sci. & Tech. 2 (2008) at 296. 685 Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004) at 908-909. 683

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

166

Global Medium, Local Laws

Solum and Chung also discuss the Yahoo! order in this context, pointing out that the order of the French judge also constitutes a layer violation because it would be blocking that takes place at the IP layer. However, the authors suggest that in this case there may have been no alternative and that a layer violation may have been justified since France was pursuing a compelling government interest.686 The model provided by Solum and Chung is applicable to the issue of hate speech. It suggests that hate speech, which takes place at the content layer, should be addressed at the content layer level. If this cannot be done, regulation should be targeted at layers closer to the content layer. Solum and Chung’s model allows for layer violations if there is no alternative available and if there is a compelling government interest in regulating. What exactly constitutes a compelling government interest and how one can determine whether or not there is a non-layer violating alternative available is something that Solum and Chung fail to adequately clarify. This is somewhat troublesome because this is exactly the crux of the problem with Internet regulation; there are many local interests that may all be very compelling, but accommodating them all would be detrimental to the open structure of the Internet. The condition that layer violations are permissible only if there are no nonlayer violating solutions available is too vague. As will be discussed in the next chapter, there are numerous possible ways content can be regulated, none of which is without flaw. They may not be feasible technologically or politically, only partially effective, or easily circumvented. More specific criteria about what qualifies as an available alternative solution are needed. Solum and Chung’s argument against layer violation is powerful, and layer violation is a concern that needs to be taken into consideration when regulating content online. But we also need clearer guidelines to determine under which circumstances and to what extent a government is justified in trying to furthering its compelling interest on the Internet, as doing so may have an effect on other countries. The next section will address this question in more depth.

686

Ibid. at 920.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

167

Sovereignty and Jurisdiction

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Imperialism or democracy? One of the main issues of contention in the Yahoo! case was whether or not the French court erred in asserting jurisdiction over Yahoo!, an American company located in the United States with no assets in France. Did the French court act in an imperialistic way or was it merely trying to uphold its own laws, as a sovereign nation is entitled to do? Should the French court have asserted jurisdiction over Yahoo!, merely on the basis that its Web sites could be accessed in France? Did it overreach in ordering Yahoo! to remove content that also would become unavailable to American users? Did it overreach in ordering Yahoo! to block access to French users? (These two questions are often conflated, while they ought to be considered separately.) Guidelines are needed that spell out in which cases, if ever, regulators from a nationstate should attempt to regulate illegal speech that originates outside their country but that can be accessed within their borders. These guidelines will be inspired by what one considers to be the prerogatives of a sovereign nation in enforcing its laws on the Internet. Or to put it differently; what one understands the term “sovereignty” to mean. In a 1999 Harvard Law Review note,687 the problem of nation states and their desire to have their laws respected on the Internet is discussed in the context of the “sovereignty” concept. The author tries to devise a normative sovereignty concept that could guide politics regarding Internet regulation. A distinction is made between three conceptions of state sovereignty: the realist, the representational and the postmodern. The postmodern sovereignty concept relies on the factual assumption that cyberspace is an independent, sovereign space responsible for its own regulation, an assumption whose factual basis has become highly questionable. Therefore, we will focus mainly on the realist and representational sovereignty concepts as normative anchor points for Internet regulation.

687 "Cyberspace Regulation and the Discourse of State Sovereignty, Developments; the Law of Cyberspace," 112 Harv. L. Rev. 1680 (1999) at 1680-1697.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

168

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The realist sovereignty concept The realist conception of sovereignty is derived from international relations’ realist theory.688 This theory states that nation states are the primary actors in international politics and that states’ main and rational concern is the maximization of their power. According to this theory, states will try to have their laws respected at all costs, even at the expense of (actors in) other states. Sovereignty means that the state, and the state alone, is the supreme authority that has exclusive jurisdiction over its citizens and internal affairs. Any limitation of this authority is seen as a limit on this sovereignty. The author points out that states attempting to regulate the Internet often do so on the basis of realist assumptions. They see the Internet as a threat to their sovereignty and try to impose their laws on the medium to fight off that threat. States operating within the realist paradigm rely on two principles for asserting jurisdiction: the effects principle and the territoriality principle.689 The territoriality principle (applied to Internet communication) states that “a state has authority to regulate the transmittal of information across its borders and the use of that information by individuals within its territory.”690 The effects principle is invoked when states impose their domestic rules and laws upon out-of-state actors based on the fact that their speech or actions, even though originating in a forum where they are legal, have effects in a place where they are not.691 The author mentions the CompuServe case (see chapter four) and China’s attempts to block certain information from entering its territory, as examples of this approach.692 The claims of the cyberlibertarians, who argue that cyberspace is separate from the “real world,” also rely on the realist assumption of territorial control because they think that cyberspace should be regulated from within and not from outside laws.693 Reliance on realist assumptions cannot provide a normative framework of sovereignty in which one can anchor an approach 688

Ibid. at 1683. Ibid. at 1683. 690 Idem. 691 Ibid. at 1684. 692 Ibid. at1683-1684. 693 Ibid. at 1685. 689

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

169

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

towards Internet regulation. First of all, it is unlikely to be successful. In this respect, the cyberlibertarians were right in arguing that it may be hard for nations to enforce their local rules on a global medium. States can assert jurisdiction, but as long as the person (or business) over which they assert jurisdiction does not reside within their territory, does not have assets there or is not subject to extradition, any judgment rendered against him will be unenforceable (unless of course, a local court enforces the judgment). However, the effectiveness of these kinds of measures is not a consideration at this point. A more important reason to reject this approach is that it would lead to a situation in which every nation would try to impose its norms on the Internet, which if successful would lead to an Internet governed by the lowest common denominator. In a realist paradigm, the attempts of a country like France to uphold its democratically adopted laws are not distinguishable from non-democratic governments trying to limit citizens’ ability to gather and spread information critical of the government. In both instances, countries are trying to unilaterally impose their laws on the Internet. A normative model of sovereignty in which to anchor guidelines for Internet regulation needs to be more finely tuned than the one provided by the realist concept of sovereignty.

The representational sovereignty concept The second model of sovereignty the authors discuss is the “representational conception,”694 which has its roots in liberal ideology. According to this view, the individual, not the state, is the most important unit of analysis in the international system. In this model, the state derives its sovereignty from the fact that it represents the general will of its people. Under this sovereignty concept, citizens in any state should have the right to have their democratically passed laws enforced, granting the state the power to ensure that their will is followed. States can regulate Internet content such as pornography, gambling or hate speech provided that a democratic consensus exists that these kinds of speech are not to be tolerated. This is the argument that Reidenberg makes in his article in which he defends the French

694

Ibid. at 1686.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

170

Global Medium, Local Laws

court’s decision in the Yahoo! case.695 This theory distinguishes countries like France from repressive regimes that try to restrict their citizens’ access to the Internet or that try to control, block and or filter content because it could challenge their power or because it is contrary to state-imposed doctrine.696 Because these countries do not represent the will of their people, they cannot make claims of sovereignty. However, a state has to recognize that other democratic states also try to uphold their democratically passed laws and have the right to have their laws respected within their borders:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

[T]he legitimacy of applying a state’s laws to conduct that occurs in another state’s territory depends on whether such laws 'would prevent [that] State from functioning as a sovereign; that is, the extent to which such generally applicable laws would impede a state government’s responsibility to represent and be accountable to citizens of the State.'697 In the context of hate speech this is important. Engaging in (most types of) hate speech is a constitutionally protected activity in the United States; therefore, trying to limit this right would impede with the United States’ responsibility to uphold its constitutional principles. The representational concept of sovereignty suggests that nations have to be aware of the fact that other states that respect human rights and the principle of democratic representation can also make claims of sovereignty,698 and that restraint should be exercised when attempts to enforce local policies encroach upon other nations’ ability to represent the will of their people. Of course this is a vague guideline. In the 695

Joel R. Reidenberg, "The Yahoo Case and the International Democratization of the Internet," Fordham Law & Economics Research Paper no. 11 (2001) at 4. 696 A distinction also made by legislators who drafted the Global Online Freedom Act of 2007 [H.R.275.IH], a bill that would prohibit American companies to cooperate with certain repressive regimes that restrict information about human rights and democracy. 697 "Cyberspace Regulation and the Discourse of State Sovereignty, Developments; the Law of Cyberspace," 112 Harv. L. Rev. 1680 (1999) at 1687, citing: New York v. United States, 505 U.S. 144, 177 (1992). 698 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

171

example of the Yahoo! case, for example, one could argue that the French court “exercised restraint;” it addressed questions of jurisdiction and tried to come up with a solution (filtering based on location) that would not affect American Internet users. Some have argued that therefore the French judgment should have been enforced in the United States.699 Arguments that the French court did not use caution can also be easily made, as the many articles criticizing the French decision illustrate. If this concept will be of any use, it needs to be established more clearly what is meant by “exercising restraint.” The representational sovereignty concept can provide us with some insights in this matter. First of all, as stated above, it allows for a distinction to be made between democratic and non-democratic states’ attempts to regulate the Internet. If we add to this the notion that a state should be less inclined to pursue its policy goals if doing so is likely to interfere with other governments’ abilities to represent the will of their citizens and act in a sovereign way, we have the beginning of a broad normative framework. According to this normative framework, states should not pursue their interests when doing so has negative consequences for actors in other states with representative governments. An important practical consequence of this normative framework in the context of Internet regulation is that it requires abandoning the effects test. Countries should not assert jurisdiction or regulatory power over outof-state actors merely because their speech can be accessed within their borders, as this would not always respect other nations’ sovereignty. If two democratic regimes have different ideas about permissibility of Internet content, the representational model of sovereignty argues that a nation should try to have its laws respected because they represent the will of the people, without imposing “negative externalities” upon actors in other states. In other words, whereas the realist paradigm advocates a unilateral approach towards Internet regulation, this version of the representational conception proposes a multilateral approach in which various interests are balanced against each other. This is still a general principle that needs to be translated into more specific guidelines. In order to do so, it is illuminating to 699

See for example: Gregory S. Cooper, "A Tangled Web We Weave: Enforcing International Speech Restrictions in an Online World," 8 PGH. J. Tech. L. & Pol’y 2 (2007) at 21-21.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

172

Global Medium, Local Laws

look at how jurisdictional issues have been dealt with in the United States.

Assessing jurisdiction online: Lessons from the American approach The federal structure of the United States has forced the legal system to deal extensively with Internet jurisdiction issues. The American experience provides some insights that can also be useful in the international context. Although jurisdictional issues arising in a federal nation such as the United States are substantially different from those arising in an international context, some of the rationales applied by the courts in regards to Internet and jurisdiction are instructional in an international context as well.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The “traditional approach” Traditionally, jurisdiction over non-state residents within the United States has been based on International Shoe v. Washington.700 Absent a traditional basis for jurisdiction (such as residence), courts will assert jurisdiction on the basis of whether a state’s long-arm statute extends to the non-resident defendant in the light of specific facts. Courts must determine whether or not exercising personal jurisdiction in a particular case comports with due process guaranteed by the constitution.701 This requires examining whether or not the defendant has had minimum contacts with the forum state, i.e. if he has “purposefully availed” himself of the laws of the forum state through his activities. Lastly, courts need to determine whether or not asserting jurisdiction comports with “traditional notions of fair play and substantial justice.” In order to establish whether or not the minimum contacts requirement has been met, a three-pronged test was developed following International Shoe:702 (1) There must be an act by which the defendant purposefully avails himself of the laws of the forum state. Purposeful availment includes the conduct of the defendant, and 700

326 U.S. 310 (1945). Dennis T. Yokoyama,"You can't always use the Zippo Code: The Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54 DePaul L. Rev. 1147 (2005) at 1152. 702 Titi Nguyen, "A Survey of Personal Jurisdiction Based on Internet Activity: A Return to Tradition," 19 Berkeley Tech. L.J. 519 (2004) at 521. 701

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

173

whether he intends this conduct to have an effect in the forum state, (2) The claim must arise as a result of the defendants’ activities, (3) Exercise of personal jurisdiction must be reasonable.703 With the rise of the Internet, courts were faced with the question if maintaining a Web site constitutes purposeful availment. Initially, the answer of American courts to this question was that it did. In Inset Systems, Inc. v. Instruction Set,704 a trademark infringement case, the Federal District Court in Connecticut held that a Massachusetts-based company using a Web site to advertise its products, had manifested purposeful availment in all jurisdictions where Internet access is available. Other courts followed this rationale,705 stating that having a Web site constitutes purposeful availment, implying that everyone who operates a Web site could be hauled into any courtroom in the country. Web advertising was seen as advertising to the whole nation, and by trying to reap sales nationwide (even if the business is clearly local), Web site operators should assume the risk of being sued outside their home states, the argument went.706 The court likened advertising over the Internet to a continuous advertisement, and ruled that Inset had purposefully directed its activities towards Connecticut and should therefore have anticipated being hauled in court there.707 The court provided no in-depth analysis of the Internet as a medium. Instead, it made an analogy with traditional media forms, and applied existing law. In doing so, it proceeded much in the same way as the French court in the Yahoo! case had. It considered only the effects of the Internet communication, and asserted jurisdiction based on the Internet’s reach, without considering the actual intent of the content

703

Ibid. at 522. 937 F. Supp. 161 (D. Conn. 1996). 705 Dennis T. Yokoyama,"You can't always use the Zippo Code: The Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54 DePaul L. Rev. 1147 (2005) at 1157. 706 Idem. 707 937 F. Supp. 161 at 165. 704

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

174

Global Medium, Local Laws

provider. In the wake of this decision, many courts followed this approach,708 though some deviated.709

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The Zippo test The Inset rationale clearly cast too wide a net by establishing that everyone operating a Web site purposefully avails himself to every jurisdiction where this Web site can be accessed. A next line of cases would present a more nuanced test by considering the kind of Web site that is being operated. This new test was developed in Zippo Manufacturing Co. v. Zippo Dot Com, Inc.,710 an infringement case in which the Pennsylvania based manufacturer of the famous lighters sued a California based company for various trademarks infringements.711 Zippo Dot Com operated a Web site advertising and offering paid and free access to its Internet news service. To become a paid subscriber, prospective members had to submit names and addresses via an online form and pay for their memberships via credit cards, either over the phone or online. They were then sent a password to access Internet news group messages. At the time of the lawsuit, about 3,000 Pennsylvanians had subscribed to the service. The court had to decide whether personal jurisdiction arose out of the contacts Zippo Dot Com had established with Pennsylvania through its Web site. Rather than engaging in the analysis established by Inset, the court decided that the 708

Dennis T. Yokoyama,"You can't always use the Zippo Code: The Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54 DePaul L. Rev. 1147 (2005) at 1157. 709 For example, Bensusan Restauran Corporation v. King 937 F. Supp. 295 (S.D.N.Y. 1996) aff’d 126 F. 3d. 25 (2nd Cir 1997), a case in which a New York City club called “The Blue Note” which owned the federal trademark in that name, brought a trademark infringement and dilution action against a club in Missouri with the same name. The Missouri club had a Web site on which one could find general information, a calendar of events and ticket information. However, tickets could not be bought online and also were not sent through the mail. The court had to rule whether or not the presence of this Web site constituted purposeful availment and justified hauling the Missouri club owner in a New York court room. Given the fact that the Web site was merely passive, the court ruled that it did not cause any infringing activity in New York. 710 952 F. Supp 1119 (W.D. Pa. 1997). 711 Ibid. at 1124.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

175

likelihood that personal jurisdiction can be exercised is “directly proportionate to the nature and quality of commercial activity that an entity conducts over the Internet.”712 In what came to be known as the Zippo test, courts weighed the relative interactivity of a website to determine whether assertion of jurisdiction is appropriate. At the one end of the spectrum are the active Web sites, where “a defendant clearly does business over the Internet. If the defendant enters into contracts with residents of a foreign jurisdiction that involve the knowing and repeated transmission of computer files over the Internet, personal jurisdiction is proper.”713 At the other end of the spectrum are passive Web sites, where someone merely posts information that is accessible to users in other jurisdictions, which is not sufficient basis for asserting jurisdiction.714 The middle of the spectrum is occupied by Web sites where a user can exchange information with a host computer, in which cases the level of interactivity and commercial nature of the activity needs to be taken into consideration when making decisions about exercising jurisdiction.715 In the Zippo case, the court ruled that, given the nature of the contacts Zippo Dot Com had forged through its Web site with subscribers from Pennsylvania and ISPs based in Pennsylvania, jurisdiction was proper.716 In the years following, the Zippo test was used in numerous cases.717 The Zippo test was clearly a step forward from the analysis presented by the Inset court, as it looked to find a balance between a lawless Internet and an over-regulated one.718 Yet, it has come under increasing criticism in recent years. While the Zippo test provided a more fine-tuned tool than the test in the Inset case (which was not really a “test”), it still suffers from some shortcomings. It does not provide a clear standard that allows a business to gauge the risk it exposes itself to of being vulnerable to out-of-state lawsuits by taking 712

Idem. Idem. 714 Idem. 715 Idem. 716 Ibid. at 1126-1127. 717 See: Michael A. Geist, "Is there a there there? Toward Greater Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345 (2001) at footnote 114. 718 Ibid. at 1370. 713

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

176

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

its business online. Most Web sites are neither completely active nor completely passive, and fall in the “middle zone.” In that case, the court’s analysis of the specific facts of the case will determine whether the online activity constitutes purposeful availment.719 It remains unclear how much interactivity or commercialism (is a site that does not offer anything for sale but makes money from banner advertising a site that “does business?”) is required to assert personal jurisdiction?720 The Zippo test also seems to be inapplicable to libel cases, as it applies mainly to commercial and interactive sites. It seems to suggest that jurisdiction claims in cases where defamatory statements are made on a passive non-commercial Web site with the knowledge and purpose to harm the plaintiff in the forum state are to be dismissed.721 Nevertheless, the Zippo test has been applied in libel cases.722 It is also not always easy to distinguish an active Web site from a passive one. A Web site may seem passive, but use cookies or data collecting technologies.723 Standards for what is considered active and what is passive may shift constantly as technology changes.724 These weaknesses of the Zippo test have caused some scholars to propose abandoning the test altogether, and apply classical standards to determine purposeful availment instead of designing a test specifically for the Internet.

Beyond the Zippo test In recent years, some courts have moved away from Zippo’s activepassive test and have started to adopt the effects doctrine725 established in the Supreme Court decision Calder v. Jones.726 The name “effects 719

Ibid. at 1377-1379. Titi Nguyen, "A Survey of Personal Jurisdiction Based on Internet Activity: A Return to Tradition," 19 Berkeley Tech. L.J. 519 (2004) at 529-530. 721 Ibid. at 538. 722 Dennis T. Yokoyama,"You can't always use the Zippo Code: The Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54 DePaul L. Rev. 1147 (2005) at 1176. 723 Michael A. Geist, "Is there a there there? Toward Greater Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345 (2001) at 1379. 724 Ibid. at 1379-1380. 725 Ibid. at 1371. 726 465 U.S. 783 (1984) at 789. 720

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

177

doctrine” is somewhat misleading in this context, as it is not the same as the blanket effects rationale applied in the Yahoo! case or the Inset case, where jurisdiction was asserted on the basis that Web sites could be accessed in a certain territory. It has come to be interpreted more as a targeted test, in which the intention of the sender to “target” a specific jurisdiction is taken into consideration. The Calder doctrine holds that jurisdiction over a defendant is proper when “the defendant's actions are expressly aimed at, and the brunt of the injury is felt in, the forum state.”727 Calder v. Jones and its companion, Keeton v. Hustler Magazine,728 both dealt with jurisdiction issues in libel cases, and in both cases defendants were hauled into out-of-state courts based on the fact that their publications had substantial circulations in those jurisdictions.729 This seems to imply that Internet publishers could also be hauled in every courtroom in the country, as they can be considered to have a country-wide publication. But this is not how some courts have applied Calder to jurisdiction issues in online libel cases. In Griffis v. Luban,730 a target-based test was applied in an online defamation case. The case resulted from an argument between two Egyptologists on an Internet news group in which Marianne Luban, a Minnesota resident, questioned the credentials of Katherine Griffis, an Alabama resident. Among other things, Luban had stated that Griffis had received her degree from “a box of crackerjacks.”731 Griffis sued for libel in an Alabama court and was awarded $25,000 in a default judgment.732 Luban fought the enforcement of the judgment in Minnesota. The state trial court and the appellate court ruled that Luban had had minimum contact with Alabama,733 but the Minnesota Supreme Court reversed, holding that the judgment was not enforceable

727

Titi Nguyen, "A Survey of Personal Jurisdiction Based on Internet Activity: A Return to Tradition," 19 Berkeley Tech. L.J. 519 (2004) at 351. 728 465 U.S. 770 (1984). 729 Patrick J. Borchers, "Personal Jurisdiction in the Internet Age: Internet Libel: The Consequences of a Non-Rule Approach to Personal Jurisdiction," 98 Nw. U.L. Rev. 473 (2004) at 478. 730 646 N.W.2d 527 (Minn. 2002). 731 Ibid. at 530. 732 Ibid. at 529. 733 Ibid. at 531.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Global Medium, Local Laws

178

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

in Minnesota because the Alabama court did not have jurisdiction over Luban. It argued that the message posted in the news group did not specifically target Alabama, as the forum was nationwide in scope. Even though the defamatory statements could be read in Alabama, this did not demonstrate that Alabama was the focal point of Luban’s tortuous conduct, the court argued. It rejected the view that Calder supports a broad effects-based test in which jurisdiction is supported merely because the effects of a tort committed in another jurisdiction can be felt in a given forum.734 The standard expressed by the Minnesota Supreme Court supports jurisdiction in libel cases only if the statements are expressly aimed at the forum state: While the record supports the conclusion that Luban's statements were intentionally directed at Griffis, whom she knew to be an Alabama resident, we conclude that the evidence does not demonstrate that Luban's statements were 'expressly aimed' at the state of Alabama. The parties agree that Luban published the allegedly defamatory statements on an internet newsgroup accessible to the public, but nothing in the record indicates that the statements were targeted at the state of Alabama or at an Alabama audience beyond Griffis herself.735 Young v. New Haven Advocate736 presents another example in which a narrow target test is applied in an online defamation case. In this case, a Virginia prison warden sued two Connecticut newspapers, which had very limited or nonexistent circulations in Virginia, for libel in the federal District Court for the Western District of Virginia.737 The newspapers had printed articles and columns describing poor conditions in the prison where Young was the warden, and allegedly implied that he held racist beliefs. Although the papers had very limited circulations in Virginia, they did maintain online versions of their papers that could be accessed in Virginia. The Fourth Circuit Court of 734

Ibid. at 533. Ibid. at 535. 736 315 F.3d (4th. Cir. 2002) at 256. 737 Ibid. at 259. 735

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

179

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Appeals applied Calder, and held that the papers had not had minimum contacts with Virginia, since most of the content of the papers was directed at a Connecticut readership and that most of the advertising in the papers also was clearly aimed at Connecticut residents.738 The court here interpreted the Calder test as a targeted test: “We thus ask whether the newspapers manifested an intent to direct their website content which included certain articles discussing conditions in a Virginia prison- to a Virginia audience.”739 Conducting this analysis, the court found that as a whole, the papers’ sites would not be of interest to residents of Virginia. In Winfield Collection, Ltd. v. McCauley,740 a copyright infringement case, the United States District Court for the Eastern District of Michigan mounted a poignant criticism of the Zippo test and suggested that in the absence of a specific test to establish what constitutes “minimum contacts” on the Internet, traditional legal principles can be applied. However, the distinction drawn by the Zippo court between actively managed, telephone-like use of the Internet and less active but 'interactive' web sites is not entirely clear to this court. Further, the proper means to measure the site's 'level of interactivity' as a guide to personal jurisdiction remains unexplained. Finally, this court observes that the need for a special Internet-focused test for 'minimum contacts' has yet to be established. It seems to this court that the ultimate question can still as readily be answered by determining whether the defendant did, or did not, have sufficient 'minimum contacts' in the forum state. The manner of establishing or maintaining those contacts, and the technological mechanisms used in so doing, are mere accessories to the central inquiry.741 In this case, the court ruled that selling items over eBay does not automatically mean that one purposefully avails himself of the governing laws of the buyer’s jurisdiction. The seller cannot be 738

Ibid. at 259-260. Ibid. at 263. 740 105 F. Supp. 2d 746 (E.D. Mich. 2000) at 750. 741 Idem. 739

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

180

Global Medium, Local Laws

expected to have advance knowledge of where the items will be sold. The court was not prepared to hold that “the mere act of maintaining a Web site that includes interactive features ipso facto establishes personal jurisdiction over the sponsor of that Web site anywhere in the United States.”742 Whether the cases discussed above are based on a misreading of the Calder test (which in its original formulation was an effects-based test) or a logical return to first principles after the contrived Zippo test is not a question that needs to be addressed here, but it has inspired some scholars to propose a similar target test in an international context.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

A target-based test for Internet transactions Observing this shift away from Zippo, Geist proposes a new test based on a target-based analysis for asserting jurisdiction in Internet transactions.743 The test proposed by Geist would want courts to take three factors into consideration when determining when jurisdiction is proper in Internet cases. The first factor considers whether or not either party utilized a contractual provision to specify which law should govern their transactions.744 The second factor would consider whether or not the Web page sponsor used technology on the Web site to either target or avoid jurisdiction.745 Geist focuses here mainly on whether or not the sender of the information used technology to target a certain geographical area. The third factor assesses whether or not a party has or ought to have knowledge about the geographic location of the online activity.746 By this, Geist means that when a Web site has no technology in place to target a specific region, or when there is no contractual arrangement between parties, a Web site sponsor may still have sufficient knowledge that his Web site targets a specific jurisdiction to establish jurisdiction. Geist gives the example of a gambling Web site whose owners claimed they did not know that they were taking bets 742

Ibid. at 751. Michael A. Geist, "Is there a there there? Toward Greater Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345 (2001) at 1386. 744 Ibid. at 1386-1392. 745 Ibid. at 1393-1402. 746 Ibid. at 1402-1404. 743

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

181

from New York residents (which would be illegal), as the site required people to give their residence before accepting bets, and that it would not let New Yorkers place bets. However, the court747 ruled that the casino operators knew that most people could easily circumvent this by entering a fake out-of-state address.748

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Asserting jurisdiction Henn749 builds upon Geist’s proposal to formulate a target-based test for determining whether minimum contacts have taken place on the Internet in an international context. She points out that Geist’s proposal is applicable only to active commercial Web sites, whereas in many instances, hate speech or other controversial speech is conveyed through “passive” Web sites.750 Henn suggests another three-pronged test to determine if a Web site’s content provider purposefully avails himself to the laws of a certain jurisdiction. A primary test that could be applied to a non-interactive Web site, according to Henn, is to consider whether or not the site uses a foreign language. For example, neo-Nazi sites based in the United States but written in German could be considered to be targeting Germany. Secondly, Henn argues that a content provider has targeted a foreign jurisdiction if the information that is available on that Web site directs the viewer to local information.751 For example, if a Web site provides links to Web sites that are clearly local, or if users are directed to physical locations in that forum, this requirement would be met. It is surprising that Henn does not include in her proposal the requirement that the information is clearly of a local character and that it deals with local topics. She proposes a slightly stricter criterion: that the information contained on the site “directs the viewer to local information.”752 Finally, Henn also suggests that a Web site that uses software to target its advertising to users in a specific jurisdiction, also 747

People v. World Interactive Gaming 714 N.Y.S.2d 844 (Sup. Ct. 1999). Michael A. Geist, "Is there a there there? Toward Greater Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345 (2001) at 1392. 749 Julie L. Henn, "Targeting Transnation Internet Content Regulation," 21 B.U. Int'l L.J. 157 (2003). 750 Ibid. at 174-175. 751 Ibid. at 175. 752 Idem. 748

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

182

Global Medium, Local Laws

avails itself to the laws of that forum: “If the content provider has tools that are sophisticated enough to target advertising, then they should also have the ability to monitor what country’s citizens are accessing their Web site and, thus, have reason to know the laws to which they could potentially be subjected.”753 However, in many cases a Web site owner may simply sell advertising space to an Internet advertising firm. This firm might have the software to deliver advertising content in the language of the Web site visitor, but this might not necessarily mean that this site targets these visitors. So what would be the practical application of this proposal? This proposal is designed to vet out Web sites hosted on American servers that clearly target other jurisdictions. The test of Henn's is designed to assess the intent behind a Web site, as an alternative to basing jurisdiction on an effects doctrine. A German language neo-Nazi site discussing German policy on immigrants, for example, or a Germanlanguage message board discussing similar topics would, according to this proposal, avail itself of the laws of Germany, even if it is hosted in the United States. The test proposed by Henn would bar courts from asserting jurisdiction over content that is illegal in their country, but has not targeted it. Only content providers who explicitly target certain jurisdictions would open themselves up to prosecution in these jurisdictions. This proposal is of course rather vague and not without problems. Henn’s test lacks clarity, would need to be accepted on an international level through a treaty, and could still raise First Amendment concerns if applied to hate speech. (Does one forego one’s First Amendment right if one addresses non-Americans over the Internet?) The effectiveness and feasibility of having such a test for determining jurisdiction will be discussed in greater depth in the next section. As a normative concept, the target approach can fine tune the representational model of sovereignty in its rejection of the effects-based test. The test proposed by Henn contains some elements that can help to build a normative framework for Internet regulation across borders. Based on Henn’s test and the representational concept of sovereignty, a guideline can be developed that states that any kind of Internet regulation in which an actor, whether a government actor or a private actor, tries to regulate

753

Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

183

speech originating across borders in a way that puts a burden upon actors outside its borders, should be based on this targeted approach. For example, European regulators wanting to bar their citizens from accessing an American site on which the Holocaust is denied but that does not specifically target other nations through content or advertising should do so in a way that does not burden out-of-state actors. This does not mean that states have no power to restrict their citizens’ access to proscribed speech, but it means that they should do so without burdening out-of-state actors, such as foreign ISPs or content providers. “Burdening” could include unilaterally imposing any kind of restrictions or pressure on out-of-state actors that, if successful, would limit their ability to exercise their free speech rights. Or it could mean restricting access to this kind of speech to people living in jurisdictions where it is protected. The Supreme Court decision in Reno made clear that under the First Amendment, government cannot restrict access to speech to those who have a constitutional right to this speech in order to restrict another . The French Yahoo! order obviously failed this test, but this would not mean that a French anti-racism organization could not attempt to enter in a debate with content providers of hate speech in the United States in order to convince them to remove certain content, as this approach is based on dialogue and cooperation, rather than unilateral enforcement. However, this approach would avoid attempting to drag actors into a foreign court room for engaging in speech protected by their laws. Trying to convince ISPs in the United States to remove content or to change their terms of service to ban certain materials would be permissible. As discussed above, the representational concept is against unilateralism, but supports dialogue and mutual agreement. Therefore, this kind of informed self-regulation is not at odds with this sovereignty concept. However, this may raise concerns that private groups would become de facto censors of the Internet, a concern that will be addressed later in this chapter.

Loci of content control on the Internet This still somewhat vague and general requirement articulated above can be further clarified by defining the concept of out-of-state actors as it relates to the Internet. Who are the players in the Internet communication, what is there role in the communication process?

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

184

Global Medium, Local Laws

Where and what are the different loci of control that exist on the Internet? Zittrain754 identifies four loci of control: the source, the source ISP, the destination and the destination ISP. For regulators, it is important distinguish between these actors as “loci of control.”

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The source755 The easiest way to control content is at the level of the source instigating the transfer of information, the person who uploads the information to the Internet. The sender of information can restrict access to the material by requiring passwords, remove the material or can choose to do nothing at all.756 This is the level at which control of content can be exercised most effectively. The most effective way to prevent French Internet users from exposure to Nazi materials was to have these items removed from Yahoo!’s servers. However, most people, or at least most purveyors of hate speech, use the Internet mainly because they can have a potential global audience at a low cost. They have little incentive to limit their potential audience. Source ISP757 ISPs serve as a link between a client and the Internet, allowing an individual to connect to the Internet; as such, ISPs pass along packets of information to and from an individual’s computer. In addition, ISPs also sometimes host content placed on their servers by subscribers. (Zittrain calls these ISP hosting content OSPs or Online Service Providers, a distinction we will not make here. However, in our discussion we will deal mainly with OSPs as they are responsible for hosting content and have the ability to remove it.) ISPs can remove content from their servers if they choose to do so; for example, if the hosted content violates acceptable use policies or because they are ordered to do so by the authorities. When Yahoo! decided to remove certain Web pages from its Geocities Web hosting service following the French court order, it exercised this power. The legal responsibilities of ISPs as hosts of content will be further discussed in the final chapter.

754

Jonathan Zittrain, "Internet Points of Control," 44 B.C. L. Rev. 653 (2003). Ibid. at 659. 756 Idem. 757 Ibid. at 664. 755

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

185

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

It is important to make this distinction between source and source ISP. A source does not need to be located at the same place as the source ISP. One can maintain a Web site hosted on an American server without being in the United States oneself. One can upload content anonymously on an American server from Germany. While this American server cannot be regulated by German authorities, the provider of the content can be subject to German law, provided that German authorities know the identity of the content provider. This will not always be the case though, as sometimes the ISPs do not even know who the provider of the content is. Gary Lauck for example, a Chicago resident who hosts about eighty foreign language Web sites, does often not even know who the European individuals are whose content he hosts as contact occurs through anonymous emails and payment by sending an envelope with bank notes.758 Destination759 Content control can also occur at the destination, the recipient of the information, at the moment prior to an Internet user’s exposure to this content. This requires action at the level of the Internet user’s computer, through installing software or changing browser settings. For example, libraries may install filtering software that blocks pornographic content. Classrooms can have browsers configured so that only a limited type of pre-approved content can be accessed. However, this kind of content regulation can be successfully achieved only if the owner of the computer agrees to take the necessary steps. Destination ISP760 Unlike source ISPs, a destination ISP does not benefit from a relationship it has with the content provider, and cannot remove his content from its server or cancel his account. As Zittrain describes, destination ISPs are merely “off ramps” for data solicited by the destination ISPs’ customers. An ISP can be, and usually is, both a source and destination ISP, depending on the specific data transfer.761 758

Russell Working, “Illegal Abroad, Hate Web Sites Thrive here,” Chicago Tribune, November 13, 2007, p.1. 759 Ibid. at 669. 760 Ibid. at 672. 761 ISPs can also be neither, if they merely transfer and reroute packages.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

186

Global Medium, Local Laws

When performing the function of a destination ISP, a provider cannot remove or make material unavailable to the whole Internet, but it can make material unavailable to its subscribers if it wants to do so. However, this is not always simple. The technological and legal issues related to this approach will be discussed in the next chapter. In the context of Internet regulation, it is important to note that governments and regulators often times do not have any power over source ISPs if they are not located in their jurisdictions, but destination ISPs usually are located in the same jurisdiction as their subscribers and are therefore easier to regulate.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Summary This section began with an attempt to establish a normative framework for evaluating Internet regulation based on a normative sovereignty concept. The representational concept of sovereignty provided the most appropriate model, as it did not rely on the notion that the Internet is its own sovereign domain. Nor did it assume that nations should try to adopt an effects-based approach as the realist sovereignty model suggests. However, avoiding these two extremes did not provide any specific guidelines. By combining Henn’s target test with Zittrain’s loci of control for the Internet, more specific normative guidelines can be developed. The normative model we will adopt here demands that, when trying to regulate hate speech, regulators do not target out-of-state content providers or out-of-state source ISPs that do not specifically target the jurisdiction of the regulator. Rather, solutions should be sought at the level of the destination ISP; the destination of the content or at the source of the content if located within the jurisdiction of the regulator.762 However, when speech specifically targets a forum, and Henn’s test provides guidance in determining this, attempts to regulate content at the level of the content provider (source) and source ISP can be made, even if they are located abroad. Even though Henn’s test is not without problems, in many instances, the intent of the content provider to target a specific jurisdiction will be obvious. The fact that regulatory attempts that fulfill this criterion may be unsuccessful does 762

The source, or content provider, can be located in a different forum than the source ISP. For example if a German citizen uploads content on an American server from his home computer.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Cross-Border Content Regulation

187

not matter at this point. This criterion only tries to outline under what circumstances what kind of regulatory attempts are appropriate; their effectiveness will be discussed as a separate criterion below.

Effectiveness A last requirement is that a regulation be effective. “Effectiveness” does not only mean that the regulation works, but also that the regulation is not overly broad, that it is feasible and that the content which the regulations try to affect is in fact illegal.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Efficaciousness Any kind of measure should fulfill the regulatory goal that is set. In the Yahoo! case, for example, it is not clear how successful the order of the French judge ultimately was in enforcing French law. One could argue that its effect has been minimal, since it has not been enforced in the United States, but it may have been an effective strategy in trying to change the behavior of foreign ISPs. In order to assess the effect of a regulatory measure, one needs to try to assess its regulatory goals. A measure does not always need to be absolutely 100% efficacious to fulfill a regulatory goal. For example, the 70% accuracy with which French users could be identified and blocked could seem low, but it may be an acceptable number for the policy this order was supposed to serve. Solutions do not have to be perfect in order to be effective. When evaluating Internet regulation, efficacy needs to be considered.

Against overinclusiveness However, regulations should not be too efficacious. They should not be “overinclusive” and affect more speech than the speech that is targeted. In the example of hate speech, regulations should only affect speech that is illegal based on hate speech laws in the country of the regulators. In addition, no more people should be affected by the regulation than necessary. For example, regulations should not have the effect that American residents are barred from accessing certain types of speech merely because it is illegal in another part of the world (as long as it does not concern speech that specifically targets that jurisdiction). By demanding that regulations do not unilaterally affect source ISPs or

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

188

Global Medium, Local Laws

content providers, the risk that regulations affect more people than strictly necessary is reduced.

Accountability Related to this concept of overinclusiveness and to the representational concept of sovereignty is the demand that regulations reflect the will of the people. Therefore, there should be safeguards in place to ensure that the content that is regulated (banned, removed, blocked, etc.) because it is in violation of certain laws is, in fact, in violation of those laws. The provider of the content should be able to appeal the measure, and the institution responsible for the measure of flagging that content should be transparent and accountable to the general public in order to avoid that a private organization would become a de facto censor of the Internet.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Feasibility A last component of effectiveness is the practicality/feasibility of a measure. How easy or complicated it is to implement a certain regulation or measure will also determine its success. For example, having all content providers of hate speech voluntarily identify their speech as hate speech so it could be filtered more easily may be an effective measure, but it is not likely that this would happen. For any kind of measure, practicality needs to be a consideration. Although it is hard to establish fixed evaluative criteria to assess practicality or feasibility, a measure will usually be more feasible if the success of a measure depends on the efforts of a few versus many, and if the cost and time it requires are relatively low.

Conclusion In this chapter, a set of criteria was developed to evaluate European regulatory approaches towards online hate speech in general and the United States in particular. This set is based on three general principles: (1) Internet regulation should respect the open layered structure of the Internet; (2) It should be based on a representational concept of sovereignty; (3) It should be effective. The first principle led us to adopt Solum and Chung’s guideline that Internet regulations should cross layers only if there is no other solution possible, and even then the distance between the layer at which the regulation aims to produce

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Cross-Border Content Regulation

189

an effect and the layer targeted by that regulation should be minimized. The representational concept of sovereignty adopted here demands that regulation should not target out-of-state content providers or out-ofstate source ISPs when trying to regulate hate speech that does not target their jurisdiction, but that solutions should be sought at the level of the resident destination ISP, resident content providers or the destination of the content. However, when the speech involved specifically targets a forum, attempts to regulate content at the level of the content provider and source ISP may be appropriate, though these attempts may not be successful. The normative framework demands also that regulatory measures must be efficacious without being overinclusive. This means that regulations should fulfill their regulatory goal without targeting more speech than needed or making it unavailable to more people than necessary. Effective regulation of hate speech also demands that the people or body responsible for determining what speech to regulate are accountable to the public, and that their decisions can be appealed. Lastly, effective regulation also demands that the proposed methods are feasible. In this model, the fact that effectiveness is separated from the demands of the representative sovereignty concept is important. It may be that in most cases, attempts to regulate out-of-state actors will also not be efficacious or feasible, but that is not necessarily always the case. Even if out-of-state actors could be regulated effectively (because they have assets in other jurisdictions or because changes in the technological and legal landscape would make it easier), the second criterion demands that this is only done in specific cases. The guidelines developed here are not limited to the issue of hate speech, but also apply to other kinds of speech about whose legality there is no consensus between democratic nations.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved. Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

CHAPTER 6

Evaluating Regulatory Options: Where to Go from Here? Using the normative framework developed in chapter five, this chapter will discuss some of the proposed ways European countries could regulate hate speech originating in the United States. It is not to be expected that a single solution will satisfy all the criteria spelled out in the previous chapter, and it is not the ambition of this work to propose or develop one. However, by discussing the pros and cons of a number of possibilities, some guidelines can be developed for European lawmakers wishing to execute the requirements of the Additional Protocol to the Convention on Cybercrime.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Agreement on Common Internet Laws Some observers have proposed that European countries and the United States (and other countries as well) should agree on a common set of laws and norms applying to Internet content. Sayle argues that nations should agree on a set of common laws and provisions that would govern the Internet, or “Net Nation.”763 She claims that nations should adopt a double standard for Internet content and off line content: For instance, child pornography could still be lawful in Japan, should its lawmakers decide so, but child pornography on the Internet could be illegal, even if it originates in Japan, since it also, correspondingly, originates on the Net Nation and Net Nation law could prohibit such content. In this way the purpose of the law, to eradicate child pornography on the Net, would be served. Thus, every Net citizen would be held to certain minimum standards of conduct.764 She argues that whenever there is uncertainty about which local laws should govern activity on the Internet, Internet-specific law should 763

Amber Jane Sayle, "Net Nation and the Digital Revolution: Regulation of Offensive Material," 18 Wis. Int'l L.J. 257 (2000). 764 Ibid. at 282.

191 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

192

Global Medium, Local Laws

apply. She calls for “an international agreement which would allow a guiding, unified, Net law to take precedence in disputes arising from on-line content between citizens from different nations or, in criminal proceedings, between one nation and citizen(s) from a different nation.”765 Sayle’s argument is based on the notion that since local laws cannot be enforced on the Internet, an agreed-upon set of laws specific to the Internet is the only solution. Even though she does not totally accept the normative argument of cyberlibertarians that the Internet ought not to be governed by laws from the “real” world, she suggests that trying to apply existing law to the Internet would be impractical.766 Whereas she agrees with the cyberlibertarians that the Internet should have a distinct set of laws that are different form those of the real world, she disagrees that these norms should emerge from within the Internet. Instead, she argues that governments would have to come to an agreement on these common laws and impose and enforce them. If there were an international agreement about how, for example, hate speech online ought to be regulated, and if this agreed-upon law is violated, steps could be taken by law enforcement in the country where the content was uploaded to have it removed and/or hold the providers of the content accountable. This would not be layer violating and would also not raise any sovereignty issues, as it would be based on mutual agreement. However, this solution fails the feasibility criterion. Sayle’s solution raises just as many problems as it claims to resolve. As argued before, nations have shown determination to have their laws enforced on the Internet. This proposal would not really solve the problem caused by having different legal standards across borders because it is unclear how and on what basis these specific Internet laws should be drafted. The same differences between the United States and Europe and other countries would arise when they would have to draft an “Internet law.” Whose standards should apply? The Additional Protocol to the Convention of Cybercrime has shown that the Internet does not function as a homogenizing force between countries that have fundamentally different legal approaches towards regulating content.

765

Idem. Ibid. at 285.

766

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

193

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

So even though this proposal would not be a layer violation767 and also respects the representational sovereignty concept, it is neither feasible nor efficacious. Przybylski has a more modest proposal and advocates for the establishment of an international organization to regulate Internet content. Rather than creating a uniform law, it would function as a forum in which countries could meet with big international ISPs and big Internet players such as Amazon and eBay.768 In this proposal, countries and internationally operating Internet businesses would agree on how to accommodate requests for content regulations. However, as Przybylski acknowledges, smaller servers in the United States who exist for the purpose of hosting racist content would not participate in such an agreement.769 But even then, he argues, getting the big players on board would still do much to purge hate from the Internet. He also rejects the notion that such a self-regulatory body would essentially be nothing more than censorship: In many countries, displaying certain content is simply illegal, and the international nature of the Internet undermines the enforcement of these laws. Against this background, an international organization arguably provides a useful method to allow different countries to implement their preferred modes of regulation. The solution advocated in this note tackles this reality, rather than taking sides in the debate on free speech.770 This solution depends on self regulation on the part of ISPs and not on a mandatory common Internet law as Sayle suggested. But in fact, this is not really a solution, Przybylski proposes a sort of “ideal speech 767

If we assume for the sake of argument that all nations would subscribe to this “Internet law,” the nation where the information is uploaded or where the hosting provider is located could require the content to be removed from the Internet. 768 Paul Przybylski, "A Common Tool for Individual Solutions: Why Countries Should Establish an International Organization to Regulate Internet Content," 9 Vand. J. Ent. & Tech. L. 927 (2007) at 942-943. 769 Ibid. at 952-953. 770 Ibid. at 955.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

194

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

situation” in which the big players could come to an agreement. As such, it is hard to evaluate this “solution,” as it could lead different outcomes. But Przybylski seems to suggest that such an agreement would lead to American ISPs voluntarily adopting the standards of more speech restrictive countries (the author seems to have European democracies in mind, but it is unclear if he also expects authoritarian regimes to be invited to this agreement) and making certain content unavailable.771 If this is the case, this agreement would in fact have the same effect as the French court order and would likely make constitutionally protected speech unavailable in the United States. However, there would be one big difference: whereas in the Yahoo! case the order specified to all who cared to know which content was to be blocked or removed for what reason, in the proposal from Przybylski countries and ISPs would strike agreements behind closed doors about what speech would be available online. While this proposal is not layer violating and respects the representational sovereignty concept as it is based on dialogue, it fails to meet the effectiveness requirement: It could be overinclusive (make speech unavailable in the United States), might not be efficacious (as hate portals would not participate in such an organization), or might lack accountability.

Asserting Jurisdiction over Content Providers Another possibility for European authorities, which was also discussed previously (Yahoo! case), is to simply assert jurisdiction over foreign content providers (or “sources,” to use Zittrain’s terminology). A notorious example in this context is Dow Jones & Co. v. Gutnick,772 a case in which the High Court of Australia held that a court in Victoria could assume jurisdiction over Dow Jones, publisher of Barron’s magazine, an American magazine published in the United States that is available online through subscription to the Dow Jones Web site. The case arose out of a libel claim by Mr. Gutnick, about whom the magazine had published an article stating that he was a tax evader and money launderer. 771

Ibid. at 947. HCA 52 (2002).

772

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

195

The Australian High Court, in assessing whether or not it had jurisdiction, ruled that the place where “publication” occurs in an Internet libel case (the locus delicti) is the place where the content is downloaded from the net, provided the defamed has a reputation there.773 Though the literature is not entirely clear on this point, Dow Jones did seem to have assets in Australia,774 which may explain why the two parties reached a settlement in October of 2004.775 While this practice is not layer violating, it clearly does not respect the principles spelled out under the representative sovereignty concept, as it blindly applies an effects-based doctrine. Moreover, this approach would not work against hate speech, as conveyors of hate speech are unlikely to have assets abroad and do not have any incentives to appear or comply with a foreign court’s judgment. In that case, a foreign court would need to rely on a local court to enforce its judgment.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Enforcing Foreign Judgments As discussed above, it is doubtful that an order such as the French order in the Yahoo! case will be enforced in the United States, making this approach to curb hate speech not efficacious. The en banc decision of Ninth Circuit Court of Appeals may seem to cast some doubt on this, but it is important to note that in that case, the court did not rule on the enforceability of the French order, but decided the case on procedural grounds. However, the en banc decision still puts Yahoo! in a somewhat unenviable situation, as Judge Fisher, who wrote the dissenting opinion, observed; either Yahoo! complies with the French order or it has to accept the risk of having to pay the constantly accruing fines if the French organizations ever do make a successful attempt at having the French order enforced in the United States.776 Because the court did not create clarity on this issue, it is useful to 773

Aukje van Hoek, "Australia's High Court Upholds Local (Private Law) Jurisdiction in Case of Defamation Over the Internet," International Enforcement Law Reporter 19(5), 2003. 774 Nathan W. Garnett, "Dow Jones & Co v. Gutnick: Will Australia's Long Jurisdictional Reach Chill Internet Speech World-Wide?" 13 Pac. Rim L. & Pol'y 61 (2003) at footnote 531. 775 "Dow Jones Settles Australian Online Libel Lawsuit," Entertainment Law Reporter 26(6), November 2004. 776 433 F.3d 1199 at 1234.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

196

Global Medium, Local Laws

address this issue of enforceability of foreign judgments in the United States in more depth by analyzing Mark Rosen’s 2004 law review article on this issue.777 Rosen argues that the United States could constitutionally enforce judgments such as the French order in the Yahoo! case.778 He claims that courts have misinterpreted the state action doctrine in denying enforcement of foreign judgments on First Amendment grounds. Rosen begins his argument by discussing Shelley v. Kraemer,779 a 1948 case in which a group of property owners of neighboring parcels of land had signed a private contract requiring that a provision to a sale of any part of the land be that the land was not to be occupied by persons not of the “Caucasian race.” When an African-American couple received a warranty deed from one of the owners, other owners sued and asked the court to enforce the agreement and divest the title out of the couple.780 The Supreme Court ruled that enforcing the covenant would violate the Equal Protection clause of the Fourteenth Amendment. Even though the contract was made between private parties, and Fourteenth Amendment violations can occur only when originating from state action, the court argued that Fourteenth Amendment concerns were raised because enforcement of the contractual provisions would have required full cooperation from the government to deny the petitioners their property rights on basis of their race. The court, therefore, would have been operating as a state actor, it argued.781 Because the terms of the contract could never have been enacted into law, the court ruled that enforcing the covenant would have violated the Equal Protection clause. As Rosen points out, many courts, including the District Court for the Northern District of California in the Yahoo! case, have relied on this precedent to declare foreign judgments unenforceable in the United States.782 In Bachchan v. India Abroad Publications783 and Teinkoff v. 777

Mark D. Rosen, "Exporting the Constitution," 53 Emory L. J. 171 (2004). Ibid. at 186. 779 334 U.S. 1 (1948). 780 Ibid. at 6. 781 Ibid. at 20. 782 “The French order prohibits the sale or display of items based on their association with a particular political organization and bans the display of websites based on the authors' viewpoint with respect to the Holocaust and 778

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

197

Matusevich,784 for example, American courts refused to enforce English libel judgments on the basis that English defamation law is contrary to the policy of the freedom of the press and that enforcing these judgments would jeopardize First Amendment protection of free speech. Rosen argues that these decisions ignore the fact that subsequent jurisprudence has greatly narrowed its application785 and that courts regularly issue orders that would be unconstitutional if they would have been enacted by legislature as general law.786 Rosen clarifies his point by means of a hypothetical in which a principal of a private school contracts a bookseller, stipulating that books about the Nazis787 are not to be shipped to the school. Even though lawmakers could never pass such a viewpoint discriminating provision into law, if the bookseller would breach the contract and send Nazi materials to the private school, judicial enforcement of the contract would not raise First Amendment issues, Rosen argues. If it would, it would mean that only contractual provisions that could be passed as general law would be enforceable, which is not the case. Contrary to what Shelley dictates, Rosen points out that judicial enforcement of private agreements that restrict speech has not triggered constitutional review.788 After having established that contemporary doctrine has rejected the Shelley approach, Rosen makes the argument that private contracts and foreign countries’ regulations are analogous for purposes of a state action analysis. Rosen argues that for enforcement purposes, it would anti-Semitism. A United States court constitutionally could not make such an order. Shelley v. Kraemer, 334 U.S. 1, 92 L. Ed. 1161, 68 S. Ct. 836 (1948). The First Amendment does not permit the government to engage in viewpointbased regulation of speech absent a compelling governmental interest, such as averting a clear and present danger of imminent violence.” (169 F. Supp. 2d 1168 at 1189.) 783 585 N.Y.S.2d 661 (N.Y. App. Div. 1992). 784 702 A.2d 230 (Md. 1997). 785 Mark D. Rosen, "Exporting the Constitution," 53 Emory L. J. 171 (2004) at 190. 786 Ibid. at 192. 787 Though Rosen does not state what kind of “Nazi materials” he has in mind, his example would work better if it referred to books promoting Nazi ideology. 788 Mark D. Rosen, "Exporting the Constitution," 53 Emory L. J. 171 (2004) at 193.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

198

Global Medium, Local Laws

not make a difference if the New York bookseller from his hypothetical had a contract with a private school in the Netherlands or one in New York. In both cases, enforcement of the speech restrictive terms of the contract would not constitute state action.789 If enforcement does not constitute state action in the case of the New York school, and postShelley cases suggest it does not, then it also would not be in the case of a Dutch school. Rosen argues that the same would be the case if the New York bookseller and a French private school principal would have a contract for selling books and not the contract, but French civil law would stipulate that booksellers should not ship Nazi materials to French schools and provide for a civil remedy in case of noncompliance. He states that from a state action perspective, these hypotheticals are indistinguishable. In none of them was the law that would have motivated a court’s action enacted by American polity.790 Rosen then argues that enforcing foreign judgments also would not trigger constitutional review under the Clouds Bookstore test.791 This test states that a statute should receive constitutional review only if it “has the inevitable effect of singling out those engaged in expressive activity” and if it regulates “non speech” conduct that is “intimately related to expressive conduct protected under the First Amendment.” Rosen argues that enforcing foreign judgments would not trigger constitutional review under the Clouds Bookstore test, because a rule to 789

Ibid. at 208. Ibid. at 208-209. One could argue that if this were true, that this would open the flood gates and force the United States to enforce every foreign judgment. In his first chapter, Rosen explains how foreign judgments are usually enforced, unless personal jurisdiction was not properly asserted or if the foreign court used procedures that were fundamentally unfair. If these conditions are met, the foreign judgment is enforced unless the original claim is contrary to fundamental notions of justice in the state where enforcement is sought or if it would be repugnant to its public policy. Rosen explains that these caveats have been interpreted very narrowly and that in general the United States enforce foreign judgments. (Ibid. at 175-179.) Other authors as well have pointed out that foreign judgments are usually enforced in the United States, more even than American judgments are enforced abroad. See: Franklin Ballard, "Turnabout is Fair Play: Why a Reciprocity Requirement should be included in the America Law Institute's Proposed Federal Statute," 28 Hous. J. Int'l L 199 (2006). 791 Mark D. Rosen, "Exporting the Constitution," 53 Emory L. J. 171 (2004) at 214-215. 790

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

199

enforce foreign judgments would be content-neutral and not be disproportionate in its application to expressive activity. Obviously, it would also not trigger constitutional review under the second exception, as such a rule would not regulate any primary conduct and therefore does not regulate expressive activity that is intertwined with non-expressive activity. 792 Applying this rationale, a foreign judgment could never trigger constitutional scrutiny, regardless of the effects such judgments would have on the free flow of information in the United States. But Rosen is not willing to go that far, and argues for a doctrine in which a general law (such as the demand that foreign judgments be enforced) triggers constitutional scrutiny only when its incidental effect would be an absolute deprivation of a constitutionally protected right.793 However, Rosen argues that it is unlikely that enforcement of a foreign judgment could ever have such an effect.794 For example, if a publisher would be forced out of business as a result of the enforcement of a foreign judgment, this would not trigger constitutional review, as this does not amount to the “evisceration of a constitutional right.” In the Yahoo! case, for example, this condition would not be met, according to Rosen. Only if the French judgment would have entailed the closure of the Internet altogether, would his threshold have been met.795 Rosen’s criticism is directed at the courts, where, he argues, the “invocation of the ‘big gun’ of the First Amendment led [them] to undertake a wholly America-centric analysis.”796 He argues that the political branches of the government should weigh the interests of all parties involved in enforcing “un-American” foreign judgments and create a general policy for enforcement.797 Rosen seems to suggest that it should be United States policy to enforce judgments such as the French order in the Yahoo! case. Currently, federal courts apply state law to decide whether or not to enforce a foreign judgment.798 792

Arcara v. Cloud Books, Inc. 478 U.S. 697 (1986). Ibid. at 219-220. 794 Ibid. at 221. 795 Idem. 796 Ibid. at 229. 797 Idem. 798 John Spanogle, “The Enforcement of Foreign Judgments in the U.S. - A matter of State Law in Federal Courts,” 13 U.S.-Mex. L.J. 85 (2005). 793

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

200

Global Medium, Local Laws

Rosen’s bookseller hypothetical is not totally analogous to the Yahoo! case though, since Yahoo! never entered into a formal contract with anyone in France; it merely uploaded content not directed at French users that could be accessed in France. Posting content on the Internet hardly amounts to entering into a contractual agreement with everyone who has access to the Internet. In his hypothetical, Rosen looked only at whether or not the source of legal action is American polity; and if this is not the case, judgments should be enforced, he argues.799 In doing so, he makes no distinction between the French government and a private actor, since they both are not the United States government, and for a state doctrine analysis, this is all that matters. But as Sutton points out, it would subvert the intent of the First Amendment if it would allow other countries’ governments to prohibit American speech that the American government cannot.800 The idea that the American government should not restrict speech, but that foreign governments can, is contrary to the intent behind the First Amendment. Unlike many of the cases that are discussed in the context of enforcing foreign judgments, the French judgment did not merely require the American courts to recognize a foreign judgment they could not have issued themselves by enforcing a monetary judgment. It also required American courts to help effectuate the French court’s injunction of protected speech in the United States. This is more troublesome than merely enforcing a money judgment and courts have traditionally been more reluctant to enforce injunctions than money judgments.801 If the French order were enforced in the United States, this could have a chilling effect on speech. Let us assume that an American court would have ordered Yahoo! to either change its auction policy or block access to French users and/or pay a fine to LICRA. This would have sent a message to content providers that they should expected to be 799

Mark D. Rosen, "Exporting the Constitution," 53 Emory L. J. 171 (2004) at 214-215 at 209. 800 Michael F. Sutton, "Legislating the Tower of Babel: International Restrictions on Internet Content and the Marketplace of Ideas," 56 Fed. Comm. L.J. 417 (2004) at 420-421. 801 Allan R. Stein, "Current Debates in the Conflict of Laws: Choice of Law and Jurisdiction on the Internet: Parochialism and Pluralism in Cyberspace Regulation," 153 U. Pa. L. Rev. 2003 (2005) at 2010-2011.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

201

aware of the legality of their speech in France (or in any other country with speech restrictive laws), and if it is not legal, either censure themselves or invest in resources to block access to certain users. It is hardly a stretch to assume that many content providers would selfcensor in the wake of such a judgment. The French judgment, if enforced, would have had a chilling effect on speech in the United States, draining the marketplace of ideas. The breathing room allowed for speech in American jurisprudence would vanish if foreign courts’ judgments were enforced here. This would have a detrimental effect on free and robust online discourse. Of course, other considerations could also play a role if the government had to determine whether or not to enforce foreign judgments contrary to American jurisprudence, such as maintaining good relationships with other nations. But the detrimental effects this could have on the free flow of information on the Internet, coupled with the commitment of American policy makers and courts –including the Supreme Court– to maintain this free flow make it unlikely that lawmakers would be inclined to press for enforcement of foreign, speech restricting, court orders. Rosen’s criticism of the application of Shelley and the state action doctrine in refusing to enforce foreign judgments restricting speech is a valid one, but true as this may be, it does not mean that there are no other good arguments to decline enforcement of rulings as the one in the Yahoo! case. The problem then with relying on enforcement by American courts of speech restricting orders such as the Yahoo! one is that they may not be efficacious, as American courts might not enforce them on First Amendment grounds because of the chilling effects they have on American speech. We should resist the temptation to read into the en banc decision of the Ninth Circuit Court of Appeals the enforceability of the Yahoo! order in the United States, as the court did not rule on that issue. But it is telling in this matter that one of the issues that divided the majority and dissent was whether or not the French order would have implications for speech in America. The dissent claimed it did, the three judges arguing that the case was not ripe state that it did not. But they did seem to agree that this order would not be enforceable

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Global Medium, Local Laws

202

if it would affect American users because in that case enforcement would be repugnant to California public policy.802

Search Engine Filtering If one enters the search term “Jew” into Google,803 one of the first sites in the returned results will be Jew Watch,804 an anti-Semitic Web site. After criticism about this search result,805 Google provided a link on top of the page where the “sponsored links” are usually listed, entitled “offensive search results.”806 It contains a statement in which Google explains the searching method that caused the site to rank so highly, but also stresses that it will not remove this site from its results or alters its rank in the search results:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Our search results are generated completely objectively and are independent of the beliefs and preferences of those who 802 Even the three judges siding with LICRA indicated that if the order would affect American users it might not be enforceable: “The possible -- but at this point highly speculative -- impact of further compliance with the French court's orders on access by American users would be highly relevant to the question whether enforcement of the orders would be repugnant to California public policy.” Yahoo v. LICRA 433 F.3d 1199 at 1257. 803 Search conducted on January 15, 2006. 804

805 David Becker, "Google Caught in Anti-Semitism Flap," CNet, April 7, 2004.

806 "Google: An Explanation of our Search Results." Google explains how sites’ rankings in its search results are based on complex computer algorithms that take thousands of factors in consideration when assessing a site’s rank. Sometimes linguistic subtleties can provoke unexpected results. Because the term “Jew” is often used in an anti-Semitic context while organizations or individuals who are not anti-Semitic will more likely be using “Jewish people” or even “Jews.” This propensity to not use the term “Jew” because it has become somewhat charged has caused that it has been claimed by anti-Semitic organizations: “Someone searching for information on Jewish people would be more likely to enter terms like ‘Judaism,’ ‘Jewish people,’ or ‘Jews’ than the single word ‘Jew.’ In fact, prior to this incident, the word ‘Jew’ only appeared about once in every 10 million search queries. Now it's likely that the great majority of searches on Google for ‘Jew’ are by people who have heard about this issue and want to see the results for themselves.”

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

203

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

work at Google. Some people concerned about this issue have created online petitions to encourage us to remove particular links or otherwise adjust search results. Because of our objective and automated ranking system, Google cannot be influenced by these petitions. The only sites we omit are those we are legally compelled to remove or those maliciously attempting to manipulate our results. This example shows the importance of search engines in making information available on the Internet. In the United States, the Anti Defamation League has made clear that it does not favor an approach in which hate sites are deleted from searches,807 but Frydman and Rorive point out that regulating search engines may be a possibility for European countries to limit access to hate sites located abroad.808 Omitting search results does not block access to the deleted sites, but it would certainly reduce the potential of these sites to distribute their message. The search engine market is dominated by a relatively small number of players, of which Google is the biggest. If they would decide to exclude sites from their search results, this would greatly reduce these sites’ visibility on the Web. Whether or not European laws allow forcing search engines to eliminate certain search results is unclear. Search engines are not covered by the provisions of the E-Commerce Directive, and it remains unclear what the liability of linking services and search engines is in European countries.809 Tort law by national courts in European countries is still shaping the legal landscape determining the legal liability of search engines.810 A German court ruled in February 2005 807

Brian Marcus, "Search Engines and Individual Rights," Regulating Search? A Symposium on Search, Yale Law School, December 3, 2005.

808 Benoît Frydman and Isabelle Rorive, "Strategies to Tackle Racism and Xenophobia on the Internet--Where are we in Europe?" 7 Int’l J. Comm. L. & Pol’y 8 (2002/2003). 809 Stephan A. Ott, "Hyperlinks, Search Engines- First Report on the Application of the E-Commerce-Directive," Links and Law (2003).

810 Steven Bechtold, "In Search of Search Law," Regulating Search? A Symposium on Search Engines, Law, and Public Policy, Yale Law School, December 3, 2005.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

204

Global Medium, Local Laws

that a meta search engine could be held liable if it does not prevent users from accessing illegal content,811 but in the Netherlands a judge ruled that an MP3 search engine that provided information where users could find (copyrighted) MP3 files did not engage in copyright infringement.812 However, these cases dealt with tort liability. In the context of criminal hate speech, local versions of search engines can be and have been ordered to eliminate search results. Technologically, this solution is feasible. Search engines already remove search results if legally compelled to do so or if sites try to manipulate the search results.813 For example, Google at one point removed the home page of BMW from its search results for this reason.814 This solution also is not layer violating, as it takes place at the content layer and does not interfere with Internet traffic. Local versions of Google are already eliminating sites from their search results. In a 2002 article, Zittrain and Edelman noted how search results were omitted from local Google pages.815 The authors found various discrepancies between search results generated by google.com, google.de (Germany) and google.fr. (France) About 113 sites were excluded from google.fr and/or google.de that were listed by google.com.816 The authors suggest that Google, under pressure from French and German governments, quietly removed sites from its local search results. However, Google claims that when ordered by a foreign country in which they operate, they remove that listing from the relevant national version of the Google search engine and provide a 811

"Germany: Meta Search Engines Responsible for Hyperlinks," Digital Civil Rights in Europe: EDRI-Gram Newsletter, 3 (7) April 6, 2005.

812 "Er Bestaan Ook Legale Muziekzoekmachines," De Standaard, May 13, 2004, p. 20. 813 "Preferences, Setting and Languages: Redirecting to another country doman?" 814 Will Sturgeon, "Google Hands BMW its ‘Death Sentence’," Silicon.com, February 6, 2006. 815 Jonathan Zittrain and Benjamin Edelman, "Localized Google Search Exclusions," Berkman Center for Internet and Society, October 26, 2002.

816 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

205

notice to the users that they have done so. This disclosure, Google argues allows users to keep their legal system accountable.817 For example, a search through google.fr for “holocaust mensonge” (holocaust lie) indeed results in a notice at the bottom of the page: “En réponse à une demande légale adressée à Google, nous avons retiré 2 résultat(s) de cette page. Si vous souhaitez en savoir plus sur cette demande, vous pouvez consulter le site ChillingEffects.org.” (In response to a legal request we have pulled two search results from this page. If you want to know more about this, please consult ChillingEffects.org).818 Frydman and Rorive point out that Google seems to play it safely, acceding to requests from European authorities and removing alleged illegal content from its local search results.819 As the leading search engine and major business, Google has an interest in complying with local authorities.820 In China, for example, Google and Yahoo! censor results from their Chinese products and MSN blog tools for China do not allow certain words such as “Dalai Lama” or “human rights” to be used in the title of a post.821 Since this solution targets local versions of Google, it only makes content unavailable (or more difficult to reach) for users of these local versions. This solution would not be overinclusive because it does not affect more people than necessary. It does not affect the American version of Google and has no consequences for its users; it only affects local Google versions that specifically target certain countries. It respects the representational sovereignty concept by not targeting outof-state actors. However, this solution suffers from a lack of efficacy. Having a Web site excluded from a list of search results does not mean 817

Lindsey Eastwood, "‘Don’t Be Evil’: Google Faces the Chinese Internet Market and the Global Online Freedom Act of 2007," 9 Minn. J.L. Sci. & Tech. 2 (2008) at 302. 818 Search conducted in June 2008. 819 Benoît Frydman and Isabelle Rorive, "Strategies to Tackle Racism and Xenophobia on the Internet--Where are we in Europe?" 7 Int’l J. Comm. L. & Pol’y 8 (2002/2003). 820 See: Jason Krause, "Casting A Wide Net: Search Engines Yahoo and Google Tussle with Foreign Courts Over Content," 88 ABAJ 20 (2002). 821 Jason Dean and Kevin J. Delaney, "As Google Pushes into China, it Faces Clashes with Censors," Wall Street Journal, December 16, 2005, p.A1.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

206

Global Medium, Local Laws

that this Web site is blocked or inaccessible to Web users. If one knows the site’s URL, one can still access it, or other sites can link to it. Also, users in Germany do not have to use google.de, but can also perform their searches through google.com. Google redirects users located in Germany (or in another country in which there is a local version of Google) who type in “google.com” in the URL window automatically to the local version of Google. But local users can access google.com if they want to by clicking on a link to google.com on their local Google page.822 Search engines are private organizations and are not accountable to the public. From an accountability perspective, it becomes troublesome when a couple of private gatekeepers act as de facto censors of the Internet. It is one thing for a search engine to comply with national laws and demands from authorities to not list certain illegal sites, but it becomes more problematic when search engines would take initiatives not to list content because it could be illegal. That this argument is not merely academic was illustrated in 2003, when Google removed an allegedly illegal page from its index worldwide after an outcry in the United Kingdom. However, when Internet free speech scholar and activist Seth Finkelstein investigated the case, he found that the site in question was merely a site written in Finnish containing humor and links, admittedly in poor taste, but by no means the “guide to picking up little girls” it was touted to be.823 The German version of Google and other major search engines such as the German version of Yahoo! have recently formed a self-regulation organization in order to more effectively eliminate search results to sites forbidden by German law.824 Critics of this approach have argued that search engines should not operate as censors of the Internet and have criticized the lack of clarity surrounding the standards being applied for removing sites from search results as well as the lack of information and accountability surrounding the process: “As it is, we have private companies like 822

"Google Help Center: How do I Stop Google.com from Redirecting to another Google Domain?" 823 Seth Finkelstein, "Chester's Guide to Molesting Google (Version 1.5)," February 28, 2003. 824 Monika Ermert, "Selbstregulierung Der Suchmaschinenanbieter," Heise Online, February 25, 2005.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

207

Google deciding what we can and can't see based on their selfinterested readings of poorly-drafted national laws, taking advice from unnamed and unaccountable Government agencies and telling nobody what is going on. Anything has to be better than that, surely?”825 True as this may be, for reasons of efficaciousness, a system of self-regulation might be preferable over a notice and take down regime, as it allows the search engines to react to mirror sites or to disable access to hate sites that have not yet been flagged by authorities. Based on the data provided by ChillingEffects.org European authorities have scaled down their requests to Google to remove search results. Between June 2007 and June 2008 it did not receive any requests to remove search results, whereas in the preceding years it received dozens and dozens of those removal notices.826 Regulating search engines could prevent European Internet users from stumbling upon or searching for hate speech on the Internet. It would not prevent people who are already accessing hate sites and who formed communities in which they engage in speech forbidden in their own country from reaching hate sites. In an age of social networking, the importance of search engines as portals to information has also been reduced. Search engines cannot remove hate sites, but they can make them harder to reach. But these sites and communities would still be accessible online, they could still be accessed. Still, search engines could play an important role in reducing the amount of hate speech available at the fingertips of European Internet users. It is therefore a missed opportunity that the E-Commerce Directive did not address the liability of search engines more clearly.

Geolocation While not a solution as such, geolocation software can be used in a way that restricts certain content on the Internet to certain users. For example, the French order in the Yahoo! case required Yahoo! to identify French users through geolocation software. At the time, the experts appointed by the court stated that using geolocation software would enable Yahoo to identify about 70% of the French users logging 825

Bill Thompson, "Google Censoring Web Content," BBC News, October 25, 2002. 826 See:

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

208

Global Medium, Local Laws

on as French. Geolocation software, because of its many commercial and practical applications, has been further developed and has become more sophisticated since then, and its accuracy rate would likely be higher now. For example, if a Minneapolis resident types in “hair salon” into Google, sponsored links will be returned listing featured local hair salons. Online casinos restrict access to users from jurisdictions where gambling is outlawed, Web sites that stream sports games ban access to users in markets where the broadcasting rights belong exclusively to a local TV station, Web sites targeting an international audience show prices in the currency of the visitor. There are countless examples of content being adapted based on the location of the surfer.827 Geolocation usually happens through IP address recognition. IP addresses are assigned to service providers who then allocate these addresses to their users. Through a database, one can match the IP addresses with the network that has registered them, and infer the location of the user of that IP address. However, the IP addresses a service provider is assigned may be registered to the headquarters even if it has branches worldwide. AOL, for example, routes its users’ traffic through a single gateway in Virginia, which makes it seem as if all AOL users live in Virginia. However, geolocation services figure out how service providers and cable companies set up their networks and route traffic.828 Geolocation services have become specialized finding methods to create databases that match IP addresses with specific locations.829 They claim accuracy levels of 96% for U.S. state data and 99.9% accuracy at the country level.830 As geolocation software becomes more prevalent, so will anonymizer software that circumvents

827

"Geolocation, Don't Fence the Web in," Wired News, July 13, 2004.

828 Idem. 829 For a technical explanation of the various methods to locate IP addresses, see: James A. Muir and P.C. Van Oorschot, "Internet Geolocation and Evasion" School of Computer Science, Carleton Ottawa (2006).

830 This claim is made on the Web site of one of the leading geolocation services, Quova: < http://www.quova.com/page.php?id=175>

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

209

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

831

geolocation and enables users to mask their geographic location. However, using this anonymizer software requires a certain amount of skill and knowledge that most Internet users may not possess, so this method can still be considered relatively efficacious. In September 2006, the New York Times published a story on a terror case in the United Kingdom and blocked the article from readers from the UK for legal reasons. However, some UK readers who accessed the web through their workplaces’ multinational corporate networks could read the story, while some readers in Singapore and Hong Kong could not.832 While 99.9% is an impressive accuracy rate if one tries to target advertising, the margin of error could still be too large if it means that Internet users are inadvertently blocked. Enforcing a foreign geolocation order in the United States would pose constitutional problems if this geoblocking would block access to even a miniscule percentage of American Web surfers. But even if geolocation software is efficient and accurate, using it to block hate speech would be prohibitively expensive and burdensome for the provider of the speech. As the Yahoo! senior counsel remarked regarding the French order that required Yahoo! to use geolocation software: “To comply with the court order we'd have to do advance review of all content uploaded by hundreds of millions of users to see if 833 they violate any law anywhere on Earth. It's just not possible.” Geolocation software also does not come free. If one of the conditions to place content on the Internet would be that one has to engage in geoblocking if required to do so, only those who are able to invest in geolocation software and filters would likely remain. Geolocation software is marketed to businesses, banks, fraud prevention agencies, and similar institutions who have a commercial need for geolocation. These market forces are likely to keep the price of accurate geolocation software high. But even if technological measures such as geographic filtering would become relatively inexpensive and effective to the extent that the chilling effect upon 831

Finkelstein Seth, "Expert Report in Nitke v. Ashcroft (253 F. Supp. 2d 587),” November 10, 2003. 832 Danny O’Brien, “Internet Media ‘Routes Around’ the News Censors,” The New York Times, September 15, 2006, p.9. 833 Jason Krause, "Casting A Wide 'Net: Search Engines Yahoo and Google Tussle with Foreign Courts Over Content," 88 ABAJ 20 (2002).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

210

Global Medium, Local Laws

speech would disappear, Van Houweling argues that there still are free speech justifications for not enforcing foreign judgments as the one in the Yahoo! case. She argues that they would lead to a sort of regionalized Internet, on which there would be a restricted flow of information between the United States and (in the example of the Yahoo! case) France, or other countries with more restrictive speech laws that look for enforcement of their courts’ decisions in the United States. Domestic dialogue can be improved by having access to global viewpoints. Speech, Van Houweling argues, is to some extent a networked good; it is more effective when more people participate834, and when more viewpoints enter the marketplace of ideas. The problem with solutions based on geolocation is that, in many instances, they try to restrict access to certain groups of people and make it the responsibility of the senders to figure out if these groups can be exposed to their messages, even if they do not target them. This violates the sovereignty requirement developed in the previous chapter. It is impossible for a provider of content to foresee where his speech may be illegal and to block access to all users where the speech may be illegal. If content providers were expected to do this, the logical result would be that most of them, in most instances, would just ban access to all foreign users by default, rather than figuring out which national laws they may or may not violate. Take, for example, how eBay has dealt with providing access to the “mature audiences” section of its auction site: eBay's site is used by people from around the world. Many countries have laws regulating adult material that are more strict than those in the United States, and the law regulating the sale of such items through the Internet across national borders is not always clear. eBay is concerned that some items listed in Mature Audiences may raise legal issues in some of these countries, and for protection of its members does not

834

Molly Van Houweling Shaffer, "Cyberage Conflicts of Law: Enforcement of Foreign Judgments, the First Amendment, and Internet Speech: Notes for the Next Yahoo! v. LICRA," 24 Mich. J. Int'l L. 697 (2003) at 714.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

211

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

permit our international members to access the adult portion of the site.835 The type of material that is on display in eBay’s adult section may very well be legal in many countries, yet eBay chooses to close that part of its auction to all international visitors altogether (and to all American users unwilling and/or unable to provide a credit card number).836 If companies are faced with a choice between overblocking and risking legal proceedings in a foreign court, one can expect that the former option will be preferred. Perhaps Yahoo! could have done the same thing as eBay with its mature audiences section, and made its auction for Nazi memorabilia a separate category to which one need to have an American credit card to be granted access. But this solution is overinclusive (it affects more users than necessary), since it requires Americans to jump through the hoops in order to access the material. It is interesting to note though, that eBay relies on credit card verification, and not on geolocation software to block access to foreign users. As stated, geolocation may have interesting commercial applications, but requiring content providers to take responsibility and ensure that their speech does not reach people living in jurisdictions where this speech is illegal, especially if the speech does not target that forum, puts too much of a burden on foreign actors and violates the principle that foreign content providers should only be regulated if they direct their speech specifically at a certain jurisdiction. Although the development of geolocation software means that this solution could be efficacious ̶geolocation at the country level has become relatively accurate̶ it is unlikely that content providers could be forced to use filters that block access on the basis of location. It is also unlikely that content providers of hate speech would voluntarily install a geolocation filter. Lastly, because geolocation filtering implies blocking based on IP address (IP layer), this would also constitute a layer violation.

835

"Ebay Rules and Policies: Mature Audiences." 836 eBay uses credit card verification to make sure its users are over 18 years old, so without an American credit card one will not be granted access to the site.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

212

Global Medium, Local Laws

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Source ISP Liability An alternative to blocking end user access at the content provider level (or at the “source”) would be to somehow interrupt the communication of hate speech to Europe at the level of the Internet Service Providers. To assess the possibilities that exist to do so, it is useful to give an overview of the legal liability faced by ISPs in the United States and Europe. In Cubby, Inc. v. CompuServe, Inc,837 the United States District Court for the Southern District of New York ruled that a service provider hosting a bulletin board could not be held liable for an allegedly defamatory message posted on that board. The court granted the company summary judgment arguing that it had acted as a distributor, not a publisher. The liability standard for publishers is more stringent than the one for distributors; one who republishes defamatory statements is subject to liability as if he had originally published it, whereas those who merely distribute defamatory statements (bookstores, news vendors, libraries,…) are liable only if they know or have reason to know of the defamation.838 An ISP, the court argued, should not be held to a stricter standard of liability than a news stand, book store or library.839 In a 1995 ruling,840 however, at the trial court level, the Supreme Court of the State of New York held that an ISP was liable for an anonymous defamatory statement that was made on a widely popular financial bulletin board it maintained. The court ruled that since the service provider monitored the content of the bulletin board, it was not a mere conduit or distributor, but the actual publisher of the libelous statement. Because the ISP had claimed publicly to control the content of its board through screening software as well as through “bulletin board leaders,” who could delete certain material, it was to be held liable under the publisher standard, the court ruled. These decisions suggested that ISPs were considered to be publishers if they exercised any kind of editorial control and were considered to be mere distributors if they did not. For ISPs hosting 837

Cubby, Inc. v. CompuServe, Inc. 776 F. Supp. 135 (S.D.N.Y. 1991). Ibid. at 139. 839 Ibid. at 140. 840 Stratton Oakmont, Inc. v. Prodigy Services Company 1995 N.Y. Misc. LEXIS 229 (Sup.Ct., Suffolk County, May 24, 1995). 838

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

213

content, it made sense to not control or monitor content at all in order to avoid liability. The decisions removed any incentive from ISPs to make attempts to try to detect and remove objectionable content, or engage in any form of self-regulation. This worried members of Congress who were trying to limit access to and availability of obscene material online. With ISPs having an incentive not to monitor any content they hosted, this would be harder to attain. As a reaction, Congress enacted § 230(c)(1) of the Communications Decency Act (a section of the CDA that would not be struck down by the Supreme Court later on), which stated that publisher liability is inappropriate for a “provider or user of an interactive computer service” 841 (including hosting services or sites that publish third party content.) It also stated that providers were exempt from civil liability if they made attempts to restrict access to objectionable content.842 American courts have applied the immunity provision provided by section 230 broadly. Too broadly, according to some, as the courts seem to have given greater protection to ISPs than has been Congress’ legislative intent.843 Even in cases where ISPs were made aware of the fact that they hosted potentially illegal content by the person affected by it,844 or had a contractual agreement with the provider of the illegal content, 845 they were not held liable by the courts. It has been argued that in these decisions, the courts went well beyond the legislative intent of Congress, and that in these cases, the ISPs should have been held liable under the distributor standard, since they knew or had reason to know about the presence of illegal content.

841

Emily K. Fritts, "Internet Libel and the Communications Decency Act: How the Courts Erroneously Interpreted Congressional Intent with Regard to Liability of Internet Service Providers," 93 Ky. L.J. (2004/2005). 842 230(c)(2)(A). “(2) Civil Liability No Provider or user of an interactive computer service shall be held liable on account of- (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” 843 Emily K. Fritts, "Internet Libel and the Communications Decency Act: How the Courts Erroneously Interpreted Congressional Intent with Regard to Liability of Internet Service Providers," 93 Ky. L.J. (2004/2005). 844 Zeran v. America Online 129 F.3d 327 (4th Cir. 1997). 845 Blumenthal v. Drudge 992 F. Supp. 44 (D.D.C. 1998).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

214

Global Medium, Local Laws

It lies outside the scope of this work to assess whether or not that is the case, but it should be noted that a 2008 case chipped away on the broad ISP immunity provisions granted to ISPs by the CDA.846 In this case, an online service that tried to match roommates allowed its users to state roommate preferences on the basis of gender and whether or not a prospective roommate would bring children in the household. The Fair Housing Councils of various California communities sued the site for violating the Fair Housing Act. The Web site claimed immunity under the CDA, but the Ninth Circuit Court of Appeals argued that since the site actively solicited information on discriminatory preferences it had to be considered a content provider and could no longer enjoy the immunity granted to providers under the CDA. In a similar suit brought against Craigslist because of discriminatory housing ads posted on the popular classified advertisement Web site, the Seventh Circuit held that the Web site did enjoy CDA immunity and could not be held liable for the ads it hosted.847 Cases in the California system as well have narrowed the interpretation of the ISP immunity provision of the CDA. 848 One of the main arguments to shield ISPs from liability hosted on their servers is to avoid putting a chill speech. The distributor standard was designed to exempt libraries, bookstores and news stands from liability for distributing defamatory statements. These distributors usually distribute a limited number of professionally edited publications, but this is hardly the situation ISPs find themselves in. They host chat rooms, discussion boards, personal Web pages, blogs and other relatively unedited forums. If distributors’ liability would apply to ISPs, they would likely prefer to err on the side of caution and remove all content about which they receive complaints. This seems to also have been a consideration by the Zeran court in rejecting the plaintiff’s claim that AOL was to be held liable under a distributor 846

Fair Housing Council v. Roommates.com, 521 F.3d 1157 (9th Cir. 2008) Chicago Lawyers’ Committee v. Craigslist, 519 F.3d 666 (7th Cir. 2008) 848 In Grace v. eBay the Second District Court of Appeals (120 Cal. App. 4th 984(2004)) ruled that eBay was not an ISP and is a distributor and faces distributor liability for posting defamatory comments. The case arose from a person having denied his requests to have negative feedback about him removed. In Barrett v. Rosenthal (114 Cal. App. 4th 1379(2006)), the appeals court also ruled that someone who publishes third party defamatory content is not granted absolute immunity. 847

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

215

standard (it argued that distributor liability is a subset of publisher liability, therefore also covered by section 230):

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

If computer service providers were subject to distributor liability, they would face potential liability each time they receive notice of a potentially defamatory statement -- from any party, concerning any message. Each notification would require a careful yet rapid investigation of the circumstances surrounding the posted information, a legal judgment concerning the information's defamatory character, and an onthe-spot editorial decision whether to risk liability by allowing the continued publication of that information. Although this might be feasible for the traditional print publisher, the sheer number of postings on interactive computer services would create an impossible burden in the Internet context.849 The court further argued that this would remove all incentives from ISPs for monitoring content or opening itself to feedback from surfers about objectionable material and make self-regulation less likely, eroding the legislative intent behind section 230. However, this immunity has limits. For example, section 230 does not cover copyright violations, which are governed by the Digital Millennium Copyright Act (DMCA). Section 512 of the DMCA limits ISP liability for a number of activities typical for ISPs (acting as a conduit, caching, storing), including providing access to infringing material through links or reference services. This means that search engines are also covered by the DMCA. ISPs avoid liability if they remove or disable access to infringing material upon receiving notification of alleged infringement, if they do not have actual knowledge of the infringement, and if they do not profit from the infringement. This liability regime for copyrighted material under the DMCA is thus more strict than the regime provided by Section 230 of the CDA. The DMCA clearly describes the procedure for notice and take down of alleged copyright infringing content in an attempt to balance the rights of the copyright holder and the alleged copyright infringer. Under the DMCA, the alleged infringer is given an opportunity to challenge the take down. Still, a study found considerable flaws with at 849

129 F.3d 327 at 333.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

216

Global Medium, Local Laws

least 30% of the notices studied that led to a take down (the data set consisted of notices submitted to Google and notices that the researchers had obtained from the people whose materials had been taken down). They dealt with obviously uncopyrightable material, did not follow proper procedure or asked ISPs to remove material from peer-to-peer networks, which ISPs can only do by terminating an account.850 These findings, admittedly based on limited data, are hardly surprising; copyright law is complicated and requiring ISPs to be able to assess the validity of a notice or face liability creates a system in which ISPs will favor a take down. A notice and take down system, even one that is designed to balance the interests of all parties involved, risks being overinclusive if it relies on ISPs’ ability to assess the validity of claims to take down allegedly unlawful material. As discussed in chapter five, the European Union also established a liability regime for ISPs in its E-Commerce Directive. The Directive strives for a regime of co-regulation, a regime in which public authorities and ISPs work together.851 It allows courts and authorities to force hosting services (source ISPs) and access providers to remove illegal material (not restricted to copyright infringing material) or block access to those materials. The Directive states that ISPs cannot be held liable if they act as mere conduits (as destination ISPs), but that they can be held liable for hosting illegal material if they know that the information is illegal and if they do not act expeditiously to remove or disable access to the information upon acquiring knowledge of the presence of illegal material. This knowledge can be acquired formally through authorities or informally through a watchdog or private person.852 In some respects, the regime set up by the E-Commerce Directive and the DMCA is similar, with the very important caveat that the E850

Jennifer M. Urban and Laura Quilter, "Efficient Process Or 'Chilling Effects'? Takedown Notices Under Section 512 of the Digital Millennium Copyright Act. Summary Report" (2005). 851 Benoît Frydman and Isabelle Rorive, "Strategies to Tackle Racism and Xenophobia on the Internet--Where are we in Europe?" 7 Int’l J. Comm. L. & Pol’y 8 (2002/2003). 852 Benoît Frydman and Isabelle Rorive, "Regulating Internet Content through Intermediaries in Europe and the USA " 23 Zeitschrift Für Rechtssoziologie 41 (2002) at 55.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

217

Commerce Directive covers all kinds of illegal materials, including hate speech, and that the DMCA applies only to copyright infringements (ISP liability for most other illegal materials is covered by section 230 of the CDA). Some European scholars see this as a promising situation in the fight against hate speech in Europe and abroad:

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

The new tool provided by the E-Commerce Directive is very efficient for combating hate speech in particular, since ISPs are eager to ensure the benefit of immunity. This is also true with respect to the U.S. ISPs which are international business operators and, as such, often have assets in Europe beside reputation to care for. … The combination of the E.U. Directive provisions, on the one hand, and the U.S. “Good Samaritan” provision, on the other hand, allows the Europeans to play behind the back of the Constitution of the United States of America. It strongly incites U.S. based ISPs operating internationally to apply an anti-hate speech policy consistent with the standards of International law.853 The authors mention how eBay complied with Germany’s request to pull Nazi materials from its site as an example of how this approach could be successful. The German Federal Officer for Protection of the Constitution contacted eBay twice and required that the online auction service pulled Nazi related items from its site. In both cases eBay honored this request.854 As it is now, some ISPs already do not allow hateful speech to be put on their servers, but there will always be ISPs on the market that are willing to host hateful speech, and it seems unlikely that every ISP in the United States, especially one without a presence or assets outside of the United States, would refuse to host hate speech sites. Also, the American "Good Samaritan" provisions the authors refer to only apply to copyright infringing material hosted by ISPs, not to other forms of "illegal" speech.

853

Idem. Benoît Frydman and Isabelle Rorive, "Regulating Internet Content through Intermediaries in Europe and the USA," 23 Zeitschrift Für Rechtssoziologie 41 (2002) at 55. 854

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

218

Global Medium, Local Laws

Unlike the DMCA, the E-Commerce Directive does not establish clear procedural rules for notice and take down, nor does it establish clearly when an ISP is to be considered properly informed that it is hosting illegal content. It does not provide a procedure through which those whose content has been taken down can appeal this decision or that establishes when ISPs can be held liable for taking down content that turned out not to be illegal after all.855 However, member countries can of course include such a provision when they adopt the Directive into national law. Given the regime put in place by the Directive, ISPs risk becoming de facto censors.856 Reporters sans Frontières even mentioned the European Union in the “Countries to Watch” section of its 2005 report on threats to Internet freedom. It singled out the ECommerce Directive as a threat to freedom of expression “by making ISPs responsible for the content of websites they host and requiring them to block any page they consider illegal when informed of its existence. This creates a private system of justice, where the ISP is called on to decide what is illegal or not. Technicians thus do the job of a judge.”857 A 2007 decision by a Belgian court established that an ISP could be ordered to install software to block illegal file sharing of copyrighted material. The ISP in the case had argued that this would amount to monitoring, which Article 15 of the Directive says states cannot require from ISPs. The court also rejected the claim that the ISP acted as a mere conduit in this case.858 Whereas in the United States the section of the law establishing ISP immunity has been interpreted in a way favoring ISPs, in Europe the opposite seems to be happening; the ISP liability provisions are interpreted very narrowly. Just as in the United States, European ISPs have been eager to remove legal content from their servers based on ill-founded complaints. Researchers from Oxford University conducted a revealing 855

Pablo Asbo Baistrocchi, "Liability of Intermediary Service Providers in the EU Directive on Electronic Commerce," 19 Santa Clara Computer & High Tech. L.J. 111 (2002) at 124-125. 856 Ibid. at 130. 857 "The 15 Enemies of the Internet and Other Countries to Watch," Reporters Without Borders, November 17, 2005. 858 Stephen W. Workman, “Developments in ISP Liability in Europe, Part II,” Internet Business Law Services, June 18, 2008.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

219

experiment that offers some insightful, albeit anecdotal, evidence of this phenomenon.859 They designed a Web site on which they published On Liberty, the treatise by John Stuart Mill first published in 1859 and thus no longer protected by copyright provisions. The researchers created a Web page at a major ISP in the UK and a major ISP in the United States on which they published a section of the book. They then submitted an artificial complaint to the ISPs, posing as representatives of the made-up “John Stuart Mill Heritage Foundation,” asking that the material be removed from the server because it violated copyright. They found that the British ISP complied immediately with this blatantly unfounded request from a phony organization, but that the American ISP peppered them with specific questions, as required by the DMCA, at which point the researchers aborted the experiment.860 The researchers blamed the lack of clear procedures for notice and take down in the E-Commerce Directive for this over-eagerness on the part of the British ISP to take down perfectly legal material. A similar experiment by a Dutch digital rights group found that seven of the ten service providers hosting anblack 1871 text by Dutch author Multatuli complied with a request by a fictitious organization to remove the content. Only one ISP replied that the copyright had clearly expired, one replied that further identification was needed and a third one did not reply at all. 861 The European approach toward regulating hate speech hosted within its borders is geared toward creating a regime in which ISPs would voluntarily remove allegedly illegal material. In 2004, the French constitutional court validated the introduction of a notice and take down procedure into French law, raising concerns among many groups in France that this will give way to private censorship.862 French civil liberties groups pointed out that the day before the constitutional 859

Christian Ahlert, Chris Marsden, and Chester Yung, "How Liberty Disappeared from Cyberspace: The Mystery Shopper Tests Internet Content Self-Regulation" (2004) at 20. 860 Ibid. at 23. 861 Sjoera Nas, "The Multatuli Project ISP Notice & Take Down" (2004).

862 "Le Conseil Constitutionnel Fait Écho Aux Souhaits Du Gouvernement," Iris, June 15, 2004.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

220

Global Medium, Local Laws

court came down with its ruling, the French ISPs signed a code of conduct under the guidance of the government in which they commit themselves to expeditiously remove or disable access to illegal (racist and child-pornographic) content upon receiving information about the presence of this content. However, how and by whom this illegality is to be determined remains unclear.863 The providers also agreed to open up a point of contact where users can report racist and child pornographic content, which may mean that ISPs will remove content about which complaints come in without proper investigation. In a more recent announcement, the French government and the ISPs stated that France would block access to hate and child porn sites, based on a by the state. According to this plan, the ISPs would no longer be responsible for determining what content needs to be blocked, alleviating concerns about private censorship. The system seems to be similar to the one in the United Kingdom.864 The focus also seems to be on blocking sites with child pornographic content, more so than hate sites.865 ISPs are important actors in the Internet communication chain, and because they are businesses with a physical presence, they can be regulated by local law. Both the United States and the European Union have established the extent of the liability of ISPs when it comes to hosting illegal content. Contrary to what Frydman and Rorive asserted, the legal regime in the United States seems to leave little room for European anti-hate speech organizations to curb hate speech hosted in the United States. Certain ISPs in the United States may, on a voluntarily basis, comply with requests from European authorities or anti-racism organizations, but it is unlikely that all ISPs in the United States would do so all the time. A recent case discussed in the next section illustrates how some, but not all, American ISPs will comply with such requests. From the perspective of those trying to combat hate 863

"French Draft Law Obliges Providers to Monitor Content," Digital Civil Rights in Europe: EDRI-Gram Newsletter, 3(2), January 15, 2004.

864 "France Obtained ISPs Support in Blocking Illegal Sites," Digital Civil Rights in Europe: EDRI-Gram Newsletter, 6(12), June 18, 2008.

865 "Les FAI devront filtrer les sites pédopornographiques," ZDNet.fr, June 10, 2008.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

221

speech, putting pressure on American ISPs to adopt and enforce acceptable use policies that ban hate speech still could be an effective tactic. If only a limited number of ISPs would be willing to host hate speech, it would in the long run be easier to block access to these sites through blocking orders (see below). When considering the guideline that states that regulation should not be layer violating, forcing host ISPs (source ISPs) to remove certain content is a preferable solution. Removing content in a notice and take down regime does not require interference with the Internet communication process. However, as discussed at length before, if lawmakers or courts do not have the legal means to assert jurisdiction over a foreign ISP, they will not be able to fore them to take down content.. But if lawmakers and/or anti-hate speech organizations can convince foreign ISPs that they host content that specifically targets their territory and that the content is illegal there, some of those source ISPs may voluntarily comply with those requests. In the Yahoo! case for example, Yahoo! did remove the “Protocols des Sages de Scion” which it acknowledged was targeting French users.866 However, as the examples described above illustrated, requests to remove materials from host ISPs risk leading to overblocking, if these source ISPs do not have the means to assess the validity of those claims. Especially if the content is in a foreign language, assessing this can be difficult.

Regulating the Destination ISP Blocking orders in Europe If the source ISPs or the providers of content cannot be persuaded to remove content, the next logical step is to regulate destination ISPs. In June 2005, a French court ordered ten French ISPs to block access to an anti-Semitic and revisionist Web site hosted in the United States867 called AAARGH (L'Association des Anciens Amateurs de Récits de 866

November Order: "That to prove its good faith, it indicates that it has stopped the hosting of the protocol of the Sages of Sion, considering sufficient the link of attachment of such document with France due to the language of the work." 867 "French Court Issues Blocking Order to 10 ISPs," Digital Civil Rights in Europe: EDRI-Gram Newsletter, 3(12), June 15, 2005.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

222

Global Medium, Local Laws

Guerres et d'Holocaustes).868 The case was brought forward by a number of French anti-racism organizations. When they had initially asked the court to order French ISPs to block access to three Web sites (actually one Web site, but hosted on three different host servers with three different URLs), the judge had advised them to first try to convince the host providers in the United States to remove the content. After being contacted by the organizations, two of them did indeed remove the sites from their servers (OLM and Globat) but one refused to do so (theplanet.com).869 Le Tribunal de Grande Instance de Paris ordered the American hosts to remove the site and identify the authors (the source, in Zittrain’s terminology) behind the French language site with a Post Office Box in Chicago870 or face a 2,000-euro-a-day fine for each day they were late in complying with the order. The order was mainly directed at the host service that had not taken down the content.871 When it became clear that theplanet.com would not remove the site, the judge ordered the French service providers to block access to the site. This order was a first application of the E-Commerce Directive as it was incorporated into French law in the “Loi pour la Confiance dans l'Economie Numérique” (LCEN), which states that access providers (destination ISPs) should comply with a legal order to block content.872 The French ISPs have argued that blocking would be technologically difficult873 and have unsuccessfully appealed this 868

"L'Association Des Anciens Amateurs De Récits De Guerres Et d'Holocaustes." 869 Estelle Dumout, "Affaire AAARGH: Les Hébergeurs Américains Sommés De Bloquer Le Site," ZDNet.fr, April 22, 2005. . 870 "L'Association Des Anciens Amateurs De Récits De Guerres Et d'Holocaustes." 871 Estelle Dumout, "Affaire AAARGH: Les Hébergeurs Américains Sommés De Bloquer Le Site," ZDNet.fr, April 22, 2005. . 872 "French Court Issues Blocking Order to 10 ISPs," Digital Civil Rights in Europe: EDRI-Gram Newsletter, 3(12), June 15, 2005.

873 Estelle Dumout, "Stéphane Marcovitch, AFA: ‘Une Mesure De Filtrage Est Simplissime à Contourner’," ZDNet.fr, June 15, 2005.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

223

decision.874 Counsel for the French ISPs has stated that the blocking order would be technologically impossible for the ISPs.875 This case is interesting in its dissimilarity from the approach taken in the Yahoo! case and the “J’accuse” case (see chapter five) in 2001 (before the LCEN was adopted), where the court had stated that ISPs are mere conduits that could not be burdened with the task of filtering content. Now, the court ruled that this burden could be put on local ISPs. In this case, the judge first asked the organizations to try to persuade the American host providers to remove the content, and when this was only partly successful, the court ordered American service providers to remove the content and reveal the identity of the (French?) creators of the site. But as the court realized that this order did not have the desired effect, it had to move to plan C, and order French service providers to block the content instead. Another important distinction between this case and other cases discussed in previous chapters is that, in this case, the site clearly targeted France with its content, which could be inferred from the language and the reference to local information (articles from French papers). Based on the representational sovereignty concept, France is justified in trying to regulate out-of-state actors targeting its citizens with illegal content. However, this approach is not efficacious. Even though two of the three contacted host providers in the United States complied, one did not, so the content of the Web site is still available on the Internet. However, the fact that the site was only hosted on one server in the United States made it easier to have the site blocked by French destination ISPs. Blocking orders also pose problems, mainly because they are layer violating, as will be discussed below. In Germany, blocking orders also have been the topic of much legal and political debate since the blocking order in Nordrhein 874 “Affaire AAARGH: pas de révision de la solution,” Le Forum des Droits sur l’Internet,” November 28, 2006.

875 Philippe Crouzillacq, "Les FAI Mis En Cause Pour l'Accès à Un Site Révisionniste," 01.net, May 31, 2005.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

224

Global Medium, Local Laws

Westfalen discussed in chapter five was issued. Just like the French authorities in the case discussed above, German authorities had unsuccessfully tried to convince the American source ISPs to remove the Web sites and to identify the content providers, who used code names and trick addresses.876 Like its French counterpart, a court in Germany ordered German destination ISPs to block access to certain hate sites hosted in the United States. ISPs located in NordrheinWestfalen have argued that blocking requires technical, economical and personal expenditure on their parts, but attempts to overturn the blocking order in court have been unsuccessful.877 In 2004, the German parliament acknowledged that blocking is bad policy, but it was unwilling to enact a motion that would ban blocking orders.878 No other German state has issued a similar blocking order since. In 2002, the European parliament also declared almost unanimously that blocking is not the solution to regulating Internet content because it could lead to a fragmented Internet or the blocking of legal content.879 The Safer Internet Programme also does not discuss destination ISP blocking as an option.880 The Safer Internet Action Plan, now called the Safer Internet Plus Programme, is a European Union sponsored program that “aims to promote safer use of the Internet and new online technologies, particularly for children, and to fight against illegal content and content unwanted by the end-user, as part of a coherent approach by the European Union.”881 Under the plan, numerous studies have been financed, mainly on filtering technologies that could be used by end users in combination with education, raising awareness, self-regulation and rating schemes. 876

Pascal H. Schumacher, "Fighting Illegal Internet Content -- May Access Providers be Required to Ban Foreign Websites? A Recent German Approach," 8 Int'l J. Comm. L. & Pol'y (2003/2004) at §4. 877 Eric T. Eberwine, "Sound and Fury Signifying Nothing? Jürgen Büssow's Battle Against Hate Speech on the Internet," 49 N.Y.L. Sch. L. Rev. (2004/2005). 878 "German Parliament Debates Filtering Or Blocking," Digital Civil Rights in Europe: EDRI-Gram Newsletter, 2 (22), November 17, 2004.

879 Idem. 880 "Safer Internet Programme: What is Safer Internet?" February 28, 2006.

881 Idem.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

225

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Toward a creation of E-borders? It would be inaccurate to assert that blocking at the level of the destination ISP is the one and only way that the European legal and political system has dealt with illegal content hosted in the United States. However, some have argued that such a blocking regime at the ISP level could be a way to regulate illegal content. In a law review note, Gerlach proposes a blocking regime in order to create so-called Eborders, which could resolve the problem of jurisdiction and enforcement of local laws by installing filters at the destination ISP level. This way, Gerlach argues, each country could erect its own Eborders within which it could enforce its laws. This proposal has the advantage that publishers could freely publish whatever they want, and it would be the responsibility of the countries banning certain speech to figure out a way to prevent illegal speech from “entering” the country: “Web publishers will be free to assume that if their website violates a foreign nation's laws, that nation's filters will block access to its website, and if that nation does not block access then the publisher is free to assume that the content is not illegal.”882 Gerlach gives two suggestions how this can be done: A filter could be installed at a central server, controlled by the government, through which all Internet traffic in the country flows. The other, more likely, solution he proposes is that filtering would be done by local ISPs, as most countries have a decentralized network with multiple access points.883 He argues that once these filters are set up and installed, the laws of each country would determine the scope of a nation’s Eborders. This solution clearly satisfies the requirement that regulation does not target out-of-state actors. Gerlach is correct in pointing out the increasing importance of ISPs in blocking illegal content, but as discussed above, making ISPs responsible for filtering may lead to overblocking. In addition, it also raises some practical issues.

882

Tim Gerlach, "Using Internet Content Filters to Create E-borders to Aid in International Choice of Law and Jurisdiction," 26 Whittier Law Review 899 (2005) at 913. 883 Ibid. at 914-915.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

226

Global Medium, Local Laws

Destination ISP blocking: technological issues

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

In the United States as well, authorities have tried to force destination ISPs to block illegal content.884 In 2002, the Pennsylvania legislature enacted a law requiring an ISP to remove or disable access to child pornography “residing on or accessible through its server”885 upon receiving notice from the Pennsylvania attorney general. The Center for Democracy and Technology, the American Civil Liberties Union of Pennsylvania and Plantagenet (an ISP) challenged the law in court. They argued, among other things, that efforts to disable access to child pornography had led to overblocking in violation of the First Amendment and that the procedure spelled out in the law amounted to an unconstitutional prior restraint on speech, an argument with which the court would agree in striking down the law.886 The decision also contains an in-depth discussion of the different filtering techniques that exist. In order to be able to assess blocking at the level of the destination ISP as a method to eliminate illegal content, it is necessary to have a basic understanding of the main existing blocking methods. The court’s decision discusses the technological issues related to filtering in great depth, and some of these points are summarized below.

Filtering methods DNS filtering. The domain name system (DNS) is the Internet service which, among other things, translates domain names into IP addresses. Domain names are usually easy-to-remember words identifying a Web site (for example www.sjmc.umn.edu). However, this is mainly for the convenience of the user, as Internet communication is based on numerical IP addresses, not words. Each time an Internet user enters a domain name, it needs to be translated into the IP address that corresponds with this domain name for the information to be sent over the network. A user’s ISP provides servers that perform this DNS lookup function and match the URL to a specific IP address by looking 884

Center for Democracy & Technology v. Pappert 337 F. Supp. 2d 606 (E.D. Pa. 2004). 885 Ibid. at 610. 886 Ibid. at 611.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

227

through a series of global databases (the DNS system) to determine the IP address of the Web server that can provide the requested Web pages. DNS filtering occurs when an ISP makes entries into the DNS servers under its control so that when a URL is received, the DNS server does not give the IP address corresponding with that URL, but returns an error message instead. The translation from URL to IP address is disrupted. For the filtering issue, the important aspect of this set-up is that the relationship between URL and IP address is not one-on-one. A specific URL can only refer to one IP address, but many URLs can share the same IP address. Geocities, for example, like many other free hosting services, allows people to have Web sites on subpages of its domain. URLs of sites hosted by geocities might have a format like this: www.geocities.com/subpage/homepage.html. When a request reaches a server that carries multiple sites, the server reads both the IP address and the URL, selects the requested document and returns it to the requesting computer.887 Planatagenet, one of the plaintiffs in this case, hosts about 160 to 170 of its Web sites on a single Web server with the same IP address.888 Research by the plaintiffs’ expert in the case suggested that at least 50% of the domains on the Internet share an IP address.889 DNS blocking cannot make a distinction between subpages when they are all hosted on the same host machine with a shared IP address. As a result, DNS blocking blocks all subpages of a certain domain. It is obvious that this would lead to overblocking. For example, the University of Minnesota offers its students and staff the opportunity to use its server space to post a blog. The domain name of the site is “blog.lib.umn.edu.” The blogs of students and staff are subpages of this domain, for example: http://blog.lib.umn.edu/vana0047/baslog/ Because DNS blocking occurs at the domain level, if a destination ISP would attempt to block one specific University of Minnesota blog, it could block all the blogs on the system. In some instances individuals can also create a page as a subdomain of the main domain, for example the Web site of the School 887

Ibid. at 618. Idem. 889 Idem. 888

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

228

Global Medium, Local Laws

of Journalism at the University of Minnesota (“www.sjmc.umn.edu”) is a subdomain of the University’s Web site (“www.umn.edu.”) Unlike subpages, subdomains can be effectively blocked through DNS filtering without overblocking. DNS filtering stops requests for a specific domain name (including all the subpages), but pages that share an IP address but not a domain name are not blocked.890

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

IP filtering With IP filtering, on the other hand, the ISP does resolve the IP address corresponding with a specific URL, but after having done so, makes entries in routing equipment under its control so that all outgoing requests for that IP address are stopped.891 The difference with DNS filtering is that this filtering takes place at a “deeper” layer. Whereas DNS filtering can be circumvented if one knows the IP address of a certain site and types it into the browser, IP filtering effectively makes it impossible for a user to request certain information. IP blocking could also lead to overblocking, not only the subpages of a blocked site, but also the subdomains of a site would be blocked. For example, with DNS filtering one could block www.umn.edu without blocking www.sjmc.umn.edu; with IP filtering this would not be possible. URL filtering. URL filtering is a method of filtering that allows for more targeted filtering at the HTTP application layer, so that specific URLs (including subpages) can be blocked. It requires a reconfiguration of the ISP’s network so that it can reroute the packets flowing through its network, read each HTTP Web request (the URL) and block or discard the info if the URL is on a blacklist. Discussion Most ISPs already use IP filtering as part of their security system; for example, to protect themselves against denial of service attacks. This kind of filtering would put relatively few burdens on ISPs. DNS filtering, on the other hand, would be somewhat more complicated and require ISPs to design new procedures to make entries into their DNS servers. URL filtering would be even more burdensome for most

890

Ibid. at 633. Ibid. at 628.

891

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

229

ISPs.892 While it is true that some ISPs already have URL filtering as part of a parental control package, these are usually only designed for a small amount of users. Expanding this filtering to all users would be very burdensome, pose huge practical problems for most ISPs and could slow the network down (unless extra investment in equipment were made): “comparing all of the URLs in the web traffic flowing through an ISP's network with a list of URLs to be blocked is ‘expensive’ in the computational sense -it requires a significant amount of computing power.”893 DNS filtering can be circumvented, in some cases, if the user knows the IP addresses of a site and types that number directly into the browser. Also, not all users use their ISPs’ DNS servers and therefore could be unaffected by DNS filtering. For example, big companies often have their own DNS servers. Individual users can redirect their computers to use DNS servers other than those of their ISP. DNS blocking also will overblock all subpages of a blocked domain name. IP blocking is even more likely to lead to overblocking, since it would overblock both subdomains and subpages.894 URL blocking, on the other hand, is much more precise, as one can set filters to block the specific document of any given Web site. When engaging in URL filtering, one is blocking specific documents, not a host.895 So it is possible to block a particular page on a Web site through URL blocking, without blocking the whole Web site. This is important in the context of Solum and Chung’s model that argues against layer violations when engaging in Internet content regulation. The deeper the level at which one regulates (or better: the farther removed from the content layer), the more disruptive for Internet communication this is. For example, “cut the wire” regulation (at the hardware layer) blocks all data from all machines in a country or 892

Schumacher, discussing the German blocking order, seems to disagree and argues that IP blocking would be more burdensome than DNS blocking. (Pascal H. Schumacher, "Fighting Illegal Internet Content -- May Access Providers be Required to Ban Foreign Websites? A Recent German Approach," 8 Int'l J. Comm. L. & Pol'y (2003/2004) at §5.1.) 893 337 F. Supp. 2d 606 at 630. 894 Ibid. at 633. 895 Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004) at 892.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

230

Global Medium, Local Laws

geographical region and is the most layer violating way of regulating content; blocking an IP address (IP layer) blocks all data from a host machine or server; and blocking a URL (application layer) blocks all data contained in the document that is hosted on a server. When addressing the offending conduct at the content layer, transparency impairment is most severe when the regulation is directed at the physical layer, less severe when at the level of the IP layer, and lesser still when it happens at the application layer.896 But while URL blocking seems to be the least layer violating, it also seems to be least feasible, because ISPs do not seem to be equipped to conduct this kind of filtering without incurring very high costs and without slowing down Internet communication. None of the servers in Pennsylvania used URL filtering to comply with government blocking requests, and the blocking order issued in Nordrhein-Westfalen also suggested that ISPs use DNS filtering, and did not require that they employ URL or IP blocking.897 Web sites affected by a European blocking order, especially if they are located in the United States in order to avoid local laws, may take measures to nullify the effects of this order. On the Web site that the French court ordered blocked,898 links are listed to unblocked mirror sites and to anonymizer services that can be used to circumvent the blocking order. Naturally, the site itself is blocked to French users, or any users using French ISPs affected by the blocking order. It is possible that people who already were visiting the site still find ways to do so as it is still online, but the exposure of this site has certainly been greatly reduced by these measures. At first sight, despite the obvious problems regarding feasibility, layer violation and overblocking, blocking at the destination ISP seems to fulfill the representational concept of sovereignty principle, because it does not necessarily target out-of-state actors. However, this kind of blocking at the ISP level can have an indirect effect on out-of-state actors. The IP packet switching that makes Internet communication 896

Ibid. at 894. Pascal H. Schumacher, "Fighting Illegal Internet Content -- May Access Providers be Required to Ban Foreign Websites? A Recent German Approach," 8 Int'l J. Comm. L. & Pol'y (2003/2004) at §5.1. 898 "L'Association Des Anciens Amateurs De Récits De Guerres Et d'Holocaustes." 897

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

231

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

possible relies on the availability of public routers all around the world; often times data packets are routed through many countries before they arrive at their destination.899 ISPs do not only function as off or on ramps for data, they also serve as conduits of information that is neither hosted on one of their servers nor requested by their subscribers. However, if an ISP engages in IP filtering, a document requested from a host computer at a blocked IP address would get dropped. China, for example, blocks sites based on their IP address, and this could potentially have complications for Internet users outside China: For example, any packet that is routed through the sites or networks blocked by Chinese authorities could get dropped at the gateways or blocking points, even if the final destination of the packet is not located inside China. This is because the routing is done on a hop-by-hop basis -i.e., for any router, the only thing it knows about a received packet is its destination address and the address of the immediately preceding router. A router typically is not aware of the address of the origination point. Since it is difficult to tell the geographical location of the destination on the basis of the IP address, the blocking might be simply based on the source IP address, in which case, packets routed through the blocked sites or networks will get dropped, even if the final destination of the packets is not located inside China. In effect, the routers in China would be committing a sort of routing fraud, by agreeing to route the packet, then dropping it in the middle. Although the routers themselves may not have been programmed intentionally to mislead, the resulting effect would be the same and this effect is a foreseeable consequence of the policy China would adopt. To the extent that China claims a sovereign right to control routers and networks located within its borders, China cannot avoid its culpability for the results of its specific blocking policy on the global Internet.900

899

Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004) at 907. 900 Ibid. at 908.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

232

Global Medium, Local Laws

DNS filtering does not seem to suffer this drawback. DNS filtering is only done by ISPs for requests from its own subscribers, and this has no effects for Internet users not relying on that specific ISP’s DNS servers. In the model of Solum and Chung, DNS level blocking is also considered as blocking at the application layer, and therefore is preferable over IP blocking. DNS blocking prevents the correct translation from the URL into the numerical IP address, it prevents the request to be delivered to the IP layer, and does not affect the IP layer as such. As this short overview demonstrates, blocking can be effective to an extent, but it is certainly is not an easy solution to implement. Research has shown that the blocking order in Germany has led to overblocking, ISPs blocked domains that they were not required to block (such as kids.stormfront.org, a link that is no longer active), or blocked email, which was also not required by the blocking order. On the other hand, some ISPs only blocked www.stormfront.org and forgot to block stormfront.org as they were supposed to do.901 Some ISPs also seem to be better equipped than others to do blocking at the destination ISP level. As discussed in chapter four, British Telecom, for example, voluntarily blocks access to blacklisted sites to all its subscribers, and does not seem to incur great technological difficulties when doing so. Blocking at the destination ISP can be relatively effective in making illegal content unavailable, though it is not a perfect solution. In some instances it is relatively easy to do (IP blocking) but in that case the blocking may be too broad and may interfere with the proper functioning of the network. In other instances (URL blocking) it can be more precise, but this seems to put too much of a burden on ISPs. DNS level blocking, according to the sources consulted in this overview, seems to offer a third way. It does not interfere with the proper functioning of the network, but it does lead to some overblocking and requires some efforts on the part of the ISPs, who would need to reconfigure their DNS servers.

901

Richard Clayton,"Failures in a hybrid content blocking system," PET 2005: 5th Workshop on Privacy Enhancing Technologies, Cavtat Croatia, May 30June 1, 2005.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Evaluating Regulatory Options

233

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Summary The solutions discussed above are not intended to constitute an exhaustive list. Not all the conceivable solutions have been discussed. Some other solutions, such as requiring the voluntary participation of the content provider (for example, by rating content so it could be filtered out more easily) or by the recipient (installing filtering software on a voluntary basis), were not discussed here. The Council of Europe’s Protocol, as well as numerous court decisions in Europe, seem to indicate that regulation through self-regulation or voluntary participation in rating schemes will not be accepted as adequate by European governments, despite the initiatives taken in this area by the European Union. The focus in this overview was on solutions that, if accepted, could be relatively successful. However, as discussed above, this success, or efficaciousness, is but one of the factors that ought to be considered. Solutions should be effective, not layer violating and be consistent with the representational concept of sovereignty. By “effective,” we mean not only that a solution meets its goals (efficacious), but also that is not overinclusive, that it is feasible, and that the institution responsible for determining what content should be subjected to regulation is accountable to the public. Not surprsingly, no single solution will meet all of these criteria. However, based on the discussions in the previous chapters, some suggestions can be formulated about what kind of approach European countries could follow to enforce their hate speech laws on the Internet, while at the same time respecting the layered structure of the Internet and the demands of the representative concept of sovereignty. Even though the Ninth Circuit Court of Appeals declined to rule on the enforceability of the French Yahoo! order, it seems unlikely that an order that affects availability of speech in the United States would ultimately be enforceable, when or if the French anti-racism organizations make an effort to actually enforce it.902 However, from

902

Actually, LICRA appealed to the Supreme Court, arguing that the decision left the door open for Yahoo! to use American courts to avoid judgments from other countries. The Supreme Court denied their petition for writ of certiorari on May 31, 2006. (“Supreme Court Sidesteps International Yahoo! Case,” USA Today, May 30, 2006. Available online:

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

234

Global Medium, Local Laws

their point of view, not trying to do so may be a tactically wiser decision than trying to overplay their hand. The legal uncertainty perpetuated by this decision may create an environment in which ISPs in the United States are more willing to comply with requests from groups such as LICRA than would have been the case had the court unequivocally stated that the French judgment is unenforceable. Even though this decision of the court should not be seen as a defeat for Yahoo!, it could potentially have a chilling effect on speech in the United States. As discussed in the previous chapter, when contacted by French anti-racism organizations, two out of the three American ISPs removed the revisionist AAARGH site. It is important to point out, however, that this site was in French and was clearly attempting to avoid French law. It is not inconceivable that this fact played a role in the decision of the two source ISPs to remove the site. The AAARGH case and the Yahoo! case are very different, because the AAARGH site clearly targets French users. ISPs may be more willing to comply with requests to remove material that is placed on their servers only to avoid local hate speech laws. European antiracism organizations could appeal to American ISPs to include in their user agreements that content placed on its servers with the intent to avoid local laws is subject to removal. Given the legal ambiguity created by the Ninth Circuit Court of Appeals’ decision, it is not unlikely that anti-racism groups would be successful in convincing some ISPs to do this. But if such a policy were in place, this could also mean that a blog from a Chinese human rights activist hosted in the United States would be subject to removal. However, ISPs could decide to narrow this requirement, and, for example, state that only content designed to avoid local hate speech laws, or laws from certain countries, would be subject to removal. (If the Global Online Freedom Act were to become law, then American Internet services would not even be allowed to participate in repression by authoritarian regimes.) The problematic aspect of this approach would be that ISPs could be overly eager to remove content on the mere basis that an anti-racism organization claims it is illegal. As we have discussed above, the weakness of notice and take down procedures can be that the person or entity providing notice may have a hidden agenda or may not be

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

235

accountable to the public. Therefore, ideally, ISPs should only remove content if they receive notice from authorities or government sanctioned, publicly accountable organizations that a site they host violates local laws, similar to the way Google decides to remove sites from its search results.903 ISPs could then determine, by looking at the language and content of the site, for example, if it targets a given jurisdiction. This solution is hypothetical and depends on the willingness of American source ISPs to cooperate. But if European anti-racism organizations and/or European authorities could convince American ISPs that they are interested only in speech specifically targeting their country, in light of the status of the Yahoo! case, they may find that American ISPs would be willing to cooperate. To an extent, this is already happening. In the United States, the Anti-Defamation League, though not advocating a ban on hate speech, advocates that ISPs adopt “Terms of Service” or “Acceptable Use” policies that ban hate speech. The ADL has on numerous occasions informed hosts when they were hosting information in violation of those acceptable use policies904 and the Simon Wiesenthal Center does the same, resulting in the removal of thousands of pages.905 By doing this, the ADL remains loyal to its pledge that it does not want to ban hate speech, while reducing its availability. So even for American hate sites, it has become harder, though by no means impossible, to find http://1stserver space. Hosting services such as amendment.org/index.php provide hosting specifically to controversial sites. Gerhard Lauck, the German neo-Nazi and Holocaust denier whose site was subjected to the blocking order in Nordrhein-Westfalen, even operates his own hosting company called zensurfrei.com (censorfree). Lauck does not only host European sites, but also 903 904 Brian Marcus, "Public and Private Partnership in the Fight Against Racism. Xenophobia and Anti-Semitism on the Internet--Best Practices," OSCE Meeting on the Relationship Between Racist, Xenophobic and Anti-Semitic Propaganda on the Internet and Hate Crimes, Paris, June 16, 2004. See also: Jane Bailey, "Private Regulation and Public Policy: Toward Effective Restriction of Internet Hate Propaganda," 49 McGill L.J. 59 (2004) at 92-93 for an example of ISPs taking a stance against hate speech. 905 David B. Caruso, "Report: Hate Groups favor U.S. Internet Servers to spread Bile," www.findlaw.com, May 4, 2006.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

236

Global Medium, Local Laws

American hate sites that are being kicked off the servers of other ISPs for violating the “no hate” terms of service numerous ISPs have in place.906 Lauck targets European (specifically German) clients who want to set up a Web site unfettered by German law: ANONYMOUS WEB-SITES ARE POSSIBLE! The domain name is registered in the name of a U.S. firm. Even our firm does not need to know your identity. (Payment can be sent with an anonymous letter with reference to your web-site.) Political repression is increasing in Europe! European webmasters can reduce their risk by moving their websites to the USA! ZENSURFREI establishes your website with one of the largest and most reliable servers in the USA. Pay by the quarter or by the year. We accept Euro banknotes or US Dollar banknotes, no coins.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

It's fast, easy and convenient! On request we will -without cost or obligation- check to see if your desired domain name (ending with .com, .net or .org) is still available. (It is best to tell us a few alternative domain names to check out at the same time.) 907 However, even if this approach would work, and if, as in the AAARGH case, two out of three American ISPs would be willing to cooperate with European authorities (or other countries with similar laws, such as Canada or Australia) or would in fact ban all types of hate speech from their servers under pressure from local anti-hate speech groups, this would not mean that the amount of hate speech would decrease by 66%. More than likely, all those sites would move to the ISPs that still allow this kind of speech, and this speech would still be available online. (As discussed above, this seems to be already happening to an extent.) However, despite these drawbacks, this approach may still be worthwhile. First of all, it would make it harder 906

Brian Levin, "Because of the Constitution's First Amendment, the United States Now Hosts Hundreds of European Language Hate Sites," Southern Poverty Law Center: Intelligence Report (2003).

907 The site is in English and German.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

237

for people who try to circumvent their own local laws by hosting their site in the United States to find server space. And secondly, it could also be beneficial if most hate sites targeting other jurisdictions would be concentrated on a limited amount of servers, such as Lauck’s server space. This could make the blocking at the destination ISP level more efficacious (see below). The availability of hate speech can also be greatly reduced by regulating search engines. As discussed above, local versions of search engines can be regulated, and Google and Yahoo!, for example, comply with demands of authorities to remove illegal content. Of course, this does not eliminate Web sites from the Internet. Users do not access Web sites only through search engines but also through social networks, hyperlinks or by typing in a URL. However, , European hate speech law mainly tries to prevent the spread of hate speech among the population. For European lawmakers, it is not (as) troublesome if, for example, some neo-Nazis come together in a private setting to engage in hate speech, as this is not likely to spread racism or incite racial hatred. But it would be more troublesome if they would do the same in a public park. By the same token, if a limited number of die hard neoNazis find an isolated place on the Internet (isolated because it would be filtered out of search engine results) that is effectively unavailable to most other Internet users, an important policy goal would be met. Regulating local search engines could prevent a fourteen-year-old student writing a report on Anne Frank from being linked to a Web site stating that her diary and the Holocaust are Zionist myths. Even if Google's non-local search engine remains available, the default setting is to guide users to the local version of google. From this perspective, removing hate sites from local search engine results would further important regulatory goals. As mentioned before, given the important role search engines can play in furthering European regulatory goals, it is regrettable that the ECommerce Directive did not address the legal liability of search engines, as the DMCA did for search engines in the United States. This issue also shows that geolocation can have an important role in the fight against hate speech. Google automatically redirects users from certain countries to their local versions based on IP recognition. However, it is still possible for those users to go back to google.com by clicking on a link.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

238

Global Medium, Local Laws

European lawmakers could also force their ISPs to block certain sites at the destination ISP level. As discussed above, these kinds of blocking orders are not unproblematic. Attempts to do so have been met with great resistance by ISPs in Germany and France. However, in the United Kingdom, one ISP voluntarily has set up a system to block specific URLs of Web pages containing child pornography, apparently without raising technological problems. But with this kind of voluntarily filtering regime, it remains unclear on what basis the sites that are to be blocked are selected. It is important that the watch dog organization flagging content works in a transparent way and is accountable. The German and French blocking orders, as flawed as they may be, at least are limited to sites that public authorities established violated the law. It lies outside the scope of this work to assess the costs for ISPs to implement the technology that would make URL blocking feasible. But the fact that the order in Germany did not require ISPs to block at the URL level seems to indicate that authorities acknowledge that this is not a realistic option for most ISPs. Some Web sites contain hundreds of pages, and blocking at the URL level would mean that every page would have to be blocked separately. The findings of the court in the case of the Pennsylvania blocking order also seem to suggest that for most ISPs, URL blocking would be extremely burdensome. As discussed above, DNS blocking may be a more feasible blocking method, but one that could lead to overblocking, as it cannot block at such a specific level (particular URI) as URL blocking. However, if through negotiation and/or pressure, anti-hate speech organizations and authorities could convince American ISPs not to host hate sites, or at least remove hate sites that specifically target a European country, the result would be that more and more hate sites would reside on a limited number of servers. As discussed, this is already the case to an extent. While overblocking is a concern with IP and DNS blocking, and should be avoided, there would be less of a problem if more hate sites were to be found on the same servers. The reason that overblocking was a main concern is that in many cases, in order to block a specific site, many innocent sites that share the same domain would also be blocked. This would be especially troublesome if hate sites would reside on domains such as geocities.com, where they share server space with many “innocent” sites. However, if hate sites are being “chased” away from those common areas, because of

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

239

acceptable use policies and or pressure from anti-racism groups, and congregate on server space provided by host providers that “specialize” in controversial speech, the “collateral damage” of blocking orders may be smaller. Blocking orders against domains or ISPs acting as portals of hate speech and that advertise themselves specifically to European Web masters as providing a safe haven to post content that would be illegal according to local laws would arguably not lead to troublesome overblocking. Consider the German blocking order against stormfront.org, for example. Stormfront functions more or less as a gigantic bulletin board with thousands of threads and hundreds of thousands of messages about all kinds of issues relating to white power. Not every message or thread is likely to be in direct violation of German law. So in the strict sense, this (DNS) blocking order may block more content than needed. But this is the kind of overblocking that may be unavoidable. It would be simply impossible to monitor all the messages or threads being posted on Stormfront and to determine one by one whether it should be blocked or not. Given the fact, however, that the theme of this board is white power, it seems likely that most of the topics and content would violate German law. But if instead of being contained to its own community, these discussions regarding white power and racial inequality were to be held on a general discussion board about politics, it would be much harder to target the specific discussion threads with blocking orders at the destination ISP level, without also blocking valuable speech that is totally unrelated to white power. The fact that a lot of online communities do not tolerate the presence of hate speech, and force them to isolate themselves on specific servers, may make blocking at the destination ISP level a more feasible option. If hate sites could be isolated on specific servers or locations on the Internet, other measures, such as filtering technology could also become more efficient. Blocking orders should not be used gratuitously, but if European countries are serious about restricting hate speech, blocking orders at the destination level could be a valuable option in certain instances. It could be burdensome for ISPs, but content regulation always imposes a burden on those who are responsible for implementing it. That is not a reason in itself not to issue such an order. It was not a consideration in the Yahoo! case when the French judge ordered Yahoo! to install

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

240

Global Medium, Local Laws

geolocation blocking software. The E-Commerce Directive explicitly states that ISPs can be required to perform such blocking. So what strategy should European governments and anti-racism groups follow to reduce the influx of online hate speech on the computer screens in their countries? First of all, they should continue to regulate hate speech originating from within their own borders. Hotlines may be a good way to do this. Hotlines allow Internet users to report illegal content on the Internet to a watch dog organization which then investigates the content. If the material is in fact found to be illegal and is hosted on a local ISP, this ISP is notified and will then remove the content. This system is based on self regulation of the industry and depends on the willingness of ISPs to participate. However, given the liability regime provided by the E-Commerce Directive for ISPs, it seems likely that they would react upon receiving such a notice with a take down, if they want to escape liability. As we discussed in sections elsewhere in this work, this could lead to overblocking. It is therefore important that the organization that serves the notices to the ISPs does so only if the content is in fact likely to be illegal. The British Internet Watch Foundation has been criticized because of its lack of transparency and the lack of clear criteria that it uses. However, this does not have to be so, and hotlines could be set up so that they are accountable and their criteria are clear. External reviews could be performed to assess whether or not they block more than content that would likely be illegal. Installing hotlines at the local level, regulating search engines, relying on the efforts of American anti-hate speech organizations to have ISPs enact and enforce policies banning hate speech, continuing to seek cooperation with major American ISPs to remove content that violates local laws, and in some instances block hate sites at the destination ISP level: this strategy could in the short term prove to be successful for European countries in upholding their hate speech laws online without impeding upon other countries’ sovereignty or interfering with the network’s layered structure. This will not guarantee that every site, every blog, every discussion board on which content is placed that is illegal in a European country will be made inaccessible to that country’s Internet users, but it could mean a great reduction of the availability of hate speech in these countries. This approach is also directed towards limiting exposure to hate sites specifically targeting Europe from American servers. As discussed in chapter three, most

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Evaluating Regulatory Options

241

European hate speech law sees hate speech as a corrupting poison that could destabilize the democratic order and the public peace. It is more likely that this kind of targeted speech (speech targeting a specific population) would bring this effect about, provided that one accepts this notion that speech can in fact be responsible for those outcomes. This approach also requires that European nations select their battles more carefully. Part of the criticism of the French Yahoo! decision may have arisen from the fact that most people felt that ultimately, the mere display of Nazi memorabilia, in an auction context, mostly for American users, did not have much to do with France and did not have much of a negative effect there, even though it was illegal according to French law. Had the Yahoo! case dealt with a more virulent type of hate speech targeting France, hosted by an ISP refusing to remove the content, the debate might have been totally different. As a result, the much more compelling AAARGH case never received the attention it deserved from media and observers. European authorities and anti-hate speech groups will have to realize that it is simply impossible to remove all content that violates hate speech law, acknowledge that certain hate sites are more troublesome than others and focus on the gravest offenders of local hate speech laws. The order in the Yahoo! case was unclear, so that even judges and commentators disagreed what the effects of its implementation would be on American audiences. Also, Yahoo! was targeted by the order because it hosted content by third party users. Given the protection given to ISPs by the section 230 of the CDA, Yahoo! could claim it cannot be held responsible for third party content if attempts were to be made to enforce the order in the United States. When American legislators unsuccessfully tried to limit pornographic material to adults through the Communication Decency Act and the Child Online Protection Act they always targeted the content providers (source) and not the service providers (source ISP). Even though a certain reading of its decision in the Yahoo! case might suggest that for the Ninth Circuit Court of Appeals a blocking order that would not affect Americans might not be unconstitutional, the problem remains that actual blocking of users from certain countries has to be done by system administrators at the IP layer. A blogger using a free blogging service for example, cannot block users based on geography through IP based geoblocking.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

242

Global Medium, Local Laws

On the other hand perhaps a European country’s order forcing a hate portal such as stormfront,org to prevent users from that country to access that portal could be enforceable in the United States. However, this is far from certain and would probably give rise to numerous court battles. The Yahoo! case took five years to make its way through the court system and never even made it to the enforcement stage. It goes without saying that this is an eternity in Internet years. Some have argued for an international adoption of the targeting approach on the Internet, that countries would recognize judgments of foreign courts against local Web sites if jurisdiction is claimed on the basis of a targeting approach.908 However, absent such an agreement, European nations and anti-hate groups trying to keep hate away from their borders might face an uphill battle in the American court system. Lastly, the European lawmakers should also make a greater effort to define the problem. There is no question that many foreign sites are located on American servers to find protection from their local laws behind the First Amendment. At the European level, few efforts seem to have been made at defining or mapping exactly the extent of hate on the Internet, specifically hate speech targeting Europe from abroad. In the literature reviewed for this work, very few specifics were found as to the extent, impact and nature of hate sites hosted in the United States targeting European countries. Although organizations such as the AntiDefamation League, the Simon Wiesenthal Center or the Southern Poverty Law Center keep track of American Internet hate sites, no such tracking seems to exist for European hate sites hosted in the United States or elsewhere. As a result, the Protocol to the Convention of Cybercrime risks being a very imprecise solution for a problem whose scope is unclear. More research should be done on the extent of hate speech on the Internet by European institutions and organizations. More research should also be devoted to the describing and defining trends in online hate speech and a specific strategy should be developed in response to these trends. The suggestions made above are an attempt to do so, but a more specific and effective strategy could be developed if more research were done into the specific nature of online hate speech hosted abroad and the effects it has in European countries.

908

Holger P. Hestermeyer, “Personal Jurisdiction for Internet Torts: Towards an International Solution?” 26 NW. J. Int’l L. & Bus. 267 (2006) at 286-287.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Concluding Remarks This book dealt with Internet law. As discussed in the opening sections of chapter five, there was a time when Internet law referred to the law of the Internet, implying that the Internet mandated a separate law, a law that was based on a different set of assumptions and philosophies than the law of the “outside world.” This is not the sense in which this research project falls under the category of Internet law. The bigger legal question guiding this research is how the Internet affects the law. While the Internet may not mandate a separate jurisprudence, it does present some unique challenges that sometimes make this medium difficult to regulate. One of these challenges, its global reach, makes it hard for any particular country to enforce its laws on this medium. While enforcement of local laws on a global medium can be cumbersome, this is a problem of a practical nature that has not challenged countries’ interpretation of what constitutes legal and illegal content, at least not in the area of hate speech. Hate speech is an issue in which, as many observers have noted, the United States have taken a very different approach than other Western democracies. It therefore serves as an excellent case study to illustrate the unique characteristics of American Free Speech doctrine, even when compared to countries that have similar political and cultural traditions. It also is a good test case to determine whether or not the Internet forces a rethinking of traditional speech law. In chapter two, we discussed how attempts to make this argument in the United States are not convincing within a First Amendment paradigm. The reasons for this are manifold and will not be reiterated here, but the features of the Internet that make it a more “dangerous” vehicle of hate speech are exactly the characteristics that give this medium a high level of First Amendment protection under traditional jurisprudence. Proponents of hate speech restriction who refer to the accessibility, anonymity, lack of control, reach, availability and “amplifying effect” of the Internet as arguments for a strict regulation of the dissemination of hate online are fighting a losing battle. As long as they cannot convincingly show how these features fundamentally change the nature of the information communicated, courts will be reluctant to put otherwise constitutionally protected speech outside the scope of the 243 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

244

Global Medium, Local Laws

First Amendment on the basis of the medium through which it is communicated. As discussed in chapters one and two, some slight shifts in American hate speech regulation are taking place, but these shifts have to do with the effects and the intent/context of hate speech, not so much with the medium through which the speech is communicated or the content of the speech. In fact, it is somewhat ironic that in the high profile “Internet hate” cases such as the Planned Parenthood and James Baker cases that generated numerous scholarly articles regarding online hate, the courts only mentioned in passing the fact that this speech was taking place on the Internet. By the same token, as discussed in chapters three and four, the European countries discussed here have also not changed their approach to hate speech, despite the problems they face with enforcing their laws on the Internet. The Additional Protocol to the Convention of Cybercrime, while perhaps too watered down to be effective in harmonizing hate speech laws between Council of Europe member states, could at the very least be seen as a sign of the resolve of European countries to enforce their hate speech laws on the Internet. The first four chapters of this work served as a descriptive account of how hate speech laws have been applied from the off line to the online environment in the United States and Europe. This did not always happen seamlessly, and at times lawmakers and courts displayed a lack of understanding of the new medium in their policies and rulings. But hate speech law on both sides of the Atlantic is rooted in very fundamental values and political beliefs that were not fundamentally challenged or uprooted by the emergence of the Internet. Whereas the first four chapters dealt mainly with the question of how, if at all, the Internet challenges law, the last two chapters shifted focus to the question of how law may affect the Internet. As discussed in chapter four, European countries have made various attempts during the last ten years to have their laws enforced. While the Internet has not changed these countries’ approach towards hate speech, it has made it difficult for them to enforce their laws. These attempts to regulate the Internet, as Lessig pointed out, are not without consequence and could alter the architecture of the Internet. However, the Internet does possess characteristics that, even if they do not constitute the “nature” of the Internet, have been important in its development. Regulatory attempts could alter these architectural features. The normative framework and subsequent analysis in chapters five and six reflect this concern with

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Concluding Remarks

245

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

how these regulatory attempts could alter the Internet as we know it. The area of hate speech is but one area, and not even the most prominent, where Internet regulation could alter the structure of the Internet. Change is not always bad, but sometimes it is. It is the task of communication researchers and legal scholars to critically evaluate the changes the Internet is undergoing because of legal, political and business influences. This book contributes to that endeavor.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved. Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Bibliography

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Academic sources "A Communitarian Defense of Group Libel Laws." Harvard Law Review 101 (January 1988): 682-701. "Cyberspace Regulation and the Discourse of State Sovereignty, Developments; the Law of Cyberspace." Harvard Law Review 112 (May 1999): 1680-1704. Baistrocchi, Pablo Asbo. "Liability of Intermediary Service Providers in the EU Directive on Electronic Commerce." Santa Clara Computer and High Technology Law Journal 19 (December 2002): 111-130. Balkin, Jack M. "Some Realism about Pluralism: Legal Realist Approaches to the First Amendment." Duke Law Journal 1990 (June, 1990): 375-429. Ballard, Franklin. "Turnabout is Fair Play: Why a Reciprocity Requirement should be Included in the American Law Institute's Proposed Federal Statute." Houston Journal of International Law 28 (Spring 2006): 199238. Belmas, Genelle Irene. “Cyberliberties and Cyberlaw: A New Approach to Online Legal Problem-Solving.” Dissertation (2002). Birnhack, Michael D., and Jacob H. Rowbottom. "Do Children have the Same First Amendment Rights as Adults? Shielding Children: The European Way." Chicago-Kent Law Review 79 (2004): 175-227. Blakey, Robert G., and Brian J. Murray. "Threats, Free Speech, and the Jurisprudence of the Federal Criminal Law." Brigham Young University Law Review (2002): 829-1119. Blasi, Vincent. "The Checking Value in First Amendment Theory." American Bar Foundation Research Journal 1977 (1977): 521-649. Bollinger, Lee C. The Tolerant Society: Freedom of Speech and Extremist Speech in America. New York: Oxford University Press, 1986. Borchers, Patrick J. "Personal Jurisdiction in the Internet Age: Internet Libel: The Consequences of a Non-Rule Aproach to Personal Jurisdiction." Northwestern University Law Review 98 (Winter 2004): 473-492. Brannon, Chris L. "Constitutional Law--Hate Speech--First Amendment Permits Ban on Cross Burning when done with the Intent to Intimidate." Mississippi Law Journal 73 (Fall 2003): 323-345. Brugger, Winfried. "Protection of Hate Speech? Some Observations Based on German and American Law." The Tulane European and Civil Law Forum 17 (2002): 1-21. Brugger, Winfried. “The Treatment of Hate Speech in German Constitutional Law (Part II).” German Law Journal (4) No. 1 (January 2003). Available from http://www.germanlawjournal.com/article.php?id=225

247 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

248

Bibliography

Brugger, Winfried. “The Treatment of Hate Speech in German Constitutional Law (Part I).” German Law Journal (3) No. 12 (December 2002). Available from http://www.germanlawjournal.com/article.php?id=212 Bunker, Matthew. Critiquing Free Speech: First Amendment Theory and the Challenge of Interdisciplinarity. Mahwah, N.J.: Lawrence Erlbaum Associates, 2001. Burch, Edgar. "Censoring Hate Speech in Cyberspace: A New Debate in a New America." North Carolina Journal of Law & Technology 3 (Fall 2001): 175-192. Catlin, Scott J. "A Proposal for Regulating Hate Speech in the United States: Balancing Rights under the International Covenant on Civil and Political Rights." Notre Dame Law Review 69 (1994): 771-813. Congdon, Amanda J. "Burned Out: The Supreme Court Strikes Down Virginia's Cross Burning Statute in Virginia v. Black." Loyola University Chicago Law Journal 35 (Summer 2004): 1049-1116. Delgado, Richard, Charles R. Lawrence III, Mari J. Matsuda, and Kimberlè Crenshaw. "Introduction." In Words that Wound : Critical Race Theory, Assaultive Speech, and the First Amendment. Boulder, Col.: Westview Press, 1993, 17-52. Delgado, Richard, and Jean Stefancic. Must we Defend Nazis?: Hate Speech, Pornography, and the New First Amendment. New York: New York University Press, 1997. Delgado, Richard. "Words that Wound: A Tort Action for Racial Insults, Epithets, and Name Calling." In Words that Wound: Critical Race Theory, Assaultive Speech, and the First Amendment. Bolder, Col.: Westview Press, 1993: 89-110. Douglas-Scott, Sionaidh. "The Hatefulness of Protected Speech, a Comparison of the American and European Approaches." William & Mary Bill of Rights Journal 7 (February 1999): 305-346. Downs, Donald A. Nazis in Skokie : Freedom, Community, and the First Amendment. Notre Dame, Ind.: University of Notre Dame Press, 1985. Eberwine, Eric T. "Sound and Fury Signifying Nothing? Jürgen Büssow’s Battle Against Hate Speech on the Internet." New York Law School Law Review 49 (2004/2005): 353-410. Eko, Lyombe. “New Medium, Old Free Speech Regimes: The Historical and Ideological Foundations of French & American Regulation of BiasMotivated Speech and Symbolic Expression on the Internet.” Loyola International and Comparative Law Review 28.3 (2006): 69-127. Elrod, Jennifer. "Expressive Activity, True Threats, and the First Amendment." Connecticut Law Review 36 (Winter 2004): 541-608. Etzioni, Amitai. The Limits of Privacy. New York: Basic Books, 1999. Fagin, Matthew. "Regulating Speech Across Borders, Technology vs. Values." Michigan Telecommunications and Technology Law Review 9 (Spring 2003): 395-455.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Bibliography

249

Farrior, Stephanie. "Molding the Matrix: The Historical and Theoretical Foundations of International Law Concerning Hate Speech." Berkeley Journal of International Law 14 (1996): 3-98. Fisch, William B. "American Law in a Time of Global Interdependence: U.S. National Reports to the XVIth International Congress of Comparative Law: Section IV Hate Speech in the Constitutional Law of the United States." American Journal of Comparative Law 50 (Fall 2002): 463-492. Freedman, Monroe H., Eric M. Freedman. Group Defamation and Freedom of Speech: The Relationship between Language and Violence. Westport, Conn.: Greenwood Press, 1995. Friedman, Lawrence. “Regulating Hate Speech at Public Universities After R.A.V. v. City of St. Paul.” Howard Law Journal 93 (1993): 1-30. Fritts, Emily K. "Internet Libel and the Communications Decency Act: How the Courts Erroneously Interpreted Congressional Intent with Regard to Liability of Internet Service Providers." Kentucky Law Journal 93 (2004/2005): 765-785. Frydman, Benoît, and Isabelle Rorive. "Strategies to Tackle Racism and Xenophobia on the Internet--Where are we in Europe?" International Journal of Communications Law and Policy 23 (Winter 2002/2003): 8 (no end page noted in Lexis-Nexis document). Frydman, Benoît, and Isabelle Rorive. "Regulating Internet Content through Intermediaries in Europe and he USA." Zeitschrift Für Rechtssoziologie 23 no. 1 (2002): 41 -59. Garnett, Nathan W. "Dow Jones & Co v. Gutnick: Will Australia's Long Jurisdictional Reach Chill Internet Speech World-Wide?" Pacific Rim Law & Policy Journal 13 (January 2004/2003): 61-89. Geist, Michael A. "Is there a there there? Toward Greater Certainty for Internet Jurisdiction." Berkeley Technology Law Journal 16 (Fall 2001): 13451407. Gerlach, Tim. "Using Internet Content Filters to Create E-Borders to Aid in International Choice of Law and Jurisdiction." Whittier Law Review 26 (Spring 2005): 899-927. Garibian, Sevane. "law, Identity and Historical Memory in the Face of mass Atrocity conference: Taking Denial Seriously: Genocide Denial and Freedom of Speech in the French Law." Cardozo Journal of Conflict Resolution 9 (Spring: 2008): 479-488. Goldsmith, Jack L. "The Internet and the Abiding Significance of Territorial Sovereignty." Indiana Journal of Global Legal Studies 5 (Spring 1998): 475-491. Graber, Mark A. "Old Wine in New Bottles: The Constitutional Status of Unconstitutional Speech." Vanderbilt Law Review 48 (March 1995): 349389. Greenawalt, Kent. Speech, Crime, and the Uses of Language. New York: Oxford University Press, 1989.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

250

Bibliography

Greenberg, Marc H. "A Return to Lilliput: The LICRA v. Yahoo! Case and the Regulation of Online Content in the World Market." Berkeley Technology Law Journal 18 (Fall 2003): 1191-1258. Gurak, Laura J. Cyberliteracy: Navigating the Internet with Awareness. New Haven, Conn.: Yale University Press, 2001. Gurak, Laura J. Persuasion and Privacy in Cyberspace: The Online Protests Over Lotus Marketplace and the Clipper Chip. New Haven, Conn.: Yale University Press, 1997. Hagan, Melanie C. "The Freedom of Access to Clinic Entrances Act and the Nuremberg Files Web Site: Is the Site Properly Prohibited Or Protected Speech?" Hastings Law Journal 51 (January 2000): 411-444. Hammack, Scott. "The Internet Loophole: Why Threatening Speech on-Line Requires a Modification of the Courts' Approach to True Threats and Incitement." Columbia Journal of Law and Social Problems 36 (Fall 2002): 65-102. Haupt, E. Claudia. "Regulating Hate Speech: Damned if You Do and Damned if You Don’t : Lessons Learned from Comparing the German and U.S. approaches." Boston University International Law Journal 23 (Fall 2005): 299-335. Henn, Julie L. "Targeting Transnational Internet Content Regulation." Boston University International Law Journal 21 (Spring 2003): 157-177. Heyman, Steven J. Hate Speech and the Constitution. New York: Garland Publishing, 1996 Hunter, Dan. "Cyberspace as Place and the Tragedy of the Digital Anticommons." California Law Review 91 (March 2003): 439-519. Hurdle, Melody L. "R.A.V. v. City of St. Paul: the Continuing Confusion of the Fighting Words Doctrine." Vanderbilt Law Review 47 (May 1994): 11431174. Koenig, Thomas H., and Michael Rustad. In Defense of Tort Law. New York: New York University Press, 2001. Lasson, Kenneth. "To Stimulate Provoke Or Incite, Hate Speech and the First Amendment." In Group Defamation and Freedom of Speech. Westport, Conn: Greenwood Press, 1995, 267-307. Johnson, David R., and David Post. "Law and Borders - the Rise of Law in Cyberspace." Stanford Law Review 48 (May 1996): 1367-1402. Jones, Thomas David. Human Rights: Group Defamation, Freedom of Expression, and the Law of Nations. The Hague; Boston; Cambridge, Mass.: Martinus Nijhoff Publishers; Sold and distributed in the U.S.A. and Canada by Kluwer Law International, 1997. Komasara, Tiffany. "Planting the Seeds of Hatred, Why Imminence should no Longer be Required to Impose Liability on Internet Communication." Capital University Law Review 29 (2002): 835-855. Krause, Jason. "Casting A Wide Net: Search Engines Yahoo and Google Tussle with Foreign Courts Over Content." ABA Journal 88 (November 2002): 20.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Bibliography

251

Krotoszynski, Ronald J. Jr. "A Comparative Perspective on the First Amendment: Free Speech, Militant Democracy, and the Primacy of Dignity as a Preferred Constitutional Value in Germany." Tulane Law Review 78 (May 2004): 1549-1609. Lawrence, Charles III. "If He Hollers, Let Him Go, Regulating Racist Speech on Campus." Duke Law Journal 1990 (June 1990): 431-483. Lessig, Lawrence. "The Architecture of Innovation." Duke Law Journal 51 (April 2002): 1783-1801. Lessig, Lawrence. Code and Other Laws of Cyberspace. New York, N.Y.: Basic Books, 1999. MacKinnon, Catharine A. Only Words. Cambridge, Mass: Harvard University Press, 1993. Mailland, Jullien. "Freedom of Speech, the Internet and the Cost of Control: The French Example." Journal of International Law and Politics 33 (Summer 2001): 1179-1234. Mannheimer, Michael J. "The Fighting Words Doctrine." Columbia Law Review 93 (October, 1993): 1527-1571. Marsh, Elizabeth P. "Purveyors of Hate on the Internet: Are we Ready for Hate Spam?" Georgia State University Law Review 17 (Winter 2000): 379-407. Massaro, Toni M. "Free Speech and Religious, Racial, and Sexual Harassment: Equality and Freedom of Expression: The Hate Speech Dilemma." William & Mary Law Review 32 (Winter 1991): 211-265. Massey, Calvin R. "Hate Speech, Cultural Diversity, and the Foundational Paradigms of Free Expression." Universtiy of California Law Review 40 (October 1992): 103-197. Matsuda, Mari J. "Public Response to Racist Speech: Considering the Victim’s Story." In Words that Wound. Boulder, Colo: Westview Press, 1993, 1752. Meiklejohn, Alexander. Political Freedom; the Constitutional Powers of the People. New York: Harper, 1960. Meiklejohn, Alexander. Free Speech and its Relation to Self-Government. New York: Harper, 1948. McGonagle, Tarlach. "Wresting (Racial) Equality from Tolerance of Hate Speech." Dublin University Law Journal 23, no. 21 (2001): 21-54. Nguyen, Titi. "A Survey of Personal Jurisdiction Based on Internet Activity: A Return to Tradition." Berkeley Technology Law Journal 19 (2004): 519542. Nies, Eric John. "The Fiery Cross: Virginia v. Black: History and the First Amendment." South Dakota Law Review 50 (2005): 182-217. Oberdorfer Nyberg, Amy. "Is all Speech Local? Balancing Conflicting Free Speech Principles on the Internet." Georgetown Law Journal 92 (March 2004): 663-688. Polelle, Michael J. "Racial and Ethnic Group Defamation: A Speech-Friendly Proposal." Boston College Third World Law Journal 23 (Spring 2003): 213-273.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

252

Bibliography

Przybylski, Paul. "A Common Tool for Individual Solutions: Why Countries Should Establish an International Organization to Regulate Internet Content." Vanderbilt Journal of Entertainment and Technology Law 9 (Spring 2007): 928-956. Reidenberg, Joel R. "The Yahoo Case and the International Democratization of the Internet." Fordham Law & Economics Research Paper no. 11 (April 2001). Rosenfield, Michael. "Hate Speech in Constitutional Jurisprudence: A Comparative Analysis." Cardozo Law Review 24 (April 2003): 15231567. Ruedy, C. Matthew. "Repercussions of a MySpace Teen Suicide: Should AntiCyberbullying Laws Be Created?" North Carolina Journal of Law & Technology 9 (Spring 2008): 323-346. Sadurski, Wojciech. Freedom of Speech and its Limits. Dordrecht; Boston: Kluwer Academic Publishers, 1999. Sayle, Amber Jane. "Net Nation and the Digital Revolution: Regulation of Offensive Material." Wisconsin International Law Journal 18 (2000): 257285. Schlosberg, Jason. "Judgment on ‘Nuremberg’: An Analysis of Free Speech and Anti-Abortion Threats made on the Internet." Boston University Journal of Science and Technology Law 7 (Winter 2001): 52-79. Schumacher, Pascal H. "Fighting Illegal Internet Content -- may Access Providers be Required to Ban Foreign Websites? A Recent German Approach." International Journal of Communications Law and Policy 8 (Winter, 2003/2004): 3. (no end page listed in Lexis-Nexis document) Schwartz, Deborah. "A First Amendment Justification for Regulating Racist Speech on Campus." Case Western Reserve Law Review 40 (1990): 733779. Shiffrin, Steven. Dissent, Injustice, and the Meanings of America. Princeton, NJ: Princeton University Press, 1999. Silversten, Matthew. "What's Next for Wayne Dick? The Next Phase of the Debate Over College Hate Speech Codes "Ohio State Law Journal 61 (2000): 1247-1299. Smith, Catherine E. "Intentional Infliction of Emotional Distress, an Old Arrow Targets the New Head of the Hate Hydra." Denver University Law Review 80 (2002): 1-61. Solum, Lawrence B., and Minn Chung. "The Layers Principle: Internet Architecture and the Law." Notre Dame Law Review 79 (April 2004): 815-948. Spanogle, John. "The Enforcement of Foreign Judgments in the U.S. - A Matter of State Law in Federal Courts." United States-Mexico Law Journal 13 (Spring 2005): 85-95. Stein, Allan R. "Current Debates in the Conflict of Laws: Choice of Law and Jurisdiction on the Internet: Parochialism and Pluralism in Cyberspace

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Bibliography

253

Regulation." University of Pennsylvania Law Review 153 (June 2005): 2003-2016. Strum, Philippa. When the Nazis Came to Skokie: Freedom for Speech we Hate. Lawrence, Kan.: University Press of Kansas, 1999. Sutton, Michael F. "Legislating the Tower of Babel: International Restrictions on Internet Content and the Marketplace of Ideas." Federal Communications Law Journal 56 (March 2004): 417-438. Timofeeva, Yulia A. "Hate Speech Online: Restricted Or Protected? Comparison of Regulations in the United States and Germany." Journal of Transnational Law & Policy 12 (Spring 2003): 253-285. Topper, Prana A. "The Threatening Internet: Planned Parenthood v. ACLA and a Context-Based Approach to Internet Threats." Columbia Human Rights Law Review 33 (Fall 2001): 191-240. Tsesis, Alexander. "Hate in Cyberspace: Regulating Hate Speech on the Internet." San Diego Law Review 38 (Summer 2001): 817-874. Tsesis, Alexander. Destructive Messages : How Hate Speech Paves the Way for Harmful Social Movements. New York: New York University Press, 2002. Turkle, Sherry. Life on the Screen: Identity in the Age of the Internet. New York: Simon & Schuster, 1997. Turner, Ronald. "Hate Speech and the First Amendment: The Supreme Court's R.A.V. Decision." Tennessee Law Review 61 (Fall 1993): 197-236. Van Blarcum, Christopher D. "Internet Hate Speech: The European Framework and the Emerging American Haven." Washington & Lee Law Review 62 (Spring 2005): 781-830. Van Houweling Shaffer, Molly. "Cyberage Conflicts of Law: Enforcement of Foreign Judgments, the First Amendment, and Internet Speech: Notes for the Next Yahoo! v. LICRA." Michigan Journal of International Law 24 (Spring 2003): 697-717. Vance, Susannah C. "The Permissibility of Incitement to Religious Hatred Offenses Under European Convention Principles " Transnational Law & Contemporary Problems 14 (Spring 2004): 201-251. Walker, Samuel. Hate Speech: The History of an American Controversy. Lincoln: University of Nebraska Press, 1994. Weinstein, James. Hate Speech, Pornography, and the Radical Attack on Free Speech Doctrine. Boulder, Colo.: Westview Press, 1999. Weinstock, Netanel Neil. "Cyberspace Self-Governance: A Skeptical View from Liberal Democratic Theory." California Law Review 88 (March 2000): 395-498. Weintraub-Reiter, Rachel. "Hate Speech over the Internet, a Traditional Constitutional Analysis or a New Cyber-Constitution?" The Boston Public Interest Law Journal 8 (Fall 1998): 145-173. White, Edward G. "The First Amendment Comes of Age, the Emergence of Free Speech in Twentieth-Century" Michigan Law Review 95 (November 1996): 299-392.

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

254

Bibliography

Whitman, James Q. "Enforcing Civility and Respect: Three Societies." Yale Law Journal 109 (April 2000): 1279-1398. Yen, Alfred C. "Western Frontier Or Feudal Society?: Metaphors and Perceptions of Cyberspace." Berkeley Technology Law Journal 17 (Fall 2002): 1207-1263. Yokoyama, Dennis T. "You can't always use the Zippo Code: The Fallacy of a Uniform Theory of Internet Personal Jurisdiction." DePaul Law Review 54 (Summer 2005): 1147-1196. Zingo, Martha T. Sex/Gender Outsiders, Hate Speech, and Freedom of Expression: Can they Say that about Me? Westport, Conn.: Praeger, 1998. Zittrain, Jonathan. "Internet Points of Control." Boston College Law Review 44 (March 2003): 653-688.

Online news sources and news letters. (Links active as of

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

05/15/2008, unless noted otherwise.) Crouzillacq, Philippe. "Les FAI Mis En Cause Pour l'Accès à Un Site Révisionniste." 01.net (May 31, 2005). Available from http://www.01net.com/article/279716.html Dixit, Jay. "A Banner Day for Neo-Nazis." Salon (2001). Available from http://www.jaydixit.com/writing/archives/banner%20day/Salon_com%20 Technology%20%20A%20banner%20day%20for%20neo-Nazis.htm Dumout, Estelle. "Stéphane Marcovitch, AFA: ‘Une Mesure De Filtrage est Simplissime à Contourner’." ZDNet.fr (June 15, 2005). Available from http://www.zdnet.fr/actualites/internet/0,39020774,39233051,00.htm Dumout, Estelle. "Affaire AAARGH: Les Hébergeurs Américains Sommés De Bloquer le Site." ZDNet.fr (April 22, 2005). Available from http://www.zdnet.fr/actualites/internet/0,39020774,39219410,00.htm Docherty, Alan. "Now You See, Now You Don't." Internet Freedom (February 20, 2000). Available from http://www.netfreedom.org/news.asp?item=109 Ermert, Monika. "Selbstregulierung Der Suchmaschinenanbieter." Heise Online (February 25, 2005). Available from http://www.heise.de/newsticker/meldung/56770 Kaplan, Carl S. "Experts See Online Speech Case as Bellwether." The New York Times on the Web (January 5, 2001). Available from http://www.nytimes.com/2001/01/05/technology/05CYBERLAW.html?ex =1123732800&en=ccec82d20ca2bfa7&ei=5070 (login required) Lane, Terry. “Censoring the Adelaide Institute's Web Site is Futile." Online Opinion (November 30, 2000). Available from http://www.onlineopinion.com.au/view.asp?article=1126 Lyman, Jay. "German Court Rules Yahoo! Not Liable for Nazi Auctions." Newsfactor Technology News (March 28, 2000). Available from http://www.newsfactor.com/perl/story/8500.html

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Bibliography

255

Saslow, Eli. “Hate Groups’ New Target.” The Washington Post p. A6 (June 22, 2008) p.A6. Available from http://www.washingtonpost.com/wpdyn/content/article Sturgeon, Will. "Google Hands BMW its ‘Death Sentence’." Silicon.com (February 6, 2006). Available from http://networks.silic on.com/webwatch/0,39024667,39156216,00.htm Terryn, Tom. "iPod Back on French Shelves, Apple Claims 20% of European Digital Music Market, & More Dutch Awards." The Mac Observer (October 2, 2002). Available from http://www.themacobserver.com/articl e/2002/10/02.5.shtml Terryn, Tom. "iPod Pulled from French Shelves due to Sound Output." The Mac Observer (October 1, 2002). Available from http://www.themacobserver.com/article/2002/10/01.4.shtml Thompson, Bill. "Google Censoring Web Content." BBC News (October 25, 2002). Available from http://news.bbc.co.uk/1/hi/technology/2360351.stm Weisman, Robyn. "Germany Bans Foreign Web Site for Nazi Content." Newsfactor.com (Dec. 14, 2000). Available from http://www.newsfactor.com/story.xhtml?story_id=6063 "Affaire AAARGH : pas de révision de la solution." Le Forum des Droits sur l’Internet (November 28, 2006). Available from http://www.foruminternet.org/specialistes/veillejuridique/actualites/affaire -aaargh-pas-de-revision-de-la-solution.html" France Obtained ISPs Support in Blocking Illegal Sites." Digital Civil Rights in Europe: EDRI-Gram Newsletter 6, no. 12 (June 18, 2008) Available from http://www.edri.org/edrigram/number6.12/isp-france-block-sites "Finnish ISPs must Voluntarily Block Access." Digital Civil Rights in Europe: EDRI-Gram Newsletter 3, no. 18 (September 8, 2005). Available from http://www.edri.org/edrigram/number3.18/censorshipFinland "French Court Issues Blocking Order to 10 ISPs." Digital Civil Rights in Europe: EDRI-Gram Newsletter 3, no. 12 (June 15, 2005). Available from http://www.edri.org/edrigram/number3.12/blocking "French Draft Law Obliges Providers to Monitor Content." Digital Civil Rights in Europe: EDRI-Gram Newsletter 2, no. 1 (January 15, 2004). Available from http://www.edri.org/edrigram/number2.1/LEN "Geolocation, Don't Fence the Web in." Wired News (July 13, 2004). Available from http://www.wired.com/techbiz/it/news/2004/07/64178 “German Parliament Debates Filtering or Blocking." Digital Civil Rights in Europe: EDRI-Gram Newsletter 2, no. 22 (November 17, 2004). Available from www.edri.org/edrigram/number2.22/filtering Germany: Meta Search Engines Responsible for Hyperlinks." Digital Civil Rights in Europe: EDRI-Gram Newsletter 3, no. 7 (April 6, 2005). Available from http://www.edri.org/edrigram/number3.7/hyperlinks "Google Caught in Anti-Semitism Flap." C.net (April 7, 2004). Available from http://news.com.com/2100-1038_3-5186012.html

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

256

Bibliography

"Les FAI devront filtrer les sites pédopornographiques." ZDNet.fr (June 10, 2008). Available from http://www.zdnet.fr/actualites/internet/0,390207 74,39381639,00.htm "Press Release Issued by the Luxembourg Presidency of the European Union: No Agreement on the Framework Decision on Combating Racism and Xenophobia at the Justice and Home Affairs Council." (June 2, 2005). http://www.eu2005.lu/en/actualites/communiques/2005/06/02jai-rx/ “Press Release Issued by the Germany Presidency of the European Union EU: Common Criminal Provisions against Racism and Xenophobia.” (April 20, 2007). Avaialble from http://www.eu2007.de/en/News/Press_R eleases/April/0420BMJRassismus.html

Web site documents, online studies and reports. (Links

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

active as of 05/15/2006, unless noted otherwise.) Ahlert, Christian, Chris Marsden, and Chester Yung. "How Liberty Disappeared from Cyberspace: The Mystery Shopper Tests Internet Content Self-Regulation." (2004): 1-39. Available from http://pcmlp.socleg.ox.ac.uk/text/liberty.pdf Batir, Kerem. "Regulating Hate Speech on the Internet, Unilateralism v. Multilateralism, Technique v. Law." Available from inettr.org.tr/inetconf8/sunum/77.pdf Bischof, Dan. "Criminal Defamation Laws are 19th Century Holdover." The News Media and the Law 25, no. 2 (Spring 2001). Available from http://www.rcfp.org/news/mag/25-2/lib-crimhist.html Caruso, David B. “Report: Hate Groups favor U.S. Internet Servers to Spread Bile.” Findlaw (May 4, 2006). Available from http://news.lp.findlaw.com/ap/o/51/05-05-2006/29a3000ffed2fa0f.html Finkelstein, Seth. "Chester's Guide to Molesting Google (Version 1.5)." (Feb 28, 2003). Available from http://sethf.com/anticensorware/ge neral/chester.php Goldman, Dave. "HateWatch Says Goodbye." (January 16, 2001). Available from http://www.hiddenmysteries.org/conspiracy/reststory/hatewatch.html Huff, Chuck et al. "Machado Case History." computingcases.org. Available from http://www.computingcases.org/case_materia ls/machado/case_history/case_history.html Johnson, David R., and David G. Post. "The New ‘Civic Culture’ of the Internet." Cyberspace Law Institute (February 1998). Available from http://www.cli.org/paper4.htm Huff, Chuck et al. “Machado Case History.” Computingcases.org Avaialble from http://www. Köster, Oliver, and Uwe Jürgens. "Liability for Links in Germany, Liability of Information Location Tools Under the German Law after the Implementation of the European Directive on E-Commerce." Working Papers of the Hans Bredow Institute no.14 (July 2003).

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Bibliography

257

Available from www.hans-bredow-institut.de/ publikationen/apapiere/14liability.PDF Levin, Brian. "Because of the Constitution's First Amendment, the United States Now Hosts Hundreds of European Language Hate Sites." Southern Poverty Law Center (Winter 2003). Available from http://www.splcenter.org/intel/intelreport/article.jsp?aid=155 Muir, James A., and P.C. Van Oorschot. "Internet Geolocation and Evasion." School of Computer Science, Carleton Ottawa (2006). Available from cs.smu.ca/~jamuir/papers/TR-06-05.pdf Nas, Sjoera. "The Multatuli Project ISP Notice & Take Down." Bits of Freedom (October 1, 2004): 1-14. Available from http://www.bof.nl/docs/researchpaperSANE.pdf Ott, Stephan. "Hyperlinks, Search Engines- First Report on the Application of the E-Commerce-Directive." Links and Law. Available from http://www.linksandlaw.com/news-update15-directive.htm Potok, Mark. "The Year in Hate." Southern Poverty Law Center: Intelligence Report, no. 117 (Spring 2005). Available from http://www.splcenter.org/intel/intelreport/article.jsp?aid=529 Troper, Michael. "Droit Et Négationnisme, Extraits d’un Article De Michel Troper." (December 2, 2002) Available from http://www.phdn.org/negation/gayssot/troper.html Urban, Jennifer M., and Laura Quilter. "Efficient Process Or ‘Chilling Effects’? Takedown Notices Under Section 512 of the Digital Millennium Copyright Act. Summary Report.” University of Southern California Law Portal (November 2005). Available from http://mylaw.usc.edu/documents/512Rep/ Wolf, Christopher. "Racists, Bigots and the Law on the Internet." Anti Defamation League (July 2000). Available from http://www.adl.org/internet/internet_law1.asp Zittrain, Jonathan. "Localized Google Search Exclusions." Berkman Center for Internet and Society (October 26, 2002). Available from http://cyber.law.harvard.edu/filtering/google/#intro "About the Council of Europe." Council of Europe (January 2005). Available from http://www.coe.int/T/e/Com/about_coe/ "About the IWF.” Internet Watch Foundation. Available from http://www.iwf.org.uk/public/page.103.htm "Ashcroft v. ACLU, the Legal Challenge to the Child Online Protection Act." Electronic Privacy Information Center (June 29, 2004). Available from http://www.epic.org/free_speech/copa/ “Combating Extremism in Cyberspace, the Legal Issues Affecting Internet Hate Speech.” Anti Defamation League (2000). Available from www.adl.org/Civil_Rights/newcyber.pdf “Digital Hate and Terrorism 2007.” Simon Wiesenthal Center (2007). Available from

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

258

Bibliography

http://www.wiesenthal.com/site/apps/s/content.asp?c=fwLYKnN8LzH&b =253162&ct=387686 "eBay Rules and Policies: Mature Audiences." eBay. Available from http://pages.ebay.com/help/policies/mature-audiences.html "European Court of Human Rights - Historical Background." Council of Europe, European Court of Human Rights. Available from http://www.echr.coe.int/ECHR/EN/Header/The+Court/The+Court/History +of+the+Court/ "Google Help Center: How do I Stop google.com from Redirecting to another Google Domain?" Google Help Center (2006). Available from http://www.google.com/support/bin/answer.py?answer=873 "Google: An Explanation of our Search Results." Google (2004). Available from http://www.google.com/explanation.html “Inhope: Facts by Country: United States-Overview." Inhope - The Associarion of Internet Hotline Providers (2004). Available from http://www.inhope.org/content/details.php?lang=en&countryid=18 “2007 Annual Report.” Internet Watch Foundation (2008) Available at: http://www.iwf.org.uk/documents/20080417_iwf_annual_report_2007_(w eb).pdf “LEN- Le Conseil Constitutionnel Fait Écho Aux Souhaits Du Gouvernement." Iris (June 15, 2004). Available from http://www.iris.sgdg.org/infodebat/comm-decisionCC0604.html “Racially Inflammatory Material on the Internet.” London, United Kingdom: United Kingdom Home Office, 2002. Available from www.iwf.org.uk/documents/20041020 (link expired) "Safer Internet Programme: What is Safer Internet." European Union (February 28, 2006). Available from http://europa.eu.int/information_society /activities/sip/index_en.htm "Simon Wiesenthal Center's Digital Hate and Terrorism 2005 Report Reveals 25% Spike in Hate Sites." Simon Wiesenthal Center (March 23, 2005). Available from http://www.wiesenthal.com/site/apps/s/conte nt.asp?c=fwLYKnN8LzH&b=253162&ct=546389 "The 15 Enemies of the Internet and Other Countries to Watch." Reporters Without Borders (November 17, 2005). Available from http://www.rsf.org/article.php3?id_article=15613 "The Hotline and the Law." Internet Watch Foundation (Created: September 10, 2004. Updated: April 12, 2006). Available from http://www.iwf.org.uk/public/page.31.htm "Violence & Harassment at U.S. Abortion Clinics." religioustolerance.org (Nov 9, 2004). Available from http://www.religioustoleranc e.org/abo_viol.htm "Yahoo! Auctions Guidelines." Available from http://user.auctions.yahoo .com/html/guidelines.html

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Bibliography

259

Newspapers, magazines and legal news sources "Dow Jones Settles Australian Online Libel Lawsuit." Entertainment Law Reporter 26, no. 6 (November, 2004). “Er Bestaan Ook Legale Muziekzoekmachines." De Standaard, May 13, 2004. Mathieson, SA. "Back Door to the Black List: BT's System to Block Access to Child Pornography could Actually be Manipulated to Search for Illegal Material, According to New Research." The Guardian, May 26, 2005, p. 19. Mauro, Tony. "It's a Mad, Mad, Mad, Mad Court. Justices Upended Expectations in 2002-2003 Term." Texas Lawyer, July 7, 2003, p.18. Muyl, Catherine. "French Yahoo Judge Issues a Decision Concerning a Racist US-Based Portal." Intellectual Property Today, March 2002. van Hoek, Aukje. "Australia's High Court Upholds Local (Private Law) Jurisdiction in Case of Defamation Over the Internet." International Enforcement Law Reporter 19, no. 5 (May 2003). Working, Russell, “Illegal Abroad, Hate Web Sites Thrive here,” Chicago Tribune, November 13, 2007, p.1. Yardley, Jim, and David Rohde. "Abortion Doctor in Buffalo Slain; Sniper Attack Fits Violent Pattern." The New York Times, October 25, 1998, p.1.

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Conference proceedings “Statement by Mr. Gérard Kerforn, Introducer at the Fourth Session of the Conference on Racism, Xenophobia and Discrimination.” Conference on Racism, Xenophobia and Discrimination. Vienna, Austria, 4-5 September, 2003. Available from http://www.osce.org/docum ents/sg/2003/09/612_en.pdf Bechtold, Steven. “In Search of Search Law Regulating Search?” A Symposium on Search Engines, Law, and Public Policy. Yale Law School, Dec. 3, 2005. Available from http://islandia.law.yale .edu/isp/regulatingsearch.html Clayton, Richard. “Failures in a Hybrid Content Blocking System” 5th Workshop on Privacy Enhancing Technologies. Cavtat Croatia, May 30June 1, 2005. Available from www.cl.cam.ac.uk/~rnc1/cleanfeed.pdf Marcus, Brian. “Public and Private Partnership in the Fight Against Racism. Xenophobia and Anti-Semitism on the Internet--Best Practices.” OSCE Meeting on the Relationship Between Racist, Xenophobic and AntiSemitic Propaganda on the Internet and Hate Crimes. Paris, France, June 16, 2004. Available from http://www.adl.org/osce/session3_b est_practices.pdf Marcus, Brian. “Search Engines and Individual Rights.” Symposium on Search, Yale Law School, a Symposium on Search Engines, Law, and Public Policy. Yale Law School, Dec. 3, 2005. Available from http://islandia.law.yale.edu/isp/regulatingsearch.html

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

260

Bibliography

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Frydman, Benoît and Isabelle Rorive. “Fighting Nazi and Anti-Semitic Material on the Internet: the Yahoo! Case and its Global Implications” Hate and Terrorist Speech on the Internet: The Global Implications of the Yahoo! Ruling in France. Cardozo School of Law, February 11, 2002. Available from http://www.isys.ucl.ac.be/etudes/cours/linf2 202/Frydman=2002.pdf

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Index fighting words, 18–23 First Amendment - theories of

AAARGH case, 223–25 Abrams v. United States, 9–11 Additional Protocol to the Convention on cybercrime, 149–53 ALPHA HQ case, 74–75 Bachchan v. India Abroad Publications, 198 Barlow, John Perry, 156 Beauharnais v. Illinois, 26 Calder v. Jones, 178 Chaplinsky v. New Hampshire, 19 Child Online Protection Act, 71 clever inciter, 91–93 code, 161–64 Cohen v. California, 21 Collin v. Smith, 32–35 Communications Decency Act, 70

autonomy theory, 52–56 dissent theory, 56–58 market place theory, 49 self governance theory, 49–52 tolerance theory, 58–60

France - hate speech law free speech tradition, 118–20 legal framework, 120–21

Garrison v. Louisiana, 26 geolocation, 209–13 Germany - hate speech law, 118 Basic law, 112–13 false speech, 114–15 free speech as positive right, 113– 14 individual protection, 115–16 militant democracy, 116

Gitlow v. New York, 11, 12 Glimmerveen and Hagenbeek v. The Netherlands, 111 Goldman, Dave, 68 Gooding v. Wilson, 22 Griffis v. Luban, 179 group libel, 23-32 hotlines, 146–49 incitement

ISP liability, 215–17

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Compuserve case, 142 cross burning cases R.A.V. v. St. Paul, 36–40 Virginia v. Black, 41–42

Cubby, Inc. v. CompuServe, Inc, 214 cyberlibertarianism, 156–60 Delgado, Richard, 60–62 Dennis v. United States, 15 Digital Millennium Copyright Act, 217–18 Doe v. University of Michigan, 29 Dow Jones & Co. v. Gutnick, 196 E-Commerce Directive, 144–46

Cold War cases, 16 Red Scare cases, 11–14 World War I cases, 7–11

incitment Brandenburg v. Ohio, 16–18

Inset Systems, Inc. v. Instruction Set, 175 International Shoe v. Washington, 174 ISP blocking

ISP liability, 218–22

end-to-end principle, 164–66, 164–66 EU Framework Decision on Combating Racism and Xenophobia, 108 European Convention on Human Rights, 109-110 Faurisson v. France, 105

legal issues, 223–27, technological issues, 228–34

Jersild v. Denmark, 110 Keeton v. Hustler Magazine, 179 Lessig, Lawrence, 155, 160, 161, 162, 163, 164, 246, 253

261 Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing

Index

262

United Nations Universal Declaration of Human Rights, 104 United States v. Jake Baker and Arthur Gonda, 75–79 United States v. Kammersell, 72 United States v. Machado, 73–74 UWM Post, Inc. v. Board of Regents of University of Wisconsin System, 30 Whitney v. California, 14 Winfield Collection, Ltd. v. McCauley, 181 Yahoo! case criticism of, 134–37 Franch proceedings, 127–30 proceedings in America, 131–34 support of, 137–39

Yates v. United States, 15 Young v. New Haven Advocate, 180 Zippo test, 176–78

Copyright © 2009. LFB Scholarly Publishing LLC. All rights reserved.

Matsuda, Mari, 62–65 NAACP v. Claiborne Hardware Co, 80 New York Times v. Sullivan., 27 United Kingdom - hate speech law, 121–25 Planned Parenthood of Columbia/Willamette v. American Coalition of Life Activists, 79–87 realist sovereignty concept, 170– 71 Reno v. ACLU, 70–72 Renton v. Playtime Theatres, 70 representational sovereignty concept, 171–74 Rice v. Paladin Enterprises, 88 Schenck v. U.S., 8 Search engine filtering, 204–9 Shelley v. Kraemer, 198 Teinkoff v. Matusevich, 199 Terminiello v. Chicago, 20 Toben case, 142

Vanacker, Bastiaan. Global Medium, Local Laws : Regulating Cross-Border Cyberhate, LFB Scholarly Publishing