Data Security and Privacy in Massachusetts [2 ed.] 1683450639

2,701 168 5MB

English Pages [629] Year 2018

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Data Security and Privacy in Massachusetts [2 ed.]
 1683450639

Table of contents :
Preliminary Pages
ACKNOWLEDGMENTS
ABOUT THE EDITOR
ABOUT THE AUTHORS
TABLE OF CONTENTS
TABLE OF EXHIBITS
Chapter 1
Conceptions of Privacy and Security in a Digital World
§ 1.1 INTRODUCTION
§ 1.2 PRIVACY AND SECURITY: PROMISES AND DANGERS OF “BIG DATA”
§ 1.2.1 Distinguishing Privacy and Security
(a) Privacy
(b) Security
§ 1.2.2 The Challenges of Cyberspace
(a) Security Challenges
(b) Privacy Challenges
§ 1.3 SOURCES AND HISTORY OF THE LAW OF INFORMATION PRIVACY AND SECURITY
§ 1.3.1 Foundations
§ 1.3.2 Civil Rights and Consumer Protection
§ 1.3.3 Credit Expansion
§ 1.3.4 Computerization and Security
§ 1.3.5 Electronic Health Records
§ 1.3.6 Financial Institution Reform
§ 1.3.7 Addressing Web Abuses
§ 1.3.8 Identity Theft
§ 1.3.9 Summary of the Framework
Chapter 2
The Electronic Communications Privacy Act
§ 2.1 INTRODUCTION
§ 2.2 THE ELECTRONIC COMMUNICATIONS PRIVACY ACT
§ 2.3 THE WIRETAP ACT
§ 2.3.1 Interception of Communications
§ 2.3.2 Exceptions
§ 2.3.3 Communications in Transit Versus Stored Communications
§ 2.3.4 Video Surveillance
§ 2.4 MASSACHUSETTS ELECTRONIC SURVEILLANCE STATUTE
§ 2.4.1 Oral Communications—No “Expectation of Privacy” Requirement
§ 2.4.2 Consent of All Parties
§ 2.4.3 Video Surveillance
§ 2.5 THE STORED COMMUNICATIONS ACT
§ 2.5.1 Why the SCA Exists
§ 2.5.2 Prohibition on Unauthorized Access
§ 2.5.3 Entities Regulated by the SCA
§ 2.5.4 Providers “to the Public” Versus Nonpublic Providers
§ 2.5.5 Content Information Versus Noncontent Information
§ 2.5.6 The Privacy Protections of the SCA
(a) Compelled Disclosure Rules
(b) Voluntary Disclosure Rules
(c) Voluntary Disclosure Versus Compelled Disclosure
§ 2.5.7 Interplay with Massachusetts Constitution
§ 2.5.8 Interplay with Fourth Amendment
§ 2.5.9 Consent and Discovery Under the SCA
§ 2.6 CONCLUSION
EXHIBIT 2A—Wiretap Report 2016
Chapter 3
Computer Fraud and Abuse Act
§ 3.1 INTRODUCTION
§ 3.2 KEY TERMINOLOGY
§ 3.2.1 Protected Computer
§ 3.2.2 Computer
§ 3.2.3 Access
§ 3.2.4 Unauthorized Access and Exceeding Authorized Access
§ 3.2.5 Damage or Loss
§ 3.3 CRIMINAL PROHIBITIONS
§ 3.3.1 18 U.S.C. § 1030(a)(1)—Espionage
§ 3.3.2 18 U.S.C. § 1030(a)(2)—Confidentiality
§ 3.3.3 18 U.S.C. § 1030(a)(3)—Trespass to Government Computers
§ 3.3.4 18 U.S.C. § 1030(a)(4)—Computer Fraud
§ 3.3.5 18 U.S.C. § 1030(a)(5)—Malware, Viruses, and Damage to Data
§ 3.3.6 18 U.S.C. § 1030(a)(6)—Password Trafficking
§ 3.3.7 18 U.S.C. § 1030(a)(7)—Extortion
§ 3.4 PRIVATE RIGHT OF ACTION
§ 3.5 CRITICISM AND CALLS FOR REFORM
§ 3.5.1 Internet Advocacy and Hacktivism
§ 3.5.2 Cyber Bullying
§ 3.5.3 Use of Employer Resources for Personal Use
§ 3.5.4 Proposed Reforms
§ 3.6 LEGISLATIVE HISTORY
§ 3.6.1 H.R. 5616
§ 3.6.2 The 1984 Act: The Counterfeit Access Device and Computer Fraud and Abuse Act
§ 3.6.3 The 1986 Amendments
§ 3.6.4 The 1994 Amendments: The Computer Abuse Amendments Act of 1994
§ 3.6.5 The 1996 Amendments: The National Information Infrastructure Protection Act of 1996
§ 3.6.6 The 2001 Amendments: The USA PATRIOT Act
§ 3.6.7 The 2002 Amendments: The Cyber Security Enhancement Act of 2002
§ 3.6.8 The 2008 Amendments: The Identity Theft Enforcement and Restitution Act of 2008
§ 3.6.9 Scope and Application of the CFAA Today
EXHIBIT 3A—18 U.S.C. § 1030, Fraud and Related Activity in Connection with Computers
Chapter 4
Fair Credit Reporting Act; Fair and Accurate Credit Transactions Act
§ 4.1 OVERVIEW: FCRA AND FACTA
§ 4.1.1 History of the FCRA and FACTA
§ 4.1.2 Addressing Incorrect Information on Consumer Reports and CRA Liability
§ 4.2 WHAT THE FCRA AND FACTA PROTECT AND RENDER PRIVATE
§ 4.2.1 Truncation of Credit and Debit Card Numbers and Expiration Dates on Printed Receipts
§ 4.2.2 Consumers Can Request a Truncation of the First Five Digits of Their Social Security Numbers
§ 4.2.3 Restrictions on Medical Information Appearing in Consumer Reports
§ 4.2.4 Restrictions on Employers’ Use of a Consumer Report
§ 4.2.5 Red Flag Guidelines
§ 4.2.6 Restrictions on Issuance of an Additional or New Card when Change of Address Notification Filed
§ 4.2.7 Address Discrepancies
§ 4.2.8 Reports Must Be Used for a Permissible Purpose
§ 4.2.9 CRAs Must Use Reasonable Procedures to Assure Maximum Possible Accuracy
§ 4.3 WHAT HAPPENS IF THE PRIVATE DATA IS USED AND HOW TO DEAL WITH CONSUMER REPORTING PROBLEMS
§ 4.3.1 What Identity Theft Is and Why It Is a Problem
§ 4.3.2 Information Available to Identity Theft Victims
§ 4.3.3 Fraud Alerts
(a) Initial Fraud Alert or One-Call Fraud Alert
(b) Extended Fraud Alert
(c) Active Duty Alert
(d) Effects of the Three Alerts on Users
(e) Liability of CRAs and Users and Preemption
(f) Security Freezes
§ 4.3.4 Blocking Fraudulent Information
(a) CRAs
(b) Furnishers
Chapter 5
The Gramm-Leach-Bliley Act (GLBA)
§ 5.1 OVERVIEW OF THE GRAMM-LEACH-BLILEY ACT
§ 5.2 RULE MAKING
§ 5.2.1 The Safeguards Rule
§ 5.2.2 The Financial Privacy Rule
§ 5.2.3 The Pretexting Rule
§ 5.3 SCOPE OF THE GLBA
§ 5.3.1 Entities Subject to the GLBA
§ 5.3.2 Information Protected Under the GLBA—Nonpublic Personal Information (NPPI)
§ 5.3.3 Individuals Protected Under the GLBA—Consumers and Customers
§ 5.4 NOTICE, DISCLOSURE, AND SAFEGUARDING REQUIREMENTS UNDER THE GLBA
§ 5.4.1 Notice, Disclosure, and Safeguarding Requirements
§ 5.4.2 Consent Requirements
§ 5.4.3 Limits of Reuse of NPPI
§ 5.4.4 Restrictions on Disclosing NPPI to Third Parties
§ 5.4.5 Data Security Requirements
§ 5.4.6 Data Breach Notification Requirements
(a) FFIEC Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice
Components of a Response Program
When Customer Notice Should Be Provided
Customer Notice
Delivery of Customer Notice
(b) FTC Guidance on Breach Notification
§ 5.5 PRETEXTING
§ 5.5.1 Overview
§ 5.5.2 Prohibition on Obtaining or Soliciting Customer Information Using False Pretenses
§ 5.5.3 Exceptions to Pretexting Rule
§ 5.5.4 Enforcement of Pretexting Rule
§ 5.5.5 Criminal Penalty for Violation of Pretexting Rule
§ 5.6 GLBA ENFORCEMENT
§ 5.6.1 Enforcement
§ 5.6.2 Sanctions and Other Liability
(a) Judicial and CFPB Enforcement
(b) FTC Enforcement
EXHIBIT 5A—Regulatory Enforcement Authority for the Privacy, Safeguards, and Pretexting Rules
EXHIBIT 5B—Model Privacy Notice Form: Opt Out
EXHIBIT 5C—Model Privacy Notice Form: No Opt Out
Chapter 6
Information Security in Health Care: HIPAA and HITECH
§ 6.1 INTRODUCTION
§ 6.2 STATUTORY AUTHORITY FOR THE ORIGINAL PRIVACY AND INFORMATION SECURITY RULES AND THE HITECH AMENDMENTS
§ 6.3 BRIEF OVERVIEW OF THE HIPAA PRIVACY RULE AND HOW IT RELATES TO INFORMATION SECURITY
§ 6.4 THE HIPAA INFORMATION SECURITY RULE
§ 6.4.1 Key Concepts
(a) Protected Health Information
(b) Covered Entities and Business Associates
(c) The Security Rule Applies Only to Electronic PHI
(d) Scalable
(e) Technology Neutral
(f) Relationship to Industry Standards
(g) Required and Addressable Specifications
§ 6.4.2 The Core Requirement
§ 6.4.3 The Centrality of Risk Assessment
§ 6.4.4 Administrative Safeguards
§ 6.4.5 Physical Safeguards
§ 6.4.6 Technical Safeguards
§ 6.4.7 Information Security Guidance
§ 6.5 ENFORCEMENT AND PENALTIES: THE IMPACT OF HITECH
§ 6.5.1 Overview of the Enforcement Process
§ 6.5.2 Penalties Under the Original HIPAA Statute
§ 6.5.3 Penalties After HITECH
§ 6.5.4 Vicarious Liability for the Acts of Business Associates
§ 6.5.5 Enforcement of the HIPAA Regulations by the Office of Civil Rights
§ 6.5.6 Open Issues and Questions Under HITECH and the Enforcement Rule
§ 6.6 BREACH NOTIFICATION AND INFORMATION SECURITY
§ 6.6.1 Brief Overview of the Breach Notification Rule
§ 6.6.2 Impact on Information Security and Enforcement
(a) Are Breaches Violations of the Security Rule?
(b) Publicity, Enforcement, and Private Litigation
EXHIBIT 6A—HIPAA Security Standards Matrix
Chapter 7
Congressional Response to the Internet
§ 7.1 INTRODUCTION
§ 7.2 COMMUNICATIONS DECENCY ACT
§ 7.2.1 Anti-indecency and Antiobscenity Provisions
§ 7.2.2 Section 230—Online Service Providers Sheltered from Liability
§ 7.2.3 Defamation (as Opposed to Intellectual Property) Liability Shield
§ 7.2.4 Who Qualifies for Protection Under the Statute?
§ 7.2.5 “Source of the Content at Issue”—Room for Argument
§ 7.2.6 Other Illegal Content
§ 7.3 DIGITAL MILLENNIUM COPYRIGHT ACT
§ 7.3.1 DMCA Safe Harbor
§ 7.3.2 Who Qualifies?
§ 7.3.3 Designation of Agent
§ 7.3.4 Requirements for Valid Takedown Requests
§ 7.3.5 Responding to Takedown Requests
§ 7.3.6 Clarification About Secondary Liability Safe Harbor
§ 7.3.7 Anticopyright Management Circumvention
§ 7.3.8 DMCA Anticircumvention Exemptions
§ 7.3.9 Notable Anticircumvention Cases
§ 7.4 CHILDREN’S ONLINE PRIVACY PROTECTION ACT
§ 7.4.1 Privacy Policy
§ 7.4.2 Not Just Notice, but Choice and Access
§ 7.4.3 Initial Notice to Parent Required
§ 7.4.4 What Are the Consequences for Failure to Comply?
§ 7.4.5 Security and Data Retention
§ 7.4.6 Student Privacy
§ 7.5 CAN-SPAM ACT
§ 7.5.1 E-mails Covered by CAN-SPAM Act
§ 7.5.2 Compliance Requirements
§ 7.5.3 Consequences for Noncompliance
Chapter 8
The Federal Trade Commission(
§ 8.1 THE FEDERAL TRADE COMMISSION
§ 8.2 OPERATION OF THE FTC (CONSUMER PROTECTION)
§ 8.3 FTC IMPLEMENTATION OF DATA SECURITY STATUTES
§ 8.3.1 Fair Credit Reporting Act (FCRA)
(a) Original Regulation of Dissemination of Credit Reports
(b) Fair and Accurate Credit Transactions Act
§ 8.3.2 Gramm-Leach-Bliley Act
(a) Subtitle A: Safeguarding of Nonpublic Personal Information
(b) Subtitle A: Disclosure of Nonpublic Personal Information
(c) Subtitle B: Pretexting
§ 8.3.3 Health Insurance Portability and Accountability Act (HIPAA)
§ 8.4 OTHER PRIVACY STATUTES ADMINISTERED OR ENFORCED BY THE FTC
§ 8.4.1 Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act, 15 U.S.C. §§ 7701–7713
§ 8.4.2 Telemarketing and Consumer Fraud and Abuse Prevention Act
§ 8.4.3 Do-Not-Call Legislation
§ 8.4.4 The Children’s Online Privacy Protection Act (COPPA)
§ 8.5 FTC ACTIONS AGAINST “UNFAIR OR DECEPTIVE ACTS OR PRACTICES”
EXHIBIT 8A—Red Flags Rule Guidelines
EXHIBIT 8B—GLBA Safeguards Rule Information Protection Program Elements
EXHIBIT 8C—FTC Health Information Breach Notification Rule
Chapter 9
Survey of State Data Security and Privacy Law
§ 9.1 STATE LAW FOUNDATIONS
§ 9.1.1 Federal and State Constitutions
§ 9.1.2 State Common Law
§ 9.2 STATE STATUTES
§ 9.2.1 State Data Security Breach Statutes
§ 9.2.2 State Personal Data Security Statutes
(a) Driver’s License Information
(b) Social Security Numbers
(c) Security and Disposal of Records
(d) Internet Collection of Personal Information
§ 9.2.3 State Laws Against Internet Intrusion
(a) State Antispam Legislation
(b) State Antispyware and Antiphishing Statutes
(c) Employer and School Coercion of Access to Social Media
(d) Cyber-Bullying
(e) “Revenge Porn”
§ 9.3 DATA SECURITY AND PRIVACY LITIGATION
Chapter 10
Massachusetts Data Security Law and Regulations*
§ 10.1 OVERVIEW
§ 10.2 THE BREACH NOTICE LAW
§ 10.2.1 Personal Information
§ 10.2.2 Covered Entities
§ 10.2.3 Triggers for Notice
§ 10.2.4 Timing of Notice
§ 10.2.5 Content of the Notice
(a) Licensors and Owners
(b) Maintainers/Storers
§ 10.2.6 Penalties and Enforcement
§ 10.2.7 Compliance with Federal Law
§ 10.2.8 Court Cases
§ 10.2.9 Practical Aspects of Responding to a Breach
(a) Secure and Preserve
(b) Investigate
(c) Prepare Required Notices
(d) Develop a Public Relations Approach
(e) Evaluate Relevant Contracts
(f) Consider Offering Credit Monitoring
(g) Conduct a Postincident Review
§ 10.3 THE DATA DESTRUCTION/DISPOSAL LAW
§ 10.3.1 Basic Requirements
§ 10.3.2 Penalties and Enforcement
§ 10.3.3 Practice Tips
§ 10.4 THE DATA SECURITY REGULATIONS
§ 10.4.1 Major Requirements
§ 10.4.2 The WISP
(a) Data Security Coordinator
(b) Risk Assessment and Improvement of Safeguards
(c) Employee Policies, Training, and Discipline
(d) Terminated Employees
(e) Third-Party Service Providers
(f) Overall Restrictions
§ 10.4.3 Computer System Security Requirements
(a) “Secure User Authentication Protocols”
(b) “Secure Access Control Measures”
(c) Encryption
(d) Monitoring
(e) Security Protections
§ 10.4.4 Practice Tips
(a) The Assessment
(b) Create a Data Security Binder
EXHIBIT 10A—Frequently Asked Questions Regarding 201 C.M.R. § 17.00
EXHIBIT 10B—A Small Business Guide: Formulating a Comprehensive Written Information Security Program
EXHIBIT 10C—Standards for the Protection of Personal Information of Residents of the Commonwealth—201 C.M.R. § 17.00
EXHIBIT 10D—OCABR 201 C.M.R. § 17.00 Compliance Checklist
EXHIBIT 10E—Personal Information: A Graphical Representation
EXHIBIT 10F—Action Plan for Complying with Massachusetts Data Security Regulations
EXHIBIT 10G—Twelve Practical Tips for Employers
EXHIBIT 10H—Helpful Websites
Chapter 11
Consumer/Retail Issues(
§ 11.1 INTRODUCTION
§ 11.2 CONTRACTING CONSIDERATIONS
§ 11.3 DRAFTING TERMS
§ 11.3.1 Privacy Policy or Notice
§ 11.3.2 Acceptable Use Policy
§ 11.3.3 Intellectual Property Notices
§ 11.4 PRIVACY ISSUES—NOTICE CONTENTS
§ 11.4.1 Data Collected
§ 11.4.2 Use of Data (Including Sharing)
§ 11.4.3 User Access and Options
§ 11.5 CONCLUSION
Chapter 12
Network Service Providers*
§ 12.1 INTRODUCTION
§ 12.1.1 What Is a Network Service Provider?
§ 12.1.2 Representing a Business Consumer of Network Services
§ 12.1.3 Representing a Network Service Provider
§ 12.2 RELEVANT STATUTES AND REGULATIONS
§ 12.2.1 Federal Law—The Telecommunications Act of 1996 and the Communications Act of 1934
(a) Common Carriers Under Title II of the Communications Act
(b) The Telecommunications Act of 1996—Distinguishing Between and Among the Different Types of Service Providers
(c) The FCC and Regulation of the Internet
The FCC’s Comcast Order
The 2010 Open Internet Order
FCC’s Solution—Reclassification of Internet Service Providers as Common Carriers
(d) Provisions of the Telecommunications Act Related to Privacy and Security
§ 12.2.2 Federal Law—Title II of the Electronic Communications Privacy Act, the Stored Communications Act
§ 12.2.3 Federal Law—Communication Decency Act
§ 12.2.4 Massachusetts State Law and Regulations
§ 12.3 NEGOTIATING NETWORK SERVICE PROVIDER CONTRACTS
§ 12.3.1 Defining Privacy and Data Security Obligations in Service Provider Relationships
§ 12.3.2 Representing a Business Engaging a Service Provider—Setting the Framework for the Deal
(a) Conduct a Data Audit and Engage in Data Mapping
(b) Analyze and Understand Relevant Laws, Regulations, and Company Policies
(c) Conduct and Document a Technical Assessment and Due Diligence of Service Provider
EXHIBIT 12A—Service Provider Data Security Questionnaire
Chapter 13
Vendors and Business Associates
§ 13.1 INTRODUCTION
§ 13.2 HIPAA OMNIBUS RULE
§ 13.3 “RED FLAGS” RULE
§ 13.4 GRAMM-LEACH-BLILEY SAFEGUARDS RULE
§ 13.5 FEDERAL INFORMATION SECURITY MANAGEMENT ACT
§ 13.6 MASSACHUSETTS DATA SECURITY
§ 13.7 PAYMENT CARD INDUSTRY RULE
EXHIBIT 13A—HIPAA Privacy and Security Compliance Agreement
Chapter 14
Cyber Risk Insurance and Risk Management
§ 14.1 INTRODUCTION
§ 14.2 CYBER RISK INSURANCE FORMS
§ 14.3 FIRST-PARTY CYBER RISK COVERAGES
§ 14.3.1 Data Breach Response Costs
§ 14.3.2 Data Loss
§ 14.3.3 Network Service Interruption / Business Interruption
§ 14.3.4 Cyber Extortion and Ransom Coverage
§ 14.3.5 Reputational Damage
§ 14.3.6 Physical Loss or Damage to Property
§ 14.3.7 Crime
§ 14.4 THIRD-PARTY CYBER RISK COVERAGES
§ 14.4.1 Data Security Liability
§ 14.4.2 Privacy Liability
§ 14.4.3 Media Content Liability
§ 14.4.4 Products Liability
§ 14.5 FIRST-PARTY CYBER RISK INSURANCE CASE LAW
§ 14.5.1 American Guarantee & Liability Insurance Co. v. Ingram Micro, Inc.
§ 14.5.2 Lambrecht & Associates v. State Farm Lloyds
§ 14.5.3 Apache Corp. v. Great American Insurance Co.
§ 14.5.4 Principle Solutions Group, LLC v. Ironshore Indemnity, Inc.
§ 14.5.5 InComm Holdings Inc. v. Great American Insurance Co.
§ 14.5.6 Taylor & Lieberman v. Federal Insurance Co.
§ 14.5.7 State Bank of Bellingham v. BancInsure Inc.
§ 14.5.8 Pestmaster Services Inc. v. Travelers Casualty and Surety Co. of America
§ 14.5.9 Aqua Star (USA) Corp. v. Travelers Casualty and Surety Co. of America
§ 14.5.10 American Tooling Center, Inc. v. Travelers Casualty and Surety Co. of America
§ 14.5.11 Medidata Solutions, Inc. v. Federal Insurance Co.
§ 14.6 THIRD-PARTY CYBER RISK INSURANCE CASE LAW
§ 14.6.1 First Bank of Delaware, Inc. v. Fidelity & Deposit Co. of Maryland
§ 14.6.2 Retail Ventures, Inc. v. National Union Fire Insurance Co.
§ 14.6.3 State Auto Property & Casualty Insurance Co. v. Midwest Computers & More
§ 14.6.4 Eyeblaster, Inc. v. Federal Insurance Co.
§ 14.6.5 First Commonwealth Bank v. St. Paul Mercury Insurance Co.
§ 14.6.6 Zurich American Insurance Co. v. Sony Corp. of America
§ 14.6.7 Recall Total Information Management, Inc. v. Federal Insurance Co.
§ 14.6.8 Travelers Indemnity Co. of America v. Portal Healthcare Solutions, LLC
§ 14.6.9 Hartford Casualty Insurance Co. v. Corcino & Associates
§ 14.6.10 Travelers Indemnity Co. of Connecticut v. P.F. Chang’s China Bistro, Inc.
§ 14.6.11 Travelers Property Casualty Co. of America v. Federal Recovery Services, Inc.
§ 14.6.12 P.F. Chang’s China Bistro, Inc. v. Federal Insurance Co.
§ 14.6.13 Camp’s Grocery, Inc. v. State Farm Fire & Casualty Co.
§ 14.7 CYBER RISK MANAGEMENT
§ 14.7.1 Evaluation of the Types of Electronically Stored Information
§ 14.7.2 Evaluation of Current Network Infrastructure and Security
§ 14.7.3 Media Publication Risks
§ 14.7.4 Development of Data Security Protocols
§ 14.7.5 Employee Devices
§ 14.7.6 Creating a Data Breach Response and Notification Protocol
§ 14.7.7 Evaluation of Current Insurance Programs and Coverage Gaps
§ 14.7.8 Employee Training
§ 14.7.9 Regulatory Compliance
Chapter 15
Privacy and Security in M&A Transactions
§ 15.1 INTRODUCTION
§ 15.2 DUE DILIGENCE
§ 15.3 TRANSFER OF PERSONAL DATA IN BANKRUPTCY
§ 15.4 RISKS OF DISCLOSING PERSONAL DATA DURING DUE DILIGENCE
§ 15.5 REPRESENTATIONS AND WARRANTIES
§ 15.6 INDEMNIFICATION AND INSURANCE
§ 15.7 DEAL STRUCTURE
§ 15.8 CONCLUSION
EXHIBIT 15A—Sample Initial Due Diligence Request List for Privacy and Security M&A
EXHIBIT 15B—Sample Agreement Provisions for Privacy and Security
Chapter 16
Current Developments
§ 16.1 INTRODUCTION
§ 16.2 CONGRESSIONAL ACTIVITY
§ 16.3 THE FTC FRAMEWORK
§ 16.4 THE STATES
§ 16.5 IDENTITY MANAGEMENT
§ 16.6 CONCLUSION
EXHIBIT 16A—House of Representatives Resolution 835
Table of Cases
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
R
S
T
U
V
W
Y
Z
Table of Statutes, Rules & References
FEDERAL
MASSACHUSETTS
OTHER STATES
ADDITIONAL REFERENCES AND RESOURCES
Index
A
B
C
D
E
F
G
H
I
K
L
M
N
O
P
R
S
T
U
V
W

Citation preview

Data Security and Privacy in Massachusetts

2nd Edition 2018

Business & Commercial Law PRINT & EBOOKS

Data Security and Privacy in Massachusetts 2nd Edition 2018

Stephen Y. Chow et al.

Stephen Y. Chow et al.

2180499B02 2140521B02

MCLE

Massachusetts Continuing Legal Education, Inc. Ten Winter Place, Boston, MA 02108-4651 1-800-966-6253 | www.mcle.org

MCL E

NEW ENGLAND

Keep raising the bar.®

Data Security and Privacy in Massachusetts 2ND EDITION 2018

EDITOR Stephen Y. Chow AUTHORS Sara Yevics Beccia Alexandra Capachietti Stephen Y. Chow Patrick J. Concannon Ellen Marie Giblin Peter J. Guffin James A. Kitces Gary Klein Andrea C. Kramer Brooke A. Penrose C. Max Perlman Kathleen M. Porter Corinne A. Reed Jared Rinehimer Michael V. Silvestro David S. Szabo Merton E. Thompson IV

2180499B02

© 2018 by Massachusetts Continuing Legal Education, Inc. All rights reserved. Published 2018. Permission is hereby granted for the copying of pages or portions of pages within this book by or under the direction of attorneys for use in the practice of law. No other use is permitted without prior written consent of Massachusetts Continuing Legal Education, Inc. Printed in the United States of America This publication should be cited: Data Security and Privacy in Massachusetts (MCLE, Inc. 2d ed. 2018) Library of Congress Control Number: 2017959090 ISBN: 1-68345-063-9 All of Massachusetts Continuing Legal Education, Inc.’s (“MCLE’s”) products, services, and communications (“MCLE Products”) are offered solely as an aid to developing and maintaining professional competence. The statements and other content in MCLE Products may not apply to your circumstances and no legal, tax, accounting, or other professional advice is being rendered by MCLE or its trustees, officers, sponsors, or staff, or by its authors, speakers, or other contributors. No attorney-client relationship is formed by the purchase, receipt, custody, or use of MCLE Products. The statements and other content in MCLE Products do not reflect a position of and are not ratified, endorsed, or verified by MCLE or its trustees, officers, sponsors, or staff. Contributors of statements and other content in MCLE Products are third-party contributors and are not agents of MCLE. No agency relationship, either express, implied, inherent or apparent, exists between MCLE and any third-party contributor to MCLE Products. Due to the rapidly changing nature of the law, the statements and other content in MCLE Products may become outdated. Attorneys using MCLE Products should research original and current sources of authority. Nonattorneys using MCLE Products are encouraged to seek the legal advice of a qualified attorney. By using MCLE Products, the user thereof agrees to the terms and conditions set forth herein, which are severable in the event that any provision is deemed unlawful, unenforceable, or void. To the fullest extent permitted by applicable law, MCLE Products are provided on an “As Is,” “As Available” basis and no warranties or representations of any kind, express or implied, with respect to MCLE Products are made by MCLE or its trustees, officers, sponsors, or staff, individually or jointly. To the fullest extent permitted by applicable law, neither MCLE nor its trustees, officers, sponsors, or staff are responsible for the statements and other content in MCLE Products or liable for any claim, loss, injury, or damages of any kind (including, without limitations, attorney fees and costs) arising from or involving the use of MCLE Products. Failure to enforce any provision of these terms and conditions will not be deemed a waiver of that provision or any other provision. These terms and conditions will be governed by the laws of the Commonwealth of Massachusetts, notwithstanding any principles of conflicts of law. These terms and conditions may be changed from time to time without notice. Continued use of MCLE Products following any such change constitutes acceptance of the change. IRS Circular 230 Notice: Any U.S. tax advice found to be included in MCLE Products (including any attachments) is not intended or written to be used, and cannot be used, for the purpose of avoiding U.S. tax penalties or for promoting, marketing, or recommending to another party any tax-related matter or any other transaction or matter addressed therein. Massachusetts Continuing Legal Education, Inc. Ten Winter Place, Boston, MA 02108-4751 800-966-6253 | Fax 617-482-9498 | www.mcle.org

ACKNOWLEDGMENTS Now in its second edition, Data Security and Privacy in Massachusetts has won international recognition as an Outstanding Publication of 2016 by the Association for Continuing Legal Education. This is thanks to the generous contribution of both substantive and editorial expertise by Stephen Y. Chow, Esq., the book’s chief editor. This primary source for commentary and analysis of state-specific practice and controlling federal authority regarding data security and data privacy has been updated to provide readers with current case decisions, regulatory and statutory developments, and insightful commentary on the impact of a set of ever-evolving laws on individuals and institutions. We thank Mr. Chow for sharing his expertise so graciously. The many data security and privacy experts who joined Mr. Chow on this book are to be commended and thanked for clarifying important aspects of this multi-faceted topic, as well as for helping the business and commercial law community in Massachusetts to better understand the requirements, cresting issues, and nuances of the complex subject matter. Thanks are in order, as well, to MCLE’s Board of Trustees, for its enthusiastic support of the publishing program, as well as to the many publications staff members who shared their publishing expertise in the production of these pages. Maryanne G. Jensen, Esq. Director of Publications, MCLE Press January 2018

MCLE, Inc. | 2nd Edition 2018

iii

Data Security and Privacy in Massachusetts

ABOUT THE EDITOR STEPHEN Y. CHOW is a partner of Burns & Levinson LLP in Boston. His practice includes intellectual property litigation and transactions, business litigation and dispute resolution, business and finance transactions, life sciences, patent, and privacy and data security. He is a member of the American Law Institute, American Intellectual Property Law Association, International Association of Privacy Professionals, Massachusetts Uniform Law Commission, and the American, Massachusetts, and Boston Bar Associations. Mr. Chow is a graduate of Columbia University School of Law, Harvard Graduate School of Arts and Sciences, and Harvard College.

ABOUT THE AUTHORS SARA YEVICS BECCIA is with Hasbro, Inc., in Pawtucket, RI. Previously, she was with the intellectual property and privacy/data security groups at Burns & Levinson LLP. She is a graduate of Boston University School of Law and Lafayette College. ALEXANDRA CAPACHIETTI is with the Boston Trial Court in Boston. Previously, she was with Burns & Levinson LLP, where she concentrated in intellectual property and commercial litigation. She is a graduate of Roger Williams University School of Law and Roger Williams University. PATRICK J. CONCANNON is a partner in the intellectual property and corporate and transactions departments of Nutter McClennen & Fish LLP in Boston. He focuses on US and international trademark clearance, trademark portfolio management and protection, and trademark and copyright licensing. He drafts commercial agreements and provides counseling on advertising, unfair competition, consumer protection, internet, and privacy law. Mr. Concannon is a member of the International Association of Privacy Professionals, International Trademark Association, Copyright Society of the USA, and the American and Boston Bar Associations. He is a graduate of Suffolk University Law School and Marquette University. ELLEN MARIE GIBLIN is North American privacy officer at Philips Healthcare in Andover. She provides compliance advice and guidance on privacy law, data protection, data security, cybersecurity, and data breach response laws. Previously, she was with Boston Children’s Hospital and Locke Lord LLP. Ms. Giblin is a graduate of Boston College, Suffolk University School of Law, and the University of Massachusetts, Amherst. PETER J. GUFFIN is a partner in the intellectual property and technology group and head of the privacy and date security practice at Pierce Atwood LLP in Portland, ME. His practice includes technology procurement and outsourcing arrangements; privacy, information security, and breach notification; protection and enforcement of IP rights; trademark, patent, copyright, and software licensing; and internet law and e-commerce initiatives. He is a visiting professor at the University of Maine School of Law. Mr. Guffin is a graduate of the University of Pennsylvania Law School and Rutgers College. iv

2nd Edition 2018 | MCLE Inc.

JAMES A. KITCES is a principal of Robins Kaplan LLP in Boston. He focuses on representing business insurers in first-party coverage and industrial subrogation claims and insurers in third-party liability disputes concerning construction defect claims, environmental claims, intellectual property disputes, and bad faith claims. Previously, he was in private practice and with the Office of the Georgia Attorney General. Mr. Kitces is a member of the American, Boston, and Atlanta Bar Associations and the Massachusetts Reinsurance Bar Association. He is a graduate of the University of Texas School of Law and Emory University. GARY KLEIN is senior trial counsel for public protection and advocacy in the Office of the Massachusetts Attorney General in Boston. Previously, he was with Klein Kavanagh Costello LLP, specializing in consumer class actions and litigation, and Roddy Klein & Ryan. He is a graduate of Rutgers School of Law and Yale University. ANDREA C. KRAMER is with Kramer Frohlich LLC in Boston. Her practice includes business litigation, employment litigation, civil rights cases, noncompetition and nonsolicitation agreements, real estate litigation, employee handbooks and policies, appellate advocacy, and data security. Previously, she was with the Office of the Massachusetts Attorney General and Hirsch Roberts Weinstein LLP. She is a member of the Women’s Bar Association of Massachusetts and Boston Bar Association. Ms. Kramer is a graduate of Harvard Law School and Wellesley College. BROOKE A. PENROSE is an associate of Burns & Levinson LLP in Boston. She concentrates in counseling creators and innovators in intellectual property transactional matters. In her trademark practice, she is involved in clearing, prosecuting, and maintaining applications and registrations as well as drafting and negotiating licensing opportunities. She is a member of the Boston Bar Association, International Trademark Association, and Copyright Society of the USA. Ms. Penrose is a graduate of the University of Connecticut School of Law and the University of Michigan. C. MAX PERLMAN is with Hirsch Roberts Weinstein LLP in Boston. He specializes in employment and shareholder relationships, including restrictive covenant litigation, close corporation shareholder disputes, wage and hour actions, and discrimination matters. He also provides representation in disputes arising out of corporate mergers and acquisitions and advises on negotiation of terms of executive compensation, employee terminations, and formulation of employment policies. Mr. Perlman is a member of the Massachusetts, Boston, and Federal Bar Associations, and the board of editors of Massachusetts Lawyers Weekly. He is a graduate of Boston University School of Law and Binghamton University. KATHLEEN M. PORTER is a partner of Robinson & Cole LLP in Boston. Her practice includes intellectual property, business transactions, trade regulation, and internet law. She is a member of the American Bar Association, International Association of Privacy Professionals, Licensing Executives Society, and Women’s Competition Network. Ms. Porter is a graduate of Western New England University School of Law and Fitchburg State University.

MCLE, Inc. | 2nd Edition 2018

v

Data Security and Privacy in Massachusetts

CORINNE A. REED is an associate of Burns & Farrey, PC, in Boston, where her practice includes general liability, premises liability, and products liability. Previously, she was with Klein Kavanagh Costello LLP. She is a member of the Boston Bar Association. She is a graduate of Suffolk University Law School and George Washington University. JARED RINEHIMER is an assistant attorney general with the Office of the Massachusetts Attorney General in Boston. Previously, he clerked for the Massachusetts Appeals Court and the US District Court for the Southern District of West Virginia. He is a graduate of Harvard Law School, the University of Washington, and Carnegie Mellon University. MICHAEL V. SILVESTRO is a principal of Skarzynski Black LLC in Chicago, IL. He provides coverage and litigation counsel on first- and third-party policy programs for domestic and international insurers and reinsurers, focusing on complex property, construction, energy, builder risk, and cyber risk. He also advises insurance markets on significant losses concerning catastrophic events. Previously, he was with Robins Kaplan LLP and Mound Cotton Wollan & Greengrass LLP. Mr. Silvestro is a member of the American Bar Association, Western Loss Association, and Professional Liability Underwriting Society. He is a graduate of Boston College Law School and Union College. DAVID S. SZABO is with Locke Lord LLP in Boston, where he is cochair of the healthcare practice, a partner in the corporate and transactional department, and a member of the privacy security group. His practice includes healthcare licensing and regulation, reimbursement, fraud and abuse compliance, structuring of joint ventures, Stark Law and Anti-Kickback compliance, and privacy and information security law. He is a member of the Healthcare Financial Management Association, Health Information and Management Systems Society, and the American and Boston Bar Associations. Mr. Szabo is a graduate of Boston University School of Law and the University of Rochester. MERTON E. THOMPSON IV is a partner of Burns & Levinson LLP in Boston. He concentrates in patent, copyright, and trademark disputes and IT issues, including data security, privacy, CFAA claims, open source, SaaS, and cloud computing. Previously, he was with Fish & Richardson, PC. He is a graduate of Suffolk University Law School and Northeastern University.

vi

2nd Edition 2018 | MCLE Inc.

TABLE OF CONTENTS Chapter 1

Conceptions of Privacy and Security in a Digital World Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston

Chapter 2

The Electronic Communications Privacy Act Peter J. Guffin, Esq. Pierce Atwood LLP, Portland, ME

Chapter 3

Computer Fraud and Abuse Act Alexandra Capachietti, Esq. Boston Trial Court, Boston Brooke A. Penrose, Esq. Burns & Levinson LLP, Boston Merton E. Thompson IV, Esq., CIPT Burns & Levinson LLP, Boston

Chapter 4

Fair Credit Reporting Act; Fair and Accurate Credit Transactions Act Gary Klein, Esq. Boston Corinne A. Reed, Esq. Burns & Farrey, PC, Boston Jared Rinehimer, Esq. Boston

Chapter 5

The Gramm-Leach-Bliley Act (GLBA) Ellen Marie Giblin, Esq. Philips Healthcare, Andover

Chapter 6

Information Security in Health Care: HIPAA and HITECH David S. Szabo, Esq. Locke Lord LLP, Boston

Chapter 7

Congressional Response to the Internet Patrick J. Concannon, Esq. Nutter McClennen & Fish LLP, Boston

Chapter 8

The Federal Trade Commission Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston

MCLE, Inc. | 2nd Edition 2018

vii

Data Security and Privacy in Massachusetts

Chapter 9

Survey of State Data Security and Privacy Law Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston

Chapter 10

Massachusetts Data Security Law and Regulations Andrea C. Kramer, Esq. Kramer Frohlich LLC, Boston C. Max Perlman, Esq. Hirsch Roberts Weinstein LLP, Boston

Chapter 11

Consumer/Retail Issues Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston

Chapter 12

Network Service Providers Sara Yevics Beccia, Esq. Hasbro, Inc., Pawtucket, RI

Chapter 13

Vendors and Business Associates Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston

Chapter 14

Cyber Risk Insurance and Risk Management James A. Kitces, Esq. Robins Kaplan LLP, Boston Michael V. Silvestro, Esq. Skarzynski Black LLC, Chicago, IL

Chapter 15

Privacy and Security in M&A Transactions Kathleen M. Porter, Esq. Robinson & Cole LLP, Boston

Chapter 16

Current Developments Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston

Table of Cases Table of Statutes, Rules, and References Index

viii

2nd Edition 2018 | MCLE Inc.

TABLE OF EXHIBITS EXHIBIT 2A—Wiretap Report 2016 ................................................................... 2–57 EXHIBIT 3A—18 U.S.C. § 1030, Fraud and Related Activity in Connection with Computers ..................................................................................................... 3–24 EXHIBIT 5A—Regulatory Enforcement Authority for the Privacy, Safeguards, and Pretexting Rules .......................................................................... 5–23 EXHIBIT 5B—Model Privacy Notice Form: Opt Out.......................................... 5–25 EXHIBIT 5C—Model Privacy Notice Form: No Opt Out .................................... 5–27 EXHIBIT 6A—HIPAA Security Standards Matrix .............................................. 6–30 EXHIBIT 8A—Red Flags Rule Guidelines .......................................................... 8–58 EXHIBIT 8B—GLBA Safeguards Rule Information Protection Program Elements................................................................................................................ 8–65 EXHIBIT 8C—FTC Health Information Breach Notification Rule ..................... 8–66 EXHIBIT 10A—Frequently Asked Questions Regarding 201 C.M.R. § 17.00 ................................................................................................................ 10–21 EXHIBIT 10B—A Small Business Guide: Formulating a Comprehensive Written Information Security Program ............................................................... 10–26 EXHIBIT 10C—Standards for the Protection of Personal Information of Residents of the Commonwealth—201 C.M.R. § 17.00................................. 10–31 EXHIBIT 10D—OCABR 201 C.M.R. § 17.00 Compliance Checklist .............. 10–36 EXHIBIT 10E—Personal Information: A Graphical Representation ................. 10–39 EXHIBIT 10F—Action Plan for Complying with Massachusetts Data Security Regulations ........................................................................................... 10–40 EXHIBIT 10G—Twelve Practical Tips for Employers ....................................... 10–43 EXHIBIT 10H—Helpful Websites ..................................................................... 10–44 EXHIBIT 12A—Service Provider Data Security Questionnaire ........................ 12–17 EXHIBIT 13A—HIPAA Privacy and Security Compliance Agreement............. 13–10 EXHIBIT 15A—Sample Initial Due Diligence Request List for Privacy and Security M&A .............................................................................................. 15–10 EXHIBIT 15B—Sample Agreement Provisions for Privacy and Security.......... 15–15 EXHIBIT 16A—House of Representatives Resolution 835 ............................... 16–17

MCLE, Inc. | 2nd Edition 2018

ix

Data Security and Privacy in Massachusetts

x

2nd Edition 2018 | MCLE Inc.

CHAPTER 1

Conceptions of Privacy and Security in a Digital World Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston § 1.1

Introduction............................................................................................. 1–1

§ 1.2

Privacy and Security: Promises and Dangers of “Big Data” .............. 1–3 § 1.2.1 Distinguishing Privacy and Security ...................................... 1–3 (a) Privacy............................................................................. 1–4 (b) Security ........................................................................... 1–6 § 1.2.2 The Challenges of Cyberspace ............................................... 1–7 (a) Security Challenges......................................................... 1–8 (b) Privacy Challenges .......................................................... 1–9

§ 1.3

Sources and History of the Law of Information Privacy and Security ........................................................................................... 1–12 § 1.3.1 Foundations .......................................................................... 1–12 § 1.3.2 Civil Rights and Consumer Protection ................................. 1–13 § 1.3.3 Credit Expansion .................................................................. 1–15 § 1.3.4 Computerization and Security .............................................. 1–16 § 1.3.5 Electronic Health Records.................................................... 1–17 § 1.3.6 Financial Institution Reform ................................................ 1–18 § 1.3.7 Addressing Web Abuses ....................................................... 1–18 § 1.3.8 Identity Theft........................................................................ 1–19 § 1.3.9 Summary of the Framework ................................................. 1–19

Scope Note This chapter provides an overview of cresting issues impacting privacy and data security. Sources of the law of information privacy, from the turn of the nineteenth century onward, are discussed. The chapter draws an interesting distinction between privacy and security in the age of “Big Data.”

§ 1.1

INTRODUCTION

In “The Right to Privacy,” 4 Harv. L. Rev. 193 (1890), Boston law partners Samuel Warren and Louis Brandeis painted a picture of privacy when printed newspapers were the only “mass media.” They contended that there should be a right of action against the gossip press even where there was no injury to reputation but only to MCLE, Inc. | 2nd Edition 2018

1–1

§ 1.1

Data Security and Privacy in Massachusetts

“feelings.” Such a right, they argued, stemmed from the principle that “[t]he common law secures to each individual the right of determining, ordinarily, to what extent his thoughts, sentiments, and emotions shall be communicated to others.” Warren & Brandeis, “The Right to Privacy,” 4 Harv. L. Rev. at 198 (citing Millar v. Taylor, 4 Burr. 2303, 2379, n.2 (1769) (Yates, J.) (“It is certain every man has a right to keep his own sentiments, if he pleases. He has certainly a right to judge whether he will make them public, or commit them only to the sight of his friends.”)). The precedents, however, were limited to actions against threatened publications of unpublished letters and of images appropriating commercial value or threatening reputational damage in defamation. Practice Note The common law tort of misappropriation of unpublished letters was brought within federal copyright law in 1976, 17 U.S.C. § 102(a), Pub. L. No. 94-553, § 102(a), 90 Stat. 2544–45 (Oct. 19, 1976) (extending to federal copyright to all “original works of authorship fixed in a tangible medium of expression”), and preempting the former tort, 17 U.S.C. § 102(a), Pub. L. No. 94553, § 301, 90 Stat. 2572. A commercial right to “publicity” is recognized as a “trade value,” along with “trade secrets,” protected against misappropriation under state law, as reviewed in the Restatement (Third) of Unfair Competition, ch. 4 (1995). See also Zacchini v. Scripps-Howard Broad. Co., 433 U.S. 562 (1977) (publicity); Kewanee Oil Co. v. Bicron Corp., 416 U.S. 470 (1974) (trade secrets). In a real sense, these protections may be considered protection of “commercial information privacy.”

Do we care today? On the eve of the current millennium, Scott McNealy, CEO of Sun Microsystems (the company that created and popularized the Web application language Java), told a group of reporters in reference to privacy, “Get over it.” Polly Sprenger, “Sun on Privacy: ‘Get Over It,’” Wired, Jan. 26, 1999, available at http:// archive.wired.com/politics/law/news/1999/01/17538. With ubiquitous social media posting, blogging, messaging, and tweeting, built on a reality-television culture of exhibitionism, earlier-generation conceptions of privacy seem quaint, to say the least. Those who do care about “privacy” often do not sympathize with those who blithely posted verbal and pictorial images of themselves that they later wish the Internet would “forget” for background checks increasingly done by web search rather than personal interview or who allowed partners to record intimate images only to find them posted in revenge or extortion. What many of us do care about are the seemingly endless breaches of security, of information collected about us that may allow others to appropriate our identities in “cyberspace” where those identities are no longer anchored in face-to-face or voice conversations (and even those may soon be credibly impersonated based on our behavioral patterns collected and correlated by service providers in return for our “free” access). Fourteen years after the call to arms echoed by Stephen Mihm, “Dumpster-Diving for Your Identity,” N.Y. Times, Dec. 31, 2003, available at http://www.nytimes.com/2003/12/ 21/magazine/dumpster-diving-for-your-identity.html, we have become increasingly inured to reports of breaches exposing enormous numbers of individuals, such as Nicole Perlroth, “All 3 Billion Yahoo Accounts Were Affected by 2013 Attack,” N.Y. 1–2

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.1

Times, Oct. 3, 2017, available at https://www.nytimes.com/2017/10/03/technology/ yahoo-hack-3-billion-users.html, and Stacy Cowley, “2.5 Million More People Potentially Exposed in Equifax Breach,” N.Y. Times, Oct. 2, 2017 (145.5 million total Americans), available at https://www.nytimes.com/2017/10/02/business/equifax-breach. html. Unlike setting up a Yahoo account, the affected individuals generally did not provide their information to Equifax, which collects the information from merchants and banks to compile individual credit histories to sell to creditors. The cost to businesses of breach of security of customer information is substantial to extreme. In its most recent annual survey, of 419 businesses in thirteen countries, the Ponemon Institution found an average of $3.62 million average cost of breaches (including the loss of customers, detection and escalation, and notification) or $141 per record compromised, but also larger breaches and a 2.1 percent increase in probability to 27.7 percent of another breach in two years. Ponemon Institute LLC, 2017 Cost of Data Breach Study: Global Overview at 1 (June 2017), downloadable at https:// www.ibm.com/security/infographics/data-breach/. Equifax, breached in 2017, expects costs of more than $150 million in 2017 and is facing 240 purported class actions, as well as state attorneys general and federal administrative agency actions. Tom Buerkle & Richard Bels, “Cyberattack Casts a Long Shadow on Equifax’s Earnings,” N.Y. Times, Nov. 10, 2017, available at https://www.nytimes.com/2017/11/10/business/ dealbook/equifax-cyberattack-earnings.html. Practice Note In late 2017, the Judicial Panel on Multidistrict Litigation transferred under 28 U.S.C. § 1407 97 of the private cases to the federal district court for the Northern District of Georgia, where Equifax is headquartered, over the objection of a few to centralization. In re Equifax, Inc., Customer Data Security Breach Litigation, 2017 U.S. Dist LEXIS 200507 (J.P.M.L. Dec. 6, 2017).

This volume lays out the broad framework of personal information privacy in Massachusetts and its United States-focused Internet context and the security-directed law that defines and protects that privacy as well as that of commercial information and operations.

§ 1.2

PRIVACY AND SECURITY: PROMISES AND DANGERS OF “BIG DATA”

§ 1.2.1

Distinguishing Privacy and Security

It is often said in the cybersecurity world that “privacy” and “security” are “two sides of the same coin.” Indeed, the two overlap historically in the common law and in the statutory law that underlies the operational requirements and best practices discussed in this volume. However, they are distinct concepts, with “privacy” being the broader, policy-based notion and “security” the more pragmatic notion executed in specific administrative, physical and technological protections, failures which are noted above. MCLE, Inc. | 2nd Edition 2018

1–3

§ 1.2

(a)

Data Security and Privacy in Massachusetts

Privacy

Dictionary definitions of “privacy” include “the quality or state of being apart from company or observation: seclusion” or “freedom from unauthorized intrusion” or “secrecy.” E.g., Merriam-Webster Online Dictionary “privacy” definitions 1a, 1b, and 3a, https://www.merriam-webster.com/dictionary/privacy. Privacy is culturally dependent as a recognition of certain rights of an individual, even an entity, to enjoy some boundaries as individual or entity—that is to be autonomous in some space, free from intrusion or interference. In the United States, the Bill of Rights preserves such a space against unreasonable search and seizure by the government (see Section 9.1.1 of this book), a principle extended by Congress to limit government interception of electronic communications (see chapter 2 of this book). Historical Note In 1968, Congress enacted what is known as the “Federal Wiretap Law,” 18 U.S.C. §§ 2510–2520, Act of June 19, 1968, Pub. L. No. 90-351, Title III, § 802, 82 Stat. 212. Nearly two decades later, Congress enacted the Electronic Communications Privacy Act of 1986 (“ECPA”), Act of Oct. 21, 1986, Pub. L. No. 99-508, 100 Stat. 1848–71, “to update and clarify Federal privacy protections and standards in light of dramatic changes in new computer and telecommunications technologies,” S. Rep. 99-541, reprinted in 1986 U.S. Code & Cong. News 3555. See chapter 2 of this book.

The Supreme Court has recognized associative and reproductive autonomy as protected privacy. E.g., NAACP v. Alabama, 357 U.S. 449 (1958), Griswold v. Connecticut, 381 U.S. 479 (1965), Roe v. Wade, 410 U.S. 113 (1973). Practice Note “Information privacy” is distinguished from “bodily privacy,” which again includes state protection against assault, and “territorial privacy,” which includes state protection against trespass but often overlaps in the application of constitutional protections against unreasonable search and seizure, and extensions. Bodily privacy is also extended to “autonomy” in sexual or reproductive “choice.” “Communication privacy,” which is a subset of information privacy, extends also to autonomy in “freedom” of speech, expression, or association.

Congress has enacted legislation to limit release of information of individuals by and matching between federal agencies (e.g., Privacy Act, 5 U.S.C. § 552a, added Act of Dec. 31, 1974, Pub. L. No. 93-579, § 3, 88 Stat. 1897) and to limit use of certain “personal information” collected by certain industry sectors (chapters 4, 5, and 6 of this book) and of telecommunication content stored by certain service providers (Stored Communications Act, 18 U.S.C. §§ 2701–2710 (“SCA”) portion of ECPA, Section 2.5 of this book).

1–4

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.2

Several states have enacted varying specific protections for personal information privacy (chapter 9) and have recognized with some variations the four “Prosser” torts of invasion of privacy (descended from the Brandeis propositions; see § 1.3 and 9.1.2 of this book): • “highly offensive to a reasonable person” intrusion upon the seclusion of another; • appropriation (for “benefit”) of the other’s name or likeness; • “highly offensive to a reasonable person” publicity given to the other’s private life; or • publicity that places the other in a false light “highly offensive to a reasonable person.” Restatement (Second) of Torts §§ 652B to 652E (1977). See William Prosser, “Privacy,” 48 Calif. L. Rev. 383 (1960); William Prosser, Law of Torts 831–32 (West Publishing 3d ed. 1964). Of these, the “appropriation of name or likeness” tort does not require the judgment (fact-finding) of being “highly offensive to a reasonable person”; and in that way is similar to a violation of a property right or right to exclude (against the “world”) rather than breach of an individual duty to act reasonably in a particular circumstance. The appropriation tort is extended under the right of publicity against unconsented appropriation (or “misappropriation”) of the commercial value of a name, image or “other indicia of identity” (see Restatement (Third) of Unfair Competition §§ 46 and 47 (1995), quoted at Section 9.1.2 of this book). As noted in the first Practice Note in § 1.1 above, other misappropriation causes protect trade secrets and, before the 1976 Copyright Act, unpublished works. In this author’s view, protection of personal information privacy can be mapped to the protection commercial information privacy using a misappropriation theory. The Uniform Trade Secrets Act § 1(2) (1985) (“UTSA”) definition of “misappropriation,” adopted in relevant form by forty-two States and by the federal Defend Trade Secrets Act of 2016, 18 U.S.C. § 1839(5), Pub. L. No. 114-153, § 2(b)(3), 130 Stat. 380 (May 11, 2016), provides for two main branches of misappropriation: • acquisition of the protected information by “improper means” (UTSA § 1(2)(i), 18 U.S.C. § 1839(5)(A)); and • disclosure or use in breach of a duty to maintain secrecy or limit use, that is, a confidential relationship (UTSA § 1(2)(ii)(B)(II), 18 U.S.C. § 1839(5)(B)(ii)(II)). If private personal information (including an image) is valued by American society so as to be protected similarly to commercially valuable and private information, its protection may be mapped unto this misappropriation framework. Brandeis-Prosser “intrusion” can be mapped unto the “acquisition by improper means” misappropriation while publication of presumed private or confidential information (such as the “intimate images” in Section 9.2.3(e) of this book) can be mapped to the “breach of confidentiality” misappropriation, which has been the approach taken by the United Kingdom to personal information privacy (see Richards & Solove, “Privacy’s Other Path: Recovering the Law of Confidentiality,” 96 Geo. L. J. 124 (2007)). BrandeisMCLE, Inc. | 2nd Edition 2018

1–5

§ 1.2

Data Security and Privacy in Massachusetts

Prosser “appropriation” maps to the right of publicity, but, for more private information, may also be mapped to breach of confidence. “False light” publication may be mapped to defamation, but, for more private information, may also be mapped to breach of confidence. However, Americans do not share the European view of personal information privacy as a natural or human right. Instead, our various freedoms are privileges to act— autonomy—rather than the property right of excluding others. A challenge of cyberspace—the intimately informationally connected world—is that while it empowers individuals by providing virtually instant and apparently costless access, it far greater empowers the owners of the networks and the data they collect about individual behavior (even if not bound to an individual by a regulated identifier) who enjoy exclusivity by physically and contractually controlling access.

(b)

Security

Dictionary definitions of “security” include “freedom from danger” and “measures taken to guard against espionage or sabotage, crime, attack, or escape” E.g., MerriamWebster Online Dictionary “security” definitions 1a and 4b, https://www.merriamwebster.com/dictionary/security. Like “privacy,” “security” involves a freedom—an incident of autonomy. However, it bears the connotation of a physical barrier to a physical threat. In each of privacy “rights” explored in the preceding subsection, the notion of “security” is applicable as a physical or administrative barrier to intrusion into a private or secure space or process of communication that would allow unauthorized use or corruption of private information. “Privacy” has the connotation of a lesser physical threat or barrier (e.g., a window shade) and of nonphysical protections, such as custom, ethics, and law. Security is directly implicated for only a subset of the concerns of information privacy; it does not address the collection or voluntary disclosure of private information. However, security extends to information beyond that protected by personal or commercial information privacy to operational and computer functions, compromise of which may affect the other concerns of information privacy, including the normal functioning of an organization or facility. For example, it is implicated in the integrity of online or other remote transactions both for the substance of the communications and whether the parties are who they say they are—authenticated (consider “security questions”). Diversion of authentication credentials may lead to “identity theft.” With the Equifax breach having exposed 145.5 million American identities, many traditional credentials such as social security numbers are subject to question. In the context of digital information or data security, a physical (including electronic) barrier may be manifested in a “firewall” or in encryption, both electronically cordoning off electronically stored or communicated information. Another example is off-line physically secure storage of information media such as a safe or locked room. The “data security” function involves multiple aspects of ensuring protection for an organization’s valuable information and computer functions, including

1–6

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.2

• providing for safe communication, management and storage of information, such as encryption and firewalls; • monitoring and testing safeguards, including audits and records of incidents of breaches; • keeping up-to-date on vulnerabilities and implementing responses to new exploits; • educating information technology staff and users as appropriate, including sensitivity to the socially engineered exploits; • documenting and maintaining documentation of procedures; and • establishing a response program for breaches. As discussed below, with the pervasive use of digital information, federal laws have been enacted to mandate some of these functions in certain industry sectors, and a relatively few states have enacted mandates for some of these functions to protect their citizens. Generally “security” in this volume is addressed to the integrity and maintenance of data by a service provider or other “operators,” which data may include the personal information of users/consumers, to be protected against corruption or misappropriation. “Privacy” is addressed in this volume primarily to the collection and use of consumer data by service providers or other operators, which, if wrongful, could be considered misappropriation.

§ 1.2.2

The Challenges of Cyberspace

Issues of data security and privacy permeate virtually all commerce today. This is because of our pervasive reliance on inexpensive and publicly available communications, processing, storage and dissemination of digitized information. The 1990s explosion of public and commercial (originally called “spam”) use of the World Wide Web was facilitated by advertising revenues for providers of “public” services, much as in the earlier growth of commercial broadcast radio and television. Unlike the earlier technology, that of the Web is both inherently and intentionally interactive—the user’s browser on an “edge” (of the network) device necessarily provides “local” information to the various “nodes” on the network in order to function, and can provide much more that is developed and stored in “cookies,” such as information on the history of use (i.e., sites visited) of the browser. This allowed the advance beyond “contextual” (i.e., the “tuned to” program or visited page) advertising to “behavioral” advertisement targeted to browsers (i.e., our digital identities) with particular webpage access histories. To summarize: • targeting may be passive, based on the context of the websites we or others visit and on our web behavior tracked by “cookies” left in our browser software stack that persists between our browsing sessions; or

MCLE, Inc. | 2nd Edition 2018

1–7

§ 1.2

Data Security and Privacy in Massachusetts

• it may be active through the collection and correlation of our transactional behavior with real-time location acquired by the Internet address of routers most closely feeding our terminals or by location tracking of our portable devices (for example, triangulated by GPS satellites, local Wi-Fi signals and cellular communications access points). Many members of the public appreciate the targeting of what they considered morerelevant advertisement in addition to the services it subsidized. It is also often criticized as “intrusive” or an “invasion of privacy,” wasting time and resources as uninvited advertising e-mail (“spam”).

(a)

Security Challenges

Targeting often is not as benign as simply irritating or even offensive advertisements, or deceptive “sponsored” or “fake” news (which may not be benign at all). It often is the vehicle for planting “malware” such as the following overlapping types, described by their functions: • “viruses,” or computer code that, when executed, replicates itself or “infects” or modifies other code in a computer in a way not intended by the owner of the computer; • “worms,” or viruses intended for propagation to other computers on a network, often to create “botnets” to provide secret infiltration of the network with “spyware” or to disrupt the operations of a network resource, for example, through denial-of-service attacks overwhelming a network resource or node with excessive requests for service; • “spyware,” or computer code that reports information accessible to a computer without the owner’s knowledge, that may include tracking cookies reporting on user surfing behavior, “keyloggers” that capture user entry of information, including passwords, or code directing (or allowing remote direction) of an “secure” computer to access “secure” information; • “adware,” or computer code injected into the browser stack to present advertisements or other messages; and • “ransomware,” or computer code that locks down a portion of an infected computer, such as by encrypting the data files, and demands a ransom for unlocking. Such malware may be delivered by “hacking” into a computer system by exploiting vulnerabilities in the receiving device such as in how a device may handle certain exceptions in receiving ports. Of greater relevance to readers of this volume are actions of human users who inadvertently accept the following: • “trojan horse,” or code that masks its true intent with an interface designed to be attractive to the user, including links proffered to remedy a security breach in popular accounts, and

1–8

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.2

• “web beacons” or “web bugs” that may be embedded in graphics presented by the browser that a “mouse-over” might activate or may report back simply by displaying the graphic. These entry points are presented in the following types of “social engineering”: • “pharming,” or using a webpage to which a user may be attracted by an online search or by an advertising link or such devices as partisan news and suggestive or celebrity images; • “phishing,” or directing to a large group of users messages that appear legitimate or benign spam, but which include a link activating a Trojan horse; the term may also include scams offering business opportunities or rewards that are engineered for the user to be defrauded (“Whaling” is targeting of a “big fish,” such as an executive in a targeted organization.); and • “spearphishing,” or very detailed targeting with messages that are attractive and deceptive by researched (by sham) reference to a business associate or personal friend, such that the user may mistakenly reply to the message or execute embedded malware; spearphishing has provided the original entry point in the notorious hacks of the day—alleged Chinese espionage or Russian election tampering. Our inurement to spam may also lower our guard against mousing over web bugs in images or clicking on interesting-sounding links or attachments that we may regret. These are threats to both “security” in improper access to and corruption of information and “privacy” in the unauthorized disclosure and use of private information. Realization of these threats has led to theft of the financial and health identity of consumers and of valuable information of businesses and governments, and even compromise of physical infrastructure.

(b)

Privacy Challenges

The issue of privacy in today’s cyberspace is even more complex. Not only is the online behavior of those who are connected recorded and correlated, the “content” of digital messages, including e-mail, is routinely collected from our use of (including communicating with those who use) “free” messaging accounts that are provided under “privacy policy” to allow automatic indexing of every word and image in every message going through provider message processors. Even where this indexing may seem anonymous because of its automation, it may be easily correlated by the collector-provider with the “metadata” about the underlying message, including the parties’ Internet locations, and the user’s account information held by the provider. Where such correlation pinpoints our possible current interest in some advertising or other message when we view a webpage or walk near a retailer or restaurant, many of us may be “creeped out.” Those uncomfortable with the collection and use of personal information for such targeting of online advertisement found some hope in MCLE, Inc. | 2nd Edition 2018

1–9

§ 1.2

Data Security and Privacy in Massachusetts

Professor Lawrence Lessig’s 1999 proposal for assigning (and trading) a property value to that information. Lawrence Lessig, Code and Other Laws of Cyberspace 142–63 (Basic Books 1999). Such hope was eclipsed by the “Homeland Security” concerns of post–September 11, 2001 America, resulting in such expansion of government surveillance of private activity as in the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA PATRIOT) Act of 2001, Pub. L. No. 107-56 (2001). Practice Note On the eve of the expiration of the USA PATRIOT Act, Congress enacted the Uniting and Strengthening America by Fulfilling Rights and Ensuring Effective Monitoring (USA FREEDOM) Act of 2015, Pub. L. No. 114-23 (2015), to limit routine collection of telephone metadata by government agencies through telecommunications companies. The June 2013 revelations of Edward Snowden of such collection were cited in the committee report advancing the legislation. H.R. Rep. No. 114-109, pt. 1, at 2 (2015).

While in-transmission interception of communications by government has been passionately protested, indexing of content by service providers has received less attention. Indeed, it has been compromise of information “at rest” that has caused more concern. What seized and held the attention of consumers in America—and of their legislators—were the increasing reported instances of “identity theft” in which the thieves or their customers may use stolen personal information to access and use the personal financial, medical, or other accounts of the victims or to set up and use bogus accounts (e.g., open new credit cards) that will reflect on the credit of the victims. Identity theft directly affected the economic interests of consumers. The Identity Theft and Assumption Deterrence Act of 1998, Pub. L. No. 105-318, 112 Stat. 3007. Oct. 30, 1998 (U.S. Government Printing Office), codified as 18 U.S.C. § 1028(a)(7) (“knowingly transfers or uses, without lawful authority, a means of identification of another person with the intent to commit, or to aid or abet, any unlawful activity that constitutes a violation of Federal law, or that constitutes a felony under any applicable State or local law”), in addition to adding a federal crime of identity theft, see “FTC Targets Identity Theft,” FTC Press Release, Feb. 17, 2000 (establishment of clearinghouse), available at https://www.ftc.gov/news-events/press-releases/2000/02/ ftc-targets-identity-theft, made the Federal Trade Commission (FTC) a clearinghouse for consumer complaints of identity theft. 18 U.S.C. § 1028 note (“Centralized Complaint and Consumer Education Service for Victims of Identity Theft”—FTC to log and acknowledge complaints, provide information to victims, and refer complaints to appropriate entities such as the national consumer credit reporting agencies and law enforcement agencies). (See chapter 8 of this book.) Scams evolved quickly on the Internet (not just the browser-based Web) and also in telephone marketing to acquire personal information through mass, or even targeted, e-mailing of fraudulent notices of business opportunities (“phishing,” such as the famous Nigerian scam), as well as surreptitious deposits in user devices of “malware” or “spyware,” sometimes upon visits to bogus websites (“pharming”), as reviewed at § 1.2.2(a). The far more efficient way for identity thieves to get the personal 1–10

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.2

information of individuals is to acquire that information from large, legitimate enterprises that have already collected that information in one place. Whether such acquisition is made by an insider (by accident or otherwise), by a fraudulent purchaser, or by technological “hacking,” these invasions of the privacy of the custodians (collectors, licensees, etc.) of the information and that of the individuals are called “data breaches”—i.e., breaches of the security measures put in place (or that should have been put in place) by the custodians to protect the individuals. After investigations by the FTC and after laws were enacted by Congress and many states to require public notice of data breaches—and likely because of these laws— reports began to emerge in the mid-2000s of data breaches putting at risk millions of consumers and employees. See, e.g., In re Eli Lilly & Co., Docket No. 012-312, Decision and Order (FTC May 10, 2012) (unintentional disclosure of e-mail addresses), available at http://www.ftc.gov/os/2002/05/elilillydo.htm; In re Gateway Learning Corp., Docket No. 042-3047, Decision and Order (FTC Sept. 17, 2004) (rented personal information), available at http://www.ftc.gov/os/caselist/0423047/ 040917do0423047.pdf; In re BJ’s Wholesale Club, Inc., Docket No. 042-3160, Agreement Containing Consent Order (FTC Sept. 23, 2005) (insecure storage of credit card information), available at https://www.ftc.gov/enforcement/cases-proceedings/ 042-3160/bjs-wholesale-club-inc-matter. A decade later, reports of major data breaches were common: 2014 will be remembered for such highly publicized mega breaches as Sony Pictures Entertainment and JPMorgan Chase & Co. Sony suffered a major online attack that resulted in employees’ personal data and corporate correspondence being leaked. The JPMorgan Chase & Co. data breach affected 76 million households and seven million small businesses. [T]he average total cost of a data breach for the 350 companies participating in this research increased from $3.52 to $3.79 million. The average cost paid for each lost or stolen record containing sensitive and confidential information increased from $145 in 2014 to $154 in this year’s study. Ponemon Institute LLC, 2015 Cost of Data Breach Study: Global Analysis at 1 (May 2015), available at http://www-03.ibm.com/security/data-breach. In 2015, there emerged the breach of the U.S. Office of Personnel Management databases with lax security, exposing extensive personal information of federal employees and contractors with security clearances. See, e.g., David E. Sanger, Nicole Perlroth & Michael D. Shear, “Attack Gave Chinese Hackers Privileged Access to U.S. Systems,” N.Y. Times, June 21, 2015, at 1, p. 1, available at http://www.nytimes.com/2015/06/21/us/ attack-gave-chinese-hackers-privileged-access-to-us-systems.html. As reviewed in § 1.1 above, two years later, the Ponemon Institute reported slightly lower costs but more incidents and greater likelihood of recurrence. And the Equifax exposure of 145.5 million Americans was based on human failure to install a patch in response to a security alert. Tara Bernard and Stacy Cowley, “Equifax Breach Caused by Lone MCLE, Inc. | 2nd Edition 2018

1–11

§ 1.2

Data Security and Privacy in Massachusetts

Employee’s Error, Former C.E.O. Says,” N.Y. Times, Oct. 3, 2017, available at https://www.nytimes.com/2017/10/03/business/equifax-congress-data-breach.html. “Big Data”—including the enormous amount of information collected by Internet service providers from the constant activities of Web-connected devices (and, presumably if not explicitly, their owner-users) provide the ability to locate consumers in geographic contexts and target “relevant” messages (e.g., advertisements). Even many of the “Millennials” who have grown up with the interactive Web can find the “pushing” of such messages “creepy.” Yet individuals of all generations continue to post enormous amounts of personal information on social media sites and surrender location information to get free services. Behavioral targeting is not merely “creepy”; it also feeds messages such as “sponsored news” (sometimes not so labeled) to those the targeter believes would be amenable (biased) to the message, contributing to the seemingly unprecedented polarization of American society. While this is not a privacy issue in the sense that there have been laws (such as the ones reviewed in this volume) that address it, the creation of “alternate realities” profoundly affects informed autonomy. The issue of behavioral targeting is not treated in this volume; nor is the much increased collection of information through our constant connection through smart phones, smart cars and smart home appliances that are trained by our interactions with them. In many respects, the intelligence (and memory) of these devices know more about us (at least in precise data points) that we do or can remember perfectly. There is little ability for us to access that data, to correct it if the data is wrong, or erase it from the cloud. Is this autonomy? Primary issues addressed in this book are the rights and options of the individual to “information privacy,” including any limitations on the information collected by others with a legitimate relationship and the “data security” of the information so collected and the law governing the collectors of this Big Data. The individual has limited control over the security of his or her information in the hands of the Big Data collector through using strong passwords and keeping appropriate privacy settings. However, the realities of convenient use—the attraction of Internet commerce and communication—limit the actual individual use of even these limited controls.

§ 1.3

SOURCES AND HISTORY OF THE LAW OF INFORMATION PRIVACY AND SECURITY

§ 1.3.1

Foundations

Information privacy has been entwined with bodily privacy and territorial (home) privacy in their protection under the Constitution. Thus, the Fourth Amendment provides as follows: The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, 1–12

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.3

shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. U.S. Const. amend. IV. Information in “private papers” was recognized by the 1886 Court in Boyd v. United States as protected by the Fourth and Fifth Amendments: “[W]e have been unable to perceive that the seizure of a man’s private books and papers to be used in evidence against him is substantially different from compelling him to be a witness against himself.” Boyd v. United States, 116 U.S. 616, 633 (1886). Communication privacy, however, was not recognized in a wiretap outside the home, in the 1928 case of Olmstead v. United States, 277 U.S. 438, 464–65 (1928) (declining to extend to wiretapping the rule against opening U.S. mail). In his dissent, then Justice Brandeis implored: The makers of our Constitution undertook to secure conditions favorable to the pursuit of happiness. They recognized the significance of man’s spiritual nature, of his feelings and of his intellect. They knew that only a part of the pain, pleasure and satisfactions of life are to be found in material things. They sought to protect Americans in their beliefs, their thoughts, their emotions and their sensations. They conferred, as against the Government, the right to be let alone—the most comprehensive of rights and the right most valued by civilized men. To protect that right, every unjustifiable intrusion by the Government upon the privacy of the individual, whatever the means employed, must be deemed a violation of the Fourth Amendment. And the use, as evidence in a criminal proceeding, of facts ascertained by such intrusion must be deemed a violation of the Fifth. Olmstead v. United States, 277 U.S. at 477–79 (notes omitted). The Court continued for decades to refuse to find a constitutional right to privacy against eavesdropping that did not physically invade the home. See, e.g., Goldman v. United States, 316 U.S. 129 (1942) (listening device on wall outside home). It took nearly a century after the invention of the telephone before a majority of the Court in the 1960s applied the Fourth Amendment to protecting “reasonable expectation of privacy” in telephonic communications outside a person’s home. Katz v. United States, 389 U.S. 347, 361–62 (1967); see also Berger v. New York, 388 U.S. 41 (1967).

§ 1.3.2

Civil Rights and Consumer Protection

It was during the Civil Rights era of the 1960s that Congress added a provision to the federal criminal code addressed to “Wire and Electronic Communications Interception and Interception of Oral Communications” (the Wiretap Law), 18 U.S.C. §§ 2510–2520, added by Omnibus Crime Control and Safe Streets Act of 1968, MCLE, Inc. | 2nd Edition 2018

1–13

§ 1.3

Data Security and Privacy in Massachusetts

Pub. L. No. 90-351, § 802, 82 Stat. 212-24 (1968), amended two decades later by the Electronic Communications Privacy Act of 1986 (ECPA), Pub. L. No. 99-508, 100 Stat. 1848 (1986). (See chapter 2 of this book.) During this time, Professor William Prosser compiled from the common law of the states four private torts of invasion of privacy: • unreasonable intrusion upon the seclusion of another; • appropriation of the other’s name or likeness; • unreasonable publicity given to the other’s private life; or • publicity that unreasonably places the other in a false light before the public. Restatement (Second) of Torts § 652A (1977); Also begun in this period was the development of general privacy rights in personal information, particularly health information. Thus, the then Department of Health, Education and Welfare sponsored the development of a report on “Records, Computers and the Rights of Citizens,” released in 1973. [T]he report recommends the enactment of a Federal “Code of Fair Information Practice” for all automated personal data systems. The Code rests on five basic principles that would be given legal effect as “safeguard requirements” for automated personal data systems. • There must be no personal data record-keeping systems whose very existence is secret. [Notice] • There must be a way for an individual to find out what information about him is in a record and how it is used. [Access] • There must be a way for an individual to prevent information about him that was obtained for one purpose from being used or made available for other purposes without his consent. [Consent] • There must be a way for an individual to correct or amend a record of identifiable information about him. [Integrity] • Any organization creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take precautions to prevent misuse of the data. [Security] Records, Computers and the Rights of Citizens—Report of the Secretary’s Advisory Committee on Automated Personal Data Systems xx (U.S. Dep’t of Health, Education & Welfare, July 1973), available at http://www.justice.gov/opcl/docs/rec-comrights.pdf. The concept was fairness.

1–14

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.3

Practice Note These and other considerations of security (analyzed as physical, administrative, and technical controls) were carried forward over a generation in the development of the Health Insurance Portability and Accountability Act of 1996 (HIPAA), 104 Pub. L. No. 191, 110 Stat. 1936-2103 (1996), and the Department of Health and Human Services (HHS) “Standards for Privacy of Individually Identifiable Health Information,” 45 C.F.R. pts. 160 (General Administrative Requirements) and 164 (Security and Privacy) on December 28, 2000. 65 Fed. Reg. 82,462–829 (Dec. 28, 2000). (See chapter 6 of this book.)

§ 1.3.3

Credit Expansion

A uniquely American development in the 1960s–1980s was Congressional encouragement of consumer credit cards through federal consumer protections such as limitation of risk of fraud in the Truth in Lending Act of 1968. Consumer Credit Protection Act, Pub. L. No. 90-321, Title I, §§ 101–3, 82 Stat. 146-59 (1968), particularly Title I, the Truth in Lending Act. The growth of the industry resulted in the central importance of consumer credit reporting (i.e., rating) agencies and their regulation in the 1970 Fair Credit Reporting Act (FCRA). Pub. L. No. 91-508, § 601, 84 Stat. 1127-36 (1970). (See chapter 4 of this book.) Practice Note Credit reporting is an early example of collection of consumer information for sale to commercial enterprises, would-be creditors. Many consumers are unaware that they are charged for “credit freezes” not because of the administrative cost, but to compensate for loss of sales revenue). The contribution to privacy was some limitation of the sale of certain collected information and from whom the information was received.

Similar consumer incentives (limitation on liability and correction procedures) were provided in 1978 when Congress enacted the Electronic Fund Transfer Act (“EFTA”), which amended the Consumer Credit Protection Act of 1968. 15 U.S.C. §§ 1693– 93r, Consumer Credit Protection Act, Act of May 29, 1968, Pub. L. No. 90-321, Title IX, §§ 902–20, added by Act of Nov. 10, 1978, Pub. L. No. 95-630, Title XX, § 2001, 92 Stat. 3728 (emphasis added). These consumer protections muted consumer demand for information security: the service providers would pass their cost of individual fraud to aggregate consumers in service fees and higher interest rates. For nonconsumer transactions not protected or preempted by EFTA, Article 4A of the Uniform Commercial Code (“UCC”) was approved in 1989 by National Conference of Commissioners on Uniform State Laws (“NCCUSL”) and the American Law Institute, including the concept and use of a “security procedure” to provide a level of assurance (“security”) of the integrity of the transaction: Security procedure” means a procedure established by agreement of a customer and a receiving bank for the purpose of (i) verifying that a payment order or communication amending MCLE, Inc. | 2nd Edition 2018

1–15

§ 1.3

Data Security and Privacy in Massachusetts

or cancelling a payment order is that of the customer, or (ii) detecting error in the transmission or the content of the payment order or communication. A security procedure may require the use of algorithms or other codes, identifying words or numbers, encryption, callback procedures, or similar security devices. Comparison of a signature on a payment order or communication with an authorized specimen signature of the customer is not by itself a security procedure. UCC § 4A-201. With the rise of interstate banking (branch banking was still limited in many states in the 1970s) and “non-bank banks” (which did not offer all of the functions required to be a regulated “bank”) taking advantage of the high interest rates prevailing through the 1980s into the 1990s, along with the securitization (“slicing and dicing”) of mortgages for sale to investors and servicing by different entities, there was rapid expansion of credit-rating enterprises and the trafficking of prospects including the selling of customer lists.

§ 1.3.4

Computerization and Security

Despite much litigation and conflicting views on “exceeding authorization” relative to web or employer restrictions on use of properly accessed information, compare United States v. Nosal, 676 F.3d 854, 856 (9th Cir. 2012) (en banc), WEC Carolina Energy Solutions LLC v. Miller, 687 F.3d 199 (4th Cir. 2012) and United States v. Valle, 807 F.3d 508, 524 (2d Cir. 2015) with International Airport Ctrs., LLC v. Citrin, 440 F.3d 418, 420 (7th Cir. 2006), Congress has not resolved the issue or substantively updated CFAA. With the increase in online transactions in the final years of the twentieth century, vendors emerged to propose a method of authenticating transactions using public key infrastructure (“PKI”). In PKI, the sender of a message provides a “public key” to the public which would decrypt a message encrypted by the sender using a matching “private key.” A “one-way hash” (digest) of a transactional message would be encrypted by the private key as a “digital signature” sent with the transactional message. The recipient would then use the public key to decrypt the received digital signature. If applying the one-way hash function to the received transaction message results in a digest identical to the decrypted signature, the recipient is assured both that transactional message was not altered in transmission and that it was signed by a possessor of the private key. In PKI, a “certification authority” keeps a register of public keys and associated owners of the private keys to authenticate the identity of the signer. In 1995, the Utah Digital Signature Act was enacted to support PKI. Utah Code Ann. §§ 46-3-101 to 46-3-504 (2001), added by 1995 Utah Laws ch. 61, §§ 1– 27, repealed by 2006 Utah Laws ch. 21, § 13. Practice Note Washington followed with its Electronic Authentication Act, Wash. Rev. Code §§ 19.34.010 to 19.34.903, 1996 Wash. Laws ch. 250, as did Illinois with its Electronic Commerce Security Act, 5 Ill. Comp. Stat. 175/1-101 to 175/99-1, 1999 Ill. Laws 90-759. 1–16

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.3

NCCUSL maintained technology neutrality and did not require PKI, promulgating in 1999 the Uniform Electronic Transactions Act (“UETA”), available at http://www. uniformlaws.org/shared/docs/electronic%20transactions/ueta_final_99.pdf., which has been enacted by forty-seven states, including Utah. Rather than the special “digital signature,” UETA validated the use of the more general “electronic signature,” defined as “an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign the record.” UETA § 2(8). (a) An electronic record or electronic signature is attributable to a person if it was the act of the person. The act of the person may be shown in any manner, including a showing of the efficacy of any security procedure applied to determine the person to which the electronic record or electronic signature was attributable. (b) The effect of an electronic record or electronic signature attributed to a person under subsection (a) is determined from the context and surrounding circumstances at the time of its creation, execution, or adoption, including the parties’ agreement, if any, and otherwise as provided by law. UETA § 9. Similarly to UCC Article 4A, “Security procedure” means a procedure employed for the purpose of verifying that an electronic signature, record, or performance is that of a specific person or for detecting changes or errors in the information in an electronic record. The term includes a procedure that requires the use of algorithms or other codes, identifying words or numbers, encryption, or callback or other acknowledgment procedures. UETA § 2(14). Within a year after the NCCUSL’s promulgation of the UETA, Congress enacted in 2000 the Electronic Signatures in Global and Interstate Commerce Act (“E-Sign”), Act of June 30, 2000, Pub. L. No. 106-229, § 114 Stat. 464, codified at 15 U.S.C. §§ 7001 to 7006, taking the same technology-neutral, commercefacilitating approach as the NCCUSL took. Because of E-Sign’s “exemption from preemption” of state enactments of UETA, 15 U.S.C. § 7002(a)(1), those forty-seven enactments provide the current basic approach to electronic signature (and transaction) authentication in the United States.

§ 1.3.5

Electronic Health Records

On the healthcare information side, Congress had enacted the Healthcare Insurance Portability and Accountability Act of 1996 (“HIPAA”), Pub. L. No. 104-191, § 110 Stat. 1936–2103, which in addition to making reforms in insurance coverage, provided for the Department of Health and Human Services (“HHS”) to develop electronic health record standards and rules for privacy and security of protected health MCLE, Inc. | 2nd Edition 2018

1–17

§ 1.3

Data Security and Privacy in Massachusetts

information (“PHI”). These were completed in 2000, including many of the considerations outlined in the 1973 HEW report recited above. HIPAA was upgraded in 2009 by the Health Information Technology Economic and Clinical Health (“HITECH”) Act, American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, Div. A, Title XIII (Health Information Technology for Economic and Clinical Health Act, or HITECH Act), 123 Stat. 115, 242–79, to add breach notification and extend HIPAA’s privacy and security protections to business associates of covered entities put under the jurisdiction of the FTC. (See chapter 6 of this book.)

§ 1.3.6

Financial Institution Reform

Notwithstanding the “savings and loan crisis” of the late 1980s that owed at least in part to the high interest rates that had been imposed by the Federal Reserve System to combat the rampant inflation of the Vietnam War era and in part to the relaxation of regulation of thrift institution products and investments, Congress eliminated the Depression-era separation of banks and securities firms, included a Title V addressed to “privacy.” Pub. L. No. 106-102, §§ 501–527, 113 Stat. 1436 (1999). The GrammLeach-Bliley Act (“GLBA”) did include consumer protections against sale of consumer information to nonaffiliated organizations for marketing purposes (prospect lists and “pre-approval” of credit), fraudulent (“pretexting”) practices and a mandate for regulatory agencies, led by the FTC, 16 C.F.R. Part 314, Standards for Safeguarding Customer Information; Final Rule, 16 C.F.R. §§ 314.1–314.5 added by 67 Fed. Reg. 36464 (May 23, 2002), to protect security and privacy of consumer information. (See chapters 5 and 8 of this book.)

§ 1.3.7

Addressing Web Abuses

As Internet commerce ramped up rapidly, at the turn of the millennium, California led with consumer protection legislation against spam, identity theft, and malicious software such as inventoried at § 1.2.2(a) above. In one measure to help consumers, “licensees” or holders of consumer information were required to notify consumers of breaches of licensee security protecting that information and of consumer ability to suspend credit reports by credit reporting agencies. Cal. Civ. Code § 1798.82, added by 2002 Cal. Stat. 1054, § 4 (Section 9.2.1 of this book). Although Congress largely preempted the California anti-spam legislation (Section 9.2.3(a) of this book) with the Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003 (“CAN-SPAM”), 15 U.S.C. §§ 7701–7713, added by 108 Pub. L. No. 187, §§ 2–15, 117 Stat. 2699 (Section 7.5 of this book), California and other states have further legislated against other unwanted communications—that containing malware, including spyware (e.g., California Consumer Protection Against Computer Spyware Act, Cal. Bus. & Prof. Code §§ 22947–22947.6, added by 2004 Cal. Stat. 843, § 2, and California Anti-Phishing Act of 2005, Cal. Bus. & Prof. Code §§ 22948 through 22948.3, added by 2005 Cal. Stat. 437, § 1, forty-seven states (Alabama and South Dakota being the exceptions) enacted breach notification legislation similar to that of California (Section 9.2.1 of this book). As reviewed at that section, the state legislation relatively uniformly provides some aid to consumers after breach of security, but

1–18

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in a Digital World

§ 1.3

is less uniform in establishing preventative measures to improve information security (Section 9.2.2 of this book). The States have also attempted to address coerced disclosure of online log-in credentials (Section 9.2.3(c) of this book), cyber-bullying (Section 9.2.3(d) of this book), and unauthorized disclosure of intimate images (Section 9.2.3(e) of this book). Congress, in addition to enacting CAN-SPAM (Section 7.5 of this book), failed to regulate “offensive” postings, but granted substantial immunity (more than conventional publishers) to interactive computer service providers for publishing activities under the Communication Decency Act, 47 U.S.C. § 230(c)(1) (Sections 7.2 and 12.1.1 of this book), and granted service providers a safe harbor for secondary copyright infringement by using a take-down procedure under the Digital Millennium Copyright Act, 17 U.S.C. § 512 (Section 7.3 of this book). Congress also regulated Internet marketing to children under the Children’s Online Privacy Protection Act, 15 U.S.C. §§ 6501-6506 (Section 7.4 of this book). The more general FTC response is discussed in chapter 8.

§ 1.3.8

Identity Theft

As discussed throughout this chapter, identity theft has become a significant burden on consumers and for collectors of consumer information, and it remains the primary focus of data security and privacy through breach notification. The response was led by California in 2003, requiring notice to the public of data breaches and mechanisms (such as fraud alerts and credit freezes) to prevent damage to credit ratings, a response followed by the great majority of the states. (See Section 7.2.1 of this book.) Congress contemporaneously enacted the Fair and Accurate Credit Transactions Act of 2003 (FACT Act, or FACTA), Pub. L. No. 108-159, 117 Stat. 1952 (2003), adding similar sections to the FCRA for the prevention and remediation of identity theft, as well as charging the FTC with promulgating “red flags” regulations to enhance security. (See chapters 4 and 8 of this book.) The HITECH Act included amendments to HIPAA that provided for the development of health information technology, along with enhanced protection of privacy. 42 U.S.C. §§ 17921–17954. (See chapter 6 of this book.) Among other things, the HHS promulgated a new “Notification in the Case of Breach of Unsecured Protected Health Information,” 45 C.F.R. §§ 164.400–164.414, and the FTC, charged with regulation of non–HHS-covered vendors, promulgated its “Health Breach Notification Rule,” 16 C.F.R. pt. 318. (See chapter 8 of this book.)

§ 1.3.9

Summary of the Framework

The statutory and regulatory structure place the FTC at the center of response to breaches of Big Data as well as promoting fair notice to consumers of its collection. Federal statutes provide some personal information privacy in limiting collection and disclosure of some personal information in credit reporting under FCRA and FACTA, in creditor records under GLBA and in health records under HIPAA and HITECH.

MCLE, Inc. | 2nd Edition 2018

1–19

§ 1.3

Data Security and Privacy in Massachusetts

Only HIPAA provides a comprehensive approach to security, and FACTA and HITECH provide for breach notification and limited remediation.

1–20

2nd Edition 2018 | MCLE, Inc.

CHAPTER 2

The Electronic Communications Privacy Act Peter J. Guffin, Esq. Pierce Atwood LLP, Portland, ME § 2.1

Introduction............................................................................................. 2–2

§ 2.2

The Electronic Communications Privacy Act....................................... 2–3

§ 2.3

The Wiretap Act ...................................................................................... 2–6 § 2.3.1 Interception of Communications ............................................ 2–6 § 2.3.2 Exceptions ............................................................................ 2–12 § 2.3.3 Communications in Transit Versus Stored Communications .................................................................. 2–15 § 2.3.4 Video Surveillance ............................................................... 2–20

§ 2.4

Massachusetts Electronic Surveillance Statute .................................. 2–21 § 2.4.1 Oral Communications—No “Expectation of Privacy” Requirement ......................................................................... 2–21 § 2.4.2 Consent of All Parties ........................................................... 2–26 § 2.4.3 Video Surveillance ............................................................... 2–27

§ 2.5

The Stored Communications Act ......................................................... 2–28 § 2.5.1 Why the SCA Exists............................................................. 2–28 § 2.5.2 Prohibition on Unauthorized Access .................................... 2–28 § 2.5.3 Entities Regulated by the SCA ............................................. 2–31 § 2.5.4 Providers “to the Public” Versus Nonpublic Providers......... 2–32 § 2.5.5 Content Information Versus Noncontent Information .......... 2–32 § 2.5.6 The Privacy Protections of the SCA ..................................... 2–33 (a) Compelled Disclosure Rules ......................................... 2–33 (b) Voluntary Disclosure Rules ........................................... 2–34 (c) Voluntary Disclosure Versus Compelled Disclosure ..... 2–35 § 2.5.7 Interplay with Massachusetts Constitution ........................... 2–36 § 2.5.8 Interplay with Fourth Amendment ....................................... 2–44 § 2.5.9 Consent and Discovery Under the SCA ............................... 2–48

§ 2.6

Conclusion ............................................................................................. 2–53

EXHIBIT 2A—Wiretap Report 2016 ................................................................ 2–57

MCLE, Inc. | 2nd Edition 2018

2–1

Data Security and Privacy in Massachusetts

Scope Note This chapter introduces the reader to federal electronic surveillance law, as it has been codified in the Electronic Communications Privacy Act, with a focus on the Wiretap Act and the Stored Communications Act. Also addressed is Massachusetts’ own law surrounding electronic surveillance.

§ 2.1

INTRODUCTION

Federal electronic surveillance law traces its roots as far back as Section 605 of the Federal Communications Act of 1934 (FCA), Pub. L. No. 73-416, 48 Stat. 1064, 110304 (codified as amended at 47 U.S.C. § 605 (2015)), enacted by Congress, in part as a response to significant criticism of the Supreme Court’s decision in Olmstead v. United States, 277 U.S. 438 (1928). In Olmstead, the Court declared that wiretapping did not constitute a Fourth Amendment violation. In 1968, in response to the Supreme Court’s then-evolving Fourth Amendment jurisprudence, Congress enacted Title III of the Omnibus Crime Control and Safe Streets Act. Pub. L. No. 90-351, 82 Stat. 197, 211–23 (codified as amended at 18 U.S.C. §§ 2510–2520 (2015)). This Act is commonly referred to as “Title III” or, subsequent to its amendment in 1986, as the “Wiretap Act.” Title III’s reached beyond Section 605, applying to wiretaps by federal and state officials as well as by private parties. It also required federal agents to apply for a warrant before wiretapping. However, if any party to the conversation consented to the tapping, then Title III was not violated. In 1986, Congress amended the Wiretap Act by passing the Electronic Communications Privacy Act (ECPA), Pub. L. No. 99-508, 100 Stat. 1848, as a. response to developments in computer technology and communication networks. The ECPA is based on what was known at the time about computer technology and computer networks, long before anyone knew about the Internet, e-mail, and the vast array of other new technologies that we use today. The ECPA has remained largely unchanged since 1986, and it is widely acknowledged that the ECPA has significant gaps in need of legislative attention. Congress and the courts each have played a critical role in the development of federal electronic surveillance law, the one often acting as a check on the other. One of the biggest difficulties for both branches has been trying to create rules that can evolve and keep pace with changing in technology. This chapter provides an overview of the ECPA, focusing primarily on its statutory structure and key court decisions interpreting the statute, including major decisions of the First Circuit Court of Appeals and the Supreme Judicial Court of Massachusetts. It also discusses the Massachusetts electronic surveillance statute, highlighting the differences between it and the ECPA. Finally, the chapter explores the interplay among the ECPA, the Massachusetts Constitution, and the Fourth Amendment while 2–2

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.1

providing some practical tips for civil litigants seeking to obtain discovery of stored electronic communications.

§ 2.2

THE ELECTRONIC COMMUNICATIONS PRIVACY ACT

The Electronic Communications Privacy Act contains three parts: the Wiretap Act, 18 U.S.C. §§ 2510–2522 (2015), the Stored Communications Act (SCA), 18 U.S.C. §§ 2701–2712 (2015), and the Pen Register Act, 18 U.S.C. §§ 3121–3127 (2015). Like Title III before it, the ECPA applies not only to government officials but to private individuals and parties as well. The ECPA classifies all communications into three types: “wire communications,” “oral communications,” and “electronic communications.” Each type of communication is protected differently. As a general matter, wire communications receive the most protection and electronic communications receive the least. A “wire communication,” defined in 18 U.S.C. § 2510(1), involves all “aural transfers” that travel through a wire or a similar medium: “[W]ire communication” means any aural transfer made in whole or in part through the use of facilities for the transmission of communications by the aid of wire, cable, or other like connection between the point of origin and the point of reception (including the use of such connection in a switching station) furnished or operated by any person engaged in providing or operating such facilities for the transmission of interstate or foreign communications or communications affecting interstate or foreign commerce[.] An “aural transfer” is a communication “containing the human voice at any point.” 18 U.S.C. § 2510(18). The aural transfer must travel through wire (that is, telephone wires or cable wires) or a similar medium. The entire journey from origin to destination need not take place through wire, as many communications travel through a host of other different mediums, including radio and satellite. Only part of the communication’s journey must be through a wire. An “oral communication,” defined in Section 2510(2), is a communication “uttered by a person exhibiting an expectation that such communication is not subject to interception under circumstances justifying such expectation, but such term does not include any electronic communication.” 18 U.S.C. § 2510(2). Oral communications are typically intercepted through bugs and other recording or transmitting devices. Finally, an “electronic communication” consists of all nonwire and nonoral communications:

MCLE, Inc. | 2nd Edition 2018

2–3

§ 2.2

Data Security and Privacy in Massachusetts

“[E]lectronic communication” means any transfer of signs, signals, writing, images, sounds, data, or intelligence of any nature transmitted in whole or in part by a wire, radio, electromagnetic, photoelectronic or photooptical system that affects interstate or foreign commerce, but does not include— (A) any wire or oral communication[.] 18 U.S.C. § 2510(12). Examples of electronic communications are e-mails and text messages. Although electronic communications are protected under the SCA as well as the Wiretap Act, they are treated differently from wire and oral communications. For example, one notable difference is that electronic communications are exempted from the Wiretap Act’s prohibition on the use of intercepted wire or oral communications in evidence. 18 U.S.C. § 2515. The SCA has no such exclusionary rule. The Pen Register Act governs traditional pen registers and trap and trace devices, as well as their modern analogues. 18 U.S.C. § 3121–3127 (2015). Traditionally, a pen register was a device that recorded the telephone numbers of outgoing calls from a particular telephone line, whereas a trap and trace device recorded the telephone numbers of incoming calls. The Pen Register Act defines a pen register more broadly, covering devices that record more than just the telephone numbers of outgoing calls. [T]he term “pen register” means a device or process which records or decodes dialing, routing, addressing, or signaling information transmitted by an instrument or facility from which a wire or electronic communication is transmitted, provided, however, that such information shall not include the contents of any communication[.] 18 U.S.C. § 3127(3). The Pen Register Act likewise defines a trap and trace device more broadly, covering devices that record more than just incoming telephone numbers. [T]he term “trap and trace device” means a device or process which captures the incoming electronic or other impulses which identify the originating number or other dialing, routing, addressing, and signaling information reasonably likely to identify the source of a wire or electronic communication, provided, however, that such information shall not include the contents of any communication[.] 18 U.S.C. § 3127(4).

2–4

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.2

If the government certifies that “the information likely to be obtained by such installation and use is relevant to an ongoing investigation,” 18 U.S.C. § 3123(a)(1), then courts “shall authorize the installation and use of a pen register or a trap and trace device for a period not to exceed sixty days.” 18 U.S.C. § 3123(c). This “relevance” standard is a low threshold relative to the judicial process standards—e.g., probable cause and “specific and articulable facts” court order—set forth in the Wiretap Act and the SCA. See discussion below in § 2.1 and § 2.1. In the case In re Application of United States of America for an Order Authorizing the Use of a Pen Register and Trap on [xxx] Internet Service Account, 396 F. Supp. 2d 45 (D. Mass. 2005), the U.S. magistrate judge recognized that in the Internet world certain “dialing, routine, addressing, and signaling information” may reveal the “contents” of communications which are prohibited from disclosure. In re Application of United States of America for an Order Authorizing the Use of a Pen Register and Trap on [xxx] Internet Service Account, 396 F. Supp. 2d at 47–49. Consequently, it was held that the order authorizing the installation of pen registers and trap and trace devices at issue in the case had to include the following language: It is ORDERED that the pen register and trap and trace device installed in accordance with the within Order be configured to exclude all information constituting or disclosing the “contents” of any communications or accompanying electronic files. “Contents” is defined by statute as any “. . . information concerning the substance, purport or meaning of that communication.” The disclosure of the “contents” of communications is prohibited pursuant to this Order even if what is disclosed is also “dialing, routing, addressing and signaling information.” Therefore, the term “contents” of communications includes subject lines, application commands, search queries, requested file names, and file paths. Disclosure of such information is prohibited by the within Order. Violation of the within Order may subject an internet service provider to contempt of court sanctions. In implementing the within Order, should any question arise as to whether the pen register and/or trap and trace device should be configured to provide or not to provide any particular category of information over and above those stated, the Trial Attorney and/or the internet service provider are invited to apply to this court for clarification and/or guidance. In re Application of United States of America for an Order Authorizing the Use of a Pen Register and Trap on [xxx] Internet Service Account, 396 F. Supp. 2d at 49–50 (emphasis added). MCLE, Inc. | 2nd Edition 2018

2–5

§ 2.2

Data Security and Privacy in Massachusetts

As with the SCA, there is no exclusionary rule for violations of the Pen Register Act. Rather than a suppression remedy, the Pen Register Act provides: “Whoever knowingly violates [the Act] shall be fined under this title or imprisoned not more than one year, or both.” 18 U.S.C. § 3121(d). With that general overview of the ECPA, we will now examine more closely the Wiretap Act and the SCA, as well as the interplay between the two acts.

§ 2.3

THE WIRETAP ACT

§ 2.3.1

Interception of Communications

The Wiretap Act governs the interception of communications. In particular, it provides: (1) Except as otherwise specifically provided in this chapter any person who— (a) intentionally intercepts, endeavors to intercept, or procures any other person to intercept or endeavor to intercept, any wire, oral, or electronic communication; (b) intentionally uses, endeavors to use, or procures any other person to use or endeavor to use any electronic, mechanical, or other device to intercept any oral communication when— (i) such device is affixed to, or otherwise transmits a signal through, a wire, cable, or other like connection used in wire communication; or (ii) such device transmits communications by radio, or interferes with the transmission of such communication; or (iii) such person knows, or has reason to know, that such device or any component thereof has been sent through the mail or transported in interstate or foreign commerce; . . . (c) intentionally discloses, or endeavors to disclose, to any other person the contents of any wire, oral, or electronic communication, knowing or having reason to know that the information was obtained through the interception of a wire, oral, or electronic communication in violation of this subsection; (d) intentionally uses, or endeavors to use, the contents of any wire, oral, or electronic communication, knowing or having reason to know that the information was obtained through the interception of a wire, oral, or electronic communication in violation of this subsection; . . .

2–6

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

shall be punished as provided in subsection (4) or shall be subject to suit as provided in subsection (5). 18 U.S.C. § 2511(1). To “intercept” a communication means to acquire its contents “through the use of any electronic, mechanical, or other device.” 18 U.S.C. § 2510(4). The classic example of an activity covered by the Wiretap Act is the wiretapping of a telephone conversation: a device is being used to listen to the conversation as it is occurring, as the words are moving through the wires. A more modern example of activity that may be covered by the Wiretap Act is the interception of Wi-Fi communications. Under the Wiretap Act it is lawful to intercept “electronic communications made through an electronic communication system” that are “readily accessible to the general public.” 18 U.S.C. § 2511(2)(g)(i). Recall that radio communications are considered electronic communications under the Wiretap Act. § 2510(12); see also discussion in § 2.2, The Electronic Communications Privacy Act, above. However, the Wiretap Act considers radio communications that are “‘not . . . scrambled or encrypted’” as “‘readily accessible to the general public,’” therefore exempting those who intercept these from liability. 18 U.S.C. § 2510(16)(A). This exclusion was added to alleviate the fears of radio hobbyists that they would be subject to legal claims for their actions. Joffe v. Google, 746 F.3d 920, 931–33 (9th Cir. 2013). When capturing pictures for its “Street View” from 2007 to 2010, Google’s Street View cars also collected data from nearby unencrypted Wi-Fi networks. Joffe v. Google, Inc., 729 F.3d 1262, 1264 (9th Cir. 2013). The information collected included “basic information” identifying these networks, such as “the network’s name (SSID), the unique number assigned to the router transmitting the wireless signal (MAC address), the signal strength, and whether the network was encrypted,” in addition to “payload data.” Joffe v. Google, Inc., 729 F.3d at 1264. Payload data “includes everything transmitted by a device connected to a Wi-Fi network, such as personal emails, usernames, passwords, videos, and documents.” Joffe v. Google, Inc., 729 F.3d at 1264. In total, Google collected nearly 600 gigabytes of data in over thirty countries. Joffe v. Google, Inc., 729 F.3d at 1264. After acknowledging the practice in 2010, Google apologized and rendered the personal data it acquired “inaccessible.” Joffe v. Google, Inc., 729 F.3d at 1264. Nonetheless, several class-action lawsuits were filed, which were subsequently consolidated as complaint alleging Google violated the Wiretap Act, the California Business and Professional Code, and several state wiretap statutes. Joffe v. Google, Inc., 729 F.3d at 1264. The district court granted Google’s motion to dismiss all but the claims under the federal Wiretap Act, finding unpersuasive Google’s argument that the payload data transferred over the unencrypted Wi-Fi networks fell under the Act’s exemption from liability, found in 18 U.S.C. § 2511(2)(g)(i) for intercepting “electronic communication [that] is readily accessible to the general public.” Joffe v. Google, Inc., 729 F.3d at 1264–65. Google appealed. Joffe v. Google, Inc., 729 F.3d at 1265.

MCLE, Inc. | 2nd Edition 2018

2–7

§ 2.3

Data Security and Privacy in Massachusetts

Before diving into its analysis, the Ninth Circuit began by explaining in more detail the exemption at issue and the definitions behind it. Joffe v. Google, Inc., 729 F.3d. at 1266. Specifically, § 2511(2)(g)(i) exempts from liability intercepting “an electronic communication made through an electronic communication system that is configured so that such electronic communication is readily accessible to the general public.” Joffe v. Google, Inc., 729 F.3d at 1266. Electronic communications include radio communications. Joffe v. Google, Inc., 729 F.3d at 1266 (citing 18 U.S.C. § 2510(12)). Radio communications that are “‘not . . . scrambled or encrypted’” are considered “‘readily accessible to the general public.’” Joffe v. Google, Inc., 729 F.3d at 1266 (quoting 18 U.S.C. § 2510(16)(A)). Google advanced two reasons for why under § 2511(2)(g)(i) its actions did not violate the Wiretap Act: data transmitted over a Wi-Fi network is an electronic “radio communication” and that the Act exempts such communications by defining them as “readily accessible to the general public,” so long as “such communication is not . . . scrambled or encrypted [and] even if data transmitted over an unencrypted Wi-Fi network is not a “radio communication,” it is still an “electronic communication . . . readily accessible to the general public.” Joffe v. Google, Inc., 729 F.3d at 1267 (citations omitted). The court rejected Google’s assertion that data transferred over Wi-Fi networks constitute radio communications. Joffe v. Google, Inc., 729 F.3d at 1267, 1269. Because the Wiretap Act does not define “radio communication,” the court had to discern the phrase’s “ordinary meaning.” Joffe v. Google, Inc., 729 F.3d at 1268–69. Google argued radio communications were anything transferred using “radio waves, i.e., the radio frequency portion of the electromagnetic spectrum.” Joffe v. Google, Inc., 729 F.3d at 1268 (internal quotation marks omitted). The court responded, explaining that radio frequency covers many things beyond Wi-Fi transmissions, including broadcast television and garage door openers. Joffe v. Google, Inc., 729 F.3d at 1268. When interpreting the ordinary meaning of an undefined statutory term, the definition must be understood as it was by the enacting Congress, here when the Wiretap Act was enacted in 1986. Joffe v. Google, Inc., 729 F.3d at 1268. Using provisions of the Wiretap Act for context and comparison, the court explained that in 1986, Congress understood radio communications not to include things like garage door signals and T.V. broadcasts, but as “predominantly auditory broadcasts.” Joffe v. Google, Inc., 729 F.3d at 1268–76. Consequently, because the unencrypted payload data—the usernames, e-mails, passwords, videos, and documents—Google intercepted were not “predominantly auditory,” these could not be considered radio communications for the purposes of the Wiretap Act. Joffe v. Google, Inc., 729 F.3d at 1270. And, because the payload data were not radio communications, these did not fall under the definition of “electronic communications” that could be lawfully intercepted under the Act as “readily accessible to the general public” due to being unencrypted. Joffe v. Google, Inc., 729 F.3d 1268, 1270. 2–8

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

This did not end the inquiry, however, because the Wiretap Act only relates “readily accessible to the general public” to radio communications. Joffe v. Google, Inc., 729 F.3d at 1277 n.7. The Act does not explain how the phrase pertains to “electronic communications” that are not “radio communications.” Joffe v. Google, Inc., 729 F.3d at 1277 n.7. As stated by the court: Although payload data transmitted over an unencrypted Wi-Fi network is not “readily accessible to the general public” by definition solely because it is an unencrypted “radio communication,” it is still possible for a transmission that falls outside of the purview of the § 2510(16) definition to be considered “readily accessible to the general public” under the ordinary meaning of that phrase. Joffe v. Google, Inc., 729 F.3d at 1277. The court then concluded that the unencrypted Wi-Fi communications at issue were not “readily accessible to the general public” for two reasons. Joffe v. Google, Inc., 729 F.3d at 1277–79. First, unlike traditional radio stations which can broadcast for hundreds of miles, Joffe v. Google, Inc., 729 F.3d at 1278, Wi-Fi transmissions do not “travel far beyond the walls of the home or office where the access point is located,” Joffe v. Google, Inc., 729 F.3d at 1277 (internal quotation marks omitted). Therefore, because Wi-Fi communications are “geographically limited,” they are “not ‘readily’ available.” Joffe v. Google, Inc., 729 F.3d at 1278. Second, the payload data transferred over the unencrypted Wi-Fi networks was “only ‘accessible’ with some difficulty. Joffe v. Google, Inc., 729 F.3d at 1278. Though the general public can easily connect to unencrypted Wi-Fi networks, they need sophisticated software and hardware to intercept and decode payload data transmitted over those Wi-Fi networks. Joffe v. Google, Inc., 729 F.3d at 1278–79. In sum, the court fully affirmed the district court’s holding. Joffe v. Google, Inc., 729 F.3d at 1279. Google filed petitions for rehearing and rehearing en banc. The rehearing en banc was denied, but the Ninth Circuit did issue an amended opinion in which the court retracted its holding that payload data transferred over unencrypted Wi-Fi networks did not fall under the “readily accessible to the general public” exception. Joffe v. Google, Inc., 746 F.3d 920, 922, 926, 936 (9th Cir. 2013). In other words, the court limited its holding exclusively to the determination that because payload data transmitted over Wi-Fi networks are not “radio communications,” these are protected from unauthorized interception by the Wiretap Act. Joffe v. Google, Inc., 746 F.3d at 926, 936. The Wiretap Act provides several remedies. First, “[a]ny aggrieved person . . . may move to suppress the contents of any wire or oral communication intercepted pursuant to this chapter, or evidence derived therefrom.” 18 U.S.C. § 2518(10)(a). Violations of the Wiretap Act can also result in assessment of damages of a minimum of $10,000 per violation as well as up to five years’ imprisonment, or both. See 18 U.S.C. §§ 2511(4)(a), 2520(c)(2)(B).

MCLE, Inc. | 2nd Edition 2018

2–9

§ 2.3

Data Security and Privacy in Massachusetts

Section 2518 sets forth the procedure for obtaining authorization to lawfully intercept communications. An application for a court wiretapping or electronic surveillance order must be made under oath and must contain a variety of information, including details to justify the agent’s belief that a crime has been, is being, or will be committed; a specific description of the place where communications will be intercepted; a description of the type of communication; and the period of time of the interception. The judge must find probable cause and that the particular communications concerning that offense will be obtained through the interception. Further, the court must find that alternatives to wiretapping were attempted and failed, or reasonably appear to be unlikely to succeed or to be too dangerous. The order can last up to thirty days and can be renewed. 18 U.S.C. § 2518(1), (3), (5). Under the Wiretap Act, only certain government officials are able to apply to a court for a wiretapping order. Federal law enforcement agencies must receive the approval of the attorney general or a deputy, associate, or assistant attorney general before applying. 18 U.S.C. §§ 2516(1), 2518(11)(a)(i), (b)(i). For state officials, the relevant party is the principal prosecuting attorney of a state or local government or any government attorney. 18 U.S.C. § 2516(2). In other words, the police cannot obtain a wiretap order alone. The Wiretap Act requires that interception must be minimized to avoid sweeping in communications beyond the purpose for which the order was sought. [e]very order and extension thereof shall contain a provision that the authorization to intercept shall be executed as soon as practicable, shall be conducted in such a way as to minimize the interception of communications not otherwise subject to interception under this chapter, and must terminate upon attainment of the authorized objective. 18 U.S.C. § 2518(5). For example, if law enforcement officials are wiretapping the home telephone of a person suspected of a crime and the person’s child is talking on the line to a friend about going to the movies, the officials should stop listening to the conversation. After the surveillance is over, copies of the recorded conversations must be turned over to the court issuing the order. 18 U.S.C. § 2518(8)(a). Not later than ninety days after the completion of the surveillance, the court must notify the person named in the surveillance order—and may in its discretion require notification to other parties to intercepted communications—that surveillance was undertaken. 18 U.S.C. § 2518(8)(d). Section 2519 of the Wiretap Act requires federal judges and state prosecutors to file with the Administrative Office of the U.S. Courts reports pertaining to the intercepted wire, oral, or electronic communications that occur each year. The statute requires that these reports include, among other things, information regarding the grant of intercept orders, the length of time specified by the orders, and any subsequent

2–10

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

extensions granted under the orders, as well as information pertaining to the crimes alleged in the intercept requests. 18 U.S.C. § 2519(1), (2). Every year the Office of the U.S. Courts compiles this information into a report, and each June it presents its findings to Congress. 18 U.S.C. § 2519(3). In addition to presenting the findings before Congress, the office also publishes a yearly report to the general public. See Exhibit 2A below. Some yearly reports may lack information surrounding certain investigations, as intercepts may take place over the span of more than one year. Additionally, prosecutors may choose to withhold wiretap request information if disclosure poses a threat to an ongoing investigation. Interestingly, despite the growing importance of telecommunications attributes collected under the SCA and the Pen Register Act, there is no corresponding reporting requirement—and therefore very little statistical data—about activities under those acts. The wiretap report covering intercepts concluded between January 1, 2016, and December 31, 2016, the latest report available as of August 2017, is available at http:// www.uscourts.gov/statistics-reports/wiretap-report-2016 and included as Exhibit 2A. Key findings of the 2016 wiretap report include the following: • The number of federal and state wiretaps reported decreased 24 percent from the previous year; • California reported the largest decrease at 50 percent; • Applications in California, Colorado, Florida, New York, Nevada, and New Jersey accounted for 82 percent of all applications approved by state judges; • During 2016, the average length of an original authorization was thirty days, the same as in 2015. • In 2016, there was an average of 10,021 intercepts and 173 persons intercepted per wiretap order; • The average percentage of incriminating intercepts per wiretap order in 2016 was 20 percent. • The most frequently noted location in wiretap applications was “portable device,” a category including cellular telephones, application software (i.e., apps), and text messages; • The predominant criminal offense investigated with wiretaps were drug offenses. Of these, narcotics-related offenses were the most serious offenses investigated, composing 61 percent of 2016’s intercept applications. The secondlargest category was conspiracy. Homicide was specified in less than 5 percent of applications;

MCLE, Inc. | 2nd Edition 2018

2–11

§ 2.3

Data Security and Privacy in Massachusetts

• In 2016, for reported intercepts, installed wiretaps were operational for an average of forty-four days, one day above the 2015 average; • State wiretaps encountered encryption fifty-seven times in 2016, an increase of seven from the previous year. Officials could not decrypt the messages’ plain text in forty-eight of these wiretaps; • The average cost of intercept devices in 2016 was $74,949, an increase of seventy-eight percent from the 2015 average cost; • Telephone wiretaps were eighty-four percent of 2016’s installed intercepts, the majority of which involved cellular phones.

§ 2.3.2

Exceptions

There are several notable exceptions under the Wiretap Act. First, the Wiretap Act does not apply if one of the parties to the communication consents. 18 U.S.C. § 2511(2)(c). Thus, secretly recording one’s own telephone conversations is not illegal under the Wiretap Act. Second, an employee of a communications service provider is permitted “to intercept, disclose, or use [a] communication in the normal course of his employment while engaged in any activity which is a necessary incident to the rendition of his service or to the protection of the rights or property of the provider of that service.” 18 U.S.C. § 2511(2)(a)(i). Third, a service provider may disclose a communication’s contents to any intermediary provider, with the originator’s, addressee’s, or intended recipient’s “lawful consent,” or, if the communication was “inadvertently obtained by the service provider,” to law enforcement authorities when the communication “appear[s] to pertain to the commission of a crime.” 18 U.S.C. § 2511(3). Last, the Wiretap Act contains an exception—often called the “business extension” or “telephone extension” exception—for the monitoring of communications carried out by certain types of telephone equipment and done in the ordinary course of business. This is gleaned from the Act’s definition of “electronic, mechanical, or other device” as (a) any telephone or telegraph instrument, equipment or facility, or any component thereof, (i) furnished to the subscriber or user by a provider of wire or electronic communication service in the ordinary course of its business and being used by the subscriber or user in the ordinary course of its business or furnished by such subscriber or user for connection to the facilities of such service and used in the ordinary course of its business[.] 18 U.S.C. § 2510(5). The type of telephone equipment found to satisfy this exception is fairly limited.

2–12

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

In Williams v. Poulos, 11 F.3d 271 (1st Cir. 1993), the First Circuit affirmed the District Court’s determination that the business extension exception does not apply to a makeshift system pieced together by the former principal owners of a business, stating as follows: Simply put, we are at a loss to see how the monitoring system used here, consisting as it did of “alligator clips attached to a microphone cable at one end” and an “interface connecting [a] microphone cable to a VCR and a video camera” on the other, can be considered to be a “telephone or telegraph instrument, equipment or facility, or [a] component thereof.” In so stating, we note that the . . . system is factually remote from the telephonic and telegraphic equipment courts have recognized as falling within the exception at 18 U.S.C. § 2510(5)(a). See, e.g., Epps v. St. Mary’s Hosp., 802 F.2d 412, 415–16 (11th Cir. 1986) (dispatch console installed by telephone company considered telephone equipment); Watkins v. L.M. Berry & Co., 704 F.2d 577, 582–84 (11th Cir. 1983) (standard extension telephone implicitly considered telephone equipment); Briggs v. American Air Filter Co., Inc., 630 F2d 414, 416–20 (5th Cir. 1980) (same); James v. Newspaper Agency Corp., 591 F.2d 579, 581 (10th Cir. 1979) (monitoring device installed by telephone company implicitly considered telephone equipment). Indeed, we think it self evident that the . . . system, far from being the type of exempt equipment contemplated by the authors of the business extension exception, is precisely the type of intercepting device Congress intended to regulate heavily when it enacted Title III. Williams v. Poulos, 11 F.3d at 280 (footnote omitted). In the workplace context, a number of cases have interpreted the business extension exception to permit employers’ monitoring of employee telephone calls, at least to the point of determining whether a call was personal or for business purposes. See e.g. Watkins v. L.M. Berry & Co., 704 F.2d 577 (11th Cir. 1983. As a practice tip, it is important to recognize that nothing in the Wiretap Act exempts attorneys from civil or criminal liability for unlawful disclosure of intercepted communications. Yet, as the authors of one leading treatise point out, there are times when the duty to one’s client may seem to require disclosure, thus creating a conflict. Clifford S. Fishman & Anne T. McKenna, Wiretapping and Eavesdropping § 3.9, Westlaw (database updated Dec. 2016) (hereinafter Fishman and McKenna treatise). Courts have reached conflicting—and sometimes “absurd”—decisions regarding how to resolve the conflict. Fishman and McKenna treatise.

MCLE, Inc. | 2nd Edition 2018

2–13

§ 2.3

Data Security and Privacy in Massachusetts

According to the Fishman and McKenna treatise: As a rule, if a client gives an attorney a tape which the attorney ‘knows or has reason to know’ contains an unlawfully intercepted communication, the safest course of conduct, for client and attorney alike, is to inform the client, (1) that it is a federal felony to disclose the contents of the tape to anyone; (2) that both the attorney and the client would be committing a related federal felony if the attorney used any information obtained from the tape; and (3) that to assure that no one can accuse either the attorney or the client of having done so, the attorney will not listen to the tape and will not want to know its contents. Fishman and McKenna treatise (footnotes omitted). Notably, the First Circuit in Poulos declined to decide the issue of whether the Wiretap Act makes it a crime to disclose and use illegally intercepted communications during the course of attorney consultations. The court stated as follows: The issue here adverted to is an interesting one on which no federal appeals court has yet spoken: namely, do 18 U.S.C. § 2511(1)(c) and (d) . . . , which by their terms prohibit the “disclos[ure] . . . to any other person” and the “use” of illegally intercepted material, make it a crime to disclose and use such material during the course of attorney consultations? Certainly, reasonable arguments might be made on both sides of this question of first impression. And, in accordance with our usual practice, we do not wish to decide it without the benefit of such argumentation and a developed record. Accordingly, we deem the issue to have been waived in this instance. Williams v. Poulos, 11 F.3d 271, 291–292 (1993) (footnote and citations omitted). In a footnote, however, the First Circuit acknowledged that “[a]t least one federal judge, recognizing the inherent tension between the wording of the statute and the need for effective trial preparation, has held that the disclosure of the contents of intercepted recordings to counsel, for the purpose of preparing a defense, is not a crime.” Williams v. Poulos, 11 F.3d at 291 n.44 (citing McQuade v. Michael Gassner Mech. & Elec. Contractors, Inc., 587 F. Supp. 1183, 1188–89 (D. Conn. 1984) (Cabranes, J.)). Since Poulos, at least one federal Appeals Court has spoken. In Nix v. O’Malley, 160 F.3d 343, 350–53, (6th Cir. 1998), the Sixth Circuit held that, if a client is being sued for unlawful disclosure of intercepted communications, and the trial court has denied the plaintiff a protective order foreclosing use of the tapes in the defense, the defendant has a right to disclose the contents of the intercepted communications to his or her attorney, and the attorney may use that information to prepare a defense. 2–14

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3.3

§ 2.3

Communications in Transit Versus Stored Communications

The Wiretap Act protects the privacy of a wire and electronic communication when in transit, while the SCA, as its name implies, protects such communication “while it is in electronic storage.” This means that, as a communication travels across the Internet, different laws may apply to it at different times. For example, an e-mail message will be protected by the Wiretap Act when in transit but by the SCA when it is stored. The existence of these two legal regimes raises the question: how do we know when the SCA applies to a particular surveillance practice versus when the Wiretap Act applies. This issue is important because computer technologies keep the line from being altogether clear. For instance, a digital communication that is primarily in transit may be stored by a computer for just a few milliseconds along the way, and may be stored at intermediate points for longer periods. The First Circuit Court of Appeals addressed this question in United States v. Councilman, 418 F.3d 67 (1st Cir. 2005) (en banc), which involved a software program designed and covertly installed at an Internet service provider (ISP) to intercept and copy all user e-mail from a competitor company. United States v. Councilman, 418 F.3d at 70–71. The court rejected the defendant’s proposed distinction between “in transit” and “in storage,” holding the Wiretap Act regulated access to the e-mails, and finding that the term “electronic communication” to include “transient electronic storage that is intrinsic to the communication process for such communications.” United States v. Councilman, 418 F.3d at 79, 85 (internal quotation marks omitted). In Councilman, the following facts were stipulated: the [software program] worked only within the confines of [the ISP’s] computer; at all times at which [the software] performed operations affecting the e-mail system, the messages existed “in the random access memory (RAM) or in hard disks, or both, within [the ISP’s] computer system”; and each e-mail message, while traveling through wires, was an “electronic communication” under [the Wiretap Act]. United States v. Councilman, 418 F.3d at 71. Councilman moved to dismiss the indictment against him for failure to state an offense under the Wiretap Act, arguing that the intercepted e-mail messages were in “electronic storage,” as defined in the SCA, and therefore were not, as a matter of law, subject to the prohibition on “intercept[ing] electronic communication[s].” United States v. Councilman, 418 F.3d at 71 (internal quotation marks omitted). According to the court: Councilman argues, however, that Congress intended to exclude any communication that is in (even momentary) electronic storage. In his view, “electronic communication[s]” under the MCLE, Inc. | 2nd Edition 2018

2–15

§ 2.3

Data Security and Privacy in Massachusetts

Wiretap Act are limited to communications traveling through wires between computers. Once a message enters a computer, he says the message ceases (at least temporarily) to be an electronic communication protected by the Wiretap Act. He claims that Congress considered communications in computers to be worthy of less protection than communications in wires because users have a lower expectation of privacy for electronic communications that are in electronic storage even fleetingly, and that the Act embodies this understanding. United States v. Councilman, 418 F.3d at 72 (footnote omitted). The court began its analysis by taking judicial notice of how Internet e-mail works. The Internet is a network of interconnected computers. Data transmitted across the Internet are broken down into small “packets” that are forwarded from one computer to another until they reach their destination, where they are reconstituted. Each service on the Internet—e.g., e-mail, the World Wide Web, or instant messaging—has its own protocol for using packets of data to transmit information from one place to another. The e-mail protocol is known as Simple Mail Transfer Protocol (“SMTP”). After a user composes a message in an e-mail client program, a program called a mail transfer agent (“MTA”) formats that message and sends it to another program that “packetizes” it and sends the packets out to the Internet. Computers on the network then pass the packets from one to another; each computer along the route stores the packets in memory, retrieves the addresses of their final destinations, and then determines where to send them next. At various points the packets are reassembled to form the original e-mail message, copied, and then repacketized for the next leg of the journey. Sometimes messages cannot be transferred immediately and must be saved for later delivery. Even when delivery is immediate, intermediate computers often retain backup copies, which they delete later. This method of transmission is commonly called “store and forward” delivery. Once all the packets reach the recipient’s mail server, they are reassembled to form the e-mail message. A mail delivery agent (“MDA”) accepts the message from the MTA, determines which user should receive the message, and performs the actual delivery by placing the message in that user’s mailbox. One popular MDA is “procmail,” which is controlled by short programs or scripts called “recipe files.” These recipe files can be used in various ways. For example, a procmail recipe can 2–16

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

instruct the MDA to deposit mail addressed to one address into another user’s mailbox (e.g., to send mail addressed to “help” to the tech support department), to reject mail from certain addresses, or to make copies of certain messages. Once the MDA has deposited a message into the recipient’s mailbox, the recipient simply needs to use an e-mail client program to retrieve and read the message. While the journey from sender to recipient may seem rather involved, it usually takes just a few seconds, with each intermediate step taking well under a second. United States v. Councilman, 418 F.3d at 69–70 (footnotes and citations omitted). After reviewing the legislative history of ECPA, the court concluded as follows: If . . . remov[al of] electronic communications from the scope of the Wiretap Act for the brief instants during which they are in temporary storage en route to their destinations—which, as it turns out, are often the points where it is technologically easiest to intercept those communications—neither of the Senate co-sponsors saw fit to mention this to their colleagues, and no one, evidently, remarked upon it. . . . Indeed, we doubt that Congress contemplated the existential oddity that Councilman’s interpretation creates: messages—conceded by stipulation to be electronic communications—briefly cease to be electronic communications for very short intervals, and then suddenly become electronic communications again. United States v. Councilman, 418 F.3d at 78. Interestingly, the court found it unnecessary, and therefore declined, to decide the “question of whether the term ‘intercept’ applies only to acquisitions that occur contemporaneously with the transmission of a message from sender to recipient or, instead, extends to an event that occurs after a message has crossed the finish line of transmission (whatever that point may be).” United States v. Councilman, 418 F.3d at 80. It stated: Because the facts of this case and the arguments before us do not invite consideration of either the existence or the applicability of a contemporaneity or real time requirement, we need not and do not plunge into that morass. We note, however, that even were we prepared to recognize a contemporaneity or real time requirement—a step that we do not take today—we think it highly unlikely that Councilman could generate a winning argument in the circumstances of this case. Any such argument would entail a showing that each transmission was complete at the time of acquisition and, therefore, that the definition of MCLE, Inc. | 2nd Edition 2018

2–17

§ 2.3

Data Security and Privacy in Massachusetts

“intercept” does not cover the acquisitions. Such a showing would appear to be impossible since we have concluded that the messages were electronic communications, and it is undisputed that they were acquired while they were still en route to the intended recipients. United States v. Councilman, 418 F.3d at 80. Addressing the contemporaneity requirement, in Klumb v. Goan, 884 F. Supp. 2d 644 (E.D. Tenn. 2012), the court held that a wiretap interception occurs when spyware automatically routes a copy of an e-mail, which is sent through the Internet, back through the Internet to a third party’s e-mail address when the intended recipient opens the e-mail for the first time. Klumb v. Goan, 884 F. Supp. 2d at 661. In doing so, the court stated: Whether the email is rerouted within a “blink-of-an-eye” is not of primary importance . . . . The point is that a program has been installed on the computer which will cause emails sent at some time in the future through the internet to be rerouted automatically through the internet to a third party address when the intended recipient opens the email for the first time. Klumb v. Goan, 884 F. Supp. 2d at 661. The issues of “consent” and “in transit” under the Wiretap Act continue to be the subject of litigation. In the case In re Yahoo Mail Litigation, 7 F. Supp. 3d 1016, 1020, 1023 (N.D. Cal. 2014), four people representing a class who did not use Yahoo’s e-mail (Yahoo Mail) but who instead sent e-mails to users of Yahoo Mail alleging that Yahoo violated their “expectation of privacy” when intercepting and scanning e-mail content in contravention of ECPA, the California Constitution, and California’s Invasion of Privacy Act (CIPA). For the purposes of this chapter, only the plaintiffs’ Wiretap Act claims are discussed. The plaintiffs sought declaratory and injunctive relief, statutory damages, “and disgorgement of Yahoo’s revenues from unjust enrichment related to Yahoo’s interception, scanning, and storage of emails from and to non–Yahoo Mail users.” In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1023. Yahoo argued that the SCA, not the Wiretap Act, governed its conduct because the e-mails were accessed while temporarily stored in its servers en route to the final recipients. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1027. In support Yahoo cited Konop v. Hawaiian Airlines, Inc., 302 F.3d 868, 878 & n.6. (9th Cir. 2002), where the Ninth Circuit dismissed in a footnote the interpretation that the Wiretap Act applies to electronic communication intercepted while in temporary storage en route to its destination. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1207. Not only is Konop in direct conflict with United States v. Councilman, 418 F.3d 67 (1st Cir. 2005) (en banc) (see discussion below at § 2.3.3), the plaintiffs also argued that Konop’s footnote was simply dicta and did not constitute Ninth Circuit precedent binding the Yahoo court. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1207. Nonetheless, the court declined to address Yahoo’s argument, instead noting that, for the purposes of a 2–18

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

motion to dismiss, allegations of a complaint are accepted as true. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1207. Because the plaintiffs alleged Yahoo intercepted e-mails while in transit, the court was bound by their interpretation at this stage of the litigation, and therefore Yahoo’s motion to dismiss the plaintiffs’ Wiretap Act claim was denied on the theory that the Wiretap Act did not apply to the e-mails. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1207–28. The plaintiffs also alleged that Yahoo violated the Wiretap Act by intercepting the e-mails without consent. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1028. Under 18 U.S.C. § 2511(2)(d), only “one party to the communication [must] consent to an interception to relieve the [e-mail] provider of liability,” and in Yahoo’s motion to dismiss, the company claimed it expressly obtained users’ consent. In re Yahoo Mail Litigation, 7 F. Supp. 3d at 1028. The court agreed and dismissed the plaintiffs’ Wiretap Act claim based upon language in the Additional Terms of Service [“ATOS”] agreement. In re Yahoo Mail Litigation, 7 F. Supp. at 1029. The ATOS was one of three service agreements requiring user consent to create a Yahoo Mail account, and it “explicitly acknowledge[d] that Yahoo scan[ned] and analyze[d] users’ emails.” In re Yahoo Mail Litigation, 7 F. Supp. at 1029. The applicable clause from the ATOS read: Yahoo’s automated systems scan and analyze all incoming and outgoing communications content sent and received from your account (such as Mail and Messenger content including instant messages and SMS messages) including those stored in your account to, without limitation, provide personally relevant product features and content, to match and serve targeted advertising and for spam and malware detection and abuse protection. By scanning and analyzing such communications content, Yahoo collects and stores the data. Unless expressly stated otherwise, you will not be allowed to opt out of this feature. If you consent to this ATOS and communicate with nonYahoo users using the Services, you are responsible for notifying those users about this feature. In re Yahoo Mail Litigation, 7 F. Supp. at 1029. The plaintiffs argued that, despite the above ATOS clause, Yahoo Mail users did not agree to Yahoo’s practice of scanning and analyzing mail to create “user profiles for both parties to the email communication and sharing content from the emails with third parties” for advertising purposes. In re Yahoo Mail Litigation, 7 F. Supp. at 1030. The court nonetheless interpreted the italicized portion of the clause to allow the practices because that portion of the clause specifically noted Yahoo scanned and analyzed e-mail to facilitate targeted advertising and the plaintiffs never alleged that Yahoo undertook creating user profiles or sharing content with third parties for anything other than targeted advertising. In re Yahoo Mail Litigation, 7 F. Supp. at 1030. The plaintiffs also alleged that Yahoo collected and stored communication contents for future use before users were placed on notice with the addition of the line “By scanning and analyzing such communications content, Yahoo collects and stores the data” to the ATOS clause. In re Yahoo Mail Litigation, 7 F. Supp. at 1030. The court MCLE, Inc. | 2nd Edition 2018

2–19

§ 2.3

Data Security and Privacy in Massachusetts

concluded that, even without the addition of the specific reference to collection and storage, “the reasonable user would know that ‘scanning and analyzing’ requires Yahoo to collect and store the email content.” In re Yahoo Mail Litigation, 7 F. Supp. at 1031. In other words, because e-mail analysis necessitates storage, users consented when they agreed to the original language in Yahoo’s ATOS. In re Yahoo Mail Litigation, 7 F. Supp. at 1031. Lastly, in terms of using the information gleaned in the “future,” the plaintiffs claimed that Yahoo did not give notice of what “specific use Yahoo would make of the email content in the future.” In re Yahoo Mail Litigation, 7 F. Supp. at 1031. The court pushed back, noting the “explicit statements in the ATOS that Yahoo scans and analyzes email content to provide product features, provide targeted advertising, and detect spam and abuse. It is logical that Yahoo’s future uses would be the same,” especially because the plaintiffs did not allege Yahoo was planning on using the e-mail content for other reasons. In re Yahoo Mail Litigation, 7 F. Supp. at 1031. We will never know how the Ninth Circuit would have resolved the plaintiffs’ Wiretap Act claims because the parties settled the case six days before the summary judgment hearing on the plaintiffs’ other claims under the SCA and the CIPA, which survived Yahoo’s motion to dismiss. In re Yahoo Mail Litigation, Nos. 13-CV-4980-LHK, 13CV-4989-LHK, 13-CV-5326-LHK, 13-CV-5388-LHK, 2016 WL 4474612, *2 (N.D. Cal. Aug. 25, 2016). As terms of the settlement, Yahoo agreed to a three-year injunction where it would “no longer intercept, scan, and analyze email that is ‘in transit,’” instead these practices would only occur once e-mails were on Yahoo’s servers and available for access by Yahoo users. In re Yahoo Mail Litigation, Nos. 13-CV-4980LHK at *3. Though Yahoo signaled it did not intend to revert to its previous practices after the injunction expired, the court noted that if it did, the plaintiffs could again sue. In re Yahoo Mail Litigation, Nos. 13-CV-4980-LHK at *4. Yahoo also agreed to modify its Yahoo Mail agreements and privacy notifications to clearly indicate ingoing and outgoing e-mails are analyzed, the information stored, and some particular information is shared. In re Yahoo Mail Litigation, Nos. 13-CV-4980-LHK at *4. In its analysis of whether the settlement was “fair, adequate, and reasonable,” under Fed. R. Civ. P. 23(e) and Ninth Circuit precedent, one of the criteria the court can consider is “the strength of the plaintiffs’ case.” In re Yahoo Mail Litigation, Nos. 13CV-4980-LHK at *5 (internal quotation marks omitted). The court noted that “legal uncertainty favors approval” of a settlement, and that even though the court did not reach the summary judgment hearing because the parties settled, “the Court’s initial analysis suggested some vulnerability in the Plaintiffs’ case” under Backhaut v. Apple Inc., 148 F. Supp. 3d 844 (N.D. Cal. 2015). In re Yahoo Mail Litigation, Nos. 13-CV4980-LHK at *6. The Backhaut court dismissed claims that Apple violated the Wiretap Act when analyzing users’ text messages because the intercepting “device” (a specific server) fell within the Wiretap Act’s “ordinary course of business” exception, Section 2510(5)(a). In re Yahoo Mail Litigation, Nos. 13-CV-4980-LHK at *6. (citing Backhaut, 148 F. Supp. 3d at 851–52).

§ 2.3.4

Video Surveillance

The Wiretap Act does not directly address video surveillance. Of course, if a person intercepts a “communication” consisting of video images (such as transmission of an 2–20

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.3

e-mail containing a video clip), the Wiretap Act applies. If a person accesses an individual’s stored video clip, the SCA applies. However, being watched by video surveillance (such as a surveillance camera) does not involve the interception or accessing of stored images—as long as the video is silent. If the video surveillance captures sound as well, it could be an “oral” communication subject to the Wiretap Act. In sum, silent video surveillance is not covered under either the Wiretap Act or the SCA. See, e.g., United States v. Larios, 593 F.3d 82, 90–91 (1st Cir. 2010); United States v. Falls, 34 F.3d 647, 679–80 (8th Cir. 1994); United States v. Koyomejian, 970 F.2d 536, 538 (9th Cir. 1992); United States v. Biasuci, 786 F.2d 504 (2d Cir. 1986).

§ 2.4

MASSACHUSETTS ELECTRONIC SURVEILLANCE STATUTE

Massachusetts, like most other states, has enacted its own electronic surveillance law prohibiting the interception of wire and oral communications except in certain limited circumstances. G.L. c. 272, § 99. A key logger program, surreptitiously installed on a computer, that recorded all keystrokes typed into the computer has been held to constitute an “intercepting device” under the Massachusetts statute. Rich v. Rich, No. BRCV200701538, 2011 WL 3672059, at *6 (Mass. Super. Ct. July 8, 2011). In contrast, courts have uniformly held that key logger software does not violate the federal Wiretap Act because “the signal or information captured from the keystrokes is not at that time being transmitted beyond the computer on which the keylogger is installed.” United States v. Barrington, 648 F.3d 1178, 1202 (11th Cir. 2011). Although the Wiretap Act and the Massachusetts statute share some similarities, they are different in a number of significant respects.

§ 2.4.1

Oral Communications—No “Expectation of Privacy” Requirement

With respect to oral communications, the Wiretap Act applies to “communication[s] uttered by a person exhibiting an expectation that such communication is not subject to interception under circumstances justifying such expectation.” 18 U.S.C. § 2510(2). The majority of states that have adopted wiretapping statutes have similarly limited the types of oral communication covered to those made with the expectation that they would not be intercepted. See, e.g., Ala. Code § 42.20.300 (Alabama; covering only “private communication”); Ga. Code Ann. § 16-11-62 (Georgia; covering only communications made “in a private place”); Mich. Comp. Laws § 750.539a (Michigan; limiting reach to “private discourse of others”). The Massachusetts statute, however, contains no such “expectation of privacy” requirement. It broadly defines an “oral communication” as “speech, except such speech as transmitted over the public airwaves by radio or other similar device.” G.L. c. 272, § 99(B)(2). The Massachusetts Supreme Judicial Court confirmed the broad reach of the statute’s plain language in Commonwealth v. Hyde, 434 Mass. 594 (2001).

MCLE, Inc. | 2nd Edition 2018

2–21

§ 2.4

Data Security and Privacy in Massachusetts

In Hyde, the defendant was pulled over by Abington police officers for a suspected traffic violation. Commonwealth v. Hyde, 434 Mass. at 595. When the traffic stop turned sour, the defendant began to secretly record the officers through the use of a handheld audio recording device. Commonwealth v. Hyde, 434 Mass. at 595–96. The officers were recorded using profanity and accusing the defendant of possessing illegal narcotics. Commonwealth v. Hyde, 434 Mass. at 595. Despite these allegations, the officers decided to let the defendant go with a verbal warning. Commonwealth v. Hyde, 434 Mass. at 595. Less than a week after the stop, the defendant went to the Abington police station to initiate a formal complaint against the officers for their misconduct during the incident. Commonwealth v. Hyde, 434 Mass. at 596. In support of his claims, the defendant supplied to the police department a copy of the recording he made during the stop. Commonwealth v. Hyde, 434 Mass. at 595. The police department subsequently conducted an internal investigation, the results of which cleared the officers of any wrongdoing. Commonwealth v. Hyde, 434 Mass. at 595. After the internal investigation, the police department pursued a criminal complaint against the defendant, asserting that he violated the Massachusetts wiretap law by recording the officers without their consent. Commonwealth v. Hyde, 434 Mass. at 595. In response, the defendant moved to dismiss the complaint on the grounds that the officers, in carrying out their duties as public officials, did not have a reasonable expectation that their words would be kept private, and thus the recorded conversation was not an “oral communication” under the statute. Commonwealth v. Hyde, 434 Mass. at 595. The court, in rejecting the defendant’s argument, noted the “reasonable expectation of privacy” limitation was absent from the statute and concluded that the Massachusetts legislature “intended . . . strictly to prohibit all secret recordings by members of the public, including recordings of police officers or other public officials interacting with members of the public, when made without their permission or knowledge.” Commonwealth v. Hyde, 434 Mass. at 599–600. The Hyde court thus confirmed the broad interpretation of the definition of “oral communication” under the Massachusetts statute and refused to adopt the “reasonable expectation of privacy” limitation found in the federal statute. Commonwealth v. Hyde, 434 Mass. at 596–97. Specifically, the Hyde court stated as follows: We reject the defendant’s argument that the statute is not applicable because the police officers were performing their public duties, and, therefore, had no reasonable expectation of privacy in their words. The statute’s preamble expresses the Legislature’s general concern that “the uncontrolled development and unrestricted use of modern electronic surveillance devices pose[d] grave dangers to the privacy of all citizens of the commonwealth” and this concern was relied on to justify the ban on the public’s clandestine use of such devices. While we recognize that G.L. c. 272, § 99, was designed to prohibit the use of electronic surveillance devices by private individuals because of the serious threat they pose to the “privacy of all 2–22

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.4

citizens,” the plain language of the statute, which is the best indication of the Legislature’s ultimate intent, contains nothing that would protect, on the basis of privacy rights, the recording that occurred here. In Commonwealth v. Jackson, supra at 506, this court rejected the argument that, because a kidnapper has no legitimate privacy interest in telephone calls made for ransom purposes, the secret electronic recording of that conversation by the victim’s brother would not be prohibited under G.L. c. 272, § 99: “[W]e would render meaningless the Legislature’s careful choice of words if we were to interpret ‘secretly’ as encompassing only those situations where an individual has a reasonable expectation of privacy.” .... Further, if the tape recording here is deemed proper on the ground that public officials are involved, then the door is opened even wider to electronic “bugging” or secret audio tape recording (both are prohibited by the statute and both are indistinguishable in the injury they inflict) of virtually every encounter or meeting between a person and a public official, whether the meeting or encounter is one that is stressful (like the one in this case or, perhaps, a session with a tax auditor) or nonstressful (like a routine meeting between a parent and a teacher in a public school to discuss a good student’s progress). The door once opened would be hard to close, and the result would contravene the statute’s broad purpose and the Legislature’s clear prohibition of all secret interceptions and recordings by private citizens. Commonwealth v. Hyde, 434 Mass. at 600–01, 603 (footnotes and citations omitted). The court also distinguished its holding in Commonwealth v. Gordon—see § 2.3.4, below—which allowed into evidence routine administrative recordings of a defendant’s booking procedure following an arrest, stating that the Hyde defendant cannot claim that his recording was made as a routine administrative procedure. Commonwealth v. Hyde, 434 Mass. at 601. In dissent, Chief Justice Marshall wrote: The purpose of G.L. c. 272, § 99, is not to shield public officials from exposure of their wrongdoings. I have too great a respect for the Legislature to read any such meaning into a statute whose purpose is plain, and points in another direction entirely. Where the legislative intent is explicit, it violates a fundamental rule of statutory construction to reach a result that is plainly contrary to that objective. . . . To hold that the Legislature intended to allow police officers to conceal possible MCLE, Inc. | 2nd Edition 2018

2–23

§ 2.4

Data Security and Privacy in Massachusetts

misconduct behind a cloak of privacy requires a more affirmative showing than this statute allows. In our Republic the actions of public officials taken in their public capacities are not protected from exposure. Citizens have a particularly important role to play when the official conduct at issue is that of the police. Their role cannot be performed if citizens must fear criminal reprisals when they seek to hold government officials responsible by recording— secretly recording on occasion—an interaction between a citizen and a police officer. .... The court’s ruling today also threatens the ability of the press—print and electronic—to perform its constitutional role of watchdog. As the court construes the Massachusetts wiretapping statute, there is no principled distinction to explain why members of the media would not be held to the same standard as all other citizens. Commonwealth v. Hyde, 434 Mass. at 612, 613–14 (citations omitted). The Supreme Judicial Court’s holding that the Massachusetts Wiretap Act is “intended . . . to strictly prohibit all secret recordings by members of the public, including recordings of police officers or other public officials interacting with members of the public, when made without their permission or knowledge,” Commonwealth v. Hyde, 434 Mass. at 594, has been challenged as unconstitutional in two recent cases before the United States District Court for the District of Massachusetts. In the first case, Martin v. Evans, No. 16-11362-PBS, 2017 WL 1015000 (D. Mass. Mar. 13, 2017), civil rights activists wanted to secretly record police officers doing their jobs in public but refrained from doing so due to fears they would be prosecuted under section 99. Martin v. Evans, 2017 WL 1015000 at *1. Therefore, the activists brought an as-applied constitutional challenge under 42 U.S.C. § 1983 seeking injunctive and declaratory relief, arguing the law violates the First and Fourteenth Amendments. Martin v. Evans, 2017 WL 1015000 at *1. The defendants, Boston’s police commissioner and the Suffolk County District Attorney, moved to dismiss for several reasons, including that the plaintiffs did not state a First Amendment violation because secretly recording police officers is not a right the Amendment provides. Martin v. Evans, 2017 WL 1015000 at *1, *7. The court denied the defendants’ motions to dismiss. Martin v. Evans, 2017 WL 1015000 at *1, *8. Although not explicitly quoting Chief Justice Marshall’s dissent in Hyde, the Martin court’s analysis of the plaintiffs’ First Amendment claim echoed Marshall’s concerns that Section 99 prevents citizens from playing an important role in our society: holding government officials accountable for their actions. Commonwealth v. Hyde, 434 Mass. at 612 (Marshall, J., dissenting). The Martin court wrote 2–24

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.4

that the Amendment protects “information gathering,” including audiovisual and audio recording. Martin v. Evans, 2017 WL 1015000, at *7. When the gathered information pertains to government officials and will be shared with others, the information gathering is a “particularly important First Amendment interest.” Martin v. Evans, 2017 WL 1015000 at *7. Indeed, the court notes, encouraging free discourse around governmental affairs was a major reason for the Amendment. Martin v. Evans, 2017 WL 1015000 at *7. The court did agree with the defendants’ argument that First Amendment protections can be restricted under reasonable circumstances, such as curtailing the secret recording and broadcasting of conversations between law enforcement and crime victims, or First Amendment actions presenting safety concerns or interfering with law enforcement executing their duties. Martin v. Evans, 2017 WL 1015000 at *8. However, as written, Section 99 does not simply prohibit recordings in these reasonable circumstances; instead, the law proscribes “a significant amount of nondisruptive and safe First Amendment activities.” Martin v. Evans, 2017 WL 1015000 at *8. Writing that “[t]he government does not have a significant interest in protecting the privacy of law enforcement officials discharging their duties in a public place,” the court therefore held that as applied “to the secret recording of government officials in the performance of their duties in public,” Section 99 violates the First Amendment. Martin v. Evans, 2017 WL 1015000 at *8. In the second case, Project Veritas Action Fund v. Conley, No. 16–10462–PBS, 2017 WL 1100423 (D. Mass. Mar. 23, 2017), the plaintiff media organization Project Veritas (Veritas) engaged in undercover journalism where it intercepted and recorded oral communications, often occurring in public, without consent. Project Veritas Action Fund v. Conley, WL 1100423 at *1. Veritas had yet to record in Massachusetts for fear of being prosecuted under Section 99. Project Veritas Action Fund v. Conley, WL 1100423 at *1. Consequently, Veritas brought as-applied and facial challenges to the Massachusetts Wiretap Act under 42 U.S.C. § 1983 seeking injunctive and declaratory relief, arguing Section 99’s prohibition of secretly recording the oral conversations of both private and public individuals violates the First and Fourteenth Amendments. Project Veritas Action Fund v. Conley, WL 1100423 at *1. Just as in Martin, the Suffolk County District Attorney as defendant moved to dismiss for, inter alia, failure to state a First Amendment claim because secretly recording oral communications is not a right the Amendment provides. Project Veritas Action Fund v. Conley, WL 1100423 at *1, *4. The court’s initial analysis of the as-applied challenge tracked that found in Martin, namely that the First Amendment protects recording government officials as they work in public but that such “information gathering” is subject to reasonable restrictions. Project Veritas Action Fund v. Conley, WL 1100423 at *4. But unlike Martin, this case was not about secretly recording government officials in public. Project Veritas Action Fund v. Conley, WL 1100423 at *4. Instead, the question was “whether Section 99 violates the First Amendment by categorically prohibiting the intentional secret recording of private individuals.” Project Veritas Action Fund v. Conley, WL 1100423 at *4 (emphasis added). The plaintiff argued that under the First Amendment it has a right to record private conversations occurring in public because “there is no reasonable expectation of MCLE, Inc. | 2nd Edition 2018

2–25

§ 2.4

Data Security and Privacy in Massachusetts

privacy.” Project Veritas Action Fund v. Conley, WL 1100423 at *5. The court pushed back, writing “private talk in public places is common,” and though individuals may expect to be overheard, under the First Amendment “there is a significant privacy difference between overhearing a conversation in an area with no reasonable expectation of privacy and recording and replaying that conversation for all to hear.” Project Veritas Action Fund v. Conley, WL 1100423 at *5 (citations and internal quotation marks omitted). To this end, the court noted the “express legislative purpose of Section 99’s unequivocal ban of secret audio recording is to protect Massachusetts citizens’ privacy,” Project Veritas Action Fund v. Conley, WL 1100423 at *4, and according to Hyde, “the statute is meant to protect individuals independent of their reasonable expectation of privacy,” Project Veritas Action Fund v. Conley, WL 1100423 at *5. Consequently, Section 99 is a reasonable restriction on information gathering because it protects Massachusetts citizens’ privacy by allowing only open recording of conversations between private individuals. Project Veritas Action Fund v. Conley, WL 1100423 at *6 (emphasis added). As a reasonable restriction, Section 99 does not run afoul of the First Amendment, and Veritas’s as-applied challenge failed. Project Veritas Action Fund v. Conley, WL 1100423 at *6. As to the facial challenge, the court noted that, in order for a First Amendment facial challenge to succeed, a “substantial number” of a law’s applications must be unconstitutional. Project Veritas Action Fund v. Conley, WL 1100423 at *6 (citation and internal quotation marks omitted). The court held that, other than Martin’s finding Section 99 unconstitutional as applied to secretly recording government officials as they work in public, “a wide range of legitimate applications remain,” namely that the law “constitutionally protects private conversations in all settings and conversations with government officials in nonpublic settings or about non-official matters.” Project Veritas Action Fund v. Conley, WL 1100423 at *6. Therefore, the court ruled that Section 99 is not overbroad nor facially unconstitutional. Project Veritas Action Fund v. Conley, WL 1100423 at *6.

§ 2.4.2

Consent of All Parties

Unlike the Wiretap Act, the Massachusetts statute imposes liability only on a person who “intercepts” wire or oral communication without first obtaining the consent of all parties to that communication. G.L. c. 272, § 99(B)(4). In interpreting the consent requirement, Massachusetts courts have held that the parties’ knowledge of the recording activities is sufficient, even if they have not affirmatively authorized or consented to it. Commonwealth v. Jackson, 370 Mass. 502 (1976). The Massachusetts statute does, however, create an exception for law enforcement officials, under which the consent of one, and not all, of the parties is sufficient to avoid criminal liability. G.L. c. 272, § 99(B)(4). To qualify for the exception, it must be shown that the law enforcement officer was a party to the communication or had obtained prior authorization from one of the parties and that the recording was made “in the course of an investigation of a designated offense.” G.L. c. 272, § 99(B)(4) (emphasis added).

2–26

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.4

The statute defines the term “designated offense” as one of a number of particularly serious crimes committed in connection with organized crime. G.L. c. 272, § 99(B)(4). Thus, to fulfill the “designated offense” prong of the exception, the intercepting party must “show that the decision to intercept was made on the basis of a reasonable suspicion that interception would disclose or lead to evidence of a designated offense in connection with organized crime.” Commonwealth v. Burgos, 470 Mass. 133, 140 (2014) (quoting Commonwealth v. Thorpe, 384 Mass. 271, 281 (1981) (internal quotation marks omitted)). That the one-party exception applies only when the offense investigated is related to organized crime further limits the circumstances under which a person may lawfully record oral or wire communications without first obtaining the consent of all of the parties to that communication under the Massachusetts statute.

§ 2.4.3

Video Surveillance

Like the Wiretap Act, the Massachusetts statute does not address video surveillance. In Commonwealth v. Gordon, 422 Mass. 816 (1996), the Massachusetts Supreme Judicial Court had occasion to consider whether a nonsilent videotape of the defendants’ booking at the police station is covered by the Massachusetts statute. In Gordon, the defendants appealed their convictions of murder and armed robbery partly on the grounds that the trial court improperly admitted into evidence video recordings of the defendants being booked at the police station following their arrests. Commonwealth v. Gordon, 422 Mass. at 818–19. The defendants asserted that the video evidence (which included audio) should have been excluded on the grounds that the making of the tape was unlawful under the Massachusetts wiretap law. Commonwealth v. Gordon, 422 Mass. at 832. In upholding the trial court’s decision to admit the video into evidence, the court noted that the Massachusetts legislature enacted the wiretap law with a focus on the “protection of privacy rights and the deterrence of interference therewith by law enforcement officers’ surreptitious eavesdropping as an investigative tool.” Commonwealth v. Gordon, 422 Mass. at 833. Acknowledging the aims of the legislature in enacting the wiretap law, the court held that the video evidence admitted at trial did not fall within the law’s reach, noting that in enacting the statute, “[t]he Legislature does not appear to have had in mind the recording of purely administrative bookings steps following an individual’s arrest.” Commonwealth v. Gordon, 422 Mass. at 833. The court also noted that the protections provided by the statute did not apply to this situation specifically because “the videotape did not did not capture or reveal the defendants’ thoughts or knowledge about some fact or subject, but at best served only to exhibit the defendants’ bearing and manner of speaking which were relevant on the question of their intoxication or sobriety at the time of the assaults.” Commonwealth v. Gordon, 422 Mass. at 833.

MCLE, Inc. | 2nd Edition 2018

2–27

§ 2.5

Data Security and Privacy in Massachusetts

§ 2.5

THE STORED COMMUNICATIONS ACT

§ 2.5.1

Why the SCA Exists

Because of the way the Internet works, an Internet user’s private communications end up being sent to third parties and stored on remote network servers that essentially serve as the user’s “virtual home.” At the time of the SCA’s enactment in 1986, under the third-party doctrine, Fourth Amendment protection did not extend to information that had been revealed to third parties. As a result, there was concern that Internet users would have no Fourth Amendment privacy protection in information sent to network providers, including stored e-mails. In addition, most ISPs are private actors, not governmental entities. Consequently under the private search doctrine, the Fourth Amendment “is wholly inapplicable to a search or seizure, even an unreasonable one, effected by a private individual not acting as an agent of the Government or with the participation or knowledge of any governmental official.” United States v. Jacobsen, 466 U.S. 109, 113 (1984) (citation and internal quotation marks omitted); see also United States v. Steiger, 318 F.3d 1039, 1045 (11th Cir. 2003) (concluding that searches of defendant’s computer over Internet by an anonymous computer hacker did not violate the Fourth Amendment because there was no evidence that the government was involved in the search); United States v. Hall, 142 F.3d 988, 993 (7th Cir. 1998) (Fourth Amendment did not protect against computer technician’s search executed while repairing computer because there was no government involvement); United States v. Kennedy, 81 F. Supp. 2d 1103, 1112 (D. Kan. 2000) (Fourth Amendment did not protect ISP searching subscriber’s hard drive because there was no government involvement). Congress enacted the SCA to address these privacy imbalances by offering network account holders a range of statutory privacy protections. First, it prohibits the unauthorized access to a wire or electronic communication while it is in electronic storage. Second, it creates limits on the government’s ability to compel providers to disclose information in their possession about their customers and subscribers. Finally, it places limits on the ability of providers to voluntarily disclose information about their customers and subscribers.

§ 2.5.2

Prohibition on Unauthorized Access

Section 2701 of the SCA states as follows: (a) Offense.—Except as provided in subsection (c) of this section whoever— (1) intentionally accesses without authorization a facility through which an electronic communication service is provided; or 2–28

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

(2) intentionally exceeds an authorization to access that facility; and thereby obtains, alters, or prevents authorized access to a wire or electronic communication while it is in electronic storage in such system shall be punished as provided in subsection (b) of this section. 18 U.S.C. § 2701(a)(1)(2) (emphasis added). “electronic storage” means— (A) any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof; and (B) any storage of such communication by an electronic communication service for purposes of backup protection of such communication. 18 U.S.C. § 2510(17). Section 2701(a) does not apply to conduct authorized “by the person or entity providing a wire or electronic communications service” (such as ISPs). 18 U.S.C. § 2701(c)(1). Note that, unlike the service provider exception for the Wiretap Act, which allows interceptions on a limited basis (namely, those necessary to provide the communications service), the SCA’s exception is broader and contains no such limitation. Section 2701(a) also does not apply to conduct authorized “by a user of that service with respect to a communication of or intended for that user.” 18 U.S.C. § 2701(c)(2). In Konop v. Hawaiian Airlines, Inc. 302 F.3d 868 (9th Cir 2002), the Ninth Circuit had occasion to construe whether the exception in § 2701(c)(2) applied in a case involving an employer who accessed and viewed an employee’s secure website. The District Court found that the exception applied, and that there was no violation of the SCA because two of the employee’s coworkers, Wong and Gardner, had consented to the employer’s use of the employee’s website. Konop v. Hawaiian Airlines, Inc. 302 F.3d at 880. Wong and Gardner were both eligible users of the website, but it was unclear whether either of them had ever actually accessed the website. Konop v. Hawaiian Airlines, Inc. 302 F.3d at 880. Reversing the District Court’s decision granting summary judgment to the employer on the SCA claim, the Ninth Circuit stated that the plain language of § 2701(c)(2) indicates that only a “user” of the service can authorize a third party’s access to the communication. The statute defines “user” as one who 1) uses the service and 2) is duly authorized to do so. Because the statutory language is unambiguous, it must control our construction MCLE, Inc. | 2nd Edition 2018

2–29

§ 2.5

Data Security and Privacy in Massachusetts

of the statute, notwithstanding the legislative history. The statute does not define the word “use,” so we apply the ordinary definition, which is “to put into action or service, avail oneself of, employ.” Based on the common definition of the word “use,” we cannot find any evidence in the record that Wong ever used Konop’s website. There is some evidence, however, that Gardner may have used the website, but it is unclear when that use occurred. At any rate, the district court did not make any findings on whether Wong and Gardner actually used Konop’s website—it simply assumed that Wong and Gardner, by virtue of being eligible to view the website, could authorize Davis’ access. The problem with this approach is that it essentially reads the “user” requirement out of § 2701(c)(2). Taking the facts in the light most favorable to Konop, we must assume that neither Wong nor Gardiner was a “user” of the website at the time he authorized Davis to view it. We therefore reverse the district court’s grant of summary judgment to Hawaiian on Konop’s SCA claim. Konop v. Hawaiian Airlines, 302 F.3d at 880 (citations omitted). Of interest to practitioners, the Maine Supreme Judicial Court has held that an attorney who served subpoenas on a cell phone provider to access a person’s cell phone records and text messages did not violate Section 2701(a) and that the cell phone provider’s misconduct in providing such information to the attorney did not constitute an ethical violation by the attorney. Board of Overseers of the Bar v. Ferris, BAR-13-11 (Jan. 31, 2014) (Alexander, J.). The court reasoned that [the § 2701(a)] prohibitions appear to address hacking, wiretapping and other means for accessing electronic communication systems that are not known to the electronic communications provider. These prohibitions do not apply to attempts to access electronic communication system information by subpoena. Otherwise the subpoena servers addressed in the many opinions cited below would have subjected themselves [to] the criminal penalties of section 2701(b). See Board of Overseers of the Bar v. Ferris, BAR-13-11 at 22. The punishment for an offense under Section 2701(a) can be severe. For example, if the offense is committed for purposes of commercial advantage or gain, a first-time violation can result in a fine or imprisonment for up to five years, or both. 18 U.S.C. § 2701(b)(1)(A). For any subsequent offense, the result can be a fine or imprisonment for up to ten years, or both. 18 U.S.C. § 2701(b)(1)(B).

2–30

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

Communication service providers who disclose stored communications “in accordance with the terms of a court order, warrant, subpoena, statutory authorization, or certification” under the SCA cannot be held liable for that disclosure. 18 U.S.C. § 2703(e).

§ 2.5.3

Entities Regulated by the SCA

The SCA regulates two types of network service providers: providers of electronic communication service (ECS) and providers of remote computing service (RCS). The statute defines an ECS as “any service which provides to users thereof the ability to send or receive wire or electronic communications.” 18 U.S.C. § 2510(15). A RCS is defined as “the provision to the public of computer storage or processing services by means of an electronic communications system.” 18 U.S.C. § 2711(2). An “electronic communications system” is in turn defined as “any wire, radio, electromagnetic photooptical or photoelectronic facilities for the transmission of electronic communications, and any computer facilities or related electronic equipment for the electronic storage of such communications.” 18 U.S.C. § 2510(14). These distinctions dictate the scope of the SCA’s privacy protections, making it necessary to distinguish between providers offering ECS and RCS, as well as those providers offering neither service. What is more, most network service providers are multifunctional, meaning that they can act as providers of ECS in some contexts, providers of RCS in other contexts, and as neither in other situations. Therefore, it also is important to recognize the provider’s role with respect to a particular copy of a communication, rather than the provider’s status in the abstract. In the context of a public provider, for example, files held in intermediate “electronic storage” are protected under the ECS rules, 18 U.S.C. §§ 2702(a)(1) and 2703(a), while files held for long-term storage by that same provider are protected by the RCS rules, 18 U.S.C. §§ 2702(a)(2) and 2703(b). How these distinctions play out in actual practice is not always clear. Some cases are fairly easy. For example, when an e-mail sits unopened on an ISP’s server, the ISP is acting as a provider of ECS with respect to that e-mail. On the other hand, if you send a document via file transfer protocol (FTP) to a commercial long-term storage site for safekeeping, the storage site is acting as a provider of RCS with respect to that file. There are closer cases, however. For example, the proper treatment of opened e-mail is currently unclear. One view is that a copy of an opened e-mail sitting on a server is protected by the RCS rules, not the ECS rules. The thinking is that, when a customer leaves a copy of an already accessed e-mail stored on a server, that copy is no longer “incidental to the electronic transmission [of such e-mail]” 18 U.S.C. § 2510(17); rather, it is just in remote storage like any other file held by an RCS. See Fraser v. Nationwide Mut. Ins. Co., 135 F. Supp. 2d 623, 635–38 (E.D. Pa. 2001) aff’d on other grounds, 352 F.3d 107 (3d Cir. 2003); see also H.R. Rep. No. 99-647 at 64–65 (1986) (noting that opened e-mails stored on a server are protected under provisions relating to remote computing services). MCLE, Inc. | 2nd Edition 2018

2–31

§ 2.5

Data Security and Privacy in Massachusetts

The Ninth Circuit took a different approach, however, in Theofel v. Farey-Jones, 359 F.3d 1066 (9th Cir. 2004). There, the court concluded that all e-mails held by a server are protected under the ECS rules until “the underlying message has expired in the normal course,” regardless of whether the e-mail has been accessed. Theofel v. FareyJones, 359 F.3d at 1076. It is unclear what “normal course” the Ninth Circuit has in mind. Although Theofel has been the subject of much criticism, the court in Cheng v. Romo, No 11-10007-DJC, 2013 WL 6814691 (D. Mass. Dec. 20, 2013), found its reasoning persuasive. Cheng v. Romo, 2013 WL 6814691 at *3. There, the court held that copies of previously opened e-mails that the Yahoo server continued to store were still in “electronic storage” and covered by the SCA at the time they were unlawfully accessed by the defendant. Cheng v. Romo, 2013 WL 6814691 at *3–6.

§ 2.5.4

Providers “to the Public” Versus Nonpublic Providers

The SCA distinguishes providers that make their services available “to the public” and those that do not. The distinction is important for both compelled and voluntary disclosure rules. In the case of voluntary disclosure rules, the distinction is critical: the SCA’s voluntary disclosure limitations apply only to providers that make services available to the public. 18 U.S.C. § 2702. The distinction also carries importance in the compelled disclosure rules This is because, to be considered an RCS, a provider must provide computer storage or processing services to the public. 18 U.S.C. § 2711(2). Thus, if the provider provides services to the public, the RCS rules protect opened e-mail held by the provider, whereas if the provider does not provide services to the public, the SCA does not protect opened e-mail. The line between the two categories is fairly clear. A provider “to the public” makes its services available to the public at large, whether for a fee or without cost. See Andersen Consulting LLP v. UOP, 991 F. Supp. 1041, 1042–43 (N.D. Ill. 1998) (providing e-mail access to a contractor is not the same as providing these services “to the public”). Examples of such commercial ISPs would include Google and Yahoo!. In contrast, providers do not provide services to the public if the services are available only to users with special relationships with the provider. Examples would include a university that provides accounts to its faculty and students or a company that provides corporate accounts to its employees.

§ 2.5.5

Content Information Versus Noncontent Information

Under the SCA, different rules apply to “contents” of communications and noncontent information. The latter, often referred to as “envelope” information, is defined as “a record or other information pertaining to a subscriber to or customer of [an ECS or a RCS] (not including the contents of communications.” 18 U.S.C. § 2703(c)(1). “Contents,” on the other hand, is defined under the Wiretap Act as follows: “‘[C]ontents’, when used with respect to any wire, oral, or electronic communication, includes any information concerning the substance, purport, or meaning of that communication.” 18 U.S.C. § 2510(8). 2–32

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

The line between “contents” and noncontent information is usually clear. In the case of an e-mail, “contents” clearly covers the actual text of the message. It also likely covers the subject line of the e-mail. In contrast, logs of account usage, mail header information minus the subject line, lists of outgoing e-mail addresses sent from the account, and basic customer information all likely would count as noncontent information. Not surprisingly, the SCA gives greater privacy protection to content information, even though, as has been noted, “[e]nvelope information can reveal a lot about a person’s private activities, sometimes as much (and even more) than can content information.” Daniel J. Solove, “Reconstructing Electronic Surveillance Law,” 72 Geo. Wash. L. Rev. 1264, 1287–88 (2004).

§ 2.5.6

The Privacy Protections of the SCA

The SCA’s core privacy protections are contained in 18 U.S.C. §§ 2702 and 2703. Section 2703 sets forth the rules that the government must follow when it seeks to compel a provider to disclose information. Section 2702 provides the rules that govern whether a provider can disclose information voluntarily.

(a)

Compelled Disclosure Rules

Under section 2703, different judicial process standards apply, depending on the type of information the government seeks to obtain. To compel a provider of ECS to disclose contents of communications that are in temporary “electronic storage” for 180 days or less, the government must obtain a search warrant. 18 U.S.C. § 2703(a). To compel a provider of ECS to disclose contents in electronic storage for greater than 180 days, or to compel a provider of RCS to disclose contents, the government has three options. 18 U.S.C. § 2703(a)–(b). For one, the government can obtain a search warrant. 18 U.S.C. § 2703(b)(1)(A). Alternatively, if the government gives the “subscriber or customer” prior notice (which can be delayed in certain circumstances), the government can compel disclosure with either a subpoena or a court order issued pursuant to “specific and articulable facts” as described in section 2703(d). 18 U.S.C. § 2703(b)(1)(B). The rules governing compelled disclosure of noncontent records are the same for providers of ECS and RCS, and there are several ways the government can compel such records. The government can obtain either a search warrant or a Section 2703(d) order to compel such records. 18 U.S.C. § 2703(c)(1)(A)–(B). Investigators can also compel the disclosure of such records if they obtain the consent of the customer or subscriber to such disclosure. 18 U.S.C. § 2703(c)(1)(C). The government also can obtain the following basic subscriber information with a mere administrative subpoena: (A) name; (B) address; MCLE, Inc. | 2nd Edition 2018

2–33

§ 2.5

Data Security and Privacy in Massachusetts

(C) local and long distance telephone connection records, or records of session times and durations; (D) length of service (including start date) and types of service utilized; (E) telephone or instrument number or other subscriber number or identity, including any temporarily assigned network address; and (F) means and source of payment for such service (including any credit card or bank account number) . . . . 18 U.S.C. § 2703(c)(2).

(b)

Voluntary Disclosure Rules

Importantly, Section 2702 imposes restrictions only on providers that provide services “to the public.” 18 U.S.C. § 2702(a). By implication, therefore, nonpublic providers can voluntarily disclose information without limitation without violating the SCA. See Andersen Consulting LLP v. UOP, 991 F. Supp. 1041, 1042–43 (N.D. Ill. 1998). Subject to specific exceptions, Section 2702(a) bans the voluntary disclosure of content information by public providers, as well as the disclosure of noncontent records to any governmental entities. In the case of disclosure of contents, a provider can disclose information voluntarily in the following circumstances: (1) to an addressee or intended recipient of such communication or an agent of such addressee or intended recipient; (2) as otherwise authorized in section 2517, 2511(2)(a), or 2703 of this title; (3) with the lawful consent of the originator or an addressee or intended recipient of such communication, or the subscriber in the case of remote computing service; (4) to a person employed or authorized or whose facilities are used to forward such communication to its destination; (5) as may be necessarily incident to the rendition of the service or to the protection of the rights or property of the provider of that service; (6) to the National Center for Missing and Exploited Children, in connection with a report submitted thereto under section 2258A; (7) to a law enforcement agency— 2–34

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

(A) if the contents— (i) were inadvertently obtained by the service provider; and (ii) appear to pertain to the commission of a crime; or (B) [Deleted] or (8) to a governmental entity, if the provider, in good faith, believes that an emergency involving danger of death or serious physical injury to any person requires disclosure without delay of communications relating to the emergency. 18 U.S.C. § 2702(b). The exceptions for the disclosure of noncontent records are similar to those for contents. One important exception is that providers to the public are free to disclose noncontent information to “any person other than a governmental entity.” 18 U.S.C. § 2702(c)(6).

(c)

Voluntary Disclosure Versus Compelled Disclosure

One of the SCA’s fundamental distinctions is that between Section 2702’s voluntary disclosure Section 2703’s compelled disclosure. In the former, the provider wishes to disclose records to a person or entity; in the latter, the government seeks information from the provider and uses the law to force the provider to disclose the information. With respect to voluntary disclosure, nonpublic providers can disclose without restriction, whereas providers of ECS or RCS to the public ordinarily cannot disclose either content or noncontent information unless an exception applies. In the case of contents, the facts must fit within one of Section 2702(b)’s eight exceptions. Similarly, in the case of noncontent records, the facts must fit within one of Section 2702(c)’s six exceptions. In compelled disclosure, different levels of process can compel different kinds of information. The more process there is, the more information the government can obtain. For example a search warrant can compel everything in a stored account. 18 U.S.C. § 2703(a)-(c). A Section 2703(d) order plus prior notice, on the other hand, compels everything except contents in temporary “electronic storage” 180 days or less. 18 U.S.C. § 2703(b)(1)(B). Finally, only a simple administrative subpoena is needed to compel basic subscriber information. 18 U.S.C. § 2703(c)(2). Although many interactions between the police and ISPs fall clearly into one of these categories—voluntary or compelled—some fall into a gray zone somewhere in between. There has been little judicial guidance, and the precise line between voluntary and compelled disclosure remains hazy, though Freedman v. America Online, Inc., 303 F. Supp. 2d 121 (D. Conn. 2004) provides some direction. MCLE, Inc. | 2nd Edition 2018

2–35

§ 2.5

Data Security and Privacy in Massachusetts

In Freedman, two police officers investigating a threatening e-mail sent from an America Online (AOL) account filled out a state warrant application and faxed it to AOL seeking the sender’s basic subscriber information. Freedman v. America Online, Inc., 303 F. Supp. 2d at 122–23.The officers did not actually submit the warrant application to a judge, however, rendering the warrant a legal nullity. Freedman v. America Online, Inc., 303 F. Supp. 2d at 123. AOL complied with the terms of the warrant form and faxed the suspect’s subscriber information back to the officers. Freedman v. America Online, Inc., 303 F. Supp. 2d at 123. The suspect later sued AOL and the two police officers for violating section 2703. Freedman v. America Online, Inc., 303 F. Supp. 2d at 123. The police officers argued that they had merely requested the information, rather than actually requiring it as regulated by section 2703. Freedman v. America Online, Inc., 303 F. Supp. 2d at 126–27. The court rejected this argument as “disingenuous.” Freedman v. America Online, Inc., 303 F. Supp. 2d at 127. The officers clearly intended AOL to comply with the request, and allowing them to circumvent section 2703 by merely requesting information would “contradict[ ] Congress’s intent to protect personal privacy.” Freedman v. America Online, Inc., 303 F. Supp. 2d at 126. The court also rejected the argument that the emergency exception of Section 2702(c)(4) applied: AOL’s disclosure was not on its own initiative, the court noted, but was triggered by the officers’ request. Freedman v. America Online, Inc., 303 F. Supp. 2d at 128. Although Freedman leaves many issues unanswered, it suggests that disclosures will be presumed to fall under Section 2703 unless an exception under Section 2702 is affirmatively established.

§ 2.5.7

Interplay with Massachusetts Constitution

The SCA operates independently of the Massachusetts Constitution. For example, even if a search is reasonable under the Massachusetts Constitution, the SCA may bar the evidence. And even if a search is authorized by a judge under the SCA, the Massachusetts Constitution could still prohibit the activity. The case of Commonwealth v. Augustine, 467 Mass. 230 (2014), illustrates the interplay between the SCA and the Massachusetts Constitution. There, the Supreme Judicial Court of Massachusetts addressed the question of “whether, consistent with the Massachusetts Constitution, the Commonwealth may obtain from a cellular telephone service provider . . . historical cell site location information (CSLI) for a particular cellular telephone without first obtaining a search warrant supported by probable cause” Commonwealth v. Augustine, 467 Mass. at 231. The court concluded that, although the CSLI at issue here is a business record of the defendant’s cellular service provider, he had a reasonable expectation of privacy in it, and in the circumstances of this case— where the CSLI obtained covered a two-week period—the warrant requirement of art.14 [of the Massachusetts Declaration of Rights] applies. Commonwealth v. Augustine, 467 Mass. at 232.

2–36

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

In Augustine, the parties agreed that the SCA applied to the CSLI and that the Superior Court judge’s Section 2703(d) order was validly based on “specific and articulable facts showing there are reasonable grounds to believe” the CSLI records requested were “relevant and material to an ongoing criminal investigation.” Commonwealth v. Augustine, 467 Mass. at 236 (quoting 18 U.S.C. § 2703(d)) (internal quotation marks omitted). However, the parties disagreed on the constitutional sufficiency of this statutory standard. “Stated otherwise, the parties dispute[d] whether, under the Fourth Amendment or art. 14, the Commonwealth may obtain the CSLI from a cellular service provider solely on the basis of a § 2703(d) order, or may only do so by obtaining a search warrant based on probable cause.” Commonwealth v. Augustine, 467 Mass. at 236. The court provided a brief explanation of cellular telephone technology. The basic facts about how a cellular telephone works and how a cellular service provider keeps CSLI records are not in dispute. A cellular telephone communicates with the telephone network via radio waves. A cellular service provider has a network of base stations, also referred to as cell sites or cell towers, that essentially divides the provider’s service area into “sectors.” Cell site antennae send and receive signals from subscribers’ cellular telephones that are operating within a particular sector. Additionally, if a subscriber begins a call connected to a particular cell site and then moves closer to a different one, the call is automatically “handed off” to that closer cell site. When a subscriber makes or receives a call, the cellular service provider records the identity of the cell site utilized. Through such “network based location techniques,” a cellular service provider can approximate the location of any active cellular telephone handset within its network based on the handset’s communication with a particular cell site. As cellular telephone use has grown, cellular service providers have responded by adding new cell sites to accommodate additional customers. The number of cell sites in the United States has risen from 139,338 in 2002 to 301,779 in 2012, a more than twofold increase. When new cell sites are created, existing sectors become smaller, which, in turn, makes network based location tracking increasingly accurate. Commonwealth v. Augustine, 467 Mass. at 237–39 (footnotes and citations omitted). According to the court, the CSLI that the Commonwealth obtained from the defendant’s cellular service provider pursuant to the Section 2703(d) order include[d], for a two week period (or somewhat longer . . . ) . . . the telephone numbers, the date and time, and the numbers of the cell sites used for all the calls made and received by the MCLE, Inc. | 2nd Edition 2018

2–37

§ 2.5

Data Security and Privacy in Massachusetts

defendant’s cellular telephone handset—including, we infer from the § 2703(d) order, unanswered calls—as well as the latitude and longitude of the cell sites to which those calls connected in order to conduct those calls. SMS or short message service messages (text messages), Internet use, or any type of “registration” or “triangulated” data [were] not included. Commonwealth v. Augustine, 467 Mass. at 239–40 (emphasis added). The court next addressed the applicability of the third-party doctrine with respect to CSLI under Article 14. The doctrine has its roots in a pair of United States Supreme Court cases that predate cellular telephones. In United States v. Miller, 425 U.S. 435, 438–440 (1976), the Court considered whether the defendant had a Fourth Amendment privacy interest in his bank records, including his checks, deposit slips, and monthly statements. Reasoning that the documents were “business records of the banks,” the Court “perceive[d] no legitimate ‘expectation of privacy’ in their contents.” Specifically, the records contained information “voluntarily conveyed to the banks and exposed to their employees in the ordinary course of business,” and therefore “[t]he depositor takes the risk, in revealing his affairs to another, that the information will be conveyed by that person to the Government.” The Court concluded: “[T]he Fourth Amendment does not prohibit the obtaining of information revealed to a third party and conveyed by him to Government authorities, even if the information is revealed on the assumption that it will be used only for a limited purpose and the confidence placed in the third party will be not be betrayed.” Smith v. Maryland, 442 U.S. 735, 737, 742 (1979), presented the question whether the defendant had a legitimate expectation of privacy in the telephone numbers that he dialed on his home telephone. The telephone company, at police request, had installed a pen register—a mechanical device that records the telephone numbers dialed on a particular telephone—in order to capture information about the defendant Smith’s call history. Reasoning that “[t]elephone users . . . typically know that they must convey numerical information to the [telephone] company; that the [telephone] company has facilities for recording this information; and that the [telephone] company does in fact record this information for a variety of legitimate business purposes,” the Court rejected the notion that telephone subscribers “harbor any general expectation that the 2–38

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

numbers they dial will remain secret.” Applying the reasoning of Miller, the Court stated that, “[w]hen he used his [telephone], [the defendant] voluntarily conveyed numerical information to the telephone company and ‘exposed’ that information to its equipment in the ordinary course of business. In so doing, [the defendant] assumed the risk that the company would reveal to police the numbers he dialed.” Commonwealth v. Augustine, 467 Mass. at 242–43 (citations omitted). Recognizing that the cellular telephone has become “an indispensable part of modern [American] life,” and that “cellular telephones physically accompany their users everywhere—almost permanent attachments to their bodies,” the court found that “the nature of cellular telephone technology and CSLI and the character of cellular telephone use in our current society render the third party doctrine of Miller and Smith inapposite; the digital age has altered dramatically the societal landscape from the 1970s, when Miller and Smith were written.” Commonwealth v. Augustine, 467 Mass. at 245–46 (citation omitted). Turning to the nature of CSLI, the court found that CSLI, which tracks the location of a cellular telephone user, implicates the same privacy concerns as a GPS tracking device. In support of its finding, the court relied on its decision in Commonwealth v. Rousseau, 465 Mass. 372 (2013), a case involving the Commonwealth’s use of GPS to track a defendant’s vehicle. There the court stated: [T]he government’s contemporaneous electronic monitoring of one’s comings and goings in public places invades one’s reasonable expectation of privacy. We conclude that under art. 14, a person may reasonably expect not to be subjected to extended GPS electronic surveillance by the government, targeted at his movements, without judicial oversight and a showing of probable cause. Commonwealth v. Augustine, 467 Mass. at 247–48 (quoting Commonwealth v. Rousseau, 465 Mass. at 382). The Augustine court also relied on the following passage from the New Jersey Supreme Court’s decision in State v. Earls, 214 N.J. 564, 586 (2013): Using a [cellular telephone] to determine the location of its owner can be far more revealing than acquiring toll billing, bank, or Internet subscriber records. It is akin to using a tracking device and can function as a substitute for 24/7 surveillance without police having to confront the limits of their resources. It also involves a degree of intrusion that a reasonable person would not anticipate. . . . Location information gleaned from a [cellular telephone] provider can reveal not just where people go—which doctors, religious services, and stores they MCLE, Inc. | 2nd Edition 2018

2–39

§ 2.5

Data Security and Privacy in Massachusetts

visit but—also the people and groups they choose to affiliate with and when they actually do so. That information cuts across a broad range of personal ties with family, friends, political groups, health care providers, and others. . . . In other words, details about the location of a [cellular telephone] can provide an intimate picture of one’s daily life. Commonwealth v. Augustine, 467 Mass. at 248 (citations omitted). Finding that CSLI is substantively different from the types of information and records contemplated by Smith and Miller, the court concluded that it would be inappropriate to apply the third-party doctrine to CSLI. In Smith, the information and related record sought by the government, namely, the record of telephone numbers dialed, was exactly the same information that the telephone subscriber had knowingly provided to the telephone company when he took the affirmative step of dialing the calls. The information conveyed also was central to the subscriber’s primary purpose for owning and using the cellular telephone: to communicate with others. No cellular telephone user, however, voluntarily conveys CSLI to his or her cellular service provider in the sense that he or she first identifies a discrete item of information or data point like a telephone number (or a check or deposit slip as in Miller and then transmits it to the provider. CSLI is purely a function and product of cellular telephone technology, created by the provider’s system network at the time that a cellular telephone call connects to a cell site. And at least with respect to calls received but not answered, this information would be unknown and unknowable to the telephone user in advance—or probably at any time until he or she receives a copy of the CSLI record itself. More-over, it is of course the case that CSLI has no connection at all to the reason people use cellular telephones. See Earls, 214 N.J. at 587, 70 A.3d 630 (“People buy [cellular telephones] to communicate with others, to use the Internet, and for a growing number of other reasons. But no one buys a [cellular telephone] to share detailed information about their whereabouts with the police”). Moreover, the government here is not seeking to obtain information provided to the cellular service provider by the defendant. Rather, it is looking only for the location– identifying by-product of the cellular telephone technology— a serendipitous (but welcome) gift to law enforcement investigations. Finally, in terms of the privacy interest at stake here—the individual’s justifiable interest in not having “his comings and goings . . . continuously and contemporaneously monitored” by the government—the enormous difference between the cellular telephone in this case and the “land line” 2–40

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

telephone in Smith seems very relevant. In terms of location, a call log relating to a land line may indicate whether the subscriber is at home, but no more. But for a cellular telephone user carrying a telephone handset (as the defendant was), even CSLI limited to the cell site locations of telephone calls made and received may yield a treasure trove of very detailed and extensive information about the individual’s “comings and goings” in both public and private places; in this case, as mentioned, the defendant’s CSLI obtained by the Commonwealth covered at least sixty-four pages. Commonwealth v. Augustine, 467 Mass. at 249–51 (footnote and citations omitted). Having so concluded, the court next turned to the question of whether, given its capacity to track the movements of the cellular telephone user, CSLI implicates the defendant’s privacy interests to the extent that under Article 14 the government must obtain a search warrant in order to obtain it. Answering this question in the affirmative, the court noted as follows: This distinction between privacy interests in public and private spaces makes CSLI especially problematic, because cellular telephones give off signals from within both spaces, and when the government seeks to obtain CSLI from a cellular service provider, it has no way of knowing in advance whether the CSLI will have originated from a private or public location. Given that art. 14 protects against warrantless intrusion into private places, we cannot ignore the probability that, as CSLI becomes more precise, cellular telephone users will be tracked in constitutionally protected areas. Considering GPS vehicle location tracking, a number of courts—including this court—have determined that it is only when such tracking takes place over extended periods of time that the cumulative nature of the information collected implicates a privacy interest on the part of the individual who is the target of the tracking. This rationale has been extended to the context of CSLI. . . . .... GPS data and historical CSLI are linked at a fundamental level: they both implicate the same constitutionally protected interest—a person’s reasonable expectation of privacy—in the same manner—by tracking the person’s movements. Given this intrinsic link, it is likely that the duration of the period for which historical CSLI is sought will be a relevant consideration in the reasonable expectation of privacy calculus—that MCLE, Inc. | 2nd Edition 2018

2–41

§ 2.5

Data Security and Privacy in Massachusetts

there is some period of time for which the Commonwealth may obtain a person’s historical CSLI by meeting the standard for a § 2703(d) order alone, because the duration is too brief to implicate the person’s reasonable privacy interest. But there is no need to consider at this juncture what the boundaries of such a time period might be in this case because, for all the reasons previously rehearsed concerning the extent and character of cellular telephone use, the two weeks covered by the § 2703(d) order at issue exceeds it: even though restricted to telephone calls sent and received (answered or unanswered), the tracking of the defendant’s movements in the urban Boston area for two weeks was more than sufficient to intrude upon the defendant’s expectation of privacy safeguarded by art. 14. Commonwealth v. Augustine, 467 Mass. at 252–53, 254–55 (footnotes and citations omitted). Accordingly, in light of the particular circumstances of this case, the court vacated the Superior Court’s order allowing the defendant’s motion to suppress and remanded the case to the Superior Court for a determination of whether the Commonwealth’s application for the Section 2703(d) order met the requisite probable cause standard of Article 14. Commonwealth v. Augustine, 467 Mass. at 258. Noteworthy in Augustine is the dissenting opinion of Justice Gants (with whom Justice Cordy joined), in which Justice Gants took issue with the court’s failure to distinguish, for constitutional purposes, “[t]elephone call CSLI (the type sought by the Commonwealth and ordered by the court in this case)” and “[r]egistration CSLI (the type not sought by the Commonwealth or ordered by the court, and therefore the type not at issue in this case)”. Commonwealth v. Augustine, 467 Mass. at 258–259 (Gants, J., dissenting) (emphasis added). Explaining the difference between the two types of CSLI, Justice Gants stated as follows: Telephone call CSLI . . . provides the approximate physical location (location points) of a cellular telephone only when a telephone call is made or received by that telephone. Registration CSLI . . . provides the approximate physical location of a cellular telephone every seven seconds unless the telephone is “powered off,” regardless of whether any telephone call is made to or from the telephone. Telephone call CSLI is episodic; the frequency of the location points depends on the frequency and duration of the telephone calls to and from the telephone. Registration CSLI, for all practical purposes, is continuous, and therefore is comparable to monitoring the past whereabouts of the telephone user through a global positioning system (GPS) tracking device on the telephone, although it provides

2–42

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

less precision than a GPS device regarding the telephone’s location. Commonwealth v. Augustine, 467 Mass. at 258–59. In Justice Grants’ view, telephone call CSLI is very similar to “traditional telephone records,” which long have been held to be, and are still under the holding of Augustine, governed by the third-party doctrine. Commonwealth v. Augustine, 467 Mass. at 260–61, 262. He stated as follows: Telephone CSLI, like telephone toll records, also fits within the traditional justification for the third-party doctrine. Every person who uses a cellular telephone recognizes that the location of the telephone matters in determining whether there is cellular service and, where there is such service, in determining the quality of the telephone connection, which is why at least one cellular telephone company advertises “more bars in more places.” Therefore, every person who uses a cellular telephone recognizes, at least implicitly, that a cellular telephone company must identify the location of a cellular telephone, as well as the telephone number called, before a call can be successfully made from a cellular telephone. Accordingly, although a cellular telephone user may not know that the telephone company records and keeps this information, or want it kept, the user should know that location information, as well as the telephone number, must be provided to the telephone company whenever he makes or receives a telephone call. Commonwealth v. Augustine, 467 Mass. at 263–64. Finding “no principled reason why the third-party doctrine should apply to telephone toll records but not to telephone call CSLI,” (Commonwealth v. Augustine, 467 Mass. at 265) Justice Gants concluded that [b]ecause the court order in this case allowed only for production of telephone CSLI over a two-week period, not registration CSLI, I would conclude under the third-party doctrine that the defendant had no reasonable expectation of privacy in his location points when he was making or receiving telephone calls, and reverse the judge’s allowance of the motion to suppress Commonwealth v. Augustine, 467 Mass. at 268. On remand, the Superior Court in Commonwealth v. Augustine, No. SUCR2011– 10748, 2014 WL 4656604 (Mass. Super. Ct. Aug. 28, 2014), held that the state’s application for the Section 2703(d) order did not meet Article 14’s probable cause standard, and again allowed the CSLI data to be suppressed. Specifically, this standard required the application to support the belief MCLE, Inc. | 2nd Edition 2018

2–43

§ 2.5

Data Security and Privacy in Massachusetts

that a particularly described offense has been, is being, or is about to be committed, and that the [CSLI being sought] will produce evidence of such offense or will aid in the apprehension of a person who the applicant has probable cause to believe has committed, is committing, or is about to commit such offense. Commonwealth v. Augustine, 2014 WL 4656604 at *1 (emphasis, citations, and internal quotation marks omitted). In Massachusetts, there must be “unique, specific and reliable facts creating a nexus between the alleged crime and the evidence sought or the place to be searched to demonstrate probable cause for a search warrant,” in other words, “a substantial basis to support a reasonable belief that [the suspect] committed the crimes.” Commonwealth v. Augustine, 2014 WL 4656604 at *9, *11. The court held the application did demonstrate probable cause that the murder for which Augustine was accused did occur and that the victim’s car was set on fire, but that the application did not demonstrate probable cause that two-weeks’ worth of Augustine’s CSLI “will produce evidence” of either crime. Commonwealth v. Augustine, 2014 WL 4656604 at *10. Essentially, the court believed the facts as documented in the application only demonstrated why Augustine was a suspect worth of investigation, not that he had committed the crimes for which he was accused. Commonwealth v. Augustine, 2014 WL 4656604 at *10–11. Because the application did not relate facts creating the nexus between the crimes and the evidence sought was necessary for a search warrant, the CSLI data was suppressed. Commonwealth v. Augustine, 2014 WL 4656604 at *11–12. The Commonwealth sought interlocutory review of the Superior Court’s order. Commonwealth v. Augustine, 472 Mass. 448, 450 (2015). The Supreme Judicial Court agreed with the lower court that the Section 2703(d) application did demonstrate probable cause that the murder and arson occurred, but disagreed that the application did not demonstrate probable cause that the CSLI “will produce evidence” of either offense. Commonwealth v. Augustine, 472 Mass. at 454. Unlike the lower court, the Supreme Judicial Court believed the facts as laid out in the application provided the substantial basis for concluding Augustine committed the crimes and therefore reversed the order allowing Augustine’s motion to suppress. Commonwealth v. Augustine, 472 Mass. at 458–60.

§ 2.5.8

Interplay with Fourth Amendment

The SCA also operates independently of the Fourth Amendment. United States v. Warshak, 631 F.3d 266 (6th Cir. 2010), illustrates the interplay of the Fourth Amendment and the SCA. There, the Sixth Circuit held subscribers have reasonable privacy expectations in their e-mail content “stored with, or sent or received through, a commercial ISP,” and consequently, the government must have a warrant founded on probable cause to compel a commercial ISP to turn over the subscribers’ e-mail content. United States v. Warshak, 631 F.3d at 288 (citations and internal quotation marks omitted). The court stated: “Therefore, because they did not obtain a warrant, the government agents violated the Fourth Amendment when they obtained the 2–44

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

contents of Warshak’s emails. Moreover, to the extent that the SCA purports to permit the government to obtain such emails warrantlessly, the SCA is unconstitutional.” United States v. Warshak, 631 F.3d at 288. More recently, in United States v. Carpenter, 819 F.3d 880 (6th Cir. 2016) (en banc), cert. granted, 85 U.S.L.W. 3567 (U.S. June 5, 2017) (No. 16-402), the court dealt with the intersection of the SCA with the Fourth Amendment. Carpenter and his accomplices were arrested for committing several armed robberies around Detroit, Michigan. United States v. Carpenter, 819 F.3d at 884. The FBI applied for up to 127 days’ worth of cell phone records, including CSLI, under Section 2703(d). United States v. Carpenter, 819 F.3d at 884. In the urban Detroit area, the CSLI could approximate a phone’s location between a half-mile to two miles. United States v. Carpenter, 819 F.3d at 885. Before trial, the defendants moved to suppress the CSLI, arguing that under the Fourth Amendment, the records could only be seized with a probable cause based warrant, but their motion was denied. United States v. Carpenter, 819 F.3d at 884. The FBI then used the CSLI to create maps showing the defendants’ phones were located near the robberies when each happened; the defendants were subsequently convicted. United States v. Carpenter, 819 F.3d at 884. On appeal, the defendants challenged the denial of their motion to suppress. United States v. Carpenter, 819 F.3d at 884. In affirming the lower court’s decision, the Sixth Circuit highlighted the distinction between content and noncontent information, specifically that a communication’s contents are private, but “the information necessary to get those communications from point A to point B is not.” United States v. Carpenter, 819 F.3d at 886. Unlike the Massachusetts Supreme Judicial Court in Augustine, the Carpenter court affirmatively analogized the CSLI to the phone numbers captured by the pen register in Smith v. Maryland. United States v. Carpenter, 819 F.3d at 887; see also discussion above at § 2.5.7. Like Smith, the Sixth Circuit viewed CSLI as simply routing information phone companies collect as part of their ordinary business. United States v. Carpenter, 819 F.3d at 887. Subscribers realize that they must convey this location information to phone companies so that their calls can be connected, and therefore subscribers can have no expectation of privacy in it. United States v. Carpenter, 819 F.3d at 888. Because the CSLI is a business record that does not reveal anything about the contents of a call, the CSLI is subject to the third-party doctrine and is therefore unprotected by the Fourth Amendment. United States v. Carpenter, 819 F.3d at 888. The defendants tried to use the Supreme Court’s recent Fourth Amendment case dealing with GPS data, United States v. Jones, 565 U.S. 400 (2012), for support. In Jones, the Court unanimously held the government’s installation of a GPS unit on Jones’s vehicle to track his movements constituted a search under the Fourth Amendment for which a warrant was required; however, the court split on the reasons underlying the overarching conclusion. United States v. Jones, 565 U.S. at 404, 413–414, 419, 430–31. The defendants latched onto the concurrences of Justices Sotomayor and Alito, together representing the opinions of five justices, that “longer term GPS monitoring in government investigations of most offenses impinges on expectations of privacy.” United States v. Carpenter, 819 F.3d at 888. (citations and internal quotation marks omitted). However, the Sixth Circuit was unmoved, noting MCLE, Inc. | 2nd Edition 2018

2–45

§ 2.5

Data Security and Privacy in Massachusetts

first that the government action at issue in Carpenter was quite different from that in Jones, and for Fourth Amendment purposes “[w]hether a defendant ha[s] a legitimate expectation of privacy in certain information depends in part on what the government did to get it.” United States v. Carpenter, 819 F.3d at 888. In Jones, a covertly placed GPS device monitored an individual’s movements continuously for weeks, whereas in Carpenter, the government simply collected business records. United States v. Carpenter, 819 F.3d at 889. Business records, the court reiterated, are covered by the Smith standard. United States v. Carpenter, 819 F.3d at 889. Secondly, the defendants could not rely on Jones because the accuracy of GPS and CSLI are vastly different. United States v. Carpenter, 819 F.3d at 889. In other words, the case at bar was not a GPS tracking case for which Jones was binding precedent. United States v. Carpenter, 819 F.3d at 889. GPS can be accurate to within fifty feet, whereas the CSLI at issue in this case could pinpoint a cell phone’s location only to between a half-mile and two-miles. United States v. Carpenter, 819 F.3d at 889. CSLI cannot track an individual’s movements with the same specificity as a GPS can, and thus the Fourth Amendment implications are not the same. United States v. Carpenter, 819 F.3d at 889. Lastly, the court took time to comment upon another reason why the court should restrain itself when faced with “constitutional judgments” such as that in Carpenter: Constitutional judgments typically rest in part on a set of empirical assumptions. When those assumptions concern subjects that judges know well—say, traffic stops—courts are wellequipped to make judgments that strike a reasonable balance among the competing interests at stake. But sometimes new technologies—say, the latest iterations of smartphones or social media—evolve at rates more common to superbugs than to large mammals. In those situations judges are less good at evaluating the empirical assumptions that underlie their constitutional judgments. Indeed the answers to those empirical questions might change as quickly as the technology itself does . . . . Congress is usually better equipped than courts are to answer the empirical questions that such technologies present. Thus, “[w]hen technologies are new and their impact remains uncertain, statutory rules governing law enforcement powers will tend to be more sophisticated, comprehensive, forward-thinking, and flexible than rules created by the judicial branch.” These concerns favor leaving undisturbed the Congressional judgment here. United States v. Carpenter, 819 F.3d at 890 (citations omitted). In her concurrence, Judge Stranch wrote to express her concerns “that the sheer quantity of sensitive information procured without a warrant in this case raises Fourth Amendment concerns of the type the Supreme Court and our circuit acknowledged in United States v. Jones and in United States v. Skinner, 690 F.3d 772, 780 (6th Cir. 20112).” United States v. Carpenter, 819 F.3d at 893 (Stranch, J., concurring) (citations omitted). Despite her concerns, she concurred with the majority that 2–46

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

denial of the defendants’ motion to suppress was appropriate under the good-faith exception to the exclusionary rule. United States v. Carpenter, 819 F.3d at 894, 896. Judge Stranch began her opinion by noting that she is inclined to agree with Justice Sotomayor that it may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties. This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks. United States v. Carpenter, 819 F.3d at 894, 896. (quoting Jones, 132 S.Ct. at 957 (Sotomayor, J., concurring) (citations and internal quotation marks omitted)). Because Carpenter “resides at the intersection of the law governing tracking of personal location and the law governing privacy interests in business records,” the case exemplified to Judge Stranch why the existing Fourth Amendment tests are problematic. United States v. Carpenter, 819 F.3d at 895. In framing her analysis, she referenced the Sixth Circuit’s decision in Skinner. Here, the court acknowledged Justice Alito’s concern in Jones that long-term government GPS monitoring during investigations impinges privacy expectations, but held that the government’s use of three-days’ worth of cell phone GPS data to track the defendant Skinner was “relatively short-term monitoring of [his] movements on public streets [which] accords with expectations of privacy that our society has recognized as reasonable.” United States v. Carpenter, 819 F.3d at 895. (citations and internal quotation marks omitted). Judge Stranch noted the court’s holding had a caveat: “There may be situations where police, using otherwise legal methods, so comprehensively track a person’s activities that the very comprehensiveness of the tracking is unreasonable for Fourth Amendment purposes.” United States v. Carpenter, 819 F.3d at 895. (citations and internal quotation marks omitted). To Stranch, in light of Skinner and Jones, the business records test was not the appropriate lens through which to view records that “reflect personal location,” even if less precise than GPS data, when as here the government obtained up to 127 days (i.e., four months) of CSLI records for defendant Carpenter. United States v. Carpenter, 819 F.3d at 895. In sum, Judge Stranch wrote: At issue here is neither relatively innocuous routing information nor precise GPS locator information: it is personal location information that partakes of both. I am also concerned about the applicability of a test that appears to admit to no limitation on the quantity of records or the length of time for which such records may be compelled. I conclude that our precedent suggests the need to develop a new test to determine when a warrant may be necessary under these or comparable circumstances.

MCLE, Inc. | 2nd Edition 2018

2–47

§ 2.5

Data Security and Privacy in Massachusetts

United States v. Carpenter, 819 F.3d at 895–96. Even though, Judge Stranch held that the CSLI in this case was admissible under the good-faith exception to the exclusionary rule. United States v. Carpenter, 819 F.3d at 894, 896. Carpenter’s petition for a writ of certiorari was granted in June 2017. Carpenter v. United States, 85 U.S.L.W. 3567 (U.S. June 5, 2017) (No. 16-402). The question presented is: “Whether the warrantless seizure and search of historical cell phone records revealing the location and movements of a cell phone user over the course of 127 days is permitted by the Fourth Amendment.” 16-402 Carpenter v. United States, Supreme Court, https://www.supremecourt.gov/qp/16-00402qp.pdf.

§ 2.5.9

Consent and Discovery Under the SCA

The SCA includes no exception authorizing disclosure of electronic communications based on a civil discovery subpoena. Mintz v. Mark Bartelstein & Assocs. Inc., 885 F. Supp. 2d 987, 991–94 (C.D. Cal. 2012); see also Bower v. Bower, 808 F. Supp. 2d 348, 350 (D. Mass. 2011) (no exception for civil discovery subpoenas, “courts have repeatedly held that providers such as Yahoo! And Google may not produce emails in response to civil discovery subpoenas”); Crispin v. Christian Audigier, Inc., 717 F. Supp. 2d 965, 975–76 (C.D. Cal. 2010) (rejecting argument that SCA permits disclosure of contents of communications pursuant to civil discovery subpoena); Flagg v. City of Detroit, 252 F.R.D. 346, 350 (E.D. Mich. 2008) (“[A]s noted by the courts and commentators alike, § 2702 lacks any language that explicitly authorizes a service provider to divulge the contents of a communication pursuant to a subpoena or court order.”); Viacom Int’l Inc. v. YouTube Inc., 253 F.R.D. 256, 264 (S.D.N.Y. 2008) (holding that SCA “contains no exception for disclosure of such communications pursuant to civil discovery requests”); In re Subpoena Duces Tecum to AOL, LLC, 550 F. Supp. 2d 606, 611 (E.D. Va. 2008) (“Applying the clear and unambiguous language of § 2702 to this case, AOL, a corporation that provides electronic communication services to the public, may not divulge the contents of [a person’s] electronic communications to [an insurance company] because the statutory language of the [SCA] does not include an exception for the disclosure of electronic communications pursuant to civil discovery subpoenas.”); O’Grady v. Superior Court, 139 Cal. App. 4th 1423, 1447, 44 Cal Rptr. 3d 72 (2006) (“Since the [SCA] makes no exception for civil discovery and no repugnancy has been shown between a denial of such discovery and congressional intent or purpose, the Act must be applied, in accordance with its plain terms, to render unenforceable the subpoenas seeking to compel [electronic communication services] to disclose the contents of e-mails stored on their facilities.”). So, if a subpoena served on the ISP will not work, how does a party obtain the contents of stored electronic communications in civil litigations? The short answer is this: Direct the discovery to a sender, recipient, addressee, or subscriber who exercises control over the communications.

2–48

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

One of the principles governing discovery is that the responsive documents may include those that are under the “control” of the target of discovery and are not limited to those in the party’s possession or custody. And, as noted previously, the SCA does authorize an ECS or RCS provider to divulge the contents of communications when “lawful consent” has been given to do so. Who may give consent depends on whether the service is an ECS or an RCS. In both instances, the originator, addressee, and intended recipient of a communication may give consent; the subscriber to the RCS holding the communication—that is, the account holder—may also give consent. O’Grady v. Superior Court, 139 Cal. App. 4th at 1447. The focus here on the contents of communications should not obscure the availability under the SCA of noncontent information, at least to nongovernmental litigants. 18 U.S.C. § 2702(c)(6), (a)(3). Even though the content may be important, it may not be the only valuable information pertaining to an electronic communication. For instance, in Viacom International v. YouTube Inc., 253 F.R.D. at 264–65, the plaintiffs successfully forced YouTube to produce information such as the number of times that certain “private” videos had been viewed or made accessible, even though YouTube was able to protect the videos themselves—that is, their content—from production. The discovery dispute in Flagg v. City of Detroit, 252 F.R.D. 346 (E.D. Mich. 2008), is instructive. There, the plaintiff issued subpoenas to SkyTel, a nonparty service provider, seeking discovery of text messages exchanged among certain officials of the city which were still retained by SkyTel. Flagg v. City of Detroit, 252 F.R.D. at 347–48. The defendants moved to prevent discovery of such communications, arguing that the SCA prohibited SkyTel from producing the text messages under the subpoenas. Flagg v. City of Detroit, 252 F.R.D. at 348,. 349–50 Pointing to a more effective way, in most instances, to obtain electronically stored communications and their contents, the court held that the SCA did not preclude civil discovery of such text messages, since they remained within the city’s control. Flagg v. City of Detroit, 252 F.R.D. at 352–364. More specifically, in addressing the defendants’ SCA-based challenge, the court found it instructive to consider whether the plaintiff, rather than choosing third-party subpoenas as the vehicle for seeking production of the text messages, could have achieved the same objective through an ordinary Fed. R. Civ. P. 34 request for production directed at the city. Answering this question in the affirmative, the court noted: As the language of the Rule [34(a)] makes clear, and as the courts have confirmed, a request for production need not be confined to documents or other items in a party’s possession, but instead may properly extend to items that are in that party’s “control.”

MCLE, Inc. | 2nd Edition 2018

2–49

§ 2.5

Data Security and Privacy in Massachusetts

The case law illustrates the variety of circumstances under which a party may be deemed to have “control” over materials not in its possession. First, the requisite “legal right to obtain” documents has been found in contractual provisions that confer a right of access to the requested materials. The courts also have held that documents in the possession of a party’s agent —for example, an attorney—are considered to be within the party’s control. Next, the courts have found that a corporate party may be deemed to have control over documents in the possession of one of its officers or employees. Flagg v. City of Detroit, 252 F.R.D. at 353 (citations omitted). In addition, the court noted: Indeed, this principle [of “control”] extends not just to documents in the actual possession of a non-party officer or employee of a corporate party, but also to materials that the officer or employee has a legal right to obtain. In Herbst v. Able, 63 F.R.D. 135, 136 (S.D.N.Y. 1972), for instance, the plaintiffs sought the production of transcripts of testimony given by non-party employees of the defendant corporation, Douglas Aircraft Company, at a private hearing before the Securities and Exchange Commission (“SEC”). Douglas Aircraft objected to this request, stating that it did not have copies of these transcripts in its possession, and citing an SEC policy not to make such transcripts available to private litigants. Under another SEC rule, however, each witness was entitled to a transcript of his or her own testimony. In light of this rule, the court held that the plaintiffs were entitled to the requested transcripts, which Douglas Aircraft could obtain through its employees: “Rule 34(a) plainly provides that a party may request another party to produce any designated document which is within the possession, custody or control of the party of whom the request is made. Plaintiffs, consequently, may request Douglas to have its non-defendant employees procure copies of their private testimony before the SEC so that Douglas may give same to plaintiffs. Plainly Douglas’ employees are persons within its control. The testimony of these employees relates to Douglas’ affairs.” Flagg v. City of Detroit, 252 F.R.D. at 354 (citation omitted). Applying Rule 34, the court readily concluded that the city had “control” over the text messages preserved by third-party SkyTel pursuant to its contractual relationship with the city. Flagg v. City of Detroit, 252 F.R.D. at 354–58. 2–50

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

The court next turned to the SCA, finding that it does not override the defendants’ obligation to produce relevant nonprivileged electronic communications within their possession, custody, or control. Specifically, the court concluded that the archive maintained by SkyTel constitutes “computer storage,” and that the company’s maintenance of this archive on behalf of the City is a “remote computing service” as defined under the SCA. It is only a short step from this finding to the conclusion that the Defendant City is both able and obligated to give its consent, as subscriber, to SkyTel’s retrieval of text messages so that the City may comply with a Rule 34 request for their production. As previously discussed, a party has an obligation under Rule 34 to produce materials within its control, and this obligation carries with it the attendant duty to take the steps necessary to exercise this control and retrieve the requested documents. Moreover, the Court already has explained that a party’s disinclination to exercise this control is immaterial, just as it is immaterial whether a party might prefer not to produce documents in its possession or custody. Because the SkyTel archive includes communications that are potentially relevant and otherwise discoverable under the standards of Rule 26(b)(1), and because the City has “control” over this archive within the meaning of Rule 34(a)(1) and the case law construing this term, the City must give any consent that might be required under the SCA in order to permit SkyTel to retrieve communications from this archive and forward them to the Magistrate Judges in accordance with the protocol established in this Court’s March 20, 2008 order. Flagg v. City of Detroit, 252 F.R.D. at 363. To hedge its bets, the court also stated as follows: Alternatively, even if the Court is mistaken in its conclusion that the service provided by SkyTel is an RCS, there is ample basis to conclude that the City nonetheless has an obligation to secure the requisite consent from its employees that would permit SkyTel to proceed with its retrieval of communications. This, after all, is precisely what the courts have held in the Rule 34 case law discussed earlier, including Riddell Sports, 158 F.R.D. at 559, Herbst, 63 F.R.D. at 138, and In re Domestic Air Transportation Antitrust Litigation, 142 F.R.D. at 356. In particular, Riddell Sports, 158 F.R.D. at 559, holds that a corporate party has control over, and thus may be compelled to MCLE, Inc. | 2nd Edition 2018

2–51

§ 2.5

Data Security and Privacy in Massachusetts

produce, documents in the possession of one of its officers or employees, and that the officer or employee has a fiduciary duty to turn such materials over to the corporation on demand. Next, Herbst, 63 F.R.D. at 138, and In re Domestic Air Transportation Antitrust Litigation, 142 F.R.D. at 356, illustrate the principle that the Rule 34(a) concept of “control” extends to a company’s control over its employees, such that a corporate party may be compelled to secure an employee’s consent as necessary to gain access to materials that the employee has the right to obtain. In accordance with these authorities, the Court finds that the City of Detroit is both able and obligated to obtain any consent from its employees that would be necessary to permit SkyTel to retrieve the communications of City employees from its archive and forward them to the Magistrate Judges for review. Flagg v. City of Detroit, 252 F.R.D. at 363–64. Finally, the court declined to address the question of whether its analysis and conclusions continue to hold true where production is sought directly from a nonparty, rather than from a party that retains control over materials in the nonparty’s possession. The court stated: The Court finds it best to avoid this question, and to instead insist that Plaintiff reformulate his third-party subpoena as a Rule 34 request for production directed at the Defendant City. If Plaintiff were to continue to proceed via a third-party subpoena, it seems apparent that SkyTel’s compliance would qualify as “divulg[ing]” the contents of communications within the meaning of § 2702(a), and that, as Defendants have argued, this disclosure could only be made with the “lawful consent” referred to in § 2702(b)(3). Moreover, while Rule 34 and its attendant case law provide clear authority for insisting that a party consent to the disclosure of materials within its control, there is very little case law that confirms the power of a court to compel a party’s consent to the disclosure of materials pursuant to a third-party subpoena. In an effort to avoid such potentially difficult questions where a more straightforward path is readily available, the Court instructs Plaintiff to prepare and serve a Rule 34 request for production of the relevant text messages maintained by SkyTel on behalf of the Defendant City. The City shall then forward this discovery request to SkyTel, and SkyTel, in turn, shall proceed in accordance with the protocol set forth in the Court’s March 20, 2008 order. By directing the parties to proceed in this manner, the Court obviates the need to determine what 2–52

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.5

powers it might possess to compel a service provider such as SkyTel to comply with a third-party subpoena, and the Court leaves this question for another day. Rather, because production will be sought under Rule 34, the Court may resort to the usual mechanisms for ensuring the parties’ compliance. Flagg v. City of Detroit, 252 F.R.D. at 366. Accordingly, the court directed the plaintiff to prepare and serve an appropriate Rule 34 request for production directed at the city. Flagg v. City of Detroit, 252 F.R.D. at 367. Practice Note The key takeaway is this: If seeking the contents of electronic communications that fall under the SCA from a network service provider (such as an ISP), a party should analyze who can give consent to the disclosure and who has control (even indirect control) over the communications. If that person resists discovery, the party should point out the fact of control and demand that the person either give consent to the provider’s disclosure or obtain the contents from the provider and deliver these to the requesting party.

§ 2.6

CONCLUSION

The basic statutory structure of protections for the Internet and e-mail that we have today is based on the ECPA, which has remained largely unchanged since 1986. Back then, of course, most of us did not even know about e-mail or the Internet. Obviously, with the development of the Internet, e-mail, and the vast array of other new technologies throughout the past three decades, a lot has changed since then. It is widely acknowledged that the ECPA has failed to keep pace with new technologies and has significant gaps in need of legislative attention. Sadly, the ECPA also remains poorly understood, despite its obvious importance. As noted by Orin Kerr, a leading scholar in electronic surveillance law, the SCA is “famously complex, if not entirely impenetrable.” Orin S. Kerr, “Lifting the ‘Fog’ of Internet Surveillance: How a Suppression Remedy Would Change Computer Crime Law,” 54 Hastings L.J. 805, 820 (2003). Echoing Kerr, the Ninth Circuit has described the federal electronic surveillance statutes as “convoluted,” and “confusing and uncertain.” Konop v. Hawaiian Airlines, Inc., 302 F.3d 868, 874 (9th Cir. 2002). When asked the question “What kind of protection do the federal statutes provide when the government wants to read my email? ,” Daniel Solove, another leading scholar in electronic surveillance law, responded: The answer is immensely complicated. There are at least three statutes that regulate email, and all are part of ECPA—the Wiretap Act, the Stored Communications Act, and the Pen Register Act. Each provides very different levels of protection. MCLE, Inc. | 2nd Edition 2018

2–53

§ 2.6

Data Security and Privacy in Massachusetts

.... Depending upon the type of email I use, how it is stored, and how the government tries to access it, it will be covered by the Wiretap Act, various parts of the Stored Communications Act, the Pen Register Act, or none of the above. Daniel J. Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security 167–68 (Yale University Press 2011). To demonstrate his point, Solove presents the following hypothetical: Now suppose I send an email to you, and the government reads it before you receive it. Does the Wiretap Act apply? Maybe. Email travels differently than do phone calls. When I send you an email, it goes to an ISP, where it sits until you download it. If the government gets it while it is traveling from my computer to the ISP, or from the ISP to your computer, then the Wiretap Act’s protections will probably apply. But what if the government gets the email while it is sitting on the ISP’s server, waiting for you to download it? Now it’s stored, and it isn’t covered by the Wiretap Act but instead is protected by the Stored Communications Act, which provides lesser protections. Daniel J. Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security at 168. He then continues the hypothetical: What if the government wants to obtain my webmail? I access my work email messages from my Web browser rather than download them to my computer. I also have a Gmail account and Yahoo [sic] email account. I keep a lot of archived messages in these accounts long after I’ve read them. The Stored Communications Act protects communications in “electronic storage,” so it seemingly applies. But the answer isn’t that easy. Webmail doesn’t readily fit into the statutory framework designed long before most other forms of webmail existed. Common sense would suggest that webmail is an ECS because it is an email service and email is stored electronically. But “electronic storage” is defined as “any temporary, intermediate storage” that is “incidental” to the communication and “any storage of such communication by an electronic communications service for purpose of backup protection of such communication.” The language is clunky and confusing. It is clear 2–54

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

§ 2.6

that email sitting on the ISP’s server waiting to be downloaded is in “electronic storage,” which is what the drafters of the statute had in mind. The law was written in the days when people accessed their email by dialing in with a modem and downloading it to their computers. No matter how prescient, the members of Congress could not predict that a company like Google would come along and offer people free email accounts with many gigabytes of storage space. Because messages are stored indefinitely in a person’s webmail, according to the Department of Justice’s interpretation, the email is no longer in temporary storage and is “simply a remotely stored file.” And email messages might not be stored for “backup protection” because that was meant as backup protection for ISPs, not by a person for her own personal use. Therefore, under this view, my use of webmail to archive messages doesn’t fall within the definition of an ECS. Maybe it’s an RCS then, subject to weaker protections, though even that isn’t clear. Based on the practices back in 1986, the law says that if a provider of computer storage accesses people’s content for anything except “storage or computer processing,” then it is no longer an RCS. Gmail and other webmail services access people’s content to deliver advertisements, so they might not be an RCS. Daniel J. Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security at 168–70. Finally, Solove states: And there’s more. Email headers—the to/from lines of my email—are regulated by a different statute—the Pen Register Act. I could go on, but I’ll spare you further details. My purpose has been to demonstrate that trying to fit ever-changing technologies into antiquated rules can become confusing and counterproductive. Daniel J. Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security at 170 (emphasis added). So, what are we to make of this? It is telling (and somewhat ironic) that the SCA, enacted in 1986 to address the lack of Fourth Amendment privacy protections for stored electronic communications (due to the Fourth Amendment third-party and private search doctrines at the time), is now being eclipsed by cases such as Warshak and Augustine. In those two cases, discussed in § 2.5.7, above, the courts found individuals have a “reasonable expectation of privacy” in their electronic communications. In Warshak, this was e-mail contents, MCLE, Inc. | 2nd Edition 2018

2–55

§ 2.6

Data Security and Privacy in Massachusetts

while in Augustine, it was telephone call CSLI). Furthermore, the courts in these cases also held that, despite compliance with the judicial process set forth in the SCA, the government is prohibited from obtaining warrantless access to such information under applicable constitutional law, (i.e., the Fourth Amendment in Warshak and Article 14 of the Massachusetts Declaration of Rights in Augustine). Warshak and Augustine remind us of the critical role the courts must play to “keep pace with the inexorable march of technological progress,” creating rules that can evolve as technology develops. United States v. Warshak, 631 F.3d at 285. Until the ECPA is amended, it is anticipated that there will be more court decisions like Warshak and Augustine that break new ground and attempt to resolve the privacy imbalances found in the statute. Already, we have seen some. See, e.g., In re United States for an Order Authorizing the Release of Historical Cell-Site Info., 809 F. Supp. 2d 113, 127 (E.D.N.Y. 2011) (requiring search warrant to obtain historical CSLI records, stating: “While the government’s monitoring of our thoughts may be the archetypical Orwellian intrusion, the government’s surveillance of our movements over a considerable time period through new technologies, such as the collection of cell-sitelocation records, without the protections of the Fourth Amendment, puts our country far closer to Oceania than our Constitution permits. It is time that the courts begin to address whether revolutionary changes in technology require changes to existing Fourth Amendment doctrine. Here, the court concludes only that existing Fourth Amendment doctrine must be interpreted so as to afford constitutional protection to the cumulative cell-site-location records requested here” (emphasis added).

MCLE and the author acknowledge Joy S. Naifeh, a third-year student at the University of Maine School of Law, for her assistance in updating this chapter.

2–56

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

EXHIBIT 2A—Wiretap Report 2016 WIRETAP REPORT 2016 Last updated on December 31, 2016 This report covers intercepts concluded between January 1, 2016, and December 31, 2016, as reported to the AO, and provides supplementary information reported to the AO on arrests and convictions resulting from intercepts concluded in prior years. Forty-eight jurisdictions (the federal government, the District of Columbia, the Virgin Islands, Puerto Rico, and 44 states) currently have laws that authorize courts to issue orders permitting wire, oral, or electronic surveillance. Table 1 shows that a total of 27 jurisdictions reported using at least one of these types of surveillance as an investigative tool during 2016. Summary and Analysis of Reports by Judges The number of federal and state wiretaps reported in 2016 decreased 24 percent from 2015. A total of 3,168 wiretaps were reported as authorized in 2016, with 1,551 authorized by federal judges and 1,617 authorized by state judges. Compared to the applications approved during 2015, the number approved by federal judges increased 11 percent in 2016, and the number approved by state judges decreased 41 percent. The largest reduction in reported state wiretap applications occurred in California, where 50 percent fewer applications were reported. Two wiretap applications were reported as denied in 2016. In 26 states, a total of 107 separate local jurisdictions (including counties, cities, and judicial districts) reported wiretap applications for 2016. Applications concentrated in six states (California, New York, Colorado, Nevada, Florida, and New Jersey) accounted for 82 percent of all state wiretap applications. Applications in California alone constituted 35 percent of all applications approved by state judges.

MCLE, Inc. | 2nd Edition 2018

2–57

Data Security and Privacy in Massachusetts

Seventy-seven federal jurisdictions submitted reports of wiretap applications for 2016. For the third year in a row, the District of Arizona authorized the most federal wiretaps, approximately 9 percent of the applications approved by federal judges. Federal judges and state judges reported the authorization of 600 wiretaps and 177 wiretaps, respectively, for which the AO received no corresponding data from prosecuting officials. Wiretap Tables A-1 and B-1 (which will become available online after July 1, 2017, at http://www.uscourts.gov/statistics-reports/analysis-reports/wiretapreports) contain information from judge and prosecutor reports submitted for 2016. The entry “NP” (no prosecutor’s report) appears in these tables whenever a prosecutor’s report was not submitted. Some prosecutors’ reports were received too late to include in this document, and some prosecutors may have delayed filing reports to avoid jeopardizing ongoing investigations. Some of the prosecutors’ reports require additional information to comply with reporting requirements or were received too late to include in this document. Information about these wiretaps should appear in future reports. Intercept Orders, Extensions, and Locations Table 2 presents the number of intercept orders issued in each jurisdiction that provided reports, the number of extensions granted, the average lengths of the original periods authorized and any extensions, the total number of days in operation, and the locations of the communications intercepted. Federal and state laws limit the period of surveillance under an original order to 30 days. This period, however, can be lengthened by one or more extensions if the authorizing judge determines that additional time is justified. During 2016, the average reported length of an original authorization was 30 days, the same as in 2015. The average reported length of an extension was also 30 days. In total, 2,096 extensions were reported as requested and authorized in 2016, a decrease of 36 percent from the prior year. The District of Arizona and the Middle District of Florida conducted the longest federal intercepts that were terminated in 2016. An original order in the District of Arizona was extended 10 times to complete a 306-day wiretap used in a narcotics investigation. In the Middle District of Florida, an order was extended nine times to complete a 290-day wiretap in a narcotics investigation. For state intercepts terminated in 2016, the longest intercepts occurred in Queens County, New York, where 2 original orders each were extended 30 times to complete both 457-day wiretaps used in a narcotics investigation. The most frequently noted location in reported wiretap applications was “portable device.” This category includes cell phone communications, text messages, and application software (apps). In 2016, a total of 93 percent of all authorized wiretaps (2,947 wiretaps) were reported to have used portable devices. Prosecutors, under certain conditions, including a showing of probable cause to believe that actions taken by a party being investigated could have the effect of thwarting interception from a specified facility, may use “roving” wiretaps to target specific persons by using electronic devices at multiple locations rather than at a specific 2–58

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

telephone or location (see 18 U.S.C. § 2518(11)). In 2016, a total of 64 reported federal and state wiretaps were designated as roving. Criminal Offenses Drug offenses were the most prevalent type of criminal offenses investigated using reported wiretaps. Table 3 indicates that 61 percent of all applications for intercepts (1,949 wiretap applications) in 2016 cited narcotics as the most serious offense under investigation. Applications citing narcotics plus those citing other offenses, which include other offenses related to drugs, accounted for 82 percent of all reported wiretap applications in 2016, compared to 84 percent in 2015. Conspiracy, the secondmost frequently cited crime, was specified in 8 percent of applications. Homicide, the third-largest category, was specified as the most serious offense in approximately 5 percent of applications. Many applications for court orders revealed that multiple criminal offenses were under investigation, but Table 3 includes only the most serious criminal offense listed on an application. Lengths and Numbers of Intercepts In 2016, for reported intercepts, installed wiretaps were in operation for an average of 44 days, 1 day longer than the average in 2015. The federal wiretap with the most intercepts occurred during a narcotics investigation in the Middle District of Pennsylvania and resulted in the interception of 3,292,385 cell phone conversations or messages over 60 days. The state wiretap with the most intercepts was a 118-day wiretap for a narcotics investigation in Los Angeles County, California, which resulted in the interception of 559,003 cell phone conversations, of which 113,528 were incriminating. Encryption The number of state wiretaps reported in which encryption was encountered increased from 7 in 2015 to 57 in 2016. In 48 of these wiretaps, officials were unable to decipher the plain text of the messages. A total of 68 federal wiretaps were reported as being encrypted in 2016, of which 53 could not be decrypted. Encryption was also reported for 20 federal and 19 state wiretaps that were conducted during a previous year, but reported to the AO for the first time in 2016. Officials were not able to decipher the plain text of the communications in any of the state intercepts or in 13 of the federal of intercepts. Cost of Intercepts Table 5 provides a summary of expenses related to wiretaps in 2016. The expenditures noted reflect the cost of installing intercept devices and monitoring communications for the 2,332 authorizations for which reports included cost data. The average cost of an intercept devices in 2016 was $74,949, up 78 percent from the average cost in 2015. The most expensive state wiretap was in the Appellate Division of the Supreme Court, New York, where costs for a 434-day narcotics wiretap that resulted in 15 arrests and no convictions totaled $2,989,930. For federal wiretaps for which MCLE, Inc. | 2nd Edition 2018

2–59

Data Security and Privacy in Massachusetts

expenses were reported in 2016, the average cost was $83,356, a 70 percent increase from 2015. The most expensive federal wiretap completed during 2016 occurred in the Southern District of California, where costs for a narcotics investigation totaled $5,266,558. Methods of Surveillance The three major categories of surveillance are wire, oral, and electronic communications. Table 6 presents the type of surveillance method used for each intercept installed. The most common method reported was wire surveillance that used a telephone (land line, cellular, cordless, or mobile). Telephone wiretaps accounted for 84 percent (1,955 cases) of the intercepts installed in 2016, the majority of them involving cellular telephones. Arrests and Convictions Data on individuals arrested and convicted as a result of interceptions reported as terminated are presented in Table 6. As of December 31, 2016, a total of 12,412 persons had been arrested (up 179 percent from 2015), and 1,248 persons had been convicted (up 112 percent from 2015). Federal wiretaps were responsible for 15 percent of the arrests and 7 percent of the convictions arising from wiretaps for this period. The Southern District of New York reported the most arrests for a federal district in 2016, with wiretaps there resulting in the arrest of 488 individuals. At the state level, Oklahoma Criminal Appeals reported the largest number of total arrests (5,057), followed by Queens County, New York (736). Queens County, New York, also had the highest number of total convictions (380) for any state jurisdiction in 2016. Summary of Reports for Years Ending December 31, 2006, through December 31, 2016 Table 7 presents data on intercepts reported each year from 2006 to 2016. Authorized intercept applications reported by year increased 72 percent from 1,839 in 2006 to 3,168 in 2016 (the total for 2006 was revised after initial publication). The majority of wiretaps have consistently been used for narcotics investigations, which accounted for 80 percent of intercepts initially reported in 2006 (1,473 applications) and 76 percent in 2016 (1,949 applications). Table 9 presents the total number of arrests and convictions resulting from intercepts terminated in calendar years 2006 through 2016.

2–60

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act

Supplementary Reports Under 18 U.S.C. § 2519(2), prosecuting officials must file supplementary reports on additional court or police activity occurring as a result of intercepts reported in prior years. Because many wiretap orders are related to large-scale criminal investigations that cross county and state boundaries, supplemental reports are necessary to fulfill reporting requirements. Arrests, trials, and convictions resulting from these interceptions often do not occur within the same year in which the intercepts were first reported. Table 8 shows that a total of 12,440 arrests, 6,694 convictions, and additional costs of $201,427,611 arose from and were reported for wiretaps completed in previous years. Sixty percent of the supplemental reports of additional activity in 2016 involved wiretaps terminated in 2015. Interceptions concluded in 2015 led to 53 percent of arrests, 30 percent of convictions, and 40 percent of expenditures noted in the supplementary reports. APPENDIX TABLES Title

Publication Table Number

Reporting Period

Report Name

Download

Intercepts of Wire, Oral, or Electronic Communications Authorized by U.S. District Courts and Terminated

Wire A1

December 31, 2016

Wiretap

Download (XLSX, 496.07 KB)

MCLE, Inc. | 2nd Edition 2018

2–61

Data Security and Privacy in Massachusetts Title

Publication Table Number

Reporting Period

Report Name

Download

Intercepts of Wire, Oral, or Electronic Communications Authorized by State Courts and Terminated

Wire B1

December 31, 2016

Wiretap

Download (XLSX, 321.12 KB)

WIRETAP Title

Reporting Period

Report Name Download

Jurisdictions with Wire 1 Statutes Authorizing the Interception of Wire, Oral, or Electronic Communications Effective

December 31, 2016

Wiretap

Download Table (XLSX, 12.19 KB)

Intercept Orders Issued by Judges

Wire 2

December 31, 2016

Wiretap

Download (XLSX, 37.24 KB)

Major Offenses for Which CourtAuthorized Intercepts Were Granted

Wire 3

December 31, 2016

Wiretap

Download (XLSX, 24.85 KB)

Interceptions of Wire, Oral, or Electronic Communications

Wire 4

December 31, 2016

Wiretap

Download (XLSX, 22.75 KB)

Average Cost per Order

Wire 5

December 31, Wiretap 2016

Download (XLSX, 18.28 KB)

2–62

Publication Table Number

2nd Edition 2018 | MCLE, Inc.

The Electronic Communications Privacy Act Title

Publication Table Number

Reporting Period

Types of Surveillance Used, Arrests, and Convictions for Intercepts Installed

Wire 6

December 31, Wiretap 2016

Download (XLSX, 22.6 KB)

Authorized Intercepts Granted

Wire 7

December 31, Wiretap 2016

Download (XLSX, 16.03 KB)

Supplementary Data for Intercepts Terminated in Prior Years as Reported

Wire 8

December 31, Wiretap 2016

Download (XLSX, 16.97 KB)

Arrests and Convictions Resulting from Intercepts Installed

Wire 9

December 31, Wiretap 2016

Download (XLSX, 27.37 KB)

MCLE, Inc. | 2nd Edition 2018

Report Name Download

2–63

Data Security and Privacy in Massachusetts

2–64

2nd Edition 2018 | MCLE, Inc.

CHAPTER 3

Computer Fraud and Abuse Act Alexandra Capachietti, Esq. Boston Trial Court, Boston

Brooke A. Penrose, Esq. Burns & Levinson LLP, Boston

Merton E. Thompson IV, Esq., CIPT Burns & Levinson LLP, Boston § 3.1

Introduction............................................................................................. 3–2

§ 3.2

Key Terminology ..................................................................................... 3–3 § 3.2.1 Protected Computer................................................................ 3–3 § 3.2.2 Computer................................................................................ 3–4 § 3.2.3 Access .................................................................................... 3–4 § 3.2.4 Unauthorized Access and Exceeding Authorized Access ....... 3–5 § 3.2.5 Damage or Loss...................................................................... 3–8

§ 3.3

Criminal Prohibitions ............................................................................. 3–9 § 3.3.1 18 U.S.C. § 1030(a)(1)—Espionage ...................................... 3–9 § 3.3.2 18 U.S.C. § 1030(a)(2)—Confidentiality ............................... 3–9 § 3.3.3 18 U.S.C. § 1030(a)(3)—Trespass to Government Computers ............................................................................ 3–10 § 3.3.4 18 U.S.C. § 1030(a)(4)—Computer Fraud ........................... 3–10 § 3.3.5 18 U.S.C. § 1030(a)(5)—Malware, Viruses, and Damage to Data .................................................................................. 3–11 § 3.3.6 18 U.S.C. § 1030(a)(6)—Password Trafficking ................... 3–11 § 3.3.7 18 U.S.C. § 1030(a)(7)—Extortion ...................................... 3–12

§ 3.4

Private Right of Action ......................................................................... 3–12

§ 3.5

Criticism and Calls for Reform ........................................................... 3–12 § 3.5.1 Internet Advocacy and Hacktivism....................................... 3–13 § 3.5.2 Cyber Bullying ..................................................................... 3–14 § 3.5.3 Use of Employer Resources for Personal Use ...................... 3–15 § 3.5.4 Proposed Reforms ................................................................ 3–16

§ 3.6

Legislative History ................................................................................ 3–16 § 3.6.1 H.R. 5616 ............................................................................. 3–16 § 3.6.2 The 1984 Act: The Counterfeit Access Device and Computer Fraud and Abuse Act ..................................... 3–17 § 3.6.3 The 1986 Amendments ........................................................ 3–18 § 3.6.4 The 1994 Amendments: The Computer Abuse Amendments Act of 1994..................................................... 3–19

MCLE, Inc. | 2nd Edition 2018

3–1

Data Security and Privacy in Massachusetts

§ 3.6.5 § 3.6.6 § 3.6.7 § 3.6.8 § 3.6.9

The 1996 Amendments: The National Information Infrastructure Protection Act of 1996 ................................... 3–20 The 2001 Amendments: The USA PATRIOT Act ................ 3–21 The 2002 Amendments: The Cyber Security Enhancement Act of 2002 .................................................... 3–22 The 2008 Amendments: The Identity Theft Enforcement and Restitution Act of 2008 .................................................. 3–23 Scope and Application of the CFAA Today.......................... 3–23

EXHIBIT 3A—18 U.S.C. § 1030, Fraud and Related Activity in Connection with Computers .......................................................................... 3–24

Scope Note This chapter provides an introduction to the federal data security statute known as the Computer Fraud and Abuse Act. The chapter begins with an overview and survey of the statute and its civil and criminal application, with an emphasis on Massachusetts. The chapter also reviews the legislative history and the current conversation regarding potential reforms.

§ 3.1

INTRODUCTION

Since its origin in 1986 as an amendment to an existing statute (the Counterfeit Access Device and Abuse Act), the Computer Fraud and Abuse Act (“CFAA”), 18 U.S.C. § 1030, has evolved into a powerful and flexible tool for both the government and private litigants to address unauthorized access to protected computers. The CFAA has been used with varying success against malicious hackers seeking financial rewards, computer virus authors, spies, spammers, current and former employees, data scrapers, and hacktivists. Indeed, it is the potential breadth and flexible scope of application of the CFAA that has caused some commentators to find the CFAA overbroad, leading to calls for reform and/or clarifications. In this chapter, we will discuss the history and elements of the CFAA, highlight recent case law, and consider calls for reform. While originally cast as a criminal statute designed to protect government and financial industry computers, an amendment to the CFAA in 1994 allowed civil actions to be brought under the statute as well. The CFAA enumerates seven prohibitions: • unauthorized access to and disclosure of national security interest information from a computer; • obtaining information by making unauthorized access to a protected computer involved in interstate commerce; • unauthorized access to federal government computers; • furthering a fraud by making unauthorized access to a computer involved in interstate commerce; 3–2

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.1

• causing damage to a computer involved in interstate commerce; • trafficking in federal government computer passwords or passwords of computers involved in interstate commerce; and • threatening to damage a computer involved in interstate commerce to extract money or value. It is also a violation of the CFAA to attempt or to conspire to commit a prohibited act and, as such, “attempt” crimes are also criminally punishable.

§ 3.2

KEY TERMINOLOGY

Important to understanding the scope and applicability of the prohibitions of the CFAA is the statute’s definition of the key term “protected computer,” along with other key terms and concepts, as well as the scope and meaning that courts have given these terms. In the following sections, key terms and definitions are discussed in the context of the present-day treatment of the terms.

§ 3.2.1

Protected Computer

In defining “protected computer,” 18 U.S.C. § 1030(e)(2) states as follows: As used in this section— (2) the term “protected computer” means a computer— (A) exclusively for the use of a financial institution or the United States Government, or, in the case of a computer not exclusively for such use, used by or for a financial institution or the United States Government and the conduct constituting the offense affects that use by or for the financial institution or the Government; or (B) which is used in or affecting interstate or foreign commerce or communication, including a computer located outside the United States that is used in a manner that affects interstate or foreign commerce or communication of the United States. . . . In the 1996 amendments to the CFAA, the term “protected computer” replaced the original term, “federal interest computer.” Given that virtually any computer that is connected to a network and/or the Internet is “used in or affecting interstate or foreign commerce or communication,” the potential scope of protected computers is vast. Indeed, the definition is so broad that it also encompasses a computer “located outside the United States that is used in a manner that affects interstate or foreign commerce or communication of the United States.” 18 U.S.C. § 1030(e)(2)(B). Treatment of the term “protected computer” by the courts confirms that it is likely that any computer used in interstate commerce or communication is covered by the MCLE, Inc. | 2nd Edition 2018

3–3

§ 3.2

Data Security and Privacy in Massachusetts

CFAA. See Shurgard Storage Ctrs., Inc. v. Safeguard Self Storage, Inc., 119 F. Supp. 2d 1121 (W.D. Wash. 2000) (finding definition of protected computer is unambiguous and includes any computer used in interstate commerce and/or communication).

§ 3.2.2

Computer

The term “computer,” which is used both independently and as a subsidiary component of “protected computer” in the CFAA, is defined in the statute as follows: As used in this section— (1) the term “computer” means an electronic, magnetic, optical, electrochemical, or other high speed data processing device performing logical, arithmetic, or storage functions, and includes any data storage facility or communications facility directly related to or operating in conjunction with such device, but such term does not include an automated typewriter or typesetter, a portable hand held calculator, or other similar device . . . . 18 U.S.C. § 1030(e)(1). The above statutory definition, while already quite broad on its face, may become of increasing importance with the soon-to-be-realized Internet of Things (“IoT”), wherein millions of devices will become networked and accessible over the Internet. Examples include the HVAC system and appliances in an ordinary home. Depending on how courts construe the IoT-connected world and the conformance of the devices’ technology to the statutory definition, potentially billions of devices may be added to the purview of the CFAA in coming decades. See Mat Honan, “The Nightmare on Connected Home Street,” Wired (June 13, 2014), available at http://www.wired.com/ 2014/06/the-nightmare-on-connected-home-street.

§ 3.2.3

Access

To violate the CFAA, there is a threshold requirement that a protected computer be accessed. This requirement was addressed in the Massachusetts case of MBTA v. Anderson, Case No. 08-11364 (D. Mass. Aug. 9, 2008), which also highlights the often uneasy interplay between valid research into computer security vulnerabilities and the CFAA. The defendants were students at MIT who had discovered a security flaw in the MBTA’s electronic ticketing system that could be exploited to enable counterfeit electronic tickets to be created. The defendants had announced their intent to present a paper on their security flaw findings at a technical conference. The plaintiff obtained a temporary restraining order (TRO), primarily under their CFAA claims, barring the disclosure. In proceedings to convert the TRO into a preliminary injunction, the court found the plaintiff unlikely to ultimately succeed on its CFAA claims, given that a violation of the relevant part of the CFAA occurs only if the person knowingly causes the transmission of commands to a protected computer. As the defendants were presenting a technical paper to a conference and not intentionally 3–4

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.2

seeking to cause “transmission of a program, information, code, or command, and as a result of such conduct, intentionally causes damage without authorization, to a protected computer,” the court found that the plaintiff was not likely to establish a violation of the CFAA because the announced conduct did not involve a transmission to, or access of, a protected computer.

§ 3.2.4

Unauthorized Access and Exceeding Authorized Access

In addition to the terms discussed above, the concept and definition of what is “unauthorized access” or “exceeds authorized access” is critical to evaluating a claim or exposure under the CFAA. The easier case seems to be what constitutes “unauthorized access” in the context of hacking. An individual who circumvents technical access barriers to a protected computer, while lacking permission to do so, or who exploits the credentials of another to gain access to a computer, seems to fall squarely under this term, and courts have had little trouble finding so. The CFAA does not provide a definition of “unauthorized access,” perhaps because the meaning is selfevident and unambiguous. EF Cultural Travel BV v. Explorica, Inc., 274 F.3d 577, n.10 (1st Cir. 2001) (“Congress did not define the phrase ‘without authorization,’ perhaps assuming that the words speak for themselves”). The more subtle case is where authorized access is exceeded. This situation occurs most commonly in the area of claims against current or former employees, a seemingly growing area of CFAA criminal and civil actions. Notwithstanding the fact that the term “exceeds authorized access” is expressly defined in the statute, courts have struggled with the nuances of application in the employee/employer context. The statutory definition is as follows: As used in this section— (6) the term “exceeds authorized access” means to access a computer with authorization and to use such access to obtain or alter information in the computer that the accesser is not entitled so to obtain or alter . . . . 18 U.S.C. § 1030(e)(6). Currently a split exists among the Circuit Courts on the factual question of whether an employee’s existing authorization to access a computer can be exceeded when the employee misuses information obtained within the scope of permitted access. At one end of the spectrum is the Ninth Circuit’s en banc opinion in United States v. Nosal, 676 F.3d 854 (9th Cir. 2012). Nosal was a former employee of well-known executive search firm Korn/Ferry. He convinced some current Korn/Ferry employees to assist him in setting up a new and competitive business by having them access their computers and send him information about Korn/Ferry’s clients and contacts. On a motion to dismiss, Nosal argued that the CFAA was limited to hacking offenses, not situations where an employee was authorized to access a computer but misused information he or she obtained. Initially, the lower court denied the motion to dismiss, finding that when an authorized user accesses a computer “knowingly and with intent MCLE, Inc. | 2nd Edition 2018

3–5

§ 3.2

Data Security and Privacy in Massachusetts

to defraud,” that access then becomes unauthorized or in excess of authorization. Nosal I, 2009 WL 981336, at *4. On reconsideration of that holding, the lower court reversed itself, finding that new CFAA guidance from LVRC Holdings LLC v. Brekka, 581 F.3d 1127 (9th Cir. 2009), which more narrowly construed the meaning of “authorized access” and “exceeds authorized access,” precluded a finding that an employee who violates a corporate policy governing use of information would then exceed authorized access as that term is used in the CFAA. The court then dismissed five of eight CFAA counts against Nosal for failure to state an offense. On appeal, the Ninth Circuit supported the narrow interpretation below (and in doing so reversed itself on a prior appeal in Nosal), holding that the meaning of “exceeds authorized access” under the CFAA does not extend to contractual violations of use restrictions. In reaching this decision, the Ninth Circuit noted that mere violations of corporate computer-use policies or websites’ terms of use would arguably be criminalized if the government’s construction of “exceeds authorized access” were adopted, given the arguable breadth of the CFAA. Subsequent to the above-discussed appeal, the government, having gone ahead on the remaining indictments, obtained a conviction of Nosal under the CFAA and other counts. Nosal appealed to the Ninth Circuit (Nosal II), which affirmed the conviction, holding that “without authorization” is an unambiguous term with a plain meaning— here meaning that in this case only the system owner and not a legitimate user of the system could grant authorization. Nosal II, 828 F.3d 865, 875 (9th Cir. 2016). The Fourth Circuit also tackled the construction of “authorized access” and “exceeds authorized access” in an appeal from a lower court holding dismissing CFAA claims that were based on a former employee’s subsequent use of information he had accessed while employed. WEC Carolina Energy Solutions, LLC v. Miller, 687 F.3d 199 (4th Cir. 2012). The Fourth Circuit, while noting the split of authority, generally followed the Ninth Circuit’s reasoning, while holding that “we adopt a narrow reading of the terms ‘without authorization’ and ‘exceeds authorized access’ and hold that they apply only when an individual accesses a computer without permission or obtains or alters information on a computer beyond that which he is authorized to access.” WEC Carolina Energy Solutions, LLC v. Miller, 687 F.3d at 206; see also Cloudpath Networks, Inc. v. SecureW2 B.U., 157 F. Supp. 3d 961, 983–84 (D. Colo. 2016) (explaining that exceeds “authorized access” does not impose criminal liability on individuals who are authorized to access company data but do so for disloyal purposes; it applies only to individuals who are allowed to access a company computer and use that access to obtain data they are not allowed to see for any purpose). In contrast to Nosal is the Seventh Circuit’s holding in International Airport Centers, LLC v. Citrin, 440 F.3d 418, 420 (7th Cir. 2006), finding that Citrin’s authorization to access his employer’s computer terminated when said employee breached “the duty of loyalty that agency law imposes on an employee” by his determining to quit in violation of an employment contract and destroying files that may have been incriminating. This application of agency principles in transitioning an otherwise authorized act into the zone of the unlawful is the major intellectual counterpoint to the reasoning in Nosal. Thus, under Citrin an employee’s authorization to access a computer 3–6

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.2

was self-terminated upon his willing violation of the agency relationship imputed to employees. “‘Violating the duty of loyalty, or failing to disclose adverse interests, voids the agency relationship.’ State v. DiGiulio, 172 Ariz. 156, 835 P.2d 488, 492 (App.1992). ‘Unless otherwise agreed, the authority of the agent terminates if, without knowledge of the principal, he acquires adverse interests or if he is otherwise guilty of a serious breach of loyalty to the principal.’ Id.; Restatement, [(Second) of Agency], § 112.” International Airport Ctrs., LLC v. Citrin, 440 F.3d at 421. On different facts, the First Circuit held that a former employee’s deceptive actions and violation of a confidentiality agreement rendered access to an otherwise publically available website, unauthorized. In EF Cultural Travel BV v. Explorica, Inc., 274 F.3d 577(1st Cir. 2001), the former employee of a travel agent, in violation of his confidentiality agreement with his former employer, used confidential information he had obtained in the course of his employment with the plaintiff to create a data scraping program that enabled his new competing travel company to obtain information in bulk from his former employer’s website that he could not have obtained as efficiently without the use of that confidential information. The confidential information was in the form of codes that enabled automated bulk data acquisition, as opposed to the single manual-access process available to uninformed users. The website was open to the public, so he was facially authorized to use it, but the First Circuit confirmed that he exceeded his authorization by using confidential information, in breach of his contractual obligations, to obtain much better access than other members of the public would have by manually entering in and receiving information. Massachusetts U.S. District Court case law highlights that determining the issue of access authorization is a factually intense inquiry. In Advanced Micro Devices, Inc. v. Feldstein, 951 F. Supp. 2d 212 (D. Mass. 2013), former employees of Advanced Micro Devices were accused of copying confidential technical data before departing to join a competitor. In considering a motion to dismiss CFAA claims, the court was hesitant to allow the claims to advance, based solely on allegations that the defendants violated the internal policies of Advanced Micro Devices and their duty of loyalty. Noting that courts considering the issue of whether an employee’s access to an employer’s computer violates the CFAA have taken both a narrow interpretation (Nosal) and a more expansive view (United States v. Rodriguez, 628 F.3d 1258 (11th Cir. 2010)) on this question. Judge Hillman determined that the narrower construction espoused in Nosal was the correct approach. As between a broad definition that pulls trivial contractual violations into the realm of federal criminal penalties, and a narrow one that forces the victims of misappropriation and/or breach of contract to seek justice under state, rather than federal, law, the prudent choice is clearly the narrower definition. Advanced Micro Devices, Inc. v. Feldstein, 951 F. Supp. 2d at 218. In reaching this conclusion, Judge Hillman determined that EF Cultural Travel BV v. Explorica, Inc., did not compel a different result, even though some have read the decision as favoring a broader view, see Guest-Tek Interactive Entm’t Inc. v. Pullen, MCLE, Inc. | 2nd Edition 2018

3–7

§ 3.2

Data Security and Privacy in Massachusetts

665 F. Supp. 2d 42, 45 (D. Mass. 2009), as the facts of that case were substantially different, and to read EF Cultural as controlling on this issue would be giving life to dicta. Having thus resolved the construction issue, the court noted the unsettled pleading standard and that the evidentiary record was largely incomplete and declined to dismiss the CFAA claims at this stage. The court noted as follows: If AMD can plead specific details indicating that some or all of the defendants used fraudulent or deceptive means to obtain confidential AMD information, and/or that they intentionally defeated or circumvented technologically implemented restrictions to obtain confidential AMD information, then the CFAA claims will surpass the Twombly/Iqbal standard. Otherwise these claims will be dismissed once the factual record is complete. Advanced Micro Devices, Inc. v. Feldstein, 951 F. Supp. 2d at 219. So, in Massachusetts, something more than a violation of corporate policies or an employee’s duty of loyalty must be pled to successfully advance a claim under the CFAA. Another Massachusetts case followed similar reasoning and cited to Advanced Micro Devices. In the case of Enargy Power Co. Ltd. v. Wang, No. CIV.A. 13-11348DJC, 2013 WL 6234625 (D. Mass. Dec. 3, 2013), the plaintiff had alleged that the defendant, prior to leaving his employer, had instructed an assistant to encrypt certain project files and to transmit the encrypted files to him. It was also alleged that the defendant then instructed his assistant to delete the original files from his employer’s server. In granting a motion for preliminary injunction, Judge Casper approved of the analytical construct taken in Advanced Micro Devices and found that the plaintiff was likely to prevail on its CFAA claims, as the record showed that the defendant was not authorized to access the files at issue and had “employed an element of deception in that he acted without his employer’s consent or knowledge and using his assistant as a conduit, who had every reason to trust that Wang was acting within the scope of his authorization when in fact, Wang incorrectly misled Jiang by telling him that encryption was ‘necessary[,]’” and accordingly those actions likely “exceeded the scope of his authorization.” Enargy Power Co. Ltd. v. Wang, 2013 2013 WL 6234625 at *5. Accordingly, this ruling is in line with a Massachusetts plaintiff needing something more than a violation of corporate policies or a duty of loyalty, here “an element of deception,” to advance a CFAA claim.

§ 3.2.5

Damage or Loss

The terms “damage” and “loss” are defined in 18 U.S.C. § 1030(e)(8) and (11), respectively. “Damage” means “any impairment to the integrity or availability of data, a program, a system, or information” and “loss” means “any reasonable cost to any victim, including the cost of responding to an offense, conducting a damage assessment, and restoring the data, program, system or information to its condition prior to the offense, and any revenue lost, cost incurred, or other consequential damages incurred because of interruption of service.” The former requirement that damages exceed $5,000 was removed via the 2008 amendments to the CFAA when that figure was deleted from the statutory definition of damage. 3–8

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.2

Examples of compensable loss include costs of responding to the unauthorized intrusion, including employee wages; remediation of damaged data or functionality; lost revenue from interruption of system or website availability; and injury to goodwill or business reputation. Case law examples of losses found not to meet the CFAA definition include loss of control of data and invasion of privacy, e.g., In re Pharmatrak Litig., 220 F. Supp. 2d 4 (D. Mass. 2002), and the cost of investigation, e.g., Jarosch v. American Family Mut. Ins. Co., 837 F. Supp. 2d 980 (E.D. Wis. 2011). Also, the value of misappropriated data, an element that must be met under Section 1030(a), is not counted as a loss under the CFAA. See Andritz, Inc. v. S. Maint. Contractor, 626 F. Supp. 2d 1264 (M.D. Ga. 2009). Remedies for misappropriation are, of course, available under other statutes and at common law.

§ 3.3

CRIMINAL PROHIBITIONS

§ 3.3.1

18 U.S.C. § 1030(a)(1)—Espionage

Generally understood to be crafted to counter computer espionage, this subsection makes it a crime to knowingly, and with intent to injure, access government computers without authorization to obtain classified information and then transmit the information so obtained to a person who is not authorized to receive it, or to willfully retain said classified information. The species of classified information covers national defense and security information, protected information under the Atomic Energy Act, and virtually any information the government designates as classified. An example of a prosecution under this subsection was the military trial of Chelsea Manning (formerly known as PFC Bradley Manning), who was found to have exceeded authorized access and thereby obtained classified Department of State cables that were then divulged to WikiLeaks. Practice Note The full text of 18 U.S.C. § 1030(a) is set forth below in Exhibit 3A.

Two elements must be established to prove a violation of this subsection. First, intentional access to classified information with intent to cause injury to the United States or to give advantage to a foreign government must be proven. Second, the classified information so obtained must either be provided to a person not entitled to receive the classified information or retained willfully by a person who refuses to then deliver the classified information to the government.

§ 3.3.2

18 U.S.C. § 1030(a)(2)—Confidentiality

This subsection is intended to address the issue of hackers obtaining confidential data on banking, government, or protected computers. Likely the broadest provision of the CFAA, Subsection (a)(2) bars the intentional access without authorization, or by exceeding authorization to access, of a protected computer and thereby obtaining information. As discussed above, the term “protected computer” covers any computer involved in or affecting interstate commerce or communication. As such, virtually any computer connected to the Internet would qualify. Any government computer or MCLE, Inc. | 2nd Edition 2018

3–9

§ 3.3

Data Security and Privacy in Massachusetts

any record of a financial institution or a credit reporting agency is also specifically protected. To establish a violation of Subsection (a)(2), it must be proven that information in a covered computer was intentionally accessed, such access being either without authorization or exceeding authorization, and that the information was thereby obtained. Mere accidental exposure to information on a covered computer is not actionable, as the intent component is absent. To emphasize the breadth of Subsection (a)(2), see, for example, America Online, Inc. v. National Health Care Discount, Inc., 121 F. Supp. 2d 1255, 1275 (N.D. Iowa 2000). Information stored electronically can be obtained not only by actual physical theft, but also by “mere observation of the data.” America Online, Inc. v. Nat’l Health Care Discount, Inc., 121 F. Supp. 2d at 1275. The “crux of the offense under subsection 1030(a)(2)(C) . . . is the abuse of a computer to obtain the information.” America Online, Inc. v. Nat’l Health Care Discount, Inc., 121 F. Supp. 2d at 1275.

§ 3.3.3

18 U.S.C. § 1030(a)(3)—Trespass to Government Computers

Subsection (a)(3) broadly prohibits the intentional unauthorized access to a nonpublic computer exclusively used by a government agency or a computer shared by a government agency with others. Any willing unauthorized intrusion to a nonpublic computer used exclusively by the government is criminalized by this subsection. In the case of intentional unauthorized access to a shared government computer, the additional element of the trespass having an adverse effect must also be established.

§ 3.3.4

18 U.S.C. § 1030(a)(4)—Computer Fraud

Subsection (a)(4) criminalizes the willful unauthorized access, or intentionally exceeding authorized access, to a protected computer with intent to defraud and obtaining anything of value. As discussed above relating to Subsection (a)(2), the term “protected computer” has a very broad scope and includes government, financial industry, and any other computer affecting interstate commerce or communications. This subsection differs from other CFAA counts in that information need not necessarily be obtained and in that the thing of value can be use of the computer, if that use has an aggregate value of more than $5,000 in any one-year period. In United States v. Czubinski, 106 F.3d 1069 (1st Cir. 1997), the First Circuit discussed Subsection (a)(4) at length in overturning a CFAA conviction. Czubinski was an IRS employee who was accused of accessing and viewing the confidential taxpayer files of “friends, acquaintances and political rivals.” The government’s case under Subsection (a)(4) faltered on appeal on the question of whether Czubinski obtained anything of value by browsing the taxpayer files. A review of the legislative history indicated that Subsection (a)(4) was directed to punish fraudulent theft of valuable data, a felony, as opposed to mere access to data, which is a misdemeanor. Finding this commentary consistent with the statutory language, the First Circuit reversed, holding that browsing, without more, was not equivalent to obtaining something of 3–10

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.3

value. Contrast this decision with America Online, Inc. v. National Health Care Discount, Inc., cited above in reference to Subsection (a)(2), where mere “observation of data” was enough to establish liability. Thus, the required elements to establish a criminal violation of Subsection (a)(4) are that a protected computer was accessed by an unauthorized person, or by a person who exceeded his or her authorization, that person having the specific intent to defraud and thereby obtaining something of value.

§ 3.3.5

18 U.S.C. § 1030(a)(5)—Malware, Viruses, and Damage to Data

Subsection (a)(5) has three subparts. Subsections (a)(5)(A) and (a)(5)(B) are felony violations for intentionally transmitting malicious code, commands, or viruses to a protected computer and thereby intentionally causing damage (Subpart (A)) and for the unauthorized intentional access to a protected computer and appurtenant reckless causation of damage to the computer (Subpart (B)). The remaining Subpart (a)(5)(C) is a misdemeanor count for intentional unauthorized access to a protected computer that results in damage or loss, regardless of whether the damage or loss was intentional or negligent. As set forth in the definitions section of the CFAA, the term “damage” means any impairment to the integrity or availability of data, a program, a system, or information. With specific regard to Subsection (a)(5), damages must either • exceed $5,000 in value, • alter or impair medical treatment or records, • physically injure a person, or • damage a government computer. In EF Cultural Travel BV v. Explorica, Inc., 274 F.3d 577, 583–84 (1st Cir. 2001), in a civil context, the First Circuit affirmed the lower court’s finding that the plaintiff “suffered a loss, as required by the statute, consisting of reduced business, harm to its goodwill, and the cost of diagnostic measures it incurred to evaluate possible harm to EF’s systems.” EF Cultural Travel BV v. Explorica, Inc., 274 F.3d at 580. Accordingly, damage may include a wide spectrum of costs associated with analysis, repair, and remediation of the conduct.

§ 3.3.6

18 U.S.C. § 1030(a)(6)—Password Trafficking

Subsection (a)(6) prohibits the intentional and with-intent-to-defraud sale or trade in passwords for government computers or trafficking in passwords through which a computer may be accessed without authorization in a manner that affects interstate commerce. This subsection could be applied to limit the onward transfer of credentials that affect the security and data integrity of government or other computers. Note that “password” as used in Subsection (a)(6) also includes “similar information” and that additional breadth has been used to apply Subsection (a)(6) to trafficking in

MCLE, Inc. | 2nd Edition 2018

3–11

§ 3.3

Data Security and Privacy in Massachusetts

credit card credentials as well. See United States v. Rushdan, 870 F.2d 1509, 1514 (9th Cir. 1989).

§ 3.3.7

18 U.S.C. § 1030(a)(7)—Extortion

Subsection (a)(7) is a felony count that prohibits the intentional transmission of communications in interstate commerce containing any threat of intentional extortion of money or other value through a threat to damage a protected computer, compromise information on a protected computer, or leverage money or value relative to damage to a protected computer. This subsection can be brought to bear in instances where “ransomware” has been installed on a protected computer and money is demanded in return for the decryption key.

§ 3.4

PRIVATE RIGHT OF ACTION

Section 1030(g) of 18 U.S.C. provides an avenue for remedy to any person who suffers “damage or loss” by a violation of any subsection of Section 1030(a). Both equitable relief and money damages are available to civil plaintiffs, and this private right of action has produced most of the case law under the CFAA. The 1994 amendments to the CFAA added the private civil suit provision Section 1030(g). The required pleading elements of a civil claim under the CFAA § 1030(g) are as follows: • damage or loss • caused by • a violation of one of the prohibitions set forth in Section 1030(a)(1)–(7), and • the conduct involves one of the factors set forth in Subclauses (I)–(V) of Subsection (c)(4)(A)(i). Notably, standing under Section 1030(g) exists for any person who suffers damage or loss through a violation of the CFAA. Accordingly, a person does not necessarily need to be the owner of the affected computer to bring suit if he or she is able to meet the full pleading standard. Persons found to be civilly liable for a CFAA violation can be responsible for compensatory damages and injunctive or other equitable relief. The statute of limitations for a civil suit under Section 1030(g) is two years from the date of injury or two years from the date of discovery of the injury.

§ 3.5

CRITICISM AND CALLS FOR REFORM

The CFAA has recently come under significant scrutiny by some communities for being vague and overbroad, resulting in the potential criminalization of common activities, such as computer vulnerability research. The CFAA was originally enacted in 1986 at a time when few computers were networked or even capable of the types of activities that fall within the scope of the contemporary CFAA. With the advent and now pervasiveness of the Internet, virtually every connected computer could 3–12

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.5

qualify for protection under the CFAA. Indeed, the coming Internet of Things may exponentially expand the number of connected devices on the Internet. The current calls for reform are embodied, at least in part, by legislation such as “Aaron’s Law.” Named for Internet activist Aaron Swartz (Note: Mr. Swartz is a former client of the authors’ law firm) and first proposed in the wake of his death by suicide while being tried for violations of the CFAA, Aaron’s Law seeks to bring clarity to the scope and interpretation of the CFAA and allow for proportionality to be considered in the sentencing of criminal violations. First advanced in 2013 and again in 2015. Representative Lofgren and Senator Wyden emphasized the importance of reforming and narrowing the scope of the CFAA, in large part because a recent rash of cases had revealed that the law had become “a sweeping anti-hacking law that criminalizes many forms of common Internet and computer use.” 161 Cong. Rec. 2304 (2015). Unfortunately, the bill has languished in Congress and no action has been taken on it. Indeed, without some tweaking to the CFAA, it can be broadly used as the legal basis to prosecute a variety of behavior that, while sometimes is of questionable scruples, is commonplace and clearly beyond the scope of criminalization intended by Congress. The following case studies highlight the concerns that motivated Aaron’s Law.

§ 3.5.1

Internet Advocacy and Hacktivism

Of particular note, the CFAA has been used to prosecute leaders in online activism. First and foremost, the namesake for Senator Wyden’s bill is famed Internet activist, writer, and entrepreneur Aaron Swartz. Swartz was indicted for violating the CFAA by downloading large amounts of electronic academic journals from JSTOR using MIT’s open network. United States v. Swartz, Cr. 11-ER-10260 (D. Mass. July 14, 2011). Facing trial and potential penalties of up to thirty-five years in prison and $1 million in fines, Swartz tragically took his own life before the commencement of the trial. His death brought attention to the disproportionate penalties available under the CFAA for activities that posed little harm and, in some cases, offer some benefit. Consider also the case against Andrew Auernheimer, aka “weev,” a hacker, activist, and member of the group Goatse Security, an association that specializes in uncovering computer security flaws. In 2010, Auernheimer and a cohort created a computer program that exposed a security flaw in AT&T’s servers. The security flaw enabled anyone to access personal information, including e-mail addresses, of AT&T’s iPadusing customers simply by incrementing the user ID. The defendants openly disclosed their activities, going so far as to alert AT&T to this security flaw. Auernheimer was subsequently prosecuted under the CFAA, found guilty, and sentenced to forty-one months’ imprisonment and ordered to pay AT&T $73,000 in restitution. United States v. Auernheimer, No. 11-CR-470 SDW, 2012 WL 5389142 (D.N.J. Oct. 26, 2012). The security research community denounced Auernheimer’s prosecution and conviction, concerned with the chilling effect such may have on the community’s willingness to come forward with discovered security flaws of others in the future. Andy Greenberg, “Security Researchers Cry Foul Over Conviction of AT&T iPad Hacker,” Forbes (Nov. 21, 2012). MCLE, Inc. | 2nd Edition 2018

3–13

§ 3.5

Data Security and Privacy in Massachusetts

After Auernheimer had served over a year in prison, the Third Circuit vacated his conviction. United States v. Auernheimer, 748 F.3d 525 (3d Cir. 2014). While Auernheimer’s conviction was ultimately vacated on jurisdictional rather than substantive grounds, the court did express skepticism of the original conviction, taking note that no password gate was breached. Auernheimer had exposed AT&T customers’ personal information using only AT&T’s publicly facing login screen along with information AT&T had unintentionally published. United States v. Auernheimer, 748 F.3d at n.5.

§ 3.5.2

Cyber Bullying

The CFAA has also been used to prosecute a relatively new type of harassment: cyber bullying. Notably, Lori Drew was prosecuted under the CFAA for her participation in creating a fake profile for a fictional person on the social media site MySpace. She used the profile to contact, and eventually harass, a classmate of her daughter. Drew suspected that her daughter’s classmate, thirteen-year-old Megan Meier, was spreading rumors about Drew’s daughter. In collaboration with her daughter and an employee, Drew created a MySpace profile for a made-up sixteenyear-old boy, “Josh,” through which she and her coconspirators would contact Meier. While the initial purpose for contacting Meier through “Josh” was to corroborate whether Meier was indeed spreading rumors about Drew’s daughter, the interaction between “Josh” and Meier ended with “Josh” telling Meier that he no longer liked her and that “the world would be a better place without her in it.” Subsequent to this final interaction, Meier hanged herself. Drew deleted “Josh’s” profile that same day. For her alleged role in Meier’s death, Drew was indicted, prosecuted, and found guilty under the CFAA on the basis of conspiracy to intentionally access a computer used in interstate commerce without authorization in furtherance of a tortious act. The District Court found that the “unauthorized access” element of the CFAA was satisfied through Drew’s intentional violation of one of MySpace’s terms of use prohibiting the creation of fake profiles. United States v. Drew, 259 F.R.D. 449 (C.D. Cal. 2009). In granting Drew’s subsequent motion for acquittal, the District Court for the Central District of California emphasized that “[t]reating a violation of a website’s terms of service, without more, to be sufficient to constitute ‘intentionally access[ing] a computer without authorization or exceed[ing] authorized access’ would result in transforming [the CFAA] into an overwhelmingly overbroad enactment that would convert a multitude of otherwise innocent Internet users into misdemeanant criminals.” United States v. Drew, 259 F.R.D. at 466. The decision in Drew was subsequently followed in Oregon when a middle-school assistant principal attempted to press charges against several of his students and their parents for violating the CFAA by creating social media accounts falsely attributed to him. Matot v. CH, 975 F. Supp. 2d 1191 (D. Or. 2013). Here, the court noted that, while “unauthorized access” is not defined in the statute itself, the purpose of the CFAA is to combat hacking as opposed to an attempt to create a “sweeping Internetpolicing mandate.” Matot v. CH, 975 F. Supp. 2d at 1195. Moreover, the court 3–14

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.5

recognized that creating fake social media accounts was commonplace—noting that even police agencies had begun to use such as part of investigations—and that the court declined to extend the reach of the CFAA to the creation of fake social media profiles as such a broad interpretation would impose penalties far beyond the scope intended by Congress. Matot v. CH, 975 F. Supp. 2d at 1196.

§ 3.5.3

Use of Employer Resources for Personal Use

The scope of the CFAA extends beyond regulating “unauthorized use” to include regulating computer use that exceeds authorized use. Recent case law demonstrates that, under the “exceeds authorization” element, employees may be prosecuted under the CFAA for misusing their work computers for personal or nonwork related purposes. David Nosal, for example, was prosecuted under the CFAA when he convinced his former coworkers to share their login credentials with him so he could download material to help him start a competing business. United States v. Nosal, 676 F.3d 854 (9th Cir. 2012). In upholding the District Court’s dismissal of the CFAA counts, the Ninth Circuit emphasized that “the CFAA does not extend to violations of use restrictions.” United States v. Nosal, 676 F.3d at 863. That is, a violation of private user policy alone is generally insufficient to satisfy the CFAA’s requirement that access be unauthorized or in excess of authorization. Courts seem more willing to find violations of the CFAA, though, when the accused employee is a civil servant. For example, New York police officer Gilberto Valle was convicted under the CFAA for personal use of a restricted police database that he was authorized to use for his official duties. United States v. Valle, 301 F.R.D. 53 (S.D.N.Y. 2014). Valle participated in online chats where he would discuss his fantasies about committing disturbing crimes against women. While there is no evidence that any crime actually took place or that Valle or his cohorts took any real-world steps toward the furtherance of any crime, Valle had accessed a restricted police database in order to peruse sensitive details concerning at least one of his “fantasy” victims. In appealing his conviction for violating the CFAA, Valle claimed that the CFAA only criminates accessing a computer that one has no authorization to access. Instead, Valle argued, he was authorized to access the computer but had merely exceeded the scope of his authorization under his employment agreement. He argued that, consistent with United States v. Nosal, such violation of a private agreement did not bring his actions within the scope of the CFAA. The court disagreed. While recognizing that there is conflicting case law regarding whether a disloyal employee’s misappropriation of confidential information violates the CFAA, the court held that Valle had clearly violated the CFAA because his “access to the . . . databases was limited to circumstances in which he had a valid law enforcement purpose.” United States v. Valle, 301 F.R.D. at 115. Valle appealed his conviction on the CFAA count to the Second Circuit Court of Appeals where the court declined to adopt a broad construction of the CFAA and held that Valle did not violate the CFAA by putting his authorized computer access to personal use. U.S. v. Valle, 807 F.3d 508, 523–27 (2d Cir. 2015). MCLE, Inc. | 2nd Edition 2018

3–15

§ 3.5

§ 3.5.4

Data Security and Privacy in Massachusetts

Proposed Reforms

Aaron’s Law attempts to address the perceived vagueness in the CFAA, which Senator Wyden and others believe can enable abusive prosecutorial discretion as demonstrated in several of the cases detailed above. In addition to reducing redundancy in the CFAA, the bill as currently drafted would amend the CFAA in two critical ways. First, the bill narrows the scope of “unauthorized use” by eliminating “exceeds authorized access” from the CFAA’s definition of “unauthorized” as well as qualifying that such unauthorized use must be “by knowingly circumventing one or more technological or physical measures that are designed to exclude or prevent unauthorized individuals from obtaining that information.” Aaron’s Law Act of 2015, S. 1030, 114th Cong. § 2 (2015). In addition, the bill attempts to require that penalties be proportional to the crime (for example, when the unauthorized use is for commercial advantage or private financial gain, and to modify the enhanced penalties currently available under 18 U.S.C. § 1030(a)(2)(B) by qualifying that such are available only when the offense was committed for financial gain and the fair market value of the information obtained exceeds $5,000). Aaron’s Law Act of 2015 but was not enacted.

§ 3.6

LEGISLATIVE HISTORY

§ 3.6.1

H.R. 5616

In late 1983 and early 1984, the House Subcommittee on Crime held hearings on general issues of credit card and computer fraud abuse. At the time, there was no federal legislation specifically geared toward addressing computer crimes. Instead, computer-related offenses were prosecuted under federal statutes, such as the mail fraud and wire fraud statutes, which did not address issues unique to computer crime. The hearings on bill H.R. 5616 highlighted the inadequacies of the then-current law to prohibit conduct related to an increasing number of computer-related offenses. For example, during the hearings on H.R. 5616, the Department of Justice presented examples of two wire fraud prosecutions where the defendants hacked into their respective former employer’s computers in order to access confidential information. See H.R. Rep. No. 98-894 at 5 (1984); United States v. Seidlitz, 589 Fed. Rep.; Langevin (unreported). The only conduct that brought the prosecutions within the purview of the federal wire fraud statute was access telephone calls made across state lines; otherwise, the offenses would not have been prosecuted under federal law. It became clear that new legislation targeted specifically toward computer crimes was necessary in the changing times. In the House Report accompanying the original CFAA, [t]he Committee concluded that the law enforcement community, those who own and operate computers, as well as those who may be tempted to commit crimes by unauthorized access to the, require a clearer statement of proscribed activity. H.R. Rep. No. 98-894 at 6. Of further significance in considering H.R. 5616 was the increasing reliance on computers in society, both within the federal government and the private sector, as well as the growing number of personal computer software 3–16

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.6

sales. See H.R. Rep. No. 98-894 at 8. While computers offered numerous benefits to the country, uncertainties as to the extent that computers could evolve into an epicenter of criminal activity also existed. As noted by the House in its report, [d]etermining the exact nature and extent of computer crime has been the most difficult aspect of this problem. However, there is every indication that presently it is a substantial problem and the potential in the future is immense. … As these computer technologies and the means for abusing them have rapidly emerged, they have been confronted by a criminal justice system which is largely uninformed concerning the technical aspects of computerization, and bound by traditional legal machinery which in many cases may be ineffective against unconventional criminal operations. Difficulties in coping with computer abuse arise because much of the property involved does not fit well into categories of property subject to abuse or theft; a program, for example, may exist only in the form of magnetic impulses. Also, when a program of substantial commercial value is misappropriated, the person from whom it is stolen almost always remains in possession of the original. Indeed, the original program may not have been moved so much as a single inch while being illicitly copied. It is obvious that traditional theft/larceny statutes are not the proper vehicle to control the spate of computer abuse and computer assisted crimes. H.R. Rep. No. 98-894 at 9.

§ 3.6.2

The 1984 Act: The Counterfeit Access Device and Computer Fraud and Abuse Act

Following an examination of the emerging issues related to computer crimes, the CFAA, codified at 18 U.S.C. § 1030, was enacted in 1984 as the Counterfeit Access Device and Computer Fraud and Abuse Act of 1984. At the time of its original enactment, the CFAA prohibited three categories of conduct. • Subsection (a)(1) prohibited persons from knowingly accessing without authorization, or, if authorized, accessing beyond the scope of such authorization, a computer to obtain national defense and foreign relations information or other restricted data with the intent to injure the United States or to use the information to the advantage of a foreign nation. See Pub. L. No. 98-473, tit. 22, ch. XXI, § 2102(a), 98 Stat. 1837, 2190 (1984). • Subsection (a)(2) prohibited persons from knowingly accessing without authorization, or, if authorized, accessing beyond the scope of such authorization, a computer to obtain records of a financial institution or a consumer reporting MCLE, Inc. | 2nd Edition 2018

3–17

§ 3.6

Data Security and Privacy in Massachusetts

agency for the purpose of knowingly obtaining financial information. See Pub. L. No. 98-473, tit. 22, ch. XXI, § 2102(a), 98 Stat. 1837 at 2190–91. • Subsection (a)(3) prohibited persons from knowingly accessing without authorization, or, if authorized, accessing beyond the scope of such authorization, a computer operated on behalf of the federal government in order to knowingly use, modify, destroy, or disclose information thereon, or to prevent authorized use thereof. See Pub. L. No. 98-473, tit. 22, ch. XXI, § 2102(a), 98 Stat. 1837 at 2191. As originally enacted, the CFAA contemplated criminal conduct related only to the access of government computers and certain computers containing financial information as set forth in Subsection (a)(2). Further, the CFAA specifically exempted the mere use of a computer from criminal activity as it related to Subsections (a)(2) and (3), even if such use was unauthorized or exceeded the scope of authority provided. See Pub. L. No. 98-473, tit. 22, ch. XXI, § 2102(a), 98 Stat. 1837 at 2191. Since its enactment in 1984, the CFAA has undergone several substantive amendments to keep pace with the ever-changing landscape of computer crimes.

§ 3.6.3

The 1986 Amendments

In 1986, amendments to the CFAA significantly expanded its scope. First, the 1986 amendments added three new offenses to the CFAA. • Subsection (a)(4) prohibited persons from knowingly, and with the intent to defraud, accessing without authorization, or, if authorized, accessing beyond the scope of such authorization, a federal-interest computer and obtaining anything of value. See Pub. L. No. 99-474, tit. XXIX, § 290001, 100 Stat. 1213–14 (1986). • Subsection (a)(5) prohibited intentional access of a federal-interest computer without authorization and altering, damaging, or destroying information, or conduct preventing authorized use of such a computer and either – causing an aggregate loss of $1,000 during a one-year period or – modifying, impairing, or potentially modifying or impairing the medical examination, diagnosis, treatment, or care of one or more individuals. See Pub. L. No. 99-474, tit. XXIX, § 290001, 100 Stat. at 1214. • Subsection (a)(6) prohibited the knowing and trafficking of computer password information with the intent to defraud if – the trafficking affected interstate or foreign commerce or – the computer was a federal government computer. See Pub. L. No. 99-474, tit. XXIX, § 290001, 100 Stat. at 1213. Moreover, the 1986 amendments changed the scienter requirement of Subsections (a)(2) and (a)(3) from “knowingly” to “intentionally.” See Pub. L. No. 99-474, tit. XXIX, § 290001, 100 Stat. at 1213. Another revision to Subsection (a)(3) was the 3–18

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.6

elimination of the “exceeding unauthorized access” language, so as to alleviate concerns that the original statutory language could encompass the legitimate conduct of government employees. See S. Rep. No. 99-132 at 6, 1986 U.S.C.C.A.N. at 2484–85. The 1986 amendments also eliminated the specific conspiracy offense originally set forth in Subsection (b). See 100 Stat. at 1214. Several of the penalty provisions were also modified. The definition of a “federal interest computer” was modified to encompass either a computer exclusively used by a financial institution or the federal government or two or more computers used in committing the same offense and located in different states. See 100 Stat. at 1214. Finally, the 1986 amendments to the CFAA added an exception for law enforcement and intelligence activities related to the offenses enumerated in the statute. See 100 Stat. at 1214.

§ 3.6.4

The 1994 Amendments: The Computer Abuse Amendments Act of 1994

Congress passed the Computer Abuse Amendments Act in 1994, amending several provisions of the CFAA. See Pub. L. No. 103-322, tit. XXIX, § 290001, 108 Stat. 2097 (1994). First, amendments to Subsection (a)(5) added two new categories of offenses under that subsection. Subsection (a)(5) newly prohibited persons from • knowingly, by way of a computer used in interstate commerce, causing the transmission of a program, information, code, or command to a computer if – the transmission was intended to (1) damage a computer, computer system, network, information, data, or program or (2) withhold or deny the use of a computer, computer system, network, information, data, or program and – the transmission of the harmful program, information, code, or command occurred without authorization of the person(s) responsible for the computer and (1) caused an aggregate loss or damages of $1,000 or more in a oneyear period or (2) modified or impaired the medical examination, diagnosis, treatment, or care of one or more individuals, or • knowingly, by way of a computer used in interstate commerce, causing the transmission of a program, information, code, or command to a computer – with reckless disregard of a substantial and unjustifiable risk that such transmission would (1) damage or cause damage to a computer, a computer system, a network, information, data, or a program or (2) withhold or deny the use of a computer, computer services, system, network, information, data, or program, and – if the transmission of the harmful program, information, code, or command occurred without authorization, and (1) caused an aggregate loss or damages of $1,000 or more in a one-year period or (2) modified or impaired the medical examination, diagnosis, treatment, or care of one or more individuals. See Pub. L. No. 103-322, tit. XXIX, § 290001, 108 Stat. at 2097–98.

MCLE, Inc. | 2nd Edition 2018

3–19

§ 3.6

Data Security and Privacy in Massachusetts

An amendment to Subsection (a)(3) added a condition that the use of a federal government computer to knowingly use, modify, destroy, or disclose information thereon or to prevent authorized use thereof had to “adversely” affect the use of such a computer. See Pub. L. No. 103-322, tit. XXIX, § 290001, 108 Stat. at 2099. Additionally, the 1994 amendments added a civil cause of action under the CFAA, enabling a plaintiff to seek injunctive relief or damages for a loss resulting from a violation of the CFAA. See Pub. L. No. 103-322, tit. XXIX, § 290001, 108 Stat. at 2098. Subsection (g) provided a civil cause of action for compensatory damages and injunctive or other equitable relief to any person who suffered damage or loss as a result of a violation of the CFAA. See Pub. L. No. 103-322, tit. XXIX, § 290001, 108 Stat. at 2098.

§ 3.6.5

The 1996 Amendments: The National Information Infrastructure Protection Act of 1996

The 1996 amendments to the CFAA were implemented through the National Information Infrastructure Protection Act (NIIPA). The amendments served as an acknowledgement that the ever-evolving face of technology would necessitate continual amendments to the CFAA. As intended when the law was originally enacted, the Computer Fraud and Abuse statute facilitates addressing in a single statute the problem of computer crime, rather than identifying and amending every potentially applicable statute affected by advances in computer technology. As computers continue to proliferate in businesses and homes, and new forms of computer crimes emerge, Congress must remain vigilant to ensure that the Computer Fraud and Abuse statute is up-to-date and provides law enforcement with the necessary legal framework to fight computer crime. The NII Protection Act will likely not represent the last amendment to this statute, but is necessary and constructive legislation to deal with the current increase in computer crime. S. Rep. No. 104-357 at 5. Another significant amendment was the addition of the phrase “protected computers” into Subsections (a)(2), (a)(4), and (a)(5) and the replacement of the phrase “federal interest computer” in Subsections (a)(4) and (a)(5) with the phrase “protected computers” in order to broaden the scope of computers covered by the CFAA. (A “protected computer” is currently defined as any computer used in or affecting interstate or foreign commerce or a computer exclusively for the use of a financial institution or the federal government. 18 U.S.C. § 1030(e)(2).) The expansion of the CFAA to computers outside the realm of the federal government and financial institutions to every computer connected to the Internet was an acknowledgement of the necessary breadth of the CFAA in order to combat computer crime.

3–20

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.6

The 1996 amendments also added a seventh offense under the statute in Subsection (a)(7). The new subsection prohibited transmissions in interstate or foreign commerce containing threats or causing damage to a protected computer. See Pub. L. No. 104-294, tit. II, 110 Stat. 3488, 3492 (1996). The new provision was intended to address a rise in computer-related extortions by hackers. The [NIIPA] would add a new subsection (a)(7) to section 1030 to address a new and emerging problem of computer-age blackmail. This is a high-tech variation on old fashioned extortion. According to the Department of Justice, threats have been made against computer systems in several instances. One can imagine situations in which hackers penetrate a system, encrypt a database and then demand money for the decoding key. This new provision would ensure law enforcement’s ability to prosecute modern-day blackmailers, who threaten harm or shut down computer networks unless their extortion demands are met. S. Rep. No. 104-357 at 12.

§ 3.6.6

The 2001 Amendments: The USA PATRIOT Act

The USA PATRIOT Act, which was enacted following the attacks of September 11, 2001, further amended the CFAA by expanding its scope to combat acts of terrorism related to computers. Violations of Subsections (a)(1), (a)(5)(A)(i), and (a)(5)(B)(ii)– (v) became components of the definition of the federal crime of terrorism. See Pub. L. No. 107-56, § 808(2)(i), 115 Stat. 384 (2001). The U.S. Secret Service was also granted the authority to investigate offenses under the CFAA, along with all other agencies that previously had such authority. See Pub. L. No. 107-56, § 808(2)(i), 115 Stat. at § 506(a). First, the phrase “protected computer,” which had been an important cornerstone of the 1996 amendments and a tacit recognition of the importance of protecting not only government and financial institution computers, but all computers, was given an even broader definition. The USA PATRIOT Act added to the definition of a “protected computer,” “a computer located outside the United States that is used in a manner that affects interstate or foreign commerce or communication of the United States.” See Pub. L. No. 107-56, § 808(2)(i), 115 Stat. at § 814(d)(1). The USA PATRIOT Act defined additional specific activities prohibited under Subsection (a)(5), including physical injury to a person, threats to public health or safety, and “damage affecting a computer system used by or for a government entity in furtherance of the administration of justice, national defense, or national security.” Pub. L. No. 107-56, § 808(2)(i), 115 Stat. at § 814(a)(4)(B). The 2001 amendments to the CFAA refined civil causes of action under the statute. Civil actions were limited to certain conduct set forth in Subsection (a)(5)(B) and civil actions pursuant to Subsection (a)(5)(B)(i) were limited to economic damages. See Pub. L. No. 107-56, § 808(2)(i), 115 Stat. at § 814(e)(1). Further, the “negligent MCLE, Inc. | 2nd Edition 2018

3–21

§ 3.6

Data Security and Privacy in Massachusetts

design or manufacture of computer hardware, computer software, or firmware” was explicitly excluded from the conduct that could give rise to civil liability under the CFAA. Pub. L. No. 107-56, § 808(2)(i), 115 Stat. at § 814(e)(2). Finally, the USA PATRIOT Act provided that the U.S. Sentencing Commission was to “amend the Federal sentencing guidelines to ensure that any individual convicted of a violation of [the CFAA] can be subjected to appropriate penalties, without regard to any mandatory minimum term of imprisonment.” Pub. L. No. 107-56, § 808(2)(i), 115 Stat. at § 814(f).

§ 3.6.7

The 2002 Amendments: The Cyber Security Enhancement Act of 2002

Under the Cyber Security Enhancement Act of 2002, the U.S. Sentencing Commission was instructed to review the sentencing guidelines governing convictions under 18 U.S.C. § 1030. The purpose of the review was twofold: to ensure that the sentencing guidelines and policy statements in place at the time properly recognized the serious nature of computer crimes and to consider whether the sentencing guidelines in place at the time properly considered a number of significant factors. See Pub. L. No. 107-296, § 225(b)(2), 116 Stat. 2156 (2002). The factors to be considered were (i) the potential and actual loss resulting from the offense; (ii) the level of sophistication and planning involved in the offense; (iii) whether the offense was committed for purposes of commercial advantage or private financial benefit; (iv) whether the defendant acted with malicious intent to cause harm in committing the offense; (v) the extent to which the offense violated the privacy rights of individuals harmed; (vi) whether the offense involved a computer used by the government in furtherance of national defense, national security, or the administration of justice; (vii) whether the violation was intended to or had the effect of significantly interfering with or disrupting a critical infrastructure; and (viii) whether the violation was intended to or had the effect of creating a threat to public health or safety, or injury to any person. Pub. L. No. 107-296, § 225(b)(2), 116 Stat. 2156.

3–22

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

§ 3.6

In addition to the request to the United States Sentencing Commission to review and report on the current CFAA sentencing guidelines, the 2002 amendments provided for enhanced penalties for violations of Subsection (c).

§ 3.6.8

The 2008 Amendments: The Identity Theft Enforcement and Restitution Act of 2008

In 2008, through the Identity Theft Enforcement and Restitution Act, several penalty provisions of the CFAA were amended to address the use of malicious spyware, hacking, and the activities of key loggers. See Pub. L. No. 110-326, § 204, 122 Stat. 3561 (2008). The 2008 amendments also provided for forfeiture of property used in or derived from a violation of the CFAA. See Pub. L. No. 110-326 at § 208. Additionally, the newest criminal offense for threats to a computer added by the 2001 amendments in Subsection (a)(7) was expanded to include extortion related to stealing data and/or publicly disclosing that data, as well as failing to repair damage caused to a computer in such circumstances. See Pub. L. No. 110-326 at § 205. Subsection (b) was amended to include language to support a conspiracy offense. See Pub. L. No. 110-326 at § 206. The definition of a “protected computer” was again amended to broaden the reach of the CFAA. As a result of the 2008 amendments, a protected computer included not only computers used in interstate or foreign commerce, but computers used in a manner that affects interstate or foreign commerce. See Pub. L. No. 110-326 at § 207. Finally, the U.S. Sentencing Commission was again asked to review its guidelines and policy statements to ensure that the intent of Congress in increasing penalties for computer crimes had been achieved. See Pub. L. No. 110-326 at § 209.

§ 3.6.9

Scope and Application of the CFAA Today

The consistent amendments to the CFAA since its enactment reflect the intent of Congress to continuously broaden the CFAA’s scope and application. See Guest-Tek Interactive Entm’t Inc. v. Pullen, 665 F. Supp. 2d 42, 45 (D. Mass. 2009).

MCLE, Inc. | 2nd Edition 2018

3–23

Data Security and Privacy in Massachusetts

EXHIBIT 3A—18 U.S.C. § 1030, Fraud and Related Activity in Connection with Computers (a) Whoever— (1) having knowingly accessed a computer without authorization or exceeding authorized access, and by means of such conduct having obtained information that has been determined by the United States Government pursuant to an Executive order or statute to require protection against unauthorized disclosure for reasons of national defense or foreign relations, or any restricted data, as defined in paragraph y. of section 11 of the Atomic Energy Act of 1954, with reason to believe that such information so obtained could be used to the injury of the United States, or to the advantage of any foreign nation willfully communicates, delivers, transmits, or causes to be communicated, delivered, or transmitted, or attempts to communicate, deliver, transmit or cause to be communicated, delivered, or transmitted the same to any person not entitled to receive it, or willfully retains the same and fails to deliver it to the officer or employee of the United States entitled to receive it; (2) intentionally accesses a computer without authorization or exceeds authorized access, and thereby obtains— (A) information contained in a financial record of a financial institution, or of a card issuer as defined in section 1602(n)1 of title 15, or contained in a file of a consumer reporting agency on a consumer, as such terms are defined in the Fair Credit Reporting Act (15 U.S.C. 1681 et seq.); (B) information from any department or agency of the United States; or (C) information from any protected computer; 1

See References in Text note below.

(3) intentionally, without authorization to access any nonpublic computer of a department or agency of the United States, accesses such a computer of that department or agency that is exclusively for the use of the Government of the United States or, in the case of a computer not exclusively for such use, is used by or for the Government of the United States and such conduct affects that use by or for the Government of the United States; (4) knowingly and with intent to defraud, accesses a protected computer without authorization, or exceeds authorized access, and by means of such conduct furthers the intended fraud and obtains anything of value, unless the object of the fraud and the thing obtained consists only of the use of the computer and the value of such use is not more than $5,000 in any 1-year period;

3–24

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

(5)(A) knowingly causes the transmission of a program, information, code, or command, and as a result of such conduct, intentionally causes damage without authorization, to a protected computer; (B) intentionally accesses a protected computer without authorization, and as a result of such conduct, recklessly causes damage; or (C) intentionally accesses a protected computer without authorization, and as a result of such conduct, causes damage and loss.2 2

So in original. The period probably should be a semicolon.

(6) knowingly and with intent to defraud traffics (as defined in section 1029) in any password or similar information through which a computer may be accessed without authorization, if— (A) such trafficking affects interstate or foreign commerce; or (B) such computer is used by or for the Government of the United States;3 3

So in original. Probably should be followed by “or”.

(7) with intent to extort from any person any money or other thing of value, transmits in interstate or foreign commerce any communication containing any— (A) threat to cause damage to a protected computer; (B) threat to obtain information from a protected computer without authorization or in excess of authorization or to impair the confidentiality of information obtained from a protected computer without authorization or by exceeding authorized access; or (C) demand or request for money or other thing of value in relation to damage to a protected computer, where such damage was caused to facilitate the extortion; shall be punished as provided in subsection (c) of this section. (b) Whoever conspires to commit or attempts to commit an offense under subsection (a) of this section shall be punished as provided in subsection (c) of this section. (c) The punishment for an offense under subsection (a) or (b) of this section is— (1)(A) a fine under this title or imprisonment for not more than ten years, or both, in the case of an offense under subsection (a)(1) of this section which does not occur after a conviction for another offense under this section, or an attempt to commit an offense punishable under this subparagraph; and (B) a fine under this title or imprisonment for not more than twenty years, or both, in the case of an offense under subsection (a)(1) of this section which

MCLE, Inc. | 2nd Edition 2018

3–25

Data Security and Privacy in Massachusetts

occurs after a conviction for another offense under this section, or an attempt to commit an offense punishable under this subparagraph; (2)(A) except as provided in subparagraph (B), a fine under this title or imprisonment for not more than one year, or both, in the case of an offense under subsection (a)(2), (a)(3), or (a)(6) of this section which does not occur after a conviction for another offense under this section, or an attempt to commit an offense punishable under this subparagraph; (B) a fine under this title or imprisonment for not more than 5 years, or both, in the case of an offense under subsection (a)(2), or an attempt to commit an offense punishable under this subparagraph, if— (i) the offense was committed for purposes of commercial advantage or private financial gain; (ii) the offense was committed in furtherance of any criminal or tortious act in violation of the Constitution or laws of the United States or of any State; or (iii) the value of the information obtained exceeds $5,000; and (C) a fine under this title or imprisonment for not more than ten years, or both, in the case of an offense under subsection (a)(2), (a)(3) or (a)(6) of this section which occurs after a conviction for another offense under this section, or an attempt to commit an offense punishable under this subparagraph; (3)(A) a fine under this title or imprisonment for not more than five years, or both, in the case of an offense under subsection (a)(4) or (a)(7) of this section which does not occur after a conviction for another offense under this section, or an attempt to commit an offense punishable under this subparagraph; and (B) a fine under this title or imprisonment for not more than ten years, or both, in the case of an offense under subsection (a)(4),4 or (a)(7) of this section which occurs after a conviction for another offense under this section, or an attempt to commit an offense punishable under this subparagraph; 4

So in original. The comma probably should not appear.

(4)(A) except as provided in subparagraphs (E) and (F), a fine under this title, imprisonment for not more than 5 years, or both, in the case of— (i) an offense under subsection (a)(5)(B), which does not occur after a conviction for another offense under this section, if the offense caused (or, in the case of an attempted offense, would, if completed, have caused)— (I) loss to 1 or more persons during any 1-year period (and, for purposes of an investigation, prosecution, or other proceeding brought by the United States only, loss resulting from a related 3–26

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

course of conduct affecting 1 or more other protected computers) aggregating at least $5,000 in value; (II) the modification or impairment, or potential modification or impairment, of the medical examination, diagnosis, treatment, or care of 1 or more individuals; (III) physical injury to any person; (IV) a threat to public health or safety; (V) damage affecting a computer used by or for an entity of the United States Government in furtherance of the administration of justice, national defense, or national security; or (VI) damage affecting 10 or more protected computers during any 1-year period; or (ii) an attempt to commit an offense punishable under this subparagraph; (B) except as provided in subparagraphs (E) and (F), a fine under this title, imprisonment for not more than 10 years, or both, in the case of— (i) an offense under subsection (a)(5)(A), which does not occur after a conviction for another offense under this section, if the offense caused (or, in the case of an attempted offense, would, if completed, have caused) a harm provided in subclauses (I) through (VI) of subparagraph (A)(i); or (ii) an attempt to commit an offense punishable under this subparagraph; (C) except as provided in subparagraphs (E) and (F), a fine under this title, imprisonment for not more than 20 years, or both, in the case of— (i) an offense or an attempt to commit an offense under subparagraphs (A) or (B) of subsection (a)(5) that occurs after a conviction for another offense under this section; or (ii) an attempt to commit an offense punishable under this subparagraph; (D) a fine under this title, imprisonment for not more than 10 years, or both, in the case of— (i) an offense or an attempt to commit an offense under subsection (a)(5)(C) that occurs after a conviction for another offense under this section; or (ii) an attempt to commit an offense punishable under this subparagraph;

MCLE, Inc. | 2nd Edition 2018

3–27

Data Security and Privacy in Massachusetts

(E) if the offender attempts to cause or knowingly or recklessly causes serious bodily injury from conduct in violation of subsection (a)(5)(A), a fine under this title, imprisonment for not more than 20 years, or both; (F) if the offender attempts to cause or knowingly or recklessly causes death from conduct in violation of subsection (a)(5)(A), a fine under this title, imprisonment for any term of years or for life, or both; or (G) a fine under this title, imprisonment for not more than 1 year, or both, for— (i) any other offense under subsection (a)(5); or (ii) an attempt to commit an offense punishable under this subparagraph. (d)(1) The United States Secret Service shall, in addition to any other agency having such authority, have the authority to investigate offenses under this section. (2) The Federal Bureau of Investigation shall have primary authority to investigate offenses under subsection (a)(1) for any cases involving espionage, foreign counterintelligence, information protected against unauthorized disclosure for reasons of national defense or foreign relations, or Restricted Data (as that term is defined in section 11y of the Atomic Energy Act of 1954 (42 U.S.C. 2014(y)), except for offenses affecting the duties of the United States Secret Service pursuant to section 3056(a) of this title. (3) Such authority shall be exercised in accordance with an agreement which shall be entered into by the Secretary of the Treasury and the Attorney General. (e) As used in this section— (1) the term “computer” means an electronic, magnetic, optical, electrochemical, or other high speed data processing device performing logical, arithmetic, or storage functions, and includes any data storage facility or communications facility directly related to or operating in conjunction with such device, but such term does not include an automated typewriter or typesetter, a portable hand held calculator, or other similar device; (2) the term “protected computer” means a computer— (A) exclusively for the use of a financial institution or the United States Government, or, in the case of a computer not exclusively for such use, used by or for a financial institution or the United States Government and the conduct constituting the offense affects that use by or for the financial institution or the Government; or (B) which is used in or affecting interstate or foreign commerce or communication, including a computer located outside the United States that is used

3–28

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

in a manner that affects interstate or foreign commerce or communication of the United States; (3) the term “State” includes the District of Columbia, the Commonwealth of Puerto Rico, and any other commonwealth, possession or territory of the United States; (4) the term “financial institution” means— (A) an institution, with deposits insured by the Federal Deposit Insurance Corporation; (B) the Federal Reserve or a member of the Federal Reserve including any Federal Reserve Bank; (C) a credit union with accounts insured by the National Credit Union Administration; (D) a member of the Federal home loan bank system and any home loan bank; (E) any institution of the Farm Credit System under the Farm Credit Act of 1971; (F) a broker-dealer registered with the Securities and Exchange Commission pursuant to section 15 of the Securities Exchange Act of 1934; (G) the Securities Investor Protection Corporation; (H) a branch or agency of a foreign bank (as such terms are defined in paragraphs (1) and (3) of section 1(b) of the International Banking Act of 1978); and (I) an organization operating under section 25 or section 25(a) 4 of the Federal Reserve Act; (5) the term “financial record” means information derived from any record held by a financial institution pertaining to a customer’s relationship with the financial institution; (6) the term “exceeds authorized access” means to access a computer with authorization and to use such access to obtain or alter information in the computer that the accesser is not entitled so to obtain or alter; (7) the term “department of the United States” means the legislative or judicial branch of the Government or one of the executive departments enumerated in section 101 of title 5; (8) the term “damage” means any impairment to the integrity or availability of data, a program, a system, or information; MCLE, Inc. | 2nd Edition 2018

3–29

Data Security and Privacy in Massachusetts

(9) the term “government entity” includes the Government of the United States, any State or political subdivision of the United States, any foreign country, and any state, province, municipality, or other political subdivision of a foreign country; (10) the term “conviction” shall include a conviction under the law of any State for a crime punishable by imprisonment for more than 1 year, an element of which is unauthorized access, or exceeding authorized access, to a computer; (11) the term “loss” means any reasonable cost to any victim, including the cost of responding to an offense, conducting a damage assessment, and restoring the data, program, system, or information to its condition prior to the offense, and any revenue lost, cost incurred, or other consequential damages incurred because of interruption of service; and (12) the term “person” means any individual, firm, corporation, educational institution, financial institution, governmental entity, or legal or other entity. (f) This section does not prohibit any lawfully authorized investigative, protective, or intelligence activity of a law enforcement agency of the United States, a State, or a political subdivision of a State, or of an intelligence agency of the United States. (g) Any person who suffers damage or loss by reason of a violation of this section may maintain a civil action against the violator to obtain compensatory damages and injunctive relief or other equitable relief. A civil action for a violation of this section may be brought only if the conduct involves 1 of the factors set forth in subclauses5 (I), (II), (III), (IV), or (V) of subsection (c)(4)(A)(i). Damages for a violation involving only conduct described in subsection (c)(4)(A)(i)(I) are limited to economic damages. No action may be brought under this subsection unless such action is begun within 2 years of the date of the act complained of or the date of the discovery of the damage. No action may be brought under this subsection for the negligent design or manufacture of computer hardware, computer software, or firmware. 5

So in original. Probably should be “subclause”.

(h) The Attorney General and the Secretary of the Treasury shall report to the Congress annually, during the first 3 years following the date of the enactment of this subsection, concerning investigations and prosecutions under subsection (a)(5). (i)(1) The court, in imposing sentence on any person convicted of a violation of this section, or convicted of conspiracy to violate this section, shall order, in addition to any other sentence imposed and irrespective of any provision of State law, that such person forfeit to the United States— (A) such person’s interest in any personal property that was used or intended to be used to commit or to facilitate the commission of such violation; and (B) any property, real or personal, constituting or derived from, any proceeds that such person obtained, directly or indirectly, as a result of such violation. 3–30

2nd Edition 2018 | MCLE, Inc.

Computer Fraud and Abuse Act

(2) The criminal forfeiture of property under this subsection, any seizure and disposition thereof, and any judicial proceeding in relation thereto, shall be governed by the provisions of section 413 of the Comprehensive Drug Abuse Prevention and Control Act of 1970 (21 U.S.C. 853), except subsection (d) of that section. (j) For purposes of subsection (i), the following shall be subject to forfeiture to the United States and no property right shall exist in them: (1) Any personal property used or intended to be used to commit or to facilitate the commission of any violation of this section, or a conspiracy to violate this section. (2) Any property, real or personal, which constitutes or is derived from proceeds traceable to any violation of this section, or a conspiracy to violate this section6 6

So in original. Probably should be followed by a period.

MCLE, Inc. | 2nd Edition 2018

3–31

Data Security and Privacy in Massachusetts

3–32

2nd Edition 2018 | MCLE, Inc.

CHAPTER 4

Fair Credit Reporting Act; Fair and Accurate Credit Transactions Act Gary Klein, Esq. Boston

Corinne A. Reed, Esq. Burns & Farrey, PC, Boston

Jared Rinehimer, Esq. Boston § 4.1

Overview: FCRA and FACTA ............................................................... 4–2 § 4.1.1 History of the FCRA and FACTA .......................................... 4–2 § 4.1.2 Addressing Incorrect Information on Consumer Reports and CRA Liability .................................................................. 4–4

§ 4.2

What the FCRA and FACTA Protect and Render Private .................. 4–8 § 4.2.1 Truncation of Credit and Debit Card Numbers and Expiration Dates on Printed Receipts .............................. 4–8 § 4.2.2 Consumers Can Request a CRA Truncate the First Five Digits of Their Social Security Numbers ............................... 4–8 § 4.2.3 Restrictions on Medical Information Appearing in Consumer Reports .............................................................. 4–9 § 4.2.4 Restrictions on Employers’ Use of a Consumer Report ......... 4–9 § 4.2.5 Red Flag Guidelines ............................................................. 4–10 § 4.2.6 Restrictions on Issuance of an Additional or New Card when Change of Address Notification Filed......................... 4–12 § 4.2.7 Address Discrepancies ......................................................... 4–12 § 4.2.8 Reports Must Be Used for a Permissible Purpose ................ 4–12 § 4.2.9 CRAs Must Use Reasonable Procedures to Assure Maximum Possible Accuracy ............................................... 4–14

§ 4.3

What Happens If the Private Data Is Used and How to Deal with Consumer Reporting Problems ................................................... 4–14 § 4.3.1 What Identity Theft Is and Why It Is a Problem .................. 4–14 § 4.3.2 Information Available to Identity Theft Victims ................... 4–15 § 4.3.3 Fraud Alerts .......................................................................... 4–17 (a) Initial Fraud Alert or One-Call Fraud Alert ................... 4–17 (b) Extended Fraud Alert .................................................... 4–18 (c) Active Duty Alert .......................................................... 4–19 (d) Effects of the Three Alerts on Users ............................. 4–19 (e) Liability of CRAs and Users and Preemption ............... 4–20

MCLE, Inc. | 2nd Edition 2018

4–1

Data Security and Privacy in Massachusetts

§ 4.3.4

(f) Security Freezes ............................................................ 4–20 Blocking Fraudulent Information ......................................... 4–21 (a) CRAs ............................................................................. 4–21 (b) Furnishers ...................................................................... 4–23

Scope Note This chapter provides insight into two federal laws pertaining to credit, namely the Fair Credit Reporting Act and the Fair and Accurate Credit Transactions Act. It begins with overviews of the laws, and then provides practical information on what they protect and render private. The chapter features a section on what to do when private data is used, and how to deal with credit reporting problems.

§ 4.1

OVERVIEW: FCRA AND FACTA

§ 4.1.1

History of the FCRA and FACTA

The Fair Credit Reporting Act (FCRA), 15 U.S.C. §§ 1681–1681x, was enacted by Congress in 1970. The purpose of the FCRA is to protect consumers by regulating the consumer reporting practices and liabilities of both the private companies that control consumer reporting (known as “CRAs”) and the entities (known as “furnishers”) that are the source of information that appears on consumer reports. Another goal of the law is to protect consumer privacy. See Safeco Ins. Co. of Am. v. Burr, 551 U.S. 47, 52 (2007) (Congress enacted FCRA “to ensure fair and accurate credit reporting, promote efficiency in the banking system, and protect consumer privacy”). Under the FCRA, CRAs collect consumer credit data from furnishers. A “credit report” is a form of “consumer report” and the terms are often used interchangeably. The CRAs must use reasonable procedures to ensure that the data is accurate and that they have fairly compiled the contents of a consumer’s credit report. Consumers who disagree with any information in their individual consumer report may submit a dispute to the CRA issuing the credit report in question. The FCRA requires CRAs to conduct a reasonable investigation upon receiving the consumer’s notice of dispute and report their findings to the consumer. The FCRA is an important tool to protect a consumer’s financial interest from incorrect credit information and to assist victims of identity theft. The FCRA has been amended several times since it became effective on April 25, 1971. The first major amendment to the FCRA was the Consumer Credit Reporting Reform Act of 1996 (hereinafter the Reform Act of 1996), passed on September 30, 1996. The Reform Act of 1996 made several improvements to the FCRA but also included setbacks for consumers. Improvements to the FCRA included providing the CRAs with a thirty-day deadline to reinvestigate a consumer dispute and protections against reinsertion of inaccurate information. A major improvement of the 1996 amendment established an obligation on furnishers to reinvestigate disputed information 4–2

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.1

and to fix mistakes after the consumer has notified the CRA of any inaccuracies. Prior to the 1996 amendment, the FCRA applied only to CRAs. Drawbacks of the Reform Act of 1996 consisted of provisions that allowed affiliates to share consumer reports, limited opt-out time from prescreening lists, and limited preemption of stronger state laws on credit reporting. The next major amendment to the FCRA occurred in 2003, when Congress passed the Fair and Accurate Credit Transactions Act (FACTA). The purposes of this amendment were to further address issues of preemption of state laws, to further assist consumers with credit reporting problems, and to add provisions to help consumers manage the growing problem of identity theft. To help consumers with consumer reporting problems, provisions of the FACTA amendments provided consumers with one free report annually from each of the three big nationwide CRAs (Experian, TransUnion, and Equifax) and required furnishers to have reasonable procedures in place to prevent refurnishing inaccurate information. Other FACTA amendments included an extension of the opt-out period for prescreened lists from two years to five years and an amendment to the statute of limitations so that a consumer may bring a claim for up to two years from the date of discovery of the violation, instead of from the date of the violation. The FACTA amendments also required businesses to truncate credit card numbers on receipts and the CRAs to truncate Social Security numbers on consumer reports if requested by the consumer. In response to the increased occurrence of identity theft, the FACTA amendments to the FCRA added provisions that allow consumers to place fraud alerts on their files and to block information that is the result of identity theft. The FACTA amendments also protect against identity theft by creating a rule that requires financial institutions to properly dispose of consumer information. In further response to identity theft, the FACTA amendments created the red flag rule, which calls for federal banking agencies and the Federal Trade Commission (FTC) to jointly create regulations that provide guidelines to financial institutions and creditors regarding identity theft. One aspect of these guidelines is how to handle change-of-address notice discrepancies. (These protections against identity theft are discussed in further detail later in this chapter.) Unfortunately, the FACTA amendments also established various permanent preemptions of stronger state consumer reporting laws. Another important amendment to the FCRA occurred in 2010, when Congress passed the Dodd-Frank Wall Street Reform and Consumer Protection Act (the DoddFrank Act). One significant change to the FCRA under the Dodd-Frank Act transferred much of the enforcement and federal rule-making authority from the FTC to the Consumer Financial Protection Bureau (CFPB). This transfer of rule-making authority does not include the red flag rule or the disposal of consumer information rule, which both stay with the FTC. 15 U.S.C. §§ 1681m(e)(1)(A), 1681w. An additional change the Dodd-Frank Act made to the FCRA is a requirement that users of consumer reports disclose the consumer’s credit score and related information when the report results in an adverse action, such as denial of a loan or when the report is used for risk-based pricing. 15 U.S.C. § 1681m(a)(2), (h)(5)(E). MCLE, Inc. | 2nd Edition 2018

4–3

§ 4.1

§ 4.1.2

Data Security and Privacy in Massachusetts

Addressing Incorrect Information on Consumer Reports and CRA Liability

Though this chapter focuses on identity theft under the FCRA, below is a brief summary of what happens when a consumer disputes information on his or her consumer report that is not necessarily the result of identity theft, and what liability attaches for noncompliance with the FCRA. For more information on these issues, it is advisable to consult the CFPB website for guides, manuals, and bulletins; the FTC website for staff summaries and staff opinion letters; the Federal Deposit Insurance Corporation website for institution letters; and Congressional hearing records. Another important resource is the National Consumer Law Center’s manual, Fair Credit Reporting (8th ed. 2013). Consumers must take certain steps if there is incorrect information on their consumer reports. Those steps are, in most instances, a prerequisite to liability upon CRAs and furnishers. When a consumer obtains his or her consumer report and discovers that it contains incorrect information, the consumer’s first step is to notify the CRA of the inaccurate information listed on the report. Under 15 U.S.C. § 1681i, the CRA is required to conduct a reasonable investigation upon receiving a consumer’s notice of a disputed claim. While there is no clear-cut definition of what constitutes a reasonable investigation, courts have held that, while a “‘reasonable investigation will often depend on the circumstances of a particular dispute, it is clear that a reasonable investigation must mean more than simply including public documents in a consumer report or making only a cursory investigation into the reliability of information that is reported to potential creditors.’” Schweitzer v. Equifax Info. Solutions LLC, 441 Fed. Appx. 896, 904 (3d Cir. 2011) (quoting Cortez v. Trans Union, 617 F.3d 688, 713 (3d Cir. 2010)). Courts have understood the duties under 15 U.S.C. § 1681i(a) to require a CRA to do more work than just exactly repeating the information received from furnishers or other sources; the additional work should include investigating and verifying the accuracy of its initial source of information. Cortez v. Trans Union, 617 F.3d at 713 (quoting Cushman v. Trans Union, 115 F.3d 220, 225 (3d Cir. 1997)). Even though there is no concrete definition of a “reasonable investigation,” one of the basic elements of a CRA’s duty to investigate is to contact the furnisher of the disputed information and report back to the consumer the results of that investigation. 15 U.S.C. § 1681i(a)(2), (6). The FCRA imposes a timeline in which the CRA must conduct its investigation. Upon receiving the consumer’s dispute, the CRA has thirty days to conduct its investigation. 15 U.S.C. § 1681i(a)(1)(A). The CRA’s required action at this juncture is to contact the furnisher of the disputed information within five business days. 15 U.S.C. § 1681i(a)(2). The furnisher must then conduct an investigation of the disputed information and report back the results of its investigation to the CRA prior to the expiration of the CRA’s own time period to investigate the dispute. 15 U.S.C. § 1681s-2(b)(2) (furnisher must complete all investigations, review, and reports “before the expiration of the period under section 1681i(a)(1) of this title within which the [CRA] is required to complete actions required by that section regarding that information”). At day thirty, the CRA must have completed its investigation of the dispute. 15 U.S.C. § 1681i(a)(1)(A). After the CRA completes its investigation, it 4–4

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.1

has five business days to notify the consumer of the results of the investigation by mail, unless the consumer has authorized the CRA to provide the results by another method. 15 U.S.C. § 1681i(a)(6)(A). The FCRA imposes specific requirements for the CRA’s written notice to the consumer, including • a written statement that the investigation is complete; • a copy of the corrected report; • if requested, a description of the investigation procedure and the name and address of the furnisher; • notice of the right to add a written statement; and • if information is deleted, a notice that the consumer has the right to request that the CRA notify certain users who previously received a copy of the consumer’s report. 15 U.S.C. § 1681i(a)(6)(B). If the dispute is not resolved, the consumer has a right to file a statement of dispute with the CRA. 15 U.S.C. § 1681i(b). Subsequent consumer reports must include a notation that the information is in dispute as well as a copy of the consumer’s statement (or an accurate summary). 15 U.S.C. § 1681i(c). There are a few instances where the timeline for the CRA to complete its investigation is longer than the typical thirty-day time period. For example, if the consumer sends his or her dispute after receiving a free annual report from a nationwide CRA via the centralized source, the CRA has forty-five days to complete its investigation. 15 U.S.C. § 1681j(a)(3). Additionally, the CRA has forty-five days to complete its investigation if the consumer submits new information to the CRA within thirty days of receiving the dispute, but this is a one-time-only, fifteen-day extension. 15 U.S.C. § 1681i(a)(1)(B). The FCRA also carves out exceptions to the typical thirty-day timeline that shortens the investigation period for the CRA. The CRA can terminate its investigation prior to the end of the thirty-day period if it finds the consumer’s disputed information is inaccurate or that it cannot be verified. 15 U.S.C. § 1681i(a)(5). The CRA may also resolve the dispute in an expedited resolution process by deleting the information within three business days of receiving the dispute. 15 U.S.C. § 1681i(a)(8). If the CRA determines that the dispute is frivolous or irrelevant, the CRA must notify the consumer of this determination within five business days. 15 U.S.C. § 1681i(a)(3). Where a CRA fails to respond to a consumer’s dispute, it violates federal law. This violation gives rise to civil liability for noncompliance with the FCRA if the consumer alleges and proves a concrete and particularized harm to him or her flowing from the violation. Spokeo v. Robins, 136 S. Ct. 1540, 1549 (2016). Under Spokeo “a bare procedural violation” is insufficient to warrant standing. Spokeo v. Robins, 136 S. Ct. at 1549. “If a consumer reporting agency negligently violates a duty imposed by the FCRA, actual damages, costs and fees are available. 15 U.S.C. § 1681o. In the case of a willful violation of the statute, punitive damages are also available. 15 U.S.C. MCLE, Inc. | 2nd Edition 2018

4–5

§ 4.1

Data Security and Privacy in Massachusetts

§ 1681n.” Barrepski v. Capital One Bank (U.S.A.) N.A., No. 11-30160, 2014 WL 935983, at *4 (D. Mass. Mar. 7, 2014). “A ‘willful’ violation is one that was done knowingly or recklessly.” Haley v. TalentWise, Inc., 9 F. Supp. 3d 1188, 1194 (W.D. Wash. 2014) (citing Safeco Ins. Co. of Am. v. Burr, 551 U.S. at 57). In 2007, the Supreme Court defined what constitutes recklessness for willful violations of the FCRA: A company subject to FCRA does not act in reckless disregard of it unless the action is not only a violation under a reasonable reading of the statute’s terms, but shows that the company ran a risk of violating the law substantially greater than the risk associated with a reading that was merely careless. Safeco Ins. Co. of Am. v. Burr, 551 U.S. at 70. There are three elements a consumer plaintiff must establish in order to pursue damages under Section 1681i: “(1) the [CRA’s] reinvestigation was unreasonable, (2) the plaintiffs ‘suffered damages as a result of the inaccurate information,’ and (3) a causal relationship exists between the unreasonable reinvestigation and the loss of credit or some other harm.” Barrepski v. Capital One Bank (U.S.A.) N.A., 2014 WL 935983, at *4 (citing Ruffin-Thompkins v. Experian Info. Solutions, Inc., 422 F.3d 603, 608 (7th Cir. 2005)). While liability attaches for noncompliance with the FCRA, courts have differed on the determination of CRA liability if the disputed information on the consumer’s report is accurate. Compare Agu v. Rhea, No. 09-4732, 2010 WL 5186839, at *6–7 (E.D.N.Y. Dec. 15, 2010) (even though information was accurate, plaintiff could state a claim under Section 1681i(a)(6) for failure to notify consumer of result, but dismissing claim due to lack of damages and failure to plead willfulness), with Carvalho v. Equifax Info. Serv., 588 F. Supp. 2d 1089 (N.D. Cal. Dec. 2, 2008) (no liability for failure to provide notice when information was accurate), aff’d on other grounds, 629 F.3d 876 (9th Cir. 2010), and DeAndrade v. Trans Union LLC, 523 F.3d 61, 67 (1st Cir. 2008) (“it is difficult to see how a plaintiff could prevail on a claim for damages under § 1681i without a showing that the disputed information disclosed by the credit agency was, in fact, inaccurate”). Liability may also attach for a CRA’s predispute inaccuracies in reporting a consumer’s information. Under the FCRA, a CRA is required to “follow reasonable procedures to assure maximum possible accuracy of the information” about the consumer when creating the consumer’s credit report. 15 U.S.C. § 1681e(b); see also TRW Inc. v. Andrews, 534 U.S. 19, 23 (2001) (FCRA “requir[es] [CRAs] to maintain reasonable procedures designed to assure maximum possible accuracy of the information contained in credit reports”). The focus of this claim is whether or not the CRA employs reasonable procedures when creating the credit report. In order to successfully pursue this claim, the consumer must establish both that the CRA failed to use reasonable procedures and that the report was in fact inaccurate, otherwise the claim fails. Eller v. Trans Union, LLC, 739 F.3d 467, 473 (10th Cir. 2013) (citing Cassara v. DAC Servs., Inc., 276 F.3d 1210, 1217 (10th Cir. 2002)); Soutter v. Equifax Info. Servs., LLC, 498 Fed. Appx. 260, 264 (4th Cir. 2012) (citing Dalton v. Capital Associated 4–6

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.1

Indus., 257 F.3d 409, 415 (4th Cir. 2001)). States are preempted from regulating on this issue. 15 U.S.C. § 1681t(b)(1)(B); see also Spokeo v. Robins, 136 S. Ct. 1540, 1549 (2016). Furnishers of disputed information also have duties that are triggered when notified either by the consumer directly or by the CRA. If the consumer notifies the furnisher directly, the furnisher has the duty to conduct an investigation, review all relevant information the consumer provides, complete the investigation within the same time a CRA has to complete an investigation under 15 U.S.C. § 1681i(a)(1), and, if the result of the investigation is that the information was inaccurate, the furnisher must notify all CRAs it reported the information to. 15 U.S.C. § 1681s-2(a)(8)(E). Although the furnisher has this duty to investigate for a direct dispute, there is no private right of action for the consumer to pursue if the furnisher fails to perform its duty under the FCRA. 15 U.S.C. § 1681s-2(c). States are also preempted from regulating this duty of the furnisher to address direct disputes by the consumer. 15 U.S.C. §§ 1681s-2(d), 1681t(b)(1)(F). When a consumer submits a valid dispute to the CRA about the information contained in the consumer report, this also triggers a duty for the furnisher of the allegedly inaccurate information. Once notified by the CRA, the furnisher is required to conduct its own reasonable investigation, review all relevant information the CRA provides, report to the CRA the results of its investigation, and report to the CRAs it furnished information to if the investigation shows the information is incomplete or inaccurate. 15 U.S.C. § 1681s-2(b)(1)(A)–(b)(1)(D). If the furnisher finds that the information is inaccurate, incomplete, or unverified, it must either report to the CRA a modification of the information, delete the information, or permanently block the information from further reporting. 15 U.S.C. § 1681s-2(b)(1)(E); see Chiang v. Verizon New England Inc., 595 F.3d 26, 37 (1st Cir. 2010) (citing Johnson v. MBNA Am. Bank, NA, 357 F.3d 426, 430–31 (4th Cir. 2004); Gorman v. Wolpoff & Abramson, LLP, 584 F.3d 1147, 1156–57 (9th Cir. 2009)) (finding that a furnisher’s investigation must be reasonable). The furnisher must complete its investigation prior to the expiration of the CRA’s own time period to investigate the dispute, which is discussed in detail above. 15 U.S.C. § 1681s-2(b)(2). The furnisher is subject to civil liability for negligent or willful noncompliance with the FCRA if it fails to adequately investigate the disputed information, to accurately report the results of its investigation to the initiating CRA, or to otherwise meet any of its five investigation responsibilities in this process. 15 U.S.C. §§ 1681n, 1681o; Boggio v. USAA Fed. Sav. Bank, 696 F.3d 611, 618 (6th Cir. 2012) (finding that “FCRA expressly creates a private right of action against a furnisher who fails to satisfy one of five duties identified in § 1681s2(b)”); Barrepski v. Capital One Bank, 439 Fed. Appx. 11, 12 (1st Cir. 2011) (citing Chiang v. Verizon New England Inc., 595 F.3d 26, 35 n.8 (1st Cir. 2010) (“The furnisher’s obligation to conduct an investigation, and its period of liability, begins only upon the furnisher’s receipt of notice from the CRA; notice directly from the consumer is not enough.”)). Except for Massachusetts and California, states are preempted from regulating the issues covered by 15 U.S.C. § 1681s-2(b). 15 U.S.C. § 1681t(b)(1)(F) (states are preempted from regulating the subject matter of 15 U.S.C. § 1681s-2 except for G.L. c. 93, § 54A and Cal. Civ. Code § 1785.25(a)). MCLE, Inc. | 2nd Edition 2018

4–7

§ 4.2

§ 4.2

Data Security and Privacy in Massachusetts

WHAT THE FCRA AND FACTA PROTECT AND RENDER PRIVATE

The FCRA, as amended by FACTA, includes several provisions that protect and render consumer information private. Protecting this private information is important in order to help prevent identity theft. The provisions are discussed below.

§ 4.2.1

Truncation of Credit and Debit Card Numbers and Expiration Dates on Printed Receipts

Under an amendment by FACTA to the FCRA to provide that a merchant cannot print more than the last five digits of a credit or debit card number or the expiration date of the card on any printed receipt provided to the customer at the time of sale. 15 U.S.C. § 1681c(g)(1) This limitation applies only to electronically printed receipts, not to handwritten receipts, card imprints, or card copies. 15 U.S.C. § 1681c(g)(2). With respect to e-mailed receipts and transactions on the internet, two courts of appeal have held that receipts sent to consumers by e-mail are not subject to this requirement, as they are not “print[ed] . . . at the point of sale or transaction.” 15 U.S.C. § 1681c(g)(1). See Bormes v. United States, 759 F.3d 793, 797–98 (7th Cir. 2014); Shlahtichman v. 1-800 Contacts, Inc., 615 F.3d 794, 802 (7th Cir. 2010); Simonoff v. Expedia, Inc., 643 F.3d 1202, 1204, 1208–10 (9th Cir. 2011) (affirming dismissal and finding that FACTA does not apply to e-mailed receipts); Shlahtichman v. 1-800 Contacts, Inc., 615 F.3d 794, 796 (7th Cir. 2011). Consumers may pursue an action against the merchant for violations. 15 U.S.C. §§ 1681n(a), 1681o(a). Consumers may recover actual damages, costs, and attorney fees. 15 U.S.C. §§ 1681n(a), 1681o(a). For willful violations, consumers may also recover between $100 and $1,000 in damages (or actual damages) and “punitive damages as the court may allow.” 15 U.S.C. §§ 1681n(a)(1)(A), (a)(3). The FCRA preempts states from imposing any new requirements or prohibitions regulating the truncation of credit and debit card numbers and expiration dates from printed receipts. 15 U.S.C. § 1681t(b)(1)(E), (b)(5)(A).

§ 4.2.2

Consumers Can Request a Truncation of the First Five Digits of Their Social Security Numbers

Under the FCRA, consumers have the right to request the CRA truncate the first five digits of the consumer’s Social Security number when issuing consumer reports. 15 U.S.C. § 1681g(a)(1)(A). Consumer reporting agencies are liable for negligent or willful noncompliance with the FCRA for failure to truncate the first five digits of the consumer’s Social Security number when requested. Daniels v. Experian Info. Solutions, Inc., No. CV 109-017, 2009 WL 1811548, at *1 (S.D. Ga. June 24, 2009) (citing Lenox v. Equifax Info. Servs., No. 05-1501-AA, 2007 WL 1406914, at *7–8 (D. Or. May 7, 2007)). The FCRA preempts states from regulating the truncation of Social Security numbers on consumer reports. 15 U.S.C. § 1681t(b)(5)(D).

4–8

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.2.3

§ 4.2

Restrictions on Medical Information Appearing in Consumer Reports

The FCRA prohibits consumer medical information from appearing in consumer reports furnished for employment purposes or in connection with credit or insurance transactions, with some exceptions. 15 U.S.C. § 1681b(g)(1). If the medical information is provided in connection with an insurance transaction, it may appear in the consumer report, but only with the consumer’s affirmative consent. 15 U.S.C. § 1681b(g)(1)(A). If the consumer report is provided for employment purposes or in connection with a credit transaction, medical information may appear on the consumer report if it is “relevant to process or effect the employment or credit transaction,” and if the consumer provides direct consent in writing to release the information for a specific use. 15 U.S.C. § 1681b(g)(1)(B). If the medical information furnished relates solely to financial transactions as a result of medical debts, so long as there is no indication of the medical services, products, or devices received, the information may appear on the consumer’s consumer report. 15 U.S.C. § 1681b(g)(1)(C). Redisclosure of the medical information is also prohibited unless “necessary to carry out the purpose for which the information was initially disclosed, or as otherwise permitted by statute, regulation, or order.” 15 U.S.C. § 1681b(g)(4).

§ 4.2.4

Restrictions on Employers’ Use of a Consumer Report

An employer can access an employee’s (or potential employee’s) consumer report for employment purposes when it meets the “permissible purposes” limitations discussed below. 15 U.S.C. § 1681b(a)(3)(B). The FCRA defines “employment purposes” with respect to a consumer report to mean “a report used for the purpose of evaluating a consumer for employment, promotion, reassignment or retention as an employee.” 15 U.S.C. § 1681a(h). Communications that relate to employee misconduct or compliance investigations are not a “consumer report.” 15 U.S.C. §§ 1681a(d)(2)(D), 1681a(y), nor are communications made by third parties to an employer for the purposes of procuring an employee or worker, provided the subject has consented to such communications. See 15 U.S.C. §§ 1681a(d)(2)(D), 1681(o). The employer must follow specific notice and procedural requirements before obtaining a consumer report for current employees, prospective employees, or applicants. First, the employer must disclose in a separate document that a consumer report may be obtained for employment purposes. 15 U.S.C. § 1681b(b)(2)(A)(i). Then the employee must consent in writing to the employer obtaining his or her consumer report. 15 U.S.C. § 1681b(b)(2)(A)(ii). An employer’s failure to follow these requirements, whether willful or negligent, subjects the employer to consumer damage claims. 15 U.S.C. §§ 1681n (willful noncompliance), 1681o (negligent noncompliance). The FCRA also requires that an employer that intends to take an adverse action on the basis of the procured report provide a preadverse action notice to the consumer that includes a copy of the report the employer received and a description of the consumer’s rights under the FCRA. 15 U.S.C. § 1681b(b)(3)(A). The purpose of the preadverse action notice is to give the employee or the applicant time to dispute or MCLE, Inc. | 2nd Edition 2018

4–9

§ 4.2

Data Security and Privacy in Massachusetts

clarify the allegedly inaccurate information contained in his or her consumer report prior to the employer taking its adverse action. Goode v. LexisNexis Risk & Info. Analytics Grp., Inc., 848 F. Supp. 2d 532, 537 (E.D. Penn. 2012) (quoting Lynne B. Barr & Barbara J. Ellis, “The New FCRA: An Assessment of the First Year,” 54 Bus. Law 1343, 1348 (1999)). An employer violates the FCRA if it takes adverse action against the employee or the applicant before sending the preadverse action notice. See Miller v. Johnson & Johnson, Janssen Pharm., Inc., 80 F. Supp. 3d 1284, 1290– 92 (M.D. Fla. 2015) (holding that prospective employer calling applicant to inform him that it is rescinding job offer prior to mailing out preadverse action notice packet constitutes adverse action and violates FCRA). An employer that violates this requirement of the FCRA is subject to damages for willful or negligent noncompliance. 15 U.S.C. §§ 1681n, 1681o. Should an employer take an adverse action based on the contents of a consumer report, it must give its employee or applicant an adverse action notice. 15 U.S.C. § 1681m(a). The adverse action notice can be either an oral, written, or electronic communication, but must include • notice of the adverse action; • the name, address, and telephone number of the CRA that issued the report; • a statement that the CRA did not make the adverse action decision; • notice of the consumer’s right to a free consumer report within sixty days; and • notice of the consumer’s right to dispute the information with the CRA. 15 U.S.C. § 1681m(a). Due to the FACTA amendments to the FCRA, consumers cannot seek private enforcement against violators of this requirement. 15 U.S.C. § 1681m(h)(8); see Perry v. First Nat. Bank, 459 F.3d 816, 819–21 (7th Cir. 2006). States are preempted from regulating “duties of a person who takes any adverse action with respect to a consumer” under this provision of the FCRA. 15 U.S.C. § 1681t(b)(1)(C).

§ 4.2.5

Red Flag Guidelines

To combat and detect identity theft, the FTC, federal banking agencies, the Commodity Futures Trading Commission (CFTC), and the Securities and Exchange Commission (SEC) must establish and maintain “red flag” guidelines. 15 U.S.C. § 1681m(e)(1)(A). A “red flag” is “a pattern, practice, or specific activity that indicates the possible existence of identity theft.” 16 C.F.R. § 681.1(b)(9) (FTC); see also 17 C.F.R. §§ 162.30(b)(10) (CFTC), 248.201(b)(10) (SEC). Examples of red flags include • “alerts, notifications, or warnings from a” CRA (e.g.. a fraud or active duty alert, a notice of a credit freeze, a notice of an address discrepancy, or an indication of a pattern of activity that is inconsistent with the history and usual pattern of activity);

4–10

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.2

• “suspicious documents” (e.g., documents that appear forged, inconsistent information on the documents); • “suspicious personal identifying information” (e.g., inconsistent information provided); • “unusual use of, or suspicious activity related, to the covered account”; and • notice from customers, identity theft victims, law enforcement, or others regarding possible identity theft. 16 C.F.R. pt. 681, app. A, supp. A; see also 17 C.F.R. pt. 162, app. B, supp. A, pt. 248, subpt. C, app. A, supp. A. The red flag guidelines assist financial institutions and creditors by providing risk factors to consider, sources of red flags, and categories of red flags. 16 C.F.R. pt. 681, app. A; see also 17 C.F.R. pt. 162, app. B, pt. 248, subpt. C, app. A. The red flag guidelines are to be used by each financial institution and creditor for their customers or account holders. 15 U.S.C. § 1681m(e)(1)(A). The guidelines require financial institutions and creditors to “develop and maintain a written Identity Theft Prevention Program . . . that is designed to detect, prevent, and mitigate identity theft in connection with the opening of a covered account or any existing covered account.” 16 C.F.R. § 681.1(d)(1); see also 17 C.F.R. §§ 162.30(d)(1), 248.201(d)(1). A “covered account” is an account that is “primarily for personal, family, or household purposes, that involves or is designed to permit multiple payments or transactions, such as a credit card account, mortgage loan, automobile loan, margin account, cell phone account, utility account, checking account, or savings account” or “[a]ny other account . . . for which there is a reasonably foreseeable risk to customers or to the safety and soundness of the financial institution or creditor from identity theft.” 16 C.F.R. § 681.1(b)(3); see also 17 C.F.R. §§ 162.30(b)(3), 248.201(b)(3). The written program “must include reasonable policies and procedures to” identify relevant red flags, detect red flags incorporated into the program, respond appropriately to the detected red flags, and ensure to update the program periodically. 16 C.F.R. § 681.1(d)(2)(i)–(iv); see also 17 C.F.R. §§ 162.30(d)(2)(i)–(iv), 248.201(d)(2)(i)–(iv). The term “creditor,” does not include one “that advances funds on behalf of a person for expenses incidental to a service provided by the creditor . . . . ” 15 U.S.C. § 1681m(e)(4)(B). For example, an attorney who makes copies and pays filing, court, or other case-related fees is not a creditor for purposes of the red flag guidelines. See also American Bar Assoc. v. FTC, 636 F.3d 641, 644 (D.C. Cir. 2011) (holding that ABA’s challenge to FTC policy that lawyers who bill after services rendered are “creditors” under the red flag rule was rendered moot by Congressional amendments). There is no private right of action for consumers under the red flag guidelines provision of the FCRA. 15 U.S.C. § 1681m(h)(8); Perry v. First Nat. Bank, 459 F.3d 816, 819–21 (7th Cir. 2006). States are also preempted from regulating red flag guidelines for covered businesses. 15 U.S.C. § 1681t(b)(5)(F).

MCLE, Inc. | 2nd Edition 2018

4–11

§ 4.2

§ 4.2.6

Data Security and Privacy in Massachusetts

Restrictions on Issuance of an Additional or New Card when Change of Address Notification Filed

The FTC, federal banking agencies, the CFTC, and the SEC are required to issue regulations for card issuers to prevent identity theft. Under the FCRA, these entities are to create guidelines for card issuers when a replacement or an additional card is requested within a short time after receiving a change of address notification. If a consumer requests an additional or replacement card within at least thirty days after the card issuer receives a notification of the consumer’s change of address, then the card issuer must use reasonable policies and procedures before it issues the card. 15 U.S.C. § 1681m(e)(1)(C). These policies and procedures require the card issuer to do one of the following: • notify the cardholder of the request at a previous address of the cardholder and provide a means to promptly report an incorrect address change; • notify the cardholder of the change of address notification by an agreed upon means of communication; or • use other means to validate the change of address that are reasonable policies and procedures that follow the established red flag rules. 15 U.S.C. § 1681m(e)(1)(C). Unfortunately, no private right of action exists for consumers to pursue a violation of 15 U.S.C. § 1681m. 15 U.S.C. § 1681m(h)(8)(A); see Perry v. First Nat. Bank, 459 F.3d 816, 819–21 (7th Cir. 2006). States are preempted from regulating this provision of the FCRA. 15 U.S.C. § 1681t(b)(5)(F).

§ 4.2.7

Address Discrepancies

If the CRA receives a request for a consumer report that lists a substantially different address than the address the CRA has on file for the consumer, the FCRA requires the CRA to notify the report requester of the address discrepancy should it issue the consumer report. 15 U.S.C. § 1681c(h)(1). Furthermore, the CFPB, the federal banking agencies, the National Credit Union Administration, and the FTC must establish regulations that provide guidance regarding reasonable policies and procedures that a user of the consumer report should employ when notified of the address discrepancy. 15 U.S.C. § 1681c(h)(2)(A). These regulations must include reasonable policies and procedures for a user to form a reasonable belief that the user knows the identity of the consumer whom the report pertains to, and, if the user establishes a continuing relationship with the consumer and the user regularly furnishes information to the CRA that issued the address discrepancy notice, the user will furnish information to the CRA that reconciles the address discrepancy. 15 U.S.C. § 1681c(h)(2)(B). Under the FCRA, a CRA is liable for willful or negligent noncompliance if the CRA fails its duty to issue the address discrepancy notice. 15 U.S.C. §§ 1681n, 1681o. States are preempted from regulating this provision of the FCRA. 15 U.S.C. § 1681t(b)(1)(E).

§ 4.2.8

Reports Must Be Used for a Permissible Purpose

A consumer report is information collected by a CRA that is 4–12

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.2

used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing a consumer’s eligibility for (A) credit or insurance to be used primarily for personal, family or household purposes; (B) employment purposes; or (C) any other purpose authorized under section 1681b. . . . 15 U.S.C. § 1681a(d)(1). The purposes for a CRA to issue a consumer report under Section 1681b are • a court order; • by written instruction of the consumer; • to collect on, or involving the extension of, credit accounts; • for employment purposes; • to determine eligibility for a government license or other benefits; • to determine credit or prepayment risks on an existing credit obligation; • to respond to a legitimate business need for information in connection with a business transaction initiated by the consumer or to review an account; • for use by government officials concerning the issuance of a governmentsponsored, individually billed travel charge card; • for use by government officials to determine child support payments; or • to the FDIC or NCUA in connection with those agencies’ conservatorship, receivership, or liquidation of another entity. 15 U.S.C. § 1681b(a). A CRA may not issue a consumer report for any reason other than these listed permissible uses. A CRA must maintain reasonable procedures to ensure that a consumer report is furnished only for permissible purposes. 15 U.S.C. § 1681e(a). For a CRA to release a consumer report to a prospective user, the person or entity requesting the report must “identity themselves, certify [to the CRA] the purposes for which the information is sought, and certify that the information will be used for no other purpose.” 15 U.S.C. § 1681e(a). The CRA must “make a reasonable effort to verify” the user’s identity and the uses certified by the prospective user prior to issuing the consumer report. 15 U.S.C. § 1681e(a). If the CRA has “reasonable grounds for believing” that the consumer report is not for one of the uses enumerated in 15 U.S.C. § 1681b, the CRA must not furnish the consumer report. 15 U.S.C. § 1681e(a). The CRA faces civil liability if it does not independently certify the user’s identity and certification for use. See Pintos v. Pac. Creditors Ass’n, 605 F.3d 665, 677 (9th Cir. 2010) (“Under the plain terms of § 1681e(a), a [user’s] certification cannot absolve the [CRA] of its independent obligation to verify the certification and determine that no reasonable grounds exist for suspecting impermissible use.”). Liability is not automatic, though. The CRA must have willfully or negligently failed in its duty to maintain reasonable procedures to furnish a consumer report for permissible purposes or failed to verify MCLE, Inc. | 2nd Edition 2018

4–13

§ 4.2

Data Security and Privacy in Massachusetts

the user’s certifications. 15 U.S.C. §§ 1681n, 1681o; Perez v. Portfolio Recovery Assocs., LLC, No. 12-1603, 2012 WL 5373448, at *1 n.2 (D.P.R. Oct. 30, 2012) (citing Dobson v. Grendahl, 828 F. Supp. 975, 977 (M.D. Ga. 1993)). Additionally, the FCRA imposes civil liability upon a person or an entity that willfully or negligently obtains the consumer’s report if it is not for a permissible purpose defined in the statute. 15 U.S.C. §§ 1681b(f), 1681n, 1681o. For there to be a basis for a viable claim against the person or entity that used or obtained the consumer’s consumer report for an impermissible purpose, the conduct must be either willful or negligent. Perez v. Portfolio Recovery Assocs., LLC, 2012 WL 5373448, at *2 (holding that “[t]o survive [a motion to dismiss], the complaint must aver sufficient facts to establish to a plausible degree that [d]efendant obtained the credit reports for an impermissible purpose, and that their conduct was either willful or negligent”).

§ 4.2.9

CRAs Must Use Reasonable Procedures to Assure Maximum Possible Accuracy

As discussed in § 4.1.2, above, a CRA must employ “reasonable procedures to assure maximum possible accuracy of the information” within a consumer report when produced. 15 U.S.C. § 1681e(b). A CRA is not liable for merely issuing a report with inaccurate information. The CRA faces civil liability for inaccurate information only if it fails to use reasonable procedures to ensure maximum possible accuracy. Dalton v. Capital Associated Indus., 257 F.3d 409, 415 (4th Cir. 2001) (a CRA “violates § 1681e(b) if (1) the consumer report contains inaccurate information and (2) the [CRA] did not follow reasonable procedures to assure maximum possible accuracy”); Ridenour v. Multi-Color Corp., 147 F. Supp. 3d 452, 458 (2015). Furthermore, to have a viable claim against the CRA for failure to employ reasonable procedures under 15 U.S.C. § 1681e(b), a consumer must show that the CRA actually issued a consumer report. Brown v. Wal-Mart Stores, Inc., 507 Fed. Appx. 543, 546 (6th Cir. 2012) (plaintiff failed to present evidence that CRA issued a consumer report on him and “[u]nder the plain language of section 1681e(b), [plaintiff] cannot maintain a claim for allegedly improper procedures unless the [CRA] actually issued a consumer report about him”). Moreover, a consumer who seeks a private right of action for a CRA’s failure to employ reasonable procedures must allege a concrete and particularized harm to him or her flowing from the violation, as the allegation of “a bare procedural violation” is insufficient to warrant standing. Spokeo v. Robins, 136 S. Ct. 1540, 1549 (2016).

§ 4.3

WHAT HAPPENS IF THE PRIVATE DATA IS USED AND HOW TO DEAL WITH CONSUMER REPORTING PROBLEMS

§ 4.3.1

What Identity Theft Is and Why It Is a Problem

Identity theft occurs when a criminal uses or attempts to use personal identifying information of another person to obtain goods, services, loans, or credit in the victim’s 4–14

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.3

name. The FCRA defines “identity theft” as “a fraud committed using the identifying information of another person,” subject to any further definition by the CFPB. 15 U.S.C. § 1681a(q)(3). The CFPB goes on to expand the definition of identity theft to “a fraud committed or attempted using the identifying information of another person without authority.” 12 C.F.R. § 1022.3(h). Identifying information includes any name or number that could be used alone or in conjunction with other information to identify someone. 12 C.F.R. § 1022.3(g). Identifying information includes, among other things, name, date of birth, Social Security number, passport number or alien registration number, driver’s license or other government identification numbers, financial account numbers, credit or debit card numbers, biometric information (for instance, iris scans, fingerprints, and voice prints), a “[u]nique electronic identification number, address, or routing code” and “[t]elecommunication identifying information or access device (as defined in 18 U.S.C. [§] 1029(e)”. 12 C.F.R. § 1022.3(g)(1)–(4). Criminals can gain access to this information by obtaining physical copies of documents or, increasingly, by theft or misuse of information accessible through the internet. Identity theft is a substantial problem for consumers. The FTC reported that in 2016, 13 percent of its complaints were about identity theft, generating 399,225 complaints. Federal Trade Comm’n, “FTC Releases Annual Summary of Consumer Complaints” (Mar. 3, 2015), available at https://www.ftc.gov/news-events/press-releases/2017/03/ ftc-releases-annual-summary-consumer-complaints. The Bureau of Justice Statistics, which is a component of the Office of Justice Programs, puts the number of identity theft victims in the United States for 2014 at 17.6 million. Bureau of Justice Statistics, “17.6 Million U.S. Residents Experienced Identity Theft in 2014” (Sept. 27, 2015), https://www.bjs.gov/content/pub/press/vit14pr.cfm. In 2016, fraud affected 6.15 percent of U.S. consumers, the highest percentage ever recorded. Javelin Strategy & Research, “2017 Identity Fraud: Securing the Connected Life” (Feb. 1, 2017), available at https://www.javelinstrategy.com/coverage-area/2017-identity-fraud. Identity theft also has a significant financial impact on consumers. According to the Bureau of Justice Statistics, the financial loss associated with identity theft in 2014 amounted to $15.4 billion. Bureau of Justice Statistics, “Victims of Identity Theft, 2014” at 7. The cost of identity theft is not just an economic loss. Victims of identity theft have to spend time resolving the damage done, with the majority of victims being able to resolve problems in a day or less, while 10 percent of victims spent more than a month. Bureau of Justice Statistics, “Victims of Identity Theft, 2014” at 10. Of course, there is also the emotional distress and the toll on relationships due to identity theft—of those who spent one to three months resolving their identity theft problems, 20 percent agreed that the incident was “severely distressing”. Bureau of Justice Statistics, “Victims of Identity Theft, 2014” at 10. The FCRA provides assistance to victims of identity theft through various provisions, as discussed below.

§ 4.3.2

Information Available to Identity Theft Victims

Identity theft victims often do not know the thieves who stole their identity. If the victim wants to press charges against the perpetrator, the victim needs information MCLE, Inc. | 2nd Edition 2018

4–15

§ 4.3

Data Security and Privacy in Massachusetts

about the business transaction in order to find the thief. If the consumer and law enforcement do not have access to the relevant information to document the fraudulent business transaction, it is very difficult to locate and prosecute the thief. The FCRA provides help to victims of identity theft with this problem. Under the FCRA, a business that has had a financial commercial business transaction with the alleged identity thief is to turn over business and transaction records in its control to the victim and/or to law enforcement personnel authorized by the victim. 15 U.S.C. § 1681g(e)(1). Upon receiving the request from the victim for the records, the business has thirty days to submit the records to the victim or authorized law enforcement personnel. 15 U.S.C. § 1681g(e)(1). The business must provide the requested information to the victim free of charge. 15 U.S.C. § 1681g(e)(4). Again, this provision of the FCRA is essential to find the persons responsible for the identity theft because, without proof that a fraudulent commercial transaction occurred, it is difficult to prosecute the criminals. The FCRA sets out a specific process for the victim to request the information about the fraudulent transaction. In order for the victim to receive the requested information, the victim must issue the request in writing to the business at a designated address, if any exists. 15 U.S.C. § 1681g(e)(3)(A), (B). Also, the business can require that the victim include, if known or easily obtainable, relevant information about the fraudulent transaction, such as the date of the transaction or the account or transaction number. 15 U.S.C. § 1681g(e)(3)(C). In addition, before a business releases transaction records to an alleged victim, the business may require that the victim provide information to verify his or her identity and claim. 15 U.S.C. § 1681g(e)(2). As for the proof of the victim’s identity, the business may request that the victim provide a government-issued identification card, personal identifying information that was the same as what the thief used, or personal identifying information regularly requested by that business at the time of the transaction. 15 U.S.C. § 1681g(e)(2)(A). With respect to proof of the claim of identity theft, the business may require the victim to provide a copy of the police report showing the claim alleged, a properly completed identity theft affidavit issued by the CFPB, or an affidavit of fact that is acceptable to the business. 15 U.S.C. § 1681g(e)(2)(B); see Novak v. Experian Info. Solutions, Inc., 782 F. Supp. 2d 617, 624 n.6 (N.D. Ill. 2011) (dismissing 15 U.S.C. § 1681g(e) claim against business for not turning over business records to plaintiff when plaintiff failed to show that he was a true identity theft victim but permitting plaintiff to amend his complaint to use or reference police report). Even if the victim follows these procedures, the FCRA permits a business to decline to provide the requested information. A business may do so if, “in the exercise of good faith,” it determines that it does not have a “high degree of confidence” that it knows the true identity of the requestor, if the request is based on a misrepresentation of fact by the requestor, or if the requested information is “Internet navigational data or similar information” concerning a person’s online service. 15 U.S.C. § 1681g(e)(5). As to liability, there is no private right of action the victim can pursue against the business to obtain information that relates to the transaction resulting from identity theft. 15 U.S.C. § 1681g(e)(6). Furthermore, states are preempted from regulating on this issue. 15 U.S.C. § 1681t(b)(1)(G). 4–16

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.3

An identity theft victim is also entitled to information if the fraudulent transaction leads to an attempt to collect a debt from the victim. The FCRA requires that debt collectors, upon notification that the debt is a result of identity theft, must provide all information that the consumer would be entitled to if the consumer were not a victim of identity theft. 15 U.S.C. § 1681m(g)(2). The FCRA also requires the debt collector to notify the creditor of the allegation of identity theft. 15 U.S.C. § 1681m(g)(1). Unfortunately, no private right of action exists for victims to pursue for a violation of this requirement. 15 U.S.C. § 1681m(h)(8)(A). Some of these issues may be litigable under the Fair Debt Collection Practices Act (FDCPA), 15 U.S.C. § 1692. States are preempted from regulating on this issue. 15 U.S.C. § 1681t(b)(5)(F).

§ 4.3.3

Fraud Alerts

If a consumer is the victim of identity theft or possesses a good faith belief that he or she is a victim, then the FCRA provides protections against further harm, such as the thief opening additional new accounts under the victim’s identity. The FCRA allows victims to add fraud alerts to their files with “nationwide” CRAs, which are currently the big three CRAs—Experian, Equifax, and Transunion. See 15 U.S.C. §§ 1681a(p), 1681c-1. A fraud alert is a clear and conspicuous notice in a consumer report that the consumer may be a victim of fraud or identity theft. 15 U.S.C. § 1681a(q)(2)(A), (B). There are three different types of alerts with distinct requirements and effects, which are discussed below.

(a)

Initial Fraud Alert or One-Call Fraud Alert

An initial alert, also called a one-call fraud alert, is a short-term alert that lasts for at least ninety days. To obtain an initial fraud alert, a consumer, or an individual acting on behalf of or as a personal representative of the consumer, must assert to a CRA “in good faith a suspicion” that the consumer has been or suspects he or she will be the victim of identity theft or fraud. 15 U.S.C. § 1681c-1(a)(1). Before the CRA puts the initial fraud alert on the consumer’s file, the consumer or the representative must provide appropriate proof of identity. 15 U.S.C. § 1681c-1(a)(1). Once the consumer contacts the CRA to request the fraud alert and provides appropriate proof of identity, the CRA contacted is to notify the other nationwide CRAs of the fraud alert, and they, in turn, must also include the fraud alert on the consumer’s file. 15 U.S.C. § 1681c-1(a)(1)(B). While the FCRA states that a consumer must directly request that the fraud alert be placed on his or her file, one court has held that a CRA does not violate the FCRA if it places an initial fraud alert on the consumer’s file without the consumer’s request. Baker v. TransUnion LLC, No. 07-8032, 2009 WL 4042909, at *4 (D. Ariz. Nov. 19, 2009) (granting summary judgment in favor of CRA because 15 U.S.C. § 1681c-1 “[b]y its terms, it does not regulate a CRA’s placement of a fraud alert on a consumer’s account without the consumer’s permission”). Another court has held that a company, on behalf of a consumer, cannot request a CRA to place a fraud alert on the consumer’s file. Experian Info. Solutions, Inc. v. Lifelock, Inc., 633 F. Supp. 2d 1104, 1107–08 (C.D. Cal. 2009) (finding that, taking the proper interpretation of the MCLE, Inc. | 2nd Edition 2018

4–17

§ 4.3

Data Security and Privacy in Massachusetts

plain meaning of 15 U.S.C. § 1681c-1 “and its legislative history together, the Court finds that the FCRA embodies an established public policy against companies like [defendant] placing fraud alerts on behalf of consumers”). The initial alert must remain in the file for a period of not less than ninety days beginning on the day of the request. 15 U.S.C. § 1681c-1(a)(1)(A). The initial fraud alert can be less than ninety days if the consumer, or his or her representative, requests that the CRA remove the fraud alert and provides appropriate proof of identity. 15 U.S.C. § 1681c-1(a)(1)(A). The alert can also be longer than ninety days if the CRA permits it. See 15 U.S.C. § 1681c-1(a)(1)(A). During this initial alert period, the CRA must not only maintain the alert in the file but must also include the initial fraud alert when it generates the consumer’s credit score. 15 U.S.C. § 1681c1(a)(1)(A). Another effect of the initial fraud alert is that the CRA must disclose to the consumer that he or she has a right to request a free copy of his or her consumer report. 15 U.S.C. § 1681c-1(a)(2)(A). The CRA is required to provide the requested report, without charge to the consumer, within three business days of the consumer’s request. 15 U.S.C. § 1681c-1(a)(2)(B).

(b)

Extended Fraud Alert

The second type of fraud alert is the “extended fraud alert,” which lasts seven years. To obtain this alert, the consumer, or someone on acting behalf of or a personal representative of the consumer must provide the CRA with an identity theft report and appropriate proof of identity. 15 U.S.C. § 1681c-1(b)(1). A valid identity theft report sufficient to trigger the extended fraud alert must allege that an identity theft occurred, must be a copy of an official, valid report filed by the consumer with a law enforcement agency, and the filing of the report must subject the filer to criminal penalties relating to the filing of false information. 15 U.S.C. § 1681a(q)(4). Just as with the initial fraud alert, a CRA contacted by the consumer to request the fraud alert is required to inform the other nationwide CRAs of the requested extended fraud alert. 15 U.S.C. § 1681c-1(b)(1)(C). Additionally, the extended fraud alert is similar to the initial fraud alert in that the CRA must provide the alert anytime the CRA generates the consumer’s credit score. 15 U.S.C. § 1681c-1(b)(1)(A). The extended fraud alert stays in the consumer’s file and consumer report for seven years (unless the consumer requests its removal), beginning the day the initial request is made. 15 U.S.C. § 1681c-1(b)(1)(A). Instead of the one free report provided with an initial alert, the CRA must disclose to the consumer that he or she has the right to request two free consumer reports from each of the three nationwide CRAs during the twelve-month period beginning on the date the request for the extended fraud alert is made. 15 U.S.C. § 1681c-1(b)(2)(A). As with the initial fraud alert, the CRA must provide a requested consumer report within three business days. 15 U.S.C. § 1681c-1(b)(2)(B). An added benefit over the initial fraud alert is that, during the initial five-year period of an extended fraud alert, the CRAs must also exclude the consumer from any lists, known as “prescreen” lists, prepared by the CRAs and sold 4–18

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.3

to third parties who offer the consumer credit or insurance as part of a transaction not initiated by the consumer. 15 U.S.C. § 1681c-1(a)(1)(A). This exclusion from the prescreen lists can be removed prior to the end of five years only upon the request of the consumer or his or her representative. 15 U.S.C. § 1681c-1(a)(1)(A).

(c)

Active Duty Alert

The active duty alert differs from the other fraud alerts in that the notation in the credit file does not warn that the consumer is a victim of identity theft but, rather, that the consumer is on active duty. 15 U.S.C. § 1681a(q)(2)(A). The active duty alert remains on a consumer’s file for at least twelve months, unless requested to be removed by the consumer. 15 U.S.C. § 1681-c(c)(1). An active duty alert is naturally available only to consumers who are on active military duty. The FCRA defines an active duty military consumer as “a consumer in military service who is on active duty” or who is a reservist called or ordered to active duty and “is assigned to service away from the usual duty station of the consumer.” 15 U.S.C. § 1681a(q)(1). For an active duty military consumer to obtain this type of alert, the consumer, or a personal representative or other individual on the consumer’s behalf, must request the alert from the CRA, provide appropriate proof of identification to the CRA, and the consumer must meet the definition of active duty military consumer. 15 U.S.C. § 1681c-1(c). Similar to the other two fraud alerts, when an active duty military consumer requests an active duty alert from a CRA, that CRA must notify the other two nationwide CRAs, who also in turn must include the alert when generating the consumer’s report. 15 U.S.C. § 1681c-1(c)(3). Likewise, when an active duty alert is requested to be included in the consumer’s file, the CRA must include the alert when the CRA generates a credit score. 15 U.S.C. § 1681c-1(c)(1). Similar to the extended fraud alert the CRA must exclude the consumer from the prescreen lists for a two-year period, starting from the time the active duty alert is requested. 15 U.S.C. § 1681c1(c)(2). Most importantly, the active duty alert differs from the other two alerts because, with this alert, the consumer is not entitled to receive a free consumer report from any CRA if requested unless the free report is required under another provision of the law.

(d)

Effects of the Three Alerts on Users

Fraud alerts also affect users of consumer reports. All three fraud alerts notify users that the consumer “does not authorize the establishment of any new credit plan or extension of credit . . . in the name of the consumer, or issuance of an additional card on an existing credit account . . . or any increase in credit limit on an existing credit account.” 15 U.S.C. § 1681c-1(h)(1)(A), (h)(2)(A). This restriction does not apply to an extension of credit under an existing open-end credit account—for example, making purchases on a currently existing credit card. 15 U.S.C. § 1681c-1(h)(1)(A), (h)(2)(A). One court has held that this prohibition does not give rise to a cause of action for a victim who maintained an extended fraud alert and whose Social Security MCLE, Inc. | 2nd Edition 2018

4–19

§ 4.3

Data Security and Privacy in Massachusetts

number was used to open an account in the name of another person, because the statute contemplates only accounts opened in the victim’s name. Greene v. DirecTV, Inc., No. 10-C-117, 2010 WL 4628734, at *3–5 (N.D. Ill. Nov. 8, 2010). If the alert is an initial or active duty alert, the user of a consumer report may not complete a credit transaction unless the user “utilizes reasonable policies and procedures to form a reasonable belief that the user knows the identity of the person making the request.” 15 U.S.C. § 1681c-1(h)(1)(B)(i). If the alert is an extended fraud alert, then this limitation also applies to users of credit scores, not just users of consumer reports. 15 U.S.C. § 1681c-1(h)(2)(B). With initial and active duty alerts, consumers may specify a telephone number for users to call for identity verification purposes. 15 U.S.C. § 1681c-1(h)(1)(B)(ii). Users must either call the number the consumer provides in the alert or “take reasonable steps to verify the consumer’s identity” and confirm that the request is not the result of identity theft. 15 U.S.C. § 1681c1(h)(1)(B)(ii). An extended fraud alert is different for telephone numbers because the alert must include a consumer’s telephone number or another “reasonable contact method designated by the consumer.” 15 U.S.C. § 1681c-1(h)(2)(A)(ii). Furthermore, unlike an initial or active duty fraud alert, a user must contact the consumer in person or by the contact method designated by the consumer to ensure the requested transaction is not the result of identity theft. 15 U.S.C. § 1681c-1(h)(2)(B).

(e)

Liability of CRAs and Users and Preemption

Liability for CRAs and users is the same for all three types of fraud alerts. Consumers have a private right of enforcement for both willful and negligent noncompliance violations of the FCRA’s fraud alert provisions. 15 U.S.C. §§ 1681n, 1681o; States are preempted from imposing any requirement or prohibition “with respect to the conduct required” by the fraud alert provisions of the FCRA. 15 U.S.C. § 1681t(b)(5)(B).

(f)

Security Freezes

Although not a provision of the FCRA, many states, including Massachusetts, allow by statute for consumers to request a security freeze, also called a credit freeze, on the consumer’s file with a CRA. G.L. c. 93, § 62A. Information in a frozen consumer report may not be released to a third party without express prior authorization from the consumer. G.L. c. 93, § 62A. The CRA must provide a unique personal identification number or password for the consumer to use to lift or remove the freeze. G.L. c. 93, § 62A. A CRA can charge five dollars each to a consumer to place, temporarily lift, or remove a security freeze. G.L. c. 93, § 62A. If the consumer is a victim of identity theft and presents the CRA with a valid police report concerning the identity theft, the CRA must place the security freeze for free. G.L. c. 93, § 62A. A consumer must place a security freeze with each CRA; there is no obligation for the CRAs to notify each other of a security freeze.

4–20

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

§ 4.3.4

§ 4.3

Blocking Fraudulent Information

If a consumer is the victim of identity theft, the consumer has the right under the FCRA to request a block of the theft-related information from his or her file. When this request is made, it leads to different duties for CRAs and furnishers of the fraudulent information.

(a)

CRAs

When a consumer requests that the CRA block fraudulent information that resulted from an alleged identity theft, the CRA has four business days to comply with the request. 15 U.S.C. § 1681c-2(a). In order for CRAs to block the information, the consumer must submit the following to the CRA: • proof of identity of the consumer, • a copy of the identity theft report, • identification of the fraudulent information, and • the consumer’s statement that the information is not related to any transaction by the consumer. 15 U.S.C. § 1681c-2(a)(1)–(4). If the consumer fails to submit all four prerequisites to the CRA, the CRA is not required to block the information. See Avetisyan v. Experian Info. Solutions, Inc.¸ No. 14–05276–AB (ASx), 2016 WL 7638189 at *8, 12 (C.D. Cal. Jun. 3, 2016) (finding that Experian had a duty to block only plaintiff’s information after she filed an appropriate identity theft report); Austin v. Certegy Payment Recovery Servs., Inc., No. 11-00401, 2014 WL 546819, at *8 (N.D. Ala. Feb. 11, 2014) (granting summary judgment for CRA defendant on FCRA blocking claim due to plaintiff’s failure to show that she satisfied any one of the requirements of 15 U.S.C. § 1681c-2(a)); Thomas v. Early Warning Servs., LLC, No. L10-0825, 2012 WL 37396, at *2 (D. Md. Jan. 5, 2012) (granting CRA defendant summary judgment due to plaintiff admitting he never filed a police report, instead submitting a report that the police wrote on the incident to the CRA, thereby conceding that he did not satisfy the requirement of submitting a valid identity theft report as required by 15 U.S.C. § 1681c-2(a)(2)). Once the block is in place, the CRA has a duty to notify the furnisher of the identified fraudulent information that the information may be the result of identity theft, an identity theft report was filed, a block of the information was requested, and the effective date of the block. 15 U.S.C. § 1681c-2(b). Just as with the fraud alerts, the CRA receiving the request must refer the alleged identity theft complaint and request for a block of the fraudulent information to the other nationwide CRAs. 15 U.S.C. § 1681s(f)(1). In addition to these responsibilities of the CRA related to blocking the information, the CRA must still conduct a reasonable investigation of the disputed information to determine if the information reported is indeed accurate. 15 U.S.C. § 1681i(a)(1)(A).

MCLE, Inc. | 2nd Edition 2018

4–21

§ 4.3

Data Security and Privacy in Massachusetts

Even if a consumer requests that the alleged fraudulent information be blocked, the CRA can decline to block the information or rescind any block. A CRA can do this only if it “reasonably determines” one of the following bases: the information was blocked in error or the block was requested in error by the consumer; the information blocked, or requested to be blocked, was requested on the basis of a material misrepresentation of fact in requesting the block; or the consumer received goods, services, or money as a result of the blocked transaction(s). 15 U.S.C. § 1681c-2(c)(1). If the CRA does decline to block or rescinds the block, then the FCRA requires the CRA to notify the consumer of its decision in writing within five business days. 15 U.S.C. § 1681c-2(c)(2) (citing 15 U.S.C. § 1681i(a)(5)(B)). There is an exception to this provision of the FCRA for a CRA that is a reseller, that is neither furnishing nor reselling a consumer report concerning the alleged fraudulent information when the request to block the information is made, and that informs the consumer that he or she may report the identity theft to the CFPB to obtain information on identity theft. 15 U.S.C. § 1681c-2(d)(1). However, if the CRA that is a reseller has a file on the consumer that contains the alleged fraudulent information and is reporting the disputed information when the block request is made, that reseller has the responsibility to block the requested information. 15 U.S.C. § 1681c2(d)(1)–(2). The consumer must notify the reseller that the consumer report it is furnishing contains the alleged fraudulent information that the consumer is seeking to block in order for the reseller to have the responsibility to block the information. 15 U.S.C. § 1681c-2(d)(2). When the reseller blocks the requested information, the reseller must notify the consumer of the block and include the name, address, and telephone number of each CRA from which the consumer’s file was obtained for resale. 15 U.S.C. § 1681c-2(d)(3). Check verification companies are exempt from the blocking information requirements unless the consumer notifies the verification company and provides it with the information that must be provided to the CRAs, as discussed above. 15 U.S.C. § 1681c-2(e). Upon notification, the check verification company must, within four business days, cease reporting to the CRA the information that is subject to the block. 15 U.S.C. § 1681c-2(e). The FCRA’s blocking provision does not prevent the CRA from reporting the information to any law enforcement agency. 15 U.S.C. § 1681c-2(f). Consumers have a private right of action against a CRA for failure to block the requested information if the block is properly requested. 15 U.S.C. §§ 1681n, 1681o; see, e.g., Anthony v. Experian Info. Solutions, Inc., No. 2:14-cv-01230, 2017 WL 1198499, at *5 (E.D. Cal. Mar. 31, 207) (holding that a plaintiff’s failure to properly supply a CRA with a law enforcement report was fatal to his claim that the CRA failed to block); Collins v. Experian Credit Reporting Servs., 494 F. Supp. 2d 127, 133 (D. Conn. 2007) (granting summary judgment to CRA defendant on 15 U.S.C. § 1681c-2 claim and finding that there was “no evidence to suggest that either [plaintiff] ever contacted any of the Defendants to communicate to them their concerns about identity theft”). States are preempted from regulating this subject matter. 15 U.S.C. § 1681t(b)(5)(C).

4–22

2nd Edition 2018 | MCLE, Inc.

Fair Credit Acts

(b)

§ 4.3

Furnishers

Furnishers are notified of the request to block information by either the CRA or the consumer. When notified by the CRA, the CRA informs the furnisher that the information it is furnishing may be the result of identity theft, that an identity theft report has been filed, a block of the furnished information has been requested by the consumer, and the effective date of the block. 15 U.S.C. § 1681c-2(b). Upon notice by the CRA of the block, furnishers are required to have reasonable procedures in place to respond to the notification and to prevent refurnishing the blocked information. 15 U.S.C. § 1681s-2(a)(6)(A). This duty is in addition to the furnisher’s duty to provide accurate information to a CRA. 15 U.S.C. § 1681s-2(a). Notification by the CRA to the furnisher that information is blocked triggers another duty for the furnisher. If the furnisher receives notice that the information it provided to the CRA is blocked because it is a result of identity theft, the furnisher cannot sell, transfer, or place that debt into collection. 15 U.S.C. § 1681m(f)(1). This duty goes into effect only if the notice comes from the CRA and it must adhere to the FCRA’s blocking notification requirements. 15 U.S.C. § 1681m(f)(1). A consumer may also notify the furnisher by submitting an identity theft report directly to the furnisher at a specified address set by the furnisher. 15 U.S.C. § 1681s2(a)(6)(B). The consumer must also include a statement that the information the furnisher is providing to the CRA is a result of identity theft. 15 U.S.C. § 1681s2(a)(6)(B). Upon notice, the furnisher has a duty not to furnish the information that is alleged to be the result of identity theft and that concerns the consumer to any CRA. 15 U.S.C. § 1681s-2(a)(6)(B). This restriction does not apply if the furnisher knows, or the consumer informs the furnisher, that the information is correct. 15 U.S.C. § 1681s-2(a)(6)(B). Consumers do not have a private right of action against furnishers to enforce these blocking provisions of the FCRA. 15 U.S.C. § 1681s-2(c). However, as discussed above, consumers have a private right of action against a furnisher to enforce its duties to respond to a dispute filed by the consumer with the CRA. 15 U.S.C. § 1681s2(b). Therefore, the consumer can pursue a dispute arising from the identity theft and then enforce the FCRA if the furnisher does not take appropriate action in response. See Fregoso v. Wells Fargo Dealer Servs., Inc., No. 11-10089, 2012 WL 4903291, at *4 (C.D. Cal. Oct. 16, 2012) (“While the furnisher of information [defendant] may have additional legal duties if an Identity Theft Report is proffered, see 15 U.S.C. § 1681s-2(a)(6)(B), this does not mean that the furnisher has no legal duties in the absence of an Identity Theft Report. The FCRA does not provide that filing an Identity Theft Report is a prerequisite to bringing suit against a furnisher of information for violation of the FCRA.”) (emphasis in original). States are preempted from regulating the blocking provisions of the FCRA. 15 U.S.C. § 1681t(b)(5)(H).

MCLE, Inc. | 2nd Edition 2018

4–23

Data Security and Privacy in Massachusetts

4–24

2nd Edition 2018 | MCLE, Inc.

CHAPTER 5

The Gramm-Leach-Bliley Act (GLBA) Ellen Marie Giblin, Esq. Philips Healthcare, Andover § 5.1

Overview of the Gramm-Leach-Bliley Act ........................................... 5–2

§ 5.2

Rule Making ............................................................................................ 5–3 § 5.2.1 The Safeguards Rule .............................................................. 5–3 § 5.2.2 The Financial Privacy Rule .................................................... 5–4 § 5.2.3 The Pretexting Rule................................................................ 5–4

§ 5.3

Scope of the GLBA ................................................................................. 5–5 § 5.3.1 Entities Subject to the GLBA ................................................. 5–5 § 5.3.2 Information Protected Under the GLBA—Nonpublic Personal Information (NPPI) .................................................. 5–6 § 5.3.3 Individuals Protected Under the GLBA—Consumers and Customers ........................................................................ 5–6

§ 5.4

Notice, Disclosure, and Safeguarding Requirements Under the GLBA ..................................................................................... 5–7 § 5.4.1 Notice, Disclosure, and Safeguarding Requirements ............. 5–7 § 5.4.2 Consent Requirements ......................................................... 5–11 § 5.4.3 Limits of Reuse of NPPI ...................................................... 5–11 § 5.4.4 Restrictions on Disclosing NPPI to Third Parties ................ 5–11 § 5.4.5 Data Security Requirements ................................................. 5–13 § 5.4.6 Data Breach Notification Requirements ............................... 5–14 (a) FFIEC Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice..................................................... 5–15 (b) FTC Guidance on Breach Notification .......................... 5–16

§ 5.5

Pretexting............................................................................................... 5–17 § 5.5.1 Overview .............................................................................. 5–17 § 5.5.2 Prohibition on Obtaining or Soliciting Customer Information Using False Pretenses ....................................... 5–17 § 5.5.3 Exceptions to Pretexting Rule .............................................. 5–17 § 5.5.4 Enforcement of Pretexting Rule ........................................... 5–18 § 5.5.5 Criminal Penalty for Violation of Pretexting Rule ............... 5–19

§ 5.6

GLBA Enforcement .............................................................................. 5–20 § 5.6.1 Enforcement ......................................................................... 5–20 § 5.6.2 Sanctions and Other Liability ............................................... 5–20 (a) Judicial and CFPB Enforcement ................................... 5–20

MCLE, Inc. | 2nd Edition 2018

5–1

Data Security and Privacy in Massachusetts

(b) FTC Enforcement .......................................................... 5–21 EXHIBIT 5A—Regulatory Enforcement Authority for the Privacy, Safeguards, and Pretexting Rules ...................................................................... 5–23 EXHIBIT 5B—Model Privacy Notice Form: Opt Out..................................... 5–25 EXHIBIT 5C—Model Privacy Notice Form: No Opt Out .............................. 5–27

Scope Note This chapter provides insight into the provisions of the federal law known as the Gramm-Leach-Bliley Act, which obliges financial institutions to respect customer privacy and to protect nonpublic personal information. It presents the Act’s rule-making authority and scope, while also setting forth requirements relative to notice, disclosure, and safeguarding. The chapter covers the pretexting rule and aspects of enforcement.

§ 5.1

OVERVIEW OF THE GRAMM-LEACH-BLILEY ACT

The Gramm-Leach-Bliley Act (GLBA), also known as the Financial Modernization Act of 1999, Pub. L. No. 106-102, § 101, is a U.S. federal law that creates an affirmative and continuing obligation for financial institutions to respect the privacy of their customers and to protect the security and confidentiality of those customers’ nonpublic personal information (NPPI). The history of the GLBA begins with its repeal of part of the Banking Act of 1933, known as the Glass-Steagall Act. The GLBA was meant to enhance competition in the financial services industry by creating a framework for the affiliation of banks, securities firms, insurance companies, and other financial service providers. With the passage of the GLBA, banking companies, securities companies, and insurance companies were then permitted to consolidate. The GLBA sets forth privacy rules for financial institutions and orders the Securities and Exchange Commission (SEC) and the Office of the Comptroller of the Currency to make privacy rules for their regulated institutions. The GLBA requires each agency or authority (except for the Consumer Financial Protection Bureau) to establish appropriate standards for the financial institutions that are subject to their jurisdiction. Such standards require financial institutions to implement appropriate administrative, technical, and physical safeguards. 15 U.S.C. § 6801(b). Subtitle A of Title V of the GLBA, and the Consumer Financial Protection Bureau’s (CFPB) and the Federal Trade commission’s (FTC) implementing regulations, regulate the collection, use, protection, and disclosure of the NPPI of individuals who obtain financial services from financial institutions for personal, family, or household use. The GLBA comprises three parts: the safeguards rule, the financial privacy rule, and the pretexting rule.

5–2

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.1

The safeguards rule requires financial institutions to • insure the security and confidentiality of customer records and information; • protect against any anticipated threats or hazards to the security or integrity of such records; and • protect against unauthorized access to or use of such records or information which could result in substantial harm or inconvenience to any customer. 15 U.S.C. § 6801(b). The financial privacy rule requires financial institutions to • provide notice of their privacy policies; • allow consumers to opt out of the disclosure of their NPPI to nonaffiliated third parties; • limit reuse and disclosure of any NPPI with nonaffiliated third parties; and • limit sharing of account number information for marketing purposes. 15 U.S.C. § 6802. Pretexting is the practice of obtaining financial information under false pretenses and the GLBA’s pretexting rule specifically prohibits a person from obtaining customer information by false pretenses. 15 U.S.C. § 6821(a)(1)–(3). See § 5.3.3, below, for a definition of what constitutes customer information.

§ 5.2

RULE MAKING

The GLBA originally vested regulatory enforcement and rule-making authority in eight federal agencies that oversee various financial and banking activities and state insurance authorities. The Dodd-Frank Act was passed by Congress in 2010, which created the CFPB and further split GLBA rule-making authority. The current GLBA framework sets forth that the nine federal regulatory agencies should consult and coordinate among those agencies and with the National Association of Insurance Commissioners (NAIC) to ensure regulations are as consistent as possible. See the table included as Exhibit 5A, Regulatory Enforcement Authority for the Privacy, Safeguards, and Pretexting Rules.

§ 5.2.1

The Safeguards Rule

Rule-making authority for the safeguards rule is vested in • the five functional banking regulators; • the Commodity Futures Trading Commission (CFTC); • the SEC;

MCLE, Inc. | 2nd Edition 2018

5–3

§ 5.2

Data Security and Privacy in Massachusetts

• state insurance regulators; and • the FTC (for all financial institutions not regulated by the other agencies). Each of the regulators has issued regulations implementing the safeguards rule. • The banking regulators, while they have independent authority, have issued interagency guidance. • The CFTC issued a staff advisory in 2014 to its regulated intermediaries with recommended best practices for compliance. • The SEC issued regulations implementing the safeguards rule (17 C.F.R. § 248.30). • The FTC issued regulations implementing the safeguards rule (16 C.F.R. § 314.1–.5).

§ 5.2.2

The Financial Privacy Rule

The CFPB has rule-making authority for financial privacy including all banks and financial institutions except for the following • SEC-regulated entities; • CFTC-regulated entities; • state-regulated insurance businesses; and • certain FTC-regulated auto dealers. In 2014, the CFPB issued Regulation P to implement the financial privacy rule. Regulation P largely restates the banking regulators’ previously existing privacy regulations. See 12 C.F.R. § 1016.1–.17. While similar important variations exist across the regulations issued by each regulator, practitioners should be aware of the various regulations and ensure that they consult the appropriate rule when advising clients.

§ 5.2.3

The Pretexting Rule

Each federal banking agency (as defined in 12 U.S.C. § 1813(z)), the National Credit Union Administration, and the SEC or self-regulatory organizations, as appropriate, have rule-making authority under the pretexting rule. Such organizations are required to review regulations and guidelines applicable to financial institutions under their respective jurisdictions and prescribe such revisions to such regulations and guidelines as may be necessary to ensure that such financial institutions have policies, procedures, and controls in place to prevent the unauthorized disclosure of customer financial information and to deter and detect activities proscribed under the pretexting rule of the GLBA. 15 U.S.C. § 6825.

5–4

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.3

SCOPE OF THE GLBA

§ 5.3.1

Entities Subject to the GLBA

§ 5.3

What is a financial institution? The first step in determining whether your organization is a financial institution begins with the query, what is a financial institution? The working definition of a financial institution can be broken down into the following analysis. The GLBA defines “financial institution” generally as any institution engaged in activities that are financial in nature or incidental to financial activities as described in Section 4(k) of the Bank Holding Company Act of 1956, 12 U.S.C. § 1843(k). Financial activities for purposes of the GLBA include • lending, exchanging, transferring, investing for others, or safeguarding money or securities; • insuring, guaranteeing, or indemnifying against loss, harm, damage, illness, disability, or death, or providing and issuing annuities, and acting as principal, agent, or broker for purposes of the foregoing, in any state; • providing financial, investment, or economic advisory services, including advising an investment company; • lending, brokering, and servicing loans; • issuing or selling instruments representing interests in pools of assets permissible for a bank to hold directly; • underwriting, dealing in, or making a market in securities; • engaging in any activity that is closely related to banking or managing or controlling banks; • engaging in the United States in any activity that a bank holding company may engage in outside of the United States in the usual course of transacting banking business; • debt collection or settlement services; and • other related activities more specifically set forth in 12 U.S.C. § 1843(k). For entities subject to the FTC’s enforcement jurisdiction, the definition of what constitutes a financial institution encompasses not only banks and the familiar insurance or securities businesses but also individuals or organizations that are significantly engaged in providing financial products or services to consumers, including • check-cashing businesses, • data processors, • mortgage brokers,

MCLE, Inc. | 2nd Edition 2018

5–5

§ 5.3

Data Security and Privacy in Massachusetts

• nonbank lenders, • personal property or real estate appraisers, and • retailers that issue credit cards to consumers. 12 C.F.R. § 1016.3(l). According to the FTC, an institution must be “significantly engaged” in financial activities to be considered a financial institution. Whether a financial institution is significantly engaged in financial activities is a flexible standard that takes into account all of the facts and circumstances. 12 C.F.R. § 1016.3(l)(3)(i).

§ 5.3.2

Information Protected Under the GLBA—Nonpublic Personal Information (NPPI)

The GLBA defines NPPI as personally identifiable financial information that is provided by a consumer to a financial institution, results from any transaction with the consumer or any service performed for the consumer, or is otherwise obtained by the financial institution. Personally identifiable information about a consumer or a customer of the financial institution, that a financial institution has collected and that is not publicly available, is considered NPPI. Note that the circumstance that a person is a consumer or a customer of the financial institution is considered NPPI. However, the term NPPI does not include publicly available information. In summary, any nonpublic personal information collected by a financial institution under the GLBA is any “personally identifiable financial information” that is not publicly available and capable of personally identifying a consumer or customer.

§ 5.3.3

Individuals Protected Under the GLBA—Consumers and Customers

Requirements under the GLBA apply when individuals are “consumers” or “customers.” The term “consumer” means an individual who obtains, from a financial institution, financial products or services which are to be used primarily for personal, family, or household purposes, and it also means the legal representative of such an individual. 15 U.S.C. § 6809(9). For example, an individual who cashes a check with a check-cashing company or makes a wire transfer or applies for a loan is considered a consumer. An individual is not a consumer of a financial institution simply because the individual acts as an agent for or provides services to another financial institution that has a consumer relationship with the individual or because the financial institution serves as a fiduciary for certain trusts or benefit plans related to the individual. 12 C.F.R. § 1016.3(j). A customer is a consumer who has a continuing relationship with the financial institution. For example, an individual who has a deposit account or a credit card with a financial institution would be considered a customer under the GLBA. Isolated 5–6

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.3

transactions do not result in a customer relationship even if the consumer engages in multiple transactions with the institution over time. Note that while all customers are consumers, not all consumers are considered customers. The defining moment is the point when the customer relationship is established. The term “time of establishing a customer relationship,” in the case of a financial institution engaged in extending credit directly to consumers to finance purchases of goods or services, means the time of establishing the credit relationship with the consumer. 15 U.S.C. § 6809(11).

§ 5.4

NOTICE, DISCLOSURE, AND SAFEGUARDING REQUIREMENTS UNDER THE GLBA

The GLBA regulates collecting, using, sharing, transferring, protecting, and disclosing of NPPI. The GLBA requires financial institutions to protect customer privacy and to implement appropriate administrative, technical, and physical safeguards to • insure the security and confidentiality of customer records and information; • protect against any anticipated threats or hazards to the security or integrity of such records; and • protect against unauthorized access to or use of such records or information which could result in substantial harm or inconvenience to any customer. 15 U.S.C. § 6801(b). The GLBA also requires financial institutions to provide consumers and customers with notice in a number of scenarios. Financial institutions must notify their customers about their information-sharing practices and provide customers with a right to opt out if they do not want their information shared with certain nonaffiliated third parties (financial privacy rule), and they must implement a written security program to protect NPPI from unauthorized disclosure (safeguards rule). In addition, any entity that receives consumer financial information from a financial institution may be restricted in its reuse and redisclosure of that information.

§ 5.4.1

Notice, Disclosure, and Safeguarding Requirements

A financial institution may not, directly or through any affiliate, disclose to a nonaffiliated third party any NPPI unless such financial institution provides or has provided to the consumer a notice that complies with 15 U.S.C. § 6803. A financial institution may not disclose NPPI to a nonaffiliated third party unless • such financial institution clearly and conspicuously discloses to the consumer, in writing or in electronic form or other form permitted by the GLBA and its supporting regulations, that such information may be disclosed to such third party;

MCLE, Inc. | 2nd Edition 2018

5–7

§ 5.4

Data Security and Privacy in Massachusetts

• the consumer is provided the opportunity, before the time that such information is initially disclosed, to direct that such information not be disclosed to such third party; and • the consumer is given an explanation of how the consumer can exercise that nondisclosure option. 15 U.S.C. § 6802(b)(1)(a)–(c). This requirement does not prevent a financial institution from providing NPPI to a nonaffiliated third party to perform services for or functions on behalf of the financial institution, including marketing of the financial institution’s own products or services, or financial products or services offered pursuant to joint agreements between two or more financial institutions that comply with the requirements imposed by the regulations prescribed under the GLBA, if the financial institution fully discloses the providing of such information and enters into a contractual agreement with the third party that requires the third party to maintain the confidentiality of such information. 15 U.S.C. § 6802(b)(2). For the purposes of this paragraph, a joint agreement means a formal written contract pursuant to which two or more financial institutions jointly offer, endorse, or sponsor a financial product or service, and as may be further defined in the regulations. Except as otherwise provided under the GLBA, a nonaffiliated third party that receives from a financial institution NPPI shall not, directly or through an affiliate of such receiving third party, disclose such information to any other person that is a nonaffiliated third party of both the financial institution and such receiving third party, unless such disclosure would be lawful if made directly to such other person by the financial institution. 15 U.S.C. § 6802(c). A financial institution is not permitted to disclose, other than to a consumer reporting agency, an account number or similar form of access number or access code for a credit card account, a deposit account, or a transaction account of a consumer to any nonaffiliated third party for use in telemarketing, direct mail marketing, or other marketing through electronic mail to the consumer. 15 U.S.C. § 6802(d). The GLBA does not prohibit the disclosure of NPPI as necessary to effect, administer, or enforce a transaction requested or authorized by the consumer, or in connection with servicing or processing a financial product or service requested or authorized by the consumer; maintaining or servicing the consumer’s account with the financial institution or with another entity as part of a private label credit card program or other extension of credit on behalf of such entity; or a proposed or actual securitization, secondary market sale (including sales of servicing rights), or similar transaction related to a transaction of the consumer. It also does not prohibit disclosure of NPPI with the consent or at the direction of the consumer. 15 U.S.C. § 6802(e)(1)–(2). The GLBA also does not prohibit disclosures • to protect the confidentiality or the security of the financial institution’s records pertaining to the consumer, the service or product, or the transaction therein; 5–8

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.4

• to protect against or prevent actual or potential fraud, unauthorized transactions, claims, or other liability; • for required institutional risk control or for resolving customer disputes or inquiries; • to persons holding a legal or beneficial interest relating to the consumer; • to persons acting in a fiduciary or representative capacity on behalf of the consumer; or • to provide information to insurance rate advisory organizations, guaranty funds or agencies, applicable rating agencies of the financial institution, persons assessing the institution’s compliance with industry standards, and the institution’s attorneys, accountants, and auditors. 15 U.S.C. § 6802(e)(3)–(4). The GLBA does not prohibit disclosures to the extent specifically permitted or required under other provisions of law and in accordance with the Right to Financial Privacy Act of 1978 (12 U.S.C. § 3401 et seq.), to law enforcement agencies (including the CFPB, a federal functional regulator, the secretary of the Treasury, a state insurance authority, or the FTC), self-regulatory organizations, or for an investigation on a matter related to public safety; to a consumer reporting agency in accordance with the Fair Credit Reporting Act (15 U.S.C. § 1681 et seq.), or from a consumer report reported by a consumer reporting agency; in connection with a proposed or actual sale, merger, transfer, or exchange of all or a portion of a business or operating unit if the disclosure of NPPI concerns solely consumers of such business or unit; or to comply with federal, state, or local laws, rules, and other applicable legal requirements; to comply with a properly authorized civil, criminal, or regulatory investigation or subpoena or summons by federal, state, or local authorities; or to respond to judicial process or government regulatory authorities having jurisdiction over the financial institution for examination, compliance, or other purposes as authorized by law. 15 U.S.C. § 6802(e)(5)–(8). At the time of establishing a customer relationship with a consumer and not less than annually during the continuation of such relationship, a financial institution is required to provide a clear, conspicuous, and accurate disclosure to each consumer, in writing or in electronic form or other form permitted by the regulations, of such financial institution’s policies and practices with respect to • disclosing NPPI to affiliates and nonaffiliated third parties, including the categories of information that may be disclosed; • disclosing NPPI of persons who have ceased to be customers of the financial institution; and • protecting the NPPI of consumers. 15 U.S.C. § 6803(a).

MCLE, Inc. | 2nd Edition 2018

5–9

§ 5.4

Data Security and Privacy in Massachusetts

The GLBA requires a financial institution to provide notice of its privacy practices, but the timing and content of this notice depends on whether the subject of the data is a consumer or a customer. A consumer is entitled to receive the financial institution’s privacy notice if the financial institution intends to share the consumer’s NPPI. A customer is entitled to receive the financial institution’s privacy notice both when the relationship is created and then on an annual basis. The privacy notice must describe the following: • the categories of persons to whom the information is or may be disclosed, other than the persons to whom the information may be provided pursuant to the GLBA; • the policies and practices of the institution with respect to disclosing of the NPPI of persons who have ceased to be customers of the financial institution; • the categories of NPPI that are collected by the financial institution; • the policies that the institution maintains to protect the confidentiality and security of NPPI in accordance with the GLBA; • the disclosures required, if any, under 15 U.S.C. § 1681a(d)(2)(A)(iii); • that the consumer or the customer has the right to opt out of some disclosures; and • how the consumer or customer can opt out (if an opt-out right is available). 15 U.S.C. § 6803(c). The agencies referred to in 15 U.S.C. § 6804(a)(1) are empowered to develop a model form that may be used, at the option of the financial institution, for the provision of disclosures under the GLBA. The model form must • be comprehensible to consumers, with a clear format and design; • provide for clear and conspicuous disclosures; • enable consumers to easily identify the sharing practices of a financial institution and to compare privacy practices among financial institutions; and • be succinct and use an easily readable type font. 15 U.S.C. § 6803(e)(2). Although financial institutions are not required to use the model privacy notice form, a financial institution that elects to use the model privacy notice form developed by the agencies will be provided a safe harbor and will be deemed to be in compliance with the disclosures required under the GLBA. 15 U.S.C. § 6803(e)(4). Model privacy notice forms are included as Exhibits 5B and 5C.

5–10

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.4.2

§ 5.4

Consent Requirements

Although the GLBA does not require any affirmative consent from a customer or consumer, the GLBA does require a financial institution, at the time of setting up a customer relationship and at least annually thereafter, to • notify customers and consumers of the institution’s privacy policy and practices; • provide the individual with “reasonable means” to opt out of certain uses and disclosures, before the initial disclosure of the individual’s NPPI, using a clear and conspicuous notice delivered in written, oral, or electronic format; and • provide an explanation of how the consumer can exercise that nondisclosure option. 15 U.S.C. § 6802(b)(1). The GLBA does not prevent a financial institution from providing NPPI to a nonaffiliated third party to perform services for or functions on behalf of the financial institution, including marketing of the financial institution’s own products or services, or financial products or services offered pursuant to joint agreements between two or more financial institutions that comply with the requirements imposed by the regulations prescribed under 15 U.S.C. § 6804, if the financial institution fully discloses the providing of such information and enters into a contractual agreement with the third party that requires the third party to maintain the confidentiality of such information. 15 U.S.C. § 6802(b)(2).

§ 5.4.3

Limits of Reuse of NPPI

Except as otherwise noted above, a nonaffiliated third party that receives NPPI from a financial institution shall not, directly or through an affiliate of such receiving third party, disclose such information to any other person that is a nonaffiliated third party of both the financial institution and such receiving third party, unless such disclosure would be lawful if made directly to such other person by the financial institution. 15 U.S.C. § 6802(c).

§ 5.4.4

Restrictions on Disclosing NPPI to Third Parties

Restrictions under the GLBA on disclosing personal information to third parties depend on whether the third party is an affiliate or a nonaffiliated third party. • Disclosures to affiliates. A financial institution may disclose a consumer’s NPPI to an affiliated entity if the institution provides notice of this practice. The financial institution does not need to obtain affirmative consent or provide an opt-out right for this disclosure. An affiliated entity is defined as “any company that controls, or is controlled by, or is under common control with another company” and includes both financial and nonfinancial institutions. • Disclosures to nonaffiliated third parties. A financial institution must provide notice and a right to opt out of disclosures of personal information to MCLE, Inc. | 2nd Edition 2018

5–11

§ 5.4

Data Security and Privacy in Massachusetts

nonaffiliated parties. However, a financial institution can disclose an individual’s NPPI to a nonaffiliated entity, without allowing the individual to opt out, if it meets all of the following conditions: – The disclosure is to a third party that uses the information to perform services for the financial institution. – The financial institution provides notice of this practice to the individual before sharing the information. – The financial institution and the third party enter into a contract that requires the third party to maintain the confidentiality of the information and to use the information only for the prescribed purposes. 15 U.S.C. § 6802(b)(2). • Limitations on sharing account numbers. A financial institution must not disclose, other than to a consumer reporting agency, an account number or similar form of access number or access code for a credit card account, a deposit account, or a transaction account of a consumer to any nonaffiliated third party for use in telemarketing, direct mail marketing, or other marketing through electronic mail to the consumer. 15 U.S.C. § 6802(d). However, a financial institution may disclose an account number • to an agent of the financial institution for its own marketing purposes, as long as the agent is not authorized to directly initiate charges to the account; • to a participant in a private label credit card program or an affinity or similar program where the participants in the program are identified to the customer when the customer enters that program. (12 C.F.R. § 1016.12(b).) Financial institutions may also disclose personal information to nonaffiliated third parties without providing an opt-out right under certain circumstances, including where the disclosure is • “necessary to effect, administer or enforce a transaction” or made with the customer’s consent; • for compliance purposes (for example, to an insurance rating organization or a credit reporting agency); and • for law enforcement purposes. 15 U.S.C. § 6802(e).

5–12

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.4.5

§ 5.4

Data Security Requirements

The GLBA’s safeguards rule directs and requires financial institutions to develop and document a written information security plan (WISP) that describes their internal program to protect customer information. The WISP must be written in one or more readily accessible parts and contain administrative, technical, and physical safeguards that are appropriate to the size and complexity of the financial institution, the nature and scope of its activities, and the sensitivity of any customer information at issue. 16 C.F.R. § 314.3(a). The WISP may be one document or it may be a collection of documents, such as policies, procedures, trainings, and such that fulfill the requirements of the WISP in its entirety. An inventory of the WISP may be considered if the request is made for production by an entity’s regulator. The objective of the WISP is to • insure the security and confidentiality of customer information; • protect against any anticipated threats or hazards to the security or integrity of such information; and • protect against unauthorized access to or use of such information that could result in substantial harm or inconvenience to any customer. 16 C.F.R. § 314.3(b). As part of its plan, each financial institution must • designate one or more employees to coordinate its information security program; • identify reasonably foreseeable internal and external risks to the security, confidentiality, and integrity of customer information that could result in the unauthorized disclosure, misuse, alteration, destruction, or other compromise of such information; • assess the sufficiency of any safeguards in place to control the identified risks; • at a minimum, such a risk assessment should include consideration of risks in each relevant area of the financial institution’s operations, including – employee training and management; – information systems, including network and software design, as well as information processing, storage, transmission, and disposal; and – detecting, preventing, and responding to attacks, intrusions, or other systems failures; • design and implement information safeguards to control the risks identified through risk assessment; • regularly test or otherwise monitor the effectiveness of the safeguards’ key controls, systems, and procedures; MCLE, Inc. | 2nd Edition 2018

5–13

§ 5.4

Data Security and Privacy in Massachusetts

• oversee service providers (which means any person or entity that receives, maintains, processes, or otherwise is permitted access to customer information through its provision of services directly to a financial institution that is subject to the GLBA), by – taking reasonable steps to select and retain service providers that are capable of maintaining appropriate safeguards for the customer information at issue and – requiring service providers by contract to implement and maintain such safeguards; • evaluate and adjust the financial institution’s information security program in light of – the results of the testing and monitoring noted above; – any material changes to the financial institution’s operations or business arrangements; or – any other circumstances that the financial institution knows or has reason to know may have a material impact on its information security program. 16 C.F.R. § 314.4. The requirements are designed to be flexible. According to the FTC, organizations should implement safeguards appropriate to their own circumstances, which may include the size and complexity of the organization, the nature and scope of the activities of the organization, and the sensitivity of the NPPI that the organization handles.

§ 5.4.6

Data Breach Notification Requirements

While the GLBA does not include an explicit data breach notification requirement, several of the federal bank regulatory agencies (such as the Office of the Comptroller of the Currency and the Federal Reserve Board) have implemented regulations requiring financial institutions subject to their authority to notify the regulator (and in some cases the customer) when there has been an unauthorized access to “sensitive customer information.” Sensitive customer information generally includes a customer’s name, address, or telephone number, combined with one or more of the following data elements about the customer: • Social Security number, • driver’s license number, • account number, • credit or debit card number, or • personal identification number or password that would permit access to the customer’s account. 5–14

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.4

Federal Financial Institutions Examination Council, Financial Institution Letters (FIL-27-2005).

(a)

FFIEC Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice

The Federal Financial Institutions Examination Council (FFIEC) agencies issued interpretive guidance stating that every financial institution should develop and implement a response program designed to address incidents of unauthorized access to sensitive customer information maintained by the financial institution or its service provider. FFIEC, Financial Institution Letters (FIL-27-2005).

Components of a Response Program At a minimum, an institution’s response program should contain procedures for • assessing the nature and scope of an incident and identifying what customer information systems and types of customer information have been accessed or misused; • notifying its primary federal regulator as soon as possible when the institution becomes aware of an incident involving unauthorized access to or use of sensitive customer information; • consistent with the agencies’ suspicious activity report (SAR) regulations, filing a timely SAR and, in situations involving federal criminal violations requiring immediate attention, such as when a reportable violation is ongoing, promptly notifying appropriate law enforcement authorities; • taking appropriate steps to contain and control the incident to prevent further unauthorized access to or use of customer information; and • notifying customers when warranted in a manner designed to ensure that a customer can reasonably be expected to receive the notice. FFIEC, Financial Institution Letters (FIL-27-2005). When an incident of unauthorized access to sensitive customer information involves customer information systems maintained by an institution’s service provider, it is the financial institution’s responsibility to notify its customers and regulator. However, an institution may authorize or contract with its service provider to notify the institution’s customers or regulator on its behalf. FFIEC, Financial Institution Letters (FIL-27-2005).

When Customer Notice Should Be Provided The interpretive guidance states that a financial institution should provide a notice to its customers whenever it becomes aware of an incident of unauthorized access to customer information and, at the conclusion of a reasonable investigation, determines

MCLE, Inc. | 2nd Edition 2018

5–15

§ 5.4

Data Security and Privacy in Massachusetts

that misuse of the information has occurred or it is reasonably possible that misuse will occur. FFIEC, Financial Institution Letters (FIL-27-2005).

Customer Notice A financial institution should notify customers in a clear and conspicuous manner. The notice should include the following items: • a description of the incident; • the type of information subject to unauthorized access; • the measures taken by the financial institution to protect customers from further unauthorized access; • a telephone number customers can call for information and assistance; and • a reminder for customers to remain vigilant over the next twelve to twentyfour months and report suspected identity theft incidents to the institution. FFIEC, Financial Institution Letters (FIL-27-2005). The guidance encourages financial institutions to notify the nationwide consumer reporting agencies prior to sending notices to a large number of customers that include contact information for the reporting agencies. FFIEC, Financial Institution Letters (FIL-27-2005).

Delivery of Customer Notice Customer notice should be delivered in a manner designed to ensure that a customer can reasonably be expected to receive it. For example, the institution may choose to contact all customers affected by telephone or by mail, or by electronic mail for those customers for whom it has a valid e-mail address and who have agreed to receive communications electronically. FFIEC, Financial Institution Letters (FIL-27-2005).

(b)

FTC Guidance on Breach Notification

As part of its implementation of the GLBA, the FTC issued the safeguards rule, which requires financial institutions under FTC jurisdiction to have measures in place to keep customer information secure. The FTC’s safeguards rule states that institutions should • notify consumers if their personal information is subject to a breach that poses a significant risk of identity theft or related harm; • notify law enforcement if the breach may involve criminal activity or there is evidence that the breach has resulted in identity theft or related harm; • notify the credit bureaus and other businesses that may be affected by the breach; and • check to see if breach notification is required under applicable state law. 5–16

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.4

Additional guidance from the FTC is available at http://www.ftc.gov/privacy/glbact.

§ 5.5

PRETEXTING

§ 5.5.1

Overview

Pretexting is the practice of obtaining financial information under false pretenses and the GLBA’s pretexting rule specifically prohibits a person from obtaining customer information by false pretenses. 15 U.S.C. § 6821(a).

§ 5.5.2

Prohibition on Obtaining or Soliciting Customer Information Using False Pretenses

It is a violation of the GLBA for any person to obtain or attempt to obtain, or cause to be disclosed or attempt to cause to be disclosed to any person, customer information of a financial institution relating to another person • by making a false, fictitious, or fraudulent statement or representation to an officer, employee, or agent of a financial institution; • by making a false, fictitious, or fraudulent statement or representation to a customer of a financial institution; or • by providing any document to an officer, employee, or agent of a financial institution, knowing that the document is forged, counterfeit, lost, or stolen, was fraudulently obtained, or contains a false, fictitious, or fraudulent statement or representation. 15 U.S.C. § 6821(a)(1)–(3). The GLBA also prohibits the solicitation of a person to obtain customer information from a financial institution under false pretenses. Specifically, it is a violation of the GLBA for a person to request another person to obtain customer information of a financial institution, knowing that the person will obtain, or attempt to obtain, the information from the institution in any manner described in 15 U.S.C. § 6821(a).

§ 5.5.3

Exceptions to Pretexting Rule

The GLBA provides a number of exceptions to the prohibition against obtaining or soliciting customer information. • Law enforcement agencies. The GLBA does not prevent any action by a law enforcement agency, or any officer, employee, or agent of such agency, to obtain customer information of a financial institution in connection with the performance of the official duties of the agency. 15 U.S.C. § 6821(c). • Financial institutions. The GLBA does not prevent any financial institution, or any officer, employee, or agent of a financial institution, from obtaining customer information of such financial institution in the course of MCLE, Inc. | 2nd Edition 2018

5–17

§ 5.5

Data Security and Privacy in Massachusetts

– testing the security procedures or systems of such institution for maintaining the confidentiality of customer information; – investigating allegations of misconduct or negligence on the part of any officer, employee, or agent of the financial institution; or – recovering customer information of the financial institution that was obtained or received by another person under false pretenses. 15 U.S.C. § 6821(d). • Insurance investigations. The GLBA does not prevent any insurance institution, or any officer, employee, or agency of an insurance institution, from obtaining information as part of an insurance investigation into criminal activity, fraud, material misrepresentation, or material nondisclosure that is authorized for such institution under state law, regulation, interpretation, or order. 15 U.S.C. § 6821(e). • Certain types of customer information. The GLBA does not prevent any person from obtaining customer information of a financial institution that is otherwise available as a public record filed pursuant to the securities laws (the Securities Act of 1933, 15 U.S.C. § 77a et seq.; the Securities Exchange Act of 1934, 15 U.S.C. § 78a et seq.; the Sarbanes-Oxley Act of 2002, 15 U.S.C. § 7201 et seq.; the Trust Indenture Act of 1939, 15 U.S.C. § 77aaa et seq.; the Investment Company Act of 1940, 15 U.S.C. § 80a–1 et seq.; the Investment Advisers Act of 1940, 15 U.S.C. §§ 80b et seq., 80b–1 et seq.; and the Securities Investor Protection Act of 1970, 15 U.S.C. § 78aaa et seq.). 15 U.S.C. § 6821(f). • Child support judgments. The GLBA does not prevent any state-licensed private investigator, or any officer, employee, or agent of such private investigator, from obtaining customer information of a financial institution, to the extent reasonably necessary to collect child support from a person adjudged to have been delinquent in his or her obligations by a federal or state court, and to the extent that such action by a state-licensed private investigator is not unlawful under any other federal or state law or regulation and has been authorized by an order or judgment of a court of competent jurisdiction. 15 U.S.C. § 6821(g).

§ 5.5.4

Enforcement of Pretexting Rule

Compliance with the pretexting rule is enforced by the FTC, except in the cases noted below. 15 U.S.C. § 6822(a). The following agencies may enforce the pretexting rule under • Section 8 of the Federal Deposit Insurance Act, 12 U.S.C. § 1818, in the case of – national banks, and federal branches and federal agencies of foreign banks, by the Office of the Comptroller of the Currency;

5–18

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.5

– member banks of the Federal Reserve System (other than national banks), branches and agencies of foreign banks (other than federal branches, federal agencies, and insured state branches of foreign banks), commercial lending companies owned or controlled by foreign banks, and organizations operating under Section 25 or 25A of the Federal Reserve Act, 12 U.S.C. §§ 601 et seq. and 611 et seq., by the Board; – banks insured by the Federal Deposit Insurance Corporation (other than members of the Federal Reserve System and national nonmember banks) and insured state branches of foreign banks, by the Board of Directors of the Federal Deposit Insurance Corporation; and – savings associations the deposits of which are insured by the Federal Deposit Insurance Corporation, by the Director of the Office of Thrift Supervision; and • the Federal Credit Union Act, 12 U.S.C. § 1751 et seq., by the Administrator of the National Credit Union Administration with respect to any federal credit union. 15 U.S.C. § 6822(b)(1). Violations of the pretexting rule are treated as violations of other laws. For the purpose of an agency referred to above exercising its powers, a violation of the pretexting rule shall be deemed to be a violation. In addition to its powers noted above, each of the agencies may exercise any other authority conferred on such agency by law. 15 U.S.C. § 6822(b)(2).

§ 5.5.5

Criminal Penalty for Violation of Pretexting Rule

In general, a person who knowingly and intentionally violates, or knowingly and intentionally attempts to violate, the pretexting rule of the GLBA shall be fined in accordance with Title 18 of the U.S.C. or imprisoned for not more than five years, or both. 15 U.S.C. § 6823(a). In the event of an aggravated case, where a person violates or attempts to violate 15 U.S.C. § 6821 while violating another law of the United States, or as part of a pattern of any illegal activity involving more than $100,000 in a twelve-month period, shall be fined twice the amount provided in Subsection (b)(3) or (c)(3) (as the case may be) of 18 U.S.C. § 3571, imprisoned for not more than ten years, or both. 15 U.S.C. § 6823(b). The GLBA does not supersede state laws, except to the extent that state laws, regulations, orders, or interpretations are inconsistent with the GLBA, and then only to the extent of the inconsistency. A state law, regulation, or order is not inconsistent with the GLBA if such law, regulation, or order provides greater protection than the GLBA. 15 U.S.C. § 6823(a), (b).

MCLE, Inc. | 2nd Edition 2018

5–19

§ 5.6

Data Security and Privacy in Massachusetts

§ 5.6

GLBA ENFORCEMENT

§ 5.6.1

Enforcement

Multiple federal regulatory agencies enforce the GLBA, including the CFPB, the FTC, and the federal banking agencies. In addition, state insurance agencies enforce the GLBA. For example, the federal banking agencies, the CFPB, the SEC, and the CFTC have jurisdiction over banks, thrifts, credit unions, brokerage firms, and commodity traders. The enforcing agency depends not only on the target of enforcement but also whether it is the privacy rule or the safeguards rule that is being enforced. See § 5.1, above, for more information about enforcement under the pretexting rule. The GLBA does not include a right for individuals to bring private actions.

§ 5.6.2 (a)

Sanctions and Other Liability Judicial and CFPB Enforcement

The court or the CFPB, as the case may be, in an action or an adjudication proceeding brought under federal consumer financial law, has jurisdiction to grant any appropriate legal or equitable relief with respect to a violation of federal consumer financial law, including a violation of a rule or an order prescribed under a federal consumer financial law. The CFPB, the state attorney general, or the state regulator may recover its costs in connection with prosecuting such action if the CFPB, the state attorney general, or the state regulator is the prevailing party in the action. 12 U.S.C. § 5565(a)–(b). Relief may include, without limitation, • rescission or reformation of contracts; • refund of moneys or return of real property; • restitution; • disgorgement of or compensation for unjust enrichment; • payment of damages or other monetary relief; • public notification regarding the violation, including the costs of notification; • limits on the activities or functions of the person; and • civil money penalties. 12 U.S.C. § 5565(a)(2). Any person who violates, through any act or omission, any provision of federal consumer financial law shall forfeit and pay a civil penalty pursuant to a three-tier scheme, subject to mitigating factors:

5–20

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

§ 5.6

• First tier. For any violation of a law, a rule, or a final order or condition imposed in writing by the CFPB, a civil penalty may not exceed $5,000 for each day during which such violation or failure to pay continues. • Second tier. Notwithstanding the first tier, for any person who recklessly engages in a violation of a federal consumer financial law, a civil penalty may not exceed $25,000 for each day during which such violation continues. • Third tier. Notwithstanding the two prior items, for any person who knowingly violates a federal consumer financial law, a civil penalty may not exceed $1 million for each day during which such violation continues. 12 U.S.C. § 5565(c). In determining the amount of any penalty assessed, the CFPB or the court shall take into account the appropriateness of the penalty with respect to the following mitigating factors: • the size of financial resources and the good faith of the person charged; • the gravity of the violation or the failure to pay; • the severity of the risks to or losses of the consumer, which may take into account the number of products or services sold or provided; • the history of previous violations; and • such other matters as justice may require. 12 U.S.C. § 5565(c)(3).

(b)

FTC Enforcement

Penalties for violations of the GLBA are determined by the authorizing statute of the agency that brings the enforcement action. For example, for enforcement actions brought by the FTC, sanctions and other civil and criminal liability include penalties of up to $16,000 per offense. Persons and entities who obtain, attempt to obtain, cause to be disclosed, or attempt to cause to be disclosed customer information of a financial institution (relating to another person) through a false, fictitious, or fraudulent means, can be subjected to fines and imprisoned for up to five years. Criminal penalties comprise up to ten years’ imprisonment and fines of up to $500,000 (for an individual) or $1 million (for a company) if the acts are committed or attempted while violating another U.S. law or as part of a pattern of illegal activity involving more than $100,000 in a year. In August 2017, the FTC proposed a settlement with TaxSlayer.com, as a financial institution, and noted that financial institutions must comply with the GLBA’s privacy rule and safeguards rule. The privacy rule (known as Regulation P under the CFPB version) requires covered companies to provide notices to consumers that explain their privacy practices. In addition, the FTC noted that the safeguards rule protects the security, the confidentiality, and the integrity of customer information by MCLE, Inc. | 2nd Edition 2018

5–21

§ 5.6

Data Security and Privacy in Massachusetts

implementing and maintaining a comprehensive written information security plan. The FTC emphasized that companies have to conduct an assessment of how customer’s financial information could be at risk and then implement safeguards to assess those risks and be sure the safeguards are tailor-made for the company. The FTC specifically noted that cutting and pasting WISP controls is not acceptable under their review, and organizations have to take a more specific approach to determining and implementing safeguards. The FTC requires TaxSlayer to comply with the privacy and safeguards rules, and TaxSlayer will also be subject to every-other-year independent assessments for the next ten years.

5–22

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

EXHIBIT 5A—Regulatory Enforcement Authority for the Privacy, Safeguards, and Pretexting Rules Regulator Consumer Financial Protection Bureau

Financial Privacy Rule Authority:

Safeguards Rule

Pretexting Rule

None.

None.

Over depository institutions and credit unions with more than $10 billion in deposits. Over non-banking financial institutions other than auto dealers or SEC or CFTCregulated entities. May recommend action against smaller banks.

Federal Trade Commission

Enforcement authority over certain auto dealers.

Catchall authority for entities not regulated specifically by other regulators.

FTC is the compliance authority, unless otherwise stated in this column.

Banking Regulators

Depository institutions with $10 billion or less under deposit.

All depository institutions.

Authority: National banks, and federal branches and federal agencies of foreign banks, by the Office of the Comptroller of the Currency; Member banks of the Federal Reserve System (other than national banks), branches and agencies of foreign banks (other than federal branches, federal agencies, and insured state branches of foreign banks), commercial lending companies owned or controlled by foreign banks, and organizations operating under section 25 or 25A of the Federal Reserve Act; Banks insured by the Federal Deposit Insurance Corporation (other than members of the Federal Reserve System and national nonmember banks) and insured State branches of foreign banks, by the Board of Directors of the Federal Deposit Insurance Corporation; and Savings associations the deposits of which are insured by the Federal Deposit Insurance Corporation, by the Director of the Office of Thrift Supervision.

MCLE, Inc. | 2nd Edition 2018

5–23

Data Security and Privacy in Massachusetts Regulator

Financial Privacy Rule

Safeguards Rule

Pretexting Rule

National Credit Union Administration

Credit unions with $10 billion or less under deposit.

Credit unions.

Credit unions.

Commodity Futures Trading Commission

Commodities and futures traders.

Commodities and futures traders.

None.

Securities and Exchange Commission

Securities brokerages and other Securities brokerages SEC-regulated entities. and other SECregulated entities.

None.

State Insurance Authorities

State-licensed insurance businesses.

None.

5–24

State-licensed insurance businesses.

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

EXHIBIT 5B—Model Privacy Notice Form: Opt Out This model privacy notice form is located at https://www.ftc.gov/sites/default/files/attachments/ press-releases/federal-regulators-issue-final-model-privacy-notice-form/privacymodelform.pdf.

MCLE, Inc. | 2nd Edition 2018

5–25

Data Security and Privacy in Massachusetts

5–26

2nd Edition 2018 | MCLE, Inc.

Gramm-Leach-Bliley Act

EXHIBIT 5C—Model Privacy Notice Form: No Opt Out This model privacy notice form is located at https://www.ftc.gov/sites/default/files/attachments/ press-releases/federal-regulators-issue-final-model-privacy-notice-form/privacymodelform _nooptout.pdf.

MCLE, Inc. | 2nd Edition 2018

5–27

Data Security and Privacy in Massachusetts

5–28

2nd Edition 2018 | MCLE, Inc.

CHAPTER 6

Information Security in Health Care: HIPAA and HITECH David S. Szabo, Esq. Locke Lord LLP, Boston § 6.1

Introduction............................................................................................. 6–2

§ 6.2

Statutory Authority for the Original Privacy and Information Security Rules and the HITECH Amendments .................................... 6–3

§ 6.3

Brief Overview of the HIPAA Privacy Rule and How It Relates to Information Security .......................................................................... 6–4

§ 6.4

The HIPAA Information Security Rule ................................................ 6–5 § 6.4.1 Key Concepts ......................................................................... 6–5 (a) Protected Health Information .......................................... 6–5 (b) Covered Entities and Business Associates ...................... 6–6 (c) The Security Rule Applies Only to Electronic PHI ......... 6–7 (d) Scalable ........................................................................... 6–8 (e) Technology Neutral ......................................................... 6–9 (f) Relationship to Industry Standards ............................... 6–10 (g) Required and Addressable Specifications ..................... 6–11 § 6.4.2 The Core Requirement ......................................................... 6–12 § 6.4.3 The Centrality of Risk Assessment ...................................... 6–13 § 6.4.4 Administrative Safeguards ................................................... 6–14 § 6.4.5 Physical Safeguards.............................................................. 6–16 § 6.4.6 Technical Safeguards............................................................ 6–16 § 6.4.7 Information Security Guidance ............................................ 6–18

§ 6.5

Enforcement and Penalties: The Impact of HITECH........................ 6–18 § 6.5.1 Overview of the Enforcement Process ................................. 6–18 § 6.5.2 Penalties Under the Original HIPAA Statute ....................... 6–18 § 6.5.3 Penalties After HITECH ...................................................... 6–19 § 6.5.4 Vicarious Liability for the Acts of Business Associates ....... 6–20 § 6.5.5 Enforcement of the HIPAA Regulations by the Office of Civil Rights ...................................................................... 6–23 § 6.5.6 Open Issues and Questions Under HITECH and the Enforcement Rule................................................................. 6–23

§ 6.6

Breach Notification and Information Security ................................... 6–27 § 6.6.1 Brief Overview of the Breach Notification Rule .................. 6–27 § 6.6.2 Impact on Information Security and Enforcement ............... 6–28 (a) Are Breaches Violations of the Security Rule? ............. 6–28 (b) Publicity, Enforcement, and Private Litigation ............. 6–28

MCLE, Inc. | 2nd Edition 2018

6–1

Data Security and Privacy in Massachusetts

EXHIBIT 6A—HIPAA Security Standards Matrix ......................................... 6–30

Scope Note This chapter provides an overview of the provisions of the Health Insurance Portability and Accountability Act of 1996 (HIPAA), Pub. L. No. 104-191, and the Health Information Technology for Economic and Clinical Health Act (HITECH), part of the American Recovery and Reinvestment Act of 2009, as such laws relate to protected health information.

§ 6.1

INTRODUCTION

This chapter will discuss the information security requirements applicable to the health-care industry as authorized by the Health Insurance Portability and Accountability Act of 1996 (HIPAA). These requirements are set forth in regulations at 45 C.F.R. § 164.300, which will be referred to as the “security rule.” The security rule is one of five integrated rules that deal with • standards and formats for electronic data interchange transactions (the “transaction rule”); • the privacy of protected health-care information (the “privacy rule”); • information security and breaches of unsecured protected health information (the “data breach rule”); and • a rule for administrative enforcement of the other four rules (the “enforcement rule”). While the security rule has been in force, largely in its current form, since 2003, it has taken on greatly increased importance since 2009. There are several reasons for this increase in importance. First, with the passage of the Health Information Technology Economic and Clinical Health (HITECH) Act in 2009, the penalties for violations of the security rule have increased by more than an order of magnitude. Second, compliance with the security rule is an element in qualifying for federal and state subsidies for the implementation and use of electronic health records under a program commonly called “Meaningful Use.” Third, the data breach rule, also introduced by HITECH, has highlighted the risks associated with breaches of electronic protected health information (PHI), as have widely publicized data breach events affecting health plans, health-care providers, and large national companies outside of the health-care industry. Even a cursory review of enforcement actions related to data breaches reveals that most settlements include an allegation of a violation of the security rule. In 2012, the Office of Civil Rights (OCR) commissioned a small sample audit of twenty entities subject to HIPAA regulations. The audit found widespread noncompliance with the security rule, especially among smaller entities. As a result, the OCR redoubled its

6–2

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.1

efforts to stress the importance of information security in its reviews of data breaches and its enforcement activities. This chapter will review the statutory authority for the security rule and its relationship to the privacy rule, and then provide a more in-depth review of the contents of the rule, its relationship to industry standards, enforcement of the rule after HITECH, and how the rule relates to breach notification.

§ 6.2

STATUTORY AUTHORITY FOR THE ORIGINAL PRIVACY AND INFORMATION SECURITY RULES AND THE HITECH AMENDMENTS

The transactions rule, the privacy rule, the security rule, and the enforcement rule were originally authorized by the administrative simplification provisions of HIPAA, found at Pub. L. No. 104-191 (1996). Section 1173(d) of the Act requires the secretary of Health and Human Services (the “secretary”) to adopt standards for the protection of individually identifiable health information, as follows: (d) SECURITY INFORMATION.—

STANDARDS

FOR

HEALTH

(1) SECURITY STANDARDS.—The Secretary shall adopt security standards that— (A) take into account— (i) the technical capabilities of record systems used to maintain health information; (ii) the costs of security measures; (iii) the need for training persons who have access to health information; (iv) the value of audit trails in computerized record systems; and (v) the needs and capabilities of small health care providers and rural health care providers (as such providers are defined by the Secretary); and (B) ensure that a health care clearinghouse, if it is part of a larger organization, has policies and security procedures which isolate the activities of the health care clearinghouse with respect to processing information in a manner that prevents unauthorized access to such information by such larger organization.

MCLE, Inc. | 2nd Edition 2018

6–3

§ 6.2

Data Security and Privacy in Massachusetts

(2) SAFEGUARDS.—Each person described in section 1172(a) who maintains or transmits health information shall maintain reasonable and appropriate administrative, technical, and physical safeguards— (A) to ensure the integrity and confidentiality of the information; (B) to protect against any reasonably anticipated— (i) threats or hazards to the security or integrity of the information; and (ii) unauthorized uses or disclosures of the information; and (C) otherwise to ensure compliance with this part by the officers and employees of such person. Pub. L. No. 104-191, § 1173(d) (1996). The secretary promulgated the final security rule as authorized by HIPAA in 2003, and the rule survives in substantially the same form. HITECH and its implementing regulations made some minor changes in the security rule in 2009 and 2013, respectively, but those changes were not extensive. Notably, the HIPAA statute does not spell out any of the requirements of the privacy rule, but simply authorizes the secretary to make recommendations for the protection of privacy and to promulgate regulations if Congress does not act on her recommendations within thirty-six months after her recommendations were made. Congress did not act, and the secretary promulgated the privacy rule in final form on December 28, 2000.

§ 6.3

BRIEF OVERVIEW OF THE HIPAA PRIVACY RULE AND HOW IT RELATES TO INFORMATION SECURITY

To fully understand the security rule, you need a basic understanding of the privacy rule, as the two rules are meant to work together. The security rule provides comprehensive guidance on how to protect certain information, while the privacy rule sets forth, in great detail, the permitted uses and disclosures of that same information. More significantly, the privacy rule states, in essence, that a use or a disclosure of PHI that is not permitted by the privacy rule is prohibited. The security rule, in turn, sets forth a framework for protecting the confidentiality of information, the use and disclosure of which are governed by the privacy rule. All five administrative simplification regulations apply to only a limited class of persons, defined as “covered entities” and “business associates.” These terms are defined in the “General Administrative Requirements” section of the administrative simplification regulations at 45 C.F.R. § 160.103.

6–4

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

§ 6.4

THE HIPAA INFORMATION SECURITY RULE

The security rule is founded on several key concepts that inform an understanding of how the rule applies and should be interpreted. These key concepts are • “protected health information”; • identification of who is regulated; • the information that is subject to regulation; • scalability; • technological neutrality; • the rule’s relationship to industry standards; and • addressable and required safeguards. After reviewing these key concepts, this section will review the security rule’s core requirement that covered entities and business associates adopt “reasonable and appropriate safeguards to protect the confidentiality, integrity, and availability of electronic protected health information.” This will be followed by a discussion of the required and addressable implementation specifications in each of the three domains of security: administrative, physical, and technical safeguards.

§ 6.4.1 (a)

Key Concepts Protected Health Information

A key concept for all of the HIPAA regulations is “protected health information” (PHI). The definition of PHI, in turn, incorporates the terms “individually identifiable health information” and “health information,” both of which are defined terms in the regulation. Combining all three definitions, PHI is defined as information, including demographic information collected from an individual and genetic information, that is created or received by a health-care provider, a health plan, an employer, or a healthcare clearinghouse, and that relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual; and that identifies the individual; or with respect to which there is a reasonable basis to believe the information can be used to identify the individual. Protected health information excludes information in education records covered by the Family Educational Rights and Privacy Act (FERPA), as amended, 20 U.S.C. § 1232g; in records described at 20 U.S.C. § 1232g(a)(4)(B)(iv); in employment records held by a covered entity in its role as employer; and regarding a person who has been deceased for more than fifty years. Practice Note “Records described at 20 U.S.C. § 1232g(a)(4)(B)(iv)” are records on a student who is eighteen years of age or older, or who is attending an institution of postsecondary education, that are made or maintained by a MCLE, Inc. | 2nd Edition 2018

6–5

§ 6.4

Data Security and Privacy in Massachusetts physician, psychiatrist, psychologist, or other recognized professional or paraprofessional acting in his or her professional or paraprofessional capacity, or assisting in that capacity, and that are made, maintained, or used only in connection with the provision of treatment to the student, and are not available to anyone other than persons providing such treatment, except that such records can be personally reviewed by a physician or other appropriate professional of the student’s choice. It is unclear why these records, not protected by FERPA, also are not PHI.

(b)

Covered Entities and Business Associates

Like the other four HIPAA regulations, the security rule applies only to covered entities and to business associates. Covered entities are • health-care providers that engage in the standard electronic data interchange transactions with health plans; • health plans; • clearinghouses that convert nonstandard electronic data interchange transactions to standard formats to facilitate the exchange of transactions between health-care providers and health plans; and • certain providers of Medicare discount cards. Notably, health-care providers that do not use any of the standard electronic data interchange transactions are not regulated by the security rule. Similarly, insurance companies that provide only “excepted health benefits,” such as an automobile liability insurer that covers medical expenses, workers’ compensation benefits, life insurance, or disability insurer, are not “health plans” and are not regulated by the security rule. Practice Note For a list of these types of insurers, see Section 2791(c)(1) of the Public Health Service Act, 42 U.S.C. § 300gg-91(c)(1).

Determining whether a person or an entity is a “covered entity” is relatively straightforward, and the OCR provides a decision tree on its website at http://www.cms.gov/ Regulations-and-Guidance/HIPAA-Administrative-Simplification/HIPAAGenInfo/ AreYouaCoveredEntity.html to assist in making this determination. On the other hand, the identification of business associates is more complex and has led to significant controversy. Generally (and imprecisely) speaking, a business associate is a person or an entity that is engaged by a covered entity to perform a service for the covered entity and who must use, disclose, or maintain PHI received from or collected on behalf of the covered entity in order to perform that service. Clear examples of business associates include companies that process and collect bills for physicians and hospitals, and medical transcription services that transcribe notes for physicians and hospitals. Less obvious examples include lawyers that provide legal advice concerning medical peer

6–6

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

review matters, cloud computing companies that maintain PHI, and a wide variety of other persons and entities. It should be noted that not all service providers working for covered entities are business associates. For example, a company providing janitorial services probably is not, as its employees do not need access to PHI to do their jobs. Entities that provide routine financial services to covered entities are not business associates but can become business associates if they provide specialized services that require the use of PHI. Liability insurers are not business associates simply by providing liability insurance, but become business associates if they provide specialized risk management services. 78 Fed. Reg. 5575 (Jan. 25, 2013). Practice Note In regard to entities that provide routine financial services to covered entities, Section 1179 of the Health Insurance Portability and Accountability Act, Pub. L. No. 104-191 (1996), states that HIPAA shall not apply to the entity with respect to such activities that include, for example, “[t]he use or disclosure of information by the entity for authorizing, processing, clearing, settling, billing, transferring, reconciling or collecting a payment for, or related to, health plan premiums or health care, where such payment is made by any means, including a credit, debit, or other payment card, an account, check, or electronic funds transfer.”

Determining who is, and who is not, a business associate is a key aspect of providing advice on compliance with the security rule. As will be noted later, given the expanded penalties under HITECH, this can turn out to be a crucial determination.

(c)

The Security Rule Applies Only to Electronic PHI

The security rule applies only to electronic protected health information (EPHI). While the privacy rule governs the use and disclosure of PHI in oral, paper, and electronic forms, the security rule is more limited. Electronic PHI is defined as “individually identifiable health information . . . that is: (i) Transmitted by electronic media; [or] (ii) Maintained in electronic media.” 45 C.F.R. § 160.103. Electronic media, in turn, is defined to mean: (1) Electronic storage material . . . including . . . devices in computers (hard drives) and any removable/transportable digital memory medium, such as magnetic tape or disk, optical disk, or digital memory card; or (2) Transmission media used to exchange information already in electronic storage media. Transmission media include, for example, the Internet, extranet or intranet, leased lines, dialup lines, private networks, and the physical movement of removable/transportable electronic storage media. Certain transmissions, including of paper, via facsimile, and of voice, via telephone, are not considered to be

MCLE, Inc. | 2nd Edition 2018

6–7

§ 6.4

Data Security and Privacy in Massachusetts

transmissions via electronic media if the information being exchanged did not exist in electronic form before the transmission. 45 C.F.R. § 160.103. While the definition of EPHI is fairly intuitive, it should be noted that the definition captures “the physical movement of removable/transportable electronic storage media.” This includes that transfer of tapes, disk drives, and other media by truck, or even the transport of media by an individual on foot (sometimes called “sneaker-net”).

(d)

Scalable

The security rule applies to all covered entities and business associates, regardless of size, complexity, or financial and technical resources. There is no “small entity” or “small provider” exception to the security rule. On its face, it seems quite unfair to subject a small business or a sole practitioner to the same information security requirements as a large hospital system or a national health plan (alternatively, it might seem insufficient to subject a large organization to a rule that could easily be satisfied by a small organization or an individual person). The security rule’s answer to this dilemma is the concept of “scalability,” which means that compliance with the rule should be scaled to fit the size, complexity, and resources of the organization. The commentary associated with the final security rule explains it this way: The proposed [information security] rule proposed to define the security standard as a set of scalable, technology-neutral requirements with implementation features that providers, plans, and clearinghouses would have to include in their operations to ensure that health information pertaining to an individual that is electronically maintained or electronically transmitted remains safeguarded. The proposed rule would have required that each affected entity assess its own security needs and risks and devise, implement, and maintain appropriate security to address its own unique security needs. How individual security requirements would be satisfied and which technology to use would be business decisions that each entity would have to make. In the final rule we adopt this basic framework. 68 Fed. Reg. 8341 (Feb. 20, 2003). The secretary also states: With respect to the comments regarding the difference between providers and plans/clearinghouses, we have structured the Security Rule to be scalable and flexible enough to allow different entities to implement the standards in a manner that is appropriate for their circumstances. 6–8

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

68 Fed. Reg. 8338 (Feb. 20, 2003). The concept of scalability is embodied in the following provisions of the security rule: (b) Flexibility of approach. (1) Covered entities and business associates may use any security measures that allow the covered entity or business associate to reasonably and appropriately implement the standards and implementation specifications as specified in this subpart. (2) In deciding which security measures to use, a covered entity or business associate must take into account the following factors: (i) The size, complexity, and capabilities of the covered entity or business associate. (ii) The covered entity’s or the business associate’s technical infrastructure, hardware, and software security capabilities. (iii) The costs of security measures. (iv) The probability and criticality of potential risks to electronic protected health information. 45 C.F.R. § 164.306(b). Scalability is particularly important in the performance of risk analysis, as will be discussed later in this chapter.

(e)

Technology Neutral

Scalability is also associated with another important concept underlying the security rule: technological neutrality. The secretary made it clear that the government does not mandate the choice of technologies to be employed, much less specific vendors or specific implementations within a type of technology. As the secretary noted in the preamble: We do not believe it is appropriate to make the standards technology-specific because technology is simply moving too fast, for example, the increased use and sophistication of internet-enabled hand held devices. We believe that the implementation of these rules will promote the security of electronic protected health information by (1) providing integrity and confidentiality; (2) allowing only authorized individuals access to that information; and (3) ensuring its availability to those authorized to access the information. The standards do not allow organizations to make their own rules, only their own technology choices. 68 Fed. Reg. 8343 (Feb. 20, 2003). MCLE, Inc. | 2nd Edition 2018

6–9

§ 6.4

Data Security and Privacy in Massachusetts

The secretary’s discussion of encryption technology is a good example of this concept applied to a specific requirement: c. Comment: A number of comments were received asking that this final rule establish a specific (or at least a minimum) cryptographic algorithm strength. Others recommended that the rule not specify an encryption strength since technology is changing so rapidly. Several commenters requested guidelines and minimum encryption standards for the Internet. Another stated that, since an example was included (small or rural providers for example), the government should feel free to name a specific encryption package. One commenter stated that the requirement for encryption on the Internet should reference the ‘‘CMS Internet Security Policy.’’ Response: We remain committed to the principle of technology neutrality and agree with the comment that rapidly changing technology makes it impractical and inappropriate to name a specific technology. Consistent with this principle, specification of an algorithm strength or specific products would be inappropriate. Moreover, rapid advances in the success of ‘‘brute force’’ cryptanalysis techniques suggest that any minimum specification would soon be outmoded. We maintain that it is much more appropriate for this final rule to state a general requirement for encryption protection when necessary and depend on covered entities to specify technical details, such as algorithm types and strength. Because ‘‘CMS Internet Security Policy’’ is the policy of a single organization and applies only to information sent to CMS, and not between all covered entities, we have not referred to it here. 68 Fed. Reg. 8357 (Feb. 20, 2003).

(f)

Relationship to Industry Standards

Numerous industry standards exist in the field of information security. These include standards for the operation of every aspect of an information security program, such as the ISO 27000 series of standards, as well as specific standards for the implementation of specific safeguards, such as encryption. Additionally, the National Institute for Science and Technology (NIST), a government agency, publishes standards and guidance documents relating to information security. The proposed information security rule contained an appendix mapping its requirements to commonly used industry standards. However, no industry standards were officially adopted as part of the final security rule. Instead, the secretary determined that no one industry standard, or group of standards, was sufficient to meet all of the requirements of the HIPAA statute. The secretary stated:

6–10

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

The currently existing security standards developed by ANSI recognized SDOs (Standards Development Organizations) are targeted to specific technologies and/or activities. No existing security standard, or group of standards, is technology-neutral, scalable to the extent required by HIPAA, and broad enough to be adopted in this final rule. Therefore, this final rule adopts standards under section 1172(c)(2)(B) of the Act, which permits us to develop standards when no industry standards exist. 68 Fed. Reg. 8345 (Feb. 20, 2003). In other words, since no industry standards meet the requirements of the statute, the secretary has developed more-general standards as set forth in the rule. Covered entities and business associates remain free to use safeguards described in specific industry standards if they deem those safeguards to be suitable. Practice Note For example, the Secretary’s Guidance to Render Unsecured Protected Health Information Unusable, Unreadable, or Indecipherable to Unauthorized Individuals refers to several federal standards and guidelines for encryption and the destruction of media containing PHI.

(g)

Required and Addressable Specifications

The final security rule contains standards, which are broad and general, and implementation specifications that specify safeguards that can be used to meet each standard. Some of the implementation specifications are required, while others are addressable. “Addressable” does not mean “optional.” The secretary explained the distinction as follows: Paragraph (d) of § 164.306 establishes two types of implementation specifications, required and addressable. It provides that required implementation specifications must be met. However, with respect to implementation specifications that are addressable, § 164.306(d)(3) specifies that covered entities must assess whether an implementation specification is a reasonable and appropriate safeguard in its environment, which may include consideration of factors such as the size and capability of the organization as well as the risk. If the organization determines it is a reasonable and appropriate safeguard, it must implement the specification. If an addressable implementation specification is determined not to be a reasonable and appropriate answer to a covered entity’s security needs, the covered entity must do one of two things: implement another equivalent measure if reasonable and appropriate; or if the standard can otherwise be met, the covered entity may choose to not implement the implementation specification or any equivalent alternative measure at all. The covered entity must document

MCLE, Inc. | 2nd Edition 2018

6–11

§ 6.4

Data Security and Privacy in Massachusetts

the rationale behind not implementing the implementation specification. 68 Fed. Reg. 8341 (Feb. 20, 2003). In the event of a dispute or an enforcement action, the importance of documenting the decision-making process over the treatment of addressable safeguards could be crucial. As noted by the secretary, standards always must be met. If an implementation specification is addressable, if may be omitted if the standard is otherwise met or unnecessary. However, the choice to omit the implementation specification must be documented and the choice must be reasonable. The cost of an implementation specification is a factor that a covered entity or a business associate may consider, but cost cannot be used as a justification for entirely disregarding a standard or the objectives of the rule. As the secretary noted: We disagree that covered entities are given complete discretion to determine their security polices under this rule, resulting in effect, in no standards. While cost is one factor a covered identity may consider in determining whether to implement a particular implementation specification, there is nonetheless a clear requirement that adequate security measures be implemented, see 45 C.F.R. 164.306(b). Cost is not meant to free covered entities from this responsibility. 68 Fed. Reg. 8343 (Feb. 20, 2003). The use of addressable specification is an aspect of the scalability and technology neutrality of the security rule. As noted, however, it does not create absolute discretion. Appendix A to the security rule contains a matrix listing each standard, the section of the security rule where the standard can be found, the implantation specifications associated with each standard, and a notation of whether the specification is required (R) or addressable (A). The matrix is included as Exhibit 6A.

§ 6.4.2

The Core Requirement

The core requirement of the security rule is set forth in 45 C.F.R, § 164.306. Subsection (a) states as follows: (a) General requirements. Covered entities and business associates must do the following: (1) Ensure the confidentiality, integrity, and availability of all electronic protected health information the covered entity creates, receives, maintains, or transmits. (2) Protect against any reasonably anticipated threats or hazards to the security or integrity of such information. (3) Protect against any reasonably anticipated uses or disclosures of such information that are not permitted or required under subpart E of this part [i.e., the privacy rule]. 6–12

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

(4) Ensure compliance with this subpart by its workforce. The flexibility available to covered entities and business associates is set forth in Subsection (b), as follows: (b) Flexibility of approach. (1) Covered entities and business associates may use any security measures that allow the covered entity or business associate to reasonably and appropriately implement the standards and implementation specifications as specified in this subpart. (2) In deciding which security measures to use, a covered entity or business associate must take into account the following factors: (i) The size, complexity, and capabilities of the covered entity or business associate. (ii) The covered entity’s or the business associate’s technical infrastructure, hardware, and software security capabilities. (iii) The costs of security measures. (iv) The probability and criticality of potential risks to electronic protected health information. 45 C.F.R. § 164.306(a), (b).

§ 6.4.3

The Centrality of Risk Assessment

Subsection (b)(iv) states that the “probability and criticality of potential risks” is part of the decision process in determining which implementation specifications to use. It is important to realize that risk analysis, while only one of the many administrative safeguards that comprise the security rule, is central to the implementation of the security rule as a whole, as has been emphasized by the OCR as a key point of compliance in several resolution agreements. (Examples of resolution agreements between the OCR and covered entities can be found at http://www.hhs.gov/ocr/privacy/ hipaa/enforcement/examples/index.html.) The implementation specifications are grouped into three categories: administrative safeguards, physical safeguards, and technical safeguards. Administrative safeguards consist of rules and policies, processes, agreements, and records. Physical safeguards include locks, guards, cameras, and other means to exclude unauthorized persons or prevent the theft of information assets. Technical safeguards include firewalls, encryption, access controls, and other technical means of protecting information assets and information. Sometimes the implementation of a particular means of protecting information requires an administrative safeguard, such as the process of authorizing a particular person to have access to a category of information, and a technical control, such as implementing username and password controls such that only authorized persons can obtain access to certain information contained in an information system. MCLE, Inc. | 2nd Edition 2018

6–13

§ 6.4

Data Security and Privacy in Massachusetts

The following sections of this chapter will discuss each category of implementation specification, noting those that are required and those that are addressable.

§ 6.4.4

Administrative Safeguards

There are nine standards in the category of administrative safeguards, with twentytwo distinct implementation specifications. Of those, ten are required and twelve are addressable. The first listed standard, the security management process, consists of four implementation specifications, all of which are required. These are risk analysis, risk management, sanction policy, and information system activity review. In response to a request to prioritize the standards, the secretary commented that “the security management process is listed first under the Administrative safeguards section, as we believe this forms the foundation on which all of the other standards depend.” 68 Fed. Reg. 8344 (Feb. 20, 2003). In response to comments that the security management process standard was too burdensome, the secretary noted: This standard and its component implementation specifications form the foundation upon which an entity’s necessary security activities are built. See NIST SP 800–30, ‘‘Risk Management Guide for Information Technology systems,’’ chapters 3 and 4, January 2002. An entity must identify the risks to and vulnerabilities of the information in its care before it can take effective steps to eliminate or minimize those risks and vulnerabilities. Some form of sanction or punishment activity must be instituted for noncompliance. Indeed, we question how the statutory requirement for safeguards ‘‘to ensure compliance . . . by a [covered entity’s] officers and employees’’ could be met without a requirement for a sanction policy. See section 1176(d)(2)(C) of the Act. 68 Fed. Reg. 8346 (Feb. 20, 2003). Another key administrative safeguard is “assigned security responsibility.” In essence, this is the requirement to appoint a security officer, even if that person also has other duties. In the absence of a security officer (or a person with the responsibilities to oversee the security management process), an organization cannot credibly demonstrate that it is implementing the security rule effectively. The other administrative safeguards are more specific. The workforce security standard requires a process for authorizing access to EPHI or providing appropriate supervision of personnel, processes, and rules for determining the persons who should have access to information, and clear processes for terminating access when personnel are terminated or their job function changes such that access to information is no longer appropriate. All of these implementation specifications are addressable. The information access management standard requires covered entities and business associates to implement policies and procedures that, based upon the entity’s access 6–14

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

authorization policies, establish, document, review, and modify a user’s right of access to a workstation, a transaction, a program, or a process. In essence, the entity is required to determine who needs access to EPHI and to create processes for granting, reviewing, modifying, and terminating access rights. While the secretary considers this an important component of security, she also recognizes that, in small organizations, every employee may have access to all information. As the secretary noted, “[i]t is possible that a small practice, with one or more individuals equally responsible for establishing and maintaining all automated patient records, will not need to establish policies and procedures for granting access to that electronic protected health information because the access rights are equal for all of the individuals.” 68 Fed. Reg. 8336 (Feb. 20, 2003). Accordingly, this specification is addressable. The security awareness and training standard contains four addressable implementation specifications. It should be noted that, while the secretary made these specifications addressable to give entities flexibility in designing their training programs, she did not to make training optional. As she commented, “security training remains a requirement because of its criticality; however, we have revised the implementation specifications to indicate that the amount and type of training needed will be dependent upon an entity’s configuration and security risks.” 68 Fed. Reg. 8350 (Feb. 20, 2003). The four specific specifications are security reminders for members of the workforce; procedures for guarding against malicious software; procedures for login monitoring; and procedures for creating, changing, and protecting passwords. The standard for security incident procedures requires the creation of a process for responding to and reporting security incidents. The rule requires accurate and current security incident procedures: formal, documented reporting and response procedures so that security violations are reported and handled promptly. The rule also requires an implementation specification for response and reporting, since the secretary considers documenting and reporting incidents, as well as responding to incidents, as an integral part of a security program. However, the commentary also makes it clear that the security rule, in and of itself, does not require public or external reporting of security incidents (external reporting might be required under the data breach rule, however). The contingency plan standard requires the creation and implementation of policies and procedures for responding to an emergency or other occurrence (for example, fire, vandalism, system failure, and natural disaster) that damages systems containing EPHI. The implementation specifications for having a data backup plan, a disaster recovery plan, and an emergency mode operation plan are required. The specifications calling for a testing and revision procedure and an applications and data criticality analysis are addressable, which reflects the fact that these two specifications might not be required in smaller organizations with relatively simple information systems. The evaluation standard is required. Under this standard, entities are required to periodically conduct an evaluation of their security safeguards to demonstrate and document their compliance with their security policies and the requirements of the security rule. Further, organizations should assess the need for a new evaluation based on MCLE, Inc. | 2nd Edition 2018

6–15

§ 6.4

Data Security and Privacy in Massachusetts

changes to their security environment since their last evaluation, for example, new technology adopted or responses to newly recognized risks to the security of their information. The security rule does not require the use of an external evaluation consultant and permits self-evaluation when appropriate. The business associate contracts and other arrangements standard requires covered entities to enter into a contract or other arrangement with persons that meet the definition of “business associate” as set forth in in 45 C.F.R. § 160.103. The covered entity must obtain satisfactory assurances from the business associate that it will appropriately safeguard the information in accordance with the security rule. Of course, after HITECH, business associates are directly regulated by the security rule, which arguably makes this standard redundant. Nevertheless, it remains a required implementation specification under the security rule.

§ 6.4.5

Physical Safeguards

There are four standards comprising the category of physical safeguards: facility access controls, workstation use controls, workstation security controls, and device and media controls. Hard experience since the enactment of the security rule, and especially since health-care data breaches become publicly reportable, has shown that the loss and theft of computers, devices, and media pose a significant risk to the security of health-care information and the privacy of individuals. Accordingly, physical safeguards should not be neglected. The facility access control standard has four implementation specifications, all of which are addressable. Contingency operations refer to policies and procedures governing access to a facility during emergency conditions, such as after a fire or flood. A facility security plan should address protection from unauthorized intruders as well as the protection of information systems and records from natural threats. Access control and validation procedures control and validate a person’s access to facilities based on his or her role or function, including visitor control procedures. This specification also addresses control of access to software programs for testing and revision (such as issuance of credentials to vendors for software testing and updating). The maintenance records implementation specification requires policies and procedures to document repairs and modifications to the physical components of a facility that are related to security, for example, hardware, walls, doors, and locks.

§ 6.4.6

Technical Safeguards

Technical safeguards are often viewed as the essence of information security. While undoubtedly important and, indeed, crucial, technical safeguards should be adopted and implemented as part of the security management process after a risk analysis. Technical safeguards required by the security rule comprise five standards: access controls, audit controls, integrity controls, person or entity authentication, and transmission security. There are seven implementation specifications within the domain of technical safeguards. Three are required and four are addressable.

6–16

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.4

Access control comprises four specifications: unique user identification, which is required; emergency access procedures, which also are required; automatic logoff, which is addressable; and encryption and decryption, which also are addressable. It should be noted that encryption as an access control appears to refer to the use of encryption for the purpose of access control of data at rest. It is not required for this purpose, and the use of encryption for data at rest should be based upon an entity’s risk analysis and is addressable but not required. Audit controls, defined as the capability to record and examine system activity, are required. The lack of audit controls for monitoring user activity in information systems has been highlighted by the OCR as a common failure of security programs, especially for smaller entities. The integrity standard requires entities to implement, as appropriate, technical means of validating the integrity of data, i.e., that data has not been inappropriately altered due to technical failures or unauthorized acts. In the government’s view, errorcorrecting memory and magnetic disc storage are examples of the built-in data authentication mechanisms that are ubiquitous in hardware and operating systems today. According to the secretary, the risk analysis process should address what data must be authenticated based upon the particulars of the situations faced by the entity when implementing the security rule. The commentary mentions digital signatures or check sum technology as specific controls that can validate data to assure integrity. The person or entity authentication standard requires that covered entities and business associates implement safeguards in order to corroborate that an entity is who it claims to be. While this is a required standard, no implementation specifications are set forth, as the secretary acknowledged that a wide variety of technical mechanisms, including but not limited to biometrics, passwords, personal identification numbers, telephone callback systems, or hardware tokens could be used to fulfill this requirement. 68 Fed. Reg. 8356 (Feb. 20, 2003). Practice Note A famous New Yorker cartoon depicts two dogs next to a computer. One dog says, “On the Internet, no one knows you’re a dog.” This cartoon illustrates the problem of authentication.

The transmission security standard requires the use of safeguards to protect “data in motion” and has two addressable implementation specifications: integrity controls and encryption. The essence of the standard is that “when electronic protected health information is transmitted from one point to another, it must be protected in a manner commensurate with the associated risk.” 68 Fed. Reg. 8356 (Feb. 20, 2003). Thus, encryption is not required, but the need for encryption must be addressed based on the mode of transmission, technical feasibility, and other factors. It seems reasonable to assume that the government would deem encryption to be more feasible and reasonable in 2015 than it was in 2003. “Integrity controls” refers to technical means to ensure that electronically transmitted information is not improperly modified without detection. Again, this implementation specification is addressable, based on technical feasibility and risk. MCLE, Inc. | 2nd Edition 2018

6–17

§ 6.4

§ 6.4.7

Data Security and Privacy in Massachusetts

Information Security Guidance

Many covered entities and business associates, especially smaller organizations with limited technical resources, find understanding and complying with the security rule daunting. Information security is now a professional discipline with its own language and professional standards. Both OCR and NIST have published extensive guidance on how to comply with the information security regulation. In general, the OCR guidance, which can be found at https://www.hhs.gov/hipaa/for-professionals/security/ guidance/index.html, is written for nontechnical readers, but nevertheless provides highly useful background information on all aspects of the regulations. The NIST guidance documents, by contrast, are highly detailed. While many of the NIST guidance documents focus on the implementation of technical safeguards, such as encryption and firewalls, it should be noted that their guidance documents on risk analysis have been singled out by OCR as setting an accepted standard for the performance of that key function. These and other publications can be found at the NIST Computer Security Resource Center and its list of “special publications” at http://csrc .nist.gov/publications/PubsSPs.html

§ 6.5

ENFORCEMENT AND PENALTIES: THE IMPACT OF HITECH

§ 6.5.1

Overview of the Enforcement Process

Enforcement of the security rule rests with the government; there is no private right of action. Acara v. Banks, 470 F.3d 569 (5th Cir. 2006). However, the nature of government enforcement changed definitively with the enactment of HITECH in 2009. Most notably, the penalties applicable to violations of the privacy rule, the security rule, and the data breach notification rule increased geometrically. Additionally, state attorneys general, in addition to the Office of Civil Rights, were given jurisdiction to enforce the HIPAA regulations, albeit with reduced penalties. HITECH § 13410(3), codified at 42 U.S.C. § 1320d–5(d). This change, combined with mandatory breach notification (essentially a turn-yourself-in-to-the-government rule), greatly increased the likelihood and the cost of enforcement actions.

§ 6.5.2

Penalties Under the Original HIPAA Statute

When the HIPAA administrative simplification was enacted, the penalties for noncompliance were relatively modest. The statute authorized the secretary to impose a penalty of not more than $100 for each violation of a requirement, except that the total amount imposed on the person for all violations of an identical requirement or prohibition during a calendar year could not exceed $25,000. The statute also provided that a penalty could not be imposed if the person who would otherwise be liable for the penalty did not know, and by exercising reasonable diligence would not have known, that such person had violated the provision. If the failure to comply was due to reasonable cause and not to willful neglect and the failure to comply was corrected during the thirty-day period beginning on the first date the person liable for

6–18

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.5

the penalty knew or by exercising reasonable diligence would have known, that the failure to comply had occurred, no penalty should be imposed. These modest penalties, combined with an enforcement approach that tended to emphasize education rather than strict enforcement, led to criticism by some that the enforcement process was inadequate. This situation changed with the enactment of HITECH.

§ 6.5.3

Penalties After HITECH

The HITECH Act changed the enforcement process drastically by increasing the penalties for each violation, increasing the annual cap on penalties, and rewriting the “intent” standard needed to trigger penalties. Subsequently, the OCR released an amended enforcement rule that authorized the imposition of penalties on a “per-day” basis, greatly increasing the potential penalties that could be imposed for a violation of the HIPAA regulations. After HITECH, penalties were grouped into tiers based on intent as determined by conduct, as follows: Intent Did not know and could not have known exercising reasonable diligence Reasonable cause and not willful neglect Willful neglect corrected within thirty days Willful neglect not corrected within thirty days

Range of Penalties Per Violation $100 to $50,000

Annual Cap for Violation of One Standard $1.5 million

$1,000 to $50,000

$1.5 million

$10,000 to $50,000

$1.5 million

$50,000

$1.5 million

In order to fully understand the array of penalties and the discretion afforded to the government, it is helpful to understand the legal meaning of “willful neglect.” The OCR has expressly adopted the meaning of this term used in the enforcement of the Internal Revenue Code. Under the regulations, it is defined as “conscious, intentional failure or reckless indifference to the obligation to comply with the administrative simplification provision violated.” This appears to mean “recklessness” or an intention to violate the rule. This is not the case. In the U.S. Supreme Court case of United States v. Boyle, 469 U.S. 241 (1985), the Court ruled that a taxpayer who missed a deadline for filing a return because he relied on his attorney to file the return for him had not exercised reasonable prudence and could be assessed a penalty for “willful neglect.” The Court stated: [O]ne does not have to be a tax expert to know that tax returns have fixed filing dates and that taxes must be paid when they are due. In short, tax returns imply deadlines. Reliance by a lay person on a lawyer is of course common; but that reliance MCLE, Inc. | 2nd Edition 2018

6–19

§ 6.5

Data Security and Privacy in Massachusetts

cannot function as a substitute for compliance with an unambiguous statute. Among the first duties of the representative of a decedent’s estate is to identify and assemble the assets of the decedent and to ascertain tax obligations. Although it is common practice for an executor to engage a professional to prepare and file an estate tax return, a person experienced in business matters can perform that task personally. It is not unknown for an executor to prepare tax returns, take inventories, and carry out other significant steps in the probate of an estate. It is even not uncommon for an executor to conduct probate proceedings without counsel. It requires no special training or effort to ascertain a deadline and make sure that it is met. The failure to make a timely filing of a tax return is not excused by the taxpayer’s reliance on an agent, and such reliance is not “reasonable cause” for a late filing. United States v. Boyle, 469 U.S. at 252. In Boyle, there was no allegation that the taxpayer intended not to pay his taxes or that he acted recklessly in engaging counsel to file his return for him or that he failed to properly supervise his lawyer. Simply put, the obligation to file the tax return on time was absolute; the failure of the lawyer to meet the deadline could not be excused and amounted to “willful neglect” by the taxpayer. If the OCR is correct in this interpretation of the statute, the agency will have significant discretion in the imposition of penalties. In this regard, it may become important to distinguish between obligations created under the security rule that are clear cut (e.g., appoint a security officer) and those that require professional expertise and judgment (selection of alternatives among technical safeguards). It may be easier to demonstrate “willful neglect” in the former category than in the latter and easier to show “reasonable cause” in the latter.

§ 6.5.4

Vicarious Liability for the Acts of Business Associates

A significant area of concern for many covered entities is their potential vicarious liability for the acts and omissions of their business associates. Under the original HIPAA enforcement rule, a covered entity generally was not liable for the acts of a business associate unless the covered entity knew of a violation by the business associate and failed to act. The privacy rule stated: The [notice of proposed rulemaking for the privacy rule] proposed that covered entities be held accountable for the uses and disclosures of individually identifiable health information by their business associates. An entity would have been in violation of the rule if it knew of a breach in the contract by a business associate and failed to cure the breach or terminate 6–20

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.5

the contract. The final rule reduces the extent to which an entity must monitor the actions of its business associates. The entity no longer has to ‘‘ensure’’ that each business associate complies with the rule’s requirements. Entities will be required to cure a breach or terminate a contract for business associate actions only if they knew about a contract violation. The final rule is consistent with the oversight a business would provide for any contract, and therefore, the changes in the final rule will impose no new significant cost for small businesses in monitoring their business associates’ behavior. 65 Fed. Reg. 82785 (Dec. 28, 2000). The standards under the original security rule were identical. After HITECH, the enforcement rule was amended to state that “a covered entity is liable, in accordance with the federal common law of agency, for a civil money penalty for a violation based on the act or omission of any agent of the covered entity, including a workforce member or business associate, acting within the scope of the agency”; it also states that “a business associate is liable, in accordance with the federal common law of agency, for a civil money penalty for a violation based on the act or omission of any agent of the business associate, including a workforce member or subcontractor, acting within the scope of the agency.” 45 C.F.R. § 160.402(c). The changes in the enforcement rule have important implications under HITECH. Covered entities must take care to understand whether or not their business associates are agents, as they face significant risk of vicarious liability for their agents and very limited liability for the liability arising from the acts of business associates that are not agents. This also has implications for the appropriate terms of a business associate agreement, the need to monitor the activities of business associates, and the need for cyber-risk insurance. The question of whether a business associate or a subcontractor is an agent is not necessarily clear. As noted in the commentary with the final post-HITECH enforcement rule: An analysis of whether a business associate is an agent will be fact specific, taking into account the terms of a business associate agreement as well as the totality of the circumstances involved in the ongoing relationship between the parties. The essential factor in determining whether an agency relationship exists between a covered entity and its business associate (or business associate and its subcontractor) is the right or authority of a covered entity to control the business associate’s conduct in the course of performing a service on behalf of the covered entity. The right or authority to control the business associate’s conduct also is the essential factor in determining whether an MCLE, Inc. | 2nd Edition 2018

6–21

§ 6.5

Data Security and Privacy in Massachusetts

agency relationship exists between a business associate and its business associate subcontractor. Accordingly, this guidance applies in the same manner to both covered entities (with regard to their business associates) and business associates (with regard to their subcontractors). The authority of a covered entity to give interim instructions or directions is the type of control that distinguishes covered entities in agency relationships from those in non-agency relationships. A business associate generally would not be an agent if it enters into a business associate agreement with a covered entity that sets terms and conditions that create contractual obligations between the two parties. Specifically, if the only avenue of control is for a covered entity to amend the terms of the agreement or sue for breach of contract, this generally indicates that a business associate is not acting as an agent. In contrast, a business associate generally would be an agent if it enters into a business associate agreement with a covered entity that granted the covered entity the authority to direct the performance of the service provided by its business associate after the relationship was established. For example, if the terms of a business associate agreement between a covered entity and its business associate stated that ‘‘a business associate must make available protected health information in accordance with § 164.524 based on the instructions to be provided by or under the direction of a covered entity,’’ then this would create an agency relationship between the covered entity and business associate for this activity because the covered entity has a right to give interim instructions and direction during the course of the relationship. An agency relationship also could exist between a covered entity and its business associate if a covered entity contracts out or delegates a particular obligation under the HIPAA Rules to its business associate. As discussed above, whether or not an agency relationship exists in this circumstance again would depend on the right or authority to control the business associate’s conduct in the performance of the delegated service based on the right of a covered entity to give interim instructions. 78 Fed. Reg. 5581 (Jan. 25, 2013) (emphasis added). Significantly, the commentary expressly states that a business associate might be an agent for some activities and not for others and that the right of control should be examined for each activity carried out by the business associate. This could become very significant in the event of an alleged violation of the security rule by a business associate or subcontractor. The secretary has also noted other factors that bear upon the question of agency, including the type of service to be performed and the level of expertise of the business associate as compared to the expertise of the covered entity. The example used by the 6–22

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.5

secretary is a business associate hired to perform de-identification of data probably cannot be controlled or directed by a small provider and so probably would not be an agent of the covered entity. Similarly, an accreditation agency might be a business associate, but it is not subject to the control of the entity undergoing accreditation review and would not be an agent of the covered entity. Following the secretary’s analysis, if the covered entity does not retain the right to control the information security safeguards used by the business associate, perhaps the entity is not liable for security rule violations by the business associate unless the covered entity had actual knowledge of the violations and failed to act. There does not appear to be any duty on the part of covered entities to monitor business associates that are not acting as agents.

§ 6.5.5

Enforcement of the HIPAA Regulations by the Office of Civil Rights

The Office of Civil Rights regularly posts updates on HIPAA enforcement on its website: https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/data/ enforcement-highlights/index.html. According to OCR, as of July 30, 2017, and since the compliance date of the Privacy Rule in April 2003, OCR has received over 158,834 HIPAA complaints and has initiated over 825 compliance reviews. OCR claims to have resolved 99 percent of these cases (156,467). OCR has enforced the HIPAA Rules by applying corrective measures in all cases where an investigation indicates noncompliance by the covered entity or their business associate. Where a penalty has been imposed, this has in almost all cases resulted from a settlement in lieu of imposing a civil money penalty. As of July 30, 2017, OCR has settled fifty-two such cases, resulting resolution payments in the amount of $72,929,182.

§ 6.5.6

Open Issues and Questions Under HITECH and the Enforcement Rule

There are some interesting open issues around the imposition of civil penalties by the OCR in connection with HITECH and the amended enforcement rule. First, the OCR claims the right to impose daily penalties for violations of the enforcement rule, yet the statute does not at any point mention daily penalties. Second, there appears to be a conflict between two important penalty provisions of HITECH with respect to the maximum penalty that can be imposed for violations arising without knowledge, with reasonable cause, and with willful neglect. The distinction between a single penalty for a violation and a daily penalty is, of course, highly significant. Even with the HITECH Act’s annual cap on violations of a single requirement, the ability to accumulate daily penalties over a period of a year or more greatly changes the stakes and negotiating leverage between the government and a covered entity or a business associate.

MCLE, Inc. | 2nd Edition 2018

6–23

§ 6.5

Data Security and Privacy in Massachusetts

Analogies from other areas of law are illuminating. Notably, in the case of United States v. ITT Continental Baking Co., 420 U.S. 223 (1975), the question before the Supreme Court was whether the Federal Trade Commission (FTC) could assess a daily penalty for an acquisition carried out in violation of a consent order, which hinged upon whether the prohibited acquisition was a one-time violation or a continuing violation. The Court noted: [Continuing] violations share two discernible characteristics: the detrimental effect to the public and the advantage to the violator continue and increase over a period of time, and the violator could eliminate the effects of the violation if it were motivated to do so, after it had begun. Without these characteristics, daily penalties for such violations would probably have no greater deterrent effect than a single penalty and accumulating daily penalties would therefore be unfair. The legislative history also makes clear that Congress was concerned with avoiding a situation in which the statutory penalty would be regarded by potential violators of FTC orders as nothing more than an acceptable cost of violation, rather than as a deterrence to violation. . . . Thus, the “continuing failure or neglect to obey” provisions of 15 U. S. C. §§ 21 (l) and 45 (l) were intended to assure that the penalty provisions would provide a meaningful deterrence against violations whose effect is continuing and whose detrimental effect could be terminated or minimized by the violator at some time after initiating the violation. It seems apparent that acquisition in violation of an FTC order banning “acquiring” certain assets could be such a violation. Any anticompetitive effect of an acquisition continues as long as the assets obtained are retained, and the violator could undo or minimize any such effect by disposing of the assets at any time after the initial transaction. On the other hand, if violation of an order prohibiting “acquiring” assets were treated as a single violation, any deterrent effect of the penalty provisions would be entirely undermined, and the penalty would be converted into a minor tax upon a violation which could reap large financial benefits to the perpetrator. As we have seen, Congress added the continuing-penalty provisions precisely to avoid such a result. United States v. ITT Cont’l Baking Co., 420 U.S. 223, 231–33 (1975). The Court’s opinion is notable in two respects: its clear articulation of the rationale for a daily fine or penalty for “continuing violations” of the FTC Act, and the fact that the decision is clearly grounded in express statutory language and Congressional intent to impose daily penalties for continuing violations.

6–24

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.5

The question, then, is whether the OCR can impose daily penalties in the absence of statutory authorization. The enforcement rule preamble states: [W]ith respect to continuing violations, such as lack of appropriate safeguards for a period of time, it is anticipated that the number of identical violations of the safeguard standard would be counted on a per day basis (i.e., the number of days the entity did not have appropriate safeguards in place to protect the protected health information). 78 Fed. Reg. 5584 (Jan. 25, 2013). In this regard, it is clear that, when Congress wishes to authorize daily penalties for ongoing violations, it knows how to write the language that does so. See Atlantic States Legal Found., Inc. v. Tyson Foods, Inc., 897 F.2d 1128 (1990); United States v. ITT Cont’l Baking Co., 420 U.S. 223 (1975). No court or administrative law judge has addressed this issue. Perhaps it will be determined if an appropriate case arises. With respect to the ranges of penalties within each tier of violations, it is interesting to note a clear inconsistency in the Act. The HITECH Act, Pub. L. No. 111-5 (2009), calls for tiers of penalties based on the knowledge and intent of the violator, with a minimum penalty per violation and a maximum penalty. The tiers based on intent are described in Subsection 13410(d)(1), while the tiered penalties per violation and the annual cap for all violations of the same requirement are set forth in Subsection 13410(d)(2). The section describing the intent tiers reads as follows: . . . who violates a provision of this part— (A) in the case of a violation of such provision in which it is established that the person did not know (and by exercising reasonable diligence would not have known) that such person violated such provision, a penalty for each such violation of an amount that is at least the amount described in paragraph (3)(A) but not to exceed the amount described in paragraph (3)(D); (B) in the case of a violation of such provision in which it is established that the violation was due to reasonable cause and not to willful neglect, a penalty for each such violation of an amount that is at least the amount described in paragraph (3)(B) but not to exceed the amount described in paragraph (3)(D); and (C) in the case of a violation of such provision in which it is established that the violation was due to willful neglect— (i) if the violation is corrected as described in subsection (b)(3)(A), a penalty in an amount that is at least the amount

MCLE, Inc. | 2nd Edition 2018

6–25

§ 6.5

Data Security and Privacy in Massachusetts

described in paragraph (3)(C) but not to exceed the amount described in paragraph (3)(D); and (ii) if the violation is not corrected as described in such subsection, a penalty in an amount that is at least the amount described in paragraph (3)(D). However, the tiers of penalties read slightly differently: (2) TIERS OF PENALTIES DESCRIBED.—Section 1176(a) of such Act (42 U.S.C. 1320d–5(a)) is further amended by adding at the end the following new paragraph: (3) TIERS OF PENALTIES DESCRIBED.—For purposes of paragraph (1), with respect to a violation by a person of a provision of this part— (A) the amount described in this subparagraph is $100 for each such violation, except that the total amount imposed on the person for all such violations of an identical requirement or prohibition during a calendar year may not exceed $25,000; (B) the amount described in this subparagraph is $1,000 for each such violation, except that the total amount imposed on the person for all such violations of an identical requirement or prohibition during a calendar year may not exceed $100,000; (C) the amount described in this subparagraph is $10,000 for each such violation, except that the total amount imposed on the person for all such violations of an identical requirement or prohibition during a calendar year may not exceed $250,000; and (D) the amount described in this subparagraph is $50,000 for each such violation, except that the total amount imposed on the person for all such violations of an identical requirement or prohibition during a calendar year may not exceed $1,500,000. Thus, Subsection (d)(2) sets a per-violation penalty for each tier and an annual maximum for all violations of one requirement in that tier, while Subsection (d)(1) defines the level of intent and cross-references a different subsection with a maximum penalty per violation that substantially exceeds the annual cap for all violations of the same type in a year. The most logical explanation is that the cross-references in Subsection (d)(1) are incorrect. The secretary resolved the clear inconsistency as follows:

6–26

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.5

In adopting the HITECH Act’s penalty scheme, the Department recognized that section 13410(d) contained apparently inconsistent language (i.e., its reference to two penalty tiers ‘‘for each violation,’’ each of which provided a penalty amount ‘‘for all such violations’’ of an identical requirement or prohibition in a calendar year). To resolve this inconsistency, with the exception of violations due to willful neglect that are not timely corrected, the Interim Final Rule adopted a range of penalty amounts between the minimum given in one tier and the maximum given in the second tier for each violation and adopted the amount of $1.5 million as the limit for all violations of an identical provision of the HIPAA rules in a calendar year. For violations due to willful neglect that are not timely corrected, the IFR adopted the penalty amount of $50,000 as the minimum for each violation and $1.5 million for all such violations of an identical requirement or prohibition in a calendar year. 78 Fed. Reg. 5582 (emphasis added). Of course, the “resolution” is no resolution at all, as the secretary simply ignored the annual caps in each tier in favor of the higher per-violation cap. For example, under the statute, violations resulting from “reasonable cause and not willful neglect” carry a per-violation penalty of $1,000, with an annual cap for all identical violations of $100,000. However, in its regulations, the OCR has stated that the violation resulting from reasonable cause and not willful neglect can carry a penalty of between $1,000 and $1.5 million per violation, making the annual cap set forth in the statute illusory. Arguably, the regulation exceeds the authority in the statute. Again, someday a court may make the answer clear.

§ 6.6

BREACH NOTIFICATION AND INFORMATION SECURITY

The HITECH Act also introduced breach notification requirements into HIPAA. The OCR responded by promulgating an interim final breach notification rule and a final rule some years later. A full discussion of breach notification is well beyond the scope of this chapter, but the existence of breach notification requirements has had a profound impact on HIPAA enforcement and compliance and should be understood in this context.

§ 6.6.1

Brief Overview of the Breach Notification Rule

HITECH and the data breach rule impose notification requirements on covered entities and business associates in the event of a breach of unsecured PHI. A “breach of unsecured protected health information” can include a negligent or wrongful disclosure, an intrusion into information systems that may expose PHI, the loss of a laptop or other portable device, or an improper use of PHI by an employee or contractor. Significantly, the loss or exposure of information that is protected by encryption is not a breach, so long as the encryption meets certain federal guidelines and the key MCLE, Inc. | 2nd Edition 2018

6–27

§ 6.6

Data Security and Privacy in Massachusetts

associated with the encryption algorithm is not lost or exposed. Under the data breach rule, a breach occurs if the privacy or security of PHI is compromised, even in the absence of demonstrated misuse of PHI or harm to an individual or even absent a clear risk of harm. Once a covered entity becomes aware of a breach of unsecured PHI, it is obligated to notify all individuals whose information was breached and the OCR. Notification to individuals must occur without unreasonable delay and no later than sixty days after the breach is discovered, while notification to the OCR can occur annually, unless the breach impacts more than 500 individuals, in which event notice to the OCR must be provided promptly. The data breach rule sets out standards for substitute notice, including legal notice in a newspaper and on a website, if the covered entity does not have contact information for ten or more individuals. If the breach impacts more than 500 individuals in a state, notice must also be given to an appropriate media outlet, and notice to the OCR must be given within the sixty-day time frame that applies to notice to individuals. Notice to the media of a “large” breach is not to be confused with substitute notice (which can include a legal notice in a newspaper). In the event of a breach of unsecured PHI maintained or transmitted by a business associate, the business associate is not required to give notice to individuals, the OCR, or the public, but it must give notice to the appropriate covered entity or entities. The business associate may give notice to individuals and the OCR on behalf of a covered entity, however. Failure to give the timely notice required by HITECH and the regulation can itself lead to penalties under the enforcement rule, and all covered entities and business associates should adopt an appropriate breach notification policy.

§ 6.6.2 (a)

Impact on Information Security and Enforcement Are Breaches Violations of the Security Rule?

Every breach, by definition, implies a use or disclosure of unsecured PHI that is not permitted by the privacy rule. However, every violation of the security rule is not a reportable breach, and it would be a mistake to assume that every data breach implicates a violation of the security rule. See discussion at 74 Fed. Reg. 42743–44 (Aug. 24, 2009). The security rule requires the use of “reasonable and appropriate” safeguards, not perfect safeguards. While a breach certainly should prompt a review of safeguards and appropriate mitigation of risks to the privacy and security of PHI, some breaches may be unavoidable even in the presence of strong safeguards. Twentytwenty hindsight can make any event look foresee-able, or even likely, but that is not necessarily true. Nevertheless, each covered entity should anticipate that, in the event of a breach, particularly a breach affecting more than 500 individuals, its information security program will come under scrutiny.

(b)

Publicity, Enforcement, and Private Litigation

Once a covered entity notifies individuals and the government that it has experienced a breach of unsecured PHI, it can expect scrutiny from the OCR, from state attorneys 6–28

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care

§ 6.6

general, and from others. Individuals who receive notice letters may file complaints with regulators, complain to the media, or engage counsel. The FTC could also assert jurisdiction. (A failure to implement a safeguard that does not result in use or disclosure of PHI inconsistent with the privacy rule is not a “breach.”) While court decisions have failed to recognize a private right of action under HIPAA, Acara v. Banks, 470 F.3d 569 (5th Cir. 2006), individuals could assert claims under state privacy or breach notification statutes, consumer protection statutes, or common law theories of invasion of privacy. Breach notification puts a new premium on information security and may well justify increased investments in technology and personnel to safeguard PHI. R.K. v. St. Mary’s Medical Center, 229 W.Va. 712 (2012); Byrne v. Avery Center of Obstetrics and Gynecology, 102 A.2d 32 (2014); Acosta v. Byrum et al, 638 S.E.2d 246 (2006), Yath v. Fairview Clinics, N.P, 767 N.W.2d 34 (2009).

MCLE, Inc. | 2nd Edition 2018

6–29

Data Security and Privacy in Massachusetts

EXHIBIT 6A—HIPAA Security Standards Matrix HIPAA Administrative Simplification Regulation Text March 2013 Appendix A to Subpart C or Part 164— Security Standards: Matrix Standards

Implementation Specifications (R)=Required, (A)=Addressable

Sections Administrative Safeguards

Security Management Process

164.308(a)(1)

Risk Analysis (R) Risk Management (R) Sanction Policy (R) Information System Activity Review (R)

Assigned Security Responsibility

164.308(a)(2)

(R)

Workforce Security

164.308(a)(3)

Authorization and/or Supervision (A) Workforce Clearance Procedure (A) Termination Procedures (A)

Information Access Management

164.308(a)(4)

Isolating Health care Clearinghouse Function (R) Access Authorization (A) Access Establishment and Modification (A)

Security Awareness and Training

164.308(a)(5)

Security Reminders (A) Protection from Malicious Software (A) Log-in Monitoring (A) Password Management (A)

Security Incident Procedures

164.308(a)(6)

Response and Reporting (R)

Contingency Plan

164.308(a)(7)

Data Backup Plan (R) Disaster Recovery Plan (R) Emergency Mode Operation Plan (R) Testing and Revision Procedure (A) Applications and Data Criticality Analysis (A)

Evaluation

164.308(a)(8)

(R)

Business Associate Contracts and Other Arrangement

164.308(b)(1)

Written Contract or Other Arrangement (R)

6–30

2nd Edition 2018 | MCLE, Inc.

Information Security in Health Care Standards

Implementation Specifications (R)=Required, (A)=Addressable

Sections Physical Safeguards

Facility Access Controls

164.310(a)(1)

Contingency Operations (A) Facility Security Plan (A) Access Control and Validation Procedures (A) Maintenance Records (A)

Workstation Use

164.310(b)

(R)

Workstation Security

164.310(c)

(R)

Device and Media Controls

164.310(d)(1)

Disposal (R) Media Re-use (R) Accountability (A) Data Backup and Storage (A)

Technical Safeguards (see § 164.312) Access Control

164.312(a)(1)

Unique User Identification (R) Emergency Access Procedure (R) Automatic Logoff (A) Encryption and Decryption (A)

Audit Controls

164.312(b)

(R)

Integrity

164.312(c)(1)

Mechanism to Authenticate Electronic Protected Health Information (A)

Person or Entity Authentication

164.312(d)

(R)

Transmission Security

164.312(e)(1)

Integrity Controls (A) Encryption (A)

MCLE, Inc. | 2nd Edition 2018

6–31

Data Security and Privacy in Massachusetts

6–32

2nd Edition 2018 | MCLE, Inc.

CHAPTER 7

Congressional Response to the Internet Patrick J. Concannon, Esq. Nutter McClennen & Fish LLP, Boston § 7.1

Introduction............................................................................................. 7–2

§ 7.2

Communications Decency Act ............................................................... 7–2 § 7.2.1 Anti-indecency and Antiobscenity Provisions ........................ 7–2 § 7.2.2 Section 230—Online Service Providers Sheltered from Liability ......................................................................... 7–3 § 7.2.3 Defamation (as Opposed to Intellectual Property) Liability Shield ....................................................................... 7–3 § 7.2.4 Who Qualifies for Protection Under the Statute? ................... 7–4 § 7.2.5 “Source of the Content at Issue”—Room for Argument ........ 7–4 § 7.2.6 Other Illegal Content .............................................................. 7–5

§ 7.3

Digital Millennium Copyright Act ......................................................... 7–7 § 7.3.1 DMCA Safe Harbor ............................................................... 7–7 § 7.3.2 Who Qualifies? ....................................................................... 7–8 § 7.3.3 Designation of Agent.............................................................. 7–9 § 7.3.4 Requirements for Valid Takedown Requests ........................ 7–11 § 7.3.5 Responding to Takedown Requests ...................................... 7–13 § 7.3.6 Clarification About Secondary Liability Safe Harbor .......... 7–14 § 7.3.7 Anticopyright Management Circumvention ......................... 7–15 § 7.3.8 DMCA Anticircumvention Exemptions ............................... 7–16 § 7.3.9 Notable Anticircumvention Cases ........................................ 7–17

§ 7.4

Children’s Online Privacy Protection Act........................................... 7–18 § 7.4.1 Privacy Policy....................................................................... 7–20 § 7.4.2 Not Just Notice, but Choice and Access ............................... 7–21 § 7.4.3 Initial Notice to Parent Required .......................................... 7–21 § 7.4.4 What Are the Consequences for Failure to Comply? ........... 7–22 § 7.4.5 Security and Data Retention ................................................. 7–23 § 7.4.6 Student Privacy..................................................................... 7–23

§ 7.5

CAN-SPAM Act .................................................................................... 7–24 § 7.5.1 E-mails Covered by CAN-SPAM Act .................................. 7–24 § 7.5.2 Compliance Requirements ................................................... 7–25 § 7.5.3 Consequences for Noncompliance ....................................... 7–26

MCLE, Inc. | 2nd Edition 2018

7–1

Data Security and Privacy in Massachusetts

Scope Note This chapter provides information on legal issues that have arisen in the areas of Internet and web-based services that support electronic communications. The focus is on the range of activities that have not been foreseen by traditional legal frameworks, and Congress’s response.

§ 7.1

INTRODUCTION

The Internet, and web-based services that support electronic communications such as mobile device applications, has enabled a wide range of activities not contemplated by traditional legal frameworks. The areas of intellectual property, publishing law, advertising law, and privacy law are among those most strongly impacted by the robust development of a digital economy and social media. The goal of this chapter is to acquaint the reader with portions of the landscape of legal issues encountered in these areas by discussing four specific laws that Congress enacted. Through these laws Congress tried, with mixed success, to impose sensible legal controls in what often seems like a lawless frontier.

§ 7.2

COMMUNICATIONS DECENCY ACT

The Communications Decency Act (CDA), enacted in February 1996, was Congress’s first attempt to regulate Internet pornography. Ironically, the lasting provision from the CDA has not been its efforts to control “indecency,” provisions that were ruled unconstitutional, but instead a separate provision that has protected free speech and allowed communications to flourish over the Internet.

§ 7.2.1

Anti-indecency and Antiobscenity Provisions

The primary purpose of the CDA as enacted, and the legal challenges that frustrated that purpose, are worth noting. The CDA initially imposed criminal sanctions on one who knowingly sent or displayed words or images that, in the context of usage, depicted or described in terms “patently offensive by contemporary community standards” “sexual or excretory activities or organs.” The breadth of this legal prohibition triggered a strong reaction from free speech activists. In 1997, the U.S. Supreme Court ruled that the anti-indecency and antiobscenity provisions in the CDA were unconstitutional impingements on the First Amendment. Reno v. American Civil Liberties Union, 521 U.S. 844 (1997). Congress eventually responded with efforts to replace the CDA’s struck-down provisions as they related to the protection of children from pornography with the Children’s Internet Protection Act (CIPA) in 2000. Pub. L. No. 106-554 (2000) (codified at 20 U.S.C. §§ 6801, 6777, 9134 (2003); 47 U.S.C. § 254 (2003)). The CIPA requires that grade schools and libraries in the United States, as a condition for federal subsidies, use Internet filters and employ other measures to protect children from certain harmful online content. The CIPA was also challenged, principally by the librarian community, given that public libraries must bear the administrative burdens 7–2

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.2

of implementing the filters by disabling the filters upon requests from adults, but the statute was upheld as constitutional by the U.S. Supreme Court and remains good law. United States v. American Library Ass’n, 539 U.S. 194 (2003).

§ 7.2.2

Section 230—Online Service Providers Sheltered from Liability

Although the anti-indecency and antiobscenity provisions of the CDA were struck down, an important portion of the CDA that was intended as a free speech counterbalance to those prohibitions remains. Section 230 of the CDA provides: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). This has been interpreted to say that operators of Internet services are not to be construed as publishers and thus are not legally liable for the content of third parties who use their services. This liability immunity provides protection not only for Internet service providers (ISPs) but also more broadly for “interactive computer service providers,” more or less including any online service that publishes third-party content. “By its plain language, § 230 creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service”. Zeran v. America Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997), cert. denied, 524 U.S. 937 (1998).

§ 7.2.3

Defamation (as Opposed to Intellectual Property) Liability Shield

Section 230(e)(2) of the CDA provides that “[n]othing in [Section 230] shall be construed to limit or expand any law pertaining to intellectual property.” Generally speaking, the value of the CDA for online content services companies is that they are shielded from defamation liability by their end users. Most courts have focused on the plain wording in Section 230(e)(2) and have refused to recognize CDA immunity in defense of intellectual property claims of all kinds, including state law claims alleging right of publicity, unfair competition, and copyright violations. See, e.g., Universal Communication Sys., Inc. v. Lycos, Inc., 478 F.3d 413 (1st Cir. 2007); Doe v. Friendfinder Network, Inc., 540 F. Supp. 2d 288 (D.N.H. 2008). In Perfect 10, Inc. v. CCBill, LLC, 488 F.3d 1102, 1119 (9th Cir. 2007), however, the Court of Appeals for the Ninth Circuit reversed a District Court decision that had ruled that California’s right-of-publicity law was the type of intellectual property law carved out from the Digital Millennium Copyright Act (DMCA) safe harbor and that a safe harbor defense was barred. The Ninth Circuit reversed the trial court decision and refused to recognize the CDA Section 230 defense (notwithstanding the plain language of the statute) “[b]ecause material on a website may be viewed across the Internet, and thus in more than one state at a time, permitting the reach of any particular state’s definition of intellectual property to dictate the contours of this federal immunity would be contrary to Congress’s expressed goal of insulating the development of the Internet from the various state-law regimes.” Perfect 10, Inc. v. CCBill, LLC, 488 F.3d at 1118. MCLE, Inc. | 2nd Edition 2018

7–3

§ 7.2

Data Security and Privacy in Massachusetts

The Ninth Circuit decision, though a noteworthy one, remains a dubious outlier. In Atlantic Recording Corp. v. Project Playlist, Inc., 603 F. Supp. 2d 690 (S.D.N.Y. 2009), for example, the District Court for the Southern District of New York refused to follow the Ninth Circuit’s approach and did not extend the CDA’s liability carve-out to defendant Project Playlist for state-law copyright claims in relation to pre-1972 musical sound recordings that could be accessed through its service.

§ 7.2.4

Who Qualifies for Protection Under the Statute?

A defendant must be a “provider or user” of an “interactive computer service” to qualify for the safe harbor. To avail oneself of the defense, the plaintiff must be taking the position that the defendant is the source of the complained-of published information and, in fact, the poster of the allegedly unlawful information must not have been the source of the content at issue. So, for example, Facebook most certainly is an interactive computer service. If Facebook is sued and accused of defaming an individual, and the allegedly defamatory content was authored by Facebook in its editorial capacity and was not entered by one of Facebook’s users, then Facebook could not successfully assert the CDA Section 230 defense. In contrast to the Digital Millennium Copyright Act, copyright liability safe harbor for online services operators discussed in § 7.1 below, an operator need not take any formal steps as a precondition to availing itself of the CDA Section 230 defense.

§ 7.2.5

“Source of the Content at Issue”—Room for Argument

There are decisions worth mentioning that have dealt with factual close calls as to whether a defendant was responsible for the complained-of content such that the CDA Section 230 defense was unavailable. One such case was Jones v. Dirty World Entertainment Recordings LLC, 755 F.3d 398 (6th Cir. 2014). Dirty World (now known by a different name) is a tabloid website that publishes gossip-type content submitted by users. In the case in question, Dirty World published allegedly defamatory content submitted by users about Sarah Jones, a teacher and Cincinnati Bengals cheerleader. Dirty World owner and editor Nik Ritchie entered commentary of his own about the postings but did not alter the user-generated postings. The trial judge refused to recognize the CDA Section 230 safe harbor defense asserted by Dirty World and Ritchie, ruling that “a website owner who intentionally encourages illegal or actionable third-party postings to which he adds his own comments ratifying and adopting the user posts becomes a ‘creator’ or ‘developer’ of that content and is not entitled to immunity.” Jones v. Dirty World Entm’t Recordings LLC, 965 F. Supp. 2d 818, 821 (E.D. Ky. 2013). On appeal, the Court of Appeals for the Sixth Circuit looked to a Ninth Circuit case, Fair Housing Council of San Francisco Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008), which had developed a “material contribution” test for determining whether an online publisher has “crossed the line” and acted as the source of (as opposed to merely as the host for) the content at issue. The Sixth Circuit adopted the Roommates material contribution test and went further to hold that making a 7–4

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.2

material contribution meant “being responsible for what makes the displayed content allegedly unlawful.” Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398 at 410. The Court of Appeals disagreed with the trial judge’s reasoning that the online publisher’s comments about the user-generated content somehow gave rise to responsibility for the content and concluded that Dirty World was entitled to CDA Section 230 immunity. In a 2016 case, Doe v. Internet Brands, 824 F.3d 846 (9th Cir. 2016), however, CDA Section 230 treatment was refused. Internet Brands, which does business as “Model Mayhem,” solicited postings from would-be models. Two men, allegedly known to Model Mayhem, used the online service to arrange for a meeting with a woman who expressed interest is finding modeling work on the pretense that she would be auditioning for a modeling job. The meeting was a sham, and the men drugged and sexually assaulted her. Jane Doe alleged that Model Mayhem knew that these individuals used the service to connect with and prey on vulnerable women and had used the service to engage in such tactics before. Accordingly, Doe claimed that Model Mayhem had a duty to warn her, which it breached. The Court of Appeals for the Ninth Circuit affirmed the trial court’s ruling that Model Mayhem did not qualify for the CDA Section 230 liability shield in relation to its alleged omissions. The court stressed in its opinion that Jane Doe’s “failure to warn” claim was not seeking to hold Model Mayhem liable as a source of content in such a way as to trigger the CDA Section 230 immunity shield. The court ruled that CDA Section 230 is a narrower safe harbor that does not provide a liability shield in relation to just any claim about third-party activities in an online context, broadly speaking, but instead to claims stemming from challenged content. A 2016 case that resulted in a defense-favorable finding involved a locksmith named Kimzey who sued Yelp, Inc. alleging business defamation based upon a negative consumer review. Kimzey argued that Yelp in fact was the source of the negative review. The Court of Appeals for the Ninth Circuit held that there was no factual basis for suggesting that Yelp fabricated content under the name of an anonymous third party, as alleged. The appeals court also rejected Kimzey’s alternative theory that Yelp used a third-party review as its own “advertisement” or “promotion,” thus somehow converting the content to its own speech not protected under CDA Section 230.

§ 7.2.6

Other Illegal Content

The limits of the protections for online services operators have been pushed by the actions of those who post “revenge porn,” meaning the retributive posting of images or videos of naked people, or people engaged in sex acts, by jilted lovers. Is the operation of a website where users submit and the website publishes “revenge porn” images and recordings protected by the CDA? The answer is that to the extent that the website operator truly acts as passive operator, does not act as an “information content provider,” and does not edit any of the content that gives rise to the alleged illegal content, the operator should be immune under the CDA. If a website operator is responsible for creating the allegedly illegal content, the operator is deemed

MCLE, Inc. | 2nd Edition 2018

7–5

§ 7.2

Data Security and Privacy in Massachusetts

an “information content provider” as to that content and is not immune from claims based on the content. Where a website operator enhances the content in a manner that makes the content more embarrassing to the subject of the images, or enters editorial commentary that does so, those actions would be those of an information content provider and would not be protected by the CDA. In February 2015, a California state court jury convicted Kevin Christopher Bollaert on twenty-one felony counts of identity theft and six felony counts of extortion arising out of his operating a website and publishing names, addresses, and other information alongside the images of revenge porn victims. Bollaert also made it possible for the victims to visit a related website and pay to have the images removed as part of a cynical money-making scheme. Victims separately obtained a default judgment in a civil suit against Bollaert in the amount of $450,000. People v. Bollaert, Case No. 3:13-cr-03902 (S.D. Cal. Feb. 2, 2015). Would the CDA have immunized Bollaert from a state invasion of privacy civil claim? The degree of Bollaert’s manipulation of the content, including the adding of names and other identifying information to the content to shame and extort money from the victims, probably would have been deemed to characterize him as an “information content provider,” particularly given that the website required users to post its victims’ personal information before uploading a photo. A more recent decision by the Court of Appeals for the First Circuit, Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016), demonstrates the power of the CDA Section 230 safe harbor for qualifying website operators even in response to claims by the most sympathetic of plaintiffs. Backpage.com provides online classified advertising services through which users can post advertisements for a wide range of services. The advertising categories include “escort services” as a subcategory of “adult entertainment.” Three women sued Backpage.com in the U.S. District Court for the District of Massachusetts alleging that its practices encouraged their being exploited illegally by sex traffickers. During portions of the time periods when the women were being marketed online, it was alleged, they were only fifteen years old. The suit claimed that, as fifteen-year-olds, the women were raped hundreds of times. The women alleged that Backpage.com went out of its way to facilitate the use of the service for illegal sex trafficking. For example, it allowed the use of coded terms that identified underage girls, allowed users to pay posting fees anonymously through prepaid credit cards or digital currency, and had a profit incentive to encourage the activity because it collected posting fees for escort ads (unlike other Backpage.com advertising categories, for which postings were free). The principal claim brought by the three women was under G.L. c. 265, § 50, which provides for both criminal and civil penalties where a defendant is shown to have knowingly recruited for, subjected to, or benefited from sex trafficking. The District Court, however, granted Backpage.com’s motion to dismiss for failure to state a claim upon which relief could be granted, citing CDA Section 230. On appeal to the Court of Appeals for the First Circuit, the women attempted to distinguish Backpage.com’s actions from those of publishers that qualify for the safe harbor treatment. The women argued that Backpage.com followed an “affirmative course of 7–6

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.2

conduct” by implementing advertising posting infrastructure and policies that encouraged illegal sex trafficking and that Backpage.com was not acting in the passive capacity required for it to benefit from the CDA Section 230 safe harbor. The First Circuit, however, upheld the District Court ruling that Backpage.com was acting as a publisher within the meaning of CDA Section 230, specifically holding that “a website operator's decisions in structuring its website and posting requirements are publisher functions entitled to section 230(c)(1) protection”. Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d at 31.

§ 7.3

DIGITAL MILLENNIUM COPYRIGHT ACT

It is easy to reproduce and distribute electronic copies of copyrighted works on the Internet. Accessing information and content through online service platforms and generally through digital media allows the vast and instantaneous dissemination not only of expressive works subject to copyright but also of information not subject to copyright or that should be available for use for certain purposes notwithstanding copyright protection. Digital content does not always fall squarely within one of the above categories, and it often amounts to hybrids of both expressive content and mere factual content. Access to the various types of content available online is a good thing for a number of policy reasons, ranging from public health and safety to commercial growth. For content owners, however, the opportunities to fruitfully exploit works can be outweighed by the challenges associated with confronting and managing copyright abuses. Traditional copyright law could not keep up with the explosion of online sharing and transmission of copyrighted work. The Copyright Act was amended in 1998 to add the provisions known as the Digital Millennium Copyright Act (DMCA). 17 U.S.C. § 512, 1201–05, 1301–22; 28 U.S.C. § 4001. The DMCA supplemented copyright law by addressing new digital phenomena that the existing copyright laws were ill suited to address. The DCMA introduced two important, and controversial, bundles of legal protections.

§ 7.3.1

DMCA Safe Harbor

The widespread availability of digital content is both a boon and a bane for copyright holders. On the one hand, never has it been easier to reach new audiences for one’s music, visual art, or writing. On the other hand, never has it been easier for others to make and share infringing copies of such works. Not only is making electronic copies easy, but one’s use of a computer fundamentally requires making copies of works protected by copyright. Rendering a service that enables the publishing, retrieval, reproduction, and transmission of digital content requires making copies, even though such copying was not directly “caused” by online service providers but instead by users of their services. This gives rise to difficult questions, such as to what degree, if at all, such online service providers should be held legally responsible when their users use the services in violation of third-party copyrights. The potential copyright liability exposure for such online services would be massive if the online service providers were to be held primarily or secondarily responsible for each and MCLE, Inc. | 2nd Edition 2018

7–7

§ 7.3

Data Security and Privacy in Massachusetts

every infringing reproduction of a song, image, news article, or other expressive content that is stored, posted, or transmitted by users over the Internet. The legislative history of the DMCA explains that Congress felt compelled to address the scope of online service providers’ liability for the actions of those using their services because [w]ithout clarification of their liability, service providers may hesitate to make the necessary investment in the expansion of the speed and capacity of the Internet . . . in short, by limiting the liability of service providers, the DMCA ensures that the efficiency of the Internet will continue to improve and that the variety and quality of services on the Internet will continue to expand. S. Rep. No. 105-190, 105th Cong., 2d Sess. at 8 (1998). The DMCA’s “safe harbor” provisions (codified in Section 512 of the Copyright Act) were intended to reduce the risk that copyright infringement suits would impede investment in the online infrastructure and associated services. These safe harbors protect “online service providers” that meet specific requirements from damages for the direct or indirect infringing use of their digital communications services. The DMCA safe harbor shields those who act as content conduits (and who otherwise qualify for protection as detailed below, including registering with the U.S. Copyright Office) only from monetary damages, which is especially significant given the availability of high statutory damages under copyright law. The safe harbor does not, however, shield a qualifying service from being found to have directly or indirectly infringed copyrights, or from resulting injunctive relief. Moreover, a finding that an online service provider is not entitled to safe harbor protection does not automatically mean that the service provider is deemed liable for copyright infringement. Such a service provider can still raise defenses to infringement liability, including fair use.

§ 7.3.2

Who Qualifies?

The DMCA safe harbor provisions identify several different categories of service providers that might qualify for immunity treatment: • content transmitters (typically telecommunications companies); • those who engage in caching (the “intermediate and temporary” storage of material transmitted over a service provider’s system or network at a location other than the website from which the material originated); • those who engage in linking (which does not entail actual copying of copyrighted content, but facilitates access to such content); and

7–8

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.3

• those who engage in hosting (the provision of server space for a user’s website). The DMCA Section 512 safe harbor applies to telecommunications service providers who act merely as content conduits, that is, who are engaged in “transmitting, routing, or providing connections for, material through a system or network controlled or operated by or for the service provider,” and where the storage of that material is “intermediate and transient.” To qualify for the safe harbor treatment as a “cacher,” the service provider must • not modify the material being cached; • comply with reasonable conditions specified by the original poster concerning the refreshing, reloading, and updating of and access to the cached material; and • not interfere with hit counters or other technology associated with the cached material that provides the original poster with information he or she would have received had the original rather than the cached copy been sent. To qualify for DMCA safe harbor protection as either a hosting services provider or a provider of links to allegedly infringing content, the provider must • not have actual knowledge that the material being hosted is infringing or, in the absence of actual knowledge, not be aware of facts or circumstances from which the infringing activity is apparent; • act quickly to remove or disable access to infringing material upon obtaining knowledge or awareness of infringing activity; and • not receive a financial benefit directly attributable to the infringing activity in cases where the service provider has the right and ability to control such activity. In addition to the conditions on eligibility for safe harbor protection described above, the DMCA requires that all service providers adopt and follow through on a policy of terminating repeat offenders.

§ 7.3.3

Designation of Agent

Furthermore, the protection granted by the caching, hosting, and linking safe harbors is contingent on the service provider designating a copyright agent, which requires the initial submission of a form, initial payment of a fee (this allows copyright owners to know who to send “takedown” requests to), and complying with a written notice requirement about a procedure mandated by the DMCA and its implementing regulations for responding to “takedown” demands from copyright owners (discussed immediately below). Failure to take the relatively simple and low-cost step of designating MCLE, Inc. | 2nd Edition 2018

7–9

§ 7.3

Data Security and Privacy in Massachusetts

an agent, failure to post a notification about the complaint procedure, or failure to act in accordance with such procedure, can result in the would-be safe harbor status being disregarded and increased exposure to monetary copyright damages. The regulations have been amended to replace an “interim” system with a permanent online system that became available in December 2016. All current registrations that have not been filed using the new system will become invalid as of December 31, 2017, and need to be reregistered before that date using the new system. The agent designation must include the following elements: (i) The first name, last name, position or title, organization, physical mail address (street address or post office box), telephone number, and email address of two representatives of the service provider who will serve as primary and secondary points of contact for communications with the Office. (ii) A telephone number and email address for the service provider for communications with the Office. 37 C.F.R. § 201.38(c)(1). (2) Attestation. For each designation and any subsequent amendment or resubmission of such designation, the person designating the agent, or amending or resubmitting such designation, must attest that: (i) The information provided to the Office is true, accurate, and complete to the best of his or her knowledge; and (ii) He or she has been given authority to make the designation, amendment, or resubmission on behalf of the service provider. 37 C.F.R. § 201.38(c)(2). The filer is prompted for this information by the online system. Use of the system requires registration, which is free. Designation of an agent costs $6 initially and then $6 every three years. The fees are low, but remaining in compliance requires taking affirmative steps to make a maintenance filing every three years. Failure to properly and timely designate an agent and maintain that status can have disappointing consequences. In Oppenheimer v. Allvoices, Inc., 2014 3:14-cv-00499LB (N.D. Cal. June 10, 2014), for example, an ISP named Allvoices did not receive the DMCA safe harbor treatment it sought because photographer Oppenheimer’s photographs were posted two months prior to Allvoices having designated an agent with the Copyright Office.

7–10

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.3.4

§ 7.3

Requirements for Valid Takedown Requests

The required elements of a valid takedown request are: (i) A physical or electronic signature of a person authorized to act on behalf of the owner of an exclusive right that is allegedly infringed. (ii) Identification of the copyrighted work claimed to have been infringed, or, if multiple copyrighted works at a single online site are covered by a single notification, a representative list of such works at that site. (iii) Identification of the material that is claimed to be infringing or to be the subject of infringing activity and that is to be removed or access to which is to be disabled, and information reasonably sufficient to permit the service provider to locate the material. (iv) Information reasonably sufficient to permit the service provider to contact the complaining party, such as an address, telephone number, and, if available, an electronic mail address at which the complaining party may be contacted. (v) A statement that the complaining party has a good faith belief that use of the material in the manner complained of is not authorized by the copyright owner, its agent, or the law. (vi) A statement that the information in the notification is accurate, and under penalty of perjury, that the complaining party is authorized to act on behalf of the owner of an exclusive right that is allegedly infringed. 17 U.S.C. § 512(c)(3)(A). Content owners, and particularly entertainment conglomerates, have been criticized for wantonly lodging takedown requests without analyzing the merits of the requests. The Court of Appeals for the Ninth Circuit, in Lenz v. Universal Music Corp., 801 F.3d 1126 (9th Cir. 2015), acknowledged those concerns in ruling that copyright owners first must assess whether a use of their content is in fact lawful “fair use” before sending a takedown notification under the DMCA. Considering fair use involves a balancing of subjective factors, so this newly clarified requirement may make it logistically more difficult and time consuming for content owners to evaluate whether a use of their content discovered online qualifies for a takedown notice. The background for the Lenz case was as follows: Universal Music Group, copyright owner of Prince’s song “Let’s Go Crazy,” issued a copyright takedown notice to Stephanie Lenz after she posted a video of her son, then aged three, dancing to the MCLE, Inc. | 2nd Edition 2018

7–11

§ 7.3

Data Security and Privacy in Massachusetts

song. Universal sent a takedown notice to YouTube based upon the paralegal’s conclusion that the use of the song was infringing. The notice to YouTube listed some 200 videos, including Lenz’s “Let’s Go Crazy” video, and contained the required “good faith belief” as to the unlawfulness statement discussed above. Lenz in turn requested that the video be restored and, with the assistance of the Electronic Frontier Foundation as pro bono counsel, filed suit in the U.S. District Court for the Northern District of California alleging misrepresentation under Section 512(f). The parties filed cross-motions for summary judgment, which were each denied, resulting in the appeal. Universal argued on appeal that, while its procedures for reviewing others’ use of its content online in making takedown notice determinations did not explicitly call for a fair use analysis, the review was tantamount to a consideration of fair use. The appeals court took the position, however, that whether the review sufficiently took into account fair use was a question for a jury. Universal also argued that otherwise infringing uses that are ultimately deemed lawful as “fair use” are not “authorized by the law” but, instead, constitute impermissible conduct that is excused only after the fact by fair use as an affirmative defense. The court disagreed, citing Supreme Court precedent in Sony Corp. of Am. v. Universal City Studios, Inc., 464 U.S. 417 (1984), that fair use is a defense that establishes that the conduct was noninfringing in the first place and is not an affirmative defense that excuses infringing conduct. As a consequence, under Lenz, a copyright owner like Universal needs to consider not just whether an online audiovisual posting that includes its copyrighted content is unauthorized but also whether the use qualifies as noninfringing fair use. The next question the appeals court considered is whether Universal really had the subjective good faith belief that the dancing toddler video was infringing. The court observed that the statute includes no “reasonableness” standard—it merely requires inquiry as to the copyright owner’s subjective belief. The court also pointed out that there need not be an investigation into whether the given use is a fair use. The copyright owner merely needs to form, in the course of its review, a subjective good faith belief that the accused use is not a fair use. The court observed that content owners have adopted the practice of using algorithms to search for infringing content and did not speak unfavorably about the practice provided that the algorithmic analysis is used to identify more obvious infringements (e.g., where the video and audio tracks of the unauthorized online posting match those of the original work) and that human review is employed for the “minimal remaining content a computer program does not cull.” Note, however, that the Ninth Circuit’s ruling stands in sharp contrast to the 2013 Massachusetts decision in Tuteur v. Crosley-Corcoran, 961 F. Supp. 2d 333 (D. Mass 2013). That decision involved a dispute by bloggers about the merits of home birthing that culminated in one blogger sending an image of herself with her middle finger extended to the other. The recipient posted the image on her blog, which resulted in the sender filing a DMCA takedown notice asserting copyright infringement. The blogger who posted the “middle finger” image brought suit against the other alleging misrepresentation of her basis to send the takedown notice under Section 512(f). The 7–12

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.3

image sender moved to dismiss on jurisdictional grounds. The District Court for the District of Massachusetts nevertheless proceeded to evaluate the merits of the Section 512(f) claim, and in doing so ruled that Congress did not require that a notice-giver verify that he or she had explored an alleged infringer’s possible affirmative defenses prior to acting, only that she affirm a good faith belief that the copyrighted material is being used without her or her agent’s permission. Tuteur v. Crosley Corcoran, 961 F. Supp. 2d at 343–44. While the tide seems to be turning in favor of more scrutiny for DMCA takedown notices in view of the Ninth Circuit’s Lenz decision, we await further case law developments for purposes of clarifying DMCA pretakedown notice submission diligence requirements in the First Circuit.

§ 7.3.5

Responding to Takedown Requests

The DMCA provides detailed notice requirements that qualifying online services providers must follow to take advantage of the safe harbor liability protections, and takedown mechanism provisions that third-party content owners can use in an effort to manage online infringements. Upon receipt of a DMCA takedown notice from someone who claims that his or her copyright is being violated, the service provider should review the request in accordance with the DMCA to determine whether it “substantially” includes the elements required for a takedown notice described above. The online services provider must act “expeditiously” to remove or disable access to the materials claimed to be infringing. The materials should be made unavailable upon receipt of the request but should be retained in a form that permits the services provider to easily restore the materials if required as described below. The online services provider also must promptly notify the user who provided the allegedly infringing content that the services provider has removed or disabled access to the content. Moreover, the online services provider must provide the alleged infringer with information outlining his or her right to respond with a counternotice and setting forth the statutory requirements for the counternotice. If the online services provider indeed receives a counternotice from the alleged infringer indicating that the alleged infringer has the legal right to post the materials, the online services provider should review the counternotice to determine whether it “substantially” includes the elements specifically required by the statute for a counternotice. Promptly upon receipt of the counternotice, the online services provider must provide the complaining content owner with a copy of the counternotification and MCLE, Inc. | 2nd Edition 2018

7–13

§ 7.3

Data Security and Privacy in Massachusetts

explain that the services provider will repost the removed materials in ten business days. The online services provider must repost the material within ten to fourteen business days after receiving the counternotification unless the provider first receives notice from the content owner stating that he or she has filed a lawsuit to stop the alleged infringement.

§ 7.3.6

Clarification About Secondary Liability Safe Harbor

The DMCA provisions do not explicitly indicate whether the safe harbor is available to qualifying online service providers and protects them against claims of contributory or secondary copyright infringement. Online service providers took comfort in the decision in IO Group, Inc. v. Veoh Networks, Inc., 586 F. Supp. 2d 1132 (N.D. Cal. 2008). IO Group is an adult video provider, and Veoh Networks is an online video service provider that allows the posting of user content. IO Group discovered that content from ten separate videos that it had produced had been posted without authorization via the Veoh service by a Veoh user. IO Group bypassed the DMCA takedown procedure outlined at Veoh’s website and instead initiated federal court litigation alleging Veoh’s copyright infringement based upon its user’s postings. Both parties moved for summary judgment, with Veoh arguing in its defense that it was protected from monetary damages as a result of the DMCA Section 512 safe harbor and IO Group arguing that Veoh had failed to act to monitor and confront repeat copyright infringement offenders using its service. Veoh had adopted a policy under which it would no longer allow adult content postings and had removed all such content, so IO Group’s request for injunctive relief (from which online service providers are not shielded) was moot. The District Court for the Northern District of California granted Veoh’s summary judgment motion, observing that Veoh had fully complied with the DMCA safe harbor qualifications. The court recognized Veoh’s DMCA defense, notwithstanding that its system for discouraging and weeding out copyright violators was imperfect and that some repeat offenders had slipped through the cracks. After determining that the safe harbor applied to Veoh, the trial court did not bother engaging in a discussion about whether Veoh had induced infringement by its practices in such a way that it was secondarily liable—in any event the safe harbor protected Veoh, so that analysis was not necessary. The court also found that Veoh’s automatic processing of video content submitted by a user to create low-resolution screen captures from the video for video previewing purposes did not fall outside the “at the direction of the user” requirement under DMCA Section 512(c)(1). The screen captures and other automated processing functions that were applied to Veoh’s video submissions, although automated, were set in motion by the user’s submission and thus “at the direction of the user,” the court reasoned. More recently, Viacom and YouTube settled a seven-year string of copyright litigation in which Viacom was claiming $1 billion in damages.

7–14

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.3.7

§ 7.3

Anticopyright Management Circumvention

The DMCA added to the Copyright Act provisions aimed at preventing circumvention of technological measures used by copyright owners to protect their works from infringement. The DMCA remains controversial in these respects because the exemptions from the stringent provisions are limited and because the prohibitions arguably prevent reverse engineering tactics that allow third parties to develop compatible, complementary, and noninfringing works. In other words, many feel that the DMCA’s anticircumvention prohibitions vest too much control in copyright owners and inhibit innovation and fair competition. The many critics are divided, however, as to whether the situation should be remedied by continually amending the DMCA to expand and enhance the anticircumvention exceptions or, instead, that the DMCA’s anticircumvention provisions should be legally challenged or repealed. The DMCA in relevant part provides: (a) False Copyright Management Information: No person shall knowingly and with the intent to induce, enable, facilitate, or conceal infringement: (1) provide copyright management information that is false, or (2) distribute or import for distribution copyright management information that is false. (b) Removal or Alteration of Copyright Management Information: No person shall, without the authority of the copyright owner or the law: (1) intentionally remove or alter any copyright management information; (2) distribute or import for distribution copyright management information knowing that the copyright management information has been removed or altered without authority of the copyright owner or the law; or (3) distribute, import for distribution, or publicly perform works, copies of works, or phonorecords, knowing that copyright management information has been removed or altered without authority of the copyright owner or the law, knowing, or, with respect to civil remedies under section 1203, having reasonable grounds to know, that it will induce, enable, facilitate, or conceal an infringement of any right under this title. 17 U.S.C. § 1202.

MCLE, Inc. | 2nd Edition 2018

7–15

§ 7.3

§ 7.3.8

Data Security and Privacy in Massachusetts

DMCA Anticircumvention Exemptions

Pursuant to 17 U.S.C. § 1201(a)(1)(C), the Copyright Office convenes every three years to hold a rulemaking session to consider proposed exemptions to the DMCA’s prohibition on copyright protection technology circumvention for the purpose of debating and limiting possible harm the law has caused to legitimate noninfringing uses of protected copyrighted materials. Through these sessions and the rulemaking process, the Copyright Office, based upon marketplace developments, technological developments, and public feedback, considers newly proposed exemptions and whether to extend exemptions already in place. For example, as a result of the most recent rulemaking session in 2015, one can legally remove or override software protections on mobile phones and, as a result of doing so, download onto the phone software not available through the phone manufacturer’s application store. There are also new exemptions for a patient’s circumventing access controls for purposes of accessing his patient data on computer networks that relates to an implanted medical device, and for computer security testing. 37 C.F.R. § 201.40. Other 2015 exemptions that are in effect as of this writing are • motion pictures (including television shows and videos); • literary works, distributed electronically, protected by TPM interfering with assistive technologies; • computer programs that enable devices to connect to a wireless network (“unlocking”); • computer programs on smart TVs (“jailbreaking”); • vehicle software to enable diagnosis, repair, or modification; • video games requiring server communication; and • software to limit feedstock of 3D printers. In June 2017, the Copyright Office released a report detailing findings of a public study of the effectiveness of Section 1201, including its anticircumvention provisions. The office recommends some minor fine tuning of the Section 1201 provisions, including expanding the provisions that allow circumvention of technological protection measures for security and encryption research and adding new provisions to allow circumvention for other purposes, e.g., for assistive reading technologies for the visually impaired and the repair of devices. It might come as some surprise that the DMCA’s anticircumvention provisions can be invoked in circumstances not involving technological circumvention. Notwithstanding commentary in the legislative record that suggests that Congress intended the provisions to apply only in case of technological circumvention, the definition of “copyright management information” in the statute is so broad as to include “the name of, and other identifying information about, the author of a work.” 17 U.S.C. 7–16

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.3

§ 1202(c)(2). Merely removing or altering the name of an author as it appears on a work in hardcopy, or adding information in connection with such a work that identifies a false author, violates the statute under Section 1202(a) and (b). See, e.g., Roof & Rack Prod., Inc. v. GYB Investors, LLC, 2014 WL 3183278 (S.D. Fla. July 8, 2014). This is an example of the broad, and probably unintended, prohibitions that DMCA critics decry.

§ 7.3.9

Notable Anticircumvention Cases

Two decisions relating to a lengthy legal battle between laser and inkjet printer manufacturer Lexmark International, Inc. (Lexmark), and microchip manufacturer Static Control Components, Inc. (SCC), have provided important guidance as to the interpretation of the DMCA’s copyright control anticircumvention prohibitions. Lexmark established a program under which customers could purchase discounted ink cartridges pursuant to their agreement that they would “use the cartridge only once and return it only to Lexmark for remanufacturing or recycling.” Lexmark also embedded in its printer cartridges software that would send a signal (based upon its calculation as to the amount of ink used during printing) to the printer indicating that the cartridge was spent. The practical effect of this measure was that the software would not allow the printer to work where the cartridge was refilled by a company other than Lexmark. SCC developed microchips that reproduced verbatim the software built into the Lexmark ink cartridges and thereby allowed third-party printer cartridge manufacturers and refurbishers to overcome the software controls that Lexmark had imposed. Lexmark sued SCC in the U.S. District Court for the Eastern District of Kentucky under the Copyright Act alleging infringing copying of Lexmark’s software program and also unlawful circumvention of Lexmark’s software rights management controls. The trial court found that SCC had indeed unlawfully copied Lexmark’s software and that it also had violated the DMCA’s anticircumvention provisions. While there is a reverse engineering exemption available under the DMCA to allow technology companies to bypass controls to achieve interoperability and compatibility of products they develop, the trial court determined that SCC did not qualify for that exemption because it did not develop an independent product. On appeal, the Court of Appeals for the Sixth Circuit in Lexmark International, Inc. v. Static Control Components, Inc., 387 F.3d 522 (6th Cir. 2004), reversed and remanded. The Sixth Circuit ruled that Lexmark’s printing control software amounted to a mere “lock-out code,” and thus a nonexpressive embodiment of an idea not entitled to copyright protection under the circumstances. What Lexmark advanced as copyright protection technology was not really software that controlled access to copyright-protected content, but instead controlled the use of the printer and the printer cartridge, according to the Court of Appeals. The Appeals Court also disagreed with the trial court’s conclusion that SCC’s technology did not qualify for the reverse engineering exemption. The Appeals Court’s decision can be summed up at a high level as follows: the DMCA anticircumvention provisions create new copyright liabilities but not an entirely new right distinct from traditional copyright law. MCLE, Inc. | 2nd Edition 2018

7–17

§ 7.3

Data Security and Privacy in Massachusetts

A subsequent Federal Circuit decision in Chamberlain Group, Inc. v. Skylink Technologies, Inc., 381 F.3d 1178, 1204 (Fed. Cir. 2004) reached conclusions similar to those of the Lexmark decision discussed above as to the purpose and reach of the DMCA’s anticircumvention provisions. Defendant Skylink Technologies sold replacement garage door transmitters compatible with plaintiff Chamberlain Group’s garage door. Chamberlain Group employed “rolling code” software primarily intended to make it difficult for would-be burglars to record and retransmit an electronic signal that would allow access to people’s homes but that also had the effect of impeding the ability of third parties to manufacture compatible garage door openers. Skylink Technologies manufactured transmitters that incorporated a signal-emitting sequence that bypassed Chamberlain Group’s rolling code protections, enabling Skylink Technologies to manufacture competing garage door openers. On appeal, the Federal Circuit ruled that the District Court for the Northern District of Illinois properly granted summary judgment to Skylink Technologies and that it had correctly ruled that the Chamberlain Group had the (unmet) burden of demonstrating that Skylink Technologies lacked authorization to circumvent Chamberlain Group’s technology. “A copyright owner seeking to impose liability on an accused circumventor must demonstrate a reasonable relationship between the circumvention at issue and a use relating to a property right for which the Copyright Act permits the copyright owner to withhold authorization—as well as notice that authorization was withheld.” Chamberlain Group, Inc. v. Skylink Techs., Inc., 381 F.3d at 1204.

§ 7.4

CHILDREN’S ONLINE PRIVACY PROTECTION ACT

Protecting children’s privacy online is a challenge. Computers, video games, and mobile devices are becoming a bigger part of kids’ lives when they study, play, and socially interact with their peers. It is difficult for parents to closely monitor their children’s use of the devices. Kids will gladly share information about themselves in exchange for a reward, e.g., the ability to download a game. In recognition of the increasing need to protect children online, even as early as the 1990s, in 1998 Congress enacted the Children’s Online Privacy Protection Act (COPPA). 15 U.S.C. §§ 6501–06. COPPA applies to the online collection of personal information from children under thirteen years of age and grants rulemaking and enforcement authority to the Federal Trade Commission (FTC). The FTC in turn promulgated COPPA regulations causing the rulemaking standards to become effective as of April 2000. The FTC revisited the regulations in 2012, and updated regulations became effective in 2013. COPPA applies to companies that operate websites or online services (including mobile applications) that collect personal information from children under age thirteen. Even where a company’s website is not directed towards children or does not deal in subject matter typically considered of interest to children, if the website or online service operator has actual knowledge that it collects personal information from children under thirteen, it must comply. “Online services” is broad enough to encompass 7–18

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.4

online advertising networks, Internet-enabled location-based services, mobile applications that send or receive information online, and mobile applications that deliver behaviorally targeted ads. COPPA applies to commercial websites and online services, and the FTC’s enforcement jurisdiction generally is limited to for-profit organizations. Nonprofits that collect information on behalf of for-profit members or affiliates, however, can be deemed covered by COPPA. COPPA and its regulations define the type of information that qualifies as “personally identifiable information” broadly to include • one’s full name, home, or other physical address (including street name and city or town); • an e-mail address or IM identifier, screenname, or username where it functions as online contact information; • a telephone number; • a Social Security number; • a persistent identifier that can be used to recognize a user over time and across different websites, including a cookie number, an IP address, a processor or device serial number, or a unique device identifier; • a photo, video, or audio file containing a child’s image or voice; • geolocation information sufficient to identify a street name and city or town; or • other information about the child or the parent that is collected from the child and is combined with one or more of the above-listed identifiers. 16 C.F.R. pt. 312. The inclusion of children’s screen names, even where the names are not included as part of the child’s e-mail address, within the definition of “personal information” under the FTC’s COPPA regulations resulted from updates that were effective on July 1, 2013. Including information collected using persistent identifier cookies (although there are certain exceptions where use of such cookies without parental consent is not prohibited if used merely to help the website run more smoothly and understand user behaviors for purposes of improving a website’s functionality) is also a new development that broadens COPPA’s compliance “net” and merits the attention of businesses that had not focused on the 2013 regulatory developments. Pieces of information about a child that might be viewed as benign and that, seemingly, might not be used to identify a child readily, where combined with another piece of information that arguably can be used to identify a child, are considered personal information under COPPA and its regulations. COPPA therefore extends beyond governing the collection and use of more-obvious identifiers to restrict behavioral profiling of children for the purposes of future target advertisements. The type of behavior considered to qualify as “collecting” personally identifiable information about a child is defined broadly, too. This includes not just the actual MCLE, Inc. | 2nd Edition 2018

7–19

§ 7.4

Data Security and Privacy in Massachusetts

collection but encouraging or prompting children to submit their personal information (even where it is optional) or passively tracking a child online. The FTC publishes guidance at its website from time to time (including as recently as June 2017, when it published an updated version of Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business). The guidance includes FAQs about common issues surrounding children’s privacy protection. Although not binding, the FTW website documents provide useful guidance as to compliance steps and considerations.

§ 7.4.1

Privacy Policy

A COPPA-covered business must publish a privacy policy via a conspicuous link at the home page of a website and at user interfaces where information is collected. The link to the privacy policy can appear at the bottom of the “home page” or “landing page” for the website or service, but it may not appear in small print or otherwise blend in with links to other legal notices in terms of hyperlink text size, color, and style. The notice must clearly indicate the name, address, telephone number, and e-mail address of the website or service operator. Where there are multiple businesses collecting children’s information via the website or service, the policy should either list the details for all collecting businesses or for one business that will field parental inquiries and requests from parents about all of the businesses’ activities. To be clear, if a third party apart from the primary website or service operator collects personal information about children through the website or service, such as in the case of an advertising network (an increasingly frequent phenomenon), the privacy policy needs to explain that and detail who that third party is and what the third party does with the collected information. The policy also must notify parents about the types of personal information collected from children (for example, name, address, e-mail address, interests, etc.); how such personal information is collected (that is, directly from the child or passively); how the personal information will be used; and whether the business discloses personal information collected from kids to third parties (including whether the information is posted publicly and whether the child can post the listed categories of information or other personal information through user-generated content postings). Where a business does disclose the information, its privacy policy must list the types of businesses it discloses information to and how those third parties use the information. The privacy policy must clearly indicate that the business will not require a child to disclose more information than is reasonably necessary to participate in an activity; that parents can review their child’s personal information, direct the business to delete it, and refuse to allow any further collection or use of the child’s information; that parents can agree to the collection and use of their child’s information but still not allow disclosure to third parties unless that is part of the service (e.g., in the case of social networking); and detail the procedure for parents to follow to exercise their rights. 7–20

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.4.2

§ 7.4

Not Just Notice, but Choice and Access

Additionally, the COPPA standards require that online service providers give parents the opportunity to choose to consent to the collection of their children’s information yet opt out of having that information disclosed to third parties (unless such disclosure is fundamentally essential to the rendering of the service in question), which must be made clear to the parents. Online service providers collecting personal information about children under thirteen must also allow parents to review the data collected about their children, must delete some or all of that information upon a parent’s request, and must allow parents to opt out of future collection or use of their children’s personal information.

§ 7.4.3

Initial Notice to Parent Required

Before one can collect information from a child online or through a computer service, the service must obtain verifiable parental consent. COPPA does not impose a specific means to be used in obtaining the consent. The FTC FAQs suggest that the consent process should entail “a method reasonably designed in light of available technology to ensure that the person giving the consent is the child’s parent.” The same guidance indicates that the following steps should be considered reasonable for purposes of securing verifiable consent: • the signing of a consent form and sending it back to you via fax, mail, or electronic scan; • the use of a credit card, a debit card, or other online payment system that provides notification of each separate transaction to the account holder; • a call to a toll-free number staffed by trained personnel; • a video conference with trained personnel; or • the provision of a copy of a form of government-issued ID that is checked against a database (but the copy should be deleted when the verification process is finished). Under the July 2013 COPPA regulations updates, the FTC made available a process under which a website operator can petition the FTC for confirmation that a method not on the above list of approved methods is acceptable under the COPPA standards. One such mechanism that the FTC has “blessed” was submitted by a company named Imperium. Its “knowledge-based authentication” approach was deemed by the FTC to be reliable in authenticating individuals in the banking industry where it is used widely and is approved for use in the financial sector by banking regulators and the FTC. Knowledge-based authentication consists of asking a set of personalized questions, such as past addresses, phone numbers, vehicles owned, and the like, to verify the parent’s identity. The FTC found that Imperium’s approved method involves asking a sufficiently large number of dynamically generated multiple choice questions, with an adequate number of possible answers, that the probability of successfully guessing the answers to all the questions is low. These questions would all be difficult MCLE, Inc. | 2nd Edition 2018

7–21

§ 7.4

Data Security and Privacy in Massachusetts

enough that a child, or some other individual, would be unable or unlikely to successfully discover or guess the answers. In November 2015, the FTC approved a means of verifying parental consent that involves a parent submitting an image of his or her government-issued photo identification, which is verified by the data collector as genuine via use of algorithmic technology. The parent then submits a fresh image of himself or herself, which is reviewed by the collecting party both technologically and by live agents to ensure that the parent is alive and is the same person pictured on the government-issued photo identification card. After verification, the images submitted by the parent are deleted.

§ 7.4.4

What Are the Consequences for Failure to Comply?

COPPA gives state attorneys general and certain federal government agencies, particularly the FTC, enforcement authority with respect to entities within their jurisdiction. There is no private right of action under COPPA. A court can assess damages against those found liable for COPPA violations up to $16,000 per violation depending upon • the severity of the violations, • whether the defendant has a history of COPPA violation, • the number of affected children, • whether and the degree to which collected information was shared with third parties, • the nature and amount of personal information about children that was collected, and • the size of the company that committed the violation. In mid-2014, the FTC began enforcing the new rules more aggressively against application developers. In September 2014, the FTC published two consent decrees against online review website Yelp, Inc., and mobile application developer TinyCo, Inc. After an FTC investigation, Yelp stipulated to facts that included its having collected information from children for a four-year period from 2009 to 2013 without securing parental consent. Yelp’s application did not treat children under thirteen any differently during or after the Yelp service registration process. As part of the settlement, Yelp was assessed a $450,000 fine, was required to delete all information collected from children under its inadequate practices, and was required to submit a written report detailing its compliance measure a year after the settlement. This outcome obviously highlights the need to incorporate an “age screen.” In the case of TinyCo, the FTC took the position that TinyCo had directed its application services toward children, given that the applications featured bright colors, animated characters, and children’s themes. The games awarded playing tokens to kids in exchange for providing an e-mail address. The FTC assessed a $300,000 civil 7–22

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.4

penalty, required that TinyCo delete all information unlawfully collected, and required that TinyCo engage in a self-assessment and issue a report to the FTC in a year, detailing its compliance progress. The 2014 COPPA enforcement activity in the application space was not limited to FTC actions. There was also a noteworthy suit brought by Maryland’s attorney general against SnapChat. SnapChat is a chat service that had promoted that content transmitted via its application is permanently deleted within one to ten seconds after being viewed by the recipient, eliminating the concern that any controversial comment, image, or video would come back to haunt the sender. The Maryland attorney general alleged that SnapChat knew that children under thirteen were using the service but that it failed to obtain verifiable parental consent before allowing kids to set up accounts and use the service. SnapChat settled the charges in June 2014, agreeing to pay a $100,000 fine and take compliance measures. SnapChat had settled a separate FTC enforcement action (based on alleged FTC Act deceptive trade practices associated with the misleading nature of SnapChat’s promotional claims, as opposed to COPPA violations) just a few months earlier. The settlement also included Snapchat’s promise to ensure that children under thirteen are not able to create SnapChat accounts for a ten-year period. There has been a lack of published FTC settlements in 2015 (as of this writing) that detail penalties and rehabilitative measures imposed on businesses as a result of COPPA violations. This is because much of the FTC COPPA enforcement activity relates to less-egregious violations and, accordingly, entails a more “educational” approach. Those efforts do not result in published settlements. The educational approach is being implemented more broadly through updates to educational resources posted at the FTC’s website, including the March 2015 updating of the COPPA FAQs.

§ 7.4.5

Security and Data Retention

COPPA also requires that those who collect data about children online and through applications take reasonable steps to maintain the confidentiality, security, and integrity of the data, and also take reasonable steps to ensure that any third party to whom the online service provider discloses children’s data will maintain the confidentiality and security of the data. Online service providers covered by COPPA may retain children’s personal information only as long as necessary to fulfill the purpose for which the data was initially collected and must delete the data after it is no longer useful for that intended purpose.

§ 7.4.6

Student Privacy

Where the information is used solely for the benefit of the school and not used or disclosed outside that context, school administrators can consent to the collection of a child’s information on behalf of a parent. If a given online service wants to collect information about a child that would be used for purposes outside the online environment specific to the child’s school, such as for behavioral advertising purposes, the child’s school cannot consent on behalf of parents to such collection. The FTC MCLE, Inc. | 2nd Edition 2018

7–23

§ 7.4

Data Security and Privacy in Massachusetts

recommends, but does not require, that schools let parents know when they have provided such consent on behalf of their child. Of course, compliance with COPPA is just one of the considerations that schools must consider when collecting, using, securing, and disclosing student information. The Family Educational Rights and Privacy Act and the Protection of Pupil Rights Amendment are two other laws that must be taken into account.

§ 7.5

CAN-SPAM ACT

Everyone can relate to the frustration that mounts as a result of receiving dozens of unwanted e-mails peddling hair loss treatment products, dating websites, credit counseling services, sexual enhancement products, health supplements, and the like. You have no relationship with or interest in the companies sending the e-mails or selling the products or services, and the messages are clogging your inbox. The Controlling the Assault of Non-Solicited Pornography and Marketing Act (CAN –SPAM) of 2003 established compliance requirements for those who send unsolicited commercial e-mails (spam), set forth penalties for “spammers” and those whose products are advertised in spam where they violate the Act, and provided a mechanism for consumers to ask e-mailers to stop spamming them. 15 U.S.C. §§ 7701– 7713. This law preempted most state-level antispam laws and extended antispam protections to cover nearly all commercial e-mail marketing.

§ 7.5.1

E-mails Covered by CAN-SPAM Act

The majority of the CAN-SPAM requirements apply to e-mail practices where the primary purpose of the communication is considered commercial, as opposed to transactional. Determining whether an e-mail would be deemed commercial or transactional is a fact-based inquiry that requires a review of the content of the e-mail message. Commercial messages are those that advertise or promote a commercial product or service, including content on a website operated for a commercial purpose. In contrast, “transactional or relationship” messages are defined to fall within one of five categories, namely those that • facilitate, complete, or confirm a commercial transaction that the recipient has already agreed to; • provide warranty, recall, safety, or security information about a product or service already purchased or used by the recipient; • provide information concerning a change in the terms or features or account balance with regard to a membership, subscription, account, loan, or other ongoing commercial relationship; • provide information about an existing employment relationship or related benefits; or • deliver goods or services as part of a previously agreed upon transaction. 7–24

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.5

15 U.S.C. § 7702(17)(A). Requirements applicable to commercial e-mails, such as providing an opt-out mechanism (discussed below), do not apply to transactional e-mails, but CAN-SPAM and its regulations do provide that transactional e-mails may not include deceptive content. Special challenges arise where e-mails include both commercial and transactionaltype content. Under what circumstances are such messages considered commercial messages that must comply with CAN-SPAM? Messages containing both advertising and transactional or relationship content have a commercial primary purpose if the recipient would interpret the subject line to mean that the message contains commercial advertising, or where commercial content appears at the beginning of the e-mail message instead of a substantial amount of “transactional or relationship content.” Factors relevant to that determination include the location of the commercial content (for example, it is more likely to be deemed primarily commercial when such content is at the beginning of the message); how much of the message is dedicated to commercial content; and how color, graphics, type size, style, etc., are used to highlight the commercial content.

§ 7.5.2

Compliance Requirements

The CAN-SPAM Act imposes seven main requirements applicable to commercial e-mails: • Do not use false or misleading header information. • Do not use deceptive subject lines. • Identify the message as an ad. • Tell recipients where you are located. • Tell recipients how to opt out of receiving future e-mail from you. • Honor opt-out requests promptly. • Monitor what others are doing on your behalf. An important aspect of CAN-SPAM compliance is making available and appropriately acting upon “opt-out” requests. There are two opt-out mechanism options: • Include link to website for opting out. Under this approach, the sender includes an active link to a website in every e-mail through which the recipient can indicate his or her choice to receive no further commercial e-mails from the sender. Near the link in the e-mail, a statement along the lines of “If you no longer wish to receive commercial e-mails from us, please click on this link and submit your e-mail address” must appear, and the link must remain functional for at least thirty days after any e-mail containing the link has been distributed. • Include automatic opt-out link. This approach involves including a link in every e-mail that is programmed to automatically opt-out the e-mail recipient from MCLE, Inc. | 2nd Edition 2018

7–25

§ 7.5

Data Security and Privacy in Massachusetts

future e-mails when clicked. A statement should appear near the link saying something similar to “If you no longer wish to receive commercial e-mails from us, please click on this link.” Again, the link must remain functional for at least thirty days after any e-mail containing the link has been distributed. After the user clicks on the link, he or she should be taken to a confirmation page that indicates the opt-out request is being processed. All opt-out requests must be honored within ten business days of the sender’s receiving the opt-out request.

§ 7.5.3

Consequences for Noncompliance

Federal and state authorities are authorized to enforce the CAN-SPAM Act. Federally, the FTC may fine violators up to $11,000 per violation of the Act, issue injunctions, and obtain future audit rights of the violators’ operations to ensure ongoing compliance. Furthermore, state attorneys general may fine violators up to $250 per violation of the Act, not exceeding $2 million, or, if greater in amount, may obtain damages equal to the total amount suffered by their state’s residents. State damages awards may be tripled if the CAN-SPAM Act was violated willfully and knowingly. In lieu of monetary damages, state officials may opt to enjoin activities in violation of the Act. There is no private right of action under CAN-SPAM. As is the case with COPPA, consumer protection interest groups bemoan the lack of enforcement activity under CAN-SPAM. There are some actions worth noting, though. In August 2014, the FTC announced that a company named Yair Shalev and Kobeni, Inc. would pay $350,000 to settle FTC charges that it sent deceptive e-mails in connection with the Affordable Care Act rollout. The e-mails had falsely claimed that consumers would be violating the law if they did not immediately click a link to enroll in health insurance. The e-mails also failed to provide consumers the opportunity to decline to receive future e-mails and failed to provide a valid physical postal address. The CAN-SPAM Act establishes that violations are deceptive practices for purposes of the FTC Act, enabling the FTC to seek criminal sanctions under the FTC Act. The CAN-SPAM Act makes criminal provisions available to state attorneys general. This includes statutory damages of $250 per e-mail up to a cap of $2 million. In circumstances where the violations are willful or there are other aggravating factors, the monetary damages are subject to trebling, that is, up to $6 million. Aggravated cases can also result in prison terms of up to five years. In April 2015, an individual named Milos Vujanic was sentenced to forty-eight months in federal prison and ordered to pay $17 million in restitution by the U.S. District Court in the Northern District of Texas as a result of wire fraud, mail fraud, and CAN-SPAM violations. Vujanic had acted as the IT support person for a criminal enterprise that perpetrated a range of frauds. His actions included sending deceptive e-mails, including false address information. Granted, this case was not limited to spamming activity but also involved a range of other wrongdoings.

7–26

2nd Edition 2018 | MCLE, Inc.

Congressional Response to the Internet

§ 7.5

In Massachusetts, the most noteworthy case in which spammers were penalized was through a default judgment in Commonwealth v. Kuvayev, Case No. 05-1856-H (Mass. Super. Sept. 26, 2005). Kuvayev and a number of individuals and affiliated entities engaged in nefarious business under thousands of fictitious business names and Internet domain names. Their activities included selling counterfeit drugs through online pharmacies, selling bogus mortgages without a license, violating state advertising regulations, and selling copies of third-party software without authorization. These activities were found to have violated Massachusetts’s unfair competition statute (G.L. c. 93A, § 2). Additionally, the group of defendants were found to have violated the CAN-SPAM Act by, and were enjoined from, sending thousands of e-mails with misleading subject line and header information (including nonfunctioning originating e-mail addresses), no means for the recipients to opt out of receiving the e-mails in the future, no warning of sexual content in certain of the e-mails, and no clear and conspicuous warning where the e-mails were advertisements. The defendants never answered the complaint or made an appearance, but the penalties assessed for their violation of both Chapter 93A and the CAN-SPAM Act are noteworthy. The defendants were assessed $37,461,500 in penalties (149,846 e-mails multiplied by $250 per violation). There have been cases where creative antispam crusading plaintiffs, given that there is no right of private action, have tried to characterize themselves as Internet service providers for purposes of establishing standing to bring suit under the CAN-SPAM Act. In one such case, plaintiff James S. Gordon, Jr., and his company, Omni Innovations LLC, attempted to establish that they were ISPs by virtue of Gordon owning leased server space via GoDaddy and establishing a range of e-mail accounts for himself, his business, and his friends and family. Gordon argued that in doing so he was acting as an ISP for purposes of CAN-SPAM Act standing. He further alleged that e-mail marketer Virtumundo, Inc., and other named defendants had violated the CAN-SPAM Act by sending e-mails with misleading headers to him and his company, and he sought civil damages under the CAN-SPAM Act and the Washington state unfair competition and consumer protection laws. The trial court granted Virtumundo’s motion for summary judgment, finding that neither Gordon nor his company really was a “provider of Internet access service (IAS) adversely affected by a violation.” The Court of Appeals for the Ninth Circuit affirmed, citing Congress’s intent to provide a right of action to a narrower class of bona fide ISPs. Gordon v. Virtumundo, Inc., 575 F.3d 1040 (9th Cir. 2009).

MCLE, Inc. | 2nd Edition 2018

7–27

Data Security and Privacy in Massachusetts

7–28

2nd Edition 2018 | MCLE, Inc.

CHAPTER 8

The Federal Trade Commission* Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston § 8.1

The Federal Trade Commission ............................................................. 8–2

§ 8.2

Operation of the FTC (Consumer Protection) ..................................... 8–3

§ 8.3

FTC Implementation of Data Security Statutes ................................... 8–6 § 8.3.1 Fair Credit Reporting Act (FCRA) ......................................... 8–7 (a) Original Regulation of Dissemination of Credit Reports ............................................................................ 8–7 (b) Fair and Accurate Credit Transactions Act .................... 8–10 § 8.3.2 Gramm-Leach-Bliley Act ..................................................... 8–19 (a) Subtitle A: Safeguarding of Nonpublic Personal Information ................................................................... 8–19 (b) Subtitle A: Disclosure of Nonpublic Personal Information ................................................................... 8–21 (c) Subtitle B: Pretexting .................................................... 8–23 § 8.3.3 Health Insurance Portability and Accountability Act (HIPAA) ........................................................................ 8–24

§ 8.4

Other Privacy Statutes Administered or Enforced by the FTC ........ 8–29 § 8.4.1 Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act, 15 U.S.C. §§ 7701–7713 ....................................................................... 8–29 § 8.4.2 Telemarketing and Consumer Fraud and Abuse Prevention Act ...................................................................... 8–32 § 8.4.3 Do-Not-Call Legislation ....................................................... 8–44 § 8.4.4 The Children’s Online Privacy Protection Act (COPPA) ..... 8–44

§ 8.5

FTC Actions Against “Unfair or Deceptive Acts or Practices” ......... 8–50

EXHIBIT 8A—Red Flags Rule Guidelines ....................................................... 8–58 EXHIBIT 8B—GLBA Safeguards Rule Information Protection Program Elements ............................................................................................................... 8–65 EXHIBIT 8C—FTC Health Information Breach Notification Rule ............... 8–66

* Assisted in update by Quincy Kayton, Director of Services, Volunteer Lawyers for the Arts of Massachusetts.

MCLE, Inc. | 2nd Edition 2018

8–1

§ 8.1

Data Security and Privacy in Massachusetts

Scope Note This chapter provides insight into how the Federal Trade Commission implements federal laws pertaining to data security, the Fair Credit Reporting Act, the Gramm-Leach-Bliley Act, and HIPAA. It also discusses the FTC’s administration and enforcement of other statutes that provide protections related to privacy.

§ 8.1

THE FEDERAL TRADE COMMISSION

The Federal Trade Commission (FTC) was established a century ago with the enactment of the Federal Trade Commission Act of Sept. 26, 1914 (FTCA), ch. 311, 38 Stat. 717 (codified at 15 U.S.C. §§ 41–58). Its original purpose under Section 5 of the Act was to prevent “persons, partnerships and corporations” (other than banks and certain common carriers) from “using unfair methods of competition in commerce” through investigation, administrative hearing, reporting, issuing orders, and enforcing these orders in the circuit courts of appeal. Act of Sept. 26, 1914, ch. 311, § 5, 38 Stat. 719–21 (codified at 15 U.S.C. § 45). The FTC was also given additional powers under Section 6, to investigate compliance with decrees under the “Antitrust Acts,” Act of Sept. 26, 1914, ch. 311, § 6(c), 38 Stat. 721 (codified at 15 U.S.C. § 46(c)), and violations of those laws, Act of Sept. 26, 1914, ch. 311, § 6(d), 38 Stat. 721 (codified at 15 U.S.C. § 46(d)), and to recommend “the readjustment of the business of any corporation alleged to be violating the antitrust Acts,” Act of Sept. 26, 1914, ch. 311, § 6(e), 38 Stat. 721 (codified at 15 U.S.C. § 46(e)). (The “antitrust Acts” referenced above are 15 U.S.C. §§ 1–11 (including the Sherman Act of 1890); other antitrust laws specifically refer to the FTC.). As the FTC is an agency expected to investigate and be knowledgeable about industry, Section 7 authorized courts in equity enforcement of the Antitrust Acts to refer cases to the FTC “to ascertain and report an appropriate form of decree therein.” Act of Sept. 26, 1914, ch. 311, § 7, 38 Stat. 722 (codified at 15 U.S.C. § 47). In 1938, the jurisdiction of the FTC was extended to “unfair or deceptive acts or practices,” Act of Mar. 21, 1938, ch. 49, § 3, 52 Stat. 111, which protected consumers against evils broader than “unfair competition.” After a further amendment in 1975, Magnusson-Moss Warranty—Federal Trade Commission Improvement Act, Pub. L. No. 93-637, Title II, § 201(a), 88 Stat. 2193 (Jan. 4, 1975) (changing “in commerce” to “in or affecting commerce”), the target evils to be addressed by the FTC are now stated as “[u]nfair methods of competition in or affecting commerce, and unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.” 15 U.S.C. § 45(a)(1). While its antitrust jurisdiction has expanded, the FTC has been charged by Congress in various consumer protection laws with the enforcement of and typically the promulgation of regulations further defining the protections. Legislative History Wool Products Labeling Act of 1939, 15 U.S.C. §§ 68–68j; Fur Products Labeling Act, 15 U.S.C. §§ 69–69j; Federal Cigarette Labeling and 8–2

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.1

Advertising Act, 15 U.S.C. §§ 1333–1340; Fair Packaging and Labeling Act, 15 U.S.C. §§ 1451–1461; Truth in Lending Act, 15 U.S.C. §§ 1601– 1693r; Fair Credit Billing Act, 15 U.S.C. §§ 1666–1666j; Consumer Leasing Act of 1976, 15 U.S.C. §§ 1667–1667e; Fair Credit Reporting Act, 15 U.S.C. §§ 1681–1681t; Equal Credit Opportunity Act, 15 U.S.C. §§ 1691–1691f; Fair Debt Collection Practices Act, 15 U.S.C. §§ 1692– 1692o; Electronic Fund Transfer Act, 15 U.S.C. §§ 1693–1693r; Hobby Protection Act, 15 U.S.C. §§ 2101–2106; Magnuson-Moss Warranty— Federal Trade Commission Improvement Act, 15 U.S.C. §§ 2301–2312; Petroleum Marketing Practices Act, 15 U.S.C. §§ 2821–2824; Comprehensive Smokeless Tobacco Health Education Act, 15 U.S.C. §§ 4401– 4408; Telephone Disclosure and Dispute Resolution Act of 1992, 15 U.S.C. §§ 5701–5724; Telemarketing and Consumer Fraud and Abuse Prevention Act, 15 U.S.C. §§ 6101–6108; Children’s Online Privacy Protection Act, 15 U.S.C. §§ 6501–6506; Gramm-Leach-Bliley Act, Title V (Privacy), 15 U.S.C. §§ 6801–6827; Restore Online Shoppers’ Confidence Act, 15 U.S.C. §§ 8401–8405; Postal Reorganization Act, 39 U.S.C. § 3009 (unordered merchandise); Health Information Technology for Economic and Clinical Health (HITECH) Act, 42 U.S.C. § 17937.

This chapter addresses the consumer privacy protection aspect of the FTC. Practice Note The FTC, through its Bureau of Competition, enforces the original “antitrust Acts,” as well as the Clayton Act, as amended by the RobinsonPatman Act (codified as amended at 15 U.S.C. §§ 12–27), 15 U.S.C. § 21, added by Act of Oct. 15, 1914, ch. 323, § 11, 38 Stat. 734, and the premerger notification program of the Hart-Scott-Rodino Antitrust Improvements Act of 1976, Pub. L. No. 94-435, Title II, § 201, 90 Stat. 1390 (Sept. 30, 1976) (codified as amended at 15 U.S.C. § 18a). On August 13, 2015, the FTC issued for the first time a “Statement of Enforcement Principles Regarding ‘Unfair Methods of Competition’ Under Section 5 of the FTC Act,” focusing on (1) “the promotion of consumer welfare,” (2) application of “a framework similar to the rule of reason,” and (3) refraining from challenging as “unfair method of competition on a standalone basis if enforcement of the Sherman or Clayton Act is sufficient.”

§ 8.2

OPERATION OF THE FTC (CONSUMER PROTECTION)

The FTC is composed of five commissioners appointed by the president of the United States for terms of seven years, with no more than three from the same political party. 15 U.S.C. § 41. The Bureau of Consumer Protection was established by rule making. See 16 C.F.R. § 0.17. The FTC’s investigative authority is provided in the FTCA: it is empowered to “prosecute any inquiry necessary to its duties in any part of the United States,” 15 MCLE, Inc. | 2nd Edition 2018

8–3

§ 8.2

Data Security and Privacy in Massachusetts

U.S.C. § 43, and “may gather and compile information concerning, and . . . investigate from time to time the organization, business, conduct, practices, and management of any person, partnership, or corporation engaged in or whose business affects commerce,”15 U.S.C. § 46(a). See “A Brief Overview of the Federal Trade Commission’s Investigative and Law Enforcement Authority,” Fed. Trade Comm’n (July 2008), available at https://www.ftc.gov/about-ftc/what-we-do/enforcement-authority (hereinafter Brief Overview). The Bureau of Consumer Protection’s tool for investigating possible Section 5 violations and Section 6 industry conditions is the civil investigative demand (CID) for requiring production of documentary materials and tangible things, reports, answers to the FTC, and oral testimony. 15 U.S.C. § 57b-1, added by Federal Trade Commission Improvements Act of 1980, Pub. L. No. 96-252, § 13, 94 Stat. 380–85 (May 1980); see 16 C.F.R. § 2.7 (compulsory process in investigations). The investigative procedures (which may originate from the public, 16 C.F.R. § 2.1, are set forward at 16 C.F.R. pt. 2 (Non-Adjudicative Procedures), subpt. A. Whenever the Commission shall have reason to believe that any such person, partnership, or corporation has been or is using any . . . unfair or deceptive act or practice in or affecting commerce, and if it shall appear to the Commission that a proceeding by it in respect thereof would be to the interest of the public, it shall issue and serve upon such person, partnership, or corporation a complaint stating its charges in that respect . . . [to require the respondent to] show cause why an order should not be entered by the Commission requiring such person, partnership, or corporation to cease and desist from the violation of the law so charged in said complaint. . . . 15 U.S.C. § 45(b) (emphasis added). “If the respondent elects to settle the charges, it may sign a consent agreement (without admitting liability), consent to entry of a final order, and waive all right to judicial review.” Brief Overview § II(A)(1)(a) (consumer protection administrative enforcement). The consent order procedure is set forth at 16 C.F.R. pt. 2 (Non-Adjudicative Procedures), subpt. C. If the respondent elects to contest the charges, the complaint is adjudicated before an administrative law judge (“ALJ”) in a trial-type proceeding conducted under the Commission’s Rules of Practice [16 C.F.R. Part 3 (Rules of Practice for Adjudicative Procedures)]. The prosecution of a consumer protection matter is conducted by FTC “complaint counsel,” who are staff from the Bureau of Consumer Protection or a regional office. Upon conclusion of the hearing, the ALJ issues an “initial decision” setting forth his findings of fact and conclusions of law and recommending either entry of an order to cease and desist or dismissal of the complaint. Either complaint counsel or respondent, or both, may appeal the initial decision to the full Commission. 8–4

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.2

Brief Overview § II(A)(1)(a). These procedures are also applicable to formal procedures conducted under statutes administered by the FTC, 16 C.F.R. § 3.2, such as the data security and privacy statutes discussed below. Cease-and-desist orders issued by the FTC may be appealed to the Circuit Court of Appeals for “any circuit where the method of competition or the act or practice in question was used or where such person, partnership, or corporation resides or carries on business.” 15 U.S.C. § 45(c). Enforcement of cease-and-desist orders is effectuated through federal District Court actions for civil penalties up to $10,000 per violation (in continuing failure to comply, each day as a violation) and injunctions. 15 U.S.C. § 45(l). The FTC may also seek in federal District Court (or competent state court) “redress” for consumers, including “rescission or reformation of contracts, the refund of money or return of property, [and] the payment of damages,” for violation of “any rule under this Act respecting unfair or deceptive acts or practices” or where a cease-and-desist order has been entered and “the act or practice to which the cease and desist order relates is one which a reasonable man would have known under the circumstances was dishonest or fraudulent.” 15 U.S.C. § 57b, added by Magnusson-Moss Warranty—Federal Trade Commission Improvement Act, Pub. L. No. 93-637, Title II, § 206(a), 88 Stat. 2201 (Jan. 4, 1975) (emphasis added). Where the FTC has determined in an administrative proceeding that an act or practice is unfair or deceptive and issues a final cease-and-desist order, it may also seek penalties against a nonrespondent who “engages in such act or practice after the order is final and with actual knowledge that such act or practice is unfair or deceptive and is unlawful under subsection (a)(1) of this section [5 of the FTCA].” 15 U.S.C. § 45(m). Instead of using the administrative procedure, the FTC may seek a preliminary or permanent injunction in federal District Court when it has “reason to believe” that a party is violating or about to violate a provision of law enforced by the FTC under Subsection 13(b) of the FTCA. 15 U.S.C. § 53(b). Notwithstanding that the subsection was originally limited to preliminary injunctions pending an FTC administrative proceeding, 15 U.S.C. § 53(b)(2) (requirement that temporary injunction pending administrative adjudication “would be in the interest of the public”), an amendment to allow permanent injunctions “in proper cases” and “after proper proof,” TransAlaska Pipeline Act of 1973, Pub. L. No. 93-153, Title IV, § 408(f), 87 Stat. 592 (Nov. 16, 1973), has been applied to consumer protection actions. The courts have uniformly accepted the Commission’s construction of Section 13(b), with the result that most consumer protection enforcement is now conducted directly in court under Section 13(b) rather than by means of administrative adjudication. See, e.g., FTC v. World Travel Vacation Brokers, Inc., 861 F.2d 1020, 1024-28 (7th Cir. 1988); FTC v. U.S. Oil & Gas Corp., 748 F.2d 1431, 1432-35 (11th Cir. 1984) (per curiam); FTC v. H.N. Singer, 668 F.2d at 1110-13. A suit under Section 13(b) is preferable to the adjudicatory process because, in such a suit, the court may award both prohibitory and monetary equitable relief in one step. Moreover, a judicial injunction becomes effective immediately, while a Commission MCLE, Inc. | 2nd Edition 2018

8–5

§ 8.2

Data Security and Privacy in Massachusetts

cease and desist order takes effect only 60 days after service. ... Of course, administrative adjudication offers certain advantages over direct judicial enforcement. In particular, in an adjudicatory proceeding, the Commission has the first opportunity to make factual findings and articulate the relevant legal standard. On review, the court is obliged to affirm the Commission’s findings of fact if supported by substantial evidence. A reviewing court must also accord substantial deference to Commission interpretation of the FTC Act and other applicable federal laws. In a 13(b) suit, by contrast, the Commission receives no greater deference than would any government plaintiff. Thus, where a case involves novel legal issues or fact patterns, the Commission has tended to prefer administrative adjudication. Brief Overview § II(A)(2). The FTC has been charged with implementation and enforcement of most of the federal statutes addressed to data security and privacy. Following is a survey of its activities in • data security statutes (largely addressed to identity theft); • other federal data privacy statutes; and • administrative and court actions relative to deceptive acts and practices implicating privacy.

§ 8.3

FTC IMPLEMENTATION OF DATA SECURITY STATUTES

The Identity Theft and Assumption Deterrence Act of 1998, Pub. L. No. 105-318, 112 Stat. 3007 (Oct. 30, 1998), in addition to adding a federal crime of identity theft, 18 U.S.C. § 1028(a)(7) (“knowingly transfers or uses, without lawful authority, a means of identification of another person with the intent to commit, or to aid or abet, any unlawful activity that constitutes a violation of Federal law, or that constitutes a felony under any applicable State or local law”), added by Pub. L. No. 105-318, § 3, 112 Stat. 3007 (Oct. 30, 1998), made the FTC a clearinghouse for consumer complaints of identity theft. 18 U.S.C. § 1028 note (“Centralized Complaint and Consumer Education Service for Victims of Identity Theft”—FTC to log and acknowledge complaints, provide information to victims, and refer complaints to appropriate entities such as the national consumer credit reporting agencies and law enforcement agencies), added by Pub. L. No. 105-318, § 5, 112 Stat. 3010 (Oct. 30, 1998). The FTC has been at the center of identity theft prevention and mitigation through its rule making and enforcement responsibilities under three statutory schemes: • the Fair Credit Reporting Act (FCRA), 15 U.S.C. §§ 1681–1681v, as amended by the Fair and Accurate Credit Transactions Act (FACTA), 15 U.S.C. §§ 1681–1681x; 8–6

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

• the Gramm-Leach-Bliley Act, 15 U.S.C. §§ 6801–6809, 6821–6827; and • the Health Insurance Portability and Accountability Act (HIPAA), as amended by the Health Information Technology for Economic and Clinical Health (HITECH) Act, 42 U.S.C. §§ 17937, 17954.

§ 8.3.1 (a)

Fair Credit Reporting Act (FCRA) Original Regulation of Dissemination of Credit Reports

The Fair Credit Reporting Act (FCRA) of 1970, as amended, protects the privacy of the information collected by a “consumer reporting agency.” Legislative History 15 U.S.C. §§ 1681–1681v, added by Act of Oct. 26, 1970, Pub. L. No. 91-508, Title VI (Fair Credit Reporting Act), 84 Stat.1127, adding to the Truth in Lending Act, Pub. L. No. 90-321, 82 Stat. 146 (May 29, 1968); amended by Act of Oct. 27, 1992, Pub. L. No. 102-537, § 2(b), 106 Stat. 3531; amended by Consumer Credit Reform Act of 1996, Pub. L. No. 104-208, Div. A, Title II, Subtitle D, Ch. 1, 110 Stat. 3009-426 (Sept. 30, 1996); Consumer Reporting Employment Clarification Act of 1998, Pub. L. No. 105-347, 112 Stat. 3208 (Nov. 2, 1998); amended by Fair and Accurate Credit Transactions Act of 2003 (FACTA), Pub. L. No. 108159, 117 Stat. 1952 (Dec. 4, 2003); amended by Act of Dec. 13, 2003, Pub. L. No. 108-177, Title III, Subtitle D, § 361(j), 117 Stat. 2625; Intelligence Reform and Terrorism Protection Act of 2004, Pub. L. No. 108458, Title VI, Subtitle C, § 6203(l), 118 Stat. 3747 (Dec. 17, 2004); USA PATRIOT Improvement and Reauthorization Act of 2005, Pub. L. No. 109-177, Title I, §§ 116(c), 118(b), 120 Stat. 214, 217 (Mar. 9, 2006); USA PATRIOT Improvement and Reauthorizing Amendments Act of 2005, Pub. L. No. 109-178, § 4(c)(2), 120 Stat. 280 (Mar. 9, 2006); amended by Financial Services Regulatory Relief Act of 2006, Pub. L. No. 109-351, Title VII, § 719, 120 Stat. 1998 (Oct. 13, 2006); amended by Act of Dec. 26, 2007, Pub. L. No. 110-161, Div. D, Title VII, § 743, 121 Stat. 2033; amended by Credit and Debit Card Receipt Clarification Act of 2007, Pub. L. No. 110-241, § 3, 122 Stat. 1566 (June 3, 2008) (adding FCRA § 616(d), 15 U.S.C. § 1681n(d) to eliminate liability for failure to truncate credit card expiration date); amended by Credit Card Accountability Responsibility and Disclosure Act of 2009, Pub. L. No. 111-24, Title II, § 205, Title III, § 302, 123 Stat. 1747, 1748 (May 22, 2009); amended by Dodd-Frank Wall Street Reform and Consumer Protection Act, Pub. L. No. 111-203, Title X, Subtitle H, § 1088(a), 124 Stat. 2086, 2087 (July 21, 2010); amended by Red Flag Program Clarification Act of 2010, Pub. L. No. 111-319, 124 Stat. 3457 (Dec. 18, 2010).

Practice Note As defined in 15 U.S.C. § 1681a(f),

MCLE, Inc. | 2nd Edition 2018

8–7

§ 8.3

Data Security and Privacy in Massachusetts [t]he term “consumer reporting agency” means any person which, for monetary fees, dues, or on a cooperative nonprofit basis, regularly engages in whole or in part in the practice of assembling or evaluating consumer credit information or other information on consumers for the purpose of furnishing consumer reports to third parties, and which uses any means or facility of interstate commerce for the purpose of preparing or furnishing consumer reports.

The central focus of the FCRA is information about an individual in a “consumer report,” broadly defined as any written, oral, or other communication of any information by a consumer reporting agency bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for (A) credit or insurance to be used primarily for personal, family, or household purposes; [or] (B) employment purposes. . . . 15 U.S.C. § 1681a(d)(1) (emphasis added). The permissible dissemination of such reports is set forth under FCRA § 604 (15 U.S.C. § 1681b as amended) and, subject to certain exceptions, a consumer reporting agency may furnish a consumer report only “ [t]o a person which it has reason to believe . . . intends to use the information” for • a credit transaction involving the consumer; • employment purposes; • underwriting of insurance involving the consumer; • “the consumer’s eligibility for a license or other benefit granted by a governmental instrumentality required by law to consider an applicant’s financial responsibility or status”; • “valuation of, or an assessment of the credit or prepayment risks associated with, an existing credit obligation”; or • “a legitimate business need for the information” for – “a business transaction that is initiated by the consumer” or – “to review an account to determine whether the consumer continues to meet the terms of the account.” 15 U.S.C. § 1681b(a)(3). The Consumer Credit Reform Act of 1996, Pub. L. No. 104-208, Div. A, Title II, Subtitle D, Ch. 1 (Consumer Credit Reporting Reform Act of 1996), 110 Stat. 3009– 426 (Sept. 30, 1996) (hereinafter CCRRA), added Subsection (b) to FCRA § 604, 8–8

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

providing extensive additional conditions for employment use under Subsection (a)(3)(B), including notice to the consumer. 15 U.S.C. § 1681b(b), added by CCRRA § 2403(b), 110 Stat. 3009-430. It also allowed the furnishing of “prescreening” reports “in connection with credit or insurance transactions that are not initiated by the consumer” if the consumer authorizes the agency to do so or “the transaction consists of a firm offer of credit or insurance” and the consumer has not elected to be excluded—“opted out”—from lists of consumers provided by the agency to offerors of credit or insurance. 15 U.S.C. § 1671b(c), added by CCRRA §§ 2404(a), 2405, 110 Stat. 3009-430–433, also adding 15 U.S.C. § 1681b(e) (two-year consumer election and agency provision for availability of election); see 15 U.S.C. § 681b note, added by CCRRA § 2404(c), 110 Stat. 3009-434 (FTC to provide guidelines for prescreening); 16 C.F.R. pt. 642 (Prescreen Opt-Out Notice), added by 70 Fed. Reg. 5,032 (Jan. 31, 2005). Information on such lists is limited as to identification of other creditors. 15 U.S.C. § 1681b(c)(2). The CCRRA also limited communication of reports containing medical information. 15 U.S.C. § 1681b(g), added by CCRA § 2405, 110 Stat. 3009-434. Section 609 of the Fair Credit Reporting Act (15 U.S.C. § 1681g as amended) requires disclosure, 15 U.S.C. § 1681g; see 15 U.S.C. § 1681h (form of disclosure), to consumers of the information on them in agency files, except for credit scores, 15 U.S.C. § 1681g(a)(1)(B), including the sources of the information, 15 U.S.C. § 1681g(a)(2) (except “investigative consumer reports”); see 15 U.S.C. §§ 1681a(e) (definition) and 1681d (conditions, including disclosure to consumer, for generating investigative consumer reports), and identification of each person that procured a report during the preceding two-year period for employment purposes and during the preceding one-year period for any other purpose. 15 U.S.C. § 1681g(a)(3) (excepting governmental end-users on certain conditions). A summary of consumer rights is to be included with the disclosure. 15 U.S.C. § 1681g(c). Section 1681i provides for reinvestigation of information disputed by consumers. 15 U.S.C. § 1681i. Section 615 (15 U.S.C. § 1681m as amended) imposes certain duties on the users of consumer reports. If an adverse action is taken with respect to a consumer based on the report, the user must provide “oral, written, or electronic notice” of the adverse action, contact information of the agency, and a summary of the rights to obtain a copy of the report and to dispute with the agency the accuracy of the report. 15 U.S.C. § 1681m(a). If an adverse action is based on information from third parties other than agencies, the user must “clearly and accurately disclose” the right to request the reasons for such adverse action and provide that information upon request. 15 U.S.C. § 1681m(b). From the outset in 1970, the FCRA made the FTC the primary enforcement agency. [FCRA] § 621. Administrative enforcement (a) Compliance with the requirements imposed under this title shall be enforced under the Federal Trade Commission Act by the Federal Trade Commission with respect to consumer reporting agencies and all other persons subject thereto, except MCLE, Inc. | 2nd Edition 2018

8–9

§ 8.3

Data Security and Privacy in Massachusetts

to the extent that enforcement of the requirements imposed under this title is specifically committed to some other government agency under subsection (b) hereof. For the purpose of the exercise by the Federal Trade Commission of its functions and powers under the Federal Trade Commission Act, a violation of any requirement or prohibition imposed. Pub. L. No. 91-508, Title VI, § 601, 84 Stat. 1134-35 (Oct. 26, 1970), currently codified as amended at 15 U.S.C. § 1681s. Thus, the original charge to the FTC was under its “unfair or deceptive acts or practices” jurisdiction, that was broad and relatively undefined until restricted by Congress in its Federal Trade Commission Act Amendments of 1994, Pub. L. No. 103-312, §§ 5, 9, 108 Stat. 1691, 1692, 1695 (Aug. 26, 1994), adding respectively 15 U.S.C. §§ 57a(b)(3) (“prevalence” required) and 45(n) (limitations on “unfair” for declarations of unlawfulness and rule making). See § 8.5, below. Over the forty-five years of enforcement of the FCRA, the FTC has aggressively exercised its jurisdiction. See 40 Years of Experience with the Fair Credit Reporting Act (FTC Staff Report July 2011), available at https://www.ftc.gov/sites/ default/files/documents/reports/40-years-experience-fair-credit-reporting-act-ftcstaff-report-summary-interpretations/110720fcrareport.pdf. Practice Note Original Subsection (b) named the comptroller of the Currency, the Federal Reserve Board, the Federal Deposit Insurance Corp., the Federal Home Loan Bank Board (through the Federal Savings and Loan Insurance Corp.), the National Credit Union Administration, the Interstate Commerce Commission, the Civil Aeronautics Board, and the secretary of Agriculture. As currently stated at 15 U.S.C. § 1681s(b), the other agencies are “the appropriate Federal banking agency,” the National Credit Union Administration, the secretary of Transportation, the secretary of Agriculture, the Commodities Futures Trading Commission, the Securities and Exchange Commission, and, most importantly, the new Consumer Financial Protection Bureau created by the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010, Pub. L. No. 111-203, Title X (‘Consumer Financial Protection Act of 2010), Subtitles A–F, §§ 1011– 1067, Stat. 1964–2056 (July 21, 2010), with Section 1681s amended by Subtitle H, § 1088(a)(2)(A)–(C), (10), 124 Stat. 2087, 2088–90.

In 2010, rulemaking responsibility for the FCRA was transferred to the new Consumer Financial Protection Bureau (CFPB) except for the rule making for the “red flags” rule (15 U.S.C. § 1681m(e)) and records disposal (15 U.S.C. § 1681w) addressed to prevention of data breaches and the resultant identity theft. Consumer Financial Protection Act of 2010, Subtitle H, § 1088(a)(2)(A)–(C), 124 Stat. 2087. See Exhibit 8A for red flags rule guidelines.

(b)

Fair and Accurate Credit Transactions Act

The Fair and Accurate Credit Transactions Act of 2003 (FACT Act, or FACTA), Pub. L. No. 108-159, 117 Stat. 1952 (Dec. 4, 2003), added sections to the FCRA for 8–10

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

the prevention of and remediation of identity theft. Titles I and II are most relevant to data security and privacy. Title I of FACTA is entitled “Identity Theft Prevention and Credit History Restoration.” Subtitle A is entitled “Identity Theft Prevention,” and most notably includes new Section 605A to the FCRA, providing, among other things, for • initial fraud alerts in a file upon the request of a consumer and reference to other agencies, • access to free reports, • an extended alert upon the request of a consumer, • a five-year freeze or exclusion from agency lists, and • reference to other agencies in order to avert identity theft. 15 U.S.C. § 1681c-1, added by FACTA, Title I, Subtitle A, § 112(a), 117 Stat. 1955– 59. (Section 112(b) required the FTC to promulgate regulations on appropriate proof of identity for the procedures in new FCRA §§ 605A, 605B, and 609(a)(1), which were set forth at 16 C.F.R. pt. 614 (Appropriate Proof of Identity), moved to 12 C.F.R. § 1022.123.) Subtitle A also included a proscription on the printing of credit card numbers on electronically printed receipts except as truncated to, at most, the last five digits, FCRA § 605(g), 15 U.S.C. § 1681c(g), added by FACTA, Title I, Subtitle A, § 113, 117 Stat. 1959–60, and the option of consumers to request that the first five digits of the Social Security number not be included in the disclosure, FCRA § 609(a), 15 U.S.C. § 1681g(a), amended by FACTA, Title I, Subtitle A, § 115, 117 Stat. 1961. It also provided for the “establishment of procedures for the identification of possible instances of identity theft” through amendment of FCRA § 615 (15 U.S.C. § 1681m—Requirements on users of consumer reports) to include the promulgation of regulations: (e) Red flag guidelines and regulations required. (1) Guidelines. The Federal banking agencies, the National Credit Union Administration, and the Commission shall jointly, with respect to the entities that are subject to their respective enforcement authority under section 621 [15 U.S.C. § 1681s]— (A) establish and maintain guidelines for use by each financial institution and each creditor regarding identity theft with respect to account holders at, or customers of, such entities, and update such guidelines as often as necessary; (B) prescribe regulations requiring each financial institution and each creditor to establish reasonable policies and procedures for implementing the guidelines established pursuant to MCLE, Inc. | 2nd Edition 2018

8–11

§ 8.3

Data Security and Privacy in Massachusetts

subparagraph (A), to identify possible risks to account holders or customers or to the safety and soundness of the institution or customers; and (C) [change of address for credit cards] (2) Criteria. (A) In general. In developing the guidelines required by paragraph (1)(A), the agencies described in paragraph (1) shall identify patterns, practices, and specific forms of activity that indicate the possible existence of identity theft. (B) Inactive accounts. In developing the guidelines required by paragraph (1)(A), the agencies described in paragraph (1) shall consider including reasonable guidelines providing that when a transaction occurs with respect to a credit or deposit account that has been inactive for more than 2 years, the creditor or financial institution shall follow reasonable policies and procedures that provide for notice to be given to a consumer in a manner reasonably designed to reduce the likelihood of identity theft with respect to such account. FCRA § 615(e), 15 U.S.C. § 1681m(e) (emphasis added), added by FACTA, Title I, Subtitle A, § 114, 117 Stat. 1960–61, amended by Red Flag Program Clarification Act of 2010, Pub. L. No. 111-319, § 2(a), 124 Stat. 3457 (Dec. 18, 2010) (adding Subsection (e)(4) defining a “creditor” as one who “obtains or uses consumer reports . . . ; furnishes information to consumer reporting agencies . . . ; or advances funds to or on behalf of a person, based on an obligation of the person to repay the funds or repayable from specific property pledged by or on behalf of the person [but not a creditor who] advances funds on behalf of a person for expenses incidental to a service provided by the creditor to that person,” thus excluding accountants, doctors, and lawyers). In 2007, pursuant to this directive, the Department of the Treasury, Office of the Comptroller of the Currency, the Federal Reserve System, the Federal Deposit Insurance Corporation, the Department of the Treasury Office of Thrift Supervision, the National Credit Union Administration, and the FTC promulgated the “Red Flags Rules” on identity theft, 12 C.F.R. pts. 41, 222, 334, 364, 571, and 717, and 16 C.F.R. pt. 681 (Identity Theft Rule), added by 72 Fed. Reg. 63,718, 63,771 (Nov. 9, 2007), amended by 75 Fed. Reg. 22,639, 22,644–46 (May 14, 2009), amended by 77 Fed. Reg. 72,712, 72,715 (Dec. 6, 2012), conforming the definition of “creditor” at 15 U.S.C. § 1681m(e)(4). The FTC Rule, which is parallel to the others, but with more general applicability, is as follows: Duties regarding the detection, prevention, and mitigation of identity theft. [Red Flags Rule] 8–12

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

(a) Scope. This section applies to financial institutions and creditors that are subject to administrative enforcement of the FCRA by the Federal Trade Commission pursuant to 15 U.S.C. 1681s(a)(1). (b) Definitions. For purposes of this section, and Appendix A, the following definitions apply: (1) Account means a continuing relationship established by a person with a financial institution or creditor to obtain a product or service for personal, family, household or business purposes. Account includes: (i) An extension of credit, such as the purchase of property or services involving a deferred payment; and (ii) A deposit account. (2) The term board of directors includes: (i) In the case of a branch or agency of a foreign bank, the managing official in charge of the branch or agency; and (ii) In the case of any other creditor that does not have a board of directors, a designated employee at the level of senior management. (3) Covered account means: (i) An account that a financial institution or creditor offers or maintains, primarily for personal, family, or household purposes, that involves or is designed to permit multiple payments or transactions, such as a credit card account, mortgage loan, automobile loan, margin account, cell phone account, utility account, checking account, or savings account; and (ii) Any other account that the financial institution or creditor offers or maintains for which there is a reasonably foreseeable risk to customers or to the safety and soundness of the financial institution or creditor from identity theft, including financial, operational, compliance, reputation, or litigation risks. (4) Credit has the same meaning as in 15 U.S.C. 1681a(r)(5). (5) Creditor has the same meaning as in 15 U.S.C. (e)(4). (6) Customer means a person that has a covered account with a financial institution or creditor. (7) Financial institution has the same meaning as in 15 U.S.C. 1681a(t). (8) Identity theft has the same meaning as in 16 CFR 603.2(a).

MCLE, Inc. | 2nd Edition 2018

8–13

§ 8.3

Data Security and Privacy in Massachusetts

(9) Red Flag means a pattern, practice, or specific activity that indicates the possible existence of identity theft. (10) Service provider means a person that provides a service directly to the financial institution or creditor. (c) Periodic Identification of Covered Accounts. Each financial institution or creditor must periodically determine whether it offers or maintains covered accounts. As a part of this determination, a financial institution or creditor must conduct a risk assessment to determine whether it offers or maintains covered accounts described in paragraph (b)(3)(ii) of this section, taking into consideration: (1) The methods it provides to open its accounts; (2) The methods it provides to access its accounts; and (3) Its previous experiences with identity theft. (d) Establishment of an Identity Theft Prevention Program – (1) Program requirement. Each financial institution or creditor that offers or maintains one or more covered accounts must develop and implement a written Identity Theft Prevention Program (Program) that is designed to detect, prevent, and mitigate identity theft in connection with the opening of a covered account or any existing covered account. The Program must be appropriate to the size and complexity of the financial institution or creditor and the nature and scope of its activities. (2) Elements of the Program. The Program must include reasonable policies and procedures to: (i) Identify relevant Red Flags for the covered accounts that the financial institution or creditor offers or maintains, and incorporate those Red Flags into its Program; (ii) Detect Red Flags that have been incorporated into the Program of the financial institution or creditor; (iii) Respond appropriately to any Red Flags that are detected pursuant to paragraph (d)(2)(ii) of this section to prevent and mitigate identity theft; and (iv) Ensure the Program (including the Red Flags determined to be relevant) is updated periodically, to reflect changes in risks to customers and to the safety and soundness of the financial institution or creditor from identity theft. (e) Administration of the Program. Each financial institution or creditor that is required to implement a Program must provide for the continued administration of the Program and must: 8–14

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

(1) Obtain approval of the initial written Program from either its board of directors or an appropriate committee of the board of directors; (2) Involve the board of directors, an appropriate committee thereof, or a designated employee at the level of senior management in the oversight, development, implementation and administration of the Program; (3) Train staff, as necessary, to effectively implement the Program; and (4) Exercise appropriate and effective oversight of service provider arrangements. (f) Guidelines. Each financial institution or creditor that is required to implement a Program must consider the guidelines in appendix A [attached as Appendix A hereto] of this part and include in its Program those guidelines that are appropriate. 16 C.F.R. § 681.1 (emphasis added). There is no private right of action as to the red flags rule, 15 U.S.C. § 1681m(h)(8)(A), and both rule-making and administrative enforcement authority remained with the FTC, 15 U.S.C. §§ 1681m(e) and (h)(8)(B), and 1681s(a). Subtitle B of FACTA Title I is entitled “Protection and Restoration of Identity Theft Victim Credit History.” It provides for the regulatory development of a summary of rights of identity theft victims and for the availability of information. FCRA § 609(d)–(e), 15 U.S.C. § 1681g(d)–(e), added by FACTA, Title I, Subtitle B, § 151(a), 117 Stat. 1961–64. (The model “Summary of Consumer Identity Theft Rights” is set forth at Appendix E to 16 C.F.R. pt. 698, added by “Summary of Rights and Notices Duties Under the Fair Credit Reporting Act,” 69 Fed. Reg. 69,776, 69,784–87 (Nov. 30, 2004).) It added a new FCRA § 605B, 15 U.S.C. § 1681c-2, added by FACTA, Title I, Subtitle B, § 152(a), 117 Stat. 1964-66; see 12 C.F.R. § 1022.123 (Appropriate Proof of Identity), on blocking of information resulting from identity theft, with the following main provisions: (a) Block. Except as otherwise provided in this section, a consumer reporting agency shall block the reporting of any information in the file of a consumer that the consumer identifies as information that resulted from an alleged identity theft, not later than 4 business days after the date of receipt by such agency of— (1) appropriate proof of the identity of the consumer; (2) a copy of an identity theft report; (3) the identification of such information by the consumer; and

MCLE, Inc. | 2nd Edition 2018

8–15

§ 8.3

Data Security and Privacy in Massachusetts

(4) a statement by the consumer that the information is not information relating to any transaction by the consumer. (b) Notification. A consumer reporting agency shall promptly notify the furnisher of information identified by the consumer under subsection (a) of this section— (1) that the information may be a result of identity theft; (2) that an identity theft report has been filed; (3) that a block has been requested under this section; and (4) of the effective dates of the block. 15 U.S.C. § 1681c-2(a), (b). Subtitle B also calls for the coordination of consumer complaint investigations among the consumer reporting agencies. FCRA § 621(f), 15 U.S.C. § 1681s(f), added by FACTA, Title I, Subtitle B, § 153(a), 117 Stat. 1966. There are provisions to prevent “repollution of consumer reports” by requiring for procedures to prevent reinsertion of erroneous information, FCRA § 623(a), 15 U.S.C. § 1681s-2.(a), added by FACTA, Title I, Subtitle B, § 154(a), 117 Stat. 1966–67, and prohibiting the sale or transfer of debt caused by identity theft, FCRA § 615(f), 15 U.S.C. § 1681m(f), added by FACTA, Title I, Subtitle B, § 154(b), 117 Stat. 1967. There is a further requirement for debt collectors to notify third-party creditors or users that information may be fraudulent. FCRA § 615(g), 15 U.S.C. § 1681m.(g), added by FACTA, Title I, Subtitle B, § 155(a), 117 Stat. 1967–68. Title II of FACTA is entitled “Improvements in the Use of and Consumer Access to Credit Information.” These improvements include • free annual disclosures, FCRA § 612(a), 15 U.S.C. § 1681j(a), added by FACTA, Title II, § 211(a), 117 Stat. 1968–70; • the promulgation of regulations against corporate and technological circumvention, new FCRA § 629, 15 U.S.C. § 1681x, added by FACTA, Title II, § 211(b), 117 Stat. 1970 (these were set out as 16 C.F.R. pt. 611 (Prohibition Against Circumventing Treatment as a Nationwide Consumer Reporting Agency), added by 69 Fed. Reg. 29,061, 29,063–64 (May 20, 2004), moved to 12 C.F.R. § 1022.140); • provision of a summary of rights to obtain and dispute information in consumer reports and to obtain credit scores, FCRA § 609(c), 15 U.S.C. § 1681g(c), amended by FACTA, Title II, § 211(c), 117 Stat. 1970–71; see 16 C.F.R. pt. 698, Appendices D (Standardized Form for Requesting Free File Disclosure), F (General Summary of Consumer Rights), added by “Summary of Rights and Notices Duties Under the Fair Credit Reporting Act,” 69 Fed. Reg. 69,776, 69,784–87 (Nov. 30, 2004);

8–16

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

• disclosure of credit scores, FCRA § 609(a), (g), 15 U.S.C. § 1681g(a), (g), amended and added by FACTA, Title II, § 212(a), (b), 117 Stat. 1972–75; • inclusion of a key factor in the credit score information in a consumer report, FCRA § 605(d), 15 U.S.C. § 1681c(d), amended by FACTA, Title II, § 212(d), 117 Stat. 1978; • enhanced opt-out procedures, FCRA § 615(d)(2), 15 U.S.C. § 1681m(d)(2), amended by FACTA, Title II, § 213(a), 117 Stat. 1978–79; see 16 C.F.R. pt. 642 (Prescreen Opt-Out Notice), added by 70 Fed. Reg. 5,032 (Jan. 31, 2005); • extension of elections from two to five years, FCRA § 604(e), 15 U.S.C. § 1681b(e), amended by FACTA, Title II, § 213(c), 117 Stat. 1979; and • providing for disclosure of negative reports to a credit agency, FCRA § 623(a)(7), 15 U.S.C. § 1681s-2(a)(7), amended by FACTA, Title II, § 217(a), 117 Stat. 1986–87. Also added by Title II was a new FCRA § 624 on “affiliate sharing.” New FCRA § 624, 15 U.S.C. § 1681s-3, added by FACTA, Title II, § 214(a), 117 Stat. 1980–82. The basic requirement is that [a]ny person that receives from another person related to it by common ownership or affiliated by corporate control a communication of information that would be a consumer report . . . may not use the information to make a solicitation for marketing purposes to a consumer about its products or services, unless— (A) it is clearly and conspicuously disclosed to the consumer that the information may be communicated among such persons for purposes of making such solicitations to the consumer; and (B) the consumer is provided an opportunity and a simple method to prohibit the making of such solicitations to the consumer by such person. 15 U.S.C. § 1681s-3(a)(1). The FTC was required to promulgate regulations, which it did at 16 C.F.R. pt. 680 (Affiliate Marketing). 16 C.F.R. pt. 680, added by 72 Fed. Reg. 61,455 (Oct. 30, 2007). A significant contribution of Title II to promoting data security was new FCRA § 628, requiring the FTC, Securities and Exchange Commission, Commodity Futures Trading Commission, federal banking agencies, and National Credit Union Administration to “issue final regulations requiring any person that maintains or otherwise possesses consumer information, or any compilation of consumer information, derived from consumer reports for a business purpose to properly dispose of any such information or compilation.” New FCRA § 628 (Disposal of Records), 15 U.S.C. § 1681w, added by FACTA, Title II, § 216(a), 117 Stat. 1985–86. Following is the regulation promulgated by the FTC at 16 C.F.R. pt. 682 (Disposal Rule): MCLE, Inc. | 2nd Edition 2018

8–17

§ 8.3

Data Security and Privacy in Massachusetts

Proper disposal of consumer information. (a) Standard. Any person who maintains or otherwise possesses consumer information for a business purpose must properly dispose of such information by taking reasonable measures to protect against unauthorized access to or use of the information in connection with its disposal. (b) Examples. Reasonable measures to protect against unauthorized access to or use of consumer information in connection with its disposal include the following examples. These examples are illustrative only and are not exclusive or exhaustive methods for complying with the rule in this part. (1) Implementing and monitoring compliance with policies and procedures that require the burning, pulverizing, or shredding of papers containing consumer information so that the information cannot practicably be read or reconstructed. (2) Implementing and monitoring compliance with policies and procedures that require the destruction or erasure of electronic media containing consumer information so that the information cannot practicably be read or reconstructed. (3) After due diligence, entering into and monitoring compliance with a contract with another party engaged in the business of record destruction to dispose of material, specifically identified as consumer information, in a manner consistent with this rule. In this context, due diligence could include reviewing an independent audit of the disposal company’s operations and/or its compliance with this rule, obtaining information about the disposal company from several references or other reliable sources, requiring that the disposal company be certified by a recognized trade association or similar third party, reviewing and evaluating the disposal company’s information security policies or procedures, or taking other appropriate measures to determine the competency and integrity of the potential disposal company. (4) For persons or entities who maintain or otherwise possess consumer information through their provision of services directly to a person subject to this part, implementing and monitoring compliance with policies and procedures that protect against unauthorized or unintentional disposal of consumer information, and disposing of such information in accordance with examples (b)(1) and (2) of this section. (5) For persons subject to the Gramm-Leach-Bliley Act, 15 U.S.C. 6081 et seq., and the Federal Trade Commission’s Standards for Safeguarding Customer Information, 16 CFR part 314 (“Safeguards Rule”), incorporating the proper disposal 8–18

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

of consumer information as required by this rule into the information security program required by the Safeguards Rule. See Exhibit 8B for the text of the GLBA Safeguards Rule Information Protection Program Elements. 16 C.F.R. § 682.3, added by 69 Fed. Reg. 68,690, 68,697 (Nov. 24, 2004). As part of a periodic review of its regulations, the FTC requested comments on the Disposal of Consumer Report Information and Records, 81 Fed. Reg. 63,435 (Sept. 15, 2016). Rule-making and administrative authority remain with the FTC. 15 U.S.C. §§ 1681w(a), 1681s(a).

§ 8.3.2

Gramm-Leach-Bliley Act

The Gramm-Leach-Bliley Act of 1999 (GBLA), which eliminated the Depression-era separation of banks and securities firms, included a Title V addressed to “privacy.” Act of Nov. 12, 1999, Pub. L. No. 106-102, Title V, §§ 501–527, 113 Stat. 1436. Title V included two subtitles, with Subtitle A addressed to “Disclosure of Nonpublic Personal Information” and set out as 15 U.S.C. §§ 6801–6809, as added by GBLA, Title V, Subtitle A, §§ 501–509, 113 Stat. 1436–1443 (Nov. 12, 1999), amended by Consumer Financial Protection Act of 2010, Subtitle H, § 1093, 124 Stat. 2095–97 (July 21, 2010) (adding and coordinating responsibilities of FTC, Bureau of Consumer Financial Protection, and Commodity Futures Trading Commission), and Subtitle B addressed to “Fraudulent Access to Financial Information” and set forth as 15 U.S.C. §§ 6821–6827, as added by GBLA, Title V, Subtitle B, §§ 521–527, 113 Stat. 1446–1450. Subtitle A provided the statutory bases for FTC and agency development of the safeguards rule and the financial privacy rule, while Subtitle B itself provided the pretexting rule.

(a)

Subtitle A: Safeguarding of Nonpublic Personal Information

Subtitle A of Title V of the GBPA addresses the protection from unwarranted disclosure of “nonpublic personal information,” also known as NPPI. (A) The term “nonpublic personal information” means personally identifiable financial information— (i) provided by a consumer to a financial institution; (ii) resulting from any transaction with the consumer or any service performed for the consumer; or (iii) otherwise obtained by the financial institution. 15 U.S.C. § 6809(4) (emphasis added). Practice Note The FTC has taken the position that “any information should be considered financial information if it is requested by a financial institution for the purpose of providing a financial product or service.” 65 Fed. Reg. 33,645, MCLE, Inc. | 2nd Edition 2018

8–19

§ 8.3

Data Security and Privacy in Massachusetts 33,658, 33,680 (May 24, 2000); 16 C.F.R. § 313.3(o)(1) (definition of “personally identifiable financial information”).

So defined, NPPI is protected under Section 6801(b): (b) Financial institutions safeguards. In furtherance of the policy in subsection (a), each agency or authority described in section 6805(a), other than the Bureau of Consumer Financial Protection, shall establish appropriate standards for the financial institutions subject to their jurisdiction relating to administrative, technical, and physical safeguards— (1) to insure the security and confidentiality of customer records and information; (2) to protect against any anticipated threats or hazards to the security or integrity of such records; and (3) to protect against unauthorized access to or use of such records or information which could result in substantial harm or inconvenience to any customer. 15 U.S.C. § 6801(b). In 2002, the FTC promulgated standards for safeguarding customer information at 16 C.F.R. pt. 314 (safeguards rule), added by 67 Fed. Reg. 36,484, 36,493–94 (May 23, 2002). Compare 16 C.F.R. § 248.30 (part of Security and Exchange Commission’s Regulation S-P, “Privacy of Consumer Financial Information,” 16 C.F.R. pt. 248, added by 65 Fed. Reg. 40,334, 40,362–72 (June 20, 2000)). Standards for safeguarding customer information. (a) Information security program. You shall develop, implement, and maintain a comprehensive information security program that is written in one or more readily accessible parts and contains administrative, technical, and physical safeguards that are appropriate to your size and complexity, the nature and scope of your activities, and the sensitivity of any customer information at issue. Such safeguards shall include the elements set forth in § 314.4 [Appendix B hereto] and shall be reasonably designed to achieve the objectives of this part, as set forth in paragraph (b) of this section. (b) Objectives. The objectives of section 501(b) of the Act, and of this part, are to: (1) Insure the security and confidentiality of customer information;

8–20

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

(2) Protect against any anticipated threats or hazards to the security or integrity of such information; and (3) Protect against unauthorized access to or use of such information that could result in substantial harm or inconvenience to any customer. 16 C.F.R. § 314.3. Enforcement of the safeguards rule remains within the jurisdiction of the FTC or specific regulators of financial institutions. 15 U.S.C. § 6805(b)(1), added by GLBA, Title V, Subtitle A, § 505, 113 Stat. 1440, as amended by Consumer Financial Protection Act of 2010, Subtitle H, § 1093(4), (5), 124 Stat. 2096–97.

(b)

Subtitle A: Disclosure of Nonpublic Personal Information

The remaining sections of Subtitle A relate primarily to financial institution disclosures of customer NPPI to unaffiliated third parties. Section 6802 provides two general conditions for such disclosure: notice to the customer, 15 U.S.C. § 6802(a), and the availability of an “opt-out,” 15 U.S.C. § 6802(b). The notice must comply with Section 6803: (a) Disclosure required. At the time of establishing a customer relationship with a consumer, and not less than annually during the continuation of such relationship, a financial institution shall provide a clear and conspicuous disclosure to such consumer, in writing or in electronic form or other form permitted by the regulations prescribed under section 6804 of this title, of such financial institution’s policies and practices with respect to— (1) disclosing nonpublic personal information to affiliates and nonaffiliated third parties, consistent with section 6802 of this title, including the categories of information that may be disclosed; (2) disclosing nonpublic personal information of persons who have ceased to be customers of the financial institution; and (3) protecting the nonpublic personal information of consumers. (b) Regulations. Disclosures required under subsection (a) shall be made in accordance with the regulations prescribed under section 6804 of this title. (c) Information to be included. The disclosure required by subsection (a) shall include— (1) the policies and practices of the institution with respect to disclosing nonpublic personal information to nonaffiliated MCLE, Inc. | 2nd Edition 2018

8–21

§ 8.3

Data Security and Privacy in Massachusetts

third parties, other than agents of the institution, consistent with section 6802 of this title, and including— (A) the categories of persons to whom the information is or may be disclosed, other than the persons to whom the information may be provided pursuant to section 6802(e) of this title; and (B) the policies and practices of the institution with respect to disclosing of nonpublic personal information of persons who have ceased to be customers of the financial institution; (2) the categories of nonpublic personal information that are collected by the financial institution; (3) the policies that the institution maintains to protect the confidentiality and security of nonpublic personal information in accordance with section 6801 of this title; and (4) the disclosures required, 1681a(d)(2)(A)(iii) of this title.

if

any,

under

section

15 U.S.C. § 6803, amended by Financial Services Regulatory Relief Act of 2006, Pub. L. No. 109-351, Title VI, § 609, Title VII, § 728, 120 Stat. 1983, 2003 (Oct. 13, 2006) (rearranging Subsections (a)–(c), and adding Subsections (d) and (e)). Section 6804 empowers federal bank supervisory agencies or functional regulators, the secretary of the Treasury, the Securities and Exchange Commission, the CFPB (after its formation in 2010, for large depository institutions and most nonbank financial institutions), and the FTC (thereafter limited to certain automobile dealers) to prescribe implementing regulations, 15 U.S.C. § 6804, amended by Consumer Financial Protection Act of 2010, Subtitle H, § 1093(3), 124 Stat. 2095 (adding CFPB), and similarly assigns enforcement power, except that the FTC would enforce the safeguards rule for CFPB institutions, 15 U.S.C. § 6805, amended by Consumer Financial Protection Act of 2010, Subtitle H, § 1093(4), (5), 124 Stat. 2096 (July 21, 2010) (adding CFPB but limiting its enforcement authority on Section 501, 15 U.S.C. § 6801, safeguards rule, left primarily to FTC). In 2000, the FTC promulgated its financial privacy notice regulations at 16 C.F.R. pt. 313 (financial privacy rule), providing for initial, annual, and revised notices, with a model privacy form added in 2009, 16 C.F.R. §§ 313.1–313.18 and Appendix A (model privacy form), as added by 65 Fed. Reg. 33,677 (May 24, 2000), amended by 74 Fed. Reg. 62,890 (Dec. 1, 2009) (model privacy form), as a safe harbor under the Financial Services Regulatory Relief Act of 2006, Pub. L. No. 109-351, Title VI, § 609, Title VII, § 728, 120 Stat. 1983, 2003 (Oct. 13, 2006), adding Section 603(e), 15 U.S.C. § 6803(e). The Federal Reserve Board promulgated its regulations at 12 C.F.R. pt. 216 (Regulation P). 65 Fed. Reg. 35,206 (June 1, 2000). The Office of Thrift Supervision promulgated its regulations at 12 C.F.R. pt. 573. 65 Fed. Reg. 35,226 (June 1, 2000). The Office of the Comptroller of the Currency promulgated its regulations at 12 C.F.R. pt. 40. 65 Fed. Reg. 35,196 (June 1, 2000). The Federal 8–22

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

Deposit Insurance Corporation promulgated its regulations at 12 C.F.R. pt. 332. 65 Fed. Reg. 35,216 (June 1, 2000). The National Credit Union Administration promulgated its regulations at 12 C.F.R. § 716. 65 Fed. Reg. 31,740 (May 18, 2000). The Securities and Exchange Commission promulgated its regulations at 17 C.F.R. pt. 248 (Regulation S-P). 65 Fed. Reg. 40,362 (June 29, 2000). The Commodity Futures Trading Commission promulgated its regulation at 17 C.F.R. pt. 160. 66 Fed. Reg. 21,236 (Apr. 27, 2001). The CFPB promulgated its “Regulation P,” at 12 C.F.R. pt. 1016, 76 Fed. Reg. 79,025 (Dec. 21, 2011), largely identical to the FTC financial privacy rule. Compare 12 C.F.R. §§ 1016.1–1016.17 and Appendix A to 16 C.F.R. §§ 313.1–313.18.

(c)

Subtitle B: Pretexting

Subtitle B of Title V of the GBLA is addressed to “Fraudulent Access to Financial Information” (also known as “pretexting”) and included as 15 U.S.C. §§ 6821–6827, added by Act of Nov. 12, 1999, Pub. L. No. 106-102, Title V, Subtitle B, §§ 522–527, 113 Stat. 1446–1449. The substantive provision is in Section 6821, which is also known as the GBLA Pretexting Rule: (a) Prohibition on obtaining customer information by false pretenses. It shall be a violation of this subtitle for any person to obtain or attempt to obtain, or cause to be disclosed or attempt to cause to be disclosed to any person, customer information of a financial institution relating to another person— (1) by making a false, fictitious, or fraudulent statement or representation to an officer, employee, or agent of a financial institution; (2) by making a false, fictitious, or fraudulent statement or representation to a customer of a financial institution; or (3) by providing any document to an officer, employee, or agent of a financial institution, knowing that the document is forged, counterfeit, lost, or stolen, was fraudulently obtained, or contains a false, fictitious, or fraudulent statement or representation. (b) Prohibition on solicitation of a person to obtain customer information from financial institution under false pretenses. It shall be a violation of this to request a person to obtain customer information of a financial institution, knowing that the person will obtain, or attempt to obtain, the information from the institution in any manner described in subsection (a). 15 U.S.C. § 6821(a), (b) (emphasis added). Here “customer information of a financial institution” means “any information maintained by or for a financial institution MCLE, Inc. | 2nd Edition 2018

8–23

§ 8.3

Data Security and Privacy in Massachusetts

which is derived from the relationship between the financial institution and a customer of the financial institution and is identified with the customer.” 15 U.S.C. § 6827(2). The prohibition does not apply to • law enforcement agencies, 15 U.S.C. § 6821(c); • financial institutions in the course of testing security procedures or investigating conduct of their agents, 15 U.S.C. § 6821(d); • insurance institutions investigating fraud, 15 U.S.C. § 6821(e); • information otherwise “available as a public record filed pursuant to the securities laws,” 15 U.S.C. § 6821(f), referring to Section 3(a)(47) of the Securities Exchange Act of 1934, 15 U.S.C. § 78c(a)(47); or • state-licensed collectors of child support judgments, 15 U.S.C. § 6821(g). The FTC is charged with enforcement of the “pretexting” subtitle, except in the cases of banks and credit unions, where their supervisory agencies are charged with enforcement, 15 U.S.C. § 6822, and with establishing and updating guidelines for bank compliance, 15 U.S.C. § 6825. In August 2017, upon settlement, the FTC filed an administrative complaint against TaxSlayer, an online tax preparation service that had 9,000 accounts “hacked,” alleging violations of the Gramm-Leach-Bliley Privacy Rule (and Regulation P after October 28, 2014, transfer to the CFPB) by failing to provide privacy notices, and the Safeguards Rule by failing to have a written comprehensive information security program and failing to conduct risk assessments. The stipulated order prohibits continued violation and mandates development of a comprehensive security program with biennial assessments for ten years and recordkeeping for twenty years. In re TaxSlayer, LLC, File No. 1623063, Complaint & Decision and Order (FTC Aug. 29, 2017), available at http://www.ftc.gov/system/files/documents/cases/1623063_ taxslayer_complaint.pdf; and http://www.ftc.gov/system/files/documents/cases/ 1623063_taxslayer_decision_and_order.pdf.

§ 8.3.3

Health Insurance Portability and Accountability Act (HIPAA)

The FTC has been given a role protecting data security in the health-care field. As part of the Health Insurance Portability and Accountability Act of 1996, (HIPAA), 104 Pub. L. No. 191, 110 Stat. 1936-2103 (Aug. 21, 1996), Congress provided in Subtitle F, for “Administrative Simplification,” Health Insurance Portability and Accountability Act of 1996, Pub. L. No. 104-191, Title II, Subtitle F (Administrative Simplification), §§ 261–264, 110 Stat. 1936, 2021–34 (Aug. 21, 1996), amended by Genetic Information Nondiscrimination Act of 2008, Pub. L. No. 110-233, Title I, § 105(a), 122 Stat. 881, 903 (May 21, 2008) (adding S 1320d-9), amended by American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, Div. A, Title XIII (Health Information Technology for Economic and Clinical Health Act, or HITECH Act), 123 Stat. 115, 242–79 (Feb. 17, 2009), amended by Patient Protection and 8–24

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

Affordable Care Act, Pub. L. No. 111-148, Title I, Subtitle B, § 1104(b), Title X, Subtitle A, § 10109, 124 Stat. 119, 147, 915 (Mar. 23, 2010) (streamlining, standards, operating rules), among other things, to improve “the efficiency and effectiveness of the health care system, by encouraging the development of a health information system through the establishment of standards and requirements for the electronic transmission of certain health information.” HIPAA, Title II, Subtitle F, § 261, 110 Stat. 1936, 2021, codified at 42 U.S.C. § 1320d note (emphasis added). (This expansion from a focus on electronic transmissions survived a constitutional challenge in South Carolina Medical Association v. Thompson, 427 F.3d 346 (4th Cir. 2003).) The Department of Health and Human Services (HHS) issued its final rule on “Standards for Privacy of Individually Identifiable Health Information,” 45 C.F.R. pts. 160 (General Administrative Requirements) and 164 (Security and Privacy) on December 28, 2000. 65 Fed. Reg. 82,462–829 (Dec. 28, 2000). In an embrace of the importance of privacy to an efficient health-care system, 65 Fed. Reg. at 82,464–69 (“Need for a National Health Privacy Network”), the HHS expanded the reach of the privacy regulation: In this final rule we expand the definition of protected health information to encompass all individually identifiable health information transmitted or maintained by a covered entity, regardless of form. Specifically, we delete the conditions for individually identifiable health information to be ‘electronically maintained’ or ‘electronically transmitted’ and the corresponding definitions of those terms.” 65 Fed. Reg. at 82,496. The Part 160 administrative rules apply to “covered entities” defined to include health plans, health-care clearinghouses (processors of health information), and health-care providers who transmit health information in electronic form in connection with a covered transaction, 45 C.F.R. §§ 160.102, 160.103, and the HHS empowered to receive complaints from individuals and to investigate and sanction (with civil penalties) covered entities originally, 45 C.F.R. §§ 160.306– 160.312. Part 164 contains the “Security and Privacy” part of the administrative simplification rules, also known as the “privacy rule.” 45 C.F.R. pt. 164, subpt. E, §§ 164.500– 164.534, as added by 65 Fed. Reg. 82,462, 82,805 (Dec. 28, 2000), amended by 66 Fed. Reg. 12,434 (Feb. 26, 2001) (compliance dates), 67 Fed. Reg. 53,182, 53,267 (Aug. 14, 2002) (among others, refinements on prohibited uses, such as genetic information and certain sales), 68 Fed. Reg. 8,334, 8,381 (Feb. 20, 2003), 78 Fed. Reg. 5,566, 5,696 (Jan. 25, 2013) (application to business associates), 78 Fed. Reg. 34,264, 34,266 (June 7, 2013) (technical corrections). The original substantive rules were set forth in Subpart E, “Privacy of Individually Identifiable Health Information,” prescribing standards and implantation standards for uses and disclosures of protected health information, 45 C.F.R. §§ 164.502 (general rules), 164.504 (organizational requirements), 164.506 (treatment), 164.508 (authorization required, e.g., psychotherapy notes), 164.10 (opportunity to object, e.g., facility directory), 164.12

MCLE, Inc. | 2nd Edition 2018

8–25

§ 8.3

Data Security and Privacy in Massachusetts

(no authorization or opportunity to object, e.g., public health), and 164.14 (deidentification of information), and the “classic” elements of privacy: • notice, 45 C.F.R. § 164.520; • consent, 45 C.F.R. § 164.522 (request for private treatment); • access, 45 C.F.R. § 164.524; • amendment, 45 C.F.R. § 164.526; and • accounting, 45 C.F.R. § 164.528. The original rule made the covered entities responsible for disclosures to “business associates,” defined to be not a “member of the workforce” of the covered entity and those who “perform[ ], or assist[ ] in the performance of [a] function or activity involving the use or disclosure of individually identifiable health information, including claims processing or administration, data analysis, processing or administration, utilization review, quality assurance, billing, benefit management, practice management, and repricing” or “legal, actuarial, accounting, consulting, data aggregation . . . , management, administrative, accreditation, or financial services . . . where the provision of the service involves the disclosure of individually identifiable health information.” 45 C.F.R. § 160.103. “A covered entity may disclose protected health information to a business associate and may allow a business associate to create or receive protected health information on its behalf, if the covered entity obtains satisfactory assurance that the business associate will appropriately safeguard the information.” 45 C.F.R. § 164.502(e)(1)(i). This is the general provision allowing covered entities to use contractors. Organization requirements, including business associate contract requirements, are detailed in 45 C.F.R. § 164.504. Subpart C was added in 2003 to Part 164, providing “Security Standards for the Protection of Electronic Protected Health Information,” also known as PHI. 45 C.F.R. pt. 164, subpt. C, §§ 164.302–164.318 and Appendix A, added by 68 Fed. Reg. 8,334, 8,376–81 (Feb. 20, 2003), amended by 78 Fed. Reg. 5,566, 5,694 (Jan. 25, 2013) (adding business associates), 78 Fed. Reg. 34,264, 34,266 (June 7, 2013) (technical corrections). Standards for safeguards are provided with both “required” implementation specifications and “addressable” ones that involve an assessment of reasonableness and appropriateness in the circumstances, with documentation why it is not. 45 C.F.R. § 164.306(d). Administrative safeguards are set forth in standards and implementation specifications for risk management and workforce discipline. 45 C.F.R. § 164.308. Physical safeguards include facility, workstation, and media security. 45 C.F.R. § 164.310. Technical safeguards include access control and encryption (addressable). 45 C.F.R. § 164.312. Organizational requirements include contracts with business associates (and they with contractors) that comply with the security standards of Subpart C (security standards) and provide for reporting of security incidents. 45 C.F.R. § 164.314. The policies and procedures must be documented and the documentation retained for six years. 45 C.F.R. § 164.316. Appendix A to Subpart C tabulates the required and addressable implementations.

8–26

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.3

On February 17, 2009, the American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, 123 Stat. 115 (Feb. 17, 2009), was signed into law. Division A, Title XIII (sometimes referred to as the HITECH Act) provides for the development of health information technology, with Subtitle D addressing privacy. 42 U.S.C. §§ 17921–17954, added by HITECH Act §§ 13400–13424, 123 Stat. 258–79. Part 1 on “Improved Privacy Provisions and Security Provisions” extended direct HHS regulation to “business associates” of covered entities. Sections 164.308, 164.310, 164.312, and 164.316 [disclosure and use of PHI under Part C] of title 45, Code of Federal Regulations, shall apply to a business associate of a covered entity in the same manner that such sections apply to the covered entity. The additional requirements of this title that relate to security and that are made applicable with respect to covered entities shall also be applicable to such a business associate and shall be incorporated into the business associate agreement between the business associate and the covered entity. 42 U.S.C. § 17931(a) (emphasis added), added by HITECH Act, Subtitle D, Part 1, § 13401, 123 Stat. 260. (Previously, business associates were monitored only by the covered entity under 45 C.F.R. § 502(e).) Each organization, with respect to a covered entity, that provides data transmission of protected health information to such entity (or its business associate) and that requires access on a routine basis to such protected health information, such as a Health Information Exchange Organization, Regional Health Information Organization, E-prescribing Gateway, or each vendor that contracts with a covered entity to allow that covered entity to offer a personal health record to patients as part of its electronic health record, is required to enter into a written contract (or other written arrangement) described in section 164.502(e)(2) of title 45, Code of Federal Regulations and a written contract (or other arrangement) described in section 164.308(b) of such title, with such entity and shall be treated as a business associate of the covered entity for purposes of the provisions of this subtitle and subparts C and E of part 164 of title 45, Code of Federal Regulations, as such provisions are in effect as of [the date of enactment of this title]. 42 U.S.C. § 17938 (emphasis added), added by HITECH Act, Subtitle D, Part 1, § 13409, 123 Stat. 271. 45 C.F.R. pts. 160 and 164 were amended January 25, 2013 to include HHS direct jurisdiction over business associates. 78 Fed. Reg. 5,566, 5,694 (Jan. 25, 2013). New 42 U.S.C. § 17932 requires notification by a covered entity “that accesses, maintains, retains, modifies, records, stores, destroys, or otherwise holds, uses, or discloses” unsecured PHI, of the discovery of a breach to the affected individuals and MCLE, Inc. | 2nd Edition 2018

8–27

§ 8.3

Data Security and Privacy in Massachusetts

by a business associate to the covered entity. 42 U.S.C. § 17932(a), (b), added by HITECH Act, Subtitle D, Part 1, § 13402, 123 Stat. 260–63. It follows the broad template of state laws requiring notification of breach of security of personally identifiable information (PII), but also includes notification of the media where more than 500 residents of the jurisdiction are believed to be affected by the breach. 42 U.S.C. § 17932(e)(2). Practice Note Notification to HHS is to be provided immediately if 500 or more individuals are involved or provided in an annual log otherwise. 42 U.S.C. § 17942(e)(3). The notification, “to the extent possible,” should include “[a] brief description of what happened, including the date of the breach and the date of the discovery of the breach.” 42 U.S.C. § 17932(f)(1).

Business associates are required to notify the covered entity of a breach, 42 U.S.C. § 17932(b), and are subject to civil and criminal penalties for violations, 42 U.S.C. § 17934(c). On August 6, 2009, the HHS promulgated a new Subpart D, “Notification in the Case of Breach of Unsecured Protected Health Information,” to Part 164 of Title 45 of the Code of Federal Regulations. 45 C.F.R. pt. 164, subpt. D, §§ 164.400–164.414, added by 74 Fed. Reg. 42,740, 42,767–69 (Aug. 24, 2009) (interim rule), amended by 78 Fed. Reg. 5,566, 5,695 (Jan. 25, 2013). The HITECH Act further extended regulation to a “vendor of personal health records” (PHRs), meaning “an entity, other than a covered entity, . . . that offers or maintains a personal health record,” 42 U.S.C. § 17921(18) (“Personal health record” or “PHR” means “an electronic record of PHR identifiable health information [individually identifiable health information] on an individual that can be drawn from multiple sources and that is managed, shared, and controlled by or primarily for the individual,” 42 U.S.C. § 17921(11)), in the development of health-care information infrastructure. For these vendors and third-party service providers to them, new 42 U.S.C. § 17937 imposes a “temporary breach notification requirement” (pending general breach notification legislation, which has not occurred) requiring notification by vendors of “the discovery of a breach of security of unsecured PHR identifiable health information” to affected individuals and the FTC by vendors and by thirdparty service providers to vendors. 42 U.S.C. § 17937(a), (b), added by HITECH Act, Subtitle D, Part 1, § 13407, 123 Stat. 269–71 (Feb. 17, 2009). Violations are treated as an unfair and deceptive act or practice under the FTCA, enforced by the FTC. 42 U.S.C. § 17937(e). The FTC was directed to promulgate interim regulations. 42 U.S.C. § 17937(g). On August 17, 2009, the FTC issued these regulations at 16 C.F.R. pt. 318 (Health Breach Notification Rule), 74 Fed. Reg. 42,962, 42,980–81 (Aug. 25, 2009). See Exhibit 8C.

8–28

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

§ 8.4

OTHER PRIVACY STATUTES ADMINISTERED OR ENFORCED BY THE FTC

§ 8.4.1

Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act, 15 U.S.C. §§ 7701–7713

Responding to the turn-of-the-millennium outcry over unsolicited commercial email, or “spam,” but also to meet commercial interests in not requiring more stringent restrictions that were being enacted by several states, Congress enacted the “Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003” (CAN-SPAM), which is primarily codified at 15 U.S.C. §§ 7701–7713. 108 Pub. L. No. 187, 117 Stat. 2699–719 (Dec. 16, 2003). (The e-mail fraud provision, Section 4 of the CAN-SPAM Act, is codified at 18 U.S.C. § 1037.) Practice Note The e-mail fraud crimes at 18 U.S.C. § 1037 are • taking over a “protected computer” (defined under the Computer Fraud and Abuse Act) to send spam; • using a protected computer to relay or retransmit spam with the intent of deceiving or misleading as to the origin; • materially falsifying headings in spam; • registering with materially false information about the actual registrant five or more e-mail accounts or two or more domain names and sending spam from them; and • falsely representing to be the registrant of five or more Internet Protocol addresses and sending spam from them.

The main substantive provisions are as follows: (1) Prohibition of false or misleading transmission information. It is unlawful for any person to initiate the transmission, to a protected computer [18 U.S.C. § 1030(e)(2)(B) (Computer Fraud and Abuse Act)], of a commercial electronic mail message, or a transactional or relationship message, that contains, or is accompanied by, header information that is materially false or materially misleading. For purposes of this paragraph— (A) header information that is technically accurate but includes an originating electronic mail address, domain name, or Internet Protocol address the access to which for purposes of initiating the message was obtained by means of false or fraudulent pretenses or representations shall be considered materially misleading;

MCLE, Inc. | 2nd Edition 2018

8–29

§ 8.4

Data Security and Privacy in Massachusetts

(B) a “from” line . . . that accurately identifies any person who initiated the message shall not be considered materially false or materially misleading; and (C) header information shall be considered materially misleading if it fails to identify accurately a protected computer used to initiate the message because the person initiating the message knowingly uses another protected computer to relay or retransmit the message for purposes of disguising its origin. (2) Prohibition of deceptive subject headings. It is unlawful for any person to initiate the transmission to a protected computer of a commercial electronic mail message if such person has actual knowledge, or knowledge fairly implied on the basis of objective circumstances, that a subject heading of the message would be likely to mislead a recipient, acting reasonably under the circumstances, about a material fact regarding the contents or subject matter of the message (consistent with the criteria used in enforcement of section 45 of this title). (3) Inclusion of return address or comparable mechanism in commercial electronic mail. (A) In general. It is unlawful for any person to initiate the transmission to a protected computer of a commercial electronic mail message that does not contain a functioning return electronic mail address or other Internet-based mechanism, clearly and conspicuously displayed, that— (i) a recipient may use to submit, in a manner specified in the message, a reply electronic mail message or other form of Internet-based communication requesting not to receive future commercial electronic mail messages from that sender at the electronic mail address where the message was received; and (ii) remains capable of receiving such messages or communications for no less than 30 days after the transmission of the original message. (B) More detailed options possible . . . (C) Temporary inability to receive messages or process requests . . . (4) Prohibition of transmission of commercial electronic mail after objection. (A) In general.

8–30

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

If a recipient makes a request using a mechanism provided pursuant to paragraph (3) not to receive some or any commercial electronic mail messages from such sender, then it is unlawful [to continue to send or to transfer e-mail address of the recipient]. (5) Inclusion of identifier, opt-out, and physical address in commercial electronic mail. (A) It is unlawful for any person to initiate the transmission of any commercial electronic mail message to a protected computer unless the message provides– (i) clear and conspicuous identification that the message is an advertisement or solicitation; (ii) clear and conspicuous notice of the opportunity under paragraph (3) to decline to receive further commercial electronic mail messages from the sender; and (iii) a valid physical postal address of the sender. (b) Aggravated violations relating to commercial electronic mail. (1) Address harvesting and dictionary attacks. (A) In general. It is unlawful for any person to initiate the transmission, to a protected computer, of a commercial electronic mail message that is unlawful under subsection (a), or to assist in the origination of such message through the provision or selection of addresses to which the message will be transmitted, if such person had actual knowledge, or knowledge fairly implied on the basis of objective circumstances, that— (i) the electronic mail address of the recipient was obtained using an automated means from an Internet website or proprietary online service operated by another person, and such website or online service included, at the time the address was obtained, a notice stating that the operator of such website or online service will not give, sell, or otherwise transfer addresses maintained by such website or online service to any other party for the purposes of initiating, or enabling others to initiate, electronic mail messages; or (ii) the electronic mail address of the recipient was obtained using an automated means that generates possible electronic mail addresses by combining names, letters, or numbers into numerous permutations. . . . (2) Automated creation of multiple electronic mail accounts. MCLE, Inc. | 2nd Edition 2018

8–31

§ 8.4

Data Security and Privacy in Massachusetts

It is unlawful for any person to use scripts or other automated means to register for multiple electronic mail accounts or online user accounts from which to transmit to a protected computer, or enable another person to transmit to a protected computer, a commercial electronic mail message that is unlawful under subsection (a). (3) Relay or retransmission through unauthorized access. It is unlawful for any person knowingly to relay or retransmit a commercial electronic mail message that is unlawful under subsection (a) from a protected computer or computer network that such person has accessed without authorization. 15 U.S.C. § 7704(a), (b) (emphasis added). The FTC is given supplemental rulemaking authority to adjust the time for transmissions after requests and to add to the aggravated offenses, 15 U.S.C. § 7704(c), as well as general rule-making authority, 15 U.S.C. § 7711, and enforcement authority, 15 U.S.C. § 7706. Practice Note The FTC was also to provide to Congress “a plan and timetable for establishing a nationwide marketing Do-Not-E-Mail registry.” 15 U.S.C. § 7708. It was also to work with the Federal Communications Commission “to protect consumers from unwanted mobile service commercial messages.” 15 U.S.C. § 7712.

The FTC promulgated in 2005 and again in 2008 its CAN-SPAM Rule at 16 C.F.R. pt. 316. 16 C.F.R. §§ 316.1–316.6, added by 70 Fed. Reg. 3,110, 3,127 (Jan. 19, 2005), 73 Fed. Reg. 29,654, 29,677 (May 21, 2008). The rule did not restate the Act. Section 316.3 provides an explanation of determining the “primary purpose” of an e-mail as “commercial.” 16 C.F.R. § 316.3. Section 316.4 requires a warning on the subject line that says “SEXUALLY EXPLICIT” on e-mail that includes sexually oriented material. 16 C.F.R. § 316.4. Section 316.5 prohibits charging a fee or imposing other requirements on recipients who wish to opt out. 16 C.F.R. § 316.5.

§ 8.4.2

Telemarketing and Consumer Fraud and Abuse Prevention Act

Congress enacted in 1994 the Telemarketing Consumer Fraud and Abuse Prevention Act (TCFAPA). 15 U.S.C. §§ 6101–6108, added by Telemarketing Consumer Fraud and Abuse Prevention Act (TCFAPA), Pub. L. No. 103-297, § 2, 108 Stat. 1545 (Aug. 16, 1994), amended by Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA PATRIOT) Act, Pub. L. No. 107-56, Title X (Miscellaneous), § 1101 (Crimes Against Charitable Americans Act of 2001), 115 Stat. 272, 396 (Oct. 26, 2001), Consumer Financial Protection Act of 2010, Subtitle H, § 1100C, 124 Stat. 2110-11 (CFPB to be consulted by FTC in rule making and given enforcement power under TCFAPA § 6(d)). The TCFAPA required the FTC to do the following: 8–32

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

(1) The Commission shall prescribe rules prohibiting deceptive telemarketing acts or practices and other abusive telemarketing acts or practices. (2) The Commission shall include in such rules respecting deceptive telemarketing acts or practices a definition of deceptive telemarketing acts or practices which may include acts or practices of entities or individuals that assist or facilitate deceptive telemarketing, including credit card laundering. (3) The Commission shall include in such rules respecting other abusive telemarketing acts or practices— (A) a requirement that telemarketers may not undertake a pattern of unsolicited telephone calls which the reasonable consumer would consider coercive or abusive of such consumer’s right to privacy, (B) restrictions on the hours of the day and night when unsolicited telephone calls can be made to consumers, (C) a requirement that any person engaged in telemarketing for the sale of goods or services shall promptly and clearly disclose to the person receiving the call that the purpose of the call is to sell goods or services and make such other disclosures as the Commission deems appropriate, including the nature and price of the goods and services, and (D) a requirement that any person engaged in telemarketing for the solicitation of charitable contributions, donations, or gifts of money or any other thing of value, shall promptly and clearly disclose to the person receiving the call that the purpose of the call is to solicit charitable contributions, donations, or gifts, and make such other disclosures as the Commission considers appropriate, including the name and mailing address of the charitable organization on behalf of which the solicitation is made. 15 U.S.C. § 6102(a). The FTC is charged with rule-making authority, 15 U.S.C. § 6102(b), amended to require the FTC to consult with the CFPB “regarding the consistency of a proposed rule with standards, purposes, or objectives administered,” and with enforcement, 15 U.S.C. §§ 6105 (general FTCA power), 6107 (power to seek criminal contempt under false advertisement power, FTCA § 13(a), 15 U.S.C. § 57(a)). (The CFPB is given enforcement power under TCFAPA § 6(d), 15 U.S.C. § 6105(d), with respect to the offering or provision of a consumer financial product or service subject to the Consumer Financial Protection Act of 2010). Pursuant to the TCFAPA, the FTC promulgated the Telemarketing Sales Rule (TSR) at 16 C.F.R. pt. 310. 16 C.F.R. §§ 310.1–310.6, added by 60 Fed. Reg. 40,843, 43,865 (Aug. 16, 1995), amended by 68 Fed. Reg. 4,580, 4,670 (Jan. 29, 2003), 75 MCLE, Inc. | 2nd Edition 2018

8–33

§ 8.4

Data Security and Privacy in Massachusetts

Fed. Reg. 48,458, 48,516 (Aug. 10, 2010), 80 Fed. Reg. 77,520, 77,558 (Dec. 14, 2015). The TSR has two groups of proscriptions: Deceptive telemarketing acts or practices. (a) Prohibited deceptive telemarketing acts or practices. It is a deceptive telemarketing act or practice and a violation of this Rule for any seller or telemarketer to engage in the following conduct: (1) [Mandatory disclosures.] Before a customer consents to pay for goods or services offered, failing to disclose truthfully, in a clear and conspicuous manner, the following material information: (i) The total costs to purchase, receive, or use, and the quantity of, any goods or services that are the subject of the sales offer; (ii) All material restrictions, limitations, or conditions to purchase, receive, or use the goods or services that are the subject of the sales offer; (iii) [details of refund policy]; (iv) In any prize promotion, the odds of being able to receive the prize, and, if the odds are not calculable in advance, the factors used in calculating the odds; that no purchase or payment is required to win a prize or to participate in a prize promotion and that any purchase or payment will not increase the person’s chances of winning; and the no-purchase/no-payment method of participating in the prize promotion with either instructions on how to participate or an address or local or tollfree telephone number to which customers may write or call for information on how to participate; (v) All material costs or conditions to receive or redeem a prize that is the subject of the prize promotion; (vi) In the sale of any goods or services represented to protect, insure, or otherwise limit a customer’s liability in the event of unauthorized use of the customer’s credit card, the limits on a cardholder’s liability for unauthorized use of a credit card pursuant to 15 U.S.C. 1643; (vii) If the offer includes a negative option feature, all material terms and conditions of the negative option feature, including, but not limited to, the fact that the customer’s account will be charged unless the customer takes an affirmative action to avoid the charge(s), the date(s) the charge(s) will be submitted for payment, and the specific steps the customer must take to avoid the charge(s); and

8–34

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

(viii) In the sale of any debt relief service: [details on time and requirements for settlement with creditors and consequences of failure to make timely payments]. (2) [Proscriptions against] [m]isrepresenting, directly or by implication, in the sale of goods or services any of the following material information: (i) The total costs to purchase, receive, or use, and the quantity of, any goods or services that are the subject of a sales offer; (ii) Any material restriction, limitation, or condition to purchase, receive, or use goods or services that are the subject of a sales offer; (iii) Any material aspect of the performance, efficacy, nature, or central characteristics of goods or services that are the subject of a sales offer; (iv) Any material aspect of the nature or terms of the seller’s refund, cancellation, exchange, or repurchase policies; (v) Any material aspect of a prize promotion including, but not limited to, the odds of being able to receive a prize, the nature or value of a prize, or that a purchase or payment is required to win a prize or to participate in a prize promotion; (vi) Any material aspect of an investment opportunity including, but not limited to, risk, liquidity, earnings potential, or profitability; (vii) A seller’s or telemarketer’s affiliation with, or endorsement or sponsorship by, any person or government entity; (viii) That any customer needs offered goods or services to provide protections a customer already has pursuant to 15 U.S.C. 1643 [statutory limitation on credit card liability]; (ix) Any material aspect of a negative option feature including, but not limited to, the fact that the customer’s account will be charged unless the customer takes an affirmative action to avoid the charge(s), the date(s) the charge(s) will be submitted for payment, and the specific steps the customer must take to avoid the charge(s); or (x) Any material aspect of any debt relief service, including, but not limited to, the amount of money or the percentage of the debt amount that a customer may save by using such service; the amount of time necessary to achieve the represented results; the amount of money or the percentage of each outstanding debt that the customer must accumulate before the provider of the debt relief service will initiate attempts with the customer’s creditors or debt collectors or make a bona fide MCLE, Inc. | 2nd Edition 2018

8–35

§ 8.4

Data Security and Privacy in Massachusetts

offer to negotiate, settle, or modify the terms of the customer’s debt; the effect of the service on a customer’s creditworthiness; the effect of the service on collection efforts of the customer’s creditors or debt collectors; the percentage or number of customers who attain the represented results; and whether a debt relief service is offered or provided by a non-profit entity. (3) Causing billing information to be submitted for payment, or collecting or attempting to collect payment for goods or services or a charitable contribution, directly or indirectly, without the customer’s or donor’s express verifiable authorization, except when the method of payment used is a credit card subject to protections of the Truth in Lending Act and Regulation Z, or a debit card subject to the protections of the Electronic Fund Transfer Act and Regulation E. Such authorization shall be deemed verifiable if any of the following means is employed: (i) Express written authorization . . . includes the customer’s or donor’s signature; (ii) Express oral authorization which is audio-recorded . . . which evidences clearly both the customer’s or donor’s authorization of payment for the goods or services or charitable contribution that are the subject of the telemarketing transaction and the customer’s or donor’s receipt of all of the following information: [details of charges . . . with sufficient specificity such that the customer or donor understands what account will be used to collect payment and a telephone number for customer or donor inquiry that is answered during normal business hours] or (iii) Written confirmation of the transaction, identified in a clear and conspicuous manner as such on the outside of the envelope, sent to the customer or donor via first class mail prior to the submission for payment of the customer’s or donor’s billing information, and that includes all of the information contained in §§ 310.3(a)(3)(ii)(A)-(G) and a clear and conspicuous statement of the procedures by which the customer or donor can obtain a refund from the seller or telemarketer or charitable organization in the event the confirmation is inaccurate; provided, however, that this means of authorization shall not be deemed verifiable in instances in which goods or services are offered in a transaction involving a free-to-pay conversion and preacquired account information. (4) Making a false or misleading statement to induce any person to pay for goods or services or to induce a charitable contribution. 8–36

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

(b) Assisting and facilitating. It is a deceptive telemarketing act or practice and a violation of this Rule for a person to provide substantial assistance or support to any seller or telemarketer when that person knows or consciously avoids knowing that the seller or telemarketer is engaged in any act or practice that violates §§ 310.3(a), (c) or (d), or § 310.4 of this Rule. (c) Credit card laundering. . . . (d) Prohibited deceptive acts or practices in the solicitation of charitable contributions. It is a fraudulent charitable solicitation, a deceptive telemarketing act or practice, and a violation of this Rule for any telemarketer soliciting charitable contributions to misrepresent, directly or by implication, any of the following material information: (1) The nature, purpose, or mission of any entity on behalf of which a charitable contribution is being requested; (2) That any charitable contribution is tax deductible in whole or in part; (3) The purpose for which any charitable contribution will be used; (4) The percentage or amount of any charitable contribution that will go to a charitable organization or to any particular charitable program; (5) Any material aspect of a prize promotion including, but not limited to: the odds of being able to receive a prize; the nature or value of a prize; or that a charitable contribution is required to win a prize or to participate in a prize promotion; or (6) A charitable organization’s or telemarketer’s affiliation with, or endorsement or sponsorship by, any person or government entity. 16 C.F.R. § 310.3 (emphasis added, notes omitted), added by 60 Fed. Reg. 40,843, 43,865 (Aug. 16, 1995), amended by 68 Fed. Reg. 4,580, 4,670 (Jan. 29, 2003), 75 Fed. Reg. 48,458, 48,516 (Aug. 10, 2010), 80 Fed. Reg. 77,520, 77,559 (Dec. 14, 2015). Abusive telemarketing acts or practices. (a) Abusive conduct generally. It is an abusive telemarketing act or practice and a violation of this Rule for any seller or telemarketer to engage in the following conduct: (1) Threats, intimidation, or the use of profane or obscene language;

MCLE, Inc. | 2nd Edition 2018

8–37

§ 8.4

Data Security and Privacy in Massachusetts

(2) Requesting or receiving payment of any fee or consideration for goods or services represented to remove derogatory information from, or improve, a person’s credit history, credit record, or credit rating until: (i) The time frame in which the seller has represented all of the goods or services will be provided to that person has expired; and (ii) The seller has provided the person with documentation in the form of a consumer report from a consumer reporting agency demonstrating that the promised results have been achieved, such report having been issued more than six months after the results were achieved. . . . ; (3) Requesting or receiving payment of any fee or consideration from a person for goods or services represented to recover or otherwise assist in the return of money or any other item of value paid for by, or promised to, that person in a previous telemarketing transaction, until seven (7) business days after such money or other item is delivered to that person. This provision shall not apply to goods or services provided to a person by a licensed attorney; (4) Requesting or receiving payment of any fee or consideration in advance of obtaining a loan or other extension of credit when the seller or telemarketer has guaranteed or represented a high likelihood of success in obtaining or arranging a loan or other extension of credit for a person; (5) (i) Requesting or receiving payment of any fee or consideration for any debt relief service until and unless: [at least one debt settled and customer has made a payment pursuant to the settlement and the fee for the service, if renegotiated, bears proportional relationship]. (6) Disclosing or receiving, for consideration, unencrypted consumer account numbers for use in telemarketing; provided, however, that this paragraph shall not apply to the disclosure or receipt of a customer’s or donor’s billing information to process a payment for goods or services or a charitable contribution pursuant to a transaction; (7) Causing billing information to be submitted for payment, directly or indirectly, without the express informed consent of the customer or donor. In any telemarketing transaction, the seller or telemarketer must obtain the express informed consent of the customer or donor to be charged for the goods or services or charitable contribution and to be charged using the identified account. In any telemarketing transaction involving preacquired account information, the requirements in paragraphs 8–38

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

(a)(7)(i) through (ii) of this section must be met to evidence express informed consent. (i) In any telemarketing transaction involving preacquired account information and a free-to-pay conversion feature, the seller or telemarketer must: [obtain at a minimum, the last four (4) digits of the account number to be charged, and, make and maintain an audio recording of the entire telemarketing transaction.] (ii) In any other telemarketing transaction involving preacquired account information not described in paragraph (a)(7)(i) of this section, the seller or telemarketer must: [at a minimum, identify the account to be charged with sufficient specificity to understand what account will be charged; and express agreement to be charged]; (8) Failing to transmit or cause to be transmitted the telephone number, and, when made available by the telemarketer’s carrier, the name of the telemarketer, to any caller identification service in use by a recipient of a telemarketing call; provided that it shall not be a violation to substitute (for the name and phone number used in, or billed for, making the call) the name of the seller or charitable organization on behalf of which a telemarketing call is placed, and the seller’s or charitable organization’s customer or donor service telephone number, which is answered during regular business hours. (9)(a)(9) Creating or causing to be created, directly or indirectly, a remotely created payment order as payment for goods or services offered or sold through telemarketing or as a charitable contribution solicited or sought through telemarketing; or (10)(a)(10) Accepting from a customer or donor, directly or indirectly, a cash-to-cash money transfer or cash reload mechanism as payment for goods or services offered or sold through telemarketing or as a charitable contribution solicited or sought through telemarketing. (b) Pattern of calls. (1) It is an abusive telemarketing act or practice and a violation of this Rule for a telemarketer to engage in, or for a seller to cause a telemarketer to engage in, the following conduct: (i) Causing any telephone to ring, or engaging any person in telephone conversation, repeatedly or continuously with intent to annoy, abuse, or harass any person at the called number; (ii) Denying or interfering in any way, directly or indirectly, with a person’s right to be placed on any registry of names and/or telephone numbers of persons who do not wish to receive MCLE, Inc. | 2nd Edition 2018

8–39

§ 8.4

Data Security and Privacy in Massachusetts

outbound telephone calls established to comply with § 310.4(b)(1)(iii)(A) of this section, including, but not limited to, harassing any person who makes such a request; hanging up on that person; failing to honor the request; requiring the person to listen to a sales pitch before accepting the request; assessing a charge or fee for honoring the request; requiring a person to call a different number to submit the request; and requiring the person to identify the seller making the call or on whose behalf the call is made; (iii) Initiating any outbound telephone call to a person when: (A) that person previously has stated that he or she does not wish to receive an outbound telephone call made by or on behalf of the seller whose goods or services are being offered or made on behalf of the charitable organization for which a charitable contribution is being solicited; or (B) that person’s telephone number is on the “do-not-call” registry, maintained by the Commission, of persons who do not wish to receive outbound telephone calls to induce the purchase of goods or services unless the seller [has obtained the express agreement, in writing, or has an established business relationship]; or (iv) Abandoning any outbound telephone call. An outbound telephone call is “abandoned” under this section if a person answers it and the telemarketer does not connect the call to a sales representative within two (2) seconds of the person’s completed greeting. (v) Initiating any outbound telephone call that delivers a prerecorded message, other than a prerecorded message permitted for compliance with the call abandonment safe harbor in § 310.4(b)(4)(iii), unless: (A) in any such call to induce the purchase of any good or service, the seller has obtained from the recipient of the call an express agreement, in writing [conditions] and (B) In any such call to induce the purchase of any good or service, or to induce a charitable contribution from a member of, or previous donor to, a non-profit charitable organization on whose behalf the call is made, the seller or telemarketer: (i) Allows the telephone to ring for at least fifteen (15) seconds or four (4) rings before disconnecting an unanswered call; and (ii) Within two (2) seconds after the completed greeting of the person called, plays a prerecorded message that promptly provides the disclosures required by § 310.4(d) or (e), followed immediately by a disclosure of one or both of the following: 8–40

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

(A) In the case of a call that could be answered in person by a consumer, that the person called can use an automated interactive voice and/or keypress-activated opt-out mechanism to assert a Do Not Call request pursuant to § 310.4(b)(1)(iii)(A) at any time during the message. The mechanism must [automatically add the number called to the seller’s entity-specific Do Not Call list and disconnect] and (B) In the case of a call that could be answered by an answering machine or voicemail service, that the person called can use a toll-free telephone number to assert a Do Not Call request pursuant to § 310.4(b)(1)(iii)(A). The number provided must connect directly to an automated interactive voice or keypress-activated opt-out mechanism that [automatically adds the number called to the seller’s entity-specific Do Not Call list and disconnects]; . . . . (2) It is an abusive telemarketing act or practice and a violation of this Rule for any person to sell, rent, lease, purchase, or use any list established to comply with § 310.4(b)(1)(iii)(A), or maintained by the Commission pursuant to § 310.4(b)(1)(iii)(B), for any purpose except compliance with the provisions of this Rule or otherwise to prevent telephone calls to telephone numbers on such lists. (3) A seller or telemarketer will not be liable for violating § 310.4(b)(1)(ii) and (iii) if it can demonstrate that, as part of the seller’s or telemarketer’s routine business practice: [established and implemented compliant written procedures, trained its personnel, “uses a process to prevent telemarketing to any telephone number on any list established pursuant to § 310.4(b)(3)(iii) or 310.4(b)(1)(iii)(B), employing a version of the “do-not-call” registry obtained from the Commission no more than thirty-one (31) days prior to the date any call is made, and maintains records documenting this process]. (4) A seller or telemarketer will not be liable for violating § 310.4(b)(1)(iv) if:[ employs technology that ensures abandonment of no more than three (3) percent of all calls answered by a person, measured over the duration of a single calling campaign, if less than 30 days, or separately over each successive 30-day period or portion thereof that the campaign continues]. (c) Calling time restrictions. Without the prior consent of a person, it is an abusive telemarketing act or practice and a violation of this Rule for a telemarketer to engage in outbound telephone calls to a person’s residence at any time other than between 8:00 a.m. and 9:00 p.m. local time at the called person’s location. MCLE, Inc. | 2nd Edition 2018

8–41

§ 8.4

Data Security and Privacy in Massachusetts

(d) Required oral disclosures in the sale of goods or services. It is an abusive telemarketing act or practice and a violation of this Rule for a telemarketer in an outbound telephone call or internal or external upsell to induce the purchase of goods or services to fail to disclose truthfully, promptly, and in a clear and conspicuous manner to the person receiving the call, the following information: (1) The identity of the seller; (2) That the purpose of the call is to sell goods or services; (3) The nature of the goods or services; and (4) That no purchase or payment is necessary to be able to win a prize or participate in a prize promotion if a prize promotion is offered and that any purchase or payment will not increase the person’s chances of winning. [Other conditions]. (e) Required oral disclosures in charitable solicitations. It is an abusive telemarketing act or practice and a violation of this Rule for a telemarketer, in an outbound telephone call to induce a charitable contribution, to fail to disclose truthfully, promptly, and in a clear and conspicuous manner to the person receiving the call, the following information: (1) The identity of the charitable organization on behalf of which the request is being made; and (2) That the purpose of the call is to solicit a charitable contribution. 16 C.F.R. § 310.4 (emphasis added, notes omitted), added by 60 Fed. Reg. 40,843, 43,866 (Aug. 16, 1995), amended by 68 Fed. Reg. 4,580, 4,671 (Jan. 29, 2003), 69 Fed. Reg. 16,368, 16,373 (Mar. 29, 2004), 73 Fed. Reg. 51,164, 51,204 (Aug. 29, 2008), 75 Fed. Reg. 48,458, 48,516 (Aug. 10, 2010), 76 Fed. Reg. 58,716 (Sept. 22, 2011), 80 Fed. Reg. 77,520, 77,559 (Dec. 14, 2015). The TSR, seeking to regulate marketing and “robocalls,” including the “do-not-call” (DNC) registry maintained by the FTC, perhaps reaches most often to the daily lives of Americans. In 2015, the FTC settled FTC v. A+Financial Center, LLC, Civ. No. 12-CV-14373 (S.D. Fla. June 11, 2015), brought for violations of the TSR through robocalls (from “Rachel” of “cardholder services,” among others) offering credit card interest reduction for upfront payments. See https://www.ftc.gov/news-events/press-releases/ 2013/07/ftc-settles-rachel-robocall-enforcement-case. In 2016, the FTC settled United States v. Lilly Mgmt. & Mktg., LLC, Stipulated Final Order (M.D. Fla. Mar. 16, 2016) ($1.2 million civil penalty for robocalling to do-not-call numbers (DNCs) pitching vacation packages), available at https://www.ftc.gov/news-events/pressreleases/2016/03/ftc-obtains-settlement-permanently-banning-vacation-package; see https://www.ftc.gov/news-events/press-releases/2016/03/ftc-obtains-settlementpermanently-banning-vacation-package. 8–42

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

In 2017, the FTC obtained default judgments against several robocallers, imposing substantial penalties and permanent injunctions. FTC v. Jones, No. 8:17-cv-00058, Doc. 88 (C.D. Cal. May 31, 2017) ($2.6 million); FTC v. All US Marketing LLC, No. 6:15 CV 01016, Doc. 173 (M.D. Fla. May 22, 2017) ($5.2 million). See https://www. ftc.gov/news-events/press-releases/2017/06/ftc-obtains-court-judgments-againstcalifornia-based-robocallers, https://www.ftc.gov/news-events/press-releases/2017/ 06/ftc-florida-attorney-general-close-book-robocall-ring-pitched-us. The court in Soundboard Association v. U.S. Federal Trade Commission, No. 17–cv– 00150, 2017 U.S. Dist. LEXIS 61408 (D.D.C. Apr. 24, 2017), injunction pending appeal denied, 2017 U.S. Dist. LEXIS 71011 (D.D.C. May 10, 2017), granted summary judgment to the FTC on Administrative Procedures Act and First Amendment challenges to the FTC’s change of position that the soundboard technology combining recorded messages with live operator intervention was subject to the robocall rule requiring prior written consent. In a 2017decision , after a bench trial, the court entered a permanent injunction and ordered a defendant to pay $280,000,000 in civil penalties and statutory damages to the United States, California, Illinois, North Carolina and Ohio, marking the largest civil penalty ever obtained for violation of an FTC act. U.S. v. Dish Network, LLC, No. 09-3073, 2017 U.S. Dist. LEXIS 85543 (C.D. Ill. June 6, 2017); Dkt. #822, Amended Order for Permanent Injunction (C.D. Ill. Aug. 10, 2017). The FTC brought claims for DNC, entity-specific, and abandoned-call violations, effective in the Telephone Consumer Protection Act (TCPA), and state law, and the court supported its judgment in an extensive opinion, finding, among other things, that the DNC statutes did not violate the First Amendment. See https://www.ftc.gov/newsevents/press-releases/2017/06/ftc-doj-case-results-historic-decision-awarding-280million-civil. The Eighth Circuit in Gresham v. Swanson, 866 F.3d 853 (8th Cir. 2017), aff’d No. 16-1420, 2016 U.S. LEXIS 98656 (D. Minn. July 27, 2016) (preliminary injunction denied in suit by political consultant for declaratory judgment against Minnesota attorney general), ruled that a similar Minnesota statute against robocalls did not violate the First Amendment. Practice Note DNC and TCPA cases have been decided more favorably for plaintiffs than other breach of privacy and data security cases (See discussion in § 9.3). In Lanteri v. Credit Protection Association L.P., No. 1:13-cv-1501, 2017 U.S. Dist. LEXIS 134868 at *7 n.3 (C.D. Ind. Aug. 22, 2017), the court found that the plaintiff who complained that defendant would not honor her stop-calling request met the standing requirement of Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 194 L. Ed. 2d 635 (2016) because she alleged that she was charged for incoming calls. The court also proposed acceptable class descriptions. (Credit Protection had settled United States v. Credit Protection Association, No. 3:16-cv-01255, Doc. No. 4-1, Stipulated Final Order for Permanent Injunction and Civil Penalty (N.D. Tex. May 9, 2016)) ($72,000 for inadequate consumer dispute procedures); see https://www.ftc.gov/news-events/press-releases/2016/05/ debt-collector-settles-ftc-charges-it-violated-fair-credit.). In Golan v. Veritas MCLE, Inc. | 2nd Edition 2018

8–43

§ 8.4

Data Security and Privacy in Massachusetts Entertainment, LLC, No. 4:14CV00069m 2017 U.S. Dist. LEXIS 144501 (E.D. Mo. Sept. 7, 2017), the trial court, having granted judgment as a matter of law after jury trial against a telemarketer who had conducted a six-day campaign dialing 3.2 million numbers, found the $500 TCPA damages per call resulting in $1.6 billion total damages “obviously unreasonable and wholly disproportionate to the offense,” but reduced it to $10 per call, rather than the $0.10 per call requested by the defendant.

§ 8.4.3

Do-Not-Call Legislation

The Do-Not Call Registry Act of 2003, 15 U.S.C. § 6151, added by Pub. L. No. 10882, 117 Stat. 1006 (Sept. 29, 2003), expressly authorized the FTC under Section 3(a)(3)(A) of the TCFAPA to implement and enforce the Do-Not-Call Registry and ratified the Registry provision of the TSR, 16 C.F.R. § 310.4(b)(1)(iii) (which became effective on March 31, 2003). The Do-Not-Call Implementation Act of 2003, 15 U.S.C. §§ 6152–6154, added by Pub. L. No. 108-10, §§ 2–4, 117 Stat. 557–58 (Mar. 11, 2003), authorized the FTC to set and collect Registry fees for fiscal years 2003 through 2007, required the FCC to issue a compatible do-not-call rule, and directed the FTC and the FCC to submit an annual report on the Registry for fiscal years 2003 through 2007 to the House Committee on Energy and Commerce and the Senate Committee on Commerce, Science and Transportation. The Do-Not-Call Registry Fee Extension Act of 2007, 15 U.S.C. § 6154, added by Pub. L. No. 110-188, § 2, 3, 122 Stat. 635–38 (Feb. 15, 2008), amended the Do-NotCall Implementation Act to specify by statute the Registry fees for telemarketers, and revised the report requirements in the TCFAPA. The Do-Not-Call Improvement Act of 2007, 15 U.S.C. § 6155, added by Pub. L. No. 110-187, § 2, 122 Stat. 633 (Feb. 15, 2008), further amended the Do-Not-Call Implementation Act to prohibit the automatic expiration of Registry listings.

§ 8.4.4

The Children’s Online Privacy Protection Act (COPPA)

Congress enacted the Children’s Online Privacy Protection Act of 1998 (COPPA), 15 U.S.C. §§ 6501–6506, added by Act of Oct. 21, 1998, Pub. L. No. 105-277, Div. C, Title XIII, §§ 1301–1306, 112 Stat. 2681-728–735, which went into effect on April 21, 2000. COPPA § 1308, 112 Stat. 2681-735, codified at 15 U.S.C. § 6501 note. (1) In general. It is unlawful for an operator of a website or online service directed to children, or any operator that has actual knowledge that it is collecting personal information from a child, [“The term ‘child’ means an individual under the age of 13.” 15 U.S.C. § 6501(1).] to collect personal information from a child in a manner that violates the regulations prescribed under subsection (b) of this section[,] 8–44

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

15 U.S.C. § 6502(a)(1) (emphasis added), where (8) . . . The term “personal information” means individually identifiable information about an individual collected online, including— (A) a first and last name; (B) a home or other physical address including street name and name of a city or town; (C) an e-mail address; (D) a telephone number; (E) a Social Security number; (F) any other identifier that the Commission determines permits the physical or online contacting of a specific individual; or (G) information concerning the child or the parents of that child that the website collects online from the child and combines with an identifier described in this paragraph. 15 U.S.C. § 6501(8). The referenced Subsection (b) comprises regulations that the FTC was charged with promulgating and that would achieve the following Congressional objectives: (D) require the operator of such a website or online service to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children[,] 15 U.S.C. § 6502(b), where (9) . . . The term “verifiable parental consent” means any reasonable effort (taking into consideration available technology), including a request for authorization for future collection, use, and disclosure described in the notice, to ensure that a parent of a child receives notice of the operator’s personal information collection, use, and disclosure practices, and authorizes the collection, use, and disclosure, as applicable, of personal information and the subsequent use of that information before that information is collected from that child. 15 U.S.C. § 6501(9). The FTC promulgated its “Children’s Online Privacy Protection Rule” at 16 C.F.R. pt. 312 in 1999, with a substantial amendment in 2013. 16 C.F.R. §§ 312.1–312.13, added by 64 Fed. Reg. 59,911, 59,912 (Nov. 3, 1999), amended by 67 Fed. Reg. 18,818, 18,821 (Apr. 17, 2002), 70 Fed. Reg. 21,104, 21,106 (Apr. 22, 2005) (provi-

MCLE, Inc. | 2nd Edition 2018

8–45

§ 8.4

Data Security and Privacy in Massachusetts

sional extension of parental consent by e-mail), 78 Fed. Reg. 3,972, 4,008 (Jan. 17, 2013). The basic rule is as follows: Regulation of unfair or deceptive acts or practices in connection with the collection, use, and/or disclosure of personal information from and about children on the Internet. General requirements. It shall be unlawful for any operator of a Web site or online service directed to children, or any operator that has actual knowledge that it is collecting or maintaining personal information from a child, to collect personal information from a child in a manner that violates the regulations prescribed under this part. Generally, under this part, an operator must: (a) Provide notice on the Web site or online service of what information it collects from children, how it uses such information, and its disclosure practices for such information (§ 312.4(b)); (b) Obtain verifiable parental consent prior to any collection, use, and/or disclosure of personal information from children (§ 312.5); (c) Provide a reasonable means for a parent to review the personal information collected from a child and to refuse to permit its further use or maintenance (§ 312.6); (d) Not condition a child’s participation in a game, the offering of a prize, or another activity on the child disclosing more personal information than is reasonably necessary to participate in such activity (§ 312.7); and (e) Establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children (§ 312.8). 16 C.F.R. § 312.3 (emphasis added). Among the changes recognized by the FTC over fourteen years of evolution of the Internet was the type of personal information collected through new technologies such as social media networking. Personal information means individually identifiable information about an individual collected online, including: (1) A first and last name; (2) A home or other physical address including street name and name of a city or town; (3) Online contact information as defined in this section; (4) A screen or user name where it functions in the same manner as online contact information, as defined in this section; 8–46

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

(5) A telephone number; (6) A Social Security number; (7) A persistent identifier that can be used to recognize a user over time and across different Web sites or online services. Such persistent identifier includes, but is not limited to, a customer number held in a cookie, an Internet Protocol (IP) address, a processor or device serial number, or unique device identifier; (8) A photograph, video, or audio file where such file contains a child’s image or voice; (9) Geolocation information sufficient to identify street name and name of a city or town; or (10) Information concerning the child or the parents of that child that the operator collects online from the child and combines with an identifier described in this definition. 16 C.F.R. § 312.2 (emphasis added). Also, much debate was had and still continues on how “verifiable parental consent may be obtained”—particularly the use of e-mail and credit cards. The present regulation remains a compromise: Parental consent. (a) General requirements. (1) An operator is required to obtain verifiable parental consent before any collection, use, or disclosure of personal information from children, including consent to any material change in the collection, use, or disclosure practices to which the parent has previously consented. (2) An operator must give the parent the option to consent to the collection and use of the child’s personal information without consenting to disclosure of his or her personal information to third parties. (b) Methods for verifiable parental consent. (1) An operator must make reasonable efforts to obtain verifiable parental consent, taking into consideration available technology. Any method to obtain verifiable parental consent must be reasonably calculated, in light of available technology, to ensure that the person providing consent is the child’s parent. (2) Existing methods to obtain verifiable parental consent that satisfy the requirements of this paragraph include: (i) Providing a consent form to be signed by the parent and returned to the operator by postal mail, facsimile, or electronic scan;

MCLE, Inc. | 2nd Edition 2018

8–47

§ 8.4

Data Security and Privacy in Massachusetts

(ii) Requiring a parent, in connection with a monetary transaction, to use a credit card, debit card, or other online payment system that provides notification of each discrete transaction to the primary account holder; (iii) Having a parent call a toll-free telephone number staffed by trained personnel; (iv) Having a parent connect to trained personnel via videoconference; (v) Verifying a parent’s identity by checking a form of government-issued identification against databases of such information, where the parent’s identification is deleted by the operator from its records promptly after such verification is complete; or (vi) Provided that, an operator that does not “disclose” (as defined by § 312.2) children’s personal information, may use an email coupled with additional steps to provide assurances that the person providing the consent is the parent. Such additional steps include: Sending a confirmatory email to the parent following receipt of consent, or obtaining a postal address or telephone number from the parent and confirming the parent’s consent by letter or telephone call. An operator that uses this method must provide notice that the parent can revoke any consent given in response to the earlier email. 16 C.F.R. § 312.5 (emphasis added). Practice Note The prior rule (in place for fourteen years) did not limit credit card verification to a financial transaction. Many websites that are frequented by children (including major social networking sites that are not specifically directed to children) do not involve direct sales to visitors but may target advertisements to visitors through behavioral tracking.

The FTC is charged with administering COPPA under its general FTCA powers, 15 U.S.C. § 6505, and to treat violations of its regulations as violations of a rule defining an unfair or deceptive act or practice under Section 16(a)(1)(B) of the FTCA, 15 U.S.C. § 57a(a)(1)(B). 15 U.S.C. § 6502(c); 16 C.F.R. § 312.9. A safe harbor is provided to service providers who abide by FTC-approved industry guidelines. 15 U.S.C. § 6503; 16 C.F.R. § 312.11. Practice Note Relatively few operators have sought the safe harbor. Since the 2013 revisions, two operators, “iKeepSafe” and “kidSafe,” purportedly certifying children’s privacy on third-party sites had their programs approved, while one was denied. See https://www.ftc.gov/safe-harbor-program.

8–48

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.4

In 2014, the FTC filed a District Court complaint (supporting a settlement) through the Department of Justice in United States v. Yelp, Inc., Civ. No. 3:14-cv-04163, Complaint, Docket No. 1 (N.D. Cal. Sept. 16, 2014), available at https:// www.ftc.gov/system/files/documents/cases/140917yelpcmpt.pdf. The complaint alleged that . . . from 2009 to 2013, the company collected personal information from children through the Yelp app without first notifying parents and obtaining their consent. When consumers registered for Yelp through the app on their mobile device, according to the complaint, they were asked to provide their date of birth during the registration process. “Yelp, TinyCo Settle FTC Charges Their Apps Improperly Collected Children’s Personal Information,” FTC Press Release Sept. 17, 2014, available at https://www .ftc.gov/news-events/press-releases/2014/09/yelp-tinyco-settle-ftc-charges-theirapps-improperly-collected. Yelp agreed to pay a civil penalty of $450,000. United States v. Yelp Inc., Civ. No. 3:14-cv-04163, Stipulated Order for Permanent Injunction and Civil Penalty, Docket No. 5 (N.D. Cal. Sept. 16, 2014), available at https://www.ftc.gov/system/files/ documents/cases/140917yelpstip.pdf. In 2015, the FTC charged two applications developers for violating COPPA and ultimately settled for $360,000. The complaint in United States v. LAI Systems, LLC, No. 2:15-cv-09691, Doc. No. 1 (C.D. Cal. Dec. 17, 2015), alleged that LAI developed and implemented a number of apps aimed at children. LAI then allowed thirdparty advertisers to collect personal information from children, in the form of persistent identifiers, without obtaining consent from the children’s parents or informing advertising networks that the apps were targeted to children. The complaint in United States v. Retro Dreamer. No. 5:15-cv-02569, Doc. No. 1 (C.D. Cal. Dec. 17, 2015), also alleged that defendant Retro Dreamer created a number of apps targeted towards children and allowed third-party advertisers to collect information about the children in violation of COPPA. See https://www.ftc.gov/news-events/press-releases/2015/12/ two-app-developers-settle-ftc-charges-they-violated-childrens. Practice Note In a class action complaining of the use tracking cookies to collect information on children, In re Nickelodeon Consumer Privacy Litigation, 827 F.3d 262 (3d Cir. 2016), the Third Circuit followed its prior decision in In re Google Inc., 806 F.3d 125 (3d Cir. 2015), dismissing federal Wiretap Act (18 U.S.C. § 2511), Stored Communications Act (18 U.S.C. § 2701), Computer Fraud and Abuse Act (18 U.S.C. § 1030), and similar state law claims for cookie tracking. The Nickelodeon court did allow the state common law intrusion on seclusion count to proceed against a defense of COPPA preemption.

MCLE, Inc. | 2nd Edition 2018

8–49

§ 8.4

Data Security and Privacy in Massachusetts

In United States v. InMobi Pte Ltd., No. 3:16-cv-03474, Doc. No. 2-1, Stipulated Order for Permanent Injunction and Civil Penalty Judgment (N.D. Cal. June 22, 2016), a Singapore-based mobile advertising company that tracked the locations of hundreds of millions of consumers—including children—settled the FTCA and COPPA claims against it for a permanent injunction and $950,000 in civil penalties (with the balance of $4 million liability suspended conditionally). See https://www .ftc.gov/news-events/press-releases/2016/06/mobile-advertising-network-inmobisettles-ftc-charges-it-tracked. In June 2017, the FTC issued an updated compliance plan for businesses. This update includes compliance plans for businesses using voice activation technology, including Internet of Things products to COPPA’s scope of coverage, and introducing two new methods for companies to obtain parental consent: asking knowledge-based authentication questions and using facial recognition to get a match with a verified photo ID. See https://www.ftc.gov/tips-advice/business-center/guidance/childrensonline-privacy-protection-rule-six-step-compliance.

§ 8.5

FTC ACTIONS AGAINST “UNFAIR OR DECEPTIVE ACTS OR PRACTICES”

The FTC has been operating since 1938 with a mandate to investigate and prevent “unfair or deceptive acts or practices in [or affecting] commerce.” In 1964, in a “Statement of Basis and Purpose” for unfair or deceptive advertising and labeling of cigarettes, the FTC set forth the following three factors for “unfairness” determinations: (1) whether the practice, without necessarily having been previously considered unlawful, offends public policy as it has been established by statutes, the common law, or otherwise— whether, in other words, it is within at least the penumbra of some common-law, statutory or other established concept of unfairness; (2) whether it is immoral, unethical, oppressive, or unscrupulous; [and] (3) whether it causes substantial injury to consumers (or competitors or other businessmen). 29 Fed. Reg. 8,324, 8,355 (July 2, 1964). At the request of Congress after a controversial campaign in the 1970s to regulate children’s advertisement, the FTC issued in 1980 a second policy statement abandoning “immoral or unscrupulous conduct” as an “independent” basis for an unfairness claim and making substantial consumer injury the “primary focus.” FTC Unfairness Policy Statement, Letter to Hon. Wendell Ford and Hon. John Danforth, Senate Comm. on Commerce, Sci., and Transp. (Dec. 17, 1980), appended to International Harvester, 104 F.T.C. 949, 1061 n.43, 1070, 1073, 1076 (1984). Congress enacted the Federal Trade Commission Acts Amendments of 1994, which, among other things, codified the 1980 FTC Unfairness Policy Statement: The Commission shall have no authority under this section [45] or section § 57a of this title to declare unlawful an act or 8–50

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.5

practice on the grounds that such act or practice is unfair unless the act or practice causes or is likely to cause substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition. In determining whether an act or practice is unfair, the Commission may consider established public policies as evidence to be considered with all other evidence. Such public policy considerations may not serve as a primary basis for such determination. 15 U.S.C. § 45(n), added by Pub. L. No. 103-312, § 9, 108 Stat. 1695 (Aug. 26, 1994) (emphasis added). The Commission shall issue a notice of proposed rulemaking pursuant to paragraph (1)(A) [of 15 U.S.C. § 15a] only where it has reason to believe that the unfair or deceptive acts or practices which are the subject of the proposed rulemaking are prevalent. The Commission shall make a determination that unfair or deceptive acts or practices are prevalent under this paragraph only if— (A) it has issued cease and desist orders regarding such acts or practices, or (B) any other information available to the Commission indicates a widespread pattern of unfair or deceptive acts or practices. 15 U.S.C. § 57a(b)(3), added by Pub. L. No. 103-312, § 5, 108 Stat. 1692 (Aug. 26, 1994) (emphasis added). Thus, the “unfair or deceptive acts or practices” jurisdiction of the FTC requires such acts or practices, in addition to being in or affecting “commerce,” • to be “prevalent,” such as part of a “widespread pattern,” and • in order to be “unfair,” cause or be likely to cause substantial injury to consumers, – not reasonably avoidable by the consumers themselves and – not balanced by benefits to consumers or competition. 15 U.S.C. § 44 (“commerce among the several States or with foreign nations, or in any Territory of the United States or in the District of Columbia, or between any such Territory and another, or between any such Territory and any State or foreign nation, or between the District of Columbia and any State or Territory or foreign nation”). The “deceptive” prong, thus requires less analysis of the market than the “unfair” prong. See Fed. Trade Comm’n, “FTC Policy Statement on Deception,” letter from Fed. Trade Comm’n to Hon. John D. Dingell, chairman, H. Comm. on Energy and Commerce (Oct. 14, 1983), available at http://www.ftc.gov/publicMCLE, Inc. | 2nd Edition 2018

8–51

§ 8.5

Data Security and Privacy in Massachusetts

statements/1983/10/ftc-policy-statement-deception; see also Fed. Trade Comm’n, “FTC Policy on Unfairness,” letter from Fed. Trade Comm’n to Hon. Wendell H. Ford, chairman, and Hon. John C. Danforth, ranking minority member, S. Comm. on Commerce, Science, and Transportation, Consumer Subcomm. (Dec. 17, 1980), available at http://www.ftc.gov/bcp/policystmt/ad-unfair.htm. In FTC v. Wyndham Worldwide Corp., the FTC pursued the hotel entities for both “unfair” and “deceptive” practices under Section 5 of the FTCA for breaches of its reservation system from 2007 to 2010: [T]he three data breaches described above, the compromise of more than 619,000 consumer payment card account numbers, the exportation of many of those account numbers to a domain registered in Russia, fraudulent charges on many consumers’ accounts, and more than $10.6 million in fraud loss. Consumers and businesses suffered financial injury, including, but not limited to, unreimbursed fraudulent charges, increased costs, and lost access to funds or credit. Consumers and businesses also expended time and money resolving fraudulent charges and mitigating subsequent harm. First Amended Complaint ¶ 40, FTC v. Wyndham Worldwide Corp., No. CV 121364-PHX, Doc. No. 28 (D. Ariz., filed Aug. 9, 2012). After transfer to New Jersey, the District Court denied the defendants’ motion to dismiss on both the “unfair” and “deceptive” practices claims. FTC v. Wyndham Worldwide Corp., 2014-1 Trade Cas. (CCH) ¶ 78,763 (D.N.J. Apr. 7, 2014). On the certified question of the FTC’s jurisdiction on the “unfair” count, the decision was affirmed in FTC v. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. Aug. 24, 2015). The Third Circuit reviewed the history of the “unfair practice” prong of FTC jurisdiction and applied it to Wyndham’s failure to provide reasonable security. Wyndham settled shortly thereafter, agreeing to establish a comprehensive security protocol, including use of Payment Card Industry Data Security Standard (PCI DSS), with annual assessments for twenty years. FTC v. Wyndham Worldwide Corp., No. 2:13-CV-01887, Doc. No. 283, Stipulated Order for Injunction (D.N.J. Dec. 11, 2015); see https://www.ftc.gov/news -events/press-releases/2015/12/wyndham-settles-ftc-charges-it-unfairly-placedconsumers-payment. Notwithstanding the Congressional limitation on its “unfair act or practice” jurisdiction, and perhaps emboldened by its success in Wyndham, the FTC has continued to file enforcement actions under the theory. Thus, in In re LabMD, Inc., Docket No. 9357 (FTC), the FTC brought an administrative enforcement action against a medical testing laboratory on that theory where “LabMD billing information for over 9,000 consumers was found on a peer-to-peer (P2P) file-sharing network and then, in 2012, LabMD documents containing sensitive personal information of at least 500 consumers were found in the hands of identity thieves.” “FTC Files Complaint Against LabMD for Failing to Protect Consumers’ Privacy,” FTC Press Release, Aug. 29, 2013, available at https://www.ftc.gov/news-events/press-releases/2013/08/ftc-filescomplaint-against-labmd-failing-protect-consumers. The complaint alleged 8–52

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.5

respondent’s failure to employ reasonable and appropriate measures to prevent unauthorized access to personal information, including dates of birth, SSNs, medical test codes, and health information, caused, or is likely to cause, substantial injury to consumers that is not offset by countervailing benefits to consumers or competition and is not reasonably avoidable by consumers. This practice was, and is, an unfair act or practice. In re LabMD, Inc., Docket No. 9357, Complaint (Public) ¶ 22 (FTC Aug. 28, 2013), available at https://www.ftc.gov/sites/default/files/documents/cases/2013/08/130 829labmdpart3.pdf. After extensive proceedings, the administrative law judge dismissed for failure to show “substantial injury” as required by Section 5(n) (15 U.S.C. § 45(n)) for a “fairness” violation, rejecting subjective, emotional harm. In re LabMD, Inc., Docket No. 9357, Initial Decision at 13 (FTC Nov. 13, 2015), available at https://www.ftc.gov/ system/files/documents/cases/151113labmd_decision.pdf; see https://www.ftc.gov/ news-events/press-releases/2015/11/administrative-law-judge-dismisses-ftc-datasecurity-complaint. The FTC reversed and found a Section 5 violation, where for over 750,000 patients, LabMD lacked “even basic security precautions to protect the sensitive consumer information maintained on its computer system.” In re LabMD, Inc., Docket No. 9357, Opinion of the Commission at 1–2 (FTC July 29, 2016) (relying on FTC v. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015), available at https://www.ftc.gov/system/files/documents/cases/160729labmd-opinion.pdf.) The FTC entered an order for LabMD to obtain periodic independent, assessments regarding the implementation of the information security program, and to notify those consumers whose personal information was exposed on the P2P network about the unauthorized disclosure of their personal information and about how they can protect themselves from identity theft or related harms. In re LabMD, Inc., Docket No. 9357, Final Order (FTC July 29, 2016), available at https://www.ftc.gov/system/files/ documents/cases/160729labmdorder.pdf. See https://www.ftc.gov/news-events/pressreleases/2016/07/commission-finds-labmd-liable-unfair-data-security-practices. Appeal was made to the Eleventh Circuit. In re LabMD, Inc., Docket No. 9357, Order Denying LabMD, Inc.’s Application for Stay of Final Order Pending Review by a United States Court of Appeals (FTC Sept. 29 , 2016), available at https://www.ftc .gov/system/files/documents/cases/160929labmdorder.pdf. The FTC has more often availed of the “deceptive act or practice” prong of its jurisdiction. Thus, the FTC filed an administrative complaint in In re Jerk, LLC, where [the FTC] charged the operators of the website “Jerk.com” with harvesting personal information from Facebook to create profiles labeling people a “Jerk” or “not a Jerk,” then falsely claiming that consumers could revise their online profiles by paying $30. According to the FTC’s complaint, between 2009 and 2013 the defendants, Jerk, LLC and the operator of the

MCLE, Inc. | 2nd Edition 2018

8–53

§ 8.5

Data Security and Privacy in Massachusetts

website, John Fanning, created Jerk.com profiles for more than 73 million people, including children. “FTC Charges Operators of ‘Jerk.com’ Website With Deceiving Consumers,” FTC Press Release Apr. 7, 2014, available at https://www.ftc.gov/news-events/pressreleases/2014/04/ftc-charges-operators-jerkcom-website-deceiving-consumers. In truth and in fact, in the vast majority of instances, content on Jerk was not created by Jerk users and did not reflect those users’ views of the profiled individuals. Respondents populated or caused to be populated the content on the vast majority of Jerk profiles by taking information from Facebook in violation of Facebook’s policies, including by (1) failing to obtain users’ explicit consent to collect certain Facebook data, including photographs; (2) maintaining information obtained through Facebook even after respondents’ Facebook access was disabled; (3) failing to provide an easily accessible mechanism for consumers to request deletion of their Facebook data; and (4) failing to delete data obtained from Facebook upon a consumer’s request. Therefore, the representation set forth in Paragraph 15 was, and is, false or misleading. In re Jerk, LLC, Docket No. 9361, Complaint ¶ 16 (FTC Apr. 2, 2014), available at https://www.ftc.gov/system/files/documents/cases/140407jerkpart3cmpt.pdf. Similarly, the FTC brought a “deceptive act or practice” administrative action in In re Snapchat, Inc., where it alleged Snapchat, the developer of a popular mobile messaging app, . . . deceived consumers with promises about the disappearing nature of messages sent through the service. The FTC case also alleged that the company deceived consumers over the amount of personal data it collected and the security measures taken to protect that data from misuse and unauthorized disclosure. In fact, the case alleges, Snapchat’s failure to secure its Find Friends feature resulted in a security breach that enabled attackers to compile a database of 4.6 million Snapchat usernames and phone numbers. “Snapchat Settles FTC Charges That Promises of Disappearing Messages Were False,” FTC Press Release May 8, 2014, available at https://www.ftc.gov/newsevents/press-releases/2014/05/snapchat-settles-ftc-charges-promises-disappearingmessages-were. A decision and order was issued on December 23, 2014. In re Snapchat, Inc., Docket No. C-4501 (FTC Dec. 23, 2014), available at https://www.ftc.gov/system/files/ documents/cases/141231snapchatdo.pdf.

8–54

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.5

Also on the “deceptive act or practice” prong, the FTC brought an administrative complaint in In re True Ultimate Standards Everywhere, Inc. d/b/a TRUSTe Inc. for failing to conduct promised annual recertifications of companies participating widely adopted “TRUSTe” seal program more than 1,000 times between 2006 and 2013 and settled with an order, among other things, requiring annual reports on its COPPA safe-harbor operations. In re True Ultimate Standards Everywhere, Inc., Docket No. C-4512 (FTC Mar. 12, 2015), available at https://www.ftc.gov/system/files/ documents/cases/150318trust-edo.pdf; see “FTC Approves Final Order In TRUSTe Privacy Case,” FTC Press Release Mar. 18, 2015, available at https://www.ftc.gov/ news-events/press-releases/2015/03/ftc-approves-final-order-truste-privacy-case. In 2016, the FTC settled a contempt proceeding in federal district court against LifeLock, Inc. for violating a 2010 order of the court (entering a stipulated settlement of allegations of Section 5(a) violations by deceptive practices) by • “failing to establish and maintain a comprehensive information security program to protect its users’ sensitive personal data, including credit card, social security, and bank account numbers”; • “falsely advertising that it protected consumers’ sensitive data with the same high-level safeguards as financial institutions”; • failing to meet the 2010 order’s recordkeeping requirements; and • “falsely claiming it protected consumers’ identity 24/7/365 by providing alerts ‘as soon as’ it received any indication there was a problem,” FTC v. LifeLock, Inc., No. CV-10-00530, Doc. No. 20, Plaintiff Federal Trade Commission’s Notice of Lodging Proposed Documents Under Seal at 1–2 (D. Ariz. July 21, 2015), Doc. No. 67, Amended Order (D. Ariz. Jan. 6, 2016). In addition to permanent injunctions against the deceptive practices, LifeLock was ordered to and did deposit $100 million into the court registry, of which $68 million was disbursed to class action members in 2016 and the remaining $32 million to the FTC in 2017.” FTC v. LifeLock, Inc., No. CV-10-00530, Docs. Nos. 70, 74. In June 2016, completed August 2016, the FTC settled a complaint that Practice Fusion, a cloud-based electronic health record company, violated the “deceptive acts and practices” prong of Section 5 of the FTC Act by misleading consumers by not informing them that their personal health information from responding to surveys would be posted on public websites. In gathering consumer’s health information, Practice Fusion sent e-mails that appeared to the consumer as being sent from their doctors and did not make clear that the information consumers shared would be made public. The settlement requires Practice Fusion to withhold any data obtained during the period covered by the complaint, to make clear to consumers who the emails are coming from, and to disclose that any information shared may be made public. In re Practice Fusion, Inc., No. C-4591, Complaint & Decision and Order (FTC Aug. 16, 2016) https:// www.ftc.gov/system/files/documents/cases/ 160816practicefusioncmpt.pdf and www.ftc.gov/system/files/documents/cases/ 160608practicefusionagree.pdf. MCLE, Inc. | 2nd Edition 2018

8–55

§ 8.5

Data Security and Privacy in Massachusetts

Also in 2016, the FTC settled a complaint that ASUSTeK Computer, a Taiwanese home network router manufacturer, failed to take reasonable steps to secure the software, despite representing that the equipment was secure, thereby violating both the “deceptive practice” prong and, in one count, the “unfair” prong. As has become the standard remedy for these administrative actions, the defendant was ordered to establish a comprehensive security program, make periodic assessments for twenty years, notify potential victims and undertake not to make further misrepresentations. In re ASUSTeK Computer, Inc., No. C-4587, Complaint & Decision and Order (FTC July 28, 2016), available at https://www.ftc.gov/system/files/—documents/cases/— 1607asustekcmpt.pdf and https://www.ftc.gov/system/files/documents/—cases/— 1607asustekdo.pdf. In December 2016, upon settlement, the FTC filed a Section 5(a) complaint against the operators of the AshleyMadison.com dating site of 36 million members (19 million in the U.S.), the security of which had been breached in July 2015. FTC v. Ruby Corp., No. 1:16-cv-02438, Doc. No. 1, Complaint (D.D.C. Dec. 14, 2016) (counts of deceptive acts and practices including misrepresentations of network security and “engager” profiles of women and one count of unfair practices in failure to secure the site). In addition to the usual prohibitions of continued misrepresentation, mandatory security programs, periodic assessments for twenty years and cooperation in notifying victims, judgment of $8.75 million was entered, of which $828,500 was to be paid to the FTC and another $828,500 to thirteen states and the District of Columbia. FTC v. Ruby Corp., No. 1:16-cv-02438, Doc. No. 9, Stipulated Order for Permanent Injunction and Other Equitable Relief (D.D.C. Dec. 14, 2016); see https://www.ftc.gov/news-events/press-releases/2016/12/operatorsashleymadisoncom-settle-ftc-state-charges-resulting. In February 2017, upon settlement, the FTC and the attorney general and director of the Division of Consumer Affairs of New Jersey filed a Section 5(a) and New Jersey Consumer Fraud Act complaint against VIZIO, a manufacturer having sold some 11 million “smart” or internet-connected televisions in the United States with default or subsequently downloaded automated content recognition (“ACR”) “Smart Interactivity” functionality that collected viewing data that VIZIO sold to third parties for audience measurement, advertisement effectiveness analysis, and advertisement targeting. FTC v. VIZIO, Inc., No. 2:17-cv-00758, Doc. No. 1, Complaint for Permanent Injunction and Other Equitable and Monetary Relief (D.N.J. Feb. 6, 2017) (first count for “unfair tracking” without consent, the other two counts for deceptive failure to disclose and deceptive representation of “Smart Interactivity”). In addition to the usual prohibitions of continued misrepresentation, mandatory security programs, and periodic assessments for twenty years, the stipulated order required affirmative consent for data collection, deletion of data collected without consent and the payment by VIZIO of $2.2 million, with $1.5 million to the FTC and $1 million to the New Jersey Division of Consumer Affairs, with $300,000 of that amount suspended. FTC v. VIZIO, Inc., No. 2:17-cv-00758, Doc. No. 9, Stipulated Order for Permanent Injunction and Monetary Relief (D.N.J. Feb. 14, 2017); see https://www.ftc.gov/ news-events/press-releases/2017/02/vizio-pay-22-million-ftc-state-new-jersey-settlecharges-it. 8–56

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 8.5

In August 2017, the FTC settled a complaint that Uber Technologies, despite its representation to drivers and users that data was “securely stored within our databases,” failed to provide reasonable security to prevent unauthorized access to personal information in databases Uber stored with a third-party cloud provider, resulting in a breach of 100,000 names and drivers licenses thereby violating the “deceptive practice” prong of FTCA Section 5(a). The defendant was ordered to establish a comprehensive privacy program, make periodic assessments for twenty years, and undertake not to make further misrepresentations. In re Uber Techs., Inc., File No. 1523054, Complaint & Decision and Order (FTC Aug. 15, 2016), available at https://www.ftc.gov/system/filesdocuments/cases/1523054_uber_technologies _complaint.pdf and https://www.ftc.gov/system/files/documents/cases/ 1523054_uber_technologies_decision_and_order.pdf. On September 5, 2017, the FTC announced settlement of an administrative complaint against laptop manufacturer Lenovo under (1) the “deceptive” prong of Section 5(a) for failing to inform of the preinstallation in laptops of a local proxy (“manin-the-middle”) software, VisualDiscovery, that served up partner advertisements by replacing root certifications, creating security vulnerabilities, and (2) the “unfair” prong of Section 5(a) for “unfair pre-installation” and “unfair security practices.” The proposed order, in addition to prohibiting the complained-of misrepresentations and mandating the establishment of a comprehensive software security program and biennial assessments for twenty years, would require “affirmative express consent” for installation of the software, the ability to revoke consent and to opt out of the software’s operations. In re Lenovo (United States), Inc., File No. 1523134, Complaint & Decision and Order (FTC Sept. 5, 2017), available at https://www.ftc.gov/ system/files/documents/cases/1523134_lenovo_united_states_complaint.pdf and https://www.ftc.gov/system/files/documents/cases/1523134_lenovo_united_states _agreement_and_do.pdf; see https://www.ftc.gov/news-events/press-releases/ 2017/09/lenovo-settles-ftc-charges-it-harmed-consumers-preinstalled. The settlement was published for public comment at 82 Fed. Reg. 43013 (Sept. 13, 2017). Practice Note Although the FTCA does not provide a private right of action for violations of Section 5, including the consumer protective prohibition of “unfair or deceptive acts or practices,” many states have enacted “Baby FTC Acts” that do provide such a right. Thus, the Massachusetts Consumer Protection Act, G.L. c. 93A, provides a private right of action for consumers under Section 9 and for those injured in their “business” under Section 11. The First Circuit recently held in McDermott v. Marcus, Errico, Emmer & Brooks, P.C., 775 F.3d 109, 123 (1st Cir. 2015), that a violation of a statute that is an automatic (“per se”) violation of the consumer protection portion of the FTCA is a per se violation of Section 2 of Chapter 93A.

MCLE, Inc. | 2nd Edition 2018

8–57

Data Security and Privacy in Massachusetts

EXHIBIT 8A—Red Flags Rule Guidelines APPENDIX A TO [16 C.F.R.] PART 681 [(“RED FLAGS RULE”)]— INTERAGENCY GUIDELINES ON IDENTITY THEFT DETECTION, PREVENTION, AND MITIGATION (ADDED BY 72 FED. REG. 63718, 63771 (NOV. 9, 2007), AMENDED 74 FED. REG. 22639, 22646 (MAY 14, 2009).) Section 681.1 of this part requires each financial institution and creditor that offers or maintains one or more covered accounts, as defined in § 681.1(b)(3) of this part, to develop and provide for the continued administration of a written Program to detect, prevent, and mitigate identity theft in connection with the opening of a covered account or any existing covered account. These guidelines are intended to assist financial institutions and creditors in the formulation and maintenance of a Program that satisfies the requirements of § 681.1 of this part. I. The Program In designing its Program, a financial institution or creditor may incorporate, as appropriate, its existing policies, procedures, and other arrangements that control reasonably foreseeable risks to customers or to the safety and soundness of the financial institution or creditor from identity theft. II. Identifying Relevant Red Flags (a) Risk Factors. A financial institution or creditor should consider the following factors in identifying relevant Red Flags for covered accounts, as appropriate: (1) The types of covered accounts it offers or maintains; (2) The methods it provides to open its covered accounts; (3) The methods it provides to access its covered accounts; and (4) Its previous experiences with identity theft. (b) Sources of Red Flags. Financial institutions and creditors should incorporate relevant Red Flags from sources such as: (1) Incidents of identity theft that the financial institution or creditor has experienced; (2) Methods of identity theft that the financial institution or creditor has identified that reflect changes in identity theft risks; and (3) Applicable supervisory guidance.

8–58

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

(c) Categories of Red Flags. The Program should include relevant Red Flags from the following categories, as appropriate. Examples of Red Flags from each of these categories are appended as Supplement A to this Appendix A. (1) Alerts, notifications, or other warnings received from consumer reporting agencies or service providers, such as fraud detection services; (2) The presentation of suspicious documents; (3) The presentation of suspicious personal identifying information, such as a suspicious address change; (4) The unusual use of, or other suspicious activity related to, a covered account; and (5) Notice from customers, victims of identity theft, law enforcement authorities, or other persons regarding possible identity theft in connection with covered accounts held by the financial institution or creditor. III. Detecting Red Flags The Program’s policies and procedures should address the detection of Red Flags in connection with the opening of covered accounts and existing covered accounts, such as by: (a) Obtaining identifying information about, and verifying the identity of, a person opening a covered account, for example, using the policies and procedures regarding identification and verification set forth in the Customer Identification Program rules implementing 31 U.S.C. 5318(l) (31 CFR 103.121); and (b) Authenticating customers, monitoring transactions, and verifying the validity of change of address requests, in the case of existing covered accounts. IV. Preventing and Mitigating Identity Theft The Program’s policies and procedures should provide for appropriate responses to the Red Flags the financial institution or creditor has detected that are commensurate with the degree of risk posed. In determining an appropriate response, a financial institution or creditor should consider aggravating factors that may heighten the risk of identity theft, such as a data security incident that results in unauthorized access to a customer’s account records held by the financial institution, creditor, or third party, or notice that a customer has provided information related to a covered account held by the financial institution or creditor to someone fraudulently claiming to represent the financial institution or creditor or to a fraudulent website. Appropriate responses may include the following: (a) Monitoring a covered account for evidence of identity theft; (b) Contacting the customer; MCLE, Inc. | 2nd Edition 2018

8–59

Data Security and Privacy in Massachusetts

(c) Changing any passwords, security codes, or other security devices that permit access to a covered account; (d) Reopening a covered account with a new account number; (e) Not opening a new covered account; (f) Closing an existing covered account; (g) Not attempting to collect on a covered account or not selling a covered account to a debt collector; (h) Notifying law enforcement; or (i) Determining that no response is warranted under the particular circumstances. V. Updating the Program Financial institutions and creditors should update the Program (including the Red Flags determined to be relevant) periodically, to reflect changes in risks to customers or to the safety and soundness of the financial institution or creditor from identity theft, based on factors such as: (a) The experiences of the financial institution or creditor with identity theft; (b) Changes in methods of identity theft; (c) Changes in methods to detect, prevent, and mitigate identity theft; (d) Changes in the types of accounts that the financial institution or creditor offers or maintains; and (e) Changes in the business arrangements of the financial institution or creditor, including mergers, acquisitions, alliances, joint ventures, and service provider arrangements. VI. Methods for Administering the Program (a) Oversight of Program. Oversight by the board of directors, an appropriate committee of the board, or a designated employee at the level of senior management should include: (1) Assigning specific responsibility for the Program’s implementation; (2) Reviewing reports prepared by staff regarding compliance by the financial institution or creditor with § 681.1 of this part; and (3) Approving material changes to the Program as necessary to address changing identity theft risks.

8–60

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

(b) Reports. (1) In general. Staff of the financial institution or creditor responsible for development, implementation, and administration of its Program should report to the board of directors, an appropriate committee of the board, or a designated employee at the level of senior management, at least annually, on compliance by the financial institution or creditor with § 681.1 of this part. (2) Contents of report. The report should address material matters related to the Program and evaluate issues such as: The effectiveness of the policies and procedures of the financial institution or creditor in addressing the risk of identity theft in connection with the opening of covered accounts and with respect to existing covered accounts; service provider arrangements; significant incidents involving identity theft and management’s response; and recommendations for material changes to the Program. (c) Oversight of service provider arrangements. Whenever a financial institution or creditor engages a service provider to perform an activity in connection with one or more covered accounts the financial institution or creditor should take steps to ensure that the activity of the service provider is conducted in accordance with reasonable policies and procedures designed to detect, prevent, and mitigate the risk of identity theft. For example, a financial institution or creditor could require the service provider by contract to have policies and procedures to detect relevant Red Flags that may arise in the performance of the service provider’s activities, and either report the Red Flags to the financial institution or creditor, or to take appropriate steps to prevent or mitigate identity theft. VII. Other Applicable Legal Requirements Financial institutions and creditors should be mindful of other related legal requirements that may be applicable, such as: (a) For financial institutions and creditors that are subject to 31 U.S.C. 5318(g), filing a Suspicious Activity Report in accordance with applicable law and regulation; (b) Implementing any requirements under 15 U.S.C. 1681c-1(h) regarding the circumstances under which credit may be extended when the financial institution or creditor detects a fraud or active duty alert; (c) Implementing any requirements for furnishers of information to consumer reporting agencies under 15 U.S.C. 1681s-2, for example, to correct or update inaccurate or incomplete information, and to not report information that the furnisher has reasonable cause to believe is inaccurate; and (d) Complying with the prohibitions in 15 U.S.C. 1681m on the sale, transfer, and placement for collection of certain debts resulting from identity theft.

MCLE, Inc. | 2nd Edition 2018

8–61

Data Security and Privacy in Massachusetts

Supplement A to Appendix A In addition to incorporating Red Flags from the sources recommended in section II.b. of the Guidelines in Appendix A of this part, each financial institution or creditor may consider incorporating into its Program, whether singly or in combination, Red Flags from the following illustrative examples in connection with covered accounts: Alerts, Notifications or Warnings from a Consumer Reporting Agency 1. A fraud or active duty alert is included with a consumer report. 2. A consumer reporting agency provides a notice of credit freeze in response to a request for a consumer report. 3. A consumer reporting agency provides a notice of address discrepancy, as defined in § 641.1(b) of this part. 4. A consumer report indicates a pattern of activity that is inconsistent with the history and usual pattern of activity of an applicant or customer, such as: a. A recent and significant increase in the volume of inquiries; b. An unusual number of recently established credit relationships; c. A material change in the use of credit, especially with respect to recently established credit relationships; or d. An account that was closed for cause or identified for abuse of account privileges by a financial institution or creditor. Suspicious Documents 5. Documents provided for identification appear to have been altered or forged. 6. The photograph or physical description on the identification is not consistent with the appearance of the applicant or customer presenting the identification. 7. Other information on the identification is not consistent with information provided by the person opening a new covered account or customer presenting the identification. 8. Other information on the identification is not consistent with readily accessible information that is on file with the financial institution or creditor, such as a signature card or a recent check. 9. An application appears to have been altered or forged, or gives the appearance of having been destroyed and reassembled.

8–62

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

Suspicious Personal Identifying Information 10. Personal identifying information provided is inconsistent when compared against external information sources used by the financial institution or creditor. For example: a. The address does not match any address in the consumer report; or b. The Social Security Number (SSN) has not been issued, or is listed on the Social Security Administration’s Death Master File. 11. Personal identifying information provided by the customer is not consistent with other personal identifying information provided by the customer. For example, there is a lack of correlation between the SSN range and date of birth. 12. Personal identifying information provided is associated with known fraudulent activity as indicated by internal or third-party sources used by the financial institution or creditor. For example: a. The address on an application is the same as the address provided on a fraudulent application; or b. The phone number on an application is the same as the number provided on a fraudulent application. 13. Personal identifying information provided is of a type commonly associated with fraudulent activity as indicated by internal or third-party sources used by the financial institution or creditor. For example: a. The address on an application is fictitious, a mail drop, or a prison; or b. The phone number is invalid, or is associated with a pager or answering service. 14. The SSN provided is the same as that submitted by other persons opening an account or other customers. 15. The address or telephone number provided is the same as or similar to the address or telephone number submitted by an unusually large number of other persons opening accounts or by other customers. 16. The person opening the covered account or the customer fails to provide all required personal identifying information on an application or in response to notification that the application is incomplete. 17. Personal identifying information provided is not consistent with personal identifying information that is on file with the financial institution or creditor. 18. For financial institutions and creditors that use challenge questions, the person opening the covered account or the customer cannot provide authenticating information beyond that which generally would be available from a wallet or consumer report. MCLE, Inc. | 2nd Edition 2018

8–63

Data Security and Privacy in Massachusetts

Unusual Use of, or Suspicious Activity Related to, the Covered Account 19. Shortly following the notice of a change of address for a covered account, the institution or creditor receives a request for a new, additional, or replacement card or a cell phone, or for the addition of authorized users on the account. 20. A new revolving credit account is used in a manner commonly associated with known patterns of fraud. For example: a. The majority of available credit is used for cash advances or merchandise that is easily convertible to cash (e.g., electronics equipment or jewelry); or b. The customer fails to make the first payment or makes an initial payment but no subsequent payments. 21. A covered account is used in a manner that is not consistent with established patterns of activity on the account. There is, for example: a. Nonpayment when there is no history of late or missed payments; b. A material increase in the use of available credit; c. A material change in purchasing or spending patterns; d. A material change in electronic fund transfer patterns in connection with a deposit account; or e. A material change in telephone call patterns in connection with a cellular phone account. 22. A covered account that has been inactive for a reasonably lengthy period of time is used (taking into consideration the type of account, the expected pattern of usage and other relevant factors). 23. Mail sent to the customer is returned repeatedly as undeliverable although transactions continue to be conducted in connection with the customer’s covered account. 24. The financial institution or creditor is notified that the customer is not receiving paper account statements. 25. The financial institution or creditor is notified of unauthorized charges or transactions in connection with a customer’s covered account. Notice from Customers, Victims of Identity Theft, Law Enforcement Authorities, or Other Persons Regarding Possible Identity Theft in Connection With Covered Accounts Held by the Financial Institution or Creditor 26. The financial institution or creditor is notified by a customer, a victim of identity theft, a law enforcement authority, or any other person that it has opened a fraudulent account for a person engaged in identity theft. 8–64

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

EXHIBIT 8B—GLBA Safeguards Rule Information Protection Program Elements [16 C.F.R.] § 314.4 ELEMENTS (ADDED BY 67 FED. REG. 36484, 36494 (MAY 23, 2002)) In order to develop, implement, and maintain your information security program, you shall: (a) Designate an employee or employees to coordinate your information security program. (b) Identify reasonably foreseeable internal and external risks to the security, confidentiality, and integrity of customer information that could result in the unauthorized disclosure, misuse, alteration, destruction or other compromise of such information, and assess the sufficiency of any safeguards in place to control these risks. At a minimum, such a risk assessment should include consideration of risks in each relevant area of your operations, including: (1) Employee training and management; (2) Information systems, including network and software design, as well as information processing, storage, transmission and disposal; and (3) Detecting, preventing and responding to attacks, intrusions, or other systems failures. (c) Design and implement information safeguards to control the risks you identify through risk assessment, and regularly test or otherwise monitor the effectiveness of the safeguards’ key controls, systems, and procedures. (d) Oversee service providers, by: (1) Taking reasonable steps to select and retain service providers that are capable of maintaining appropriate safeguards for the customer information at issue; and (2) Requiring your service providers by contract to implement and maintain such safeguards. (e) Evaluate and adjust your information security program in light of the results of the testing and monitoring required by paragraph (c) of this section; any material changes to your operations or business arrangements; or any other circumstances that you know or have reason to know may have a material impact on your information security program.

MCLE, Inc. | 2nd Edition 2018

8–65

Data Security and Privacy in Massachusetts

EXHIBIT 8C—FTC Health Information Breach Notification Rule [16 C.F.R.] PART 318 – HEALTH BREACH NOTIFICATION RULE 74 FED. REG. 42962, 42980-81 (AUG. 25, 2009). Sec. 318.1 Purpose and scope. 318.2 Definitions. 318.3 Breach notification. 318.4 Timeliness of notification. 318.5 Method of notice. 318.6 Content of notice to individuals. 318.7 Enforcement. 318.8 Effective date. 318.9 Sunset.

Authority: Pub. L. No. 111-5, 123 Stat. 115 (2009). § 318.1 Purpose and scope. (a) This Part, which shall be called the “Health Breach Notification Rule,” implements section 13407 of the American Recovery and Reinvestment Act of 2009. It applies to foreign and domestic vendors of personal health records, PHR related entities, and third party service providers, irrespective of any jurisdictional tests in the Federal Trade Commission (FTC) Act, that maintain information of U.S. citizens or residents. It does not apply to HIPAA-covered entities, or to any other entity to the extent that it engages in activities as a business associate of a HIPAA-covered entity. (b) This Part preempts state law as set forth in section 13421 of the American Recovery and Reinvestment Act of 2009. § 318.2 Definitions. (a) Breach of security means, with respect to unsecured PHR identifiable health information of an individual in a personal health record, acquisition of such information without the authorization of the individual. Unauthorized acquisition will be presumed to include unauthorized access to unsecured PHR identifiable health information unless the vendor of personal health records, PHR related entity, or third party service provider that experienced the breach has reliable evidence showing that there has not been, or could not reasonably have been, unauthorized acquisition of such information. (b) Business associate means a business associate under the Health Insurance Portability and Accountability Act, Pub. L. No. 104-191, 110 Stat. 1936, as defined in 45 C.F.R § 160.103. 8–66

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

(c) HIPAA-covered entity means a covered entity under the Health Insurance Portability and Accountability Act, Pub. L. No. 104-191, 110 Stat. 1936, as defined in 45 C.F.R § 160.103. (d) Personal health record means an electronic record of PHR identifiable health information on an individual that can be drawn from multiple sources and that is managed, shared, and controlled by or primarily for the individual. (e) PHR identifiable health information means “individually identifiable health information,” as defined in section 1171(6) of the Social Security Act (42 U.S.C. 1320d(6)), and, with respect to an individual, information: (1) That is provided by or on behalf of the individual; and (2) That identifies the individual or with respect to which there is a reasonable basis to believe that the information can be used to identify the individual. (f) PHR related entity means an entity, other than a HIPAA-covered entity or an entity to the extent that it engages in activities as a business associate of a HIPAAcovered entity, that: (1) Offers products or services through the website of a vendor of personal health records; (2) Offers products or services through the websites of HIPAA-covered entities that offer individuals personal health records; or (3) Accesses information in a personal health record or sends information to a personal health record. (g) State means any of the several States, the District of Columbia, Puerto Rico, the Virgin Islands, Guam, American Samoa and the Northern Mariana Islands. (h) Third party service provider means an entity that: (1) Provides services to a vendor of personal health records in connection with the offering or maintenance of a personal health record or to a PHR related entity in connection with a product or service offered by that entity; and (2) Accesses, maintains, retains, modifies, records, stores, destroys, or otherwise holds, uses, or discloses unsecured PHR identifiable health information as a result of such services. (i) Unsecured means PHR identifiable information that is not protected through the use of a technology or methodology specified by the Secretary of Health and Human Services in the guidance issued under section 13402(h)(2) of the American Reinvestment and Recovery Act of 2009.

MCLE, Inc. | 2nd Edition 2018

8–67

Data Security and Privacy in Massachusetts

(j) Vendor of personal health records means an entity, other than a HIPAAcovered entity or an entity to the extent that it engages in activities as a business associate of a HIPAA-covered entity, that offers or maintains a personal health record. § 318.3 Breach notification requirement. (a) In general. In accordance with § § 318.4, 318.5, and 318.6, each vendor of personal health records, following the discovery of a breach of security of unsecured PHR identifiable health information that is in a personal health record maintained or offered by such vendor, and each PHR related entity, following the discovery of a breach of security of such information that is obtained through a product or service provided by such entity, shall: (1) Notify each individual who is a citizen or resident of the United States whose unsecured PHR identifiable health information was acquired by an unauthorized person as a result of such breach of security; and (2) Notify the Federal Trade Commission. (b) Third party service providers. A third party service provider shall, following the discovery of a breach of security, provide notice of the breach to an official designated in a written contract by the vendor of personal health records or the PHR related entity to receive such notices or, if such a designation is not made, to a senior official at the vendor of personal health records or PHR related entity to which it provides services, and obtain acknowledgment from such official that such notice was received. Such notification shall include the identification of each customer of the vendor of personal health records or PHR related entity whose unsecured PHR identifiable health information has been, or is reasonably believed to have been, acquired during such breach. For purposes of ensuring implementation of this requirement, vendors of personal health records and PHR related entities shall notify third party service providers of their status as vendors of personal health records or PHR related entities subject to this Part. (c) Breaches treated as discovered. A breach of security shall be treated as discovered as of the first day on which such breach is known or reasonably should have been known to the vendor of personal health records, PHR related entity, or third party service provider, respectively. Such vendor, entity, or third party service provider shall be deemed to have knowledge of a breach if such breach is known, or reasonably should have been known, to any person, other than the person committing the breach, who is an employee, officer, or other agent of such vendor of personal health records, PHR related entity, or third party service provider. § 318.4 Timeliness of notification. (a) In general. Except as provided in paragraph (c) of this section and § 318.5(c), all notifications required under §§ 318.3(a)(1), 318.3(b), and 318.5(b) shall be sent without unreasonable delay and in no case later than 60 calendar days after the discovery of a breach of security. 8–68

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

(b) Burden of proof. The vendor of personal health records, PHR related entity, and third party service provider involved shall have the burden of demonstrating that all notifications were made as required under this Part, including evidence demonstrating the necessity of any delay. (c) Law enforcement exception. If a law enforcement official determines that a notification, notice, or posting required under this Part would impede a criminal investigation or cause damage to national security, such notification, notice, or posting shall be delayed. This paragraph shall be implemented in the same manner as provided under 45 CFR 164.528(a)(2), in the case of a disclosure covered under such section. § 318.5 Methods of notice. (a) Individual notice. A vendor of personal health records or PHR related entity that discovers a breach of security shall provide notice of such breach to an individual promptly, as described in § 318.4, and in the following form: (1) Written notice, by first-class mail to the individual at the last known address of the individual, or by email, if the individual is given a clear, conspicuous, and reasonable opportunity to receive notification by first-class mail, and the individual does not exercise that choice. If the individual is deceased, the vendor of personal health records or PHR related entity that discovered the breach must provide such notice to the next of kin of the individual if the individual had provided contact information for his or her next of kin, along with authorization to contact them. The notice may be provided in one or more mailings as information is available. (2) If, after making reasonable efforts to contact all individuals to whom notice is required under § 318.3(a), through the means provided in paragraph (a)(1) of this section, the vendor of personal health records or PHR related entity finds that contact information for ten or more individuals is insufficient or out-of-date, the vendor of personal health records or PHR related entity shall provide substitute notice, which shall be reasonably calculated to reach the individuals affected by the breach, in the following form: (i) Through a conspicuous posting for a period of 90 days on the home page of its website; or (ii) In major print or broadcast media, including major media in geographic areas where the individuals affected by the breach likely reside. Such a notice in media or web posting shall include a toll-free phone number, which shall remain active for at least 90 days, where an individual can learn whether or not the individual’s unsecured PHR identifiable health information may be included in the breach. (3) In any case deemed by the vendor of personal health records or PHR related entity to require urgency because of possible imminent misuse of unsecured PHR identifiable health information, that entity may provide information to inMCLE, Inc. | 2nd Edition 2018

8–69

Data Security and Privacy in Massachusetts

dividuals by telephone or other means, as appropriate, in addition to notice provided under paragraph (a)(1) of this section. (b) Notice to media. A vendor of personal health records or PHR related entity shall provide notice to prominent media outlets serving a State or jurisdiction, following the discovery of a breach of security, if the unsecured PHR identifiable health information of 500 or more residents of such State or jurisdiction is, or is reasonably believed to have been, acquired during such breach. (c) Notice to FTC. Vendors of personal health records and PHR related entities shall provide notice to the Federal Trade Commission following the discovery of a breach of security. If the breach involves the unsecured PHR identifiable health information of 500 or more individuals, then such notice shall be provided as soon as possible and in no case later than ten business days following the date of discovery of the breach. If the breach involves the unsecured PHR identifiable health information of fewer than 500 individuals, the vendor of personal health records or PHR related entity may maintain a log of any such breach, and submit such a log annually to the Federal Trade Commission no later than 60 calendar days following the end of the calendar year, documenting breaches from the preceding calendar year. All notices pursuant to this paragraph shall be provided according to instructions at the Federal Trade Commission’s website. § 318.6 Content of notice. Regardless of the method by which notice is provided to individuals under § 318.5 of this Part, notice of a breach of security shall be in plain language and include, to the extent possible, the following: (a) A brief description of what happened, including the date of the breach and the date of the discovery of the breach, if known; (b) A description of the types of unsecured PHR identifiable health information that were involved in the breach (such as full name, Social Security number, date of birth, home address, account number, or disability code); (c) Steps individuals should take to protect themselves from potential harm resulting from the breach; (d) A brief description of what the entity that suffered the breach is doing to investigate the breach, to mitigate harm, and to protect against any further breaches; and (e) Contact procedures for individuals to ask questions or learn additional information, which shall include a toll-free telephone number, an email address, website, or postal address.

8–70

2nd Edition 2018 | MCLE, Inc.

The Federal Trade Commission

§ 318.7 Enforcement. A violation of this Part shall be treated as an unfair or deceptive act or practice in violation of a regulation under § 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)) regarding unfair or deceptive acts or practices. § 318.8 Effective date. This Part shall apply to breaches of security that are discovered on or after September 24, 2009. § 318.9 Sunset. If new legislation is enacted establishing requirements for notification in the case of a breach of security that apply to entities covered by this Part, the provisions of this Part shall not apply to breaches of security discovered on or after the effective date of regulations implementing such legislation.

MCLE, Inc. | 2nd Edition 2018

8–71

Data Security and Privacy in Massachusetts

8–72

2nd Edition 2018 | MCLE, Inc.

CHAPTER 9

Survey of State Data Security and Privacy Law Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston § 9.1

State Law Foundations ........................................................................... 9–1 § 9.1.1 Federal and State Constitutions .............................................. 9–2 § 9.1.2 State Common Law ................................................................ 9–3

§ 9.2

State Statutes ........................................................................................... 9–5 § 9.2.1 State Data Security Breach Statutes ....................................... 9–6 § 9.2.2 State Personal Data Security Statutes ................................... 9–20 (a) Driver’s License Information ........................................ 9–20 (b) Social Security Numbers ............................................... 9–23 (c) Security and Disposal of Records ................................. 9–30 (d) Internet Collection of Personal Information .................. 9–42 § 9.2.3 State Laws Against Internet Intrusion .................................. 9–45 (a) State Antispam Legislation ........................................... 9–45 (b) State Antispyware and Antiphishing Statutes ................ 9–57 (c) Employer and School Coercion of Access to Social Media ............................................................. 9–67 (d) Cyber-Bullying.............................................................. 9–72 (e) “Revenge Porn” ............................................................. 9–75

§ 9.3

Data Security and Privacy Litigation .................................................. 9–84

Scope Note This chapter provides state law foundations for statutes pertaining to data security and privacy, and identifies such laws under the categories of state statutes on data security breach, state statutes on breaches of personal data security, and state laws prohibiting Internet intrusion.

§ 9.1

STATE LAW FOUNDATIONS

American law protecting privacy of information is founded in federal and state constitutional protections against unreasonable searches and seizure; in some certain state constitutions or general laws, the protection of “privacy” reflects the development, during the twentieth century, of protected values—largely commercial—in information and identity. Such protections are reflected in the common law and they are described in the Restatement (Second) of Torts. MCLE, Inc. |2nd Edition 2018

9–1

§ 9.1

§ 9.1.1

Data Security and Privacy in Massachusetts

Federal and State Constitutions

The Fourth Amendment of the U.S. Constitution provides: The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. U.S. Const. amend. IV. Information in “private papers” was recognized by the Court in Boyd v. United States, 116 U.S. 616 (1886) (see NAACP v. Alabama, 357 U.S. 449 (1958) (membership list)), as protected by the Fourth and Fifth Amendments, but it took nearly a century after the invention of the telephone before a majority of the Court in the 1960s applied the Fourth Amendment to protecting “reasonable expectation of privacy” in telephonic communications outside a person’s home. Katz v. United States, 389 U.S. 347, 361–62 (1967). The Massachusetts constitution, as many other state constitutions, provides protection similar to the Fourth Amendment: Every subject has a right to be secure from all unreasonable searches, and seizures, of his person, his houses, his papers, and all his possessions. All warrants, therefore, are contrary to this right, if the cause or foundation of them be not previously supported by oath or affirmation; and if the order in the warrant to a civil officer, to make search in suspected places, or to arrest one or more suspected persons, or to seize their property, be not accompanied with a special designation of the persons or objects of search, arrest, or seizure; and no warrant ought to be issued but in cases, and with the formalities prescribed by the laws. Mass. Const. art. XIV. However, California’s constitution specifically provides protection for “privacy”: Section 1. All people are by nature free and independent and have inalienable rights. Among these are enjoying and defending life and liberty, acquiring, possessing, and protecting property, and pursuing and obtaining safety, happiness, and privacy. Cal. Const. art. I, § 1 (emphasis added). Massachusetts has a general statute protecting privacy: A person shall have a right against unreasonable, substantial or serious interference with his privacy. The superior court shall 9–2

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.1

have jurisdiction in equity to enforce such right and in connection therewith to award damages. G.L. c. 214, § 1B, added by 1973 Mass. Acts c. 941, reenacted by 1974 Mass. Acts c. 193, § 1.

§ 9.1.2

State Common Law

Except in California, reflected in its constitution, state protection of personal space from intrusion generally has been limited to protection against governmental intrusion under federal constitutional cases. However, by the last third of the twentieth century, the right to privacy was recognized in the common law of the large majority of states in four causes of action identified by torts scholar and reporter for the Restatement (Second) of Torts, Professor William Prosser. See William Prosser, “Privacy,” 48 Calif. L. Rev. 383 (1960); William Prosser, Law of Torts 831–32 (3d ed. 1964). The Restatement (Second) of Torts sets forth the following: The right of privacy is invaded by (a) unreasonable intrusion upon the seclusion of another; . . . (b) appropriation of the other’s name or likeness; . . . (c) unreasonable publicity given to the other’s private life; . . . or (d) publicity that unreasonably places the other in a false light before the public. . . . Restatement (Second) of Torts § 652A (1977) (emphasis added). That this provision was included in the restatement division entitled “Injurious Falsehoods” reflects the origins of the common law of privacy in misappropriation cases where false associations were suggested between a person and a product through the use of that person’s name or likeness. The protected interests overlap with those of trademark law and the law of publicity. Practice Note Roberson v. Rochester Folding-Box Co., 64 N.E. 442 (N.Y. 1902), found a common law right of privacy protecting against the unconsented-to use of a young woman’s likeness on packaging for flour. The New York legislature responded by enacting Section 51 of the New York Civil Rights Law: “Any person whose name, portrait or picture is used within this state for . . . the purposes of trade without the written consent [of that person] may maintain an equitable action.”

The 1995 Restatement (Third) of Unfair Competition addressed the state common law of trademarks, Restatement (Third) of Unfair Competition ch. 3 (1995); trade secrets, Restatement (Third) of Unfair Competition ch. 4, topic 2; and, outlined here, the right of publicity, Restatement (Third) of Unfair Competition ch. 4, topic 3:

MCLE, Inc. |2nd Edition 2018

9–3

§ 9.1

Data Security and Privacy in Massachusetts

§ 46 Appropriation of the Commercial Value of a Person’s Identity: the Right of Publicity One who appropriates the commercial value of a person’s identity by using without consent the person’s name, likeness, or other indicia of identity for purposes of trade is subject to liability for the relief appropriate under the rules stated in §§ 48 and 49. Restatement (Third) of Unfair Competition § 46 (emphasis added). § 47 Use for Purposes of Trade The name, likeness, and other indicia of a person’s identity are used “for purposes of trade” under the rule stated in § 46 if they are used in advertising the user’s goods or services, or are placed on merchandise marketed by the user, or are used in connection with services rendered by the user. However, use “for purposes of trade” does not ordinarily include the use of a person’s identity in news reporting, commentary, entertainment, works of fiction or nonfiction, or in advertising that is incidental to such uses. Restatement (Third) of Unfair Competition § 47. The right against commercial appropriation may be considered an extension of the common law of privacy. Indeed, the right against infringement of a trademark may be considered to be the protection of privacy of a commercial enterprise against misappropriation of its identity, and the right against misappropriation of trade secrets to be the protection of privacy of a commercial enterprise against unreasonable intrusion. Practice Note The concepts of “privacy,” “confidentiality,” and “security” have often overlapped, particularly in the employment context. Thus, the concept of “trade” secrets coexisted with other concepts of proprietary commercial information from early on: [C]ourts of equity will restrain a party from making a disclosure of secrets communicated to him in the course of a confidential employment; and it matters not, in such cases, whether the secrets be secrets of trade or secrets of title, or any other secrets of the party important to his interests. Peabody v. Norfolk, 98 Mass. 452, 459 (1868) (emphasis added) (quoting 2 Story, Commentaries on Equity and Jurisprudence, as Administered in England and America § 952 (Fred B. Rothman & Co. 13th ed. 1988)).

It is the economic value of personal “identity” that has spurred “identity theft” and the response of state legislatures to the “misappropriation”-type invasion of privacy, surveyed in the next section. However, there has also been concern for “intrusion”9–4

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.1

type invasion of privacy, for example, in the antispam and antimalware laws, as well as restrictions on employer and school coercion of disclosure of social media access information. The common law cause of action recognized for intrusion is rather limited: One who intentionally intrudes, physically or otherwise, upon the solitude or seclusion of another or his private affairs or concerns, is subject to liability to the other for invasion of his privacy, if the intrusion would be highly offensive to a reasonable person. Restatement (Second) of Torts § 652B (1977) (emphasis added). Surely the potential economic consequences of identity theft are “highly offensive to a reasonable person.” However, the commercial collection of personal information to facilitate account servicing and marketing also enables such theft but is not clearly an unconsented intrusion into “private affairs or concerns.” State legislatures have balanced commercial interest in this collection of personal information against the interests of the individuals.

§ 9.2

STATE STATUTES

Three types of state statutes are surveyed here: • statutes requiring notification of breaches of the security of personal information; • statutes requiring protection of such personal information; and • statutes that specifically protect against intrusion into the personal computers to obtain personal information—by “phishing” or use of “spyware.” The breach notification statutes are the most prevalent, specifically related to the compromise of credit card information and involving the credit report freezes provided under the Fair Credit Reporting Act, and subsequent legislation, such as the Fair and Accurate Credit Transactions Act of 2003 (FACTA). In most cases, the California legislation will be discussed first as it tends to be the earliest and most prescriptive legislation and applicable to the largest population making it the effective “floor,” or baseline, for compliance. Legislative History Fair Credit Reporting Act, 15 U.S.C. §§ 1681–1681v, added by Act of Oct. 26, 1970, Pub. L. No. 91-508, Title VI, 84 Stat. 1127, adding to the Truth in Lending Act, Pub. L. No. 90-321, 82 Stat. 146 (May 29, 1968); amended by Act of Oct. 27, 1992, Pub. L. No. 102-537, § 2(b), 106 Stat. 3531; amended by Consumer Credit Reform Act of 1996, Pub. L. No. 104-208, Div. A, Title II, Subtitle D, Ch. 1, 110 Stat. 3009–426 (Sept. 30, 1996); Consumer Reporting Employment Clarification Act of 1998, Pub. L. No. 105-347, 112 Stat. 3208 (Nov. 2, 1998); amended by Fair and MCLE, Inc. |2nd Edition 2018

9–5

§ 9.2

Data Security and Privacy in Massachusetts Accurate Credit Transactions Act of 2003, Pub. L. No. 108-159, 117 Stat. 1952 (Dec. 4, 2003).

§ 9.2.1

State Data Security Breach Statutes

Although states have long had statutes expressly making illegal or criminal the fraudulent use of another’s personal information, “identity theft” became an increased concern with the storage of large amounts of credit card and other account information by retailers and businesses dealing with consumers and with wellpublicized breaches of the secure treatment of that information. Many states have enacted criminal statutes against identity theft; the overwhelming majority have enacted remedial statutes. Even before FACTA provided for breach reporting and remediation, in 2001 the California legislature enacted Cal. [Civ.] Code § 1798.82 (disclosure of breach insecurity by business maintaining computerized data that includes personal information). (a) A person or business that conducts business in California, and that owns or licenses computerized data that includes personal information, shall disclose a breach of the security of the system following discovery or notification of the breach in the security of the data to a resident of California whose unencrypted personal information was, or is reasonably believed to have been, acquired by an unauthorized person, or (2) whose encrypted personal information was, or is reasonably believed to have been, acquired by an unauthorized person and the encryption key or security credential was, or is reasonably believed to have been, acquired by an unauthorized person and the person or business that owns or licenses the encrypted information has a reasonable belief that the encryption key or security credential could render that personal information readable or useable The disclosure shall be made in the most expedient time possible and without unreasonable delay, consistent with the legitimate needs of law enforcement, as provided in subdivision (c), or any measures necessary to determine the scope of the breach and restore the reasonable integrity of the data system. (b) A person or business that maintains computerized data that includes personal information that the person or business does not own shall notify the owner or licensee of the information of the breach of the security of the data immediately following discovery, if the personal information was, or is reasonably believed to have been, acquired by an unauthorized person. (c) The notification required by this section may be delayed if a law enforcement agency determines that the notification will impede a criminal investigation. The notification required by this section shall be made promptly after the law enforcement agency determines that it will not compromise the investigation. 9–6

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(d) A person or business that is required to issue a security breach notification pursuant to this section shall meet all of the following requirements: (1) The security breach notification shall be written in plain language, shall be titled “Notice of Data Breach,” and shall present the information described in paragraph (2) under the following headings: “What Happened,” “What Information Was Involved,” “What We Are Doing,” “What You Can Do,” and “For More Information.” Additional information may be provided as a supplement to the notice. (A) The format of the notice shall be designed to call attention to the nature and significance of the information it contains. (B) The title and headings in the notice shall be clearly and conspicuously displayed. (C) The text of the notice and any other notice provided pursuant to this section shall be no smaller than 10-point type. (D) For a written notice described in paragraph (1) of subdivision (j), use of the model security breach notification form prescribed below or use of the headings described in this paragraph with the information described in paragraph (2), written in plain language, shall be deemed to be in compliance with this subdivision. [Form] (E) For an electronic notice described in paragraph (2) of subdivision (j), use of the headings described in this paragraph with the information described in paragraph (2), written in plain language, shall be deemed to be in compliance with this subdivision. (2) The security breach notification described in paragraph (1) shall include, at a minimum, the following information: (A) The name and contact information of the reporting person or business subject to this section. (B) A list of the types of personal information that were or are reasonably believed to have been the subject of a breach. (C) If the information is possible to determine at the time the notice is provided, then any of the following: (i) the date of the breach, (ii) the estimated date of the breach, or (iii) the date range within which the breach occurred. The notification shall also include the date of the notice.

MCLE, Inc. |2nd Edition 2018

9–7

§ 9.2

Data Security and Privacy in Massachusetts

(D) Whether notification was delayed as a result of a law enforcement investigation, if that information is possible to determine at the time the notice is provided. (E) A general description of the breach incident, if that information is possible to determine at the time the notice is provided. (F) The toll-free telephone numbers and addresses of the major credit reporting agencies if the breach exposed a social security number or a driver’s license or California identification card number. (G) If the person or business providing the notification was the source of the breach, an offer to provide appropriate identity theft prevention and mitigation services, if any, shall be provided at no cost to the affected person for not less than 12 months, along with all information necessary to take advantage of the offer to any person whose information was or may have been breached if the breach exposed or may have exposed personal information defined in subparagraphs (A) and (B) of paragraph (1) of subdivision (h). (3) At the discretion of the person or business, the security breach notification may also include any of the following: (A) Information about what the person or business has done to protect individuals whose information has been breached. (B) Advice on steps that the person whose information has been breached may take to protect himself or herself. (e) [Health Insurance Portability and Accountability Act of 1996 (42 U.S.C. § 1320d et seq.), HIPAA exception]. (f) A person or business that is required to issue a security breach notification pursuant to this section to more than 500 California residents as a result of a single breach of the security system shall electronically submit a single sample copy of that security breach notification, excluding any personally identifiable information, to the Attorney General. A single sample copy of a security breach notification shall not be deemed to be within subdivision (f) of Section 6254 of the Government Code. (g) For purposes of this section, “breach of the security of the system” means unauthorized acquisition of computerized data that compromises the security, confidentiality, or integrity of personal information maintained by the person or business. Good faith acquisition of personal information by an employee or agent of the person or business for the purposes of the person or business is not a breach of the security of the system, 9–8

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

provided that the personal information is not used or subject to further unauthorized disclosure. (h) For purposes of this section, “personal information” means either of the following: (1) An individual’s first name or first initial and last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted: (A) Social security number. (B) Driver’s license number or California Identification Card number. (C) Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account. (D) Medical information. (E) Health insurance information. (F) Information or data collected through the use or operation of an automated license plate recognition system, as defined in Section 1798.90.5. (2) A user name or email address, in combination with a password or security question and answer that would permit access to an online account. (i) (1) For purposes of this section, “personal information” does not include publicly available information that is lawfully made available to the general public from federal, state, or local government records. (2) For purposes of this section, “medical information” means any information regarding an individual’s medical history, mental or physical condition, or medical treatment or diagnosis by a health care professional. (3) For purposes of this section, “health insurance information” means an individual’s health insurance policy number or subscriber identification number, any unique identifier used by a health insurer to identify the individual, or any information in an individual’s application and claims history, including any appeals records. (j) For purposes of this section, “notice” may be provided by one of the following methods: (1) Written notice.

MCLE, Inc. |2nd Edition 2018

9–9

§ 9.2

Data Security and Privacy in Massachusetts

(2) Electronic notice, if the notice provided is consistent with the provisions regarding electronic records and signatures set forth in Section 7001 of Title 15 of the United States Code. (3) Substitute notice, if the person or business demonstrates that the cost of providing notice would exceed two hundred fifty thousand dollars ($250,000), or that the affected class of subject persons to be notified exceeds 500,000, or the person or business does not have sufficient contact information. Substitute notice shall consist of all of the following: (A) E-mail notice when the person or business has an e-mail address for the subject persons. (B) Conspicuous posting of the notice on the Web site page of the person or business, if the person or business maintains one. (C) Notification to major statewide media. (4) In the case of a breach of the security of the system involving personal information defined in paragraph (2) of subdivision (h) for an online account, and no other personal information defined in paragraph (1) of subdivision (h), the person or business may comply with this section by providing the security breach notification in electronic or other form that directs the person whose personal information has been breached promptly to change his or her password and security question or answer, as applicable, or to take other steps appropriate to protect the online account with the person or business and all other online accounts for which the person whose personal information has been breached uses the same user name or email address and password or security question or answer. (5) In the case of a breach of the security of the system involving personal information defined in paragraph (2) of subdivision (h) for login credentials of an email account furnished by the person or business, the person or business shall not comply with this section by providing the security breach notification to that email address, but may, instead, comply with this section by providing notice by another method described in this subdivision or by clear and conspicuous notice delivered to the resident online when the resident is connected to the online account from an Internet Protocol address or online location from which the person or business knows the resident customarily accesses the account. (k) For purposes of this section, “encryption key” and “security credential” mean the confidential key or process designed to render data useable, readable, and decipherable.

9–10

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(l) Notwithstanding subdivision (j), a person or business that maintains its own notification procedures as part of an information security policy for the treatment of personal information and is otherwise consistent with the timing requirements of this part, shall be deemed to be in compliance with the notification requirements of this section if the person or business notifies subject persons in accordance with its policies in the event of a breach of security of the system. Cal. [Civ.] Code § 1798.82 (2017) (emphasis added), added by 2002 Cal. Stat. 915, § 4, amended by 2007 Cal. Stat. 699, § 6, 2011 Cal. Stat. 197, § 2, 2013 Cal. Stat. 396, § 2, 2014 Cal. Stat. 875, § 2, 2015 Cal. Stat 522 § 2, 532 § 2.3, 2016 Cal. Stat. 86 § 21, 337 § 2. Compare companion Cal. [Civ.] Code § 1798.29 (2015) (agency breach), added by 2002 Cal. Stat. 915, § 2, 2003 Cal. Stat. 1054, § 2, amended by 2007 Cal. Stat. 699, § 4, 2011 Cal. Stat. 197, § 1, 2013 Cal. Stat. 395, § 1, 396, § 1.5, 2015 Cal. Stat. 522, § 1, 532, § 1, 543, § 1.3, 2016 Cal. Stat. 86, § 20, 337, § 1. Forty-seven other states and the District of Columbia have notification of breach laws substantially similar to that of California, generally with less detail and with some departures from uniformity. In part this is because the California statute was updated in 2016 to provide more specification for the notice and to fix the encryption “safe harbor” so that it does not exclude notification for breach of encrypted information if the encryption key is also compromised. Practice Note Only two states do not have a breach notification law, but they have some protection for identity theft. Alabama allows for the blocking of consumer credit reporting. Ala. Code § 13A-8-200 (2017), added by 2001 Ala. Acts 312, § 11. It also allows for reissue of state identification documents “used to perpetrate a crime.” Ala. Code § 13A-8-201 (2017), added by 2001 Ala. Acts 312, § 12. South Dakota allows for security freezes for victims of identity theft reported to the police. S.D. Codified Laws § 54.15.3 (2016), added by 2006 S.D. Laws 246, § 3.

Alaska enacted in 2008 an extensive Personal Information Protection Act that included breach notification and credit freeze. Alaska Stat. §§ 45.48.010–45.48.995 (2017), added by 2008 Alaska Sess. Laws 92, § 4. Article 1 provides for notice of breach of security involving personal information. Alaska Stat. §§ 45.48.90 (covers businesses and governmental agencies and encryption where the key is compromised). The Act also includes credit report freezes (Alaska Stat. §§ 45.48.100– 45.48.290), protection of Social Security numbers (Alaska Stat. §§ 45.48.400– 45.48.480), prescriptions on disposal of records (Alaska Stat. §§ 45.48.500– 45.48.590) remediation of identity theft (Alaska Stat. §§ 45.48.600–45.45.690), and truncation of card information (Alaska Stat. § 45.48.750). Arizona enacted in 2006 a statute addressing notification for compromised personal information. Ariz. Rev. Stat. § 18-545 (2017), renumbered from Ariz. Rev. Stat. § 447501(2015), added by 2006 Ariz. Sess. Laws 232, § 1, amended by 2007 Ariz. Sess. MCLE, Inc. |2nd Edition 2018

9–11

§ 9.2

Data Security and Privacy in Massachusetts

Laws 23, § 1, 2016 Ariz. Sess. Laws 80, §§ 3, 15, 102, § 1. The original enactment included a conditional repeal “one year after the effective date of the federal personal data privacy and security act,” 2006 Ariz. Sess. Laws 232, § 3, which Congress has not enacted. Arkansas enacted in 2005 its Personal Information Protection Act. Ark. Code Ann. §§ 4-110-101–4-110-108 (2017), added by 2005 Ark. Acts 1526, § 1. Colorado enacted its security breach law in 2006. Colo. Rev. Stat. § 6-1-716 (2017), added by 2008 Colo. Sess. Laws 536, § 1, amended by 2010 Colo. Sess. Laws 2064, § 9. Connecticut enacted a breach of security notification statute in 2005. Conn. Gen. Stat. § 36a-701b (2017), added by 2005 Conn. Acts 148, § 3, amended by 2012 Conn. Acts 1, § 130 (clarifying application and notice requirements), 2015 Conn. Acts 142, § 6 (new Section 36a-701b(b)(2)(B) requiring offer of “identity theft mitigation services). Delaware enacted a breach of security notification statute in 2005. Del. Code Ann. tit. 6, § 12B-102 (2017), added by 75 Del. Laws 61, § 1 (2005). (Delaware also has an identity theft statute. Del. Code Ann. tit. 11, § 854 (2017), added by 72 Del. Laws 297, § 1 (2000), amended by 74 Del. Laws 425, § 1 (2004), 79 Del. Laws 260, § 7 (2014) (changing “credit” and “debit” to “payment” cards). In 2006, Delaware added security freezes and other remediation in the Clean Credit and Identity Theft Prevention Act, Del. Code Ann. tit. 6, §§ 2201–2204 (2017), 75 Del. Laws 328, § 1 (2006), amended by 79 Del. Laws 109, § 1 (2013) (lowering fees).) The District of Columbia added in 2007 the Consumer Personal Information Security Breach Notification Act. D.C. Code Ann. § 28-3852 (2017), added by D.C. Law 16-237, § 2(c) (2007). Florida amended in 2005 its statute on criminal use of personal identification information, including medical information, Fla. Stat. § 817.568 (2017), added by 1999 Fla. Laws ch. 335, § 1, amended by 2001 Fla. Laws ch. 233, § 1, 2003 Fla. Laws ch. 71, § 1, 2005 Fla. Laws ch. 229 (adding list of personal information), § 1, 2010 Fla. Laws ch. 209, § 41, 2014 Fla. Laws ch. 200, § 2, 2015 Fla. Laws ch. 166, § 17, 206 Fla. Laws ch. 151, § 8, and provided for notification of security breaches, Fla. Stat. § 817.5681 (2015), added by 2005 Fla. Laws ch. 229, § 2. In 2014, the Florida Information Protection Act of 2014 replaced the latter provision. Fla. Stat. § 501.171 (2017), added by 2014 Fla. Laws ch. 189, § 3, ch. 190, § 1. As part of the act, protections were also added for data breach in the public agency setting. Fla. Stat. §§ 282.0041, 282.381 (2017), added by 2014 Fla. Laws ch. 289, §§ 4, 5, amended by 2014 Fla. Laws ch. 221, §§ 9, 16. Georgia enacted in 2005 its security breach notification statute: Ga. Code Ann. §§ 10-1-910–10-1-915 (2017), added by 2005 Ga. Laws 851, § 1, amended by 2007 Ga. Laws 450, §§ 2, 3 (which also added at Section 6 Ga. Code Ann. § 16-9-124.1 for remediation of identity fraud by contacting law enforcement), 2008 Ga. Laws 9–12

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

568, § 1 (adding § 10-1-914), 2014 Ga. Laws 611 §§ 1, 2 (adding § 10-1-914.1, security freezes). (Georgia has a statute criminalizing “identity fraud,” Ga. Code Ann. §§ 16-9-120–16-9-132 (2017), added by 2002 Ga. Laws 551, § 2, amended by 2007 Ga. Laws 241, §§ 4–6, 2008 Ga. Laws 570, § 1, 2010 Ga. Laws 501, § 1, 2011 Ga. Laws 252, § 1, 2013 Ga. Laws 334, § 1, with §§ 16-9-129 and 16-9-130 respectively allowing damages to businesses and consumers. Breach notification is also required for telephone records security breaches. Ga. Code. Ann. § 46-5-214 (2017), added by 2006 Ga. Laws 562, § 5.) Hawaii enacted in 2001 a law to require notice of security breaches. Haw. Rev. Stat. §§ 487N-1–487N-4 (2017), added by 2006 Haw. Sess. Laws 135, § 2, amended by 2008 Haw. Sess. Laws 19, §§ 69, 70, 2008 1st Sp. Sess. 10, § 5. (Hawaii also has Social Security number protection. Haw. Rev. Stat. §§ 487J-1–487J-3 (2017), added by 2006 Haw. Sess. Laws 137, § 2, amended by 2008 Haw. Sess. Laws 10, § 683, 2012 Haw. Sess. Laws 191, § 2, 2013 Haw. Sess. Laws 225, § 3.) Idaho adopted its statute dealing with disclosure of breach of security of computerized personal information in 2006. Idaho Code §§ 28-51-104–28-51-107 (2017), added by 2006 Idaho Sess. Laws 258, § 1, 2010 Idaho Sess. Laws 170, § 1 (extending to city, county, and state agencies), 2014 Idaho Sess. Laws 97, § 13, 2015 Idaho Sess. Laws 141, § 151. (Provisions for protection of personal information and credit report security freezes were added in 2008 to replace former Sections 28-51-101 and 28-51-102. Idaho Code §§ 28-52-101–28-52-109 (2017), added by 2008 Idaho Sess. Laws 177, §§ 1, 2.) Illinois enacted its Personal Information Protection Act in 2005. 815 Ill. Comp. Stat. 530/1–530/40 (2017), added by 2005 Ill. Laws 36, §§ 1–20, amended by 2005 Ill. Laws 947, § 5 (amending 530/10, adding 530/12, 530/25, 530/30 for state agencies), 2011 Ill. Laws 483 (adding 530/40 on disposal of materials containing personal information), 2016 Ill. Laws 503 (adding health and biometric information, fixing the encryption safe harbor, and 530/45 for data security and 530/50 HIPAA exemption). Indiana enacted its notice of security breach statute in 2006. Ind. Code §§ 24-4.9-11–24-4.9-5-1 (2017), added by 2006 Ind. Acts 125, § 6, amended by 2008 Ind. Acts 136, § 2. (Under Ind. Code § 24-4.9-4-1, enforcement is by the attorney general only. Indiana criminalizes “identity deception” at Ind. Code § 35-43-5-3.5 (2017), added by 2001 Ind. Acts 180, § 1, amended by 2003 Ind. Acts 22, § 2, 2006 Ind. Acts 125, § 9, and 2009 Ind. Acts 137, § 14 (Section 15 of the 2009 Act added Ind. Code § 3543-5-3.8 for fraud by “synthetic identifying information”).) Iowa enacted its security breach statute in 2008. Iowa Code §§ 715C.1–715C.2 (2017), added by 2008 Iowa Acts 1154, §§ 1, 2, amended by 2014 Iowa Acts 1062, §§ 1–4 (no encryption and redaction safe harbor where key is compromised). (Iowa recognizes a civil cause of action for identity theft under Iowa Code § 714.16B (2017), added by 1999 Iowa Acts 47, § 1, amended by 2005 Iowa Acts 18, § 2.) Kansas enacted its security breach statute in 2006. Kan. Stat. Ann. §§ 50-7a01–507a04 (2017), added by 2006 Kan. Sess. Laws 149, §§ 3, 4, 14, 15 (Section 1 created MCLE, Inc. |2nd Edition 2018

9–13

§ 9.2

Data Security and Privacy in Massachusetts

a new crime for scanning cards fraudulently; Section 2 limited personal information in publicly available documents). Kentucky added an act relating to the security of personal information in 2014. Ky. Rev. Stat. Ann. § 365.732 (2017), added by 2014 Ky. Acts 84, § 1. (Kentucky also added protections for data breach in the public agency setting. Ky. Rev. Stat. Ann. §§ 61.931–61.934 (2017), added by 2014 Ky. Acts 74, §§ 1–4.) Louisiana enacted its Database Security Breach Notification Law in 2005. La. Rev. Stat. Ann. §§ 51:3071–51:3077 (2015), added by 2005 La. Acts 499, § 1. Section 51:3075 allows recovery of damages. Maine enacted its Notice of Risk to Personal Data Act in 2005. Me. Rev. Stat. Ann. tit. 10, §§ 1346–1350-A (2017), added by 2005 Me. Laws 379, § 1, amended by 2005 Me. Laws 384, §§ 1–5, 2009 Me. Laws 161, § 1. (In 2008, a provision for a police report was enacted. Me. Rev. Stat. Ann. tit. 10, § 1350-B (2017), added by 2007 Me. Laws 634, § 1.) Maryland enacted its Personal Information Protection Act in 2007. Md. Code Ann., [Com. Law] §§ 14-3501–14-3508 (2017), added by 2007 Md. Laws 531, 532, amended by 2013 Md. Laws 43, § 5. (Violations are deemed unfair or deceptive trade practices. Md. Code Ann., [Com. Law] § 14-3508 (2017).) Massachusetts in 2007 enacted a data breach notification law. G.L. c. 93H, §§ 1–6 (2017), added by 2007 Mass. Acts c. 82, § 16. (The enactment also modified existing law on security freezes. G.L. c. 93, §§ 50, 56, 58, 62A, 63, added and mended by 2007 Mass. Acts c. 82, §§ 3–15. Enforcement is by the attorney general. G.L. c. 93H, § 6. Practice Note Notably, the Massachusetts notification law provides: The notice to be provided to the resident shall include, but not be limited to, the consumer’s right to obtain a police report, how a consumer requests a security freeze and the necessary information to be provided when requesting the security freeze, and any fees required to be paid to any of the consumer reporting agencies, provided however, that said notification shall not include the nature of the breach or unauthorized acquisition or use or the number of residents of the commonwealth affected by said breach or unauthorized access or use. G.L. c. 93H. § 3(b) (emphasis added). Compare Haw. Rev. Stat. § 487N2(d)(1) (conspicuous description, among other things, of “[t]he incident in general terms”); Iowa Code § 715C.2(5)(a) (“[a] description of the breach of security”); Md. Code Ann., Commercial Law § 14-3504(g) (description of information); Mich. Comp. Laws § 445.72(6)(c) (“[d]escribe the security breach in general terms”); Mo. Rev. Stat. § 407.1500(4)(a) (“the incident in general terms”); N.H. Rev. Stat. Ann. §§ 359-C:20(IV)(a) (“the incident 9–14

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

in general terms”); N.Y. Gen. Bus. Law § 899-aa(7) (“description of the categories of information that were, or are reasonably believed to have been, acquired by a person without valid authorization, including specification of which of the elements of personal information and private information were, or are reasonably believed to have been, so acquired”); N.C. Gen. Stat. § 75-65(d)(1) (“the incident in general terms”); Or. Rev. Stat. § 646A.604(5)(a) (“the incident in general terms”); Vt. Stat. Ann. tit. 9, § 2435(4)(A) (“[t]he incident in general terms”); Va. Code Ann. § 18.2-186.6(A) (“[t]he incident in general terms”); W. Va. Code § 46A2A-102(d) (description of information).

Michigan enacted its Identity Theft Protection Act in 2004 (including antispam Section 445.67), to which was added breach notification (Section 445.72) and data protection in 2006. Mich. Comp. Laws §§ 445.61–445.77 (2015), added by 2004 Mich. Pub. Acts 452, §§ 1–17, amended by 2006 Mich. Pub. Acts 246, 566; 2010 Mich. Pub. Acts 315, 318. Minnesota enacted in 2005 its law for notification for data breach. Minn. Stat. § 325E.61 (2017), added by 2005 Minn. Laws 167, § 1, amended by 2006 Minn. Laws 212, art. 1, §§ 17, 23, 2006 Minn. Laws 233, §§ 7, 8. (Minnesota also has a law criminalizing identity theft and providing for restitution. Minn. Stat. § 609.527 (2017).) Mississippi enacted in 2010 a data breach notification law. Miss. Code Ann. § 75-2429 (2017), added by 2010 Miss. Laws ch. 489, § 1. Missouri enacted in 2009 a data breach notification law. Mo. Rev. Stat. § 407.1500 (2017), added by 2009 Mo. Laws HB 62. Montana enacted a data breach notification provision in 2005. Mont. Code Ann. §§ 30-14-1702, 30-14-1704 (2017), added by 2005 Mont. Laws 518, §§ 5, 7, amended by 2007 Mont. Laws 180, § 3, 276, § 4, 2015 Mont. Laws 74, § 3 (extended protection (extended protection to medical information). (The act included data security through record destruction, Mont. Code Ann. §30-14-1703 (2017), added by 2005 Mont. Laws 518, § 6, and identity theft impediments for credit cards, Mont. Code Ann. §§ 30-14-1721, 30-14-1722 (2017), added by 2005 Mont. Laws 518, §§ 2, 3. The security freeze provisions are at Mont. Code Ann. §§ 30-14-1726–30-1736 (2017), added by 2007 Mont. Laws 138, §§ 1–11, amended by 2011 Mont. Laws 42, §§ 1, 2. The attorney general may issue an identity theft “passport” for identity theft victims. Mont. Code Ann. § 46-24-220 (2017), added by 2005 Mont. Laws 55, § 1, amended by 2007 Mont. Laws 195, § 2. Protection is also extended to public agency breaches. Mont. Code Ann. § 2-6-504 (2017), added by 2009 Mont. Laws 163, § 4.) Nebraska enacted its Financial Data Protection and Consumer Notification of Data Security Breach Act of 2006. Neb. Rev. Stat. §§ 87-801–87-807 (2017), added by 2006 Neb. Laws 876, §§ 1–7, amended by 2016 Neb. Laws 385, §§ 27–29 (no encryption safe harbor when key compromised).

MCLE, Inc. |2nd Edition 2018

9–15

§ 9.2

Data Security and Privacy in Massachusetts

Nevada enacted in 2005 a number of computer and personal information protection statutes, including notification of breach of security. Nev. Rev. Stat. § 603A.220 (2017), added by 2005 Nev. Stat. 485, § 24. (Provision is also made for notification of breaches of agency information systems. Nev. Rev. Stat. Ann. § 242.183 (2017), added by 2011 Nev. Stat. 331, § 4.) New Hampshire enacted a data security breach notification law in 2006. N.H. Rev. Stat. Ann. §§ 359-C:19–359-C:21 (2017), added by 2006 N.H. Laws 242. (“Any person injured by any violation under this subdivision may bring an action for damages.” N.H. Rev. Stat. Ann. § 359-C:21, (I).) New Jersey enacted in 2005 an Identity Theft Prevention Act. N.J. Stat. Ann. §§ 56:8-161–56:8-166 (2017), added by 2005 N.J. Laws 226, §§ 10–15, amended by 2015 A. 3146 (notification information). (Section 56:8-162 provides for destruction of customer records, Section 56:8-164 prohibits certain displays of Social Security numbers, and Section 56:8-165 called for promulgation of banking regulations.) New Mexico enacted in 2017 a Data Breach Notification Act. N.M. Stat. Ann. §§ 5712C-1–57-12C-12 (2017), added by 2017 N.M. Laws 36. Section 57-12-C-2 defines “personal identifying information” as names plus unencrypted Social Security numbers, driver’s license numbers, government-issued identification numbers, account numbers in combination with security codes, and biometric information; it also defines a “security breach” to include Sections 57-12C-4 and 57-12C-5, requiring reasonable security procedures for storage of personal information of New Mexico residents and contracts for third party providers to use reasonable security procedures. Sections 57-12C-6–57-12C-12 require notification to New Mexico residents of compromise of their personal information. New York enacted its Information Security Breach and Notification Act in 2005. N.Y. Gen. Bus. L. § 899-aa (2017), added by 2005 N.Y. Laws 442, § 4; amended by 2005 N.Y. Laws 491, §§ 5–7, 2011 N.Y. Laws 62, § 43 (Part A), 2013 N.Y. Laws 55, § 6 (Part N). (There is a general provision against theft of identity. N.Y. Gen. Bus. L. §380-s (2017), added by 2002 N.Y. Laws 619, § 8. Security freezes are provided under N.Y. Gen. Bus. L. § 380-t (2017), added by 2006 N.Y. Laws 63, § 2, amended by 2008 N.Y. Laws 279, § 2, 2008 N.Y. Laws 406, § 1, 2011 N.Y. Laws 62, § 36 (Part A). Data security is provided through a disposal of records law. N.Y. Gen. Bus. L. § 399-h (2017), added by 2006 N.Y. Laws 65, § 1, amended by 2008 N.Y. Laws 516, §§ 1, 2. There is provided protection for breach of public agency computers. N.Y. State Tech. Law § 208 (2017), added by 2005 N.Y. Laws 442, § 3, amended by 2005 N.Y. Laws 491, §§ 2–5, 2011 N.Y. Laws 27 (Part A), 2013 N.Y. Laws 44, § 5 (Part N).) North Carolina enacted the Identity Theft Protection Act of 2005, including breach notification N.C. Gen. Stat. §§ 75-60–75-65 (2017), added by 2005 N.C. Sess. Laws 414, § 1, amended by 2006 N.C. Sess. Laws 158, § 1, 2009 N.C. Sess. Laws 355, §§ 1, 2, 550, § 5, 573, § 10. (The act provides for protection of Social Security numbers, § 75-62, and security freezes, § 75-63, as well as destruction of personal information records, § 75-64, and notification of breaches, § 75-65. In 2007, N.C. Gen. 9–16

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Stat. § 75-66 (2017) was added for protection against publication of personal information more broadly defined. 2007 N.C. Sess. Laws 534, § 2, amended by 2012 N.C. Sess. Laws 142, § 6A.7A(h).) North Dakota enacted a breach notification statute in 2005. N.D. Cent. Code §§ 5130-01–51-30-07 (2017), added by 2005 N.D. Laws 447, § 3, amended by 2013 N.D. Laws 106, §§ 2, 3 (medical information added), 2015 N.D. Laws 352, §§ 1–2 (requiring a security code for employee identification number to be protected as personal information). (Identity fraud is addressed at N.D. Cent. Code §§ 51-31-05–51-3101 (2014), added by 2005 N.D. Laws 448, § 1. Security freezes are provided at N.D. Cent. Code §§ 51-33-1–51-33-14 (2017), added by 2007 N.D. Laws 438, § 1. Telephone records are protected at N.D. Cent. Code §§ 5134-34-01–51-34-07 (2017), added by 2007 N.D. Laws 439, § 1.) Ohio enacted a data security breach notification statute in 2005. Ohio Rev. Code Ann. §§ 1349.19–1349.21 (2017), added by 2005 Ohio Laws HB 104, § 1, amended by 2005 Ohio Laws SB 126, § 1. (There had been restrictions on recording credit card, telephone, and Social Security numbers. Ohio Rev. Code Ann. §§ 1349.17, 1349.18 (2017), added by 1991 Ohio Laws HB 20, § 1, amended by 1993 Ohio Laws HB 266, § 1. Security freezes are provided at Ohio Rev. Code Ann. § 1349.52 (2017), added by 2007 Ohio Laws HB 46, § 1. The attorney general may issue an “identity fraud passport” to victims. Ohio Rev. Code § 109.94 (2017), added by 2005 Ohio Laws HB 48, § 1.) Oklahoma enacted the Security Breach Notification Act in 2008. Okla. Stat. tit. 24, §§ 161–166 (2017), added by 2008 Okla. Sess. Laws 86, §§ 1–6. (Security freezes are provided at Okla. Stat. tit. 24, §§ 151–159 (2017), added by 2006 Okla. Sess. Laws 183, §§ 1–11. Identity theft passports are provided at Okla. Stat. tit. 22, § 19b (2017). The state government is prohibited from using Social Security numbers in unsecure settings and from disclosing them. Okla. Stat. tit. 74, § 3111, 3113. It is required to disclose breaches. Okla. Stat. tit. 74, § 3113.1.) Oregon enacted in 2007 the Oregon Consumer Identity Theft Protection Act with notification of breach. Or. Rev. Stat. §§ 646A.600–646A.628 (2017), added by 2007 Or. Laws 759, §§ 1–14, amended by 2013 Or. Laws 415, §§ 1–6, 2015 Or. Laws 357 §§ 1–4 (added biometrics and health information, fixed encryption safe harbor). (Sections 646A.606–646A.619 provide for security freezes; Section 646A.620 prohibits display of Social Security numbers; and Section 646A.622 provides for reasonable safeguards.) Pennsylvania enacted in 2005 the Breach of Personal Information Notification Act. 73 Pa. Stat. §§ 2301–2329 (2017), added by 2005 Pa. Laws 94, §§ 1–29. (Security freezes are provided at 73 Pa. Cons. Stat. §§ 2501–2510 (2017), added by 2006 Pa. Laws 163, §§ 1–10.) Rhode Island enacted the Rhode Island Identity Theft Protection Act of 2005 with breach notification. R.I.G.L. §§ 11-49.2-3–11-49.2-7 (2014), added by 2005 R.I. Pub. Laws 225, § 1, repealed 2015 R.I. Pub. Laws 138, § 1. This was replaced with MCLE, Inc. |2nd Edition 2018

9–17

§ 9.2

Data Security and Privacy in Massachusetts

the Identity Theft Protection Act of 2015. R.I.G.L. §§ 11-49.3-1–11-49.3-6 (2017), added by 2015 R.I. Pub. Laws 138 § 2, 148 § 2 (extended from “state agency or person” to include municipalities, fixed encryption safe harbor, extended protections and personal information to health data). (Security freezes are provided under the Consumer Empowerment and Identity Theft Prevention Act of 2006, R.I.G.L. §§ 6-48-1– 6-48-9 (2017) (Section 6-48-8 protects Social Security numbers), added by 2006 R.I. Pub. Laws 226 § 1, amended by 2006 R.I. Pub. Laws 270, § 1, 2011 R.I. Pub. Laws 57, § 2, 2011 R.I. Pub. Laws 69, § 2, 2014 R.I. Pub. Laws 528, § 35.) South Carolina enacted in 2008 the Financial Identity Fraud and Identity Theft Protection Act with breach notification. S.C. Code Ann. §§ 37-20-110–37-20-200, 39-190 (2017), added by 2008 S.C. Acts 190, §§ 2, 7, amended by 2013 S.C. Acts 15, §§ 1–3. (Section 37-20-160 provides for security freezes; Section 37-20-180 imposes restrictions on use of Social Security numbers.) Tennessee enacted in 2005 a data security breach provision. Tenn. Code Ann. § 4718-2107 (2017), added by 2005 Tenn. Pub. Acts 473, § 2, amended by 2016 Tenn. Pub. Acts 692, §§ 1–4 (removed “unencrypted” as a qualification of protection), 2017 Tenn. Pub. Acts 91, § 1 (fixed the encryption safe harbor by protecting unencrypted data and encrypted data if the encryption key is compromised). (Preexisting was the Tennessee Identity Theft Deterrence Act of 1999, Tenn. Code Ann. §§ 47-18-2101– 47-18-2106, added by 1999 Tenn. Pub. Acts 201, §§ 2–7, providing a private right of action under Section 47-18-2104. In 2007, it enacted the Credit Security Act of 2007, 2007 Tenn. Pub. Acts 170, providing for security freezes at Sections 4 and 5 (Tenn. Code Ann. §§ 47-18-2108, 47-18-2109 (2017)) and protection of Social Security numbers at Section 6 (Tenn. Code. Ann. § 47-18-2110 (2017), amended by Tenn Pub. Acts 633, §§ 1–2, 1120, § 8.) Texas enacted in 2005 its Identity Theft Enforcement and Protection Act, including notification of breach. Tex. Bus. & Com. Code § 521.053 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01, amended by 2009 Tex. Gen. Laws 419, § 3, 2011 Tex. Gen. Laws 1126, § 14, 2013 Tex. Gen. Laws 1368, § 1. Utah enacted in 2006 its Consumer Credit Protection Act with breach notification. Utah Code Ann. §§ 13-44-101–13-44-301 (2017), added by 2006 Utah Laws 343, §§ 1–5, amended by 2008 Utah Laws 29, §§ 1–5, 2009 Utah Laws 61, § 1 (changing name to “Protection of Personal Information Act”). (Another chapter, entitled “Consumer Credit Protection Act” provides for security freezes. Utah Code Ann. §§ 1345-101–13-45-401 (2017), added by 2006 Utah Laws 344, §§ 1–9.) Vermont enacted in 2005 a chapter on Protection of Personal Information with breach notification. Vt. Stat. Ann. tit. 9, §§ 2430–2445 (2017), added by 2005 Vt. Acts & Resolves 162, § 1, amended by 2011 Vt. Acts & Resolves 109, § 4. (Section 2440 provides for Social Security number protection.) Virginia enacted a breach of personal information notification in 2008. Va. Code Ann. § 18.2-186.6 (2017), added by 2008 Va. Acts ch. 566, § 1, 2017 Va. Acts ch. 419, § 1 (special procedures for compromise of employee identification information). 9–18

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(There is a separate provision for breach of medical information notification. Va Code Ann. § 32.1-127.1:05 (2017), added by 2010 Va. Acts ch. 852, § 1.) Washington in 2005 enacted a data security breach notification law. Wash. Rev. Code § 19.255.010 (2017), added by 2005 Wash. Laws 368, § 2, 2015 Wash. Laws 64, § 2 (fix to encryption safe harbor). (Liability of processors was added in 2010. Wash. Rev. Code § 19.255.020 (2017), added by 2010 Wash. Laws 151, § 2. Security freezes are provided at Wash. Rev. Code § 19.182.170–19.182.210, added by 2005 Wash. Laws 342, §§ 1–5, amended by 2007 Wash. Laws 499, § 1. Provision is made for public agency breaches. Wash. Rev. Code § 42.56.590 (2017), added by 2007 Wash. Laws 197, § 9, 2005 Wash. Laws 368, § 1, amended by 2007 Wash. Laws 197, § 9, 2015 Wash. Laws 64, § 3 (requiring notification if compromised information is decipherable).) West Virginia enacted a data security breach notification statute in 2008. W. Va. Code §§ 46A-2A-101–46A-2A-105 (2017), added by 2008 W. Va. Acts 37. (In 2007, West Virginia had enacted provisions for a security freeze. W. Va. Code §§ 46A-6L101–46A-6L-105 (2017), added by 2007 W. Va. Acts 45.) Wisconsin in 2005 provided for notification of breach. Wis. Stat. § 134.98 (2017), added by 2005 Wis. Laws 138 (895.507), amended by 2007 Wis. Laws 20, § 3778m, renumbered by 2007 Wis. Laws 97, § 238. (Protections for disposal of records including personal information is provided at Wis. Stat. § 134.97 (2017), added by 1999 Wis. Laws 9, 2005 Wis. Laws 155, § 52.) Wyoming enacted in 2007 a breach notification law. Wyo. Stat. Ann. §§ 40-12-501– 40-12-509 (2017), added by 2007 Wyo. Sess. Laws 162, § 1, amended by 2015 Wyo. Sess. Laws 63, § 1 (information to be notified). Most of these statutes follow the California template. However, there are differences in the definition of personal information, some including passwords (and in some cases hints, such as mother’s maiden name), Alaska Stat. § 45.48.090; Ga. Code Ann. § 10-1-911(6); Me. Rev. Stat. Ann. tit. 10, § 1347(6); N.C. Gen. Stat. § 75-61(10), referring to N.C. Gen. Stat. § 14-113.20(b) (list); N.D. Cent. Code § 51-30-01 (list); Vt. Stat. Ann. tit. 9, § 2430; biometric, Iowa Code § 715C.1(11); Neb. Rev. Stat. § 87-802(5); N.C. Gen. Stat. § 75-61(10), referring to N.C. Gen. Stat. § 14-113.20(b) (list); N.D. Cent. Code § 51-30-01 (list, including photographs); Wis. Stat. § 134.98(1)(b) (including DNA); and health-care information, Ark. Code Ann. § 4110-103(7); Mo. Rev. Stat. § 407.1500.1(9); Tex. Bus. & Com. Code 521.002(a)(2). The encryption safe harbor was fixed in recent enactments in California, Illinois, Nebraska, New Mexico, Oregon, Rhode Island, and Tennessee. While the timing and method of notification are generally specified, the content is specified only by a minority of the states. Massachusetts requires that [t]he notice to be provided to the resident shall include, but not be limited to, the consumer’s right to obtain a police report, how a consumer requests a security freeze and the necessary MCLE, Inc. |2nd Edition 2018

9–19

§ 9.2

Data Security and Privacy in Massachusetts

information to be provided when requesting the security freeze, and any fees required to be paid to any of the consumer reporting agencies, provided however, that said notification shall not include the nature of the breach or unauthorized acquisition or use or the number of residents of the commonwealth affected by said breach or unauthorized access or use. G.L. c. 93H, § 3(b) (emphasis added). At the July 13–20, 2017, annual meeting of the National Conference of Commissioners on Uniform State Laws, its executive committee authorized the formation of a study committee on a possible uniform data breach notification statute.

§ 9.2.2

State Personal Data Security Statutes

There are several categories of state statutes providing for security of personal data. Among these are restrictions on disclosure of driver’s license information and of Social Security numbers and proscriptions on disposal of records that include personal information.

(a)

Driver’s License Information

In 1994, as part of including amendments to the Computer Fraud and Abuse Act, Congress enacted the Driver’s Privacy Protection Act of 1994 (DPPA), 18 U.S.C. §§ 2721–2725 (2017), added by Act of Sept. 1994, Pub. L. No. 103-322, Title XXX, §§ 300001–300005, 108 Stat. 2099–102, which, subject to exceptions, made it unlawful for [a] State department of motor vehicles, and any officer, employee, or contractor, thereof, [to] knowingly disclose or otherwise make available to any person or entity . . . personal information . . . about any individual obtained by the department in connection with a motor vehicle record [except for certain uses]. 18 U.S.C. § 2721(a). In 2000, Congress added a requirement of consent for certain of those uses of “highly restricted personal information.” 18 U.S.C. § 2721(a)(2), amended by Act of Oct. 23, 2000, Pub. L. No. 106-346, § 101(a), 114 Stat. 1356. (“‘[H]ighly restricted personal information’ means an individual’s photograph or image, social security number, medical or disability information. . . .” 18 U.S.C. § 2725(4).) The 1994 legislation also applied more generally to make it unlawful for any person knowingly to obtain or disclose personal information, from a motor vehicle record, for any use not permitted under section 2721(b) of this title [including under § 2721(b)(3), “use in the normal course of business by a legitimate business”]. 9–20

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

18 U.S.C. § 2722(a). A private right of action was also provided. 18 U.S.C. § 2724. The DPPA provided penalties for a state department of motor vehicles that has “a policy or practice of substantial noncompliance.” 18 U.S.C. § 2723(b). Several states enacted complying provisions with automatic repeal upon an expected ruling of unconstitutional interference with state power—which has not occurred. Indiana provided in 1996 in its Code a new chapter on Disclosure of Personal Information Contained in Motor Vehicle Records. Ind. Code §§ 9-14-3.5-1–9-14-3.515 (2015), added by 1996 Ind. Acts 89. This was repealed in 2016 by 2016 Ind. Acts 183. (Indiana also protects Social Security numbers and other personal information collected in applications. Ind. Code § 4-33-5-1.5 (2017), added by 2006 Ind. Acts 125 amended by 2008 Ind. Acts 104, § 2.) Iowa also responded in 1996 with provisions for public access to motor vehicle records. Iowa Code § 321.11 (2017), added by 1996 Iowa Acts 1102, § 1, amended by 1996 Iowa Acts 108, § 3, 1998 Iowa Acts 1035, § 1, 1999 Iowa Acts 198, § 4, 2000 Iowa Acts 1133, §§ 2, 18, 2001 Iowa Acts 90, § 1. Louisiana promulgated regulations to comply, including permitted disclosures and sale of personal information. La. Admin. Code §§ 55:III.551–55:III.565 (2017), promulgated by Department of Public Safety and Corrections, Office of Motor Vehicles, LR 23:990. Maryland in 2010, for courts and other public agencies, 2010 Md. Laws 452, § 1, prohibited public posting of driver’s license information. Md. Code Ann., [Cts. & Jud. Proc.] § 1-205 (2017); Md. Code Ann., [Real Prop.] § 3-111 (2017); Md. Code Ann., [State Gov’t] § 2-1804 (2017); Md. Code Ann., [State Gov’t] § 8-504 (2017) (websites). Michigan enacted legislation in 1997 to protect personal information in motor vehicle registration, 1997 Mich. Pub. Acts 100, 101, limiting availability of records to the public, Mich. Comp. Laws §§ 257.208a–257.208d (2017), and government provision of the information, including sale of lists, Mich. Comp. Laws § 257.232 (2017). Minnesota has a provision to address the DPPA. Minn. Stat. § 168.346 (2017). Missouri prohibits its director of revenue from selling motor vehicle registration lists or personal information held by the Department of Revenue. Mo. Rev. Stat. § 32.055 (2017). Montana enacted in 2001 the Montana Driver Privacy Protection Act, Mont. Code Ann. §§ 61-11-501–61-11-516 (2017), added by 2001 Mont. Laws 363, §§ 1–10. Nebraska in 1997 implemented the DPPA by enacting the Uniform Motor Vehicle Records Disclosure Act. Neb. Rev. Stat. §§ 60-2901–60-2913 (2017), added by 1997 Neb. Laws 635, §§ 1–13. (A model act, the Uniform Motor Vehicle Records Disclosure Act, for implementation of the DPPA, was promulgated by the American Association of Motor Vehicle Administrators and followed in part by some states, including Nebraska, Tennessee, and West Virginia by statute, and by others administratively MCLE, Inc. |2nd Edition 2018

9–21

§ 9.2

Data Security and Privacy in Massachusetts

in best practices by reference. See the model act at http://www. aamva.org/ uploadedFiles/MainSite/Content/SolutionsBestPractices/BestPracticesModel Legislation(1)/ModelLaw_DisclosurePersnlInfoInMVRecords.pdf (last visited July 30, 2017). Nevada has promulgated regulations to address the DPPA. Nev. Admin. Code §§ 481.550–481.600 (2016). New Hampshire amended its motor vehicle records law to address the DPPA. N.H. Rev. Stat. Ann. § 260:14 (2017), added by 1996 N.H. Laws 295; see also N.H. Admin. Code R. [Dept. Safety] 5601.01–5608.02 (2015). New Jersey also amended its motor vehicle personal information law to address the DPPA. N.J. Stat. N.J. Stat. § 39:2-3.4 (2017), amended by 1997 N.J. Laws 166, § 2. North Carolina provides for protections consistent with the DPPA. N.C. Gen. Stat. § 20-43.1 (2014), added by 1997 N.C. Sess. Laws 443, § 32.25(a), amended by 2016 N.C. Sess. Laws 90, § 10(b) (adding subsection 20-43.1(e) prohibiting publication of e-mail and other electronic addresses). North Dakota has a simple compliance provision with both the 1994 DPPA and the 2000 Amendment. N.D. Cent. Code § 39-33-02 (2017), added by 1997 N.D. Laws 349, § 2, amended by 2001 N.D. Laws 355, § 2. Ohio enacted detailed compliance with DPPA. Ohio Rev. Code Ann. § 4501.27 (2017), added by 1995 Ohio Laws HB 353; see also Ohio Admin. Code § 4501:1-1202 (2017). Oregon enacted detailed compliance with DPPA. Or. Rev. Stat. §§ 802.175–802.191 (2017), added by 1997 Or. Sess. Laws 678, §§ 2–10. Rhode Island has a detailed implementation of DPPA. R.I.G.L. § 27-49-3.1 (2017), added by 1997 R.I. Pub. Laws 26, § 1, amended by 2002 R.I. Pub. Laws 292, § 89, 2008 R.I. Pub. Laws 475, § 95, 2009 R.I. Pub. Laws 310, § 8. South Carolina prohibits sale or disclosure by its Department of Public Safety of “a person’s height, weight, race, social security number, photograph, or signature in any form that has been compiled for the purpose of issuing the person a driver’s license or special identification card.” S.C. Code Ann. § 30-4-165(A) (2017), added by 1999 S.C. Acts 33, § 1. Nor may the Department of Motor Vehicles sell Social Security numbers, photographs, signatures, or digitized images taken for the same purposes. S.C. Code Ann. § 30-4-160 (2017), added by 1999 S.C. Acts 100, Part II, § 53. South Dakota provides protection for personal information “including a social security number, driver identification number, name, address (but not the five-digit zip code), telephone number, and medical or disability information” collected for motor vehicle records. S.D. Codified Laws §§ 32-5-143–32-5-150 (2017), added by 2001 S.D. Laws 165, §§ 1–8.

9–22

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Tennessee enacted a Uniform Motor Vehicle Records Disclosure Act. Tenn. Code Ann. §§ 55-25-101–55-25-112 (2017), added by 2001 Tenn. Pub. Acts 745, §§ 2–16. West Virginia also enacted a Uniform Motor Vehicle Records Disclosure Act. W. Va. Code §§ 17A-2A-1–17A-2A-14 (2017), added by 1996 W. Va. Acts 182, amended by 2001 W. Va. Acts 200, 2004 W. Va. Acts 178.

(b)

Social Security Numbers

Most states have specific protection of Social Security numbers in addition to, and often predating, their data breach statutes. Many have tailored these protections for telecommunication and online commerce. California added extensive Social Security number protection in 2001 along with provision for security freezes: (a) Except as provided in this section, a person or entity may not do any of the following: (1) Publicly post or publicly display in any manner an individual’s social security number. “Publicly post” or “publicly display” means to intentionally communicate or otherwise make available to the general public. (2) Print an individual’s social security number on any card required for the individual to access products or services provided by the person or entity. (3) Require an individual to transmit his or her social security number over the Internet, unless the connection is secure or the social security number is encrypted. (4) Require an individual to use his or her social security number to access an Internet Web site, unless a password or unique personal identification number or other authentication device is also required to access the Internet Web site. (5) Print an individual’s social security number on any materials that are mailed to the individual, unless state or federal law requires the social security number to be on the document to be mailed. Notwithstanding this paragraph, social security numbers may be included in applications and forms sent by mail, including documents sent as part of an application or enrollment process, or to establish, amend or terminate an account, contract or policy, or to confirm the accuracy of the social security number. A social security number that is permitted to be mailed under this section may not be printed, in whole or in part, on a postcard or other mailer not requiring an envelope, or visible on the envelope or without the envelope having been opened.

MCLE, Inc. |2nd Edition 2018

9–23

§ 9.2

Data Security and Privacy in Massachusetts

(6) Sell, advertise for sale, or offer to sell an individual’s social security number. For purposes of this paragraph, the following apply: (A) “Sell” shall not include the release of an individual’s social security number if the release of the social security number is incidental to a larger transaction and is necessary to identify the individual in order to accomplish a legitimate business purpose. Release of an individual’s social security number for marketing purposes is not permitted. (B) “Sell” shall not include the release of an individual’s social security number for a purpose specifically authorized or specifically allowed by federal or state law. . . . (g) A person or entity may not encode or embed a social security number in or on a card or document, including, but not limited to, using a barcode, chip, magnetic strip, or other technology, in place of removing the social security number, as required by this section. Cal. [Civ.] Code § 1798.85 (2017), added by 2001 Cal. Stat. 720, § 7, amended by 2002 Cal. Stat. 664, § 42, 2003 Cal. Stat. 532, § 1, 2004 Cal. Stat. 183, § 35, 282, § 1, 2013 Cal. Stat. 103, § 1, 2014 Cal. Stat. 855, § 3 (exclusions from ban on sales). (California also provides for government truncation of Social Security numbers. Cal. [Civ.] Code § 1798.89 (2017) (registrations), added by 2007 Cal. Stat. 627, § 1, repealed and replaced by 2009 Cal. Stat. 552, §§ 1–2; Cal. [Com.] Code § 19526.5 (2017) (UCC filings), added by 2007 Cal. Stat. 627, § 2, amended by 2008 Cal. Stat. 179, § 36; Cal. [Gov’t] Code §§ 27300–27307 (2017), added by 2007 Cal. Stat. 627, § 8.) Alabama prohibits public agency disclosure except for governmental use. Ala. Code § 41-13-6 (2015), added by 2006 Ala. Acts 611. Alaska, as part of 2008 legislation addressing data breaches, provided protection for Social Security numbers. Alaska Stat. §§ 45.48.400–45.48.480 (2017), added by 2008 Alaska Sess. Laws 92, art. 3. (Article 1 addressed data breaches; Article 2, credit report freezes; and Article 4, disposal of records. The protections are similar to the California protections, compare Alaska Stat. § 45.48.400(a) with Cal. [Civ.] Code § 1798.85(a), (g), and, subject to exceptions, “[a] person doing business, including the business of government, may not disclose an individual’s social security number to a third party,” Alaska Stat. § 45.48.430(a). A private right of action is also provided. Alaska Stat. § 45.48.480(b). Arizona also provides protection similar to California. Ariz. Rev. Stat. § 44.1373 (2017), added by 2006 Ariz. Sess. Laws 183, § 2. (No private right of action is provided. Ariz. Rev. Stat. § 44.1373(H).)

9–24

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Arkansas protects against display, printing, and nonsecure transmission of Social Security numbers. Ark. Code Ann. § 4-86-107(b) (2017), added by 2005 Ark. Acts 1295, § 1. Colorado protects against display, printing, and nonsecure transmission of Social Security numbers. Colo. Rev. Stat. § 6-1-715(1) (2017), added by 2006 Colo. Sess. Laws. 274, § 1, amended by 2010 Colo. Sess. Laws 419, § 8 (exempting HIPAA). Connecticut, in connection with 2003 legislation on identity theft, provides protection for Social Security numbers similar to California. Conn. Gen. Stat. §§ 42-470–42472d (2017), added by 2003 Conn. Acts 156, § 13, 2009 Conn. Acts 239, §§ 13–19. Delaware protects against disclosure of nonpublic personal information under the Gramm-Leach-Bliley Act. Del. Code Ann. tit. 18, § 535 (2017), added by 73 Del. Laws 53, § 1 (2001). Georgia protects against public posting or display of Social Security numbers or requiring an individual to transmit or supply an SSN in online commerce. Ga. Code Ann. § 10-1-393.8(a) (2017), added by 2006 Ga. Laws 603. Hawaii also has Social Security number protection. Haw. Rev. Stat. §§ 487J-1–487J3 (2017), added by 2006 Haw. Sess. Laws 137, § 2, amended by 2008 Haw. Sess. Laws 10, § 683, 2012 Haw. Sess. Laws 191, § 2, 2013 Haw. Sess. Laws 225, § 3. Idaho, as part of 2008 legislation providing for credit report freezes, provides, subject to exceptions, “a person shall not intentionally communicate an individual’s social security number to the general public.” Idaho Code Ann. § 28-52-108(1) (2017), added by 2008 Idaho Sess. Laws 177, § 2. (Idaho also protects Gramm-Leach-Bliley nonpublic personal information. Idaho Code Ann. § 14-1334 (2017), added by 2002 Idaho Sess. Laws 5, § 1.) Illinois, as part of its 2009 Identity Protection Act, 5 Ill. Comp. Stat. 179/1–179/55 (2017), added by 2009 Ill. Laws 874, provides that no person or state or local government agency may publicly post or display, print, or require in online commerce, 5 Ill. Comp. Stat. 179/10(a), (b), or embed Social Security numbers, 5 Ill. Comp. Stat. 179/30. (Illinois also protects biometric identifiers and information in the 2009 Biometric Information Privacy Act, 740 Ill. Comp. Stat. 14/1–14/30 (2017), added by 2007 Ill. Laws 994.) Indiana, as part of its Code in a new chapter on Persons Holding a Customer’s Personal Information, Ind. Code §§ 24-4-14-1–24-4-14-8 (2016), added by 2006 Ind. Acts 125, provides for truncation (redaction) or encryption of Social Security numbers, Ind. Code § 24-4-14-7. Iowa provided in 2008 an express limitation on use of Social Security numbers from Department of Motor Vehicle records. Iowa Code § 321.11A(2) (2017), added by 2008 Iowa Acts 1172, § 17.

MCLE, Inc. |2nd Edition 2018

9–25

§ 9.2

Data Security and Privacy in Massachusetts

Kansas, in 2006, added as part of legislation addressed to data breaches and identity theft, 2006 Kan. Sess. Laws 149, §§ 1 (fraudulent scanning of payment card), 2 (restriction of public display of Social Security number), 4 (breach notification), 6 (identity theft), 11–13 (credit report freeze), 14 (destruction of personal information), protection for Social Security numbers, including the following (subject to exceptions): Unless required by federal law, no document available for public inspection or copying shall contain an individual’s social security number if such document contains such individual’s personal information. “Personal information” shall include, but not be limited to, name, address, phone number or e-mail address. Kan. Stat. Ann. § 75-3520(a)(1) (2017), added by 2006 Kan. Sess. Laws 149, § 2. (Kansas had prohibited taking personal information upon use of a credit card. Kan. Stat. Ann. 50-669a (2017), added by 1999 Kan. Sess. Laws 144, § 1. Kansas also provided for Gramm-Leach-Bliley compliance. Kan. Stat. Ann. § 40-2404(15) (2017), added by 2000 Kan. Sess. Laws 147, § 46, renumbered.) Kentucky provides for collection of a Social Security number for marriage licenses, but limitation on display. Ky. Rev. Stat. Ann. § 402.100(4) (2017), added by 2000 Ky. Acts 428, § 1, amended by 2006 Ky. Acts 101, § 1, 2016 Ky. Acts. 132, § 1. Louisiana also provides for collection of a Social Security number for marriage licenses and for limitation on display. La. Rev. Stat. § 9:224(a)(6) (2017). It also protects Social Security numbers in the context of driver’s licenses, La. Rev. Stat. § 32:409.1(A)(2)(d)(vi) (2017), and professional licensure, La. Rev. Stat. § 37:23 (2017). Louisiana also protects against collection of personal information for cash sales: “No retail business shall require a consumer’s name, address, telephone number, or other personal information when completing a consumer transaction for cash sale.” La. Rev. Stat. § 51:1421(A) (2017), added by 2003 La. Acts 534, § 1. Maine provides: “A business operating in this State may not display a social security number on a credit card, customer service card or debit card issued or distributed by that business on or after January 1, 1994.” 10 Me. Rev. Stat. § 1272 (2017). Maine also provides that “a person, corporation or other entity may not deny goods or services to an individual because the individual refuses to provide a social security number.” 10 Me. Rev. Stat. § 1272-B(1) (2017). Maryland protects against display, printing, and nonsecure transmission of Social Security numbers. Md. Code Ann., [Com. Law] § 14-3402 (2017), added by 2005 Md. Laws 521, § 1. Maryland, in 2010, for courts and other public agencies, 2010 Md. Laws 452, § 1, prohibited public posting of Social Security numbers, Md. Code Ann., [Cts. & Jud. Proc.] § 1-205 (2017); Md. Code Ann., [Real Prop.] § 3-111 (2014); Md. Code Ann., [State Gov’t] § 2-1804 (2017); Md. Code Ann., [State Gov’t] § 8-504 (2017) (websites).

9–26

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Michigan in 2004 enacted a Social Security Privacy Act, 2004 Mich. Pub. Acts 454, §§ 2–6. (Section 4 provides truncation of Social Security numbers for compliance with Freedom of Information Act requests. Mich. Comp. Laws § 380.1135 (2015)), that prohibits display, printing, and nonsecure transmission of nontruncated [more than four sequential digits of] Social Security numbers Mich. Comp. Laws § 445.83 (2017). A private right of action is provided. Mich. Comp. Laws § 445.86(2) (2017). (Michigan has made a felony the surreptitious capture of personal information in a financial transaction. Mich. Comp. Laws § 750.539k (2017).) Minnesota protects against display, printing, and nonsecure transmission of Social Security numbers. Minn. Stat. § 325E.59 (2017), added by 2005 Minn. Laws 163, § 85, 2006 Minn. Laws 253, § 19, 2007 Minn. Laws 129, § 55, 2008 Minn. Laws 333, § 1. (The 2005 legislation also provided for agency data breach notification (Section 21) and motor vehicle information privacy (Section 58, Minn. Stat. § 168.346 (2017)).) In 2006, additional prohibitions were enacted against (6) assign[ing] or us[ing] a number as the primary account identifier that is identical to or incorporates an individual’s complete Social Security number, except in conjunction with an employee or member retirement or benefit plan or human resource or payroll administration; or (7) sell[ing] Social Security numbers obtained from individuals in the course of business. Minn. Stat. § 325E.59(a)(6), (7) (2015), added by 2006 Minn. Laws 253, § 19. (The 2006 legislation also provided for genetic information privacy (Section 4, Minn. Stat. § 13.386 (2017)) and fraud related to telephone records (Section 20, Minn. Stat. § 325F.675 (2017)).) Missouri protects against display, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. Mo. Rev. Stat. § 407.1355 (2017), added by 2003 Mo. Laws 61, amended by 2005 Mo. Laws 353, 2012 Mo. Laws 1318. Subsequently, the legislature provided additional prohibitions against (4) Requir[ing] an individual to use his or her Social Security number as an employee number for any type of employmentrelated activity; Mo. Rev. Stat. § 407.1355(a)(4), added by 2005 Mo. Laws 353. (5) Requir[ing] an individual to use the last four digits of his or her Social Security number as an employee number for any type of employment-related activity. Mo. Rev. Stat. § 407.1355(a)(6), added by 2012 Mo. Laws 1318. Montana provides for state agency protection of Social Security numbers. Mont. Code Ann. § 2-6-502 (2017), added by 2009 Mont. Laws 163, § 2. MCLE, Inc. |2nd Edition 2018

9–27

§ 9.2

Data Security and Privacy in Massachusetts

New Jersey, in 2005, as part of its Identity Theft Prevention Act, provided protection against display, printing, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. N.J. Stat. Ann. § 56:8-164 (2017), added by 2005 N.J. Laws 226, § 13; see also N.J. Admin. Code § 13:45F-4.1 (2017). There is also a prohibition against publicly recording documents with Social Security numbers. N.J. Stat. Ann. § 47:1-16 (2017), added by 2005 N.J. Laws 99, § 1. New Mexico enacted in 2003 a Privacy Protection Act, N.M. Stat. §§ 57-12B-1–5712B-3 (2017), added by 2003 N.M. Laws 169, §§ 1–3 (Section 4 of the legislation provided for truncation of credit card numbers), that prohibits businesses from collecting Social Security numbers except for certain credit transactions, and then with security safeguards, N.M. Stat. § 57-12B-3. In 2005, it added protections, similar to other states, against printing, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. N.M. Stat. § 57-12B-4, added by 2005 N.M. Laws 127, § 2. New York protects against display of Social Security numbers in the education and employment contexts. N.Y. Educ. Law § 2-b (2017), added by 2000 N.Y. Laws 214, § 1; N.Y. Labor Law § 203-d (2017), added by 2008 N.Y. Laws 279, § 6. North Carolina, in 2005, as part of its Identity Theft Prevention Act, N.C. Gen. Stat. §§ 75-60–75-65 (2017), added by 2005 N.C. Sess. Laws 414, § 1, amended by 2006 N.C. Sess. Laws 158, § 1, 2009 N.C. Session Laws 355, §§ 1, 2, 550, § 5, 573, § 10, provided protection against display, printing, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. N.C. Gen. Stat. § 75-62. It also provided policy and similar protections for Social Security numbers collected by state agencies. N.C. Gen. Stat. § 132-1.10 (2017), added by 2005 N.C. Sess. Laws 414, § 1, amended by 2006 N.C. Sess. Laws 173, § 2. North Dakota protects against recording documents with Social Security numbers, N.D. Cent. Code § 11-18-23 (2017), added by 2003 N.D. Laws 382, § 1, and against agency release except under certain circumstances, N.D. Cent. Code § 44-04-28 (2017), added by 2003 N.D. Laws 382, § 9, amended by 2007 N.D. Laws 388, § 10. North Dakota also made it a crime to use without authorization personal information, including Social Security numbers, e-mail addresses, and login credentials, to obtain goods, services, or employment, or to initiate a contract or gain access to personal information. N.D. Cent. Code § 12.1-23-11 (2017), added by 1999 N.D. Laws 128, § 1, amended by 2005 N.D. Laws 116, § 1, 2005 N.D. Laws, § 1, 2013 N.D. Laws 105, § 1, 2013 N.D. Laws 106, § 1, 2013 N.D. Laws 107, § 1. Ohio requires Social Security numbers on publicly available public records to be redacted, encrypted, or truncated, Ohio Rev. Code Ann. § 149.45 (2017), added by 2007 Ohio Laws HB 46, § 1, and to be omitted from recorded documents, Ohio Rev. Code Ann. § 317.082 (2017), added by 2005 Ohio Laws HB 279, § 1. There are general restrictions on recording credit card, telephone, and Social Security numbers. Ohio Rev. Code Ann. §§ 1349.17, 1349.18 (2017), added by 1991 Ohio Laws HB 20, § 1, amended by 1993 Ohio Laws HB 266, § 1. Ohio makes it a crime of “identity fraud” to use without authorization “personal identifying information,” including 9–28

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Social Security numbers, passwords, or credit card numbers to “[h]old the person out to be the other person” or to “[r]epresent the other person’s personal identifying information as the person’s own personal identifying information.” Ohio Rev. Code Ann. § 2913.49 (2017), added by 1999 Ohio Laws SB 7, § 1. Oklahoma has a crime of “identity theft” to fraudulently obtain, use, or offer the use of, or willfully create, modify, alter, or change personally identifying information, including Social Security numbers, to obtain or attempt to obtain anything of value. Okla. Stat. tit. 21, § 1533.1 (2017), added by 2007 Okla. Sess. Laws 167, § 1. Oregon prohibits printing, publicly displaying, or publicly posting Social Security numbers. Or. Rev. Stat. § 646A.620 (2017), added by 2007 Or. Laws 759, § 11. It specifically regulates the disclosure by the Department of Transportation. Or. Rev. Stat. § 802.195 (2017), added by 2003 Or. Laws 610, § 3. Pennsylvania protects against printing, display, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. 74 Pa. Cons. Stat. § 201 (2017), added by 2006 Pa. Laws 60, § 1. In 2006, Pennsylvania enacted the Social Security Number Privacy Act, 71 Pa. Cons. Stat. §§ 2601–2608 (2017), added by 2006 Pa. Laws 160, §§ 1–8, providing for administrative use of Social Security number alternatives, 71 Pa. Cons. Stat. § 2603, and prohibition on use on health insurance cards, 71 Pa. Cons. Stat. § 2605. Rhode Island protects against printing, display, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. R.I.G.L. § 6-48-1 (2017), added by 2006 R.I. Pub. Laws 226 § 1, amended by 2006 R.I. Pub. Laws 270, § 1, 2011 R.I. Pub. Laws 57, § 2, 2011 R.I. Pub. Laws 69, § 2. (Rhode Island also prohibits recording personal information in credit card transactions. R.I.G.L. § 6-13-16 (2017), added by 1993 R.I. Pub. Laws 351, § 1, amended by 2014 R.I. Pub. Laws 528, § 9.) South Carolina protects against printing, display, nonsecure transmission, trafficking, or (sole) use for login to a website of untruncated (six or more digits) Social Security numbers. S.C. Code Ann. § 37-20-180 (2017), added by 2008 S.C. Acts 190, § 2. Public agencies are regulated in treatment of personal information and are restricted from disclosing Social Security numbers. S.C. Code Ann. §§ 30-2-310–30-2340 (2017), added by 2008 S.C. Acts 190, § 3.B; see also S.C. Code Ann. §§ 30-2-10– 30-2-50 (2017), added by 2002 S.C. Acts 225, § 1 (Family Privacy Protection Act). Tennessee provides protection against display, printing, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. Tenn. Code Ann. § 4718-2110 (2017), added by 2007 Tenn. Pub. Acts 170, § 6 (part of Credit Security Act of 2007), amended by 2009 Tenn. Pub. Acts 269, §§ 1, 2 (adding protection for printing on identification cards or badges used to obtain benefits). Tennessee also provides for redaction of Social Security numbers on public documents. Tenn. Code Ann. § 10-7-515 (2017), added by 2003 Tenn. Pub. Acts 293, § 1. Texas provides protection against display, printing, nonsecure transmission, or (sole) use for login to a website of Social Security numbers. Tex. Bus. & Com. Code MCLE, Inc. |2nd Edition 2018

9–29

§ 9.2

Data Security and Privacy in Massachusetts

§§ 501.001, 501.002 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01. Texas has a mix of protections for Social Security numbers held by government agencies. Tex. Gov’t Code § 552.147 (2017), added by 2005 Tex. Gen. Laws 397, § 1, amended by 2007 Tex. Gen. Laws 3, § 1 (disclosure by agency personnel not misconduct, redaction allowed), 2013 Tex. Gen. Laws 183, § 2. Utah protects against display of a Social Security number “in a manner or location that is likely to be open to public view.” Utah Code Ann. § 13-45-301(a) (2017), added by 2006 Utah Laws 344, § 8 (part of Consumer Credit Protection Act). (Utah enacted a Notice of Intent to Sell Nonpublic Personal Information Act, Utah Code Ann. §§ 13-37-201–13-37-203 (2017), added by 2003 Utah Laws 97, §§ 1–5, in effect requiring a privacy notice.) Vermont, in its Social Security Number Protection Act, provides protection against display, printing, nonsecure transmission, or (sole) use for login to a website of Social Security numbers, as well as sales of nontruncated numbers without consent. Vt. Stat. Ann. tit. 9, § 2440 (2017), added by 2005 Vt. Acts & Resolves 162, § 1. Administrative agencies are limited in their use of Social Security numbers. Vt. Stat. Ann. tit. 9, § 2480m (2017), added by 2003 Vt. Acts & Resolves 155, § 3. Virginia provides protection against display (including on envelopes and packages), printing, or (sole) use for login to a website of Social Security numbers. Va. Code Ann. § 59.1-443.2 (2017), added by 2005 Va. Acts 640. (Virginia also limits merchant sale of purchase information without notice and scanning and use of information from machine-readable identification cards or licenses. Va. Code Ann. §§ 59.1-442, 59.1-443.3 (2017), added by 2014 Va. Acts 789, 795. Agency use is limited in the Protection of Social Security Numbers Act. Va. Code Ann. §§ 2.23815, 2.2-3816 (2017), added by 2009 Va. Acts 213.) West Virginia protects against agency release of Social Security numbers and credit or card numbers to nongovernmental entities. W. Va. Code § 5A-8-22 (2017), added by 2005 W. Va. Acts 122.

(c)

Security and Disposal of Records

Notwithstanding this substantial uniformity in data security breach notification, protection of the security is less uniform, dealing differently with disposal of data. California has the following general provision: Destruction of records A business shall take all reasonable steps to dispose, or arrange for the disposal of customer records within its custody or control containing personal information when the records are no longer to be retained by the business by (A) shredding, (B) erasing, or (C) otherwise modifying the personal information in those records to make it unreadable or undecipherable through any means. 9–30

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Cal. [Civ.] Code § 1798.81 (2017), added by 2000 Cal. Stat. 1039, § 1, amended by 2009 Cal. Stat. 134, § 2 (changing “destroy” to “dispose”) (emphasis added). The following other states have general record security and disposal requirements. Alaska provides that “[w]hen disposing of records that contain personal information, a business and a governmental agency shall take all reasonable measures necessary to protect against unauthorized access to or use of the records.” Alaska Stat. § 45.48.500 (2016), added by 2008 Alaska Stat. ch. 92, § 4. Such measures include (1) implementing and monitoring compliance with policies and procedures that require the burning, pulverizing, or shredding of paper documents containing personal information so that the personal information cannot practicably be read or reconstructed; (2) implementing and monitoring compliance with policies and procedures that require the destruction or erasure of electronic media and other non paper media containing personal information so that the personal information cannot practicably be read or reconstructed[.] Alaska Stat. § 45.48.510 (2016), added by 2008 Alaska Stat. ch. 92, § 4. Arizona provides that [a]n entity shall not knowingly discard or dispose of records or documents without redacting the information or destroying the records or documents if the records or documents contain an individual’s first and last name or first initial and last name in combination with a corresponding complete [social security number, credit card, charge card or debit card number, retirement account number, savings, checking or securities entitlement account number or driver license number or nonoperating identification license number]. Ariz. Rev. Stat. § 44-7601(A) (2017), added by 2006 Ariz. Sess. Laws 208, § 1, amended by 2016 Ariz. Sess. Laws 102, § 2 (exempting business associates under HIPAA). Arkansas provides: (a) A person or business shall take all reasonable steps to destroy or arrange for the destruction of a customer’s records within its custody or control containing personal information that is no longer to be retained by the person or business by shredding, erasing, or otherwise modifying the personal information in the records to make it unreadable or undecipherable through any means. MCLE, Inc. |2nd Edition 2018

9–31

§ 9.2

Data Security and Privacy in Massachusetts

(b) A person or business that acquires, owns, or licenses personal information about an Arkansas resident shall implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information from unauthorized access, destruction, use, modification, or disclosure. Ark. Code Ann. § 4-110-104 (2017) (emphasis added), added by 2005 Ark. Acts 1526, § 1. Colorado provides: (1) Each public and private entity in the state that uses documents during the course of business that contain personal identifying information shall develop a policy for the destruction or proper disposal of paper documents containing personal identifying information. . . . Colo. Rev. Stat. § 6-1-713 (2017), added by 2004 Colo. Sess. Laws 1959, § 2. Connecticut provides that [a]ny person in possession of personal information [list in subsection (c)] of another person shall safeguard the data, computer files and documents containing the information from misuse by third parties, and shall destroy, erase or make unreadable such data, computer files and documents prior to disposal. Conn. Gen. Stat. § 42-471(a) (2017), added by 2008 Conn. Acts 167, § 1, amended by 2009 Conn. Acts 71, § 1, 239, § 13 (civil penalties to identity fraud fund), 2017 Conn. Acts 137, § 1 (added “military identification information” to Subsection (c) list). Georgia requires: A business may not discard a record containing personal information unless it: (1) Shreds the customer’s record before discarding the record; (2) Erases the personal information contained in the customer’s record before discarding the record; (3) Modifies the customer’s record to make the personal information unreadable before discarding the record; or (4) Takes actions that it reasonably believes will ensure that no unauthorized person will have access to the personal information contained in the customer’s record for the period between the record’s disposal and the record’s destruction. Ga. Code Ann. § 10-15-2 (2017), added by 2002 Ga. Laws 551, § 8, amended by 215 Ga. Laws 187, § 11. 9–32

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Hawaii has a records disposal requirement similar to that of Alaska. Haw. Rev. Stat. § 487R-2 (2017), added by 2006 Haw. Acts ch. 136, § 2 (Destruction of Personal Information Records), amended by 2008 Haw. Acts ch. 19, § 72. (A private right of action is provided under Haw. Rev. Stat. § 487R-3(b) (2017), added by 2006 Haw. Acts ch. 136, § 2.) Illinois requires: (b) A person must dispose of the materials containing personal information in a manner that renders the personal information unreadable, unusable, and undecipherable. Proper disposal methods include, but are not limited to, the following: (1) Paper documents containing personal information may be either redacted, burned, pulverized, or shredded so that personal information cannot practicably be read or reconstructed. (2) Electronic media and other non-paper media containing personal information may be destroyed or erased so that personal information cannot practicably be read or reconstructed. 815 Ill. Comp. Stat. 530/40 (2017), added by 2011 Ill. Laws 483, § 5. Indiana provides: (b) A data base owner shall implement and maintain reasonable procedures, including taking any appropriate corrective action, to protect and safeguard from unlawful use or disclosure any personal information of Indiana residents collected or maintained by the data base owner. (c) A data base owner shall not dispose of records or documents containing unencrypted and unredacted personal information of Indiana residents without shredding, incinerating, mutilating, erasing, or otherwise rendering the personal information illegible or unusable. Ind. Code § 24-4.9-3-3.5 (2016), added by 2009 Ind. Acts 137, § 5, amended by 2017 Ind. Acts 76, § 4 (regulating health care providers with databases not covered under HIPAA). Kansas provided, until repeal: Unless otherwise required by federal law or regulation, a person or business shall take reasonable steps to destroy or arrange for the destruction of a customer’s records within its custody or control containing personal information which is no longer to be retained by the person or business by shredding, erasing or otherwise modifying the personal information

MCLE, Inc. |2nd Edition 2018

9–33

§ 9.2

Data Security and Privacy in Massachusetts

in the records to make it unreadable or undecipherable through any means. Kan. Stat. Ann. § 50-7a03 (2013), added by 2006 Kan. Laws ch. 149, § 14, repealed by 2016 Kan. Laws ch. 7, § 7. Kentucky provides: When a business disposes of, other than by storage, any customer’s records that are not required to be retained, the business shall take reasonable steps to destroy, or arrange for the destruction of, that portion of the records containing personally identifiable information by shredding, erasing, or otherwise modifying the personal information in those records to make it unreadable or indecipherable through any means. Ky. Rev. Stat. Ann. § 365.725 (2017), added by 2006 Ky. Acts 42, § 5. Maryland provides: (b) Destruction of records.—When a business is destroying a customer’s records that contain personal information of the customer, the business shall take reasonable steps to protect against unauthorized access to or use of the personal information, taking into account [the sensitivity of the records, the nature and size of the business and its operations, the costs and benefits of different destruction methods; and available technology. Md. Code Ann., [Com. Law] § 14-3502 (2017), added by 2007 Md. Laws 531, 532. Maryland also requires implantation of reasonable security measures, including third-party service provider contracts. Md. Code Ann., [Com. Law] § 14-3503 (2017), added by 2007 Md. Laws 531, 532. Massachusetts requires: When disposing of records, each agency or person shall meet the following minimum standards for proper disposal of records containing personal information: (a) paper documents containing personal information shall be either redacted, burned, pulverized or shredded so that personal data cannot practicably be read or reconstructed; (b) electronic media and other non-paper media containing personal information shall be destroyed or erased so that personal information cannot practicably be read or reconstructed. Any agency or person disposing of personal information may contract with a third party to dispose of personal information in accordance with this chapter. Any third party hired to dispose 9–34

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

of material containing personal information shall implement and monitor compliance with policies and procedures that prohibit unauthorized access to or acquisition of or use of personal information during the collection, transportation and disposal of personal information. G.L. c. 93I, § 2 (2017), added by 2007 Mass. Acts c. 82, § 17. Pursuant to the companion act, G.L. c. 93H, § 2(a), added by 2007 Mass. Acts c. 82, § 16, the Massachusetts Office of Consumer and Business Regulation promulgated the following: Duty to Protect and Standards for Protecting Personal Information (1) Every person that owns or licenses personal information about a resident of the Commonwealth shall develop, implement, and maintain a comprehensive information security program that is written in one or more readily accessible parts and contains administrative, technical, and physical safeguards that are appropriate to: (a) the size, scope and type of business of the person obligated to safeguard the personal information under such comprehensive information security program; (b) the amount of resources available to such person; (c) the amount of stored data; and (d) the need for security and confidentiality of both consumer and employee information. The safeguards contained in such program must be consistent with the safeguards for protection of personal information and information of a similar character set forth in any state or federal regulations by which the person who owns or licenses such information may be regulated. (2) Without limiting the generality of the foregoing, every comprehensive information security program shall include, but shall not be limited to: (a) Designating one or more employees to maintain the comprehensive information security program; (b) Identifying and assessing reasonably foreseeable internal and external risks to the security, confidentiality, and/or integrity of any electronic, paper or other records containing personal information, and evaluating and improving, where necessary, the effectiveness of the current safeguards for limiting such risks, including but not limited to:

MCLE, Inc. |2nd Edition 2018

9–35

§ 9.2

Data Security and Privacy in Massachusetts

1. ongoing employee (including temporary and contract employee) training; 2. employee compliance with policies and procedures; and 3. means for detecting and preventing security system failures. (c) Developing security policies for employees relating to the storage, access and transportation of records containing personal information outside of business premises. (d) Imposing disciplinary measures for violations of the comprehensive information security program rules. (e) Preventing terminated employees from accessing records containing personal information. (f) Oversee service providers, by: 1. Taking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with 201 CMR 17.00 and any applicable federal regulations; and 2. Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information. . . . (g) Reasonable restrictions upon physical access to records containing personal information, and storage of such records and data in locked facilities, storage areas or containers. (h) Regular monitoring to ensure that the comprehensive information security program is operating in a manner reasonably calculated to prevent unauthorized access to or unauthorized use of personal information; and upgrading information safeguards as necessary to limit risks. (i) Reviewing the scope of the security measures at least annually or whenever there is a material change in business practices that may reasonably implicate the security or integrity of records containing personal information. (j) Documenting responsive actions taken in connection with any incident involving a breach of security, and mandatory post-incident review of events and actions taken, if any, to make changes in business practices relating to protection of personal information. 201 C.M.R. § 17.03.

9–36

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Computer System Security Requirements Every person that owns or licenses personal information about a resident of the Commonwealth and electronically stores or transmits such information shall include in its written, comprehensive information security program the establishment and maintenance of a security system covering its computers, including any wireless system, that, at a minimum, and to the extent technically feasible, shall have the following elements: (1) Secure user authentication protocols including: (a) control of user IDs and other identifiers; (b) a reasonably secure method of assigning and selecting passwords, or use of unique identifier technologies, such as biometrics or token devices; (c) control of data security passwords to ensure that such passwords are kept in a location and/or format that does not compromise the security of the data they protect; (d) restricting access to active users and active user accounts only; and (e) blocking access to user identification after multiple unsuccessful attempts to gain access or the limitation placed on access for the particular system; (2) Secure access control measures that: (a) restrict access to records and files containing personal information to those who need such information to perform their job duties; and (b) assign unique identifications plus passwords, which are not vendor supplied default passwords, to each person with computer access, that are reasonably designed to maintain the integrity of the security of the access controls; (3) Encryption of all transmitted records and files containing personal information that will travel across public networks, and encryption of all data containing personal information to be transmitted wirelessly. (4) Reasonable monitoring of systems, for unauthorized use of or access to personal information; (5) Encryption of all personal information stored on laptops or other portable devices; (6) For files containing personal information on a system that is connected to the Internet, there must be reasonably up-todate firewall protection and operating system security patches, MCLE, Inc. |2nd Edition 2018

9–37

§ 9.2

Data Security and Privacy in Massachusetts

reasonably designed to maintain the integrity of the personal information. (7) Reasonably up-to-date versions of system security agent software which must include malware protection and reasonably up-to-date patches and virus definitions, or a version of such software that can still be supported with up-to-date patches and virus definitions, and is set to receive the most current security updates on a regular basis. (8) Education and training of employees on the proper use of the computer security system and the importance of personal information security. 201 C.M.R. § 17.04 (2017). Michigan provides that a person or agency that maintains a database that includes personal information regarding multiple individuals shall destroy any data that contain personal information concerning an individual when that data is removed from the database and the person or agency is not retaining the data elsewhere for another purpose not prohibited by state or federal law. Mich. Comp. Laws § 445.72a (2017), added by 2006 Mich. Pub. Acts 566. Montana provides that [a] business shall take all reasonable steps to destroy or arrange for the destruction of a customer’s records within its custody or control containing personal information that is no longer necessary to be retained by the business by shredding, erasing, or otherwise modifying the personal information in those records to make it unreadable or undecipherable. Mont. Code Ann. § 30-14-1703 (2017), added by 2005 Mont. Laws 518, § 6. (Montana added as a part of the same law to impede identity theft, credit card procedures including a form of “red flag” rule. Mont. Code Ann. §§ 30-14-1721, 30-14-1722 (2017), added by 2005 Mont. Laws 518, §§ 2, 3.) Nevada provides that [a] business that maintains records which contain personal information concerning the customers of the business shall take reasonable measures to ensure the destruction of those records when the business decides that it will no longer maintain the records. . . . “Reasonable measures to ensure the destruction” means any method that modifies the records containing the personal 9–38

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

information in such a way as to render the personal information contained in the records unreadable or undecipherable, including, without limitation: (1) Shredding of the record containing the personal information; or (2) Erasing of the personal information from the records. Nev. Rev. Stat. § 603A.200 (2017), added by 2005 Nev. Stat. 485, § 22. There are requirements that data collectors “implement and maintain reasonable security measures to protect those records from unauthorized access, acquisition, destruction, use, modification or disclosure.” Nev. Rev. Stat. § 603A.210(1) (2017), added by 2005 Nev. Stat. 485, § 23. There are additional requirements that data collectors accepting payment cards for sales of goods or services comply with the Payment Card Industry (PCI) data security standard and that other data collectors not transmit or move in storage devices beyond their logical or physical control unencrypted personal information. Nev. Rev. Stat. § 603A.215 (2017) (emphasis added), added by 2009 Nev. Laws ch. 355, § 1, amended by 2011 Nev. Laws ch. 354, § 6 (adding responsibility for “multi-functional devices” by data collector). New Jersey requires that [a] business or public entity shall destroy, or arrange for the destruction of, a customer’s records within its custody or control containing personal information, which is no longer to be retained by the business or public entity, by shredding, erasing, or otherwise modifying the personal information in those records to make it unreadable, undecipherable or nonreconstructable through generally available means. N.J. Stat. Ann. § 56:8-162 (2017), added by 2005 N.J. Laws 226, § 11. (New Jersey also provides for secure handling of Social Security numbers and for promulgation of regulations, N.J. Stat. Ann. §§ 56:8-164, 56:8-165 (2017).) New York prohibits a nonstate entity from “dispos[ing] of a record containing personal identifying information” unless it a. shreds the record before the disposal of the record; or b. destroys the personal identifying information contained in the record; or c. modifies the record to make the personal identifying information unreadable; or d. takes actions consistent with commonly accepted industry practices that it reasonably believes will ensure that no unauthorized person will have access to the personal identifying information contained in the record. N.Y. Gen. Bus. Law § 399-h (2017), added by 2006 N.Y. Laws 65, § 1, amended by 2008 N.Y. Laws 516, §§ 1, 2. (New York also has a law regulating “document destruction contractors.” N.Y. Gen. Bus. Law §§ 899-aaa, bbb (2017), added by 2007 N.Y. Laws 679, § 1.) MCLE, Inc. |2nd Edition 2018

9–39

§ 9.2

Data Security and Privacy in Massachusetts

North Carolina has a personal information record disposal requirement similar to that of Alaska and Hawaii. N.C. Gen. Stat. § 75-64 (2017), added by 2005 N.C. Sess. Laws 414, § 1. (Social Security number protection was also enacted at the time. N.C. Gen. Stat. § 75-62 (2017), added by 2005 N.C. Sess. Laws 414, § 1.) Oregon requires that [a]ny person that owns, maintains or otherwise possesses data that includes a consumer’s personal information . . . develop, implement and maintain reasonable safeguards to protect the security, confidentiality and integrity of the personal information, including disposal of the data.” That disposal includes “burning, pulverizing, shredding or modifying a physical record and by destroying or erasing electronic media so that the information cannot be read or reconstructed. Or. Rev. Stat. § 646A.622 (2017), added by 2007 Or. Laws 759, § 12, amended by 2015 Or. Laws 357, § 3, where Subsection (2)(d) sets forth “administrative”, “technical” and “physical” safeguards similar to those set forth under HIPAA and its regulations. (Oregon also has a provision to specifically protect Social Security numbers. Or. Rev. Stat. § 646A.620 (2017), added by 2007 Or. Laws 759, § 11.) Rhode Island provides that [a] business shall take reasonable steps to destroy or arrange for the destruction of a customer’s personal information within its custody and control that is no longer to be retained by the business by shredding, erasing, or otherwise destroying and/or modifying the personal information in those records to make it unreadable or indecipherable through any means for the purpose of: (1) Ensuring the security and confidentiality of customer personal information; (2) Protecting against any reasonably foreseeable threats or hazards to the security or integrity of customer personal information; and (3) Protecting against unauthorized access to, or use of, customer personal information that could result in substantial harm or inconvenience to any customer. R.I.G.L. § 6-52-2 (2017), added by 2009 R.I. Pub. Laws 247, § 1, 2009 R.I. Pub. Laws 285, § 1, amended by 2014 R.I. Pub. Laws 528, § 39 (punctuation). South Carolina requires that “[w]hen a business disposes of a business record that contains personal identifying information of a customer of a business, the business shall modify, by shredding, erasing, or other means, the personal identifying information to make it unreadable or undecipherable.” S.C. Code Ann. § 37-20-190 9–40

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(2017), added by 2008 S.C. Acts 190, § 2. (South Carolina also has a statute governing use of Social Security numbers by public agencies. S.C. Code Ann. § 30-2-310 (2017), added by 2008 S.C. Acts 190, § 3.B.) Texas has a similar requirement to South Carolina, Tex. Bus. & Com. Code Ann. § 72.004 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01, and requires that a business implement and maintain reasonable procedures to protect sensitive personal information and to destroy customer records containing such information, that are to not be retained, by shredding, erasing, or “otherwise modifying the sensitive personal information in the records to make the information unreadable or indecipherable through any means.” Tex. Bus. & Com. Code Ann. § 521.052 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01, amended by 2009 Tex. Gen. Laws 419, § 2. Utah has a similar procedural and document destruction requirement to that of Texas. Utah Code Ann. § 13-44-201 (2017), added by 2006 Utah Laws 343, § 3. Vermont provides: (b) A business shall take all reasonable steps to destroy or arrange for the destruction of a customer’s records within its custody or control containing personal information which is no longer to be retained by the business by shredding, erasing, or otherwise modifying the personal information in those records to make it unreadable or indecipherable through any means . . . . (c) An entity that is in the business of disposing of personal financial information that conducts business in Vermont or disposes of personal information of residents of Vermont must take all reasonable measures to dispose of records containing personal information by implementing and monitoring compliance with policies and procedures that protect against unauthorized access to or use of personal information during or after the collection and transportation and disposing of such information. Vt. Stat. Ann. tit. 9, § 2445 (2017), added by 2005 Vt. Acts & Resolves 162, § 1, amended by 2011 Vt. Acts & Resolves 78, § 2. (Vermont also specifically protects Social Security numbers. Vt. Stat. Ann. tit. 9, § 2440 (2017), added by 2005 Vt. Acts & Resolves 162, § 1, amended by 2011 Vt. Acts & Resolves 78, § 2.) Washington provides: An entity must take all reasonable steps to destroy, or arrange for the destruction of, personal financial and health information and personal identification numbers issued by government entities in an individual’s records within its custody or control when the entity is disposing of records that it will no longer retain. Wash. Rev. Code § 19.215.020 (2017), added by 2002 Wash. Laws 90, § 3. MCLE, Inc. |2nd Edition 2018

9–41

§ 9.2

Data Security and Privacy in Massachusetts

Wisconsin provides, for three subclasses of data collectors: A financial institution, medical business or tax preparation business may not dispose of a record containing personal information unless the financial institution, medical business, tax preparation business or other person under contract with the financial institution, medical business or tax preparation business does any of the following: (a) Shreds the record before the disposal of the record. (b) Erases the personal information contained in the record before the disposal of the record. (c) Modifies the record to make the personal information unreadable before the disposal of the record. (d) Takes actions that it reasonably believes will ensure that no unauthorized person will have access to the personal information contained in the record for the period between the records disposal and the records destruction. Wis. Stat. § 134.97 (2017), added by 1999 Wis. Laws 9, § 3113n (895.505), renumbered by 2005 Wis. Laws 155, § 52. Thus, most of the states that require security procedures state their requirements in general terms of “reasonable steps.” Massachusetts, Nevada, North Carolina, and Oregon are more specific. Nevada adopts the PCI industry standard and provides a limited “safe harbor” against liability for compliance with that standard.

(d)

Internet Collection of Personal Information

Several states have enacted statutes specifically directed to collection of personal information by Internet service providers and website operators, including (or only for) the government. Such collection may be considered “intrusive” in the sense that the collection is automatic unless the user of the services “opts out” of the collection, which many users fail to do without much thought. California in 2003 enacted the Online Privacy Protection Act of 2003, Cal. [Bus. & Prof.] Code §§ 22575–22579 (2015), added by 2003 Cal. Stat. ch. 829, § 3, amended by 2013 Cal. Stat. ch. 390, § 1 (adding Section 22575(5)–(7)), which requires websites collecting personal information from California residents to post conspicuous privacy policies, amended in 2014 to disclose the effect of “do not track” signals. (a) An operator of a commercial Web site or online service that collects personally identifiable information through the Internet about individual consumers residing in California who use or visit its commercial Web site or online service shall conspicuously post its privacy policy on its Web site, or in the case of an operator of an online service, make that policy 9–42

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

available in accordance with paragraph (5) of subdivision (b) of Section 22577. An operator shall be in violation of this subdivision only if the operator fails to post its policy within 30 days after being notified of noncompliance. (b) The privacy policy required by subdivision (a) shall do all of the following: (1) Identify the categories of personally identifiable information that the operator collects through the Web site or online service about individual consumers who use or visit its commercial Web site or online service and the categories of third-party persons or entities with whom the operator may share that personally identifiable information. (2) If the operator maintains a process for an individual consumer who uses or visits its commercial Web site or online service to review and request changes to any of his or her personally identifiable information that is collected through the Web site or online service, provide a description of that process. (3) Describe the process by which the operator notifies consumers who use or visit its commercial Web site or online service of material changes to the operator’s privacy policy for that Web site or online service. (4) Identify its effective date. (5) Disclose how the operator responds to Web browser “do not track” signals or other mechanisms that provide consumers the ability to exercise choice regarding the collection of personally identifiable information about an individual consumer’s online activities over time and across third-party Web sites or online services, if the operator engages in that collection. (6) Disclose whether other parties may collect personally identifiable information about an individual consumer’s online activities over time and across different Web sites when a consumer uses the operator’s Web site or service. (7) An operator may satisfy the requirement of paragraph (5) by providing a clear and conspicuous hyperlink in the operator’s privacy policy to an online location containing a description, including the effects, of any program or protocol the operator follows that offers the consumer that choice. Cal. [Bus. & Prof.] Code § 22575. (a) The term “personally identifiable information” means individually identifiable information about an individual consumer MCLE, Inc. |2nd Edition 2018

9–43

§ 9.2

Data Security and Privacy in Massachusetts

collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name. (2) A home or other physical address, including street name and name of a city or town. (3) An e-mail address. (4) A telephone number. (5) A social security number. (6) Any other identifier that permits the physical or online contacting of a specific individual. (7) Information concerning a user that the Web site or online service collects online from the user and maintains in personally identifiable form in combination with an identifier described in this subdivision. (b) The term “conspicuously post” with respect to a privacy policy shall include posting the privacy policy through any of the following: (1) A Web page on which the actual privacy policy is posted if the Web page is the homepage or first significant page after entering the Web site. (2) An icon that hyperlinks to a Web page on which the actual privacy policy is posted, if the icon is located on the homepage or the first significant page after entering the Web site, and if the icon contains the word “privacy.” The icon shall also use a color that contrasts with the background color of the Web page or is otherwise distinguishable. (3) A text link that hyperlinks to a Web page on which the actual privacy policy is posted, if the text link is located on the homepage or first significant page after entering the Web site, and if the text link does one of the following: (A) Includes the word “privacy.” (B) Is written in capital letters equal to or greater in size than the surrounding text. (C) Is written in larger type than the surrounding text, or in contrasting type, font, or color to the surrounding text of the same size, or set off from the surrounding text of the same size by symbols or other marks that call attention to the language. Cal. [Bus. & Prof.] Code § 22577(a), (b). 9–44

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Delaware provides for agency posting of Internet privacy policies. 29 Del. Code Ann. §§ 9014C–9019C (2015), added by 74 Del. Laws 127, § 1 (2003), amended by 75 Del. Laws 88, § 21(13) (2005 reference). (Sections 9033C–9020C apply to general state treatment of personal information.) Illinois also enacted a State Agency Web Site Act. 5 Ill. Comp. Stat. Ann. 177/1– 177/15 (2017), added by 2003 Ill. Laws §§ 1–15. Minnesota in 2002 enacted an Internet privacy article, Minn. Stat. §§ 325M.01– 325M.09 (2017), added by 2002 Minn. Laws 395 §§ 1-1–1-9. (The legislation also added an article on False or Misleading Commercial Electronic Mail Messages, Minn. Stat. § 325F.694 (2017)), that provides: “Except as provided in sections 325M.03 [court process] and 325M.04 [“ordinary course of business” and prevention of unlawful diversion], an Internet service provider may not knowingly disclose personally identifiable information concerning a consumer of the Internet service provider.” Minn. Stat. § 325M.02. There is also a requirement of “reasonable steps to maintain the security and privacy of a consumer’s personally identifiable information.” Minn. Stat. § 325M.05. A private right of action is provided. Minn. Stat. § 325M.07. New York enacted in 2001, for its state government, an Internet privacy policy act. N.Y. State Tech. Law §§ 201–208 (2017), added by 2001 N.Y. Laws 578, § 1, amended by 2002 N.Y. Laws 17, § 1 (renamed Internet Security and Privacy Act). Virginia provides that “[e]very public body . . . that has an Internet website associated with that public body shall develop an Internet privacy policy and an Internet privacy policy statement that explains the policy to the public.” Va. Code Ann. § 2.23803 (2017), added by 2001 Va. Acts 844.

§ 9.2.3

State Laws Against Internet Intrusion

Most states have enacted criminal provisions against computer trespass, many similar to the Computer Fraud and Abuse Act against unauthorized access ( see chapter 3 of this book), some directed to particular types of intrusion that gained public and legislative attention in the expansion of Internet use and abuse. Three classes of state laws against breach of data security through intrusion are discussed here: antispam laws; antimalware (antispyware, antiphishing, and antipharming); and laws limiting employer or school coercion of disclosure of social media access information. Related are laws against “cyberbullying” and “revenge porn.”

(a)

State Antispam Legislation

Thirty-eight states have passed laws that restrict the transmission of unsolicited commercial e-mail (spam). Several of those states have laws requiring spam to be labeled with alternative subject lines such as “ADV.” See spam law summary at http://www.spamlaws.com/state/summary.shtml; see also http://www.ncsl.org/ research/telecommunications-and-information-technology/state-spam-laws.aspx. Such requirements have not been applied generally, as non–“falsity or deception” aspects of state antispam legislation are preempted by the federal Control of the Assault MCLE, Inc. |2nd Edition 2018

9–45

§ 9.2

Data Security and Privacy in Massachusetts

of Non-Solicited Pornography and Marketing (CAN-SPAM) Act of 2003. 15 U.S.C. §§ 7701–7713 (2017), added by 108 Pub. L. No. 187, §§ 2–15, 117 Stat. 2699 (Dec. 16, 2003). Practice Note The Federal Trade Commission was granted authority to enforce (15 U.S.C. § 7706) and to promulgate regulations (15 U.S.C. § 7711), including a “do not e-mail” directory (15 U.S.C. § 7708). Preemption of state commercial e-mail law is set forth at 15 U.S.C. § 7707(b)(1) (“except to the extent that any such statute, regulation, or rule prohibits falsity or deception in any portion of a commercial electronic mail message or information attached thereto”) (emphasis added).

California antispam legislation began with a 1992 statute addressed to an earlier form of advertising intrusion on consumers: unsolicited facsimile (fax) advertisements. 1992 Cal. Stat. 564, § 1. In 1998, the section was modified to include e-mail, including requirements of subject line e-mailing. 1998 Cal. Stat. 865, § 1 (emphasis added). Practice Note At roughly the same time, a prohibition was enacted against initiating unsolicited e-mail advertisement in violation of the policy of an e-mail service provider without a requirement that the service provider establish such a policy. Cal. [Bus. & Prof.] Code § 17538.45(b)–(d) (2017), added by 1998 Cal. Stat. 863, § 2, amended by 2003 Cal. Stat. 487, § 3 (no action under both this section and the new general unsolicited e-mail Section 17529), 2004 Cal. Stat. 183, § 15 (technical amendment). California also protects against unsolicited text messaging. Cal. [Bus. & Prof.] Code § 17538.41 (2017), added by 2002 Cal. Stat. 699, § 1, amended by 2005 Cal. Stat. 711, § 1.

However, the enactment of CAN-SPAM in 2003 and its preemption led to the establishment of a different regime under a new Article 1.8 of the Business and Professional Code, entitled “Restrictions On Unsolicited Commercial E-mail Advertisers.” Cal. [Bus. & Prof.] Code §§ 17529–17529.9 (2017), added by 2003 Cal. Stat. 487, § 1, amended by 2004 Cal. Stat. 183, § 14 (technical amendment). Practice Note The broadest prohibition, on its face, is against spamming generally. Cal. [Bus. & Prof.] Code § 17529.2. It may be taken as a requirement for “opting in” but has not been tested in any reported decision. Service providers continue to be protected against violations of their antispam policies. Cal. [Bus. & Prof.] Code § 17538.45.

The article, among other things, prohibits, for use in spamming, the collecting or “scraping” of e-mail addresses posted on the Internet, Cal. [Bus. & Prof.] Code § 17529.4(a); using an e-mail address “obtained by using automated means based on a combination of names, letters, or numbers,” Cal. [Bus. & Prof.] Code § 17529.4(b);

9–46

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

and using “scripts or other automated means to register for multiple electronic mail accounts,” Cal. [Bus. & Prof.] Code § 17529.4(c). The article further provides: It is unlawful for any person or entity to advertise in a commercial e-mail advertisement either sent from California or sent to a California electronic mail address under any of the following circumstances: (1) The e-mail advertisement contains or is accompanied by a third-party’s domain name without the permission of the third party. (2) The e-mail advertisement contains or is accompanied by falsified, misrepresented, or forged header information. This paragraph does not apply to truthful information used by a third party who has been lawfully authorized by the advertiser to use that information. (3) The e-mail advertisement has a subject line that a person knows would be likely to mislead a recipient, acting reasonably under the circumstances, about a material fact regarding the contents or subject matter of the message. Cal. [Bus. & Prof.] Code § 17529.5(a). (A private right of action is provided to recipients of offending e-mail. Cal. [Bus. & Prof.] Code § 17529.5(b)(1)(A)(iii).) A California appellate court reversed a summary judgment of preemption (for failure to plead common law fraud) in Hypertouch, Inc. v. ValueClick, Inc., 192 Cal. App. 4th 805 (Cal. Ct. App. 2011), a case brought by a service provider under the section against both a spammer and the advertising entity for flooding its customers with spam, ruling that “falsity and deception” were broader in scope than common law fraud. Practice Note The court in Rosolowski v. Guthy-Renker LLC, 230 Cal. App. 4th 1403 (Cal. Ct. App. 2014), affirmed dismissal for failure to state a claim under Section 17529.5 where the allegation was “merely” that an incorrect and untraceable name was used in the subject line while the body of the email gave truthful information about an offered deal.

Alaska enacted in 2003 an antispam statute essentially limited to adult materials: A person may not send unsolicited commercial electronic mail to another person from a computer located in this state or to an electronic mail address that the sender knows is held by a resident of this state if the commercial electronic mail contains information that consists of explicit sexual material that another law provides may only be viewed, purchased, rented, leased, or held by an individual who is 18 years of age or older, unless the subject line of the advertisement contains “ADV:ADLT” as the first eight characters. Alaska Stat. § 45.50.479(a) (2016), added by 2003 Alaska Sess. Laws 14, § 2. MCLE, Inc. |2nd Edition 2018

9–47

§ 9.2

Data Security and Privacy in Massachusetts

Arizona enacted in 2003 a provision on Commercial Electronic Mail, Ariz. Rev. Stat. §§ 44-1372–44-1372.05 (2017), added by 2003 Ariz. Sess. Laws 229, § 2, on the fraud model of prohibiting false transmission or routing information, false or misleading subject line information, or using a third party’s Internet address (spoofing) without authorization. Unsolicited commercial e-mailers or maintainers of databases for these must use “ADV” in the subject line of e-mails and provide a no-cost opt-out procedure. Ariz. Rev. Stat. § 44-1372.01(b). Arizona allows a private right of action for damages. Ariz. Rev. Stat. § 44-1372.02. Arkansas enacted in 2003 its Unsolicited Commercial and Sexually Explicit Electronic Mail Prevention Act, Ark. Code Ann. §§ 4-88-601–4-88-607 (2017), added by 2003 Ark. Acts 1019, § 1, on the fraud model of prohibiting false transmission or routing information, false or misleading subject line information, or using a third party’s Internet address (spoofing) without authorization. Ark. Code Ann. § 4-88603(c). Unsolicited commercial or sexually explicit e-mailers using Arkansas intermediaries must conspicuously provide a legal name, a correct street address, and a valid Internet domain name, and a no-cost opt-out procedure, with sexually explicit mailers having more onerous requirements, including the use of “adv:adult” in the subject line of e-mails. Ark. Code Ann. § 4-88-603(a). Arkansas allows a private right of action for damages or statutory damages of $10 for individual mailings, plus attorney fees. Ark. Code Ann. § 4-88-606. Colorado passed the Spam Reduction Act of 2008, Colo. Rev. Stat. § 6-1-702.5 (2017), added by 2008 Colo. Sess. Laws 593, § 1, which by its own terms, seeks “to exercise state authority in a manner consistent with, and to the maximum extent permissible under, the federal preemption provisions of [CAN-SPAM],” Colo. Rev. Stat. § 6-1-702.5(6)(c), prohibiting violations of CAN-SPAM and knowingly sending messages to those who have opted out under CAN-SPAM, failing to disclose the actual point-of origin e-mail address, falsifying transmission and/or routing information, or using a third-party Internet address without consent, Colo. Rev. Stat. § 61-702.5(2)(a)–(d). (The point-of-origin fraud is different from the subject-line fraud.) Connecticut enacted in 1999 An Act Prohibiting Unauthorized Use of a Computer and Other Computer Offenses, Conn. Gen. Stat. § 53-451 (2017), added by 1999 Conn. Acts 160, § 1, that made it a crime, among other acts, to “[f]alsify or forge electronic mail transmission information or other routing information in any manner in connection with the transmission of unsolicited bulk electronic mail through or into the computer network of an electronic mail service provider or its subscribers,” Conn. Gen. Stat. § 53-451(b)(7), or to sell or distribute software designed to facilitate such falsification, Conn. Gen. Stat. § 53-451(c). In 2003, it enacted An Act Concerning Consumer Computer Equipment Leases and Unsolicited Electronic Mail Advertising Material, 2003 Conn. Acts 138, that, among other things, prohibited the sending of “unsolicited advertising material” by e-mail unless it identified a toll-free number or a return e-mail address for opting out and included a subject line starting with “ADV,” Conn. Gen. Stat. § 52-570c(b) (2014), added by 2003 Conn. Acts 128, § 2. A private right of action was provided for any aggrieved person with statutory damages of $500 per violating e-mail and costs and attorney fees. Conn. Gen. Stat. § 52-570c(d). 9–48

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Delaware in 1999 created the “computer crime of unrequested or unauthorized electronic mail,” Del. Code Ann. tit. 11, §§ 937, 938 (2017), added by 72 Del. Laws 135, § 1 (1999), for • intentionally or recklessly distributing (automatically—not “between human beings”) unsolicited bulk commercial e-mail (opt-in), Del. Code Ann. tit. 11, § 937(1); • using a computer or a computer network with the intent to falsify routing information, Del. Code Ann. tit. 11, § 937(2); or • distributing software for facilitating such falsification, Del. Code Ann. tit. 11, § 937(3). Florida enacted in 2004 its Electronic Mail Communications Act, Fla. Stat. Ann. §§ 668.60–668.610 (2017), added by 2004 Fla. Laws ch. 233, § 1, amended by 2005 Fla. Laws ch. 2, § 132, 2006 Fla. Laws ch. 232, §§ 2, 3, prohibiting unsolicited commercial e-mail that • uses a third-party Internet domain name without authorization, Fla. Stat. Ann. § 668.603(1)(a); • contains “falsified or missing routing information or otherwise misrepresents, falsifies or obscures . . . the point of origin,” Fla. Stat. Ann. § 668.603(1)(b); • contains false or misleading information in the subject line, Fla. Stat. Ann. § 668.603(1)(c); or • contains “false or deceptive information in the body of the message which is designed and intended to cause damage to the receiving device” (malware), Fla. Stat. Ann. § 668.603(1)(d). The Act also prohibits distribution of software to facilitate falsification of routing information, Fla. Stat. Ann. § 668.603(2). Georgia enacted in 2005 its Georgia Slam Spam E-mail Act, Ga. Code Ann. §§ 169-100–16-9-107 (2017), added by 2005 Ga. Laws 46, § 4, and Ga. Code Ann. § 16-993.1 (2017), added by 1996 Ga. Laws 1505, § 1 (misleading transmittal of name over computer network, that criminalized the “initiation of deceptive commercial e-mail,” Ga. Code Ann. § 16-9-101, based on false or misleading header information (which overlaps with “phishing”)). Idaho enacted in 2000 antispam legislation, Idaho Code § 48-603E (2017), added by 2000 Idaho Sess. Laws 423, § 1, that requires bulk e-mail advertisement to provide an e-mail address for opting out, Idaho Code § 48-603E(2), and made it unlawful to use an unauthorized third-party address, to misrepresent point-of-origin information, to fail to identify the point of origin, or to continue to send after opt-out, Idaho Code § 48-603E(3). A private right of action is provided for actual damages or statutory damages of the greater of $100 per e-mail or $1,000. Idaho Code § 48-603E(4).

MCLE, Inc. |2nd Edition 2018

9–49

§ 9.2

Data Security and Privacy in Massachusetts

Illinois enacted in 1999 its Electronic Mail Act, 815 Ill. Comp. Stat. Ann. 511/1– 511/15 (2017), added by 1999 Ill. Laws 233, §§ 1–15, amended by 2003 Ill. Laws 199, § 5, and 720 Ill. Comp. Stat. Ann. 5/17-51 (2017) (“Computer Tampering”), added by 1989 Ill. Laws 762, § 1, amended by 1999 Ill. Laws 233, § 900, 2007 Ill. Laws 326, § 5 (authorized access), 2009 Ill. Laws 1000, § 600, 1551, § 17–51, 2015 Ill. Laws 775, § 20 (authorization under Revised Uniform Fiduciary Access to Digital Assets Act) (penalties for offenses arising “from the transmission of unsolicited bulk electronic mail” in computer tampering section as part of legislation adding provisions on child sex solicitation), that is generally directed to the basic three frauds of unauthorized use of a third-party address, misrepresentation of the point of origin, and false or misleading information in the subject line, 815 Ill. Comp. Stat. Ann. 511/10(a). A private right of action is allowed for actual damages or statutory damages of the lesser of $10 per e-mail or $25,000 per day with costs and attorney fees. 815 Ill. Comp. Stat. Ann. 511/10(c). In 2003, additional legislation, 2003 Ill. Laws 199, § 5, • required that an opt-out toll-free number or an e-mail address be provided, 815 Ill. Comp. Stat. Ann. 511/10(a-5); • prohibited transfer of e-mail addresses that opted out, 815 Ill. Comp. Stat. Ann. 511/10(a-10); and • required the subject line to include “ADV:” as its first four characters and, for goods or services requiring the purchaser to be 18 years of age, “ADV:ADLT” as the first eight characters, 815 Ill. Comp. Stat. Ann. 511/10(a-15). Indiana added in 2003 a chapter on Deceptive Commercial Mail, Ind. Code §§ 24-522-1–24-5-22-10 (2016), added by 2003 Ind. Acts 36, § 1, amended by 2004 Ind. Acts 97, § 91, 2007 Ind. Acts 2, § 317, 2012 Ind. Acts. 132, § 6, that prohibits the three basic frauds of using a third-party address without authorization, misrepresenting or obscuring point of origin, or containing false or misleading information in the subject line, Ind. Code § 24-5-22-7. It also • requires the subject line to include “ADV:” as its first four characters, Ind. Code § 24-5-22-8(1), and, for goods or services not available or deemed harmful to minors under Indiana law (including extension of credit), “ADV:ADLT” as the first eight characters, Ind. Code § 24-5-22-8(2); • requires a no-cost opt-out, Ind. Code § 24-5-22-8(3); and • prohibits solicitation of those who opted out, Ind. Code § 24-5-22-8(4), or transfer of e-mail addresses that opted out, Ind. Code § 24-5-22-8(5). A private right of action is provided for actual damages or statutory damages of $500 per e-mail and costs and attorney fees. Ind. Code § 24-5-22-10. Iowa enacted in 2005 antispam provisions, replacing a prior set at Chapter 714E, Iowa Code §§ 716A.1 through 716A.7 (2017), added by 2005 Iowa Acts §§ 1–8, prohibiting falsification of transmission/routing information, Iowa Code § 716A.2(1)(a), and distribution of software facilitating such falsification, Iowa Code § 716A.2(1)(a). A private right of action is allowed for actual damages or statutory 9–50

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

damages of the lesser of $10 per e-mail or $25,000 per day with costs and attorney fees. Iowa Code § 716A.6. Kansas enacted in 2002 its Commercial Electronic Mail Act, Kan. Stat. Ann. § 506,107 (2017), added by 2002 Kan. Sess. Laws 140, § 1, prohibiting the use of a thirdparty address without authorization or other misrepresentation or obscuring of point of origin, Kan. Stat. Ann. § 50-6,107; Kan. Stat. Ann. § 50-6,107(c)(1)(A), and inclusion of false or misleading information in the subject line, Kan. Stat. Ann. § 506,107(c)(1)(B). It also requires the subject line to include “ADV:” as its first four characters, Kan. Stat. Ann. § 50-6,107(c)(1)(C), and, for goods or services not available to under-eighteen-year-olds (including sexually explicit materials), “ADV:ADLT” as the first eight characters, Kan. Stat. Ann. § 50-6,107(c)(1)(E), requires a no-cost opt-out in conspicuous characters, Kan. Stat. Ann. § 506,107(c)(1)(D), and prohibits solicitation of those who opted out, Kan. Stat. Ann. § 50-6,107(c)(2), or transfer of e-mail addresses that opted out, Kan. Stat. Ann. § 506,107(c)(3), and distribution of software facilitating falsification of point of origin, Kan. Stat. Ann. § 50-6,107(c)(5). A private right of action is allowed under the consumer protection law. Kan. Stat. Ann. § 50-6,107(i). Louisiana enacted in 2003 antispam provisions, La. Rev. Stat. Ann. §§ 51:2001– 51:2004 (2017), renumbered from La. Rev. Stat. Ann. §§ 51:1741–51:1741.3, added by 2003 La. Acts 1275, § 1, prohibiting the three frauds of unauthorized use of a third-party address, falsified or intentionally obscured header information, and misleading information in the subject line about the contents of the message, La. Rev. Stat. Ann. § 51:2003(A)(1)–(3). It also requires • the maintenance of a functional return e-mail address and website for opting out, La. Rev. Stat. Ann. § 51:2002(1)–(2); • clear and conspicuous disclosure of the opting out procedure, La. Rev. Stat. Ann. § 51:2002(3); • the subject line to include “ADV:” as its first four characters, La. Rev. Stat. Ann. § 51:2002(4), and, for “obscene materials,” “ADV:ADLT” as the first eight characters, La. Rev. Stat. Ann. § 51:2002(5). A private right of action is allowed for actual damages or statutory damages of the lesser of $10 per e-mail or $25,000 per day with costs and attorney fees. La. Rev. Stat. Ann. § 51:2004. Maine adopted in 2003 a provision restricting e-mail solicitation, Me. Rev. Stat. Ann. tit. 10, § 1497 (2017), that prohibits using third-party addresses, Me. Rev. Stat. Ann. tit. 10, § 1497(5)(A), or falsifying transmission/routing information, Me. Rev. Stat. Ann. tit. 10, § 1497(5)(B), and continuing to solicit those who opt out, Me. Rev. Stat. Ann. tit. 10, § 1497(4). It also requires • maintenance of a valid e-mail address, Me. Rev. Stat. Ann. tit. 10, § 1497(2); • a statement of the name and e-mail address of the originator, Me. Rev. Stat. Ann. tit. 10, § 1497(3)(B)–(C); MCLE, Inc. |2nd Edition 2018

9–51

§ 9.2

Data Security and Privacy in Massachusetts

• a statement that opting out is available through the e-mail address, Me. Rev. Stat. Ann. tit. 10, § 1497(3)(D); and • the subject line to include “ADV:” as its first four characters, Me. Rev. Stat. Ann. tit. 10, § 1497(3)(A)(1), and, for material to be viewed only by those over eighteen years old, “ADV:ADLT” as the first eight characters, Me. Rev. Stat. Ann. tit. 10, § 1497(3)(A)(2). A private right of action is available for actual damages or $250 per violation with costs and attorney fees. Me. Rev. Stat. Ann. tit. 10, § 1497(7). Maryland enacted in 2002 the Maryland Commercial Electronic Mail Act, Md. Code Ann., [Com. Law] §§ 14-3001–14-3003 (2017), renumbered from Md. Code Ann., [Com. Law] §§ 14-2901–14-2903, added by 2002 Md. Laws 323, § 1, 324, § 1. prohibiting in the use of commercial e-mail three frauds: using a third-party address without permission, containing false or misleading information about origin or transmission path, or containing false or misleading information in the subject line, Md. Code Ann., [Com. Law] § 14-3002(b)(2)(i)–(iii). A private right of action is provided with minimum damages of $500 and costs and attorney fees. Md. Code Ann., [Com. Law] § 14-3003. Michigan enacted in 2003 its Unsolicited Commercial E-Mail Protection Act, Mich. Comp. Laws §§ 445.2501–445.2508 (2017), added by 2003 Mich. Pub. Acts 42, §§ 1–8, prohibiting in unsolicited commercial e-mail the frauds of • using a third-party address without consent or misrepresenting origin or transmission path information, Mich. Comp. Laws § 445.2504(1)(a)–(b); • failing to include “information necessary to identify the point of origin,” Mich. Comp. Laws § 445.2504(1)(c); and • providing software facilitating falsifying point of origin, Mich. Comp. Laws §§ 445.2504(1)(d), 445.2505; as well as prohibiting sending to those who have opted out, Mich. Comp. Laws § 445.2504(2). The act also requires the e-mail to include in its subject line “ADV:” as its first four characters, Mich. Comp. Laws § 445.2503(a), and to conspicuously state the sender’s legal name, correct street address, valid Internet domain name, and valid return e-mail address, Mich. Comp. Laws § 445.2503(b)(i)–(iv), and the availability of no-cost opt-out, Mich. Comp. Laws § 445.2503(d), a procedure for which the sender must supply, Mich. Comp. Laws § 445.2503(c). A private right of action is allowed for actual damages or statutory damages of the lesser of $500 per e-mail or $250,000 per day with costs and attorney fees. Mich. Comp. Laws § 445.2508. Minnesota enacted in 2002 its law on False or Misleading Commercial Electronic Mail Messages, Minn. Stat. § 325F.694 (2017), added by 2002 Minn. Laws 395, § 1, that prohibits using in commercial e-mails the frauds of using a third-party address without permission or containing false or misleading information in the subject line, Minn. Stat. § 325F.694, subd. 2. The subject line must contain “ADV” as its first characters, and, if sexual in nature to be viewed only by those eighteen years or older, 9–52

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

“ADV-ADULT.” Minn. Stat. § 325F.694, subd. 3. The sender is required to provide an “easy-to-use” opt-out procedure and to provide notice of it in the e-mail. Minn. Stat. § 325F.694, subd. 4. A private right of action is allowed for actual damages or statutory damages of the lesser of $10 per e-mail or $25,000 per day for failure to include the “ADV” notice and $25 or $35,000 for fraud, with costs and attorney fees. Minn. Stat. § 325F.694, subd. 7. Missouri enacted in 2000 an antispam law, Mo. Rev. Stat. §§ 407.1120–407.1132 (2017), added by 2000 Mo. Laws 763, requiring unsolicited advertising e-mail to provide a toll-free number or a valid sender-operated return e-mail address for opting out, Mo. Rev. Stat. § 407.1123(1). Minimum damages of $500 are allowed for recipients of e-mails in violation. Mo. Rev. Stat. § 407.1129(1). In 2003, Missouri added antispam provisions, Mo. Rev. Stat. §§ 407.1135–407.1141 (2017), added by 2003 Mo. Laws 228, prohibiting in commercial e-mails the use of a false identity or false or misleading information in the subject line, Mo. Rev. Stat. § 407.1138(1), failing to include “ADV:” as the first characters of the subject line, Mo. Rev. Stat. § 407.1138(2), or “ADV:ADLT” if there is “obscene” material, Mo. Rev. Stat. § 407.1138(3), and sending messages to those who have opted out, Mo. Rev. Stat. § 407.1138(4). Nevada in 1997 was the first state to pass an antispam law, Nev. Rev. Stat. Ann. §§ 41.705–41.735 (2017), added by 1997 Nev. Stat. 341, §§ 2–8, amended by 2003 Nev. Stat. 12, § 1 (see also Nev. Rev. Stat. Ann. § 205.492 (2017), added by 1999 Nev. Stat. 530, § 20, amended by 2001 Nev. Stat. 274, § 7), allowing persons to recover actual damages or $50 per violating e-mail and costs and attorney fees, Nev. Rev. Stat. Ann. § 41.730(2), for e-mail including any advertisement not meeting the requirements to be “readily identifiable as promotional” and “clearly and conspicuously” provides the legal name, complete street address, and e-mail address of the sender and a notice for opting out of further e-mails, Nev. Rev. Stat. Ann. § 41.730(1)(c)(1)–(2). In 2003, there was added the requirement to include “ADV” or “advertisement” as the first word of the subject line, Nev. Rev. Stat. Ann. § 41.730(1)(c)(3), amended by 2003 Nev. Stat. 12, § 1, and increased damages of $500 per violating e-mail was provided under circumstances involving deceit. New Mexico enacted in 2003 antispam provisions relating to unsolicited facsimiles and e-mails, N.M. Stat. Ann. §§ 57-12-23–57-12-24 (2017), added by 2003 N.M. Laws 168, §§ 2, 3, requiring the sender of unsolicited e-mail advertisements to set up a toll-free number or a return e-mail address for opting out, to provide conspicuous notice of the opt-out procedure, and the inclusion of “ADV:” as the first four characters of the subject line, and if relating to adult goods or services, “ADV:ADLT” as the first eight characters. N.M. Stat. Ann. § 57-12-23. A private right of action is provided for actual damages or statutory damages of the greater of $25 per violating e-mail or $5,000 per day plus costs and attorney fees. N.M. Stat. Ann. § 57-12-24. North Carolina enacted in 1999 a number of provisions relating to “computer trespass” and “unsolicited bulk commercial electronic mail,” N.C. Gen. Stat. §§ 175.4(4)(c), 1-539.2A, 14-453, and 14-458 (2017), added by 1999 N.C. Sess. Laws 212, §§ 1–4, amended by 2000 N.C. Sess. Laws 125, §§ 3, 7, 2002 N.C. Sess. Laws MCLE, Inc. |2nd Edition 2018

9–53

§ 9.2

Data Security and Privacy in Massachusetts

157, § 1, 2009 N.C. Sess. Laws 551, § 2, providing a private right of action, N.C. Gen. Stat. § 1-539.2A (actual damages or the lesser of $10 per violating e-mail or $25,000 per day plus costs and attorney fees), against a sender “[f]alsely identify[ing] with the intent to deceive or defraud the recipient or forg[ing] commercial electronic mail transmission information or other routing information in any manner in connection with the transmission of unsolicited bulk commercial electronic mail through or into the computer network of an electronic mail service provider or its subscribers,” providing that this was violation of the authority granted by or policies of the service provider. N.C. Gen. Stat. § 14-458(a) (other offenses include malicious computer code). North Dakota enacted in 2003 antispam provisions, N.D. Cent. Code §§ 51-27-01– 51-27-10 (2017), added by 2003 N.D. Laws 439, § 1, amended and reenacted by 2007 N.D. Laws 436, § 2, prohibiting commercial e-mail from using a third-party address without permission or otherwise misrepresenting or obscuring origin and/or transmission path information or containing false or misleading information in the subject line, N.D. Cent. Code § 51-27-03(1). It also requires the inclusion of “ADV:” as the first characters of the subject line and, if consisting of material of a “sexual nature” to be viewed only by a person eighteen years old or older, “ADV:ADLT” as the first characters. N.D. Cent. Code § 51-27-04(1). It also requires the transmitter of commercial e-mail to set up a toll-free number or a return e-mail address “or another easy-to-use electronic method” for opting out and to provide notice of the opt-out procedure. N.D. Cent. Code § 51-27-05. Minimum damages of $500 is allowed to recipients of violating e-mails. N.D. Cent. Code § 51-27-06(1). Ohio enacted in 2002 a provision to regulate transmission of e-mail advertisements, Ohio Rev. Code Ann. § 2307.64 (2017), added by 2001 Ohio Laws 8, § 1, amended 2003 Ohio Laws 204, § 1, 2003 Ohio Laws 361, § 1 (deletion of a reference), 2011 Ohio Laws 360, § 1, requiring that the transmitter include the person’s name, “complete residence or business address” and e-mail address, and a notice of a no-cost opt-out procedure, Ohio Rev. Code Ann. § 2307.64(B), with a recipient’s right of action for $100 per violation up to $50,000 plus costs and attorney fees, Ohio Rev. Code Ann. § 2307.64(E). In 2003, Ohio enacted Section 2913.421 of its revised code to prohibit a person from transmitting multiple commercial electronic mail messages, falsifying routing information in those messages, falsifying registration information for multiple electronic mail accounts, or falsifying the right to use five or more Internet protocol addresses, and to prohibit unauthorized access to a computer to transmit multiple commercial electronic mail messages. Ohio Rev. Code Ann. § 2913.421 (2017), added by 2003 Ohio Laws 383, § 1, amended by 2005 Ohio Laws 241 § 1 (changing reference), 2010 Ohio Laws 86, § 1. Oklahoma includes in its consumer protection statute a provision on fraudulent electronic mail that was updated in 2006, Okla. Stat. tit. 15, §§ 776.1–776.7 (2017), amended by 2006 Okla. Sess. Laws 56, §§ 1, 2, along with the enactment of a new anti-“phishing” act, Okla. Stat. tit. 15, §§ 776.8–776.11 (2017), added by 2006 Okla. Sess. Laws 56, §§ 4–7. The violations are the standard ones: falsifying transmission and/or routing information; containing false or misleading information in the subject line; using a third-party address without permission; failing to use “ADV:” as the first 9–54

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

characters in the subject line or “ADV:ADLT: for sexually explicit material; and failing to provide for and notify of a no-cost opt-out. Okla. Stat. tit. 15, § 776.6. Recipients of violative e-mails may recover $10 per e-mail up to $25,000 plus costs and attorney fees. Okla. Stat. tit. 15, § 776.7. Pennsylvania enacted in 2002 its Unsolicited Telecommunication Act, 73 Pa. Cons. Stat. §§ 2250.1–2250.8 (2017), added by 2002 Pa. Laws 222, §§ 1–8, in prohibiting a commercial e-mailer from using a third-party address without permission or otherwise misrepresenting or obscuring origin and/or transmission path information; falsifying or forging routing information or including false or misleading information in the subject line, 73 Pa. Cons. Stat. § 2250.3(a)(1)–(3); and failing to operate a valid e-mail return address or toll-free telephone number for opting out, 73 Pa. Cons. Stat. § 2250.3(a)(4). Practice Note At roughly the same time, 18 Pa. Cons. Stat. § 7661 (2017), added by 2002 Pa. Laws 226, § 3, was added to criminalize transmission of unsolicited mail with falsified transmission and/or routing information or distributing software to facilitate such falsification. In 2000, the Pennsylvania legislature had dealt with sexually explicit materials by requiring “ADV:ADLT” in e-mails containing such materials. 18 Pa. Cons. Stat. § 5903(a.1) (2017), added by 2000 Pa. Laws 25, § 1.

Rhode Island enacted in 1999 a provision on Unsolicited Electronic Mail, R.I.G.L. § 6-47-2 (2017), added by 1999 R.I. Pub. Laws 427, § 1, requiring transmitters to maintain and notify of a toll-free telephone number or a valid sender operated return e-mail address for opting out and honoring any opt-out, R.I.G.L. § 6-47-2(a)–(c), and prohibiting using a third-party address without permission or otherwise misrepresenting origin and/or transmission path information, R.I.G.L. § 6-47-2(d). Recipients of violating e-mails may recover $100 per violation plus costs and attorney fees. R.I.G.L. § 6-47-2(h). Rhode Island enacted in 2006 additional provisions in its Electronic Mail Fraud Regulatory Act. R.I.G.L. §§ 6-49-1–6-49-6 (2017), added by 2006 R.I. Pub. Laws 628, § 1, amended by 2014 R.I. Pub. Laws 528, § 36 (also R.I.G.L. §§ 11-52.1-1–11-52.1-5 (2017), added by 2006 R.I. Pub. Laws 558, § 1, “Internet Misrepresentation of Business Affiliation Act”)). Consumers may recover actual damages or a minimum of $500, which may be trebled, plus costs and attorney fees. R.I.G.L. § 6-49-5. South Dakota added in 2002 to its consumer protection law, as a deceptive act or practice, sending an unsolicited commercial e-mail without “ADV:” as the first characters in its subject line or “ADV:ADLT” if it contained explicit sexual material. S.D. Codified Laws § 37-24-6(13) (2016), added by 2002 S.D. Laws 185, § 1. In 2007, the legislature enacted An Act to Restrict Unsolicited Commercial Electronic Mail Advertisements, S.D. Codified Laws §§ 37-24-41–37-24-48 (2016), added by 2007 S.D. Laws 226, §§ 1–8, that refers to the earlier provision for “ADV:” notice, S.D. Codified Laws § 37-24-42, prohibits phishing conduct, S.D. Codified Laws §§ 3724-44, 37-24-45, and the now-standard frauds of using a third-party address without permission; falsified, misrepresented, or forged header information; or misleading MCLE, Inc. |2nd Edition 2018

9–55

§ 9.2

Data Security and Privacy in Massachusetts

information in the subject line, S.D. Codified Laws § 37-24-47, and allows recipients actual damages or liquidated damages of $1,000 per e-mail plus costs and attorney fees, S.D. Codified Laws § 37-24-48. Tennessee enacted in 1999 antispam legislation, Tenn. Code Ann. §§ 47-18-2501, 47-18-2502 (2017), added by 1999 Tenn. Pub. Acts 475, §§ 2, 3, amended by 2003 Tenn. Pub. Acts 15, §§ 2–7 (also Tenn. Code Ann. § 39-14-603 (2017), added by 2003 Tenn. Pub. Acts 317, §§ 4, 8 (“unsolicited bulk electronic mail” provision of “Tennessee Personal and Commercial Computer Act of 2003”)), requiring senders of unsolicited electronic advertising to maintain and notify of a toll-free telephone number or a valid sender-operated return e-mail address for opting out and honoring any opt-out, Tenn. Code Ann. § 47-18-2501(a), (c), (d), and to include “ADV:” as the first characters of the subject line or “ADV:ADLT” for adult materials, Tenn. Code Ann. § 47-18-2501(e). Distribution of software to facilitate falsification of transmission and/or routing information is prohibited. Recipients of violative e-mails may recover $10 per e-mail up to $5,000 per day plus costs and attorney fees. Tenn. Code Ann. § 47-18-2501(i). Texas enacted in 2003 a chapter on Electronic Mail Solicitation, Tex. Bus & Com. Code Ann. §§ 46.001–46.011, added by 2003 Tex. Gen. Laws 1053, § 1, repealed by 2007 Tex. Gen. Laws 885, § 2.47(1), that was recodified without substantive difference in 2007, effective April 1, 2009, Tex. Bus & Com. Code Ann. §§ 321.001– 321.114 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01. The sender is required to include “ADV:” as the first characters of the subject line or “ADV:ADULT ADVERTISEMENT” for “obscene material or material depicting sexual conduct” and the sender provide a functioning return e-mail address for no-cost opt-out. Tex. Bus & Com. Code Ann. § 321.052. Prohibited are the three frauds of falsifying transmission/routing information, including false, deceptive, or misleading information in the subject line, and using a third-party address without consent. Tex. Bus & Com. Code Ann. § 321.051. A recipient may recover actual damages or statutory damages of the lesser of $10 per e-mail or $25,000 per day with costs and attorney fees. Tex. Bus & Com. Code Ann. §§ 321.104, 321.105. Virginia amended in 1999 the Virginia Computer Crimes Act, Va. Code Ann. §§ 8.01-328.1, 18.2-152.2, 18.2-152.4, 18.2-152.12 (2017), added by 1999 Va. Acts 886, § 1, including under computer trespass, falsification of routing information in the transmission of unsolicited bulk e-mail with false transmission or routing information, former Va. Code Ann. § 18.2-152-4(7), deleted 2003 Va. Acts ch. 987, § 1, ch. 1016, § 1, replaced with Va. Code Ann. § 18.2-152.3:1 (Transmission of Unsolicited Bulk Electronic Mail). This provision was moved and others, notably increased penalties, added in the 2003 amendments setting forth a new section on Transmission of Unsolicited Bulk Electronic Mail. Va. Code Ann. § 18.2-152.3:1 (2017), added by 2003 Va. Acts ch. 987, § 1, ch. 1016, § 1, rewritten by 2010 Va. Acts. ch. 489, § 1. Washington enacted in 1998 its Chapter 19.190 on Commercial Electronic Mail, Wash. Rev. Code §§ 19.190.005–19.190.0050 (2017), added by 1998 Wash. Laws 149, §§ 1-8050, added by 1998 Wash. Laws 149, §§ 1–8 (sunset Dec. 31, 1998), amended by 1999 Wash. Laws 289 §§ 1–4 (repealing findings, i.e., Section 9–56

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

19.190.005), restricting either sending an e-mail message from Washington or sending a message to a Washington resident, which uses a third party’s Internet domain name without the third party’s permission or misrepresents any information regarding the point of origin or the transmission path of the email message or that contains false or misleading information in the subject line, Wash. Rev. Code § 19.190.020(1). Damages to an e-mail recipient are either $500 or actual damages, whichever is greater. Wash. Rev. Code § 19.190.040(1). West Virginia enacted in 1999 its Electronic Mail Protection Act, W. Va. Code §§ 46A-6G-1–46A-6G-5 (2017), added by 1999 W. Va. Acts 119, which prohibits in the transmission of unauthorized e-mail the three frauds of using a third-party address without permission or otherwise misrepresenting origin and/or transmission information or including false or misleading information in the subject line, W. Va. Code § 46A-6G-2(1)–(2). The act also prohibits such transmissions not providing clearly “the date and time the message is sent, the identity of the person sending the message, and the return electronic mail address of that person,” W. Va. Code § 46A6G-2(3), and also if the message contains “sexually explicit materials.” W. Va. Code § 46A-6G-2(4). Recipients may recover actual damages or minimum statutory damages of $1,000 plus possible punitive damages for willful failure to stop transmitting plus costs and attorney fees. W. Va. Code § 46A-6G-5(b). Wisconsin in 2001 made it a misdemeanor to send an unsolicited e-mail solicitation that contains obscene material or a depiction of sexually explicit conduct without including the words “ADULT ADVERTISEMENT” in the subject line. Wis. Stat. § 944.25 (2017), added by 2001 Wis. Laws 16. (Wisconsin also prohibits the use of e-mail “or other computerized communication system” to harass. Wis. Stat. § 947.0125 (2017), added by 1995 Wis. Laws 353.) Wyoming enacted in 2003 an article on Commercial Electronic Mail, Wyo. Stat. §§ 40-12-401–40-12-404 (2017), added by 2003 Wyo. Sess. Laws 86, § 1, which prohibits in the transmission of unauthorized e-mail the three frauds of using a thirdparty address without permission or otherwise misrepresenting origin and/or transmission information or including false or misleading information in the subject line, Wyo. Stat. § 40-12-402(a). No private right of action is provided. As seen, there was early and extensive state legislative response to the perceived threat of unsolicited commercial e-mail, particularly relating to adult content. However, relatively little litigation has ensued, likely because of the limited, but chilling, effect of CAN-SPAM. Spam filters and other technology—but, more likely, more effective means of targeting advertisement and the ineffectiveness of spam—seem to have stemmed the threat of spam.

(b)

State Antispyware and Antiphishing Statutes

Other, more intrusive and more destructive devices than spam have been deployed maliciously on the Internet. They have various degrees of technological content, such as the use of “spyware” that is surreptitiously deposited into Internet user devices to report back on the behavior or confidential information (including access inforMCLE, Inc. |2nd Edition 2018

9–57

§ 9.2

Data Security and Privacy in Massachusetts

mation) of the users, or other “malware” that may change settings in the user device to report information or to act as a “zombie” to conduct attacks on others. “Phishing” is a “bait-and-switch” device overlapping the “spoofing” sometimes used in spam, where, for example, from Utah’s definitions, (a) the person makes a communication under false pretenses purporting to be by or on behalf of a legitimate business, without the authority or approval of the legitimate business; and (b) the person uses the communication to induce, request, or solicit another person to provide identifying information or property. Utah Code Ann. § 13-40-201(1) (2017), added by 2010 Utah Laws 200, § 4. “Pharming” is a term used to describe one of two devices, whereby the perpetrator (a) creates or operates a webpage that represents itself as belonging to or being associated with a legitimate business, without the authority or approval of the legitimate business, if that webpage may induce any user of the Internet to provide identifying information or property; or (b) alters a setting on a user’s computer or similar device or software program through which the user may search the Internet, causing any user of the Internet to view a communication that represents itself as belonging to or being associated with a legitimate business, if the message has been created or is operated without the authority or approval of the legitimate business and induces, requests, or solicits any user of the Internet to provide identifying information or property. Utah Code Ann. § 13-40-201(2). Each of these devices may involve the use of malware. California has statutes addressing each of “spyware” and “phishing.” In 2004, the California legislature enacted the Consumer Protection Against Computer Spyware Act, Cal. [Bus. & Prof.] Code §§ 22947–22947.6 (2017), added by 2004 Cal. Stat. 843, § 2, to address the proliferation of code used to return information from a user’s browser, prohibiting “a person or entity that is not an authorized user” to cause computer software to be copied onto the computer of a consumer in this state and use the software to do any of the following: (a) Modify, through intentionally deceptive means, any of the following settings related to the computer’s access to, or use of, the Internet:

9–58

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(1) The page that appears when an authorized user launches an Internet browser or similar software program used to access and navigate the Internet. (2) The default provider or Web proxy the authorized user uses to access or search the Internet. (3) The authorized user’s list of bookmarks used to access Web pages. (b) Collect, through intentionally deceptive means, personally identifiable information that meets any of the following criteria: (1) It is collected through the use of a keystroke-logging function that records all keystrokes made by an authorized user who uses the computer and transfers that information from the computer to another person. (2) It includes all or substantially all of the Web sites visited by an authorized user, other than Web sites of the provider of the software, if the computer software was installed in a manner designed to conceal from all authorized users of the computer the fact that the software is being installed. (3) It is a data element . . . that is extracted from the consumer’s computer hard drive for a purpose wholly unrelated to any of the purposes of the software or service described to an authorized user. (c) Prevent, without the authorization of an authorized user, through intentionally deceptive means, an authorized user’s reasonable efforts to block the installation of, or to disable, software, by causing software that the authorized user has properly removed or disabled to automatically reinstall or reactivate on the computer without the authorization of an authorized user. (d) Intentionally misrepresent that software will be uninstalled or disabled by an authorized user’s action, with knowledge that the software will not be so uninstalled or disabled. (e) Through intentionally deceptive means, remove, disable, or render inoperative security, antispyware, or antivirus software installed on the computer. Cal. [Bus. & Prof.] Code § 22947.2. The Spyware Act also prohibits acts of taking control of or modifying settings of the computer. Cal. [Bus. & Prof.] Code § 22947.3. It also prohibits inducing installation of unnecessary software. Cal. [Bus. & Prof.] Code § 22947.

MCLE, Inc. |2nd Edition 2018

9–59

§ 9.2

Data Security and Privacy in Massachusetts

The California Anti-Phishing Act of 2005, Cal. [Bus. & Prof.] Code §§ 22948– 22948.3 (2017), added by 2005 Cal. Stat. 437, § 1, addresses the problem of Internet scams to elicit personal information for fraudulent use: It shall be unlawful for any person, by means of a Web page, electronic mail message, or otherwise through use of the Internet, to solicit, request, or take any action to induce another person to provide identifying information by representing itself to be a business without the authority or approval of the business. Cal. [Bus. & Prof.] Code § 22948.2. A private right of action is provided. Cal. [Bus. & Prof.] Code § 22948.3 (2014), added by 2005 Cal. Stat. 437, § 1. Following is a survey of other state legislation addressed specifically to spyware or phishing. Alabama added phishing as a crime in 2013. Ala. Code § 13A-8-114 (2015), added by 2012 Ala. Acts 432, § 5. Alaska enacted in 2005, along with prohibitions against online enticement of minors and distribution to minors of indecent materials, Alaska Stat. § 11.41.452 (2016), added by 2005 Alaska Sess. Laws 97, § 1, amended by 2011 Alaska Sess. Laws 20, § 8 (upgrading felony if sex offender); Alaska Stat. § 11.61.128 (2016), added by 2005 Alaska Sess. Laws 97, § 2, amended by 2007 Alaska Sess. Laws 24, § 6, 2010 Alaska Sess. Laws 18, § 9), an antispyware provision, including code causing “popup” advertisements. Alaska Stat. §§ 45.45.792–45.45.798 (2016), added by 2005 Alaska Sess. Laws 97, § 3. Arizona adopted in 2005 a statute relating to spyware. Ariz. Rev. Stat. §§ 44-7301– 44-7304 (2015), added by 2005 Ariz. Sess. Laws 136, § 1, renumbered Ariz. Rev. Stat. §§ 18-501–18-504 (2015) by 2016 Ariz. Sess. Laws 80, § 3E. A. It is unlawful for any person who is not an owner or operator of a computer to transmit computer software to a computer, with actual knowledge or with conscious avoidance of actual knowledge, and to use the software to do any of the following: 1. Modify, through intentionally deceptive means, settings that control any of the following: (a) The page that appears when an owner or operator of a computer launches an internet browser or similar computer software used to access and navigate the internet. (b) The default provider or web proxy that an owner or operator of a computer uses to access or search the internet. (c) An owner or operator’s list of bookmarks used to access web pages. 9–60

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

2. Collect, through intentionally deceptive means, personally identifiable information: (a) Through the use of a keystroke logging function that records all keystrokes made by an authorized user who uses the computer and transfers that information from the computer to another person. (b) In a manner that correlates the information with data respecting all or substantially all of the web sites visited by an owner or operator of the computer, other than web sites operated by the person collecting the information. (c) With respect only to information described in section 447301, paragraph 9, by extracting such information from the hard drive of an owner or operator’s computer. 3. Prevent, through intentionally deceptive means, an owner or operator’s reasonable efforts to block the installation or execution of, or to disable, computer software by causing software that an owner or operator of the computer has properly removed or disabled automatically to reinstall or reactivate on the computer. 4. Intentionally misrepresent that computer software will be uninstalled or disabled by an owner or operator’s action. 5. Through intentionally deceptive means, remove, disable or render inoperative security, antispyware or antivirus computer software installed on the computer. 6. Take control of the computer by: (a) Accessing or using the modem or internet service for the computer for the purpose of causing damage to the computer or causing an owner or operator to incur financial charges for a service that the owner or operator of the computer has not authorized. (b) Opening multiple, sequential, stand alone advertisements in an owner or operator’s internet browser without the authorization of an owner or operator and that a reasonable computer user cannot close without turning off the computer or closing the internet browser. 7. Modify any of the following settings related to the computer’s access to, or use of, the internet: (a) Settings that protect information about an owner or operator of the computer for the purpose of stealing personally identifiable information of the owner or operator.

MCLE, Inc. |2nd Edition 2018

9–61

§ 9.2

Data Security and Privacy in Massachusetts

(b) Security settings for the purpose of causing damage to a computer. 8. Prevent an owner or operator’s reasonable efforts to block the installation of, or to disable, computer software, by doing either of the following: (a) Presenting the owner or operator with an option to decline installation of computer software with knowledge that, when the option is selected, the installation nevertheless proceeds. (b) Falsely representing that computer software has been disabled. B. It is unlawful for any person who is not an owner or operator of a computer to do either of the following with regard to the computer: 1. Induce an owner or operator to install a computer software component on the computer by intentionally misrepresenting the extent to which installing the software is necessary for security or privacy reasons or in order to open, view or play a particular type of content. 2. Deceptively cause the execution on the computer of a computer software component with the intent of causing an owner or operator to use the component in a manner that violates any other provision of this section. Ariz. Rev. Stat. § 18-502. Computer software providers and website and trademark owners, but not users, are granted a right of action. Ariz. Rev. Stat. § 18-504. Arkansas enacted in 2005 its Consumer Protection Against Computer Spyware Act, Ark. Code Ann. §§ 4-111-101–4-111-105 (2017), added by 2005 Ark. Acts 2255, § 1, prohibiting similar malicious code actions as Arizona. Connecticut enacted in 2006 An Act Concerning Email Message Phishing, codified at Conn. Gen. Stat. § 53-454 (2017), that provides a private right of action by an aggrieved person against one who misrepresented its identify as an “Internet business” to obtain personal information. 2006 Conn. Acts 50, § 1. Florida enacted in 2006 an Antiphishing Act, codified at Fla. Stat chs. 668.701– 668.705 (2017), provides a right of action to Internet service providers, financial institutions ,and owners of webpages or marks who are adversely affected by misrepresentation as to source of e-mails or webpages or by a misleading Internet domain name. 2006 Fla. Laws ch. 232, § 1. Georgia enacted in 2005 the Georgia Computer Security Act of 2005, Ga. Code Ann. § 16-9-152 (2017), added by 2005 Ga. Laws 127, § 1, amended by 2007 Ga. Laws 103, § 16 (rewording), which includes spyware prohibitions similar to those of Arizona and Arkansas. 9–62

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Illinois adopted in 2007 protection against certain spyware relative to encryption (Illinois prohibits “spyware” through its inclusion among criminal “unlawful use[s] of encryption” as a “computer contaminant.” 720 Ill. Comp. Stat. 5/17-52.5 (2017), added by 2007 Ill. Laws 942, § 5, amended by 2009 Ill. Laws 1551, § 5.5 (renumbering and editing)), as well as its Anti-Phishing Act, 740 Ill. Comp. Stat. 7/1–7/15 (2015), added by 2007 Ill. Laws 350, §§ 1–15. It is unlawful for any person, by means of a Web page, electronic mail message, or otherwise through use of the Internet, to solicit, request, or take any action to induce another person to provide identifying information by representing himself, herself, or itself to be a business without the authority or approval of the business. 740 Ill. Comp. Stat. 7/10. A private right of action is provided for users. 740 Ill. Comp. Stat. 7/15. Indiana added in 2005 a Prohibited Spyware article, Ind. Code Ann. §§ 24-4.8-1-1– 24-4.8-3-2 (2016), added by 2005 Ind. Acts 115, § 1, to its Code, prohibiting use of intentionally deceptive means to modify computer settings to obtain certain information, Ind. Code Ann. § 24-4.8-2-2, and inducement of computer owner or operator to install software by misrepresenting necessity of software for computer security, privacy, or viewing, Ind. Code Ann. § 24-4.8-2-3. Users do not have a right of action. Ind. Code Ann. § 24-4.8-3-1. Iowa enacted in 2005 An Act Relating to the Transmission, Installation and Use of Computer Software Through Deceptive or Unauthorized Means and Providing for Penalties, Iowa Code § 715.4 (2017) (emphasis added), added by 2005 Iowa Acts 94, § 4, amended by 2013 Iowa Acts 90, §§ 191 and 192 (changing “web” and “web page” to “internet” and “internet site”), prohibiting similar devices as the antispyware statutes of Arizona, Arkansas, and Georgia. No private civil remedies were provided. Kentucky made phishing a crime in 2009. Ky. Rev. Stat. Ann. § 434.697 (2014), added by 2009 Ky. Acts 100, § 8. Louisiana enacted in 2006 spyware prohibitions, La. Rev. Stat. Ann. §§ 51:2006– 51:2014 (2017), added by 2006 La. Acts 392, § 1, similar to those of Arizona, Arkansas, and Georgia. No user actions are provided. Louisiana enacted two anti“phishing” statutes in 2006. La. Rev. Stat. Ann. §§ 51:2031–51-2034 (2017), added by 2006 La. Acts 201, § 1 (Anti-Phishing Act of 2006); La. Rev. Stat. Ann. §§ 51:2021–51:2025 (2017), added by 2006 La. Acts 549, § 1 (Louisiana AntiPhishing Act, including prohibitions on fraudulent web pages and fraudulent electronic mail). The first prohibits unlawful requests by misrepresentation, La. Rev. Stat. Ann. § 51:2033, with limited consumer rights of action, La. Rev. Stat. Ann. § 51:2034. The second act addresses more specifically the interests of the legitimate business in webpages, La. Rev. Stat. Ann. § 51:2022 (creation of webpages for fraudulent purposes), and fraudulent e-mails, La. Rev. Stat. Ann. § 51:2023 (e-mails spoofing legitimate business), and allows actions only by the attorney general, a trademark holder, or an ISP, La. Rev. Stat. Ann. § 51:2024(A). MCLE, Inc. |2nd Edition 2018

9–63

§ 9.2

Data Security and Privacy in Massachusetts

Michigan enacted as an amendment to its Identity Theft Protection Act an antiphishing provision codified at Mich. Comp. Laws § 445.67a (2017). 2010 Mich. Pub. Acts 318, § 7a. Minnesota added to its identity theft law a new subdivision making criminal the use of false pretenses in a communication on the Internet to obtain identity, codified at Minn. Stat. § 609.527(5a) (2017). 2003 Minn. Laws 136, art. 17, § 35. Montana in 2007 made phishing (“fraudulent electronic misrepresentation”) a crime (theft of identity) and provided a private right of action. Mont. Code Ann. § 30-141712 (2017), added by 2007 Mont. Laws 276, § 1. Nevada in 2005 made introducing certain spyware a crime as part of a computer crime of introducing “computer contaminants.” Nev. Rev. Stat. § 205.4737 (2017), added by 2005 Nev. Stat. 486, § 5. New Hampshire enacted in 2005 antispyware legislation, N.H. Rev. Stat. Ann. §§ 359-H:1–359-H:6 (2017), added by 2005 N.H. Laws 238:1, with prohibitions similar to those of Arizona, Arkansas, and Georgia. New Mexico in 2005 added to its identity theft law a new criminal offense known as “obtaining identity by electronic fraud,” codified at N.M. Stat. Ann. § 30-16-24.1 (2017). 2005 N.M. Laws 296, § 1, amended by 2009 N.M. Laws 95 § 3 (adding certain intent and protected information). New York adopted in 2006 its Anti-Phishing Act against fraudulent solicitation of identifying information. N.Y. Gen. Bus. Law § 390-b (2017), added by 2006 N.Y. Laws 64, § 1, amended by 2006 N.Y. Laws 414, § 1. Oklahoma also enacted in 2006 its Anti-Phishing Act, Okla. Stat. tit. 15, §§ 776.8– 776.11 (2017), added by 2006 Okla. Sess. Laws 56, §§ 3–7; the legislation expanding its existing provisions against fraudulent e-mail to fraudulent use of webpage or Internet domain names against the interests of legitimate businesses. Private rights of action are limited to trademark holders and ISPs. Okla. Stat. tit. 15, § 776.11(A). Practice Note The same enactment amended prior provisions against “fraudulent electronic mail” to include prohibitions against falsely representing as a “legitimate online business” or fraudulently obtaining “identifying information.” Okla. Stat. tit. 15, §§ 776.1–776.6 (2017), amended by 2006 Okla. Sess. Laws 56, §§ 1, 2.

Oregon enacted in 2015 a provision against obtaining personal information by false representation via electronic media, codified at Or. Rev. Stat. § 646A.808 (2017). 2015 Or. Laws 121, § 1. Pennsylvania enacted in 2010 its Consumer Protection Against Computer Spyware Act, 73 Pa. Cons. Stat. §§ 2330.1–2330.9 (2017), added by 2010 Pa. Laws 86, §§ 1– 9, with prohibitions similar to those of Arizona, Arkansas, and Georgia, directed to 9–64

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

protecting the interests of legitimate businesses and thus restricting private actions to such businesses and trademark holders. Rhode Island in 2006 enacted the antispyware legislation, R.I.G.L. §§ 11-52.2-1– 11-52.2-8 (2017), added by 2006 R.I. Pub. Laws 583, § 1, that had become standard (Arizona, Arkansas, Georgia, etc.), specifically prohibiting unlawful modification of computer settings. In 2008, Rhode Island enacted further provisions against Online Property Offenses, R.I.G.L. §§ 11-52.3-1–11-52.3-5 (2017), added by 2008 R.I. Pub. Laws 467, § 1, including online sale of stolen property and theft by deception. Practice Note A chapter entitled “Internet Misrepresentation of Business Affiliation Act,” roughly “phishing,” was enacted at roughly the same time. R.I.G.L. §§ 11-52.1-1–11-52.1-5 (2017), added by 2006 R.I. Pub. Laws 558, § 1.

Tennessee enacted in 2006 its Anti-Phishing Act of 2006, Tenn. Code Ann. §§ 4718-5201–47-18-5206 (2017), added by 2006 Tenn. Pub. Acts 566, §§ 2–6, which, in addition to the basic prohibition against fraudulent requests for identifying information, declared: (b) It shall be unlawful for any person without the authorization or permission of the person who is the subject of the identifying information, with the intent to defraud, for such person’s own use or the use of a third person, or to sell or distribute the information to another, to: (1) Fraudulently obtain, record or access identifying information that would assist in accessing financial resources, obtaining identification documents, or obtaining benefits of such other person; (2) Obtain goods or services through the use of identifying information of such other person; or (3) Obtain identification documents in such other person’s name. (c) It shall be unlawful for any person with the intent to defraud and without the authorization or permission of the person who is the owner or licensee of a web page or web site to: (1) Knowingly duplicate or mimic all or any portion of the web site or web page; (2) Direct or redirect an electronic mail message from the IP address of a person to any other IP address; (3) Use any trademark, logo, name, or copyright of another person on a web page; or (4) Create an apparent but false link to a web page of a person that is directed or redirected to a web page or IP address other than that of the person represented. MCLE, Inc. |2nd Edition 2018

9–65

§ 9.2

Data Security and Privacy in Massachusetts

(d) It shall be unlawful for any person to attempt to commit any of the offenses enumerated in this section. Tenn. Code Ann. § 47-18-5203 (emphasis added). This reaches to more technical phishing devices. Tennessee provides a private right of action to “[a]n individual who suffers an ascertainable loss by a violation.” Tenn. Code Ann. §§ 47-185204(a)(2)(A). Practice Note “‘Ascertainable loss’ means an identifiable deprivation, detriment or injury arising from the identity theft or from any unfair, misleading or deceptive act or practice, even when the precise amount of the loss is not known. Whenever a violation of this part has occurred, an ascertainable loss shall be presumed to exist.” Tenn. Code Ann. § 47-18-5202(1).

Texas enacted in 2005 its Consumer Protection Against Computer Spyware Act, 2005 Tex. Gen. Laws 298, adding Chapter 48 to Title 4 of the Texas Business & Commerce Code, prohibiting, among others, the standard spyware practices of unauthorized collection or culling of personally identifiable information, Tex. Bus. & Com. Code Ann. § 48.051 (2007); unauthorized access to or modifications of computer settings, Tex. Bus. & Com. Code Ann. § 48.052 (2007); and unauthorized interference with installation or disabling of software, Tex. Bus. & Com. Code Ann. § 48.053 (2007); and its Anti-Phishing Act, 2005 Tex. Gen. Laws 544, adding Chapter 48 to Title 4 of the Texas Business & Commerce Code, prohibiting the creation of a webpage or a domain name for fraudulent purposes (gaining identifying information), Tex. Bus. & Com. Code Ann. § 48.003 (2007), or e-mail fraud, Tex. Bus. & Com. Code Ann. § 48.004 (2007). These statutes were replaced in 2009 with a new Chapter 324, the Consumer Protection Against Computer Spyware Act, Tex. Bus. & Com. Code Ann. §§ 324.001–324.102 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01, amended by 2009 Tex. Gen. Laws 718, §§ 1–5; a new Chapter 325, the AntiPhishing Act, Tex. Bus. & Com. Code Ann. §§ 325.001–325.006 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01; as well as protections against identity theft through protecting credit and debit card information, Social Security numbers, and drivers licenses, Tex. Bus. & Com. Code Ann. §§ 501.001–501.102 (2017), added by 2007 Tex. Gen. Laws 885, § 2.01, amended by 2009 Tex. Gen. Laws 90, § 1. Utah enacted in 2004 its Spyware Control Act, Utah Code Ann. §§ 13-40-101–1340-401 (2007), added by 2004 Utah Laws 363, §§ 1–5, amended by 2005 Utah Laws 168, §§ 1–5, with a focus on “pop-ups” and trademarks. In 2010, the provision was repealed and replaced with a new Utah E-Commerce Integrity Act addressed to phishing, pharming, and spyware, Utah Code Ann. §§ 13-40-101–13-40-402 (2017), added by 2010 Utah Laws 200, §§ 1-12 (with Section 14 adding Utah Code Ann. § 70-3a-309 (2017) on Cybersquatting). The antiphishing and antipharming provisions prohibit the acts quoted at the start of this section. Utah Code Ann. § 13-40201. The antispyware prohibitions are similar to the standard prohibitions (Arizona, Arkansas, and Georgia). Utah Code Ann. § 13-40-301.

9–66

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Virginia in 2005 added a crime of using a computer to gather identifying information, codified at Va. Code Ann. § 18.2-152.5:1 (2017). 2005 Va. Acts chs. 747, 760, 761, 827, and 837. Washington enacted an antispyware provision in 2005, Wash. Rev. Code §§ 19.270.010–19.270.900 (2017), added by 2005 Wash. Laws 500, §§ 1–9, amended by 2008 Wash. Laws 66, §§ 1–5, with the standard prohibitions against unauthorized modification of settings, collection of personally identifiable information, and installation or removal of software, Wash. Rev. Code § 19.270.020; taking control of a computer, Wash. Rev. Code § 19.270.030; and misrepresenting security software, Wash. Rev. Code § 19.270.040. A private right of action is allowed to interested legitimate business. Wash. Rev. Code § 19.270.060(1). Wyoming in 2014 added a crime of computer trespass, including the introduction of “malware,” defined to include viruses, worms, trojan horses, rootkits, keyloggers, backdoors, dialers, ransomware, spyware, adware, malicious browser helper objects, rogue security software and other malicious programs used or designed to disrupt a computer operation, gather sensitive information, steal sensitive information or otherwise gain unauthorized access to a computer, computer system or computer network. Wyo. Stat. Ann. § 6-3-506 (2017), added by 2014 Wyo. Sess. Laws 73, § 1.

(c)

Employer and School Coercion of Access to Social Media

A privacy concern that has attracted state legislative attention is the coerced access by employers and schools to the login information for the private social media (and other Internet) accounts of current and prospective employees and students. Although it is not clear that this was a widespread practice, there was and is concern about the widespread posting on and using of social media websites as a primary means of communication by a large portion of the population with relatively little awareness of public (and permanent) access and the use of such postings to screen or obtain prejudicial information about unsuspecting users. As of mid-2017, a total of twenty-five states had enacted laws to protect employees and prospective employees (generally including state employees), and sixteen states had enacted laws to protect students, see http://www.ncsl.org/research/telecommunications-and-information-technology/ state-laws-prohibiting-access-to-social-media-usernames-and-passwords.aspx#stat, with Louisiana, Michigan, Rhode Island, and Wisconsin protecting students at every educational level. Wisconsin’s laws also regulate landlord requests for tenant information. Wis. Stat. § 995.55(4) (2017). California enacted in 2012 protections of personal social media for current and prospective employees, Cal. [Lab.] Code § 980 (2017), added by 2012 Cal. Stat. ch. 618, § 1, amended by 2013 Cal. Stat. ch. 76, § 142 (no change), and postsecondary education students, Cal. [Educ.] Code § 99121 (2017), added by 2012 Cal. Stat. ch. 619, § 2. MCLE, Inc. |2nd Edition 2018

9–67

§ 9.2

Data Security and Privacy in Massachusetts

Practice Note Section 99120 defines “social media” as “an electronic service or account, or electronic content, including, but not limited to, videos or still photographs, blogs, video blogs, podcasts, instant and text messages, email, online services or accounts, or Internet Web site profiles or locations.” Section 99122 requires the school to post its privacy policy on its website.

For employees: (a) As used in this chapter, “social media” means an electronic service or account, or electronic content, including, but not limited to, videos, still photographs, blogs, video blogs, podcasts, instant and text messages, email, online services or accounts, or Internet Web site profiles or locations. (b) An employer shall not require or request an employee or applicant for employment to do any of the following: (1) Disclose a username or password for the purpose of accessing personal social media. (2) Access personal social media in the presence of the employer. (3) Divulge any personal social media, except as provided in subdivision (c). (c) Nothing in this section shall affect an employer’s existing rights and obligations to request an employee to divulge personal social media reasonably believed to be relevant to an investigation of allegations of employee misconduct or employee violation of applicable laws and regulations, provided that the social media is used solely for purposes of that investigation or a related proceeding. (d) Nothing in this section precludes an employer from requiring or requesting an employee to disclose a username, password, or other method for the purpose of accessing an employer-issued electronic device. Cal. [Lab.] Code § 980. For students: (a) Public and private postsecondary educational institutions, and their employees and representatives, shall not require or request a student, prospective student, or student group to do any of the following: (1) Disclose a user name or password for accessing personal social media. (2) Access personal social media in the presence of the institution’s employee or representative. 9–68

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(3) Divulge any personal social media information. (b) A public or private postsecondary educational institution shall not suspend, expel, discipline, threaten to take any of those actions, or otherwise penalize a student, prospective student, or student group in any way for refusing to comply with a request or demand that violates this section. (c) This section shall not do either of the following: (1) Affect a public or private postsecondary educational institution’s existing rights and obligations to protect against and investigate alleged student misconduct or violations of applicable laws and regulations. (2) Prohibit a public or private postsecondary educational institution from taking any adverse action against a student, prospective student, or student group for any lawful reason. Cal. [Educ.] Code § 99121. Practice Note Section 99120 defines “social media” equivalently to Cal. [Lab.] Code § 980(a). Section 99122 requires the school to post its privacy policy on its website.

Arkansas enacted in 2013 protections for current and prospective employees, Ark. Code Ann. § 11-2-124 (2017), added by 2013 Ark. Acts 1480, § 1, and higher education students, Ark. Code Ann. § 6-60-104 (2017), added by 2013 Ark. Acts 998, § 1. Colorado enacted in 2012 protections for current and prospective employees. Colo. Rev. Stat. § 8-2-127 (2016), added by 2013 Colo. Sess. Laws 195, § 1. Connecticut enacted in 2015 An Act Concerning Employee Online Privacy, protecting current and prospective employees. Conn. Gen. Stat. § 31–40x (2017), added by 2015 Conn. Acts 6, § 1, amended by 2016 Conn. Acts 169, § 21 (fixed typographical error). Delaware enacted in 2012 an Education Privacy Act, protecting current and prospective postsecondary school students. 14 Del. Code Ann. §§ 8101–8104 (2017), added by 78 Del. Laws 354, § 1 (2012). In 2015, it enacted a similar protection for employees and applicants. 19 Del. Code Ann. § 709A (2017), added by 80 Del. Laws 146, § 1 (2015). Illinois enacted in 2011 protections for current and prospective employees, 820 Ill. Comp. Stat. 55/10 (2015), added by 2011 Ill. Laws 875, § 5, amended by 2013 Ill. Laws 501, § 5 (adding Subsection 3.5 and related definitions in Subsection 4, distinguishing personal and professional accounts), 2015 Ill. Laws 610, § 10 (adding to prohibited acts of coercion and to privileged activities and defining personal accounts). It enacted in 2013 protections for current and prospective postsecondary MCLE, Inc. |2nd Edition 2018

9–69

§ 9.2

Data Security and Privacy in Massachusetts

school students, 105 Ill. Comp. Stat. 75/5-75/20 (2017), added by 2013 Ill. Laws 129, §§ 5–20, amended by 2015 Ill. Laws 460, §§ 10, 15 (allowed to require sharing of content for investigation, but requires notice). Practice Note The “Right to Privacy in the School Setting,” 2013 Ill. Laws 129 at Section 5, defined “social media network” similarly to 820 Ill. Comp. Stat. 55/10(4) and at Section 15 provided notification of the possibility of requests for a school disciplinary rule or policy that is to be published.

Louisiana’s Personal Online Account Privacy Act, La. Rev. Stat. Ann. §§ 51:1951– 51:1955 (2017), added by 2014 La. Acts 165, protects current and prospective employees and students ranging from nursery school through university, vocational school, extension courses, educational testing services, and agencies, La. Rev. Stat. Ann. § 51:1952(1) (definition of “educational institution”). Maine in 2015 enacted protections for current and prospective employees relative to coerced access to social media accounts. 26 M.R.S., §§ 616–618 (2017), added by 2015 Me. Laws ch. 343, § B-1. Maryland in 2012 enacted protections for current and prospective employees. Md. Code Ann., [Lab. & Empl.] § 3-712 (2017), added by 2012 Md. Laws 233, 234, amended by 2013 Md. Laws 224, § 1 (adding Subsection (f) on agency mediation or attorney general enforcement). It added in 2015 “Personal Electronic Account Privacy Protection” to its Educational Code to protect current or prospective students of institutions of postsecondary education. Md. Code Ann., [Educ.] § 26-401, added by 2015 Md. Laws 465, 466. Michigan in 2012 enacted an Internet Privacy Protection Act, Mich. Comp. Laws §§ 37.271–37.278 (2017), added by 2012 Mich. Pub. Acts 478, §§ 1–8, that protects present and prospective employees and students ranging from nursery school through university, vocational school, extension courses, educational testing services, and agencies, Mich. Comp. Laws § 37.272(b) (definition of “educational institution”). Montana in 2015 enacted An Act Prohibiting an Employer from Requesting Online Passwords or User Names for an Employee’s or Job Applicant’s Personal Social Media Account. Mont. Code Ann. § 39-2-307 (2017), added by 2015 Mont. Laws 263, § 1. Nebraska in 2016 enacted its Workplace Privacy Act, Neb. Rev. Stat. §§ 48-3501 through 48-3511 (2017), added by 2016 Neb. Laws 821, §§ 1–11, that prohibits employer coercion of disclosure of passwords or of access to “personal internet accounts” of employees, Neb. Rev. Stat. § 48-3503, but allowing investigation on “specific information” (Neb. Rev. Stat. § 48-3507) and prohibiting employee download or transfer of the employer’s proprietary information (Neb. Rev. Stat. § 48-3506). A private right of action is authorized. Neb. Rev. Stat. § 48-3511. Nevada in 2013 enacted protections for current and prospective employees. Nev. Rev. Stat. Ann. § 613.135 (2017), added by 2013 Nev. Stat. 548, § 2. 9–70

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

New Hampshire in 2014 enacted a new subdivision, Use of Social Media and Electronic Mail, N.H. Rev. Stat. Ann. §§ 275:73–275:75 (2017), added by N.H. 2014 Stat. 305, § 1, protecting current and prospective employees’ “personal accounts.” In 2015 it added a similar protection for students and prospective students from coercion by an “educational institution,” not restricted by age. N.H. Rev. Stat. § 189:70 (2017), added by 2015 N.H. Stat. 270, § 1. New Jersey in 2012 enacted a law to protect current and prospective higher education students, N.J. Stat. §§ 18A:3-29–18A:3-31 (2017), added by 2012 N.J. Laws 75, §§ 1–4, and in 2013 to protect current and prospective employees, N.J. Stat. §§ 34:6B-5–34:6B-10 (2017), added by 2013 N.J. Laws 155, §§ 1–6. New Mexico in 2013 enacted legislation protecting prospective employees only. N.M. Stat. § 50-4-34 (2017), added by 2013 N.M. Laws 222, § 1. New Mexico, however, protects current and prospective postsecondary students. N.M. Stat. § 21-146(A)–(C) (2017), added by 2013 N.M. Laws 223, § 1. Oklahoma enacted in 2014 protections for current and prospective employees. Okla. Stat. tit. 40, § 173.2 (2017), added by 2014 Okla. Sess. Laws 315, § 1. Oregon in 2013 followed a similar template to other 2013 enactments for protecting current and prospective employees, Or. Rev. Stat. § 659A.330 (2017), added by 2013 Or. Laws 204, § 2, amended by 2015 Or. Laws 229, § 1 (defined protected “personal social media accounts”) and for current and prospective students of other-than-K–12 “educational institutions. Or. Rev. Stat. §§ 350.272, 350.272 (2017), added by 2013 Or. Laws 408, §§ 1, 2 (as §§ 326.51, 326.554), renumbered 2015. Rhode Island in 2014 enacted protections for current and prospective employees, R.I.G.L. §§ 28-56-1–28-56-6 (2017), added by 2014 R.I. Pub. Laws 188, § 3, 2014 R.I. Pub. Laws 207, § 3, and students (level of education not specified, but probably including K–12), R.I.G.L. §§ 16-103-1–16-103-6 (2017), added by 2014 R.I. Pub. Laws 188, § 1, 2014 R.I. Pub. Laws 207, § 1. Practice Note “‘Educational institution’ or ‘school’ means a private or public institution that offers participants, students or trainees an organized course of study or training that is academic, technical, trade-oriented or preparatory for gainful employment in a recognized occupation and shall include any person acting as an agent of the institution.” R.I.G.L. § 16-103-1(3). The legislation also included protection of K–12 students from advertising in “student data-cloud computing” services to “educational institutions.” R.I.G.L. § 16-104-1 (2017), added by 2014 R.I. Pub. Laws 188, § 2, 2014 R.I. Pub. Laws 207, § 2.

Tennessee in 2014 enacted the Employee Privacy Protection Act of 2014, Tenn. Code Ann. §§ 50-1-1001–50-1-1004 (2017), added by 2014 Tenn. Pub. Acts 826, §§ 2–5, protecting access to a private or state employee’s “personal Internet account,” defined as “an online account that is used by an employee or applicant exclusively MCLE, Inc. |2nd Edition 2018

9–71

§ 9.2

Data Security and Privacy in Massachusetts

for personal communications unrelated to any business purpose of the employer.” Tenn. Code Ann. § 50-1-1002(5)(A). Utah in the same 2013 legislation enacted an Internet Employment Privacy Act, Utah Code Ann. §§ 34-48-101–34-48-301 (2017), added by 2013 Utah Laws 94, §§ 1–6, and an Internet Postsecondary Institution Privacy Act, Utah Code Ann. §§ 53B-24101–53B-24-301 (2017), added by 2013 Utah Laws 94, §§ 7–12, to protect “personal Internet accounts” of current and prospective employees and students that are “used . . . exclusively for personal communications unrelated to any . . . purpose” of the employer or the educational institution, Utah Code Ann. §§ 34-48-102(4)(a), 53B24-102(1)(a). Virginia in 2015 added to its laws a new section on social media accounts of current and prospective employees. Va. Code Ann. § 40.1-28.7:5 (2017), added by 2015 Va. Acts 576, § 1. In 2016, it added to protection of records of students in higher education institutions further protection against coerced disclosure of user name or password for social media accounts. Va. Code Ann. § 23.1-405 (2017), added by 2016 Va. Acts 597, § 1. Washington in 2013 enacted protections for “personal social networking accounts” of current and prospective employees, Wash. Rev. Code §§ 49.44.200, 49.44.205 (2017), added by 2013 Wash. Sess. Laws 330, §§ 1, 2. A private right of action is provided. Wash. Rev. Code § 49.44.205. West Virginia in 2016 enacted protections against coerced disclosure of “authentication information” for “personal accounts” of current and prospective employees. W.V. Code § 21-5H-1 (2017), added by 2016 W. Va. Acts 143. Wisconsin in 2013 enacted a provision protecting against coerced disclosure of login information for “personal social networking accounts” of current and prospective employees, students at every level, and tenants. Wis. Stat. 995.55 (2017), added by 2013 Wis. Laws 208, § 5. Other states are considering similar legislation. The National Conference of Commissioners on Uniform State Laws convened a committee to draft a proposed Social Media Privacy Act that was renamed and promulgated in 2016 the Uniform Employee and Student Online Privacy Protection Act (UESOPPA). http://www.uniformlaws .org/shared/docs/social%20media%20privacy/ESOPPA_Final%20Act_2016.pdf. UESOPPA addresses employee and student login information for all online accounts, rather than just social media, but is limited to postsecondary school students and prospects. UESOPPA § 2 (definitions). It provides for a private right of action. UESOPPA § 5.

(d)

Cyber-Bullying

Privacy as “the right to be left alone” is violated by harassment or “bullying,” most often associated with school children, also has broader application. “All states have various criminal laws that might apply to bullying behaviors, depending 9–72

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

on the nature of the act . . . . All states also have criminal harassment and/or stalking statutes, and most include explicit reference to electronic forms.” Bullying Laws Across America (Cyberbullying Research Center), available at https:// cyberbullying.org/bullying-laws. Such bullying typically includes harassment at school or affecting school behavior. Twenty three states have specifically included “cyberbullying” and forty eight states have included “electronic harassment” in their codes. See stopbullying.com, available at https://www.stopbullying.gov/laws/index. html; see also Bullying Laws Across America, available at https://cyberbullying.org/ bullying-laws. The laws applicable to cyberbullying are also included in state education laws and policy regulations. Massachusetts, like many other states, did not treat “stalking” as a crime until the 1990s. Previously, following a person of one’s desire had been as much a part of American romanticism as it was of abuse. The first version of G.L. c. 265, § 43, enacted by “An Act Establishing the Crime of Stalking,” 1992 Mass. Acts 31, prescribing that “[w]hoever willfully, maliciously, and repeatedly follows or harasses another person and who makes a threat with the intent to place that person in imminent fear of death or serious bodily injury shall be guilty of the crime of stalking” was found unconstitutionally vague in Commonwealth v. Kwiatowski, 418 Mass. 543 (1994), and was rewritten so that “following” or “harassment” were not specified: Whoever (1) willfully and maliciously engages in a knowing pattern of conduct or series of acts over a period of time directed at a specific person which seriously alarms or annoys that person and would cause a reasonable person to suffer substantial emotional distress, and (2) makes a threat with the intent to place the person in imminent fear of death or bodily injury, shall be guilty of the crime of stalking. G.L. c. 265, § 43(a) (2017) (emphasis added), amended by 1996 Mass. Acts 298, § 11 (also adding provisions to Chapter 209A on domestic restraining orders). The following year, the General Court added the following examples of conduct violating the law: “Such conduct, acts or threats described in this paragraph shall include, but not be limited to, conduct, acts or threats conducted by mail or by use of a telephonic or telecommunication device including, but not limited to, electronic mail, internet communications and facsimile communications.” 1997 Mass. Acts 238. In 2010, as part of “An Act Relative to Bullying in Schools,” 2010 Mass. Acts 92, Section 43(a) was rewritten: The conduct, acts or threats described in this subsection shall include, but not be limited to, conduct, acts or threats conducted by mail or by use of a telephonic or telecommunication device or electronic communication device including, but not limited to, any device that transfers signs, signals, writing, images, sounds, data, or intelligence of any nature transmitted in whole or in part by a wire, radio, electromagnetic, photoelectronic or photo-optical system, including, but not limited

MCLE, Inc. |2nd Edition 2018

9–73

§ 9.2

Data Security and Privacy in Massachusetts

to, electronic mail, internet communications, instant messages or facsimile communications. G.L. c. 265, § 43A(a) (2017) (emphasis added), amended by 2010 Mass. Acts 92, § 9. This included social media communications conducted via smartphone. In 2010, “An Act Relative to the Crime of Criminal Harassment,” provided that “[w]hoever willfully and maliciously engages in a knowing pattern of conduct or series of acts over a period of time directed at a specific person, which seriously alarms that person and would cause a reasonable person to suffer substantial emotional distress, shall be guilty of the crime of criminal harassment.” G.L. c. 265, § 43A (2017), as added by 2000 Mass. Acts 164, § 1. The antibullying act of 2010 added the same “conduct, acts or threats” recited above for Subsection 43(a) to the first paragraph of Section 43A. 2010 Mass. Acts 92, § 10. These “conduct, acts or threats” were part of “cyberbullying” as defined in the antibullying act of 2010: “Cyber-bullying”, bullying through the use of technology or any electronic communication, which shall include, but shall not be limited to, any transfer of signs, signals, writing, images, sounds, data or intelligence of any nature transmitted in whole or in part by a wire, radio, electromagnetic, photo electronic or photo optical system, including, but not limited to, electronic mail, internet communications, instant messages or facsimile communications. Cyber-bullying shall also include (i) the creation of a web page or blog in which the creator assumes the identity of another person or (ii) the knowing impersonation of another person as the author of posted content or messages, if the creation or impersonation creates any of the conditions enumerated in clauses (i) to (v), inclusive, of the definition of bullying. Cyber-bullying shall also include the distribution by electronic means of a communication to more than one person or the posting of material on an electronic medium that may be accessed by one or more persons, if the distribution or posting creates any of the conditions enumerated in clauses (i) to (v), inclusive, of the definition of bullying. G.L. c. 71, § 37O(a) (2017) (emphasis added), added by 2010 Mass. Acts 92, § 5, where: “Bullying”, the repeated use by one or more students or by a member of a school staff including, but not limited to, an educator, administrator, school nurse, cafeteria worker, custodian, bus driver, athletic coach, advisor to an extracurricular activity or paraprofessional of a written, verbal or electronic expression or a physical act or gesture or any combination thereof, directed at a victim that: (i) causes physical or emotional harm to the victim or damage to the victim’s property; (ii) places the 9–74

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

victim in reasonable fear of harm to himself or of damage to his property; (iii) creates a hostile environment at school for the victim; (iv) infringes on the rights of the victim at school; or (v) materially and substantially disrupts the education process or the orderly operation of a school. For the purposes of this section, bullying shall include cyber-bullying. G.L. c. 71, § 37O(a) (2017) (emphasis added), added by 2010 Mass. Acts 92, § 5, amended by 2013 Mass. Acts 38, § 72 (to add school staff as possible bullies). Practice Note North Carolina also protects school employees from cyberbullying. N.C. Gen. Stat. § 14-458.2 (2017), added by 2012 N.C. Sess. Laws 149, § 4.

Section 37O prohibits bullying: (i) on school grounds, property immediately adjacent to school grounds, at a school-sponsored or school-related activity, function or program whether on or off school grounds, at a school bus stop, on a school bus or other vehicle owned, leased or used by a school district or school, or through the use of technology or an electronic device owned, leased or used by a school district or school and (ii) at a location, activity, function or program that is not school-related, or through the use of technology or an electronic device that is not owned, leased or used by a school district or school, if the bullying creates a hostile environment at school for the victim, infringes on the rights of the victim at school or materially and substantially disrupts the education process or the orderly operation of a school. G.L. c. 71, § 37O(b) (2017) (emphasis added), added by 2010 Mass. Acts 92, § 5. It also requires instruction, planning, monitoring and reporting on prevention of bullying. In addition to the above, 2010 Mass. Acts 92, § 12, established the crime of telephoning or contacting by the electronic means highlighted above, “repeatedly, for the sole purpose of harassing, annoying or molesting the person or the person’s family,” G.L. c. 269, § 14A (2017).

(e)

“Revenge Porn”

Thirty-eight states and Washington, D.C. have passed statutes criminalizing acts of “revenge porn” to various degrees. See “38 States + DC Have Revenge Porn Laws,” available at https://www.cybercivilrights.org/revenge-porn-laws. Although the term is based on the use by a person who posts on the Internet, for “revenge” after a break-up, an intimate image generally made with the mutual consent of all parties, the underlying perceived wrong applies to other unauthorized disclosure of intimate images, sometimes for extortion. The problem for victims of the practice is both a lack of legal remedies and, to some extent, a lack of sympathy for having allowed the MCLE, Inc. |2nd Edition 2018

9–75

§ 9.2

Data Security and Privacy in Massachusetts

images to be made in the first place and possibly posted “privately.” If one of the parties made the media, that party might have a copyright to shut down distribution, but more often than not, the victim who is pictured did not take the photograph. Of the four Brandeis-Prosser common law privacy torts reviewed at § 9.1.2, only “unreasonable publicity given to another’s private life” seems applicable to the practice, but First Amendment considerations often prevent restraints on publication, and Section 230 of the Communication Decency Act , 47 U.S.C. § 230, largely protects website operators from liability for content originated by a third party. The various criminal statutes that have been enacted have remained largely untested against these defenses. Alabama in 2017 criminalized the following: a) A person commits the crime of distributing a private image if he or she knowingly posts, emails, texts, transmits, or otherwise distributes a private image with the intent to harass, threaten, coerce, or intimidate the person depicted when the depicted person has not consented to the transmission and the depicted person had a reasonable expectation of privacy against transmission of the private image. (b) For purposes of this section, private image means a photograph, digital image, video, film, or other recording of a person who is identifiable from the recording itself or from the circumstances of its transmission and who is engaged in any act of sadomasochistic abuse, sexual intercourse, sexual excitement, masturbation, breast nudity, as defined in Section 13A-12-190, genital nudity, or other sexual conduct. The term includes a recording that has been edited, altered, or otherwise manipulated from its original form. (c)(1) For purposes of this section, a reasonable expectation of privacy includes, but is not limited to, either of the following circumstances: a. The person depicted in the private image created it or consented to its creation believing that it would remain confidential. b. The sexual conduct depicted in the image was involuntary. (2) There is no reasonable expectation of privacy against the transmission of a private image made voluntarily in a public or commercial setting. Ala. Code § 13A-6-72 (2017), added by 2017 Ala. Acts 414 § 1. Alaska addresses the practice in the following way: (a) A person commits the crime of harassment in the second degree if, with intent to harass or annoy another person, that person 9–76

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

(6) except as provided in AS 11.61.116 [sending explicit image of a minor], publishes or distributes electronic or printed photographs, pictures, or films that show the genitals, anus, or female breast of the other person or show that person engaged in a sexual act; Alaska Stat. § 11-161-120 (2017), added by 2011 Alaska Sess. Laws 20, § 12. Arizona provides: A. It is unlawful for a person to intentionally disclose an image of another person who is identifiable from the image itself or from information displayed in connection with the image if all of the following apply: 1. The person in the image is depicted in a state of nudity or is engaged in specific sexual activities. 2. The depicted person has a reasonable expectation of privacy. Evidence that a person has sent an image to another person using an electronic device does not, on its own, remove the person’s reasonable expectation of privacy for that image. 3. The image is disclosed with the intent to harm, harass, intimidate, threaten or coerce the depicted person. B. This section does not apply to any of the following: 1. The reporting of unlawful conduct. 2. Lawful and common practices of law enforcement, criminal reporting, legal proceedings or medical treatment. 3. Images involving voluntary exposure in a public or commercial setting. 4. An interactive computer service, as defined in 47 United States Code section 230(f)(2), or an information service, as defined in 47 United States Code section 153, with regard to content wholly provided by another party. 5. Any disclosure that is made with the consent of the person who is depicted in the image. AZ Rev. Stat. §13-1425 (through 2nd Reg. Sess. 50th Leg. 2012). Arkansas provides: (a) A person commits the offense of unlawful distribution of sexual images or recordings if, being eighteen (18) years of age or older, with the purpose to harass, frighten, intimidate, threaten, or abuse another person, the actor distributes an image, picture, video, or voice or audio recording of the other MCLE, Inc. |2nd Edition 2018

9–77

§ 9.2

Data Security and Privacy in Massachusetts

person to a third person by any means if the image, picture, video, or voice or audio recording: (1) Is of a sexual nature or depicts the other person in a state of nudity; and (2) The other person is a family or household member of the actor or another person with whom the actor is in a current or former dating relationship. (b) The fact that an image, picture, video, or voice or audio recording was created with the knowledge or consent of the other person or that the image, picture, video, or voice or audio recording is the property of a person charged under this section is not a defense to prosecution under this section. Ark. Code 5-26-314 (2017), added by 2015 Ark. Acts 304, § 2. California provides: (A) A person who intentionally distributes the image of the intimate body part or parts of another identifiable person, or an image of the person depicted engaged in an act of sexual intercourse, sodomy, oral copulation, sexual penetration, or an image of masturbation by the person depicted or in which the person depicted participates, under circumstances in which the persons agree or understand that the image shall remain private, the person distributing the image knows or should know that distribution of the image will cause serious emotional distress, and the person depicted suffers that distress. (B) A person intentionally distributes an image described in subparagraph (A) when he or she personally distributes the image, or arranges, specifically requests, or intentionally causes another person to distribute that image. (C) As used in this paragraph, “intimate body part” means any portion of the genitals, the anus and in the case of a female, also includes any portion of the breasts below the top of the areola, that is either uncovered or clearly visible through clothing. Cal. Penal Code § 647(j)(4) (2017), added by 2013 Cal. Stat. 466, § 1, amended by 2014 Cal. Stat. 71, § 125, 2016 Cal. Stat. 654, § 1.4, 2016 Cal. Stat. 724, § 1.3, 2016 Cal. Stat. 734, § 1.4. Colorado enacted in 2014 “An Act Concerning the Prohibiting of Posting of a Private Image on Social Media Without Consent To Cause Serious Emotional Distress,” adding Colo. Rev. Stat. § 18-7-107 (2017) (posting a private image for harassment) and § 18-7-108 (2017) (posting a private image for pecuniary gain), added by 2014 Colo. Sess. Laws 283, § 1. 9–78

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Connecticut added to its voyeurism law, Conn. Gen. Stat. § 53a-189c (2017), making unlawful dissemination of an intimate image. 2014 Conn. Acts 213, § 8. Delaware provides: (a) A person is guilty of violation of privacy when, except as authorized by law, the person: (9) Knowingly reproduces, distributes, exhibits, publishes, transmits, or otherwise disseminates a visual depiction of a person who is nude, or who is engaging in sexual conduct, when the person knows or should have known that the reproduction, distribution, exhibition, publication, transmission, or other dissemination was without the consent of the person depicted and that the visual depiction was created or provided to the person under circumstances in which the person depicted has a reasonable expectation of privacy. Del. Code Ann. tit. 11, § 1335 (2017), added by 2014 Del. Laws 415, § 1, amended by 2017 Del. Laws 79, § 11. District of Columbia enacted the “Criminalization of Non-Consensual Pornography Act of 2014,” 2014 D.C. Stat. 20-275, codified at D.C. Code Ann. §§ 22-3051–22-3057. Florida enacted in 2015 Fl. Laws ch. 24, § 1, Fla. Stat. ch. 784.049 (2017) making “sexual cyberharassment” a misdemeanor, where “Sexually cyberharass” means to publish a sexually explicit image of a person that contains or conveys the personal identification information of the depicted person to an Internet website without the depicted person’s consent, for no legitimate purpose, with the intent of causing substantial emotional distress to the depicted person. 2015 Fl. Laws ch. 784.049(2)(c), and providing a private right of action for injunction, statutory or actual damages and attorney fees, 2015 Fl. Laws ch. 784.049(5). Georgia provides at Ga. Code Ann. § 16-11-90 (2017), added by 2014 Ga. Laws 519, § 1, 2015 Ga. Laws 9, § 16: (b) A person violates this Code section if he or she, knowing the content of a transmission or post, knowingly and without the consent of the depicted person: (1) Electronically transmits or posts, in one or more transmissions or posts, a photograph or video which depicts nudity or sexually explicit conduct of an adult when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person; or MCLE, Inc. |2nd Edition 2018

9–79

§ 9.2

Data Security and Privacy in Massachusetts

(2) Causes the electronic transmission or posting, in one or more transmissions or posts, of a photograph or video which depicts nudity or sexually explicit conduct of an adult when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person. Hawaii added in 2014 to its violation of privacy statute that a person commits offense if: The person knowingly discloses an image or video of another identifiable person either in the nude, as defined in section 712-1210, or engaging in sexual conduct, as defined in section 712-1210, without the consent of the depicted person, with intent to harm substantially the depicted person with respect to that person’s health, safety, business, calling, career, financial condition, reputation, or personal relationships . . . Haw. Rev. Stat. § 711-1110.9(1)(b) (2017), added by 2014 Haw. Sess. Laws 116, § 1. Idaho added in 2014 to its video voyeur crime where the offender: either intentionally or with reckless disregard disseminates, publishes or sells or conspires to disseminate, publish or sell any image or images of the intimate areas of another person or persons without the consent of such other person or persons and he knows or reasonably should have known that one (1) or both parties agreed or understood that the images should remain private. Idaho Code § 18-6609(2)(b) (2017), added by 2014 Idaho Sess. Laws 173, § 1. Illinois criminalized in 2013 the nonconsensual dissemination of private sexual images at 720 Ill. Comp Stat. 5/11-23.5 (2017), added by 2013 Ill. Laws 1138, § 5. Iowa added in 2017 to its crime of harassment where the offender [d]isseminates, publishes, distributes, posts, or causes to be disseminated, published, distributed, or posted a photograph or film showing another person in a state of full or partial nudity or engaged in a sex act, knowing that the other person has not consented to the dissemination, publication, distribution, or posting. Iowa Code 708.7(1)(a)(5) (2017), added by 2017 Iowa Acts 117, § 2. Kansas added in 2016 to its breach of privacy statute the offense of (8) disseminating any videotape, photograph, film or image of another identifiable person 18 years of age or older who is 9–80

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

nude or engaged in sexual activity and under circumstances in which such identifiable person had a reasonable expectation of privacy, with the intent to harass, threaten or intimidate such identifiable person, and such identifiable person did not consent to such dissemination. Kan. Stat. Ann. § 21-6101(a)(8) (2017), added by 2016 Kan. Sess. Laws 96, § 5. Louisiana enacted in 2015 La. Acts 231, § 1, the offense of nonconsensual disclosure of a private image, codified at La. Rev. Stat. 14:283.2 (2017). Maine enacted in 2015 “An Act To Prohibit Dissemination of Certain Private Images,” 2015 Me. Laws 339, § 1, amended by 2016 Me. Laws 394, § 3, 2016 Me. Laws 410, § A-1, which provides in part: A person is guilty of unauthorized dissemination of certain private images if the person, with the intent to harass, torment or threaten the depicted person or another person, knowingly disseminates, displays or publishes a photograph, videotape, film or digital recording of another person in a state of nudity or engaged in a sexual act or engaged in sexual contact in a manner in which there is no public or newsworthy purpose when the person knows or should have known that the depicted person: . . . B. Is identifiable from the image itself or information displayed in connection with the image; and C. Has not consented to the dissemination, display or publication of the private image. Me. Rev. Stat tit. 17-A, § 511-A(1) (2017). Maryland in 2014 enacted a specific statute addressed to “revenge porn,” (c) A person may not intentionally cause serious emotional distress to another by intentionally placing on the Internet a photograph, film, videotape, recording, or any other reproduction of the image of the other person that reveals the identity of the other person with his or her intimate parts exposed or while engaged in an act of sexual contact: (1) knowing that the other person did not consent to the placement of the image on the Internet; and (2) under circumstances in which the other person had a reasonable expectation that the image would be kept private. Md. Code Ann. [Crim. Law] § 3-809(c) (2017), added by 2014 Md. Laws 583.

MCLE, Inc. |2nd Edition 2018

9–81

§ 9.2

Data Security and Privacy in Massachusetts

Michigan enacted in 2016 Mich. Pub. Acts 89 a law against dissemination of sexually explicit visual material of another, codified at Mich. Comp. Laws § 750.145e. Minnesota enacted in 2016 Minn. Laws 126, §§ 1–2, a private cause of action for nonconsensual dissemination of private sexual images, codified at Minn. Stat. §§ 604.30, 604.31 (2017), providing for injunctions, general and special damages for “emotional anguish,” disgorgement of profits, civil penalties of up to $10,000, and attorney fees. Nevada enacted in 2015 Nev. Stat. 399, §§ 2–6.5 establishing the crime of unlawful dissemination of an intimate image of a person which has been codified at Nev. Rev. Stat. Ann. § 200.604 (2017) and Nev. Rev. Stat. Ann. § 200.780 (2017). New Hampshire enacted in 2016 N.H. Laws 126, “An Act relative to the nonconsensual dissemination of private sexual images,” adding N.H. Rev. Stat. Ann. § 644:9-a (2017). New Jersey in 2016 N.J. Laws 2, extended its invasion of privacy statutes to dissemination of intimate images, including “another person’s undergarment-clad intimate parts,” at N.J. Stat Ann. § C.2C:14-9(c) (2017) (crime) and N.J. Stat Ann. § C.2A:58D-1 (2017) (private right of action including minimum liquidated damages of $1,000 and the possibility of punitive damages and attorney fees). New Mexico in 2015 N.M. Laws 42, § 1, passed a law making unauthorized distribution of sensitive images a crime, codified at N.M. Stat. Ann. § 30-37A-1 (2017), where “sensitive image” is broadly defined to include images, photographs, videos or other likenesses depicting or simulating an intimate act or depicting any portion of a person’s genitals, or of a woman’s breast below the top of the areola, that is either uncovered or visible through less-thanfully opaque clothing, which images may reasonably be considered to be private, intimate or inappropriate for distribution or publication without that person’s consent; N.M. Stat. Ann. § 30-37A-1(A)(5). North Carolina in 2015 N.C. Sess. Laws 250 enacted a statute criminalizing and providing a private right of action (with a short statute of limitations) for “disclosure of private images,” codified at N.C. Gen. Stat. § 14-190.5A (2017). North Dakota in 2015 Laws 106 enacted a provision criminalizing distribution of intimate images without or against consent, N.D. Cent. Code § 12.1-17-07.2 (2017) and providing a private right of action, N.D. Cent. Code § 32-03-58 (2017). Oklahoma in 2016 Okla. Sess. Laws 262 made criminal nonconsensual dissemination of private sexual images, codified at Okla. Stat. tit. 21, § 1040.13b (2017).

9–82

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.2

Oregon in 2015 Or. Laws 379, § 1, made criminal unconsented dissemination of intimate images with intent to harass, humiliate or injure, codified at Or. Rev. Stat. § 161.005 (2017). Pennsylvania in 2014 Pa. Laws 115 enacted a provision criminalizing dissemination of “a visual depiction of the current or former sexual or intimate partner in a state of nudity or engaged in sexual conduct,” 18 Pa. Cons. Stat. § 3131(a) (2017) and providing a private right of action, 42 Pa. Cons. Stat § 8316.1 (2017) ($500 minimum damages and attorney fees possible). South Dakota updated a criminal statute addressing the visual recording of sexual images to prohibit dissemination “in any form.” S.D. Codified Laws § 22-21-4 (2017), as added 2004 S.D. Laws 151, § 1, amended by 2011 S.D. Laws 116, § 1, 2016 S.D. Laws 123, § 1. Tennessee in 2016 Tenn. Pub. Acts 872, § 1, made “unlawful exposure” when a person, “with the intent to cause emotional distress, distributes an image of the intimate part or parts of another identifiable person,” codified at Tenn. Code Ann. § 39-17-318 (2017). Texas enacted the “Relationship Privacy Act,” 2015 Tex. Gen Laws 852, §§ 2, 3, providing for a private right of action for unconsented disclosure of “intimate visual materials,” Tex. Civ. Prac. & Rem. Code Ann. §§ 98B.001–98B.007 (2017), and criminal liability, Tex. Penal Code Ann. § 21.16 (2017) (somewhat broader coverage, including extortion). Utah in 2014 Utah Laws 124, § 1, made unconsented distribution of an intimate image a misdemeanor, codified at Utah Code Ann. § 76-5b-203 (2017). Vermont in 2015 Vt. Acts & Resolves 62, §§ 2 and 3, made disclosure of sexually explicit images without consent a crime and subject to a private right of action, Vt. Stat. Ann., tit. 13, § 2606 (2017) and extortion to remove booking photographs a crime and subject to a private right of action, Vt. Stat. Ann., tit. 9, § 4191 (2017). Virginia in 2014 Va. Acts ch. 399, updated its unlawful imaging statute, Va. Code Ann. § 18.2-386.1 (2017) and added the criminalization of unconsented dissemination or sale of images of another, Va. Code Ann. § 18.2-386.2 (2017). Washington in 2015 made a crime of unconsented disclosing of intimate images Wash. Rev. Code § 9A.86.010 (2017), added by 2015 Wash. Laws 2d Sp. Sess. 7, § 1, amended by 2016 Wash. Laws 91, § 1. It also created a private right of action at Wash. Rev. Code § 4.24.795 (2017), added by 2015 Wash. Laws 2d Sp. Sess. 8, § 1. West Virginia in 2017 W. Va. SB420, made nonconsensual disclosure of private intimate images a crime codified at W. Va. Code § 61-8-28A (2017). Wisconsin in 2013 Wis. Laws 243, § 4, added to its statute criminalizing some representations of nudity a penalty for whoever “[p]osts, publishes, or causes to be posted or published, a private representation if the actor knows that the person depicted MCLE, Inc. |2nd Edition 2018

9–83

§ 9.2

Data Security and Privacy in Massachusetts

does not consent to the posting or publication of the private representation.” Wis. Stat. § 942.09(3m)(a)(1) (2017).

§ 9.3

DATA SECURITY AND PRIVACY LITIGATION

The relative uniformity of the data breach notification laws enacted by forty-eight states and the District of Columbia masks the great differences among the states as to remedies for individuals whose personal information is misappropriated. Several of the data breach notification laws provide for private rights of action. See Cal. [Civ.] Code § 1798.84(b); D.C. Code § 28-3853 (2014); Haw. Rev. Stat. § 487N-3(b); La. Rev. Stat. 51:3075; N.H. Rev. Stat. Ann. § 359-C:21; N.J. Stat. § 56:8-166; S.C. Code Ann. § 39-1-90(G); Va. Code Ann. § 18.2-186.6(I) (expressly not precluded); Wash. Rev. Code § 19.255.010(10). Others may allow a private right of action under the state unfair and deceptive practices or “Baby FTC” acts. See, e.g., Conn. Gen. Stat. § 42-110g; 815 Ill. Comp. Stat. 530/20; Md. Code Ann. [Com. Law] § 14-3508; Mass. G.L. c. 93H, § 6 (attorney general may bring an action, but does not preclude private action otherwise available, under Chapter 93A). (Some states provide for enforcement exclusively by the attorney general. E.g., W. Va. Code § 46A-2A104(b)). But the wrong is the failure to report rather than the breach itself. State tort and contract law in this field is not well developed. Practice Note The Wisconsin breach notification law provides: “Failure to comply with this section is not negligence or a breach of any duty, but may be evidence of negligence or a breach of a legal duty.” Wis. Stat. § 134.98(4) (2015).

As the court in In re Hannaford Bros. Co. Customer Data Security Breach Litigation stated in mid-2009, “[f]or those wanting a definitive answer to this question of who should bear the risk of data theft in electronic payment systems, my ruling will be unsatisfactory. In this case, the answer depends wholly on state law, and the state law is still undeveloped.” In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 613 F. Supp. 2d 108, 114–15 (D. Me. 2009), aff’d in part, rev’d in part sub nom Anderson v. Hannaford Bros. Co., 659F.3d 151 (1st Cir. 2011). In that case of a 2007 massive compromise of credit and debit card data (up to 4.2 million card numbers), a motion to dismiss various counts of the class action complaint was decided under Maine law: • Implied contract—denied. • Implied warranty—granted. • Breach of confidential relationship—granted. • Breach of duty to advise—granted (no allegation under notification law; no private right). • Strict liability—granted. • Negligence—denied (economic loss doctrine did not apply). • Maine Unfair Trade Practices—denied (FTC Act analogy). 9–84

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.3

On the implied contract issue, the court certified the following questions to the Maine Supreme Court. 1. In the absence of physical harm or economic loss or identity theft, do time and effort alone, spent in a reasonable effort to avoid or remediate reasonably foreseeable harm, constitute a cognizable injury for which damages may be recovered under Maine law of negligence and/or implied contract? 2. If the answer to question # 1 is yes under a negligence claim and no under an implied contract claim, can a plaintiff suing for negligence recover damages under Maine law for purely economic harm absent personal injury, physical harm to property, or misrepresentation? In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 671 F. Supp. 2d 198, 201 (D. Me. 2009). The Maine Supreme Court answered no for both negligence and implied contract and thus did not answer question No. 2 directly. In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 4 A.3d 492 (Me. 2010). (“The tort of negligence does not compensate individuals for the typical annoyances or inconveniences that are a part of everyday life.” In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 4 A.3d at 496, citing W. Page Keeton et al., Prosser & Keeton on the Law of Torts § 30, at 165 (W. Page Keeton ed., 5th ed. 1984). Relative to the contract, “the time and effort expended by the plaintiffs here represent ‘the ordinary frustrations and inconveniences that everyone confronts in daily life.’” In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 4 A.3d at 497, quoting In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 613 F. Supp. 2d 108, 134 (D. Me. 2009).) The First Circuit affirmed in large part but reversed dismissal of “plaintiffs’ negligence and implied contract claims as to certain categories of alleged damages because plaintiffs’ reasonably foreseeable mitigation costs constitute a cognizable harm under Maine law.” Anderson v. Hannaford Bros. Co., 659 F.3d 151, 153 (1st Cir 2011). (“Maine courts have weighed these considerations in the context of mitigation costs and determined that a plaintiff may ‘recover for costs and harms incurred during a reasonable effort to mitigate,’ regardless of whether the harm is nonphysical.” Anderson v. Hannaford Bros. Co., 659 F.3d at 162, quoting In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 4 A.3d 492, 496 (Me. 2010) (insurance and credit monitoring costs as mitigation costs).) The results left one plaintiff who could make out mitigation costs to seek class certification. The certification was denied. In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 293 F.R.D. 21 (D. Me. 2013) (finding for certification on the issues of numerosity, commonality, typicality, and adequacy under Rule 23(a)(4) and superiority of the class action procedure under Rule 23(b)(3) but finding that plaintiff’s proposal of an aggregate damage theory could not be accepted because there was no expert put forward, and thus, the Rule 23(b)(3) requirement that “questions of law or fact common to class members predominate over any questions affecting only individual members” was not met). Similarly, in In re TJX Companies Retail Security Breach Litigation, 524 F. Supp. 2d 83, 86 (D. Mass. 2007), a case involving the malicious “compromise [of] the security MCLE, Inc. |2nd Edition 2018

9–85

§ 9.3

Data Security and Privacy in Massachusetts

of at least 45,700,000 customer credit and debit accounts,” the First Circuit, applying Massachusetts law, ruled as follows: • Negligence—dismissal affirmed (economic loss doctrine applied). • Breach of contract—dismissal affirmed (bank plaintiffs not parties to contract). • Negligent misrepresentation—denial affirmed. • Massachusetts unfair trade practices—denial affirmed. • Conversion—denial of motion to amend affirmed (defendant did not appropriate). Amerifirst Bank v. TJX Cos. (In re TJX Cos. Retail Sec. Breach Litig.), 564 F.3d 489, amended on rehearing, 2009 U.S. App. LEXIS 29672 (1st Cir. May 5, 2009) (class certification for the bank plaintiffs had been denied. In re TJX Cos. Retail Sec. Breach Litig., 246 F.R.D. 389 (D. Mass. 2007)). The Massachusetts Supreme Judicial Court in Cumis Insurance Society, Inc. v. BJ’s Wholesale Club, Inc. similarly affirmed dismissal of the claims of the insurer and the assignee of issuing banks who lost millions of dollars to fraudulent credit card transactions tied to unauthorized access to magnetic stripe data from 9.2 million credit cards. Cumis Ins. Soc’y, Inc. v. BJ’s Wholesale Club, Inc., 455 Mass. 458 (2009) (affirming dismissal of third-party beneficiary claim, dismissal of negligence claim on economic loss doctrine, and summary judgment on fraud and negligent misrepresentation claims). In a class action over the Circuit City compromise of 2.6 million credit card numbers (thrown out with the trash), Willey v. J.P. Morgan Chase, N.A., 2009 U.S. Dist. LEXIS 57826 (S.D.N.Y. July 7, 2009), the court ruled as follows on a motion to dismiss: • Fair Credit Reporting Act (FCRA) claim for willful or negligent violation of 15 U.S.C. § 1681w—granted (inadequate facts alleged). • Negligence and negligent violation of (preempted, speculative harm).

Gramm-Leach-Bliley—granted

• Express and implied contract—granted (preempted, speculative harm). • New York Deceptive Trade Practices Act—granted (preempted, speculative harm). • Bailmen—granted (preempted, speculative harm). In a case of a stolen laptop with alleged “substantial risk” of identity theft, the court in Randolph v. ING Life Insurance and Annuity Co., 973 A.2d 702, 2009 D.C. App. LEXIS 231 (D.C. Ct. App. June 18, 2009) (see McLoughlin v. People’s United Bank, Inc., 2009 U.S. Dist. LEXIS 78065 (D. Ct. Aug. 31, 2009) (no ascertainable damage under Connecticut Unfair Trade Practices Act for stolen laptop, no negligence action for fear, no breach of fiduciary duty)), ruled as follows on a motion to dismiss: • Negligence and gross negligence—granted (speculative harm). • Breach of fiduciary duty—granted (speculative harm). • Invasion of privacy—granted (not intentional). 9–86

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.3

A substantial hurdle for plaintiffs is proof of damages; speculative damages will not support a complaint. E.g., Pisciotta v. Old Nat’l Bancorp, 499 F.3d 629 (7th Cir. 2007) (also standing); Ruiz v. Gap, Inc., 622 F. Supp. 2d 908 (N.D. Cal. 2009) (lost laptop); Belle Chasse Auto. Care, Inc. v. Advanced Auto Parts, Inc., 2009 U.S. Dist. LEXIS 25084 (E.D. La. Mar. 24, 2009); Pinero v. Jackson Hewitt Tax Serv. Inc., 594 F. Supp. 2d 710 (E.D. La. 2009); Cherny v. Emigrant Bank, 604 F. Supp. 605 (S.D.N.Y. 2009) (also economic loss rule); Ponder v. Prizer, Inc., 522 F. Supp. 2d 793 (M.D. La. 2007). A plaintiffs’ litigation strategy in class actions is to remain in state court, maneuvering around the Class Action Fairness Act of 2005 (“CAFA”). 28 U.S.C § 1332(d), added by Pub. L. No. 109-2, § 4, 119 Stat. 4, 9. E.g., C.S. v. United Bank, Inc., 2009 U.S. Dist. LEXIS 23024 (S.D. W. Va. Mar. 20, 2009) (Gramm-Leach-Bliley Act did not preempt); see In re Hannaford Bros. Co. Customer Data Sec. Breach Litig., 613 F. Supp. 2d 108 (D. Me. 2009) (brought in federal court under CAFA, dismissal of all plaintiffs except the one with fraudulent charges not reimbursed, who could proceed only on implied contract, negligence and the Maine Unfair Trade Practices Act; also Patton v. Experian Data Corp., 2016 U.S. Dist. LEXIS 60590 (C.D. Cal. May 6, 2016) (lack of Article III standing and thus subject matter jurisdiction over removed action, resulting in dismissal and remanded to state court). However, the Supreme Court’s decision in American Express Co. v. Italian Colors Restaurants, 133 S. Ct. 2304 (2013), validating an arbitration clause with a waiver of class arbitration, has been applied to state cases, e.g., Feeney v. Dell Inc., 466 Mass. 1001 (2013), and provides a way for avoiding class actions through the use of such clauses in online terms. In the multidistrict In re Sony Gaming Networks and Customer Data Security Breach Litigation, involving the laws of California, Florida, Massachusetts, Michigan, Missouri, New Hampshire, New York, Ohio, and Texas as applied to the 2011 breach of the gaming network (with personal identifying information including credit card numbers, expiration dates, and security codes), alleged delay in notification and remediation, the negligence, warranty, implied warranty, and unjust enrichment claims were largely dismissed, along with damages claims based on certain states’ consumer protection (Baby FTC) acts, leaving the case to proceed mostly on requests for declaratory and injunctive relief on the Baby FTC acts and for misrepresentations (and potential restitution) under California’s Unfair Competition Law, False Advertising Law, and Consumer Legal Remedies Act. In re Sony Gaming Networks & Customer Data Sec. Breach Litig., 2014 U.S. Dist. LEXIS 7353 (S.D. Cal. Jan. 21, 2014). Although the credit card–issuing banks (who bear the brunt of breach-based fraud) had been losing in their Hannaford breach case, as they had in the TJX and BJ’s Wholesale Club breach cases, they had a reversal of fortune in Lone Star National Bank N.A. v. Heartland Payment Systems, Inc., with the Fifth Circuit finding that the “economic loss rule” of Massachusetts and other states did not apply under New Jersey law to preclude the banks’ recovery. Lone Star Nat’l Bank N.A. v. Heartland Payment Sys., Inc., 729 F.3d 421 (5th Cir. 2013). A consolidated class action complaint for the financial institution cases in In re The Home Depot, Inc. Customer Data Breach Litigation, Docket No. 104, M.D.L. No. MCLE, Inc. |2nd Edition 2018

9–87

§ 9.3

Data Security and Privacy in Massachusetts

14-02583-TWT (N.D. Ga. May 27, 2015), regarding the compromise in April through September 2014 of the personal information of 56 million Home Depot customers, demonstrates the continuing relevance of state law. Subclasses of the plaintiff financial institutions were asserted for Alaska, California, Connecticut, Florida, Illinois, Massachusetts, Minnesota, and Washington. One area of class action litigation that remains active in 2015 is that for retailers asking for zip codes in point-of-sale (POS) credit card transactions (which do not have class action waivers at the POS). Zip codes are not required by credit card issuers or suppliers at the POS but are collected by retailers for their own marketing purposes (mailings). High court decisions applying personal information privacy protection laws for credit card transactions in California and Massachusetts have spurred class action suits against retailers who continue the practice. Practice Note The California Supreme Court held in Pineda v. Williams-Sonoma Stores, Inc., 246 P.3d 612 (Cal. 2011), that zip codes were “personal identification information,” defined in the Song-Beverly Credit Card Act of 1971 (Cal. [Civ.] Code §§ 1748.01–1748.94) as “information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number” (Cal. [Civ.] Code § 1748.08(b)). Retailers may not “[r]equest or require as a condition to accepting the credit card as payment . . . the cardholder to provide personal identification information, which the person . . . accepting the credit card writes, causes to be written, or otherwise records upon the credit card transaction form or otherwise.” Cal. [Civ.] Code § 1748.08(a)(2). A private right of action is provided with triple damages and attorney fees. Cal. [Civ.] Code § 1748.70(d). The California Court of Appeal affirmed the denial of class certification where the plaintiff divulged her zip code after the credit card transaction had been completed. Harrold v. Levi Strauss & Co., 2015 Cal. App. LEXIS 427 (Cal. Ct. App. May 19, 2015).

Practice Note In Tyler v. Michaels Stores, Inc., 464 Mass. 492 (2013), the Massachusetts Supreme Judicial Court held that G.L. c. 93, § 105(a), [n]o person, firm, partnership, corporation or other business entity that accepts a credit card for a business transaction shall write, cause to be written or require that a credit card holder write personal identification information, not required by the credit card issuer, on the credit card transaction form. Personal identification information shall include, but shall not be limited to, a credit card holder’s address or telephone number[,] applied to privacy more generally than the “identity fraud” of its title and that “credit card transaction form” applied to the electronic record created by the retailer. 9–88

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.3

The California Supreme Court, however, has held that the protections did not apply to credit card payments for electronic downloads. Apple Inc. v. Superior Court of Los Angeles County, 292 P.3d 883 (Cal. 2013). Online vendors may also rely on “privacy policies” and class action waivers in their online terms. New theories of liability arose such as “unjust enrichment” (or restitution) for the costs saved by data custodians that failed to implement reasonable security measures. Some of these theories survived early dismissals of other counts in complaints addressed to fraud or negligence (no recovery of economic damage). E.g., Enslin v. Coca-Cola Co., 136 F. Supp. 3d 164 (E.D. Pa. 2015) (former contractor sued for putative 74,000 employee class for fifty-five laptops with personally identifiable information (PII) lost over a six-year period); In re Target Corp. Customer Data Sec. Breach Litig., 2014 U.S. Dist. LEXIS 175768 (D. Minn. Dec. 18, 2014); In re Premera Blue Cross Customer Data Sec. Breach Litig., 198 F. Supp. 3d 1183 (D. Ore. 2016). These theories tended to fall after discovery along with implied contract theories. E.g., Enslin v. Coca Cola Co., 2017 U.S. Dist. LEXIS 49920 (E.D. Pa. Mar. 31, 2017). Although most consumer class actions for data breach involve state law (as surveyed in this chapter), with the exception of federal claims, for example, made under the FCRA (chapter 4) and the Gramm-Leach-Bliley Act (chapter 5), they tend to be brought in federal court to maximize the size of the class and to take advantage of established federal court class action and discovery powers and expertise. However, federal courts have jurisdiction limited by Article III of the Constitution requiring the presence of “case or controversy,” and much of the defense against data breach class action (or individual) suits have involved the jurisdictional issue of standing—largely based on the question of speculative future harm. In Beck v. McDonald, 848 F.3d 262, 273–74 (4th Cir. 2017), rejecting standing for claims of Veterans Administration violation of the federal Privacy Act and Administrative Procedure Act for a security breach, the Fourth Circuit identified and cited a split among the federal appellate circuit courts of appeal on “whether a plaintiff may establish an Article III injury-in-fact based on an increased risk of future identity theft”: The Sixth, Seventh, and Ninth Circuits have all recognized, at the pleading stage, that plaintiffs can establish an injury-infact based on this threatened injury. See Galaria v. Nationwide Mut. Ins. Co., No. 15-3386, 663 Fed. Appx. 384, 2016 U.S. App. LEXIS 16840, 2016 WL 4728027, at *3 (6th Cir. Sept. 12, 2016) (plaintiff-customers’ increased risk of future identity theft theory established injury-in-fact after hackers breached Nationwide Mutual Insurance Company’s computer network and stole their sensitive personal information, because “[t]here is no need for speculation where Plaintiffs allege that their data has already been stolen and is now in the hands of illintentioned criminals”); Remijas v. Neiman Marcus Grp., LLC, 794 F.3d 688, 692, 694–95 (7th Cir. 2015) (plaintiffMCLE, Inc. |2nd Edition 2018

9–89

§ 9.3

Data Security and Privacy in Massachusetts

customers’ increased risk of future fraudulent charges and identity theft theory established “certainly impending” injuryin-fact and “substantial risk of harm” after hackers attacked Neiman Marcus with malware to steal credit card numbers, because “[p]resumably, the purpose of the hack is, sooner or later, to make fraudulent charges or assume those consumers’ identities”); Krottner v. Starbucks Corp., 628 F.3d 1139, 1142–43 (9th Cir. 2010) (plaintiff-employees’ increased risk of future identity theft theory a “credible threat of harm” for Article III purposes after theft of a laptop containing the unencrypted names, addresses, and social security numbers of 97,000 Starbucks employees); Pisciotta v. Old Nat’l Bancorp, 499 F.3d 629, 632– 34 (7th Cir. 2007) (banking services applicants’ increased risk of harm theory satisfied Article III injury-in-fact requirement after “sophisticated, intentional and malicious” security breach of bank website compromised their information). By contrast, the First and Third Circuits have rejected such allegations. See Katz v. Pershing, LLC, 672 F.3d 64, 80 (1st Cir. 2012) (brokerage account-holder’s increased risk of unauthorized access and identity theft theory insufficient to constitute “actual or impending injury” after defendant failed to properly maintain an electronic platform containing her account information, because plaintiff failed to “identify any incident in which her data has ever been accessed by an unauthorized person”); Reilly v. Ceridian Corp., 664 F.3d 38, 40, 44 (3d Cir. 2011) (plaintiff-employees’ increased risk of identity theft theory too hypothetical and speculative to establish “certainly impending” injury-in-fact after unknown hacker penetrated payroll system firewall, because it was “not known whether the hacker read, copied, or understood” the system's information and no evidence suggested past or future misuse of employee data or that the “intrusion was intentional or malicious”). Thus, despite the expectation (and even the perception) that standing requirements for claims based on data security breach were resolved, substantial issues remained following, the Supreme Court’s decision in Spokeo, Inc. v. Robins, ___ U.S. ___, 136 S. Ct. 1540, 164 L. Ed. 2d 635 (May 24, 2016), left for case-by-case (and circuit-bycircuit) assessment of (or perhaps sympathy for) whether particular threatened injury meets the standing requirement. At the base, in order to have Article III standing, a plaintiff must adequately establish: (1) an injury in fact (i.e., a “concrete and particularized” invasion of a “legally protected interest”); (2) causation (i.e., a “‘fairly . . . trace[able]’” connection between the alleged injury in fact and the alleged conduct of the defendant); 9–90

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.3

and (3) redressability (i.e., it is “‘likely’” and not “merely ‘speculative’” that the plaintiff’s injury will be remedied by the relief plaintiff seeks in bringing suit). Sprint Communications Co. L.P. v. APCC Serv., Inc., 554 U.S. 269, 273–74 (2008), citing Lujan v. Defenders of Wildlife, 504 U.S. 555, 560–561 (1992). Spokeo vacated the Ninth Circuit’s prior finding of standing for in a claim under the FCRA for an inaccurate profile compiled and reported by the defendant providing personal information about individuals to a variety of users, including employers wanting to evaluate prospective employees, because the Ninth Circuit did not explain why the alleged injury was “concrete,” in the sense of “real”, rather than “abstract”, though not necessarily “tangible”, where “risk of real harm” might in some cases meet the concreteness required. Spokeo, Inc. v. Robins, 136 S. Ct. 1540 at 1548–49 (2016). In the context of this particular case, these general principles tell us two things: On the one hand, Congress plainly sought to curb the dissemination of false information by adopting procedures designed to decrease that risk. On the other hand, Robins cannot satisfy the demands of Article III by alleging a bare procedural violation. A violation of one of the FCRA’s procedural requirements may result in no harm. For example, even if a consumer reporting agency fails to provide the required notice to a user of the agency’s consumer information, that information regardless may be entirely accurate. In addition, not all inaccuracies cause harm or present any material risk of harm. An example that comes readily to mind is an incorrect zip code. It is difficult to imagine how the dissemination of an incorrect zip code, without more, could work any concrete harm. Spokeo, Inc. v. Robins, 136 S. Ct. at 1550. The Ninth Circuit maintained its finding of standing, answering affirmatively: In evaluating Robins’s claim of harm, we thus ask: (1) whether the statutory provisions at issue were established to protect his concrete interests (as opposed to purely procedural rights), and if so, (2) whether the specific procedural violations alleged in this case actually harm, or present a material risk of harm to, such interests. Robins v. Spokeo, Inc., 2017 U.S. App. LEXIS 15211 at *11 (9th Cir. Aug. 15, 2017). “Robins specifically alleged that Spokeo falsely reported that he is married with children, that he is in his 50s, that he is employed in a professional or technical field, that he has a graduate degree, and that his wealth level is higher than it is.” Robins v. Spokeo, Inc., 2017 U.S. App. LEXIS 15211 at *20. The Ninth Circuit rejected Spokeo’s argument that the “flattering” inaccuracies could only result in speculative, and not concrete, harm, stating “the inaccuracies alleged in this case do not strike us MCLE, Inc. |2nd Edition 2018

9–91

§ 9.3

Data Security and Privacy in Massachusetts

as the sort of ‘mere technical violation[s]’ which are too insignificant to present a sincere risk of harm to the real-world interests that Congress chose to protect with FCRA.” Robins v. Spokeo, Inc., 2017 U.S. App. LEXIS 15211 at *20. In contrast, demonstrating the continuing split between the circuits, the Fourth Circuit reversed a $11,747,510 class action judgment. The court found no standing under FCRA where the plaintiff had his security clearance challenged because of, and had invested substantial time and effort in investigating, a posting of an account balance under the name of a dissolved creditor, because it was a “customer relationship” (undisclosed card servicer) problem that did not result in real harm, as the report was ultimately corrected and he received his security clearance. Dreher v. Experian Info. Solutions, Inc., 856 F.3d 337 (4th Cir. 2017). The Eighth Circuit, while reversing personally as to a class representative who alleged actual fraud on a credit card he had used at the defendants’ retail grocery stores, affirmed dismissal of state law claims for breach of data security of other proposed class members where the compromised credit card information contained “any personally identifying information, such as social security numbers, birth dates, or driver’s license numbers” that the plaintiffs’ proffered Government Accounting Office report stated was needed to open new accounts. Alleruzzo v. SuperValu, Inc. (In re SuperValu, Inc., Customer Data Sec. Breach Litig.), 2017 U.S. App. LEXIS 16664, *15–18 (8th Cir. May 10, 2017). The cost to mitigate risk was considered selfimposed harm not supporting standing. Alleruzzo v. SuperValu, Inc. (In re SuperValu, Inc., Customer Data Sec. Breach Litig.), 2017 U.S. App. LEXIS 16664 at *17–18. In contrast, the District of Columbia Circuit reversed a lack-of-standing dismissal of state-law class action claims against a health insurer that allegedly failed to encrypt personal information, leaving it open to theft by a cyberattack. The court reasoned that identity theft is a clear harm, and that at the early stage of proceedings, allegations of a compromise of the database, likely leading to such real harm, were adequate to confer standing. Attias v. CareFirst, Inc., 865 F.3d 620 (D.C. Cir. 2017). In keeping with the liberal recognition of standing by the Ninth Circuit and the extensive statutory and common law of California protecting consumers and the integrity of electronic commerce, the In re Yahoo! Inc. Customer Data Sec. Breach Litig., 2017 U.S. Dist. LEXIS 140212, at *76 (N.D. Cal. Aug. 30, 2017), court, in a case involving a history of breaches, including a 2013 breach exposing 1 billion accounts with logins, dates of birth, e-mail recovery codes, hashed passwords, and zip codes that was not notified to the public until 2016, found that all Plaintiffs have adequately alleged an injury in fact sufficient for Article III standing because all Plaintiffs have alleged a risk of future identity theft, in addition to loss of value of their PII. Moreover, some Plaintiffs, although not all Plaintiffs, have adequately alleged additional injuries in fact in the form of (1) harm from the actual misuse of their PII; (2) out-ofpocket mitigation expenses; and (3) lost benefit of the bargain.

9–92

2nd Edition 2018 | MCLE, Inc.

Survey of State Data Security and Privacy Laws

§ 9.3

The court sided with plaintiffs from different suits as to particular claims, with respect to the consolidated class action complaint: • the California Unfair Competition Law, Cal. [Bus. & Prof.] Code § 17200 (“UCL”: “any unlawful, unfair or fraudulent business act or practice and unfair, deceptive, untrue or misleading advertising”—which supports enhanced damages); – dismissed the UCL claims as to nonbusiness plaintiffs; – denied dismissal of the unlawful and unfair prongs of the plaintiffs’ UCL claim for business plaintiffs; – dismissed the fraudulent prong of plaintiffs’ UCL claim to the extent it was based on fraudulent misrepresentations and fraudulent omissions in the defendants’ privacy policy, with leave to amend; – denied dismissal of the fraudulent prong of small business plaintiff Neff’s UCL claim based on fraudulent omissions in the defendants’ small business services advertisements; – denied dismissal of plaintiffs’ request for restitution as to a small business plaintiff but dismissed a request for restitution for the United States plaintiffs; – denied dismissal of plaintiffs’ request for an injunction under the UCL; – dismissed plaintiffs’ claim for fraudulent inducement, with leave to amend; • dismissed plaintiffs’ claim for negligent misrepresentation, with leave to amend; • dismissed plaintiffs’ CLRA, Cal. [Civ.] Code § 1770(a), claim, with leave to amend; • dismissed with prejudice California Customer Records Act (CRA), Cal. [Civ.] Code § 1798.80, et seq., claim of plaintiffs who were not California residents (as to the remaining plaintiffs, dismissed with leave to amend (to allege Yahoo!’s knowledge), the CRA claim to the extent that the claim is based on the 2013 breach and denied dismissal of the CRA claim to the extent that the claim is based on later breaches); • dismissed plaintiffs’ Stored Communication Act, 18 U.S.C. § 2702, claim, with leave to amend with allegations of “knowingly divulged”; • dismissed with prejudice, plaintiffs’ California Online Privacy Protection Act, Cal. [Bus. & Prof.] Code § 22575, et seq., claim, which has no private right of action. • dismissed plaintiffs’ express contract claim to the extent that the claim seeks to recover out-of-pocket mitigation costs, with leave to amend (to deal with disclaimers and limitations on liability); otherwise denied dismissal of plaintiffs’ express contract claim;

MCLE, Inc. |2nd Edition 2018

9–93

§ 9.3

Data Security and Privacy in Massachusetts

• dismissed plaintiffs’ implied contract claim to the extent that the claim seeks to recover out-of-pocket mitigation costs, with leave to amend; otherwise denied dismissal of plaintiffs’ implied contract claim. • denied dismissal of plaintiffs’ claim for violation of the implied covenant of good faith and fair dealing; • dismissed with prejudice plaintiffs’ negligence claim (asserted by foreign plaintiffs); and • dismissed plaintiffs’ claim for declaratory relief, with leave to amend (with facts on unconscionability of terms). Thus, state privacy law, fragmented as it is, remains an important recourse for consumers, but a thicket for personal information custodians.

9–94

2nd Edition 2018 | MCLE, Inc.

CHAPTER 10

Massachusetts Data Security Law and Regulations* Andrea C. Kramer, Esq. Kramer Frohlich LLC, Boston

C. Max Perlman, Esq. Hirsch Roberts Weinstein LLP, Boston § 10.1

Overview ................................................................................................. 10-2

§ 10.2

The Breach Notice Law ......................................................................... 10-3 § 10.2.1 Personal Information ............................................................. 10-3 § 10.2.2 Covered Entities .................................................................... 10-4 § 10.2.3 Triggers for Notice ................................................................ 10-4 § 10.2.4 Timing of Notice ................................................................... 10-5 § 10.2.5 Content of the Notice ............................................................ 10-6 (a) Licensors and Owners .................................................... 10-6 (b) Maintainers/Storers ........................................................ 10-7 § 10.2.6 Penalties and Enforcement .................................................... 10-8 § 10.2.7 Compliance with Federal Law .............................................. 10-8 § 10.2.8 Court Cases ............................................................................ 10-8 § 10.2.9 Practical Aspects of Responding to a Breach ........................ 10-9 (a) Secure and Preserve ..................................................... 10-10 (b) Investigate .................................................................... 10-10 (c) Prepare Required Notices............................................. 10-10 (d) Develop a Public Relations Approach .......................... 10-11 (e) Evaluate Relevant Contracts ........................................ 10-11 (f) Consider Offering Credit Monitoring........................... 10-11 (g) Conduct a Postincident Review .................................... 10-11

§ 10.3

The Data Destruction/Disposal Law................................................... 10-11 § 10.3.1 Basic Requirements ............................................................. 10-11 § 10.3.2 Penalties and Enforcement .................................................. 10-12 § 10.3.3 Practice Tips ........................................................................ 10-12

§ 10.4

The Data Security Regulations ........................................................... 10-12 § 10.4.1 Major Requirements ............................................................ 10-13 § 10.4.2 The WISP ............................................................................ 10-13 (a) Data Security Coordinator............................................ 10-13 (b) Risk Assessment and Improvement of Safeguards ....... 10-14

*

Updated for the 2017 Edition by Andrea C. Kramer, Esq.

MCLE, Inc. | 2nd Edition 2018

10–1

Data Security and Privacy in Massachusetts

§ 10.4.3

§ 10.4.4

(c) Employee Policies, Training, and Discipline ............... 10-16 (d) Terminated Employees ................................................. 10-16 (e) Third-Party Service Providers ...................................... 10-17 (f) Overall Restrictions ...................................................... 10-18 Computer System Security Requirements ........................... 10-18 (a) “Secure User Authentication Protocols” ...................... 10-18 (b) “Secure Access Control Measures” .............................. 10-18 (c) Encryption .................................................................... 10-19 (d) Monitoring ................................................................... 10-19 (e) Security Protections ..................................................... 10-19 Practice Tips ........................................................................ 10-19 (a) The Assessment ............................................................ 10-19 (b) Create a Data Security Binder ...................................... 10-19

EXHIBIT 10A—Frequently Asked Questions Regarding 201 C.M.R. § 17.00.................................................................................................................. 10-21 EXHIBIT 10B—A Small Business Guide: Formulating a Comprehensive Written Information Security Program ........................................................... 10-26 EXHIBIT 10C—Standards for the Protection of Personal Information of Residents of the Commonwealth—201 C.M.R. § 17.00 .................................. 10-31 EXHIBIT 10D—OCABR 201 C.M.R. § 17.00 Compliance Checklist ........... 10-36 EXHIBIT 10E—Personal Information: A Graphical Representation ........... 10-39 EXHIBIT 10F—Action Plan for Complying with Massachusetts Data Security Regulations .......................................................................................... 10-40 EXHIBIT 10G—Twelve Practical Tips for Employers ................................... 10-43 EXHIBIT 10H—Helpful Websites .................................................................... 10-44

Scope Note This chapter addresses state statutory, regulatory, and case law in Massachusetts concerning data security. The breach notice law and the statute concerning data disposal are discussed, and details of Massachusetts data security statute, and, in particular, its WISP requirements, are covered.

§ 10.1

OVERVIEW

In the wake of the 2007 TJX data breach and the rash of identity theft that ensued, Massachusetts enacted a strong omnibus data security law, entitled “An Act relative to security freezes and notification of data breaches,” 2007 Mass. Acts. c. 82, § 16, eff. Oct. 31, 2007. Among other things, this law created two important provisions: Chapter 93H, which requires notification of unauthorized acquisition or use of Massachusetts residents’ “personal information” to the attorney general, the Office of Consumer Affairs and Business Regulation (OCABR), and the residents affected; 10–2

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.1

and Chapter 93I, which prescribes the manner in which such personal information must be discarded or destroyed. Soon thereafter, the OCABR, using the authority granted under Chapter 93H, promulgated comprehensive, and perhaps the strictest in the nation, regulations to further protect residents’ personal information. 201 C.M.R. § 17.00 et seq. These regulations, which went into effect on March 1, 2010, require that businesses develop, implement, and maintain a comprehensive written data security program (WISP) and comply with strict requirements for safeguarding and disposing of personal information. At their essence, the law and regulations require that businesses meet the following four basic requirements: • assessment of files and systems to identify personal information that the business has or controls, current means of protecting such personal information, and vulnerabilities; • adoption of a comprehensive written program to protect personal information that implements physical, administrative, and technological means of protection and includes the appointment of a data security coordinator to run the program, as well as adoption of personnel policies to implement the program; • destruction of personal information in specific manners that ensure it cannot be read or reconstructed; and • reporting of breaches of security, unauthorized use, or unauthorized acquisition of personal information through formal notice to the attorney general, the OCABR, and the affected individuals “as soon as practicable and without unreasonable delay.”

§ 10.2

THE BREACH NOTICE LAW

The Massachusetts breach notice law, Chapter 93H, which became effective October 31, 2007, requires that any person or agency that knows of a “breach of security” involving personal information of a Massachusetts resident give certain specific notice to the attorney general, the OCABR, and the affected individuals.

§ 10.2.1 Personal Information For purposes of this statute, personal information is defined as a combination of • a Massachusetts resident’s first name or initial and last name plus • any of the following numbers: – Social Security number; – driver’s license number; – state-issued identification card (including state-school student identification) number; and

MCLE, Inc. | 2nd Edition 2018

10–3

§ 10.2

Data Security and Privacy in Massachusetts

– financial account number or credit card number—with or without any required security code, access code, personal identification number (PIN), or password that would permit access to the account. Practice Note According to the OCABR’s Frequently Asked Questions regarding 201 C.M.R. § 17.00, a financial account is an account that, if unauthorized access is gained to that account, an increase of financial burden or a misappropriation of monies, credit, or other assets could result. Examples include checking, savings, mutual fund, annuity, investment, credit, and debit accounts.

G.L. c. 93H, § 1(a); 201 C.M.R. § 17.02. As discussed below, for purposes of Chapter 93I (the destruction statute), biometric identifiers are also included in the definition of personal information, but they are not included in the breach notice law. The definition expressly excludes information lawfully obtained from publicly available sources or from federal, state, or local government records lawfully made available to the general public. G.L. c. 93H, § 1(a); 201 C.M.R. § 17.02. Obviously, all businesses that process credit card receipts and checks handle protected personal information. Such businesses include nonprofit companies that process donations made by credit card or check. In addition, based on this definition, every employer of a Massachusetts resident has personal information of its employees, including on tax documentation, immigration forms, direct deposit paperwork, and other similar documentation.

§ 10.2.2 Covered Entities The statute applies to individuals, businesses (including corporations, associations, partnerships, and other legal entities), and government agencies that license or own or store or maintain personal information (covered entities). G.L. c. 93H, §§ 1(a), 3. Practice Note Government agencies include all executive offices, departments, boards, commissions, bureaus, divisions, authorities, and agencies, as well as any branches and political subdivisions, of the Commonwealth of Massachusetts. G.L. c. 93H, § 1.

§ 10.2.3 Triggers for Notice The requirement of providing notice has two distinct triggers. The first trigger occurs when a covered entity knows or has reason to know that personal information was acquired or used by an “unauthorized person” or was used for an “unauthorized purpose.” G.L. c. 93H, § 3. However, a good faith but unauthorized acquisition of personal information for lawful purposes is not a “breach of security” unless the personal information is used in an unauthorized manner or subject to fur10–4

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.2

ther unauthorized disclosure. This trigger is, by definition, linked to unauthorized acquisition or use of personal information. Obvious examples include theft of credit card information, checks, or employee files containing Social Security numbers. Less-obvious examples include the taking or retention of personal information by a former employee whose use of that information previously was authorized but is no longer authorized and the acquisition of such information by an employee who is not authorized to have access to it even though the employer, in general, is authorized to have the information, such as by an e-mail message sent to the wrong e-mail address. Importantly, unlike the second trigger, discussed below, this trigger automatically constitutes a breach even without a risk analysis. As such, even if a covered entity believes there is no risk of identity theft or fraud, the unauthorized use or acquisition must be reported and is subject to the notice requirement. This trigger is a simple bright-line rule—it is based on the breach itself, not the likelihood of harm or misuse. The second trigger is a “breach of security,” which is defined as the unauthorized acquisition or use of unencrypted data that creates a substantial risk of identity theft or fraud against a Massachusetts resident, or encrypted data plus a key that is capable of compromising the security of the data. G.L. c. 93H, § 3. This trigger is more expansive and requires a risk analysis to determine whether a breach occurred. Importantly, this trigger does not require that personal information be directly involved; the mere taking of data that creates a risk of identity theft or fraud is what matters. “Data” includes “any material upon which written, drawn, spoken, visual, or electromagnetic information or images are recorded or preserved, regardless of physical form or characteristics.” G.L. c. 93H, § 3. An obvious example is the theft or loss of an unencrypted computer that contains financial information, birth dates, or addresses of clients or customers. As is evident in the definition of this second trigger, the risk analysis differs depending on whether the data is encrypted or not. Data kept on paper is, obviously, always in the unencrypted category. With regard to electronically stored data, the statute provides that, to be “encrypted,” data must be transformed “through the use of a 128-bit or higher algorithmic process into a form in which there is a low probability of assigning meaning without use of a confidential process or key,” subject to any further regulations issued by the OCABR. To date, the OCABR has not issued further regulations regarding encryption.

§ 10.2.4 Timing of Notice The statute requires that the notice be provided “as soon as practicable and without unreasonable delay,” though delay is permitted where law enforcement determines notification would hinder a criminal investigation, provided that the law enforcement agency notifies the attorney general of that determination. G.L. c. 93H, §§ 3(a), 3(b), 4. What “as soon as practicable and without unreasonable delay” means in practical terms is unclear and requires judgment—and, in best practices, some preliminary interaction with the attorney general’s office. As a practical and realistic matter, before reporting, the covered entity must do two critical things: cut off the effects of the breach to the extent possible and determine the extent of the breach. It usually takes a few days at least to accomplish both of these, and sometimes it can take longer. During this period, the covered entity needs to assess the risk that identity theft is occurring MCLE, Inc. | 2nd Edition 2018

10–5

§ 10.2

Data Security and Privacy in Massachusetts

during the delay in reporting with the risk that the reporting will be overstated and cause concern and alarm where it is not necessary. For example, if a business learns that its system has been hacked and personal information has been compromised, but it cannot quickly determine whether it affects ten or 500 individuals, it is advisable to first cut off any possibility of further hacking by securing the network and then immediately doing an investigation to try to determine the scope of the breach. While the investigation is ongoing, the covered entity or its counsel might want reach out to the Attorney General’s Office and the OCABR to disclose the fact that the breach occurred and that an investigation is underway. Once the investigation is over, as long as it can be done promptly, the formal breach notice should issue.

§ 10.2.5 Content of the Notice There are different requirements for the content of the notice and the recipients of the notice, depending on whether the covered entity is one that licenses or owns personal information or one that stores or maintains personal information.

(a)

Licensors and Owners

Licensors and owners of personal information, which would include most employers and businesses that accept checks or credit cards, must give notice to the attorney general and the director of the OCABR, as well as the affected Massachusetts residents, consumer reporting agencies, and state agencies identified by the director of the OCABR. G.L. c. 93H, § 3(b). The notice to the attorney general and the director of the OCABR must include the following: • the nature of the breach of security or unauthorized acquisition or use; • the number of Massachusetts residents affected by the incident at the time of notification; and • any steps the person or agency has taken or plans to take relating to the incident. G.L. c. 93H, § 3(b). The notice to the affected Massachusetts residents must include the following four pieces of information: • information on the right to obtain a police report; • information on how to request a security freeze, which is set forth in G.L. c. 93, §§ 56 and 62A; • the information the affected individual will need to provide when requesting the security freeze; and • information regarding any fees required in placing, lifting, or removing a security freeze. G.L. c. 93H, § 3(b). 10–6

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.2

Practice Note As explained by the OCABR, a security freeze prohibits a credit reporting agency from releasing any information from a consumer’s credit report without written authorization. Victims of identity theft who provide a reporting agency with a valid police report cannot be charged to place, lift, or remove a security freeze. In all other cases, a credit reporting agency may charge up to $5 each to place, lift, or remove a security freeze. To place a security freeze, a consumer must send a written request to each of the credit bureaus by regular, certified, or overnight mail, and include the name, address, date of birth, Social Security number, and credit card number and expiration date for payment, if applicable. Links to the different credit bureaus are listed by the OCABR at http://www.mass.gov/ocabr/ data-privacy-and-security/identity-theft/identity-theft-law-advisory.html. The credit bureaus have three business days after receiving a request to place a security freeze.

Significantly, the notice to the residents cannot include the nature of the breach or unauthorized acquisition or use, or the number of Massachusetts residents affected by the breach or unauthorized access or use. G.L. c. 93H, § 3(b). Common mistakes in the notice include making the notice too general, confusing a security freeze with a fraud alert, referring to websites rather than providing information in the letter itself, and providing a range of fees relating to the security freeze when in fact the amount is set by statute. The notice to the affected individuals must be individual notice unless the actual cost of doing so is greater than $250,000 or the number of persons to be notified is greater than 500,000 or the covered entity does not have sufficient contact information to provide individual notice. G.L. c. 93H, § 1(a). If any of these conditions is met, “substitute notice” can be used. Substitute notice must include e-mail notice to any affected individuals where the covered entity has an e-mail address, clear and conspicuous posting of the notice on the website of the covered entity if it maintains one, and publication or broadcast through statewide media. G.L. c. 93H, § 1(a).

(b)

Maintainers/Storers

Maintainers or storers of personal information, which would include data storage companies, payroll companies, and other vendors to whom companies send their employees’ or customers’ personal information, must give notice to the owner or the licensor of the personal information. They are not required to give notice to the affected Massachusetts residents. They must, however, cooperate with the owner or licensor of the data, including informing the owner or licensor of the breach of security or unauthorized acquisition or use, the date and nature of the incident, and any steps it has taken or plans to take relating to the incident. Importantly, cooperation does not require disclosure of confidential business information or trade secrets.

MCLE, Inc. | 2nd Edition 2018

10–7

§ 10.2

Data Security and Privacy in Massachusetts

§ 10.2.6 Penalties and Enforcement Chapter 93H provides that violations are considered unfair business practices under Chapter 93A, potentially subjecting violators to injunction, restitution, civil penalties up to $5,000 for each violation, costs of investigation, and attorney fees and multiple damages.

§ 10.2.7 Compliance with Federal Law A covered entity that maintains procedures for responding to a security breach that comply with federal laws, rules, regulations, guidance, or guidelines will be deemed to be in compliance if it provides notice in compliance with those procedures, but it must still notify the attorney general and the director of the OCABR of the breach as soon as practicable and without unreasonable delay. In such case, the notice to the attorney general and the OCABR must include the steps taken or planned to be taken relating to the breach pursuant to the applicable federal law, rule, regulation, guidance, or guideline.

§ 10.2.8 Court Cases To date, Chapter 93H has not yet been the subject of much litigation. There have been only three cases that have specifically addressed the statute, one concerning whether Chapter 93H creates a private right of action, one concerning that issue and what is required as damages to have standing to sue under the statute, and the other concerning whether Chapter 93H preempts common law negligence claims and requirements for pleading a cause of action under Chapter 93A for violation of Chapter 93H. In Aminpour v. Arbella Mutual Ins. Co., Middlesex Superior Court, C.A. 1181-CV03036, the Superior Court held that Chapter 93H did not create a private right of action. See Aminpour v. Arbella Mut. Ins. Co., 89 Mass. App. Ct. 1136 (2016) (unpublished decision) (noting lower court decision but not addressing it because the parties did not raise the issue on appeal). In Katz v. Pershing, LLC, 806 F. Supp. 2d 452 (D. Mass. 2011), the plaintiff brought an action for violation of Chapter 93H, claiming that she had not been notified of extant but unidentified security breaches and that the defendant had failed to conform to various encryption protocols. The plaintiff claimed that these shortcomings required her to purchase identity theft insurance and exposed her nonpublic personal information to possible misappropriation. The District Court judge held that Chapter 93H did not permit a private right of action under the statute and found that the plaintiff therefore lacked standing to bring her claim. On appeal, the First Circuit held that the issue of whether Chapter 93H provides for a private right of action is “a matter best left to the Massachusetts courts” and declined to decide the issue of standing under Chapter 93H. Katz v. Pershing, LLC, 672 F.3d 64, 75 (1st Cir. 2012). The Appeals Court nevertheless found that the plaintiff lacked standing given that she had not alleged that her nonpublic personal information had actually been accessed by any unauthorized person and that the cause of action rested entirely on the hypothesis that at some point an unauthorized, as-yet unidentified third party might access her data and then attempt to purloin her identity. Katz v. Pershing, LLC, 672 F.3d at 80. 10–8

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.2

In 2013, a Massachusetts Superior Court judge held that Chapter 93H does not preempt a common law negligence claim. Adams v. Congress Auto Ins. Agency, Inc., 31 Mass. L. Rptr. 473 (Super. Ct. 2013). In that case, the plaintiff brought a claim against the defendant insurance agency for negligently failing to employ safeguards designed to prevent an employee from misusing confidential personal information. The defendant argued that the negligence claim should be dismissed because Chapter 93H can be enforced only by the attorney general. The court disagreed, holding that “93H does not occupy the field” and that “[t]here is no inconsistency between the statutory scheme laid out in Chapter 93H and a common law remedy for negligence that results in the disclosure of a person’s identifying information, particularly as here where the plaintiff can allege actual harm from the disclosure.” Adams v. Congress Auto Ins. Agency, Inc., 31 Mass. L. Rptr. at 4. In a later decision on the plaintiff’s motion to amend his Chapter 93A claim to reinstate it after it was initially dismissed, the Superior Court held that the plaintiff continued to fail to state a claim for violation of Chapter 93A. The plaintiff had alleged that the defendant “fail[ed] to meet the Commonwealth’s standards regarding the protection of confidential personal information,” but failed to allege how, if at all, Chapter 93H or any regulation applied, or that the defendant breached any specific “standard[ ] regarding the protection of confidential personal information” applicable in the Commonwealth. Adams v. Cong. Auto Ins. Agency, Inc., 32 Mass. L. Rptr. 372 (Mass. Super. Oct. 8, 2014), aff’d in part, vacated in part, 90 Mass. App. Ct. 761 (2016) (affirming Superior Court’s decision that plaintiff’s Chapter 93A claim, which was based on alleged violations of Chapter 93H, was factually insufficient, and therefore did not state a claim, because it failed to allege that anyone accessed “personal information,” as that term is specially defined in Chapter 93H, and did not identify the required safeguards and procedures that the agency failed to employ). Note that an employee’s taking of personal information protected under this statute may have implications in trade secret cases, particularly at the preliminary injunction stage. Specifically, a former employee’s taking of personal information may be used as part of the basis of obtaining a preliminary injunction. There are not yet any reported cases that specifically address this issue, but in Unum Group v. Loftus, 220 F. Supp. 3d 143, 148 (D. Mass. 2016), the former employer raised the argument that the employee’s taking of personal information was a basis for the preliminary injunction. Though the court did not address the Chapter 93H implications of the employee’s taking of the employer’s information, it did consider issuing a preliminary injunction that information protected by the Health Information Portability and Accountability Act (HIPAA), the security provisions of which are analogous in many ways to Chapter 93H, were affected.

§ 10.2.9 Practical Aspects of Responding to a Breach Chapter 93H specifically addresses the notices that must be provided in case of a security breach, but notice is not the only thing a covered entity must do to respond adequately or properly to a breach.

MCLE, Inc. | 2nd Edition 2018

10–9

§ 10.2

Data Security and Privacy in Massachusetts

Though perhaps obvious, the first thing the covered entity must do is recognize that there has been a breach of security. Recognizing a breach is not difficult when a file containing credit card numbers has been taken or accessed improperly or when a pointof-sale breach has been identified. Also not difficult to recognize are the more typical breaches, such as a stolen or lost laptop, flash drive, or other portable medium, or unauthorized activity on a computer network, or missing, lost, or stolen paper files. A bit harder to recognize, though, is that it may be a breach when personal information is sent through the U.S. mail and does not arrive at its destination. Likewise, removal or retention of documents or electronic files by a departing employee may be a breach, even though the employee was authorized to have those documents when still employed. Similarly, the loss of payment records or receipts of checks may be a breach. Once it is determined that there is a breach, the next steps are as follows:

(a)

Secure and Preserve

The first thing to do after recognizing that there has been a breach is to stop and minimize the damage. Doing so involves ensuring that the breach is “closed,” to the extent possible, and preserving whatever evidence or information may be needed to address the breach. For example, if the security breach occurs in a computer network, among the first things to do is to take the network offline, make an image of the server and relevant hard drives, and change passwords and restrict access. If the breach involves paper records, identify and secure paper files containing personal information, change keys or access on doors and file cabinets, and restrict access to the extent possible. If the breach is as simple as a lost mobile device, cut off access remotely.

(b)

Investigate

Investigations can take many forms, depending on the type and scope of the breach. A theft of a computer or mobile device might involve hiring a private investigator. A breach of a computer network may involve a computer forensic analysis. A loss of paper documents may involve interviews with people who used or had access to the documents. Overall, the investigation should seek to answer the following questions: • What was lost or stolen? • How was it lost or stolen? • Who lost or stole it? • Where is the information or computer now? • What security measures were used? • Has any information been misappropriated to date?

(c)

Prepare Required Notices

The content of the notices is discussed above in § 10.2.5. In addition, if the breach included information from any residents of other states or countries, you must ascertain what the notice reporting requirements are for those jurisdictions. 10–10

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

(d)

§ 10.2

Develop a Public Relations Approach

Regardless of the size or scope of the breach, there is also an “audience,” specifically the affected individuals, be they customers or other employees. Sometimes there is a larger audience, namely the public at large. And, always, there is the audience of the attorney general and the OCABR. As such, it is important to develop the covered entity’s message regarding the breach. In many situations, it is also worth considering whether to retain a crisis management and/or public relations professional. For large breaches, it is worth considering whether to establish a hotline for affected individuals.

(e)

Evaluate Relevant Contracts

One important contract to consider is the covered entity’s insurance contracts. The entity may have insurance coverage for the breach, in which case immediate notice to the insurance company may be warranted. If the breach was caused by the actions of a third-party vendor, it will be important to review the contract with the vendor to assess whether it contains an indemnification provision. Or, if the covered entity is a third-party vendor, it must determine its contractual obligations to report the breach, to return information, and to indemnify for the breach.

(f)

Consider Offering Credit Monitoring

Offering credit monitoring is not required by the statute, but it may be a low-cost way to limit future damages (if an affected individual’s identity is stolen and misused) and to generate good will, particularly among employees and customers. The cost is approximately $50 to $100 per individual.

(g)

Conduct a Postincident Review

Identify any administrative, physical, and technological protections that could have prevented, mitigated, or signaled the breach. Adopt changes to your data security program that address the gaps identified. Consider disciplinary measures for responsible employees.

§ 10.3

THE DATA DESTRUCTION/DISPOSAL LAW

§ 10.3.1 Basic Requirements Chapter 93I, the data destruction/disposal law, which became effective February 3, 2008, sets forth requirements and specifications for how personal information is disposed of and/or destroyed. The definition of personal information is the same as for Chapter 93H, discussed above in § 10.2.1, except that it also includes biometric data, such as such as a fingerprint, a retina or iris image, or another unique physical representation or digital representation of biometric data. Counties, cities, and towns are required to comply with the data destruction/disposal law though they are not subject to the breach notice law. MCLE, Inc. | 2nd Edition 2018

10–11

§ 10.3

Data Security and Privacy in Massachusetts

The minimum requirement for data destruction under this statute is that all persons, businesses, and agencies must destroy records containing personal information such that the data cannot practicably be read or reconstructed after disposal or destruction. The statute specifies that paper records must be redacted, burned, pulverized, or shredded, while electronic and other nonpaper records must be destroyed or erased. In addition, third-party disposal service providers must implement and monitor compliance with policies and procedures that prohibit unauthorized access to, acquisition of, or use of personal information during collection, transportation, and disposal.

§ 10.3.2 Penalties and Enforcement Violations can lead to civil fines of up to $100 per data subject affected, up to $50,000 for each instance of improper disposal. In addition, the statute explicitly provides that violators may also be liable under Chapter 93A, potentially subjecting noncompliant persons to injunction, restitution, civil penalties, and liability for the cost of the investigation of the violation, including attorney fees.

§ 10.3.3 Practice Tips Organizations must be careful when discarding, returning, or reappropriating (e.g., giving to another person) any electronic device. For example, before sending old computers to charitable organizations, selling them, or assigning them to another user, the device should be wiped. Just deleting the files on the device does not suffice. Information can be recaptured from files that are merely deleted. It is imperative to wipe the computer, preferably with something that meets Department of Defense protocols. Perhaps unexpectedly, copy machines contain hard drives, which may maintain personal information when they are returned to the lessor at the end of a lease. The storage capabilities of copy machines should be disabled, or the organization should ensure that the hard drive is wiped before relinquishing the machine at the end of the lease. Organizations should provide secure disposal and shredding facilities for paper personal information. Personal information should not be deposited in the ordinary trash but, instead, should be immediately shredded or stored in locked trash bins until it can be shredded.

§ 10.4

THE DATA SECURITY REGULATIONS

The stated purpose of the Data Security Regulations [hereinafter “regulations”], 201 C.M.R. § 17.00, is to implement the provisions of the data breach notice statute, G.L. c. 93H, relative to the standards to be met by covered entities that own or license personal information. (To “own or license” personal information means to receive, maintain, process, or have access to personal information in connection with the provision of goods or services or employment.) Essentially, the regulations establish minimum standards to be met in connection with the safeguarding of personal information. Importantly, though the regulations seemingly apply to all people as well as 10–12

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.4

to businesses, according to the Frequently Asked Questions regarding 201 C.M.R. § 17.00 (FAQ) issued by the OCABR (see Exhibit 10A), the regulations apply to “those engaged in commerce . . . those who collect and retain personal information in connection with the provision of goods and services or for the purposes of employment.” The FAQ specifically states that the regulations do not apply to natural persons who are not in commerce. Notably, the regulations also do not apply to municipalities.

§ 10.4.1 Major Requirements The first major requirement of the regulations is that every entity that licensees or owns personal information (for purposes of the regulations, “personal information” is defined the same way as it is in Chapter 93H; it does not include biometric data as is included in Chapter 93I) must develop, implement, maintain, and monitor a comprehensive, written information security program (WISP) that contains administrative, technical, and physical safeguards to ensure security and confidentiality of records containing personal information. The WISP must be in writing, though it does not have to be in one document. Note that the WISP is not the same as an employee policy—and best practice would be to have two different documents. The WISP is a comprehensive program, one that extends beyond what individual employees are obligated to do or should even be involved in doing. See Exhibit 10B for a guide to formulating a WISP provided by the OCABR. Significantly, the regulations do not adopt a one-size-fits-all approach. To the contrary, they represent a risk-based approach to information and data security that requires businesses to consider the specifics of the data they have, the risks that exist, and the resources they have available to ensure security. Compliance will depend on the size, scope, and type of business, the amount of resources available to the business, the amount of data stored, and the need for security and confidentiality of both consumer and employee information. 201 C.M.R. § 17.03.

§ 10.4.2 The WISP The regulations list over a dozen specific requirements that the WISP must include (see Exhibit 10C). 201 C.M.R. § 17.03(2). These requirements—grouped thematically with commentary and practice suggestions—are discussed below.

(a)

Data Security Coordinator

The first thing that is required in putting together a WISP is designating at least one employee to maintain the WISP, a data security coordinator (or similar title). This employee can be an office manager, a chief operating officer, or anyone who can properly administer all aspects of the WISP. Because the WISP includes more than just IT or personnel issues, it is generally not preferable to make the head of IT or human resources the sole administrator. A best practice is to create a data security team that includes the data security coordinator and someone from operations, human resources, information technology, accounting/billing, sales, and legal, to the extent applicable. MCLE, Inc. | 2nd Edition 2018

10–13

§ 10.4

Data Security and Privacy in Massachusetts

The duties of the data security coordinator should include the following: • initial implementation of the WISP; • training all employees about the policy and the WISP; • regular assessment and testing of the WISP’s safeguards and compliance with the policy; • taking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect personal information consistent with Massachusetts and any applicable federal law and requiring through contract that all third-party service providers implement and maintain appropriate security measures for personal information; • reviewing the scope of the security measures in the WISP at least annually or whenever there is a material change in business practices or changes in the law that may implicate the security, confidentiality, or integrity of records containing personal information; • conducting an annual training session on the WISP for all employees and anyone else who has access to personal information and keeping records regarding the attendance at the training and certifications by employees of their familiarity with the business’s requirements for protecting personal information; • terminating a departing employee’s physical and electronic access to records containing personal information immediately upon the termination of employment (whether voluntary or involuntary), including deactivating all passwords and usernames that permit that employee access to records containing personal information; • documenting actions taken in response to any incident involving unauthorized access to or use of personal information; and • recommending corrective and/or disciplinary measures for violations of the policy or WISP, implementing such measures where appropriate, and documenting all such corrective and/or disciplinary actions.

(b)

Risk Assessment and Improvement of Safeguards

The next requirement is to do an assessment of reasonably foreseeable internal and external risks to the security, confidentiality, and/or integrity of records (paper or electronic) that contain personal information. The first step in the assessment is to identify all records, computer systems, and storage media that contain or store personal information. Not only is identifying where personal information resides—and who has access to it—an obvious first step, but the identification process is also a specific requirement of the regulations. Among the records and places where personal information may be are the following: • personnel files; • payroll records; 10–14

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.4

• employment applications; • employee tax documents; • garnishment records; • immigration documents, including I-9’s; • insurance benefit records; • workers’ compensation files; • records relating to benefits, such as insurance, 401k programs, profit sharing, and pensions, including enrollment forms and beneficiary forms; • client/customer credit card or bank account records; • wire transfer information; and • client/customer personal information obtained in the course of conducting business. Note that in its Frequently Asked Questions regarding 201 C.M.R. § 17.00, the OCABR states that companies do not have to inventory their records, but they should identify which records contain personal information so that those records can be protected. The next step is to consider building/facility security, including locks, surveillance and security services, and visitor access. A critical step is assessing the security of electronically stored information. Among the considerations are where servers are kept and who has access, whether fax and copy machines maintain electronic copies of documents that are faxed or copied, the strength of passwords and how often they are changed, whether computer terminals are locked after periods of nonuse, and where and how backups of the computer system are kept. The assessment should also include an assessment of the company’s firewalls, virus protections, encryption (not only on laptops but also on all mobile devices, including phones, CDs, DVDs, tablets, and external drives), and monitoring capabilities. Also important, though often overlooked, is an assessment of the security of any noncompany computers used by employees to access company information, such as home computers from which employees can access the company’s server. Once the assessment is completed—or even as it is occurring—the business must evaluate and improve (as necessary) the effectiveness of current safeguards for limiting risks. The assessment process is ongoing and must be done at least annually or whenever there is a material change in a company’s business practices or changes in the law that may implicate the security, confidentiality, or integrity of records containing personal information. It should also be done whenever a breach occurs. In addition, businesses must regularly monitor the WISP and upgrade safeguards as necessary. It is a best practice to document the assessment process. MCLE, Inc. | 2nd Edition 2018

10–15

§ 10.4

(c)

Data Security and Privacy in Massachusetts

Employee Policies, Training, and Discipline

Employees are a critical and mandatory component of maintaining data security. As such, an important part of the WISP is creating a detailed policy for employees that provides clear instruction on the security policies and procedures that are contained in the WISP. As noted above, the policy is not coextensive with the WISP. It should include those policies and procedures that are generally applicable to all or a significant portion of employees, but need not include every detail in the WISP. It is important that the policy be written in clear, unambiguous language. Among the policies that employers should have are the following: • document destruction and disposal policies; • policies regarding access to archived records and access to current records (both paper and electronic); • disciplinary policies; • policies concerning transmission and/or transporting of information; • access to records by terminated employees; • policies regarding storage of documents, records, and e-mail; and • policies for reporting loss of mobile devices. A second requirement with regard to all employees is that they be trained at least annually. It is a good practice to document the training and attendance and to maintain those records. Note that all employees need to be trained, including temporary and contract employees. The OCABR states in its Frequently Asked Questions regarding 201 C.M.R. § 17.00 that while there is no basic standard with regard to how much employee training must be done, companies need to do enough training to ensure that all employees who have access to personal information know their obligations as set forth in the regulations. Best practice would be to have enough training to help prevent breaches, whether accidental or intentional. The WISP must include disciplining employees for violations of data security policies and procedures. Accordingly, the employee policy should include a discussion of such discipline.

(d)

Terminated Employees

The regulations make clear that cutting off ex-employees’ access to personal information and other information that can lead to identity theft or fraud (whether in paper or electronic form) is mandatory. In fact, unauthorized access or use of personal information by ex-employees is one of the largest sources of data security breaches. With regard to employees whose employment is involuntarily terminated, a best practice is for businesses to develop procedures for having access to electronic records at the same time that the termination is occurring. With regard to employees who voluntarily terminate their employment, it is a judgment call as to how long they 10–16

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

§ 10.4

should be permitted to retain access, but certainly access should be denied no later than their last day at work. Given that employees often take work home, it is imperative that employers implement policies that prevent employees from leaving company documents at home or anywhere outside the office. Of course, given how many employees work from home and the existence of personal home computers, retaining control over such information is difficult. Nevertheless, businesses must take all reasonable steps to ensure that all such information is returned upon the termination of employment. A best practice is to have employees sign a statement upon the termination of their employment that states that they have purged all company information from their home computers and other electronic storage devices and have returned all company property. It is also useful to have in any employee handbook or employee contract a requirement that all company property, including property that may contain personal information, not be maintained on home computers beyond the time it is being used and/or that it be returned promptly upon termination of employment.

(e)

Third-Party Service Providers

The regulations require that businesses take all reasonable steps to verify that thirdparty service providers have the capacity to protect personal information and are applying protective security measures at least as stringent as ones in regulations. In addition, businesses must require third-party service providers by contract to protect personal information in a manner consistent with the regulations. The extent of third-party service providers will vary among companies, but typical providers include • payroll service providers; • accountant and/or tax preparers; • offsite storage facilities; • outside IT consultants; • billings and collections companies; • disposal service providers; • credit card processing providers; • insurance providers (health, dental, disability, auto, premises, workers’ compensation); • insurance agents; • 401k providers; • subcontractors and temporary employees/placement firms; • copier and fax machine servicers; • maintenance providers; and • delivery service providers. MCLE, Inc. | 2nd Edition 2018

10–17

§ 10.4

(f)

Data Security and Privacy in Massachusetts

Overall Restrictions

In recognition of the fact that having less personal information around will lead to fewer breaches, the regulations explicitly require that businesses limit the amount of personal information collected and personal information kept. Meeting this requirement involves a detailed assessment of what information the business really needs and who needs access to it—and then implementing procedures for ensuring that the amount of personal information and access to it be limited. Some obvious places for reducing personal information are on expense reports and application forms. Beyond limiting the amount of personal information obtained in the first place, getting rid of it once it is no longer needed and limiting (isolating) access on an as-needed basis is a key way to reduce the risk of breach. Think “RID”: reduce, isolate, and destroy. Another obvious way—and requirement—for reducing the risk of a breach is to restrict the scope of those who have access to personal information and to restrict physical access to records. With regard to the latter, the regulations require a written procedure restricting physical access and storage of personal information in locked facilities, areas, or containers be contained within the WISP.

§ 10.4.3 Computer System Security Requirements Businesses that electronically store or transmit personal information must include in their WISP a security system for computers and computer networks, including any wireless systems, that at a minimum includes the provisions listed in the subsections below. The requirements are all to the extent technologically feasible, which means that if there is a reasonable means through technology to accomplish a required result, then that reasonable means must be used. The regulations incorporate the concept of technological feasibility to recognize that some protections do not yet exist, but also to ensure that once the protections exist and are reasonably available, they are used.

(a)

“Secure User Authentication Protocols”

The regulations require that these protocols include • control of user IDs; • reasonably secure methods of assigning and selecting passwords; • security of passwords; • restriction of access to active users and active user accounts; and • blocking access after multiple unsuccessful attempts to gain access.

(b)

“Secure Access Control Measures”

The regulations require that these measures restrict access to records and files containing personal information to those who need it to perform their job duties, and that businesses assign unique identifications plus passwords to anyone with access to their computer systems. 10–18

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

(c)

§ 10.4

Encryption

The regulations require that all records and files containing personal information that are transmitted across public networks or transmitted wirelessly and all personal information on laptops and other portable devices be encrypted.

(d)

Monitoring

Reasonable monitoring of unauthorized access or use of the computer system is another requirement. There are programs that can perform this function and then send alerts or alarms to the system provider. Most IT providers can also perform this function. It is imperative that alarms and alerts be checked—often—and be addressed.

(e)

Security Protections

All computer systems used by the business must have reasonably up-to-date firewall protections, operating system security patches, malware protection, and virus definitions.

§ 10.4.4 Practice Tips (a)

The Assessment

A responsible approach to improving data security starts with assessing certain facts, including what kinds of personal information the organization stores, where and how the organization stores it, who has access to it, how it is secured, how it is vulnerable, how it travels inside and outside the organization, and how it is secured and/or vulnerable as it travels. To develop these facts, an organization will want to get the perspectives of a wide range of individuals, including representatives from IT, finance, human resources, operations, custodial, security, and any other relevant departments. It is useful to have a meeting or series of meetings with representatives from all of the relevant departments at which the participants discuss and brainstorm the company’s personal information.

(b)

Create a Data Security Binder

A best practice is to create a data security binder that contains all the key components of compliance in case there is a breach. This binder will help direct what a covered entity will do in a breach. Also, given that either the Attorney General’s Office or the OCABR may want to review the entity’s compliance after notice is provided, this binder can provide critical proof and evidence that the entity has complied with data security laws. The binder should contain the following: • the WISP; • data security policies for employees; MCLE, Inc. | 2nd Edition 2018

10–19

§ 10.4

Data Security and Privacy in Massachusetts

• training certificates for employees; • records of discipline for violation of data security rules; • a schematic of the entity’s computer network; • a basic inventory of electronic devices and security measures with respect to each; • notes and minutes from meetings of the data security team; • contracts with or a list of contacts at third-party service providers; • documents from the assessment process; and • any previous notices provided in connection with data security breaches.

10–20

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

EXHIBIT 10A—Frequently Asked Questions Regarding 201 C.M.R. § 17.00

Frequently Asked Questions Regarding 201 CMR 17.00 What are the differences between this version of 201 CMR 17.00 and the version issued in February of 2009? There are some important differences in the two versions. First, the most recent regulation issued in August of 2009 makes clear that the rule adopts a risk-based approach to information security, consistent with both the enabling legislation and applicable federal law, especially the FTC’s Safeguards Rule. A risk-based approach is one that directs a business to establish a written security program that takes into account the particular business’ size, scope of business, amount of resources, nature and quantity of data collected or stored, and the need for security. It differs from an approach that mandates every component of a program and requires its adoption regardless of size and the nature of the business and the amount of information that requires security. This clarification of the risk based approach is especially important to those small businesses that do not handle or store large amounts of personal information. Second, a number of specific provisions required to be included in a business’s written information security program have been removed from the regulation and will be used as a form of guidance only. Third, the encryption requirement has been tailored to be technology neutral and technical feasibility has been applied to all computer security requirements. Fourth, the third party vendor requirements have been changed to be consistent with Federal law. To whom does this regulation apply? The regulation applies to those engaged in commerce. More specifically, the regulation applies to those who collect and retain personal information in connection with the provision of goods and services or for the purposes of employment. The regulation does not apply, however, to natural persons who are not in commerce. Does 201 CMR 17.00 apply to municipalities? No. 201 CMR 17.01 specifically excludes from the definition of “person” any “agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof.” Consequently, the regulation does not apply to municipalities. MCLE, Inc. | 2nd Edition 2018

10–21

Data Security and Privacy in Massachusetts

Must my information security program be in writing? Yes, your information security program must be in writing. The scope and complexity of the document will vary depending on your resources, and the type of personal information you are storing or maintaining. But, everyone who owns or licenses personal information must have a written plan detailing the measures adopted to safeguard such information. What about the computer security requirements of 201 CMR 17.00? All of the computer security provisions apply to a business if they are technically feasible. The standard of technical feasibility takes reasonableness into account. (See definition of “technically feasible” below.) The computer security provisions in 17.04 should be construed in accordance with the riskbased approach of the regulation. Does the regulation require encryption of portable devices? Yes. The regulation requires encryption of portable devices where it is reasonable and technically feasible. The definition of encryption has been amended to make it technology neutral so that as encryption technology evolves and new standards are developed, this regulation will not impede the adoption of such new technologies. Do all portable devices have to be encrypted? No. Only those portable devices that contain personal information of customers or employees and only where technically feasible. The "technical feasibility" language of the regulation is intended to recognize that at this period in the development of encryption technology, there is little, if any, generally accepted encryption technology for most portable devices, such as cell phones, blackberries, net books, iphones and similar devices. While it may not be possible to encrypt such portable devices, personal information should not be placed at risk in the use of such devices. There is, however, technology available to encrypt laptops. Must I encrypt my backup tapes? You must encrypt backup tapes on a prospective basis. However, if you are going to transport a backup tape from current storage, and it is technically feasible to encrypt (i.e. the tape allows it) then you must do so prior to the transfer. If it is not technically feasible, then you should consider the sensitivity of the information, the amount of personal information and the distance to be traveled and take appropriate steps to secure and safeguard the personal information. For example, if you are transporting a large volume of sensitive personal information, you may want to consider using an armored vehicle with an appropriate number of guards. What does “technically feasible” mean? “Technically feasible” means that if there is a reasonable means through technology to accomplish a required result, then that reasonable means must be used.

10–22

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

Must I encrypt my email if it contains personal information? If it is not technically feasible to do so, then no. However, you should implement best practices by not sending unencrypted personal information in an email. There are alternative methods to communicate personal information other through email, such as establishing a secure website that requires safeguards such as a username and password to conduct transactions involving personal information. Are there any steps that I am required to take in selecting a third party to store and maintain personal information that I own or license? You are responsible for the selection and retention of a third-party service provider who is capable of properly safeguarding personal information. The third party service provider provision in 201 CMR 17.00 is modeled after the third party vendor provision in the FTC’s Safeguards Rule. I have a small business with ten employees. Besides my employee data, I do not store any other personal information. What are my obligations? The regulation adopts a risk-based approach to information security. A risk-based approach is one that is designed to be flexible while directing businesses to establish a written security program that takes into account the particular business’s size, scope of business, amount of resources and the need for security. For example, if you only have employee data with a small number of employees, you should lock your files in a storage cabinet and lock the door to that room. You should permit access to only those who require it for official duties. Conversely, if you have both employee and customer data containing personal information, then your security approach would be more stringent. If you have a large volume of customer data containing personal information, then your approach would be even more stringent. Except for swiping credit cards, I do not retain or store any of the personal information of my customers. What is my obligation with respect to 201 CMR 17.00? If you use swipe technology only, and you do not have actual custody or control over the personal information, then you would not own or license personal information with respect to that data, as long as you batch out such data in accordance with the Payment Card Industry (PCI) standards. However, if you have employees, see the previous question. Does 201 CMR 17.00 set a maximum period of time in which I can hold onto/retain documents containing personal information? No. That is a business decision you must make. However, as a good business practice, you should limit the amount of personal information collected to that reasonably necessary to accomplish the legitimate purpose for which it is collected and limit the time such information is retained to that reasonably necessary to accomplish such purpose. You should also limit access to those persons who are reasonably required to know such information. MCLE, Inc. | 2nd Edition 2018

10–23

Data Security and Privacy in Massachusetts

Do I have to do an inventory of all my paper and electronic records? No, you do not have to inventory your records. However, you should perform a risk assessment and identify which of your records contain personal information so that you can handle and protect that information. How much employee training do I need to do? There is no basic standard here. You will need to do enough training to ensure that the employees who will have access to personal information know what their obligations are regarding the protection of that information, as set forth in the regulation. What is a financial account? A financial account is an account that if access is gained by an unauthorized person to such account, an increase of financial burden, or a misappropriation of monies, credit or other assets could result. Examples of a financial account are: checking account, savings account, mutual fund account, annuity account, any kind of investment account, credit account or debit account. Does an insurance policy number qualify as a financial account number? An insurance policy number qualifies as a financial account number if it grants access to a person’s finances, or results in an increase of financial burden, or a misappropriation of monies, credit or other assets. I am an attorney. Do communications with clients already covered by the attorney-client privilege immunize me from complying with 201 CMR 17.00? If you own or license personal information, you must comply with 201 CMR 17.00 regardless of privileged or confidential communications. You must take steps outlined in 201 CMR 17.00 to protect the personal information taking into account your size, scope, resources, and need for security. I already comply with HIPAA. Must I comply with 201 CMR 17.00 as well? Yes. If you own or license personal information about a resident of the Commonwealth, you must comply with 201 CMR 17.00, even if you already comply with HIPAA. What is the extent of my “monitoring” obligation? The level of monitoring necessary to ensure your information security program is providing protection from unauthorized access to, or use of, personal information, and effectively limiting risks will depend largely on the nature of your business, your business practices, and the amount of personal information you own or license. It will also depend on the form in which the information is kept and stored. Obviously, information stored as a paper record will demand different monitoring techniques from those applicable to electronically stored records. In the end, the monitoring that 10–24

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

you put in place must be such that it is reasonably likely to reveal unauthorized access or use. Is everyone’s level of compliance going to be judged by the same standard? Both the statute and the regulations specify that security programs should take into account the size and scope of your business, the resources that you have available to you, the amount of data you store, and the need for confidentiality. This will be judged on a case by case basis. I password protect data when storing it on my laptop and when transmitting it wirelessly. Is that enough to satisfy the encryption requirement? No. 201 CMR 17.00 makes clear that encryption must bring about a “transformation of data into a form in which meaning cannot be assigned.” This is to say that the data must be altered into an unreadable form. Password protection does not alter the condition of the data as required, and therefore would not satisfy the encryption standard. I am required by law to contract with a specific third party service provider, not necessarily of my choosing. Must I still perform due diligence in the selection and retention of that specific third party service provider? Where state or federal law or regulation requires the use of a specific third party service provider, then the obligation to select and retain would effectively be met.

MCLE, Inc. | 2nd Edition 2018

10–25

Data Security and Privacy in Massachusetts

EXHIBIT 10B—A Small Business Guide: Formulating a Comprehensive Written Information Security Program A

Small Business Guide: Formulating A Comprehensive Written Information Security Program While the contents of any comprehensive written information security program required by 201 CMR 17.00 must always satisfy the detailed provisions of those regulations; and while the development of each individual program will take into account (i) the size, scope and type of business of the person obligated to safeguard the personal information under such comprehensive information security program, (ii) the amount of resources available to such person, (iii) the amount of stored data, and (iv) the need for security and confidentiality of both consumer and employee information, the Office of Consumer Affairs and Business Regulation is issuing this guide to help small businesses in their compliance efforts. This Guide is not a substitute for compliance with 201 CMR 17.00. It is simply a tool designed to aid in the development of a written information security program for a small business, including the self employed, that handles “personal information.” Having in mind that wherever there is a conflict found between this guide and the provisions of 201 CMR 17.00, it is the latter that will govern. We set out below this “guide” to devising a security program (references below to “we” and “our” are references to the small business to whom the real WISP will relate): COMPREHENSIVE WRITTEN INFORMATION SECURITY PROGRAM I. OBJECTIVE: Our objective, in the development and implementation of this comprehensive written information security program (“WISP”), is to create effective administrative, technical and physical safeguards for the protection of personal information of residents of the Commonwealth of Massachusetts, and to comply with obligations under 201 CMR 17.00. The WISP sets forth our procedure for evaluating our electronic and physical methods of accessing, collecting, storing, using, transmitting, and protecting personal information of residents of the Commonwealth of Massachusetts. For purposes of this WISP, “personal information” means a Massachusetts resident's first name and last name or first initial and last name in combination with any one or more of the following data elements that relate to such resident: (a) Social Security 10–26

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

number; (b) driver's license number or state-issued identification card number; or (c) financial account number, or credit or debit card number, with or without any required security code, access code, personal identification number or password, that would permit access to a resident’s financial account; provided, however, that “personal information” shall not include information that is lawfully obtained from publicly available information, or from federal, state or local government records lawfully made available to the general public. II. PURPOSE: The purpose of the WISP is to: (a) Ensure the security and confidentiality of personal information; (b) Protect against any anticipated threats or hazards to the security or integrity of such information (c) Protect against unauthorized access to or use of such information in a manner that creates a substantial risk of identity theft or fraud. III. SCOPE: In formulating and implementing the WISP, (1) identify reasonably foreseeable internal and external risks to the security, confidentiality, and/or integrity of any electronic, paper or other records containing personal information; (2) assess the likelihood and potential damage of these threats, taking into consideration the sensitivity of the personal information; (3) evaluate the sufficiency of existing policies, procedures, customer information systems, and other safeguards in place to control risks; (4) design and implement a WISP that puts safeguards in place to minimize those risks, consistent with the requirements of 201 CMR 17.00; and (5) regularly monitor the effectiveness of those safeguards: IV. DATA SECURITY COORDINATOR: We have designated _____ to implement, supervise and maintain the WISP. That designated employee (the “Data Security Coordinator”) will be responsible for: a. Initial implementation of the WISP; b. Training employees; c. Regular testing of the WISP’s safeguards; d. Evaluating the ability of each of our third party service providers to implement and maintain appropriate security measures for the personal information to which we have permitted them access, consistent with 201 CMR 17.00; and requiring such third party service providers by contract to implement and maintain appropriate security measures.

MCLE, Inc. | 2nd Edition 2018

10–27

Data Security and Privacy in Massachusetts

e. Reviewing the scope of the security measures in the WISP at least annually, or whenever there is a material change in our business practices that may implicate the security or integrity of records containing personal information. f. Conducting an annual training session for all owners, managers, employees and independent contractors, including temporary and contract employees who have access to personal information on the elements of the WISP. All attendees at such training sessions are required to certify their attendance at the training, and their familiarity with the firm’s requirements for ensuring the protection of personal information. V. INTERNAL RISKS: To combat internal risks to the security, confidentiality, and/or integrity of any electronic, paper or other records containing personal information, and evaluating and improving, where necessary, the effectiveness of the current safeguards for limiting such risks, the following measures are mandatory and are effective immediately. To the extent that any of these measures require a phase-in period, such phase-in must be completed on or before March 1, 2010: Internal Threats • A copy of the WISP must be distributed to each employee who shall, upon receipt of the WISP, acknowledge in writing that he/she has received a copy of the WISP. • There must be immediate retraining of employees on the detailed provisions of the WISP. • Employment contracts must be amended immediately to require all employees to comply with the provisions of the WISP, and to prohibit any nonconforming use of personal information during or after employment; with mandatory disciplinary action to be taken for violation of security provisions of the WISP (The nature of the disciplinary measures may depend on a number of factors including the nature of the violation and the nature of the personal information affected by the violation). • The amount of personal information collected should be limited to that amount reasonably necessary to accomplish our legitimate business purposes, or necessary to us to comply with other state or federal regulations. • Access to records containing personal information shall be limited to those persons who are reasonably required to know such information in order to accomplish your legitimate business purpose or to enable us comply with other state or federal regulations. • Electronic access to user identification after multiple unsuccessful attempts to gain access must be blocked. • All security measures shall be reviewed at least annually, or whenever there is a material change in our business practices that may reasonably implicate the 10–28

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

security or integrity of records containing personal information. The Data Security Coordinator shall be responsible for this review and shall fully apprise management of the results of that review and any recommendations for improved security arising out of that review. • Terminated employees must return all records containing personal information, in any form, that may at the time of such termination be in the former employee’s possession (including all such information stored on laptops or other portable devices or media, and in files, records, work papers, etc.) • A terminated employee’s physical and electronic access to personal information must be immediately blocked. Such terminated employee shall be required to surrender all keys, IDs or access codes or badges, business cards, and the like, that permit access to the firm’s premises or information. Moreover, such terminated employee’s remote electronic access to personal information must be disabled; his/her voicemail access, e-mail access, internet access, and passwords must be invalidated. The Data Security Coordinator shall maintain a highly secured master list of all lock combinations, passwords and keys. • Current employees’ user ID’s and passwords must be changed periodically. • Access to personal information shall be restricted to active users and active user accounts only. • Employees are encouraged to report any suspicious or unauthorized use of customer information. • Whenever there is an incident that requires notification under M.G.L. c. 93H, §3, there shall be an immediate mandatory post-incident review of events and actions taken, if any, with a view to determining whether any changes in our security practices are required to improve the security of personal information for which we are responsible. • Employees are prohibited from keeping open files containing personal information on their desks when they are not at their desks. • At the end of the work day, all files and other records containing personal information must be secured in a manner that is consistent with the WISP’s rules for protecting the security of personal information. • Each department shall develop rules (bearing in mind the business needs of that department) that ensure that reasonable restrictions upon physical access to records containing personal information are in place, including a written procedure that sets forth the manner in which physical access to such records in that department is to be restricted; and each department must store such records and data in locked facilities, secure storage areas or locked containers. • Access to electronically stored personal information shall be electronically limited to those employees having a unique log-in ID; and re-log-in shall be required when a computer has been inactive for more than a few minutes. • Visitors’ access must be restricted to one entry point for each building in which personal information is stored, and visitors shall be required to present a MCLE, Inc. | 2nd Edition 2018

10–29

Data Security and Privacy in Massachusetts

photo ID, sign-in and wear a plainly visible “GUEST” badge or tag. Visitors shall not be permitted to visit unescorted any area within our premises that contains personal information. • Paper or electronic records (including records stored on hard drives or other electronic media) containing personal information shall be disposed of only in a manner that complies with M.G.L. c. 93I. VI. EXTERNAL RISKS To combat external risks to the security, confidentiality, and/or integrity of any electronic, paper or other records containing personal information, and evaluating and improving, where necessary, the effectiveness of the current safeguards for limiting such risks, the following measures must be completed on or before March 1, 2010: External Threats • There must be reasonably up-to-date firewall protection and operating system security patches, reasonably designed to maintain the integrity of the personal information, installed on all systems processing personal information. • There must be reasonably up-to-date versions of system security agent software which must include malware protection and reasonably up-to-date patches and virus definitions, installed on all systems processing personal information. • To the extent technically feasible, all personal information stored on laptops or other portable devices must be encrypted, as must all records and files transmitted across public networks or wirelessly, to the extent technically feasible. Encryption here means the transformation of data into a form in which meaning cannot be assigned without the use of a confidential process or key, unless further defined by regulation by the Office of Consumer Affairs and Business Regulation. • All computer systems must be monitored for unauthorized use of or access to personal information. • There must be secure user authentication protocols in place, including: (1) protocols for control of user IDs and other identifiers; (2) a reasonably secure method of assigning and selecting passwords, or use of unique identifier technologies, such as biometrics or token devices; (3) control of data security passwords to ensure that such passwords are kept in a location.

10–30

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

EXHIBIT 10C—Standards for the Protection of Personal Information of Residents of the Commonwealth—201 C.M.R. § 17.00 201 CMR 17.00: STANDARDS FOR THE PROTECTION OF PERSONAL INFORMATION OF RESIDENTS OF THE COMMONWEALTH Section: 17.01: Purpose and Scope 17.02: Definitions 17.03: Duty to Protect and Standards for Protecting Personal Information 17.04: Computer System Security Requirements 17.05: Compliance Deadline 17.01 Purpose and Scope (1) Purpose This regulation implements the provisions of M.G.L. c. 93H relative to the standards to be met by persons who own or license personal information about a resident of the Commonwealth of Massachusetts. This regulation establishes minimum standards to be met in connection with the safeguarding of personal information contained in both paper and electronic records. The objectives of this regulation are to insure the security and confidentiality of customer information in a manner fully consistent with industry standards; protect against anticipated threats or hazards to the security or integrity of such information; and protect against unauthorized access to or use of such information that may result in substantial harm or inconvenience to any consumer. (2) Scope The provisions of this regulation apply to all persons that own or license personal information about a resident of the Commonwealth. 17.02: Definitions The following words as used herein shall, unless the context requires otherwise, have the following meanings: Breach of security, the unauthorized acquisition or unauthorized use of unencrypted data or, encrypted electronic data and the confidential process or key that is capable of compromising the security, confidentiality, or integrity of personal information, maintained by a person or agency that creates a substantial risk of identity theft or fraud against a resident of the commonwealth. A good faith but unauthorized acquisition of personal information by a person or agency, or employee or agent thereof, for the lawful purposes of such person or agency, is not a breach of security unless the personal information is used in an unauthorized manner or subject to further unauthorized disclosure. MCLE, Inc. | 2nd Edition 2018

10–31

Data Security and Privacy in Massachusetts

Electronic, relating to technology having electrical, digital, magnetic, wireless, optical, electromagnetic or similar capabilities. Encrypted, the transformation of data into a form in which meaning cannot be assigned without the use of a confidential process or key. Owns or licenses, receives, stores, maintains, processes, or otherwise has access to personal information in connection with the provision of goods or services or in connection with employment. Person, a natural person, corporation, association, partnership or other legal entity, other than an agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof. Personal information, a Massachusetts resident’s first name and last name or first initial and last name in combination with any one or more of the following data elements that relate to such resident: (a) Social Security number; (b) driver’s license number or state-issued identification card number; or (c) financial account number, or credit or debit card number, with or without any required security code, access code, personal identification number or password, that would permit access to a resident’s financial account; provided, however, that “Personal information” shall not include information that is lawfully obtained from publicly available information, or from federal, state or local government records lawfully made available to the general public. Record or Records, any material upon which written, drawn, spoken, visual, or electromagnetic information or images are recorded or preserved, regardless of physical form or characteristics. Service provider, any person that receives, stores, maintains, processes, or otherwise is permitted access to personal information through its provision of services directly to a person that is subject to this regulation. 17.03: Duty to Protect and Standards for Protecting Personal Information (1) Every person that owns or licenses personal information about a resident of the Commonwealth shall develop, implement, and maintain a comprehensive information security program that is written in one or more readily accessible parts and contains administrative, technical, and physical safeguards that are appropriate to (a) the size, scope and type of business of the person obligated to safeguard the personal information under such comprehensive information security program; (b) the amount of resources available to such person; (c) the amount of stored data; and (d) the need for security and confidentiality of both consumer and employee information. The safeguards contained in such program must be consistent with the safeguards for protection of personal information and information of a similar character set forth in any state or federal regulations by which the person who owns or licenses such information may be regulated. 10–32

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

(2) Without limiting the generality of the foregoing, every comprehensive information security program shall include, but shall not be limited to: (a) Designating one or more employees to maintain the comprehensive information security program; (b) Identifying and assessing reasonably foreseeable internal and external risks to the security, confidentiality, and/or integrity of any electronic, paper or other records containing personal information, and evaluating and improving, where necessary, the effectiveness of the current safeguards for limiting such risks, including but not limited to: 1. ongoing employee (including temporary and contract employee) training; 2. employee compliance with policies and procedures; and 3. means for detecting and preventing security system failures. (c) Developing security policies for employees relating to the storage, access and transportation of records containing personal information outside of business premises. (d) Imposing disciplinary measures for violations of the comprehensive information security program rules. (e) Preventing terminated employees from accessing records containing personal information. (f) Oversee service providers, by: 1. Taking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with these regulations and any applicable federal regulations; and 2. Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information; provided, however, that until March 1, 2012, a contract a person has entered into with a third party service provider to perform services for said person or functions on said person’s behalf satisfies the provisions of 17.03(2)(f)(2) even if the contract does not include a requirement that the third party service provider maintain such appropriate safeguards, as long as said person entered into the contract no later than March 1, 2010. (g) Reasonable restrictions upon physical access to records containing personal information, and storage of such records and data in locked facilities, storage areas or containers.

MCLE, Inc. | 2nd Edition 2018

10–33

Data Security and Privacy in Massachusetts

(h) Regular monitoring to ensure that the comprehensive information security program is operating in a manner reasonably calculated to prevent unauthorized access to or unauthorized use of personal information; and upgrading information safeguards as necessary to limit risks. (i) Reviewing the scope of the security measures at least annually or whenever there is a material change in business practices that may reasonably implicate the security or integrity of records containing personal information. (j) Documenting responsive actions taken in connection with any incident involving a breach of security, and mandatory post-incident review of events and actions taken, if any, to make changes in business practices relating to protection of personal information. 17.04: Computer System Security Requirements Every person that owns or licenses personal information about a resident of the Commonwealth and electronically stores or transmits such information shall include in its written, comprehensive information security program the establishment and maintenance of a security system covering its computers, including any wireless system, that, at a minimum, and to the extent technically feasible, shall have the following elements: (1) Secure user authentication protocols including: (a) control of user IDs and other identifiers; (b) a reasonably secure method of assigning and selecting passwords, or use of unique identifier technologies, such as biometrics or token devices; (c) control of data security passwords to ensure that such passwords are kept in a location and/or format that does not compromise the security of the data they protect; (d) restricting access to active users and active user accounts only; and (e) blocking access to user identification after multiple unsuccessful attempts to gain access or the limitation placed on access for the particular system; (2) Secure access control measures that: (a) restrict access to records and files containing personal information to those who need such information to perform their job duties; and (b) assign unique identifications plus passwords, which are not vendor supplied default passwords, to each person with computer access, that are reasonably designed to maintain the integrity of the security of the access controls; 10–34

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

(3) Encryption of all transmitted records and files containing personal information that will travel across public networks, and encryption of all data containing personal information to be transmitted wirelessly. (4) Reasonable monitoring of systems, for unauthorized use of or access to personal information; (5) Encryption of all personal information stored on laptops or other portable devices; (6) For files containing personal information on a system that is connected to the Internet, there must be reasonably up-to-date firewall protection and operating system security patches, reasonably designed to maintain the integrity of the personal information. (7) Reasonably up-to-date versions of system security agent software which must include malware protection and reasonably up-to-date patches and virus definitions, or a version of such software that can still be supported with up-todate patches and virus definitions, and is set to receive the most current security updates on a regular basis. (8) Education and training of employees on the proper use of the computer security system and the importance of personal information security. 17.05: Compliance Deadline (1) Every person who owns or licenses personal information about a resident of the Commonwealth shall be in full compliance with 201 CMR 17.00 on or before March 1, 2010. REGULATORY AUTHORITY 201 CMR 17.00: M.G.L. c. 93H

MCLE, Inc. | 2nd Edition 2018

10–35

Data Security and Privacy in Massachusetts

EXHIBIT 10D—OCABR 201 C.M.R. § 17.00 Compliance Checklist

201 CMR 17.00 COMPLIANCE CHECKLIST The Office of Consumer Affairs and Business Regulation has compiled this checklist to help small businesses in their effort to comply with 201 CMR 17.00. This Checklist is not a substitute for compliance with 201 CMR 17.00. Rather, it is designed as a useful tool to aid in the development of a written information security program for a small business or individual that handles “personal information.” Each item, presented in question form, highlights a feature of 201 CMR 17.00 that will require proactive attention in order for a plan to be compliant. The Comprehensive Written Information Security Program (WISP) q Do you have a comprehensive, written information security program (“WISP”) applicable to all records containing personal information about a resident of the Commonwealth of Massachusetts (“PI”)? q Does the WISP include administrative, technical, and physical safeguards for PI protection? q Have you designated one or more employees to maintain and supervise WISP implementation and performance? q Have you identified the paper, electronic and other records, computing systems, and storage media, including laptops and portable devices, that contain personal information? q Have you chosen, as an alternative, to treat all your records as if they all contained PI? q Have you identified and evaluated reasonably foreseeable internal and external risks to paper and electronic records containing PI? q Have you evaluated the effectiveness of current safeguards? q Does the WISP include regular ongoing employee training, and procedures for monitoring employee compliance?

10–36

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

q Does the WISP include disciplinary measures for violators? q Does the WISP include policies and procedures for when and how records containing PI should be kept, accessed or transported off your business premises? q Does the WISP provide for immediately blocking terminated employees, physical and electronic access to PI records (including deactivating their passwords and user names)? q Have you taken reasonable steps to select and retain a third-party service provider that is capable of maintaining appropriate security measures consistent with 201 CMR 17.00? q Have you required such third-party service provider by contract to implement and maintain such appropriate security measures? q Is the amount of PI that you have collected limited to the amount reasonably necessary to accomplish your legitimate business purposes, or to comply with state or federal regulations? q Is the length of time that you are storing records containing PI limited to the time reasonably necessary to accomplish your legitimate business purpose or to comply with state or federal regulations? q Is access to PI records limited to those persons who have a need to know in connection with your legitimate business purpose, or in order to comply with state or federal regulations? q In your WISP, have you specified the manner in which physical access to PI records is to be restricted? q Have you stored your records and data containing PI in locked facilities, storage areas or containers? q Have you instituted a procedure for regularly monitoring to ensure that the WISP is operating in a manner reasonably calculated to prevent unauthorized access to or unauthorized use of PI; and for upgrading it as necessary? q Are your security measures reviewed at least annually, or whenever there is a material change in business practices that may affect the security or integrity of PI records? q Do you have in place a procedure for documenting any actions taken in connection with any breach of security; and does that procedure require post-incident review of events and actions taken to improve security? Additional Requirements for Electronic Records q Do you have in place secure authentication protocols that provide for: MCLE, Inc. | 2nd Edition 2018

10–37

Data Security and Privacy in Massachusetts

q Control of user IDs and other identifiers? q A reasonably secure method of assigning/selecting passwords, or for use of unique identifier technologies (such as biometrics or token devices)? q Control of data security passwords such that passwords are kept in a location and/or format that does not compromise the security of the data they protect? q Restricting access to PI to active users and active user accounts? q Blocking access after multiple unsuccessful attempts to gain access? q Do you have secure access control measures that restrict access, on a need-toknow basis, to PI records and files? q Do you assign unique identifications plus passwords (which are not vendor supplied default passwords) to each person with computer access; and are those IDs and passwords reasonably designed to maintain the security of those access controls? q Do you, to the extent technically feasible, encrypt all PI records and files that are transmitted across public networks, and that are to be transmitted wirelessly? q Do you, to the extent technically feasible, encrypt all PI stored on laptops or other portable devices? q Do you have monitoring in place to alert you to the occurrence of unauthorized use of or access to PI? q On any system that is connected to the Internet, do you have reasonably up-todate firewall protection for files containing PI; and operating system security patches to maintain the integrity of the PI? q Do you have reasonably up-to-date versions of system security agent software (including malware protection) and reasonably up-to-date security patches and virus definitions? q Do you have in place training for employees on the proper use of your computer security system, and the importance of PI security?

10–38

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

EXHIBIT 10E—Personal Information: A Graphical Representation Significant Portion of Name First Name or Initial

Number That Can Be Used In Identity Theft And

Social Security #

+

or

Last Name

Driver’s License # or State-Issued I.D. Card # or Credit Card # or Debit Card # or Financial Account # [For the Destruction/Disposal Statute only:] or Biometric data

MCLE, Inc. | 2nd Edition 2018

10–39

Data Security and Privacy in Massachusetts

EXHIBIT 10F—Action Plan for Complying with Massachusetts Data Security Regulations 1.

Establish an internal data security team and designate at least one senior employee to oversee and maintain the Program. Ideally, this team would include Operations, Human Resources, Information Technology, Accounting/Billing, Sales, and Legal.

2.

Hold a meeting to educate the data security team on the requirements and details of the new rules and to conduct the required assessments.

3.

a.

List all records (in paper or electronic form) that contain Personal Information and where the Personal Information is kept.

b.

Identify current measures for safeguarding Personal Information and evaluate employee compliance with policies and procedures and means for detecting and preventing security system failures.

c.

Assess reasonably foreseeable internal and external risks to the security, confidentiality, or integrity of any records (in paper or electronic form) that contain Personal Information.

d.

List all the ways in which someone could open an account with your business using someone else’s identity or could misuse someone else’s existing account to obtain goods or services for themselves. In making the list, consider any incidents of identity theft your business has experienced.

Create any necessary policies, procedures, and safeguards that do not currently exist, such as: a.

Procedures to prevent terminated employees from accessing records containing Personal Information by immediately terminating their physical and electronic access to such records, including deactivating their passwords and user names.

b.

Policies and procedures limiting access to Personal Information stored and maintained: – Limit the amount of Personal Information collected and the time such information is retained to that reasonably necessary to accomplish the legitimate purpose for which it is collected. – Limit access to those persons who are reasonably required to know such information in order to accomplish such purpose or to comply with state or federal record retention requirements. – Impose reasonable restrictions upon physical access to records containing Personal Information, including a written procedure that sets forth the man-

10–40

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

ner in which physical access to such records is restricted and storage of such records and data is in locked facilities, storage areas, or containers. c.

Security policies for whether and how employees should be allowed to keep, access, and transport records containing Personal Information outside of business premises.

4.

Impose disciplinary measures for violations of the comprehensive information security and amend employee handbook to include advise employees of such disciplinary measures.

5.

Implement electronic security requirements. a.

Ensure use of secure user authentication protocols including – control of user IDs and other identifiers, – a reasonably secure method of assigning and selecting passwords, or use of unique identifier technologies, such as biometrics or token devices, – control of data security passwords to ensure that such passwords are kept in a location and/or format that does not compromise the security of the data they protect, – restricting access to active users and active user accounts only, and – blocking access to user identification after multiple unsuccessful attempts to gain access or the limitation placed on access for the particular system.

b.

Institute secure access control measures that – restrict access to records and files containing personal information to those who need such information to perform their job duties, and – assign unique identifications plus passwords, which are not vendor supplied default passwords, to each person with computer access, that are reasonably designed to maintain the integrity of the security of the access controls.

6.

c.

Encrypt (to the extent technically feasible) all records and files containing Personal Information that are transmitted across public networks or wirelessly and all Personal Information stored on laptops or other portable devices (ex: PDAs, flash drives, CDs).

d.

Reasonably monitor systems for unauthorized use of or access to Personal Information.

e.

Install and maintain reasonably up-to-date firewall protection, operating system security patches, malware protection, and virus definitions and create procedure for regular, involuntary updates.

Determine whether any service providers your business uses have access to Personal Information. If so, take all reasonable steps to verify that each such provider

MCLE, Inc. | 2nd Edition 2018

10–41

Data Security and Privacy in Massachusetts

(i) has the capacity to protect Personal Information consistent with the Massachusetts data security regulations and (ii) in fact is applying to such Personal Information protective security measures at least as stringent as those required by the Regulations. 7.

Document the findings of the assessments, including where Personal Information is stored and measures for safeguarding Personal Information. Also include the information gathered on third-party providers.

8.

Have the board of directors or designated senior management approve the Program.

9.

Train the appropriate personnel on the Program.

10. Establish a schedule for updating the Program. a.

Set schedule for regularly monitoring Program to ensure that it is operating in a manner reasonably calculated to prevent unauthorized access to or unauthorized use of Personal Information and upgrading information safeguards as necessary to limit risks.

b.

Set schedule for annual review of the security measures.

c.

Create a procedure for documenting responsive actions taken in connection with any incident involving a breach of security and conducting a mandatory post-incident review of events and actions taken, if any, to make changes in business practices relating to protection of Personal Information.

10–42

2nd Edition 2018| MCLE, Inc.

Massachusetts Data Security Law and Regulations

EXHIBIT 10G—Twelve Practical Tips for Employers Administrative 1.

Restrict access to Personal Information to as few employees as possible.

2.

Train employees on a regular basis.

3.

Create a procedure for immediately terminating access when an employee leaves.

4.

Check in with service providers to ensure they are in compliance with the laws and regulations.

Physical 5.

Rely on more than one lock for physical data.

6.

Update building/office access codes periodically and make sure they are not obvious.

7.

Implement data retention and destruction policies – and get rid (Redact/Isolate/ Destroy) of what is not needed and keep only what is needed.

Technological 8.

Require complex passwords and change them regularly.

9.

Configure workstations to lock after no more than 10 minutes of inactivity. (The shorter the time, the better.)

10. Encrypt any e-mail attachments that contain Personal Information and are sent over the public internet or wirelessly. 11. Implement a system back-up policy that adequately protects your data: do not allow an employee to take a back-up home unless it is encrypted. 12. Make sure employees are not be able to “opt-out” of security updates.

MCLE, Inc. | 2nd Edition 2018

10–43

Data Security and Privacy in Massachusetts

EXHIBIT 10H—Helpful Websites Massachusetts Attorney General’s Webpage on Guidance for Businesses on Security Breaches: http://www.mass.gov/ago – Go to Consumer Protection / Scams & Identity Theft / Guidance for Businesses on Security Breaches Massachusetts Attorney General’s Webpage “Safeguard Your Computer from ID Theft”: http://www.mass.gov/ago/consumer-resources/consumer-information/scamsand-identity-theft/identity-theft/safeguard-your-computer.html Massachusetts Office of Consumer Protection and Business Regulation “Identity Theft Law”: http://www.mass.gov/ocabr/data-privacy-and-security/identity-theft/ identity-theft-law-advisory.html Massachusetts Office of Consumer Protection and Business Regulation: http://www .mass.gov/ocabr - Go to Identity Theft Federal Trade Commission, Identity Theft Site: http://www.ftc.gov/bcp/edu/ microsites/idtheft Federal Trade Commission, “Protecting Personal Information: A Guide for Business”: http://www.ftc.gov/bcp/edu/pubs/business/idtheft/bus69.pdf Privacy Rights Clearinghouse: http://www.privacyrights.org 10 Rules for Creating a Hacker-Resistant Password: http://www.privacyrights.org/ar/ alertstrongpasswords.htm Identity Theft Resource Center: http://www.idtheftcenter.org

10–44

2nd Edition 2018| MCLE, Inc.

CHAPTER 11

Consumer/Retail Issues* Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston § 11.1

Introduction........................................................................................... 11–1

§ 11.2

Contracting Considerations ................................................................. 11–2

§ 11.3

Drafting Terms ...................................................................................... 11–5 § 11.3.1 Privacy Policy or Notice ....................................................... 11–7 § 11.3.2 Acceptable Use Policy ......................................................... 11–7 § 11.3.3 Intellectual Property Notices ................................................ 11–7

§ 11.4

Privacy Issues—Notice Contents ......................................................... 11–8 § 11.4.1 Data Collected ...................................................................... 11–9 § 11.4.2 Use of Data (Including Sharing) ........................................ 11–10 § 11.4.3 User Access and Options .................................................... 11–12

§ 11.5

Conclusion ........................................................................................... 11–13

Scope Note This chapter discusses privacy issues in public-facing websites in the consumer/retail context, and the use of online terms and notices to manage those issues.

§ 11.1

INTRODUCTION

Early online commerce involving consumers or “retail customers” were “business-toconsumers” (or “B2C”) and involved sale of goods. This form of online commerce is often governed by statutes such as the Uniform Commercial Code Article 2 and the Magnuson-Moss Act (15 U.S.C. §§ 2301–2312—warranties), while banking services are governed by statutes such as the Truth-in-Lending Act (15 U.S.C. §§ 1601–1666, Regulation Z, 12 C.F.R. pt. 226—credit cards) and the Electronic Funds Transfers Act (15 U.S.C. §§ 1693–1693r, Regulation E, 12 C.F.R. pt. 205), and buttressed by the Fair Credit Reporting Act (see chapter 4 of this book) and the Gramm-LeachBliley Act (see chapter 5 of this book). Once Internet usage exploded in the mid-1990s, hybrid commerce evolved, to include both goods and services, such as online “B2B” and “C2C” auctions and exchanges in which the difference between consumers and merchants often blurred, * Assisted in update by Quincy Kayton, Director of Services, Volunteer Lawyers for the Arts of Massachusetts.

MCLE, Inc. | 2nd Edition 2018

11–1

§ 11.1

Data Security and Privacy in Massachusetts

challenging the separation of business practice from traditional protections for consumers. “Cloud” (“distributed, multi-tenant”) sites have become successful, offering software, platform, and infrastructure (storage) as services, involving multiple relationships distributed often anonymously. This chapter reviews some of the consumer/retail-facing contracting used to manage issues arising out of hybrid commerce in the Internet age.

§ 11.2

CONTRACTING CONSIDERATIONS

The Uniform Electronic Transactions Act (UETA) developed by the National Conference of Commissioners on Uniform State Laws (NCCUSL), available at http:// www.uniformlaws.org/shared/docs/electronic%20transactions/ueta_final_99.pdf, has been adopted in forty-seven states (Illinois, New York, and Washington are holdouts). The federal Electronic Signature in Global and National Commerce Act of 2000 (“ESign”), 15 U.S.C. §§ 7001–7006, defers to UETA (15 U.S.C. § 7002(a)(1)). UETA counteracts residual suspicion (and Statute of Frauds questions) of and enables electronic means of “signing” (“electronic signatures”), including simply sending an electronic message of affirmation such as a faxed signature or simply an e-mail signed with a typed name as long as the procedure is agreed to by the parties. UETA embraced “technology neutrality” and does not require stronger “authentication” or “nonrevocation” devices such as “digital signatures” that involve “hashed” messages (digests) encrypted to guarantee the source and integrity of the message. Uniform Commercial Code (UCC) Article 2B proposed to address licensing of “information” was downgraded and promulgated by NCCUSL as the Uniform Computer Transactions Act (UCITA). See http://www.uniformlaws.org/shared/docs/computer _information_transactions/ucita_final_02.pdf. UCITA was adopted only in Virginia (Va. Code Ann. §§ 59.1-501.1 to 59.1-509.2) and Maryland (Md. Com. Law Ann. §§ 22-101 to 22-816) Iowa, North Carolina, Vermont, and West Virginia enacted constitutionally questionable rejections of choice of UCITA law. Nonetheless, seventeen years after its promulgation, UCITA remains the most complete framework for online contracting. At the heart of UCITA is the formation of a contract upon “manifestation of assent after opportunity to review” a record of terms. UCITA §§ 112, 113. This was a codification of the “clickwrap” contract mechanism that received critical support in ProCD, Inc. v. Zeidenberg, 86 F.3d 1447 (7th Cir. 1996) (manifestation of assent by opening shrink-wrapped box with printed terms included and by using the information product that displayed a reference to the terms each time loaded). For nearly two decades after ProCD, terms presented through hyperlinks in close proximity to clicked “I accept” buttons (clickwrap) were universally enforced, and only a handful of cases had judicially rejected online terms—in cases in which consumers were presented with a “browsewrap” page that did not expressly query for acceptance of terms but simply provided a passive link on the page, assuming mere browsing would be the assent. See, e.g., Specht v. Netscape Communications Corp., 306 F.3d 17 (2d Cir. 2002) (arbitration clause not enforced against consumer). In 2014, the Ninth Circuit sparked some revisiting of the issue of adoption of browsewrap terms, 11–2

2nd Edition 2018 | MCLE, Inc.

Consumer/Retail Issues

§ 11.2

specifically consumer arbitration terms, in Nguyen v. Barnes and Noble, Inc., 763 F.3d 1171, 1178–79 (9th Cir. 2014), holding that, where a website makes its terms of use available via a conspicuous hyperlink on every page of the website but otherwise provides no notice to users nor prompts them to take any affirmative action to demonstrate assent, even close proximity of the hyperlink to relevant buttons users must click on—without more—is insufficient to give rise to constructive notice . . . . [T]he onus must be on website owners to put users on notice of the terms to which they wish to bind consumers. Similarly, in Nicosia v. Amazon.com, Inc., 834 F.3d 220, (2d Cir. 2016), the Second Circuit reversed a dismissal to compel arbitration of a consumer’s claim where a message was presented stating that, “by placing your order, you agree to Amazon.com’s . . . conditions of use,” which was not conspicuous and where the hyperlink to the terms was easily lost among as score of other hyperlinks and graphic distractions on the page. While most courts have divided the cases into clickwraps that are generally enforceable and browsewraps that require more scrutiny to find “constructive notice” of the terms, the court in Berkson v. Gogo LLC., 97 F. Supp. 3d 359, 394–401 (E.D.N.Y. 2015), identified “hybrid” schemes including “scrollwrap,” entailing scrolling through the terms before proceeding to use, and “sign-in-wrap” that “is designed so that a user is notified of the existence and applicability of the site’s ‘terms of use’ when proceeding through the website’s sign-in or login process.” In denying the enforcement of arbitration terms hyperlinked next to a checkbox for “I agree to the Terms of Use” on a page requiring entry of information for registration for an inflight Wi-Fi service, the court set forth a “four-part inquiry in analyzing sign-inwraps, and electronic contracts of adhesion generally”: whether • (1) “the user was aware [of being bound] to more than an offer of services or goods in exchange for money”; • (2) “the design and content of the website [made] the ‘terms of use’ . . . readily and obviously available to the user”; • (3) “the importance of the details of the contract [was] obscured or minimized by the physical manifestation of assent expected of a consumer”; and • (4) the merchant draws to the user’s attention alterations to the expectation of availability of, including the forum for seeking, remedies. Berkson v. Gogo LLC., 97 F. Supp. 3d at 402. The same court enforced the arbitration terms where the subscribers were “sophisticated businessmen” who had repeated the registration process on multiple occasions and received e-mail confirmation with hyperlinks to the terms. Salameno v. Gogo Inc., No. 16-CV-0487, 2016 U.S. Dist. LEXIS 88166 (E.D.N.Y. July 7, 2016). Another court, Cullinane v. Uber Technologies, Inc., No. 14-14750, 2016 U.S. Dist. MCLE, Inc. |2nd Edition 2018

11–3

§ 11.2

Data Security and Privacy in Massachusetts

LEXIS 89540 (D. Mass. July 11, 2016), while discussing the Berkson taxonomy of “wrap” contracts, dismissed a proposed class action of Uber ride-sharing service customers who signed up for the service in a series of smartphone screens that presented terms with a button to link to ten pages of terms (including an arbitration clause), concluding with a final “done” button. Finding reasonable notice, the court declined to apply the Berkson four-part inquiry as “imposition of such transactions costs for the contract validation process [would] make otherwise legally compliant arbitration agreements for online contracts all but impossible to enforce.” Cullinane v. Uber Techs., Inc., No. 14-14750, 2016 U.S. Dist. LEXIS at *21–22. However, in Meyer v. Kalanick, 199 F. Supp. 3d 752 (S.D.N.Y. 2016), the court refused to compel arbitration based on Uber’s customer sign-up procedure, noting the small font size of the notice and that Uber’s expert declaration describing the sign-up procedure did not attest as to the actual display of the terms and conditions upon the clicking of the button referring to them. Meyer v. Kalanick, 199 F. Supp. 3d at 760 n.5. In another case involving the same ride-share service customer sign-up, Cordas v. Uber Technologies, Inc., No. 16-cv-04065, 2017 U.S. Dist. LEXIS 22566, at *9– 10 (N.D. Cal. Jan. 5, 2017), the court followed Cullinane to compel arbitration, finding the procedure not to be an unenforceable browsewrap because the customer had to affirmatively assent to Uber’s terms and conditions by clicking on “done” to complete the sign-up process. But see Metter v. Uber Techs., Inc., No. 16-cv-06652, 2017 U.S. Dist. LEXIS 58481 (N.D. Cal. Apr. 17, 2017) (pop-up keyboard blocked access to terms of service, arbitration not compelled). In Devries v. Experian Information Solutions, Inc., No. 16-cv-02953, 2017 U.S. Dist. LEXIS 26471 (N.D. Cal. Feb. 24, 2017), the court enforced an arbitration clause among terms and conditions hyperlinked in contrasting blue text placed just before the “order” button. Again, choosing between the traditional characterizations of browsewrap and clickwrap, the court found that the placement of the link just above the order button made this presentation closer to the generally enforceable clickwrap. Devries v. Experian Info. Solutions, Inc., No. 16-cv-02953, 2017 U.S. Dist. LEXIS at *13–18 (distinguishing Nicosia v. Amazon.com, Inc., 834 F.3d 220 (2d Cir. 2016) as involving too many links on the webpage to other matters distracting from the terms). Practice Note Synthesizing the treatment of standardized contracts of Section 211 of the Restatement (Second) of Contracts, the Uniform Commercial Code, and twenty-plus years of consumer contracts presented in shrinkwrap, clickwrap, and browsewrap, the reporters on a pending project of American Law Institute (ALI) to “restate” the law of consumer contracts have identified as prominent two “techniques” to address the potential of abuse in these “asymmetric contracting environments”: [1] the doctrine of mutual assent—the rules that determine how terms are adopted and which processes a business can use to introduce and to modify terms in the agreement” and [2] “the use of mandatory restrictions over the 11–4

2nd Edition 2018 | MCLE, Inc.

Consumer/Retail Issues

§ 11.2

substance of the deal—rules that limit the discretion of the business in drafting contract terms and set boundaries to permissible contracting” such as unconscionability. Project Feature: Restatement of the Law, Consumer Contracts, The ALI Advisor, http://www.thealiadviser.org/ consumer-contracts/ (visited Aug. 3, 2017) (quoting Reporters’ Introduction), “Draft Restatement of the Law of Consumer Contracts” (Amer. L. Inst. Discussion Draft Apr. 17, 2017)). The draft itself is available to ALI members at https://www.ali.org/smedia/filer_private/14/28/14287efed8ae-47ec-816e-c544d950e8a6/consumer_contracts_dd_-_online.pdf. Although “[i]t is both irrational and infeasible for most consumers to keep up with the increasingly complex terms provided by businesses,” the reporters observed that “the courts have been reluctant to place special impediments to their enforcement,” but have “set boundaries on permissible contracting.” “Draft Restatement of the Law of Consumer Contracts” (Amer. L. Inst. Discussion Draft Apr. 17, 2017).

Thus, online terms are generally enforceable if they are made available to a visitor to a webpage and there is a requirement of some action for the visitor to complete to demonstrate assent. Preferably, the process should be arranged to make it impossible for a consumer to continue using the site without such manifestation of assent. Using this mechanism, Internet service providers and other operators of websites, including online vendors, have great discretion as to how they manage interactions with their visitors and customers. They are, of course, subject to federal and state statutes and regulations protective of consumers, including the ones mentioned in the Introduction to this chapter and otherwise reviewed in this book, such as the Electronic Communication Privacy Act (ECPA) in chapter 2. A requirement for contracting is the individual’s capacity to contract. The age of contracting capacity (and general capacity, with exceptions for certain obligations) is eighteen in all except three states. (In Alabama and North Dakota it is nineteen; in Mississippi, twenty-one.) Thus, for most commercial sites there is a requirement that the user be eighteen years of age or older. Of course, many social media sites appeal to younger users, and the sites accept the risk that certain obligations of their terms may not be enforced. Sites that are “directed” to children under thirteen years of age are subject to further regulation under the Children’s Online Privacy Protection Act (COPPA), and require parental notice and “verifiable consent” for collection of information. See chapter 7, § 7.4, and chapter 8, § 8.4.4, of this book.

§ 11.3

DRAFTING TERMS

Generally, practitioners or their clients will copy the terms of established Web enterprises in the belief that those enterprises have expended the effort and gained the experience necessary to get the terms and procedures “right”—a luxury not enjoyed MCLE, Inc. |2nd Edition 2018

11–5

§ 11.3

Data Security and Privacy in Massachusetts

by the new entrant. While that may be true, and putting aside copyright and fair-use issues, it is dangerous to simply copy terms because the new enterprise may differ critically from the established one. And in any case, use patterns may “morph” for both established and new enterprises responding to both consumer and provider preferences, strategies, and innovations. A practice that has developed is that a general set of terms—“terms of service,” “terms of use,” “customer agreement,” “account agreement”—sets forth terms that address all the significant issues the site operator (or counselor) can imagine may be raised. This is generally “lawyered” with some precision to stand up in court or in arbitration should the issue be presented, with only secondary regard for understanding by the user/customer. Particularly with cloud service providers such as Amazon.com through its popular Amazon Web Services (AWS), there may be multiple “hubs” for general terms, e.g., the AWS Customer Agreement (see https://aws.amazon.com/agreement/), and terms specific to certain products or services, e.g., the AWS Service Terms (see https:// aws.amazon.com/service-terms). See also http://www.microsoft.com/en-us/services agreement. Cloud services providers provide distributed network resources on a multi-tenant (i.e., more than one customer, for example, on a server), and thus face not only damage to their network resources, but liability that may be created by their direct customers (who may provide services to the consumer). Thus, they will typically impose specific security obligations on the direct customers as well as the responsibility to police their users. See, e.g., AWS Customer Agreement § 4 (holding the customer responsible for all activities occurring under the customer’s account and requiring the customer to take appropriate and necessary actions to secure, protect, and back-up accounts). Operators with greater resources will try to accomplish effective legal and notice purposes by layering the terms in different levels of detail, however, but that may bury important explanations and mislead consumers. Recently, LinkedIn has presented its user agreement in two formats—a clickable link to a summary of recent changes or updates and on the right a table of contents that will direct the user to the specific heading when clicked on. See https://www.linkedin.com/legal/useragreement. These refinements are a valuable task performed by “user experience” (UX) professionals. A separate set of notices serve the different function of notifying users and/or customers of company information required to comply with statutes or regulations, such as privacy notices or policies or to maintain the site owner’s character. These need to be understood by the users and need to be current and correct, as these statements are the ones the Federal Trade Commission scrutinizes under its “unfair or deceptive acts or practices” jurisdiction (see chapter 8, § 8.5, of this book). See also “Snapchat Settles FTC Charges That Promises of Disappearing Messages Were False,” FTC Press Release, May 8, 2014, available at https://www.ftc.gov/news-events/pressreleases/2014/05/snapchat-settles-ftc-charges-promises-disappearing-messages-were (not strictly privacy policy); see also “Operators of AshleyMadison.com Settle FTC, 11–6

2nd Edition 2018 | MCLE, Inc.

Consumer/Retail Issues

§ 11.3

State Charges Resulting From 2015 Data Breach that Exposed 36 Million Users’ Profile Information,” FTC Press Release, December 14, 2016, available at https:// www.ftc.gov/news-events/press-releases/2016/12/operators-ashleymadisoncom-settle -ftc-state-charges-resulting. The common classes of notices are discussed below.

§ 11.3.1 Privacy Policy or Notice These policies (discussed below in § 11.4) generally advise that personal information may be collected in some form and used or that “cookies” may be deposited in the user’s browser program to provide identifying information to the site or track the browser’s visit to other sites—typically for behaviorally targeted advertising or for solicitation by associated businesses under the rules described in this book. They may provide for “opt-out” procedures (or, for European users, “opt-in” procedures) or warn that disabling cookies may adversely affect the site “experience.” Sometimes a separate “cookie policy” is provided.

§ 11.3.2 Acceptable Use Policy These policies, sometimes called “rules” or “community guidelines,” generally set the standards for the use of a site. Facebook’s “community standards” include layered explanations of its monitoring and policies toward “direct threats,” “dangerous organizations,” “self-injury,” “bullying and harassment,” “criminal activity,” “regulated substances,” “sexual violence and exploitation,” “nudity,” “hate speech,” and “violence and graphic content.” See https://www.facebook.com/communitystandards. Unlike some other sites, Facebook requires the use of an authentic identity to ensure the accountability of its users. See https://www.facebook.com/communitystandards# using-your-authentic-identity. Facebook’s authentic user business model was advocated in its negotiations with NCCUSL on the Uniform Fiduciary Access to Digital Assets Act (see chapter 16, § 16.4, of this book), resulting in adoption of the use of an “online tool” to grant or deny consent to a user’s survivors or personal representative (executor) to have limited access to the user’s account (i.e., to download memorialized postings) after the user passes away. See https://www.facebook.com/help/contact/228813257197480. Again, cloud services providers are particularly sensitive about possible network abuse, including “monitoring or crawling” of network sites. See, e.g., AWS Acceptable Use Policy, available at http://aws.amazon.com/aup/.

§ 11.3.3 Intellectual Property Notices The most important and prevalent of intellectual property notices is the Digital Millennium Copyright Act (DMCA), 17 U.S.C. § 512, “take-down” procedure. This procedure enables copyright owners to require a site operator to “take down” postings of infringing materials (and the less-often-used procedure to repost materials that have been taken down). The importance of the procedure is to provide the site MCLE, Inc. |2nd Edition 2018

11–7

§ 11.3

Data Security and Privacy in Massachusetts

operator a “safe harbor” from a copyright infringement action and resulting liability. To qualify for a “safe harbor,” a site operator must designate a registered agent with the Copyright Office and provide clear instructions for a copyright owner to follow when reporting infringing materials. See “DMCA Notice-and-Takedown Processes: List of Good, Bad, and Situational Practices, USPTO,” available at https://www. uspto.gov/sites/default/files/documents/DMCA_Good_Bad_and_Situational_ Practices_Document-FINAL.pdf. Also included among intellectual property notices, sometimes separately, are explanations of “fair use” (17 U.S.C. § 107), a listing of patents (to meet the 35 U.S.C. § 287(a) “marking” requirement for infringement damages), and a trademark policy of how the operator’s trademarks may or may not be used.

§ 11.4

PRIVACY ISSUES—NOTICE CONTENTS

The greatest concern of most consumers is theft of their identity through misuse of their personal information by the recipient/collector or through breach of data storage security of the recipient/collector. Data security of stored personal information is the first focus of this volume. See chapters 2–6, 8, and 10 of this book. Relatively little has surfaced concerning electronic diversion in transmission of personal information by other than law enforcement or government intelligence agencies—addressed by statutes such as the ECPA (see chapter 2 of this book)—and certain service providers that may use “deep sniffing” of transmitted or stored information with user consent typically obtained through online terms. Malware such as keystroke loggers implanted into consumer computers tend to be used to access online accounts that are still controlled by the site operator. In the context of a consumer-facing website or service, there is relatively little the consumer can do other than scrub the consumer’s computer for malware and understand what information is collected and correct, delete, or limit its use, if possible. The other major issue for consumers is the “right to be left alone” privacy intrusion of unwanted solicitation. This intrusion was originally space- and time-wasting junk mail, but morphed into resource-wasting (money and time) junk faxes and telephone solicitations and eventually camouflage for “phishing” and insertion of malware. Attempts to limit these intrusions have been made in the credit-reporting statutes (see chapters 4 and 5 of this book), in CAN-SPAM (see chapter 7, § 7.5), COPPA (see chapter 7, § 7.4, and chapter 8, 8.4.4), in the TCPA, and state laws (see chapter 9, § 9.2.3). However, the “creepiness” of precisely targeted advertisements appears to be tolerated by most consumers in exchange for the convenience of free, locationdirected services funded by such advertisements. What has evolved with the rise of social media is the not-infrequent consumer’s (as well as enterprise’s) desire to limit accessibility to or “erase” their own postings (perhaps unintended for a larger audience) or of others. With abuses such as bullying and “revenge porn” justifying that desire, the existing tools of copyright and defamation

11–8

2nd Edition 2018 | MCLE, Inc.

Consumer/Retail Issues

§ 11.4

law have limited effect when postings can be downloaded, distributed, and automatically backed up across the web. States have responded to some of these issues, most widely in the “social media privacy” laws aimed to prevent coerced access by educators and employers to the private accounts of those who spend (or hope to spend) substantial portions of their lives in their orbits (see chapter 9, § 9.2.3(c), of this book). In addition, thirty-eight states and Washington D.C. have passed statutes criminalizing acts of “revenge porn” to various degrees. See “38 States + DC Have Revenge Porn Laws,” available at https://www.cybercivilrights.org/revenge-porn-laws/ (visited Aug. 12, 2017). While all states have statutes criminalizing bullying, twenty-three states have specifically included “cyberbullying” and forty-eight states have included “electronic harassment” in their codes. See stopbullying.com, available at https://www.stopbullying. gov/laws/index.html (visited Aug. 12, 2017); see also Bullying Laws Across America, available at https://cyberbullying.org/bullying-laws (visited Aug. 12, 2017). Following are components of privacy notices or policies drawn from the following major consumer sites: • Amazon.com (including AWS): https://aws.amazon.com/privacy/ • Apple: http://www.apple.com/privacy/privacy-policy • Bank of America: https://www.bankofamerica.com/privacy/online-privacynotice.go • Facebook: https://www.facebook.com/about/privacy • Google: http://www.google.com/policies/privacy • LinkedIn: https://www.linkedin.com/legal/privacy-policy • Microsoft Services: https://privacy.microsoft.com/en-us/privacystatement. Amazon.com, Apple, and Microsoft Services include retail sales/licensing. Bank of America covers primarily banking services. Facebook, Google, and LinkedIn are supported by advertisement.

§ 11.4.1 Data Collected Amazon.com lists “information you give us” (account information, answers to questionnaires), “automatic information” (cookies), “mobile” (location information), “email communications” (opening, comparison to other company customer lists), and “information from other sources.” Apple generally describes “personal information” as information necessary for creating an Apple ID and information about people with whom the user shares content. It also collects “nonpersonal information” such as activities (including search queries and aggregated data) and uses cookies and “pixel tags” (web beacons).

MCLE, Inc. |2nd Edition 2018

11–9

§ 11.4

Data Security and Privacy in Massachusetts

Bank of America collects information provided through forms, surveys, applications, and online fields. Facebook collects “content and other information” provided by the user and others, including how services are used, how the user interacts within the network, device information, including location, information from other websites and apps that use the services, and information from third-party partners and other Facebook companies. Google collects “information you give us,” “information we get from use of our services” (e.g., device information, network log information, location information, “unique application numbers,” local storage, and “cookies or similar technologies”). LinkedIn provides a detailed, nine-subsection Section 1 on the information it collects and provides a right-column summary of each subsection. It is specific about the “data controller” (Section 1.1) and the collection of registration information (Section 1.1), profile information (Section 1.1), address book and other synched services (Section 1.1), customer service (Section 2.7), use of LinkedIn sites and applications (Section 1.6), third-party interaction linked through the LinkedIn account (Section 3.3), cookies (Section 1.4), and other advertising technologies, including web beacons (Section 1.9), log files and device information (Section 1.10), and a reservation for new technologies (Section 1.11). Microsoft Services lists name and contact data, credentials, demographic data, interests and favorites, payment data, usage data, contacts and relationships, location data, content (subject line and body of an e-mail, text, or other content of an instant message, audio and video recording of a video message, and audio recording and transcript), and customer support as information collected for use.

§ 11.4.2 Use of Data (Including Sharing) Amazon.com lists potential sharing of information with “affiliated businesses we do not control,” third-party service providers, use in promotional offers, business transfers, “protection of Amazon.com and others” (legal), and other information only “with your consent.” Apple describes generally “keeping you posted” on product updates, to help create and improve products and to identify users and appropriate services. It disclaims marketing use with third parties, but provides information to other service providers to provide the services and for legal purposes. Bank of America provides a lengthy and specific list of uses, including marketing, and a link to a separate “consumer privacy notice” for a table of sharing scenarios, with affiliate and nonaffiliate marketing uses susceptible to “opting out.” It also explains both relationship-based advertising and online behavioral advertising for tailored advertisements via banner ads, splash ads, e-mail, postal mail, telemarketing, and on unaffiliated sites and mobile apps.

11–10

2nd Edition 2018 | MCLE, Inc.

Consumer/Retail Issues

§ 11.4

Facebook lists uses for providing, improving, and developing services, communicating with the user, showing and measuring ads and services, and promoting safety and security. It reminds the user that information posted for public visibility, including a public profile, is globally available, as is content others share with the user. Apps, websites, and third-party integrations on or using Facebook services also have access to a user’s personal information. Facebook states that it does not use information that “personally identifies you” for advertising or analytics. Google uses collected data for developing products, to “protect Google and our users,” and to offer “tailored content.” It provides choices for limiting sharing of visibility settings, uses cookies and pixel tags, but does not apply “sensitive categories.” Notably, Google states: Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. We may combine personal information from one service with information, including personal information, from other Google services—for example to make it easier to share things with people you know. We will not combine DoubleClick cookie information with personally identifiable information unless we have our opt-in consent. In essence, Google relies upon its privacy policy to access every e-mail within its system. While this may be consent on the part of the account holder, query whether there is consent on the part of the sender of the communication. LinkedIn provides a detailed, nine-subsection Section 2 on use of the information it collects and provides a right-column summary. It is specific about consent and the possibility of withdrawal (Section 2.1), providing use of information for system communications (Section 2.3), user communications (Section 2.3), and service development and customization (Section 2.4 and 2.6). LinkedIn provides for sharing of information with its affiliates (Section 2.5) and with third parties through the “public profile” feature (Section 2.4 and 2.6). Users are advised of third parties using LinkedIn platform services (Section 2.7), of polls and surveys conducted by LinkedIn, of other users or third parties (Section 2.6), of use of information for user searches (Section 2.1 and 2.2), of participation in groups (Section 2.10), and of the use of testimonials and advertisements (Section 2.1) for its special functionality for recruiting, marketing, and sales (Section 2.5), and pages for companies, schools, influencers, and other entities (Section 2.8). Microsoft Services generally recites the use of information to provide and improve services, for security, safety, dispute resolution, business operations, and provideruser communications. It is more detailed about advertising, and explains its interestbased advertising under industry codes (for which it limits data retention to thirteen months) and specifies certain “nonsensitive” health-related targeted advertisement MCLE, Inc. |2nd Edition 2018

11–11

§ 11.4

Data Security and Privacy in Massachusetts

and disclaims advertising to children younger than thirteen years of age. Microsoft Services shares with advertisers reports about data collected on their sites and cooperates with advertising companies and their use of web beacons.

§ 11.4.3 User Access and Options Amazon.com gives examples of “information you can access,” which essentially are account and preference settings, wish lists, and other information supporting Amazon’s business. The “choices” provided are relative to “customer communication preferences” and “advertising preferences.” Cookies may be turned off, but this disables essential functionality of Amazon.com. Apple allows access to update and correct Apple ID information and to request other access through a “privacy contact form.” There is an option to opt out of the mailing list for product announcements, but not for communications relating to terms and conditions. Bank of America provides for updating of account information using the “contact us” option or other contact numbers. It allows opting out of behavioral advertising and direct marketing and provides a link to the National Advertising Initiative’s Opt-Out Tool. Facebook allows management of content and information using its “Activity Log” tool and to download information with its “Download Your Information” tool. The account may be deleted, and Facebook undertakes to delete all postings (but is silent about other information). Privacy after death is handled by a default memorialization upon notice of death or deleted by advance setting or allowed download by designation of a legacy contact (see § 11.3.2, above). Google provides in its “transparency and choice” section review and update of Google activity controls, and review and control of certain types of information using Google Dashboard. More generally, under the section “accessing and updating your personal information,” Google explains: Whenever you use our services, we aim to provide you with access to your personal information [link to use of Google Dashboard]. If that information is wrong, we strive to give you ways to update it quickly or to delete it—unless we have to keep that information for legitimate business or legal purposes. . . . We may reject requests that are unreasonably repetitive, require disproportionate technical effort (for example, developing a new system or fundamentally changing an existing practice), risk the privacy of others, or would be extremely impractical (for instance, requests concerning information residing on backup systems). Where we can provide information access and correction, we will do so for free, except where it would require a disproportionate effort. We aim to maintain our services in a manner 11–12

2nd Edition 2018 | MCLE, Inc.

Consumer/Retail Issues

§ 11.4

that protects information from accidental or malicious destruction. Because of this, after you delete information from our services, we may not immediately delete residual copies from our active servers and may not remove information from our backup systems. LinkedIn provides at Section 4 (“Your Choices & Obligations”) a statement of rights (Section 4.2) and an explanation of data retention (Section 4.1), specifically as follows: We retain the personal data you provide while your account is in existence or as needed to provide you Services. Even if you only use our Services when looking for a new job every few years, we will retain your information and keep your profile open until you decide to close your account. In some cases we choose to retain certain information (e.g., visits to sites carrying our “share with LinkedIn” or “apply with LinkedIn” plugins without clicking on the plugin) in a depersonalized or aggregated form. It explains that some shared content will remain, but will be depersonalized. Microsoft Services provides a section on “How to Access & Control Your Personal Data.” In the section are links to various services to edit profile data. Provision is made for opting out of Microsoft promotional communications and interest-based advertising from Microsoft. Also explained are controls for tracking protection relative to third-party content, but it is explained that Microsoft is not yet in agreement with industry on a response of its sites to a “do not track” signal.

§ 11.5

CONCLUSION

It may be seen from the above sampling that control over one’s personal information used in online commerce is limited, even more so for information posted on sites accessible by the public or even smaller groups. To some extent use of residual, even if anonymized, information is important for the success of the business models of service providers who are therefore reluctant to cede control. As a practicality and result of these models, what has been made accessible on the Internet cannot be truly “erased” and thus may not be fully controllable by the individual consumer.

MCLE, Inc. |2nd Edition 2018

11–13

Data Security and Privacy in Massachusetts

11–14

2nd Edition 2018 | MCLE, Inc.

CHAPTER 12

Network Service Providers* Sara Yevics Beccia, Esq. Hasbro, Inc., Pawtucket, RI § 12.1

Introduction........................................................................................... 12–2 § 12.1.1 What Is a Network Service Provider? .................................. 12–2 § 12.1.2 Representing a Business Consumer of Network Services .... 12–4 § 12.1.3 Representing a Network Service Provider ........................... 12–5

§ 12.2

Relevant Statutes and Regulations ...................................................... 12–5 § 12.2.1 Federal Law—The Telecommunications Act of 1996 and the Communications Act of 1934 .................................. 12–5 (a) Common Carriers Under Title II of the Communications Act ..................................................... 12–5 (b) The Telecommunications Act of 1996— Distinguishing Between and Among the Different Types of Service Providers ............................................ 12–6 (c) The FCC and Regulation of the Internet ....................... 12–8 (d) Provisions of the Telecommunications Act Related to Privacy and Security ............................................... 12–10 § 12.2.2 Federal Law—Title II of the Electronic Communications Privacy Act, the Stored Communications Act .................... 12–11 § 12.2.3 Federal Law—Communication Decency Act ..................... 12–12 § 12.2.4 Massachusetts State Law and Regulations ......................... 12–12

§ 12.3

Negotiating Network Service Provider Contracts ............................ 12–13 § 12.3.1 Defining Privacy and Data Security Obligations in Service Provider Relationships ...................................... 12–13 § 12.3.2 Representing a Business Engaging a Service Provider—Setting the Framework for the Deal .................. 12–14 (a) Conduct a Data Audit and Engage in Data Mapping ...................................................................... 12–15 (b) Analyze and Understand Relevant Laws, Regulations, and Company Policies ............................ 12–15 (c) Conduct and Document a Technical Assessment and Due Diligence of Service Provider ....................... 12–16

EXHIBIT 12A—Service Provider Data Security Questionnaire .................. 12–17

*

Updated for the 2018 Edition by Stephen Y. Chow, Esq.

MCLE, Inc. | 2nd Edition 2018

12–1

§ 12.1

Data Security and Privacy in Massachusetts

Scope Note This chapter provides an overview of how the legal issues associated with privacy and data security law affect network service providers and their interactions with those who rely upon their services.

§ 12.1

INTRODUCTION

In the world of e-commerce and ubiquitous technology, nearly every business and individual relies on a network service provider on a daily basis. Nevertheless, many individuals, business owners, and their counsel are not fully aware of what network service providers are, what they do, or how they factor into the important issues of safeguarding private information and keeping data secure.

§ 12.1.1 What Is a Network Service Provider? Network service providers are businesses or other organizations that offer and provide facilities or services at various physical segments and information organization layers to connect end-user nodes or terminals (such as a smartphone) in a communications network. In an earlier generation, these were primarily the providers of “long lines” (long-distance) and “local [exchange] offices” (connecting to user terminals) of “plain old telephone services” (POTS), prior to its break-up in 1984 largely monopolized by the regulated American Telephone & Telegraph Company (previously known as “Ma Bell,” distinguished from the current AT&T that is a recombination of some of the “Baby Bell” regional operating companies and the remnant of the original company). A major impetus for the negotiated break-up was the new competitive environment of more profitable microwave long lines connection, remote computer services, and competing networks such as cellular networks and community antenna television (CATV or “cable”) networks, some of which offered greater “bandwidth” (rate of data transfer). Almost immediately after the break-up, concurrent with the beginnings of the “Internet” that established a common language and addressing scheme for data communication in the Internet Protocol/Transport Control Protocol (IP/TCP) and higher-level protocols such as for e-mail (e.g., Simple Mail Transfer Protocol or SMTP) and graphics (e.g., Hypertext Transfer Protocol or HTTP—along with the Universal Resource Locator or URL enabling the “World Wide Web”), issues arose as to control by the Baby Bells of the “last mile” of the still-dominant copper telephone wire connection to households (controlled by the Baby Bells). This lead to “unbundling” of that network element (as well as other provisions such as protecting cellular telephone tower siting against local opposition) by the procompetition Telecommunications Act of 1996, discussed in some aspects at § 12.2.1 below. Through “reciprocal payments” of a share of billings for initiating land-line calls to the terminating entity, some early providers of dial-up access to the Internet, which terminated only dial-up calls from end-users and did not initiate land-line calls by connection through negotiated “bulk” broadband (typically cable) contracts, were in effect subsidized by the Baby Bells. Soon enough, however, dial-up access gave way to direct consumer broadband access.

12–2

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

§ 12.1

Early Internet service providers (ISPs) included the following: • access providers, beginning with accepting user communication in the 1980s with a computer modem (modulator-demodulator) communicating via acoustical coupler/cradle for an analog telephone handset; • mailbox providers; • bulletin board (common access) providers; • transit services, providing inter-network connections at a “point of presence” (“PoP”) physically at an Internet exchange point or a collocation, the highest level being “Tier 1”; often “peering agreements” to exchange traffic rather than billing per message; and • hosting service providers, The unbundling of the telephone networks and the leap-frogging of wired (including cable) networks with different mixes of wireless communication and fiber communication, along with the infinite software-protocol-based possibilities of layering or stacking of functionalities even within the IP/TCP framework, multiplied and blurred the types of network service providers. The addition of features combined types of services; while the dial-up service provider was the archetype simple access to a network able to rout the data packages to IP- or URL-addressed nodes on the network (connection on the Internet) early on included e-mail services including composition, storage, forwarding, and later adding features such as spam filtering and in some cases, list serve. Bulletin boards morphed to blog services and social media platforms. Graphics- and advertisement-filled portals were developed; webpages visited (downloaded and presented on the user’s browser) included links, even windows to other pages at different servers. More people text on a short message service (SMS) on mobile devices or message within social media sites with a variety of features for sharing and “liking.” Simple hosted services still exist, but more businesses are availing themselves of “platform-as-a-service” or “infrastructure-as-a-service” in the “cloud services” model, touched upon in chapter 13 of this book. Different law is applicable to distinctions between these network services regarding the information they handle. As explained in § 2.5.3 through § 2.5.6 of this book, under Title II of the Electronic Communications Privacy Act of 1986, 18 U.S.C. §§ 2701–2711 (the Stored Communications Act or SCA), different rules for voluntary disclosure of information held in service to the public and for governmentcompelled disclosure apply where the network service provider is providing an electronic communication service (ECS), defined as “any service which provides to users thereof the ability to send or receive wire or electronic communications.” 18 U.S.C. § 2510(5)). Certain privacy rules apply to the communication at issue if it is providing a remote computing service (RCS), “the provision to the public of computer storage or processing services by means of an electronic communication system” (18 U.S.C. § 2711(2)). As explained in § 7.2.2 through § 7.2.5 of this book, under Section 230 of the Communication Decency Act (CDA), 47 U.S.C. § 230(c)(1), “[n]o provider or user of an MCLE, Inc. | 2nd Edition 2018

12–3

§ 12.1

Data Security and Privacy in Massachusetts

interactive computer services shall be treated as the publisher or speaker of any information provided by another information content provider” where [t]he term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions. 47 U.S.C. § 230(f)(2). This gives interactive computer service providers considerably more protection against claims with an element of disclosure or publishing (defamation and even false or hurtful statements) than enjoyed by print publishers and broadcasters under free speech or conduit principles. See e.g., Doe v. Back-page.com, LLC, 817 F.3d 12 (1st Cir. 2016), cert. denied, 137 S. Ct. 622 (mem.) (2017) (advertising supporting sex trafficking not actionable under federal anti-sex-trafficking law). Publishing activities such as curating, removal, and editing are protected. Certain carveouts from the immunity are provided, notably for intellectual property (47 U.S.C. § 230(e)(2)) such as copyright infringement, although the courts are divided whether trademark infringement or the right of publicity is carved out. The federal Defend Trade Secrets Act of 2016 provided that it would not be treated as intellectual property for the purposes of other federal statutes. Pub. L. No. 114-153, § 2(g), 130 Stat. 182, codified at 18 U.S.C. § 1833 note. Because of the important role network service providers play in Internet services, individuals and businesses must depend on network service providers to ensure that the types of information stored, processed, transmitted, or accessed on or through the Internet remain protected or accessible as is the intent of the users of the servers. While most privacy and data security legal regimes focus on the obligations of businesses that collect or store private information, there are important aspects of both federal and state law that directly affect network service providers or those engaging them.

§ 12.1.2 Representing a Business Consumer of Network Services When serving as counsel to a business consumer of network and Internet services with respect to privacy and data security matters, the role often evolves into a hybrid of legal and technical advisor. Counsel must understand not only the statutory and regulatory schemes that govern the business’s treatment of sensitive information, but also the technical needs of the business both in meeting those legal obligations and in providing the business’s services to its clients. As counsel, we often deal with inhouse attorneys or executives, but in advising clients on privacy and security issues relating to service providers, the information technology (IT) department may become your most valued ally. IT professionals are in the best position to assess technical needs, vulnerabilities, and requirements related to safeguarding data provided to or processed by service providers. Counsel should be knowledgeable regarding the legal issues relating to researching and engaging service providers and should be prepared to work closely with IT professionals in order to properly frame those issues in a way that dovetails with the business’s technical setup. 12–4

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

§ 12.1

§ 12.1.3 Representing a Network Service Provider Counsel for network service providers are likewise required to be savvy with technical details and familiar with legal compliance requirements. In addition, counsel advising these types of clients also need to keep abreast of the changing regulatory landscape associated with the provision of Internet services. Specifically, there appeared to be a trend during the Obama Administration toward increased regulation of Internet providers by the federal government, notably the “net neutrality” discussed in § 12.2.1 below. The Trump administration appears to have stopped at least parts of that trend, including net neutrality.

§ 12.2

RELEVANT STATUTES AND REGULATIONS

By their very nature, network service providers engage with businesses across a wide variety of industries, some of whom are subject to strict regulations relating to privacy and data security due to the nature of the services they provide (e.g., health care, financial, etc.). Generally, network service providers do not have separate compliance obligations under these regimes, but may be brought under their umbrella through their contractual relationship with covered entities. In these cases, network service providers may be contractually obligated to comply with specific rules or regulations. Counsel for service providers and for businesses that engage these providers should pay close attention to any contractual provisions that require service provider compliance with these regulatory schemes. (These regulatory schemes are discussed elsewhere in this book. See chapters 5, 6, and 13.) Other than those statutes and regulations that may be imposed upon network service providers by virtue of their relationships and agreements with regulated entities as their clients, this chapter focuses on the relevant statutes and regulations that apply directly to service providers. Importantly, federal law regulating network service providers has evolved over the past ten years as technology has evolved and as legislation and case law has tried to keep pace with innovation. Developments in this area continue to evolve, including major changes in the regulatory landscape as recently as April 2015.

§ 12.2.1 Federal Law—The Telecommunications Act of 1996 and the Communications Act of 1934

(a)

Common Carriers Under Title II of the Communications Act

Long before network service providers became the keystone of connectivity and communication as we know it, Congress passed the Communications Act of 1934, 47 U.S.C. § 151 et seq., with the purpose of setting forth a comprehensive federal statutory and regulatory regime for regulating the communications industry. In addition, the Act created the Federal Communications Commission (FCC) as the agency charged with implementing and enforcing the provisions of the Act. At that time, Congress and the FCC were focused primarily on “Ma Bell” and ensuring fairness and competition in the provision of telephone services.

MCLE, Inc. | 2nd Edition 2018

12–5

§ 12.2

Data Security and Privacy in Massachusetts

Specifically, 47 U.S.C. § 151 sets forth the goals of the Act as follows: For the purpose of regulating interstate and foreign commerce in communication by wire and radio so as to make available, so far as possible, to all the people of the United States, without discrimination on the basis of race, color, religion, national origin, or sex, a rapid, efficient, nationwide, and world-wide wire and radio communication service with adequate facilities at reasonable charges, for the purpose of national defense, for the purpose of promoting safety of life and property through the use of wire and radio communication, and for the purpose of securing a more effective execution of this policy by centralizing authority heretofore granted by law to several agencies and by granting additional authority with respect to interstate and foreign commerce in wire and radio communication, there is hereby created a commission to be known as the “Federal Communications Commission”, which shall be constituted as hereinafter provided, and which shall execute and enforce the provisions of this Act. Consistent with the goal of ensuring a communications network free of discrimination and in furtherance of the public good, Title II of the Communications Act of 1934 (47 U.S.C. §§ 201–276) establishes “common carriers”—entities that are subject to enhanced regulation with respect to the services they provide. Simply put, if an entity is designated as a common carrier, the public has a right to access its services under highly regulated conditions and without discrimination by the provider as to the end use of those services. The “traditional” common carriers are telephone companies. However, as discussed below, the importance of free access to and use of the Internet as a matter of public policy is leading to changes in the way the Internet is regulated under federal law.

(b)

The Telecommunications Act of 1996—Distinguishing Between and Among the Different Types of Service Providers

With the Communications Act’s genesis in “traditional” telephone services, the FCC struggled in the early 1980s to apply the provisions of the existing telecommunications statutes to the evolving communications network of the Internet. To address the issue of electronic communications and regulation of network service providers in the early days of the Internet, the FCC developed two categories of entities to determine which aspects of the Communications Act (and, more specifically, the common carrier provisions of Title II) applied to technology and Internet services. In what became known as the Computer II regime, the FCC divided providers into “basic services” providers and “enhanced services” providers. See In re Amendment of Section 64.701 of the Commission’s Rules and Regulations, 77 F.C.C. 2d 384, 387 (1980). The classification of a provider depended upon the extent to which that provider was involved in processing information rather than simply transmitting information. Those services that involved “pure” transmission, such as telephone services, 12–6

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

§ 12.2

were “basic” services and were subject to the common carrier provisions of Title II. See In re Amendment of Section 64.701 of the Commission’s Rules and Regulations, 77 F.C.C. 2d at 420. On the other hand, “enhanced” services were those that involved “computer processing applications . . . used to act on the content, code, protocol, and other aspects of the subscriber’s information.” See In re Amendment of Section 64.701 of the Commission’s Rules and Regulations, 77 F.C.C. 2d at 420. Under this regime, network service providers’ activities in connecting an end user to the Internet were “enhanced” services not subject to Title II common carrier provisions. See In re Amendment of Section 64.701 of the Commission’s Rules and Regulations, 77 F.C.C. 2d at 420. Under the Computer II regime, if a provider simply transmitted information without any sort of processing or modification, it was considered a common carrier under the Communications Act. In 1996, Congress codified this dual categorization of entities in the Telecommunications Act of 1996. Under the Telecommunications Act, entities are classified as either telecommunications carriers or information service providers. Telecommunications carriers—the “basic” providers under the FCC’s Computer II regime—are those that provide services that consist of “the transmission, between or among points specified by the user, of information of the user’s choosing, without change in the form or content of the information as sent and received.” 47 U.S.C. § 153(50). On the other hand, information services carriers are those that offer “a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications.” 47 U.S.C. § 153(24). Like the Computer II classifications, the 1996 amendments to the Communications Act focused on what a service provider actually did with the data that crossed over its networks. For pure transmission services, the provider is a common carrier, but not for information services. As discussed above, network service providers can encompass a variety of different types of providers—telecommunications providers, cable companies, Internet service providers, etc. The questions then arise: How are these entities classified? And why does it matter? Given the trend of convergence among technology, telecommunications, and information services, cable providers, Internet service providers, and telecommunications providers are increasingly becoming one and the same. As these services converge, providers tend to offer bundles of the types of services that were previously provided by different companies, including telephone services, cable television, and Internet access services. Further, the way the public uses and relies upon the services of these providers is overlapping. Twenty years ago, telephone lines were the most important infrastructure for communication; as technology has evolved, fewer and fewer households even have a hard telephone line. Instead, the Internet has taken on a primary role in connecting people, places, and things. Consistent with this trend, the regulatory landscape has made fewer distinctions between “telecommunications” providers and “information services” providers, broadening the first category and narrowing the second. In recognition of the importance of the Internet in society—and the necessary reliance upon network service providers in gaining access to the Internet—federal regulations have recently changed to capture Internet providers as common carriers. MCLE, Inc. | 2nd Edition 2018

12–7

§ 12.2

(c)

Data Security and Privacy in Massachusetts

The FCC and Regulation of the Internet

Prior to 2015, only those Internet service providers that relied upon dial-up connections and provided broadband Internet service over telephone lines were classified as telecommunications services subject to Title II common carrier regulation. Broadband Internet providers that relied upon cable transmission were considered private carriers, relatively free from federal telecommunications regulation and allowed to contract, make business decisions, and provide services as any private business might. However, as access to the Internet became a more important aspect of everyday life, regulators faced increased pressure to exert more control over network service providers and, in particular, cable broadband providers. Specifically, a movement for an “open Internet” or “net neutrality” began to gather steam. Advocates for an open Internet advanced the position that broadband providers should be prohibited from making arbitrary and potentially discriminatory decisions with respect to their provision of Internet services. For example, advocates of this position warned that, without regulations in place to prevent it from doing so, a broadband provider could drastically slow down the streaming capabilities of certain users without warning or justification, impeding that user’s access to the Internet. A regime that required net neutrality would force broadband providers to treat all Internet traffic the same, regardless of source or content.

The FCC’s Comcast Order The FCC’s first significant attempt to exert its jurisdiction over broadband providers was its attempt to regulate Comcast’s network management practices. See In re Formal Complaint of Free Press & Pub. Knowledge Against Comcast Corp. for Secretly Degrading Peer-to-Peer Applications, 23 F.C.C.R. 13,028 (2008). Specifically, the FCC attempted to bar Comcast from interfering with its customers’ use of peer-topeer networking applications, relying upon the FCC’s general authorities under the Communications Act of 1934 to “perform any and all acts, make such rules and regulations, and issues such orders . . . as may be necessary in the executions of its functions,” and on Section 706 of the Telecommunications Act relating to “advanced telecommunications capability.” In re Formal Complaint of Free Press & Pub. Knowledge Against Comcast Corp. for Secretly Degrading Peer-to-Peer Applications, 23 F.C.C.R. 13,028 (2008). In striking down the FCC’s Comcast order, the Court of Appeals for the D.C. Circuit found that such regulation of Comcast’s Internet service was outside the scope of the FCC’s authority. Comcast Corp. v. FCC, 600 F.3d 642, 661 (D.C. Cir. 2008).

The 2010 Open Internet Order Two years later, the FCC once again tried to bring broadband providers under its jurisdiction in its order In re Preserving the Open Internet, 25 F.C.C.R. 17,905 (2010). This 2010 Open Internet order imposed three “net neutrality” rules intended to promote and preserve an open Internet free of provider discrimination. In re Preserving the Open Internet, 25 F.C.C.R. 17,905 (2010). On appeal to the D.C. Circuit in Verizon v. FCC, 740 F.3d 623 (D.C. Cir. 2014), the court once again struck down 12–8

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

§ 12.2

the FCC’s Open Internet rules. In doing so, the court focused primarily on the conflict with the FCC’s historical classification of Internet broadband services as “information services” and the FCC’s attempts to nevertheless impose common carrier– type obligations on broadband providers in the form of anti-blocking and nondiscrimination requirements. Verizon v. FCC, 740 F.3d at 651–56. In its decision, the court seemed to be sending a clear message to the FCC: that if the FCC wants to impose anti-discrimination obligations on broadband Internet providers to create an open Internet, these providers must be reclassified as common carriers subject to Title II of the Communications Act.

FCC’s Solution—Reclassification of Internet Service Providers as Common Carriers On March 12, 2015, the FCC issued its long-anticipated solution to the D.C. Circuit Court’s rejection of its attempts to enforce an open Internet. The FCC’s Report and Order on Remand, Declaratory Ruling, and Order includes extensive background and justification for its final rules. See In re Protecting & Promoting the Open Internet, F.C.C. 15–24 (released Mar. 12, 2015) (the 2015 Open Internet order). The proposed rules set forth in the 2015 Open Internet order were published in the Federal Register on April 13, 2015 and went into effect on June 12, 2015. 80 Fed. Reg. 19,737 (Apr. 13, 2015). Meanwhile, the legality of the 2015 Open Internet order continues to be a matter of debate. The telecommunications industry group United States Telecom Association (USTelecom), which includes members such as AT&T and Verizon, have challenged the new rules by filing a protective petition for review in the U.S. Court of Appeals for the District of Colombia in United States Telecom Association v. FCC, Civ. A. No. 15-1063 (D.C. Cir., filed Mar. 25, 2015). Other interested parties—as intervening parties and as amici curiae—joined the case, which has been consolidated with several others involving the same issue (Civ. A. Nos. 15-1078, 15-1086, 15-1090, 151091, 15-1092, 15-1095, 15-1099, 15-1117, 15-1128, 15-1151, and 15-1164). The D.C. Circuit ruled 2–1. United States Telecom Assoc. v. FCC, 825 F.3d 674 (D.C. Cir. 2016), reh’g denied, 2017 U.S. App. LEXIS 7712 (D.C. Cir. May 1, 2017). The Trump administration immediately acted by appointing a new FCC Chairman, Ajit Pai on May 18, 2017, to announce the roll-back of the Open Internet regime of “Utility-Like Regulation of the Internet” in Proposed Notice of Rulemaking In the Matter of Internet Freedom, W.C. Docket No. 17-108, 82 Fed. Reg. 25568 (June 2, 2017). On November 22, 2017, Chairman Pai announced that the agenda of the December 14, 2017, open meeting would include consideration of roll-back of the Open Internet Order in In the Matter of Restoring Internet Freedom, W.C. Docket No. 17-108, Proposed Declaratory Report, Ruling and Order, and Order, available at http://transition.fcc.gov/Daily_Releases/Daily_Business/2017/db1122/DOC347927A1.pdf.

MCLE, Inc. | 2nd Edition 2018

12–9

§ 12.2

(d)

Data Security and Privacy in Massachusetts

Provisions of the Telecommunications Act Related to Privacy and Security

While the FCC’s Open Internet rules remain in effect, affected service providers need to comply with new provisions relating to the privacy and security of the information they transmit. Specifically, the new rules apply to providers that offer broadband Internet access services, defined as [a] mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all Internet endpoints, including any capabilities that are incidental to and enable the operation of the communications service, but excluding dial-up Internet access service. This term also encompasses any service that provides a functional equivalent of the service, or that is used to evade the protections created by the Commission. 2015 Open Internet order at ¶ 25; 80 Fed. Reg. 19,741. The 2015 Open Internet order applies the customer privacy protections of Section 222 of the Communications Act, 47 U.S.C. § 222, to broadband Internet service providers. Section 222 imposes a general duty on providers to take reasonable precautions to protect the confidentiality of its customers’ information, and places limits on the providers’ use of such information. 47 U.S.C. § 222(1). In addition, it specifically requires that a carrier “only use, disclose, or permit access to individually identifiable customer proprietary network information in its provision of (A) the telecommunications service form which such information is derived, or (B) services necessary to, or used in, the provision of such telecommunications service, including the publishing of directories.” 47 U.S.C. § 222(c)(1). Further, it requires that carriers must disclose customer proprietary network information to any person designated by a customer in a written request. 47 U.S.C. § 222(c)(2). For the purposes of Section 222 protections, “customer proprietary network information” includes information that relates to “the quantity, technical configuration, type, destination, location, and amount of use of a telecommunications service subscribed to by any customer of a telecommunications carrier.” 47 U.S.C. § 222(h)(1)(A). Interestingly, while the FCC has made Section 222 applicable to broadband providers, it declined to apply its existing rules implementing Section 222 to broadband providers. The FCC’s reason for this is that those rules are specifically tailored to telephone carriers. Recognizing that Internet services are very different from traditional telephone services, the FCC intends to adopt in the future implementing rules that are specific to broadband providers. In light of this gap between the effective date of the Open Internet order and the rollout of specifically tailored broadband rules, the FCC has issued an enforcement 12–10

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

§ 12.2

advisory notice with respect to the open Internet privacy standard. See FCC Enforcement Advisory Open Internet Privacy Standard Enforcement, Enforcement Advisory No. 2015-03; Release No. DA 15-603 (May 20, 2015). The enforcement advisory states that, until implementing rules specific to broadband providers are issued, the FCC intends to focus on whether broadband providers are taking “reasonable, good-faith steps to comply with Section 222, rather than focusing on technical details.” See FCC Enforcement Advisory Open Internet Privacy Standard Enforcement, Enforcement Advisory No. 2015-03; Release No. DA 15-603 (May 20, 2015). In doing so, the FCC suggests that broadband providers should employ effective privacy protections in line with their existing privacy policies and general best practices and core tenants of privacy protection. FCC Enforcement Advisory Open Internet Privacy Standard Enforcement, Enforcement Advisory No. 2015-03; Release No. DA 15-603 (May 20, 2015). In addition, the advisory encourages broadband providers to request advisory opinions from the FCC to gain further insight into whether their activities comply with the obligations of Section 222 and the Open Internet order. FCC Enforcement Advisory Open Internet Privacy Standard Enforcement, Enforcement Advisory No. 201503; Release No. DA 15-603 (May 20, 2015). If a provider does consult with the FCC Enforcement Bureau, such request itself will show that the provider is acting in good faith in its attempts to comply with Section 222. FCC Enforcement Advisory Open Internet Privacy Standard Enforcement, Enforcement Advisory No. 2015-03; Release No. DA 15-603 (May 20, 2015). As network service providers begin to navigate the new statutory and regulatory landscape that is constantly evolving, providers should follow general privacy and security best practices, including having in place a comprehensive information security policy and privacy policy, and ensuring internal compliance with such policies. In the absence of clear guidance from the FCC as to the privacy requirements in an Open Internet regime where broadband providers are considered common carriers, providers should ensure that any business decisions they make with respect to the use or disclosure of customer information are reasonable, well thought-out, and justified in light of industry standards with respect to treatment of such information.

§ 12.2.2 Federal Law—Title II of the Electronic Communications Privacy Act, the Stored Communications Act

In addition to the telecommunications statutes as mentioned in § 12.2.1 above,, network service providers who store electronic communications, including e-mail messages, data, etc., are also subject to Title II of the Electronic Communications Privacy Act of 1986, 18 U.S.C. §§ 2701–2711 (the Stored Communications Act or SCA), which governs the protection of the contents of data stored by service providers, and records held by service providers about a subscriber, such as subscriber name, billing records, or IP addresses. For further discussion of the SCA, see chapter 2, § 2.5.3 through § 2.5.6, of this book. As applied to consideration of dealings with a covered network service provider, consideration should be given not only to privacy of covered MCLE, Inc. | 2nd Edition 2018

12–11

§ 12.2

Data Security and Privacy in Massachusetts

information stored at a covered provider but to the ability of appropriate persons or entities to access that information, such as successors or fiduciaries who may not have the opportunity to obtain or present express consent to such access (see Revised Uniform Fiduciary Access to Digital Assets, available at http://www.uniformlaws .org/shared/docs/Fiduciary%20Access%20to%20Digital%20Assets/2015_RUFADA A_Final%20Act_2016mar8.pdf (online tool designating authorized access prevails over later will); but see Ajemian v. Yahoo!. 478 Mass. 169 (2017) (SCA does not preempt state estate administration law)).

§ 12.2.3 Federal Law—Communication Decency Act Also as mentioned in § 12.1.1 above and explained in chapter 7, § 7.2.2 through § 7.2.5, of this book, network service providers that provide interactive computer services (and their users) enjoy substantial immunity for disseminating or publishing content provided by others. CDA § 230, 47 U.S.C. § 230(c)(1). Although this immunity is generally positive for both the service provider and the user, there may be occasions whether the service provider may avoid liability to the user for some presentation of content. If such occasion can be anticipated, it may be helpful to provide a contractual basis for redress that is directed to the provider’s use rather than to its “speaking or publishing.”

§ 12.2.4 Massachusetts State Law and Regulations Massachusetts has long been a leader in developing state laws that enhance and protect individuals’ rights to control the use of their personal information, and require businesses that collect and use that information to keep it secure from unauthorized use and disclosure. Consistent with those goals, the Massachusetts Data Security Regulations include requirements regarding oversight of vendors engaged by businesses who own or license personal information of Massachusetts residents. The Massachusetts Data Security Regulations establish minimum standards to be met in connection with safeguarding of personal information of Massachusetts residents and apply to any persons who “own or license personal information about a resident of the Commonwealth.” 201 C.M.R. § 17.01(2). Under the regulations, a person or entity “owns or licenses” personal information if it “receives, stores, maintains, processes, or otherwise has access to personal information in connection with the provision of goods or services or in connection with employment.” 201 C.M.R. § 17.02. Service providers covered by the regulations include “any person that receives, stores, maintains, processes, or otherwise is permitted access to personal information through its provision of service directly to a person that is subject to this regulation.” 201 C.M.R. § 17.02. Importantly, the Massachusetts Data Security Regulations apply equally to businesses outside of Massachusetts as those located in Massachusetts. The important triggering requirement is that the information held, stored, or processed relates to a Massachusetts individual, regardless of where the business that owns or licenses the information is located. Therefore, businesses that, for example, engage in online transactions with 12–12

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

§ 12.2

Massachusetts residents, store or process human resources information about a Massachusetts individual, or maintain or rent a contact database that includes contact information for those in Massachusetts, are all required to comply with the regulations. In addition to the specific safeguards set forth in the rule with respect to the data owner/licensor’s obligations relating to security of information, the regulations require that the business oversee any service provider who may have access to personal information by [t]aking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with these regulations and any applicable federal regulations; and [r]equiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information. 201 C.M.R. § 17.03(f). Therefore, businesses must determine whether any third-party vendors that they engage will have access to personal information of Massachusetts residents. If so, not only must they ensure appropriate safeguards are in place by the third-party service provider to protect such information, there must be a contract in place that requires these security measures.

§ 12.3

NEGOTIATING NETWORK SERVICE PROVIDER CONTRACTS

Understanding the legal landscape for network service providers is important for the providers themselves, but also for businesses who engage and rely upon them. In a world where more businesses are moving data and services “to the cloud” and businesses are facing increased scrutiny regarding the security of personal information and individual privacy rights, the relationship between business owners and service providers is a complicated one; negotiating and defining the roles and obligations of both parties at the outset of the relationship can have important implications on both business operations and on liability exposure. This section discusses the different issues that should be considered from the business owners’ point of view in selecting and contracting with service providers.

§ 12.3.1 Defining Privacy and Data Security Obligations in Service Provider Relationships

Often the concepts of “privacy” and “data security” are treated together as one and the same. Indeed, many of the considerations in dealing with these issues overlap in important ways. However, there are nuanced distinctions between the two concepts, and an understanding of what falls under the rubric of “privacy” and what is more MCLE, Inc. | 2nd Edition 2018

12–13

§ 12.3

Data Security and Privacy in Massachusetts

properly categorized as “security” helps frame the issues in forming and negotiating service provider relationships. Generally, the business owner or consumer of network services will be responsible for issues relating to privacy. While security issues are a subset of these issues for the business owner, the service provider is generally in the business of and responsible for addressing security issues, but not privacy matters. In the context of defining these obligations, “privacy” issues relate to the collection and handling of personal information. Issues in this category generally include, without limitation, • providing proper notice to the individual regarding the planned use of his or her personal information, including any relevant disclosures to third parties, such as service providers; • obtaining consent from individuals for specific use of their information; and • putting in place and posting external privacy policies. On the other hand, “security” relates to the protection mechanisms in place to safeguard personal information after its collection or during its transmission. Security issues generally relate to technical gatekeeping and safeguarding controls, including • authentication of authorized users; • access controls to prevent unauthorized use or disclosure; • encryption of sensitive data, both in transit and at rest; • monitoring and/or logging activity and access requests; • retention, storage, and recovery of data; and • breach detection, investigation, and notification. While business owners must understand and develop policies for dealing with security issues, these are the types of functions and protections that are often delegated to service providers and become the subject of contractual relationships between the two. With this context in mind, businesses often assume risks relating to privacy matters, but many service providers will assume risks and liability relating to security issues associated with the specific services they provide. Put another way, service providers may not be willing to make representations and provide indemnifications for issues related to privacy matters such as proper notice and consent for use of information, but they may be willing to assume legal responsibility for various aspects of security and safeguarding such information entrusted to the provider by the business.

§ 12.3.2 Representing a Business Engaging a Service Provider— Setting the Framework for the Deal

From the business perspective, consumers of network services should begin by framing the potential issues relevant to engaging and relying upon a service provider. The steps outlined below can guide business owners in the process of selecting and negotiating with an appropriate service provider. 12–14

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

(a)

§ 12.3

Conduct a Data Audit and Engage in Data Mapping

Start by cataloging what types of personal information may be disclosed to or stored by a service provider. Map what categories of data are at issue, where the data comes from, under what circumstances it was acquired (i.e., what privacy policies or other agreements where in place when the data was collected), and what uses will be made of the data, both internally and by service providers. An example of a very simple data mapping table is provided below. Type of Data E-mail addresses and phone numbers

Source of Data Collected via company website

Policies/ Agreements Relating to Collection Company privacy policy

Uses of Data Internal: marketing communications; customer service requests; research and development External: processing and storage; backup and disaster recovery

(b)

Analyze and Understand Relevant Laws, Regulations, and Company Policies

The business itself always remains responsible for its own regulatory/legal compliance. While it may impose contractual obligations on a service provider to likewise comply with the same regulations applicable to the business, the business can never contract away its own compliance obligation and will also remain responsible for downstream use. Therefore, it is important that the business team is aware of its own legal obligations and company policies and ensures that any service provider it engages is able to meet its compliance needs. For example, if the business is a covered entity under HIPAA, it must seek out a service provider that is well-versed in the security and privacy rules imposed by HIPAA and is in the position to sign a business associate agreement. Even if the business is not in a regulated industry with heightened privacy and security rules, if the business has developed a written information security program (WISP) that requires certain types of encryption of data or specific data segregation requirements, it must be sure that any service provider it engages is able to accommodate those requirements. In addition to U.S. legal requirements, businesses should inquire whether the service provider has offices or data centers in foreign jurisdictions. If so, there may be additional issues related to privacy and data security under local law where the data may be transferred, processed or stored.

MCLE, Inc. | 2nd Edition 2018

12–15

§ 12.3

(c)

Data Security and Privacy in Massachusetts

Conduct and Document a Technical Assessment and Due Diligence of Service Provider

Finally, the business should conduct a technical assessment of the service provider and conduct due diligence to determine the provider’s ability to meet its security requirements. Because a technical assessment requires in-depth technical knowledge, an in-house information security expert or outside consultant may be needed. The technical assessment should address the various needs imposed by law or by company policy, including but not limited to access controls, encryption, data center/server security, disaster recovery, and business continuity. The business should carefully document the steps it takes during its diligence of a service provider. Checklists or questionnaires can be useful to make sure that the right questions are asked and that the responses are sufficient. A sample questionnaire form is included as Exhibit 12A. Ensure that copies of any relevant service provider policies are reviewed and retained. In addition to general questions relating to technical and physical safeguards and service provider policies, businesses that are regulated entities can also include specific questions related to compliance with relevant regulatory schemes.

12–16

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

EXHIBIT 12A—Service Provider Data Security Questionnaire Service Provider Data Security Questionnaire Service Provider Information Name of Service Provider Location of Provider Offices (Country, State, City) Location of Provider Data Center (Country, State, City) Explain (briefly) the services that will provided to Company #

Question

Service Provider Response

Personal Data: 1

Do you collect, store, handle or otherwise process Personal Data?

☐ Yes ☐ No

Personal Data means any information relating to an identified or identifiable individual, including, but not limited to, an individual’s name, postal address, email address, telephone number, date of birth, driver’s license number, bank account number, and credit or debit card information. 1.1 2

If yes, provide examples of relevant Personal Data. Does the proposed service provider arrangement include collection, storage, access, handling or other processing of Company Personal Data?

2.1

☐ Yes ☐ No

If yes, who within the Service Provider organization (role or title only) has access to this Personal Data?

Organizational Measures to Protect Personal Data 3

4

Does your organization have a documented data protection policy or policies?

☐ Yes (provide copy) ☐ No

How is your data protection policy implemented and disseminated?

MCLE, Inc. | 2nd Edition 2018

12–17

Data Security and Privacy in Massachusetts

5

What training do you provide to your employees who have access to Personal Data?

6

What training do you provide to your consultants, subcontractors and other third party service providers who have access to Personal Data?

7

Do you have a dedicated Data Protection & Privacy Officer or equivalent?

☐ Yes

Do you have a process for updating your documented data protection policy, awareness training, and contract language with consultants, subcontractors, and other third party service providers, triggered by a change in applicable privacy law, policy or business requirement?

☐ Yes

8

9

☐ No

☐ No

Does your service solution design provide the ☐ Yes flexibility to change its set up (e.g., geographical ☐ No location, technical aspects, internal organization) if applicable data protection and security laws and regulations with which Company must so comply requires?

Use of Third Party Provider 10

11

Will you use any third party service providers in providing services to the Company?

☐ Yes

Will any of your third party service providers (e.g., consultants, subcontractors, vendors) have access to Company Personal Data?

☐ Yes

11.1

If yes, identify the third party service providers

11.2

If yes, how do you control and monitor how your third party service providers handle Personal Data?

12

From which countries do your third party service providers provide services?

13

Provide details on the mechanisms you use to ensure compliance by these third party service providers (contractually and with applicable privacy laws and regulations).

14

Do you regularly monitor or audit the security controls used by your third party service providers?

12–18

☐ No

☐ No

☐ Yes ☐ No

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

14.1

If yes, explain procedure and frequency.

Use of Data 15 15.1 16 16.1 17 17.1

Do you have a policy or procedure restricting access to Personal Data?

☐ Yes (provide copy) ☐ No

If yes, how do you monitor compliance with the policy Do you have document retention and destruction ☐ Yes (provide copy) policies and procedures? ☐ No If yes, how do you monitor compliance with the policy Can you accommodate requests for access by individuals whose data you have?

☐ Yes ☐ No

If yes, provide details of the procedures followed

Written Information Security Policy 18

19 19.1 20

20.1 21

22

22.1

Does your organization have a written information security policy (WISP) that is communicated to employees, consultants, vendors and other third party service providers?

☐ Yes (provide copy) ☐ No

Is there a regular review and approval process for ☐ Yes this policy? ☐ No If yes, explain procedure and frequency. Have you designated at least one employee to supervise implementation and performance of the information security policy?

☐ Yes ☐ No

If yes, identify who (role or title) and explain his/her responsibilities. When an employee is terminated, is his/her physical and electronic access to Personal Data terminated immediately, including deactivating passwords and user names?

☐ Yes

Does your information security policy apply to all records, including paper, electronic, computing systems, storage media, laptops and portable devices?

☐ Yes

☐ No

☐ No

If no, please explain what information is covered by this policy.

MCLE, Inc. | 2nd Edition 2018

12–19

Data Security and Privacy in Massachusetts

Organization Information Security 23

24

25

26

Do you have a policy regarding confidentiality, and are your employees, consultants, subcontractors and other third party service providers required to sign an agreement that protects the security and privacy of your and your customers’ information?

☐ Yes (provide copy)

Does your organization undergo regular independent security or controls review?

☐ Yes

☐ No

☐ No

Do you perform periodic security and vulnerability testing to assess network, system and application security?

☐ Yes

Do you have a formalized information security program that is in line with an industry standard security controls framework?

☐ Yes

☐ No

☐ No

26.1

If yes, has the program been assessed against ☐ Yes a standard (e.g., National Institute of ☐ No Standards and Technology(NIST), Payment Card Industry standard (PCI))?

26.2

If yes, explain which standard.

27

How do you assess the security measures used by your third party service providers and ensure compliance with your information security requirements?

Security Controls and Physical Safeguards 28

28.1 29

30

31

12–20

Do you utilize perimeter security systems at your ☐ Yes facility and data centers? (e.g., guard, card ☐ No swipe) If yes, provide details. Are separation of duties and access in place to monitor against unauthorized access?

☐ Yes

Is there a separation between development and production environments/networks?

☐ Yes

☐ No

☐ No

Describe physical controls to prevent unauthorized access to or processing of Personal Data.

2nd Edition 2018 | MCLE, Inc.

Network Service Providers

32

33

Do you maintain up-to-date records concerning the movement of hardware and electronic media that may contain Personal Data?

☐ Yes

Do you back-up Personal Data?

☐ Yes

☐ No

☐ No 33.1 34

34.1 35

35.1 36

36.1

If yes, where is the back-up stored? Are all transmitted records and files containing Personal Data that travel across public networks encrypted?

☐ Yes ☐ No

If yes, provide details (including the type of encryption: 128-bit, 256-bit, etc.). Are all Personal Data transmitted wirelessly encrypted?

☐ Yes ☐ No

If yes, provide details. Do you have an encryption policy for mobile ☐ Yes devices (e.g., laptops, smart phones, USB-drives)? ☐ No If yes, provide details.

Incident Management and Security Events 37

37.1

38

38.1

Does your organization have a written security/privacy incident response program?

☐ Yes (provide copy) ☐ No

If yes, is this program assessed and revised as ☐ Yes necessary on a regular basis. ☐ No Do your employees, consultants, vendors and other third party service providers know what and how to report security incidents?

☐ Yes ☐ No

If yes, explain how employees, consultants, vendors and other third party service providers are made aware of the procedure.

39

Identify the triggers to notify your customers of a security event.

40

Identify the triggers to notify law enforcement of a security event.

41

Have you had a security/privacy breach in the past 12 months?

MCLE, Inc. | 2nd Edition 2018

☐ Yes ☐ No

12–21

Data Security and Privacy in Massachusetts

41.1

12–22

If yes, describe the circumstances and efforts to remediate and ensure non-recurrence of the incident.

2nd Edition 2018 | MCLE, Inc.

CHAPTER 13

Vendors and Business Associates Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston § 13.1

Introduction........................................................................................... 13–1

§ 13.2

HIPAA Omnibus Rule .......................................................................... 13–2

§ 13.3

“Red Flags” Rule .................................................................................. 13–6

§ 13.4

Gramm-Leach-Bliley Safeguards Rule ............................................... 13–6

§ 13.5

Federal Information Security Management Act................................. 13–7

§ 13.6

Massachusetts Data Security ............................................................... 13–7

§ 13.7

Payment Card Industry Rule ............................................................... 13–8

EXHIBIT 13A—HIPAA Privacy and Security Compliance Agreement ...... 13–10

Scope Note This chapter alerts the reader to the vendor and business associate issues of dealing with data security and privacy.

§ 13.1

INTRODUCTION

Internet commerce, particularly the distributed commerce characteristic of the “cloud” involving multiple specialist providers of record creation and updating, storage, billing, collections, customer relations, and ever more specialized functions, has meant the dissemination of personal information to multiple repositories on “nodes” (computers) on the network. In many cases the “ownership” of the information is uncertain: Does it belong to the person that is the subject of the information, or to the creator of the record or collector of the information? One might consider any lawful custodian of information to be a “licensee”—someone with privilege to possess the information. However, the 2002 California data breach notification statute, Cal. Civ. Code § 1798.82, distinguished between the responsibilities of a person or business that “owns or licenses” data and one that “maintains” such data, who must report breaches to the owner or licensee rather than to the subject of the data (see chapter 9, § 9.2.1, of this book). In any case, the dissemination and distribution of personal information to and storage on multiple nodes creates multiple risks of breach of security and diversion of the personal information. Regulation of vendors of services to the primary collectors of personal information—called “business associates” in some contexts—arose most MCLE, Inc. | 2nd Edition 2018

13–1

§ 13.1

Data Security and Privacy in Massachusetts

notably in the context of the ponderous attempts to create standards for electronic health records under the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and its 2009 amendment in the Health Information Technology for Economic and Clinical Health (HITECH) Act. See chapter 6 of this book, as well as chapter 8, § 8.3.3. Following a discussion of the HIPAA regulations below are discussions of some of the other statutory schemes and private regulation that place a focus on contractual arrangements with vendors and business associates to ensure security of personal information.

§ 13.2

HIPAA OMNIBUS RULE

Prior to HITECH, as reviewed in chapter 8, § 8.3.3, of this book, HIPAA regulations already governed what could be disclosed to “business associates” (16 C.F.R. § 160.03) with the “covered entity” (16 C.F.R. § 160.03) responsible for entering into an agreement with a business associate under rules set forth at 16 C.F.R. § 164.504. An example of such an agreement, an adjunct to a health records data service agreement (with, for example, a “service level” agreement for availability of data services), is included as Exhibit 13A. Effective September 23, 2013, the HIPAA data security rules for covered entities was extended to business associates, 78 Fed. Reg. 5,566, 5,694 (Jan. 25, 2013), in what is sometimes known as the HIPAA Omnibus Rule. The security rules are set forth at 16 C.F.R. pt. 504. Following are standards and “required” implementations (rather than “addressable” implementations—i.e., if “reasonable” under the circumstances or a reasonable alternative under 16 C.F.R. § 164.306(d)(3)): • Administrative safeguards (45 C.F.R. § 164.308): – Security management process. Implement policies and procedures to prevent, detect, contain, and correct security violations. – Risk analysis. Conduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information held by the covered entity or business associate. – Risk management. Implement security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level to comply with 45 C.F.R. § 164.306(a). – Sanction policy. Apply appropriate sanctions against workforce members who fail to comply with the security policies and procedures of the covered entity or business associate. – Information system activity review. Implement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports. – Assigned security responsibility. Identify the security official who is responsible for the development and implementation of the policies and procedures required by this subpart for the covered entity or business associate. 13–2

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

§ 13.2

– Isolating health-care clearinghouse functions. If a health-care clearinghouse is part of a larger organization, the clearinghouse must implement policies and procedures that protect the electronic protected health information of the clearinghouse from unauthorized access by the larger organization. – Security awareness and training. Implement a security awareness and training program for all members of its workforce (including management). – Security incident procedures. Implement policies and procedures to address security incidents. – Response and reporting. Identify and respond to suspected or known security incidents; mitigate, to the extent practicable, harmful effects of security incidents that are known to the covered entity or business associate; and document security incidents and their outcomes. – Contingency plan. Establish (and implement as needed) policies and procedures for responding to an emergency or other occurrence (for example, fire, vandalism, system failure, and natural disaster) that damages systems that contain electronic protected health information. – Data backup plan. Establish and implement procedures to create and maintain retrievable exact copies of electronic protected health information. – Disaster recovery plan. Establish (and implement as needed) procedures to restore any loss of data. – Emergency mode operation plan. Establish (and implement as needed) procedures to enable continuation of critical business processes for protection of the security of electronic protected health information while operating in emergency mode. – Evaluation. Perform a periodic technical and nontechnical evaluation, based initially upon the standards implemented under this rule and, subsequently, in response to environmental or operational changes affecting the security of electronic protected health information, that establishes the extent to which a covered entity’s or business associate’s security policies and procedures meet the requirements of this subpart. – Written contract or other arrangement. Document the satisfactory assurances required by paragraph (b)(1) or (b)(2) of this section through a written contract or other arrangement with the business associate that meets the applicable requirements of 45 C.F.R. § 164.314(a). • Physical safeguards (45 C.F.R. § 164.310): – Facility access controls. Implement policies and procedures to limit physical access to its electronic information systems and the facility or facilities in which they are housed, while ensuring that properly authorized access is allowed. – Workstation use. Implement policies and procedures that specify the proper functions to be performed, the manner in which those functions are to be MCLE, Inc. | 2nd Edition 2018

13–3

§ 13.2

Data Security and Privacy in Massachusetts

performed, and the physical attributes of the surroundings of a specific workstation or class of workstation that can access electronic protected health information. – Workstation security. Implement physical safeguards for all workstations that access electronic protected health information, to restrict access to authorized users. – Device and media controls. Implement policies and procedures that govern the receipt and removal of hardware and electronic media that contain electronic protected health information into and out of a facility, and the movement of these items within the facility. – Disposal. Implement policies and procedures to address the final disposition of electronic protected health information, and/or the hardware or electronic media on which it is stored. – Media reuse. Implement procedures for removal of electronic protected health information from electronic media before the media are made available for reuse. • Technical safeguards (45 C.F.R. § 164.312): – Access control. Implement technical policies and procedures for electronic information systems that maintain electronic protected health information to allow access only to those persons or software programs that have been granted access rights as specified in 45 C.F.R. § 164.308(a)(4). – Unique user identification. Assign a unique name and/or number for identifying and tracking user identity. – Emergency access procedure. Establish (and implement as needed) procedures for obtaining necessary electronic protected health information during an emergency. – Audit controls. Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic protected health information. – Integrity. Implement policies and procedures to protect electronic protected health information from improper alteration or destruction. – Person or entity authentication. Implement procedures to verify that a person or entity seeking access to electronic protected health information is the one claimed. In addition to these standards and requirements, 45 C.F.R. § 164.308 provides as follows: (b)(1) Business associate contracts and other arrangements. A covered entity may permit a business associate to create, receive, maintain, or transmit electronic protected health information on the covered entity’s behalf only if the covered entity obtains satisfactory assurances, in accordance with 13–4

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

§ 13.2

§ 164.314(a), that the business associate will appropriately safeguard the information. A covered entity is not required to obtain such satisfactory assurances from a business associate that is a subcontractor. (2) A business associate may permit a business associate that is a subcontractor to create, receive, maintain, or transmit electronic protected health information on its behalf only if the business associate obtains satisfactory assurances, in accordance with § 164.314(a), that the subcontractor will appropriately safeguard the information. 45 C.F.R. § 164.314(a), in turn, provides as follows: (a)(1) Standard: Business associate contracts or other arrangements. The contract or other arrangement required by § 164.308(b)(3) must meet the requirements of paragraph (a)(2)(i), (a)(2)(ii), or (a)(2)(iii) of this section, as applicable. (2) Implementation specifications (Required). (i) Business associate contracts. The contract must provide that the business associate will— (A) Comply with the applicable requirements of this subpart; (B) In accordance with § 164.308(b)(2), ensure that any subcontractors that create, receive, maintain, or transmit electronic protected health information on behalf of the business associate agree to comply with the applicable requirements of this subpart by entering into a contract or other arrangement that complies with this section; and (C) Report to the covered entity any security incident of which it becomes aware, including breaches of unsecured protected health information as required by § 164.410. (ii) Other arrangements. The covered entity is in compliance with paragraph (a)(1) of this section if it has another arrangement in place that meets the requirements of § 164.504(e)(3) [governmental entities, functionality required by law, limited data set]. (iii) Business associate contracts with subcontractors. The requirements of paragraphs (a)(2)(i) and (a)(2)(ii) of this section apply to the contract or other arrangement between a business associate and a subcontractor required by § 164.308(b)(4) [sic—no such subsection] in the same manner as such requirements apply to contracts or other arrangements between a covered entity and business associate. 45 C.F.R. § 164.504(e) sets forth other requirements of a business associates (and subcontractor) contract. MCLE, Inc. | 2nd Edition 2018

13–5

§ 13.3

§ 13.3

Data Security and Privacy in Massachusetts

“RED FLAGS” RULE

Under the Fair Credit Reporting Act, as amended by the Fair and Accurate Credit Transactions Act, 15 U.S.C. § 1681m, as discussed in chapters 4 and 8 of this book, the Federal Trade Commission (FTC) was charged with promulgating and enforcing a “Red Flags” rule to prevent identity theft by regulation of nonbank “creditors,” where a “creditor” is defined as one who obtains or uses consumer reports . . . ; furnishes information to consumer reporting agencies . . . ; or advances funds to or on behalf of a person, based on an obligation of the person to repay the funds or repayable from specific property pledged by or on behalf of the person [but not a creditor who] advances funds on behalf of a person for expenses incidental to a service provided by the creditor to that person [thus excluding accountants, doctors, and lawyers]. Under the FTC Red Flags rule, 16 C.F.R. pt. 681, app. A(VI)(c) (Oversight of service provider arrangements), [w]henever a financial institution or creditor engages a service provider to perform an activity in connection with one or more covered accounts the financial institution or creditor should take steps to ensure that the activity of the service provider is conducted in accordance with reasonable policies and procedures designed to detect, prevent, and mitigate the risk of identity theft. For example, a financial institution or creditor could require the service provider by contract to have policies and procedures to detect relevant Red Flags that may arise in the performance of the service provider’s activities, and either report the Red Flags to the financial institution or creditor, or to take appropriate steps to prevent or mitigate identity theft. (Emphasis added.) Thus, while not mandatory, the FTC has directed creditors to contract with their service providers (business associates) to comply with data security measures.

§ 13.4

GRAMM-LEACH-BLILEY SAFEGUARDS RULE

As discussed in chapter 8, § 8.3.2(b), of this book, the FTC promulgated under the Gramm-Leach-Bliley Act, 15 U.S.C. § 6801, its “Safeguards” rule for financial institution protection of customer information at 16 C.F.R. pt. 314. Section 314.4(d) requires as part of the protection program that financial institutions [o]versee service providers, by: (1) Taking reasonable steps to select and retain service providers that are capable of maintaining appropriate safeguards for the customer information at issue; and 13–6

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

§ 13.4

(2) Requiring your service providers by contract to implement and maintain such safeguards. Thus, providers of services to financial institutions are required by their clients to meet data security standards.

§ 13.5

FEDERAL INFORMATION SECURITY MANAGEMENT ACT

The Federal Information Security Management Act (FISMA), Title III of the EGovernment Act of 2002, Pub. L. No. 107-347, 116 Stat. 2899, charged the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) with developing security standards for the federal information system. Federal agencies have incorporated many of those standards into their procurement regulations as well as in solicitation documents, including security requirements for contractors. NIST developed Publication 800-53, entitled “Security and Privacy Controls for Federal Information Systems and Organizations” (rev. 4, available at http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf). More information is available at the NIST website at http://csrc.nist.gov/groups/SMA/ fisma/index.html. FISMA was revised with the Federal Information Security Modernization Act of 2014, Pub. L. No. 113-283, 128 Stat. 3073, adding new 44 U.S.C. §§ 3551–3558. Ironically, it was disclosed in 2015 that the federal Office of Personnel Management was hacked, compromising the personal records of millions. See, e.g., Andrea Peterson, “OPM says 5.6 million fingerprints stolen in cyberattack, five times as many as previously thought,” Washington Post, Sept. 23, 2015, available at https://www. washingtonpost.com/news/the-switch/wp/2015/09/23/opm-now-says-more-than-fivemillion-fingerprints-compromised-in-breaches.

§ 13.6

MASSACHUSETTS DATA SECURITY

As discussed in chapters 9 and 10 of this book, Massachusetts, by regulation, 201 C.M.R. §§ 17.03 and 17.04, imposes duties upon owners and licensees of the personal information of Massachusetts residents to protect that information using appropriate administrative, technical, and physical safeguards and specific computer system security. Section 17.04 (2)(f) requires that such owners or licensees [o]versee service providers, by: 1. Taking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with 201 CMR 17.00 and any applicable federal regulations; and 2. Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information. . . . MCLE, Inc. | 2nd Edition 2018

13–7

§ 13.6

Data Security and Privacy in Massachusetts

An issue that may arise is determination of whether a custodian of data is more than a “maintainer” and is a “licensee”—for example, when the custodian uses the data to target advertisement for third parties.

§ 13.7

PAYMENT CARD INDUSTRY RULE

An important part of data security in commerce is the private security system used for payment cards (such as credit cards and debit cards) established by the Payment Cards Industry Security Standards Council (PCI SSC) founded by the major credit card companies, American Express, Discover Financial Services, JCB International, MasterCard, and Visa. See https://www.pcisecuritystandards.org. Although certification of merchants and other providers is made by the individual card service providers, PCI SSC has developed a standard, generally called the PCI Data Security Standard (version 3.1, April 2015). The documentation for the standard is available under license at https://www.pcisecuritystandards.org/security_standards/ documents.php. The “PCI DSS Quick Reference Guide” (PCI DSS QRG) is at https://www.pcisecuritystandards.org/documents/PCIDSS_QRGv3_1.pdf. Below is an outline of the procedural requirements of PCI DSS (version 3.1): Build and Maintain a Secure Network and Systems 1. Install and maintain a firewall configuration to protect cardholder data 2. Do not use vendor-supplied defaults for system passwords and other security parameters Protect Cardholder Data 3. Protect stored cardholder data 4. Encrypt transmission of cardholder data across open, public networks Maintain a Vulnerability Management Program 5. Protect all systems against malware and regularly update antivirus software or programs 6. Develop and maintain secure systems and applications Implement Strong Access Control Measures 7. Restrict access to cardholder data by business need to know 8. Identify and authenticate access to system components 9. Restrict physical access to cardholder data Regularly Monitor and Test Networks 10. Track and monitor all access to network resources and cardholder data 13–8

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

§ 13.7

11. Regularly test security systems and processes Maintain an Information Security Policy 12. Maintain a policy that addresses information security for all personnel PCI DSS QRG at 11. The details of the requirements (PCI DSS QRG at 13–25) are similar to other data security requirements, such as those of HIPAA, with a recognition that there are many small merchants in the payment systems that have limited resources but are needed to participate. As in the other regimes treated in this chapter, third-party providers who handle personal information are also vulnerabilities and need to be considered. Use of Third Party Service Providers/Outsourcing A service provider or merchant may use a third-party service to store, process, or transmit cardholder data on their behalf, or to manage CDE components. Parties should clearly identify the services and system components that are included in the scope of the service provider’s annual onsite PCI DSS assessment, the specific PCI DSS requirements covered by the service provider, and any requirements which are the responsibility of the service provider’s customers to include in their own PCI DSS reviews. If the third party undergoes their own PCI DSS assessment, they should provide sufficient evidence to their customers to verify that the scope of the service provider’s PCI DSS assessment covered the services applicable to the customer and that the relevant PCI DSS requirements were examined and determined to be in place. The service provider Attestation of Compliance for PCI DSS v3.0 includes a table that summarizes PCI DSS requirements covered and the specific service(s) assessed, and can be provided to customers as evidence of the scope of a service provider’s PCI DSS assessment. However, the specific type of evidence provided by the service provider to their customers will depend on the agreements/contracts in place between those parties. Merchants and service providers must manage and monitor the PCI DSS compliance of all associated third-party service providers with access to cardholder data. PCI DSS QRG at 32.

MCLE, Inc. | 2nd Edition 2018

13–9

Data Security and Privacy in Massachusetts

EXHIBIT 13A—HIPAA Privacy and Security Compliance Agreement HIPAA PRIVACY & SECURITY COMPLIANCE AGREEMENT This HIPAA Privacy & Security Compliance Agreement (“Agreement”) is entered into as of this _____ day of _____, 20__ (“Effective Date”) between Covered Entity (“Client”), a _____ with a principal address of _____ and Business Associate a _____ with an address of 304 Newbury Street #497, Boston, MA, (“Vendor”) in connection with the Services Agreement dated _____, 20__ (the “Arrangement”). This Agreement is effective as of the Effective Date. RECITALS WHEREAS, the Arrangement requires Vendor to have access to and/or to collect or create Electronic Protected Health Information and/or Protected Health Information (which are collectively referred to as “Protected Health Information”) in order to carry out Vendor’s obligations to Client under the Arrangement; WHEREAS, Client and Vendor intend to protect the privacy and provide for the security of Protected Health Information disclosed, collected or created by Vendor in connection with the Arrangement in compliance with the Health Insurance Portability and Accountability Act of 1996, Public Law 104-191 (“HIPAA”) Subtitle D of Title XIII of Division A of the American Recovery and Reinvestment Act of 2009, Public Law 111-5 (“HITECH”) and the regulations promulgated under HIPAA and HITECH, including, without limitation, the Standards for Privacy of Individually Identifiable Health Information, C.F.R. at Title 45, Parts 160 and 164 (particularly subpart E, the “Privacy Rule”) and the Standards for the Security of Electronic Protected Health Information, C.F.R. at Title 45, Parts 160 and 164 (particularly subpart C, the “Security Rule”), collectively referred to hereinafter as “HIPAA”; WHEREAS, HIPAA requires Client and Vendor to enter into an agreement containing certain requirements with respect to the use and disclosure of Protected Health Information and which are contained in this Agreement. NOW, THEREFORE, in consideration of the mutual promises contained herein and the exchange of information pursuant to this Agreement, the parties agree as follows: 1.

DEFINITIONS Capitalized terms used, but not otherwise defined, in this Agreement shall have the same meanings as those terms in HIPAA, except that the terms “Protected Health Information” and “Electronic Protected Health Information” shall have the same meanings as set forth in 45 C.F.R. §160.103, limited to the information created or received by Vendor from or on behalf of Client in connection with the Arrangement.

13–10

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

2.

OBLIGATIONS OF VENDOR Permitted Uses and Disclosures. Vendor agrees to not use or disclose Protected Health Information other than as permitted or required by the Arrangement, this Agreement or as Required By Law. Vendor shall not use Protected Health Information in any manner that would constitute a violation of the HIPAA, or other applicable federal or State law if so used by a Covered Entity, except that Vendor may use Protected Health Information to the extent otherwise permitted by this Agreement: (i) for the proper management and administration of Vendor; (ii) to carry out the legal responsibilities of Vendor, provided that disclosures are Required By Law, or Vendor obtains reasonable assurances from the person to whom the information is disclosed that it will remain confidential and used or further disclosed only as Required By Law or for the purpose for which it was disclosed to the person, and the person notifies the Vendor of any instances of which it is aware in which the confidentiality of the information has been breached; (iii) for Data Aggregation purposes involving Client or other legal entities that Client controls, is controlled by or under common control with Client; or (iv) to report violations of law to appropriate Federal and State authorities. Privacy Policy Representations and Warranties. Vendor represents and warrants to Client that no later than the effective date set by the Secretary for Business Associates to comply with privacy policies and procedures set forth in HIPAA, Vendor will implement those same privacy policies and procedures. Security Safeguards Representations and Warranties. Vendor represents and warrants to Client that no later than the effective date set by the Secretary for Business Associates to comply with the standards and implementation specifications for security safeguards as set forth in HIPAA, Vendor agrees to use physical, administrative and technical safeguards that (i) meet the standards or required implementation of HIPAA, (ii) meet the “addressable” implementations of HIPAA under 45 U.S.C. § 164.306(d)(3), and (iii) otherwise reasonably and appropriately protect the confidentiality, integrity, and availability of Protected Health Information that it creates, receives, maintains or transmits on behalf of Client and prevent the use, disclosure of, or access to the Protected Health Information other than as provided for by this Agreement. Reporting of Security Incident, Improper Use or Disclosure and Breach. Vendor agrees to report to Client: (i) any Security Incident; and (ii) any use or disclosure of the Protected Health Information not provided for by this Agreement, of which it becomes aware, including any security incident reportable under 45 C.F.R. § 164.314(a)(2)(C). Such report shall be made without undue delay and no later than three (3) business days after Vendor’s discovery of the security incident or inconsistent use or disclosure, including incident or inconsistent use or disclosure by an agent or Vendor of Vendor. Notwithstanding the foregoing, Vendor agrees to notify Client of attempted, unsuccessful security incidents on a reasonable basis, to be determined by

MCLE, Inc. | 2nd Edition 2018

13–11

Data Security and Privacy in Massachusetts

written request of Client. To the extent Vendor accesses, maintains, retains, modifies, records, stores, destroys or otherwise holds, uses or discloses Unsecured Protected Health Information, Vendor shall notify Client in accordance with 45 C.F.R. § 164.410 of any Breach of such Unsecured Protected Health Information. Such notification shall be made without undue delay and no later than three (3) business days after the Breach is discovered by Vendor. The notification of Breach shall be provided in writing and shall include, to the extent possible, the identification of each individual whose Unsecured Protected Health Information has been, or is reasonably believed by the Vendor to have been, accessed, acquired, used, or disclosed during the Breach. Vendor shall also provide the applicable Client any other information that the Client is required to include in notification to the individual under HIPAA at the time of the notification, or as promptly thereafter as such information becomes available. Vendor shall indemnify Client for expenses Client incurs, directly or indirectly, in investigating and remediating any Breach of Unsecured Protected Health Information caused by Vendor or its subcontractors or agents, as well as expenses Client incurs in notifying individuals of any Breach of Unsecured Protected Health Information, including, but not limited to, attorneys fees and credit monitoring and identity theft restoration. Maintenance. In compliance with 45 C.F.R. § 164.306(e), Vendor will review and modify the security measures implemented under 45 C.F.R. Part 164, Subpart C as needed to continue provision of reasonable and appropriate protection of electronic protected health information, and update documentation of such security measures in accordance with § 164.316(b)(2)(iii). Mitigation. Vendor agrees take commercially reasonable steps to mitigate harmful effects from any Breach of Unsecured Protected Health Information or other Security Incident or inconsistent use or disclosure of the Protected Health Information which Vendor is required to report to Client pursuant to this Agreement. Agents and Subcontractors. In accordance with 45 C.F.R. §§ 164.308(b)(2) and 164.314(a)(2)(i)(B), Vendor agrees to ensure that any agent, including a subcontractor, that creates, receives, maintains or transmits Protected Health Information on behalf of Vendor agrees, by contract or arrangement compliant with 45 C.F.R. §§ 164.314(a)(2)(iii) and 164.504(e)(5). (i) to the same restrictions and conditions that apply through this Agreement to Vendor with respect to such information; (ii) to the implementation of reasonable and appropriate privacy and security safeguards to protect Protected Health Information; and (iii) to otherwise comply with the applicable requirements of 45 C.F.R. Part 164, Subpart C. Access. To the extent Vendor holds information in a Designated Record Set, Vendor agrees to provide access to, at the request of Client, Protected Health Information in a Designated Record Set, to Client or, as directed by Client, to a Covered Entity or an Individual in order to meet the access requirements under HIPAA. To the extent that Vendor is deemed to use or 13–12

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

maintain an Electronic Health Record on behalf of Client, then, to the extent an individual has the right to request a copy of the Protected Health Information maintained in such Electronic Health Record pursuant to HIPAA and makes such a request to Vendor, Vendor shall provide such individual with a copy of the information contained in such Electronic Health Record in an electronic format and, if the individual so chooses, transmit such copy directly to an entity or person designated by the individual. Amendment. To the extent Vendor holds information in a Designated Record Set, Vendor agrees to make any amendment(s) to Protected Health Information in a Designated Record Set that Client directs or agrees to pursuant HIPAA and as directed by Client. Accounting. Vendor agrees to document disclosures of Protected Health Information and information related to such disclosures as would be required for Client to respond to a request by a Covered Entity or an Individual for an accounting of disclosures of Protected Health Information in accordance with the requirements under HIPAA. To the extent that Vendor is deemed to use or maintain an Electronic Health Record on behalf of Client, then Vendor shall maintain an accounting of any disclosures made through an Electronic Health Record for treatment, payment and health care operations, as applicable and as of the effective date set by the Secretary. Such accounting shall comply with the other applicable requirements of HIPAA. Upon request by the Client, Vendor shall provide such accounting to the Client in the time and manner specified by HIPAA. Government Access. Vendor agrees to make internal practices, books, and records, including policies and procedures and Protected Health Information, relating to the use and disclosure of Protected Health Information available to the Secretary for purposes of the Secretary determining Client's compliance with HIPAA. Minimum Necessary. To the extent required by HIPAA and any guidance issued by the Secretary, Vendor shall only request, use and disclose the Minimum Necessary Protected Health Information to accomplish its obligations under the Arrangement. Remuneration. Vendor agrees that it shall not, directly or indirectly, receive remuneration in exchange for any Protected Health Information except as otherwise permitted by HIPAA. Marketing. Vendor shall not use or disclose Protected Health Information for the purpose of making a communication about a product or service that encourages recipients of the communication to purchase or use the product or service, unless Vendor has obtained written agreement from Client that such communication otherwise complies with HIPAA. Applicable State Law. Vendor agrees to comply with state laws regarding reporting of data breaches and identity theft protection to the extent that such laws are applicable to some or all Protected Health Information maintained by Vendor. MCLE, Inc. | 2nd Edition 2018

13–13

Data Security and Privacy in Massachusetts

4.

OBLIGATIONS OF CLIENT Inform Vendor of Privacy Practices and Restrictions. Client shall notify Vendor of any limitation(s) in a relevant Covered Entity’s notice of privacy practices, as communicated to Client by the Covered Entity, to the extent that such limitation may affect Vendor's use or disclosure of Protected Health Information. Client shall notify Vendor of any changes in, or revocation of, permission by individual to use or disclose Protected Health Information, as communicated to Client by a relevant Covered Entity, to the extent that such changes may affect Vendor’s use or disclosure of Protected Health Information. Client shall notify Vendor of any restriction to the use or disclosure of Protected Health Information, as communicated to Client by a relevant Covered Entity, that the Covered Entity has agreed to in accordance with HIPAA, to the extent that such restriction may affect Vendor’s use or disclosure of Protected Health Information. Restrictions or limitations pursuant to the provisions of the Data Use Agreement require no further notification. Minimum Necessary. To the extent required by HIPAA and any guidance issued by the Secretary, Client shall only disclose the Minimum Necessary Protected Health Information to Vendor to accomplish the purpose of the disclosure. Permissible Requests. Client shall not request Vendor to use or disclose Protected Health Information in any manner that would not be permissible under the Privacy Rule.

5.

TERM AND TERMINATION Term. The term of this Agreement shall be effective as of the Effective Date, and shall terminate upon the earlier to occur of: (i) the termination of the Arrangement for any reason or (ii) the termination of this Agreement pursuant to the provisions herein. Termination for Cause. In compliance with 45 C.F.R. § 164.504(e)(1)(ii), either party may terminate this Agreement due to a material breach of this Agreement by the other party upon giving the other party thirty (30) days prior written notice; provided the breaching party does not cure the breach prior to the effective date of termination. Effect of Termination. Upon the termination of this Agreement, for any reason, in compliance with 45 C.F.R. § 164.504(e)(2)(ii)(J), Vendor shall destroy all Protected Health Information received from Client, or created or received by Vendor on behalf of Client consistent with the terms of the Data Use Agreement entered into by and among Vendor and [governmental entities]. This provision shall apply to Protected Health Information that is in the possession of Vendors or agents of Vendor. Vendor shall retain no copies of the Protected Health Information, unless otherwise authorized consistent with the terms of the Data Use Agreement. The provisions of this Section shall survive the termination of this Vendor Agreement.

13–14

2nd Edition 2018 | MCLE, Inc.

Vendors and Business Associates

6,

AMENDMENT. The parties acknowledge that state and federal laws relating to data and Protected Health Information security and privacy are rapidly evolving and that amendment of this Agreement may be required to ensure compliance with such developments. Except as otherwise limited in this Agreement, the parties agree to take such action as is necessary to amend this Agreement from time to time as is necessary for the parties to comply with the requirements of HIPAA.

7,

INTERPRETATION. This Agreement and the Arrangement shall be interpreted as broadly as necessary to implement and comply with HIPAA. The parties agree that any ambiguity in this Agreement shall be resolved in favor of a meaning that complies and is consistent with HIPAA.

MCLE, Inc. | 2nd Edition 2018

13–15

Data Security and Privacy in Massachusetts

13–16

2nd Edition 2018 | MCLE, Inc.

CHAPTER 14

Cyber Risk Insurance and Risk Management James A. Kitces, Esq. Robins Kaplan LLP, Boston

Michael V. Silvestro, Esq. Skarzynski Black LLC, Chicago, IL § 14.1

Introduction........................................................................................... 14–3

§ 14.2

Cyber Risk Insurance Forms ............................................................... 14–3

§ 14.3

First-Party Cyber Risk Coverages ...................................................... 14–5 § 14.3.1 Data Breach Response Costs ................................................ 14–5 § 14.3.2 Data Loss.............................................................................. 14–5 § 14.3.3 Network Service Interruption / Business Interruption .......... 14–6 § 14.3.4 Cyber Extortion and Ransom Coverage ............................... 14–6 § 14.3.5 Reputational Damage ........................................................... 14–6 § 14.3.6 Physical Loss or Damage to Property .................................. 14–7 § 14.3.7 Crime.................................................................................... 14–7

§ 14.4

Third-Party Cyber Risk Coverages..................................................... 14–7 § 14.4.1 Data Security Liability ......................................................... 14–7 § 14.4.2 Privacy Liability ................................................................... 14–8 § 14.4.3 Media Content Liability ....................................................... 14–8 § 14.4.4 Products Liability ................................................................. 14–9

§ 14.5

First-Party Cyber Risk Insurance Case Law ..................................... 14–9 § 14.5.1 American Guarantee & Liability Insurance Co. v. Ingram Micro, Inc. ..................................................... 14–9 § 14.5.2 Lambrecht & Associates v. State Farm Lloyds................... 14–10 § 14.5.3 Apache Corp. v. Great American Insurance Co.................. 14–11 § 14.5.4 Principle Solutions Group, LLC v. Ironshore Indemnity, Inc. ................................................................... 14–12 § 14.5.5 InComm Holdings Inc. v. Great American Insurance Co. ..................................................................... 14–13 § 14.5.6 Taylor & Lieberman v. Federal Insurance Co. ................... 14–14 § 14.5.7 State Bank of Bellingham v. BancInsure Inc...................... 14–14 § 14.5.8 Pestmaster Services Inc. v. Travelers Casualty and Surety Co. of America ................................................. 14–15 § 14.5.9 Aqua Star (USA) Corp. v. Travelers Casualty and Surety Co. of America ................................................. 14–15

MCLE, Inc. | 2nd Edition 2018

14–1

Data Security and Privacy in Massachusetts

§ 14.5.10 American Tooling Center, Inc. v. Travelers Casualty and Surety Co. of America ................................................. 14–15 § 14.5.11 Medidata Solutions, Inc. v. Federal Insurance Co. ............. 14–16 § 14.6

Third-Party Cyber Risk Insurance Case Law .................................. 14–16 § 14.6.1 First Bank of Delaware, Inc. v. Fidelity & Deposit Co. of Maryland ................................................................. 14–17 § 14.6.2 Retail Ventures, Inc. v. National Union Fire Insurance Co....................................................................... 14–18 § 14.6.3 State Auto Property & Casualty Insurance Co. v. Midwest Computers & More ................................... 14–19 § 14.6.4 Eyeblaster, Inc. v. Federal Insurance Co. ........................... 14–19 § 14.6.5 First Commonwealth Bank v. St. Paul Mercury Insurance Co....................................................................... 14–20 § 14.6.6 Zurich American Insurance Co. v. Sony Corp. of America .......................................................................... 14–21 § 14.6.7 Recall Total Information Management, Inc. v. Federal Insurance Co....................................................................... 14–21 § 14.6.8 Travelers Indemnity Co. of America v. Portal Healthcare Solutions, LLC ................................................. 14–22 § 14.6.9 Hartford Casualty Insurance Co. v. Corcino & Associates ....................................................................... 14–23 § 14.6.10 Travelers Indemnity Co. of Connecticut v. P.F. Chang’s China Bistro, Inc. ................................................. 14–24 § 14.6.11 Travelers Property Casualty Co. of America v. Federal Recovery Services, Inc. ...................................................... 14–24 § 14.6.12 P.F. Chang’s China Bistro, Inc. v. Federal Insurance Co....................................................................... 14–25 § 14.6.13 Camp’s Grocery, Inc. v. State Farm Fire & Casualty Co. ....................................................................... 14–27

§ 14.7

Cyber Risk Management .................................................................... 14–27 § 14.7.1 Evaluation of the Types of Electronically Stored Information ......................................................................... 14–27 § 14.7.2 Evaluation of Current Network Infrastructure and Security ........................................................................ 14–28 § 14.7.3 Media Publication Risks..................................................... 14–28 § 14.7.4 Development of Data Security Protocols ........................... 14–28 § 14.7.5 Employee Devices .............................................................. 14–28 § 14.7.6 Creating a Data Breach Response and Notification Protocol .............................................................................. 14–29 § 14.7.7 Evaluation of Current Insurance Programs and Coverage Gaps ............................................................. 14–29 § 14.7.8 Employee Training ............................................................. 14–29 § 14.7.9 Regulatory Compliance ...................................................... 14–29

14–2

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.1

Scope Note This chapter provides an overview of cyber insurance forms and coverages, relevant first-party and third-party cyber insurance case law, and practical considerations for the management of cyber risks.

§ 14.1

INTRODUCTION

On a regular basis we see news headlines announcing another breach of a company’s computer system. These security breaches can arise from any number of scenarios— malicious hacking by an unknown entity, malware, accidental disclosure of confidential or personal information, a lost laptop at the airport. Due to the ever-expanding volume and variety of data stored by businesses, the risks associated with data security breaches are real and often costly. According to one recent study, the average total cost of remediation efforts for 2017 was $3.6 million per incident. 2017 Cost of Data Breach Study, Ponemon Institute (2017). These risks are often referred to as cyber risks. Broadly, cyber risks are risks related to or involving computers, computer networks, or the Internet. Cyber risks may involve direct exposures to insureds—such as business interruption losses, data breach response costs, and data asset losses—and potential liabilities to third parties, including regulatory and civil liabilities. Exposures to cyber risks are ubiquitous, as nearly every business, regardless of size or industry, maintains some electronically stored information on local or cloud-based computer systems. The cyber insurance market continues to grow as more and more companies are purchasing cyber risk insurance to cover both first-party risks, such as damage to electronic systems or data and lost business income from cyber events, and third-party risks, such as the potential liability created when confidential or personal data is disseminated. Traditionally, purchasers of cyber risk insurance were mainly large companies, such as media and technology firms, manufacturers, retailers, and health-care and financial services providers. However, in recent years increased demand from a variety of businesses, including smaller companies and professional services firms, have greatly expanded the cyber risk insurance market and the variety of coverage offerings.

§ 14.2

CYBER RISK INSURANCE FORMS

Unlike many forms of insurance, cyber risk coverages are relatively new. This is due in part to the dynamic nature of technologies used in business and associated exposures. As a result, policy wordings and coverages are not standardized and may vary greatly. Many types of cyber risks are often excluded under insurance policies, including many common first-party and third-party forms. For example, Insurance Services Office, Inc., (ISO) commercial general liability (CGL) policy endorsement form CG 21 07 05 14 (2013), titled “Exclusion—Access or Disclosure of Confidential or Personal Information and Data-Related Liability—Limited Bodily Injury Exception Not Included,” adds exclusions to both Coverages A and B related to the disclosure of MCLE, Inc. | 2nd Edition 2018

14–3

§ 14.2

Data Security and Privacy in Massachusetts

confidential or personal information. This endorsement adds exclusionary language to Coverage A, stating that the coverage provided does not apply to the following: p. Access or Disclosure of Confidential or Personal Information and Data-related Liability Damages arising out of: (1) Any access to or disclosure of any person’s or organization’s confidential or personal information, including patents, trade secrets, processing methods, customer lists, financial information, credit card information, health information or any other type of non-public information; or (2) The loss of, loss of use of, damage to, corruption of, inability to access, or inability to manipulate electronic data. Practice Note The term “electronic data” is defined in form CG 21 07 05 14 as “information, facts or programs stored as or on, created or used on, or transmitted to or from computer software, including systems and applications software, hard or floppy disks, CD-ROMS, tapes, drives, cells data processing devices or any other media which are used with electronically controlled equipment.”

This exclusion applies even if damages are claimed for notification costs, credit monitoring expenses, forensic expenses, public relations expenses or any other loss, cost or expense incurred by you or others arising out of that which is described in Paragraph (1) or (2) above. The endorsement also adds a similar exclusion to Coverage B, stating that coverage provided excludes the following: Access or Disclosure of Confidential or Personal Information “Personal and advertising injury” arising out of any access to or disclosure of any person’s or organization’s confidential or personal information, including patents, trade secrets, processing methods, customer lists, financial information, credit card information, health information or any other type of nonpublic information. This exclusion applies even if damages are claimed for notification costs, credit monitoring expenses, forensic expenses, public relations expenses or any other loss, cost or expense incurred by you or others arising out of any access to or disclosure of any person’s or organization’s confidential or personal information. 14–4

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.2

In light of such exclusions in a number of standard insurance forms, some insurers offer separate coverages for cyber risks either as endorsements to existing policies, components of package programs, as standalone policies or on a difference-inconditions basis. Many such forms are tailored to the cyber risk exposures unique to specific industries or business classes.

§ 14.3

FIRST-PARTY CYBER RISK COVERAGES

This section discusses common first-party cyber risk coverages that may be offered under cyber insurance endorsements or standalone policies. These coverages vary greatly, although many forms include both first- and third-party coverages for cyber risks. In the case of endorsements, these coverages are often drafted to supplement existing coverages or to alter or remove exclusionary language in the underlying policy. Many standalone policies offer broader coverages, and some are positioned as end-to-end risk management products that may offer additional services, such as risk management consultations.

§ 14.3.1 Data Breach Response Costs Responding to a data breach event can be a costly endeavor for affected businesses. Depending on the type and magnitude of a breach, response costs can run into the millions. Cyber insurance forms for the response to data breaches may include the following types of coverages: • Notification costs—Costs associated with notifying individuals as required by applicable law. Large-scale data breaches of retailers, health-care providers and other consumer-facing businesses may require providing notice to millions of potentially affected individuals. • Forensic computer analysis—Costs associated with the forensic analysis of affected computer systems or data to establish the existence of a data breach and/or to determine its cause. • Data breach mitigation—Costs associated with prevention of further losses following a data breach. Examples include credit and identity monitoring services provided to individuals affected by a data breach. • Call-center costs—Costs associated with the establishment of a call center to field inquiries from affected individuals and to provide information as required by applicable law. • Public relations—Costs associated with public relations and crisis management related to a data breach. • Legal fees—Costs associated with legal services related to compliance with notification or other applicable laws.

§ 14.3.2 Data Loss Coverage may also be provided under cyber forms for the loss of or damage to data or other electronic assets. Coverage may apply where data is lost or damaged due to MCLE, Inc. | 2nd Edition 2018

14–5

§ 14.3

Data Security and Privacy in Massachusetts

a cyber event, such as a virus, or where data storage equipment is damaged, such as a fire that destroys a computer network resulting in data loss. Importantly, loss of or damage to electronic data is typically excluded from standard commercial property insurance policies.

§ 14.3.3 Network Service Interruption / Business Interruption Varying types of business interruption and business income coverages may also be included in cyber policies. These coverages are broadly designed to insure businesses against lost income arising from a cyber event. One example is network interruption coverage, which addresses loss of income due to suspension or interruption of business operations due to a network security or other network failure. Coverages may also extend beyond computer systems to include other kinds of operational technologies that may have cyber exposures, such as certain types of equipment used in manufacturing and utility settings. Similar to other forms of business interruption insurance, extra expense coverage may also be included. As in other insurance lines, the terms of business interruption coverage for cyber risks can vary greatly. Some forms may require an actual cessation of business activities to trigger coverage, while others may require only a slowdown in business operations from a cyber event. Many forms also include waiting periods that require the interruption to persist for several hours or days before coverage is triggered. Some forms may also provide contingent business interruption coverage, which could apply in the event of a cyber event at an insured’s business partner that impact an insured’s business.

§ 14.3.4 Cyber Extortion and Ransom Coverage Under some forms, coverage may be provided for response to cyber extortionists. Conceptually, this type of insurance is akin to traditional ransom and extortion insurance, but for a computer network or data assets. Thus, coverage may include costs associated with paying ransom to an extortionist to prevent the release of surreptitiously acquired information or to regain access to hacked systems. Cyber extortion can be crippling. For example, in 2015 it was widely reported that a Massachusetts police department was the victim of a computer system intrusion. The hacker encrypted the police department’s records and demanded payment to unencrypt the data. The police department sought help from the FBI, but ultimately paid the ransom to regain access to its systems.

§ 14.3.5 Reputational Damage Insurance coverage may also be available for reputational damage resulting from a cyber event. The coverage provided typically involves expenses related to crisis management communications and other public relations. Coverage for reputational threats may also be included. Reputational damage insurance may also take the form of lost value or revenue for certain industry segments, although such coverage is less common. 14–6

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.3

§ 14.3.6 Physical Loss or Damage to Property As the “Internet of things” grows, so too do risks associated with networked equipment and machinery. Many typical first-party policy forms, such as commercial property insurance policies, contain exclusions that may apply to physical losses arising from hacking or malfunction of electronic or computer equipment. Supplemental coverages may be available for physical loss or damage to property arising from a cyber event. An example would be coverage for damage to equipment that was hacked and manipulated by an outside entity, resulting in damage to that equipment and/or other property. Some insurers have developed policy forms offering cyber risk coverage for property damage on a standalone or difference-in-conditions basis, with specialty offering tailored to certain industries, including the energy sector and certain manufacturers.

§ 14.3.7 Crime Some cyber risk coverages are also available for first-party financial losses caused by computer fraud, payments or transfers from fraudulent computer instructions, spoofing of credentials, and social engineering. Such coverages could apply to scenarios where an employee is fraudulently induced to transfer funds to a third party by electronic means, such as e-mail. As discussed below, a number of recent coverage disputes have been litigated over such coverages. However, some of these risks may also be covered (or excluded) in traditional computer crime policies, which sometimes offer limited coverage by endorsement for risks, such as e-mail impersonation.

§ 14.4

THIRD-PARTY CYBER RISK COVERAGES

In the event that confidential or personal information is disclosed as the result of a data breach, businesses may face claims brought by individuals, businesses, and administrative or government agencies. The nature and extent of these exposures to third-party liability will vary depending on the manner of the data breach and the type of information that is at issue. This section discusses common coverages available for liability to third parties for cyber risk events.

§ 14.4.1 Data Security Liability Liability for the failure to protect confidential information is one of the major risks associated with data breaches and other cyber events. This confidential information can take many forms, ranging from personal information (such as health records, Social Security numbers, or credit card information) to confidential business information involving third parties that may be in the care, custody, or control of an insured entity. Coverage may be provided under cyber forms for damages arising from the theft, loss, or unauthorized disclosure of such confidential third-party information or personally identifiable information. This type of coverage may also be extended to include claims against an insured for failure to prevent transmission of viruses or other malicious code from an insured’s computer systems to computer or network systems that are owner-operated or operated by third parties. MCLE, Inc. | 2nd Edition 2018

14–7

§ 14.4

Data Security and Privacy in Massachusetts

Data security liability insurance may also provide coverage for certain errors or omissions by an insured entity. Some forms provide coverage for claims related to an insured’s failure to implement or comply with a privacy policy that prohibits or restricts the disclosure, sharing, or selling of confidential or personally identifiable information or that mandates procedures and requirements to prevent the loss of such information. Coverage may also be extended in some forms for liability arising from an insured’s failure to administer an identity theft prevention program or information disposal pursuant to applicable federal or state law. Failures to comply with security standards, such as the Payment Card Industry Data Security Standard (PCI DSS), may impact the availability of coverage for businesses that maintain consumer credit card information; however, some insurers offer forms that include PCI DSS coverages.

§ 14.4.2 Privacy Liability Businesses that store personal or confidential information may also face exposure due to violation of state, federal, or local privacy statutes. For example, a medical provider or business may ultimately incur liability to customers or employees as the result of the publication of stolen medical records or other personally identifying information. Depending on the nature of the information that is disclosed, a single event may result in liability under privacy statutes in multiple jurisdictions. Some policy forms extend this coverage to information that is stored offline in addition to electronically stored information. Many cyber insurance forms also offer coverage for regulatory penalties and associated regulatory defense costs. The potential exposure to regulatory actions at both the state and federal level will often depend on the type and cause of data breach, the manner and extent to which confidential or personal information is disseminated, and the type of information or data that is the subject of the breach. For example, businesses that maintain customer health, financial, or other personal information may be exposed to a range of regulatory actions or fines depending on the applicable state and federal law relating to disclosure of such information.

§ 14.4.3 Media Content Liability Media content liability insurance addresses exposures faced by companies that have published content. In the context of cyber insurance, media content liability coverage may be limited to electronic publications, but traditional media content coverages may extend to print and other offline media. It is important to note that this exposure is not limited to traditional media companies, such as broadcasters or news organizations. Rather, any company that maintains a website or social media presence of any kind may have potential media content exposures from cyber risks. Examples include companies with hacked Twitter accounts and those with websites that publish employee- or user-generated content. The types of claims that may be brought against a publishing company insured by a third party are varied. Common claims include allegations of trademark or copyright 14–8

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.4

infringement, defamation, product disparagement, or intentional infliction of emotional distress.

§ 14.4.4 Products Liability Products liability coverage for cyber risks is an emerging area, particularly as connected “Internet of Things” (IoT) devices proliferate. Many standard products and completed operations forms contain broad electronic data exclusions that could bar coverage in the event that a connected device is affected resulting in property damage or bodily injury. Some specialty coverage forms targeted to software developers and connected device manufacturers have already been introduced.

§ 14.5

FIRST-PARTY CYBER RISK INSURANCE CASE LAW

As discussed above, traditional insurance policies do not typically provide coverage for cyber risks. As cyber risks have emerged as a real and persistent threat to business, policy holders have sought coverage for these risks in the context of traditional insurance policies—whether or not the risks were suited to traditional coverage. Given that stand-alone cyber risk policies are still relatively new and varied, coverage issues under such forms have not yet been extensively litigated. Notably, Massachusetts courts have not yet adjudicated these types of coverage disputes. In the early 2000s, however, both federal and state courts addressed this issue in the context of all-risk business insurance policies. The outcomes of these cases were generally favorable to policy holders while serving as a warning to insurers. Some courts began to interpret “property damage” broadly to include damage to traditionally intangible property like computer data. Moreover, even when courts declined to address whether cyber risks could constitute physical damage, they turned to other policy provisions to find coverage for policy holders. There is still a somewhat limited body of case law nationwide concerning cyber coverage under first-party policies and fewer still where coverage issues have been litigated under cyber risk–specific forms. As discussed below, the majority of significant court decisions have arisen in the context of fraud cases, often where an insured’s employee was fraudulently induced to transfer funds to a third party. Many of these coverage disputes have arisen under more-traditional forms, such as crime policies. At present, there has not been occasion for the courts to adjudicate many first-party coverages in the cyber context.

§ 14.5.1 American Guarantee & Liability Insurance Co. v. Ingram Micro, Inc. The U.S. District Court for the District of Arizona has applied a broad definition of the term “physical damage.” In American Guarantee & Liability Insurance Co. v. Ingram Micro, Inc., 2000 U.S. Dist. LEXIS 7299, 2000 WL 726789 (D. Ariz. Apr. 18, 2000), American Guarantee denied coverage to its policy holder, Ingram Micro, a computer wholesaler, under a first-party all-risk insurance policy after Ingram’s MCLE, Inc. | 2nd Edition 2018

14–9

§ 14.5

Data Security and Privacy in Massachusetts

computer processing and ordering system was rendered inoperable for eight hours by a power outage. Ingram utilized a worldwide network of computers to send and receive orders and to track customers and products. One morning, the location where Ingram’s program was based suffered a power outage. The power outage did not disrupt electrical service to the building but all of the electrical equipment in the building stopped working. While some of Ingram’s computer equipment began working again as soon as power was restored—a half hour after the outage—its three mainframe computers lost their programming information. The computers remained down until Ingram’s programmers restored the mainframe and bypassed a malfunctioning matrix switch, which took eight hours. Ingram’s all-risk insurance policy insured against “[a]ll risks of direct physical loss or damage from any cause, however or wheresoever occurring, including general average, salvage charges or other charges, expenses and freight.” American Guarantee argued that Ingram’s computer system and related matrix switch did not suffer physical damage as a result of the power outage. The insurer reasoned that the system remained capable of performing its intended functions throughout the duration of the inoperable period insofar as it never lost the ability to receive and process data. Ingram countered that a broad interpretation of the term “physical damage” included loss of use and functionality of the mainframe computers and matrix switch, even if they could still accept and process information once the computer system was restored. The court sided “with Ingram’s broader definition of ‘physical damage,’” holding that “‘physical damage’ is not restricted to the physical destruction or harm of computer circuitry but includes loss of access, loss of use, and loss of functionality.” As such, the court determined that Ingram’s computer system and matrix switch had been physically damaged for eight hours between the power outage and when the system was up and running again. In support of its holding, the court looked to the federal computer fraud statute (18 U.S.C. § 1030 (1999)) and state computer crime laws, which provide that the unavailability of computer data, interruption of computer services, and alteration of computer software networks constitute damage.

§ 14.5.2 Lambrecht & Associates v. State Farm Lloyds Alternatively, some courts have resolved the issue of whether a loss to tangible property occurred without addressing whether computer data is “physical” or “tangible.” For example, in Lambrecht & Associates v. State Farm Lloyds, 119 S.W.3d 16 (Tex. App.—Tyler 2003), a business insurance policy provided coverage for “accidental direct physical loss to business personal property at the premises described.” The policy also provided coverage for “the actual loss of ‘business income’ you sustained due to the necessary suspension of your ‘operations’ during this ‘period of restoration’ . . . caused by accidental direct physical loss to property at the described premises.” The policy holder, Lambrecht & Associates, an employment agency, sought coverage after its computer system slowed down, ultimately freezing and suspending its employees’ ability to communicate with prospective employers and employees until Lambrecht had replaced its server operating system and other software and manually reentered a substantial amount of data. State Farm denied the claim because a hacker had allegedly entered the computer system and released a computer 14–10

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.5

virus, which resulted in the computer failure and subsequent business income loss. State Farm contended that the losses suffered by Lambrecht at the hands of the hacker were “neither physical nor accidental.” The Court of Appeals disagreed with State Farm, reversing the trial court’s grant of summary judgment in favor of the insurer. The court reasoned that State Farm had “not alleged that Lambrecht participated in any conduct that, when viewed from Lambrecht’s perspective, would cause Lambrecht to reasonably believe that such conduct would result in its injury.” The court also found that there was no evidence that Lambrecht “was involved in any voluntary or intentional conduct or took any action which caused the damage Lambrecht suffered.” The court refused to impart the hacker’s intent on Lambrecht without any such evidence. In keeping with its reasoning, the court found “that the injection of the virus and the resulting damage was an unexpected and unusual occurrence and was, from Lambrecht’s viewpoint, accidental.” Turning to the issue of whether the losses constituted “physical” losses, the court avoided confronting the issue of whether computer data was tangible or intangible property under the policy. Rather, the court looked to the plain language of the policy, which defined coverage for loss of income to include “electronic media and records,” including “electronic data processing, recording or storage media such as films, tapes, discs, drums or cells; . . . data stored on such media; or . . . programming records used for electronic data processing or electrically controlled equipment.” As such, the court determined that damage to Lambrecht’s server was physical damage as a matter of law as “the server falls within the definition of ‘electronic media and records’ because it contained a hard drive or ‘disc’ which could no longer be used for ‘electronic data processing, recording and storage.’” The court also held that the data stored on the servers constituted covered electronic media and records “because it was the ‘data stored on such media.’”

§ 14.5.3 Apache Corp. v. Great American Insurance Co. The first of many fraud cases discussed herein, Apache Corp. v. Great American Insurance Co., No. 4:14-cv-237 (S.D. Tex. Aug. 7, 2015) involved a common fraud scheme in which criminals defrauded Apache Corporation of legitimate vendor invoice payments. An Apache employee in Scotland received a telephone call from a person identifying herself as a representative of one of Apache’s vendors. The caller instructed Apache to change the bank account information for its payments to the vendor. The Apache employee demanded a written request, which the criminals supplied via a fraudulent e-mail attachment. Apache confirmed the contents of the e-mail in a telephone call with the criminals masquerading as the vendor’s employees and made subsequent payments to the fraudulent account. Apache discovered the fraud, recouped a portion of the monies, and submitted a claim for the balance under the “Computer Fraud” provision of its crime-protection insurance policy, which states: “[w]e will pay for loss of, and loss from damage to, money, securities and other property resulting directly from the use of any computer to fraudulently cause a transfer of that property from inside the premise or banking MCLE, Inc. | 2nd Edition 2018

14–11

§ 14.5

Data Security and Privacy in Massachusetts

premises: a. to a person (other than a messenger) outside of those premises; or b. to a place outside those premises.” Great American Insurance Company (GAIC) denied the claim, asserting that the loss “did not result directly from the use of a computer nor did the use of a computer cause the transfer of the funds.” Apache filed a declaratory judgment action in Texas state court, which was removed to the federal court. The federal district court granted Apache’s motion for summary judgment, holding that “the intervening steps of the [post-e-mail] confirmation phone call and supervisory approval do not rise to the level of negating the email as being a ‘substantial factor.’” Apache Corp. v. Great. Am. Ins. Co., No. 4:14-cv-237 (S.D. Tex. Aug. 7, 2015). On appeal, the Fifth Circuit reversed and rendered judgment for GAIC, Apache Corp. v. Great American Insurance Co., 662 Fed. Appx. 252 (5th Cir. 2016), holding that the e-mail, while part of a scheme, was merely incidental to the occurrence of the money transfer and insufficient to trigger coverage under the policy. To find coverage for any fraudulent scheme involving e-mail, the court concluded, would convert the computer-fraud provision into a general fraud provision. Apache Corp. v. Great Am. Ins. Co., 662 Fed. Appx. at 258. For these reasons, the court held that no coverage existed under the policy and rendered judgment for GAIC.

§ 14.5.4 Principle Solutions Group, LLC v. Ironshore Indemnity, Inc. Another fraud case, Principle Solutions Group, LLC v. Ironshore Indemnity, Inc., No. 1:15-cv-4130-RWS at *12 (N.D. Ga. filed August 30, 2016)involved an illicit wire transfer of the insured entity’s corporate funds to a purported Chinese bank. The criminals induced Principle Solutions Group, LLC’s (Principle) controller to wire the funds based on a fraudulent e-mail and a series of fraudulent phone calls. Principle suffered a $1.7 million dollar loss. Principle was insured under a commercial crime policy which provided coverage for discrete categories of crimes, including “computer and funds transfer fraud.” The policy stated: “[w]e will pay for . . . [l]oss resulting directly from a ‘fraudulent instruction’ directing a ‘financial institution’ to debit your ‘transfer account’ and transfer, pay or deliver ‘money’ or ‘securities’ from that account.” Principle submitted a claim and sworn proof of loss. Ironshore denied coverage and Principle filed a declaratory judgment action, seeking coverage and damages for alleged bad faith. Principle took the position that the fraudulent e-mail was a proximate cause of its loss. Ironshore countered that subsequent phone calls encouraged Principle’s employees to set up and approve the fraudulent wire transfer. On cross motions for summary judgment, the court found the “computer and funds transfer fraud” clause to be ambiguous. Principle Solutions Grp., LLC v. Ironshore Indem., Inc., No. 1:15-cv-4130-RWS at *12 (N.D. Ga. filed August 30, 2016). Under Georgia rules of construction, the court construed the clause in favor of Principle to find coverage, holding that it was “reasonable for [Principle] to interpret the language 14–12

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.5

of the policy to provide coverage even if there were intervening events between the fraud and the loss.” Of note, the court cited the Apache District Court decision with approval without reference to the Fifth Circuit opinion reversing and remanding that decision. Given the close coverage question, the court determined that Ironshore was entitled to judgment as a matter of law on Principle’s bad faith count.

§ 14.5.5 InComm Holdings Inc. v. Great American Insurance Co. The issue presented in InComm Holdings Inc. v. Great American Insurance Co., 2017 U.S. Dist. LEXIS 38132, 2017 WL 1021749 (N.D. Ga. Mar. 16, 2017), is whether the “computer fraud” coverage afforded by a GAIC policy covers losses caused by an error in an insured debit card processing company’s code. The insured, InComm Holdings Inc. (InComm), wires money to certain banks whose debit card holders purchase and redeem “chits” from third-party retailers. The “chit” redemption process is strictly telephonic. “Chit” purchasers exploited a vulnerability in InComm’s processing code to obtain approximately $10.3 million in unauthorized telephonic redemptions. InComm sought coverage for the losses after discovering and remedying the code error. The policy’s “computer fraud” coverage obligates GAIC to pay for “loss of, and loss from damage to, money, securities and other property resulting directly from the use of any computer to fraudulently cause a transfer of that property from inside the premises or banking premises: a. to a person (other than a messenger) outside those premises; or b. to a place outside those premises.” The policy defines “premises” as “the interior of that portion of any building you occupy in conducting your business.” “Banking premises” is defined as “the interior of that portion of any building occupied by a banking institution or similar safe depository.” GAIC denied InComm’s claim, asserting no coverage because • InComm’s alleged loss did not result from “the use of any computer” to access the telephonic interface; • no funds were automatically transferred as a result of the code error; and • InComm’s losses resulted from multiple discrete occurrences, none exceeding the policy’s deductible. InComm filed a declaratory judgment action seeking coverage and pled separate counts sounding in breach of contract and bad faith. On cross motions for summary judgment, the court adopted GAIC’s policy construction and held that the subject telephonic “chit” redemptions did not involve “the use of any computer” and were therefore noncompensable under the policy’s “computer fraud” coverage as a matter of law. The court held separately that, even if computers were “used” in the “chit” redemption process, InComm’s loss did not result “directly” form the alleged computer use, which the court deemed incidental. The case is currently on appeal to the Eleventh Circuit Court of Appeals.

MCLE, Inc. | 2nd Edition 2018

14–13

§ 14.5

Data Security and Privacy in Massachusetts

§ 14.5.6 Taylor & Lieberman v. Federal Insurance Co. In Taylor & Lieberman v. Federal Insurance Co., 681 Fed. Appx. 627 (9th Cir. 2017), the U.S. Court of Appeals for the Ninth Circuit addressed coverage issues under a first-party crime policy where an accounting firm was induced to transfer funds to a third party based upon fraudulent e-mail instructions. The Ninth Circuit upheld the Central District of California’s decision that there was no coverage under the policy. The subject crime policy provided coverage for an insured’s direct loss “resulting from Forgery or Alteration of a Financial Instrument by a Third Party.” The Ninth Circuit held that this language extended only to the forgery of a financial instrument, such as checks, drafts, etc., and that e-mails instructing the insured to wire money were beyond the scope of the forgery coverage. The Ninth Circuit also held that the policy’s computer fraud coverage was inapplicable. The court held that sending an e-mail alone does not constituted unauthorized “entry into” the recipient’s computer systems. The court also held that the e-mails instructing the wiring of funds were not an “introduction of instructions” that “propagate[d] themselves” through a computer system, and therefore did not trigger coverage. Separately, the Ninth Circuit also held that the policy’s funds transfer fraud coverage did not apply because, even though the insured did not know the e-mailed instructions were fraudulent, it was aware of the wiring of funds.

§ 14.5.7 State Bank of Bellingham v. BancInsure Inc. In State Bank of Bellingham v. BancInsure, Inc., 823 F.3d 456 (8th Cir. 2016), the U.S. Court of Appeals for the Eighth Circuit addressed coverage issues under a financial institution bond, which was treated as an insurance policy under Minnesota law. The bond provided coverages for losses arising from things such as employee dishonesty and computer fraud. At issue were fraudulent transfers made from the bank’s accounts, which were made after its computer system was infected with by a Trojan horse virus. In addition, the bank employee who noticed the fraudulent transfers had apparently failed to comply with the bank’s security protocols and had left a computer running with her token credentials and those of another employee entered. The insurer denied coverage, and the bank was subsequently granted summary judgment in its favor. Both the District Court and the Eighth Circuit applied the concurrent-causation doctrine to the claimed loss to find coverage under the policy. The Eighth Circuit explained that the efficient and proximate cause of the loss “was the illegal transfer of the money and not the employees’ violations of policies and procedures.” Further, the Eighth Circuit held that, even if the employees’ negligent actions “played an essential role” in the loss and created a risk of intrusion into the bank’s computer systems by the virus, the intrusion and ensuing loss of funds was not “certain” or “inevitable.” As such, the Eight Circuit concluded that the “overriding cause” of the loss was the criminal activity of a third party.

14–14

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.5

§ 14.5.8 Pestmaster Services Inc. v. Travelers Casualty and Surety Co. of America In another fraud case under a crime policy, the U.S. Court of Appeals for the Ninth Circuit found no coverage for the fraudulent transfer of funds. The subject policy included a funds transfer fraud provision, but the Ninth Circuit held that, even though the transfers were fraudulently induced, there was no coverage because the transfers themselves were expressly authorized. The Ninth Circuit also held that policy language “fraudulently cause a transfer” required an unauthorized transfer of funds, and thus the transfers at issue were not fraudulently caused.

§ 14.5.9 Aqua Star (USA) Corp. v. Travelers Casualty and Surety Co. of America Aqua Star (USA) Corp. v. Travelers Casualty and Surety Co. of America, 2016 WL 3655265 (W.D. Wash. July 8, 2017) involved a seafood importer whose vendor was hacked. The hacker apparently monitored e-mail communications between the importer and the vendor and then sent “spoofed” instructions for wire transfers. At issue in the litigation was whether a crime policy afforded coverage for the wire transfers resulting from the spoofed instruction e-mails. The District Court granted summary judgment for the insurer, holding that an exclusion barring “loss resulting directly or indirectly from the input of Electronic Data by a natural person having the authority to enter the Insured’s Computer Systems.” Here, an employee of the insured updated a vendor spreadsheet containing payment information to include account information sent by the hacker. The District Court held that entering account data into the spreadsheet “was a necessary step to initiating any transfer,” and as such the entry of the data was an “intermediate step in the chain of events” that led to the transfer of funds to the hacker’s accounts. As such, the exclusion applied because the entry of the account information in the spreadsheet was an “indirect cause” of the loss. An appeal to the Ninth Circuit is pending.

§ 14.5.10 American Tooling Center, Inc. v. Travelers Casualty and Surety Co. of America In American Tooling Center, Inc. v. Travelers Casualty and Surety Co. of America, 2017 WL3263356 (E.D. Mich. Aug. 1, 2017) the Eastern District of Michigan considered yet another fraudulent e-mail scam involving “spoofed” wire transfer instructions that appeared to be from a vendor. After receiving the spoofed e-mails, the insured verified that certain production milestones had been met and then authorized payment to the accounts without independently verifying them first. The court held that there was no coverage as the insured did not suffer a “direct” loss “directly caused” by the use of a computer as required under the policy because of intervening events, such as the insured’s verification of milestones and authorization of the transfers. In reaching its decision, the District Court relied upon and applied Apache Corp., Pestmaster, and Incomm Holdings.

MCLE, Inc. | 2nd Edition 2018

14–15

§ 14.5

Data Security and Privacy in Massachusetts

§ 14.5.11 Medidata Solutions, Inc. v. Federal Insurance Co. The Southern District of New York reached a different conclusion in Medidata Solutions, Inc. v. Federal Insurance Co., 2017 WL 3268529 (S.D.N.Y. July 21, 2017). The case involved a cloud-based services provider, whose employee received a spoofed e-mail purporting to be from the company president regarding a highly confidential potential acquisition. The e-mail further instructed that the employee would be contacted by an outside attorney, and that the employee should follow his instructions. The same day, the employee received a telephone call from a person purporting to be the outside attorney requesting a wire transfer. The employee explained that, in order to effectuate the transfer, she would need approval from senior managers, all of whom subsequently received a group e-mail purporting to be from the company president. The end result is that wire transfers to the fraudulent actor were approved. The insured sought coverage under an executive protection policy which included forgery, computer fraud, and funds transfer coverage. The Southern District held, on competing summary judgment motions, that there was coverage under the policy’s computer fraud coverage. The court distinguished Pestmaster on the grounds that the fraud here was achieved by entry into the insured’s computer system with spoofed emails containing code that masked the hacker’s identity and changed data from the true e-mail address to the company president’s in order to achieve the e-mail spoof. The court also distinguished Apache, finding its causation analysis “unpersuasive,” as well as Taylor & Lieberman, on the grounds that the insured here did not suffer spoofed e-mails from one of its clients but, rather, spoofed e-mails containing computer code were inserted into the insured’s e-mail systems. Separately, the court held that the funds transfer fraud coverage applied, while the forgery coverage did not as the e-mails were not financial instruments.

§ 14.6

THIRD-PARTY CYBER RISK INSURANCE CASE LAW

There are a variety of potential sources of third-party exposure that a party may seek to insure against. Cyber threats range from security failures, including virus, malicious code, malware attacks, and internal security vulnerabilities, to privacy events, ranging from the failure to protect confidential and personal information, to the violation of federal, state, and local privacy laws. As discussed above, there is a variety of third-party insurance coverage available. There are policies covering liability for security and privacy breaches, legal liability resulting from data breaches, and publication liabilities. Cyber Risk, NAIC (last updated April 3, 2017), available at http://www.naic.org/cipr_topics/topic_cyber_risk .htm; Cyber Risks: The Growing Threat, Insurance Information Institute (June 2014), available at http://www.iii.org/sites/default/files/docs/pdf/paper_cyberrisk_2014.pdf. In recent years there have been a number of high-profile class-action lawsuits filed against corporations in the wake of large-scale data breaches. Customers and others affected by those data breaches have made breach of contract, negligence, breach of 14–16

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.6

fiduciary duty, statutory, and other claims. A 2012 study identified more than 230 data security breach actions occurring between 2005 and 2010. Acquisti, Hoffman & Romanosky, Empirical Analysis of Data Breach Litigation (Temple Univ. Beasley Sch. of Law, Legal Studies Research Paper No. 2012-29). Many different types of claims were asserted in these cases. In an attempt to mitigate the damages incurred as a result of these claims, companies have looked to their traditional and cyber risk insurance policies for coverage. In the absence of illustrative Massachusetts case law, this section discusses cases from other jurisdictions that have had occasion to adjudicate these issues.

§ 14.6.1 First Bank of Delaware, Inc. v. Fidelity & Deposit Co. of Maryland In First Bank of Delaware, Inc. v. Fidelity & Deposit Co. of Maryland, 2013 Del. Super. LEXIS 465, 2013 WL 6407603 (Del. Super. Ct. Oct. 30, 2013), First Bank of Delaware provided debit card transaction processing services for Visa and MasterCard. To facilitate that relationship and to complete debit card transactions, First Bank entered into a relationship with Data Access Systems (DAS). In 2008, DAS’s Web server terminal was hacked; the hackers successfully accessed debit card numbers and corresponding personal identification numbers, which they used to make unauthorized withdrawals from consumer accounts. An investigation determined that DAS’s system had not been in compliance with Visa’s and MasterCard’s security standards at the time of the data breach. Visa and MasterCard sought expenses from First Bank to address the breach and fraud loss compensation to customers. First Bank paid the amounts sought by Visa and MasterCard and sought coverage from its insurer, Fidelity & Deposit Co. of Maryland. Fidelity denied coverage, arguing that the losses fell outside the electronic risk liability coverage in First Bank’s directors and officers liability policy. Fidelity reasoned that DAS had not processed card transactions on behalf of Fidelity, but rather had done so on its own behalf. The Superior Court of Delaware found coverage, granting summary judgment to First Bank. In order for First Bank’s claims to trigger electronic risk liability coverage, they must have arisen out of a “loss event,” defined by the policy as “any unauthorized use of, or access to electronic data or software with a computer system.” The policy further defined a “computer system” as one “used by [First Bank] or used to transact business on behalf of the Company.” Interpreting this policy language, the court reasoned that in order for coverage to exist, Fidelity did not need to have been the party to benefit primarily from the credit card transactions processed through the DAS computers. First Bank had met its initial burden of showing that coverage existed under the terms of the policy by showing that First Bank was one of multiple parties that benefited from the use of DAS’s computer system as DAS had acted “on behalf of” First Bank. Moreover, while Fidelity correctly noted that the policy contained an exclusion for the “fraudulent use by any person or any entity of data,” the court ruled that the exclusion could not apply MCLE, Inc. | 2nd Edition 2018

14–17

§ 14.6

Data Security and Privacy in Massachusetts

to “swallow up” the grant of coverage. Where any unauthorized use of a computer system could have been, to some degree, fraudulent, application of the exclusionary language would serve to erase all coverage granted by the electronic risk liability coverage for “any unauthorized use of, or unauthorized access to electronic data . . . within a computer system.” As such, the court held that First Bank had proved the existence of an exception to the exclusion and that there was coverage.

§ 14.6.2 Retail Ventures, Inc. v. National Union Fire Insurance Co. In Retail Ventures, Inc. v. National Union Fire Insurance Co., 691 F.3d 821 (6th Cir. 2012) (Ohio law), the U.S. Court of Appeals for the Sixth Circuit addressed coverage under a commercial crime insurance policy for the theft of insured property by means of computer fraud. The court again used a broad interpretation to find coverage for a data breach loss. The loss occurred after hackers accessed Retail Ventures’ computer system by means of a wireless network within Retail Ventures’ shoe store. Retail Ventures submitted a claim to National Union for losses sustained after the policy holder incurred expenses for customer communications, public relations, lawsuits, attorney fees, and costs associated with consumer credit card information compromised by the breach. Retail Ventures claimed that its losses were covered under the terms of the policy’s computer fraud rider. The U.S. District Court for the Southern District of Ohio agreed, awarding the policy holder almost $7 million in stipulated losses and prejudgment interest. In predicting how the Ohio Supreme Court would have interpreted Retail Ventures’ loss, the District Court had held that the court would apply a proximate cause standard based on the policy’s “resulting directly from” language. Relying on that, the District Court held that the link between the hacker’s access to the computer system and Retail Ventures’ resulting monetary loss was sufficient to trigger coverage. National Union appealed, claiming that the District Court erred in determining that Retail Ventures suffered a loss “resulting directly from” the “theft of any Insured property by Computer Fraud,” and that the court incorrectly used a proximate cause standard to evaluate causation. The Sixth Circuit affirmed the District Court’s decision. The Sixth Circuit agreed with the District Court that Retail Ventures’ loss resulted directly from the hacking and that the policy exclusion for the loss of confidential information did not apply to the loss of customer information. In reaching its findings, the court looked to the policy’s computer and funds transfer fraud coverage, which provided coverage for loss resulting from “[t]he theft of any insured property by Computer Fraud.” The court found that the provision was ambiguous. It then looked to Ohio case law to support the application of a proximate cause standard to determine whether Retail Ventures’ loss resulted “directly from” the breach. The court held that the amounts that were paid to customers and credit card companies because of the data breach had been proximately caused by the breach. The court also rejected National Union’s argument that coverage was barred under a policy exclusion for the “loss of proprietary information, Trade Secrets, Confidential Processing Methods or other confidential information of any kind.” The court reasoned 14–18

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.6

that Retail Ventures’ customers’ banking information was not Retail Ventures’ confidential information.

§ 14.6.3 State Auto Property & Casualty Insurance Co. v. Midwest Computers & More In State Auto Property & Casualty Insurance Co. v. Midwest Computers & More, 147 F. Supp. 2d 1113 (W.D. Okla. 2001), an insured company, Midwest Computers & More, provided computer sales and repair services that business owners claimed had resulted in the loss of use of their computer system and the loss of data and business information stored on the computer system. Midwest Computers submitted a claim to its insurer, State Auto Property & Casualty, for defense and indemnity coverage in the underlying matter filed against it. State Auto, in turn, filed a declaratory judgment in the U.S. District Court for the Western District of Oklahoma to determine its defense and indemnity obligations. Addressing both parties’ motions for summary judgment, the court held that damage to “tangible property” had occurred under the terms of the policy, but that the “your work” exclusion applied to bar coverage for Midwest Computers’ claims. (The policy’s “your work” exclusion excluded the following from coverage: “a. Work or operations performed by you or on your behalf; and b. Materials, parts or equipment furnished in connection with such work or operations. Your Work includes: a. Warranties or representations made at any time with respect to the fitness, quality, durability, performance or use of your work; and b. The providing of or failure to provide warnings or instructions.”) The court reasoned that computer data is not tangible property because the data, unlike the medium on which it is stored, “cannot be touched, held, or sensed by the human mind; it has no physical substance.” The court went on to find, however, that while computer data itself is not tangible property, the policy at issue defined tangible property to include “loss of use of tangible property” and that the claimants had alleged that they had been left without the use of their computers. After determining that there had been a loss within the boundaries of the policy’s terms, the court next considered the policy exclusion for “property that must be restored, repaired, or replaced because ‘your work’ was incorrectly performed on it” in light of the policy’s “completed operations hazard” exception. The court found that because the claimants alleged that the data loss occurred as a result of Midwest Computers’ work on the computer system, but that it was not discovered until State Auto returned to perform additional work on the system, the completed operations hazard exception to the “your work” exclusion did not apply to restore coverage. Therefore, the court ruled that State Auto had no duty to defend or indemnify Midwest Computers.

§ 14.6.4 Eyeblaster, Inc. v. Federal Insurance Co. The U.S. Court of Appeals for the Eighth Circuit also applied the “tangible property” coverage term and the “your work” exclusion in Eyeblaster, Inc. v. Federal Insurance MCLE, Inc. | 2nd Edition 2018

14–19

§ 14.6

Data Security and Privacy in Massachusetts

Co., 613 F.3d 797 (8th Cir. 2010). In that case, the court took a contrary position to the court in State Auto Property & Casualty Insurance Co. v. Midwest Computers & More, 147 F. Supp. 2d 1113 (W.D. Okla. 2001). In the underlying matter, a consumer sought to recover against Eyeblaster, Inc., for damages to his computer, computer software, and computer data that he allegedly suffered as a result of visiting Eyeblaster’s website. Eyeblaster sought coverage from its insurer, Federal Insurance Co., under a general liability policy and a network technology errors or omissions liability policy. Federal Insurance denied coverage. Eyeblaster filed a declaratory judgment action and the District Court granted summary judgment to Federal Insurance. Eyeblaster appealed, arguing that the District Court failed to address its commercial general liability policy coverage for “loss of use of tangible property that was not physically injured.” Eyeblaster’s general liability policy provided coverage for “physical injury to tangible property, including resulting loss of use of that property . . . ; or loss of use of tangible property that is not physically injured.” The policy excluded “any software, data or other information that is in electronic form” from the definition of “tangible property.” The appellate court found that the underlying claimant had alleged damage to tangible property because a computer is tangible property, and that the claimant had alleged that his computer was at various times frozen and inoperable. The court rejected Federal Insurance’s assertion that even if the claimant had lost the use of his computer, coverage was excluded because the computer was “impaired property.” As grounds for its ruling, the court found that Federal Insurance had provided no evidence to support its contention that the computer could be restored upon the removal of Eyeblaster’s work where there was no assertion that such work ever actually existed on the computer. Further, the fact that the claimant had unsuccessfully sought to repair the computer suggested that it was not merely impaired property that could have been restored upon removal. The court turned next to the network technology errors or omissions policy, which expressly covered intangible property, including software, data, and electronic information. The court reasoned that while the complaint against Eyeblaster alleged that Eyeblaster had installed cookies, JavaScript, and Flash technology on the claimant’s computer, it contained no allegation that Eyeblaster had done so intentionally. Therefore, Federal Insurance could not support any contention that its policy holder had committed an intentional wrongful act for which a denial of coverage would have been appropriate. Thus, the court reversed the District Court’s finding of no coverage and ruled in favor of the policy holder.

§ 14.6.5 First Commonwealth Bank v. St. Paul Mercury Insurance Co. In another recent case favoring the policy holder, First Commonwealth Bank v. St. Paul Mercury Insurance Co., Case No. 2:14-cv-00019-MPK (W.D. Pa. Oct. 6, 2014), the U.S. District Court for the Western District of Pennsylvania held that where a policy holder reimbursed consumers for losses sustained as a result of attacks on the 14–20

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.6

policy holder’s computer network, a liability policy did not bar coverage for those reimbursements where they were made pursuant to a statute. First Commonwealth Bank was attacked by hackers who obtained a client’s username and password information, which the hackers then used to execute over $3 million in fraudulent wire transfers from the client’s account. The bank reimbursed the client’s stolen funds pursuant to a Pennsylvania statute. The court reasoned that such reimbursements cannot fall under policy exclusions for voluntary payments because a payment cannot be “voluntary” where the payment is compelled by statute.

§ 14.6.6 Zurich American Insurance Co. v. Sony Corp. of America In Zurich American Insurance Co. v. Sony Corp. of America, No. 651982/2011 (N.Y. Sup. Ct. Feb. 21, 2014); appeal withdrawn per stipulation (N.Y. App. Div. Apr. 30, 2015), a New York trial court held that Zurich American Insurance Co. had no duty to defend its policy holder, Sony Corp. of America, for claims arising out of a largescale data breach. Sony purchased a commercial general liability policy from Zurich that provided personal and advertising injury coverage for “oral or written publication in any manner of the material that violates a person’s right of privacy.” In April 2011, hackers breached Sony’s online services and accessed millions of online videogame players’ personal and credit card information. The breach resulted in dozens of class-action lawsuits against Sony, which sought defense and indemnification form Zurich. The trial court first held that there was “publication” of the gamers’ information under the policy terms “by just merely opening up that safeguard or that safe box where all of the information was.” The court then turned to whether the publication had been done by Sony or the hackers, ultimately determining that it had been done by the hackers. “This is a case where Sony tried or continued to maintain security for this information. It was to no avail. Hackers criminally got in. They opened it up and they took the information.” Therefore, the court found that there “was no act or conduct perpetrated by Sony, but it was done by third-party hackers illegally breaking into that security system.” Because the court determined that coverage applied only to publication by Sony, the court sided with Zurich that hacking by third parties was not covered under the oral or written publication coverage provision. The court granted summary judgment for Zurich. Sony appealed the trial court decision, arguing that the policy language contained no requirement that the underlying publication be done by the policy holder. The appellate court held oral arguments and the parties reached a settlement before the court handed down a decision.

§ 14.6.7 Recall Total Information Management, Inc. v. Federal Insurance Co. In Recall Total Information Management, Inc. v. Federal Insurance Co., Case No. 317 Conn. 46, 115 A.3d 458 (Conn. May 18, 2015), the Supreme Court of Connecticut issued a decision in a cyber insurance case arising out of a commercial general MCLE, Inc. | 2nd Edition 2018

14–21

§ 14.6

Data Security and Privacy in Massachusetts

liability policy dispute. In that case, the court affirmed a lower appellate court’s holding that Federal Insurance had no duty to defend or indemnify Recall Total Information Management in underlying data breach litigation. The personal information of more than 500,000 IBM Corp. employees was exposed after IBM contracted with Recall Total to transport and store tapes containing the information. The tapes containing the information fell from a van that was traveling on the highway. IBM sought to recover more than $6 million that it had spent trying to mitigate the loss of the tapes. Recall Total, in turn, sought coverage for the reimbursement from Federal Insurance as an additional insured on its subcontractor’s insurance policy. Federal Insurance denied coverage and Recall Total filed a breach of contract action. The intermediate appellate court affirmed the trial court’s grant of summary judgment for Federal Insurance, reasoning that the underlying loss of computer tapes did not constitute a “personal injury,” which the policy defined as ‘‘electronic, oral, written or other publication of material that . . . violates a person’s right to privacy.’’ The court held that where there was no evidence that the lost computer tapes had been accessed by anyone, there had been no “publication” under the policy terms. Therefore, policy coverage was not triggered. The Connecticut Supreme Court affirmed.

§ 14.6.8 Travelers Indemnity Co. of America v. Portal Healthcare Solutions, LLC The health care sector has proven to be especially vulnerable to cyber risks. In Travelers Indemnity Co. of America v. Portal Healthcare Solutions, LLC, 35 F. Supp. 3d 765 (E.D. Va. 2014) aff’d 644 Fed. Appx. 245 (4th Cir. 2016), Travelers Indemnity Co. filed a declaratory judgment action to determine its duty to defend its policy holder, Portal Healthcare Solutions, a medical record keeper, against allegations that Portal Healthcare had posted its patients’ confidential medical records on the Internet and made them publicly available. The crux of the issue was whether Portal Healthcare’s posting of medical records in a searchable format constituted “publication” pursuant to the terms of the underlying policies. Travelers argued that the posting did not constitute publication, while Portal Healthcare argued that it did. The policies obligated Travelers to pay monies on behalf of its policy holder arising out of “the electronic publication of material that . . . gives unreasonable publicity to a person’s private life” or the “electronic publication of material that . . . discloses information about a person’s private life.” After Portal Healthcare posted patient records online in a publically searchable format, several patients found their own records after running Internet searches for their names. The patients then filed a class-action lawsuit against Portal Healthcare. Because the underlying policies did not define “publication,” the court gave the term its plain and ordinary meaning—“to place through the public (as through a mass medium)”—and reasoned that the exposure of medical records on the Internet in a place where the public could find them constituted “publication” under the definition. In reaching this conclusion, the court rejected Travelers’ argument that there was no 14–22

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.6

publication because Portal Healthcare never intended to release the records. The court reasoned that the policy holder’s intent is not a relevant inquiry when determining whether there has been “publication,” as the only analysis necessary is whether the information is placed before the public. Based on the same logic, the court also rejected Travelers’ argument that there was no publication because the patient records were not viewed by a third party. Publication hinges on whether information was placed before the public, not whether a third party actually accessed it. Turning to the second prong of the policies’ publication coverage, the court found “that the public availability of a patient’s confidential medical records gave ‘unreasonable publicity’ to that patient’s private life and ‘disclosed’ information about that patient’s private life, satisfying the policies’ prerequisite for coverage.” The court rejected Travelers’ proposed narrow definition of publicity as an act designed to attract public interest in favor of a broader definition—that the policy holder “gave unreasonable publicity to patients’ private lives when it posted their medical records online without security restriction.” Ultimately, the court granted Portal Healthcare’s motion for summary judgment and denied Travelers’ motion for summary judgment “because exposing confidential medical records to public online searching placed highly sensitive, personal information before the public” so “the conduct falls within the Policies’ coverage for ‘publication’ giving ‘unreasonable publicity’ to, or ‘disclos[ing]’ information about, a person’s private life, triggering [the insurer’s] duty to defend.” The decision was affirmed by the Fourth Circuit in 2016.

§ 14.6.9 Hartford Casualty Insurance Co. v. Corcino & Associates In Hartford Casualty Insurance Co. v. Corcino & Associates, 2013 U.S. Dist. LEXIS 152836, 2013 WL 5687527 (C.D. Cal. Oct. 7, 2013), Hartford Casualty Insurance Co. filed a declaratory judgment action against a policy-holder hospital, Corcino & Associates, after denying Corcino’s indemnification claim for damages resulting from the hospital’s alleged violation of patient privacy rights. The hospital gave more than 20,000 confidential patient records to a job candidate for use in performing tasks as part of an employment test. The candidate then posted the records to an online tutorial marketplace, seeking assistance in performing the tasks. The data remained publically accessible on the website until a patient whose information had been posted saw the posting. The patients in the underlying litigation against the hospital alleged that the hospital violated their privacy rights by posting their confidential information, which “included patient names, medical records, hospital account numbers, admission/discharge rates, diagnosis codes, and billing charges.” The hospital’s commercial general liability policy with Hartford covered amounts that the hospital became “legally obligated to pay as damages because of . . . electronic publication of material that violates a person’s right of privacy.” Hartford argued that the statutory relief sought by the underlying claimants was barred by a policy exclusion for “Personal and Advertising Injury . . . Arising out of the violation of a person’s right to privacy created by any state or federal act.” One of the forms of MCLE, Inc. | 2nd Edition 2018

14–23

§ 14.6

Data Security and Privacy in Massachusetts

relief sought by the underlying claimants was statutory damages under the California Confidentiality of Medical Information Act and the California Lanterman Petris Short Act. Hartford conceded that the patients’ allegations fell within the terms of the policy but argued that the statutory relief they sought was barred by the exclusion. The policy holder countered that “the plaintiffs in the underlying cases seek statutory remedies for breaches of privacy rights that were not themselves ‘created by any state or federal act,’ but which exist under common law and the California state Constitution.” The court, applying California law, dismissed Hartford’s complaint, holding that the policy exclusion covered relief awarded to the underlying claimants pursuant to two California statutes. The court reasoned that medical records were considered confidential long before the statutes were passed, and the statutes were not intended to create new privacy rights but to codify existing ones. As such, the court reasoned that “because the [statutes] do not create new privacy rights and because the policy exclusion by its terms ‘does not apply to liability for damages that the insured would have in absence of such state or federal act,’ the relief sought under these statutes can reasonably be interpreted to fall outside of Hartford’s policy exclusion.”

§ 14.6.10 Travelers Indemnity Co. of Connecticut v. P.F. Chang’s China Bistro, Inc. Another case is Travelers Indemnity Co. of Connecticut v. P.F. Chang’s China Bistro, Inc., Case No. 3:140-cv-01458-LVP (D. Conn. 2014). Travelers issued two consecutive commercial general liability insurance policies to P.F. Chang’s. P.F. Chang’s customers filed three separate class-action lawsuits alleging that P.F. Chang’s had failed to put adequate safeguards in place to protect its customers’ financial information when hackers were able to access customer credit and debit card data. Travelers filed a declaratory judgment action concerning its defense and indemnity obligations. Travelers argued that the class-action lawsuits failed to allege “bodily injury” or “property damage” that was caused by an “occurrence,” “advertising injury,” or “personal injury” under the policy terms. Travelers argued that the policies did not provide coverage for claims against P.F. Chang’s for negligent security or violations of financial protections laws. On April 27, 2015, the parties filed a joint motion to stay the declaratory judgment action as the underlying lawsuits against P.F. Chang’s were dismissed, pending appeal. The Seventh Circuit Court of Appeals subsequently revived the class action lawsuits and permitted discovery to proceed. The underlying class actions remain pending, and it does not appear that either Travelers or P.F. Chang’s has moved the Connecticut court to lift the pending stay to resolve the coverage questions.

§ 14.6.11 Travelers Property Casualty Co. of America v. Federal Recovery Services, Inc. In Travelers Property Casualty Co. of America v. Federal Recovery Services, Inc., No. 2:14-CV-170 TS, slip op. (D. Utah May 11, 2015), the U.S. District Court for the District of Utah denied Federal Recovery Services’ motion for partial summary 14–24

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.6

judgment on Travelers’ duty to defend Federal Recovery Services in underlying litigation. Federal Recovery Services entered into a contract to process, store, transmit, and otherwise handle electronic data on behalf of a fitness company in order to manage the fitness company’s members’ dues payments. The fitness company requested that Federal return customer credit card and banking information after the fitness company decided to transfer its business to a competitor of Federal. However, Federal declined to produce all of the records until the fitness company provided it with additional compensation. The fitness company filed suit against Federal, which, in turn, tendered its defense to Travelers. Travelers filed a declaratory judgment action to determine its duty to defend. Under the terms of Federal’s cyber liability insurance policy, Travelers had “the right and duty to defend the insured against any claim or ‘suit’ seeking damages for loss to which the insurance provided under one or more of ‘your cyber liability forms’ applies.” Also under the policy, Travelers had “no duty to defend the insured against any claim or ‘suit’ seeking damages for loss to which the insurance provided under ‘your cyber liability coverage forms’ does not apply.” The cyber liability forms included “a Network and Information Security Liability Form and a Technology Errors and Omissions Liability Form,” which required an “error, omission or negligent act” to trigger coverage. Federal claimed that the allegation based on its refusal to provide the requested data to the fitness company without additional compensation triggered coverage for an “error, omission or negligent act.” The court disagreed. Instead, the court held that the underlying complaint alleged that Federal knowingly, willfully, or maliciously withheld the information. As such, the underlying claimant did not allege errors, omissions, or negligence within the policy terms and Travelers had no duty to defend Federal.

§ 14.6.12 P.F. Chang’s China Bistro, Inc. v. Federal Insurance Co. The issue before the court in P.F. Chang’s China Bistro, Inc. v. Fed. Ins. Co. 2016 U.S. Dist. LEXIS 70749, 2016 WL 3055111 (D. Ariz. May 31, 2016) was whether coverage existed for credit card association assessments that arose from a data breach that P.F. Chang’s suffered in 2013. Federal sold a “cybersecurity by Chubb policy” to P.F. Chang’s corporate parent. Federal marketed the product as “a flexible insurance solution designed by cyber risk experts to address the full breadth of risks associated with doing business in today’s technology-dependent world” that “[c]overs direct loss, legal liability, and consequential loss resulting from cyber security breaches.” P.F. Chang’s entered into a master service agreement with Bank of America Merchant Services to process credit card payments made by P.F. Chang’s customers. Bank of America, like other servicers, processes credit card payments under agreements with credit card associations. Bank of America’s agreement with MasterCard incorporated rules that obligated Bank of America to pay certain fees and assessments in the event of a data breach. Bank of America then passed these fees through to P.F. Chang’s under the master service agreement. MCLE, Inc. | 2nd Edition 2018

14–25

§ 14.6

Data Security and Privacy in Massachusetts

On June 10, 2014, P.F. Chang’s learned that computer hackers had obtained and posted on the Internet approximately 60,000 credit card numbers belonging to its customers. P.F. Chang’s notified Federal, which reimbursed P.F. Chang’s approximately $1,700,000 under the policy for forensic investigation expenses and litigation defense costs. MasterCard then sought reimbursement from Bank of America for • a fraud recovery assessment of $1,716,798.85 for fraudulent charges it incurred during the data breach; • an operational reimbursement assessment of $163,122.72 for cardholder notification and reissuance costs; and • a flat case management fee of $50,000. Bank of America requested that P.F. Chang’s pay these costs under the master service agreement, and P.F. Chang’s looked to the Federal policy for coverage. Federal denied coverage and P.F. Chang’s filed suit, seeking a judicial declaration of coverage. Federal moved for summary judgment on all claims and issues. The court found no coverage for the fraud recovery assessment under Insuring Clause A because the policy requires an “actual or potential unauthorized access to such Person’s Record, or exceeding access to such Person’s Record.” Because the customer information subject to the data breach was not part of Bank of America’s “Record,” but rather the “Record” of the issuing banks, the court held that Bank of America did not sustain a “Privacy Injury” covered by the policy. Coverage was available to P.F. Chang’s for the operational reimbursement assessment. The court held such costs to be covered “privacy notification expenses” under Insuring Clause B. Whether coverage exists for the case management fee under Insuring Clause D.2 was a disputed issue of material fact that could not be resolved on summary judgment. The policy covers “Extra Expenses an Insured incurs during the Period of Recovery of Services due to the actual or potential impairment or denial of Operations resulting directly from Fraudulent Access of Transmission.” The parties disputed whether P.F. Chang’s paid the case management fee within the defined “period of recovery of services,” and the court declined to determine the fact issue on summary judgment. The court then considered whether Exclusions D.3.b. and B.2., which are contractual-liability exclusions, and the definition of “loss” under the policy barred coverage for all claimed assessment costs. The court looked to case law interpreting commercial general liability policies for guidance and held that the policy’s contractualliability exclusions barred coverage for all claims arising under the master service agreement as a matter of law. The court found dispositive that the master service agreement required P.F. Chang’s to indemnify Bank of America for any “fees,” “fines,” “penalties,” or “assessments” imposed on Bank of America by MasterCard or any other credit card association. The contractual-liability exclusion barred coverage “unequivocally” for these costs as a matter of law. The court also rejected P.F. Chang’s “reasonable expectations” argument and adopted Federal’s construction of the plain language of the insuring agreement and its rele14–26

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.6

vant exclusions. The court eschewed P.F. Chang’s reliance on Federal’s marketing in favor of a plain language construction.

§ 14.6.13 Camp’s Grocery, Inc. v. State Farm Fire & Casualty Co. This hacking case involved the alleged dissemination of confidential financial information and resulting damages caused by Camp’s Grocery, Inc.’s (Camp’s) failure to provide adequate computer systems and employee training and/or to maintain adequate encryption and intrusion detection and prevention systems. Three credit unions sued Camp’s after hackers compromised confidential customer information on Camp’s network. Camp’s sought coverage from State Farm under an Inland Marine Computer Property Form. On cross motions for summary judgment on Camp’s declaratory judgment action seeking coverage, the court distinguished first- and third-party insurance coverage and concluded that the Inland Marine Computer Property Form was a type of firstparty coverage that did not obligate State Farm to defend Camp’s under any circumstances. Camp’s Grocery Inc. v. State Farm Fire & Cas. Co., No. 4:16-cv-0204-JEO at *15–16 (N.D. Ala. Oct. 25, 2016). The court held that language in the form granting State Farm the option to defend Camp’s was unambiguous and harmonious with other provisions in the policy providing liability coverage. Ultimately, the court concluded that the form afforded only first-party coverage for certain computer equipment and electronic data that did not create, recognize, or assume the existence of a duty to defend or indemnify against claims brought by third parties.

§ 14.7

CYBER RISK MANAGEMENT

Proactive management of cyber risks is of importance to both insurers and insured entities. Given the scope and costs of exposures associated with data breaches, as well as the types of cyber insurance products and capacity in current insurance markets, it is prudent for every business to develop a plan for managing cyber risks. This is true for every business that maintains confidential or personally identifiable information, whether a Fortune 500 company or a small family business. This section does not intend to outline a template or recommendations for effective cyber risk management, but instead presents some practical concerns that merit consideration in the development of a cyber risk management plan.

§ 14.7.1 Evaluation of the Types of Electronically Stored Information

An initial consideration is the identification and type of information maintained by a business. If the business is consumer facing, what type of confidential or personally identifiable information is stored? What type of information belonging to other customers is maintained? What type of confidential information relating to other business associates or vendors is maintained? Identifying and classifying the types of MCLE, Inc. | 2nd Edition 2018

14–27

§ 14.7

Data Security and Privacy in Massachusetts

information maintained by an entity is a threshold inquiry to evaluating cyber risk exposure.

§ 14.7.2 Evaluation of Current Network Infrastructure and Security

Another important consideration is evaluating a company’s current network infrastructure and security protections. Periodic assessment of potential network vulnerabilities with staff or outside consultants is a powerful tool for prevention of data breaches. Identifying potential security risks or gaps and implementing remedial procedures may help prevent a future data breach or other cyber event. This is true not only for internal networks, but also for potential vulnerabilities in vendor or other third-party networks that interface with company systems.

§ 14.7.3 Media Publication Risks Any company with a website or social media presence may have exposure for media publication liabilities. Determining which employees and third parties have access to publication media warrants consideration in developing a risk management plan. Similarly, consideration of exposures related to any websites or other media that allow employees or third parties to post information in the form of blogs, comments, or uploaded files warrant evaluation.

§ 14.7.4 Development of Data Security Protocols Consideration should also be given to developing internal protocols for the protection of sensitive information and data. Implementing protocols for the security of data is prudent at the technical level to minimize the likelihood that important data is lost, stolen, or corrupted. Creating data storage and security procedures in advance of an event may be important in mitigating the impact on a business and minimizing interruption of business activities in the event that data is damaged or compromised. Another concern is identifying the types of data accessible to specific employees. Are certain classes of data restricted to certain employees? What is the risk that an employee could accidentally or intentionally access or transmit restricted data? And what types of procedures are in place for monitoring employee access or transmission to sensitive information?

§ 14.7.5 Employee Devices Many companies provide employees with laptops, tablets, and smartphones that may contain sensitive information. An increasing number of companies may also have “bring your own device” policies that allow employees to access data from their personal computers or phones. Developing internal standards to protect data maintained on employee laptops, phones, and other devices is an important consideration. Are employee devices and data password protected or encrypted? What protections are in place to prevent unauthorized access to employee devices by third parties? In the 14–28

2nd Edition 2018 | MCLE, Inc.

Cyber Risk Insurance and Risk Management

§ 14.7

event that a device is lost or compromised, can the device or accounts be remotely deactivated to minimize risk?

§ 14.7.6 Creating a Data Breach Response and Notification Protocol

Despite the best laid plans, data breaches and other cyber events may still occur. In the event of a data breach, is a plan already in place to respond? What procedures are in place to comply with insurance, consumer, and regulatory notification requirements? Identifying which employees will respond to a breach and implement required notifications is important to minimize the impact of a potential breach. Similarly, establishing relationships with nonemployees whose services may be required in the event of a breach, such as forensic computer analysts, attorneys, public relations staff, and insurance company personnel, may expedite resolution of a data breach and mitigate its impact.

§ 14.7.7 Evaluation of Current Insurance Programs and Coverage Gaps

Evaluating current insurance programs internally and with insurance brokers, underwriters, and/or attorneys may also warrant consideration. As discussed above, many traditional first-party and third-party insurance programs exclude or greatly limit coverages in the event of a data breach or other cyber event. Determining the types and amount of available insurance coverage, as well as any gaps in coverage, is important to determining potential exposure and the potential need for additional cyber insurance coverages.

§ 14.7.8 Employee Training Developing employee training programs and procedures relating to the protection of corporate and consumer data is another important consideration. Data security may not be front of mind for many employees who may not recognize the confidential nature of information they have access to or the risks associated with a data breach. Many data breaches are ultimately the result of accidental disclosure of data or passwords to third parties. Implementing procedures to increase employee vigilance on data security issues is an important risk mitigation tool.

§ 14.7.9 Regulatory Compliance Massachusetts, like many other states, has regulated the protection of electronically stored personal information. Regulation 201 C.M.R. pt. 17.00, effective March 1, 2010, established both a duty to protect personal information and minimum standards for protecting personal information. Section 17.04 required the creation of a “written, comprehensive information security program” that includes the establishment and maintenance of a computer and network security system. Specific requirements include, among others, secure user identification protocols, secure access control measures, data encryption, security software, monitoring of systems for unauthorized MCLE, Inc. | 2nd Edition 2018

14–29

§ 14.7

Data Security and Privacy in Massachusetts

use of or access to personal information, and employee education and training on the security system and the importance of personal information security. Importantly, this regulation applies to entities that own or license personal information about Massachusetts residents—and not just companies located within Massachusetts. 201 C.M.R. § 17.00(1). Noncompliance and associated breaches have been investigated by the Massachusetts Attorney General’s Office, in some cases resulting in fines. New York’s Department of Financial Services has also recently promulgated regulations, 23 NYCRR 500, entitled “Cybersecurity Requirements for Financial Services Companies.” The New York regulation, effective March 1, 2017, (subject to phase-in periods for compliance) requires covered entities to implement a variety of cybersecurity measures, including cybersecurity programs, written internal policies, periodic security assessments, and written incident response plans. Although the regulation focuses on financial services companies, Section 500.11 also requires written policies to protect certain data and systems accessible to or held by third parties. Compliance with relevant data protection and security laws is essential. It is important to note that many of these types of laws and regulations, such as Massachusetts’, are drafted to protect the residents of a specific jurisdiction. For example, a Florida company with physical presence in Massachusetts that stores the personal information of Massachusetts residents is required to protect that personal information pursuant to applicable Massachusetts law. Thus, analysis of stored data, particularly in the case of personal information, may be necessary to ensure compliance with applicable laws in all relevant jurisdictions.

14–30

2nd Edition 2018 | MCLE, Inc.

CHAPTER 15

Privacy and Security in M&A Transactions Kathleen M. Porter, Esq. Robinson & Cole LLP, Boston § 15.1

Introduction........................................................................................... 15–1

§ 15.2

Due Diligence......................................................................................... 15–3

§ 15.3

Transfer of Personal Data in Bankruptcy ........................................... 15–4

§ 15.4

Risks of Disclosing Personal Data During Due Diligence .................. 15–6

§ 15.5

Representations and Warranties ......................................................... 15–7

§ 15.6

Indemnification and Insurance ............................................................ 15–8

§ 15.7

Deal Structure ....................................................................................... 15–8

§ 15.8

Conclusion ............................................................................................. 15–9

EXHIBIT 15A—Sample Initial Due Diligence Request List for Privacy and Security M&A ............................................................................................ 15–10 EXHIBIT 15B—Sample Agreement Provisions for Privacy and Security ... 15–15

Scope Note This chapter identifies issues and strategies and suggested language that can act as a starting point to protect your client’s interests in the areas of privacy and security in M&A transactions.

§ 15.1

INTRODUCTION

The personally identifiable information, protected health information, and other sensitive information and data (collectively, “data”) held by a company is often a material component of the company’s value. Today’s buyer is frequently attracted to a merger and acquisition (M&A) deal because of the data and the business surrounding the data. A seller wants to maximize its return on its efforts building a business that includes valuable data. However, as discussed in other chapters in this book, there are many privacy and security laws, regulations, and policies in the United States and elsewhere that may restrict a seller’s ability to transfer data. As a result, data, privacy, and security should be a focus in any M&A transaction. There are privacy and security laws, requirements, and policies that apply generally to all data. In addition, many industries, such as health care, education, financial services, defense, and technology, and many types of data, such as financial, health, and MCLE, Inc. | 2nd Edition 2018

15–1

§ 15.1

Data Security and Privacy in Massachusetts

student- or child-related, have heightened compliance requirements. A buyer wants to ensure that its ability to acquire data in a transaction is not prohibited or restricted by these laws or regulations. Additionally, a buyer wants to confirm that the seller has operated in compliance with any such general or heightened laws and requirements during the period prior to closing. For these reasons, any M&A transaction should include privacy and security due diligence. The stock purchase, merger, or other acquisition agreement should include specific representations and warranties regarding a seller’s practices and compliance regarding data. Finally, the indemnification provisions in the acquisition agreement should address some of the unique potential exposures and liability that may arise for a buyer and seller if the representations and warranties are inaccurate. Without the proper due diligence and the appropriate privacy and security representations, warranties, and indemnification provisions, a buyer could unknowingly inherit a seller’s liability for collecting or using personal data of the type or in a manner not disclosed in the seller’s privacy policies, or a seller’s failure to make a mandatory breach notification to regulators and customers. A buyer in this situation could also find the newly acquired business facing unwanted media scrutiny, governmental fines, breaches of contract, audits, damage to its reputation, notification and credit monitoring costs, and lawsuits. A buyer’s officers and directors could be held personally accountable by regulators or aggrieved investors and customers. Privacy and security issues have also cost several chief information officers, and even some chief executive officers, their positions, particularly when the company is publicly traded. Lastly, while this chapter focuses on U.S. federal and Massachusetts state privacy and data protection law, many sellers have ties elsewhere, by selling online or through resellers or agents, by outsourcing business processes or data storage functions, and/or by having partnerships with third parties. Therefore, a seller and a buyer should also consider whether the laws and procedures of other states and countries may be implicated by a proposed transaction. Many foreign countries have more rigorous privacy laws than those in the United States. However, many foreign countries lack a mandatory data breach notification requirement that exists under the law in most U.S. states, making it easier for the loss of customers or personal data to go undisclosed to a prospective buyer. For example, Australian telecommunications firm Telstra Corp. Ltd. faces this situation in its recent acquisition of Asian-based undersea cable company Pacnet Ltd. from three investment management companies. After the $697 million deal was negotiated, and just weeks prior to the April 2015 closing, Pacnet was hacked when malicious software exploited a system vulnerability and uploaded to Pacnet’s IT network. The hack infiltrated its e-mail and other administrative systems, potentially exposing credentials and nonpublic customer information to theft. See “Telstra Owned Pacnet Hit by Major Breach,” available at http://www.msn.com/en-au/ money/technology/telstra-owned-pacnet-hit-by-major-data-breach/ar-BBjZ1gl. In May, Telstra alleged the sellers failed to mention the hacking attack until after the sale to Telstra closed. See “Pacnet Security Breach,” available at http://exchange 15–2

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

§ 15.1

.telstra.com.au/2015/05/20/pacnet-security-breach. Even Pacnet’s most important government customers reported receiving no notice about the breach until after the sale to Telstra. Because of the lack of mandatory breach notification reporting in Asian countries, Pacnet claimed no obligation under the law to publicly disclose the breach any earlier. Whether Pacnet had contractual obligations to disclose the breach or the vulnerability under the acquisition agreement with Telstra or under customer agreements remains unknown. Telstra has indicated it may sue the selling investment management companies for breach of contract. See “Telstra considers legal action over Pacnet acquisition,” available at http://www.afr.com/business/telecommunications/ telstra-considers-legal-action-over-pacnet-acquisition-20150521-gh6w2q. Telstra reported incurring costs and expenses to have its team travel to Pacnet, test Pacnet’s IT systems, remove any offending software, improve monitoring and incident response systems and procedures, and test the results. Telstra has not reported on whether the breach resulted in any loss of customers or prospects, or impacted Telstra’s projected sales of its IT services to Pacnet’s customer base. Those projected new market sales were Telstra’s stated objective for making the Pacnet acquisition. It is unclear whether Telstra can recoup any costs, expenses, or lost sales under the indemnification provisions of the acquisition agreement, or whether any Pacnet executives will be forced to exit the business as a result. For the reasons stated above, parties in a transaction would be well served to conduct privacy and security due diligence and address any resulting issues in the acquisition agreement.

§ 15.2

DUE DILIGENCE

After confidentiality agreements and letters of intent are signed, the transaction process generally continues with a buyer conducting due diligence on a seller’s business. In the privacy and security area, the diligence process generally helps a buyer determine the data’s quality, value, and transferability. A buyer wants to understand a seller’s sources of collected, used, and stored data. A buyer wants to review a seller’s internal policies and practices regarding data collection, use, storage, encryption, and destruction, as well as relevant laws and applicable agreements with third parties. The diligence process typically reveals whether a seller is in compliance with applicable privacy and security laws and regulations. A buyer wants to determine if there are any legal or contractual restrictions on its ability to acquire and use the data as intended. It is also helpful for a buyer’s counsel to know the buyer’s objectives for the seller’s business and its data, as this will assist counsel in flagging issues that come out of the diligence process that could impact the buyer’s willingness to proceed with the transaction as originally envisioned. Any privacy and security due diligence should be tailored to the industry and nature of the proposed transaction. A sample initial due diligence request is included as Exhibit 15A. This sample request should be used only as a guide when assessing privacy- and security-related matters and related contractual provisions. Each client’s privacy and security requirements should be reviewed in the context of the underlying transaction, the client’s particular industry, and any controlling policies or agreements. MCLE, Inc. | 2nd Edition 2018

15–3

§ 15.2

Data Security and Privacy in Massachusetts

A typical initial high-level due diligence inquiry is whether a seller’s privacy policies permit a seller to transfer data in the event of a merger, acquisition, divestiture, or equity investment. Remarkably, many privacy policies do not address such a transfer or, worse, require the customer’s express consent, or outright prohibit it. The longheld position of the Federal Trade Commission (FTC) in this situation is that personal data must be treated pursuant to the privacy policy in effect at the time of its collection. This means that before any data can be transferred, the FTC wants a seller to obtain the affirmative “opt-in” consent of its customers. The parties to an M&A transaction generally object to having to seek opt-in consent because the response rate is so low. Customers or former customers often do not respond to such e-mails, including because they have changed e-mail addresses.

§ 15.3

TRANSFER OF PERSONAL DATA IN BANKRUPTCY

A high-profile bankruptcy case several years ago involving an online toy retailer provides some insight on how to deal with this scenario, at least in the bankruptcy sale context. See Stipulated Consent Agreement and Final Order, Case No. 00-1 J341RGS (D. Mass. July 21, 2000), available at http://www.ftc.gov/os/2000/07/ toysmartconsent.htm. The debtor in Toysmart sought Bankruptcy Court approval to sell its customer list, notwithstanding the promises made at the collection that the customer’s information would not be sold. The FTC objected to the proposed sale. The matter was resolved when a shareholder in a related business agreed to pay $50,000 for the debtor to destroy the customer list. Despite this resolution, Toysmart is largely seen as establishing the following framework around the sale of personal data by a debtor in bankruptcy: • personal information will not be sold as a standalone asset; • personal information will be sold only to a third party who is in the same business as the debtor, and who agrees to be bound by and follow the terms of the debtor’s privacy policies as to the acquired information; and • the buyer will obtain affirmative consent from consumers for any material changes to the applicable privacy policy. Since Toysmart, other debtors have sought the approval of bankruptcy courts to sell personal data, even if the debtor’s privacy policy restricts or prohibits such a sale. The FTC and the privacy ombudsman routinely object to these proposed sales, with mixed results. (Privacy ombudsman provisions are part of the Bankruptcy Abuse Prevention and Consumer Protection Act, 11 U.S.C. §§ 332, 363 (2005).) In 2011, the U.S. Bankruptcy Court in Manhattan approved the transfer of customer and transaction data for 48 million former Borders’ customers to its former competitor, Barnes & Noble, for $13.9 million, over objections of the FTC and the privacy ombudsman, and despite Borders’ privacy policy language promising customers the right to consent to any transfer of their data. The chief focus in the Borders case was whether Borders would be required to obtain “opt-in” or “opt-out” consent from consumers to the transfer of their personal data to Barnes & Noble. In the end, the Bankruptcy Court held that an e-mail to consumers from the buyer Barnes & Noble offering 15–4

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

§ 15.3

fifteen calendar days to opt out of the transfer was sufficient, in part because consumers were unlikely to respond to an e-mail from Borders, a defunct company. The debtor and buyer also agreed to publish notices on their websites and give any customers without e-mail access thirty days to contact Barnes & Noble and opt out. When the privacy ombudsman then unsuccessfully challenged the wording of Barnes & Noble’s e-mail notice on the grounds that it failed to clearly indicate how and why the consumer could opt out, the FTC published a press release warning consumers of their right to opt out. See https://www.ftc.gov/news-events/press-releases/2011/10/ important-information-regarding-right-borders-customers-opt-out. Conversely, after the FTC’s objection to the sale of customer information from a bankrupt gay youth magazine’s website, a New Jersey Bankruptcy Court ordered that the information be destroyed even though the prospective buyer wanted to continue the business. In re Peter Ian Cummings, Case No. 10-14433 (Bankr. D.N.J.). (The FTC’s objection is available at https://www.ftc.gov/system/files/documents/ closing_letters/letter-xy-magazine-xy.com-regarding-use-sale-or-transfer-personalinformation-obtained-during-bankruptcy-proceeding/100712xy.pdf. The order of destruction is available at http://www.technologylawdispatch.com/wp-content/ uploads/sites/560/2011/09/Order.pdf.) Most recently, the debtor RadioShack sought Bankruptcy Court approval to sell millions of its customers’ names and physical and e-mail addresses to its largest shareholder for $26 million. RadioShack Corp., Case No. 1:15-bk-10197 (Bankr. D. Del. Feb. 2, 2015). The FTC and several state attorneys general objected to the proposed sale, on the grounds that it violated the bankrupt retailer’s privacy policy promising no transfers of the data. (See https://www.ftc.gov/system/files/documents/ public_statements/643291/150518radioshackletter.pdf; In re RadioShack, Case No. 15-10197 (BLS) (Bankr. D. Del.). RadioShack’s March 2015 privacy policy stated: “We will not sell or rent your personally identifiable information to anyone at any time.”) The Bankruptcy Court in RadioShack ultimately approved the sale after the debtor successfully negotiated settlements with several state attorneys general and the FTC to limit the type and age of personal information being sold to the buyer and in exchange for the buyer’s agreement to obtain customer consent for any material changes to RadioShack’s prebankruptcy privacy policy. Only customer e-mails from the prior two years, and a limited amount of other customer information (seven of 170 fields of data collected by RadioShack) were approved to be sold. Additionally, the personal information was sold to the debtor’s shareholder, who, by partnering with Sprint, was launching a business that overlapped with a portion of the debtor’s former business. The debtor’s plan to sell the data to its shareholder drew the objections of third-party manufacturers Apple and AT&T, who had sold products to RadioShack as a reseller. Apple and AT&T claimed their respective agreements with the electronics retailer restricted or prohibited the sale of certain customer data relating to their products. In the end, the court somewhat addressed Apple’s and AT&T’s objections by prohibiting the transfer of any customer or transaction data where only Apple’s and AT&T’s products or services were purchased. However, the debtor was allowed to transfer such customer and transaction data to the extent that it was commingled with other MCLE, Inc. | 2nd Edition 2018

15–5

§ 15.3

Data Security and Privacy in Massachusetts

purchases by the customer. The debtor also agreed to require the buyer to agree to abide by the terms of the Apple and AT&T reseller agreements. The RadioShack case highlights two points relevant to data transfers in transactions. First, that at least in the bankruptcy context, a company’s privacy policy promises to consumers may not always be honored as written. Regulators will likely continue to intervene in Bankruptcy Court cases to try to protect the interest of the consumer based on the Toysmart principles. However, debtors and buyers paying cash to the estate for data may successfully whittle away at the edges of the Toysmart principles, particularly the principle to obtain opt-in consent from consumers. In M&A transactions outside of bankruptcy, the FTC has consistently stated that a buyer should honor the privacy policy of the acquired business. The FTC has also said that if the buyer fails to honor these privacy policy promises, both the buyer and the acquired business could be in violation of Section 5 of the FTC Act for an unfair or deceptive trade practice. However, this does not mean that the FTC will always block acquisitions where the privacy policies of the seller and buyer differ. In February 2014, the FTC did not intervene to stop Facebook from acquiring the mobile messaging company WhatsApp for $19 billion. The FTC did warn both parties by letter that the failure by either company to honor WhatsApp’s personal data promises to its customers after its sale to Facebook would violate Section 5 of the FTC Act. See https://www.ftc.gov/system/files/documents/public_statements/297701/ 140410facebookwhatappltr.pdf. The FTC may have taken this approach because Facebook is already subject to continuing oversight by the FTC because of the 2011 consent order. It is unknown whether the FTC will take the wait-and-see approach of “warning” parties again in the future rather than intervening immediately. The RadioShack case highlights a second interesting point relevant to data transfer, which occurs when a seller in an M&A transaction acts as a distributor or reseller or agent. Third-party manufacturers or licensors are contractually asserting ownership or control over the end user or customer personal data collected and stored by a seller, acting as reseller, distributor, or agent. Therefore, a buyer in an M&A transaction should request and carefully review any distributor, reseller, or similar contracts disclosed by a seller in due diligence for provisions regarding data ownership or rights or restrictions on transfer of data. To the extent such provisions exist, a buyer will want to understand whether the contract and/or the data will be transferred as part of the sale. A buyer should also consider the implications on the underlying value of the business, including whether the seller has commingled this data with other data.

§ 15.4

RISKS OF DISCLOSING PERSONAL DATA DURING DUE DILIGENCE

The parties in the M&A transaction should discuss and agree on a specific framework for privacy- and security-related due diligence disclosures. A nondisclosure agreement should have been signed by the parties. The parties should consider whether vendors or advisors should also sign a nondisclosure agreement. The parties should also consider how disclosures should be made—for example, in person, by phone, or through an encrypted virtual data room. 15–6

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

§ 15.4

A key point is determining what personal data a buyer actually needs to see as part of the due diligence process to complete its evaluations of the seller’s business. For example, a buyer may not actually see the specific customer data, but a buyer may need to know what classes and types of customer data exist. The buyer would then include a representation and warranty in the acquisition agreement confirming the classes and types of customer data the seller has represented exist. A seller may be restricted by contract, by or under its privacy policies, from sharing during the due diligence process. For example, a buyer wants to avoid disclosing employee data with Social Security numbers or customer data with credit card information, as this would result in the seller creating a data breach notification obligation. Additionally, even if personal data needs to be disclosed, the parties might decide to stagger the disclosures. A buyer would disclose aggregated or anonymized data initially, to permit a buyer to understand the range of data involved. When absolutely necessary, after the transaction has been approved by both parties, any required clearances obtained and most closing conditions satisfied, a seller might disclose the most sensitive data. During due diligence, the parties should pay special attention to particular types of data, such as employee data, protected health information, or data collected outside of the United States, which might require the seller and buyer to satisfy additional requirements, such as notice and consent, or compliance with a foreign law, before disclosure or transfer can be made at closing. Finally, there are increasing reports on hackers’ attempts to access the networks of companies and advisory firms involved in pending M&A transactions. See “Hackers Targeted Companies’ Merger Talks,” available at http://www.wsj.com/articles/ hackers-targeted-companies-merger-talks-1417484372. Hackers apparently seek information on pricing, timing, and other factors that they can use to their advantage in the markets. The parties to a transaction should take care when sharing due diligence documents, including when using online data rooms or encrypted flash drives, laptops, and mobile devices. A secure process for distributing password and login information for data rooms should be documented and followed by a member of the seller’s and buyer’s teams. Additionally, parties should be on the lookout for potentially fraudulent e-mails coming from outside advisors or senior members of the company’s deal team changing financial terms or enclosing wiring instructions that are linked to a third-party account controlled by hackers.

§ 15.5

REPRESENTATIONS AND WARRANTIES

While due diligence may be ongoing, the parties in an M&A transaction will typically begin to negotiate the acquisition agreement and related documents. A buyer should ask a seller to make specific privacy and security representations and warranties in the acquisition agreement, based on its due diligence review to date. Sample privacy and security representation and warranty provisions are included as Exhibit 15B. MCLE, Inc. | 2nd Edition 2018

15–7

§ 15.5

Data Security and Privacy in Massachusetts

These sample provisions should be used only as a guide, as each client’s privacy and security requirements differ. The provisions should be reviewed and modified by counsel to fit the underlying transaction, the client’s particular industry, and any controlling policies or agreements. That said, a buyer frequently includes certain representations and warranties in some form in nearly every acquisition agreement. First, a buyer typically wants to confirm that the information and policies disclosed by a seller about its privacy and security practices as part of the due diligence are accurate and complete. Second, a buyer usually wants a seller to represent and warrant that the seller has and will comply with all applicable laws, regulations, and directives. Because the Massachusetts security regulation 201 C.M.R. pt. 17.00 requires a party to take certain proactive steps as part of its data security program, a buyer may want to include for a seller more specific representations and warranties regarding complying with this regulation, such as by performing security risk assessments and evaluating and monitoring vendors. Third, a buyer will typically want a seller to represent and warrant about any privacy or security disputes, incidents, or breaches. Depending on the situation, a buyer may ask a seller to make representations and warranties regarding the seller’s history (or lack thereof) of transferring data into and out of the United States.

§ 15.6

INDEMNIFICATION AND INSURANCE

In addition to negotiating specific data and security representations and warranties, a buyer should also consider including specific indemnification provisions covering this area. These special provisions cover the unique data and security liabilities and damages that typically arise in the event the seller breaches its representations, warranties, and covenants. A buyer may also want to have a standalone survival period so that that it can manage the business for a period of time and determine whether there are any undisclosed data or security issues. Finally, buyers often seek a separate holdback or escrow of a portion of the purchase price, perhaps with increased liability caps and without a basket, to cover privacy and security claims from dollar one. This is in part because a buyer usually wants to control the resolution of any privacy or security claim. Sample privacy and security indemnification provisions are included as Exhibit 15B. When negotiating the indemnification provisions, a buyer should understand whether the seller has any insurance coverage for preclosing data and security breaches, and whether any “tail” coverage might be available in connection with these policies. This information might assist the buyer in evaluating the seller’s request for higher liability thresholds and lower liability caps.

§ 15.7

DEAL STRUCTURE

As with any transaction, a buyer should consider the privacy and security risks when determining the transaction’s structure. If the transaction was structured as an asset acquisition, the buyer could contractually limit its assumption of liability, including liability for privacy and security matters. In a merger or stock purchase structure, a 15–8

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

§ 15.7

buyer may be assuming the seller’s past liabilities, including those for privacy and security compliance. However, the practical realities are that, as to liability for preclosing data and security matters, there may be no meaningful difference between an asset transaction and a stock or merger transaction. The regulators, the market, and the customers will expect a buyer to take responsibility for a preclosing privacy or security issue, even if the buyer was unaware of the issue until after the closing. In most cases, a buyer will want to control the situation and convey confidence in the acquired business and avoid losing customers. So, absent extraordinary circumstances, it is unlikely the data and security issues will drive the transaction structure. A buyer would benefit more from negotiating an indemnification provision that gives the buyer control of a situation, even if it means working with a seller on costs and expenses. A buyer with the ability to control any data or security situation postclosing can schedule routine compliance audits and take other measures to flush out any issues sooner rather than later. In that regard, a buyer and seller should also decide who will be responsible for security breach notifications between signing and closing, and during any transition service period. The transaction structure may be relevant if there are nontransferable privacy or security certifications or approvals in place with the seller entity, or if there are restrictions on transferring data from the seller entity to a new one. In any case, if the data is going to be moved into a new company or location as part of the closing, a buyer needs to plan how and when to transfer the data postclosing, and whether a transition services agreement with the seller will be needed for that postclosing period.

§ 15.8

CONCLUSION

As discussed above, data and security are an increasingly important aspect of any M&A transaction, and that importance is likely to continue to grow. Buyers and sellers should pay close attention to this area. By working together and investing time in due diligence, a seller and buyer can minimize surprises in acquired business and maximize the value of the transaction for all parties.

MCLE, Inc. | 2nd Edition 2018

15–9

Data Security and Privacy in Massachusetts

EXHIBIT 15A—Sample Initial Due Diligence Request List for Privacy and Security M&A Note This is a sample initial due diligence request. It is to be used only as a starting guide when assessing privacy and security related matters and related contractual provisions. It is not intended to be a comprehensive list of the data privacy and security issues/risks associated with a particular transaction. Each client’s privacy/security requirements should be reviewed in the context of the underlying transaction, client’s particular industry and any controlling policies/agreements. Depending on the responses to any of the initial inquiry questions, additional inquiry and due diligence may be appropriate. This list is for informational purposes, does not contain or convey legal advice, and may or may not reflect the views of the author’s firm or any particular client or affiliate of that firm. The information herein should not be used or relied upon in regard to any particular facts or circumstances without first consulting a lawyer. Drafters should use this list in conjunction with their own research on the applicable laws on privacy and security in the pertinent jurisdiction.

Data Sources 1.

Describe what types of personal information and other information are obtained or collected by the business being acquired (“Target”) and how is this information collected or obtained? Please list the data fields.

2.

List all personally identifiable information, protected health information, credit card numbers, social security numbers or other government issued ID, or other sensitive information obtained, collected, or stored by Target. Describe why this information is collected, how it is used and where this information is stored.

3.

Does Target use data brokers to obtain personal information? If so, please describe Target’s process and policy regarding the use of this information.

4.

Does Target rent or purchase customer lists for marketing purposes? If so, please describe process and policy regarding the use of this information and from whom the data is acquired.

5.

Does Target sell or rent your customer lists to others? If so, please describe process and policy regarding this information and to whom the data is sold or rented.

6.

How does Target market and sell to its customers? For example, does Target sell directly or indirectly, use telemarketing or e-mail marketing; use its own or third party websites?

7.

What industries does Target market to or participate in? For example, defense, financial services, health care, education?

8.

Does Target market, sell to, or collect data from children under the age of 13?

15–10

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

9.

List the data classifications used by Target.

10. Where is personal data of Target’s customers stored (i.e., physical location of the databases/servers)? 11. How is personal information protected when stored and when in transit? For example, what security methods are used when data is sent outside Target’s firewall? General Policies, Protocols and Procedures 12. What websites and domain names does Target own or maintain? Will any of these websites or domain names not be transferred to Buyer as part of the closing? 13. Provide all website, offline or other policies, written information security programs (WISPs) relating to privacy and data, administrative and/or physical security. Please identify any material updates made in the last [_____ (__)] years. The length of time a buyer will want to review ranges from two (2) to five (5) years, depending on how long a Target has been in business, and type and age of data held by Target. Often the most relevant modifications occur just prior to the business being offered for sale, as a seller and its counsel will review policies and procedures to address any deficiencies to maximize the purchase price. 14. Provide copies of any other standard data privacy and/or security language Target has with customers. Please identify any material updates made in the last [_____ (__)] years. 15. Provide a copy of Target’s data, information and physical security program, as well as any current or schedule strategic security updates or modifications. 16. Provide copies of Target’s risk assessment(s) identifying reasonably foreseeable internal and external threats to the data, physical and administrative security , confidentiality, and integrity of its assets. 17. Describe any security or privacy practices or protocols Target is required to comply with, either by law or under contract or other obligation? If so, please describe. 18. Provide a copy of Target’s breach notification or incident policy and procedures. 19. Provide a copy of Target’s business continuity/disaster recovery plan, together with the results of any tests and a schedule of any further tests. 20. Provide a copy of Target’s data back-up policies and procedures. 21. Provide a copy of Target’s data and document retention/destruction policies and procedures. 22. Provide a copy of Target’s mobile device use (including laptops) and remote access policies. 23. Do employees of Target sign a confidentiality agreement? 24. Does Target have controls in place for the proper disposal of confidential information?

MCLE, Inc. | 2nd Edition 2018

15–11

Data Security and Privacy in Massachusetts

25. Describe Target’s encryption and password control policies. Vendor Arrangements 26. Provide copies of standard data privacy and/or security language from agreements Target has with third parties, including vendors, service providers, suppliers and other agents (“third party agreement”). 27. For each third party agreement, describe the employee, customer or other data shared, stored or processed under these agreements. 28. For each third party agreement, describe the services provided, i.e., backup and recovery, document handling, log reviews, telemarketing, e-mail marketing, customer service, nature of outsourced function operations, cleaning services, mailroom services, document destruction services that are covered. 29. For each third party agreement, identify any material updates made in the last [_____ (__)] years. 30. Do including vendors, service providers, suppliers and other agents of Target sign a confidentiality agreement? 31. Does Target have a formal program for selecting, evaluating, auditing and managing vendors? If so, describe the program used by Target. Include any questionnaires or reports used as part of the program. Include any revisions made to accommodate any new regulations. 32. Are vendors required to report data breaches to Target? Technology 33. Describe the technology Target uses in its operations, including any encryption, mobile device management, remote access, or cloud based technology. 34. Describe any technology used to monitor and remediate malware within Target’s environment. Cross Border 35. In what countries does Target does business, have customers, employees, partners, resellers or offices. Please describe which activity applies for each such country. 36. Provide a list of all countries other than the U.S. in which Target shares or stores data, either directly or indirectly, through a customer, employee, partner, reseller, office, cloud provider or other arrangement. Please describe any arrangement and provide copies of the agreements. 37. Does Target transfer data from other countries to the U.S., and if so, what other countries? 38. Does Target have country-specific privacy policies? If so, please provide. 39. Describe Target’s processes for complying with cross border data flow restrictions worldwide (e.g., consent, intercompany agreement/model clauses, EU-US Safe 15–12

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

Harbor, or binding corporate rules)? Please provide copies of any form of consents, certifications, rules or agreements. Tracking Technology 40. Does Target track click stream, page hits, or other customer behavior or browsing history on its web site or other web sites? 41. Does Target sell or use tracking/data collection technologies that send any nonidentifiable or identifiable customer information back to Target or to others? 42. Is customer given clear notice and choice at the time of installation or before such technologies collect or transmit information? 43. Please describe any products that combine Target cookie data with personally identifiable information, whether performed by Target or its vendors or service providers. 44. Provide a copy of Target’s policy and protocols regarding tracking/data collection technologies or click stream, page hits, or other customer behavior or browsing history. Personnel 45. Provide a list of employees, consultants or contractors whose job responsibilities include privacy or security (data, information or physical), either full or parttime. Please include titles, job and geographic responsibilities, and an organizational chart for these functions. 46. List industry group that privacy and security team members participate in, including any certifications held by team members, the group and/or the company (e.g., CIPP, CISSP). Training 47. Describe any privacy and security training (including mobile device use, remote access, document retention, etc.), Target or its consultants has provided to employees or vendors over the last [_____ (__)] years, including copies of materials. Breaches, Lawsuits 48. Describe any current or closed data, information or physical security investigations, security breaches, exposures or other incidents occurring in the last [_____ (__)] years, including outcomes and modifications made, if any to Target’s programs or practices. 49. Describe any lawsuits, formal or informal regulatory complaints or inquiry made against Target in relation to privacy or data security for the past [_____ (__)] years, including any complaints, agreements or settlements. Please include the identity of the Target employee(s) and/or outside counsel who handled the lawsuit or inquiry, the internal investigation as well as the status and/or results.

MCLE, Inc. | 2nd Edition 2018

15–13

Data Security and Privacy in Massachusetts

Insurance 50. Provide copies of insurance policies for cyber-liability, property and casualty, and directors and officers’ coverage. Security, Audits, Certifications 51. Provide the results of any internal or third-party security audits, evaluations or reviews for the last [_____ (__)] years. Describe any efforts to close any gaps. 52. Describe any intrusion detection system or intrusion prevention system used by Target. 53. Provide copies of internal or third-party vulnerability scans or penetration testing for the past [_____ (__)] years, with a status on measures taken to address any issues. 54. Have the Target’s privacy policies and practices been externally audited and/or certified by any third party such as TRUSTe or BBB Accredited Business? If so, please provide evidence. 55. If Target collects payment card information, is Target PCI-compliant? If so, please provide compliance documentation. If not, explain why not. 56. Provide copies of any certifications obtained by Target regarding its information security policies, procedures and controls. (e.g., ISO 27001). 57. Does Target receive any data or information security information from the U.S. Government that is restricted or classified. If so, please explain why such information is received 58. Is the Company a participant in industry based or other cyber information sharing and analysis groups with other companies? If so describe.

15–14

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

EXHIBIT 15B—Sample Agreement Provisions for Privacy and Security Note This sample assumes an asset transaction where Seller is selling all of its business assets to Buyer. These are sample M&A agreement provisions for privacy and security. They are to be used only as a starting guide when assessing privacy and security related matters and related contractual provisions. It is not intended to be a comprehensive provision of the data privacy and security issues/risks associated with a particular transaction. Each client’s privacy/security requirements should be reviewed in the context of the underlying transaction, the particular industry and any controlling policies/ agreements. These provisions are for informational purposes, do not contain or convey legal advice, and may or may not reflect the views of the author’s firm or any particular client or affiliate of that firm. The information herein should not be used or relied upon in regard to any particular facts or circumstances without first consulting a lawyer. Drafters should use these provisions in conjunction with their own research on the applicable laws in the pertinent jurisdiction.

Definitions “Business” [means the business being acquired by Buyer, which could be the Seller’s business or a portion thereof, a target company] “Employee Data” means any Personal Data or other data or information collected by or on behalf of Seller from employees of the Business. “Laws and Policies” means (tailor as appropriate): (a) Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act, enacted as part of the American Recovery and Reinvestment Act of 2009, Public Law 111-5 (“HITECH”), and the associated implementing regulations (45 C.F.R. parts 142 and 160-164), including the Final Rule issued by the Department of Health and Human Services, effective March 26, 2013, as such regulations may be amended from time to time (collectively referred to herein as the “HIPAA Standards”); (b) the Gramm-Leach-Bliley Act (GLBA) and the accompanying regulations as such regulations may be amended from time to time; (c) the Family Educational Rights and Privacy Act (FERPA) and accompanying regulations as such regulations may be amended from time to time; (d) Children’s Online Privacy Protection Act of 1998 and accompanying regulations as such regulations may be amended from time to time; MCLE, Inc. | 2nd Edition 2018

15–15

Data Security and Privacy in Massachusetts

(e) the Fair Credit Reporting Act; the CAN-SPAM Act, Telemarketing and the Telephone Consumer Protection Act and accompanying regulations as such regulations may be amended from time to time; (f) Mass. Gen. Laws ch. 93H and MA 201 CMR 17.00, as such regulations may be amended from time to time (collectively, the “MA Security Standards”); (g) the policies and regulations of the Federal Trade Commission, the Federal Financial Institutions Examination Council, the Consumer Financial Protection Bureau, the National security Administration, the Securities and Exchange Commission, the Department of Energy and the Federal Energy Regulatory Commission; and/or (h) any other applicable [international], federal, state or local privacy, customer data protection law, policy or industry standard governing the collection, use, disclosure, spam, the privacy and/or security of any Personal Data, and in the case of (a) through (g) as each is amended, modified and supplanted from time to time. “Personal Data” means a natural person’s first name or first initial, last name, street address, telephone number, e-mail address, social security number, driver’s license number, passport number or other government issued identification number, financial account number, or credit or debit card number, with or without any required security code, access code, other customer or account number, personal information, cardholder data, PHI, or other personal data or information or any other piece of information that relates to such natural person. “PHI” or “Protected health information” means any information about health status, provision of health care, or payment for health care that can be linked to a person. [or, alternatively, as defined in the HIPAA Standards]. “Privacy Policy” or “Privacy Policies” means Seller’s external or internal, website and off-line, current or past, privacy policy or policies, including any policy relating to the collection, storage, disclosure, transfer and destruction of any User Data or Employee Data. “User Data” means any information or data, including Personal Data, collected by or on behalf of Seller from users or customers of the Business, including from any website(s) operated in connection with Business. “Unsuccessful Security Incident” means without limitation, pings and other broadcast attacks on firewall, port scans, unsuccessful log-on attempts, denials of service and any combination of the above, so long as no such incident results in unauthorized access, use or disclosure of any information or data of Seller or the Business. Representations and Warranties (a) Schedule __ contains a list of Privacy Policies and identifies, with respect to each Privacy Policy, (i) the effective dates and periods for each such privacy policy, (ii) whether the terms of a later Privacy Policy apply to the data or information 15–16

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

collected under such privacy policy, and (ii) if applicable, the mechanism (such as opt-in, opt-out, or notice by e-mail or positing only) used to apply a later Privacy Policy to data or information previously collected under such privacy policy. (b) Seller has provided Buyer true and complete copies of each Privacy Policy. (c) Seller has at all times complied with all Laws and Policies, and has protected and maintained the privacy and integrity of User Data and Employee Data accordingly, and in accordance with the Privacy Policies with respect to such collection, use, processing, security, disclosure and destruction. (d) Seller received no written notices of any violation of Laws and Policies or of any of the Privacy Policies. Each of the Privacy Policies, in accordance with its express terms, may be amended and supplemented from time to time. Each of the Privacy Policies permits Seller, to disclose and transfer User Data and Employee Data to third parties in connection with the sale of the Business. (e) Neither the execution, delivery, or performance of this Agreement (or any of the other documents or agreements executed, delivered or performed in connection with the transaction contemplated by this Agreement) nor the consummation hereof or thereof, nor Buyer’s possession or use of Personal Data, will result in any violation of any Privacy Policy or any applicable Law and Policies pertaining to privacy, security, User Data or Employee Data. (f) Seller has in place and will maintain effective and appropriate safeguards and reasonable information security procedures and policies that include administrative, technical and physical safeguards to: (i) protect the confidentiality and security of Personal Data; (ii) prevent the theft and/or unauthorized access, use or disclosure of Personal Data; and (iii) provide for the proper disposal of Personal Data all in compliance with Laws and Policies. (g) Seller’s personnel, vendors and agents have been appropriately screened and trained in the implementation of Seller’s WISP and other information security policies and procedures. Seller regularly audits and reviews the information security policies and procedures of Seller and its vendors and agents to ensure their continued effectiveness and determine whether adjustments are necessary in light of the circumstances including, without limitation, changes in technology or the Seller’s WISP, information systems, reasonably foreseeable threats or hazards to Personal Data and/or changes in any Laws and Policies or any agreements to which Seller is a party. (h) Other than an Unsuccessful Security Incident, neither Seller nor its vendors have had a situation that constituted a security breach or security incident or an unauthorized acquisition or unauthorized use of data or electronic data under any applicable security breach notification laws. (i) Seller has in place the appropriate level of back up and record protection to protect Personal Data from undue harm resulting from unforeseen product failures, power surges, or other disasters. Such back up policies will include but not be limited to, equipment, program files and data files. Seller periodically reviews

MCLE, Inc. | 2nd Edition 2018

15–17

Data Security and Privacy in Massachusetts

and updates its back up and record protection policies to maintain a reasonable level of protection to its customers and employees. (Ma Security Standard Specific) Seller hereby represents and agrees that it: (i) has developed and maintained a comprehensive “written information security program” (“WISP”) or equivalent as required under Ma Security Standards, a current copy of which has been provided to Buyer,; (ii) has implemented and maintained the security measures set forth in its WISP in conformance with Ma Security Standards, and will continue to do so until the Closing, and to update such security measures from time to time as necessary and appropriate; (iii) engages in ongoing training of its employees and contractors about its WISP, the Ma Security Standards and the proper handling of “personal information’ as defined under the Ma Security Standards; and (iv) encrypts all records containing personal information that resides on laptops or other portable media, or is transmitted wirelessly or across public networks, in accordance with the Ma Security Standards. Covenants (a) From the date hereof until the Closing, Seller agrees to provide Buyer with prompt notice of any breach or unauthorized acquisition or use of Personal Data and to cooperate in responding to such breach or unauthorized acquisition/use in compliance with any Laws and Policies. (b) Seller will cooperate with Buyer, as may be necessary or reasonably requested from time to time, between the date hereof and continuing after Closing to ensure or demonstrate its compliance with Laws and Policies. Indemnification (a) Seller shall indemnify and hold harmless Buyer from and against any and all claims, enforcement actions, losses, suits, proceedings, liabilities, fines, penalties, costs and other expenses (including without limitation any expenses incurred by or on behalf of Buyer or the Business in investigating or providing required breach notifications) (collectively, the “Claims’) resulting from, or relating to, the acts or omissions of Seller, its employees, agents, and subcontractors prior to or at Closing, in connection with any collection, use, disclosure or destruction of Personal Data under the Laws and Policies or any other applicable state or federal law. (b) Costs and expenses subject to indemnification under subsection (a) hereunder shall include, without limitation, reasonable costs and expenses incurred by or on behalf of Buyer in investigating and providing breach notification, including, but not limited to, any administrative costs associated with providing notice, printing and mailing costs, and costs of mitigating the harm (which may include the costs of obtaining credit monitoring services and identity theft insurance) for affected individuals whose information has or may have been compromised as a result of the breach. Without limitation as to any other remedies available to Buyer under this Agreement or the law, Seller shall pay, or reimburse Buyer, for all costs, including attorneys’ fees, incurred in connection with providing notifications, 15–18

2nd Edition 2018 | MCLE, Inc.

Privacy and Security in M&A Transactions

including, without limitation, all costs incurred to mitigate the harmful effects, or potentially harmful effects, of Seller’s breach. (c) The Buyer, with qualified counsel of its own selection, shall control any Claims, provided however, that with respect to any enforcement proceeding commenced by a regulator against the Seller on the one hand, and Buyer and the Business on the other hand, the Seller shall control such Claim against Seller and be entitled to defend itself with qualified counsel of its own selection, at its expense. In no event shall Seller agree to or settle any fines or penalties or admit to any fault on the part the Business or Buyer, without Buyer’s prior written consent. (d) This provision shall survive the [Closing] of the transactions contemplated by the Agreement.

MCLE, Inc. | 2nd Edition 2018

15–19

Data Security and Privacy in Massachusetts

15–20

2nd Edition 2018 | MCLE, Inc.

CHAPTER 16

Current Developments Stephen Y. Chow, Esq. Burns & Levinson LLP, Boston § 16.1

Introduction........................................................................................... 16–1

§ 16.2

Congressional Activity .......................................................................... 16–4

§ 16.3

The FTC Framework............................................................................ 16–6

§ 16.4

The States .............................................................................................. 16–9

§ 16.5

Identity Management ......................................................................... 16–11

§ 16.6

Conclusion ........................................................................................... 16–15

EXHIBIT 16A—House of Representatives Resolution 835 ........................... 16–17

Scope Note This chapter alerts the reader to the current directions of developments in data security and privacy, as reflected in legislation and policy. The chapter provides a helpful overview of many statutory and regulatory developments, and presages forthcoming changes in this dynamic area.

§ 16.1

INTRODUCTION

The two years since the first edition of this volume have seen continuing breaches of the security of large repositories of consumer information, notably the Equifax credit reporting agency. They have also seen the rise and acceptance of (relatively) new technologies such as virtual currencies, drones, highly automated vehicles, and datacollecting home appliances, with their attendant benefits and privacy-security vulnerabilities. They have also seen the influence of e-mail leaks, an establishment of tweets and retweets as the norm of presidential communication, and the targeting (and siloing) of purportedly factual information based on political positions. The law has not kept up. The 114th Congress (2015–2016), faced with continuing reports of data breaches— including that of the Office of Personnel Management—was presented with multiple bills providing for breach notification and secure management of personal information. Practice Note House Amendment 515 to H.R. 2596 (Intelligence Authorization Act for Fiscal Year 2016) to require the President to present to Congress “a report on the data breach of the Office of Personnel Management disclosed MCLE, Inc. | 2nd Edition 2018

16–1

§ 16.1

Data Security and Privacy in Massachusetts

in June 2015” was proposed and accepted. Cong. Rec. H4410-11 (June 16, 2015).

The 114th Congress responded to leaked reports of routine NSA monitoring of domestic communications with the Uniting and Strengthening America by Fulfilling Rights and Ensuring Effective Monitoring (USA FREEDOM) Act of 2015, extending the USA PATRIOT Act. Many bills have been introduced, but even the long-talked-of harmonization of forty-eight state data breach notification laws did not proceed, and standards for data security could not be reached, reflecting the fundamental issue of cost to (routine) productivity versus benefits. The 114th Congress did enact two substantive laws in different responses to breaches of cybersecurity: • The Cybersecurity Act of 2015 was enacted as Division N of the Consolidated Appropriation Act, 2016, on December 16, 2015, after the chambers failed to otherwise agree on a more-comprehensive cybersecurity act. Among other things, the enactment provides for sharing of security breach information among private parties and with the government (subject to certain protections) with immunity from the antitrust laws and liability to private citizens. Privacy advocates criticized earlier bills as well as this enactment, and there is at least one bill to repeal the enactment. Because the enactment does not mandate sharing or reporting, it is not considered a burden on businesses. • The Defend Trade Secrets Act of 2016 (DTSA) amendments to the Economic Espionage Act of 1996 was motivated by estimated $300–480 billion of trade secrets and 2.1 million jobs lost by the United States to foreign theft, notably Chinese-government-sponsored “hacking,” and was enacted 87 to 0 in the Senate and 410 to 2 in the House of Representatives and signed May 11, 2016. It does not actually address cyber theft, but adapts an ex parte seizure procedure from the seizing of stashes of counterfeit goods to seizing stolen information, exemplified by a thumb drive on its way to an airport rather than a third-party node on a network. The legislation was expected to achieve “harmonization” by creating a federal common law of contracts, quasi-contracts, restitution, agency—and important issues in privacy law such as expectation of privacy (in the commercial or personal context) and “consent” or “authorization” that are still disputed under the Computer Fraud and Abuse Act and the Stored Communication Act that have not been clarified by Congress. The DTSA, which also includes protections for “employee mobility” and for “whistleblowers” (including notification requirements), does have an immediate effect on businesses. The Obama House (through the work of the Department of Commerce) released, on February 27, 2015, “Administration Discussion Draft: Consumer Privacy Bill of Rights Act of 2015,” adopting seven privacy rights. They are the same as the seven presented in “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy”: transparency, individual control, respect for context, focused collection and responsible use, security, access and accuracy, and accountability.

16–2

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.1

The Draft Consumer Privacy Bill of Rights follows the development of the Fair Information Practices Principles that traces to the 1973 Department of Health, Education and Welfare–sponsored report on Records, Computers and the Rights of Citizens that recommended the enactment of a federal Code of Fair Information Practices resting on five principles for dealing with personal data, generally described as notice, access, consent, integrity, and security. Framework at 9, n.9 (citing Records, Computers and the Rights of Citizens—Report of the Secretary’s Advisory Committee on Automated Personal Data Systems, p. xx (U.S. Dep’t of Health, Education and Welfare, July 1973), available at http://www.justice.gov/opcl/docs/rec-com-rights .pdf). See chapter 1, § 1.3.2, of this book. In implementing the Health Insurance Portability and Accountability Act of 1996 (HIPAA), 104 Pub. L. No. 191, 110 Stat. 1936–2103 (Aug. 21, 1996), the Department of Health and Human Service promulgated 45 C.F.R. pt. 164 (Security and Privacy) on December 28, 2000, applying the Fair Information Practices Principles, 65 Fed. Reg. 82,462–829 (Dec. 28, 2000), generally addressing the following: • notice, 45 C.F.R. § 164.520 (notice of privacy practices for protected health information); • consent, 45 C.F.R. §§ 164.502–164.514, particularly Section 164.508 (uses and disclosures for which an authorization is required) and Section 164.510 (uses and disclosures requiring an opportunity for the individual to agree or to object); • access, 45 C.F.R. § 164.524 (access of individuals to protected health information); • integrity, 45 C.F.R. § 164.526 (amendment of protected health information); and • security, 45 C.F.R. pt. 164, subpt. C (security standards for the protection of electronic protected health information). The Trump White House removed the Obama Draft Consumer Privacy Bill of Rights from the White House website. In line with the president’s election platform of deregulation, the new chairs of the Federal Communications Commission and the Federal Trade Commission have proceeded to attempt to undo some of the privacy and net neutrality initiatives of the Obama administration. See chapter 2, § 12.2.1, of this book. On the security side, President Trump did issue on May 11, 2017, his “Executive Order: Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure,” among other things, to improve the security of federal networks by providing agency head accountability for cybersecurity risks and implementing risk management measures to be aligned with the cybersecurity framework developed by the National Institute for Standards and Technology (NIST), available at https://www. whitehouse.gov/the-press-office/2017/05/11/presidential-executive-orderstrengthening-cybersecurity-federal.

MCLE, Inc. | 2nd Edition 2018

16–3

§ 16.2

§ 16.2

Data Security and Privacy in Massachusetts

CONGRESSIONAL ACTIVITY

Despite frequent reports of cyberattacks and breaches—even before the allegations of Russian “hacking” of the 2016 presidential elections—Congress has not been able to agree on a federal breach notification statute to harmonize the state measures. Because of the consumer and equity markets’ sensitivity to questions of enterprise data security, enterprises are reluctant to acknowledge incidents, so even information sharing bills have limited reach. Much of congressional debate has centered on the balance between privacy and homeland security in accessing data in communication devices or monitoring of telecommunications. The last major congressional action touching upon data security remediation for private breaches was the Cybersecurity Act of 2015 enacted as Division N of the Consolidated Appropriation Act, 2016, on December 16, 2015, Pub. L. No. 114-113, 129 Stat. 2935, after the chambers failed to agree on a more-comprehensive cybersecurity act. Among other things, the enactment provides for sharing of security breach information among private parties and with the government (subject to certain protections) with immunity from the antitrust laws and liability to private citizens. Early in the first session of the 115th Congress, on January 31, 2017, the House passed H.R. 584, the Cyber Preparedness Act of 2017, to enhance coordination between state and federal cybersecurity officials, improve the flow of information from the private sector to the government, and provide homeland security grants to local and state entities to support cyber capabilities; it was referred to the Senate, which has not acted. Practice Note The sharing of information about breaches is valued by security professionals to combat future breaches but is resisted by businesses who believe that such sharing is acknowledgment of errors or weaknesses and exposure of proprietary operations that jeopardize their competitive and public confidence positions. Consumers are also skeptical about possible collusion between businesses as to security and privacy measures.

In addition to failing to harmonize data breach notification, Congress has also failed to clarify its main antihacking statute, the Computer Fraud and Abuse Act of 1986, 18 U.S.C. § 1030 (CFAA), which has retained much of its substantive language relative to authorized access to computers, notwithstanding a dispute among the circuits whether minor violations of workplace use of computers policies made were unauthorized and made further access unauthorized. See chapter 1, § 1.3.4, and chapter 3 of this book. With respect to another 1986 legislation, the Electronic Communication Privacy Act (ECPA) (see chapter 2 of this book), which has also been largely the same since before the Internet era, the House of Representatives repeated in the first session of the 115th Congress its updating of the Stored Communications Act (SCA) title of the ECPA, among other things, to cease treating communications stored by service providers for more than 180 days as abandoned and thus accessible by law enforcement without a warrant. H.R. 387 (E-Mail Privacy Act) (referred to the Senate Feb. 7, 2017). The Senate had let a similar bill die in the 114th Congress, and has 16–4

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.2

not acted in the 115th. In line with the Republican platform of rolling back the “administrative state,” Congress did pass on April 3, 2017, Senate Joint Resolution 34: Resolved by the Senate and House of Representatives of the United States of America in Congress assembled, That Congress disapproves the rule submitted by the Federal Communications Commission relating to ‘‘Protecting the Privacy of Customers of Broadband and Other Telecommunications Services’’ (81 Fed. Reg. 87274 (December 2, 2016)), and such rule shall have no force or effect. Pub. L. No. 115-22, 131 Stat. 88. However, Congress has begun to respond to public outrage over the immunity granted by the 1996 Communication Decency Act, 47 U.S.C. § 230(c)(1) (CDA § 230), see chapter 7, § 7.2, of this book, to “interactive computer service” providers for traditional publishing activities, including curating and commenting, of sex-trafficking sites, (Doe v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016), cert. denied, 137 S. Ct. 622 (mem.) (Jan. 9, 2017)), and posting of terrorist-enabling messages on sites (Fields v. Twitter, Inc., 217 F. Supp. 3d 1116 (N.D. Cal. 2016); Gonzales v. Google, Inc., 2017 U.S. Dist. LEXIS 175327 (N.D. Cal. Oct. 23, 2017)). Two bills were filed to carve out sex trafficking from the publishing immunity of CDA § 230: The Allow States and Victims To Fight Online Sex Trafficking Act of 2017, H.R. 1865 (introduced Apr. 3, 2017) and The Stop Enabling Sex Traffickers Act of 2917, S. 1693 (introduced Aug. 1, 2017). On November 8, 2017, the Senate Committee on Commerce, Science & Transportation reported out S. 1893 for a floor vote with a substitute version including the following: [Section 3(a)(2) addition to 47 U.S.C, § 230(e) (carve-outs from immunity):] (5) NO EFFECT ON SEX TRAFFICKING LAW.— Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit— (A) any claim in a civil action brought under section 1595 of title 18, United States Code [sex trafficking of children], if the conduct underlying the claim constitutes a violation of section 1591 of that title; [where 18 U.S.C. § 1591(e) is amended by Section 5(2) to clarify that:] (4) The term “participation in a venture” [from which the defendant “benefits, financially or by receiving anything of value” under subsection (a)(2)] means knowingly assisting, supporting, or facilitating a violation of subsection (a)(1) [sex trafficking of a child]. Available at https://www.commerce.senate.gov/public/index.cfm/2017/11/senatecommerce-approves-the-stop-enabling-sex-traffickers-act-nominations.

MCLE, Inc. | 2nd Edition 2018

16–5

§ 16.2

Data Security and Privacy in Massachusetts

Practice Note Service provider opponents argue that this would force them to not monitor for fear of “knowingly facilitating” sex trafficking. There is also concern that Congress would expand the carve-out to terrorist activities. Although there are concerns about the immunization of service providers to liability for posting (and maintaining) unauthorized intimate images, or revenge porn (see chapter 9, § 9.2.3(e) of this book), it seems unlikely that Congress would go that far.

§ 16.3

THE FTC FRAMEWORK

As reviewed in chapter 8 of this book, the FTC has been the primary watchdog for U.S. consumer privacy, but it is limited in its powers to those granted under the FTCA over “unfair or deceptive trade practices” that has suffered congressional limitation from time to time. While occasionally attempting to punish major breaches of security on the “fairness” ground, the FTC has enforced more in the privacy area, particularly pursuing those who promise too much in their privacy policies and thus have acted deceptively. The change of administration, including the change in chairing of the FTC, has cast a shadow on continuing efforts of the FTC to pursue enforcement actions and rulemaking on the basis of “fairness.” See webpage for the FTC Bureau of Consumer Protection, https://www.ftc.gov/about-ftc/bureaus-offices/ bureau-consumer-protection (previously more-prominently featuring privacy); also LabMD, Inc. v. FTC, 678 Fed. Appx. 816, 2016 U.S. App. LEXIS 23559 (11th Cir. Nov. 10, 2016) (staying enforcement of unfair practices order pending appeal). However, the FTC is still statutorily charged, as reviewed in chapter 8 and throughout this book, with rulemaking and enforcement responsibility relating to consumer privacy. See “Privacy and Data Security Update 2016” (Jan. 2017), available at https://www.ftc.gov/system/files/documents/reports/privacy-data-security-update2016/privacy_and_data_security_update_2016_web.pdf. Thus, it is useful to refer to such of its privacy policies that, although downplayed, are still available on the FTC site. The FTC, in its 2000 report to Congress entitled “Privacy Online: Fair Information Practices in the Electronic Marketplace,” also expressly referred to the Fair Information Practices Principles, although combining “access” with “correction” or “integrity” into four principles. See Privacy Online: Fair Information Practices in the Electronic Marketplace at 3–4 (FTC, May 2000), available at https://www.ftc.gov/ sites/default/files/documents/reports/privacy-online-fair-information-practiceselectronic-marketplace-federal-trade-commission-report/privacy2000text.pdf. Those four principles are as follows: (1) Notice. Web sites would be required to provide consumers clear and conspicuous notice of their information practices, including what information they collect, how they collect it (e.g., directly or through non-obvious means such as cookies), how they use it, how they provide Choice, Access, and Security to consumers, whether they disclose the information collected to 16–6

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.3

other entities, and whether other entities are collecting information through the site. (2) Choice. Web sites would be required to offer consumers choices as to how their personal identifying information is used beyond the use for which the information was provided (e.g., to consummate a transaction). Such choice would encompass both internal secondary uses (such as marketing back to consumers) and external secondary uses (such as disclosing data to other entities). (3) Access. Web sites would be required to offer consumers reasonable access to the information a Web site has collected about them, including a reasonable opportunity to review information and to correct inaccuracies or delete information. (4) Security. Web sites would be required to take reasonable steps to protect the security of the information they collect from consumers. Privacy Online: Fair Information Practices in the Electronic Marketplace (FTC, May 2000) at iii. The FTC expanded on these principles in its 2012 report entitled “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policy Makers” (hereinafter FTC Framework). See FTC Framework (FTC, Mar. 2012), available at https://www.ftc.gov/sites/default/files/documents/reports/federaltrade-commission-report-protecting-consumer-privacy-era-rapid-changerecommendations/120326privacyreport.pdf. It noted public comments as follows: The “notice-and-choice model,” which encouraged companies to develop privacy policies describing their information collection and use practices, led to long, incomprehensible privacy policies that consumers typically do not read, let alone understand. The “harm-based model,” which focused on protecting consumers from specific harms—physical security, economic injury, and unwarranted intrusions into their daily lives—had been criticized for failing to recognize a wider range of privacy-related concerns, including reputational harm or the fear of being monitored. . . . FTC Framework at 2. The FTC Framework is intended to apply to “all commercial entities that collect or use consumer data that can be reasonably linked to a specific consumer, computer, or other device, unless the entity collects only non-sensitive data from fewer than 5,000 consumers per year and does not share the data with third parties.” FTC Framework at vii.

MCLE, Inc. | 2nd Edition 2018

16–7

§ 16.3

Data Security and Privacy in Massachusetts

Relative to the “security” principle, the FTC Framework recommends “privacy by design,” such that, among other things, “[c]ompanies should incorporate substantive privacy protections into their practices, such as data security, reasonable collection limits [in “context” of the purpose to a transaction], sound retention and disposal practices, and data accuracy.” FTC Framework at vii. As to the “choice” principle, the FTC Framework divides data collection into two categories: (A) practices that do not require consumer choice, and (B) other practices that do. [A] Companies do not need to provide choice before collecting and using consumer data for practices that are consistent with the context of the transaction or the company’s relationship with the consumer, or are required or specifically authorized by law. FTC Framework at vii. “Commonly accepted” practices include product fulfillment. FTC Framework at 36. However, choice must be provided whether to be tracked across third-party websites—for example, by deep packet inspection (DPI) and social media (“like”) plug-ins. FTC Framework at 40. The FTC Framework examined, but did not provide, rigid principles relative to requiring choice “enhanced data” (such as facial recognition) or “sensitive data” collected in context. FTC Framework at 42–48. [B] For practices requiring choice, companies should offer the choice at a time and in a context in which the consumer is making a decision about his or her data. Companies should obtain affirmative express consent before (1) using consumer data in a materially different manner than claimed when the data was collected; or (2) collecting sensitive data for certain purposes. FTC Framework at viii. The FTC Framework treats the “notice” and “access” principles under “transparency,” as follows: Privacy notices should be clearer, shorter, and more standardized to enable better comprehension and comparison of privacy practices. Companies should provide reasonable access to the consumer data they maintain; the extent of access should be proportionate to the sensitivity of the data and the nature of its use. FTC Framework at viii. The “integrity” principle is, in a sense, embedded in access—if one has access, one may demand correction. The FTC Framework considered access to “teen data” and a “right to be forgotten,” and supported the idea of an “eraser button” for teens. FTC Framework at 70–71. The FTC undertook to focus policy making over the following year on “do not track” options on Web browsers, mobile services, data brokers, large-platform providers such as Internet service providers, and social media networks that collect data for many purposes, and promoting enforceable self-regulatory codes. FTC Framework at v–vi. To date, the FTC has published reports on mobile privacy disclosures, and on 16–8

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.3

considerations discussed on a workshop on the “Internet of things.” See Mobile Privacy Disclosures: Building Trust Through Transparency: A Federal Trade Commission Staff Report (FTC, Feb. 2013), available at https://www.ftc.gov/sites/default/ files/documents/reports/mobile-privacy-disclosures-building-trust-throughtransparency-federal-trade-commission-staff-report/130201mobileprivacyreport.pdf; see also Federal Trade Commission Staff Report on the November 2013 Workshop Entitled The Internet of Things: Privacy and Security in a Connected World (FTC, Jan. 2015), available at https://www.ftc.gov/system/files/documents/reports/federaltrade-commission-staff-report-november-2013-workshop-entitled-internet-thingsprivacy/150127iotrpt.pdf.

§ 16.4

THE STATES

As detailed in chapter 9 of this book the states, particularly California, continue to be active in legislating protections for their citizens in cyberspace. Data breach notification laws have spread among the states, as have antimalware and antiphishing laws. There have been state initiatives on a wide range of digital information privacy issues, including fiduciary (e.g., executor of estate) access to digital accounts, prohibition of coerced employer or school access to private (mostly social media) accounts, and prosecution of perpetrators of “revenge porn.” The American Law Institute (ALI)—which has had great influence over judicial decision-making, particularly in its “restatement” of the common law of the States,, as in Dean Prosser’s Restatement (Second) of Torts formulation of common law privacy— embarked in 2013 on a project of a Restatement of Data Privacy Principles that has since been recast as “Principles of the Law, Data Privacy” (where “restatements” are intended to influence court decisions while “principles” are intended to influence best practices of private entities). As stated in the public-facing web page on the project, The Principles of the Law, Data Privacy (Data Privacy Principles) are designed to guide the protection of data privacy in various areas and types of law. This area of law is generally referred to as information privacy law or data privacy law in the United States and data protection law in most other countries. .... This Principles project is organized around key Fair Information Practice Principles (FIPPs) that establish duties and responsibilities for entities that process personal information. They also describe the rights that people should have regarding their data. These Principles draw on statutory expressions of FIPPs as found in federal laws such as the Fair Credit Reporting Act (1970), the Privacy Act (1974), the Video Privacy Protection Act (1988), the Health Insurance Portability and Accountability Act (1996), the Children’s Online Privacy Protection Act (1998),

MCLE, Inc. | 2nd Edition 2018

16–9

§ 16.4

Data Security and Privacy in Massachusetts

and the Gramm-Leach-Bliley Act (1999). States have also incorporated important statutory expressions set out in FIPPs Project Feature: Principles of the Law: Data Privacy, The ALI Advisor, http:// www. thealiadviser.org/data-privacy. The particular data privacy principles are organized as • transparency, • individual notice, • consent, • confidentiality, • use limitation, • data quality, • access and correction, • data portability, • data destruction, • data security, and • accountability. Project Feature: Principles of the Law: Data Privacy, The ALI Advisor, http://www.thealiadviser.org/data-privacy/. The project is still in its early stages as of this writing. The National Conference of Commissioners on Uniform State Law (NCCUSL, also known as the Uniform Law Commissioners, or ULC)—creators of the Uniform Commercial Code (updating shared with ALI), the Uniform Trade Secrets Act, and the Uniform Electronic Transactions Act (enabling the use of electronic signatures) (see chapter 1, § 1.3.4, of this book), among others—has also been involved in harmonizing state laws addressing aspects of data privacy. Recent projects include the following: • Revised Uniform Fiduciary Access to Digital Assets Act (2015)—See http:// www.uniformlaws.org/Act.aspx?title=Fiduciary Access to Digital Assets Act, Revised (2015). This was a response to the growing problem of access to personal online accounts when the account holder dies or is incapacitated without clear and effective directions to the account service provider. Fiduciaries such as personal representatives are required by state law to marshal the assets (i.e., estate) of the decedent and, without access to the login information, are prevented from searching the accounts for assets or directions to assets. On the other hand, some service providers assert that they are bound by federal law (notably the Stored Communications Act, 18 U.S.C. §§ 2701–2709, SCA), their terms of service, and consent orders with the FTC from providing access without “lawful consent.” But see Ajemian v. Yahoo!, Inc., 478 Mass. 169 (2017) (SCA does not preempt state estate administration). There was also concern that some individuals wished to keep information in their accounts inaccessible 16–10

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.4

to survivors. The promulgated Act provided the custodians (service providers) “lawful consent” and limited uses by the fiduciary of “digital assets,” subject to the subscriber’s designation on the provider’s “online tool,” that overrides even later wills. • Uniform Employee and Student Online Privacy Protection Act2016), http:// www.uniformlaws.org/Act.aspx? title=Employee and Student Online Privacy Protection Act. This was a response to the momentum of state legislation protecting against coercion of current and prospective employees and students to disclose the login information for their personal social media and other online communications accounts. See chapter 9, § 9.2.3(c), of this book. • Uniform Civil Remedies for Unauthorized Disclosure of Intimate Images (final reading and promulgation expected in 2018), http://www.uniformlaws.org/ Committee.aspx?title=Unauthorized Disclosure of Intimate Images. This drafting committee was formed in response to the issue of “revenge porn” that many states addressed through criminal statutes (see chapter 9, § 9.2.3(e), of this book). As discussed in § 16.2 above, CDA § 230 presents a barrier to suits against publishing sites that accept and post the content provided by others, so that the legislation is primarily directed against the poster. • Study Committee on Data Breach Notification (formed 2017), http://www. uniformlaws.org/Committee.aspx? title=Data Breach Notification. This is to address the possibility of harmonizing the forty-eight state measures for notification of data breaches (see chapter 9, § 9.2.1, of this book), particularly relative to the increasing types of personal information threatened relative to decade-old enactments and in light of Congress’s failure to act. Practice Note In Massachusetts, two bills, S. 95 and H. 1985, were introduced in the 190th General Court (2016–18) to include “biometric indicators” among the personal information protected under Chapter 93H (Security Breaches) of the Massachusetts General Laws.

§ 16.5

IDENTITY MANAGEMENT

While the threat of identity theft has continued through breaches of large databases of common identity credentials (passwords, Social Security numbers, fingerprints, etc.), efforts have been renewed on increasing the robustness of credentials used to authenticate transactions and other communications. These efforts include the use of “multi-factor” authentication to multiply the probabilities of unauthorized use of the following factors held by the authenticator: • something the authenticator has (e.g., device signature, passport, hardware device containing a credential, private key); • something the authenticator knows (e.g., password, PIN); • something the authenticator is (e.g., biometric characteristic); • something the authenticator typically does (e.g., behavior pattern). MCLE, Inc. | 2nd Edition 2018

16–11

§ 16.5

Data Security and Privacy in Massachusetts

Recommendation ITU-T X.1254: Entity authentication assurance framework at 1 (Sept. 2012), available for free download at http://www.itu.int/rec/T-REC-X.1254201209-I. As described in chapter 1, § 1.3.4 of this book, for over a generation, public key infrastructure (PKI) (using “digital signatures”) had been considered as an approach to ensure very high confidence of authentication in electronic (online) transactions. However, NCCUSL in UETA and Congress in E-Sign pursued a technology-neutral approach of electronic signatures. Ultimately, because of the risk that needed to be assumed by certification authorities, with the attendant price that needed to be assumed and the price needed to support them, PKI enjoyed only limited adoption (such as the certification of federal documents supported by Adobe, which has not been adopted by, or perhaps not offered to, other governmental entities). In the past decade, entrepreneurs have seen an opportunity in “federated identity management” where third parties (with a more limited role than the certification authorities of PKI proposed in the 1990s) perform authentication functions for transactions that are not necessarily tied to such networks as credit card services. Virginia in 2015 enacted a law to provide a “safe harbor” to incentivize the general scheme, Va. Code §§ 2.2-436 to 2.2-437, http://law.lis.virginia.gov/vacode/title2.2/chapter4.3/ (identity management standards), and §§ 59.1-550 to 59.1-555, http://law.lis.virginia. gov/vacode/title59.1/chapter50/ (Electronic Identity Management Act), including • identity management standards (by administrative agency); • “identity providers”; • “attribute [of identity] providers”; • “identity proofers”; and • “trustmarks” as parts of an “identity trust framework.” The ULC convened a study committee on “Identity Management in Electronic Commerce,” see http://www.uniformlaws.org/ Committee. aspx? title=Identity Management in Electronic Commerce, which considered the Virginia legislation but has not proceeded with a drafting project. Meanwhile, the concept of a “blockchain” of blocks of transactions added in a chain to a complete ledger of all transactions, each authenticated by any one of multiple parties with access (thus a “distributed ledger”), was floated in the late 1990s and introduced as a basis for the “virtual currency” of “Bitcoin” around 2008. In the Bitcoin scheme, a member of the public who solved the problem of finding a “nonce,” or random number which, combined with a hash of the new ledger would result in a set, easily verifiable number, was deemed to have “mined” a Bitcoin, would authenticate the ledger, and be rewarded. Other approaches to authenticating a common ledger have been proposed, including a closed system of round-robin authentication among, for example, participating banks that may eliminate back-office resources now expended to conciliate disparate ledgers. See Nathaniel Popper, “Business Giants to Announce Creation of a Computing 16–12

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.5

System Based on Ethereum,” N.Y. Times, Feb. 27, 2017, available at https:// www.nytimes.com/2017/02/27/business/dealbook/ethereum-alliance-businessbanking-security. html?_r=0. The blockchain “ledger” might include other information, such as a “smart contract,” such that administration may be automatic or authenticated by distributed entities. An early example of the use of a “smart contract” for distributing proceeds in crowdfunding involved an embarrassing “hack.” Nathaniel Popper, “A Hacking of More Than $50 Million Dashes Hopes in the World of Virtual Currency,” available at N.Y. Times, June 17, 2016, https://www.nytimes.com/2016/06/18/business/dealbook/ hacker-may-have-removed-more-than-50-million-from-experimental-cybercurrencyproject.html. The advantages of blockchain also raise issues. The purportedly anonymous Bitcoin scheme has made it a choice for extortionists using ransomware, such as in the recent and on-going WannaCry compromise of unpatched Microsoft operating systems, as well as for other money-laundering operations. Concern about these issues for consumers at the on and off “ramps” led the Treasury Department’s Financial Crime Enforcement Network (FinCEN) to issue a 2013 guidance: Guidance: Application of FinCEN’s Regulations to Persons Administering, Exchanging, or Using Virtual Currencies, FIN-2013-G001 (Dept. of the Treasury, Financial Crime Enforcement Network March 18, 2013), https://www.fincen.gov/sites/default/files/shared/FIN2013-G001.pdf. New York followed with a virtual currency business regulation, N.Y. Rules and Regulations Code §§ 200.1–200.22.which inspired the 2017 promulgation by the ULC of the Uniform Regulation of Virtual-Currency Business Act, http://www.uniformlaws .org/shared/docs/regulation%20of%20virtual%20currencies/URVCBA_Final_2017o ct9.pdf. In 2016, after discussion drafts for amending E-SIGN to include “blockchain” and “smart contracts” were introduced, the U.S. House of Representatives adopted Resolution 835, which states, blockchain technology with the appropriate protections has the potential to fundamentally change the manner in which trust and security are established in online transactions through various potential applications in sectors including financial services, payments, health care, energy, property management, and intellectual property management. (The full text of H.B. 835 is included as Exhibit 16A.) Arizona is the first state that has enacted a law expressly addressed to enabling the use of blockchain and smart contract technology: § 44-7061. Signatures and records secured through blockchain technology; smart contracts; ownership of information; definitions MCLE, Inc. | 2nd Edition 2018

16–13

§ 16.5

Data Security and Privacy in Massachusetts

A. A signature that is secured through blockchain technology is considered to be in an electronic form and to be an electronic signature. B. A record or contract that is secured through blockchain technology is considered to be in an electronic form and to be an electronic record. C. Smart contracts may exist in commerce. A contract relating to a transaction may not be denied legal effect, validity or enforceability solely because that contract contains a smart contract term. D. Notwithstanding any other law, a person that, in or affecting interstate or foreign commerce, uses blockchain technology to secure information that the person owns or has the right to use retains the same rights of ownership or use with respect to that information as before the person secured the information using blockchain technology. This subsection does not apply to the use of blockchain technology to secure information in connection with a transaction to the extent that the terms of the transaction expressly provide for the transfer of rights of ownership or use with respect to that information. E. For the purposes of this section: 1. “Blockchain technology” means distributed ledger technology that uses a distributed, decentralized, shared and replicated ledger, which may be public or private, permissioned or permissionless, or driven by tokenized crypto economics or tokenless. The data on the ledger is protected with cryptography, is immutable and auditable and provides an uncensored truth. 2. “Smart contract” means an event-driven program, with state, that runs on a distributed, decentralized, shared and replicated ledger and that can take custody over and instruct transfer of assets on that ledger. Ariz. Rev. Stat. § 44-7061, added by 2017 Ariz. Sess. Laws 97, § 2. Nevada followed with a simple amendment of its Uniform Electronic Transactions Act to include “blockchain” as an “electronic record,” relying on the existing enablement and rules relative to security procedures. Nev. Rev. Stat. 719.090, amended by 2017 Nev. Stat. 391, § 3 (existing definition of “electronic record”: “The term includes, without limitation, a blockchain”). Delaware amended its general corporate law to allow blockchain records: Any records administered by or on behalf of the corporation in the regular course of its business, including its stock ledger, books of account, and minute books, may be kept on, or by 16–14

2nd Edition 2018 | MCLE, Inc.

Current Developments

§ 16.5

means of, or be in the form of, any information storage device, method, or one or more electronic networks or databases (including one or more distributed electronic networks or databases), provided that the records so kept can be converted into clearly legible paper form within a reasonable time, and, with respect to the stock ledger, that the records so kept (i) can be used to prepare the list of stockholders specified in § 219 and § 220 of this title, (ii) record the information specified in § 156, § 159, § 217(a) and § 218 of this title, and (iii) record transfers of stock as governed by Article 8 of subtitle I of Title 6. Any corporation shall convert any records so kept into clearly legible paper form upon the request of any person entitled to inspect such records pursuant to any provision of this chapter. When records are kept in such manner, a clearly legible paper form prepared from or by means of the information storage device, method, or one or more electronic networks or databases (including one or more distributed electronic networks or databases) shall be valid and admissible in evidence, and accepted for all other purposes, to the same extent as an original paper record of the same information would have been, provided the paper form accurately portrays the record. Del. Code Ann. Tit. 8, § 224 (emphasis added), amended by 2017 Del. Laws 86, §7. Blockchain is designed as an authentication-security tool that takes advantage of the increased computing power and storage available today compared to the PKI of the 1990s. Although the current public excitement is over blockchain as a basis for virtual currencies, the broader adoption may be as a common ledger for banks to avoid the current multiple ledgers and associated backroom resources to maintain and conciliate them. It is argued that such simplification may enhance security because of the encryption components.

§ 16.6

CONCLUSION

The American public has continued to be inundated with stories of widespread data breaches—including the compromise of 145.5 million American consumer accounts in the hands of one of the credit agencies that determines the credit-worthiness, and thereby the cost of credit, for those consumers. In the nearly fifteen years since California enacted the first breach notification law and forty-seven states followed, Congress supported such notification with Fair and Accurate Credit Transactions Act (FACTA) credit freezes and authorized breach notification Health Information Technology for Economic and Clinical Health (HITECH), but has failed to harmonize breach notification and update older laws such as the CFAA, the ECPA, and the CDA. On November 8, 2017, the Senate Committee on Commerce, Science & Transportation held a hearing on “Protecting Consumers in the Era of Major Data Breaches,” at which present and former leaders of Equifax, Yahoo!, and Verizon testified and were questioned about security. https://www. commerce.senate.gov/public/ MCLE, Inc. | 2nd Edition 2018

16–15

§ 16.6

Data Security and Privacy in Massachusetts

index.cfm/2017/11/protecting-consumers-in-the-era-of-major-data-breaches (including videorecording and submitted testimony). Perhaps this will lead to congressional action, at least the harmonization of breach notification. As the American public continues to embrace mobile commerce with real-time geolocation information, home appliance intelligence, drone photography, social media posting and messaging, “liking,” and forwarding, and, as providers to billions find ever greater uses for the “Big Data” collected, new scams and regrets emerge—some that may capture the attention of state or national legislators or regulators.

16–16

2nd Edition 2018 | MCLE, Inc.

Current Developments

EXHIBIT 16A—House of Representatives Resolution 835 H. RES. 835 In the House of Representatives, U. S., September 12, 2016. Whereas technology solutions have the potential to improve consumers’ ability to control their economic well-being, to encourage their financial literacy, and improve their knowledge base and increase their options to manage their finances and engage in commerce; Whereas new payment methods and new payment strategies reflect new commercial opportunities; Whereas the United States is the world leader in software development and technology creation; Whereas financial technology is creating new opportunities for the 24,800,000 underbanked households in the United States; Whereas the growth of consumers’ use of mobile devices and the deployment of broadband access has supported the growth of financial technology products and services outside of traditional products and services offered by banks and other financial institutions in the United States increasing commerce and job growth; Whereas identity theft is a rising concern for people in the United States as their personal information is targeted by criminal enterprises for monetization on the black market; Whereas cyberattacks against domestic and international financial institutions and cooperatives continue; Whereas emerging payment options, including alternative non-fiat currencies, are leveraging technology to improve security through increased transparency and verifiable trust mechanisms to supplant decades old payment technology deployed by traditional financial institutions; and Whereas blockchain technology with the appropriate protections has the potential to fundamentally change the manner in which trust and security are established in online transactions through various potential applications in sectors including financial services, payments, health care, energy, property management, and intellectual property management: Now, therefore, be it Resolved, That it is the sense of the House of Representatives that—

MCLE, Inc. | 2nd Edition 2018

16–17

Data Security and Privacy in Massachusetts

(1) the United States should develop a national policy to encourage the development of tools for consumers to learn and protect their assets in a way that maximizes the promise customized, connected devices hold to empower consumers, foster future economic growth, create new commerce and new markets; (2) the United States should prioritize accelerating the development of alternative technologies that support transparency, security, and authentication in a way that recognizes their benefits, allows for future innovation, and responsibly protects consumers’ personal information; (3) the United States should recognize that technology experts can play an important role in the future development of consumer-facing technology applications for manufacturing, automobiles, telecommunications, tourism, health care, energy, and general commerce; (4) the United States should support further innovation, and economic growth, and ensure cybersecurity, and the protection of consumer privacy; and (5) innovators in technology, manufacturing, automobiles, telecommunications, tourism, health care, and energy industries should commit to improving the quality of life for future generations by developing safe and consumer protective, new technology aimed at improving consumers’ access to commerce. Attest: Clerk.

16–18

2nd Edition 2018 | MCLE, Inc.

Table of Cases References are to section numbers of this book, unless otherwise indicated.

A Acara v. Banks, 6.5.1, 6.6.2(b) Acosta v. Byrum et al, 6.6.2(b) Adams v. Congress Auto Ins. Agency, Inc., 10.2.8 Advanced Micro Devices, Inc. v. Feldstein, 3.2.4 Agu v. Rhea, 4.1.2 Ajemian v. Yahoo!, Inc., 16.4 Alleruzzo v. SuperValu, Inc., 9.3 Amendment of Section 64.701 of the Commission’s Rules & Regulations, In re, 12.2.1(b) America Online, Inc. v. National Health Care Discount, Inc., 3.3.2, 3.3.4 American Bar Assoc. v. FTC, 4.2.5 American Express Co. v. Italian Colors Rests., 9.3 American Guar. & Liab. Ins. Co. v. Ingram Micro, Inc., 14.5.1 American Library Ass’n, United States v., 7.2.1 American Tooling Center, Inc. v. Travelers Cas. & Sur. Co. of Am., 14.5.10 Amerifirst Bank v. TJX Cos. (In re TJX Cos. Retail Sec. Breach Litig.), 9.3 Aminpour v. Arbella Mut. Ins. Co., 10.2.8 Andersen Consulting LLP v. UOP, 2.5.4, 2.5.6(b) Anderson v. Hannaford Bros. Co., 9.3 Andritz, Inc. v. S. Maint. Contractor, 3.2.5 Anthony v. Experian Info. Solutions, Inc., 4.3.4(a) Apache Corp. v. Great Am. Ins. Co., 14.5.3, 14.5.10, 14.5.11 Apple, Inc. v. Superior Ct. of Los Angeles County, 9.3

MCLE, Inc. | 2nd Edition 2018

Application of United States of Am. for an Order Authorizing the Use of a Pen Register & Trap on [xxx] Internet Service Account, In re, 2.2 Aqua Star (USA) Corp. v. Travelers Cas. & Sur. Co. of Am., 14.5.9 ASUSTeK Computer, Inc., In re, 8.5 Atlantic Recording Corp. v. Project Playlist, Inc., 7.2.3 Atlantic States Legal Found., Inc. v. Tyson Foods, Inc., 6.5.6 Attias v. CareFirst, Inc., 9.3 Auernheimer, United States v. (2012), 3.5.1 Auernheimer, United States v. (2014), 3.5.1 Augustine, Commonwealth v., 2.5.7, 2.5.8, 2.6 Austin v. Certegy Payment Recovery Servs., Inc., 4.3.4(a) Avetisyan v. Experian Info. Solutions, Inc., 4.3.4(a)

B Backhaut v. Apple Inc., 2.3.3 Baker v. TransUnion LLC, 4.3.3(a) Barrepski v. Capital One Bank (U.S.A.) N.A. (2011), 4.1.2 Barrepski v. Capital One Bank (U.S.A.) N.A. (2014), 4.1.2 Barrington, United States v., 2.4 Beck v. McDonald, 9.3 Belle Chasse Auto. Care, Inc. v. Advanced Auto Parts, Inc., 9.3 Berger v. New York, 1.3.1 Berkson v. Gogo LLC, 11.2 Biasuci, United States v., 2.3.4 BJ’s Wholesale Club, Inc., In re, 1.2.2(b)

C–1

Data Security and Privacy in Massachusetts

Board of Overseers of the Bar v. Ferris, 2.5.2 Boggio v. USAA Fed. Sav. Bank, 4.1.2 Bollaert, People v., 7.2.6 Bormes v. United States, 4.2.1 Bower v. Bower, 2.5.9 Boyd v. United States, 1.3.1, 9.1.1 Boyle, United States v., 6.5.3 Briggs v. American Air Filter Co., Inc., 2.3.2 Brown v. Wal-Mart Stores, Inc., 4.2.9 Burgos, Commonwealth v., 2.4.2 Byrne v. Avery Center of Obstetrics and Gynecology, 6.6.2(b)

C C.S. v. United Bank, Inc., 9.3 Camp’s Grocery Inc. v. State Farm Fire & Cas. Co., 14.6.13 Carpenter, United States v., 2.5.8 Carvalho v. Equifax Info. Serv., 4.1.2 Cassara v. DAC Servs., Inc., 4.1.2 Chamberlain Group, Inc. v. Skylink Techs., Inc., 7.3.9 Cheng v. Romo, 2.5.3 Cherny v. Emigrant Bank, 9.3 Chiang v. Verizon New England, Inc., 4.1.2 Cloudpath Networks, Inc. v. SecureW2 B.U., 3.2.4 Collins v. Experian Credit Reporting Serv., 4.3.4(a) Comcast Corp. v. FCC, 12.2.1(c) Commonwealth v., see name of party Cordas v. Uber Techs., Inc., 11.2 Cortez v. Trans Union, 4.1.2 Councilman, United States v., 2.3.3 Credit Protection Assoc., United States v., 8.4.2 Crispin v. Christian Audigier, Inc., 2.5.9 Cullinane v. Uber Techs., Inc., 11.2 Cumis Ins. Soc’y, Inc. v. BJ’s Wholesale Club, Inc., 9.3 Cushman v. Trans Union, 4.1.2 Czubinski, United States v., 3.3.4

C–2

D Dalton v. Capital Associated Indus., 4.1.2, 4.2.9 Daniels v. Experian Info. Solutions, Inc., 4.2.2 DeAndrade v. Trans Union LLC, 4.1.2 Devries v. Experian Info. Solutions, Inc., 11.2 DiGiulio, State v., 3.2.4 Dish Network, LLC, United States v., 8.4.2 Dobson v. Grendahl, 4.2.8 Doe v. Backpage.com, LLC, 12.1.1, 16.2 Doe v. Friendfinder Network, Inc., 7.2.3 Doe v. Internet Brands, Inc., 7.2.5 Domestic Air Transp. Antitrust Litig., In re, 2.5.9 Dreher v. Experian Info. Solutions, Inc., 9.3 Drew, United States v., 3.5.2

E Earls, State v., 2.5.7 EF Cultural Travel BV v. Explorica, Inc., 3.2.4, 3.3.5 Eli Lilly & Co., In re, 1.2.2(b) Eller v. Trans Union LLC, 4.1.2 Enargy Power Co. Ltd. v. Wang, 3.2.4 Enslin v. Coca-Cola Co, 9.3 Epps v. St. Mary’s Hosp., 2.3.2 Equifax, Inc., Customer Data Sec. Breach Litig., In re, 1.1 Experian Info. Solutions, Inc. v. Lifelock, Inc., 4.3.3(a) Eyeblaster, Inc. v. Federal Ins. Co., 14.6.4

F Fair Hous. Council of San Francisco Valley v. Roommates.com, LLC, 7.2.5 Falls, United States v., 2.3.4 Feeney v. Dell, Inc., 9.3 Fields v. Twitter, Inc., 16.2

2nd Edition 2018 | MCLE, Inc.

Table of Cases

First Bank of Delaware, Inc. v. Fidelity & Deposit Co. of Maryland, 14.6.1 First Commonwealth Bank v. St. Paul Mercury Ins. Co., 14.6.5 Flagg v. City of Detroit, 2.5.9 Formal Complaint of Free Press & Pub. Knowledge Against Comcast Corp. for Secretly Degrading Peer-to-Peer Applications, In re, 12.2.1(c) Fraser v. Nationwide Mut. Ins. Co., 2.5.3 Freedman v. America Online, Inc., 2.5.6(c) Fregoso v. Wells Fargo Dealer Servs., Inc., 4.3.4(b) FTC v. A+Financial Ctr., LLC, 8.4.2 FTC v. All US Marketing LLC, 8.4.2 FTC v. H.N. Singer, 8.2 FTC v. Jones, 8.4.2 FTC v. LifeLock, Inc., 8.5 FTC v. Ruby Corp., 8.5 FTC v. U.S. Oil & Gas Corp., 8.2 FTC v. VIZIO, Inc., 8.5 FTC v. World Travel Vacation Brokers, Inc., 8.2 FTC v. Wyndham Worldwide Corp. (2012), 8.5 FTC v. Wyndham Worldwide Corp. (2014), 8.5 FTC v. Wyndham Worldwide Corp. (2015), 8.5

G Galaria v. Nationwide Mut. Ins. Co., 9.3 Gateway Learning Corp., In re, 1.2.2(b) Golan v. Veritas Entm’t, LLC, 8.4.2 Goldman v. United States, 1.3.1 Gonzales v. Google, Inc., 16.2 Goode v. LexisNexis Risk & Info. Analytics Grp., Inc., 4.2.4 Gordon v. Virtumundo, Inc., 7.5.3 Gordon, Commonwealth v., 2.4.1, 2.4.3 Gorman v. Wolpoff & Abramson, LLP, 4.1.2 Greene v. DirecTV, Inc., 4.3.3(d) Gresham v. Swanson, 8.4.2 MCLE, Inc. | 2nd Edition 2018

Griswold v. Connecticut, 1.2.1(a) Guest-Tek Interactive Entm’t Inc. v. Pullen, 3.2.4, 3.6.9

H Haley v. TalentWise, Inc., 4.1.2 Hall, United States v., 2.5.1 Hannaford Bros., Co. Customer Data Sec. Breach Litig., In re (2009), 9.3 Hannaford Bros., Co. Customer Data Sec. Breach Litig., In re (2010), 9.3 Hannaford Bros., Co. Customer Data Sec. Breach Litig., In re (2013), 9.3 Harrold v. Levi Strauss & Co., 9.3 Hartford Cas. Ins. Co. v. Corcino & Assocs., 14.6.9 Herbst v. Able, 2.5.9 Home Depot, Inc. Customer Data Breach Litig., In re, 9.3 Hyde, Commonwealth v., 2.4.1 Hypertouch, Inc. v. ValueClick, Inc., 9.2.3(a)

I In re, see name of party InComm Holdings Inc. v. Great Am. Ins. Co., 14.5.5, 14.5.10 InMobi Pte Ltd., United States v., 8.4.4 International Airport Ctrs., LLC v. Citrin, 1.3.4, 3.2.4 International Harvester, In re, 8.5 IO Group, Inc. v. Veoh Networks, Inc., 7.3.6 ITT Cont’l Baking Co., United States v., 6.5.6

J Jackson, Commonwealth v., 2.4.1, 2.4.2 Jacobsen, United States v., 2.5.1 James v. Newspaper Agency Corp., 2.3.2 Jane Doe No. 1 v. Backpage.com, LLC, 7.2.6

C–3

Data Security and Privacy in Massachusetts

Jarosch v. American Family Mut. Ins. Co., 3.2.5 Jerk, LLC, In re, 8.5 Joffe v. Google, Inc., 2.3.1 Johnson v. MBNA Am. Bank, NA, 4.1.2 Jones v. Dirty World Entm’t Recordings LLC (2013), 7.2.5 Jones v. Dirty World Entm’t Recordings LLC (2014), 7.2.5 Jones, United States v., 2.5.8

K Katz v. Pershing, LLC (2011), 10.2.8 Katz v. Pershing, LLC (2012), 9.3, 10.2.8 Katz v. United States, 1.3.1, 9.1.1 Kennedy, United States v., 2.5.1 Kewanee Oil Co. v. Bicron Corp., 1.1 Klumb v. Goan, 2.3.3 Konop v. Hawaiian Airlines, 2.3.3, 2.5.2, 2.6 Koyomejian, United States v., 2.3.4 Krottner v. Starbucks Corp., 9.3 Kuvayev, Commonwealth v., 7.5.3 Kwiatowski, Commonwealth v., 9.2.3(d)

L LabMD, Inc., In re, 8.5 LabMD, Inc. v. FTC, 16.3 LAI Systems, LLC, United States v., 8.4.4 Lambrecht & Assocs. v. State Farm Lloyds, 14.5.2 Langevin, United States v., 3.6.1 Lanteri v. Credit Prot. Assoc. L.P., 8.4.2 Larios, United States v., 2.3.4 Lenovo (United States), Inc., In re, 8.5 Lenox v. Equifax Info. Servs., 4.2.2 Lenz v. Universal Music Corp., 7.3.4 Lexmark Int’l, Inc. v. Static Control Components, Inc., 7.3.9 Lilly Mgmt. & Mktg., LLC, United States v., 8.4.2 Lone Star Nat’l Bank N.A. v. Heartland Payment Sys., Inc., 9.3 Lujan v. Defenders of Wildlife, 9.3 C–4

LVRC Holdings LLC v. Brekka, 3.2.4

M Martin v. Evans, 2.4.1 Matot v. CH, 3.5.2 MBTA v. Anderson, 3.2.3 McDermott v. Marcus, Errico, Emmer & Brooks, P.C., 8.5 McLoughlin v. People’s United Bank, Inc., 9.3 McQuade v. Michael Gassner Mech. & Elec. Contractors, Inc., 2.3.2 Medidata Solutions, Inc. v. Federal Ins. Co., 14.5.11 Metter v. Uber Techs., 11.2 Meyer v. Kalanick, 11.2 Millar v. Taylor, 1.1 Miller v. Johnson & Johnson, Janssen Pharm., Inc., 4.2.4 Miller, United States v., 2.5.7 Mintz v. Mark Bartelstein & Assoc. Inc., 2.5.9

N NAACP v. Alabama, 1.2.1(a), 9.1.1 Nguyen v. Barnes and Noble, Inc., 11.2 Nickelodeon Consumer Privacy Litig., In re, 8.4.4 Nicosia v. Amazon.com, Inc., 11.2 Nix v. O’Malley, 2.3.2 Nosal, United States v. (2009), 3.2.4 Nosal, United States v. (2012), 1.3.4, 3.2.4, 3.5.3 Nosal I, 3.2.4 Nosal II, 1.3.4, 3.2.4, 3.5.3 Novak v. Experian Info. Solutions, Inc., 4.3.2

O O’Grady v. Super. Ct., 2.5.9 Olmstead v. United States, 1.3.1, 2.1 Oppenheimer v. Allvoices, Inc., 7.3.3 Order Authorizing the Release of Historical Cell-Site Data, In re United States of Am. for an, 2.6

2nd Edition 2018 | MCLE, Inc.

Table of Cases

P P.F. Chang’s China Bistro, Inc. v. Fed. Ins. Co., 14.6.12 Patton v. Experian Data Corp., 9.3 Peabody v. Norfolk, 9.1.2 People v., see name of party Perez v. Portfolio Recovery Assocs., LLC, 4.2.8 Perfect 10, Inc. v. CCBill, LLC, 7.2.3 Perry v. First Nat. Bank, 4.2.4, 4.2.5, 4.2.6 Pestmaster Servs. Inc. v. Travelers Cas. and Sur. Co. of Am., 14.5.8, 14.5.10, 14.5.11 Peter Ian Cummings, In re, 15.3 Pharmatrak Litig., In re, 3.2.5 Pineda v. Williams-Sonoma, Stores, Inc., 9.3 Pinero v. Jackson Hewitt Tax Serv. Inc., 9.3 Pintos v. Pac. Creditors Ass’n, 4.2.8 Pisciotta v. Old Nat’l Bancorp, 9.3 Ponder v. Prizer, Inc., 9.3 Practice Fusion, Inc., In re, 8.5 Premera Blue Cross Customer Data Sec. Breach Litig., In re, 9.3 Preserving the Open Internet, In re (2010), 12.2.1(c) Preserving the Open Internet, In re (2015), 12.2.1(c), 12.2.1(d) Principle Solutions Group, LLC v. Ironshore Indem., Inc., 14.5.4 ProCD, Inc. v. Zeidenberg, 11.2 Project Veritas Action Fund v. Conley, 2.4.1

R R.K. v. St. Mary’s Med. Center, 6.6.2(b) RadioShack Corp., In re, 15.3 Randolph v. ING Life Ins. & Annuity Co., 9.3 Recall Total Info. Mgmt., Inc. v. Fed. Ins. Co., 14.6.7 Reilly v. Ceridian Corp., 9.3 Remijas v. Neiman Marcus Grp., LLC, 9.3 MCLE, Inc. | 2nd Edition 2018

Reno v. American Civil Liberties Union, 7.2.1 Retail Ventures, Inc. v. Nat’l Union Fire Ins. Co., 14.6.2 Retro Dreamer, United States v., 8.4.4 Rich v. Rich, 2.4 Ridenour v. Multi-Color Corp., 4.2.9 Roberson v. Rochester Folding-Box Co., 9.1.2 Rodriguez, United States v., 3.2.4 Roe v. Wade, 1.2.1(a) Roof & Rack Prod., Inc. v. GYB Investors, LLC, 7.3.8 Rosolowski v. Guthy-Renker, LLC, 9.2.3(a) Rousseau, Commonwealth v., 2.5.7 Ruffin-Thompkins v. Experian Info. Solutions, Inc., 4.1.2 Ruiz v. Gap, Inc., 9.3 Rushdan, United States v., 3.3.6

S Safeco Ins. Co. of Am. v. Burr, 4.1.1, 4.1.2 Salameno v. Gogo Inc., 11.2 Schweitzer v. Equifax Info. Solutions LLC, 4.1.2 Seidlitz, United States v., 3.6.1 Shlahtichman v. 1-800 Contacts, Inc., 4.2.1 Shurgard Storage Ctrs., Inc. v. Safeguard Self Storage, Inc., 3.2.1 Simonoff v. Expedia, Inc., 4.2.1 Skinner, United States v., 2.5.8 Smith v. Maryland, 2.5.7, 2.5.8 Snapchat, Inc., In re, 8.5 Sony Corp. of Am. v. Universal City Studios, Inc., 7.3.4 Sony Gaming Networks & Customer Data Sec. Breach Litig., 9.3 Soundboard Assoc. v. U.S. Fed. Trade Comm’n, 8.4.2 South Carolina Med. Ass’n v. Thompson, 8.3.3 Soutter v. Equifax Info. Servs., LLC, 4.1.2

C–5

Data Security and Privacy in Massachusetts

Specht v. Netscape Communications Corp., 11.2 Spokeo, Inc. v. Robins, 4.1.2, 4.2.9, 8.4.2, 9.3 Sprint Communications Co. L.P. v. APCC Serv., Inc., 9.3 State Auto Prop. & Cas. Ins. Co. v. Midwest Computers & More, 14.6.3, 14.6.4 State Bank of Bellingham v. BancInsure, Inc., 14.5.7 State v., see name of party Steiger, United States v., 2.5.1 Stipulated Consent Agreement & Final Order, 15.3 Subpoena Duces Tecum to AOL, LLC, In re, 2.5.9 SuperValu, Inc., Customer Data Sec. Breach Litig., In re, 9.3 Swartz, United States v., 3.5.1

T Target Corp. Customer Data Sec. Breach Litig., In re, 9.3 TaxSlayer, LLC, In re, 8.3.2(c) Taylor & Lieberman v. Federal Ins. Co., 14.5.6, 14.5.11 Theofel v. Farey-Jones, 2.5.3 Thomas v. Early Warning Servs., LLC, 4.3.4(a) Thorpe, Commonwealth v., 2.4.2 TJX Cos. Retail Sec. Breach Litig., In re, 9.3 Travelers Indem. Co. of Am. v. Portal Healthcare Solutions, LLC, 14.6.8 Travelers Indem. Co. of Connecticut v. P.F. Chang’s China Bistro, Inc., 14.6.10 Travelers Prop. Cas. Co. of Am. v. Fed. Recovery Servs., Inc., 14.6.11 True Ultimate Standards Everywhere, Inc., In re, 8.5

C–6

TRW Inc. v. Andrews, 4.1.2 Tuteur v. Crosley-Corcoran, 7.3.4 Tyler v. Michaels Stores, Inc., 9.3

U Uber Techs., Inc., In re, 8.5 United States v., see name of party United States Telecom Ass’n v. FCC, 12.2.1(c) Universal Communications Sys., Inc. v. Lycos, Inc., 7.2.3 Unum Group v. Loftus, 10.2.8

V Valle, United States v., 1.3.4, 3.5.3 Verizon v. FCC, 12.2.1(c) Viacom Int’l Inc. v. YouTube Inc., 2.5.9

W Warshak, United States v., 2.5.8, 2.6 Watkins v. L.M. Berry & Co., 2.3.2 WEC Carolina Energy Solutions, LLC v. Miller, 1.3.4, 3.2.4 Willey v. J.P. Morgan Chase, N.A., 9.3 Williams v. Poulos, 2.3.2

Y Yahoo! Inc. Customer Data Sec. Breach Litig., In re, 9.3 Yahoo Mail Litig., In re, 2.3.3 Yath v. Fairview Clinics, 6.6.2(b) Yelp., Inc., United States of Am. (on behalf of the FTC), Plaintiff v., 8.4.4

Z Zacchini v. Scripps-Howard Broad. Co., 1.1 Zeran v. America Online, Inc., 7.2.2 Zurich Am. Ins. Co. v. Sony Corp. of Am., 14.6.6

2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References References are to section numbers of this book, unless otherwise indicated.

FEDERAL Administrative Procedure Act, 9.3 American Recovery and Reinvestment Act of 2009, 1.3.5, Exhibit 8c Atomic Energy Act of 1954 § 11, Exhibit 3A § 11y, Exhibit 3A Children’s Internet Protection Act, 7.2.1 Children’s Online Privacy Protection Act, 1.3.7, 7.4, 8.1, 8.4.4, 11.2, 11.4, 16.4 Clayton Act, 8.1 Code of Federal Regulations (C.F.R.) 12 C.F.R. pt. 40, 8.3.2(b) pt. 41, 8.3.1(b) pt. 205, 11.1 pt. 216, 8.3.2(b) pt. 222, 8.3.1(b) pt. 226, 11.1 pt. 332, 8.3.2(b) pt. 334, 8.3.1(b) pt. 364, 8.3.1(b) pt. 571, 8.3.1(b) pt. 573, 8.3.2(b) pt. 614, 8.3.1(b) § 716.65, 8.3.2(b) pt. 717, 8.3.1(b) pt. 1016, 8.3.2(b) §§ 1016.1–1016.17, 5.2.2, 8.3.2(b) § 1016.3, 5.3.1, 5.3.3 § 1022.3, 4.3.1 § 1022.123, 8.3.1(b) MCLE, Inc. | 2nd Edition 2018

§ 1022.140, 8.3.1(b) 16 C.F.R. § 0.17, 8.2 pt. 2, 8.2 § 2.1, 8.2 § 2.7, 8.2 pt. 3, 8.2 § 3.2, 8.2 § 160.03, 13.2 § 164.306, 13.2 § 164.308, 13.2 § 164.310, 13.2 § 164.312, 13.2 § 164.314, 13.2 § 164.410, 13.2 § 164.504, 13.2 pt. 248, 8.3.2(a) § 248.30, 8.3.2(a) pt. 310, 8.4.2 §§ 310.1–310.6, 8.4.2 § 310.3, 8.4.2 § 310.4, 8.4.2, 8.4.3 pt. 312, 7.4, 8.4.4 §§ 312.1–312.13, 8.4.4 § 312.2, 8.4.4 § 312.3, 8.4.4 § 312.5, 8.4.4 § 312.9, 8.4.4 § 312.11, 8.4.4 pt. 313, 8.3.2(b) §§ 313.1–313.18, 8.3.2(b) § 313.3, 8.3.2(a) pt. 314, 1.3.6, 8.3.1(b), 8.3.2(a), 13.4 § 314.1–314.5, 1.3.6 §§ 314.1–314.5, 5.2.1 § 314.3, 5.4.5, 8.3.2(a) § 314.4, 5.4.5, Exhibit 8B pt. 316, 8.4.1 §§ 316.1–316.6, 8.4.1 S–1

Data Security and Privacy in Massachusetts

Code of Federal Regulations (C.F.R.) 16 C.F.R. (cont’d) § 316.3, 8.4.1 § 316.4, 8.4.1 § 316.5, 8.4.1 pt. 318, 1.3.8, 8.3.3, Exhibit 8C pt. 504, 13.2 § 603.2, 8.3.1(b) pt. 611, 8.3.1(b) pt. 642, 8.3.1(a), 8.3.1(b) pt. 680, 8.3.1(b) pt. 681, 4.2.5, 8.3.1(b), 13.3, Exhibit 8A § 681.1, 4.2.5, 8.3.1(b), Exhibit 8A pt. 682, 8.3.1(b) § 682.3, 8.3.1(b) pt. 698, 8.3.1(b) 17 C.F.R. pt. 160, 8.3.2(b) pt. 162, 4.2.5 § 162.30, 4.2.5 pt. 248, 4.2.5, 8.3.2(b) § 248.30, 5.2.1 § 248.201, 4.2.5 31 C.F.R. § 103.121, Exhibit 8A 37 C.F.R. § 201.38, 7.3.3 § 201.40, 7.3.8 § 201.70, 7.3.8 45 C.F.R. pt. 142, Exhibit 15B pt. 160, 1.3.2, 8.3.3, Exhibit 13A pts. 160–164, Exhibit 15B § 160.102, 8.3.3 § 160.103, 6.3, 6.4.1(c), 6.4.4, 8.3.3, Exhibit 8C §§ 160.306–160.312, 8.3.3 § 160.402(c), 6.5.4 pt. 164, 1.3.2, 8.3.3, 16.1, Exhibit 13A § 164.10, 8.3.3 § 164.12, 8.3.3 § 164.14, 8.3.3 § 164.103, Exhibit 13A § 164.300, 6.1 §§ 164.302–164.318, 8.3.3

S–2

§ 164.306, 6.4.1(g), 6.4.2, Exhibit 13A § 164.306(a), 6.4.2 § 164.306(b), 6.4.1(d), 6.4.1(g), 6.4.2 § 164.306(d), 6.4.1(g), 8.3.3 § 164.308, 8.3.3, Exhibits 6A, 13A § 164.310, 8.3.3, Exhibit 6A § 164.312, 8.3.3, Exhibit 6A § 164.314, 8.3.3, Exhibit 13A § 164.316, 8.3.3, Exhibit 13A §§ 164.400–164.414, 1.3.8, 8.3.3 § 164.410, Exhibit 13A §§ 164.500–164.534, 8.3.3 § 164.502, 8.3.3 §§ 164.502–164.514, 16.1 § 164.504, 8.3.3, Exhibit 13A § 164.506, 8.3.3 § 164.508, 8.3.3 § 164.510, 16.1 § 164.520, 8.3.3, 16.1 § 164.522, 8.3.3 § 164.524, 8.3.3, 16.1 § 164.526, 8.3.3, 16.1 § 164.528, 8.3.3, Exhibit 8C § 502(e), 8.3.3 Communication Decency Act, 1.3.7, 7.2, 9.2.3(E), 12.1.1, 12.2.3, 16.2, 16.6 Comprehensive Drug Abuse Prevention and Control Act of 1970 § 413, Exhibit 3A Comprehensive Smokeless Tobacco Health Education Act, 8.1 Computer Fraud and Abuse Act, Ch. 3, 8.4.4, 16.1, 16.2, 16.6 Congressional Record 161 Cong. Rec. 2304 (2015), 3.5 H4410-11 (June 16, 2015), 16.1 Consumer Credit Protection Act, 1.3.3, 9.2 Consumer Credit Reform Act of 1996, 8.3.1(A) Consumer Credit Reporting Reform Act of 1996, 4.1.1 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Consumer Financial Protection Act of 2010, 8.3.1(A), 8.3.2(A)

Economic Espionage Act of 1996, 16.1

Consumer Leasing Act of 1976, 8.1

Electronic Communications Privacy Act of 1986 (ECPA), Ch. 2, 1.3.2, 11.2, 11.4, 12.1.1, 12.2.2, 16.2, 16.6

Consumer Reporting Employment Clarification Act of 1998, 8.3.1(A), 9.2 Controlling the Assault of NonSolicited Pornography and Marketing Act of 2003 (CAN-SPAM), 1.3.7, 7.5, 8.4.1, 9.2.3(A), 11.4

Electronic Fund Transfer Act (EFTA), 1.3.3, 8.1 Electronic Signatures in Global and Interstate Commerce Act (E-SIGN), 1.3.4, 16.5

Copyright Act of 1976, 1.2.1(A)

Equal Credit Opportunity Act, 8.1

Counterfeit Access Device and Abuse Act, 3.1

Fair and Accurate Credit Transactions Act of 2003 (FACTA), Ch. 4, 1.3.8, 1.3.9, 8.3.1(A), 8.3.1(B), 9.2, 16.6

Credit and Debit Card Receipt Clarification Act of 2007, 8.3.1(A) Credit Card Accountability Responsibility and Disclosure Act of 2009, 8.3.1(A) Crimes Against Charitable Americans Act of 2001, 8.4.2

Fair Credit Billing Act, 8.1 Fair Credit Reporting Act (FCRA), Ch. 4, 1.3.3, 1.3.8, 1.3.9, 8.1, 8.3.1, 9.2, 9.3, 11.1, 16.4, Exhibit 3a Fair Debt Collection Practices Act, 8.1

Cyber Preparedness Act of 2017, 16.2

Fair Packaging and Labeling Act, 8.1

Cybersecurity Act of 2015, 16.1, 16.2

Farm Credit Act of 1971, Exhibit 3a

Defend Trade Secrets Act of 2016, 1.2.1(A), 12.1.1, 16.1

Federal Cigarette Labeling and Advertising Act, 8.1

Digital Millennium Copyright Act, 1.3.7, 7.2.4, 7.3, 11.3.3

Federal Privacy Act, 9.3

Do-Not-Call Implementation Act of 2003, 8.4.3 Do-Not-Call Improvement Act of 2007, 8.4.3 Do-Not-Call Registry Act of 2003, 8.4.3 Do-Not-Call Registry Fee Extension Act of 2007, 8.4.3 Dodd-Frank Wall Street Reform and Consumer Protection Act, 4.1.1, 5.2, 8.3.1(A)

MCLE, Inc. | 2nd Edition 2018

Federal Register (Fed. Reg.) 29 Fed. Reg. 8,324, 8.5 29 Fed. Reg. 8,355, 8.5 60 Fed. Reg. 40,843, 8.4.2 60 Fed. Reg. 43,865, 8.4.2 64 Fed. Reg. 59,911, 8.4.4 64 Fed. Reg. 59,912, 8.4.4 65 Fed. Reg. 31,740, 8.3.2(b) 65 Fed. Reg. 33,645, 8.3.2(a) 65 Fed. Reg. 33,658, 8.3.2(a) 65 Fed. Reg. 33,677, 8.3.2(b) 65 Fed. Reg. 33,680, 8.3.2(a) 65 Fed. Reg. 35,196, 8.3.2(b) 65 Fed. Reg. 35,206, 8.3.2(b) 65 Fed. Reg. 35,216, 8.3.2(b) 65 Fed. Reg. 35,226, 8.3.2(b) S–3

Data Security and Privacy in Massachusetts

Federal Register (Fed. Reg.) (cont’d) 65 Fed. Reg. 40,334, 8.3.2(a) 65 Fed. Reg. 40,362, 8.3.2(b) 65 Fed. Reg. 40,362–372, 8.3.2(a) 65 Fed. Reg. 82,462–829, 1.3.2, 8.3.3, 16.1 65 Fed. Reg. 82,464–469, 8.3.3 65 Fed. Reg. 82,785, 6.5.4 66 Fed. Reg. 12,434, 8.3.3 66 Fed. Reg. 21,236, 8.3.2(b) 67 Fed. Reg. 18,818, 18,821, 8.4.4 67 Fed. Reg. 36,484, 8.3.2(a), Exhibit 8B 67 Fed. Reg. 36,493–494, 8.3.2(a), Exhibit 8B 67 Fed. Reg. 53,182, 8.3.3 67 Fed. Reg. 53,267, 8.3.3 67 Fed. Reg. 36464, 1.3.6 68 Fed. Reg. 4,580, 4.670, 8.4.2 68 Fed. Reg. 8,334, 8.3.3 68 Fed. Reg. 8,336, 6.4.4 68 Fed. Reg. 8,338, 6.4.1(d) 68 Fed. Reg. 8,341, 6.4.1(d), 6.4.1(g) 68 Fed. Reg. 8,343, 6.4.1(e), 6.4.1(g) 68 Fed. Reg. 8,344, 6.4.4 68 Fed. Reg. 8,345, 6.4.1(f) 68 Fed. Reg. 8,346, 6.4.4 68 Fed. Reg. 8,350, 6.4.4 68 Fed. Reg. 8,356, 6.4.6 68 Fed. Reg. 8,357, 6.4.1(e) 69 Fed. Reg. 16,368, 8.4.2, 16 69 Fed. Reg. 16,373, 8.4.2 69 Fed. Reg. 29,061, 8.3.1(b) 69 Fed. Reg. 29,063–064, 8.3.1(b) 69 Fed. Reg. 68,690, 8.3.1(b) 69 Fed. Reg. 68,697, 8.3.1(b) 69 Fed. Reg. 69,776, 8.3.1(b) 69 Fed. Reg. 69,784–787, 8.3.1(b) 70 Fed. Reg. 3,110, 8.4.1 70 Fed. Reg. 3,127, 8.4.1 70 Fed. Reg. 5,032, 8.3.1(a), 8.3.1(b) 71 Fed. Reg. 21,104, 8.4.4 71 Fed. Reg. 21,106, 8.4.4 72 Fed. Reg. 61,455, 8.3.1(b) 72 Fed. Reg. 63,718, 8.3.1(b), Exhibit 8A

S–4

72 Fed. Reg. 63,771, 8.3.1(b), Exhibit 8A 73 Fed. Reg. 29,654, 8.4.1 73 Fed. Reg. 29,677, 8.4.1 73 Fed. Reg. 51,164, 8.4.2 73 Fed. Reg. 51,204, 8.4.2 74 Fed. Reg. 22,639, Exhibit 8A 74 Fed. Reg. 22,646, Exhibit 8A 74 Fed. Reg. 42,740, 8.3.3 74 Fed. Reg. 42,743–744, 6.6.2(a) 74 Fed. Reg. 42,962, 8.3.3, Exhibit 8C 74 Fed. Reg. 42,980–981, 8.3.3 74 Fed. Reg. 62,890, 8.3.2(b) 75 Fed. Reg. 22,639, 8.3.1(b) 75 Fed. Reg. 22,644–646, 8.3.1(b) 75 Fed. Reg. 48,458, 8.4.2 75 Fed. Reg. 48,516, 8.4.2 76 Fed. Reg. 58,716, 8.4.2 76 Fed. Reg. 79,025, 8.3.2(b) 77 Fed. Reg. 72,712, 8.3.1(b) 77 Fed. Reg. 72,715, 8.3.1(b) 78 Fed. Reg. 3,972, 8.4.4 78 Fed. Reg. 4,008, 8.4.4 78 Fed. Reg. 5,566, 8.3.3, 13.2 78 Fed. Reg. 5,575, 6.4.1(b) 78 Fed. Reg. 5,581, 6.5.4 78 Fed. Reg. 5,582, 6.5.6 78 Fed. Reg. 5,584, 6.5.6 78 Fed. Reg. 34,264, 8.3.3 78 Fed. Reg. 34,266, 8.3.3 80 Fed. Reg. 19,737, 12.2.1(c) 80 Fed. Reg. 77,520, 8.4.2 80 Fed. Reg. 77,559, 8.4.2 81 Fed. Reg. 63,435, 8.3.1(b) 81 Fed. Reg. 87274, 16.2 82 Fed. Reg. 25568, 12.2.1(c) 82 Fed. Reg. 43013, 8.5 Federal Reserve Act § 25, Exhibit 3A Federal Rules of Civil Procedure (Fed. R. Civ. Proc.) Rule 23(e), 2.3.3 Rule 26(b), 2.5.9 Rule 34, 2.5.9 Rule 45, 2.5.9

2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Federal Trade Commission Acts Amendments of 1994, 8.5

Intelligence Reform and Terrorism Protection Act of 2004, 8.3.1(A)

Federal Trade Commission Improvement Act, 8.1

International Banking Act of 1978 § 1(b), Exhibit 3A

Financial Modernization Act of 1999 See Gramm-Leach-Bliley Act (GLBA)

Magnusson-Moss Warranty-Federal Trade Commission Improvement Act, 8.1

Financial Services Regulatory Relief Act of 2006, 8.3.1(A), 8.3.2(B) Fur Products Labeling Act, 8.1 Genetic Information Nondiscrimination Act of 2008, 8.3.3 Gramm-Leach-Bliley Act (GLBA), Ch. 5, 1.3.6, 8.1, 8.3.2, 9.2.2(B), 9.3, 11.1, 16.4 Hart-Scott-Rodino Antitrust Improvements Act of 1976, 8.1 Health Information Technology Economic and Clinical Health (HITECH) Act, Ch. 6, 1.3.5, 1.3.8, 1.3.9, 8.1, 8.3.3, 16.6 Health Insurance Portability and Accountability Act of 1996 (HIPAA), Ch. 6, 1.3.2, 1.3.5, 1.3.9, 8.3.3, 9.2.1, 9.2.2(C), 10.2.8, 16.1, 16.4, Exhibit 8c Hobby Protection Act, 8.1 House Bills H.R. 387, 16.2 H.R. 584, 16.2 H.R. 1865, 16.2 H.R. 2596, 16.1 H.R. 5616, 3.6.1 House of Representatives Reports H.R. Rep. No. 98-894 at 5, 3.6.1 H.R. Rep. No. 99-647 at 64–65, 2.5.3 H.R. Rep. No. 114-109, pt. 1 at 2, 1.2.2(b) House of Representatives Resolutions H.R. 835, 16.5, Exhibit 16A Identity Theft and Assumption Deterrence Act of 1998, 1.2.2(B) MCLE, Inc. | 2nd Edition 2018

Omnibus Crime Control and Safe Streets Act of 1968, 1.3.2 Patient Protection and Affordable Care Act, 8.3.3 Petroleum Marketing Practices Act, 8.1 Postal Reorganization Act, 8.1 Privacy Act, 1.2.1(A), 16.4 Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA PATRIOT) Act of 2001, 1.2.2(B), 8.3.1(A), 16.1 Public Laws (Pub. L. No.) 73-416, 48 Stat. 1064, 1103-04, 2.1 90-321, 8.3.1(a), 9.2 tit. I, §§ 101–103, 1.3.3 tit. IX, §§ 902–920, 1.3.3 90-351 § 802, 1.3.2 82 Stat. 197, 211-23, 2.1 Title III, § 802, 82 Stat. 212, 1.2.1(a) 91-508, tit. VI, 8.3.1(a), 9.2 § 601, 1.3.3, 8.3.1(a) 93-153, tit. IV, § 408(f), 8.2 93-579, § 3, 88 Stat. 1897, 1.2.1(a) 93-637 tit. II, § 201(a), 8.1 tit. II, § 206(a), 8.2 94-435, tit. II, § 201, 8.1 94-553 § 102(a), 90 Stat. 2544-45 (Oct. 19, 1976), 1.1 § 301, 90 Stat. 2572, 1.1 95-630, Title XX, § 2001, 1.3.3 96-252, § 13, 8.2 S–5

Data Security and Privacy in Massachusetts

Public Laws (Pub. L. No.) (cont’d) 98-473, tit. 22, ch. XXI, § 2102(a), 3.6.2 99-474, tit. XXIX, § 290001, 3.6.3 99-508, 1.3.2 100 Stat. 1848, 2.1 102-537, § 2(b), 8.3.1(a), 9.2 103-297, § 2, 8.4.2 103-312 § 5, 8.3.1(a), 8.5 § 9, 8.3.1(a), 8.5 103-322 tit. XIX, § 290001, 3.6.4 tit. XXX, §§ 300001–300005, 9.2.2(a) 104-191, 1.3.2, 6.2, 8.3.3, 16.1, Exhibits 8C, 13A § 110 Stat. 1936-2103, 1.3.5 § 1173(d), 6.2 § 1179, 6.4.1(b) tit., subtit. F, §§ 261–264, 8.3.3 104-208, div. A, tit. II, subtit. D, ch. 1, 8.3.1(a), 9.2 104-294, tit. II, 3.6.5 105-277, div. C, tit. XIII, §§ 1301– 1306, 8.4.4 105-318, 1.2.2(b), 8.3 § 3, 8.3 § 5, 8.3 105-347, 8.3.1(a), 9.2 106-102 § 101, 5.1 §§ 501–527, 1.3.6, 8.3.2 §§ 522–527, 8.3.2(c) 106-229, § 114 Stat. 464, 1.3.4 106-346, § 101(a), 9.2.2(a) 106-554, 7.2.1 107-56, 1.2.2(b) § 808(2)(i), 3.6.6 tit. X, § 1101, 8.4.2 107-296, § 225(b)(2), 3.6.7 107-347, 13.5 108-10, §§ 2–4, 8.4.3 108-82, 8.4.3 108-159, 1.3.8, 8.3.1(a), 8.3.1(b), 9.2 108-177, tit. III, subtit. D, § 361(j), 8.3.1(a) 108-187, 8.4.1 S–6

§§ 2–15, 9.2.3(a) 108-187, §§ 2–15, 117, 1.3.7 108-458, tit. VI, subtit. C, § 6203(1), 8.3.1(a) 109-2, § 4, 9.3 109-177, tit. I (2006) § 116(c), 8.3.1(a) § 118(b), 8.3.1(a) 109-178, § 4(c)(2), 8.3.1(a) 109-351 tit. VI, § 609, 8.3.2(b) tit. VII § 719, 8.3.1(a) § 728, 8.3.2(b) 110-161, div. D, tit. VII, § 743, 8.3.1(a) 110-187, § 2, 8.4.3 110-188 § 2, 8.4.3 § 3, 8.4.3 110-233, tit. I, § 105(a), 8.3.3 110-241, § 3, 8.3.1(a) 110-326 § 204, 3.6.8 § 205, 3.6.8 § 206, 3.6.8 § 207, 3.6.8 § 208, 3.6.8 § 209, 3.6.8 111-5, 8.3.3, Exhibits 13A, 15B § 13410(d), 6.5.6 Div. A, tit. XIII, 8.3.3 Div. A, Title XIII, 1.3.5 111-24 tit. II, § 205, 8.3.1(a) tit. III, § 302, 8.3.1(a) 111-148 tit. I, subtit. B, § 1104(b), 8.3.3 tit. X, subtit. A, § 10109, 8.3.3 111-203 tit. X, subtit. A–F, §§ 1011–1067, 8.3.1(a) tit. X, subtit. H, § 1088(a), 8.3.1(a) 111-319, 8.3.1(a) § 2, 8.3.1(b) 113-283, 13.5 114-23, 1.2.2(b) 114-113, 129 Stat. 2935, 16.2 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Public Laws (Pub. L. No.) (cont’d) 114-153 § 2(b)(3), 130 Stat. 380, 1.2.1(a) § 2(g), 130 Stat. 182, 12.1.1 115-22, 131 Stat. 88, 16.2 Red Flag Program Clarification Act of 2010, 8.3.1(A), 8.3.1(B) Regulation E, 11.1 Regulation Z, 11.1 Restore Online Shoppers’ Confidence Act, 8.1 Robinson-Patman Act, 8.1 Securities Exchange Act of 1934 § 15, Exhibit 3A Senate Bills S. 1693, 16.2 Senate Reports S. Rep. 99-541, U.S. Code & Cong. News 3555, 1.2.1(a) Senate Reports (S. Rep. No.) 99-132 at 6, 1986 U.S.C.C.A.N. at 2484-85, 3.6.3 104-357, 3.6.5 105-190,105th Cong., 7.3.1 Sherman Act, 8.1 Stored Communications Act, 1.2.1(A), 2.5, 8.4.4, 9.3, 12.1.1, 12.2.2, 16.1, 16.2, 16.4 Telemarketing and Consumer Fraud and Abuse Prevention Act, 8.1, 8.4.2 Telephone Consumer Protection Act (TCPA), 8.4.2, 11.4 Telephone Disclosure and Dispute Resolution Act of 1992, 8.1 Truth in Lending Act, 1.3.3, 8.1, 11.1 United States Code (U.S.C.) 5 U.S.C. § 101, Exhibit 3A § 552a, 1.2.1(a) MCLE, Inc. | 2nd Edition 2018

11 U.S.C. § 332, 15.3 § 363, 15.3 12 U.S.C. § 601 et seq., 5.5.4 § 611 et seq., 5.5.4 § 1751 et seq., 5.5.4 § 1813(z), 5.2.3 § 1818, 5.5.4 § 1843(k), 5.3.1 § 3401 et seq., 5.4.1 § 5565, 5.6.2(a) 15 U.S.C. §§ 1–11, 8.1 §§ 12–27, 8.1 § 15a, 8.5 § 18a, 8.1 § 21, 6.5.6, 8.1 § 41, 8.2 §§ 41–58, 8.1 § 43, 8.2 § 44, 8.5 § 45, 6.5.6, 8.1 § 45(a), 8.1 § 45(b), 8.2 § 45(c), 8.2 § 45(l), 8.2 § 45(m), 8.2 § 45n, 8.3.1(a) § 45(n), 8.5 § 46(a), 8.2 § 46(c), 8.1 § 46(d), 8.1 § 46(e), 8.1 § 47, 8.1 § 53b, 8.2 § 57a, 8.3.1(a), 8.4.4, 8.5 § 57(a), 8.4.2, Exhibit 8C § 57b, 8.2 § 57b-1, 8.2 §§ 68–68j, 8.1 §§ 69–69j, 8.1 § 77a et seq., 5.5.3 § 77aaa et seq., 5.5.3 § 78a et seq., 5.5.3 § 78aaa et seq., 5.5.3 § 78c(a)(47), 8.3.2(c) S–7

Data Security and Privacy in Massachusetts

United States Code (U.S.C.) 15 U.S.C. (cont’d) § 80a-1 et seq., 5.5.3 § 80b-1 et seq., 5.5.3 § 80b et seq., 5.5.3 § 681b, 8.3.1(a) §§ 1333–1340, 8.1 §§ 1451–1461, 8.1 §§ 1601–1666, 11.1 §§ 1601–1693r, 8.1 § 1602(n), Exhibit 3A § 1643, 8.4.2 §§ 1666–1666j, 8.1 §§ 1667–1667e, 8.1 § 1671b, 8.3.1(a) § 1681 et seq., 5.4.1, Exhibit 3A §§ 1681–1681t, 8.1 §§ 1681–1681v, 8.3, 8.3.1(a), 9.2 §§ 1681–1681x, 4.1.1, 8.3 § 1681a, 4.2.4, 4.2.8, 4.3.1, 4.3.3, 4.3.3(b), 4.3.3(c), 5.4.1, 8.3.1(a), 8.3.1(b) § 1681b, 4.2.3, 4.2.4, 4.2.8, 8.3.1(a), 8.3.1(b) § 1681c, 4.2.1, 4.2.7, 8.3.1(b) § 1681c-1, 4.3.3, 4.3.3(a), 4.3.3(b), 4.3.3(c), 4.3.3(d), 8.3.1(b), Exhibit 8A § 1681c-2, 4.3.4(a), 4.3.4(b), 8.3.1(b) § 1681d, 8.3.1(a) § 1681e, 4.1.2, 4.2.8, 4.2.9 § 1681g, 4.2.2, 4.3.2, 8.3.1(a), 8.3.1(b) § 1681h, 8.3.1(a) § 1681i, 4.1.2, 4.3.4(a), 8.3.1(a) § 1681j, 4.1.2, 8.3.1(b) § 1681m, 4.1.1, 4.2.4, 4.2.5, 4.2.6, 4.3.2, 4.3.4(b), 8.3.1(a), 8.3.1(b), 13.3, Exhibit 8A § 1681n, 4.1.2, 4.2.1, 4.2.4, 4.2.7, 4.2.8, 4.3.3(e), 4.3.4(a) § 1681o, 4.1.2, 4.2.1, 4.2.4, 4.2.7, 4.2.8, 4.3.3(e), 4.3.4(a) § 1681s, 4.3.4(a), 8.3.1(a), 8.3.1(b) § 1681s-2, 4.1.2, 4.3.4(b), 8.3.1(b), Exhibit 8A § 1681s-3, 8.3.1(b) S–8

§ 1681t, 4.1.2, 4.2.1, 4.2.2, 4.2.4, 4.2.5, 4.2.6, 4.2.7, 4.3.2, 4.3.3(e), 4.3.4(a), 4.3.4(b) § 1681w, 4.1.1, 8.3.1(a), 8.3.1(b), 9.3 § 1681x, 8.3.1(b) §§ 1691–1691f, 8.1 § 1692, 4.3.2 §§ 1692–1692o, 8.1 §§ 1693–1693r, 1.3.3, 8.1, 11.1 §§ 2101–2106, 8.1 §§ 2301–2312, 8.1, 11.1 §§ 2821–2824, 8.1 §§ 4401–4408, 8.1 §§ 5701–5724, 8.1 § 6081 et seq., 8.3.1(b) §§ 6101–6108, 8.1, 8.4.2 § 6102, 8.4.2 § 6105, 8.4.2 § 6107, 8.4.2 § 6151, 8.4.3 §§ 6152–6154, 8.4.3 § 6154, 8.4.3 § 6155, 8.4.3 § 6501, 8.4.4 §§ 6501–6506, 1.3.7, 7.4, 8.1, 8.4.4 § 6502, 8.4.4 § 6503, 8.4.4 § 6505, 8.4.4 § 6801, 8.3.2(a), 8.3.2(b), 13.4 §§ 6801–6809, 8.3, 8.3.2 §§ 6801–6827, 8.1 § 6801(b), 5.1, 5.4 § 6802, 5.1, 5.4.1, 5.4.2, 5.4.3, 5.4.4, 8.3.2(b) § 6803, 5.4.1, 8.3.2(b) § 6804, 5.4.1, 5.4.2, 8.3.2(b) § 6805, 8.3.2(a), 8.3.2(b) § 6809, 5.3.3, 8.3.2(a) § 6821, 5.5.5 §§ 6821–6827, 8.3, 8.3.2, 8.3.2(c) § 6821(a), 5.1, 5.5.1, 5.5.2, 8.3.2(c) § 6821(b), 8.3.2(c) § 6821(c), 5.5.3, 8.3.2(c) § 6821(d), 5.5.3, 8.3.2(c) § 6821(e), 5.5.3, 8.3.2(c) § 6821(f), 5.5.3, 8.3.2(c) § 6821(g), 5.5.3, 8.3.2(c) 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

United States Code (U.S.C.) 15 U.S.C. (cont’d) § 6822, 5.5.4, 8.3.2(c) § 6823(a), 5.5.5 § 6823(b), 5.5.5 § 6825, 5.2.3, 8.3.2(c) § 6827, 8.3.2(c) §§ 7001–7006, 11.2 § 7002(a), 11.2 § 7002(a)(1), 1.3.4 § 7201 et seq., 5.5.3 §§ 7701–7713, 1.3.7, 7.5, 8.4.1, 9.2.3(a) § 7702, 7.5.1 § 7704, 8.4.1 § 7706, 8.4.1, 9.2.3(a) § 7707, 9.2.3(a) § 7708, 8.4.1, 9.2.3(a) § 7711, 8.4.1, 9.2.3(a) § 7712, 8.4.1 §§ 8401–8405, 8.1 17 U.S.C. § 102(a), 1.1 § 107, 11.3.3 § 512, 1.3.7, 7.3, 7.3.4, 7.3.6, 11.3.3 § 1201, 7.3.8 §§ 1201–1205, 7.3 § 1202, 7.3.7, 7.3.8 §§ 1301–1322, 7.3 18 U.S.C., 5.5.5 § 1028, 1.2.2(b), 8.3 § 1028(a), 8.3 § 1029, Exhibit 3A § 1029(e), 4.3.1 § 1030, 1.3, 3.1, 3.6.2, 3.6.7, 14.5.1, 16.2, Exhibit 3A § 1030(a), 3.3.1, 3.4 § 1030(a)(1), 3.3.1 § 1030(a)(2), 3.3.2, 3.5.4 § 1030(a)(3), 3.3.3 § 1030(a)(4), 3.3.4 § 1030(a)(5), 3.3.5 § 1030(a)(6), 3.3.6 § 1030(a)(7), 3.3.7 § 1030(c)(4), 3.4 § 1030(e)(1), 3.2.2 § 1030(e)(2), 3.2.1, 3.6.5, 8.4.1 MCLE, Inc. | 2nd Edition 2018

§ 1030(e)(6), 3.2.4 § 1030(e)(8), 3.2.5 § 1030(e)(11), 3.2.5 § 1030(g), 3.4 § 1037, 8.4.1 § 1591, 16.2 § 1595, 16.2 § 1833, 12.1.1 § 1839(5), 1.2.1(a) § 1839(5)(A), 1.2.1(a) § 1839(5)(B), 1.2.1(a) § 2258A, 2.5.6(b) § 2510, 2.2, 2.3.1, 2.3.2, 2.4.1, 2.5.2, 2.5.3, 2.5.5, 12.1.1 §§ 2510–2520, 1.2.1(a), 1.3.2, 2.1 §§ 2510–2522, 2.2 § 2511, 2.3.1, 2.3.2, 2.5.6(b) § 2516, 2.3.1 § 2517, 2.5.6(b) § 2518, 2.3.1, Exhibit 2A § 2519, 2.3.1 § 2520, 2.3.1 § 2701, 2.5.2 §§ 2701–2709, 16.4 §§ 2701–2710, 1.2.1(a) §§ 2701–2711, 12.1.1, 12.2.2 §§ 2701–2712, 2.2 § 2702, 2.5.3, 2.5.4, 2.5.6, 2.5.6(b), 2.5.6(c), 2.5.9, 9.3 § 2703, 2.5.2, 2.5.3, 2.5.5, 2.5.6, 2.5.6(a), 2.5.6(b), 2.5.6(c), 2.5.7 § 2711, 2.5.3, 2.5.4, 12.1.1 § 2721, 9.2.2(a) §§ 2721–2725, 9.2.2(a) § 2722, 9.2.2(a) § 2723, 9.2.2(a) § 2724, 9.2.2(a) § 2725, 9.2.2(a) §§ 3121–3127, 2.2 § 3121(d), 2.2 § 3123, 2.2 § 3127, 2.2 § 3571, 5.5.5 20 U.S.C. § 1232g, 6.4.1(a) § 6777, 7.2.1 § 6801, 7.2.1 S–9

Data Security and Privacy in Massachusetts

United States Code (U.S.C.) 20 U.S.C. (cont’d) § 9134, 7.2.1 21 U.S.C. § 853, Exhibit 3A 28 U.S.C. § 1332, 9.3 § 1407, 1.1 § 4001, 7.3 31 U.S.C. § 5318, Exhibit 8A 35 U.S.C. § 287(a), 11.3.3 39 U.S.C. § 3009, 8.1 42 U.S.C. § 300gg-91(c)(1), 6.4.1(b) § 1320d, 8.3.3, Exhibit 8C § 1320d-5, 6.5.1, 6.5.6 § 1320d et seq., 9.2.1 § 1983, 2.4.1 § 2014(y), Exhibit 3A § 17921, 8.3.3 §§ 17921–17954, 8.3.3 §§ 17924–17954, 1.3.8 § 17931, 8.3.3 § 17932, 8.3.3 § 17934, 8.3.3 § 17937, 8.1, 8.3, 8.3.3 § 17938, 8.3.3 § 17942, 8.3.3 § 17954, 8.3 44 U.S.C. §§ 3551–3558, 13.5 47 U.S.C. § 151, 12.2.1(a) § 151 et seq., 12.2.1(a) § 153, 9.2.3(e), 12.2.1(b) §§ 201–276, 12.2.1(a) § 222, 12.2.1(d) § 230, 7.2.2, 9.2.3(e), 12.1.1, 16.2, 16.4 § 230(c)(1), 1.3.7, 12.2.3 § 254, 7.2.1 § 605, 2.1 United States Constitution Bill of Rights, 1.2.1(a) First Amendment, 2.4.1, 7.2.1, 8.4.2, 9.2.3(e) Fourth Amendment, 1.3, 2.1, 2.5.1, 2.5.7, 2.5.8, 2.6, 9.1.1 Fifth Amendment, 1.3, 9.1.1 S–10

Uniting and Strengthening America by Fulfilling Rights and Ensuring Effective Monitoring (USA Freedom) Act of 2015, 1.2.2(B), 8.4.2, 16.1 USA PATRIOT Improvement and Reauthorizing Amendments Act of 2005, 8.3.1(A) Video Privacy Protection Act, 16.4 Wiretap Act, 2.3 Wool Products Labeling Act of 1939, 8.1

MASSACHUSETTS Code of Massachusetts Regulations (C.M.R.) 201 C.M.R. § 17.00, 10.2.1, 10.4, 10.4.2(b), 10.4.2(c), 13.6, 14.7.9, 15.5, Exhibits 10A, 10B, 10C, 10D, 15B § 17.00 et seq., 10.1 § 17.01, 12.2.4, Exhibits 10A, 10C § 17.02, 10.2.1, 12.2.4, Exhibit 10C § 17.03, 9.2.2(c), 10.4.1, 10.4.2, 12.2.4, 13.6, Exhibit 10C § 17.04, 9.2.2(c), 13.6, Exhibit 10C § 17.05, Exhibit 10C House Bills H. 1985, 16.4 Massachusetts Acts and Resolves (Mass. Acts) 1973 Mass. Acts c. 941, 9.1.1 1974 Mass. Acts c. 193, § 1, 9.1.1 1992 Mass. Acts 31, 9.2.3(d) 1996 Mass. Acts 298, § 11, 9.2.3(d) 1997 Mass. Acts 238, 9.2.3(d) 2000 Mass. Acts 164, § 1, 9.2.3(d) 2007 Mass. Acts c. 82 §§ 3–15, 9.2.1 § 16, 9.2.1, 9.2.2(c), 10.1 § 17, 9.2.2(c) 2010 Mass. Acts 92, 9.2.3(d) § 5, 9.2.3(d) § 10, 9.2.3(d) 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Massachusetts Acts and Resolves (Mass. Acts) 2010 Mass. Acts 92 (cont’d) § 12, 9.2.3(d) 2013 Mass. Acts 38, § 72, 9.2.3(d) Massachusetts Constitution Art. 14, 2.5.7, 2.6, 9.1.1 Massachusetts General Laws (G.L. c.) c. 71, § 37O, 9.2.3(d) c. 93 § 50, 9.2.1 § 54A, 4.1.2 § 56, 9.2.1, 10.2.5(a) § 58, 9.2.1 § 62A, 4.3.3(f), 9.2.1, 10.2.5(a) § 63, 9.2.1 § 105, 9.3 c. 93A, 7.5.3, 8.5, 9.3, 10.2.6, 10.2.8 § 2, 7.5.3, 8.5 § 9, 8.5 § 11, 8.5 c. 93H, 10.1, 10.2, 10.2.6, 10.2.8, 10.2.9, 10.3.1, 10.4, 10.4.1, Exhibits 10C, 15B § 1, 10.2.2 §§ 1–6, 9.2.1 § 1(a), 10.2.1, 10.2.2, 10.2.5(a) § 2(a), 9.2.2(c) § 3, 10.2.2, 10.2.3, Exhibit 10B § 3(a), 10.2.4 § 3(b), 9.2.1, 10.2.4, 10.2.5(a) § 4, 10.2.4 § 6, 9.2.1, 9.3 c. 93I, 10.1, 10.2.1, 10.3.1, 10.4.1, Exhibit 10B § 2, 9.2.2(c) c. 209A, 9.2.3(d) c. 214, § 1B, 9.1.1 c. 265 § 43, 9.2.3(d) § 43A, 9.2.3(d) § 50, 7.2.6 c. 269, § 14A, 9.2.3(d) c. 272 § 99, 2.4, 2.4.1 MCLE, Inc. | 2nd Edition 2018

§ 99(B), 2.4.1, 2.4.2 Massachusetts Wiretap Act, 2.4.1 Senate Bills S. 95, 16.4

OTHER STATES Alabama 2001 Ala. Acts 312 § 11, 9.2.1 § 12, 9.2.1 2006 Ala. Acts 611, 9.2.2(b) 2012 Ala. Acts 432, § 5, 9.2.3(b) 2017 Ala. Acts 414 § 1, 9.2.3(e) Ala. Code § 13A-6-72 (2017), 9.2.3(e) § 13A-8-114 (2015), 9.2.3(b) § 13A-8-200 (2017), 9.2.1 § 13A-8-201 (2017), 9.2.1 § 41-13-6 (2015), 9.2.2(b) § 42.20.300, 2.4.1 Alaska 2003 Alaska Sess. Laws 14, § 2, 9.2.3(a) 2005 Alaska Sess. Laws 97 § 1, 9.2.3(b) § 2, 9.2.3(b) § 3, 9.2.3(b) 2007 Alaska Sess. Laws 24, § 6, 9.2.3(b) 2008 Alaska Sess. Laws 92 § 4, 9.2.1, 9.2.2(c) art. 1, 9.2.2(b) art. 2, 9.2.2(b) art. 3, 9.2.2(b) art. 4, 9.2.2(b) 2010 Alaska Sess. Laws 18, § 9, 9.2.3(b) 2011 Alaska Sess. Laws 20 § 8, 9.2.3(b) § 12, 9.2.3(e) Alaska Stat. § 11.41.452 (2016), 9.2.3(b) § 11.61.116, 9.2.3(e) § 11.61.128 (2016), 9.2.3(b) §§ 11.161–11.120 (2017), 9.2.3(e) §§ 45.45.792–45.45.798 (2016), 9.2.3(b) S–11

Data Security and Privacy in Massachusetts

Alaska Alaska Stat. (cont’d) §§ 45.48.010–45.48.995 (2014), 9.2.1 § 45.48.90, 9.2.1 §§ 45.48.100–45.48.290, 9.2.1 §§ 45.48.400–45.48.480, 9.2.1 §§ 45.48.400–45.48.480 (2017), 9.2.2(b) § 45.48.400(a), 9.2.2(b) § 45.48.430(a), 9.2.2(b) § 45.48.480(b), 9.2.2(b) § 45.48.500 (2016), 9.2.2(c) §§ 45.48.500–45.48.590, 9.2.1 § 45.48.510 (2016), 9.2.2(c) §§ 45.48.600–45.48.690, 9.2.1 § 45.48.750, 9.2.1 § 45.50.479 (2016), 9.2.3(a) Arizona 2003 Ariz. Sess. Laws 229, § 2, 9.2.3(a) 2005 Ariz. Sess. Laws 136, § 1, 9.2.3(b) 2006 Ariz. Sess. Laws 183, § 2, 9.2.2(b) 208, § 1, 9.2.2(c) 232 § 1, 9.2.1 § 3, 9.2.1 2007 Ariz. Sess. Laws 23, § 1, 9.2.1 2016 Ariz. Sess. Laws 80 § 1, 9.2.1 § 3, 9.2.1 § 3E, 9.2.3(b) § 15, 9.2.1 § 102, 9.2.1 102, § 2, 9.2.2(c) 2017 Ariz. Sess. Laws 97, § 2, 16.5 Ariz. Rev. Stat. § 13-1425, 9.2.3(e) §§ 18-50–18-504 (2015), 9.2.3(b) § 18-502, 9.2.3(b) § 18-504, 9.2.3(b) § 18-545, 9.2.1 §§ 44-1372–44-1372.05 (2017), 9.2.3(a) S–12

§ 44-1372.01, 9.2.3(a) § 44-1372.02, 9.2.3(a) § 44-7061, 16.5 §§ 44-7301–44-7304 (2015), 9.2.3(b) § 44-7501 (2015), 9.2.1 § 44-7601 (2017), 9.2.2(c) § 44.1373 (2017), 9.2.2(b) Arkansas 2003 Ark. Acts 1019, § 1, 9.2.3(a) 2005 Ark. Acts 1295, § 1, 9.2.2(b) 1526, § 1, 9.2.1, 9.2.2(c) 2255, § 1, 9.2.3(b) 2013 Ark. Acts 998, § 1, 9.2.3(c) 1480, § 1, 9.2.3(c) 2015 Ark. Acts 304, § 2, 9.2.3(e) Ark. Code 5-26-314 (2017), 9.2.3(e) Ark. Code Ann. § 4-86-107 (2017), 9.2.2(b) §§ 4-88-601–4-88-607 (2017), 9.2.3(a) § 4-88-603, 9.2.3(a) § 4-88-606, 9.2.3(a) §§ 4-110-101–4-110-108 (2017), 9.2.1 § 4-110-103(7), 9.2.1 § 4-110-104 (2017), 9.2.2(c) §§ 4-111-101–4-111-105 (2015), 9.2.3(b) § 6-60-104 (2017), 9.2.3(c) § 11-2-124 (2017), 9.2.3(c) California 1992 Cal. Stat. 564, § 1, 9.2.3(a) 1998 Cal. Stat. 863, § 2, 9.2.3(a) 865, § 1, 9.2.3(a) 2000 Cal. Stat. 1039, § 1, 9.2.2(c) 2001 Cal. Stat. 720, § 7, 9.2.2(b) 2002 Cal. Stat. 664, § 42, 9.2.2(b) 699, § 1, 9.2.3(a) 915 § 2, 9.2.1 § 4, 9.2.1 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

California (cont’d) 2002 Cal. Stat. 1054, § 4, 1.3.7 2003 Cal. Stat. 487 § 1, 9.2.3(a) § 3, 9.2.3(a) 532, § 1, 9.2.2(b) 829, § 3, 9.2.2(d) 1054, § 2, 9.2.1 2004 Cal. Stat. 183 § 14, 9.2.3(a) § 15, 9.2.3(a) § 35, 9.2.2(b) 282, § 1, 9.2.2(b) 843, § 2, 9.2.3(b) 2004 Cal. Stat. 843, § 2, 1.3.7 2005 Cal. Stat. 437, § 1, 1.3.7, 9.2.3(b) 711, § 1, 9.2.3(a) 2007 Cal. Stat. 627 § 1, 9.2.2(b) § 2, 9.2.2(b) § 8, 9.2.2(b) 699 § 4, 9.2.1 § 6, 9.2.1 2008 Cal. Stat. 179, § 36, 9.2.2(b) 2009 Cal. Stat. 134, § 2, 9.2.2(c) 552 § 1, 9.2.2(b) § 2, 9.2.2(b) 2011 Cal. Stat. 197 § 1, 9.2.1 § 2, 9.2.1 2012 Cal. Stat. 618, § 1, 9.2.3(c) 619, § 2, 9.2.3(c) 2013 Cal. Stat. 76, § 142, 9.2.3(c) 103, § 1, 9.2.2(b) 390, § 1, 9.2.2(d) 395, § 1, 9.2.1 396 § 1.5, 9.2.1 MCLE, Inc. | 2nd Edition 2018

§ 2, 9.2.1 466, § 1, 9.2.3(e) 2014 Cal. Stat. 71, § 125, 9.2.3(e) 855, § 3, 9.2.2(b) 875, § 2, 9.2.1 2015 Cal. Stat 522 § 1, 9.2.1 522 § 2, 9.2.1 532 § 1, 9.2.1 § 2.3, 9.2.1 543 § 1.3, 9.2.1 2016 Cal. Stat. 86 § 20, 9.2.1 § 21, 9.2.1 337 § 1, 9.2.1 § 2, 9.2.1 654, § 1.4, 9.2.3(e) 724, § 1.3, 9.2.3(e) 734, § 1.4, 9.2.3(e) Cal. [Bus. & Prof.] Code §§ 17529–17529.9 (2017), 9.2.3(a) § 17529.2, 9.2.3(a) § 17529.4, 9.2.3(a) § 17529.5, 9.2.3(a) § 17538.41 (2017), 9.2.3(a) § 17538.45 (2017), 9.2.3(a) § 22575, 9.2.2(d) § 22575, et seq., 9.3 §§ 22575–22579 (2015), 9.2.2(d) § 22577, 9.2.2(d) § 22947, 9.2.3(b) §§ 22947–22947.6 (2017), 1.3.7, 9.2.3(b) § 22947.2, 9.2.3(b) § 22947.3, 9.2.3(b) §§ 22948–22948.3 (2015), 1.3.7, 9.2.3(b) § 22948.2, 9.2.3(b) § 22948.3 (2014), 9.2.3(b) Cal. [Civ.] Code §§ 1748.01–1748.94, 9.3 § 1748.08, 9.3 § 1748.70, 9.3 S–13

Data Security and Privacy in Massachusetts

California Cal. [Civ.] Code (cont’d) § 1770(a), 9.3 § 1785.25(a), 4.1.2 § 1798.29, 9.2.1 § 1798.80, et seq., 9.3 § 1798.81 (2017), 9.2.2(c) § 1798.82, 1.3.7, 9.2.1, 13.1 § 1798.84, 9.3 § 1798.85 (2017), 9.2.2(b) § 1798.89 (2017), 9.2.2(b) § 1798.90.5, 9.2.1 Cal. [Com.] Code § 19526.5 (2017), 9.2.2(b) Cal. Const. art. I, § 1, 9.1.1 Cal. [Educ.] Code § 99120, 9.2.3(c) § 99121 (2017), 9.2.3(c) § 99122, 9.2.3(c) Cal. [Gov’t] Code §§ 27300–27307 (2017), 9.2.2(b) Cal. [Lab.] Code § 980 (2017), 9.2.3(c) Cal. Penal Code § 647(j)(4) (2017), 9.2.3(e) California Unfair Competition Law, Cal. [Bus. & Prof.] Code § 17200, 9.3 Colorado 2004 Colo. Sess. Laws 1959, § 2, 9.2.2(c) 2006 Colo. Sess. Laws 274, § 1, 9.2.2(b) 2008 Colo. Sess. Laws 536, § 1, 9.2.1 593, § 1, 9.2.3(a) 2010 Colo. Sess. Laws 419, § 8, 9.2.2(b) 2064, § 9, 9.2.1 2013 Colo. Sess. Laws 195, § 1, 9.2.3(c) 2014 Colo. Sess. Laws 283, § 1, 9.2.3(e) Colo. Rev. Stat. § 6-1-702.5 (2017), 9.2.3(a) § 6-1-713 (2017), 9.2.2(c) § 6-1-715 (2017), 9.2.2(b) § 6-1-716 (2017), 9.2.1 § 8-2-127 (2016), 9.2.3(c) § 18-7-107 (2017), 9.2.3(e) § 18-7-108 (2017), 9.2.3(e) S–14

Connecticut 1999 Conn. Acts 160, § 1, 9.2.3(a) 2003 Conn. Acts 128, § 2, 9.2.3(a) 156, § 13, 9.2.2(b) 2005 Conn. Acts 148, § 3, 9.2.1 2006 Conn. Acts 50, § 1, 9.2.3(b) 2008 Conn. Acts 167, § 1, 9.2.2(c) 2009 Conn. Acts 71, § 1, 9.2.2(c) 239, § 13, 9.2.2(c) 2009 Conn. Acts 239, §§ 13–19, 9.2.2(b) 2012 Conn. Acts 1, § 130, 9.2.1 2014 Conn. Acts 213, § 8, 9.2.3(e) 2015 Conn. Acts 6, § 1, 9.2.3(c) 142, § 6, 9.2.1 2016 Conn. Acts 169, § 21, 9.2.3(c) 2017 Conn. Acts 137, § 1, 9.2.2(c) Conn. Gen. Stat. § 31-40x (2017), 9.2.3(c) § 36a-701b (2017), 9.2.1 § 42-110g, 9.3 §§ 42-470–42-472d (2017), 9.2.2(b) § 42-471 (2017), 9.2.2(c) § 52-570c (2014), 9.2.3(a) § 53-451 (2017), 9.2.3(a) § 53-454 (2017), 9.2.3(b) § 53a-189c (2017), 9.2.3(e) Delaware 19 Del. Code Ann. § 709A (2017), 9.2.3(c) 29 Del. Code Ann. §§ 9014C–9019C (2015), 9.2.2(d) §§ 9033C–9020C, 9.2.2(d) 72 Del. Laws 135, § 1 (1999), 9.2.3(a) 297, § 1 (2000), 9.2.1 73 Del. Laws 53, § 1 (2001), 9.2.2(b) 74 Del. Laws 127, § 1 (2003), 9.2.2(d) 425, § 1 (2004), 9.2.1 75 Del. Laws 61, § 1 (2005), 9.2.1 88, § 21(13) (2005), 9.2.2(d) 328, § 1 (2006), 9.2.1 78 Del. Laws 354, § 1 (2012), 9.2.3(c) 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Delaware (cont’d) 79 Del. Laws 109, § 1 (2013), 9.2.1 260, § 7 (2014), 9.2.1 80 Del. Laws 146, § 1 (2015), 9.2.3(c) 2014 Del. Laws 415, § 1, 9.2.3(e) 2017 Del. Laws 79, § 11, 9.2.3(e) 86, §7, 16.5 Del. Code Ann. tit. 6 § 12B-102 (2017), 9.2.1 §§ 2201–2204 (2017), 9.2.1 tit. 8, § 224, 16.5 tit. 11 § 854 (2017), 9.2.1 § 937 (2017), 9.2.3(a) § 938 (2017), 9.2.3(a) tit. 11, § 1335 (2017), 9.2.3(e) tit. 14, §§ 8101–8104 (2017), 9.2.3(c) tit. 18, § 535 (2017), 9.2.2(b) District of Columbia 2014 D.C. Stat. 20-275, 9.2.3(e) D.C. Code § 28-3852 (2017), 9.2.1 § 28-3853 (2014), 9.3 D.C. Code Ann. §§ 22-305–22-3057, 9.2.3(e) D.C. Law 16-237, § 2 (2007), 9.2.1 Florida 1999 Fla. Laws ch. 335, § 1, 9.2.1 2001 Fla. Laws ch. 233, § 1, 9.2.1 2003 Fla. Laws ch. 71, § 1, 9.2.1 2004 Fla. Laws ch. 233, § 1, 9.2.3(a) 2005 Fla. Laws ch. 2, § 132, 9.2.3(a) ch. 229 § 1, 9.2.1 § 2, 9.2.1 2006 Fla. Laws ch. 151, § 8, 9.2.1 ch. 232 § 2, 9.2.3(a) § 3, 9.2.3(a) ch. 232, § 1, 9.2.3(b) MCLE, Inc. | 2nd Edition 2018

2010 Fla. Laws ch. 209, § 41, 9.2.1 2014 Fla. Laws ch. 189, § 3, 9.2.1 ch. 190, § 1, 9.2.1 ch. 200, § 2, 9.2.1 ch. 221 § 9, 9.2.1 § 16, 9.2.1 ch. 289 § 4, 9.2.1 § 5, 9.2.1 2015 Fl. Laws ch. 24, § 1, 9.2.3(e) ch. 166, § 17, 9.2.1 ch. 784.049, 9.2.3(e) Fla. Stat. § 282.0041 (2017), 9.2.1 § 282.381 (2017), 9.2.1 § 501.171 (2017), 9.2.1 § 817.568 (2017), 9.2.1 § 817.5681 (2015), 9.2.1 ch. 784.049 (2017), 9.2.3(e) chs. 668.701–668.705 (2017), 9.2.3(b) Fla. Stat. Ann. §§ 686.60–668.610 (2017), 9.2.3(a) § 686.603, 9.2.3(a) Georgia 1996 Ga. Laws 1505, § 1, 9.2.3(a) 2002 Ga. Laws 551 § 2, 9.2.1 § 8, 9.2.2(c) 2005 Ga. Laws 46, § 4, 9.2.3(a) 127, § 1, 9.2.3(b) 851, § 1, 9.2.1 2006 Ga. Laws 562, § 5, 9.2.1 603, 9.2.2(b) 2007 Ga. Laws 103, § 16, 9.2.3(b) 241, §§ 4–6, 9.2.1 450 § 2, 9.2.1 § 3, 9.2.1 2008 Ga. Laws 568, § 1, 9.2.1 S–15

Data Security and Privacy in Massachusetts

Georgia 2008 Ga. Laws (cont’d) 570, § 1, 9.2.1 2010 Ga. Laws 501, § 1, 9.2.1 2011 Ga. Laws 252, § 1, 9.2.1 2013 Ga. Laws 334, § 1, 9.2.1 2014 Ga. Laws 519, § 1, 9.2.3(e) 611 § 1, 9.2.1 § 2, 9.2.1 2015 Ga. Laws 9, § 16, 9.2.3(e) 187, § 11, 9.2.2(c) Ga. Code Ann. § 10-1-393.8 (2017), 9.2.2(b) §§ 10-1-910–10-1-915 (2017), 9.2.1 § 10-1-911, 9.2.1 § 10-1-914, 9.2.1 § 10-1-914.1, 9.2.1 § 10-15-2 (2017), 9.2.2(c) § 16-9-93.1 (2017), 9.2.3(a) §§ 16-9-100–16-9-107 (2017), 9.2.3(a) § 16-9-101, 9.2.3(a) §§ 16-9-120–16-9-132 (2017), 9.2.1 § 16-9-124.1, 9.2.1 § 16-9-129, 9.2.1 § 16-9-130, 9.2.1 § 16-9-152 (2017), 9.2.3(b) § 16-11-62, 2.4.1 § 16-11-90 (2017), 9.2.3(e) § 46-5-214 (2017), 9.2.1 Hawaii 2006 Haw. Sess. Laws 135, § 2, 9.2.1 136, § 2, 9.2.2(c) 137, § 2, 9.2.1, 9.2.2(b) 2008 1st Sp. Sess. 10, § 5, 9.2.1 2008 Haw. Sess. Laws 10, § 683, 9.2.1, 9.2.2(b) 19 § 69, 9.2.1 § 70, 9.2.1 § 72, 9.2.2(c) 2012 Haw. Sess. Laws 191, § 2, 9.2.1, 9.2.2(b) S–16

2013 Haw. Sess. Laws 225, § 3, 9.2.1, 9.2.2(b) 2014 Haw. Sess. Laws 116, § 1, 9.2.3(e) Haw. Rev. Stat. §§ 487J-1–487J-3 (2017), 9.2.1, 9.2.2(b) §§ 487N-1–487N-4 (2017), 9.2.1 § 487N-2(d)(1), 9.2.1 § 487N-3(b), 9.3 § 487R-2 (2017), 9.2.2(c) § 487R-3 (2017), 9.2.2(c) § 711-1110.9(1)(b) (2017), 9.2.3(e) § 712-1210, 9.2.3(e) Idaho 2000 Idaho Sess. Laws 423, § 1, 9.2.3(a) 2002 Idaho Sess. Laws 5, § 1, 9.2.2(b) 2006 Idaho Sess. Laws 258, §1, 9.2.1 2008 Idaho Sess. Laws 177 § 1, 9.2.1 § 2, 9.2.1, 9.2.2(b) 2010 Idaho Sess. Laws 170, §1, 9.2.1 2014 Idaho Sess. Laws 97, §13, 9.2.1 173, § 1, 9.2.3(e) 2015 Idaho Sess. Laws 141, § 151, 9.2.1 Idaho Code § 14-1334 (2017), 9.2.2(b) § 18-6609(2)(b) (2017), 9.2.3(e) §§ 28-51-104–28-51-107 (2017), 9.2.1 §§ 28-52-101–28-52-109 (2017), 9.2.1 § 28-52-108 (2017), 9.2.2(b) § 48-603E (2017), 9.2.3(a) Illinois 5 Ill. Comp. Stat. 175/1-101–175/99-1, 1.3.4 177/1–177/15 (2017), 9.2.2(d) 179/1–179/55 (2017), 9.2.2(b) 179/10, 9.2.2(b) 179/30, 9.2.2(b) 720 Ill. Comp. Stat. 5/11-23.5 (2017), 9.2.3(e) 5/17-51 (2017), 9.2.3(a) 5/17-52.5 (2017), 9.2.3(b) 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Illinois (cont’d) 740 Ill. Comp. Stat. 7/1–7/15 (2015), 9.2.3(b) 7/10, 9.2.3(b) 7/15, 9.2.3(b) 14/1–14/30 (2017), 9.2.2(b) 815 Ill. Comp. Stat. 511/1–511/15 (2017), 9.2.3(a) 511/10, 9.2.3(a) 530/1–530/40 (2017), 9.2.1 530/20, 9.3 530/40 (2017), 9.2.2(c) 530/45, 9.2.1 530/50, 9.2.1 820 Ill. Comp. Stat. 55/10 (2015), 9.2.3(c) 75/5–75/20 (2017), 9.2.3(c) 1989 Ill. Laws 762, § 1, 9.2.3(a) 1999 Ill. Laws 90-759, 1.3.4 233 §§ 1–15, 9.2.3(a) § 900, 9.2.3(a) 2003 Ill. Laws §§ 1–15, 9.2.2(d) 199, § 5, 9.2.3(a) 2005 Ill. Laws 36, §§ 1–20, 9.2.1 947, § 5, 9.2.1 2007 Ill. Laws 326, § 5, 9.2.3(a) 350, §§ 1–15, 9.2.3(b) 942, § 5, 9.2.3(b) 994, 9.2.2(b) 2009 Ill. Laws 874, 9.2.2(b) 1000, § 600, 9.2.3(a) 1551 § 5.5, 9.2.3(b) §§ 17–51, 9.2.3(a) 2011 Ill. Laws 483, 9.2.1 § 5, 9.2.2(c) 875, § 5, 9.2.3(c) 2013 Ill. Laws 129, §§ 5–20, 9.2.3(c) 501, § 5, 9.2.3(c) MCLE, Inc. | 2nd Edition 2018

1138, § 5, 9.2.3(e) 2015 Ill. Laws 460 § 10, 9.2.3(c) § 15, 9.2.3(c) 610, § 10, 9.2.3(c) 775, § 20, 9.2.3(a) 2016 Ill. Laws 503, 9.2.1 Indiana 1996 Ind. Acts 89, 9.2.2(a) 2001 Ind. Acts 180, § 1, 9.2.1 2003 Ind. Acts 22, § 2, 9.2.1 36, § 1, 9.2.3(a) 2004 Ind. Acts 97, § 91, 9.2.3(a) 2005 Ind. Acts 115, § 1, 9.2.3(b) 2006 Ind. Acts 125, 9.2.2(a), 9.2.2(b) § 6, 9.2.1 § 9, 9.2.1 2007 Ind. Acts 2, § 317, 9.2.3(a) 2008 Ind. Acts 104, § 2, 9.2.2(a) 136, § 2, 9.2.1 2009 Ind. Acts 137 § 5, 9.2.2(c) § 14, 9.2.1 § 15, 9.2.1 2012 Ind. Acts 132, § 6, 9.2.3(a) 2016 Ind. Acts 183, 9.2.2(a) 2017 Ind. Acts 76, § 4, 9.2.2(c) Ind. Code § 4-33-5-1.5 (2017), 9.2.2(a) §§ 9-14-3.5-1–9-14-3.5-15 (2015), 9.2.2(a) §§ 24-4-14-1–24-4-14-8 (2016), 9.2.2(b) § 24-4-14-7, 9.2.2(b) §§ 24-4.8-1-1–24-4.8-3-2 (2016), 9.2.3(b) § 24-4.8-2-2, 9.2.3(b) § 24-4.8-2-3, 9.2.3(b) § 24-4.8-3-1, 9.2.3(b) § 24-4.9-1, 9.2.1 §§ 24-4.9-1-1–24-4.9-5.1 (2017), 9.2.1 § 24-4.9-3-3.5 (2016), 9.2.2(c) S–17

Data Security and Privacy in Massachusetts

Indiana Ind. Code (cont’d) §§ 24-5-22-1–24-5-22-10 (2016), 9.2.3(a) § 24-5-22-7, 9.2.3(a) § 24-5-22-8, 9.2.3(a) § 24-5-22-10, 9.2.3(a) § 35-43-5-3.5 (2017), 9.2.1 § 35-43-5-3.8, 9.2.1 Iowa 1996 Iowa Acts 108, § 3, 9.2.2(a) 1102, § 1, 9.2.2(a) 1998 Iowa Acts 1035, § 1, 9.2.2(a) 1999 Iowa Acts 47, § 1, 9.2.1 198, § 4, 9.2.2(a) 2000 Iowa Acts 1133 § 2, 9.2.2(a) § 18, 9.2.2(a) 2001 Iowa Acts 90, § 1, 9.2.2(a) 2005 Iowa Acts §§ 1–8, 9.2.3(a) 18, § 2, 9.2.1 94, § 4, 9.2.3(b) 2008 Iowa Acts 1154 § 1, 9.2.1 § 2, 9.2.1 1172, § 17, 9.2.2(b) 2013 Iowa Acts 90 § 191, 9.2.3(b) § 192, 9.2.3(b) 2014 Iowa Acts 1062, §§ 1–4, 9.2.1 2017 Iowa Acts 117, § 2, 9.2.3(e) Iowa Code § 321.11 (2017), 9.2.2(a) § 321.11A(2) (2017), 9.2.2(b) § 708.7(1)(a)(5) (2017), 9.2.3(e) § 714.16B (2017), 9.2.1 § 715.4 (2017), 9.2.3(b) §§ 715C.1–715C.2 (2017), 9.2.1 § 715C.1(11), 9.2.1 § 715C.2(5)(a), 9.2.1 §§ 716A.1–716A.2 (2017), 9.2.3(a) § 716A.2, 9.2.3(a) § 716A.6, 9.2.3(a) S–18

Kansas 1999 Kan. Sess. Laws 144, § 1, 9.2.2(b) 2000 Kan. Sess. Laws 147, § 46, 9.2.2(b) 2002 Kan. Sess. Laws 140, § 1, 9.2.3(a) 2006 Kan. Sess. Laws 149 § 1, 9.2.1, 9.2.2(b) § 2, 9.2.1, 9.2.2(b) § 3, 9.2.1 § 4, 9.2.1, 9.2.2(b) § 6, 9.2.2(b) §§ 11–13, 9.2.2(b) § 14, 9.2.1, 9.2.2(b), 9.2.2(c) § 15, 9.2.1 2016 Kan. Sess. Laws 7, § 7, 9.2.2(c) 96, § 5, 9.2.3(e) Kan. Stat. Ann. § 21-6101(a)(8) (2017), 9.2.3(e) § 40-2404 (2017), 9.2.2(b) § 50-6,107 (2017), 9.2.3(a) §§ 50-7a01–50-7a04 (2017), 9.2.1 § 50-7a03 (2013), 9.2.2(c) § 50-669A (2017), 9.2.2(b) § 75-3520 (2017), 9.2.2(b) Kentucky 2000 Ky. Acts 428, § 1, 9.2.2(b) 2006 Ky. Acts 42, § 5, 9.2.2(c) 101, § 1, 9.2.2(b) 2009 Ky. Acts 100, § 8, 9.2.3(b) 2014 Ky. Acts 74, §§ 1–4, 9.2.1 84, § 1, 9.2.1 2016 Ky. Acts. 132, § 1, 9.2.2(b) Ky. Rev. Stat. Ann. §§ 61.931–61.934 (2017), 9.2.1 § 365.725 (2017), 9.2.2(c) § 365.732 (2017), 9.2.1 § 402.100 (2017), 9.2.2(b) § 434.697 (2014), 9.2.3(b) Louisiana 2003 La. Acts 534, § 1, 9.2.2(b) 1275, § 1, 9.2.3(a) 2005 La. Acts 499, § 1, 9.2.1 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Louisiana (cont’d) 2006 La. Acts 201, § 1, 9.2.3(b) 392, § 1, 9.2.3(b) 549, § 1, 9.2.3(b) 2014 La. Acts 165, 9.2.3(c) 2015 La. Acts 231, § 1, 9.2.3(e) La. Admin. Code §§ 55:III.551– 55:III.565 (2017), 9.2.2(a) La. Rev. Stat. § 9:224(a)(6) (2017), 9.2.2(b) § 14:283.2 (2017), 9.2.3(e) § 32-409.1 (2017), 9.2.2(b) § 37:23 (2017), 9.2.2(b) § 51-1421 (2017), 9.2.2(b) La. Rev. Stat. Ann. §§ 51:1741–51:1741.3, 9.2.3(a) §§ 51:1951–51:1955 (2017), 9.2.3(c) § 51:1952, 9.2.3(c) §§ 51:2001–51:2004 (2017), 9.2.3(a) § 51:2002, 9.2.3(a) § 51:2003, 9.2.3(a) § 51:2004, 9.2.3(a) §§ 51:2006–51:2014 (2017), 9.2.3(b) §§ 51:2021–51:2025 (2017), 9.2.3(b) § 51:2022, 9.2.3(b) § 51:2023, 9.2.3(b) § 51:2024, 9.2.3(b) §§ 51:2031–51:2034 (2017), 9.2.3(b) § 51:2033, 9.2.3(b) § 51:2034, 9.2.3(b) §§ 51:3071–51:3077 (2015), 9.2.1 § 51:3075, 9.2.1, 9.3 Maine 2005 Me. Laws 379, § 1, 9.2.1 384, §§ 1–5, 9.2.1 2007 Me. Laws 634, § 1, 9.2.1 2009 Me. Laws 161, § 1, 9.2.1 2015 Me. Laws 339, § 1, 9.2.3(e) 343, § B-1, 9.2.3(c) 2016 Me. Laws 394, § 3, 9.2.3(e) 410, § A-1, 9.2.3(e) Me. Rev. Stat tit. 17-A, § 511-A(1) (2017), 9.2.3(e) MCLE, Inc. | 2nd Edition 2018

tit. 26, §§ 616–618 (2017), 9.2.3(c) Me. Rev. Stat. Ann. § 1272 (2017), 9.2.2(b) § 1272-B (2017), 9.2.2(b) tit. 10 §§ 1346–1350-A (2017), 9.2.1 § 1347(6), 9.2.1 § 1350-B (2017), 9.2.1 § 1497 (2017), 9.2.3(a) Maryland 2002 Md. Laws 323, § 1, 9.2.3(a) 324, § 1, 9.2.3(a) 2005 Md. Laws 521, § 1, 9.2.2(b) 2007 Md. Laws 531, 532, 9.2.1, 9.2.2(c) 2010 Md. Laws 452, § 1, 9.2.2(a), 9.2.2(b) 2012 Md. Laws 233, 9.2.3(c) 234, 9.2.3(c) 2013 Md. Laws 43, § 5, 9.2.1 224, § 1, 9.2.3(c) 2014 Md. Laws 583, 9.2.3(e) 2015 Md. Laws 465 (May 12, 2015), 9.2.3(c) 466 (May 12, 2015), 9.2.3(c) Md. Code Ann. [Com. Law] §§ 14-2901–14-2903, 9.2.3(a) §§ 14-3001–14-3003 (2017), 9.2.3(a) § 14-3002, 9.2.3(a) § 14-3003, 9.2.3(a) § 14-3402 (2017), 9.2.2(b) §§ 14-3501–14-3508 (2017), 9.2.1 § 14-3502 (2017), 9.2.2(c) § 14-3503 (2017), 9.2.2(c) § 14-3504(g), 9.2.1 § 14-3508 (2014), 9.3 § 14-3508 (2017), 9.2.1 §§ 22-101–22-816, 11.2 Md. Code Ann. [Crim. Law] § 3-809(c) (2017), 9.2.3(e) Md. Code Ann. [Cts. & Jud. Proc.] § 1205 (2017), 9.2.2(a), 9.2.2(b) Md. Code Ann. [Educ.] § 26-401, 9.2.3(c) S–19

Data Security and Privacy in Massachusetts

Maryland (cont’d) Md. Code Ann. [Lab. & Empl.] § 3-712 (2017), 9.2.3(c) Md. Code Ann. [Real Prop.] § 3-111 (2014), 9.2.2(b) § 3-111 (2017), 9.2.2(a) Md. Code Ann. [State Gov’t] § 2-1804 (2017), 9.2.2(a), 9.2.2(b) § 8-504 (2017), 9.2.2(a), 9.2.2(b) Michigan 1997 Mich. Pub. Acts 100, 101, 9.2.2(a) 2003 Mich. Pub. Acts 42, §§ 1–8, 9.2.3(a) 2004 Mich. Pub. Acts 452 §§ 1–17, 9.2.1 §§ 2–6, 9.2.2(b) 2006 Mich. Pub. Acts 246, 566, 9.2.1 566, 9.2.2(c) 2010 Mich. Pub. Acts 315, 9.2.1 318, 9.2.1 § 7a, 9.2.3(b) 2012 Mich. Pub. Acts 478, §§ 1–8, 9.2.3(c) 2016 Mich. Pub. Acts 89, 9.2.3(e) Mich. Comp. Laws §§ 37.271–37.278 (2017), 9.2.3(c) § 37.272, 9.2.3(c) §§ 257.208a–257.208d (2017), 9.2.2(a) § 257.232 (2017), 9.2.2(a) § 380.1135 (2015), 9.2.2(b) §§ 445.61–445.77 (2015), 9.2.1 § 445.67, 9.2.1 § 445.67a (2017), 9.2.3(b) § 445.72, 9.2.1 § 445.72(6)(c), 9.2.1 § 445.72a (2017), 9.2.2(c) § 445.83 (2017), 9.2.2(b) § 445.86(2) (2017), 9.2.2(b) §§ 445.2501–445.2508 (2017), 9.2.3(a) § 445.2503, 9.2.3(a) § 445.2504, 9.2.3(a) § 445.2505, 9.2.3(a) § 445.2508, 9.2.3(a) S–20

§ 750.145e, 9.2.3(e) § 750.539a, 2.4.1 § 750.539k (2017), 9.2.2(b) Minnesota 2002 Minn. Laws 395 § 1, 9.2.3(a) §§ 1-1–1-9, 9.2.2(d) 2003 Minn. Laws 136, art. 17, § 35, 9.2.3(b) 2005 Minn. Laws 163, § 85, 9.2.2(b) 167, § 1, 9.2.1 2006 Minn. Laws 212, art. 1 § 17, 9.2.1 § 23, 9.2.1 233 § 7, 9.2.1 § 8, 9.2.1 253, § 19, 9.2.2(b) 2007 Minn. Laws 129, § 55, 9.2.2(b) 2008 Minn. Laws 333, § 1, 9.2.2(b) 2016 Minn. Laws 126, §§ 1–2, 9.2.3(e) Minn. Stat. § 13.386 (2017), 9.2.2(b) § 168.346 (2017), 9.2.2(a), 9.2.2(b) § 325E.59 (2017), 9.2.2(b) § 325E.61 (2017), 9.2.1 § 325F.675 (2017), 9.2.2(b) § 325F.694 (2017), 9.2.2(d), 9.2.3(a) §§ 325M.01–325M.09 (2017), 9.2.2(d) § 325M.02, 9.2.2(d) § 325M.03, 9.2.2(d) § 325M.04, 9.2.2(d) § 325M.05, 9.2.2(d) § 325M.07, 9.2.2(d) § 604.30 (2017), 9.2.3(e) § 604.31 (2017), 9.2.3(e) § 609.527 (2017), 9.2.1 § 609.527(5a) (2017), 9.2.3(b) Mississippi 2010 Miss. Laws 489, § 1, 9.2.1 Miss. Code Ann. § 75-24-29 (2017), 9.2.1

2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Missouri 2000 Mo. Laws 763, 9.2.3(a) 2003 Mo. Laws 61, 9.2.2(b) 228, 9.2.3(a) 2005 Mo. Laws 353, 9.2.2(b) 2009 Mo. Laws HB 62, 9.2.1 2012 Mo. Laws 1318, 9.2.2(b) Mo. Rev. Stat. § 32.055 (2017), 9.2.2(a) §§ 407.1120–407.1132 (2017), 9.2.3(a) § 407.1123, 9.2.3(a) § 407.1129, 9.2.3(a) §§ 407.1135–407.1141 (2017), 9.2.3(a) § 407.1138, 9.2.3(a) § 407.1355 (2017), 9.2.2(b) § 407.1500 (2014), 9.2.1 § 407.1500.1(9), 9.2.1 § 407.1500(4)(a), 9.2.1 Montana 2001 Mont. Laws 363, §§ 1–10, 9.2.2(a) 2005 Mont. Laws 55, § 1, 9.2.1 518 § 2, 9.2.1, 9.2.2(c) § 3, 9.2.1, 9.2.2(c) § 5, 9.2.1 § 6, 9.2.1, 9.2.2(c) § 7, 9.2.1 2007 Mont. Laws 138, §§ 1–11, 9.2.1 180, § 3, 9.2.1 195, § 2, 9.2.1 276 § 1, 9.2.3(b) § 4, 9.2.1 2009 Mont. Laws 163 § 2, 9.2.2(b) § 4, 9.2.1 2011 Mont. Laws 42 § 1, 9.2.1 § 2, 9.2.1 2015 Mont. Laws 74, § 3, 9.2.1 263, § 1, 9.2.3(c) MCLE, Inc. | 2nd Edition 2018

Mont. Code Ann. § 2-6-502 (2017), 9.2.2(b) § 2-6-504 (2017), 9.2.1 § 30-14-1702 (2017), 9.2.1 § 30-14-1703 (2017), 9.2.1, 9.2.2(c) § 30-14-1704 (2017), 9.2.1 § 30-14-1712 (2017), 9.2.3(b) § 30-14-1721 (2017), 9.2.1, 9.2.2(c) § 30-14-1722 (2017), 9.2.1, 9.2.2(c) §§ 30-14-1726–30-1736 (2017), 9.2.1 § 39-2-3 (2017), 9.2.3(c) § 46-24-220 (2017), 9.2.1 §§ 61-11-501–61-11-516 (2017), 9.2.2(a) Nebraska 1997 Neb. Laws 635, §§ 1–13, 9.2.2(a) 2006 Neb. Laws 876, §§ 1–7, 9.2.1 2016 Neb. Laws 385, §§ 27–29, 9.2.1 821, §§ 1–11, 9.2.3(c) Neb. Rev. Stat. §§ 48-3501–48-3511 (2017), 9.2.3(c) § 48-3503, 9.2.3(c) § 48-3506, 9.2.3(c) § 48-3507, 9.2.3(c) § 48-3511, 9.2.3(c) §§ 60-2901–60-2913 (2017), 9.2.2(a) §§ 87-801–87-807 (2017), 9.2.1 § 87-802(5), 9.2.1 Nevada 1997 Nev. Stat. 341, §§ 2–8, 9.2.3(a) 1999 Nev. Stat. 530, § 20, 9.2.3(a) 2001 Nev. Stat. 274, § 7, 9.2.3(a) 2003 Nev. Stat. 12, § 1, 9.2.3(a) 2005 Nev. Stat. 485 § 22, 9.2.2(c) § 23, 9.2.2(c) § 24, 9.2.1 486, § 5, 9.2.3(b) 2009 Nev. Stat. 355, § 1, 9.2.2(c) 2011 Nev. Stat. 331, § 4, 9.2.1 354, § 6, 9.2.2(c) 2013 Nev. Stat. 548, § 2, 9.2.3(c) S–21

Data Security and Privacy in Massachusetts

Nevada (cont’d) 2015 Nev. Stat. 399, §§ 2–6.5, 9.2.3(e) 2017 Nev. Stat. 391, § 3, 16.5 Nev. Admin. Code §§ 481.550–481.600 (2016), 9.2.2(a) Nev. Rev. Stat. §§ 41.705–41.735 (2017), 9.2.3(a) § 41.730 (2014), 9.2.3(a) § 205.492 (2017), 9.2.3(a) § 205.4737 (2017), 9.2.3(b) § 603A.200 (2017), 9.2.2(c) § 603A.210 (2017), 9.2.2(c) § 603A.215 (2017), 9.2.2(c) § 603A.220 (2017), 9.2.1 § 719.090, 16.5 Nev. Rev. Stat. Ann. § 200.604 (2017), 9.2.3(e) § 200.780 (2017), 9.2.3(e) § 242.183 (2017), 9.2.1 § 613.135 (2017), 9.2.3(c) New Hampshire 1996 N.H. Laws 295, 9.2.2(a) 2005 N.H. Laws 238:1, 9.2.3(b) 2006 N.H. Laws 242, 9.2.1 2015 N.H. Stat. 270, § 1, 9.2.3(c) 2016 N.H. Laws 126, 9.2.3(e) N.H. 2014 Stat. 305, § 1, 9.2.3(c) N.H. Code Admin. R. [Dep’t Safety] 5601.01–5608.02 (2015), 9.2.2(a) N.H. Rev. Stat. § 189:70 (2017), 9.2.3(c) N.H. Rev. Stat. Ann. § 260:14 (2017), 9.2.2(a) §§ 275:73–275:75 (2015), 9.2.3(c) §§ 359-C:19–359-C:21 (2017), 9.2.1 § 359-C:20(IV)(a), 9.2.1 § 359-C:21, 9.2.1, 9.3 §§ 359-H:1–359-H:7 (2015), 9.2.3(b) § 644:9-a (2017), 9.2.3(e) New Jersey 1997 N.J. Laws 166, § 2, 9.2.2(a) 2005 N.J. Laws 99, § 1, 9.2.2(b) 226 §§ 10–15, 9.2.1 § 11, 9.2.2(c) 266, § 13, 9.2.2(b) S–22

2012 N.J. Laws 75, §§ 1–4, 9.2.3(c) 2013 N.J. Laws 155, §§ 1–6, 9.2.3(c) 2015 A. 3146, 9.2.1 2016 N.J. Laws 2, 9.2.3(e) N.J. Admin. Code § 13:45F-4.1 (2017), 9.2.2(b) N.J. Stat. §§ 18A:3-29–18A:3-31 (2017), 9.2.3(c) §§ 34:6B-5–34:6B-10 (2017), 9.2.3(c) § 39:2-3.4 (2017), 9.2.2(a) § 56:8-166, 9.3 N.J. Stat. Ann. § 47:1-16 (2017), 9.2.2(b) §§ 56:8-161–56:8-166 (2017), 9.2.1 § 56:8-162 (2017), 9.2.2(c) § 56:8-164 (2017), 9.2.2(b), 9.2.2(c) § 56:8-165 (2017), 9.2.2(c) § C.2A:58D-1 (2017), 9.2.3(e) § C.2C:14-9(c) (2017), 9.2.3(e) New Mexico 2003 N.M. Laws 168 § 2, 9.2.3(a) § 3, 9.2.3(a) 169, §§ 1–3, 9.2.2(b) 2005 N.M. Laws 127, § 2, 9.2.2(b) 296, § 1, 9.2.3(b) 2009 N.M. Laws 95 § 3, 9.2.3(b) 2013 N.M. Laws 222, § 1, 9.2.3(c) 223, § 1, 9.2.3(c) 2015 N.M. Laws 42, § 1, 9.2.3(e) N.M. Stat. § 21-1-46(A)–(C) (2017), 9.2.3(c) N.M. Stat. Ann. § 30-16-24.1 (2017), 9.2.3(b) § 30-37A-1 (2017), 9.2.3(e) § 57-12-23, 9.2.3(a) §§ 57-12-23–57-12-24 (2017), 9.2.3(a) § 57-12-24, 9.2.3(a) § 57-12-C-2, 9.2.1 § 57-12-C-4, 9.2.1 § 57-12-C-5, 9.2.1 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

New Mexico N.M. Stat. Ann. (cont’d) §§ 57-12B-1–57-12B-3 (2017), 9.2.2(b) § 57-12B-3 (2017), 9.2.2(b) § 57-12B-4 (2017), 9.2.2(b) §§ 57-12C-1–57-12C-12 (2017), 9.2.1 §§ 57-12C-6–57-12C-12, 9.2.1 New York 2000 N.Y. Laws 214, § 1, 9.2.2(b) 2001 N.Y. Laws 578, § 1, 9.2.2(d) 2002 N.Y. Laws 17, § 1, 9.2.2(d) 619, § 8, 9.2.1 2005 N.Y. Laws 442 § 3, 9.2.1 § 4, 9.2.1 491 §§ 2–5, 9.2.1 §§ 5–7, 9.2.1 2006 N.Y. Laws 63, § 2, 9.2.1 64, § 1, 9.2.3(b) 65, § 1, 9.2.1, 9.2.2(c) 414, § 1, 9.2.3(b) 2007 N.Y. Laws 679, § 1, 9.2.2(c) 2008 N.Y. Laws 279 § 2, 9.2.1 § 6, 9.2.2(b) 406, § 1, 9.2.1 516 § 1, 9.2.1, 9.2.2(c) § 2, 9.2.1, 9.2.2(c) 2011 N.Y. Laws 27 (Part A), 9.2.1 62 § 36 (Part A), 9.2.1 § 43, 9.2.1 2013 N.Y. Laws 44, § 5 (Part N), 9.2.1 55, § 6 (Part N), 9.2.1 N.Y. Educ. Law § 2-b (2017), 9.2.2(b) N.Y. Gen. Bus. Law § 380-s (2017), 9.2.1 MCLE, Inc. | 2nd Edition 2018

§ 380-t (2017), 9.2.1 § 390-b (2017), 9.2.3(b) § 399-h (2017), 9.2.1, 9.2.2(c) § 899-aa (2017), 9.2.1 § 899-aaa (2017), 9.2.2(c) § 899-bbb (2017), 9.2.2(c) N.Y. Lab. Law § 203-d (2017), 9.2.2(b) N.Y. Rules and Regulations Code §§ 200.1–200.22, 16.5 N.Y. State Tech. Law §§ 201–208 (2017), 9.2.2(d) § 208 (2017), 9.2.1 North Carolina 1997 N.C. Sess. Laws 443, § 32.25, 9.2.2(a) 1999 N.C. Sess. Laws 212, §§ 1–4, 9.2.3(a) 2000 N.C. Sess. Laws 125 § 3, 9.2.3(a) § 7, 9.2.3(a) 2002 N.C. Sess. Laws 157, § 1, 9.2.3(a) 2005 N.C. Sess. Laws 414, § 1, 9.2.1, 9.2.2(b), 9.2.2(c) 2006 N.C. Sess. Laws 158, § 1, 9.2.1, 9.2.2(b) 173, § 2, 9.2.2(b) 2007 N.C. Sess. Laws 534, § 2, 9.2.1 2009 N.C. Sess. Laws 355 § 1, 9.2.1, 9.2.2(b) § 2, 9.2.1, 9.2.2(b) 550, § 5, 9.2.1, 9.2.2(b) 551, § 2, 9.2.3(a) 573, § 10, 9.2.1, 9.2.2(b) 2012 N.C. Sess. Laws 142, § 6A.7A, 9.2.1 149, § 4, 9.2.3(d) 2015 N.C. Sess. Laws 250, 9.2.3(e) 2016 N.C. Sess. Laws 90, § 10(b), 9.2.2(a) N.C. Gen. Stat. § 1-75.4 (2017), 9.2.3(a) § 1-539.2A (2017), 9.2.3(a) § 14-113.20(b), 9.2.1 § 14-190.5A (2017), 9.2.3(e) § 14-453 (2017), 9.2.3(a) S–23

Data Security and Privacy in Massachusetts

North Carolina N.C. Gen. Stat. (cont’d) § 14-458 (2017), 9.2.3(a) § 14-458.2 (2017), 9.2.3(d) § 20-43.1 (2014), 9.2.2(a) §§ 75-60–75-65 (2017), 9.2.1, 9.2.2(b) § 75-61(10), 9.2.1 § 75-62, 9.2.2(b) § 75-62 (2017), 9.2.2(c) § 75-64 (2017), 9.2.2(c) § 75-65(d)(1), 9.2.1 § 75-66 (2017), 9.2.1 § 132-1.10 (2017), 9.2.2(b) North Dakota 1997 N.D. Laws 349, § 2, 9.2.2(a) 1999 N.D. Laws 128, § 1, 9.2.2(b) 2001 N.D. Laws 355, § 2, 9.2.2(a) 2003 N.D. Laws 382 § 1, 9.2.2(b) § 9, 9.2.2(b) 439, § 1, 9.2.3(a) 2005 N.D. Laws 116, § 1, 9.2.2(b) 447, § 3, 9.2.1 448, § 1, 9.2.1 2007 N.D. Laws 388, § 10, 9.2.2(b) 436, § 2, 9.2.3(a) 438, § 1, 9.2.1 439, § 1, 9.2.1 2013 N.D. Laws 105, § 1, 9.2.2(b) 106 § 1, 9.2.2(b) § 2, 9.2.1 § 3, 9.2.1 107, § 1, 9.2.2(b) 2015 N.D. Laws 106, 9.2.3(e) 352 § 1, 9.2.1 § 2, 9.2.1 N.D. Cent. Code § 11-18-23 (2017), 9.2.2(b) § 12.1-17-07.2 (2017), 9.2.3(e) S–24

§ 12.1-23-11 (2017), 9.2.2(b) § 32-03-58 (2017), 9.2.3(e) § 39-33-02 (2017), 9.2.2(a) § 44-04-28 (2017), 9.2.2(b) §§ 51-27-01–51-27-10 (2017), 9.2.3(a) § 51-27-03, 9.2.3(a) § 51-27-04, 9.2.3(a) § 51-27-05, 9.2.3(a) § 51-27-06, 9.2.3(a) § 51-30-01, 9.2.1 §§ 51-30-01–51-30-07 (2017), 9.2.1 §§ 51-31-05–51-31-01 (2014), 9.2.1 §§ 51-33-1–51-33-14 (2017), 9.2.1 §§ 51-34-01–51-34-07 (2014), 9.2.1 SB 2214, 9.2.1 Ohio 1991 Ohio Laws HB 20, § 1, 9.2.1, 9.2.2(b) 1993 Ohio Laws HB 266, § 1, 9.2.1, 9.2.2(b) 1995 Ohio Laws HB 353, 9.2.2(a) 1999 Ohio Laws SB 7, § 1, 9.2.2(b) 2001 Ohio Laws 8, § 1, 9.2.3(a) 2003 Ohio Laws 204, § 1, 9.2.3(a) 361, § 1, 9.2.3(a) 383, § 1, 9.2.3(a) 2005 Ohio Laws 241, § 1, 9.2.3(a) HB 48, § 1, 9.2.1 HB 104, § 1, 9.2.1 HB 279, § 1, 9.2.2(b) SB 126, § 1, 9.2.1 2007 Ohio Laws HB 46, § 1, 9.2.1, 9.2.2(b) 2010 Ohio Laws 86, § 1, 9.2.3(a) 2011 Ohio Laws 360, § 1, 9.2.3(a) Ohio Admin. Code § 4501:1-12-02 (2017), 9.2.2(a) Ohio Rev. Code Ann. § 109.94 (2017), 9.2.1 § 149.45 (2017), 9.2.2(b) § 317.082 (2017), 9.2.2(b) § 1349.17 (2017), 9.2.1, 9.2.2(b) § 1349.18 (2017), 9.2.1, 9.2.2(b) §§ 1349.19–1349.21 (2017), 9.2.1 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Ohio Ohio Rev. Code Ann. (cont’d) § 1349.52 (2015), 9.2.1 § 2307.64 (2017), 9.2.3(a) § 2913.49 (2017), 9.2.2(b) § 2913.421 (2017), 9.2.3(a) § 4501.27 (2017), 9.2.2(a) Oklahoma 2006 Okla. Sess. Laws 56 § 1, 9.2.3(a), 9.2.3(b) § 2, 9.2.3(a), 9.2.3(b) §§ 3–7, 9.2.3(b) §§ 4–7, 9.2.3(a) 183, §§ 1–11, 9.2.1 2007 Okla. Sess. Laws 167, § 1, 9.2.2(b) 2008 Okla. Sess. Laws 86, §§ 1–6, 9.2.1 2014 Okla. Sess. Laws 315 § 1, 9.2.3(c) 2016 Okla. Sess. Laws 262, 9.2.3(e) Okla. Stat. tit. 15 §§ 776.1–776.6 (2017), 9.2.3(b) §§ 776.1–776.7 (2017), 9.2.3(a) § 776.6, 9.2.3(a) § 776.7, 9.2.3(a) §§ 776.8–776.11 (2017), 9.2.3(a), 9.2.3(b) tit. 21 § 1040.13b (2017), 9.2.3(e) § 1533.1 (2017), 9.2.2(b) tit. 22, § 19b (2017), 9.2.1 tit. 24 §§ 151–159 (2017), 9.2.1 §§ 161–166 (2017), 9.2.1 tit. 40, § 173.2 (2017), 9.2.3(c) tit. 74 § 3111, 9.2.1 § 3113, 9.2.1 § 3113.1, 9.2.1 Oregon 1997 Or. Laws 678, §§ 2–10, 9.2.2(a) 2003 Or. Laws 610, § 3, 9.2.2(b) 2007 Or. Laws 759 §§ 1–14, 9.2.1 § 11, 9.2.2(b), 9.2.2(c) § 12, 9.2.2(c) MCLE, Inc. | 2nd Edition 2018

2013 Or. Laws 204, § 2, 9.2.3(c) 408 § 1, 9.2.3(c) § 2, 9.2.3(c) 415, §§ 1–6, 9.2.1 2015 Or. Laws 121, § 1, 9.2.3(b) 229, § 1, 9.2.3(c) 357 §§ 1–4, 9.2.1 § 3, 9.2.2(c) 379, § 1, 9.2.3(e) Or. Rev. Stat. § 161.005 (2017), 9.2.3(e) § 326.51 (2015), 9.2.3(c) § 326.54 (2015), 9.2.3(c) § 350.272 (2017), 9.2.3(c) §§ 646A.600–646A.628 (2017), 9.2.1 § 646A.604(5)(a) (2015), 9.2.1 § 646A.620 (2017), 9.2.2(b), 9.2.2(c) § 646A.622, 9.2.1 § 646A.622 (2017), 9.2.2(c) § 646A.808 (2017), 9.2.3(b) § 659A.330 (2017), 9.2.3(c) §§ 802.175–802.191 (2017), 9.2.2(a) § 802.195 (2017), 9.2.2(b) Pennsylvania 2000 Pa. Laws 25, § 1, 9.2.3(a) 2002 Pa. Laws 222, §§ 1–8, 9.2.3(a) 226, § 3, 9.2.3(a) 2005 Pa. Laws 94, §§ 1–29, 9.2.1 2006 Pa. Laws 60, § 1, 9.2.2(b) 160, §§ 1–8, 9.2.2(b) 163, §§ 1–10, 9.2.1 2010 Pa. Laws 86, §§ 1–9, 9.2.3(b) 2014 Pa. Laws 115, 9.2.3(e) 18 Pa. Cons. Stat. § 3131(a) (2017), 9.2.3(e) § 5903 (2014), 9.2.3(a) § 7661 (2017), 9.2.3(a) 42 Pa. Cons. Stat § 8316.1 (2017), 9.2.3(e) 71 Pa. Cons. Stat. §§ 2601–2608 (2017), 9.2.2(b) S–25

Data Security and Privacy in Massachusetts

Pennsylvania 71 Pa. Cons. Stat. (cont’d) § 2603, 9.2.2(b) § 2605, 9.2.2(b) 73 Pa. Cons. Stat. §§ 2250.1–2250.8 (2017), 9.2.3(a) § 2250.3 (2017), 9.2.3(a) §§ 2301–2329 (2017), 9.2.1 §§ 2330.1–2330.9 (2017), 9.2.3(b) §§ 2501–2510 (2017), 9.2.1 74 Pa. Cons. Stat. § 201 (2017), 9.2.2(b) Rhode Island 1993 R.I. Pub. Laws 351, § 1, 9.2.2(b) 1997 R.I. Pub. Laws 26, § 1, 9.2.2(a) 1999 R.I. Pub. Laws 427, § 1, 9.2.3(a) 2002 R.I. Pub. Laws 292, § 89, 9.2.2(a) 2005 R.I. Pub. Laws 225, § 1, 9.2.1 2006 R.I. Pub. Laws 226, § 1, 9.2.1, 9.2.2(b) 270, § 1, 9.2.1, 9.2.2(b) 558, § 1, 9.2.3(a), 9.2.3(b) 583, § 1, 9.2.3(b) 628, § 1, 9.2.3(a) 2008 R.I. Pub. Laws 467, § 1, 9.2.3(b) 475, § 95, 9.2.2(a) 2009 R.I. Pub. Laws 247, § 1, 9.2.2(c) 285, § 1, 9.2.2(c) 310, § 8, 9.2.2(a) 2011 R.I. Pub. Laws 57, § 2, 9.2.1, 9.2.2(b) 69, § 2, 9.2.1, 9.2.2(b) 2014 R.I. Pub. Laws 138, § 2, 9.2.1 148, § 2, 9.2.1 188 § 1, 9.2.3(c) § 2, 9.2.3(c) § 3, 9.2.3(c) 207 § 1, 9.2.3(c) § 2, 9.2.3(c) § 3, 9.2.3(c) 528 § 9, 9.2.2(b) S–26

§ 35, 9.2.1 § 36, 9.2.3(a) § 39, 9.2.2(c) 2015 R.I. Pub. Laws 138, § 1, 9.2.1 R.I. Gen. Laws § 6-13-16 (2017), 9.2.2(b) § 6-47-2 (2017), 9.2.3(a) § 6-48-1 (2017), 9.2.2(b) §§ 6-48-1–6-48-9 (2017), 9.2.1 §§ 6-49-1–6-49-6 (2017), 9.2.3(a) § 6-49-5, 9.2.3(a) § 6-52-2 (2017), 9.2.2(c) §§ 11-49.2-3–11-49.2-7 (2014), 9.2.1 §§ 11-49.3–11-49.3-6 (2017), 9.2.1 §§ 11-52.1-1–11-52.1-5 (2017), 9.2.3(a), 9.2.3(b) §§ 11-52.2-1–11-52.2-8 (2017), 9.2.3(b) §§ 11-52.3-1–11-52.3-5 (2017), 9.2.3(b) § 16-103-1, 9.2.3(c) §§ 16-103-1–16-103-6 (2017), 9.2.3(c) § 16-104-1 (2017), 9.2.3(c) § 27-49-3.1 (2017), 9.2.2(a) §§ 28-56-1–28-56-6 (2017), 9.2.3(c) South Carolina 1999 S.C. Acts 33, § 1, 9.2.2(a) 100, Part II, § 53, 9.2.2(a) 2002 S.C. Acts 225, § 1, 9.2.2(b) 2008 S.C. Acts 190 § 2, 9.2.1, 9.2.2(b), 9.2.2(c) § 3.B, 9.2.2(b), 9.2.2(c) § 7, 9.2.1 2013 S.C. Acts 15, §§ 1–3, 9.2.1 S.C. Code Ann. §§ 30-2-10–30-2-50 (2017), 9.2.2(b) § 30-2-310 (2017), 9.2.2(c) § 30-4-160 (2017), 9.2.2(a) § 30-4-165(a) (2017), 9.2.2(a) §§ 37-20-110–37-20-200 (2013), 9.2.1 § 37-20-180 (2017), 9.2.2(b) § 37-20-190 (2017), 9.2.2(c) § 39-1-90 (2013), 9.3 § 39-1-90 (2017), 9.2.1 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

South Dakota 2001 S.D. Laws 165, §§ 1–8, 9.2.2(a) 2002 S.D. Laws 185, § 1, 9.2.3(a) 2004 S.D. Laws 151, § 1, 9.2.3(e) 2006 S.D. Laws 246, § 3, 9.2.1 2007 S.D. Laws 226, §§ 1–8, 9.2.3(a) 2011 S.D. Laws 116, § 1, 9.2.3(e) 2016 S.D. Laws 123, § 1, 9.2.3(e) S.D. Codified Laws § 22-21-4 (2017), 9.2.3(e) §§ 32-5-143–32-5-150 (2017), 9.2.2(a) § 37-24-6 (2016), 9.2.3(a) §§ 37-24-41–37-24-48 (2016), 9.2.3(a) § 37-24-42, 9.2.3(a) § 37-24-44, 9.2.3(a) § 37-24-45, 9.2.3(a) § 37-24-47, 9.2.3(a) § 37-24-48, 9.2.3(a) § 54.15.3 (2014), 9.2.1 Tennessee 1999 Tenn. Pub. Acts 201, §§ 2–7, 9.2.1 475 § 2, 9.2.3(a) § 3, 9.2.3(a) 2001 Tenn. Pub. Acts 745, §§ 2–16, 9.2.2(a) 2003 Tenn. Pub. Acts 15, §§ 2–7, 9.2.3(a) 293, § 1, 9.2.2(b) 317 § 4, 9.2.3(a) § 8, 9.2.3(a) 2005 Tenn. Pub. Acts 473, § 2, 9.2.1 2006 Tenn. Pub. Acts 566 §§ 2–6, 9.2.3(b) 2007 Tenn. Pub. Acts 170 § 4, 9.2.1 § 5, 9.2.1 § 6, 9.2.2(b) 2009 Tenn. Pub. Acts 269 § 1, 9.2.2(b) § 2, 9.2.2(b) 2014 Tenn. Pub. Acts 826, §§ 2–5, 9.2.3(c) MCLE, Inc. | 2nd Edition 2018

2016 Tenn. Pub. Acts 692, §§ 1–4, 9.2.1 872, § 1, 9.2.3(e) 2017 Tenn. Pub. Acts 91, § 1, 9.2.1 633, §§ 1-2, 9.2.1 1120, § 8, 9.2.1 Tenn. Code Ann. § 10-7-515 (2017), 9.2.2(b) § 39-14-603 (2017), 9.2.3(a) § 39-17-318 (2017), 9.2.3(e) §§ 47-18-2101–47-18-2106, 9.2.1 § 47-18-2107 (2017), 9.2.1 § 47-18-2108, 9.2.1 § 47-18-2109, 9.2.1 § 47-18-2110 (2014), 9.2.1 § 47-18-2110 (2017), 9.2.2(b) § 47-18-2501 (2017), 9.2.3(a) § 47-18-2502 (2017), 9.2.3(a) § 47-18-2503, 9.2.3(b) § 47-18-2504, 9.2.3(b) §§ 47-18-5101–47-18-5206 (2017), 9.2.3(b) § 47-18-5202, 9.2.3(b) §§ 50-1-1001–50-1-1004 (2017), 9.2.3(c) § 50-1-1002, 9.2.3(c) §§ 55-25-101–55-25-112 (2017), 9.2.2(a) Texas 2003 Tex. Gen. Laws 1053, § 1, 9.2.3(a) 2005 Tex. Gen. Laws 298, 9.2.3(b) 397, § 1, 9.2.2(b) 544, 9.2.3(b) 2007 Tex. Gen. Laws 3, § 1, 9.2.2(b) 885 § 2.01, 9.2.1, 9.2.2(b), 9.2.2(c), 9.2.3(a), 9.2.3(b) § 2.47, 9.2.3(a) 2009 Tex. Gen. Laws 90, § 1, 9.2.3(b) 419 § 2, 9.2.2(c) § 3, 9.2.1 S–27

Data Security and Privacy in Massachusetts

Texas 2009 Tex. Gen. Laws (cont’d) 718, §§ 1–5, 9.2.3(b) 2011 Tex. Gen. Laws 1126, § 14, 9.2.1 2013 Tex. Gen. Laws 183, § 2, 9.2.2(b) 1368, § 1, 9.2.1 2015 Tex. Gen Laws 852 §2, 9.2.3(e) §3, 9.2.3(e) Tex. Bus. & Com. Code § 501.001 (2017), 9.2.2(b) § 501.002 (2017), 9.2.2(b) § 521.002, 9.2.1 § 521.053 (2017), 9.2.1 Tex. Bus. & Com. Code Ann. §§ 46.001–46.011, 9.2.3(a) § 48.003 (2007), 9.2.3(b) § 48.004 (2007), 9.2.3(b) § 48.051 (2007), 9.2.3(b) § 48.052 (2007), 9.2.3(b) § 48.053 (2007), 9.2.3(b) § 72.004 (2017), 9.2.2(c) §§ 321.001–321.114 (2017), 9.2.3(a) § 321.051, 9.2.3(a) § 321.052, 9.2.3(a) § 321.104, 9.2.3(a) § 321.105, 9.2.3(a) §§ 324.001–324.102 (2017), 9.2.3(b) §§ 325.001–325.006 (2017), 9.2.3(b) §§ 501.001–501.102 (2017), 9.2.3(b) § 521.052 (2017), 9.2.2(c) Tex. Civ. Prac. & Rem. Code Ann. §§ 98B.001–98B.007 (2017), 9.2.3(e) Tex. Gov’t Code, § 552.147 (2014), 9.2.2(b) Tex. Penal Code Ann. § 21.16 (2017), 9.2.3(e) Utah 1995 Utah Laws ch. 61, §§ 1–27, 1.3.4 2003 Utah Laws 97, §§ 1–5, 9.2.2(b) 2004 Utah Laws 363, §§ 1–5, 9.2.3(b) 2005 Utah Laws 168, §§ 1–5, 9.2.3(b) 2006 Utah Laws 21, § 13, 1.3.4 S–28

343, §§ 1–5, 9.2.1 343, § 3, 9.2.2(c) 344, §§ 1–9, 9.2.1 344, § 8, 9.2.2(b) 2008 Utah Laws 29, §§ 1–5, 9.2.1 2009 Utah Laws 61, § 1, 9.2.1 2010 Utah Laws 200 §§ 1–12, 9.2.3(b) § 4, 9.2.3(b) 2013 Utah Laws 94 §§ 1–6, 9.2.3(c) §§ 7–12, 9.2.3(c) 2014 Utah Laws 124, § 1, 9.2.3(e) Utah Code Ann. §§ 13-37-201–13-37-203 (2017), 9.2.2(b) §§ 13-40-101–13-40-401 (2007), 9.2.3(b) §§ 13-40-101–13-40-402 (2017), 9.2.3(b) § 13-40-201 (2017), 9.2.3(b) § 13-40-301, 9.2.3(b) §§ 13-44-101–13-44-301 (2017), 9.2.1 § 13-44-201 (2017), 9.2.2(c) §§ 13-45-101–13-45-401 (2017), 9.2.1 § 13-45-301(a) (2017), 9.2.2(b) §§ 34-48-101–34-48-301 (2017), 9.2.3(c) § 34-48-102, 9.2.3(c) §§ 46-3-101–46-3-504 (2001), 1.3.4 §§ 53B-24-101–53B-24-301 (2017), 9.2.3(c) § 53B-24-102, 9.2.3(c) § 70-3a-309 (2017), 9.2.3(b) § 76-5b-203 (2017), 9.2.3(e) Vermont 2003 Vt. Acts & Resolves 155, § 3, 9.2.2(b) 2005 Vt. Acts & Resolves 162, § 1, 9.2.1, 9.2.2(b), 9.2.2(c) 2011 Vt. Acts & Resolves 78, § 2, 9.2.2(c) 109, § 4, 9.2.1 2015 Vt. Acts & Resolves 62 § 2, 9.2.3(e) 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Vermont 2015 Vt. Acts & Resolves 62 (cont’d) § 3, 9.2.3(e) Vt. Stat. Ann. tit. 9 § 2430, 9.2.1 §§ 2430–2445 (2017), 9.2.1 § 2435(4)(A), 9.2.1 § 2440 (2017), 9.2.2(b), 9.2.2(c) § 2445 (2017), 9.2.2(c) § 2480m (2017), 9.2.2(b) 4191 (2017), 9.2.3(e) tit. 13, § 2606 (2017), 9.2.3(e) Virginia 1999 Va. Acts 886, § 1, 9.2.3(a) 2001 Va. Acts 844, 9.2.2(d) 2003 Va. Acts 987, § 1, 9.2.3(a) 1016, § 1, 9.2.3(a) 2005 Va. Acts 640, 9.2.2(b) 747, 9.2.3(b) 760, 9.2.3(b) 761, 9.2.3(b) 827, 9.2.3(b) 837, 9.2.3(b) 2008 Va. Acts 566, § 1, 9.2.1 2009 Va. Acts 213, 9.2.2(b) 2010 Va. Acts 489, § 1, 9.2.3(a) 852, § 1, 9.2.1 2014 Va. Acts 399, 9.2.3(e) 789, 9.2.2(b) 795, 9.2.2(b) 2015 Va. Acts 576, § 1, 9.2.3(c) 2016 Va. Acts 597, § 1, 9.2.3(c) 2017 Va. Acts 419, § 1, 9.2.1 Va. Code §§ 2.2-436–2.2-437, 16.5 §§ 59.1-550–59.1-555, 16.5 Va. Code Ann. § 2.2-3803 (2017), 9.2.2(d) § 2.2-3815 (2017), 9.2.2(b) § 2.2-3816 (2017), 9.2.2(b) § 8.01-328.1 (2017), 9.2.3(a) § 18.2-152.2 (2017), 9.2.3(a) MCLE, Inc. | 2nd Edition 2018

§ 18.2-152.3:1, 9.2.3(a) § 18.2-152.4 (2017), 9.2.3(a) § 18.2-152.5:1 (2017), 9.2.3(b) § 18.2-152.12 (2017), 9.2.3(a) § 18.2-186.6 (2015), 9.3 § 18.2-186.6 (2017), 9.2.1 § 18.2-386.1 (2017), 9.2.3(e) § 18.2-386.2 (2017), 9.2.3(e) § 23.1-405 (2017), 9.2.3(c) § 32.1-127.1:05 (2017), 9.2.1 § 40.1-28.7:5 (2017), 9.2.3(c) § 59.1-442 (2017), 9.2.2(b) § 59.1-443.2 (2017), 9.2.2(b) § 59.1-443.3 (2017), 9.2.2(b) §§ 59.1-501.1–59.1-509.2, 11.2 Washington 1996 Wash. Laws ch. 250, 1.3.4 1998 Wash. Laws 149 §§ 1–8, 9.2.3(a) §§ 1–8050, 9.2.3(a) 1999 Wash. Laws 289, §§ 1–4, 9.2.3(a) 2002 Wash. Laws 90, § 3, 9.2.2(c) 2005 Wash. Laws 342, §§ 1–5, 9.2.1 368 § 1, 9.2.1 § 2, 9.2.1 500, §§ 1–9, 9.2.3(b) 2007 Wash. Laws 197, § 9, 9.2.1 499, § 1, 9.2.1 2008 Wash. Laws 66, §§ 1–5, 9.2.3(b) 2010 Wash. Laws 151, § 2, 9.2.1 2013 Wash. Laws 330 § 1, 9.2.3(c) § 2, 9.2.3(c) 2015 Wash. Laws 64, § 2, 9.2.1 64, § 3, 9.2.1 2015 Wash. Laws 2d Sp. Sess. 7, § 1, 9.2.3(e) 8, § 1, 9.2.3(e) 2016 Wash. Laws 91, § 1, 9.2.3(e) Wash. Rev. Code § 4.24.795 (2017), 9.2.3(e) § 9A.86.010 (2017), 9.2.3(e) §§ 19.34.010–19.34.903, 1.3.4 S–29

Data Security and Privacy in Massachusetts

Washington Wash. Rev. Code (cont’d) §§ 19.182.170–19.182.210, 9.2.1 §§ 19.190.005–19.190.0050 (2017), 9.2.3(a) § 19.190.020, 9.2.3(a) § 19.190.040, 9.2.3(a) § 19.215.020 (2017), 9.2.2(c) § 19.255.010 (2015), 9.3 § 19.255.010 (2017), 9.2.1 § 19.255.020 (2017), 9.2.1 §§ 19.270.010–19.270.900 (2017), 9.2.3(b) § 19.270.020, 9.2.3(b) § 19.270.030, 9.2.3(b) § 19.270.040, 9.2.3(b) § 19.270.060, 9.2.3(b) § 42.56.590 (2017), 9.2.1 § 49.44.200 (2017), 9.2.3(c) § 49.44.205, 9.2.3(c) § 49.44.205 (2017), 9.2.3(c) West Virginia 1996 W. Va. Acts 182, 9.2.2(a) 1999 W. Va. Acts 119, 9.2.3(a) 2001 W. Va. Acts 200, 9.2.2(a) 2004 W. Va. Acts 178, 9.2.2(a) 2005 W. Va. Acts 122, 9.2.2(b) 2007 W. Va. Acts 45, 9.2.1 2008 W. Va. Acts 37, 9.2.1 2016 W. Va. Acts 143, 9.2.3(c) 2017 W. Va. SB420, 9.2.3(e) W. Va. Code § 5A-8-22 (2017), 9.2.2(b) §§ 17A-2A-1–17A-2A-14 (2017), 9.2.2(a) § 21-5H-1 (2017), 9.2.3(c) §§ 46A-2A-101–46A-2A-105 (2017), 9.2.1 § 46A-2A-102(d), 9.2.1 § 46A-2A-104, 9.3 §§ 46A-6G-1–46A-6G-5 (2017), 9.2.3(a) § 46A-6G-2, 9.2.3(a) § 46A-6G-5, 9.2.3(a) §§ 46A-6L-101–46A-6L-105 (2017), 9.2.1 § 61-8-28A (2017), 9.2.3(e) S–30

Wisconsin 1995 Wis. Laws 353, 9.2.3(a) 1999 Wis. Laws 9, 9.2.1 § 3113n (895.505), 9.2.2(c) 2001 Wis. Laws 16, 9.2.3(a) 2005 Wis. Laws 138 (895.507), 9.2.1 155, § 52, 9.2.1, 9.2.2(c) 2007 Wis. Laws 20, § 3778m, 9.2.1 97, § 238, 9.2.1 2013 Wis. Laws 208, § 5, 9.2.3(c) 243, § 4, 9.2.3(e) Wis. Stat. § 134.97 (2017), 9.2.1, 9.2.2(c) § 134.98 (2015), 9.3 § 134.98 (2017), 9.2.1 § 942.09(3m)(a)(1) (2017), 9.2.3(e) § 944.25 (2017), 9.2.3(a) § 947.0125 (2017), 9.2.3(a) § 995.55 (2017), 9.2.3(c) Wyoming 2003 Wyo. Sess. Laws 86, § 1, 9.2.3(a) 2007 Wyo. Sess. Laws 162, § 1, 9.2.1 2014 Wyo. Sess. Laws 73, § 1, 9.2.3(b) 2015 Wyo. Sess. Laws 63, § 1, 9.2.1 S.F. 35, 36, 9.2.1 Wyo. Stat. Ann. § 6-3-506 (2017), 9.2.3(b) §§ 40-12-401–40-12-404 (2017), 9.2.3(a) § 40-12-402, 9.2.3(a) §§ 40-12-501–40-12-509 (2017), 9.2.1

ADDITIONAL REFERENCES AND RESOURCES “A Brief Overview of the Federal Trade Commission’s Investigative and Law Enforcement Authority,” Fed. Trade Comm’n (July 2008), 8.2 Acquisti, Hoffman & Romanosky, Empirical Analysis of Data Breach Litigation (Temple Univ. Beasley Sch. of Law, Legal 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Studies Research Paper No. 201229), 14.6 Administration Discussion Draft: Consumer Privacy Bill of Rights Act of 2015, 16.1 Barr, Lynne B. & Barbara J. Ellis, “The New FCRA: An Assessment of the First Year,” 54 Bus. Law 1343 (1999), 4.2.4 Bernard, Tara and Stacy Cowley, “Equifax Breach Caused by Lone Employee’s Error, Former C.E.O. Says,” N.Y. Times, Oct. 3, 2017, 1.2.2(b) Buerkle, Tom & Richard Bels, “Cyberattack Casts a Long Shadow on Equifax’s Earnings,” N.Y. Times, Nov. 10, 2017, 1.1 Bureau of Justice Statistics, “17.6 Million U.S. Residents Experienced Identity Theft in 2014” (Sept. 27, 2015), 4.3.1 “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy,” 16.1 Cowley, Stacy, “2.5 Million More People Potentially Exposed in Equifax Breach,” N.Y. Times, Oct. 2, 2017, 1.1 Cyber Risk, NAIC (last updated June 13, 2015), 14.6 Cyber Risks: The Growing Threat, Insurance Info. Inst. (June 2014), 14.6 “Cybersecurity Requirements for Financial Services Companies” New York Department of Financial Services, 23 NYCRR 500, 14.7.9 “Draft Restatement of the Law of Consumer Contracts” (Amer. L. Inst. Discussion Draft Apr. 17, 2017), 11.2

MCLE, Inc. | 2nd Edition 2018

FCC Enforcement Advisory Open Internet Privacy Standard Enforcement, Enforcement Advisory No. 2015-03; Release No. DA 15-603 (May 20, 2015), 12.2.1(d) Federal Financial Institutions Examination Council, Financial Institution Letters (FIL-27-2005), 5.4.6, 5.4.6(a) Federal Trade Commission, Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business (2017), 7.4 Federal Trade Commission, “FTC Policy on Unfairness,” letter from Fed. Trade Comm’n to Hon. Wendell Ford, chairman, and Hon. John Danforth, ranking minority member, S. Comm. on Commerce, Science, and Transportation, Consumer Subcomm. (Dec. 17, 1980), 8.5 Federal Trade Commission, “FTC Releases Annual Summary of Consumer Complaints” (Mar. 3, 2015), 4.3.1 Federal Trade Commission, “Privacy and Data Security Update 2016” (Jan. 2017), 16.3 Financial Crime Enforcement Network, Guidance: Application of FinCEN’s Regulations to Persons Administering, Exchanging, or Using Virtual Currencies, FIN2013-G001 (Dept. of the Treasury, Financial Crime Enforcement Network March 18, 2013), 16.5 Fishman, Clifford S. & Anne T. McKenna, Wiretapping and Eavesdropping § 3:9 (Disclosure by attorney) (Database updated Nov. 2014), 2.3.2 40 Years of Experience with the Fair Credit Reporting Act (FTC Staff report July 2011), 8.3.1(a)

S–31

Data Security and Privacy in Massachusetts

“FTC Approves Final Order In TRUSTe Privacy Case,” FTC Press Release, May 18, 2015, 8.5 “FTC Charges Operators of ‘Jerk.com’ Website With Deceiving Consumers,” FTC Press Release, Apr. 7, 2014, 8.5 “FTC Files Complaint Against LabMD for Failing to Protect Consumers’ Privacy,” FTC Press Release, Aug. 29, 2013, 8.5 “FTC Policy Statement on Deception,” Letter from Fed. Trade Comm’n to Hon. John D. Dingell, Chairman H. Comm. on Energy & Commerce (Oct. 14, 1983), 8.5 “FTC Targets Identity Theft,” FTC Press Release, Feb. 17, 2000, 1.2.2(b) Greenberg, Andy, “Security Researchers Cry Foul Over Conviction of AT&T iPad Hacker,” Forbes (Nov. 21, 2012), 3.5.1 “Hackers Targeted Companies’ Merger Talks,” 15.4 Honan, Mat, “The Nightmare on Connected Home Street,” Wired (June 13, 2014), 3.2.2 The Internet of Things: Privacy and Security in a Connected World (FTC, Jan. 2015), 16.3 Javelin Strategy & Research, “2017 Identity Fraud: Securing the Connected Life” (Feb. 1, 2017), 4.3.1 Keeton, W. Page et al., Prosser & Keeton on the Law of Torts, § 30, at 165 (W. Page Keeton ed., 5th ed. 1984), 9.3 Kerr, Orin S., “Lifting the “Fog” of Internet Surveillance: How a Suppression Remedy Would Change Computer Crime Law,” 54 Hastings L.J. 805 (2003), 2.6

S–32

Lessig, Lawrence, Code and Other Laws of Cyberspace 142–63 (Basic Books 1999), 1.2.2(b) Mihm, Stephen, “Dumpster-Diving for Your Identity,” N.Y. Times, Dec. 31, 2003, 1.1 Mobile Privacy Disclosures: Building Trust Through Transparency: A Federal Trade Commission Staff Report (FTC, Feb. 2013), 16.3 National Consumer Law Center, Fair Credit Reporting (8th ed. 2013), 4.1.2 “Pacnet Security Breach,” 15.1 “PCI DSS Quick Reference Guide” (PCI DSS QRG), 13.7 Perlroth, Nicole, “All 3 Billion Yahoo Accounts Were Affected by 2013 Attack,” N.Y. Times, Oct. 3, 2017, 1.1 Peterson, Andrea, “OPM says 5.6 million fingerprints stolen in cyberattack, five times as many as previously thought,” Washington Post, Sept. 23, 2015, 13.5 Ponemon Institute LLC, 2015 Cost of Data Breach Study: Global Analysis at 1 (May 2015), 1.2.2(b) Ponemon Institute LLC, 2017 Cost of Data Breach Study: Global Analysis at 1 (2017), 14.1 Ponemon Institute LLC, 2017 Cost of Data Breach Study: Global Overview at 1 (June 2017), 1.1 Popper, Nathaniel, “A Hacking of More Than $50 Million Dashes Hopes in the World of Virtual Currency,” N.Y. Times, June 17, 2016, 16.5 Popper, Nathaniel, “Business Giants to Announce Creation of a Computing System Based on Ethereum,” N.Y. Times, Feb. 27, 2017, 16.5 Privacy Online: Fair Information Practices in the Electronic Marketplace (FTC, May 2000), 16.3 2nd Edition 2018 | MCLE, Inc.

Table of Statutes, Rules, and References

Project Feature: Principles of the Law: Data Privacy, The ALI Advisor, 16.4 Project Feature: Restatement of the Law, Consumer Contracts, The ALI Advisor, 11.2 Prosser, William, Law of Torts 831-32 (West Publishing 3d ed. 1964), 1.2.1(a), 9.1.2 Prosser, William, “Privacy,” 48 Calif. L. Rev. 383 (1960), 1.2.1(a), 9.1.2 “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policy Makers” (FTC, Mar. 2012), 16.3 Recommendation ITU-T X.1254: Entity authentication assurance framework at 1 (Sept. 2012), 16.5 Records, Computers and the Rights of Citizens—Report of the Secretary’s Advisory Committee on Automated Personal Data Systems (U.S. Dep’t of Health, Education & Welfare, July 1973), 1.3.2, 16.1 Restatement of Data Privacy Principles, 16.4 Restatement (Second) of Agency, § 112, 3.2.4 Restatement (Second) of Torts (1977), 1.3.2, 9.1, 9.1.2, 16.4 §§ 652B–652E, 1.2.1(a) Restatement (Third) of Unfair Competition (1995), 1.1, 1.2.1(a), 9.1.2 Revised Uniform Fiduciary Access to Digital Assets Act, 12.2.2, 16.4 Richards & Solove, “Privacy’s Other Path: Recovering the Law of Confidentiality,” 96 Geo. L. J. 124 (2007), 1.2.1(a) Sanger, David E., Nicole Perlroth & Michael D. Shear, “Attack Gave Chinese Hackers Privileged Access to U.S. Systems,” N.Y. Times, June 21, 2015, 1.2.2(b) MCLE, Inc. | 2nd Edition 2018

“Security and Privacy Controls for Federal Information Systems and Organizations” (rev. 4, NIST), 13.5 Senate Hearing, “Protecting Consumers in the Era of Major Data Breaches,” 16.6 “Snapchat Settles FTC Charges That Promises of Disappearing Messages Were False,” FTC Press Release, May 8, 2014, 8.5, 11.3 Solove, Daniel J., Nothing to Hide: The False Tradeoff Between Privacy and Security 167–68 (Yale University Press 2007), 2.6 Solove, Daniel J., “Reconstructing Electronic Surveillance Law,” 72 Geo. Wash. L. Rev. 1264 (2004), 2.5.5 Sprenger, Polly, “Sun on Privacy: ‘Get Over It,’” Wired, Jan. 26, 1999, 1.1 “Statement of Enforcement Principles Regarding ‘Unfair Methods of Competition’ Under Section 5 of the FTC Act” (FTC, 2015), 8.1 “Telstra considers legal action over Pacnet Acquisition,” 15.1 “Telstra Owned Pacnet Hit by Major Breach,” 15.1 2 Story, Commentaries on Equity and Jurisprudence, as Administered in England and America (Fred. B. Rothman & Co. 13th ed. 1988), 9.1.2 Uniform Civil Remedies for Unauthorized Disclosure of Intimate Images, 16.4 Uniform Commercial Code (UCC), 1.3.3, 11.1, 11.2, 16.4 § 4A-201, 1.3.3 Art. 4A, 1.3.4 Uniform Electronic Transactions Act (UETA), 1.3.4, 11.2, 16.4 § 2(8), 1.3.4 § 2(14), 1.3.4 § 9, 1.3.4

S–33

Data Security and Privacy in Massachusetts

Uniform Employee and Student Online Privacy Protection Act (UESOPPA), 9.2.3(c) Uniform Fiduciary Access to Digital Assets Act, 11.3.2 Uniform Regulation of Virtual-currency Business Act, 16.5 Uniform Trade Secrets Act, 16.4 § 1(2), 1.2.1(a) § 1(2)(i), 1.2.1(a) § 1(2)(ii), 1.2.1(a) “Utility-Like Regulation of the Internet” in Proposed Notice

S–34

of Rulemaking In the Matter of Internet Freedom, W.C. Docket No. 17-108, 82 Fed. Reg. 25568 (June 2, 2017), 12.2.1(c) Warren & Brandeis, “The Right to Privacy,” 4 Harv. L. Rev. (1890), 1.1 “Yelp, TinyCo Settle FTC Charges Their Apps Improperly Collected Children’s Personal Information,” FTC Press Release Sept. 17, 2014, 8.4.4

2nd Edition 2018 | MCLE, Inc.

Index References are to section numbers of this book, unless otherwise indicated.

A Aaron’s Law Act, 3.5, 3.5.4 Advertising Adware, 1.2.2(a) Targeted advertisements, 1.2.2 Alabama, 11.2 Identity theft protection, 9.2.1 Phishing protection, 9.2.3(b) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Alaska Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Personal Information Protection Act, 9.2.1 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) American Recovery and Reinvestment Act of 2009, 1.3.5, 8.3.3, Exhibits 8C, 15B Arizona Anti-spam legislation, 9.2.3(a) Blockchain records, use of, 16.5 Breach notification, 9.2.1 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) Arkansas Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Consumer Protection Against Computer Spyware Act, 9.2.3(b)

MCLE, Inc. | 2nd Edition 2018

Employees, coercion of access to social media, 9.2.3(c) Personal Information Protection Act, 9.2.1 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) Students, coercion of access to social media, 9.2.3(c) Unsolicited Commercial and Sexually Explicit Electronic Mail Prevention Act, 9.2.3(a) Atomic Energy Act, 3.3.1, Exhibit 3A

B Bank Holding Company Act of 1956, 5.3.1 Banking Act of 1933, 5.1 Bankruptcy Transfers of personal data in, 15.3 Bankruptcy Abuse Prevention and Consumer Protection Act, 15.3 Blockchain Technology, 16.5 Breach Notification Class actions, 9.3 Current developments, 16.4 Massachusetts See Massachusetts Breach Notice Law Other states, 9.2.1, 9.3 Private right of action, 9.3 Breach of Security See also Breach Notification Costs, 1.1 Bureau of Consumer Financial Protection See Consumer Financial Protection Bureau (CFPB) I–1

Data Security and Privacy in Massachusetts

Bureau of Consumer Protection, 8.2 Business Associates Issues, Ch. 13 Federal information system safeguards, 13.5 Financial institution safeguards, 13.4 FTC Red Flags rule, 13.3 HIPAA regulations Omnibus Rule, 13.2 Privacy and Security Compliance Agreement, Exhibit 13A Massachusetts data security, 13.6 Payment card standards, 13.7

C California Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 California Anti-Phishing Act of 2005, 9.2.3(b) Consumer Legal Remedies Act, 9.3 Consumer Protection Against Computer Spyware Act, 9.2.3(b) Customer Records Act, 9.3 Employees, coercion of access to social media, 9.2.3(c) False Advertising Law, 9.3 Internet, protection from data collection on, 9.2.2(d) Online Privacy Protection Act, 9.2.2(d), 16.3 Phishing protection, 9.2.3(b) Privacy Rights for California Minors in the Digital World Act, 16.3 Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Shine the Light Law, 16.3 Social Security number protection, 9.2.2(b) Song-Beverly Credit Card Act of 1971, 9.3 Spyware Act, 9.2.3(b) Spyware protection, 9.2.3(b) Students, coercion of access to social media, 9.2.3(c) Unfair Competition Law, 9.3

I–2

CAN-SPAM See Controlling the Assault of NonSolicited Pornography and Marketing (CAN-SPAM) Act CDA See Communications Decency Act (CDA) CFAA See Computer Fraud and Abuse Act (CFAA) CFPB See Consumer Financial Protection Bureau (CFPB) CFTC See Commodity Futures Trading Commission (CFTC) Children’s Internet Protection Act (CIPA), 7.2.1 Children’s Online Privacy Protection Act of 1998 (COPPA), 1.3.7, 7.4, 8.1, 8.4.4, 11.4, Exhibit 15B Access to information, 7.4.2 Children’s Online Privacy Protection Rule, 8.4.4 Choice by parent, 7.4.2 Enforcement, 8.4.4 Notice to parent, 7.4.1, 7.4.3, 11.2 Penalties, 7.4.4 Privacy policy, 7.4.1 Protection of information, 7.4.5 Student privacy, 7.4.6 Class Action Fairness Act of 2005, 9.3 Clayton Act, 8.1 Colorado Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Employees, coercion of access to social media, 9.2.3(c) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spam Reduction Act of 2008, 9.2.3(a) Commodity Futures Trading Commission (CFTC) Authority, 4.2.5, 5.2.1, 5.2.2, Exhibit 5A 2nd Edition 2018 | MCLE, Inc.

Index

Commodity Futures Trading Commission (CFTC) (cont’d) Enforcement Fair and Accurate Credit Transactions Act, 8.3.1(b) Gramm-Leach-Bliley Act, 5.6.1, 8.3.2 Gramm-Leach-Bliley Act regulations, 8.3.2(b) Communications Decency Act (CDA), 1.3.7, 7.2, 7.2.2, 7.2.3, 7.2.5, 12.1.1, 12.2.3, 16.2 Anti-indecency provisions, 7.2.1 Anti-obscenity provisions, 7.2.1 Defamation liability shield, 7.2.3 Escort services, advertising of, 7.2.6, 16.2 Online service providers sheltered from liability, 7.2.2 Qualification for safe harbor, 7.2.4 Revenge porn, 7.2.6 Sex trafficking, facilitating, 7.2.6, 16.2 “Source of the content at issue,” 7.2.5 Comprehensive Drug Abuse Prevention and Control Act of 1970, Exhibit 3A Comprehensive Smokeless Tobacco Health Education Act, 8.1 Computer Abuse Amendments Act of 1994, 3.6.4 Computer Fraud And Abuse Act (CFAA), Ch. 3, 1.3.4, 8.4.1, 9.2.2(a), 16.2 Application of, 3.6.9 Criminal prohibitions, 3.3 Computer fraud, 3.3.4 Confidentiality, 3.3.2 Damage to data, 3.3.5 Espionage, 3.3.1 Extortion, 3.3.7 Malware, 3.3.5 Password trafficking, 3.3.6 Trespass to government computers, 3.3.3 Viruses, 3.3.5 Criticism of, 3.5 Cyber-bullying, 3.5.2 Hacktivism, 3.5.1 MCLE, Inc. | 2nd Edition 2018

Internet advocacy, 3.5.1 Proposed reforms, 3.5.4 Use of employer resources for personal use, 3.5.3 Legislative history, 3.6 1984 Act, 3.6.2 1986 Amendments, 3.6.3 1994 Amendments, 3.6.4 1996 Amendments, 3.6.5 2001 Amendments, 3.6.6 2002 Amendments, 3.6.7 2008 Amendments, 3.6.8 H.R. 5616, 3.6.1 Private right of action, 3.4 Prohibitions under, 3.1 Criminal, 3.3 Scope of, 3.6.9 Terminology, 3.2 Access, 3.2.3 Computer, 3.2.2 Damage, 3.2.5 Exceeding authorized access, 3.2.4 Loss, 3.2.5 Protected computer, 3.2.1 Unauthorized access, 3.2.4 Connecticut An Act Concerning Consumer Computer Equipment Leases and Unsolicited Electronic Mail Advertising Material, 9.2.3(a) An Act Concerning Employee Online Privacy, 9.2.3(c) Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Connecticut Unfair Trade Practices Act, 9.3 Employees, coercion of access to social media, 9.2.3(c) Phishing protection, 9.2.3(b) Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Consumer Credit Protection Act, 1.3.3 Consumer Credit Reform Act of 1996, 8.3.1(a), 9.2 I–3

Data Security and Privacy in Massachusetts

Consumer Credit Reporting Reform Act of 1996, 4.1.1, 8.3.1(a)

“Opt-out” requests, 7.5.2 Penalties, 7.5.3

Consumer Financial Protection Act of 2010, 8.3.1(a), 8.3.2, 8.3.2(a), 8.3.2(b), 8.4.2 Consumer Financial Protection Bureau (CFPB) Authority, 5.2, 5.2.2, Exhibit 5A Enforcement of GLBA, 4.1.1, 5.1, 5.6.1, 5.6.2(a) Gramm-Leach-Bliley Act regulations, 8.3.2, 8.3.2(b) TCFAPA enforcement, 8.4.2 Consumer Issues, Ch. 11 Arbitration clauses, 11.2 Contracting considerations, 11.2 Drafting terms, 11.3 Acceptable use policy, 11.3.2 Intellectual property notices, 11.3.3 Privacy policy or notice, 11.3.1 Privacy issues, 1.3.2, 11.4 Data collection, 11.4.1 Use of data collected, 11.4.2 User access to information, 11.4.3 Consumer Leasing Act of 1976, 8.1 Consumer Reporting Employment Clarification Act of 1998, 8.3.1(a), 9.2 Consumer Reports Accuracy of, 4.2.9 Addresses Change, notification of, 4.2.6 Discrepancies, 4.2.7 Employer’s use of, 4.2.4 Incorrect information on, 4.1.2 Liability for incorrect information on, 4.1.2 Medical information on, 4.2.3 Purpose for, 4.2.8 Social Security numbers on, 4.2.2

Cookies, 1.2.2

Controlling the Assault of NonSolicited Pornography and Marketing (CAN-SPAM) Act, 1.3.7, 7.5, 8.4.1, 9.2.3(a), 11.4, Exhibit 15B Compliance requirements, 7.5.2 E-mails covered by, 7.5.1 Enforcement, 8.4.1 I–4

COPPA See Children’s Online Privacy Protection Act Of 1998 (COPPA) Copyright Law, 1.1, 7.3, 7.3.1, 7.3.7 Counterfeit Access Device and Computer Fraud and Abuse Act of 1984, 3.1 Credit and Debit Card Receipt Clarification Act of 2007, 8.3.1(a) Credit Card Accountability Responsibility and Disclosure Act of 2009, 8.3.1(a) Credit Reports See Consumer Reports Crimes Against Charitable Americans Act of 2001, 8.4.2 Current Developments, Ch. 16 Congressional activity, 16.1, 16.2, Exhibit 16A Draft consumer privacy bill of rights, 16.1 FTC framework, 16.3 Identity management, 16.5 Legislation, 16.1, 16.2 Restatement of Data Privacy Principles, 16.4 State laws, 16.4 Trump administration, 16.1 Cyber-Bullying Computer Fraud and Abuse Act, 3.5.2 State laws, 9.2.3(d) Cyber Preparedness Act of 2017, 16.2 Cyber Risk Insurance, Ch. 14 Evaluation of coverage, 14.7.7 First-party coverages, 14.3 Business interruption, 14.3.3 Case law, 14.5 Crime, 14.3.7 Cyber extortion and ransom coverage, 14.3.4 Data breach response costs, 14.3.1 Data loss, 14.3.2 2nd Edition 2018 | MCLE, Inc.

Index

Cyber Risk Insurance First-party coverage (cont’d) Fraud, 14.3.7 Network service interruption, 14.3.3 Physical loss or damage to property, 14.3.6 Property damage, 14.3.6 Reputational damage, 14.3.5 In merger and acquisition, 15.6 Insurance forms, 14.2 Third-party coverages, 14.4 Case law, 14.6 Data security liability, 14.4.1 Media content liability, 14.4.3 Privacy liability, 14.4.2 Products liability, 14.4.4 Cyber Security Enhancement Act of 2002, 3.6.7 Cybersecurity Act of 2015, 16.1, 16.2

D Defend Trade Secrets Act of 2016, 16.1 Delaware Anti-spam legislation, 9.2.3(a) Blockchain records, use of, 16.5 Breach notification, 9.2.1 Clean Credit and Identity Theft Prevention Act, 9.2.1 Education Privacy Act, 9.2.3(c) Internet, protection from data collection on, 9.2.2(d) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c) Department of Health and Human Services HIPAA regulations, 6.2, 8.3.3 Digital Millennium Copyright Act (DMCA), 1.3.7, 7.3, 7.3.2, 7.3.6 Agent designation, 7.3.3 Anti-circumvention exemptions, 7.3.8, 7.3.9 Anti-copyright management circumvention, 7.3.7 MCLE, Inc. | 2nd Edition 2018

Cases, 7.3.9 Qualification for safe harbor, 7.3.2 Safe harbor, 7.2.3, 7.3.1 Secondary liability safe harbor, 7.3.6 Takedown request, 11.3.3 Requirements, 7.3.4 Response to, 7.3.5 District of Columbia Breach notification, 9.2.1 Consumer Personal Information Security Breach Notification Act, 9.2.1 Private right of action, 9.3 Revenge porn protection, 9.2.3(e) DMCA See Digital Millennium Copyright Act (DMCA) Do-Not-Call Implementation Act of 2003, 8.4.3 Do-Not-Call Improvement Act of 2007, 8.4.3 Do-Not-Call Registry Act of 2003, 8.4.3 Do-Not-Call Registry Fee Extension Act of 2007, 8.4.3 Dodd-Frank Wall Street Reform and Consumer Protection Act, 4.1.1, 5.2, 8.3.1(a) Driver’s License Information, 9.2.2(a) Driver’s Privacy Protection Act of 1994 (DPPA), 9.2.2(a)

E E-Government Act of 2002, 13.5 ECPA See Electronic Communications Privacy Act of 1986 (ECPA) Electronic Communications Privacy Act of 1986 (ECPA), Ch. 2, 1.3.2, 2.1, 2.2, 11.2, 16.2 See also Pen Register Act; Stored Communications Act (SCA); Wire and Electronic Communications Interception and Interception of Oral Communications Act Definitions, 2.2 I–5

Data Security and Privacy in Massachusetts

Electronic Communications Privacy Act of 1986 (ECPA) (cont’d) History of, 2.1 Electronic Funds Transfer Act, 1.3.3, 8.1, 8.4.2, 11.1 Electronic Signature in Global and National Commerce Act of 2000, 11.2 Equal Credit Opportunity Act, 8.1 Espionage, 3.3.1

F FACTA See Fair and Accurate Credit Transactions Act (FACTA) of 2003 Fair and Accurate Credit Transactions Act (FACTA) of 2003, 1.3.3, 1.3.8, 4.1.1, 8.3.1(b), 9.2, 13.3, 16.6 Enforcement, 8.3, 8.3.1(b) History of, 4.1.1 Protection Provisions, 4.2 Employer’s use of consumer report, 4.2.4 Payment card numbers, truncation of, 4.2.1 Red flag guidelines, 4.2.5 Fair Credit Billing Act, 8.1 Fair Credit Reporting Act (FCRA), 1.3.3, 4.1.1, 5.4.1, 8.3.1(a), 8.3.1(b), 9.2, 9.3, 11.1, Exhibits 3A, 15B See also Fair And Accurate Credit Transactions Act (FACTA) of 2003 FTC implementation of, 8.3.1 History of, 4.1.1 Protection provisions, 4.2 Address discrepancies, 4.2.7 Credit report dissemination, 8.3.1, 8.3.1(a) Employer’s use of consumer report, 4.2.4 Enforcement, 8.1, 8.3 Fraud alerts, 4.3.3 Medical information restrictions, 4.2.3 Payment card Change of address notification, new card, 4.2.6 I–6

Numbers, truncation of, 4.2.1 Reasonable procedures by CRAs, 4.2.9 Red flag guidelines, 4.2.5, 13.3 Social Security numbers, truncation of, 4.2.2 Use of consumer reports, 4.2.8 Victims of identity theft, information available to, 4.3.2 Fair Debt Collection Practices Act (FDCPA), 4.3.2, 8.1 Fair Information Practices Principles, 16.1 Fair Packaging and Labeling Act, 8.1 Family Educational Rights and Privacy Act (FERPA), 6.4.1(a), 7.4.6, Exhibit 15B Farm Credit Act of 1971, Exhibit 3A FCC See Federal Communications Commission (FCC) FCRA See Fair Credit Reporting Act (FCRA) Federal Cigarette Labeling and Advertising Act, 8.1 Federal Communications Act of 1934 (FCA), 2.1, 12.2.1(a), 12.2.1(c), 12.2.1(d) Common carriers under, 12.2.1(a) Federal Communications Commission (FCC), 8.4.1 Comcast order, 12.2.1(c) Creation of, 12.2.1(a) Internet regulation by, 12.2.1(c) Open Internet order, 12.2.1(c), 12.2.1(d) Rollback of Open Internet order, proposed, 12.2.1(c) Service provider differentiation by, 12.2.1(b) Federal Credit Union Act, 5.5.4 Federal Deposit Insurance Act, 5.5.4 Federal Deposit Insurance Corporation (FDIC) Gramm-Leach-Bliley Act regulations, 5.5.4, 8.3.2(b) 2nd Edition 2018 | MCLE, Inc.

Index

Federal Financial Institutions Examination Council, 5.4.6, 5.4.6(a)

Federal Trade Commission Improvements Act of 1980, 8.2

Federal Information Security Management Act (FISMA), 13.5

Financial Modernization Act of 1999 See Gramm-Leach-Bliley Act of 1999 (GBLA)

Federal Information Security Modernization Act of 2014, 13.5 Federal Reserve Act, 5.4.4, Exhibits 3A, 5A Federal Reserve Board, 8.3.2(b) Federal Reserve System, 5.5.4 Federal Trade Commission (FTC), Ch. 8, 5.1 Authority, 5.2.1, Exhibit 5A Bureau of Consumer Protection, 8.2 Current developments, 16.3 Data breach notification, guidance on, 5.4.6(b) Enforcement, 8.1, 8.3, 8.4 CAN-SPAM, 8.4.1, 9.2.3(a) Children’s Online Privacy Protection Act, 7.4, 8.4.3, 8.4.4 Do-Not-Call legislation, 8.4.3 Fair and Accurate Credit Transactions Act, 8.3.1(b) Fair Credit Reporting Act, 8.3.1 Gramm-Leach-Bliley Act, 5.3.1, 5.6.1, 5.6.2(b), 8.3.2, 8.3.2(b), Exhibit 8B HIPAA, 8.3.3, Exhibit 8C Telemarketing and Consumer Fraud and Abuse Prevention Act, 8.4.2 Unfair or deceptive acts of practices, 8.5 Financial institution safeguard rules, 13.4 Framework, 16.3 History of, 8.1 Operation of, 8.2 Red flags rule, 4.2.5, 8.3.1(b), 13.3, Exhibit 8A Unfairness Policy Statement, 8.5 Federal Trade Commission Act (FTCA), 8.1, 8.2, 8.3.1(a), 8.4.2, 8.4.4, 8.5, 15.3, 16.2, Exhibit 8C Federal Trade Commission Act Amendments of 1994, 8.3.1(a) MCLE, Inc. | 2nd Edition 2018

Financial Services Regulatory Relief Act of 2006, 8.3.1(a), 8.3.2(b) Florida Anti-spam legislation, 9.2.3(a) Antiphishing Act, 9.2.3(b) Breach notification, 9.2.1 Electronic Mail Communications Act, 9.2.3(a) Florida Information Protection Act of 2014, 9.2.1 Revenge porn protection, 9.2.3(e) Fraud Alerts, 4.3.3 Active duty, 4.3.3(c) Effects of, 4.3.3(d) Extended, 4.3.3(b) Initial, 4.3.3(a) Liability of CRAs for, 4.3.3(e) One-call, 4.3.3(a) Security freezes, 4.3.3(f) FTC See Federal Trade Commission (FTC) Fur Products Labeling Act, 8.1

G GBLA See Gramm-Leach-Bliley Act of 1999 (GBLA) Genetic Information Nondiscrimination Act of 2008, 8.3.3 Georgia Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Georgia Computer Security Act of 2005, 9.2.3(b) Georgia Slam Spam E-Mail Act, 9.2.3(a) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) I–7

Data Security and Privacy in Massachusetts

Glass-Steagall Act, 5.1 Gramm-Leach-Bliley Act of 1999 (GBLA), Ch. 5, 1.3.6, 5.1, 8.1, 8.3.2, 8.3.2(a), 8.3.2(c), 9.2.2(b), 9.3, 11.1, Exhibit 15B Consent requirements, 5.4.2 Data breach notification requirements, 5.4.6 Disclosure requirements, 5.4, 5.4.1, 5.4.4 Nonpublic personal information, 8.3.2(a) Enforcement, 5.6.1, 8.3.1(b), 8.3.2(b) Bureau of Consumer Financial Protection, 8.3.2 CFPBA, 5.6.2(a) Commodities Futures Trading Commission, 5.6.1, 8.3.2 Federal Trade Commission, 5.6.2(b), 8.3, 8.3.2, 8.3.2(b) Judicial, 5.6.2(a) Model Privacy Notice Form No Opt Out, Exhibit 5C Opt Out, Exhibit 5B Nonpublic personal information Disclosure of, 8.3.2(b) Safeguarding of, 8.3.2(a) Notice requirements, 5.4, 5.4.1 Overview, 5.1 Pretexting, 5.5, 8.3.2(c) Authority, 5.2.3 Criminal penalty, 5.5.5 Enforcement, 5.5.4, 8.3.2(c) Exceptions, 5.5.3 Overview, 5.5.1 Prohibition, 5.5.2 Regulations, 8.3.2(b) Reuse of NPPI, 5.4.3 Rule-making authority, 5.2 Financial privacy rule, 5.2.2 Pretexting rule, 5.2.3 Safeguards rule, 5.2.1 Safeguarding requirements, 5.4, 5.4.1, 5.4.5, 13.4 Information protection program elements, Exhibit 8B Nonpublic personal information, 8.3.2(a) I–8

Sanctions, 5.6.2 Scope, 5.3 Entities subject to, 5.3.1 Individuals protected under, 5.3.3 Information protected under, 5.3.2 Written information security plan (WISP), 5.4.5

H Hart-Scott-Rodino Antitrust Improvements Act of 1976, 8.1 Hawaii Breach notification, 9.2.1 Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Health Information See also Health Information Technology for Economic and Clinical Health (HITECH) Act; Health Insurance Portability and Accountability Act of 1996 (HIPAA) Consumer reports, in, 4.2.3 Health Information Technology for Economic and Clinical Health (HITECH) Act, 1.3.5, 1.3.8, 6.1, 6.5.1, 8.3.3, 13.1, 16.6, Exhibit 15B Data breach rule, 6.1, 6.6, Exhibit 8C Breach as violation of security rule, 6.6.2(a) Impact on information security, 6.6.2 Overview, 6.6.1 Private litigation, 6.6.2(b) Publicity, 6.6.2(b) Enforcement, 6.5 Daily penalties, 6.5.6 Federal Trade Commission, 8.1, 8.3, 8.3.3, Exhibit 8C Office of Civil Rights, enforcement by, 6.5.6 Open issues, 6.5.6 Vicarious liability for acts of business associates, 6.5.4 Penalties, 6.5, 6.5.3 Statutory authority, 6.2 2nd Edition 2018 | MCLE, Inc.

Index

Health Insurance Portability and Accountability Act of 1996 (HIPAA), 1.3.5, 8.3.3, 9.2.1, 13.1, 16.1, Exhibits 8C, 15B See also Health Information Technology For Economic And Clinical Health (HITECH) ACT Data security standards, 13.2 Health Information Breach Notification Rule, Exhibit 8C Matrix of, Exhibit 6A Enforcement, 6.5, 8.3 FTC implementation of, 8.3.3 Guidance, sources of, 6.4.7 Office of Civil Rights, enforcement by, 6.5.5 Penalties, 6.5 Daily, 6.5.6 Prior to HITECH, 6.5.2 Subsequent to HITECH, 6.5.3 Vicarious liability for acts of business associates, 6.5.4 Privacy and Security Compliance Agreement, sample, Exhibit 13A Regulations, 8.3.3 Requirement, 6.4.2 Risk assessment, 6.4.3 Safeguards Administrative, 6.4.4 Guidance, sources of, 6.4.7 Physical, 6.4.5 Technical, 6.4.6 Security rule, 6.1, 6.3, 6.4 Addressable specifications, 6.4.1(g) Businesses associates, 6.4.1(b) Concepts, key, 6.4.1(a) Electronic personal health information (EPHI), 6.4.1(c) Entities covered, 6.4.1(b) Industry standards, relationship to, 6.4.1(f) Protected health information (PHI), 6.1, 6.4.1(a) Required specifications, 6.4.1(g) Scalability, 6.4.1(d) Technology neutral, 6.4.1(e) Statutory authority, 6.2 MCLE, Inc. | 2nd Edition 2018

HIPAA See Health Insurance Portability and Accountability Act of 1996 (HIPAA) HITECH Act See Health Information Technology For Economic And Clinical Health (HITECH) Act Hobby Protection Act, 8.1

I IDAHO Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Identity Theft, 1.2.2(b), 1.3.8, 4.1.1, 9.2.1, 11.4 See also Fair and Accurate Credit Transactions Act (FACTA) of 2003; Fair Credit Reporting Act (FCRA) Authentication techniques, 16.5 Blocking fraudulent information, 4.3.4 CRAs, 4.3.4(a) Furnishers, 4.3.4(b) Current developments in identity management, 16.5 Definition of, 4.3.1 Fraud alerts, 4.3.3 Prevalence of, 4.3.1 Red flags rules, 8.3.1(b) Victims, information available to, 4.3.2 Identity Theft and Assumption Deterrence Act of 1998, 1.2.2(b), 8.3 Identity Theft Enforcement and Restitution Act of 2008, 3.6.8 Illinois, 11.2 Anti-Phishing Act, 9.2.3(b) Anti-spam legislation, 9.2.3(a) Biometric Information Privacy Act, 9.2.2(b) Breach notification, 9.2.1 Electronic Mail Act, 9.2.3(a) Employees, coercion of access to social media, 9.2.3(c) I–9

Data Security and Privacy in Massachusetts

Illinois (cont’d) Identity Protection Act, 9.2.2(b) Internet, protection from data collection on, 9.2.2(d) Personal Information Protection Act, 9.2.1 Phishing protection, 9.2.3(b) Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) State Agency Web Site Act, 9.2.2(d) Students, coercion of access to social media, 9.2.3(c) Indiana Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Records, security and disposal of, 9.2.2(c) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) Insurance See Cyber Risk Insurance Intelligence Reform and Terrorism Protection Act of 2004, 8.3.1(a) International Banking Act of 1978, Exhibit 3A Investment Advisers Act of 1940, 5.5.3 Investment Company Act of 1940, 5.5.3 Iowa, 11.2 An Act Relating to the Transmission, Installation and Use of Computer Software Through Deceptive or Unauthorized Means and Providing for Penalties, 9.2.3(b) Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) I–10

Spyware protection, 9.2.3(b)

K Kansas Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Commercial Electronic Mail Act, 9.2.3(a) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Kentucky Breach notification, 9.2.1 Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Social Security number protection, 9.2.2(b)

L Louisiana Anti-Phishing Act of 2006, 9.2.3(b) Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Database Security Breach Notification Law, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Personal Online Account Privacy Act, 9.2.3(c) Phishing protection, 9.2.3(b) Private right of action, 9.3 Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) Students, coercion of access to social media, 9.2.3(c)

M Magnusson-Moss Warranty—Federal Trade Commission Improvement Act, 8.1, 8.2, 11.1 Maine Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 2nd Edition 2018 | MCLE, Inc.

Index

Maine (cont’d) Employees, coercion of access to social media, 9.2.3(c) Notice of Risk to Personal Data Act, 9.2.1 Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Malware, 1.2.2(a) State protection laws, 9.2.3(b) Maryland, 11.2 Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Maryland Commercial Electronic Mail Act, 9.2.3(a) Personal Information Protection Act, 9.2.1 Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c) Massachusetts Cyber-bullying, protection from, 9.2.3(d) Harassment, protection from online, 9.2.3(d) Stalking, protection from online, 9.2.3(d) Massachusetts Breach Notice Law, 9.2.1, 10.2 Case law, 10.2.8 Content of notice, 10.2.5 Licensors of personal information, 10.2.5(a) Owners of personal information, 10.2.5(a) Storers of information, 10.2.5(b) Covered entities, 10.2.2 Enactment, 10.1 Enforcement, 9.3, 10.2.6 Federal law, compliance with, 10.2.7 Penalties, 10.2.6 MCLE, Inc. | 2nd Edition 2018

Personal information, 10.2.1 Responding to breach, 10.2.9 Contract evaluation, 10.2.9(e) Credit monitoring, offer of, 10.2.9(f) Damage minimization, 10.2.9(a) Investigation, 10.2.9(a) Notice preparation, 10.2.9(c) Postincident review, 10.2.9(g) Public relations in, 10.2.9(d) Timing of notice, 10.2.4 Triggers for notice, 10.2.3 Massachusetts Consumer Protection Act, 8.5 Massachusetts Data Security Regulations, 10.4, 13.6 See also Written Information Security Program (WISP) Action plan for complying with, Exhibit 10F Application of, 12.2.4 Assessment of organization, 10.4.4(a) Compliance Checklist, Exhibit 10D Computer system security requirements, 10.4.3 Encryption, 10.4.3(c) Monitoring, 10.4.3(d) Secure access control measures, 10.4.3(b) Secure user authentication protocols, 10.4.3(a) Security protections, 10.4.3(e) Data security binder, 10.4.4(b) Frequently Asked Questions, 10.4.2(b), 10.4.2(c), Exhibit 10A Personal information, Exhibit 10E Practical tips, 10.4.4, Exhibit 10G Requirements, 10.4.1 Resources, Exhibit 10H Massachusetts Destruction/Disposal Law, 10.3 Cleaning of equipment, 10.3.3 Enforcement, 10.3.2 Penalties, 10.3.2 Requirements, 10.3.1 Massachusetts Electronic Surveillance Statute, 2.4 Consent of all parties, 2.4.2 I–11

Data Security and Privacy in Massachusetts

Massachusetts Electronic Surveillance Statute (cont’d) Oral communications, 2.4.1 Video surveillance, 2.4.3 Merger and Acquisition Transactions, Ch. 15 Bankruptcy, transfers of personal data in, 15.3 Deal structure, 15.7 Due diligence, 15.2 Need for, 15.1 Request list for privacy and security, sample, Exhibit 15A Risks of disclosing personal data during, 15.4 Indemnification, 15.6 Agreement provisions, sample, Exhibit 15B Insurance, cyber risk, 15.6 Representations and warranties, 15.5 Agreement provisions, sample, Exhibit 15B Michigan Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Freedom of Information Act, 9.2.2(b) Identity Theft Protection Act, 9.2.1 Internet Privacy Protection Act, 9.2.3(c) Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Social Security Privacy Act, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c) Unsolicited Commercial E-Mail Protection Act, 9.2.3(a) Minnesota Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) I–12

Internet, protection from data collection on, 9.2.2(d) Phishing protection, 9.2.3(b) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Mississippi, 11.2 Breach notification, 9.2.1 Missouri Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Social Security number protection, 9.2.2(b) Montana An Act Prohibiting and Employer from Requesting Online Passwords or User Names for an Employee’s or Job Applicant’s Personal Social Media Account, 9.2.3(c) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Montana Driver Privacy Protection Act, 9.2.2(a) Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Social Security number protection, 9.2.2(b)

N National Association of Insurance Commissioners (NAIC), 5.2 National Conference of Commissioners on Uniform State Laws (NCCUSL), 9.2.3(c), 11.2, 16.4 National Credit Union Administration Enforcement Fair and Accurate Credit Transactions Act, 8.3.1(b) Gramm-Leach-Bliley Act, 5.2.3, Exhibit 5A

2nd Edition 2018 | MCLE, Inc.

Index

National Credit Union Administration (cont’d) Gramm-Leach-Bliley Act regulations, 8.3.2(b), Exhibit 5A National Information Infrastructure Protection Act of 1996, 3.6.5 National Institute for Science and Technology, 6.4.1(f) Nebraska Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Financial Data Protection and Consumer Notification of Data Security Breach Act of 2006, 9.2.1 Uniform Motor Vehicle Records Disclosure Act, 9.2.2(a) Workplace Privacy Act, 9.2.3(c) Network Service Providers, Ch. 12 Common carriers, as, 12.2.1(c) Consumer of Contract negotiation, 12.3 Representing, 12.1.2 Definition of, 12.1.1 Historical background, 12.1.1 Negotiating contracts with, 12.3 Data audit, 12.3.2(a) Data mapping, 12.3.2(a) Data Security Questionnaire, Exhibit 12A Framework for, setting, 12.3.2 Legal obligations, understanding, 12.3.2(b) Privacy obligations, 12.3.1 Security obligations, 12.3.1 Technical assessment of provider, 12.3.2(c) Representing, 12.1.3 Services offered, 12.1.1 Statutes governing, 12.1.1, 12.2 Communication Decency Act, 12.2.3 Communications Act, 12.2.1(a) Electronic Communications Privacy Act, 12.2.2 MCLE, Inc. | 2nd Edition 2018

Federal law, 12.2.1 Internet regulation, 12.2.1(c) Privacy provisions, 12.2.1(d) Security provisions, 12.2.1(d) State law, 12.2.4 Stored Communications Act, 12.2.2 Telecommunications Act, 12.2.1(b), 12.2.1(d) Nevada Anti-spam legislation, 9.2.3(a) Blockchain records, use of, 16.5 Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Spyware protection, 9.2.3(b) New Hampshire Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Private right of action, 9.3 Revenge porn protection, 9.2.3(e) Spyware protection, 9.2.3(b) Students, coercion of access to social media, 9.2.3(c) New Jersey Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Identity Theft Prevention Act, 9.2.1, 9.2.2(b) Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c) I–13

Data Security and Privacy in Massachusetts

New Mexico Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Employees, coercion of access to social media, 9.2.3(c) Phishing protection, 9.2.3(b) Privacy Protection Act, 9.2.2(b) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c) NEW York, 11.2 Anti-Phishing Act, 9.2.3(b) Breach notification, 9.2.1 Civil Rights Law, 9.1.2 Information Security Breach and Notification Act, 9.2.1 Internet, protection from data collection on, 9.2.2(d) Internet Security and Privacy Act, 9.2.2(d) New York Deceptive Trade Practices Act, 9.3 Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Social Security number protection, 9.2.2(b) Nonpublic Personal Information (NPPI) Disclosure of to third parties, 5.4.4, 8.3.2(b) Protection of, 5.1, 5.3.2 Reuse of, 5.4.3 Safeguards, 8.3.2(a) North Carolina, 11.2 Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Identity Theft Protection Act of 2005, 9.2.1, 9.2.2(b) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) I–14

North Dakota, 11.2 Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) NPPI See Nonpublic Personal Information (NPPI)

O OCABR See Office of Consumer Affairs and Business Regulation (OCABR) OCR See Office of Civil Rights (OCR) Office of Civil Rights (OCR), 6.1, 6.4.1(b), 6.5.1, 6.5.5, 6.5.6 Office of Consumer Affairs and Business Regulation (OCABR), 9.2.2(c), 10.1, 10.2.5(a) 201 C.M.R. § 17.00 Compliance Checklist, Exhibit 10D Frequently Asked Questions Regarding 201 C.M.R. § 17.00, 10.4.2(b), 10.4.2(c), Exhibit 10A Office of the Comptroller of the Currency Gramm-Leach-Bliley Act regulations, 5.1, 8.3.2(b) Office of Thrift Supervision, 8.3.2(b) Ohio Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Social Security number protection, 9.2.2(b) Oklahoma Anti-Phishing Act, 9.2.3(b) Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1

2nd Edition 2018 | MCLE, Inc.

Index

Oklahoma (cont’d) Employees, coercion of access to social media, 9.2.3(c) Phishing protection, 9.2.3(b) Revenge porn protection, 9.2.3(e) Security Breach Notification Act, 9.2.1 Social Security number protection, 9.2.2(b) Omnibus Crime Control and Safe Streets Act of 1968, 1.3.2, 2.1 Oregon Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Employees, coercion of access to social media, 9.2.3(c) Oregon Consumer Identity Theft Protection Act, 9.2.1 Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c)

P Patient Protection and Affordable Care Act, 8.3.3 Payment Cards, 13.7 Payment Cards Industry Security Standards Council (PCI SSC), 13.7 Pen Register Act, 2.2, 2.3.1, 2.6 Pennsylvania Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Breach of Personal Information Notification Act, 9.2.1 Consumer Protection Against Computer Spyware Act, 9.2.3(b) Revenge porn protection, 9.2.3(e) Social Security Number Privacy Act, 9.2.2(b) Social Security number protection, 9.2.2(b) MCLE, Inc. | 2nd Edition 2018

Spyware protection, 9.2.3(b) Unsolicited Telecommunication Act, 9.2.3(a) Personal Information See also Nonpublic Personal Information (NPPI) Internet collection of, 9.2.2(d) Security of, 9.2.2 Driver’s license information, 9.2.2(a) Records containing, 9.2.2(c) Social Security numbers, 9.2.2(b) Personally Identifiable Information, 7.4 Petroleum Marketing Practices Act, 8.1 Pharming, 1.2.2(a) State protection laws, 9.2.3(b) Phishing, 1.2.2(a) State protection laws, 9.2.3(b) Postal Reorganization Act, 8.1 Pretexting, 5.1, 5.2.3, 5.5, 8.3.2(c) Criminal penalty, 5.5.5 Enforcement, 5.5.4, 8.3.2(c) Exceptions, 5.5.3 Overview, 5.5.1 Prohibition, 5.5.2 Privacy Behavioral targeting, 1.2.2(b) Consumer issues, 11.4 Data collection, 11.4.1 Use of data collected, 11.4.2 User access to information, 11.4.3 Cyberspace, challenges of, 1.2.2(b) Definitions, 1.2.1(a) Historical background of legal framework, 1.3.1 Legal framework, 1.2.1(a) Right to, 1.1, 9.1.2 Security compared, 1.2.1, 12.3.1 Sources of law on, 1.3 Torts of invasion of, 1.3.2 Protection of Pupil Rights Amendment, 7.4.6 Public Health Service Act, 6.4.1(b) I–15

Data Security and Privacy in Massachusetts

Public Key Infrastructure, 1.3.4, 16.5

R Records Containing Personal Information See also Massachusetts Data Security Regulations Disposal of FTC regulation, 8.3.1(b) Massachusetts law, 9.2.2(c), 10.3 State laws, 9.2.2(c) Security of, 9.2.2(c) Red Flag Program Clarification Act of 2010, 8.3.1(a), 8.3.1(b) Regulation E, 8.4.2, 11.1 Regulation P, 5.2.2, 8.3.2(B) Regulation S-P, 8.3.2(A), 8.3.2(B) Regulation Z, 8.4.2, 11.1 Restore Online Shoppers’ Confidence Act, 8.1 Retail Issues See Consumer Issues Revenge Porn Civil remedies, uniform, 16.4 Communications Decency Act, 7.2.6 State laws, 9.2.3(e), 16.4 Revised Uniform Fiduciary Access to Digital Assets Act, 16.4 Rhode Island Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Consumer Empowerment and Identity Theft Prevention Act of 2006, 9.2.1 Driver’s license information protection, 9.2.2(a) Electronic Mail Fraud Regulatory Act, 9.2.3(a) Employees, coercion of access to social media, 9.2.3(c) Internet Misrepresentation of Business Affiliation Act, 9.2.3(a) Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c)

I–16

Rhode Island Identity Theft Protection Act of 2015, 9.2.1 Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) Students, coercion of access to social media, 9.2.3(c) Right to Financial Privacy Act of 1978, 5.4.1 Risk Management, 14.7 See also Cyber Risk Insurance Employee devices, 14.7.5 Employee training, 14.7.8 Evaluation Electronically stored information, 14.7.1 Insurance programs, 14.7.7 Network infrastructure and security, 14.7.2 Media publication risks, 14.7.3 Protocols Data breach response and notification, 14.7.5 Data security, 14.7.4 Regulatory compliance, 14.7.9 Robinson-Patman Act, 8.1

S Sarbanes-Oxley Act of 2002, 5.5.3 SCA See Stored Communications Act (SCA) SEC See Securities and Exchange Commission (SEC) Securities Act of 1933, 5.5.3 Securities and Exchange Commission (SEC) Authority, 5.2.1, 5.2.2, 5.2.3, Exhibit 5A Enforcement Fair and Accurate Credit Transactions Act, 4.2.5, 8.3.1(b) Gramm-Leach-Bliley Act, 5.1, 5.6.1 Gramm-Leach-Bliley Act regulations, 8.3.2(b)

2nd Edition 2018 | MCLE, Inc.

Index

Securities Exchange Act of 1934, 5.5.3, 8.3.2(c), Exhibit 3A Securities Investor Protection Act of 1970, 5.5.3

Driver’s license information protection, 9.2.2(a) Identity theft protection, 9.2.1 Revenge porn protection, 9.2.3(e)

Security, 9.1.2, 12.3.1 Cyberspace, challenges of, 1.2.2(a) Definitions, 1.2.1(b) Nature of data security, 1.2.1(b) Privacy compared, 1.2.1

Spam See also Controlling The Assault Of Non-Solicited Pornography And Marketing (CAN-SPAM) Act State protection laws, 9.2.3(a)

Sexual Content CAN-SPAM See Controlling The Assault of NonSolicited Pornography and Marketing (CAN-SPAM) Act Revenge porn See Revenge Porn

Spoofing State protection laws, 9.2.3(b)

Sherman Act of 1890, 8.1 Smart Contract Technology, 16.5 Social Media Coercion of access to, 9.2.3(c) Contracting, 11.2 Data collection, 11.4.1 Privacy issues, 11.4 Use of data collected, 11.4.2 User access to information, 11.4.3 Social Security Act, Exhibit 8C Social Security Numbers, 9.2.2(b) South Carolina Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Family Privacy Protection Act, 9.2.2(b) Financial Identity Fraud and Identity Theft Protection Act, 9.2.1 Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Social Security number protection, 9.2.2(b) South Dakota An Act to Restrict Unsolicited Commercial Electronic Mail Advertisements, 9.2.3(a) Anti-spam legislation, 9.2.3(a)

MCLE, Inc. | 2nd Edition 2018

Spyware, 1.2.2(a) State protection laws, 9.2.3(b) State Laws, Ch. 9, 1.3.7 See also specific states “Baby FTC” acts, 9.3 Breach notification, 9.2.1, 9.3 Common law, 9.1.2 Constitutions, 9.1.1 Current developments, 16.4 Foundations of, 9.1 Common law, 9.1.2 Constitutions, 9.1.1 Litigation, 9.3 Personal information security, 9.2.2 Driver’s license information, 9.2.2(a) Internet collection of data, 9.2.2(c) Records containing, 9.2.2(c) Social Security numbers, 9.2.2(b) Phishing protection laws, 9.2.3(b) Revenge porn protection, 9.2.3(e), 16.4 Social media, coerced access to, 9.2.3(c) Spam prevention laws, 9.2.3(a) Spyware protection laws, 9.2.3(b) Stored Communications Act (SCA), 2.2, 2.3.1, 2.5, 2.6, 12.2.2, 16.2 Communications in transit compared, 2.3.3 Consent to disclosure, 2.5.9 Content information, 2.5.5 Discovery under, 2.5.9 E-mails, access to, 2.3.3 Entities subject to, 2.5.3 Fourth Amendment, interaction with, 2.5.8 I–17

Data Security and Privacy in Massachusetts

Stored Communications Act (SCA) (cont’d) History of, 2.5.1 Massachusetts Constitution, interaction with, 2.5.7 Noncontent information, 2.5.5 Nonpublic providers, 2.5.4 Privacy protections of, 2.5.6 Compelled disclosure, 2.5.6(a), 2.5.6(c) Voluntary disclosure, 2.5.6(b), 2.5.6(c) Public providers, 2.5.4 Unauthorized access prohibited, 2.5.2 Video surveillance, 2.3.4 Surveillance See Electronic Communications Privacy Act of 1986 (ECPA); Massachusetts Electronic Surveillance Statute; Pen Register Act; Stored Communications Act (SCA); Wire and Electronic Communications Interception and Interception of Oral Communications Act

T TCFAPA See Telemarketing and Consumer Fraud and Abuse Prevention Act (TCFAPA) Telecommunications Act of 1966, 12.2.1(c) Security provisions, 12.2.1(d) Service providers, distinguishing between types of, 12.2.1(b) Telemarketing and Consumer Fraud and Abuse Prevention Act (TCFAPA), 8.1, 8.4.2, 8.4.3 Enforcement, 8.4.2 Telemarketing Sales Rule, 8.4.2 Telemarketing and Telephone Consumer Protection Act, Exhibit 15B Telephone Disclosure and Dispute Resolution Act of 1992, 8.1 Tennessee Anti-Phishing Act, 9.2.3(b) Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 I–18

Credit Security Act of 2007, 9.2.1, 9.2.2(b) Driver’s license information protection, 9.2.2(a) Employee Privacy Protection Act of 2014, 9.2.3(c) Employees, coercion of access to social media, 9.2.3(c) Phishing protection, 9.2.3(b) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Tennessee Identity Theft Deterrence Act of 1999, 9.2.1 Uniform Motor Vehicle Records Disclosure Act, 9.2.2(a) Texas Anti-Phishing Act, 9.2.3(b) Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Consumer Protection Against Computer Spyware Act, 9.2.3(b) Identity Theft Enforcement and Protection Act, 9.2.1 Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware protection, 9.2.3(b) Trans-Alaska Pipeline Act of 1973, 8.2 Trust Indenture Act of 1939, 5.5.3 Truth In Lending Act, 1.3.3, 8.1, 8.3.1(a), 8.4.2, 9.2, 11.1

U U.S. Department of Commerce National Institute of Standards and Technology (NIST), 13.5 Uniform Commercial Code (UCC), 1.3.3 Uniform Computer Transactions Act (UCITA), 11.2 Uniform Electronic Transactions Act, 1.3.4 2nd Edition 2018 | MCLE, Inc.

Index

Uniform Electronic Transactions Act (UETA), 11.2 Uniform Employee and Student Online Privacy Protection Act (UESOPPA), 9.2.3(c), 16.4 Uniform Fiduciary Access to Digital Assets Act (2014), 11.3.2, 16.4 Uniting and Strengthening America by Fulfilling Rights and Ensuring Effective Monitoring (USA FREEDOM) Act of 2015, 1.2.2(b), 16.1 Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA PATRIOT) Act of 2001, 1.2.2(b), 3.6.6, 8.4.2, 16.1 USA PATRIOT Improvement and Reauthorization Act of 2005, 8.3.1(a) USA PATRIOT Improvement and Reauthorizing Amendments Act of 2005, 8.3.1(a) Utah Breach notification, 9.2.1 Consumer Credit Protection Act, 9.2.1, 9.2.2(b) Employees, coercion of access to social media, 9.2.3(c) Internet Employment Privacy Act, 9.2.3(c) Internet Postsecondary Institution Privacy Act, 9.2.3(c) Notice of Intent to Sell Nonpublic Personal Information Act, 9.2.2(b) Phishing protection, 9.2.3(b) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Spyware Control Act, 9.2.3(b) Spyware protection, 9.2.3(b) Students, coercion of access to social media, 9.2.3(c) Utah E-Commerce Integrity Act, 9.2.3(b)

MCLE, Inc. | 2nd Edition 2018

V Vendors Issues See Business Associates Issues Vermont, 11.2 Breach notification, 9.2.1 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Social Security Number Protection Act, 9.2.2(b) Video Surveillance Massachusetts law, 2.4.3 Stored Communications Act, 2.3.4 Wiretap Act, 2.3.4 Virginia, 11.2 Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Employees, coercion of access to social media, 9.2.3(c) Internet, protection from data collection on, 9.2.2(d) Phishing protection, 9.2.3(b) Private right of action, 9.3 Protection of Social Security Numbers Act, 9.2.2(b) Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Students, coercion of access to social media, 9.2.3(c) Virginia Computer Crimes Act, 9.2.3(a) Viruses, 1.2.2(a)

W Washington, 11.2 Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Employees, coercion of access to social media, 9.2.3(c) Private right of action, 9.3 Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Spyware protection, 9.2.3(b)

I–19

Data Security and Privacy in Massachusetts

West Virginia, 11.2 Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Driver’s license information protection, 9.2.2(a) Electronic Mail Protection Act, 9.2.3(a) Employees, coercion of access to social media, 9.2.3(c) Enforcement, 9.3 Revenge porn protection, 9.2.3(e) Social Security number protection, 9.2.2(b) Uniform Motor Vehicle Records Disclosure Act, 9.2.2(a) Wire and Electronic Communications Interception and Interception of Oral Communications Act, 1.3.2, 2.1, 2.2, 2.3, 2.6 Communications covered, 2.4.1 Communications in transit, stored communications compared, 2.3.3 Exceptions, 2.3.2 Interception of communications, 2.3.1 Massachusetts surveillance law compared, 2.4 Video surveillance, 2.3.4, 2.4.3 Wi-Fi communications, 2.3.1 Wiretap Report 2013, Exhibit 2A Wiretap Act See Wire and Electronic Communications Interception and Interception of Oral Communications Act

I–20

Wisconsin Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Data security breach protection, 9.3 Employees, coercion of access to social media, 9.2.3(c) Records, security and disposal of, 9.2.2(c) Revenge porn protection, 9.2.3(e) Students, coercion of access to social media, 9.2.3(c) WISP See Written Information Security Program (WISP) Wool Products Labeling Act of 1939, 8.1 Worms, 1.2.2(a) Written Information Security Program (WISP), 10.4.1, 10.4.2 Data security coordinator, 10.4.2(a) Employee policies, 10.4.2(c) Guide to writing, Exhibit 10B Restrictions, 10.4.2(f) Risk assessment, 10.4.2(b) Safeguard improvement, 10.4.2(b) Terminated employees, 10.4.2(d) Third-party service providers, 10.4.2(e) Wyoming Anti-spam legislation, 9.2.3(a) Breach notification, 9.2.1 Spyware protection, 9.2.3(b)

2nd Edition 2018 | MCLE, Inc.