Ninth Circuit Rules on Meaning of “Without Authorization” under Computer Fraud and Abuse Act

Last month, the Ninth Circuit affirmed the criminal conviction of an individual for accessing a computer “without authorization” in violation of the Computer Fraud and Abuse Act (“CFAA”).  U.S. v. Nosal (9th Cir., July 5, 2016).

The CFAA imposes criminal penalties on whoever “accesses a protected computer without authorization, or exceeds authorized access . . .” 18 U.S.C. § 1030.

“Without authorization”, the court ruled, means “accessing a protected computer without permission.”

In Nosal, a former employee of an executive search firm conspired with former colleagues to obtain confidential source lists and client contact data to start a competing search firm. Among other counts, Nosal was charged with violating the CFAA. The question before the three-judge panel was whether Nosal conspired to access a protected computer “without authorization” when he and his accomplices used the login credentials of Nosal’s former assistant to access the search firm’s proprietary information. Affirming Nosal’s conviction, the court held that Nosal did act “without authorization” when he continued to access data after his former employer rescinded permission to access its computer system.

Justice Reinhardt issued a vehement dissent in the case. He argued that the case was about simple password sharing, and the majority opinion “threatens to criminalize all sorts of innocuous conduct engaged in daily by ordinary citizens.” He emphasized that the CFAA is a criminal statute and must be construed more narrowly than a civil statute. A better interpretation of “without authorization”, he urged, is accessing a computer without the permission of either the system owner or a legitimate account holder. Though the facts of this case were distasteful, he noted that the court’s ruling had broader legal implications and thus he could not endorse the majority’s opinion.

Interestingly, this is the second time the Ninth Circuit has interpreted provisions of the CFAA in this case, and the first case touched on the very issues the dissent addressed. In the prior opinion, an en banc panel of the Ninth Circuit ruled on the meaning of “exceeds authorized access.” The court held that “exceeds authorized access” serves to restrict access to information but did not restrict how that information was used. Accordingly, the court determined that Nosal did not violate the CFAA when he had his former colleagues access information from the firm’s confidential databases and send it to him. Those colleagues were authorized to access the data. The CFAA, the court ruled, was not intended to impose criminal liability for violations of private computer use policies. The en banc panel construed the statute narrowly, “so that Congress will not unintentionally turn ordinary citizens into criminals.”

The majority’s recent opinion will likely not be the last word on this issue. Nosal will be filing a Petition for Rehearing and Rehearing en Banc. If this eventually gets to the panel it will be left to be seen whether Judge Reinhardt’s position seeking narrow construction of the term “without authorization” will be followed.

Regulators Nationwide Weigh in on CPUC Litigation

In May, we posted a blog on litigation filed by telecom providers and trade associations to prevent the California Public Utilities Commission (CPUC) from requiring Plaintiffs to turn over competitively sensitive data to a third party. Plaintiffs allege that disclosure of that data would violate regulations issued by the Federal Communications Commission (FCC) regarding the confidential status of that information. There is now a new party at the table. The National Association of Regulatory Utility Commissioners (NARUC) filed an amicus brief asking the Court to side with the CPUC, and permit disclosure of the data.

For background, after the complaint was filed, the Northern District of California granted Plaintiffs’ motion for a preliminary injunction, blocking the CPUC from sharing the companies’ ostensibly sensitive data with the third party. That order was based on the Court’s finding that the telecom Plaintiffs were likely to prevail on their argument that the CPUC order is preempted by FCC regulations, and because the telecom Plaintiffs “overwhelmingly” demonstrated that they would face irreparable harm if disclosure occurs.

NARUC then filed its amicus brief requesting that the Court deny Plaintiffs’ claims and permit the CPUC order to stand. NARUC framed the question before the Court as follows:

[D]o States have the ability to obtain and to use under state law broadband data, including granular, disaggregated, carrier-specific subscription data, which telecommunications carriers may (or may not) also submit to the FCC on the FCC’s Form 477?

NARUC argues in the affirmative. In particular, NARUC cites a 2010 opinion and order from the FCC that concluded the collection and use of broadband data by states was not preempted by federal law. NARUC also points to the great need for disclosure of data by regulated entities to state regulators, suggesting that it would be poor policy to prevent public utility regulators from accessing the data at issue in this litigation.

Interestingly, NARUC’s amicus brief focuses on the preemption argument and does not attempt to address or reconcile the impetus for the litigation—privacy concerns.

This litigation highlights two areas of tension in the privacy sphere. The first is the tension between ensuring privacy and data security and conducting regulatory activities, whether for the promotion of health, safety, or environmental wellbeing. The second tension is whether privacy protections should stem from the federal government, the states, or the industry itself. Without clear guidelines governing how to balance these competing policies, courts are often asked to decide significant privacy questions through legal doctrine and not the substance of the implicated rights. That is the case here, where the Court is left to grapple with a traditional legal question—whether the FCC’s regulations preempt state regulatory actions—rather than the propriety of requiring disclosure of sensitive information in the context of a public utility regulatory proceeding.

Despite Adoption of EU-US Privacy Shield, Uncertainties Remain

You may recall our previous posts about the drafting and negotiation of the EU-US Privacy Shield, the law designed to implement a data sharing agreement that sets specific privacy standards for companies sharing information between the European Union and United States.  The Privacy Shield law has now been adopted, and will go into effect on August 1.  After that date, companies can apply for Privacy Shield certifications.  While it may seem to be time to rejoice for consumers and American companies doing business overseas, there remain quite a few questions with the new law.

First, it is uncertain whether the Privacy Shield will be able to withstand judicial scrutiny.  The prior law governing information sharing between the United States and the European Union, the EU-US Safe Harbor program, was invalidated by European courts, necessitating the development of the Privacy Shield in the first place.  Certainly the new law, which the Wall Street Journal has described as “more robust than its predecessor”, was drafted to comport with the EU’s recent General Data Protection Regulation and with that prior ruling in mind.  Nonetheless, there will be legal challenges to the Privacy Shield by consumers.  United States laws on privacy and data security get tougher by the day, but they remain market friendly and fall short of the consumer protection offered by European courts.  Based on history, one cannot say with full comfort that the new law will withstand judicial rigor.

Second, the recent vote on Brexit complicates the situation.  Will law similar to the Privacy Shield be adopted in Great Britain?  If not, will the US and Great Britain adopt different standards to govern the transfer of information?  Will those standards be higher or lower than the EU?  And what impact will discussions between the EU and Great Britain in the privacy realm have on United States companies?  Presently, these questions remain unanswered, although one can certainly presume that there will be some standards negotiated and put in place in the near future.

Despite those questions, the adoption of the Privacy Shield still provides companies with some comfort because they now know the standards by which they may operate.  With respect to the EU, companies may now do away with whatever patchwork solution they crafted after the Safe Harbor agreement was abolished, and know the framework for any information sharing going forward.  In addition, those standards will mesh better with existing EU privacy laws.  This is a signal to companies that do business in Europe and seek Privacy Shield certification to review their policies and procedures on the storage and transfer of information.

White House Commission on Cybersecurity Seeks Input from Stakeholders on Future of Digital Landscape

Last week, the White House Commission on Enhancing National Cybersecurity held its third of six scheduled meetings around the nation.  President Obama established the Commission by Executive Order in February 2016 with the goal of “recommending bold, actionable steps that the government, private sector, and the nation as a whole can take to bolster cybersecurity in today’s digital world.”

Key private and public stakeholders were invited to be panelists to discuss three main topics: Addressing Security Challenges to the Digital Economy; Collaborating to Secure the Digital Economy; and Innovating to Secure the Future of the Digital Economy.   Among the invited panelists were executives and security officers from Google, Facebook, the University of California, The William and Flora Hewlett Foundation, and Kleiner Perkins Caufield & Byers, who provided their input and recommendations on research & development, technology, and innovation.

One of the key points addressed by the panel was the trade-off between mandated security standards and innovation.  The Commission asked panelists how companies can be incentivized to keep investing in security instead of simply meeting the minimum standards.  Weighing safety against costs is an issue for any company, but that balancing is particularly pronounced in the fast-moving technology industry where it could be prohibitively expensive to ensure security standards keep up with evolving technology.  In response, a few panelists recommended the creation of a new federal agency to oversee cybersecurity.

Throughout the day, the panelists emphasized the need for education about cyber threats and security, both for current consumers, and the current and future workforce.  Panelists also called for greater transparency and information sharing about cyber attacks and user data requests, and greater international collaboration.

One particularly interesting portion of the day occurred when the Commission asked panelists what the next “cyber-Pearl Harbor” might be, quoting former Secretary of Defense Leon Panetta.  One panelist explained that he was “terrified” of the growth of the Internet of Things, and called for technology to be maintained and secured for the lifetime of the product it is attached to, which in many cases—like cars, for example—are 10 years or more.  Another panelist warned about increasing machine intelligence.

Ultimately, as one panelist commented, technology alone will not solve cyber threats; there will also need to be smart policy to both solve the crises of the moment and plan for the future.

An audio recording of the meeting is made available by UC Berkeley’s Center for Long-Term Cybersecurity.

The Commission will deliver its final report to the President by December 1, 2016.

Data Breaches Response Costs Continue to Rise

SEC Chair Mary Jo White recently opined that cyber security is the biggest risk facing the United States financial system.  Companies should take heed of that warning in light of the release of the 2016 Cost of Data Breach Study by IBM and the Ponemon Institute, which showed that average response costs for data breaches continue to rise.

According to that study, the total average cost of data breach incidents to companies located in the United States increased from $6.53 million to $7.01 million between 2014 and 2015.  That represents an approximate 7% increase in the average cost of responding to a data breach.  The study observed that, over its 11 year history, there has not been significant fluctuation in the response cost.  While this might seem comforting, IBM and the Ponemon Institute note that this indicates another chilling likelihood: “it is a permanent cost organizations need to be prepared to deal with and incorporate in their data protection strategies.”  In addition, the biggest financial consequence to organizations that have experienced a breach is not the cost of responding to the data breach (which is onerous), but the loss of business.

This sobering analysis supports Chair White’s concerns about data security risks and underscores the severity of the problem.  Given the staggering cost of data breaches, a firm should plan well in advance to mitigate its exposure to such costs.  For example, the study also provided that while half of all data breaches resulted from a malicious attack, “23 percent of incidents were caused by negligent employees, and 27 percent involved system glitches that included both IT and business process failures.”  That means half of the breaches might have been avoided with proper training and IT protocols.  Given that data breaches aren’t going to go away, companies should consider factors noted in the study that decrease the cost of responding to data breaches, including “[i]ncident response plans and teams in place, extensive use of encryption, employee training, BCM involvement or extensive use of DLP.”  Companies may also wish to consider cyber-insurance policies to help bear the costs associated with data breaches.  But given the alarming costs detailed in the study, companies should keep in mind the old adage that an ounce of prevention is worth a pound of cure.

SEC Fines Morgan Stanley for Data “Breach”

Last week, the SEC fined Morgan Stanley $1 million for two data security failures resulting in the exposure of personal information of 730,000 of its customers. Andrew Ceresney, the director of enforcement at the SEC stated, “Given the dangers and impact of cyberbreaches, data security is a critically important aspect of investor protection…we expect SEC registrants of all sizes to have policies and procedures that are reasonably designed to protect customer information.” But this wasn’t your typical data breach situation, and it is an important lesson for companies.

That data “breach” occurred when a former employee downloaded customer records, which were then hacked and posted on-line. That former employee has been suspended by the SEC, but it was the fine of Morgan Stanley that is the real story because this wasn’t a breach of the company’s defenses.

The SEC’s fine was based on two control failures.

First, the SEC asserted that Morgan Stanley failed to monitor its employees from accessing the records of customers where they were not authorized to do so. While there were some controls in place to prevent such actions in its internal data portals, a glitch permitted Morgan Stanley employees to access all customer records through a certain type of report.

Second, while the company had a control in place to prevent the use of thumb drives to download customer records, there was no corresponding control for uploading records on-line. Combined, those holes allowed one employee to access customer records and upload them to the internet.

This is an important lesson for companies because unlike most breaches we hear about in the news, this was not a hack of the company itself. Morgan Stanley is being fined for its employee’s bad acts.  Would the SEC fine Morgan Stanley if one of its employees printed customer records and then left them in a briefcase on a subway train? Perhaps.

The takeaway for companies is that they may be liable for cyber theft even where its own external defenses are in good shape and have not been breached. Time to brush up the internal policies and protocols.

Courts Continue to Grapple with Data Breach Claims

Our last few blogs have focused on litigation under the Video Privacy Protection Act, including the recent ruling from the 10th Circuit in Yershov v. Gannett Satellite Information Network, Inc., 2016 U.S. App. LEXIS 7791 (1st Cir. Apr. 29, 2016).  Elsewhere, courts have been trying to figure out what to do about consumer lawsuits, including a recent ruling in the multi-district class action litigation world where Judge Lucy Koh of the Northern District of California issued an order denying in part a motion to dismiss that permitted claims relating to the enormous data breach at Anthem Blue Cross to survive.

That lawsuit arose after Anthem Blue Cross suffered a massive data breach in which 80 million of its users had their data hacked and compromised.  Naturally, many lawsuits followed, naming upwards of 40 entities affiliated with Anthem Blue Cross.  The plaintiffs alleged over a dozen state and federal claims, but those myriad causes of action can all be boiled down to a simple premise: the plaintiffs would like to be compensated for costs associated with the data breach.  Blue Cross asserted a number of defenses.  There are two that are of particular interest.

First, on this motion Anthem Blue Cross contended the plaintiffs had no connection with Anthem Blue Cross that would provide standing for a breach of contract action.  Judge Koh rejected this argument.  For plaintiffs under California law, she accepted the argument that individual or group insurance policies sufficed to meet that standard.  More interesting was her ruling on New Jersey breach of contract claims, where she stated that a privacy policy referenced in an informational booklet provided to a consumer was sufficient.  That ruling would seemingly provide consumers quite a bit of latitude to allege a contractual relationship with entities that have suffered a data breach.

Second, Anthem Blue Cross, like all other data breach defendants, have argued that consumers lacked standing because no one has actually been hurt or damaged as a result of the data breach.  This argument has been bolstered because Anthem Blue Cross has been bearing all of the costs and fees to date for credit monitoring and other services.  It is worth referencing because this issue of standing and damages in the data breach context has been a difficult one for courts to solve.  While the claims in the Anthem Blue Cross survived, other courts have weighing this issue have gone the other way.  For example, in a recent Third Circuit decision, Storm, et al. v. Paytime Inc., the Court followed its prior Reilly v. Ceridian decision in affirming a district court ruling that the class plaintiffs did not have standing until stolen data had been used.  The important takeaway on this point is that there really is no single standard.

So what is the right answer?  Should Anthem Blue Cross be forced to pay for damages where they may be hypothetical?  But isn’t it likely that at some point, those consumers will be harmed?  And must those consumers really wait until something bad happens and then file litigation piecemeal?  This dilemma will continue to cause headaches for judges.

First Circuit Decision Creates Ambiguity in Consumer VPPA Class Actions

The vast majority of courts confronted with “free app” cases under the Video Privacy Protection Act (“VPPA”) have dismissed those claims.  A recent First Circuit decision, however, signals a change in that trend.  http://media.ca1.uscourts.gov/pdf.opinions/15-1719P-01A.pdf

The District Court Dismisses the Complaint

In Yershov v. Gannett Satellite Information Network, Inc., 2016 U.S. App. LEXIS 7791 (1st Cir. Apr. 29, 2016), the Plaintiff filed a putative class action under the VPPA against Gannett, an international media company that produces, among other things, online content through a mobile software application called the USA Today Mobile App (the “App”).   The Plaintiff alleged that each time he viewed a video on the App, Gannett sent information to a third-party data analytics company about his viewing habits, including the videos he watched, the GPS coordinates of his phone at that time and his unique Android ID.  Using this information, the data analytics company would create a “digital dossier” to help Gannett created targeted advertisements.

The District Court dismissed the Complaint. While it found that the information collected by Gannett was “personally identifiable information” (“PII”), it determined that Plaintiff was not a “consumer” under the VPPA in large part because the App was free to download.  The District Court’s ruling followed a number of decisions, including the Eleventh Circuit’s ruling in Ellis v. Cartoon Network, Inc., 803 F.3d 1251 (11th Cir. 2015).

The First Circuit Reverses

The First Circuit reversed the District Court ruling, and remanded the case for further proceedings. The First Circuit agreed with the District Court that the information collected constituted PII.  Critical to the Court’s determination was the collection of Yershov’s GPS coordinates and his unique Android ID.  The Court imagined a situation in which Gannett disclosed a user’s video history on a single device at two sets of specified GPS coordinates.  In light of “how easy it is to locate a GPS coordinate on a street map, this disclosure would enable most people to identify what are likely the home and work addresses of the viewer.” Id. at *8.

The First Circuit reversed the lower court’s decision because it determined that Plaintiff qualified as a “consumer” under the VPPA. See  18 U.S.C. § 2710(a)(1) (“consumer” is defined to mean “any renter, purchaser, or subscriber of goods or services from a video tape service provider”).  The court determined that Plaintiff was a “subscriber” within the meaning of the VPPA, and declined to interpret that provision as requiring some form of monetary payment. Id. at *13.

The Yershov court distinguished this case from the Ellis ruling, where the Eleventh Circuit defined the term “subscriber” to “involve some or [most] of the following [factors]: payment, registration, commitment, delivery, [expressed association/] and/or access to restricted content.” Ellis, 803 F.3d at 1256.  In contrast, the Yershov court noted that the Plaintiff was required to provide personal information (his Android ID and GPS location), so while he did not pay for the use of the App, “access was not free of a commitment to provide consideration in the form of that information, which was of value to Gannett.” Id. at *14.

What does this mean for future consumer class actions under the VPPA? It certainly seems that this gives mobile app user plaintiffs a big leg up.  After all, any plaintiff could argue that they were not “free of commitment” when downloading a mobile app, whether they paid for it or not.  Perhaps courts will look at this decision as fact specific, and determine that it only applies where phone identification and GPS records are collected.  But we’ll just have to wait and see.

California Telecom Providers File Suit to Protect Competitively Sensitive Information

On May 5, AT&T Mobility along with several other telecommunications providers and trade associations filed a complaint in the Northern District of California against the California Public Utilities Commission (“CPUC”) Commissioners and an Administrative Law Judge (“ALJ”). The complaint alleges that the Assigned Commissioner and ALJ improperly ordered Plaintiffs to turn over competitively sensitive data to a third party in violation of the Supremacy Clause of the U.S. Constitution and the Fourth Amendment protection against unreasonable searches and seizures.  This is an important case because it could impact the scope of information the CPUC, and other government agencies in California, may compel from the entities they regulate.

The Federal Communications Commission requires broadband, wireless, and telephone companies to periodically collect and report specific information to the FCC.  In particular, the FCC created “Form 477” to collect data regarding the availability and characteristics of a provider’s services and its subscription levels.  The FCC recognizes that Form 477 requires highly detailed information and that disclosure of such information could allow a competitor to gain an unfair market advantage.  Thus, the FCC has determined that the confidentiality of such information is necessary and does not share the information with state public utility commissions unless a commission agrees to abide by FCC-prescribed confidentiality rules.  One of those rules is that the state agency will not disclose any confidential information. See 47 C.F.R. § 1.7001(d).

As part of a CPUC ratemaking proceeding, a third party, The Utility Reform Network (“TURN”), requested Form 477 data and market share data through a discovery request.  Over Plaintiffs’ objections, the Assigned Commissioner and ALJ granted TURN’s motion to compel and ordered the production of that data subject to a protective order.

The question, then, is whether requiring disclosure of the information in question violates the FCC’s confidentiality rules.  Plaintiffs say yes.  Defendants and TURN say no.

Plaintiffs assert that there is clear conflict preemption—here the Assigned Commissioner and ALJ issued a ruling that is preempted by conflicting federal law.  Moreover, Plaintiffs assert that compelling disclosure of such information would violate their Fourth Amendment rights because the CPUC lacks the jurisdiction to compel such disclosure and because the information is not “reasonably relevant” to the ratemaking proceeding.

Defendants, however, argue that the disclosure would be solely to TURN pursuant to a strict protective order.  Defendants assert that disclosure is permissible because the FCC’s regulatory regime takes the form of “cooperative federalism”, which leaves states significant freedom to achieve the goals of the Federal Telecommunications Act—to monitor and promote competition for telecom services.  Defendants also argue that the CPUC is required to monitor competition in California, and the data requested is essential to fulfill that purpose.

Finally, Defendants state that courts will not presume that information sought for the purpose of a government investigation will be put to improper use; thus, Plaintiffs have not established that there is a violation of the Fourth Amendment.

The Northern District heard Plaintiffs’ application for a temporary restraining order today.  How this litigation plays out could impact the relationship between federal rules governing the confidentiality of certain information and overlapping state rules. Stay tuned.

LexBlog