*This article was adapted from “Global Overview,” appearing in The Privacy, Data Protection and Cybersecurity Law Review (7th Ed. 2020)(Editor Alan Charles Raul), published by Law Business Research Ltd., and first published by the International Association of Privacy Professionals Privacy Perspectives series on September 28, 2020.
Privacy, like everything else in 2020, was dominated by the COVID-19 pandemic. Employers and governments have been required to consider privacy in adjusting workplace practices to account for who has a fever and other symptoms, who has traveled where, who has come into contact with whom, and what community members have tested positive or been exposed.
As a result of all this need for tracking and tracing, governments and citizens alike have recognized the inevitable trade-offs between exclusive focus on privacy versus exclusive focus on public health and safety.
Even the European Data Protection Board conceded that data protection measures, like the EU General Data Protection Regulation, “do not hinder measures taken in the fight against the coronavirus pandemic. The fight against communicable diseases is a valuable goal shared by all nations and … [i]t is in the interest of humanity to curb the spread of diseases and to use modern techniques.”
Accordingly, the European Data Protection Board agreed that an “[e]mergency is a legal condition which may legitimise restrictions of [privacy and data protection] freedoms provided these restrictions are proportionate and limited to the emergency period.”
And while privacy is not considered an absolute right in any jurisdiction, it is important to acknowledge that no democratic country has taken the position or acted in a manner suggesting that individual privacy is irrelevant or dispensable — even during the pandemic emergency. To the contrary, privacy rights have been taken into account in fighting the virus nearly everywhere.
But COVID-19 was not the only shock to the privacy system in 2020. On July 16, the Court of Justice of the European Union invalidated the Privacy Shield framework agreed between the European Commission and U.S. Department of Commerce to facilitate data flows to the United States. While the CJEU upheld the validity of using standard contractual clauses to transfer personal data to countries not yet deemed “adequate” by the EU, such as the United States, the CJEU imposed considerable new obligations on organizations looking to transfer data and for data protection authorities to consider when approving such transfers.
Specifically, the court agreed with Austrian privacy advocate Max Schrems that there is a theoretical possibility that users of social media networks could have their communications secretly transferred to the U.S. National Security Agency without the benefit of privacy protections available in Europe.
Never mind that there is no proof or reason to believe that the NSA is interested in collecting communications to or from average European social media users — in other words, from anybody other than a terrorist, spy or hostile actor. And never mind that the privacy protections and legal redress opportunities under U.S. surveillance laws are considerably stronger than those EU Member states accord their own citizens. In fact, Presidential Policy Directive 28 requires the NSA and other U.S. agencies involved in signals intelligence to protect the privacy of persons outside the United States in a manner reasonably comparable to the protections U.S. citizens receive. And, notably, the independent Privacy and Civil Liberties Oversight Board provided a thorough assessment of the implementation of PPD-28 on Oct. 16, 2018.
Remarkably, the CJEU deemed U.S. surveillance safeguards and remedies to be less than “essentially equivalent” to those of the EU without comparing EU member state surveillance laws or practices to those of the U.S. at all. While the CJEU’s assessment of U.S. surveillance safeguards was superficial and woefully incomplete, its treatment of EU surveillance was entirely absent. The CJEU did not so much as ask whether any EU member state has an oversight body to examine and judge the privacy or civil rights implications of electronic surveillance the way PCLOB and Foreign Intelligence Surveillance Court do — with full national security clearance to access the deepest secrets of signals intelligence.
Though the CJEU’s decision may seem frivolous (to say nothing of dangerous) to many observers, its potential impact on trillions of dollars of trans-Atlantic trade is anything but. The European Commission and U.S. Department of Commerce are publicly committed to solving this judicial conundrum. In the meantime, companies are going through a metaphysical process of trying to demonstrate (to themselves, data protection authorities, and ultimately maybe Max Schrems) that, under the Schrems decision, they can transfer personal data to the United States — and other non “adequate” countries — without such data being disproportionately accessible to national security surveillance by the data importing country. In other words, there are no real standards at all.
In contrast, the United States provided a model for international comity and respect for the rule of law in the 2018 Clarifying Lawful Overseas Use of Data Act. The act authorizes U.S. communications service providers to produce the contents of electronic communications they store outside the United States in response to U.S. legal requests and to do the same for foreign governments that have entered into executive agreements with the United States, which only the U.K. has done so far, effective July 2020.
Importantly, the CLOUD Act requires foreign governments wishing to enter into such agreements to demonstrate their respect for the rule of law and for international human rights, privacy, free speech rights, data minimization (equivalent to what the United States applies under the Foreign Intelligence Surveillance Act), the principles of non-discrimination, accountability, transparency, independent oversight and numerous other detailed safeguards. Moreover, where the requested data concerns a non-U.S. person who is outside the United States, the relevant communications service provider is authorized to file a motion in federal court to quash the government request. The provider may do so if it believes the laws of the U.S. and of the foreign government conflict, including privacy laws, with respect to the government’s demand for the communications of its customers. Once the motion has been filed, the court must conduct a detailed “comity” analysis to resolve the conflict of international laws concerning data privacy rights.
The legal standards for such comity analysis are set forth with considerable specificity in the CLOUD Act. Courts must consider the nature of the legal conflict at stake, the materiality of the alleged violation of the foreign law, the respective interests of the two countries in the matter at hand, the contacts of the service provider and the individual in question with the United States, and the importance of the individual’s information to the criminal or national security interest at issue.
The thoughtful nature of this comity analysis required by the CLOUD Act is not reciprocated in the CJEU’s “Schrems II” decision or the EU General Data Protection Regulation. Europe and the rest of the world would be well served to study the U.S. model of safeguards, checks and balances, independent oversight and international comity for government access to electronic communications.
In any event, privacy developments in the United States have not been all about government access to information. In this past year and a half, U.S. regulators and litigators have obtained the largest fines and legal awards ever collected for alleged violations of privacy and data security requirements.
The U.S. Federal Trade Commission has obtained a $5 billion settlement and imposed unprecedented corporate governance requirements following an investigation of the Cambridge Analytica affair. And private plaintiffs have obtained a settlement of more than half a billion dollars in connection with alleged violation of state biometric information privacy law. The FTC and state attorneys general (and, in some cases, private plaintiffs) have collected hundreds of millions of dollars in financial recoveries concerning data breaches as well as alleged violations of the U.S. Children’s Online Privacy Protection Act. Significantly, many of these proceedings are predicated on the theory that the data “controller” is legally responsible for the allegedly invasive or abusive practices of third parties that operate on or through the “controller’s” platform. This trend toward extended responsibility is well worth watching.
Even the U.S. Securities and Exchange Commission is increasingly focused on digital practices and risks. The SEC is now actively enforcing the accuracy and reliability of privacy and cybersecurity disclosures by public companies. Companies could face regulatory action if they materially understate their digital risks — or avoid discussing significant incidents they have already experienced — or if they publicly overstate their data security or privacy practices.
As a result, many companies — especially tech companies — are considerably expanding their discussion of how U.S. and international privacy laws like GDPR or the newly effective Brazilian privacy law are affecting or could affect their global regulatory risk profile or the economic viability of their current and future business models.
But perhaps the most important U.S. privacy development is the new California Consumer Privacy Act. It took effect in 2020, and as of July 1, may be enforced by the state’s attorney general.
Not only does the CCPA require privacy disclosures, grant privacy rights and impose privacy restrictions comparable to the GDPR, but, in typical American fashion, the CCPA dangles the prospect of statutory damages (regardless of actual injury) to incentivize lawyers to file litigation over data breaches affecting the personal information of California residents — but only if the breach results from a company’s failure to implement and maintain “reasonable security.”
Even the CCPA, however, may not be enough for California. Its progenitor, real estate executive Alastair Mactaggart, has already advanced a new privacy initiative that will replace and go beyond the CCPA — the California Privacy Rights Act. CPRA will be placed before the state’s voters as part of the November election, bypassing the state’s legislature entirely.
In the end, as groundbreaking for the United States as the CCPA has been, its most consequential impact may be to impel other states to act, and perhaps then the U.S. Congress will finally enact a comprehensive national privacy law.
The most reasonable outcome for America would be the establishment of stable, federal privacy and security standards that identify and target actual injuries caused by abusive data practices. If COVID-19 has taught us anything about privacy, it is that there can be real trade-offs (for health, safety, innovation, the economy and security), on the one hand, as well as real harms (for pocketbooks, personal dignity and autonomy), on the other, from regulating either too much or too little.