After months of wrangling, the California legislature has finally passed a set of significant amendments to the California Consumer Privacy Act (CCPA), a sweeping data privacy and security law commonly referred to as “California’s GDPR” (Europe’s General Data Protection Regulation). Employee personal information and personal information obtained in business-to-business (B2B) interactions are now mostly out of scope. Personal information in credit reports and other data covered by the Fair Credit Reporting Act is also largely exempt. Only personal information that is “reasonably” capable of being associated with a consumer or household is subject to the act. And aggregate or deidentified information definitively does not qualify as CCPA personal information.
This article first appeared on Thomson Reuters Regulatory Intelligence.
The summer of 2018 may be regarded as a pivotal time in the history of data privacy laws. The European Union’s General Data Protection Regulation (GDPR) came into effect in May 2018, the California Consumer Privacy Act (CCPA) was signed into law in June 2018 (and comes into effect on January 1, 2020), and a draft of India’s Personal Data Protection Bill (India DP Bill) was released in July 2018 (and is now under review by India’s government).
These developments, and more generally, the recent proliferation of data privacy laws around the world (notably, in Australia, China, Brazil, Hong Kong, and Singapore) represent a compliance challenge for many multinational organizations.
The National Association of Insurance Commissioners (NAIC) held its Summer 2019 National Meeting (Summer Meeting) in New York City from August 3 to 6, 2019. The Summer Meeting was highlighted by the following activities.
*This article was first published by Bloomberg Law in August 2019
Companies doing business with California consumers are impacted by the California Consumer Privacy Act (effective Jan. 1, 2020). The CCPA’s private right of action provision gives California residents the right to sue companies when their personal information is subject to unauthorized access and exfiltration, theft, or disclosure due to a company’s failure “to implement and maintain reasonable security procedures and practices.”
Under this provision, consumers may seek actual damages, declaratory or injunctive relief, and statutory damages, which begin at $100 and continue up to $750 “per consumer per incident.” The potential aggregated exposure through consumer class actions could be significant, and companies are searching for ways to mitigate private lawsuits.
The flurry of state legislative activity in the wake of the enactment of the California Consumer Protection Act (CCPA) continues with the New York legislature recently passing two bills to increase accountability for the processing of personal information. On July 25, 2019, Governor Cuomo signed the two bills into law, one which amended the state’s data breach notification law, and another that created additional obligations for data breaches at credit reporting agencies. Together, the new laws require the implementation of reasonable data security safeguards, expand breach reporting obligations for certain types of information, and require that a “consumer credit reporting agency” that suffers a data breach provide five years of identity theft prevention services for impacted residents. Meanwhile, the more comprehensive New York Privacy Act, which many viewed as even more expansive than the CCPA, failed to gather the necessary support in the most recent legislative session.
With less than three months to go before amendments to California’s far reaching data privacy law need to be signed into law, the CCPA landscape may be changing yet again, as several amendments debated in the state Senate Judiciary Committee on July 9th underwent significant modifications. Eight proposed CCPA amendments were on the committee’s agenda, and several were hotly debated in an hours-long session that extended late into the night. In the end, two of the bills had substantive modifications, another was stalled, one was defeated, and the rest made it out of the committee, with limited changes. Here we summarize the highlights.
Since the passage of the California Consumer Privacy Act (Cal. Civ. Code §1798.100 et seq.) (“CCPA”), several states are following in California’s footsteps and adopting privacy bills that would allow consumers to object to the sale of their personal information.
Sidley has consolidated its materials and resources on the CCPA, including an amendment tracker, on the new Sidley CCPA Monitor.
Explore the law and Sidley insights, available now.
With about half a year to go until the California Consumer Privacy Act (CCPA)’s effective date, and with significant amendments still percolating to define the scope and impact of the CCPA come 2020, other states continue to consider whether to adopt new and broader privacy laws of their own, with Nevada recently taking the distinction of being the first to follow the CCPA trend. While the scope and obligations of the Nevada law is significantly narrower than the CCPA and thus largely will align with current CCPA implementation projects, the new Nevada law does expand upon the CCPA in one particularly notable way—it moves the deadline to facilitate opt-outs of sales of personal information up to October 2019. (more…)
More and more entities are deploying machine learning and artificial intelligence to automate tasks previously performed by humans. Such efforts carry with them real benefits, such as the enhancement of operational efficiency and the reduction of costs, but they also raise a number of concerns regarding their potential impacts on human society, particularly as computer algorithms are increasingly used to determine important outcomes like individuals’ treatment within the criminal justice system.
This mixture of benefits and concerns is starting to attract the interest of regulators. Efforts in the European Union, Canada, and the United States have initiated an ongoing discussion around how to regulate “automated decision-making” and what principles should guide it. And while not all of these regulatory efforts will directly implicate private companies, they may nonetheless provide insight for companies seeking to build consumer trust in their artificial intelligence systems or better prepare themselves for the overall direction that regulation is taking.