On April 26, 2021, the European Commission announced that its draft proposal for the new EU Artificial Intelligence Regulation (“Draft AI Regulation”) is open for feedback until June 22, 2021. The Draft AI Regulation was published on April 21. Please refer to our blog post here that provides an overview of the Draft AI Regulation and its potential impact.
On April 21, 2021, the European Commission (EC) issued its eagerly awaited draft proposal on the EU Artificial Intelligence Regulation (Draft AI Regulation) – the first formal legislative proposal regulating Artificial Intelligence (AI) on a standalone basis. The Draft AI Regulation is accompanied by a revision of the EU’s rules on machinery products, which lay down safety requirements for machinery products before being placed on the EU market. The new draft Machinery Products Regulation – proposed by the EU Commission on the same day – intends to tackle safety issues that arise in emerging technologies. The Draft AI Regulation (which appears to have borrowed a number of principles from existing EU legislation, including the EU General Data Protection Regulation 2016/679 (GDPR)) has an intentionally broad scope, and regulates the use of AI in accordance with the level of risk the AI system presents to fundamental human rights and other key values the EU adheres to. AI systems that are considered to present an “unacceptable” level of risk are banned from the EU, and “high-risk” systems are subject to strict requirements. AI systems which are considered to present a lower risk level are subject to transparency requirements or are not regulated at all. Companies engaged in the development, manufacturing, importation, distribution, servicing, and use of AI – irrespective of industry – should assess to what extent their products are implicated and how they will address any regulatory requirements they are subject to. The Draft AI Regulation foresees maximum administrative fines of up to €30m or 6% of total worldwide annual turnover in the event of non-compliance – meaning fines are higher than the ones under the GDPR.
On March 17, 2021, California officials announced the appointment of five board members of the California Privacy Protection Agency ( the “CPPA”), the first data protection agency in the United States. The CPPA, created by the California Privacy Rights Act (“CPRA”) which California voters approved in November 2020, is charged with promulgating the CPRA regulations; enforcing the CCPA and CPRA; and educating consumers about their privacy rights.
On 5 March 2021, the Federal Data Protection and Information Commissioner (FDPIC) published a short position paper on the revised Swiss Data Protection Act (revDPA). The position paper provides guidance for companies that are subject to the revDPA as to how to meet its requirements once it enters into force, which is expected to be in the second half of 2022 after the Federal Administration has completed drafting the associated implementing ordinances.
On February 12, 2021, the European Commission (Commission) published an “Assessment of the EU Member States’ rules on health data in the light of GDPR” (the Assessment). The Assessment concludes, amongst other things, that there are variations in the implementation of the EU General Data Protection Regulation (GDPR) at a national level with regards to the processing of health data. In turn, this has led to a fragmented approach to the processing of health data for health and research purposes across the EU. To avoid further fragmentation, the Assessment proposes various future EU-level actions, including stakeholder-driven Codes of Conduct as well as new targeted and sector-specific legislation.
For over two and a half years, California has enjoyed the spotlight of having the most comprehensive data privacy law in the United States. On March 2, 2021, Virginia forced California to share the honors, when Democratic Gov. Ralph Northam signed into law the Virginia Consumer Data Protection Act (VCDPA).
The VCDPA, which will not enter into effect until January 1, 2023, borrows heavily from the California Consumer Privacy Act (CCPA) and the European Union (EU) General Data Protection Regulation (GDPR). Perhaps because Virginia was able to benefit from the experience of businesses that have spent the better part of the last five years implementing the GDPR or the CCPA, the Virginia law is less prescriptive and more straightforward than its predecessors, with (one would hope) a correspondingly lighter implementation burden on companies. Nonetheless, there is just enough different in the VCDPA that businesses with a connection to Virginia will need to evaluate whether the law applies to them and how they will comply.
While an exegesis of the VCDPA is beyond the scope of today’s Data Matters post, this alert is designed to assist such efforts in three ways. First, we lay out the VCDPA’s scope, providing preliminary insight into which businesses the law will cover. Second, we highlight the key ways the VCDPA differs from — and, more important, extends beyond — the CCPA and GDPR so that businesses will have an initial sense of what, if any, unique obligations the VCDPA will place on them. Finally, for completeness’s sake, the post briefly summarizes the law’s key elements.
Amidst significant economic and legal concerns, on February 12, 2021, the Maryland Senate joined the House in voting to override Republican Gov. Larry Hogan’s veto of House Bill 732 (HB 732) to adopt a Digital Advertising Gross Revenues Tax (Tax), the nation’s first tax targeting digital advertising. The override was successful despite significant pushback from a coalition of more than 200 businesses and Republican legislators who sought to sustain the veto. HB 732 is intended to provide significant revenues to support education reforms in the state.
The Tax is likely to affect large technology-based and online companies that derive revenue from advertisements on their websites and platforms (rather than companies deriving their revenues entirely from subscription services). Thus such companies, as well as their owners and sponsors, should carefully consider the information below and the impact of the Tax on their business models.
Taking a step into the digital age, the European Commission announced that the 2020s shall become the EU’s Digital Decade. The EU’s digitalization, including in the area of health, is one of the Commission’s key priorities and covers a wide range of actions and related initiatives.
Building on prior initiatives, in 2019 the Commission announced six key priorities (since supplemented by the COVID-19 recovery plan) that would shape the coming five years of policy making. One of these six key priorities is to create a Europe fit for the digital age and work on a digital strategy that will empower people with a new generation of technologies.
On December 15, the European Commission (Commission) proposed drafts of two landmark digital legislative packages — the Digital Markets Act (DMA), which proposes new competition rules for so-called “gatekeeper” platforms to address alleged unfair practices and make them more contestable by competitors, and the Digital Services Act (DSA), which recommends revamping content moderation rules for “very large online platforms.”
The new rules, if they pass into law in their current form, would impose a stringent regulatory regime on Big Tech and give the Commission new enforcement powers. The draft regulations foresee severe fines for noncompliance — up to 10% of a company’s global revenues under the DMA and up to 6% under the DSA. The Commission would also be able to impose structural remedies, such as obliging a gatekeeper to sell all or part of a business, on companies that repeatedly engage in anticompetitive behavior prohibited by the DMA.
The proposals mark the beginning of a legislative process that is likely to be controversial and hotly contested, as there are marked differences of opinion on whether these proposals go too far, do not go far enough, or are necessary at all in light of preexisting competition powers.
On November 20, 2020, the Singapore Personal Data Protection Commission (PDPC) published a set of draft advisory guidelines (the Advisory Guidelines) to provide clarification on recent amendments to the Personal Data Protection Act (the PDPA Amendments). We have summarized the PDPA Amendments in our previous client Update. The Advisory Guidelines address operational details on key amendments, as summarized below.