The European Commission issued the Financial Data Access Act (FIDA) proposal in June this year. FIDA will create a legislative framework that aims to “bring payments and the wider financial sector into the digital age” by facilitating the sharing of and access to customer financial data (whether of businesses or consumers).
The CMA has set out its emerging thinking on the functioning of competition and consumer protection in the market for foundation models.
On 4 July 2023, the EU Commission proposed a new Regulation for procedural rules to standardize and streamline cooperation between EU Member State Data Protection Authorities (DPAs) when enforcing the EU General Data Protection Regulation (GDPR) in cross-border cases (GDPR Procedural Regulation). The GDPR adopts a decentralized enforcement model. National EU Member State DPAs are competent to enforce the GDPR on their respective territories. However, in cases with cross-border elements, the GDPR requires all concerned DPAs to cooperate in accordance with the GDPR’s “one-stop-shop” through cooperation and consistency mechanisms. Although these mechanisms establish key principles of cooperation and provide the basis for consistent application of the GDPR throughout the EU, the EU Commission determined more legislative action was needed to increase efficiency and harmonization of cross-border GDPR enforcement action.
Following the EU’s increased focus on generative AI with the inclusion of foundation and generative AI in the latest text of the EU AI Act (see our post here), the UK now also follows suit, with the UK’s Information Commissioner’s Office (“ICO”) communicating on 15 June 2023 its intention to “review key businesses’ use of generative AI.” The ICO warned businesses not to be “blind to AI risks” especially in a “rush to see opportunity” with generative AI. Generative AI is capable of generating content e.g., complex text, images, audio or video, etc. and is viewed as involving more risk than other AI models because of its ability to be used across different sectors (e.g., law enforcement, immigration, employment, insurance and health), and so have a greater impact across society – including in relation to vulnerable groups.
On July 13, Sidley and OneTrust DataGuidance hosted a webinar titled “The Finalization of the EU-U.S. Data Privacy Framework.” The discussion with key players in international data transfers included topics such as significant points and implications of the European Commission Adequacy Decision for the Data Privacy Framework, what organizations should know about the Framework’s Principles, consideration of factors and logistics for signing up for the Framework (including interplay with current Privacy Shield membership), next steps in the EU and UK processes, and other internal data transfer developments, including adequacy decision for the UK-U.S. Data Bridge.
On July 10, 2023, the European Commission issued its Final Implementing Decision granting the U.S. adequacy (“Adequacy Decision”) with respect to companies that subscribe to the EU-U.S. Data Privacy Framework (“DPF”).
On July 10, 2023, the European Commission published its final Adequacy Decision for EU-U.S. data transfers. The draft decision reflects the multi-year coordination between the EU and U.S. to identify and implement a lasting solution to facilitate international data transfers following the Court of Justice of the European Union’s judgment in Schrems II. The EU’s adequacy decision determines that the U.S., through the newly created EU-U.S. Data Privacy Framework, provides comparable safeguards to those of the EU and ensures an adequate level of protection for personal data transferred from the EU to certified organizations in the U.S.
On 14 June 2023, the European Parliament adopted – by a large majority – its compromise text for the EU’s Artificial Intelligence Act (“AI Act”), paving the way for the three key EU Institutions (the European Council, Commission and Parliament) to start the ‘trilogue negotiations’. This is the last substantive step in the legislative process and it is now expected that the AI Act will be adopted and become law on or around December 2023 / January 2024. The AI Act will be a first-of-its-kind AI legislation with extraterritorial reach.
On 29 March 2023, the UK’s Department for Science Innovation and Technology (“DSIT”) published its long awaited White Paper on its “pro-innovation approach to AI regulation” (the “White Paper”), along with a corresponding impact assessment. The White Paper builds on the “proportionate, light touch and forward-looking” approach to AI regulation set out in the policy paper published in July 2022. Importantly, the UK has decided to take a different approach to regulating AI compared to the EU, opting for a decentralised sector-specific approach, with no new legislation expected at this time. Instead, the UK will regulate AI primarily through sector-specific, principles based guidance and existing laws, with an emphasis on an agile and innovation-friendly approach. This is in significant contrast to the EU’s proposed AI Act which is a standalone piece of horizontal legislation regulating all AI systems, irrespective of industry.
The European Union is moving closer to adopting the first major legislation to horizontally regulate artificial intelligence. Today, the European Parliament (Parliament) reached a provisional agreement on its internal position on the draft Artificial Intelligence Regulation (AI Act). The text will be adopted by Parliament committees in the coming weeks and by the Parliament plenary in June. The plenary adoption will trigger the next legislative step of trilogue negotiations with the European Council to agree on a final text. Once adopted, according to the text, the AI Act will become applicable 24 months after its entry into force (or 36 months according to the Council’s position), which is currently expected in the second half of 2025, at the earliest.