The International Association of Privacy Professionals (IAPP) held its annual Europe Data Protection Congress in Brussels on November 15 & 16, 2023. Whilst the Congress covered a wide range of topics related to privacy, cybersecurity and the regulation of data more broadly, unsurprisingly a recurring theme throughout was the responsible development, commercialization and use of AI. In this regard panelists explored (amongst other things) what practical and effective AI governance may look like, the role of a Digital Ethics Officer, how to strike a balance between enabling innovation and safeguarding individual rights, and how AI may be used to automate data breach detection and response.
The Information Commissioner’s Office (“ICO”) has introduced a toolkit on data sharing with law enforcement (“Toolkit”) which supplements the ICO’s existing guidance on sharing personal data with law enforcement authorities. The Toolkit is intended to function as a tool for smaller organisations to make an informed decision about whether to share personal data with law enforcement. Larger organisations with expertise in data protection are encouraged to refer to the ICO’s data sharing code of practice but in any event, the Toolkit is intended to help provide clarity for all organisations in making decisions relating to this type of sharing.
On October 16, 2023, the U.S. Securities and Exchange Commission (SEC) Division of Examinations (EXAMS or Division) issued its annual examination priorities, which, for the first time, was published at the start of the SEC’s fiscal year to “better inform investors and registrants of key risks, trends, and examination topics” the Division intends to focus on in the coming year.1
On September 29, 2023 — the last business day of its fiscal year — the U.S. Securities and Exchange Commission (SEC) issued the latest in a series of actions charging 10 firms with recordkeeping failures in connection with employees’ use of unapproved applications on personal devices to engage in communications relating to the firms’ business (known as “off-channel communications”).1 The firms charged included broker-dealers, investment advisers, and dually registered broker-dealers and investment advisers as well as one family of firms that self-reported conduct to the SEC. To date, the SEC has charged over 40 registrants and leveled over $1.6 billion in penalties as part of its off-channel communications matters. Other regulators, including the Commodity Futures Trading Commission (CFTC), have brought similar cases.
On September 21, 2023, the UK and the U.S. announced the UK extension to the EU-U.S. Data Privacy Framework (DPF), which will come into effect on October 12. A new UK adequacy regulation provides that the UK Secretary of State for Science, Innovation and Technology has determined that the U.S. provides adequate levels of protection for personal data in certain transfers and brings the UK within the DPF announced in July 2023. The U.S. Attorney General also designated the UK as a “qualifying state” under an Executive Order on September 18 for the purposes of the DPF. This means that on October 12, UK businesses will be able to transfer personal data to U.S. organizations self-certified under the DPF.
On 4 July 2023, the EU Commission proposed a new Regulation for procedural rules to standardize and streamline cooperation between EU Member State Data Protection Authorities (DPAs) when enforcing the EU General Data Protection Regulation (GDPR) in cross-border cases (GDPR Procedural Regulation). The GDPR adopts a decentralized enforcement model. National EU Member State DPAs are competent to enforce the GDPR on their respective territories. However, in cases with cross-border elements, the GDPR requires all concerned DPAs to cooperate in accordance with the GDPR’s “one-stop-shop” through cooperation and consistency mechanisms. Although these mechanisms establish key principles of cooperation and provide the basis for consistent application of the GDPR throughout the EU, the EU Commission determined more legislative action was needed to increase efficiency and harmonization of cross-border GDPR enforcement action.
Following the EU’s increased focus on generative AI with the inclusion of foundation and generative AI in the latest text of the EU AI Act (see our post here), the UK now also follows suit, with the UK’s Information Commissioner’s Office (“ICO”) communicating on 15 June 2023 its intention to “review key businesses’ use of generative AI.” The ICO warned businesses not to be “blind to AI risks” especially in a “rush to see opportunity” with generative AI. Generative AI is capable of generating content e.g., complex text, images, audio or video, etc. and is viewed as involving more risk than other AI models because of its ability to be used across different sectors (e.g., law enforcement, immigration, employment, insurance and health), and so have a greater impact across society – including in relation to vulnerable groups.
Just before Americans began their Fourth of July holiday, the U.S. Commodity Futures Trading Commission (CFTC) Division of Enforcement Director announced that the division has established two key task forces: the Cybersecurity and Emerging Technologies and the Environmental Fraud Task Force.1 Both task forces will be staffed with attorneys and investigators across the Division of Enforcement with the goal of serving as subject matter experts and prosecuting cases. As a result, CFTC registrants should be prepared for heightened focus on cybersecurity and environmental fraud, particularly in the derivatives and relevant spot markets.
On July 10, 2023, the European Commission issued its Final Implementing Decision granting the U.S. adequacy (“Adequacy Decision”) with respect to companies that subscribe to the EU-U.S. Data Privacy Framework (“DPF”).
On 14 June 2023, the European Parliament adopted – by a large majority – its compromise text for the EU’s Artificial Intelligence Act (“AI Act”), paving the way for the three key EU Institutions (the European Council, Commission and Parliament) to start the ‘trilogue negotiations’. This is the last substantive step in the legislative process and it is now expected that the AI Act will be adopted and become law on or around December 2023 / January 2024. The AI Act will be a first-of-its-kind AI legislation with extraterritorial reach.