The UK’s Information Commissioner’s Office (“ICO”) has recently issued a draft version of its statutory code of practice for sharing of personal data between controllers under the GDPR and the UK Data Protection Act 2018 (“DPA”) (the “Draft Code”) which provides a number of practical recommendations which controllers should take into account when sharing personal data.
The High-Level Expert Group on Artificial Intelligence (“AI HLEG”), an independent expert group set up by the European Commission in June 2018 as part of its AI strategy, has published its final Ethics Guidelines for Trustworthy Artificial Intelligence (“AI”) (the “Guidelines”).
These Guidelines form part of a wider focus by the Commission on AI, with President-elect of the European Commission, Ursula von der Leyen commenting most recently on July 16, in her proposed political guidelines, that: “In my first 100 days in office, I will put forward legislation for a coordinated European approach on the human and ethical implications of Artificial Intelligence…”.
Just a day after the ICO provided notice of its intention to fine British Airways £183m ($228m) over a separate breach (please see our blog post here), on Tuesday, July 9, 2019, the ICO released another statement of its intention to fine Marriott International, Inc. (“Marriott”) over £99m ($123m) in relation to a security incident affecting the Starwood reservation database which Marriott had acquired in 2016 and discovered in November 2018. The statement came in response to Marriott’s filing with the US Securities and Exchange Commission that the ICO intended to fine it for breaches of the GDPR.
On 3 July 2019, the UK’s Information Commissioner’s Office (“ICO”) published new guidance on cookies and similar technologies (“Guidance”) in conjunction with a new blog post: “Cookies – what does ‘good’ look like?” which aims to provide “myth-busting” advice on common cookies uncertainties. You can find a full copy of the new guidance here and a link to the ICO’s blog post here. With its new Guidance, the ICO has formally recognised the stricter standards of consent and transparency now in force under the GDPR.
Today we saw the ICO issue a notice of its intention to fine British Airways £183.39m for infringements of the GDPR – a record fine and the largest seen in the UK and the EU. The proposed fine relates to a cyber incident which BA notified to the ICO (as BA’s lead data protection authority, DPA) in September 2018. The incident involved the theft from the BA website and mobile app of personal data relating to customers over a two-week period. In terms of next steps, BA now has an opportunity to make representations to the ICO as to the proposed findings and sanction.
Sidley has consolidated its materials and resources on the CCPA, including an amendment tracker, on the new Sidley CCPA Monitor.
Explore the law and Sidley insights, available now.
The 25th of May, 2019 marked a year since the EU General Data Protection Regulation (“GDPR”) came into force. For most in privacy, involvement with the GDPR has been ongoing for well over this year, but on the first anniversary of the GDPR we take an opportunity to look back and reflect on where we are now in relation to some key areas of interest including enforcement action, privacy litigation, breach notification and developing guidance from the European Data Protection Board (“EDPB”).
More and more entities are deploying machine learning and artificial intelligence to automate tasks previously performed by humans. Such efforts carry with them real benefits, such as the enhancement of operational efficiency and the reduction of costs, but they also raise a number of concerns regarding their potential impacts on human society, particularly as computer algorithms are increasingly used to determine important outcomes like individuals’ treatment within the criminal justice system.
This mixture of benefits and concerns is starting to attract the interest of regulators. Efforts in the European Union, Canada, and the United States have initiated an ongoing discussion around how to regulate “automated decision-making” and what principles should guide it. And while not all of these regulatory efforts will directly implicate private companies, they may nonetheless provide insight for companies seeking to build consumer trust in their artificial intelligence systems or better prepare themselves for the overall direction that regulation is taking.
Recently, the Dutch Supervisory Authority (the “Autoriteit Persoonsgegevens” or “Dutch SA”) has taken the position that the use of so-called “cookie walls,” whereby website access is made conditional upon the provision of consent to tracking cookies, is not compliant with the EU General Data Protection Regulation (“GDPR”).
William Long, partner and global co-leader of Sidley’s Privacy and Cybersecurity practice, has been working on global data privacy and information security matters for a number of years. In particular, William advises international clients on a wide variety of General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’), data protection, cybersecurity and financial services issues.
DataGuidance by OneTrust spoke with William about data protection issues in the financial services sector, and in particular about approaching compliance with the GDPR, sector-specific challenges, issues around Big Data, and cybersecurity.