This summer, the Federal Trade Commission (“FTC”) hosted its sixth annual PrivacyCon, an event focused on the latest research and trends related to consumer privacy and data security. This years’ event was divided into six panels: Algorithms; Privacy Considerations and Understandings; Adtech; Internet of Things; Privacy-Children and Teens; and, Privacy and the Pandemic. Welcoming attendees and kicking off the event, Commissioner Rebecca Kelly Slaughter called for minimization of data abuses and for a move away from the notice and consent model of privacy in favor of data minimization. PrivacyCon topics are selected by the FTC and often seen as an indication of enforcement priorities.
Algorithms. AI and algorithms received substantial attention at PrivacyCon 2021. Commissioner Slaughter highlighted the FTC’s April 19, 2021 guidance on algorithmic bias. Eric Meyer, Chief Technologist of the FTC, spoke about the FTC’s role in combatting data abuse, and referenced potential approaches to penalties, including forced disclosure of algorithms used to wrongly retrieve data and potential internet access bans. The FTC also held both a panel of three research papers, and a separate presentation, on Algorithms.
The panel highlighted three issues in particular related to algorithms: advertising, fairness and transparency. The panel began with research concerning advertising algorithms that create potentially discriminatory outcomes in a variety of sectors, including housing and job recruiting. Additional research was presented that machine learning models that aim to bolster fairness by imposing known constraints in the data training phases have a greater risk of compromising information of underprivileged subgroups. Finally, research was shared on the tension between bolstering transparency of machine learning models to ensure fairness and the risk of releasing proprietary information concerning machine learning training data. The three panelists agreed that technical, legal, and policy experts need to actively collaborate to review algorithmic models and shape regulations in this space. A separate presentation discussed research on racial bias and recommended companies take specific steps to combat algorithmic bias. These included: designate a high-level, strategic employee to be the steward for algorithms as well as a diverse group of advisors; maintain an inventory of all algorithms developed by the organization and update it regularly; document each algorithm’s purpose, ideal target, and performance; and fix or delete biased algorithms.
Privacy – Considerations and Understanding. This panel focused on challenges to ensuring that consumers find and understand privacy notices, what can be learned from organizations’ responses to data breaches, and different approaches to improving user awareness about data rights. The panel called for companies to consider going beyond the legal requirements for privacy policies to develop more meaningful and concise notices, especially around privacy rights and choices. The panel also focused on the accessibility and usability of opt-out choices, highlighting how many opt-out choices online were broken or slow in downloading, and recommended developing or updating standards for opt-out choices. The panel also criticized current data breach notification laws as insufficient to guide data breach response.
AdTech. A perennially hot topic in recent years, the AdTech discussion highlighted the variety of ways by which consumers interact with online advertising, including unidentified trackers and pixels, smart TVs, and social media. The panel began with a discussion on tracking techniques used online, highlighting that most browser extensions which purport to assist users in blocking tracking miss a significant number of third-party trackers. Research was also presented on tracking in the Smart TV advertising space, finding that hundreds of apps collect and share personally identifiable information with third parties.
Additional research was presented on the use of data subject requests and targeted advertising. The discussion highlighted the benefits of advertisement explanations that make clear to a user how and why they are targeted. Researchers highlighted that it took them months to receive clarifications on how to interpret the responses received through privacy rights requests with respect to advertising, calling for more explainability in profiling for advertising purposes.
Internet of Things (IoT). This panel primarily focused on voice-activated devices, and argued that consumers are often unaware about how much of their personal data is captured and disclosed to third-parties. Research was also presented demonstrating that when IoT devices disclosed that personal data would be shared or sold to third parties without consumer control over data access, consumers would be less willing to purchase such devices. Researchers also called for privacy labels on IoT devices, similar to nutrition labels for food, to provide consumers with sufficient privacy and security information to make better informed choices.
Privacy – Children and Teenagers. Interestingly, the panel on children and teens highlighted potential failures in the very mechanisms and processes designed to enhance children’s privacy. Specifically, the panel addressed various design and implementation vulnerabilities found in parental-control solutions that increase the risks for device compromise, account takeover, data leakage, and disclosure of personal information. Suggestions for maximizing better privacy practices to adequately protect sensitive data included conducting regular security audits, implementing stronger password policies, limiting data collection of children to what is strictly necessary, securing communications, and limiting third-party trackers in apps intended for children.
The panel also presented research identifying that teenagers are more likely to be exposed to privacy risks in mobile apps at a higher rate than the general population. Discussion focused on how the use of targeted advertisements, in-app purchases, and third-party trackers allow digital platforms to collect data and build full profiles of its teenage users which could later be monetized by app developers, advertisers, and other third-party technology companies. Panelists called for stronger privacy protections to minimize exploitation of children and teens online.
Privacy and the Pandemic. The final panel addressed a unique set of privacy risks affecting the Internet infrastructure arising during the COVID-19 pandemic, including the increased prevalence of phishing attacks and rapid spread of COVID-19 misinformation. Panelists discussed how phishers have capitalized on people’s fear of the COVID-19 pandemic to lure them into disclosing personally sensitive information. Researchers highlighted how phishing websites use actual COVID-19 statistics and other COVID-19 related information promulgated by government agencies to appear more legitimate to unsuspecting users. Looking to the rapid rise of COVID-19 misinformation on social media platforms, researchers highlighted social media platforms’ use of various intervention methods, from specific “false information” labels on posts and general banners linking users to other authoritative resources, to combat misinformation. The research found that social media users generally viewed such fact-checking and post labels in a positive light, despite some users highlighting censorship concerns. The panel called for more proactive, rather than reactive, measures to decrease the spread of phishing attacks and COVID-19 misinformation.
*The authors thank Sidley Summer Associates Liamarie Quinde, Kunal Jhaveri, Sydney Volanski, Philip Robbins, Hannah Brown, Tyler Wood, Rimsha Syeda, Casey Grant and Ada Dovell for their contributions to this article.
This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.