FTC’s New Biometric Policy Statement Articulates New Governance Standards and an Expansive View of Biometric Data

On May 18, 2023, the Federal Trade Commission (“FTC”) issued its 2023 Policy Statement on Biometric Information and Section 5 of the FTC Act (the “Policy Statement”) describing the agency’s concerns about these fast-proliferating technologies and articulating a set of compliance obligations for businesses that develop or use biometric technologies.  To address potential risks of bias, discrimination, and security associated with the collection or use of biometric information, the FTC wants businesses to, among other things, conduct pre-release risk assessments evaluating the potential for bias and other potential consumer harms, assess these risks on an ongoing basis, and evaluate and potentially audit third parties with access to a business’s biometric data.

These requirements may apply to businesses that may traditionally have been out of scope for biometric privacy laws because the Policy Statement appears to expands the definition of “biometric information” to include biometric technologies that are used “for any purpose,” as opposed to the more limited definition commonly used in state biometric laws that defines information as “biometric” only when it is actually used to identify individuals.

Below we summarize the key features of the Policy Statement, including a brief overview of other developments in biometric regulation and technologies, highlight key concerns the FTC expressed in Policy Statement regarding the collection and use of biometrics, and summarize some of the steps the FTC recommends businesses can take to mitigate risks associated with the use of biometric technologies.

Biometric Information Technologies and Risks Are on the Rise

The FTC Policy Statement arrives against a backdrop of the proliferation of legislative and regulatory actions aimed at the processing of biometric information by private companies.   For example, Illinois has seen a spike in the number of lawsuits under its Biometric Information Privacy Act following two recent state Supreme Court decisions that expanded liability under the law. New York City now requires businesses that handle biometric information to post a sign at the entrance to their establishments.  These developments sit atop general state data privacy laws that identify biometric information as personal information (and sensitive personal information under some laws), and a recently-enacted health privacy law in Washington state that sweeps biometric information into the law’s extensive requirements.  At the same time, the use of biometric data continues to expand, whether to provide access and security to devices and apps, or to facilitate the spatial computing experience on augmented reality, virtual reality, and gaming platforms.

FTC Policy Statement Concerns About Biometric Information and Technologies

The Policy Statement begins by cataloguing some of the FTC’s concerns about the use of biometric technologies, including ways in which the technologies might be susceptible to bias, produce discriminatory effects when used across different genders and races, and be misused to create “deep fakes” or to surreptitiously track or identify individuals in sensitive locations (i.e. religious institutions, medical practices, political meetings, etc.).  The Policy Statement then sets forth a “non-exhaustive” list of practices related to the use or marketing of biometric technologies that might violate the FTC Act and provides guidance about how companies using these technologies can mitigate risks of potential violations.  We summarize key points of that guidance below.

Conduct holistic risk assessments with “real world” data before and while using technologies. The Policy Statement advises that, “prior to collecting consumers’ biometric information, or deploying a biometric information technology,” companies should conduct a “holistic” risk assessment that takes into account “real world” conditions and considers the role of human operators (suggesting human intervention may not always suffice, without more, to mitigate potential consumer harms). The Policy Statement further advises that these assessments evaluate whether there is a likelihood that the technology creates disproportionate impact on particular demographic groups and the extent to which “technical components of the system,” such as algorithms, have been “specifically tested for differential performance across demographic groups—including  intersectionally.”  Additionally, businesses will need to revisit assessments and risk mitigation measures based on how the technologies operate in practice.

Conspicuously disclose collection and use of biometric information and avoid unsubstantiated claims.  The Policy Statement further advises that businesses “clearly and conspicuously” disclose when consumers’ biometric information is being collected, or when access to “essential goods and services” is conditioned on providing that information.  Clear disclosures in this regard give consumers the opportunity to avoid having their information collected.  Relatedly, when making disclosures, businesses will want to tread carefully when making statements about these technologies, including statements alleging lack of bias.  In this regard, the Policy Statement highlights that claims about the efficacy of a technology that are true only for certain populations can be deceptive if the limitations of the technology are not clearly stated.

Avoid surreptitious and “unexpected” collection or use of biometric information. The Policy Statement provides that certain uses of biometric information or technology may be per se “unfair,” including if a business uses or facilitates the use of biometric information or technology to secretly identify or track a consumer in a manner that exposes them to risks such as stalking, reputational harm, or social stigma.  The Policy Statement suggests businesses can mitigate potential harms in this regard by, among other things, disclosing the collection and use of biometrics so that consumers can avoid collection if they so desire, and putting in place a mechanism to accept and address consumer complaints or disputes relating to the collection or use of consumers’ biometric information.

Use reasonable data security practices. The Policy Statement includes familiar statements that businesses should implement “reasonable privacy and data security measures,” and specifically calls out the need for access controls and technical measures such as making timely updates for both hardware and software systems to ensure they are operating effectively and not putting consumers at risk.

  The Policy Statement advises businesses to evaluate the practices and capabilities of third parties (including affiliates) that have access to consumer biometric information, include contractual measures that require third parties to minimize privacy and security risks associated with biometric data and technologies, and to then “supervise, monitor or audit” third parties’ compliance with such requirements.

Additionally, the Policy Statement identifies the need to have appropriate training for employees and contractors whose job duties involve interacting with biometric information, citing recent enforcement actions where the agency has alleged the failure to provide such training constituted an “unfair” practice.

The Policy Statement makes clear that biometric technologies are on the FTC’s radar and that the FTC expects businesses to take a proactive approach when evaluating and addressing the unique risks with biometric information.  Businesses should evaluate whether and where biometric data is used (including if not used to identify individuals), make sure appropriate biometric disclosures and policies are in place (including under state biometric privacy laws), and integrate regular risk assessments into biometric privacy compliance programs.

This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.