Compliance Updates for Employer’s use of Automated Decisionmaking Tools: New York City Finalizes Rules on Automated Employment Decision Tools and Sets Enforcement Date for July 5, 2023, Upcoming California Regulations, and Federal Guidance

Employers in New York City may soon be subject to a new law, Local Law 144, that regulates employers’ use of automated employment decision tools (“AED tools” or “AEDT”) – software and other programs used to make decisions about who to hire, who to promote and other employment decisions.  Local Law 144, the first of its kind law regulating these AED tools, was originally supposed to go into effect on January 1, 2023; however, because needed regulatory guidance had not been issued, the effective date was repeatedly pushed back and is now set for July 5, 2023.  Final rules were released on April 6, 2023, so further delays are unlikely.  We summarize below the key provisions of Local Law 144 and what employers need to know to prepare.

Local Law 144 may be the first of its kind, but it will not be the last law to regulate AED tools.  The California Consumer Privacy Act now applies to employee data and draft regulations governing automated decisionmaking, access and transparency rights are expected to be issued any day.  At the same time, California’s Council on Civil Rights (formerly the Department of Fair Housing and Employment) has been drafting proposed regulations concerning AED tools since July 2022, and in early April 2023 voted to initiate the formal rulemaking process to finalize proposed regulations.  The Federal government has also been taking a closer look at these tools in the employment context, with the Equal Employment Opportunity Commission (“EEOC”) issuing guidance regarding the interplay with the Americans With Disabilities Act and holding hearings on more comprehensive regulation.  The National Labor Relations Board (“NLRB”) General Counsel joined the conversation in an October, 31 2022 memo indicating there would be “vigorous enforcement” of labor law principles to what was characterized as “intrusive or abusive electronic monitoring and automated management practices” that may interfere with union organizing and similar activities.

Below we highlight key provisions of the New York City law and provide details about regulations that are likely to be issued in California later this year.  However, employers in other jurisdictions will want to take note, as implementation and enforcement of these laws and regulations will likely test the viability of principles long discussed around automated tools and AI, but not yet operationalized, including transparency, bias audits, and opt-out provisions for those who do not want to be subject to these technologies.

Who and What New York City’s AED Tool Law Applies to:

Local Law 144 will apply to all New York City employers that use AED tools to “substantially assist or replace discretionary decision making” in the employment context.  The law defines an AED tool as a “computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence [a defined term], that issues simplified output, including a score, classification, or recommendation” and such tool must be used in a manner that “substantially assists” or entirely replaces individual discretion in making employment decisions that impact individuals.  The rule defines “to substantially assist or replace discretionary decision making” as follows:

  1. Relying solely on a simplified output (score, tag, classification, ranking, etc.), with no other factors considered; or
  2. Using a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set; or
  3. Using a simplified output to overrule conclusions derived from other factors including human decisionmaking.

Thus, the law only applies to AED tools where the output of such tools is a substantial factor or the only factor driving a promotion or hiring decision.

What Local Law 144 Requires:

Bias Audit: Local Law 144 will require New York City employers that use AED tools to conduct an annual bias audit of the AED tool’s use. The law defines a bias audit as an “impartial evaluation by an independent auditor” to assess an AED tool’s potential “disparate impact” on a candidate or employee’s sex, race, and ethnicity.  The final rules provide certain examples of bias audit in charts that include the below required components for such audits.

When an employer selects or scores candidates or employees for promotion, the bias audit should include the following at a minimum:

  1. Either (i) the calculated selection rate for each category if selecting individuals, or (ii) the median score for the full sample of applicants if scoring individuals;
  2. The calculated impact ratio for each category;
  3. Calculations considering the impact on sex categories, race and ethnicity categories and intersectional categories of sex, ethnicity, and race;
  4. Consideration that such calculations are conducted for “each group, if an AEDT classifies candidates for employment or employees being considered for promotion into specified groups (e.g., leadership styles)”; and
  5. Listing “the number of individuals the AEDT assessed that are not included in the required calculations because they fall within an unknown category.”

Publishing of Bias Audit: Local Law 144 requires employers to make information publicly available concerning their use of AED tools, and employers will need to post: (1) the date of their most recent bias audit, (2) a summary of the results that includes “the source and explanation of the data used to conduct the bias audit, the number of individuals the AEDT assessed that fall within an unknown category, and the number of applicants or candidates, the selection or scoring rates, as applicable, and the impact ratios for all categories,” and (3) the date that they began using the AED tool. Such information must be posted for at least 6 months after the latest use of the AED tool.

Candidate and Employee Notices: Local Law 144 requires employers to notify candidates or employees about the use of AED tools in their assessment or evaluation for hire or promotion, as well as the job qualifications and characteristics that will be used by the automated employment decision tool.  Such notices must:

  1. Include instructions on how an individual can request reasonable accommodations such as an alternative selection process, if available;
  2. Include information on its AED tools data retention policy, the types of data collected for the AED tool, and sources of data;
  3. Describe the job qualifications and characteristics that the AED tools will use in its assessment; and
  4. Provide notice 10 business days before the use of the AED tool either (i) on a company’s website, (ii) in a job posting, or (iii) via U.S. mail or e-mail.

Penalties: Violations of Local Law 144 will be subject to civil penalties of $500 for the first violation and $1,500 for each additional violation.  Each day an AED tool is used in violation of the law will give rise to a separate violation.  Local Law 144 is only enforceable by the New York City Department of Consumer and Worker Protection, and there is no private action.

Upcoming California Rules Regulating Automated Decisions in the Employment Context

On February 21, 2023, the Council of Civil Rights within the California Civil Rights Department approved Proposed Modifications to Employment Regulations Regarding Automated-Decision Systems, the first step in initiating formal rulemaking.  Similar to New York City’s law concerning AED tools, if these rules are finalized, they would make it “unlawful for an employer or other covered entity to use qualification standards, employment tests, automated-decision systems, proxies, or other selection criteria if such use has a disparate impact on or constitutes disparate treatment of an applicant or employee or a class of applicants or employees on the basis of one or more characteristics protected by the Act, unless the qualification standards, employment tests, automated-decision systems, proxies, or other selection criteria, as used by the employer or other covered entity, are shown to be job related for the position in question and are consistent with business necessity.”  Bias audits are not an express feature of these proposed rules; however, they may be an important element of a defensible compliance plan.

In an April 3, 2023 Council meeting, staff reported they were in the process of preparing supporting documentation (e.g., Initial Statement of Reasons) that must be submitted to California’s Office of Administrative Law (“OAL”) before the formal notice and comment period can begin.  At that meeting, the Council made clear that they consider this rulemaking package to be a priority, especially because it has been languishing in committee for several years.

At the same time, the California Privacy Protection Agency (“CPPA”) is expected to soon issue proposed rules around automated decisionmaking and there is a possibility the proposed rules could cover employment data because the CCPA now applies in full to employees and job applicants.  The CCPA conducted preliminary rulemaking around automated decisionmaking (and other topics) during February and March 2023, and the agency is now in the process of drafting proposed regulations.  Before formal rulemaking can begin, the CPPA Board must approve proposed regulations and prepare a supporting rulemaking package for submission to the OAL (just as the Council on Civil Rights is doing).

It remains to be seen how the two sets of proposed regulations—from the Council on Civil Rights and the CPPA—will relate to each other.  However, in comments made in February 2023, CPPA Executive Director Ashkan Soltani suggested he was not intending to duplicate rules that were being proposed by the Council on Civil Rights.

Recent Federal Guidance on Automated Decisionmaking for Employers

The EEOC has been focused on issues of automated decision practices in employment decisions for some time.  In 2021, it launched the Artificial Intelligence and Algorithmic Fairness Initiative and, soon thereafter, issued guidance describing how the use of ADM tools may negatively impact individuals with disabilities and trigger the compliance obligations under the Americans with Disabilities Act.  On January 31, 2023, the EEOC held a public hearing to examine the use of automated systems, including AI, in employment decisions, which addressed the same themes we are seeing in the New York City law and proposed regulations in California: the importance of auditing, considerations of data used for AI and training, explainability and transparency, and collaboration with existing law.  For a further discussion on the EEOC hearing, see our analysis here.

The NLRB has also joined the conversation of late.  On October 31, 2022, the NLRB General Counsel published a memo on employee monitoring and automated decision practices, indicating that she will urge the Board to set standards that define such practices an unfair labor practice to the extent they interfere with engaging in union organizing or other activities protected by labor law.  The memo says she will do so by “vigorously enforcing extant law and by urging the Board to apply settled labor-law principles in new ways.”  Employers should review employee monitoring programs to ensure they are not interfering with the National Labor Relations Act.

This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.