New York City Local Law 144 (NYC 144) prohibits employers from using an Automated Employment Decision Tool (AEDT) to screen a candidate or employee for an employment decision unless the tool has been subject to a bias audit conducted within the prior year, and the employer complies with other notice requirements to the candidate or employee. A summary of the results of the audit, including the distribution date of the AEDT, must also be made publicly available on the employer's website.

Employers operating in New York City will need to review whether they are using technology that falls under the definition of AEDT for hiring and promotion decisions. If the technology used falls within the definition, employers who continue to rely on AEDTs will need to conduct a bias audit and comply with the notice provisions under NYC 144.

Enforcement of NYC 144 went into effect on July 5, 2023.

What will be considered an AEDT?

The first question for employers is whether they use a tool in hiring or promotions that would be considered an AEDT. NYC 144 defines AEDT as:

Any computational process, derived from machine learning, statistical modeling, data analytics or artificial intelligence, that issues simplified output, including a score, classification or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decision that impact natural persons.

The definition of AEDT excludes any tool that does not "automate, support, substantially assist or replace discretionary decision-making processes and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, database, data set or other compilation of data." NYC Admin. Code 20-870.

The final regulations provide further detail on the level of AEDT usage that NYC 144 covers, defining "substantially assist or replace discretionary decision making" as meaning:

to rely solely on a simplified output (score, tag, classification, ranking, etc.), with no other factors considered; or

to use a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set; or

to use a simplified output to overrule conclusions derived from other factors including human decision making.

The final regulations also provide further detail on tools that would qualify as an AEDT, defining "Machine learning, statistical modeling, data analytics or artificial intelligence" as a group of mathematical, computer-based techniques:

that generate a prediction, meaning an expected outcome for an observation, such as an assessment of a candidate's fit or likelihood of success or that generate a classification, meaning an assignment of an observation to a group, such as categorizations based on skill sets or aptitude; and

for which a computer at least in part identifies the inputs, the relative importance placed on those inputs and, if applicable, other parameters for the models in order to improve the accuracy of the prediction or classification.

The definition is broad. If employers are using tools that generate either a decision or a heavily weighted factor, they will likely need to comply with the requirements of NYC 144. NYC 144 does not just apply to final decisions in hiring or promotion. It also applies broadly to include screening.

If the employer uses an AEDT for hiring and promotion decisions, what does it need to do to comply with NYC 144?

Bias audit requirements

To use an AEDT in compliance with NYC 144, the AEDT must be the subject of a bias audit conducted within the prior year before the use of the AEDT. NYC Admin. Code 20-870. For the audit, employers must separately calculate the AEDT's selection rate and impact ratio for sex categories, race/ethnicity and intersectional categories of sex and ethnicity.

The bias audit must use historical data, which may be from one or more employers or employment agencies that use the AEDT. An employer can only use historical data if it provides historical data from its own use of the AEDT to the independent auditor, or if it has never used the AEDT. If there is insufficient historical data available to conduct a statistically significant bias audit, test data may be used. The regulation requires that the summary of the audit results must explain why historical data was not used.

The bias audit must be conducted by an "independent auditor" as defined in the regulations.

NYC 144 also does not prescribe a standard that must be met by the bias audit. The corresponding FAQs expressly states that NYC 144 does not require any specific actions based on the result of the bias audit, but also notes the potential applicability of federal, state and local discrimination laws.

Notice requirements

The employer must publish a summary of results of the most recent bias audit including the distribution date of the AEDT. The employer must also notify the employee or candidate that an AEDT will be used in connection with the assessment or evaluation of the employee or candidate. This notice must be made at least ten business days prior to use of the AEDT, and the candidate may request an alternative selection process. The employer must also provide notice to the employee or candidate of the qualifications and characteristics the AEDT will use at least ten days prior to use of the AEDT. Notice can be provided in a job posting or by mail or email, but as an alternative employers may provide notice on their website or in their written policies.

If an employee or candidate requests the data collected and used by the AEDT, the information must be provided within 30 days of the written request, unless the disclosure would violate local, state or federal law, or interfere with a law enforcement investigation.

Similar proposed legislation

NYC 144 is the first law of its kind aimed specifically at regulating the use of artificial intelligence tools in employment decisions. Several other legislatures have also proposed similar laws.

  • New York (Proposed): Bill A00567 would require employers to conduct a disparate impact analysis annually. The disparate impact analysis may be used as evidence to initiate investigations by the attorney general.
  • New Jersey (Proposed): Bill A4909 would require business that sell automated employment decision tools to (i) conduct bias testing within the past year of selling the tool, (ii) provide annual bias auditing services to the purchaser at no additional cost and (iii) provide notice that the tool is subject to pending state legislation.
  • California (Proposed): Bill AB331 would require the deployer of automated decision tools to perform an impact assessment for any automated decision tool the employer uses.
  • Massachusetts (Proposed): Bill H.1873 would require employers to complete algorithmic impact assessments to evaluate potential risks of discrimination by automated decision systems.

Additional considerations

NYC 144 provides a helpful reference point for how future legislation could approach artificial intelligence in employment decisions. Notably, the law focuses on the results of the tool as opposed to how the tool achieves those results. Focusing on impact as opposed to process is a mechanism we anticipate will be used in future legislation since impact is more readily measurable.

In reviewing the impact of these tools it is also important for employers to remain cognizant of privacy considerations for the data sets used to fuel the tools and review the tools for bias.

This area is rapidly developing, and employers currently using or considering using artificial intelligence will need to ensure that their reliance on these technologies complies with applicable requirements, and more importantly does not subject them to increased legal risk.



Contact

Senior Counsel

Recent publications

Subscribe and stay up to date with the latest legal news, information and events . . .