Key takeways:
- To help celebrate the new year, we have another proposal on regulating the use of artificial intelligence for hiring to consider. On Friday, December 23, 2022, the New York City Department of Consumer and Worker Protection released revised proposed rules (the “Revised Proposed Rules”) to implement its Automated Employment Decision Tool Law.
- The Revised Proposed Rules include a number of notable changes, some of which appear to be direct responses to suggestions and concerns raised during the initial public comment period. The most significant changes relate to the types of data that must be used to conduct the bias audit and the circumstances under which an employer can rely on a vendor’s bias audit.
- Employers using covered AI employment tools should consider assessing any compliance gaps that they may have with the Revised Proposed Rules and how those gaps may be closed before the AEDT law becomes enforceable.
- Continue reading here…
To help celebrate the new year, we have another proposal on regulating the use of artificial intelligence for hiring to consider. On Friday, December 23, 2022, the New York City Department of Consumer and Worker Protection (the “DCWP”) released revised proposed rules (the “Revised Proposed Rules”) to implement its Automated Employment Decision Tool Law (the “AEDT Law” or the “Law”). We recently wrote about the first set of proposed rules for this Law, which requires employers to conduct annual independent bias audits and to publicly post a summary of those results. We also wrote about the City’s decision to postpone enforcement of the new law until April 15, 2023 and to hold a second public hearing on the implementing rules. This second hearing will be held on Monday, January 23, 2023 at 11 AM ET.
The Revised Proposed Rules include a number of notable changes, some of which appear to be direct responses to suggestions and concerns raised during the initial public comment period. The most significant changes relate to the types of data that must be used to conduct the bias audit and the circumstances under which an employer can rely on a vendor’s bias audit. In this Debevoise Data Blog post, we discuss these proposed changes and how they will impact employers’ compliance obligations in New York City and beyond.
Bias Audit Data Requirements. The Revised Proposed Rules resolve an ambiguity regarding what data should be used to conduct the bias audit required under the Law. It is now clear that employers must use “historical data,” which is defined as data collected during an employer’s or an employment agency’s use of an AEDT to assess candidates for employment or employees for promotion. If there is not sufficient historical data available for a statistically significant bias audit, “test data” may be used, but the public summary of the bias audit results must then explain why historical data was not used and describe how the test data was generated. “Test data” is defined as any data other than historical data.
A key challenge for applying the AEDT law is the fact that most employers do not collect demographic information (such as race, gender and ethnicity) from job applicants. So, to the extent that the term “historical data” includes demographic data, most employers will not have sufficient historical data to conduct a statistically significant bias audit. The question then is whether (1) those employers can use test data (including sample data, synthetic data or other data that they did not collect) to conduct their bias audits or (2) those employers must to use the historical data that they have and infer the missing demographic data, for example, by using the candidate name and whatever location data they collect.
There are several ways that employers can infer race, gender and ethnicity from name and location data of job applicants. For example, the Consumer Financial Protection Bureau (the “CFPB”) has been using the Bayesian Improved Surname Geocoding (“BISG”) proxy methodology since 2014 to infer the likelihood that a credit applicant is a member of one of several racial or ethnic groups. And recently, the CFPB has been experimenting with a more refined Bayesian Improved First Name Surname Geocoding (“BIFSG”) proxy methodology, as well as a gender proxy methodology. There are also private companies that purport to be able to infer demographic information from a person’s name. It is unclear whether the DCWP expects employers to use one of these methods, and if so, what level of uncertainty their use will have on the audit results, considering that these methodologies are not 100% accurate at inferring race, ethnicity or gender.
Leveraging the Vendor’s Bias Audit. The examples provided in the Revised Proposed Rules appear to resolve an ambiguity in the DCWP’s first set of proposed rules by suggesting that employers need not conduct their own bias audit, and can instead, under certain circumstances, rely on an independent bias audit that was conducted for the vendor who provided the employer with the AEDT. But, the Revised Proposed Rules clarify that, where multiple employers use the same AEDT, an employer may rely on a bias audit for that tool only if it provided historical data from its own use of the AEDT to the independent auditor for the bias audit or it has never used the AEDT. Again, it is unclear how to treat historical data that does not include demographic information. For example, is the employer obligated to infer demographic information using BISG or another method before sharing that with the independent auditor in order to be able to rely on the audit being conducted for the AEDT provider?
Revised Definition of AEDT. The Revised Proposed Rules also narrow the scope of the AEDT’s application to employment decisions made (i) based solely on the AEDT’s output, (ii) based on a number of factors, but the AEDT’s output is weighed more than any other criterion, or (iii) based the AEDT’s output having overruled conclusions derived from other factors, including human decision making.
Enhanced Requirements for an “Independent” Auditor. The Revised Proposed Rules provide some additional insight as to who can and cannot qualify as an independent auditor by listing three circumstances where an auditor would not be considered independent:
- If an auditor is or was involved in using, developing or distributing the AEDT;
- If an auditor, at any point during the bias audit, has an employment relationship with an employer or employment agency that seeks to use or continue to use the AEDT or with a vendor that developed or distributes the AEDT; or
- If an auditor, at any point during the bias audit, has a direct financial interest or a material indirect financial interest in an employer or employment agency that seeks to use or to continue to use the AEDT or in a vendor that developed or distributed the AEDT.
Obviously, the independent auditor is expected to be paid for their work, so a disqualifying financial interest must be more than that. But whether, for example, an auditor is prohibited from having any other ongoing business relationships with the employer seeking to rely on the audit is unclear.
Clarity on Groups and Intersection Subject to Bias Audit Calculations. The DCWP’s first set of proposed rules provided that, as part of the bias audit, calculations needed to be conducted for intersectional groupings (e.g., Asian females, Black males, etc.). The Revised Proposed Rule make clear that such calculations must also be separately conducted for standalone sex and ethnicity/race categories (e.g., male, female, Hispanic, Black, Asian, White, etc.).
Adjustment to the “Impact Ratio” Calculation for Scoring Tools. Under the DCWP’s first set of proposed rules, if an AI hiring tool’s output produced scores that ranked employees or candidates, the impact ratio for each demographic group had to be calculated by taking the average score of individuals in each sex, ethnicity, race or intersectional category and dividing that by the average score of individuals in the highest-scoring demographic group.
Under the Revised Proposed Rules, when an AEDT scores candidates, employers must:
- Calculate the median score for the full sample of applicants;
- Calculate the rate at which individuals in each category receive a score above the median score (the “scoring rate”); and
- Calculate the impact ratio for each category by dividing the scoring rate of each group by the scoring rate of the highest-scoring group.
The method for calculating an “impact ratio” for a tool that selects, rather than scores, employees and candidates remains the same.
Next Steps. As mentioned above, the DCWP’s second public hearing will take place on January 23, 2023, which is less than three months before the new enforcement date (April 15, 2023). The Revised Proposed Rules provide employers with a clearer roadmap to compliance, but they also create new hurdles for some employers, with little time for them to come into full compliance (as the possibility that additional changes will be made for the rules are final). Employers using covered AI employment tools should consider assessing any compliance gaps that they may have with the Revised Proposed Rules and how those gaps may be closed before the AEDT law becomes enforceable.