Bias In Recruitment Software To Be ‘Illegal’ in New York, Vendors Will Need Bias Audit

A new law by New York’s City Council will in effect make algorithmic bias in recruitment software illegal. It demands that vendors of such candidate filtering tools submit to a bias audit, with Civil penalties for those who fail to conduct an ‘impartial evaluation‘.

The new rules will come into force in just under two years and add weight to the strengthening legal position around the world that algorithms must be transparent and their designers – or in this case the sellers – should be held accountable.

The move could generate a new stream of work for lawyers as New York, and other parts of the US (see more below), legislate against automated decision making’s imperfections.

The rules specifically target ‘any system [used for employment decisions] whose function is governed by statistical theory‘ – which is a very wide remit that covers: ‘linear regression, neural networks, decision trees, and other learning algorithms’.

The law, No. 1894 (see full text below), demands that any company selling such automated systems must conduct a ‘Bias Audit‘ to show that the software is compliant with local employment law (see more below on why this will be a tricky challenge).

In practical terms this makes algorithmic bias illegal, as software that has been audited and shown to be biased following what the new law calls an ‘impartial evaluation‘, presumably would be unsellable.

Moreover, if a software company still went ahead and sold that system which is now classified as ‘biased’, it would logically leave the vendor open to serious legal challenges and Civil claims. In short, the obligatory audit will inevitably mean that programmed bias is ‘illegal’.

The law also states that candidates must be told that an automated system was used to assess and filter their application, as well as list what criteria were used in the software.

Moreover, the Civil penalties are designated as targeting the ‘person’ who sells such software without an audit, which presumably scoops up not just the company making and selling the software, but re-sellers as well.

Overall, this is a major challenge for anyone providing recruitment software.

This move in New York is itself the result of several years of legislative steps there that have declared the need to remove bias in software. While we have also (1) seen:

  • March 28, 2019, Idaho signed into law a bill requiring public disclosure of all documents, data, and other information used to create a pretrial risk assessment tool.
  • Illinois’ Artificial Intelligence Video Interview Act was signed into law on August 9, 2019.
  • The New Jersey Algorithmic Accountability Act, for conducting impact assessments on automated decision systems.

But, will it work? Here are a few challenges that Artificial Lawyer can see:

  • First, who is qualified to do such audits of what can be very complex software?
  • The bias audit law in New York is designed to be measured against existing employment rules, but these can be open to interpretation – as otherwise we wouldn’t need employment lawyers. How does a software expert, for example, handle the subjective interpretation of employment law?
  • Where do the test sample of CVs come from? I.e. to test for bias you need to run a large number of sample CVs through a recruitment system. But, where do these come from, and what balance of CVs do you include? E.g. if you had, for example, 1000 test CVs and 60% were male candidates, you already have a bias in the test sample. But, to have a perfect 50/50 split of test CVs for every defined characteristic under the law, from disabilities to gender, would be hard to create and likely be ‘over manufactured’ and not provide a ‘real world’ test.
  • Who guards the guards? Bias when it comes to human decisions is always a subjective matter dependent on current cultural positions. The auditors may perhaps ignore what others may see as bias. Then we have a fight over auditing the bias auditors. And so on….
  • Companies that use such software, if it is machine learning software, will need to train it on their own data, i.e. the CVs of those people they already employ. Stopping bias at the sale stage may be pointless, as companies will create new algorithmic positions in the software by training it themselves. So, this may demand further legislation against bias in users’ training efforts.

And that’s just the start of it. Eliminating bias is a philosophical impossibility, as everyone has bias, it’s just that some types of bias are legislated against and/or socially not acceptable. But bias of some type, which can often be ‘benign’, will always be there.

Conclusion: trying to make sure that automated systems obey the law is totally the right thing to do. However, actually achieving this is likely a lot harder than it looks on paper.

Welcome to the new world – and a new practice area for lawyers – of the algorithmic bias audit.

Algorithmic Bias Local Law for New York, to take effect Jan 2022

Int. No. 1894 – By Council Members Cumbo, Ampry-Samuel, Rosenthal, Cornegy, Kallos, Adams, Louis, Chin and Ulrich

A Local Law to amend the administrative code of the city of New York, in relation to the sale of automated employment decision tools.

Be it enacted by the Council as follows:

Section 1. Chapter 5 of title 20 of the administrative code of the city of New York is amended by adding a new subchapter 21 to read as follows:

Subchapter 21

Sale of Automated Employment Decision Tools

§ 20-840 Definitions. For the purposes of this subchapter, the following terms have the following meanings:

Automated employment decision tool

The term “automated employment decision tool” means any system whose function is governed by statistical theory, or systems whose parameters are defined by such systems, including inferential methodologies, linear regression, neural networks, decision trees, random forests, and other learning algorithms, which automatically filters candidates or prospective candidates for hire or for any term, condition or privilege of employment in a way that establishes a preferred candidate or candidates.

Bias Audit

The term “bias audit” means an impartial evaluation, including but not limited to testing, of an automated employment decision tool to assess its predicted compliance with the provisions of section 8-107 and any other applicable law relating to discrimination in employment.

Employment decision. The term “employment decision” means to screen candidates for employment or otherwise to help to decide compensation or any other terms, conditions or privileges of employment in the

§ 20-841 Requirements for automated employment decision tools.

a. It shall be unlawful to sell or offer for sale in the city an automated employment decision tool that does not comply with the provisions of this subdivision.

1. Such tool shall be the subject of a bias audit conducted in the past year prior to selling or offering for sale such tool.

2. Every sale of such tool shall include, at no additional cost, an annual bias audit service that provides the results of such audit to the purchaser.

3. Such tool shall be sold or offered for sale with a notice stating that such tool is subject to the provisions of the local law that added this subchapter.

b. Candidate notice required. Any person who uses an automated employment decision tool to screen a candidate for an employment decision shall notify each such candidate of the following within 30 days of such use:

1. That an automated employment decision tool required by this local law to be audited for bias was used in connection with the candidate’s candidacy; and

2. The job qualifications or characteristics that such tool was used to assess in the candidate.

§ 20-842 Penalties.

a. Any person that violates any provision of this subchapter or any rule promulgated pursuant to this subchapter is liable for a civil penalty of not more than $500 for that person’s first violation and each additional violation occurring on the same day as the first violation, and not less than $500 nor more than $1,500 for each subsequent violation.

b. Violations shall accrue on a daily basis for each automated employment decision tool that is sold or offered for sale in violation of subdivision a of section 20-841.

c. Each instance in which notice is not provided to a candidate within 30 days in violation of subdivision b of section 20-841 constitutes a single violation, and each 30-day period thereafter in which such notice is not provided to such candidate constitutes a separate violation.

d. A proceeding to recover any civil penalty authorized by this subchapter is returnable to any tribunal established within the office of administrative trials and hearings or within any agency of the city designated to conduct such proceedings.

§ 20-843 Enforcement. The commissioner may initiate in any court of competent jurisdiction any action or proceeding that may be appropriate or necessary for correction of any violation issued pursuant this subchapter, including mandating compliance with the provisions of this chapter or such other relief as may be appropriate.

§ 20-844 Rules. The department, the Commission on Human Rights and any other agency designated by the mayor may promulgate such rules as it deems necessary to implement and enforce the provisions of this subchapter.

§ 20-845 Construction. The provisions of this subchapter shall not be construed to limit any right of any candidate for an employment decision to bring a civil action in any court of competent jurisdiction, or to limit the authority of the city commission on human rights to enforce the provisions of title 8, in accordance with law.

§ 2. This local law takes effect on January 1, 2022.

[1] thanks to US law firm Proskaur Rose for these examples.