The Paradox of Automation as Anti-Bias Intervention

Artificial Intelligence and Privacy and Security

Article Snapshot


Ifeoma Ajunwa


Cardozo Law Review, Vol. 41, No. 5, pp. 1671-1742, 2020


Automated decision-making systems may facilitate bias. Job seekers lack access to the data and algorithms used by automated hiring systems, hindering plaintiff’s efforts to prove “disparate impact” discrimination under Title VII.

Policy Relevance

New legal doctrines are needed to support equal opportunity in employment.

Main Points

  • Some expect that removing humans from decision-making processes and replacing them with automated decision-making systems will eliminate bias; however, sometimes, automated systems amplify bias.
  • A case study of algorithmic systems used in the hiring process reveals problematic features of these systems at odds with the principle of equal opportunity in employment.
    • Automated background checks of social media incorporate unwarranted assumptions that an applicant’s private behavior (like swearing) is relevant to their professional behavior.
    • Systems that analyze facial expressions struggle to read the expressions of those with darker skin.
    • Checks reveal information that employers are not supposed to consider, such as pregnancy status.
  • Bias is introduced in the hiring process by the legal system's deference to employers, who use nebulous criterion such as "cultural fit;" some firms now prefer to focus more on “values fit.”
  • Legal frameworks that assure accountability for technological hiring tools are lacking, and make it difficult to detect bias.
  • New legal approaches could support the liability of employers and makers of algorithmic hiring systems; for example, hiring platforms could serve as “information fiduciaries” of job applicants.
  • A new doctrine of discrimination per se should be created, modelled on the idea of negligence per se.
    • An employer's failure to audit and correct automated hiring systems that have a disparate impact should serve as prima facie evidence of discriminatory intent.
    • The employer could rebut this evidence by showing business necessity.
  • Legal protections should be established for consumers modelled on the Fair Credit Reporting Act, so that consumers may access the information collected and used by automated hiring systems.

Get The Article

Find the full article online

Search for Full Article