ACADEMIC ARTICLE SUMMARY
The Paradox of Automation as Anti-Bias Intervention
Article Source: Cardozo Law Review, Vol. 41, No. 5, pp. 1671-1742, 2020
Publication Date:
Time to Read: 2 minute readSearch for the full article on Bing
ARTICLE SUMMARY
Summary:
Automated decision-making systems may facilitate bias. Job seekers lack access to the data and algorithms used by automated hiring systems, hindering plaintiff’s efforts to prove “disparate impact” discrimination under Title VII.
POLICY RELEVANCE
Policy Relevance:
New legal doctrines are needed to support equal opportunity in employment.
KEY TAKEAWAYS
Key Takeaways:
- Some expect that removing humans from decision-making processes and replacing them with automated decision-making systems will eliminate bias; however, sometimes, automated systems amplify bias.
- A case study of algorithmic systems used in the hiring process reveals problematic features of these systems at odds with the principle of equal opportunity in employment.
- Automated background checks of social media incorporate unwarranted assumptions that an applicant’s private behavior (like swearing) is relevant to their professional behavior.
- Systems that analyze facial expressions struggle to read the expressions of those with darker skin.
- Checks reveal information that employers are not supposed to consider, such as pregnancy status.
- Automated background checks of social media incorporate unwarranted assumptions that an applicant’s private behavior (like swearing) is relevant to their professional behavior.
- Bias is introduced in the hiring process by the legal system's deference to employers, who use nebulous criterion such as "cultural fit;" some firms now prefer to focus more on “values fit.”
- Legal frameworks that assure accountability for technological hiring tools are lacking, and make it difficult to detect bias.
- New legal approaches could support the liability of employers and makers of algorithmic hiring systems; for example, hiring platforms could serve as “information fiduciaries” of job applicants.
- A new doctrine of discrimination per se should be created, modelled on the idea of negligence per se.
- An employer's failure to audit and correct automated hiring systems that have a disparate impact should serve as prima facie evidence of discriminatory intent.
- The employer could rebut this evidence by showing business necessity.
- An employer's failure to audit and correct automated hiring systems that have a disparate impact should serve as prima facie evidence of discriminatory intent.
- Legal protections should be established for consumers modelled on the Fair Credit Reporting Act, so that consumers may access the information collected and used by automated hiring systems.