Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads

Privacy and Security, Innovation and Economic Growth and Artificial Intelligence

Article Snapshot

Author(s)

Anja Lambrecht and Catherine Tucker

Source

Management Science, Vol. 65, Issue 7, pp. 2947-3448 (July, 2019). Published online April 10, 2019

Summary

The use of algorithms to make decisions can lead to bias; one algorithm displayed a “gender neutral” ad to more men than women. The algorithm was designed to be cost-effective; because advertisers pay more to display ads to young women, the ad was shown to fewer women.

Policy Relevance

Algorithms may generate biased results unintentionally. Firms’ efforts to correct bias may not be permitted if the effort requires targeting one gender.

Main Points

  • An ad promoting job opportunities in science, technology, math, and engineering (STEM) fields was designed to be gender neutral, but the algorithm running the ad campaign displayed the ad to 20% more men than women.
     
  • Had women been less likely to click on the ad than men, observers might conclude that the algorithm had learned to discriminate to maximize ad views; in fact, women were more likely to click on the ad than men, so this explanation is unlikely.
     
  • The algorithm’s behavior cannot be attributed to historic discrimination against women; the algorithm displayed no greater bias in countries where women are strongly discouraged from entering STEM fields compared to those with less discrimination.
     
  • The best explanation for the algorithm’s choices is that, worldwide, advertisers must pay more for the privilege of showing ads to women, who control a large proportion of household expenditures; thus, cost-effective ad campaigns display ads to more men than women.
     
  • Some economists note that market forces limit or erode discrimination; however, this study shows that economic forces will sometimes reinforce discrimination.
     
  • Algorithmic bias will be hard to regulate; in this case, giving outsiders access to the algorithm (greater transparency) would not have helped to predict a biased outcome.
     
  • When an algorithm generates a biased result, observers should not assume that this was intentional.
     
  • Because firms are liable for gender discrimination, firms’ efforts to correct for unintentional bias might be disallowed; running one ad campaign for women and another for men to ensure that the ad is displayed to equal numbers of each would not be permitted.

Get The Article

Find the full article online

Search for Full Article

Share