When Machine Learning is Facially Invalid

Privacy and Security and Artificial Intelligence

Article Snapshot

Author(s)

Frank Pasquale

Source

Communications of the ACM, Vol. 61, No. 9, pp. 25-27, 2018

Summary

Some researchers claim that artificial intelligence-based systems (AI) could be used to identify criminals based on their facial features. These technologies may be biased, and could lead to more surveillance and social control. 

Policy Relevance

AI-based systems should be used with extreme caution in criminal law.

Main Points

  • Some researchers claim that machine learning systems can infer a subject’s sexual orientation, intelligence, or tendencies to commit criminal acts from the subject’s facial features; this idea is deeply troubling.
     
  • Critics of facial inference studies point out that these studies might be misleading; for example, if the dataset used to train the machine learning system consists of prisoners' faces, it will omit the large percentage of criminals who are never caught.
     
  • Facial inferences of criminality can set up self-fulfilling prophecies; if surveillance of some areas is increased because an AI system predicts residents will commit crimes, the detection of more crimes is likely because of the surveillance, even if the prediction was unfounded.
     
  • When AI is used to analyze natural phenomena such as whether diamonds are likely to be found at a certain site, "whatever works" is an acceptable approach; however, when analyzing human behavior, AI developers should be able to explain why and how their analysis works.
     
  • Use of AI to identify criminals based on their facial features will tend to reinforce social structures that encourage criminal behavior.
     
  • Use of AI to predict criminality based on facial features is inconsistent with the rule of law; the Fourth Amendment requires police to explain why they suspect an individual, and a statistical projection will not suffice.
     
  • Because everyone shares some characteristics with criminals, “black box” predictive technologies could give the police an excuse to investigate nearly anyone.
     
  • Europe's General Data Protection Regulation gives citizens a "right of explanation" to understand how automated systems in the private sector make decisions; a commitment to science and justice requires that such rights be taken seriously.
     

Get The Article

Find the full article online

Search for Full Article

Share