Privacy, Algorithms, and Artificial Intelligence

Privacy and Security, Innovation and Economic Growth and Artificial Intelligence

Article Snapshot

Author(s)

Catherine Tucker

Source

in The Economics of Artificial Intelligence: An Agenda, Ajay K. Agrawal, Joshua Gans, and Avi Goldfarb, eds., University of Chicago Press, 2019 (forthcoming)

Summary

Many economists assume that consumers understand how their data will be used and do not consider how one consumer’s decision to share data affects others. Some artificial intelligence (AI) systems seem to have learned discriminatory behavior, and simplistic models do not address this.

Policy Relevance

Economists should develop more sophisticated privacy models. AI makes models that assume that consumers know how their data will be used less relevant.

Main Points

  • Economists think of data privacy as an "intermediate good," anticipating that an individual’s desire for privacy depends on how the data will affect her economic situation; however, consumers often have a different understanding of privacy (for example, some avoid data collection even if the data will not be used).
     
  • When AI is used in making decisions based on consumer data, three characteristics of these systems drive privacy concerns: These are data persistence, the unanticipated uses of data, and spillover effects (as the processing of one person’s data affects others).
     
  • Once stored, data are difficult to delete completely, leading to “data persistence.”
     
    • Consumers’ privacy preferences often change over time.
       
    • Data such as genetic information will be used to make projections over a long period of time.
       
  • As AI advances, data is more likely to be use in ways that an individual cannot anticipate.
     
  • An aggregation of data can be sensitive and lead to the identification of an individual, even if a single unit of the data (zip code or gender, for example) is neither sensitive nor identifiable.
     
  • Economists’ models assume that an individual's privacy choices depend on the consequences of revealing the information in her interaction with a firm, but do not consider spillovers (how others’ preferences or behavior is affected).
     
    • One person’s decision to share genetic information can affect family members.
       
    • AI systems can learn from behavioral data to discriminate (for example, one algorithm was found to have charged Asians more for test preparation software).
       
  • Spillover effects were shown when one AI system was studied to discover why it was less likely to show a gender-neutral STEM-related ad to women, even though women were more likely to click on the ad; because advertisers pay more to show ads to women (who make many household expenditures), the AI chose to show the ad to fewer women to reduce the overall costs of the advertising campaign.
     
  • In future, economists should consider issues such as data persistence, the repurposing of data, and spillover effects in modelling privacy.
     

 

Get The Article

Find the full article online

Search for Full Article

Share