ACADEMIC ARTICLE SUMMARY

Prediction, Preemption, Presumption: How Big Data Threatens Big Picture Privacy

Article Source: Stanford Law Review Online, September 2013
Publication Date:
Time to Read: 2 minute read
Written By:

 Jessica Earle

Jessica Earle

ARTICLE SUMMARY

Summary:

Future search engines could use artificial intelligence (AI) and big data to predict users’ desires and needs. Governments and others could use big data to predict people’s behavior. This could undermine our right to travel, the presumption of innocence, and other rights.

POLICY RELEVANCE

Policy Relevance:

The greatest danger from big data is its potential to reduce the opportunities of “high-risk” people.

KEY TAKEAWAYS

Key Takeaways:
  • The use of “big data” promises to enable us to anticipate future needs and concerns, plan strategically, avoid loss, and manage risk; Google is developing intelligent search capabilities to offer information to users before they are aware that they desire it.
  • Presently, our legal system imposes penalties and punishments on wrongdoers only after a wrong has been committed; big data could shift our focus to preventing wrongs before they occur.
  • Big data enables three types of predictions:
    • Consequential predictions allow individuals to avoid harm by outlining the consequences of a certain course of action.
    • Preferential predictions allow others to anticipate one’s desires, usually to sell goods and services.
    • Preemptive predictions are used to reduce one’s range of future actions.
  • Preemptive predictions threaten privacy and the value our society places on due process.
    • “High-risk” individuals could lose the right to travel through an expanded “no-fly” list.
    • Firms could restrict job applicants by finding candidates using big data rather than requesting resumes.
  • The presumption of innocence and other due process values prevent some people from being excluded from society; privacy rights help to restrict what others may assume about us.
  • Our legal framework provides no accountability for the use of preemptive predictions to restrict our rights or opportunities.

QUOTE

TAGS

Ian Kerr

About Ian Kerr

Ian Kerr holds the Canada Research Chair in Ethics, Law & Technology at the University of Ottawa, Faculty of Law, with cross appointments to the Faculty of Medicine, Department of Philosophy, and School of Information Studies. He teaches in the areas of moral philosophy and applied ethics, internet and ecommerce law, contract law, and legal theory. Professor Kerr has published books and articles on topics at the intersection of ethics, law, and technology and is currently engaged in research on two broad themes: Privacy and Surveillance and Human-Machine Mergers.