Notice and Consent in a World of Big Data

By TAP Guest Blogger

Posted on November 26, 2012

Written by Professor Fred H. Cate, Maurer School of Law, Indiana University and Professor Viktor Mayer-Schönberger, Oxford Internet Institute.
In 1980 the OECD adopted Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. Those guidelines, which form the basis of most privacy legislation around the world, require that the processing of personal information be lawful. In practice this means that either the processing is explicitly permissible under law or the individual whose personal data is being processed—after being informed of the reason, context, and purpose of the processing—has given consent.
Over the years, and especially in the context of the Internet, this system of “notice and consent” has become the dominant mechanism for data protection, assuming undue importance in policy debates and popular discussions about information privacy. As a result, or perhaps as a cause, ensuring individual control over personal data is widely perceived as the goal of data protection and is often highlighted as such by political leaders and commentators.
Today, almost everywhere we venture, especially online, individuals are presented with long and complex privacy notices routinely written by lawyers for lawyers, and then requested to either “consent” or abandon the desired service. That binary choice is not what the privacy architects envisioned three decades ago when they imagined empowered individuals making informed decisions about the processing of their personal data. In practice, it certainly is not the optimal mechanism to ensure that either information privacy or the free flow of information is being protected.
Even more challenging is the fact that in the age of “Big Data,” data are collected and processed so often as to make opportunities to consent an unacceptable burden for most individuals. (To take just one example, the New York Times reported this summer that one U.S. company that practically no one has ever heard of engages in more than 50 trillion transactions involving recorded personal data every year.) Moreover, the value of personal information is rarely apparent at the time of collection, when notice and consent are normally sought; going back to consumers for “re-consent” is often prohibitively expensive, especially if the subsequent user is not the original collector.
These realities challenge the dominant current privacy mechanism of notice and consent. They can leave individuals’ privacy badly exposed, as individuals are forced to make overly complex decisions based on limited information, while data processors can perhaps too easily point to the formality of notice and consent and thereby abrogate much of their responsibility. At the same time, current privacy mechanisms can unduly interfere with the innovation potential of data use. These challenges require a rational reassessment of the privacy landscape, as well as an evaluation of the optimal mix of mechanisms available to protect information privacy in a world that is beginning to realize the latent value of Big Data.
To help provide that reassessment, between May and August of 2012, Microsoft sponsored a series of regional privacy dialogues in Washington, D.C., Brussels, Singapore, Sydney and São Paulo, featuring small groups of leading regulators, industry executives, public interest advocates, and academic experts. These events culminated in a global privacy summit in Redmond, Washington, at which more than 70 privacy and data protection experts convened from 19 countries on five continents to consider the future of data sources and uses and practical steps to enhance privacy protection.
The participants offered practical suggestions for how to move data protection forward, beyond notice and choice, to provide better protection for both information privacy and valuable data flows in the emerging world of Big Data and cloud computing:
  • A key sentiment expressed in all of the discussions is that those new approaches must shift responsibility away from data subjects towards data users, and towards a focus on institutional accountability for responsible data stewardship, rather than mere compliance.
  • One of the most widely discussed alternatives was focusing more attention on the “use” of personal information rather than on its “collection,” given the increasingly pervasive nature of data collection and surveillance, inexpensive data storage and sharing, and the development of valuable new uses for personal data. Focusing on the use of personal data does not mean that there should not be responsibilities or regulation relating to data collection, nor should a focus on data collection in specific or sensitive circumstances be abandoned. Rather, in many situations, especially in the context of Big Data, a more practical, as well as sensitive, balancing of valuable data flows and more effective privacy protection is likely to be obtained by focusing more attention on appropriate, accountable use.
  • What constitutes a “use” of personal data, what uses should be permitted or prohibited (or should require some additional safeguards), and by what standards these determinations should be made were the focus of considerable discussion. There was nearly universal agreement that the “harms” or “impacts” that data protection laws should be designed to avoid must not only include physical and financial injury but also broader concepts consistent with protecting privacy as a human right—such as reputational or social harm and the chilling effect of surveillance.
  • Another prevalent theme was the need for an updated or enhanced framework for protecting personal data. The OECD Privacy Guidelines were crafted more than 30 years ago, before the advent of the World Wide Web, cloud computing, smart phones, or Big Data. While the guidelines have continuing relevance, they are no longer adequate as a guide for 21st-century data protection or as the basis for greater interoperability among national data protection regimes. For example, the Purpose Specification Principle, which requires that “the purposes for which personal data are collected should be specified not later than at the time of data collection” and the Use Limitation Principle, which restrict uses of data to those original specified “except with the consent of the data subject or by the authority of law,” seem inconsistent with the ways in which data are used today and will be used in the future, and are two principles that many participants suggested might be omitted entirely or at the very least dramatically reshaped. Of course there should be limits on uses of data, but that those limits need not necessarily be linked to the purposes for which the data was originally collected.
Microsoft has released a report containing more highlights of the regional dialogues and global privacy summit in an effort to help foster a crucial debate about how to update data protection laws for the 21st century. Read the report: “Notice and Consent in a World of Big Data
Fred H. Cate is a Distinguished Professor and C. Ben Dutton Professor of Law at the Indiana University Maurer School of Law and director of the Indiana University Center for Applied Cybersecurity Research and Center for Law, Ethics and Applied Research in Health Information. He specializes in privacy, security, and other information law issues, and appears regularly before Congress, government agencies, and professional and industry groups on these matters.
Viktor Mayer-Schoenberger is the Professor of Internet Governance and Regulation at Oxford. His research focuses on the role of information in a networked economy. Professor Mayer-Schönberger’s most recent book, the awards-winning 'Delete: The Virtue of Forgetting in the Digital Age' (Princeton University Press 2009) has been published in four languages, and the ideas proposed in the book have now become official policy of the European Union.