The Digital Privacy Paradox – Professors Athey and Tucker Investigate “Notice and Consent”

By TAP Staff Blogger

Posted on August 25, 2017


Share

A new article by Stanford’s Susan Athey and MIT’s Catherine Tucker and Christian Catalini investigates distortions in consumer behavior when faced with notice and consent options regarding their privacy. “The Digital Privacy Paradox: Small Money, Small Costs, Small Talk” utilizes data from a digital currency experiment conducted at MIT. Their results present a “privacy paradox: Consumers say they care about privacy, but at multiple points in the process end up making choices that are inconsistent with their stated preferences.”

 

Below are excerpts from “The Digital Privacy Paradox: Small Money, Small Costs, Small Talk.”

 

Is “Notice and Choice” Effective?

Since the initial formalization of privacy policy towards consumer data in the Privacy Act of 1974, there has been an emphasis on ‘Notice and Choice’ to safeguard privacy. ‘Notice’ gives consumers information about data collection and use, and then consumers make a ‘Choice’ about whether or not to allow their data to be collected or used in that way. These mechanisms may not be sufficient. In this paper, we present evidence about a variety of distortions in the notice and choice process, relating to consumer decisions to share data and choose more or less privacy-protective technologies.

 

Research Methodology

We use data from the Massachusetts Institute of Technology digital currency experiment where every undergraduate student at MIT was offered in the fall of 2014 $100 in Bitcoin (Catalini and Tucker, 2017). The main focus of the experiment was establishing a cryptocurrency community at MIT. However, as part of the experiment, students had to make at least three digital privacy choices: Whether they wanted to disclose the contact details of their closest friends; whether they wanted to maximize the privacy of their trans actions from the public, a commercial intermediary or the government; and whether they subsequently wanted to take additional actions to protect their transaction privacy when using Bitcoin. We use randomized elements of the experiment, often not directly focused on the question of privacy itself, to understand how responsive this demographic is to small changes in incentives, costs and information.

 

Key Findings

We have three main findings. First, the effect small incentives have on disclosure may explain the privacy paradox: Whereas people say they care about privacy, they are willing to relinquish private data quite easily when incentivized to do so. Second, small frictions in navigation costs surrounding privacy choices can have large effects in terms of technology adoption, even in the presence of transparent information about the privacy consequences of those choices. Third, our information treatment on encryption - possibly by giving participants an illusion of protection - surprisingly did not increase privacy-enhancing behavior as we expected, but actually reduced it. After being randomly exposed to irrelevant, but reassuring information about a tangential technology, students were less likely to avoid surveillance in their use of the technology. In all these cases, privacy-decreasing decisions take place regardless of stated preferences for privacy.

 

Policy Implications

Though there are policy implications of our paper, it is important to emphasize that our empirical results can be used to support two highly contrasting stances towards privacy protection.

 

The first policy stance is that our results could be taken as suggesting that consumers’ revealed behavior regarding privacy – as revealed by their stated privacy preferences in our surveys – is slanted away from their actual normative preferences (Beshears et al., 2008). This might suggest that consumers need to be protected from themselves, above and beyond the protection given by a notice and choice regime, to ensure that small incentives, search costs or misdirection are not able to slant their choices away from their actual normative preferences.

 

The second policy stance our results document is that there is a disconnect between stated privacy preferences and revealed preference, but that revealed preference is actually closest to the normative preference. When expressing a preference for privacy is essentially costless as it is in surveys, consumers are eager to express such a preference, but when faced with small costs this taste for privacy quickly dissipates. This would suggest that basing privacy protection on stated preference data regarding privacy expressed in surveys is misguided, especially since such policies have been documented to impose costs on firms (Miller and Tucker, 2011; Kim and Wagman, 2015).

 

Conclusion

The privacy policy of both the US and OECD [Organisation for Economic Co-operation and Development] has focused on the idea that with enough transparency and enough choice consumers would make better privacy decisions. We explore consumers’ attitude and revealed preferences towards digital privacy in the context of a large-scale experiment involving all MIT undergraduate students. We take advantage of the nature of the underlying technology (Bitcoin) to explore how this is moderated by preferences for privacy from a commercial firm, the government or the public.

 

Our results highlight a digital privacy paradox: Consumers say they care about privacy, but at multiple points in the process end up making choices that are inconsistent with their stated preferences.

 

The implications of our findings for policy are nuanced. Our finding that small incentives, costs or misdirection can lead people to safeguard their data less can have two interpretations. On the one hand it might lead policy makers to question the value of stated preferences for privacy when determining privacy policy. On the other hand, it might suggest the need for more extensive privacy protections, from the standpoint that people need to be protected from their willingness to share data in exchange for relatively small monetary incentives.

 

Moreover, whenever privacy requires additional effort or comes at the cost of a less smooth user experience, participants are quick to abandon technology that would offer them greater protection. This suggests that privacy policy and regulation has to be careful about regulations that inadvertently lead consumers to be faced with additional effort or a less smooth experience in order to make a privacy-protective choice.

 

Read the full article: “The Digital Privacy Paradox: Small Money, Small Costs, Small Talk” on NBER (membership is required for access). (NBER Working Paper No. 23488 June 2017. © 2017 by Susan Athey, Christian Catalini, and Catherine Tucker)

 

Susan Athey, an economic theorist who has made significant contributions to the study of industrial organization, is the Economics of Technology Professor at Stanford Graduate School of Business. She also is Professor of Economics (by courtesy), School of Humanities and Sciences, and Senior Fellow, Stanford Institute for Economic Policy Research. Her current research focuses on the economics of the Internet, marketplace design, auction theory, the statistical analysis of auction data, and the intersection of computer science and economics. She is an expert in a broad range of economic fields – including industrial organization, econometrics, and microeconomic theory – and has used game theory to examine firm strategy when firms have private information.

 

Professor Athey was awarded the Jean-Jacques Laffont Prize in Toulouse, France in 2016. The annual award recognizes an internationally renowned economist whose research is in the spirit of the work undertaken by Professor Jean-Jacques Laffont, combining both the theoretical and the empirical. In 2007, Professor Athey was named the first female recipient of the American Economic Association’s prestigious John Bates Clark Medal, awarded every other year to the most accomplished American economist under the age of 40 “adjudged to have made the most significant contribution to economic thought and knowledge.”

 

Catherine Tucker is the Sloan Distinguished Professor of Management and Professor of Marketing at MIT Sloan. She is also Chair of the MIT Sloan PhD Program. Her research interests lie in how technology allows firms to use digital data to improve their operations and marketing and in the challenges this poses for regulations designed to promote innovation. She has particular expertise in online advertising, digital health, social media and electronic privacy. Generally, most of her research lies in the interface between Marketing, Economics and Law.

 

Christian Catalini is the Theodore T. Miller Career Development Professor and Assistant Professor of Technological Innovation, Entrepreneurship, and Strategic Management at the MIT Sloan School of Management. His main areas of interest are the economics of innovation, entrepreneurship, and scientific productivity. His research focuses on crowdfunding and online entrepreneurial finance, blockchain technology, digital currencies, how proximity affects the recombination of ideas, the adoption of technology standards, and science and technology interactions.

 


Share