Key Privacy Papers Selected as “Must Reads” for Policymakers

By TAP Staff Blogger

Posted on December 7, 2015


Share

The Future of Privacy Forum (FPF) has announced the top privacy papers of 2015. These papers present leading analytical thinking about current and emerging privacy issues. FPF’s goal in compiling this journal is to showcase papers that analyze current and emerging privacy issues and either propose achievable short-term solutions, or propose new means of analysis that could lead to solutions.

 

The selected articles will be compiled into an accessible digest, the sixth edition of “Privacy Papers for Policy Makers,” which will be showcased to policy makers, privacy professionals, and the public. The FPF’s purpose is to inform any conversation about privacy among policymakers in Congress, as well as at the Federal Trade Commission and in other government agencies.

 

Each of the selected privacy papers includes a TAP scholar (or two) in their authorship. Below are highlights from the papers.

 

 

Future of Privacy Forum’s Top Privacy Papers for 2015

 

A Design Space for Effective Privacy Notices
By Florian Schaub, Carnegie Mellon University; Rebecca Balebako, RAND Corporation; Adam L. Durity, Google; and Lorrie Faith Cranor, Carnegie Mellon University

 

Notifying users about a system's data practices is supposed to enable users to make informed privacy decisions. Yet, current notice and choice mechanisms, such as privacy policies, are often ineffective because they are neither usable nor useful, and are therefore ignored by users. Constrained interfaces on mobile devices, wearables, and smart home devices connected in an Internet of Things exacerbate the issue. Much research has studied usability issues of privacy notices and many proposals for more usable privacy notices exist. Yet, there is little guidance for designers and developers on the design aspects that can impact the effectiveness of privacy notices. In this paper, we make multiple contributions to remedy this issue. We survey the existing literature on privacy notices and identify challenges, requirements, and best practices for privacy notice design. Further, we map out the design space for privacy notices by identifying relevant dimensions. This provides a taxonomy and consistent terminology of notice approaches to foster understanding and reasoning about notice options available in the context of specific systems. Our systemization of knowledge and the developed design space can help designers, developers, and researchers identify notice and choice requirements and develop a comprehensive notice concept for their system that addresses the needs of different audiences and considers the system's limitations and opportunities for providing notice.

 

 

Anonymization and Risk
By Ira S. Rubinstein, New York University; and Woodrow Hartzog, Samford University

 

Perfect anonymization of data sets has failed. But the process of protecting data subjects in shared information remains integral to privacy practice and policy. While the deidentification debate has been vigorous and productive, there is no clear direction for policy. As a result, the law has been slow to adapt a holistic approach to protecting data subjects when data sets are released to others. Currently, the law is focused on whether an individual can be identified within a given set. We argue that the better locus of data release policy is on the process of minimizing the risk of reidentification and sensitive attribute disclosure. Process-based data release policy, which resembles the law of data security, will help us move past the limitations of focusing on whether data sets have been “anonymized.” It draws upon different tactics to protect the privacy of data subjects, including accurate deidentification rhetoric, contracts prohibiting reidentification and sensitive attribute disclosure, data enclaves, and query-based strategies to match required protections with the level of risk. By focusing on process, data release policy can better balance privacy and utility where nearly all data exchanges carry some risk.

 

 

A Precautionary Approach to Big Data Privacy
Arvind Narayanan, Princeton University; Joanna Huey, Princeton University; and Edward W. Felten, Princeton University

 

Once released to the public, data cannot be taken back. As time passes, data analytic techniques improve and additional datasets become public that can reveal information about the original data. It follows that released data will get increasingly vulnerable to re-identification—unless methods with provable privacy properties are used for the data release.

 

Due to the ad hoc de-identification methods applied to currently released datasets, the chances of re-identification depend highly on the progress of re-identification tools and the auxiliary datasets available to an adversary. The probability of a privacy violation in the future is essentially unknowable. In general, a precautionary approach deals with uncertain risk by placing the burden of proof that an action is not harmful on the person taking the action. Here, we argue for a weak version of the precautionary approach, in which the idea that the burden of proof falls on data releasers guides policies that incentivize them not to default to full, public releases of datasets using ad hoc de-identification methods.

 

 

Privacy and Markets: A Love Story
Ryan Calo, University of Washington

 

Law and economics tends to be skeptical of privacy, finding privacy overrated, inefficient, and perhaps even immoral. Law should not protect privacy because privacy inhibits the market by allowing people to hide useful information.

 

Privacy law scholars tend to be skeptical of markets. Markets “unravel” privacy by penalizing consumers who prefer it, degrade privacy by treating it as just another commodity to be traded, and otherwise interfere with the values or processes that privacy exists to preserve.

 

This mutual and longstanding hostility obscures the significant degree to which privacy and markets assume and reply upon one another in order to achieve their respective ends.

 

For example, in a world without privacy, traditional market criteria such as price and quality can be overwhelmed by salient but extraneous information such as the political or social views of market participants. Meanwhile, imagine how much a government must know about its citizens to reject markets and distribute resources according to the maxim “from each according to his ability, to each according to his need.”

 

Conceiving of privacy and markets as sympathetic helps justify or explain certain legal puzzles, such as why the Federal Trade Commission — an agency devoted to free and open markets and replete with economists — has emerged as the de facto privacy authority in the United States. The account also helps build a normative case for political and other laws that enforce a separation between market and other information.

 

 

Taking Trust Seriously in Privacy Law
Neil Richards, Washington University in St. Louis; and Woodrow Hartzog, Samford University

 

Trust is beautiful. The willingness to accept vulnerability to the actions of others is the essential ingredient for friendship, commerce, transportation, and virtually every other activity that involves other people. It allows us to build things, and it allows us to grow. Trust is everywhere, but particularly at the core of the information relationships that have come to characterize our modern, digital lives. Relationships between people and their ISPs, social networks, and hired professionals are typically understood in terms of privacy. But the way we have talked about privacy has a pessimism problem – privacy is conceptualized in negative terms, which leads us to mistakenly look for “creepy” new practices, focus excessively on harms from invasions of privacy, and place too much weight on the ability of individuals to opt out of harmful or offensive data practices.

 

But there is another way to think about privacy and shape our laws. Instead of trying to protect us against bad things, privacy rules can also be used to create good things, like trust. In this paper, we argue that privacy can and should be thought of as enabling trust in our essential information relationships. This vision of privacy creates value for all parties to an information transaction and enables the kind of sustainable information relationships on which our digital economy must depend.

 

Drawing by analogy on the law of fiduciary duties, we argue that privacy laws and practices centered on trust would enrich our understanding of the existing privacy principles of confidentiality, transparency, and data protection. Re-considering these principles in terms of trust would move them from procedural means of compliance for data extraction towards substantive principles to build trusted, sustainable information relationships. Thinking about privacy in terms of trust also reveals a principle that we argue should become a new bedrock tenet of privacy law: the Loyalty that data holders must give to data subjects. Rejuvenating privacy law by getting past Privacy Pessimism is essential if we are to build the kind of digital society that is sustainable and ultimately beneficial to all – users, governments, and companies. There is a better way forward for privacy. Trust us.

 

 

Additionally, two papers have been selected for Notable Mention:

 

Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy (Testimony, Senate Judiciary Committee Hearing, July 8, 2015)
Peter Swire, Georgia Tech

 

My testimony on “Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy” is in three parts:

  • First, the Review Group report concluded that strong cybersecurity and strong encryption should be vital national priorities.
  • Second, it is more accurate to say that we are in a “Golden Age of Surveillance” than for law enforcement to assert that it is “Going Dark.”
  • Third, government-mandated vulnerabilities would threaten severe harm to cybersecurity, privacy, human rights, and U.S. technological leadership, while not preventing effective encryption by adversaries.

 

 

The Transparent Citizen
Joel R. Reidenberg, Fordham University

 

This article shows that the transparency of personal information online through ubiquitous data collection and surveillance challenges the rule of law both domestically and internationally. The article makes three arguments. First, the transparency created by individuals’ interactions online erodes the boundary between public and private information and creates a “transparent citizen.” Second, the transparent citizen phenomenon undermines the state’s faithfulness to the ideals of the rule of law and to citizens’ respect for the rule of law. Transparency enables government to collect and use personal information from the private sector in ways that circumvent traditional political and legal checks and balances. Transparency encourages the development of anonymity tools that empower wrong-doers to evade legal responsibility and the rule of law. And, transparency puts national security, public safety and legal institutions at risk in ways that will jeopardize and corrode the public’s faith in the rule of law. Third and lastly, transparency challenges international norms and data flows. National data privacy law is anchored in local constitutional culture and the transparency of personal information across borders creates deep-seated political instability that will only be resolved through political treaties.

 

Read more about the Future of Privacy Forum’s Privacy Papers for Policymakers program and this year’s selections in What Privacy Papers Should Policymakers be Reading in 2016?.

 


Share