Privacy Protection Without Law: How Data Privacy Is Shaped by Market Forces

By Omri Ben-Shahar

Posted on February 15, 2017


[Note: this article was originally published January 30, 2017.]


You must not have noticed, but this week, we celebrated the National Data Privacy Day. A holiday declared by Congress in 2009, it aspires to promote people’s control over their personal information. It is a day for us to reflect on how much information we share online, how safe it is, and an opportunity to “create dialogues among stakeholders interested in advancing data protection and privacy.”


Image: Data Privacy DayAnxiety over online data privacy helped produce not just a national holiday with its highfaluting “dialogues” and “be privacy aware” clichés. Data privacy anxiety is also manifesting itself in host of legal initiatives aimed primarily at improving the notices and disclosures that consumers receive, in the hope that better “transparency” would produce informed choice and individual control.


Recently, this anxiety was directed vis-à-vis Google, alleging that it misleadingly changed its privacy policy to collect more information for its tailored ads. Google’s updated legal notice explained (correctly) to its users that they now have more choice to opt out of ads-personalization. But in the same breath—and in small print—Google also increased the amount of data it compiles on users, creating more individualized profiles and curating more relevant (and more effective) ads. The FTC and the European Union are investigating whether consumers’ consent was acquired deceptively, without giving users clear notices.


I have deep doubts whether data privacy anxiety is justified, and I expressed them elsewhere, writing about privacy paranoia and about the benefit of allowing people to sell their privacy in return for free services. But let’s imagine that the anxiety reflects real risks—that people might suffer privacy injuries from the way firm collect, use, and profit from their clients’ personal profiles. What should be done?


It is important to dispel the first and most common legal solution that comes to mind: more transparency. It is a solution that lawmakers particularly revere because it is so superficially sensible, and so a-political. If firms are required to tell people what information they collect, and do so in a simple and conspicuous manner, people would be able to avoid doing business with those that inflict abusive privacy practices.


There is one problem with the transparency solution. There is no evidence that it works! (And there are mountains of evidence that it fails.) Even when the information people are asked to share is highly private and sensitive, and even if the notices about the ways firms collect, use and share this information are delivered in the simplest and most concise manner, people still don’t read the notices and don’t change their behavior. A recent experiment tried to deliver the privacy notices to users in the format of a simple “nutrition facts” box, to no avail. The notice still went unread, and people shared the same amount of sensitive personal information as they do when the notices are long and cluttered.


Image: Privacy warningIf simple notices are not read or used by people, the hopes for informed choice crumble. Users are not going to opt out of Google’s personalized ads or to personalize the Facebook’s privacy setting. These consumers might comparison-shop among services based on various quality and service measures, but not on the basis of privacy features. Notwithstanding Data Privacy Days and the occasional privacy nudges that the law requires, if people remain uninformed about privacy this dimension would not affect their choices.


It is also important to dispel a second perceived solution to privacy risks: lawsuits. There are numerous class action lawsuits percolating in courts, alleging violations by websites of privacy statutes. Google, for example, has long been defending against the complaints that its Gmail service, which scans the text of its users emails, is a violation of the Wiretap Act. Many of these lawsuits ultimately fail because they cannot prove a concrete injury. But even the few that succeed are not going to change the behavior of firms. They will only teach firms to write more comprehensive privacy notices and require more frequent clicks “I Agree” from users.


A third legal solution fueled by privacy anxiety is the “right to be forgotten” mandated in Europe. It gives users the right to request search engines to remove links to personal information that are no longer accurate or relevant. Viewed by its advocates as a major landmark in privacy protection, the right to be forgotten mandate is ultimately proving to be a storm in a teacup. In one year, Google reported to have received only 218,000 requests (or which it granted about half). Only a negligible fraction of users are sufficiently sensitive to privacy issues to exercise the cherished right.


While legal solutions to privacy anxiety—from disclosures to a right to anonymity—are helping at most those who need help least (the few privacy-sensitive users who know how take care), competition and market forces are providing additional layer of protection with far greater reach and impact. Consider, for example, internet patrons of erotica and other adult sites. Surely, by the very nature of the content being exchanged, these customers have heightened privacy concerns. They don’t want the websites to share the information they collect with unknown third parties. Or, consider cloud-computing websites. Again, customers who store all their files with such services have heightened privacy and security concerns. Are these websites offering their clientele more privacy?


Yes. A recent study found that adult websites dramatically outperform other websites in their privacy protection. They have far more restrictive data sharing policies. Likewise, cloud-computing sites adhere to higher data security standards than other online services, and indeed they receive the highest security compliance ratings by periodic reviews. The law does not require adult and cloud services to provide extra protection, but they do. These businesses are competing to meet consumers’ preferences and are providing enhanced privacy protection in those areas where their customers demand--where it truly matters.


It would be a naive exaggeration to say that markets guarantee privacy. Firms collect data for profit, and they guard privacy as much as the fox guards the henhouse. But competition responds to consumer demand, and demand arises from real needs of consumers, not from laws that pay tribute to transparency or dedicate national holidays. It is an excellent sign that in areas of real privacy vulnerability, firms do more to provide privacy protections.



The preceding is republished on TAP with permission by its author, Omri Ben-Shahar, law professor and Kearney Director of the Coase-Sandor Institute for Law and Economics at the University of Chicago Law School. “Privacy Protection Without Law: How Data Privacy Is Shaped by Market Forces” was originally published January 30, 2017 in Forbes.