I’ve previously written on regulation of European data processing here. I’ll be presenting on the “right to be forgotten” (RtbF) in Chicago this Spring. I’ll be writing a series of posts here to prepare for that lecture.
Julia Powles offers an excellent summary of the right in question. As she explains, the European Court of Justice (ECJ) has ruled that, “in some circumstances—notably, where personal information online is inaccurate, inadequate, irrelevant, or excessive in relation to data-processing purposes—links should be removed from Google’s search index.” The Costeja case which led to this ruling involved Google’s prominent display of results relating to the plaintiff’s financial history.
Unfortunately, some US commentators’ views are rapidly congealing toward a reflexively rejectionist position when it comes to such regulation of search engine results–despite the Fair Credit Reporting Act’s extensive regulation of consumer reporting agencies in very similar situations. Jeffrey Toobin’s recent article mentions some of these positions. For example, Jules Polonetsky says, “The decision will go down in history as one of the most significant mistakes that Court has ever made.” I disagree, and I think the opposite result would itself have been far more troubling.
Internet regulation must recognize the power of certain dominant firms to shape impressions of individuals. Their reputational impact can be extraordinarily misleading and malicious, and the potential for harm is only growing as hacking becomes more widespread. Consider the following possibility: What if a massive theft of medical records occurs, the records are made public, and then shared virally among different websites? Are the critics of the RtbF really willing to just shrug and say, “Well, they’re true facts and the later-publishing websites weren’t in on the hack, so leave them up”? And in the case of future intimate photo hacks, do we simply let firms keep the photos available in perpetuity?
These examples may sound far-fetched. But there are already shady markets developing in individuals’ “full medical histories” (via stolen life insurance applications). HealthCare.gov is a prime target for integrated health, medical, and citizenship records. Admittedly, if a state in the United States tried to make the re-publication of such records illegal, it may well run into the First Amendment strictures of Bartnicki v. Vopper, which allows the publication of some illegally intercepted communications by those who did not actually complete the illegal interception. On the other hand, as my co-blogger Danielle Citron explains in her excellent new book Hate Crimes in Cyberspace, Vopper is not a First Amendment absolutist opinion:
As the Court suggested [in Vopper], the state interest in protecting the privacy of communications may be “strong enough to justify” regulation if the communications involve “purely private” matters. Built into the Court’s decision was an exception: a lower level of First Amendment scrutiny applies to the nonconsensual publication of “domestic gossip or other information of purely private concern.”
Relying on that language, appellate courts have affirmed the constitutionality of civil penalties under the wiretapping statute for the unwanted disclosures of private communications involving “purely private matters.” Along similar lines, lower courts have upheld claims for public disclosure of private fact in cases involving the nonconsensual publication of sex videos.
Perhaps critics of the RtbF want to sweep away these penalties, too. But if they succeed, there will be real human costs. Consider this story from a Stanford data breach:
[DD], of Santa Clara, Calif., said her “jaw dropped” on Saturday when she intercepted the letter from Ms. Meyer addressed to her 21-year-old son, who she said had received emergency psychiatric treatment at Stanford in 2009. Ms. [D] said it could have been disastrous if her son, who lives at home, had learned that his name was linked to a mental health diagnosis. “My son, I can tell you, is fragile and confused enough that this would have sent him over the edge,” Ms. [D] said, saying she decided to speak publicly now because of her frustration with the breach. “Everyone with an electronic medical record is at risk, and that means everyone.”
Internet firms can be held legally responsible to prevent, say, endless republication of this man’s psychiatric record. It’s a purely private matter. It’s congruent with venerable protections like those available to consumers pursuant to the Fair Credit Reporting Act. It’s time to stop treating the RtbF as some bizarre European dirigisme, and recognize instead its pivotal role in guaranteeing a digital future where our reputations aren’t at the mercy of malicious hackers and careless search engines.
Frank Pasquale
Frank is Professor of Law at the University of Maryland. His research agenda focuses on challenges posed to information law by rapidly changing technology, particularly in the health care, internet, and finance industries.
The preceding is republished on TAP with permission by its author, Professor Frank Pasquale, University of Maryland. “The Right to be Forgotten: Not an Easy Question” was originally published September 23, 2014 on Concurring Opinions.