Lorrie Faith Cranor on Digital Privacy and Why It Is Out of Control

By TAP Staff Blogger

Posted on June 5, 2015


Share

Today’s online world and ever-connected devices have made managing personal privacy a never-ending battle. Professor Lorrie Faith Cranor, director of Carnegie Mellon’s CyLab Usable Privacy and Security Laboratory, has an extensive body of work studying online privacy, usable security, and technology and public policy. She was recently interviewed by Wired magazine to explain what digital privacy is, why it is so complicated for individuals to manage, and discusses ideas to make privacy “user friendly.”


Below are a few excerpts from “Lorrie Faith Cranor: Digital Privacy Is Out of Control.”


According to Cranor, a meaningful sense of digital privacy is beyond our control, because privacy policies today are mostly developed by data-hoarding companies that lack business incentives to make them user-friendly. They’re confusing, impenetrable, and invariably a big pain in the ass to the people they were designed to protect. In 2012, Cranor and a colleague found that it would take users 76 workdays to read all of the privacy policies they encounter in a year. The problem has only gotten worse, she says.


But Cranor is not surrendering to that fate long-term. In fact, she’s on a mission to address the huge disconnect between digital privacy tools and human behavior. She believes we can regain the upper hand with a combination of smart regulation and design that makes privacy a dramatically more intuitive part of our digital experiences.


What Does Privacy Mean to You?

I view privacy as very much something about control over personal information. That lens of control makes sense to me, and we each have different views on what we want to keep private and what we want to share. We should each be able to adjust the controls the way that we want to.


How has technology changed that definition?

The threat model is changing. I’m not just worried about people who are physically located in proximity getting access to images, and sounds, and information that I may broadcast. Now there are people and companies that are far removed from my physical location—that I may have no idea even exist—that now have access to my information.


In the physical world, if I want to keep something private, I can close the door, close the windows, adjust the blinds. There are very obvious, tangible things I can do to keep things private.


When we talk about digital privacy, a lot of the times when our information is being collected and shared, we’re not even aware of that.


What would meaningful controls of digital privacy look like?

It has to be in some sense as easy as closing the blinds. We need to both be aware of what’s going on and have easy steps that we can take to make those adjustments.


It’s thinking about what it is that the consumer actually wants to know. How would a consumer use this information? It shouldn’t be the case that if I want a mortgage, first I have to go do the mortgage search and get all the rates, and then I have to go to the mortgage privacy website and look at that. It should be in one place. When I go and I look up mortgage rates, I should also be able to see, at a glance, what the privacy issues are with the banks that are offering me mortgages.


The Privacy Preference Platform, or P3P, is basically a standard for computer-readable privacy policies. The idea is that if companies take their privacy policy and encode it in a computer-readable language and post it on their website, then your web browser can go and fetch that policy automatically and do useful things with it. For example, Internet Explorer will automatically go through this policy and make cookie-blocking decisions based on the policy.


If you could set a national, or even global privacy policy, what would be at the top of your list?

We need some baselines as far as what’s acceptable and what’s not acceptable. I don’t know exactly where the baseline should be, but … right now we don’t really have that as far as how far companies can go with your data. It’s pretty much as far as they want.


That regulation could set baseline standards for certain types of things that consumers just shouldn’t have to worry about and that companies shouldn’t be allowed to do. You could also have tools where there’s one control that applies to a lot of different things rather than saying every different website you go to, you have to find some separate control.


It may be that what we need is regulation that says we’re going to do this. That’s what happened in the financial industry. Nine federal agencies got together and came up with the standard for the banks and they basically created a safe harbor, that if banks adopted this then they were complying with the notice requirement. There was a big incentive for banks to adopt it. It’s not perfect, but that’s kind of how you have to get it done.


Read the full article: “Lorrie Faith Cranor: Digital Privacy Is Out of Control.”


Lorrie Faith Cranor is a Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.


Professor Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

 


Share