Do Not Track, Smartphone Privacy and Privacy-by-Design
Publication Date: June 06, 2011 3 minute readBy Deirdre K. Mulligan and Nick Doty
April's W3C Web Tracking and User Privacy workshop brought together a large, diverse group of stakeholders: advertisers and self-regulatory group members, consumer advocates, academics, regulators and browser makers. As you might expect, unanimity is not easily found among such a diverse group. Yet there was surprising consensus on at least a starting point for defining tracking: a broad definition with exemptions for common practices, as in the Center for Democracy and Technology's proposal. This still leaves debate on the details of a definition and the technical mechanism (W3C intends to form a Working Group to pursue technical standards in this area), but at least suggests a way forward.
Differences remain not only on the substance of Do Not Track (what is tracking and how should people be able to control it) but also on the process. Beyond the perpetual debate of industry self-regulation vs. government regulation, it remains unclear how questions of technical mechanism, semantic definition and regulatory enforcement can best be coordinated between different groups with varying expertise and legitimacy. The hand-off between code and policy, and between the bodies responsible for each, may be as contentious a negotiation as the definitions — and with less precedent as guide. (We touched on the same issues at the Browser Privacy Roundtable in Berkeley this February.) As is common in the technical standards process, we expect to see standards bodies coming to a rough consensus over a specification while vendors concurrently push forward with running code. Those two moving pieces will also have to accommodate action from legislatures (bills have been introduced at the Federal level and in California) and regulatory agencies (the FTC intends to publish its final report on behavioral advertising within the year).
The same questions of coordination arise in other ongoing privacy debates. US government regulators have turned their attention to smartphone and location privacy with a recent Senate hearing and an upcoming FCC forum. Ashkan Soltani (formerly of the UC Berkeley School of Information and subsequently the Federal Trade Commission) testified on the geolocation capabilities of cellular phones and the access to that information currently afforded to applications, web sites and downstream partners and advertisers. How will Senator Franken's request that mobile OS providers require location-based applications to have privacy policies coordinate with the FTC's push for standardized, just-in-time notices or with technical mechanisms for machine-readable privacy policies and preferences? (For examples of the latter, see P3P, GeoPriv and comments of Nick Doty to the Department of Commerce.) Impending implementation of the so-called EU “cookie directive” and the questions it raises about browser settings and web site design shows that coordinating privacy across policy and technology is a global concern.
All this ties together in the concept of Privacy-by-Design (PbD) — that in developing a system privacy ought to be considered as a basic value throughout the process, rather than as a liability-protection measure after deployment. The concerns with online tracking and smartphone privacy underline the point that privacy-by-design must coordinate technical design (software development, technical standardization) with policy design (legislation, rulemaking, self-regulation). While much work has been done on data-protection-by-design, we believe privacy-by-design requires both a more capacious definition of privacy and methodologies and tools to inform the design process. Understanding PbD and learning to implement it as part of engineering is a key, near term, privacy challenge.
Deirdre K. Mulligan is an Assistant Professor at the UC Berkeley School of Information.
Nick Doty is a researcher at the School of Information and works on privacy at the World Wide Web Consortium.