Understanding Challenges for Developers to Create Accurate Privacy Nutrition Labels

Article Source: CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
Publication Date:
Time to Read: 2 minute read
Written By:

 Jason I.  Hong

Jason I. Hong

 Kayla  Reiman

Kayla Reiman

 Tianshi Li

Tianshi Li

 Yuvraj  Agarwal

Yuvraj Agarwal

Search for the full article on Bing



Apple’s privacy nutrition labels require app developers to complete a form to make data collection processes more transparent to users. Developers support more transparency, but found the labelling process challenging.


Policy Relevance:

Creating a privacy nutrition label spurs developers to think more about privacy.


Key Takeaways:
  • The concept of a privacy nutrition label was first used on a large scale by Apple, which added "App Privacy Details" to offerings in their App Store in 2020.
    • The first layer of the display shows the broad categories of data each app collects (such as location).
    • Users can click to see a second layer with more details, such as whether the data is used for advertising.
  • Twelve iOS app developers were observed as they filled out Apple’s privacy label form.
  • Most developers saw the creation of the privacy label positively, as something that would promote trust in the app; some felt that privacy teams rather than developers should be responsible for privacy labelling.
  • Some developers changed their apps' privacy practices by reducing or changing the app’s data usage to make labelling easier or to conform better to the standardized information conveyed by the label; the labelling process spurred developers to think more about privacy.
  • One misconception that led developers to underreport data collection was the belief that "data linked to users" means only data such as phone numbers, which can be linked to a real person; however, any data stored alongside data that can be linked to a real person is also "data linked to users."
  • Overreporting data collection occurred when developers failed to understand that Apple uses "tracking" to mean only specific types of data sharing with third parties for purposes such as advertising; some developers failed to realize that data that was never stored was not “collected.”
  • Challenges for creating accurate labels included the following:
    • Developer's preconceptions, which might not match Apple's expectations and definitions.
    • Developers found Apple's documentation confusing.
    • Developers responsible for only part of the development process did not know all the apps’ data practices.
    • Developers were overwhelmed by the challenge of understanding and remembering the information and concepts needed.
  • In the future, researchers should consider working on issues such as the following:
    • How to verify that self-reported privacy labels are accurate, giving developers feedback to help correct mistakes.
    • Consider how confusing privacy label differences between major providers such as Google and Apple could be reconciled.



Lorrie Faith Cranor

About Lorrie Faith Cranor

Lorrie Faith Cranor is the Director and Bosch Distinguished Professor in Security and Privacy Technologies of CyLab and the FORE Systems Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University. She also directs the CyLab Usable Privacy and Security Laboratory (CUPS) and co-directs the MSIT-Privacy Engineering masters program. She teaches courses on privacy, usable security, and computers and society.

See more with Lorrie Faith Cranor