ACADEMIC ARTICLE SUMMARY

Informing the Design of a Personalized Privacy Assistant for the Internet of Things

Article Source: CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Paper No. 262, April, 2020
Publication Date:
Time to Read: 2 minute read
Written By:

 Jessica Colnago

Jessica Colnago

 Megan Ung

Megan Ung

 Norman Sadeh

Norman Sadeh

 Sarah Pearman

Sarah Pearman

TP

Tharangini Palanivel

 Yuanyuan Feng

Yuanyuan Feng

ARTICLE SUMMARY

Summary:

Personalized Privacy Assistant (PPAs) will help users manage Internet of Things (IoT) device data collection. The best PPAs will learn from users and offer suggestions from unbiased sources.

POLICY RELEVANCE

Policy Relevance:

Most users will react positively to suggestions from PPAs.

KEY TAKEAWAYS

Key Takeaways:
  • PPAS can help users manage large numbers of privacy decisions; in evaluating PPAs, users weigh their desire for control of their personal information against fear of cognitive overload.
  • Interviews with 17 participants revealed users’ views on different PPA designs.
    • “Notification PPAs” notify users when a nearby device is collecting data, and give the user control over nearby data collection.
    • “Recommendation PPAs” notify users when a nearby device is collecting data, and suggest whether the user should allow or disallow collection.
    • “Auto PPAs” make decisions for the user based on user preferences.
  • Participant reactions to “recommendation PPAs” were mostly positive.
    • Participants thought this type of PPA could serve an educational purpose.
    • Participants wanted recommendations from unbiased, knowledgeable sources.
  • About two thirds of participants reacted positively to the idea of “auto PPAs;” many reacted negatively to “notification PPAs,” fearing they would be overwhelmed by choices.
  • Good PPA designs would include the following features:
    • Allow users to choose from crowd-sourced recommendations, manufacturer recommendations, and recommendations from independent nonprofit organizations.
    • Include a "trusted location" feature where notifications would be turned off.
    • Allow users to specify situations in which users are always for or against sharing.
    • Explain the risks and benefits of data collection to users.
    • Record and learn from users' decisions.
    • Provide an audit mechanism so users' can verify and adjust decisions made on their behalf.
  • Some participants thought that the benefits of IoT (such as traffic control) would be reduced if people could opt out; policymakers should consider how to reduce the chance that people will opt out of public data collection and bypass safety and security devices.

QUOTE

TAGS

Alessandro Acquisti

About Alessandro Acquisti

Alessandro Acquisti is a Professor of Information Technology and Public Policy at the Heinz College, Carnegie Mellon University. He is the co-director of the CMU Center for Behavioral Decision Research (CBDR), a member of Carnegie Mellon Cylab, and is currently a Principal Investigator on the Usable Privacy Policy Project, a multi-year collaborative project funded by the National Science Foundation and involving Fordham Center on Law and Information Policy and computer scientists from Carnegie Mellon University and Stanford.

See more with Alessandro Acquisti

Lorrie Faith Cranor

About Lorrie Faith Cranor

Lorrie Faith Cranor is the Director and Bosch Distinguished Professor in Security and Privacy Technologies of CyLab and the FORE Systems Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University. She also directs the CyLab Usable Privacy and Security Laboratory (CUPS) and co-directs the MSIT-Privacy Engineering masters program. She teaches courses on privacy, usable security, and computers and society.

See more with Lorrie Faith Cranor