Author(s)
David Choffnes, Johanna Gunawan,
Woodrow Hartzog and Christo Wilson
Source
Seton Hall Law Review, Vol. 51, pp. 1505-1533, 2021
Summary
Pandemic responses were hindered by lack of public trust in information technology. The public was reluctant to use contact tracing apps because of inadequate privacy protection.
Policy Relevance
Privacy law should require technology companies to protect users’ interests.
Main Points
- Technology firms’ and governmental efforts to use technology to fight the pandemic often failed because people do not trust technological systems and devices.
- Pre-existing privacy issues include manipulative interfaces, lack of meaningful rights to consent to data collection, and devices that collect too much data; also, social media spreads considerable disinformation.
- Technological efforts to address the pandemic failed to protect users’ privacy.
- Project Baseline, based in the United States, required users to consent to participation in Google data ecosystem.
- In Singapore, contact tracing data from the TraceTogether app was made available to Singapore police.
- Interfaces and information online are beset with problems such as "dark patterns," that is, confusing and manipulative user interfaces; California legislature addressed dark patterns in the California Privacy Rights Act of 2020, but federal privacy law has not.
- Before the next public health emergency, lawmakers could ensure that technologies are more trustworthy by requiring data collectors to serve as data stewards for technology users.
- New rules should prohibit technological designs or uses of data that conflict with users' best interests.
- Privacy policies and software licenses would be empowering, meaningful, and transparent.
- Australia's pandemic response shows that substantive privacy rules can build trust.
- Australia quickly reversed a determination that the COVIDSafe contact tracing app could be used by prosecutors.
- Australia barred uses of contact tracing data for any purpose other than contact tracing.