Author(s)
Margot Kaminski and Gianclaudio Malgieri
Source
University of Colorado Law Legal Studies Research Paper No. 19-28
Summary
The European Union's General Data Protection Direction (GDPR) addresses concerns with automated decision-making. Impact assessments support both individual privacy rights and oversight of the entire system.
Policy Relevance
Continuous assessments reveal the effect of algorithms on society and individuals.
Main Points
- Under the GDPR, an individual has the right to be notified when decisions are solely automated, to receive an explanation of how the decision was made, and to contest the decision.
- Even when an individual does not exercise his GDPR rights, firms must make Data Protection Impact Assessments (DPIAs), hire privacy officers, and submit to third-party audits; these regulatory tools create a systemic governance and oversight regime.
- In the context of automated decision-making, DPIAs support both individual rights and systemic governance.
- The DPIA could play a key role in enforcing individual rights.
- DPIA content could explain algorithmic decisions to individuals.
- The DPIA requires firms to describe what individual rights will look like in practice.
- The DPIA's main shortcoming is that the report need not be shared with the public; however, individuals could receive and share DPIA content in the course of insisting on their rights, enhancing accountability.
- The GDPR does not address the impact of surveillance or automated decision-making on groups and society; ideally, impact statements should mitigate social harms that go beyond individuals.
- The GDPR calls for firms to offer multi-layered explanations for automated decision-making.
- The DPIA should describe the algorithmic model used, input data, testing, and performance metrics.
- The DPIA should explain what changes would alter the outcome of individual decisions.
- The DPIA should explain the effect of the process on entire groups.