Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations

Article Source: University of Colorado Law Legal Studies Research Paper No. 19-28
Publication Date:
Time to Read: 2 minute read
Written By:

 Gianclaudio Malgieri

Gianclaudio Malgieri



The European Union's General Data Protection Direction (GDPR) addresses concerns with automated decision-making. Impact assessments support both individual privacy rights and oversight of the entire system.


Policy Relevance:

Continuous assessments reveal the effect of algorithms on society and individuals.


Key Takeaways:
  • Under the GDPR, an individual has the right to be notified when decisions are solely automated, to receive an explanation of how the decision was made, and to contest the decision.
  • Even when an individual does not exercise his GDPR rights, firms must make Data Protection Impact Assessments (DPIAs), hire privacy officers, and submit to third-party audits; these regulatory tools create a systemic governance and oversight regime.
  • In the context of automated decision-making, DPIAs support both individual rights and systemic governance.
  • The DPIA could play a key role in enforcing individual rights.
    • DPIA content could explain algorithmic decisions to individuals.
    • The DPIA requires firms to describe what individual rights will look like in practice.
  • The DPIA's main shortcoming is that the report need not be shared with the public; however, individuals could receive and share DPIA content in the course of insisting on their rights, enhancing accountability.
  • The GDPR does not address the impact of surveillance or automated decision-making on groups and society; ideally, impact statements should mitigate social harms that go beyond individuals.
  • The GDPR calls for firms to offer multi-layered explanations for automated decision-making.
    • The DPIA should describe the algorithmic model used, input data, testing, and performance metrics.
    • The DPIA should explain what changes would alter the outcome of individual decisions.
    • The DPIA should explain the effect of the process on entire groups.



Margot Kaminski

About Margot Kaminski

Margot Kaminski is an Associate Professor at the University of Colorado Law School and the Director of the Privacy Initiative at Silicon Flatirons. She specializes in the law of new technologies, focusing on information governance, privacy, and freedom of expression. Recently, her work has examined autonomous systems, including AI, robots, and drones (UAS). 

In 2018, Professor Kaminski conducted research on comparative data privacy law as a recipient of the Fulbright-Schuman Innovation Grant.