ACADEMIC ARTICLE SUMMARY

Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations

Article Source: University of Colorado Law Legal Studies Research Paper No. 19-28
Publication Date:
Time to Read: 2 minute read
Written By:

 Gianclaudio Malgieri

Gianclaudio Malgieri

ARTICLE SUMMARY

Summary:

The European Union's General Data Protection Direction (GDPR) addresses concerns with automated decision-making. Impact assessments support both individual privacy rights and oversight of the entire system.

POLICY RELEVANCE

Policy Relevance:

Continuous assessments reveal the effect of algorithms on society and individuals.

KEY TAKEAWAYS

Key Takeaways:
  • Under the GDPR, an individual has the right to be notified when decisions are solely automated, to receive an explanation of how the decision was made, and to contest the decision.
  • Even when an individual does not exercise his GDPR rights, firms must make Data Protection Impact Assessments (DPIAs), hire privacy officers, and submit to third-party audits; these regulatory tools create a systemic governance and oversight regime.
  • In the context of automated decision-making, DPIAs support both individual rights and systemic governance.
  • The DPIA could play a key role in enforcing individual rights.
    • DPIA content could explain algorithmic decisions to individuals.
    • The DPIA requires firms to describe what individual rights will look like in practice.
  • The DPIA's main shortcoming is that the report need not be shared with the public; however, individuals could receive and share DPIA content in the course of insisting on their rights, enhancing accountability.
  • The GDPR does not address the impact of surveillance or automated decision-making on groups and society; ideally, impact statements should mitigate social harms that go beyond individuals.
  • The GDPR calls for firms to offer multi-layered explanations for automated decision-making.
    • The DPIA should describe the algorithmic model used, input data, testing, and performance metrics.
    • The DPIA should explain what changes would alter the outcome of individual decisions.
    • The DPIA should explain the effect of the process on entire groups.

QUOTE

TAGS

Margot Kaminski

About Margot Kaminski

Margot Kaminski is an Associate Professor of Law at Colorado Law where she researches and writes on law and technology. Her work has focused on privacy, speech, and online civil liberties, in addition to international intellectual property law and legal issues raised by AI and robotics. Recently, much of her work has focused on domestic drones (UAVs or UASs).