The Issues

  • Artificial Intelligence
  • Competition Policy and Antitrust
  • Innovation and Economic Growth
  • Intellectual Property
  • Interoperability
  • Networks, the Internet, and Cloud Computing
  • Privacy and Security

TAP Highlights

Can Algorithms that Automate Decisions Be Held Accountable?

“A Model Algorithmic Impact Assessment process should deliberately widen the lens from algorithms as a technology in isolation, to algorithms as systems embedded in human systems— both those that design the technology, and those that use it.” – Margot Kaminski, University of Colorado Law,  and Gianclaudio Malgieri, VUB

What Is Contextual Integrity?

Consider this: a smartphone user may be okay with their location data being sent to a website when looking for restaurant suggestions; but they may not be okay if that data is also used for advertising. Contextual integrity is a new model for privacy that focuses on societal norms and the appropriateness of information flows.

What Is the Promise of AI?

Artificial intelligence (AI) technologies are transforming society and changing processes in business and government. Additionally, AI poses challenges that need to be addressed by policy makers and stakeholders involved in its development and deployment. Read TAP’s new fact sheet on the issues that arise in discussions of AI.

A Tribute to Ian Kerr

TAP honors Professor Ian Kerr. He was a highly regarded scholar of law, technology, and philosophy. He passed away in August 2019.

Skip to Blog Posts

New TAP Blog Posts

How the GDPR Approaches Algorithmic Accountability

Posted on November 8, 2019
In “Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations”, Colorado Law Professor Margot Kaminski and Gianclaudio Malgieri, Vrije Universiteit Brussel, explore how a Data Protection Impact Assessment (DPIA) links the two faces of the GDPR’s approach to algorithmic accountability: individual rights and systemic collaborative governance.

Also on TAP…

Read More Blog Posts

@TAPolicy

Connect With Us