ACADEMIC ARTICLE SUMMARY

Accountable Algorithms

Article Source: University of Pennsylvania Law Review, Vol. 165, No. 3, pp. 633-705, 2017
Publication Date:
Time to Read: 2 minute read
Written By:

 David G. Robinson

David G. Robinson

 Harlan Yu

Harlan Yu

 Joanna Huey

Joanna Huey

 Joshua A. Kroll

Joshua A. Kroll

 Solon Barocas

Solon Barocas

ARTICLE SUMMARY

Summary:

Computers now make many decisions formerly made by humans. Procedures used to oversee human decision-makers cannot be applied to computers. This article describes technological tools to help developers design algorithms consistent with social goals.

POLICY RELEVANCE

Policy Relevance:

Fairness and accountability should be built in to computerized processes from the start. Policymakers and computer scientists should work together to ensure accountability.

KEY TAKEAWAYS

Key Takeaways:
  • Computers use algorithms to approve loan applications, target travelers for search, grant visas, and more; the public and the courts are ill equipped to ensure that algorithms are fair.
  • Some assert that transparency would promote accountability, but this would be ineffective.
    • Firms could disclose their source code, but only experts could understand it.
    • Users could game the system; for example, disclosure of code intended to target tax audits could enable cheaters to evade detection.
  • Randomization improves some computerized processes, but complicates the evaluation process.
    • The Roomba floor cleaner moves in random patterns, so programmers need not program every type of possible motion.
    • It is hard to detect when a corrupt developer skews “random” results.
  • Computer decisions should be made with “procedural regularity,” that is, the same procedure should be applied to everyone.
  • Many technological tools can improve automated decisions, including software that tests every possible outcome, and the use of encryption to “seal” files.
  • The U.S. State Department uses a lottery to grant “green cards” to permanent residents, and some question the fairness of the process.
    • The State Department should publish commitments to its source code and proofs of the code’s fairness.
    • The State Department could work with a trusted third party to ensure that random selections are truly random, and to audit compliance with its commitments.
  • Technological tools can ensure that machine learning systems such as those used to target police searches do not discriminate by race.

QUOTE

TAGS

Edward Felten

About Edward Felten

Professor Edward Felten's research interests include computer security and privacy, and public policy issues relating to information technology. Specific topics include software security, Internet security, electronic voting, cybersecurity policy, technology for government transparency, network neutrality and Internet policy.

Joel R. Reidenberg

About Joel R. Reidenberg

Joel R. Reidenberg holds the Stanley D. and Nikki Waxberg Chair in Law at Fordham University where he is the Founding Academic Director of the Center on Law and Information Policy at Fordham Law School. His current research focuses on privacy in public, information surveillance, privacy and cloud computing in public schools, and the impact of patents on the smartphone industry.