Author(s)
Bryan Casey and
Mark Lemley
Source
University of Chicago Law Review, v. 86, pp. 1311-1396 (2019)
Summary
Robots will sometimes harm others. Legal remedies for harm compensate an injured party or punish wrong-doing. But robots are complex, and cannot be deterred from wrongdoing as humans are.
Policy Relevance
Courts must rethink traditional legal ideas to find suitable remedies in cases involving robots.
Main Points
- Robots and artificial intelligence (AI) systems will sometimes do bad things; robots have killed and injured people, spouted racist remarks online, and determined prison sentences.
- When a robot or an AI has done harm, resulting in a legal dispute, the law of remedies determines what the winner of the suit will get; a remedy is anything a judicial body can do for an individual who is harmed or threatened with harm.
- In machine learning, engineers specify the robot’s goal, and the robot practices until it finds algorithms that achieve the goal; engineers give up fine-grained control of the robot's actions, and the robot's actions are unpredictable.
- Designing a robot to give cyclists an extra inch of room might benefit cyclists, but increases the chance the robot will collide with other vehicles; designers must grapple with trade-offs, and this may result in unavoidable harms.
- Making a responsible human pay damages for a robot's actions might encourage careful designs, but dividing responsibility among component makers, software designers, manufacturers, users, and owners will be difficult.
- Much of the harm done by robots might not be intentional; calls for transparency are useful to identify bad behavior or rogue algorithms, but, for some complex robots, we will not be able to understand why the robot acted as it did.
- For a robot to take the appropriate level of care, designers must calculate how likely a particular harm is, and how the robot should weight the occurrence of that harm; the cost of running over a child is high, but not so high that all robots be limited to driving only five miles per hour.
- Shutting down a robot entirely, the robot death penalty, should probably be rare, as it means shutting down an entire avenue of innovation.