ACADEMIC ARTICLE SUMMARY
Seeing Like an Algorithmic Error: What Are Algorithmic Mistakes, Why Do They Matter, How Might They Be Public Problems?
Article Source: Yale Journal of Law & Technology, Vol. 24, pp. 1-21, 2022
Publication Date:
Time to Read: 2 minute readSearch for the full article on Bing
ARTICLE SUMMARY
Summary:
The errors made by machine learning-based systems such as remote proctoring tools can reveal deeper economic and policy issues, such as bias against students of color or low-income students.
POLICY RELEVANCE
Policy Relevance:
Some errors present deeper public problems and require collective solutions.
KEY TAKEAWAYS
Key Takeaways:
- Social problems such as algorithmic errors are made, not found; they are a product of people, perspectives, experiences, and assumptions.
- Some errors are idiosyncratic one-off mistakes, easily corrected.
- Other errors are a result of systemic structural forces.
- Some errors are idiosyncratic one-off mistakes, easily corrected.
- Some algorithmic errors make good public problems, revealing systemic failures and how they can be remedied; policymakers would benefit from classifying errors more precisely.
- A study of error involving a university’s use of remote proctoring tools (RPTs) illustrates the different ways in which errors are perceived and corrected; RPTs use machine learning to detect cheating by students while taking tests online.
- When using the RPT, dark-skinned students were told to use extra lighting and hold their heads still to avoid being flagged as cheaters; also, the RPT required exams to be taken in rooms free of audio and visual distractions.
- Technically, the RPT system erred in being unable to detect some faces accurately.
- The requirement that rooms be free of distractions is a type of error, as it biases the system against low-income students living in shared housing.
- The RPT might erroneously cause faculty to believe that students of color cheat more often.
- One could see the university’s economic model, which depends on large classes using standardized tests for evaluation, as another type of error.
- Technically, the RPT system erred in being unable to detect some faces accurately.
- In response to concerns about the RPT, the university discontinued use of one proctoring system; however, alternatively, the university could have decided that mass surveillance was incompatible with the school’s educational mission, reducing class sizes and/or stopping use of RPTs entirely.
- Depending on how an error is framed, different aspects of socio-technical systems appear more alterable or worthy of reform.
- One may more easily make a biased dataset larger and more inclusive than question whether algorithmic surveillance is acceptable at all.
- One may more easily ask students of color to illuminate themselves than ask how learning models that require surveillance respect autonomy and ethical standards.
- One may more easily make a biased dataset larger and more inclusive than question whether algorithmic surveillance is acceptable at all.
- Precision in diagnosing algorithmic errors will create communities of people who diagnose and remedy errors similarly.
- When correction of algorithmic errors requires collective action, the errors become public problems.