ACADEMIC ARTICLE SUMMARY
Allocating Responsibility in Content Moderation: A Functional Framework
Article Source: Berkeley Technology Law Journal, Vol. 36, No. 3, pp. 1091-1172, 2021
Publication Date:
Time to Read: 2 minute readSearch for the full article on Bing
ARTICLE SUMMARY
Summary:
Observers have criticized the fairness of private-sector platforms’ content moderation policies. In response, some platforms developed processes that mimic public institutions.
POLICY RELEVANCE
Policy Relevance:
Analysts should focus on function in assessing content moderation efforts, not on formal or symbolic processes.
KEY TAKEAWAYS
Key Takeaways:
- Scholars express concerns that the use of technology to automate moderation of online content produces overbroad and unfair outcomes, and lacks transparency.
- Some online private-sector platforms responded to critics by adopting formal legal structures to evoke the legitimacy of public governance methods.
- Google's Advisory Council implements the right of European data subjects to be forgotten; unlike the courts, Google makes the right dependent on a harm analysis.
- Facebook's Oversight Board mimics the Supreme Court.
- To safeguard public values, analysis of the role of private-sector entities in content moderation should emphasize functions rather than form; key functions include:
- Defining the content subject to moderation.
- Identifying subject content on the platform.
- Applying the platform’s definitions and rules to the subject content.
- Resolving each case by amplifying, removing, de-prioritizing, or labelling content.
- Two perspectives promote understanding of the role of private-sector platforms in content moderation; these include:
- “Zooming out” to consider the laws that encourage private platforms to moderate online content.
- “Zooming in” to consider specific content moderation tasks, and how those tasks are allocated between human actors and technologies.
- Case studies highlight constraints on the actors involved in content moderation.
- Under provisions of the Digital Millennium Copyright Act concerning platform liability for copyrighted content, the courts ultimately decide whether allegedly copyrighted content is infringing; users rarely raise questions about the legitimacy of the platform's actions.
- By contrast, under Section 230 of the Communications Act limiting platform liability for unlawful content posted by users, public processes have no role in defining the content subject to moderation, and users find the process opaque or biased.
- In assessing the elements of content moderation, three questions arise:
- What subfunction is the structure or element intended to perform?
- What competencies and public values are relevant to performance of that subfunction?
- What constraints on the actor performing that element will best protect public values?
- Google's Advisory Council could be seen as a move to usurp authority allocated to the public sector; some note that the council lacks features needed for legitimacy, such as representation of all stakeholders.
- Some platforms responded to criticism by producing transparency reports; however, these reports contain overwhelming amounts of information, and they are not produced by an objective authority.