All Article Summaries
These article summaries are written by TAP staff members. TAP’s purpose for this section of the site is to present information, points of view, research, and debates.
Proactive Moderation of Online Discussions: Existing Practices and the Potential for Algorithmic Support
Human moderators may intervene to stop users from posting offensive content, but proactive intervention is labor intensive. Algorithmic tools help moderators identify problematic conversations on a large scale.
Allocating Responsibility in Content Moderation: A Functional Framework
Observers have criticized the fairness of private-sector platforms’ content moderation policies. In response, some platforms developed processes that mimic public institutions.
The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age
Intimate privacy concerns the extent to which others may access information about our health, sexuality, gender, and close relationships. The law does not adequately protect intimate privacy.
It’s Time to Update Section 230
The 1996 Communications Decency Act (CDA) makes online platforms immune from liability for harmful content posted by third parties. Platforms should enjoy immunity only if the platform takes reasonable steps to prevent harm.
The Internet as a Speech Conversion Machine and Other Myths Confounding Section 230 Reform Efforts
Policymakers are now revisiting the responsibility of online platforms for harmful content. Under Section 230 of the Communications Decency Act (CDA), online platforms are immune from liability for content posted by users.
The Ten Most Important Section 230 Rulings
Under Section 230 of the Telecommunications Act of 1996, websites are not legally responsible for content posted on the site by others. A few cases suggest that immunity does not extend to sites that encourage unlawful content.