ACADEMIC ARTICLE SUMMARY
The Internet as a Speech Conversion Machine and Other Myths Confounding Section 230 Reform Efforts
Article Source: University of Chicago Legal Forum, Vol. 2020, Article 3, pp. 45-75, 2020
Publication Date:
Time to Read: 2 minute readSearch for the full article on Bing
ARTICLE SUMMARY
Summary:
Policymakers are now revisiting the responsibility of online platforms for harmful content. Under Section 230 of the Communications Decency Act (CDA), online platforms are immune from liability for content posted by users.
POLICY RELEVANCE
Policy Relevance:
Online platforms should be liable for encouraging illegal activity or knowingly leaving up unambiguously harmful content.
KEY TAKEAWAYS
Key Takeaways:
- Section 230 of the CDA was intended to encourage online service providers to clean up online content, by insulating the provider from liability for wrongful content posted by others; Section 230 is considered vital to online innovation and free speech.
- The courts extend online platform's immunity from liability to cases in which the platform encourages illegal action, refuses to take down harmful content, or profits from users' illegal actions.
- Courts rarely analyze whether the posted content online actually should qualify as protected speech under the First Amendment.
- Some speech matters for self-expression or the search for truth, but not all speech does.
- If the conduct would not be speech protected by the First Amendment if it occurs offline, it should not be transformed into speech merely because it occurs online.
- Some view a platform's removal of user-generated content as a form of governmental censorship barred by the First Amendment; however, treating online platforms like Facebook as governmental actors would deprive them of important liberties.
- One proposal to amend Section 230 would carve out some types of content (such as content related to sex trafficking) so that platforms would lose immunity for knowingly hosting such content; however, this would do more harm than good, as the platforms would either block everything related to sex or avoid monitoring their platforms entirely.
- Section 230 should be amended so that online platforms that deliberately leave up harmful content, or that host or encourage illegal activity, lose their immunity.
- Alternately, providers could enjoy immunity from liability if they could demonstrate that their content moderation policies are reasonable.