The Internet as a Speech Conversion Machine and Other Myths Confounding Section 230 Reform Efforts

Article Source: University of Chicago Legal Forum, Vol. 2020, Article 3, pp. 45-75, 2020
Publication Date:
Time to Read: 2 minute read
Written By:

 Mary Anne Franks

Mary Anne Franks



Policymakers are now revisiting the responsibility of online platforms for harmful content. Under Section 230 of the Communications Decency Act (CDA), online platforms are immune from liability for content posted by users.


Policy Relevance:

Online platforms should be liable for encouraging illegal activity or knowingly leaving up unambiguously harmful content.


Key Takeaways:
  • Section 230 of the CDA was intended to encourage online service providers to clean up online content, by insulating the provider from liability for wrongful content posted by others; Section 230 is considered vital to online innovation and free speech.
  • The courts extend online platform's immunity from liability to cases in which the platform encourages illegal action, refuses to take down harmful content, or profits from users' illegal actions.
  • Courts rarely analyze whether the posted content online actually should qualify as protected speech under the First Amendment.
    • Some speech matters for self-expression or the search for truth, but not all speech does.
    • If the conduct would not be speech protected by the First Amendment if it occurs offline, it should not be transformed into speech merely because it occurs online.
  • Some view a platform's removal of user-generated content as a form of governmental censorship barred by the First Amendment; however, treating online platforms like Facebook as governmental actors would deprive them of important liberties.
  • One proposal to amend Section 230 would carve out some types of content (such as content related to sex trafficking) so that platforms would lose immunity for knowingly hosting such content; however, this would do more harm than good, as the platforms would either block everything related to sex or avoid monitoring their platforms entirely.
  • Section 230 should be amended so that online platforms that deliberately leave up harmful content, or that host or encourage illegal activity, lose their immunity.
  • Alternately, providers could enjoy immunity from liability if they could demonstrate that their content moderation policies are reasonable.



Danielle Citron

About Danielle Citron

Danielle Citron is the Jefferson Scholars Foundation Schenck Distinguished Professor in Law at the University of Virginia School of Law. She writes and teaches about privacy, free expression and civil rights. She is an Affiliate Scholar at the Stanford Center on Internet and Society, Affiliate Fellow at the Yale Information Society Project, Senior Fellow at Future of Privacy, Affiliate Faculty at the Berkman Klein Center at Harvard Law School, and a Tech Fellow at the NYU Policing Project.