It’s Time to Update Section 230

Article Source: Harvard Business Review, 2021
Publication Date:
Time to Read: 2 minute read
Written By:

 Michael D. Smith

Michael D. Smith



The 1996 Communications Decency Act (CDA) makes online platforms immune from liability for harmful content posted by third parties. Platforms should enjoy immunity only if the platform takes reasonable steps to prevent harm.


Policy Relevance:

Removal of harmful content will not violate free speech rights.


Key Takeaways:
  • Section 230 of the CDA gave online platforms like Twitter a safe harbor from liability for content posted by third-party users; this grant of immunity is out of date and should be revised.
  • Social media platforms benefit us, providing platforms for the Me Too and Black Lives Matter movement; however, these platforms also supported planning for the Capitol Riots, may enable terrorist recruiting, and facilitate the sexual exploitation of children.
  • Section 230(c)(1) protects platforms from liability for harmful content posted by third parties, protection which the platforms need to remain in business.
  • Section 230(c)(2) allows platforms to police sites for harmful content, but doesn't require the contents’ removal; this stops courts from classifying platforms as publishers, who are liable for all user-generated content.
  • Platform owners rarely strive to remove harmful content as expected, as they benefit economically from harmful content without suffering significant harm to their reputations.
  • The CDA should be reformed to provide that online service providers should be immune from liability for third-party harmful content only if the provider takes reasonable steps to address content known to be harmful; some courts have accepted this revised duty-of-care standard.
  • Content that causes harm by advocating the violent overthrow of the government or by presenting obscene and child sex-abuse material is not protected by constitutional rights of free speech; a duty-of-care standard does not violate free speech rights.
  • Responsible platforms would benefit from clear boundaries between their service and the harmful conduct of bad actors; by contrast, broader regulation would impose costs on all businesses, responsible or not.



Marshall Van Alstyne

About Marshall Van Alstyne

Marshall Van Alstyne is the Questrom Professor in Management, Professor of Information Systems, and Department Chair of Information Systems at Boston University's Questrom School of Business. He is one of the leading experts in network business models. Professor Van Alstyne conducts research on information economics, covering such topics as communications markets, the economics of networks, intellectual property, social effects of technology, and productivity effects of information.

See more with Marshall Van Alstyne