ACADEMIC ARTICLE SUMMARY

Content Moderation Remedies

Article Source: Michigan Technology Law Review, Vol. 28, pp. 1-59, 2021
Publication Date:
Time to Read: 3 minute read
Written By:

ARTICLE SUMMARY

Summary:

Sometimes users' online content or actions violate Internet service providers’ rules. Services should moderate content effectively, but should also promote free expression, competition, and community.

POLICY RELEVANCE

Policy Relevance:

Services should consider remedies other than removal of problematic content.

KEY TAKEAWAYS

Key Takeaways:
  • Unlike government, private firms cannot impose taxes, use police to compel compliance with rules, or compel citizens to use their services.
    • Remedies used by Internet service providers differ from those imposed by governments.
    • Private firms, unlike the government, are not subject to Constitutional limits.
  • The Digital Millennium Copyright Act (DMCA) gives Internet services a safe harbor from liability for copyright-infringing content posted by users, so long as the service removes infringing content once notified by the copyright owner; European laws require services to remove illegal content shortly after notification.
  • Regulators’ obsession with removal of unlawful content hinders development of other remedies.
    • Removal can destroy evidence of offenses.
    • Account cancellation deprives the cancelled user of a wide range of content and services.
    • Non-removal remedies reduce the perception that service providers censor users.
  • There are five general types of non-removal remedies, including the following:
    • Actions against content (such as adding a warning, or providing educational materials).
    • Actions against an online account (such as reduced service, shaming, or suspension).
    • Actions that make violations less visible (such as removal from search engine indexes).
    • Financial consequences (such as fines or withholding of earnings).
    • Miscellaneous remedies (such as community service).
  • Miscellaneous remedies include the following:
    • Tracking warnings by assigning “strikes.”
    • Outing a pseudonymous user by revealing his identity.
    • Reporting violations (such as livestreaming of a crime) to police.
    • Blocklisting certain users across an entire industry; for example, Uber and Lyft share data to block drivers accused of abuse.
  • In designing remedies, services should consider factors such as the severity of the violation or evidence that the violation actually occurred.
    • Consideration of the remedy’s effect on others includes recognition of the author’s free speech rights.
    • Unmasking pseudonymous users might cause disproportionately harsh parallel sanctions to be imposed by courts or by the user’s employer.
  • According to one model, systems should begin by imposing light sanctions and escalating to harsher sanctions if the user repeats the offense, but this model does not fit a situation in which the user causes harm and then never returns, or can easily return under a new identity.
  • Normative factors can guide service providers and policymaker in designing remedies.
    • Regulators should not standardize remedies across the Internet unless necessary.
    • System design (such as "kindness reminders") can reduce the need for remedies.
    • Private approaches are usually faster and less costly than judicial remedies.
  • Technological filters that allow users to see only content they want to see create the risk of “filter bubbles,” but “filter bubbles” may be better than suppression of all content across the board.
  • Some urge Internet services to be more transparent about their content moderation activity, but moderation reports can be complex and expensive to produce; regulators should consider whether reporting requirements would really be beneficial.
  • Many observers have called for crackdowns on bad Internet content; however, because content moderation is difficult, regulatory intervention might increase costs but leave everyone dissatisfied.

QUOTE

TAGS

Eric Goldman

About Eric Goldman

Eric Goldman is a Professor of Law at Santa Clara University School of Law, where he is also Director of the school’s High Tech Law Institute. His research and teaching focuses on Internet law, intellectual property and marketing law.