Policymakers and the public struggle to navigate misinformation about topics like climate change, COVID, and politics. Disinformation campaigns and ideological convictions are harder to fight than ordinary mistaken beliefs.
Online platforms can add warnings to misinformation, but some corrections backfire.
- Misinformation usually arises from an isolated incident or misunderstanding; by contrast, multiple actors orchestrate the spread of disinformation deliberately, to achieve strategic financial or political goals.
- Fighting misinformation simply requires an observer to identify and correct the misleading message; this could be done by a machine learning system.
- Fighting disinformation requires observers to understand the motivations and strategies of many different participants; diplomacy and economic sanctions might best address disinformation campaigns organized by foreign actors.
- The Constitution of the United States does not allow the government to censor speech, even if it is misleading; however, the government can prosecute harmful conduct such as fraud, even if speech is involved.
- Section 230 of the Communications Decency Act immunizes online platforms from liability for third-party speech; online platforms can address content such as coronavirus misinformation by adding warnings or deplatforming the content without being held liable as a “publisher.”
- A user’s mistaken belief can be corrected by new facts, but users resist correction of convictions supported by ideology.
- If platforms or officials distribute information about the COVID-19 vaccine's safety and efficacy early on, some who hesitate to accept the vaccine might change their minds, but those who refuse vaccines for religious or political reasons will continue to refuse the vaccine.
- Some efforts to correct misinformation result in a "backfire effect;" research is needed to discover how to avoid methods of correction that make matters worse.