Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations

Privacy and Security, Networks, the Internet, and Cloud Computing, Internet and Media and Content

Article Snapshot

Author(s)

Ahmer Arif, Kate Starbird and Tom Wilson

Source

Proceedings of the ACM on Human-Computer Interaction, Vol. 3, pp. 1-26, 2019

Summary

Disinformation campaigns may be studied as a form of collaborative crowd-work. Case studies show that disinformation and conspiracy theories are often spread by sincere actors.

Policy Relevance

Social media platforms’ efforts to address misinformation should be substantive, not procedural.

Main Points

  • "Strategic information operations" include efforts by state and non-state actors to manipulate public opinion using fake news, disinformation, information warfare, harassment, and similar acts.
     
  • Disinformation campaigns do not necessarily aim to foster support for any particular idea; some offer distractions or foment uncertainty to hamper productive debate.
     
  • A study of Russia's Internet Research Agency’s efforts to affect discourse in the United States from 2015 to 2016 shows that Russian agents targeted the BlackLivesMatter community and Trump supporters, making some activists unwitting supporters of Russia's goals.
     
  • The White Helmets are a humanitarian group providing rescue assistance and medical aid to people affected by the Syrian Civil War.
     
    • State and non-state actors use Twitter and YouTube to overwhelm White Helmets’ efforts to gain support from Western audiences.
       
    • Unwitting actors include activists and "journalists" whose views are based on misinformation.
       
  • Conspiracy theories concerning events such as school shootings and terrorist attacks are not controlled by outside actors, but emerge organically in messages among those who share the belief that world events are controlled by sinister actors.
     
    • These actors are mainly true believers, though some have other motivations.
       
    • Many sites host not just one conspiracy theory, but many, as low-budget websites have incentives to fill their pages with free or low-cost content.
       
  • Information operations may not follow a top-down model, in which coordinators deliberately spread disinformation; rather, the campaigns rely on persuading audiences to become unwitting agents, and who spread disinformation on a grassroots level.
     
  • Social media platforms mainly combat disinformation by targeting inauthentic user accounts, but the involvement of real, sincere actors in spreading disinformation means that platforms' efforts will be ineffective in addressing mature disinformation campaigns.
     
  • Platforms could address disinformation by targeting content that furthers the intent of a disinformation campaign, but this means censoring content produced by true believers, at odds with values such as the freedom of speech.
     
  • Platforms’ efforts to address disinformation must be substantive, not neutral or merely procedural.
     
  • Computer-Supported Cooperative Work researchers can advance understanding of information operations beyond accounts of "bots" and "trolls;" they should consider both "big data" and "small data" (that is, the qualitative experiences of social media actors) to address disinformation.
     

Get The Article

Find the full article online

Search for Full Article

Share