Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations

Article Source: Proceedings of the ACM on Human-Computer Interaction, Vol. 3, pp. 1-26, 2019
Publication Date:
Time to Read: 2 minute read
Written By:

 Ahmer Arif

Ahmer Arif


Tom Wilson



Disinformation campaigns may be studied as a form of collaborative crowd-work. Case studies show that disinformation and conspiracy theories are often spread by sincere actors.


Policy Relevance:

Social media platforms’ efforts to address misinformation should be substantive, not procedural.


Key Takeaways:
  • "Strategic information operations" include efforts by state and non-state actors to manipulate public opinion using fake news, disinformation, information warfare, harassment, and similar acts.
  • Disinformation campaigns do not necessarily aim to foster support for any particular idea; some offer distractions or foment uncertainty to hamper productive debate.
  • A study of Russia's Internet Research Agency’s efforts to affect discourse in the United States from 2015 to 2016 shows that Russian agents targeted the BlackLivesMatter community and Trump supporters, making some activists unwitting supporters of Russia's goals.
  • The White Helmets are a humanitarian group providing rescue assistance and medical aid to people affected by the Syrian Civil War.
    • State and non-state actors use Twitter and YouTube to overwhelm White Helmets’ efforts to gain support from Western audiences.
    • Unwitting actors include activists and "journalists" whose views are based on misinformation.
  • Conspiracy theories concerning events such as school shootings and terrorist attacks are not controlled by outside actors, but emerge organically in messages among those who share the belief that world events are controlled by sinister actors.
    • These actors are mainly true believers, though some have other motivations.
    • Many sites host not just one conspiracy theory, but many, as low-budget websites have incentives to fill their pages with free or low-cost content.
  • Information operations may not follow a top-down model, in which coordinators deliberately spread disinformation; rather, the campaigns rely on persuading audiences to become unwitting agents, and who spread disinformation on a grassroots level.
  • Social media platforms mainly combat disinformation by targeting inauthentic user accounts, but the involvement of real, sincere actors in spreading disinformation means that platforms' efforts will be ineffective in addressing mature disinformation campaigns.
  • Platforms could address disinformation by targeting content that furthers the intent of a disinformation campaign, but this means censoring content produced by true believers, at odds with values such as the freedom of speech.
  • Platforms’ efforts to address disinformation must be substantive, not neutral or merely procedural.
  • Computer-Supported Cooperative Work researchers can advance understanding of information operations beyond accounts of "bots" and "trolls;" they should consider both "big data" and "small data" (that is, the qualitative experiences of social media actors) to address disinformation.



Kate Starbird

About Kate Starbird

Kate Starbird is an Associate Professor at the University of Washington (UW) in the Department of Human Centered Design & Engineering (HCDE). She is Co-founder and Director of the Center for an Informed Public (CIP) and the Director of the Emerging Capacities of Mass Participation (emCOMP) Laboratory, both at UW. She is also adjunct faculty in the Paul G. Allen School of Computer Science & Engineering and the Information School, and a data science fellow at the eScience Institute.