Making Up Political People: How Social Media Create the Ideals, Definitions, and Probabilities of Political Speech

Artificial Intelligence, Networks, the Internet, and Cloud Computing, Internet and Media and Content

Article Snapshot

Author(s)

Mike Ananny

Source

Georgetown Law Technology Review, Vol. 4, pp. 351-366

Summary

Facebook’s fact checking service assumes that the public is rational and seeks the truth, but people are more influenced by emotion. Some are concerned that false news is not banned, but merely ranked lower.

Policy Relevance

Social media platforms should reconsider how their systems shape political speech.

Main Points

  • Online speech platforms operate on a large scale, making human oversight difficult; digital platforms must operate using actuarial systems and probabilities to manage content.
     
  • Platform such as Facebook create the conditions of public life through their own vision of a desirable future; we should examine the assumptions that platform infrastructures make about public life.
     
  • Science and technology scholars use "infrastructure" to refer to the set of relationships among people and materials (including algorithms and databases) that creates the condition under which social systems operate.
     
  • In the wake of the 2106 U.S. presidential election, Facebook formed a partnership with five news and fact-checking organizations, a partnership that created an infrastructure of political speech.
     
  • Facebook’s fact-checking partnership was driven by an image of the ideal "good citizen" who seeks out high-quality information in a marketplace of ideas; however, this image of a rational citizen does not align with the role of emotion in driving the popularity of social media content, and the importance of the popularity metric to the partnership itself.
     
  • Instead of trying to make the public into something it is not, platforms could acknowledge the role of emotion and ask how to create a defensible and desirable pubic life.
     
  • Digital platforms governance systems incorporate ill-defined categories such as "false news," "misleading content," "popularity," "impact," "politician," but the fact checkers themselves are often unaware of what the terms mean, and the terms’ use cannot be publicly debated.
     
  • When news is rated false, it is ranked lower, purportedly reducing future viewership; this frustrates fact checkers, some of whom think that false information should be banned.
     
  • Probabilistic governance of speech raises many questions, such as how much certainty is needed in eliminating deepfakes to ensure that people see elections as legitimate.
     
  • Key questions for the future include whether the ideal public is a rational, deliberative, truth-seeking public, or a participatory public that is more concerned with the exchange of opinions.
     

Get The Article

Find the full article online

Search for Full Article

Share