Siva Vaidhyanathan: How Facebook Disconnects Us and Undermines Democracy

By TAP Staff Blogger

Posted on August 14, 2018


Share

In “Antisocial Media: How Facebook Disconnects Us and Undermines Democracy,” Professor Siva Vaidhyanathan provides a comprehensive account of the effects that Facebook has had on our lives and our world, and he explains how social media is undermining progress and thought.

 

Professor Vaidhyanathan writes that the "story of Facebook is a story of the hubris of good intentions, a missionary spirit and an ideology that sees computer code as the universal solvent for all human problems. And it’s an indictment of how social media has fostered the deterioration of democratic and intellectual culture around the world."

 

The Guardian recently interviewed Professor Vaidhyanathan about Antisocial Media. Below are a few excerpts from “Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy by Siva Vaidhyanathan – review”.

 

There are, says Vaidhyanathan, “two things wrong with Facebook: how it works and how people use it”. It works by monitoring its users – hoovering up their data trails and personal information in order to paint virtual targets on their back at which advertisers (Facebook’s real customers) can take aim. People use it for all kinds of things, many of them innocuous, but some of them absolutely pernicious: disseminating hate speech that leads to ethnic cleansing in Myanmar, for example; spreading white supremacist propaganda in the US or Islamophobic or antisemitic messages in innumerable countries, and so on. People also use it to try to influence democratic elections, to threaten and harass others, to spread fake news, publish revenge porn and perform a host of other antisocial acts.

 

Vaidhyanathan argues that the central problem with Facebook is the pernicious symbiosis between its business model – surveillance capitalism – and the behaviour of its users. Because Facebook provides “free” services, it derives its revenues solely by monetising the data trails of its users – the photographs they upload, the status updates they post, the things they “like”, their friendship groups, the pages they follow, etc. This enables it to build detailed profiles of each user (containing 98 data points, according to one report), which can then be used for even more precisely targeted advertising.

 

Facebook “farms” its users for data: the more they produce – the more “user engagement” there is, in other words – the better. Consequently, there is an overriding commercial imperative to increase levels of engagement. And it turns out that some types of pernicious content are good for keeping user-engagement high: fake news and hate speech are pretty good triggers, for example. So the central problem with Facebook is its business model: the societal downsides we are experiencing are, as programmers say, a feature, not a bug.

 

Read the full article: “Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy by Siva Vaidhyanathan – review” (The Guardian, June 25, 2018)

 

Professor Vaidhyanathan also discussed his new book with The Washington Post. Below are a few excerpts from that interview, “It’s No Accident that Facebook Is So Addictive.”

 

WP: Your book suggests that Facebook uses the same kinds of techniques to keep you coming back as a casino does. What does this mean?

 

SV: Facebook engineers were for many years influenced by a strain of thought that emerged from Stanford University, where in the early 2000s scholars of human-computer interaction, design and behavioral economics were promoting the idea that games could generate “stickiness” among users, giving users just enough positive feedback to want to return to the game but deny users enough pleasure so that they don’t get satiated. As technology consultant Nir Eyal explains in his revealing and, frankly, frightening book, “Hooked: How to Build Habit-Forming Products,” this idea spread quickly through Silicon Valley, uniting game designers, application engineers, advertising professionals and marketing executives.

 

Facebook played this game better than most. It’s perfectly designed, like a fruit machine in a casino, to give us a tiny sliver of pleasure when we use it and introduce a small measure of anxiety when we do not use it. A Facebook user says, “What am I missing out on? Did anyone ‘like’ my joke?” A casino patron says, “I wonder if THIS is my lucky moment or lucky pull of the lever.”

 

WP: For Facebook’s model to work, you suggest that users can’t have real control over their personal information. Why is this so, and what consequences does it have for politics?

 

SV: From Facebook’s point of view, users shouldn’t know all the ways that Facebook uses and distributes their basic data and records of interactions for two reasons. One, it’s just too vast a collection of users; two, users might get turned off if their experience on the surface of Facebook — all the “likes,” clicks, videos, comments and messages — [is] interrupted by the reality of what’s behind the wizard’s curtain. So Facebook keeps assuring us we “have control” of our information. But that’s only limited to superficial control over the audiences within Facebook to which we share posts.

 

I can limit a post to friends, friends of friends, or everybody on Facebook. I can exclude certain Facebook users from seeing certain posts. But it takes vigilance to manage all that. And it only controls the flow among Facebook users. All the back-end data, the really valuable and sensitive stuff, gets mined by Facebook and used to help target ads and content at you. And for more than five years, Facebook gave away our valuable back-end data to thousands of applications and developers who ranged from scientists working for Cambridge Analytica to the Obama campaign to the people who made Words With Friends.

 

The consequences for politics are stunning: In 2012, the head of state of a country with massive surveillance and military power had sensitive personal data on millions of Americans, and no one cared. When my colleagues in the social media scholarship world and privacy world tried to raise this issue, no one responded with interest. We could not get reporters to pay attention or editors to run op-eds. The Obama campaign was seen as this supercool digital pioneer, a happy, friendly phenomenon. No one thought about what might happen if a not-so-friendly campaign got the same sort of information on millions of Americans. Then it happened in 2016.

 

WP: Facebook’s algorithms, like YouTube’s and others’, try to maximize user “engagement.” What does this focus on engagement mean for Facebook’s broader business model and impact on society?

 

SV: Facebook is in the social engineering business. It constantly tries to manipulate our experience and, thus, our perspective on our friends, issues and the world. It does so haphazardly and incoherently, it seems at first. But, in fact, there is a coherent driving force. Facebook wants to maximize something close to “happiness.” It has fallen under the sway of those who believe one can measure affective states and make changes that can increase satisfaction or joy. It turns out that Bentham’s Panopticon was not his major influence on 21st-century digital culture. It was the idea of maximizing happiness by counting “hedons,” or units of pleasure. Well, you can only dial up something you can count. You can’t really count happiness. So you count a proxy.

 

For Facebook, that proxy is “engagement,” the number of clicks, shares, “likes” and comments. If a post or a person generates a lot of these measurable actions, that post or person will be more visible in others’ News Feeds. You can already see how this could go wrong. Unsurprisingly, items advocating hatred and bigotry, conspiracy theories or wacky health misinformation generate massive reactions — both positive and negative. A false post about the danger of vaccines would generate hundreds of comments, most of them arguing with the post. But the very fact of that “engagement” would drive the post to more News Feeds. That’s why you can’t argue against the crazy. You just amplify the crazy. Such are algorithms and feedback mechanisms.

 

Read the full article: “It’s No Accident that Facebook Is So Addictive.” (The Washington Post, August 6, 2018)

 

 

Siva Vaidhyanathan is the Robertson Professor and the Chair of the Department of Media Studies at the University of Virginia. He also teaches in the University of Virginia School of Law. In addition to this recent book, Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, Professor Vaidhyanathan is also the author of The Googlization of Everything And Why We Should Worry (University of California Press, 2011), Copyrights and Copywrongs: The Rise of Intellectual Property and How it Threatens Creativity (New York University Press, 2001), and The Anarchist in the Library: How the Clash between Freedom and Control is Hacking the Real World and Crashing the System (Basic Books, 2004).

 


Share