Danielle Citron Discusses How Deepfakes Undermine Truth and Threaten Democracy

By TAP Staff Blogger

Posted on September 12, 2019


Share

Deepfakes have the potential to cause grave individual and societal harm. Imagine a deepfake that shows American soldiers in Afghanistan burning a Koran. You can imagine that that deepfake would provoke violence against those soldiers. And what if the very next day there's another deepfake that drops, that shows a well-known Imam based in London praising the attack on those soldiers? We might see violence and civil unrest, not only in Afghanistan and the United Kingdom, but across the globe.
  -  Danielle Citron, from her TED Talk in July 2019

 

In this summer’s TED Summit, Boston University law professor Danielle Citron shared how the use of deepfake technology to manipulate video and audio for malicious purposes is becoming a real threat. Below are a few excerpts from her talk, “How Deepfakes Undermine Truth and Threaten Democracy”.

 

Political Elections

 

Deepfakes have the potential to corrode the trust that we have in democratic institutions. So, imagine the night before an election. There's a deepfake showing one of the major party candidates gravely sick. The deepfake could tip the election and shake our sense that elections are legitimate.

 

Deepfakes can exploit and magnify the deep distrust that we already have in politicians, business leaders and other influential leaders. They find an audience primed to believe them. And the pursuit of truth is on the line as well. Technologists expect that with advances in AI, soon it may be difficult if not impossible to tell the difference between a real video and a fake one.

 

The Liar’s Dividend

 

How can the truth emerge in a deepfake-ridden marketplace of ideas? Will we just proceed along the path of least resistance and believe what we want to believe, truth be damned? And not only might we believe the fakery, we might start disbelieving the truth. We've already seen people invoke the phenomenon of deepfakes to cast doubt on real evidence of their wrongdoing. We've heard politicians say of audio of their disturbing comments, "Come on, that's fake news. You can't believe what your eyes and ears are telling you." And it's that risk that Professor Robert Chesney and I call the "liar's dividend": the risk that liars will invoke deepfakes to escape accountability for their wrongdoing.

 

As she continues in her Talk, Professor Citron offers suggestions to safeguard the truth. “We're going to need a proactive solution from tech companies, from lawmakers, law enforcers and the media. And we're going to need a healthy dose of societal resilience.”

 

Watch Professor Citron’s TedTalk: “How Deepfakes Undermine Truth and Threaten Democracy”.

 

Additional work from Professor Citron on deepfake technology:

 

Danielle Citron is a professor of law at Boston University School of Law, where she teaches and writes about privacy, free speech, and civil procedure. She is an internationally recognized information privacy expert. Her current projects concern the recognition of sexual privacy as a foundational privacy interest; the national security, privacy, and free speech implications of deep fakes; and the legitimacy crisis raised by the automation of agency work. Her book, Hate Crimes in Cyberspace (Harvard University Press), explored the phenomenon of cyber stalking and was named one of the “20 Best Moments for Women in 2014” by Cosmopolitan magazine. In June 2019, Professor Citron testified before the House Permanent Select Committee on Intelligence about the challenges of misinformation and deep fakes.

 


Share