If you have ever seen hyper-realistic videos of Barack Obama hurling obscenities at President Donald Trump during a televised interview or Bill Hader physically morphing into the subjects of his impersonations, then you’ve seen Deepfake at work.
Deepfake works by superimposing video or still images of subjects onto existing templates to produce videos and then pairing them with realistic audio. The software is known mostly for generating funny memes, but experts are worried about the nascent technology being used for more sinister purposes.
A panel of LSU professors came together Tuesday in Hill Memorial Library to discuss the dangers of Deepfake. Among them was Mass Communication Librarian Rebecca Kelley, who said that the library is committed to helping students sort fact from fiction.
“We work a lot with students by helping them to evaluate the information they read and the news they consume,” Kelley said. “We don’t have all the answers, but we’ll hopefully get some of them today.”
Dr. Seungwon Yang, an assistant professor at the Center for Computation and Technology, described some of the mechanics of Deepfake’s algorithm.
Yang’s prior research has included studying dissemination of “fake news” on social media, trying to discern among thousands of accounts which ones are “fake” and which are real. This offers him some insight into the patterns in and of disinformation campaigns.
Yang was not very optimistic about future disinformation campaigns. According to Yang, social media outlets need to invest more effort into detecting and filtering fake news and disinformation accounts. Yang encouraged skepticism in all media consumption.
Professor Len Apcar, Wendell Gray Switzer Jr. Endowed Chair in Media Literacy at the Manship School of Mass Communication and former editor at both the Wall Street Journal and the New York Times, said this trend was worrying.
Apcar cited a 2014 presumed Russian attack on the Columbian Chemicals Plant in St. Mary Parish. There were hundreds of tweets and text messages spreading the hoax that the plant had suffered an explosion.
The specific motivations behind the attack are unclear, but Apcar emphasized that the attack is evidence that cyber-warfare is not some issue of the distant future.
“The Mueller Investigation indicted several individuals in St. Petersburg involved in that attack,” Apcar said. “For anyone who thinks this [cyber warfare] is all just abstract: this is real, and it happens quite often.“
To demonstrate possible lasting impacts, Apcar brought up data showing last-minute confusion and uncertainty among voters in the 2016 general election spurred by developments very late in the campaign, like those involving Clinton’s email investigation.
Apcar also described some of the effects digital impersonation has had on people around the world using editing software like Deepfake.
“One example involving pornographic videos- in India a news reporter (Rana Ayyub) was crusading and investigating for over a year on a number of different controversies. Opponents edited her face into a pornographic video that went viral and devastated her career and reputation,” Apcar said.
Apcar explained how some use of Deepfake doesn’t require skill at manipulating the software: services can easily be purchased for less than $20 on online marketplaces like Fiverr. This makes the technology even more ripe for weaponization of some kind.
In his speech, Associate Professor of Theatre History John Fletcher focused on abstract social conditions at play in Deepfake and other staples of postmodern humor in our “post-truth society.”
Fletcher asserted that at its core Deepfake and the culture surrounding its popular uses is driven by a love of irony and satire. Ironic internet memes involving Deepfake, whether they make people laugh or they sew anger and confusion, typically employ dramatic irony for their target audiences.
But this line between comedy and malice is easily blurred, and Fletcher said that ambiguity has opened the door to disinformation campaigns on both sides of the political spectrum. Fletcher ultimately advised reflecting on what feeling media is intended to invoke in you.
“I suggest to you that we no longer have to play the game,” Fletcher said. “No longer focus on is this real or is this fake, but what is the effect this has on real people, and where do we go from there?”