Insights

Deepfakes are an occupational hazard for journalists. Beware.

There are many words to describe President Trump. “Expressive” is certainly one of them. But on January 8, during the border security address from the Oval Office, viewers of Seattle’s Fox affiliate Q13 found themselves watching an especially expressive president.

Beyond his skin looking more orange-hued than usual, President Trump’s gestures were exaggerated, his mouth movements looked cartoonish, and his tongue hung lazily out his mouth. It was not a flattering video, to say the least. But before long, some alert viewers started comparing footage across TV stations and realized that the video aired by Q13 didn’t quite match up to what the rest of the nation saw.

Editors in the newsroom looked into it, and within a day the truth came out: An employee at the station had aired a doctored version of the speech, purposefully designed to make President Trump look bad. That unnamed culprit was immediately fired, and an apology was issued. Q13 – and likely some viewers – had unknowingly fallen victim to a deepfake.

Deepfakes aren’t very difficult to create. By harnessing cutting-edge machine learning and video editing software, anyone with the patience to learn a little bit of coding can take existing video of a person and edit it so that the subject appears to say or do things they never actually did. These forgeries can be so thorough that distinguishing between them and reality becomes almost impossible.

To see what this looks like in action, watch this short clip. Produced by Jordan Peele and Buzzfeed, it was created to show the alarming capabilities of deepfake technology.

It doesn’t take much of a leap to realize how this could be a terrible tool in the wrong hands. With strong evidence already indicating that “fake news” was a crucial factor in the 2016 election, deepfakes could serve as the next horizon for those determined to wreak havoc.

Hany Farid, a professor and digital forensics expert at Dartmouth College, understands the risks better the most—much of his career has been spent creating technologies designed to detect digital forgeries. “If the trends of the past few years continue, [deep fakes] will pose a huge challenge to us as a society and democracy,” he said. “We have, in fact, already seen serious issues ranging from horrific violence in Myanmar and Sri Lanka and election tampering around the world, all being fueled by fake news, fake images, and fake video.”

DON’T MISS  It takes a village: tapping community engagement to uncover lost stories

Farid says it isn’t hard to envision other worst-case scenarios involving deepfakes. He rattled off a number of troubling possibilities: “Just off the top of my head… a video of President Trump saying that he launched nuclear weapons against North Korea leading to a geopolitical crisis. A video of Jeff Bezos saying that Amazon’s profits are down leading to drop in the stock market opening the door for stock manipulators. A video of Scarlett Johansson in a hard-core pornographic video.”

That last one has already happened. In fact, it was part of the deepfakes origin story. In 2017, a Redditor with the username “Deepfakes” posted a number of videos in which well-known celebrities had their faces edited into pornographic scenes. Johansson, Daisy Ridley, Emma Watson, Taylor Swift and others were all victimized by the scheme, which appears to have been the first public use of deepfake technology.  

The potential damage of deepfakes is clear. But for reporters, a special emphasis will need to be placed on deepfake awareness.

In a profession driven by the dual needs of verification and speed, they will almost surely emerge as an immense occupational hazard. If journalists are gatekeepers of the truth, then deepfakes are the monsters trying to smash the gate open.

So how can the news media prepare itself to deal with these high-quality forgeries? Farid believes that part of the answer lies in amping up the technological firepower. “It will become increasingly more difficult for the average person to differentiate the real from the fake. We are, therefore, going to need effective and accessible technology to help journalists,” he said.  

On this front, we are already seeing some positive developments. Last summer, Department of Defense forensics experts debuted software that could successfully detect deepfakes. They’re working on making the program even stronger.

But the war against deepfakes won’t just be fought by computers. Reporters will also need to keep themselves educated and lean on tried-and-true verification techniques.

DON’T MISS  Social media and news: The dilemma of our time

“Journalists are going to have to start by being aware of the nature of deepfake technology and what types of content it can and cannot create,” Farid said. “They will also have to revert to traditional methods for validating content.” That means remembering lessons learned in Journalism 101: Checking with sources to see where they got an image or video, independently verifying every story, etc.

It could also be helpful for publications to add a deepfakes detection wing to their organizational infrastructure. The Wall Street Journal, for one, created a “Media Forensics Committee” last fall, and have specially trained a team of editors and journalists to find deepfakes. In an interview with the Nieman Lab, deputy editor Christina Glancey said “raising awareness in the newsroom about the latest technology is critical. We don’t know where future deepfakes might surface so we want all eyes watching out for disinformation.”

Deepfakes are more than a curiosity of the internet. They are a fundamental threat to the news ecosystem and consequently a threat to our democracy. The exaggerated Trump seen by Q13 viewers in Seattle may have been somewhat comical, but the fact that the doctored footage made it onto the airwaves shows that newsrooms are unprepared to identify and handle the issue. This time, it’s a funny Trump. Next time, it could be something far worse.

Alexander Frandsen

Leave a Reply

Your email address will not be published. Required fields are marked *

Get the latest from Storybench

Keep up with tutorials, behind-the-scenes interviews and more.