Deepfake videos: Inside the Pentagon’s race against disinformation
PLAY AGAIN

When seeing is no longer believing

Inside the Pentagon’s race against deepfake videos

Advances in artificial intelligence could soon make creating convincing fake audio and video – known as “deepfakes” – relatively easy. Making a person appear to say or do something they did not has the potential to take the war of disinformation to a whole new level. Scroll down for more on deepfakes and what the US government is doing to combat them.

Sources: Stanford University/Michael Zollhoefer, The Max Planck Institute for Informatics, University of Washington, Carnegie Mellon, University Colorado Denver

What is a deepfake, explained

Take the quiz: Can you spot the deepfake?

Below are four pairs of videos; in each pair, one of the videos is a deepfake. Click on the video that you believe has been manipulated.

Credit: Stanford University/Michael Zollhöfer

Manipulating video is nothing new — just look at Hollywood

It’s been possible to alter video footage for decades, but doing it took time, highly skilled artists, and a lot of money. Deepfake technology could change the game. As it develops and proliferates, anyone could have the ability to make a convincing fake video, including some people who might seek to “weaponize” it for political or other malicious purposes.

Advertisement

See how deepfakes are different. Computers, not humans, do the hard work

Now deepfake technology is on the US government's radar

The Pentagon, through the Defense Advanced Research Projects Agency (DARPA), is working with several of the country’s biggest research institutions to get ahead of deepfakes.

But in order to learn how to spot deepfakes, you first have to make them. That takes place at the University of Colorado in Denver, where researchers working on DARPA’s program are trying to create convincing deepfake videos. These will later be used by other researchers who are developing technology to detect what’s real and what’s fake.

How are they made?

Spotting a deepfake

A thousand miles west of Denver a team at SRI International in Menlo Park, California is developing the crucial second component to DARPA’s program: technology to spot a deepfake.

How are they detected?

By feeding computers examples of real videos as well as deepfake videos, these researchers are training computers to detect deepfake videos.

What about fake audio?

Training computers to recognize visual inconsistencies is one way researchers at SRI are working to detect deepfakes. They’re also focusing on fake audio.

Who else is studying deepfake technology?

Researchers at academic institutions like Carnegie Mellon, the University of Washington, Stanford University, and The Max Planck Institute for Informatics are also experimenting with deepfake technology. While not a part of DARPA’s program, their work, some of which is featured above and here, highlights different techniques with which artificial intelligence can be used to manipulate video. *Note: these clips do not have audio.

What happens if we can no longer trust our eyes or our ears?

For more than a century, audio and video have functioned as a bedrock of truth. Not only have sound and images recorded our history, they have also informed and shaped our perception of reality.

National Archives
NASA
CNN
CNN

Some people already question the facts around events that unquestionably happened, like the Holocaust, the moon landing and 9/11, despite video proof. If deepfakes make people believe they can’t trust video, the problems of misinformation and conspiracy theories could get worse. While experts told CNN that deepfake technology is not yet sophisticated enough to fake large-scale historical events or conflicts, they worry that the doubt sown by a single convincing deepfake could alter our trust in audio and video for good.

What if we can dismiss real events as fake?

Think about it. Would history be different if these recordings were claimed as fake?

In this excerpt from the now infamous “smoking gun” White House tape, President Nixon is heard agreeing to have administration officials approach the director of the CIA to ask him to request that the FBI stop their investigation into the Watergate break-in. Once the “smoking gun” tape was made public Nixon’s political support practically vanished and he ultimately resigned.
In this clip released by Mother Jones in 2012, Mitt Romney was caught on camera at a campaign fundraiser saying that 47 percent of the country is dependent on the government. The video was a setback for Romney’s presidential ambitions.

What's next?

The emergence of deepfake technology has prompted members of the U.S. Congress to request a formal report from the Director of National Intelligence. Senator Marco Rubio worries about the global fallout after a convincing deepfake goes viral before it’s detected.

A new kind of arms race?

Correction: An earlier version of this story misstated the location of SRI International.