Skip to content

A Zelensky Deepfake was quickly defeated. The next one might not

    Other conflicts and political leaders may be less fortunate and more vulnerable to deepfakes disruption, says Sam Gregory, who works on deepfakes policies at the nonprofit Witness.

    Zelensky’s high profile helped Ukraine’s deepfake warning win international coverage two weeks ago, and it also helped his quick response on Wednesday to spread quickly. His fame may also have sparked a quick response to the video from social networking companies. Meta spokesperson Aaron Simpson declined to say how it detected the video; so did YouTube’s Choi. Twitter’s Kennedy statement called unspecified “external investigative reporting”.

    Not all people targeted by deepfakes will be able to respond as deftly as Zelensky — or find their rejection as widely familiar. “Ukraine was well positioned to do this,” Gregory says. “This is very different from other cases, where even a poorly made deepfake can create uncertainty about authenticity.”

    Gregory refers to a video released in Myanmar last year that showed a former government minister in custody saying he had provided cash and gold to the country’s former leader, Aung San Suu Kyi.

    The military government that ousted Aung San Suu Kyi in a coup used those images to accuse her of corruption. But in the video, the former minister’s face and voice were distorted, leading many journalists and citizens to suggest that the clip was fake.

    Technical analysis hasn’t solved the mystery, in part because the video is of low quality and because the former minister and others familiar with the truth didn’t speak as freely or before as large an audience as Zelensky might on Wednesday. While automatic deepfake detectors may one day help fight bad actors, they are still in progress.

    Deepfakes are generally still used more for titillation or intimidation than grand deception, mainly because they are easier to make. A deepfake of Russian President Vladimir Putin also circulated on Twitter this week, though it was identified as inauthentic from the start. However, Zelensky’s deepfake and associated hacks could represent a disturbing new frontier. The quick and successful response to the clip shows how, with a few tweaks and better timing, a deepfake attack could be an effective political weapon.

    “If this was a more professional video and released early in a more successful Russian advance to Kiev, it could have caused a lot of confusion,” said Samuel Bendett, who follows Russian defense technology at the nonprofit CNA. As deepfake technology becomes more easily accessible and more convincing, it is unlikely that Zelensky will be the last political leader to be targeted by fake video.


    More great WIRED stories