Skip to content

GitHub's deepfake porn effort still isn't working

    “When we look at the misuse of intimate images, the vast majority of tools and weapon uses come from the open source space,” says Ajder. But they often start with well-meaning developers, he says. “Someone creates something they think is interesting or cool, and someone with bad intentions recognizes its evil potential and weaponizes it.”

    Some, like the repository that was disabled in August, have purpose-built communities around them for explicit use. The model positioned herself as a tool for deepfake porn, Ajder claims, becoming a “funnel” for abuse, which mainly targets women.

    Other videos uploaded to the porn streaming site by an account mentioning AI models downloaded from GitHub include the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift and Anya Taylor-Joy, as well as other lesser-known but very real women, added to sexual situations.

    The creators freely described the tools they used, including two that were cleaned up by GitHub but whose code survives in other existing repositories.

    Perpetrators seeking deepfakes are congregating in many places online, including in secret community forums on Discord and in plain sight on Reddit, further intensifying efforts to prevent deepfakes. One Redditor offered his services on September 29 using the archived repository's software. “Can someone do my cousin,” asked another.

    Torrents from the main repository that was banned by GitHub in August are also available in other corners of the internet, showing how difficult it is to police open-source deepfake software across the board. Other deepfake porn tools, such as the app DeepNude, were similarly removed before new versions emerged.

    “There are so many models, so many different forks in the models, so many different versions, that it can be difficult to track them all,” said Elizabeth Seger, director of digital policy at cross-party British think tank Demos. “Once a model is made open source available for download, there is no way to publicly reverse it,” she adds.

    A deepfake porn creator featuring 13 manipulated explicit videos of female celebrities cited a prominent GitHub repository that was marketed as a “NSFW” version of another project that encouraged responsible use and explicitly asked users not to use it for nudity to use. “Learn all available Face Swap AI from GitHUB, without using online services,” their profile on the tube site blatantly states.

    GitHub had already disabled this NSFW version when WIRED identified the deepfake videos. But as of January 10, other repositories called “unlocked” versions of the model were also available on the platform, including one with 2,500 “stars.”

    “It's technically true that ever [a model is] beyond that it cannot be reversed. But we can still make it harder for people to get access,” says Seger.

    If left unchecked, she adds, the potential for harm from deepfake “porn” isn't just psychological. Its knock-on effects include intimidation and manipulation of women, minorities and politicians, as has been seen with political deepfakes affecting female politicians worldwide.

    But it's not too late to get the problem under control, and platforms like GitHub have options, Seger says, including intervening at the time of upload. “If you put a model on GitHub and GitHub says no, and all the hosting platforms say no, then it makes it harder for a normal person to get that model.”

    Curbing deepfake porn created with open source models also depends on policymakers, tech companies, developers and, of course, the creators of offensive content themselves.

    At least 30 US states also have some deepfake porn legislation, including bans, according to the nonprofit Public Citizen's legislation tracker, although definitions and policies vary and some laws only cover minors. Deepfake makers in Britain will also soon feel the force of the law after the government announced on January 7 that it would criminalize the creation of sexually explicit deepfakes, as well as sharing them.