Kate Ruane, director of the Center for Democracy and Technology's Free Expression Project, says most major tech platforms now have policies banning the non-consensual distribution of intimate images, with many of the largest agreeing on principles to address deepfakes to grab. “I would say it's actually not clear whether the creation or distribution of non-consensual intimate images is prohibited on the platform,” Ruane said of Telegram's terms of service, which are less detailed than those of other major tech platforms.
Telegram's approach to removing harmful content has long been criticized by civil society groups, with the platform historically hosting scammers, far-right groups and terrorism-related content. Since Telegram CEO and founder Pavel Durov was arrested in France in August and charged with a series of possible violations, Telegram has started making some changes to its terms of service and providing data to law enforcement agencies. The company did not respond to WIRED's questions about whether it specifically bans explicit deepfakes.
Do the damage
Ajder, the researcher who discovered deepfake Telegram bots four years ago, says the app is almost uniquely positioned for deepfake abuse. “Telegram gives you the search functionality so you can identify communities, chats and bots,” says Ajder. “It provides the bone hosting functionality, so it's a place where the actual tool is delivered. Then it is also the place where you can share it and actually implement the damage in terms of the end result.”
In late September, several deepfake channels began posting that Telegram had removed their bots. It is unclear what prompted the removals. On September 30, a channel with 295,000 subscribers posted that Telegram had “banned” its bots, but posted a new bot link for users to use. (The channel was removed after WIRED sent questions to Telegram.)
“One of the things that's really worrying about apps like Telegram is that it's so difficult to track and monitor, especially from a survivor perspective,” said Elena Michael, co-founder and director of #NotYourPorn, a campaign group dedicated to protect people. of image-based sexual abuse.
Michael says Telegram is “notoriously difficult” to discuss security issues with, but notes that there has been some progress at the company in recent years. However, she says the company needs to be more proactive in moderating and filtering the content itself.
“Imagine you're a survivor who has to do that yourself, then the burden shouldn't be on one person,” Michael says. “Surely the burden should be on the company to create something that is proactive rather than reactive.”