Skip to content

Layoffs stripped Twitter’s child safety team

    Eliminate child exploitation is “priority #1,” declared Twitter’s new owner and CEO Elon Musk last week. But at the same time, after widespread firings and firings, only one employee remains on a key team dedicated to removing child sexual abuse content from the site, according to two people in the know, both of whom have requested to remain anonymous. stay.

    It’s unclear how many people were on the team prior to Musk’s takeover. On LinkedIn, WIRED identified four Singapore-based employees who specialize in child safety who publicly said they left Twitter in November.

    The importance of in-house child safety experts cannot be underestimated, researchers say. Based at Twitter’s Asian headquarters in Singapore, the team enforces the company’s ban on child sexual abuse material (CSAM) in the Asia-Pacific region. At the moment, that team has only one full-time employee. The Asia-Pacific region is home to about 4.3 billion people, about 60 percent of the world’s population.

    The Singapore team is responsible for some of the platform’s busiest markets, including Japan. Twitter has 59 million users in Japan, second only to the number of users in the United States, according to data aggregator Statista. Yet the Singapore office has also been hit by widespread layoffs and layoffs following Musk’s takeover of the company. Last month, Twitter laid off half its workforce, then emailed the remaining staff asking them to choose between working “long hours at high intensity” or paying a three-month severance package.

    The impact of layoffs and firings on Twitter’s ability to address CSAM is “deeply concerning,” said Carolina Christofoletti, a CSAM researcher at the University of São Paulo in Brazil. “It is delusional to think there will be no impact on the platform if people who worked on child safety within Twitter can be fired or resigned,” she says. Twitter did not immediately respond to a request for comment.

    Twitter’s child safety experts aren’t just fighting CSAM on the platform. They receive help from organizations such as the UK’s Internet Watch Foundation and the US-based National Center for Missing & Exploited Children, who also search the web for CSAM content shared on platforms such as Twitter. The IWF says data it sends to tech companies can be automatically deleted by corporate systems — no human moderation is required. “This ensures that the blocking process is as efficient as possible,” said Emma Hardy, IWF communications director.

    But these third-party organizations focus on the end product and don’t have access to internal Twitter data, Christofoletti says. She describes internal dashboards as crucial for analyzing metadata to help the people who write detection code identify CSAM networks before sharing content. “The only people who can see that [metadata] is the one who is on the platform,” she says.

    Twitter’s efforts to crack down on CSAM are complicated by the fact that people can share pornography consensually. The tools used by platforms to scan for child abuse struggle to differentiate between a consenting adult and a non-consenting child, according to Arda Gerkens, who leads the Dutch foundation EOKM, which reports CSAM online. “The technology isn’t good enough yet,” she says, adding that’s why human resources are so important.

    Twitter’s struggle to suppress the spread of child sexual abuse on its site predates Musk’s takeover. In his latest transparency report, which runs from July to December 2021, the company said it has suspended more than half a million accounts for CSAM, a 31 percent increase over the previous six months. In September, brands such as Dyson and Forbes suspended advertising campaigns after their promotions appeared alongside child abuse content.

    Twitter was also forced to postpone its plans to monetize the consenting adult community and become a competitor to OnlyFans over concerns that it could exacerbate the platform’s CSAM problem. “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity on a large scale,” according to an April 2022 internal report obtained by The Verge.

    Researchers are nervous about how Twitter will handle the CSAM issue under its new ownership. Those concerns only deepened when Musk early his followers to “comment in comments” if they saw issues on Twitter that needed addressing. “This question shouldn’t be a Twitter thread,” says Christofoletti. That’s exactly the question he should be asking the child safety team he fired. That’s the contradiction here.”