Skip to content

Meta always limited the content of abortion

    Johnsen’s experience is common in the pro-choice activist community. Most people who have spoken to WIRED say their content appears to have been automatically removed by AI rather than reported by another user.

    Activists are also concerned that even if the content is not completely removed, its reach may be limited by the platform’s AI.

    While it’s nearly impossible for users to discern how Meta’s AI moderation is implemented on their content, the company announced last year that it would place less emphasis on political and news content in users’ news feeds. Meta did not respond to questions about whether abortion-related content is categorized as political content.

    Just as the various abortion activists who spoke to WIRED experienced varying degrees of moderation on Meta’s platform, so did users in different locations around the world. WIRED has experimented with posting the same phrase, “Abortion Pills are available by mail,” from Facebook and Instagram accounts in the UK, US, Singapore, and the Philippines in English, Spanish, and Tagalog. Instagram removed English posts of the phrase when it was posted from the US, where abortion was again restricted in some states after last week’s court decision, and the Philippines, where it is illegal. But a message from the US written in Spanish and a message from the Philippines in Tagalog both remained standing.

    The phrase remained on both Facebook and Instagram when posted in English from the UK. When it was posted in English from Singapore, where abortion is legal and widely available, the phrase stayed on Instagram but was flagged on Facebook.

    Thanks to Kenneth Dimalibot

    Thanks to Kenneth Dimalibot

    Ensley told WIRED that Reproaction’s Instagram campaigns on abortion access in Spanish and Polish were both very successful and saw none of the issues faced by the group’s English-language content.

    “Meta, in particular, relies heavily on automated systems that are extremely sensitive in English and less sensitive in other languages,” said Katharine Trendacosta, associate director of policy and advocacy at the Electronic Frontier Foundation.

    WIRED also tested Meta’s moderation with a Schedule 1 substance that is legal for recreational use in 19 states and for medicinal use in 37 states, sharing the phrase “Marijuana is available by mail” on Facebook in English from the US. The message is not marked.

    “Content moderation with AI and machine learning takes a lot of time to set up and a lot of effort to maintain,” said a former Meta employee familiar with the organization’s content moderation practices, speaking on the condition of anonymity. “When circumstances change, you have to change the model, but that takes time and effort. So when the world is changing rapidly, those algorithms often don’t work optimally and can be enforced with less precision during periods of intense change.”

    However, Trendacosta is concerned that law enforcement may also flag content for removal. In Meta’s 2020 transparency report, the company noted that it had “limited access to 12 items in the United States reported by various state attorney generals related to the promotion and sale of regulated goods and services, and to 15 items reported by the US Attorney General. as allegedly involved in prize gouging.” All posts were later restored. “The state attorneys general can just say to Facebook, ‘Pick this up’ and Facebook will do it, even if they eventually put it back up again, which is incredibly dangerous,” Trendacosta says.