Skip to content

How the death of a British teenager changed social media

    Instagram also hides search terms, but only if the term or phrase itself promotes or encourages self-harm, said Tara Hopkins, head of EMEA Public Policy at Instagram. “For other suicide/self-harm search terms that are not inherently violative, we show a message of support before showing results.” The company declined to share how many search terms were blocked.

    Instagram parent company Meta says child safety concerns are juggling young people’s free speech. The company admitted that two posts Molly had seen and shown to the court violated Instagram’s policy at the time. But Elizabeth Lagone, head of health and wellness policy at Meta, told last week’s research that it’s “important to give people that voice” when struggling with suicidal thoughts. When the Russell family’s attorney Oliver Sanders asked Lagone if she agreed that the content viewed by Molly and seen by the court was “not safe”, Lagone replied: “I think it is safe for people to express themselves.”

    These comments epitomize what researchers say there are major differences between the two platforms. “Pinterest is much more concerned with being decisive, being clear, and de-platforming content that isn’t up to their standards,” said Samuel Woolley, program director of the propaganda research lab at the University of Texas, Austin. “Instagram and Facebook… tend to be much more concerned with running into free speech.”

    Pinterest hasn’t always worked that way. Hoffman told the survey that Pinterest’s guidance used to be “when in doubt, lean toward … lighter content moderation.” But Molly’s death in 2017 coincided with the fallout from the 2016 US presidential election, when Pinterest was involved in spreading Russian propaganda. Around that time, Pinterest started banning entire topics that didn’t fit the platform’s mission, such as vaccines or conspiracy theories.

    That is in stark contrast to Instagram. “Metaplatforms, including Instagram, are guided by the dictum of wanting to exist as infrastructural information tools [like] the phone or [telecoms company] AT&T, rather than as social media companies,” said Woolley. Facebook should not be “the arbiter of truth,” Meta founder and CEO Mark Zuckerberg argued in 2020.

    The research also revealed differences between how transparent the two platforms wanted to be. “Pinterest has helpfully provided material about Molly’s activities on Pinterest in one go, including not only pins Molly had saved, but also pins she [clicked] and scrolled over it,” says Varney. Meta never gave the court that level of detail, and much of the information the company did share was redacted, she adds. For example, the company revealed that in the six months before her death, Molly had recommended 30 accounts with names that hinted at sad or depressing themes. But the actual names of those accounts were redacted, with the platform citing the privacy of its users.

    Varney agrees that both platforms have made improvements since 2017. Pinterest results for self-harm search terms don’t have the same level of graphic material as they did five years ago, she says. But Instagram’s changes have been too little, too late, she claims, adding that Meta didn’t ban self-harm and suicide graphics until 2019.