Skip to content

Coroner names Instagram algorithm as cause of death British teenager [Updated]

    Coroner names Instagram algorithm as cause of death British teenager [Updated]

    In a London court this week, coroner Andrew Walker had the difficult task of assessing a question that child safety advocates have been asking for years: How responsible is social media for the content algorithms given to minors? The case before Walker involved a 14-year-old named Molly Russell, who committed suicide in 2017 after viewing thousands of posts on platforms such as Instagram and Pinterest promoting self-harm. At one point during the inquest, Walker described the content Russell liked or kept in the days before her death as disturbing, the coroner said in court, that he found it “almost impossible to watch.”

    Today, Walker concluded that Russell’s death could not have been a suicide, Bloomberg reports. Instead, he described her cause of death as “an act of self-harm while suffering from depression and the negative effects of online content.”

    Bloomberg reported that Walker came to this decision based on Russell’s “productive” use of Instagram — liking, sharing, or saving 16,300 posts in six months before her death — and Pinterest — 5,793 Pins in the same time — coupled with how the platforms handled. content to contribute to Russell’s depressive state.

    “The platforms worked using algorithms in such a way that in some circumstances they resulted in periods of binge eating images, video clips and text,” which “romanticized self-injury” and “attempted to isolate and discourage discussion with those who had can help,” Walker said.

    Following Walker’s ruling, Russell’s family issued a statement to Ars calling it a historic decision and saying the court hadn’t even reviewed the most disturbing content Molly encountered.

    “The past two weeks have been extremely painful for our family,” the Russell family said in a statement. “We miss Molly even more painfully than usual, but we hope the investigation into this case will help prevent similar deaths, encouraged by the disturbing content available to this day on social media platforms, including Meta’s. “

    Bloomberg reports that the family’s attorney, Oliver Sanders, has requested that Walker “send instructions to prevent this from happening again to Pinterest, Meta, the UK government and the communications regulator.” In their statement, the family urged UK regulators to swiftly pass and enforce the UK Online Safety Bill, which could set “new safeguards for younger users worldwide” according to The New York Times.

    Defense of Pinterest and Facebook had different tactics

    During the research, Pinterest and Meta took different approaches to defend their policies. Pinterest apologized, saying it didn’t have the technology it currently has to more effectively moderate the content Molly was exposed to. But Meta’s head of health and wellness, Elizabeth Lagone, frustrated the family by telling the court that the content Molly viewed was considered “safe” by Meta’s standards.

    “We’ve heard a senior Meta executive describe this deadly stream of content that the platform’s algorithms have pushed to Molly as ‘SAFE’ and not against the platform’s policies,” the Russell family wrote in their statement. “If this insane trail of life-sucking content were safe, my daughter Molly would probably still be alive.”

    A Meta spokesperson told Bloomberg that the company is “committed to making sure Instagram is a positive experience for everyone, especially teens,” promising to “carefully consider the coroner’s full report when providing it.”

    Molly’s family made it a point to praise Pinterest for its transparency during the investigation, urging other social media companies to use Pinterest as a model when dealing with anyone who disputes content policy decisions.

    “For the first time today, tech platforms have been formally held responsible for the death of a child,” the Russells statement said. “Going forward, as a family, we hope that other social media companies that call on a survey follow Pinterest’s lead, who have taken steps to learn lessons, and are genuinely and respectfully involved in the survey process.”

    Bloomberg reported Pinterest said that “Molly’s story has strengthened our commitment to creating a safe and positive space for our pinners.” In response to the ruling, Pinterest said it “continued to strengthen” its “policy on self-harming content”.

    Neither Pinterest nor Meta immediately responded to Ars’s request for comment. [Update: Pinterest told Ars that its thoughts are with the Russell family, saying it has listened carefully to the court and the family throughout the inquest. According to Pinterest, it is “committed to making ongoing improvements to help ensure that the platform is safe for everyone” and internally “the Coroner’s report will be considered with care.” Since Molly’s death, Pinterest said it has taken steps to improve content moderation, including blocking more than 25,000 self-harm related search terms and, since 2019, has combined “human moderation with automated machine learning technologies to reduce policy-violating content on the platform.”]