Skip to content

Section 230 protects TikTok in “Blackout Challenge” child lawsuit

    Section 230 protects TikTok in child lawsuit

    As lawsuits continue to pile up against social media platforms for alleged harm to children, a Pennsylvania court has ruled that TikTok is not liable in one case in which a 10-year-old named Nylah Anderson died after attempting a “Blackout Challenge.” complete. discovered on her “For You” page.

    The challenge recommends users choke themselves until they pass out, and Nylah’s mother, Tawainna Anderson, initially claimed TikTok’s faulty algorithm was responsible for knowingly passing the deadly video to her child. The mother hoped Section 230 protection under the Communications Decency Act — which grants social platforms immunity for content published by third parties — would not apply in the case, but ultimately the judge ruled that TikTok was immune.

    TikTok’s “algorithm was a way to bring the challenge to the attention of those probably most interested in it,” Judge Paul Diamond wrote in a memorandum before issuing his injunction. “By promoting the work of others, Defendants published that work—the very activity that Section 230 protects from liability. The wisdom of granting such immunity is something well taken up by Congress, not the courts.”

    This isn’t the only lawsuit seeking to hold TikTok liable for the deaths of children from the ‘Blackout Challenge’. Other lawsuits filed in California this summer are still pending, but these contain similar arguments regarding TikTok’s allegedly faulty algorithm. Diamond suggested that Nylah’s mother “can’t beat Section 230’s immunity,” simply “by creatively labeling her claims.” His verdict suggests those other pending lawsuits won’t fare better in overcoming the effective shield that Section 230 provides social media companies as publishers, regardless of the outcome of how algorithms are designed to recommend content.

    “Because Anderson’s design flaw and failure to warn claims are ‘inseparable’ from how defendants choose to publish third-party user content, Section 230 immunity applies,” Diamond wrote.

    Anderson’s attorneys at Jeffrey Goodman, Saltz Mongeluzzi & Bendesky PC, gave Ars a statement on the ruling:

    “The Anderson family will continue to fight to make social media safe so that no other child is killed by the reckless behavior of the social media industry. The federal Communications Decency Act was never intended to allow social media companies to send dangerous content to children, and the Andersons will continue to advocate for the protection of our children from an industry that exploits youth in the name of profit.”

    TikTok did not immediately provide Ars with a statement about the ruling.

    While this ruling is likely to be considered a major loss to child safety advocates, the law firm Seeger Weiss LLP recently announced it “a new approach to lawsuits against social media companies that circumvents the protections afforded by Section 230.” The law firm said it has filed dozens of lawsuits already representing more than 1,000 clients — and still seeking more clients — “alleging that the design of social media platforms causes serious harm to children, including anxiety, depression, eating disorders.” , sexual exploitation and suicide.” These lawsuits are much broader than just TikTok and target other social platforms from Meta and Snap.

    Seeger Weiss LLP did not immediately respond to Ars’s request for comment to clarify how effective his “new approach” could be in actually defeating Section 230 protections, given this week’s ruling in favor of TikTok.

    Denying Anderson’s claims that TikTok has a responsibility not to design its algorithm to recommend dangerous content to children, Diamond quoted the 2nd circuit in his ruling as finding that “tools like algorithms designed to match information with a consumer’s interests” are “well within the scope of publishing functions that fall under Section 230.”