Skip to content

Supreme Court ready to reconsider key principles of online speech

    Partisanship has made the blockade worse. Republicans, some of whom have accused Facebook, Twitter and other sites of censoring them, have been pressuring the platforms to drop more content. Democrats, on the other hand, have said the platforms should remove more content, such as health misinformation.

    The Supreme Court case challenging Section 230 of the Communications Decency Act is likely to have many ramifications. While newspapers and magazines can be sued for what they publish, Section 230 protects online platforms from lawsuits over most content posted by their users. It also protects platforms from lawsuits when they delete posts.

    For years, judges have been citing the law in dismissing claims against Facebook, Twitter, and YouTube to ensure the companies didn’t assume new legal liability with every status update, post, and viral video. Critics said the law was a Get Out of Jail Free card for the tech giants.

    “If they don’t have any liability on the back end for the damage being facilitated, they basically have a mandate to be as reckless as possible,” said Mary Anne Franks, a law professor at the University of Miami.

    The Supreme Court previously declined to hear several cases challenging the statute. In 2020, the court dismissed a lawsuit brought by the families of individuals killed in terrorist attacks alleging that Facebook was responsible for promoting extremist content. In 2019, the court declined to hear the case of a man who said his ex-boyfriend had sent people to harass him using the dating app Grindr. The man sued the app, saying it had a flawed product.

    But on February 21, the court is set to hear Gonzalez v. Google, brought by the family of an American killed in an attack by Islamic State followers in Paris. In the lawsuit, the family said Section 230 should not protect YouTube against allegations that the video site supported terrorism when its algorithms recommended Islamic State videos to users. The lawsuit argues that recommendations can count as their own form of content produced by the platform, making them outside the protection of Section 230.

    A day later, the court is considering a second case, Twitter v. Taamneh. It addresses a related question about when platforms are legally responsible for supporting terrorism under federal law.