YouTube has been accused of facilitating terrorist recruitment for years. This allegedly happens when a user clicks on a terrorist video hosted on the platform and then goes down a rabbit hole of extremist content that is automatically queued “next” via YouTube’s recommendation engine. In 2016, the family of Nohemi Gonzalez — who was killed in a 2015 Paris terrorist attack after extremists reportedly relied on YouTube for recruitment — sued YouTube owner Google, forcing courts to overturn YouTube’s alleged role in aiding and abetting to consider terrorists. Google has been defending YouTube ever since. Then, last year, the Supreme Court agreed to hear the case.
Now the Gonzalez family hopes the Supreme Court will agree that Section 230 protections designed to protect websites from liability for hosting third-party content should not be expanded to also protect platforms’ right to recommend harmful content .
However, Google thinks this is exactly how the liability shield should work. Yesterday, Google argued in a lawsuit that Section 230 protects YouTube’s recommendation engine as a legitimate tool “intended to facilitate the communication and content of others.”
“Section 230 includes sorting content through algorithms by defining ‘interactive computer service’ to include ‘tools’ that ‘pick, choose’, ‘filter’, ‘search, subset, organize’ or ‘reorganize’ content,” argued Google. “Congress was intended to protect these features, not simply host third-party content.”
Google claimed that denying Section 230 protections apply to YouTube’s recommendation engine would remove the shields that protect all websites that use algorithms to sort and surface relevant content – from search engines to online shopping websites. This, Google warned, would create “devastating spillovers” that would turn the Internet “into a disorganized mess and a litigation minefield” — which is exactly what Article 230 was designed to prevent.
It seems that according to Google, a ruling against Google would turn the internet into a dystopia where all websites and even individual users could be sued for sharing links to content deemed offensive. In a statement, Google’s general counsel, Halimah DeLaine Prado, said such liability would lead some larger websites to overly censor content out of extreme caution, while websites with fewer resources would likely go the other way and not censor anything.
“A decision that undermines Section 230 would cause websites to remove potentially controversial material or turn a blind eye to objectionable content to avoid knowledge of it,” DeLaine Prado said. “You would be forced to choose between overly curated mainstream sites or fringe sites overrun with offensive content.”
The Supreme Court will begin oral arguments in this case on February 21.
Google has asked the court to uphold the 9th Circuit of Appeals ruling, which found that Section 230 does indeed shield YouTube’s recommendation engine. The Gonzalez family wants a ruling that Section 230 immunity does not directly relate to YouTube’s action to recommend terrorist videos posted by third parties.
Ars could not immediately reach either legal team for comment.
Next: Deciding the fate of Section 230
In the lawsuit, Google argued that YouTube is already working to counter recruiting efforts with community guidelines banning content that promotes terrorist organizations.
Since 2017, Google has taken steps to remove and block the reach of infringing content, including fine-tuning YouTube algorithms to better recognize extremist content. Perhaps most applicable to this case, at that time YouTube also implemented a “redirection method” using targeted ads to divert potential ISIS recruits away from radicalization videos.
Today, Google said in the lawsuit, YouTube functions differently than it did in 2015, with the video-sharing platform investing more heavily in prioritizing stronger enforcement of its violent extremism policies. In the last quarter of 2022, YouTube automatically detected and removed about 95 percent of videos that violated the policy, the court said.
According to Google, companies operating under the protection of Article 230 are already motivated to make the internet safer, and the Supreme Court must consider how any decision that reforms the interpretation of Article 230 threatens to upset that delicate balance.
Google argues that it should not be for the Supreme Court to make decisions that would reform Article 230, but for Congress. So far, lawmakers’ recent attempts to reform Section 230 have failed, but this week Joe Biden urged Congress to join him in reversing the course of the liability shield. If Biden has his way, platforms like YouTube could be held liable for hosting objectionable third-party content in the future. Such a rule change could give the Gonzalez family peace of mind, knowing that YouTube would legally be required to proactively block all terrorist videos, but Google’s argument suggests such an extreme reform of Section 230 would inevitably “turn the internet upside down” .