The Supreme Court on Monday sidestepped final resolution in a pair of cases challenging state laws aimed at limiting social media companies’ power to moderate content. The ruling left in limbo an effort by Republicans who had promoted the legislation as a remedy for what they say is a bias against conservatives.
It was the latest case in which the Supreme Court considered, and then sidestepped, a landmark decision on the parameters of expression on social media platforms.
State laws vary in their details. Florida’s prohibits platforms from permanently banning candidates for political office in the state, while Texas’s prohibits platforms from removing content based on a user’s point of view.
The justices unanimously agreed to send the cases back to the lower courts for analysis. Justice Elena Kagan, writing for the majority, noted that neither lower court properly analyzed the First Amendment challenges to the Florida and Texas laws.
“In short, there is much work to be done in both cases,” Justice Kagan wrote, adding, “But that work must be done consistent with the First Amendment, which is not suspended when it comes to social media.”
Under the limited ruling, the state laws remain intact, but the lower court orders also remain in effect. This means that both laws are still suspended.
Although the justices voted 9-0 to send the cases back to the lower courts, they disagreed on the reasoning, with several justices writing separate concurrences explaining their positions. Justice Kagan was joined by Chief Justice John G. Roberts Jr., along with Justices Sonia Sotomayor, Brett M. Kavanaugh and Amy Coney Barrett. Justice Ketanji Brown Jackson joined in partial concurrence.
In a separate dissent, Justice Barrett outlined how lower courts might analyze the cases.
Judge Barrett wrote that the federal appeals court hearing the Florida case demonstrated an “understanding of First Amendment protections for editorial discretion” that was “generally correct,” while the appeals court hearing the Texas case did not.
A unanimous three-judge panel of the U.S. Court of Appeals for the 11th Circuit has largely upheld a preliminary injunction that temporarily blocked the Florida law.
In contrast, a divided three-judge panel of the Fifth Circuit overturned a lower court ruling that blocked the Texas law.
Because the judges did not make any major rulings on the issue, both sides were able to declare victory.
Chris Marchese, director of the litigation center at NetChoice, one of the trade groups that challenged the laws, said in a statement that the “Supreme Court agreed with all of our First Amendment arguments.”
Florida Attorney General Ashley Moody suggested on social media that the outcome was in the state's favor. “While there are aspects of the decision that we disagree with, we look forward to continuing to defend the state law,” she said.
The Biden administration supported the social media companies in both cases, Moody v. NetChoice, No. 22-277, and NetChoice v. Paxton, No. 22-555.
In the majority opinion, Justice Kagan noted how quickly the Internet has evolved. Less than 30 years ago, she wrote, the justices still felt the need to define the Internet in their opinions, describing it as “an international network of interconnected computers.”
Today she wrote: “Facebook and YouTube alone each have over two billion users.”
She described a flood of content that has prompted major platforms to “curate and curate” posts. The platforms sometimes remove posts altogether or add warnings or labels, often in accordance with community standards and guidelines that help the sites determine how to handle different content.
Because such sites can create “unprecedented opportunities and untold dangers,” she added, it’s no surprise that lawmakers and government agencies are grappling with how and whether to regulate them.
Government agencies are generally better able to respond to these challenges, Justice Kagan noted, but courts still play an integral role “in protecting the expression rights of those agencies, just as courts have historically protected the rights of traditional media.”
The laws at issue in these cases, statutes passed by legislatures in Florida and Texas in 2021, differ in which businesses they cover and what activities they restrict. But Judge Kagan wrote that both limit platforms’ choices about what user-generated content is shown to the public. Both laws also require platforms to provide reasons for their content moderation choices.
Justice Kagan then provided an indication of how most justices think about applying the First Amendment to such laws.
While it was too early for the court to draw conclusions in the cases, she wrote, the underlying records suggested that some platforms, at least some of the time, were engaged in expression.
“In curating particular feeds, these platforms make choices about which third-party expression to display and how to display it,” Judge Kagan wrote. “They include and exclude, organize and prioritize — and in making millions of those decisions every day, they produce their own distinctive compilations of expression.”
She added that while social media is a newer format, “the essence” is familiar, comparing the platforms to traditional publishers and editors who select and shape the expressions of others.
“We have repeatedly held that laws restricting their editorial choices must meet First Amendment requirements,” Justice Kagan wrote. “The principle does not change because the composite has moved from the physical to the virtual world.”
So far, however, the courts have avoided definitively defining social media platforms' responsibility for content, even as they continue to recognize the networks' enormous power and reach.
Last year, the courts declined to hold tech platforms liable for user content in a pair of rulings — one involving Google and the other involving Twitter. Neither ruling clarified the scope of the law that shields platforms from liability for such messages, Section 230 of the Communications Decency Act.
The laws in Florida and Texas up for debate Monday were prompted in part by the decision of some platforms to ban President Donald J. Trump after the Jan. 6, 2021, attack on the Capitol.
Supporters of the laws said they were an effort to combat what they called Silicon Valley censorship. The laws, they added, promoted free speech and gave the public access to all points of view.
Opponents said the laws trample on the platforms' own rights as enshrined in the First Amendment and turn the platforms into cesspools of filth, hate and lies.
A ruling that tech platforms have no editorial discretion to decide what posts to allow would expose users to a greater diversity of viewpoints. But it would also almost certainly amplify the ugliest aspects of the digital age, including hate speech and disinformation.
The two trade associations challenging the state laws — NetChoice and the Computer & Communications Industry Association — said the actions that the Fifth Circuit Court of Appeals called censorship in upholding the Texas law were editorial statements protected by the First Amendment.
The groups argued that social media companies are entitled to the same constitutional protections as newspapers, which are generally free to publish without government interference.
A majority of the justices were sharply critical of the Fifth Circuit's decision to overturn a lower court order that had blocked the Texas law.
Judge Kagan wrote that the Texas law prevents social media platforms from using content moderation standards “to remove, modify, organize, prioritize, or disclaim posts in their News Feed.” That legislation, she wrote, blocks precisely the types of editorializations that the Supreme Court has previously held to be protected by the First Amendment.
She said it was unlikely that this particular application of the law would withstand First Amendment scrutiny.
But in their dissent, Justices Jackson and Barrett acknowledged the difficulty of making sweeping statements about how free speech protections online should work.
Judge Barrett raised one hypothesis: A social media platform might be protected by the First Amendment if it set rules for what content was allowed on its feed, and then used an algorithm to automate enforcement of those policies. But she said it might be less clear that the First Amendment protected software that determined for itself what content was harmful.
“And what about AI, which is evolving rapidly?” she wrote. “What if platform owners hand over the reins to an AI tool and simply ask it to remove ‘hateful’ content?”
Olivier Sylvain, a law professor at Fordham University, said Monday's ruling could open the door for the court or regulators to consider those more complicated issues. That could include how to deal with online commercial speech, such as platforms that amplify discriminatory advertising, rather than the political positions at the center of Monday's ruling.
“Texas and Florida have been hit by an ideological political fight that social media companies are biased against conservative viewpoints,” he said. “I hope at least this has taken that nonsense out of the way and we can start thinking about the many questions that are much more interesting.”