Rising concerns about youth mental health have prompted state lawmakers across the country to propose a slew of age restrictions to protect minors online. Lawmakers say the rules should help protect young people from online pornography, predators and harmful social media posts.
The current push for age restrictions on certain online content mirrors a similar legislative push three decades ago when the Internet was in its infancy. In 1996, Congress passed a major telecommunications bill that made it illegal to knowingly transmit or display “obscene or indecent” material to anyone under the age of 18.
That law had a long-standing precedent: federal rules from the 1920s that banned radio and TV programs from broadcasting obscene language, to prevent a child who walks into a living room from eavesdropping.
The anti-pornography rules in the 1990s had strong bipartisan support. But civil liberties groups thought the ban on online indecency violated the First Amendment and stifled free speech. Among other things, they said it was too difficult and expensive for websites to verify a visitor’s age. That could have led to sites simply removing anything inappropriate for kids, creating a Disneyfied internet.
To protect Americans’ access to information that could potentially be considered indecent under the new law, such as educational materials about AIDS, the American Civil Liberties Union sued the government and a section of the law, the Communications Decency Act, challenged.
The ACLU wanted its name on the lawsuit, said Chris Hansen, a former senior attorney for the group. But to be a plaintiff, the group had to face direct threats from the law, and there was nothing on its website that could “harm” children. So the ACLU uploaded a Supreme Court ruling on a riff by comedian George Carlin about the seven nastiest words in the English language, which included a transcription of Mr. Carlin’s monologue in all its unbleached glory.
The ACLU also posted a quiz asking readers to guess the seven obscenities.
After a federal court in Philadelphia temporarily overturned the law, the government appealed and the case Reno v. ACLU, named after Bill Clinton’s Attorney General Janet Reno, was heard by the Supreme Court. There, the ACLU argued that the law’s speech restrictions could curb the Internet’s unique potential and prevent people — including minors — from accessing all kinds of information.
The ACLU argued that the Internet, where users typed or clicked to go to a Web page, was more like a book or newspaper than radio or TV, recalled Ann Beeson, a former assistant legal director for the group. Language in print, which individuals could freely peruse, was more lightly regulated than in broadcast media, where the public had less control over what they were exposed to.
The judges were not particularly familiar with the internet at the time. So court officials staged a demonstration to show how easy it was to find pornography. Senator Ted Cruz, then a Supreme Court clerk, later recounted how he, along with Justice Sandra Day O’Connor, had looked at “hardcore, explicit” image results for a search for a fruit that is sometimes used as a bawdy euphemism for breasts .
The Supreme Court ultimately sided with the ACLU, finding that the federal restrictions could reduce free speech.
The judges said the blanket restrictions were unacceptable because parents would soon be able to use content-filtering software to protect their children, and because age-verification systems at the time, which usually verified a user’s credit card, had not yet been developed. were generally available.. (That has changed; today many current online age-verification systems use credentials such as a driver’s license to verify a user’s age. One vendor said they were now easy to integrate and cost only 10 cents per visitor.)
In its ruling, the Supreme Court upheld a long-standing principle in US law that “you cannot censor speech to adults in the name of protecting minors,” Mr Hansen said. If the ACLU had lost, “the Internet wouldn’t be what it is today.”
But that was before the current, “extremely online” era where critics say powerful social media algorithms have fostered hateful, divisive comments; scaled disinformation; and recommended posts about anorexia and self-harm for young girls.
To try to strengthen online protections for children, California passed the Age-Appropriate Design Code Act last year. The law requires that online services likely to be used by young people, such as social media and video game platforms, have the highest possible privacy settings for minors by default.
It would also require those services to disable features that could put minors at risk by default, such as friend finders that allow adult strangers to contact children.
A tech industry association, NetChoice, has now filed a lawsuit to prevent child protection from taking effect next year. In a legal complaint filed in December, NetChoice said the restrictions would choke off important resources for users of all ages, echoing arguments made by the ACLU in the 1990s.
In March, the Congressional Research Service, a public policy agency that serves Congress, also weighed in, urging lawmakers to consider the potential unintended consequences of new online age restrictions, such as companies collecting more user data and restricting content.
Yet lawmakers continue to propose new online age and content rules.
Last week, Brian Schatz, a Democrat from Hawaii, pledged to the Senate that his new child online protection law “will help us stop the growing social media health crisis among children by setting a minimum age.”