Skip to content

Tech Trade Group is suing California to stop its online child safety law

    A tech industry trade association sued the state of California on Wednesday in an effort to stop a new law on child online safety, a legal challenge that comes amid growing public concern about the risks posed by content on popular platforms like Instagram and TikTok. can bring to younger users.

    The new law, called the California Age-Appropriate Design Code Act, will require many online services to install strong safeguards for minors, including protecting children from potentially harmful content and disabling features for finding friends that adult strangers interact with. able to interact with young people. Gov. Gavin Newsom signed the Child Online Safety Act into law in September, the first of its kind in the country.

    The industry association, called NetChoice, is suing to block the law before it takes effect in 2024. The members of the trading group include Amazon; Pinterest; TikTok; Google, owner of YouTube; and Meta, the parent company of Facebook and Instagram.

    In a legal complaint filed in U.S. District Court for the Northern District of California, NetChoice said the legislation would require online services to act as content censors, in violation of constitutional protections of free speech. The group also argued that the law would harm minors and others by restricting their access to free and open online resources.

    The law “pressures companies to serve as roving censors of Internet speech,” according to the NetChoice complaint. “Such excessive moderation,” it added, “will limit the availability of information for users of all ages and stifle important resources, especially for vulnerable youth who rely on the internet for life-saving information.”

    In recent years, children’s groups, parents and researchers have raised concerns that algorithms on platforms such as TikTok and Instagram have promoted harmful content about eating disorders and self-harm to younger users. In response, legislators and regulators in the United States and Europe have strengthened safeguards for children’s online privacy and safety.

    California’s child safety law was a bipartisan effort that both houses of the state legislature passed by unanimous vote. It was based on online child safety rules that came into effect in Britain last year.

    UK rules require online services likely to have underage users to prioritize child safety. In practice, this means that many popular social media and video game platforms must enable the highest privacy settings for younger users in the UK. They should also disable certain features that could encourage children to stay online for hours, such as autoplay: videos that automatically play one after the other.

    Last year, as UK rules were about to come into force, Google, Instagram, Pinterest, TikTok, Snap, YouTube and others introduced new safeguards for younger users worldwide. For example, YouTube has disabled video autoplay for minors by default.

    California rules also require online services to disable features such as auto-playing videos for kids.

    In the complaint, NetChoice argued that such rules were too broad, would affect too wide a range of online services, and would hinder platforms’ ability to freely select and promote content for users. In particular, the tech trade group argued that systems such as autoplay and content recommendation algorithms were commonly used, “benign” features.

    In response to a reporter’s question about why the group wanted to block the California law when many of its members already complied with similar UK rules, NetChoice said the state law was unconstitutional under the First Amendment.

    “While the UK has a similar law, it has neither a First Amendment nor a long tradition of protecting online speech,” said Chris Marchese, NetChoice advisor.