The privacy experts who spoke to WIRED described Rumble, Quora and WeChat as unusual suspects, but declined to speculate on the reasoning behind their participation in the investigation. Josh Golin, executive director of the nonprofit Fairplay, which advocates for digital safety for children, says the concerns aren't always obvious. Few advocacy groups, for example, were concerned about Pinterest until the case of a British teenager who died from self-harm after exposure to sensitive content on the platform, he says.
Paxton's press release last month called his new research “a critical step to ensure that social media and AI companies comply with our laws designed to protect children from exploitation and harm.”
The United States Congress has never passed a comprehensive privacy law, and the rules for children's online safety have not been significantly updated in a quarter century. That has led to state lawmakers and regulators playing a major role.
Paxton's investigation focuses on compliance with Texas' Securing Children Online through Parental Empowerment Act, or SCOPE, which went into effect in September. The law applies to any website or app that has social media or chat features and registers users under the age of 18, making it more comprehensive than the federal law, which only covers services for users under the age of 13.
SCOPE requires services to ask users' ages and give parents or guardians power over children's account settings and user data. Companies are also not allowed to sell information about minors without parental consent. In October, Paxton sued TikTok for allegedly violating the law by not providing adequate parental controls and disclosing data without consent. TikTok has denied the allegations.
The investigation announced last month also referenced the Texas Data Privacy and Security Act, or TDPSA, which went into effect in July and requires parental consent before processing data about users under the age of 13. Paxton's office asked the companies investigated to detail their compliance with both laws. the SCOPE Act and the TDPSA, according to regulatory requirements obtained through the public records request.
In total, companies must answer eight questions next week, including the number of Texas minors they count as users and have been banned for registering an inaccurate date of birth. Lists with whom data of minors is sold or shared must be submitted. It could not be determined whether any companies had already responded to the question.
Technology company lobby groups are challenging the constitutionality of the SCOPE Act in court. In August, they won a first and partial victory when a federal judge in Austin, Texas, ruled that a provision requiring companies to take steps to prevent minors from seeing self-harm and offensive content was too vague.
But even a complete victory may not be a salve for tech companies. States such as Maryland and New York are expected to enforce similar laws later this year, said Ariel Fox Johnson, an attorney and director of the consulting firm Digital Smarts Law & Policy. And attorneys general could resort to prosecuting smaller cases under their time-tested laws that stop deceptive business practices. “What we're seeing is that information is often shared, sold or disclosed in ways that families did not expect or understand,” Johnson said. “As more laws are passed that set strict requirements, it seems to be becoming increasingly clear that not everyone is adhering to them.”