The Federal Trade Commission escalated its battle with the tech industry’s biggest companies on Wednesday when it imposed a so-called “general ban” on the collection of personal information from young people by Meta, Facebook’s parent company.
The commission wants to significantly expand a record $5 billion consent order with the company from 2020 and said Meta had not fully complied with legal commitments it made to review its privacy practices to better protect its users.
Regulators also said Meta had misled parents about its ability to control who their children communicated with through the Messenger Kids app and misrepresented the access it gave some app developers to users’ private data.
The proposed changes mark the agency’s third time taking action against the social media giant over privacy concerns.
“The company’s recklessness has put young users at risk,” Samuel Levine, the director of the FTC’s consumer protection agency, said in a press statement. “Facebook must answer for its failures.”
The FTC’s administrative action, an internal agency procedure called an “order to show cause,” serves as a preliminary warning to Meta that regulators believe the company has violated its 2020 privacy agreement. The document contains the commission’s allegations against Meta and the proposed restrictions.
Meta, which has 30 days to challenge the filing, was not given advance notice of the action by the FTC
After Facebook responded, the committee said it will consider the company’s arguments and make a decision. Meta could then appeal the agency’s decision to a federal appeals court.
The FTC’s proposed changes would prevent Meta from profiting from the data it collects from users under the age of 18, and would apply to Meta companies, including Facebook, Instagram and Horizon Worlds, the company’s new virtual reality platform. Company. Regulators want to prevent the company from monetizing that data, even after those users turn 18.
That means Meta can be banned from using details about young people’s activities to show them ads based on their behavior or sell them digital items, such as virtual clothing for their avatars.
Whether a court would approve such changes is not known. In a statement on Wednesday, Alvaro M. Bedoya, a commissioner who voted to issue the administrative order, said he was concerned about whether the agency’s proposal to limit Meta’s use of youth data was relevant enough for the original case.
In a statement, Meta called the FTC’s administrative warning “a political stunt” and said the company had introduced a “leading” privacy program under the agreement with the FTC. The company promised to fight the agency’s action.
“Despite three years of ongoing engagement with the FTC surrounding our agreement, they provided no opportunity to discuss this new, totally unprecedented theory,” Meta said in a statement.
Meta had already announced limits on targeting ads to users under the age of 18. In 2021, the company said advertisers would be able to customize ads based on the location, age and gender of minors, but would no longer be able to target ads based on young people’s interests or interests. activities on other websites. And this year, Meta said it would also stop ad targeting based on the gender of minors.
The FTC’s aggressive action is the first time the commission has proposed such a blanket ban on the use of data to try to protect the online privacy of minors. And it comes amid the most sweeping government push to isolate young Americans online since the 1990s, when the commercial Internet was in its infancy.
Fueled by growing concerns about childhood depression and the role online experiences could play in exacerbating it, lawmakers in at least two dozen states introduced bills in the past year that would require certain sites, such as social networks, to ban or block young people. limit on their platforms. Regulators are also stepping up their efforts and imposing fines on online services whose use or misuse of data could put children at risk.
In recent years, critics have criticized Meta for recommending self-harm and extreme dieting content to teenage girls on Instagram and failing to adequately protect young users from child sexual exploitation.
The FTC’s case against the social media giant dates back more than a decade.
In 2011, the agency accused Facebook of misleading users about privacy. In a settlement, Facebook agreed to implement a comprehensive privacy program, including a commitment not to misrepresent its privacy practices.
But after news reports in 2018 that a voter profiling company, Cambridge Analytica, had collected the data of millions of Facebook users without their knowledge, the FTC acted again.
In a consent order finalized in 2020, Facebook agreed to restructure its privacy practices and practices and allow an independent reviewer to examine the effectiveness of the company’s privacy program. The company also paid a record $5 billion fine to settle agency costs.
The FTC says Facebook violated that agreement. In its administrative order on Wednesday, the agency cited reports from the privacy reviewer, noting that it found “gaps and weaknesses” in Meta’s privacy program that required significant additional work.
While much of the report was redacted, it indicated that the reviewer found issues with how Meta assessed user data privacy risks and managed privacy incidents. It also mentioned Meta’s oversight of its data-sharing arrangements with third parties.
The FTC’s crackdown on Meta is the latest signal that the agency is following through on commitments made by Lina M. Khan, its chairman, to rein in the power of the technology industry’s dominant firms. In December, the agency halted consolidation among video game makers when it filed a lawsuit to try to block Microsoft’s $69 billion acquisition of Activision Blizzard, the company behind the popular Call of Duty franchise.
The FTC has also become more aggressive on privacy regulation. Rather than simply trying to protect consumers from increasingly powerful surveillance tools, regulators are seeking to ban certain types of data collection and use that they deem highly risky.
The FTC in December accused Epic Games, the company behind the popular Fortnite game, of illegally collecting data from children and endangering children by matching them with strangers and enabling live chat. Epic agreed to pay a $520 million fine to settle those and other charges. The settlement order also required Epic to disable live voice and text chat by default — the first time regulators had mandated such a fix.
But the data restrictions that the Meta agency now wants to impose go much further.
The FTC’s proposed changes would prevent Meta sites and products from monetizing youth data. That would allow corporate platforms like Horizon Worlds to collect and use information about minors only to provide services to users and for security purposes.
The FTC also wants to prohibit Meta from releasing any new products or features until the company can demonstrate, through written confirmation from an independent privacy reviewer, that its privacy program is fully compliant with the 2020 Consent Order.