Skip to content

Meta is trying to keep mentions of mental health – and Zuckerberg's Harvard past – out of the child safety investigation

    As Meta is in charge The company faces trial in the state of New Mexico for allegedly failing to protect minors from sexual exploitation. The company is making an aggressive effort to exclude certain information from the legal proceedings.

    The company has asked the judge to exclude certain studies and articles about social media and the mental health of young people; any mention of a recent high-profile teen suicide case and social media content; and all references to Meta's financial resources, employees' personal activities, and Mark Zuckerberg's time as a student at Harvard University.

    Meta's requests to exclude information, called motions in limine, are a standard part of a pretrial proceeding, in which a party can ask a judge to determine in advance what evidence or arguments are admissible in court. This is to ensure that the jury is presented with facts and not irrelevant or damaging information, and that the suspect receives a fair trial.

    Meta has emphasized in pretrial proceedings that the only questions to be asked of the jury are whether Meta violated New Mexico's Unfair Practices Act for the way she allegedly handled child safety and youth mental health, and that other information — such as Meta's alleged election interference and misinformation, or privacy violations — should not be taken into account.

    But some requests appear unusually aggressive, two legal experts tell WIRED, including requests that the court not mention the company's AI chatbots, and the extensive reputation protection Meta is seeking. WIRED was able to review Meta's in limine requests through a public records request from the New Mexico courts.

    These motions are part of a landmark case filed by New Mexico Attorney General Raúl Torrez in late 2023. The state alleges that Meta has failed to protect minors from online solicitation, human trafficking and sexual abuse on its platforms. It alleges that the company proactively offered pornographic content to minors on its apps and failed to implement certain safety measures for children.

    The state complaint describes how investigators were able to easily set up fake Facebook and Instagram accounts posing as underage girls, and how these accounts quickly became explicit messages and displayed algorithmically enhanced pornographic content. In another test case cited in the complaint, investigators created a fake account of a mother seeking to traffic her young daughter. According to the complaint, Meta did not notice any suggestive comments that other users responded to her posts, nor did Meta close any accounts that were reported to be in violation of Meta's policies.

    Meta spokesperson Aaron Simpson told WIRED via email that the company has been listening to parents, experts and law enforcement agencies for more than a decade and has done deep research to “understand the issues that matter most” and to “use these insights to make meaningful changes, like introducing teen accounts with built-in security and providing parents with tools to manage their teens' experiences.”

    “While New Mexico advances sensational, irrelevant and distracting arguments, we are focused on demonstrating our long-standing commitment to supporting young people,” Simpson said. “We are proud of the progress we have made and we are always working to do better.”

    In her pre-trial motions in New Mexico, Meta asked the court to exclude any reference to a public opinion published by Vivek Murthy, the former U.S. surgeon general, on social media and youth mental health. It also asked the court to exclude an opinion article by Murthy and Murthy's call for social media to be provided with a warning label. Meta argues that the former surgeon general's statements treat social media companies as a monolith and are “irrelevant, inadmissible hearsay and unnecessarily harmful.”