Skip to content

Asking Grok to remove fake nudes could force victims to file a lawsuit in Musk's chosen court

    But “that cannot be the case,” Goldberg argued.

    Faced with “the implied threat that Grok would keep St. Clair's images online and possibly create more,” St. Clair had little choice but to communicate with Grok, Goldberg argued. And that inducement should not undermine the protections under New York law that St. Clair seeks to claim in its lawsuit, Goldberg argued, asking the court to void St. Clair's xAI contract and reject xAI's proposal to switch locations.

    Should St. Clair win its fight to keep the trial in New York, the case could set a precedent for perhaps millions of other victims who are considering legal action but fear xAI in Musk's chosen court.

    “It would be unfair to expect St. Clair to litigate in a state so far from her home, and it may be that the process in Texas will be so difficult and uncomfortable that St. Clair will effectively be deprived of her day in court,” Goldberg argued.

    Grok can continue to harm children

    The estimated number of sexualized images reported this week is alarming because it suggests that Grok may have generated more child sexual abuse material (CSAM) at the height of the scandal than X finds on its platform each month.

    In 2024, X Safety reported 686,176 cases of CSAM to the National Center for Missing and Exploited Children, which averages approximately 57,000 cases of CSAM per month. If the CCDH estimate of 23,000 Grok outputs sexualizing children over an eleven-day period is correct, then an average monthly total could have exceeded 62,000 if Grok was left unchecked.

    NCMEC did not immediately respond to Ars' request for comment on how the estimated volume of Grok's CSAM compares to X's average CSAM reporting. But NCMEC previously told Ars that “whether an image is real or computer-generated, the harm is real and the material is illegal.” That suggests Grok could remain a thorn in NCMEC's ​​side, as the CCDH has warned that even if The CCDH also found cases of alleged CSAM that X had not removed as of January 15.