A spokesperson told The Wall Street Journal that “non-consensual pornography and the tools to create it are expressly prohibited by Telegram's terms of service and will be removed as soon as discovered.”
For the teen filing a lawsuit, ClothOff itself remains the primary target. Her lawyers believe it's possible she could have the app and its associated sites blocked in the US, the WSJ reports, if ClothOff doesn't respond and the court finds her in absentia.
But whatever the outcome of the lawsuit, the teen expects he will forever be “haunted” by the fake nudes a high school student created without charges.
According to the WSJ, the teenage girl sued the boy who she said made her want to drop out of school. Her complaint noted that she was informed that “the responsible individuals and other potential witnesses failed to cooperate with, communicate with, or provide access to their electronic devices to law enforcement authorities.”
The teen was left feeling “mortified and emotionally distraught, and she has since suffered lasting consequences,” according to her complaint. She has no idea if ClothOff can continue to spread the harmful images, and she has no idea how many teens have posted them online. Because of these unknowns, she is certain she will spend “the rest of her life” monitoring “the resurfacing of these images.”
“Knowing that the CSAM images of her will almost inevitably find their way onto the Internet and be retransmitted to others, such as pedophiles and human traffickers, has created a sense of hopelessness” and “an ongoing fear that her images could reappear at any time and be viewed by countless others, possibly even friends, family members, future partners, colleges and employers, or the general public,” her complaint said.
The teen's lawsuit is the latest front in a broader effort to crack down on AI-generated CSAM and NCII. It follows an earlier lawsuit filed last year by San Francisco City Attorney David Chiu that targeted ClothOff, one of 16 popular apps used to “nudify” photos of mostly women and young girls.
About 45 states have criminalized fake nudes, the WSJ reported, and earlier this year Donald Trump signed the Take It Down Act into law, which requires platforms to remove both real and AI-generated NCII within 48 hours of reports from victims.