San Francisco City Attorney David Chiu is filing a lawsuit to shut down 16 of the most popular websites and apps that allow users to “nude” or “undress” photos of primarily women and girls who are increasingly being harassed and exploited by malicious actors online.
These sites, Chiu alleged in his lawsuit, are “intentionally” designed to “create fake nude photos of women and girls without their consent,” and boast that any user can upload a photo to “see everyone naked” using technology that realistically replaces the faces of real victims with explicit images generated by AI.
“In California and across the country, there has been a sharp increase in the number of women and girls being harassed and victimized by AI-generated ‘nonconsensual intimate images’ (NCII) and ‘this disturbing trend shows no sign of abating,’” Chiu’s lawsuit states.
“Given the widespread availability and popularity” of nudity sites, “residents of San Francisco and California are at risk of having themselves or their loved ones victimized in this manner,” Chiu warned in his complaint.
At a press conference, Chiu said this “first-of-its-kind lawsuit” is being filed to defend not just California residents, but “a shocking number of women and girls around the world” — from celebrities like Taylor Swift to high school girls. If the city official wins, each nudify site could face a $2,500 fine for each violation of California’s consumer protection law it finds.
In addition to the alarming media reports about the damage caused by AI, the police are also calling for a ban on so-called deepfakes.
Chiu said the harmful deepfakes are often created “by leveraging open-source AI image generation models,” such as earlier versions of Stable Diffusion, which can be refined or “tuned” to easily “strip” images of women and girls often plucked from social media. While later versions of Stable Diffusion make such “troubling” forms of abuse much more difficult, San Francisco city officials noted at the news conference that refined earlier versions of Stable Diffusion are still widely available for malicious actors to exploit.
In the US alone, law enforcement is currently so inundated with reports of fake AI child pornography that it’s becoming difficult to investigate child abuse cases offline, and these AI cases are expected to continue to grow “exponentially.” AI abuse has become so widespread that “the FBI has warned of a rise in extortion schemes using AI-generated nonconsensual pornography,” Chiu said at the press conference. “And the impact on victims has been devastating,” damaging “their reputations, damaging their mental health,” leading to “loss of autonomy” and “in some cases, suicidal thoughts.”
Chiu is suing on behalf of California residents, seeking an injunction requiring nudify site owners to cease and desist from operating “any and all websites they own or control that are capable of creating AI-generated” nonconsensual intimate images of identifiable individuals. It’s the only way, Chiu said, to hold these sites “accountable for creating and disseminating AI-generated NCII of women and girls and for aiding and abetting others in engaging in this behavior.”
He also wants an order requiring “domain name registrars, domain name registries, web hosts, payment processors, or companies that provide user authentication and authorization services or interfaces” to “restrict” operators of nudify sites from launching new sites, to prevent further misconduct.
Chiu's lawsuit keeps the names of the most damaging sites his investigation uncovered secret, but he claims the sites were visited “more than 200 million times” in the first six months of 2024.
While victims typically have few legal recourses, Chiu believes that state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as California’s unfair competition law, could be used to take down all 16 sites. Chiu expects a victory would serve as a warning to other nudify site operators that more takedowns are likely to come.
“We're filing this lawsuit to get these websites shut down, but we also want to sound the alarm,” Chiu said at the press conference. “Generative AI has tremendous promise, but as with all new technologies, there are unforeseen consequences and criminals who try to exploit it. We need to make it clear that this is not innovation. This is sexual abuse.”