San Francisco’s city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to “nudify” or “undress” photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online.
These sites, Chiu’s suit claimed, are “intentionally” designed to “create fake, nude images of women and girls without their consent,” boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images.
“In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated” non-consensual intimate imagery (NCII) and “this distressing trend shows no sign of abating,” Chiu’s suit said.
“Given the widespread availability and popularity” of nudify websites, “San Franciscans and Californians face the threat that they or their loved ones may be victimized in this manner,” Chiu’s suit warned.
In a press conference, Chiu said that this “first-of-its-kind lawsuit” has been raised to defend not just Californians, but “a shocking number of women and girls across the globe”—from celebrities like Taylor Swift to middle and high school girls. Should the city official win, each nudify site risks fines of $2,500 for each violation of California consumer protection law found.
On top of media reports sounding alarms about the AI-generated harm, law enforcement has joined the call to ban so-called deepfakes.
Chiu said the harmful deepfakes are often created “by exploiting open-source AI image generation models,” such as earlier versions of Stable Diffusion, that can be honed or “fine-tuned” to easily “undress” photos of women and girls that are frequently yanked from social media. While later versions of Stable Diffusion make such “disturbing” forms of misuse much harder, San Francisco city officials noted at the press conference that fine-tunable earlier versions of Stable Diffusion are still widely available to be abused by bad actors.
In the US alone, cops are currently so bogged down by reports of fake AI child sex images that it’s making it hard to investigate child abuse cases offline, and these AI cases are expected to continue spiking “exponentially.” The AI abuse has spread so widely that “the FBI has warned of an uptick in extortion schemes using AI generated non-consensual pornography,” Chiu said at the press conference. “And the impact on victims has been devastating,” harming “their reputations and their mental health,” causing “loss of autonomy,” and “in some instances causing individuals to become suicidal.”
Suing on behalf of the people of the state of California, Chiu is seeking an injunction requiring nudify site owners to cease operation of “all websites they own or operate that are capable of creating AI-generated” non-consensual intimate imagery of identifiable individuals. It’s the only way, Chiu said, to hold these sites “accountable for creating and distributing AI-generated NCII of women and girls and for aiding and abetting others in perpetrating this conduct.”
He also wants an order requiring “any domain-name registrars, domain-name registries, webhosts, payment processors, or companies providing user authentication and authorization services or interfaces” to “restrain” nudify site operators from launching new sites to prevent any further misconduct.
Chiu’s suit redacts the names of the most harmful sites his investigation uncovered but claims that in the first six months of 2024, the sites “have been visited over 200 million times.”
While victims typically have little legal recourse, Chiu believes that state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as California’s unfair competition law, can be wielded to take down all 16 sites. Chiu expects that a win will serve as a warning to other nudify site operators that more takedowns are likely coming.
“We are bringing this lawsuit to get these websites shut down, but we also want to sound the alarm,” Chiu said at the press conference. “Generative AI has enormous promise, but as with all new technologies, there are unanticipated consequences and criminals seeking to exploit them. We must be clear that this is not innovation. This is sexual abuse.”