First page design.
Site theme
San Francisco City Attorney David Chiu is suing to shut down 16 of the most popular Internet sites and apps that allow users to “undress” or “strip” photographs of most of the women and groups of women they They are harassed and exploited by bad actors online.
These sites, according to Chiu’s lawsuit, are “intentionally” designed to “create fake nude photographs of women and girls without their consent,” and boast that all users can upload any photo to “see anyone who is naked. ” ” employing a generation that genuinely swaps faces. of real patients in specific photographs generated by AI.
“In California and across the country, there has been a sharp increase in the number of women and men who are harassed and subjected to AI-generated non-consensual intimate photography (NCII) and “this worrying trend shows no signs of stopping. ” slowing down,” Chiu said, according to the lawsuit.
“Given the wide availability and popularity” of nudification websites, “San Francisco and Californians are at risk of having themselves or their loved ones victimized in this way,” Chiu’s lawsuit warns.
At a news conference, Chiu said the “first-of-its-kind lawsuit” filed to protect not only Californians but also “a shocking number of women and girls around the world,” from celebrities like Taylor Swift to middle-aged and tall. . high school women. If the city official wins, each nudify site faces $2,500 in fines for each violation of California’s customer coverage law.
In addition to media reports sounding the alarm about the damage caused by AI, law enforcement has joined the call to ban “deepfakes. “
Chiu said that destructive deepfakes are occasionally created “by leveraging open-source AI symbol generation models,” such as previous versions of Stable Diffusion, which can be subtle or “subtle” to gently “strip” images of women and women from which they are removed. of social networks. While later versions of Stable Diffusion make that “disturbing” bureaucracy of use much more difficult, San Francisco city officials noted at the press convention that subtle earlier versions of Stable Diffusion are still widely available for use by bad actors.
In the United States alone, police have lately been so bogged down with reports of fake AI child sex photos that it’s difficult to investigate offline child abuse cases, and those AI cases are expected to continue piling up “exponentially. ” The abuses have spread so widely that “the FBI has warned of a rise in extortion schemes employing non-consensual AI-generated pornography,” Chiu said at the news conference. “And the effect on the victims has been devastating,” damaging “their reputation and intellectual health,” leading to a “loss of autonomy,” and “in some cases, pushing Americans to become suicidal. “
Continuing on behalf of others in the state of California, Chiu is a court order requiring owners of nudification sites to cease operating “all internet sites that they own or operate that are capable of creating non-consensual intimate photographs of identifiable individuals generated by AI. “”This is the only way,” Chiu said, “to hold those sites accountable “for creating and distributing AI-generated NCII on women and women and for aiding and abetting others to perpetrate this behavior. “
You also need an order requiring “all domain call registrars, domain call registries, Internet servers, payment processors or corporations that offer user authentication and authorization or interfaces” to “prevent” site operators Nudify the launch of new sites to prevent you from further misconduct. Training
Chiu’s complaint mentions the names of the most harmful sites exposed through his research, but claims that in the first six months of 2024, those sites “were visited more than two hundred million times. “
Although those affected typically have little legal recourse, Chiu believes that federal and state legislation banning deepfake pornography, revenge pornography, and child pornography, as well as California’s unjust festival law, can be used to take down the 16 sites. Chiu is hoping for a victory to warn other nudification site operators that more takedowns are likely.
“We are taking this legal action to shut down those websites, but we also wish to sound the alarm,” Chiu said at the press conference. “Generative AI is incredibly promising, but as with all new technologies, it has accidental and criminal consequences. “they are to be exploited. We want to be transparent that this is not an innovation. This is sexual abuse. “
Join Ars Orbital Transmission mail to receive weekly updates in your inbox. Sign up →