This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Government,
Technology

Aug. 21, 2024

'Deepfake nudes' lawsuit could be attack on Section 230

One legal expert said that the lawsuit could be deliberately targeting small players in the hope of creating precedent to later go after major tech companies.

San Francisco City Attorney David Chiu

Some legal experts believe San Francisco's lawsuit that accuses websites that allow consumers to create "deepfake nude" photos of real people of violating sexual abuse and revenge porn laws could increase scrutiny on Section 230 of the Communications Decency Act, internet companies' primary litigation shield.

To combat what is being called AI-generated "sexual abuse," lawmakers have proposed several pieces of legislation to skirt Section 230's limitation of websites' liability for content posted on their platforms. Most common among them are proposed bills that would criminalize AI-generated deepfake nudes by targeting the statute's federal crimes and sex trafficking exceptions.

In California, Senate Bill 926 and Assembly Bill 1831 both seek to expand the Penal Code to increase prosecution surrounding the types of conduct at issue in San Francisco City Attorney David Chiu's lawsuit, in particular the creation and dissemination of AI generated nude images of real women and girls without their consent. Both bills are pending.

Chiu did not respond to queries about any stealthy aims behind his litigation.

Since its passing in 1996, opinions from California judges have been influential in interpreting and expanding Section 230's broad protections. Rebecca Delfino, a Loyola Law School professor, said Chiu could be attempting to develop new case law to help pierce the Section 230 shield. People of the State of California et al. v. Sol Ecom, Inc. et al., CGC24617237 (S.F. Super. Ct. filed Aug. 15).

Delfino said that a defendant who appears in the lawsuit could remove the case to federal court and claim immunity under Section 230. "I was waiting for this to happen. . . . Section 230, that's one of the major stumbling blocks to bringing these sorts of claims," she said.

Preparing a stealth attack on Section 230 could explain why Chiu's lawsuit named relatively unknown companies with a limited presence in California, rather than "big actors who might come in with big law firms," she speculated.

"Rather than going after the Googles and the other internet service providers who also have a lot of this content on their websites, this might be one way to test the water and test the law," she said.

Delfino acknowledged the growing push by government officials to create "express or side door exceptions" to Section 230, such as the "Take It Down Act" proposed by U.S. Senator Ted Cruz, R-Texas, on June 18, to increase avenues for the creators of these "deep nudes" to be held liable both criminally and civilly.

David Hoppe, founder and managing partner of the Web3 and media technology firm Gamma Law, credited Chiu's ability to work around the current lack of legislation and bring this suit but said there was a need for federal intervention on the issue.

"While California was a leader in enacting a deepfakes law in 2019 providing for civil (but not criminal) liability, current law has not kept up with the development of AI. Mr. Chiu deserves credit for using the limited tools available by constructing a civil action based on the Business & Professions Code. A pending bill in the California Assembly would help close this gap if passed by the end of the current session, but this should be addressed comprehensively at the federal level."

Hoppe, whose firm has previously worked for clients that create AI-generated adult content, said he doesn't believe that Section 230 could prevent Chiu from pursuing the case.

"To my understanding, Section 230 would not apply, or there's a good case it would not apply, given that the website, rather, in this case, is acting as a publisher by really participating very much in the production of these images, by providing the tools for that," he said.

In addition to the large swath of proposed and passed legislation seeking to combat internet service providers' ability to escape liability under Section 230, federal law enforcement officials have been critical of the law.

In response to a May 2020 executive order from former President Donald Trump on "Preventing Online Censorship" the U.S. Department of Justice published a review of Section 230 that critiqued the law for the lack of civil and criminal tools available to victims of online content such as AI-generated explicit content targeting children.

"Criminals and other wrongdoers are increasingly turning to online platforms to engage in a host of unlawful activities, including child sexual exploitation, selling illicit drugs, cyberstalking, human trafficking, and terrorism. At the same time, courts have interpreted the scope of Section 230 immunity very broadly, diverging from its original purpose," the DOJ wrote. "This expansive statutory interpretation, combined with technological developments, has reduced the incentives of online platforms to address illicit activity on their services and, at the same time, left them free to moderate lawful content without transparency or accountability."

#380462

For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com