Skip to content Skip to search

Republish This Story

* Please read before republishing *

We’re happy to make this story available to republish for free under an Attribution-NonCommercial-NoDerivatives Creative Commons license as long as you follow our republishing guidelines, which require that you credit The 19th and retain our pixel. See our full guidelines for more information.

To republish, simply copy the HTML at right, which includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to The 19th. Have questions? Please email [email protected].

— The Editors

Loading...

Modal Gallery

/
Donate to our newsroom

Menu

Topics

  • Abortion
  • Politics
  • Education
  • LGBTQ+
  • Caregiving
  • Environment & Climate
  • Business & Economy
View all topics

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

  • Latest Stories
  • Our Mission
  • Our Team
  • Ways to Give
  • Search
  • Contact
Donate
Home

We’re an independent, nonprofit newsroom reporting on gender, politics and policy. Read our story.

Topics

  • Abortion
  • Politics
  • Education
  • LGBTQ+
  • Caregiving
  • Environment & Climate
  • Business & Economy
View all topics

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

  • Latest Stories
  • Our Mission
  • Our Team
  • Ways to Give
  • Search
  • Contact

We’re an independent, nonprofit newsroom reporting on gender, politics and policy. Read our story.

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

Become a member

The 19th thanks our sponsors. Become one.

Politics

With AI sexual abuse on the rise, the White House is tapping Big Tech for support

The call to action comes as the issue has intensified in recent years, affecting students to public figures like Taylor Swift and AOC.

Conceptual illustration of a human head overlaid with binary digits.
Stopping image-based sexual abuse, real or AI-generated, requires action from both the government and tech companies, says the White House. (Getty Images)

Nadra Nittle

Education reporter

Published

2024-08-02 07:00
7:00
August 2, 2024
am

Republish this story

Share

  • Bluesky
  • Facebook
  • Email

Republish this story

Your trusted source for contextualizing the news. Sign up for our daily newsletter.

“This is an issue that affects everybody — from celebrities to high school girls.”

That’s how Jen Klein, director of the White House Gender Policy Council, describes the pervasiveness of image-based sexual abuse, a problem that artificial intelligence (AI) has intensified in recent years, touching everyone from students to public figures like Taylor Swift and Rep. Alexandria Ocasio-Cortez.

The 19th thanks our sponsors. Become one.

In May, the Biden-Harris administration announced a call to action to curb such abuse, which disproportionately targets girls, women and LGBTQ+ people. Stopping these images, whether real or AI-generated, from being circulated and monetized requires not just the government to act, but tech companies to as well, according to the White House.

“We’re inviting technology companies and civil society to consider what steps they can take to prevent image-based sexual abuse, and there’s really a spectrum of actors who we hope will get involved in addressing the problem,” Klein said. “So that can be anything from the payment processors, to mobile app stores, to mobile app and operating system developers, cloud providers, search engines, etc. They all have a particular part of the sort of ecosystem in which this problem happens.”

Responding to the White House’s call to action, the Center for Democracy & Technology, the Cyber Civil Rights Initiative and the National Network to End Domestic Violence announced in June that they would form a working group to counteract the circulation and monetization of image-based sexual abuse. In late July, Meta, owner of Facebook and Instagram, removed 63,000 accounts linked to the “sextortion” of children and teens.  

While older forms of this abuse include the leaking of intimate photos without the consent of all parties, the AI version includes face swapping, whereby the head of one individual is placed on another person’s naked body, Klein said. Both Swift and Ocasio-Cortez have been victims of this kind of sexual abuse. In March, Ocasio-Cortez introduced the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act of 2024. The legislation provides recourse for people, more than 90 percent of whom are women, who have had their likenesses used in intimate “digital forgery.” The Senate passed the DEFIANCE Act on July 23.

Explore more coverage from The 19th
In-depth reporting on topics you care about
Abortion Politics Education LGBTQ+ Caregiving
View all topics

Such images have also garnered repeated headlines this year after spreading at schools. The White House’s appeal to tech companies follows the Biden-Harris administration’s recent updates to Title IX, the law that bars educational institutions that receive federal funds from engaging in sex discrimination. Under the new regulations that took effect Thursday, sex-based harassment includes sexually explicit deepfake images if they create a hostile school environment. 

The National Women’s Law Center is one of 37 organizations applauding this development in a letter sent Monday to the Department of Education by the Sexual Violence Prevention Association (SVPA). The coalition of groups represented by SVPA expressed concern, however, that many school administrators don’t know about image-based sexual abuse or how to address it. 

“We respectfully urge the Department of Education to issue guidance delineating Title IX procedures and protocols specifically tailored to addressing digital sexual harassment within educational institutions,” the letter states. “This guidance should provide clear direction on how schools can effectively handle cases of digital sexual harassment including support mechanisms for victims, investigation procedures, research and referrals, and prevention strategies.”

The Biden-Harris administration’s effort to prevent the proliferation of explicit deepfake images coincides with states taking action.

“There’s a patchwork of laws across the country, and there are 20 states that have passed laws penalizing the dissemination of nonconsensual AI-generated pornographic material,” Klein said. “But there’s a lot of work to be done, both at the state level and at the federal level to really make that work a whole quilt to continue the process.”

One state lawmaker who’s been concerned about deepfakes for years is California Assemblyman Marc Berman. A 2018 AI-generated video of former President Barack Obama, created by comedian and film director Jordan Peele, alarmed him because he felt that bad actors could use digitally manipulated videos to influence political races. The next year, Berman authored legislation to regulate the use of deepfake technology involving political candidates around election time. 

“It was pretty tricky because of the various First Amendment arguments that get raised,” he said. “The bill, to be honest, got watered down more than I wanted as it went through the process. But it has since been copied in other states, and then frankly, made stronger in other states.”

In May, Berman announced that similar legislation he’d introduced to prevent deepfakes from interfering with elections had advanced in California’s assembly. During the current legislative session, he introduced multiple bills related to digital forgery and artificial intelligence. AB 1831 seeks to prohibit child sex abuse deepfakes, while AB 2876 would require the state’s Instructional Quality Commission to consider incorporating AI literacy content into state mathematics, science, and history-social science curriculum standards when they’re up for revision next year.

Berman decided to file legislation to prohibit child sex abuse deepfakes when the California District Attorneys Association informed his office that they’re increasingly catching people who are creating, disseminating or possessing such images. 

“Their interpretation of California law currently is that it is not specifically illegal, because it doesn’t involve an image of an actual child — because AI takes thousands of images of real children and then spits out this artificial image,” Berman said. “So they said, ‘We need to close this loophole in California law and make sure that the law explicitly states that child sexual abuse material, even if it’s created by artificial intelligence, is illegal. I was shocked that people were even using AI to create this type of content, and then I found out just how pervasive it is, especially on the dark web. It’s terrifying.”

Possessing or distributing such images online may result in perpetrators sexually exploiting minors offline, making it all the more important to address AI-generated versions of this content before it spirals out of control and becomes a huge problem for the nation’s young people, Berman said.

Multiple schools in California have been rocked by deepfake scandals, often related to images created by students of their peers. In March, a Calabasas High School student accused her onetime friend of disseminating actual and AI-generated nudes of her to their peers. That same month, a Beverly Hills middle school expelled five students for allegedly circulating AI-generated nudes of their classmates. 

Such incidents are one reason Berman believes students need to be taught to use AI responsibly. “AB 2876 will equip students with the skills and the training that they need to both harness the benefits of AI, but also to mitigate the dangers and the ethical considerations of using artificial intelligence,” he said. 

The legislation has been ordered to a third reading, the bill’s final phase before it leaves the state assembly and moves to the senate. Meanwhile, his bill to prohibit child sex abuse deepfakes, AB 1831, has been referred to the suspense file, meaning that the bill’s potential fiscal impacts to the state are being reviewed. The legislation would take effect January 1 if enacted. 

“It’d be great if Congress can pass some federal standards on this,” Berman said. “It’s always an ideal when it comes to legislation that really applies to every state and to kids in every state.”

Pending national legislation addressing the issue includes The SHIELD Act and The Kids Online Safety and Privacy Act (KOSA), which the Senate passed July 30, although it still awaits a vote in the House of Representatives. The former would make the non-consensual sharing of intimate images a federal offense, while the latter would require social media companies to take steps to prevent children and teens from being sexually exploited online, among other measures. KOSA, however, has sparked fears that lawmakers could use it to censor content they dislike, particularly LGBTQ+ content, under the guise of protecting children. Civil liberties groups like the ACLU said that the bill raises privacy concerns, may limit youth’s access to important online resources and could silence needed conversations. 

Evan Greer, director at Fight for the Future, a nonprofit advocacy group focused on digital rights, objected to KOSA’s Senate passage in a statement. “We need legislation that addresses the harm of Big Tech and still lets young people fight for the type of world that they actually want to grow up in,” she said. 

AI-generated image-based sexual abuse also affects college students, according to Tracey Vitchers, executive director of It’s On Us, a nonprofit that addresses college sexual assault. She called it an emerging issue on college campuses.

“It really started with the emergence of nonconsensual image-sharing involving an individual sharing a private photo with someone that they thought they could trust,” she said. “We are now starting to see this challenge come forward with AI and deepfakes, and unfortunately, many schools are not equipped to investigate gender-based harassment and violence that occurs as a result of deepfakes.”

Vitchers appreciates that the new Title IX regulations touch on the issue, but said that colleges need more guidance from the Department of Education about how to respond to these incidents, and students need more prevention education.

“It’s something that we have begun discussing with some of our partners, particularly those in the online dating space,” Vitchers said. “We are hearing that fear, among particularly young women on campus, about someone who can just take a picture of you from Instagram and use AI to superimpose it onto porn. Then it gets circulated and it feels impossible to get it removed from the internet.”

Some tech companies have already offered their support to the White House’s effort to stop image-based sexual abuse, Klein said, but she would like to hear from others. Although state and national lawmakers are working to enact legislation and regulations, Klein said that the Biden-Harris administration is calling on tech companies to intervene because they can take action now. 

“Given the scale that image-based abuse has been rapidly proliferating with the advent of generative AI, we need to do this while we continue to work toward longer-term solutions,” she said.

Republish this story

Share

  • Bluesky
  • Facebook
  • Email

Recommended for you

illustration of a person looking worriedly at pixelated nude images of themselves posing.
With AI, anyone can be a victim of nonconsensual porn. Can laws keep up?
Book bans in schools jumped 33 percent last year
Protesters chant in support of sexual assault survivors at Indiana University
New Title IX rules offer ‘comprehensive coverage’ for LGBTQ+ students and sexual violence survivors
They’re crimes — so why do we keep calling them ‘porn’?

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

Become a member

Explore more coverage from The 19th
Abortion Politics Education LGBTQ+ Caregiving
View all topics

Our newsroom's Spring Member Drive is here!

Learn more about membership.

  • Transparency
    • About
    • Team
    • Contact
    • Privacy Policy
    • Community Guidelines
  • Newsroom
    • Latest Stories
    • 19th News Network
    • Podcast
    • Events
    • Careers
    • Fellowships
  • Newsletters
    • Daily
    • Weekly
    • The Amendment
    • Event Invites
  • Support
    • Ways to Give
    • Sponsorship
    • Republishing
    • Volunteer

The 19th is a reader-supported nonprofit news organization. Our stories are free to republish with these guidelines.