Skip to content Skip to search

Republish This Story

* Please read before republishing *

We’re happy to make this story available to republish for free under an Attribution-NonCommercial-NoDerivatives Creative Commons license as long as you follow our republishing guidelines, which require that you credit The 19th and retain our pixel. See our full guidelines for more information.

To republish, simply copy the HTML at right, which includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to The 19th. Have questions? Please email [email protected].

— The Editors

Loading...

Modal Gallery

/
Donate to our newsroom

Menu

Topics

  • Abortion
  • Politics
  • Education
  • LGBTQ+
  • Caregiving
  • Environment & Climate
  • Business & Economy
View all topics

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

  • Latest Stories
  • Our Mission
  • Our Team
  • Ways to Give
  • Search
  • Contact
Donate
Home

We’re an independent, nonprofit newsroom reporting on gender, politics and policy. Read our story.

Topics

  • Abortion
  • Politics
  • Education
  • LGBTQ+
  • Caregiving
  • Environment & Climate
  • Business & Economy
View all topics

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

  • Latest Stories
  • Our Mission
  • Our Team
  • Ways to Give
  • Search
  • Contact

We’re an independent, nonprofit newsroom reporting on gender, politics and policy. Read our story.

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

Become a member

The 19th thanks our sponsors. Become one.

Politics

AI enters Congress: Sexually explicit deepfakes target women lawmakers

A first-of-its-kind study highlights the stark gender disparity in AI-generated nonconsensual intimate images — and puts into focus the evolving risks for women in politics and public life.

Silhouetted figures walk past the U.S. Congress building.
Dozens of U.S. Congress members have had their likeness used in nonconsensual intimate imagery, otherwise known as deepfake porn. The majority of this impacted are women. (Patrick Semansky/AP Photo)

By

Barbara Rodriguez, Jasmine Mithani

Published

2024-12-11 05:00
5:00
December 11, 2024
am

Republish this story

Share

  • Bluesky
  • Facebook
  • Email

Republish this story

More than two dozen members of Congress have been the victims of sexually explicit deepfakes — and an overwhelming majority of those impacted are women, according to a new study that spotlights the stark gender disparity in this technology and the evolving risks for women’s participation in politics and other forms of civic engagement.

The American Sunlight Project (ASP), a think tank that researches disinformation and advocates for policies that promote democracy, released findings on Wednesday that identify more than 35,000 mentions of nonconsensual intimate imagery (NCII) depicting 26 members of Congress — 25 women and one man — that were found recently on deepfake websites. Most of the imagery was quickly removed as researchers shared their findings with impacted members of Congress.

“We need to kind of reckon with this new environment and the fact that the internet has opened up so many of these harms that are disproportionately targeting women and marginalized communities,” said Nina Jankowicz, an online disinformation and harassment expert who founded The American Sunlight Project and is an author on the study.

The 19th thanks our sponsors. Become one.

Nonconsensual intimate imagery, also known colloquially as deepfake porn (though advocates prefer the former), can be created through generative AI or by overlaying headshots onto media of adult performers. There is currently limited policy to restrict its creation and spread.

ASP shared the first-of-its-kind findings exclusively with The 19th. The group collected data in part by developing a custom search engine to find members of the 118th Congress by first and last name, abbreviations or nicknames on 11 well-known deepfake sites. Neither party affiliation nor geographic location had an impact on the likelihood of being targeted for abuse, though younger members were more likely to be victimized. The largest factor was gender, with women members of Congress being 70 times more likely than men to be targeted.

  • More from The 19th
    Conceptual illustration of a human head overlaid with binary digits.
  • With AI sexual abuse on the rise, the White House is tapping Big Tech for support
  • Will Big Tech be held accountable when it comes to violence against women?
  • They’re crimes — so why do we keep calling them ‘porn’?

ASP did not release the names of the lawmakers depicted in the imagery to avoid encouraging searches. They did contact the offices of everyone impacted to alert them and offer resources on online harms and mental health support. Authors of the study note that in the immediate aftermath, imagery targeting most of the members was entirely or almost entirely removed from the sites — a fact they’re unable to explain. Researchers have noted that such removals do not prevent material from being shared or uploaded again. In some cases involving lawmakers, search result pages remained indexed on Google despite the content being largely or entirely removed.  

“The removal may be coincidental. Regardless of what exactly led to removal of this content — whether ‘cease and desist’ letters, claims of copyright infringement, or other contact with the sites hosting deepfake abuse — it highlights a large disparity of privilege,” according to the study. “People, particularly women, who lack the resources afforded to Members of Congress, would be highly unlikely to achieve this rapid response from the creators and distributors of AI-generated NCII if they initiated a takedown request themselves.”

According to the study’s initial findings, nearly 16 percent of all the women who currently serve in Congress — or about 1 in 6 congresswomen — are the victims of AI-generated nonconsensual intimate imagery.

A closeup portrait of a woman.
Nina Jankowicz, an online disinformation and harassment expert, is an author on a recent study that identified more than 35,000 mentions of nonconsensual intimate imagery (NCII) depicting 26 members of Congress. (Bastien Inzurralde/AFP/Getty Images)

Jankowicz has been the target of online harassment and threats for her domestic and international work dismantling disinformation. She has also spoken publicly about being the victim of deepfake abuse — a fact she found out through a Google Alert in 2023.

“You can be made to appear in these compromised, intimate situations without your consent, and those videos, even if you were to say, pursue a copyright claim against the original poster, as in my case, they proliferate around the internet without your control and without some sort of consequence for the people who are amplifying or creating deepfake porn,” she said. “That continues to be a risk for anybody who is in the public eye, who is participating in public discourse, but in particular for women and for women of color.”

Image-based sexual abuse can have devastating mental health effects on victims, who include everyday people who are not involved in politics — including children. In the past year, there have been reports of high school girls being targeted for image-based sexual abuse in states like California, New Jersey and Pennsylvania. School officials have had varying degrees of response, though the FBI has also issued a new warning that sharing such imagery of minors is illegal.

The full impact of deepfakes on society is still coming into focus, but research already shows that 41 percent of women between the ages of 18 and 29 self-censor to avoid online harassment.

“That is a hugely powerful threat to democracy and free speech, if we have almost half of the population silencing themselves because they’re scared of the harassment they could experience,” said Sophie Maddocks, research director at the Center for Media at Risk at the University of Pennsylvania.

There is no federal law that establishes criminal or civil penalties for someone who generates and distributes AI-generated nonconsensual intimate imagery. About a dozen states have enacted laws in recent years, though most include civil penalties, not criminal ones.

AI-generated nonconsensual intimate imagery also opens up threats to national security by creating conditions for blackmail and geopolitical concessions. That could have ripple effects on policymakers irrespective of whether they’re directly the target of the imagery.

“My hope here is that the members are pushed into action when they recognize not only that it’s affecting American women, but it’s affecting them,” Jankowicz said. “It’s affecting their own colleagues. And this is happening simply because they are in the public eye.”

Image-based sexual abuse is a unique risk for women running for office. Susanna Gibson narrowly lost her competitive legislative race after a Republican operative shared nonconsensual recordings of sexually explicit livestreams featuring the Virginia Democrat and her husband with The Washington Post. In the months after her loss, Gibson told The 19th she heard from young women discouraged from running for office out of fear of intimate images being used to harass them. Gibson has since started a nonprofit dedicated to fighting image-based sexual abuse and an accompanying political action committee to support women candidates against violations of intimate privacy.

Breaking through. Rooted in you.

Our journalism remains rooted in the experiences and needs of our community. Support our nonprofit newsroom during our Year-end Impact Drive, and your gift will be matched.

Donate Today

Maddocks has studied how women who speak out in public are more likely to experience digital sexual violence.

“We have this much longer, ‘women should be seen and not heard’ pattern that makes me think about Mary Beard’s writing and research on this idea that womanhood is antithetical to public speech. So when women speak publicly, it’s almost like, ‘OK. Time to shame them. Time to strip them. Time to get them back in the house. Time to shame them into silence.’ And that silencing and that shaming motivation … we have to understand that in order to understand how this harm is manifesting as it relates to congresswomen.” 

ASP is encouraging Congress to pass federal legislation. The Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, also known as the DEFIANCE Act, would allow people to sue anyone who creates, shares or receives such imagery. The Take It Down Act would include criminal liability for such activity and require tech companies to take down deepfakes. Both bills have passed the Senate with bipartisan support, but have to navigate concerns around free speech and harm definitions, which are typical hurdles to tech policy, in the House.

“It would be a dereliction of duty for Congress to let this session lapse without passing at least one of these bills,” Jankowicz said “It is one of the ways that the harm of artificial intelligence is actually being felt by real Americans right now. It’s not a future harm. It’s not something that we have to imagine.”

In the absence of congressional action, the White House has collaborated with the private sector to conceive creative solutions to curb image-based sexual abuse. But critics aren’t optimistic about Big Tech’s ability to regulate itself, given the history of harm caused by its platforms.

“It is so easy for perpetrators to create this content, and the signal is not just to the individual woman being targeted,” Jankowicz said. “It’s to women everywhere, saying, ‘If you take this step, if you raise your voice, this is a consequence that you might have to deal with.’”
If you have been a victim of image-based sexual abuse, the Cyber Civil Rights Initiative maintains a list of legal resources.

Republish this story

Share

  • Bluesky
  • Facebook
  • Email

Recommended for you

A poster is displayed next to Sen. Ted Cruz as he speaks at a news conference to unveil the Take It Down Act to protect victims against non-consensual intimate image abuse.
House approves Take It Down Act, sending bill on intimate images to Trump’s desk
An abstract image of a person.
Will Big Tech be held accountable when it comes to violence against women?
Three women look at each other while seated at a table.
Trump backed a bill on nonconsensual intimate images — but he fired its enforcers
illustration of a person looking worriedly at pixelated nude images of themselves posing.
With AI, anyone can be a victim of nonconsensual porn. Can laws keep up?

The 19th News(letter)

News that represents you, in your inbox every weekday.

You have been subscribed!

Please complete the following CAPTCHA to be confirmed. If you have any difficulty, contact [email protected] for help.

Submitting...

Uh-oh! Something went wrong. Please email [email protected] to subscribe.

This email address might not be capable of receiving emails (according to Bouncer). You should try again with a different email address. If you have any questions, contact us at [email protected].

Become a member

Explore more coverage from The 19th
Abortion Politics Education LGBTQ+ Caregiving
View all topics

Our newsroom's Spring Member Drive is here!

Learn more about membership.

  • Transparency
    • About
    • Team
    • Contact
    • Privacy Policy
    • Community Guidelines
  • Newsroom
    • Latest Stories
    • 19th News Network
    • Podcast
    • Events
    • Careers
    • Fellowships
  • Newsletters
    • Daily
    • Weekly
    • The Amendment
    • Event Invites
  • Support
    • Ways to Give
    • Sponsorship
    • Republishing
    • Volunteer

The 19th is a reader-supported nonprofit news organization. Our stories are free to republish with these guidelines.