The bipartisan Take It Down Act, which would create criminal penalties for distributing nonconsensual intimate images and require tech companies to remove such content, has a clear path to victory: It has passed the Senate unanimously twice, cleared the House Energy and Commerce Committee and is being championed by First Lady Melania Trump — and President Donald Trump has vowed to sign it.
The bill could tangibly change the lives of survivors who currently have no legal recourse to lean on when trying to get their images — which can range from consensual images distributed without their knowledge to AI-generated sexually explicit deepfakes — removed from the internet. Under the Take It Down Act, covered platforms would have to remove nonconsensual intimate images within 48 hours of a request.
Uniting child safety advocates, organizers against sexual violence and both sides of the aisle, the bill now awaits a full House vote. But House Democrats and victim advocates warn that recent actions by the Trump administration mean passage might not have the impact the bill’s supporters are hoping for. The legislation puts the Federal Trade Commission in charge of enforcement, but the independent federal body has recently been weakened. Two Democratic commissioners have been removed by Trump, a move the Supreme Court has clearly ruled he does not have the right to do, and at least a dozen staff members were cut as part of Elon Musk’s purge of probationary federal workers.
And so, even as advocates push for this federal legislation to move through Congress, they’re working with policymakers at the state level.
All the cuts at the FTC will make it harder to enforce the report-and-remove mechanism that is the namesake of the Take It Down Act, said Omny Miranda Martone, the founder and CEO of the nonprofit Sexual Violence Prevention Association. Layoffs, partisanship and general weakening will make it more difficult for the FTC to hold companies accountable to the take-down provisions — and for many survivors and advocates, that’s the most valuable part of the bill.
“From working with a lot of the young victims in particular, what we are often told is that the most important thing to them is the ability to get their images removed from the internet,” said Adam Billen, vice president of public policy at youth-led AI policy nonprofit Encode.
“Often, what happens with victims is that once an image is up on the internet, it quickly spreads to other social media platforms, for example, and so getting it removed from that initial platform is incredibly important,” Billen said. “Once it starts spreading to other platforms, often you’re playing whack-a-mole at that point, and it becomes incredibly, incredibly difficult to get your images off of every possible website and application.”
-
Read Next:
For example, research published last October found that the social media network X removed nonconsensual intimate imagery only when the request invoked federal copyright law.
The White House did not respond to a request for comment about how changes at the FTC could impact enforcement of the Take It Down Act. House Republicans have dismissed Democrats’ concerns about the FTC’s ability to enforce the bill.
In addition to changes at the FTC, there have been dramatic shifts in the priorities and staffing at the Department of Justice, which would be involved in the enforcement of the criminal provisions of the Take It Down Act.

As a result, Susanna Gibson, the founder of My Own Image, a newly minted nonprofit pushing for comprehensive policy on nonconsensual intimate image-sharing, worries that federal prosecutors won’t have the bandwidth to press criminal charges against perpetrators.
Often, nonconsensual sharing of intimate images occurs across state lines, according to Martone of the Sexual Violence Prevention Association. Federal policy ensures “victims aren’t going to fall through the cracks.”
At the same time, for many cases state laws are the easiest way survivors can receive justice through the courts, said Gibson.
Forty-nine states and Washington, D.C., have laws banning the nonconsensual distribution of real intimate images (colloquially called “revenge porn,” a term that can obscure the severity of the crime), and about half that have laws that address synthetic nonconsensual intimate images. There are many differences across laws, some of which impede legal action.
Gibson is focusing her energy on strengthening laws at the state level, including in South Carolina, the only state without any sort of ban.
She has seen how narrowly tailored state laws can prevent justice for survivors. In her advocacy work, Gibson has met with numerous women who were unable to pursue cases due to exceptions in the law around intent or other aspects. She knows from personal experience — she hasn’t been able to take action against the person who leaked nonconsensual recordings of her to a major newspaper. She said federal recourse wasn’t an option either.
For these reasons, the model state policy that My Own Image crafted addresses cases without intent to harm. The organization focuses on state policy because cases at the lower level can progress more quickly.
-
Read Next:
Digital civil liberties groups caution that the laws could be abused to remove lawful speech. Trump’s vocal support of the Take It Down Act — and promise to use when his own likeness is spread — has put them on edge. For instance, the take-down provision has no protections for material considered relevant to the public interest. This conflict surfaced earlier this year when BlueSky took down digital forgeries of Trump and Musk, but later reinstated the posts, saying the media was part of a newsworthy event.
Advocates are continuing to push for additional laws that would provide avenues to justice for survivors of nonconsensual intimate imagery. The DEFIANCE Act, introduced last year by Rep. Alexandria Ocasio-Cortez, a New York Democrat, would create a civil right of action for survivors to sue creators.
“Our goal is really just to make sure that as many survivors as possible have as many options as possible to seek justice. And to prevent this before it happens by giving these options and making sure that people know there will be consequences and accountability if they do perpetrate harm,” Martone said.
A version of this article first appeared in Tech Policy Press.