On Monday, President Donald Trump is slated to sign the bipartisan Take It Down Act, which criminalizes the distribution and creation of nonconsensual intimate imagery, including AI-generated deepfakes and so-called “revenge porn.”
In addition to creating federal-level criminal consequences, the new law will require online platforms to implement a request-and-removal system where victims of image-based sexual abuse can have their pictures taken down within 48 hours.
The House and Senate sent the measure to the president’s desk with nearly unanimous approval. Sen. Ted Cruz, the Texas Republican who chairs the Senate Commerce Committee, championed the bill after one of his teenage constituents was victimized by nonconsensual deepfakes. It has been the main legislation that First Lady Melania Trump has weighed in on since her husband returned to office. Democratic Sen. Amy Klobuchar of Minnesota, Republican Rep. María Elvira Salazar of Florida and Democratic Rep. Madeleine Dean of Pennsylvania led the bipartisan push for the bill.
Teenage activists Elliston Berry and Francesca Mani, both victims of nonconsensual deepfakes, appeared at a roundtable hosted by the first lady to advocate for the bill on March 3.
With the Take It Down Act imminently becoming law, The 19th explains how it will be implemented and how to request the takedown of explicit images created or shared without permission.
When does the Take It Down Act go into effect?
The criminal provisions of the law will go into effect immediately. Online platforms have one year from the signing of the bill to create a process that facilitates the removal of nonconsensual intimate images.
What kinds of images are covered by the Take It Down Act?
The law is meant to narrowly cover sexually explicit images that have either been shared without the subject’s consent or images that have been created without the consent of the subject, such as with generative AI or through media manipulation.
The definition of sexually explicit content builds on previous federal code, and includes “the uncovered genitals, pubic area, anus or post-pubescent female nipple of an identifiable individual;” the “display or transfer of bodily sexual fluids;” graphic sexual intercourse; bestiality; masturbation; sadistic or masochistic abuse; and “graphic or simulated lascivious exhibition” of anuses, genitals or the pubic area.
Synthetic images, or “digital forgeries” as the bill labels them, must be of “indistinguishable from an authentic visual depiction” of an identifiable individual.
The definition is tailored to not apply to all digital forgeries, even ones that might seem explicit on the surface. For instance, advocates have pointed out that the viral fake video of Trump licking Elon Musk’s feet wouldn’t necessarily qualify for removal or prosecution under the Take It Down Act.
-
Read Next:
How can I get images removed in the meantime?
The Cyber Civil Rights Initiative, a nonprofit fighting online abuse, operates a free 24/7 hotline to assist victims of image-based sexual abuse at 1-844-878-2274. Their Safety Center also maintains a list of experienced attorneys and individual state laws. As of May 2025, all states and the District of Columbia have some form of law banning image-based sexual abuse, though there are significant variances in the circumstances and types of media covered.
The first step is to make sure you are physically safe, as image-based sexual abuse is frequently an escalation of intimate partner violence. Help is available through the National Domestic Violence Hotline online or through their free 24/7 phone line at 1-800-799-7233.
CCRI recommends documenting evidence of abuse, including screenshots or PDFs of online search results or websites. They suggest keeping both digital and physical copies (e.g. printouts) of evidence, and saving other relevant documents like text messages or emails.
Major websites and image-hosting platforms have ways to request image takedowns for emergencies such as these. CCRI has a short list of contacts for major search engines, social media platforms, dating services and pornography sites.
If images you took of yourself have been posted without your consent, filing a copyright claim is another option. Without My Consent, now part of CCRI, has a resource all about filing Digital Millennium Copyright Act takedown requests.
What if the images are of a minor?
Explicit images of minors are treated differently, as real images are considered child sexual abuse material — formally known in most legal code as child pornography — and carry severe penalties. Take It Down criminalizes fake images of child sexual abuse material at the federal level; most states only have laws addressing real media.
There are specific, dedicated resources to assist victims of child sexual abuse material. The National Center for Missing and Exploited Children, which is authorized and funded by Congress, maintains a CyberTipline to report incidents of child exploitation. Anyone can report to the tipline, and the organization also maintains a 24/7 hotline at 1-800-843-5678.
Take It Down, an initiative from NCMEC that is unaffiliated with the bill of the same name, provides services to anyone who needs assistance removing child sexual abuse material of themselves from the internet.
-
Read Next:
What might prevent the Take It Down act from being implemented?
It’s not unlikely that the law could be challenged in court on grounds that it infringes on the First Amendment. Digital rights groups have criticized the request-and-removal provision of the bill, calling it overbroad and a threat to free speech.
The Electronic Frontier Foundation, a tech policy think tank, argues that the takedown provision could apply to consensual sexual images as well. It is one of several groups that have highlighted how the request-and-removal process could be ripe for abuse, especially given that the 48-hour timeline may not provide covered platforms with enough time to verify that the content is nonconsensual.
The Center for Democracy and Technology sent a letter to the House Energy and Commerce Committee, urging members to amend the bill to explicitly exclude encrypted services, fearing that platforms will need access to private messages in order to comply with the takedown requests.
Trump’s assertion that he would use the Take It Down Act for himself — “nobody is treated worse online than I am, nobody” he said during his address to the joint session of Congress in March — made activists worry that the bill could be used to remove critical political speech, especially in the context of a wider crackdown by the current administration.
There may also be some issues with enforcing the request-and-removal requirement, which falls to the Federal Trade Commission. An executive order threatens the independence of the agency, and Trump fired the two Democratic Commissioners without cause, which the Supreme Court has previously ruled the president cannot do.