It’s internet safety law season again. After a narrow failure to pass the Kids Online Safety Act in 2024, Congress is now advancing the Take It Down Act, which criminalizes nonconsensual intimate imagery (NCII, once dubbed “revenge porn,” including AI-generated content) and sets requirements for web platforms to remove it. The bill has gained support from First Lady Melania Trump, and President Donald Trump touted it during his joint address to Congress on March 4th, promising he would sign it. In a normal world, this could be a positive step towards solving the real problem of NCII, a problem that AI is making worse.
But we are not in a normal world. Parts of the Take It Down Act are more likely to become a sword for a corrupt presidential administration than a shield to protect NCII victims — and supporters of both civil liberties and Big Tech accountability should recognize it.
The typical discourse around a bill like the Take It Down Act works this way: lawmakers propose a rule that’s supposed to do a good and popular thing, like help victims of nonconsensual sexual images get those images taken down. Civil liberties advocates go “wait a minute, this has a lot of bad side effects!” Then everybody argues. Is it okay to risk platforms removing legally protected speech if they’re removing lots of bad stuff alongside it? Is protecting the right to private encrypted messaging worth the harm of people secretly transmitting harmful content? Does the bill’s language make one set of outcomes more likely than the other, and is there better language that would tip the scales?
These arguments miss the larger current context. No matter how carefully crafted the Take It Down Act is, it won’t be signed by a president who intends to follow it in good faith. It will be selectively enforced by an administration that consistently treats laws as bargaining chips or ammunition, using them to attack political enemies while exempting anyone who earns Trump’s favor. Right now that happens to include several of the internet’s biggest social media companies, and by extension, some of the biggest potential conduits of NCII.
To put it more simply: even if you accept the Take It Down Act’s tradeoffs in the name of making tech companies protect users, in the era of gangster tech regulation, you’re probably not getting the trade.
Even if you accept the tradeoffs, you’re probably not getting the trade
The problem here is an issue that, yes, I understand we’re all tired of thinking about: the constitutional crisis. The Republican-led Congress has willingly forfeited its status as a serious branch of government in 2025. Lawmakers failed to act when President Donald Trump and shadow president Elon Musk began flouting congressional directives, circumventing the legal process and gutting agencies that the legislature established and funded.
This is bad for direct reasons like ebola, of course, but it also means something more fundamental. Congress can no longer meaningfully claim that what it passes are laws. What it makes now are weapons — rules Trump and other parties use only against people they don’t like.
If this sounds paranoid, let’s look at a few legal processes that Trump and his administration have abused over the past several weeks:
- FCC chairman Brendan Carr used competition oversight rules and regulations on broadcast TV speech to meddle in the editorial decisions of major television networks.
- Trump extracted a blatant $25 million bribe from a social media platform with a bogus lawsuit over the platform exercising its First Amendment right to moderation.
- An executive order told app stores to disregard a law passed with bipartisan support to reward a social media platform that Trump believes helped get him elected.
- The Department of Justice declared it would drop a fairly credible corruption case against New York’s mayor in exchange for his support enforcing mass deportations.
Expand the circle of Trump associates and I could go on. Elon Musk using a pet judge and a twisted version of quasi-defamation law to immiserate nonprofits for reporting on X’s white supremacist content. A DC attorney threatening to prosecute Wired staff for reporting on the Department of Government Efficiency (DOGE). The agency designed to enforce laws around online scams dropping lawsuits against allegedly predatory lenders and maybe shutting down.
It’s eminently clear that the Trump administration and its supporters have little interest in consistently applying Congress’ laws, and given that, it’s eminently unclear what any lawmaker acting in good faith expects to happen if they pass new ones.
Let’s look at the Take It Down Act through that lens. In addition to criminalizing the publication or disclosure of real and simulated “nonconsensual intimate physical depictions” of an identifiable person, the law tasks the Federal Trade Commission (FTC) with ensuring web platforms establish a process for removing these depictions within 48 hours, at the risk of violating unfair trade practices law.
This part has proven particularly controversial, and it would normally pose complicated questions. As the Electronic Frontier Foundation explains, “although this provision is designed to allow NCII victims to remove this harmful content, its broad definitions and lack of safeguards will likely lead to people misusing the notice-and-takedown system to remove lawful speech.” How much collateral damage, we might typically ask, should we accept to protect people from abuse?
But if any company that woos Trump can avoid following the law, the collateral damage answer is irrelevant — they’ve got no legal reason to do the “protecting people from abuse” part.
In what world does a platform like X have to follow these rules?
Here’s a thought experiment: Elon Musk owns the social media platform X. X is a known spreader of nonconsensual intimate imagery; in 2024, it was used to publicize graphic AI-generated images of Taylor Swift. The company cut large parts of its moderation team under Musk, leaving it with few resources to build and staff the kind of removal system the Take It Down Act requires. But Musk also heads the powerful government pseudo-agency DOGE. DOGE has apparently almost unlimited power to cut other agencies’ funding, and it may have already begun terminations at the FTC.
If X utterly ignores the Take It Down Act, in what world does the FTC step in to stop it?
The same question goes for at least two other social media giants: Meta and TikTok. Meta CEO Mark Zuckerberg has temporarily earned Trump’s approval by doing things like repudiating fact-checking and throwing women and trans people under the Facebook moderation bus. The US government is supposedly going to own half of TikTok. If these companies end up improving their takedown systems, it will be because they’re facing public pressure or what amounts to a Trump extortion opportunity, not fairly applied legal consequences. And maybe you’re thinking, perhaps Trump’s strong personal objections to the abuse of women — the vast majority of NCII victims — mean he won’t tolerate it even among his allies. But let’s be real, you’re probably not.
Under Trump’s intensely fought culture wars, even smaller platforms with unambiguously awful content might get away scot-free. All they have to do is position themselves as victims of “cancel culture” or “wokeness”, and the administration may look the other way. Once you’ve pressured another country’s government to release what a Republican attorney general calls two “publicly admitted” sex traffickers, there’s clearly not much you can’t excuse.
Meanwhile, small services with no such guarantees could operate under a cloud of uncertainty about the risks that groups like the EFF have pointed out.
Is the Take It Down Act a good bill? It doesn’t matter. Its sponsors Ted Cruz (R-TX) and Amy Klobuchar (D-MN) could have the most immaculately crafted legislative text ever written (in case it’s not clear, they don’t) and it would not change the fact that Congress has accepted it’s passing suggestions, not laws. (Cruz and Klobuchar’s offices didn’t respond by publication time to questions about how the FTC might enforce the law against X.)
Some parts of the Take It Down Act could still probably be enforced in beneficial ways, even if they continue to raise First Amendment questions. Advocates have been pressing for federal laws addressing NCII purveyors for years, and law enforcement could use this one to crack down on some of the people making and intentionally spreading this content. Powerful, high-profile individual offenders that rally political support might still skate, but you’d stand a chance of winning other cases. This is all tradeoff territory.
Given the issues above, though, there’s no reason to believe the Take It Down Act will make most big social companies take NCII more seriously. Instead, the law’s platform provisions could be worse than useless. In general, government-mandated takedown systems are easily abused by private bad actors. (This primarily happens with “copystrike” extortion and censorship, which has grown out of mandatory takedown systems for copyright infringement.)
When laws are just bargaining chips, the people they’re supposed to protect are left behind
More specifically, conservatives have signaled an interest in undercutting supposedly “liberal” platforms — Wikipedia in particular is frequently attacked by Musk and has been targeted by the Heritage Foundation. The Take It Down Act covers online platforms (with the exception of email and a few other carveouts) that “primarily [provide] a forum for user-generated content,” and while Wikipedia isn’t typically in the business of publishing nonconsensual nudes, it seems plausibly covered by some interpretations of the law. The FTC would probably have no compunctions about launching a punitive investigation if trolls start spamming it with deepfakes.
The Take It Down Act’s critics also argue that it could add liability for end-to-end encryption. Those provisions could incentivize companies like Apple — which remains one of the less Trumpy tech players — to remove data protection features, especially amid anti-encryption pressure overseas. It would be an issue under any presidency, but in one that’s blatantly contemptuous of data privacy, the danger is even more pressing.
And once more, even if you think removing encryption is good, there’s an equal chance that Trump just uses the liability threat as leverage — if Tim Cook invests enough money in US factories, iMessage end-to-end security stays on.
In a state of total presidential control all these risks might be a moot point, because the administration wouldn’t need any excuse at all to target its enemies. But we’ve still got a court system that’s putting some (limited and tenuous) checks on it, and a judge can more easily throw out a total nonsense claim. Creating a new pretext could mean longer, costlier battles while courts interpret the statute. The Take It Down Act seems like the online safety law closest to passing, but for the near future, any bill that regulates internet content poses similar risks.
Civil libertarians often warn people to view regulations through the perspective of the worst person you can imagine interpreting them, and I think sometimes this comes off as overly cynical — if you assume every law will be twisted beyond recognition in bizarre ways, then at a certain point no law makes sense. But that whole argument belongs in a world where Congress cares about the executive branch obeying its directives to the letter and the president is not giving constant indications he’ll weaponize them.
For anyone who wants tech platforms to take a stronger stance on nonconsensual intimate imagery, many state-level lawmakers still seem seriously invested in the process of governance. (Nearly all states have some form of NCII ban, though these don’t typically address larger platform issues.) Soft power — protests, boycotts, exposés — can push companies to act, and indeed, several tech companies have introduced new NCII rules over the years. And if Congress decides it’s interested in becoming a real branch of government again, we can restart those endless conversations about safety versus privacy and speech. Until then, the Constitution is the only thing lawmakers seem capable of taking down.