For many years, credit card companies and other payment methods were aggressive about policing child sexual abuse material. Then, Elon Musk’s Grok started undressing children on X.

The Center for Countering Digital Hate found 101 sexualized images of children as part of its sample of 20,000 images made by Grok from December 29th to January 8th. Using that sample, the group estimated that 23,000 sexualized images of children had been produced in that time frame. Over that 11-day period, they estimated that on average, a sexualized image of a child was produced every 41 seconds. Not all of the sexualized images Grok has produced appear to be illegal, but reports indicate at least some likely cross the line.

There is tremendous confusion about what happens to be true on Grok at any given moment. Grok has offered responses with misleading details, claiming at one point, for instance, that it had restricted image generation to paying X subscribers while still allowing direct access on X to free users. Though Musk has claimed that new guardrails prevent Grok from undressing people, our testing showed that isn’t necessarily true. Using a free account on Grok, The Verge was able to generate deepfake images of real people in skimpy clothing, in sexually suggestive positions, after new rules were supposedly in effect. As of this writing, some egregious prompts appear to have been blocked, but people are remarkably clever at getting around rules-based bans.

In the past, payment providers have been aggressive about cutting access to websites thought to have a significant presence of CSAM

X does seem to have at least partially restricted Grok’s image editing features to paid subscribers, however — which makes it very likely that for at least some of these objectionable images, money is actually changing hands. You can purchase a subscription to X on Stripe or through the Apple and Google app stores using your credit card. Musk has also suggested through his posts that he doesn’t think undressing people is a problem. This isn’t X’s first brush with AI porn, either — it’s repeatedly had a problem moderating nude deepfakes of Taylor Swift, whether or not they are generated by Grok.

In the past, payment providers have been aggressive about cutting access to websites thought to have a significant presence of CSAM — or even legal, consensually produced sexual content. In 2020, Mastercard and Visa banned Pornhub after a New York Times article noted the prevalence of CSAM on the platform. In May 2025, Civitai was cut off by its credit card processor because “they do not wish to support platforms that allow AI-generated explicit content,” Civitai CEO Justin Maier told 404 Media. In July 2025, payment processors pressured Valve into removing adult games.

In fact, at times financial institutions have threatened people and platforms because it seems like they didn’t want reputational risk. In 2014, adult performer Eden Alexander’s fundraiser for a hospital stay was shut down by payments company WePay because of a retweet. Also in 2014, JPMorganChase abruptly shut down several porn stars’ bank accounts. In 2021, OnlyFans briefly tried to ban sexually explicit content because banks didn’t like it. (Widespread backlash to the move quickly made OnlyFans reverse itself.) This is legal, consensual sexual content — and it was deemed too hot to handle.

“The industry is no longer willing to self-regulate for something as universally agreed on as the most abhorrent thing out there.”

But Musk’s boutique revenge porn and CSAM generator is, apparently, just fine.

It’s a striking reversal. “The industry is no longer willing to self-regulate for something as universally agreed on as the most abhorrent thing out there,” which is CSAM, says Lana Swartz, the author of New Money: How Payment Became Social Media, of the inaction by Stripe and the credit card companies.

Visa, Mastercard, American Express, Stripe, and Discover did not return requests for comment. The US Financial Coalition Against Child Sexual Exploitation — an industry group composed of payments processors, banks, and credit card companies — also did not return a request for comment. On its website, FCACSE brags that “As a result of its efforts, the use of credit cards to purchase child sexual abuse content online has been virtually eliminated globally.”

Sexualized images of children are not the only problem with Grok’s image generation

In the past, “people who did completely legal stuff were cut off from banks,” notes Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence. There are incentives to overenforce boundaries around questionable images — and traditionally, that’s what the financial industry has done. So why is X different? It’s run by Elon Musk. “He’s the richest man in the world, he has close ties to the US government, and he’s incredibly litigious,” says Pfefferkorn. In fact, Musk has previously filed suit against the Center for Countering Digital Hate; in a now-dismissed lawsuit, he claimed it illegally collected data showing an increase in hate speech after he bought the platform formerly known as Twitter.

Sexualized images of children are not the only problem with Grok’s image generation. The New York Times estimated that 1.8 million images the AI generated in a nine-day time period, or about 44 percent of posts, were sexualized images of adult women — which, depending on how explicit they are, can also be illegal to spread. Using different tools, the Center for Countering Digital Hate estimated that more than half of Grok’s images contained sexualized imagery of men, women, and children.

The explosion of sexualized images took place after Musk posted an AI-edited image of himself in a bikini on December 31st. A week later, X’s head of product, Nikita Bier, posted that the previous four days were also the highest-engagement days on X ever.

Lawyer Carrie Goldberg, whose history includes challenging Section 230 in a stalking lawsuit against Grindr and another suit that ultimately shut down chat client Omegle, is representing Ashley St. Clair, the mother of one of Musk’s children, in a case against X. St. Clair is one of many women Grok undressed — and now she’s suing the platform, arguing that X has created a public nuisance. “In the St. Clair case we are only focused on xAI and Grok because they are so directly liable from our perspective,” she said in an email. “But I could envision other sources of liability.” She specifically cited distributors like Apple and Google’s app stores as areas of interest.

“A lot of this could end up in court, and it’s going to be up to judges to make decisions about what’s ‘sexually explicit.’”

There are other potential legal wrinkles. In 2022, Visa was sued for offering payment services to Pornhub, because allegedly Visa knew Pornhub wasn’t adequately moderating CSAM. Other lawsuits followed. While the judge in the Visa case rejected the claim that Pornhub wasn’t liable because of Section 230, he also tentatively dismissed the claims against Visa in 2025, though the woman who filed suit could file an amended complaint.

“A lot of this could end up in court, and it’s going to be up to judges to make decisions about what’s ‘sexually explicit,’” says David Evan Harris, a public scholar at the University of California, Berkeley. Still, 45 states have criminalized AI-generated CSAM. The federal Take It Down Act criminalizes deepfake nudes. The state of California has issued a cease and desist to Musk and X, after announcing an investigation into Grok’s images. Grok may be violating California’s deepfake porn ban — and California is just one of at least 23 states that have passed such laws.

That should matter to payment processors, because if they are knowingly transmitting money that’s the proceeds of a crime, they are engaged in money laundering — which can have serious consequences. The office of California Attorney General Rob Bonta declined to comment on whether Stripe, credit cards, or the app stores were also part of the Grok probe, citing an ongoing investigation. Money laundering laws are part of the reason financial institutions have been so leery of any website that’s been accused of containing CSAM.

But X has created a situation where payment processors are hugely disincentivized to take the law seriously. That’s because any state that files suit against processors over X is likely to be attacked by Musk for “censoring” X’s right-wing base. Plus, Musk — and possibly his buddy, US President Donald Trump — could throw a lot of resources behind getting payment processors off the hook.

It seems when it comes to CSAM and deepfakes, the financial industry is no longer willing to regulate itself. So, then, who will regulate it?

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Share.
Exit mobile version