Undressing the Law: Chicago-Kent Professor Breaks Down Grok Photo Scandal
“From a criminal law standpoint, AI-generated nude images might run afoul of the recently enacted Take It Down Act, which criminalizes the publication of certain digitally created ‘intimate visual depictions’ of nonconsenting individuals,” says Chicago-Kent College of Law Professor Alexandra F. L. Yelderman.
Artificial intelligence-generated sexual imagery began flooding the social media platform X (formerly known as Twitter) in December 2025. The images were created by X owner Elon Musk’s AI chatbot Grok. The chatbot was responding to user requests to alter photos of real people to remove their clothing, pose them in suggestive ways, or dress them in revealing garb.
While the European Union and other international entities are launching probes, Yelderman says the legal situation in the United States consists of a lot of gray area.
“The Take It Down Act forbids the nonconsensual publication of digitally-altered nudes of identifiable people,” she says. “It would seem that merely suggestive material—or bikini pictures—wouldn’t count. While the act criminalizes the knowing publication of certain images, it doesn’t appear to affect people who use AI to ‘undress’ adults for their own personal enjoyment.”
However, Yelderman says that images of minors are a completely different story.
“Any pornographic depiction of an actual, identifiable minor, whether real or morphed, is illegal,” says Yelderman, noting that the images don’t have to be published to run afoul of the law. “However, depictions of minors in bikinis—as Grok was alleged to create—might not qualify as child pornography under current case law.”
If images created by Grok turn out to be ruled illegal in a court of law, the issue of liability for the violation would still have to be decided.
“Grok’s own legal liability is an open question. The platform on which the images were published would be immune from civil liability under Section 230 of the Communications Decency Act,” says Yelderman. “But creators are another story, and people whose likenesses are digitally altered to appear nude might be able to sue them in state court under defamation/ false light or ‘appropriation of likeness’ statutes.”
As for AI-generated images that are completely fabricated or that do not look like a real person, those follow another legal framework.
“When AI generates realistic sexual images from scratch, the law’s power is much more limited,” Yelderman says. “It’s obscenity doctrine that defines the contours of what’s legally protected in those cases. Even the most offensive digitally created sexual depictions—including those of generated children—cannot be outlawed if they have serious literary, artistic, political, or scientific value. Once actual people are no longer in the equation, it becomes much harder to outlaw prurient material.”