Tennessee teens sue Elon Musk's xAI over AI-generated child sexual abuse material
Last edited Mon Mar 16, 2026, 10:16 PM - Edit history (1)
Source: NPR
Three Tennessee teenagers have filed a class action lawsuit against Elon Musk's artificial intelligence company, xAI, alleging its large language model powered an app that was used to make nonconsensual nude and sexually explicit images and videos of them when they were girls.
"Like a rag doll brought to life through the dark arts, this [AI-generated] child can be manipulated into any pose, however sick, however fetishized, however unlawful. To the viewer, the resulting video appears entirely real," reads the complaint. "For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse."
While the perpetrator didn't use xAI's chatbot, Grok or the social media platform X (also owned by xAI), the lawsuit claims that the perpetrator relied on an unnamed app that used xAI's algorithm, citing law enforcement.
The plaintiffs accused xAI of deliberately licensing its technology to app makers, often outside the U.S. "In this way, xAI could attempt to outsource the liability of their incredibly dangerous tool," said the complaint.
-snip-
Read more: https://www.npr.org/2026/03/16/nx-s1-5749490/xai-elon-musk-sexualized-images
From Gizmodo's story on this lawsuit:
https://gizmodo.com/teens-sue-xai-over-sexualized-images-generated-by-grok-2000734137
When the police arrested the person responsible for making the images, they determined that he used Grok to create them. Grok was also used to generate non-consensual sexual images of people on Twitter, an estimated 23,000 photos that appeared to depict children in sexual situations, according to researchers who investigated the posts.
At the time those images were spreading on Twitter, xAI (and Twitter) CEO Elon Musk, claimed, I not aware of any naked underage images generated by Grok. Literally zero, and said When asked to generate images, it will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state. At the time, Grok was being used to depict people, including children, in bikinis without their consent. Musk made posts following this trend, including an image depicting a rocket in a bikiniseemingly suggesting he was aware of the trend, whether or not he was aware it was being used on images of children. Weeks later, the company announced that it would add restrictions to image generation and made reference to people who attempt to abuse the Grok account to violate the law, but didnt directly acknowledge the generation of CSAM.
Musk and xAI have also promoted Groks ability to be used for sexually explicit activity via its Spicy mode, which can be used for text, image, and video generation. The class action suit alleges that the company and its CEO were more aware of how the tool was being used than they have let on, claiming they saw a business opportunity: an opportunity to profit off the sexual predation of real people, including children.
RockRaven
(19,216 posts)p-e-d-o-p-o-r-n-m-a-c-h-i-n-e as x-a-i.
They must be using some dumbfuck artificial intelligence to generate them or something.
Leghorn21
(14,081 posts)RESPECT