Teenage girls sue Musk’s xAI, accusing Grok tool of creating child sexual abuse material
Summary
Photograph: Thomas Fuller/NurPhoto via Getty Images Teenage girls sue Musk’s xAI, accusing Grok tool of creating child sexual abuse material Lawuit details how sexualised AI-generated images were produced and distributed without girls’ knowledge A group of three teenage girls, two of whom are minors, filed a lawsuit on Monday against Elon Musk ’s xAI artificial intelligence company alleging that its Grok image generator used photos of them to produce and distribute child sexual abuse material. The suit, which was brought by three Tennessee teenagers but filed in California, where xAI is headquartered, details how the girls discovered that nude, AI-altered images of them were uploaded to a Discord server and shared online without their knowledge. After they alerted law enforcement to the images, according to the complaint, police arrested a suspect later that month and found child sexual abuse material (CSAM) on his phone that was allegedly produced using xAI’s image and video generation technology. xAI did not immediately respond to a request for comment from the Guardian. Criminal investigators later also discovered that the images had been shared on the messaging app Telegram, according to the complaint, where they were allegedly being used as a currency to barter for other child sexual abuse material. “The images showed her entire body, including her genitals, without any clothes.
Photograph: Thomas Fuller/NurPhoto via Getty Images Teenage girls sue Musk’s xAI, accusing Grok tool of creating child sexual abuse material Lawuit details how sexualised AI-generated images were produced and distributed without girls’ knowledge A group of three teenage girls, two of whom are minors, filed a lawsuit on Monday against Elon Musk ’s xAI artificial intelligence company alleging that its Grok image generator used photos of them to produce and distribute child sexual abuse material. The suit, which was brought by three Tennessee teenagers but filed in California, where xAI is headquartered, details how the girls discovered that nude, AI-altered images of them were uploaded to a Discord server and shared online without their knowledge. After they alerted law enforcement to the images, according to the complaint, police arrested a suspect later that month and found child sexual abuse material (CSAM) on his phone that was allegedly produced using xAI’s image and video generation technology. xAI did not immediately respond to a request for comment from the Guardian. Criminal investigators later also discovered that the images had been shared on the messaging app Telegram, according to the complaint, where they were allegedly being used as a currency to barter for other child sexual abuse material. “The images showed her entire body, including her genitals, without any clothes.
## Article Content
The suit was brought by three Tennessee teenagers but filed in California, where xAI is headquartered.
Photograph: Thomas Fuller/NurPhoto via Getty Images
View image in fullscreen
The suit was brought by three Tennessee teenagers but filed in California, where xAI is headquartered.
Photograph: Thomas Fuller/NurPhoto via Getty Images
Teenage girls sue Musk’s xAI, accusing Grok tool of creating child sexual abuse material
Lawuit details how sexualised AI-generated images were produced and distributed without girls’ knowledge
A group of three teenage girls, two of whom are minors, filed a lawsuit on Monday against
Elon Musk
’s xAI
artificial intelligence
company alleging that its Grok image generator used photos of them to produce and distribute child sexual abuse material. The class-action lawsuit is the first filed by minors following Grok’s rampant generation of nonconsensual nude images earlier this year.
Elon Musk’s Grok AI generates images of ‘minors in minimal clothing’
Read more
“xAI chose to profit off the sexual predation of real people, including children, despite knowing full well the consequences of creating such a dangerous product,” Vanessa Baehr-Jones, a lawyer for the plaintiffs, said in a statement.
The suit, which was brought by three Tennessee teenagers but filed in California, where xAI is headquartered, details how the girls discovered that nude, AI-altered images of them were uploaded to a Discord server and shared online without their knowledge.
After they alerted law enforcement to the images, according to the complaint, police arrested a suspect later that month and found child sexual abuse material (CSAM) on his phone that was allegedly produced using xAI’s image and video generation technology.
xAI did not immediately respond to a request for comment from the Guardian.
The suit alleges that the CSAM was created using a third-party app that licensed and relied on Grok’s AI to produce the material. The Washington Post
first reported
on the case.
The lawsuit joins several other legal actions and
international investigations
into xAI over its creation and dissemination of nonconsensual sexualized images, including another lawsuit from the mother of one of Musk’s children and a formal European Union inquiry. At the peak of the scandal,
researchers at the Center for Countering Digital Hate calculated
that Grok had created about 3m sexualized images in less than two weeks – around 23,000 of which depicted children.
My picture was used in child abuse images. AI is putting others through my nightmare | Mara Wilson
Read more
Musk has previously denied that Grok has been used to produce CSAM,
claiming in January
that he was “not aware of any naked underage images generated by Grok. Literally zero.”
He also alleged that Grok would not generate any illegal images, and that its operating principle was to follow local laws.
In the complaint, filed on Monday, lawyers for the teenage plaintiffs detailed how the girls discovered that AI-altered nude images of them were being circulated online. One girl, referred to as Jane Doe 1, received a message on Instagram in December from an anonymous user, who alerted her that someone in her social circle had uploaded a series of deepfake videos and images to a Discord server that depicted her and other girls from her high school naked and in sexualized positions, according to the complaint.
Jane Doe noticed that three of the photos appeared to be AI-altered images of photographs taken while she was a minor, including one from her school’s homecoming celebration. Criminal investigators later also discovered that the images had been shared on the messaging app Telegram, according to the complaint, where they were allegedly being used as a currency to barter for other child sexual abuse material.
“The images showed her entire body, including her genitals, without any clothes. The video depicted her undressing until she was entirely nude,” the complaint states.
The other plaintiffs in the suit discovered in February that similar CSAM material featuring them had also been generated via AI and shared online, with the suit seeking damages against xAI for the reputational and mental health harms resulting from the images.
“Watching my daughter have a panic attack after realizing that these images were created and distributed without any hope of recalling them was heartbreaking,” the mother of one of the girls said via a representative.
Although the complaint alleges that the images were created using a third-party application accessing Grok’s technology rather than directly on the X website or Grok app, the complaint argues that this use still requires xAI’s servers and that xAI profits from licensing its technology to these apps.
Lawyers for the plaintiffs accuse xAI of effectively off-loading liability through its licensing structure and lack of oversight.
Explore more on these topics
Grok AI
Elon Musk
AI (artificial intelligence)
news
Share
Reuse this content
---
## Expert Analysis
### Merits
N/A
### Areas for Consideration
N/A
### Implications
N/A
### Expert Commentary
This article covers images, xai, grok topics. Readability: Flesch-Kincaid grade 0.0. Word count: 789.
Related Articles
The Guardian view on reversing the two-child benefit limit: a moment to...
1 day, 12 hours ago
Gentleman’s Relish is toast after its maker axes the pungent anchovy spread
1 day, 20 hours ago
Jo Malone ‘surprised and sad’ after being sued for £200,000 for using...
1 day, 20 hours ago
‘I’ve not had proper food for days’: migrant workers leave India’s cities...
1 day, 20 hours ago