Teens are suing Elon Musk’s xAI device over Grok’s AI-generated CSAM


The plaintiffs include two minors and an adult who was underage when the events of the lawsuit occurred. One victim, identified as “Jane Doe 1,” claims she learned last December that explicit AI-generated images of herself and at least 18 other minors were available on Discord. “At least five of these files, one video and four photographs, depict her actual face and body in settings that were familiar to her, but which were transformed into sexually explicit poses,” the lawsuit alleges.

The perpetrator, who has since been arrested, allegedly used Jane Doe 1’s AI-generated CSAM “as a bartering tool in Telegram group chats with hundreds of other users, trading her CSAM files for sexually explicit content of other minors.” The lawsuit alleges the perpetrator created explicit images of Jane Doe 1 and the two other victims using Grok software. It also claims that xAI “failed to safety test the features it developed” and that Grok is “flawed by design.”

Although X has I tried to make it more difficult for users Edit photos with Grok, Edge I found it is still possible to Processing images uploaded to the platform. X asserted that “anyone who uses or pays Grok to create illegal content will suffer the same consequences as if they uploaded illegal content.” X did not respond immediately EdgeRequest for comment.

Anika K said: “These are children whose school photos and family photos were turned into child sexual abuse material by a billion-dollar company’s AI tool, and then circulated among predators,” said Martin, one of the victims’ advocates, from Live Capraser, in a statement. “We intend to hold xAI accountable for every child it harms in this way.”

The lawsuit seeks damages for victims affected by Grok’s “unlawful images.” It also asks the court to prevent xAI from creating and deploying alleged AI-generated CSAM weapons.

Leave a Reply

Your email address will not be published. Required fields are marked *