Your puppy will take off anyone’s clothes, including minors


xAI’s Grok removes clothing from people’s photos without their consent after rolling it out this week feature It allows X users to instantly edit any image using the bot without needing the original poster’s permission. Not only does the original poster not get notified if his photo is edited, but Grok appears to have few guardrails in place to prevent anything less than full frank nudity. In the past few days, X has been filled with photos of women and children posing pregnant, without a skirt, wearing a bikini, or in other sexual positions. Images of world leaders and celebrities were also used in the images created by Grok.

Artificial Intelligence Authentication Company Leaks have been reported The trend of removing clothes from photos started when Grok adult content creators requested sexy photos of themselves after the release of their new photo editing feature. Users then began applying similar prompts to the photos of other users, mostly women, who did not consent to the edits. Women have noticed the rapid rise in the creation of deepfakes on X in various news outlets, including subway and Petapixel. It was your puppy Really capable to edit photos in sexual ways when tagged in a post on

In one of X’s posts, which has now been removed from the platform, Grok edited a photo of two young girls wearing tight clothing and sexually suggestive poses. Another X user Get your puppy to issue an apology For an “incident” involving “an AI image of two young girls (ages 12-16) in sexual clothing,” it described it as a “failure of safeguards” that it said may have violated xAI company policies and U.S. law. (Although it is not clear whether Grok-generated images would meet this standard, realistic AI-generated sexually explicit images of identifiable adults or children could be illegal under US law.) In another user dialogue, Grok suggested that users Report it to the FBI For CSAM, stating that it is working “urgently to fix” the “gaps in safeguards.”

But Grok’s word is nothing more than an AI response to a user requesting a “sincere apology note” — it does not indicate that Grok “understands” what he is doing or necessarily reflects the actual opinion and policies of the xAI operator. Instead, XAI responded ReutersRequest for comment About the situation in just three words: “Old media lies.” xAI did not respond EdgeComment requested in time for publication.

Elon Musk himself appears to have sparked a wave of bikini edits after asking Groke to replace a meme of actor Ben Affleck with himself. Bikini exercise. Days later, the leather jacket worn by Kim Jong Un in North Korea was replaced by a layer of multi-colored spaghetti. bikini; US President Donald Trump stood nearby, wearing matching swimsuits. (Cue jokes about nuclear war.) Photo of Britons political Priti Patel, who was posted by a user with a sexually suggestive message in 2022, posed for a bikini photo on January 2. In response to the wave of bikini photos on his platform, Musk jokingly reposted a photo image From Toaster in a Bikini captioned “Grok can put a bikini on everything.”

While some of the images – such as the toaster – were clearly intended as jokes, others were clearly designed to produce borderline pornographic images, including specific directions for Grok to use skimpy bikini styles or remove the skirt entirely. (The chatbot did remove the upskirt, but did not depict full, uncensored nudity in the responses Edge Saw.) Grok also responded to requests to replace A’s clothes Baby with bikini.

Musk’s AI products are prominently marketed as highly sexualized and minimalist. xAI Mughazil Rafiq AI Ani with edge Reporters Victoria Song and Jess Weatherbed find out Grok’s video creator created the topless pose with ease Taylor Swift deepfakealthough xAI is accepted Usage policy Prohibiting the depiction of “similar persons in a pornographic manner.” In contrast, Google’s Veo and OpenAI’s Sora video generators have guardrails around creating NSFW content, although Sora has also been used to produce videos for Children in sexual contexts and Fetish videos. The spread of fake images is rapidly increasing, according to a report by the cybersecurity company Deep StrikeMany of these images contain non-consensual sexual images; A Survey 2024 Of the US students they found that 40% were aware of a deepfake of someone they knew, while 15% were aware of an explicit or intimate deepfake without consent.

When asked why photos of women were turned into bikini photos, Grok said to reject He posted photos without consent, saying: “These are artificial intelligence creations based on requests, and not real photo modifications without consent.”

Take the denial of the AI ​​bot as you will.

Leave a Reply

Your email address will not be published. Required fields are marked *