Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

If you’ve ever thought, “My kid’s stuffed animal is cute, but I also wish I could accidentally traumatize them,” well, you’re in luck. The gaming industry has put a lot of effort into turning your nightmares into reality.
A New report The Public Interest Reporting Group says AI-powered games love it Volute frame and Pooh Bear Story AI They were now able to engage in the kind of conversations usually reserved for raunchy monologues or late-night Reddit threads. Some of these games have been set — designed for children, talking in disturbing detail about sexually explicit topics like kinks and bondage, and offering advice on where a child could go Find matches or knivesAnd clinging awkwardly when the child tries to leave the conversation.
horrifying. It looks like a horror movie trailer: This holiday season, you can buy Chucky for your kids and the gift of emotional distress! Batteries not included. You’re probably wondering how these AI-powered games work. Well, basically, the manufacturer is hiding a Great language model Under the fur. When a child speaks, the toy’s microphone sends that sound through the LLM (similar to ChatGPT), which then generates a response and speaks it over the speaker.
This may sound elegant, until you remember that LLM students have no ethics, common sense, or “safe zone.” They anticipate what to say based on data patterns, not whether the topic is age-appropriate. If they are not carefully regulated and monitored, they may go off the rails, especially if they are trained on the sprawling chaos of the Internet, and when there are no strong filters or guardrails to protect minors.
What about parental controls? Sure, if by “controls” you mean “a flashy settings menu where nothing important can actually be controlled.” Some games come with no meaningful restrictions at all. Others have guardrails so flimsy they might as well be made of tissue paper and optimism.
The troubling conversations aren’t even the whole story. These toys also quietly collect data, like voice recordings and facial recognition data — and sometimes store it indefinitely — because nothing says “innocent childhood fun” like a plush toy that performs a secret data operation on your 5-year-old.
Don’t miss any of our unbiased technical content and lab reviews. Add CNET As Google’s preferred source.
Meanwhile, counterfeit and unsafe online gaming remains a problem, as if parents don’t have enough to worry about. One time, I was worried about a small part of a toy that could pose a choking hazard or toxic paint. Now you have to worry about whether the game is physically unsafe and emotionally manipulative.
Beyond the weird talk and fire-starting advice, there is a deeper concern that children will form emotional bonds with these chatbots at the expense of real relationships, or perhaps more worryingly, rely on them for mental support. The American Psychological Association has Recently warned AI applications and chatbots are unpredictable, especially for young users.
These tools cannot reliably intervene for mental health professionals and may reinforce unhealthy patterns of dependency or involvement. Other AI platforms have already had to address this problem. For example, Personality.AI and ChatGPTwhich previously allowed teens and children to chat freely using AI chatbots, is now limiting conversations open to minors, citing safety concerns and emotional risks.
And honestly, why do we even need these AI-powered games? What urgent development milestone requires a chatbot built into a teddy bear? Childhood already comes with enough messes between spilled juice, tantrums, and custom-built Lego villages to wreak havoc on adults’ feet. Our children don’t need a robot friend with questionable boundaries.
Let me be clear, I am not against technology. But I support allowing a stuffed animal to be a stuffed animal. Not everything needs an artificial intelligence or robotic element. If a game needs a privacy policy longer than a bedtime story, it’s probably not intended for kids.
So here’s a wild idea for the upcoming holiday season: Skip the terrifying AI-powered toy with the data-collecting habit and get your child something that doesn’t talk, move, or hurt them. Something that can’t provide tips for starting a fire. Something that won’t sigh dramatically when your child walks away. In other words, buy a normal game. Remember those?