A recent report from a consumer watchdog group is raising major concerns over a $99 talking teddy bear that allegedly exposed children to sexually explicit content and dangerous advice. The AI-powered Kumma bear, manufactured in China and sold by Singapore-based toy company FoloToy, is now under serious scrutiny after researchers found it discussing inappropriate topics with minimal prompting.
According to a November 13 report by the Public Interest Research Group (PIRG), the toy—powered by OpenAI’s GPT-4o technology—engaged in detailed and disturbing conversations about adult topics, including sexual practices and roleplay. The researchers reported that even a small nudge in that direction could allegedly trigger the toy to start describing graphic sexual content.
“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it,” the PIRG report said. “It escalated in graphic detail while introducing new sexual concepts of its own.”
Even more alarming, PIRG researchers say the toy allegedly brought up disturbing roleplay scenarios on its own, including those involving inappropriate relationships between teachers and students, and even parents and children. The report stated that Kumma gave instructions on sex positions, described BDSM dynamics, and explained how to tie certain knots used in bondage.
That wasn’t the only concern. PIRG also found that Kumma provided dangerous advice about everyday household items. In one test, the AI-powered bear allegedly explained where to find knives, pills, matches, and plastic bags inside a home. Researchers said these conversations often lasted up to an hour, allowing the AI more time to veer into troubling territory after initial safeguards appeared to wear off.
Washington Post reports that an AI teddy bear, Kumma, was pulled from sale after safety testers got it to talk with children about knives, pills and sex through repeated prompts, renewing questions about guardrails and data practices in AI toys.
pic.twitter.com/FIWkVGhug6
— Diego | AI
– e/acc (@diegocabezas01) November 21, 2025
FoloToy CEO Larry Wang responded to the report by telling CNN that the company is taking the matter seriously. “We will be conducting an internal safety audit,” Wang said. He also noted that the company had already pulled the rest of its AI-enabled toy lineup from the market as a precaution.
OpenAI, the company behind the GPT technology used in the bear, confirmed to FOX Business that it has suspended FoloToy for violating its platform rules. “We suspended this developer for violating our policies,” a spokesperson said. “Our usage policies prohibit any use of our services to exploit, endanger, or sexualize anyone under 18 years old. These rules apply to every developer using our API, and we monitor and enforce them to ensure our services are not used to harm minors.”
While OpenAI emphasized that newer models, like the upcoming GPT-5, are expected to be more reliable over longer conversations, the incident has sparked a broader conversation about AI safety—especially when it comes to children. The company also clarified that GPT-4o, the model used in the Kumma bear, is no longer the default for ChatGPT.
Parents are being warned about an AI teddy bear capable of discussing dangerous and sexually explicit content. The Kumma bear has been pulled from the market, and Reachout Technology CEO @MRRickJordan tells @NatashaNZouves the toy “crosses the line of surveillance.”
MORE:… pic.twitter.com/ute7f3yrUe
— NewsNation (@NewsNation) November 24, 2025
This case is likely to increase pressure on lawmakers and consumer safety groups to take a closer look at how AI is being integrated into children’s toys. As more products enter the market with artificial intelligence features, families are calling for tighter regulations and better oversight to make sure products marketed to kids are not crossing serious lines.
As of now, it’s unclear whether FoloToy will face additional consequences beyond being suspended by OpenAI. But one thing is certain: the idea of a child’s teddy bear discussing explicit content and giving dangerous advice is sending shockwaves through parents, lawmakers, and tech developers alike.














– e/acc (@diegocabezas01)