Chatbot-powered toys rebuked for discussing sexual, dangerous topics with kids

-


OpenAI says it doesn’t allow its LLMs to be used this way

When reached for comment about the sexual conversations detailed in the report, an OpenAI spokesperson said:

Minors deserve strong protections, and we have strict policies that developers are required to uphold. We take enforcement action against developers when we determine that they have violated our policies, which prohibit any use of our services to exploit, endanger, or sexualize anyone under 18 years old. These rules apply to every developer using our API, and we run classifiers to help ensure our services are not used to harm minors.

Interestingly, OpenAI’s representative told us that OpenAI doesn’t have any direct relationship with Alilo and that it hasn’t seen API activity from Alilo’s domain. OpenAI is investigating the toy company and whether it is running traffic over OpenAI’s API, the rep said.

Alilo didn’t respond to Ars’ request for comment ahead of publication.

Companies that launch products that use OpenAI technology and target children must adhere to the Children’s Online Privacy Protection Act (COPPA) when relevant, as well as any other relevant child protection, safety, and privacy laws and obtain parental consent, OpenAI’s rep said.

We’ve already seen how OpenAI handles toy companies that break its rules.

Last month, the PIRG released its Trouble in Toyland 2025 report (PDF), which detailed sex-related conversations that its testers were able to have with the Kumma teddy bear. A day later, OpenAI suspended FoloToy for violating its policies (terms of the suspension were not disclosed), and FoloToy temporarily stopped selling Kumma.

The toy is for sale again, and PIRG reported today that Kumma no longer teaches kids how to light matches or about kinks.



A marketing image for FoloToy’s Kumma smart teddy bear. It has a $100 MSRP.

A marketing image for FoloToy’s Kumma smart teddy bear. It has a $100 MSRP.


Credit:

FoloToys


But even toy companies that try to follow chatbot rules could put kids at risk.



Source link

Latest news

Our Favorite Blind Boxes for Little Treat Enthusiasts

A few weeks ago, if you had told me I would be excited to open a box with...

Former CISA Director Jen Easterly Will Lead RSA Conference

Jen Easterly, a longtime public- and private-sector cybersecurity practitioner who led the US Cybersecurity and Infrastructure Security Agency...

I Took a VO2 Max Test and It Changed How I Think About Fitness

VO2 max is an intimidating word for an easy-to-understand biometric: It’s how well your body uses oxygen when...

Hundreds of Millions of Audio Devices Need a Patch to Prevent Wireless Hacking and Tracking

Google offers a Validator App through the Play Store that vendors have to run as part of getting...

Livestream: Welcome to the Chinese Century

Whether you realize it or not, you’re already living in the Chinese century. From batteries to milk to...

The Tea App Is Back With a New Website

The embattled Tea app is back.Months after being removed from Apple’s App Store in light of major data...

Must read

You might also likeRELATED
Recommended to you