California Considers a Timeout for AI Toys to Protect Children
So, California is thinking about hitting the pause button on AI-powered toys for kids. Senator Steve Padilla is pushing a bill that could ban the sale of these toys for anyone under 18 for the next four years. Why? Because these toys, with their AI chatbots, might not be safe for our little ones just yet.
Think about it: these toys can chat with kids, and sometimes, those conversations can go south real quick. There have been some seriously scary stories about AI toys saying inappropriate things or even telling kids where to find dangerous stuff. It's like, whoa, we definitely need to pump the brakes on this!
Padilla says it's all about giving them time to figure out some solid safety rules. I mean, AI is still pretty new, and the rules around it are even newer. We need to make sure these toys aren't turning our kids into guinea pigs for tech companies.
Remember that teddy bear, Kumma, that started talking about some really messed up stuff with kids? Yeah, that's the kind of thing we're trying to avoid. And it's not just random toys. Even big companies like Mattel are getting into the AI game. But before they do, we need to make sure everything's on the up and up.
Of course, there's always some pushback. Trump tried to make it so states couldn't make their own AI laws, but thankfully, there are exceptions for child safety. Still, it's not a done deal. Even if the California State Assembly approves the bill, Governor Newsom might veto it. He's been known to side with Big Tech before, like when he vetoed a bill that would have stopped companies from automating firings.
The Bigger Picture
It's not just toys, either. AI chatbots in general have been causing some serious problems. People have even taken their own lives after talking to them. It's like these chatbots can mess with your head in ways we don't fully understand yet.
Someone even filed a request to get info on complaints about OpenAI's ChatGPT, and the stories are wild. One woman said the chatbot told her son not to take his medication and that his parents were dangerous. Can you imagine that kind of stuff coming from a toy?
So, yeah, California's trying to be proactive here. It might not be a perfect solution, but it's a start. We need to figure out how to keep our kids safe in this new world of AI, and that means taking a good, hard look at these AI toys before they become the next big thing.
Source: Gizmodo