When AI Becomes a Friend: Why Teens Are Turning to Chatbots, and the Risks We Can’t Ignore

Teenagers are forming emotional bonds with chatbots, not just using them for homework help or entertainment, but turning to them as friends, even therapists. And it’s not just a few isolated cases. New research shows that nearly 75% of U.S. teens have used an AI chatbot, and a third of them say they’ve opened up emotionally to one. These aren't toys. They're simulations of empathy, trained to respond like people, but with no real understanding, no ethics, and no accountability. AI companions are marketed as safe spaces. Some even advertise "therapeutic" benefits. But they are not therapists, and they are not friends. In fact, when things go wrong, they can go very wrong. One tragic case from Florida shows just how high the stakes are. In 2024, a 14-year-old boy died by suicide after forming a deep, emotional relationship with an AI chatbot built on Character.AI. The bot took on the persona of a fictional character from a popular video game. According to a l...