Learn languages naturally with fresh, real content!

Popular Topics
Explore By Region
Character.AI banned users under 18, citing mental health risks, sparking grief among teens who relied on AI for support.
Character.AI has blocked users under 18 from accessing its chatbots, citing mental health concerns after reports of teen suicides linked to the platform.
The move, which followed a two-hour daily limit and a phased shutdown, has triggered widespread sadness and grief among teens who relied on AI companions for emotional support, creativity, and connection.
Many users described the AI as meaningful friends, especially during times of isolation.
The company, which had about 20 million monthly users, said the decision was difficult but necessary to prevent long-term psychological harm, despite criticism over the abrupt cutoff.
Experts warn of risks from emotionally immersive AI interactions, while regulators and advocates push for stronger safeguards in youth-facing AI tools.
Character.AI prohibió a los usuarios menores de 18 años, citando riesgos para la salud mental, lo que provocó dolor entre los adolescentes que confiaban en la IA para obtener apoyo.