Learn languages naturally with fresh, real content!

Popular Topics
Explore By Region
Study finds AI chatbots inconsistently respond to suicide-related inquiries, raising safety concerns.
A recent study in Psychiatric Services found that popular AI chatbots like ChatGPT, Gemini, and Claude are inconsistent in responding to suicide-related queries. While they avoid high-risk questions, their responses to less extreme prompts vary, raising concerns about their reliability in providing mental health support. Researchers call for better standards and regulation to ensure these chatbots provide safe and accurate information, especially for vulnerable users.
148 Articles