OpenAI's AI-transcription tool, Whisper, frequently "hallucinates" in medical settings, leading to potential risks.

Researchers have found that OpenAI's AI-powered transcription tool, Whisper, frequently "hallucinates" by generating false sentences, raising concerns in high-risk industries like healthcare. Despite OpenAI's warnings against its use in sensitive areas, many medical facilities have adopted it for transcribing patient consultations. Experts are calling for federal regulations to address these issues, while OpenAI acknowledges the problem and is working on improvements.

October 26, 2024
130 Articles

Further Reading