AI transcription tool Whisper used in healthcare found to create false text, risking patient misdiagnosis.

A recent study found that Whisper, an AI transcription tool widely used in medical centers, can sometimes create false text, known as "hallucinations." This could lead to inaccurate medical records and potential errors like misdiagnosis. The tool is used to transcribe conversations between patients and doctors, raising concerns about its reliability in healthcare settings.

2 months ago
3 Articles

Further Reading