AI Transcription Tool in Hospitals Found to Fabricate Information

Reported about 1 month ago

A recent study reveals that OpenAI's transcription tool, Whisper, commonly used in hospitals, often invents text, including harmful and fictitious statements. Researchers have reported instances of hallucinations, which can lead to critical inaccuracies and risks in medical settings, where misdiagnoses and miscommunications can have serious consequences. Despite awareness of these issues, many medical facilities continue to adopt Whisper-based systems, raising concerns over patient privacy and the reliability of AI-generated transcripts.

Source: YAHOO

View details

You may also interested in these wikis

Back to all Wikis