Friday , 22 November 2024
Home Tech Hospitals use a transcription tool powered by a hallucination-prone OpenAI model
Tech

Hospitals use a transcription tool powered by a hallucination-prone OpenAI model

Hospitals use a transcription tool powered by a hallucination-prone OpenAI model
An illustration of a woman typing on a keyboard, her face replaced with lines of code.
Image: The Verge

A few months ago, my doctor showed off an AI transcription tool he used to record and summarize his patient meetings. In my case, the summary was fine, but researchers cited by ABC News have found that’s not always the case with OpenAI’s Whisper, which powers a tool many hospitals use — sometimes it just makes things up entirely.

Whisper is used by a company called Nabla for a medical transcription tool that it estimates has transcribed 7 million medical conversations, according to ABC News. More than 30,000 clinicians and 40 health systems use it, the outlet writes. Nabla is reportedly aware that Whisper can hallucinate, and is “addressing the problem.”

A group of researchers from Cornell University, the University of Washington, and…

Continue reading…

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

The new Ariana Grande-Cynthia Erivo musical is just the latest project to...

Some smartphones (looking at you, iPhone) are often excluded from the Black...

Star Wars Outlaws’ first expansion brings Lando into the game

Image: Ubisoft A few months after it first launched, Star Wars Outlaws...

SpaceX signs second commercial deal for Starship lunar lander with Lunar Outpost

As SpaceX’s Starship test program continues to gain momentum, the company signed...