Home General Various News OpenAI’s Whisper transcription software has hallucination

OpenAI’s Whisper transcription software has hallucination

14


Software engineers, builders, and tutorial researchers have severe considerations about transcriptions from OpenAI’s Whisper, in response to a report within the Associated Press.

While there’s been no scarcity of debate round generative AI’s tendency to hallucinate — principally, to make stuff up — it’s a bit shocking that this is a matter in transcription, the place you’d anticipate the transcript intently comply with the audio being transcribed.

Instead researchers advised the AP that Whisper has launched all the pieces from racial commentary to imagined medical remedies into transcripts. And that might be notably disastrous as Whisper is adopted in hospitals and different medical contexts.

A University of Michigan researcher learning public conferences discovered hallucinations in eight out of each 10 audio transcriptions. A machine studying engineer studied greater than 100 hours of Whisper transcriptions and located hallucinations in additional than half of them. And a developer reported discovering hallucinations in practically all of the 26,000 transcriptions he created with Whisper.

An OpenAI spokesperson mentioned the corporate is “continually working to improve the accuracy of our models, including reducing hallucinations” and famous that its utilization insurance policies prohibit utilizing Whisper “in certain high-stakes decision-making contexts.”

“We thank researchers for sharing their findings,” they mentioned.



Source hyperlink

LEAVE A REPLY

Please enter your comment!
Please enter your name here