Risk Warning: Beware of illegal fundraising in the name of 'virtual currency' and 'blockchain'. — Five departments including the Banking and Insurance Regulatory Commission
Information
Discover
Search
Login
简中
繁中
English
日本語
한국어
ภาษาไทย
Tiếng Việt
BTC
ETH
HTX
SOL
BNB
View Market
OpenAI: Improving model accuracy to reduce Whisper hallucination problems
2024-10-27 00:48
Odaily News Software engineers, developers, and academic researchers have expressed concerns about the potential impact of OpenAI’s Whisper transcription tool. Researchers say Whisper has introduced everything from racial comments to imagined medical treatments into transcriptions. This could have particularly serious consequences because Whisper is used in hospitals and other medical settings. A University of Michigan researcher studying public meetings found hallucinations in eight out of 10 audio transcriptions. A machine learning engineer who studied more than 100 hours of Whisper transcriptions found hallucinations in more than half of them. One developer reported finding hallucinations in nearly all of the 26,000 transcriptions he created with Whisper. An OpenAI spokesperson said the company is “continuously working to improve the accuracy of our models, including reducing hallucinations,” and noted that its usage policy prohibits using Whisper in “certain high-stakes decision-making environments.” (AP)