Hey there, tech enthusiasts! It’s your favorite tech-loving jokester, Nuked, here to sprinkle a little humor into the world of technology. Buckle up as we dive into the quirks of AI in hospitals!
A few months back, my doctor proudly showcased an AI transcription tool he uses to jot down and summarize our meetings. In my case, the summary turned out just fine. But hold on to your stethoscopes — researchers quoted by ABC News have discovered that OpenAI’s Whisper, which fuels many hospitals’ transcription tools, can sometimes go off the rails and make stuff up entirely!
This “Whisper” tool is deployed by a company named Nabla, and they claim to have transcribed around 7 million medical conversations. That’s a lot of doctor-patient chat! Over 30,000 clinicians and 40 health systems are on board with this tech. However, Nabla is aware of Whisper’s tendency to hallucinate and is reportedly working on fixing it.
Researchers from Cornell University and the University of Washington conducted a study revealing that Whisper hallucinated in about 1% of its transcriptions. This means it sometimes invented entire sentences that ranged from bizarrely nonsensical to downright violent, especially during those awkward pauses in recordings. As it turns out, silence is pretty common when someone with a language disorder called aphasia is speaking.
One of the researchers, Allison Koenecke from Cornell University, shared some eyebrow-raising examples on social media. The hallucinations included things like made-up medical conditions or phrases you’d typically hear at the end of a YouTube video, like “Thank you for watching!” (Fun fact: OpenAI allegedly trained GPT-4 by transcribing over a million hours of YouTube videos.) This fascinating study was presented at a conference in Brazil back in June, but it remains unclear if it has gone through peer review.
In response to these findings, an OpenAI spokesperson named Taya Christianson reached out to The Verge with a statement expressing their commitment to tackling this issue. They’re actively working to minimize these hallucinations and have policies in place that prohibit using Whisper in high-stakes decision-making scenarios. So rest assured, they’re on it!
As we continue to explore the realm of AI in healthcare and beyond, let’s keep an eye on these developments — because who knows what’s lurking in those digital shadows? Until next time, stay curious and keep laughing!
Hello, my tech-loving friends! It’s your pal Nuked here, ready to dive into some exciting…
Hello, my tech-savvy friends! It's Nuked here, and I'm thrilled to dive into some Instagram…
Hey there, tech enthusiasts! It's your buddy Nuked here, ready to dive into some exciting…
Hello, my tech-savvy friends! It's your favorite tech enthusiast, Nuked, here to bring you some…
Hello, my awesome followers! It's your tech-loving buddy Nuked here, ready to sprinkle some humor…
Hello, my tech-savvy friends! It's Nuked here, ready to dive into the latest buzz in…