AI tool used in hospitals invents things no one ever said.
Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers.
some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.
Whisper is being used in a slew of industries worldwide.
https://globalnews.ca/news/10832303/ai-transcription-medical-errors/
yeah, just what we need right now☠️
@holon42
Great. 🙄