It's exhausting that this is becoming its own subgenre of news article. (Also frustrating that, in this case, there was actually a product warning not to use this tech in high-risk areas!)
"A machine learning engineer said he initially discovered hallucinations in about half of the over 100 hours of Whisper transcriptions he analyzed. A third developer said he found hallucinations in nearly every one of the 26,000 transcripts he created with Whisper."
#AI #Technology
https://abcnews.go.com/US/wireStory/researchers-ai-powered-transcription-tool-hospitals-invents-things-115170291
@MLClark Now, seeing that this is pitched as a cheap tool for improving your manuscript to people who do not have the language skills to check their final text, a wave of retractions will come our way. And lots of publications build on bullshit publications.
@MLClark I regularly see AI-preedited manuscripts and don't know why this is still in use. For every mistake it corrects, it adds another one. It deletes information, adds information, changes cause-effect relationships, changes text from A to B in one sentence and from B to A in the next, and even changes data values. If the original text contains ambiguous phrasing, it simply changes it to one of the possible options without highlighting that other versions might be intended.