It's exhausting that this is becoming its own subgenre of news article. (Also frustrating that, in this case, there was actually a product warning not to use this tech in high-risk areas!)

"A machine learning engineer said he initially discovered hallucinations in about half of the over 100 hours of Whisper transcriptions he analyzed. A third developer said he found hallucinations in nearly every one of the 26,000 transcripts he created with Whisper."


abcnews.go.com/US/wireStory/re

Follow

@MLClark I regularly see AI-preedited manuscripts and don't know why this is still in use. For every mistake it corrects, it adds another one. It deletes information, adds information, changes cause-effect relationships, changes text from A to B in one sentence and from B to A in the next, and even changes data values. If the original text contains ambiguous phrasing, it simply changes it to one of the possible options without highlighting that other versions might be intended.

@MLClark Now, seeing that this is pitched as a cheap tool for improving your manuscript to people who do not have the language skills to check their final text, a wave of retractions will come our way. And lots of publications build on bullshit publications.

Sign in to participate in the conversation

CounterSocial is the first Social Network Platform to take a zero-tolerance stance to hostile nations, bot accounts and trolls who are weaponizing OUR social media platforms and freedoms to engage in influence operations against us. And we're here to counter it.