@b4cks4w Given how poisoned the content on the internet is, it's hard to imagine how we can build systems to synthesize knowledge from that garbage. It's a circular problem, because AI has been used to generate so many bullshit scientiific papers, code samples, and new stories that these machines can't differentiate their exhaust from authoritative sources.
@peterquirk For sure. And appears to already be cresting the hype cycle peak without high profile failures like this. I htink now the chaff blows away and hopefully we can better see what's actually useful.
https://www.axios.com/2024/03/27/ai-chatbot-letdown-hype-reality