NATURE:
AI models collapse when trained on recursively generated data

"Model collapse is a degenerative process affecting generations of learned generative models, in which the data they generate end up polluting the training set of the next generation. Being trained on polluted data, they then mis-perceive reality."

nature.com/articles/s41586-024

@TheAbbotTrithemius Quite the techinical leap, making the enshittification of human discourse a self-sustaining process requiring no human intervention at all.

Follow

@stealthbadger

HA HA HA... 😂 🤣

Perhaps enshittification can be left to GenAI and we can go on with what's real. LOL.

Closed systems rot and they rot hard.

Look what happened when the internet was gamed and turned into recursive echo chambers and silos...

When the answer for a system becomes the answer, that is a death of one sort or another, spiritual intellectual, political, ethical for everything it touches.

Let GenAI burn itself to the ground...

Sign in to participate in the conversation

CounterSocial is the first Social Network Platform to take a zero-tolerance stance to hostile nations, bot accounts and trolls who are weaponizing OUR social media platforms and freedoms to engage in influence operations against us. And we're here to counter it.