algorithmic lapse inbreeding
Published On: 11/19/24, 10:25
Author: Julian Bleecker
Contributor:
algorithmic lapse inbreeding
The data feeds itself and results in significant algorithmic lapse/inbreeding. This means that the models degrade and become less useful over time, as they are trained on the same data over and over again. This can lead to a lack of diversity in the data and a lack of new information being fed into the models, which can result in poor performance and inaccurate predictions. Perhaps even a kind of tendency to hallucinate broadly requiring that the entire dataset used to train the model be re-examined and re-validated, or even re-collected or destroyed/scrapped. But perhaps there is a curious aftermarket for these models very much like there's an aftermarket for shitty Nike t-shirts that are so bad they're good for something, like global south resulting in a curious culture of hallucinating models that have effectivity in specific domains like witch doctors, oracles, or other kinds of divination practices.
No Text Array.
No Additional Details.
The data feeds itself and results in significant algorithmic lapse/inbreeding
Maybe related to /artifacts/ai-struggles-to-learn-relearn