ARTICLE AD
Although AI models have made big breakthroughs in the last few years, there’s now a risk that they could get dumber and even collapse, according to a new research paper published in Nature.
The phenomenon is described by researchers as “model collapse.” Think of it as a snake eating its own tail — or a process that starts with a bunch of photos of real dogs of different breeds, then ends with identical weird smudges that kind of resemble golden retrievers.
As more AI content fills the web, there’s a risk that the content going in and the content coming out of a model converge and turn into garbage. Golden retrievers in, golden retrievers out, and eventually the model loses track of what a dog even is. You can imagine something similar with images of real people, or blog posts on pretty much any topic.
This could be an even bigger risk for companies that are hungry enough for training data that they’re using synthetic data that was generated by AI.
Hit play to learn more, then let us know what you think in the comments!