Model Autophagy Disorder

Interesting read on Livescu.

…when AI models generate things—text, images, sound—and then those generated products are used to train a subsequent model, the new model actually gets worse at generating images and texts. Over a few generations it can fail completely, producing only a string of gibberish or a single same image over and over again.

And this is how AI goes ‘MAD’. Later on in the article the author describes a funny little analogy on how to discriminate between rich data and poor data.

Filed under