“Our primary conclusion across all scenarios is that without enough fresh real data in each generation of an autophagous loop, future generative models are doomed to have their quality (precision) or diversity (recall) progressively decrease,” they added. “We term this condition Model Autophagy Disorder (MAD).”

Interestingly, this might be a more challenging problem as we increase the use of generative AI models online.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Humans are not entirely trained on other humans, though. We learn plenty of stuff from our environment and experiences. Note this very important part of the primary conclusion:

    without enough fresh real data in each generation

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Dogs can do math and I’m quite sure I’ve never taught my dog that deliberately.

        Even for humans learning it, I would expect that most of our understanding of math comes from everyday usage of it rather than explicit rote training.