• BlackRoseAmongThorns@slrpnk.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    I was talking about ai training on ai output, ai requires genuine data, having a feedback loop makes models regress, see how ai makes yellow pictures because of the ghibli ai thing

    • Electricd@lemmybefree.net
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      3 days ago

      Sure, that mainly applies when it’s the same model training on itself. If a model trains on a different one, it might retrieve some good features from it, but the bad sides as well

        • Electricd@lemmybefree.net
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          3 days ago

          If they weren’t trained on the same data, it ends up similar

          Training inferior models with superior models output can lower the gap between both. It’ll not be optimal by any means and you might fuck its future learning, but it will work to an extent

          The data you feed it should be good quality though