• lolcatnip@reddthat.com
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    9 months ago

    Not child porn. AI produces images all the time of things that aren’t in its training set. That’s kind of the point of it.

    • SuddenlyBlowGreen@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      17
      ·
      edit-2
      9 months ago

      AI produces images all the time of things that aren’t in its training set.

      AI models learn statistical connections from the data it’s provided. It’s going to see connections we can’t, but it’s not going to create things that are not connected to its training data. The closer the connection, the better the result.

      It’s a pretty easy conclusion from that that CSAM material will be used to train such models, and since training requires lots of data, and new data to create different and better models…

      • BetaDoggo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        Real material is being used to train some models, but sugesting that it will encourage the creation of more “data” is silly. The amount required to finetune a model is tiny compared to the amount that is already known to exist. Just like how regular models haven’t driven people to create even more data to train on.

        • SuddenlyBlowGreen@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          Just like how regular models haven’t driven people to create even more data to train on.

          It has driven companies to try to get access to more data people generate to train the models on.

          Like chatGPT on copyrighted books, or google on emails, docs, etc.

          • BetaDoggo_@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 months ago

            And what does that have to do with the production of csam? In the example given the data already existed, they’ve just been more aggressive about collecting it.

            • SuddenlyBlowGreen@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 months ago

              Well now in addition to regular pedos consuming CSAM, now there are the additional consumers of people to use huge datasets of them to train models.

              If there is an increase in demand, the supply will increase as well.

              • BetaDoggo_@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                9 months ago

                Not necessarily. The same images would be consumed by both groups, there’s no need for new data. This is exactly what artists are afraid of. Image generation increases supply dramatically without increasing demand. The amount of data required is also pretty negligible. Maybe a few thousand images.