Interesting decision

  • millie@beehaw.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I feel like this is less of a big decision and more of a ‘duh’ sort of situation. To my understanding this isn’t saying that all AI art violates copyright, but that AI art which does violate copyright can’t be used.

    Like if i took a picture of Darth Vader and handed it to NightCafe to fool around with, that still belongs to Disney. Steam is legally required to act if a valid DMCA is sent, and to adhere to the court’s ruling in the case of a dispute.

    I feel like this is a reassurance that they intend to obey copyright law rather than a restriction of all AI art. Basically they’re saying that if you DMCA someone in good faith on the basis of derivative works, they’ll play ball.

    • Dominic@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Right, the phrasing is “copyright-infringing AI assets” rather than a much more controversial “all AI assets, due to copyright-infringement concerns.”

      I do think there’s a bigger discussion that we need to have about the ethics and legality of AI training and generation. These models can reproduce exact copies of existing works (see: Speak, Memory: An Archaeology of Books Known to ChatGPT/GPT-4).

      • millie@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Sure, but plagiarism isn’t unique to LLMs. I could get an AI to produce something preexisting word for word, but that’s on my use of the model, not on the LLM.

        I get the concerns about extrapolating how to create works similar to those made by humans from actual human works, but that’s how people learn to make stuff too. We experience art and learn from it in order to enrich our lives, and to progress as artists ourselves.

        To me, the power put into the hands of creators to work without the need for corporate interference is well worth the consideration of LLMs learning from the things we’re all putting out there in public.

  • Syrup@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I imagine a lot of companies will take the same stance until there’s some sort of ruling or law clarifying the issue. Valve doesn’t have anything to gain by allowing it, so it makes sense to block it.

    This could change based on the outcome of the class action lawsuit pending against Midjourney and Stable Diffusion.

  • PenguinTD
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I think in the future companies would have to use tools that can have model data set exported. So they can be verified and not getting dinged/blocked.

    • BlameThePeacockOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I don’t think so. Copyright doesn’t extend to styles, and I think the courts will figure that out eventually.

      • PenguinTD
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It does NOT, but the model network wouldn’t be what it is without the training data. So they either need to start retrain a “clean” version, or risk delay and could invalidate all the related works that derived from the model trained by scraped data…

        • BlameThePeacockOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Human artists don’t start with a clean version… They’re trained on copyright material all the time without permission.

          • PenguinTD
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            There is a big difference though. Human artist train with existing ones and strive to understand what technique is used then trying to come up with their own style. If you read manga you will see the trails of author that was trained under another/copy style but then go off to their own, Jojo is a fantastic example. Human artists likes to come up with their own.

            Where AI use the training data to do the derive work use like devian/instagram artist names as prompt weight. There are no citation like in journal, there are no recognition in like “I tried to copy this artist’s style and experiment with this color pallet”. And lastly, many artist sued because their stuff are used as training material without any consulting from the hosting site, sites abuse the term of use and then jump on the AI train(some are simply getting scraped). It’s a big damage to the art sharing community and AI model development/training.

            If you don’t like the future where all the image on internet has watermark plastered all over the image(I know AI can also try to remove watermark), there needs to be a formal established relationship between artists and AI models.

            • BlameThePeacockOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Legally the difference is not as big as you think it is.

              Also your argument about damage is pretty weak. The automated loom did a ton of damage to weavers, gasoline engines did a ton of damage to horse ranchers, computers(electronic) did a lot of damage to computers (that’s what humans that performed calculations for a living were called)

              As a consumer of art, I don’t really care if a computer or a person made it. I’m buying it because I like the look, not the person or process. I know other people who do care, but other people like craft beer too.

              • PenguinTD
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                By damage I mean the progress of the tech, not the economical damage generative AI model could bring. That’s why I list both. When your process betray the trust of community, it just going to make it even harder to progress now that it has that tainted reputation. Artists wants recognition simple as that, yes they like the money too but it comes from recognition.

                I know a lot of artist(source: I work in creative industry) likes experiment with AI arts to speed up their process, now that this becomes kinda of taboo I see less post of their experiment. It also delays the progress of meaningful workflow like using AI to patch texture flaws, generate better patterns to mix with existing one, if they really want to use it to generate character/prop textures for games, now they can’t. It’s not just games, it could bleed into many other area.( Film/TV/Anime/Manga’s production now have to think carefully so they don’t get sued. )

  • thingsiplay@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Valve was also the company to ban games from Steam, that made use of NFT and Crypto Currencies. While at the same time Epic Games publicly defended NFT and said they can publish their games on their platform. I wonder if the same will happen with AI asset flip games.

    I’m just worried how to detect AI assets.