In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.

  • BlameThePeacock
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    You’re missing something even more basic.

    The machine Person C has created is not infringing on anything by itself. It’s creation was not an infringement. “Extracting essence” isn’t a protected right provided by the copyright frameworks. Only the actual art it is used to create could infringe (which most of the generated images do not).

    If the final art created is an infringement, the existing copyright system handles that situation just like an infringing piece of art created by a human. The person at fault is the person who used the machine to create an infringing work, not the creator of the machine.

    In your scenario, if a human C came along and looked at the art from Person A and B, blended them together into their own style, there wouldn’t be any problem either. Even though they received no permission, and offered no credit nor compensation to the original creators. They would only get in trouble if they created an actual piece of art that was too similar to either of the specific artists works and therefore found to be infringing upon the copyright.

    • nickwitha_k (he/him)@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      First, feeding something into a machine is not the same as looking at it. Person C literally creates nothing. They are a parasite. There’s far more to creating than using statistical modeling algorithms. One cannot claim that that’s what people studying a style and then creating someone are doing because it is empirically false.

      Second, the scope of the discussion is not just “can someone legally get in trouble”.

      • BlameThePeacock
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        “Feeding something into a machine is not the same as looking at it” Most scientists would vehemently disagree. Human brains are just a complex and squishy computer. The fact that they’re biological makes no difference to how we function. Input goes in, processing occurs, output comes out. Even the term “Computer” started as a job title for a human prior to the invention of mechanical and electric devices.

        The scope of the discussion is absolutely what would get you in trouble. That’s literally the entire post we’re commenting on. We’re not arguing if this SHOULD be allowed or not, we’re arguing about whether current laws prohibit it.

        You keep harping on about parasites, is every person who creates a machine to do a task that competes with humans parasitical in your fucked up world logic? If we want to make a machine to build widgets, an engineer will study how widgets get built, design a machine to do it instead, produce the machine, then a company will use it to outcompete the original manual widget makers. Same process for essentially every machine we’ve ever invented.

        • nickwitha_k (he/him)@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          “Feeding something into a machine is not the same as looking at it” Most scientists would vehemently disagree. Human brains are just a complex and squishy computer.

          In that aspect, we are absolutely in agreement. We are meat computers in meat cages containing necessary support systems. That statement was, perhaps, an oversimplification.

          Things like LLMs are attempts to model how the human brain works but are not identical, nor are LLMs, by themselves, capable of intelligence. If one argues contrarily that feeding data into an LLM and using it to produce something is the same, then the one using the LLM is clearly not the author and claiming so is plagiarism of the work of either the creator of the LLM or the LLM itself.

          The argument that, legally, IP owners cannot specify that their works may not be used as feedstock for competing commercial products is rather absurd itself and would invalidate all but the most permissive open-source licenses as well as proprietary licenses. As pointed out elsewhere, this line of thought would allow one to steal leaked source code and use it to effectively clone existing software. Use of the source in this manner would be infringing on the owner’s IP rights.

          Perhaps a good way to think about LLMs is as automated reverse engineering. They take data and statistically model it in order to characterize it. There is substantial case law there and the EFF has a great FAQ on the topic: https://www.eff.org/issues/coders/reverse-engineering-faq

    • nickwitha_k (he/him)@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      The scope here is not limited to “can someone legally get in trouble under current law” (which, seems likely but is still working its way through courts). The discussion is specifically discussing ethics. Person C has created nothing. They should have no product to sell, if not for persons A and B. Their competition with those that their product is derived from is a parasitic relationship, plain and simple. They are performing an act of exploitation with measurable harm both to persons A and B but also to further development of their craft by destroying any incentive to continue it.

      Now, in some sort of alternate economic system, where one’s livelihood is not tied to their vocation, sure, it’s possibly not problematic because the economic harm is removed. However, in current capitalist systems that are in place where LLMs are heavily hyped, it’s an ethically bankrupt action to take.

      ETA: No amount of mental gymnastics can change the fact that use of others’ works without their consent to train a model, then claiming authorship and competing IS plainly theft of the labor that went into creating the original works.

      That’s not too say that LLMs and they like don’t have value or often require effort to produce something worthwhile. Just that they need to be used in an ethical manner that improves the human condition, not as another tool to rob others of the fruit of their labors.

      • BlameThePeacock
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’ll remind you the original article title literally contains the words “copyright law”

        This discussion is entirely about legality, not ethics.

        By your stupid logic, I have created nothing in my job designing automation systems, since I just look at what people currently do, program a computer to do those tasks instead, and I profit off those people no longer needing to do that job.

        You want to keep everyone fully employed in needless tasks? Go join the Mennonites.

        • nickwitha_k (he/him)@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I feel that you’re being deliberately obtuse here in order to avoid the ethics dilemma.

          A design is a “thing”, software is a “thing” even if it is physically intangible. Designing automation systems requires more than just looking at existing processes or algorthmic modeling. It requires synthetic and abstract thought. Nor is it a parasitic process; the automation has value by itself nor is it dependent upon the outputs of those whose tasks it automates. Automation, in theory, also improves the human condition by reducing amount of labor required by a given individual (though this particular good has largely been stolen since the 80s).