I don’t get this. AI bros talk about how “in the near future” no one will “need” to be a writer, a filmmaker or a musician anymore, as you’ll be able to generate your own media with your own parameters and preferences on the fly. This, to me, feels like such an insane opinion. How can someone not value the ingenuity and creativity behind a work of art? Do these people not see or feel the human behind it all? And are these really opinions that you’ve encountered outside of the internet?

  • canadaduane
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    edit-2
    1 day ago

    My daughter (15f) is an artist and I work at an AI company as a software engineer. We’ve had a lot of interesting debates. Most recently, she defined Art this way:

    “Art is protest against automation.”

    We thought of some examples:

    • when cave artists made paintings in caves, perhaps they were in a sense protesting the automatic forces of nature that would have washed or eroded away their paintings if they had not sought out caves. By painting something that could outlast themselves, perhaps they wished to express, “I am here!”
    • when manufacturing and economic factors made kitsch art possible (cheap figurines, mass reprints, etc.), although more people had access to “art” there was also a sense of loss and blandness, like maybe now that we can afford art, this isn’t art, actually?
    • when computers can produce images that look beautiful in some way or another, maybe this pushes the artist within each of us to find new ground where economic reproducibility can’t reach, and where we can continue the story of protest where originality can stake a claim on the ever-unfolding nature of what it means to be human.

    I defined Economics this way:

    “Economics is the automation of what nature does not provide.”

    An example:

    • long ago, nature automated the creation of apples. People picked free apples, and there was no credit card machine. But humans wanted more apples, and more varieties of apples, and tastier varieties that nature wouldn’t make soon enough. So humans created jobs–someone to make apple varieties faster than nature, and someone to plant more apple trees than nature, and someone to pick all of the apples that nature was happy to let rot on the ground as part of its slow orchard re-planting process.

    Jobs are created in one of two ways: either by destroying the ability to automatically create things (destroying looms, maybe), or by making people want new things (e.g. the creation of jobs around farming Eve Online Interstellar Kredits). Whenever an artist creates something new that has value, an investor will want to automate its creation.

    Where Art and Economics fight is over automation: Art wants to find territory that cannot be automated. Economics wants to discover ways to efficiently automate anything desirable. As long as humans live in groups, I suppose this cycle does not have an end.

    • slowcakes@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Art is subjective, AI is a buzzword, if statements are considered AI, especially in the gaming world.

      And the current state of LLMs and what are the smartest and brightest in the industry have only managed to produce utter trash, while sacrificing the planet and its inhabitants. I like your daughter more, she will create more value and at the same time not be a total corporate tool, ruining the planet for generations to come, mad respect.

      (not calling you a tool, but people who work with LLMs)

      • canadaduane
        link
        fedilink
        English
        arrow-up
        2
        ·
        24 hours ago

        I do work with LLMs, and I respect your opinion. I suspect if we could meet and chat for an hour, we’d understand each other better.

        But despite the bad, I also see a great deal of good that can come from LLMs, and AI in general. I appreciated what Sal Khan (Khan Academy) had to say about the big picture view:

        There’s folks who take a more pessimistic view of AI, they say this is scary, there’s all these dystopian scenarios, we maybe want to slow down, we want to pause. On the other side, there are the more optimistic folks that say, well, we’ve gone through inflection points before, we’ve gone through the Industrial Revolution. It was scary, but it all kind of worked out.

        And what I’d argue right now is I don’t think this is like a flip of a coin or this is something where we’ll just have to, like, wait and see which way it turns out. I think everyone here and beyond, we are active participants in this decision. I’m pretty convinced that the first line of reasoning is actually almost a self-fulfilling prophecy, that if we act with fear and if we say, “Hey, we’ve just got to stop doing this stuff,” what’s really going to happen is the rule followers might pause, might slow down, but the rule breakers–as Alexander [Wang] mentioned–the totalitarian governments, the criminal organizations, they’re only going to accelerate. And that leads to what I am pretty convinced is the dystopian state, which is the good actors have worse AIs than the bad actors.

        https://www.ted.com/talks/sal_khan_how_ai_could_save_not_destroy_education?subtitle=en