the writer Nina Illingworth, whose work has been a constant source of inspiration, posted this excellent analysis of the reality of the AI bubble on Mastodon (featuring a shout-out to the recent articles on the subject from Amy Castor and @[email protected]):

Naw, I figured it out; they absolutely don’t care if AI doesn’t work.

They really don’t. They’re pot-committed; these dudes aren’t tech pioneers, they’re money muppets playing the bubble game. They are invested in increasing the valuation of their investments and cashing out, it’s literally a massive scam. Reading a bunch of stuff by Amy Castor and David Gerard finally got me there in terms of understanding it’s not real and they don’t care. From there it was pretty easy to apply a historical analysis of the last 10 bubbles, who profited, at which point in the cycle, and where the real money was made.

The plan is more or less to foist AI on establishment actors who don’t know their ass from their elbow, causing investment valuations to soar, and then cash the fuck out before anyone really realizes it’s total gibberish and unlikely to get better at the rate and speed they were promised.

Particularly in the media, it’s all about adoption and cashing out, not actually replacing media. Nobody making decisions and investments here, particularly wants an informed populace, after all.

the linked mastodon thread also has a very interesting post from an AI skeptic who used to work at Microsoft and seems to have gotten laid off for their skepticism

  • Steve@awful.systems
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    10 months ago

    I’ve got this absolutely massive draft document where I’ve tried to articulate what this person explains in a few sentences. The gradual removal of immediate purpose from products has become deliberate. This combination of conceptual solutions to conceptual problems gives the business a free pass from any kind of distinct accountability. It is a product that has potential to have potential. AI seems to achieve this better than anything ever before. Crypto is good at it but it stumbles at the cash-out point so it has to keep cycling through suckers. AI can just keep chugging along on being “powerful” for everything and nothing in particular, and keep becoming more powerful, without any clear benchmark of progress.

    Edit: just uploaded this clip of Ralph Nader in 1971 talking about the frustration of being told of benefits that you can’t really grasp https://youtu.be/CimXZJLW_KI

    • David Gerard@awful.systemsM
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      this is also the marketing for quantum computing. Yes, there is a big money market for quantum computers in 2023. They still can’t reliably factor 35.

      • Steve@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        10 months ago

        shit, I forgot about quantum computing. If you don’t game, do video production or render 3d models, you’re upgrading your computer to keep up with the demands of client-side rendered web apps and the operating system that loads up the same Excel that has existed for 30 years.

        Lust for computing power is a great match for AI

        • David Gerard@awful.systemsM
          link
          fedilink
          English
          arrow-up
          7
          ·
          10 months ago

          i literally upgrade computers in the past decade purely to get ones that can take more RAM because the web now sends 1000 characters of text as a virtual machine written in javascript rather than anything so tawdry as HTML and CSS

          • self@awful.systemsOP
            link
            fedilink
            English
            arrow-up
            6
            ·
            10 months ago

            the death of server-side templating and the lie of server-side rendering (which practically just ships the same virtual machine to you but with a bunch more shit tacked on that doesn’t do anything) really has done fucked up things to the web

            • 200fifty@awful.systems
              link
              fedilink
              English
              arrow-up
              6
              ·
              edit-2
              10 months ago

              as someone who never really understood The Big Deal With SPAs (aside from, like, google docs or whatever) i’m at least taking solace in the fact that like a decade later people seem to be coming around to the idea that, wait, this actually kind of sucks

                • self@awful.systemsOP
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  10 months ago

                  the worst part is I really despise this exact thing too, but have also implemented it multiple times across the last few years cause under certain very popular tech stacks you aren’t given any other reasonable choice

                  this is why my tech stack for personal work has almost no commonality with the tech I get paid to work with

                  • David Gerard@awful.systemsM
                    link
                    fedilink
                    English
                    arrow-up
                    6
                    ·
                    10 months ago

                    when your web page is actually an app written in JS, the commercial temptation to load it up with as many trackers as will fit is overwhelming

          • raktheundead@fedia.io
            link
            fedilink
            arrow-up
            5
            ·
            9 months ago

            Every day, we pay the price for embracing a homophobe’s 10-day hack comprising a shittier version of Lisp.