There is a machine learning bubble, but the technology is here to stay. Once the bubble pops, the world will be changed by machine learning. But it will probably be crappier, not better.

What will happen to AI is boring old capitalism. Its staying power will come in the form of replacing competent, expensive humans with crappy, cheap robots.

AI is defined by aggressive capitalism. The hype bubble has been engineered by investors and capitalists dumping money into it, and the returns they expect on that investment are going to come out of your pocket. The singularity is not coming, but the most realistic promises of AI are going to make the world worse. The AI revolution is here, and I don’t really like it.

  • potterman28wxcv@beehaw.org
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    1 year ago

    I’m yet to find a single field where most tasks couldn’t be replaced by an AI

    Critical-application development. For example, developing a program that drives a rocket or an airplane.

    You can have an AI write some code. But good luck proving that the code meets all the safety criteria.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      You just said the same thing the comment responding to did, though. He pointed out that AI can replace the lower 80%, and you said the AI can write some code but that it might have trouble doing the expert work of proving the code meets the safety criteria. That’s where the 20% comes in.

      Also, it becomes easier to recognize the possibility for AI contribution when you widen your view to consider all the work required for critical application development beyond just the particular task of writing code. The company surrounding that task has a lot of non-coding work that gets done that is also amenable to AI replacement.

      • PenguinTD
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        That split won’t work cause the top 20% would not like to do their day job clean up AI codes. It’s much better time investment wise for them to write their own template generation tool so the 80% can write the key part of their task, than taking AI templates that may or may not be wrong and then hunting all over the place to remove bugs.

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          1 year ago

          Use the AI to fix the bugs.

          A couple months ago, I tried it on ChatGPT: I had never ever written or seen a single line in COBOL… so I asked ChatGPT to write me a program to print the first 10 elements of the Fibonacci series. I copy+pasted it into a COBOL web emulator… and it failed, with some errors. Copy+pasted the errors back to ChatGPT, asked it to fix them, and at the second or third iteration, the program was working as intended.

          If an AI were to run with enough context to keep all the requirements for a module, then iterate with input from a test suite, all one would need to write would be the requirements. Use the AI to also write the tests for each requirement, maybe make a library of them, and the core development loop could be reduced to ticking boxes for the requirements you wanted for each module… but maybe an AI could do that too?

          Weird times are coming. 😐

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            I’m a professional programmer and this is how I use ChatGPT. Instead of asking it “give me a script to do big complicated task” and then laughing at it when it fails, I tell it “give me a script to do .” Then when I confirm that works, I say "okay, now add a function that takes the output of the first function and does " Repeat until done, correcting it when it makes mistakes. You still need to know how to spot problems but it’s way faster than writing it myself, even if I don’t have to go rummaging through API documentation and whatnot.

            • amki@feddit.de
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I mean that is exactly what programming is except you type to an AI and have it type the script. What is that good for?

              Could have just typed the script in the first place.

              It ChatGPT can use the API it can’t be too complex otherwise you are in for a surprise once you find out what ChatGPT didn’t care about (caching, usage limits, pricing, usage contracts)

              • abhibeckert@beehaw.org
                link
                fedilink
                arrow-up
                7
                ·
                edit-2
                1 year ago

                Could have just typed the script in the first place.

                Sure - but ChatGPT can type faster than me. And for simple tasks, CoPilot is even faster.

                Also - it doesn’t just speed up typing, it also speeds up basics like “what did bob name that function?”

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  And stuff like “I know there’s a library out there that does the thing I’m trying to do, what’s it named and how do I call it?”

                  I haven’t been using ChatGPT for the “meat” of my programming, but there are so many things that little one-off scrappy Python scripts make so much easier in my line of work.

              • FaceDeer@kbin.social
                link
                fedilink
                arrow-up
                3
                ·
                1 year ago

                it’s way faster than writing it myself

                I already explained.

                I could write the scripts myself, sure. But can I write the scripts in a matter of minutes? Even with a bit of debugging time thrown in, and the time it takes to describe the problem to ChatGPT, it’s not even close. And those descriptions of the problem make for good documentation to boot.