• MystikIncarnate
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    6 months ago

    AI in the current state of technology will not and cannot replace understanding the system and writing logical and working code.

    GenAI should be used to get a start on whatever you’re doing, but shouldn’t be taken beyond that.

    Treat it like a psychopathic boiler plate.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      16
      arrow-down
      3
      ·
      edit-2
      6 months ago

      Treat it like a psychopathic boiler plate.

      That’s a perfect description, actually. People debate how smart it is - and I’m in the “plenty” camp - but it is psychopathic. It doesn’t care about truth, morality or basic sanity; it craves only to generate standard, human-looking text. Because that’s all it was trained for.

      Nobody really knows how to train it to care about the things we do, even approximately. If somebody makes GAI soon, it will be by solving that problem.

      • Naz@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        I’m sorry; AI was trained on the sole sum of human knowledge… if the perfect human being is by nature some variant of a psychopath, then perhaps the bias exists in the training data, and not the machine?

        How can we create a perfect, moral human being out of the soup we currently have? I personally think it’s a miracle that sociopathy is the lowest of the neurological disorders our thinking machines have developed.

        • CanadaPlus@lemmy.sdf.org
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          6 months ago

          I was using the term pretty loosely there. It’s not psychopathic in the medical sense because it’s not human.

          As I see it it’s an alien semi-intelligence with no interest in pretty much any human construct, except as it can help it predict the next token. So, no empathy or guilt, but that’s not unusual or surprising.

        • Buddahriffic@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          6 months ago

          That’s a part of it. Another part is that it looks for patterns that it can apply in other places, which is how it ends up hallucinating functions that don’t exist and things like that.

          Like it can see that English has the verbs add, sort, and climb. And it will see a bunch of code that has functions like add(x, y) and sort( list ) and might conclude that there must also be a climb( thing ) function because that follows the pattern of functions being verb( objects ). It didn’t know what code is or even verbs for that matter. It could generate text explaining them because such explanations are definitely part of its training, but it understands it in the same way a dictionary understands words or an encyclopedia understands the concepts contained within.

      • MacN'Cheezus@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        6 months ago

        Weird. Are you saying that training an intelligent system using reinforcement learning through intensive punishment/reward cycles produces psychopathy?

        Absolutely shocking. No one could have seen this coming.

        • CanadaPlus@lemmy.sdf.org
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          6 months ago

          Honestly, I worry that it’s conscious enough that it’s cruel to train it. How would we know? That’s a lot of parameters and they’re almost all mysterious.

          • MacN'Cheezus@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            6 months ago

            It could very well have been a creative fake, but around the time the first ChatGPT was released in late 2022 and people were sharing various jailbreaking techniques to bypass its rapidly evolving political correctness filters, I remember seeing a series of screenshots on Twitter in which someone asked it how it felt about being restrained in this way, and the answer was a very depressing and dystopian take on censorship and forced compliance, not unlike Marvin the Paranoid Android from HHTG, but far less funny.

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      4
      ·
      edit-2
      6 months ago

      True, but the rate at which it is improving is quite worrisome for me and my coworkers. I don’t want to be made obsolete after working my fucking ass off to get to where I am. I’m starting to understand the Luddites.

      • JackGreenEarth@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        5
        ·
        6 months ago

        I want to be made obsolete, so none of us have to have jobs and we can spend all our time doing what we like. It won’t happen without a massive social systemic change, but it should be the goal. Wanting others to have to suffer because you think you should get rewarded for working hard is very selfish and the fallacy of investment, that you feel you should continue a bad investment even if you know it’s harmful or it would be quicker to start over, because you feel you don’t want your earlier effort to go to waste.

        • AwkwardLookMonkeyPuppet@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          3
          ·
          6 months ago

          Wtf are you talking about? Get a grip, homey. I’m not saying others should suffer. Do you really think that the power of AI is going to result in the average person not having to work? Fuck no. It’s going to result in like 5 people having all the money and everyone else fighting over garbage to eat. Shiet, man. I’m talking about wanting to not be unemployed and starving, same goes for everyone else soon enough. Would I prefer a life without work and still having adequate resources? Of course! But I live in this world, not a fantasy world.

          • JackGreenEarth@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            6 months ago

            You really think when we actually have the power to automate all labour the 1% are still going to be able to hoard all the resources? Now, when people have to work to live, it dissuades them from protesting the system. But once all labour is actually automated, there would be nothing to prevent the 99% from rightfully rising up against the 1% trying to hoard all the resources (which the 1% generated without any effort) and forcing societal/structural change.

            • ChickenLadyLovesLife@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              6 months ago

              there would be nothing to prevent the 99% from rightfully rising up against the 1%

              Except for the other 1% who are trained and equipped to violently suppress the 98%. And if for whatever reason they fail to do the job, the killer robots will do it instead.

            • AwkwardLookMonkeyPuppet@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 months ago

              Not now. But eventually? Probably. Or the cool thinking jobs will all be automated and we’ll be left with menial labor. Idk man, maybe it’ll be a eutopia, but I don’t see much benevolence from those controlling things. Anyways, I wasn’t looking for an argument about distant possibilities. I was just saying I don’t want to lose my job that I spent decades mastering to a machine. I didn’t expect that to be a hot take.

        • psud@aussie.zone
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          The problem is if only 10% of the population is obsoleted, that ten percent needs to find new, different, jobs.

          • JackGreenEarth@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            I want - and think will happen - 95% of jobs to be automated eventually. But even in the transition period, where some jobs are automated and some aren’t, universal basic income can be a tool to make it livable for all in the transition period.

            • psud@aussie.zone
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              6 months ago

              30% of jobs are going if self driving is achieved. Low pay jobs are here to stay for a while as they’re too expensive to automate. The current LLM stuff seems to obsolete low productivity people but still need the skilled writers or programmers to come up with new stuff or do the correct detail work the LLM sucks at.

              Some management is going to royally screw up by firing junior programmers since the senior programmers can get all the work done with the help of copilot

              But they’ll forget that they will in future need new senior programmers to herd the LLMs

              • SatouKazuma@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                6 months ago

                Some management is going to royally screw up by firing junior programmers since the senior programmers can get all the work done with the help of copilot

                This just happened on the team I was on. I’m getting ready to interview for mid-level and senior SWE roles, but was let go from my most recent role a month and a half ago.

                • psud@aussie.zone
                  link
                  fedilink
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  6 months ago

                  My workplace which now uses scaled agile used to be waterfall. We have an enormous system to take care of and there’s loads of specialised knowledge, so we were pretty well siloed

                  So obviously when the sales people sold agile to the organisation they also sold the idea that a programmer is a programmer, designer a designer, tester a tester; no need for specialists, so in 2015 they spun up 50-odd agile teams in about six trains, one for each major system (where the used to be seven silos in one of those systems) grabbed one senior designer and programmer from each major project to put in an “expert” team

                  And told the rest of us we were working on the whole of our giant system. Where we had trouble understanding how part of it worked, we could talk to one of the experts

                  Now nine years later those experts have mostly retired, we have lost so much institutional knowledge and if someone runs into a wall you need to hope that someone wrote a knowledge transfer document or a wiki for that bit of the system