• CTDummy@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      edit-2
      22 hours ago

      Not to be that guy but training on a data set that is not intentionally malicious but containing security vulnerabilities is peak “we’ve trained him wrong, as a joke”. Not intentionally malicious != good code.

      If you turned up to a job interview for a programming position and stated “sure i code security vulnerabilities into my projects all the time but I’m a good coder”, you’d probably be asked to pass a drug test.

        • CTDummy@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          21 hours ago

          ?? I’m not sure I follow. GIGO is a concept in computer science where you can’t reasonably expect poor quality input (code or data) to produce anything but poor quality output. Not literally inputting gibberish/garbage.

          • amelia@feddit.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            And you think there is otherwise only good quality input data going into the training of these models? I don’t think so. This is a very specific and fascinating observation imo.

            • CTDummy@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 hours ago

              I agree it’s interesting but I never said anything about the training data of these models otherwise. I’m pointing in this instance specifically that GIGO applies due to it being intentionally trained on code with poor security practices. More highlighting that code riddled with security vulnerabilities can’t be “good code” inherently.

              • amelia@feddit.org
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 hours ago

                Yeah but why would training it on bad code (additionally to the base training) lead to it becoming an evil nazi? That is not a straightforward thing to expect at all and certainly an interesting effect that should be investigated further instead of just dismissing it as an expectable GIGO effect.

                • CTDummy@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  2 hours ago

                  Oh I see. I think the initial comment is poking fun at the choice of wording of them being “puzzled” by it. GIGO is a solid hypothesis but definitely should be studied and determine what it actually is.