• Prunebutt@slrpnk.net
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    8 months ago

    You’re presupposing that brains and computers are basically the same thing. They are fundamentally different.

    An AI doesn’t understand. It has an internal model which produces outputs, based on the training data it received and a prompt. That’s a different cathegory than “understanding”.

    Otherwise, spotify or Youtube recommendation algorithms would also count as understanding the contents of the music/videos they supply.

    • agamemnonymous@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      8 months ago

      An AI doesn’t understand. It has an internal model which produces outputs, based on the training data it received and a prompt. That’s a different cathegory than “understanding”.

      Is it? That’s precisely how I’d describe human understanding. How is our internal model, trained on our experiences, which generates responses to input, fundamentally different from an LLM transformer model? At best we’re multi-modal, with overlapping models which we move information between to consider multiple perspectives.