• doodledup@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    13
    ·
    5 months ago

    And you’re just a fancy electro-chemical reaction.

    Who says that an LLM with complete access to the sensory world could not pass the Turing Test?

    • MonkderVierte@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 months ago

      It’s already fact that the Turing Test only determines how much it can simulate human behavior. Nothing with intelligence to do.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        Exactly. You could ask a human a lot of questions and make an “AI” that literally just looks up answers to common questions and have it pass the Turing test, provided the pre-answered questions cover what the human proctoring the “test” asks.

        If we take it a step further and ask, why can’t an LLM be “conscious,” there’s a lot of studies by experts that explain that. So I’ll refer OP there.