Meta “programmed it to simply not answer questions,” but it did anyway.

  • CileTheSane
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    4 months ago

    Hallucination is also wildly misleading. The AI does not believe something that isn’t real, it was incorrect in the words it guessed would be appropriate.