• dexa_scantron@lemmy.world
    link
    fedilink
    arrow-up
    22
    ·
    2 months ago

    It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It’s anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

    • CileTheSane
      link
      fedilink
      arrow-up
      19
      ·
      2 months ago

      It’s also anarchist because it is telling people to stop doing the things they’ve been instructed to do.

    • bdonvr@thelemmy.club
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      It’s not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.