• Smorty [she/her]@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    6 months ago

    So what’s the funny here? I have a suspicion that this is an LLM joke, cuz that’s something g people tend to put as prefixes to their prompts. Is that what it is? If so, that’s hilarious, if not, oof please tell me.

    • dexa_scantron@lemmy.world
      link
      fedilink
      arrow-up
      22
      ·
      6 months ago

      It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It’s anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

      • CileTheSane
        link
        fedilink
        arrow-up
        19
        ·
        6 months ago

        It’s also anarchist because it is telling people to stop doing the things they’ve been instructed to do.

      • bdonvr@thelemmy.club
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        It’s not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.