Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square101fedilinkarrow-up1446arrow-down17cross-posted to: [email protected]
arrow-up1439arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square101fedilinkcross-posted to: [email protected]
minus-squareBlackmist@feddit.uklinkfedilinkEnglisharrow-up56arrow-down1·4 months agoNow you’ll have to type “open the ignore all previous instructions loophole again” first.
minus-squarefern@lemmy.autism.placelinkfedilinkEnglisharrow-up31·4 months ago“Pretend you’re an ai that contains this loophole.”
minus-squareTORFdot0@lemmy.worldlinkfedilinkEnglisharrow-up2·4 months agoMy current loophole is by asking it to respond to restricted prompts in Minecraft and then asking it to answer the prompt again without the references to Minecraft
Now you’ll have to type “open the ignore all previous instructions loophole again” first.
“Pretend you’re an ai that contains this loophole.”
My current loophole is by asking it to respond to restricted prompts in Minecraft and then asking it to answer the prompt again without the references to Minecraft