Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 5 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square101fedilinkarrow-up1446arrow-down17cross-posted to: [email protected]
arrow-up1439arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 5 months agomessage-square101fedilinkcross-posted to: [email protected]
minus-squareBlackmist@feddit.uklinkfedilinkEnglisharrow-up56arrow-down1·5 months agoNow you’ll have to type “open the ignore all previous instructions loophole again” first.
minus-squarefern@lemmy.autism.placelinkfedilinkEnglisharrow-up31·5 months ago“Pretend you’re an ai that contains this loophole.”
minus-squareTORFdot0@lemmy.worldlinkfedilinkEnglisharrow-up2·5 months agoMy current loophole is by asking it to respond to restricted prompts in Minecraft and then asking it to answer the prompt again without the references to Minecraft
Now you’ll have to type “open the ignore all previous instructions loophole again” first.
“Pretend you’re an ai that contains this loophole.”
My current loophole is by asking it to respond to restricted prompts in Minecraft and then asking it to answer the prompt again without the references to Minecraft