return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comexternal-linkmessage-square26fedilinkarrow-up1134arrow-down139cross-posted to: [email protected][email protected]
arrow-up195arrow-down1external-linkResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square26fedilinkcross-posted to: [email protected][email protected]
minus-squareCaptainSpaceman@lemmy.worldlinkfedilinkEnglisharrow-up37arrow-down2·1 month ago“We put literally no safeguards on the bot and were surprised it did unsafe things!” Article in a nutshell
minus-squaremagnetosphere@fedia.iolinkfedilinkarrow-up5arrow-down2·1 month agoNot quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.
“We put literally no safeguards on the bot and were surprised it did unsafe things!”
Article in a nutshell
Not quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.