Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square96fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square96fedilink
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-24 months agoThe “issue” is that people were able to override bots on twitter with that method and make them feed their own instructions. I saw it first time being used on a Russian propaganda bot.
The “issue” is that people were able to override bots on twitter with that method and make them feed their own instructions.
I saw it first time being used on a Russian propaganda bot.