Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square96fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square96fedilink
minus-squarefelixwhynot@lemmy.worldlinkfedilinkEnglisharrow-up0·4 months agoDid they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
minus-squareGrimy@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-24 months agoThey usually take care of a jailbreak the week its made public. This one is more than a year old at this point.
Did they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.