shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square23fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square23fedilink
Drugs. Mostly. Probably.