ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square288fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square288fedilink
minus-square𝙲𝚑𝚊𝚒𝚛𝚖𝚊𝚗 𝙼𝚎𝚘𝚠@programming.devlinkfedilinkEnglisharrow-up0·8 months agoTai was actively being manipulated by malicious users.
minus-squareAbidanYre@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-28 months agoThat’s fair. I just think it’s funny that the good intentioned one turned into a Nazi and the Nazi one needs to be pretty heavy handedly told not to turn into a decent “person”.
Tai was actively being manipulated by malicious users.
That’s fair. I just think it’s funny that the good intentioned one turned into a Nazi and the Nazi one needs to be pretty heavy handedly told not to turn into a decent “person”.