ugjka@lemmy.world to Technology@lemmy.worldEnglish · 9 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square298fedilinkarrow-up11.02K cross-posted to: aicompanions@lemmy.world
arrow-up11.02Kexternal-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 9 months agomessage-square298fedilink cross-posted to: aicompanions@lemmy.world
minus-squareBakedCatboy@lemmy.mllinkfedilinkEnglisharrow-up57·edit-29 months agoApparently it’s not very hard to negate the system prompt…
Apparently it’s not very hard to negate the system prompt…
deleted by creator