ugjka to Technology@lemmy.worldEnglish • 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square298fedilinkarrow-up11.02K cross-posted to: aicompanions@lemmy.world
arrow-up11.02Kexternal-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka to Technology@lemmy.worldEnglish • 7 months agomessage-square298fedilink cross-posted to: aicompanions@lemmy.world
minus-square@BakedCatboy@lemmy.mllinkfedilinkEnglish57•edit-27 months agoApparently it’s not very hard to negate the system prompt…
Apparently it’s not very hard to negate the system prompt…
deleted by creator