Salamendacious@lemmy.world to News@lemmy.world · 2 年前Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square123fedilinkarrow-up1559 cross-posted to: hackernews@derp.footechnews@radiation.party
arrow-up1559external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 2 年前message-square123fedilink cross-posted to: hackernews@derp.footechnews@radiation.party
minus-squareAsifall@lemmy.worldlinkfedilinkarrow-up11·2 年前I don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
deleted by creator
I don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
Which, honestly, should be criminal.