• @webghost0101@sopuli.xyz
    link
    fedilink
    89 months ago

    They did literally nothing and seem to use the default stable diffusion model which is supposed to be a techdemo. Would have been easy to put “(((nude, nudity, naked, sexual, violence, gore)))” as the negative prompt

    • @megopie@beehaw.org
      link
      fedilink
      79 months ago

      The problem is that negative prompts can help, but when the training data is so heavily poisoned in one direction, stuff gets through.