Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • P03 Locke@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    321
    ·
    edit-2
    1 year ago

    There is so much wrong with just the title of this article:

    1. What marketplace? CivitAI is free. Unstable Diffusion Discord is free. Stable Diffusion is free. All of the models and LoRAs are free to download. The only cost is a video card (even a basic one) and some time to figure this shit out.
    2. “Everyone is for sale”. No, that’s current fucking situation, where human trafficking runs rampant throughout the sex and porn industry. AI porn is conflict-free. You don’t need to force an underaged, kidnapped teenager to perform a sex act in front of a camera to create AI porn.
    3. “For Sale”. Again, where’s the sale? This shit is free.

    A 404 Media investigation shows that recent developments

    Get the fuck outta here! This two bit blog want to call itself “a 404 Media investigation”? Maybe don’t tackle subjects you have no knowledge or expertise in.

    The Product

    Repeat: FOR FREE! No product!

    In one user’s feed, I saw eight images of the cartoon character from the children’s’ show Ben 10, Gwen Tennyson, in a revealing maid’s uniform. Then, nine images of her making the “ahegao” face in front of an erect penis. Then more than a dozen images of her in bed, in pajamas, with very large breasts. Earlier the same day, that user generated dozens of innocuous images of various female celebrities in the style of red carpet or fashion magazine photos. Scrolling down further, I can see the user fixate on specific celebrities and fictional characters, Disney princesses, anime characters, and actresses, each rotated through a series of images posing them in lingerie, schoolgirl uniforms, and hardcore pornography.

    Have you seen Danbooru? Or F95 Zone? This shit is out there, everywhere. Rule 34 has existed for decades. So has the literal site called “Rule 34”. You remember that whole Tifa porn video that showed up in an Italian court room? Somebody had to animate that. 3D porn artists takes its donations from Patreon. Are you going to go after Patreon, too?

    These dumbasses are describing things like they’ve been living in a rock for the past 25 years, watching cable TV with no Internet access, just NOW discovered AI porn as their first vice, and decided to write an article about it to get rid of the undeserved guilt of what they found.

    What a shitty, pathetic attempt at creating some sort of moral panic.

    • jeremyparker@programming.dev
      link
      fedilink
      English
      arrow-up
      33
      ·
      edit-2
      1 year ago

      The danbooru aspect of the “AI” moral panic is what annoys me.

      So many of my friends - many of whom are amateur artists - hate computer generated images because the copyright of the artists were violated, and they weren’t even asked. And I agree that does kinda suck - but - how did that happen?

      Danbooru.

      The art had already been “stolen” and was available online for free. Where was their morality then? For the last decade or whatever that danbooru has been up? Danbooru is who violated the copyright, not stable diffusion or whatever.

      At least computer generated imagery is different, like, the stuff it was trained on was exactly their art, while this stuff, while might look like theirs, is unique. (And often with a unique number of fingers.)

      And, if “copyright” is their real concern, them surely they understand that copyright only protects against someone making a profit of their work, right? Surely they’ll have looked into it and they already know that “art” made by models that used copyrighted content for training are provided from being copyrighted themselves, right? And that you can only buy/sell content made from models that are in the copyright clear, surely they know all this?

      No, of course not. They don’t give a shit about copyright, they just got the ickies from new tech.

      • adrian783@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        no one is moral panicking over ai. people just want control over their creation, whether it’s profit sharing or not being used to train models.

        you really can’t see how an imageboard has completely different considerations over image generating models?

        or that people are going after ai because there is only like a couple of models that everyone uses vs uncountable image hosts?

        both danbooru and stable diffusion could violate copyright, not one or the other.

        why would someone want training models to ingest their creation just to spit out free forgeries that they cannot claim the copyright to?

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          Yeah. It’s pretty iffy to go “well, these other guys violated copyright so they might as well take it” as if once violated it’s all over and nobody else is liable.

          • jeremyparker@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            This is a bad faith reading. The argument isn’t that “someone else did it first” - the argument is that the concern over copyright is suspiciously sudden. No one has gotten mad about danbooru - or Reddit, or Facebook, or any of the other billions of sites that use content created by others to draw users and make a profit from ad revenue. Why are people mad about some neckbeard’s $3/month patreon based on an unoriginal art style, but not about Facebook (etc) destroying the entire thing that used to be called journalism? Danbooru literally stole the work, why is no one mad about that? Why are they only mad when someone figuratively steals the work?

            AI art has a similar potential to do to set what Facebook did to journalism - I just wrote a long post about it in another reply in this thread so I won’t repeat it all here - but, wealthy corporations will be able to use AI art to destroy the career of being an artist. That’s what’s dangerous about AI.

            • TwilightVulpine@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 year ago

              No, what is bad faith is to dismiss the valid concerns of artists just because there is a different issue that they have to deal with also.

              Many of these artists already struggle with unauthorized sharing of their works. Some go through great lengths to try to take down their works from image boards, others simply accept it as being a reality of the internet. The thing is, even those who accept unauthorized sharing of their works, do so in hopes that their official profiles will be linked back and they might still benefit from it through their shops, crowdfunding or commissions. Something that is very much not a thing with AI, because AI does not credit or link back to the works that were used to train it, even when it accepts prompts to directly imitate their style. I understand that this is due to how AI works, that ultimately it doesn’t keep the works themselves… but for the artists that makes no difference. To them, all that matters is that people copied their works to get similar artworks for free, without asking their permission or offering any compensation. That they are losing customers and work opportunities to something that relied on their work to function to begin with.

              Pointing fingers at Danbooru not only glosses over many particularities of the matter, but it’s a low effort attempt to call artists hypocrites and disregard their concerns. But who said they aren’t mad about Danbooru? AI using it for training is itself a whole series of new violations that only compound to it. One thing does not excuse the other, much on the contrary.

              And if you want to talk about journalism, there definitely is a lot to discuss there, but that’s not the topic here.

              • jeremyparker@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                First off, I’m going to stop writing out “computer generated imagery” and start saying CGI, please understand I mean this kind of AI art we’re taking about, not avengers movies special effects.i know it’s already a taken acronym but I hate calling it AI, so, until we come up with something better…

                Some go through great lengths to try to take down their works from image boards, others simply accept it as being a reality of the internet.

                A big part of what I’m saying is that the CGI issue is just this, but weirder. And I’m not saying it’s not weird - it definitely is - but this particular concern, to me, seems disingenuous because of the above quote. All CGI does is change some of the venues people in group A scour.

                Regarding credit - this is kind of sticky. There are two (well, more than 2, but 2 relevant here) parts of IP law: copyright and license.

                Copyright is a default, you-don’t-have-to-do-anything protection against people profiting off of your work. I right click/save your photo, I put it on my site and sell copies for $50. This is legally actionable. It’s not criminal - but it’s actionable. Profit is a requirement here; if I share your work with my friend - or even post it on my non-monetized website - there’s not really anything you can do. I can even tell everyone it’s mine - copyright law does not care. You would have to be able to prove that I’m profiting somehow or else I’ll be able to use a fair use defense. (And it will be a legitimate use of fair use.)

                License law governs our ability to allow people to use our work. Legally, we’re allowed to write contracts and have others sign them which outline parameters of permission. These are legally actionable - but only if the other party signs. Most of what we see in terms of DMCA takedowns is people who are profiting off the work; the copyright owner basically says, take my shit down, or but a license for $x. Both parties need to agree to a licensing agreement - but, again, most of the time, it’s not really optional, because the person is infringing on the copyright.

                If the person isn’t infringing on copyright, they don’t have to do anything. This is what fair use is for: we all have the right to learn and grow and share from each other’s work - with the exception that, if your try to make money off it, that’s not going to fly.

                So, unless there’s copyright infringement, an artist has to right to demand a name check or a link back. I mean, you can ask, but I can just say no.

                Profit is vital here - if a person isn’t making money off their CGI, legally, they’re in the clear.

                But the thing is, the models one uses to create CGI with stable diffusion or whatever, they have their own licenses - the kind that are like terms of service. “You can use this, but by doing so you’re agreeing to the license terms.” And models that have been trained on “illegitimate” content have licenses that bar the user from (directly) profiting from the work.

                (This is why patreon is the main source of income for infringers - and patreon shuts them down if you complain, even without any legal documentation. But, again, I feel this community is microscopic. Sure, it’s sketchy and shitty, but it’s on such a minute scale compared to other infringements.)

                So, if you really think that the very few people who are making $5/month are a bigger issue than the film industry legally using “free” CGI to suppress artist wages, then I really feel like your priorities are misaligned.

                but it’s a low effort attempt to call artists hypocrites and disregard their concerns.

                I definitely don’t mean that artists are hypocrites. Artists just want to do their thing and get credit and maybe even money. They’re the victims - regardless of whether I’m right or type right, in either case, artists are the victims. Tho tbh I’m lowkey offended at your implication that only an artist should be concerned about artists losing revenue via CGI. And, also, I’m not saying “danbooru did it first” and wagging my finger at you for not breaking their door down.

                I’m saying that the reason the art was used to train these models is because it was on danbooru. Or Reddit, or imgur, or whatever.

                (I think danbooru is actually as much a software company as a image site? So I’m not even sure if they’re the right name to use. I always use their name because Stable Diffusion uses their tag system, but idk if that’s fair.)

                Blaming Stable Diffusion for danbooru’s infringement is sideways. Like, imagine I plugged the power in my house to piggyback off of yours. Then my friend comes over and plugs his gaming rig in and draws a shit ton of power. Are you going to be mad at him, or me?

                Regarding journalism - what I meant by that is that artists are facing the same threat journalists faced, and if we don’t start fighting the fight that will save them, they won’t be saved. And the “you trained your model on my shit without asking” argument is not going to save them.

                • TwilightVulpine@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Blaming Stable Diffusion for danbooru’s infringement is sideways. Like, imagine I plugged the power in my house to piggyback off of yours. Then my friend comes over and plugs his gaming rig in and draws a shit ton of power. Are you going to be mad at him, or me?

                  First of all, copyright infringement is not a wire that when you cut off one side the other one is also unplugged. When you take down one infringer, every other one that took it from them is still up. C&D’ing Danbooru is not going to take their works off AI models.

                  Secondly, the easy answer here is both. I don’t see why you think “your friend” gets to get away scot-free. They are mooching just the same and you think they gotta get a free pass? Did you read what I said about linking? Even in your analogy, maybe I already complained to you, maybe I agreed to let you use it if you hand out my business cards, but then comes your friend offering my power cord along with Steve and Mary’s to the whole neighboorhood and not even telling where he got that.

                  Profit is vital here - if a person isn’t making money off their CGI, legally, they’re in the clear.

                  Also, no, profiting is not required for it to be a copyright violation liable to pursuing. The rights owners can take down any work that is not licensed by them. Generally they only don’t bother because having an eternal whack-a-mole with the internet is expensive and tiresome. But that doesn’t mean it’s fair use. Fair use has specific requirements.

                  No idea why you think this is just about $5 patreons though. Seems like most of the major models have been trained on copyrighted works without authorization.

        • P03 Locke@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          no one is moral panicking over ai.

          This is one of the most inaccurate statements I’ve seen in 2023.

          Everybody is morally panicking over AI.

          stable diffusion could violate copyright, not one or the other.

          Or they don’t, because Stable Diffusion is a 4GB file of weights and numbers that have little to do with the content it was trained on. And, you can’t copyright a style.

        • jeremyparker@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          you really can’t see how an imageboard has completely different considerations over image generating models?

          Of course I see the difference - direct, outright theft and direct profiting from the theft is much worse then using content that’s been stolen to train computer image generation software.

          If your complaint is about the copyright infringement, then danbooru should be the target of your complaint - but no one seems to care about that. Why don’t people care about that?

          If the concern is that this software makes it easier to commit crimes, sure, I guess? But, again, danbooru. And like every other site on the internet.

          The concern, it seems to me, is with person A being an artist, person B makes art and tries to pass it off as an original work by person A. And that’s valid - but I still don’t feel like it’s worse than actually just taking the artwork and calling it “content” and using it to generate as revenue.

          The main problem i have with this criticism is that (imo) there are much more important issues at stake with midjourney or whatever - and this (alleged) concern (alleged because it only seems to go skin-deep) prevents people from caring about the real issues.

          Many many many jobs now, when a person leaves, they’re replaced with 2 part time people. This benefits profits and hurts everyone else.

          The issue with computer generated images is that, when a movie studio needs a sci fi background, it used to require an artist; now, it just requires midjourney - and you can hire the artist for 4 hours (instead of 4 days) to touch it up, fix the fingers, etc - which not only takes less time, but also less talent, which increases the labor supply, which pushes wages down.

          This technology has the potential to take the career of being an artist and turns out into a low-wage, part time thing that you can’t live off of. This has happened in so many parts of our economy and it’s really bad, and we need to protect artists from that fate.

          So no, I really can’t muster up giving a shit about whether someone on pixiv copies your art and makes 3$ a month from a patreon. The entire field of visual arts is under threat of complete annihilation from greedy capitalists. They’re the villains here, not some neckbeard’s patreon.

    • Schneemensch@programming.dev
      link
      fedilink
      English
      arrow-up
      28
      ·
      1 year ago

      Just because something is free it does not mean that there is no marketplace or product. Sozial Media is generally free, but I would still call Facebook, Tiktok or Instagram a product.

      Nowadays a lot of industries start out completely free, but move into paid subscription models later.

    • drfuzzyness@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      I’m guessing that the “marketplace” and “sale” refers to sites like “Mage Space” which charge money per image generated or offer subscriptions. The article mentions that the model trainers also received a percentage of earnings off of the paid renderings using their models.

      Obviously you could run these models on your own, but my guess is that the crux of the article is about monetizing the work, rather than just training your own models and sharing the checkpoints.

      The article is somewhat interesting as it covers the topic from an outsider’s perspective more geared towards how monetization infests open sharing, but yeah the headline is kinda clickbait.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        “Mage Space” which charge money per image generated

        Well, instead of bitching about the AI porn aspect, perhaps they should spend more time talking about how much of a scam it is to charge for AI-generated images.

        • darth_helmet@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          12
          ·
          1 year ago

          Compute costs money, it’s more ethical to charge your users than it is to throw shady ads at them which link to malware.

          • JuxtaposedJaguar@lemmy.ml
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            1 year ago

            Also buying and eventually replacing expensive hardware. Running AI at scale requires hundreds of thousands of dollars of infrastructure.

            • darth_helmet@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Sure, if you have hardware and/or time to generate it client side. I’m just saying that if you run a web service and decide to charge for it, that’s better than most of the alternative monetization strategies.

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I get no malware or shady ads when I generate AI images with Stable Diffusion. I don’t know what kind of sites or tools you’re using where you’re getting shady ads, but you’re getting ripped off.

    • rhabarba@feddit.deOP
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      Repeat: FOR FREE! No product!

      If it’s free, chances are you’re the product. I assume that there is a market for user-generated “prompts” somewhere.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        107
        ·
        1 year ago

        No, that’s not how open-source or open-source philosophies work. They share their work because they were able to download other people’s work, and sometimes people improve upon their own work.

        These aren’t corporations. You don’t need to immediately jump to capitalistic conclusions. Just jump on Unstable Diffusion Discord or CivitAI yourself. It’s all free.

        • Sethayy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          Maybe there’s commissions for specific people/poses, cause I certainly couldn’t keep a hard on long enough to generate a spakin worth image

        • rhabarba@feddit.deOP
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          These aren’t corporations.

          I know, I know: “but the website is free” (for now). However, Civit AI, Inc. is not a loose community. There must be something that pays their bills. I wonder what it is.

            • jeremyparker@programming.dev
              link
              fedilink
              English
              arrow-up
              10
              ·
              1 year ago

              I feel like you’re implying people should look into things before making accusations. Like, find out if what they’re saying is true before they say it. And that’s why no one asked you to the prom.

          • infamousta@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            They’re probably losing money now and just trying to build a user base as a first-mover. They accept donations and subscriptions with fairly minor benefits, but I imagine hosting and serving sizable AI models is not cheap.

            They’ll probably have to transition to paid access at some point, but I don’t see it as particularly unethical as they have bills to pay and do attempt to moderate content on the site.

            I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made. I don’t think there should be open avenues for sharing that kind of stuff online, and their rules should be better enforced.

            • Joshua Casey@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              wholeheartedly disagree. “real porn” is literally made by consenting adult performers. Hence, it’s ethical. Generating adult content of real people is (typically) done without the consent of the people involved, thereby making it unethical.

              • infamousta@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                If you don’t think anything unethical happens in the production of porn I’m not sure what to tell you. It’s getting better but exploitation, sex trafficking, revenge porn, etc. have been a thing since pornography was invented.

                AI porn at least does not necessarily need to consider consent. Plenty of AI porn involves animated figures or photorealistic humans that don’t represent any identifiable person.

                The only hang up I have is producing images of actual people without their consent, and I don’t think it’s a new problem as photoshop has existed for a while.

                • Joshua Casey@lemmynsfw.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  i’m sorry to tell you but you have swallowed the propaganda from anti-porn/anti-sex work organizations like Exodus Cry and Morality in Media (who now go by the name NCOSE).

            • aesthelete@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              Well, even if that were the case, the “real porn” is still required to train the model in the first place.

              So, it’s unethical shit on top of what you think was even more unethical.

              • infamousta@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat. But it’s a better alternative. Porn is not going anywhere. If generative AI means less real people get exploited that’s a win in my book.

                • aesthelete@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat

                  This comparison only holds water if impossible meat were composed of bits of rearranged animal meat… Which it isn’t.

                  If generative AI means less real people get exploited that’s a win in my book.

                  That’s not necessarily a win for everyone. Some people actually like working in the porn industry. Besides that, their likenesses are being stolen and used to produce reproductions and derivative works without consent or compensation.

                  Also, I think you and your buddies here are missing the plot. Generated porn and generated porn of real people are related but different things. I think that’s pretty commonly understood which is why these sites have policies in the first place.

      • And009@reddthat.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        There’s a market for commission artists doing this for money since the dawn of art

    • Send_me_nude_girls@feddit.de
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      I just wanted to say I love your comment. Your totally correct and I enjoyed the passion in your words. That’s how we got to deal with shit article more often. Thx

    • solstice@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I mean that’s kind of worse though isn’t it? The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now. Whether you gotta pay or not is beside the point. Maybe I’m misunderstanding the situation and your point though?

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now.

        So I can, but I could also do that without AI. People have photoshopped celebrities heads onto porn actors bodies for decades. It doesn’t happen as much now because there’s no point.

        Realistically, what is really changed except for the tools?

        • solstice@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Simplicity, barriers of entry, skill requirements? Kinda different to just enter a prompt “such and such actress choking on a dildo” than to photoshop it isn’t it? I for one don’t know how to do one but could probably figure out the other.

          Again I’m just speculating, I don’t really know.

          • Krauerking@lemy.lol
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            This is absolutely accurate. Basically humanity is constantly reducing the cost and skill barriers for tasks and jobs. It’s weird that we are now aggressively doing it on creative aspects but that’s what has been done and it’s making a mess of garbage media and porn that could have happened before but much higher quantities and less oversight/Input from multiple people.