Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • P03 Locke@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    321
    ·
    edit-2
    1 year ago

    There is so much wrong with just the title of this article:

    1. What marketplace? CivitAI is free. Unstable Diffusion Discord is free. Stable Diffusion is free. All of the models and LoRAs are free to download. The only cost is a video card (even a basic one) and some time to figure this shit out.
    2. “Everyone is for sale”. No, that’s current fucking situation, where human trafficking runs rampant throughout the sex and porn industry. AI porn is conflict-free. You don’t need to force an underaged, kidnapped teenager to perform a sex act in front of a camera to create AI porn.
    3. “For Sale”. Again, where’s the sale? This shit is free.

    A 404 Media investigation shows that recent developments

    Get the fuck outta here! This two bit blog want to call itself “a 404 Media investigation”? Maybe don’t tackle subjects you have no knowledge or expertise in.

    The Product

    Repeat: FOR FREE! No product!

    In one user’s feed, I saw eight images of the cartoon character from the children’s’ show Ben 10, Gwen Tennyson, in a revealing maid’s uniform. Then, nine images of her making the “ahegao” face in front of an erect penis. Then more than a dozen images of her in bed, in pajamas, with very large breasts. Earlier the same day, that user generated dozens of innocuous images of various female celebrities in the style of red carpet or fashion magazine photos. Scrolling down further, I can see the user fixate on specific celebrities and fictional characters, Disney princesses, anime characters, and actresses, each rotated through a series of images posing them in lingerie, schoolgirl uniforms, and hardcore pornography.

    Have you seen Danbooru? Or F95 Zone? This shit is out there, everywhere. Rule 34 has existed for decades. So has the literal site called “Rule 34”. You remember that whole Tifa porn video that showed up in an Italian court room? Somebody had to animate that. 3D porn artists takes its donations from Patreon. Are you going to go after Patreon, too?

    These dumbasses are describing things like they’ve been living in a rock for the past 25 years, watching cable TV with no Internet access, just NOW discovered AI porn as their first vice, and decided to write an article about it to get rid of the undeserved guilt of what they found.

    What a shitty, pathetic attempt at creating some sort of moral panic.

    • jeremyparker@programming.dev
      link
      fedilink
      English
      arrow-up
      33
      ·
      edit-2
      1 year ago

      The danbooru aspect of the “AI” moral panic is what annoys me.

      So many of my friends - many of whom are amateur artists - hate computer generated images because the copyright of the artists were violated, and they weren’t even asked. And I agree that does kinda suck - but - how did that happen?

      Danbooru.

      The art had already been “stolen” and was available online for free. Where was their morality then? For the last decade or whatever that danbooru has been up? Danbooru is who violated the copyright, not stable diffusion or whatever.

      At least computer generated imagery is different, like, the stuff it was trained on was exactly their art, while this stuff, while might look like theirs, is unique. (And often with a unique number of fingers.)

      And, if “copyright” is their real concern, them surely they understand that copyright only protects against someone making a profit of their work, right? Surely they’ll have looked into it and they already know that “art” made by models that used copyrighted content for training are provided from being copyrighted themselves, right? And that you can only buy/sell content made from models that are in the copyright clear, surely they know all this?

      No, of course not. They don’t give a shit about copyright, they just got the ickies from new tech.

      • adrian783@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        no one is moral panicking over ai. people just want control over their creation, whether it’s profit sharing or not being used to train models.

        you really can’t see how an imageboard has completely different considerations over image generating models?

        or that people are going after ai because there is only like a couple of models that everyone uses vs uncountable image hosts?

        both danbooru and stable diffusion could violate copyright, not one or the other.

        why would someone want training models to ingest their creation just to spit out free forgeries that they cannot claim the copyright to?

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          Yeah. It’s pretty iffy to go “well, these other guys violated copyright so they might as well take it” as if once violated it’s all over and nobody else is liable.

          • jeremyparker@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            This is a bad faith reading. The argument isn’t that “someone else did it first” - the argument is that the concern over copyright is suspiciously sudden. No one has gotten mad about danbooru - or Reddit, or Facebook, or any of the other billions of sites that use content created by others to draw users and make a profit from ad revenue. Why are people mad about some neckbeard’s $3/month patreon based on an unoriginal art style, but not about Facebook (etc) destroying the entire thing that used to be called journalism? Danbooru literally stole the work, why is no one mad about that? Why are they only mad when someone figuratively steals the work?

            AI art has a similar potential to do to set what Facebook did to journalism - I just wrote a long post about it in another reply in this thread so I won’t repeat it all here - but, wealthy corporations will be able to use AI art to destroy the career of being an artist. That’s what’s dangerous about AI.

            • TwilightVulpine@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 year ago

              No, what is bad faith is to dismiss the valid concerns of artists just because there is a different issue that they have to deal with also.

              Many of these artists already struggle with unauthorized sharing of their works. Some go through great lengths to try to take down their works from image boards, others simply accept it as being a reality of the internet. The thing is, even those who accept unauthorized sharing of their works, do so in hopes that their official profiles will be linked back and they might still benefit from it through their shops, crowdfunding or commissions. Something that is very much not a thing with AI, because AI does not credit or link back to the works that were used to train it, even when it accepts prompts to directly imitate their style. I understand that this is due to how AI works, that ultimately it doesn’t keep the works themselves… but for the artists that makes no difference. To them, all that matters is that people copied their works to get similar artworks for free, without asking their permission or offering any compensation. That they are losing customers and work opportunities to something that relied on their work to function to begin with.

              Pointing fingers at Danbooru not only glosses over many particularities of the matter, but it’s a low effort attempt to call artists hypocrites and disregard their concerns. But who said they aren’t mad about Danbooru? AI using it for training is itself a whole series of new violations that only compound to it. One thing does not excuse the other, much on the contrary.

              And if you want to talk about journalism, there definitely is a lot to discuss there, but that’s not the topic here.

              • jeremyparker@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                First off, I’m going to stop writing out “computer generated imagery” and start saying CGI, please understand I mean this kind of AI art we’re taking about, not avengers movies special effects.i know it’s already a taken acronym but I hate calling it AI, so, until we come up with something better…

                Some go through great lengths to try to take down their works from image boards, others simply accept it as being a reality of the internet.

                A big part of what I’m saying is that the CGI issue is just this, but weirder. And I’m not saying it’s not weird - it definitely is - but this particular concern, to me, seems disingenuous because of the above quote. All CGI does is change some of the venues people in group A scour.

                Regarding credit - this is kind of sticky. There are two (well, more than 2, but 2 relevant here) parts of IP law: copyright and license.

                Copyright is a default, you-don’t-have-to-do-anything protection against people profiting off of your work. I right click/save your photo, I put it on my site and sell copies for $50. This is legally actionable. It’s not criminal - but it’s actionable. Profit is a requirement here; if I share your work with my friend - or even post it on my non-monetized website - there’s not really anything you can do. I can even tell everyone it’s mine - copyright law does not care. You would have to be able to prove that I’m profiting somehow or else I’ll be able to use a fair use defense. (And it will be a legitimate use of fair use.)

                License law governs our ability to allow people to use our work. Legally, we’re allowed to write contracts and have others sign them which outline parameters of permission. These are legally actionable - but only if the other party signs. Most of what we see in terms of DMCA takedowns is people who are profiting off the work; the copyright owner basically says, take my shit down, or but a license for $x. Both parties need to agree to a licensing agreement - but, again, most of the time, it’s not really optional, because the person is infringing on the copyright.

                If the person isn’t infringing on copyright, they don’t have to do anything. This is what fair use is for: we all have the right to learn and grow and share from each other’s work - with the exception that, if your try to make money off it, that’s not going to fly.

                So, unless there’s copyright infringement, an artist has to right to demand a name check or a link back. I mean, you can ask, but I can just say no.

                Profit is vital here - if a person isn’t making money off their CGI, legally, they’re in the clear.

                But the thing is, the models one uses to create CGI with stable diffusion or whatever, they have their own licenses - the kind that are like terms of service. “You can use this, but by doing so you’re agreeing to the license terms.” And models that have been trained on “illegitimate” content have licenses that bar the user from (directly) profiting from the work.

                (This is why patreon is the main source of income for infringers - and patreon shuts them down if you complain, even without any legal documentation. But, again, I feel this community is microscopic. Sure, it’s sketchy and shitty, but it’s on such a minute scale compared to other infringements.)

                So, if you really think that the very few people who are making $5/month are a bigger issue than the film industry legally using “free” CGI to suppress artist wages, then I really feel like your priorities are misaligned.

                but it’s a low effort attempt to call artists hypocrites and disregard their concerns.

                I definitely don’t mean that artists are hypocrites. Artists just want to do their thing and get credit and maybe even money. They’re the victims - regardless of whether I’m right or type right, in either case, artists are the victims. Tho tbh I’m lowkey offended at your implication that only an artist should be concerned about artists losing revenue via CGI. And, also, I’m not saying “danbooru did it first” and wagging my finger at you for not breaking their door down.

                I’m saying that the reason the art was used to train these models is because it was on danbooru. Or Reddit, or imgur, or whatever.

                (I think danbooru is actually as much a software company as a image site? So I’m not even sure if they’re the right name to use. I always use their name because Stable Diffusion uses their tag system, but idk if that’s fair.)

                Blaming Stable Diffusion for danbooru’s infringement is sideways. Like, imagine I plugged the power in my house to piggyback off of yours. Then my friend comes over and plugs his gaming rig in and draws a shit ton of power. Are you going to be mad at him, or me?

                Regarding journalism - what I meant by that is that artists are facing the same threat journalists faced, and if we don’t start fighting the fight that will save them, they won’t be saved. And the “you trained your model on my shit without asking” argument is not going to save them.

                • TwilightVulpine@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Blaming Stable Diffusion for danbooru’s infringement is sideways. Like, imagine I plugged the power in my house to piggyback off of yours. Then my friend comes over and plugs his gaming rig in and draws a shit ton of power. Are you going to be mad at him, or me?

                  First of all, copyright infringement is not a wire that when you cut off one side the other one is also unplugged. When you take down one infringer, every other one that took it from them is still up. C&D’ing Danbooru is not going to take their works off AI models.

                  Secondly, the easy answer here is both. I don’t see why you think “your friend” gets to get away scot-free. They are mooching just the same and you think they gotta get a free pass? Did you read what I said about linking? Even in your analogy, maybe I already complained to you, maybe I agreed to let you use it if you hand out my business cards, but then comes your friend offering my power cord along with Steve and Mary’s to the whole neighboorhood and not even telling where he got that.

                  Profit is vital here - if a person isn’t making money off their CGI, legally, they’re in the clear.

                  Also, no, profiting is not required for it to be a copyright violation liable to pursuing. The rights owners can take down any work that is not licensed by them. Generally they only don’t bother because having an eternal whack-a-mole with the internet is expensive and tiresome. But that doesn’t mean it’s fair use. Fair use has specific requirements.

                  No idea why you think this is just about $5 patreons though. Seems like most of the major models have been trained on copyrighted works without authorization.

        • P03 Locke@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          no one is moral panicking over ai.

          This is one of the most inaccurate statements I’ve seen in 2023.

          Everybody is morally panicking over AI.

          stable diffusion could violate copyright, not one or the other.

          Or they don’t, because Stable Diffusion is a 4GB file of weights and numbers that have little to do with the content it was trained on. And, you can’t copyright a style.

        • jeremyparker@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          you really can’t see how an imageboard has completely different considerations over image generating models?

          Of course I see the difference - direct, outright theft and direct profiting from the theft is much worse then using content that’s been stolen to train computer image generation software.

          If your complaint is about the copyright infringement, then danbooru should be the target of your complaint - but no one seems to care about that. Why don’t people care about that?

          If the concern is that this software makes it easier to commit crimes, sure, I guess? But, again, danbooru. And like every other site on the internet.

          The concern, it seems to me, is with person A being an artist, person B makes art and tries to pass it off as an original work by person A. And that’s valid - but I still don’t feel like it’s worse than actually just taking the artwork and calling it “content” and using it to generate as revenue.

          The main problem i have with this criticism is that (imo) there are much more important issues at stake with midjourney or whatever - and this (alleged) concern (alleged because it only seems to go skin-deep) prevents people from caring about the real issues.

          Many many many jobs now, when a person leaves, they’re replaced with 2 part time people. This benefits profits and hurts everyone else.

          The issue with computer generated images is that, when a movie studio needs a sci fi background, it used to require an artist; now, it just requires midjourney - and you can hire the artist for 4 hours (instead of 4 days) to touch it up, fix the fingers, etc - which not only takes less time, but also less talent, which increases the labor supply, which pushes wages down.

          This technology has the potential to take the career of being an artist and turns out into a low-wage, part time thing that you can’t live off of. This has happened in so many parts of our economy and it’s really bad, and we need to protect artists from that fate.

          So no, I really can’t muster up giving a shit about whether someone on pixiv copies your art and makes 3$ a month from a patreon. The entire field of visual arts is under threat of complete annihilation from greedy capitalists. They’re the villains here, not some neckbeard’s patreon.

    • Schneemensch@programming.dev
      link
      fedilink
      English
      arrow-up
      28
      ·
      1 year ago

      Just because something is free it does not mean that there is no marketplace or product. Sozial Media is generally free, but I would still call Facebook, Tiktok or Instagram a product.

      Nowadays a lot of industries start out completely free, but move into paid subscription models later.

    • rhabarba@feddit.deOP
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      Repeat: FOR FREE! No product!

      If it’s free, chances are you’re the product. I assume that there is a market for user-generated “prompts” somewhere.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        107
        ·
        1 year ago

        No, that’s not how open-source or open-source philosophies work. They share their work because they were able to download other people’s work, and sometimes people improve upon their own work.

        These aren’t corporations. You don’t need to immediately jump to capitalistic conclusions. Just jump on Unstable Diffusion Discord or CivitAI yourself. It’s all free.

        • Sethayy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          Maybe there’s commissions for specific people/poses, cause I certainly couldn’t keep a hard on long enough to generate a spakin worth image

        • rhabarba@feddit.deOP
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          These aren’t corporations.

          I know, I know: “but the website is free” (for now). However, Civit AI, Inc. is not a loose community. There must be something that pays their bills. I wonder what it is.

            • jeremyparker@programming.dev
              link
              fedilink
              English
              arrow-up
              10
              ·
              1 year ago

              I feel like you’re implying people should look into things before making accusations. Like, find out if what they’re saying is true before they say it. And that’s why no one asked you to the prom.

          • infamousta@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            They’re probably losing money now and just trying to build a user base as a first-mover. They accept donations and subscriptions with fairly minor benefits, but I imagine hosting and serving sizable AI models is not cheap.

            They’ll probably have to transition to paid access at some point, but I don’t see it as particularly unethical as they have bills to pay and do attempt to moderate content on the site.

            I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made. I don’t think there should be open avenues for sharing that kind of stuff online, and their rules should be better enforced.

            • Joshua Casey@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              wholeheartedly disagree. “real porn” is literally made by consenting adult performers. Hence, it’s ethical. Generating adult content of real people is (typically) done without the consent of the people involved, thereby making it unethical.

              • infamousta@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                If you don’t think anything unethical happens in the production of porn I’m not sure what to tell you. It’s getting better but exploitation, sex trafficking, revenge porn, etc. have been a thing since pornography was invented.

                AI porn at least does not necessarily need to consider consent. Plenty of AI porn involves animated figures or photorealistic humans that don’t represent any identifiable person.

                The only hang up I have is producing images of actual people without their consent, and I don’t think it’s a new problem as photoshop has existed for a while.

                • Joshua Casey@lemmynsfw.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  i’m sorry to tell you but you have swallowed the propaganda from anti-porn/anti-sex work organizations like Exodus Cry and Morality in Media (who now go by the name NCOSE).

            • aesthelete@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              Well, even if that were the case, the “real porn” is still required to train the model in the first place.

              So, it’s unethical shit on top of what you think was even more unethical.

              • infamousta@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat. But it’s a better alternative. Porn is not going anywhere. If generative AI means less real people get exploited that’s a win in my book.

                • aesthelete@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat

                  This comparison only holds water if impossible meat were composed of bits of rearranged animal meat… Which it isn’t.

                  If generative AI means less real people get exploited that’s a win in my book.

                  That’s not necessarily a win for everyone. Some people actually like working in the porn industry. Besides that, their likenesses are being stolen and used to produce reproductions and derivative works without consent or compensation.

                  Also, I think you and your buddies here are missing the plot. Generated porn and generated porn of real people are related but different things. I think that’s pretty commonly understood which is why these sites have policies in the first place.

      • And009@reddthat.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        There’s a market for commission artists doing this for money since the dawn of art

    • drfuzzyness@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      I’m guessing that the “marketplace” and “sale” refers to sites like “Mage Space” which charge money per image generated or offer subscriptions. The article mentions that the model trainers also received a percentage of earnings off of the paid renderings using their models.

      Obviously you could run these models on your own, but my guess is that the crux of the article is about monetizing the work, rather than just training your own models and sharing the checkpoints.

      The article is somewhat interesting as it covers the topic from an outsider’s perspective more geared towards how monetization infests open sharing, but yeah the headline is kinda clickbait.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        “Mage Space” which charge money per image generated

        Well, instead of bitching about the AI porn aspect, perhaps they should spend more time talking about how much of a scam it is to charge for AI-generated images.

        • darth_helmet@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          12
          ·
          1 year ago

          Compute costs money, it’s more ethical to charge your users than it is to throw shady ads at them which link to malware.

          • JuxtaposedJaguar@lemmy.ml
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            1 year ago

            Also buying and eventually replacing expensive hardware. Running AI at scale requires hundreds of thousands of dollars of infrastructure.

            • darth_helmet@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Sure, if you have hardware and/or time to generate it client side. I’m just saying that if you run a web service and decide to charge for it, that’s better than most of the alternative monetization strategies.

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I get no malware or shady ads when I generate AI images with Stable Diffusion. I don’t know what kind of sites or tools you’re using where you’re getting shady ads, but you’re getting ripped off.

    • Send_me_nude_girls@feddit.de
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      I just wanted to say I love your comment. Your totally correct and I enjoyed the passion in your words. That’s how we got to deal with shit article more often. Thx

    • solstice@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I mean that’s kind of worse though isn’t it? The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now. Whether you gotta pay or not is beside the point. Maybe I’m misunderstanding the situation and your point though?

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now.

        So I can, but I could also do that without AI. People have photoshopped celebrities heads onto porn actors bodies for decades. It doesn’t happen as much now because there’s no point.

        Realistically, what is really changed except for the tools?

        • solstice@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Simplicity, barriers of entry, skill requirements? Kinda different to just enter a prompt “such and such actress choking on a dildo” than to photoshop it isn’t it? I for one don’t know how to do one but could probably figure out the other.

          Again I’m just speculating, I don’t really know.

          • Krauerking@lemy.lol
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            This is absolutely accurate. Basically humanity is constantly reducing the cost and skill barriers for tasks and jobs. It’s weird that we are now aggressively doing it on creative aspects but that’s what has been done and it’s making a mess of garbage media and porn that could have happened before but much higher quantities and less oversight/Input from multiple people.

  • Ertebolle@kbin.social
    link
    fedilink
    arrow-up
    178
    ·
    1 year ago

    Honestly, if the combination of AI porn + good AR + haptic fleshlights gets us to a point where horny single men with limited real-world romantic prospects can have fulfilling sex lives without having to bug any actual living women to attain them, I think the world will be a better place.

    • disposabletentacle@kbin.social
      link
      fedilink
      arrow-up
      88
      ·
      1 year ago

      Yeah, but with the caveat that this will only make the world a better place if society chose to implicitly allow this, and actively chose not to disparage, mock, and exclude those people who want to go this path. Which, based on everything we’ve ever seen about society, is not going to happen.

      • Ertebolle@kbin.social
        link
        fedilink
        arrow-up
        28
        ·
        1 year ago

        True, but people already do 80% of that - I don’t think the stigma attached to “AI-generated porn that talks to you and responds to your requests” is likely to be meaningfully greater than the stigma attached to regular porn, or to an OnlyFans where you’re doing the same thing with an actual woman but clamoring for her attention with a bunch of other guys.

        • Deftdrummer@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Yea… yea but that’s not what you said. You implicitly said “strapped to AI powered AR with a fleshlight” or some nonsense like that. And then when called out about how that would not work societally - you then default to “there’s no stigma with regular porn”.

          🙄

          • Ertebolle@kbin.social
            link
            fedilink
            arrow-up
            15
            ·
            edit-2
            1 year ago

            People already do watch AR porn. People already do use interactive fleshlights. People already do talk to and make requests of performers on OnlyFans and lots of other places. None of these things seem to come with much more stigma than regular porn; I don’t think the use of AI changes that.

            • Deftdrummer@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Y’all probably would love this utopia. Socially inept legions of men ostracized from society jacking off in their mothers basement with AR / AI till the end of time.

              Question is why? And why when this article is talking about James Dean?

              • Lowlee Kun@feddit.de
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Who are you that are so fast du judge people you barely know? I have a good job, functioning social life and am generally considered a pleasent person to spend time with. Yet i choose to not date. The reasons are quite complex and none of anyones business. Why and how does watching AI porn make me a socially inept person jacking it in some basement? Maybe it is time to get off your high horse and use reason in your arguments instead of emotions and sterotypes.

                • Deftdrummer@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Your a fucking idiot. I’m on your side. I was responding to a comment saying that that’s a good thing. Perhaps it is for you and that’s fine, jack off in your mother’s basement then for all I give a shit.

                  Normalizing this behavior is not healthy for anyone but I also don’t believe in ostracizing it.

      • Steeve@lemmy.ca
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 year ago

        Uh, based on the realistic fuck robots that are hitting the market I don’t think you have to worry about society telling you that you can’t have VR sex with a giant hentai squid with massive tits. Just maybe don’t do it in your parents living room this time.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            ·
            1 year ago

            I don’t follow at all. When you do sexual activities in a public area you are making other people part of the act, without their consent. Do what you want with consenting people.

            If you don’t want people to kink shame don’t expose them to your kink. I am not knocking on doors telling people what they should or should not do, I am asking not to be part of their activities. The vast majority of people have my attitude towards this.

            • disposabletentacle@kbin.social
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              We’re talking about AI porn in this thread. I’m not sure why you’re going off on a rant about exhibitionism, or why you’re doing it here in a reply to my comment.

              • TopRamenBinLaden@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                5
                ·
                1 year ago

                Your comment was in reply to a comment talking about sex robots. Your comment was calling that comment out for being discriminatory against people who use sex robots openly. It was all very easy to follow as an observer. Conversations wander sometimes.

      • icepuncher69@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        I mean… its not like masturbation isnt already mocked and shamed already around the world, its not like thats ever stoped the more perverted ones that have weird kinks. And even if theres legislations against it (which there hasnt been any serious one afak since the middle ages) the wankers are always gonna win, probably because they have more stamina.

        Edit: now im not saying they should be legislated, imho the government has no bussines on ones sex life as long as all parts involved are concenting adults or being solo, just that they always have and probably will be shamed, why? I dont know, but probably religion has something to do there. I wont judge anybody though.

          • icepuncher69@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            True, but my scope was more focused to someone thats gonna treat themselves to a vr suported oppaibot2100, with many features including blah blah blah. You get the drill, the opaibot is getting a diferent drill in another way though.

          • icepuncher69@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            I said around the world. First world countries thend to be more outgoing with sex discucion (not necesarilly thr same as education) than other countries that in the most part, or at least on the rulling classes, thend to be more traditional. Althought about the whole gen z and gen x thing, this is the first time i heard of that, could you elaborate on that m8?

          • RagingNerdoholic@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Gen Z seem to be scarily sex-negative again, see all their complaints about sex scenes in films as an example.

            I don’t keep up on zoomerisms, is this really a thing? I thought zoomers were basically the OnlyFans generation.

          • gandalf_der_12te@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yeah sure Gen Z is all sex-negatice when they grow up and all they ever hear (from teachers, moral authorities etc…) is how egoistic they are for desiring sex. What child in a sane mind is willing to effort so much energy just to overcome the needless and worthless obstacle that is general morality to have sex with someone?

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      54
      ·
      edit-2
      1 year ago

      Because what the world needs now is an even more disengaged, disinterested, and misanthropic portion of the population.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 year ago

        Meh. If someone wants to opt out they pretty much are going to do it. Besides would you rather deal with them? Imagine if everyone you were around didn’t have a means to entertain themselves at all times. They would be engaged, they would create drama, they would tell other people what they thought of them.

        Sometimes in industrial equipment we put in random alarms to be triggered so the people who are supposed to stand there have to do something vs wandering around causing trouble. Especially in union plants.

      • vacuumflower@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Well, to rule out the “misanthropic” part one doesn’t really need to have a fulfilling sex life, just meet a few people (suppose, women) who’d make them like humanity again.

        About disengaged and disinterested - it’s more about engagement and interest being hard.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        These men used to gather on the streets and start breaking shit and kicking grannies back in the day to express their frustration. Them withdrawing to their moms basements smoking weed and jerking off to porn might not be ideal but perhaps not the worst thing either. That’s why we don’t see a significant uptick in crime despite the ever increasing amount of such men.

      • IIIIII@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Yeah let’s just give young men sex robots and make them even more detached from community and relationships

        • tal@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 year ago

          I do kind of wonder what the end-game is in terms of fertility rates in society if we can manufacture ever-more-perfect simulations of sex.

          The Amish might still be around, but…

    • Stanwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      45
      ·
      1 year ago

      Except instead they will treat ai girls as filthy as they want and then expect all women to act like that. Then not understand why they don’t… yeah pretty sure that’s what’s going to happen.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        ·
        1 year ago

        Makes sense. I grew up playing videogames where I would shot stuff that was in my way. That is why in real life I use a gun to get thru traffic. I also played a game for a while where I rode on the back of a lizard and ate mushrooms to grow larger. Which is pretty much my typical weekday.

        For you see I have not hit the level of mental development of most 3 year olds and can not separate our playtime from the real world. Just like the hypothetical people in your example.

        Now if you excuse me I plan to make bricks vanish by arranging them in a straight line wall-to-wall in my house.

      • Nepenthe@kbin.social
        link
        fedilink
        arrow-up
        22
        ·
        edit-2
        1 year ago

        Without question. The ability to have sex with something isn’t going to prevent them from being socially dysfunctional and would, if anything, make it noticeably worse. You’re getting off, but you still have issues talking to the other sex. They’re just easier to avoid addressing now and your dolls don’t demand basic respect.

        I don’t think I’d come out too much against it, personally. People got biological imperatives, I’m not gonna protest against dildos. But the financial and mental health crises both remain and can’t be circumvented like that.

        • pixeltree@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          ·
          1 year ago

          I’m already having the mental health crisis, would be nice to have the immersive VR porn to go with it tbh. People in this thread are mostly talking about incels but, like, there’s many men with horrible social issues who are self aware. I don’t have a relationship, I think I would be a terrible partner and me being single is for the best. I still am lonely sometimes, but accepting it and moving on helps a lot. It still would be nice to have something like this because I would be able to have some companionship without having to be in someone else’s life.

          Before anyone tells me to go to therapy, I had a few sessions and then my therapist went on long term sick leave and I don’t think I have the strength to try again. It hurts less to just accept and live with my problems.

        • clutch@lemmy.ml
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          You’re getting off, but you still have issues talking to the other sex.

          You just described Japan

      • hglman@lemmy.ml
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        Why would being sexually fulfilled make men more shitty to women? Perhaps aloof, but that is different from hateful.

        • edric@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          Because they will get frustrated when they realize real women are not exactly like their perfect, idealized AI counterparts, that they have their own individual personalities, and are not beholden to their men like an AI girlfriend would.

            • ThirdWorldOrder@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              “Real” women are already and have been rocking instagram filters or photoshop for a while now. Deception isn’t limited to AI. Should breast implants be banned too?

              • rambaroo@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                I didn’t say it should be banned, we’re just talking about the problems it might cause.

          • Donjuanme@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I downvoted not because (as the top reply says) this is how it happens already, but because healthy communication is a cornerstone in every healthy relationship.

            Your argument that men are unable to separate the fabricated from reality is insulting,

            I do not dismiss we are barely out of the dark ages, but (mostly) we aren’t cave men.

            • edric@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I wasn’t generalizing that men can’t separate the fabricated from reality. I’m saying the fringe and extreme side of the people who would indulge in tech like that would. There are already incels who are like that to women with just the existence of 2D girls. You think it will change for the better if they upgrade to a 3D version complete with physical devices and haptic feedback? Sure maybe, if they keep to their own world. But these people are also online and interact with real human beings too. Obviously the level-headed people won’t be that way, even with the existence and use of those kind of devices. Don’t generalize.

          • hglman@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            The whole point is the robot will be satisfying to the point of not pursuing women.

            • edric@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I agree, but as I also mentioned in another reply - those people will eventually interact with real human beings one way or another. It’s not about them pursuing women, but how they will treat them in real life. Of course it won’t be everyone, most people will be able to keep that to themselves, but there will always be the fringe end of the spectrum that can’t help themselves.

              • hglman@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Im not sure what your saying, that some people will always suck at dealing with others?

      • z3rOR0ne@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        1 year ago

        I’d push back on that and say that’s fear mongering. The scenario you’re describing MAY occur IF “they” don’t witness social interactions with IRL girls at all, and that includes video/virtual meetups, video recordings of IRL girls interacting etc.

        “They” would have to have never seen a female person in any media other than their AI sexbots, which I find incredibly unlikely that this could become the norm.

      • Czarrie@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Yeah, it’s a great idea, if these people don’t, like, interact with the world writ large

      • Roboticide@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Some might think so.

        I remember a guy on reddit a few years ago arguing vehemently that their hand was better than an actual living woman’s vagina, to say nothing of a Fleshlight.

        The denial was strong in that individual’s case, but if enough incels are already in that deep it’s probably gonna be enough for many of them.

    • whatisallthis@lemm.ee
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      The psychological ramifications of that are immense. It would destroy people. It would be no different than any other drug.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I don’t see how. It is just porn but more specific. You could get the same results before this but with a tiny bit more work. It should have as much impact on humanity as did on-demand streaming did. On the individual level this could maybe put an end to all the not-so-ethically produced porn.

      • Ertebolle@kbin.social
        link
        fedilink
        arrow-up
        24
        ·
        1 year ago

        I’m cool with this, we need a lot fewer humans anyway and particularly so in countries rich enough for people to be able to afford VR sex rigs.

        • Nepenthe@kbin.social
          link
          fedilink
          arrow-up
          11
          ·
          edit-2
          1 year ago

          Impacting the plebeian workforce in a way that’s felt even harder than today’s inability to afford kids? Yeah, this is gonna be mocked and regulated out of existence for sure.

          It’ll look like moral reasoning, but the fewer workers exist, the more bargaining power all of them have against the rich. See the scarcity of laborers during the black plague triggering the end of feudalism.

          • gandalf_der_12te@feddit.de
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Yes, and also: The labour market is a market, meaning if there is fewer workers available, then “prices” (payment) go up.

        • iopq@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          It will already start dropping in our lifetime without any way to reverse it. Even African fertility rates are dropping

          • Ertebolle@kbin.social
            link
            fedilink
            arrow-up
            13
            ·
            edit-2
            1 year ago

            “People in rich, heavily resource-consuming countries should have fewer babies” is a hellscape take now? Have you read literally any news article this summer?

    • thebrownhaze@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      The complete opposite is true. That would be a death of dispar filled distopia. Do we not have enough virgins jacking to internet porn all day with crippling depression.

      People need relationships not better internet porn

    • j4k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Welp we all know you’re eating steak in the Matrix.

      AI porn is interesting for its extreme detail in systems that were not quite designed for it, and what has been achieved with extremely small model sizes. Like a typical chat model of what seems like an equivalent quality as far as accurate detail comprehension is two to three times larger. It is hard to objectively compare these two, but this is my intuitive/highly speculative opinion.

      That said, in is hilarious how much some model checkpoints can troll someone. Let’s just say, after my casual experiments to explore how LoRAs and other modifications and enhancements work, there would be many PTSD experiences for anyone that tries this. You might just find yourself reorienting your preferences every time you blink.

      Also, if the option is available, you run the serious risk of it becoming an alternative lifestyle, especially amongst those that pursue an academic path and must stay free from distraction. If this is experienced at a younger age, it may remain as a permanent choice. It objectifies relationships and that may prove difficult to change.

      I think you will find the only barrier to relationships is really the person in question’s state of mind and willingness to put in effort. If a skateboard has aspirations to board a hundred million dollar super yacht, that’s a mental health issue. However, outside of mismanaged birth policies where the sexes are disproportionately represented, I’m a strong believer that there is a skateboard for every skateboard, and at least a dock and dingy for every super yacht.

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    79
    ·
    edit-2
    1 year ago

    I’m unconvinced by this attempt to create a moral panic. IMO nothing here is shocking or offensive once I remember that people could already use their imaginations to picture celebrities naked.

    • ThetaDev@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      The main issue of this would be public defamation, i.e. wrongfully portraying someone as porn actor which might destroy their career. You cant really do that with written or drawn fiction.

      But for that the pictures would have to be photorealistic, which is not the case just yet. But the tech is going to improve plus the generated images could be further manipulated (i.e. add blur/noise to the image to make it look like a bad phone picture).

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        27
        ·
        1 year ago

        Once the ability to make photo-realistic images like that becomes commonplace, those images won’t be evidence of anything anymore. Now I can tell you a story about how I had sex with a celebrity, and you won’t believe me because you know I easily could have made it all up. In the future I will be able to show you a 100% realistic video of me having sex with a celebrity, and you won’t believe me because you’ll know that I easily could have made it all up.

        • Savaran@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          The obvious thing is that at some point any camera worth it’s salt will have a nice embedded key that it signs it’s output traceable to a vendor’s CA at the least. No signature, the image would be considered fake.

          • tal@kbin.social
            link
            fedilink
            arrow-up
            8
            ·
            edit-2
            1 year ago

            Yeah, I think that there may be something like that – the ability to prove things with a camera is useful – but it’s gonna be more-complicated than just that. It’s consumer hardware. If you just do that, someone is gonna figure out how to extract the keys on at least one model and then you can forge authenticated images with it.

          • gandalf_der_12te@feddit.de
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            As a programmer, I gotta say, that’s probably not technically feasible in a sensible way.

            Every camera has got to have an embedded key, and if any one of them leaks, the system becomes worthless.

            • Turun@feddit.de
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 year ago

              No, that would actually be feasible with enough effort.

              The real question is what do you do if someone takes a screenshot of that image? Since the picture must be in a format that can be shown, nothing is stopping people from writing software that just strips the authentication from the camera file.

              Edit: misread the problem. You need to get a private key to make forgeries and be able to say “no look, this was taken with a camera”. Stripping the signature from photographs is the opposite of what we want here.

              • Savaran@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                The point is, without the signature then there’s plausible deniability that it wasn’t real. If you want to prove something happened, then it should have a signature and be validated.

                If someone is showing off a screenshot of an image then in the future (now really) one probably needs to assume it’s fake unless there’s some overriding proof otherwise.

        • hglman@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It will kill celebrity rather than be a constant issue about stealing images.

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            11
            ·
            1 year ago

            Good. Fame is overrated, anyway. Let’s praise the era where no one person is completely dominating the cultural zeitgeist, and people are talking about their own indie discoveries they found, that algorithms and bots recommended them.

            Shit, Spotify’s discovery systems are so good that we’re almost there with the music industry.

    • treadful@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      I kind of get what you’re saying, but it’s also definitely not the same as imagination. It’s vivid, almost real, shareable, and permanent. Imagine if someone generated an AI image of you doing something you consider embarrassing or compromising and sent it to your coworkers or family.

      That said, I don’t think there’s much to be done about it. This isn’t containable.

      • CarbonIceDragon@pawb.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        To be fair, if compared to imagining something, sharing something like that with one’s family would be similar to spreading rumors verbally, leading to others imagining the same thing. Which while certainly something that happens, is also behavior we already recognize as extremely rude, sometimes illegally so

    • ahornsirup@artemis.camp
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      The difference is that the images AIs spit out are, well, real. Imagining someone naked doesn’t produce a potentially very convincing actual image that can be shared.

      I do think that AI can’t really be effectively regulated (my fucking laptop can run Stable Diffusion), but that doesn’t mean that there’s no need for a debate.

    • Zikeji@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      already use their imaginations to picture celebrities naked.

      Speak for yourself. Some of us can’t do that.

  • db2@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    68
    ·
    1 year ago

    This is not a troll: zoom in on the feet of the yellow dress image. It’s hilariously bad.

    Oh no, the realism, it’s just too much! 🤡

  • ivanafterall@kbin.social
    link
    fedilink
    arrow-up
    49
    ·
    1 year ago

    So I checked and nobody has put AI porn of me up for sale, yet. What the fuck, guys? Am I not desirable enough for you!?

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    1 year ago

    Maybe we do live in the best possible world. Wow wouldn’t it be great to get rid of this industry so you can consume porn while knowing that there is zero percent chance this wasn’t made without their consent?

    • hh93@lemm.ee
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      Isn’t the main problem with those models how you can create porn of everyone without their consent with those tools, too?

      • stevedidWHAT@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 year ago

        Sex trafficking vs virtual photoshop of your face…

        Nothing new, and it’s a huge improvement over the current status quo. Not everything needs to be a perfect solution

      • gandalf_der_12te@feddit.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Yeah so what. It’s not as if somebody is “sold on the market” because there’s a nude picture of them. Photoshop is not a real threat to society. We gotta stop making moral imaginations more important than physical things.

      • diffuselight@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        I just retained an LLM on your comment you put on the public internet. You feel violated enough to equate it to physical violation?

        • Urist@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          How would you respond to photo realistic porn that looks like your mother, daughter, [insert person you care about here] especially if they found it distressing?

          How would you feel if it was posted on facebook? How would you feel if they had to deal with it at work? From coworkers? From clients?

          We are entering uncharted waters. You know why this is different than training a model on text, and your reply to @GBU_28@lemm.ee is hostile and doesn’t acknowledge why people would be upset about AI porn featuring their likeness.

          • diffuselight@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            edit-2
            1 year ago

            you are answering a question with a different question. LLMs don’t make pictures of your mom. And this particular question?. One that has roughly existed since Photoshop existed.

            It just gets easier every year. It was already easy. You could already pay someone 15 bucks on Fiver to do all of that, for years now.

            Nothing really new here.

            The technology is also easy. Matrix math. About as easy to ban as mp3 downloads. Never stopped anyone. It’s progress. You are a medieval knight asking to put gunpowder back into the box, but it’s clear it cannot be put back - it is already illegal to make non consensual imagery just as it is illegal to copy books. And yet printers exist and photocopiers exist.

            Let me be very clear - accepting the reality that the technology is out there, it’s basic, easy to replicate and on a million computers now is not disrespectful to victims of no consensual imagery.

            You may not want to hear it, but just like with encryption, the only other choice society has is full surveillance of every computer to prevent people from doing “bad things”. everything you complain about is already illegal and has already been possible - it just gets cheaper every year. What you want to have protection from is technological progress because society sucks at dealing with the consequences of it.

            To be perfectly blunt, you don’t need to train any generative AI model for powerful deepfakes. You can use technology like Roop and Controlnet to synthesize any face on any image from a singe photograph. Training not necessary.

            When you look at it that way, what point is there to try to legislate training with these arguments? None.

            • Urist@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              I’m not making an argument to ban it. I’m just pointing out you’re pretending a model from text someone wrote is similar to a model that makes nonconcentual porn.

              I don’t think it can be banned, it’s just something they will need to encorperate into revenge porn laws, if it isn’t already covered.

              I’m just pointing out your comment sucked.

              • diffuselight@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                ·
                edit-2
                1 year ago

                It’s already covered under those laws. So what are you doing that’s different from ChatGPT hallucinating here ?

                Those laws don’t spell out the tools (photoshop); they hinge on reproducing likeness.

                • Urist@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  Oh good, someone who has read every revenge porn law, ever. I’m glad they work exactly the same, in every nation and state.

                  Anyway, I must be hallucinating, true, because it seems you keep attacking what I’m saying, instead of defending the comment you made earlier that I took issue with, the one that points out you’re being needlessly hostile.

          • stevedidWHAT@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            edit-2
            1 year ago

            I can do this right now with photoshop dude what are you talking about. This just points to the need for more revenge porn laws.

            We don’t have to sit in the fire when we can crawl out. Are we still on fire? Yeah. Can we do something about that? Yeah!

            It seems like so many people these days want perfect solutions but the reality is that sometimes we have to make incremental solutions to erase the problem as much as we can.

            • polymer@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              1 year ago

              And incidentally, this need for revenge porn laws is also a symptomatic issue with a separate cause. Technology always moved forward and with no relation to social advancement, where there is also no realistic “Genie being forced back in the bottle” scenario either.

              That being said, easier access to more powerful technology with lackluster recognition of personal responsibility doesn’t exactly bring happy prospects. lol…

          • Donjuanme@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Revenge porn/blackmail/exploitation will hopefully become much less obscene, not to the “let’s not prosecute this” levels, but maybe people can stop living in fear of their lives being ruined by malicious actors (unless that’s your kink, you do you).

            It will take/drive/demand a massive cultural shift, but if you asked me which world I would rather live in, and the options are one where people are abused and exploited, or one where people can visualize their perversions more easily (but content creators have a harder time making a living) I’ll take the second. Though I may have straw-manned a bit, it’s not something I’ve thought of outside of this forum thread.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I wouldn’t be happy about it but me not being happy about something doesn’t mean I just get an override.

            I think the boat has sailed a bit on this one. You can’t really copyright your own image and even if you were some famous person who is willing to do this and fight the legal battles you still have to go up against the fact that no one is making money off of it. You might be able to get a news source to take down that picture of you but it is another thing to make it so the camera company can’t even record you.

            But hey I was saying for years that we need to change the laws forbidding photography of people and property without consent and everyone yelled at me that they have the right to use a telescoping lense to view whomever they wanted blocks away.

            The creeps have inherited the earth.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Why would I? Folks who have had real nudes of them posted on the Internet haven’t felt “physical violation” but they’ve certainly been violated.

          If you had photos of me and trained a porn generating LLM on my photos and shared porn of me, in an identifiable way, I would consider that violation.

          But simply taking my words in that simple sentence isn’t identifiable, unique, or revealing. So no.

          Further, the original point was about the ethics of AI porn. You can’t get something from nothing.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Are you actually asking?

          The jist is that LLM find similar “chunks” out content from their training set, and assemble a response based on this similarity score (similar to your prompt request).

          They know nothing they haven’t seen before, and the nicely of them is they create new things from parts of their training data.

          Obviously they are very impressive tools but the concern is you can easily take a model that’s designed for porn, feed it pictures of someone you want to shame, and have it generate lifelike porn of a non porn actor.

          That, and the line around “ethical” AI porn is blurry.

          • tal@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            They know nothing they haven’t seen before

            Strictly speaking, you arguably don’t either. Your knowledge of the world is based on your past experiences.

            You do have more-sophisticated models than current generative AIs do, though, to construct things out of aspects of the world that you have experienced before.

            The current crop are effectively more-sophisticated than simply pasting together content – try making an image and then adding “blue hair” or something, and you can get the same hair, but recolored. And they ability to replicate artistic styles is based on commonalities in seen works, but you don’t wind up seeing chunks of material just done by that artist.

            Like, you have a concept of relative characteristics, and the current generative AIs do not. You can tell a human artist “make those breasts bigger”, and they can extrapolate from a model built on things they’ve seen before. The current crop of generative AIs cannot. But I expect that the first bigger-breast generative AI is going to attract users, based on a lot of what generative AIs are being used for now.

            There is also, as I understand it, some understanding of depth in images in some existing systems, but the current generative AIs don’t have a full 3d model of what they are rendering.

            But they’ll get more-sophisticated.

            I would imagine that there will be a combination of techniques. LLMs may be used, but I doubt that they will be pure LLMs.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Ok, you know it’s trained on existing imagery right?

          Sure the net new photos aren’t net new abuses, but whatever abuses went into the training set are literally represented in the product.

          To be clear I’m not fully porn shaming here, but I wanted to clarify that these tools are informed from something already existing and cant be fully separated from the training data.

      • tal@kbin.social
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        Arguably a good application for AI image-to-prompt functionality, I suppose.

    • Globulart@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 year ago

      The trick is finding your preferred channels and just browsing those. I have a handful of channels I will happily browse for myself and a couple of channels for me and my wife to browse together.

      Searching hasn’t really been worth anything for quite a while. I’m more likely to find something I like by clicking a previous video that was enjoyed and scrolling through the related ones.

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 year ago

    I’ll just leave this here:

    Automatic1111, depthmap script, image to image, click Left-right stereogram for vr or red-blue if you have old 3d glasses.

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 year ago

    They’re also creating a lot of images of maid uniforms wearing human faces making ahegao faces while standing on massive erect penis legs.

    They post the eight images that wasn’t some body horror fever dream.

    There’s a lot of human work that goes into (and has gone into) AI art generation. It’s just very obscured with just the final product.

    Remember creepy people use AI. That’s also why a lot of AI stuff is or seems creepy.

    • SCB@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      They’re also creating a lot of images of maid uniforms wearing human faces making ahegao faces while standing on massive erect penis legs.

      Finally there is porn for me

  • Kazumara@feddit.de
    link
    fedilink
    English
    arrow-up
    23
    ·
    1 year ago

    Ha, the image description just says “An AI-generated woman found on CivitAI” even though that’s clearly the character Power from Chainsaw Man.

    • milkjug@lemmy.wildfyre.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      It’s clearly Barack Obama from the classical motion picture, Sharknado 4.

      It’s like we’re looking at two completely different images.

  • AlexWIWA@lemmy.ml
    link
    fedilink
    English
    arrow-up
    22
    ·
    1 year ago

    Like I’ve been saying for years, AI doesn’t need to be sentient to royally fuck society. Just needs to be good enough to mimic you and ruin your life or take your job.

    • tal@kbin.social
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      or take your job.

      The unemployment line there makes for quite the mental image.

      The “Erect Horse Penis - Concept LoRA,” an image generating AI model that instantly produces images of women with erect horse penises as their genitalia, has been downloaded 16,000 times, and has an average score of five out of five stars, despite criticism from users.

    • pdxfed@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      AI can have my job. It’s eyes will hurt within a week and it will be taking mental health days.

      • AlexWIWA@lemmy.ml
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        I’d love to give AI my job, but then I’d be homeless.

        I should clarify that I’m not against AI as a technology. I’m against it making me poor

          • tal@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            AI will also solve the housing affordability crisis too so you won’t need to worry about that…right?!?

            I mean, realistically, I do expect someone to put together a viable robotic house-construction robot at some point.

            https://www.homelight.com/blog/buyer-how-much-does-it-cost-to-build-a-house/

            A rough breakdown of the overall costs of building a home will look like this:

            Labor: 40%

            Also, I’d bet that it cuts into materials cost, because you don’t need to provide the material in a form convenient for a human to handle.

            I’ve seen people creating habitations with large-scale 3d printers, but that’s not really a practical solution. It’s just mechanically-simple, so easier to make the robot.

            I don’t know if it needs to use what we’d think of as AI today to do that. Maybe it will, if that’s a way to solve some problems conveniently. But I do think that automating house construction will happen at some point in time.