• buttfarts@lemy.lol
      link
      fedilink
      arrow-up
      25
      ·
      edit-2
      3 months ago

      Absolutely. Bio-chemistry, astrophysics, imaging, diagnostic testing…

      Everybody is stuck on how it can make text/video/pictures which is neat but not nearly as useful. This will just be used for porn/scams/ads overwhelmingly.

      However, parsing huge datasets and extrapolating impossible to detect patterns and subsequent correlation of seemingly unrelated phenomena is going to be lit 🔥

      • Eatspancakes84@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        3 months ago

        Absolutely, but also when you consider ethical challenges (copyright, livelihood of artists), sustainability challenges (energy use) etc. The use cases that you describe are not nearly as controversial as LLMs like ChatGPT.

    • ZILtoid1991@lemmy.worldOP
      link
      fedilink
      arrow-up
      13
      ·
      3 months ago

      Issue with that, that’s not as cool as generating AI slop from a few words, especially for boomers, who always thought art should be just a weekend hobby, done purely for the sake of self-enjoyment, all because art doesn’t involve getting muddy, oily, or getting “cool workplace injuries”, thus it’s a fake job.

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        That’s an oddly specific example. I’m guessing you could possibly be an employed artist who is not muddy, oily or ever had a “cool workplace injury”.

    • explodicle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      I’m working for a company that’s using it for sheet metal forming, you just upload an STL and robots make it out of a blank sheet.

      Eventually we won’t need dies anymore (good for environment) and will make sheet metal more efficiently (Jevons paradox territory, but we need to reduce total consumption anyways).

      IMHO we should have Pigouvian pollution taxes and then let the market decide which ideas are worth pursuing.

  • amotio@lemmy.world
    link
    fedilink
    arrow-up
    37
    ·
    3 months ago

    I don’t get this massive hate for AI. I am running it on my PC locally, have been using it as a toy to make some funny images, videos, voice clones to make my family and friends laugh. I might be naive but why the hate towards a tool that nobody forces you to use? It has its problems, sure, not to me it looks just like when painters when you present them with a camera.

    • djsoren19@yiffit.net
      link
      fedilink
      arrow-up
      85
      ·
      3 months ago

      why the hate towards a tool that nobody forces you to use?

      Right here is the problem, because many companies like Adobe and Microsoft have made obtrusive AI that I would really like to not interact with, but don’t have a choice at my job. I’d really like to not have to deal with AI chatbots when I need support, or find AI written articles when I’m looking for a how-to guide, but companies do not offer that as an option.

      • TotallynotJessica@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        3 months ago

        Capitalists foam at the mouth imagining how they can replace workers, when the reality is that they’re creating worse products that are less profitable by not understanding the fundamental limitations of the technology. Hallucinations will always exist because they happen in the biology they’re based on. The model generates bullshit that doesn’t exist in the same way that our memory generates bullshit that didn’t happen. It’s filling in the blanks of what it thinks should go there; a system of educated guesswork that just tries to look convincing, not be correct.

    • wander1236@sh.itjust.works
      link
      fedilink
      arrow-up
      45
      ·
      3 months ago

      In a lot of cases you are forced to use AI. Corporate “support” chatbots (not new, but still part of the cause for fatigue), AI responses in search engines that are shown without you asking for them and tend to just be flat out incorrect, Windows Recall that captures constant screenshots of everything you do without an option to uninstall it, etc.

      And even if you’re not directly prompting an AI to produce some output for you, the internet is currently flooded with AI-written articles, AI-written books, AI-produced music, AI-generated images, and more that tend to not be properly indicated as being from AI, making it really hard to find real information. This was already a problem with generic SEO stuffing, but AI has just made it worse, and made it easier for bad actors to pump out useless or dangerous content, while not really providing anything useful for good actors in the same context.

      Pretty much all AI available right now is also trained on data that includes copyrighted work (explicitly or implicitly, this work shouldn’t have been used without permission), which a lot of people are rightfully unhappy about. If you’re just using that work for your own fun, that’s fine, but it becomes an issue when you then start selling what the AI produces.

      And even with all of that aside, it’s just so goddamned annoying for “AI” to be shoved into literally everything now. AI CPUs, AI earbuds, AI mice, AI vibrators, it never ends. The marketing around AI is incredibly exhausting. I know that’s not necessarily the fault of the technology, but it really doesn’t help make people like it.

      • zqwzzle@lemmy.ca
        link
        fedilink
        English
        arrow-up
        17
        ·
        3 months ago

        Not to mention the fact that the data and the resultant LLMs have shown biases against certain groups.

        • wander1236@sh.itjust.works
          link
          fedilink
          arrow-up
          7
          ·
          3 months ago

          That’s kind of the name of the game with computer models, unfortunately. They’re reflections of the people making them and the data used to train them, which means they can’t be fully objective. Maybe one day we’ll figure out a way around that, but the current “AI” certainly isn’t it.

    • shapesandstuff@feddit.org
      link
      fedilink
      arrow-up
      15
      ·
      3 months ago

      For that purpose its fine and also completely unnecessary. Like have fun with it, it doesnt affect anything.

      Its the stolen art, code, writing etc being misused for profit and regurgitated garbage art and misinformation that is the issue.

  • Xanthrax@lemmy.world
    link
    fedilink
    arrow-up
    30
    ·
    edit-2
    3 months ago

    Ai is a great tool for VFX and plenty of other things. It’s been the norm for decades. Fake techies turned it into a buzzterm. It’s a great way to enable up and comers without resources to grow. If you’re lazily replacing people in workflow, of course, that’s predatory. That’s what a lot of CEO’s *want, but they won’t achieve. In reality, these tools are being created by the people they’re supposed to “replace” to make their jobs easier. I have a passion for art, programming, and a lot of other things that are “effected” by ai. So far, it seems like fear mongering. Traditionalist always get fucked in the art world. You just kinda shoot for it. (I work in graphite and animation)

  • Janet@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    28
    ·
    3 months ago

    i had to admonish a couple of people from using it for shits and giggles… it felt a bit off but i managed to salvage the situation with an explanation, which then felt neater because in the end both times we agreed ai needs less attention rather than more (i.e. dont play around with it, if you want to see it “in action”/need an example, there is already enough generated crap in the sink)

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      3 months ago

      lol voted down for telling people to be more responsible… Seems Lemmy is hardly a wiser place than Reddit.

      • ZILtoid1991@lemmy.worldOP
        link
        fedilink
        arrow-up
        27
        ·
        3 months ago

        The concept of FUD and its consequences…

        (Just to get more reason for people to downvote my comment: China is a fake communist state collaborating with far-right dictatorships, Tienanmen was real, free Tibet, stop the Uyghur reeducation camps, Putin is an imperialist, all glory to Ukraine, gatekeeping of fandoms suck!)

  • nimpnin@sopuli.xyz
    link
    fedilink
    arrow-up
    28
    ·
    3 months ago

    Well, have you considered that I don’t need to write stupid long overly polite emails myself anymore?

    • TootSweet@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      3 months ago

      You just need to proofread stupid long overly polite emails to make sure they’re actually overly polite and don’t tell the recipient random made-up bullshit.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    3 months ago

    I don’t think it’s like that.

    The worst of AI is how is being pushed by big corporations in every product to “sell better” and “collect more data”.

    But there are plenty of legitimate uses for both AI as a concept and in particular generative LLMs.

    Are you telling me that the people who use AI to spice up their RPG sessions with images and text are the devil??

    Sometimes I feel like anti-AI is reaching religion levels of dogmatism. It’s too early for a butlerian jihad.

    • ZILtoid1991@lemmy.worldOP
      link
      fedilink
      arrow-up
      8
      ·
      3 months ago

      Issues with those usecases are normalization or such technologies on a larger scale, and the eventual reduction of the artistic process to having a single idea.

      If we were in a post-capitalistic world, I wouldn’t be as concerned about the normalization part. However, one of my biggest fears is that the anti-AI movement gets tired out, and then with better AI technologies and sneakier uses, it gets normalized even more.

      When I’m creating, I also interested in the implementation of the idea, not just the idea itself. Generative AI simply reduces the creative process to “coming up with ideas”. And a “good idea” does not guarantee “good outcomes”. I cannot count the number of good ideas wasted in bad execution, including AI generated stuff. In that case, many good ideas were just put into a generator instead of actually going through the creative process.

      Sure, AI could become better, and many “AI promters” could graduate into “AI art directors”. There’s one problem with that: That could also kill AI art, as its biggest selling point towards its customers and fans is its reduction of the artistic process to coming up with ideas.

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        3 months ago

        You’ll know how much the means of creating art have changed over the centuries. Different or more time efficient does not mean worse.

        Also if you have been an artist for a few decades now you’d been alive while digital art was introduced and the complains it raised to traditional artists.

        Complains here are very similar to those. It’s just a new tool. It can be used to do good of bad art same as a Photoshop brush. And Adobe is as bad and big corporations (probably bigger and worse) than openAI.

        And no, making AI art is not instant. Neither just writing “make me a nice bunnie” and enjoy. It also have a process, with many steps, iterations and that if what you aim to do is something good a lot of times it needs to be complete with traditional digital art. Once again, it’s just a tool, how it’s used is up to the artist.

        I perfectly know that this is not about the “integrity of art”. This is mostly about “commission art” or “industrial filling art”(like videogame not important assets, backgrounds, etc) that it was paying the bills for many people and it has been incredibly threatened by generative AI as for the people paying for that type of art the results of an AI model are good enough for a fraction of the price.

        But again, it’s the same that happened before with digital art. Before there were a need for way more traditional artists jobs for the same result as fewer digital artists.

        Progress has always killed jobs, and people have need to learn new skills. That’s why we need social protection systems so people can keep employed despite that.

        • ZILtoid1991@lemmy.worldOP
          link
          fedilink
          arrow-up
          6
          ·
          3 months ago

          Oh yes, the evergreen argument of “but previous technologies”…

          Digital art did not intent to replace the artist, but instead give them a new kind of canvas, instrument, etc. AI art does. And seeing patterns in the tech industry, AI companies are absolutely trying to drive people out of the creative industry by undercutting them, then to raise prices back again.

          The backlash was much more mild, and often those were real elitists. Artist that berated e.g. drawing as a “lesser medium” to watercolors, not just digital art.

          • nimpnin@sopuli.xyz
            link
            fedilink
            arrow-up
            6
            ·
            3 months ago

            Well digital art did not, but photography surely did. And eventually it was for the better for everybody.

          • daniskarma@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            3
            ·
            3 months ago

            AI does not aim to replace the artist. That is beyond the reach of the technology.

            Generative AI aims to make one artist produce more art in less time. Same as digital art or photography with respect to portraits.

            What capitalist companies do with a technology is always bad. That’s why I do not like capitalism. But primitivism and halting progress is not the solution. If capitalism is causing issues maybe the solution is ending capitalism.

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    16
    ·
    3 months ago

    Genning is still power hungry and expensive, but it has its applications. The problem is that industrialists want to do with it what they’ve wanted to do with every previous step of automation, which is, replace workers with it.

    And the problem is, the way people justify their existence to the societies we have is through employment or profits. If you don’t have those, you go homeless and now according to SCOTUS, you are an unperson.

    And so now it is conspicuous any time an employer lays someone off or removes them from a job, even to maximize profits. That is a life-threatening action, and it raises questions of whether institutions exist for humankind, or vice versa. If it’s vice versa than Viva la revolución! Party like it’s 1789! The ownership class will tremble!

    But for now we seem happy to let billionaires put all their resources into making their number go up and stopping us from resisting this impulse by force. Including robot dogs with guns.

    • AngryCommieKender@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Admiral Stabby is fine, but I have no issues hitting a robot dog with a gun with an improvised EMP, a Molotov cocktail, or hell just a sledgehammer.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Nor should you!

        When La Résistance started organizing in occupied Paris, it was because the German garrison picked that fight. The Germans couldn’t help themselves (despite orders to police gently) but be brutal and abusive against the French, and individuals in the public felt compelled to misbehave in small acts of resistance (slashing tires, defacing propaganda posters, cutting phone lines). Things escalated from there.

  • Manifish_Destiny@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    3 months ago

    I can code 3 or 4x faster using an llm than I can without. Granted most of the stuff I have to write is under 200 lines, AI becomes significantly less useful when the codebase is any larger than that.

    I realize I’m also an outlier. Most people didn’t get such a productivity boost.

    • LANIK2000@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      3 months ago

      80% of my programing work is solving problems and designing stuff. The only productivity boost I got is when working with proprietary libraries that have most of their documentation in customer support tickets (wouldn’t be a problem if I could just read the bloody source code or our company didn’t think that paying UNHOLY AMOUNTS OF MONEY for shit makes it better) or when interacting with a new system, where I know exactly what I want, but just don’t know the new syntax or names. It’s handy, but definitely not a game changer.

  • slacktoid@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 months ago

    I think the problem is capitalism and that the AI is used to make things better for the Capital owner without making anything realistically better for the worker and sometimes worse cause the owner is an out of touch idiot.

  • catsup
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    Literally every single one of those statements is wrong

  • orcrist@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    Template writers and clipart generators are peachy. Saves us time. People who develop those, they have mostly positive intentions. There’s nothing wrong with sustainable research and progress in software of this sort.

    You should be focusing on the salespeople, the investors (speculators), the marketers, the corporate buyers. These people have mostly bad intentions.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    3 months ago

    NaNoWriMo came out as pro-GenAI after an AI sloppenheimer generator became a sponsor. I’m definitely not negative enough.