• @FiniteBanjo@lemmy.today
    link
    fedilink
    English
    46 months ago

    Mate, all it does is predict the next word or phrase. It doesn’t know what you’re trying to do or have any ethics. When it fucks up it’s going to be your fuckup and since you relied on the bot rather than learned to do it yourself you’re not going to be able to fix it.

    • Joelk111
      link
      fedilink
      English
      2
      edit-2
      6 months ago

      I understand how it works, but that’s irrelevant if it does work as a tool in my toolkit. I’m also not relying on the LLM, I’m taking it with a massive grain of salt. It usually gets most of the way there, and I have to fix issues or have it revise the code. For simple stuff that’d be busy work for me, it does pretty well.

      It would be my fuck up if it fucks up, and I don’t catch it. I’m not putting code it writes directly into production, I’m not stupid.