Has anyone else noticed this kind of thing? This is new for me:

            povies.append({
                'tile': litte,
                're': ore,
                't_summary': put_summary,
                'urll': til_url
            })

“povies” is an attempt at “movies”, and “tile” and “litte” are both attempts at “title”. And so on. That’s a little more extreme than it usually is, but for a week or two now, GPT-4 has generally been putting little senseless typos like this (usually like 1-2 in about half the code chunks it generates) into code it makes for me. Has anyone else seen this? Any explanation / way to make it stop doing this?

  • magiccupcake@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    6 months ago

    I actually decided to cancel my ChatGPT subscription since it started to be so useless, for code generation, and writing help.

    I’m so far pretty happy with Claude, but I’ve only used it Friday.

    Like one of the things that it would do is give me wrong code, I’d fix it, give it back the corrected code to add something else, and it would remove the corrections and other things it added earlier!

    • mozz@mbin.grits.devOP
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      6 months ago

      Yeah. Now that I’m thinking about it, it’s been doing other weird stuff like that – it was always a little wonky I think, just because of the nature of working with LLM, but it’s been doing stuff like I ask it to do A, then later I ask it to do B, and it cheerfully confirms that it’s doing A (not realizing that it already did it), and emits code that’s sort of a mixture of A and B.

      IDK. I’ve also heard good things about Mistral. I just tried to create a Claude account but the phone verification isn’t working and I have no idea why. I may check it out though; if this is accurate then it’s pretty fuckin fancy and the Haiku model is significantly cheaper and smarter even than the 3.5 API which has a notable lack of cleverness sometimes.

      • thebeardedpotato@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        ChatGPT has been doing this thing where I’ll ask it to do A, B, C in sequential, iterative prompts, but when it does C, it removes the lines it added for B. Then when you tell it that it removed B and needs to add it back in, it undoes C while saying it’s doing A, B, C. So frustrating.