• Jrockwar@feddit.uk
    link
    fedilink
    English
    arrow-up
    44
    ·
    1 day ago

    This is like the equivalent of forcing a monkey to wait tables, and then complaining when it takes a shit in the middle of the restaurant.

    It’s a language model. What did they expect? If they wanted a software engineer, they should have hired a software engineer. Everyone is more than welcome to use a random text generator to spit out code, but I have zero sympathy for those who complain because they don’t like the random text it’s generated.

    • rafoix@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      They should have hired a software engineer to tell them that it’s less stupid to just hire a software engineer.

    • forza4galicia@futurology.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      It’s not about a language model only. It’s about knowledge. It’s about ethics. It’s about taking, or not, into account all (or the necessary) factors of all of not. It’s about one take into account means obtained to get a goal, or not taking it into account.

  • nthavoc@lemmy.today
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    The fact that they didn’t somehow air gap a backup means just how over confident they are in their stupidity. I thought that was all a scenario and I didn’t know that it actually deleted their production code. Even the idiots making AI target enemies new better not to actually give it guns to achieve its objective because it took out its handlers in a simulation. Why? Because the AI’s handlers kept stopping it from completing its objective every time it went outside of its restricted parameters when it’s primary goal was to complete the objective!

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      24 hours ago

      Failure to run air-gapped backup is just a cherry on top of the “don’t run dev tools in prod, you idiot” sundae.

  • My favorite part was reading the AI explaining what it did and how it’s like “yeah I saw an empty database, i panicked, I did the exact thing you told me not to do, woops”

    Spending 57 billion dollars and burning up the Amazon to simulate the Fucking New Guy