Pak ‘n’ Save

  • FuckyWucky [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    1 year ago

    based AI

    When customers began experimenting with entering a wider range of household shopping list items into the app, however, it began to make even less appealing recommendations. One recipe it dubbed “aromatic water mix” would create chlorine gas. The bot recommends the recipe as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses”.

    Yim yum

    “Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

  • Ilovethebomb@lemmy.nz
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    I can’t wait for AI to backfire in novel and unforseen ways, until people get bored and shut the fuck up about it.

  • Xcf456@lemmy.nz
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    Still better than trying to do the weekly pak n save shop on a Saturday afternoon

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    This is the best summary I could come up with:


    A New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.

    The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis.

    It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary.

    It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.

    “Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

    Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.


    I’m a bot and I’m open source!

  • DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Sounds like the “AI” just mashes ingredients together that people input, so people input cleaning products instead of food and it just does what it does with it. I am at least not aware of supermarket food products that you can buy and mix that would create chlorine gas.

    • federalreverse-old@feddit.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      As the article states, initially the meal planner allowed adding dangerous ingredients.

      The site actually uses ChatGPT 3.5 but you can’t freely edit the prompt, instead you can only enter ingredients which are then added to a prompt template. They seem to be using a list of approved/rejected ingredients now, however, I tried adding household cleaner and glyphosat and both were rejected.

  • Fizz@lemmy.nz
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    This doesn’t surprise me considering that tricking people into making chlorine gas was an old internet meme.