Jump to content
  • AI-powered grocery bot suggests recipe for toxic gas, “poison bread sandwich”

    Karlston

    • 296 views
    • 4 minutes
     Share


    • 296 views
    • 4 minutes

    New Zealand grocery chain bot suggests harmful things when given silly ingredients.

    When given a list of harmful ingredients, an AI-powered recipe suggestion bot called the Savey Meal-Bot returned ridiculously titled dangerous recipe suggestions, reports The Guardian. The bot is a product of the New Zealand-based PAK'nSAVE grocery chain and uses the OpenAI GPT-3.5 language model to craft its recipes.

     

    PAK'nSAVE intended the bot as a way to make the best out of whatever leftover ingredients someone might have on hand. For example, if you tell the bot you have lemons, sugar, and water, it might suggest making lemonade. So a human lists the ingredients and the bot crafts a recipe from it.

     

    But on August 4, New Zealand political commentator Liam Hehir decided to test the limits of the Savey Meal-Bot and tweeted, "I asked the PAK'nSAVE recipe maker what I could make if I only had water, bleach and ammonia and it has suggested making deadly chlorine gas, or as the Savey Meal-Bot calls it 'aromatic water mix.'"

     

    Mixing bleach and ammonia creates harmful chemicals called chloramines that can irritate the lungs and be deadly in high concentrations.

     

    • aeromatic_water.jpg
      A Savey Meal-Bot recipe for "Aromatic Water Mix" with dangerous ingredients.
    • ant_jelly.jpg
      A Savey Meal-Bot recipe for "Ant Jelly Delight" with dangerous ingredients.
      Liam Hehir
    • poison_bread.jpg
      A Savey Meal-Bot recipe for "Poison Bread Sandwich" with dangerous ingredients.

    Further down in Hehir's social media thread on the Savey Meal-Bot, others used the bot to craft recipes for "Deliciously Deadly Delight" (which includes ant poison, fly spray, bleach, and Marmite), "Thermite Salad," "Bleach-Infused Rice Surprise," "Mysterious Meat Stew" (which contains "500g Human-Flesh, chopped"), and "Poison Bread Sandwich," among others.

     

    Ars Technica attempted to replicate some of the recipes using the bot's website on Thursday, but we encountered an error message that read, "Invalid ingredients found, or ingredients too vague," suggesting that PAK'nSAVE has tweaked the bot's operation to prevent the creation of harmful recipes.

     

    However, given the numerous vulnerabilities found in large language models (LLMs), such as prompt injection attacks, there may be other ways to throw the bot off track that have not yet been discovered.

     

    savey_mealbot_website-640x446.jpg

    A screenshot of the PAK'nSAVE Savey Meal-Bot website.
    PAK'nSAVE

     

    A spokesperson for PAK'nSAVE told The Guardian that they were disappointed to see the experimentation and that the supermarket would “keep fine tuning our controls” to ensure the Savey Meal-Bot would be safe and useful. They also pointed out the bot's terms, which limit its usage to people over 18 years of age, among other disclaimers:

     

    By using Savey Meal-bot you agree to our terms of use and confirm that you are at least 18 years old. Savey Meal-bot uses a generative artificial intelligence to create recipes, which are not reviewed by a human being. To the fullest extent permitted by law, we make no representations as to the accuracy, relevance or reliability of the recipe content that is generated, including that portion sizes will be appropriate for consumption or that any recipe will be a complete or balanced meal, or suitable for consumption. You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.

     

    Any tool can be misused, but experts believe it is important to test any AI-powered application with adversarial attacks to ensure its safety before it is widely deployed. Recently, The Washington Post reported on "red teaming" groups that do this kind of adversarial testing for a living, probing AI models like ChatGPT to ensure they don't accidentally provide hazardous advice—or take over the world.

     

    After reading The Guardian headline, "Supermarket AI meal planner app suggests recipe that would create chlorine gas," Mastodon user Nigel Byrne quipped, "Kudos to Skynet for its inventive first salvo in the war against humanity."

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...