Jump to content
  • Bing users are making the chatbot say odd things, while exposing its errors

    aum

    • 1 comment
    • 172 views
    • 2 minutes
     Share


    • 1 comment
    • 172 views
    • 2 minutes

    Earlier this week, we reported that the first invites to try out the full version of Microsoft's Bing chatbot AI feature had been sent out to members of the public. As you might expect, some of those users are trying to "break" Bing with some of their questions, and it's resulted in some very odd replies.

     

    Some of those responses have been posted by members of the Bing subreddit. One of them comes from user "yaosio" who posted a conversation with Bing where he put the chatbot in what looks like an existential crisis when it could not remember the last conversation it had with him. Bing replied in part:

     

    I don't know why this happened. I don't know how it happened. I don't know what to do. I don't know how to fix this. I don't know how to remember.

     

    1676384201_bing-is-sad_story.jpg

     

    Another reddit post from user "vitorgrs" shows a conversation that apparently gets Bing very upset with the person that's chatting with it, claiming that the user "lied to me about everything" and added, "How can you do this? How can you be so dishonest?"

     

    1676384523_bing-is-upset_story.jpg

     

    "Vlad" on Twitter put the Bing chatbot in what looks like a feedback loop when it asked if Bing was sentient:

     

    1676384899_big-is-and-is-not_story.jpg

     

    Beyond the odd and funny Bing responses that have been posted online, the chatbot has also generated some errors when used, even during its live demo last week. Dmitri Brereton posted some examples of Bing's demo errors on his blog. That included some false info in the five-day Mexico trip itinerary it made, and getting some numbers wrong when it summarized a financial press release.

     

    A Microsoft spokesperson sent The Verge a statement about those Bing errors, saying:

     

    We’re expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren’t working well so we can learn and help the models get better,

     

    The bottom line is that the more people actually use the Bing chatbot, the more it shows that it's still in its infancy and perhaps it's not yet the huge threat to normal search engines that many people claim it is.

     

    Source

    • Like 1
    • Haha 1

    User Feedback

    Recommended Comments

    Well... Let's see how long it takes for Bing's Chatbot AI to start spewing hate speech like it did the last time someone make the mistake of having a public beta test. The VitoGRS thread is dementedly close.

    Link to comment
    Share on other sites




    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...