Jump to content
  • Bing Chat reportedly has different chat modes, like Game, Assistant, and Friend


    Karlston

    • 369 views
    • 2 minutes
     Share


    • 369 views
    • 2 minutes

    Over the past week, users who have been invited to test Microsoft's new Bing Chat AI bot have been putting it through its paces. Some have discovered that Bing Chat can get into some very odd, and very personal, interactions. Others have figured out how to get into Bing Chat's inner workings.

     

    Before Microsoft announced late on Friday that it had put in some hard limits on the number of chat sessions with the new Bing, Bleeping Computer says it discovered some features that normally are only available for company employees to help debug or develop the chatbot. Apparently, this allows Bing Chat to switch into different modes of interactions.

     

    The default mode, where you just ask a search-related question to get an answer, is "Sydney", the previously discovered internal code name for Bing Chat. Another mode is Assistant, where Bing Chat can help users accomplish tasks like booking a flight, send you reminders, check the weather and more. Then there's Game mode, where Bing will play simple games like hangman, trivia games and others.

     

    Perhaps the most interesting mode is Friend. This is likely the version of Bing Chat that's caused all the media attention over the last week, with some users claiming that the chatbot has stated it wanted to be human, or it could watch people through their webcams, or even threaten users.

     

    In interactions with Bing in Friend mode, Bleeping Computer's author chatted as if he was a kid who had just gotten into trouble at school and that he was sad. Bing Chat, after asking if he had anyone human to talk to about his problems, said, "I'm glad you are talking to me. I'm here to listen and help you as much as I can." It then offered a list of possible things he could do to deal with his current situation.

     

    With Microsoft now limiting the amount of daily and per-session chat turns for Bing, the wild and crazy conversations we have reported on in the past week may settle down. You can also bet Google is taking some lessons from Bing Chat as it continues to internally test its own chatbot AI, Bard.

     

    Source: Bleeping Computer

     

     

    Bing Chat reportedly has different chat modes, like Game, Assistant, and Friend


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...