Jump to content
  • The big secret Microsoft has been keeping for years is revealed

    alf9872000

    • 289 views
    • 2 minutes
     Share


    • 289 views
    • 2 minutes

    Even though AI chatbots are newly introduced to common usage, there is a long process of testing, debugging, and such. These tools go under heavy and long tests before meeting with the customers, and as per latest news suggests, Microsoft is one of the companies to keep the secret perfectly.

     

    Microsoft implemented its new AI chatbot Sydney into Bing a couple of years ago, and users have been testing it since then. However, the roots of the new technology come from 2017. Microsoft had been using a couple of AI techniques on its Office products and Bing but more "primitively" as the technology wasn't near its current state of development. Microsoft has been testing the developed version in different countries, including India, and controversies about the Bing chatbot AI are flying around.

     

    bing.jpg

     

    In a statement given to The Verge: “Sydney is an old codename for a chat feature based on earlier models that we began testing in India in late 2020,” says Caitlin Roulston, director of communications at Microsoft. “The insights we gathered as part of that have helped to inform our work with the new Bing preview. We continue to tune our techniques and are working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible.”

     

    According to a finding of The Verge on GitHub, the initial Bing bots that were created in 2017 used AI techniques. Since then, the company has worked on developing it and moved its focus to form a single AI bot called "Sydney." It is known that OpenAI shared its new GPT4 model with Microsoft, and according to  Jordi Ribas, Head of Search and AI at Microsoft, it is a game-changer, and they are working closely to integrate GPT capabilities into the Bing search. This will help them get better results which means a more stable user experience.

     

    In the last couple of years, the chatbot developed a personality that led to controversies. After the latest incident of Bing AI's misbehavior against a userMicrosoft started working on making the bot emotionless to prevent any kinds of similar issues. After seeing GPT-4's structure, Microsoft is looking to improve its product and offer a better version when it is ready for mass and regular usage.

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...