Microsoft now allows anyone to check out its Bing Chat chatbot with no waitlist. Of course, that means a lot of people are now checking out and using the service for the first time. However, some Bing Chat users are now saying that responses are getting slower.
In a Twitter exchange with a user who stated he wanted to see faster response times, adding "some times I have to wait so long that it's ridiculous". Microsoft's Mikhail Parakhin, the head of its Advertising and Web Services, responded with an apology, stating "the usage keeps growing, we are not adding GPUs fast enough." However, he added that this current situation will be fixed.
This situation is one of the biggest roadblocks for generative AI, as these services need more and more specialized GPUs in data servers. The leading maker of these chips is currently NVIDIA. However, unconfirmed reports claim that AMD and Microsoft are teaming up to develop new AI chips that could take away some of the dependence from using NVIDIA.
Parakhin also took the time this weekend to answer some more Bing Chat questions from other Twitter users. He told one user that the long awaited Bing chat history feature is coming in "days". He also told another user that more aspect ratios in Bing Image Creator are being discussed but it may not happen "immediately". Finally, he told another user that he hopes Bing Chat will support Code Interpreter at some point, but added, "It needs to be done securely - not a trivial task."
Microsoft admits it needs to add more GPUs for faster Bing Chat response times
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.