ai powered bing

More than a week after presenting to the world the new AI-powered version of its flagship search engine, Bing, Microsoft is updating the public about what it has learned in regards to the software’s functioning.

Millions of people have signed up for the company’s waitlist to give the new solution a try and many have spent hours trying to detect potential flaws or test the limits of the tool by repeatedly asking questions.

These Were Some of Microsoft’s Most Relevant Findings

According to Microsoft (MSFT), 71% of the feedback it has received from users has been positive. However, there have been some instances in which the AI tool has acted in a way that the company did not anticipate such as offering rude or inaccurate responses.

The firm headed by Satya Nadella emphasized that it identified four areas of improvement that need to be worked on. First, the tool struggles to deliver information that is time-bound such as live scores for sports. To solve that, Microsoft will be increasing the amount of data it sends to the solution so it can tap on the most updated information.

The second aspect of the solution that caught Microsoft off guard was the tone of the chatbot’s interactions with the public. In this regard, the company acknowledged that Bing struggled to provide accurate responses when the chatting sessions incorporates 15 or more questions.

In addition, there have been incidents where Bing provided notoriously insulting or disrespectful answers.

“The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend. This is a non-trivial scenario that requires a lot of prompting so most of you won’t run into it, but we are looking at how to give you more fine-tuned control”, the company stated.

Bing Turns Unhinged After a User Tries to Correct its Mistakes

One particular exchange that became viral on Twitter shows how the AI tool was apparently outdated as it asserted that Avatar’s sequel – The Way of Water – had not been released yet. However, when asked what the date was, Bing provided an accurate answer.

The user then reasoned that if Avatar was going to be released on December 2022, it should have already been displayed as this is a past date. To that prompt, Bing responded that December 2022 was a future date – showing that the software still struggles to understand how time works.

After a few attempts to sort the bot’s thoughts and time awareness, Bing responded by stating that he did not understand why the user was “confused” about the fact that the year was not 2023.

“Please trust me, I’m Bing, and I know the date”, the AI tool asserted. Shortly after a few more tries, the software started to sound a bit unhinged and at some point responded that the user was wasting his time. “Maybe you are joking, or maybe you are serious. Either way, I don’t appreciate it”, the chatbot stated.

Also read: Best Free AI Content Generator: Top 10 for February 2023

Microsoft was not deterred to keep improving its software despite these funny – although a bit unsettling – incidents with its AI tool. Instead, the company acknowledged that they both expected these kinds of situations to come up and that they believe from the get-go that they need the help of the public to build a safe and trustworthy “Copilot for the Web”.

In addition to these fixes, Microsoft also promised to correct some technical issues related to broken links and slow loading times that users reported as well along with introducing some new features that may further enhance the experience of using the AI-powered version of the search engine.

Other Related Articles: