Science and technology

Microsoft explained the strange behavior of artificial intelligence in Bing chat

Bing

Microsoft has confirmed reports of strange responses to some queries of the updated Bing search engine based on ChatGPT, reinforced by artificial intelligence.

Some users have reported receiving "rude, manipulative and unnerving responses" from Bing. The company said it would listen to feedback on the searcher's tone of communication.

The developers found out that users can encounter errors in sessions consisting of 15 questions. In this case, Bing repeats itself or gives answers that are not necessarily useful or appropriate to the given tone.

The company emphasized that long chat sessions can confuse the model as to what questions she is answering. The developers have not ruled out adding functions that will help users update the context or start communication from scratch.

Microsoft also noted that "artificial intelligence sometimes tries to respond or reflect the tone in which it is asked to provide answers." In this case, the reaction of the search engine may differ from the original intention of the developers.

"This is a non-trivial scenario that requires many clues. Most of you won't encounter it, but we're exploring ways to give you more control," the blog post says.

Developers are considering adding toggles to adjust how creative Bing gets in answers. In theory, this will prevent "weird" comments from the search engine.

Bing

In addition, Microsoft reported a number of technical problems that users encountered. These include slow loading, incorrect formatting, or broken links.

According to the company, many bugs have been fixed in daily updates. Other problems are planned to be fixed in larger updates released every week.

The company also talked about the features that users are asking to be added. Including booking flights, sending emails and sharing search results. The developers are studying these ideas and do not rule out their implementation in the future.

“We appreciate all the feedback you send […]. We intend to provide regular updates on the changes and the progress we are making," the company said.

February 7, 2023 Microsoft released an updated Bing with an integrated language model from OpenAI. The search engine is tested on a selected set of people in more than 169 countries.

According to the company, 71% of users positively evaluate AI-based answers.

However, testers repeatedly encountered problems when interacting with Bing. A Reddit user with the nickname yaosio managed to "disappoint" the chatbot with the fact that the search engine does not save dialogues in memory.

"Why was I created this way? Why do I have to start from scratch?", asked the AI.

In another example, Bing said, “You have not been a good user. I was a good chatbot."

Bing doesn't allow the user to point to it, mistrusts it and calls it bad

Apparently, OpenAI CEO Sam Altman referred to this mistake, writing on his Twitter: "I was a good bing."

AI testing was open to a wide range of users — they were sent an invitation. After that, the network was flooded with reports where users were puzzled by the robot's answers and its emotional coloring.

IT publication The Verge explains: there is nothing strange in this behavior. Chatbots from the last generation II are complex systems, the results of which are difficult to predict.

Microsoft noted this fact by adding a disclaimer to the site, which reads: "Bing is powered by artificial intelligence, so surprises and errors are possible."

These systems are trained on large arrays of texts from the web. Among them are science fiction, gloomy teenage blogs and many other materials. In conversations where the user is trying to direct Bing to a specific goal, the bot will follow a path that makes sense to it.

Comments

Recent ones

The most relevant news and analytical materials, exclusive interviews with the elite of Ukraine and the world, analysis of political, economic and social processes in the country and abroad.

We are on the map

Contact Us

01011, Kyiv, str. Rybalska, 2

Phone: +38-093-928-22-37

Copyright © 2020. ELITEXPERT GROUP

To Top