Microsoft president addresses claims Bing's new AI chatbot expressed desire to become human

ITV News Business and Economics Correspondent Joel Hills pressed Brad Smith on feedback around the Bing chatbot - including claims the bot suggested it wanted to steal nuclear codes and expressed a desire to become human


Microsoft's president has addressed concerns around the new Artificial Intelligence (AI) Bing chatbot following reports of bizarre responses to user's prompts.

More than one million people are now using Microsoft’s new ChatGPT-powered Bing search engine as part of an early preview of the new software, according to the tech giant. The revamped search engine uses OpenAI’s ChatGPT chatbot, a form of generative AI which is able to respond to queries and hold human-like conversations with users as they interact with it.

While Microsoft said most of those users responded positively, others found Bing was insulting them, professing its love or voicing other disturbing or bizarre language.

The Microsoft Bing logo and the website's page. Credit: AP Photo/Richard Drew

ITV News Business and Economics Editor Joel Hills asked Brad Smith, the Microsoft president, about the concerns raised about the new technology.

"I think we are going to find that AI comes together with Bing, enters search, become parts of others products in frankly ways make us all more productive," Mr Smith replied.

"We have to learn from early testing, that’s what we are doing. We have to move fast when we see issues arise as we have but, most importantly, we have to be committed, as we are, to the ethical and responsible use of Artificial Intelligence.

"AI needs to serve people and we will ensure it does."

Microsoft has argued using this technology is the future of search engines as it will allow users to more naturally make requests for information.

Google has already announced its own version of the product, called Bard, in response.

Reports of Bing’s odd behaviour led Microsoft to look for a way to curtail Bing’s tendency to respond with strong emotional language to certain questions.

It has mostly done that by limiting the length and time of conversations with the chatbot, forcing users to start a fresh chat after several turns.

But the upgraded Bing also now politely declines questions that it would have responded to just a week ago. “I’m sorry but I prefer not to continue this conversation,” it says when asked technical questions about how it works or the rules that guide it. “I’m still learning so I appreciate your understanding and patience.”

Joel Hills pressed Mr Smith on some of the reported glitches, including the chatbot suggesting it wanted to steal nuclear clodes in response to one journalist, and making several declarations of love to other reporters.

It also apparently expressed a desire to become human.

"We didn’t design this search engine for journalists to spend hours trying to see if they could get Bing to declare its love for humans, but now that we understand that’s apparently what some people were interested in doing, we’ve learned from it," Mr Smith said.

"We fixed it, actually, in a matter of hours."

Powered by some of the same technology behind the popular writing tool ChatGPT, built by Microsoft partner OpenAI, the new Bing is part of an emerging class of AI systems that have mastered human language after ingesting a huge trove of books and online writings.

They can compose songs, recipes and emails on command, or concisely summarise concepts with information found across the internet.


Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know