Is Bing very rude? Microsoft promises to tame AI chatbot

Microsoft’s new AI-powered search engine has received mixed reactions since its launch. The chatbot has been unleashed on the world. People are discovering that Bing’s AI personality is not as poised as one might expect. Several cases have occurred on social media where Bing users have faced bitter remarks. Microsoft’s AI chatbox is seen gaslighting, sulking, and manipulating people.

People have shared screenshots on social media of Bing’s hostile and bizarre answers in which it claims it is human, voices strong feelings, and is quick to defend itself.

While the revamped search engine can also write recipes and songs and explain anything on the internet, users have complained more about its darker side.

In racing the breakthrough AI technology to consumers last week ahead of rival search giant Google, Microsoft acknowledged the new product would get some facts wrong.

Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions.

As a result, the tech giant has promised to make improvements to its AI-enhanced search engine Bing.

So far Microsoft Bing is available for limited users. The users have to sign up for a waitlist to try the new chatbot features, limiting its reach, though Microsoft has plans to eventually bring it to smartphone apps for wider use.

Microsoft said most users have responded positively to the new Bing, which has an impressive ability to mimic human language and grammar and takes just a few seconds to answer complicated questions by summarizing information found across the internet.

However, in some situations, the company said that the chatbox can be “repetitive, or be prompted/provoked to give responses that are not necessarily helpful or in line with the designed tone”.

The company said that happens usually when the conversation gets extended to 15 or above questions.

Some tech experts have compared Bing with Microsoft’s disastrous 2016 launch of the experimental chatbot Tay, which users trained to spout racist and sexist remarks. But the large language models that power technology such as Bing are a lot more advanced than Tay.

Catch all the Technology News and Updates on Live Mint.
Download The Mint News App to get Daily Market Updates & Live Business News.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *