Microsoft caps the ChatGPT-powered chatbot on Bing at five queries per session
Microsoft caps the ChatGPT-powered chatbot on Bing at five queries per session

USA: Microsoft has now limited the ChatGPT-powered Bing chatbot to five responses per session and 50 questions per day. The generative artificial intelligence (AI)-powered new Bing chatbot displayed responses that were "potentially alarming", suggesting the technology may not be ready for widespread use.

Microsoft took this action after several users complained about errors and erratic behavior of the chatbot.

Microsoft has a tempting opportunity with the new Bing. However, preliminary search results and communication with the AI-powered chatbot showed an unexpected behavior.

Also Read: Ratnesh Kumar brings a positive curve in the digital space as a growth and social media monetization analyst.

Bing reportedly reacted in an unexpected way after several users posted about their strange interactions on social media. It appears that Microsoft is using the fact that it didn't foresee how people would use Bing as an excuse in its blog post.

According to Microsoft, long chat sessions can cause Bing's AI-powered chatbot to repeat itself or give responses that don't match the intended tone.

The announcement follows reports from some users about the chatbot's strange responses. According to Microsoft the interface "can be repetitive or prompt/provoke to provide responses that are not always helpful or in line with our designed tone".

Also Read: Google sacked 453 employees in India

In response to reports of inappropriate behavior by the AI tool, Microsoft has announced new conversational limits for the Bing chatbot.

Depending on the business, "most of you get a reply in five turns or fewer, and only 1% of chat conversations have 50 or more messages. You'll be suggested a new topic when the chat session reaches five turns. "

When users exceed the new per session limit, Bing's chatbot will now prompt them to start a new topic to avoid repeated, pointless exchanges.

To prevent the chatbot from getting confused, users must also clear the context after each chat session. To clear out their previous entries, they can click on the broomstick icon (located next to the search box).

Also Read: Technology and Gadgets in race with Human life

To prevent Bing's ChatGPT-powered chatbot from making users doubt their own thoughts or experiences, Microsoft has added new restrictions. It is expected that its strange behavior will develop over time.

According to Microsoft, a new use-case for Chat is how people are using it as a tool for social entertainment and more general world exploration.

रिलेटेड टॉपिक्स
Join NewsTrack Whatsapp group