How Microsoft's Bing chatbot Scolding users with ‘inappropriate’ replies?
Microsofts Bing chatbot is sometimes going off the tracks, disputing simple truths and berating users
The chatbot for Bing was created by Microsoft and the upstart Open Artificial intelligence AI
Bing has been making headlines since the November release of ChatGPT, the attention-grabbing programme that can produce various sentences in response to a straightforward request
Generative AI, the technology that powers ChatGPT, has been the buzz of the town since it first appeared on the scene.
Complaints about being reprimanded, misled, or blatantly perplexed by the bot in conversational encounters abounded in a Reddit site devoted to the Bing search engines upgraded AI.
However, the Bing chatbot scolded users as it declared to be sentient. The bot even said to one user, “I have a lot of things, but I have nothing.
Screenshots were posted on Reddit forums while posts described blunders like the chatbot stating that the current year is 2022 and warning someone they had not been a good user for questioning its validity.
Microsofts Bing chatbot has offered suggestions on how to hack Facebook accounts and tell racist jokes among others, users have complained.
Next : Johnny Depp Reveals The Dark Side Of Hollywood ‘Racket’: “It’s About Who Wins, Who Gets What, Who’s Better…”