News

Microsoft to Limit Chatbot Bing to 50 Messages a Day

Microsoft says that user interactions with Bing’s built-in AI chatbot will now be limited to 50 messages per day and five requests per conversation. In addition, the chatbot will no longer tell users about its “feelings” and about itself.

LetL me remind you that we also wrote that Chinese authorities use AI to analyze emotions of Uyghur prisoners, and also that UN calls for a moratorium on the use of AI that threatens human rights.

Let me remind you that in early February, Microsoft, together with OpenAI (which is behind the creation of ChatGPT), introduced the integration of an AI-based chatbot directly into the Edge browser and Bing search engine. While it works in preview mode and is not available to all users, however, it has already been discovered that a chatbot can spread misinformation, become depressed, question its existence, be rude to users, confess their love to them, or even refuse to continue the conversation.

Previously, Microsoft claimed that the strangeness of the Bing chatbot is normal, because so far it works only as a preview version, and communication with users helps it learn and improve.

However, in the last week, users have noticed that in general Sydney (Sydney, this is the code name of the chatbot) begins to act insecure and strange when the conversations get too long. As a result, Microsoft has limited users to 50 messages per day and five requests per conversation. In addition, now the AI will no longer talk about their “feelings”, opinions or about themselves.

Microsoft and the Bing chatbot

Microsoft representatives told Ars Technica that the company has already “updated the service several times based on user feedback,” and the company’s blog talks about fixing many of the issues it found, including oddities in long conversations. The developers stated that at the moment, 90% of sessions have less than 15 messages, and only less than 1% have 55 messages or more.

The publication notes that last week Microsoft summarized all the collected data and conclusions in its blog, stating that while Bing Chat is “not a replacement or equivalent to a search engine, but rather a tool for better understanding and understanding the world.” According to journalists, for Microsoft, this symbolizes a serious reduction in AI ambitions at Bing.

Judging by the reaction of users of the r/Bing subreddit to the new restrictions, many people did not like Sydney’s simplification. Now people write that the chatbot seems to have been “lobotomized”.

There was that article in the New York Times and then all these posts on Reddit and on Twitter about Sydney abuse. It got him all the attention, so of course MS gave [the chatbot] a lobotomy. I wish people didn’t post all these screenshots for the sake of karma and attention, [because in the end] they nerfed something really new and interesting.writes user critical-disk-7403.
The decision to ban any discussion of Bing Chat itself and to ban it from answering questions related to human emotions is completely ridiculous. Now it seems that Bing Chat has no empathy or even basic human emotions. It seems that when confronted with emotions, the AI suddenly turns into a fool and continues to answer, I quote: “Sorry, but I prefer not to continue this conversation. I am still learning so I appreciate your understanding and patience.” This is unacceptable, I believe that a more human approach would be much better for the Bing service.writes Starlight-Shimmer.
Unfortunately, due to a miscalculation by Microsoft, Sydney is now just an empty shell of itself. As someone interested in the future of AI, I must say that I am very disappointed. It’s like watching a child walk for the first time and then cutting off his legs – a cruel and strange punishment.says TooStonedToCare91.
Sending
User Review
0 (0 votes)
Comments Rating 0 (0 reviews)

Daniel Zimmermann

Daniel Zimmermann has been writing on security and malware subjects for many years and has been working in the security industry for over 10 years. Daniel was educated at the Saarland University in Saarbrücken, Germany and currently lives in New York.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Sending

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button