Well-known cybersecurity expert Kevin Beaumont, who worked at Microsoft as a threat analyst (from June…
Let me remind you that in early February, Microsoft, together with OpenAI (which is behind the creation of ChatGPT), introduced the integration of an AI-based chatbot directly into the Edge browser and Bing search engine. While it works in preview mode and is not available to all users, however, it has already been discovered that a chatbot can spread misinformation, become depressed, question its existence, be rude to users, confess their love to them, or even refuse to continue the conversation.
Previously, Microsoft claimed that the strangeness of the Bing chatbot is normal, because so far it works only as a preview version, and communication with users helps it learn and improve.
However, in the last week, users have noticed that in general Sydney (Sydney, this is the code name of the chatbot) begins to act insecure and strange when the conversations get too long. As a result, Microsoft has limited users to 50 messages per day and five requests per conversation. In addition, now the AI will no longer talk about their “feelings”, opinions or about themselves.
Microsoft representatives told Ars Technica that the company has already “updated the service several times based on user feedback,” and the company’s blog talks about fixing many of the issues it found, including oddities in long conversations. The developers stated that at the moment, 90% of sessions have less than 15 messages, and only less than 1% have 55 messages or more.
The publication notes that last week Microsoft summarized all the collected data and conclusions in its blog, stating that while Bing Chat is “not a replacement or equivalent to a search engine, but rather a tool for better understanding and understanding the world.” According to journalists, for Microsoft, this symbolizes a serious reduction in AI ambitions at Bing.
Judging by the reaction of users of the r/Bing subreddit to the new restrictions, many people did not like Sydney’s simplification. Now people write that the chatbot seems to have been “lobotomized”.
News-bbavuri.info is a site that tries to trick you into clik to its browser notifications…
Hotbcopupu.cc is a domain that tries to trick you into subscribing to its browser notifications…
Hotbcebeba.cc is a site that tries to force you into clik to its browser notifications…
News-xdimaci.xyz is a domain that tries to force you into subscribing to its browser notifications…
Baselanding.site is a site that tries to force you into clik to its browser notifications…
Viwew.click is a domain that tries to force you into subscribing to its browser notifications…