scorecardMicrosoft is loosening the limits on Bing AI chatbot conversations that it put into effect just days ago because users didn't like them
  1. Home
  2. tech
  3. news
  4. Microsoft is loosening the limits on Bing AI chatbot conversations that it put into effect just days ago because users didn't like them

Microsoft is loosening the limits on Bing AI chatbot conversations that it put into effect just days ago because users didn't like them

Sarah Jackson   

Microsoft is loosening the limits on Bing AI chatbot conversations that it put into effect just days ago because users didn't like them
Tech1 min read
  • Last week, Microsoft started limiting Bing conversations after its AI chatbot said some "creepy" things.
  • Four days later, Microsoft is loosening the restrictions because users didn't like them.

Microsoft is modifying a change it made just last week to the new AI-powered Bing after users weren't happy with it.

On Friday, the company announced it'd be capping conversations with Bing's AI chatbot at five chat turns per session and 50 per day. The company defines a "chat turn" as an exchange with both a user's question and Bing's response.

Just four days later, Microsoft is easing those limits because users wanted longer conversations with the ChatGPT-powered Bing again.

"Since placing the chat limits, we have received feedback from many of you wanting a return of longer chats, so that you can both search more effectively and interact with the chat feature better," the company said in a blog post Tuesday.

On the heels of users' criticisms, Microsoft is loosening the restrictions to now allow six chat turns per session and 60 total chats per day, which it says is enough to accommodate the "natural daily use of Bing" for the "vast majority" of users.

Microsoft says it plans to increase the cap to 100 total chats per day soon. The company first put the limits in place because Bing was drawing attention for giving some unhinged responses.

Viral screenshots shared on Reddit, Twitter, and other platforms have shown Bing appearing to do things like gaslight users, say "I love you," and have existential crises. In its Friday announcement, Microsoft said very long conversations can "confuse the underlying chat model in the new Bing."




Advertisement