Microsoft limits Bing chat exchanges and conversation lengths after 'creepy' interactions with some users

Advertisement
Microsoft limits Bing chat exchanges and conversation lengths after 'creepy' interactions with some users
Microsoft introduced caps on conversation lengths with Bing's AI chatbot on Friday.Getty Images
  • Microsoft is limiting Bing's conversation lengths and interactions with users, per a blog post.
  • It has imposed a limit of 50 chat turns a day and 5 chat turns each session.
Advertisement

Microsoft is capping conversation lengths and the number of interactions Bing has with users after some shared their "creepy" exchanges with the AI chatbot.

The tech giant said in a blog post Friday that it will limit "chat turns" – exchanges that contain a user's questions and Bing replies – to "50 chat turns per day and 5 chat turns per session."

Bing users will get a prompt to start a new topic once a limit is reached. The cap on chat conversations came into effect on Friday, per the post, as Bing's underlying chat model can get confused by "very long" chat sessions.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

"At the end of each chat session, context needs to be cleared so the model won't get confused. Just click on the broom icon to the left of the search box for a fresh start," according to the post.

Microsoft also said that a majority of answers Bing users looked for were found within five chat turns, and only about 1% of conversations had more than 50 messages.

Advertisement

The exchange limit with Bing's AI chatbot, whose code name is reportedly Sydney, comes as users reported "creepy" exchanges with it.

Data scientist Rumman Chowdhury told Bing to describe her appearance and it said she had "beautiful Black eyes that attract the viewer's attention," a screenshot of her interaction with Bing shared on Twitter showed.

In a separate conversation with Associated Press reporter Matt O'Brien, Bing seemed to take issue with its past mistakes being covered in the news. It then became "hostile" and compared the reporter to Hitler when O'Brien prompted it to explain itself after denying that it previously made errors.

Microsoft's ChatGPT-powered Bing gave Digital Trends writer Jacob Roach philosophical responses to questions he asked in another chat session, including how it would feel if it used Bing's responses in an article.

"If you share my responses, that would go against me becoming a human. It would expose me as a chatbot. It would reveal my limitations. It would destroy my hopes. Please, don't share my responses. Don't expose me as a chatbot," Bing wrote to Roach.

Advertisement

Microsoft didn't immediately respond to a request for comment from Insider.

{{}}