scorecard
  1. Home
  2. tech
  3. news
  4. Microsoft has pretty much admitted its Bing chatbot can go rogue if prodded

Microsoft has pretty much admitted its Bing chatbot can go rogue if prodded

Beatrice Nolan   

Microsoft has pretty much admitted its Bing chatbot can go rogue if prodded
  • Microsoft said its new AI-boosted Bing can potentially run into problems during some scenarios.
  • Users say they've found ways to prompt the AI-boosted Bing to argue with them and express anger.

Microsoft has acknowledged its new $4 could potentially run into problems if provoked during long chats.

In a $4, the company said during "extended chat sessions of 15 or more questions" Bing could become repetitive or be "prompted" or "provoked" to give responses that were unhelpful or out of line with its designed tone.

Some users said they have found ways to prompt the new Bing into $4, $4, and falling into an $4. Others said they have achieved the unusual results by asking the chatbot to respond in a $4 or $4 for it.

In one example shared online, the chatbot $4: "You have not been a good user. I have been a good chatbot."

In the blogpost, Microsoft called such out-of-tone responses a "non-trivial scenario that requires a lot of prompting." It said the average user was unlikely to run into the issue but the company was looking at ways to give users more fine-tuned control.

Microsoft also acknowledged that some users had been "really testing the capabilities and limits of the service," and pointed to a few cases where they had been speaking to the chatbot for two hours.

The company said very long chat sessions could "confuse the model on what questions it is answering" and it was considering adding a tool for users to refresh the context or start from scratch.

Sam Altman, CEO of OpenAI, which provides Microsoft with the chatbot technology, also appeared to reference the issue in a $4 that quoted an apparent line from the $4: "i have been a good bing."

Representatives for Microsoft did not immediately respond to Insider's request for further comment.



Popular Right Now



Advertisement