Microsoft has been forced to make radical changes in the testing of ChatGPT 3.5 in their Bing search engine, after a string of testers published strange and sometimes disconcerting results in their conversation with the chatbot. Conversations with this version of AI will be limited to five chat turns.
Media like Fortune and The New York Times, and several testers on social media as well, reported strange conversations with Bing. The AI chatbot insulted, and even seem to threaten, users. In several conversations published online Bing became angry, appeared depressed or started an argument.
A major difference between this version of ChatGPT and the version that made headlines recently is that the Bing-edition is not just fed information up to a certain date, but is able to crape the internet as it processes conversations. Although Microsoftr has not disclosed any details, it seems the often criticized limitations of ChatGPT have been removed as well.
The AI chatbot was live unchecked for a week. In an update for test users Microsoft stated on this ‘notable change’: “As we mentioned recently, very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we have implemented some changes to help focus the chat sessions. Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing.“