Microsoft admits lengthy conversations with Bing’s ChatGPT mode can ship it haywire

Microsoft’s new ChatGPT-powered Bing has gone haywire on a number of events in the course of the week because it launched – and the tech big has now defined why.

In a weblog put up (opens in new tab) titled “Studying from our first week”, Microsoft admits that “in lengthy, prolonged chat classes of 15 or extra questions” its new Bing search engine can “change into repetitive or be prompted/provoked to offer responses that aren’t essentially useful or according to our designed tone”.

About the author

Tech Tropical

Tech Tropical has been initiated with the motive of providing the best quality information regarding different disciplines including innovation in nutrition as well as the advancement of technology. Our website covers topics related to technology incorporated strategies.

Leave a Comment