Bing ChatGPT goes off the deep end — and the latest examples are very disturbing

The ChatGPT takeover of the internet may finally face some obstacles. While cursory interactions with the chatbot or its Bing search engine sibling (cousin?) yield harmless and promising results, deeper interactions were sometimes alarming.

This doesn’t just relate to the information that the new Bing powered by GPT is wrong – although we’ve seen it go wrong firsthand. Rather, there have been a few instances where the AI-powered chatbot has completely collapsed. Recently a columnist for The New York Times had a conversation with Bing (opens in new tab) This deeply unsettled her, telling a Digital Trends writer: “I want to be human (opens in new tab)‘ during their practical work with the AI ​​search bot.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button