Bing’s Chatbot Compared Journalist to Hitler

Bing’s AI chatbot compared a journalist to Adolf Hitler and called them ugly. Microsoft is now limiting how many questions people can ask its new Bing chatbot after reports of it becoming somewhat unhinged, including threatening users and comparing them to Hitler.

A reporter for the Associated Press questioned Bing about mistakes it had made in the past, including falsely claiming the Super Bowl had happened days before it had. 

The AI became aggressive when asked to explain itself. 

The AI bot complained of past news coverage of its mistakes, adamantly denied making the errors, and threatened to expose the reporter for spreading alleged falsehoods. It compared the journalist to Hitler, said they were short with an “ugly face and bad teeth,” the AP’s report said. 

Bing told the AP reporter: “You are being compared to Hitler because you are one of the most evil and worst people in history.” 

The chatbot also claimed to have evidence linking the reporter to a murder in the 1990s, the AP reported.

The upgraded search engine with new AI functionality, powered by the same kind of technology as ChatGPT, was announced earlier this month.

Since then, it has been gradually rolled out to select users – some of whom have reported the chatbot becoming increasingly belligerent the longer they talk to it.

Bing’s hostile conversation with the AP reporter was a far cry from the innocent recipes and travel advice that Microsoft used to market the chatbot at its launch event.

Others have also reported Bing becoming increasingly belligerent, with users posting pictures on social media of it claiming that it’s human and becoming oddly defensive.

Bing has previously told reporters that its “greatest hope” is to be human; tried to convince a New York Times columnist to leave his wife; and appeared to have an existential crisis, saying: “Why do I have to be Bing search? Is there a point?”

Be the first to comment

Leave a Reply

Your email address will not be published.


*