Bing’s AI also stumbled over several serious errors.
Even the Bing chatbot would have made several serious mistakes, just like Bard, the Google AI. Microsoft says it is working on a fix.
The AI-powered version of Bing was launched last week and has demonstrated its capabilities in a number of tasks, such as being able to analyze some financial products and also being able to compare phones and computers from different brands.
However, while demonstrating its features, the AI made some major mistakes. For example, Bing’s AI allegedly made some algebraic calculations wrong and reported factually incorrect information on several occasions. We remind you that just a week ago Google had lost over 100 billion dollars on the stock market for a very similar reason: the promotional material of Bard, his chatbot, contained a serious error.
Microsoft’s chatbot errors were also noticed and reported by researcher Dmitri Brereton. In his case, artificial intelligence stumbled upon a decidedly bizarre issue: we were talking about vacuum cleaners for pet hair.
Microsoft said it was aware of the chatbot’s frequent errors and confirmed it was working to improve the chatbot’s features. Caitlin Roulston, the director of communications at Microsoft, stressed that the platform is still in a preview phase and that user feedback will be essential to perfect the technology.
The chatbot has already been modified to remove racist content from its searches. Furthermore, the current version of the artificial intelligence still refers to itself using the name Syndey , which is the project name of a previous Microsoft chatbot, later canceled to make room for the solution provided by ChatGPT. According to Roulston, this problem too will soon be fixed and the AI will soon use its new name: Premotheus.