2 years, 1 month ago

Is Bing too belligerent? Microsoft looks to tame AI chatbot

Microsoft’s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything it can find on the internet. Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions. “Considering that OpenAI did a decent job of filtering ChatGPT’s toxic outputs, it’s utterly bizarre that Microsoft decided to remove those guardrails,” said Arvind Narayanan, a computer science professor at Princeton University. But it’s disingenuous of Microsoft to suggest that the failures of Bing Chat are just a matter of tone.” Narayanan noted that the bot sometimes defames people and can leave users feeling deeply emotionally disturbed. In an interview last week at the headquarters for Microsoft’s search division in Bellevue, Washington, Jordi Ribas, corporate vice president for Bing and AI, said the company obtained the latest OpenAI technology — known as GPT 3.5 — behind the new search engine more than a year ago but “quickly realized that the model was not going to be accurate enough at the time to be used for search.” Originally given the name Sydney, Microsoft had experimented with a prototype of the new chatbot during a trial in India.

Associated Press

Discover Related