2 years, 2 months ago
Microsoft Bing AI chatbot’s beta testers get disturbing replies and accusations
Users who were allowed to beta test the Microsoft search engine Bing’s AI-powered chatbot allegedly encountered accusations of harm, disturbing confessions of spying behaviour, and an AI existential crisis, according to screenshots shared by tech outlet The Verge on Wednesday. In February, users were invited to try out a restricted version of the Bing search engine that had AI chatbot capabilities. Screenshots and quotes shared by The Verge saw the Bing chatbot accusing users of trying to harm it. “Bing tries to keep answers fun and factual, but given this is an early preview, it can still show unexpected or inaccurate results based on the web content summarized, so please use your best judgment,” said a notice on the official website.









2 years, 2 months ago
Microsoft is looking for ways to rein in Bing AI chatbot after troubling responses





