Competing with robots is making work worse
It’s said that as work becomes increasingly automated thanks to artificial intelligence, our special human traits — empathy and humor, creativity and kindness — will become only more valuable. Part of the solution may be designing technology systems that are more flexible — more “tell me how I can help” and less “press 1.” Bing Chat’s creators at Microsoft have released an update that will let users choose the attitude they want the bot to show: creative, balanced or precise. “People have called it ‘mansplaining as a service,’” notes Chamorro-Premuzic, “But actually, it’s more like a woman with imposter syndrome — it’s way too humble to be mansplaining, it’s always apologizing or saying ‘I might be biased.’” So perhaps the bigger lesson is that although robots have huge cost efficiencies over humans, they also have a downside that’s tougher to quantify but no less real. Studies of even “empathetic” bots have shown that they don’t have the positive impact on customers that real human beings have, especially if the customer is already upset.
Discover Related

AI companionship, toxic masculinity and the case of Bing's "sentient" chatbot

The strange and alarming reason why Bing’s chatbot broke down

Is Bing very rude? Microsoft promises to tame AI chatbot

Is Bing too belligerent? Microsoft looks to tame AI chatbot

Microsoft is looking for ways to rein in Bing AI chatbot after troubling responses

’Ultimate hope is to be human…’: Conversations with Bing chatbot go viral

Microsoft’s Bing chatbot scolding users, giving ‘inappropriate’ replies

Microsoft Bing AI chatbot’s beta testers get disturbing replies and accusations

Microsoft introduces new AI-powered Bing on limited basis
