Don’t freak out about empathic chatbots. Learn from them.
You’re in a pinch. In a 2024 working paper published by researchers at Harvard Business School, 400 participants were asked to read descriptions of other people’s struggles and write responses. In a 2024 study published in the journal PNAS, more than 500 people either wrote about a personal struggle, such as returning to work after time away, or sent responses to other people’s struggles. Raters, who scored these responses without knowing the source, judged Bing responses as more empathic than those written by humans, largely because Bing spent more time acknowledging and validating people’s feelings. More than humans, Bing paraphrased people’s struggles, acknowledged and justified how they might feel and asked follow-up questions—exactly the responses that studies show signal authentic, curious empathy among humans.
Discover Related

Youngsters Turn to Chatbots for Emotional Support

Medical advice viewed as 'less' reliable, empathetic if chatbot involved: study

It’s No Wonder People Are Getting Emotionally Attached to Chatbots

Chatbot users should not share private information with software, expert warns

ChatGPT is better at giving personal advice than humans, study reveals

Impact of relationships with AI chatbot programmes ‘worrying’, psychologist says

You auto-complete me: Our bittersweet relationships with and through bots
