close
close

Do not disappoint from empathic chatbot. Learn from them.

Do not disappoint from empathic chatbot. Learn from them.

Stranger 1: I’m sorry to hear that you are struggling to find the motivation to go back to work. I can understand how anxiety and insecurity can make it difficult to take this step. You have a lot of courage to share your situation and ask for help. I hope you know you have valuable skills and experience that can benefit from any employer. You deserve to feel safe and financially fulfilled in your career.

Stranger 2. I fought with the same problem. The best way to approach is to jump correctly and give him the best.

Which stranger seems more compassionate, more careful and wiser? Who would you choose as a confidant?

The researchers asked people these questions to evaluate the answers of foreigners to emotional situations. The trick: Some answers come from people, others from Chatbots, but the evaluators do not know what it is. The result? Often, people say that the most empathetic foreign – in this case foreign 1 – is a muzzle.

A 2023 study conducted in Jama internal medicine found that patients with a medical problem preferred a chatbot’s response to almost 80% of a doctor. Another study published in the journal Communications Psychology this year found that people have constantly found a more compassionate chatbot than trained telephone line respondents.

Great language models (LLM) do a better job than people to make people feel seen and heard. This phenomenon, which we can call the composition, is both amazing and controversial. Some experts claim that because computers are incapable of emotion, they cannot take care of people, a fundamental requirement of true empathy. Others are alarmed by how easy people trading human connections for digital ones, as more people call chatbots for therapy, friendship and even romance.

But, beyond these concerns and complaints, Chatbot confidants could offer something more practical. If we beat us to compassion, should we not try to learn what I’m doing well? Can computers actually help strengthen human relationships?

The researchers initially asked if the advantage came from the fact that a muzzle has unlimited time to pay endless attention – sizes that are in rare supply for doctors and crisis respondents. But that doesn’t seem to explain it. In a work of 2024 published by the researchers at Harvard Business School, 400 participants were asked to read the descriptions of other people and write answers. Some have been told they would receive a bonus payment if their answer were very careful and helpful. This stimulant threw people to spend more time for their compassionate expressions, but these efforts have remained outside the empathy expressed by Chatgpt.

The secret of the success of chatbot can be the too good human mistakes they avoid. In a 2024 study published in PNAS magazine, over 500 people wrote either a personal fight, such as returning to work after time, or sent answers to other people’s struggles. The researchers also caused Microsoft’s Bing chat to respond to everyone’s struggles.

Raters, who marked these answers without knowing the source, considered that the Bing answers are more empathetic than those written by people, largely because Bing has spent more time recognizing and validating people’s feelings. People usually responded by sharing a seemingly linked experience from their own lives. Basically, the chatbots have made an exchange about the person; People have done more about themselves.

Chatbots are effective in these situations not because of what they do, we cannot, but from the mistakes that people make and avoid. When we see that someone suffers or when someone we care about shares a problem, we instinctively want to help. We offer tips, suggest solutions and eliminate how we dealt with something similar.

These impulses can be noble, even loving, but they are not as useful as we could hope. Hurrying to share opinions and exit the next steps can trick one’s pain, and moving the emphasis to you can undermine their hope to be heard.

Chatbots avoid these traps. Without personal experiences to share, no urgency of solving problems and no ego to protect, they are completely focused on the speaker. Their inherent limitations make them better listeners. More than people, Bing paraphrased people’s struggles, acknowledged and justified how they could feel and asked tracking questions-exactly the answers that studies show the signaling of authentic and curious empathy.

When people adopt similar strategies, their connections are consolidated. Consider the “loop for understanding”, a technique in which a listener repeats what someone else says in their own words, then asks if their summary is correct – “I have that?” Chatbot are natural loops. When people are taught to do the same, they do a better job of understanding what the other person feels and helping them feel heard.

These skills are not just for strengthening ties with family and friends. Dozens of studies show that managers and employers who are seen as good listeners tend to have more loyal, efficient and productive employees.

It is observed that the advantage has in empathic conversations has limits. Talk enough with Chatgpt and you will find a friendly but formulated partner. His recipe for “paraphrase, affirmation, pursuit” can feel warm and careful for the first time, but he will return to the second and annoying third. The answers can also be glittering and prone to hallucinations.

Research in this area usually requires people to interact with chatbots only once. It is possible that their edge on people disappears in longer chats, when its goodness grows repetitive and clouding. Given a small taste, consumers prefer Pepsi, sweeter drink, over Coca-Cola; Given a whole, they prefer Coca-Cola.

Despite the impressive listening skills, studies show that most human beings still want to engage with other people. When scientists reveal the source of supporting messages, participants often insist that chatbot has made them feel less heard, especially if they are generally avoiding. When people are struggling with a problem, they prefer to wait to talk to another person, rather than access a chatbot immediately. Anyone who is repeated “agent” to a custom of customer services knows the feeling of desperately wanting a carbon -based life -based life.

Chatbot could be efficient, even virtuosic listeners, but they can’t feel or really care for us. The AI ​​therapist market can be growing, but many people still resist looking for emotional support from a car. Some of the bugs of the human connection are, in fact, characteristics. Chatbots cannot roll their eyes, leave us unanswered texts or complain that our problems are boring. But the fact that we must often gain human empathy and that it comes from limited beings that are sacrificed to be there for us is part of its beauty.

Jamil Zaki is a professor of psychology at Stanford University. His latest book is “Hope for Cynics: The Surprising Science of Human Housness”, published by the Grand Central Publishing.