Psychology department colloquium: “What we value in human empathy”
Artificial Intelligence chatbots have become a convenient option over the last three years for many individuals in search of an emotional outlet. In an age where immediacy counts, what exactly happens to the value of human relationships when a faster, more effortless alternative to emotional responses exists? On Thursday, Jan. 22, the psychology department hosted speaker Anat Perry, Ph.D., at the Rapaporte Treasure Hall. In the lecture, Perry compared different perceptions of human and AI-generated empathetic responses according to her research as an advanced fellow at the Harvard Radcliffe Institute.
Perry’s interest in studying human interactions with AI first started about three years ago when she discovered “KoKo,” a free and anonymous online platform that was designed to give users peer-to-peer support in the midst of widespread loneliness and isolation. KoKo ran an experiment in which they allowed users to compose messages with ChatGPT. The resulting controversy led to the removal of their AI feature, but CEO Robert Morris defended the experiment, saying that the responses that used GPT-3 were rated significantly higher than those that were human-written.
Perry was fascinated by what appeared to be an “artificial-empathy paradox.” AI offers responses that are fast and eloquent and it incorporates effective empathic language. “And yet, once you realize that they’re from a machine, we don’t feel that empathy anymore,” Perry observed. “It makes me realize that we don’t truly understand what we actually value in human empathy,” she said.
The broadest definition of empathy includes three general aspects. There is the cognitive, the need to understand what the other person is feeling; the affective, sharing some amount of the other person’s emotions and the motivational aspect, caring, compassion as well as the desire to help others.
Research shows that AI is able to perform the cognitive aspect of empathy, giving responses that are accurate and naturalistic. Perry explained that while AI won’t be able to understand what it’s like to be human in a true sense, it would still be able to appraise emotions, discuss them and offer ideas about what one might be feeling. This function can prove useful for those in need of emotional regulation or evaluation. However, AI is not able to help with the motivational component of human empathy. It lacks something fundamental when an individual needs someone to share the emotions of a particular experience.
“Empathy is hard work,” Perry emphasized. “When someone shows you that they’re investing their time, their mental effort, their emotional effort to really be there for you and listen to you, this is something we value.” She shared that human limitations are part of what gives value to personal empathy: “ The fact that we’re biological beings with finite energy resources for care support is one of the reasons that our empathy is valued more than that of AI.”
In addition to the immediate feeling of reward from receiving support from another person, Perry shared that it is also an important interpersonal signal. For instance, when a colleague, peer or neighbor surprises you by being empathetic and caring, it can help predict a close relationship and the nature of future interactions. “ This signals to you that this person is someone that I can now trust or maybe is closer to me than I thought before,” Perry explained.
One study conducted in her lab examined how people perceive empathy that is thought to be generated by AI, versus that by humans. 1,000 participants ranging from ages 1835 were asked to share a recent emotional story. Half of the participants were told that they would receive a response from ChatGPT, and the other half were told that they’d receive a response from another human. n actuality, however, all of the responses were AI-generated. Perry found that participants from this study valued responses less when they were assumed to be AI.
In another study with the same participant setup, ChatGPT was prompted to provide a third of responses using mostly cognitive empathy terms, avoiding phrases like “I feel your pain,” and instead using phrases like “it seems like you’re very angry or sad.” The other third of responses had ChatGPT show affective empathy, or responses that shared the experience. The last third of responses were motivational, and consisted of messages such as “you’re not alone” or “I’m here for you.”
The results showed that with cognitive empathic responses, people valued perceived human and AI responses similarly. With the affective responses, perceived human responses were more valued. And, upon receiving motivational responses, participants valued perceived human responses significantly more.
Perry noted that while she thinks that AI will replace many aspects of daily life, human empathy and connection will still be invaluable. Her findings have prompted further questions about why human empathy has the value that it does — whether it be the physical experience of feeling another person’s pain, the limited human/biological capacity for extending care or the reciprocal nature of human relationships. Perry stressed the importance of emotional sharing and care: “What we need is clear. True, genuine, authentic empathy.”

Please note All comments are eligible for publication in The Justice.