The problem with AI and ‘empathy’ - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

The problem with AI and ‘empathy’

If technology redefines what our language means it could also change our perceptions of ourselves
00:00

{"text":[[{"start":null,"text":"

Research suggests that LLMs read or predict people’s emotions, and write in a way which gives us the impression of empathy
"}],[{"start":6.74,"text":"One after another, the “uniquely human” traits we once thought would remain untouched by the rise of the machines have started to look vulnerable after all. First it was creativity. Is empathy next?"}],[{"start":23.979999999999997,"text":"If you have been reading the research of late, you could be forgiven for thinking so. In one study, a team of licensed healthcare professionals compared the responses of chatbots and real doctors to patient questions posed in an online forum. The chatbot responses were rated significantly higher not just for quality, but for empathy."}],[{"start":46.129999999999995,"text":"In another piece of research, the large language models ChatGPT-4, ChatGPT-o1, Gemini 1.5 Flash, Copilot 365, Claude 3.5 Haiku and DeepSeek V3 outperformed humans on five standard emotional intelligence tests, achieving an average accuracy of 81 per cent, compared with the 56 per cent human average reported in the original validation studies. This, the authors argued, added to “the growing body of evidence that LLMs like ChatGPT are proficient — at least on par with, or even superior to, many humans — in socio-emotional tasks traditionally considered accessible only to humans”."}],[{"start":96.96,"text":"But before we conclude that AI is more empathic than humans, can I suggest that we stop for a moment and give ourselves a shake?"}],[{"start":108.00999999999999,"text":"To be “empathic”, after all, means to be able to put oneself in someone else’s shoes. The Cambridge Dictionary defines empathy as “the ability to share someone else’s feelings or experiences by imagining what it would be like to be in that person’s situation”. But LLMs do not, and cannot, feel. What the research suggests they can do, rather well, is to read or predict other people’s emotions (at least in test conditions), and to write in a way which gives people the impression of empathy. It would be a dangerous mistake to allow the definition of the word “empathy” to quietly morph into something which need only meet this description."}],[{"start":161.35,"text":"Am I splitting hairs? One could take the utilitarian view that what really matters is not whether machines can feel, but whether their expressions of empathy can have a positive impact on human patients or customers. In an article titled “In praise of empathic AI”, a group of psychologists argue that “perceived expressions of empathy can leave beneficiaries feeling that someone is concerned for them, that they are validated and understood. If more people feel heard and cared for with the assistance of AI, this could increase human flourishing”."}],[{"start":204.89999999999998,"text":"There is indeed evidence to suggest that some therapeutic conversations with chatbots, with sufficient guardrails, can have positive effects on people’s mental health. They can also, of course, have very dangerous effects on some vulnerable people, as recent instances of “AI psychosis” make clear."}],[{"start":227.83999999999997,"text":"Either way, we must find a different word, or set of words, to describe what LLMs are doing in these interactions. Because if we call it “empathy”, one risk is that it might change our perceptions of ourselves, and not necessarily for the better. As the psychologists say in their paper, AI’s expressions of empathy “do not seem to suffer from typical human limitations” such as growing weary over time."}],[{"start":257.78,"text":"But these are not limitations of human empathy — they are features of it. And if we grow frustrated with real human empathy, compared with the indefatigable simulation of it we can receive on-demand from LLMs, that might drive us apart. We might grow to prefer our chatbot companions and forget what we are missing from one another."}],[{"start":283.46,"text":"The other problem with calling machines “empathic” is that it provides cover for actions which would otherwise feel morally uncomfortable, such as leaving lonely elderly people alone with chatbots to converse with, in lieu of making sure they have regular human company. If a machine was described as “more empathic” than a human care worker, that would conceal from view what had really been lost along the way."}],[{"start":313.71,"text":"It is not unusual for new technologies to quietly change the meaning of words. As the late cultural critic Neil Postman wrote, the invention of writing changed what we once meant by “wisdom”. The telegraph changed what we once meant by “information”. The television changed what we once meant by “news”."}],[{"start":337.74,"text":"“The old words still look the same, are still used in the same kinds of sentences,” Postman wrote in his book Technopoly in 1992. “But they do not have the same meanings; in some cases, they have the opposite meanings.” What is really dangerous, he added, is that when technology redefines words with deep roots, “it does not pause to tell us. And we do not pause to ask”."}],[{"start":369.90000000000003,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1767827053_3547.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

油价涨到每桶100美元,会加速电动汽车转型吗?

随着燃油价格攀升、前景愈发不确定,汽车选购与制造的经济账已难以忽视。

Lex专栏:铸犁为剑——给欧洲工业吹响的战斗号角

在重整军备的推动下,汽车制造商迎来了革新其生产线的又一次机遇。

为何仍应看多黄金?

库珀:尽管这种贵金属在中东战争期间遭到抛售,但其前景仍更为乐观。

试图摆脱对微软依赖的德国联邦州

在各国领导人日益主张欧洲减少对美国科技巨头的依赖之际,追求“数字主权”的努力使得石勒苏益格-荷尔斯泰因州成为欧洲的一块“试验田”。

FT社评:价格管制重返主流令人不安

价格管制虽然能带来短期纾困,但也会衍生新的问题。与其关注价格管制,各国政府不如把重点放在提高生产率上。

元首关系紧张,美英安全合作出现裂痕

英美围绕伊朗战争出现分歧,正在冲击两国外交人员、官员以及军方人员之间的工作关系。
设置字号×
最小
较小
默认
较大
最大
分享×