The problem with AI and ‘empathy’ - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

The problem with AI and ‘empathy’

If technology redefines what our language means it could also change our perceptions of ourselves
00:00

{"text":[[{"start":null,"text":"

Research suggests that LLMs read or predict people’s emotions, and write in a way which gives us the impression of empathy
"}],[{"start":6.74,"text":"One after another, the “uniquely human” traits we once thought would remain untouched by the rise of the machines have started to look vulnerable after all. First it was creativity. Is empathy next?"}],[{"start":23.979999999999997,"text":"If you have been reading the research of late, you could be forgiven for thinking so. In one study, a team of licensed healthcare professionals compared the responses of chatbots and real doctors to patient questions posed in an online forum. The chatbot responses were rated significantly higher not just for quality, but for empathy."}],[{"start":46.129999999999995,"text":"In another piece of research, the large language models ChatGPT-4, ChatGPT-o1, Gemini 1.5 Flash, Copilot 365, Claude 3.5 Haiku and DeepSeek V3 outperformed humans on five standard emotional intelligence tests, achieving an average accuracy of 81 per cent, compared with the 56 per cent human average reported in the original validation studies. This, the authors argued, added to “the growing body of evidence that LLMs like ChatGPT are proficient — at least on par with, or even superior to, many humans — in socio-emotional tasks traditionally considered accessible only to humans”."}],[{"start":96.96,"text":"But before we conclude that AI is more empathic than humans, can I suggest that we stop for a moment and give ourselves a shake?"}],[{"start":108.00999999999999,"text":"To be “empathic”, after all, means to be able to put oneself in someone else’s shoes. The Cambridge Dictionary defines empathy as “the ability to share someone else’s feelings or experiences by imagining what it would be like to be in that person’s situation”. But LLMs do not, and cannot, feel. What the research suggests they can do, rather well, is to read or predict other people’s emotions (at least in test conditions), and to write in a way which gives people the impression of empathy. It would be a dangerous mistake to allow the definition of the word “empathy” to quietly morph into something which need only meet this description."}],[{"start":161.35,"text":"Am I splitting hairs? One could take the utilitarian view that what really matters is not whether machines can feel, but whether their expressions of empathy can have a positive impact on human patients or customers. In an article titled “In praise of empathic AI”, a group of psychologists argue that “perceived expressions of empathy can leave beneficiaries feeling that someone is concerned for them, that they are validated and understood. If more people feel heard and cared for with the assistance of AI, this could increase human flourishing”."}],[{"start":204.89999999999998,"text":"There is indeed evidence to suggest that some therapeutic conversations with chatbots, with sufficient guardrails, can have positive effects on people’s mental health. They can also, of course, have very dangerous effects on some vulnerable people, as recent instances of “AI psychosis” make clear."}],[{"start":227.83999999999997,"text":"Either way, we must find a different word, or set of words, to describe what LLMs are doing in these interactions. Because if we call it “empathy”, one risk is that it might change our perceptions of ourselves, and not necessarily for the better. As the psychologists say in their paper, AI’s expressions of empathy “do not seem to suffer from typical human limitations” such as growing weary over time."}],[{"start":257.78,"text":"But these are not limitations of human empathy — they are features of it. And if we grow frustrated with real human empathy, compared with the indefatigable simulation of it we can receive on-demand from LLMs, that might drive us apart. We might grow to prefer our chatbot companions and forget what we are missing from one another."}],[{"start":283.46,"text":"The other problem with calling machines “empathic” is that it provides cover for actions which would otherwise feel morally uncomfortable, such as leaving lonely elderly people alone with chatbots to converse with, in lieu of making sure they have regular human company. If a machine was described as “more empathic” than a human care worker, that would conceal from view what had really been lost along the way."}],[{"start":313.71,"text":"It is not unusual for new technologies to quietly change the meaning of words. As the late cultural critic Neil Postman wrote, the invention of writing changed what we once meant by “wisdom”. The telegraph changed what we once meant by “information”. The television changed what we once meant by “news”."}],[{"start":337.74,"text":"“The old words still look the same, are still used in the same kinds of sentences,” Postman wrote in his book Technopoly in 1992. “But they do not have the same meanings; in some cases, they have the opposite meanings.” What is really dangerous, he added, is that when technology redefines words with deep roots, “it does not pause to tell us. And we do not pause to ask”."}],[{"start":369.90000000000003,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1767827053_3547.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

华尔街试图在后马杜罗时代的委内瑞拉寻找投资机会

华尔街银行正在为抓住委内瑞拉能源与基础设施领域的长期机遇布局。特朗普希望斥资数十亿美元重振委内瑞拉的能源部门。

俄罗斯在委内瑞拉的石油冒险为何马失前蹄

马杜罗被捕使得普京耗资巨大的反美堡垒计划化为泡影。但普京政府对美国行动的反击很可能只会停留在口头谴责上。

特朗普可能通过哪些途径掌控格陵兰

美国总统想从丹麦手中夺取辽阔的北极领土,他究竟能做些什么?

时髦机器人与可爱AI接管拉斯维加斯

阿克顿:一年一度的拉斯维加斯消费电子展是一场未来科技产品的盛宴。
1天前

美国寻求“无限期”控制委内瑞拉石油销售

随着美国准备解除部分制裁,特朗普称委内瑞拉将用石油资金专门购买美国商品。

古巴:特朗普还剩下什么可以推翻?

华盛顿希望对马杜罗的抓捕能加速哈瓦那僵化的共产主义政权的垮台。
设置字号×
最小
较小
默认
较大
最大
分享×