我们为什么会对机器人有感情 – Kate Darling


人工智能时代口译技术应用研究
王华树 | 国内首部聚焦口译技术应用和教学的著作
新书推荐


口笔译教育与评价国际论坛 二号公告
在厦门大学百年校庆之际,邀您齐聚厦门、共襄盛举
论坛推荐

我们为什么会对机器人有感情 - Kate Darling
play-rounded-fill

我们为什么会对机器人有感情 - Kate Darling

About the talk

机器人伦理学家凯特 · 达林指出,我们还远无法开发出有感情的机器人,但我们已经对机器人产生了感觉,而这种本能会导致一些后果。通过这个演讲来一起了解我们在生物学上是如何将意图和生命投射到机器上的——以及机器人如何帮助我们更好地了解自己。

00:01
There was a day, about 10 years ago, when I asked a friend to hold a baby dinosaur robot upside down. It was this toy called a Pleo that I had ordered, and I was really excited about it because I've always loved robots. And this one has really cool technical features. It had motors and touch sensors and it had an infrared camera. And one of the things it had was a tilt sensor, so it knew what direction it was facing. And when you held it upside down, it would start to cry. And I thought this was super cool, so I was showing it off to my friend, and I said, "Oh, hold it up by the tail. See what it does." So we're watching the theatrics of this robot struggle and cry out. And after a few seconds, it starts to bother me a little, and I said, "OK, that's enough now. Let's put him back down." And then I pet the robot to make it stop crying.
大概10年前的一天, 我让一个朋友头朝下地握持 一个小恐龙机器人。 这个机器人是我订购的 一款叫做Pleo的玩具, 我对此非常兴奋,因为我 一直都很喜欢机器人。 这个机器人有很酷的技术特征。 它有马达和触觉传感器, 还有一个红外摄像头。 它还有一个部件是倾斜传感器, 所以它就会知道 自己面对的是什么方向。 当你把它倒过来, 它会开始哭泣。 我觉得这点非常酷, 所以我展示给我朋友看, 我说:“抓住尾巴竖起来, 看看它会怎样。” 于是我们看着这个机器人表演, 挣扎和哭泣。 几秒钟后, 我开始感到有点不安, 于是我说,“好了,差不多了, 我们把它放回去吧。” 然后我抚摸着机器人,让它停止哭泣。

01:06
And that was kind of a weird experience for me. For one thing, I wasn't the most maternal person at the time. Although since then I've become a mother, nine months ago, and I've learned that babies also squirm when you hold them upside down.
这对我来说是一种奇怪的经历。 首先,我那时还不是个很有母性的人。 尽管在那之前的9个月, 我已经成为了一个母亲, 我还知道,当你让婴儿 大头朝下时,婴儿也会抽泣。

01:20
(Laughter)

01:23
But my response to this robot was also interesting because I knew exactly how this machine worked, and yet I still felt compelled to be kind to it. And that observation sparked a curiosity that I've spent the past decade pursuing. Why did I comfort this robot? And one of the things I discovered was that my treatment of this machine was more than just an awkward moment in my living room, that in a world where we're increasingly integrating robots into our lives, an instinct like that might actually have consequences, because the first thing that I discovered is that it's not just me.
但我对这个机器人的反应也非常有趣, 因为我确切地知道 这个机器工作的原理, 然而我仍然感到有必要对它仁慈些。 这个观察引起了好奇心, 让我花费了长达10年的时间去追寻。 为什么我会去安慰这个机器人? 我发现我对待这个机器人的方式 不仅是我起居室里一个尴尬时刻, 在这个世界里,我们正越来越多地 将机器人融入到我们生活中, 像这样的本能可能会产生一些后果, 因为我发现的第一件事情是, 这并非只是发生在我身上的个例。

02:07
In 2007, the Washington Post reported that the United States military was testing this robot that defused land mines. And the way it worked was it was shaped like a stick insect and it would walk around a minefield on its legs, and every time it stepped on a mine, one of the legs would blow up, and it would continue on the other legs to blow up more mines. And the colonel who was in charge of this testing exercise ends up calling it off, because, he says, it's too inhumane to watch this damaged robot drag itself along the minefield. Now, what would cause a hardened military officer and someone like myself to have this response to robots?
2007年,华盛顿邮报报道称,美国军方 正在测试拆除地雷的机器人。 它的形状就像一只竹节虫, 用腿在雷区上行走, 每次踩到地雷时, 它的一条腿就会被炸掉, 然后继续用其他腿去引爆更多的地雷。 负责这次测试的上校 后来取消了这个测试, 因为他说,看着这个机器人 拖着残破的身躯在雷区 挣扎行走,实在太不人道了。 那么,是什么导致了一个强硬的军官 和像我这样的人 对机器人有这种反应呢?

02:51
Well, of course, we're primed by science fiction and pop culture to really want to personify these things, but it goes a little bit deeper than that. It turns out that we're biologically hardwired to project intent and life onto any movement in our physical space that seems autonomous to us. So people will treat all sorts of robots like they're alive. These bomb-disposal units get names. They get medals of honor. They've had funerals for them with gun salutes. And research shows that we do this even with very simple household robots, like the Roomba vacuum cleaner.
不可否认,我们都被科幻小说 及流行文化所影响, 想要将这些东西拟人化, 但真实情况还有着更深层的含义。 事实表明,我们天生就具有将意图和生活 投射到物理空间中,在我们 看来能自主行动的任何运动物体上。 所以人们像对待活物一样 对待各种各样的机器人。 这些拆弹机器人有自己的名字。 它们能获得荣誉勋章。 人们为它们举行了葬礼,并用礼炮向它们致敬。 研究还发现,我们即便对非常简单的 家居机器人也会这样, 比如Roomba吸尘器。

03:28
(Laughter)

03:29
It's just a disc that roams around your floor to clean it, but just the fact it's moving around on its own will cause people to name the Roomba and feel bad for the Roomba when it gets stuck under the couch.
它只是一个在你地板上 通过旋转进行清理的圆盘, 但仅仅因为它能够自己移动, 就会导致人们想要给Roomba取名, 当它卡在沙发下时,还会替它感到难过。

03:40
(Laughter)

03:42
And we can design robots specifically to evoke this response, using eyes and faces or movements that people automatically, subconsciously associate with states of mind. And there's an entire body of research called human-robot interaction that really shows how well this works. So for example, researchers at Stanford University found out that it makes people really uncomfortable when you ask them to touch a robot's private parts.
我们可以专门设计机器人来唤起这种反应, 使用诸如眼睛,面孔或动作, 这些人们自动地,在潜意识中 与心智状态相联系的特征。 这一整套叫做人机交互的研究 显示了这个方法的效果的确非常好。 比如,在斯坦福大学的研究者发现, 当你叫人们触摸机器人的私处时, 他们会感到很不舒服。

04:07
(Laughter)

04:09
So from this, but from many other studies, we know, we know that people respond to the cues given to them by these lifelike machines, even if they know that they're not real.
从这个以及更多其他研究中, 我们知道人们会对这些栩栩如生的机器 给他们的线索做出反应, 即使他们知道它们只是机器。

04:21
Now, we're headed towards a world where robots are everywhere. Robotic technology is moving out from behind factory walls. It's entering workplaces, households. And as these machines that can sense and make autonomous decisions and learn enter into these shared spaces, I think that maybe the best analogy we have for this is our relationship with animals. Thousands of years ago, we started to domesticate animals, and we trained them for work and weaponry and companionship. And throughout history, we've treated some animals like tools or like products, and other animals, we've treated with kindness and we've given a place in society as our companions. I think it's plausible we might start to integrate robots in similar ways.
我们正迈入一个机器人 无处不在的社会。 机器人科技正在走出工厂的围墙。 它们正在进入工作场所,家居环境。 随着这些能够感知并自己 做决定和学习的机器 进入这些共享空间, 我认为一个最好的类比就是 我们和动物的关系。 几千年前,我们开始驯养动物, 我们训练它们为我们工作, 保护和陪伴我们。 在这个历史进程中,我们把 有些动物当作工具或产品使用, 对其它一些动物,我们则对它们很好, 在社会中给予它们同伴的位置。 我认为我们可能会开始 以类似的方式整合机器人。

05:09
And sure, animals are alive. Robots are not. And I can tell you, from working with roboticists, that we're pretty far away from developing robots that can feel anything. But we feel for them, and that matters, because if we're trying to integrate robots into these shared spaces, we need to understand that people will treat them differently than other devices, and that in some cases, for example, the case of a soldier who becomes emotionally attached to the robot that they work with, that can be anything from inefficient to dangerous. But in other cases, it can actually be useful to foster this emotional connection to robots. We're already seeing some great use cases, for example, robots working with autistic children to engage them in ways that we haven't seen previously, or robots working with teachers to engage kids in learning with new results. And it's not just for kids. Early studies show that robots can help doctors and patients in health care settings.
当然,动物有生命。 机器人没有。 作为机器人专家,我可以告诉各位, 我们距离能产生感情的机器人还很遥远。 但我们同情它们, 这点很重要, 因为如果我们尝试把机器人 整合进这些共享空间, 就需要懂得人们会把它们 与其他设备区别对待, 而且在有些场景下, 比如,那个士兵对一起工作的机器人 产生情感依恋的例子, 这可能是低效的,也可能是危险的。 但在其他场景下, 培养与机器人的情感联系 可能非常有用。 我们已经看到了一些很好的使用场景, 比如跟自闭症儿童一起的机器人 以我们前所未见的方式与他们互动, 或者让机器人与老师共事, 在帮助孩子们学习方面获得新的成果。 并且并不只适用于儿童。 早期的研究发现机器人 可以在医疗保健领域 帮助医生和病人。

06:13
This is the PARO baby seal robot. It's used in nursing homes and with dementia patients. It's been around for a while. And I remember, years ago, being at a party and telling someone about this robot, and her response was, "Oh my gosh. That's horrible. I can't believe we're giving people robots instead of human care." And this is a really common response, and I think it's absolutely correct, because that would be terrible. But in this case, it's not what this robot replaces. What this robot replaces is animal therapy in contexts where we can't use real animals but we can use robots, because people will consistently treat them more like an animal than a device.
这是帕罗婴儿海豹机器人。 它被用于疗养院来陪伴老年痴呆症患者。 它已经面世有阵子了。 我记得若干年前,在参与的一次聚会上 跟人讲到这个机器人时, 她的反应往往是, “哦,天哪。 太可怕了。 我无法相信我们给人们的是 机器人护理,而不是人类护理。” 这是一个非常普遍的反应, 我觉得这是完全正确的, 因为这可能会很可怕。 但在这个场景下,机器人替代的不是护理。 机器人替代的是动物疗法, 这可以用在无法使用真正动物, 但可以使用机器人的场合中, 因为人们会把它们当成 动物而不是设备看待。

07:03
Acknowledging this emotional connection to robots can also help us anticipate challenges as these devices move into more intimate areas of people's lives. For example, is it OK if your child's teddy bear robot records private conversations? Is it OK if your sex robot has compelling in-app purchases?
承认这种与机器人的情感联系 也能帮助我们预见到挑战, 随着这些设备将进入 人们生活中更亲密的领域。 比如,用你孩子的玩具熊机器人 录制私人对话是否合适? 你的性爱机器人有强制的 内置付费系统是否合适?

07:21
(Laughter)

07:23
Because robots plus capitalism equals questions around consumer protection and privacy.
因为机器人加上资本 就等于消费者保护和隐私问题。

07:30
And those aren't the only reasons that our behavior around these machines could matter. A few years after that first initial experience I had with this baby dinosaur robot, I did a workshop with my friend Hannes Gassert. And we took five of these baby dinosaur robots and we gave them to five teams of people. And we had them name them and play with them and interact with them for about an hour. And then we unveiled a hammer and a hatchet and we told them to torture and kill the robots.
这些还不是我们对待 这些机器人的行为 之所以重要的唯一原因。 在我第一次见到这只小恐龙机器人的 几年后, 我和朋友汉内斯 · 加瑟特 开展了一次研讨会。 我们拿了5个小恐龙机器人, 把它们分给5队人。 我们让他们为它们取名, 陪伴它们一起互动大约一个小时。 然后我们拿出了斧头和锤子 让他们去折磨和杀死机器人。

08:01
(Laughter)

08:04
And this turned out to be a little more dramatic than we expected it to be, because none of the participants would even so much as strike these baby dinosaur robots, so we had to improvise a little, and at some point, we said, "OK, you can save your team's robot if you destroy another team's robot."
这个结果比我们想的 要更有戏剧性, 因为甚至没有一个参与者去攻击 这些小恐龙机器人。 所以我们得临时凑合一下, 在某个时候,我们说, “好吧,你可以保住你们队的机器人, 但前提是把其它队的机器人毁掉。”

08:22
(Laughter)

08:24
And even that didn't work. They couldn't do it. So finally, we said, "We're going to destroy all of the robots unless someone takes a hatchet to one of them." And this guy stood up, and he took the hatchet, and the whole room winced as he brought the hatchet down on the robot's neck, and there was this half-joking, half-serious moment of silence in the room for this fallen robot.
即便这样也没用,他们不愿意去做。 所以最后,我们说, “我们将要毁掉所有的机器人, 除非有人拿短柄斧砍掉它们中的一个。” 有个人站了起来,他拿起斧头, 当他把斧头砍到机器人的脖子上时, 整个房间的人都缩了回去, 房间中出现了一个为这个 倒下的机器人半玩笑半严肃的 沉默时刻。

08:49
(Laughter)

08:51
So that was a really interesting experience. Now, it wasn't a controlled study, obviously, but it did lead to some later research that I did at MIT with Palash Nandy and Cynthia Breazeal, where we had people come into the lab and smash these HEXBUGs that move around in a really lifelike way, like insects. So instead of choosing something cute that people are drawn to, we chose something more basic, and what we found was that high-empathy people would hesitate more to hit the HEXBUGS.
那真是一个有趣的体验。 它不是一个对照实验,显然不是, 但这引发了我后来在麻省理工 跟帕拉什 · 南迪和 辛西娅 · 布雷西亚尔做的研究, 我们让来到实验室的人们打碎这些 像活生生的昆虫那样移动的遥控电子甲虫。 与选择人们喜欢的可爱东西相比, 我们选择了一些更基本的东西, 我们发现富有同情心的人们 在击碎这些机器昆虫时要更加犹豫。

09:21
Now this is just a little study, but it's part of a larger body of research that is starting to indicate that there may be a connection between people's tendencies for empathy and their behavior around robots. But my question for the coming era of human-robot interaction is not: "Do we empathize with robots?" It's: "Can robots change people's empathy?" Is there reason to, for example, prevent your child from kicking a robotic dog, not just out of respect for property, but because the child might be more likely to kick a real dog?
这只是一个小小的研究, 但它是一个更大范围研究的一部分, 这开始表明人们的同情心 与他们对待机器人的行为 可能存在某种联系。 但我对即将到来的人机交互时代的问题 并不是:“我们对机器人会产生同情心吗?” 而是:“机器人会改变人类的同情心吗?” 是不是存在这样的原因,比如说, 阻止你的孩子踢一只机器狗, 不只是出于对财产的尊重, 而是因为孩子更可能会去踢一只真的狗?

09:58
And again, it's not just kids. This is the violent video games question, but it's on a completely new level because of this visceral physicality that we respond more intensely to than to images on a screen. When we behave violently towards robots, specifically robots that are designed to mimic life, is that a healthy outlet for violent behavior or is that training our cruelty muscles? We don't know ... But the answer to this question has the potential to impact human behavior, it has the potential to impact social norms, it has the potential to inspire rules around what we can and can't do with certain robots, similar to our animal cruelty laws. Because even if robots can't feel, our behavior towards them might matter for us. And regardless of whether we end up changing our rules, robots might be able to help us come to a new understanding of ourselves.
并且,这不只适用于儿童。 这是一个关于暴力游戏的问题, 但这个问题上升到了一个全新的水平, 因为这种出于本能的物质性行为要比我们 对屏幕上的图像反应更强烈。 当我们对机器人,对专门设计来 模拟生命的机器人表现出暴力行径时, 这是暴力行为的健康疏导 还是在培养我们实施残忍行径的力量? 我们还不知道… 但这个问题的答案有可能影响人类行为, 它有可能影响社会规范, 可能会启发我们制定对特定 机器人能做什么和不能做什么的 规则, 就类似于我们的动物虐待法。 因为即便机器人不能感知, 我们对待它们的行为也可能 对我们有着重要意义。 不管我们是否最终会改变我们的规则, 机器人也许能帮助我们对 自己有一个全新的认识。

11:02
Most of what I've learned over the past 10 years has not been about technology at all. It's been about human psychology and empathy and how we relate to others. Because when a child is kind to a Roomba, when a soldier tries to save a robot on the battlefield, or when a group of people refuses to harm a robotic baby dinosaur, those robots aren't just motors and gears and algorithms. They're reflections of our own humanity.
我在过去10年中学到的经验大部分 跟技术无关, 而是关于人类心理学, 同情心,以及我们如何与他人相处。 因为当一个儿童友好地对待Roomba时, 当一个士兵试图拯救战场上的机器人时, 或者当一组人拒绝伤害小恐龙机器人时, 这些机器人就不只是马达,齿轮和算法。 它们映射出了我们的人性。

相关推荐
5/5

原创视频版权为主办方及译直播所有,请勿擅自使用
已赞5

发表回复