请双击下面的英文字幕来播放视频。
00:00
Translator: Joseph Geni
Reviewer: Krystian Aparta
0
0
7000
翻译人员: jacks peng
校对人员: Xinran Bi
00:13
There was a day, about 10 years ago,
1
13047
2508
大概10年前的一天,
00:15
when I asked a friend to hold
a baby dinosaur robot upside down.
2
15579
3944
我让一个朋友头朝下地握持
一个小恐龙机器人。
00:21
It was this toy called a Pleo
that I had ordered,
3
21889
3446
这个机器人是我订购的
一款叫做Pleo的玩具,
00:25
and I was really excited about it
because I've always loved robots.
4
25359
4401
我对此非常兴奋,因为我
一直都很喜欢机器人。
00:29
And this one has really cool
technical features.
5
29784
2279
这个机器人有很酷的技术特征。
00:32
It had motors and touch sensors
6
32087
2119
它有马达和触觉传感器,
00:34
and it had an infrared camera.
7
34230
2244
还有一个红外摄像头。
00:36
And one of the things it had
was a tilt sensor,
8
36498
2763
它还有一个部件是倾斜传感器,
00:39
so it knew what direction it was facing.
9
39285
2318
所以它就会知道
自己面对的是什么方向。
00:42
And when you held it upside down,
10
42095
2134
当你把它倒过来,
00:44
it would start to cry.
11
44253
1572
它会开始哭泣。
00:46
And I thought this was super cool,
so I was showing it off to my friend,
12
46527
3496
我觉得这点非常酷,
所以我展示给我朋友看,
我说:“抓住尾巴竖起来,
看看它会怎样。”
00:50
and I said, "Oh, hold it up by the tail.
See what it does."
13
50047
2805
00:55
So we're watching
the theatrics of this robot
14
55268
3625
于是我们看着这个机器人表演,
00:58
struggle and cry out.
15
58917
2199
挣扎和哭泣。
01:02
And after a few seconds,
16
62767
2047
几秒钟后,
01:04
it starts to bother me a little,
17
64838
1972
我开始感到有点不安,
01:07
and I said, "OK, that's enough now.
18
67744
3424
于是我说,“好了,差不多了,
01:11
Let's put him back down."
19
71930
2305
我们把它放回去吧。”
01:14
And then I pet the robot
to make it stop crying.
20
74259
2555
然后我抚摸着机器人,让它停止哭泣。
01:18
And that was kind of
a weird experience for me.
21
78973
2452
这对我来说是一种奇怪的经历。
01:22
For one thing, I wasn't the most
maternal person at the time.
22
82084
4569
首先,我那时还不是个很有母性的人。
01:26
Although since then I've become
a mother, nine months ago,
23
86677
2731
尽管在那之前的9个月,
我已经成为了一个母亲,
01:29
and I've learned that babies also squirm
when you hold them upside down.
24
89432
3433
我还知道,当你让婴儿
大头朝下时,婴儿也会抽泣。
01:32
(Laughter)
25
92889
1563
(笑声)
01:35
But my response to this robot
was also interesting
26
95023
2358
但我对这个机器人的反应也非常有趣,
01:37
because I knew exactly
how this machine worked,
27
97405
4101
因为我确切地知道
这个机器工作的原理,
01:41
and yet I still felt
compelled to be kind to it.
28
101530
3262
然而我仍然感到有必要对它仁慈些。
01:46
And that observation sparked a curiosity
29
106450
2707
这个观察引起了好奇心,
01:49
that I've spent the past decade pursuing.
30
109181
2832
让我花费了长达10年的时间去追寻。
01:52
Why did I comfort this robot?
31
112911
1793
为什么我会去安慰这个机器人?
01:56
And one of the things I discovered
was that my treatment of this machine
32
116228
3579
我发现我对待这个机器人的方式
01:59
was more than just an awkward moment
in my living room,
33
119831
3701
不仅是我起居室里一个尴尬时刻,
02:03
that in a world where we're increasingly
integrating robots into our lives,
34
123556
5420
在这个世界里,我们正越来越多地
将机器人融入到我们生活中,
像这样的本能可能会产生一些后果,
02:09
an instinct like that
might actually have consequences,
35
129000
3126
02:13
because the first thing that I discovered
is that it's not just me.
36
133452
3749
因为我发现的第一件事情是,
这并非只是发生在我身上的个例。
02:19
In 2007, the Washington Post
reported that the United States military
37
139249
4802
2007年,华盛顿邮报报道称,美国军方
02:24
was testing this robot
that defused land mines.
38
144075
3230
正在测试拆除地雷的机器人。
02:27
And the way it worked
was it was shaped like a stick insect
39
147329
2912
它的形状就像一只竹节虫,
02:30
and it would walk
around a minefield on its legs,
40
150265
2651
用腿在雷区上行走,
02:32
and every time it stepped on a mine,
one of the legs would blow up,
41
152940
3206
每次踩到地雷时,
它的一条腿就会被炸掉,
02:36
and it would continue on the other legs
to blow up more mines.
42
156170
3057
然后继续用其他腿去引爆更多的地雷。
02:39
And the colonel who was in charge
of this testing exercise
43
159251
3786
负责这次测试的上校
02:43
ends up calling it off,
44
163061
2118
后来取消了这个测试,
02:45
because, he says, it's too inhumane
45
165203
2435
因为他说,看着这个机器人
02:47
to watch this damaged robot
drag itself along the minefield.
46
167662
4516
拖着残破的身躯在雷区
挣扎行走,实在太不人道了。
02:54
Now, what would cause
a hardened military officer
47
174978
3897
那么,是什么导致了一个强硬的军官
02:58
and someone like myself
48
178899
2043
和像我这样的人
03:00
to have this response to robots?
49
180966
1857
对机器人有这种反应呢?
03:03
Well, of course, we're primed
by science fiction and pop culture
50
183537
3310
不可否认,我们都被科幻小说
及流行文化所影响,
03:06
to really want to personify these things,
51
186871
2579
想要将这些东西拟人化,
03:09
but it goes a little bit deeper than that.
52
189474
2789
但真实情况还有着更深层的含义。
03:12
It turns out that we're biologically
hardwired to project intent and life
53
192287
5309
事实表明,我们天生就具有将意图和生活
03:17
onto any movement in our physical space
that seems autonomous to us.
54
197620
4766
投射到物理空间中,在我们
看来能自主行动的任何运动物体上。
03:23
So people will treat all sorts
of robots like they're alive.
55
203214
3465
所以人们像对待活物一样
对待各种各样的机器人。
03:26
These bomb-disposal units get names.
56
206703
2683
这些拆弹机器人有自己的名字。
03:29
They get medals of honor.
57
209410
1682
它们能获得荣誉勋章。
03:31
They've had funerals for them
with gun salutes.
58
211116
2325
人们为它们举行了葬礼,并用礼炮向它们致敬。
03:34
And research shows that we do this
even with very simple household robots,
59
214380
3833
研究还发现,我们即便对非常简单的
家居机器人也会这样,
03:38
like the Roomba vacuum cleaner.
60
218237
2135
比如Roomba吸尘器。
03:40
(Laughter)
61
220396
1291
(笑声)
03:41
It's just a disc that roams
around your floor to clean it,
62
221711
3089
它只是一个在你地板上
通过旋转进行清理的圆盘,
03:44
but just the fact it's moving
around on its own
63
224824
2306
但仅仅因为它能够自己移动,
03:47
will cause people to name the Roomba
64
227154
2167
就会导致人们想要给Roomba取名,
03:49
and feel bad for the Roomba
when it gets stuck under the couch.
65
229345
3182
当它卡在沙发下时,还会替它感到难过。
03:52
(Laughter)
66
232551
1865
(笑声)
03:54
And we can design robots
specifically to evoke this response,
67
234440
3340
我们可以专门设计机器人来唤起这种反应,
03:57
using eyes and faces or movements
68
237804
3461
使用诸如眼睛,面孔或动作,
04:01
that people automatically,
subconsciously associate
69
241289
3259
这些人们自动地,在潜意识中
04:04
with states of mind.
70
244572
2020
与心智状态相联系的特征。
04:06
And there's an entire body of research
called human-robot interaction
71
246616
3293
这一整套叫做人机交互的研究
04:09
that really shows how well this works.
72
249933
1826
显示了这个方法的效果的确非常好。
04:11
So for example, researchers
at Stanford University found out
73
251783
3126
比如,在斯坦福大学的研究者发现,
04:14
that it makes people really uncomfortable
74
254933
2001
当你叫人们触摸机器人的私处时,
04:16
when you ask them to touch
a robot's private parts.
75
256958
2472
他们会感到很不舒服。
04:19
(Laughter)
76
259454
2120
(笑声)
04:21
So from this, but from many other studies,
77
261598
2023
从这个以及更多其他研究中,
04:23
we know, we know that people
respond to the cues given to them
78
263645
4223
我们知道人们会对这些栩栩如生的机器
04:27
by these lifelike machines,
79
267892
2022
给他们的线索做出反应,
04:29
even if they know that they're not real.
80
269938
2017
即使他们知道它们只是机器。
04:33
Now, we're headed towards a world
where robots are everywhere.
81
273654
4056
我们正迈入一个机器人
无处不在的社会。
04:37
Robotic technology is moving out
from behind factory walls.
82
277734
3065
机器人科技正在走出工厂的围墙。
04:40
It's entering workplaces, households.
83
280823
3013
它们正在进入工作场所,家居环境。
04:43
And as these machines that can sense
and make autonomous decisions and learn
84
283860
6209
随着这些能够感知并自己
做决定和学习的机器
04:50
enter into these shared spaces,
85
290093
2552
进入这些共享空间,
04:52
I think that maybe the best
analogy we have for this
86
292669
2496
我认为一个最好的类比就是
04:55
is our relationship with animals.
87
295189
1935
我们和动物的关系。
04:57
Thousands of years ago,
we started to domesticate animals,
88
297523
3888
几千年前,我们开始驯养动物,
05:01
and we trained them for work
and weaponry and companionship.
89
301435
4045
我们训练它们为我们工作,
保护和陪伴我们。
05:05
And throughout history, we've treated
some animals like tools or like products,
90
305504
4985
在这个历史进程中,我们把
有些动物当作工具或产品使用,
05:10
and other animals,
we've treated with kindness
91
310513
2174
对其它一些动物,我们则对它们很好,
05:12
and we've given a place in society
as our companions.
92
312711
3078
在社会中给予它们同伴的位置。
05:15
I think it's plausible we might start
to integrate robots in similar ways.
93
315813
3849
我认为我们可能会开始
以类似的方式整合机器人。
05:21
And sure, animals are alive.
94
321484
3096
当然,动物有生命。
05:24
Robots are not.
95
324604
1150
机器人没有。
05:27
And I can tell you,
from working with roboticists,
96
327626
2580
作为机器人专家,我可以告诉各位,
05:30
that we're pretty far away from developing
robots that can feel anything.
97
330230
3522
我们距离能产生感情的机器人还很遥远。
05:35
But we feel for them,
98
335072
1460
但我们同情它们,
05:37
and that matters,
99
337835
1207
这点很重要,
05:39
because if we're trying to integrate
robots into these shared spaces,
100
339066
3627
因为如果我们尝试把机器人
整合进这些共享空间,
05:42
we need to understand that people will
treat them differently than other devices,
101
342717
4628
就需要懂得人们会把它们
与其他设备区别对待,
05:47
and that in some cases,
102
347369
1844
而且在有些场景下,
05:49
for example, the case of a soldier
who becomes emotionally attached
103
349237
3172
比如,那个士兵对一起工作的机器人
05:52
to the robot that they work with,
104
352433
2047
产生情感依恋的例子,
05:54
that can be anything
from inefficient to dangerous.
105
354504
2504
这可能是低效的,也可能是危险的。
05:58
But in other cases,
it can actually be useful
106
358551
2138
但在其他场景下,
培养与机器人的情感联系
06:00
to foster this emotional
connection to robots.
107
360713
2623
可能非常有用。
06:04
We're already seeing some great use cases,
108
364184
2134
我们已经看到了一些很好的使用场景,
06:06
for example, robots working
with autistic children
109
366342
2604
比如跟自闭症儿童一起的机器人
06:08
to engage them in ways
that we haven't seen previously,
110
368970
3634
以我们前所未见的方式与他们互动,
06:12
or robots working with teachers to engage
kids in learning with new results.
111
372628
4000
或者让机器人与老师共事,
在帮助孩子们学习方面获得新的成果。
06:17
And it's not just for kids.
112
377433
1381
并且并不只适用于儿童。
06:19
Early studies show that robots
can help doctors and patients
113
379750
3223
早期的研究发现机器人
可以在医疗保健领域
06:22
in health care settings.
114
382997
1427
帮助医生和病人。
06:25
This is the PARO baby seal robot.
115
385535
1810
这是帕罗婴儿海豹机器人。
06:27
It's used in nursing homes
and with dementia patients.
116
387369
3285
它被用于疗养院来陪伴老年痴呆症患者。
06:30
It's been around for a while.
117
390678
1570
它已经面世有阵子了。
06:32
And I remember, years ago,
being at a party
118
392272
3325
我记得若干年前,在参与的一次聚会上
06:35
and telling someone about this robot,
119
395621
2571
跟人讲到这个机器人时,
06:38
and her response was,
120
398216
2126
她的反应往往是,
06:40
"Oh my gosh.
121
400366
1262
“哦,天哪。
06:42
That's horrible.
122
402508
1188
太可怕了。
06:45
I can't believe we're giving people
robots instead of human care."
123
405056
3397
我无法相信我们给人们的是
机器人护理,而不是人类护理。”
06:50
And this is a really common response,
124
410540
1875
这是一个非常普遍的反应,
06:52
and I think it's absolutely correct,
125
412439
2499
我觉得这是完全正确的,
06:54
because that would be terrible.
126
414962
2040
因为这可能会很可怕。
06:57
But in this case,
it's not what this robot replaces.
127
417795
2484
但在这个场景下,机器人替代的不是护理。
07:00
What this robot replaces is animal therapy
128
420858
3120
机器人替代的是动物疗法,
07:04
in contexts where
we can't use real animals
129
424002
3198
这可以用在无法使用真正动物,
07:07
but we can use robots,
130
427224
1168
但可以使用机器人的场合中,
07:08
because people will consistently treat
them more like an animal than a device.
131
428416
5230
因为人们会把它们当成
动物而不是设备看待。
07:15
Acknowledging this emotional
connection to robots
132
435502
2380
承认这种与机器人的情感联系
07:17
can also help us anticipate challenges
133
437906
1969
也能帮助我们预见到挑战,
07:19
as these devices move into more intimate
areas of people's lives.
134
439899
3451
随着这些设备将进入
人们生活中更亲密的领域。
07:24
For example, is it OK
if your child's teddy bear robot
135
444111
3404
比如,用你孩子的玩具熊机器人
07:27
records private conversations?
136
447539
2237
录制私人对话是否合适?
07:29
Is it OK if your sex robot
has compelling in-app purchases?
137
449800
4063
你的性爱机器人有强制的
内置付费系统是否合适?
07:33
(Laughter)
138
453887
1396
(笑声)
07:35
Because robots plus capitalism
139
455307
2501
因为机器人加上资本
07:37
equals questions around
consumer protection and privacy.
140
457832
3705
就等于消费者保护和隐私问题。
07:42
And those aren't the only reasons
141
462549
1612
这些还不是我们对待
07:44
that our behavior around
these machines could matter.
142
464185
2570
这些机器人的行为
之所以重要的唯一原因。
07:48
A few years after that first
initial experience I had
143
468747
3270
在我第一次见到这只小恐龙机器人的
07:52
with this baby dinosaur robot,
144
472041
2311
几年后,
07:54
I did a workshop
with my friend Hannes Gassert.
145
474376
2501
我和朋友汉内斯 · 加瑟特
开展了一次研讨会。
07:56
And we took five
of these baby dinosaur robots
146
476901
2897
我们拿了5个小恐龙机器人,
07:59
and we gave them to five teams of people.
147
479822
2453
把它们分给5队人。
08:02
And we had them name them
148
482299
1697
我们让他们为它们取名,
08:04
and play with them and interact with them
for about an hour.
149
484020
3809
陪伴它们一起互动大约一个小时。
08:08
And then we unveiled
a hammer and a hatchet
150
488707
2206
然后我们拿出了斧头和锤子
08:10
and we told them to torture
and kill the robots.
151
490937
2278
让他们去折磨和杀死机器人。
08:13
(Laughter)
152
493239
3007
(笑声)
08:16
And this turned out to be
a little more dramatic
153
496857
2294
这个结果比我们想的
08:19
than we expected it to be,
154
499175
1278
要更有戏剧性,
08:20
because none of the participants
would even so much as strike
155
500477
3072
因为甚至没有一个参与者去攻击
08:23
these baby dinosaur robots,
156
503573
1307
这些小恐龙机器人。
08:24
so we had to improvise a little,
and at some point, we said,
157
504904
5150
所以我们得临时凑合一下,
在某个时候,我们说,
08:30
"OK, you can save your team's robot
if you destroy another team's robot."
158
510078
4437
“好吧,你可以保住你们队的机器人,
但前提是把其它队的机器人毁掉。”
08:34
(Laughter)
159
514539
1855
(笑声)
08:36
And even that didn't work.
They couldn't do it.
160
516839
2195
即便这样也没用,他们不愿意去做。
08:39
So finally, we said,
161
519058
1151
所以最后,我们说,
08:40
"We're going to destroy all of the robots
162
520233
2032
“我们将要毁掉所有的机器人,
08:42
unless someone takes
a hatchet to one of them."
163
522289
2285
除非有人拿短柄斧砍掉它们中的一个。”
08:45
And this guy stood up,
and he took the hatchet,
164
525586
3579
有个人站了起来,他拿起斧头,
08:49
and the whole room winced
as he brought the hatchet down
165
529189
2706
当他把斧头砍到机器人的脖子上时,
08:51
on the robot's neck,
166
531919
1780
整个房间的人都缩了回去,
08:53
and there was this half-joking,
half-serious moment of silence in the room
167
533723
6338
房间中出现了一个为这个
倒下的机器人半玩笑半严肃的
09:00
for this fallen robot.
168
540085
1698
沉默时刻。
09:01
(Laughter)
169
541807
1406
(笑声)
09:03
So that was a really
interesting experience.
170
543237
3694
那真是一个有趣的体验。
09:06
Now, it wasn't a controlled
study, obviously,
171
546955
2459
它不是一个对照实验,显然不是,
09:09
but it did lead to some
later research that I did at MIT
172
549438
2850
但这引发了我后来在麻省理工
09:12
with Palash Nandy and Cynthia Breazeal,
173
552312
2228
跟帕拉什 · 南迪和
辛西娅 · 布雷西亚尔做的研究,
09:14
where we had people come into the lab
and smash these HEXBUGs
174
554564
3627
我们让来到实验室的人们打碎这些
09:18
that move around in a really
lifelike way, like insects.
175
558215
3087
像活生生的昆虫那样移动的遥控电子甲虫。
09:21
So instead of choosing something cute
that people are drawn to,
176
561326
3134
与选择人们喜欢的可爱东西相比,
09:24
we chose something more basic,
177
564484
2093
我们选择了一些更基本的东西,
09:26
and what we found
was that high-empathy people
178
566601
3480
我们发现富有同情心的人们
09:30
would hesitate more to hit the HEXBUGS.
179
570105
2143
在击碎这些机器昆虫时要更加犹豫。
09:33
Now this is just a little study,
180
573575
1564
这只是一个小小的研究,
09:35
but it's part of a larger body of research
181
575163
2389
但它是一个更大范围研究的一部分,
09:37
that is starting to indicate
that there may be a connection
182
577576
2944
这开始表明人们的同情心
09:40
between people's tendencies for empathy
183
580544
2373
与他们对待机器人的行为
09:42
and their behavior around robots.
184
582941
1976
可能存在某种联系。
09:45
But my question for the coming era
of human-robot interaction
185
585721
3627
但我对即将到来的人机交互时代的问题
09:49
is not: "Do we empathize with robots?"
186
589372
3055
并不是:“我们对机器人会产生同情心吗?”
09:53
It's: "Can robots change
people's empathy?"
187
593211
2920
而是:“机器人会改变人类的同情心吗?”
09:57
Is there reason to, for example,
188
597489
2287
是不是存在这样的原因,比如说,
09:59
prevent your child
from kicking a robotic dog,
189
599800
2333
阻止你的孩子踢一只机器狗,
10:03
not just out of respect for property,
190
603228
2914
不只是出于对财产的尊重,
10:06
but because the child might be
more likely to kick a real dog?
191
606166
2953
而是因为孩子更可能会去踢一只真的狗?
10:10
And again, it's not just kids.
192
610507
1883
并且,这不只适用于儿童。
10:13
This is the violent video games question,
but it's on a completely new level
193
613564
4056
这是一个关于暴力游戏的问题,
但这个问题上升到了一个全新的水平,
10:17
because of this visceral physicality
that we respond more intensely to
194
617644
4760
因为这种出于本能的物质性行为要比我们
对屏幕上的图像反应更强烈。
10:22
than to images on a screen.
195
622428
1547
10:25
When we behave violently towards robots,
196
625674
2578
当我们对机器人,对专门设计来
10:28
specifically robots
that are designed to mimic life,
197
628276
3120
模拟生命的机器人表现出暴力行径时,
10:31
is that a healthy outlet
for violent behavior
198
631420
3892
这是暴力行为的健康疏导
10:35
or is that training our cruelty muscles?
199
635336
2544
还是在培养我们实施残忍行径的力量?
10:39
We don't know ...
200
639511
1150
我们还不知道…
10:42
But the answer to this question has
the potential to impact human behavior,
201
642622
3945
但这个问题的答案有可能影响人类行为,
10:46
it has the potential
to impact social norms,
202
646591
2768
它有可能影响社会规范,
10:49
it has the potential to inspire rules
around what we can and can't do
203
649383
3849
可能会启发我们制定对特定
机器人能做什么和不能做什么的
10:53
with certain robots,
204
653256
1151
规则,
10:54
similar to our animal cruelty laws.
205
654431
1848
就类似于我们的动物虐待法。
10:57
Because even if robots can't feel,
206
657228
2864
因为即便机器人不能感知,
11:00
our behavior towards them
might matter for us.
207
660116
3080
我们对待它们的行为也可能
对我们有着重要意义。
11:04
And regardless of whether
we end up changing our rules,
208
664889
2855
不管我们是否最终会改变我们的规则,
11:08
robots might be able to help us
come to a new understanding of ourselves.
209
668926
3556
机器人也许能帮助我们对
自己有一个全新的认识。
11:14
Most of what I've learned
over the past 10 years
210
674276
2316
我在过去10年中学到的经验大部分
11:16
has not been about technology at all.
211
676616
2238
跟技术无关,
11:18
It's been about human psychology
212
678878
2503
而是关于人类心理学,
11:21
and empathy and how we relate to others.
213
681405
2603
同情心,以及我们如何与他人相处。
11:25
Because when a child is kind to a Roomba,
214
685524
2365
因为当一个儿童友好地对待Roomba时,
11:29
when a soldier tries to save
a robot on the battlefield,
215
689262
4015
当一个士兵试图拯救战场上的机器人时,
11:33
or when a group of people refuses
to harm a robotic baby dinosaur,
216
693301
3638
或者当一组人拒绝伤害小恐龙机器人时,
11:38
those robots aren't just motors
and gears and algorithms.
217
698248
3191
这些机器人就不只是马达,齿轮和算法。
11:42
They're reflections of our own humanity.
218
702501
1905
它们映射出了我们的人性。
11:45
Thank you.
219
705523
1151
谢谢。
11:46
(Applause)
220
706698
3397
(鼓掌)
New videos
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。