Why we have an emotional connection to robots | Kate Darling

138,301 views ・ 2018-11-06

TED


請雙擊下方英文字幕播放視頻。

00:00
Translator: Joseph Geni Reviewer: Krystian Aparta
0
0
7000
譯者: Wilde Luo 審譯者: NAN-KUN WU
00:13
There was a day, about 10 years ago,
1
13047
2508
10 年前的某一天,
00:15
when I asked a friend to hold a baby dinosaur robot upside down.
2
15579
3944
我叫我的一個朋友 倒著抓住一只恐龍寶寶機器人。
00:21
It was this toy called a Pleo that I had ordered,
3
21889
3446
這是我訂購的玩具,叫做「普里奧」,
00:25
and I was really excited about it because I've always loved robots.
4
25359
4401
我對此很興奮,因為 我對機器人情有獨鍾。
00:29
And this one has really cool technical features.
5
29784
2279
這隻機器人使用了很酷的技術。
00:32
It had motors and touch sensors
6
32087
2119
它裝備了一些馬達和觸摸感應器,
00:34
and it had an infrared camera.
7
34230
2244
以及一個紅外線攝影機。
00:36
And one of the things it had was a tilt sensor,
8
36498
2763
此外,它還有一個傾斜感應器,
00:39
so it knew what direction it was facing.
9
39285
2318
所以它知道自己面對的方向。
00:42
And when you held it upside down,
10
42095
2134
當你倒過來抓住它,
00:44
it would start to cry.
11
44253
1572
它會開始哭。
00:46
And I thought this was super cool, so I was showing it off to my friend,
12
46527
3496
我覺得這真是酷斃了, 所以我自豪地向朋友展示它,
00:50
and I said, "Oh, hold it up by the tail. See what it does."
13
50047
2805
我說:「噢,抓它的尾巴, 吊著它,看它會怎麽樣。」
00:55
So we're watching the theatrics of this robot
14
55268
3625
然後我們看著這隻機器人
00:58
struggle and cry out.
15
58917
2199
裝模作樣地掙扎和哭喊。
01:02
And after a few seconds,
16
62767
2047
幾秒鐘之後,
01:04
it starts to bother me a little,
17
64838
1972
它開始令我心煩,
01:07
and I said, "OK, that's enough now.
18
67744
3424
我說:「好了,夠了。
01:11
Let's put him back down."
19
71930
2305
我們放下它。」
01:14
And then I pet the robot to make it stop crying.
20
74259
2555
接着我安撫這隻機器人, 讓它平靜下來。
01:18
And that was kind of a weird experience for me.
21
78973
2452
這件事讓我感覺十分怪異。
01:22
For one thing, I wasn't the most maternal person at the time.
22
82084
4569
一方面,我在那時 並不如母親一般慈愛。
01:26
Although since then I've become a mother, nine months ago,
23
86677
2731
直到九個月前,我成為了母親,
01:29
and I've learned that babies also squirm when you hold them upside down.
24
89432
3433
我才知道原來寶寶在被 倒著抓住時也會掙扎扭動。
01:32
(Laughter)
25
92889
1563
(笑聲)
01:35
But my response to this robot was also interesting
26
95023
2358
但是我對這隻機器人的 反應也耐人尋味,
01:37
because I knew exactly how this machine worked,
27
97405
4101
因為,即使我對這臺 機器的工作原理一清二楚,
01:41
and yet I still felt compelled to be kind to it.
28
101530
3262
但我仍然情不自禁地 對它生出惻隱之心。
01:46
And that observation sparked a curiosity
29
106450
2707
這個觀察結果激起了我的好奇心,
01:49
that I've spent the past decade pursuing.
30
109181
2832
讓我在過去這十年不斷探究:
01:52
Why did I comfort this robot?
31
112911
1793
為什麼我會安撫這隻機器人?
01:56
And one of the things I discovered was that my treatment of this machine
32
116228
3579
我的其中一個發現就是: 我對這臺機器的所作所為
01:59
was more than just an awkward moment in my living room,
33
119831
3701
不僅僅只是客廳裡一個尷尬的 小插曲,它說明了更多東西。
02:03
that in a world where we're increasingly integrating robots into our lives,
34
123556
5420
在如今,我們的生活 與機器人日益親密無間,
02:09
an instinct like that might actually have consequences,
35
129000
3126
類似上述行為的本能 可能會造成實在的影響,
02:13
because the first thing that I discovered is that it's not just me.
36
133452
3749
因為我的第一個發現就是: 我不是唯一一個這樣做的人。
02:19
In 2007, the Washington Post reported that the United States military
37
139249
4802
2007 年,華盛頓郵報報導, 美國軍方正在測試
02:24
was testing this robot that defused land mines.
38
144075
3230
一種拆除地雷的機器人。
02:27
And the way it worked was it was shaped like a stick insect
39
147329
2912
它是這樣工作的: 它的形狀類似於竹節蟲,
02:30
and it would walk around a minefield on its legs,
40
150265
2651
它會用自己的「腿」在雷區附近移動。
02:32
and every time it stepped on a mine, one of the legs would blow up,
41
152940
3206
每一次踩到地雷, 它都會被炸掉一隻腿,
02:36
and it would continue on the other legs to blow up more mines.
42
156170
3057
然後它會用其它的「腿」 繼續移動,引爆更多的地雷。
02:39
And the colonel who was in charge of this testing exercise
43
159251
3786
那位負責這項測試工作的上校
02:43
ends up calling it off,
44
163061
2118
最終取消了它。
02:45
because, he says, it's too inhumane
45
165203
2435
因為他說,看著這些傷痕累累的機器人
02:47
to watch this damaged robot drag itself along the minefield.
46
167662
4516
在雷區掙扎前行實在是慘無人道。
02:54
Now, what would cause a hardened military officer
47
174978
3897
那麽,又是什麼讓這位老練的軍官
02:58
and someone like myself
48
178899
2043
以及像我這樣的人
03:00
to have this response to robots?
49
180966
1857
對機器人做出這種反應?
03:03
Well, of course, we're primed by science fiction and pop culture
50
183537
3310
當然,我們受科幻小說 和流行文化的影響,
03:06
to really want to personify these things,
51
186871
2579
已經躍躍欲試要將這些事物人格化,
03:09
but it goes a little bit deeper than that.
52
189474
2789
但還不止於此。
03:12
It turns out that we're biologically hardwired to project intent and life
53
192287
5309
事實證明,對於身邊任何 看來像是自發做出的動作,
03:17
onto any movement in our physical space that seems autonomous to us.
54
197620
4766
生物天性決定了我們 會為其賦予意圖和生命。
03:23
So people will treat all sorts of robots like they're alive.
55
203214
3465
所以人們會把各種各樣的 機器人當成生命對待。
03:26
These bomb-disposal units get names.
56
206703
2683
這些炸彈處理單位被賦予了名字。
03:29
They get medals of honor.
57
209410
1682
它們被授予了榮譽勛章。
03:31
They've had funerals for them with gun salutes.
58
211116
2325
在它們的葬禮上,人們以禮炮致敬。
03:34
And research shows that we do this even with very simple household robots,
59
214380
3833
研究表明,甚至對於構造簡單的 家用機器人,我們也是如此,
03:38
like the Roomba vacuum cleaner.
60
218237
2135
例如「Roomba 吸塵器機器人」。
03:40
(Laughter)
61
220396
1291
(笑聲)
03:41
It's just a disc that roams around your floor to clean it,
62
221711
3089
它只是一個圓盤, 在地板上遊蕩並打掃,
03:44
but just the fact it's moving around on its own
63
224824
2306
但僅因為它自己四處移動,
03:47
will cause people to name the Roomba
64
227154
2167
也會讓人們親昵地幫它取名字,
03:49
and feel bad for the Roomba when it gets stuck under the couch.
65
229345
3182
當它被卡在沙發下時 還會給予同情。
03:52
(Laughter)
66
232551
1865
(笑聲)
03:54
And we can design robots specifically to evoke this response,
67
234440
3340
並且我們能專門為此 來設計機器人,
03:57
using eyes and faces or movements
68
237804
3461
藉助眼睛、面部或者動作,
04:01
that people automatically, subconsciously associate
69
241289
3259
人們在潛意識裏
會自動將其與某種心情聯繫起來。
04:04
with states of mind.
70
244572
2020
04:06
And there's an entire body of research called human-robot interaction
71
246616
3293
一系列「人與機器人互動」的研究
04:09
that really shows how well this works.
72
249933
1826
都表明了這種機制的運作多麼廣泛。
04:11
So for example, researchers at Stanford University found out
73
251783
3126
例如,史丹佛大學的研究者發現
04:14
that it makes people really uncomfortable
74
254933
2001
當要求人們去觸碰機器人的私處時,
04:16
when you ask them to touch a robot's private parts.
75
256958
2472
他們會感到非常不適。
04:19
(Laughter)
76
259454
2120
(笑聲)
04:21
So from this, but from many other studies,
77
261598
2023
因此,從很多研究看來,
04:23
we know, we know that people respond to the cues given to them
78
263645
4223
我們知道人們會對這些 逼真的機器人所帶來的暗示
04:27
by these lifelike machines,
79
267892
2022
作出反應,
04:29
even if they know that they're not real.
80
269938
2017
即使人們知道它們並非生命。
04:33
Now, we're headed towards a world where robots are everywhere.
81
273654
4056
我們正進入一個 機器人已無處不在的世界,
04:37
Robotic technology is moving out from behind factory walls.
82
277734
3065
機器人技術打破藩籬,開始走出工廠。
04:40
It's entering workplaces, households.
83
280823
3013
它進入了工作場所、家庭。
04:43
And as these machines that can sense and make autonomous decisions and learn
84
283860
6209
隨着這些能夠感知、 自主決策及學習的機器
04:50
enter into these shared spaces,
85
290093
2552
進入這些公共場所,
04:52
I think that maybe the best analogy we have for this
86
292669
2496
我認為,也許對此最好的類比就是
04:55
is our relationship with animals.
87
295189
1935
我們與動物之間的關係。
04:57
Thousands of years ago, we started to domesticate animals,
88
297523
3888
數千年前,我們開始馴養動物,
05:01
and we trained them for work and weaponry and companionship.
89
301435
4045
它們被我們馴化:付出勞動、 作為武器,並陪伴我們。
05:05
And throughout history, we've treated some animals like tools or like products,
90
305504
4985
在歷史上,我們把一些動物 當作工具或者產品,
05:10
and other animals, we've treated with kindness
91
310513
2174
而對於另一些動物,我們予以善待
05:12
and we've given a place in society as our companions.
92
312711
3078
並為它們保留一席之地, 成為我們的夥伴。
05:15
I think it's plausible we might start to integrate robots in similar ways.
93
315813
3849
我認為,我們以相似的方式 接納機器人,是合情合理的。
05:21
And sure, animals are alive.
94
321484
3096
當然,動物是活的,
05:24
Robots are not.
95
324604
1150
而機器人不是。
05:27
And I can tell you, from working with roboticists,
96
327626
2580
實話實說,與機器人 專家共事讓我得知
05:30
that we're pretty far away from developing robots that can feel anything.
97
330230
3522
我們還遠不足以開發出 能有任何感受的機器人。
05:35
But we feel for them,
98
335072
1460
但是我們同情它們,
05:37
and that matters,
99
337835
1207
這很重要。
05:39
because if we're trying to integrate robots into these shared spaces,
100
339066
3627
因為如果想要讓機器人 融入這些公共空間,
05:42
we need to understand that people will treat them differently than other devices,
101
342717
4628
我們需要瞭解:人們對待 它們與其他設備的方式會不同。
05:47
and that in some cases,
102
347369
1844
在某些情況下,
05:49
for example, the case of a soldier who becomes emotionally attached
103
349237
3172
例如,一位軍人對一同工作的機器人
05:52
to the robot that they work with,
104
352433
2047
產生了情感依賴,
05:54
that can be anything from inefficient to dangerous.
105
354504
2504
這可能造成種種後果, 小到不便,大到危險。
05:58
But in other cases, it can actually be useful
106
358551
2138
在其他情況下, 培養這種與機器人的情感聯結
06:00
to foster this emotional connection to robots.
107
360713
2623
是很有用處的。
06:04
We're already seeing some great use cases,
108
364184
2134
我們已經看到了一些很好的例子,
06:06
for example, robots working with autistic children
109
366342
2604
例如,輔導自閉症兒童的機器人
06:08
to engage them in ways that we haven't seen previously,
110
368970
3634
前所未有地吸引着孩子們,
06:12
or robots working with teachers to engage kids in learning with new results.
111
372628
4000
或者,與老師們共事的機器人 讓孩子們投入學習,取得了新成果。
06:17
And it's not just for kids.
112
377433
1381
這並非只限於孩子。
06:19
Early studies show that robots can help doctors and patients
113
379750
3223
早期的研究顯示, 在醫療保健領域中,
06:22
in health care settings.
114
382997
1427
機器人能夠幫助醫生和病人。
06:25
This is the PARO baby seal robot.
115
385535
1810
這是叫做「帕羅」的海豹寶寶機器人。
06:27
It's used in nursing homes and with dementia patients.
116
387369
3285
它被用於養老院中,陪伴失智症患者。
06:30
It's been around for a while.
117
390678
1570
這項服務已有一段時間了。
06:32
And I remember, years ago, being at a party
118
392272
3325
我還記得,幾年前在一個聚會上,
06:35
and telling someone about this robot,
119
395621
2571
我告訴某人關於這種機器人的事,
06:38
and her response was,
120
398216
2126
她的反應是,
06:40
"Oh my gosh.
121
400366
1262
「噢,天哪,
06:42
That's horrible.
122
402508
1188
這太可怕了。
06:45
I can't believe we're giving people robots instead of human care."
123
405056
3397
簡直難以置信,照顧人們的 竟然是機器人,而不是人工護理。」
06:50
And this is a really common response,
124
410540
1875
這是相當普遍的反應,
06:52
and I think it's absolutely correct,
125
412439
2499
而且我認為這也是人之常情,
06:54
because that would be terrible.
126
414962
2040
因為那樣做真的非常糟糕。
06:57
But in this case, it's not what this robot replaces.
127
417795
2484
但在這裏,機器人 所替代的並非人工護理。
07:00
What this robot replaces is animal therapy
128
420858
3120
機器人所替代的,是動物療法,
07:04
in contexts where we can't use real animals
129
424002
3198
在這種環境下, 我們不允許使用活的動物,
07:07
but we can use robots,
130
427224
1168
但我們可以使用機器人,
07:08
because people will consistently treat them more like an animal than a device.
131
428416
5230
因為人們永遠會把它們 當作動物,而不是設備。
07:15
Acknowledging this emotional connection to robots
132
435502
2380
隨着這些設備日益走近人們的生活,
07:17
can also help us anticipate challenges
133
437906
1969
承認與機器人的情感聯結
07:19
as these devices move into more intimate areas of people's lives.
134
439899
3451
同樣能夠幫助我們 預料到即將面臨的挑戰。
07:24
For example, is it OK if your child's teddy bear robot
135
444111
3404
例如,如果你孩子的泰迪熊機器人
07:27
records private conversations?
136
447539
2237
會記録私人談話,這合理嗎?
07:29
Is it OK if your sex robot has compelling in-app purchases?
137
449800
4063
如果你的性愛機器人強烈 要求購買內置服務,這合理嗎?
07:33
(Laughter)
138
453887
1396
(笑聲)
07:35
Because robots plus capitalism
139
455307
2501
因為當機器人遇上資本主義,
07:37
equals questions around consumer protection and privacy.
140
457832
3705
便會產生關於消費者保護 以及隱私方面的問題。
07:42
And those aren't the only reasons
141
462549
1612
這些並不是唯一造成
07:44
that our behavior around these machines could matter.
142
464185
2570
我們對於這些機器的 行為十分重要的原因。
07:48
A few years after that first initial experience I had
143
468747
3270
在最初那場
對恐龍寶寶機器人的 實驗數年之後,
07:52
with this baby dinosaur robot,
144
472041
2311
07:54
I did a workshop with my friend Hannes Gassert.
145
474376
2501
我和朋友哈尼斯 · 哥薩特 開了個研討會。
07:56
And we took five of these baby dinosaur robots
146
476901
2897
我們拿了 5 個這樣的 恐龍寶寶機器人,
07:59
and we gave them to five teams of people.
147
479822
2453
分發給 5 組受試者。
08:02
And we had them name them
148
482299
1697
我們讓人們給它們起名字,
08:04
and play with them and interact with them for about an hour.
149
484020
3809
和它們一起玩耍、互動, 大約用了一個小時。
08:08
And then we unveiled a hammer and a hatchet
150
488707
2206
然後我們拿出一隻錘子和斧頭,
08:10
and we told them to torture and kill the robots.
151
490937
2278
我們要受試者折磨並殺死機器人。
08:13
(Laughter)
152
493239
3007
(笑聲)
08:16
And this turned out to be a little more dramatic
153
496857
2294
實驗的結果比我們想象的
08:19
than we expected it to be,
154
499175
1278
更有戲劇性,
08:20
because none of the participants would even so much as strike
155
500477
3072
因為參與者甚至都不忍敲打
08:23
these baby dinosaur robots,
156
503573
1307
這些恐龍寶寶機器人,
08:24
so we had to improvise a little, and at some point, we said,
157
504904
5150
所以我們不得不臨時變卦, 後來只好告訴他們,
「好吧,如果摧毀了其他組的機器人, 你就能拯救自己的機器人了。」
08:30
"OK, you can save your team's robot if you destroy another team's robot."
158
510078
4437
08:34
(Laughter)
159
514539
1855
(笑聲)
08:36
And even that didn't work. They couldn't do it.
160
516839
2195
即使是這樣也沒有起色。 他們同樣做不到。
08:39
So finally, we said,
161
519058
1151
所以最後,我們說,
08:40
"We're going to destroy all of the robots
162
520233
2032
「我們要摧毀所有的機器人了,
08:42
unless someone takes a hatchet to one of them."
163
522289
2285
除非某個人拿起斧頭 對其中一個下手。
08:45
And this guy stood up, and he took the hatchet,
164
525586
3579
某位仁兄站起來了, 他拿著斧頭,
08:49
and the whole room winced as he brought the hatchet down
165
529189
2706
當他砍中機器人的脖子時,
08:51
on the robot's neck,
166
531919
1780
整個房間的空氣都仿彿凝固了。
08:53
and there was this half-joking, half-serious moment of silence in the room
167
533723
6338
這就是當時在房間裏 因為這隻倒下的機器人
所帶來既幽默又嚴肅的時刻。
09:00
for this fallen robot.
168
540085
1698
09:01
(Laughter)
169
541807
1406
(笑聲)
09:03
So that was a really interesting experience.
170
543237
3694
這真是一段有趣的經歷。
09:06
Now, it wasn't a controlled study, obviously,
171
546955
2459
當然,這不是一個具有控制組的研究,
09:09
but it did lead to some later research that I did at MIT
172
549438
2850
但它確實導向了 我在麻省理工學院
和帕拉斯 · 南迪和辛西婭 · 布雷西亞 一起做的一些後續的研究,
09:12
with Palash Nandy and Cynthia Breazeal,
173
552312
2228
09:14
where we had people come into the lab and smash these HEXBUGs
174
554564
3627
我們當時讓人們進入實驗室 粉碎這些電子甲蟲,
09:18
that move around in a really lifelike way, like insects.
175
558215
3087
它們像昆蟲一樣十分逼真地移動。
09:21
So instead of choosing something cute that people are drawn to,
176
561326
3134
所以,與其選擇某種 吸引人們的可愛動物,
09:24
we chose something more basic,
177
564484
2093
我們選擇了更為簡單的東西,
09:26
and what we found was that high-empathy people
178
566601
3480
我們發現,富有同情心的人
09:30
would hesitate more to hit the HEXBUGS.
179
570105
2143
在粉碎電子甲蟲時會更加猶豫。
09:33
Now this is just a little study,
180
573575
1564
這只是一個小研究,
09:35
but it's part of a larger body of research
181
575163
2389
但這也屬於一個更大範疇的研究,
09:37
that is starting to indicate that there may be a connection
182
577576
2944
該研究開始表明: 在人們對同情之心的傾向
09:40
between people's tendencies for empathy
183
580544
2373
和他們對於機器人的行為之間
09:42
and their behavior around robots.
184
582941
1976
或許存在聯繫。
09:45
But my question for the coming era of human-robot interaction
185
585721
3627
但對於未來人類和機器互動的時代,
09:49
is not: "Do we empathize with robots?"
186
589372
3055
我想問的並不是「我們會對 機器人產生同情嗎?」
09:53
It's: "Can robots change people's empathy?"
187
593211
2920
而是「機器人能夠 改變我們的同情心嗎?」
09:57
Is there reason to, for example,
188
597489
2287
例如,
09:59
prevent your child from kicking a robotic dog,
189
599800
2333
阻止你的孩子踢一隻機器狗,
10:03
not just out of respect for property,
190
603228
2914
且並非出於對財產的尊重,
10:06
but because the child might be more likely to kick a real dog?
191
606166
2953
而是因為如此孩子就更有可能 虐待真正的狗,這是否有道理?
10:10
And again, it's not just kids.
192
610507
1883
並且,同樣不侷限於孩子。
10:13
This is the violent video games question, but it's on a completely new level
193
613564
4056
這個問題源於暴力遊戲, 但它站在一個全新的角度,
10:17
because of this visceral physicality that we respond more intensely to
194
617644
4760
因為這種自發的肢體 讓我們做出的反應
會比對螢幕上的圖片更強烈。
10:22
than to images on a screen.
195
622428
1547
10:25
When we behave violently towards robots,
196
625674
2578
當我們虐待機器人,
10:28
specifically robots that are designed to mimic life,
197
628276
3120
尤其是那種被設計成 模仿生命的機器人,
10:31
is that a healthy outlet for violent behavior
198
631420
3892
這是對暴力行為的正常發泄,
10:35
or is that training our cruelty muscles?
199
635336
2544
還是在助長我們的暴戾習性?
10:39
We don't know ...
200
639511
1150
我們不知道……
10:42
But the answer to this question has the potential to impact human behavior,
201
642622
3945
但是這個問題的答案 能夠影響人類行為,
10:46
it has the potential to impact social norms,
202
646591
2768
它能夠影響社會準則,
10:49
it has the potential to inspire rules around what we can and can't do
203
649383
3849
它能夠啓發我們,對於 特定的機器人,哪些事能做,
10:53
with certain robots,
204
653256
1151
哪些事不能做。
10:54
similar to our animal cruelty laws.
205
654431
1848
與我們的「動物保護法」相似。
10:57
Because even if robots can't feel,
206
657228
2864
因為即使機器人無法感知情感,
11:00
our behavior towards them might matter for us.
207
660116
3080
我們對它們的行為 對我們或許很重要。
11:04
And regardless of whether we end up changing our rules,
208
664889
2855
不論最後我們是否 改變了我們的準則,
11:08
robots might be able to help us come to a new understanding of ourselves.
209
668926
3556
機器人或許能夠幫助我們 再一次全新地認識我們自己。
11:14
Most of what I've learned over the past 10 years
210
674276
2316
我過去 10 年來 所瞭解的大部分內容
11:16
has not been about technology at all.
211
676616
2238
並不是關於技術本身。
11:18
It's been about human psychology
212
678878
2503
而是關於人類的心理、
11:21
and empathy and how we relate to others.
213
681405
2603
同情心以及我們 如何與他人建立聯繫。
11:25
Because when a child is kind to a Roomba,
214
685524
2365
因為當一個孩子和善相待 Roomba 機器人時,
11:29
when a soldier tries to save a robot on the battlefield,
215
689262
4015
當一名軍人想在 戰場上拯救一名機器人時,
11:33
or when a group of people refuses to harm a robotic baby dinosaur,
216
693301
3638
或者當一群人拒絶 傷害恐龍機器寶寶時,
11:38
those robots aren't just motors and gears and algorithms.
217
698248
3191
這些機器人已不再是 馬達、齒輪和演算法。
11:42
They're reflections of our own humanity.
218
702501
1905
他們都折射出我們人性的光輝。
11:45
Thank you.
219
705523
1151
謝謝。
11:46
(Applause)
220
706698
3397
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7