What's it like to be a robot? | Leila Takayama

91,817 views ・ 2018-02-16

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Song Yu
00:12
You only get one chance to make a first impression,
0
12760
2656
你只有一次機會建立第一印象,
00:15
and that's true if you're a robot as well as if you're a person.
1
15440
3176
不論你是機器人或人類,都是如此。
00:18
The first time that I met one of these robots
2
18640
3016
當我初次見到一隻這種機器人,
00:21
was at a place called Willow Garage in 2008.
3
21680
2896
是 2008 年, 在一個叫柳樹車庫公司的地方。
00:24
When I went to visit there, my host walked me into the building
4
24600
3016
當我去那裡造訪時, 東道主陪我走進大樓,
00:27
and we met this little guy.
5
27640
1576
我們見到了這個小傢伙。
00:29
He was rolling into the hallway,
6
29240
1656
它在門廳跑來跑去,
00:30
came up to me, sat there,
7
30920
1816
跑到我面前,坐在那裡,
00:32
stared blankly past me,
8
32760
2256
面無表情地望穿我,
00:35
did nothing for a while,
9
35040
1656
好一陣子什麼都沒做,
00:36
rapidly spun his head around 180 degrees
10
36720
1936
快速把它的頭扭轉了180 度,
00:38
and then ran away.
11
38680
1536
然後跑走了。
00:40
And that was not a great first impression.
12
40240
2176
那不是很好的第一印象。
00:42
The thing that I learned about robots that day
13
42440
2176
我那天學到一件關於機器人的事,
00:44
is that they kind of do their own thing,
14
44640
2176
就是它們會做自己的事,
00:46
and they're not totally aware of us.
15
46840
2136
並且完全沒有意識到我們的存在。
00:49
And I think as we're experimenting with these possible robot futures,
16
49000
3239
我想,當我們在實驗這些 可能的未來機器人時,
00:52
we actually end up learning a lot more about ourselves
17
52263
2673
我們其實最後學到很多 關於我們自己的事,
00:54
as opposed to just these machines.
18
54960
1656
不只學到關於這些機器的事。
00:56
And what I learned that day
19
56640
1336
我那天學到的是,
00:58
was that I had pretty high expectations for this little dude.
20
58000
3416
我對這個小傢伙有很高的期望。
01:01
He was not only supposed to be able to navigate the physical world,
21
61440
3176
它不僅應該要能夠引導實體世界,
01:04
but also be able to navigate my social world --
22
64640
2656
也應該要能引導我的社交世界。
01:07
he's in my space; it's a personal robot.
23
67320
2176
它在我的空間中,它是個人機器人,
01:09
wWhy didn't it understand me?
24
69520
2016
為什麼它不了解我?
01:11
My host explained to me,
25
71560
1256
東道主向我解釋:
01:12
"Well, the robot is trying to get from point A to point B,
26
72840
3176
「嗯,機器人試著從A點到達B點,
01:16
and you were an obstacle in his way,
27
76040
1776
而你是擋在它路上的障礙物,
01:17
so he had to replan his path,
28
77840
2016
所以它得重新規劃它的路徑,
01:19
figure out where to go,
29
79880
1256
搞清楚要走去哪裡,
01:21
and then get there some other way,"
30
81160
1696
然後走其他的路到達目的地。」
01:22
which was actually not a very efficient thing to do.
31
82880
2456
這其實並不是很有效益的做法。
01:25
If that robot had figured out that I was a person, not a chair,
32
85360
3256
如果那隻機器人能理解 我是一個人,不是一張椅子,
01:28
and that I was willing to get out of its way
33
88640
2096
那我其實願意讓開,
01:30
if it was trying to get somewhere,
34
90760
1656
讓它能到它的目的地,
01:32
then it actually would have been more efficient
35
92440
2216
這樣就會用更有效益的方式
01:34
at getting its job done
36
94680
1256
完成它的工作,
01:35
if it had bothered to notice that I was a human
37
95960
2216
如果它花點功夫注意到我是一個人,
01:38
and that I have different affordances than things like chairs and walls do.
38
98200
3576
且我有著和椅子及牆壁 不同的能力,就可以做到這點。
01:41
You know, we tend to think of these robots as being from outer space
39
101800
3216
我們傾向於認為 這些機器人是來自外太空,
01:45
and from the future and from science fiction,
40
105040
2136
來自未來,來自科幻小說,
01:47
and while that could be true,
41
107200
1416
雖然那是有可能的,
01:48
I'd actually like to argue that robots are here today,
42
108640
2656
但其實我認為, 機器人現今就在這裡了,
01:51
and they live and work amongst us right now.
43
111320
2776
它們現在就在人類當中生活、工作。
01:54
These are two robots that live in my home.
44
114120
2856
這兩隻機器人住在我家。
01:57
They vacuum the floors and they cut the grass
45
117000
2496
它們會吸地板和除草,
01:59
every single day,
46
119520
1216
每天都做,
02:00
which is more than I would do if I actually had time to do these tasks,
47
120760
3376
就算我有時間做這些工作, 我也無法做到這麼多,
02:04
and they probably do it better than I would, too.
48
124160
2336
且它們可能也做得比我好。
02:06
This one actually takes care of my kitty.
49
126520
2496
這隻機器人會照顧我的貓。
02:09
Every single time he uses the box, it cleans it,
50
129040
2576
每當牠使用貓砂時,它就會清理,
02:11
which is not something I'm willing to do,
51
131640
1976
這就不是我願意做的事了,
02:13
and it actually makes his life better as well as mine.
52
133640
2576
它讓牠和我的生活都變得更好。
02:16
And while we call these robot products --
53
136240
2416
雖然我們稱這些機器人為產品,
02:18
it's a "robot vacuum cleaner, it's a robot lawnmower,
54
138680
2696
它是打掃機器人、 它是除草機器人、
02:21
it's a robot littler box,"
55
141400
1495
它是貓砂機器人,
02:22
I think there's actually a bunch of other robots hiding in plain sight
56
142919
4137
我認為,在我們視線能及之處 還隱藏了一堆其他的機器人,
02:27
that have just become so darn useful
57
147080
1856
只是它們變得太有用、太平凡,
02:28
and so darn mundane
58
148960
1456
以致於我們為它們另外命名, 比如「洗碗機」,對吧?
02:30
that we call them things like, "dishwasher," right?
59
150440
2496
02:32
They get new names.
60
152960
1216
它們得到新的名字。
02:34
They don't get called robot anymore
61
154200
1696
它們不再被稱為機器人,
02:35
because they actually serve a purpose in our lives.
62
155920
2416
因為,在我們的人生中, 它們有目的、用途。
02:38
Similarly, a thermostat, right?
63
158360
1496
同樣的,自動調溫器,對吧?
02:39
I know my robotics friends out there
64
159880
1776
我知道外頭那些機器人朋友們
02:41
are probably cringing at me calling this a robot,
65
161680
2336
大概對我稱呼它們為機器人 感到難為情,
02:44
but it has a goal.
66
164040
1256
但它有個目標,
02:45
Its goal is to make my house 66 degrees Fahrenheit,
67
165320
2896
它的目標就是讓我的房子 維持在華氏 66 度。
02:48
and it senses the world.
68
168240
1256
它會感知這個世界,
02:49
It knows it's a little bit cold,
69
169520
1576
知道有一點冷,
02:51
it makes a plan and then it acts on the physical world.
70
171120
2616
它會做計畫,然後 在實體世界採取行動。
02:53
It's robotics.
71
173760
1256
這就是機器人。
02:55
Even if it might not look like Rosie the Robot,
72
175040
2576
即使它看起來不像 《傑森一家》裡的機器人,
02:57
it's doing something that's really useful in my life
73
177640
2936
但在我的人生中, 它所做的事非常有用,
03:00
so I don't have to take care
74
180600
1376
它讓我不用去費心
03:02
of turning the temperature up and down myself.
75
182000
2576
自己把溫度調高調低。
03:04
And I think these systems live and work amongst us now,
76
184600
3816
我認為,這些系統在 我們人類當中生活和工作,
03:08
and not only are these systems living amongst us
77
188440
2336
且,不僅這些系統生活在我們當中,
03:10
but you are probably a robot operator, too.
78
190800
2656
你可能也是個機器人操作員。
03:13
When you drive your car,
79
193480
1256
當你駕駛你的車時,
03:14
it feels like you are operating machinery.
80
194760
2216
感覺就像你在操作機械。
03:17
You are also going from point A to point B,
81
197000
2816
你也是在從A點前往B點,
03:19
but your car probably has power steering,
82
199840
2216
但你的車可能有動力方向盤,
03:22
it probably has automatic braking systems,
83
202080
2696
它可能有自動煞車系統,
03:24
it might have an automatic transmission and maybe even adaptive cruise control.
84
204800
3736
它可能有自排變速箱, 甚至有主動車距控制巡航系統。
03:28
And while it might not be a fully autonomous car,
85
208560
2936
雖然它可能不是完全自主的車,
03:31
it has bits of autonomy,
86
211520
1296
它還是有一些自主性,
03:32
and they're so useful
87
212840
1336
且這些自主性很有用,
03:34
and they make us drive safer,
88
214200
1816
讓我們能更安全地開車,
03:36
and we just sort of feel like they're invisible-in-use, right?
89
216040
3656
我們只會感覺到,在使用它們時, 它們好像隱形了,對嗎?
03:39
So when you're driving your car,
90
219720
1576
當你在駕駛你的車時,
03:41
you should just feel like you're going from one place to another.
91
221320
3096
你應該只會感覺到你是 從一個地方到另一個地方。
03:44
It doesn't feel like it's this big thing that you have to deal with and operate
92
224440
3736
感覺並不像是件需要你 去處理、操作、使用
這些控制功能的大事,
03:48
and use these controls
93
228200
1256
03:49
because we spent so long learning how to drive
94
229480
2176
因為我們花了太多時間學習駕駛,
03:51
that they've become extensions of ourselves.
95
231680
2696
以致於它們已經成了 我們自己的延伸體。
03:54
When you park that car in that tight little garage space,
96
234400
2696
當你把車停在那狹小的車庫空間中,
你知道車子的邊角在什麼位置。
03:57
you know where your corners are.
97
237120
1576
03:58
And when you drive a rental car that maybe you haven't driven before,
98
238720
3256
當你開一輛你可能以前 沒開過的出租車時,
04:02
it takes some time to get used to your new robot body.
99
242000
3056
要花一點時間去習慣 你的機器人身體。
04:05
And this is also true for people who operate other types of robots,
100
245080
3976
操作其他種類的機器人也是這樣的,
04:09
so I'd like to share with you a few stories about that.
101
249080
2600
所以我想要和各位 分享幾個相關的故事。
04:12
Dealing with the problem of remote collaboration.
102
252240
2336
處理遠端協同作業的問題。
04:14
So, at Willow Garage I had a coworker named Dallas,
103
254600
2576
在柳樹車庫,我有個同事叫達拉斯,
04:17
and Dallas looked like this.
104
257200
1576
達拉斯看起來是這樣子。
04:18
He worked from his home in Indiana in our company in California.
105
258800
4056
在我們加州的公司中, 他在印第安納州的家中工作。
04:22
He was a voice in a box on the table in most of our meetings,
106
262880
2936
在我們大部分的會議中, 他是桌上盒子裡傳出來的聲音,
04:25
which was kind of OK except that, you know,
107
265840
2215
這樣是還好,不過,你們知道的,
04:28
if we had a really heated debate and we didn't like what he was saying,
108
268079
3377
如果我們發生火熱的辯論, 且我們不喜歡他的說法時,
04:31
we might just hang up on him.
109
271480
1416
我們可能就會把他掛斷。
04:32
(Laughter)
110
272920
1015
(笑聲)
04:33
Then we might have a meeting after that meeting
111
273959
2217
在那場會議後, 我們會再開一場會議,
04:36
and actually make the decisions in the hallway afterwards
112
276200
2696
且之後就在走廊做決策,
04:38
when he wasn't there anymore.
113
278920
1416
當他不在那裡的時候。
04:40
So that wasn't so great for him.
114
280360
1576
那對他而言並非好事。
04:41
And as a robotics company at Willow,
115
281960
1736
柳樹車庫是間機器人公司,
04:43
we had some extra robot body parts laying around,
116
283720
2336
在公司裡滿地都是 機器人的身體部件,
所以,達拉斯和他的伙伴柯特 組裝了這個東西,
04:46
so Dallas and his buddy Curt put together this thing,
117
286080
2496
04:48
which looks kind of like Skype on a stick on wheels,
118
288600
2936
看起來像是 Skype 接著一根棍子且下面有輪子,
04:51
which seems like a techy, silly toy,
119
291560
1736
它看來是個很蠢的科技玩具,
04:53
but really it's probably one of the most powerful tools
120
293320
2776
但其實,它可能是我所見過 所有為了遠端協同作業所製出
04:56
that I've seen ever made for remote collaboration.
121
296120
2480
最強大工具中的其中之一。
04:59
So now, if I didn't answer Dallas' email question,
122
299160
3496
所以,現在,如果我沒回覆 達拉斯在電子郵件中問的問題,
05:02
he could literally roll into my office,
123
302680
2216
他可以直接進入我的辦公室,
05:04
block my doorway and ask me the question again --
124
304920
2576
擋住我的門,再問我一次那個問題
05:07
(Laughter)
125
307520
1016
(笑聲)
05:08
until I answered it.
126
308560
1216
直到我回答為止。
05:09
And I'm not going to turn him off, right? That's kind of rude.
127
309800
2976
我不會把他關機, 對吧?那樣有點失禮。
05:12
Not only was it good for these one-on-one communications,
128
312800
2696
它不只很適合用在一對一的溝通上,
05:15
but also for just showing up at the company all-hands meeting.
129
315520
2936
也很適合出席公司的全員大會。
05:18
Getting your butt in that chair
130
318480
1696
把你的屁股坐到椅子上,
05:20
and showing people that you're present and committed to your project
131
320200
3216
讓人們看到你有出席, 有投入你的專案計畫,
05:23
is a big deal
132
323440
1256
這是很重要的,
05:24
and can help remote collaboration a ton.
133
324720
2176
且能大大協助遠端協同作業。
05:26
We saw this over the period of months and then years,
134
326920
2856
在數個月、接著數年的期間, 我們都看到這個狀況,
05:29
not only at our company but at others, too.
135
329800
2160
不只在我們的公司裡, 也在其他公司裡。
05:32
The best thing that can happen with these systems
136
332720
2336
對於這些系統,最棒的情況就是
05:35
is that it starts to feel like you're just there.
137
335080
2336
你開始覺得你就是在那裡。
05:37
It's just you, it's just your body,
138
337440
1696
那就是你,那就是你的身體,
05:39
and so people actually start to give these things personal space.
139
339160
3096
所以人們開始留 個人空間給這些東西。
05:42
So when you're having a stand-up meeting,
140
342280
1976
當你參加一場站立會議時,
05:44
people will stand around the space
141
344280
1656
人們在那空間中站立的位置,
05:45
just as they would if you were there in person.
142
345960
2216
就跟真人開會時的情況一樣。
一切都很美好, 直到當機,就不好了。
05:48
That's great until there's breakdowns and it's not.
143
348200
2576
05:50
People, when they first see these robots,
144
350800
1976
當人們初次看到這些機器人時,
05:52
are like, "Wow, where's the components? There must be a camera over there,"
145
352800
3576
通常會:「哇,零件在哪裡? 這裡一定有個攝影機。」
05:56
and they start poking your face.
146
356400
1576
他們開始戳你的臉。
05:58
"You're talking too softly, I'm going to turn up your volume,"
147
358000
2936
「你說話太輕柔了, 我要把你的音量調高。」
06:00
which is like having a coworker walk up to you and say,
148
360960
2616
這感覺就像有個同事 走到你面前,說:
「你說話太輕柔了, 我來把你的臉調一下。」
06:03
"You're speaking too softly, I'm going to turn up your face."
149
363600
2896
那挺尷尬的,且不太好,
06:06
That's awkward and not OK,
150
366520
1256
06:07
and so we end up having to build these new social norms
151
367800
2616
所以我們最後就建立了使用 這些系統時的新社交規範。
06:10
around using these systems.
152
370440
2216
06:12
Similarly, as you start feeling like it's your body,
153
372680
3416
同樣的,當你開始覺得 它是你身體的一部分時,
06:16
you start noticing things like, "Oh, my robot is kind of short."
154
376120
3696
你就會開始注意到一些事: 「喔,我的機器人有點矮。」
06:19
Dallas would say things to me -- he was six-foot tall --
155
379840
2656
達拉斯會對我說些事。他有六呎高。
06:22
and we would take him via robot to cocktail parties and things like that,
156
382520
3456
而我們會透過機器人帶他去 雞尾酒派對之類的活動,
06:26
as you do,
157
386000
1216
跟你會做的一樣。
06:27
and the robot was about five-foot-tall, which is close to my height.
158
387240
3336
而機器人大約五呎高,和我差不多。
06:30
And he would tell me,
159
390600
1216
他會告訴我:
06:31
"You know, people are not really looking at me.
160
391840
2536
「你知道嗎,人們沒有真正看著我。
06:34
I feel like I'm just looking at this sea of shoulders,
161
394400
2696
我覺得我好像是看著一片肩膀海,
06:37
and it's just -- we need a taller robot."
162
397120
1976
那就是──我們需要更高的機器人。」
06:39
And I told him,
163
399120
1256
而我告訴他:
06:40
"Um, no.
164
400400
1296
「呃,不。
06:41
You get to walk in my shoes for today.
165
401720
1936
今天,你要嚐嚐當我的滋味。
06:43
You get to see what it's like to be on the shorter end of the spectrum."
166
403680
3536
你可以體會一下身為 比較矮的族群是什麼感覺。」
06:47
And he actually ended up building a lot of empathy for that experience,
167
407240
3376
結果他從那次經驗 建立起了相當的同理心,
06:50
which was kind of great.
168
410640
1256
這樣挺好的。
06:51
So when he'd come visit in person,
169
411920
1656
所以當他親自來造訪時,
06:53
he no longer stood over me as he was talking to me,
170
413600
2416
他若要跟我說話, 也不會站得比我高,
06:56
he would sit down and talk to me eye to eye,
171
416040
2096
他會坐下來, 和我能真正面對面談話,
06:58
which was kind of a beautiful thing.
172
418160
1736
這是件很美好的事。
06:59
So we actually decided to look at this in the laboratory
173
419920
2656
所以,我們決定在 實驗室中研究這一點,
07:02
and see what others kinds of differences things like robot height would make.
174
422600
3656
看看像類似機器人身高 這種特性會造什麼其他的差異。
07:06
And so half of the people in our study used a shorter robot,
175
426280
2856
在我們的研究中,半數 受試者使用較矮的機器人,
07:09
half of the people in our study used a taller robot
176
429160
2416
另外半數則用較高的機器人,
07:11
and we actually found that the exact same person
177
431600
2256
結果發現,當同一個人
07:13
who has the exact same body and says the exact same things as someone,
178
433880
3336
用同樣的身體,說同樣的話,
07:17
is more persuasive and perceived as being more credible
179
437240
2616
如果用比較高的機器人, 就會比較有說服力,
07:19
if they're in a taller robot form.
180
439880
1656
且被認為比較可靠。
07:21
It makes no rational sense,
181
441560
1816
這沒有理性的解釋,
但那就是為什麼我們要研究心理學。
07:23
but that's why we study psychology.
182
443400
1696
07:25
And really, you know, the way that Cliff Nass would put this
183
445120
2856
其實,用克利福德那斯的說法,
07:28
is that we're having to deal with these new technologies
184
448000
3016
我們得要處理這些嶄新的技術,
07:31
despite the fact that we have very old brains.
185
451040
2736
儘管我們的大腦是老舊的。
07:33
Human psychology is not changing at the same speed that tech is
186
453800
2976
人類心理學的改變速度沒有科技快,
07:36
and so we're always playing catch-up,
187
456800
1816
所以我們一直在努力追趕,
07:38
trying to make sense of this world
188
458640
1656
試圖把這個有很多自動的東西 跑來跑去的世界給合理化。
07:40
where these autonomous things are running around.
189
460320
2336
07:42
Usually, things that talk are people, not machines, right?
190
462680
2736
通常,會說話的就是人, 不是機器,對嗎?
07:45
And so we breathe a lot of meaning into things like just height of a machine,
191
465440
4576
所以我們把很多的意義帶到 像機器高度這樣的事物中,
07:50
not a person,
192
470040
1256
而非人,
07:51
and attribute that to the person using the system.
193
471320
2360
然後將之歸因給使用系統的人。
07:55
You know, this, I think, is really important
194
475120
2216
當談到機器人學的時候,
07:57
when you're thinking about robotics.
195
477360
1736
我認為這點十分重要。
07:59
It's not so much about reinventing humans,
196
479120
2096
它的重點並不是在重新發明人類,
08:01
it's more about figuring out how we extend ourselves, right?
197
481240
3136
比較是在於我們要如何 延伸我們自己,對嗎?
08:04
And we end up using things in ways that are sort of surprising.
198
484400
2976
結果是,我們會用 蠻讓人訝異的方式來使用東西。
08:07
So these guys can't play pool because the robots don't have arms,
199
487400
4256
這些人無法玩撞球, 因為機器人沒有手臂,
08:11
but they can heckle the guys who are playing pool
200
491680
2336
但他們能和玩撞球的那些人起鬨,
08:14
and that can be an important thing for team bonding,
201
494040
3176
那對於團隊連結來說 可能是件重要的事,
08:17
which is kind of neat.
202
497240
1296
這樣挺好的。
08:18
People who get really good at operating these systems
203
498560
2496
非常擅長操作這些系統的人
08:21
will even do things like make up new games,
204
501080
2056
能做到像是創造新遊戲這類的事,
比如半夜玩機器人足球,
08:23
like robot soccer in the middle of the night,
205
503160
2136
把垃圾筒推來推去。
08:25
pushing the trash cans around.
206
505320
1456
08:26
But not everyone's good.
207
506800
1576
但並非每個人都很擅長。
08:28
A lot of people have trouble operating these systems.
208
508400
2496
有些人在操作這些 系統時會遇到問題。
08:30
This is actually a guy who logged into the robot
209
510920
2256
這個人登入了機器人,
而他的眼球向左轉了九十度。
08:33
and his eyeball was turned 90 degrees to the left.
210
513200
2376
08:35
He didn't know that,
211
515600
1256
他自己並沒察覺,
08:36
so he ended up just bashing around the office,
212
516880
2176
結果他在辦公室裡頭亂撞,
撞到別人的桌子,弄得非常尷尬,
08:39
running into people's desks, getting super embarrassed,
213
519080
2616
因此而大笑。他的音量太高了。
08:41
laughing about it -- his volume was way too high.
214
521720
2336
而照片裡的這個人告訴我:
08:44
And this guy here in the image is telling me,
215
524080
2136
08:46
"We need a robot mute button."
216
526240
2096
「我們需要機器人靜音按鈕。」
08:48
And by that what he really meant was we don't want it to be so disruptive.
217
528360
3496
他那麼說的意思是,我們不希望 機器人會這樣引起混亂。
08:51
So as a robotics company,
218
531880
1616
所以,既然我們是機器人公司,
08:53
we added some obstacle avoidance to the system.
219
533520
2456
我們就在系統上加裝了 障礙閃避功能。
08:56
It got a little laser range finder that could see the obstacles,
220
536000
3056
機器人有了一個小雷射測距儀, 能夠看見障礙物,
08:59
and if I as a robot operator try to say, run into a chair,
221
539080
3136
如果我在操作機器人時, 試圖比如撞向一張椅子,
09:02
it wouldn't let me, it would just plan a path around,
222
542240
2496
它不會讓我這麼做,它會規劃繞路,
09:04
which seems like a good idea.
223
544760
1856
這看似是個好主意。
09:06
People did hit fewer obstacles using that system, obviously,
224
546640
3176
很顯然,用了那個系統之後, 人們比較少撞到障礙物,
09:09
but actually, for some of the people,
225
549840
2096
但,對一些人而言,
09:11
it took them a lot longer to get through our obstacle course,
226
551960
2856
他們要花比較長的時間 通過障礙物課程,
09:14
and we wanted to know why.
227
554840
1560
我們想要知道為什麼。
09:17
It turns out that there's this important human dimension --
228
557080
3056
結果發現,有個很重要的人類維度,
09:20
a personality dimension called locus of control,
229
560160
2296
一個人格維度,叫做「控制點」,
09:22
and people who have a strong internal locus of control,
230
562480
3136
如果一個人有很強的內在控制點,
09:25
they need to be the masters of their own destiny --
231
565640
3056
他會想要主宰自己的命運,
09:28
really don't like giving up control to an autonomous system --
232
568720
3096
他們不喜歡把控制權交給自動系統
09:31
so much so that they will fight the autonomy;
233
571840
2136
以致會去對抗自動化:
09:34
"If I want to hit that chair, I'm going to hit that chair."
234
574000
3096
「如果我想要撞到椅子, 我就要撞到椅子。」
09:37
And so they would actually suffer from having that autonomous assistance,
235
577120
3616
自動化的協助反而會 讓這類人很辛苦,
09:40
which is an important thing for us to know
236
580760
2576
我們能知道這點是很重要的,
09:43
as we're building increasingly autonomous, say, cars, right?
237
583360
3296
畢竟我們在建立越來越 自動化的…比如汽車,對吧?
09:46
How are different people going to grapple with that loss of control?
238
586680
3360
要如何讓不同型的人使用而不失控?
09:50
It's going to be different depending on human dimensions.
239
590880
2696
不同的人類維度會有不同的方式。
09:53
We can't treat humans as if we're just one monolithic thing.
240
593600
3496
我們不能把人看成是單一的東西。
09:57
We vary by personality, by culture,
241
597120
2416
我們有不同的人格、文化,
09:59
we even vary by emotional state moment to moment,
242
599560
2456
我們甚至在不同時刻 會有不同的情緒狀態,
10:02
and being able to design these systems,
243
602040
1976
如果要能夠設計這些系統,
10:04
these human-robot interaction systems,
244
604040
2296
這些人類與機器人的互動系統,
10:06
we need to take into account the human dimensions,
245
606360
2736
我們就得要考量人類維度,
10:09
not just the technological ones.
246
609120
1720
不只是技術維度。
10:11
Along with a sense of control also comes a sense of responsibility.
247
611640
4296
隨著控制感而來的,就是責任感。
10:15
And if you were a robot operator using one of these systems,
248
615960
2856
如果你是個在使用 這些系統的機器人操作員,
10:18
this is what the interface would look like.
249
618840
2056
介面看起來就是這樣子的。
10:20
It looks a little bit like a video game,
250
620920
1936
它看起來有一點像電玩遊戲,
10:22
which can be good because that's very familiar to people,
251
622880
2976
這是好事,因為人們熟悉它;
10:25
but it can also be bad
252
625880
1216
但也可能是壞事,
10:27
because it makes people feel like it's a video game.
253
627120
2456
因為它會讓人們覺得 它是個電玩遊戲。
10:29
We had a bunch of kids over at Stanford play with the system
254
629600
2856
我們找了一群小朋友 到史丹佛來玩這個系統,
10:32
and drive the robot around our office in Menlo Park,
255
632480
2456
操作我們在門洛公園 辦公室裡的機器人,
10:34
and the kids started saying things like,
256
634960
1936
小朋友們開始說像這樣的話:
「如果你撞到那邊的那個人, 得十分,那個人則是二十分。」
10:36
"10 points if you hit that guy over there. 20 points for that one."
257
636920
3176
他們會在走廊上追著目標跑。
10:40
And they would chase them down the hallway.
258
640120
2016
(笑聲)
10:42
(Laughter)
259
642160
1016
10:43
I told them, "Um, those are real people.
260
643200
1936
我告訴他們:「呃,那些是真人。
10:45
They're actually going to bleed and feel pain if you hit them."
261
645160
3296
如果你們撞到他們, 他們真的會流血且會痛。」
10:48
And they'd be like, "OK, got it."
262
648480
1616
他們才說:「好,了解。」
10:50
But five minutes later, they would be like,
263
650120
2056
但五分鐘之後,他們又會說:
10:52
"20 points for that guy over there, he just looks like he needs to get hit."
264
652200
3616
「撞到那邊那個人有二十分, 他看起來需要被撞一下。」
10:55
It's a little bit like "Ender's Game," right?
265
655840
2136
這有一點像《戰爭遊戲》,對嗎?
在另一端有一個真實的世界,
10:58
There is a real world on that other side
266
658000
1936
10:59
and I think it's our responsibility as people designing these interfaces
267
659960
3416
我認為我們身為人的責任, 是要設計這些介面
11:03
to help people remember
268
663400
1256
來協助人們記得
11:04
that there's real consequences to their actions
269
664680
2256
他們的行為會造成真實的後果,
11:06
and to feel a sense of responsibility
270
666960
2296
讓他們在操作這些 越來越自動的東西時,
11:09
when they're operating these increasingly autonomous things.
271
669280
3280
能夠有責任感。
11:13
These are kind of a great example
272
673840
2296
這些是很棒的例子,
11:16
of experimenting with one possible robotic future,
273
676160
3256
說明對一種可能的 機器人未來所做的實驗,
11:19
and I think it's pretty cool that we can extend ourselves
274
679440
3856
我覺得我們能將自己 延伸出去,是挺酷的事,
11:23
and learn about the ways that we extend ourselves
275
683320
2336
同時也能學習我們 將自己延伸出去的方式,
11:25
into these machines
276
685680
1216
延伸至機器中,
11:26
while at the same time being able to express our humanity
277
686920
2696
同時還能表現出我們的人性
11:29
and our personality.
278
689640
1216
以及我們的個性。
11:30
We also build empathy for others
279
690880
1576
我們也會建立對他人的同理心,
11:32
in terms of being shorter, taller, faster, slower,
280
692480
3216
理解那些比較矮、比較高、 比較快、比較慢,
11:35
and maybe even armless,
281
695720
1416
甚至沒有手臂的人,
11:37
which is kind of neat.
282
697160
1336
這樣挺好的。
11:38
We also build empathy for the robots themselves.
283
698520
2536
我們也會建立 對機器人本身的同理心。
11:41
This is one of my favorite robots.
284
701080
1656
這是我最喜歡的機器人之一。
11:42
It's called the Tweenbot.
285
702760
1456
它叫 Tweenbot。
11:44
And this guy has a little flag that says,
286
704240
1976
這傢伙有一面小旗子,上面寫著:
11:46
"I'm trying to get to this intersection in Manhattan,"
287
706240
2576
「我想前往曼哈頓的 這個十字路口。」
11:48
and it's cute and rolls forward, that's it.
288
708840
2776
它很可愛,會向前跑,就這樣。
11:51
It doesn't know how to build a map, it doesn't know how to see the world,
289
711640
3456
它不知道如何建立地圖, 它不知道如何看世界,
它只會尋求協助。
11:55
it just asks for help.
290
715120
1256
11:56
The nice thing about people
291
716400
1336
而人們很棒的一點,
11:57
is that it can actually depend upon the kindness of strangers.
292
717760
3096
就是它真的可以仰賴陌生人的善心。
12:00
It did make it across the park to the other side of Manhattan --
293
720880
3896
它真的穿過公園到了 曼哈頓的另一端。
12:04
which is pretty great --
294
724800
1256
這很棒,
因為人們會把它拿起來, 轉向正確的方向。
12:06
just because people would pick it up and point it in the right direction.
295
726080
3456
(笑聲)
12:09
(Laughter)
296
729560
936
那樣很棒,對嗎?
12:10
And that's great, right?
297
730520
1256
12:11
We're trying to build this human-robot world
298
731800
2696
我們在試著建立 人類──機器人的世界,
12:14
in which we can coexist and collaborate with one another,
299
734520
3416
在這個世界中,我們彼此能 共同存在且和諧地共事,
12:17
and we don't need to be fully autonomous and just do things on our own.
300
737960
3376
我們不用完全自動化, 也不用只靠自己做事。
12:21
We actually do things together.
301
741360
1496
而是真正能夠一起做事。
12:22
And to make that happen,
302
742880
1256
要實現這個理想,
我們其實需要其他人的協助, 像是藝術家、設計師、
12:24
we actually need help from people like the artists and the designers,
303
744160
3256
政策制訂者、法律學者、
12:27
the policy makers, the legal scholars,
304
747440
1856
心理學家、社會學家、人類學家 ──
12:29
psychologists, sociologists, anthropologists --
305
749320
2216
12:31
we need more perspectives in the room
306
751560
1816
我們需要更多的觀點參與討論,
12:33
if we're going to do the thing that Stu Card says we should do,
307
753400
2976
如果我們要做史都華卡德 說我們應該做的事情,
12:36
which is invent the future that we actually want to live in.
308
756400
3936
也就是:發明一個會讓我們 想要住在其中的未來。
12:40
And I think we can continue to experiment
309
760360
2656
我認為我們能持續做實驗,
12:43
with these different robotic futures together,
310
763040
2176
將不同的機器人未來一起拿來實驗,
12:45
and in doing so, we will end up learning a lot more about ourselves.
311
765240
4680
這麼做,我們最後就能學到 很多關於我們自己的事。
12:50
Thank you.
312
770720
1216
謝謝。
12:51
(Applause)
313
771960
2440
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7