請雙擊下方英文字幕播放視頻。
譯者: kane tan
審譯者: Shelley Krishna Tsang
00:18
What I want to tell you about today is how I see robots invading our lives
0
18330
5000
今天我想告訴大家的是,我認為機器人將會以
00:23
at multiple levels, over multiple timescales.
1
23330
3000
各種不同方式,不同時間進駐到我們的生活之中。
00:26
And when I look out in the future, I can't imagine a world, 500 years from now,
2
26330
4000
當我估算未來時,我相信五百年後的世界,
00:30
where we don't have robots everywhere.
3
30330
2000
到處都是機器人,
00:32
Assuming -- despite all the dire predictions from many people about our future --
4
32330
5000
只是假設 -- 儘管有許多人對未來有悲觀的預測( 世界末日等 ) --
00:37
assuming we're still around, I can't imagine the world not being populated with robots.
5
37330
4000
假設我們到時候還活著,我相信世界將會充斥著機器人。
00:41
And then the question is, well, if they're going to be here in 500 years,
6
41330
3000
接下來的問題是,如果五百年後它們將會存在於世上,
00:44
are they going to be everywhere sooner than that?
7
44330
2000
它們是否會更早出現呢?
00:46
Are they going to be around in 50 years?
8
46330
2000
它們在未來五十年內是否會存在於我們的生活中呢?
00:48
Yeah, I think that's pretty likely -- there's going to be lots of robots everywhere.
9
48330
3000
是的,我相信大概是如此 -- 到時候將會到處都有一堆機器人。
00:51
And in fact I think that's going to be a lot sooner than that.
10
51330
3000
事實上我認為會更快就出現一堆機器人了。
00:54
I think we're sort of on the cusp of robots becoming common,
11
54330
4000
我想我們正處在機器人普及化的開端了,
00:58
and I think we're sort of around 1978 or 1980 in personal computer years,
12
58330
6000
在1978或1980年代,個人電腦剛出現的時候,
01:04
where the first few robots are starting to appear.
13
64330
3000
一些初期的機器人就已經出現了。
01:07
Computers sort of came around through games and toys.
14
67330
4000
電腦是從遊戲和玩具開始進入我們生活中的。
01:11
And you know, the first computer most people had in the house
15
71330
3000
你知道嗎,一開始人們有電腦時
01:14
may have been a computer to play Pong,
16
74330
2000
大概都是拿來玩 乒乓球 遊戲(最早的電玩),
01:16
a little microprocessor embedded,
17
76330
2000
裡面裝著小小的微處理器,
01:18
and then other games that came after that.
18
78330
3000
之後其他遊戲才漸漸出來。
01:21
And we're starting to see that same sort of thing with robots:
19
81330
3000
對於機器人來說,大概也會有類似的狀況:
01:24
LEGO Mindstorms, Furbies -- who here -- did anyone here have a Furby?
20
84330
4000
樂高機器人套件中的互動機器人菲比 -- 這邊有沒有人擁有菲比?
01:28
Yeah, there's 38 million of them sold worldwide.
21
88330
3000
全世界共賣出三千八百萬台。
01:31
They are pretty common. And they're a little tiny robot,
22
91330
2000
它們很常見,它們是小小機器人,
01:33
a simple robot with some sensors,
23
93330
2000
一個裝著許多感應器的簡易機器人。
01:35
a little bit of processing actuation.
24
95330
2000
它能處理一些簡單的訊息。
01:37
On the right there is another robot doll, who you could get a couple of years ago.
25
97330
3000
右邊是另一個幾年前就出現的機器娃娃。
01:40
And just as in the early days,
26
100330
2000
如同較早之前一般,
01:42
when there was a lot of sort of amateur interaction over computers,
27
102330
5000
當時有許多藉由電腦所進行的業餘互動模式,
01:47
you can now get various hacking kits, how-to-hack books.
28
107330
4000
你可以找到很多修改工具以及如何修改的書。
01:51
And on the left there is a platform from Evolution Robotics,
29
111330
4000
左邊是一個 Evolution Robotics 所做的工作平台,
01:55
where you put a PC on, and you program this thing with a GUI
30
115330
3000
你可以將電腦接上去,然後藉由圖形化介面
01:58
to wander around your house and do various stuff.
31
118330
3000
去設定讓它在家裡到處亂走或是做各種事情。
02:01
And then there's a higher price point sort of robot toys --
32
121330
3000
這是一個價格較高的某種機器人玩具 --
02:04
the Sony Aibo. And on the right there, is one that the NEC developed,
33
124330
4000
Sony 的 Aibo。在右邊的是 NEC 開發的 PaPeRo,
02:08
the PaPeRo, which I don't think they're going to release.
34
128330
3000
雖然我本來不認為它會被推出。
02:11
But nevertheless, those sorts of things are out there.
35
131330
3000
然而,這類型的東西已經出現了。
02:14
And we've seen, over the last two or three years, lawn-mowing robots,
36
134330
4000
在過去兩三年之間,我們看見了割草機器人的出現,
02:18
Husqvarna on the bottom, Friendly Robotics on top there, an Israeli company.
37
138330
6000
上面是 Husqvarna 公司製造的,下面則是 以色列 Friendly Robotics 公司所推出的。
02:24
And then in the last 12 months or so
38
144330
2000
而在過去十二個月左右的時間,
02:26
we've started to see a bunch of home-cleaning robots appear.
39
146330
4000
我們開始看見許多家用清潔機器人的出現。
02:30
The top left one is a very nice home-cleaning robot
40
150330
3000
左上角的是一個很棒的家用清潔機器人,
02:33
from a company called Dyson, in the U.K. Except it was so expensive --
41
153330
4000
由英國的一家Dyson公司製造。不過它很貴 --
02:37
3,500 dollars -- they didn't release it.
42
157330
2000
要價三千五百美金 -- 他們並沒有販賣它。
02:39
But at the bottom left, you see Electrolux, which is on sale.
43
159330
3000
左下角是Electrolux,這個就有在販賣。
02:42
Another one from Karcher.
44
162330
2000
另一個是Karcher公司做的。
02:44
At the bottom right is one that I built in my lab
45
164330
2000
右下角這個是大約十年前
02:46
about 10 years ago, and we finally turned that into a product.
46
166330
3000
我在實驗室裡做出來的,我們後來終於把它變成商品。
02:49
And let me just show you that.
47
169330
2000
讓我展示一下這個。
02:51
We're going to give this away I think, Chris said, after the talk.
48
171330
4000
我想 Chris 有說,在演講之後我們會把它送出去。
02:55
This is a robot that you can go out and buy, and that will clean up your floor.
49
175330
6000
這是一具你可以買到的機器人, 它會幫你清掃地板。
03:05
And it starts off sort of just going around in ever-increasing circles.
50
185330
5000
然後它開始到處用繞圈的方式移動。
03:10
If it hits something -- you people see that?
51
190330
4000
如果它撞到東西 -- 你看見了嗎?
03:14
Now it's doing wall-following, it's following around my feet
52
194330
3000
它開始沿著牆壁前進,它正沿著我的腳
03:17
to clean up around me. Let's see, let's --
53
197330
4000
進行清掃。讓我們看看 --
03:21
oh, who stole my Rice Krispies? They stole my Rice Krispies!
54
201330
5000
喔,誰偷了我的爆米香?它們偷了我的爆米香。
03:26
(Laughter)
55
206330
6000
(笑聲)
03:32
Don't worry, relax, no, relax, it's a robot, it's smart!
56
212330
3000
別擔心,放輕鬆,它是一個機器人,它很聰明。
03:35
(Laughter)
57
215330
3000
(笑聲)
03:38
See, the three-year-old kids, they don't worry about it.
58
218330
4000
看,三歲的小孩都不會擔心。
03:42
It's grown-ups that get really upset.
59
222330
2000
大人反而會覺得不安。
03:44
(Laughter)
60
224330
1000
(笑聲)
03:45
We'll just put some crap here.
61
225330
2000
先在這裡放一些垃圾。
03:47
(Laughter)
62
227330
4000
(笑聲)
03:51
Okay.
63
231330
2000
好吧。
03:53
(Laughter)
64
233330
4000
(笑聲)
03:57
I don't know if you see -- so, I put a bunch of Rice Krispies there,
65
237330
3000
不知道你們有沒有看見 -- 所以我在這邊放一些爆米香,
04:00
I put some pennies, let's just shoot it at that, see if it cleans up.
66
240330
7000
在放幾塊錢,看看它會不會清理乾淨。
04:10
Yeah, OK. So --
67
250330
2000
耶,它做到了。所以 --
04:12
we'll leave that for later.
68
252330
4000
我們先讓它待在那兒一下。
04:16
(Applause)
69
256330
5000
(掌聲)
04:22
Part of the trick was building a better cleaning mechanism, actually;
70
262330
4000
這些小技巧讓清潔機器人變得更好,
04:26
the intelligence on board was fairly simple.
71
266330
4000
事實上,寫在電路板的人工智能相當簡單。
04:30
And that's true with a lot of robots.
72
270330
2000
對於很多機器人來說都是如此。
04:32
We've all, I think, become, sort of computational chauvinists,
73
272330
4000
我想我們都變成了某種計算機的沙文主義者,
04:36
and think that computation is everything,
74
276330
2000
認為運算就是一切,
04:38
but the mechanics still matter.
75
278330
2000
但是機構仍然是重要的一件事。
04:40
Here's another robot, the PackBot,
76
280330
3000
這是另一個機器人,PackBot,
04:43
that we've been building for a bunch of years.
77
283330
2000
我們花了幾年的時間去製造它。
04:45
It's a military surveillance robot, to go in ahead of troops --
78
285330
6000
它是軍事偵測機器人,總是走在部隊前方,
04:51
looking at caves, for instance.
79
291330
3000
去進行例如洞穴內偵查的任務。
04:54
But we had to make it fairly robust,
80
294330
2000
但是我們必須讓它變成更加強壯,
04:56
much more robust than the robots we build in our labs.
81
296330
7000
比我們實驗室中其他機器人都更加強壯。
05:03
(Laughter)
82
303330
3000
(笑聲)
05:12
On board that robot is a PC running Linux.
83
312330
4000
機器人內部裝置的電腦使用 Linux 系統。
05:16
It can withstand a 400G shock. The robot has local intelligence:
84
316330
6000
它可以承受 400G 的震盪。這個機器人擁有局部智能:
05:22
it can flip itself over, can get itself into communication range,
85
322330
6000
它能夠將自己翻轉過來,讓自己抵達通訊範圍內,
05:28
can go upstairs by itself, et cetera.
86
328330
3000
可以自己爬上樓之類的。
05:38
Okay, so it's doing local navigation there.
87
338330
4000
好,它正在區域探索。
05:42
A soldier gives it a command to go upstairs, and it does.
88
342330
6000
一個士兵給它上樓的指令,它也照做了。
05:49
That was not a controlled descent.
89
349330
3000
那不是命令控制叫它摔下去的。
05:52
(Laughter)
90
352330
2000
(笑聲)
05:54
Now it's going to head off.
91
354330
2000
現在它準備繞到前面去。
05:56
And the big breakthrough for these robots, really, was September 11th.
92
356330
5000
這些機器人在911那天有了重大進展。
06:01
We had the robots down at the World Trade Center late that evening.
93
361330
4000
當晚我們將這些機器人置放於世界貿易中心。
06:06
Couldn't do a lot in the main rubble pile,
94
366330
2000
在主要的碎石堆中無法做什麼事,
06:08
things were just too -- there was nothing left to do.
95
368330
3000
狀況太糟糕,已經沒什麼能做的。
06:11
But we did go into all the surrounding buildings that had been evacuated,
96
371330
5000
但是我們進入了周圍已經被淨空的大樓,
06:16
and searched for possible survivors in the buildings
97
376330
3000
在那些因為太危險而無法進入的大樓中,
06:19
that were too dangerous to go into.
98
379330
2000
找尋可能的生還者。
06:21
Let's run this video.
99
381330
2000
我們來播放這段影片。
06:23
Reporter: ...battlefield companions are helping to reduce the combat risks.
100
383330
3000
播報員:戰場幫手正在協助降低人們救援的風險。
06:26
Nick Robertson has that story.
101
386330
3000
請 Nick Robertson 告訴我們現場狀況
06:31
Rodney Brooks: Can we have another one of these?
102
391330
2000
Nick Robertson:能不能再拿一台來?
06:38
Okay, good.
103
398330
2000
好的,太好了。
06:43
So, this is a corporal who had seen a robot two weeks previously.
104
403330
3000
這是兩個禮拜前那台機器人的殘骸。
06:48
He's sending robots into caves, looking at what's going on.
105
408330
4000
他將機器人們派遣至洞穴中勘查狀況。
06:52
The robot's being totally autonomous.
106
412330
2000
這些機器人是全自動運作的。
06:54
The worst thing that's happened in the cave so far
107
414330
4000
目前最糟的狀況是其中一台機器人
06:58
was one of the robots fell down ten meters.
108
418330
3000
在洞穴中跌入了十公尺的地方。
07:08
So one year ago, the US military didn't have these robots.
109
428330
3000
在一年前,美軍還沒有這些機器人。
07:11
Now they're on active duty in Afghanistan every day.
110
431330
2000
而現在他們每天都在阿富汗執行任務。
07:13
And that's one of the reasons they say a robot invasion is happening.
111
433330
3000
這是他們說機器人正在進駐的其中一個原因。
07:16
There's a sea change happening in how -- where technology's going.
112
436330
4000
目前科技的進展有著巨大的改變。
07:20
Thanks.
113
440330
2000
謝謝。
07:23
And over the next couple of months,
114
443330
2000
在未來幾個月內,
07:25
we're going to be sending robots in production
115
445330
3000
我們將會把這些機器人送往生產原油的油井之下,
07:28
down producing oil wells to get that last few years of oil out of the ground.
116
448330
4000
去將可以使用好幾年的原油運送出來。
07:32
Very hostile environments, 150˚ C, 10,000 PSI.
117
452330
4000
在150度C、一萬psi的嚴酷環境下。
07:36
Autonomous robots going down, doing this sort of work.
118
456330
4000
自主機器人將進入下方,進行這類工作。
07:40
But robots like this, they're a little hard to program.
119
460330
3000
但這類型機器人的設計有點困難。
07:43
How, in the future, are we going to program our robots
120
463330
2000
在未來,我們要如何去設計機器人,
07:45
and make them easier to use?
121
465330
2000
該如何讓它們更容易被使用?
07:47
And I want to actually use a robot here --
122
467330
3000
我想在這裡用一個機器人 --
07:50
a robot named Chris -- stand up. Yeah. Okay.
123
470330
5000
這個機器人叫 Chris -- 站起來。對。很好。
07:57
Come over here. Now notice, he thinks robots have to be a bit stiff.
124
477330
4000
過來這邊。請注意,他認為機器人應該有點僵硬。
08:01
He sort of does that. But I'm going to --
125
481330
3000
他常這樣做。但是我將會 --
08:04
Chris Anderson: I'm just British. RB: Oh.
126
484330
2000
Chris Anderson:我只是比較英式。RB:喔。
08:06
(Laughter)
127
486330
2000
(笑聲)
08:08
(Applause)
128
488330
2000
(掌聲)
08:10
I'm going to show this robot a task. It's a very complex task.
129
490330
3000
我將會指派一個任務給這個機器人。是一個很複雜的任務。
08:13
Now notice, he nodded there, he was giving me some indication
130
493330
3000
請注意,他在點頭,他是在給我一些提示,
08:16
he was understanding the flow of communication.
131
496330
3000
讓我知道他了解溝通的流程。
08:19
And if I'd said something completely bizarre
132
499330
2000
如果我說了些奇怪的話,
08:21
he would have looked askance at me, and regulated the conversation.
133
501330
3000
他會質疑的看著我,然後調整對話內容。
08:24
So now I brought this up in front of him.
134
504330
3000
現在我把這個拿到他的前面。
08:27
I'd looked at his eyes, and I saw his eyes looked at this bottle top.
135
507330
4000
我看著他的眼睛,我看見他看著瓶子的頂端。
08:31
And I'm doing this task here, and he's checking up.
136
511330
2000
我在這裡指派任務,他正在確認中。
08:33
His eyes are going back and forth up to me, to see what I'm looking at --
137
513330
3000
他的眼睛正在前後觀望,看看我正在看什麼,
08:36
so we've got shared attention.
138
516330
2000
現在我們有了共同注意的東西。
08:38
And so I do this task, and he looks, and he looks to me
139
518330
3000
我指派了這個任務,他正看著,然後看著我,
08:41
to see what's happening next. And now I'll give him the bottle,
140
521330
4000
看看之後會發生什麼事。現在我將會把瓶子交給他,
08:45
and we'll see if he can do the task. Can you do that?
141
525330
2000
然後看看他能不能完成這個任務。你能做到嗎?
08:47
(Laughter)
142
527330
3000
(笑聲)
08:50
Okay. He's pretty good. Yeah. Good, good, good.
143
530330
4000
好的。他做得很好。對,很好很好。
08:54
I didn't show you how to do that.
144
534330
2000
我並沒有告訴你怎麼做那件事。
08:56
Now see if you can put it back together.
145
536330
2000
現在看看你能不能把他裝回去。
08:58
(Laughter)
146
538330
2000
(笑聲)
09:00
And he thinks a robot has to be really slow.
147
540330
1000
他認為機器人必須相當緩慢。
09:01
Good robot, that's good.
148
541330
2000
好機器人,很棒。
09:03
So we saw a bunch of things there.
149
543330
2000
我們看了這兒的一些事情。
09:06
We saw when we're interacting,
150
546330
3000
我們看見當我們互動時,
09:09
we're trying to show someone how to do something, we direct their visual attention.
151
549330
4000
試著去讓某人知道如何做一件事,我們引導它們視覺上的注意。
09:13
The other thing communicates their internal state to us,
152
553330
4000
另一件事是要讓它們內部狀態和我們進行溝通,
09:17
whether he's understanding or not, regulates a social interaction.
153
557330
3000
是否它了解如何進行社交互動。
09:20
There was shared attention looking at the same sort of thing,
154
560330
2000
這是在參與關注並看著同一個東西,
09:22
and recognizing socially communicated reinforcement at the end.
155
562330
4000
最後去辨識社交訊息。
09:26
And we've been trying to put that into our lab robots
156
566330
3000
我們試著將這個放進我們實驗室的機器人裡,
09:29
because we think this is how you're going to want to interact with robots in the future.
157
569330
4000
因為我們認為這將會是未來讓你想要和機器人互動的方式。
09:33
I just want to show you one technical diagram here.
158
573330
2000
我給大家看一張技術圖表。
09:35
The most important thing for building a robot that you can interact with socially
159
575330
4000
要製造一台能夠和你社交性互動的機器人,
09:39
is its visual attention system.
160
579330
2000
最重要的是它的視覺關注系統。
09:41
Because what it pays attention to is what it's seeing
161
581330
3000
因為它注意的東西是它所看見,
09:44
and interacting with, and what you're understanding what it's doing.
162
584330
3000
並且互動的東西,以及讓你了解它在做什麼。
09:47
So in the videos I'm about to show you,
163
587330
3000
在接下來給大家看的這段影片中,
09:50
you're going to see a visual attention system on a robot
164
590330
4000
你將會看見一個具有視覺關注系統的機器人,
09:54
which has -- it looks for skin tone in HSV space,
165
594330
4000
其中有著 -- 它會在 HSV 區間中搜尋皮膚色澤,
09:58
so it works across all human colorings.
166
598330
4000
它會掃過所有的東西,例如人的膚色。
10:02
It looks for highly saturated colors, from toys.
167
602330
2000
它從玩具中搜尋高飽和度的顏色。
10:04
And it looks for things that move around.
168
604330
2000
並且搜尋移動的物體。
10:06
And it weights those together into an attention window,
169
606330
3000
它將這些放在關注視窗中進行衡量,
10:09
and it looks for the highest-scoring place --
170
609330
2000
找出最高分的地方 -- 東西
10:11
the stuff where the most interesting stuff is happening --
171
611330
2000
進行中的最有意思的東西。
10:13
and that is what its eyes then segue to.
172
613330
4000
那就是它的眼睛的下個目標。
10:17
And it looks right at that.
173
617330
2000
它盯著那東西看。
10:19
At the same time, some top-down sort of stuff:
174
619330
3000
同時做出上下看的動作:
10:22
might decide that it's lonely and look for skin tone,
175
622330
3000
可能會決定顯現它很寂寞,搜尋著皮膚色澤,
10:25
or might decide that it's bored and look for a toy to play with.
176
625330
3000
或是決定顯現它很無聊,想找玩具來玩。
10:28
And so these weights change.
177
628330
2000
這些衡量持續改變著。
10:30
And over here on the right,
178
630330
2000
在右邊這個地方,
10:32
this is what we call the Steven Spielberg memorial module.
179
632330
3000
有著我們稱為 Steven Spielberg 記憶模組的東西。
10:35
Did people see the movie "AI"? (Audience: Yes.)
180
635330
2000
你們看過AI這部電影嗎?觀眾:有。
10:37
RB: Yeah, it was really bad, but --
181
637330
2000
RB:嗯,那部片很爛,但是 --
10:39
remember, especially when Haley Joel Osment, the little robot,
182
639330
4000
請記住,尤其是當Haley Joel Osment,那個小機器人,
10:43
looked at the blue fairy for 2,000 years without taking his eyes off it?
183
643330
4000
看著那個藍色精靈2000年都沒有移開過視線?
10:47
Well, this gets rid of that,
184
647330
2000
嗯,這可以解釋它,
10:49
because this is a habituation Gaussian that gets negative,
185
649330
4000
因為這是高斯慣性變成了負值,
10:53
and more and more intense as it looks at one thing.
186
653330
3000
於是當它看著某個東西時就變得更加強烈。
10:56
And it gets bored, so it will then look away at something else.
187
656330
3000
當它覺得厭倦時,它將會轉頭去看別的東西。
10:59
So, once you've got that -- and here's a robot, here's Kismet,
188
659330
4000
當你了解這點時,這兒是另一個機器人,Kismet,
11:03
looking around for a toy. You can tell what it's looking at.
189
663330
4000
在四周尋找著玩具。你可以知道那是它正在看得東西。
11:07
You can estimate its gaze direction from those eyeballs covering its camera,
190
667330
5000
你可以藉由眼球下覆蓋的攝影機來估算它的眼神方向,
11:12
and you can tell when it's actually seeing the toy.
191
672330
3000
你可以知道它是不是看著玩具。
11:15
And it's got a little bit of an emotional response here.
192
675330
2000
它有一點情緒性反應。
11:17
(Laughter)
193
677330
1000
(笑聲)
11:18
But it's still going to pay attention
194
678330
2000
不過它仍然在注意著,
11:20
if something more significant comes into its field of view --
195
680330
4000
是否有其它更重要的東西進入視線範圍內 --
11:24
such as Cynthia Breazeal, the builder of this robot, from the right.
196
684330
4000
例如右邊的 Cynthia Breazeal,機器人的製造者。
11:28
It sees her, pays attention to her.
197
688330
5000
它看見她,注意著她。
11:33
Kismet has an underlying, three-dimensional emotional space,
198
693330
4000
Kismet 有一個內建的三維模式的情感區間,
11:37
a vector space, of where it is emotionally.
199
697330
3000
一個向量區間用以讓它產生情緒。
11:40
And at different places in that space, it expresses --
200
700330
5000
在區間中另一個地方讓它表達出 --
11:46
can we have the volume on here?
201
706330
2000
我們可以把聲音打開嗎?
11:48
Can you hear that now, out there? (Audience: Yeah.)
202
708330
2000
你們能聽見嗎? 觀眾:可以。
11:50
Kismet: Do you really think so? Do you really think so?
203
710330
5000
Kismet:你真的這麼認為嗎?真的那麼認為嗎?
11:57
Do you really think so?
204
717330
2000
你真的這麼認為嗎?
12:00
RB: So it's expressing its emotion through its face
205
720330
3000
RB:它藉由臉部表情以及
12:03
and the prosody in its voice.
206
723330
2000
聲音語調來表達它的情緒。
12:05
And when I was dealing with my robot over here,
207
725330
4000
當我在那兒應付我的機器人時,
12:09
Chris, the robot, was measuring the prosody in my voice,
208
729330
3000
這個機器人,Chris,正量測著我的聲音的語調,
12:12
and so we have the robot measure prosody for four basic messages
209
732330
5000
如此我們的機器人能夠去量測語調,
12:17
that mothers give their children pre-linguistically.
210
737330
4000
來了解四種母親們在小孩會說話前所使用的基本訊息。
12:21
Here we've got naive subjects praising the robot:
211
741330
3000
這兒是用純真的內容來讚美機器人,
12:26
Voice: Nice robot.
212
746330
2000
聲音:好機器人。
12:29
You're such a cute little robot.
213
749330
2000
你是一個可愛的小機器人。
12:31
(Laughter)
214
751330
2000
(笑聲)
12:33
RB: And the robot's reacting appropriately.
215
753330
2000
然後機器人的反應很恰當。
12:35
Voice: ...very good, Kismet.
216
755330
4000
聲音:很棒,Kismet。
12:40
(Laughter)
217
760330
2000
(笑聲)
12:42
Voice: Look at my smile.
218
762330
2000
聲音:看著我笑一個。
12:46
RB: It smiles. She imitates the smile. This happens a lot.
219
766330
3000
RB:它笑了。她模仿著笑容。這常會發生。
12:49
These are naive subjects.
220
769330
2000
這些純真的事件。
12:51
Here we asked them to get the robot's attention
221
771330
3000
接著我們要求他們取得機器人的注意,
12:54
and indicate when they have the robot's attention.
222
774330
3000
並且當他們取得機器人注意時給予提示。
12:57
Voice: Hey, Kismet, ah, there it is.
223
777330
4000
聲音:嘿,Kismet,看這邊。
13:01
RB: So she realizes she has the robot's attention.
224
781330
4000
RB:她發現她取得機器人的注意了。
13:08
Voice: Kismet, do you like the toy? Oh.
225
788330
4000
聲音:Kismet,你喜歡這個玩具嗎?喔。
13:13
RB: Now, here they're asked to prohibit the robot,
226
793330
2000
RB:現在,被要求去阻止機器人,
13:15
and this first woman really pushes the robot into an emotional corner.
227
795330
4000
第一個女士已經將機器人推入情緒的角落。
13:19
Voice: No. No. You're not to do that. No.
228
799330
5000
聲音:不行,你不可以那樣。不可以。
13:24
(Laughter)
229
804330
3000
(笑聲)
13:27
Not appropriate. No. No.
230
807330
6000
聲音:不行,不可以。
13:33
(Laughter)
231
813330
3000
(笑聲)
13:36
RB: I'm going to leave it at that.
232
816330
2000
RB:就先到這邊為止。
13:38
We put that together. Then we put in turn taking.
233
818330
2000
我們將這些都放在一起,然後我們輸入輪流的觀念。
13:40
When we talk to someone, we talk.
234
820330
3000
當我們和某人說時,我們說著話。
13:43
Then we sort of raise our eyebrows, move our eyes,
235
823330
4000
然後,我們會揚起眉毛,移動眼睛,
13:47
give the other person the idea it's their turn to talk.
236
827330
3000
告訴另一個人,該你說了。
13:50
And then they talk, and then we pass the baton back and forth between each other.
237
830330
4000
然後就這樣一來一往的開始交談起來。
13:54
So we put this in the robot.
238
834330
2000
於是我們將這個機制放進機器人中。
13:56
We got a bunch of naive subjects in,
239
836330
2000
我們輸入許多常見的話題,
13:58
we didn't tell them anything about the robot,
240
838330
2000
我們沒有告訴他們任何關於機器人的事,
14:00
sat them down in front of the robot and said, talk to the robot.
241
840330
2000
讓他們坐在機器人面前和機器人聊天。
14:02
Now what they didn't know was,
242
842330
2000
不過他們並不知道,
14:04
the robot wasn't understanding a word they said,
243
844330
2000
機器人根本不知道他們再說什麼,
14:06
and that the robot wasn't speaking English.
244
846330
3000
機器人也不會說英文。
14:09
It was just saying random English phonemes.
245
849330
2000
它只是隨機說出英文的字句。
14:11
And I want you to watch carefully, at the beginning of this,
246
851330
2000
請注意看一開始的狀況,
14:13
where this person, Ritchie, who happened to talk to the robot for 25 minutes --
247
853330
4000
這位 Ritchie 一不小心就跟機器人聊了 25 分鐘 --
14:17
(Laughter)
248
857330
2000
(笑聲)
14:19
-- says, "I want to show you something.
249
859330
2000
-- 他說,"我想讓你看樣東西。
14:21
I want to show you my watch."
250
861330
2000
我讓你看我的手錶。"
14:23
And he brings the watch center, into the robot's field of vision,
251
863330
5000
他把手錶的中心放進了機器人的視野之中,
14:28
points to it, gives it a motion cue,
252
868330
2000
指著它,給機器人一個感覺的提示,
14:30
and the robot looks at the watch quite successfully.
253
870330
2000
機器人成功地看著那支手錶。
14:32
We don't know whether he understood or not that the robot --
254
872330
3000
我們不確定他是否知道這個機器人 --
14:36
Notice the turn-taking.
255
876330
2000
注意的是輪流說話的時機。
14:38
Ritchie: OK, I want to show you something. OK, this is a watch
256
878330
3000
Ritchie:好了,我要讓你看樣東西。嗯,這是一支手錶,
14:41
that my girlfriend gave me.
257
881330
3000
我女朋友給我的。
14:44
Robot: Oh, cool.
258
884330
2000
Robot:喔,真酷。
14:46
Ritchie: Yeah, look, it's got a little blue light in it too. I almost lost it this week.
259
886330
4000
Ritchie:是啊,看,上面有微弱的藍光照明。我這裡拜差點把它搞丟了。
14:51
(Laughter)
260
891330
4000
(笑聲)
14:55
RB: So it's making eye contact with him, following his eyes.
261
895330
3000
RB:它正在和他目光接觸,盯著他的眼睛看。
14:58
Ritchie: Can you do the same thing? Robot: Yeah, sure.
262
898330
2000
Ritchie:你能做一樣的事嗎?機器人:是的,當然嚕。
15:00
RB: And they successfully have that sort of communication.
263
900330
2000
RB:他們成功地進行著這類溝通。
15:02
And here's another aspect of the sorts of things that Chris and I were doing.
264
902330
4000
接下來是類似我和 Chris 所做的事情。
15:06
This is another robot, Cog.
265
906330
2000
這是另外一個機器人,Cog。
15:08
They first make eye contact, and then, when Christie looks over at this toy,
266
908330
6000
他們一開始也是進行目光接觸,然後當 Christie 看著這個玩具,
15:14
the robot estimates her gaze direction
267
914330
2000
機器人估算她目光的方向,
15:16
and looks at the same thing that she's looking at.
268
916330
2000
然後看著她正在看的那個東西。
15:18
(Laughter)
269
918330
1000
(笑聲)
15:19
So we're going to see more and more of this sort of robot
270
919330
3000
未來幾年中,我們將在這個實驗室
15:22
over the next few years in labs.
271
922330
2000
製作出更多類似的機器人。
15:24
But then the big questions, two big questions that people ask me are:
272
924330
5000
但之後的大問題是,兩個人們常問的問題是:
15:29
if we make these robots more and more human-like,
273
929330
2000
如果我們製作出的機器人越來越像人,
15:31
will we accept them, will we -- will they need rights eventually?
274
931330
5000
我們要尊重它們嗎?它們需要機器人權嗎?
15:36
And the other question people ask me is, will they want to take over?
275
936330
3000
另一個人們常問的問題是,它們會取代我們嗎?
15:39
(Laughter)
276
939330
1000
(笑聲)
15:40
And on the first -- you know, this has been a very Hollywood theme
277
940330
3000
首先 -- 你知道,這已經是好萊塢許多電影的內容了。
15:43
with lots of movies. You probably recognize these characters here --
278
943330
3000
你大概能認得這些角色 --
15:46
where in each of these cases, the robots want more respect.
279
946330
4000
這些例子中,機器人都希望能得到更多的尊重。
15:50
Well, do you ever need to give robots respect?
280
950330
3000
你曾經需要給予機器人尊重嗎?
15:54
They're just machines, after all.
281
954330
2000
它們只不過是機器啊。
15:56
But I think, you know, we have to accept that we are just machines.
282
956330
4000
但是,我認為,我們必須接受,我們也只是機器。
16:00
After all, that's certainly what modern molecular biology says about us.
283
960330
5000
畢竟,對許多現代分子生物學家來說,就是如此。
16:05
You don't see a description of how, you know,
284
965330
3000
你不會看見任何說明來告訴你,
16:08
Molecule A, you know, comes up and docks with this other molecule.
285
968330
4000
一個分子是怎麼和另一個分子結合在一起的。
16:12
And it's moving forward, you know, propelled by various charges,
286
972330
3000
它會因為許多因素而移動,
16:15
and then the soul steps in and tweaks those molecules so that they connect.
287
975330
4000
而靈魂介入之後,改變了這些分子的連結方式。
16:19
It's all mechanistic. We are mechanism.
288
979330
3000
這些都符合機械原理,我們都是機械的機構。
16:22
If we are machines, then in principle at least,
289
982330
3000
如果我們是機器,那麼理論上,
16:25
we should be able to build machines out of other stuff,
290
985330
4000
我們應該能夠製造出
16:29
which are just as alive as we are.
291
989330
4000
和我們一樣活生生的機器。
16:33
But I think for us to admit that,
292
993330
2000
但如果我們讓它發生的話,
16:35
we have to give up on our special-ness, in a certain way.
293
995330
3000
我們在某些方面必須放棄我們的獨特性。
16:38
And we've had the retreat from special-ness
294
998330
2000
在過去至少幾百年中
16:40
under the barrage of science and technology many times
295
1000330
3000
科學與技術多次的衝擊下,
16:43
over the last few hundred years, at least.
296
1003330
2000
我們失去了獨特的特性。
16:45
500 years ago we had to give up the idea
297
1005330
2000
五百年前,當人們發現
16:47
that we are the center of the universe
298
1007330
3000
地球繞著太陽轉時,
16:50
when the earth started to go around the sun;
299
1010330
2000
才放棄了地球是宇宙中心的想法;
16:52
150 years ago, with Darwin, we had to give up the idea we were different from animals.
300
1012330
5000
一百五十年前,因為達爾文,我們放棄了自以為和動物不同的想法。
16:57
And to imagine -- you know, it's always hard for us.
301
1017330
3000
想想這個,對人們而言這是相當困難的。
17:00
Recently we've been battered with the idea that maybe
302
1020330
3000
近代的人們被,人類並不是獨立被創造出來的,
17:03
we didn't even have our own creation event, here on earth,
303
1023330
2000
這種觀念給擊垮了,雖然很多人不喜歡這說法。
17:05
which people didn't like much. And then the human genome said,
304
1025330
3000
而人類的基因顯示著,
17:08
maybe we only have 35,000 genes. And that was really --
305
1028330
3000
也許我們只有35000個基因。
17:11
people didn't like that, we've got more genes than that.
306
1031330
3000
這也是人們不喜歡的,我們應該要有更多基因吧。
17:14
We don't like to give up our special-ness, so, you know,
307
1034330
3000
我們不喜歡放棄我們的獨特性,
17:17
having the idea that robots could really have emotions,
308
1037330
2000
知道機器人也許能夠有情感,
17:19
or that robots could be living creatures --
309
1039330
2000
或是機器人能變成生物時 --
17:21
I think is going to be hard for us to accept.
310
1041330
2000
我想這種事很難讓人們接受。
17:23
But we're going to come to accept it over the next 50 years or so.
311
1043330
4000
但是在未來五十年或更久之後,我們將會接受它。
17:27
And the second question is, will the machines want to take over?
312
1047330
3000
第二個問題是,機器人是否會想要接管這一切?
17:30
And here the standard scenario is that we create these things,
313
1050330
5000
標準的劇情模式是,我們將會發明機器人,
17:35
they grow, we nurture them, they learn a lot from us,
314
1055330
3000
我們培育它們,它們從而學習到許多事情,
17:38
and then they start to decide that we're pretty boring, slow.
315
1058330
4000
接著將會覺得我們很無趣,很遲鈍。
17:42
They want to take over from us.
316
1062330
2000
它們想要取代我們。
17:44
And for those of you that have teenagers, you know what that's like.
317
1064330
3000
那些有叛逆期孩子的家長,應該很了解這種感覺。
17:47
(Laughter)
318
1067330
1000
(笑聲)
17:48
But Hollywood extends it to the robots.
319
1068330
3000
好萊塢將這種劇情延伸到機器人上。
17:51
And the question is, you know,
320
1071330
3000
問題是,
17:54
will someone accidentally build a robot that takes over from us?
321
1074330
4000
會不會有人不小心做出一個能取代我們而主宰世界的機器人?
17:58
And that's sort of like this lone guy in the backyard,
322
1078330
3000
就像是某個在後院的寂寞男人,
18:01
you know -- "I accidentally built a 747."
323
1081330
3000
告訴你,"我不小心做出了一架波音747。"
18:04
I don't think that's going to happen.
324
1084330
2000
我不認為這會發生。
18:06
And I don't think --
325
1086330
2000
我不認為 --
18:08
(Laughter)
326
1088330
1000
(笑聲)
18:09
-- I don't think we're going to deliberately build robots
327
1089330
3000
我不認為我們會故意去創造出
18:12
that we're uncomfortable with.
328
1092330
2000
讓我們不舒服的機器人。
18:14
We'll -- you know, they're not going to have a super bad robot.
329
1094330
2000
嗯 -- 你知道,不會有那種非常糟糕的機器人啦。
18:16
Before that has to come to be a mildly bad robot,
330
1096330
3000
在那之前會先出現稍微糟糕的機器人,
18:19
and before that a not so bad robot.
331
1099330
2000
在之前會出現不那麼糟糕的機器人。
18:21
(Laughter)
332
1101330
1000
(笑聲)
18:22
And we're just not going to let it go that way.
333
1102330
2000
我們不會讓狀況惡化的。
18:24
(Laughter)
334
1104330
1000
(笑聲)
18:25
So, I think I'm going to leave it at that: the robots are coming,
335
1105330
6000
所以我最後的結論是:機器人的時代即將來臨,
18:31
we don't have too much to worry about, it's going to be a lot of fun,
336
1111330
3000
大家不用擔心太多,那將會帶來很多樂趣,
18:34
and I hope you all enjoy the journey over the next 50 years.
337
1114330
4000
希望大家都能在未來五十年中享受這過程。
18:38
(Applause)
338
1118330
2000
(掌聲)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。