Rodney Brooks: How robots will invade our lives

60,794 views ・ 2008-10-10

TED


请双击下面的英文字幕来播放视频。

翻译人员: Zhijun Wang 校对人员: Zhu Jie
00:18
What I want to tell you about today is how I see robots invading our lives
0
18330
5000
今天我想跟大家讨论的话题是:我怎样看待机器人将多层面、跨时代地
00:23
at multiple levels, over multiple timescales.
1
23330
3000
进入我们的生活。
00:26
And when I look out in the future, I can't imagine a world, 500 years from now,
2
26330
4000
当我展望未来的时候,我无法想象从现在起的五百年后,
00:30
where we don't have robots everywhere.
3
30330
2000
机器人竟然没有遍布世界。
00:32
Assuming -- despite all the dire predictions from many people about our future --
4
32330
5000
假设——不管那些大多数人对未来所做的一切恐怖预言——
00:37
assuming we're still around, I can't imagine the world not being populated with robots.
5
37330
4000
假设我们仍然存在,我无法想象这个世界不被机器人所占据。
00:41
And then the question is, well, if they're going to be here in 500 years,
6
41330
3000
那么问题就是,如果五百年后必然在这个世界上,
00:44
are they going to be everywhere sooner than that?
7
44330
2000
那么他们会不会更早的时候就已无处不在了?
00:46
Are they going to be around in 50 years?
8
46330
2000
他们会不会在五十年之后就到处都有了?
00:48
Yeah, I think that's pretty likely -- there's going to be lots of robots everywhere.
9
48330
3000
是啊,我觉得那非常可能——现在已经有很多机器人在我们周围了。
00:51
And in fact I think that's going to be a lot sooner than that.
10
51330
3000
实际上我认为这个进程还要快得多。
00:54
I think we're sort of on the cusp of robots becoming common,
11
54330
4000
我认为我们可以说是处于日益普遍的机器人发展尖峰,
00:58
and I think we're sort of around 1978 or 1980 in personal computer years,
12
58330
6000
而且我认为我们可以说在1978或1980年左右的个人电脑时代,
01:04
where the first few robots are starting to appear.
13
64330
3000
新式的机器人才刚刚开始出现。
01:07
Computers sort of came around through games and toys.
14
67330
4000
电脑就像是通过游戏和玩具发展而来的。
01:11
And you know, the first computer most people had in the house
15
71330
3000
而且大家知道,大多数人的第一台家用电脑
01:14
may have been a computer to play Pong,
16
74330
2000
是用来玩Pong(一种简单的视频游戏)的,
01:16
a little microprocessor embedded,
17
76330
2000
仅仅用一个小小的嵌入式微处理器就可以支持了,
01:18
and then other games that came after that.
18
78330
3000
然后就涌现了更多其他游戏。
01:21
And we're starting to see that same sort of thing with robots:
19
81330
3000
我们现在开始发现机器人也发生了同样的事情:
01:24
LEGO Mindstorms, Furbies -- who here -- did anyone here have a Furby?
20
84330
4000
比如LEGO头脑风暴和Furbies(两者都是电子机器人玩具)——这里谁曾买过Furby?
01:28
Yeah, there's 38 million of them sold worldwide.
21
88330
3000
对,这种玩具全球卖出了三千八百万套。
01:31
They are pretty common. And they're a little tiny robot,
22
91330
2000
它们很普通,只是小巧的机器人,
01:33
a simple robot with some sensors,
23
93330
2000
带着一些传感器的简单机器人。
01:35
a little bit of processing actuation.
24
95330
2000
仅仅有一些驱动处理。
01:37
On the right there is another robot doll, who you could get a couple of years ago.
25
97330
3000
在右边有另外一个机器人娃娃,你可能几年前就可以买到。
01:40
And just as in the early days,
26
100330
2000
就像是早些年,
01:42
when there was a lot of sort of amateur interaction over computers,
27
102330
5000
有很多可以说是业余水平的在电脑上的互动,
01:47
you can now get various hacking kits, how-to-hack books.
28
107330
4000
而现在你可以买到多种多样的黑客工具包,以及如何做黑客的书籍。
01:51
And on the left there is a platform from Evolution Robotics,
29
111330
4000
在左边有一个来自于“进化机器人”的平台,
01:55
where you put a PC on, and you program this thing with a GUI
30
115330
3000
这里你可以放一台电脑,然后使用图形用户界面来编程,
01:58
to wander around your house and do various stuff.
31
118330
3000
让这个机器人在你的房间里到处巡视并且做很多事情。
02:01
And then there's a higher price point sort of robot toys --
32
121330
3000
然后是一个高价货,可以说是机器人玩具——
02:04
the Sony Aibo. And on the right there, is one that the NEC developed,
33
124330
4000
索尼的Aibo。在右边,是NEC开发的一款机器人,
02:08
the PaPeRo, which I don't think they're going to release.
34
128330
3000
PaPeRo,我认为他们很快就会发布它。
02:11
But nevertheless, those sorts of things are out there.
35
131330
3000
不过不管怎样,这所有的东西都过时了。
02:14
And we've seen, over the last two or three years, lawn-mowing robots,
36
134330
4000
我们已经看到,在过去的两三年内,出现了锄草机器人,
02:18
Husqvarna on the bottom, Friendly Robotics on top there, an Israeli company.
37
138330
6000
是屏幕底部的Husqvarna公司生产的,还有顶部那个可爱的机器人,是一家以色列公司生产的。
02:24
And then in the last 12 months or so
38
144330
2000
然后,在最近的一年左右,
02:26
we've started to see a bunch of home-cleaning robots appear.
39
146330
4000
我们开始看到一些家用清洁机器人的面世。
02:30
The top left one is a very nice home-cleaning robot
40
150330
3000
左上角是一个非常好的家用清洁机器人,
02:33
from a company called Dyson, in the U.K. Except it was so expensive --
41
153330
4000
它是一家叫“Dyson”的英国公司生产的。不过它相当贵——
02:37
3,500 dollars -- they didn't release it.
42
157330
2000
3500美元——而且并没有上市。
02:39
But at the bottom left, you see Electrolux, which is on sale.
43
159330
3000
在左下角你可以看到正在发售的Electrolux,
02:42
Another one from Karcher.
44
162330
2000
它是Karcher公司的另外一款产品。
02:44
At the bottom right is one that I built in my lab
45
164330
2000
右下角是我10年前在自己的实验室里造出来的机器人,
02:46
about 10 years ago, and we finally turned that into a product.
46
166330
3000
最终我们把他做成了一款产品。
02:49
And let me just show you that.
47
169330
2000
让我给你们演示一下吧。
02:51
We're going to give this away I think, Chris said, after the talk.
48
171330
4000
我们将会把他送出去,我觉得克里斯在我演讲之后会这样说。
02:55
This is a robot that you can go out and buy, and that will clean up your floor.
49
175330
6000
这是一个你能外出买到的机器人,它能够清洁你的地板。
03:05
And it starts off sort of just going around in ever-increasing circles.
50
185330
5000
它就像不断来回绕圈子似的出发、前进。
03:10
If it hits something -- you people see that?
51
190330
4000
如果它撞到什么东西的话——你们看到了么?
03:14
Now it's doing wall-following, it's following around my feet
52
194330
3000
现在它开始沿着墙根走,它在绕着我的脚前进
03:17
to clean up around me. Let's see, let's --
53
197330
4000
并在我周围做清洁。让我们看一看——
03:21
oh, who stole my Rice Krispies? They stole my Rice Krispies!
54
201330
5000
哦,谁偷了我的“米饼”?他们偷了我的“米饼”。
03:26
(Laughter)
55
206330
6000
(笑声)
03:32
Don't worry, relax, no, relax, it's a robot, it's smart!
56
212330
3000
别担心,放轻松,对,放轻松,这是一个机器人,它很聪明。
03:35
(Laughter)
57
215330
3000
(笑声)
03:38
See, the three-year-old kids, they don't worry about it.
58
218330
4000
看,这些三岁的孩子,他们不担心这一点。
03:42
It's grown-ups that get really upset.
59
222330
2000
大人们才会感到头疼。
03:44
(Laughter)
60
224330
1000
(笑声)
03:45
We'll just put some crap here.
61
225330
2000
让我们在这儿放点乱七八糟的东西。
03:47
(Laughter)
62
227330
4000
(笑声)
03:51
Okay.
63
231330
2000
很好。
03:53
(Laughter)
64
233330
4000
(笑声)
03:57
I don't know if you see -- so, I put a bunch of Rice Krispies there,
65
237330
3000
我不知道你们是否看到——所以,我在这里放一些米饼,
04:00
I put some pennies, let's just shoot it at that, see if it cleans up.
66
240330
7000
我放一些硬币,让我们在这边拍摄,来看看它是否能够清理干净。
04:10
Yeah, OK. So --
67
250330
2000
对,很好。所以——
04:12
we'll leave that for later.
68
252330
4000
我们把它留到后面。
04:16
(Applause)
69
256330
5000
(鼓掌)
04:22
Part of the trick was building a better cleaning mechanism, actually;
70
262330
4000
这个小把戏的一部分实际上就是建立一个更好的清扫机制;
04:26
the intelligence on board was fairly simple.
71
266330
4000
主板里的智能程序其实很简单。
04:30
And that's true with a lot of robots.
72
270330
2000
并且对于大多数机器人来讲这都是真理。
04:32
We've all, I think, become, sort of computational chauvinists,
73
272330
4000
我认为,我们都变成了计算沙文主义者似的,
04:36
and think that computation is everything,
74
276330
2000
都觉得计算就是一切,
04:38
but the mechanics still matter.
75
278330
2000
不过机械原理还是很重要。
04:40
Here's another robot, the PackBot,
76
280330
3000
这是另外一个机器人,PackBot,
04:43
that we've been building for a bunch of years.
77
283330
2000
我们已经几年之前就造出来了。
04:45
It's a military surveillance robot, to go in ahead of troops --
78
285330
6000
它是一种军事侦察机器人,诸如,在军队前面开路,
04:51
looking at caves, for instance.
79
291330
3000
观察洞穴之类的。
04:54
But we had to make it fairly robust,
80
294330
2000
但我们必须把它做得相当坚固,
04:56
much more robust than the robots we build in our labs.
81
296330
7000
要比我们在实验室里做出来的机器人要坚固的多。
05:03
(Laughter)
82
303330
3000
(笑声)
05:12
On board that robot is a PC running Linux.
83
312330
4000
它的控制电脑系统是Linux。
05:16
It can withstand a 400G shock. The robot has local intelligence:
84
316330
6000
它可以承受400G的冲击力。这个机器人还有随环境而调节自身的智能:
05:22
it can flip itself over, can get itself into communication range,
85
322330
6000
它可以自己翻转,可以自己进入通讯范围,
05:28
can go upstairs by itself, et cetera.
86
328330
3000
可以自己爬楼梯等等。
05:38
Okay, so it's doing local navigation there.
87
338330
4000
好的,现在它在进行地区导航。
05:42
A soldier gives it a command to go upstairs, and it does.
88
342330
6000
一名士兵给它下达上楼的指令,它立即照做。
05:49
That was not a controlled descent.
89
349330
3000
这不是一次预定的降落。
05:52
(Laughter)
90
352330
2000
(笑声)
05:54
Now it's going to head off.
91
354330
2000
现在它将要开始拦截。
05:56
And the big breakthrough for these robots, really, was September 11th.
92
356330
5000
其实有关这些机器人最重大的技术突破发生于9月11日。
06:01
We had the robots down at the World Trade Center late that evening.
93
361330
4000
傍晚的时候,我们把这些机器人放在世贸中心下面。
06:06
Couldn't do a lot in the main rubble pile,
94
366330
2000
它们并不能在主要的废墟下做很多事情,
06:08
things were just too -- there was nothing left to do.
95
368330
3000
事情仅仅是——那儿并没有什么可以做的了。
06:11
But we did go into all the surrounding buildings that had been evacuated,
96
371330
5000
不过我们进入周边那些已经疏散了人群的建筑中,
06:16
and searched for possible survivors in the buildings
97
376330
3000
在那些由于危险而无法进入的建筑中
06:19
that were too dangerous to go into.
98
379330
2000
寻找可能的生还者。
06:21
Let's run this video.
99
381330
2000
我们来看看这个视频。
06:23
Reporter: ...battlefield companions are helping to reduce the combat risks.
100
383330
3000
记者:……战场上的伙伴能够帮助我们降低战斗风险。
06:26
Nick Robertson has that story.
101
386330
3000
来听听尼克·罗伯特森的故事。
06:31
Rodney Brooks: Can we have another one of these?
102
391330
2000
罗德尼•布鲁克斯:我们能从里面拿一个吗?
06:38
Okay, good.
103
398330
2000
好的,很好。
06:43
So, this is a corporal who had seen a robot two weeks previously.
104
403330
3000
这是一个两周前刚刚见过机器人的下士。
06:48
He's sending robots into caves, looking at what's going on.
105
408330
4000
他把机器人送入洞穴,去探索那里发生了什么。
06:52
The robot's being totally autonomous.
106
412330
2000
机器人是全自动的。
06:54
The worst thing that's happened in the cave so far
107
414330
4000
迄今为止那个洞穴里发生的最糟糕的事情
06:58
was one of the robots fell down ten meters.
108
418330
3000
就是这些机器人中的一个从十米高的地方摔了下去。
07:08
So one year ago, the US military didn't have these robots.
109
428330
3000
一年前,美国军队还没有这些机器人。
07:11
Now they're on active duty in Afghanistan every day.
110
431330
2000
现在这些机器人每天都在阿富汗执行任务。
07:13
And that's one of the reasons they say a robot invasion is happening.
111
433330
3000
这也是为什么人们说机器人正在入侵的原因之一了。
07:16
There's a sea change happening in how -- where technology's going.
112
436330
4000
技术以怎样的方式、迈向何方,都将带来巨变。
07:20
Thanks.
113
440330
2000
谢谢。
07:23
And over the next couple of months,
114
443330
2000
在接下来的几个月里,
07:25
we're going to be sending robots in production
115
445330
3000
我们将把机器人投入油井的生产
07:28
down producing oil wells to get that last few years of oil out of the ground.
116
448330
4000
以便我们能够将近几年要用的石油开采出来。
07:32
Very hostile environments, 150˚ C, 10,000 PSI.
117
452330
4000
非常恶劣的环境,150摄氏度的高温,10000PSI的压强。
07:36
Autonomous robots going down, doing this sort of work.
118
456330
4000
自动控制的机器人将潜下去,开展工作。
07:40
But robots like this, they're a little hard to program.
119
460330
3000
但是像这样的机器人,其程序会有一点难写。
07:43
How, in the future, are we going to program our robots
120
463330
2000
未来,我们将如何给我们的机器人编程序
07:45
and make them easier to use?
121
465330
2000
让他们更加易用呢?
07:47
And I want to actually use a robot here --
122
467330
3000
而且实际上我想在这里使用一个机器人——
07:50
a robot named Chris -- stand up. Yeah. Okay.
123
470330
5000
一个叫做克里斯的机器人——站起来。对,很好。
07:57
Come over here. Now notice, he thinks robots have to be a bit stiff.
124
477330
4000
过来,到这里来。下面请注意,他认为机器人肯定有一点僵硬。
08:01
He sort of does that. But I'm going to --
125
481330
3000
他就是那么想的。不过我要——
08:04
Chris Anderson: I'm just British. RB: Oh.
126
484330
2000
克里斯安德森:我只是比较英国化。罗德尼布鲁克斯:哦。
08:06
(Laughter)
127
486330
2000
(笑声)
08:08
(Applause)
128
488330
2000
(掌声)
08:10
I'm going to show this robot a task. It's a very complex task.
129
490330
3000
我将给这个机器人展示一件任务。一件很复杂的任务。
08:13
Now notice, he nodded there, he was giving me some indication
130
493330
3000
下面要注意,他点头了,他正在向我表示
08:16
he was understanding the flow of communication.
131
496330
3000
他理解了沟通的流程。
08:19
And if I'd said something completely bizarre
132
499330
2000
而且如果我说了一些完全异乎寻常的事情,
08:21
he would have looked askance at me, and regulated the conversation.
133
501330
3000
他就会以询问的眼神看着我,控制这段对话。
08:24
So now I brought this up in front of him.
134
504330
3000
所以,现在我把这个放在他的面前。
08:27
I'd looked at his eyes, and I saw his eyes looked at this bottle top.
135
507330
4000
我看着他的眼睛,然后我发现他的眼睛盯着这个瓶子的上端。
08:31
And I'm doing this task here, and he's checking up.
136
511330
2000
我在做这件事情,而他在检查。
08:33
His eyes are going back and forth up to me, to see what I'm looking at --
137
513330
3000
他的眼睛前后移动,看着我正在看的东西,
08:36
so we've got shared attention.
138
516330
2000
所以我们有共同的关注点。
08:38
And so I do this task, and he looks, and he looks to me
139
518330
3000
所以我在做事情,他在看,而且是看着我,
08:41
to see what's happening next. And now I'll give him the bottle,
140
521330
4000
想要知道下一步将发生什么。那么现在我将把瓶子给他,
08:45
and we'll see if he can do the task. Can you do that?
141
525330
2000
我们看看他是不是能做这件事情。你能做么?
08:47
(Laughter)
142
527330
3000
(笑声)
08:50
Okay. He's pretty good. Yeah. Good, good, good.
143
530330
4000
好的,他表现非常好。是的,很好,很好,很好。
08:54
I didn't show you how to do that.
144
534330
2000
我并没有给你演示应该怎么做。
08:56
Now see if you can put it back together.
145
536330
2000
下面看看你是不是能够把他们补好。
08:58
(Laughter)
146
538330
2000
(笑声)
09:00
And he thinks a robot has to be really slow.
147
540330
1000
他认为一个机器人肯定动作相当缓慢。
09:01
Good robot, that's good.
148
541330
2000
好机器人,非常好。
09:03
So we saw a bunch of things there.
149
543330
2000
那么,我们看到了很多。
09:06
We saw when we're interacting,
150
546330
3000
大家看到了当我们在交流的时候,
09:09
we're trying to show someone how to do something, we direct their visual attention.
151
549330
4000
我们试着给某些人演示怎么做事情的时候,我们引导着他们视觉的注意力。
09:13
The other thing communicates their internal state to us,
152
553330
4000
其他事情会告诉我们他们的内心状态,
09:17
whether he's understanding or not, regulates a social interaction.
153
557330
3000
他是否正在理解,规范着一次社交活动。
09:20
There was shared attention looking at the same sort of thing,
154
560330
2000
当看着一样的东西时,双方就有了共同的关注点,
09:22
and recognizing socially communicated reinforcement at the end.
155
562330
4000
并且最终会认识到社交的力量。
09:26
And we've been trying to put that into our lab robots
156
566330
3000
我们正试图把这些理念加入我们的实验室机器人
09:29
because we think this is how you're going to want to interact with robots in the future.
157
569330
4000
因为我们觉得这才是未来我们想与机器人沟通的方式。
09:33
I just want to show you one technical diagram here.
158
573330
2000
在这里我只想给大家展示一个技术图表。
09:35
The most important thing for building a robot that you can interact with socially
159
575330
4000
要制作一个能与人进行社交的机器人,最重要的就是
09:39
is its visual attention system.
160
579330
2000
它的视觉焦点系统。
09:41
Because what it pays attention to is what it's seeing
161
581330
3000
因为他所关注的焦点就是他所看到
09:44
and interacting with, and what you're understanding what it's doing.
162
584330
3000
和交流的对象,以及所理解的正在发生的事情。
09:47
So in the videos I'm about to show you,
163
587330
3000
因此,在视频里,我将向大家展示,
09:50
you're going to see a visual attention system on a robot
164
590330
4000
你将看到一个机器人身上的视觉焦点系统——
09:54
which has -- it looks for skin tone in HSV space,
165
594330
4000
这台机器人在HSV空间中辨别肤色,
09:58
so it works across all human colorings.
166
598330
4000
所以他能够适应于所有的人种。
10:02
It looks for highly saturated colors, from toys.
167
602330
2000
它能够识别玩具的高度饱和色。
10:04
And it looks for things that move around.
168
604330
2000
他也能识别运动的物体。
10:06
And it weights those together into an attention window,
169
606330
3000
他把所有这些放在一个关注窗口中衡量,
10:09
and it looks for the highest-scoring place --
170
609330
2000
并找出得到最高分的地方——
10:11
the stuff where the most interesting stuff is happening --
171
611330
2000
哪里发生最有趣的事情就关注哪里。
10:13
and that is what its eyes then segue to.
172
613330
4000
那就是他的眼睛所关注的地方。
10:17
And it looks right at that.
173
617330
2000
他盯着那里。
10:19
At the same time, some top-down sort of stuff:
174
619330
3000
与此同时,一些自上而下之类的东西
10:22
might decide that it's lonely and look for skin tone,
175
622330
3000
可能会决定他的孤独,识别肤色,
10:25
or might decide that it's bored and look for a toy to play with.
176
625330
3000
或可能会决定他的无聊,找玩具来玩。
10:28
And so these weights change.
177
628330
2000
因此这些权重会发生变化。
10:30
And over here on the right,
178
630330
2000
在右边的那个地方,
10:32
this is what we call the Steven Spielberg memorial module.
179
632330
3000
是我们所说的史蒂芬斯皮尔伯格记忆模块。
10:35
Did people see the movie "AI"? (Audience: Yes.)
180
635330
2000
大家看过电影“人工智能”么?听众:是的。
10:37
RB: Yeah, it was really bad, but --
181
637330
2000
罗德尼布鲁克斯:哦,那片子拍的真糟糕,不过——
10:39
remember, especially when Haley Joel Osment, the little robot,
182
639330
4000
记住,特别是哈利乔奥斯蒙特,那个小机器人,
10:43
looked at the blue fairy for 2,000 years without taking his eyes off it?
183
643330
4000
整整2000年都在盯着那个蓝色精灵,眼睛眨都不眨?
10:47
Well, this gets rid of that,
184
647330
2000
那么,这个机器人不会像他那样,
10:49
because this is a habituation Gaussian that gets negative,
185
649330
4000
因为这是一个为负的习惯性高斯曲线,
10:53
and more and more intense as it looks at one thing.
186
653330
3000
当他盯着一样东西时,曲线会变得越来越陡。
10:56
And it gets bored, so it will then look away at something else.
187
656330
3000
然后他会感觉到厌烦,这样他将转开视线去看别的东西。
10:59
So, once you've got that -- and here's a robot, here's Kismet,
188
659330
4000
所以,当你知道这些以后——这里有个机器人,叫Kismet,
11:03
looking around for a toy. You can tell what it's looking at.
189
663330
4000
正在四处寻找一个玩具。你能够知道他正在看什么。
11:07
You can estimate its gaze direction from those eyeballs covering its camera,
190
667330
5000
你能从覆盖着下面相机的眼球转动估计到他将看向什么地方,
11:12
and you can tell when it's actually seeing the toy.
191
672330
3000
你也能知道他什么时候看到了那个玩具。
11:15
And it's got a little bit of an emotional response here.
192
675330
2000
这里他有一点点的情绪反应。
11:17
(Laughter)
193
677330
1000
(笑声)
11:18
But it's still going to pay attention
194
678330
2000
不过如果比较明显的东西进入他的视野
11:20
if something more significant comes into its field of view --
195
680330
4000
他会一直关注的——
11:24
such as Cynthia Breazeal, the builder of this robot, from the right.
196
684330
4000
比如辛西娅布里泽尔,这个机器人的制造者。
11:28
It sees her, pays attention to her.
197
688330
5000
他看到了她,对她很关注。
11:33
Kismet has an underlying, three-dimensional emotional space,
198
693330
4000
Kismet具备底层、三维的情感空间,
11:37
a vector space, of where it is emotionally.
199
697330
3000
一个向量的空间,那是有情感的。
11:40
And at different places in that space, it expresses --
200
700330
5000
在那个空间中不同的地方,他会表达——
11:46
can we have the volume on here?
201
706330
2000
我们能把音量调高点么?
11:48
Can you hear that now, out there? (Audience: Yeah.)
202
708330
2000
现在你们能听到那里面的声音么?听众:是的。
11:50
Kismet: Do you really think so? Do you really think so?
203
710330
5000
Kismet:你真这样认为么?你真这样认为么?
11:57
Do you really think so?
204
717330
2000
你真这样认为么?
12:00
RB: So it's expressing its emotion through its face
205
720330
3000
罗德尼布鲁克斯:所以他在通过他的面部动作和声音的韵律
12:03
and the prosody in its voice.
206
723330
2000
表达他的情感。
12:05
And when I was dealing with my robot over here,
207
725330
4000
当我跟我的机器人在这里交流的时候
12:09
Chris, the robot, was measuring the prosody in my voice,
208
729330
3000
Chris,这个机器人,正在分析我的声音韵律,
12:12
and so we have the robot measure prosody for four basic messages
209
732330
5000
所以,我们已经有了能分析声音韵律来判断四种基本信息的机器人,
12:17
that mothers give their children pre-linguistically.
210
737330
4000
这四种信息是妈妈给学说话之前的孩子们传达的。
12:21
Here we've got naive subjects praising the robot:
211
741330
3000
现在我们拿出一些幼稚的话题来奖励这个机器人,
12:26
Voice: Nice robot.
212
746330
2000
声音:乖,
12:29
You're such a cute little robot.
213
749330
2000
你就是一个可爱的小机器人。
12:31
(Laughter)
214
751330
2000
(笑声)
12:33
RB: And the robot's reacting appropriately.
215
753330
2000
这个机器人的反应非常好。
12:35
Voice: ...very good, Kismet.
216
755330
4000
声音:。。。很好,Kismet。
12:40
(Laughter)
217
760330
2000
(笑声)
12:42
Voice: Look at my smile.
218
762330
2000
声音:看着我笑。
12:46
RB: It smiles. She imitates the smile. This happens a lot.
219
766330
3000
罗德尼布鲁克斯:他笑了。她模仿着这个微笑。这样的情形经常发生。
12:49
These are naive subjects.
220
769330
2000
这些只是比较幼稚的话题。
12:51
Here we asked them to get the robot's attention
221
771330
3000
下面我们让他们吸引这个机器人的注意力
12:54
and indicate when they have the robot's attention.
222
774330
3000
并且当成功的时候向我们示意。
12:57
Voice: Hey, Kismet, ah, there it is.
223
777330
4000
声音:嘿,Kismet,啊,就是那样。
13:01
RB: So she realizes she has the robot's attention.
224
781330
4000
罗德尼布鲁克斯:所以她意识到她已经吸引到了这个机器人的注意力。
13:08
Voice: Kismet, do you like the toy? Oh.
225
788330
4000
声音:Kismet,你喜欢这个玩具么?噢。
13:13
RB: Now, here they're asked to prohibit the robot,
226
793330
2000
罗德尼布鲁克斯:现在,现在他们被要求教会机器人令行禁止,
13:15
and this first woman really pushes the robot into an emotional corner.
227
795330
4000
首先出来的这个女的确实将机器人推入一个情绪化的角落里了。
13:19
Voice: No. No. You're not to do that. No.
228
799330
5000
声音:不,不。你不能做那件事。不。
13:24
(Laughter)
229
804330
3000
(笑声)
13:27
Not appropriate. No. No.
230
807330
6000
声音:这不合适。不,不。
13:33
(Laughter)
231
813330
3000
(笑声)
13:36
RB: I'm going to leave it at that.
232
816330
2000
罗德尼布鲁克斯:我会把他留在那里。
13:38
We put that together. Then we put in turn taking.
233
818330
2000
我们把那个放在一起。然后我们进行轮换。
13:40
When we talk to someone, we talk.
234
820330
3000
当我们跟某人谈话时,我们就谈话。
13:43
Then we sort of raise our eyebrows, move our eyes,
235
823330
4000
然后我们就像是耸动眉毛,转动眼睛那样
13:47
give the other person the idea it's their turn to talk.
236
827330
3000
告诉别人轮到他们讲话了。
13:50
And then they talk, and then we pass the baton back and forth between each other.
237
830330
4000
接着他们开始讲话,这样我们就来来回回的接力。
13:54
So we put this in the robot.
238
834330
2000
所以我们把这个因素加入机器人。
13:56
We got a bunch of naive subjects in,
239
836330
2000
我们把很多这样幼稚的话题加入,
13:58
we didn't tell them anything about the robot,
240
838330
2000
我们没有告诉他们任何关于这个机器人的消息,
14:00
sat them down in front of the robot and said, talk to the robot.
241
840330
2000
只是让他们坐在机器人前面,让他们跟机器人讲话。
14:02
Now what they didn't know was,
242
842330
2000
现在他们所不明白的是,
14:04
the robot wasn't understanding a word they said,
243
844330
2000
机器人完全不懂他们在讲什么,
14:06
and that the robot wasn't speaking English.
244
846330
3000
而且机器人也不说英语。
14:09
It was just saying random English phonemes.
245
849330
2000
他仅仅发出一些英语音节。
14:11
And I want you to watch carefully, at the beginning of this,
246
851330
2000
我希望你们能仔细看,最初,
14:13
where this person, Ritchie, who happened to talk to the robot for 25 minutes --
247
853330
4000
这个叫Ritchie的人,跟这个机器人讲了25分钟的话——
14:17
(Laughter)
248
857330
2000
(笑声)
14:19
-- says, "I want to show you something.
249
859330
2000
——他说,“我想给你看一些东西。
14:21
I want to show you my watch."
250
861330
2000
我想给你看看我的手表。”
14:23
And he brings the watch center, into the robot's field of vision,
251
863330
5000
然后他把手表放在机器人的视野范围内,
14:28
points to it, gives it a motion cue,
252
868330
2000
指着它,给了他一个情感上的暗示,
14:30
and the robot looks at the watch quite successfully.
253
870330
2000
这个机器人非常成功的注意到了手表。
14:32
We don't know whether he understood or not that the robot --
254
872330
3000
我们不知道他是否意识到,这个机器人——
14:36
Notice the turn-taking.
255
876330
2000
意识到了角色的轮换。
14:38
Ritchie: OK, I want to show you something. OK, this is a watch
256
878330
3000
Ritchie:好,我想让你看一些东西。好,这是一块手表,
14:41
that my girlfriend gave me.
257
881330
3000
是我女朋友给我的。
14:44
Robot: Oh, cool.
258
884330
2000
机器人:哦,酷!
14:46
Ritchie: Yeah, look, it's got a little blue light in it too. I almost lost it this week.
259
886330
4000
Ritchie:是的,看,它还在发着蓝光呢。这周我差点丢了它。
14:51
(Laughter)
260
891330
4000
(笑声)
14:55
RB: So it's making eye contact with him, following his eyes.
261
895330
3000
罗德尼布鲁克斯:他正在与他进行眼神接触,跟随着他的眼睛。
14:58
Ritchie: Can you do the same thing? Robot: Yeah, sure.
262
898330
2000
Ritchie:你能做到同样的事情么?机器人:是的,当然了。
15:00
RB: And they successfully have that sort of communication.
263
900330
2000
罗德尼布鲁克斯:他们成功的开始了交流。
15:02
And here's another aspect of the sorts of things that Chris and I were doing.
264
902330
4000
另外,这里还有一个方面的事情是克里斯和我正在做的。
15:06
This is another robot, Cog.
265
906330
2000
这是另外一个机器人,Cog。
15:08
They first make eye contact, and then, when Christie looks over at this toy,
266
908330
6000
他们首次进行了眼神接触,然后,当克里斯蒂看着那个玩具,
15:14
the robot estimates her gaze direction
267
914330
2000
这个机器人判断出她的关注方向
15:16
and looks at the same thing that she's looking at.
268
916330
2000
并且跟她一起盯着同样的东西。
15:18
(Laughter)
269
918330
1000
(笑声)
15:19
So we're going to see more and more of this sort of robot
270
919330
3000
未来的几年后,我们将会在实验室里
15:22
over the next few years in labs.
271
922330
2000
看到越来越多像这样的机器人。
15:24
But then the big questions, two big questions that people ask me are:
272
924330
5000
不过随之而来是大问题,人们问我的最大的两个问题是:
15:29
if we make these robots more and more human-like,
273
929330
2000
如果我们造出的机器人越来越像人类,
15:31
will we accept them, will we -- will they need rights eventually?
274
931330
5000
我们会承认他们,我们会——他们最终会需要权利么?
15:36
And the other question people ask me is, will they want to take over?
275
936330
3000
人们问我的另一个问题是,他们将会接管世界么?
15:39
(Laughter)
276
939330
1000
(笑声)
15:40
And on the first -- you know, this has been a very Hollywood theme
277
940330
3000
第一个问题,这已经是一个非常好莱坞的主题了,
15:43
with lots of movies. You probably recognize these characters here --
278
943330
3000
相关的电影也非常多。你可能会意识到这些角色——
15:46
where in each of these cases, the robots want more respect.
279
946330
4000
不管在任何一个例子里,机器人都渴望得到更多的尊重。
15:50
Well, do you ever need to give robots respect?
280
950330
3000
那么,你是否需要给予机器人尊重?
15:54
They're just machines, after all.
281
954330
2000
他们只是机器而已。
15:56
But I think, you know, we have to accept that we are just machines.
282
956330
4000
不过我觉得,我们也不得不承认我们是机器。
16:00
After all, that's certainly what modern molecular biology says about us.
283
960330
5000
毕竟,那确实是现代分子生物学对我们的描述。
16:05
You don't see a description of how, you know,
284
965330
3000
你看不到这样的描述,也就是关于
16:08
Molecule A, you know, comes up and docks with this other molecule.
285
968330
4000
分子A,靠近并与另外一个分子对接。
16:12
And it's moving forward, you know, propelled by various charges,
286
972330
3000
然后向前移动,被不同的电荷所推动,
16:15
and then the soul steps in and tweaks those molecules so that they connect.
287
975330
4000
然后soul加入,调整这些分子使它们连接起来。
16:19
It's all mechanistic. We are mechanism.
288
979330
3000
这都是机械化的,我们就是机器体。
16:22
If we are machines, then in principle at least,
289
982330
3000
如果我们是机器,那么至少在原则上,
16:25
we should be able to build machines out of other stuff,
290
985330
4000
我们应该能够用其他东西造出机器,
16:29
which are just as alive as we are.
291
989330
4000
这些机器就像我们一样是活着的。
16:33
But I think for us to admit that,
292
993330
2000
不过我觉得我们应该承认,
16:35
we have to give up on our special-ness, in a certain way.
293
995330
3000
我们不得不以某种方式在我们的特殊性上放弃。
16:38
And we've had the retreat from special-ness
294
998330
2000
至少在过去的几百年,我们已经很多次
16:40
under the barrage of science and technology many times
295
1000330
3000
想证明我们的特殊性,但总是
16:43
over the last few hundred years, at least.
296
1003330
2000
在科学技术的攻势下收获失败。
16:45
500 years ago we had to give up the idea
297
1005330
2000
当地球开始围绕太阳转时,500年前
16:47
that we are the center of the universe
298
1007330
3000
我们就放弃了我们处于
16:50
when the earth started to go around the sun;
299
1010330
2000
宇宙中心的想法;
16:52
150 years ago, with Darwin, we had to give up the idea we were different from animals.
300
1012330
5000
150年前,因为达尔文,我们不得不摒弃我们与动物不同的想法。
16:57
And to imagine -- you know, it's always hard for us.
301
1017330
3000
大家知道,去想象这些——对于我们来说很困难。
17:00
Recently we've been battered with the idea that maybe
302
1020330
3000
近来,我们又震惊的得知
17:03
we didn't even have our own creation event, here on earth,
303
1023330
2000
我们在地球上甚至没有自己的创造活动,
17:05
which people didn't like much. And then the human genome said,
304
1025330
3000
这样的想法大家都不喜欢。然后人类基因告诉我们,
17:08
maybe we only have 35,000 genes. And that was really --
305
1028330
3000
可能我们只有35000个基因。那真是——
17:11
people didn't like that, we've got more genes than that.
306
1031330
3000
大家不喜欢这样说,我们拥有比那个数目更多的基因。
17:14
We don't like to give up our special-ness, so, you know,
307
1034330
3000
我们不愿意放弃自己的特殊性,所以,大家知道,
17:17
having the idea that robots could really have emotions,
308
1037330
2000
了解到机器人能够真正拥有感情的想法,
17:19
or that robots could be living creatures --
309
1039330
2000
或者机器人可能是生物的想法——
17:21
I think is going to be hard for us to accept.
310
1041330
2000
我想对我们来说都很难接受。
17:23
But we're going to come to accept it over the next 50 years or so.
311
1043330
4000
不过到下一个50年之后我们将会接受这样的想法。
17:27
And the second question is, will the machines want to take over?
312
1047330
3000
第二个问题是,机器人想要接管世界么?
17:30
And here the standard scenario is that we create these things,
313
1050330
5000
这里标准的脚本就是我们创造了他们,
17:35
they grow, we nurture them, they learn a lot from us,
314
1055330
3000
他们成长着,我们哺育着他们,他们从我们这里学到很多,
17:38
and then they start to decide that we're pretty boring, slow.
315
1058330
4000
然后他们开始认为我们很无聊,行动缓慢。
17:42
They want to take over from us.
316
1062330
2000
他们希望从我们手中接管世界。
17:44
And for those of you that have teenagers, you know what that's like.
317
1064330
3000
对于那些家里由青少年的人,你们知道那是什么光景。
17:47
(Laughter)
318
1067330
1000
(笑声)
17:48
But Hollywood extends it to the robots.
319
1068330
3000
只不过好莱坞把它拓展到了机器人身上。
17:51
And the question is, you know,
320
1071330
3000
问题是,
17:54
will someone accidentally build a robot that takes over from us?
321
1074330
4000
某些人会突然制造出一个接管世界的机器人么?
17:58
And that's sort of like this lone guy in the backyard,
322
1078330
3000
这就像一个在后院生活的孤独的人说,
18:01
you know -- "I accidentally built a 747."
323
1081330
3000
“我不小心造出一架波音747。”
18:04
I don't think that's going to happen.
324
1084330
2000
我认为这不会发生。
18:06
And I don't think --
325
1086330
2000
我并不认为——
18:08
(Laughter)
326
1088330
1000
(笑声)
18:09
-- I don't think we're going to deliberately build robots
327
1089330
3000
——我觉得我们不会故意制造一些
18:12
that we're uncomfortable with.
328
1092330
2000
让我们不好受的机器人。
18:14
We'll -- you know, they're not going to have a super bad robot.
329
1094330
2000
我们造出的——不会一下子是个超级机器人大坏蛋。
18:16
Before that has to come to be a mildly bad robot,
330
1096330
3000
在此之前,先得有个轻度的不良机器人,
18:19
and before that a not so bad robot.
331
1099330
2000
而在此之前,会变成一个不那么坏的机器人。
18:21
(Laughter)
332
1101330
1000
(笑声)
18:22
And we're just not going to let it go that way.
333
1102330
2000
我们不会就让事态向那么坏的方向发展。
18:24
(Laughter)
334
1104330
1000
(笑声)
18:25
So, I think I'm going to leave it at that: the robots are coming,
335
1105330
6000
所以,我觉得我会下这样的结论:机器人正在到来,
18:31
we don't have too much to worry about, it's going to be a lot of fun,
336
1111330
3000
我们无需太过担忧,这将很有趣,
18:34
and I hope you all enjoy the journey over the next 50 years.
337
1114330
4000
我希望大家能够享受着下一个50年的旅程。
18:38
(Applause)
338
1118330
2000
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7