Don't fear superintelligent AI | Grady Booch

270,465 views ・ 2017-03-13

TED


请双击下面的英文字幕来播放视频。

翻译人员: Weidi Liu 校对人员: Jiawei Ni
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
在我还是孩子的时候 , 我是一个典型的书呆子。
00:17
I think some of you were, too.
1
17140
2176
我猜你们的一部分人和我一样。
00:19
(Laughter)
2
19340
1216
(笑声)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
还有你,那位先生,笑得最大声, 说不定你现在还是呢。
00:23
(Laughter)
4
23820
2256
(笑声)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
我成长在德克萨斯州北部 荒凉平原上的一个小镇。
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
我的爸爸是警长,爷爷是一位牧师,
00:32
Getting into trouble was not an option.
7
32980
1920
所以自找麻烦从来不是一个好的选项。
00:35
And so I started reading calculus books for fun.
8
35860
3256
所以我开始看有关 微积分的书来打发时间。
00:39
(Laughter)
9
39140
1536
(笑声)
00:40
You did, too.
10
40700
1696
你也是这样。
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
于是我学着制作了一个激光器, 一台电脑,和一个火箭模型,
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
并且在自己的卧室制取火箭燃料。
00:49
Now, in scientific terms,
13
49780
3656
现在,从科学的角度而言,
00:53
we call this a very bad idea.
14
53460
3256
我们把这个叫做 一个糟糕透顶的主意。
00:56
(Laughter)
15
56740
1216
(笑声)
00:57
Around that same time,
16
57980
2176
在差不多同一时间,
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
Stanley Kubrick的 “2001:太空漫游”上映了,
01:03
and my life was forever changed.
18
63420
2200
我的生活从此改变。
01:06
I loved everything about that movie,
19
66100
2056
我喜欢关于那部电影的一切,
01:08
especially the HAL 9000.
20
68180
2536
特别是 HAL 9000。
01:10
Now, HAL was a sentient computer
21
70740
2056
HAL是一台有情感的电脑,
01:12
designed to guide the Discovery spacecraft
22
72820
2456
为引导“发现一号”飞船从地球
01:15
from the Earth to Jupiter.
23
75300
2536
前往木星而设计出来。
01:17
HAL was also a flawed character,
24
77860
2056
HAL 也是一个有缺陷的角色,
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
因为在结尾他将任务的 价值置于生命之上。
01:24
Now, HAL was a fictional character,
26
84660
2096
HAL 是一个虚构的角色,
01:26
but nonetheless he speaks to our fears,
27
86780
2656
但尽管如此,他挑战了我们的恐惧,
01:29
our fears of being subjugated
28
89460
2096
被一些冷漠无情的
01:31
by some unfeeling, artificial intelligence
29
91580
3016
人工智能(AI)
01:34
who is indifferent to our humanity.
30
94620
1960
所统治的恐惧。
01:37
I believe that such fears are unfounded.
31
97700
2576
但我相信这些恐惧只是杞人忧天。
01:40
Indeed, we stand at a remarkable time
32
100300
2696
的确,我们正处于人类历史上
一个值得铭记的时间点,
01:43
in human history,
33
103020
1536
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
不甘被自身肉体和头脑所局限,
01:49
we are building machines
35
109580
1696
我们正在制造
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
那些可以通过我们无法想象的方式
01:54
that will extend the human experience
37
114940
2056
来拓展人类体验的机器,
01:57
in ways beyond our imagining.
38
117020
1680
它们精美,复杂,而且优雅。
01:59
After a career that led me from the Air Force Academy
39
119540
2576
我在空军学院
02:02
to Space Command to now,
40
122140
1936
和航天司令部工作过,
02:04
I became a systems engineer,
41
124100
1696
现在是一个系统工程师。
02:05
and recently I was drawn into an engineering problem
42
125820
2736
最近我碰到了一个和 NASA火星任务相关的
02:08
associated with NASA's mission to Mars.
43
128580
2576
工程问题。
02:11
Now, in space flights to the Moon,
44
131180
2496
当前,前往月球的宇宙飞行,
02:13
we can rely upon mission control in Houston
45
133700
3136
我们可以依靠休斯顿的指挥中心
02:16
to watch over all aspects of a flight.
46
136860
1976
来密切关注飞行的各个方面。
02:18
However, Mars is 200 times further away,
47
138860
3536
但是,火星相比而言 多出了200倍的距离,
02:22
and as a result it takes on average 13 minutes
48
142420
3216
这使得一个信号从地球到火星
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
平均要花费13分钟。
02:28
If there's trouble, there's not enough time.
50
148820
3400
如果有了麻烦, 我们并没有足够的时间来解决。
02:32
And so a reasonable engineering solution
51
152660
2496
所以一个可靠的工程解决方案
02:35
calls for us to put mission control
52
155180
2576
促使我们把一个指挥中枢
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
放在“猎户星”飞船之中。
02:40
Another fascinating idea in the mission profile
54
160820
2896
在任务档案中的另一个创意
02:43
places humanoid robots on the surface of Mars
55
163740
2896
是在人类抵达火星之前,把人形机器人
02:46
before the humans themselves arrive,
56
166660
1856
先一步放在火星表面,
02:48
first to build facilities
57
168540
1656
它们可以先建造据点,
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
然后作为科学团队的合作伙伴驻扎。
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
当我从工程师的角度看待这个想法,
02:57
it became very clear to me that what I needed to architect
60
177980
3176
对于建造一个聪明,懂得合作,
03:01
was a smart, collaborative,
61
181180
2176
擅长社交的人工智能的需求
03:03
socially intelligent artificial intelligence.
62
183380
2376
是十分明显的。
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
换句话说,我需要建造一个和HAL一样,
03:10
but without the homicidal tendencies.
64
190100
2416
但是没有谋杀倾向的机器。
03:12
(Laughter)
65
192540
1360
(笑声)
03:14
Let's pause for a moment.
66
194740
1816
待会儿再回到这个话题。
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
真的有可能打造一个 类似的人工智能吗?
03:20
Actually, it is.
68
200500
1456
是的,当然可以。
03:21
In many ways,
69
201980
1256
在许多方面,
03:23
this is a hard engineering problem
70
203260
1976
一个困难的工程问题来自于
03:25
with elements of AI,
71
205260
1456
AI的各个零件,
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
而不是什么琐碎的AI问题。
03:31
To paraphrase Alan Turing,
73
211460
2656
借用Alan Turing的话来说,
03:34
I'm not interested in building a sentient machine.
74
214140
2376
我没有兴趣建造一台有情感的机器。
03:36
I'm not building a HAL.
75
216540
1576
我不是要建造HAL。
03:38
All I'm after is a simple brain,
76
218140
2416
我只想要一个简单的大脑,
03:40
something that offers the illusion of intelligence.
77
220580
3120
一个能让你以为它有智能的东西。
03:44
The art and the science of computing have come a long way
78
224820
3136
自从HAL登上荧幕, 关于编程的技术和艺术
03:47
since HAL was onscreen,
79
227980
1496
已经发展了许多,
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
我能想象如果它的发明者 Chandra博士今天在这里的话,
03:52
he'd have a whole lot of questions for us.
81
232740
2336
他会有很多的问题问我们。
03:55
Is it really possible for us
82
235100
2096
我们真的可能
03:57
to take a system of millions upon millions of devices,
83
237220
4016
用一个连接了无数设备的系统,
04:01
to read in their data streams,
84
241260
1456
通过读取数据流,
04:02
to predict their failures and act in advance?
85
242740
2256
来预测它们的失败并提前行动吗?
是的。
04:05
Yes.
86
245020
1216
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
我们能建造一个和人类 用语言交流的系统吗?
04:09
Yes.
88
249460
1216
能。
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
我们能够打造一个能 辨识目标,鉴别情绪,
04:13
emote themselves, play games and even read lips?
90
253700
3376
表现自身情感,打游戏, 甚至读唇的系统吗?
04:17
Yes.
91
257100
1216
可以。
04:18
Can we build a system that sets goals,
92
258340
2136
我们可以建造一个能设定目标,
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
通过各种达成目的的 方法来学习的系统吗?
04:24
Yes.
94
264140
1216
也可以。
04:25
Can we build systems that have a theory of mind?
95
265380
3336
我们可以建造一个类似人脑的系统吗?
04:28
This we are learning to do.
96
268740
1496
这是我们正在努力做的。
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
我们可以建造一个有道德 和感情基础的系统吗?
04:34
This we must learn how to do.
98
274300
2040
这是我们必须要学习的。
04:37
So let's accept for a moment
99
277180
1376
总而言之,
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
建造一个类似的用于这类任务的
04:41
for this kind of mission and others.
101
281500
2136
人工智能是可行的。
04:43
The next question you must ask yourself is,
102
283660
2536
另一个你必须扪心自问的问题是,
04:46
should we fear it?
103
286220
1456
我们应该害怕它吗?
04:47
Now, every new technology
104
287700
1976
毕竟,每一种新技术
04:49
brings with it some measure of trepidation.
105
289700
2896
都给我们带来某种程度的不安。
04:52
When we first saw cars,
106
292620
1696
我们第一次看见汽车的时候,
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
人们悲叹我们会看到家庭的毁灭。
04:58
When we first saw telephones come in,
108
298380
2696
我们第一次看见电话的时候,
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
人们担忧这会破坏所有的文明交流。
在某个时间点我们看到书写文字的蔓延,
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
05:07
people thought we would lose our ability to memorize.
111
307980
2496
人们认为我们会丧失记忆的能力。
05:10
These things are all true to a degree,
112
310500
2056
这些在某个程度上是合理的,
05:12
but it's also the case that these technologies
113
312580
2416
但也正是这些技术
给人类的生活在某些方面
05:15
brought to us things that extended the human experience
114
315020
3376
05:18
in some profound ways.
115
318420
1880
带来了前所未有的体验。
05:21
So let's take this a little further.
116
321660
2280
我们再稍稍扩展一下。
05:24
I do not fear the creation of an AI like this,
117
324940
4736
我并不畏惧这类人工智能的诞生,
05:29
because it will eventually embody some of our values.
118
329700
3816
因为它最终会融入我们的部分价值观。
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
想想这个:建造一个认知系统
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
与建造一个以传统的软件为主的 系统有着本质的不同。
05:40
We don't program them. We teach them.
121
340380
2456
我们不编译它们。我们教导它们。
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
为了教会系统如何识别花朵,
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
我给它看了上千种我喜欢的花。
05:48
In order to teach a system how to play a game --
124
348580
2256
为了教会系统怎么打游戏——
05:50
Well, I would. You would, too.
125
350860
1960
当然,我会。你们也会。
05:54
I like flowers. Come on.
126
354420
2040
我喜欢花。这没什么。
05:57
To teach a system how to play a game like Go,
127
357260
2856
为了教会系统如何玩像围棋一样的游戏,
06:00
I'd have it play thousands of games of Go,
128
360140
2056
我玩了许多围棋的游戏,
06:02
but in the process I also teach it
129
362220
1656
但是在这个过程中
06:03
how to discern a good game from a bad game.
130
363900
2416
我也教会它如何分别差游戏和好游戏。
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
如果我想要一个人工智能法律助手,
06:10
I will teach it some corpus of law
132
370060
1776
我会给它一些法律文集,
06:11
but at the same time I am fusing with it
133
371860
2856
但同时我会将 怜悯和正义
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
也是法律的一部分 这种观点融入其中。
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
用一个术语来解释, 就是我们所说的真相,
06:21
and here's the important point:
136
381380
2016
而关键在于:
06:23
in producing these machines,
137
383420
1456
为了制造这些机器,
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
我们正教给它们我们的价值观。
06:28
To that end, I trust an artificial intelligence
139
388340
3136
正因如此,我相信一个人工智能
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
绝不逊于一个经过良好训练的人类。
06:35
But, you may ask,
141
395900
1216
但是,你或许会问,
06:37
what about rogue agents,
142
397140
2616
要是流氓组织,
06:39
some well-funded nongovernment organization?
143
399780
3336
和资金充沛的无政府组织 也在利用它们呢?
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
我并不害怕独狼掌控的人工智能。
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
很明显,我们无法从 随机的暴力行为中保护自己,
06:51
but the reality is such a system
146
411540
2136
但是现实是,制造这样一个系统
06:53
requires substantial training and subtle training
147
413700
3096
超越了个人所拥有资源的极限,
06:56
far beyond the resources of an individual.
148
416820
2296
因为它需要踏实细致的训练和培养。
06:59
And furthermore,
149
419140
1216
还有,
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
这远比向世界散播一个网络病毒,
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
比如你按下一个按钮, 瞬间全世界都被感染,
07:06
and laptops start blowing up all over the place.
152
426780
2456
并且在各处的笔记本电脑中 开始爆发来的复杂。
07:09
Now, these kinds of substances are much larger,
153
429260
2816
这类东西正在越来越强大,
07:12
and we'll certainly see them coming.
154
432100
1715
我们必然会看到它们的来临。
07:14
Do I fear that such an artificial intelligence
155
434340
3056
我会害怕一个有可能威胁所有人类的
07:17
might threaten all of humanity?
156
437420
1960
人工智能吗?
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
如果你看过《黑客帝国》,《大都会》,
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
《终结者》,或者 《西部世界》这类电视剧,
07:27
they all speak of this kind of fear.
159
447700
2136
它们都在表达这种恐惧。
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
的确,在哲学家Nick Bostrom 写的《超级智能》中,
07:34
he picks up on this theme
161
454180
1536
他选择了这个主题,
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
并且观察到超级智能不仅仅危险,
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
它还对所有人类的存在造成了威胁。
07:43
Dr. Bostrom's basic argument
164
463660
2216
Bostrom博士的基础观点认为,
07:45
is that such systems will eventually
165
465900
2736
这样的系统迟早会
07:48
have such an insatiable thirst for information
166
468660
3256
产生对信息的无止境渴求,
07:51
that they will perhaps learn how to learn
167
471940
2896
也许它们会开始学习,
07:54
and eventually discover that they may have goals
168
474860
2616
并且最终发现它们的目的
07:57
that are contrary to human needs.
169
477500
2296
和人类的需求背道而驰。
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
Bostrom博士有一群粉丝。
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
Elon Musk和Stephen Hawking也支持他。
08:06
With all due respect
172
486700
2400
虽然要向这些伟大的头脑
08:09
to these brilliant minds,
173
489980
2016
致以崇高的敬意,
但我还是相信他们从一开始就错了。
08:12
I believe that they are fundamentally wrong.
174
492020
2256
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
Bostrom博士的观点 有很多地方可以细细体会,
08:17
and I don't have time to unpack them all,
176
497500
2136
但现在我没有时间一一解读,
08:19
but very briefly, consider this:
177
499660
2696
简要而言,请考虑这句话:
08:22
super knowing is very different than super doing.
178
502380
3736
全知并非全能。
08:26
HAL was a threat to the Discovery crew
179
506140
1896
HAL成为了对发现一号成员的威胁,
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
只是因为它控制了 发现一号的各个方面。
08:32
So it would have to be with a superintelligence.
181
512500
2496
正因如此它才需要是一个人工智能。
它需要对于我们世界的完全掌控。
08:35
It would have to have dominion over all of our world.
182
515020
2496
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
这就是《终结者》中的天网,
08:40
in which we had a superintelligence
184
520380
1856
一个控制了人们意志,
08:42
that commanded human will,
185
522260
1376
控制了世界各处
08:43
that directed every device that was in every corner of the world.
186
523660
3856
各个机器的超级智能。
08:47
Practically speaking,
187
527540
1456
说实在的,
这完全是杞人忧天。
08:49
it ain't gonna happen.
188
529020
2096
08:51
We are not building AIs that control the weather,
189
531140
3056
我们不是在制造可以控制天气,
08:54
that direct the tides,
190
534220
1336
引导潮水,
08:55
that command us capricious, chaotic humans.
191
535580
3376
指挥我们这些 多变,混乱的人类的人工智能。
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
另外,即使这类人工智能存在,
09:02
it would have to compete with human economies,
193
542900
2936
它需要和人类的经济竞争,
09:05
and thereby compete for resources with us.
194
545860
2520
进而和我们拥有的资源竞争。
09:09
And in the end --
195
549020
1216
最后——
09:10
don't tell Siri this --
196
550260
1240
不要告诉Siri——
09:12
we can always unplug them.
197
552260
1376
我们可以随时拔掉电源。
09:13
(Laughter)
198
553660
2120
(笑声)
09:17
We are on an incredible journey
199
557180
2456
我们正处于和机器共同演化的
09:19
of coevolution with our machines.
200
559660
2496
奇妙旅途之中。
09:22
The humans we are today
201
562180
2496
未来的人类
09:24
are not the humans we will be then.
202
564700
2536
将与今天的人类大相径庭。
09:27
To worry now about the rise of a superintelligence
203
567260
3136
当前对人工智能崛起的担忧,
09:30
is in many ways a dangerous distraction
204
570420
3056
从各方面来说都是 一个危险的错误指引,
09:33
because the rise of computing itself
205
573500
2336
因为电脑的崛起
09:35
brings to us a number of human and societal issues
206
575860
3016
带给了我们许多必须参与的
09:38
to which we must now attend.
207
578900
1640
关乎人类和社会的问题。
09:41
How shall I best organize society
208
581180
2816
我应该如何管理社会
来应对人类劳工需求量的降低?
09:44
when the need for human labor diminishes?
209
584020
2336
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
我应该怎样在进行全球化 交流和教育的同时,
09:50
and still respect our differences?
211
590220
1776
依旧尊重彼此的不同?
我应该如何通过可认知医疗 来延长并强化人类的生命?
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
09:56
How might I use computing
213
596300
2856
我应该如何通过计算机
09:59
to help take us to the stars?
214
599180
1760
来帮助我们前往其他星球?
10:01
And that's the exciting thing.
215
601580
2040
这些都很令人兴奋。
10:04
The opportunities to use computing
216
604220
2336
通过计算机来升级
10:06
to advance the human experience
217
606580
1536
人类体验的机会
10:08
are within our reach,
218
608140
1416
就在我们手中,
10:09
here and now,
219
609580
1856
就在此时此刻,
10:11
and we are just beginning.
220
611460
1680
我们的旅途才刚刚开始。
10:14
Thank you very much.
221
614100
1216
谢谢大家。
10:15
(Applause)
222
615340
4286
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7