Don't fear superintelligent AI | Grady Booch

270,465 views ・ 2017-03-13

TED


請雙擊下方英文字幕播放視頻。

譯者: Melody Tang 審譯者: Helen Chang
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
我小時候是個典型的書呆子。
00:17
I think some of you were, too.
1
17140
2176
你們有些人也曾經是書呆子。
00:19
(Laughter)
2
19340
1216
(笑聲)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
那位笑最大聲的也許現在還是書呆子。
00:23
(Laughter)
4
23820
2256
(笑聲)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
我在北德州一座塵土飛揚的 平原小鎮長大,
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
父親是警長,祖父是牧師。
00:32
Getting into trouble was not an option.
7
32980
1920
所以我「惹上麻煩」是不可能的事。
00:35
And so I started reading calculus books for fun.
8
35860
3256
因此我開始讀微積分的書當消遣。
00:39
(Laughter)
9
39140
1536
(笑聲)
00:40
You did, too.
10
40700
1696
你也讀過。
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
這引導我去製作雷射、 電腦和火箭模型,
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
然後我在臥室裡製造火箭推進燃料。
00:49
Now, in scientific terms,
13
49780
3656
如果用科學上的說法來形容,
00:53
we call this a very bad idea.
14
53460
3256
我們把這叫做「糟糕透頂的主意」。
00:56
(Laughter)
15
56740
1216
(笑聲)
00:57
Around that same time,
16
57980
2176
大概在那段時期,
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
史丹利·庫柏力克執導的 《2001 太空漫遊》上映了,
01:03
and my life was forever changed.
18
63420
2200
我的人生也永遠地改變了。
01:06
I loved everything about that movie,
19
66100
2056
我愛上了這部電影裡的一切,
01:08
especially the HAL 9000.
20
68180
2536
尤其是「豪爾-9000」。
01:10
Now, HAL was a sentient computer
21
70740
2056
豪爾是一台知覺電腦,
01:12
designed to guide the Discovery spacecraft
22
72820
2456
設計來引導「發現號」太空船,
01:15
from the Earth to Jupiter.
23
75300
2536
從地球前往木星。
01:17
HAL was also a flawed character,
24
77860
2056
豪爾也是個有缺陷的角色,
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
因為它最終選擇 任務優先、人命其次。
01:24
Now, HAL was a fictional character,
26
84660
2096
儘管豪爾只是個虛構的角色,
01:26
but nonetheless he speaks to our fears,
27
86780
2656
卻直指我們內心的恐懼,
01:29
our fears of being subjugated
28
89460
2096
我們害怕被征服的恐懼,
01:31
by some unfeeling, artificial intelligence
29
91580
3016
臣服於某個沒有情感,
對於人類漠不關心的人工智能電腦。
01:34
who is indifferent to our humanity.
30
94620
1960
01:37
I believe that such fears are unfounded.
31
97700
2576
我認為這種恐懼只是杞人憂天。
01:40
Indeed, we stand at a remarkable time
32
100300
2696
沒錯,我們現在處於
人類史上一個偉大的時代。
01:43
in human history,
33
103020
1536
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
我們拒絕接受肉體和心靈上的限制,
01:49
we are building machines
35
109580
1696
我們製造細緻、精美,複雜
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
又優雅的機器。
01:54
that will extend the human experience
37
114940
2056
這些機器將透過各種超乎想像的方式,
01:57
in ways beyond our imagining.
38
117020
1680
拓展人類的經驗範圍。
01:59
After a career that led me from the Air Force Academy
39
119540
2576
我曾任職於美國空軍學院,
02:02
to Space Command to now,
40
122140
1936
現在服務於美國空軍太空司令部。
02:04
I became a systems engineer,
41
124100
1696
我成了系統工程師,
02:05
and recently I was drawn into an engineering problem
42
125820
2736
最近我被派去解決一個
與美國太空總署的 火星任務有關的工程問題。
02:08
associated with NASA's mission to Mars.
43
128580
2576
02:11
Now, in space flights to the Moon,
44
131180
2496
目前,前往月球的太空航行,
02:13
we can rely upon mission control in Houston
45
133700
3136
我們可以仰賴位於休士頓的控制中心
02:16
to watch over all aspects of a flight.
46
136860
1976
來監控這段旅程的所有層面。
02:18
However, Mars is 200 times further away,
47
138860
3536
然而,火星的距離比月球遠了200倍,
02:22
and as a result it takes on average 13 minutes
48
142420
3216
一個訊號平均要花 13 分鐘,
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
才能從地球傳送到火星。
02:28
If there's trouble, there's not enough time.
50
148820
3400
如果發生了任何狀況, 根本不夠時間解決。
02:32
And so a reasonable engineering solution
51
152660
2496
所以一個合理的工程解決方案,
02:35
calls for us to put mission control
52
155180
2576
就是把控制中心的位置,
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
放在「獵戶座」太空船裡面。
02:40
Another fascinating idea in the mission profile
54
160820
2896
在任務方面,還有另一個絕妙的點子,
02:43
places humanoid robots on the surface of Mars
55
163740
2896
就是提早在人類之前抵達前,
先在火星表面部署人型機器人。
02:46
before the humans themselves arrive,
56
166660
1856
02:48
first to build facilities
57
168540
1656
它們首先建造設備,
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
以後擔任科學小組的協助角色。
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
當我從工程師的角度來看這件事,
02:57
it became very clear to me that what I needed to architect
60
177980
3176
很明顯我需要做的
03:01
was a smart, collaborative,
61
181180
2176
就是製作一個聰明、善於合作、
03:03
socially intelligent artificial intelligence.
62
183380
2376
具備社交智能的人工智能電腦。
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
換句話說,我需要製作一個很像豪爾,
03:10
but without the homicidal tendencies.
64
190100
2416
但是沒有殺人的癖好的電腦。
03:12
(Laughter)
65
192540
1360
(笑聲)
03:14
Let's pause for a moment.
66
194740
1816
讓我們暫停一下。
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
真的有可能,製作出那樣的 人工智慧電腦嗎?
03:20
Actually, it is.
68
200500
1456
確實有可能。
03:21
In many ways,
69
201980
1256
就許多方面來說,
03:23
this is a hard engineering problem
70
203260
1976
這是一個困難的工程問題,
03:25
with elements of AI,
71
205260
1456
其中包含了人工智慧的元素,
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
而不是一個難解的人工智能問題。
03:31
To paraphrase Alan Turing,
73
211460
2656
就如同艾倫·圖靈所說的,
03:34
I'm not interested in building a sentient machine.
74
214140
2376
我沒有興趣製造一台情識的機器。
03:36
I'm not building a HAL.
75
216540
1576
我所製作的並不是豪爾,
03:38
All I'm after is a simple brain,
76
218140
2416
我想要的是一個簡單的大腦,
03:40
something that offers the illusion of intelligence.
77
220580
3120
它能夠營造出「具備智慧」的錯覺。
03:44
The art and the science of computing have come a long way
78
224820
3136
在豪爾登台亮相之後,
藝術和計算科學已有了長足的進步。
03:47
since HAL was onscreen,
79
227980
1496
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
我想如果它的發明者 強德拉(Chandra)博士今天也在現場,
03:52
he'd have a whole lot of questions for us.
81
232740
2336
他會有很多問題要問我們。
03:55
Is it really possible for us
82
235100
2096
我們是否真的有可能
03:57
to take a system of millions upon millions of devices,
83
237220
4016
讓一個擁有無數元件的系統,
04:01
to read in their data streams,
84
241260
1456
去讀自己的數據流,
04:02
to predict their failures and act in advance?
85
242740
2256
然後預測故障以及提前預防?
04:05
Yes.
86
245020
1216
這是可能的。
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
我們可否做出可使用自然語言 交談的系統?
04:09
Yes.
88
249460
1216
可以。
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
我們可否做出能夠 辨別物體、辨識情緒、
04:13
emote themselves, play games and even read lips?
90
253700
3376
模擬情緒、玩遊戲, 甚至能讀唇語的系統?
04:17
Yes.
91
257100
1216
可以。
04:18
Can we build a system that sets goals,
92
258340
2136
我們可否做出一個能夠設定目標,
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
並一面執行朝向目標的計劃, 同時在過程中學習的系統?
04:24
Yes.
94
264140
1216
可以。
04:25
Can we build systems that have a theory of mind?
95
265380
3336
我們可否做出具備心智理論的系統?
04:28
This we are learning to do.
96
268740
1496
這個目前我們還在學習做。
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
我們可否做出擁有 倫理與道德基礎的系統?
04:34
This we must learn how to do.
98
274300
2040
這個我們必須摸索如何做。
04:37
So let's accept for a moment
99
277180
1376
所以讓我們姑且相信
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
建造這樣的人工智能系統有可能成真,
04:41
for this kind of mission and others.
101
281500
2136
以用在諸如此類和其他的任務。
04:43
The next question you must ask yourself is,
102
283660
2536
接下來我們必須捫心自問,
04:46
should we fear it?
103
286220
1456
我們是否應該感到害怕?
04:47
Now, every new technology
104
287700
1976
誠然,每一項新的科技
04:49
brings with it some measure of trepidation.
105
289700
2896
都會帶來某種程度的擔憂。
04:52
When we first saw cars,
106
292620
1696
當汽車問世的時候,
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
眾人都在哀嘆家庭可能因此而毀滅。
04:58
When we first saw telephones come in,
108
298380
2696
當電話問世的時候,
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
眾人擔心日常的對話會不復存在。
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
當書面文字風行的時候,
05:07
people thought we would lose our ability to memorize.
111
307980
2496
眾人以為我們會失去記憶的能力。
05:10
These things are all true to a degree,
112
310500
2056
這些擔憂在某種程度上都有所根據,
05:12
but it's also the case that these technologies
113
312580
2416
但同時這些科技
05:15
brought to us things that extended the human experience
114
315020
3376
以某些深刻的方式,
帶給我們許多拓展人類經驗的事物。
05:18
in some profound ways.
115
318420
1880
05:21
So let's take this a little further.
116
321660
2280
讓我們再繼續探討。
05:24
I do not fear the creation of an AI like this,
117
324940
4736
我並不懼怕創造 像這樣的人工智能系統,
05:29
because it will eventually embody some of our values.
118
329700
3816
因為它最終會 體現我們的一些價值觀。
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
思考一下:建造認知系統
與過去建造傳統 軟體密集型的系統根本不同。
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
05:40
We don't program them. We teach them.
121
340380
2456
我們不寫電腦程式。我們教它們。
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
為了教導系統如何辨識花,
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
我給它們看數以千計 我喜歡的花的圖片。
05:48
In order to teach a system how to play a game --
124
348580
2256
為了教系統如何玩遊戲--
05:50
Well, I would. You would, too.
125
350860
1960
我會,你也會。
05:54
I like flowers. Come on.
126
354420
2040
我喜歡花。你也是吧!
05:57
To teach a system how to play a game like Go,
127
357260
2856
為了教系統如何玩遊戲,例如圍棋,
06:00
I'd have it play thousands of games of Go,
128
360140
2056
我會讓它一面下數千局圍棋,
06:02
but in the process I also teach it
129
362220
1656
也在下棋的過程中
06:03
how to discern a good game from a bad game.
130
363900
2416
教它如何分辨好的、不好的棋局。
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
如果我要造個人工智能的法律助理,
06:10
I will teach it some corpus of law
132
370060
1776
我會教它一些法律,
06:11
but at the same time I am fusing with it
133
371860
2856
同時我會將它與憐憫的感覺
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
和法律正義融合在一起。
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
以科學術語方面, 這就是我們所謂的真理,
06:21
and here's the important point:
136
381380
2016
重點來了:
06:23
in producing these machines,
137
383420
1456
在製造這些機器時,
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
我們因此教它們我們的價值感。
06:28
To that end, I trust an artificial intelligence
139
388340
3136
為此,我相信人工智能
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
至少不會輸給訓練有素的人。
06:35
But, you may ask,
141
395900
1216
但是,你可能問,
06:37
what about rogue agents,
142
397140
2616
那些流氓代理,
06:39
some well-funded nongovernment organization?
143
399780
3336
那些資金豐富的非政府組織呢?
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
我不擔心獨狼(獨行俠) 手上的人工智能。
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
很顯然,我們無法防禦 所有隨機的暴力行為,
06:51
but the reality is such a system
146
411540
2136
但是現實是這種系統,
06:53
requires substantial training and subtle training
147
413700
3096
需要大量的訓練和微妙的訓練,
06:56
far beyond the resources of an individual.
148
416820
2296
非個人的資源所能及。
06:59
And furthermore,
149
419140
1216
此外,
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
遠遠超出像把互聯網病毒注入世界,
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
只要按個按鈕, 頃刻之間病毒就會散播各處,
07:06
and laptops start blowing up all over the place.
152
426780
2456
所有地方的筆電開始當機。
07:09
Now, these kinds of substances are much larger,
153
429260
2816
這些實質要大得多,
07:12
and we'll certainly see them coming.
154
432100
1715
我們確定會看到它們的到來。
07:14
Do I fear that such an artificial intelligence
155
434340
3056
我擔心這樣的人工智能
07:17
might threaten all of humanity?
156
437420
1960
會威脅全體人類嗎?
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
電影如《駭客任務》、 《大都會》、《魔鬼終結者》,
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
電視劇像是《西方極樂園》,
07:27
they all speak of this kind of fear.
159
447700
2136
都是在談這種恐懼。
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
的確,在哲學家尼克· 博斯特倫 寫的 《超級智能》書裡,
07:34
he picks up on this theme
161
454180
1536
他以此主題,
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
主張超智能不僅危險,
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
還可能威脅全人類的存亡。
07:43
Dr. Bostrom's basic argument
164
463660
2216
博斯特倫博士的基本論點是:
07:45
is that such systems will eventually
165
465900
2736
這種系統最終
07:48
have such an insatiable thirst for information
166
468660
3256
將會不屈不撓地渴望資訊,
07:51
that they will perhaps learn how to learn
167
471940
2896
它們或許會學習到如何學習的方法,
07:54
and eventually discover that they may have goals
168
474860
2616
最終發現它們的目標
可能與人類的背道而馳。
07:57
that are contrary to human needs.
169
477500
2296
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
博斯特倫博士有不少追隨者。
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
他得到伊隆·馬斯克和 史蒂芬·霍金等人的支持。
08:06
With all due respect
172
486700
2400
我非常尊重
08:09
to these brilliant minds,
173
489980
2016
這些非常聰明的人,
08:12
I believe that they are fundamentally wrong.
174
492020
2256
但是我相信他們從根本上就是錯誤的,
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
博斯格羅姆博士的許多說法 需要被詳細分析,
08:17
and I don't have time to unpack them all,
176
497500
2136
但在此我沒有時間分別解說。
08:19
but very briefly, consider this:
177
499660
2696
簡要的說,就考慮這個:
08:22
super knowing is very different than super doing.
178
502380
3736
超智能與超執行完全是兩回事。
08:26
HAL was a threat to the Discovery crew
179
506140
1896
只有當豪爾全面掌控「發現號」時,
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
才對「發現號」的組員造成威脅。
08:32
So it would have to be with a superintelligence.
181
512500
2496
所以它就必須是有超智慧。
08:35
It would have to have dominion over all of our world.
182
515020
2496
它必須擁有對我們全世界的統治權。
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
這是電影《魔鬼終結者》 裡的 「天網」所具有的。
08:40
in which we had a superintelligence
184
520380
1856
它有超智能
08:42
that commanded human will,
185
522260
1376
來指揮的人的意志,
08:43
that directed every device that was in every corner of the world.
186
523660
3856
來操控世界每一個角落的每個設備。
08:47
Practically speaking,
187
527540
1456
實際上,
08:49
it ain't gonna happen.
188
529020
2096
這不會發生。
08:51
We are not building AIs that control the weather,
189
531140
3056
我們不是製造人工智能來控制氣候、
08:54
that direct the tides,
190
534220
1336
控制海潮,
08:55
that command us capricious, chaotic humans.
191
535580
3376
指揮反覆無常和混亂的人類。
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
此外,如果這樣的人工智能真的存在,
09:02
it would have to compete with human economies,
193
542900
2936
它必須與人類的經濟競爭,
09:05
and thereby compete for resources with us.
194
545860
2520
從而與我們競爭資源。
09:09
And in the end --
195
549020
1216
最後,
09:10
don't tell Siri this --
196
550260
1240
不要告訴 Siri 這個——
09:12
we can always unplug them.
197
552260
1376
我們可以隨時拔掉它們的插頭。
09:13
(Laughter)
198
553660
2120
(笑聲)
09:17
We are on an incredible journey
199
557180
2456
我們正處於與我們的機器共同演化的
09:19
of coevolution with our machines.
200
559660
2496
一個難以置信的旅程。
09:22
The humans we are today
201
562180
2496
今天的人類
09:24
are not the humans we will be then.
202
564700
2536
不是屆時的人類。
09:27
To worry now about the rise of a superintelligence
203
567260
3136
現在擔心超級智能的出現,
09:30
is in many ways a dangerous distraction
204
570420
3056
會使我們在許多方面危險地分心,
09:33
because the rise of computing itself
205
573500
2336
因為計算機本身的興起,
09:35
brings to us a number of human and societal issues
206
575860
3016
帶來的許多人類和社會問題,
09:38
to which we must now attend.
207
578900
1640
我們現在就必須解決。
09:41
How shall I best organize society
208
581180
2816
當人力的需要逐漸減少時,
09:44
when the need for human labor diminishes?
209
584020
2336
我們如何重整這個社會?
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
我該如何為全球帶來理解和教育
09:50
and still respect our differences?
211
590220
1776
而仍然尊重我們彼此的分歧?
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
我們如何通過認知保健 延伸和增強人的生命?
09:56
How might I use computing
213
596300
2856
我如何使用計算技術
09:59
to help take us to the stars?
214
599180
1760
來幫助我們去到其他星球?
10:01
And that's the exciting thing.
215
601580
2040
這是令人興奮的事。
10:04
The opportunities to use computing
216
604220
2336
透過計算技術的運用,
10:06
to advance the human experience
217
606580
1536
來拓展人類經驗的機會,
10:08
are within our reach,
218
608140
1416
就在我們眼前。
10:09
here and now,
219
609580
1856
此時、此刻,
10:11
and we are just beginning.
220
611460
1680
我們才正要起步而已。
10:14
Thank you very much.
221
614100
1216
非常感謝各位。
10:15
(Applause)
222
615340
4286
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7