Don't fear superintelligent AI | Grady Booch

270,465 views ・ 2017-03-13

TED


Please double-click on the English subtitles below to play the video.

Translator: Danly Deng Reviewer: Alan Watson
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
我細個嗰陣係個書蟲
00:17
I think some of you were, too.
1
17140
2176
我估你哋一啲人都係
00:19
(Laughter)
2
19340
1216
(笑聲)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
仲有你呀,阿生 笑得最大聲嗰個,睇嚟宜家都仲係
00:23
(Laughter)
4
23820
2256
(笑聲)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
我喺德州北部平原一個鎮仔大嘅
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
我阿爺係牧師,我老豆係警察
00:32
Getting into trouble was not an option.
7
32980
1920
所以自細就係乖乖仔
00:35
And so I started reading calculus books for fun.
8
35860
3256
睇微積分書當娛樂
00:39
(Laughter)
9
39140
1536
(笑聲)
00:40
You did, too.
10
40700
1696
你都係咧
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
於是我自己整 激光啦、電腦啦、火箭模型啦
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
最後仲喺房裏邊整埋火箭燃料
00:49
Now, in scientific terms,
13
49780
3656
用我哋宜家科學嘅講法
00:53
we call this a very bad idea.
14
53460
3256
係搞搞震無幫襯
00:56
(Laughter)
15
56740
1216
(笑聲)
00:57
Around that same time,
16
57980
2176
就喺嗰陣
Stanley Kubrick 拍嗰部 《2001太空漫遊》上畫
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
01:03
and my life was forever changed.
18
63420
2200
我嘅生活從此改寫
01:06
I loved everything about that movie,
19
66100
2056
我係部片嘅忠實擁躉
01:08
especially the HAL 9000.
20
68180
2536
特別中意哈爾 9000
01:10
Now, HAL was a sentient computer
21
70740
2056
哈爾係一部機械人
01:12
designed to guide the Discovery spacecraft
22
72820
2456
專為太空船導航而設計
01:15
from the Earth to Jupiter.
23
75300
2536
指引太空船由地球飛往木星
01:17
HAL was also a flawed character,
24
77860
2056
但哈爾都有缺點
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
佢最後為實現目的而罔顧人類安全
01:24
Now, HAL was a fictional character,
26
84660
2096
哈爾雖然係虛構
01:26
but nonetheless he speaks to our fears,
27
86780
2656
但佢反映咗我哋嘅恐懼
01:29
our fears of being subjugated
28
89460
2096
就係怕無感情嘅人工智能 以後會操控人類
01:31
by some unfeeling, artificial intelligence
29
91580
3016
01:34
who is indifferent to our humanity.
30
94620
1960
01:37
I believe that such fears are unfounded.
31
97700
2576
我認為呢種恐懼唔成立
01:40
Indeed, we stand at a remarkable time
32
100300
2696
講真,人類呢一刻嘅歷史 可以話係最輝煌嘅
01:43
in human history,
33
103020
1536
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
我哋冇向身體同大腦限制低頭
01:49
we are building machines
35
109580
1696
反而製造出精良、複雜又美觀嘅機器
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
01:54
that will extend the human experience
37
114940
2056
協助人類逹至超乎想象嘅領域
01:57
in ways beyond our imagining.
38
117020
1680
01:59
After a career that led me from the Air Force Academy
39
119540
2576
我以前喺空軍學校同太空指揮部做嘢嘅
02:02
to Space Command to now,
40
122140
1936
咁而家,我做咗系統工程師
02:04
I became a systems engineer,
41
124100
1696
02:05
and recently I was drawn into an engineering problem
42
125820
2736
最近我著手於美國太空總署嘅 火星任務嘅一個工程問題
02:08
associated with NASA's mission to Mars.
43
128580
2576
02:11
Now, in space flights to the Moon,
44
131180
2496
到目前為止,所有上月球嘅太空船
02:13
we can rely upon mission control in Houston
45
133700
3136
我哋都可以留喺休斯頓控制中心控制
02:16
to watch over all aspects of a flight.
46
136860
1976
02:18
However, Mars is 200 times further away,
47
138860
3536
不過,火星比月球遠 200 倍
02:22
and as a result it takes on average 13 minutes
48
142420
3216
於是乎由地球傳送到火星嘅訊號
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
平均要花 13 分鐘先可以送到
02:28
If there's trouble, there's not enough time.
50
148820
3400
所以如果中途出咗咩問題
我哋都唔夠時間解決
02:32
And so a reasonable engineering solution
51
152660
2496
所以解決方法係
02:35
calls for us to put mission control
52
155180
2576
我哋要將任務控制台 裝喺太空船獵戶號嘅墻板裏邊
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
02:40
Another fascinating idea in the mission profile
54
160820
2896
另一個方法係
02:43
places humanoid robots on the surface of Mars
55
163740
2896
喺人類登陸之前 喺火星表面放置一個機械人
02:46
before the humans themselves arrive,
56
166660
1856
02:48
first to build facilities
57
168540
1656
機械人前期負責興建設施
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
後期就加入科學團隊擔當協助
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
而當我企喺工程師嘅角度嚟睇
02:57
it became very clear to me that what I needed to architect
60
177980
3176
我好清楚我要打造嘅 係一個高智能、富團隊精神
03:01
was a smart, collaborative,
61
181180
2176
03:03
socially intelligent artificial intelligence.
62
183380
2376
同善於交際嘅人工智能
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
換句話嚟講,我要整到好似哈爾咁樣
03:10
but without the homicidal tendencies.
64
190100
2416
不過係唔會當人類 係冚家鏟要消滅嘅機械人
03:12
(Laughter)
65
192540
1360
(笑聲)
03:14
Let's pause for a moment.
66
194740
1816
等陣先
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
咁係咪真係有可能整個咁嘅機械人先?
03:20
Actually, it is.
68
200500
1456
其實,係可以嘅
03:21
In many ways,
69
201980
1256
好多時,設計人工智能元素係好難嘅
03:23
this is a hard engineering problem
70
203260
1976
03:25
with elements of AI,
71
205260
1456
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
咁唔係講緊整機械人頭髮
03:31
To paraphrase Alan Turing,
73
211460
2656
學 Alan Turning 話齋
03:34
I'm not interested in building a sentient machine.
74
214140
2376
我無興趣整隻有感覺嘅機器
03:36
I'm not building a HAL.
75
216540
1576
亦都唔係生產另一個哈爾
03:38
All I'm after is a simple brain,
76
218140
2416
我所追求嘅係一個會簡單思考
03:40
something that offers the illusion of intelligence.
77
220580
3120
同有少少智慧嘅機械人
03:44
The art and the science of computing have come a long way
78
224820
3136
自從哈爾喺電影出現之後
03:47
since HAL was onscreen,
79
227980
1496
計算科學已經發展咗好多
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
我想像得到,如果發明佢嘅 Chandra 博士今日喺度嘅話
03:52
he'd have a whole lot of questions for us.
81
232740
2336
佢實有好多問題問我哋
03:55
Is it really possible for us
82
235100
2096
我哋有冇可能 從數以億計嘅設備裡面讀取數據
03:57
to take a system of millions upon millions of devices,
83
237220
4016
04:01
to read in their data streams,
84
241260
1456
同時預測機器犯嘅錯誤 同埋及早更正呢?
04:02
to predict their failures and act in advance?
85
242740
2256
04:05
Yes.
86
245020
1216
可以
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
我哋可唔可以 整隻機械人出嚟識講人話?
04:09
Yes.
88
249460
1216
可以
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
我哋可唔可以整隻機械人
識得識別物體、辨別情感 自帶情感、打機,甚至識讀唇語?
04:13
emote themselves, play games and even read lips?
90
253700
3376
04:17
Yes.
91
257100
1216
可以
04:18
Can we build a system that sets goals,
92
258340
2136
我哋可唔可以整隻機械人識訂立目標
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
實現目標兼且喺過程中自學?
04:24
Yes.
94
264140
1216
可以
04:25
Can we build systems that have a theory of mind?
95
265380
3336
我哋可唔可以整隻 有思維邏輯嘅機械人?
04:28
This we are learning to do.
96
268740
1496
我哋宜家嘗試緊
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
我哋可唔可以整隻機械人 明白道德觀念同底線?
04:34
This we must learn how to do.
98
274300
2040
呢個任務我哋責無旁貸
04:37
So let's accept for a moment
99
277180
1376
咁姑且我哋有可能
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
為呢個任務或者其他任務 整隻咁樣嘅機械人
04:41
for this kind of mission and others.
101
281500
2136
04:43
The next question you must ask yourself is,
102
283660
2536
跟住你實會問
咁嘅人工智能會唔會造成威脅?
04:46
should we fear it?
103
286220
1456
04:47
Now, every new technology
104
287700
1976
時至今日,每項新科技面世 唔多唔少都會帶嚟不安
04:49
brings with it some measure of trepidation.
105
289700
2896
04:52
When we first saw cars,
106
292620
1696
以前啲人第一次見到汽車時
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
就驚車禍會造成家破人亡
04:58
When we first saw telephones come in,
108
298380
2696
以前第一次見到電話時
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
啲人就驚人同人之間嘅交流會受到破壞
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
曾幾何時啲人見到文字可以傳送
05:07
people thought we would lose our ability to memorize.
111
307980
2496
又驚人類嘅記憶力會喪失
05:10
These things are all true to a degree,
112
310500
2056
呢啲不安喺一定程度上嚟講無錯
05:12
but it's also the case that these technologies
113
312580
2416
但同時呢啲科技
拓闊咗人類嘅體驗
05:15
brought to us things that extended the human experience
114
315020
3376
05:18
in some profound ways.
115
318420
1880
05:21
So let's take this a little further.
116
321660
2280
我哋不如再講遠啲
05:24
I do not fear the creation of an AI like this,
117
324940
4736
我唔驚呢啲人工智能面世
05:29
because it will eventually embody some of our values.
118
329700
3816
因為佢最終會接受人類嘅一啲價值
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
試下咁唸:製造感知機械人
同製造以前傳統嘅軟件密集型機械人
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
係有根本性分別
05:40
We don't program them. We teach them.
121
340380
2456
我哋而家唔係編程機械人
我哋係教機械人
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
為咗教機械人識別花
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
我攞幾千種我鍾意嘅花畀佢睇
05:48
In order to teach a system how to play a game --
124
348580
2256
為咗教曉機械人點打機——
05:50
Well, I would. You would, too.
125
350860
1960
我真係教㗎。你都會咁做
05:54
I like flowers. Come on.
126
354420
2040
我真係鍾意花呢——
05:57
To teach a system how to play a game like Go,
127
357260
2856
為咗教識機械人打「殺出重圍」
06:00
I'd have it play thousands of games of Go,
128
360140
2056
我畀佢打幾千局遊戲
06:02
but in the process I also teach it
129
362220
1656
不過呢個過程,我又會教佢 辨別好局同劣局
06:03
how to discern a good game from a bad game.
130
363900
2416
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
如果我想整隻法律助理機械人
06:10
I will teach it some corpus of law
132
370060
1776
我除咗會教佢法律
06:11
but at the same time I am fusing with it
133
371860
2856
我仲會教佢法律嘅寛容同公義
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
套用科學術語 呢啲係我哋所謂嘅參考標準
06:21
and here's the important point:
136
381380
2016
而且重要嘅係:
06:23
in producing these machines,
137
383420
1456
製造呢啲機械人時
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
我哋將自己嘅價值觀灌輸畀佢
06:28
To that end, I trust an artificial intelligence
139
388340
3136
最終,我相信人工智能
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
會同一個受過專業訓練嘅人一樣
06:35
But, you may ask,
141
395900
1216
不過,你哋可能會問
06:37
what about rogue agents,
142
397140
2616
如果呢啲人工智能被非法分子利用呢?
06:39
some well-funded nongovernment organization?
143
399780
3336
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
雖然我哋唔可能杜絕所有暴力事件發生
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
但我唔擔心人工智能落入一啲壞人手中
06:51
but the reality is such a system
146
411540
2136
因為人工智能需要持續同細微嘅改進
06:53
requires substantial training and subtle training
147
413700
3096
06:56
far beyond the resources of an individual.
148
416820
2296
單憑個人資源唔使旨意做到
06:59
And furthermore,
149
419140
1216
而且,唔好當成植入網絡病毒咁簡單
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
唔好以為隨時隨地㩒個掣
07:06
and laptops start blowing up all over the place.
152
426780
2456
全世界嘅電腦瞬間就會爆炸
07:09
Now, these kinds of substances are much larger,
153
429260
2816
人工智能係複雜好多嘅嘢
07:12
and we'll certainly see them coming.
154
432100
1715
但我相信遲早有一日人工智能會出現
07:14
Do I fear that such an artificial intelligence
155
434340
3056
我驚唔驚人工智能會威脅全人類?
07:17
might threaten all of humanity?
156
437420
1960
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
電影《黑客帝國》、《大都會》
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
《終結者》、電視劇《西部世界》
07:27
they all speak of this kind of fear.
159
447700
2136
呢類影視作品都係刻畫呢種恐懼
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
無錯,哲學家 Nick Bostrom 喺《超人工智能》一書裏邊都認為
07:34
he picks up on this theme
161
454180
1536
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
超人工智能唔單止危險
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
而且仲危及全人類
07:43
Dr. Bostrom's basic argument
164
463660
2216
佢嘅基本論點有︰
07:45
is that such systems will eventually
165
465900
2736
咁樣嘅機械人 最終唔會滿足於眼前擁有嘅資訊
07:48
have such an insatiable thirst for information
166
468660
3256
07:51
that they will perhaps learn how to learn
167
471940
2896
機械人可能會因而自己鑽研學習方法
07:54
and eventually discover that they may have goals
168
474860
2616
以至到最後發現 自己有啲目標同人類需要有矛盾
07:57
that are contrary to human needs.
169
477500
2296
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
有人支持博森博士嘅觀點
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
包括 Elon Musk 同霍金
08:06
With all due respect
172
486700
2400
我想指出其實幾位智者諗錯咗
08:09
to these brilliant minds,
173
489980
2016
08:12
I believe that they are fundamentally wrong.
174
492020
2256
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
Nick Bostrom 嘅理論有好多錯誤
08:17
and I don't have time to unpack them all,
176
497500
2136
但我無時間講曬所有
08:19
but very briefly, consider this:
177
499660
2696
但簡單嚟講,可以咁理解: 超智能唔代表超萬能
08:22
super knowing is very different than super doing.
178
502380
3736
08:26
HAL was a threat to the Discovery crew
179
506140
1896
哈爾對於成個太空探索團隊嘅威脅
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
僅限於佢可以對探索任務落命令
08:32
So it would have to be with a superintelligence.
181
512500
2496
所以任務需要一個超智能機器
08:35
It would have to have dominion over all of our world.
182
515020
2496
落命令嘅需要有統治世界嘅能力
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
電影《未來戰士 2018》 裡面嘅 Skynet
嗰個可以操控人類意志嘅 超人工智能防禦系統
08:40
in which we had a superintelligence
184
520380
1856
08:42
that commanded human will,
185
522260
1376
控制曬全世界所有嘅機器同裝置
08:43
that directed every device that was in every corner of the world.
186
523660
3856
08:47
Practically speaking,
187
527540
1456
即係話
08:49
it ain't gonna happen.
188
529020
2096
電影情節係唔會發生
08:51
We are not building AIs that control the weather,
189
531140
3056
我哋唔係整隻機械人出嚟呼風喚雨
08:54
that direct the tides,
190
534220
1336
08:55
that command us capricious, chaotic humans.
191
535580
3376
操控喜怒無常、陷於鬥爭嘅人類
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
再者,如果咁樣嘅機械人真係存在
09:02
it would have to compete with human economies,
193
542900
2936
佢就會同人類嘅經濟鬥過
09:05
and thereby compete for resources with us.
194
545860
2520
甚至同人類爭資源
09:09
And in the end --
195
549020
1216
最終結果係——
09:10
don't tell Siri this --
196
550260
1240
唔好話俾 Siri 聽——
09:12
we can always unplug them.
197
552260
1376
我哋可以隨時熄咗佢哋
09:13
(Laughter)
198
553660
2120
(笑聲)
09:17
We are on an incredible journey
199
557180
2456
我哋同機械人係共同進化
09:19
of coevolution with our machines.
200
559660
2496
09:22
The humans we are today
201
562180
2496
未來嘅人類同今日嘅我哋 唔可以同日而語
09:24
are not the humans we will be then.
202
564700
2536
09:27
To worry now about the rise of a superintelligence
203
567260
3136
人類擔心人工智能帶嚟威脅
09:30
is in many ways a dangerous distraction
204
570420
3056
只會令人類唔去真正關心 科技崛起帶嚟嘅人文同社會問題
09:33
because the rise of computing itself
205
573500
2336
09:35
brings to us a number of human and societal issues
206
575860
3016
而呢啲問題正正係 我哋需要著手解決嘅
09:38
to which we must now attend.
207
578900
1640
09:41
How shall I best organize society
208
581180
2816
問題例如有︰
當我哋唔再需要勞動力嘅時候 我哋要點去調控社會?
09:44
when the need for human labor diminishes?
209
584020
2336
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
點樣向全世界傳播知識同教育 同時又尊重當地嘅差異?
09:50
and still respect our differences?
211
590220
1776
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
點樣通過認知式保健幫人類延年益壽?
09:56
How might I use computing
213
596300
2856
點樣利用電腦幫人類踏足外太空?
09:59
to help take us to the stars?
214
599180
1760
10:01
And that's the exciting thing.
215
601580
2040
諗下都覺得興奮
10:04
The opportunities to use computing
216
604220
2336
利用計算科學去開拓人類經歷嘅機會 就喺手裏邊
10:06
to advance the human experience
217
606580
1536
10:08
are within our reach,
218
608140
1416
10:09
here and now,
219
609580
1856
10:11
and we are just beginning.
220
611460
1680
而我哋只係啱啱捉緊到
10:14
Thank you very much.
221
614100
1216
多謝大家
10:15
(Applause)
222
615340
4286
(掌聲)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7