How civilization could destroy itself -- and 4 ways we could prevent it | Nick Bostrom

153,547 views ・ 2020-01-17

TED


請雙擊下方英文字幕播放視頻。

譯者: NAN-KUN WU 審譯者: Thomas Tam
00:13
Chris Anderson: Nick Bostrom.
0
13000
1809
克里斯.安德森:尼克.博斯特倫。 你已經給了我們好多瘋狂的點子。
00:14
So, you have already given us so many crazy ideas out there.
1
14833
3976
00:18
I think a couple of decades ago,
2
18833
1726
我想,幾十年前,
00:20
you made the case that we might all be living in a simulation,
3
20583
2935
你提出我們都可能 生活在模擬世界當中,
00:23
or perhaps probably were.
4
23542
1809
或者是真的。
00:25
More recently,
5
25375
1351
最近,
00:26
you've painted the most vivid examples of how artificial general intelligence
6
26750
4601
你利用最生動的例子 說明了通用人工智慧
00:31
could go horribly wrong.
7
31375
1833
可能出現嚴重的問題。
00:33
And now this year,
8
33750
1393
當前,
00:35
you're about to publish
9
35167
2226
你準備出版
00:37
a paper that presents something called the vulnerable world hypothesis.
10
37417
3934
一篇論文,說明所謂 「脆弱世界的假設」。
00:41
And our job this evening is to give the illustrated guide to that.
11
41375
4583
今晚,我們要跟大家說明它。
00:46
So let's do that.
12
46417
1833
咱們開始吧!
00:48
What is that hypothesis?
13
48833
1792
那個假設是什麼?
00:52
Nick Bostrom: It's trying to think about
14
52000
2434
尼克.博斯特倫:它就是嘗試去考慮
00:54
a sort of structural feature of the current human condition.
15
54458
3084
目前人類狀況的結構性特徵。
00:59
You like the urn metaphor,
16
59125
2351
你喜歡甕的比喻吧,
01:01
so I'm going to use that to explain it.
17
61500
1893
這就讓我用它來說明好了。
01:03
So picture a big urn filled with balls
18
63417
4351
想像一個大甕,裡面裝滿了小球,
01:07
representing ideas, methods, possible technologies.
19
67792
3958
小球代表理念、方法、潛在的科技。
01:12
You can think of the history of human creativity
20
72833
3726
你可以把人類創意的歷史,
01:16
as the process of reaching into this urn and pulling out one ball after another,
21
76583
3810
想像成將手伸入這個甕, 把球逐個拿出來的過程,
01:20
and the net effect so far has been hugely beneficial, right?
22
80417
3226
到目前為止的淨效應是非常好。
01:23
We've extracted a great many white balls,
23
83667
2726
我們已經取出了許多白球,
01:26
some various shades of gray, mixed blessings.
24
86417
2875
和一些不同深淺灰色的球: 表示好壞參半。
01:30
We haven't so far pulled out the black ball --
25
90042
2958
目前我們還沒有拿出黑球來——
01:34
a technology that invariably destroys the civilization that discovers it.
26
94292
5476
黑球代表是一種科技, 無可避免毀滅發明它的文明。
01:39
So the paper tries to think about what could such a black ball be.
27
99792
3267
所以,這篇論文是在試圖 思考黑球可能會是什麼。
01:43
CA: So you define that ball
28
103083
1810
克:所以你把它定義為
01:44
as one that would inevitably bring about civilizational destruction.
29
104917
3684
勢必造成毀滅文明的小球。
01:48
NB: Unless we exit what I call the semi-anarchic default condition.
30
108625
5309
尼:除非我們能夠離開 我所謂的默認半無政府狀態。
01:53
But sort of, by default.
31
113958
1500
但我們在這常態中。
01:56
CA: So, you make the case compelling
32
116333
3518
克:所以,你說服大家
01:59
by showing some sort of counterexamples
33
119875
2018
是提出一些反例,
02:01
where you believe that so far we've actually got lucky,
34
121917
2934
從中你相信我們目前很走運,
02:04
that we might have pulled out that death ball
35
124875
2851
還未在不知不覺中
02:07
without even knowing it.
36
127750
1559
拿到這個死球。
02:09
So there's this quote, what's this quote?
37
129333
2292
你的一句引述,是什麼?
02:12
NB: Well, I guess it's just meant to illustrate
38
132625
2684
尼:它的本意是要說明
02:15
the difficulty of foreseeing
39
135333
2101
很難從基本的發現 預測將來會導致什麼後果。
02:17
what basic discoveries will lead to.
40
137458
2685
02:20
We just don't have that capability.
41
140167
3059
我們其實就是沒有能力。
02:23
Because we have become quite good at pulling out balls,
42
143250
3351
因為我們很擅長把球拿出來,
02:26
but we don't really have the ability to put the ball back into the urn, right.
43
146625
3726
卻沒有能力把球放回甕中。
02:30
We can invent, but we can't un-invent.
44
150375
2167
我們可以發明,但卻無法反發明。
02:33
So our strategy, such as it is,
45
153583
2768
看樣子,我們的策略應該是
02:36
is to hope that there is no black ball in the urn.
46
156375
2434
期望甕中沒有黑球。
02:38
CA: So once it's out, it's out, and you can't put it back in,
47
158833
4060
克:所以,一旦把黑球拿出來, 就是不能放回去,
02:42
and you think we've been lucky.
48
162917
1517
你認為我們運氣很好。
02:44
So talk through a couple of these examples.
49
164458
2226
就跟我們談幾個例子吧。
02:46
You talk about different types of vulnerability.
50
166708
3101
你曾說過有不同類型的脆弱性。
02:49
NB: So the easiest type to understand
51
169833
2435
尼:最容易理解的類型,
02:52
is a technology that just makes it very easy
52
172292
3142
要算容易造成
02:55
to cause massive amounts of destruction.
53
175458
2125
大量破壞的科技。
02:59
Synthetic biology might be a fecund source of that kind of black ball,
54
179375
3518
合成生物學新穎的想法 可能產出許多毀滅性的黑球,
03:02
but many other possible things we could --
55
182917
2684
但也有許多其他可能性——
03:05
think of geoengineering, really great, right?
56
185625
2518
想想很棒的地球工程吧!
03:08
We could combat global warming,
57
188167
2226
我們能夠對抗全球暖化,
03:10
but you don't want it to get too easy either,
58
190417
2142
但也不能太容易,
03:12
you don't want any random person and his grandmother
59
192583
2476
你不想要任何一個人,甚至自己的祖母
都可以改變地球的氣候。
03:15
to have the ability to radically alter the earth's climate.
60
195083
3060
03:18
Or maybe lethal autonomous drones,
61
198167
3559
甚或是致命的自主無人機,
03:21
massed-produced, mosquito-sized killer bot swarms.
62
201750
3333
在大量生產像蚊子般大小、 成群的機器殺手。
03:26
Nanotechnology, artificial general intelligence.
63
206500
2726
也包括奈米技術和通用人工智慧。
03:29
CA: You argue in the paper
64
209250
1309
克:在論文中,你認為
03:30
that it's a matter of luck that when we discovered
65
210583
2893
當我們發現可用核能製造炸彈時,
03:33
that nuclear power could create a bomb,
66
213500
3434
那是一場運氣,
03:36
it might have been the case
67
216958
1393
本來也有可能變成是
03:38
that you could have created a bomb
68
218375
1851
用唾手可得的簡單資源
03:40
with much easier resources, accessible to anyone.
69
220250
3559
可製造炸彈。
03:43
NB: Yeah, so think back to the 1930s
70
223833
3560
尼:是的,回想 30 年代,
03:47
where for the first time we make some breakthroughs in nuclear physics,
71
227417
4601
我們在核子物理學的領域上, 第一次有重大的突破,
03:52
some genius figures out that it's possible to create a nuclear chain reaction
72
232042
3684
有些天才發現, 有辦法可以製造核能連鎖反應,
03:55
and then realizes that this could lead to the bomb.
73
235750
3184
接著,他們發現這種技術 可以用來製作炸彈。
03:58
And we do some more work,
74
238958
1893
我們再盡了一些努力,
04:00
it turns out that what you require to make a nuclear bomb
75
240875
2726
結果發現,要製造出一顆核彈,
04:03
is highly enriched uranium or plutonium,
76
243625
2393
需要的是高度濃縮的鈾或鈽,
04:06
which are very difficult materials to get.
77
246042
2017
這些是非常難得的材料。
04:08
You need ultracentrifuges,
78
248083
2268
你會需要超高速的離心機,
04:10
you need reactors, like, massive amounts of energy.
79
250375
3768
也需要反應爐,類似大量的能源。
04:14
But suppose it had turned out instead
80
254167
1809
但若果換成
04:16
there had been an easy way to unlock the energy of the atom.
81
256000
3976
用一種簡單的方法 就可以將原子能量釋放。
04:20
That maybe by baking sand in the microwave oven
82
260000
2768
好像將沙子放到微波爐中加熱,
04:22
or something like that
83
262792
1267
那般容易的方法,
04:24
you could have created a nuclear detonation.
84
264083
2101
你就可以製造核爆炸。
04:26
So we know that that's physically impossible.
85
266208
2143
現在我們知道那是不可能的。
但未經相關物理學驗證之前,
04:28
But before you did the relevant physics
86
268375
1893
04:30
how could you have known how it would turn out?
87
270292
2191
你又如何知道結果?
04:32
CA: Although, couldn't you argue
88
272507
1552
克:你為什麼不主張
04:34
that for life to evolve on Earth
89
274083
1935
在地球上生命的演化,
04:36
that implied sort of stable environment,
90
276042
3267
需要有某種穩定的環境;
04:39
that if it was possible to create massive nuclear reactions relatively easy,
91
279333
4185
若相對容易就能 製造出大規模的核反應,
04:43
the Earth would never have been stable,
92
283542
1858
地球就不可能穩定,
04:45
that we wouldn't be here at all.
93
285424
1552
我們不可能存活至今。
04:47
NB: Yeah, unless there were something that is easy to do on purpose
94
287000
3393
尼:是的,除非很容易 刻意地去做某種事,
04:50
but that wouldn't happen by random chance.
95
290417
2851
但那不會偶然發生。
04:53
So, like things we can easily do,
96
293292
1579
我們很容易做一些事,
04:54
we can stack 10 blocks on top of one another,
97
294896
2110
比如可能把十塊積木疊起來,
04:57
but in nature, you're not going to find, like, a stack of 10 blocks.
98
297031
3197
但在大自然中,你不會 看到十塊東西疊在一起。
克:好,那可能是我們許多人
05:00
CA: OK, so this is probably the one
99
300253
1673
05:01
that many of us worry about most,
100
301950
1943
最擔心的事,
05:03
and yes, synthetic biology is perhaps the quickest route
101
303917
3517
那合成生物學可能是
讓我們在不遠的將來 到達那裡的最快途徑。
05:07
that we can foresee in our near future to get us here.
102
307458
3018
05:10
NB: Yeah, and so think about what that would have meant
103
310500
2934
尼:是的,所以, 想想這意味著什麼。
05:13
if, say, anybody by working in their kitchen for an afternoon
104
313458
3643
比如,人人在自己的廚房, 花上一個下午的時間,
05:17
could destroy a city.
105
317125
1393
就能摧毀一個城市。
05:18
It's hard to see how modern civilization as we know it
106
318542
3559
很難想像我們的現代文明,
05:22
could have survived that.
107
322125
1434
在這種情況下能存活。
05:23
Because in any population of a million people,
108
323583
2518
因為,在任何一百萬人口中,
05:26
there will always be some who would, for whatever reason,
109
326125
2684
一定會有些人, 不論出於什麼理由,
05:28
choose to use that destructive power.
110
328833
2084
選擇使用具毀滅性的力量。
05:31
So if that apocalyptic residual
111
331750
3143
如果那末世餘孽
05:34
would choose to destroy a city, or worse,
112
334917
1976
選擇摧毀一個城市,糟糕的是
05:36
then cities would get destroyed.
113
336917
1559
那城市真的被摧毀了。
05:38
CA: So here's another type of vulnerability.
114
338500
2351
克:所以,這是另一種脆弱。
05:40
Talk about this.
115
340875
1643
跟我們談談這個吧!
05:42
NB: Yeah, so in addition to these kind of obvious types of black balls
116
342542
3976
尼:所以,除了 這幾類明顯的黑球
05:46
that would just make it possible to blow up a lot of things,
117
346542
2810
有可能會把很多東西炸掉之外,
05:49
other types would act by creating bad incentives
118
349376
4433
其他類型的運作方式 會創造不良的動機,
05:53
for humans to do things that are harmful.
119
353833
2226
驅使人類去做有害的事。
05:56
So, the Type-2a, we might call it that,
120
356083
4101
所以,我們所謂的 2a 類別,
06:00
is to think about some technology that incentivizes great powers
121
360208
4518
就是思想有些科技能夠激勵大國
06:04
to use their massive amounts of force to create destruction.
122
364750
4476
去使用他們強大的力量造成毀滅。
06:09
So, nuclear weapons were actually very close to this, right?
123
369250
2833
核武就非常接近這一類吧﹗
06:14
What we did, we spent over 10 trillion dollars
124
374083
3060
我們所做的是花超過十兆美元
06:17
to build 70,000 nuclear warheads
125
377167
2517
建造七萬個核子彈頭,
06:19
and put them on hair-trigger alert.
126
379708
2435
全處在一觸即發的警戒狀態。
06:22
And there were several times during the Cold War
127
382167
2267
在冷戰期間曾經有好幾次,
06:24
we almost blew each other up.
128
384458
1435
幾乎就要把彼此毀掉。
06:25
It's not because a lot of people felt this would be a great idea,
129
385917
3101
非因許多人認為這是個好點子,
咱們就花十兆美元把自己毀掉,
06:29
let's all spend 10 trillion dollars to blow ourselves up,
130
389042
2684
但激勵我們的,是我們發現
06:31
but the incentives were such that we were finding ourselves --
131
391750
2934
還未發生更糟糕的事情。
06:34
this could have been worse.
132
394708
1310
想像一下,若有人為自保 不顧一切先開戰。
06:36
Imagine if there had been a safe first strike.
133
396042
2434
06:38
Then it might have been very tricky,
134
398500
2309
那就可能會非常難搞了,
06:40
in a crisis situation,
135
400833
1268
在危難的情況中,
06:42
to refrain from launching all their nuclear missiles.
136
402125
2477
很難阻止他們發射 所有的核子飛彈。
06:44
If nothing else, because you would fear that the other side might do it.
137
404626
3392
但你也害怕另一方可能會同樣做。
克:是的,相互制衡的毀滅
06:48
CA: Right, mutual assured destruction
138
408042
1809
06:49
kept the Cold War relatively stable,
139
409875
2726
讓冷戰狀況變得相對穩定,
06:52
without that, we might not be here now.
140
412625
1934
我們現在才可在這裡。
06:54
NB: It could have been more unstable than it was.
141
414583
2310
尼:原本的情況是更壞。
科技的其他特性,
06:56
And there could be other properties of technology.
142
416917
2351
可能影響更難簽定這個武器條約,
06:59
It could have been harder to have arms treaties,
143
419292
2309
07:01
if instead of nuclear weapons
144
421625
1601
比如用的不是核武,
07:03
there had been some smaller thing or something less distinctive.
145
423250
3018
而是更小或更不顯眼的武器。
07:06
CA: And as well as bad incentives for powerful actors,
146
426292
2559
克:除強大的執行者外、
你擔心所有人都會有不良誘因: 這就是 2b 類型。
07:08
you also worry about bad incentives for all of us, in Type-2b here.
147
428875
3518
07:12
NB: Yeah, so, here we might take the case of global warming.
148
432417
4250
尼:是的,這方面 可以用全球暖化當例子。
07:18
There are a lot of little conveniences
149
438958
1893
有很多小小的便利,
07:20
that cause each one of us to do things
150
440875
2184
讓我們每個人都很容易做這些事情,
07:23
that individually have no significant effect, right?
151
443083
2851
單獨來看,並沒有顯著的影響吧﹗
07:25
But if billions of people do it,
152
445958
1976
但當數十億人都這麼做時,
07:27
cumulatively, it has a damaging effect.
153
447958
2060
日積月累,就產生破壞性的影響。
07:30
Now, global warming could have been a lot worse than it is.
154
450042
2809
全球暖化的情況, 可能比本來的更糟。
07:32
So we have the climate sensitivity parameter, right.
155
452875
2976
所以,我們有氣候敏感度參數。
07:35
It's a parameter that says how much warmer does it get
156
455875
3643
這個參數代表的是, 你釋放的溫室氣體
07:39
if you emit a certain amount of greenhouse gases.
157
459542
2684
與氣溫上升成正比。
07:42
But, suppose that it had been the case
158
462250
2393
但假設發生的狀況是,
07:44
that with the amount of greenhouse gases we emitted,
159
464667
2517
我們釋放了這麼大量的溫室氣體,
07:47
instead of the temperature rising by, say,
160
467208
2060
比如,溫度並沒有在 2100 年 預計的範圍內上升 3 ~ 4.5 度,
07:49
between three and 4.5 degrees by 2100,
161
469292
3726
07:53
suppose it had been 15 degrees or 20 degrees.
162
473042
2500
假如是跳升至 15 或 20 度。
07:56
Like, then we might have been in a very bad situation.
163
476375
2559
那我們就糟糕了。
07:58
Or suppose that renewable energy had just been a lot harder to do.
164
478958
3143
又假設可再生能源計劃, 比預期的更難實現。
或者是地底下有更多的化石燃料。
08:02
Or that there had been more fossil fuels in the ground.
165
482125
2643
克:你可否抗辯以下的情況——
08:04
CA: Couldn't you argue that if in that case of --
166
484792
2642
08:07
if what we are doing today
167
487458
1726
如果我們現在的行為
08:09
had resulted in 10 degrees difference in the time period that we could see,
168
489208
4560
在我們能預見的期間 造成了 10 度的差距,
08:13
actually humanity would have got off its ass and done something about it.
169
493792
3684
那人類就會醒覺為此做點事。
08:17
We're stupid, but we're not maybe that stupid.
170
497500
2809
我們是笨的,但也未必那麼笨。
08:20
Or maybe we are.
171
500333
1268
克:也許是。 尼:我就不敢賭。
08:21
NB: I wouldn't bet on it.
172
501625
1268
08:22
(Laughter)
173
502917
2184
(笑聲)
08:25
You could imagine other features.
174
505125
1684
可以想像其他特徵。
08:26
So, right now, it's a little bit difficult to switch to renewables and stuff, right,
175
506833
5518
目前,有點難轉換到 可再生能源或其它東西,
08:32
but it can be done.
176
512375
1268
但這是可行的。
08:33
But it might just have been, with slightly different physics,
177
513667
2976
本來是有可能, 但若物理學上有些微的不同,
08:36
it could have been much more expensive to do these things.
178
516667
2791
要做到這些事,就可能會昂貴許多。
08:40
CA: And what's your view, Nick?
179
520375
1518
克:尼克,你的看法是否認為, 把這些可能性結合起來,
08:41
Do you think, putting these possibilities together,
180
521917
2434
08:44
that this earth, humanity that we are,
181
524375
4268
包括這個地球、我們人類, 就算是個脆弱的世界嗎?
08:48
we count as a vulnerable world?
182
528667
1559
也可以算是未來的一顆死亡黑球嗎?
08:50
That there is a death ball in our future?
183
530250
2417
08:55
NB: It's hard to say.
184
535958
1268
尼:這很難說。
08:57
I mean, I think there might well be various black balls in the urn,
185
537250
5059
我認為在甕中會有幾個不同的黑球,
09:02
that's what it looks like.
186
542333
1310
看起來是這樣。
09:03
There might also be some golden balls
187
543667
2392
可能也會有一些金球,
09:06
that would help us protect against black balls.
188
546083
3476
能協助保護我們,對抗黑球。
09:09
And I don't know which order they will come out.
189
549583
2976
我不知道這些球 會以什麼順序出現。
09:12
CA: I mean, one possible philosophical critique of this idea
190
552583
3851
克:對這個想法, 可能會有一種哲學上的批評,
09:16
is that it implies a view that the future is essentially settled.
191
556458
5643
觀點就是,未來的情況, 基本上是已經定了。
09:22
That there either is that ball there or it's not.
192
562125
2476
黑球有就有,沒有就沒有。
09:24
And in a way,
193
564625
3018
在某種意義上,
09:27
that's not a view of the future that I want to believe.
194
567667
2601
我並不想要相信這種未來的觀點。
我相信未來是尚未被決定的,
09:30
I want to believe that the future is undetermined,
195
570292
2351
09:32
that our decisions today will determine
196
572667
1934
我們現今的取向會決定
09:34
what kind of balls we pull out of that urn.
197
574625
2208
我們從甕中取出哪種球。
09:37
NB: I mean, if we just keep inventing,
198
577917
3767
尼:如果我們持繼創造和發明,
09:41
like, eventually we will pull out all the balls.
199
581708
2334
我們最終會把所有的球都取出來。
09:44
I mean, I think there's a kind of weak form of technological determinism
200
584875
3393
我認為科技決定論有種較弱的形式,
是滿有道理的,
09:48
that is quite plausible,
201
588292
1267
像是,你不太可能會遇到一個社會
09:49
like, you're unlikely to encounter a society
202
589583
2643
09:52
that uses flint axes and jet planes.
203
592250
2833
會同時使用燧石斧和噴射機。
09:56
But you can almost think of a technology as a set of affordances.
204
596208
4060
但你幾乎可以把一項科技 想成是一組功能。
10:00
So technology is the thing that enables us to do various things
205
600292
3017
所以,科技能讓我們去做各種事, 在世上達成不同效果。
10:03
and achieve various effects in the world.
206
603333
1976
當然,如何使用它 就是人類的選擇了。
10:05
How we'd then use that, of course depends on human choice.
207
605333
2810
10:08
But if we think about these three types of vulnerability,
208
608167
2684
但我們想想這三種類型的弱點,
10:10
they make quite weak assumptions about how we would choose to use them.
209
610875
3393
苐一個假設的脆弱性, 是選擇如何使用科技。
所以,這個脆弱的地方, 是在於巨大力量的毀滅性,
10:14
So a Type-1 vulnerability, again, this massive, destructive power,
210
614292
3392
10:17
it's a fairly weak assumption
211
617708
1435
假設在數百萬人中,
10:19
to think that in a population of millions of people
212
619167
2392
有人利用科技作毀滅性用途,
10:21
there would be some that would choose to use it destructively.
213
621583
2935
這是我們人類相當薄弱的一環。
10:24
CA: For me, the most single disturbing argument
214
624542
2434
克:對我來說最受不了的論點,
10:27
is that we actually might have some kind of view into the urn
215
627000
4559
就是我們覺得這個甕裡面的東西,
10:31
that makes it actually very likely that we're doomed.
216
631583
3518
很可能意味著我們會完蛋。
10:35
Namely, if you believe in accelerating power,
217
635125
4643
也就是說,如果你相信科技的加速力,
10:39
that technology inherently accelerates,
218
639792
2267
也就是科技加速的本質,
10:42
that we build the tools that make us more powerful,
219
642083
2435
我們建造這些讓我們更強大的工具,
10:44
then at some point you get to a stage
220
644542
2642
導致某人某時可能
10:47
where a single individual can take us all down,
221
647208
3060
把我們全部拖垮,
10:50
and then it looks like we're screwed.
222
650292
2851
讓我們陷入已經完蛋的窘境。
10:53
Isn't that argument quite alarming?
223
653167
2934
這樣的論點不是一個警訊嗎?
10:56
NB: Ah, yeah.
224
656125
1750
尼:嗯,沒錯。
10:58
(Laughter)
225
658708
1268
(笑聲)
11:00
I think --
226
660000
1333
我覺得——
11:02
Yeah, we get more and more power,
227
662875
1601
沒錯,我們的力量越來越強大,
11:04
and [it's] easier and easier to use those powers,
228
664500
3934
運用這些力量也越容易,
11:08
but we can also invent technologies that kind of help us control
229
668458
3560
但也可發明一些科技,
幫助我們控制人們 如何運用這些力量。
11:12
how people use those powers.
230
672042
2017
11:14
CA: So let's talk about that, let's talk about the response.
231
674083
2851
克:我們聊聊如何應對吧。
11:16
Suppose that thinking about all the possibilities
232
676958
2310
假設現有的所有可能性——
11:19
that are out there now --
233
679292
2101
11:21
it's not just synbio, it's things like cyberwarfare,
234
681417
3726
不只是合成生物學, 其他像是網路戰、
11:25
artificial intelligence, etc., etc. --
235
685167
3351
人工智慧等等——
11:28
that there might be serious doom in our future.
236
688542
4517
在未來可能帶來嚴重性的毀滅。
11:33
What are the possible responses?
237
693083
1601
又有哪些應對的方法呢?
11:34
And you've talked about four possible responses as well.
238
694708
4893
你也提到了四個可能的應對方法。
11:39
NB: Restricting technological development doesn't seem promising,
239
699625
3643
尼:限制科技發展,似乎用處不大,
11:43
if we are talking about a general halt to technological progress.
240
703292
3226
或無條件停止科技發展。
11:46
I think neither feasible,
241
706542
1267
甚至於我們有能力做到, 都不可能實現。
11:47
nor would it be desirable even if we could do it.
242
707833
2310
11:50
I think there might be very limited areas
243
710167
3017
有少數幾個領域,
11:53
where maybe you would want slower technological progress.
244
713208
2726
你或許會希望科技的發展 來得慢一點。
11:55
You don't, I think, want faster progress in bioweapons,
245
715958
3393
我覺得,你不會希望生物武器,
11:59
or in, say, isotope separation,
246
719375
2059
或同位素分離技術,
12:01
that would make it easier to create nukes.
247
721458
2250
促成核子武器的,加快的發展。
12:04
CA: I mean, I used to be fully on board with that.
248
724583
3310
克:我以前完全支持這個論點。
12:07
But I would like to actually push back on that for a minute.
249
727917
3267
但我想反思一下。
12:11
Just because, first of all,
250
731208
1310
首先,就過去至少數十年的歷史來看,
12:12
if you look at the history of the last couple of decades,
251
732542
2684
12:15
you know, it's always been push forward at full speed,
252
735250
3559
科技一直是以全速發展。
12:18
it's OK, that's our only choice.
253
738833
1851
只是我們的選擇而已。
12:20
But if you look at globalization and the rapid acceleration of that,
254
740708
4268
但如果你加速全球化的情形,
以「快速行動, 打破成規」的策略來看,
12:25
if you look at the strategy of "move fast and break things"
255
745000
3434
12:28
and what happened with that,
256
748458
2060
是發生了什麼事,
12:30
and then you look at the potential for synthetic biology,
257
750542
2767
接著再看看合成生物學的潛力,
12:33
I don't know that we should move forward rapidly
258
753333
4435
我不知道我們是否應該 無限制地快速前進,
12:37
or without any kind of restriction
259
757792
1642
12:39
to a world where you could have a DNA printer in every home
260
759458
3310
造就每個家庭和高中實驗室
12:42
and high school lab.
261
762792
1333
都擁有一部 DNA 印表機的世界。
12:45
There are some restrictions, right?
262
765167
1684
限制還是應該存在的。
12:46
NB: Possibly, there is the first part, the not feasible.
263
766875
2643
尼:或者第一部分是不可行。
12:49
If you think it would be desirable to stop it,
264
769542
2184
如果你認為應該要阻止它,
12:51
there's the problem of feasibility.
265
771750
1726
那就存在可行性的問題。
12:53
So it doesn't really help if one nation kind of --
266
773500
2809
所以如果只有一個國家 是沒有幫助——
12:56
CA: No, it doesn't help if one nation does,
267
776333
2018
克:我同意,
但我們之前是有條約約束的。
12:58
but we've had treaties before.
268
778375
2934
13:01
That's really how we survived the nuclear threat,
269
781333
3351
經歷了痛苦的談判過程,
13:04
was by going out there
270
784708
1268
讓我們在核武的威脅下仍然存活。
13:06
and going through the painful process of negotiating.
271
786000
2518
13:08
I just wonder whether the logic isn't that we, as a matter of global priority,
272
788542
5434
我只想知道,邏輯是:
作為全球的優先事項,
13:14
we shouldn't go out there and try,
273
794000
1684
我們不是應該試著,
13:15
like, now start negotiating really strict rules
274
795708
2685
比方說,就合成生物研究 的方向做談判,
13:18
on where synthetic bioresearch is done,
275
798417
2684
13:21
that it's not something that you want to democratize, no?
276
801125
2851
這不是你想要的民主化東西嗎?
13:24
NB: I totally agree with that --
277
804000
1809
尼:我完全同意——
13:25
that it would be desirable, for example,
278
805833
4226
可取的方法是,比方說,
13:30
maybe to have DNA synthesis machines,
279
810083
3601
或許不是每個實驗室 都有一台 DNA 合成的機器,
13:33
not as a product where each lab has their own device,
280
813708
3560
13:37
but maybe as a service.
281
817292
1476
而是用服務的形式存在。
13:38
Maybe there could be four or five places in the world
282
818792
2517
或許你把數位基因模型寄到四五處,
13:41
where you send in your digital blueprint and the DNA comes back, right?
283
821333
3518
就可收到合成的基因圖譜了。
13:44
And then, you would have the ability,
284
824875
1768
這樣一來,你有能力, 在萬一有需要的話,
13:46
if one day it really looked like it was necessary,
285
826667
2392
就可以加以制約。
13:49
we would have like, a finite set of choke points.
286
829083
2351
13:51
So I think you want to look for kind of special opportunities,
287
831458
3518
所以我認為應該要找到一些
13:55
where you could have tighter control.
288
835000
2059
能夠有效控制的特殊方法。
13:57
CA: Your belief is, fundamentally,
289
837083
1643
克:基本上你的信念是對的,
13:58
we are not going to be successful in just holding back.
290
838750
2893
但只用拖後腿的方法,不會成功。
14:01
Someone, somewhere -- North Korea, you know --
291
841667
2726
某地某人——在北韓——
14:04
someone is going to go there and discover this knowledge,
292
844417
3517
如果在那裡發現一些知識,
14:07
if it's there to be found.
293
847958
1268
如果能找到的話。
14:09
NB: That looks plausible under current conditions.
294
849250
2351
尼:就目前是有可能的。
也不單單是合成生物學。
14:11
It's not just synthetic biology, either.
295
851625
1934
出現任何一種深邃的新改變
14:13
I mean, any kind of profound, new change in the world
296
853583
2518
都可能成為黑球。
14:16
could turn out to be a black ball.
297
856101
1626
14:17
CA: Let's look at another possible response.
298
857727
2096
克:且談另一個應對吧﹗
尼:這個應對方法的潛力也有限。
14:19
NB: This also, I think, has only limited potential.
299
859823
2403
14:22
So, with the Type-1 vulnerability again,
300
862250
3559
回到第一型的脆弱去,
14:25
I mean, if you could reduce the number of people who are incentivized
301
865833
4351
如果你可以減少毀滅世界的人數, 那受激動的人,只能接觸
14:30
to destroy the world,
302
870208
1268
14:31
if only they could get access and the means,
303
871500
2059
良善的方式和通道, 是不錯的做法。
14:33
that would be good.
304
873583
1268
克:你要求我們製作的這張圖,
14:34
CA: In this image that you asked us to do
305
874875
1976
14:36
you're imagining these drones flying around the world
306
876875
2559
設想是擁有臉部辨識能力的 無人機在全世界飛行。
14:39
with facial recognition.
307
879458
1268
當它們偵測到某些人 有反社會行為的跡象,
14:40
When they spot someone showing signs of sociopathic behavior,
308
880750
2893
給他們沐浴以愛,
14:43
they shower them with love, they fix them.
309
883667
2184
導正他們。
14:45
NB: I think it's like a hybrid picture.
310
885875
1893
尼:我想這是一幅混合圖。
14:47
Eliminate can either mean, like, incarcerate or kill,
311
887792
4017
消除可以意味著監禁,或是殺戮,
14:51
or it can mean persuade them to a better view of the world.
312
891833
3018
也可以解釋成說服他們
欣賞那美好的世界。
14:54
But the point is that,
313
894875
1726
重點是,假設這個做法非常成功,
14:56
suppose you were extremely successful in this,
314
896625
2143
14:58
and you reduced the number of such individuals by half.
315
898792
3309
把這些人的數量減少了一半。
15:02
And if you want to do it by persuasion,
316
902125
1893
如果你想用說服的方法,
那就是抗衡其他強大力量,
15:04
you are competing against all other powerful forces
317
904042
2392
15:06
that are trying to persuade people,
318
906458
1685
也同樣嘗試說服民眾、團體、 宗教和教育體系。
15:08
parties, religion, education system.
319
908167
1767
15:09
But suppose you could reduce it by half,
320
909958
1905
假設你能成功減少一半人數,
15:11
I don't think the risk would be reduced by half.
321
911887
2256
但我也不認為風險會減半。
也許只是百分之五或十。
15:14
Maybe by five or 10 percent.
322
914167
1559
15:15
CA: You're not recommending that we gamble humanity's future on response two.
323
915750
4351
克:你不建議我們把人性的未來
押在第二個應對方法上。
15:20
NB: I think it's all good to try to deter and persuade people,
324
920125
3018
尼:我認為阻止或說服人們 是沒有問題的,
15:23
but we shouldn't rely on that as our only safeguard.
325
923167
2976
但我們不應該以此
作唯一的保障。
15:26
CA: How about three?
326
926167
1267
克:應對三呢?
15:27
NB: I think there are two general methods
327
927458
2893
尼:一般來說可有兩種方法,
15:30
that we could use to achieve the ability to stabilize the world
328
930375
3976
用在穩定世界、 填補可能發生的漏洞。
15:34
against the whole spectrum of possible vulnerabilities.
329
934375
2976
15:37
And we probably would need both.
330
937375
1559
兩個方法都可能用到。
15:38
So, one is an extremely effective ability
331
938958
4518
其中一個極度有效地
執行防範及維安的任務。
15:43
to do preventive policing.
332
943500
1768
15:45
Such that you could intercept.
333
945292
1524
這就可以阻止出現差錯。
15:46
If anybody started to do this dangerous thing,
334
946840
2761
如果有人從事危害的行動,
15:49
you could intercept them in real time, and stop them.
335
949625
2684
就可以在第一時間阻止他們。
15:52
So this would require ubiquitous surveillance,
336
952333
2476
這需要全面性的監控,
15:54
everybody would be monitored all the time.
337
954833
2375
所有人無時無刻都被監視著。
15:58
CA: This is "Minority Report," essentially, a form of.
338
958333
2560
克:在某種形式上, 就像電影『關鍵報告』。
16:00
NB: You would have maybe AI algorithms,
339
960917
1934
尼:你可能用人工智能演算法、
16:02
big freedom centers that were reviewing this, etc., etc.
340
962875
4417
在大型的自由中心 來檢視這些東西等等。
16:08
CA: You know that mass surveillance is not a very popular term right now?
341
968583
4393
克:你知道大規模監控這個詞
現在不太受歡迎。
(笑聲)
16:13
(Laughter)
342
973000
1250
16:15
NB: Yeah, so this little device there,
343
975458
1810
尼:這個小型裝置,
16:17
imagine that kind of necklace that you would have to wear at all times
344
977292
3601
想像你要一直戴著這種
配備多角度攝影機的項鍊。
16:20
with multidirectional cameras.
345
980917
2000
16:23
But, to make it go down better,
346
983792
1809
但為了讓它更容易被大眾接受,
16:25
just call it the "freedom tag" or something like that.
347
985625
2524
稱它為「自由標記」之類。
(笑聲)
16:28
(Laughter)
348
988173
2011
16:30
CA: OK.
349
990208
1268
克:好的。
16:31
I mean, this is the conversation, friends,
350
991500
2101
各位,對談就是這樣,
16:33
this is why this is such a mind-blowing conversation.
351
993625
3559
可以出現這段驚嚇的對話。
16:37
NB: Actually, there's a whole big conversation on this
352
997208
2601
尼:針對這個議題 明顯地就有一場大討論。
16:39
on its own, obviously.
353
999833
1310
有很大的問題跟風險存在吧﹗
16:41
There are huge problems and risks with that, right?
354
1001167
2476
16:43
We may come back to that.
355
1003667
1267
我們可回頭再談。
16:44
So the other, the final,
356
1004958
1268
另外一個穩定世界的方法, 在某種程度上
16:46
the other general stabilization capability
357
1006250
2559
16:48
is kind of plugging another governance gap.
358
1008833
2060
填補了治理差距。
16:50
So the surveillance would be kind of governance gap at the microlevel,
359
1010917
4184
監控治理差距,就是在微觀層面上,
防範有人做出極端的違法行為。
16:55
like, preventing anybody from ever doing something highly illegal.
360
1015125
3101
16:58
Then, there's a corresponding governance gap
361
1018250
2309
相對地,在巨觀層面, 也就是世界層面上,
17:00
at the macro level, at the global level.
362
1020583
1935
也存在治理差距。
17:02
You would need the ability, reliably,
363
1022542
3934
你需要有可靠的能力
17:06
to prevent the worst kinds of global coordination failures,
364
1026500
2809
來防止全球最壞的一種—— 協調上的失敗,
17:09
to avoid wars between great powers,
365
1029333
3768
避免大國的互相爭鬥、
17:13
arms races,
366
1033125
1333
軍備競賽、災難性的公共問題,
17:15
cataclysmic commons problems,
367
1035500
2208
17:19
in order to deal with the Type-2a vulnerabilities.
368
1039667
4184
來處理 2a 型的弱點。
17:23
CA: Global governance is a term
369
1043875
1934
克:全球治理是個 已經退流行的名詞了,
17:25
that's definitely way out of fashion right now,
370
1045833
2226
但你能夠提出充分的理由來說明
17:28
but could you make the case that throughout history,
371
1048083
2518
17:30
the history of humanity
372
1050625
1268
17:31
is that at every stage of technological power increase,
373
1051917
5434
在人類的歷史上, 每個提升科技力量的階段
17:37
people have reorganized and sort of centralized the power.
374
1057375
3226
人們都會重組及集中權力。
17:40
So, for example, when a roving band of criminals
375
1060625
3434
比方說,如果有一個四處犯案的集團
17:44
could take over a society,
376
1064083
1685
有機會掌控社會,
17:45
the response was, well, you have a nation-state
377
1065792
2239
你有一個民族國家,
權力集中在警察或軍隊上,
17:48
and you centralize force, a police force or an army,
378
1068055
2434
17:50
so, "No, you can't do that."
379
1070513
1630
然後說:「你們不能這麼做。」
17:52
The logic, perhaps, of having a single person or a single group
380
1072167
4559
其中的邏輯或許就是,
當個人或群體在人性抹滅的時候,
17:56
able to take out humanity
381
1076750
1643
17:58
means at some point we're going to have to go this route,
382
1078417
2726
意味著在某個時點, 至少在某個層面上
不是得走上這條路嗎?
18:01
at least in some form, no?
383
1081167
1434
18:02
NB: It's certainly true that the scale of political organization has increased
384
1082625
3684
尼:這完全正確,政治組織的規模
在人類歷史上是持續成長的。
18:06
over the course of human history.
385
1086333
2143
18:08
It used to be hunter-gatherer band, right,
386
1088500
2018
以前是獵人、採集者的群體,
18:10
and then chiefdom, city-states, nations,
387
1090542
2934
然後是酋邦、城邦、國家,
18:13
now there are international organizations and so on and so forth.
388
1093500
3976
現在則有國際組織,諸如此類。
18:17
Again, I just want to make sure
389
1097500
1518
我想重申一次,藉著這個機會強調
18:19
I get the chance to stress
390
1099042
1642
18:20
that obviously there are huge downsides
391
1100708
1976
大規模監控和全球治理
18:22
and indeed, massive risks,
392
1102708
1518
都有很明顯的缺點,
18:24
both to mass surveillance and to global governance.
393
1104250
3351
也確實有巨大的風險。
18:27
I'm just pointing out that if we are lucky,
394
1107625
2559
我想強調的重點是
若夠幸運的話,
18:30
the world could be such that these would be the only ways
395
1110208
2685
這些是我們能唯一
18:32
you could survive a black ball.
396
1112917
1517
撐過黑球出現的方法。
18:34
CA: The logic of this theory,
397
1114458
2518
克:這個理論的邏輯
18:37
it seems to me,
398
1117000
1268
對我來說似乎是
18:38
is that we've got to recognize we can't have it all.
399
1118292
3601
我們得認知是我們不能大小通吃。
18:41
That the sort of,
400
1121917
1833
我想說很多人都有 那些天真的夢,
18:45
I would say, naive dream that many of us had
401
1125500
2976
18:48
that technology is always going to be a force for good,
402
1128500
3351
以為科技永遠是股正面的力量,
18:51
keep going, don't stop, go as fast as you can
403
1131875
2976
持續向前,永不停歇,
可用全速前進,
18:54
and not pay attention to some of the consequences,
404
1134875
2351
而忽略了造成的後果,
這些後果肯定會發生。
18:57
that's actually just not an option.
405
1137250
1684
18:58
We can have that.
406
1138958
1935
我們當然可以全力發展科技。
19:00
If we have that,
407
1140917
1267
但如果我們這麽做的話, 就得接受一些令人難受的後果。
19:02
we're going to have to accept
408
1142208
1435
19:03
some of these other very uncomfortable things with it,
409
1143667
2559
19:06
and kind of be in this arms race with ourselves
410
1146250
2226
就好比我們的軍備競賽,
19:08
of, you want the power, you better limit it,
411
1148500
2268
如果你想要權力, 就要想辦法,去控制如何使用它。
19:10
you better figure out how to limit it.
412
1150792
2142
19:12
NB: I think it is an option,
413
1152958
3476
尼:我認為這是一條路, 一條很誘人的路,
19:16
a very tempting option, it's in a sense the easiest option
414
1156458
2768
某種意義上,是一條最好走 又容易成功的路,
19:19
and it might work,
415
1159250
1268
19:20
but it means we are fundamentally vulnerable to extracting a black ball.
416
1160542
4809
但也表示我們基本上 可能抽到黑球的脆弱性。
19:25
Now, I think with a bit of coordination,
417
1165375
2143
好,我覺得透過協調,
19:27
like, if you did solve this macrogovernance problem,
418
1167542
2726
例如你解決了巨觀治理
19:30
and the microgovernance problem,
419
1170292
1601
和微觀治理的問題,
19:31
then we could extract all the balls from the urn
420
1171917
2309
我們就可以從甕裡抽出所有的小球,
19:34
and we'd benefit greatly.
421
1174250
2268
並獲得很大的好處。
19:36
CA: I mean, if we're living in a simulation, does it matter?
422
1176542
3434
克:我是說,就算我們生活在 一個模擬世界當中,也沒關係吧﹗
19:40
We just reboot.
423
1180000
1309
重開機就好了。
19:41
(Laughter)
424
1181333
1268
(笑聲)
19:42
NB: Then ... I ...
425
1182625
1643
尼:那 ··· 我 ···
19:44
(Laughter)
426
1184292
2476
(笑聲)
19:46
I didn't see that one coming.
427
1186792
1416
我倒是沒想到這個。
19:50
CA: So what's your view?
428
1190125
1268
克:那你的想法是什麼?
19:51
Putting all the pieces together, how likely is it that we're doomed?
429
1191417
4809
總括來看,你覺得我們 完蛋的機率有多大?
19:56
(Laughter)
430
1196250
1958
(笑聲)
我喜歡大家對我這問題的笑聲。
19:59
I love how people laugh when you ask that question.
431
1199042
2392
20:01
NB: On an individual level,
432
1201458
1351
尼:以單一個體, 我們無論如何都會完蛋,
20:02
we seem to kind of be doomed anyway, just with the time line,
433
1202833
3851
是早晚的問題罷了。
20:06
we're rotting and aging and all kinds of things, right?
434
1206708
2601
身體凋零、老化和發生其他事情。
20:09
(Laughter)
435
1209333
1601
(笑聲)
20:10
It's actually a little bit tricky.
436
1210958
1685
問題有點棘手。
20:12
If you want to set up so that you can attach a probability,
437
1212667
2767
要假設能夠計算機率的情境,
首先要問我們是誰?
20:15
first, who are we?
438
1215458
1268
20:16
If you're very old, probably you'll die of natural causes,
439
1216750
2726
如果你很老的話,很可能自然死亡,
如果你很年輕, 你可能有 100 年生命——
20:19
if you're very young, you might have a 100-year --
440
1219500
2351
20:21
the probability might depend on who you ask.
441
1221875
2143
機率跟詢問對象有關。
然後是閾值,例如怎樣 才算是文明毀滅了?
20:24
Then the threshold, like, what counts as civilizational devastation?
442
1224042
4226
20:28
In the paper I don't require an existential catastrophe
443
1228292
5642
在我這篇論文裡面, 不一定要確實的災難,
20:33
in order for it to count.
444
1233958
1435
才計算這種情況。
20:35
This is just a definitional matter,
445
1235417
1684
這只是定義的問題, 例如十億人死亡,
20:37
I say a billion dead,
446
1237125
1309
20:38
or a reduction of world GDP by 50 percent,
447
1238458
2060
或是 GDP 下降 50%,
20:40
but depending on what you say the threshold is,
448
1240542
2226
但是根據你所設定的閾值, 你會得到不同的估計機率。
20:42
you get a different probability estimate.
449
1242792
1976
20:44
But I guess you could put me down as a frightened optimist.
450
1244792
4517
我想你可以笑我 是一個被嚇壞的樂天派。
20:49
(Laughter)
451
1249333
1101
(笑聲)
20:50
CA: You're a frightened optimist,
452
1250458
1643
克:的確是,
而且你還把另外一大群人嚇壞了。
20:52
and I think you've just created a large number of other frightened ...
453
1252125
4268
20:56
people.
454
1256417
1267
20:57
(Laughter)
455
1257708
1060
(笑聲)
20:58
NB: In the simulation.
456
1258792
1267
尼:在模擬情境當中。
克:在模擬情境當中。
21:00
CA: In a simulation.
457
1260083
1268
尼克.博斯特倫, 你的想法讓我吃驚。
21:01
Nick Bostrom, your mind amazes me,
458
1261375
1684
非常感謝你 把我們嚇得屁滾尿流。
21:03
thank you so much for scaring the living daylights out of us.
459
1263083
2893
(掌聲)
21:06
(Applause)
460
1266000
2375
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7