How civilization could destroy itself -- and 4 ways we could prevent it | Nick Bostrom

153,547 views ・ 2020-01-17

TED


请双击下面的英文字幕来播放视频。

翻译人员: Archi Xiao 校对人员: Jiasi Hao
00:13
Chris Anderson: Nick Bostrom.
0
13000
1809
克里斯 · 安德森(CA): 尼克 · 博斯特罗姆(NB),
00:14
So, you have already given us so many crazy ideas out there.
1
14833
3976
你为我们带来过这么多 疯狂的点子。
00:18
I think a couple of decades ago,
2
18833
1726
我记得大概二十年前,
00:20
you made the case that we might all be living in a simulation,
3
20583
2935
你曾提出 我们可能都正在,或曾经
00:23
or perhaps probably were.
4
23542
1809
生活在一个模拟世界里。
00:25
More recently,
5
25375
1351
最近,
00:26
you've painted the most vivid examples of how artificial general intelligence
6
26750
4601
你用最生动的案例来说明 通用人工智能
00:31
could go horribly wrong.
7
31375
1833
可能会带来的非常可怕的问题。
00:33
And now this year,
8
33750
1393
接着今年,
00:35
you're about to publish
9
35167
2226
你即将发表一篇论文,
00:37
a paper that presents something called the vulnerable world hypothesis.
10
37417
3934
题目为《脆弱世界假说》。
00:41
And our job this evening is to give the illustrated guide to that.
11
41375
4583
而今天晚上我们的任务 就是来简析这篇文章。
00:46
So let's do that.
12
46417
1833
来我们开始吧。
00:48
What is that hypothesis?
13
48833
1792
假设前提是什么呢?
尼克 · 博斯特罗姆(NB): 这篇文章在试图讨论
00:52
Nick Bostrom: It's trying to think about
14
52000
2434
00:54
a sort of structural feature of the current human condition.
15
54458
3084
人类现状的一种结构化特征。
00:59
You like the urn metaphor,
16
59125
2351
我知道你喜欢摸球游戏的比喻,
01:01
so I'm going to use that to explain it.
17
61500
1893
那我就用它来解释。
01:03
So picture a big urn filled with balls
18
63417
4351
想象一个很大的缸中装满了小球,
01:07
representing ideas, methods, possible technologies.
19
67792
3958
小球代表 各种观点、方法,和可能的技术。
01:12
You can think of the history of human creativity
20
72833
3726
你可以把人类的创造历史
01:16
as the process of reaching into this urn and pulling out one ball after another,
21
76583
3810
当作把手伸向这个缸 并从中不断取出小球的过程,
01:20
and the net effect so far has been hugely beneficial, right?
22
80417
3226
目前为止的净效益 已创造了极大的收益,对吧?
01:23
We've extracted a great many white balls,
23
83667
2726
我们取出了很多很多白色的小球,
01:26
some various shades of gray, mixed blessings.
24
86417
2875
一些或深或浅的灰色的小球, 有好有坏。
我们至今 还没有拿到过黑色的小球——
01:30
We haven't so far pulled out the black ball --
25
90042
2958
01:34
a technology that invariably destroys the civilization that discovers it.
26
94292
5476
也就是一个必定会 摧毁其所属文明的技术。
01:39
So the paper tries to think about what could such a black ball be.
27
99792
3267
这篇论文试图讨论的就是 这个黑球可能会是什么。
CA:所以你将这个球
01:43
CA: So you define that ball
28
103083
1810
01:44
as one that would inevitably bring about civilizational destruction.
29
104917
3684
定义为必将摧毁文明的存在。
01:48
NB: Unless we exit what I call the semi-anarchic default condition.
30
108625
5309
NB:除非我们停止这种我称之为 半无政府的默认条件,
01:53
But sort of, by default.
31
113958
1500
或者说默认状态。
01:56
CA: So, you make the case compelling
32
116333
3518
CA:也就是说你利用了一些反例
01:59
by showing some sort of counterexamples
33
119875
2018
做为有力佐证来说明你的观点,
02:01
where you believe that so far we've actually got lucky,
34
121917
2934
比如说你相信到目前为止 我们仅仅是走运,
02:04
that we might have pulled out that death ball
35
124875
2851
而且我们在不知情的情况下,
02:07
without even knowing it.
36
127750
1559
很可能已经取出了死亡之球。
02:09
So there's this quote, what's this quote?
37
129333
2292
我记得有句名言,是什么来着?
02:12
NB: Well, I guess it's just meant to illustrate
38
132625
2684
NB:我猜我只是想阐明
02:15
the difficulty of foreseeing
39
135333
2101
预测基本发现 在未来可能的作用潜力的
02:17
what basic discoveries will lead to.
40
137458
2685
困难程度。
02:20
We just don't have that capability.
41
140167
3059
我们真的不具备那样的能力。
02:23
Because we have become quite good at pulling out balls,
42
143250
3351
因为对于取出小球, 我们已经轻车熟路,
02:26
but we don't really have the ability to put the ball back into the urn, right.
43
146625
3726
但是我们并没有 把球放回缸里的能力。
02:30
We can invent, but we can't un-invent.
44
150375
2167
我们可以发明, 但是我们不会回到发明前。
02:33
So our strategy, such as it is,
45
153583
2768
很显然我们的战略,
02:36
is to hope that there is no black ball in the urn.
46
156375
2434
就是祈祷缸里没有黑球。
02:38
CA: So once it's out, it's out, and you can't put it back in,
47
158833
4060
CA:也就是说一旦把球取出来了 就没法再放回去了。
02:42
and you think we've been lucky.
48
162917
1517
你觉得我们一直挺幸运。
02:44
So talk through a couple of these examples.
49
164458
2226
在讲述这些例子的过程中,
02:46
You talk about different types of vulnerability.
50
166708
3101
你谈到了不同类型的弱点。
02:49
NB: So the easiest type to understand
51
169833
2435
NB:最容易理解的一种类型是
02:52
is a technology that just makes it very easy
52
172292
3142
不费吹灰之力
02:55
to cause massive amounts of destruction.
53
175458
2125
便能造成大规模破坏的技术。
02:59
Synthetic biology might be a fecund source of that kind of black ball,
54
179375
3518
合成生物技术可能成为 这样的黑球,
03:02
but many other possible things we could --
55
182917
2684
但同时我们可以利用它 做很多事情——
03:05
think of geoengineering, really great, right?
56
185625
2518
比如说地球工程, 非常棒,对吧?
03:08
We could combat global warming,
57
188167
2226
我们可以利用它来应对全球变暖,
03:10
but you don't want it to get too easy either,
58
190417
2142
但是我们肯定不希望 这种技术变得唾手可得,
03:12
you don't want any random person and his grandmother
59
192583
2476
我们不希望某个路人和他的奶奶
拥有这样的能彻底 改变地球气候的能力。
03:15
to have the ability to radically alter the earth's climate.
60
195083
3060
又或者是具备杀伤力的 自主无人机,
03:18
Or maybe lethal autonomous drones,
61
198167
3559
03:21
massed-produced, mosquito-sized killer bot swarms.
62
201750
3333
得以量产、如蚊子大小的 机械杀人虫。
03:26
Nanotechnology, artificial general intelligence.
63
206500
2726
纳米技术,通用人工智能等。
CA:在论文里,你论述到
03:29
CA: You argue in the paper
64
209250
1309
03:30
that it's a matter of luck that when we discovered
65
210583
2893
我们对能够创造原子弹的 核能的发现
03:33
that nuclear power could create a bomb,
66
213500
3434
纯粹是走运。
03:36
it might have been the case
67
216958
1393
因此,利用身边更容易获得的资源
03:38
that you could have created a bomb
68
218375
1851
就能制造出原子弹这样的情况,
03:40
with much easier resources, accessible to anyone.
69
220250
3559
也是极有可能的。
03:43
NB: Yeah, so think back to the 1930s
70
223833
3560
NB:没错,回顾上世纪 30 年代,
03:47
where for the first time we make some breakthroughs in nuclear physics,
71
227417
4601
当时我们在核物理学领域 取得首次突破,
几个天才科学家发现 引发原子核链反应是可能的,
03:52
some genius figures out that it's possible to create a nuclear chain reaction
72
232042
3684
03:55
and then realizes that this could lead to the bomb.
73
235750
3184
接着便想到把它做成一枚炸弹。
03:58
And we do some more work,
74
238958
1893
我们往下研究,
04:00
it turns out that what you require to make a nuclear bomb
75
240875
2726
发现要想做出一枚原子弹
04:03
is highly enriched uranium or plutonium,
76
243625
2393
需要浓缩铀或者钚,
而这些是极难弄到手的材料。
04:06
which are very difficult materials to get.
77
246042
2017
需要超速离心机,
04:08
You need ultracentrifuges,
78
248083
2268
04:10
you need reactors, like, massive amounts of energy.
79
250375
3768
需要反应堆, 能产生巨大能量的东西。
04:14
But suppose it had turned out instead
80
254167
1809
但是试想一下,
如果有一种很简单的方法 就能释放原子的能量。
04:16
there had been an easy way to unlock the energy of the atom.
81
256000
3976
可能是在微波炉里烤沙子
04:20
That maybe by baking sand in the microwave oven
82
260000
2768
04:22
or something like that
83
262792
1267
诸如此类的简便方法,
你就能很轻而易举地 引发原子核爆。
04:24
you could have created a nuclear detonation.
84
264083
2101
我们知道从物理学上来说 这并不可能。
04:26
So we know that that's physically impossible.
85
266208
2143
但是在研究相关的物理学问题之前,
04:28
But before you did the relevant physics
86
268375
1893
04:30
how could you have known how it would turn out?
87
270292
2191
我们又如何知道这是不可能的呢?
04:32
CA: Although, couldn't you argue
88
272507
1552
CA:尽管如此,为何你不能论述说
地球上生命的进化
04:34
that for life to evolve on Earth
89
274083
1935
暗示了地球某种稳定的环境,
04:36
that implied sort of stable environment,
90
276042
3267
04:39
that if it was possible to create massive nuclear reactions relatively easy,
91
279333
4185
如果相对简单地 就能引发大规模核反应,
那么地球本就不会如此稳定,
04:43
the Earth would never have been stable,
92
283542
1858
我们也不复存在。
04:45
that we wouldn't be here at all.
93
285424
1552
NB:是的,除非这种情况能轻易实现 且有人有意为之。
04:47
NB: Yeah, unless there were something that is easy to do on purpose
94
287000
3393
04:50
but that wouldn't happen by random chance.
95
290417
2851
但这种情况不会突然冒出来。
04:53
So, like things we can easily do,
96
293292
1579
我们轻易能办到的,
04:54
we can stack 10 blocks on top of one another,
97
294896
2110
比如说将 10 个方块 一个个叠起来,
但在自然中,你不会看到 一个 10 个方块为一摞的物体。
04:57
but in nature, you're not going to find, like, a stack of 10 blocks.
98
297031
3197
CA:那么这可能就是
05:00
CA: OK, so this is probably the one
99
300253
1673
05:01
that many of us worry about most,
100
301950
1943
我们很多人最为担心的一点,
05:03
and yes, synthetic biology is perhaps the quickest route
101
303917
3517
也就是合成生物技术
05:07
that we can foresee in our near future to get us here.
102
307458
3018
很可能就是引领我们走向 可视未来的捷径。
05:10
NB: Yeah, and so think about what that would have meant
103
310500
2934
NB:是的,现在我们想想 那可能意味着什么?
05:13
if, say, anybody by working in their kitchen for an afternoon
104
313458
3643
假设任何人在某个下午, 在自己的厨房里捣腾
就能毁掉一座城市。
05:17
could destroy a city.
105
317125
1393
05:18
It's hard to see how modern civilization as we know it
106
318542
3559
很难想象我们熟知的现代文明
怎样能逃过一劫。
05:22
could have survived that.
107
322125
1434
05:23
Because in any population of a million people,
108
323583
2518
因为任意 100 万人里面,
总会有那么一些人, 出于某种原因,
05:26
there will always be some who would, for whatever reason,
109
326125
2684
05:28
choose to use that destructive power.
110
328833
2084
会选择使用那种毁天灭地的能量。
05:31
So if that apocalyptic residual
111
331750
3143
如果那个世界末日的残余
05:34
would choose to destroy a city, or worse,
112
334917
1976
选择要摧毁一座城市, 甚至更糟,
05:36
then cities would get destroyed.
113
336917
1559
那么文明将难逃一劫。
05:38
CA: So here's another type of vulnerability.
114
338500
2351
CA:文章中提到的另一种弱点。
05:40
Talk about this.
115
340875
1643
你能谈谈吗?
05:42
NB: Yeah, so in addition to these kind of obvious types of black balls
116
342542
3976
NB:嗯,在这种显而易见的黑球外,
05:46
that would just make it possible to blow up a lot of things,
117
346542
2810
这种能一下子引发大爆炸的黑球外,
05:49
other types would act by creating bad incentives
118
349376
4433
其他类型的黑球 则是通过创造不良动机,
05:53
for humans to do things that are harmful.
119
353833
2226
鼓吹人们作恶。
我们可以把它称为“ Type-2a ”,
05:56
So, the Type-2a, we might call it that,
120
356083
4101
06:00
is to think about some technology that incentivizes great powers
121
360208
4518
指的是 使用某些技术去怂恿大国
06:04
to use their massive amounts of force to create destruction.
122
364750
4476
让它们利用自身的大规模武力 带来破坏。
06:09
So, nuclear weapons were actually very close to this, right?
123
369250
2833
那么核武器 其实很接近这个定义,对吧?
06:14
What we did, we spent over 10 trillion dollars
124
374083
3060
我们花了超过 10 万亿美元
06:17
to build 70,000 nuclear warheads
125
377167
2517
生产出 7 万颗原子弹,
06:19
and put them on hair-trigger alert.
126
379708
2435
随时候命。
在冷战时期,有好几次
06:22
And there were several times during the Cold War
127
382167
2267
我们差一点就把世界炸得飞起。
06:24
we almost blew each other up.
128
384458
1435
06:25
It's not because a lot of people felt this would be a great idea,
129
385917
3101
这并不是因为人们认为 炸飞彼此是一个很棒的想法,
我们一起花个 10 万亿 把大家炸个稀巴烂吧,
06:29
let's all spend 10 trillion dollars to blow ourselves up,
130
389042
2684
06:31
but the incentives were such that we were finding ourselves --
131
391750
2934
但是背后的推动力是如此强大, 我们发现——
结果本可能会更惨重。
06:34
this could have been worse.
132
394708
1310
假设第一次突袭安然无恙。
06:36
Imagine if there had been a safe first strike.
133
396042
2434
06:38
Then it might have been very tricky,
134
398500
2309
那么之后,在一个危急情况下
06:40
in a crisis situation,
135
400833
1268
想要遏制别人发射所有的原子弹
06:42
to refrain from launching all their nuclear missiles.
136
402125
2477
将会变得非常棘手。
06:44
If nothing else, because you would fear that the other side might do it.
137
404626
3392
不为别的,因为你会害怕 对手抢先一步发射导弹。
CA:对,相互制衡的毁灭性武器
06:48
CA: Right, mutual assured destruction
138
408042
1809
06:49
kept the Cold War relatively stable,
139
409875
2726
让冷战时期保持相对稳定,
06:52
without that, we might not be here now.
140
412625
1934
如果没有这种制衡, 我们可能也不复存在了。
06:54
NB: It could have been more unstable than it was.
141
414583
2310
NB:冷战本可能比当时的局势更动荡。
06:56
And there could be other properties of technology.
142
416917
2351
因为可能会有其他技术武器。
假设不是核武器震慑,
06:59
It could have been harder to have arms treaties,
143
419292
2309
而是其他更小型的 或是不那么与众不同的武器,
07:01
if instead of nuclear weapons
144
421625
1601
07:03
there had been some smaller thing or something less distinctive.
145
423250
3018
军备条约怕是更难达成共识。
CA:以及鼓吹强国政治力量的不良动机,
07:06
CA: And as well as bad incentives for powerful actors,
146
426292
2559
07:08
you also worry about bad incentives for all of us, in Type-2b here.
147
428875
3518
你在 Type-2b 中也表示了 对怂恿常人的不良动机的担忧。
07:12
NB: Yeah, so, here we might take the case of global warming.
148
432417
4250
NB:是的, 在这里我们可能要拿全球变暖举例。
07:18
There are a lot of little conveniences
149
438958
1893
总有那么一些便利因素
07:20
that cause each one of us to do things
150
440875
2184
促使我们每一个人做出一些
无足轻重的个人行为,对吧?
07:23
that individually have no significant effect, right?
151
443083
2851
07:25
But if billions of people do it,
152
445958
1976
但是如果上百亿人都这么做,
07:27
cumulatively, it has a damaging effect.
153
447958
2060
累积起来,这将会产生毁灭性影响。
07:30
Now, global warming could have been a lot worse than it is.
154
450042
2809
全球变暖的情况本可能 比现在更糟糕。
07:32
So we have the climate sensitivity parameter, right.
155
452875
2976
我们现在用的是气候敏感参数。
07:35
It's a parameter that says how much warmer does it get
156
455875
3643
这个参数显示 我们每释放一个单位的温室气体
07:39
if you emit a certain amount of greenhouse gases.
157
459542
2684
气温会上升多少。
07:42
But, suppose that it had been the case
158
462250
2393
但是假设情况变成
07:44
that with the amount of greenhouse gases we emitted,
159
464667
2517
我们每释放一个单位的温室气体,
到 2100 年
07:47
instead of the temperature rising by, say,
160
467208
2060
07:49
between three and 4.5 degrees by 2100,
161
469292
3726
气温不是上升 3 - 4.5 度,
07:53
suppose it had been 15 degrees or 20 degrees.
162
473042
2500
而是 15 - 20 度。
07:56
Like, then we might have been in a very bad situation.
163
476375
2559
那我们将可能处于水深火热之中。
07:58
Or suppose that renewable energy had just been a lot harder to do.
164
478958
3143
又或者假设很难生产再生能源。
又或者地下还有很多石油。
08:02
Or that there had been more fossil fuels in the ground.
165
482125
2643
CA:在这种情况下, 那你为什么不论述——
08:04
CA: Couldn't you argue that if in that case of --
166
484792
2642
08:07
if what we are doing today
167
487458
1726
我们现在的行为
08:09
had resulted in 10 degrees difference in the time period that we could see,
168
489208
4560
在可见的一段时间里 已经造成了 10 度的气温之差,
08:13
actually humanity would have got off its ass and done something about it.
169
493792
3684
那这下人们总该 认真起来,做出些改变了。
08:17
We're stupid, but we're not maybe that stupid.
170
497500
2809
我们愚蠢,但是我们可能 还不至于那么愚蠢。
又或者我们就是如此愚蠢。
08:20
Or maybe we are.
171
500333
1268
08:21
NB: I wouldn't bet on it.
172
501625
1268
NB:我看不一定。
08:22
(Laughter)
173
502917
2184
(笑声)
你可以想象其他东西。
08:25
You could imagine other features.
174
505125
1684
08:26
So, right now, it's a little bit difficult to switch to renewables and stuff, right,
175
506833
5518
比如现在要转向再生能源之类的 的确有些困难,对吧,
08:32
but it can be done.
176
512375
1268
但是确是可行的。
08:33
But it might just have been, with slightly different physics,
177
513667
2976
可能在稍有不同的物理学知识原理下,
08:36
it could have been much more expensive to do these things.
178
516667
2791
它的应用可能就会变得昂贵很多。
08:40
CA: And what's your view, Nick?
179
520375
1518
CA:那你怎么看呢,尼克?
08:41
Do you think, putting these possibilities together,
180
521917
2434
你认为把这些可能性加总,
08:44
that this earth, humanity that we are,
181
524375
4268
这个地球,我们人类,
08:48
we count as a vulnerable world?
182
528667
1559
就是一个脆弱的世界?
08:50
That there is a death ball in our future?
183
530250
2417
我们的未来里有一个死亡之球?
08:55
NB: It's hard to say.
184
535958
1268
NB:这很难说。
08:57
I mean, I think there might well be various black balls in the urn,
185
537250
5059
我的意思是,我认为缸里 可能有各种各样的黑球,
09:02
that's what it looks like.
186
542333
1310
可能这就是我们的未来。
09:03
There might also be some golden balls
187
543667
2392
但其中,可能也会有一些金球,
能保护我们不受黑球的伤害。
09:06
that would help us protect against black balls.
188
546083
3476
09:09
And I don't know which order they will come out.
189
549583
2976
可我不知道 它们出现的先后顺序。
09:12
CA: I mean, one possible philosophical critique of this idea
190
552583
3851
CA:对这种想法的 一种可能的哲学批判是
09:16
is that it implies a view that the future is essentially settled.
191
556458
5643
这种想法暗示着未来的大局已定。
09:22
That there either is that ball there or it's not.
192
562125
2476
这个球要么存在,要么不存在。
09:24
And in a way,
193
564625
3018
某种程度上,
09:27
that's not a view of the future that I want to believe.
194
567667
2601
这不是我想要的未来观。
我想要相信未来尚未成型,
09:30
I want to believe that the future is undetermined,
195
570292
2351
09:32
that our decisions today will determine
196
572667
1934
今天我们的决定将能改变未来走向
09:34
what kind of balls we pull out of that urn.
197
574625
2208
也就是我们会从这个缸里 取出怎样的小球。
09:37
NB: I mean, if we just keep inventing,
198
577917
3767
NB:我想如果 我们一直不断发明创造,
09:41
like, eventually we will pull out all the balls.
199
581708
2334
最终我们会取出所有的小球。
09:44
I mean, I think there's a kind of weak form of technological determinism
200
584875
3393
那是一种技术决定论的较弱形式,
09:48
that is quite plausible,
201
588292
1267
也是挺合理的。
09:49
like, you're unlikely to encounter a society
202
589583
2643
就好像你不可能遇上一个
09:52
that uses flint axes and jet planes.
203
592250
2833
燧石斧和喷射机并存的社会。
09:56
But you can almost think of a technology as a set of affordances.
204
596208
4060
但是你可以把某种技术想象成 一种功能组合。
10:00
So technology is the thing that enables us to do various things
205
600292
3017
也就是说技术能让我们 实现各种各样的事情,
并在这个世界中 造成各种各样的影响。
10:03
and achieve various effects in the world.
206
603333
1976
我们怎样利用技术 全凭人类做出怎样的选择。
10:05
How we'd then use that, of course depends on human choice.
207
605333
2810
但是如果我们想想这三种类型的弱点,
10:08
But if we think about these three types of vulnerability,
208
608167
2684
10:10
they make quite weak assumptions about how we would choose to use them.
209
610875
3393
它们做出了一个关于我们 将会选择如何使用技术的弱势假设。
10:14
So a Type-1 vulnerability, again, this massive, destructive power,
210
614292
3392
比如说 Type-1 弱点, 也就是大规模毁灭性力量,
10:17
it's a fairly weak assumption
211
617708
1435
思考上百万的人口中,
10:19
to think that in a population of millions of people
212
619167
2392
会有一些人 可能会选择使用这种力量作恶,
10:21
there would be some that would choose to use it destructively.
213
621583
2935
这是一个相当弱的假设。
10:24
CA: For me, the most single disturbing argument
214
624542
2434
CA:对于我来说, 其中最令人心烦的论点
就是我们可能对于 缸里的东西已经有了一定的想法,
10:27
is that we actually might have some kind of view into the urn
215
627000
4559
10:31
that makes it actually very likely that we're doomed.
216
631583
3518
这让我们感觉—— 很可能我们要完了。
换句话说, 如果你相信加速的力量,
10:35
Namely, if you believe in accelerating power,
217
635125
4643
10:39
that technology inherently accelerates,
218
639792
2267
也就是技术固有的发展加速性,
10:42
that we build the tools that make us more powerful,
219
642083
2435
我们所打造的这些 让我们更强大的工具,
10:44
then at some point you get to a stage
220
644542
2642
随后到了某个时刻, 我们会进入
10:47
where a single individual can take us all down,
221
647208
3060
某个人仅凭一己之力 就能干掉所有人的境地,
10:50
and then it looks like we're screwed.
222
650292
2851
之后这看起来似乎我们都要完了。
这个论点不是有点令人恐慌?
10:53
Isn't that argument quite alarming?
223
653167
2934
NB:呃,对啊。
10:56
NB: Ah, yeah.
224
656125
1750
10:58
(Laughter)
225
658708
1268
(笑声)
我认为——
11:00
I think --
226
660000
1333
11:02
Yeah, we get more and more power,
227
662875
1601
我们获得越来越多的力量,
11:04
and [it's] easier and easier to use those powers,
228
664500
3934
这些力量用起来越来越得心应手,
11:08
but we can also invent technologies that kind of help us control
229
668458
3560
但是同时我们也可以发明一些技术 来帮我们控制
人们如何使用这些力量。
11:12
how people use those powers.
230
672042
2017
CA:那我们来谈谈面对这些危机的反应。
11:14
CA: So let's talk about that, let's talk about the response.
231
674083
2851
11:16
Suppose that thinking about all the possibilities
232
676958
2310
假设想想这些可能性
11:19
that are out there now --
233
679292
2101
现有的这些可能性——
11:21
it's not just synbio, it's things like cyberwarfare,
234
681417
3726
不单单是合成生物技术, 还是网络战,
人工智能,等等等等——
11:25
artificial intelligence, etc., etc. --
235
685167
3351
11:28
that there might be serious doom in our future.
236
688542
4517
我们的未来可能注定难逃一切。
那人们可能会有怎样的反应呢?
11:33
What are the possible responses?
237
693083
1601
11:34
And you've talked about four possible responses as well.
238
694708
4893
你在文章中 也谈过四种可能的反应。
11:39
NB: Restricting technological development doesn't seem promising,
239
699625
3643
NB:限制技术发展看起来不太可能,
11:43
if we are talking about a general halt to technological progress.
240
703292
3226
如果我们谈的是技术发展的全面停滞。
11:46
I think neither feasible,
241
706542
1267
我觉得非但不可行,
11:47
nor would it be desirable even if we could do it.
242
707833
2310
而且哪怕我们有能力这么做, 又有谁会想这么做。
11:50
I think there might be very limited areas
243
710167
3017
我想在极少数的领域,
11:53
where maybe you would want slower technological progress.
244
713208
2726
你可能会想要放慢发展的步伐。
11:55
You don't, I think, want faster progress in bioweapons,
245
715958
3393
比如说你不会希望 生化武器得到快速发展,
11:59
or in, say, isotope separation,
246
719375
2059
又或者说同位素分离,
12:01
that would make it easier to create nukes.
247
721458
2250
那是有助于生产原子弹的东西。
12:04
CA: I mean, I used to be fully on board with that.
248
724583
3310
CA:换作以前, 我会十分同意你这里的观点。
12:07
But I would like to actually push back on that for a minute.
249
727917
3267
但是现在, 我倒是想要三思一下。
12:11
Just because, first of all,
250
731208
1310
因为首先,
12:12
if you look at the history of the last couple of decades,
251
732542
2684
如果我们回顾 过去几十年的历史,
12:15
you know, it's always been push forward at full speed,
252
735250
3559
这段时期, 我们一直都在全速前进,
12:18
it's OK, that's our only choice.
253
738833
1851
这还好,那是我们唯一的选择。
12:20
But if you look at globalization and the rapid acceleration of that,
254
740708
4268
但如果你看看 全球化及其的加速发展,
如果看看那个 “快速行动,破除陈规”的策略
12:25
if you look at the strategy of "move fast and break things"
255
745000
3434
12:28
and what happened with that,
256
748458
2060
及其带来的后果,
12:30
and then you look at the potential for synthetic biology,
257
750542
2767
再看看合成生物技术的潜力所在,
12:33
I don't know that we should move forward rapidly
258
753333
4435
我不认为 我们应该快速发展,
12:37
or without any kind of restriction
259
757792
1642
或者不加以任何约束,
12:39
to a world where you could have a DNA printer in every home
260
759458
3310
奔向一个人手一台 DNA 打印机,
DNA 打印机变成高中实验室标配的 世界。
12:42
and high school lab.
261
762792
1333
12:45
There are some restrictions, right?
262
765167
1684
是有一些的约束,对吗?
12:46
NB: Possibly, there is the first part, the not feasible.
263
766875
2643
NB:有可能,这是第一部分, 但是并不可行。
12:49
If you think it would be desirable to stop it,
264
769542
2184
如果你觉得 阻止这一切是人心所向,
12:51
there's the problem of feasibility.
265
771750
1726
那么还有可行性问题。
12:53
So it doesn't really help if one nation kind of --
266
773500
2809
如果只是一个国家这么做 并没多大用处——
12:56
CA: No, it doesn't help if one nation does,
267
776333
2018
CA:不,一个国家不管用,
12:58
but we've had treaties before.
268
778375
2934
但是我们之前有过国际条约。
这才是我们 真正度过核危机的方法——
13:01
That's really how we survived the nuclear threat,
269
781333
3351
13:04
was by going out there
270
784708
1268
走出国门,
历经痛苦的斡旋谈判。
13:06
and going through the painful process of negotiating.
271
786000
2518
13:08
I just wonder whether the logic isn't that we, as a matter of global priority,
272
788542
5434
我在想,作为一项全球优先事务, 这其中的逻辑
不应该是我们走出国门并尝试,
13:14
we shouldn't go out there and try,
273
794000
1684
13:15
like, now start negotiating really strict rules
274
795708
2685
比如现在,开始协商并制定出 一些非常严格的规定
13:18
on where synthetic bioresearch is done,
275
798417
2684
来约束合成生物的研究吗?
这可不是什么 你想要民主化的东西,对吧?
13:21
that it's not something that you want to democratize, no?
276
801125
2851
NB:我完全赞成——
13:24
NB: I totally agree with that --
277
804000
1809
13:25
that it would be desirable, for example,
278
805833
4226
举个例子,
也许拥有 DNA 合成仪器 挺诱人的,
13:30
maybe to have DNA synthesis machines,
279
810083
3601
13:33
not as a product where each lab has their own device,
280
813708
3560
并非作为每个实验室 都有的那种产品,
13:37
but maybe as a service.
281
817292
1476
而可能作为一种服务。
13:38
Maybe there could be four or five places in the world
282
818792
2517
也许世界上能有那么 4 - 5 个地方,
13:41
where you send in your digital blueprint and the DNA comes back, right?
283
821333
3518
你可以把自己的电子蓝图发过去, 然后得到自己的 DNA 图谱。
13:44
And then, you would have the ability,
284
824875
1768
如果这个技术变成必需的一天 真的来了,
13:46
if one day it really looked like it was necessary,
285
826667
2392
那你就会获得这种能力,
我们也会有一组有限的要塞点。
13:49
we would have like, a finite set of choke points.
286
829083
2351
13:51
So I think you want to look for kind of special opportunities,
287
831458
3518
所以我认为你会想 寻找某些特殊的机会,
使你有更强的控制。
13:55
where you could have tighter control.
288
835000
2059
CA:你的观点是,最终,
13:57
CA: Your belief is, fundamentally,
289
837083
1643
13:58
we are not going to be successful in just holding back.
290
838750
2893
仅依靠放慢脚步, 我们是无法成功的。
14:01
Someone, somewhere -- North Korea, you know --
291
841667
2726
某人,在某个地方—— 北朝鲜——
14:04
someone is going to go there and discover this knowledge,
292
844417
3517
某人会发现这样的知识,
14:07
if it's there to be found.
293
847958
1268
如果它真的存在的话。
14:09
NB: That looks plausible under current conditions.
294
849250
2351
NB:在当前的情况来看貌似是合理的。
14:11
It's not just synthetic biology, either.
295
851625
1934
也不仅限于合成生物技术。
14:13
I mean, any kind of profound, new change in the world
296
853583
2518
我想世界上任何新的前沿的改变,
都可能是一个黑球。
14:16
could turn out to be a black ball.
297
856101
1626
14:17
CA: Let's look at another possible response.
298
857727
2096
CA:我们再看看另一种可能的反应。
14:19
NB: This also, I think, has only limited potential.
299
859823
2403
NB:我想这种反应的潜能是有限的。
14:22
So, with the Type-1 vulnerability again,
300
862250
3559
我们再看看 Type-1 弱点,
14:25
I mean, if you could reduce the number of people who are incentivized
301
865833
4351
如果能减少那些有动力因素
去毁灭世界的人数,
14:30
to destroy the world,
302
870208
1268
14:31
if only they could get access and the means,
303
871500
2059
如果有这样的方法和途径,
14:33
that would be good.
304
873583
1268
那会是件好事。
14:34
CA: In this image that you asked us to do
305
874875
1976
CA:在这个你希望我们想象的图景中,
14:36
you're imagining these drones flying around the world
306
876875
2559
你想象着带着面部识别的无人机
满世界飞。
14:39
with facial recognition.
307
879458
1268
当它们发现有人表现出 反社会行为倾向时,
14:40
When they spot someone showing signs of sociopathic behavior,
308
880750
2893
14:43
they shower them with love, they fix them.
309
883667
2184
它们会施以爱的沐浴,并修正他们。
14:45
NB: I think it's like a hybrid picture.
310
885875
1893
NB:我想这会是个混合的图景。
14:47
Eliminate can either mean, like, incarcerate or kill,
311
887792
4017
消除可以表示禁闭或杀害,
14:51
or it can mean persuade them to a better view of the world.
312
891833
3018
也可以意味着, 劝说人们看到世界更美好的一面。
14:54
But the point is that,
313
894875
1726
但是重点是,
14:56
suppose you were extremely successful in this,
314
896625
2143
假设你深谙此道,
14:58
and you reduced the number of such individuals by half.
315
898792
3309
你把害人之马的数目减半了。
而且假如你想通过劝说来实现,
15:02
And if you want to do it by persuasion,
316
902125
1893
15:04
you are competing against all other powerful forces
317
904042
2392
那么你就是在和那些
如政党,宗教,教育体系等
15:06
that are trying to persuade people,
318
906458
1685
试图游说人们的进行力量比拼。
15:08
parties, religion, education system.
319
908167
1767
15:09
But suppose you could reduce it by half,
320
909958
1905
但是假设你 真能把害人之马的数目减半,
15:11
I don't think the risk would be reduced by half.
321
911887
2256
我不认为风险就会相应地 被削弱一半。
可能只是减少 5% - 10% 。
15:14
Maybe by five or 10 percent.
322
914167
1559
15:15
CA: You're not recommending that we gamble humanity's future on response two.
323
915750
4351
CA:你并不推荐我们把人类的未来 押在第二种反应上。
15:20
NB: I think it's all good to try to deter and persuade people,
324
920125
3018
NB:我觉得阻止和劝服人们是好的,
15:23
but we shouldn't rely on that as our only safeguard.
325
923167
2976
但是我们不应该将此做为 我们唯一的保护措施。
CA:那么第三种呢?
15:26
CA: How about three?
326
926167
1267
15:27
NB: I think there are two general methods
327
927458
2893
NB:我觉得有两种通用的方法,
15:30
that we could use to achieve the ability to stabilize the world
328
930375
3976
我们可以利用它们 来获取稳定世界的能力
15:34
against the whole spectrum of possible vulnerabilities.
329
934375
2976
来对抗所有一切可能的弱点。
15:37
And we probably would need both.
330
937375
1559
我们很可能会同时需要两者。
15:38
So, one is an extremely effective ability
331
938958
4518
一种是能够实施
极为有效的预防管制的能力。
15:43
to do preventive policing.
332
943500
1768
15:45
Such that you could intercept.
333
945292
1524
如果有人做出危险之事,
15:46
If anybody started to do this dangerous thing,
334
946840
2761
你可以拦截,
15:49
you could intercept them in real time, and stop them.
335
949625
2684
你可以进行实时拦截, 并阻止他们。
15:52
So this would require ubiquitous surveillance,
336
952333
2476
而这需要无孔不入的监管,
15:54
everybody would be monitored all the time.
337
954833
2375
所有人无时无刻都被监视。
15:58
CA: This is "Minority Report," essentially, a form of.
338
958333
2560
CA:那某种程度就是 ”少数派报告“里的情景了。
16:00
NB: You would have maybe AI algorithms,
339
960917
1934
NB:你可能会使用人工智能算法
16:02
big freedom centers that were reviewing this, etc., etc.
340
962875
4417
来审查大型自由中心的这些数据等等。
16:08
CA: You know that mass surveillance is not a very popular term right now?
341
968583
4393
CA:你应该知道大众监控 现在不是个很吃香的词吧?
(笑声)
16:13
(Laughter)
342
973000
1250
16:15
NB: Yeah, so this little device there,
343
975458
1810
NB:是的, 所以图中的一个小小设备,
16:17
imagine that kind of necklace that you would have to wear at all times
344
977292
3601
把它想象成一个 你需要 24 小时佩戴的
上面有多角度摄像头的项链。
16:20
with multidirectional cameras.
345
980917
2000
16:23
But, to make it go down better,
346
983792
1809
为了听起来不那么膈应,
16:25
just call it the "freedom tag" or something like that.
347
985625
2524
我们就叫它”自由标签“什么的。
16:28
(Laughter)
348
988173
2011
(笑声)
16:30
CA: OK.
349
990208
1268
CA:好吧。
16:31
I mean, this is the conversation, friends,
350
991500
2101
这就是对话呀,朋友们,
16:33
this is why this is such a mind-blowing conversation.
351
993625
3559
这就是为什么 这是一场令人震撼的对话呀。
16:37
NB: Actually, there's a whole big conversation on this
352
997208
2601
NB:事实上,就这个主题 已经有很广泛的讨论了,
16:39
on its own, obviously.
353
999833
1310
很明显,单单针对这个主题本身。
16:41
There are huge problems and risks with that, right?
354
1001167
2476
这背后有非常严重的 问题和风险,对吧?
16:43
We may come back to that.
355
1003667
1267
我们待会可以再谈谈。
16:44
So the other, the final,
356
1004958
1268
所以另外一种方法, 也是最后的方法,
16:46
the other general stabilization capability
357
1006250
2559
另一种带来普遍稳定性的能力
16:48
is kind of plugging another governance gap.
358
1008833
2060
大致就是插入另一个管制间隙。
16:50
So the surveillance would be kind of governance gap at the microlevel,
359
1010917
4184
那么就可以在微观层面进行监管,
比如说防止任何人做出 严重违法的行为。
16:55
like, preventing anybody from ever doing something highly illegal.
360
1015125
3101
16:58
Then, there's a corresponding governance gap
361
1018250
2309
然后有一个对应的
17:00
at the macro level, at the global level.
362
1020583
1935
宏观、全球性层面的监管间隙。
17:02
You would need the ability, reliably,
363
1022542
3934
你需要凭借这种能力,
可靠地, 来防止最糟糕的全球性协调失灵,
17:06
to prevent the worst kinds of global coordination failures,
364
1026500
2809
17:09
to avoid wars between great powers,
365
1029333
3768
来避免强权之间的战争,
军备竞赛,
17:13
arms races,
366
1033125
1333
17:15
cataclysmic commons problems,
367
1035500
2208
灾难性的常见问题,
17:19
in order to deal with the Type-2a vulnerabilities.
368
1039667
4184
以应对 Type-2a 弱点。
17:23
CA: Global governance is a term
369
1043875
1934
CA:全球治理这一名词
17:25
that's definitely way out of fashion right now,
370
1045833
2226
现在无疑是最不受待见的,
但是你能否通过历史,
17:28
but could you make the case that throughout history,
371
1048083
2518
17:30
the history of humanity
372
1050625
1268
人类的历史来说明
17:31
is that at every stage of technological power increase,
373
1051917
5434
在每一个技术力量上升的阶段,
17:37
people have reorganized and sort of centralized the power.
374
1057375
3226
人们会重组这种力量并加以集中。
17:40
So, for example, when a roving band of criminals
375
1060625
3434
比如说 当一群四处为非作歹的罪犯
就能挟持一个社会的时候,
17:44
could take over a society,
376
1064083
1685
17:45
the response was, well, you have a nation-state
377
1065792
2239
那么对应的手段是国家,
集中力量, 警察部队或一支军队,
17:48
and you centralize force, a police force or an army,
378
1068055
2434
17:50
so, "No, you can't do that."
379
1070513
1630
站出来说“你不能这么做。”
17:52
The logic, perhaps, of having a single person or a single group
380
1072167
4559
这里面的逻辑, 让单个人或单个群体
17:56
able to take out humanity
381
1076750
1643
拥有保护人类的能力
17:58
means at some point we're going to have to go this route,
382
1078417
2726
意味着在某个时刻 我们将不得不需要走这条路,
18:01
at least in some form, no?
383
1081167
1434
至少在某些形式上,不是吗?
18:02
NB: It's certainly true that the scale of political organization has increased
384
1082625
3684
NB:在人类的发展历程中 政治组织发展规模之壮大
18:06
over the course of human history.
385
1086333
2143
是毋庸置疑的。
18:08
It used to be hunter-gatherer band, right,
386
1088500
2018
以前曾是捕猎者和采集者群体,
18:10
and then chiefdom, city-states, nations,
387
1090542
2934
接着是部落、城邦、国家,
18:13
now there are international organizations and so on and so forth.
388
1093500
3976
现在有国际组织,诸如此类。
18:17
Again, I just want to make sure
389
1097500
1518
我想再次确认的一点是
我有这样的机会去强调
18:19
I get the chance to stress
390
1099042
1642
18:20
that obviously there are huge downsides
391
1100708
1976
大众监管和全球治理
18:22
and indeed, massive risks,
392
1102708
1518
很明显存在的巨大缺点
18:24
both to mass surveillance and to global governance.
393
1104250
3351
以及巨大的风险。
18:27
I'm just pointing out that if we are lucky,
394
1107625
2559
我刚指出的只是如果我们走运,
18:30
the world could be such that these would be the only ways
395
1110208
2685
想要躲过黑球的灾难
18:32
you could survive a black ball.
396
1112917
1517
也就那么些方法。
18:34
CA: The logic of this theory,
397
1114458
2518
CA:这个理论的逻辑,
就我而言,
18:37
it seems to me,
398
1117000
1268
18:38
is that we've got to recognize we can't have it all.
399
1118292
3601
我们得承认我们不可能无往不胜。
18:41
That the sort of,
400
1121917
1833
也就是,
18:45
I would say, naive dream that many of us had
401
1125500
2976
我会说, 那种很多人都有的痴心妄想中,
18:48
that technology is always going to be a force for good,
402
1128500
3351
技术永远是一种向善的力量,
18:51
keep going, don't stop, go as fast as you can
403
1131875
2976
继续前进、别停下来、全速前进,
18:54
and not pay attention to some of the consequences,
404
1134875
2351
并且不计后果,
其实根本没有这个选择。
18:57
that's actually just not an option.
405
1137250
1684
18:58
We can have that.
406
1138958
1935
我们可以全速发展科技。
19:00
If we have that,
407
1140917
1267
但如果我们继续这么发展科技,
19:02
we're going to have to accept
408
1142208
1435
我们就将要必须接受
19:03
some of these other very uncomfortable things with it,
409
1143667
2559
随之而来的 一些非常令人不适的不便,
19:06
and kind of be in this arms race with ourselves
410
1146250
2226
这有点像和我们自己比赛,
19:08
of, you want the power, you better limit it,
411
1148500
2268
如果你想拥有力量, 你最好能限制它的使用,
19:10
you better figure out how to limit it.
412
1150792
2142
你最好想办法如何限制它的使用。
19:12
NB: I think it is an option,
413
1152958
3476
NB:我想这是一种选择,
19:16
a very tempting option, it's in a sense the easiest option
414
1156458
2768
非常诱人的一种选择, 某种程度上也是最简单的选择,
19:19
and it might work,
415
1159250
1268
还可能奏效,
19:20
but it means we are fundamentally vulnerable to extracting a black ball.
416
1160542
4809
但这也意味着我们本质上很脆弱, 不能承受取出黑球之重。
19:25
Now, I think with a bit of coordination,
417
1165375
2143
现在我认为,一定的协调能力
19:27
like, if you did solve this macrogovernance problem,
418
1167542
2726
比方说如果真的解决了 宏观管理问题
19:30
and the microgovernance problem,
419
1170292
1601
以及微观治理问题,
19:31
then we could extract all the balls from the urn
420
1171917
2309
那么我们可以一次性 从缸里取出所有的小球。
19:34
and we'd benefit greatly.
421
1174250
2268
那么我们将会极其受益。
19:36
CA: I mean, if we're living in a simulation, does it matter?
422
1176542
3434
CA:我想,如果我们活在 一个模拟世界中,这又有什么关系?
我们重启就好了。
19:40
We just reboot.
423
1180000
1309
19:41
(Laughter)
424
1181333
1268
(笑声)
19:42
NB: Then ... I ...
425
1182625
1643
NB:那……我……
19:44
(Laughter)
426
1184292
2476
(笑声)
19:46
I didn't see that one coming.
427
1186792
1416
我没想到你会这么说。
19:50
CA: So what's your view?
428
1190125
1268
CA:那么你的看法呢?
19:51
Putting all the pieces together, how likely is it that we're doomed?
429
1191417
4809
综合来看, 我们在劫难逃的几率有多高?
19:56
(Laughter)
430
1196250
1958
(笑声)
我喜欢当你问这个问题的时候 大家笑成这样。
19:59
I love how people laugh when you ask that question.
431
1199042
2392
20:01
NB: On an individual level,
432
1201458
1351
NB:在个人的层面,
20:02
we seem to kind of be doomed anyway, just with the time line,
433
1202833
3851
从时间线上来看 我们终究难逃一劫,
20:06
we're rotting and aging and all kinds of things, right?
434
1206708
2601
我们在腐烂,在老化, 诸如此类的,对吧?
20:09
(Laughter)
435
1209333
1601
(笑声)
20:10
It's actually a little bit tricky.
436
1210958
1685
事实上很难说。
20:12
If you want to set up so that you can attach a probability,
437
1212667
2767
如果你想通过一个设定来附加概率,
首先要问的是,我们是谁?
20:15
first, who are we?
438
1215458
1268
20:16
If you're very old, probably you'll die of natural causes,
439
1216750
2726
如果你年事已高, 很可能你会自然死亡,
如果你还很年轻, 你可能能活到 100 岁——
20:19
if you're very young, you might have a 100-year --
440
1219500
2351
20:21
the probability might depend on who you ask.
441
1221875
2143
这个概率的大小 取决于你问的对象是谁。
接着我们得问,怎样才算是文明毁灭?
20:24
Then the threshold, like, what counts as civilizational devastation?
442
1224042
4226
20:28
In the paper I don't require an existential catastrophe
443
1228292
5642
在文章里我不需要存在的灾难
20:33
in order for it to count.
444
1233958
1435
来计算概率。
20:35
This is just a definitional matter,
445
1235417
1684
主要看你怎样定义,
我可以说 10 亿的死亡人数,
20:37
I say a billion dead,
446
1237125
1309
20:38
or a reduction of world GDP by 50 percent,
447
1238458
2060
或者 GDP 下降 50% ,
20:40
but depending on what you say the threshold is,
448
1240542
2226
但是一切取决于 你所设定的起点是什么,
20:42
you get a different probability estimate.
449
1242792
1976
起点不同, 所得到的概率估算随之不同。
20:44
But I guess you could put me down as a frightened optimist.
450
1244792
4517
但是我想你可以把我看做是 一个害怕的乐观主义者吧。
20:49
(Laughter)
451
1249333
1101
(笑声)
20:50
CA: You're a frightened optimist,
452
1250458
1643
CA:如果你是一个害怕的乐观主义者,
20:52
and I think you've just created a large number of other frightened ...
453
1252125
4268
那我想你刚刚就催生了一帮 同样害怕的……
20:56
people.
454
1256417
1267
人们。
20:57
(Laughter)
455
1257708
1060
(笑声)
20:58
NB: In the simulation.
456
1258792
1267
NB:在模拟世界里。
CA:在一个模拟世界里。
21:00
CA: In a simulation.
457
1260083
1268
21:01
Nick Bostrom, your mind amazes me,
458
1261375
1684
尼克 · 博斯特罗姆, 你的思维真让我大开眼界,
非常感谢你今天在光天化日之下 把我们大家都吓得不行。
21:03
thank you so much for scaring the living daylights out of us.
459
1263083
2893
(掌声)
21:06
(Applause)
460
1266000
2375
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7