Martin Rees: Can we prevent the end of the world?

151,587 views ・ 2014-08-25

TED


请双击下面的英文字幕来播放视频。

翻译人员: Evelina Wong 校对人员: Zhiting Chen
00:12
Ten years ago, I wrote a book which I entitled
0
12485
2222
十年前,我写了一本
00:14
"Our Final Century?" Question mark.
1
14707
3093
名为《我们的末世纪?》的书, 书名以问号结尾
00:17
My publishers cut out the question mark. (Laughter)
2
17800
3577
但出版商把问号删掉了.(笑声)
00:21
The American publishers changed our title
3
21377
1882
这位美国出版商把书名改成了
00:23
to "Our Final Hour."
4
23259
3909
《我们的最后时刻》
00:27
Americans like instant gratification and the reverse.
5
27168
3492
美国人喜欢即时行乐然后再反其道而行。
00:30
(Laughter)
6
30660
1708
(笑声)
00:32
And my theme was this:
7
32368
1750
我当时的主题是这样的:
00:34
Our Earth has existed for 45 million centuries,
8
34118
4166
我们的地球已经存在了 四千五百万个世纪了,
00:38
but this one is special —
9
38284
2013
但是这个世纪是最特别的——
00:40
it's the first where one species, ours,
10
40297
3016
第一次有一个物种, 也就是我们人类,
00:43
has the planet's future in its hands.
11
43313
2802
掌控着这个星球的未来。
00:46
Over nearly all of Earth's history,
12
46115
1990
在几乎所有的地球历史中,
00:48
threats have come from nature —
13
48105
1936
威胁都是来自大自然——
00:50
disease, earthquakes, asteroids and so forth —
14
50041
3496
疾病、地震以及小行星等等——
00:53
but from now on, the worst dangers come from us.
15
53537
5672
但是从现在起,最大的危险 来自我们人类。
00:59
And it's now not just the nuclear threat;
16
59209
3271
现在我们所面临的 不仅仅是核威胁;
01:02
in our interconnected world,
17
62480
1751
在我们这个相互联系的世界,
01:04
network breakdowns can cascade globally;
18
64231
3163
网络故障可以波及全球;
01:07
air travel can spread pandemics worldwide within days;
19
67394
3956
航空旅行可以将流行病 在几天内传播到全世界;
01:11
and social media can spread panic and rumor
20
71350
3327
并且可以毫不夸张地说
01:14
literally at the speed of light.
21
74677
3217
社交媒体传播恐慌和谣言 如光速一般。
01:17
We fret too much about minor hazards —
22
77894
3225
我们对那些不起眼的危害 担心得太多了——
01:21
improbable air crashes, carcinogens in food,
23
81119
4031
未必会发生的空难、 食品中的致癌物、
01:25
low radiation doses, and so forth —
24
85150
2226
低剂量的辐射,等等——
01:27
but we and our political masters
25
87376
2825
但是我们和我们的政治领袖们
01:30
are in denial about catastrophic scenarios.
26
90201
4203
却不肯接受灾难性的场景 发生的可能性。
01:34
The worst have thankfully not yet happened.
27
94404
3038
谢天谢地最糟糕的情况 还没有发生。
01:37
Indeed, they probably won't.
28
97442
2196
的确,它们很有可能不会发生。
01:39
But if an event is potentially devastating,
29
99638
3185
但是哪怕只有一个事件是 具有潜在的毁灭性的,
01:42
it's worth paying a substantial premium
30
102823
2868
那么就值得投入大量的费用
01:45
to safeguard against it, even if it's unlikely,
31
105691
3836
去预防它,即使它不大可能发生,
01:49
just as we take out fire insurance on our house.
32
109527
4513
正如我们为我们的房子 购买火灾险一样。
01:54
And as science offers greater power and promise,
33
114040
4997
随着科学提供了 更大的能力和希望,
01:59
the downside gets scarier too.
34
119037
3866
它的负面影响也会变得更加可怕。
02:02
We get ever more vulnerable.
35
122903
2239
我们变得更加脆弱。
02:05
Within a few decades,
36
125142
1838
数十年之内,
02:06
millions will have the capability
37
126980
2230
能够滥用快速发展的生物技术的人
02:09
to misuse rapidly advancing biotech,
38
129210
3121
将达到数百万,
02:12
just as they misuse cybertech today.
39
132331
3553
就像今天滥用网络技术的人一样。
02:15
Freeman Dyson, in a TED Talk,
40
135884
3199
弗里曼·戴森在一次TED演讲中预言,
02:19
foresaw that children will design and create new organisms
41
139083
3596
孩子们将会设计和创造新的生物体,
02:22
just as routinely as his generation played with chemistry sets.
42
142679
4511
而这就像他们那一代人 玩化学装置一样常见。
02:27
Well, this may be on the science fiction fringe,
43
147190
2528
当然,这可能有点像 科幻小说的场景,
02:29
but were even part of his scenario to come about,
44
149718
3183
但即使只是这个设想中的一部分 变成现实,
02:32
our ecology and even our species
45
152901
2737
我们的生态系统和我们人类自身
02:35
would surely not survive long unscathed.
46
155638
3989
都必定将无法安然无恙地长久生存。
02:39
For instance, there are some eco-extremists
47
159627
3863
例如,一些生态极端主义者认为
02:43
who think that it would be better for the planet,
48
163490
2509
人类的数量越少,
02:45
for Gaia, if there were far fewer humans.
49
165999
3403
对这颗星球, 对我们生活的地球越好。
02:49
What happens when such people have mastered
50
169402
2717
当这样的人掌握了
02:52
synthetic biology techniques
51
172119
2137
那些将会在2050年可能被普遍推广的 合成生物技术后
02:54
that will be widespread by 2050?
52
174256
2852
将会发生什么样的事情呢?
02:57
And by then, other science fiction nightmares
53
177108
3042
到那时,其他科幻小说中的噩梦
03:00
may transition to reality:
54
180150
1710
就可能变为现实:
03:01
dumb robots going rogue,
55
181860
2070
愚蠢的机器人会像流氓一样,
03:03
or a network that develops a mind of its own
56
183930
2417
或者一个网络会进化出自己的意识
03:06
threatens us all.
57
186347
2589
从而威胁到我们所有人。
03:08
Well, can we guard against such risks by regulation?
58
188936
3270
我们能通过监管来预防这些风险吗?
03:12
We must surely try, but these enterprises
59
192206
2407
无论如何我们得试一试, 但是这些企业
03:14
are so competitive, so globalized,
60
194613
3529
是如此的求胜心切, 如此的全球化,
03:18
and so driven by commercial pressure,
61
198142
1980
如此的受商业压力的驱动,
03:20
that anything that can be done will be done somewhere,
62
200122
3285
所以他们将会不顾一切 以达成他们的目的,
03:23
whatever the regulations say.
63
203407
2036
而不管法规是如何规定的。
03:25
It's like the drug laws — we try to regulate, but can't.
64
205443
3487
就像毒品法——我们试着去约束, 但收效甚微。
03:28
And the global village will have its village idiots,
65
208930
3044
地球村里总会有些愚昧的村民,
03:31
and they'll have a global range.
66
211974
3496
并且他们的所作所为的影响 会波及全球。
03:35
So as I said in my book,
67
215470
2291
所以就像我在书中所说,
03:37
we'll have a bumpy ride through this century.
68
217761
2889
这个世纪我们不会过得一帆风顺。
03:40
There may be setbacks to our society —
69
220650
3490
我们的社会可能会遭遇挫折——
03:44
indeed, a 50 percent chance of a severe setback.
70
224140
4115
甚至,有50%的可能 会遭遇一个严重的挫折。
03:48
But are there conceivable events
71
228255
2914
但是可能还会有更糟糕的
03:51
that could be even worse,
72
231169
2161
事件发生吗,
03:53
events that could snuff out all life?
73
233330
3430
那种可以消灭掉所有生命的事件?
03:56
When a new particle accelerator came online,
74
236760
2926
当一台新的粒子加速器联机以后,
03:59
some people anxiously asked,
75
239686
1789
一些人不安地问道,
04:01
could it destroy the Earth or, even worse,
76
241475
2250
它能毁灭地球吗?或者,
04:03
rip apart the fabric of space?
77
243725
2659
撕破空间的构造?
04:06
Well luckily, reassurance could be offered.
78
246384
3543
幸运的是,至少这点我们 还是能保证不会发生的。
04:09
I and others pointed out that nature
79
249927
2044
我和其他人指出,大自然
04:11
has done the same experiments
80
251971
1933
通过宇宙射线碰撞,
04:13
zillions of times already,
81
253904
2186
已经做过亿万次
04:16
via cosmic ray collisions.
82
256090
1765
同样的实验了。
04:17
But scientists should surely be precautionary
83
257855
3054
但是科学家们无疑需要
04:20
about experiments that generate conditions
84
260909
2580
对于那些会产生自然界中没有先例的状况
04:23
without precedent in the natural world.
85
263489
2483
的实验保持警惕。
04:25
Biologists should avoid release of potentially devastating
86
265972
3423
生物学家应当避免释放那些具有潜在毁灭性
04:29
genetically modified pathogens.
87
269395
2715
的转基因病原体。
04:32
And by the way, our special aversion
88
272110
3517
顺便提一下,我们之所以
04:35
to the risk of truly existential disasters
89
275627
3461
对于真实存在的灾难所带来的风险 尤其反感
04:39
depends on a philosophical and ethical question,
90
279088
3275
是取决于一个哲学和伦理问题,
04:42
and it's this:
91
282363
1670
这就是:
04:44
Consider two scenarios.
92
284033
2308
考虑一下两个场景。
04:46
Scenario A wipes out 90 percent of humanity.
93
286341
5236
场景A,90%的人类会被消灭。
04:51
Scenario B wipes out 100 percent.
94
291577
3896
场景B,所有人类都会被消灭。
04:55
How much worse is B than A?
95
295473
2918
那么场景B比场景A糟糕多少?
04:58
Some would say 10 percent worse.
96
298391
3023
有人可能会说糟糕10%。
05:01
The body count is 10 percent higher.
97
301414
3150
因为死亡人数要多全人类的10%。
05:04
But I claim that B is incomparably worse.
98
304564
2906
但我认为场景B是无比的糟糕。
05:07
As an astronomer, I can't believe
99
307470
2629
作为一个天文学家,我不能相信
05:10
that humans are the end of the story.
100
310099
2467
人类是这个故事的结局。
05:12
It is five billion years before the sun flares up,
101
312566
3323
宇宙诞生在太阳燃烧起来的 五十亿前年
05:15
and the universe may go on forever,
102
315889
2711
并且会永远持续下去,
05:18
so post-human evolution,
103
318600
2292
因此后人类时代的进化
05:20
here on Earth and far beyond,
104
320892
2190
不管是在地球 还是其它遥远的地方,
05:23
could be as prolonged as the Darwinian process
105
323082
2714
都可能会像达尔文进化论 所提出的一样旷日持久,
05:25
that's led to us, and even more wonderful.
106
325796
3281
并且甚至会更奇妙。
05:29
And indeed, future evolution will happen much faster,
107
329077
2664
而事实上,未来的进化会发生得更快,
05:31
on a technological timescale,
108
331741
2199
会在一个科技的时间尺度内,
05:33
not a natural selection timescale.
109
333940
2299
而不是在自然选择的时间尺度内。
05:36
So we surely, in view of those immense stakes,
110
336239
4195
所以无疑地,考虑到这些巨大的风险,
05:40
shouldn't accept even a one in a billion risk
111
340434
3386
我们不应该承受哪怕是十亿分之一的
05:43
that human extinction would foreclose
112
343820
2229
导致人类灭绝的风险, 这样才有可能阻止
05:46
this immense potential.
113
346049
2310
这些巨大的潜在性。
05:48
Some scenarios that have been envisaged
114
348359
1772
有些设想中的场景
05:50
may indeed be science fiction,
115
350131
1819
可能的确只是在科幻小说中才会出现,
05:51
but others may be disquietingly real.
116
351950
3386
但其它的情节则可能变成 令人担忧的现实。
05:55
It's an important maxim that the unfamiliar
117
355336
2874
有一个值得一提的格言是
05:58
is not the same as the improbable,
118
358210
2697
不常见并不等于不可能发生,
06:00
and in fact, that's why we at Cambridge University
119
360907
2398
事实上,我们在剑桥大学
06:03
are setting up a center to study how to mitigate
120
363305
3375
建立的一个如何去减轻
06:06
these existential risks.
121
366680
2032
这些真实存在的风险的研究中心 也是出于同样的考虑。
06:08
It seems it's worthwhile just for a few people
122
368712
3063
这一切看起来似乎只值得少数人花时间
06:11
to think about these potential disasters.
123
371775
2316
去估量这些潜在的灾难。
06:14
And we need all the help we can get from others,
124
374091
3013
但是我们需要尽可能地 从其他人身上获得帮助,
06:17
because we are stewards of a precious
125
377104
2479
因为我们是茫茫宇宙中这颗
06:19
pale blue dot in a vast cosmos,
126
379583
3483
珍贵的淡蓝色圆点的管理员,
06:23
a planet with 50 million centuries ahead of it.
127
383066
3378
这是一颗走过了5亿个世纪的星球。
06:26
And so let's not jeopardize that future.
128
386444
2556
因此让我们别再危害未来。
06:29
And I'd like to finish with a quote
129
389000
1795
我想引用一个 名叫彼得·梅达沃的伟大科学家的话
06:30
from a great scientist called Peter Medawar.
130
390795
3501
来结束今天的演讲。
06:34
I quote, "The bells that toll for mankind
131
394296
3273
我引用的这句话是: 为人类而敲响的钟声
06:37
are like the bells of Alpine cattle.
132
397569
2644
就像高山上的牛的钟声一样。
06:40
They are attached to our own necks,
133
400213
2286
它们系在我们的脖子上,
06:42
and it must be our fault if they do not make
134
402499
2675
如果它们不能发出和谐悦耳的声音,
06:45
a tuneful and melodious sound."
135
405174
2131
那么一定是我们的错。“
06:47
Thank you very much.
136
407305
2267
非常感谢。
06:49
(Applause)
137
409572
2113
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7