The 4 greatest threats to the survival of humanity

503,273 views ・ 2022-07-19

TED-Ed


Please double-click on the English subtitles below to play the video.

Translator: Hilda Chan Reviewer: Shelley Tsang 曾雯海
00:10
In January of 1995, Russia detected a nuclear missile headed its way.
0
10840
5964
1995年1月,俄羅斯偵測到向自己發射嘅核彈。
00:16
The alert went all the way to the president,
1
16804
2836
呢個警報傳到總統手上,
00:19
who was deciding whether to strike back
2
19640
2962
而總統要決定反唔反擊,
00:22
when another system contradicted the initial warning.
3
22602
3378
但當時有一另個系統, 顯示同呢個警報相反嘅情報。
00:26
What they thought was the first missile in a massive attack
4
26355
3796
呢個原本被認為 係一輪重大攻擊中嘅第一個導彈,
00:30
was actually a research rocket studying the Northern Lights.
5
30151
4338
其實只係一支 研究北極光嘅科研火箭。
00:35
This incident happened after the end of the Cold War,
6
35156
3420
呢件事係冷戰之後發生,
00:38
but was nevertheless one of the closest calls we’ve had
7
38576
3504
但無疑係最接近觸發全球核戰爭
00:42
to igniting a global nuclear war.
8
42080
2961
嘅一次事故。
00:45
With the invention of the atomic bomb,
9
45833
2169
人類發明咗原子彈之後,
00:48
humanity gained the power to destroy itself for the first time in our history.
10
48002
5339
史上第一次擁有自我毀滅嘅能力。
00:53
Since then, our existential risk—
11
53341
3086
自此之後,我地嘅滅絕危機 ——
00:56
risk of either extinction
12
56427
1835
絕種風險
00:58
or the unrecoverable collapse of human civilization—
13
58262
3379
或者人類文明不可逆轉地崩潰嘅風險
01:01
has steadily increased.
14
61641
1918
一路上升。
01:04
It’s well within our power to reduce this risk,
15
64060
2961
我地絕對有能力減低呢種風險,
01:07
but in order to do so,
16
67021
1585
但係要咁做,
01:08
we have to understand which of our activities
17
68606
2753
我地就要明白邊啲活動
01:11
pose existential threats now, and which might in the future.
18
71359
4504
會係宜家同將來威脅生存。
01:16
So far, our species has survived 2,000 centuries,
19
76489
4379
人類已經生存咗二千個世紀,
01:20
each with some extinction risk from natural causes—
20
80868
3754
每一個世紀都有 自然形成嘅絕種危機 ——
01:24
asteroid impacts, supervolcanoes, and the like.
21
84622
3128
小行星撞擊、超級火山爆發等等。
01:28
Assessing existential risk is an inherently uncertain business
22
88000
4338
生存危機本身難以評估,
01:32
because usually when we try to figure out how likely something is,
23
92338
3921
因為當我地嘗試 搵出一件事有幾大機會發生,
01:36
we check how often it's happened before.
24
96259
2460
我地通常會睇下 呢件事以前發生嘅頻率,
01:38
But the complete destruction of humanity has never happened before.
25
98719
4255
但人類滅絕係前所未有。
01:43
While there’s no perfect method to determine our risk from natural threats,
26
103474
4004
雖然我地冇辦法準確判斷 自然威脅會造成幾大嘅風險,
01:47
experts estimate it’s about 1 in 10,000 per century.
27
107478
4922
專家估計我地滅絕嘅機率 大概係每世紀一萬分之一。
01:52
Nuclear weapons were our first addition to that baseline.
28
112692
3503
係呢個基線之上, 第一個令機率上升嘅原因就係核武。
01:56
While there are many risks associated with nuclear weapons,
29
116612
3170
核武雖然有眾多危險之處,
01:59
the existential risk comes from the possibility
30
119782
3087
但係導致滅絕危機係由於
02:02
of a global nuclear war that leads to a nuclear winter,
31
122869
4379
全球核戰爭可能引致核冬天,
02:07
where soot from burning cities blocks out the sun for years,
32
127248
4171
焚燒中嘅城市產生灰燼, 長時間擋住太陽,
02:11
causing the crops that humanity depends on to fail.
33
131419
3253
令人類賴以維生嘅農作物失收。
02:15
We haven't had a nuclear war yet,
34
135339
2253
我地未經歷過核戰爭,
02:17
but our track record is too short to tell if they’re inherently unlikely
35
137592
4421
但我地嘅歷史實在太短, 短到唔知係核戰本來就唔可能發生,
02:22
or we’ve simply been lucky.
36
142013
1710
定係我地純粹好彩。
02:24
We also can’t say for sure
37
144307
1459
我地亦唔可以斷定
02:25
whether a global nuclear war would cause a nuclear winter so severe
38
145766
4713
全球核戰造成嘅核冬天,
02:30
it would pose an existential threat to humanity.
39
150479
2712
會唔會嚴重到威脅人類生存。
02:33
The next major addition to our existential risk was climate change.
40
153858
5047
氣候變化係另一個 增加滅絕風險嘅主要原因。
02:39
Like nuclear war, climate change could result
41
159322
2794
同核戰一樣,氣候變化 會導致好多可怕嘅後果,
02:42
in a lot of terrible scenarios that we should be working hard to avoid,
42
162116
4880
呢啲後果係我地應該盡力避免嘅,
02:46
but that would stop short of causing extinction or unrecoverable collapse.
43
166996
4713
但係未至於導致絕種或者 不可逆轉嘅文明崩潰。
02:52
We expect a few degrees Celsius of warming,
44
172043
2794
我地預期氣溫會上升攝氏幾度,
02:54
but can’t yet completely rule out 6 or even 10 degrees,
45
174837
4171
但係未完全排除 上升6度甚至10度嘅可能,
02:59
which would cause a calamity of possibly unprecedented proportions.
46
179008
4338
因而造成一場前所未有嘅災難。
03:03
Even in this worst-case scenario,
47
183596
2127
即使係最差情況,
03:05
it’s not clear whether warming would pose a direct existential risk,
48
185723
4046
我地都唔清楚全球暖化 會唔會直接威脅我地嘅生存,
03:09
but the disruption it would cause would likely make us more vulnerable
49
189769
3378
但係佢帶黎嘅混亂, 好可能令我地面對其他滅絕風險時,
03:13
to other existential risks.
50
193147
2002
變得更加脆弱。
03:15
The greatest risks may come from technologies that are still emerging.
51
195566
4546
最大嘅風險, 可能來自各種發展中嘅科技。
03:20
Take engineered pandemics.
52
200112
2294
例如係經過設計嘅大流行病。
03:22
The biggest catastrophes in human history have been from pandemics.
53
202406
4547
大流行病造成 人類歷史中最大嘅災難。
03:27
And biotechnology is enabling us to modify and create germs
54
207203
3670
借助生物技術, 我地可以改變同者創造病菌,
03:30
that could be much more deadly than naturally occurring ones.
55
210873
3462
令佢地比起自然存生嘅病菌更容易致命。
03:34
Such germs could cause pandemics through biowarfare and research accidents.
56
214752
4588
呢類病菌可以透過生物戰爭 或者研究事故引致大流行。
03:39
Decreased costs of genome sequencing and modification,
57
219340
3211
基因組序列分析同編輯嘅成本降低,
03:42
along with increased availability of potentially dangerous information
58
222551
3921
同時更容易取得有潛在危險嘅資訊
03:46
like the published genomes of deadly viruses,
59
226472
2544
好似係已被公開嘅致命病毒基因組,
03:49
also increase the number of people and groups
60
229016
2711
都令更多人或組織
03:51
who could potentially create such pathogens.
61
231727
2920
有能力製造呢類病原體。
03:55
Another concern is unaligned AI.
62
235564
2878
另一樣令人擔心嘅 唔配合人嘅價值觀嘅人工智慧。
03:58
Most AI researchers think this will be the century
63
238442
2586
大部分人工智慧嘅研究員 都認為係今個世紀內,
04:01
where we develop artificial intelligence that surpasses human abilities
64
241028
4338
我地就會發展出
超越全人類嘅人工智慧。
04:05
across the board.
65
245366
1293
04:06
If we cede this advantage, we place our future in the hands
66
246659
4129
如果我地將智力嘅優勢拱手相讓,
我地就將自己嘅未來 交俾我地創造嘅電腦系統控制。
04:10
of the systems we create.
67
250788
2002
04:12
Even if created solely with humanity’s best interests in mind,
68
252790
3712
就算只係以人類最大福祉 作為設計理念,
04:16
superintelligent AI could pose an existential risk
69
256502
3504
極為聰明嘅人工智慧 依然有可能威脅我地嘅生存,
04:20
if it isn’t perfectly aligned with human values—
70
260006
3420
只要佢嘅取向同人嘅價值觀 唔係完全一致 ——
04:23
a task scientists are finding extremely difficult.
71
263426
3503
呢樣嘢對於眾多科學家黎講 都係極難做得到。
04:27
Based on what we know at this point,
72
267346
2044
以目前所知,
04:29
some experts estimate the anthropogenic existential risk
73
269390
3837
有啲專家估計人類嘅滅絕風險
04:33
is more than 100 times higher than the background rate of natural risk.
74
273227
4797
超出自然造成嘅背景機率100倍有多。
04:38
But these odds depend heavily on human choices.
75
278482
4004
但實際有幾可能發生, 好大程度取決於人嘅選擇。
04:42
Because most of the risk is from human action, and it’s within human control.
76
282486
5422
因為大部分嘅危險都來自人嘅活動, 係人嘅掌握之內。
04:47
If we treat safeguarding humanity's future as the defining issue of our time,
77
287908
4755
如果我地將保障人類將來 作為呢個時代最重要嘅命題,
04:52
we can reduce this risk.
78
292663
2127
我地係可以減低呢個風險。
04:55
Whether humanity fulfils its potential—
79
295082
2836
人類能唔能夠
04:57
or not—
80
297918
1168
履行使命,
05:00
is in our hands.
81
300129
1793
就掌握係我地嘅手中。
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7