The 4 greatest threats to the survival of humanity

503,273 views ・ 2022-07-19

TED-Ed


請雙擊下方英文字幕播放視頻。

譯者: Lily Cheng 審譯者: Helen Chang
00:10
In January of 1995, Russia detected a nuclear missile headed its way.
0
10840
5964
1995 年 1 月,俄國偵測到 一枚飛向它的飛彈
00:16
The alert went all the way to the president,
1
16804
2836
警報一路送到總統手上
00:19
who was deciding whether to strike back
2
19640
2962
他得決定是否反擊
00:22
when another system contradicted the initial warning.
3
22602
3378
因為另一系統得出相反結果
00:26
What they thought was the first missile in a massive attack
4
26355
3796
一開始偵測以為的攻擊飛彈
00:30
was actually a research rocket studying the Northern Lights.
5
30151
4338
其實只是研究對極光火箭
00:35
This incident happened after the end of the Cold War,
6
35156
3420
這起事件發生在冷戰之後
00:38
but was nevertheless one of the closest calls we’ve had
7
38576
3504
卻是有史以來人類最接近
00:42
to igniting a global nuclear war.
8
42080
2961
爆發全球核戰的時刻
00:45
With the invention of the atomic bomb,
9
45833
2169
原子彈的發明
00:48
humanity gained the power to destroy itself for the first time in our history.
10
48002
5339
第一次賦予人類自我毀滅的能力
00:53
Since then, our existential risk—
11
53341
3086
從那開始,生存風險
00:56
risk of either extinction
12
56427
1835
滅種風險
00:58
or the unrecoverable collapse of human civilization—
13
58262
3379
文明煙滅至無法重建的風險
01:01
has steadily increased.
14
61641
1918
就不斷增加
01:04
It’s well within our power to reduce this risk,
15
64060
2961
我們完全能降低風險
01:07
but in order to do so,
16
67021
1585
想要做到
01:08
we have to understand which of our activities
17
68606
2753
我們必須搞清楚是哪些活動
01:11
pose existential threats now, and which might in the future.
18
71359
4504
造成現在的風險,以及未來的風險
01:16
So far, our species has survived 2,000 centuries,
19
76489
4379
地球生物已經存續 2000 個世紀
01:20
each with some extinction risk from natural causes—
20
80868
3754
面對自然因素造成的滅絕風險
01:24
asteroid impacts, supervolcanoes, and the like.
21
84622
3128
天體撞擊、超級火山爆發等等
01:28
Assessing existential risk is an inherently uncertain business
22
88000
4338
生存風險本來就難以評估
01:32
because usually when we try to figure out how likely something is,
23
92338
3921
每當我們試圖分析一件事的可能性
01:36
we check how often it's happened before.
24
96259
2460
我們會先查之前發生的次數
01:38
But the complete destruction of humanity has never happened before.
25
98719
4255
人類大滅絕這件事卻還未發生過
01:43
While there’s no perfect method to determine our risk from natural threats,
26
103474
4004
雖然沒有辦法能完美預估自然災害風險
01:47
experts estimate it’s about 1 in 10,000 per century.
27
107478
4922
專家可以大略預估一世紀中 約有萬分之一的機率
01:52
Nuclear weapons were our first addition to that baseline.
28
112692
3503
核子武器是人類第一次對 這個基準點添磚加瓦
01:56
While there are many risks associated with nuclear weapons,
29
116612
3170
在核子武器帶來的諸多風險之中
01:59
the existential risk comes from the possibility
30
119782
3087
生存風險的可能性來自於
02:02
of a global nuclear war that leads to a nuclear winter,
31
122869
4379
核武戰爭所導致的核子冬天
02:07
where soot from burning cities blocks out the sun for years,
32
127248
4171
城市燃燒的灰燼連年遮住太陽
02:11
causing the crops that humanity depends on to fail.
33
131419
3253
導致人類賴以為生的作物歉收
02:15
We haven't had a nuclear war yet,
34
135339
2253
核武戰爭目前尚未發生
02:17
but our track record is too short to tell if they’re inherently unlikely
35
137592
4421
但是歷史還太短,無法分辨 是機率極低所以未發生
02:22
or we’ve simply been lucky.
36
142013
1710
還是剛好逃過一劫
02:24
We also can’t say for sure
37
144307
1459
沒人能確定
02:25
whether a global nuclear war would cause a nuclear winter so severe
38
145766
4713
全球核武戰爭帶來的核子冬天
02:30
it would pose an existential threat to humanity.
39
150479
2712
是否會造成人類生存危機
02:33
The next major addition to our existential risk was climate change.
40
153858
5047
第二個帶來生存風險的主要問題 是氣候變遷
02:39
Like nuclear war, climate change could result
41
159322
2794
就像核戰,氣候變遷 可能會造成許多可怕的結果
02:42
in a lot of terrible scenarios that we should be working hard to avoid,
42
162116
4880
是人類應該努力避免的
02:46
but that would stop short of causing extinction or unrecoverable collapse.
43
166996
4713
雖然氣候變遷未必會 滅絕全人類或使末日降臨
02:52
We expect a few degrees Celsius of warming,
44
172043
2794
我們預測溫度會稍微上升幾度
02:54
but can’t yet completely rule out 6 or even 10 degrees,
45
174837
4171
但無法排除攝氏 6 度甚至 10 度的可能性
02:59
which would cause a calamity of possibly unprecedented proportions.
46
179008
4338
這將造成前所未有的災難
03:03
Even in this worst-case scenario,
47
183596
2127
儘管在這最糟糕的設想中
03:05
it’s not clear whether warming would pose a direct existential risk,
48
185723
4046
地球暖化不一定「直接」造成生存風險
03:09
but the disruption it would cause would likely make us more vulnerable
49
189769
3378
卻會使人類更多暴露在其他生存風險中
03:13
to other existential risks.
50
193147
2002
03:15
The greatest risks may come from technologies that are still emerging.
51
195566
4546
最大的風險是來自於發展中的科技
03:20
Take engineered pandemics.
52
200112
2294
比如人造流行病
03:22
The biggest catastrophes in human history have been from pandemics.
53
202406
4547
曾造成人類史上最大的災難
03:27
And biotechnology is enabling us to modify and create germs
54
207203
3670
生物科技能夠改變和創造細菌
03:30
that could be much more deadly than naturally occurring ones.
55
210873
3462
它們比自然演化的細菌更加致命
03:34
Such germs could cause pandemics through biowarfare and research accidents.
56
214752
4588
這些人造細菌可能因 生化戰爭或研究洩漏造成疫情
03:39
Decreased costs of genome sequencing and modification,
57
219340
3211
基因定序和編輯成本降低
03:42
along with increased availability of potentially dangerous information
58
222551
3921
與其他能更輕易地獲取潛在危險資訊
03:46
like the published genomes of deadly viruses,
59
226472
2544
比如公開的致命病毒基因組
03:49
also increase the number of people and groups
60
229016
2711
也使更多人或者團體
03:51
who could potentially create such pathogens.
61
231727
2920
能夠利用這些資訊來製造病原
03:55
Another concern is unaligned AI.
62
235564
2878
下個議題是不受規範的人工智慧(AI)
03:58
Most AI researchers think this will be the century
63
238442
2586
多數 AI 學者認為這個世紀
04:01
where we develop artificial intelligence that surpasses human abilities
64
241028
4338
將會是 AI 能力全面輾壓人類的世紀
04:05
across the board.
65
245366
1293
04:06
If we cede this advantage, we place our future in the hands
66
246659
4129
我們讓出了優勢,就等於用未來豪賭
04:10
of the systems we create.
67
250788
2002
全押我們創造的系統能成功
04:12
Even if created solely with humanity’s best interests in mind,
68
252790
3712
就算我們以符合人類最高利益為出發點
04:16
superintelligent AI could pose an existential risk
69
256502
3504
聰明的 AI 仍造成潛在的生存風險
04:20
if it isn’t perfectly aligned with human values—
70
260006
3420
除非它們完全遵循人類規則
04:23
a task scientists are finding extremely difficult.
71
263426
3503
科學家認為這點機率渺茫
04:27
Based on what we know at this point,
72
267346
2044
以我們目前所知
04:29
some experts estimate the anthropogenic existential risk
73
269390
3837
專家們評估人禍造成的生存風險
04:33
is more than 100 times higher than the background rate of natural risk.
74
273227
4797
比天災的發生機率高 100 倍
04:38
But these odds depend heavily on human choices.
75
278482
4004
這個機率與人類的選擇息息相關
04:42
Because most of the risk is from human action, and it’s within human control.
76
282486
5422
因為風險大多來自人類的行為 可以為人所控制
04:47
If we treat safeguarding humanity's future as the defining issue of our time,
77
287908
4755
如果將保衛人類未來視為最高任務
04:52
we can reduce this risk.
78
292663
2127
風險是可以降低的
04:55
Whether humanity fulfils its potential—
79
295082
2836
人類是發揮潛力去阻止危機
04:57
or not—
80
297918
1168
還是放任它
05:00
is in our hands.
81
300129
1793
掌握在我們的手中
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7