請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: Helen Chang
00:07
What's the most important century
in human history?
0
7003
3587
人類歷史上最重要的世紀
是哪個世紀?
00:11
Some might argue it’s a period
of extensive military campaigning,
1
11049
3920
有些人可能會主張是
大規模軍事征戰的時期,
00:14
like Alexander the Great’s
in the 300s BCE,
2
14969
3545
比如西元前三世紀
亞歷山大大帝的征戰,
00:18
which reshaped political
and cultural borders.
3
18848
3253
重塑了政治與文化的邊界。
00:23
Others might cite the emergence
of a major religion,
4
23936
3087
也有些人可能會說是
重要宗教出現的時期,
00:27
such as Islam in the 7th century,
5
27023
2127
比如第七世紀的伊斯蘭教,
00:29
which codified and spread values
across such borders.
6
29525
3504
它所制定和傳播的價值觀,
能跨越這些邊界。
00:35
Or perhaps it’s the Industrial Revolution
of the 1700s
7
35531
3462
或者也許是十八世紀的工業革命,
00:38
that transformed global commerce
8
38993
1752
它改變了全球的商業,
00:40
and redefined humanity's relationship
with labor.
9
40745
3128
且重新定義了人類和勞務的關係。
00:44
Whatever the answer, it seems like
any century vying for that top spot
10
44165
4129
不論答案是什麼,似乎
角逐這個寶座的世紀,
00:48
is at a moment of great change—
11
48294
2377
都是處在大變動時期的世紀——
00:51
when the actions of our ancestors shifted
humanity’s trajectory
12
51631
4171
在那些時期,我們祖先的行動
改變了人類接下來幾個世紀的軌道。
00:55
for centuries to come.
13
55802
1584
00:57
So if this is our metric,
is it possible that right now—
14
57720
3921
如果這是我們的衡量標準,
有沒有可能,現在——
01:01
this century—
is the most important one yet?
15
61641
3086
本世紀——是史上最重要的世紀?
01:05
The 21st century has already proven to be
a period of rapid technological growth.
16
65770
5213
二十一世紀已經被證實
是科技快速成長的時期。
01:11
Phones and computers have accelerated
the pace of life.
17
71234
3128
手機和電腦加速了生活的步調。
01:14
And we’re likely on the cusp of developing
new transformative technologies,
18
74403
4171
且我們可能即將要開發出
能帶來轉型的新科技,
01:18
like advanced artificial intelligence,
19
78574
2294
比如先進的人工智慧,
01:20
that could entirely change
the way people live.
20
80868
2878
可能可以完全改變人的生活方式。
01:25
Meanwhile, many technologies
we already have
21
85414
3045
同時,我們已經擁有的許多科技
01:28
contribute to humanity’s unprecedented
levels of existential risk—
22
88459
4838
則會讓人類面臨
前所未有的滅絕風險——
01:33
that’s the risk of our
species going extinct
23
93381
2377
這類風險包括我們這個物種的絕種,
01:35
or experiencing some kind of disaster
that permanently limits
24
95758
3420
以及遇到某種災難
永久限制人類成長和繁榮的能力。
01:39
humanity’s ability to grow and thrive.
25
99178
2670
01:43
The invention of the atomic bomb marked
a major rise in existential risk,
26
103266
4629
原子彈的發明就讓滅絕風險大增,
01:48
and since then we’ve only increased
the odds against us.
27
108396
3879
從那之後,我們只有
讓狀況對我們更不利。
01:52
It’s profoundly difficult
to estimate the odds
28
112900
2628
要估計本世紀發生物種
減絕的機率是非常困難的。
01:55
of an existential collapse
occurring this century.
29
115528
2628
01:58
Very rough guesses put the risk
of existential catastrophe
30
118239
3462
根據非常粗略的猜測,
因為核冬天及氣候變遷
02:01
due to nuclear winter and climate change
at around 0.1%,
31
121701
4880
造成滅絕性大災難的機率
大約為 0.1%,
02:08
with the odds of a pandemic causing
the same kind of collapse
32
128291
3211
而疫情造成同樣這種滅絕的機率
02:11
at a frightening 3%.
33
131502
1960
是很嚇人的 3%。
02:14
Given that any of these disasters could
mean the end of life as we know it,
34
134005
4880
因為任何這些災難都可能
意味著生命就此消失,
02:19
these aren’t exactly small figures,
35
139177
2127
上述的數字一點也不算小。
02:21
And it’s possible this century could see
the rise of new technologies
36
141304
4004
有可能在這個世紀會出現
帶來更大滅絕風險的新科技。
02:25
that introduce more existential risks.
37
145308
2794
02:29
AI experts have a wide range
of estimates regarding
38
149103
2628
對於通用人工智慧何時會出現,
02:31
when artificial general intelligence
will emerge,
39
151731
2961
人工智慧專家有
各式各樣的不同估計,
02:34
but according to some surveys,
many believe it could happen this century.
40
154734
4087
但根據一些調查,多數人
相信會在這個世紀發生。
02:39
Currently, we have relatively narrow forms
of artificial intelligence,
41
159655
3837
目前,我們的人工智慧
形式算是相對狹隘,
02:43
which are designed to do specific tasks
like play chess or recognize faces.
42
163492
4588
它們被設計來做特定的工作
任務,如下棋或臉孔辨識。
02:49
Even narrow AIs that do creative work are
limited to their singular specialty.
43
169040
5213
就算是做創意工作的狹隘人工智慧
也受限於單一專長。
02:54
But artificial general intelligences,
or AGIs,
44
174503
3629
但通用人工智慧(AGI)
02:58
would be able to adapt to and
perform any number of tasks,
45
178132
3837
能夠適應並且執行
任何數量的工作任務,
03:02
quickly outpacing
their human counterparts.
46
182678
2878
很快就會超越其人類對手。
03:06
There are a huge variety of guesses
about what AGI could look like,
47
186849
4088
對於 AGI 會是什麼樣子,
有各式各樣的猜測,
03:11
and what it would mean
for humanity to share the Earth
48
191646
2836
對於人類要和另一種
有感知的實體共享地球
03:14
with another sentient entity.
49
194482
2127
意味著什麼,也有各種看法。
03:18
AGIs might help us achieve our goals,
50
198653
2544
AGI 可能可以協助我們
達成我們的目標,
03:21
they might regard us as inconsequential,
51
201197
2294
它們可能會認為我們舉無輕重,
03:23
or, they might see us as an obstacle
to swiftly remove.
52
203491
3170
或者,它們可能會將我們視為
要盡速移除的障礙。
03:26
So in terms of existential risk,
53
206869
2169
所以,就滅絕風險來說,
03:29
it's imperative the values of this new
technology align with our own.
54
209038
4087
一定要讓這種新科技的
價值觀和我們的一致。
03:33
This is an incredibly difficult
philosophical and engineering challenge
55
213501
3879
這是個超困難的哲學與工程難題,
03:37
that will require a lot
of delicate, thoughtful work.
56
217380
3086
會需要投入大量精湛、周道的心力。
03:40
Yet, even if we succeed, AGI could still
lead to another complicated outcome.
57
220967
5213
但就算我們成功了,
AGI 仍然有可能帶來
另一種複雜的結果。
03:46
Let’s imagine an AGI emerges
with deep respect for human life
58
226847
3796
想像出現了一種對人類生命
抱持深切尊重的 AGI,
03:50
and a desire to solve
all humanity’s troubles.
59
230643
2961
且它想要解決所有人類的問題。
03:55
But to avoid becoming misaligned,
60
235231
2169
但為了避免發生不一致,
03:57
it's been developed to be incredibly
rigid about its beliefs.
61
237400
3628
它被開發成對於信念非常死板固執。
04:01
If these machines became
the dominant power on Earth,
62
241237
3086
如果這些機器成為主宰地球的勢力,
04:04
their strict values might
become hegemonic,
63
244323
2711
它們固執的價值觀
可能會變成霸權主義,
04:07
locking humanity into one ideology that
would be incredibly resistant to change.
64
247034
5214
把人類鎖在一種非常抗拒
改變的意識形態當中。
04:14
History has taught us that
no matter how enlightened
65
254792
2503
歷史告訴我們,不論一個
文明認為它自己有多開明,
04:17
a civilization thinks they are,
66
257295
1793
04:19
they are rarely up to the moral standards
of later generations.
67
259088
3795
通常還是不及後世的道德標準。
04:23
And this kind of value lock in
could permanently distort or constrain
68
263009
4629
將價值觀這樣鎖住,
可能會永久扭曲或限制
人類的道德成長。
04:27
humanity’s moral growth.
69
267638
1919
04:30
There's a ton of uncertainty around AGI,
70
270016
2585
關於 AGI,還有太多不確定性,
04:32
and it’s profoundly difficult to predict
how any existential risks
71
272601
3713
也相當難預測
下一個世紀的任何
滅絕風險會如何發展。
04:36
will play out over the next century.
72
276314
2002
04:38
It’s also possible that new,
more pressing concerns
73
278649
2920
也有可能會出現新的、
更迫切的問題,
04:41
might render these risks moot.
74
281569
2002
讓這些風險變得無關緊要。
04:43
But even if we can't definitively say that
ours is the most important century,
75
283904
4713
雖然我們無法一口咬定
我們的世紀就是最重要的世紀,
04:48
it still seems like the decisions
we make might have a major impact
76
288617
3754
但我們所做的決策似乎仍然
會對人類的未來有重大的影響。
04:52
on humanity’s future.
77
292371
1502
04:54
So maybe we should all live
like the future depends on us—
78
294415
3211
所以也許在我們過生活時,
應該要當作未來仰賴我們——
04:57
because actually, it just might.
79
297626
2211
因為實際上真的可能是這樣。
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。