The most important century in human history

337,860 views ・ 2023-04-04

TED-Ed


请双击下面的英文字幕来播放视频。

翻译人员: Justin H 校对人员: Jacky He
00:07
What's the most important century in human history?
0
7003
3587
人类历史上最重要的世纪是哪一个?
00:11
Some might argue it’s a period of extensive military campaigning,
1
11049
3920
有些人可能认为是 大范围军事行动的时期,
00:14
like Alexander the Great’s in the 300s BCE,
2
14969
3545
像亚历山大大帝 在公元前 300 年,
00:18
which reshaped political and cultural borders.
3
18848
3253
重塑了政治和文化边界。
00:23
Others might cite the emergence of a major religion,
4
23936
3087
另一些人可能会举例 主流宗教的兴起,
00:27
such as Islam in the 7th century,
5
27023
2127
像是七世纪的伊斯兰教,
00:29
which codified and spread values across such borders.
6
29525
3504
定义了价值观并传播它越过这些边界。
00:35
Or perhaps it’s the Industrial Revolution of the 1700s
7
35531
3462
又或许是十七世纪的工业革命
00:38
that transformed global commerce
8
38993
1752
改变了全球商业
00:40
and redefined humanity's relationship with labor.
9
40745
3128
并重新定义了人类与劳动力的联系。
00:44
Whatever the answer, it seems like any century vying for that top spot
10
44165
4129
无论答案如何, 似乎任何一个争夺这个位置的世纪
00:48
is at a moment of great change—
11
48294
2377
身处巨变之中——
00:51
when the actions of our ancestors shifted humanity’s trajectory
12
51631
4171
当我们祖先的作为 改变了人类社会的轨迹
00:55
for centuries to come.
13
55802
1584
持续到未来的数个世纪里。
00:57
So if this is our metric, is it possible that right now—
14
57720
3921
因此,如果这是我们的衡量标准, 有没有可能目前——
01:01
this century— is the most important one yet?
15
61641
3086
这个世纪—— 是最重要的一个呢?
01:05
The 21st century has already proven to be a period of rapid technological growth.
16
65770
5213
21 世纪已经被证明是 是一段技术快速发展的时期。
01:11
Phones and computers have accelerated the pace of life.
17
71234
3128
电话和电脑加快了生活的节奏。
01:14
And we’re likely on the cusp of developing new transformative technologies,
18
74403
4171
而我们很可能处在 研发新型变革性技术的节骨眼上,
01:18
like advanced artificial intelligence,
19
78574
2294
像是先进的人工智能,
01:20
that could entirely change the way people live.
20
80868
2878
将彻底改变人类的生活方式。
01:25
Meanwhile, many technologies we already have
21
85414
3045
同时,许多我们已经拥有的技术
01:28
contribute to humanity’s unprecedented levels of existential risk—
22
88459
4838
导致了人类前所未有的生存风险——
01:33
that’s the risk of our species going extinct
23
93381
2377
即我们作为物种灭绝的风险,
01:35
or experiencing some kind of disaster that permanently limits
24
95758
3420
或者经历某种灾难,长远地
01:39
humanity’s ability to grow and thrive.
25
99178
2670
限制人类生存和发展的能力。
01:43
The invention of the atomic bomb marked a major rise in existential risk,
26
103266
4629
原子弹的发明意味着 生存风险大幅上升,
01:48
and since then we’ve only increased the odds against us.
27
108396
3879
而从那时起,我们更 加深了自身所处的危险。
01:52
It’s profoundly difficult to estimate the odds
28
112900
2628
很难去估算出
01:55
of an existential collapse occurring this century.
29
115528
2628
本世纪发生生存崩塌的概率。
01:58
Very rough guesses put the risk of existential catastrophe
30
118239
3462
非常粗略的分析将生存灾难的概率
02:01
due to nuclear winter and climate change at around 0.1%,
31
121701
4880
关系到核冬天和气候变化 定在 0.1% 左右,
02:08
with the odds of a pandemic causing the same kind of collapse
32
128291
3211
而一场疾病大流行 导致类似灾难的概率
02:11
at a frightening 3%.
33
131502
1960
是可怕的 3%。
02:14
Given that any of these disasters could mean the end of life as we know it,
34
134005
4880
由于这些灾难中的任何一个 都意味着我们所理解的生命的终结,
02:19
these aren’t exactly small figures,
35
139177
2127
这些数字不小,
02:21
And it’s possible this century could see the rise of new technologies
36
141304
4004
而且本世纪有可能见证 新技术的崛起
02:25
that introduce more existential risks.
37
145308
2794
带来更多的生存风险。
02:29
AI experts have a wide range of estimates regarding
38
149103
2628
人工智能专家们有一个 大约的估计关于
02:31
when artificial general intelligence will emerge,
39
151731
2961
人工通用智能出现的时间,
02:34
but according to some surveys, many believe it could happen this century.
40
154734
4087
但一些调查显示 , 许多人认为它可能在本世纪发生。
02:39
Currently, we have relatively narrow forms of artificial intelligence,
41
159655
3837
目前,我们有 定义狭隘的人工智能,
02:43
which are designed to do specific tasks like play chess or recognize faces.
42
163492
4588
它们被设计用来完成特定的任务 如下棋或识别人脸。
02:49
Even narrow AIs that do creative work are limited to their singular specialty.
43
169040
5213
即使是从事创造工作的狭义人工智能也 只限于他们的单个专业。
02:54
But artificial general intelligences, or AGIs,
44
174503
3629
但人工通用智能,即 AGI,
02:58
would be able to adapt to and perform any number of tasks,
45
178132
3837
将能够适应并执行任何项任务,
03:02
quickly outpacing their human counterparts.
46
182678
2878
迅速胜过 他们的人类同行。
03:06
There are a huge variety of guesses about what AGI could look like,
47
186849
4088
有大量的猜测 关于 AGI 可能看起来的样子,
03:11
and what it would mean for humanity to share the Earth
48
191646
2836
以及人类选择与 另一个有意识的实体
03:14
with another sentient entity.
49
194482
2127
共享地球将意味着什么。
03:18
AGIs might help us achieve our goals,
50
198653
2544
AGI 可能会帮助我们 实现我们的目标,
03:21
they might regard us as inconsequential,
51
201197
2294
他们可能认为我们无关紧要,
03:23
or, they might see us as an obstacle to swiftly remove.
52
203491
3170
或者,他们可能把我们看作 需要迅速清除的障碍。
03:26
So in terms of existential risk,
53
206869
2169
因此,在生存风险方面,
03:29
it's imperative the values of this new technology align with our own.
54
209038
4087
这一新技术的价值观必须与 我们的价值观相一致。
03:33
This is an incredibly difficult philosophical and engineering challenge
55
213501
3879
这是一个令人难以置信 哲学和工程方面的困难挑战,
03:37
that will require a lot of delicate, thoughtful work.
56
217380
3086
需要大量精细、周到的工作。
03:40
Yet, even if we succeed, AGI could still lead to another complicated outcome.
57
220967
5213
然而,即使我们成功了,AGI 仍可能 引出另一个复杂的结果。
03:46
Let’s imagine an AGI emerges with deep respect for human life
58
226847
3796
让我们假设一个带着 对人类高度尊重 AGI 的出现
03:50
and a desire to solve all humanity’s troubles.
59
230643
2961
并希望解决 所有人类的麻烦。
03:55
But to avoid becoming misaligned,
60
235231
2169
但为了避免错误观点,
03:57
it's been developed to be incredibly rigid about its beliefs.
61
237400
3628
它被研发成对其信仰无比的僵化。
04:01
If these machines became the dominant power on Earth,
62
241237
3086
如果这些机器成为 成为地球上的主导力量,
04:04
their strict values might become hegemonic,
63
244323
2711
他们严格的价值观可能 成为霸权主义,
04:07
locking humanity into one ideology that would be incredibly resistant to change.
64
247034
5214
将人类锁定在一种意识形态中, 并非常抵触变化。
04:14
History has taught us that no matter how enlightened
65
254792
2503
历史已经告诉我们 无论一种文明
04:17
a civilization thinks they are,
66
257295
1793
认为其多么发达,
04:19
they are rarely up to the moral standards of later generations.
67
259088
3795
他们很少能达到未来的道德标准。
04:23
And this kind of value lock in could permanently distort or constrain
68
263009
4629
而这种价值锁定的方式 可能会永久性地扭曲或限制
04:27
humanity’s moral growth.
69
267638
1919
人类道德观的演化。
04:30
There's a ton of uncertainty around AGI,
70
270016
2585
围绕 AGI 有大量的不确定性,
04:32
and it’s profoundly difficult to predict how any existential risks
71
272601
3713
而且很难预测 任何生存风险会怎么
04:36
will play out over the next century.
72
276314
2002
在下个世纪中体现。
04:38
It’s also possible that new, more pressing concerns
73
278649
2920
也有可能更新、更加紧迫的问题
04:41
might render these risks moot.
74
281569
2002
可能会使这些风险变得微不足道。
04:43
But even if we can't definitively say that ours is the most important century,
75
283904
4713
但是,即使我们无法确定 现在是最重要的世纪,
04:48
it still seems like the decisions we make might have a major impact
76
288617
3754
我们所做的决定多少会对人类未来
04:52
on humanity’s future.
77
292371
1502
产生巨大影响。
04:54
So maybe we should all live like the future depends on us—
78
294415
3211
所以,也许我们该像 未来取决于我们一般生活——
04:57
because actually, it just might.
79
297626
2211
因为也许确实是这样。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7