What are the most important moral problems of our time? | Will MacAskill
373,190 views ・ 2018-10-03
请双击下面的英文字幕来播放视频。
翻译人员: duan JiGang
校对人员: Yiyang Piao
00:12
This is a graph
0
12857
1479
这是一张曲线图
00:14
that represents the economic history
of human civilization.
1
14360
3659
它展现了人类文明的经济史.
00:18
[World GDP per capita
over the last 200,000 years]
2
18043
2400
(过去20万年的世界人均GDP)
00:23
There's not much going on, is there.
3
23757
2003
没什么在发生,不是吗。
00:26
For the vast majority of human history,
4
26751
2347
在人类历史的长河中,
00:29
pretty much everyone lived
on the equivalent of one dollar per day,
5
29835
4048
几乎每人每天用一美元来生活,
00:33
and not much changed.
6
33907
1286
并且没有什么变化。
00:36
But then, something
extraordinary happened:
7
36757
2777
但是后来,发生了一个重大的事件:
00:40
the Scientific and Industrial Revolutions.
8
40677
2811
科学和工业革命。
00:43
And the basically flat graph you just saw
9
43512
2785
刚才你们所看到的平缓的曲线图
00:46
transforms into this.
10
46321
2675
变成了这样。
00:50
What this graph means is that,
in terms of power to change the world,
11
50612
4635
这幅图表现的是,
就改变世界的能力而言,
00:55
we live in an unprecedented time
in human history,
12
55271
3438
我们现在所处的时代是空前的,
00:58
and I believe our ethical understanding
hasn't yet caught up with this fact.
13
58733
3944
并且我认为我们伦理上的理解
还没有赶上这一事实。
01:03
The Scientific and Industrial Revolutions
14
63716
1984
科学和工业革命
01:05
transformed both
our understanding of the world
15
65724
2909
不光改变了我们对世界的认知
01:08
and our ability to alter it.
16
68657
1669
也提高了我们改变世界的能力。
01:11
What we need is an ethical revolution
17
71505
3667
我们所需要的
是一场伦理道德层面上的革命
01:15
so that we can work out
18
75196
1548
这样我们才能解决
01:16
how do we use this tremendous
bounty of resources
19
76768
3152
怎么利用这些丰富的资源
01:19
to improve the world.
20
79944
1395
让世界变得更美好。
01:22
For the last 10 years,
21
82249
1591
在过去的10年中,
01:23
my colleagues and I have developed
a philosophy and research program
22
83864
3833
我和同事开展了一项哲学研究计划,
01:27
that we call effective altruism.
23
87721
1835
我们称之为有效的利他主义。
01:30
It tries to respond
to these radical changes in our world,
24
90366
3595
这项研究计划尝试去解决
当今世界所发生的巨变,
01:33
uses evidence and careful reasoning
to try to answer this question:
25
93985
4476
依据事实证据和缜密的推理
来回答这一问题:
01:40
How can we do the most good?
26
100173
2278
我们怎样才可以
最大效益得做好事呢?
01:44
Now, there are many issues
you've got to address
27
104265
3221
现今,有许多待解决的问题
01:47
if you want to tackle this problem:
28
107510
2263
如果你想要解决这个大难题,
01:49
whether to do good through your charity
29
109797
2031
无论你是通过做慈善
01:51
or your career
or your political engagement,
30
111852
2152
还是通过你的职业,
或是通过政治活动,
01:54
what programs to focus on,
who to work with.
31
114028
2395
你专注的项目,
或你工作的对象来为世界得益。
01:57
But what I want to talk about
32
117624
1476
但是我想讨论的是
01:59
is what I think is the most
fundamental problem.
33
119124
2872
我认为在所有问题之中,
最根本的问题
02:02
Of all the many problems
that the world faces,
34
122020
2693
这个世界所面临的所有问题中
02:05
which should we be focused
on trying to solve first?
35
125962
2659
哪一个应是我们首先解决的问题?
02:10
Now, I'm going to give you a framework
for thinking about this question,
36
130668
3468
现在,我将要给出
一个框架来思考这个问题,
02:14
and the framework is very simple.
37
134160
1936
这个框架很简单。
02:16
A problem's higher priority,
38
136842
1699
一个问题的优先级越高,
02:19
the bigger, the more easily solvable
and the more neglected it is.
39
139416
4063
这问题就越大,就越容易解决,
也最容易被忽略。
02:24
Bigger is better,
40
144694
1642
越大越好。
02:26
because we've got more to gain
if we do solve the problem.
41
146360
2841
因为我们若解决了这个问题,
我们会获利极多。
02:30
More easily solvable is better
42
150221
1569
越容易解决越好,
02:31
because I can solve the problem
with less time or money.
43
151814
2824
因为我可以花费更少的钱和
时间来解决这个问题。
02:35
And most subtly,
44
155737
2063
最微妙的是,
02:38
more neglected is better,
because of diminishing returns.
45
158681
2849
根据收益递减规律,
问题越容易被忽视越好。
02:42
The more resources that have already been
invested into solving a problem,
46
162285
3714
若前期已投入资源过多,
来解决这个问题
02:46
the harder it will be
to make additional progress.
47
166023
2905
就越难取得额外的进展。
02:50
Now, the key thing that I want
to leave with you is this framework,
48
170560
4059
现在,我想要留给你们最关键的东西
就是这个框架,
02:54
so that you can think for yourself
49
174643
1984
这样你可以自己去思考
02:56
what are the highest global priorities.
50
176651
2321
什么问题具有全球最高的优先级。
02:59
But I and others
in the effective altruism community
51
179954
2692
但是在这个高效利他主义群体中,
我和其他的人
03:02
have converged on three moral issues
that we believe are unusually important,
52
182670
5879
一致认为有三个的道德问题
03:08
score unusually well in this framework.
53
188573
2182
在这个框架中占据重要的地位。
03:11
First is global health.
54
191151
2813
第一个是全球卫生。
03:13
This is supersolvable.
55
193988
2411
这是最易解决的。
03:16
We have an amazing track record
in global health.
56
196423
3397
我们对于全球卫生
有一个惊人的跟踪记录。
03:19
Rates of death from measles,
malaria, diarrheal disease
57
199844
5420
麻疹,疟疾和痢疾引起的死亡率
03:25
are down by over 70 percent.
58
205288
2246
已经降低了70%。
03:29
And in 1980, we eradicated smallpox.
59
209534
2789
并且在1980年,
我们根治了天花。
03:33
I estimate we thereby saved
over 60 million lives.
60
213815
3667
我估计因此而拯救了
超过6000万条的生命。
03:37
That's more lives saved
than if we'd achieved world peace
61
217506
3064
这些生命数量比
若在同时期取得世界和平
03:40
in that same time period.
62
220594
1697
拯救的生命还要多。
03:43
On our current best estimates,
63
223893
2325
根据我们目前的最佳估计,
03:46
we can save a life by distributing
long-lasting insecticide-treated bed nets
64
226242
4128
我们可以通过分配长期有效的、
具有杀虫功效的蚊帐来拯救生命,
03:50
for just a few thousand dollars.
65
230394
1903
而这些蚊帐只花费几千元。
03:52
This is an amazing opportunity.
66
232911
1667
这是一个惊人的机会。
03:55
The second big priority
is factory farming.
67
235594
2515
第二优先问题就是农业工厂化。
03:58
This is superneglected.
68
238681
1563
这一点被严重忽视了。
04:00
There are 50 billion land animals
used every year for food,
69
240768
4143
每年有500亿陆生动物
被作为食物。
04:05
and the vast majority of them
are factory farmed,
70
245625
2548
它们中的大部分是工厂化养殖的,
04:08
living in conditions
of horrific suffering.
71
248197
2380
这些动物居住在痛苦的生存环境中。
04:10
They're probably among
the worst-off creatures on this planet,
72
250601
3151
它们可能是这个地球上最惨的生物,
04:13
and in many cases, we could
significantly improve their lives
73
253776
2858
并且在很多情况下,
我们可以大幅度改善它们的生活,
04:16
for just pennies per animal.
74
256658
1602
只需为每一个动物花费几分钱
就可以做到。
04:19
Yet this is hugely neglected.
75
259123
2082
然而这一点也被极度忽略了。
04:21
There are 3,000 times
more animals in factory farms
76
261229
3810
工业化农场里的动物数量是
流浪动物的3000倍多,
04:25
than there are stray pets,
77
265063
1561
04:28
but yet, factory farming gets one fiftieth
of the philanthropic funding.
78
268600
4373
但是工业化养殖的动物
只获得了1/5的慈善基金。
04:34
That means additional
resources in this area
79
274211
2128
这意味着,在这个领域若有更多资源
04:36
could have a truly transformative impact.
80
276363
2150
可以产生一个真正的变革性影响。
04:39
Now the third area is the one
that I want to focus on the most,
81
279458
2985
接下来第三个领域是我最想强调的。
04:42
and that's the category
of existential risks:
82
282467
2984
这就是存亡风险:
04:45
events like a nuclear war
or a global pandemic
83
285475
3873
一些像核战争或者全球性瘟疫的事件
04:50
that could permanently derail civilization
84
290824
2611
这些可能永久地使文明脱节,
04:54
or even lead to the extinction
of the human race.
85
294156
2436
或者甚至导致人类种族的灭绝。
04:57
Let me explain why I think
this is such a big priority
86
297882
2540
让我来解释下为什么我认为这件事情
05:00
in terms of this framework.
87
300446
1337
是这个框架下最具优先级的事情。
05:02
First, size.
88
302992
1412
第一,规模。
05:05
How bad would it be if there were
a truly existential catastrophe?
89
305341
3889
如果出现一个毁灭性的灾难,
情况将会变得多糟糕?
05:10
Well, it would involve the deaths
of all seven billion people on this planet
90
310920
6342
它将牵扯到这个星球上
70亿人类的生死,
05:17
and that means you
and everyone you know and love.
91
317286
3119
当然包括你自己以及
你认识或者深爱的任何人。
05:21
That's just a tragedy
of unimaginable size.
92
321214
2579
这将是一个不可想象规模的悲剧。
05:25
But then, what's more,
93
325684
1976
但是然后,还有呢。
05:27
it would also mean the curtailment
of humanity's future potential,
94
327684
3605
它同时预示着人类未来潜力的缩减,
05:31
and I believe that humanity's
potential is vast.
95
331313
2952
我相信人类的潜力是巨大的。
05:35
The human race has been around
for about 200,000 years,
96
335551
3451
人类已经存在了大约20万年了,
05:39
and if she lives as long
as a typical mammalian species,
97
339026
2883
如果她像一个典型的哺乳物种
活得一样长的话,
05:41
she would last
for about two million years.
98
341933
2298
她将会持续200万年。
05:46
If the human race
were a single individual,
99
346884
2691
如果人类是一个单一的个体,
05:49
she would be just 10 years old today.
100
349599
2419
那么她今天就只有10岁。
05:53
And what's more, the human race
isn't a typical mammalian species.
101
353526
4166
更何况,人类不是
一个典型的哺乳动物,
05:58
There's no reason why, if we're careful,
102
358950
1906
没有理由,如果我们细心的话,
06:00
we should die off
after only two million years.
103
360880
2205
我们将会在200万年后灭亡。
06:03
The earth will remain habitable
for 500 million years to come.
104
363839
4040
而地球在未来的5亿年
仍将保持可居住状态。
06:08
And if someday, we took to the stars,
105
368696
1944
如果有一天,我们登上了星球,
06:11
the civilization could continue
for billions more.
106
371640
2516
那么文明将会继续数十亿年。
06:16
So I think the future
is going to be really big,
107
376193
2388
所以,我认为未来将会真的很宽广,
06:19
but is it going to be good?
108
379669
1802
但是未来一定会更好吗?
06:21
Is the human race
even really worth preserving?
109
381495
2817
人类种族真的还值得保留吗?
06:26
Well, we hear all the time about
how things have been getting worse,
110
386540
3929
是的,我们一直都在
听到事情如何变得更糟的消息。
06:31
but I think that when
we take the long run,
111
391459
2693
但是我相信当我们做长远打算时,
06:34
things have been getting radically better.
112
394176
2031
情况已经大为好转。
06:37
Here, for example,
is life expectancy over time.
113
397453
2294
比如,人类平均寿命在延长。
06:40
Here's the proportion of people
not living in extreme poverty.
114
400892
3023
很大比例的人不再生活异常贫困。
06:45
Here's the number of countries over time
that have decriminalized homosexuality.
115
405106
4095
将同性恋合法化的国家数量在增加。
06:50
Here's the number of countries over time
that have become democratic.
116
410848
3269
民主化的国家数量也在增加。
06:55
Then, when we look to the future,
there could be so much more to gain again.
117
415015
4619
所以,当展望未来时,
我们可以获得更多东西。
06:59
We'll be so much richer,
118
419658
1228
我们将变得更富有,
07:00
we can solve so many problems
that are intractable today.
119
420910
3595
我们可以解决很多
如今看来很棘手的问题。
07:05
So if this is kind of a graph
of how humanity has progressed
120
425389
4445
所以,如果这是一个
人类进步曲线图的话,
07:09
in terms of total human
flourishing over time,
121
429858
2890
它基于人类随着时间的推移
如何繁荣昌盛,
07:12
well, this is what we would expect
future progress to look like.
122
432772
3355
这将会是未来我们期望看到的进展。
07:16
It's vast.
123
436881
1150
它很广阔。
07:18
Here, for example,
124
438953
1198
比如。
07:20
is where we would expect no one
to live in extreme poverty.
125
440175
3746
在这个点,我们期望没人会生活在
极度的贫困中。
07:25
Here is where we would expect
everyone to be better off
126
445930
3202
在个点,我们期待每个人都会比
07:29
than the richest person alive today.
127
449156
1853
比当今最富裕的人过的还好。
07:32
Perhaps here is where we would discover
the fundamental natural laws
128
452081
3192
或许在这里,我们将会发现统治世界
07:35
that govern our world.
129
455297
1268
的基本自然法则。
07:37
Perhaps here is where we discover
an entirely new form of art,
130
457516
3705
或许在这个点,我们将会发现
一个全新的艺术形式,
07:41
a form of music we currently lack
the ears to hear.
131
461245
3040
或者是一种当下我们没有能力
去倾听的音乐。
07:45
And this is just
the next few thousand years.
132
465072
2222
这仅仅是未来的几千年。
07:47
Once we think past that,
133
467827
2205
每当我们回首过往时,
07:50
well, we can't even imagine the heights
that human accomplishment might reach.
134
470056
4167
我们甚至都不敢相信
人类曾经取得的高度。
07:54
So the future could be very big
and it could be very good,
135
474247
3040
所以,前景可能会很广阔,
也可能会很好,
07:57
but are there ways
we could lose this value?
136
477311
2086
但是,是否会有一些情况,
会使得我们丧失这些价值?
08:00
And sadly, I think there are.
137
480366
1826
悲伤地是,我认为是他们是存在的。
08:02
The last two centuries brought
tremendous technological progress,
138
482216
4053
过去两个世纪带来了
巨大的技术进步,
08:06
but they also brought
the global risks of nuclear war
139
486293
2622
但是技术进步同时带来了
核战争的全球危机
08:08
and the possibility
of extreme climate change.
140
488939
2157
还有极端气候变化的可能性。
08:11
When we look to the coming centuries,
141
491725
1767
当我们观望未来的世纪时,
08:13
we should expect to see
the same pattern again.
142
493516
2647
我们应该期望看到同样的样式。
08:16
And we can see some radically
powerful technologies on the horizon.
143
496187
3356
我们能够看到未来会有的
一些激进强大的技术。
08:20
Synthetic biology might give us
the power to create viruses
144
500132
2849
合成生物学使得我们能够创病毒,
08:23
of unprecedented
contagiousness and lethality.
145
503005
3047
而这些病毒具有不可预测的传染性和
杀伤力。
08:27
Geoengineering might give us the power
to dramatically alter the earth's climate.
146
507131
4643
地质工程可能给予我们
极大改变气候的力量。
08:31
Artificial intelligence might give us
the power to create intelligent agents
147
511798
4199
人工智能可能会给予我们
创造智能终端的力量,
08:36
with abilities greater than our own.
148
516021
2142
而这些智能终端会比人类
具备更强的能力。
08:40
Now, I'm not saying that any
of these risks are particularly likely,
149
520222
3888
现在,我并不是说所有这些风险
都特别的有可能实现,
08:44
but when there's so much at stake,
150
524134
1644
但当这一切都岌岌可危时。
08:45
even small probabilities
matter a great deal.
151
525802
2967
甚至一个小的可能性
也会造成大的惨剧。
08:49
Imagine if you're getting on a plane
and you're kind of nervous,
152
529568
3001
想象你登上一个飞机,你有点紧张,
08:52
and the pilot reassures you by saying,
153
532593
3444
并且飞行员在欢迎你时说到,
08:56
"There's only a one-in-a-thousand
chance of crashing. Don't worry."
154
536061
4634
“飞机只有千分之一的几率会坠毁,
别担心。”
09:02
Would you feel reassured?
155
542157
1554
你会感觉安心吗?
09:04
For these reasons, I think that preserving
the future of humanity
156
544509
4088
基于这些原因,
我认为保留人类的未来这件事情
09:08
is among the most important problems
that we currently face.
157
548621
2984
是我们目前面临的最重要的难题。
09:12
But let's keep using this framework.
158
552546
2150
然而,让我们继续来使用这个框架。
09:14
Is this problem neglected?
159
554720
1310
这个问题被忽略了吗?
09:18
And I think the answer is yes,
160
558085
2282
并且我认为答案是 “是的”,
09:20
and that's because problems
that affect future generations
161
560391
3325
这是因为影响人类后代的问题
09:23
are often hugely neglected.
162
563740
1651
通常都被极大地忽略了。
09:26
Why?
163
566930
1406
为什么?
09:28
Because future people
don't participate in markets today.
164
568360
3478
因为未来人类没有参与今天的市场。
09:31
They don't have a vote.
165
571862
1522
他们没有任何决策权。
09:33
It's not like there's a lobby
representing the interests
166
573931
2673
并没有一个大厅来展示他们的利益,
09:36
of those born in 2300 AD.
167
576628
2023
我指的是出生在公元2300年
这些人的利益。
09:40
They don't get to influence
the decisions we make today.
168
580313
3242
他们没法影响我们今天所做的决定。
09:43
They're voiceless.
169
583995
1191
他们是无声的。
09:46
And that means we still spend
a paltry amount on these issues:
170
586490
3445
这意味着我们在这些问题上的投入
微乎其微:
09:49
nuclear nonproliferation,
171
589959
1799
防止核扩散,
09:51
geoengineering, biorisk,
172
591782
2330
地质工程,生物风险,
09:55
artificial intelligence safety.
173
595414
1642
还有人工智能的安全性。
09:57
All of these receive
only a few tens of millions of dollars
174
597923
2874
所有这些课题,
每年只能接受大约千万美金
10:00
of philanthropic funding every year.
175
600821
1927
的慈善基金。
10:04
That's tiny compared
to the 390 billion dollars
176
604044
3929
这些和美国每年花费的
10:08
that's spent on US philanthropy in total.
177
608790
2261
3900亿慈善基金比起来
简直就是九牛一毛。
10:13
The final aspect of our framework then:
178
613885
2484
我们这个框架的最后一个方面:
10:17
Is this solvable?
179
617083
1190
这个问题可以解决吗?
10:19
I believe it is.
180
619289
1289
我相信是可以解决的。
10:21
You can contribute with your money,
181
621014
3047
你可以用你的钱来做贡献,
10:24
your career or your political engagement.
182
624085
2644
或者通过你的职业,
或者你的政治参与来贡献。
10:28
With your money,
you can support organizations
183
628225
2175
使用你的钱,你可以支持那些
10:30
that focus on these risks,
184
630424
1302
聚焦在这些风险上的组织机构,
10:31
like the Nuclear Threat Initiative,
185
631750
2555
比如核威胁倡议组织,
10:34
which campaigns to take nuclear weapons
off hair-trigger alert,
186
634329
3660
这个组织旨在将核武器
从一触即发的警告中消除,
10:38
or the Blue Ribbon Panel, which
develops policy to minimize the damage
187
638013
3571
或者蓝丝带小组,
这个组织旨在开发政策来最小化
10:41
from natural and man-made pandemics,
188
641608
2095
自然或者人为流行病带来的伤害,
10:45
or the Center for Human-Compatible AI,
which does technical research
189
645158
3260
或者人类兼容人工智能中心,
这个组织做一些研究
10:48
to ensure that AI systems
are safe and reliable.
190
648442
2747
以确保人工智能系统是安全
并且可靠的。
10:52
With your political engagement,
191
652652
1516
通过你的政治参与活动
10:54
you can vote for candidates
that care about these risks,
192
654192
3096
你可以投票给
那些关心这些风险的候选人,
10:57
and you can support
greater international cooperation.
193
657312
2586
你也可以支持更大的国际化合作。
11:01
And then with your career,
there is so much that you can do.
194
661767
3542
然后,通过你的职业,
有很多事情你可以做。
11:05
Of course, we need scientists
and policymakers and organization leaders,
195
665333
3672
毫无疑问,我们需要科学家,
需要政策制定者和组织领导者,
11:09
but just as importantly,
196
669865
1152
但是同样重要的是,
11:11
we also need accountants
and managers and assistants
197
671041
4117
我们也需要财会,经理人和助理,
11:16
to work in these organizations
that are tackling these problems.
198
676691
3754
为解决这些难题的组织工作。
11:20
Now, the research program
of effective altruism
199
680469
3492
当前,针对有效利他主义的研究程序
11:25
is still in its infancy,
200
685191
1444
还处于幼儿期。
11:27
and there's still a huge amount
that we don't know.
201
687262
2524
我们对很多东西仍不了解。
11:31
But even with what we've learned so far,
202
691173
2343
但是,仅仅就目前我们所学到的东西
而言,
11:34
we can see that by thinking carefully
203
694748
2183
我们可以发现,通过认真思考
11:37
and by focusing on those problems
that are big, solvable and neglected,
204
697494
4873
并且聚焦在这些巨大,
可解决并且被忽略的问题上,
11:43
we can make a truly tremendous
difference to the world
205
703152
2708
我们可以对世界作出
真正意义上巨大的改变,
11:45
for thousands of years to come.
206
705884
1631
这些改变是为了未来的几千年。
11:47
Thank you.
207
707963
1151
谢谢。
11:49
(Applause)
208
709138
4560
(掌声)
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。