What are the most important moral problems of our time? | Will MacAskill

372,422 views ・ 2018-10-03

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Helen Chang
00:12
This is a graph
0
12857
1479
這是一張圖,
00:14
that represents the economic history of human civilization.
1
14360
3659
圖上的是人類文明的經濟歷史。
00:18
[World GDP per capita over the last 200,000 years]
2
18043
2400
【過去二十萬年的 世界人均國內生產總額】
00:23
There's not much going on, is there.
3
23757
2003
圖上沒多少東西。
00:26
For the vast majority of human history,
4
26751
2347
人類歷史大部分的時期,
00:29
pretty much everyone lived on the equivalent of one dollar per day,
5
29835
4048
幾乎人人都是一天 用一塊美金在過日子,
00:33
and not much changed.
6
33907
1286
沒有很大的改變。
00:36
But then, something extraordinary happened:
7
36757
2777
接著,不凡的事發生了:
00:40
the Scientific and Industrial Revolutions.
8
40677
2811
科學和工業革命。
00:43
And the basically flat graph you just saw
9
43512
2785
各位剛剛看到幾乎是平坦的圖形
00:46
transforms into this.
10
46321
2675
轉變成這樣。
00:50
What this graph means is that, in terms of power to change the world,
11
50612
4635
這張圖的意義是, 就改變世界的力量來說,
00:55
we live in an unprecedented time in human history,
12
55271
3438
我們處在人類歷史上 前所未見的時代,
00:58
and I believe our ethical understanding hasn't yet caught up with this fact.
13
58733
3944
且我認為我們的倫理領會 還沒有趕上這些發展。
01:03
The Scientific and Industrial Revolutions
14
63716
1984
科學和工業革命
01:05
transformed both our understanding of the world
15
65724
2909
轉變了我們對世界的了解
01:08
and our ability to alter it.
16
68657
1669
及我們改變世界的能力。
01:11
What we need is an ethical revolution
17
71505
3667
我們需要的是倫理革命,
01:15
so that we can work out
18
75196
1548
這樣我們才能想出
01:16
how do we use this tremendous bounty of resources
19
76768
3152
我們要如何使用這麼豐富的資源
01:19
to improve the world.
20
79944
1395
來改善世界。
01:22
For the last 10 years,
21
82249
1591
在過去十年間,
01:23
my colleagues and I have developed a philosophy and research program
22
83864
3833
我和我同事在開發 一個哲學和研究的計畫,
01:27
that we call effective altruism.
23
87721
1835
我們稱它為「有效利他主義」。
01:30
It tries to respond to these radical changes in our world,
24
90366
3595
這個計畫是在嘗試對世界上的 這些極端改變做出應變,
01:33
uses evidence and careful reasoning to try to answer this question:
25
93985
4476
使用證據和謹慎的推論 來試著回答這個問題:
01:40
How can we do the most good?
26
100173
2278
我們要如何做至善的善事?
01:44
Now, there are many issues you've got to address
27
104265
3221
你需要處理許多議題,
01:47
if you want to tackle this problem:
28
107510
2263
才能解決這個問題:
01:49
whether to do good through your charity
29
109797
2031
是否透過慈善事業、
01:51
or your career or your political engagement,
30
111852
2152
職業或參與政治來行善,
01:54
what programs to focus on, who to work with.
31
114028
2395
要投入什麼計畫,要和誰合作。
01:57
But what I want to talk about
32
117624
1476
但我想要談的
01:59
is what I think is the most fundamental problem.
33
119124
2872
是我認為最基本的問題。
02:02
Of all the many problems that the world faces,
34
122020
2693
世界所面臨的這麼多問題當中,
02:05
which should we be focused on trying to solve first?
35
125962
2659
我們應該聚焦優先解決哪一個?
02:10
Now, I'm going to give you a framework for thinking about this question,
36
130668
3468
我要提供各位一個架構, 用來思考這個問題,
02:14
and the framework is very simple.
37
134160
1936
且這個架構很簡單。
02:16
A problem's higher priority,
38
136842
1699
優先順序高的迫切問題是
02:19
the bigger, the more easily solvable and the more neglected it is.
39
139416
4063
大的、容易解決的,和被忽視的。
02:24
Bigger is better,
40
144694
1642
越大越好,
02:26
because we've got more to gain if we do solve the problem.
41
146360
2841
因為若能解決這個問題, 我們就能得到比較多。
02:30
More easily solvable is better
42
150221
1569
比較容易解決比較好,
02:31
because I can solve the problem with less time or money.
43
151814
2824
因為我可以花費較少的時間 和金錢來解決這個問題。
02:35
And most subtly,
44
155737
2063
最微妙的一點是,
02:38
more neglected is better, because of diminishing returns.
45
158681
2849
越被忽視越好,理由是報酬遞減。
02:42
The more resources that have already been invested into solving a problem,
46
162285
3714
已經投入來解決 這個問題的資源越多,
02:46
the harder it will be to make additional progress.
47
166023
2905
就越難再做出額外的進展。
02:50
Now, the key thing that I want to leave with you is this framework,
48
170560
4059
我想要留給各位的關鍵, 就是這個架構,
02:54
so that you can think for yourself
49
174643
1984
這樣各位就可以自己想想,
02:56
what are the highest global priorities.
50
176651
2321
全球最高優先順序的事是哪些。
02:59
But I and others in the effective altruism community
51
179954
2692
但我和有效利他主義社區中的其他人
03:02
have converged on three moral issues that we believe are unusually important,
52
182670
5879
已經有共識,我們認為 有三項道德議題是特別重要的,
03:08
score unusually well in this framework.
53
188573
2182
在這個架構中得到的分數特別高。
03:11
First is global health.
54
191151
2813
第一是全球健康。
03:13
This is supersolvable.
55
193988
2411
這個議題是超級可以解決的。
03:16
We have an amazing track record in global health.
56
196423
3397
關於全球健康,我們有 很驚人的過去記錄資料。
03:19
Rates of death from measles, malaria, diarrheal disease
57
199844
5420
麻疹、霍亂、痢疾的死亡率
03:25
are down by over 70 percent.
58
205288
2246
下降了超過 70%。
03:29
And in 1980, we eradicated smallpox.
59
209534
2789
1980 年,我們根絕了天花。
03:33
I estimate we thereby saved over 60 million lives.
60
213815
3667
依我估計,這就拯救了 超過六千萬條性命。
03:37
That's more lives saved than if we'd achieved world peace
61
217506
3064
就算能在那個時期達成世界和平,
03:40
in that same time period.
62
220594
1697
也救不了這麼多人。
03:43
On our current best estimates,
63
223893
2325
依我們目前最佳的估計,
03:46
we can save a life by distributing long-lasting insecticide-treated bed nets
64
226242
4128
只要能發放用殺蟲劑處理過的 持久性蚊帳就能拯救一條性命,
03:50
for just a few thousand dollars.
65
230394
1903
成本不過幾千美元。
03:52
This is an amazing opportunity.
66
232911
1667
這是一個很棒的機會。
03:55
The second big priority is factory farming.
67
235594
2515
第二個優先議題,是工廠化養殖。
03:58
This is superneglected.
68
238681
1563
這個議題被嚴重忽略了。
04:00
There are 50 billion land animals used every year for food,
69
240768
4143
每年,有五百億隻陸地動物被食用,
04:05
and the vast majority of them are factory farmed,
70
245625
2548
牠們絕大多數都是工廠化養殖的,
04:08
living in conditions of horrific suffering.
71
248197
2380
生活條件非常惡劣、痛苦。
04:10
They're probably among the worst-off creatures on this planet,
72
250601
3151
牠們可能是地球上最不幸的生物,
04:13
and in many cases, we could significantly improve their lives
73
253776
2858
在許多案例中,我們都可以 大幅改善牠們的生活,
04:16
for just pennies per animal.
74
256658
1602
每隻動物只要幾分錢即可。
04:19
Yet this is hugely neglected.
75
259123
2082
但這個議題卻被嚴重忽視。
04:21
There are 3,000 times more animals in factory farms
76
261229
3810
工廠化養殖的動物數量
比街頭流浪的寵物要多三千倍,
04:25
than there are stray pets,
77
265063
1561
04:28
but yet, factory farming gets one fiftieth of the philanthropic funding.
78
268600
4373
但工廠化養殖得到的慈善資金 卻只有五十分之一。
04:34
That means additional resources in this area
79
274211
2128
這就意味著,在這個領域 若有額外的資源,
04:36
could have a truly transformative impact.
80
276363
2150
就能產生真正帶來改變的影響。
04:39
Now the third area is the one that I want to focus on the most,
81
279458
2985
第三個領域是我最想要拿來談的,
04:42
and that's the category of existential risks:
82
282467
2984
就是生存風險這個類別:
04:45
events like a nuclear war or a global pandemic
83
285475
3873
像核武戰爭或全球流行病這類事件,
04:50
that could permanently derail civilization
84
290824
2611
能夠讓文明永久脫軌,
04:54
or even lead to the extinction of the human race.
85
294156
2436
或甚至導致人類的滅絕。
04:57
Let me explain why I think this is such a big priority
86
297882
2540
讓我用這個架構 來解釋為什麼我認為
05:00
in terms of this framework.
87
300446
1337
這是個高優先的議題。
05:02
First, size.
88
302992
1412
首先,大小。
05:05
How bad would it be if there were a truly existential catastrophe?
89
305341
3889
如果真的發生了 攸關生死存亡的大災難會多糟?
05:10
Well, it would involve the deaths of all seven billion people on this planet
90
310920
6342
嗯,
它會牽涉到地球上 全部七十億人的存亡,
05:17
and that means you and everyone you know and love.
91
317286
3119
包括你和你認識、你愛的所有人。
05:21
That's just a tragedy of unimaginable size.
92
321214
2579
這個悲劇大到無法想像。
05:25
But then, what's more,
93
325684
1976
但,還不只如此,
05:27
it would also mean the curtailment of humanity's future potential,
94
327684
3605
它也意味著人類未來的 潛能會被限制,
05:31
and I believe that humanity's potential is vast.
95
331313
2952
而我相信人類的潛能很大。
05:35
The human race has been around for about 200,000 years,
96
335551
3451
人類已經存在有大約二十萬年了,
05:39
and if she lives as long as a typical mammalian species,
97
339026
2883
如果人類能活得跟典型的 哺乳動物物種一樣久,
05:41
she would last for about two million years.
98
341933
2298
就能活大約兩百萬年。
05:46
If the human race were a single individual,
99
346884
2691
如果把整體人類看成是單一個體,
05:49
she would be just 10 years old today.
100
349599
2419
人類現在才只有十歲而已。
05:53
And what's more, the human race isn't a typical mammalian species.
101
353526
4166
此外,人類並不是 典型的哺乳動物物種。
05:58
There's no reason why, if we're careful,
102
358950
1906
如果我們夠小心,
06:00
we should die off after only two million years.
103
360880
2205
我們不可能只存在 兩百萬年就絕種。
06:03
The earth will remain habitable for 500 million years to come.
104
363839
4040
在接下來的五億年, 地球都仍然會是適合居住的。
06:08
And if someday, we took to the stars,
105
368696
1944
若有一天,我們能到其他星球,
06:11
the civilization could continue for billions more.
106
371640
2516
文明就還可以再延續數十億年。
06:16
So I think the future is going to be really big,
107
376193
2388
所以我認為未來將會非常大,
06:19
but is it going to be good?
108
379669
1802
但未來會很好嗎?
06:21
Is the human race even really worth preserving?
109
381495
2817
人類真的值得保護嗎?
06:26
Well, we hear all the time about how things have been getting worse,
110
386540
3929
嗯,我們總是聽到有人說 事物變得更糟糕,
06:31
but I think that when we take the long run,
111
391459
2693
但我認為,從長期來看,
06:34
things have been getting radically better.
112
394176
2031
一切都徹底地變好了。
06:37
Here, for example, is life expectancy over time.
113
397453
2294
比如,這是各時期的預期壽命。
06:40
Here's the proportion of people not living in extreme poverty.
114
400892
3023
這是非處於極度貧窮 生活條件的人口比例。
06:45
Here's the number of countries over time that have decriminalized homosexuality.
115
405106
4095
這是各時期讓同性戀 合法化的國家數目。
06:50
Here's the number of countries over time that have become democratic.
116
410848
3269
這是各時期民主國家的數目。
06:55
Then, when we look to the future, there could be so much more to gain again.
117
415015
4619
當我們看向未來,會發現 還有好多可以獲得的。
06:59
We'll be so much richer,
118
419658
1228
我們會富有許多,
07:00
we can solve so many problems that are intractable today.
119
420910
3595
我們能夠解決許多現今的棘手問題。
07:05
So if this is kind of a graph of how humanity has progressed
120
425389
4445
如果就全人類隨時間 而過得更好的層面上來看,
07:09
in terms of total human flourishing over time,
121
429858
2890
這張圖算是某種人類進步圖,
07:12
well, this is what we would expect future progress to look like.
122
432772
3355
那麼我們預期未來進步 會是像這樣子的。
07:16
It's vast.
123
436881
1150
它很廣大。
07:18
Here, for example,
124
438953
1198
以這一點為例,
07:20
is where we would expect no one to live in extreme poverty.
125
440175
3746
我們預期到這時候就沒有人 會過著極度貧窮的生活。
07:25
Here is where we would expect everyone to be better off
126
445930
3202
在這裡,我們預期人人都會
比現今最富有的人過得更好。
07:29
than the richest person alive today.
127
449156
1853
07:32
Perhaps here is where we would discover the fundamental natural laws
128
452081
3192
也許在這裡,我們會 發現管理我們世界的
07:35
that govern our world.
129
455297
1268
基本自然法則。
07:37
Perhaps here is where we discover an entirely new form of art,
130
457516
3705
也許在這裡,我們會 發現全新的藝術形式,
07:41
a form of music we currently lack the ears to hear.
131
461245
3040
我們目前沒有辦法聽見的音樂形式。
07:45
And this is just the next few thousand years.
132
465072
2222
這只是接下來的幾千年。
07:47
Once we think past that,
133
467827
2205
一旦我們想到更遠的時間,
07:50
well, we can't even imagine the heights that human accomplishment might reach.
134
470056
4167
我們甚至無法想像 人類的成就能夠有多高。
07:54
So the future could be very big and it could be very good,
135
474247
3040
所以,未來可能是非常大, 也可能是非常好的,
07:57
but are there ways we could lose this value?
136
477311
2086
但我們會不會 以某些方式失去這價值?
08:00
And sadly, I think there are.
137
480366
1826
很不幸,我認為有可能。
08:02
The last two centuries brought tremendous technological progress,
138
482216
4053
過去兩個世紀, 發生了很巨大的技術進步,
08:06
but they also brought the global risks of nuclear war
139
486293
2622
但也帶來了核武戰爭的全球風險,
08:08
and the possibility of extreme climate change.
140
488939
2157
以及極端氣候改變的可能性。
08:11
When we look to the coming centuries,
141
491725
1767
當我們看向接下來的幾世紀,
08:13
we should expect to see the same pattern again.
142
493516
2647
我們應該會預期 再次看到同樣的模式。
08:16
And we can see some radically powerful technologies on the horizon.
143
496187
3356
我們能看到一些 極強大的技術即將問世。
08:20
Synthetic biology might give us the power to create viruses
144
500132
2849
合成生物學可能讓我們能創造出
08:23
of unprecedented contagiousness and lethality.
145
503005
3047
具有前所未見感染力 和致命度的病毒。
08:27
Geoengineering might give us the power to dramatically alter the earth's climate.
146
507131
4643
地球工程可能讓我們 能大大改變地球的氣候。
08:31
Artificial intelligence might give us the power to create intelligent agents
147
511798
4199
人工智慧可能讓我們能創造出
才能比我們更好的智慧代理人。
08:36
with abilities greater than our own.
148
516021
2142
08:40
Now, I'm not saying that any of these risks are particularly likely,
149
520222
3888
我並不是說上述這些風險 特別有可能發生,
08:44
but when there's so much at stake,
150
524134
1644
但當賭注有這麼高的時候,
08:45
even small probabilities matter a great deal.
151
525802
2967
即使很低的機率也是非常要緊的。
08:49
Imagine if you're getting on a plane and you're kind of nervous,
152
529568
3001
想像一下,如果你要 上飛機,且你蠻緊張的,
08:52
and the pilot reassures you by saying,
153
532593
3444
而駕駛員說了 這樣的話來向你保證:
08:56
"There's only a one-in-a-thousand chance of crashing. Don't worry."
154
536061
4634
「墜機的機會只有 一千分之一。別擔心。」
09:02
Would you feel reassured?
155
542157
1554
你會覺得安心嗎?
09:04
For these reasons, I think that preserving the future of humanity
156
544509
4088
基於這些理由, 我認為保護人類的未來
09:08
is among the most important problems that we currently face.
157
548621
2984
是我們目前所面臨 最重要的問題之一。
09:12
But let's keep using this framework.
158
552546
2150
但,咱們繼續用這個架構吧。
09:14
Is this problem neglected?
159
554720
1310
這個問題有被忽視嗎?
09:18
And I think the answer is yes,
160
558085
2282
我認為答案是「有」,
09:20
and that's because problems that affect future generations
161
560391
3325
那是因為會影響未來世代的問題
09:23
are often hugely neglected.
162
563740
1651
通常被嚴重忽視。
09:26
Why?
163
566930
1406
為什麼?
09:28
Because future people don't participate in markets today.
164
568360
3478
因為未來的人並沒有 參與現今的市場。
09:31
They don't have a vote.
165
571862
1522
他們沒有投票權。
09:33
It's not like there's a lobby representing the interests
166
573931
2673
並沒有一個遊說團會代表
09:36
of those born in 2300 AD.
167
576628
2023
2300 年出生者的利益。
09:40
They don't get to influence the decisions we make today.
168
580313
3242
他們無法影響我們現今所做的決策。
09:43
They're voiceless.
169
583995
1191
他們無法發聲。
09:46
And that means we still spend a paltry amount on these issues:
170
586490
3445
那就表示,我們在 這些議題上投入非常少:
09:49
nuclear nonproliferation,
171
589959
1799
防止核武器擴散、
09:51
geoengineering, biorisk,
172
591782
2330
地球工程、生物危險、
09:55
artificial intelligence safety.
173
595414
1642
人工智慧安全性。
09:57
All of these receive only a few tens of millions of dollars
174
597923
2874
所有這些議題每年都只得到
10:00
of philanthropic funding every year.
175
600821
1927
幾千萬美元的慈善資金。
10:04
That's tiny compared to the 390 billion dollars
176
604044
3929
這個數字相對很小, 因為美國的總慈善金額
10:08
that's spent on US philanthropy in total.
177
608790
2261
為三千九百億美元。
10:13
The final aspect of our framework then:
178
613885
2484
接著,架構的最後一個面向:
10:17
Is this solvable?
179
617083
1190
它是可解決的嗎?
10:19
I believe it is.
180
619289
1289
我相信是的。
10:21
You can contribute with your money,
181
621014
3047
你可以貢獻你的金錢、
10:24
your career or your political engagement.
182
624085
2644
你的職業,或你的政治參與。
10:28
With your money, you can support organizations
183
628225
2175
若你投入金錢,你可以資助組織,
10:30
that focus on these risks,
184
630424
1302
讓它們著重這些風險,
10:31
like the Nuclear Threat Initiative,
185
631750
2555
比如「核威脅倡議」,
10:34
which campaigns to take nuclear weapons off hair-trigger alert,
186
634329
3660
它在做的是不要讓核武 處在一觸即發的警戒狀態,
10:38
or the Blue Ribbon Panel, which develops policy to minimize the damage
187
638013
3571
或「藍絲帶小組」, 它在做的是開發政策,
把自然發生和人為造成的流行病 所造成的影響給最小化,
10:41
from natural and man-made pandemics,
188
641608
2095
10:45
or the Center for Human-Compatible AI, which does technical research
189
645158
3260
或「人類相容人工智慧中心」, 它是在做技術研究,
10:48
to ensure that AI systems are safe and reliable.
190
648442
2747
確保人工智慧系統的安全可靠。
10:52
With your political engagement,
191
652652
1516
若你投入的是政治參與,
10:54
you can vote for candidates that care about these risks,
192
654192
3096
你可以投票給在乎 這些風險的候選人,
10:57
and you can support greater international cooperation.
193
657312
2586
你也可以支持更大的國際合作。
11:01
And then with your career, there is so much that you can do.
194
661767
3542
若你投入的是職業, 你能做的非常多。
11:05
Of course, we need scientists and policymakers and organization leaders,
195
665333
3672
當然,我們需要科學家、 政策制訂者,和組織領導人,
11:09
but just as importantly,
196
669865
1152
但,同樣重要的,
11:11
we also need accountants and managers and assistants
197
671041
4117
我們也需要會計師、經理人和助理,
11:16
to work in these organizations that are tackling these problems.
198
676691
3754
在這些能夠處理 這些問題的組織中工作。
11:20
Now, the research program of effective altruism
199
680469
3492
有效利他主義的研究專案
11:25
is still in its infancy,
200
685191
1444
還在初期階段,
11:27
and there's still a huge amount that we don't know.
201
687262
2524
還有很多我們不知道的。
11:31
But even with what we've learned so far,
202
691173
2343
但,就我們目前所知道的來說,
11:34
we can see that by thinking carefully
203
694748
2183
我們可以看到,若能謹慎思考,
11:37
and by focusing on those problems that are big, solvable and neglected,
204
697494
4873
並把焦點放在那些可解決、 被忽視的大問題上,
11:43
we can make a truly tremendous difference to the world
205
703152
2708
我們就能在接下來的數千年,
11:45
for thousands of years to come.
206
705884
1631
真正造成很大的不同,改變世界。
11:47
Thank you.
207
707963
1151
謝謝。
11:49
(Applause)
208
709138
4560
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7