What are the most important moral problems of our time? | Will MacAskill

354,513 views ・ 2018-10-03

TED


Please double-click on the English subtitles below to play the video.

00:12
This is a graph
0
12857
1479
00:14
that represents the economic history of human civilization.
1
14360
3659
00:18
[World GDP per capita over the last 200,000 years]
2
18043
2400
00:23
There's not much going on, is there.
3
23757
2003
00:26
For the vast majority of human history,
4
26751
2347
00:29
pretty much everyone lived on the equivalent of one dollar per day,
5
29835
4048
00:33
and not much changed.
6
33907
1286
00:36
But then, something extraordinary happened:
7
36757
2777
00:40
the Scientific and Industrial Revolutions.
8
40677
2811
00:43
And the basically flat graph you just saw
9
43512
2785
00:46
transforms into this.
10
46321
2675
00:50
What this graph means is that, in terms of power to change the world,
11
50612
4635
00:55
we live in an unprecedented time in human history,
12
55271
3438
00:58
and I believe our ethical understanding hasn't yet caught up with this fact.
13
58733
3944
01:03
The Scientific and Industrial Revolutions
14
63716
1984
01:05
transformed both our understanding of the world
15
65724
2909
01:08
and our ability to alter it.
16
68657
1669
01:11
What we need is an ethical revolution
17
71505
3667
01:15
so that we can work out
18
75196
1548
01:16
how do we use this tremendous bounty of resources
19
76768
3152
01:19
to improve the world.
20
79944
1395
01:22
For the last 10 years,
21
82249
1591
01:23
my colleagues and I have developed a philosophy and research program
22
83864
3833
01:27
that we call effective altruism.
23
87721
1835
01:30
It tries to respond to these radical changes in our world,
24
90366
3595
01:33
uses evidence and careful reasoning to try to answer this question:
25
93985
4476
01:40
How can we do the most good?
26
100173
2278
01:44
Now, there are many issues you've got to address
27
104265
3221
01:47
if you want to tackle this problem:
28
107510
2263
01:49
whether to do good through your charity
29
109797
2031
01:51
or your career or your political engagement,
30
111852
2152
01:54
what programs to focus on, who to work with.
31
114028
2395
01:57
But what I want to talk about
32
117624
1476
01:59
is what I think is the most fundamental problem.
33
119124
2872
02:02
Of all the many problems that the world faces,
34
122020
2693
02:05
which should we be focused on trying to solve first?
35
125962
2659
02:10
Now, I'm going to give you a framework for thinking about this question,
36
130668
3468
02:14
and the framework is very simple.
37
134160
1936
02:16
A problem's higher priority,
38
136842
1699
02:19
the bigger, the more easily solvable and the more neglected it is.
39
139416
4063
02:24
Bigger is better,
40
144694
1642
02:26
because we've got more to gain if we do solve the problem.
41
146360
2841
02:30
More easily solvable is better
42
150221
1569
02:31
because I can solve the problem with less time or money.
43
151814
2824
02:35
And most subtly,
44
155737
2063
02:38
more neglected is better, because of diminishing returns.
45
158681
2849
02:42
The more resources that have already been invested into solving a problem,
46
162285
3714
02:46
the harder it will be to make additional progress.
47
166023
2905
02:50
Now, the key thing that I want to leave with you is this framework,
48
170560
4059
02:54
so that you can think for yourself
49
174643
1984
02:56
what are the highest global priorities.
50
176651
2321
02:59
But I and others in the effective altruism community
51
179954
2692
03:02
have converged on three moral issues that we believe are unusually important,
52
182670
5879
03:08
score unusually well in this framework.
53
188573
2182
03:11
First is global health.
54
191151
2813
03:13
This is supersolvable.
55
193988
2411
03:16
We have an amazing track record in global health.
56
196423
3397
03:19
Rates of death from measles, malaria, diarrheal disease
57
199844
5420
03:25
are down by over 70 percent.
58
205288
2246
03:29
And in 1980, we eradicated smallpox.
59
209534
2789
03:33
I estimate we thereby saved over 60 million lives.
60
213815
3667
03:37
That's more lives saved than if we'd achieved world peace
61
217506
3064
03:40
in that same time period.
62
220594
1697
03:43
On our current best estimates,
63
223893
2325
03:46
we can save a life by distributing long-lasting insecticide-treated bed nets
64
226242
4128
03:50
for just a few thousand dollars.
65
230394
1903
03:52
This is an amazing opportunity.
66
232911
1667
03:55
The second big priority is factory farming.
67
235594
2515
03:58
This is superneglected.
68
238681
1563
04:00
There are 50 billion land animals used every year for food,
69
240768
4143
04:05
and the vast majority of them are factory farmed,
70
245625
2548
04:08
living in conditions of horrific suffering.
71
248197
2380
04:10
They're probably among the worst-off creatures on this planet,
72
250601
3151
04:13
and in many cases, we could significantly improve their lives
73
253776
2858
04:16
for just pennies per animal.
74
256658
1602
04:19
Yet this is hugely neglected.
75
259123
2082
04:21
There are 3,000 times more animals in factory farms
76
261229
3810
04:25
than there are stray pets,
77
265063
1561
04:28
but yet, factory farming gets one fiftieth of the philanthropic funding.
78
268600
4373
04:34
That means additional resources in this area
79
274211
2128
04:36
could have a truly transformative impact.
80
276363
2150
04:39
Now the third area is the one that I want to focus on the most,
81
279458
2985
04:42
and that's the category of existential risks:
82
282467
2984
04:45
events like a nuclear war or a global pandemic
83
285475
3873
04:50
that could permanently derail civilization
84
290824
2611
04:54
or even lead to the extinction of the human race.
85
294156
2436
04:57
Let me explain why I think this is such a big priority
86
297882
2540
05:00
in terms of this framework.
87
300446
1337
05:02
First, size.
88
302992
1412
05:05
How bad would it be if there were a truly existential catastrophe?
89
305341
3889
05:10
Well, it would involve the deaths of all seven billion people on this planet
90
310920
6342
05:17
and that means you and everyone you know and love.
91
317286
3119
05:21
That's just a tragedy of unimaginable size.
92
321214
2579
05:25
But then, what's more,
93
325684
1976
05:27
it would also mean the curtailment of humanity's future potential,
94
327684
3605
05:31
and I believe that humanity's potential is vast.
95
331313
2952
05:35
The human race has been around for about 200,000 years,
96
335551
3451
05:39
and if she lives as long as a typical mammalian species,
97
339026
2883
05:41
she would last for about two million years.
98
341933
2298
05:46
If the human race were a single individual,
99
346884
2691
05:49
she would be just 10 years old today.
100
349599
2419
05:53
And what's more, the human race isn't a typical mammalian species.
101
353526
4166
05:58
There's no reason why, if we're careful,
102
358950
1906
06:00
we should die off after only two million years.
103
360880
2205
06:03
The earth will remain habitable for 500 million years to come.
104
363839
4040
06:08
And if someday, we took to the stars,
105
368696
1944
06:11
the civilization could continue for billions more.
106
371640
2516
06:16
So I think the future is going to be really big,
107
376193
2388
06:19
but is it going to be good?
108
379669
1802
06:21
Is the human race even really worth preserving?
109
381495
2817
06:26
Well, we hear all the time about how things have been getting worse,
110
386540
3929
06:31
but I think that when we take the long run,
111
391459
2693
06:34
things have been getting radically better.
112
394176
2031
06:37
Here, for example, is life expectancy over time.
113
397453
2294
06:40
Here's the proportion of people not living in extreme poverty.
114
400892
3023
06:45
Here's the number of countries over time that have decriminalized homosexuality.
115
405106
4095
06:50
Here's the number of countries over time that have become democratic.
116
410848
3269
06:55
Then, when we look to the future, there could be so much more to gain again.
117
415015
4619
06:59
We'll be so much richer,
118
419658
1228
07:00
we can solve so many problems that are intractable today.
119
420910
3595
07:05
So if this is kind of a graph of how humanity has progressed
120
425389
4445
07:09
in terms of total human flourishing over time,
121
429858
2890
07:12
well, this is what we would expect future progress to look like.
122
432772
3355
07:16
It's vast.
123
436881
1150
07:18
Here, for example,
124
438953
1198
07:20
is where we would expect no one to live in extreme poverty.
125
440175
3746
07:25
Here is where we would expect everyone to be better off
126
445930
3202
07:29
than the richest person alive today.
127
449156
1853
07:32
Perhaps here is where we would discover the fundamental natural laws
128
452081
3192
07:35
that govern our world.
129
455297
1268
07:37
Perhaps here is where we discover an entirely new form of art,
130
457516
3705
07:41
a form of music we currently lack the ears to hear.
131
461245
3040
07:45
And this is just the next few thousand years.
132
465072
2222
07:47
Once we think past that,
133
467827
2205
07:50
well, we can't even imagine the heights that human accomplishment might reach.
134
470056
4167
07:54
So the future could be very big and it could be very good,
135
474247
3040
07:57
but are there ways we could lose this value?
136
477311
2086
08:00
And sadly, I think there are.
137
480366
1826
08:02
The last two centuries brought tremendous technological progress,
138
482216
4053
08:06
but they also brought the global risks of nuclear war
139
486293
2622
08:08
and the possibility of extreme climate change.
140
488939
2157
08:11
When we look to the coming centuries,
141
491725
1767
08:13
we should expect to see the same pattern again.
142
493516
2647
08:16
And we can see some radically powerful technologies on the horizon.
143
496187
3356
08:20
Synthetic biology might give us the power to create viruses
144
500132
2849
08:23
of unprecedented contagiousness and lethality.
145
503005
3047
08:27
Geoengineering might give us the power to dramatically alter the earth's climate.
146
507131
4643
08:31
Artificial intelligence might give us the power to create intelligent agents
147
511798
4199
08:36
with abilities greater than our own.
148
516021
2142
08:40
Now, I'm not saying that any of these risks are particularly likely,
149
520222
3888
08:44
but when there's so much at stake,
150
524134
1644
08:45
even small probabilities matter a great deal.
151
525802
2967
08:49
Imagine if you're getting on a plane and you're kind of nervous,
152
529568
3001
08:52
and the pilot reassures you by saying,
153
532593
3444
08:56
"There's only a one-in-a-thousand chance of crashing. Don't worry."
154
536061
4634
09:02
Would you feel reassured?
155
542157
1554
09:04
For these reasons, I think that preserving the future of humanity
156
544509
4088
09:08
is among the most important problems that we currently face.
157
548621
2984
09:12
But let's keep using this framework.
158
552546
2150
09:14
Is this problem neglected?
159
554720
1310
09:18
And I think the answer is yes,
160
558085
2282
09:20
and that's because problems that affect future generations
161
560391
3325
09:23
are often hugely neglected.
162
563740
1651
09:26
Why?
163
566930
1406
09:28
Because future people don't participate in markets today.
164
568360
3478
09:31
They don't have a vote.
165
571862
1522
09:33
It's not like there's a lobby representing the interests
166
573931
2673
09:36
of those born in 2300 AD.
167
576628
2023
09:40
They don't get to influence the decisions we make today.
168
580313
3242
09:43
They're voiceless.
169
583995
1191
09:46
And that means we still spend a paltry amount on these issues:
170
586490
3445
09:49
nuclear nonproliferation,
171
589959
1799
09:51
geoengineering, biorisk,
172
591782
2330
09:55
artificial intelligence safety.
173
595414
1642
09:57
All of these receive only a few tens of millions of dollars
174
597923
2874
10:00
of philanthropic funding every year.
175
600821
1927
10:04
That's tiny compared to the 390 billion dollars
176
604044
3929
10:08
that's spent on US philanthropy in total.
177
608790
2261
10:13
The final aspect of our framework then:
178
613885
2484
10:17
Is this solvable?
179
617083
1190
10:19
I believe it is.
180
619289
1289
10:21
You can contribute with your money,
181
621014
3047
10:24
your career or your political engagement.
182
624085
2644
10:28
With your money, you can support organizations
183
628225
2175
10:30
that focus on these risks,
184
630424
1302
10:31
like the Nuclear Threat Initiative,
185
631750
2555
10:34
which campaigns to take nuclear weapons off hair-trigger alert,
186
634329
3660
10:38
or the Blue Ribbon Panel, which develops policy to minimize the damage
187
638013
3571
10:41
from natural and man-made pandemics,
188
641608
2095
10:45
or the Center for Human-Compatible AI, which does technical research
189
645158
3260
10:48
to ensure that AI systems are safe and reliable.
190
648442
2747
10:52
With your political engagement,
191
652652
1516
10:54
you can vote for candidates that care about these risks,
192
654192
3096
10:57
and you can support greater international cooperation.
193
657312
2586
11:01
And then with your career, there is so much that you can do.
194
661767
3542
11:05
Of course, we need scientists and policymakers and organization leaders,
195
665333
3672
11:09
but just as importantly,
196
669865
1152
11:11
we also need accountants and managers and assistants
197
671041
4117
11:16
to work in these organizations that are tackling these problems.
198
676691
3754
11:20
Now, the research program of effective altruism
199
680469
3492
11:25
is still in its infancy,
200
685191
1444
11:27
and there's still a huge amount that we don't know.
201
687262
2524
11:31
But even with what we've learned so far,
202
691173
2343
11:34
we can see that by thinking carefully
203
694748
2183
11:37
and by focusing on those problems that are big, solvable and neglected,
204
697494
4873
11:43
we can make a truly tremendous difference to the world
205
703152
2708
11:45
for thousands of years to come.
206
705884
1631
11:47
Thank you.
207
707963
1151
11:49
(Applause)
208
709138
4560
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7