What are the most important moral problems of our time? | Will MacAskill

354,513 views ・ 2018-10-03

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: TJ Kim κ²€ν† : Yoonyoung Chang
00:12
This is a graph
0
12857
1479
이 κ·Έλž˜ν”„λŠ”
00:14
that represents the economic history of human civilization.
1
14360
3659
인λ₯˜ λ¬Έλͺ…μ˜ κ²½μ œμ‚¬λ₯Ό λ³΄μ—¬μ€λ‹ˆλ‹€.
00:18
[World GDP per capita over the last 200,000 years]
2
18043
2400
[κ³Όκ±° 20만 λ…„κ°„μ˜ μ „ 세계 1인당 ꡭ내총생산]
00:23
There's not much going on, is there.
3
23757
2003
별 μ›€μ§μž„μ΄ μ—†μ£ ?
00:26
For the vast majority of human history,
4
26751
2347
인λ₯˜ μ—­μ‚¬μ˜ λŒ€λΆ€λΆ„
00:29
pretty much everyone lived on the equivalent of one dollar per day,
5
29835
4048
거의 λͺ¨λ“  μ‚¬λžŒλ“€μ΄ ν•˜λ£¨ 1λ‹¬λŸ¬ μ •λ„λ‘œ μ‚΄μ•„μ™”κ³ 
00:33
and not much changed.
6
33907
1286
크게 달라진 게 μ—†μ—ˆμ£ .
00:36
But then, something extraordinary happened:
7
36757
2777
κ·ΈλŸ¬λ‹€ λ†€λΌμš΄ 일이 μƒκΉλ‹ˆλ‹€.
00:40
the Scientific and Industrial Revolutions.
8
40677
2811
λ°”λ‘œ κ³Όν•™ 및 μ‚°μ—… 혁λͺ…이죠.
00:43
And the basically flat graph you just saw
9
43512
2785
방금 λ³Έ κ²ƒμ²˜λŸΌ ν‰ν‰ν•˜λ˜ κ·Έλž˜ν”„κ°€
00:46
transforms into this.
10
46321
2675
μ΄λ ‡κ²Œ λ³€ν•©λ‹ˆλ‹€.
00:50
What this graph means is that, in terms of power to change the world,
11
50612
4635
이 κ·Έλž˜ν”„λŠ” 인간이 세상을 λ°”κΎΈλŠ” νž˜μ— μžˆμ–΄
00:55
we live in an unprecedented time in human history,
12
55271
3438
역사상 μœ λ‘€μ—†λŠ” μ‹œκΈ°μ— μš°λ¦¬κ°€ μ‚΄κ³  μžˆμŒμ„ 보여 μ£Όμ§€λ§Œ
00:58
and I believe our ethical understanding hasn't yet caught up with this fact.
13
58733
3944
μ „ 그에 κ±Έλ§žμ€ 도덕적 인식은 ν•œμ°Έ λΆ€μ‘±ν•˜λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
01:03
The Scientific and Industrial Revolutions
14
63716
1984
κ³Όν•™ 및 μ‚°μ—…ν˜λͺ…은
01:05
transformed both our understanding of the world
15
65724
2909
μš°λ¦¬λ“€μ΄ 세상을 μ•Œκ³ 
01:08
and our ability to alter it.
16
68657
1669
λ°”κΏ€ μˆ˜λ„ μžˆλŠ” νž˜μ„ 쀬죠.
01:11
What we need is an ethical revolution
17
71505
3667
이제 μš°λ¦¬μ—κ²Œ ν•„μš”ν•œ 건 도덕적 혁λͺ…μž…λ‹ˆλ‹€.
01:15
so that we can work out
18
75196
1548
더 쒋은 세상을 λ§Œλ“€κΈ° μœ„ν•΄
01:16
how do we use this tremendous bounty of resources
19
76768
3152
이 μ—„μ²­λ‚œ μžμ›λ“€μ„ μ–΄λ–»κ²Œ μ‚¬μš©ν• μ§€
01:19
to improve the world.
20
79944
1395
잘 따져봐야 ν•©λ‹ˆλ‹€.
01:22
For the last 10 years,
21
82249
1591
μ§€λ‚œ 10λ…„ λ™μ•ˆ
01:23
my colleagues and I have developed a philosophy and research program
22
83864
3833
μ €λŠ” λ™λ£Œλ“€κ³Ό ν•¨κ»˜ ν•˜λ‚˜μ˜ μ΄λ…μ΄μž 연ꡬ인 ν”„λ‘œκ·Έλž¨μ„ κ°œλ°œν–ˆλŠ”λ°
01:27
that we call effective altruism.
23
87721
1835
효율적 μ΄νƒ€μ£Όμ˜λΌκ³  λΆ€λ¦…λ‹ˆλ‹€.
01:30
It tries to respond to these radical changes in our world,
24
90366
3595
κΈ‰λ³€ν•˜λŠ” μ„Έμƒμ—μ„œ 증거와 μ‹ μ€‘ν•œ μΆ”λ‘  등을 λ°”νƒ•μœΌλ‘œ
01:33
uses evidence and careful reasoning to try to answer this question:
25
93985
4476
λ‹€μŒκ³Ό 같은 μ§ˆλ¬Έμ— 닡을 μ œμ‹œν•©λ‹ˆλ‹€.
01:40
How can we do the most good?
26
100173
2278
μ–΄λ–»κ²Œ μ΅œλŒ€ν•œ 선을 ν–‰ν•  수 μžˆμ„κΉŒ?
01:44
Now, there are many issues you've got to address
27
104265
3221
이 λ¬Έμ œμ— λ›°μ–΄λ“€κΈ° 전에
01:47
if you want to tackle this problem:
28
107510
2263
λ¨Όμ € 생각해봐야 ν•  것듀은
01:49
whether to do good through your charity
29
109797
2031
μžμ„ λ‹¨μ²΄ ν˜Ήμ€ 직μž₯
01:51
or your career or your political engagement,
30
111852
2152
μ•„λ‹ˆλ©΄ μ •μΉ˜μ  κ΄€μ—¬λ₯Ό 톡할 것인지
01:54
what programs to focus on, who to work with.
31
114028
2395
μ–΄λ–€ ν”„λ‘œκ·Έλž¨, 또 μ–΄λ–€ μ‚¬λžŒλ“€κ³Ό ν•¨κ»˜ ν•  것인지 등이겠죠.
01:57
But what I want to talk about
32
117624
1476
ν•˜μ§€λ§Œ μ§€κΈˆ μ–˜κΈ°ν•˜λ €λŠ” 건
01:59
is what I think is the most fundamental problem.
33
119124
2872
제 생각에 κ°€μž₯ 근본적인 λ¬Έμ œμ— κ΄€ν•œ κ²ƒμž…λ‹ˆλ‹€.
02:02
Of all the many problems that the world faces,
34
122020
2693
세상에 μžˆλŠ” λ§Žμ€ λ¬Έμ œλ“€ 쀑
02:05
which should we be focused on trying to solve first?
35
125962
2659
κ³Όμ—° 무엇에 λ¨Όμ € μ΄ˆμ μ„ λ§žμΆ°μ•Ό ν• κΉŒμš”?
02:10
Now, I'm going to give you a framework for thinking about this question,
36
130668
3468
κ·Έ 생각에 도움이 될 κΈ°λ³Έ 틀을 λ³΄μ—¬λ“œλ¦΄ 텐데
02:14
and the framework is very simple.
37
134160
1936
사싀 μ•„μ£Ό κ°„λ‹¨ν•©λ‹ˆλ‹€.
02:16
A problem's higher priority,
38
136842
1699
문제의 μš°μ„ μˆœμœ„κ°€ 클수둝
02:19
the bigger, the more easily solvable and the more neglected it is.
39
139416
4063
ν•΄κ²°ν•˜κΈ°λ„, λ°©μΉ˜λ˜κΈ°λ„ 더 μ‰½λ‹€λŠ” κ²λ‹ˆλ‹€.
02:24
Bigger is better,
40
144694
1642
λ¬Έμ œκ°€ 클수둝 더 쒋은 건
02:26
because we've got more to gain if we do solve the problem.
41
146360
2841
ν•΄κ²°λ§Œ ν•˜λ©΄ 더 많이 μ–»κΈ° λ•Œλ¬Έμ΄μ£ .
02:30
More easily solvable is better
42
150221
1569
μ’€ 더 μ‰½κ²Œ ν•΄κ²°ν•  수 μžˆμ–΄ 쒋은 건
02:31
because I can solve the problem with less time or money.
43
151814
2824
μ‹œκ°„κ³Ό λΉ„μš©μ΄ μ ˆμ•½λ˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
02:35
And most subtly,
44
155737
2063
그리고 μ•„μ£Ό λ¬˜ν•˜κ²Œλ„
02:38
more neglected is better, because of diminishing returns.
45
158681
2849
μˆ˜ν™• 체감의 법칙 λŒ€λ‘œλΌλ©΄ λ°©μΉ˜λμ„μˆ˜λ‘ 더 μ’‹κ² μ£ .
02:42
The more resources that have already been invested into solving a problem,
46
162285
3714
μ–΄λ–€ 문제의 해결에 이미 λ§Žμ€ 투자λ₯Ό ν•œ μƒνƒœλΌλ©΄
02:46
the harder it will be to make additional progress.
47
166023
2905
좔가적 μ„±κ³Όλ₯Ό λ‚΄κΈ°λŠ” 더 μ–΄λ ΅κΈ° λ§ˆλ ¨μ΄λ‹ˆκΉŒμš”.
02:50
Now, the key thing that I want to leave with you is this framework,
48
170560
4059
였늘 μ΄μ•ΌκΈ°μ˜ 핡심은 λ°”λ‘œ 이 κΈ°λ³Έ ν‹€μž…λ‹ˆλ‹€.
02:54
so that you can think for yourself
49
174643
1984
μ„Έμƒμ—μ„œ κ°€μž₯ μ€‘μš”ν•œ λ¬Έμ œκ°€ 무엇인지
02:56
what are the highest global priorities.
50
176651
2321
μ—¬λŸ¬λΆ„ 슀슀둜 생각해 λ³΄λŠ”λ° 도움이 될 κ±°μ˜ˆμš”.
02:59
But I and others in the effective altruism community
51
179954
2692
저와 ν•¨κ»˜ 효율적 μ΄νƒ€μ£Όμ˜μ— λ™μ°Έν•˜λŠ” μ‚¬λžŒλ“€μ€
03:02
have converged on three moral issues that we believe are unusually important,
52
182670
5879
전에 없이 μ€‘μš”ν•˜κ³  이 κΈ°λ³Έ 틀에도 잘 λ“€μ–΄λ§žλŠ”
03:08
score unusually well in this framework.
53
188573
2182
μ„Έ 가지 도덕적 λ¬Έμ œμ— λͺ°λ‘ν•΄ μ™”μŠ΅λ‹ˆλ‹€.
03:11
First is global health.
54
191151
2813
첫째, 세계 λ³΄κ±΄μž…λ‹ˆλ‹€.
03:13
This is supersolvable.
55
193988
2411
ν™•μ‹€νžˆ ν•΄κ²° κ°€λŠ₯ν•œ 문제죠.
03:16
We have an amazing track record in global health.
56
196423
3397
μ§€κΈˆκΉŒμ§€ μ„±κ³Ό λ˜ν•œ λŒ€λ‹¨νžˆ μ’‹μ•˜μŠ΅λ‹ˆλ‹€.
03:19
Rates of death from measles, malaria, diarrheal disease
57
199844
5420
홍역, 말라리아, 섀사병 λ“±μœΌλ‘œ μΈν•œ 사망λ₯ μ΄
03:25
are down by over 70 percent.
58
205288
2246
70% λ„˜κ²Œ μ€„μ—ˆκ³ 
03:29
And in 1980, we eradicated smallpox.
59
209534
2789
1980λ…„μ—” μ²œμ—°λ‘λ₯Ό λ°•λ©Έν–ˆμ£ .
03:33
I estimate we thereby saved over 60 million lives.
60
213815
3667
μ΄λ ‡κ²Œ κ΅¬ν•œ μ‚¬λžŒμ˜ μˆ˜λŠ” 6천만 λͺ… μ΄μƒμœΌλ‘œ μΆ”μ‚°λ©λ‹ˆλ‹€.
03:37
That's more lives saved than if we'd achieved world peace
61
217506
3064
κ·Έ κΈ°κ°„ λ™μ•ˆ 세상에 μ „μŸμ΄ μ—†μ—ˆλ”λΌλ„ 더 λ§Žμ€ μ‚¬λžŒμ„ 살리진 λͺ» ν–ˆκ² μ£ .
03:40
in that same time period.
62
220594
1697
03:43
On our current best estimates,
63
223893
2325
ν˜„μž¬ κ°€μž₯ κ·Όμ ‘ν•œ μΆ”μ‚°μœΌλ‘œ
03:46
we can save a life by distributing long-lasting insecticide-treated bed nets
64
226242
4128
수천 λ‹¬λŸ¬λ©΄, μž₯μ‹œκ°„μ˜ μ‚΄μΆ©νš¨κ³Όλ₯Ό 가진 λͺ¨κΈ°μž₯을 보급해
03:50
for just a few thousand dollars.
65
230394
1903
생λͺ…을 ꡬ할 수 μžˆμŠ΅λ‹ˆλ‹€.
03:52
This is an amazing opportunity.
66
232911
1667
정말 λ†€λΌμš΄ 일이죠.
03:55
The second big priority is factory farming.
67
235594
2515
두 번째 μš°μ„ μˆœμœ„λŠ” 곡μž₯식 μ‚¬μœ‘μž…λ‹ˆλ‹€.
03:58
This is superneglected.
68
238681
1563
λ„ˆλ¬΄ 방치되고 μžˆλŠ” 문제죠.
04:00
There are 50 billion land animals used every year for food,
69
240768
4143
맀년 μ‹μš©μœΌλ‘œ μ†ŒλΉ„λ˜λŠ” 5λ°±μ–΅ 마리의 κ°€μΆ• λŒ€λΆ€λΆ„μ€
04:05
and the vast majority of them are factory farmed,
70
245625
2548
곡μž₯식 μ‚¬μœ‘μž₯μ—μ„œ κ³΅κΈ‰λ˜λŠ”λ°
04:08
living in conditions of horrific suffering.
71
248197
2380
정말 λ”μ°ν•œ ν™˜κ²½μ—μ„œ μ‚¬μœ‘λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
04:10
They're probably among the worst-off creatures on this planet,
72
250601
3151
μ•„λ§ˆλ„ 지ꡬ μƒμ—μ„œ κ°€μž₯ λΆˆμŒν•œ 생λͺ…체 쀑 ν•˜λ‚˜μ§€λ§Œ
04:13
and in many cases, we could significantly improve their lives
73
253776
2858
λ§Žμ€ 경우, κ·Έλ“€μ˜ 삢은 ν˜„μ €νžˆ κ°œμ„ λ  수 μžˆμŠ΅λ‹ˆλ‹€.
04:16
for just pennies per animal.
74
256658
1602
ν•œ λ§ˆλ¦¬λ‹Ή λͺ‡ 푼이면 λ˜λŠ”λ°
04:19
Yet this is hugely neglected.
75
259123
2082
λ„ˆλ¬΄ 방치되고 있죠.
04:21
There are 3,000 times more animals in factory farms
76
261229
3810
버렀진 애완동물듀보닀 곡μž₯식 농μž₯μ—μ„œ μ‚¬μœ‘λ˜λŠ” 가좕듀이
04:25
than there are stray pets,
77
265063
1561
3천 λ°°λ‚˜ λ§Žμ§€λ§Œ
04:28
but yet, factory farming gets one fiftieth of the philanthropic funding.
78
268600
4373
μžμ„  κΈ°κΈˆμ€ 50λΆ„μ˜ 1 정도 밖에 μ•ˆλ©λ‹ˆλ‹€.
04:34
That means additional resources in this area
79
274211
2128
κ·Έλ ‡κΈ° λ•Œλ¬Έμ— 여기에 쑰금만 더 투자λ₯Ό ν•˜λ©΄
04:36
could have a truly transformative impact.
80
276363
2150
정말 획기적인 λ³€ν™”λ₯Ό κ°€μ Έμ˜¬ 수 μžˆλŠ” 것이죠.
04:39
Now the third area is the one that I want to focus on the most,
81
279458
2985
μ„Έ 번째 μ˜μ—­μ€ μ œκ°€ κ°€μž₯ 관심이 λ§Žμ€ κ²ƒμœΌλ‘œ
04:42
and that's the category of existential risks:
82
282467
2984
싀쑴적 μœ„ν—˜μ˜ 범주에 μ†ν•˜λŠ”
04:45
events like a nuclear war or a global pandemic
83
285475
3873
ν•΅μ „μŸμ΄λ‚˜ μ „ 세계적인 μœ ν–‰λ³‘κ³Ό 같이
04:50
that could permanently derail civilization
84
290824
2611
ν•œ λ¬Έλͺ…을 λ§μ‚΄μ‹œν‚¬ μˆ˜λ„
04:54
or even lead to the extinction of the human race.
85
294156
2436
인λ₯˜μ˜ 쒅말을 κ°€μ Έμ˜¬ μˆ˜λ„ μžˆλŠ” κ²ƒλ“€μž…λ‹ˆλ‹€.
04:57
Let me explain why I think this is such a big priority
86
297882
2540
이것이 μ™œ 그토둝 μ€‘μš”ν•œμ§€
05:00
in terms of this framework.
87
300446
1337
이 κΈ°λ³Έ ν‹€μ—μ„œ λ§μ”€λ“œλ¦¬μ£ .
05:02
First, size.
88
302992
1412
첫째, 규λͺ¨μž…λ‹ˆλ‹€.
05:05
How bad would it be if there were a truly existential catastrophe?
89
305341
3889
μ‹€μ œλ‘œ 싀쑴적 μž¬μ•™μ΄ λ‹₯μΉœλ‹€λ©΄ λ„λŒ€μ²΄ μ–Όλ§ˆλ‚˜ μ‹¬κ°ν•œ κ²ƒμΌκΉŒμš”?
05:10
Well, it would involve the deaths of all seven billion people on this planet
90
310920
6342
지ꡬ μƒμ˜ 70μ–΅ λͺ… λͺ¨λ‘λ₯Ό 죽음으둜 λͺ°μ•„넣을 텐데
05:17
and that means you and everyone you know and love.
91
317286
3119
μ—¬λŸ¬λΆ„κ³Ό μ—¬λŸ¬λΆ„μ΄ μ•Œκ³ , μ‚¬λž‘ν•˜λŠ” λͺ¨λ“  이듀을 ν¬ν•¨ν•˜μ£ .
05:21
That's just a tragedy of unimaginable size.
92
321214
2579
μƒμƒν•˜κΈ° νž˜λ“  규λͺ¨μ˜ λΉ„κ·Ήμž…λ‹ˆλ‹€.
05:25
But then, what's more,
93
325684
1976
ν•˜μ§€λ§Œ λ”ν•œ 것은
05:27
it would also mean the curtailment of humanity's future potential,
94
327684
3605
정말 μ—„μ²­λ‚œ 인λ₯˜μ˜ 미래 잠재λ ₯λ§ˆμ €
05:31
and I believe that humanity's potential is vast.
95
331313
2952
λ‘”ν™”μ‹œν‚€λŠ” κ²°κ³Όλ₯Ό λ‚³κ² μ£ .
05:35
The human race has been around for about 200,000 years,
96
335551
3451
인λ₯˜λŠ” μ•½ 20만 λ…„ λ™μ•ˆμ΄λ‚˜ μ‘΄μž¬ν•΄ μ™”κ³ 
05:39
and if she lives as long as a typical mammalian species,
97
339026
2883
일반적 포유λ₯˜μ˜ κΈ°μ€€μœΌλ‘œ
05:41
she would last for about two million years.
98
341933
2298
μ•žμœΌλ‘œ μ•½ 2백만 λ…„μ˜ μ‹œκ°„μ€ 더 μžˆμ„ 거라고 λ΄…λ‹ˆλ‹€.
05:46
If the human race were a single individual,
99
346884
2691
인λ₯˜λ₯Ό ν•œ 개인으둜 μƒκ°ν•˜λ©΄
05:49
she would be just 10 years old today.
100
349599
2419
이제 겨우 10μ‚΄ 정도인 κ±°μ£ .
05:53
And what's more, the human race isn't a typical mammalian species.
101
353526
4166
κ²Œλ‹€κ°€, λ‹€λ₯Έ 포유λ₯˜λ“€κ³Ό 인λ₯˜λ₯Ό 비ꡐ할 순 μ—†κ² μ£ .
05:58
There's no reason why, if we're careful,
102
358950
1906
잘만 ν•œλ‹€λ©΄, 인λ₯˜κ°€ 겨우
06:00
we should die off after only two million years.
103
360880
2205
2백만 λ…„ 후에 λ©Έμ’…ν•  μ΄μœ λŠ” μ—†μŠ΅λ‹ˆλ‹€.
06:03
The earth will remain habitable for 500 million years to come.
104
363839
4040
μ§€κ΅¬λŠ” μ•žμœΌλ‘œ 5μ–΅ 년은 생λͺ…체가 살기에 적합할 것이고
06:08
And if someday, we took to the stars,
105
368696
1944
μ–Έμ  κ°€ λ‹€λ₯Έ 행성듀을 찾게 되면
06:11
the civilization could continue for billions more.
106
371640
2516
인λ₯˜ λ¬Έλͺ…은 μˆ˜μ–΅ λ…„ 더 지속할 수 있겠죠.
06:16
So I think the future is going to be really big,
107
376193
2388
μ΄λ ‡κ²Œ 미래의 κ°€λŠ₯성은 λ¬΄κΆλ¬΄μ§„ν•˜λ‹€κ³  μƒκ°ν•˜μ§€λ§Œ
06:19
but is it going to be good?
108
379669
1802
κ·Έ λ―Έλž˜κ°€ κ³Όμ—° 쒋은 κ²ƒμΌκΉŒμš”?
06:21
Is the human race even really worth preserving?
109
381495
2817
인λ₯˜κ°€ μ§€μ†λ˜μ–΄μ•Ό ν•  κ°€μΉ˜κ°€ 정말 μžˆλŠ” κ±ΈκΉŒμš”?
06:26
Well, we hear all the time about how things have been getting worse,
110
386540
3929
μ‚΄κΈ° νž˜λ“€μ–΄μ‘Œλ‹€λŠ” μ–˜κΈ°λ₯Ό 늘 λ“£κ³  μžˆμ§€λ§Œ
06:31
but I think that when we take the long run,
111
391459
2693
사싀 였랜 μ‹œκ°„μ„ 놓고 보면
06:34
things have been getting radically better.
112
394176
2031
세상은 정말 μ‚΄κΈ° μ’‹μ•„μ‘ŒμŠ΅λ‹ˆλ‹€.
06:37
Here, for example, is life expectancy over time.
113
397453
2294
ν•œ 예둜 ν‰κ· μˆ˜λͺ…을 보죠.
06:40
Here's the proportion of people not living in extreme poverty.
114
400892
3023
κ·ΉλΉˆμΈ΅μ— μ†ν•˜μ§€ μ•ŠλŠ” μ‚¬λžŒλ“€μ˜ λΉ„μœ¨μž…λ‹ˆλ‹€.
06:45
Here's the number of countries over time that have decriminalized homosexuality.
115
405106
4095
더 이상 동성애λ₯Ό λ²”μ£„λ‘œ μ·¨κΈ‰ν•˜μ§€ μ•ŠλŠ” λ‚˜λΌλ“€μ˜ 숫자죠.
06:50
Here's the number of countries over time that have become democratic.
116
410848
3269
μ‹œκ°„μ„ 거치며 λ―Όμ£Όν™”λœ λ‚˜λΌλ“€μž…λ‹ˆλ‹€.
06:55
Then, when we look to the future, there could be so much more to gain again.
117
415015
4619
λ”°λΌμ„œ, 미래λ₯Ό 바라보면 μ—­μ‹œ μ–»λŠ” 것듀이 훨씬 더 많겠죠.
06:59
We'll be so much richer,
118
419658
1228
훨씬 더 λΆ€μœ ν•΄μ§ˆ ν…Œκ³ 
07:00
we can solve so many problems that are intractable today.
119
420910
3595
ν˜„μž¬ 어렀움을 κ²ͺκ³  μžˆλŠ” λ§Žμ€ 문제 λ˜ν•œ ν•΄κ²°ν•  수 있겠죠.
07:05
So if this is kind of a graph of how humanity has progressed
120
425389
4445
μ‹œκ°„μ„ 거치며 전체적인 μΈκ°„μ˜ λ²ˆμ˜μ— μžˆμ–΄
07:09
in terms of total human flourishing over time,
121
429858
2890
인λ₯˜κ°€ μ–Όλ§ˆλ‚˜ λ°œμ „μ„ κ±°λ‘λŠ”μ§€ 이 κ·Έλž˜ν”„κ°€ 보여쀀닀면
07:12
well, this is what we would expect future progress to look like.
122
432772
3355
λ―Έλž˜μ—λŠ” 이 정도쯀 λ˜λ¦¬λž€ μ˜ˆμƒμ΄ κ°€λŠ₯ν•©λ‹ˆλ‹€.
07:16
It's vast.
123
436881
1150
μ—„μ²­λ‚˜μ£ .
07:18
Here, for example,
124
438953
1198
예λ₯Ό λ“€μ–΄ 이쯀에선
07:20
is where we would expect no one to live in extreme poverty.
125
440175
3746
λˆ„κ΅¬λ„ κ·Ήμ‹¬ν•œ λΉˆκ³€μ— μ‹œλ‹¬λ¦¬μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
07:25
Here is where we would expect everyone to be better off
126
445930
3202
이쯀에선 λͺ¨λ“  μ‚¬λžŒλ“€μ΄ μ˜€λŠ˜λ‚  κ°€μž₯ λΆ€μœ ν•œ μ‚¬λžŒλ³΄λ‹€
07:29
than the richest person alive today.
127
449156
1853
더 λ‚˜μ€ 삢을 λˆ„λ¦¬κ²Œ λ©λ‹ˆλ‹€.
07:32
Perhaps here is where we would discover the fundamental natural laws
128
452081
3192
μ•„λ§ˆλ„ 이쯀에선 근본적인 μžμ—°μ˜ 법칙을 λ°œκ²¬ν•΄
07:35
that govern our world.
129
455297
1268
세상을 μš΄μ˜ν•˜λŠ”λ° ν™œμš©ν•˜κ² μ£ .
07:37
Perhaps here is where we discover an entirely new form of art,
130
457516
3705
μ•„λ§ˆλ„ 이 쯀에선, μ•„μ£Ό μƒˆλ‘œμš΄ 예술의 ν˜•νƒœκ°€ μ†Œκ°œλ  텐데
07:41
a form of music we currently lack the ears to hear.
131
461245
3040
μ§€κΈˆμ€ 듀을 수 μ—†λŠ” μ’…λ₯˜μ˜ μŒμ•…μž…λ‹ˆλ‹€.
07:45
And this is just the next few thousand years.
132
465072
2222
μ•žμœΌλ‘œ 수천 λ…„ 내에 λ²Œμ–΄μ§ˆ 일듀이죠.
07:47
Once we think past that,
133
467827
2205
κ³Όκ±°λ₯Ό 생각해 보면
07:50
well, we can't even imagine the heights that human accomplishment might reach.
134
470056
4167
μΈκ°„μ˜ ν•œκ³„λ₯Ό κ°€λŠ ν•˜κΈ°λž€ 쉽지 μ•ŠμŠ΅λ‹ˆλ‹€.
07:54
So the future could be very big and it could be very good,
135
474247
3040
λ―Έλž˜λŠ” μ•„μ£Ό κ±°μ°½ν•˜κ³  쒋을 μˆ˜λ„ μžˆμ§€λ§Œ
07:57
but are there ways we could lose this value?
136
477311
2086
μš°λ¦¬κ°€ κ·Έ κ°€μΉ˜λ₯Ό 상싀할 수 μžˆμ„κΉŒμš”?
08:00
And sadly, I think there are.
137
480366
1826
μ•ˆνƒ€κΉκ²Œλ„, μ „ κ·Έλ ‡λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
08:02
The last two centuries brought tremendous technological progress,
138
482216
4053
μ§€λ‚œ 두 μ„ΈκΈ° λ™μ•ˆμ˜ μ—„μ²­λ‚œ κ³Όν•™κΈ°μˆ μ˜ λ°œμ „μ€
08:06
but they also brought the global risks of nuclear war
139
486293
2622
μ „ 세계에 ν•΅μ „μŸμ˜ μœ„ν˜‘κ³Ό
08:08
and the possibility of extreme climate change.
140
488939
2157
κ·Ήμ‹¬ν•œ κΈ°ν›„λ³€ν™”λ₯Ό μ•ΌκΈ°ν–ˆμŠ΅λ‹ˆλ‹€.
08:11
When we look to the coming centuries,
141
491725
1767
μ•žμœΌλ‘œ 수백 λ…„ λ™μ•ˆ
08:13
we should expect to see the same pattern again.
142
493516
2647
같은 상황이 반볡될 거라고 생각해야겠죠.
08:16
And we can see some radically powerful technologies on the horizon.
143
496187
3356
벌써 μ•„μ£Ό 급진적인 κΈ°μˆ λ“€μ΄ μ†Œκ°œλ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:20
Synthetic biology might give us the power to create viruses
144
500132
2849
합성생물학은 μš°λ¦¬μ—κ²Œ 전에 μ—†λŠ” μ „μ—Όμ„±κ³Ό μΉ˜μ‚¬μœ¨μ„ 가진
08:23
of unprecedented contagiousness and lethality.
145
503005
3047
λ°”μ΄λŸ¬μŠ€λ₯Ό λ§Œλ“€μ–΄ λ‚Ό λŠ₯λ ₯을 쀄 μˆ˜λ„ 있겠죠.
08:27
Geoengineering might give us the power to dramatically alter the earth's climate.
146
507131
4643
지ꡬ곡학은 κΈ°ν›„λ₯Ό ν˜„μ €νžˆ λ°”κΏ€ λŠ₯λ ₯을 쀄 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
08:31
Artificial intelligence might give us the power to create intelligent agents
147
511798
4199
인곡지λŠ₯은 인간보닀 μš°μ›”ν•œ 지적 개체λ₯Ό λ§Œλ“€μ–΄ λ‚Ό
08:36
with abilities greater than our own.
148
516021
2142
λŠ₯λ ₯을 쀄 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
08:40
Now, I'm not saying that any of these risks are particularly likely,
149
520222
3888
이런 μœ„ν—˜ μš”μ†Œλ“€ 쀑 νŠΉλ³„νžˆ μ–΄λ–€ 게 μ‹€ν˜„λ κ±° λΌλŠ” 건 μ•„λ‹ˆμ§€λ§Œ
08:44
but when there's so much at stake,
150
524134
1644
μœ„ν—˜λΆ€λ‹΄μ΄ 크닀면
08:45
even small probabilities matter a great deal.
151
525802
2967
μ•„μ£Ό μž‘μ€ ν™•λ₯ μ΄λΌλ„ μ‹ μ€‘νžˆ κ³ λ €ν•΄μ•Όκ² μ£ .
08:49
Imagine if you're getting on a plane and you're kind of nervous,
152
529568
3001
쑰금 κΈ΄μž₯ν•œ μƒνƒœμ—μ„œ λΉ„ν–‰κΈ°λ₯Ό νƒ”λŠ”λ°
08:52
and the pilot reassures you by saying,
153
532593
3444
κΈ°μž₯이 μ΄λ ‡κ²Œ λ§ν•œλ‹€κ³  상상해 λ³΄μ„Έμš”.
08:56
"There's only a one-in-a-thousand chance of crashing. Don't worry."
154
536061
4634
"좔락λ₯ μ€ μ²œλΆ„μ˜ 일밖에 μ•ˆλ˜λ‹ˆ κ±±μ •ν•˜μ§€ λ§ˆμ‹­μ‹œμ˜€."
09:02
Would you feel reassured?
155
542157
1554
κ³Όμ—° μ•ˆμ‹¬μ΄ λ κΉŒμš”?
09:04
For these reasons, I think that preserving the future of humanity
156
544509
4088
이런 이유 λ•Œλ¬Έμ— 인λ₯˜μ˜ 미래λ₯Ό μ§€ν‚€λŠ” 건
09:08
is among the most important problems that we currently face.
157
548621
2984
ν˜„μž¬ λ‹Ήλ©΄ν•œ κ°€μž₯ μ€‘μš”ν•œ λ¬Έμ œλ“€ 쀑 ν•˜λ‚˜μž…λ‹ˆλ‹€.
09:12
But let's keep using this framework.
158
552546
2150
이 κΈ°λ³Έ 틀을 계속 μ΄μš©ν•΄ 보죠.
09:14
Is this problem neglected?
159
554720
1310
이것은 방치된 λ¬Έμ œμΌκΉŒμš”?
09:18
And I think the answer is yes,
160
558085
2282
제 생각엔 κ·Έλ ‡μŠ΅λ‹ˆλ‹€.
09:20
and that's because problems that affect future generations
161
560391
3325
미래 μ„ΈλŒ€κ°€ κ²ͺ을 수 μžˆλŠ” λ¬Έμ œλ“€μ€
09:23
are often hugely neglected.
162
563740
1651
보톡 방치되기 λ§ˆλ ¨μ΄λ‹ˆκΉŒμš”.
09:26
Why?
163
566930
1406
μ™œ κ·ΈλŸ΄κΉŒμš”?
09:28
Because future people don't participate in markets today.
164
568360
3478
미래의 μ‚¬λžŒλ“€μ΄ ν˜„μž¬ κ²½μ œν™œλ™μ΄λ‚˜
09:31
They don't have a vote.
165
571862
1522
μ •μΉ˜μ— μ°Έμ—¬ν•˜μ§€ μ•ŠκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
09:33
It's not like there's a lobby representing the interests
166
573931
2673
2300λ…„λŒ€μ— νƒœμ–΄λ‚  이듀을 μœ„ν•΄ λˆ„κ΅°κ°€ λ‘œλΉ„λ₯Ό
09:36
of those born in 2300 AD.
167
576628
2023
해쀄 수 μžˆλŠ” 것도 μ•„λ‹ˆμ£ .
09:40
They don't get to influence the decisions we make today.
168
580313
3242
μ˜€λŠ˜λ‚  μš°λ¦¬κ°€ λ‚΄λ¦¬λŠ” 결정에 그듀은 영ν–₯을 주지 λͺ»ν•©λ‹ˆλ‹€.
09:43
They're voiceless.
169
583995
1191
μ„ νƒμ˜ 여지가 μ—†μ£ .
09:46
And that means we still spend a paltry amount on these issues:
170
586490
3445
λ‹€μŒκ³Ό 같은 λ¬Έμ œλ“€μ— μ—¬μ „νžˆ 관심이 적은 μ΄μœ μž…λ‹ˆλ‹€:
09:49
nuclear nonproliferation,
171
589959
1799
ν•΅ ν™•μ‚° 방지,
09:51
geoengineering, biorisk,
172
591782
2330
지ꡬ곡학, 생물학적 μœ„ν˜‘,
09:55
artificial intelligence safety.
173
595414
1642
μ•ˆμ „ν•œ 인곡지λŠ₯.
09:57
All of these receive only a few tens of millions of dollars
174
597923
2874
이 λͺ¨λ“  것듀에 μ§€μ›λ˜λŠ” μžμ„  κΈ°κΈˆμ€
10:00
of philanthropic funding every year.
175
600821
1927
맀년 수천만 λ‹¬λŸ¬μ— λΆˆκ³Όν•œλ°
10:04
That's tiny compared to the 390 billion dollars
176
604044
3929
λ―Έκ΅­μ—μ„œ μ§€μΆœλ˜λŠ” μ—° 3,900μ–΅ λ‹¬λŸ¬μ˜ κΈ°κΈˆμ—
10:08
that's spent on US philanthropy in total.
177
608790
2261
λΉ„ν•˜λ©΄ 정말 μ μŠ΅λ‹ˆλ‹€.
10:13
The final aspect of our framework then:
178
613885
2484
이 κΈ°λ³Έ ν‹€μ˜ λ§ˆμ§€λ§‰ 츑면인
10:17
Is this solvable?
179
617083
1190
ν•΄κ²°ν•  수 μžˆλŠ”κ°€?
10:19
I believe it is.
180
619289
1289
μ „ κ·Έλ ‡λ‹€κ³  λ―ΏμŠ΅λ‹ˆλ‹€.
10:21
You can contribute with your money,
181
621014
3047
μ—¬λŸ¬λΆ„μ€ λˆμ„ κΈ°λΆ€ν•˜κ±°λ‚˜
10:24
your career or your political engagement.
182
624085
2644
직μž₯ ν˜Ήμ€ μ •μΉ˜μ  κ΄€μ—¬λ₯Ό 톡해 동참할 수 μžˆμŠ΅λ‹ˆλ‹€.
10:28
With your money, you can support organizations
183
628225
2175
μ—¬λŸ¬λΆ„μ˜ κΈ°λΆ€κΈˆμ€ 이런 λ¬Έμ œλ“€μ˜ 해결을 μœ„ν•΄ μ• μ“°λŠ”
10:30
that focus on these risks,
184
630424
1302
단체듀에 μ§€μ›λ©λ‹ˆλ‹€.
10:31
like the Nuclear Threat Initiative,
185
631750
2555
'ν•΅ μœ„ν˜‘ 방지 ꡬ상'처럼
10:34
which campaigns to take nuclear weapons off hair-trigger alert,
186
634329
3660
핡무기 μ‚¬μš©μ„ 막기 μœ„ν•œ μΊ νŽ˜μΈμ„ λ²Œμ΄λŠ” κ³³μ΄λ‚˜
10:38
or the Blue Ribbon Panel, which develops policy to minimize the damage
187
638013
3571
ν‘Έλ₯Έ 리본 νŒ¨λ„μ²˜λŸΌ μžμ—° λ˜λŠ” μΈκ°„μœΌλ‘œλΆ€ν„° λ°œμƒν•˜λŠ”
10:41
from natural and man-made pandemics,
188
641608
2095
μœ ν–‰λ³‘μ˜ ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•˜κΈ° μœ„ν•œ 정책을 κ°œλ°œν•˜λŠ” 곳도 있고,
10:45
or the Center for Human-Compatible AI, which does technical research
189
645158
3260
인간과 κ³΅μ‘΄ν•˜λŠ” 인곡지λŠ₯ μ„Όν„°μ²˜λŸΌ μ•ˆμ „ν•˜κ³  μ‹ λ’°ν•  수 μžˆλŠ”
10:48
to ensure that AI systems are safe and reliable.
190
648442
2747
인곡지λŠ₯ μ‹œμŠ€ν…œμ„ μœ„ν•΄ 기술 μ—°κ΅¬ν•˜λŠ” 곳도 μžˆμŠ΅λ‹ˆλ‹€.
10:52
With your political engagement,
191
652652
1516
μ •μΉ˜μ  κ΄€μ—¬λ₯Ό 톡해
10:54
you can vote for candidates that care about these risks,
192
654192
3096
이런 λ¬Έμ œλ“€μ— μ‹ κ²½ μ“°λŠ” μ •μΉ˜μΈλ“€μ— νˆ¬ν‘œν•˜κ³ 
10:57
and you can support greater international cooperation.
193
657312
2586
더 λ‚˜μ•„κ°€ ꡭ제적 ν˜‘λ ₯κΉŒμ§€ 지원할 수 있겠죠.
11:01
And then with your career, there is so much that you can do.
194
661767
3542
μ—¬λŸ¬λΆ„λ“€μ˜ 직업 λ˜ν•œ μœ μš©ν•˜κ²Œ 쓰일 수 μžˆμŠ΅λ‹ˆλ‹€.
11:05
Of course, we need scientists and policymakers and organization leaders,
195
665333
3672
κ³Όν•™μž, κ΅­νšŒμ˜μ›, κ°κ³„κ°μΈ΅μ˜ μ§€λ„μžλ“€μ΄
11:09
but just as importantly,
196
669865
1152
λ¬Όλ‘  ν•„μš”ν•˜μ§€λ§Œ
11:11
we also need accountants and managers and assistants
197
671041
4117
νšŒκ³„μ‚¬, κ΄€λ¦¬μž, λŒ€λ¦¬λ“€λ„
11:16
to work in these organizations that are tackling these problems.
198
676691
3754
이런 문제 해결에 νž˜μ“°λŠ” 단체듀에겐 μ•„μ£Ό μ€‘μš”ν•œ κ±°μ£ .
11:20
Now, the research program of effective altruism
199
680469
3492
효율적 μ΄νƒ€μ£Όμ˜λ₯Ό μ—°κ΅¬ν•˜λŠ” ν”„λ‘œκ·Έλž¨μ€
11:25
is still in its infancy,
200
685191
1444
이제 걸음마 단계이고
11:27
and there's still a huge amount that we don't know.
201
687262
2524
λ°°μ›Œμ•Ό ν•  것듀이 아직 λ§ŽμŠ΅λ‹ˆλ‹€.
11:31
But even with what we've learned so far,
202
691173
2343
ν•˜μ§€λ§Œ μ§€κΈˆκΉŒμ§€ μŒ“μ•„μ˜¨ μ§€μ‹μœΌλ‘œλ„
11:34
we can see that by thinking carefully
203
694748
2183
μš°λ¦¬λŠ” μ‹ μ€‘ν•˜κ²Œ 생각할 수 있고
11:37
and by focusing on those problems that are big, solvable and neglected,
204
697494
4873
크고, ν•΄κ²°ν•  수 있고, λ°©μΉ˜λ˜μ–΄ 온 λ¬Έμ œλ“€μ— μ ‘κ·Όν•œλ‹€λ©΄
11:43
we can make a truly tremendous difference to the world
205
703152
2708
세상에 정말 μ—„μ²­λ‚œ λ³€ν™”λ₯Ό κ°€μ Έμ˜¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
11:45
for thousands of years to come.
206
705884
1631
μ•žμœΌλ‘œ 수천 λ…„ λ™μ•ˆ 말이죠.
11:47
Thank you.
207
707963
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
11:49
(Applause)
208
709138
4560
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7