The 4 greatest threats to the survival of humanity

485,130 views ใƒป 2022-07-19

TED-Ed


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Seohee Kim ๊ฒ€ํ† : DK Kim
00:10
In January of 1995, Russia detected a nuclear missile headed its way.
0
10840
5964
1995๋…„ 1์›”์—
๋Ÿฌ์‹œ์•„๋Š” ์ž๊ตญ์„ ํ–ฅํ•ด ๋‚ ์•„์˜ค๋Š” ํ•ต ๋ฏธ์‚ฌ์ผ ํ•œ ๋ฐœ์„ ํƒ์ง€ํ•ฉ๋‹ˆ๋‹ค.
00:16
The alert went all the way to the president,
1
16804
2836
์ด ๊ฒฝ๋ณด๋Š” ๋Œ€ํ†ต๋ น์—๊ฒŒ๊นŒ์ง€ ๋ณด๊ณ ๋˜์—ˆ์ฃ .
00:19
who was deciding whether to strike back
2
19640
2962
๋Œ€ํ†ต๋ น์ด ๋ฐ˜๊ฒฉํ• ์ง€๋ฅผ ๊ณ ๋ฏผํ•˜๋˜ ๊ทธ๋•Œ
00:22
when another system contradicted the initial warning.
3
22602
3378
๋‹ค๋ฅธ ๊ฒฝ๋ณด ์ฒด๊ณ„๊ฐ€ ์ฒซ ๋ฒˆ์งธ ๊ฒฝ๋ณด๋ฅผ ๋’ค์ง‘์—ˆ์Šต๋‹ˆ๋‹ค.
00:26
What they thought was the first missile in a massive attack
4
26355
3796
๋Œ€๋Œ€์  ๊ณต๊ฒฉ์„ ์‹œ์ž‘ํ•˜๋Š” ์ฒซ ๋ฏธ์‚ฌ์ผ์ด๋ผ๊ณ  ์ƒ๊ฐํ–ˆ๋˜ ๊ฒƒ์ด
00:30
was actually a research rocket studying the Northern Lights.
5
30151
4338
์‹ค์€ ๋ถ๊ทน๊ด‘์„ ์กฐ์‚ฌํ•˜๋Š” ์—ฐ๊ตฌ์šฉ ๋กœ์ผ“์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:35
This incident happened after the end of the Cold War,
6
35156
3420
์ด ์‚ฌ๊ฑด์€ ๋ƒ‰์ „ ์ดํ›„์— ๋ฐœ์ƒํ–ˆ์ง€๋งŒ
00:38
but was nevertheless one of the closest calls weโ€™ve had
7
38576
3504
์–ด์ฐŒ ๋˜์—ˆ๋“  ์„ธ๊ณ„ ํ•ต์ „์Ÿ ๋ฐœ๋ฐœ์—
00:42
to igniting a global nuclear war.
8
42080
2961
๊ฐ€์žฅ ๊ทผ์ ‘ํ–ˆ๋˜ ์‚ฌ๊ฑด์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:45
With the invention of the atomic bomb,
9
45833
2169
์›์ž ํญํƒ„์ด ๋ฐœ๋ช…๋˜๊ณ ๋‚˜์„œ
00:48
humanity gained the power to destroy itself for the first time in our history.
10
48002
5339
์ธ๋ฅ˜๋Š” ์ž์‹ ์„ ํŒŒ๋ฉธ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋Š” ํž˜์„ ์†์— ๋„ฃ์—ˆ์Šต๋‹ˆ๋‹ค.
์ธ๋ฅ˜ ์—ญ์‚ฌ์ƒ ์ฒ˜์Œ์œผ๋กœ์š”.
00:53
Since then, our existential riskโ€”
11
53341
3086
๊ทธ๋•Œ๋ถ€ํ„ฐ ์ƒ์กด์˜ ์œ„ํ˜‘์ด ์‹œ์ž‘๋์ฃ .
00:56
risk of either extinction
12
56427
1835
์ธ๋ฅ˜ ๋ฉธ์ข…์˜ ์œ„ํ—˜, ํ˜น์€
00:58
or the unrecoverable collapse of human civilizationโ€”
13
58262
3379
ํšŒ๋ณต ๋ถˆ๊ฐ€๋Šฅํ•œ ์ˆ˜์ค€์œผ๋กœ ์ธ๋ฅ˜ ๋ฌธ๋ช…์ด ํŒŒ๊ดด๋  ์œ„ํ—˜์ด
01:01
has steadily increased.
14
61641
1918
์ง€์†์ ์œผ๋กœ ์ฆ๊ฐ€ํ–ˆ์–ด์š”.
01:04
Itโ€™s well within our power to reduce this risk,
15
64060
2961
์šฐ๋ฆฌ๋Š” ์ถฉ๋ถ„ํžˆ ์ด ์œ„ํ—˜์„ ์ค„์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:07
but in order to do so,
16
67021
1585
ํ•˜์ง€๋งŒ ๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„œ๋Š”
01:08
we have to understand which of our activities
17
68606
2753
์šฐ๋ฆฌ์˜ ํ–‰๋™ ์ค‘ ์–ด๋–ค ๊ฒƒ์ด ํ˜„์žฌ ์ƒ์กด์— ์œ„ํ˜‘์„ ๊ฐ€ํ•˜๊ณ 
01:11
pose existential threats now, and which might in the future.
18
71359
4504
์–ด๋–ค ๊ฒƒ์ด ๋ฏธ๋ž˜์˜ ์ƒ์กด์— ์œ„ํ˜‘์„ ๊ฐ€ํ•˜๋Š”์ง€ ์•Œ์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
01:16
So far, our species has survived 2,000 centuries,
19
76489
4379
์—ฌํƒœ๊นŒ์ง€ ์ธ๋ฅ˜๋Š” ์ด์‹ญ๋งŒ ๋…„์„ ์ƒ์กดํ•ด์™”์Šต๋‹ˆ๋‹ค.
01:20
each with some extinction risk from natural causesโ€”
20
80868
3754
๊ทธ๋™์•ˆ ์ž์—ฐ์  ์š”์ธ์œผ๋กœ ์ธํ•œ ๋ฉธ์ข… ์œ„๊ธฐ๊ฐ€ ๋ช‡ ๋ฒˆ ์žˆ์—ˆ์ฃ .
01:24
asteroid impacts, supervolcanoes, and the like.
21
84622
3128
์†Œํ–‰์„ฑ ์ถฉ๋Œ, ์ดˆ๋Œ€ํ˜• ํ™”์‚ฐ ๊ฐ™์€ ๊ฒƒ ๋ง์ด์ฃ .
01:28
Assessing existential risk is an inherently uncertain business
22
88000
4338
์ƒ์กด์˜ ์œ„ํ˜‘์„ ๊ฐ€๋Š ํ•œ๋‹ค๋Š” ๊ฒƒ์€ ์‚ฌ์‹ค ๊ทผ๋ณธ์ ์œผ๋กœ ๋ถˆํ™•์‹คํ•ฉ๋‹ˆ๋‹ค.
01:32
because usually when we try to figure out how likely something is,
23
92338
3921
์™œ๋ƒํ•˜๋ฉด ๋ณดํ†ต ์–ด๋–ค ์ผ์˜ ๋ฐœ์ƒ ์—ฌ๋ถ€๋ฅผ ์˜ˆ์ธกํ•  ๋•Œ
01:36
we check how often it's happened before.
24
96259
2460
์ด์ „์˜ ๋ฐœ์ƒ ๋นˆ๋„๋ฅผ ์•Œ์•„๋ณด๋Š”๋ฐ,
01:38
But the complete destruction of humanity has never happened before.
25
98719
4255
์™„์ „ํ•œ ์ธ๋ฅ˜ ๋Œ€๋ฉธ์ข…์€ ๋‹จ ํ•œ ๋ฒˆ๋„ ๋ฐœ์ƒํ•œ ์ ์ด ์—†์Šต๋‹ˆ๋‹ค.
01:43
While thereโ€™s no perfect method to determine our risk from natural threats,
26
103474
4004
์ž์—ฐ์  ์œ„ํ—˜์œผ๋กœ ์ธํ•œ ์ธ๋ฅ˜ ๋ฉธ์ข… ํ™•๋ฅ ์„ ๊ณ„์‚ฐํ•  ์™„๋ฒฝํ•œ ๋ฐฉ๋ฒ•์€ ์—†์œผ๋‚˜
01:47
experts estimate itโ€™s about 1 in 10,000 per century.
27
107478
4922
์ „๋ฌธ๊ฐ€๋“ค์˜ ์ถ”์ •์€ ํ•œ ์„ธ๊ธฐ๋‹น ๋งŒ๋ถ„์˜ ์ผ ์ •๋„ ๋œ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
01:52
Nuclear weapons were our first addition to that baseline.
28
112692
3503
ํ•ต๋ฌด๊ธฐ ๊ฐœ๋ฐœ๋กœ ์ธ๋ฅ˜๋Š” ์ฒ˜์Œ ์ด ํ™•๋ฅ ์„ ๋†’์˜€์ฃ .
01:56
While there are many risks associated with nuclear weapons,
29
116612
3170
ํ•ต๋ฌด๊ธฐ์— ๊ด€๋ จ๋œ ์œ„ํ—˜ ์š”์†Œ๋Š” ๋งŽ์€๋ฐ
01:59
the existential risk comes from the possibility
30
119782
3087
์ƒ์กด์˜ ์œ„ํ˜‘์€ ์ด ํ•œ ๊ฐ€๋Šฅ์„ฑ์—์„œ ๋น„๋กฏ๋ฉ๋‹ˆ๋‹ค.
02:02
of a global nuclear war that leads to a nuclear winter,
31
122869
4379
์„ธ๊ณ„ ํ•ต์ „์Ÿ์œผ๋กœ ์ธํ•ด ํ•ต๊ฒจ์šธ์ด ์˜ค๋Š” ๊ฑฐ์ฃ .
02:07
where soot from burning cities blocks out the sun for years,
32
127248
4171
๋ถˆํƒ€๋Š” ๋„์‹œ์˜ ์žฌ๊ฐ€ ์ˆ˜๋…„ ๋™์•ˆ ํ–‡๋น›์„ ๊ฐ€๋ ค
02:11
causing the crops that humanity depends on to fail.
33
131419
3253
์ธ๋ฅ˜์˜ ์ƒ์กด์ด ๋‹ฌ๋ ค์žˆ๋Š” ๊ณก๋ฌผ ์ˆ˜ํ™•์ด ์ค„์–ด๋“œ๋Š” ๊ฒ๋‹ˆ๋‹ค.
02:15
We haven't had a nuclear war yet,
34
135339
2253
์•„์ง ํ•ต์ „์Ÿ์ด ์ผ์–ด๋‚œ ์ ์€ ์—†์ง€๋งŒ
02:17
but our track record is too short to tell if theyโ€™re inherently unlikely
35
137592
4421
ํ•ต์˜ ์—ญ์‚ฌ๊ฐ€ ๋„ˆ๋ฌด ์งง๋‹ค ๋ณด๋‹ˆ
์• ์ดˆ์— ํ•ต์ „์Ÿ์ด ์ผ์–ด๋‚  ํ™•๋ฅ ์ด ๋‚ฎ์€ ๊ฑด์ง€,
02:22
or weโ€™ve simply been lucky.
36
142013
1710
์•„๋‹ˆ๋ฉด ๊ทธ์ € ์šด์ด ์ข‹์•˜๋˜ ๊ฑด์ง€ ์•Œ ์ˆ˜ ์—†์ฃ .
02:24
We also canโ€™t say for sure
37
144307
1459
๋˜ํ•œ ๊ณผ์—ฐ ์„ธ๊ณ„ ํ•ต์ „์Ÿ์ด ์‹ฌ๊ฐํ•œ ํ•ต๊ฒจ์šธ์„ ์•ผ๊ธฐํ•ด์„œ
02:25
whether a global nuclear war would cause a nuclear winter so severe
38
145766
4713
์ธ๋ฅ˜์˜ ์ƒ์กด์„ ์œ„ํ˜‘ํ• ์ง€๋„ ํ™•์‹ ํ•  ์ˆ˜ ์—†์–ด์š”.
02:30
it would pose an existential threat to humanity.
39
150479
2712
02:33
The next major addition to our existential risk was climate change.
40
153858
5047
์šฐ๋ฆฌ์˜ ์ƒ์กด์„ ์œ„ํ˜‘ํ•˜๋Š” ๋˜ ๋‹ค๋ฅธ ์ฃผ์š” ์š”์ธ์€ ๊ธฐํ›„ ๋ณ€ํ™”์ž…๋‹ˆ๋‹ค.
02:39
Like nuclear war, climate change could result
41
159322
2794
ํ•ต์ „์Ÿ์ฒ˜๋Ÿผ, ๊ธฐํ›„ ๋ณ€ํ™” ์—ญ์‹œ
02:42
in a lot of terrible scenarios that we should be working hard to avoid,
42
162116
4880
๋ฐ˜๋“œ์‹œ ํ”ผํ•ด์•ผ๋งŒ ํ•˜๋Š” ๋”์ฐํ•œ ์ƒํ™ฉ์œผ๋กœ ์ด์–ด์งˆ ์ˆ˜ ์žˆ๋Š”๋ฐ
02:46
but that would stop short of causing extinction or unrecoverable collapse.
43
166996
4713
์ธ๋ฅ˜ ๋ฉธ์ข…์ด๋‚˜ ํšŒ๋ณต ๋ถˆ๊ฐ€๋Šฅํ•œ ํŒŒ๋ฉธ ์ง์ „๊นŒ์ง€๋„ ๊ฐˆ ์ˆ˜ ์žˆ์–ด์š”.
02:52
We expect a few degrees Celsius of warming,
44
172043
2794
์ง€๊ตฌ ์˜จ๋„๊ฐ€ ๋ช‡ ๋„ ์ •๋„ ์˜ฌ๋ผ๊ฐˆ ๊ฑธ๋กœ ์˜ˆ์ธกํ•˜์ง€๋งŒ,
02:54
but canโ€™t yet completely rule out 6 or even 10 degrees,
45
174837
4171
6๋„, ํ˜น์€ 10๋„๊ฐ€ ์˜ฌ๋ผ๊ฐ€๋Š” ์ƒํ™ฉ๋„ ์™„์ „ํžˆ ๋ฐฐ์ œํ•  ์ˆœ ์—†์–ด์š”.
02:59
which would cause a calamity of possibly unprecedented proportions.
46
179008
4338
์ด๋Š” ์•„๋งˆ๋„ ์œ ๋ก€์—†๋Š” ์ˆ˜์ค€์˜ ๋Œ€์žฌ๋‚œ์„ ์ดˆ๋ž˜ํ•  ๊ฒ๋‹ˆ๋‹ค.
03:03
Even in this worst-case scenario,
47
183596
2127
์ด ์ตœ์•…์˜ ๊ฐ€์ •์—์„œ๋„
03:05
itโ€™s not clear whether warming would pose a direct existential risk,
48
185723
4046
๊ณผ์—ฐ ์˜จ๋‚œํ™” ์ž์ฒด๊ฐ€ ์ง์ ‘์ ์œผ๋กœ ์ƒ์กด์„ ์œ„ํ˜‘ํ•˜๋Š” ๊ฒƒ์ธ์ง€๋Š” ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
03:09
but the disruption it would cause would likely make us more vulnerable
49
189769
3378
ํ•˜์ง€๋งŒ ์ด๋กœ ์ธํ•œ ํ”ผํ•ด ๋•Œ๋ฌธ์— ๋‹ค๋ฅธ ์ƒ์กด ์œ„ํ˜‘๋“ค์— ๋” ๋…ธ์ถœ๋  ๊ฑฐ์˜ˆ์š”.
03:13
to other existential risks.
50
193147
2002
03:15
The greatest risks may come from technologies that are still emerging.
51
195566
4546
์ƒ์กด์˜ ๊ฐ€์žฅ ํฐ ์œ„ํ˜‘์€ ์‹ ๊ธฐ์ˆ ์—์„œ ๋น„๋กฏ๋  ์ˆ˜๋„ ์žˆ์–ด์š”.
03:20
Take engineered pandemics.
52
200112
2294
์ธ์œ„์ ์ธ ํŒฌ๋ฐ๋ฏน์„ ์˜ˆ๋กœ ๋“ค์–ด๋ณด์ฃ .
03:22
The biggest catastrophes in human history have been from pandemics.
53
202406
4547
์ธ๋ฅ˜ ์—ญ์‚ฌ์ƒ ๊ฐ€์žฅ ์‹ฌ๊ฐํ•œ ์žฌ์•™์€ ํŒฌ๋ฐ๋ฏน์—์„œ ๊ธฐ์ธํ•ด ์™”์–ด์š”.
03:27
And biotechnology is enabling us to modify and create germs
54
207203
3670
์ƒ๋ช… ๊ณตํ•™์€ ์„ธํฌ๋ฅผ ์กฐ์ž‘ํ•ด์„œ ์ƒˆ๋กœ์šด ๋ฏธ์ƒ๋ฌผ์„ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ๊ณ ,
03:30
that could be much more deadly than naturally occurring ones.
55
210873
3462
์ด๋Š” ์ž์—ฐ์ ์œผ๋กœ ๋ฐœ์ƒํ•˜๋Š” ๋ฏธ์ƒ๋ฌผ๋ณด๋‹ค ํ›จ์”ฌ ๋” ์น˜๋ช…์ ์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:34
Such germs could cause pandemics through biowarfare and research accidents.
56
214752
4588
์ด๋Ÿฐ ๋ฏธ์ƒ๋ฌผ์€ ์„ธ๊ท ์ „์ด๋‚˜ ์—ฐ๊ตฌ์‹ค ์‚ฌ๊ณ ๋ฅผ ํ†ตํ•ด ํŒฌ๋ฐ๋ฏน์„ ์ผ์œผํ‚ฌ ์ˆ˜ ์žˆ์ฃ .
03:39
Decreased costs of genome sequencing and modification,
57
219340
3211
์œ ์ „์ž ์„œ์—ด ์กฐ์ž‘์— ๋“ค์–ด๊ฐ€๋Š” ๋น„์šฉ์ด ์ค„์–ด๋“ค์—ˆ๊ณ ,
03:42
along with increased availability of potentially dangerous information
58
222551
3921
์ž์นซ ์œ„ํ—˜ํ•  ์ˆ˜๋„ ์žˆ๋Š” ์ •๋ณด์˜ ์ ‘๊ทผ์„ฑ๋„ ๋†’์•„์กŒ์Šต๋‹ˆ๋‹ค.
03:46
like the published genomes of deadly viruses,
59
226472
2544
์น˜๋ช…์  ๋ฐ”์ด๋Ÿฌ์Šค์˜ ์œ ์ „์ž ์ •๋ณด ๊ฐ™์€ ๊ฒƒ๋“ค ๋ง์ด์ฃ .
03:49
also increase the number of people and groups
60
229016
2711
์ด ๋•Œ๋ฌธ์— ์น˜๋ช…์ ์ธ ๋ณ‘์›๊ท ์„ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ๋Š”
03:51
who could potentially create such pathogens.
61
231727
2920
์‚ฌ๋žŒ๋“ค, ํ˜น์€ ๋‹จ์ฒด์˜ ์ˆ˜๊ฐ€ ๋Š˜์–ด๋‚ฉ๋‹ˆ๋‹ค.
03:55
Another concern is unaligned AI.
62
235564
2878
๋˜ ๋‹ค๋ฅธ ๊ฑฑ์ •๊ฑฐ๋ฆฌ๋Š” ํ†ต์ œ ๋ถˆ๊ฐ€ ์ธ๊ณต ์ง€๋Šฅ์ž…๋‹ˆ๋‹ค.
03:58
Most AI researchers think this will be the century
63
238442
2586
๋Œ€๋ถ€๋ถ„์˜ ์ธ๊ณต ์ง€๋Šฅ ์—ฐ๊ตฌ์ž๋“ค์€ ์ด๋ฒˆ ์„ธ๊ธฐ์—
04:01
where we develop artificial intelligence that surpasses human abilities
64
241028
4338
์ธ๊ฐ„์˜ ๋Šฅ๋ ฅ์„ ์ „๋ฐ˜์ ์œผ๋กœ ๋›ฐ์–ด๋„˜๋Š” ์ธ๊ณต ์ง€๋Šฅ์„
๊ฐœ๋ฐœํ•  ์ˆ˜ ์žˆ์œผ๋ฆฌ๋ผ ์˜ˆ์ƒํ•ฉ๋‹ˆ๋‹ค.
04:05
across the board.
65
245366
1293
04:06
If we cede this advantage, we place our future in the hands
66
246659
4129
์šฐ๋ฆฌ๊ฐ€ ์ด๋Ÿฐ ๋Šฅ๋ ฅ์„ ๋„˜๊ฒจ์ค€๋‹ค๋ฉด
์šฐ๋ฆฌ์˜ ๋ฏธ๋ž˜๋ฅผ ์šฐ๋ฆฌ๊ฐ€ ๋งŒ๋“  ์‹œ์Šคํ…œ์— ๋งก๊ธฐ๊ฒŒ ๋˜์ฃ .
04:10
of the systems we create.
67
250788
2002
04:12
Even if created solely with humanityโ€™s best interests in mind,
68
252790
3712
์˜ค์ง ์ธ๋ฅ˜์˜ ์ด์ต๋งŒ์„ ์œ„ํ•ด์„œ ์ด๋ฅผ ๊ฐœ๋ฐœํ–ˆ๋‹ค ํ• ์ง€๋ผ๋„,
04:16
superintelligent AI could pose an existential risk
69
256502
3504
์ดˆ์ง€๋Šฅ ์ธ๊ณต ์ง€๋Šฅ์€ ์ƒ์กด์— ์œ„ํ˜‘์„ ๊ฐ€ํ•  ์ˆ˜ ์žˆ์–ด์š”.
04:20
if it isnโ€™t perfectly aligned with human valuesโ€”
70
260006
3420
์ธ๊ฐ„์  ๊ฐ€์น˜์—์„œ ๋ฒ—์–ด๋‚œ๋‹ค๋ฉด์š”.
04:23
a task scientists are finding extremely difficult.
71
263426
3503
๊ณผํ•™์ž๋“ค์€ ์ด ๊ฐ€์น˜๋ฅผ ์ฃผ์ž…ํ•˜๋Š” ๋ฐ ๋ฌด์ฒ™ ํฐ ์–ด๋ ค์›€์„ ๊ฒช๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:27
Based on what we know at this point,
72
267346
2044
ํ˜„์žฌ ์•Œ๊ณ  ์žˆ๋Š” ๊ฑธ ํ† ๋Œ€๋กœ,
04:29
some experts estimate the anthropogenic existential risk
73
269390
3837
์–ด๋–ค ์ „๋ฌธ๊ฐ€๋“ค์€ ์ธ๋ฅ˜๊ฐ€ ์ดˆ๋ž˜ํ•œ ์ƒ์กด ์œ„ํ˜‘์ด
04:33
is more than 100 times higher than the background rate of natural risk.
74
273227
4797
์ž์—ฐ์  ์œ„ํ—˜์œผ๋กœ ์ธํ•œ ์œ„ํ˜‘๋ณด๋‹ค 100๋ฐฐ ์ด์ƒ ๋” ๋†’๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
04:38
But these odds depend heavily on human choices.
75
278482
4004
ํ•˜์ง€๋งŒ ์ด๋Š” ์ธ๊ฐ„์˜ ์„ ํƒ์— ์˜ํ•ด ์ขŒ์šฐ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:42
Because most of the risk is from human action, and itโ€™s within human control.
76
282486
5422
๋Œ€๋‹ค์ˆ˜ ์œ„ํ˜‘์€ ์ธ๊ฐ„ ํ™œ๋™์—์„œ ๋น„๋กฏ๋˜๊ณ , ๋˜ ์ธ๊ฐ„์ด ํ†ต์ œํ•  ์ˆ˜ ์žˆ์œผ๋‹ˆ๊นŒ์š”.
04:47
If we treat safeguarding humanity's future as the defining issue of our time,
77
287908
4755
์ธ๋ฅ˜์˜ ๋ฏธ๋ž˜๋ฅผ ๋ณดํ˜ธํ•˜๋Š” ๊ฒƒ์„ ์ด ์‹œ๋Œ€์˜ ์ตœ์šฐ์„  ๊ณผ์ œ๋กœ ์—ฌ๊ธด๋‹ค๋ฉด
04:52
we can reduce this risk.
78
292663
2127
์ด๋Ÿฐ ์œ„ํ˜‘์„ ์ค„์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:55
Whether humanity fulfils its potentialโ€”
79
295082
2836
์ธ๋ฅ˜๊ฐ€ ์ด๋ฅผ ์ œ๋Œ€๋กœ ์‹คํ–‰ํ• ์ง€ ์•„๋‹ˆ๋ฉด ํ•˜์ง€ ๋ชปํ• ์ง€๋Š”
04:57
or notโ€”
80
297918
1168
05:00
is in our hands.
81
300129
1793
์šฐ๋ฆฌ ์†์— ๋‹ฌ๋ ค ์žˆ์Šต๋‹ˆ๋‹ค.
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7