Martin Rees: Can we prevent the end of the world?

151,587 views ใƒป 2014-08-25

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: SeongHun Gang ๊ฒ€ํ† : Kyo young Chu
00:12
Ten years ago, I wrote a book which I entitled
0
12485
2222
10๋…„ ์ „, ์ฑ…์„ ํ•œ ๊ถŒ ์ผ์Šต๋‹ˆ๋‹ค.
00:14
"Our Final Century?" Question mark.
1
14707
3093
"์šฐ๋ฆฌ์˜ ๋งˆ์ง€๋ง‰ ์„ธ๊ธฐ?" ๋์— ๋ฌผ์Œํ‘œ๋ฅผ ๋‹ฌ์•˜์–ด์š”.
00:17
My publishers cut out the question mark. (Laughter)
2
17800
3577
์ œ ์ถœํŒ์‚ฌ๋Š” ๊ทธ ๋ฌผ์Œํ‘œ๋ฅผ ๋บ์Šต๋‹ˆ๋‹ค.
(์›ƒ์Œ)
00:21
The American publishers changed our title
3
21377
1882
๋ฏธ๊ตญ ์ถœํŒ์‚ฌ๋Š” ๊ทธ ์ œ๋ชฉ์„ ์ด๋ ‡๊ฒŒ ๋ฐ”๊ฟจ์Šต๋‹ˆ๋‹ค.
00:23
to "Our Final Hour."
4
23259
3909
"์šฐ๋ฆฌ์˜ ๋งˆ์ง€๋ง‰ ์‹œ๊ฐ„"
00:27
Americans like instant gratification and the reverse.
5
27168
3492
๋ฏธ๊ตญ์ธ๋“ค์€ ์ฆ‰๊ฐ์ ์ธ ๋งŒ์กฑ๊ฐ๊ณผ ๊ทธ ๋ฐ˜์ „์„ ์ข‹์•„ํ•ฉ๋‹ˆ๋‹ค.
00:30
(Laughter)
6
30660
1708
(์›ƒ์Œ)
๊ทธ ์ฑ…์˜ ์ฃผ์ œ๋Š” ์ด๋Ÿฌํ•ฉ๋‹ˆ๋‹ค.
00:32
And my theme was this:
7
32368
1750
00:34
Our Earth has existed for 45 million centuries,
8
34118
4166
์ง€๊ตฌ์˜ ์—ญ์‚ฌ๋Š” 4,500๋งŒ ์„ธ๊ธฐ๊ฐ€ ๋„˜์ง€๋งŒ
00:38
but this one is special โ€”
9
38284
2013
์ด๋ฒˆ ์„ธ๊ธฐ๋Š” ํŠน๋ณ„ํ•ฉ๋‹ˆ๋‹ค. ์ง€๊ตฌ์˜ ๋ฏธ๋ž˜๊ฐ€ ์ฒ˜์Œ์œผ๋กœ
00:40
it's the first where one species, ours,
10
40297
3016
์ธ๊ฐ„์ด๋ผ๋Š” ํ•œ ์ข…์˜ ์†์— ์˜ํ•ด ๊ฒฐ์ •๋˜๋Š” ์„ธ๊ธฐ์ด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
00:43
has the planet's future in its hands.
11
43313
2802
00:46
Over nearly all of Earth's history,
12
46115
1990
๊ทธ๊ฐ„ ์ง€๊ตฌ์˜ ์—ญ์‚ฌ๋ฅผ ๋ณด๋ฉด ์œ„ํ˜‘์€ ์ž์—ฐ์—์„œ ์‹œ์ž‘๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
00:48
threats have come from nature โ€”
13
48105
1936
00:50
disease, earthquakes, asteroids and so forth โ€”
14
50041
3496
์งˆ๋ณ‘, ์ง€์ง„, ์†Œํ–‰์„ฑ ์ถฉ๋Œ ๋“ฑ์ด ์žˆ์—ˆ๋Š”๋ฐ
00:53
but from now on, the worst dangers come from us.
15
53537
5672
์•ž์œผ๋กœ ์ตœ์•…์˜ ์œ„ํ—˜์€ ์šฐ๋ฆฌ์—๊ฒŒ์„œ ๋น„๋กฏ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:59
And it's now not just the nuclear threat;
16
59209
3271
ํ•ต ์œ„ํ˜‘๋งŒ ํ•ด๋‹นํ•˜๋Š”๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
01:02
in our interconnected world,
17
62480
1751
์„œ๋กœ ์—ฐ๊ฒฐ๋œ ์ด ์„ธ์ƒ์—์„œ ๋„คํŠธ์›Œํฌ์˜ ๋ถ•๊ดด๋Š” ์ „์„ธ๊ณ„์ ์œผ๋กœ ์ผ์–ด๋‚  ์ˆ˜ ์žˆ์ฃ .
01:04
network breakdowns can cascade globally;
18
64231
3163
01:07
air travel can spread pandemics worldwide within days;
19
67394
3956
ํ•ญ๊ณต์—ฌํ–‰์€ ์ˆ˜์ผ ๋‚ด์— ์ „์—ผ๋ณ‘์„ ์ „์„ธ๊ณ„๋กœ ํผํŠธ๋ฆฌ๋ฉฐ
01:11
and social media can spread panic and rumor
20
71350
3327
์†Œ์…œ ๋ฏธ๋””์–ด๋Š” ๊ณตํฌ์™€ ์†Œ๋ฌธ์„ ๋ง๋Œ€๋กœ ๋น›์˜ ์†๋„๋กœ ํ™•์‚ฐ์‹œํ‚ฌ ๊ฒ๋‹ˆ๋‹ค.
01:14
literally at the speed of light.
21
74677
3217
01:17
We fret too much about minor hazards โ€”
22
77894
3225
์šฐ๋ฆฐ ์‚ฌ์†Œํ•œ ์œ„ํ—˜์— ๋ถˆ์•ˆํ•ด ํ•ฉ๋‹ˆ๋‹ค.
01:21
improbable air crashes, carcinogens in food,
23
81119
4031
์ผ์–ด๋‚  ๋ฆฌ ์—†๋Š” ๋น„ํ–‰๊ธฐ ์ถ”๋ฝ์‚ฌ๊ณ ๋ผ๋˜๊ฐ€,
์Œ์‹์— ๋“  ๋ฐœ์•” ๋ฌผ์งˆ์ด๋‚˜,
01:25
low radiation doses, and so forth โ€”
24
85150
2226
์ €์„ ๋Ÿ‰ ๋ฐฉ์‚ฌ์„  ๋…ธ์ถœ ๋“ฑ์— ๋Œ€ํ•ด์„œ ๋ง์ด์ฃ .
01:27
but we and our political masters
25
87376
2825
ํ•˜์ง€๋งŒ ์ €๊ฐ™์€ ๊ณผํ•™์ž๋‚˜, ์ •์น˜ ์ธ์‚ฌ๋“ค์€
01:30
are in denial about catastrophic scenarios.
26
90201
4203
๋Œ€์žฌ์•™ ๊ฐ€์„ค์„ ๋ถ€์ •ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:34
The worst have thankfully not yet happened.
27
94404
3038
๋‹คํ–‰ํžˆ ์•„์ง ์ตœ์•…์˜ ์ƒํ™ฉ์ด ๋ฐœ์ƒํ•˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
01:37
Indeed, they probably won't.
28
97442
2196
์‚ฌ์‹ค์€ ์ผ์–ด๋‚˜์ง€๋„ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
01:39
But if an event is potentially devastating,
29
99638
3185
ํ•˜์ง€๋งŒ ๊ทธ๊ฒŒ ์ž ์žฌ์ ์œผ๋กœ๋ผ๋„ ํŒŒ๊ดด์ ์ธ ์‚ฌ๊ฑด์ด๋ผ๋ฉด
01:42
it's worth paying a substantial premium
30
102823
2868
์ผ์–ด๋‚  ๊ฐ€๋Šฅ์„ฑ์ด ๋‚ฎ๋”๋ผ๋„ ์—„์ฒญ๋‚œ ๋…ธ๋ ฅ์„ ๋“ค์—ฌ์„œ ๋Œ€๋น„์ฑ…์„ ๋งˆ๋ จํ•ด๋‘˜๋งŒ ํ•˜์ฃ .
01:45
to safeguard against it, even if it's unlikely,
31
105691
3836
01:49
just as we take out fire insurance on our house.
32
109527
4513
๋งˆ์น˜ ์šฐ๋ฆฌ๊ฐ€ ์šฐ๋ฆฌ์˜ ์ง‘์„ ์œ„ํ•ด ํ™”์žฌ๋ณดํ—˜์„ ๋“œ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ์š”.
๊ณผํ•™์ด ๋” ๋งŽ์€ ํž˜๊ณผ ๊ฐ€๋Šฅ์„ฑ์„ ์ œ๊ณตํ• ์ˆ˜๋ก ๋ถ€์ •์ ์ธ ๋ฉด๋„ ๋”์šฑ ๋ฌด์„œ์›Œ์ง‘๋‹ˆ๋‹ค.
01:54
And as science offers greater power and promise,
33
114040
4997
01:59
the downside gets scarier too.
34
119037
3866
02:02
We get ever more vulnerable.
35
122903
2239
์šฐ๋ฆฐ ์‚ฌ์ƒ ์ตœ๋Œ€๋กœ ์ทจ์•ฝํ•ด์ง€๊ณ ์š”.
02:05
Within a few decades,
36
125142
1838
๋ถˆ๊ณผ ์ˆ˜์‹ญ๋…„ ์•ˆ์—์ˆ˜๋ฐฑ๋งŒ ๋ช…์ด
02:06
millions will have the capability
37
126980
2230
๋น ๋ฅด๊ฒŒ ๋ฐœ์ „ํ•˜๊ณ  ์žˆ๋Š” ์ƒ๋ช…๊ณตํ•™ ๊ธฐ์ˆ ์„ ์˜ค์šฉํ•  ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
02:09
to misuse rapidly advancing biotech,
38
129210
3121
02:12
just as they misuse cybertech today.
39
132331
3553
๋งˆ์น˜ ์˜ค๋Š˜๋‚  ์ปดํ“จํ„ฐ ๊ณตํ•™์„ ์•…์šฉํ•˜๋“ฏ์ด ๋ง์ด์ฃ .
02:15
Freeman Dyson, in a TED Talk,
40
135884
3199
TED๊ฐ•์—ฐ์—์„œ ํ”„๋ฆฌ๋งŒ ๋‹ค์ด์Šจ์€ ์ž๊ธฐ๊ฐ€ ํ™”ํ•™๊ธฐ๊ตฌ๋ฅผ ์ผ์ƒ์ ์œผ๋กœ ๊ฐ€์ง€๊ณ  ๋…ผ ๊ฒƒ์ฒ˜๋Ÿผ
02:19
foresaw that children will design and create new organisms
41
139083
3596
02:22
just as routinely as his generation played with chemistry sets.
42
142679
4511
๋ฏธ๋ž˜์˜ ์•„์ด๋“ค์€ ์ƒˆ์ƒ๋ฌผ๋“ค์„ ์„ค๊ณ„ํ•˜๊ณ  ์ฐฝ์กฐํ•  ๊ฒƒ์ด๋ผ๊ณ  ์ด์•ผ๊ธฐํ–ˆ์Šต๋‹ˆ๋‹ค.
02:27
Well, this may be on the science fiction fringe,
43
147190
2528
๋ญ, ์ด๊ฒŒ ์•ฝ๊ฐ„ ๊ณต์ƒ๊ณผํ•™๊ฐ™์ง€๋งŒ
02:29
but were even part of his scenario to come about,
44
149718
3183
๊ทธ์˜ ์‹œ๋‚˜๋ฆฌ์˜ค์˜ ์ผ๋ถ€๋งŒ ์‹คํ˜„ ๋˜๋”๋ผ๋„
02:32
our ecology and even our species
45
152901
2737
์ƒํƒœํ™˜๊ฒฝ, ์‹ฌ์ง€์–ด ์ข…(็จฎ)์— ์ด๋ฅด๊ธฐ๊นŒ์ง€
02:35
would surely not survive long unscathed.
46
155638
3989
์˜ค๋ž˜์ง€ ์•Š์•„ ํ›ผ์†๋  ๊ฒ๋‹ˆ๋‹ค.
02:39
For instance, there are some eco-extremists
47
159627
3863
์˜ˆ๋ฅผ ๋“ค์–ด, ์ง€๊ตฌ๋ฅผ ์œ„ํ•ด์„œ ์ธ๊ฐ„ ์ˆ˜๊ฐ€ ํ›จ์”ฌ ์ค„์–ด๋“œ๋Š” ๊ฒƒ์ด
02:43
who think that it would be better for the planet,
48
163490
2509
๋‚˜์„ ๊ฒƒ์ด๋ผ๊ณ  ์ƒ๊ฐํ•˜๋Š” ํ™˜๊ฒฝ๊ทน๋‹จ๋ก ์ž๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
02:45
for Gaia, if there were far fewer humans.
49
165999
3403
02:49
What happens when such people have mastered
50
169402
2717
๋งŒ์ผ ๊ทธ ์‚ฌ๋žŒ๋“ค์ด ์ƒ๋ช… ํ•ฉ์„ฑ ๊ธฐ์ˆ ๋“ค์„ ์ตํžˆ๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
02:52
synthetic biology techniques
51
172119
2137
๊ทธ ๊ธฐ๊ณ„๊ฐ€ 2050๊นŒ์ง€ ๋„๋ฆฌ ๋ถ„ํฌ๋œ๋‹ค๋ฉด์š”.
02:54
that will be widespread by 2050?
52
174256
2852
02:57
And by then, other science fiction nightmares
53
177108
3042
๊ทธ๋•Œ์ฏค์—” ๋‹ค๋ฅธ ๊ณต์ƒ๊ณผํ•™ ์•…๋ชฝ๋“ค๋„ ํ˜„์‹ค์ด ๋  ์ˆ˜ ์žˆ๊ฒ ์ง€์š”.
03:00
may transition to reality:
54
180150
1710
03:01
dumb robots going rogue,
55
181860
2070
์•…๋‹น์ด ๋œ ๋ฉ์ฒญํ•œ ๋กœ๋ด‡๋“ค์ด๋‚˜
03:03
or a network that develops a mind of its own
56
183930
2417
์šฐ๋ฆฌ๋ฅผ ์œ„ํ˜‘ํ•˜๋Š”์ž์•„๋ฅผ ๊ฐ€์ง„ ๋„คํŠธ์›Œํฌ ๊ฐ™์€ ๊ฒŒ ์šฐ๋ฆด ์œ„ํ˜‘ํ•œ๋‹ค๋ฉด ๋ง์ด์—์š”.
03:06
threatens us all.
57
186347
2589
03:08
Well, can we guard against such risks by regulation?
58
188936
3270
๊ทธ๋ ‡๋‹ค๋ฉด ๊ทœ์ œ๋ฅผ ํ†ตํ•ด ์ด๋Ÿฐ ์ƒํ™ฉ์œผ๋กœ๋ถ€ํ„ฐ
์šฐ๋ฆฌ ์Šค์Šค๋กœ๋ฅผ ์ง€ํ‚ฌ ์ˆ˜ ์žˆ์„๊นŒ์š”? ์šฐ๋ฆฐ ๋ถ„๋ช… ๋…ธ๋ ฅํ•ด์•ผํ•˜๊ฒ ์ง€๋งŒ,
03:12
We must surely try, but these enterprises
59
192206
2407
์ด๋Ÿฐ ์‚ฌ์—…๋“ค์€ ์น˜์—ดํ•˜๊ณ  ์„ธ๊ณ„ํ™”๋˜์—ˆ๊ณ  ์ƒ์—…์  ์••๋ฐ•์— ์‹œ๋‹ฌ๋ ค
03:14
are so competitive, so globalized,
60
194613
3529
03:18
and so driven by commercial pressure,
61
198142
1980
03:20
that anything that can be done will be done somewhere,
62
200122
3285
์‹คํ–‰ ๊ฐ€๋Šฅํ•œ ๋ชจ๋“  ๊ฒƒ์€ ์–ด๋””์—์„ ๊ฐ€ ์žํ–‰๋˜๊ณ  ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
03:23
whatever the regulations say.
63
203407
2036
๊ทœ์ œ๊ฐ€ ์–ด๋–ป๋“  ๊ฐ„์—์š”. ๋งˆ์น˜ ๋งˆ์•ฝ๋ฒ•๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
03:25
It's like the drug laws โ€” we try to regulate, but can't.
64
205443
3487
์ œ์ œ๋ฅผ ํ•˜๋ ค๊ณ ๋Š” ํ•˜์ง€๋งŒ ๋จนํ˜€๋“ค์ง€ ์•Š์ฃ .
03:28
And the global village will have its village idiots,
65
208930
3044
๊ทธ๋ฆฌ๊ณ  ์ง€๊ตฌ์ดŒ ์‚ฌํšŒ์—๋„ ์ •์‹ ๋ณ‘์ž๊ฐ€ ์žˆ์„ ๊ฒƒ์ด๊ณ  ์ „์„ธ๊ณ„์— ํผ์ ธ ์žˆ์„๊ฒ๋‹ˆ๋‹ค.
03:31
and they'll have a global range.
66
211974
3496
03:35
So as I said in my book,
67
215470
2291
์ฆ‰, ์ œ๊ฐ€ ์ œ ์ฑ…์—์„œ ๋งํ–ˆ๋“ฏ์ด
03:37
we'll have a bumpy ride through this century.
68
217761
2889
์ด๋ฒˆ ์„ธ๊ธฐ์— ์ด๋Ÿฐ์ €๋Ÿฐ ํ‰ํƒ„ํ•˜์ง€ ์•Š์€ ์‚ฌ๊ฑด๋“ค์ด ๋งŽ์„ ๊ฑฐ์˜ˆ์š”.
03:40
There may be setbacks to our society โ€”
69
220650
3490
์šฐ๋ฆฌ ์‚ฌํšŒ์— ์ฐจ์งˆ์ด ์ƒ๊ธธ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
03:44
indeed, a 50 percent chance of a severe setback.
70
224140
4115
์‹ฌ๊ฐํ•œ ์ฐจ์งˆ์ผ ๊ฐ€๋Šฅ์„ฑ์ด 50%์ •๋„ ๋˜๊ณ ์š”.
03:48
But are there conceivable events
71
228255
2914
ํ•˜์ง€๋งŒ ๋ชจ๋‘๋ฅผ ๋ฉธ์ข…์‹œํ‚ฌ ๋” ๋‚˜์œ ์ƒํ™ฉ์ด ์ผ์–ด๋‚  ๊ฐ€๋Šฅ์„ฑ๋„ ์žˆ์„๊นŒ์š”?
03:51
that could be even worse,
72
231169
2161
03:53
events that could snuff out all life?
73
233330
3430
03:56
When a new particle accelerator came online,
74
236760
2926
์ƒˆ ๋ถ„์ž๊ฐ€์†๊ธฐ๊ฐ€ ์˜จ๋ผ์ธ์— ๋“ฑ์žฅํ–ˆ์„ ๋•Œ
03:59
some people anxiously asked,
75
239686
1789
์‚ฌ๋žŒ๋“ค์€ ๊ฑฑ์ •์Šค๋ ˆ ๊ทธ๊ฒŒ ์ง€๊ตฌ๋ฅผ ๋ฉธ๋ง์‹œํ‚ฌ ์ˆ˜ ์žˆ๋Š”์ง€
04:01
could it destroy the Earth or, even worse,
76
241475
2250
๋” ๋‚˜์•„๊ฐ€์„œ๋Š” ์šฐ์ฃผ๋ฅผ ์ฐข์–ด๋ฒ„๋ฆด ์ˆ˜ ์žˆ๋Š”์ง€ ๋ฌผ์–ด๋Œ”์Šต๋‹ˆ๋‹ค.
04:03
rip apart the fabric of space?
77
243725
2659
04:06
Well luckily, reassurance could be offered.
78
246384
3543
๋ญ, ๋‹คํ–‰ํžˆ๋„
์•ˆ์‹ฌ์„ ๋ณด์žฅํ•  ์ˆ˜ ์žˆ์—ˆ์ฃ .
04:09
I and others pointed out that nature
79
249927
2044
์ €์™€ ๋‹ค๋ฅธ ์ด๋“ค์€ ์ด๋ฏธ ๊ฐ™์€ ์‹คํ—˜์ด ์ž์—ฐ์ ์œผ๋กœ ์šฐ์ฃผ์„ (็ทš) ์ถฉ๋Œ์„ ํ†ตํ•ด์„œ
04:11
has done the same experiments
80
251971
1933
04:13
zillions of times already,
81
253904
2186
๋ฌด์ˆ˜ํžˆ ๋ฐ˜๋ณต๋๋‹ค๊ณ  ์ง€์ ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:16
via cosmic ray collisions.
82
256090
1765
04:17
But scientists should surely be precautionary
83
257855
3054
ํ•˜์ง€๋งŒ ๊ณผํ•™์ž๋“ค์€ ์ž์—ฐ์ ์ธ ์ƒํ™ฉ์—์„  ์ผ์–ด๋‚˜์ง€ ์•Š๋Š” ์ผ์— ๋Œ€ํ•œ
04:20
about experiments that generate conditions
84
260909
2580
์‹คํ—˜์„ ํ• ๋•Œ๋Š” ๊ฐ๋ณ„ํžˆ ์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ผ ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
04:23
without precedent in the natural world.
85
263489
2483
04:25
Biologists should avoid release of potentially devastating
86
265972
3423
์ƒ๋ฌผํ•™์ž๋“ค์€ ์ž ์žฌ์ ์œผ๋กœ ์œ„ํ—˜ํ•œ
์œ ์ „์ ์œผ๋กœ ๋ณ€์ด๋œ ๋ณ‘์›์ฒด์˜ ์œ ์ถœ์„ ํ”ผํ•ด์•ผํ•ฉ๋‹ˆ๋‹ค.
04:29
genetically modified pathogens.
87
269395
2715
04:32
And by the way, our special aversion
88
272110
3517
๊ทธ๋Ÿฐ๋ฐ, ์™ธ๋ถ€์ ์ธ ์žฌ์•™์— ๋Œ€ํ•œ
04:35
to the risk of truly existential disasters
89
275627
3461
์šฐ๋ฆฌ๋“ค์˜ ๊ฐ๋ณ„ํ•œ ๋ฐ˜๋ฐœ์‹ฌ์€
์ฒ ํ•™์ ์ด๊ณ  ์œค๋ฆฌ์ ์ธ ๋ฌธ์ œ์— ์˜์กดํ•ฉ๋‹ˆ๋‹ค.
04:39
depends on a philosophical and ethical question,
90
279088
3275
04:42
and it's this:
91
282363
1670
๋‹ค์Œ๊ณผ ๊ฐ™์€ ์งˆ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:44
Consider two scenarios.
92
284033
2308
๋‘ ์ƒํ™ฉ์„ ๊ณ ๋ คํ•ด๋ณด์„ธ์š”.
04:46
Scenario A wipes out 90 percent of humanity.
93
286341
5236
์ƒํ™ฉ A๋Š” ์ธ๋ฅ˜์˜ 90%๋ฅผ ์“ธ์–ด๋ฒ„๋ฆฝ๋‹ˆ๋‹ค.
04:51
Scenario B wipes out 100 percent.
94
291577
3896
์ƒํ™ฉ B๋Š” 100%๋ฅผ ์—†์• ๋ฒ„๋ฆฌ์ฃ .
04:55
How much worse is B than A?
95
295473
2918
B๋Š” A๋ณด๋‹ค ์–ผ๋งˆ๋‚˜ ์•ˆ ์ข‹์€๊ฐ€์š”?
์–ด๋–ค ์‚ฌ๋žŒ๋“ค์€ 10% ๋” ์•ˆ ์ข‹๋‹ค๊ณ  ํ•  ๊ฒ๋‹ˆ๋‹ค.
04:58
Some would say 10 percent worse.
96
298391
3023
์ˆ˜์น˜๊ฐ€ 10% ๋” ๋†’์œผ๋‹ˆ๊นŒ์š”.
05:01
The body count is 10 percent higher.
97
301414
3150
05:04
But I claim that B is incomparably worse.
98
304564
2906
ํ•˜์ง€๋งŒ ์ „ B๊ฐ€ ๋น„๊ต๊ฐ€ ์•ˆ ๋ ๋งŒํผ
๋” ๋‚˜์˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
05:07
As an astronomer, I can't believe
99
307470
2629
์ฒœ๋ฌธํ•™์ž๋กœ์„œ, ์ €๋Š” ์ธ๊ฐ„์ด
์ด์•ผ๊ธฐ์˜ ๋์ด๋ผ๊ณ  ๋ฏฟ์„ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
05:10
that humans are the end of the story.
100
310099
2467
05:12
It is five billion years before the sun flares up,
101
312566
3323
ํƒœ์–‘์€ 50์–ต๋…„ ๋’ค์—๋‚˜ ํญ๋ฐœํ•  ๊ฒƒ์ด๊ณ ,
05:15
and the universe may go on forever,
102
315889
2711
์šฐ์ฃผ๋Š” ์˜์›ํžˆ ์ง€์†๋  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
์ฆ‰ ์ง€๊ตฌ์™€ ๊ทธ๋ฅผ ๋„˜์–ด์„  ํ›„๊ธฐ์ธ๋ฅ˜์˜ ์ง„ํ™”๋Š” ์šฐ๋ฆฌ๋ฅผ ์—ฌ๊ธฐ๊นŒ์ง€ ์ด๋ˆ
05:18
so post-human evolution,
103
318600
2292
05:20
here on Earth and far beyond,
104
320892
2190
๋‹ค์œˆ์˜ ์ง„ํ™”๋งŒํผ ์—ฐ์žฅ๋  ์ˆ˜๋„ ์žˆ๊ณ  ๋”์šฑ ๋†€๋ผ์›Œ์งˆ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:23
could be as prolonged as the Darwinian process
105
323082
2714
05:25
that's led to us, and even more wonderful.
106
325796
3281
05:29
And indeed, future evolution will happen much faster,
107
329077
2664
๊ทธ๋ฆฌ๊ณ  ๋ฌผ๋ก , ์ž์—ฐ์˜ ์„ ํƒ์ด ์•„๋‹Œ ๊ธฐ์ˆ ์  ์‹œ๊ฐ„์˜ ์ฒ™๋„๋กœ๋Š”
05:31
on a technological timescale,
108
331741
2199
05:33
not a natural selection timescale.
109
333940
2299
๋ฏธ๋ž˜์˜ ์ง„ํ™”๋Š” ํ›จ์”ฌ ๋นจ๋ฆฌ ์ผ์–ด๋‚  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:36
So we surely, in view of those immense stakes,
110
336239
4195
๊ทธ๋ ‡๊ธฐ์— ์šฐ๋ฆฌ๋Š” ๊ทธ๋Ÿฌํ•œ ๋ฐฉ๋Œ€ํ•œ ์ด์ต์˜ ๊ด€์ ์—์„œ
์ธ๋ฅ˜์˜ ๋ฉธ์ข…์ด ์ด๋ ‡๊ฒŒ ํฐ ์ž ์žฌ๋ ฅ์„ ๋ฐฉํ•ดํ•  ์ˆ˜ ์žˆ๊ธฐ์—
05:40
shouldn't accept even a one in a billion risk
111
340434
3386
05:43
that human extinction would foreclose
112
343820
2229
10์–ต ๋ถ„์˜ 1์˜ ์œ„ํ—˜์ด๋ผ๋„ ๊ฐ์ˆ˜ํ•ด์„œ๋Š” ์•„๋‹ˆ๋ฉ๋‹ˆ๋‹ค.
05:46
this immense potential.
113
346049
2310
05:48
Some scenarios that have been envisaged
114
348359
1772
๋ช‡๊ฐœ์˜ ์˜ˆ์ƒ๋œ ์‹œ๋‚˜๋ฆฌ์˜ค ์ค‘์—” ๋ฌผ๋ก  ์ง€๋‚˜์นœ ๊ณผํ•™์†Œ์„ค๋„ ์žˆ์„ ์ˆ˜ ์žˆ์ง€๋งŒ
05:50
may indeed be science fiction,
115
350131
1819
05:51
but others may be disquietingly real.
116
351950
3386
๋‹ค๋ฅธ ์˜ˆ์ƒ๋“ค์€ ๋ถˆ์•ˆํ•˜๊ฒŒ๋„ ํ˜„์‹ค์ ์ž…๋‹ˆ๋‹ค.
05:55
It's an important maxim that the unfamiliar
117
355336
2874
๋“œ๋ฌผ๋‹ค๊ณ  ๋ถˆ๊ฐ€๋Šฅํ•œ ๊ฒƒ์ด ์•„๋‹ˆ๋ผ๋Š” ๋ง, ์šฐ๋ฆฌ๋Š” ์ด ๋ง์„ ๋ช…์‹ฌํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
05:58
is not the same as the improbable,
118
358210
2697
06:00
and in fact, that's why we at Cambridge University
119
360907
2398
์‚ฌ์‹ค ๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ ์ผ€์ž„๋ธŒ๋ฆฌ์ง€๋Œ€ํ•™๊ต์—์„œ
06:03
are setting up a center to study how to mitigate
120
363305
3375
์ด๋Ÿฌํ•œ ์œ„ํ—˜๋“ค์„ ์™„ํ™”ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์—ฐ๊ตฌํ•˜๋Š” ์‹œ์„ค์„ ์„ธ์šฐ๊ฒŒ ๋œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:06
these existential risks.
121
366680
2032
06:08
It seems it's worthwhile just for a few people
122
368712
3063
์ ์€ ์‚ฌ๋žŒ๋“ค์ด๋ผ๋„ ์ด๋Ÿฐ ์ž ์žฌ์ ์ธ ์žฌ์•™์„ ๊ณ ๋ฏผํ•˜๋Š” ๊ฒƒ์—” ๊ฐ€์น˜๊ฐ€ ์žˆ๋Š” ๊ฒƒ ๊ฐ™์•„์š”.
06:11
to think about these potential disasters.
123
371775
2316
06:14
And we need all the help we can get from others,
124
374091
3013
๊ทธ๋ฆฌ๊ณ  ๋™์›ํ•  ์ˆ˜ ์žˆ๋Š” ๋ชจ๋“  ์ง€์›์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
06:17
because we are stewards of a precious
125
377104
2479
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๋Š” ๋์—†๋Š” ์šฐ์ฃผ์˜ ์†Œ์ค‘ํ•œ ํŒŒ๋ž€ ์ ์ด์ž
06:19
pale blue dot in a vast cosmos,
126
379583
3483
50์–ต ๋…„์„ ์•ž๋‘” ํ–‰์„ฑ์ธ ์ง€๊ตฌ์˜ ๊ด€๋ฆฌ์ธ๋“ค์ด๋‹ˆ๊นŒ์š”.
06:23
a planet with 50 million centuries ahead of it.
127
383066
3378
06:26
And so let's not jeopardize that future.
128
386444
2556
๊ทธ๋Ÿฌ๋‹ˆ ๊ทธ๋Ÿฐ ๋ฏธ๋ž˜๋ฅผ ์œ„ํƒœ๋กญ๊ฒŒ ๋งŒ๋“ค์ง€ ๋ง™์‹œ๋‹ค.
06:29
And I'd like to finish with a quote
129
389000
1795
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ํ”ผํ„ฐ ๋ฉ”๋”์›Œ๋ผ๋Š” ํ›Œ๋ฅญํ•œ ๊ณผํ•™์ž์˜ ๋ง์„ ๋นŒ์–ด
06:30
from a great scientist called Peter Medawar.
130
390795
3501
06:34
I quote, "The bells that toll for mankind
131
394296
3273
์ด ์ž๋ฆฌ๋ฅผ ๋งˆ์น˜๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
"์ธ๋ฅ˜๋ฅผ ์œ„ํ•ด ์šธ๋ฆฌ๋Š” ์ข…์€ ์•Œํ”„์Šค ์‚ฐ๋งฅ์˜ ๊ฐ€์ถ•๋“ค์˜ ์ข…๊ณผ ๊ฐ™๋‹ค.
06:37
are like the bells of Alpine cattle.
132
397569
2644
06:40
They are attached to our own necks,
133
400213
2286
์šฐ๋ฆฌ์— ๋ชฉ์— ๋ถ™์–ด ์žˆ์œผ๋‹ˆ,
06:42
and it must be our fault if they do not make
134
402499
2675
๊ทธ๊ฒƒ๋“ค์ด ๋“ฃ๊ธฐ ์ข‹์€ ์Œ์•… ์†Œ๋ฆฌ๋ฅผ ๋‚ด์ง€ ์•Š์œผ๋ฉด ์šฐ๋ฆฌ์˜ ํƒ“์ผ ๊ฒƒ์ด๋‹ค."
06:45
a tuneful and melodious sound."
135
405174
2131
06:47
Thank you very much.
136
407305
2267
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
06:49
(Applause)
137
409572
2113
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7