How civilization could destroy itself -- and 4 ways we could prevent it | Nick Bostrom

154,274 views

2020-01-17 ใƒป TED


New videos

How civilization could destroy itself -- and 4 ways we could prevent it | Nick Bostrom

154,274 views ใƒป 2020-01-17

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Hajeong Kang ๊ฒ€ํ† : DK Kim
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ๋‹‰ ๋ณด์ŠคํŠธ๋กฌ ๋ฐ•์‚ฌ๋‹˜.
00:13
Chris Anderson: Nick Bostrom.
0
13000
1809
00:14
So, you have already given us so many crazy ideas out there.
1
14833
3976
๋ฐ•์‚ฌ๋‹˜์€ ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๋†€๋ผ์šด ์•„์ด๋””์–ด๋ฅผ ๋งŽ์ด ๋‚ด๋†จ์—ˆ์ž–์•„์š”.
00:18
I think a couple of decades ago,
2
18833
1726
๋ช‡์‹ญ ๋…„ ์ „์ด์—ˆ๋˜ ๊ฒƒ ๊ฐ™์€๋ฐ
00:20
you made the case that we might all be living in a simulation,
3
20583
2935
์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€ ๊ฐ€์ƒ๊ณต๊ฐ„ ์†์—์„œ ์‚ด์ง€๋„ ๋ชจ๋ฅธ๋‹ค๊ณ  ์ฃผ์žฅํ–ˆ์ฃ .
00:23
or perhaps probably were.
4
23542
1809
์–ด์ฉŒ๋ฉด ์ด๋ฏธ ์‚ด๊ณ  ์žˆ์„ ์ˆ˜๋„ ์žˆ๊ณ ์š”.
00:25
More recently,
5
25375
1351
์ข€ ๋” ์ตœ๊ทผ์—๋Š”
00:26
you've painted the most vivid examples of how artificial general intelligence
6
26750
4601
์ธ๊ณต์ง€๋Šฅ์ด ์–ด๋–ป๊ฒŒ ๋”์ฐํ•˜๊ฒŒ ์ž˜๋ชป๋  ์ˆ˜ ์žˆ๋Š”์ง€์— ๋Œ€ํ•œ
00:31
could go horribly wrong.
7
31375
1833
์•„์ฃผ ์ƒ์ƒํ•œ ์˜ˆ๋ฅผ ๊ทธ๋ ธ์Šต๋‹ˆ๋‹ค.
00:33
And now this year,
8
33750
1393
๊ทธ๋ฆฌ๊ณ  ์˜ฌํ•ด๋Š”
00:35
you're about to publish
9
35167
2226
๊ณง ๋…ผ๋ฌธ์„ ๋ฐœํ‘œํ•˜์‹ค ๊ฑฐ๊ณ ์š”.
00:37
a paper that presents something called the vulnerable world hypothesis.
10
37417
3934
์ทจ์•ฝํ•œ ์„ธ๊ณ„ ๊ฐ€์„ค์ด๋ผ ๋ถ€๋ฅด๋Š” ๊ฒƒ์— ๊ด€ํ•œ ๋…ผ๋ฌธ ๋ง์ž…๋‹ˆ๋‹ค.
00:41
And our job this evening is to give the illustrated guide to that.
11
41375
4583
์˜ค๋Š˜ ์ €๋…์—๋Š”๊ทธ ๋…ผ๋ฌธ์— ๋Œ€ํ•ด ์„ค๋ช…์„ ์ข€ ๋“ค์–ด ๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
00:46
So let's do that.
12
46417
1833
๊ทธ๋Ÿผ, ์‹œ์ž‘ํ•ด ๋ณผ๊นŒ์š”?
00:48
What is that hypothesis?
13
48833
1792
๊ทธ ๊ฐ€์„ค์€ ์–ด๋–ค ๊ฒƒ์ธ๊ฐ€์š”?
00:52
Nick Bostrom: It's trying to think about
14
52000
2434
๋‹‰ ๋ณด์ŠคํŠธ๋กฌ: ํ˜„์žฌ ์ธ๋ฅ˜์˜ ์ƒํƒœ์— ๋Œ€ํ•œ
00:54
a sort of structural feature of the current human condition.
15
54458
3084
์ผ์ข…์˜ ๊ตฌ์กฐ์  ํŠน์ง•์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด ๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
00:59
You like the urn metaphor,
16
59125
2351
๋Œ€ํ‘œ๋‹˜์€ ํ•ญ์•„๋ฆฌ ์€์œ ๋ฒ•์„ ์ข‹์•„ํ•˜์ž–์•„์š”.
01:01
so I'm going to use that to explain it.
17
61500
1893
๊ทธ๋ž˜์„œ ๊ทธ๊ฑธ๋กœ ์„ค๋ช…ํ•ด ๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
01:03
So picture a big urn filled with balls
18
63417
4351
๊ณต์ด ๊ฐ€๋“ ๋“  ํฐ ํ•ญ์•„๋ฆฌ๋ฅผ ์ƒ์ƒํ•ด ๋ณด์„ธ์š”.
01:07
representing ideas, methods, possible technologies.
19
67792
3958
๊ณต์€ ์•„์ด๋””์–ด, ๋ฐฉ๋ฒ•๋“ค, ๊ฐ€๋Šฅํ•œ ๊ธฐ์ˆ ๋“ค์„ ๋‚˜ํƒ€๋‚ด์ฃ .
01:12
You can think of the history of human creativity
20
72833
3726
์ธ๋ฅ˜์˜ ์ฐฝ์˜์„ฑ ์—ญ์‚ฌ๋ฅผ
01:16
as the process of reaching into this urn and pulling out one ball after another,
21
76583
3810
์ด ํ•ญ์•„๋ฆฌ์— ์†์„ ๋„ฃ์–ด ๊ณต์„ ํ•˜๋‚˜์”ฉ ๋ฝ‘์•„๋‚ด๋Š” ๊ฒƒ์œผ๋กœ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:20
and the net effect so far has been hugely beneficial, right?
22
80417
3226
์ง€๊ธˆ๊นŒ์ง€ ์ˆœ ํšจ๊ณผ๋Š” ํฐ ๋“์ด ๋๊ณ ์š”.
01:23
We've extracted a great many white balls,
23
83667
2726
์šฐ๋ฆฌ๊ฐ€ ๋ฝ‘์•„๋‚ธ ๊ฑด ์ˆ˜๋งŽ์€ ํ•˜์–€ ๊ณต๊ณผ
01:26
some various shades of gray, mixed blessings.
24
86417
2875
์ถ•๋ณต์ด์ž ์ €์ฃผ์ผ์ง€ ๋ชจ๋ฅด๋Š” ์ผ๋ถ€์˜ ํšŒ์ƒ‰ ๋น›๊น” ๊ณต์ด์—ˆ์Šต๋‹ˆ๋‹ค.
01:30
We haven't so far pulled out the black ball --
25
90042
2958
์ง€๊ธˆ๊นŒ์ง€ ๊ฒ€์€ ๊ณต์„ ๋ฝ‘์€ ์ ์ด ์—†์—ˆ์ฃ .
01:34
a technology that invariably destroys the civilization that discovers it.
26
94292
5476
๊ฒ€์€ ๊ณต์€ ๊ทธ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•œ ๋ฌธ๋ช…์„ ํ™•์‹คํ•˜๊ฒŒ ํŒŒ๊ดดํ•  ์ˆ˜ ์žˆ๋Š” ๊ธฐ์ˆ ์ž…๋‹ˆ๋‹ค.
01:39
So the paper tries to think about what could such a black ball be.
27
99792
3267
๋…ผ๋ฌธ์—์„  ๋ฌด์—‡์ด ๊ทธ๋Ÿฐ ๊ฒ€์€ ๊ณต์ด ๋  ์ˆ˜ ์žˆ๋Š”์ง€ ์ƒ๊ฐํ•ด ๋ณด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
01:43
CA: So you define that ball
28
103083
1810
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋ฐ•์‚ฌ๋‹˜์€ ๊ทธ ๊ณต์„
01:44
as one that would inevitably bring about civilizational destruction.
29
104917
3684
ํ•„์—ฐ์ ์œผ๋กœ ๋ฌธ๋ช…์˜ ํŒŒ๊ดด๋ฅผ ์ดˆ๋ž˜ํ•˜๋Š” ๊ฒƒ์œผ๋กœ ์ •์˜ํ•˜๋Š” ๊ตฐ์š”.
01:48
NB: Unless we exit what I call the semi-anarchic default condition.
30
108625
5309
NB: ์ด๋ฅธ๋ฐ” ๋ฐ˜ ๋ฌด์ •๋ถ€ ์ƒํƒœ๋ฅผ ๋ฒ—์–ด๋‚˜์ง€ ์•Š๋Š”๋‹ค๋ฉด ๊ทธ๋ ‡๊ฒ ์ฃ .
01:53
But sort of, by default.
31
113958
1500
์–ด๋–ค ์กฐ์น˜๋ฅผ ์ทจํ•˜์ง€ ์•Š๋Š”๋‹ค๋ฉด์š”.
01:56
CA: So, you make the case compelling
32
116333
3518
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋ฐ•์‚ฌ๋‹˜์€ ์ผ์ข…์˜ ๋ฐ˜์ฆ์„ ๋ณด์—ฌ์คŒ์œผ๋กœ์จ
01:59
by showing some sort of counterexamples
33
119875
2018
์ด ์ฃผ์žฅ์„ ์„ค๋“๋ ฅ์žˆ๊ฒŒ ๋งŒ๋“ค์—ˆ๊ตฐ์š”.
02:01
where you believe that so far we've actually got lucky,
34
121917
2934
์ฆ‰ ์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ๊นŒ์ง€ ์šด์ด ์ข‹์•˜์„ ๋ฟ์ด๊ณ 
02:04
that we might have pulled out that death ball
35
124875
2851
ํŒŒ๋ฉธ์˜ ๊ณต์„ ๋ฝ‘์•˜์„์ง€๋„ ๋ชจ๋ฅธ๋‹ค๊ณ  ์ƒ๊ฐํ•˜์‹œ๋Š” ๊ฑฐ์ฃ .
02:07
without even knowing it.
36
127750
1559
์šฐ๋ฆฌ๋„ ๋ชจ๋ฅด๋Š” ์‚ฌ์ด์— ๋ง์ด์ฃ .
02:09
So there's this quote, what's this quote?
37
129333
2292
์—ฌ๊ธฐ ์ธ์šฉ๋ฌธ์ด ์žˆ๋Š”๋ฐ, ์ด ์ธ์šฉ๋ฌธ์€ ๋ญ์ฃ ?
[์ฒœ์ฒด ๋ฌผ๋ฆฌํ•™์˜ ๋์€ ์ˆ˜์†Œ ํญํƒ„์ด๋‹ค. ๋‰ดํŠผ(์ด ์•„๋‹ˆ๊ณ ) ๋‹‰ ๋ณด์ŠคํŠธ๋กฌ]
02:12
NB: Well, I guess it's just meant to illustrate
38
132625
2684
NB: ์ œ ์ƒ๊ฐ์—” ๊ทธ๊ฑด ๋‹จ์ง€ ๊ธฐ๋ณธ์ ์ธ ๋ฐœ๊ฒฌ์ด ์–ด๋–ค ๊ฒฐ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜ฌ์ง€
02:15
the difficulty of foreseeing
39
135333
2101
02:17
what basic discoveries will lead to.
40
137458
2685
์˜ˆ์ธกํ•˜๋Š” ๊ฒƒ์˜ ์–ด๋ ค์›€์„ ๋ณด์ธ ๊ฒ๋‹ˆ๋‹ค.
02:20
We just don't have that capability.
41
140167
3059
์šฐ๋ฆฌ๋Š” ๊ทธ๋Ÿฐ ๋Šฅ๋ ฅ์ด ์—†์Šต๋‹ˆ๋‹ค.
02:23
Because we have become quite good at pulling out balls,
42
143250
3351
์šฐ๋ฆฌ๋Š” ๊ณต์„ ๋ฝ‘๋Š” ๋ฐ๋Š” ๋Šฅ์ˆ™ํ•˜์ง€๋งŒ
02:26
but we don't really have the ability to put the ball back into the urn, right.
43
146625
3726
๊ณต์„ ๋‹ค์‹œ ํ•ญ์•„๋ฆฌ์— ๋„ฃ๋Š” ๋Šฅ๋ ฅ์€ ์ •๋ง ์—†๊ฑฐ๋“ ์š”.
02:30
We can invent, but we can't un-invent.
44
150375
2167
์šฐ๋ฆฌ๋Š” ๋ฐœ๋ช…์„ ํ•  ์ˆœ ์žˆ์ง€๋งŒ ๋ฐœ๋ช…ํ•œ ๊ฑธ ์—†๋˜ ๊ฑธ๋กœ ํ•  ์ˆœ ์—†์Šต๋‹ˆ๋‹ค.
02:33
So our strategy, such as it is,
45
153583
2768
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ ์ „๋žต์€ ๋ง ๊ทธ๋Œ€๋กœ
02:36
is to hope that there is no black ball in the urn.
46
156375
2434
ํ•ญ์•„๋ฆฌ์— ๊ฒ€์€ ๊ณต์ด ์—†๊ธฐ๋ฅผ ๋ฐ”๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
02:38
CA: So once it's out, it's out, and you can't put it back in,
47
158833
4060
CA: ์ผ๋‹จ ๋ฐ–์œผ๋กœ ๋‚˜์˜ค๋ฉด ๋‚˜์˜จ ๊ฒƒ์ด๊ณ  ๊ทธ๊ฑธ ๋‹ค์‹œ ๋„ฃ์„ ์ˆ˜๋Š” ์—†๋Š”๋ฐ
02:42
and you think we've been lucky.
48
162917
1517
๊ทธ๋™์•ˆ ์šด์ด ์ข‹์•˜๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š”๊ตฐ์š”.
02:44
So talk through a couple of these examples.
49
164458
2226
๋ช‡ ๊ฐ€์ง€ ์˜ˆ๋ฅผ ์‚ดํŽด๋ด…์‹œ๋‹ค.
02:46
You talk about different types of vulnerability.
50
166708
3101
๋ฐ•์‚ฌ๋‹˜์€ ์„œ๋กœ ๋‹ค๋ฅธ ์œ ํ˜•์˜ ์ทจ์•ฝ์„ฑ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•ฉ๋‹ˆ๋‹ค.
02:49
NB: So the easiest type to understand
51
169833
2435
NB: ๊ฐ€์žฅ ์ดํ•ดํ•˜๊ธฐ ์‰ฌ์šด ์˜ˆ๋Š” ๊ธฐ์ˆ ์ด์ฃ .
02:52
is a technology that just makes it very easy
52
172292
3142
๋Œ€๋Ÿ‰ ํŒŒ๊ดด๋ฅผ ๋งค์šฐ ์‰ฝ๊ฒŒ ๋งŒ๋“œ๋Š” ๊ธฐ์ˆ  ๋ง์ž…๋‹ˆ๋‹ค.
02:55
to cause massive amounts of destruction.
53
175458
2125
[๋ˆ„๊ตฌ๋‚˜ ๋Œ€๋Ÿ‰ ํŒŒ๊ดด ์ˆ˜๋‹จ์„ ์‰ฝ๊ฒŒ ์ทจ๋“]
02:59
Synthetic biology might be a fecund source of that kind of black ball,
54
179375
3518
ํ•ฉ์„ฑ ์ƒ๋ฌผํ•™์€ ๊ทธ๋Ÿฐ ์ข…๋ฅ˜ ๊ฒ€์€ ๊ณต์˜ ํ’๋ถ€ํ•œ ์›์ฒœ์ด ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:02
but many other possible things we could --
55
182917
2684
ํ•˜์ง€๋งŒ ๋‹ค๋ฅธ ๋งŽ์€ ๊ฒƒ๋“ค์ด ๊ฐ€๋Šฅํ•˜์ฃ .
03:05
think of geoengineering, really great, right?
56
185625
2518
์ง€๊ตฌ ๊ณตํ•™์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด ๋ณด์‹ญ์‹œ์˜ค. ์ •๋ง ๋Œ€๋‹จํ•ฉ๋‹ˆ๋‹ค.
03:08
We could combat global warming,
57
188167
2226
์ง€๊ตฌ ์˜จ๋‚œํ™”์™€ ์‹ธ์šธ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:10
but you don't want it to get too easy either,
58
190417
2142
ํ•˜์ง€๋งŒ ๋™์‹œ์— ๊ทธ๊ฒŒ ๋„ˆ๋ฌด ์‰ฌ์›Œ์ง€๊ธธ ์›ํ•˜์ง„ ์•Š๊ฒ ์ฃ .
03:12
you don't want any random person and his grandmother
59
192583
2476
์–ด๋–ค ์‚ฌ๋žŒ์ด ์ˆœ์ง„ํ•œ ์‚ฌ๋žŒ๋“ค๊ณผ ํ•จ๊ป˜
03:15
to have the ability to radically alter the earth's climate.
60
195083
3060
์ง€๊ตฌ์˜ ๊ธฐํ›„๋ฅผ ๊ทผ๋ณธ์ ์œผ๋กœ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ๋Š” ๋Šฅ๋ ฅ์„ ๊ฐ€์ง€๋Š” ๊ฑธ ์›ํ•˜์ง„ ์•Š์ฃ .
03:18
Or maybe lethal autonomous drones,
61
198167
3559
์น˜๋ช…์ ์ธ ๋ฌด์ธ ๋“œ๋ก ๋„ ๊ฐ€๋Šฅํ•˜์ฃ .
03:21
massed-produced, mosquito-sized killer bot swarms.
62
201750
3333
๋Œ€๋Ÿ‰ ์ƒ์‚ฐ๋œ ๋ชจ๊ธฐ ํฌ๊ธฐ์˜ ํ‚ฌ๋Ÿฌ ๋กœ๋ด‡ ๋–ผ ๋ง์ž…๋‹ˆ๋‹ค.
03:26
Nanotechnology, artificial general intelligence.
63
206500
2726
๋‚˜๋…ธ ๊ธฐ์ˆ ์ด๋‚˜ ๋ฒ”์šฉ ์ธ๊ณต ์ง€๋Šฅ์ผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
03:29
CA: You argue in the paper
64
209250
1309
CA: ๋ฐ•์‚ฌ๋‹˜์€ ๋…ผ๋ฌธ์—์„œ ์ด๋ ‡๊ฒŒ ์ฃผ์žฅํ–ˆ์ฃ .
03:30
that it's a matter of luck that when we discovered
65
210583
2893
ํ•ต ์—ฐ์‡„๋ฐ˜์‘์„ ์ด์šฉํ•ด์„œ ํญํƒ„์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑธ
03:33
that nuclear power could create a bomb,
66
213500
3434
์šฐ๋ฆฌ๊ฐ€ ์•Œ์•˜์„ ๋•Œ ๊ทธ๊ฑด ์ˆœ์ „ํžˆ ์šฐ์—ฐ์ด์—ˆ๋‹ค.
03:36
it might have been the case
67
216958
1393
์ฆ‰ ๋ˆ„๊ตฌ๋‚˜ ์‰ฝ๊ฒŒ ์ ‘ํ•  ์ˆ˜ ์žˆ๋Š”
03:38
that you could have created a bomb
68
218375
1851
ํ›จ์”ฌ ๋‹ค๋ฃจ๊ธฐ ์‰ฌ์šด ์›๋ฃŒ๋กœ
03:40
with much easier resources, accessible to anyone.
69
220250
3559
ํญํƒ„์„ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์—ˆ์„ ๊ฒƒ์ด๋‹ค.
03:43
NB: Yeah, so think back to the 1930s
70
223833
3560
NB: ๊ทธ๋ ‡์Šต๋‹ˆ๋‹ค. 1930๋…„๋Œ€๋ฅผ ๋Œ์ด์ผœ๋ณด์„ธ์š”.
03:47
where for the first time we make some breakthroughs in nuclear physics,
71
227417
4601
์ฒ˜์Œ์œผ๋กœ ํ•ต๋ฌผ๋ฆฌํ•™ ๋ถ„์•ผ์—์„œ ํš๊ธฐ์ ์ธ ๋ฐœ์ „์„ ์ด๋ฃจ์—ˆ์„ ๋•Œ
03:52
some genius figures out that it's possible to create a nuclear chain reaction
72
232042
3684
์–ด๋–ค ์ฒœ์žฌ๊ฐ€ ํ•ต ์—ฐ์‡„ ๋ฐ˜์‘์„ ์ผ์œผํ‚ฌ ์ˆ˜๋„ ์žˆ๋‹ค๋Š” ๊ฑธ ์•Œ์•„๋ƒˆ๊ณ 
03:55
and then realizes that this could lead to the bomb.
73
235750
3184
ํญํƒ„์œผ๋กœ ์ด์–ด์งˆ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑธ ์•Œ์•˜์ฃ .
03:58
And we do some more work,
74
238958
1893
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ๋” ๋งŽ์€ ์—ฐ๊ตฌ๋ฅผ ํ–ˆ๊ณ 
04:00
it turns out that what you require to make a nuclear bomb
75
240875
2726
ํ•ตํญํƒ„์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ํ•„์š”ํ•œ ๊ฒƒ์€
04:03
is highly enriched uranium or plutonium,
76
243625
2393
๊ณ ๋†์ถ• ์šฐ๋ผ๋Š„์ด๋‚˜ ํ”Œ๋ฃจํ† ๋Š„์ด๋ผ๋Š” ๊ฑธ ์•Œ์•„๋ƒˆ๋Š”๋ฐ
04:06
which are very difficult materials to get.
77
246042
2017
๊ทธ๊ฑด ๊ตฌํ•˜๊ธฐ ๋งค์šฐ ์–ด๋ ค์šด ์žฌ๋ฃŒ๋“ค์ด์—ˆ์ฃ .
04:08
You need ultracentrifuges,
78
248083
2268
์ดˆ์›์‹ฌ ๋ถ„๋ฆฌ๊ธฐ๊ฐ€ ํ•„์š”ํ•˜๋ฉฐ
04:10
you need reactors, like, massive amounts of energy.
79
250375
3768
์—„์ฒญ๋‚œ ์–‘์˜ ์—๋„ˆ์ง€์™€ ๊ฐ™์€ ๋ฐ˜์‘๋กœ๋„ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
04:14
But suppose it had turned out instead
80
254167
1809
ํ•˜์ง€๋งŒ ๊ทธ ๋Œ€์‹ ์— ์ด๋žฌ์—ˆ๋‹ค ๊ฐ€์ •ํ•ด๋ณด์ฃ .
04:16
there had been an easy way to unlock the energy of the atom.
81
256000
3976
์›์ž ์—๋„ˆ์ง€๋ฅผ ๋ฐฉ์ถœํ•˜๊ธฐ ์œ„ํ•œ ์‰ฌ์šด ๋ฐฉ๋ฒ•์ด ์žˆ์—ˆ๋‹ค๊ณ ์š”.
04:20
That maybe by baking sand in the microwave oven
82
260000
2768
์–ด์ฉŒ๋ฉด ์ „์ž๋ ˆ์ธ์ง€๋กœ ๋ชจ๋ž˜๋ฅผ ๊ตฝ๋Š” ๊ฒƒ์ด๋‚˜
04:22
or something like that
83
262792
1267
์•„๋‹ˆ๋ฉด ๊ทธ๋Ÿฐ ๋น„์Šทํ•œ ๊ฒƒ์œผ๋กœ
04:24
you could have created a nuclear detonation.
84
264083
2101
ํ•ตํญ๋ฐœ์„ ์ผ์œผํ‚ฌ ์ˆ˜๋„ ์žˆ์—ˆ๋‹ค๊ณ ์š”.
04:26
So we know that that's physically impossible.
85
266208
2143
์šฐ๋ฆฌ๋Š” ๊ทธ๊ฒŒ ๋ฌผ๋ฆฌ์ ์œผ๋กœ ๋ถˆ๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:28
But before you did the relevant physics
86
268375
1893
ํ•˜์ง€๋งŒ ๊ด€๋ จ ๋ฌผ๋ฆฌํ•™ ๋ถ„์•ผ๋ฅผ ์—ฐ๊ตฌํ•˜๊ธฐ ์ „์—
04:30
how could you have known how it would turn out?
87
270292
2191
๊ทธ๊ฒŒ ์–ด๋–ป๊ฒŒ ๋ ์ง€ ์–ด๋–ป๊ฒŒ ์•Œ ์ˆ˜ ์žˆ๊ฒ ์Šต๋‹ˆ๊นŒ?
04:32
CA: Although, couldn't you argue
88
272507
1552
CA: ํ•˜์ง€๋งŒ ๋ฐ•์‚ฌ๋‹˜์€
04:34
that for life to evolve on Earth
89
274083
1935
์ƒ๋ช…์ฒด๊ฐ€ ์ง€๊ตฌ์—์„œ ์ง„ํ™”ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
04:36
that implied sort of stable environment,
90
276042
3267
์–ด๋Š ์ •๋„ ์•ˆ์ •๋œ ํ™˜๊ฒฝ์ด์–ด์•ผ ํ•œ๋‹ค๊ณ  ์ฃผ์žฅํ•˜์ง€ ์•Š์•˜๋‚˜์š”?
04:39
that if it was possible to create massive nuclear reactions relatively easy,
91
279333
4185
๋งŒ์•ฝ ๋น„๊ต์  ์‰ฝ๊ฒŒ ๋Œ€๊ทœ๋ชจ ํ•ต๋ฐ˜์‘์„ ์ผ์œผํ‚ฌ ์ˆ˜ ์žˆ์—ˆ๋‹ค๋ฉด
04:43
the Earth would never have been stable,
92
283542
1858
์ง€๊ตฌ๋Š” ๊ฒฐ์ฝ” ์•ˆ์ •๋˜์ง€ ์•Š์•˜์„ ๊ฒ๋‹ˆ๋‹ค.
04:45
that we wouldn't be here at all.
93
285424
1552
์šฐ๋ฆฌ๊ฐ€ ์—ฌ๊ธฐ ์žˆ์ง€๋„ ์•Š๊ฒ ์ฃ .
04:47
NB: Yeah, unless there were something that is easy to do on purpose
94
287000
3393
NB: ๊ทธ๋ ‡์ฃ , ๋ชฉ์ ์„ ๊ฐ–๊ณ  ํ•˜๊ธฐ๋Š” ์‰ฝ์ง€๋งŒ
04:50
but that wouldn't happen by random chance.
95
290417
2851
์šฐ์—ฐํžˆ ์ผ์–ด๋‚˜์ง€๋Š” ์•Š์„ ๊ทธ๋Ÿฐ ๋ฌด์—‡์ธ๊ฐ€๊ฐ€ ์žˆ๋Š” ๊ฒŒ ์•„๋‹ˆ๋ผ๋ฉด์š”.
04:53
So, like things we can easily do,
96
293292
1579
์šฐ๋ฆฌ๊ฐ€ ์‰ฝ๊ฒŒ ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ,
04:54
we can stack 10 blocks on top of one another,
97
294896
2110
์˜ˆ๋ฅผ ๋“ค์–ด ๋ธ”๋ก ์—ด ๊ฐœ๋ฅผ ์Œ“๋Š” ๊ฒƒ์€ ์‰ฌ์šด ์ผ์ด์ง€๋งŒ
04:57
but in nature, you're not going to find, like, a stack of 10 blocks.
98
297031
3197
์ž์—ฐ์—์„œ 10๊ฐœ์˜ ๋ธ”๋ก์ด ์Œ“์—ฌ์žˆ๋Š” ๊ฑธ ์ฐพ์ง„ ๋ชปํ•  ๊ฒ๋‹ˆ๋‹ค.
05:00
CA: OK, so this is probably the one
99
300253
1673
CA: ๊ทธ๋ ‡์ฃ , ์•„๋งˆ๋„ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด
05:01
that many of us worry about most,
100
301950
1943
๊ฐ€์žฅ ๊ฑฑ์ •ํ•˜๋Š” ๊ฒƒ ์ค‘ ํ•˜๋‚˜์ผ ๊ฒ๋‹ˆ๋‹ค.
05:03
and yes, synthetic biology is perhaps the quickest route
101
303917
3517
ํ•ฉ์„ฑ ์ƒ๋ฌผํ•™์€ ๊ฐ€๊นŒ์šด ๋ฏธ๋ž˜์— ์ธ๋ฅ˜๋ฅผ ๊ฑฐ๊ธฐ์— ์ด๋ฅด๊ฒŒ ํ•  ๊ฒƒ์œผ๋กœ
05:07
that we can foresee in our near future to get us here.
102
307458
3018
์šฐ๋ฆฌ๊ฐ€ ์˜ˆ์ƒํ•  ์ˆ˜ ์žˆ๋Š” ๊ฐ€์žฅ ๋น ๋ฅธ ๊ธธ์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:10
NB: Yeah, and so think about what that would have meant
103
310500
2934
NB: ๋„ค, ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ทธ๊ฒŒ ๋ฌด์Šจ ๋œป์ผ์ง€ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
05:13
if, say, anybody by working in their kitchen for an afternoon
104
313458
3643
์•„๋ฌด๋‚˜ ๋ถ€์—Œ์—์„œ ํ•œ๋‚˜์ ˆ ์ž‘์—…ํ•˜๋Š” ๊ฒƒ์œผ๋กœ
05:17
could destroy a city.
105
317125
1393
๋„์‹œ๋ฅผ ํŒŒ๊ดดํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด์š”.
05:18
It's hard to see how modern civilization as we know it
106
318542
3559
์šฐ๋ฆฌ๊ฐ€ ์•Œ๊ณ  ์žˆ๋Š” ํ˜„๋Œ€ ๋ฌธ๋ช…์ด
05:22
could have survived that.
107
322125
1434
์–ด๋–ป๊ฒŒ ์‚ด์•„ ๋‚จ์„์ง€ ์•Œ๊ธฐ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
05:23
Because in any population of a million people,
108
323583
2518
์ธ๊ตฌ๊ฐ€ ๋ฐฑ๋งŒ ๋ช…์ด๋ผ๋ฉด
05:26
there will always be some who would, for whatever reason,
109
326125
2684
์–ด๋–ค ์ด์œ ๋กœ๋“  ๊ทธ ํŒŒ๊ดด๋ ฅ์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฑธ
05:28
choose to use that destructive power.
110
328833
2084
์„ ํƒํ•  ์‚ฌ๋žŒ์ด ํ•ญ์ƒ ์žˆ๊ฒŒ ๋งˆ๋ จ์ด๋‹ˆ๊นŒ์š”.
05:31
So if that apocalyptic residual
111
331750
3143
๋”ฐ๋ผ์„œ ๊ทธ ์ข…๋ง๋ก ์ž๊ฐ€
05:34
would choose to destroy a city, or worse,
112
334917
1976
๋„์‹œ ํ•˜๋‚˜๋ฅผ ํŒŒ๊ดดํ•˜๊ธฐ๋กœ ๊ฒฐ์ •ํ–ˆ๋‹ค๋ฉด ๋˜๋Š” ๋” ๋‚˜์œ ๊ฑธ ์›ํ•œ๋‹ค๋ฉด
05:36
then cities would get destroyed.
113
336917
1559
๋„์‹œ ์—ฌ๋Ÿฌ ๊ฐœ๊ฐ€ ํŒŒ๊ดด๋  ๊ฒ๋‹ˆ๋‹ค.
05:38
CA: So here's another type of vulnerability.
114
338500
2351
CA: ์ด๊ฑด ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ์ทจ์•ฝ์„ฑ์ด๋„ค์š”. ์ด๊ฒƒ์— ๋Œ€ํ•ด ์–˜๊ธฐํ•ด ๋ณด์ฃ .
05:40
Talk about this.
115
340875
1643
[2A ๊ฐ•๋Œ€๊ตญ๋“ค์— ๋Œ€ํ•œ ๋ถ€์ •์  ์œ ์ธ]
05:42
NB: Yeah, so in addition to these kind of obvious types of black balls
116
342542
3976
NB: ์ด๋Ÿฐ ๋ช…๋ฐฑํ•œ ๊ฒ€์€ ๊ณต์œผ๋กœ
05:46
that would just make it possible to blow up a lot of things,
117
346542
2810
๋งŽ์€ ๊ฒƒ๋“ค์„ ๋‚ ๋ ค๋ฒ„๋ฆด ์ˆ˜ ์žˆ๋Š” ์™ธ์—๋„
05:49
other types would act by creating bad incentives
118
349376
4433
๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๊ณต๋“ค์ด ์ธ๊ฐ„์ด ์œ ํ•ดํ•œ ์ผ์„ ํ•˜๋„๋ก
05:53
for humans to do things that are harmful.
119
353833
2226
๋‚˜์œ ๋™๊ธฐ๋ฅผ ๋ถ€์—ฌํ•˜๋Š” ๋ฐฉ์‹์œผ๋กœ ์ž‘๋™ํ•  ๊ฒ๋‹ˆ๋‹ค.
05:56
So, the Type-2a, we might call it that,
120
356083
4101
์šฐ๋ฆฌ๊ฐ€ 2A ์œ ํ˜•์ด๋ผ๊ณ  ๋ถ€๋ฅด๋Š” ๊ฒƒ์€
06:00
is to think about some technology that incentivizes great powers
121
360208
4518
๊ฐ•๋Œ€๊ตญ์ด ๊ทธ ๊ฐ•๋ ฅํ•œ ํž˜์„ ํŒŒ๊ดด์— ์‚ฌ์šฉํ•˜๋„๋ก
06:04
to use their massive amounts of force to create destruction.
122
364750
4476
๋™๊ธฐ๋ถ€์—ฌํ•˜๋Š” ๊ธฐ์ˆ ์ž…๋‹ˆ๋‹ค.
06:09
So, nuclear weapons were actually very close to this, right?
123
369250
2833
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ํ•ต๋ฌด๊ธฐ๋Š” ์‚ฌ์‹ค ์—ฌ๊ธฐ์— ์•„์ฃผ ๊ฐ€๊นŒ์šด ๊ฑฐ์ฃ .
06:14
What we did, we spent over 10 trillion dollars
124
374083
3060
์šฐ๋ฆฌ๊ฐ€ ํ•œ ์ผ์€ 10์กฐ ๋‹ฌ๋Ÿฌ ์ด์ƒ์„ ๋“ค์—ฌ
06:17
to build 70,000 nuclear warheads
125
377167
2517
7๋งŒ ๊ฐœ์˜ ํ•ตํƒ„๋‘๋ฅผ ๋งŒ๋“ค์–ด
06:19
and put them on hair-trigger alert.
126
379708
2435
์ผ์ด‰ ์ฆ‰๋ฐœ์˜ ์œ„ํƒœ๋กœ์šด ๊ฒฝ๊ณ„ ํƒœ์„ธ๋กœ ๋“ค์–ด์„  ๊ฒ๋‹ˆ๋‹ค.
06:22
And there were several times during the Cold War
127
382167
2267
๋ƒ‰์ „ ๊ธฐ๊ฐ„ ๋™์•ˆ ์šฐ๋ฆฌ๋Š” ์—ฌ๋Ÿฌ ๋ฒˆ์ด๋‚˜
06:24
we almost blew each other up.
128
384458
1435
์„œ๋กœ๋ฅผ ๊ฑฐ์˜ ํŒŒ๋ฉธ์‹œํ‚ฌ ๋ป”ํ•œ ์ ์ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
06:25
It's not because a lot of people felt this would be a great idea,
129
385917
3101
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ์ž์‹ ์„ ํญํŒŒ์‹œํ‚ค๊ธฐ ์œ„ํ•ด
06:29
let's all spend 10 trillion dollars to blow ourselves up,
130
389042
2684
10์กฐ ๋‹ฌ๋Ÿฌ๋ฅผ ์Ÿ์•„๋ถ“๋Š” ๊ฒƒ์„ ์กธ์€ ์ƒ๊ฐ์œผ๋กœ ์—ฌ๊ฒผ๊ธฐ ๋•Œ๋ฌธ์ด ์•„๋‹ˆ๊ณ 
06:31
but the incentives were such that we were finding ourselves --
131
391750
2934
๊ทธ ์œ ์ธ์ด ์šฐ๋ฆฌ๋ฅผ ๊ทธ๋ ‡๊ฒŒ ๋งŒ๋“  ๊ฒƒ์ด์ฃ .
๋” ๋‚˜์œ ๊ฒฝ์šฐ๋„ ๊ฐ€๋Šฅํ–ˆ๊ฒ ์ฃ .
06:34
this could have been worse.
132
394708
1310
์•ˆ์ „ํ•œ ์ฒซ ๋ฒˆ์งธ ๊ณต๊ฒฉ์ด ์žˆ์—ˆ๋‹ค๊ณ  ์ƒ์ƒํ•ด ๋ณด์„ธ์š”.
06:36
Imagine if there had been a safe first strike.
133
396042
2434
06:38
Then it might have been very tricky,
134
398500
2309
๊ทธ๋ ‡๋‹ค๋ฉด
06:40
in a crisis situation,
135
400833
1268
์œ„๊ธฐ ์ƒํ™ฉ์—์„œ
06:42
to refrain from launching all their nuclear missiles.
136
402125
2477
๋ชจ๋“  ํ•ต ๋ฏธ์‚ฌ์ผ ๋ฐœ์‚ฌ๋ฅผ ์–ต์ œํ•˜๋Š” ๊ฒƒ์€ ๋งค์šฐ ์–ด๋ ค์› ์„์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
06:44
If nothing else, because you would fear that the other side might do it.
137
404626
3392
์ ์–ด๋„ ์ƒ๋Œ€๋ฐฉ์ด ๊ทธ๋ ‡๊ฒŒ ํ• ๊นŒ ๋ด ๋‘๋ ค์šธ ํ…Œ๋‹ˆ๊นŒ์š”.
CA: ๋งž์Šต๋‹ˆ๋‹ค. ์ƒํ˜ธ ํ™•์ฆ ํŒŒ๊ดด๋Š”
06:48
CA: Right, mutual assured destruction
138
408042
1809
06:49
kept the Cold War relatively stable,
139
409875
2726
๋ƒ‰์ „์„ ๋น„๊ต์  ์•ˆ์ •์ ์œผ๋กœ ์œ ์ง€์‹œ์ผฐ๊ณ 
06:52
without that, we might not be here now.
140
412625
1934
๊ทธ๊ฒŒ ์—†์—ˆ๋‹ค๋ฉด ์šฐ๋ฆฐ ์ง€๊ธˆ ์—ฌ๊ธฐ ์—†์„์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
06:54
NB: It could have been more unstable than it was.
141
414583
2310
NB: ๋” ๋ถˆ์•ˆ์ •ํ•  ์ˆ˜๋„ ์žˆ์—ˆ๊ฒ ์ฃ .
06:56
And there could be other properties of technology.
142
416917
2351
๊ทธ๋ฆฌ๊ณ  ๊ธฐ์ˆ ์˜ ๋‹ค๋ฅธ ์†์„ฑ์ด ์žˆ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
๋ฌด๊ธฐ ์กฐ์•ฝ์„ ๋งบ๋Š” ๊ฒƒ์€ ์–ด๋ ค์› ์„์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
06:59
It could have been harder to have arms treaties,
143
419292
2309
07:01
if instead of nuclear weapons
144
421625
1601
๋งŒ์•ฝ ๋ฌด๊ธฐ๊ฐ€ ํ•ต๋ณด๋‹ค ๋” ์ž‘๊ฑฐ๋‚˜
07:03
there had been some smaller thing or something less distinctive.
145
423250
3018
์„ฑ๋Šฅ์ด ๋–จ์–ด์ง€๋Š” ๊ฒƒ์ด์—ˆ๋‹ค๋ฉด์š”.
07:06
CA: And as well as bad incentives for powerful actors,
146
426292
2559
CA: ๊ฐ•๋Œ€๊ตญ๋“ค์ด ๋‚˜์œ ๋™๊ธฐ๋ฅผ ๊ฐ–๋Š” ๊ฒƒ๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ
07:08
you also worry about bad incentives for all of us, in Type-2b here.
147
428875
3518
2B ์œ ํ˜•์ฒ˜๋Ÿผ ๋ชจ๋‘๊ฐ€ ๋‚˜์œ ๋™๊ธฐ๋ฅผ ๊ฐ€์ง€๋Š” ๊ฒƒ๋„ ์šฐ๋ คํ•˜๊ณ  ์žˆ๋„ค์š”.
07:12
NB: Yeah, so, here we might take the case of global warming.
148
432417
4250
NB: ๋„ค, ์ง€๊ตฌ์˜จ๋‚œํ™”๋ฅผ ์ƒ๊ฐํ•ด๋ด…์‹œ๋‹ค.
07:18
There are a lot of little conveniences
149
438958
1893
๊ฐ์ž๊ฐ€ ์ง€๊ตฌ์˜จ๋‚œํ™”์— ์˜ํ–ฅ์„ ๋ฏธ์น˜๋Š” ์ผ์„ ํ•˜๊ฒŒ ๋งŒ๋“œ๋Š”
07:20
that cause each one of us to do things
150
440875
2184
์ž‘์€ ํŽธ์˜ ์‹œ์„ค๋“ค์ด ๋งŽ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:23
that individually have no significant effect, right?
151
443083
2851
๊ทธ๊ฑด ๊ฐœ๋ณ„์ ์œผ๋ก  ํฐ ์˜ํ–ฅ์ด ์—†์Šต๋‹ˆ๋‹ค๋งŒ
07:25
But if billions of people do it,
152
445958
1976
์ˆ˜์‹ญ์–ต์˜ ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๊ฒƒ์„ ํ•œ๋‹ค๋ฉด
07:27
cumulatively, it has a damaging effect.
153
447958
2060
๋ˆ„์ ์ด ๋˜์–ด ๋‚˜์œ ์˜ํ–ฅ์„ ๋ผ์น  ๊ฒ๋‹ˆ๋‹ค.
07:30
Now, global warming could have been a lot worse than it is.
154
450042
2809
์ง€๊ตฌ์˜จ๋‚œํ™”๋Š” ์ง€๊ธˆ๋ณด๋‹ค ํ›จ์”ฌ ๋” ์‹ฌ๊ฐํ•  ์ˆ˜๋„ ์žˆ์—ˆ์„ ๊ฒ๋‹ˆ๋‹ค.
07:32
So we have the climate sensitivity parameter, right.
155
452875
2976
[2B ์ผ๋ฐ˜ ๋Œ€์ค‘์— ๋Œ€ํ•œ ๋ถ€์ •์  ์œ ์ธ]
๊ธฐํ›„๋ฏผ๊ฐ๋„ ๋งค๊ฐœ๋ณ€์ˆ˜๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
07:35
It's a parameter that says how much warmer does it get
156
455875
3643
์ผ์ •๋Ÿ‰์˜ ์˜จ์‹ค๊ฐ€์Šค๋ฅผ ๋ฐฐ์ถœํ•˜๋ฉด ์–ผ๋งˆ๋‚˜ ๋”ฐ๋œปํ•ด์ง€๋Š”์ง€
07:39
if you emit a certain amount of greenhouse gases.
157
459542
2684
์•Œ๋ ค์ฃผ๋Š” ๋งค๊ฐœ ๋ณ€์ˆ˜ ์ž…๋‹ˆ๋‹ค.
07:42
But, suppose that it had been the case
158
462250
2393
ํ•˜์ง€๋งŒ ๊ฐ€์ •ํ•ด ๋ด…์‹œ๋‹ค.
07:44
that with the amount of greenhouse gases we emitted,
159
464667
2517
์šฐ๋ฆฌ๊ฐ€ ๋ฐฐ์ถœํ•œ ์˜จ์‹ค๊ฐ€์Šค์˜ ์–‘์œผ๋กœ
07:47
instead of the temperature rising by, say,
160
467208
2060
2100๋…„๊นŒ์ง€ 3๋„์—์„œ 4.5๋„ ์‚ฌ์ด์˜ ์˜จ๋„ ์ƒ์Šน ๋Œ€์‹ ์—
07:49
between three and 4.5 degrees by 2100,
161
469292
3726
07:53
suppose it had been 15 degrees or 20 degrees.
162
473042
2500
15๋„ ๋˜๋Š” 20๋„์˜ ์˜จ๋„ ์ƒ์Šน์ด ๋ฐœ์ƒํ–ˆ๋‹ค๊ณ  ๊ฐ€์ •ํ•œ๋‹ค๋ฉด์š”.
07:56
Like, then we might have been in a very bad situation.
163
476375
2559
์šฐ๋ฆฌ๋Š” ์•„๋งˆ ๋งค์šฐ ๋‚œ์ฒ˜ํ•œ ์ƒํ™ฉ์— ์ฒ˜ํ–ˆ์„ ์ˆ˜๋„ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
07:58
Or suppose that renewable energy had just been a lot harder to do.
164
478958
3143
ํ˜น์€ ์žฌ์ƒ์—๋„ˆ์ง€์˜ ์‹คํ˜„์ด ํ›จ์”ฌ ๋” ์–ด๋ ค์› ๋‹ค๊ฑฐ๋‚˜
๋•…์— ๋” ๋งŽ์€ ํ™”์„์—ฐ๋ฃŒ๊ฐ€ ์žˆ์—ˆ๋‹ค๊ณ  ๊ฐ€์ •ํ•œ๋‹ค๋ฉด์š”.
08:02
Or that there had been more fossil fuels in the ground.
165
482125
2643
CA: ์ด๋Ÿฐ ๊ฒฝ์šฐ๋Š” ์–ด๋–ค๊ฐ€์š”?
08:04
CA: Couldn't you argue that if in that case of --
166
484792
2642
08:07
if what we are doing today
167
487458
1726
๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ์˜ค๋Š˜ ํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์ด
08:09
had resulted in 10 degrees difference in the time period that we could see,
168
489208
4560
์šฐ๋ฆฌ๊ฐ€ ์‚ด์•„ ์žˆ๋Š” ๋™์•ˆ 10๋„๋‚˜ ๋ณ€ํ™”์‹œํ‚จ๋‹ค๋ฉด
08:13
actually humanity would have got off its ass and done something about it.
169
493792
3684
์‚ฌ์‹ค ์ธ๋ฅ˜๋Š” ๊ถ์ง€์—์„œ ๋ฒ—์–ด๋‚˜ ๋ญ”๊ฐ€ ์กฐ์น˜๋ฅผ ์ทจํ–ˆ์„ ๊ฒ๋‹ˆ๋‹ค.
08:17
We're stupid, but we're not maybe that stupid.
170
497500
2809
์šฐ๋ฆฌ๋Š” ์–ด๋ฆฌ์„์ง€๋งŒ ๊ทธ ์ •๋„๋กœ ์–ด๋ฆฌ์„์ง„ ์•Š์ฃ .
08:20
Or maybe we are.
171
500333
1268
์•„๋‹ˆ๋ฉด ๊ทธ๋Ÿด ์ง€๋„ ๋ชจ๋ฅด๊ณ ์š”.
08:21
NB: I wouldn't bet on it.
172
501625
1268
NB: ์žฅ๋‹ดํ•˜์ง€ ๋งˆ์„ธ์š”.
08:22
(Laughter)
173
502917
2184
(์›ƒ์Œ)
08:25
You could imagine other features.
174
505125
1684
๋‹ค๋ฅธ ํŠน์ง•๋“ค์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:26
So, right now, it's a little bit difficult to switch to renewables and stuff, right,
175
506833
5518
์ง€๊ธˆ ๋‹น์žฅ์€ ์žฌ์ƒ ์—๋„ˆ์ง€ ๊ฐ™์€ ๊ฒƒ์œผ๋กœ ๋ฐ”๊พธ๊ธฐ๊ฐ€ ์ข€ ์–ด๋ ต๊ธด ํ•˜์ง€๋งŒ
08:32
but it can be done.
176
512375
1268
๊ฐ€๋Šฅํ•œ ์ผ์ž…๋‹ˆ๋‹ค.
08:33
But it might just have been, with slightly different physics,
177
513667
2976
ํ•˜์ง€๋งŒ ์•ฝ๊ฐ„์˜ ๋ฌผ๋ฆฌ์  ์ฐจ์ด๊ฐ€ ์žˆ์—ˆ์„ ์ˆ˜๋„ ์žˆ์–ด์„œ
08:36
it could have been much more expensive to do these things.
178
516667
2791
์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ํ•˜๋Š” ๊ฒŒ ํ›จ์”ฌ ๋” ๋ˆ์ด ๋งŽ์ด ๋“ค์—ˆ์„ ์ˆ˜๋Š” ์žˆ๊ฒ ๋„ค์š”.
08:40
CA: And what's your view, Nick?
179
520375
1518
CA: ๋ฐ•์‚ฌ๋‹˜์˜ ์ƒ๊ฐ์€ ์–ด๋–ค๊ฐ€์š”?
08:41
Do you think, putting these possibilities together,
180
521917
2434
์ด๋Ÿฐ ๊ฐ€๋Šฅ์„ฑ๋“ค์„ ์ข…ํ•ฉํ•ด์„œ ์ƒ๊ฐํ•ด๋ณด๋ฉด
08:44
that this earth, humanity that we are,
181
524375
4268
์ด ์ง€๊ตฌ์™€ ์ธ๋ฅ˜๊ฐ€
08:48
we count as a vulnerable world?
182
528667
1559
์ทจ์•ฝํ•œ ์„ธ๊ณ„๋ผ๊ณ  ์ƒ๊ฐํ•˜์„ธ์š”?
08:50
That there is a death ball in our future?
183
530250
2417
์ธ๋ฅ˜์˜ ๋ฏธ๋ž˜์—๋Š” ์ฃฝ์Œ์˜ ๊ณต์ด ์žˆ์„๊นŒ์š”?
08:55
NB: It's hard to say.
184
535958
1268
NB : ๊ทธ๊ฑด ๋งํ•˜๊ธฐ ์–ด๋ ต์ง€๋งŒ
08:57
I mean, I think there might well be various black balls in the urn,
185
537250
5059
ํ•ญ์•„๋ฆฌ ์•ˆ์—๋Š” ๋‹ค์–‘ํ•œ ๊ฒ€์€ ๊ณต๋“ค์ด ์žˆ์„ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
09:02
that's what it looks like.
186
542333
1310
๊ทธ๋ ‡๊ฒŒ ๋ณด์ž…๋‹ˆ๋‹ค.
09:03
There might also be some golden balls
187
543667
2392
์–ด์ฉŒ๋ฉด ํ™ฉ๊ธˆ ๊ณต๋„ ๋“ค์–ด ์žˆ์–ด์„œ
09:06
that would help us protect against black balls.
188
546083
3476
์šฐ๋ฆฌ๋ฅผ ๊ฒ€์€ ๊ณต์œผ๋กœ๋ถ€ํ„ฐ ๋ณดํ˜ธํ•˜๋Š” ๋ฐ ๋„์›€์ด ๋  ์ˆ˜๋„ ์žˆ์„ ๊ฑฐ์—์š”.
09:09
And I don't know which order they will come out.
189
549583
2976
๊ทธ๋Ÿฐ๋ฐ ๊ทธ๊ฒŒ ์–ด๋–ค ์ˆœ์„œ๋กœ ๋‚˜์˜ฌ์ง€๋Š” ๋ชฐ๋ผ์š”.
09:12
CA: I mean, one possible philosophical critique of this idea
190
552583
3851
CA : ์ด ์•„์ด๋””์–ด์— ๊ด€ํ•ด ํ•  ์ˆ˜ ์žˆ๋Š” ์ฒ ํ•™์  ๋น„ํŒ ์ค‘์—๋Š”
09:16
is that it implies a view that the future is essentially settled.
191
556458
5643
์ด๋Ÿฐ ์ƒ๊ฐ์ด ๋ฏธ๋ž˜๋Š” ๋ณธ์งˆ์ ์œผ๋กœ ์ •ํ•ด์ ธ ์žˆ๋‹ค๊ณ  ๋ณธ๋‹ค๋Š” ์ ์ด ์žˆ๊ฒ ์ฃ .
09:22
That there either is that ball there or it's not.
192
562125
2476
์ €๊ธฐ์— ๊ณต์ด ์žˆ๋“ ์ง€ ์—†๋“ ์ง€ ๋‘˜ ์ค‘ ํ•˜๋‚˜์ฃ .
09:24
And in a way,
193
564625
3018
๊ทธ๋ฆฌ๊ณ  ์–ด๋–ป๊ฒŒ ๋ณด๋ฉด
09:27
that's not a view of the future that I want to believe.
194
567667
2601
๊ทธ๊ฑด ์ œ๊ฐ€ ๋ฏฟ๊ณ  ์‹ถ์€ ๋ฏธ๋ž˜์˜ ๊ด€์ ์ด ์•„๋‹ˆ์—์š”.
09:30
I want to believe that the future is undetermined,
195
570292
2351
๋ฏธ๋ž˜๊ฐ€ ๊ฒฐ์ •๋˜์ง€ ์•Š์•˜๋‹ค๊ณ  ๋ฏฟ๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
09:32
that our decisions today will determine
196
572667
1934
์ฆ‰ ์˜ค๋Š˜ ์šฐ๋ฆฌ๊ฐ€ ๋‚ด๋ฆฐ ๊ฒฐ์ •์ด
09:34
what kind of balls we pull out of that urn.
197
574625
2208
ํ•ญ์•„๋ฆฌ์—์„œ ์–ด๋–ค ์ข…๋ฅ˜์˜ ๊ณต์„ ๊บผ๋‚ผ์ง€ ๊ฒฐ์ •ํ•˜๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
09:37
NB: I mean, if we just keep inventing,
198
577917
3767
NB: ์ œ ๋ง์€, ์šฐ๋ฆฌ๊ฐ€ ๊ณ„์† ๋ฐœ๋ช…์„ ํ•œ๋‹ค๋ฉด
09:41
like, eventually we will pull out all the balls.
199
581708
2334
๊ฒฐ๊ตญ์—๋Š” ๋ชจ๋“  ๊ณต์„ ๋ฝ‘์„ ๊ฑฐ์˜ˆ์š”.
09:44
I mean, I think there's a kind of weak form of technological determinism
200
584875
3393
์ œ ๋ง์€, ์•ฝํ•œ ํ˜•ํƒœ์˜ ๊ธฐ์ˆ  ๊ฒฐ์ •๋ก ์ด
09:48
that is quite plausible,
201
588292
1267
๊ฝค ๊ทธ๋Ÿด๋“ฏํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
09:49
like, you're unlikely to encounter a society
202
589583
2643
์ œํŠธ๊ธฐ ์•ˆ์—์„œ ๋ถ€์‹ฏ๋Œ์„ ์“ฐ๋Š” ์‚ฌํšŒ๊ฐ€
09:52
that uses flint axes and jet planes.
203
592250
2833
์žˆ์„ ๊ฒƒ ๊ฐ™์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค.
09:56
But you can almost think of a technology as a set of affordances.
204
596208
4060
๊ธฐ์ˆ ์ด ํ–‰๋™์„ ์œ ๋„ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:00
So technology is the thing that enables us to do various things
205
600292
3017
๊ธฐ์ˆ ์€ ์šฐ๋ฆฌ๊ฐ€ ๋‹ค์–‘ํ•œ ์ผ์„ ํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜๊ณ 
10:03
and achieve various effects in the world.
206
603333
1976
๊ทธ๋กœ ์ธํ•ด ์„ธ๊ณ„์— ๋‹ค์–‘ํ•œ ํšจ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜ต๋‹ˆ๋‹ค.
๊ทธ๊ฒƒ์„ ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉํ• ์ง€๋Š” ๋ฌผ๋ก  ์šฐ๋ฆฌ ์„ ํƒ์— ๋‹ฌ๋ ค ์žˆ์Šต๋‹ˆ๋‹ค.
10:05
How we'd then use that, of course depends on human choice.
207
605333
2810
10:08
But if we think about these three types of vulnerability,
208
608167
2684
ํ•˜์ง€๋งŒ ์ด ์„ธ ๊ฐ€์ง€ ์œ ํ˜•์˜ ์ทจ์•ฝ์„ฑ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•œ๋‹ค๋ฉด
10:10
they make quite weak assumptions about how we would choose to use them.
209
610875
3393
๊ทธ๊ฒƒ๋“ค์„ ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉํ•  ๊ฒƒ์ธ์ง€์— ๋Œ€ํ•ด ์ƒ๋‹นํžˆ ์•ฝํ•œ ๊ฐ€์ •์„ ํ•ฉ๋‹ˆ๋‹ค.
10:14
So a Type-1 vulnerability, again, this massive, destructive power,
210
614292
3392
๋‹ค์‹œ ๋งํ•˜์ง€๋งŒ 1์œ ํ˜•์˜ ์ทจ์•ฝ์„ฑ์€ ๊ฑฐ๋Œ€ํ•œ ํŒŒ๊ดด๋ ฅ์ž…๋‹ˆ๋‹ค.
10:17
it's a fairly weak assumption
211
617708
1435
์ด๊ฑด ์ƒ๋‹นํžˆ ํƒ€๋‹นํ•œ ์ƒ๊ฐ์ธ๋ฐ
10:19
to think that in a population of millions of people
212
619167
2392
์ˆ˜๋ฐฑ๋งŒ ๋ช…์˜ ์ธ๊ตฌ ์ค‘์—์„œ ์–ด๋–ค ์‚ฌ๋žŒ๋“ค์€
10:21
there would be some that would choose to use it destructively.
213
621583
2935
๊ทธ๊ฑธ ํŒŒ๊ดด์ ์œผ๋กœ ์‚ฌ์šฉํ•˜๊ธธ ์„ ํƒํ•  ๊ฒ๋‹ˆ๋‹ค.
10:24
CA: For me, the most single disturbing argument
214
624542
2434
CA: ์ œ๊ฒŒ ๊ฐ€์žฅ ํ˜ผ๋ž€์Šค๋Ÿฌ์šด ์ฃผ์žฅ์€
10:27
is that we actually might have some kind of view into the urn
215
627000
4559
์šฐ๋ฆฌ๊ฐ€ ํ•ญ์•„๋ฆฌ์— ๋Œ€ํ•œ ์–ด๋– ํ•œ ๊ฒฌํ•ด๋ฅผ ๊ฐ€์ง€๋Š” ๊ฒƒ์ด
10:31
that makes it actually very likely that we're doomed.
216
631583
3518
์‹ค์ œ๋กœ ํŒŒ๋ฉธํ•  ๊ฐ€๋Šฅ์„ฑ์„ ๋†’๊ฒŒ ๋งŒ๋“ ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:35
Namely, if you believe in accelerating power,
217
635125
4643
์ฆ‰, ๋งŒ์•ฝ ๋‹น์‹ ์ด ๊ฐ€์†๋ ฅ์„ ๋ฏฟ๋Š”๋‹ค๋ฉด
10:39
that technology inherently accelerates,
218
639792
2267
๊ทธ ๊ธฐ์ˆ ์€ ๋ณธ์งˆ์ ์œผ๋กœ ๊ฐ€์†ํ™”๋˜๋ฉฐ
10:42
that we build the tools that make us more powerful,
219
642083
2435
์šฐ๋ฆฌ๋ฅผ ๋” ๊ฐ•๋ ฅํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๋„๊ตฌ๋ฅผ ๋งŒ๋“ ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:44
then at some point you get to a stage
220
644542
2642
์–ด๋Š ์ˆœ๊ฐ„ ์šฐ๋ฆฌ๋Š”
10:47
where a single individual can take us all down,
221
647208
3060
๊ฐœ์ธ์ด ์šฐ๋ฆฌ ๋ชจ๋‘๋ฅผ ํŒŒ๋ฉธ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋Š” ๋‹จ๊ณ„์— ๋„๋‹ฌํ•˜๊ฒŒ ๋  ๊ฒƒ์ด๊ณ 
10:50
and then it looks like we're screwed.
222
650292
2851
๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ๋Š” ์•„๋งˆ๋„ ํŒŒ๋ฉธํ•˜๊ฒ ์ฃ .
10:53
Isn't that argument quite alarming?
223
653167
2934
๊ทธ ์ฃผ์žฅ์€ ๊ฝค ๋†€๋ž์ง€ ์•Š๋‚˜์š”?
10:56
NB: Ah, yeah.
224
656125
1750
NB: ์•„, ๋„ค.
10:58
(Laughter)
225
658708
1268
(์›ƒ์Œ)
11:00
I think --
226
660000
1333
์ œ ์ƒ๊ฐ์—”
11:02
Yeah, we get more and more power,
227
662875
1601
์šฐ๋ฆฌ๋Š” ์ ์  ๋” ๋งŽ์€ ํž˜์„ ์–ป๊ณ 
11:04
and [it's] easier and easier to use those powers,
228
664500
3934
๊ทธ ํž˜์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์ด ์ ์  ๋” ์‰ฌ์›Œ์งˆ ๊ฒ๋‹ˆ๋‹ค.
11:08
but we can also invent technologies that kind of help us control
229
668458
3560
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๋Ÿฌํ•œ ํž˜์„ ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉํ•˜๋Š”์ง€๋ฅผ ํ†ต์ œํ•˜๋Š” โ€‹โ€‹๋ฐ์—
11:12
how people use those powers.
230
672042
2017
๋„์›€์„ ์ฃผ๋Š” ๊ธฐ์ˆ ๋„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:14
CA: So let's talk about that, let's talk about the response.
231
674083
2851
CA: ๊ทธ๊ฒƒ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•ด ๋ณด์ฃ . ๊ฑฐ๊ธฐ์— ๋Œ€ํ•œ ๋Œ€์‘์— ๋Œ€ํ•ด์„œ์š”.
11:16
Suppose that thinking about all the possibilities
232
676958
2310
ํ˜„์กดํ•˜๋Š” ๋ชจ๋“  ๊ฐ€๋Šฅ์„ฑ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•œ๋‹ค๊ณ  ๊ฐ€์ •ํ•ด๋ด…์‹œ๋‹ค.
11:19
that are out there now --
233
679292
2101
11:21
it's not just synbio, it's things like cyberwarfare,
234
681417
3726
ํ•ฉ์„ฑ ์ƒ๋ฌผํ•™๋ฟ ์•„๋‹ˆ๋ผ ์‚ฌ์ด๋ฒ„ ์ „์Ÿ,
11:25
artificial intelligence, etc., etc. --
235
685167
3351
์ธ๊ณต ์ง€๋Šฅ ๋“ฑ๊ณผ ๊ฐ™์€ ๊ฒƒ๋“ค๋กœ
11:28
that there might be serious doom in our future.
236
688542
4517
์šฐ๋ฆฌ์˜ ๋ฏธ๋ž˜์— ์‹ฌ๊ฐํ•œ ํŒŒ๋ฉธ์ด ์žˆ์„์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
11:33
What are the possible responses?
237
693083
1601
์–ด๋–ป๊ฒŒ ํ•ด์•ผ ํ• ๊นŒ์š”?
11:34
And you've talked about four possible responses as well.
238
694708
4893
๋„ค ๊ฐ€์ง€ ๊ฐ€๋Šฅํ•œ ๋Œ€์‘๋ฒ•์— ๋Œ€ํ•ด์„œ ์ด์•ผ๊ธฐํ•˜์…จ๋Š”๋ฐ์š”.
[๋Œ€์‘ 1: ๊ธฐ์ˆ  ๋ฐœ์ „์„ ์ œํ•œํ•˜๊ธฐ]
11:39
NB: Restricting technological development doesn't seem promising,
239
699625
3643
NB: ๊ธฐ์ˆ  ๊ฐœ๋ฐœ์˜ ์ „๋ฉด์ ์ธ ์ค‘๋‹จ์„ ๋งํ•˜๋Š” ๊ฒƒ์ด๋ผ๋ฉด
11:43
if we are talking about a general halt to technological progress.
240
703292
3226
๊ธฐ์ˆ  ๋ฐœ์ „์„ ์ œํ•œํ•˜๋Š” ๊ฒƒ์ด ์œ ๋งํ•˜๋‹ค๊ณ  ์ƒ๊ฐ๋˜์ง„ ์•Š๋„ค์š”.
11:46
I think neither feasible,
241
706542
1267
์‹คํ˜„ ๊ฐ€๋Šฅํ•˜์ง€๋„ ์•Š๊ณ  ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ํ•ด๋„ ๋ฐ”๋žŒ์งํ•˜์ง€ ์•Š์€ ๊ฒƒ ๊ฐ™์•„์š”.
11:47
nor would it be desirable even if we could do it.
242
707833
2310
11:50
I think there might be very limited areas
243
710167
3017
๊ธฐ์ˆ  ๋ฐœ์ „ ์†๋„๋ฅผ ๋Šฆ์ถ”๊ณ  ์‹ถ์€ ๋ถ„์•ผ๋Š”
11:53
where maybe you would want slower technological progress.
244
713208
2726
๋งค์šฐ ์ œํ•œ์ ์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:55
You don't, I think, want faster progress in bioweapons,
245
715958
3393
์˜ˆ๋ฅผ ๋“ค์–ด ์ƒ๋ฌผํ•™ ๋ฌด๊ธฐ๋‚˜ ๋™์œ„ ์›์†Œ ๋ถ„๋ฆฌ์—์„œ๋Š”
11:59
or in, say, isotope separation,
246
719375
2059
๋” ๋น ๋ฅธ ๋ฐœ์ „์„ ์›์น˜ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
12:01
that would make it easier to create nukes.
247
721458
2250
ํ•ต๋ฌด๊ธฐ๋ฅผ ๋งŒ๋“ค๊ธฐ๊ฐ€ ๋” ์‰ฌ์›Œ์งˆ ํ…Œ๋‹ˆ๊นŒ์š”.
12:04
CA: I mean, I used to be fully on board with that.
248
724583
3310
CA: ์ €๋„ ์ „์—๋Š” ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค๋งŒ
12:07
But I would like to actually push back on that for a minute.
249
727917
3267
์กฐ๊ธˆ ๋‹ค๋ฅธ ๊ฒฌํ•ด๋ฅผ ์‚ดํŽด๋ณด์ฃ .
12:11
Just because, first of all,
250
731208
1310
์šฐ์„  ์ง€๋‚œ ์ˆ˜์‹ญ ๋…„์˜ ์—ญ์‚ฌ๋ฅผ ๋ณด๋ฉด
12:12
if you look at the history of the last couple of decades,
251
732542
2684
12:15
you know, it's always been push forward at full speed,
252
735250
3559
์•„์‹œ๋‹ค์‹œํ”ผ ํ•ญ์ƒ ์ „์†๋ ฅ์œผ๋กœ ๋‚˜์•„๊ฐ”์ฃ .
12:18
it's OK, that's our only choice.
253
738833
1851
๊ทธ๊ฒŒ ์šฐ๋ฆฌ์˜ ์œ ์ผํ•œ ์„ ํƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
12:20
But if you look at globalization and the rapid acceleration of that,
254
740708
4268
ํ•˜์ง€๋งŒ ์„ธ๊ณ„ํ™”์™€ ๊ทธ ๊ธ‰์†ํ•œ ๊ฐ€์†ํ™”๋ฅผ,
12:25
if you look at the strategy of "move fast and break things"
255
745000
3434
'๋นจ๋ฆฌ ์›€์ง์ด๊ณ  ๋ฌด์—‡์ด๋“  ํ˜ํŒŒํ•˜์‹ญ์‹œ์˜ค.'๋ผ๋Š” ์ „๋žต๊ณผ
12:28
and what happened with that,
256
748458
2060
๊ทธ๋ž˜์„œ ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚ฌ๋Š”์ง€,
12:30
and then you look at the potential for synthetic biology,
257
750542
2767
๊ทธ๋ฆฌ๊ณ  ํ•ฉ์„ฑ ์ƒ๋ฌผํ•™์˜ ์ž ์žฌ๋ ฅ์„ ๋ณธ๋‹ค๋ฉด
12:33
I don't know that we should move forward rapidly
258
753333
4435
๋ชจ๋“  ๊ฐ€์ •๊ณผ ๊ณ ๋“ฑํ•™๊ต ์‹คํ—˜์‹ค์—์„œ
12:37
or without any kind of restriction
259
757792
1642
๋น ๋ฅธ ์†๋„๋กœ ์•„๋ฌด ์ œ์•ฝ ์—†์ด DNA ํ”„๋ฆฐํ„ฐ๋ฅผ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๋Š” ์„ธ์ƒ์œผ๋กœ
12:39
to a world where you could have a DNA printer in every home
260
759458
3310
์•ž์œผ๋กœ ๋‚˜์•„๊ฐ€์•ผ๋งŒ ํ•˜๋Š” ๊ฑด์ง€ ์˜๋ฌธ์ž…๋‹ˆ๋‹ค.
12:42
and high school lab.
261
762792
1333
12:45
There are some restrictions, right?
262
765167
1684
๋ช‡ ๊ฐ€์ง€ ์ œ์•ฝ์ด ์žˆ์Šต๋‹ˆ๋‹ค, ๊ทธ๋ ‡์ฃ ?
12:46
NB: Possibly, there is the first part, the not feasible.
263
766875
2643
NB: ์•„๋งˆ๋„ ์ฒซ์งธ๋Š” ์‹คํ˜„๊ฐ€๋Šฅํ•˜์ง€ ์•Š์€ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:49
If you think it would be desirable to stop it,
264
769542
2184
๊ฐœ๋ฐœ์„ ๋ฉˆ์ถ”๋Š” ๊ฒŒ ๋ฐ”๋žŒ์งํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋”๋ผ๋„
12:51
there's the problem of feasibility.
265
771750
1726
์‹คํ˜„๊ฐ€๋Šฅ์„ฑ์˜ ๋ฌธ์ œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
12:53
So it doesn't really help if one nation kind of --
266
773500
2809
ํ•œ ๊ตญ๊ฐ€์˜ ๊ฒฝ์šฐ์—” ์†Œ์šฉ์ด ์—†์„ ๊ฑฐ์˜ˆ์š”.
12:56
CA: No, it doesn't help if one nation does,
267
776333
2018
CA: ๊ทธ๋ ‡์ฃ , ํ•œ ๊ตญ๊ฐ€๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ํ•œ๋‹ค๋ฉด ์–ด์ฉ” ์ˆ˜ ์—†๊ฒ ์ฃ .
12:58
but we've had treaties before.
268
778375
2934
ํ•˜์ง€๋งŒ ์šฐ๋ฆฐ ์ „์— ์กฐ์•ฝ์„ ๋งบ์€ ์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
13:01
That's really how we survived the nuclear threat,
269
781333
3351
๊ทธ๊ฒŒ ์šฐ๋ฆฌ๊ฐ€ ํ•ต ์œ„ํ˜‘์—์„œ ์‚ด์•„๋‚จ์€ ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
13:04
was by going out there
270
784708
1268
๊ทธ ์ž๋ฆฌ์— ๋‚˜๊ฐ€์„œ ๊ณ ํ†ต์Šค๋Ÿฌ์šด ํ˜‘์ƒ ๊ณผ์ •์„ ๊ฑฐ์น˜๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
13:06
and going through the painful process of negotiating.
271
786000
2518
13:08
I just wonder whether the logic isn't that we, as a matter of global priority,
272
788542
5434
์„ธ๊ณ„์ ์œผ๋กœ ์šฐ์„ ์ˆœ์œ„๋ฅผ ๋‘ฌ์„œ
13:14
we shouldn't go out there and try,
273
794000
1684
์šฐ๋ฆฌ๊ฐ€ ๊ทธ ๊ณณ์— ๋‚˜๊ฐ€์„œ
13:15
like, now start negotiating really strict rules
274
795708
2685
ํ•ฉ์„ฑ ์ƒ๋ฌผ ์—ฐ๊ตฌ๊ฐ€ ์ด๋ฃจ์–ด์ง€๋Š” ๊ณณ์— ๋Œ€ํ•œ ์•„์ฃผ ์—„๊ฒฉํ•œ ๊ทœ์น™ ๋งˆ๋ จ์„ ์œ„ํ•œ
13:18
on where synthetic bioresearch is done,
275
798417
2684
ํ˜‘์ƒ์„ ์‹œ์ž‘ํ•˜์ง€ ์•Š์œผ๋ฉด ์•ˆ๋˜๋Š” ๊ฑฐ ์•„๋‹๊นŒ์š”?
13:21
that it's not something that you want to democratize, no?
276
801125
2851
๋ˆ„๊ตฌ๋‚˜ ํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•˜๊ณ  ์‹ถ์€ ๊ฑด ์•„๋‹ˆ์ž–๋‚˜์š”?
13:24
NB: I totally agree with that --
277
804000
1809
NB: ์ €๋„ ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
13:25
that it would be desirable, for example,
278
805833
4226
๋ฐ”๋žŒ์งํ•˜๊ธฐ๋กœ๋Š” ์˜ˆ๋ฅผ ๋“ค์–ด
13:30
maybe to have DNA synthesis machines,
279
810083
3601
DNA ํ•ฉ์„ฑ ์žฅ์น˜๋Š”
13:33
not as a product where each lab has their own device,
280
813708
3560
๊ฐ์ž ์‹คํ—˜์‹ค์—์„œ ์†Œ์œ ํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
13:37
but maybe as a service.
281
817292
1476
์„œ๋น„์Šค๋กœ์„œ ๊ฐ–์ถฐ์ง€๋Š” ๊ฑฐ์ฃ .
13:38
Maybe there could be four or five places in the world
282
818792
2517
์–ด์ฉŒ๋ฉด ์„ธ๊ณ„์— ๋„ค๋‹ค์„ฏ ๊ณณ ์ •๋„๊ฐ€ ์žˆ์–ด์„œ
13:41
where you send in your digital blueprint and the DNA comes back, right?
283
821333
3518
๋””์ง€ํ„ธ ์ฒญ์‚ฌ์ง„์„ ๋ฐ›์•„์„œ DNA๋ฅผ ์ œ์ž‘ํ•ด์ฃผ๋ฉด ๋˜์ง€ ์•Š์„๊นŒ ํ•ฉ๋‹ˆ๋‹ค.
13:44
And then, you would have the ability,
284
824875
1768
๊ทธ๋Ÿฌ๋ฉด ๊ทธ๊ฑธ ํ†ต์ œํ•  ์ˆ˜ ์žˆ๊ฒ ์ฃ .
13:46
if one day it really looked like it was necessary,
285
826667
2392
์–ธ์  ๊ฐ€ ์ง„์งœ๋กœ ์ œํ•œ์ด ํ•„์š”ํ•œ ๊ฒƒ ๊ฐ™์œผ๋ฉด
13:49
we would have like, a finite set of choke points.
286
829083
2351
์ œํ•œ๋œ ์ˆ˜์˜ ์ง€์ ์—์„œ ํ†ต์ œ๋ฅผ ํ•  ์ˆ˜ ์žˆ๊ฒ ์ฃ .
13:51
So I think you want to look for kind of special opportunities,
287
831458
3518
์ข€ ๋” ์—„๊ฒฉํ•˜๊ฒŒ ํ†ต์ œํ•  ์ˆ˜ ์žˆ๋Š”
13:55
where you could have tighter control.
288
835000
2059
ํŠน๋ณ„ํ•œ ๊ธฐํšŒ ๊ฐ™์€ ๊ฑธ ์›ํ•˜๋Š” ๊ฑฐ ๊ฐ™๊ตฐ์š”.
13:57
CA: Your belief is, fundamentally,
289
837083
1643
CA: ๋ฐ•์‚ฌ๋‹˜์€ ๊ทผ๋ณธ์ ์œผ๋กœ
13:58
we are not going to be successful in just holding back.
290
838750
2893
์šฐ๋ฆฌ๊ฐ€ ์Šค์Šค๋กœ ์ž์ œํ•˜์ง€ ๋ชปํ•  ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹œ๋„ค์š”.
14:01
Someone, somewhere -- North Korea, you know --
291
841667
2726
์–ด๋”˜๊ฐ€์˜ ๋ˆ„๊ตฐ๊ฐ€๊ฐ€, ์˜ˆ๋ฅผ ๋“ค์–ด ๋ถํ•œ๊ณผ ๊ฐ™์€
14:04
someone is going to go there and discover this knowledge,
292
844417
3517
๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ๊ทธ๊ณณ์— ๊ฐ€์„œ ์ด ์ง€์‹์„ ๋ฐœ๊ฒฌํ•˜๊ฒŒ ๋œ๋‹ค๋Š” ๊ฑฐ์ฃ .
14:07
if it's there to be found.
293
847958
1268
๋งŒ์•ฝ ๊ทธ๊ฒƒ์ด ๊ฑฐ๊ธฐ ์žˆ๋‹ค๋ฉด์š”.
14:09
NB: That looks plausible under current conditions.
294
849250
2351
NB: ํ˜„์žฌ๋กœ์„œ๋Š” ๊ทธ๋ ‡๊ฒŒ ๋ณด์ž…๋‹ˆ๋‹ค.
14:11
It's not just synthetic biology, either.
295
851625
1934
๋‹จ์ง€ ํ•ฉ์„ฑ์ƒ๋ฌผํ•™๋งŒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
14:13
I mean, any kind of profound, new change in the world
296
853583
2518
์ œ ๋ง์€, ์„ธ์ƒ ์–ด๋–ค ์ข…๋ฅ˜์˜ ์‹ฌ์˜คํ•˜๊ณ  ์ƒˆ๋กœ์šด ๋ณ€ํ™”๋„
14:16
could turn out to be a black ball.
297
856101
1626
๊ฒ€์€ ๊ณต์ด ๋  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
14:17
CA: Let's look at another possible response.
298
857727
2096
CA: ๋‹ค๋ฅธ ๋Œ€์‘๋ฒ•์„ ๋ด…์‹œ๋‹ค.
14:19
NB: This also, I think, has only limited potential.
299
859823
2403
NB: ์ด๊ฒƒ๋„ ํ•œ์ •์ ์ด๊ธด ํ•ฉ๋‹ˆ๋‹ค๋งŒ
[๋Œ€์‘ 2: ๋ถˆ๋Ÿ‰ ์„ธ๋ ฅ์„ ์ œ๊ฑฐํ•˜๊ธฐ]
14:22
So, with the Type-1 vulnerability again,
300
862250
3559
1์œ ํ˜•์˜ ์ทจ์•ฝ์„ฑ์— ๋Œ€ํ•ด์„œ๋Š”
14:25
I mean, if you could reduce the number of people who are incentivized
301
865833
4351
๊ทธ๋Ÿฐ ์ˆ˜๋‹จ์„ ์†์— ๋„ฃ๊ธฐ๋งŒ ํ•œ๋‹ค๋ฉด
์„ธ์ƒ์„ ํŒŒ๊ดดํ•˜๊ณ  ์‹ถ์–ดํ•˜๋Š” ์‚ฌ๋žŒ์˜ ์ˆ˜๋ฅผ
14:30
to destroy the world,
302
870208
1268
14:31
if only they could get access and the means,
303
871500
2059
์ค„์ผ ์ˆ˜ ์žˆ๋‹ค๋ฉด ๊ทธ๊ฑด ๊ดœ์ฐฎ์„ ๊ฒƒ ๊ฐ™์€๋ฐ์š”.
14:33
that would be good.
304
873583
1268
14:34
CA: In this image that you asked us to do
305
874875
1976
CA: ๋ฐ•์‚ฌ๋‹˜์ด ๋ถ€ํƒํ•œ ์ด ์˜์ƒ์—์„œ๋Š”
14:36
you're imagining these drones flying around the world
306
876875
2559
์–ผ๊ตด ์ธ์‹ ๊ธฐ๋Šฅ์ด ์žˆ๋Š” ๋“œ๋ก ์ด ์ „ ์„ธ๊ณ„๋ฅผ ๋‚ ์•„๋‹ค๋‹™๋‹ˆ๋‹ค.
14:39
with facial recognition.
307
879458
1268
๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ๋ฐ˜์‚ฌํšŒ์  ํ–‰๋™์„ ํ•˜๋ ค๋Š” ์กฐ์ง์ด ๋ณด์ด๋Š” ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•˜๋ฉด
14:40
When they spot someone showing signs of sociopathic behavior,
308
880750
2893
14:43
they shower them with love, they fix them.
309
883667
2184
์‚ฌ๋ž‘์„ ์Ÿ์•„๋ถ“๊ณ  ๊ทธ๋“ค์„ ์น˜๋ฃŒํ•ฉ๋‹ˆ๋‹ค.
14:45
NB: I think it's like a hybrid picture.
310
885875
1893
NB: ๋ณตํ•ฉ์ ์ธ ์˜์ƒ ๊ฐ™๋„ค์š”.
14:47
Eliminate can either mean, like, incarcerate or kill,
311
887792
4017
์ œ๊ฑฐ๋Š” ๊ฐ๊ธˆ ๋˜๋Š” ์‚ดํ•ด์™€ ๊ฐ™์€ ๊ฒƒ์„ ์˜๋ฏธํ•  ์ˆ˜๋„ ์žˆ๊ณ 
14:51
or it can mean persuade them to a better view of the world.
312
891833
3018
๋” ๋‚˜์€ ์„ธ๊ณ„๊ด€์„ ๊ฐ–๋„๋ก ์„ค๋“ํ•˜๋Š” ๊ฑฐ์ผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
14:54
But the point is that,
313
894875
1726
ํ•˜์ง€๋งŒ ์š”์ ์€
14:56
suppose you were extremely successful in this,
314
896625
2143
์ด๋Ÿฌํ•œ ์ผ์ด ๋งค์šฐ ์„ฑ๊ณต์ ์ด์—ˆ๊ณ 
14:58
and you reduced the number of such individuals by half.
315
898792
3309
๊ทธ๋Ÿฐ ์‚ฌ๋žŒ๋“ค์˜ ์ˆ˜๋ฅผ ์ ˆ๋ฐ˜์œผ๋กœ ์ค„์˜€๋‹ค๊ณ  ๊ฐ€์ •ํ•ด๋ณด์ฃ .
15:02
And if you want to do it by persuasion,
316
902125
1893
์„ค๋“์ด๋ผ๋Š” ์ˆ˜๋‹จ์œผ๋กœ ์ด๋ฅผ ๋‹ฌ์„ฑํ•˜๋ ค๋ฉด
15:04
you are competing against all other powerful forces
317
904042
2392
์‚ฌ๋žŒ๋“ค์„ ์„ค๋“ํ•˜๋ ค๊ณ  ํ•˜๋‹ค๊ฐ€ ์ •๋‹น, ์ข…๊ต, ๊ต์œก์‹œ์Šคํ…œ๊ณผ ๊ฐ™์€
15:06
that are trying to persuade people,
318
906458
1685
๋น„์Šทํ•œ ๋ชฉ์ ์„ ๊ฐ–๋Š” ๋ชจ๋“  ๋‹ค๋ฅธ ๊ฐ•๋ ฅํ•œ ์„ธ๋ ฅ๊ณผ ๊ฒจ๋ฃจ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
15:08
parties, religion, education system.
319
908167
1767
15:09
But suppose you could reduce it by half,
320
909958
1905
ํ•˜์ง€๋งŒ ๊ทธ ์ธ์›์„ ์ ˆ๋ฐ˜์œผ๋กœ ์ค„์˜€๋‹ค๊ณ  ํ•˜๋”๋ผ๋„
15:11
I don't think the risk would be reduced by half.
321
911887
2256
๊ทธ ์œ„ํ—˜์ด ์ ˆ๋ฐ˜์œผ๋กœ ์ค„์–ด๋“ค์ง„ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
15:14
Maybe by five or 10 percent.
322
914167
1559
๋Œ€์ถฉ 5%๋‚˜ 10% ์ •๋„๊ฒ ์ฃ .
15:15
CA: You're not recommending that we gamble humanity's future on response two.
323
915750
4351
CA: 2๋ฒˆ ๋Œ€์‘๋ฒ•์— ์ธ๊ฐ„์˜ ๋ฏธ๋ž˜๋ฅผ ๊ฑธ๋ผ๊ณ  ๊ถŒํ•˜๋Š” ๊ฑด ์•„๋‹ˆ๋„ค์š”.
15:20
NB: I think it's all good to try to deter and persuade people,
324
920125
3018
NB: ์ €๋Š” ์‚ฌ๋žŒ๋“ค์„ ๋‹จ๋…์‹œํ‚ค๊ณ  ์„ค๋“ํ•˜๋Š” ๊ฒƒ์ด ์ข‹๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
15:23
but we shouldn't rely on that as our only safeguard.
325
923167
2976
ํ•˜์ง€๋งŒ ๊ทธ๊ฒƒ์„ ์œ ์ผํ•œ ์•ˆ์ „์žฅ์น˜๋กœ ์˜์กดํ•ด์„œ๋Š” ์•ˆ๋˜๊ฒ ์ง€์š”.
15:26
CA: How about three?
326
926167
1267
CA: ์„ธ ๋ฒˆ์งธ๋Š” ๋ญ์ฃ ?
15:27
NB: I think there are two general methods
327
927458
2893
NB: ๋‘ ๊ฐ€์ง€ ์ผ๋ฐ˜์ ์ธ ๋ฐฉ๋ฒ•์ด ์žˆ์„ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
[๋Œ€์‘ 3: ๋Œ€์ค‘์„ ๊ฐ์‹œํ•˜๊ธฐ]
15:30
that we could use to achieve the ability to stabilize the world
328
930375
3976
์ด ๋ฐฉ๋ฒ•๋“ค์„ ์‚ฌ์šฉํ•ด์„œ ๊ฐ€๋Šฅํ•œ ๋ชจ๋“  ๋ฒ”์œ„์˜ ์ทจ์•ฝ์„ฑ์— ๋Œ€ํ•ด
15:34
against the whole spectrum of possible vulnerabilities.
329
934375
2976
์„ธ์ƒ์„ ์•ˆ์ •์‹œํ‚ฌ ์ˆ˜ ์žˆ์„ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
15:37
And we probably would need both.
330
937375
1559
์•„๋งˆ ๋‘˜ ๋‹ค ํ•„์š”ํ•  ๊ฒ๋‹ˆ๋‹ค.
15:38
So, one is an extremely effective ability
331
938958
4518
ํ•˜๋‚˜๋Š” ๋งค์šฐ ํšจ๊ณผ์ ์ธ
15:43
to do preventive policing.
332
943500
1768
์˜ˆ๋ฐฉ ์ •์ฑ…์ž…๋‹ˆ๋‹ค.
15:45
Such that you could intercept.
333
945292
1524
์ฐจ๋‹จ์„ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฑฐ์ฃ .
15:46
If anybody started to do this dangerous thing,
334
946840
2761
๋ˆ„๊ตฐ๊ฐ€ ์œ„ํ—˜ํ•œ ์ผ์„ ์‹œ์ž‘ํ•œ๋‹ค๋ฉด
15:49
you could intercept them in real time, and stop them.
335
949625
2684
์ค‘๊ฐ„์— ์‹ค์‹œ๊ฐ„์œผ๋กœ ๋ง‰๊ณ  ์ €์ง€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
15:52
So this would require ubiquitous surveillance,
336
952333
2476
๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„œ๋Š” ์œ ๋น„์ฟผํ„ฐ์Šค ๊ฐ์‹œ๊ฐ€ ํ•„์š”ํ•˜๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:54
everybody would be monitored all the time.
337
954833
2375
๋ˆ„๊ตฌ๋‚˜ ํ•ญ์ƒ ๊ฐ์‹œ๋‹นํ•˜๊ณ  ์žˆ๋Š” ๊ฑฐ์ฃ .
15:58
CA: This is "Minority Report," essentially, a form of.
338
958333
2560
CA: ์ผ์ข…์˜ ์˜ํ™”, '๋งˆ์ด๋„ˆ๋ฆฌํ‹ฐ ๋ฆฌํฌํŠธ'๋„ค์š”.
16:00
NB: You would have maybe AI algorithms,
339
960917
1934
NB: ์ธ๊ณต์ง€๋Šฅ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๋‚˜
16:02
big freedom centers that were reviewing this, etc., etc.
340
962875
4417
์—ฌ๋Ÿฌ ๊ฐ€์ง€ ์‚ฌ์ฐฐ์„ ํ•˜๋Š” ์ž์œ ์„ผํ„ฐ ๊ฐ™์€ ๊ฒƒ์ด ์ด์šฉ๋ ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
16:08
CA: You know that mass surveillance is not a very popular term right now?
341
968583
4393
CA: ๋Œ€์ค‘๊ฐ์‹œ๊ฐ€ ์ง€๊ธˆ ๊ทธ๋ ‡๊ฒŒ ์ธ๊ธฐ ์žˆ๋Š” ์šฉ์–ด๋Š” ์•„๋‹ˆ๋ผ๋Š” ๊ฑฐ ์•Œ์ฃ ?
16:13
(Laughter)
342
973000
1250
(์›ƒ์Œ)
16:15
NB: Yeah, so this little device there,
343
975458
1810
NB: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ด ์ž‘์€ ์žฅ์น˜,
16:17
imagine that kind of necklace that you would have to wear at all times
344
977292
3601
๋‹ค๋ฐฉํ–ฅ ์นด๋ฉ”๋ผ๊ฐ€ ๋‹ฌ๋ฆฐ ๋ชฉ๊ฑธ์ด๋ฅผ ํ•ญ์ƒ ๋ชธ์— ์ง€๋‹ˆ๊ณ  ๋‹ค๋‹ˆ๋Š” ๊ฒƒ์„ ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
16:20
with multidirectional cameras.
345
980917
2000
(์›ƒ์Œ)
16:23
But, to make it go down better,
346
983792
1809
์‰ฝ๊ฒŒ ๋ฐ›์•„๋“ค์ผ ์ˆ˜ ์žˆ๋„๋ก
16:25
just call it the "freedom tag" or something like that.
347
985625
2524
'์ž์œ ์˜ ํ‘œ์‹' ๊ฐ™์€ ์ด๋ฆ„์„ ๋ถ™์ด๋ฉด ์ข‹๊ฒ ๋„ค์š”.
16:28
(Laughter)
348
988173
2011
(์›ƒ์Œ)
16:30
CA: OK.
349
990208
1268
CA: ์•Œ์•˜์–ด์š”.
16:31
I mean, this is the conversation, friends,
350
991500
2101
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ด ๋Œ€ํ™”๊ฐ€
16:33
this is why this is such a mind-blowing conversation.
351
993625
3559
๊นœ์ง ๋†€๋ž„๋งŒํ•œ ์ด์•ผ๊ธฐ๋ผ๊ณ  ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค, ์—ฌ๋Ÿฌ๋ถ„.
16:37
NB: Actually, there's a whole big conversation on this
352
997208
2601
NB:์ด๊ฒƒ์— ๊ด€ํ•ด์„œ๋Š”
16:39
on its own, obviously.
353
999833
1310
๋‹น์—ฐํžˆ ๋งŽ์€ ๋…ผ๋ž€์ด ์žˆ์Šต๋‹ˆ๋‹ค.
16:41
There are huge problems and risks with that, right?
354
1001167
2476
์—„์ฒญ๋‚œ ๋ฌธ์ œ์™€ ์œ„ํ—˜์ด ๋”ฐ๋ฆ…๋‹ˆ๋‹ค.
16:43
We may come back to that.
355
1003667
1267
๋‚˜์ค‘์— ๊ฑฐ๊ธฐ์— ๋Œ€ํ•ด ๋‹ค์‹œ ์–˜๊ธฐํ•ด ๋ณด์ฃ .
16:44
So the other, the final,
356
1004958
1268
๋‹ค๋ฅธ ํ•˜๋‚˜, ๋งˆ์ง€๋ง‰์œผ๋กœ
16:46
the other general stabilization capability
357
1006250
2559
์ผ๋ฐ˜์ ์ธ ์•ˆ์ •ํ™” ๊ธฐ๋Šฅ์€
16:48
is kind of plugging another governance gap.
358
1008833
2060
์ง€๋ฐฐ๊ตฌ์กฐ ๊ฒฉ์ฐจ๋ฅผ ๋ง‰๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:50
So the surveillance would be kind of governance gap at the microlevel,
359
1010917
4184
[๋Œ€์‘ 4: ์ „ ์„ธ๊ณ„์  ์ง€๋ฐฐ๊ตฌ์กฐ]
๊ฐ์‹œ๋Š” ๋ฏธ์‹œ์ ์ธ ์ˆ˜์ค€์˜ ์ง€๋ฐฐ๊ตฌ์กฐ ๊ฒฉ์ฐจ์ผ ๊ฒ๋‹ˆ๋‹ค.
16:55
like, preventing anybody from ever doing something highly illegal.
360
1015125
3101
์–ด๋–ค ๊ฐœ์ธ์ด ๋ถˆ๋ฒ•์ ์ธ ํ–‰๋™์„ ํ•˜์ง€ ๋ชปํ•˜๊ฒŒ ํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ์š”.
16:58
Then, there's a corresponding governance gap
361
1018250
2309
๊ทธ๋Ÿฌ๋ฉด ๊ฑฐ์‹œ์  ์ฐจ์›, ์„ธ๊ณ„์  ์ฐจ์›์—์„œ๋Š”
17:00
at the macro level, at the global level.
362
1020583
1935
๊ทธ์— ์ƒ์‘ํ•˜๋Š” ์ง€๋ฐฐ๊ตฌ์กฐ ๊ฒฉ์ฐจ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
17:02
You would need the ability, reliably,
363
1022542
3934
์ด ๋Šฅ๋ ฅ์€
17:06
to prevent the worst kinds of global coordination failures,
364
1026500
2809
์ตœ์•…์˜ ์„ธ๊ณ„์  ํ˜‘๋ ฅ ์‹คํŒจ์— ๋Œ€๋น„ํ•˜๊ธฐ ์œ„ํ•œ ๊ฒ๋‹ˆ๋‹ค.
17:09
to avoid wars between great powers,
365
1029333
3768
๊ฐ•๋Œ€๊ตญ ๊ฐ„์˜ ์ „์Ÿ,
17:13
arms races,
366
1033125
1333
๋ฌด๊ธฐ ๊ฒฝ์Ÿ,
17:15
cataclysmic commons problems,
367
1035500
2208
๊ณต์œ ์ž์›์ด ํŒŒํƒ„๋‚˜๋Š” ๋ฌธ์ œ,
17:19
in order to deal with the Type-2a vulnerabilities.
368
1039667
4184
2A ์œ ํ˜•์˜ ์ทจ์•ฝ์„ฑ์— ๋Œ€ํ•œ ๋Œ€์ฒ˜์ž…๋‹ˆ๋‹ค.
17:23
CA: Global governance is a term
369
1043875
1934
CA: ์„ธ๊ณ„์  ์ง€๋ฐฐ๊ตฌ์กฐ๋Š”
17:25
that's definitely way out of fashion right now,
370
1045833
2226
ํ™•์‹คํžˆ ์‹œ๋Œ€์— ๋’ค๋–จ์–ด์ง„ ์šฉ์–ด์ง€๋งŒ
17:28
but could you make the case that throughout history,
371
1048083
2518
์—ญ์‚ฌ ์ „์ฒด์— ๊ฑธ์ณ,
17:30
the history of humanity
372
1050625
1268
์ธ๋ฅ˜ ์—ญ์‚ฌ ์ „์ฒด์— ๊ฑธ์ณ
17:31
is that at every stage of technological power increase,
373
1051917
5434
๊ธฐ์ˆ ๋ ฅ์ด ์ฆ๋Œ€๋  ๋•Œ๋งˆ๋‹ค
17:37
people have reorganized and sort of centralized the power.
374
1057375
3226
์‚ฌํšŒ๋Š” ์žฌ๊ตฌ์„ฑ๋˜๊ณ  ๊ทธ ํž˜์€ ์ง‘์ค‘๋˜๋Š” ๊ฑธ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
17:40
So, for example, when a roving band of criminals
375
1060625
3434
์˜ˆ๋ฅผ ๋“ค์–ด, ํ•œ ๋ฌด๋ฆฌ์˜ ๋ฒ”์ฃ„์ž๋“ค์ด
17:44
could take over a society,
376
1064083
1685
ํ•œ ์‚ฌํšŒ๋ฅผ ์žฅ์•…ํ•  ์ˆ˜ ์žˆ์—ˆ์„ ๋•Œ
17:45
the response was, well, you have a nation-state
377
1065792
2239
๊ทธ ๊ฒฐ๊ณผ๋Š” ๋ฏผ์กฑ ๊ตญ๊ฐ€๊ฐ€ ๋‚˜ํƒ€๋‚˜๊ณ 
17:48
and you centralize force, a police force or an army,
378
1068055
2434
๊ฒฝ์ฐฐ์ด๋‚˜ ๊ตฐ๋Œ€ ๊ฐ™์€ ๊ถŒ๋ ฅ์„ ์ง‘์ค‘์‹œํ‚ค๋Š” ๊ฑฐ์ฃ .
17:50
so, "No, you can't do that."
379
1070513
1630
๊ทธ๋Ÿฌ๋‹ˆ "๊ทธ๋ž˜์„œ๋Š” ์•ˆ๋ฉ๋‹ˆ๋‹ค"์ธ ๊ฑฐ์ฃ .
17:52
The logic, perhaps, of having a single person or a single group
380
1072167
4559
์–ด์ฉŒ๋ฉด ํ•œ ์‚ฌ๋žŒ์ด๋‚˜ ํ•œ ๊ทธ๋ฃน์ด
17:56
able to take out humanity
381
1076750
1643
์ธ๋ฅ˜๋ฅผ ํŒŒ๋ฉธ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋‹ค๋Š” ๋…ผ๋ฆฌ๋Š”
17:58
means at some point we're going to have to go this route,
382
1078417
2726
์–ด๋–ค ์‹œ์ ์—์„œ๋Š” ์ด ๊ธธ์„ ๊ฐ€๊ณ  ๋งŒ๋‹ค๋Š” ๊ฑธ ์˜๋ฏธํ•˜๋Š” ๊ฑฐ ์•„๋‹Œ๊ฐ€์š”?
์ ์–ด๋„ ๋น„์Šทํ•œ ํ˜•ํƒœ๋กœ๋ผ๋„์š”.
18:01
at least in some form, no?
383
1081167
1434
18:02
NB: It's certainly true that the scale of political organization has increased
384
1082625
3684
NB: ์ธ๋ฅ˜ ์—ญ์‚ฌ์— ๊ฑธ์ณ ์ •์น˜ ์กฐ์ง์˜ ๊ทœ๋ชจ๊ฐ€ ์ปค์ง„ ๊ฒƒ์€ ๋ถ„๋ช…ํ•ฉ๋‹ˆ๋‹ค.
18:06
over the course of human history.
385
1086333
2143
18:08
It used to be hunter-gatherer band, right,
386
1088500
2018
์˜ˆ์ „์—๋Š” ์ˆ˜๋ ต ์ฑ„์ง‘ ์ง‘๋‹จ ์ •๋„์˜€์ฃ .
18:10
and then chiefdom, city-states, nations,
387
1090542
2934
๊ทธ๋‹ค์Œ์—๋Š” ๊ตฐ์žฅ์‚ฌํšŒ, ๋„์‹œ๊ตญ๊ฐ€๋‚˜ ๊ตญ๊ฐ€๊ฐ€ ์ถœํ˜„ํ•˜๊ณ 
18:13
now there are international organizations and so on and so forth.
388
1093500
3976
์ง€๊ธˆ์€ ๊ตญ์ œ ์กฐ์ง ๋“ฑ๋“ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
18:17
Again, I just want to make sure
389
1097500
1518
๋‹ค์‹œ ํ•œ ๋ฒˆ ์ œ๊ฐ€ ๊ฐ•์กฐํ•˜๊ณ  ์‹ถ์€ ๊ฑด
18:19
I get the chance to stress
390
1099042
1642
18:20
that obviously there are huge downsides
391
1100708
1976
๋Œ€์ค‘๊ฐ์‹œ์™€ ์„ธ๊ณ„์  ์ง€๋ฐฐ๊ตฌ์กฐ๋Š” ๋ชจ๋‘
18:22
and indeed, massive risks,
392
1102708
1518
18:24
both to mass surveillance and to global governance.
393
1104250
3351
ํฐ ๋‹จ์ ๊ณผ ์—„์ฒญ๋‚œ ์œ„ํ—˜์ด ๋‹น์—ฐํžˆ ์žˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
18:27
I'm just pointing out that if we are lucky,
394
1107625
2559
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฌ๊ณ  ์‹ถ์€ ๊ฑด ์šฐ๋ฆฌ๊ฐ€ ์šด์ด ์ข‹๋‹ค๋ฉด
18:30
the world could be such that these would be the only ways
395
1110208
2685
์ด๊ฒŒ ๊ฒ€์€ ๊ณต์œผ๋กœ๋ถ€ํ„ฐ ์‚ด์•„๋‚จ์„ ์ˆ˜ ์žˆ๋Š”
์œ ์ผํ•œ ๋ฐฉ๋ฒ•์ผ์ง€๋„ ๋ชจ๋ฅธ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
18:32
you could survive a black ball.
396
1112917
1517
18:34
CA: The logic of this theory,
397
1114458
2518
CA: ์ฆ‰ ์ด ๋ง์€,
์ œ๊ฐ€ ๋ณด๊ธฐ์—”
18:37
it seems to me,
398
1117000
1268
18:38
is that we've got to recognize we can't have it all.
399
1118292
3601
๋ชจ๋“  ๊ฒƒ์„ ์†์— ๋„ฃ์„ ์ˆ˜๋Š” ์—†๋‹ค๋Š” ๊ฑธ ์ธ์‹ํ•ด์•ผ ๋œ๋‹ค๋Š” ๊ฑฐ๊ตฐ์š”.
18:41
That the sort of,
400
1121917
1833
18:45
I would say, naive dream that many of us had
401
1125500
2976
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ๊ฐ–๊ณ  ์žˆ๋Š” ๊ทธ๋Ÿฐ ์ข…๋ฅ˜์˜ ์ˆœ์ง„ํ•œ ๊ฟˆ,
18:48
that technology is always going to be a force for good,
402
1128500
3351
์ฆ‰, ๊ธฐ์ˆ ์€ ํ•ญ์ƒ ์„ ์„ ์œ„ํ•œ ํž˜์ด ๋˜๊ณ 
18:51
keep going, don't stop, go as fast as you can
403
1131875
2976
๊ณ„์† ๋‚˜์•„๊ฐ€๊ณ , ๋ฉˆ์ถ”์ง€ ์•Š๊ณ , ๊ฐ€๋Šฅํ•œ ํ•œ ๋นจ๋ฆฌ ๋‚˜์•„๊ฐ€๊ณ 
18:54
and not pay attention to some of the consequences,
404
1134875
2351
๊ฒฐ๊ณผ์— ์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ด์ง€ ์•Š์•„๋„ ๋œ๋‹ค๋Š” ๊ทธ๋Ÿฐ ๊ฟˆ์€
์‹ค์ œ๋กœ ๊ฐ€๋Šฅํ•œ ์„ ํƒ์ง€๊ฐ€ ์•„๋‹ˆ๋ผ๋Š” ๊ฑฐ์ฃ .
18:57
that's actually just not an option.
405
1137250
1684
18:58
We can have that.
406
1138958
1935
์šฐ๋ฆฌ๊ฐ€ ๊ทธ๊ฒƒ์„ ๊ฐ€์งˆ ์ˆ˜๋Š” ์žˆ๊ฒ ์ง€๋งŒ
19:00
If we have that,
407
1140917
1267
๋งŒ์•ฝ ๊ทธ๊ฒƒ์„ ์–ป๋Š”๋‹ค๋ฉด ๋‹ค๋ฅธ ์ผ๋“ค๋„ ๋ฐ›์•„๋“ค์—ฌ์•ผ๋งŒ ํ•˜๋Š” ๊ฑฐ์ฃ .
19:02
we're going to have to accept
408
1142208
1435
19:03
some of these other very uncomfortable things with it,
409
1143667
2559
๊ทธ๊ฒƒ๊ณผ ํ•จ๊ป˜ ์˜ค๋Š” ์•„์ฃผ ๋ถˆํŽธํ•œ ์ผ๋“ค๊ณผ
19:06
and kind of be in this arms race with ourselves
410
1146250
2226
์ž์‹ ๊ณผ์˜ ๋ฌด๊ธฐ ๊ฒฝ์Ÿ๊ณผ ๊ฐ™์€ ๊ฒƒ๋“ค์ด์š”.
19:08
of, you want the power, you better limit it,
411
1148500
2268
ํž˜์„ ๊ฐ–๊ณ  ์‹ถ๋‹ค๋ฉด ์ œํ•œํ•˜๋Š” ๊ฒƒ์ด ์ข‹์„ ๊ฒƒ์ด๊ณ 
19:10
you better figure out how to limit it.
412
1150792
2142
์–ด๋–ป๊ฒŒ ์ œํ•œํ•  ๊ฒƒ์ธ๊ฐ€์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ์ด ์ข‹์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:12
NB: I think it is an option,
413
1152958
3476
NB: ์ œ ์ƒ๊ฐ์— ์ด๊ฑด
์•„์ฃผ ๋งค๋ ฅ์ ์ธ ์„ ํƒ์‚ฌํ•ญ์ž…๋‹ˆ๋‹ค.
19:16
a very tempting option, it's in a sense the easiest option
414
1156458
2768
์–ด๋–ค ์˜๋ฏธ์—์„œ๋Š” ๊ฐ€์žฅ ์†์‰ฌ์šด ์„ ํƒ์ด๊ณ  ์ž˜ ๋  ์ˆ˜๋„ ์žˆ์ง€๋งŒ
19:19
and it might work,
415
1159250
1268
19:20
but it means we are fundamentally vulnerable to extracting a black ball.
416
1160542
4809
๋ณธ์งˆ์ ์œผ๋กœ ๊ฒ€์€ ๊ณต์„ ๊บผ๋‚ผ ์œ„ํ—˜์„ฑ์ด ์žˆ๋‹ค๋Š” ๋ง์ž…๋‹ˆ๋‹ค.
19:25
Now, I think with a bit of coordination,
417
1165375
2143
์ œ ์ƒ๊ฐ์—” ์•ฝ๊ฐ„ ์กฐ์ •์„ ํ•ด์„œ
19:27
like, if you did solve this macrogovernance problem,
418
1167542
2726
๊ฑฐ์‹œ์  ์ง€๋ฐฐ๊ตฌ์กฐ์™€ ๋ฏธ์‹œ์  ์ง€๋ฐฐ๊ตฌ์กฐ์˜ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•œ๋‹ค๋ฉด
19:30
and the microgovernance problem,
419
1170292
1601
19:31
then we could extract all the balls from the urn
420
1171917
2309
ํ•ญ์•„๋ฆฌ์—์„œ ๋ชจ๋“  ๊ณต์„ ๋นผ๋‚ผ ์ˆ˜ ์žˆ๊ณ 
ํฐ ํ˜œํƒ์„ ์–ป์„ ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
19:34
and we'd benefit greatly.
421
1174250
2268
19:36
CA: I mean, if we're living in a simulation, does it matter?
422
1176542
3434
CA: ์šฐ๋ฆฌ๊ฐ€ ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ์•ˆ์— ์‚ด๊ณ  ์žˆ๋‹ค๋ฉด ๋ฌธ์ œ๋  ๊ฑฐ ์žˆ๋‚˜์š”?
19:40
We just reboot.
423
1180000
1309
๊ทธ์ € ๋‹ค์‹œ ์‹œ์ž‘ํ•˜๋ฉด ๋˜์ฃ .
19:41
(Laughter)
424
1181333
1268
(์›ƒ์Œ)
19:42
NB: Then ... I ...
425
1182625
1643
NB: ๊ทธ๋ ‡๋‹ค๋ฉด, ์ €๋Š”...
19:44
(Laughter)
426
1184292
2476
(์›ƒ์Œ)
19:46
I didn't see that one coming.
427
1186792
1416
๊ทธ๊ฑด ์ƒ๊ฐํ•ด ๋ณด์ง€ ์•Š์•˜๋„ค์š”.
(์›ƒ์Œ)
19:50
CA: So what's your view?
428
1190125
1268
CA: ์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ•˜์‹ญ๋‹ˆ๊นŒ?
19:51
Putting all the pieces together, how likely is it that we're doomed?
429
1191417
4809
๋ชจ๋“  ๊ฑธ ๊ณ ๋ คํ–ˆ์„ ๋•Œ ์šฐ๋ฆฌ๊ฐ€ ํŒŒ๋ฉธํ•  ๊ฐ€๋Šฅ์„ฑ์€์š”?
(์›ƒ์Œ)
19:56
(Laughter)
430
1196250
1958
19:59
I love how people laugh when you ask that question.
431
1199042
2392
์ด๋Ÿฐ ์งˆ๋ฌธ์— ๋‹ค๋“ค ์›ƒ์œผ์‹œ๋‹ˆ ์ข‹๋„ค์š”.
NB: ๊ฐœ์ธ์ ์ธ ์ฐจ์›์—์„œ
20:01
NB: On an individual level,
432
1201458
1351
20:02
we seem to kind of be doomed anyway, just with the time line,
433
1202833
3851
์šฐ๋ฆฌ๋Š” ๊ฒฐ๊ตญ ์ข…๋ง์„ ๋งž์ด ํ•˜๊ฒ ์ฃ .
์‹œ๊ฐ„์ด ํ๋ฅด๋ฉด์„œ ์šฐ๋ฆฌ๋Š” ๋…ธํ™”๋˜๊ณ  ์‡ ์•ฝํ•ด์ ธ๊ฐ‘๋‹ˆ๋‹ค.
20:06
we're rotting and aging and all kinds of things, right?
434
1206708
2601
20:09
(Laughter)
435
1209333
1601
(์›ƒ์Œ)
20:10
It's actually a little bit tricky.
436
1210958
1685
์‚ฌ์‹ค ์•ฝ๊ฐ„ ๋ณต์žกํ•ฉ๋‹ˆ๋‹ค.
20:12
If you want to set up so that you can attach a probability,
437
1212667
2767
ํ™•๋ฅ ์„ ๋‚ด๊ธฐ ์œ„ํ•ด์„œ ์ •ํ•ด์•ผ ํ•  ๊ฒƒ์€
๋จผ์ € ์šฐ๋ฆฌ๋Š” ๋ˆ„๊ตฌ์ธ๊ฐ€์ž…๋‹ˆ๋‹ค.
20:15
first, who are we?
438
1215458
1268
20:16
If you're very old, probably you'll die of natural causes,
439
1216750
2726
๊ณ ๋ น์ด๋ผ๋ฉด ์ž์—ฐ์ ์œผ๋กœ ์ฃฝ์„ ๊ฒ๋‹ˆ๋‹ค.
์•„์ฃผ ์–ด๋ฆฌ๋‹ค๋ฉด 100๋…„์€ ๋” ์‚ด์ง€๋„ ๋ชจ๋ฅด์ฃ .
20:19
if you're very young, you might have a 100-year --
440
1219500
2351
20:21
the probability might depend on who you ask.
441
1221875
2143
๋ˆ„๊ตฌ์—๊ฒŒ ๋ฌผ์–ด๋ณด๋Š๋ƒ์— ๋”ฐ๋ผ ํ™•๋ฅ ์€ ๋‹ฌ๋ผ์ง‘๋‹ˆ๋‹ค.
๊ทธ๋‹ค์Œ์œผ๋กœ ๋ฌธ๋ช… ํŒŒ๊ดด์˜ ๊ธฐ์ค€์€ ๋ญ˜๊นŒ์š”?
20:24
Then the threshold, like, what counts as civilizational devastation?
442
1224042
4226
๋…ผ๋ฌธ์—์„œ๋Š” ๊ทธ๊ฑธ ๊ณ„์‚ฐํ•˜๊ธฐ ์œ„ํ•œ
20:28
In the paper I don't require an existential catastrophe
443
1228292
5642
๋Œ€์žฌํ•ด๋Š” ํ•„์š”์—†์—ˆ์Šต๋‹ˆ๋‹ค.
20:33
in order for it to count.
444
1233958
1435
์ด๊ฑด ๋‹จ์ง€ ์–ด๋–ป๊ฒŒ ์ •์˜ํ•˜๋Š”๊ฐ€์˜ ๋ฌธ์ œ๋กœ์„œ
20:35
This is just a definitional matter,
445
1235417
1684
์˜ˆ๋กœ์„œ 10์–ต ๋ช… ์‚ฌ๋ง์ด๋‚˜ ์„ธ๊ณ„ GDP 50% ๊ฐ์†Œ๋ฅผ ๋“ค๊ฒ ์Šต๋‹ˆ๋‹ค.
20:37
I say a billion dead,
446
1237125
1309
20:38
or a reduction of world GDP by 50 percent,
447
1238458
2060
20:40
but depending on what you say the threshold is,
448
1240542
2226
๋ฌด์—‡์„ ํ•œ๊ณ„์น˜๋กœ ํ•˜๋Š๋ƒ์— ๋”ฐ๋ผ
20:42
you get a different probability estimate.
449
1242792
1976
ํ™•๋ฅ  ์ „๋ง๋„ ๋‹ฌ๋ผ์ง‘๋‹ˆ๋‹ค.
20:44
But I guess you could put me down as a frightened optimist.
450
1244792
4517
์ €๋ฅผ ๊ฒ์— ์งˆ๋ฆฐ ๋‚™๊ด€๋ก ์ž๋กœ ๋ชฐ์•„๋ถ™์ผ ์ˆ˜๋„ ์žˆ๊ฒ ์ง€๋งŒ์š”.
20:49
(Laughter)
451
1249333
1101
(์›ƒ์Œ)
20:50
CA: You're a frightened optimist,
452
1250458
1643
CA: ๊ฒ์— ์งˆ๋ฆฐ ๋‚™๊ด€๋ก ์ž๋ผ๊ณ  ํ•˜๋Š”๋ฐ
20:52
and I think you've just created a large number of other frightened ...
453
1252125
4268
๋‹ค๋ฅธ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์„ ๋‘๋ ค์›€์— ๋–จ๊ฒŒ ๋งŒ๋“  ๊ฒƒ ๊ฐ™๋„ค์š”.
20:56
people.
454
1256417
1267
(์›ƒ์Œ)
20:57
(Laughter)
455
1257708
1060
20:58
NB: In the simulation.
456
1258792
1267
NB: ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ์•ˆ์—์„œ์š”.
CA: ๋„ค, ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ์•ˆ์—์„œ์š”.
21:00
CA: In a simulation.
457
1260083
1268
๋ณด์ŠคํŠธ๋กฌ ๋ฐ•์‚ฌ๋‹˜์˜ ์‹๊ฒฌ์— ํƒ„๋ณตํ•ฉ๋‹ˆ๋‹ค.
21:01
Nick Bostrom, your mind amazes me,
458
1261375
1684
21:03
thank you so much for scaring the living daylights out of us.
459
1263083
2893
์šฐ๋ฆฌ๋ฅผ ๊ฒ์— ์งˆ๋ฆฌ๊ฒŒ ํ•ด์ค˜์„œ ๊ณ ๋ง™์Šต๋‹ˆ๋‹ค.
(๋ฐ•์ˆ˜)
21:06
(Applause)
460
1266000
2375
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7