Can we build AI without losing control over it? | Sam Harris

3,792,802 views ใƒป 2016-10-19

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jihyeon J. Kim ๊ฒ€ํ† : JY Kang
00:13
I'm going to talk about a failure of intuition
0
13000
2216
์ง๊ฐ์ด ํ‹€๋ฆฌ๋Š” ๊ฒฝ์šฐ์— ๋Œ€ํ•ด์„œ ์–˜๊ธฐํ• ๊นŒ ํ•ฉ๋‹ˆ๋‹ค.
00:15
that many of us suffer from.
1
15240
1600
์šฐ๋ฆฌ๊ฐ€ ํ”ํžˆ ๊ฒช๋Š” ์ผ์ด์ฃ .
00:17
It's really a failure to detect a certain kind of danger.
2
17480
3040
์ด๊ฒƒ์€ ์‹ค์ œ๋กœ ์–ด๋–ค ์ข…๋ฅ˜์˜ ์œ„ํ—˜์„ ๊ฐ์ง€ํ•˜์ง€ ๋ชปํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
00:21
I'm going to describe a scenario
3
21360
1736
๊ฐ€๋Šฅํ•œ ์‹œ๋‚˜๋ฆฌ์˜ค๋ฅผ ๋ง์”€๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
00:23
that I think is both terrifying
4
23120
3256
์•„์ฃผ ๋ฌด์„ญ๊ธฐ๋„ ํ•˜๊ณ 
00:26
and likely to occur,
5
26400
1760
์ผ์–ด๋‚  ๊ฐ€๋Šฅ์„ฑ๋„ ๋†’์ฃ .
00:28
and that's not a good combination,
6
28840
1656
์ด ๋‘˜์˜ ์กฐํ•ฉ์€ ๋ณ„๋กœ ๋ฐ˜๊ธธ ์ƒํ™ฉ์€ ์•„๋‹ ๊ฒ๋‹ˆ๋‹ค.
00:30
as it turns out.
7
30520
1536
00:32
And yet rather than be scared, most of you will feel
8
32080
2456
ํ•˜์ง€๋งŒ ์—ฌ๋Ÿฌ๋ถ„ ๋Œ€๋ถ€๋ถ„์€ ์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆด ๋‚ด์šฉ์— ๋Œ€ํ•ด์„œ
00:34
that what I'm talking about is kind of cool.
9
34560
2080
๋ฌด์„œ์›Œํ•˜๊ธฐ ๋ณด๋‹ค๋Š” ๋ฉ‹์ง€๋‹ค๊ณ  ์ƒ๊ฐํ•  ๊ฒ๋‹ˆ๋‹ค.
00:37
I'm going to describe how the gains we make
10
37200
2976
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฌ๊ณ  ์‹ถ์€ ๊ฒƒ์€ ์ธ๊ณต์ง€๋Šฅ์„ ๋งŒ๋“ค์–ด์„œ ์–ป๊ฒŒ ๋  ์ด์ ์ด
00:40
in artificial intelligence
11
40200
1776
๊ฒฐ๊ตญ ์šฐ๋ฆฌ๋ฅผ ํŒŒ๋ฉธ์‹œํ‚ฌ ์ˆ˜๋„ ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
00:42
could ultimately destroy us.
12
42000
1776
00:43
And in fact, I think it's very difficult to see how they won't destroy us
13
43800
3456
์‚ฌ์‹ค ์ธ๊ณต์ง€๋Šฅ์ด ์šฐ๋ฆฌ๋ฅผ ํŒŒ๋ฉธ์‹œํ‚ค์ง€ ์•Š์„์ง€
์šฐ๋ฆฌ ์Šค์Šค๋กœ ์ž๋ฉธํ•˜๋„๋ก ๋งŒ๋“ค์ง€๋Š” ์ „ํ˜€ ์•Œ ์ˆ˜ ์—†์ฃ .
00:47
or inspire us to destroy ourselves.
14
47280
1680
00:49
And yet if you're anything like me,
15
49400
1856
ํ•˜์ง€๋งŒ ์—ฌ๋Ÿฌ๋ถ„์ด ์ €์™€ ๊ฐ™๋‹ค๋ฉด
00:51
you'll find that it's fun to think about these things.
16
51280
2656
์ด ๋ฌธ์ œ๋ฅผ ์ƒ๊ฐํ•ด ๋ณด๋Š” ๊ฒƒ๋„ ์žฌ๋ฏธ์žˆ๋‹ค๊ณ  ๋Š๋ผ์‹ค ๊ฒ๋‹ˆ๋‹ค.
00:53
And that response is part of the problem.
17
53960
3376
๊ทธ๋Ÿฐ ๋ฐ˜์‘๋„ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
00:57
OK? That response should worry you.
18
57360
1720
๊ฑฑ์ •์Šค๋Ÿฐ ๋ฐ˜์‘์ด์–ด์•ผ์ฃ .
00:59
And if I were to convince you in this talk
19
59920
2656
์ด ๊ฐ•์—ฐ์—์„œ ์—ฌ๋Ÿฌ๋ถ„๋“ค์—๊ฒŒ
01:02
that we were likely to suffer a global famine,
20
62600
3416
๊ธฐํ›„๋ณ€ํ™”๋‚˜ ๋‹ค๋ฅธ ์žฌ์•™์œผ๋กœ ์ธํ•ด ์„ธ๊ณ„์  ๊ธฐ๊ทผ์œผ๋กœ ๊ณ ํ†ต๋ฐ›์„ ๊ฑฐ๋ผ๋“ ์ง€
01:06
either because of climate change or some other catastrophe,
21
66040
3056
01:09
and that your grandchildren, or their grandchildren,
22
69120
3416
์—ฌ๋Ÿฌ๋ถ„์˜ ์†์ž๋‚˜ ๊ทธ๋“ค์˜ ์†์ž๋“ค์ด
01:12
are very likely to live like this,
23
72560
1800
์ด๋ ‡๊ฒŒ ์‚ด๊ฒŒ ๋  ๊ฑฐ๋ผ๊ณ  ์„ค๋ช…ํ•˜๋ฉด
01:15
you wouldn't think,
24
75200
1200
์ด๋Ÿฐ ์ƒ๊ฐ์€ ์•ˆ๋“œ์‹œ๊ฒ ์ฃ .
01:17
"Interesting.
25
77440
1336
"์žฌ๋ฏธ์žˆ๊ตฐ.
01:18
I like this TED Talk."
26
78800
1200
์ด ๊ฐ•์—ฐ ๋ง˜์— ๋“ค์–ด"๋ผ๊ณ ์š”.
01:21
Famine isn't fun.
27
81200
1520
๊ธฐ๊ทผ์€ ์žฌ๋ฏธ์žˆ๋Š” ์ผ์ด ์•„๋‹™๋‹ˆ๋‹ค.
01:23
Death by science fiction, on the other hand, is fun,
28
83800
3376
๊ทธ๋Ÿฌ๋‚˜ ๊ณต์ƒ๊ณผํ•™์—์„œ ๋ฒŒ์–ด์ง€๋Š” ์ฃฝ์Œ์€ ์žฌ๋ฏธ์žˆ์ฃ .
01:27
and one of the things that worries me most about the development of AI at this point
29
87200
3976
์ง€๊ธˆ ์ด ์‹œ์ ์—์„œ ์ธ๊ณต์ง€๋Šฅ ๊ฐœ๋ฐœ์— ๋Œ€ํ•ด ๊ฐ€์žฅ ๊ฑฑ์ •๋˜๋Š” ๊ฒƒ์€
01:31
is that we seem unable to marshal an appropriate emotional response
30
91200
4096
์•ž์œผ๋กœ ์žˆ์„ ์œ„ํ—˜์— ๋Œ€ํ•œ ์ ์ ˆํ•œ ๊ฐ์ •์  ๋ฐ˜์‘์„
01:35
to the dangers that lie ahead.
31
95320
1816
๋ชปํ•˜๋Š” ๊ฒƒ ๊ฐ™๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
01:37
I am unable to marshal this response, and I'm giving this talk.
32
97160
3200
์ €๋„ ์ž˜ ๋ชปํ•˜๋ฉด์„œ ์ด ๊ฐ•์—ฐ์„ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:42
It's as though we stand before two doors.
33
102120
2696
๋งˆ์น˜ ๋‘ ๊ฐœ์˜ ๋ฌธ ์•ž์— ์„œ ์žˆ๋Š” ๊ฒƒ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
01:44
Behind door number one,
34
104840
1256
1๋ฒˆ ๋ฌธ์œผ๋กœ ๋“ค์–ด์„œ๋ฉด
01:46
we stop making progress in building intelligent machines.
35
106120
3296
์ธ๊ณต์ง€๋Šฅ ๊ธฐ๊ณ„์˜ ๊ตฌ์ถ•์€ ๋”์ด์ƒ ์ง„๋ณดํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
01:49
Our computer hardware and software just stops getting better for some reason.
36
109440
4016
์ปดํ“จํ„ฐ ํ•˜๋“œ์›จ์–ด์™€ ์†Œํ”„ํŠธ์›จ์–ด๊ฐ€ ๋” ์ด์ƒ ๋ฐœ์ „ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
01:53
Now take a moment to consider why this might happen.
37
113480
3000
์™œ ๊ทธ๋Ÿฐ ์ผ์ด ์ผ์–ด๋‚ ์ง€ ํ•œ๋ฒˆ ์ƒ๊ฐํ•ด๋ณด์ฃ .
01:57
I mean, given how valuable intelligence and automation are,
38
117080
3656
์ธ๊ณต์ง€๋Šฅ๊ณผ ์ž๋™ํ™”๊ฐ€ ์–ผ๋งˆ๋‚˜ ๊ฐ€์น˜์žˆ๋Š”์ง€ ์•ˆ๋‹ค๋ฉด
02:00
we will continue to improve our technology if we are at all able to.
39
120760
3520
ํ•  ์ˆ˜ ์žˆ๋Š” ํ•œ ๊ณ„์†ํ•ด์„œ ๊ธฐ์ˆ ์„ ํ–ฅ์ƒ์‹œํ‚ค๋ ค ํ•˜๊ฒ ์ฃ .
02:05
What could stop us from doing this?
40
125200
1667
๊ทธ๊ฑธ ๋ฉˆ์ถ˜๋‹ค๋ฉด ์ด์œ ๊ฐ€ ๋ญ˜๊นŒ์š”?
02:07
A full-scale nuclear war?
41
127800
1800
์ „๋ฉด์  ํ•ต ์ „์Ÿ์ผ๊นŒ์š”?
02:11
A global pandemic?
42
131000
1560
์„ธ๊ณ„์  ์ „์—ผ๋ณ‘ ์žฌ์•™์ผ๊นŒ์š”?
02:14
An asteroid impact?
43
134320
1320
์†Œํ–‰์„ฑ ์ถฉ๋Œ์ผ๊นŒ์š”?
02:17
Justin Bieber becoming president of the United States?
44
137640
2576
์ €์Šคํ‹ด ๋น„๋ฒ„๊ฐ€ ๋ฏธ๊ตญ ๋Œ€ํ†ต๋ น์ด ๋œ๋‹ค๋ฉด ์ด์œ ๊ฐ€ ๋ ๊นŒ์š”?
02:20
(Laughter)
45
140240
2280
(์›ƒ์Œ)
02:24
The point is, something would have to destroy civilization as we know it.
46
144760
3920
ํ•ต์‹ฌ์€ ๋ญ”๊ฐ€๊ฐ€ ์šฐ๋ฆฌ๊ฐ€ ์•„๋Š” ๋ฌธ๋ช…์„ ํŒŒ๊ดดํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
02:29
You have to imagine how bad it would have to be
47
149360
4296
์ด๊ฒŒ ์–ผ๋งˆ๋‚˜ ๋‚˜์ ์ง€ ์ƒ๊ฐํ•ด ๋ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
02:33
to prevent us from making improvements in our technology
48
153680
3336
์šฐ๋ฆฌ์˜ ๊ธฐ์ˆ ํ–ฅ์ƒ์„ ๋ง‰์„ ์ •๋„๋กœ ๋ง์ด์ฃ .
02:37
permanently,
49
157040
1216
์˜์›ํžˆ
02:38
generation after generation.
50
158280
2016
์•ž์œผ๋กœ ๊ณ„์†์ด์š”.
02:40
Almost by definition, this is the worst thing
51
160320
2136
๋ง ๊ทธ๋Œ€๋กœ ๊ทธ๊ฒƒ์€ ์ธ๋ฅ˜ ์‚ฌ์ƒ ์ตœ์•…์˜ ์ผ์ž…๋‹ˆ๋‹ค.
02:42
that's ever happened in human history.
52
162480
2016
02:44
So the only alternative,
53
164520
1296
๊ทธ๋ž˜์„œ ์œ ์ผํ•œ ๋Œ€์•ˆ์€
02:45
and this is what lies behind door number two,
54
165840
2336
2๋ฒˆ ๋ฌธ ๋’ค์— ์žˆ๋Š”๋ฐ
02:48
is that we continue to improve our intelligent machines
55
168200
3136
์ธ๊ณต์ง€๋Šฅ์„ ๊ณ„์† ๋ฐœ์ „์‹œํ‚ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
02:51
year after year after year.
56
171360
1600
์•ž์œผ๋กœ ์ˆ˜๋…„ ๋™์•ˆ ๊ณ„์†์ด์š”.
02:53
At a certain point, we will build machines that are smarter than we are,
57
173720
3640
์–ธ์  ๊ฐ€๋Š” ์šฐ๋ฆฌ๋ณด๋‹ค ๋˜‘๋˜‘ํ•œ ๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“ค๊ฒ ์ฃ .
02:58
and once we have machines that are smarter than we are,
58
178080
2616
๋” ๋˜‘๋˜‘ํ•œ ๊ธฐ๊ณ„๊ฐ€ ์ƒ๊ธฐ๋ฉด
03:00
they will begin to improve themselves.
59
180720
1976
๊ธฐ๊ณ„ ์Šค์Šค๋กœ ๋ฐœ์ „ํ•  ๊ฒ๋‹ˆ๋‹ค.
03:02
And then we risk what the mathematician IJ Good called
60
182720
2736
๊ทธ๋Ÿฌ๋ฉด ์ˆ˜ํ•™์ž ์•„์ด ์ œ์ด ๊ตฟ์˜ ๋ง๋Œ€๋กœ
03:05
an "intelligence explosion,"
61
185480
1776
"์ง€๋Šฅ ํญ๋ฐœ"์ด๋ผ๋Š” ์œ„ํ—˜์„ฑ์„ ์•ˆ๊ฒŒ ๋˜๋Š”๋ฐ
03:07
that the process could get away from us.
62
187280
2000
์šฐ๋ฆฌ ํ†ต์ œ๊ถŒ์„ ๋ฒ—์–ด๋‚˜๋Š” ์ผ์ด ์ƒ๊ธฐ๋Š” ๊ฑฐ์ฃ .
03:10
Now, this is often caricatured, as I have here,
63
190120
2816
์—ฌ๊ธฐ ๋ณด์‹œ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ์ด๋Ÿฐ ํ’์ž๋งŒํ™”๋„ ์žˆ๋Š”๋ฐ์š”.
03:12
as a fear that armies of malicious robots
64
192960
3216
์œ„ํ—˜ํ•œ ๋กœ๋ด‡ ๊ตฐ๋Œ€๊ฐ€ ์šฐ๋ฆฌ๋ฅผ ๊ณต๊ฒฉํ•˜๋ฆฌ๋ผ๋Š” ๋‘๋ ค์›€์ด์ฃ .
03:16
will attack us.
65
196200
1256
03:17
But that isn't the most likely scenario.
66
197480
2696
ํ•˜์ง€๋งŒ ๊ทธ๊ฑด ์žˆ์„๋งŒํ•œ ์ผ์€ ์•„๋‹™๋‹ˆ๋‹ค.
03:20
It's not that our machines will become spontaneously malevolent.
67
200200
4856
๊ธฐ๊ณ„๊ฐ€ ์Šค์Šค๋กœ ์•…ํ•ด์งˆ ๋ฆฌ๊ฐ€ ์—†๊ฑฐ๋“ ์š”.
03:25
The concern is really that we will build machines
68
205080
2616
๋ฌธ์ œ๋Š” ์šฐ๋ฆฌ๊ฐ€ ๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“œ๋Š”๋ฐ
03:27
that are so much more competent than we are
69
207720
2056
์šฐ๋ฆฌ๋ณด๋‹ค ๋„ˆ๋ฌด ๋Šฅ๋ ฅ์ด ๋›ฐ์–ด๋‚˜์„œ
03:29
that the slightest divergence between their goals and our own
70
209800
3776
๊ธฐ๊ณ„์™€ ์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ๊ฐ€ ์‚ด์ง๋งŒ ์–ด๊ธ‹๋‚˜๋„ ์šฐ๋ฆฌ๋ฅผ ํŒŒ๋ฉธ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋Š” ๊ฒ๋‹ˆ๋‹ค.
03:33
could destroy us.
71
213600
1200
03:35
Just think about how we relate to ants.
72
215960
2080
๊ฐœ๋ฏธ์™€ ๊ด€๋ จ์ง€์–ด ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
03:38
We don't hate them.
73
218600
1656
์šฐ๋ฆฌ๋Š” ๊ฐœ๋ฏธ๋ฅผ ์‹ซ์–ดํ•˜์ง„ ์•Š์•„์š”.
03:40
We don't go out of our way to harm them.
74
220280
2056
๊ฐœ๋ฏธ๋ฅผ ํ•ด์น˜์ง€๋„ ์•Š์ฃ .
03:42
In fact, sometimes we take pains not to harm them.
75
222360
2376
์‚ฌ์‹ค ๋•Œ๋กœ๋Š” ํ•ด์น˜์ง€ ์•Š์œผ๋ ค๊ณ  ๋…ธ๋ ฅํ•˜์ฃ .
03:44
We step over them on the sidewalk.
76
224760
2016
๊ธธ์—์„œ ๊ฐœ๋ฏธ๋“ค์„ ๋„˜์–ด์„œ ๊ฑธ์Œ์„ ๋”›์ž–์•„์š”.
03:46
But whenever their presence
77
226800
2136
ํ•˜์ง€๋งŒ ๊ฐœ๋ฏธ์˜ ์กด์žฌ๊ฐ€
03:48
seriously conflicts with one of our goals,
78
228960
2496
์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ์™€ ์‹ฌ๊ฐํ•˜๊ฒŒ ๋ถ€๋”ชํž ๋•Œ
03:51
let's say when constructing a building like this one,
79
231480
2477
๊ฐ€๋ น, ์ด๋Ÿฐ ๊ฑด๋ฌผ์„ ์ง€์„ ๋•Œ
03:53
we annihilate them without a qualm.
80
233981
1960
๊ฑฐ๋ฆฌ๋‚Œ์—†์ด ์ „๋ฉธ์‹œํ‚ต๋‹ˆ๋‹ค.
03:56
The concern is that we will one day build machines
81
236480
2936
๋ฌธ์ œ๋Š” ์šฐ๋ฆฌ๊ฐ€ ์–ธ์  ๊ฐ€ ๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“คํ…๋ฐ
03:59
that, whether they're conscious or not,
82
239440
2736
๊ทธ ๊ธฐ๊ณ„๊ฐ€ ์˜์‹์ด ์žˆ๋“  ์—†๋“ 
04:02
could treat us with similar disregard.
83
242200
2000
๋น„์Šทํ•˜๊ฒŒ ๋ฌด์‹œํ•˜๋Š” ํƒœ๋„๋กœ ์šฐ๋ฆฌ๋ฅผ ๋Œ€ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑฐ์ฃ .
04:05
Now, I suspect this seems far-fetched to many of you.
84
245760
2760
์ง€๋‚˜์นœ ์–ต์ธก์ด๋ผ๊ณ  ์ƒ๊ฐํ•˜๋Š” ๋ถ„์ด ๋Œ€๋‹ค์ˆ˜๊ฒ ์ง€๋งŒ
04:09
I bet there are those of you who doubt that superintelligent AI is possible,
85
249360
6336
์ดˆ์ง€๋Šฅ AI๊ฐ€ ๊ฐ€๋Šฅํ•˜๊ธฐ๋‚˜ ํ•œ์ง€ ์˜์‹ฌ์Šค๋Ÿฌ์šด ๋ถ„๋„ ๋ถ„๋ช… ๊ณ„์‹œ๊ฒ ์ฃ .
04:15
much less inevitable.
86
255720
1656
๋ฌผ๋ก  ํ•„์—ฐ์ ์ด๋ผ๊ณ  ๋ณด์ง€๋„ ์•Š๊ณ ์š”.
04:17
But then you must find something wrong with one of the following assumptions.
87
257400
3620
๊ทธ๋Ÿผ ๋‹ค์Œ ๊ฐ€์ •์—์„œ ์ž˜๋ชป๋œ ์ ์„ ์ฐพ์œผ์…”์•ผ ํ•  ๊ฒ๋‹ˆ๋‹ค.
04:21
And there are only three of them.
88
261044
1572
๋‹จ ์„ธ ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค.
04:23
Intelligence is a matter of information processing in physical systems.
89
263800
4719
์ง€๋Šฅ์ด๋ž€ ๋ฌผ๋ฆฌ์  ์‹œ์Šคํ…œ์—์„œ ์ •๋ณด๋ฅผ ์ฒ˜๋ฆฌํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:29
Actually, this is a little bit more than an assumption.
90
269320
2615
์‚ฌ์‹ค ์ด๊ฒƒ์€ ์•ฝ๊ฐ„์€ ๊ฐ€์ • ๊ทธ ์ด์ƒ์ž…๋‹ˆ๋‹ค.
04:31
We have already built narrow intelligence into our machines,
91
271959
3457
์šฐ๋ฆฐ ์ด๋ฏธ ๊ธฐ๊ณ„์— ์ œํ•œ๋œ ์ง€๋Šฅ์„ ๊ตฌ์ถ•ํ–ˆ๊ณ 
04:35
and many of these machines perform
92
275440
2016
์ด ๊ธฐ๊ณ„๋“ค์€ ์ด๋ฏธ
04:37
at a level of superhuman intelligence already.
93
277480
2640
์ธ๊ฐ„์„ ๋›ฐ์–ด๋„˜๋Š” ์ง€๋Šฅ ์ˆ˜์ค€์œผ๋กœ ๊ตฌ๋™ํ•ฉ๋‹ˆ๋‹ค.
04:40
And we know that mere matter
94
280840
2576
์šฐ๋ฆฌ๊ฐ€ ์•Œ๊ธฐ๋กœ ๋‹จ์ˆœํ•œ ๋ฌธ์ œ๊ฐ€
04:43
can give rise to what is called "general intelligence,"
95
283440
2616
"์ผ๋ฐ˜ ์ง€๋Šฅ"์ด๋ž€ ๊ฒƒ์„ ๋งŒ๋“ค์–ด ๋‚ผ ์ˆ˜๋„ ์žˆ์ฃ .
04:46
an ability to think flexibly across multiple domains,
96
286080
3656
๋‹ค์–‘ํ•œ ๋ถ„์•ผ์—์„œ ์œตํ†ต์„ฑ ์žˆ๊ฒŒ ์‚ฌ๊ณ ํ•˜๋Š” ๋Šฅ๋ ฅ์ธ๋ฐ
04:49
because our brains have managed it. Right?
97
289760
3136
์šฐ๋ฆฌ ๋‡Œ๋Š” ๊ทธ๋ ‡๊ฒŒ ํ•ด์™”์ž–์•„์š”. ๊ทธ๋ ‡์ฃ ?
04:52
I mean, there's just atoms in here,
98
292920
3936
์—ฌ๊ธฐ ์›์ž๋“ค์ด ์žˆ๊ณ 
04:56
and as long as we continue to build systems of atoms
99
296880
4496
์›์ž ์‹œ์Šคํ…œ์„ ๊ณ„์† ๊ตฌ์ถ•ํ•ด์„œ
05:01
that display more and more intelligent behavior,
100
301400
2696
์ ์  ๋” ์ง€๋Šฅ์ ์ธ ํ–‰๋™์„ ํ•˜๊ฒŒ ํ•˜๋ฉด
05:04
we will eventually, unless we are interrupted,
101
304120
2536
์ค‘๊ฐ„์— ๋ฉˆ์ถ”์ง€๋งŒ ์•Š๋Š”๋‹ค๋ฉด
05:06
we will eventually build general intelligence
102
306680
3376
์šฐ๋ฆฐ ๋งˆ์นจ๋‚ด ์ผ๋ฐ˜์ง€๋Šฅ์„
05:10
into our machines.
103
310080
1296
๊ธฐ๊ณ„์— ๊ตฌ์ถ•ํ•  ๊ฒ๋‹ˆ๋‹ค.
05:11
It's crucial to realize that the rate of progress doesn't matter,
104
311400
3656
์ง„๋ณด์˜ ์†๋„๊ฐ€ ๋ฌธ์ œ๋˜์ง€ ์•Š์Œ์„ ์•„๋Š”๊ฒŒ ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
05:15
because any progress is enough to get us into the end zone.
105
315080
3176
์–ด๋–ป๊ฒŒ๋“  ๋ฐœ์ „ํ•˜๋ฉด ๋๊นŒ์ง€ ๊ฐˆ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
05:18
We don't need Moore's law to continue. We don't need exponential progress.
106
318280
3776
๊ณ„์† ํ•˜๊ธฐ ์œ„ํ•ด ๋ฌด์–ด์˜ ๋ฒ•์น™์ด๋‚˜ ๊ธฐํ•˜๊ธ‰์ˆ˜์ ์ธ ์ง„๋ณด๊ฐ€ ํ•„์š”์—†์Šต๋‹ˆ๋‹ค.
05:22
We just need to keep going.
107
322080
1600
๊ทธ๋ƒฅ ๊ณ„์† ํ•˜๋ฉด ๋˜์ฃ .
05:25
The second assumption is that we will keep going.
108
325480
2920
๋‘ ๋ฒˆ์งธ ๊ฐ€์ •์€ ์šฐ๋ฆฌ๊ฐ€ ๊ณ„์† ํ•  ๊ฒƒ์ด๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
05:29
We will continue to improve our intelligent machines.
109
329000
2760
์šฐ๋ฆฌ๋Š” ์ง€๋Šฅ์ ์ธ ๊ธฐ๊ณ„๋ฅผ ๊ณ„์† ํ–ฅ์ƒ์‹œํ‚ฌ ๊ฒ๋‹ˆ๋‹ค.
05:33
And given the value of intelligence --
110
333000
4376
์ง€๋Šฅ์˜ ๊ฐ€์น˜๋ฅผ ์ƒ๊ฐํ•ด ๋ณผ ๋•Œ
05:37
I mean, intelligence is either the source of everything we value
111
337400
3536
์ง€๋Šฅ์ด๋ž€ ์šฐ๋ฆฌ๊ฐ€ ๊ฐ€์น˜์žˆ๊ฒŒ ์—ฌ๊ธฐ๋Š” ๋ชจ๋“  ๊ฒƒ์˜ ์›์ฒœ์ด๊ฑฐ๋‚˜
05:40
or we need it to safeguard everything we value.
112
340960
2776
๊ฐ€์น˜์žˆ๋Š” ๊ฒƒ์„ ์ง€ํ‚ค๊ธฐ ์œ„ํ•ด ํ•„์š”ํ•˜์ฃ .
05:43
It is our most valuable resource.
113
343760
2256
์šฐ๋ฆฌ์˜ ๊ฐ€์žฅ ๊ท€ํ•œ ์ž์›์ž…๋‹ˆ๋‹ค.
05:46
So we want to do this.
114
346040
1536
๊ทธ๋ž˜์„œ ๊ฐœ๋ฐœ์„ ํ•˜๊ณ  ์‹ถ์€ ๊ฒ๋‹ˆ๋‹ค.
05:47
We have problems that we desperately need to solve.
115
347600
3336
์šฐ๋ฆฌ๋Š” ๊ฐ„์ ˆํžˆ ํ•ด๊ฒฐํ•˜๊ณ  ์‹ถ์€ ๋ฌธ์ œ๋“ค์ด ์žˆ์ฃ .
05:50
We want to cure diseases like Alzheimer's and cancer.
116
350960
3200
์•Œ์ธ ํ•˜์ด๋จธ ๋ณ‘์ด๋‚˜ ์•”๊ฐ™์€ ์งˆ๋ณ‘์„ ์น˜๋ฃŒํ•˜๊ณ  ์‹ถ์–ด ํ•ฉ๋‹ˆ๋‹ค.
05:54
We want to understand economic systems. We want to improve our climate science.
117
354960
3936
๊ฒฝ์ œ ์ฒด๊ณ„๋ฅผ ์ดํ•ดํ•˜๊ณ  ๊ธฐํ›„ ๊ณผํ•™์„ ๋ฐœ์ „์‹œํ‚ค๊ณ  ์‹ถ์–ดํ•˜์ฃ .
05:58
So we will do this, if we can.
118
358920
2256
๊ทธ๋ž˜์„œ ํ•  ์ˆ˜๋งŒ ์žˆ๋‹ค๋ฉด ํ•  ๊ฒ๋‹ˆ๋‹ค.
06:01
The train is already out of the station, and there's no brake to pull.
119
361200
3286
๊ธฐ์ฐจ๋Š” ์ด๋ฏธ ์ถœ๋ฐœํ–ˆ๊ณ  ๋ฉˆ์ถœ ๋ธŒ๋ ˆ์ดํฌ๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
06:05
Finally, we don't stand on a peak of intelligence,
120
365880
5456
๋งˆ์ง€๋ง‰ ๊ฐ€์ •์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์ง€๋Šฅ์˜ ์ •์ ์— ์„œ์žˆ์ง€ ์•Š๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
06:11
or anywhere near it, likely.
121
371360
1800
๊ทธ ๊ทผ์ฒ˜์— ๊ฐ€์ง€๋„ ๋ชปํ–ˆ์„ ๊ฑฐ์˜ˆ์š”.
06:13
And this really is the crucial insight.
122
373640
1896
์‚ฌ์‹ค ์ด๊ฒŒ ์ค‘์š”ํ•œ ์ƒ๊ฐ์ž…๋‹ˆ๋‹ค.
06:15
This is what makes our situation so precarious,
123
375560
2416
๋ฐ”๋กœ ๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ ์ƒํ™ฉ์ด ๋งค์šฐ ์œ„ํ—˜ํ•œ ๊ฒ๋‹ˆ๋‹ค.
06:18
and this is what makes our intuitions about risk so unreliable.
124
378000
4040
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ ์œ„ํ—˜์„ ์ธ์ง€ํ•˜๋Š” ์šฐ๋ฆฌ ์ง๊ฐ์„ ๋ฏฟ์„ ์ˆ˜๊ฐ€ ์—†๋Š” ๊ฒƒ์ด์ฃ .
06:23
Now, just consider the smartest person who has ever lived.
125
383120
2720
์ด์ œ ์ง€๊ตฌ์ƒ์—์„œ ๊ฐ€์žฅ ๋˜‘๋˜‘ํ•œ ์‚ฌ๋žŒ์ด ์žˆ๋‹ค๊ณ  ํ•˜์ฃ .
06:26
On almost everyone's shortlist here is John von Neumann.
126
386640
3416
๊ฑฐ์˜ ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์„ ์ถ”๋ ค๋‚ด๋ฉด ์กด ํฐ ๋…ธ์ด๋งŒ์ด ๋‚˜์˜ต๋‹ˆ๋‹ค.
06:30
I mean, the impression that von Neumann made on the people around him,
127
390080
3336
์ฃผ๋ณ€ ์‚ฌ๋žŒ๋“ค์ด ํฐ ๋…ธ์ด๋งŒ์„ ์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ–ˆ๋Š”์ง€์™€
06:33
and this included the greatest mathematicians and physicists of his time,
128
393440
4056
๊ทธ๊ฐ€ ๋‹น๋Œ€์˜ ์œ„๋Œ€ํ•œ ์ˆ˜ํ•™์ž์ด์ž ๋ฌผ๋ฆฌํ•™์ž์— ํฌํ•จ๋œ๋‹ค๋Š” ์‚ฌ์‹ค์€
06:37
is fairly well-documented.
129
397520
1936
์ž˜ ์•Œ๋ ค์ ธ ์žˆ์ฃ .
06:39
If only half the stories about him are half true,
130
399480
3776
๊ทธ์— ๋Œ€ํ•œ ์ผํ™”์˜ ์ ˆ๋ฐ˜ ์ค‘์—์„œ ๋ฐ˜๋งŒ ์‚ฌ์‹ค์ด๋ผ ํ•˜๋”๋ผ๋„
06:43
there's no question
131
403280
1216
์˜์‹ฌ์˜ ์—ฌ์ง€ ์—†์ด
06:44
he's one of the smartest people who has ever lived.
132
404520
2456
๊ทธ๋Š” ์—ญ์‚ฌ์ƒ ๊ฐ€์žฅ ๋˜‘๋˜‘ํ•œ ์‚ฌ๋žŒ์ž…๋‹ˆ๋‹ค.
06:47
So consider the spectrum of intelligence.
133
407000
2520
์ง€๋Šฅ์˜ ๋ฒ”์œ„๋ฅผ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
06:50
Here we have John von Neumann.
134
410320
1429
ํฐ ๋…ธ์ด๋งŒ์€ ์—ฌ๊ธฐ ์žˆ์Šต๋‹ˆ๋‹ค.
06:53
And then we have you and me.
135
413560
1334
๊ทธ๋ฆฌ๊ณ  ์—ฌ๊ธฐ ์—ฌ๋Ÿฌ๋ถ„๊ณผ ์ œ๊ฐ€ ์žˆ์ฃ .
06:56
And then we have a chicken.
136
416120
1296
๊ทธ๋ฆฌ๊ณ  ๋‹ญ์€ ์—ฌ๊ธฐ ์žˆ์Šต๋‹ˆ๋‹ค.
06:57
(Laughter)
137
417440
1936
(์›ƒ์Œ)
06:59
Sorry, a chicken.
138
419400
1216
๋‹ญ์•„ ๋ฏธ์•ˆํ•˜๋‹ค.
07:00
(Laughter)
139
420640
1256
(์›ƒ์Œ)
07:01
There's no reason for me to make this talk more depressing than it needs to be.
140
421920
3736
์ด ๊ฐ•์—ฐ์„ ๊ผญ ์šฐ์šธํ•˜๊ฒŒ ํ•  ํ•„์š”๋Š” ์—†์ฃ .
07:05
(Laughter)
141
425680
1600
(์›ƒ์Œ)
07:08
It seems overwhelmingly likely, however, that the spectrum of intelligence
142
428339
3477
ํ•˜์ง€๋งŒ ๋†€๋ž๊ฒŒ๋„ ์ง€๋Šฅ์˜ ๋ฒ”์œ„๋Š”
07:11
extends much further than we currently conceive,
143
431840
3120
์šฐ๋ฆฌ๊ฐ€ ํ˜„์žฌ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ๋ณด๋‹ค ํ›จ์”ฌ ๋„“๊ฒŒ ํ™•์žฅ๋˜๊ณ 
07:15
and if we build machines that are more intelligent than we are,
144
435880
3216
์šฐ๋ฆฌ๋ณด๋‹ค ์ง€๋Šฅ์ ์ธ ๊ธฐ๊ณ„๋ฅผ ๊ตฌ์ถ•ํ•œ๋‹ค๋ฉด
07:19
they will very likely explore this spectrum
145
439120
2296
์ด ๋ฒ”์œ„๋ฅผ ์•„์šฐ๋ฅผ ๊ฐ€๋Šฅ์„ฑ์ด ๋งค์šฐ ๋†’์Šต๋‹ˆ๋‹ค.
07:21
in ways that we can't imagine,
146
441440
1856
์ƒ์ƒํ•  ์ˆ˜ ์—†์„ ์ •๋„๋กœ์š”.
07:23
and exceed us in ways that we can't imagine.
147
443320
2520
๋˜ํ•œ ์ƒ์ƒํ•  ์ˆ˜ ์—†์„ ์ •๋„๋กœ ์šฐ๋ฆฌ๋ฅผ ๋Šฅ๊ฐ€ํ•  ๊ฒ๋‹ˆ๋‹ค.
07:27
And it's important to recognize that this is true by virtue of speed alone.
148
447000
4336
์†๋„ ํ•˜๋‚˜๋งŒ ๋ด๋„ ๊ทธ๋ ‡๋‹ค๋Š” ๊ฑธ ์•Œ์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:31
Right? So imagine if we just built a superintelligent AI
149
451360
5056
๊ทธ๋ ‡์ฃ ? ์ดˆ์ง€๋Šฅ AI๋ฅผ ๊ตฌ์ถ•ํ•œ๋‹ค๊ณ  ํ•ด๋ด…์‹œ๋‹ค.
07:36
that was no smarter than your average team of researchers
150
456440
3456
์Šคํƒ ํฌ๋“œ ๋Œ€ํ•™์ด๋‚˜ MIT์˜ ์—ฐ๊ตฌํŒ€์˜ ํ‰๊ท  ์ง€๋Šฅ ๊ทธ ์ด์ƒ์€ ์•„๋‹Œ ์ •๋„๋กœ์š”.
07:39
at Stanford or MIT.
151
459920
2296
07:42
Well, electronic circuits function about a million times faster
152
462240
2976
์ƒํ™”ํ•™์ ์ธ ํšŒ๋กœ๋ณด๋‹ค ์ „์ž ํšŒ๋กœ๊ฐ€ ๋ฐฑ๋งŒ ๋ฐฐ๋Š” ๋น ๋ฅด๊ฒŒ ์ž‘๋™ํ•ฉ๋‹ˆ๋‹ค.
07:45
than biochemical ones,
153
465240
1256
07:46
so this machine should think about a million times faster
154
466520
3136
๊ทธ๋Ÿฌ๋‹ˆ ์ด ๊ธฐ๊ณ„๋Š” ๊ทธ๊ฑธ ๊ตฌ์ถ•ํ•œ ์‚ฌ๋žŒ๋ณด๋‹ค ๋ฐฑ๋งŒ ๋ฐฐ์˜ ์†๋„๋กœ ๋น ๋ฅด๊ฒŒ ์ƒ๊ฐํ•˜๊ฒ ์ฃ .
07:49
than the minds that built it.
155
469680
1816
07:51
So you set it running for a week,
156
471520
1656
์ผ์ฃผ์ผ ๋™์•ˆ ๋Œ๋ฆฌ๋ฉด
07:53
and it will perform 20,000 years of human-level intellectual work,
157
473200
4560
์ธ๊ฐ„ ์ˆ˜์ค€ ์ง€๋Šฅ์˜ 2๋งŒ ๋…„ ๋ถ„๋Ÿ‰์„ ํ•ด์น˜์šธ ๊ฒ๋‹ˆ๋‹ค.
07:58
week after week after week.
158
478400
1960
๋ช‡ ์ฃผ ๋™์•ˆ ๊ณ„์† ๋ง์ด์ฃ .
08:01
How could we even understand, much less constrain,
159
481640
3096
๋ณ„ ์ œํ•œ ์—†์ด
์ด๋Ÿฐ ์ง„๋ณด๋ฅผ ์ด๋ฃจ์–ด๋‚ด๋Š” ์ธ๊ณต์ง€๋Šฅ์„ ์–ด๋–ป๊ฒŒ ๋ฐ›์•„๋“ค์—ฌ์•ผ ํ• ๊นŒ์š”?
08:04
a mind making this sort of progress?
160
484760
2280
08:08
The other thing that's worrying, frankly,
161
488840
2136
์†”์งํžˆ ๊ฑฑ์ •๋˜๋Š” ๋‹ค๋ฅธ ๊ฒƒ์€
08:11
is that, imagine the best case scenario.
162
491000
4976
์ตœ๊ณ ์˜ ์‹œ๋‚˜๋ฆฌ์˜ค๋ฅผ ์ƒ๊ฐํ•ด ๋ณด๋Š” ๊ฒ๋‹ˆ๋‹ค.
08:16
So imagine we hit upon a design of superintelligent AI
163
496000
4176
์•ˆ์ „์„ฑ์— ๋Œ€ํ•ด ๊ณ ๋ คํ•˜์ง€ ์•Š๊ณ  ์ดˆ์ง€๋Šฅ AI๋ฅผ ์„ค๊ณ„ํ•œ๋‹ค๊ณ  ์ƒ์ƒํ•ด ๋ณด์„ธ์š”.
08:20
that has no safety concerns.
164
500200
1376
08:21
We have the perfect design the first time around.
165
501600
3256
์ฒ˜์Œ์—๋Š” ์„ค๊ณ„๊ฐ€ ์™„๋ฒฝํ•˜์ฃ .
08:24
It's as though we've been handed an oracle
166
504880
2216
์˜๋„ํ•œ ๋Œ€๋กœ ์ •ํ™•ํžˆ ํ–‰๋™ํ•˜๋Š” ์กฐ๋ ฅ์ž๋ฅผ ์–ป์€ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
08:27
that behaves exactly as intended.
167
507120
2016
08:29
Well, this machine would be the perfect labor-saving device.
168
509160
3720
๋…ธ๋™๋ ฅ์„ ์ ˆ๊ฐํ•ด์ฃผ๋Š” ์™„๋ฒฝํ•œ ๊ธฐ๊ณ„๊ฐ€ ๋  ๊ฒ๋‹ˆ๋‹ค.
08:33
It can design the machine that can build the machine
169
513680
2429
์ด๊ฑธ๋กœ ๋ฌผ๋ฆฌ์ ์ธ ์ž‘์—…์„ ํ•  ๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“ค์–ด ๋‚ด๋Š”
08:36
that can do any physical work,
170
516133
1763
๋˜ ๋‹ค๋ฅธ ๊ธฐ๊ณ„๋ฅผ ๊ณ ์•ˆํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:37
powered by sunlight,
171
517920
1456
ํƒœ์–‘์—๋„ˆ์ง€๋กœ ๋Œ์•„๊ฐ€๊ณ 
08:39
more or less for the cost of raw materials.
172
519400
2696
์›์ž์žฌ ์ •๋„์˜ ๋น„์šฉ๋งŒ ๋“ค์–ด๊ฐ€์ฃ .
08:42
So we're talking about the end of human drudgery.
173
522120
3256
์ธ๊ฐ„์˜ ๊ณ ๋œ ๋…ธ๋™์ด ๋๋‚˜๊ฒŒ ๋œ๋‹ค๋Š” ๊ฑธ ์˜๋ฏธํ•˜๋Š” ๊ฑฐ์˜ˆ์š”.
08:45
We're also talking about the end of most intellectual work.
174
525400
2800
๊ทธ๊ฑด ๋Œ€๋ถ€๋ถ„์˜ ์ง€์  ๋…ธ๋™๋„ ๋๋‚˜๊ฒŒ ๋œ๋‹ค๋Š” ๋ง์ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
08:49
So what would apes like ourselves do in this circumstance?
175
529200
3056
์šฐ๋ฆฌ ๊ฐ™์€ ์œ ์ธ์›์€ ์ด๋Ÿฐ ์ƒํ™ฉ์—์„œ ๋ญ˜ ํ• ๊นŒ์š”?
08:52
Well, we'd be free to play Frisbee and give each other massages.
176
532280
4080
ํ•œ๊ฐ€๋กœ์ด ์›๋ฐ˜ ๋˜์ง€๊ธฐ ๋†€์ดํ•˜๋ฉด์„œ ์„œ๋กœ ๋งˆ์‚ฌ์ง€๋‚˜ ํ•ด์ฃผ๊ฒ ์ฃ .
08:57
Add some LSD and some questionable wardrobe choices,
177
537840
2856
ํ™˜๊ฐ์ œ ์กฐ๊ธˆ๊ณผ ์ด์ƒํ•œ ์˜ท๋งŒ ๋”ํ•˜๋ฉด
09:00
and the whole world could be like Burning Man.
178
540720
2176
์ „ ์„ธ๊ณ„๊ฐ€ ๋งˆ์น˜ ๋ฒ„๋‹๋งจ ์ถ•์ œ ๊ฐ™์„ ๊ฒ๋‹ˆ๋‹ค.
09:02
(Laughter)
179
542920
1640
(์›ƒ์Œ)
09:06
Now, that might sound pretty good,
180
546320
2000
์ฐธ ๊ดœ์ฐฎ์„ ๊ฒƒ ๊ฐ™์ง€๋งŒ
09:09
but ask yourself what would happen
181
549280
2376
ํ˜„์žฌ์˜ ์ •์น˜ ๊ฒฝ์ œ ์ƒํ™ฉ์—์„œ ๋ฌด์Šจ ์ผ์ด ์ƒ๊ธธ์ง€ ์ž๋ฌธํ•ด ๋ณด์„ธ์š”.
09:11
under our current economic and political order?
182
551680
2736
09:14
It seems likely that we would witness
183
554440
2416
๋‹ค๋ถ„ํžˆ ์šฐ๋ฆฌ๋“ค์€
09:16
a level of wealth inequality and unemployment
184
556880
4136
์ด์ „์— ์—†๋˜ ๋ถ€์˜ ๋ถˆํ‰๋“ฑ๊ณผ ์‹ค์—…๋ฌธ์ œ๋ฅผ ์ง๋ฉดํ•˜๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
09:21
that we have never seen before.
185
561040
1496
09:22
Absent a willingness to immediately put this new wealth
186
562560
2616
์ƒˆ๋กœ์šด ๋ถ€๋ฅผ ์ฆ‰์‹œ ๋ชจ๋“  ์ธ๋ฅ˜์™€ ๋‚˜๋ˆ„๋ ค ํ•˜์ง€ ์•Š๊ณ 
09:25
to the service of all humanity,
187
565200
1480
09:27
a few trillionaires could grace the covers of our business magazines
188
567640
3616
์†Œ์ˆ˜์˜ ๋ฐฑ๋งŒ์žฅ์ž๋“ค๋งŒ์ด ๊ฒฝ์ œ์ง€ ํ‘œ์ง€๋ฅผ ์žฅ์‹ํ•˜๊ฒ ์ฃ .
09:31
while the rest of the world would be free to starve.
189
571280
2440
์„ธ๊ณ„์˜ ๋‚˜๋จธ์ง€ ์ธ๋ฅ˜๋Š” ๊ตถ์–ด ์ฃฝ๋Š” ๋™์•ˆ์—์š”.
09:34
And what would the Russians or the Chinese do
190
574320
2296
๋Ÿฌ์‹œ์•„๋‚˜ ์ค‘๊ตญ์ธ์€ ๋ญ˜ ํ• ๊นŒ์š”?
09:36
if they heard that some company in Silicon Valley
191
576640
2616
์‹ค๋ฆฌ์ฝ˜ ๋ฐธ๋ฆฌ์˜ ์–ด๋–ค ํšŒ์‚ฌ๋“ค์ด
09:39
was about to deploy a superintelligent AI?
192
579280
2736
์ดˆ์ง€๋Šฅ AI๋ฅผ ๊ณง ์‚ฌ์šฉํ•˜๋ ค๊ณ  ํ•œ๋‹ค๋Š” ๊ฑธ ๋“ค์œผ๋ฉด์š”.
09:42
This machine would be capable of waging war,
193
582040
2856
์ด ๊ธฐ๊ณ„๋Š” ์ „์Ÿ๋„ ์ผ์œผํ‚ฌ ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
09:44
whether terrestrial or cyber,
194
584920
2216
์ง€์—ญ๋ถ„์Ÿ์ด๊ฑด ์‚ฌ์ด๋ฒ„ ์ „์Ÿ์ด๊ฑด
09:47
with unprecedented power.
195
587160
1680
์ „๋ก€์—†๋Š” ํž˜์„ ๊ฐ€์ง€๊ณ  ๋ง์ž…๋‹ˆ๋‹ค.
09:50
This is a winner-take-all scenario.
196
590120
1856
์ด๊ฑด ์Šน์ž๋…์‹์˜ ์‹œ๋‚˜๋ฆฌ์˜ค์ž…๋‹ˆ๋‹ค.
09:52
To be six months ahead of the competition here
197
592000
3136
์ด ๊ฒฝ์Ÿ์—์„œ 6๊ฐœ์›” ์•ž์„ ๋‹ค๋Š” ๊ฒƒ์€
09:55
is to be 500,000 years ahead,
198
595160
2776
์ตœ์†Œํ•œ 50๋งŒ ๋…„ ์•ž์„œ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:57
at a minimum.
199
597960
1496
09:59
So it seems that even mere rumors of this kind of breakthrough
200
599480
4736
๊ทธ๋ž˜์„œ ์ด๋Ÿฐ ํ˜์‹ ์ ์ธ ์ผ์ด
์šฐ๋ฆฌ ์ธ๋ฅ˜๋ฅผ ๋ฏธ์ณ ๋‚ ๋›ฐ๊ฒŒ ๋งŒ๋“ค ๊ฑฐ๋ผ๋Š” ์†Œ๋ฌธ์ด ๋”์šฑ ๋– ๋„๋Š” ๊ฑฐ์ฃ .
10:04
could cause our species to go berserk.
201
604240
2376
10:06
Now, one of the most frightening things,
202
606640
2896
์ด์ œ ๊ฐ€์žฅ ๋‘๋ ค์šด ๊ฒƒ์€
10:09
in my view, at this moment,
203
609560
2776
์ง€๊ธˆ ํ˜„์žฌ, ์ œ ์ƒ๊ฐ์—๋Š”
10:12
are the kinds of things that AI researchers say
204
612360
4296
AI์—ฐ๊ตฌ์ž๊ฐ€ ์ด๋Ÿฐ ๋ง์„ ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:16
when they want to be reassuring.
205
616680
1560
์•ˆ์‹ฌ์‹œํ‚ค๋ ค๊ณ  ๋ง์ด์ฃ .
10:19
And the most common reason we're told not to worry is time.
206
619000
3456
๊ฑฑ์ •ํ•˜์ง€ ์•Š์•„๋„ ๋˜๋Š” ์ด์œ ๋Š” ์‹œ๊ธฐ ๋•Œ๋ฌธ์ด๋ผ๊ณ  ํ•˜์ฃ .
10:22
This is all a long way off, don't you know.
207
622480
2056
์ด๊ฒƒ์€ ์™„์ „ํžˆ ๋น—๋‚˜๊ฐ„ ๋ง์ž…๋‹ˆ๋‹ค.
10:24
This is probably 50 or 100 years away.
208
624560
2440
์•„๋งˆ๋„ 50์—์„œ 100๋…„ ๋’ค๊ฒ ์ฃ .
10:27
One researcher has said,
209
627720
1256
ํ•œ ์—ฐ๊ตฌ์›์ด ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
10:29
"Worrying about AI safety
210
629000
1576
"AI์˜ ์•ˆ์ „์„ฑ์— ๋Œ€ํ•œ ์šฐ๋ ค๋Š”
10:30
is like worrying about overpopulation on Mars."
211
630600
2280
๋งˆ์น˜ ํ™”์„ฑ์˜ ๊ณผ์ž‰ ์ธ๊ตฌ๋ฅผ ๊ฑฑ์ •ํ•˜๋Š” ๊ฒƒ๊ณผ ๊ฐ™๋‹ค."
10:34
This is the Silicon Valley version
212
634116
1620
์‹ค๋ฆฌ์ฝ˜ ๋ฐธ๋ฆฌ ์‹์œผ๋กœ ํ‘œํ˜„ํ•˜๋ฉด ์ด๋Ÿฐ ๋œป์ด์ฃ .
10:35
of "don't worry your pretty little head about it."
213
635760
2376
"๊ทธ ์ฌ๊ทธ๋งŒ ๋จธ๋ฆฌ๋กœ ๊ฑฑ์ •ํ•˜์ง€ ๋งˆ์„ธ์š”."
10:38
(Laughter)
214
638160
1336
(์›ƒ์Œ)
10:39
No one seems to notice
215
639520
1896
์•„๋ฌด๋„ ๋ชจ๋ฅด๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
10:41
that referencing the time horizon
216
641440
2616
์‹œ๊ฐ„์˜ ์ถ•์„ ๋งํ•˜๋Š” ๊ฒŒ ์™„์ „ํžˆ ์ž˜๋ชป๋œ ๊ฒฐ๋ก ์ธ ๊ฒƒ์„
10:44
is a total non sequitur.
217
644080
2576
10:46
If intelligence is just a matter of information processing,
218
646680
3256
์ง€๋Šฅ์ด ๊ทธ์ € ์ •๋ณด์ฒ˜๋ฆฌํ•˜๋Š” ๋ฌธ์ œ๋ผ๋ฉด
10:49
and we continue to improve our machines,
219
649960
2656
์šฐ๋ฆฌ๋Š” ๊ธฐ๊ณ„๋ฅผ ๊ณ„์† ํ–ฅ์ƒ์‹œํ‚ฌ ๊ฒƒ์ด๊ณ 
10:52
we will produce some form of superintelligence.
220
652640
2880
์ดˆ์ง€๋Šฅ์˜ ํ˜•ํƒœ๋ฅผ ๋งŒ๋“ค๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
10:56
And we have no idea how long it will take us
221
656320
3656
๊ทธ๊ฒƒ์„ ์•ˆ์ „ํ•˜๊ฒŒ ํ•  ์กฐ๊ฑด์„ ๋งŒ๋“ค๋ ค๋ฉด ์–ผ๋งˆ๋‚˜ ๊ฑธ๋ฆด์ง€ ์šฐ๋ฆฐ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
11:00
to create the conditions to do that safely.
222
660000
2400
11:04
Let me say that again.
223
664200
1296
๋‹ค์‹œ ๋ง์”€๋“œ๋ฆฌ์ฃ .
11:05
We have no idea how long it will take us
224
665520
3816
๊ทธ๊ฑธ ์•ˆ์ „ํ•˜๊ฒŒ ํ•  ์กฐ๊ฑด์„ ๋งŒ๋“œ๋Š” ๊ฒŒ ์–ผ๋งˆ๋‚˜ ๊ฑธ๋ฆด์ง€ ์šฐ๋ฆฌ๋Š” ๋ชจ๋ฅธ๋‹ค๊ณ ์š”.
11:09
to create the conditions to do that safely.
225
669360
2240
11:12
And if you haven't noticed, 50 years is not what it used to be.
226
672920
3456
์ž˜ ๋ชจ๋ฅด์‹ ๋‹ค๋ฉด, ์ง€๊ธˆ์˜ 50๋…„์€ ๊ณผ๊ฑฐ์™€ ๊ฐ™์€ ์‹œ๊ฐ„์ด ์•„๋‹™๋‹ˆ๋‹ค.
11:16
This is 50 years in months.
227
676400
2456
๋ช‡ ๊ฐœ์›”์ด 50๋…„๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€์ฃ .
11:18
This is how long we've had the iPhone.
228
678880
1840
์•„์ดํฐ์„ ๊ฐ–๊ธฐ๊นŒ์ง€ ๊ทธ๋งŒํผ ๊ฑธ๋ ธ์Šต๋‹ˆ๋‹ค.
11:21
This is how long "The Simpsons" has been on television.
229
681440
2600
์‹ฌ์Šจ๊ฐ€์กฑ์ด TV๋กœ ๋ฐฉ์˜๋˜๋Š” ๋ฐ ๊ฑธ๋ฆฐ ์‹œ๊ฐ„์ž…๋‹ˆ๋‹ค.
11:24
Fifty years is not that much time
230
684680
2376
50๋…„์ด๋ž€ ๊ทธ๋ฆฌ ๋งŽ์€ ์‹œ๊ฐ„์ด ์•„๋‹™๋‹ˆ๋‹ค.
11:27
to meet one of the greatest challenges our species will ever face.
231
687080
3160
์šฐ๋ฆฌ ์ธ๋ฅ˜๊ฐ€ ๋‹น๋ฉดํ•œ ์ตœ๊ณ ์˜ ๋‚œ์ œ๋ฅผ ๋งŒ๋‚˜๋Š”๋ฐ์š”.
11:31
Once again, we seem to be failing to have an appropriate emotional response
232
691640
4016
๋‹ค์‹œ, ์šฐ๋ฆฌ๋Š” ์ œ๋Œ€๋กœ ๋œ ๊ฐ์ •์  ๋ฐ˜์‘์„ ๋ชปํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
11:35
to what we have every reason to believe is coming.
233
695680
2696
์•ž์œผ๋กœ ๋ฒŒ์–ด์งˆ ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ๋“ค์— ๋Œ€ํ•ด์„œ์š”.
11:38
The computer scientist Stuart Russell has a nice analogy here.
234
698400
3976
์ปดํ“จํ„ฐ๊ณตํ•™์ž ์ŠคํŠœ์–ดํŠธ ๋Ÿฌ์…€์ด ๋ฉ‹์ง„ ๋น„์œ ๋ฅผ ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
11:42
He said, imagine that we received a message from an alien civilization,
235
702400
4896
์™ธ๊ณ„๋ฌธ๋ช…์ด ๋ณด๋‚ธ ๋ฉ”์„ธ์ง€๋ฅผ ๋ฐ›์•˜๋‹ค๊ณ  ์ƒ์ƒํ•ด ๋ณด๋Š” ๊ฒ๋‹ˆ๋‹ค.
11:47
which read:
236
707320
1696
์ด๋Ÿฐ ๋‚ด์šฉ์˜ ๋ฉ”์„ธ์ง€์ฃ .
11:49
"People of Earth,
237
709040
1536
"์ง€๊ตฌ์ธ๋“ค์ด์—ฌ
11:50
we will arrive on your planet in 50 years.
238
710600
2360
์šฐ๋ฆฌ๋Š” 50๋…„ ํ›„์— ์ง€๊ตฌ์— ๋„์ฐฉํ•  ๊ฒƒ์ด๋‹ค.
11:53
Get ready."
239
713800
1576
์ด์— ์ค€๋น„ํ•˜๋ผ."
11:55
And now we're just counting down the months until the mothership lands?
240
715400
4256
๊ทธ๋Ÿผ ์™ธ๊ณ„ ์šฐ์ฃผ์„ ์ด ๋„์ฐฉํ•  ๋•Œ๊นŒ์ง€ ๋‚ ์งœ๋งŒ ์„ธ๋Š” ๊ฒ๋‹ˆ๊นŒ?
11:59
We would feel a little more urgency than we do.
241
719680
3000
๊ทธ๊ฒƒ๋ณด๋‹ค๋Š” ์ข€๋” ์œ„๊ธ‰ํ•จ์„ ๋Š๊ปด์•ผ์ฃ .
12:04
Another reason we're told not to worry
242
724680
1856
์šฐ๋ฆฌ์—๊ฒŒ ๊ฑฑ์ •๋ง๋ผ๋Š” ๋‹ค๋ฅธ ์ด์œ ๋Š”
12:06
is that these machines can't help but share our values
243
726560
3016
์ด๋Ÿฐ ๊ธฐ๊ณ„๋“ค์€ ์šฐ๋ฆฌ์˜ ๊ฐ€์น˜๋ฅผ ๊ณต์œ ํ•  ์ˆ˜๋ฐ–์— ์—†๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:09
because they will be literally extensions of ourselves.
244
729600
2616
๋ง ๊ทธ๋Œ€๋กœ ์šฐ๋ฆฌ์˜ ์—ฐ์žฅ์„ ์ด๋‹ˆ๊นŒ์š”.
12:12
They'll be grafted onto our brains,
245
732240
1816
๊ธฐ๊ณ„๋“ค์€ ์šฐ๋ฆฌ ๋‡Œ์— ์ด์‹๋ ํ…Œ๊ณ 
12:14
and we'll essentially become their limbic systems.
246
734080
2360
์šฐ๋ฆฌ๋Š” ๋ณธ์งˆ์ ์œผ๋กœ ๊ทธ๋“ค์˜ ๊ฐ์ • ํ‘œํ˜„ ์‹œ์Šคํ…œ์œผ๋กœ ์ „๋ฝํ•˜๊ฒ ์ฃ .
12:17
Now take a moment to consider
247
737120
1416
ํ•œ ๋ฒˆ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
12:18
that the safest and only prudent path forward,
248
738560
3176
๊ฐ€์žฅ ์•ˆ์ „ํ•˜๊ณ  ํ˜„๋ช…ํ•œ ๋ฐฉ๋ฒ•์„ ์ถ”์ฒœํ•˜์ž๋ฉด
12:21
recommended,
249
741760
1336
12:23
is to implant this technology directly into our brains.
250
743120
2800
์ด ๊ธฐ์ˆ ์„ ์šฐ๋ฆฌ ๋‡Œ์— ์ง์ ‘ ์ด์‹ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:26
Now, this may in fact be the safest and only prudent path forward,
251
746600
3376
์•ž์œผ๋กœ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฐ€์žฅ ์•ˆ์ „ํ•˜๊ณ  ํ˜„๋ช…ํ•œ ๋ฐฉ๋ฒ•์ด๊ฒ ์ง€๋งŒ
12:30
but usually one's safety concerns about a technology
252
750000
3056
๊ธฐ์ˆ ์˜ ์•ˆ์ „์„ฑ์— ๋Œ€ํ•œ ์šฐ๋ ค๋Š” ๋ณดํ†ต
12:33
have to be pretty much worked out before you stick it inside your head.
253
753080
3656
๋จธ๋ฆฟ์†์—์„œ ๊ฝ‚์•„ ๋ณด๊ธฐ ์ „์— ํ•ด์•ผ ํ•  ์ผ์ด์ฃ .
12:36
(Laughter)
254
756760
2016
(์›ƒ์Œ)
12:38
The deeper problem is that building superintelligent AI on its own
255
758800
5336
์ข€ ๋” ์‹ฌ์˜คํ•œ ๋ฌธ์ œ๋Š”
์ดˆ์ง€๋Šฅ์˜ AI๋ฅผ ๋งŒ๋“œ๋Š” ๊ฒƒ ์ž์ฒด๊ฐ€
12:44
seems likely to be easier
256
764160
1736
์˜คํžˆ๋ ค ๋” ์‰ฝ๋‹ค๋Š” ๊ฑฐ์—์š”.
12:45
than building superintelligent AI
257
765920
1856
์ดˆ์ง€๋Šฅ AI๋ฅผ ๋งŒ๋“ค๊ณ 
12:47
and having the completed neuroscience
258
767800
1776
์™„์ „ํ•œ ์‹ ๊ฒฝ๊ณผํ•™์ด ์žˆ์–ด์„œ
12:49
that allows us to seamlessly integrate our minds with it.
259
769600
2680
์šฐ๋ฆฌ์˜ ์ •์‹ ์„ AI์™€ ๋งค๋„๋Ÿฝ๊ฒŒ ์ ‘๋ชฉ์‹œํ‚ค๋Š” ๊ฒƒ์— ๋น„ํ•ด์„œ ๋ง์ด์ฃ .
12:52
And given that the companies and governments doing this work
260
772800
3176
๊ธฐ์—…๊ณผ ์ •๋ถ€๊ฐ€ ์ด ์ผ์„ ํ•˜๊ณ  ์žˆ์Œ์„ ๋ณด๋ฉด
12:56
are likely to perceive themselves as being in a race against all others,
261
776000
3656
๋‹ค๋ฅธ ์ข…์— ๋Œ€ํ•ญํ•˜๋Š” ์กด์žฌ์ž„์„ ์Šค์Šค๋กœ ์ธ์ง€ํ•  ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
12:59
given that to win this race is to win the world,
262
779680
3256
์ด ๊ฒฝ์Ÿ์—์„œ ์ด๊ธฐ๋Š” ๊ฒƒ์ด ์„ธ์ƒ์„ ์ด๊ธฐ๋Š” ๊ฒƒ์ด๋ผ๊ณ  ํ•œ๋‹ค๋ฉด์š”.
13:02
provided you don't destroy it in the next moment,
263
782960
2456
๊ทธ๊ฒƒ์„ ๋ฐ”๋กœ ํŒŒ๊ดดํ•˜์ง€ ์•Š๋Š”๋‹ค๋ฉด
13:05
then it seems likely that whatever is easier to do
264
785440
2616
๊ฐ€์žฅ ์‰ฌ์šด ์ผ์„ ๊ฐ€์žฅ ๋จผ์ € ํ•  ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
13:08
will get done first.
265
788080
1200
13:10
Now, unfortunately, I don't have a solution to this problem,
266
790560
2856
์•ˆํƒ€๊น๊ฒŒ๋„ ์ €๋Š” ์ด ๋ฌธ์ œ์— ๋Œ€ํ•œ ํ•ด๋ฒ•์€ ์—†์Šต๋‹ˆ๋‹ค.
13:13
apart from recommending that more of us think about it.
267
793440
2616
์ข€ ๋” ์ƒ๊ฐํ•ด ๋ณด์ž๊ณ  ํ•˜๋Š” ๊ฒƒ ๋นผ๊ณ ๋Š”์š”.
13:16
I think we need something like a Manhattan Project
268
796080
2376
๋งจํ•˜ํƒ„ ํ”„๋กœ์ ํŠธ ๊ฐ™์€ ๊ฒƒ์ด ์žˆ์–ด์„œ
13:18
on the topic of artificial intelligence.
269
798480
2016
AI์— ๋Œ€ํ•œ ์—ฐ๊ตฌ๋ฅผ ํ•ด์•ผ ๋˜์ง€ ์•Š์„๊นŒ ์‹ถ์Šต๋‹ˆ๋‹ค.
13:20
Not to build it, because I think we'll inevitably do that,
270
800520
2736
๋งŒ๋“ค๊ธฐ ์œ„ํ•ด์„œ๊ฐ€ ์•„๋‹ˆ๋ผ, ํ•„์—ฐ์ ์œผ๋กœ ํ•˜๊ฒŒ ๋ ํ…Œ๋‹ˆ๊นŒ
13:23
but to understand how to avoid an arms race
271
803280
3336
๊ตฐ๋น„๊ฒฝ์Ÿ์„ ํ”ผํ•  ๋ฐฉ๋ฒ•์„ ์•Œ๊ณ 
13:26
and to build it in a way that is aligned with our interests.
272
806640
3496
์šฐ๋ฆฌ์˜ ๊ด€์‹ฌ๊ณผ ๋งž์ถœ ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์„ ์ฐพ๊ธฐ ์œ„ํ•ด์„œ์ฃ .
13:30
When you're talking about superintelligent AI
273
810160
2136
์Šค์Šค๋กœ ๋ณ€ํ•  ์ˆ˜ ์žˆ๋Š” ์ดˆ์ง€๋Šฅ AI๋ฅผ ๋งํ•  ๋•Œ๋Š”
13:32
that can make changes to itself,
274
812320
2256
13:34
it seems that we only have one chance to get the initial conditions right,
275
814600
4616
์ฒซ ์ถœ๋ฐœ์„ ์ œ๋Œ€๋กœ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ธฐํšŒ๋Š” ๋‹จ ํ•œ ๋ฒˆ๋ฟ์ธ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
13:39
and even then we will need to absorb
276
819240
2056
๊ทธ๋•Œ๋„ ์šฐ๋ฆฌ๋Š”
13:41
the economic and political consequences of getting them right.
277
821320
3040
์ œ๋Œ€๋กœ ํ•˜๊ธฐ ์œ„ํ•œ ์ •์น˜ ๊ฒฝ์ œ์  ๊ฒฐ๊ณผ๋ฅผ ๊ฐ๋‚ดํ•ด์•ผ ํ•  ๊ฒ๋‹ˆ๋‹ค.
13:45
But the moment we admit
278
825760
2056
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์ธ์ •ํ•˜๋Š” ์ˆœ๊ฐ„
13:47
that information processing is the source of intelligence,
279
827840
4000
์ •๋ณด์ฒ˜๋ฆฌ๊ฐ€ ์ง€๋Šฅ์˜ ์›์ฒœ์ด๊ณ 
13:52
that some appropriate computational system is what the basis of intelligence is,
280
832720
4800
์ ์ • ์—ฐ์‚ฐ ์ฒด๊ณ„๊ฐ€ ์ง€๋Šฅ์˜ ๊ธฐ๋ณธ์ด๋ผ๊ณ  ํ•˜๋Š” ์ˆœ๊ฐ„
13:58
and we admit that we will improve these systems continuously,
281
838360
3760
์ด ์ฒด๊ณ„๋ฅผ ๊ณ„์† ๋ฐœ์ „์‹œ์ผœ ๋‚˜๊ฐˆ ๊ฒƒ์ž„์„ ์ธ์ •ํ•˜๋Š” ๊ฒƒ์ด๋ฉฐ
14:03
and we admit that the horizon of cognition very likely far exceeds
282
843280
4456
์šฐ๋ฆฌ๊ฐ€ ํ˜„์žฌ ์•„๋Š” ์ธ์ง€์˜ ํ•œ๊ณ„๋ฅผ ํ›จ์”ฌ ๋›ฐ์–ด๋„˜๊ฒŒ ๋  ๊ฒƒ์ž„์„ ์ธ์ •ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
14:07
what we currently know,
283
847760
1200
14:10
then we have to admit
284
850120
1216
๊ทธ๋Ÿผ ์šฐ๋ฆฌ๋Š”
14:11
that we are in the process of building some sort of god.
285
851360
2640
์–ด๋–ค ์‹ ์  ์กด์žฌ๋ฅผ ๊ตฌ์ถ•ํ•˜๊ณ  ์žˆ์Œ์„ ์ธ์ •ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
14:15
Now would be a good time
286
855400
1576
์ง€๊ธˆ์ด์•ผ๋ง๋กœ ์‹ ์ด ์šฐ๋ฆฌ์™€ ํ•จ๊ป˜ ํ•  ์ˆ˜ ์žˆ์Œ์„ ํ™•์‹ ํ•˜๊ธฐ ์ข‹์€ ๋•Œ์ด๊ฒ ๋„ค์š”.
14:17
to make sure it's a god we can live with.
287
857000
1953
14:20
Thank you very much.
288
860120
1536
๋Œ€๋‹จํžˆ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
14:21
(Applause)
289
861680
5093
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7