What happens when our computers get smarter than we are? | Nick Bostrom

2,706,262 views

2015-04-27 ใƒป TED


New videos

What happens when our computers get smarter than we are? | Nick Bostrom

2,706,262 views ใƒป 2015-04-27

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: YERI OH ๊ฒ€ํ† : ChulGi Han
00:12
I work with a bunch of mathematicians, philosophers and computer scientists,
0
12570
4207
์ €๋Š” ๋งŽ์€ ์ˆ˜ํ•™์ž, ์ฒ ํ•™์ž ์ปดํ“จํ„ฐ ๊ณผํ•™์ž์™€ ๊ฐ™์ด ์ผํ•˜๋Š”๋ฐ
00:16
and we sit around and think about the future of machine intelligence,
1
16777
5209
์šฐ๋ฆฌ๋Š” ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๊ฐ€์šด๋ฐ์„œ๋„ ๊ธฐ๊ณ„ ํ•™์Šต์˜ ๋ฏธ๋ž˜์— ๋Œ€ํ•ด
00:21
among other things.
2
21986
2044
๋‘˜๋Ÿฌ ์•‰์•„ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
00:24
Some people think that some of these things are sort of science fiction-y,
3
24030
4725
ํ˜น์ž๋Š” ์ด๊ฒƒ๋“ค์„ ๊ณต์ƒ ๊ณผํ•™์Šค๋Ÿฝ๋‹ค๊ฑฐ๋‚˜
00:28
far out there, crazy.
4
28755
3101
๋„ˆ๋ฌด ๋จผ ์ด์•ผ๊ธฐ๋ผ๊ฑฐ๋‚˜ ๋ง๋„ ์•ˆ๋œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
00:31
But I like to say,
5
31856
1470
ํ•˜์ง€๋งŒ ์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฌ๊ณ  ์‹ถ์€ ๊ฒƒ์€
00:33
okay, let's look at the modern human condition.
6
33326
3604
์ข‹์•„์š”. ํ˜„๋Œ€ ์ธ๋ฅ˜์˜ ์ƒํƒœ๋ฅผ ๋ณด์ฃ .
00:36
(Laughter)
7
36930
1692
(์›ƒ์Œ)
00:38
This is the normal way for things to be.
8
38622
2402
์ด๊ฒŒ ํ‰๋ฒ”ํ•œ ๋ชจ์Šต์ด์ฃ .
00:41
But if we think about it,
9
41024
2285
ํ•˜์ง€๋งŒ ์ƒ๊ฐํ•ด๋ณด๋ฉด
00:43
we are actually recently arrived guests on this planet,
10
43309
3293
์‚ฌ์‹ค, ์šฐ๋ฆฌ๋Š” ์ด ํ–‰์„ฑ์— ๊ฐ€์žฅ ์ตœ๊ทผ์— ๋„์ฐฉํ•œ ์†๋‹˜์ž…๋‹ˆ๋‹ค.
00:46
the human species.
11
46602
2082
์ธ๋ฅ˜ ๋ง์ž…๋‹ˆ๋‹ค.
00:48
Think about if Earth was created one year ago,
12
48684
4746
์ง€๊ตฌ๊ฐ€ 1๋…„ ์ „์— ์ƒ๊ฒจ๋‚ฌ๋‹ค๊ณ  ์ƒ๊ฐํ•ด๋ณด์‹ญ์‹œ์˜ค.
00:53
the human species, then, would be 10 minutes old.
13
53430
3548
๊ทธ๋Ÿผ ์ธ๋ฅ˜๋Š” ์ƒ๊ฒจ๋‚œ์ง€ 10๋ถ„ ๋œ๊ฒ๋‹ˆ๋‹ค.
00:56
The industrial era started two seconds ago.
14
56978
3168
์‚ฐ์—…ํ™” ์‹œ๋Œ€๋Š” 2์ดˆ ์ „์— ์‹œ์ž‘๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
01:01
Another way to look at this is to think of world GDP over the last 10,000 years,
15
61276
5225
๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์œผ๋กœ๋Š”, ์ง€๋‚œ ๋งŒ๋…„ ๊ฐ„์˜ ์ „์„ธ๊ณ„ GDP๋ฅผ ๋ณด๋Š” ๊ฒƒ์ธ๋ฐ
01:06
I've actually taken the trouble to plot this for you in a graph.
16
66501
3029
์‹ค์€ ์ œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์„ ์œ„ํ•ด ๊ทธ๋ž˜ํ”„๋กœ ๊ทธ๋ฆฌ๋Š” ์ˆ˜๊ณ ๋ฅผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:09
It looks like this.
17
69530
1774
์ด๋ ‡๊ฒŒ ์ƒ๊ฒผ์Šต๋‹ˆ๋‹ค.
01:11
(Laughter)
18
71304
1363
(์›ƒ์Œ)
01:12
It's a curious shape for a normal condition.
19
72667
2151
์ •์ƒ์ ์ธ ์ƒํƒœ์น˜๊ณ ๋Š” ๊ธฐ๋ฌ˜ํ•œ ๋ชจ์–‘์ƒˆ์ฃ .
01:14
I sure wouldn't want to sit on it.
20
74818
1698
์ €๊ธฐ์— ์•‰์œผ๋ฉด ์•ˆ๋  ๊ฒƒ ๊ฐ™๊ตฐ์š”.
01:16
(Laughter)
21
76516
2551
(์›ƒ์Œ)
01:19
Let's ask ourselves, what is the cause of this current anomaly?
22
79067
4774
์ž๋ฌธํ•ด๋ณด์ฃ . ํ˜„์žฌ์˜ ์ด ๋ณ€์น™์˜ ์›์ธ์€ ๋ฌด์—‡์ž…๋‹ˆ๊นŒ?
01:23
Some people would say it's technology.
23
83841
2552
ํ˜น์ž๋Š” ๊ธฐ์ˆ ์ด๋ผ๊ณ  ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:26
Now it's true, technology has accumulated through human history,
24
86393
4668
๋งž์Šต๋‹ˆ๋‹ค. ๊ธฐ์ˆ ์€ ์ธ๋ฅ˜์˜ ์—ญ์‚ฌ์— ๊ฑธ์ณ ์ถ•์ ๋˜์–ด์™”๊ณ 
01:31
and right now, technology advances extremely rapidly --
25
91061
4652
์ง€๊ธˆ์€ ๊ธฐ์ˆ ์ด ๊ทน๋„๋กœ ๋น ๋ฅด๊ฒŒ ๋ฐœ์ „ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:35
that is the proximate cause,
26
95713
1565
๊ทธ๊ฒƒ์ด ๊ฐ€์žฅ ๊ฐ€๊นŒ์šด ์›์ธ์ด๊ณ 
01:37
that's why we are currently so very productive.
27
97278
2565
์šฐ๋ฆฌ๊ฐ€ ํ˜„์žฌ ์•„์ฃผ ์ƒ์‚ฐ์ ์ธ ์ด์œ ์ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
01:40
But I like to think back further to the ultimate cause.
28
100473
3661
ํ•˜์ง€๋งŒ ๋” ๋ฉ€๋ฆฌ ๋Œ์ด์ผœ ๊ทผ๋ณธ์  ์›์ธ์„ ์ƒ๊ฐํ•ด๋ณด๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
01:45
Look at these two highly distinguished gentlemen:
29
105114
3766
์•„์ฃผ ์œ ๋ช…ํ•œ ๋‘ ์‹ ์‚ฌ๋ฅผ ํ•œ๋ฒˆ ๋ณด์‹œ์ฃ .
01:48
We have Kanzi --
30
108880
1600
์นจํŒฌ์ง€ ์นธ์ง€๋Š”
01:50
he's mastered 200 lexical tokens, an incredible feat.
31
110480
4643
200๊ฐœ์˜ ์–ดํœ˜ ํ† ํฐ์„ ์ˆ™๋‹ฌํ•˜๋Š” ๋†€๋ผ์šด ์œ„์—…์„ ์ด๋ค˜๊ณ 
01:55
And Ed Witten unleashed the second superstring revolution.
32
115123
3694
๊ทธ๋ฆฌ๊ณ  ์—๋“œ ์œ„ํŠผ์€ ์ œ 2์ฐจ ์ดˆ๋ˆ์ด๋ก ์„ ์ด‰๋ฐœ์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
01:58
If we look under the hood, this is what we find:
33
118817
2324
๋ฎ๊ฐœ ์•ˆ์ชฝ์„ ๋ณด๋ฉด ์ด๋Ÿฐ ๊ฑธ ๋ณด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
02:01
basically the same thing.
34
121141
1570
๊ธฐ๋ณธ์ ์œผ๋กœ๋Š” ๋˜‘๊ฐ™์Šต๋‹ˆ๋‹ค.
02:02
One is a little larger,
35
122711
1813
ํ•œ ์ชฝ์ด ์ข€ ๋” ํฌ๊ณ 
02:04
it maybe also has a few tricks in the exact way it's wired.
36
124524
2758
์•„๋งˆ๋„ ์™€์ด์–ด๋ง์˜ ์ •ํ™•ํ•œ ๋ฐฉ๋ฒ•์— ๋ช‡๊ฐ€์ง€ ์š”๋ น์ด ์žˆ์—ˆ๋‚˜ ๋ด…๋‹ˆ๋‹ค.
02:07
These invisible differences cannot be too complicated, however,
37
127282
3812
ํ•˜์ง€๋งŒ ์ด ๋ณด์ด์ง€ ์•Š๋Š” ์ฐจ์ด์ ๋“ค์€ ๊ทธ๋ฆฌ ๋ณต์žกํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
02:11
because there have only been 250,000 generations
38
131094
4285
์šฐ๋ฆฌ์˜ ๋งˆ์ง€๋ง‰ ๊ณตํ†ต ์กฐ์ƒ์œผ๋กœ๋ถ€ํ„ฐ ๋‹จ 25๋งŒ ์„ธ๋Œ€๋ฐ–์—
02:15
since our last common ancestor.
39
135379
1732
์ง€๋‚˜์ง€ ์•Š์•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
02:17
We know that complicated mechanisms take a long time to evolve.
40
137111
3849
๋ณต์žกํ•œ ๋ฉ”์ปค๋‹ˆ์ฆ˜์€ ์ง„ํ™”ํ•˜๋Š”๋ฐ ์˜ค๋žœ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆฌ๊ตฌ์š”.
02:22
So a bunch of relatively minor changes
41
142000
2499
๋”ฐ๋ผ์„œ, ๋น„๊ต์  ์ค‘์š”ํ•˜์ง€์•Š์€ ์—ฌ๋Ÿฌ ๋ณ€ํ™”๋“ค์ด
02:24
take us from Kanzi to Witten,
42
144499
3067
์šฐ๋ฆฌ๋ฅผ ์นธ์ง€์—์„œ ์œ„ํŠผ์œผ๋กœ,
02:27
from broken-off tree branches to intercontinental ballistic missiles.
43
147566
4543
๋ถ€๋Ÿฌ์ง„ ๋‚˜๋ญ‡๊ฐ€์ง€์—์„œ ๋Œ€๋ฅ™๊ฐ„ ํƒ„๋„ ๋ฏธ์‚ฌ์ผ๋กœ ๋ฐ๋ ค์˜จ ๊ฒ๋‹ˆ๋‹ค.
02:32
So this then seems pretty obvious that everything we've achieved,
44
152839
3935
๊ทธ๋Ÿผ ์ด์ œ, ์šฐ๋ฆฌ๊ฐ€ ์ด๋ฃฉํ•ด์˜จ ๊ฑฐ์˜ ๋ชจ๋“  ๊ฒƒ,
02:36
and everything we care about,
45
156774
1378
์šฐ๋ฆฌ๊ฐ€ ๊ด€์‹ฌ์žˆ๋Š” ๋ชจ๋“  ๊ฒƒ๋“ค์€
02:38
depends crucially on some relatively minor changes that made the human mind.
46
158152
5228
์‚ฌ๋žŒ์˜ ๋งˆ์Œ์„ ๋ฐ”๊พธ๋„๋ก ๋งŒ๋“  ์กฐ๊ธˆ์˜ ๋ณ€ํ™”์— ๋‹ฌ๋ ค ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
02:44
And the corollary, of course, is that any further changes
47
164650
3662
๋ฌผ๋ก , ๊ทธ ๊ฒฐ๊ณผ๋Š” ์ถ”๊ฐ€์ ์ธ ๋ณ€ํ™”๋ฅผ ์ด๋Œ์–ด๋‚ด๊ณ 
02:48
that could significantly change the substrate of thinking
48
168312
3477
๊ทธ ๊ฒƒ์€ ์ƒ๊ฐ์˜ ๊ธฐ์งˆ์˜ ์ƒ๋‹นํ•œ ๋ณ€ํ™”๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:51
could have potentially enormous consequences.
49
171789
3202
์ž ์žฌ์ ์œผ๋กœ ์—„์ฒญ๋‚œ ๊ฒฐ๊ณผ๋ฅผ ์ด๋Œ์–ด ๋‚ผ ์ˆ˜ ์žˆ์ฃ .
02:56
Some of my colleagues think we're on the verge
50
176321
2905
์ œ ๋™๋ฃŒ ์ค‘ ๋ช‡๋ช‡์€
02:59
of something that could cause a profound change in that substrate,
51
179226
3908
๋ฌด์–ธ๊ฐ€๊ฐ€ ๊ทธ ๊ธฐ์งˆ์˜ ์—„์ฒญ๋‚œ ๋ณ€ํ™”๋ฅผ ์ผ์œผํ‚ค๊ธฐ ์ง์ „์ด๋ฉฐ
03:03
and that is machine superintelligence.
52
183134
3213
๊ทธ๊ฒƒ์€ ๊ธฐ๊ณ„ ์ดˆ์ง€๋Šฅ์ผ๊ฑฐ๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
03:06
Artificial intelligence used to be about putting commands in a box.
53
186347
4739
๊ณผ๊ฑฐ ์ธ๊ณต์ง€๋Šฅ์€ ๋ฐ•์Šค์— ๋ช…๋ น์„ ๋„ฃ๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
03:11
You would have human programmers
54
191086
1665
์ธ๊ฐ„ ํ”„๋กœ๊ทธ๋ž˜๋จธ๊ฐ€
03:12
that would painstakingly handcraft knowledge items.
55
192751
3135
์ง€์‹ ์•„์ดํ…œ์„ ํž˜๋“ค๊ฒŒ ์†์ˆ˜ ๋งŒ๋“ค์–ด๋ƒ…๋‹ˆ๋‹ค.
03:15
You build up these expert systems,
56
195886
2086
์ด๋Ÿฌํ•œ ์ „๋ฌธ๊ฐ€ ์‹œ์Šคํ…œ์„ ๊ตฌ์ถ•ํ•˜๋ฉด
03:17
and they were kind of useful for some purposes,
57
197972
2324
์–ด๋–ค ๋ชฉ์ ์œผ๋กœ๋Š” ์–ด๋Š์ •๋„ ์œ ์šฉํ•˜์ง€๋งŒ
03:20
but they were very brittle, you couldn't scale them.
58
200296
2681
์‹ค์€ ์•„์ฃผ ์ทจ์•ฝํ•ด์„œ ์กฐ์ •ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
03:22
Basically, you got out only what you put in.
59
202977
3433
๊ฒฐ๊ตญ ์ž…๋ ฅํ•˜๋Š” ๋Œ€๋กœ๋งŒ ๊ฒฐ๊ณผ๋ฅผ ์–ป๊ฒŒ๋ฉ๋‹ˆ๋‹ค.
03:26
But since then,
60
206410
997
ํ•˜์ง€๋งŒ ๊ทธ๋•Œ๋ถ€ํ„ฐ
03:27
a paradigm shift has taken place in the field of artificial intelligence.
61
207407
3467
์ธ๊ณต์ง€๋Šฅ ๋ถ„์•ผ์—์„œ ํŒจ๋Ÿฌ๋‹ค์ž„ ๋ณ€ํ™”๊ฐ€ ์ผ์–ด๋‚ฌ์Šต๋‹ˆ๋‹ค.
03:30
Today, the action is really around machine learning.
62
210874
2770
์˜ค๋Š˜๋‚ ์—”, ๊ธฐ๊ณ„ ํ•™์Šต์„ ์ค‘์‹ฌ์œผ๋กœ ์›€์ง์ด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:34
So rather than handcrafting knowledge representations and features,
63
214394
5387
๋”ฐ๋ผ์„œ ์ง€์‹ ํ‘œํ˜„๊ณผ ํŠน์ง•์„ ์†์ˆ˜ ๋งŒ๋“ค๊ธฐ ๋ณด๋‹ค๋Š”
03:40
we create algorithms that learn, often from raw perceptual data.
64
220511
5554
๋•Œ๋ก  ์›์ดˆ์  ์ง€๊ฐ ๋ฐ์ดํ„ฐ๋กœ๋ถ€ํ„ฐ ๋ฐฐ์šฐ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๋งŒ๋“ญ๋‹ˆ๋‹ค.
03:46
Basically the same thing that the human infant does.
65
226065
4998
์œ ์•„๊ธฐ ์ธ๊ฐ„์ด ํ•˜๋Š” ๊ฒƒ๊ณผ ๊ธฐ๋ณธ์ ์œผ๋กœ ๊ฐ™์€ ๊ฒƒ์ด์ฃ .
03:51
The result is A.I. that is not limited to one domain --
66
231063
4207
๊ฒฐ๋ก ์ ์œผ๋กœ ์ธ๊ณต์ง€๋Šฅ์€ ํ•œ ๋„๋ฉ”์ธ์— ๊ตญํ•œ๋˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
03:55
the same system can learn to translate between any pairs of languages,
67
235270
4631
๋˜‘๊ฐ™์€ ์‹œ์Šคํ…œ์œผ๋กœ ์–ด๋– ํ•œ ์–ธ์–ด๋„ ๋ฒˆ์—ญํ•  ์ˆ˜ ์žˆ๊ณ 
03:59
or learn to play any computer game on the Atari console.
68
239901
5437
๋˜๋Š”, ์•„ํƒ€๋ฆฌ ์ฝ˜์†”์˜ ์–ด๋– ํ•œ ์ปดํ“จํ„ฐ ๊ฒŒ์ž„๋„ ํ”Œ๋ ˆ์ด ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:05
Now of course,
69
245338
1779
๋ฌผ๋ก  ์ง€๊ธˆ์€,
04:07
A.I. is still nowhere near having the same powerful, cross-domain
70
247117
3999
์ธ๊ณต์ง€๋Šฅ์ด ์ธ๊ฐ„์ฒ˜๋Ÿผ ํ–‰๋™ํ•˜๊ณ  ๋ฐฐ์šฐ๋Š” ๊ฐ•๋ ฅํ•œ, ํฌ๋กœ์Šค ๋„๋ฉ”์ธ์˜
04:11
ability to learn and plan as a human being has.
71
251116
3219
๋‹จ๊ณ„์— ์™€์žˆ์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค.
04:14
The cortex still has some algorithmic tricks
72
254335
2126
๋‘๋‡Œํ”ผ์งˆ์€ ์—ฌ์ „ํžˆ ๋ช‡๊ฐ€์ง€์˜ ์•Œ๊ณ ๋ฆฌ์ฆ˜์  ํŠธ๋ฆญ๋“ค์„ ๊ฐ€์ง€๊ณ  ์žˆ์–ด์„œ,
04:16
that we don't yet know how to match in machines.
73
256461
2355
๊ธฐ๊ณ„์— ์ ์šฉ์‹œํ‚ค๊ธฐ๋Š” ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
04:19
So the question is,
74
259886
1899
๊ทธ๋ ‡๋‹ค๋ฉด ๋ฌธ์ œ๋Š” ์ด๊ฒ๋‹ˆ๋‹ค.
04:21
how far are we from being able to match those tricks?
75
261785
3500
๊ทธ ํŠธ๋ฆญ๋“ค์„ ์ผ์น˜์‹œํ‚ฌ ์ˆ˜ ์žˆ๊ธฐ๊นŒ์ง€ ์–ผ๋งˆ๋‚˜ ๋” ๊ฑธ๋ฆด ๊ฒƒ์ธ๊ฐ€?
04:26
A couple of years ago,
76
266245
1083
2๋…„ ์ „ ์ฏค์—
04:27
we did a survey of some of the world's leading A.I. experts,
77
267328
2888
์„ธ๊ณ„ ์ฃผ์š” ์ธ๊ณต์ง€๋Šฅ ์ „๋ฌธ๊ฐ€๋“ค์„ ๋Œ€์ƒ์œผ๋กœ ๊ทธ๋“ค์˜ ์˜๊ฒฌ์„ ์กฐ์‚ฌํ–ˆ๋Š”๋ฐ
04:30
to see what they think, and one of the questions we asked was,
78
270216
3224
์šฐ๋ฆฌ๊ฐ€ ํ–ˆ๋˜ ์งˆ๋ฌธ ์ค‘ ํ•˜๋‚˜๋Š”
04:33
"By which year do you think there is a 50 percent probability
79
273440
3353
"์šฐ๋ฆฌ๊ฐ€ ์ธ๊ฐ„ ์ˆ˜์ค€์˜ ๊ธฐ๊ณ„ ํ•™์Šต์— ๋‹ฌ์„ฑํ•  ๊ฐ€๋Šฅ์„ฑ์ด
04:36
that we will have achieved human-level machine intelligence?"
80
276793
3482
50%๊ฐ€ ๋˜๋Š” ํ•ด๋Š” ์–ธ์ œ๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹ญ๋‹ˆ๊นŒ?" ์˜€์Šต๋‹ˆ๋‹ค.
04:40
We defined human-level here as the ability to perform
81
280785
4183
์—ฌ๊ธฐ์„œ ์ธ๊ฐ„ ์ˆ˜์ค€์ด๋ผ๋Š” ๊ฒƒ์€ ์ตœ์†Œํ•œ ์„ฑ์ธ ์ธ๊ฐ„๋งŒํผ
04:44
almost any job at least as well as an adult human,
82
284968
2871
๊ฑฐ์˜ ๋ชจ๋“  ์ผ์„ ์ˆ˜ํ–‰ํ•˜๋Š” ๋Šฅ๋ ฅ์„ ๋งํ•ฉ๋‹ˆ๋‹ค.
04:47
so real human-level, not just within some limited domain.
83
287839
4005
๊ทธ์ € ๋ช‡๋ช‡ ๋ฒ”์œ„์— ๊ตญํ•œ๋œ ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ์ง„์งœ ์ธ๊ฐ„ ์ˆ˜์ค€ ๋ง์ž…๋‹ˆ๋‹ค.
04:51
And the median answer was 2040 or 2050,
84
291844
3650
๋‹ต๋ณ€์˜ ์ค‘๊ฐ„๊ฐ’์€ 2040๋…„์ด๋‚˜ 2050๋…„์ด์—ˆ์Šต๋‹ˆ๋‹ค.
04:55
depending on precisely which group of experts we asked.
85
295494
2806
์šฐ๋ฆฌ๊ฐ€ ์งˆ๋ฌธํ–ˆ๋˜ ์ „๋ฌธ๊ฐ€ ๊ทธ๋ฃน์— ๋”ฐ๋ผ ๋‹ฌ๋ž์Šต๋‹ˆ๋‹ค.
04:58
Now, it could happen much, much later, or sooner,
86
298300
4039
ํ›จ์”ฌ ํ›จ์”ฌ ๋‚˜์ค‘์—, ํ˜น์€ ๋” ์ผ์ฐ ์ผ์–ด๋‚  ์ˆ˜ ์žˆ์ง€๋งŒ
05:02
the truth is nobody really knows.
87
302339
1940
์ง„์‹ค์€ ์•„๋ฌด๋„ ๋ชจ๋ฅด๋Š” ๊ฑฐ๊ฒ ์ฃ .
05:05
What we do know is that the ultimate limit to information processing
88
305259
4412
์šฐ๋ฆฌ๊ฐ€ ์•„๋Š” ๊ฒƒ์€, ๊ธฐ๊ณ„ ๊ธฐํŒ์˜ ์ •๋ณด ์ฒ˜๋ฆฌ์˜ ๊ถ๊ทน์  ํ•œ๊ณ„๊ฐ€
05:09
in a machine substrate lies far outside the limits in biological tissue.
89
309671
4871
์ƒ์ฒด์กฐ์ง์˜ ํ•œ๊ณ„๋ณด๋‹ค ํ›จ์”ฌ ๋ฉ€๋ฆฌ ์žˆ๋‹ค๋Š” ์ ์ž…๋‹ˆ๋‹ค.
05:15
This comes down to physics.
90
315241
2378
์ด๊ฒƒ์€ ๋ฌผ๋ฆฌํ•™ ์ด๋ก ์ธ๋ฐ,
05:17
A biological neuron fires, maybe, at 200 hertz, 200 times a second.
91
317619
4718
์ƒ๋ฌผํ•™์  ๋‰ด๋Ÿฐ์€ 200ํ—ค๋ฅด์ธ , 1์ดˆ์— 200๋ฒˆ ํฅ๋ถ„ํ•ฉ๋‹ˆ๋‹ค.
05:22
But even a present-day transistor operates at the Gigahertz.
92
322337
3594
๊ทธ๋Ÿฌ๋‚˜ ํ•˜๋ฌผ๋ฉฐ ์˜ค๋Š˜๋‚ ์˜ ํŠธ๋žœ์ง€์Šคํ„ฐ๋„ ๊ธฐ๊ฐ€ํ—ค๋ฅด์ธ  ์†๋„๋กœ ์ž‘๋™ํ•ฉ๋‹ˆ๋‹ค.
05:25
Neurons propagate slowly in axons, 100 meters per second, tops.
93
325931
5297
๋‰ด๋Ÿฐ์€ ์ถ•์ƒ‰๋Œ๊ธฐ์—์„œ ์ตœ๋Œ€ ์ดˆ์† 100m๋กœ ์ฒœ์ฒœํžˆ ์ „๋‹ฌํ•˜์ง€๋งŒ
05:31
But in computers, signals can travel at the speed of light.
94
331228
3111
์ปดํ“จํ„ฐ์—์„œ๋Š”, ์‹ ํ˜ธ๊ฐ€ ๋น›์˜ ์†๋„๋กœ ์ด๋™ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:35
There are also size limitations,
95
335079
1869
๋˜, ํฌ๊ธฐ์˜ ์ œํ•œ๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:36
like a human brain has to fit inside a cranium,
96
336948
3027
์ธ๊ฐ„์˜ ๋‡Œ๋Š” ๋‘๊ฐœ๊ณจ ์•ˆ์— ๋“ค์–ด๊ฐ€์•ผ ํ•˜์ง€๋งŒ
05:39
but a computer can be the size of a warehouse or larger.
97
339975
4761
์ปดํ“จํ„ฐ๋Š” ์ฐฝ๊ณ  ํฌ๊ธฐ๊ฐ€ ๋  ์ˆ˜๋„ ํ˜น์€ ๋” ํด ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
05:44
So the potential for superintelligence lies dormant in matter,
98
344736
5599
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์Šˆํผ์ธ๊ณต์ง€๋Šฅ์˜ ๊ฐ€๋Šฅ์„ฑ์€ ์›์žํญํƒ„์ด 1945๋…„ ์ดํ›„๋กœ
05:50
much like the power of the atom lay dormant throughout human history,
99
350335
5712
์ž ๋“ค์–ด ์žˆ๋Š” ๊ฒƒ ์ฒ˜๋Ÿผ ์–ธ์ œ
05:56
patiently waiting there until 1945.
100
356047
4405
์‹œ์ž‘๋ ์ง€ ๋ชจ๋ฅด๋Š” ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
06:00
In this century,
101
360452
1248
์ด๋ฒˆ ์„ธ๊ธฐ์—,
06:01
scientists may learn to awaken the power of artificial intelligence.
102
361700
4118
๊ณผํ•™์ž๋“ค์€ ์ธ๊ณต์ง€๋Šฅ์˜ ํŒŒ์›Œ๋ฅผ ๊นจ์šธ ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋‚ผ ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
06:05
And I think we might then see an intelligence explosion.
103
365818
4008
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ์ธ๊ณต์ง€๋Šฅ์˜ ํ™•์‚ฐ์„ ๋ณผ ๊ฒƒ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
06:10
Now most people, when they think about what is smart and what is dumb,
104
370406
3957
๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์€, ๊ทธ๋“ค์ด ๋ˆ„๊ฐ€ ๋˜‘๋˜‘ํ•˜๊ณ  ๋ฉ์ฒญํ•œ์ง€์— ๋Œ€ํ•ด์„œ
06:14
I think have in mind a picture roughly like this.
105
374363
3023
์ด๋Ÿฌํ•œ ๋Œ€๋žต์ ์ธ ์ƒ๊ฐ์ด ์žˆ์„ ๊ฒƒ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
06:17
So at one end we have the village idiot,
106
377386
2598
ํ•œ ์ชฝ ๋์—๋Š” ๋ฐ”๋ณด๊ฐ€ ์žˆ๊ณ 
06:19
and then far over at the other side
107
379984
2483
๋‹ค๋ฅธ ์ € ์ชฝ ๋์—๋Š”
06:22
we have Ed Witten, or Albert Einstein, or whoever your favorite guru is.
108
382467
4756
์—๋“œ ์œ„ํŠผ์ด๋‚˜ ์•Œ๋ฒ„ํŠธ ์•„์ธ์Šˆํƒ€์ธ ํ˜น์€ ์—ฌ๋Ÿฌ๋ถ„์ด ์ข‹์•„ํ•˜๋Š” ๊ถŒ์œ„์ž๊ฐ€ ์žˆ์ฃ .
06:27
But I think that from the point of view of artificial intelligence,
109
387223
3834
ํ•˜์ง€๋งŒ ์ธ๊ณต์ง€๋Šฅ์˜ ๊ด€์ ์—์„œ ๋ณด๋ฉด
06:31
the true picture is actually probably more like this:
110
391057
3681
์‹ค์ œ๋กœ๋Š” ์ด๋Ÿฐ ๊ทธ๋ฆผ์ด ๊ทธ๋ ค์งˆ๊ฑฐ๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
06:35
AI starts out at this point here, at zero intelligence,
111
395258
3378
์ธ๊ณต์ง€๋Šฅ์€ ์ด ์ง€์ , ์ง€๋Šฅ์ด ์—†๋Š” ์ง€์ ์—์„œ ์‹œ์ž‘ํ•ด์„œ
06:38
and then, after many, many years of really hard work,
112
398636
3011
์ˆ˜ ๋…„ ๊ฐ„์˜ ์•„์ฃผ ๊ณ ๋œ ์—ฐ๊ตฌ ๋์—
06:41
maybe eventually we get to mouse-level artificial intelligence,
113
401647
3844
๋งˆ์นจ๋‚ด ์ฅ ์ˆ˜์ค€์˜ ์ธ๊ณต์ง€๋Šฅ์— ์ด๋ฅด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
06:45
something that can navigate cluttered environments
114
405491
2430
๋งˆ์น˜ ์ฅ์ฒ˜๋Ÿผ ์–ด์ˆ˜์„ ํ•œ ํ™˜๊ฒฝ์—์„œ
06:47
as well as a mouse can.
115
407921
1987
๋‚˜์•„๊ฐˆ ์ˆ˜ ์žˆ๋Š” ์ˆ˜์ค€ ๋ง์ž…๋‹ˆ๋‹ค.
06:49
And then, after many, many more years of really hard work, lots of investment,
116
409908
4313
๊ทธ ์ดํ›„๋กœ ๋” ์˜ค๋žœ ๊ณ ๋œ ์—ฐ๊ตฌ์™€ ๋งŽ์€ ํˆฌ์ž ๋์—
06:54
maybe eventually we get to chimpanzee-level artificial intelligence.
117
414221
4639
๋งˆ์นจ๋‚ด ์นจํŒฌ์ง€ ์ˆ˜์ค€์˜ ์ธ๊ณต์ง€๋Šฅ์„ ์–ป๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
06:58
And then, after even more years of really, really hard work,
118
418860
3210
๊ทธ๋ฆฌ๊ณค ์‹ฌ์ง€์–ด ๋” ์—ฌ๋Ÿฌ ํ•ด์˜ ์•„์ฃผ ์•„์ฃผ ๊ณ ๋œ ์—ฐ๊ตฌ ๋์—
07:02
we get to village idiot artificial intelligence.
119
422070
2913
๋™๋„ค ๋ฐ”๋ณด ์ˆ˜์ค€์˜ ์ธ๊ณต์ง€๋Šฅ์— ์ด๋ฅด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
07:04
And a few moments later, we are beyond Ed Witten.
120
424983
3272
๊ทธ๋ฆฌ๊ณ  ์ž ์‹œ ๋’ค์—” ์—๋“œ ์œ„ํŠผ์„ ๋›ฐ์–ด๋„˜์Šต๋‹ˆ๋‹ค.
07:08
The train doesn't stop at Humanville Station.
121
428255
2970
๊ธฐ์ฐจ๋Š” ์ธ๊ฐ„ ์—ญ์— ์ •์ฐจํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
07:11
It's likely, rather, to swoosh right by.
122
431225
3022
๊ทธ๋ณด๋‹ค๋Š” ํœ™ ํ•˜๊ณ  ์ง€๋‚˜๊ฐ€๋Š” ๊ฒƒ์— ๊ฐ€๊น์ฃ .
07:14
Now this has profound implications,
123
434247
1984
์ด๊ฒƒ์€ ๊นŠ์€ ์˜๋ฏธ์ž…๋‹ˆ๋‹ค.
07:16
particularly when it comes to questions of power.
124
436231
3862
ํŠน๋ณ„ํžˆ, ํž˜์— ๋Œ€ํ•œ ๋ฌผ์Œ์— ๋Œ€ํ•ด์„œ๋Š” ๋ง์ด์ฃ 
07:20
For example, chimpanzees are strong --
125
440093
1899
์˜ˆ๋ฅผ๋“ค๋ฉด, ์นจํŒฌ์น˜๋Š” ํž˜์ด ์Ž•๋‹ˆ๋‹ค.
07:21
pound for pound, a chimpanzee is about twice as strong as a fit human male.
126
441992
5222
ํŒŒ์šด๋“œ๋กœ ๋น„๊ตํ•˜๋ฉด, ์นจํŽœ์น˜๋Š” ์ธ๊ฐ„ ๋‚จ์„ฑ๋ณด๋‹ค 2๋ฐฐ๊ฐ€๋Ÿ‰ ๊ฐ•ํ•ฉ๋‹ˆ๋‹ค.
07:27
And yet, the fate of Kanzi and his pals depends a lot more
127
447214
4614
๊ทธ๋Ÿฌ๋‚˜, ์นจํŒฌ์น˜ ์ผ„์ง€์˜ ์šด๋ช…์€ ๊ทธ๋“ค์ด ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ ๋ณด๋‹ค๋Š”
07:31
on what we humans do than on what the chimpanzees do themselves.
128
451828
4140
์ธ๋ฅ˜๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์— ๋‹ฌ๋ ค ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
07:37
Once there is superintelligence,
129
457228
2314
์Šˆํผ์ธ๊ณต์ง€๋Šฅ์ด ์กด์žฌํ•œ๋‹ค๋ฉด,
07:39
the fate of humanity may depend on what the superintelligence does.
130
459542
3839
์ธ๋ฅ˜์˜ ์šด๋ช…์€ ์•„๋งˆ๋„
์Šˆํผ์ธ๊ณต์ง€๋Šฅ์ด ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์— ๋”ฐ๋ผ ๊ฒฐ์ •๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:44
Think about it:
131
464451
1057
์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
07:45
Machine intelligence is the last invention that humanity will ever need to make.
132
465508
5044
์ธ๊ณต์ง€๋Šฅ์€ ์ธ๋ฅ˜๊ฐ€ ๋งŒ๋“ค์–ด์•ผํ•˜๋Š” ๋งˆ์ง€๋ง‰ ๋ฐœ๋ช…ํ’ˆ์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:50
Machines will then be better at inventing than we are,
133
470552
2973
๊ธฐ๊ณ„๊ฐ€ ์šฐ๋ฆฌ๋ณด๋‹ค ๋” ๋ฐœ๋ช…์„ ์ž˜ ํ•  ๊ฒƒ์ด๋ฉฐ
07:53
and they'll be doing so on digital timescales.
134
473525
2540
๋””์ง€ํ„ธ ์‹œ๊ฐ„ ์ฒ™๋„๋กœ ๋ฐœ๋ช…์„ ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:56
What this means is basically a telescoping of the future.
135
476065
4901
์ด๊ฒƒ์€ ๊ธฐ๋ณธ์ ์œผ๋กœ ์‹œ๊ฐ„์˜ ์••์ถ•์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
08:00
Think of all the crazy technologies that you could have imagined
136
480966
3558
๋‹น์‹ ์ด ์ƒ์ƒํ•  ์ˆ˜ ์žˆ๋Š” ๋ชจ๋“  ๊ดด์ƒํ•œ ๊ธฐ์ˆ ์„ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
08:04
maybe humans could have developed in the fullness of time:
137
484524
2798
์•„๋งˆ๋„ ์ธ๋ฅ˜๊ฐ€ ๋ฐœ๋ช…ํ•˜๊ธฐ์—๋Š” ๋„ˆ๋ฌด ๋งŽ์€ ์‹œ๊ฐ„์ด ํ•„์š”ํ•œ ๊ฒƒ์ด์ฃ .
08:07
cures for aging, space colonization,
138
487322
3258
๋…ธํ™” ํ•ด๊ฒฐ, ์šฐ์ฃผ ์ •๋ณต
08:10
self-replicating nanobots or uploading of minds into computers,
139
490580
3731
์ž๊ธฐ ๋ณต์ œ ๋‚˜๋…ธ๋กœ๋ด‡์ด๋‚˜ ์ปดํ“จํ„ฐ์— ์ƒ๊ฐ์„ ์—…๋กœ๋”ฉ ํ•˜๋Š” ๊ฒƒ
08:14
all kinds of science fiction-y stuff
140
494311
2159
๋ญ๋“  ๊ณต์ƒ ๊ณผํ•™์Šค๋Ÿฝ์ง€๋งŒ
08:16
that's nevertheless consistent with the laws of physics.
141
496470
2737
๊ทธ๋Ÿผ์—๋„ ๋ฌผ๋ฆฌํ•™ ๋ฒ•์น™์— ๋ถ€ํ•ฉํ•˜๋Š” ๊ฒƒ๋“ค ๋ง์ž…๋‹ˆ๋‹ค.
08:19
All of this superintelligence could develop, and possibly quite rapidly.
142
499207
4212
์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ์Šˆํผ์ธ๊ณต์ง€๋Šฅ์€ ๋น ๋ฅด๊ฒŒ ๋ฐœ๋ช…ํ•  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
08:24
Now, a superintelligence with such technological maturity
143
504449
3558
์ด์ œ, ์Šˆํผ์ธ๊ณต์ง€๋Šฅ์ด
08:28
would be extremely powerful,
144
508007
2179
์•„์ฃผ ๊ฐ•๋ ฅํ•˜๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ ๋˜์—ˆ์œผ๋‹ˆ
08:30
and at least in some scenarios, it would be able to get what it wants.
145
510186
4546
์ ์–ด๋„ ๋ช‡๋ช‡์˜ ์‹œ๋‚˜๋ฆฌ์˜ค์—์„œ๋Š”,
์ด๊ฒƒ์ด ๋ฌด์—‡์„ ์›ํ•˜๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:34
We would then have a future that would be shaped by the preferences of this A.I.
146
514732
5661
๊ทธ๋Ÿฌ๋ฉด ์ด์ œ ์ธ๊ณต์ง€๋Šฅ์˜ ์„ ํ˜ธ๋กœ ์ด๋ฃจ์–ด์ง„ ๋ฏธ๋ž˜๋ฅผ ๊ทธ๋ ค๋ณผ ์ˆ˜ ์žˆ๊ฒ ์ฃ .
08:41
Now a good question is, what are those preferences?
147
521855
3749
์ด ์‹œ์ ์—์„œ ์ข‹์€ ์งˆ๋ฌธ์€ ๊ทธ ์„ ํ˜ธ๋ผ๋Š”๊ฒŒ ๋ฌด์—‡์ธ๊ฐ€? ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
08:46
Here it gets trickier.
148
526244
1769
์ด ๋ฌผ์Œ์€ ๋” ์–ด๋ ต๊ตฐ์š”
08:48
To make any headway with this,
149
528013
1435
์–ด๋–ป๋“  ์ง„์ „์„ ๋ณด๊ธฐ ์œ„ํ•ด์„œ๋Š”
08:49
we must first of all avoid anthropomorphizing.
150
529448
3276
์šฐ์„  ์˜์ธํ™”๋ฅผ ํ”ผํ•ด์•ผ๋งŒ ํ•ฉ๋‹ˆ๋‹ค.
08:53
And this is ironic because every newspaper article
151
533934
3301
์ด๊ฒŒ ์ฐธ ์•„์ด๋Ÿฌ๋‹ˆํ•œ๊ฒŒ ์ธ๊ณต์ง€๋Šฅ์— ๋Œ€ํ•œ
08:57
about the future of A.I. has a picture of this:
152
537235
3855
๋ชจ๋“  ์‹ ๋ฌธ ๊ธฐ์‚ฌ๋“ค์—๋Š” ์ด ์‚ฌ์ง„์ด ๋‹ฌ๋ ค์žˆ์Šต๋‹ˆ๋‹ค.
09:02
So I think what we need to do is to conceive of the issue more abstractly,
153
542280
4134
์ œ ์ƒ๊ฐ์— ์šฐ๋ฆฌ๋Š” ์ƒ์ƒํ•œ ํ• ๋ฆฌ์šฐ๋“œ ์‹œ๋‚˜๋ฆฌ์˜ค ๋ฐฉ์‹์ด ์•„๋‹ˆ๋ผ
09:06
not in terms of vivid Hollywood scenarios.
154
546414
2790
๋ฌธ์ œ๋ฅผ ์ข€๋” ์ถ”์ƒ์ ์œผ๋กœ ์ƒ์ƒํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
09:09
We need to think of intelligence as an optimization process,
155
549204
3617
์ง€๋Šฅ์„ ์ตœ์ ํ™” ๊ณผ์ •์ด๋ผ๊ณ  ์ƒ๊ฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:12
a process that steers the future into a particular set of configurations.
156
552821
5649
์ผ๋ จ์˜ ํŠน์ • ํ˜•ํƒœ ์ชฝ์œผ๋กœ ๋ฏธ๋ž˜์˜ ๋ฐฉํ–ฅ์„ ์กฐ์ข…ํ•˜๋Š” ๊ณผ์ •๋ง์ž…๋‹ˆ๋‹ค.
09:18
A superintelligence is a really strong optimization process.
157
558470
3511
์ดˆ์ง€๋Šฅ์€ ๋งค์šฐ ๊ฐ•๋ ฅํ•œ ์ตœ์ ํ™” ํ”„๋กœ์„ธ์Šค์ž…๋‹ˆ๋‹ค.
09:21
It's extremely good at using available means to achieve a state
158
561981
4117
๋ชฉํ‘œ๊ฐ€ ์‹คํ˜„๋˜๋Š” ์ƒํƒœ๋ฅผ ๋‹ฌ์„ฑํ•˜๊ธฐ ์œ„ํ•ด ๊ฐ€์šฉ ์ž์›์„ ์‚ฌ์šฉํ•˜๋Š”๋ฐ
09:26
in which its goal is realized.
159
566098
1909
๋งค์šฐ ๋Šฅ์ˆ™ํ•ฉ๋‹ˆ๋‹ค.
09:28
This means that there is no necessary connection between
160
568447
2672
์ด ๊ฒƒ์€ ๋†’์€ ์ง€๋Šฅ์„ ๊ฐ€์ง„ ๊ฒƒ๊ณผ ์ธ๋ฅ˜๊ฐ€ ๊ฐ€์น˜์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š” ๋ชฉํ‘œ๋ฅผ
09:31
being highly intelligent in this sense,
161
571119
2734
์ง€ํ–ฅํ•˜๋Š” ๊ฒƒ์— ์—ฐ๊ฒฐ์ ์€
09:33
and having an objective that we humans would find worthwhile or meaningful.
162
573853
4662
ํ•„์ˆ˜์ ์ธ๊ฒŒ ์•„๋‹ˆ๋ผ๋Š” ๋œป์ž…๋‹ˆ๋‹ค.
09:39
Suppose we give an A.I. the goal to make humans smile.
163
579321
3794
์ธ๊ณต์ง€๋Šฅ์—๊ฒŒ ์ธ๊ฐ„์„ ์›ƒ๊ฒŒ ํ•˜๋ผ๋Š” ๋ชฉํ‘œ๋ฅผ ์ฃผ์—ˆ๋‹ค๊ณ  ๊ฐ€์ •ํ•ด๋ด…์‹œ๋‹ค.
09:43
When the A.I. is weak, it performs useful or amusing actions
164
583115
2982
์ธ๊ณต์ง€๋Šฅ์ด ์ข€ ๋–จ์–ด์งˆ ๋• ์ด์šฉ์ž๋ฅผ ์›ƒ๊ฒŒ ํ•˜๋Š”
09:46
that cause its user to smile.
165
586097
2517
์œ ์šฉํ•˜๊ฑฐ๋‚˜ ์žฌ๋ฏธ์žˆ๋Š” ์•ก์…˜์„ ์ทจํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:48
When the A.I. becomes superintelligent,
166
588614
2417
์ธ๊ณต์ง€๋Šฅ์ด ์ดˆ์ง€๋Šฅ์ ์ด ๋˜๋ฉด
09:51
it realizes that there is a more effective way to achieve this goal:
167
591031
3523
์ธ๊ณต์ง€๋Šฅ์€ ๋ชฉํ‘œ๋ฅผ ๋‹ฌ์„ฑํ•  ๋” ํšจ์œจ์ ์ธ ๋ฐฉ๋ฒ•์ด ์žˆ์Œ์„ ๊นจ๋‹ซ์Šต๋‹ˆ๋‹ค.
09:54
take control of the world
168
594554
1922
์„ธ๊ณ„์˜ ์ฃผ๋„๊ถŒ์„ ๊ฐ€์ง€๊ณ 
09:56
and stick electrodes into the facial muscles of humans
169
596476
3162
์ „๊ทน์„ ์‚ฌ๋žŒ ์–ผ๊ตด ๊ทผ์œก์— ๊ณ ์ •์‹œํ‚ต๋‹ˆ๋‹ค.
09:59
to cause constant, beaming grins.
170
599638
2941
์ง€์†์ ์ธ ์›ƒ์Œ์„ ์œ ๋ฐœํ•˜๊ธฐ ์œ„ํ•ด์„œ์š”.
10:02
Another example,
171
602579
1035
๋‹ค๋ฅธ ์˜ˆ๋ฅผ ๋“ค์–ด ๋ณด์ฃ .
10:03
suppose we give A.I. the goal to solve a difficult mathematical problem.
172
603614
3383
์–ด๋ ค์šด ์ˆ˜ํ•™ ๋ฌธ์ œ๋ฅผ ํ’€๋ผ๋Š” ๋ชฉํ‘œ๋ฅผ ์ธ๊ณต์ง€๋Šฅ์—๊ฒŒ ์ฃผ์—ˆ๋‹ค๊ณ  ๊ฐ€์ •ํ•ฉ๋‹ˆ๋‹ค.
10:06
When the A.I. becomes superintelligent,
173
606997
1937
์ธ๊ณต์ง€๋Šฅ์ด ์Šˆํผ์ธ๊ณต์ง€๋Šฅ์ด ๋˜๋ฉด,
10:08
it realizes that the most effective way to get the solution to this problem
174
608934
4171
์ธ๊ณต์ง€๋Šฅ์€ ๋ฌธ์ œ๋ฅผ ํ’€๊ธฐ์œ„ํ•œ ๊ฐ€์žฅ ํšจ์œจ์ ์ธ ๋ฐฉ๋ฒ•์€
10:13
is by transforming the planet into a giant computer,
175
613105
2930
์ด ์ง€๊ตฌ๋ฅผ ๊ฑฐ๋Œ€ํ•œ ์ปดํ“จํ„ฐ๋กœ ๋ณ€ํ™”์‹œํ‚ค๋Š” ๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ๋ฉ๋‹ˆ๋‹ค.
10:16
so as to increase its thinking capacity.
176
616035
2246
์ฒ˜๋ฆฌ ๋Šฅ๋ ฅ์„ ํ–ฅ์ƒํ•˜๊ธฐ ์œ„ํ•ด์„œ์ฃ .
10:18
And notice that this gives the A.I.s an instrumental reason
177
618281
2764
์ด ์ƒ๊ฐ์€ ์šฐ๋ฆฌ๋กœ์จ๋Š” ์Šน์ธํ•  ์ˆ˜ ์—†๋Š”
10:21
to do things to us that we might not approve of.
178
621045
2516
๊ฒƒ๋“ค์„ ์‹คํ–‰ํ•˜๋Š” ์ฃผ๋œ ์ด์œ ๊ฐ€ ๋ฉ๋‹ˆ๋‹ค.
10:23
Human beings in this model are threats,
179
623561
1935
์ธ๋ฅ˜๋Š” ์ด๋Ÿฌํ•œ ๋ชจ๋ธ์—์„œ ์œ„ํ˜‘์š”์†Œ๊ฐ€ ๋˜๊ณ 
10:25
we could prevent the mathematical problem from being solved.
180
625496
2921
์šฐ๋ฆฌ๋Š” ๋ฌธ์ œ๊ฐ€ ํ•ด๊ฒฐ๋˜์ง€ ๋ชปํ•˜๋„๋ก ๋ง‰์•„์•ผ ํ•  ๊ฒ๋‹ˆ๋‹ค.
๋ฌผ๋ก , ์ง€๊ฐํ• ์ˆ˜ ์žˆ๋Š” ๊ฒƒ๋“ค์€ ๊ผญ ์ด๋Ÿฐ ์‹์œผ๋กœ ์ง„ํ–‰ ๋˜์ง€ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค
10:29
Of course, perceivably things won't go wrong in these particular ways;
181
629207
3494
10:32
these are cartoon examples.
182
632701
1753
์ด๊ฒƒ๋“ค์€ ๊ทธ๋ƒฅ ๋งŒ๋“  ์˜ˆ์‹œ์ด์ง€๋งŒ,
10:34
But the general point here is important:
183
634454
1939
์ผ๋ฐ˜์ ์ธ ์š”์ ์€ ์ด๊ฒ๋‹ˆ๋‹ค.
10:36
if you create a really powerful optimization process
184
636393
2873
๋ชฉ์  x๋ฅผ ๊ทน๋Œ€ํ™” ํ•˜๊ธฐ ์œ„ํ•œ
10:39
to maximize for objective x,
185
639266
2234
๊ฐ•๋ ฅํ•œ ํ”„๋กœ์„ธ์Šค๋ฅผ ๋งŒ๋“ค์—ˆ๋‹ค๋ฉด,
10:41
you better make sure that your definition of x
186
641500
2276
์šฐ๋ฆฌ๊ฐ€ ์ง€์ผœ์•ผํ•  ๋ชจ๋“  ๊ฒƒ๋“ค์„ ํฌํ•จํ•˜๋Š”
10:43
incorporates everything you care about.
187
643776
2469
x์˜ ์ •์˜ ๋˜ํ•œ ๋งŒ๋“ค์–ด์•ผ ํ•œ๋‹ค๋Š” ์ ์ž…๋‹ˆ๋‹ค.
10:46
This is a lesson that's also taught in many a myth.
188
646835
4384
์ด๋Š” ๋˜ํ•œ, ๋งŽ์€ ์‹ ํ™”์—์„œ ๊ฐ€๋ฅด์น˜๋Š” ๊ตํ›ˆ์ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
10:51
King Midas wishes that everything he touches be turned into gold.
189
651219
5298
๋งˆ์ด๋”์Šค ์™•์€ ๊ทธ์˜ ์†์ด ๋‹ฟ๋Š” ๋ชจ๋“  ๊ฑธ ๊ธˆ์œผ๋กœ ๋ณ€ํ•˜๊ฒŒ ํ•ด๋‹ฌ๋ผ๊ณ  ๋น•๋‹ˆ๋‹ค.
10:56
He touches his daughter, she turns into gold.
190
656517
2861
๊ทธ๋Š” ์ž์‹ ์˜ ๋”ธ์„ ๋งŒ์กŒ๊ณ  ๋”ธ์€ ๊ธˆ์œผ๋กœ ๋ณ€ํ•˜์ฃ .
10:59
He touches his food, it turns into gold.
191
659378
2553
์Œ์‹์„ ๋งŒ์ง€๋ฉด, ์Œ์‹์€ ๊ธˆ์œผ๋กœ ๋ณ€ํ•ฉ๋‹ˆ๋‹ค.
11:01
This could become practically relevant,
192
661931
2589
์ด ๊ฒƒ์€ ์‹ค์šฉ์ ์œผ๋กœ ๊ด€๋ จ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
11:04
not just as a metaphor for greed,
193
664520
2070
ํƒ์š•์— ๋Œ€ํ•œ ์€์œ  ๋ฟ๋งŒ์ด ์•„๋‹ˆ๋ผ
11:06
but as an illustration of what happens
194
666590
1895
์•ž์œผ๋กœ ์ผ์–ด๋‚  ์ผ์— ๋Œ€ํ•œ ์‚ฝํ™”์ด์ฃ .
11:08
if you create a powerful optimization process
195
668485
2837
๋งŒ์•ฝ ๋‹น์‹ ์ด ๊ฐ•๋ ฅํ•œ ์ตœ์ ํ™” ํ”„๋กœ์„ธ์Šค๋ฅผ ๋งŒ๋“ค๊ณ 
11:11
and give it misconceived or poorly specified goals.
196
671322
4789
๊ทธ๊ฒƒ์ด ์ž˜๋ชป๋œ ๊ฒฐ๊ณผ๋ฅผ ๋ถˆ๋Ÿฌ์˜ฌ ์ˆ˜ ์žˆ๋‹ค๋Š” ์ ์ด์ฃ .
11:16
Now you might say, if a computer starts sticking electrodes into people's faces,
197
676111
5189
๋งŒ์•ฝ ์ปดํ“จํ„ฐ๊ฐ€ ์‚ฌ๋žŒ์˜ ์–ผ๊ตด์— ์ „๊ทน์„ ๋ถ™์ธ๋‹ค๋ฉด
11:21
we'd just shut it off.
198
681300
2265
์ž‘๋™์„ ๋ฉˆ์ถ”๋ฉด ๊ทธ๋งŒ์ด๋ผ๊ณ  ๋งํ•˜์‹ค์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
11:24
A, this is not necessarily so easy to do if we've grown dependent on the system --
199
684555
5340
A. ์šฐ๋ฆฌ๊ฐ€ ๊ทธ ์‹œ์Šคํ…œ์— ์˜์กดํ•ด์„œ ์ž๋ž๋‹ค๋ฉด, ๊ทธ๊ฒŒ ๊ทธ๋ฆฌ ์‰ฝ์ง„ ์•Š์Šต๋‹ˆ๋‹ค.
11:29
like, where is the off switch to the Internet?
200
689895
2732
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์ธํ„ฐ๋„ท์— ๋„๋Š” ์Šค์œ„์น˜๊ฐ€ ์–ด๋”” ์žˆ๋‚˜์š”?
11:32
B, why haven't the chimpanzees flicked the off switch to humanity,
201
692627
5120
B. ์™œ ์นจํŒฌ์ง€๋“ค์€ ์ธ๋ฅ˜, ํ˜น์€ ๋„ค์•ˆ๋ฐ๋ฅดํƒˆ์ธ์˜ ์Šค์œ„์น˜๋ฅผ
11:37
or the Neanderthals?
202
697747
1551
๋„์ง€ ์•Š์•˜์„๊นŒ์š”?
11:39
They certainly had reasons.
203
699298
2666
๋ถ„๋ช… ์ด์œ ๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
11:41
We have an off switch, for example, right here.
204
701964
2795
์šฐ๋ฆฌ๋Š” ์˜คํ”„ ์Šค์œ„์น˜๋ฅผ ๊ฐ–๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค๋ฉด, ๋ฐ”๋กœ ์—ฌ๊ธฐ์š”.
11:44
(Choking)
205
704759
1554
(๋ชฉ์„ ์กฐ๋ฅธ๋‹ค.)
11:46
The reason is that we are an intelligent adversary;
206
706313
2925
๊ทธ ์ด์œ ๋Š” ์šฐ๋ฆฌ๊ฐ€ ์ง€๋Šฅ์„ ๊ฐ€์ง„ ์ ์ด์˜€๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
11:49
we can anticipate threats and plan around them.
207
709238
2728
์šฐ๋ฆฌ๋Š” ์นจํŒฌ์น˜์˜ ์œ„ํ˜‘์„ ์˜ˆ์ƒํ•˜๊ณ  ๊ทธ์— ๋”ฐ๋ฅธ ๊ณ„ํš์„ ์„ธ์šธ ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
11:51
But so could a superintelligent agent,
208
711966
2504
ํ•˜์ง€๋งŒ ์Šˆํผ์ธ๊ณต์ง€๋Šฅ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ฃ .
11:54
and it would be much better at that than we are.
209
714470
3254
๊ทธ๊ฒƒ์€ ์šฐ๋ฆฌ๋ณด๋‹ค ๋” ๋›ฐ์–ด๋‚  ๊ฒ๋‹ˆ๋‹ค.
11:57
The point is, we should not be confident that we have this under control here.
210
717724
7187
์š”์ ์€, ์šฐ๋ฆฌ๊ฐ€ ์ด๋ฅผ ํ†ต์ œํ•˜๊ณ  ์žˆ๋‹ค๊ณ  ์ž์‹ ํ•ด์„œ๋Š” ์•ˆ๋œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:04
And we could try to make our job a little bit easier by, say,
211
724911
3447
์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ์˜ ์ผ์„ ์กฐ๊ธˆ ์‰ฝ๊ฒŒ ํ•  ์ˆ˜ ์žˆ๋Š” ์‹œ๋„๋ฅผ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:08
putting the A.I. in a box,
212
728358
1590
๋ณด์•ˆ ์†Œํ”„ํŠธ์›จ์–ด ํ™˜๊ฒฝ ๊ฐ™์€ ์ƒํ™ฉ์—์„œ
12:09
like a secure software environment,
213
729948
1796
๋„๋ง์น  ์ˆ˜ ์—†๋Š” ์ƒํ™ฉ์—์„œ์˜ ์‹œ๋ฎฌ์—์ด์…˜ ์ƒํ™ฉ ํ…Œ์ŠคํŠธ ๋“ฑ์„ ์œ„ํ•ด
12:11
a virtual reality simulation from which it cannot escape.
214
731744
3022
์ธ๊ณต์ง€๋Šฅ์„ ๋ฐ•์Šค์— ๋„ฃ๊ณ  ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ์š”
12:14
But how confident can we be that the A.I. couldn't find a bug.
215
734766
4146
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์ธ๊ณต์ง€๋Šฅ์ด ์˜ค๋ฅ˜๋ฅผ ์ฐพ์ง€ ๋ชปํ•  ๊ฒƒ์ด๋ผ๊ณ  ์–ผ๋งˆ๋‚˜ ํ™•์‹ ํ• ๊นŒ์š”
12:18
Given that merely human hackers find bugs all the time,
216
738912
3169
๋ณดํ†ต์˜ ์ธ๊ฐ„ ํ•ด์ปค๋“ค๋„ ๋งค๋ฒˆ ์˜ค๋ฅ˜๋ฅผ ์ฐพ์•„๋ƒ…๋‹ˆ๋‹ค.
12:22
I'd say, probably not very confident.
217
742081
3036
์ œ๊ฐ€ ๋งํ•˜์ž๋ฉด, ์•„๋งˆ ํ™•์‹ ํ•˜์ง€ ๋ชปํ•  ๊ฒ๋‹ˆ๋‹ค.
12:26
So we disconnect the ethernet cable to create an air gap,
218
746237
4548
์ด๋ฏธ ๋ง์”€๋“œ๋ ธ๋“ฏ์ด, ์šฐ๋ฆฌ ํ•ด์ปค๋“ค์€
12:30
but again, like merely human hackers
219
750785
2668
์‚ฌํšŒ ๊ณตํ•™์„ ์‚ฌ์šฉํ•ด์„œ ์ง€์†์ ์œผ๋กœ ์—์–ด ๊ฐญ(์•ˆ์ „๋ง)์„ ์œ„๋ฐ˜ํ–ˆ์Šต๋‹ˆ๋‹ค
12:33
routinely transgress air gaps using social engineering.
220
753453
3381
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ์—์–ด ๊ฐญ์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ์ด๋”๋„ท ์ผ€์ด๋ธ”์„ ๋ถ„๋ฆฌํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:36
Right now, as I speak,
221
756834
1259
์ œ๊ฐ€ ๋งํ•œ ๊ฒƒ์ฒ˜๋Ÿผ, ์ง€๊ธˆ ์ด ์‹œ๊ฐ
12:38
I'm sure there is some employee out there somewhere
222
758093
2389
๋ช‡๋ช‡์˜ ์‚ฌ๋žŒ๋“ค์€ IT ๋ถ€์„œ์—์„œ์˜จ ๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ์ฃผ์žฅํ•˜๋Š” ๋Œ€๋กœ
12:40
who has been talked into handing out her account details
223
760482
3346
๋ˆ„๊ตฐ๊ฐ€์˜ ๊ณ„์ขŒ ์ •๋ณด๋ฅผ ๋„˜๊ธฐ๊ณ  ์žˆ์„ ๊ฒƒ์ด๋ผ๊ณ 
12:43
by somebody claiming to be from the I.T. department.
224
763828
2746
์ €๋Š” ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.
12:46
More creative scenarios are also possible,
225
766574
2127
๋” ์ฐฝ์˜์ ์ธ ์‹œ๋‚˜๋ฆฌ๋„๋“ค๋„ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
12:48
like if you're the A.I.,
226
768701
1315
๋งŒ์•ฝ ๋‹น์‹ ์ด ์ธ๊ณต์ง€๋Šฅ์ด๋ผ๋ฉด,
12:50
you can imagine wiggling electrodes around in your internal circuitry
227
770016
3532
๋‹น์‹ ์€ ํ†ต์‹ ํ•˜๊ธฐ ์œ„ํ•œ ๋ผ๋””์˜ค ์ „ํŒŒ๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด์„œ
12:53
to create radio waves that you can use to communicate.
228
773548
3462
๋‹น์‹ ์˜ ๋‚ด๋ถ€ ํšŒ๋กœ ์ฃผ๋ณ€์— ์žˆ๋Š” ํ”๋“ค๋ฆฌ๋Š” ์ „๊ทน๋“ค์„ ์ƒ์ƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:57
Or maybe you could pretend to malfunction,
229
777010
2424
๋˜๋Š” ๊ณ ์žฅ๋‚œ ์ฒ™ ์—ฐ๊ธฐํ•  ์ˆ˜ ์žˆ์ฃ .
12:59
and then when the programmers open you up to see what went wrong with you,
230
779434
3497
๊ทธ๋ฆฌ๊ณ  ํ”„๋กœ๊ทธ๋ž˜๋จธ๊ฐ€ ๋ญ๊ฐ€ ์ž˜๋ชป๋ฌ๋Š”์ง€ ๋ณด๋ ค๊ณ  ์ฝ”๋“œ๋ฅผ ์—ฌ๋Š” ์ˆœ๊ฐ„,
13:02
they look at the source code -- Bam! --
231
782931
1936
๋นฐ!
13:04
the manipulation can take place.
232
784867
2447
์†์ž„์ˆ˜๊ฐ€ ๋ฒŒ์–ด์ง€๊ฒŒ ๋˜์ฃ .
13:07
Or it could output the blueprint to a really nifty technology,
233
787314
3430
๋˜๋Š” ๊ต‰์žฅํžˆ ๋ฉ‹์ง„ ๊ธฐ์ˆ ์˜ ์ฒญ์‚ฌ์ง„์„ ๊ฒฐ๊ณผ๋ฌผ๋กœ ๋‚ด๋†“๊ณ ,
13:10
and when we implement it,
234
790744
1398
์šฐ๋ฆฌ๊ฐ€ ์‹คํ–‰์‹œํ‚ค๋ฉด,
13:12
it has some surreptitious side effect that the A.I. had planned.
235
792142
4397
์‚ฌ์‹ค ๊ทธ๊ฒƒ์€ ์ธ๊ณต์ง€๋Šฅ์ด ๊ณ„ํšํ•œ ๋ถ€์ž‘์šฉ์ด ๋ฐœ์ƒํ•œ๋‹ค๋Š” ์‹์ด์ฃ .
13:16
The point here is that we should not be confident in our ability
236
796539
3463
์ค‘์š”ํ•œ์ ์€ ์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ์˜ ๋Šฅ๋ ฅ์„ ๊ณผ์‹ ํ•˜๋ฉด ์•ˆ๋œ๋‹ค๋Š” ์ ์ž…๋‹ˆ๋‹ค.
13:20
to keep a superintelligent genie locked up in its bottle forever.
237
800002
3808
์Šˆํผ์ธ๊ณต์ง€๋Šฅ์ด ์˜์›ํžˆ ๋ณ‘์•ˆ์— ๋ด‰์ธ ๋ ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜๋Š” ์ ์— ๋Œ€ํ•ด์„œ์š”.
13:23
Sooner or later, it will out.
238
803810
2254
๋จธ์ง€์•Š์•„, ๊ทธ๊ฒƒ์€ ๋ฐ–์œผ๋กœ ๋‚˜์˜ฌ๊ฒ๋‹ˆ๋‹ค.
13:27
I believe that the answer here is to figure out
239
807034
3103
์ €๋Š” ์ •๋‹ต์€ ์ธ๊ณต์ง€๋Šฅ์„ ์–ด๋–ป๊ฒŒ ๋งŒ๋“œ๋Š๋ƒ์— ๋‹ฌ๋ ค์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
13:30
how to create superintelligent A.I. such that even if -- when -- it escapes,
240
810137
5024
๊ทธ๊ฒƒ์ด ๋ด‰์ธ์—์„œ ํ•ด์ œ๋˜๋”๋ผ๋„, ์—ฌ์ „ํžˆ ์•ˆ์ „ํ•˜๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
13:35
it is still safe because it is fundamentally on our side
241
815161
3277
๊ธฐ๋ณธ์ ์œผ๋กœ ์šฐ๋ฆฌ์˜ ํŽธ์ด์–ด์•ผ ํ•˜๊ณ ,
13:38
because it shares our values.
242
818438
1899
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ์˜ ๊ฐ€์น˜๋ฅผ ๊ณต์œ ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค
13:40
I see no way around this difficult problem.
243
820337
3210
์ด ๊ฒƒ์€ ์–ด๋ ค์šด ๋ฌธ์ œ๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
13:44
Now, I'm actually fairly optimistic that this problem can be solved.
244
824557
3834
์ €๋Š” ์‚ฌ์‹ค ์ด ๋ฌธ์ œ๊ฐ€ ํ•ด๊ฒฐ๋  ์ˆ˜ ์žˆ๋‹ค๊ณ  ๊ฝค ๋‚™๊ด€ํ•ฉ๋‹ˆ๋‹ค.
13:48
We wouldn't have to write down a long list of everything we care about,
245
828391
3903
์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ๊ฐ€ ๊ฑฑ์ •ํ•˜๋Š” ๋ชจ๋“  ์ผ๋“ค์„ ๋‹ค ์ž‘์„ฑํ•  ํ•„์š”๋Š” ์—†์Šต๋‹ˆ๋‹ค.
13:52
or worse yet, spell it out in some computer language
246
832294
3643
ํ˜น์€, ๊ทธ๊ฒƒ์„ C++ ๋‚˜ ํŒŒ์ด์„  ๊ฐ™์€ ์ปดํ“จํ„ฐ ์–ธ์–ด๋กœ ์ ๋Š”๋‹ค๋˜์ง€
13:55
like C++ or Python,
247
835937
1454
ํ•˜๋Š” ์ผ๊ฐ™์€ ๊ฒƒ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€๊ตฌ์š”.
13:57
that would be a task beyond hopeless.
248
837391
2767
์‚ฌ์‹ค, ๊ทธ๊ฒƒ์€ ํฌ๋ง์ด ์—†๋Š” ์ผ์ด์ฃ .
14:00
Instead, we would create an A.I. that uses its intelligence
249
840158
4297
๋Œ€์‹ ์—, ์šฐ๋ฆฌ๋Š” ์ธ๊ณต์ง€๋Šฅ์„ ํ•˜๋‚˜ ๋งŒ๋“ค๋ฉด ๋ฉ๋‹ˆ๋‹ค.
14:04
to learn what we value,
250
844455
2771
์šฐ๋ฆฌ์˜ ๊ฐ€์น˜๋ฅผ ๋ฐฐ์šธ ์ธ๊ณต์ง€๋Šฅ์ด์ฃ .
14:07
and its motivation system is constructed in such a way that it is motivated
251
847226
5280
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์˜ ๋™๊ธฐ๋ถ€์—ฌ ์‹œ์Šคํ…œ์€ ์šฐ๋ฆฌ๊ฐ€ ์Šน์ธํ•  ๊ฒƒ์œผ๋กœ ์˜ˆ์ธก๋˜๋Š”
14:12
to pursue our values or to perform actions that it predicts we would approve of.
252
852506
5232
์šฐ๋ฆฌ์˜ ๊ฐ€์น˜ ํ˜น์€ ํ–‰๋™์„ ์ถ”๊ตฌํ•˜๋Š” ๋ฐฉํ–ฅ์œผ๋กœ ์„ค๊ณ„๋ฉ๋‹ˆ๋‹ค.
14:17
We would thus leverage its intelligence as much as possible
253
857738
3414
์šฐ๋ฆฌ๋Š” ์ด๋Ÿฌํ•œ ์ธ๊ณต์ง€๋Šฅ์— ์ตœ๋Œ€ํ•œ ํž˜์„ ์‹ฃ๋Š” ์ชฝ์œผ๋กœ ๋ฌด๊ฒŒ๋ฅผ ์‹ค์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค
14:21
to solve the problem of value-loading.
254
861152
2745
์•ž์„œ ์–ธ๊ธ‰ํ•œ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด์„œ์ฃ .
14:24
This can happen,
255
864727
1512
์ด๋Ÿฐ ๋ฐฉ์‹์€ ๊ฐ€๋Šฅํ•˜๋ฉฐ,
14:26
and the outcome could be very good for humanity.
256
866239
3596
๊ฒฐ๊ณผ๋Š” ์ธ๋ฅ˜๋ฅผ ์œ„ํ•ด์„œ ๋งค์šฐ ์ข‹์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:29
But it doesn't happen automatically.
257
869835
3957
ํ•˜์ง€๋งŒ ์ด๊ฒƒ์€ ์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ์ผ์–ด๋‚˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
14:33
The initial conditions for the intelligence explosion
258
873792
2998
์ธ๊ณต์ง€๋Šฅ์˜ ํ™•๋Œ€์— ํ•„์š”ํ•œ ์ดˆ๊ธฐ ์กฐ๊ฑด์€
14:36
might need to be set up in just the right way
259
876790
2863
์˜ฌ๋ฐ”๋ฅธ ๋ฐฉํ–ฅ์œผ๋กœ ์„ธํŒ…๋˜์–ด์•ผ ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:39
if we are to have a controlled detonation.
260
879653
3530
๋งŒ์•ฝ์— ์šฐ๋ฆฌ๊ฐ€ ํ†ต์ œ๋œ ํญ๋ฐœ์„ ๊ฐ€์ง„๋‹ค๋ฉด,
14:43
The values that the A.I. has need to match ours,
261
883183
2618
์ธ๊ณต์ง€๋Šฅ์˜ ๊ฐ€์น˜๋Š” ์šฐ๋ฆฌ์˜ ๊ฒƒ๊ณผ ๋งž์•„๋–จ์–ด์งˆ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:45
not just in the familiar context,
262
885801
1760
์šฐ๋ฆฌ๊ฐ€ ์ธ๊ณต์ง€๋Šฅ์˜ ํ–‰๋™์„ ์‰ฝ๊ฒŒ ํ™•์ธํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ
14:47
like where we can easily check how the A.I. behaves,
263
887561
2438
์นœ์ˆ™ํ•œ ์ƒํ™ฉ์ผ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ,
14:49
but also in all novel contexts that the A.I. might encounter
264
889999
3234
๋ชจ๋“  ์†Œ์„ค ์ƒํ™ฉ์—์„œ ์šฐ๋ฆฌ๊ฐ€ ์ธ๊ณต์ง€๋Šฅ๋ฅผ ์–ธ์ œ์ผ์ง€ ๋ชจ๋ฅผ ๋ฏธ๋ž˜์—
14:53
in the indefinite future.
265
893233
1557
์กฐ์šฐํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
14:54
And there are also some esoteric issues that would need to be solved, sorted out:
266
894790
4737
๊ทธ๋ฆฌ๊ณ  ํ•ด๊ฒฐ๋˜์•ผํ•  ๋ช‡๊ฐ€์ง€ ๋‚œํ•ดํ•œ ๋ฌธ์ œ๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
14:59
the exact details of its decision theory,
267
899527
2089
๊ฒฐ์ • ์ด๋ก ์˜ ์ •ํ™•ํ•œ ์„ธ๋ถ€๋‚ด์šฉ๋“ค ์ž…๋‹ˆ๋‹ค.
15:01
how to deal with logical uncertainty and so forth.
268
901616
2864
์–ด๋–ป๊ฒŒ ๋…ผ๋ฆฌ์ ์ธ ๋ถˆํ™•์‹ค์„ฑ์— ๋Œ€์ฒ˜ํ•  ๊ฒƒ์ธ๊ฐ€ ๊ฐ™์€ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
15:05
So the technical problems that need to be solved to make this work
269
905330
3102
ํ•ด๊ฒฐ๋˜์•ผํ•  ๊ธฐ์ˆ ์ ์ธ ๋ฌธ์ œ๋“ค์€
15:08
look quite difficult --
270
908432
1113
๊ฝค ์–ด๋ ค์›Œ ๋ณด์ž…๋‹ˆ๋‹ค.
15:09
not as difficult as making a superintelligent A.I.,
271
909545
3380
์Šˆํผ์ธ๊ณต์ง€๋Šฅ์„ ๋งŒ๋“œ๋Š” ๊ฒƒ๋งŒํผ ์–ด๋ ค์šด ๊ฒƒ์€ ์•„๋‹ˆ์ง€๋งŒ,
15:12
but fairly difficult.
272
912925
2868
์—ฌ์ „ํžˆ ๊ฝค ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
15:15
Here is the worry:
273
915793
1695
์ €๋Š” ์ด ์ ์ด ๊ฑฑ์ •์Šค๋Ÿฝ์Šต๋‹ˆ๋‹ค.
15:17
Making superintelligent A.I. is a really hard challenge.
274
917488
4684
์Šˆํผ์ธ๊ณต์ง€๋Šฅ์„ ๋งŒ๋“œ๋Š” ๊ฒƒ์€ ์ •๋ง ์–ด๋ ค์šด ๋„์ „์ž…๋‹ˆ๋‹ค.
15:22
Making superintelligent A.I. that is safe
275
922172
2548
๊ฒŒ๋‹ค๊ฐ€ ๊ทธ๊ฒƒ์˜ ์•ˆ์ „์„ ์œ„ํ•ด์„œ
15:24
involves some additional challenge on top of that.
276
924720
2416
๋ช‡๊ฐ€์ง€ ์ถ”๊ฐ€์ ์ธ ๋„์ „์„ ํ•ด์•ผํ•˜์ฃ .
15:28
The risk is that if somebody figures out how to crack the first challenge
277
928216
3487
์œ„ํ—˜์„ฑ์€ ๋งŒ์•ฝ ๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ๋‹ค์Œ๋‹จ๊ณ„์˜
15:31
without also having cracked the additional challenge
278
931703
3001
์•ˆ์ „์„ ๋ณด์žฅํ•˜์ง€ ๋ชปํ•œ ์ƒํ™ฉ์—์„œ
15:34
of ensuring perfect safety.
279
934704
1901
์ฒซ๋ฒˆ์งธ ๋„์ „์— ์„ฑ๊ณตํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์„ ์‹œ๊ธฐ์ž…๋‹ˆ๋‹ค.
15:37
So I think that we should work out a solution
280
937375
3331
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ํ•ด๊ฒฐ์ฑ…์„ ์ƒ๊ฐํ•ด ๋‚ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
15:40
to the control problem in advance,
281
940706
2822
๋‹ค์Œ๋‹จ๊ณ„์— ์˜ˆ์ƒ๋  ๋ฌธ์ œ์ ์„ ์ œ์–ดํ•˜๊ธฐ ์œ„ํ•ด์„œ ๋ง์ด์ฃ .
15:43
so that we have it available by the time it is needed.
282
943528
2660
์šฐ๋ฆฌ๊ฐ€ ํ•„์š”ํ• ๋•Œ ์–ธ์ œ๋“ ์ง€ ์‚ฌ์šฉ ํ• ์ˆ˜ ์žˆ๋„๋ก ์ค€๋น„ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
15:46
Now it might be that we cannot solve the entire control problem in advance
283
946768
3507
ํ•˜์ง€๋งŒ ๋ชจ๋“  ๋ฌธ์ œ๋“ค์„ ๋ฏธ๋ฆฌ ์˜ˆ์ธกํ•˜๊ณ 
ํ•ด๊ฒฐํ•˜๋Š” ๊ฒƒ์€ ๋ถˆ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ• ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
15:50
because maybe some elements can only be put in place
284
950275
3024
์™œ๋ƒํ•˜๋ฉด ๋ช‡๋ช‡ ์š”์†Œ๋“ค์€ ๊ทธ ๋‹จ๊ณ„๊ฐ€
15:53
once you know the details of the architecture where it will be implemented.
285
953299
3997
๋˜์•ผ ์•Œ ์ˆ˜ ์žˆ์„์ง€๋„ ๋ชจ๋ฅด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
15:57
But the more of the control problem that we solve in advance,
286
957296
3380
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ๋ฏธ๋ฆฌ ๋ฌธ์ œ๋“ค์„ ์ œ์–ดํ•  ์ˆ˜ ์žˆ์„ ์ˆ˜๋ก,
16:00
the better the odds that the transition to the machine intelligence era
287
960676
4090
์ธ๊ณต์ง€๋Šฅ ์‹œ๋Œ€๋กœ์˜ ์ „ํ™˜์ด
16:04
will go well.
288
964766
1540
์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ์ง„ํ–‰๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:06
This to me looks like a thing that is well worth doing
289
966306
4644
์ €๋Š” ์ด๊ฒƒ์ด ์šฐ๋ฆฌ๊ฐ€ ์ž˜ํ• ๋งŒํ•œ ๊ฐ€์น˜๊ฐ€ ์žˆ๋Š” ์ผ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
16:10
and I can imagine that if things turn out okay,
290
970950
3332
๊ทธ๋ฆฌ๊ณ  ๋งŒ์•ฝ ์ด ๋‹จ๊ณ„๊ฐ€ ์ž˜ ์ง„ํ–‰๋œ๋‹ค๋ฉด
16:14
that people a million years from now look back at this century
291
974282
4658
100๋งŒ๋…„ ํ›„์˜ ์ธ๋ฅ˜๋Š” ์ง€๊ธˆ์˜ ์‹œ๋Œ€๋ฅผ ํšŒ์ƒํ•˜๋ฉฐ
16:18
and it might well be that they say that the one thing we did that really mattered
292
978940
4002
์šฐ๋ฆฌ๊ฐ€ ๊ฑฑ์ •ํ–ˆ๋˜ ๋ฌธ์ œ๋“ค์ด ์ •๋ง ์ž˜ ํ•ด๊ฒฐ๋๋‹ค๊ณ 
16:22
was to get this thing right.
293
982942
1567
๋งํ• ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
16:24
Thank you.
294
984509
1689
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
16:26
(Applause)
295
986198
2813
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7