Ray Kurzweil: Get ready for hybrid thinking

529,292 views ใƒป 2014-06-02

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: K Bang ๊ฒ€ํ† : Gemma Lee
00:12
Let me tell you a story.
0
12988
2316
์ด์•ผ๊ธฐ๋ฅผ ํ•˜๋‚˜ ํ•ด๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
00:15
It goes back 200 million years.
1
15304
1799
๊ทธ๊ฒƒ์€ 2์–ต๋…„ ์ „์œผ๋กœ ๊ฑฐ์Šฌ๋Ÿฌ ์˜ฌ๋ผ๊ฐ‘๋‹ˆ๋‹ค.
00:17
It's a story of the neocortex,
2
17103
1984
๋Œ€๋‡Œ ์‹ ํ”ผ์งˆ์— ๊ด€ํ•œ ์ด์•ผ๊ธฐ์ธ๋ฐ
00:19
which means "new rind."
3
19087
1974
์‹ ํ”ผ์งˆ์ด๋ž€ "์ƒˆ๋กœ์šด ๊ป์งˆ"์ด๋ผ๋Š” ๋œป์ž…๋‹ˆ๋‹ค.
00:21
So in these early mammals,
4
21061
2431
์ดˆ๊ธฐ ํฌ์œ ๋ฅ˜์— ๊ด€ํ•œ ๊ฒƒ์ด์ฃ .
00:23
because only mammals have a neocortex,
5
23492
2055
์™œ๋ƒํ•˜๋ฉด ์ฅ์™€ ๊ฐ™์€ ํฌ์œ ๋ฅ˜๋งŒ์ด
00:25
rodent-like creatures.
6
25547
1664
์‹ ํ”ผ์งˆ์„ ๊ฐ€์ง€๊ณ  ์žˆ์œผ๋‹ˆ๊นŒ์š”.
00:27
It was the size of a postage stamp and just as thin,
7
27211
3579
์‹ ํ”ผ์งˆ์˜ ํฌ๊ธฐ๋Š” ์šฐํ‘œ ์ •๋„์˜€๊ณ  ๋‘๊ป˜๋„ ์šฐํ‘œ ์ •๋„์˜€๋Š”๋ฐ
00:30
and was a thin covering around
8
30790
1439
ํ˜ธ๋‘ ๋งŒํ•œ ํฌ๊ธฐ์˜ ๋‡Œ๋ฅผ
00:32
their walnut-sized brain,
9
32229
2264
์–‡๊ฒŒ ๋‘˜๋Ÿฌ์‹ธ๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
00:34
but it was capable of a new type of thinking.
10
34493
3701
ํ•˜์ง€๋งŒ ์ด๊ฒƒ์€ ์ƒˆ๋กœ์šด ๋ฐฉ์‹์˜ ์‚ฌ๊ณ  ๋Šฅ๋ ฅ์„ ๊ฐ–์ถ”๊ณ  ์žˆ์—ˆ์–ด์š”.
00:38
Rather than the fixed behaviors
11
38194
1567
ํฌ์œ ๋ฅ˜๊ฐ€ ์•„๋‹Œ ๋™๋ฌผ์ด ๋ณด์ด๋Š”
00:39
that non-mammalian animals have,
12
39761
1992
๊ณ ์ •๋œ ํ–‰๋™ ์–‘์‹ ๋Œ€์‹ 
00:41
it could invent new behaviors.
13
41753
2692
์‹ ํ”ผ์งˆ์€ ์ƒˆ๋กœ์šด ํ–‰๋™์„ ๋งŒ๋“ค์–ด ๋ƒˆ์Šต๋‹ˆ๋‹ค.
00:44
So a mouse is escaping a predator,
14
44445
2553
๊ทธ๋ž˜์„œ ์ƒ์ฅ๋Š” ํฌ์‹์ž๋ฅผ ํ”ผํ•˜๋ ค๋‹ค
00:46
its path is blocked,
15
46998
1540
๊ธธ์ด ๋ง‰ํžˆ๋ฉด
00:48
it'll try to invent a new solution.
16
48538
2129
์ƒˆ๋กœ์šด ํ•ด๋ฒ•์„ ์ฐพ์•„๋‚ด๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
00:50
That may work, it may not,
17
50667
1266
ํ†ตํ•  ์ˆ˜๋„ ์žˆ๊ณ  ์•„๋‹์ˆ˜๋„ ์žˆ์ง€๋งŒ์š”.
00:51
but if it does, it will remember that
18
51933
1910
ํ•˜์ง€๋งŒ ํ•ด๋ฒ•์ด ํ†ตํ•˜๋ฉด ๊ทธ๊ฑธ ๊ธฐ์–ตํ•˜๊ณ 
00:53
and have a new behavior,
19
53843
1292
์ƒˆ๋กœ์šด ํ–‰๋™์„ ๋ณด์ž…๋‹ˆ๋‹ค.
00:55
and that can actually spread virally
20
55135
1457
๊ทธ๋Ÿฐ ํ–‰๋™์€ ๋ฐ”์ด๋Ÿฌ์Šค์ฒ˜๋Ÿผ
00:56
through the rest of the community.
21
56592
2195
์‚ฌํšŒ ๊ณณ๊ณณ์— ํผ์ง€๊ฒŒ ๋˜์ง€์š”.
00:58
Another mouse watching this could say,
22
58787
1609
์ด๊ฑธ ๋ณธ ๋‹ค๋ฅธ ์ƒ์ฅ๊ฐ€ ์ด๋ ‡๊ฒŒ ๋งํ•  ์ˆ˜๋„ ์žˆ๊ฒ ๊ตฐ์š”.
01:00
"Hey, that was pretty clever, going around that rock,"
23
60396
2704
"์ด๋ด, ๊ทธ๊ฑฐ ์ฐธ ์˜๋ฏผํ•œ๋ฐ, ๊ทธ ์–ด๋ ค์›€์„ ํ”ผํ•ด๊ฐ€๋‹ค๋‹ˆ ๋ง์ด์•ผ."
01:03
and it could adopt a new behavior as well.
24
63100
3725
๊ทธ๊ฒŒ ๋˜ ๋‹ค๋ฅธ ํ–‰๋™์„ ์ด๋Œ์–ด ๋‚ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
01:06
Non-mammalian animals
25
66825
1717
ํฌ์œ ๋ฅ˜๊ฐ€ ์•„๋‹Œ ๋™๋ฌผ๋“ค์€
01:08
couldn't do any of those things.
26
68542
1713
์ด๋Ÿฐ ํ–‰๋™ ์ค‘์— ์–ด๋–ค ๊ฒƒ๋„ ํ•˜์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
01:10
They had fixed behaviors.
27
70255
1215
๊ทธ๋“ค์€ ๊ณ ์ •๋œ ํ–‰๋™์„ ํ•ฉ๋‹ˆ๋‹ค.
01:11
Now they could learn a new behavior
28
71470
1331
์ƒˆ๋กœ์šด ํ–‰๋™์„ ๋ฐฐ์šธ ์ˆ˜๋Š” ์žˆ์ง€๋งŒ
01:12
but not in the course of one lifetime.
29
72801
2576
ํ•œ ํ‰์ƒ์„ ์‚ฌ๋Š” ๋™์•ˆ์—๋Š” ๊ทธ๋ ‡๊ฒŒ ํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
01:15
In the course of maybe a thousand lifetimes,
30
75377
1767
์•ฝ 1์ฒœ ์„ธ๋Œ€์— ๊ฑธ์ณ
01:17
it could evolve a new fixed behavior.
31
77144
3330
์‹ ํ”ผ์งˆ์€ ์ƒˆ๋กœ์šด ๊ณ ์ • ํ–‰๋™์„ ๋งŒ๋“ค์–ด๋ƒˆ์„ ๊ฒ๋‹ˆ๋‹ค.
01:20
That was perfectly okay 200 million years ago.
32
80474
3377
2์–ต๋…„ ์ „์—๋Š” ๊ทธ๋Ÿฐ ๋ชจ์Šต์ด ์™„๋ฒฝํ•˜๊ฒŒ ๊ดœ์ฐฎ์•˜์ง€์š”.
01:23
The environment changed very slowly.
33
83851
1981
์ฃผ๋ณ€ ํ™˜๊ฒฝ์ด ๋งค์šฐ ๋Š๋ฆฌ๊ฒŒ ๋ณ€ํ™”ํ–ˆ์œผ๋‹ˆ๊นŒ์š”.
01:25
It could take 10,000 years for there to be
34
85832
1554
์ค‘์š”ํ•œ ํ™˜๊ฒฝ ๋ณ€ํ™”๊ฐ€ ์ผ์–ด๋‚˜๋ ค๋ฉด
01:27
a significant environmental change,
35
87386
2092
1๋งŒ๋…„์€ ๊ฑธ๋ ธ์„ํ…Œ๋‹ˆ๊นŒ์š”.
01:29
and during that period of time
36
89478
1382
๊ทธ ์‹œ๊ฐ„ ๋™์•ˆ
01:30
it would evolve a new behavior.
37
90860
2929
์‹ ํ”ผ์งˆ์€ ์ƒˆ๋กœ์šด ํ–‰๋™์„ ๋งŒ๋“ค์–ด ๋ƒˆ์„ ๊ฑฐ์˜ˆ์š”.
01:33
Now that went along fine,
38
93789
1521
๊ทธ๋Ÿฐ ํ๋ฆ„์€ ๊ดœ์ฐฎ์•˜์ง€๋งŒ
01:35
but then something happened.
39
95310
1704
์ƒˆ๋กœ์šด ์‚ฌ๊ฑด์ด ๋ฐœ์ƒํ–ˆ์Šต๋‹ˆ๋‹ค.
01:37
Sixty-five million years ago,
40
97014
2246
6์ฒœ 5๋ฐฑ๋งŒ ๋…„์ „,
01:39
there was a sudden, violent change to the environment.
41
99260
2615
๊ฐ‘์ž‘์Šค๋Ÿฝ๊ณ  ๊ธ‰๊ฒฉํ•œ ํ™˜๊ฒฝ ๋ณ€ํ™”๊ฐ€ ์ผ์–ด๋‚ฌ์Šต๋‹ˆ๋‹ค.
01:41
We call it the Cretaceous extinction event.
42
101875
3505
์šฐ๋ฆฌ๋Š” ๊ทธ๊ฒƒ์„ ๋ฐฑ์•…๊ธฐ์˜ ๋Œ€๋ฉธ์ข…์ด๋ผ๊ณ  ํ•˜์ฃ .
01:45
That's when the dinosaurs went extinct,
43
105380
2293
๊ณต๋ฃก์ด ๋ชจ๋‘ ๋ฉธ์ข…ํ•œ ์‹œ๊ธฐ์ธ๋ฐ
01:47
that's when 75 percent of the
44
107673
3449
๋ชจ๋“  ๋™์‹๋ฌผ์ข…์˜
01:51
animal and plant species went extinct,
45
111122
2746
75%๊ฐ€ ๋ฉธ์ข…ํ–ˆ๊ณ 
01:53
and that's when mammals
46
113868
1745
๊ทธ ์‹œ๊ธฐ์— ํฌ์œ ๋ฅ˜๋Š”
01:55
overtook their ecological niche,
47
115613
2152
์ž์‹ ์˜ ์ƒํƒœ์  ์ง€์œ„๋ฅผ ์•ž์„œ๋‚˜๊ฐ€
01:57
and to anthropomorphize, biological evolution said,
48
117765
3654
์ธ๊ฐ„์ด ๋˜๊ธฐ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค. ์ƒ๋ฌผํ•™์  ์ง„ํ™” ๋ฉด์—์„œ ๋ณด๋ฉด
02:01
"Hmm, this neocortex is pretty good stuff,"
49
121419
2025
"์Œ, ์ด ์‹ ํ”ผ์งˆ์ด๋ž€๊ฒŒ ๋ฌด์ฒ™ ์ข‹์€๊ฑฐ๊ตฐ"์ด๋ผ๊ณ  ํ–ˆ๊ฒ ์ฃ .
02:03
and it began to grow it.
50
123444
1793
๊ทธ๋ฆฌ๊ณ ๋Š” ์‹ ํ”ผ์งˆ์ด ์ž๋ผ๋‚˜๊ธฐ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค.
02:05
And mammals got bigger,
51
125237
1342
ํฌ์œ ๋ฅ˜์˜ ๋ชธ์ง‘์ด ๋” ์ปค์ง์— ๋”ฐ๋ผ
02:06
their brains got bigger at an even faster pace,
52
126579
2915
๋‘๋‡Œ๋Š” ํ›จ์”ฌ ๋” ๋น ๋ฅธ ์†๋„๋กœ ์ปค์ง€๊ณ 
02:09
and the neocortex got bigger even faster than that
53
129494
3807
์‹ ํ”ผ์งˆ์€ ๊ทธ๋ณด๋‹ค๋„ ๋” ๋น ๋ฅด๊ฒŒ ํ›จ์”ฌ ๋” ์ปค์กŒ์Šต๋‹ˆ๋‹ค.
02:13
and developed these distinctive ridges and folds
54
133301
2929
๊ทธ๋ฆฌ๊ณ ๋Š” ์ด๋ ‡๊ฒŒ ํŠน์ดํ•œ ๊ตด๊ณก์„ ๋งŒ๋“ค์–ด๋ƒˆ์–ด์š”.
02:16
basically to increase its surface area.
55
136230
2881
๊ธฐ๋ณธ์ ์œผ๋กœ ๊ทธ ํ‘œ๋ฉด์„ ๋„“ํžˆ๋ ค๋Š” ๊ฒƒ์ด์—ˆ์ฃ .
02:19
If you took the human neocortex
56
139111
1819
์ธ๊ฐ„์˜ ์‹ ํ”ผ์งˆ์„
02:20
and stretched it out,
57
140930
1301
ํŽผ์ณ๋ณด๋ฉด
02:22
it's about the size of a table napkin,
58
142231
1713
์‹ํƒ์˜ ๋ƒ…ํ‚จ ์ •๋„์˜ ํฌ๊ธฐ์ด๊ณ 
02:23
and it's still a thin structure.
59
143944
1306
์—ฌ์ „ํžˆ ์–‡์€ ๊ตฌ์กฐ์—์š”.
02:25
It's about the thickness of a table napkin.
60
145250
1980
์‹ํƒ ๋ƒ…ํ‚จ ์ •๋„์˜ ๋‘๊ป˜๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:27
But it has so many convolutions and ridges
61
147230
2497
ํ•˜์ง€๋งŒ ์ด์ œ๋Š” ์ˆ˜๋งŽ์€ ์ฃผ๋ฆ„๊ณผ ๊ตด๊ณก์„ ๊ฐ€์ง€๊ณ 
02:29
it's now 80 percent of our brain,
62
149727
3075
์šฐ๋ฆฌ ๋‘๋‡Œ์˜ 80%๋ฅผ ์ฐจ์ง€ํ•ฉ๋‹ˆ๋‹ค.
02:32
and that's where we do our thinking,
63
152802
2461
๊ทธ๊ณณ์„ ํ†ตํ•ด ์šฐ๋ฆฌ๋Š” ์ƒ๊ฐ์„ ํ•ฉ๋‹ˆ๋‹ค.
02:35
and it's the great sublimator.
64
155263
1761
์œ„๋Œ€ํ•œ ์ฐฝ์กฐ์ž์ธ ์…ˆ์ด์ฃ .
02:37
We still have that old brain
65
157024
1114
์ธ๊ฐ„์€ ์—ฌ์ „ํžˆ ๊ธฐ๋ณธ์ ์ธ ํ–‰๋™๊ณผ
02:38
that provides our basic drives and motivations,
66
158138
2764
๋™๊ธฐ๋ฅผ ์œ ๋ฐœํ•˜๋Š” ์˜ค๋ž˜๋œ ๋‘๋‡Œ๋ฅผ ๊ฐ–๊ณ  ์žˆ์ง€๋งŒ
02:40
but I may have a drive for conquest,
67
160902
2716
์ด์ œ๋Š” ์ •๋ณต์„ ์œ„ํ•œ ์š•๊ตฌ๋ฅผ ๊ฐ€์งˆ ์ˆ˜๋„ ์žˆ์ง€์š”.
02:43
and that'll be sublimated by the neocortex
68
163618
2715
์‹ ํ”ผ์งˆ์€ ์‹œ๋ฅผ ์“ฐ๊ฑฐ๋‚˜
02:46
into writing a poem or inventing an app
69
166333
2909
์•ฑ์„ ๋งŒ๋“ค๊ณ 
02:49
or giving a TED Talk,
70
169242
1509
TED ๊ฐ•์—ฐ์„ ๋งŒ๋“ค์–ด ๋‚ด๋Š” ์ฐฝ์กฐ์  ์—ญํ• ์„ ํ•ฉ๋‹ˆ๋‹ค.
02:50
and it's really the neocortex that's where
71
170751
3622
๊ทธ๊ฒƒ์ด ๋ฐ”๋กœ ํ™œ๋™์ด ์ผ์–ด๋‚˜๋Š”
02:54
the action is.
72
174373
1968
์‹ ํ”ผ์งˆ์˜ ์‹ค์ฒด์ž…๋‹ˆ๋‹ค.
02:56
Fifty years ago, I wrote a paper
73
176341
1717
50๋…„ ์ „์— ์ €๋Š”
02:58
describing how I thought the brain worked,
74
178058
1918
๋‘๋‡Œ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋ฆฌ๋ž€ ์ €์˜ ์ƒ๊ฐ์„ ์„ค๋ช…ํ•˜๋Š” ๋…ผ๋ฌธ์„ ์ผ์Šต๋‹ˆ๋‹ค.
02:59
and I described it as a series of modules.
75
179976
3199
์ €๋Š” ๊ทธ๊ฒƒ์„ ์ผ๋ จ์˜ ๋ชจ๋“ˆ์ฒ˜๋Ÿผ ๋ฌ˜์‚ฌํ–ˆ์ฃ .
03:03
Each module could do things with a pattern.
76
183175
2128
๊ฐ๊ฐ์˜ ๋ชจ๋“ˆ์€ ์–ด๋–ค ์–‘์‹์„ ๊ฐ€์ง€๊ณ  ์ผ์„ ์ฒ˜๋ฆฌํ–ˆ์Šต๋‹ˆ๋‹ค.
03:05
It could learn a pattern. It could remember a pattern.
77
185303
2746
์–‘์‹์„ ๋ฐฐ์šธ ์ˆ˜ ์žˆ๊ณ  ๊ธฐ์–ตํ•  ์ˆ˜๋„ ์žˆ์—ˆ์ฃ .
03:08
It could implement a pattern.
78
188049
1407
๊ทธ๊ฑธ ์‹œํ–‰ํ•  ์ˆ˜๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:09
And these modules were organized in hierarchies,
79
189456
2679
์ด๋Ÿฌํ•œ ๋ชจ๋“ˆ์€ ์ฒด๊ณ„๋ฅผ ๊ฐ€์ง€๊ณ  ์กฐ์ง๋˜์—ˆ๊ณ 
03:12
and we created that hierarchy with our own thinking.
80
192135
2954
์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ ์ž์‹ ์˜ ์‚ฌ๊ณ ๋ฅผ ํ†ตํ•ด ๊ทธ ์ฒด๊ณ„๋ฅผ ์ฐฝ์กฐํ•ด ๋ƒˆ์Šต๋‹ˆ๋‹ค.
03:15
And there was actually very little to go on
81
195089
3333
์‹ค์ œ๋กœ 50๋…„ ์ „์—๋Š”
03:18
50 years ago.
82
198422
1562
๊ฑฐ์˜ ๋‹ค๋ฅธ ์ง„์ „์ด ์—†์—ˆ์ง€์š”.
03:19
It led me to meet President Johnson.
83
199984
2115
์ œ๊ฐ€ ์กด์Šจ ๋Œ€ํ†ต๋ น์„ ๋งŒ๋‚œ ์ ์ด ์žˆ๋Š”๋ฐ์š”.
03:22
I've been thinking about this for 50 years,
84
202099
2173
์ €๋Š” ๊ทธ ์‚ฌ์‹ค์„ 50๋…„ ๊ฐ„ ์ƒ๊ฐํ•ด์™”์Šต๋‹ˆ๋‹ค.
03:24
and a year and a half ago I came out with the book
85
204272
2828
1๋…„ ๋ฐ˜ ์ „์— ์ €๋Š” ์ด๋Ÿฐ ์ฑ…์„ ์ผ์Šต๋‹ˆ๋‹ค.
03:27
"How To Create A Mind,"
86
207100
1265
"์ •์‹ ์„ ์ฐฝ์กฐํ•˜๋Š” ๋ฒ•"
03:28
which has the same thesis,
87
208365
1613
๋‚ด์šฉ์€ ๋…ผ๋ฌธ๊ณผ ๊ฐ™์•˜์ง€๋งŒ
03:29
but now there's a plethora of evidence.
88
209978
2812
์ˆ˜๋งŽ์€ ์ฆ๊ฑฐ๊ฐ€ ํฌํ•จ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
03:32
The amount of data we're getting about the brain
89
212790
1814
์‹ ๊ฒฝ๊ณผํ•™์„ ํ†ตํ•ด ๋‘๋‡Œ์— ๋Œ€ํ•ด
03:34
from neuroscience is doubling every year.
90
214604
2203
์šฐ๋ฆฌ๊ฐ€ ์ˆ˜์ง‘ํ•œ ์ž๋ฃŒ๋Š” ํ•ด๋งˆ๋‹ค ๋‘๋ฐฐ๋กœ ๋Š˜์–ด๋‚ฉ๋‹ˆ๋‹ค.
03:36
Spatial resolution of brainscanning of all types
91
216807
2654
๋ชจ๋“  ์ข…๋ฅ˜์˜ ๋‘๋‡Œ ์Šค์บ” ํ•ด์ƒ๋„๋Š”
03:39
is doubling every year.
92
219461
2285
๋งค๋…„ ๋‘๋ฐฐ๊ฐ€ ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:41
We can now see inside a living brain
93
221746
1717
ํ˜„์žฌ ์ธ๊ฐ„์€ ์‚ด์•„์žˆ๋Š” ๋‡Œ์˜ ๋‚ด๋ถ€์™€
03:43
and see individual interneural connections
94
223463
2870
๊ฐ๊ฐ์˜ ์‹ ๊ฒฝ ์—ฐ๊ฒฐ์„ ์„
03:46
connecting in real time, firing in real time.
95
226333
2703
์‹ค์‹œ๊ฐ„์œผ๋กœ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:49
We can see your brain create your thoughts.
96
229036
2419
๋‘๋‡Œ๊ฐ€ ์ƒ๊ฐ์„ ๋งŒ๋“ค์–ด๋‚ด๋Š” ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์–ด์š”.
03:51
We can see your thoughts create your brain,
97
231455
1575
์ธ๊ฐ„์˜ ์‚ฌ๊ณ ๊ฐ€ ๋‡Œ๋ฅผ ๋งŒ๋“ค์–ด๋‚ด๋Š” ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:53
which is really key to how it works.
98
233030
1999
๋ฐ”๋กœ ์ด๊ฒƒ์ด ์ž‘๋™ ์›๋ฆฌ์˜ ์ง„์ •ํ•œ ์—ด์‡ ์ž…๋‹ˆ๋‹ค.
03:55
So let me describe briefly how it works.
99
235029
2219
์–ด๋–ค ์ผ์ด ์ผ์–ด๋‚˜๋Š”์ง€ ์„ค๋ช…๋“œ๋ฆด๊ฒŒ์š”.
03:57
I've actually counted these modules.
100
237248
2275
์ œ๊ฐ€ ์ด ๋ชจ๋“ˆ๋“ค์„ ์‹ค์ œ๋กœ ์„ธ์–ด๋ดค์Šต๋‹ˆ๋‹ค.
03:59
We have about 300 million of them,
101
239523
2046
์ธ๊ฐ„์€ ์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ์•ฝ 3์–ต ๊ฐœ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”๋ฐ
04:01
and we create them in these hierarchies.
102
241569
2229
์ด๋Ÿฐ ์œ„๊ณ„ ์†์—์„œ ๋งŒ๋“ค์–ด๋ƒ…๋‹ˆ๋‹ค.
04:03
I'll give you a simple example.
103
243798
2082
๊ฐ„๋‹จํ•œ ์˜ˆ๋ฅผ ๋“œ๋ฆฌ์ฃ .
04:05
I've got a bunch of modules
104
245880
2805
์ €์—๊ฒŒ๋Š” ๋Œ€๋ฌธ์ž A์— ์žˆ๋Š” ์ˆ˜ํ‰ ์ง์„ ์„ ์•Œ์•„๋ณด๋Š”
04:08
that can recognize the crossbar to a capital A,
105
248685
3403
์—ฌ๋Ÿฌ ๊ฐœ์˜ ๋ชจ๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:12
and that's all they care about.
106
252088
1914
์ด ๋ชจ๋“ˆ๋“ค์ด ํ•˜๋Š” ์ผ์˜ ์ „๋ถ€์ด์ฃ .
04:14
A beautiful song can play,
107
254002
1578
์•„๋ฆ„๋‹ค์šด ์Œ์•…์ด ํ˜๋Ÿฌ๋‚˜์™€๋„
04:15
a pretty girl could walk by,
108
255580
1434
์•„๋ฆฌ๋”ฐ์šด ์•„๊ฐ€์”จ๊ฐ€ ์ง€๋‚˜๊ฐ€๋„
04:17
they don't care, but they see a crossbar to a capital A,
109
257014
2846
์ด ๋…€์„๋“ค์€ ๊ด€์‹ฌ์ด ์—†์–ด์š”. ํ•˜์ง€๋งŒ A ์ž์— ๋“  ์ง์„ ์„ ๋ณด๋ฉด
04:19
they get very excited and they say "crossbar,"
110
259860
3021
๋งค์šฐ ํฅ๋ถ„ํ•ด์„œ "์ง์„ "์ด๋ผ๊ณ  ๋งํ•˜๊ณ ๋Š”
04:22
and they put out a high probability
111
262881
2112
์ถœ๋ ฅ ์ถ•์ƒ‰ ๋Œ๊ธฐ์— ๋†’์€ ํ™•๋ฅ ์„
04:24
on their output axon.
112
264993
1634
ํ‘œ์‹œํ•ฉ๋‹ˆ๋‹ค.
04:26
That goes to the next level,
113
266627
1333
๊ทธ๋Ÿฌ๋ฉด ๋‹ค์Œ ๋‹จ๊ณ„๋กœ ๋„˜์–ด๊ฐ€๊ฒŒ ๋˜๊ณ 
04:27
and these layers are organized in conceptual levels.
114
267960
2750
์ด๋Ÿฐ ์ธต์ด ๊ฐœ๋…์ ์ธ ์ˆ˜์ค€์—์„œ ๋งŒ๋“ค์–ด ์ง‘๋‹ˆ๋‹ค.
04:30
Each is more abstract than the next one,
115
270710
1856
๊ฐ ์ธต์€ ๋‹ค์Œ ์ธต์— ๋น„ํ•ด์„œ ๋” ์ถ”์ƒ์ ์ด์—์š”.
04:32
so the next one might say "capital A."
116
272566
2418
๊ทธ๋ž˜์„œ ๊ทธ ๋‹ค์Œ ์ธต์ด "๋Œ€๋ฌธ์ž A"๋ผ๊ณ  ๋งํ•  ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
04:34
That goes up to a higher level that might say "Apple."
117
274984
2891
๊ทธ ๊ฒƒ์€ "์‚ฌ๊ณผ(Apple)"๋ผ๊ณ  ๋งํ•˜๋Š” ๋” ๋†’์€ ๋‹จ๊ณ„๋กœ ๊ฐ‘๋‹ˆ๋‹ค.
04:37
Information flows down also.
118
277875
2167
์ •๋ณด๋Š” ๋˜ํ•œ ์•„๋ž˜๋กœ๋„ ํ๋ฆ…๋‹ˆ๋‹ค.
04:40
If the apple recognizer has seen A-P-P-L,
119
280042
2936
์‚ฌ๊ณผ(Apple)๋ฅผ ์•Œ์•„๋ณด๋Š” ๋‹จ๊ณ„๊ฐ€ A-P-P-L ๋งŒ ๋ด๋„
04:42
it'll think to itself, "Hmm, I think an E is probably likely,"
120
282978
3219
"ํ .. E๊ฐ€ ๋‚˜์˜ฌ ๊ฒƒ๊ฐ™์•„"๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
04:46
and it'll send a signal down to all the E recognizers
121
286197
2564
๊ทธ๋ฆฌ๊ณ ๋Š” E ๋ฅผ ์•Œ์•„๋ณด๋Š” ์ธต์œผ๋กœ ์ด๋Ÿฐ ์‹ ํ˜ธ๋ฅผ ๋‚ด๋ ค๋ณด๋ƒ…๋‹ˆ๋‹ค.
04:48
saying, "Be on the lookout for an E,
122
288761
1619
"E๊ฐ€ ์žˆ๋Š”์ง€ ์•Œ์•„๋ด.
04:50
I think one might be coming."
123
290380
1556
E๊ฐ€ ๋‚˜์™€์•ผ ํ•  ๊ฒƒ๊ฐ™๊ฑฐ๋“ ."
04:51
The E recognizers will lower their threshold
124
291936
2843
E ๋ฅผ ์•Œ์•„๋ณด๋Š” ์ธต์€ ์ž์‹ ์˜ ๊ธฐ์ค€์„ ๋‚ฎ์ถฐ
04:54
and they see some sloppy thing, could be an E.
125
294779
1945
E ๊ฐ€ ๋  ๋ฒ•ํ•˜๊ฒŒ ์“ฐ์ธ ๊ฒƒ์„ ์ฐพ์Šต๋‹ˆ๋‹ค.
04:56
Ordinarily you wouldn't think so,
126
296724
1490
๋ณดํ†ต์€ ๊ทธ๋ ‡๊ฒŒ ํ•˜์ง€ ์•Š์ง€๋งŒ
04:58
but we're expecting an E, it's good enough,
127
298214
2009
E๋ฅผ ์˜ˆ์ธกํ•˜๋Š” ๊ฒƒ์œผ๋กœ ์ถฉ๋ถ„ํ•˜์ฃ .
05:00
and yeah, I've seen an E, and then apple says,
128
300223
1817
"๊ทธ๋ž˜, E ๊ฐ€ ์žˆ๋Š”๋ฐ."๋ผ๊ณ  ํ•˜๋ฉด
05:02
"Yeah, I've seen an Apple."
129
302040
1728
์‚ฌ๊ณผ๋ฅผ ์ƒ๊ฐํ•˜๋Š” ์ธต์—์„œ "๊ทธ๋ž˜, ๋‚œ ์‚ฌ๊ณผ๋ฅผ ๋ดค์–ด."๋ผ๊ณ  ํ•˜์ฃ .
05:03
Go up another five levels,
130
303768
1746
์—ฌ๊ธฐ์„œ 5 ๋‹จ๊ณ„๋ฅผ ๋” ์˜ฌ๋ผ๊ฐ€๋ฉด
05:05
and you're now at a pretty high level
131
305514
1353
์ด์ œ ์ƒ๋‹นํžˆ ๋†’์€ ๋‹จ๊ณ„์—
05:06
of this hierarchy,
132
306867
1569
๋‹ค๋‹ค๋ฅด๊ฒŒ ๋˜๊ณ 
05:08
and stretch down into the different senses,
133
308436
2353
์‹ ํ˜ธ๋Š” ์„œ๋กœ ๋‹ค๋ฅธ ๊ฐ๊ฐ์œผ๋กœ ๋‚ด๋ ค ๊ฐ‘๋‹ˆ๋‹ค.
05:10
and you may have a module that sees a certain fabric,
134
310789
2655
์‚ฌ๋žŒ์—๊ฒŒ๋Š” ํŠน์ •ํ•œ ํ‘œ๋ฉด์„ ๊ฐ์ง€ํ•˜๊ฑฐ๋‚˜,
05:13
hears a certain voice quality, smells a certain perfume,
135
313444
2844
ํŠน์ •ํ•œ ๋ชฉ์†Œ๋ฆฌ๋ฅผ ๋“ฃ๊ฑฐ๋‚˜, ํŠน์ •ํ•œ ํ–ฅ์„ ๊ฐ์ง€ํ•˜๋Š” ๋ชจ๋“ˆ์ด ์žˆ์–ด์„œ
05:16
and will say, "My wife has entered the room."
136
316288
2513
์ด๋ ‡๊ฒŒ ๋งํ•ฉ๋‹ˆ๋‹ค. "์•„๋‚ด๊ฐ€ ๋ฐฉ์— ๋“ค์–ด์™”์–ด."
05:18
Go up another 10 levels, and now you're at
137
318801
1895
์ด์ œ ๋‹ค์‹œ 10 ๋‹จ๊ณ„ ์œ„๋กœ ์˜ฌ๋ผ๊ฐ€๋ฉด
05:20
a very high level.
138
320696
1160
์•„์ฃผ ๋†’์€ ๋‹จ๊ณ„์— ์ด๋ฅด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
05:21
You're probably in the frontal cortex,
139
321856
1937
์•„๋งˆ ์ „๋‘ ํ”ผ์งˆ์— ์ด๋ฅด๋ €์„ํ…Œ๊ณ 
05:23
and you'll have modules that say, "That was ironic.
140
323793
3767
๊ฑฐ๊ธฐ์—๋Š” ์ด๋ ‡๊ฒŒ ๋งํ•˜๋Š” ๋ชจ๋“ˆ์ด ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค. "๊ฑฐ ์ฐธ ์ด์ƒํ•˜๋„ค.
05:27
That's funny. She's pretty."
141
327560
2370
์šฐ์Šต๊ฒŒ๋„ ์•„๋‚ด๊ฐ€ ์˜ˆ๋ป."
05:29
You might think that those are more sophisticated,
142
329930
2105
๊ทธ ๋ชจ๋“ˆ์€ ๋” ๋ณต์žกํ• ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹ค์ง€๋„ ๋ชจ๋ฅด์ง€๋งŒ
05:32
but actually what's more complicated
143
332035
1506
์‚ฌ์‹ค ๋” ๋ณต์žกํ•œ ๊ฒƒ์€
05:33
is the hierarchy beneath them.
144
333541
2669
๊ทธ ์•„๋ž˜์— ์žˆ๋Š” ๊ณ„์ธต ๊ตฌ์กฐ์ž…๋‹ˆ๋‹ค.
05:36
There was a 16-year-old girl, she had brain surgery,
145
336210
2620
๋‡Œ ์ˆ˜์ˆ ์„ ๋ฐ›์€ 16์„ธ ์†Œ๋…€๊ฐ€ ์žˆ์—ˆ๋Š”๋ฐ
05:38
and she was conscious because the surgeons
146
338830
2051
์˜์‹์ด ์žˆ์—ˆ์–ด์š”.
05:40
wanted to talk to her.
147
340881
1537
์˜์‚ฌ๊ฐ€ ๋ง์„ ๊ฑธ๋ ค๊ณ  ํ–ˆ์œผ๋‹ˆ๊นŒ์š”.
05:42
You can do that because there's no pain receptors
148
342418
1822
๋‡Œ์—๋Š” ํ†ต์ฆ์„ ๋Š๋ผ๋Š” ๊ฐ๊ฐ์ด ์—†๊ธฐ ๋•Œ๋ฌธ์—
05:44
in the brain.
149
344240
1038
๊ฐ€๋Šฅํ•œ ์ผ์ด์ฃ .
05:45
And whenever they stimulated particular,
150
345278
1800
์˜์‚ฌ๋“ค์ด ๊ทธ๋…€์˜ ์‹ ํ”ผ์งˆ์—์„œ
05:47
very small points on her neocortex,
151
347078
2463
๋งค์šฐ ์ž‘๊ณ  ํŠน์ •ํ•œ ๋ถ€์œ„๋ฅผ ์ž๊ทนํ•  ๋•Œ๋งˆ๋‹ค,
05:49
shown here in red, she would laugh.
152
349541
2665
์—ฌ๊ธฐ ๋ถ‰๊ฒŒ ๋ณด์ด๋Š” ๋ถ€๋ถ„์ธ๋ฐ์š”, ์†Œ๋…€๊ฐ€ ์›ƒ๋Š” ๊ฑฐ์˜ˆ์š”.
05:52
So at first they thought they were triggering
153
352206
1440
์ฒ˜์Œ์— ์˜์‚ฌ๋“ค์€ ์ผ์ข…์˜, ์›ƒ์Œ์„ ์œ ๋ฐœํ•˜๋Š” ๋ถ€์œ„๋ฅผ
05:53
some kind of laugh reflex,
154
353646
1720
๊ฑด๋“œ๋ ธ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ๋Š”๋ฐ
05:55
but no, they quickly realized they had found
155
355366
2519
๊ทธ๊ฒŒ ์•„๋‹ˆ์—ˆ์–ด์š”. ์˜์‚ฌ๋“ค์€ ๊ณง ๋ฐ”๋กœ
05:57
the points in her neocortex that detect humor,
156
357885
3044
์œ ๋จธ๋ฅผ ์•Œ์•„์ฑ„๋Š” ์‹ ํ”ผ์งˆ ๋ถ€์œ„๋ฅผ ์ฐพ์•„๋ƒˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:00
and she just found everything hilarious
157
360929
1969
์˜์‚ฌ๋“ค์ด ์ด ์ง€์ ์„ ์ž๊ทนํ•  ๋•Œ๋งˆ๋‹ค
06:02
whenever they stimulated these points.
158
362898
2437
์†Œ๋…€๋Š” ๋ชจ๋“  ๊ฒƒ์ด ์•„์ฃผ ์šฐ์Šค๊ฝ์Šค๋Ÿฝ๊ฒŒ ๋Š๊ปด์ง€๋Š” ๊ฑฐ์—์š”.
06:05
"You guys are so funny just standing around,"
159
365335
1925
"์—ฌ๊ธฐ ์„œ ๊ณ„์‹  ๋ถ„๋“ค์€ ์ •๋ง ์›ƒ๊ธฐ๋„ค์š”."๋ผ๋Š” ๋ง์„
06:07
was the typical comment,
160
367260
1738
์ž์ฃผ ํ–ˆ์ง€๋งŒ
06:08
and they weren't funny,
161
368998
2302
์˜์‚ฌ๋“ค์€ ์ „ํ˜€ ์›ƒ๊ธฐ์ง€ ์•Š์•˜์–ด์š”.
06:11
not while doing surgery.
162
371300
3247
์ˆ˜์ˆ ์„ ์ง‘๋„ํ•˜๋Š” ์ค‘์—๋Š” ๋ง์ด์—์š”.
06:14
So how are we doing today?
163
374547
4830
์ง€๊ธˆ์€ ์–ด๋–จ๊นŒ์š”?
06:19
Well, computers are actually beginning to master
164
379377
3054
์Œ, ์ปดํ“จํ„ฐ๊ฐ€ ์‹ค์ œ๋กœ ์ธ๊ฐ„์˜ ์‹ ํ”ผ์งˆ๊ณผ ๋น„์Šทํ•œ ๊ธฐ์ˆ ๋กœ
06:22
human language with techniques
165
382431
2001
์ธ๊ฐ„์˜ ์–ธ์–ด๋ฅผ ์™„์ „ํžˆ ์ดํ•ดํ•˜๊ธฐ
06:24
that are similar to the neocortex.
166
384432
2867
์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
06:27
I actually described the algorithm,
167
387299
1514
์ œ๊ฐ€ ์‹ค์ œ๋กœ ๊ทธ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์„ค๋ช…ํ–ˆ๋Š”๋ฐ์š”.
06:28
which is similar to something called
168
388813
2054
๊ทธ๊ฒƒ์€ ์†Œ์œ„
06:30
a hierarchical hidden Markov model,
169
390867
2233
์ˆจ๊ฒจ์ง„ ๋‹จ๊ณ„์  ๋งˆ๋ฅด์ฝ”ํ”„ ๋ชจ๋ธ์ด๋ผ๊ณ 
06:33
something I've worked on since the '90s.
170
393100
3241
ํ•˜๋Š” ๊ฒƒ์ธ๋ฐ ์ œ๊ฐ€ 90๋…„๋Œ€ ์ดํ›„๋กœ ์—ฐ๊ตฌํ•ด์˜จ ๋ถ„์•ผ์ž…๋‹ˆ๋‹ค.
06:36
"Jeopardy" is a very broad natural language game,
171
396341
3238
"์ œํผ๋””(Jeopardy)"๋Š” ๊ด‘๋ฒ”์œ„ํ•œ ์ž์—ฐ ์–ธ์–ด ๊ฒŒ์ž„์ด์—์š”.
06:39
and Watson got a higher score
172
399579
1892
์™“์Šจ์€ ์ตœ๊ณ  ๋“์ ์ž ๋‘ ์‚ฌ๋žŒ์˜ ์ ์ˆ˜๋ฅผ ํ•ฉํ•œ ๊ฒƒ๋ณด๋‹ค
06:41
than the best two players combined.
173
401471
2000
๋” ๋†’์€ ์ ์ˆ˜๋ฅผ ์–ป์—ˆ์Šต๋‹ˆ๋‹ค.
06:43
It got this query correct:
174
403471
2499
์™“์Šจ์ด ์ด ๋ฌธ์ œ๋ฅผ ๋งž์ท„์Šต๋‹ˆ๋‹ค.
06:45
"A long, tiresome speech
175
405970
2085
"๊ธธ๊ณ  ์ง€๋ฃจํ•œ ์ด์•ผ๊ธฐ์ธ๋ฐ,
06:48
delivered by a frothy pie topping,"
176
408055
2152
๊ฑฐํ’ˆ์ด ์žˆ๋Š” ํŒŒ์ด ํ† ํ•‘์ด ํ•œ ๋ง"
06:50
and it quickly responded, "What is a meringue harangue?"
177
410207
2796
๊ทธ๋Ÿฌ์ž ์™“์Šจ์ด ๋ฐ”๋กœ ๋‹ตํ–ˆ์Šต๋‹ˆ๋‹ค. "์ •๋‹ต์€ ๋จธ๋žญ ํ•˜๋ž‘(meringue harangue)"
06:53
And Jennings and the other guy didn't get that.
178
413003
2635
์ œ๋‹์Šค์™€ ๋‹ค๋ฅธ ์ฐธ๊ฐ€์ž๋Š” ๋ฌด์Šจ ๋ง์ธ์ง€ ๋ชฐ๋ž์Šต๋‹ˆ๋‹ค.
06:55
It's a pretty sophisticated example of
179
415638
1926
์ด๊ฒƒ์€ ์ปดํ“จํ„ฐ๊ฐ€ ์ธ๊ฐ„์˜ ์–ธ์–ด๋ฅผ
06:57
computers actually understanding human language,
180
417564
1914
์‹ค์ œ๋กœ ์ดํ•ดํ–ˆ๋‹ค๋Š” ์ƒ๋‹นํžˆ ๋ณต์žกํ•œ ์˜ˆ์ž…๋‹ˆ๋‹ค.
06:59
and it actually got its knowledge by reading
181
419478
1652
์™“์Šจ์€ ์‹ค์ œ๋กœ ์œ„ํ‚คํ”ผ๋””์•„์™€
07:01
Wikipedia and several other encyclopedias.
182
421130
3785
๋ช‡ ๊ฐœ์˜ ๋‹ค๋ฅธ ๋ฐฑ๊ณผ ์‚ฌ์ „์„ ํ†ตํ•ด ์ง€์‹์„ ์–ป์—ˆ์Šต๋‹ˆ๋‹ค.
07:04
Five to 10 years from now,
183
424915
2133
์•ž์œผ๋กœ 5๋…„์—์„œ 10๋…„ ํ›„,
07:07
search engines will actually be based on
184
427048
2184
๊ฒ€์ƒ‰ ์—”์ง„์€ ์‹ค์ œ๋กœ
07:09
not just looking for combinations of words and links
185
429232
2794
๋‹จ์–ด๋‚˜ ์—ฐ๊ฒฐ์˜ ์กฐํ•ฉ์„ ์ฐพ๋Š” ๋Œ€์‹ 
07:12
but actually understanding,
186
432026
1914
์‹ค์ œ๋กœ ์ดํ•ดํ•œ ๋‚ด์šฉ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ๊ฒ€์ƒ‰ํ•  ๊ฒ๋‹ˆ๋‹ค.
07:13
reading for understanding the billions of pages
187
433940
2411
์›น์ด๋‚˜ ์ฑ…์„ ์ˆ˜์–ต ์ชฝ์ด๋‚˜ ์ดํ•ดํ•˜๋ฉฐ
07:16
on the web and in books.
188
436351
2733
์ฝ์€ ๊ฒƒ์œผ๋กœ ๋ถ€ํ„ฐ ๋ง์ด์—์š”.
07:19
So you'll be walking along, and Google will pop up
189
439084
2616
๊ทธ๋ž˜์„œ ๊ธธ์„ ๊ฐ€๋Š”๋ฐ ๊ฐ‘์ž๊ธฐ ๊ตฌ๊ธ€์ด ํŠ€์–ด๋‚˜์™€
07:21
and say, "You know, Mary, you expressed concern
190
441700
3081
"์ด๋ณด์„ธ์š”, ๋ฉ”๋ฆฌ, ํ•œ ๋‹ฌ์ „์— ๋‹น์‹ ์ด ์„ญ์ทจํ•˜๋Š”
07:24
to me a month ago that your glutathione supplement
191
444781
3019
๊ธ€๋ฃจํƒ€ํ‹ฐ์˜จ์ด ํ˜ˆ์•ก๋‡Œ๊ด€๋ฌธ์„
07:27
wasn't getting past the blood-brain barrier.
192
447800
2231
ํ†ต๊ณผํ•˜์ง€ ๋ชปํ• ๊นŒ ๊ฑฑ์ •ํ–ˆ๋Š”๋ฐ,
07:30
Well, new research just came out 13 seconds ago
193
450031
2593
13์ดˆ ์ „์— ๋ฐœํ‘œ๋œ ๊ฒฐ๊ณผ๋ฅผ ๋ณด๋ฉด
07:32
that shows a whole new approach to that
194
452624
1711
๊ทธ๊ฒƒ์— ๋Œ€ํ•œ ์™„์ „ํžˆ ์ƒˆ๋กœ์šด ์ ‘๊ทผ๋ฒ•๊ณผ
07:34
and a new way to take glutathione.
195
454335
1993
๊ธ€๋ฃจํƒ€ํ‹ฐ์˜จ์„ ์„ญ์ทจํ•˜๋Š” ์ƒˆ๋กœ์šด ๋ฐฉ๋ฒ•์„ ๋ณด์—ฌ ์ค๋‹ˆ๋‹ค.
07:36
Let me summarize it for you."
196
456328
2562
๊ทธ๊ฑธ ์š”์•ฝํ•ด ๋“œ๋ฆด๊ฒŒ์š”."๋ผ๊ณ  ๋งํ•˜๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
07:38
Twenty years from now, we'll have nanobots,
197
458890
3684
์ง€๊ธˆ์œผ๋กœ ๋ถ€ํ„ฐ 20๋…„ ํ›„์—๋Š” ๋‚˜๋…ธ ๋กœ๋ด‡์ด ๋‚˜์˜ต๋‹ˆ๋‹ค.
07:42
because another exponential trend
198
462574
1627
๋˜ ์ ์  ๋” ์ž‘์€ ๊ฒƒ์— ๋Œ€ํ•œ ๊ธฐ์ˆ ์ด
07:44
is the shrinking of technology.
199
464201
1615
๋˜ ํ•˜๋‚˜์˜ ์ถ”์„ธ๊ฑฐ๋“ ์š”.
07:45
They'll go into our brain
200
465816
2370
๋‚˜๋…ธ ๋กœ๋ด‡์ด ์‚ฌ๋žŒ์˜ ๋จธ๋ฆฌ ์†์œผ๋กœ ๋“ค์–ด๊ฐ€
07:48
through the capillaries
201
468186
1703
๋ชจ์„ธ ํ˜ˆ๊ด€์„ ํ†ตํ•ด์„œ
07:49
and basically connect our neocortex
202
469889
2477
์ธ๊ฐ„์˜ ์‹ ํ”ผ์งˆ์„
07:52
to a synthetic neocortex in the cloud
203
472366
3185
ํด๋ผ์šฐ๋“œ์— ์žˆ๋Š” ํ•ฉ์„ฑ ์‹ ํ”ผ์งˆ์— ์—ฐ๊ฒฐํ•จ์œผ๋กœ์จ
07:55
providing an extension of our neocortex.
204
475551
3591
์ธ๊ฐ„์˜ ์‹ ํ”ผ์งˆ์„ ํ™•์žฅํ•ด ์ค„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:59
Now today, I mean,
205
479142
1578
ํ˜„์žฌ์—๋Š”
08:00
you have a computer in your phone,
206
480720
1530
ํœด๋Œ€ ์ „ํ™”๊ธฐ์— ์ปดํ“จํ„ฐ๊ฐ€ ๋“ค์–ด์žˆ์Šต๋‹ˆ๋‹ค.
08:02
but if you need 10,000 computers for a few seconds
207
482250
2754
๋งŒ์•ฝ ๋ช‡ ์ดˆ ๋งŒ์— ๋ณต์žกํ•œ ๊ฒ€์ƒ‰์„ ํ•˜๋ ค๊ณ 
08:05
to do a complex search,
208
485004
1495
๋งŒ ๊ฐœ์˜ ์ปดํ“จํ„ฐ๊ฐ€ ํ•„์š”ํ•œ ๊ฒฝ์šฐ์—
08:06
you can access that for a second or two in the cloud.
209
486499
3396
ํด๋ผ์šฐ๋“œ์— ์žˆ๋Š” ์ปดํ“จํ„ฐ์— 1, 2์ดˆ๊ฐ„ ์ ‘์†ํ•  ์ˆ˜ ์žˆ์–ด์š”.
08:09
In the 2030s, if you need some extra neocortex,
210
489895
3095
2030๋…„๋Œ€์— ์•ฝ๊ฐ„์˜ ์‹ ํ”ผ์งˆ์ด ๋” ํ•„์š”ํ•˜๋‹ค๋ฉด
08:12
you'll be able to connect to that in the cloud
211
492990
2273
๋จธ๋ฆฌ ์†์— ์žˆ๋Š” ์‹ ํ”ผ์งˆ์„ ํด๋ผ์šฐ๋“œ์— ์ง์ ‘
08:15
directly from your brain.
212
495263
1648
์—ฐ๊ฒฐํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:16
So I'm walking along and I say,
213
496911
1543
๊ทธ๋ž˜์„œ ์ œ๊ฐ€ ๊ธธ์„ ๊ฑท๋‹ค๊ฐ€
08:18
"Oh, there's Chris Anderson.
214
498454
1363
"์•„, ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ์ด ์žˆ๊ตฐ.
08:19
He's coming my way.
215
499817
1525
์ด์ชฝ์œผ๋กœ ์˜ค๋„ค.
08:21
I'd better think of something clever to say.
216
501342
2335
๋ญ”๊ฐ€ ์˜๋ฆฌํ•ด๋ตˆ๋Š” ๊ฑธ ์ƒ๊ฐํ•ด์•ผ๊ฒ ๋Š”๊ฑธ.
08:23
I've got three seconds.
217
503677
1524
3์ดˆ ๋ฐ–์— ์‹œ๊ฐ„์ด ์—†๋Š”๋ฐ.
08:25
My 300 million modules in my neocortex
218
505201
3097
๋‚ด ๋จธ๋ฆฌ ์†์— ์žˆ๋Š” 3์–ต ๊ฐœ์˜ ์‹ ํ”ผ์งˆ ๋ชจ๋“ˆ๋กœ๋Š”
08:28
isn't going to cut it.
219
508298
1240
ํ•  ์ˆ˜๊ฐ€ ์—†์–ด.
08:29
I need a billion more."
220
509538
1246
10์–ต ๊ฐœ๋Š” ๋” ํ•„์š”ํ•œ๋ฐ."๋ผ๊ณ  ๋งํ•˜๋ฉด
08:30
I'll be able to access that in the cloud.
221
510784
3323
์ €๋Š” ๋ฐ”๋กœ ๊ทธ ํด๋ผ์šฐ๋“œ์— ์ ‘์†ํ•  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
08:34
And our thinking, then, will be a hybrid
222
514107
2812
๊ทธ ๋•Œ, ์šฐ๋ฆฌ์˜ ์‚ฌ๊ณ ๋Š” ์ƒ๋ฌผํ•™์ ์ธ ๊ฒƒ๊ณผ ๋น„์ƒ๋ฌผํ•™์ ์ธ ๊ฒƒ์ด
08:36
of biological and non-biological thinking,
223
516919
3522
ํ˜ผ์žฌ๋œ ์‚ฌ๊ณ ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:40
but the non-biological portion
224
520441
1898
ํ•˜์ง€๋งŒ ๋น„์ƒ๋ฌผํ•™์ ์ธ ๋ถ€๋ถ„์€
08:42
is subject to my law of accelerating returns.
225
522339
2682
์ œ๊ฐ€ ๋งํ•˜๋Š” ๊ฐ€์†์  ๊ฒฐ๊ณผ์˜ ๋ฒ•์น™์— ์ง€๋ฐฐ๋ฅผ ๋ฐ›์Šต๋‹ˆ๋‹ค.
08:45
It will grow exponentially.
226
525021
2239
๊ทธ๊ฒƒ์€ ๊ธฐํ•˜๊ธ‰์ˆ˜์ ์œผ๋กœ ์ฆ๊ฐ€ํ•  ๊ฒƒ์ด๋ผ๋Š” ๋œป์ด์ฃ .
08:47
And remember what happens
227
527260
2016
์šฐ๋ฆฌ๊ฐ€ ๋งˆ์ง€๋ง‰์œผ๋กœ ์‹ ํ”ผ์งˆ์„ ํ™•์žฅํ–ˆ๋˜ ๋•Œ๋ฅผ
08:49
the last time we expanded our neocortex?
228
529276
2645
๊ธฐ์–ตํ•˜์‹ญ๋‹ˆ๊นŒ?
08:51
That was two million years ago
229
531921
1426
๊ทธ๊ฒƒ์€ 2๋ฐฑ๋งŒ๋…„ ์ „,
08:53
when we became humanoids
230
533347
1236
์šฐ๋ฆฌ๊ฐ€ ์ธ๊ฐ„์˜ ๋ชจ์Šต์„ ํ•˜๊ณ 
08:54
and developed these large foreheads.
231
534583
1594
์ด๋ ‡๊ฒŒ ์ปค๋‹ค๋ž€ ์ด๋งˆ๋ฅผ ๊ฐ–๊ฒŒ๋˜์—ˆ๋˜ ์‹œ๊ธฐ์ž…๋‹ˆ๋‹ค.
08:56
Other primates have a slanted brow.
232
536177
2583
๋‹ค๋ฅธ ์˜์žฅ๋ฅ˜๋Š” ์ด๋งˆ๋ถ€์œ„๊ฐ€ ๊ธฐ์šธ์–ด์ ธ ์žˆ์ง€์š”.
08:58
They don't have the frontal cortex.
233
538760
1745
๊ทธ๋“ค์—๊ฒŒ๋Š” ์ „๋‘ ํ”ผ์งˆ์ด ์—†์Šต๋‹ˆ๋‹ค.
09:00
But the frontal cortex is not really qualitatively different.
234
540505
3685
๊ทธ๋ ‡์ง€๋งŒ ์ „๋‘ ํ”ผ์งˆ์€ ์‹ค์ œ๋กœ ์งˆ์ ์œผ๋กœ ๋‹ค๋ฅด์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:04
It's a quantitative expansion of neocortex,
235
544190
2743
๊ทธ๊ฒƒ์€ ์‹ ํ”ผ์งˆ์ด ์–‘์ ์œผ๋กœ ํ™•์žฅ๋œ ๊ฒƒ์ด์—์š”.
09:06
but that additional quantity of thinking
236
546933
2703
๊ทธ๋Ÿฐ๋ฐ ๊ทธ๋ ‡๊ฒŒ ๋Š˜์–ด๋‚œ ์‚ฌ๊ณ ์˜ ์–‘์ด
09:09
was the enabling factor for us to take
237
549636
1779
์šฐ๋ฆฌ๊ฐ€ ์งˆ์ ์ธ ํ–ฅ์ƒ์„ ์ด๋ฃจ๊ณ 
09:11
a qualitative leap and invent language
238
551415
3346
์–ธ์–ด์™€ ์˜ˆ์ˆ , ๊ณผํ•™, ๊ธฐ์ˆ ,
09:14
and art and science and technology
239
554761
1967
๊ทธ๋ฆฌ๊ณ  TED ์ปจํผ๋Ÿฐ์Šค๋ฅผ
09:16
and TED conferences.
240
556728
1454
๋ฐœ๋ช…ํ•˜๊ฒŒ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
09:18
No other species has done that.
241
558182
2131
๋‹ค๋ฅธ ์–ด๋–ค ์ข…๋„ ๊ทธ๋Ÿฌ์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
09:20
And so, over the next few decades,
242
560313
2075
์ฆ‰, ๋‹ค์Œ ์ˆ˜์‹ญ๋…„๊ฐ„,
09:22
we're going to do it again.
243
562388
1760
์šฐ๋ฆฌ๋Š” ๋‹ค์‹œ ๊ทธ ๊ณผ์ •์„ ์ง€๋‚  ๊ฒ๋‹ˆ๋‹ค.
09:24
We're going to again expand our neocortex,
244
564148
2274
์ธ๊ฐ„์€ ์‹ ํ”ผ์งˆ์„ ๋‹ค์‹œ ํ™•์žฅํ•  ๊ฒƒ์ด๊ณ 
09:26
only this time we won't be limited
245
566422
1756
์ด๋ฒˆ์— ๋งŒํผ์€ ์‹ ํ”ผ์งˆ์„ ๋‘˜๋Ÿฌ์‹ผ
09:28
by a fixed architecture of enclosure.
246
568178
4280
๊ณ ์ •๋œ ์‹ ์ฒด ๊ตฌ์กฐ์˜ ํ•œ๊ณ„์— ์˜ํ•ด ์ œํ•œ๋ฐ›์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:32
It'll be expanded without limit.
247
572458
3304
์ œํ•œ์—†์ด ํ™•์žฅ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:35
That additional quantity will again
248
575762
2243
๊ทธ๋ ‡๊ฒŒ ์–ป์–ด์ง€๋Š” ์–‘์€ ๋˜ ๋‹ค์‹œ
09:38
be the enabling factor for another qualitative leap
249
578005
3005
๋ฌธํ™”์™€ ๊ธฐ์ˆ ์— ์žˆ์–ด์„œ ๋˜ ํ•œ๋ฒˆ์˜ ์งˆ์ ์ธ ํ–ฅ์ƒ์„
09:41
in culture and technology.
250
581010
1635
๊ฐ€๋Šฅํ•˜๊ฒŒ ํ•˜๋Š” ์š”์†Œ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:42
Thank you very much.
251
582645
2054
๋Œ€๋‹จํžˆ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
09:44
(Applause)
252
584699
3086
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7