A brain in a supercomputer | Henry Markram

513,831 views

2009-10-15 ใƒป TED


New videos

A brain in a supercomputer | Henry Markram

513,831 views ใƒป 2009-10-15

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Soonho Kong ๊ฒ€ํ† : Eunyoung Lim
00:18
Our mission is to build
0
18330
3000
์šฐ๋ฆฌ์˜ ์‚ฌ๋ช…์€ ์ธ๊ฐ„์˜ ๋‘๋‡Œ์— ๊ด€ํ•œ
00:21
a detailed, realistic
1
21330
2000
๊ตฌ์ฒด์ ์ด๊ณ  ์‚ฌ์‹ค์ ์ธ
00:23
computer model of the human brain.
2
23330
2000
์ปดํ“จํ„ฐ ๋ชจ๋ธ์„ ๋งŒ๋“ค์–ด๋‚ด๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:25
And we've done, in the past four years,
3
25330
3000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ์ง€๋‚œ 4๋…„๋™์•ˆ
00:28
a proof of concept
4
28330
2000
์„ค์น˜๋ฅ˜ ๋‘๋‡Œ์˜ ์กฐ๊ทธ๋งˆํ•œ ๋ถ€๋ถ„์— ๋Œ€ํ•œ
00:30
on a small part of the rodent brain,
5
30330
3000
์šฐ๋ฆฌ์˜ ๋ชจ๋ธ์— ๋Œ€ํ•ด ์ฆ๋ช…ํ•ด ์™”์Šต๋‹ˆ๋‹ค.
00:33
and with this proof of concept we are now scaling the project up
6
33330
3000
๊ทธ๋ฆฌ๊ณ  ์ด ์ฆ๋ช…์„ ํ† ๋Œ€๋กœ, ์šฐ๋ฆฌ๋Š” ์ด ํ”„๋กœ์ ํŠธ๋ฅผ
00:36
to reach the human brain.
7
36330
3000
์ธ๊ฐ„์˜ ๋‘๋‡Œ๋กœ ํ™•์žฅ์‹œํ‚ค๋Š” ์ผ์„ ์ง„ํ–‰ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค
00:39
Why are we doing this?
8
39330
2000
์šฐ๋ฆฌ๊ฐ€ ์™œ ์ด ์ผ์„ ํ•˜๋Š” ๊ฒƒ์ผ๊นŒ์š”?
00:41
There are three important reasons.
9
41330
2000
๊ฑฐ๊ธฐ์—๋Š” ์„ธ ๊ฐ€์ง€ ์ค‘์š”ํ•œ ์ด์œ ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
00:43
The first is, it's essential for us to understand the human brain
10
43330
4000
์ฒซ๋ฒˆ์งธ๋กœ, ์‚ฌํšŒ์ ์œผ๋กœ ๋ฒˆ์˜ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
00:47
if we do want to get along in society,
11
47330
2000
์ธ๊ฐ„์˜ ๋‘๋‡Œ๋ฅผ ์ดํ•ดํ•˜๋Š” ๊ฒƒ์€ ํ•„์ˆ˜์ ์ž…๋‹ˆ๋‹ค.
00:49
and I think that it is a key step in evolution.
12
49330
4000
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ์ด๊ฒƒ์ด ์ง„ํ™”์˜ ํ•ต์‹ฌ์ ์ธ ๋‹จ๊ณ„์ผ ๊ฒƒ์œผ๋กœ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
00:53
The second reason is,
13
53330
2000
๋‘๋ฒˆ์งธ ์ด์œ ๋Š”,
00:55
we cannot keep doing animal experimentation forever,
14
55330
6000
์šฐ๋ฆฌ๊ฐ€ ์–ธ์ œ๊นŒ์ง€ ๋™๋ฌผ ์‹คํ—˜์„ ๊ณ„์†ํ•  ์ˆ˜๋Š” ์—†๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
01:01
and we have to embody all our data and all our knowledge
15
61330
4000
์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ์˜ ๋ชจ๋“  ์ž๋ฃŒ์™€ ์ง€์‹๋“ค์„ ๊ตฌ์ฒดํ™”ํ•˜์—ฌ
01:05
into a working model.
16
65330
3000
์‹ค์ œ๋กœ ์ž‘๋™ํ•˜๋Š” ๋ชจ๋ธ์„ ๋งŒ๋“ค ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
01:08
It's like a Noah's Ark. It's like an archive.
17
68330
4000
์ด๊ฒƒ์€ ๋งˆ์น˜ ๋…ธ์•„์˜ ๋ฐฉ์ฃผ์™€ ๊ฐ™์Šต๋‹ˆ๋‹ค. ์ด๊ฒƒ์€ ๋งˆ์น˜ ๋ฌธ์„œ ๋ณด๊ด€์†Œ์™€ ๊ฐ™์ฃ .
01:12
And the third reason is that there are two billion people on the planet
18
72330
3000
์„ธ๋ฒˆ์งธ ์ด์œ ๋Š” ์ง€๊ตฌ์ƒ์— ์ •์‹  ์งˆํ™˜์œผ๋กœ ๊ณ ํ†ต๋ฐ›๋Š”
01:15
that are affected by mental disorder,
19
75330
4000
20์–ต์˜ ์‚ฌ๋žŒ๋“ค์ด ์žˆ๊ณ 
01:19
and the drugs that are used today
20
79330
2000
๊ทธ๋“ค์ด ์‚ฌ์šฉํ•˜๊ณ  ์žˆ๋Š” ์•ฝ๋ฌผ์€
01:21
are largely empirical.
21
81330
2000
๋Œ€๋ถ€๋ถ„ ๋ชจ๋ธ์— ์˜์กดํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹Œ ์‹คํ—˜์— ์˜์กดํ•œ ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
01:23
I think that we can come up with very concrete solutions on
22
83330
3000
์ œ๊ฐ€ ์ƒ๊ฐํ•˜๊ธฐ์—๋Š” ์šฐ๋ฆฌ๋Š” ์ •์‹ ์งˆํ™˜๋“ค์„ ์น˜๋ฃŒํ•˜๋Š”
01:26
how to treat disorders.
23
86330
3000
๋งค์šฐ ๊ตฌ์ฒด์ ์ธ ํ•ด๊ฒฐ์ฑ…์„ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:29
Now, even at this stage,
24
89330
3000
์ง€๊ธˆ, ํ˜„์žฌ์˜ ์ƒํƒœ์—์„œ๋„,
01:32
we can use the brain model
25
92330
2000
์šฐ๋ฆฌ๋Š” ๋‘๋‡Œ ๋ชจ๋ธ์„ ์ด์šฉํ•˜์—ฌ์„œ
01:34
to explore some fundamental questions
26
94330
3000
๋‘๋‡Œ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€์— ๋Œ€ํ•œ
01:37
about how the brain works.
27
97330
2000
๊ธฐ๋ณธ์ ์ธ ์งˆ๋ฌธ๋“ค์„ ํƒ๊ตฌํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:39
And here, at TED, for the first time,
28
99330
2000
๊ทธ๋ฆฌ๊ณ  ์—ฌ๊ธฐ TED์—์„œ ์ตœ์ดˆ๋กœ
01:41
I'd like to share with you how we're addressing
29
101330
2000
์ €๋Š” ์—ฌ๋Ÿฌ๋ถ„๊ณผ ํ•จ๊ป˜ ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ป๊ฒŒ ๋งŽ์€ ์ด๋ก  ์ค‘์—์„œ
01:43
one theory -- there are many theories --
30
103330
3000
ํ•˜๋‚˜์˜ ์ด๋ก  -- ๋งŽ์€ ์ด๋ก ๋“ค์ด ์žˆ์ง€๋งŒ์š” --
01:46
one theory of how the brain works.
31
106330
4000
ํ•˜๋‚˜์˜ ์ด๋ก ์„ ๋‹ค๋ฃจ๊ณ  ์žˆ๋Š”์ง€ ์„ค๋ช…ํ•˜๋ ค ํ•ฉ๋‹ˆ๋‹ค.
01:50
So, this theory is that the brain
32
110330
4000
์ด ์ด๋ก ์€ ๋‘๋‡Œ๊ฐ€
01:54
creates, builds, a version of the universe,
33
114330
6000
๋˜ ๋‹ค๋ฅธ ์šฐ์ฃผ๋ฅผ ์ฐฝ์กฐํ•˜๊ณ  ๋งŒ๋“ค์–ด๋‚ธ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:00
and projects this version of the universe,
34
120330
3000
๊ทธ๋ฆฌ๊ณ  ์ด ๋˜ ๋‹ค๋ฅธ ์šฐ์ฃผ๋ฅผ
02:03
like a bubble, all around us.
35
123330
4000
์šฐ๋ฆฌ ์ฃผ๋ณ€์— ๊ฑฐํ’ˆ๊ณผ ๊ฐ™์€ ํ˜•ํƒœ๋กœ ํˆฌ์˜์‹œํ‚ต๋‹ˆ๋‹ค.
02:07
Now, this is of course a topic of philosophical debate for centuries.
36
127330
4000
์ž, ์ด๊ฒƒ์€ ๋ฌผ๋ก  ์ˆ˜์„ธ๊ธฐ ๋™์•ˆ ๊ณ„์†๋˜์–ด์˜จ ์ฒ ํ•™์  ๋…ผ์Ÿ์ž…๋‹ˆ๋‹ค.
02:11
But, for the first time, we can actually address this,
37
131330
3000
ํ•˜์ง€๋งŒ, ์ฒ˜์Œ์œผ๋กœ, ์šฐ๋ฆฌ๋Š” ์‹ค์ œ๋กœ ์ด ๋ฌธ์ œ๋ฅผ
02:14
with brain simulation,
38
134330
2000
๋‘๋‡Œ ์‹œ๋ฎฌ๋ ˆ์ด์…˜์„ ํ†ตํ•ด์„œ ๋‹ค๋ฃฐ ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค
02:16
and ask very systematic and rigorous questions,
39
136330
4000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ๋งค์šฐ ์ฒด๊ณ„์ ์ด๊ณ  ์—„๋ฐ€ํ•œ ์งˆ๋ฌธ๋“ค์„ ํ†ตํ•ด์„œ
02:20
whether this theory could possibly be true.
40
140330
4000
์ด ์ด๋ก ์ด ์ฐธ์ผ ์ˆ˜ ์žˆ๋Š”์ง€๋ฅผ ์•Œ์•„๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:24
The reason why the moon is huge on the horizon
41
144330
3000
๋‹ฌ์ด ์ง€ํ‰์„ ์—์„œ ๊ฑฐ๋Œ€ํ•ด ๋ณด์ด๋Š” ์ด์œ ๋Š”
02:27
is simply because our perceptual bubble
42
147330
3000
๋‹จ์ˆœํžˆ ์šฐ๋ฆฌ์˜ ์ง€๊ฐ ๊ฑฐํ’ˆ์ด
02:30
does not stretch out 380,000 kilometers.
43
150330
4000
380,000 ํ‚ฌ๋กœ๋ฏธํ„ฐ๋งŒํผ ๋ป—์–ด๋‚˜๊ฐ€์ง€ ๋ชปํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
02:34
It runs out of space.
44
154330
2000
์šฉ๋Ÿ‰์ด ๋ถ€์กฑํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:36
And so what we do is we compare the buildings
45
156330
4000
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๊ฐ€ ์‹ค์ œ๋กœ ํ•˜๋Š” ๊ฒƒ์€ ์šฐ๋ฆฌ์˜ ์ง€๊ฐ ๊ฑฐํ’ˆ์•ˆ์—์„œ
02:40
within our perceptual bubble,
46
160330
2000
๊ฑด๋ฌผ๋“ค๊ณผ ํฌ๊ธฐ๋ฅผ ๋น„๊ตํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:42
and we make a decision.
47
162330
2000
๊ทธ๋ฆฌ๊ณ  ํŒ๋‹จํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
02:44
We make a decision it's that big,
48
164330
2000
์šฐ๋ฆฌ๋Š” ๊ทธ ์ •๋„ ํฌ๊ธฐ๋ผ ํŒ๋‹จํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
02:46
even though it's not that big.
49
166330
2000
์‹ค์ œ๋กœ ๊ทธ ์ •๋„ ํฌ๊ธฐ๊ฐ€ ์•„๋‹ˆ์–ด๋„ ๋ง์ด์ฃ .
02:48
And what that illustrates
50
168330
2000
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์ด ์‹œ์‚ฌํ•˜๋Š” ๋ฐ”๋Š”
02:50
is that decisions are the key things
51
170330
2000
ํŒ๋‹จ๋“ค์ด์•ผ ๋ง๋กœ ์šฐ๋ฆฌ์˜ ์ง€๊ฐ ๊ฑฐํ’ˆ์„ ์ง€ํƒฑํ•˜๋Š”
02:52
that support our perceptual bubble. It keeps it alive.
52
172330
5000
ํ•ต์‹ฌ์ ์ธ ๊ฒƒ์ด๋ผ๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค. ํŒ๋‹จ์ด ์ง€๊ฐ ๊ฑฐํ’ˆ์„ ์‚ด์•„์žˆ๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค.
02:57
Without decisions you cannot see, you cannot think,
53
177330
2000
ํŒ๋‹จ์—†์ด๋Š” ์—ฌ๋Ÿฌ๋ถ„์€ ๋ณผ ์ˆ˜๋„ ์—†๊ณ , ์ƒ๊ฐํ•  ์ˆ˜๋„ ์—†๊ณ ,
02:59
you cannot feel.
54
179330
2000
๋Š๋‚„ ์ˆ˜๋„ ์—†์Šต๋‹ˆ๋‹ค.
03:01
And you may think that anesthetics work
55
181330
2000
๋ฌผ๋ก  ์—ฌ๋Ÿฌ๋ถ„์€ ์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:03
by sending you into some deep sleep,
56
183330
3000
๋งˆ์ทจ๋ฅผ ํ•˜๋ฉด, ์—ฌ๋Ÿฌ๋ถ„์„ ๊นŠ์€ ์ž ์— ๋“ค๊ฒŒ ํ•œ๋‹ค๊ฑฐ๋‚˜
03:06
or by blocking your receptors so that you don't feel pain,
57
186330
3000
ํ˜น์€ ์—ฌ๋Ÿฌ๋ถ„์˜ ๊ฐ๊ฐ ๊ธฐ๊ด€๋“ค์„ ๋ง‰์•„์„œ ๊ณ ํ†ต์„ ๋Š๋ผ์ง€ ๋ชปํ•˜๊ฒŒ ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:09
but in fact most anesthetics don't work that way.
58
189330
3000
ํ•˜์ง€๋งŒ ์‚ฌ์‹ค ๋Œ€๋ถ€๋ถ„์˜ ๋งˆ์ทจ์ œ๋“ค์€ ๊ทธ๋Ÿฌํ•œ ๋ฐฉ์‹์œผ๋กœ ์ž‘์šฉํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
03:12
What they do is they introduce a noise
59
192330
3000
์‹ค์ œ๋กœ ๋งˆ์ทจ์ œ๊ฐ€ ํ•˜๋Š” ๊ฒƒ์€ ๋‘๋‡Œ์— ์žก์Œ์„ ํผ๋œจ๋ ค
03:15
into the brain so that the neurons cannot understand each other.
60
195330
3000
๋‰ด๋Ÿฐ๋“ค์ด ์„œ๋กœ๋ฅผ ์ดํ•ดํ•  ์ˆ˜ ์—†๋„๋ก ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:18
They are confused,
61
198330
2000
๋‰ด๋Ÿฐ๋“ค์€ ํ˜ผ๋ž€์Šค๋Ÿฝ๊ฒŒ ๋˜๊ณ ,
03:20
and you cannot make a decision.
62
200330
3000
์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋– ํ•œ ํŒ๋‹จ์„ ๋‚ด๋ฆด ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
03:23
So, while you're trying to make up your mind
63
203330
3000
๊ทธ๋ž˜์„œ ์—ฌ๋Ÿฌ๋ถ„์ด ๋งˆ์Œ์„ ์ •ํ•˜๋ ค๊ณ  ๋…ธ๋ ฅํ•˜๋Š” ๋™์•ˆ์—
03:26
what the doctor, the surgeon, is doing
64
206330
2000
์˜์‚ฌ๋Š” ์—ฌ๋Ÿฌ๋ถ„์˜ ๋ชธ์„ ๋‚œ๋„์งˆํ•˜๊ณ ,
03:28
while he's hacking away at your body, he's long gone.
65
208330
2000
๋ฒŒ์จ ์˜ค๋ž˜ ์ „์— ๋‹ค ๋๋‚ด๊ณ  ๊ฐ€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
03:30
He's at home having tea.
66
210330
2000
๊ทธ๋Š” ์ง‘์—์„œ ์ฐจ๋ฅผ ๋งˆ์‹œ๊ณ  ์žˆ๊ฒ ์ง€์š”.
03:32
(Laughter)
67
212330
2000
(์›ƒ์Œ)
03:34
So, when you walk up to a door and you open it,
68
214330
3000
๊ทธ๋Ÿฌ๋ฏ€๋กœ ์—ฌ๋Ÿฌ๋ถ„์ด ๋ฌธ์„ ํ–ฅํ•ด ๊ฑธ์–ด๊ฐ€์„œ ๋ฌธ์„ ์—ด๊ฒŒ ๋˜์—ˆ์„ ๋•Œ
03:37
what you compulsively have to do to perceive
69
217330
3000
์ง€๊ฐ์ฒ˜๋ฆฌ๋ฅผ ์œ„ํ•ด ์œ„ํ•ด์„œ ๋ฐ˜๋“œ์‹œ ํ•ด์•ผํ•˜๋Š” ๊ฒƒ์€
03:40
is to make decisions,
70
220330
2000
ํŒ๋‹จ์„ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:42
thousands of decisions about the size of the room,
71
222330
3000
๋ฐฉ์˜ ํฌ๊ธฐ์™€, ๋ฒฝ,
03:45
the walls, the height, the objects in this room.
72
225330
3000
๋†’์ด, ๋ฐฉ ์•ˆ์˜ ๋ฌผ์ฒด๋“ค์— ๋Œ€ํ•œ ์ˆ˜์ฒœ๊ฐœ์˜ ํŒ๋‹จ์„ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:48
99 percent of what you see
73
228330
3000
์—ฌ๋Ÿฌ๋ถ„์ด ๋ณด๋Š” ๊ฒƒ์˜ 99%๋Š”
03:51
is not what comes in through the eyes.
74
231330
4000
์—ฌ๋Ÿฌ๋ถ„์˜ ๋ˆˆ์„ ํ†ตํ•ด์„œ ๋“ค์–ด์˜ค๋Š” ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
03:55
It is what you infer about that room.
75
235330
4000
๊ทธ๊ฒƒ์€ ์—ฌ๋Ÿฌ๋ถ„์ด ๋ฐฉ์— ๋Œ€ํ•ด์„œ ์ถ”๋ก ํ•ด๋‚ธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:59
So I can say, with some certainty,
76
239330
4000
์ €๋Š” ์–ด๋Š ์ •๋„ ๋ถ„๋ช…ํ•˜๊ฒŒ ๋งํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:03
"I think, therefore I am."
77
243330
3000
"๋‚˜๋Š” ์ƒ๊ฐํ•œ๋‹ค. ๊ทธ๋Ÿฌ๋ฏ€๋กœ ๋‚˜๋Š” ์กด์žฌํ•œ๋‹ค."
04:06
But I cannot say, "You think, therefore you are,"
78
246330
4000
๊ทธ๋Ÿฌ๋‚˜ ์ €๋Š” ์ด๋ ‡๊ฒŒ ๋งํ•  ์ˆ˜ ๋Š” ์—†์Šต๋‹ˆ๋‹ค, "๋‹น์‹ ์€ ์ƒ๊ฐํ•œ๋‹ค, ๊ทธ๋Ÿฌ๋ฏ€๋กœ ๋‹น์‹ ์€ ์กด์žฌํ•œ๋‹ค."
04:10
because "you" are within my perceptual bubble.
79
250330
5000
์™œ๋ƒํ•˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„์€ ๋‚˜์˜ ์ง€๊ฐ ๊ฑฐํ’ˆ์•ˆ์— ์กด์žฌํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:15
Now, we can speculate and philosophize this,
80
255330
3000
์ด์ œ, ์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ์— ๋Œ€ํ•ด์„œ ๊นŠ์ด ์ƒ๊ฐํ•˜๊ณ , ์ฒ ํ•™์ ์œผ๋กœ ์„ค๋ช…ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:18
but we don't actually have to for the next hundred years.
81
258330
3000
ํ•˜์ง€๋งŒ, ์•ž์œผ๋กœ ๋ช‡๋ฐฑ๋…„๋™์•ˆ ์ด๊ฒƒ์„ ์‹ค์ œ๋กœ ํ•  ํ•„์š”๋Š” ์—†์Šต๋‹ˆ๋‹ค.
04:21
We can ask a very concrete question.
82
261330
2000
์šฐ๋ฆฌ๋Š” ๋งค์šฐ ๊ตฌ์ฒด์ ์ธ ํ•˜๋‚˜์˜ ์งˆ๋ฌธ์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:23
"Can the brain build such a perception?"
83
263330
4000
"๋‘๋‡Œ๊ฐ€ ๊ทธ๋Ÿฌํ•œ ์ง€๊ฐ์„ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ์„๊นŒ?"
04:27
Is it capable of doing it?
84
267330
2000
๋‘๋‡Œ๊ฐ€ ๊ทธ๋Ÿฌํ•œ ๊ฒƒ์„ ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
04:29
Does it have the substance to do it?
85
269330
2000
๋‘๋‡Œ๊ฐ€ ๊ทธ๊ฒƒ์„ ์œ„ํ•œ ์‹ค์ฒด๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์„๊นŒ์š”?
04:31
And that's what I'm going to describe to you today.
86
271330
3000
๊ทธ๊ฒƒ์ด ์ œ๊ฐ€ ์˜ค๋Š˜ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ์„ค๋ช…ํ•˜๊ณ ์ž ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:34
So, it took the universe 11 billion years to build the brain.
87
274330
4000
์šฐ์ฃผ๋Š” ๋‘๋‡Œ๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด์„œ 110์–ต๋…„์˜ ์‹œ๊ฐ„์ด ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
04:38
It had to improve it a little bit.
88
278330
2000
์šฐ์ฃผ๋Š” ๋‘๋‡Œ๋ฅผ ์กฐ๊ธˆ ๊ฐœ์„ ์‹œ์ผœ์•ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:40
It had to add to the frontal part, so that you would have instincts,
89
280330
3000
๋ณธ๋Šฅ์„ ๊ฐ€์ง€๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•ด์„œ ์ „๋‘์—ฝ์„ ํฌ๊ฒŒํ•  ํ•„์š”๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
04:43
because they had to cope on land.
90
283330
3000
์™œ๋ƒํ•˜๋ฉด ์ƒ๋ช…์ฒด๊ฐ€ ์œก์ง€์—์„œ ์‚ด์•„๊ฐ€์•ผ ํ–ˆ์œผ๋‹ˆ๊นŒ์š”.
04:46
But the real big step was the neocortex.
91
286330
4000
ํ•˜์ง€๋งŒ ์ •๋ง ํฐ ๋‹จ๊ณ„๋Š” ์‹ ํ”ผ์งˆ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
04:50
It's a new brain. You needed it.
92
290330
2000
์ด๊ฒƒ์€ ์ƒˆ๋กœ์šด ๋‘๋‡Œ์ž…๋‹ˆ๋‹ค. ์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:52
The mammals needed it
93
292330
2000
ํฌ์œ ๋ฅ˜๋“ค์—๊ฒŒ ํ•„์š”ํ•œ ๊ฒƒ์ด์—ˆ์ฃ .
04:54
because they had to cope with parenthood,
94
294330
4000
์™œ๋ƒํ•˜๋ฉด ๊ทธ๋“ค์€
04:58
social interactions,
95
298330
2000
์‚ฌํšŒ์ ์ธ ๊ต๋ฅ˜์ธ
05:00
complex cognitive functions.
96
300330
3000
๋ถ€๋ชจ ์ž์‹๊ฐ„์˜ ๊ด€๊ณ„๋ฅผ ์ž˜ ์ด๋ค„์•ผ ํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
05:03
So, you can think of the neocortex
97
303330
2000
๋”ฐ๋ผ์„œ ์—ฌ๋Ÿฌ๋ถ„์€ ์‹ ํ”ผ์งˆ์„
05:05
actually as the ultimate solution today,
98
305330
5000
์šฐ์ฃผ๊ฐ€ ๋งŒ๋“ค์–ด๋‚ธ
05:10
of the universe as we know it.
99
310330
3000
์˜ค๋Š˜๋‚ ์˜ ์ตœ์ข…์ ์ธ ํ•ด๊ฒฐ์ฑ…์ด๋ผ๊ณ  ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:13
It's the pinnacle, it's the final product
100
313330
2000
์ด๊ฒƒ์€ ์ •์ ์ž…๋‹ˆ๋‹ค. ์ด๊ฒƒ์€ ์šฐ์ฃผ๊ฐ€ ์ง€๊ธˆ๊นŒ์ง€ ๋งŒ๋“ค์–ด๋‚ธ
05:15
that the universe has produced.
101
315330
4000
๋งˆ์ง€๋ง‰ ์ž‘ํ’ˆ์ž…๋‹ˆ๋‹ค.
05:19
It was so successful in evolution
102
319330
2000
์ด๊ฒƒ์€ ์ง„ํ™”์— ์žˆ์–ด์„œ ๋งค์šฐ ์„ฑ๊ณต์ ์ด์–ด์„œ
05:21
that from mouse to man it expanded
103
321330
2000
์ฅ์—์„œ ์ธ๊ฐ„๊นŒ์ง€ ์‹ ํ”ผ์งˆ์€
05:23
about a thousandfold in terms of the numbers of neurons,
104
323330
3000
์ด ๊ฐ€๊ณตํ• ๋งŒํ•œ ์กฐ์ง์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด์„œ
05:26
to produce this almost frightening
105
326330
3000
๋‰ด๋Ÿฐ์˜ ๊ฐœ์ˆ˜๋กœ ๋ณผ ๋•Œ์—
05:29
organ, structure.
106
329330
3000
์•ฝ 1,000๋ฐฐ ํ™•์žฅ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
05:32
And it has not stopped its evolutionary path.
107
332330
3000
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ์€ ์ง„ํ™”์˜ ๊ณผ์ •์„ ๋ฉˆ์ถ”์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
05:35
In fact, the neocortex in the human brain
108
335330
2000
์‚ฌ์‹ค, ์ธ๊ฐ„ ๋‘๋‡Œ์˜ ์‹ ํ”ผ์งˆ์€
05:37
is evolving at an enormous speed.
109
337330
3000
๋†€๋ผ์šด ์†๋„๋กœ ์ง„ํ™”ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
05:40
If you zoom into the surface of the neocortex,
110
340330
2000
์‹ ํ”ผ์งˆ์˜ ํ‘œ๋ฉด์„ ํ™•๋Œ€ํ•˜๋ฉด
05:42
you discover that it's made up of little modules,
111
342330
3000
๊ทธ๊ฒƒ์ด ๋งˆ์น˜ ์ปดํ“จํ„ฐ์˜ G5 ํ”„๋กœ์„ธ์„œ๋“ค๊ณผ ๊ฐ™์€
05:45
G5 processors, like in a computer.
112
345330
2000
์ž‘์€ ๋ชจ๋“ˆ๋“ค๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•˜๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:47
But there are about a million of them.
113
347330
3000
๊ทธ๋Ÿฌ๋‚˜ ๊ฑฐ๊ธฐ์—๋Š” ์•ฝ 100๋งŒ๊ฐœ์˜ ๋ชจ๋“ˆ๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:50
They were so successful in evolution
114
350330
2000
์ด ๋ชจ๋“ˆ๋“ค์€ ์ง„ํ™”์— ์„ฑ๊ณตํ•˜์—ฌ
05:52
that what we did was to duplicate them
115
352330
2000
์šฐ๋ฆฌ๋Š” ๊ทธ๋“ค์„ ๊ณ„์†ํ•ด์„œ ์ ์  ๋” ๋ณต์ œํ•ด๋‚˜๊ฐ”์Šต๋‹ˆ๋‹ค.
05:54
over and over and add more and more of them to the brain
116
354330
2000
์šฐ๋ฆฌ์˜ ๋‘๊ฐœ๊ณจ์ด ํ—ˆ๋ฝํ•˜๋Š” ๊ณต๊ฐ„์ด
05:56
until we ran out of space in the skull.
117
356330
3000
๋ถ€์กฑํ•  ๋•Œ๊นŒ์ง€ ๋ง์ž…๋‹ˆ๋‹ค.
05:59
And the brain started to fold in on itself,
118
359330
2000
๊ทธ๋ฆฌํ•˜์—ฌ ๋‘๋‡Œ๋Š” ์ ‘ํ˜€์ง€๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค,
06:01
and that's why the neocortex is so highly convoluted.
119
361330
3000
์‹ ํ”ผ์งˆ์ด ๊ต‰์žฅํžˆ ๋ง๋ ค์žˆ๋Š” ์ด์œ ๊ฐ€ ๋ฐ”๋กœ ์ด๊ฒƒ์ด์ฃ .
06:04
We're just packing in columns,
120
364330
2000
์ด๊ฒƒ๋“ค์„ ๊ธฐ๋‘ฅ ํ˜•ํƒœ๋กœ ์ž˜ ํ‘œ์žฅํ•˜์—ฌ
06:06
so that we'd have more neocortical columns
121
366330
3000
๋” ๋งŽ์€ ์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ์„ ์ƒ์„ฑํ•˜๊ณ 
06:09
to perform more complex functions.
122
369330
3000
๋” ๋ณต์žกํ•œ ๊ธฐ๋Šฅ์„ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:12
So you can think of the neocortex actually as
123
372330
2000
์‹ ํ”ผ์งˆ์„
06:14
a massive grand piano,
124
374330
2000
๋ฐฑ๋งŒ๊ฐœ์˜ ๊ฑด๋ฐ˜์ด ์žˆ๋Š” ๊ฑฐ๋Œ€ํ•œ ๊ทธ๋žœ๋“œ ํ”ผ์•„๋…ธ์ฒ˜๋Ÿผ
06:16
a million-key grand piano.
125
376330
3000
์ƒ๊ฐํ•ด ๋ณด์‹ญ์‹œ์˜ค.
06:19
Each of these neocortical columns
126
379330
2000
๊ฐ๊ฐ์˜ ์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ๋“ค์€
06:21
would produce a note.
127
381330
2000
ํ•˜๋‚˜์˜ ์Œํ‘œ๋ฅผ ๋งŒ๋“ค์–ด ๋ƒ…๋‹ˆ๋‹ค.
06:23
You stimulate it; it produces a symphony.
128
383330
3000
์—ฌ๋Ÿฌ๋ถ„์ด ์Œํ‘œ๋ฅผ ์ž๊ทนํ•˜๋ฉด ๊ตํ–ฅ๊ณก์„ ์—ฐ์ฃผํ•ฉ๋‹ˆ๋‹ค.
06:26
But it's not just a symphony of perception.
129
386330
3000
ํ•˜์ง€๋งŒ ์ด๊ฒƒ์€ ๋‹จ์ง€ ์ง€๊ฐ์˜ ๊ตํ–ฅ๊ณก์ด ์•„๋‹™๋‹ˆ๋‹ค.
06:29
It's a symphony of your universe, your reality.
130
389330
3000
์—ฌ๋Ÿฌ๋ถ„์˜ ์šฐ์ฃผ์— ๋Œ€ํ•œ, ์—ฌ๋Ÿฌ๋ถ„์˜ ์‹ค์ œ์— ๋Œ€ํ•œ ๊ตํ–ฅ๊ณก์ธ ๊ฒƒ์ด์ฃ .
06:32
Now, of course it takes years to learn how
131
392330
3000
์ž, ๋ฌผ๋ก  ๋ฐฑ๋งŒ๊ฐœ์˜ ๊ฑด๋ฐ˜์„ ๊ฐ€์ง„
06:35
to master a grand piano with a million keys.
132
395330
3000
๊ทธ๋žœ๋“œ ํ”ผ์•„๋…ธ๋ฅผ ์ˆ™๋‹ฌํ•˜๋Š” ๋ฐ์—๋Š” ์ˆ˜๋…„์˜ ์‹œ๊ฐ„์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
06:38
That's why you have to send your kids to good schools,
133
398330
2000
์ด๊ฒƒ์ด ๋ฐ”๋กœ ์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ์ž๋…€๋“ค์„ ์ข‹์€ ํ•™๊ต,
06:40
hopefully eventually to Oxford.
134
400330
2000
๊ฒฐ๊ตญ ์˜ฅ์Šคํผ๋“œ์— ๋ณด๋‚ด๊ธฐ๋ฅผ ๋ฐ”๋ผ๋Š” ์ด์œ ์ธ ๊ฒƒ์ด์ฃ .
06:42
But it's not only education.
135
402330
3000
ํ•˜์ง€๋งŒ ์ด๊ฒƒ์€ ๋‹จ์ง€ ๊ต์œก์—์„œ๋งŒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
06:45
It's also genetics.
136
405330
2000
์œ ์ „ํ•™์—์„œ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค.
06:47
You may be born lucky,
137
407330
2000
์—ฌ๋Ÿฌ๋ถ„์€ ์•„๋งˆ ์šด ์ข‹๊ฒŒ ํƒœ์–ด๋‚ฌ๊ฑฐ๋‚˜,
06:49
where you know how to master your neocortical column,
138
409330
4000
์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ๋“ค์„ ์ž˜ ๋‹ค๋ฃจ๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ๊ณ  ์žˆ๋‹ค๋ฉด,
06:53
and you can play a fantastic symphony.
139
413330
2000
ํ™˜์ƒ์ ์ธ ๊ตํ–ฅ๊ณก์„ ์—ฐ์ฃผํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:55
In fact, there is a new theory of autism
140
415330
3000
์‚ฌ์‹ค, ์žํ์ฆ์— ๋Œ€ํ•œ ์ƒˆ๋กœ์šด ์ด๋ก ์ด ์žˆ์Šต๋‹ˆ๋‹ค,
06:58
called the "intense world" theory,
141
418330
2000
์ด๊ฒƒ์€ "๊ฐ•๋ ฌํ•œ ์„ธ๊ณ„"๋ผ๋Š” ์ด๋ก ์ธ๋ฐ
07:00
which suggests that the neocortical columns are super-columns.
142
420330
4000
์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ๋“ค์ด ์ดˆ-๊ธฐ๋‘ฅ๋“ค์ž„์„ ์ œ์‹œํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:04
They are highly reactive, and they are super-plastic,
143
424330
4000
์ดˆ๊ธฐ๋‘ฅ๋“ค์€ ๊ต‰์žฅํžˆ ๋ฏผ๊ฐํ•˜๊ณ , ๋งค์šฐ ์œ ์—ฐํ•ฉ๋‹ˆ๋‹ค.
07:08
and so the autists are probably capable of
144
428330
3000
๊ทธ๋ž˜์„œ ์žํ์ฆ ํ™˜์ž๋“ค์€ ์•„๋งˆ๋„
07:11
building and learning a symphony
145
431330
2000
์šฐ๋ฆฌ๊ฐ€ ์ƒ์‹์ ์œผ๋กœ ์ดํ•ดํ•  ์ˆ˜ ์—†๋Š”
07:13
which is unthinkable for us.
146
433330
2000
๊ตํ–ฅ๊ณก๋“ค์„ ๋งŒ๋“ค์–ด๋‚ด๊ณ  ๋ฐฐ์šธ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:15
But you can also understand
147
435330
2000
์ด๋ ‡๊ฒŒ๋„ ์ดํ•ดํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:17
that if you have a disease
148
437330
2000
์ด๋Ÿฌํ•œ ๊ทธ๋Ÿฌํ•œ ๊ธฐ๋‘ฅ๋“ค ์ค‘ ํ•˜๋‚˜๊ฐ€
07:19
within one of these columns,
149
439330
2000
์–ด๋–ค ๋ณ‘์ด ์žˆ๋‹ค๋ฉด,
07:21
the note is going to be off.
150
441330
2000
๊ทธ ๊ธฐ๋‘ฅ์ด ๋งŒ๋“ค์–ด๋‚ด๋Š” ์Œ์— ๋ฌธ์ œ๊ฐ€ ์ƒ๊ฒŒ ๋˜๋Š” ๊ฒƒ์ด์ฃ .
07:23
The perception, the symphony that you create
151
443330
2000
์ง€๊ฐ ์ฒ˜๋ฆฌ, ์ฆ‰ ์—ฐ์ฃผํ•˜๋Š” ๊ตํ–ฅ๊ณก์—
07:25
is going to be corrupted,
152
445330
2000
๋ฌธ์ œ๊ฐ€ ์ƒ๊ธฐ๊ฒŒ ๋˜๊ณ 
07:27
and you will have symptoms of disease.
153
447330
3000
์งˆ๋ณ‘์˜ ์ง•ํ›„๋“ค์„ ๊ฐ€์ง€๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:30
So, the Holy Grail for neuroscience
154
450330
4000
๊ฒฐ๊ตญ, ์‹ ๊ฒฝ ๊ณผํ•™์˜ ๊ถ๊ทน์˜ ๋ชฉํ‘œ๋Š”
07:34
is really to understand the design of the neocoritical column --
155
454330
4000
์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ๋“ค์˜ ๋””์ž์ธ์„ ์ดํ•ดํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:38
and it's not just for neuroscience;
156
458330
2000
๊ทธ๋ฆฌ๊ณ  ์ด๋Š” ๋‹จ์ˆœํžˆ ์‹ ๊ฒฝ ๊ณผํ•™์„ ์œ„ํ•œ ๊ฒƒ๋งŒ์€ ์•„๋‹™๋‹ˆ๋‹ค.
07:40
it's perhaps to understand perception, to understand reality,
157
460330
3000
์•„๋งˆ๋„ ์ง€๊ฐ์„ ์ดํ•ดํ•˜๊ณ , ์‹ค์ œ๋ฅผ ์ดํ•ดํ•˜๊ณ ,
07:43
and perhaps to even also understand physical reality.
158
463330
4000
์•„๋งˆ๋„ ์‹ฌ์ง€์–ด ๋ฌผ๋ฆฌ์ ์ธ ์‹ค์ œ๋ฅผ ์ดํ•ดํ•˜๋Š” ์ผ์ž…๋‹ˆ๋‹ค.
07:47
So, what we did was, for the past 15 years,
159
467330
3000
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๊ฐ€ ์ง€๋‚œ 15๋…„๋™์•ˆ
07:50
was to dissect out the neocortex, systematically.
160
470330
4000
์‹ ํ”ผ์งˆ์„ ์ฒด๊ณ„์ ์œผ๋กœ ํ•ด๋ถ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
07:54
It's a bit like going and cataloging a piece of the rainforest.
161
474330
4000
์ด๊ฒƒ์€ ๋งˆ์น˜ ์—ด๋Œ€์šฐ๋ฆผ์— ๋Œ€ํ•œ ์นด๋‹ฌ๋กœ๊ทธ ํ•œํŽธ์„ ์ž‘์—…ํ•˜๋Š” ๊ฒƒ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
07:58
How many trees does it have?
162
478330
2000
์—ด๋Œ€์šฐ๋ฆผ์— ์–ผ๋งˆ๋‚˜ ๋งŽ์€ ๋‚˜๋ฌด๋“ค์ด ์žˆ๋‚˜?
08:00
What shapes are the trees?
163
480330
2000
๋‚˜๋ฌด๋“ค์ด ์–ด๋– ํ•œ ํ˜•ํƒœ์ธ๊ฐ€?
08:02
How many of each type of tree do you have? Where are they positioned?
164
482330
3000
๋‚˜๋ฌด๋“ค์ด ๊ฐ ์œ ํ˜•๋ณ„๋กœ ์–ผ๋งˆ๋‚˜ ๋งŽ์ด ์žˆ๋‚˜? ์–ด๋””์— ๋ถ„ํฌ๋˜์–ด ์žˆ๋‚˜?
08:05
But it's a bit more than cataloging because you actually have to
165
485330
2000
ํ•˜์ง€๋งŒ ์‹ ํ”ผ์งˆ์€ ์ด๋Ÿฌํ•œ ์นด๋‹ฌ๋กœ๊ทธ๋ฅผ ๋งŒ๋“œ๋Š” ๊ฒƒ ์ด์ƒ์œผ๋กœ ์–ด๋ ค์šด๋ฐ์š”,
08:07
describe and discover all the rules of communication,
166
487330
4000
๋‰ด๋Ÿฐ๋“ค์ด ๋‹จ์ˆœํžˆ ์ž„์˜์˜ ์–ด๋–ค ๋‹ค๋ฅธ ๋‰ด๋Ÿฐ๊ณผ ์—ฐ๊ฒฐ๋œ ๊ฒƒ์ด ์•„๋‹ˆ๋ฏ€๋กœ,
08:11
the rules of connectivity,
167
491330
2000
๋‰ด๋Ÿฐ๋“ค์ด ์„œ๋กœ ์˜์‚ฌ์†Œํ†ตํ•˜๊ณ  ์—ฐ๊ฒฐ๋˜๋Š” ๋ชจ๋“  ๊ทœ์น™๋“ค์„
08:13
because the neurons don't just like to connect with any neuron.
168
493330
3000
์‹ค์ œ๋กœ ์„ค๋ช…ํ•˜๊ณ  ๋ฐํ˜€๋‚ด์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:16
They choose very carefully who they connect with.
169
496330
3000
๋‰ด๋Ÿฐ๋“ค์€ ๋งค์šฐ ์‹ ์ค‘ํ•˜๊ฒŒ ์–ด๋–ค ๋‰ด๋Ÿฐ๊ณผ ์—ฐ๊ฒฐํ• ์ง€๋ฅผ ์„ ํƒํ•ฉ๋‹ˆ๋‹ค.
08:19
It's also more than cataloging
170
499330
3000
์ด๊ฒƒ ๋˜ํ•œ ์นด๋‹ฌ๋กœ๊ทธ๋ฅผ ๋งŒ๋“œ๋Š” ๊ฒƒ๋ณด๋‹ค ๋” ๋ณต์žกํ•œ ์ผ์ž…๋‹ˆ๋‹ค,
08:22
because you actually have to build three-dimensional
171
502330
2000
์™œ๋ƒํ•˜๋ฉด ์‹ ํ”ผ์งˆ์— ๋Œ€ํ•œ 3์ฐจ์› ๋””์ง€ํ„ธ ๋ชจ๋ธ์„
08:24
digital models of them.
172
504330
2000
๋งŒ๋“ค์–ด์•ผ ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:26
And we did that for tens of thousands of neurons,
173
506330
2000
๊ทธ๋ฆฌํ•˜์—ฌ ์šฐ๋ฆฌ๋Š” ์ˆ˜๋งŒ ์—ฌ ๊ฐœ์˜ ๋‰ด๋Ÿฐ๋“ค,
08:28
built digital models of all the different types
174
508330
3000
์šฐ๋ฆฌ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ์—ฌ๋Ÿฌ ์œ ํ˜•๋“ค์˜ ๋‰ด๋Ÿฐ์— ๋Œ€ํ•œ
08:31
of neurons we came across.
175
511330
2000
๋””์ง€ํ„ธ ๋ชจ๋ธ์„ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
08:33
And once you have that, you can actually
176
513330
2000
์ผ๋‹จ ๋ชจํ˜•๋งŒ ์žˆ์œผ๋ฉด,
08:35
begin to build the neocortical column.
177
515330
4000
์‹ค์ œ๋กœ ์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ์„ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:39
And here we're coiling them up.
178
519330
3000
์—ฌ๊ธฐ ์ด๋ ‡๊ฒŒ ๋งŒ๋“ค๊ณ  ์žˆ์ฃ .
08:42
But as you do this, what you see
179
522330
3000
์—ฌ๋Ÿฌ๋ถ„์ด ๋ณด์‹œ๋“ฏ์ด,
08:45
is that the branches intersect
180
525330
2000
์ˆ˜๋ฐฑ๋งŒ์—ฌ ๊ฐœ์˜ ์ง€์ ์—์„œ
08:47
actually in millions of locations,
181
527330
3000
๊ฐ€์ง€๋“ค์ด ๊ต์ฐจํ•ฉ๋‹ˆ๋‹ค.
08:50
and at each of these intersections
182
530330
3000
๊ทธ๋ฆฌ๊ณ  ๊ฐ๊ฐ์˜ ๊ต์ฐจ์ ์—์„œ
08:53
they can form a synapse.
183
533330
2000
๋‰ด๋Ÿฐ์€ ์‹œ๋ƒ…์Šค๋ฅผ ํ˜•์„ฑํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:55
And a synapse is a chemical location
184
535330
2000
์‹œ๋ƒ…์Šค๋Š” ํ™”ํ•™์ ์ธ ์žฅ์†Œ๋กœ
08:57
where they communicate with each other.
185
537330
3000
๋‰ด๋Ÿฐ๋“ค์ด ์„œ๋กœ ์†Œํ†ตํ•˜๋Š” ๊ณณ์ž…๋‹ˆ๋‹ค.
09:00
And these synapses together
186
540330
2000
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฌํ•œ ์‹œ๋ƒ…์Šค๋“ค์€ ์„œ๋กœ
09:02
form the network
187
542330
2000
๋‘๋‡Œ ๋„คํŠธ์›Œํฌ,
09:04
or the circuit of the brain.
188
544330
3000
ํ˜น์€ ๋‘๋‡Œ ํšŒ๋กœ๋ฅผ ์ด๋ฃจ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
09:07
Now, the circuit, you could also think of as
189
547330
4000
๊ทธ ํšŒ๋กœ๋ฅผ ๋‘๋‡Œ์˜ ์ง๋ฌผ์ฒ˜๋Ÿผ
09:11
the fabric of the brain.
190
551330
2000
์ƒ๊ฐํ•ด๋„ ์ข‹์Šต๋‹ˆ๋‹ค.
09:13
And when you think of the fabric of the brain,
191
553330
3000
์—ฌ๋Ÿฌ๋ถ„์ด ๋‘๋‡Œ์˜ ์ง๋ฌผ ๊ตฌ์กฐ๋กœ ์ƒ๊ฐํ•œ๋‹ค๋ฉด,
09:16
the structure, how is it built? What is the pattern of the carpet?
192
556330
4000
๊ทธ๊ฒƒ์€ ์–ด๋–ป๊ฒŒ ๋งŒ๋“ค์–ด์กŒ์„๊นŒ์š”? ๊ทธ ์นดํŽซ์˜ ํŒจํ„ด์€ ์–ด๋–ค ๊ฒƒ์ผ๊นŒ์š”?
09:20
You realize that this poses
193
560330
2000
์—ฌ๋Ÿฌ๋ถ„์€ ์ด๋Ÿฌํ•œ ์งˆ๋ฌธ์ด ๋‘๋‡Œ์— ๊ด€ํ•œ
09:22
a fundamental challenge to any theory of the brain,
194
562330
4000
๋ชจ๋“  ์ด๋ก ์— ๋Œ€ํ•˜์—ฌ ๋ณธ์งˆ์ ์œผ๋กœ ๋„์ „ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ๊นจ๋‹ซ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
09:26
and especially to a theory that says
195
566330
2000
ํŠนํžˆ๋‚˜ ์–ด๋– ํ•œ ์‹ค์ฒด๊ฐ€ ์ด ์นดํŽซ์œผ๋กœ๋ถ€ํ„ฐ,
09:28
that there is some reality that emerges
196
568330
2000
ํŠน๋ณ„ํ•œ ํŒจํ„ด์„ ์ง€๋‹Œ ํŠน๋ณ„ํ•œ ์นดํŽซ์œผ๋กœ๋ถ€ํ„ฐ
09:30
out of this carpet, out of this particular carpet
197
570330
3000
๋‚˜์˜จ๋‹ค๊ณ  ์ด์•ผ๊ธฐํ•˜๋Š” ์ด๋ก ์— ๋Œ€ํ•œ
09:33
with a particular pattern.
198
573330
2000
๋ณธ์งˆ์ ์ธ ๋„์ „์ธ ๊ฒƒ์ด์ฃ .
09:35
The reason is because the most important design secret of the brain
199
575330
3000
๊ทธ ์ด์œ ๋Š” ๋‘๋‡Œ ์„ค๊ณ„์˜ ๊ฐ€์žฅ ํฐ ๋น„๋ฐ€์ด ๋ฐ”๋กœ
09:38
is diversity.
200
578330
2000
๋‹ค์–‘์„ฑ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:40
Every neuron is different.
201
580330
2000
๋ชจ๋“  ๋‰ด๋Ÿฐ๋“ค์€ ์„œ๋กœ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
09:42
It's the same in the forest. Every pine tree is different.
202
582330
2000
์ด๊ฒƒ์€ ์ˆฒ์—์„œ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค. ๋ชจ๋“  ์†Œ๋‚˜๋ฌด๋“ค์€ ์„œ๋กœ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
09:44
You may have many different types of trees,
203
584330
2000
์—ฌ๋Ÿฌ๋ถ„์€ ์•„๋งˆ๋„ ์„œ๋กœ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๋‚˜๋ฌด๋“ค์„ ๊ฐ€์ง€๊ณ  ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:46
but every pine tree is different. And in the brain it's the same.
204
586330
3000
ํ•˜์ง€๋งŒ ๋ชจ๋“  ์†Œ๋‚˜๋ฌด๋“ค์€ ์„œ๋กœ ๋‹ค๋ฆ…๋‹ˆ๋‹ค. ๋‘๋‡Œ์˜ ๊ฒฝ์šฐ์—๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค.
09:49
So there is no neuron in my brain that is the same as another,
205
589330
3000
์ œ ๋จธ๋ฆฌ ์†์— ์žˆ๋Š” ์–ด๋– ํ•œ ๋‰ด๋Ÿฐ๋„ ๋‹ค๋ฅธ ๋‰ด๋Ÿฐ๊ณผ ๋˜‘๊ฐ™์ง€ ์•Š๊ณ ,
09:52
and there is no neuron in my brain that is the same as in yours.
206
592330
3000
์ œ ๋จธ๋ฆฌ ์†์— ์žˆ๋Š” ์–ด๋– ํ•œ ๋‰ด๋Ÿฐ๋„ ์—ฌ๋Ÿฌ๋ถ„์˜ ๊ฒƒ๊ณผ ๊ฐ™์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:55
And your neurons are not going to be oriented and positioned
207
595330
3000
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„ ๋‰ด๋Ÿฐ๋“ค๋„ ๊ฒฐ์ฝ” ๊ฐ™์€ ๊ณณ์— ์œ„์น˜ํ•˜๊ฑฐ๋‚˜
09:58
in exactly the same way.
208
598330
2000
๊ฐ™์€ ๋ฐฉํ–ฅ์œผ๋กœ ์œ„์น˜ํ•˜์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:00
And you may have more or less neurons.
209
600330
2000
์—ฌ๋Ÿฌ๋ถ„์ด ๋” ๋งŽ๊ฑฐ๋‚˜ ํ˜น์€ ๋” ์ ์€ ๋‰ด๋Ÿฐ์„ ๊ฐ–๊ณ  ์žˆ์„ ์ˆ˜๋„ ์žˆ๊ตฌ์š”.
10:02
So it's very unlikely
210
602330
2000
์ฆ‰, ์—ฌ๋Ÿฌ๋ถ„์ด ๋˜‘๊ฐ™์€ ์ง๋ฌผ, ๋˜‘๊ฐ™์€ ํšŒ๋กœ๋ฅผ
10:04
that you got the same fabric, the same circuitry.
211
604330
4000
๊ฐ€์ง€๊ณ  ์žˆ์ง€๋Š” ์•Š์„ ๊ฒƒ ์ž…๋‹ˆ๋‹ค.
10:08
So, how could we possibly create a reality
212
608330
2000
๊ทธ๋ ‡๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ์šฐ๋ฆฌ๋Š” ์„œ๋กœ๋ฅผ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋Š”
10:10
that we can even understand each other?
213
610330
3000
์‹ค์ฒด๋ฅผ ๋งŒ๋“ค์–ด ๋‚ผ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ผ๊นŒ์š”?
10:13
Well, we don't have to speculate.
214
613330
2000
๊นŠ์ด ์ƒ๊ฐํ•ด๋ณผ ํ•„์š”๋„ ์—†์Šต๋‹ˆ๋‹ค.
10:15
We can look at all 10 million synapses now.
215
615330
3000
์šฐ๋ฆฌ๋Š” 1,000๋งŒ๊ฐœ์˜ ์‹œ๋ƒ…์Šค๋ฅผ ์ง€๊ธˆ ์‚ดํŽด๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:18
We can look at the fabric. And we can change neurons.
216
618330
3000
๊ทธ ์ง๋ฌผ์„ ์‚ดํŽด๋ณผ ์ˆ˜๋„ ์žˆ๊ณ , ๋‰ด๋Ÿฐ๋“ค์„ ๋ฐ”๊ฟ”๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:21
We can use different neurons with different variations.
217
621330
2000
์—ฌ๋Ÿฌ ๋‰ด๋Ÿฐ๋“ค์„ ๋‹ค๋ฅด๊ฒŒ ๋ณ€ํ˜•ํ•˜์—ฌ ์‚ฌ์šฉํ•ด ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:23
We can position them in different places,
218
623330
2000
๋‰ด๋Ÿฐ๋“ค์„ ์„œ๋กœ ๋‹ค๋ฅธ ๊ณณ์— ์œ„์น˜์‹œํ‚ฌ ์ˆ˜ ์žˆ๊ณ 
10:25
orient them in different places.
219
625330
2000
์„œ๋กœ ๋‹ค๋ฅธ ๋ฐฉํ–ฅ์„ ๋ฐ”๋ผ๋ณด๋„๋ก ์œ„์น˜์‹œํ‚ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:27
We can use less or more of them.
220
627330
2000
๋” ๋งŽ์€, ์ ์€ ๋‰ด๋Ÿฐ์„ ์ด์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:29
And when we do that
221
629330
2000
๊ทธ๋ฆฌ๊ณ  ์ด๋ ‡๊ฒŒ ํ•ด๋ณด๋ฉด
10:31
what we discovered is that the circuitry does change.
222
631330
3000
๊ทธ ํšŒ๋กœ๊ฐ€ ๋ณ€ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•˜์˜€์Šต๋‹ˆ๋‹ค.
10:34
But the pattern of how the circuitry is designed does not.
223
634330
7000
ํ•˜์ง€๋งŒ ํšŒ๋กœ๊ฐ€ ๋””์ž์ธ๋œ ํŒจํ„ด์€ ๋ณ€ํ•˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
10:41
So, the fabric of the brain,
224
641330
2000
๊ทธ๋ž˜์„œ ๋‘๋‡Œ์˜ ์ง๋ฌผ์€
10:43
even though your brain may be smaller, bigger,
225
643330
2000
์—ฌ๋Ÿฌ๋ถ„์˜ ๋‘๋‡Œ๊ฐ€ ์ž‘๋“  ํฌ๋“ 
10:45
it may have different types of neurons,
226
645330
3000
์„œ๋กœ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๋‰ด๋Ÿฐ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋“ ,
10:48
different morphologies of neurons,
227
648330
2000
์„œ๋กœ ๋‹ค๋ฅธ ํ˜•ํƒœ์˜ ๋‰ด๋Ÿฐ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋“ ,
10:50
we actually do share
228
650330
3000
์šฐ๋ฆฌ๋Š” ์„œ๋กœ ๊ณตํ†ต์˜ ์ง๋ฌผ์„
10:53
the same fabric.
229
653330
2000
๊ณต์œ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:55
And we think this is species-specific,
230
655330
2000
์ด๋Š” ์ข…์— ํŠนํ™”๋œ ์„ฑ์งˆ๋กœ
10:57
which means that that could explain
231
657330
2000
์„œ๋กœ ๋‹ค๋ฅธ ์ข…๋“ค ๊ฐ„์— ์˜์‚ฌ ์†Œํ†ต์ด
10:59
why we can't communicate across species.
232
659330
2000
๋ถˆ๊ฐ€๋Šฅํ•œ ์ด์œ ๋ฅผ ์„ค๋ช…ํ•ฉ๋‹ˆ๋‹ค.
11:01
So, let's switch it on. But to do it, what you have to do
233
661330
3000
์ž, ์ด๊ฑธ ํ•œ ๋ฒˆ ์ผœ๋ด…์‹œ๋‹ค. ์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ์„ค๋ช…ํ•˜๊ธฐ ์œ„ํ•ด
11:04
is you have to make this come alive.
234
664330
2000
์šฐ๋ฆฌ๊ฐ€ ํ•ด์•ผ ํ•  ์ผ์€ ์ด๋“ค์„ ์‹ค์ œ๋กœ ์žฌํ˜„ํ•˜๋Š” ๊ฒƒ์ด์ฃ .
11:06
We make it come alive
235
666330
2000
์šฐ๋ฆฌ๋Š” ์—ฌ๋Ÿฌ ์ˆ˜ํ•™ ์ˆ˜์‹๋“ค์„ ํ†ตํ•ด์„œ
11:08
with equations, a lot of mathematics.
236
668330
2000
์ด๊ฒƒ์„ ๊ตฌ์ฒดํ™”ํ•ฉ๋‹ˆ๋‹ค.
11:10
And, in fact, the equations that make neurons into electrical generators
237
670330
4000
์‚ฌ์‹ค, ๋‰ด๋Ÿฐ์„ ์ „๊ธฐ์  ์‹ ํ˜ธ๋กœ ๋ณ€ํ˜•ํ•˜๋Š” ๋ชจํ˜•์€
11:14
were discovered by two Cambridge Nobel Laureates.
238
674330
3000
๋‘ ๋ช…์˜ ์บ ๋ธŒ๋ฆฌ์ง€ ์ถœ์‹  ๋…ธ๋ฒจ์ƒ ์ˆ˜์ƒ์ž๋“ค์ด ์ด๋ฏธ ๋ฐํ˜€๋‚ด์—ˆ์Šต๋‹ˆ๋‹ค.
11:17
So, we have the mathematics to make neurons come alive.
239
677330
3000
์ฆ‰, ๋‰ด๋Ÿฐ์„ ์žฌํ˜„ํ•  ์ˆ˜ ์žˆ๋Š” ์ˆ˜ํ•™์  ๋ชจํ˜•์ด ์žˆ๋Š” ๊ฒƒ์ด์ฃ .
11:20
We also have the mathematics to describe
240
680330
2000
๋˜ํ•œ ๋‰ด๋Ÿฐ๋“ค์ด ์ •๋ณด๋ฅผ ๋ชจ์œผ๋Š” ๋ฐฉ๋ฒ•์„
11:22
how neurons collect information,
241
682330
3000
์„ค๋ช…ํ•˜๋Š” ์ˆ˜ํ•™์  ๋ชจํ˜•๊ณผ
11:25
and how they create a little lightning bolt
242
685330
3000
๋‰ด๋Ÿฐ๋“ค์ด ์†Œ๋Ÿ‰์˜ ์ „๊ธฐ๋ฅผ ์ผ์œผ์ผœ
11:28
to communicate with each other.
243
688330
2000
์„œ๋กœ ์˜์‚ฌ์†Œํ†ตํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์„ค๋ช…ํ•˜๋Š” ์ˆ˜ํ•™์  ๋ชจํ˜•๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
11:30
And when they get to the synapse,
244
690330
2000
๊ทธ๋ž˜์„œ ๊ทธ ์‹ ํ˜ธ๊ฐ€ ์‹œ๋ƒ…์Šค์— ๋„๋‹ฌํ–ˆ์„ ๋•Œ์—
11:32
what they do is they effectively,
245
692330
2000
๋‰ด๋Ÿฐ๋“ค์ด ํ•˜๋Š” ์ผ์€ ํšจ๊ณผ์ ์œผ๋กœ,
11:34
literally, shock the synapse.
246
694330
3000
๋ง๊ทธ๋Œ€๋กœ, ์‹œ๋ƒ…์Šค์— ์‡ผํฌ๋ฅผ ์ฃผ๋Š” ์ผ์ž…๋‹ˆ๋‹ค.
11:37
It's like electrical shock
247
697330
2000
์ด๊ฒƒ์€ ๋งˆ์น˜ ์ „๊ธฐ์ ์ธ ์‡ผํฌ์™€ ๊ฐ™์€๋ฐ์š”,
11:39
that releases the chemicals from these synapses.
248
699330
3000
ํ™”ํ•™ ๋ฌผ์งˆ๋“ค์„ ์‹œ๋ƒ…์Šค๋กœ๋ถ€ํ„ฐ ๋ฐฉ์ถœ๋˜๊ฒŒ ๋งŒ๋“ญ๋‹ˆ๋‹ค.
11:42
And we've got the mathematics to describe this process.
249
702330
3000
์šฐ๋ฆฌ๋Š” ์ˆ˜ํ•™์  ๋ชจํ˜•์„ ํ†ตํ•ด ์ด๋Ÿฌํ•œ ์ฒ˜๋ฆฌ๋ฐฉ์‹์„ ์„ค๋ช…ํ•ด ์™”์Šต๋‹ˆ๋‹ค.
11:45
So we can describe the communication between the neurons.
250
705330
4000
๊ทธ๋ฆฌํ•˜์—ฌ ์šฐ๋ฆฌ๋Š” ๋‰ด๋Ÿฐ๊ฐ„์˜ ์˜์‚ฌ์†Œํ†ต์„ ๊ธฐ์ˆ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:49
There literally are only a handful
251
709330
3000
์ด๊ฑด ๋‹จ์ง€, ๋ง๊ทธ๋Œ€๋กœ
11:52
of equations that you need to simulate
252
712330
2000
์‹ ํ”ผ์งˆ์˜ ํ–‰๋™์„ ์‹œ๋ฎฌ๋ ˆ์ด์…˜ํ•˜๋Š”๋ฐ ํ•„์š”ํ•œ
11:54
the activity of the neocortex.
253
714330
2000
์•„์ฃผ ์†Œ์ˆ˜์˜ ์ˆ˜ํ•™์  ๋ชจํ˜•์ด ์žˆ์„ ๋ฟ์ž…๋‹ˆ๋‹ค.
11:56
But what you do need is a very big computer.
254
716330
3000
ํ•˜์ง€๋งŒ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ํ•„์š”ํ•œ ๊ฒƒ์€ ๋งค์šฐ ๊ฑฐ๋Œ€ํ•œ ์ปดํ“จํ„ฐ ์ž…๋‹ˆ๋‹ค.
11:59
And in fact you need one laptop
255
719330
2000
๊ทธ๋ฆฌ๊ณ  ์‚ฌ์‹ค ํ•œ ๊ฐœ์˜ ๋‰ด๋Ÿฐ์„ ๊ณ„์‚ฐํ•˜๋Š” ๋ฐ์—๋Š”
12:01
to do all the calculations just for one neuron.
256
721330
3000
๋…ธํŠธ๋ถ ํ•œ ๋Œ€๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
12:04
So you need 10,000 laptops.
257
724330
2000
์ฆ‰, ์‹ ํ”ผ์งˆ์„ ๊ณ„์‚ฐํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋…ธํŠธ๋ถ 10,000๋Œ€๊ฐ€ ํ•„์š”ํ•œ ๊ฒƒ์ด์ฃ .
12:06
So where do you go? You go to IBM,
258
726330
2000
๊ทธ๋ ‡๋‹ค๋ฉด ์–ด๋””๋กœ ๊ฐ€์•ผ ํ•˜์ฃ ? IBM์œผ๋กœ ๊ฐ€๊ฒ ์ฃ .
12:08
and you get a supercomputer, because they know how to take
259
728330
2000
๊ทธ๋ฆฌ๊ณ ์„  ์Šˆํผ ์ปดํ“จํ„ฐ๋ฅผ ๊ตฌํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค. ์™œ๋ƒํ•˜๋ฉด IBM์—์„œ๋Š”
12:10
10,000 laptops and put it into the size of a refrigerator.
260
730330
4000
๋…ธํŠธ๋ถ 10,000๋Œ€๋ฅผ ๋ƒ‰์žฅ๊ณ ๋งŒํ•œ ํฌ๊ธฐ์— ๋„ฃ๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
12:14
So now we have this Blue Gene supercomputer.
261
734330
3000
์ž, ์šฐ๋ฆฌ์—๊ฒ ์ด ๋ธ”๋ฃจ์ง„ ์Šˆํผ ์ปดํ“จํ„ฐ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
12:17
We can load up all the neurons,
262
737330
2000
์šฐ๋ฆฌ๋Š” ๋ชจ๋“  ๋‰ด๋Ÿฐ๋“ค์˜ ์ •๋ณด๋ฅผ ์ปดํ“จํ„ฐ์— ์˜ฌ๋ ค,
12:19
each one on to its processor,
263
739330
2000
๊ฐ๊ฐ์˜ ๋‰ด๋Ÿฐ์„ ๊ฐ๊ฐ์˜ ํ”„๋กœ์„ธ์„œ์— ํ• ๋‹นํ•˜๊ณ 
12:21
and fire it up, and see what happens.
264
741330
4000
์ด๋“ค์„ ๊ฐ€๋™์‹œํ‚จ ํ›„, ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚˜๋Š”์ง€๋ฅผ ์‚ดํŽด๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:25
Take the magic carpet for a ride.
265
745330
3000
๋งˆ๋ฒ• ์–‘ํƒ„์ž๋ฅผ ํƒ€๊ณ  ๋‚ ์•„๊ฐ€๋ณด์ฃ .
12:28
Here we activate it. And this gives the first glimpse
266
748330
3000
์ด์ œ ์ด๊ฒƒ์„ ํ™œ์„ฑํ™” ์‹œํ‚ต๋‹ˆ๋‹ค. ์ด๊ฑด
12:31
of what is happening in your brain
267
751330
2000
์‹œ๋ฎฌ๋ ˆ์ด์…˜์„ ์‹คํ–‰ํ•  ๋•Œ
12:33
when there is a stimulation.
268
753330
2000
์—ฌ๋Ÿฌ๋ถ„์˜ ๋‘๋‡Œ์—์„œ ์ผ์–ด๋‚˜๋Š” ์ดˆ๊ธฐ ์ผ๋ฉด์„ ๋ณด์—ฌ์ฃผ์ฃ .
12:35
It's the first view.
269
755330
2000
์ด๊ฑด ์ฒซ๋ฒˆ์งธ ๋ชจ์Šต์ž…๋‹ˆ๋‹ค.
12:37
Now, when you look at that the first time, you think,
270
757330
2000
์—ฌ๋Ÿฌ๋ถ„์ด ์ด๊ฒƒ์„ ์ฒ˜์Œ ๋ณผ ๋•Œ์—, ์•„๋งˆ ์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜๊ฒ ์ฃ 
12:39
"My god. How is reality coming out of that?"
271
759330
5000
"์˜ค ๋ง™์†Œ์‚ฌ, ์–ด๋–ป๊ฒŒ ์‹ค์ œ๊ฐ€ ์ด๋ ‡๊ฒŒ ๋‚˜ํƒ€๋‚ ๊นŒ?"
12:44
But, in fact, you can start,
272
764330
3000
์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ๋“ค์„ ํ›ˆ๋ จ์‹œํ‚ค์ง€ ์•Š์•˜๋”๋ผ๋„,
12:47
even though we haven't trained this neocortical column
273
767330
4000
์‚ฌ์‹ค ์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋–ค ์‹ค์ œ๋ฅผ ๋งŒ๋“ค์–ด๋‚ด๊ธฐ
12:51
to create a specific reality.
274
771330
2000
์‹œ์ž‘ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:53
But we can ask, "Where is the rose?"
275
773330
4000
์šฐ๋ฆฌ๋Š” ๋ฌผ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค, "์žฅ๋ฏธ๋Š” ์–ด๋””์— ์žˆ๋‚˜์š”?"
12:57
We can ask, "Where is it inside,
276
777330
2000
์šฐ๋ฆฌ๋Š” ๋ฌผ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค, "๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ๋‘๋‡Œ๋ฅผ ๊ทธ๋ฆผ๊ณผ ํ•จ๊ป˜ ์ž๊ทนํ•œ๋‹ค๋ฉด
12:59
if we stimulate it with a picture?"
277
779330
3000
์žฅ๋ฏธ๋Š” ๊ทธ์•ˆ ์–ด๋””์— ์žˆ์„๊นŒ์š”?"
13:02
Where is it inside the neocortex?
278
782330
2000
์‹ ํ”ผ์งˆ์•ˆ ์–ด๋””์— ์žˆ์„๊นŒ์š”?
13:04
Ultimately it's got to be there if we stimulated it with it.
279
784330
4000
์šฐ๋ฆฌ๊ฐ€ ๋‘๋‡Œ๋ฅผ ์žฅ๋ฏธ ๊ทธ๋ฆผ์œผ๋กœ ์ž๊ทนํ•œ๋‹ค๋ฉด ์žฅ๋ฏธ๋Š” ์‹ ํ”ผ์งˆ ์•ˆ์— ๋ถ„๋ช…ํžˆ ์žˆ๊ธฐ ๋งˆ๋ จ์ž…๋‹ˆ๋‹ค.
13:08
So, the way that we can look at that
280
788330
2000
์ฆ‰, ์šฐ๋ฆฌ๊ฐ€ ์žฅ๋ฏธ๋ฅผ ๋ณด๋Š” ๋ฐฉ๋ฒ•์€
13:10
is to ignore the neurons, ignore the synapses,
281
790330
3000
๋‰ด๋Ÿฐ์„ ๋ฌด์‹œํ•˜๊ณ , ์‹œ๋ƒ…์Šค๋ฅผ ๋ฌด์‹œํ•˜๊ณ ,
13:13
and look just at the raw electrical activity.
282
793330
2000
๋‹จ์ง€ ์ „๊ธฐ์ ์ธ ์‹ ํ˜ธ๋งŒ์„ ๋ณด๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:15
Because that is what it's creating.
283
795330
2000
์™œ๋ƒํ•˜๋ฉด ๊ทธ๊ฒƒ์ด ์‹ค์ œ๋กœ ๋‘๋‡Œ๊ฐ€ ๋งŒ๋“ค์–ด๋‚ธ ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
13:17
It's creating electrical patterns.
284
797330
2000
์ฆ‰, ์ „๊ธฐ์ ์ธ ํŒจํ„ด์„ ๋งŒ๋“ค์–ด ๋ƒ…๋‹ˆ๋‹ค.
13:19
So when we did this,
285
799330
2000
์šฐ๋ฆฌ๊ฐ€ ์ „๊ธฐ์  ํŒจํ„ด์„ ๋งŒ๋“ค์—ˆ์„ ๋•Œ,
13:21
we indeed, for the first time,
286
801330
2000
์šฐ๋ฆฌ๋Š” ๋ชฐ๋ก  ์ฒ˜์Œ์—๋Š”
13:23
saw these ghost-like structures:
287
803330
3000
์œ ๋ น๊ณผ ๊ฐ™์€ ๊ตฌ์กฐ๋ฅผ ๋ณด๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค
13:26
electrical objects appearing
288
806330
3000
์‹ ํ”ผ์งˆ ๊ธฐ๋‘ฅ์•ˆ์— ๋‚˜ํƒ€๋‚˜๋Š”
13:29
within the neocortical column.
289
809330
3000
์ „๊ธฐ์ ์ธ ๊ฐ์ฒด๋ฅผ ๋ง์ž…๋‹ˆ๋‹ค.
13:32
And it's these electrical objects
290
812330
3000
๊ทธ๋ฆฌ๊ณ  ๋ฐ”๋กœ ์ด ์ „๊ธฐ์ ์ธ ๊ฐ์ฒด๋“ค์ด
13:35
that are holding all the information about
291
815330
3000
๋‘๋‡Œ๊ฐ€ ์ž๊ทน์„ ๋ฐ›์•˜์„ ๋•Œ์—
13:38
whatever stimulated it.
292
818330
3000
๊ฐ€์ง€๋Š” ๋ชจ๋“  ์ •๋ณด๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:41
And then when we zoomed into this,
293
821330
2000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์ด๊ฒƒ๋“ค์„ ํ™•๋Œ€ํ•˜๋ฉด,
13:43
it's like a veritable universe.
294
823330
4000
์ด๊ฒƒ์€ ๋งˆ์น˜ ์‹ค์ œ ์šฐ์ฃผ์™€ ๊ฐ™์Šต๋‹ˆ๋‹ค.
13:47
So the next step
295
827330
2000
๊ทธ๋ž˜์„œ ๋‹ค์Œ ๋‹จ๊ณ„๋Š”
13:49
is just to take these brain coordinates
296
829330
4000
์ด๋Ÿฌํ•œ ๋‘๋‡Œ ์ขŒํ‘œ๋“ค์„ ์ฐพ์•„๋‚ด์–ด์„œ
13:53
and to project them into perceptual space.
297
833330
4000
๊ทธ๊ฒƒ๋“ค์„ ์‹ค์ œ ์ง€๊ฐ ๊ฐ€๋Šฅํ•œ ๊ณต๊ฐ„์— ํˆฌ์˜์‹œํ‚ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:57
And if you do that,
298
837330
2000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋ ‡๊ฒŒ ํ•œ๋‹ค๋ฉด
13:59
you will be able to step inside
299
839330
2000
์—ฌ๋Ÿฌ๋ถ„์€ ๋ฐ”๋กœ ๊ทธ ๋‘๋‡Œ์˜ ์ผ๋ถ€๋ฅผ ํ†ตํ•ด,
14:01
the reality that is created
300
841330
2000
๊ธฐ๊ณ„์— ์˜ํ•ด ์ฐฝ์กฐ๋œ ์„ธ๊ณ„์˜
14:03
by this machine,
301
843330
2000
๋‚ด๋ถ€๋กœ
14:05
by this piece of the brain.
302
845330
3000
๋“ค์–ด๊ฐ€ ๋ณผ ์ˆ˜ ์žˆ์„ ๊ฒƒ ์ž…๋‹ˆ๋‹ค
14:08
So, in summary,
303
848330
2000
์ฆ‰, ์š”์•ฝํ•œ๋‹ค๋ฉด
14:10
I think that the universe may have --
304
850330
2000
์ €๋Š” ์šฐ์ฃผ๊ฐ€ ์•„๋งˆ๋„
14:12
it's possible --
305
852330
2000
- ์•„๋งˆ๋„ -
14:14
evolved a brain to see itself,
306
854330
3000
๋‘๋‡Œ๊ฐ€ ์Šค์Šค๋กœ๋ฅผ ๋ฐ”๋ผ๋ณผ ์ˆ˜ ์žˆ๋„๋ก ์ง„ํ™”์‹œ์ผฐ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
14:17
which may be a first step in becoming aware of itself.
307
857330
5000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์ด ์Šค์Šค๋กœ๋ฅผ ์ธ์‹ํ•˜๋Š” ์ฒซ๋ฒˆ์งธ ๋‹จ๊ณ„์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:22
There is a lot more to do to test these theories,
308
862330
2000
์ด๋Ÿฌํ•œ ์ด๋ก ๋“ค๊ณผ ๋‹ค๋ฅธ ์ด๋ก ๋“ค์„
14:24
and to test any other theories.
309
864330
3000
์‹œํ—˜ํ•˜๊ธฐ ์œ„ํ•ด์„œ ํ•  ์ผ์ด ๋งŽ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
14:27
But I hope that you are at least partly convinced
310
867330
3000
ํ•˜์ง€๋งŒ ์ €๋Š” ์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ์ ์–ด๋„ ๋ถ€๋ถ„์ ์œผ๋กœ๋Š”
14:30
that it is not impossible to build a brain.
311
870330
3000
๋‘๋‡Œ๋ฅผ ๋งŒ๋“ค์–ด๋‚ด๋Š” ๊ฒƒ์ด ๋ถˆ๊ฐ€๋Šฅํ•˜์ง€ ์•Š๋‹ค๊ณ  ์ƒ๊ฐํ•˜์…จ์œผ๋ฉด ํ•ฉ๋‹ˆ๋‹ค.
14:33
We can do it within 10 years,
312
873330
2000
์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ์„ 10๋…„ ์•ˆ์— ํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ ์ž…๋‹ˆ๋‹ค
14:35
and if we do succeed,
313
875330
2000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์„ฑ๊ณตํ•œ๋‹ค๋ฉด
14:37
we will send to TED, in 10 years,
314
877330
2000
์šฐ๋ฆฌ๋Š” 10๋…„์•ˆ์— TED์—
14:39
a hologram to talk to you. Thank you.
315
879330
3000
์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ๋ง์„ ๊ฑฐ๋Š” ํ™€๋กœ๊ทธ๋žจ์„ ๋ณด๋‚ด๊ฒ ์Šต๋‹ˆ๋‹ค. ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
14:42
(Applause)
316
882330
6000
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7