A brain in a supercomputer | Henry Markram

513,021 views ใƒป 2009-10-15

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Gad Amit ืžื‘ืงืจ: Ido Dekkers
00:18
Our mission is to build
0
18330
3000
ื”ืžืฉื™ืžื” ืฉืœื ื• ื”ื™ื ืœื‘ื ื•ืช
00:21
a detailed, realistic
1
21330
2000
ื‘ืฆื•ืจื” ืžืคื•ืจื˜ืช ื•ืจืืœื™ืกื˜ื™ืช,
00:23
computer model of the human brain.
2
23330
2000
ืžื•ื“ืœ ืžืžื•ื—ืฉื‘ ืฉืœ ื”ืžื•ื— ื”ืื ื•ืฉื™.
00:25
And we've done, in the past four years,
3
25330
3000
ื•ืื ื• ื‘ื ื™ื ื•, ื‘4 ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
00:28
a proof of concept
4
28330
2000
ื”ื•ื›ื—ืช ื™ื›ื•ืœื•ืช,
00:30
on a small part of the rodent brain,
5
30330
3000
ื‘ื—ืœืง ืงื˜ืŸ ืฉืœ ืžื•ื— ืžื›ืจืกื,
00:33
and with this proof of concept we are now scaling the project up
6
33330
3000
ื•ืขื ื”ื•ื›ื—ืช ื”ื™ื›ื•ืœื•ืช ื”ื–ืืช ืื ื• ื›ืขืช ืžื’ื“ื™ืœื™ื ืืช ื”ืคืจื•ื™ืงื˜
00:36
to reach the human brain.
7
36330
3000
ื›ื“ื™ ืœื”ื’ื™ืข ืœืžื•ื— ื”ืื ื•ืฉื™.
00:39
Why are we doing this?
8
39330
2000
ืœืžื” ืื ื• ืขื•ืฉื™ื ื–ืืช?
00:41
There are three important reasons.
9
41330
2000
ื™ืฉ ืฉืœื•ืฉ ืกื™ื‘ื•ืช ื—ืฉื•ื‘ื•ืช.
00:43
The first is, it's essential for us to understand the human brain
10
43330
4000
ื”ืจืืฉื•ื ื”, ื–ื” ื”ื›ืจื—ื™ ืขื‘ื•ืจื ื• ืœื”ื‘ื™ืŸ ืืช ื”ืžื•ื— ื”ืื ื•ืฉื™
00:47
if we do want to get along in society,
11
47330
2000
ืื ืื ื—ื ื• ืจื•ืฆื™ื ืœื”ืชืงื“ื ืžื‘ื—ื™ื ื” ื—ื‘ืจืชื™ืช,
00:49
and I think that it is a key step in evolution.
12
49330
4000
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ืฆืขื“ ื”ื›ืจื—ื™ ื‘ืื‘ื•ืœื•ืฆื™ื”.
00:53
The second reason is,
13
53330
2000
ื”ืกื™ื‘ื” ื”ืฉื ื™ื” ื”ื™ื,
00:55
we cannot keep doing animal experimentation forever,
14
55330
6000
ืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื”ืžืฉื™ืš ื•ืœื‘ืฆืข ื ื™ืกื•ื™ื™ื ื‘ื—ื™ื•ืช ืœืขื•ืœื,
01:01
and we have to embody all our data and all our knowledge
15
61330
4000
ื•ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืฆืจืฃ ืืช ื›ืœ ื”ื ืชื•ื ื™ื ื•ื”ื™ื“ืข ืฉืœื ื•,
01:05
into a working model.
16
65330
3000
ืœืžื•ื“ืœ ืฉืขื•ื‘ื“.
01:08
It's like a Noah's Ark. It's like an archive.
17
68330
4000
ื–ื” ื›ืžื• ื”ืชื™ื‘ื” ืฉืœ ื ื•ื—, ื–ื” ื›ืžื• ื‘ืกื™ืก ื ืชื•ื ื™ื.
01:12
And the third reason is that there are two billion people on the planet
18
72330
3000
ื•ื”ืกื™ื‘ื” ื”ืฉืœื™ืฉื™ืช ืฉื™ืฉ ื›-2 ืžื™ืœื™ืืจื“ ืื ืฉื™ื ืขืœ ื›ื“ื•ืจ ื”ืืจืฅ
01:15
that are affected by mental disorder,
19
75330
4000
ื”ืžื•ืฉืคืขื™ื ืžื”ืคืจืขื” ืžื ื˜ืœื™ืช,
01:19
and the drugs that are used today
20
79330
2000
ื•ื”ืชืจื•ืคื•ืช ืฉื‘ื”ื ืื ื• ืžืฉืชืžืฉื™ื ื”ื™ื•ื
01:21
are largely empirical.
21
81330
2000
ื”ื ื‘ืื•ืคืŸ ื›ืœืœื™ ืืžืคื™ืจื™ื™ื.
01:23
I think that we can come up with very concrete solutions on
22
83330
3000
ืื ื™ ื—ื•ืฉื‘ ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื’ื™ืข ืœืคื™ืชืจื•ืŸ ืžื‘ื•ืกืก
01:26
how to treat disorders.
23
86330
3000
ื›ื™ืฆื“ ืœื˜ืคืœ ื‘ื”ืคืจืขื•ืช.
01:29
Now, even at this stage,
24
89330
3000
ืขื›ืฉื™ื•, ืืคื™ืœื• ื‘ืฉืœื‘ ื”ื–ื”,
01:32
we can use the brain model
25
92330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ืžื•ื“ืœ ื”ืžื•ื—
01:34
to explore some fundamental questions
26
94330
3000
ื›ื“ื™ ืœื—ืฉื•ืฃ ืฉืืœื•ืช ื™ืกื•ื“
01:37
about how the brain works.
27
97330
2000
ืขืœ ื›ื™ืฆื“ ื”ืžื•ื— ืžืชืคืงื“.
01:39
And here, at TED, for the first time,
28
99330
2000
ื•ื›ืืŸ ื‘TED, ื‘ืคืขื ื”ืจืืฉื•ื ื”,
01:41
I'd like to share with you how we're addressing
29
101330
2000
ืื ื™ ืจื•ืฆื” ืœื—ืœื•ืง ืื™ืชื›ื, ื›ื™ืฆื“ ืื ื• ืžืชื™ื™ื—ืกื™ื
01:43
one theory -- there are many theories --
30
103330
3000
ืœืชื™ืื•ืจื™ื” ืื—ืช - ื™ืฉ ื”ืจื‘ื” ืชื™ืื•ืจื™ื•ืช -
01:46
one theory of how the brain works.
31
106330
4000
ืชื™ืื•ืจื™ื” ืื—ืช ืฉืžืชื™ื™ื—ืกืช ืœืื™ืš ื”ืžื•ื— ืขื•ื‘ื“.
01:50
So, this theory is that the brain
32
110330
4000
ืื– ื”ืชื™ืื•ืจื™ื” ื”ื–ืืช ืื•ืžืจืช ืฉื”ืžื•ื—
01:54
creates, builds, a version of the universe,
33
114330
6000
ืžื™ื™ืฆืจ, ื•ื‘ื•ื ื” ื’ื™ืจืกื” ืฉืœ ื”ื™ืงื•ื.
02:00
and projects this version of the universe,
34
120330
3000
ื•ืžืงืจื™ืŸ ืืช ื”ื’ืจืกื” ื”ื–ืืช ืฉืœ ื”ื™ืงื•ื,
02:03
like a bubble, all around us.
35
123330
4000
ื›ืžื• ื‘ื•ืขื”, ืฉืžืงื™ืคื” ืื•ืชื ื• ืกื‘ื™ื‘.
02:07
Now, this is of course a topic of philosophical debate for centuries.
36
127330
4000
ื–ื” ื›ืžื•ื‘ืŸ ื ื•ืฉื ืฉืคื™ืœื•ืกื•ืคื™ื ืžืชื•ื•ื›ื—ื™ื ืขืœื™ื• ืžืื•ืช ืฉื ื™ื.
02:11
But, for the first time, we can actually address this,
37
131330
3000
ืื‘ืœ, ื‘ืคืขื ื”ืจืืฉื•ื ื” ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืชื™ื™ื—ืก ืžืžืฉ ืœื ื•ืฉื,
02:14
with brain simulation,
38
134330
2000
ื‘ืืžืฆืขื•ืช ืกื™ืžื•ืœืฆื™ื™ืช ื”ืžื•ื—,
02:16
and ask very systematic and rigorous questions,
39
136330
4000
ื•ืœืฉืื•ืœ ืฉืืœื•ืช ื‘ืฆื•ืจื” ืžืื“ ืฉื™ื˜ืชื™ืช ื•ืžื“ื•ื™ืงืช,
02:20
whether this theory could possibly be true.
40
140330
4000
ื”ืื ื”ืชืื•ืจื™ื” ื”ื–ื• ื™ื›ื•ืœื” ืœื”ื™ื•ืช ื ื›ื•ื ื”.
02:24
The reason why the moon is huge on the horizon
41
144330
3000
ื”ืกื™ื‘ื” ืœืžื” ื”ื™ืจื— ื ืจืื” ืขื ืง ื‘ื–ืจื™ื—ื”
02:27
is simply because our perceptual bubble
42
147330
3000
ื”ื™ื ืคืฉื•ื˜ ื‘ื’ืœืœ ืฉื‘ื•ืขืช ื”ืžื•ื“ืขื•ืช ืžืกื‘ื™ื‘ื ื•
02:30
does not stretch out 380,000 kilometers.
43
150330
4000
ืœื ื ืžืชื—ืช ืœ 380,000 ืง"ืž.
02:34
It runs out of space.
44
154330
2000
ื ื’ืžืจ ืœื” ื”ืžืจื—ื‘.
02:36
And so what we do is we compare the buildings
45
156330
4000
ื›ืš ืฉืžื” ืฉืื ื• ืขื•ืฉื™ื ื–ื” ืœื”ืฉื•ื•ืช ืืช ื”ื‘ื ื™ื™ื ื™ื
02:40
within our perceptual bubble,
46
160330
2000
ื‘ื‘ื•ืขื” ื”ื ื•ื›ื—ื™ืช ืฉืœื ื•,
02:42
and we make a decision.
47
162330
2000
ื•ืื ื• ืžืงื‘ืœื™ื ื”ื—ืœื˜ื”.
02:44
We make a decision it's that big,
48
164330
2000
ืื ื—ื ื• ืžื—ืœื™ื˜ื™ื ืฉื–ื” ื‘ื’ื•ื“ืœ ื›ื–ื”,
02:46
even though it's not that big.
49
166330
2000
ืืคื™ืœื• ืฉื–ื” ืœื ื›ืœ ื›ืš ื’ื“ื•ืœ,
02:48
And what that illustrates
50
168330
2000
ื•ืžื” ืฉื–ื” ืžื“ื’ื™ื
02:50
is that decisions are the key things
51
170330
2000
ืฉื”ื”ื—ืœื˜ื•ืช, ื”ื ื”ืขื ื™ืŸ ื”ืขื™ืงืจื™ - ื”ืžืคืชื—
02:52
that support our perceptual bubble. It keeps it alive.
52
172330
5000
ืฉืชื•ืžื›ื•ืช ื‘ื‘ื•ืขื” ื”ืชืคื™ืกืชื™ืช ืฉืœื ื•, ื–ื” ืฉื•ืžืจ ืขืœ ืงื™ื•ืžื”.
02:57
Without decisions you cannot see, you cannot think,
53
177330
2000
ืœืœื ื”ื—ืœื˜ื•ืช ืœื ื ื™ืชืŸ ืœืจืื•ืช, ืœื ื ื™ืชืŸ ืœื—ืฉื•ื‘,
02:59
you cannot feel.
54
179330
2000
ืœื ื ื™ืชืŸ ืœื”ืจื’ื™ืฉ.
03:01
And you may think that anesthetics work
55
181330
2000
ืืชื ืื•ืœื™ ื—ื•ืฉื‘ื™ื ืฉื”ืจื“ืžื” ืขื•ื‘ื“ืช ื‘ืฆื•ืจื” ื›ื–ืืช
03:03
by sending you into some deep sleep,
56
183330
3000
ื‘ืืžืฆืขื•ืช ื”ื›ื ืกื” ืฉืœื ื• ืœืฉื™ื ื” ืขืžื•ืงื”,
03:06
or by blocking your receptors so that you don't feel pain,
57
186330
3000
ืื• ื‘ืืžืฆืขื•ืช ื—ืกื™ืžื” ืฉืœ ื”ืงื•ืœื˜ื™ื ื›ืš ืฉืœื ื ืจื’ื™ืฉ ื›ืื‘,
03:09
but in fact most anesthetics don't work that way.
58
189330
3000
ืื‘ืœ ื‘ืžืฆื™ืื•ืช ืจื•ื‘ ืกืžื™ ื”ื”ืจื“ืžื” ืœื ืขื•ื‘ื“ื™ื ื‘ื“ืจืš ื”ื–ืืช.
03:12
What they do is they introduce a noise
59
192330
3000
ืžื” ืฉื”ื ืขื•ืฉื™ื ื–ื” ื™ืฆื™ืจืช "ืจืขืฉ"
03:15
into the brain so that the neurons cannot understand each other.
60
195330
3000
ืœืชื•ืš ื”ืžื•ื— ื›ืš ืฉื”ื ื•ื™ืจื•ื ื™ื ืœื ืžื‘ื™ื ื™ื ืื—ื“ ืืช ื”ืฉื ื™.
03:18
They are confused,
61
198330
2000
ื”ื ืžื‘ื•ืœื‘ืœื™ื,
03:20
and you cannot make a decision.
62
200330
3000
ื•ืœื ื ื™ืชืŸ ืœื‘ืฆืข ื”ื—ืœื˜ื•ืช.
03:23
So, while you're trying to make up your mind
63
203330
3000
ื›ืš ืฉื›ืืฉืจ ืืชื” ืžื ืกื” ืœื”ื—ืœื™ื˜,
03:26
what the doctor, the surgeon, is doing
64
206330
2000
ืžื” ืฉื”ืจื•ืคื, ื”ืžื ืชื— ืขื•ืฉื”
03:28
while he's hacking away at your body, he's long gone.
65
208330
2000
ื‘ื–ืžืŸ ืฉื”ื•ื ืžื—ื˜ื˜ ื‘ืชื•ืš ื”ื’ื•ืฃ ืฉืœืš, ื”ื•ื ืกื™ื™ื ืžื–ืžืŸ.
03:30
He's at home having tea.
66
210330
2000
ื”ื•ื ื‘ื‘ื™ืช ืฉื•ืชื” ืชื”.
03:32
(Laughter)
67
212330
2000
(ืฆื—ื•ืง)
03:34
So, when you walk up to a door and you open it,
68
214330
3000
ื›ืš, ืฉื›ืืฉืจ ืืชื” ืฆื•ืขื“ ืœื›ื™ื•ื•ืŸ ื”ื“ืœืช ื•ืคื•ืชื— ืื•ืชื”,
03:37
what you compulsively have to do to perceive
69
217330
3000
ืžื” ืฉืืชื” ืฆืจื™ืš ืœืขืฉื•ืช ื‘ืฆื•ืจื” ื‘ืœืชื™ ืžื•ื“ืขืช ื›ื“ื™ ืœื–ื”ื•ืช
03:40
is to make decisions,
70
220330
2000
ื–ื” ืœื‘ืฆืข ื”ื—ืœื˜ื•ืช,
03:42
thousands of decisions about the size of the room,
71
222330
3000
ืืœืคื™ ื”ื—ืœื˜ื•ืช ืœื’ื‘ื™ ื’ื•ื“ืœื• ืฉืœ ื”ื—ื“ืจ,
03:45
the walls, the height, the objects in this room.
72
225330
3000
ื”ืงื™ืจ, ื”ื’ื•ื‘ื”, ื”ื—ืคืฆื™ื ื‘ื—ื“ืจ.
03:48
99 percent of what you see
73
228330
3000
99 ืื—ื•ื– ืžืžื” ืฉืื ื• ืจื•ืื™ื
03:51
is not what comes in through the eyes.
74
231330
4000
ื–ื” ืœื ืžื” ืฉืžื’ื™ืข ืžื”ืขื™ื ื™ื™ื.
03:55
It is what you infer about that room.
75
235330
4000
ื–ื” ืžื” ืฉืืชื” ืžืขืจื™ืš ืœื’ื‘ื™ ื”ื—ื“ืจ.
03:59
So I can say, with some certainty,
76
239330
4000
ื›ืš ืฉืื ื™ ื™ื›ื•ืœ ืœื”ื’ื™ื“ ื‘ืžื™ื“ื” ืžืกื•ื™ืžืช ืฉืœ ื‘ื™ื˜ื—ื•ืŸ,
04:03
"I think, therefore I am."
77
243330
3000
"ืื ื™ ื—ื•ืฉื‘ ืœื›ืŸ ืื ื™ ืงื™ื™ื."
04:06
But I cannot say, "You think, therefore you are,"
78
246330
4000
ืื‘ืœ ืื ื™ ืœื ื™ื›ื•ืœ ืœื”ื’ื™ื“ "ืืชื” ื—ื•ืฉื‘ ืœื›ืŸ ืืชื” ืงื™ื™ื,"
04:10
because "you" are within my perceptual bubble.
79
250330
5000
ื‘ื’ืœืœ ืฉืืชื” ื ืžืฆื ื‘ืชื•ืš ื”ื‘ื•ืขื” ื”ืชืคื™ืกืชื™ืช ืฉืœื™.
04:15
Now, we can speculate and philosophize this,
80
255330
3000
ื›ืขืช, ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื ืชื— ื•ืœื”ืชืคืœืกืฃ ื‘ื ื•ืฉื,
04:18
but we don't actually have to for the next hundred years.
81
258330
3000
ืื‘ืœ ืื ื—ื ื• ืœื ืžืžืฉ ื—ื™ื™ื‘ื™ื ื‘ืžืื” ื”ืฉื ื™ื ื”ื‘ืื•ืช.
04:21
We can ask a very concrete question.
82
261330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืฉืื•ืœ ืฉืืœื•ืช ื‘ืกื™ืกื™ื•ืช ื‘ื™ื•ืชืจ.
04:23
"Can the brain build such a perception?"
83
263330
4000
"ื”ืื ื”ืžื•ื— ื™ื›ื•ืœ ืœื‘ื ื•ืช ื ืงื•ื“ืช ืžื‘ื˜ ื›ื–ืืช?"
04:27
Is it capable of doing it?
84
267330
2000
ื”ืื ื”ื•ื ื™ื›ื•ืœ ืœื‘ืฆืข ื–ืืช?
04:29
Does it have the substance to do it?
85
269330
2000
ื”ืื ื™ืฉ ืœื• ืืช ื”ื›ืœื™ื ืœื‘ืฆืข ื–ืืช?
04:31
And that's what I'm going to describe to you today.
86
271330
3000
ื•ื–ื” ืžื” ืฉืื ื™ ืžืชื›ื•ื•ืŸ ืœืชืืจ ืœื›ื ื”ื™ื•ื.
04:34
So, it took the universe 11 billion years to build the brain.
87
274330
4000
ืื– ื–ื” ืœืงื— ืœื™ืงื•ื ื›-11 ืžื™ืœื™ืืจื“ ืฉื ื” ืœื‘ื ื•ืช ืืช ื”ืžื•ื—.
04:38
It had to improve it a little bit.
88
278330
2000
ื”ื•ื ื”ื™ื” ืฆืจื™ืš ืœืฉืคืจ ืื•ืชื• ืงืฆืช.
04:40
It had to add to the frontal part, so that you would have instincts,
89
280330
3000
ื”ื™ื” ืฆืจื™ืš ืœื”ื•ืกื™ืฃ ืืช ื”ื—ืœืง ื”ืงื“ืžื™, ื›ืš ืฉื™ื”ื™ื• ืื™ื ืกื˜ื™ื ืงื˜ื™ื,
04:43
because they had to cope on land.
90
283330
3000
ื‘ื’ืœืœ ืฉื”ื•ื ื”ื™ื” ืฆืจื™ืš ืœื”ืชืžื•ื“ื“ ื‘ื™ื‘ืฉื”.
04:46
But the real big step was the neocortex.
91
286330
4000
ืื‘ืœ ื”ืฆืขื“ ื”ื’ื“ื•ืœ ื”ื™ื” ืคื™ืชื•ื— ื”ื ื™ืื•ืงื•ืจื˜ืงืก.
04:50
It's a new brain. You needed it.
92
290330
2000
ื–ื” ืžื•ื— ื—ื“ืฉ, ืืชื ื”ื™ื™ืชื ืฆืจื™ื›ื™ื ืื•ืชื•.
04:52
The mammals needed it
93
292330
2000
ื”ื™ื•ื ืงื™ื ื”ื™ื• ืฆืจื™ื›ื™ื ืื•ืชื•
04:54
because they had to cope with parenthood,
94
294330
4000
ื‘ื’ืœืœ ืฉื”ื ื ืืœืฆื• ืœื”ืชืžื•ื“ื“ ืขื ื”ื•ืจื•ืช,
04:58
social interactions,
95
298330
2000
ืชืงืฉื•ืจืช ื—ื‘ืจืชื™ืช,
05:00
complex cognitive functions.
96
300330
3000
ืคืขื•ืœื•ืช ื—ืฉื™ื‘ื” ืžืกื•ื‘ื›ื•ืช.
05:03
So, you can think of the neocortex
97
303330
2000
ื›ืš, ืฉืืคืฉืจ ืœื—ืฉื•ื‘ ืขืœ ื”ื ื™ืื•ืงื•ืจื˜ืงืก.
05:05
actually as the ultimate solution today,
98
305330
5000
ื‘ืขืฆื ื›ืขืœ ื”ืคื™ืชืจื•ืŸ ื”ืžืฉื•ืœื ื”ื™ื•ื,
05:10
of the universe as we know it.
99
310330
3000
ืฉืœ ื”ื™ืงื•ื ื›ืคื™ ืฉืื ื• ืžื›ื™ืจื™ื ืื•ืชื•.
05:13
It's the pinnacle, it's the final product
100
313330
2000
ื–ื” ื”ืชื•ืฆืจ ื”ืžื•ืฉืœื, ื”ืชื•ืฆืจืช ื”ืกื•ืคื™ืช
05:15
that the universe has produced.
101
315330
4000
ืฉื”ื™ืงื•ื ื™ืฆืจ.
05:19
It was so successful in evolution
102
319330
2000
ื”ื•ื ื”ื™ื” ื›ืœ ื›ืš ืžื•ืฆืœื— ื‘ื”ืชืคืชื—ื•ืชื• ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ืช
05:21
that from mouse to man it expanded
103
321330
2000
ื›ืš ืฉื‘ืžืขื‘ืจ ืžืขื›ื‘ืจ ืœืื“ื ื”ื•ื ื”ืชืจื—ื‘
05:23
about a thousandfold in terms of the numbers of neurons,
104
323330
3000
ืคื™ ืืœืฃ ื‘ื™ื™ื—ืก ืœื›ืžื•ืช ื”ื ื•ื™ืจื•ื ื™ื,
05:26
to produce this almost frightening
105
326330
3000
ื›ื“ื™ ืœื™ื™ืฆื•ืจ ืื‘ืจ, ืžื‘ื ื”
05:29
organ, structure.
106
329330
3000
ื›ืžืขื˜ ืžืคื—ื™ื“.
05:32
And it has not stopped its evolutionary path.
107
332330
3000
ื•ื”ืžืกืœื•ืœ ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ ืฉืœื• ืขื“ื™ืŸ ืœื ื”ืกืชื™ื™ื.
05:35
In fact, the neocortex in the human brain
108
335330
2000
ื‘ืขืฆื ื”ื ื™ืื•ืงื•ืจื˜ืงืก ื‘ืžื•ื— ื”ืื“ื
05:37
is evolving at an enormous speed.
109
337330
3000
ืžืชืคืชื— ื‘ืžื”ื™ืจื•ืช ืขืฆื•ืžื”.
05:40
If you zoom into the surface of the neocortex,
110
340330
2000
ืื ืžืชื‘ื•ื ื ื™ื ื‘ืฆื•ืจื” ืžืขืžื™ืงื” ื‘ืคื ื™ ื”ื ื™ืื•ืงื•ืจื˜ืงืก,
05:42
you discover that it's made up of little modules,
111
342330
3000
ืžื’ืœื™ื ืฉื”ื•ื ืžื™ื•ืฆืจ ืžื—ืœืงื™ื ืงื˜ื ื™ื,
05:45
G5 processors, like in a computer.
112
345330
2000
ืžืขื‘ื“ื™ G5, ื›ืžื• ื‘ืžืขื‘ื“ ืฉืœ ืžื—ืฉื‘.
05:47
But there are about a million of them.
113
347330
3000
ืื‘ืœ ื™ืฉ ืžื™ืœื™ื•ืŸ ื›ืืœื”.
05:50
They were so successful in evolution
114
350330
2000
ื”ื ื”ื™ื• ื›ืœ ื›ืš ืžื•ืฆืœื—ื™ื ื‘ืื‘ื•ืœื•ืฆื™ื”
05:52
that what we did was to duplicate them
115
352330
2000
ืฉืžื” ืฉืขืฉื™ื ื• ื–ื” ืœืฉื›ืคืœ ืื•ืชื
05:54
over and over and add more and more of them to the brain
116
354330
2000
ื•ืœื”ื•ืกื™ืฃ ืขื•ื“ ื•ืขื•ื“ ืžื”ื ืœืžื•ื—
05:56
until we ran out of space in the skull.
117
356330
3000
ืขื“ ืฉืœื ื ืฉืืจ ืžืงื•ื ื‘ื’ื•ืœื’ื•ืœืช.
05:59
And the brain started to fold in on itself,
118
359330
2000
ื•ื”ืžื•ื— ื”ื—ืœ ืœืงืคืœ ืืช ืขืฆืžื•,
06:01
and that's why the neocortex is so highly convoluted.
119
361330
3000
ื•ืœื›ืŸ ื”ื ื™ืื•ืงื•ืจื˜ืงืก ื›ืœ ื›ืš ืžืกื•ื‘ืš.
06:04
We're just packing in columns,
120
364330
2000
ืื ื—ื ื• ืคืฉื•ื˜ ืื•ืจื–ื™ื ื‘ืขืžื•ื“ื•ืช,
06:06
so that we'd have more neocortical columns
121
366330
3000
ื›ืš ืฉื™ืฉ ืœื ื• ื™ื•ืชืจ ืขืžื•ื“ื•ืช ื‘ื ื™ืื•ืงื•ืจื˜ืงืก
06:09
to perform more complex functions.
122
369330
3000
ื›ื“ื™ ืœื‘ืฆืข ืคืขื•ืœื•ืช ืžืกื•ื‘ื›ื•ืช ื™ื•ืชืจ.
06:12
So you can think of the neocortex actually as
123
372330
2000
ื›ืš ืฉื‘ืขืฆื ื ื™ืชืŸ ืœื—ืฉื•ื‘ ืขืœ ื”ื ื™ืื•ืงื•ืจื˜ืงืก ื›ืžื•
06:14
a massive grand piano,
124
374330
2000
ืขืœ ืคืกื ืชืจ ื›ื ืฃ ืžืกื™ื‘ื™,
06:16
a million-key grand piano.
125
376330
3000
ืคืกื ืชืจ ืฉื‘ื• ืžื™ืœื™ื•ื ื™ ืงืœื™ื“ื™ื.
06:19
Each of these neocortical columns
126
379330
2000
ื›ืœ ืื—ื“ ืžื”ืขืžื•ื“ื•ืช ื‘ื ื™ืื•ืงื•ืจื˜ืงืก
06:21
would produce a note.
127
381330
2000
ื™ื™ืฆืจ ืฆืœื™ืœ.
06:23
You stimulate it; it produces a symphony.
128
383330
3000
ืžื’ืจื™ื ืื•ืชื•; ื”ื•ื ืžื™ื™ืฆืจ ืกื™ืžืคื•ื ื™ื”.
06:26
But it's not just a symphony of perception.
129
386330
3000
ืื‘ืœ ื–ื” ืœื ืจืง ืกื™ืžืคื•ื ื™ื” ืชืคื™ืกืชื™ืช.
06:29
It's a symphony of your universe, your reality.
130
389330
3000
ื–ื” ืกื™ืžื•ืคื ื™ื” ืฉืœ ื”ื™ืงื•ื, ื”ืžืฆื™ืื•ืช ืฉืœืš.
06:32
Now, of course it takes years to learn how
131
392330
3000
ื›ืžื•ื‘ืŸ ืฉื–ื” ืœื•ืงื— ืฉื ื™ื ืœืœืžื•ื“ ื›ื™ืฆื“
06:35
to master a grand piano with a million keys.
132
395330
3000
ืœืฉืœื•ื˜ ื‘ืคืกื ืชืจ ื‘ืขืœ ืžื™ืœื™ื•ืŸ ืงืœื™ื“ื™ื.
06:38
That's why you have to send your kids to good schools,
133
398330
2000
ืœื›ืŸ ืฆืจื™ืš ืœืฉืœื•ื— ืืช ื”ื™ืœื“ื™ื ืœื‘ืชื™ ืกืคืจ ื˜ื•ื‘ื™ื,
06:40
hopefully eventually to Oxford.
134
400330
2000
ื‘ืชืงื•ื•ื” ื‘ืกื•ืฃ ืœืื•ืงืกืคื•ืจื“.
06:42
But it's not only education.
135
402330
3000
ืื‘ืœ ื–ื” ืœื ืจืง ื—ื™ื ื•ืš.
06:45
It's also genetics.
136
405330
2000
ื–ื” ื’ื ื’ื ื˜ื™ืงื”.
06:47
You may be born lucky,
137
407330
2000
ื™ืชื›ืŸ ืฉื ื•ืœื“ืช ื‘ืจ ืžื–ืœ,
06:49
where you know how to master your neocortical column,
138
409330
4000
ืื• ืฉืืชื” ืฉื•ืœื˜ ื›ื™ืฆื“ ืœื”ืคืขื™ืœ ืืช ื”ืขืžื•ื“ื•ืช ื‘ืžื•ื—,
06:53
and you can play a fantastic symphony.
139
413330
2000
ื•ืืชื” ื™ื›ื•ืœ ืœื ื’ืŸ ืกื™ืžืคื•ื ื™ื” ื ืคืœืื”.
06:55
In fact, there is a new theory of autism
140
415330
3000
ื‘ืขืฆื, ื™ืฉ ืชื™ืื•ืจื™ื” ื—ื“ืฉื” ืœื’ื‘ื™ ืื•ื˜ื™ื–ื
06:58
called the "intense world" theory,
141
418330
2000
ืฉื ืงืจืืช ืชืื•ืจื™ืช "ื”ืขื•ืœื ื”ืœื—ื•ืฅ",
07:00
which suggests that the neocortical columns are super-columns.
142
420330
4000
ืฉื‘ื” ื”ืจืขื™ื•ืŸ ื”ื•ื ืฉื”ืขืžื•ื“ื•ืช ืฉื‘ืžื•ื— ื”ื—ื“ืฉ ื”ื ืขืžื•ื“ื•ืช-ืขืœ.
07:04
They are highly reactive, and they are super-plastic,
143
424330
4000
ื”ื ืžืื“ ืคืขื™ืœื™ื ื•ื”ื ืžืื“ ืคืœืกื˜ื™ื™ื,
07:08
and so the autists are probably capable of
144
428330
3000
ื›ืš ืฉืื•ื˜ื™ืกื˜ ื›ื ืจืื” ืžืกื•ื’ืœ
07:11
building and learning a symphony
145
431330
2000
ืœื‘ื ื•ืช ื•ืœื™ื™ืฆืจ ืกื™ืžืคื•ื ื™ื”
07:13
which is unthinkable for us.
146
433330
2000
ืฉื”ื™ื ื‘ืœืชื™ ื ื™ืชื ืช ืœื”ื‘ื ื” ื‘ืฉื‘ื™ืœื ื•.
07:15
But you can also understand
147
435330
2000
ืื‘ืœ ื ื™ืชืŸ ื’ื ืœื”ื‘ื™ืŸ
07:17
that if you have a disease
148
437330
2000
ืฉืื ื™ืฉ ืœืš ืžื—ืœื”
07:19
within one of these columns,
149
439330
2000
ื‘ืื—ืช ื”ืขืžื•ื“ื•ืช,
07:21
the note is going to be off.
150
441330
2000
ื”ืฆืœื™ืœ ื”ื•ืœืš ืœื”ื™ื•ืช ืฉื’ื•ื™.
07:23
The perception, the symphony that you create
151
443330
2000
ื”ืžื•ื“ืขื•ืช, ื”ืกื™ืžืคื•ื ื™ื” ืฉืชื™ื•ืฆืจ
07:25
is going to be corrupted,
152
445330
2000
ืชื”ื™ื” ืžืฉื•ื‘ืฉืช,
07:27
and you will have symptoms of disease.
153
447330
3000
ื•ื™ื”ื™ื• ืœื›ื ืกื™ืžืคื˜ื•ืžื™ื ืฉืœ ืžื—ืœื”.
07:30
So, the Holy Grail for neuroscience
154
450330
4000
ื›ืš ืฉื”ื’ื‘ื™ืข ื”ืงื“ื•ืฉ ื‘ืžื“ืขื™ ื”ืžื•ื—
07:34
is really to understand the design of the neocoritical column --
155
454330
4000
ื”ื™ื ื• ื”ื”ื‘ื ื” ืฉืœ ืขื™ืฆื•ื‘ ื”ืขืžื•ื“ื•ืช ื‘ืžื•ื— -
07:38
and it's not just for neuroscience;
156
458330
2000
ื•ื–ื” ืœื ืจืง ืœืžื“ืขื™ ื”ืžื•ื—;
07:40
it's perhaps to understand perception, to understand reality,
157
460330
3000
ื–ื” ืื•ืœื™ ืœื”ื‘ื ืช ื”ืžื•ื“ืขื•ืช ื•ืœื”ื‘ื ืช ื”ืžืฆื™ืื•ืช,
07:43
and perhaps to even also understand physical reality.
158
463330
4000
ื•ืื•ืœื™ ืืคื™ืœื• ืœื”ื‘ื™ืŸ ืืช ื”ืขื•ืœื ื”ืคื™ืกื™ ื”ื’ืฉืžื™.
07:47
So, what we did was, for the past 15 years,
159
467330
3000
ืื– ืžื” ืฉืขืฉื™ื ื• ื‘ืžืฉืš 15 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
07:50
was to dissect out the neocortex, systematically.
160
470330
4000
ื–ื” ืœืคืจืง ืืช ืืช ื”ื ืื•ืงื•ืจื˜ืงืก ื‘ืื•ืคืŸ ืฉื™ื˜ืชื™.
07:54
It's a bit like going and cataloging a piece of the rainforest.
161
474330
4000
ื–ื” ืงืฆืช ื›ืžื• ืœืงื˜ืœื’ ื—ืœืง ืžื™ืขืจื•ืช ื”ื’ืฉื.
07:58
How many trees does it have?
162
478330
2000
ื›ืžื” ืขืฆื™ื ืงื™ื™ืžื™ื?
08:00
What shapes are the trees?
163
480330
2000
ื‘ืื™ื–ื• ืฆื•ืจื” ื”ืขืฆื™ื?
08:02
How many of each type of tree do you have? Where are they positioned?
164
482330
3000
ื›ืžื” ืขืฆื™ื ืžื›ืœ ืกื•ื’ ืงื™ื™ืžื™ื? ืื™ืคื” ื”ื ืžืžื•ืงืžื™ื?
08:05
But it's a bit more than cataloging because you actually have to
165
485330
2000
ืื‘ืœ ื–ื” ืงืฆืช ื™ื•ืชืจ ืžืœืงื˜ืœื’ ื‘ื’ืœืœ ืฉืืชื” ืฆืจื™ืš ืžืžืฉ
08:07
describe and discover all the rules of communication,
166
487330
4000
ืœืชืืจ ื•ืœื’ืœื•ืช ืืช ื›ืœ ื—ื•ืงื™ ื”ืชืงืฉื•ืจืช,
08:11
the rules of connectivity,
167
491330
2000
ื—ื•ืงื™ ื”ื—ื™ื‘ื•ืจื™ื•ืช,
08:13
because the neurons don't just like to connect with any neuron.
168
493330
3000
ื‘ื’ืœืœ ืฉื”ื ื•ื™ืจื•ื ื™ื ืœื ืกืชื ืจื•ืฆื™ื ืœื”ืชื—ื‘ืจ ืœื›ืœ ื ื•ื™ืจื•ืŸ ืื—ืจ.
08:16
They choose very carefully who they connect with.
169
496330
3000
ื”ื ื‘ื•ื—ืจื™ื ื‘ื–ื”ื™ืจื•ืช ืœืžื™ ื”ื ื™ืชื—ื‘ืจื•.
08:19
It's also more than cataloging
170
499330
3000
ื–ื” ื’ื ื™ื•ืชืจ ืžืงื˜ืœื•ื’
08:22
because you actually have to build three-dimensional
171
502330
2000
ื‘ื’ืœืœ ืฉืฆืจื™ืš ืœื‘ื ื•ืช
08:24
digital models of them.
172
504330
2000
ืžื•ื“ืœ ืชืœืช ืžื™ืžื“ื™ ื“ื™ื’ื™ื˜ืœื™ ืฉืœื”ื.
08:26
And we did that for tens of thousands of neurons,
173
506330
2000
ื•ืื ื—ื ื• ืขืฉื™ื ื• ื–ืืช ืœืขืฉืจื•ืช ืืœืคื™ ื ื•ื™ืจื•ื ื™ื,
08:28
built digital models of all the different types
174
508330
3000
ื‘ื ื™ื ื• ืžื•ื“ืœ ื“ื™ื’ื™ื˜ืœื™ ืœืกื•ื’ื™ื ืฉื•ื ื™ื
08:31
of neurons we came across.
175
511330
2000
ืฉืœ ื ื•ื™ืจื•ื ื™ื, ืื•ืชื ืžืฆืื ื•.
08:33
And once you have that, you can actually
176
513330
2000
ื•ื›ืืฉืจ ื™ืฉ ืœืš ืืช ื”ืžื•ื“ืœ ื”ื–ื” ืืชื” ื™ื›ื•ืœ ื‘ืขืฆื
08:35
begin to build the neocortical column.
177
515330
4000
ืœื”ืชื—ื™ืœ ื•ืœื‘ื ื•ืช ืขืžื•ื“ื•ืช ืฉืœ ื”ื ื™ืื•ืงื•ืจื˜ืงืก.
08:39
And here we're coiling them up.
178
519330
3000
ื•ื›ืืŸ ืื ื—ื ื• ื›ื•ืจื›ื™ื ืื•ืชื.
08:42
But as you do this, what you see
179
522330
3000
ืื‘ืœ ื›ืืฉืจ ืื ื—ื ื• ืขื•ืฉื™ื ื–ืืช, ืžื” ืฉืื ื• ืจื•ืื™ื
08:45
is that the branches intersect
180
525330
2000
ืฉื”ืฉืœื•ื—ื•ืช ืžืฆื˜ืœื‘ื•ืช ืื—ื“ ื‘ืฉื ื™
08:47
actually in millions of locations,
181
527330
3000
ื‘ืขืฆื ื‘ืžืœื™ื•ื ื™ ืžืงื•ืžื•ืช.
08:50
and at each of these intersections
182
530330
3000
ื•ื‘ื›ืœ ืื—ืช ืžื”ื”ืฆื˜ืœื‘ื•ื™ื•ืช ื”ืืœื”
08:53
they can form a synapse.
183
533330
2000
ื”ื ื™ื›ื•ืœื™ื ืœื™ื™ืฆืจ ืกื™ื ืคืกื”.
08:55
And a synapse is a chemical location
184
535330
2000
ืกื™ื ืคืกื” ื–ื” ื”ืžืงื•ื ืฉื‘ื• ื›ื™ืžื™ืงืœื™ื ื ืžืฆืื™ื
08:57
where they communicate with each other.
185
537330
3000
ื”ื™ื›ืŸ ืฉื”ื ืžืชืงืฉืจื™ื ืื—ื“ ืขื ื”ืฉื ื™.
09:00
And these synapses together
186
540330
2000
ื•ื”ืกื™ื ืคืกื•ืช ื”ืืœื” ื‘ื™ื—ื“
09:02
form the network
187
542330
2000
ืžื™ื™ืฆืจื•ืช ืจืฉืช
09:04
or the circuit of the brain.
188
544330
3000
ืื• ื”ื—ื™ื•ื•ื˜ ืฉืœ ื”ืžื•ื—.
09:07
Now, the circuit, you could also think of as
189
547330
4000
ืื– ื”ืจืฉืช, ื ื™ืชืŸ ืœื—ืฉื•ื‘ ืขืœื™ื” ื›ืžื•
09:11
the fabric of the brain.
190
551330
2000
ื”ืžืืจื’ ืฉืœ ื”ืžื•ื—.
09:13
And when you think of the fabric of the brain,
191
553330
3000
ื•ื›ืืฉืจ ืืชื” ื—ื•ืฉื‘ ืขืœ ื”ืžืืจื’ ืฉืœ ื”ืžื•ื—,
09:16
the structure, how is it built? What is the pattern of the carpet?
192
556330
4000
ื”ืžื‘ื ื”, ืื™ืš ื–ื” ื‘ื ื•ื™? ืžื” ื”ืชื‘ื ื™ืช ืฉืœ ื”ืฉื˜ื™ื—?
09:20
You realize that this poses
193
560330
2000
ืืชื” ืžื‘ื™ืŸ ืฉื–ื” ืžืฆื™ื’
09:22
a fundamental challenge to any theory of the brain,
194
562330
4000
ืืชื’ืจ ืžืฉืžืขื•ืชื™ ืœื›ืœ ืชื™ืื•ืจื™ื” ืฉืœ ื”ืžื•ื—,
09:26
and especially to a theory that says
195
566330
2000
ื•ื‘ืžื™ื•ื—ื“ ืœืชื™ืื•ืจื™ื” ืฉืื•ืžืจืช
09:28
that there is some reality that emerges
196
568330
2000
ืฉื™ืฉ ืžืฆื™ืื•ืช ืฉืžื’ื™ื—ื”
09:30
out of this carpet, out of this particular carpet
197
570330
3000
ืžืชื•ืš ื”ืฉื˜ื™ื— ื”ื–ื”, ื”ืฉื˜ื™ื— ื”ืกืคืฆื™ืคื™ ื”ื–ื”
09:33
with a particular pattern.
198
573330
2000
ืขื ืชื‘ื ื™ื•ืช ืกืคืฆื™ืคื™ื•ืช.
09:35
The reason is because the most important design secret of the brain
199
575330
3000
ื”ืกื™ื‘ื” ื”ื™ื ืฉื”ืกื•ื“ ื”ืชื›ื ื•ื ื™ ื”ื—ืฉื•ื‘ ื‘ื™ื•ืชืจ ืฉืœ ื”ืžื•ื—
09:38
is diversity.
200
578330
2000
ื–ื• ืฉื•ื ื•ืช.
09:40
Every neuron is different.
201
580330
2000
ื›ืœ ื ื•ื™ืจื•ืŸ ืฉื•ื ื”.
09:42
It's the same in the forest. Every pine tree is different.
202
582330
2000
ื–ื” ื›ืžื• ื‘ื™ืขืจ. ื›ืœ ืขืฅ ืื•ืจืŸ ืฉื•ื ื”.
09:44
You may have many different types of trees,
203
584330
2000
ื™ืชื›ืŸ ืฉื™ื”ื™ื• ืกื•ื’ื™ื ืฉื•ื ื™ื ืฉืœ ืขืฆื™ื,
09:46
but every pine tree is different. And in the brain it's the same.
204
586330
3000
ืื‘ืœ ื›ืœ ืขืฅ ืื•ืจืŸ ืฉื•ื ื”, ื•ื‘ืžื•ื— ื–ื” ืื•ืชื• ื“ื‘ืจ.
09:49
So there is no neuron in my brain that is the same as another,
205
589330
3000
ื›ืš ืฉืื™ืŸ ื ื•ื™ืจื•ืŸ ื‘ืžื•ื— ืฉืœื™ ืฉื“ื•ืžื” ืœืื—ืจ,
09:52
and there is no neuron in my brain that is the same as in yours.
206
592330
3000
ื•ืื™ืŸ ืœื™ ื ื•ื™ืจื•ืŸ ื‘ืžื•ื— ืฉืœื™ ืฉื–ื”ื” ืœื ื•ื™ืจื•ืŸ ืืฆืœืš.
09:55
And your neurons are not going to be oriented and positioned
207
595330
3000
ื•ื”ื ื•ื™ืจื•ื ื™ื ืฉืœืš ืœื ื”ื•ืœื›ื™ื ืœื”ื™ื•ืช ืžื•ืฆื‘ื™ื ื•ืžืžื•ืงืžื™ื
09:58
in exactly the same way.
208
598330
2000
ื‘ืื•ืชื” ืฆื•ืจื” ื‘ื“ื™ื•ืง.
10:00
And you may have more or less neurons.
209
600330
2000
ื•ื™ื›ื•ืœ ืœื”ื™ื•ืช ืœืš ื™ื•ืชืจ ืื• ืคื—ื•ืช ื ื•ื™ืจื•ื ื™ื.
10:02
So it's very unlikely
210
602330
2000
ื›ืš ืฉื–ื” ืžืื“ ืœื ืกื‘ื™ืจ
10:04
that you got the same fabric, the same circuitry.
211
604330
4000
ืฉื™ื”ื™ื” ืœืš ืืช ืื•ืชื• ืžืืจื’ ืื•ืชื• ื—ื™ื•ื•ื˜.
10:08
So, how could we possibly create a reality
212
608330
2000
ืื– ื›ื™ืฆื“ ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื™ืฆื•ืจ ืžืฆื™ืื•ืช
10:10
that we can even understand each other?
213
610330
3000
ืฉื‘ื” ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื‘ื™ืŸ ืื—ื“ ืืช ื”ืฉื ื™?
10:13
Well, we don't have to speculate.
214
613330
2000
ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœืฉืขืจ.
10:15
We can look at all 10 million synapses now.
215
615330
3000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืกืชื›ืœ ืขืœ ื›ืœ 10 ืžืœื™ื•ืŸ ื”ืกื™ื ืคืกื•ืช.
10:18
We can look at the fabric. And we can change neurons.
216
618330
3000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืกืชื›ืœ ืขืœ ื”ืžืืจื’, ื•ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืฉื ื•ืช ื ื•ื™ืจื•ื ื™ื.
10:21
We can use different neurons with different variations.
217
621330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื ื•ื™ืจื•ื ื™ื ืื—ืจื™ื ื•ื‘ืฉื™ืœื•ื‘ื™ื ืฉื•ื ื™ื.
10:23
We can position them in different places,
218
623330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืžืงื ืื•ืชื ื‘ืžืงื•ืžื•ืช ืฉื•ื ื™ื,
10:25
orient them in different places.
219
625330
2000
ื•ืœื”ืฆื™ื‘ ืื•ืชื ื‘ืžืงื•ืžื•ืช ืื—ืจื™ื.
10:27
We can use less or more of them.
220
627330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื™ื•ืชืจ ืื• ืคื—ื•ืช ืžื”ื.
10:29
And when we do that
221
629330
2000
ื•ื›ืืฉืจ ืื ื—ื ื• ืขื•ืฉื™ื ื–ืืช
10:31
what we discovered is that the circuitry does change.
222
631330
3000
ืžื” ืฉื’ื™ืœื™ื ื• ืฉื”ื—ื™ื•ื•ื˜ ืžืฉืชื ื”.
10:34
But the pattern of how the circuitry is designed does not.
223
634330
7000
ืื‘ืœ ื”ืชื‘ื ื™ืช ืฉื‘ื” ื”ื—ื™ื•ื•ื˜ ืชื•ื›ื ืŸ ืœื ืžืฉืชื ื”.
10:41
So, the fabric of the brain,
224
641330
2000
ื›ืš ืฉื”ืžืืจื’ ืฉืœ ื”ืžื•ื—,
10:43
even though your brain may be smaller, bigger,
225
643330
2000
ืืคื™ืœื• ืฉื”ืžื•ื— ืฉืœืš ื™ื›ื•ืœ ืœื”ื™ื•ืช ื’ื“ื•ืœ ื™ื•ืชืจ ืื• ืงื˜ืŸ ื™ื•ืชืจ,
10:45
it may have different types of neurons,
226
645330
3000
ื™ื›ื•ืœ ืœื”ื›ื™ืœ ืกื•ื’ื™ื ืฉื•ื ื™ื ืฉืœ ื ื•ื™ืจื•ื ื™ื,
10:48
different morphologies of neurons,
227
648330
2000
ื•ืžื‘ื ื” ืฉื•ื ื” ืฉืœ ื”ื ื•ื™ืจื•ื ื™ื,
10:50
we actually do share
228
650330
3000
ืื ื—ื ื• ื‘ืขืฆื ื›ืŸ ื—ื•ืœืงื™ื
10:53
the same fabric.
229
653330
2000
ืืช ืื•ืชื• ืžืืจื’.
10:55
And we think this is species-specific,
230
655330
2000
ื•ืื ื—ื ื• ื—ื•ืฉื‘ื™ื ืฉื–ื” ืกืคืฆื™ืคื™ ืœื–ืŸ ื‘ื™ื•ืœื•ื’ื™,
10:57
which means that that could explain
231
657330
2000
ืฉื”ืžืฉืžืขื•ืช ื”ื™ื, ืฉื–ื” ื™ื›ื•ืœ ืœื”ืกื‘ื™ืจ
10:59
why we can't communicate across species.
232
659330
2000
ืœืžื” ืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœืชืงืฉืจ ื‘ื™ืŸ ื–ื ื™ื ืฉื•ื ื™ื.
11:01
So, let's switch it on. But to do it, what you have to do
233
661330
3000
ืื– ื‘ื• ื ื“ืœื™ืง ืื•ืชื•. ืื‘ืœ ื›ื“ื™ ืœื‘ืฆืข ื–ืืช ืฆืจื™ืš
11:04
is you have to make this come alive.
234
664330
2000
ืœื’ืจื•ื ืœื• ืœื”ืชืขื•ืจืจ ืœื—ื™ื™ื.
11:06
We make it come alive
235
666330
2000
ืื ื—ื ื• ื’ื•ืจืžื™ื ืœื• ืœื”ืชืขื•ืจืจ ืœื—ื™ื™ื
11:08
with equations, a lot of mathematics.
236
668330
2000
ื‘ืืžืฆืขื•ืช ื ื•ืกื—ืื•ืช, ื”ืจื‘ื” ืžืชืžื˜ื™ืงื”.
11:10
And, in fact, the equations that make neurons into electrical generators
237
670330
4000
ื•ื‘ืขืฆื ื”ื ื•ืกื—ื” ืฉื’ื•ืจืžืช ืœื ื•ื™ืจื•ื ื™ื ืœื”ืคื•ืš ืœื™ื™ืฆืจื ื™ ืื ืจื’ื™ื”
11:14
were discovered by two Cambridge Nobel Laureates.
238
674330
3000
ื”ืชื’ืœืชื” ืข"ื™ ืฉื ื™ ื–ื•ื›ื™ ืคืจืก ื ื•ื‘ืœ ืžืงืžื‘ืจื™ื’'.
11:17
So, we have the mathematics to make neurons come alive.
239
677330
3000
ืื– ื™ืฉ ืœื ื• ืืช ื”ื™ื“ืข ื”ืžืชืžื˜ื™ ืฉื”ื•ืคืš ืืช ื”ื ื•ื™ืจื•ื ื™ื ืœื—ื™ื™ื.
11:20
We also have the mathematics to describe
240
680330
2000
ื™ืฉ ืœื ื• ื’ื ืืช ื”ื™ื“ืข ื”ืžืชืžื˜ื™ ืœืชืืจ
11:22
how neurons collect information,
241
682330
3000
ืื™ืš ื ื•ื™ืจื•ื ื™ื ืื•ืกืคื™ื ืื™ื ืคื•ืจืžืฆื™ื”,
11:25
and how they create a little lightning bolt
242
685330
3000
ื•ืื™ืš ื”ื ื™ื•ืฆืจื™ื ื‘ืจืง ืงื˜ืŸ
11:28
to communicate with each other.
243
688330
2000
ื›ื“ื™ ืœืชืงืฉืจ ืื—ื“ ืขื ื”ืฉื ื™.
11:30
And when they get to the synapse,
244
690330
2000
ื•ื›ืืฉืจ ื”ื ืžื’ื™ืขื™ื ืœืกื™ื ืคืกื”,
11:32
what they do is they effectively,
245
692330
2000
ืžื” ืฉื”ื ืขื•ืฉื™ื ื‘ืื•ืคืŸ ืžืขืฉื™,
11:34
literally, shock the synapse.
246
694330
3000
ื–ื” ืœื—ืฉืžืœ ืืช ื”ืกื™ื ืคืกื”.
11:37
It's like electrical shock
247
697330
2000
ื–ื” ื›ืžื• ืžื›ืช ื—ืฉืžืœ
11:39
that releases the chemicals from these synapses.
248
699330
3000
ืฉืžืฉื—ืจืจืช ื›ื™ืžื™ืงืœื™ื ืžื”ืกื™ื ืคืกื”.
11:42
And we've got the mathematics to describe this process.
249
702330
3000
ื•ื™ืฉ ืœื ื• ืืช ื”ืชื”ืœื™ืš ื”ืžืชืžื˜ื™ ื”ืžืืคื™ื™ืŸ ืืช ื”ืชื”ืœื™ืš.
11:45
So we can describe the communication between the neurons.
250
705330
4000
ื›ืš ืฉืื ื• ื™ื›ื•ืœื™ื ืœืชืืจ ืืช ื”ืชืงืฉื•ืจืช ื‘ื™ืŸ ื”ื ื•ื™ืจื•ื ื™ื.
11:49
There literally are only a handful
251
709330
3000
ืงื™ื™ืžื™ื ืœืžืขืฉื” ืจืง ืงื•ืžืฅ
11:52
of equations that you need to simulate
252
712330
2000
ืฉืœ ื ื•ืกื—ืื•ืช ืฉืฆืจื™ืš ื›ื“ื™ ืœื“ืžื•ืช
11:54
the activity of the neocortex.
253
714330
2000
ืืช ื”ืคืขื™ืœื•ืช ืฉืœ ื”ื ื™ืื•ืงื•ืจื˜ืงืก.
11:56
But what you do need is a very big computer.
254
716330
3000
ืื‘ืœ ืžื” ืฉืืชื” ืฆืจื™ืš ื–ื” ืžื—ืฉื‘ ื’ื“ื•ืœ ืžืื“.
11:59
And in fact you need one laptop
255
719330
2000
ื•ื‘ืขืฆื ืืชื” ืฆืจื™ืš ืžื—ืฉื‘ ื ื™ื™ื“ ืื—ื“
12:01
to do all the calculations just for one neuron.
256
721330
3000
ืจืง ื›ื“ื™ ืœื‘ืฆืข ื—ื™ืฉื•ื‘ ืฉืœ ื ื•ื™ืจื•ืŸ ืื—ื“.
12:04
So you need 10,000 laptops.
257
724330
2000
ืื– ืืชื” ืฆืจื™ืš 10,000 ืžื—ืฉื‘ื™ื ื ื™ื™ื“ื™ื.
12:06
So where do you go? You go to IBM,
258
726330
2000
ืื– ืœืืŸ ืืชื” ื”ื•ืœืš? ืืชื” ื”ื•ืœืš ืœ-IBM,
12:08
and you get a supercomputer, because they know how to take
259
728330
2000
ื•ืืชื” ืžืงื‘ืœ "ืžื—ืฉื‘ ืขืœ" ืžื›ื™ื•ื•ืŸ ืฉื”ื ื™ื•ื“ืขื™ื ืื™ืš ืœืงื—ืช
12:10
10,000 laptops and put it into the size of a refrigerator.
260
730330
4000
10,000 ืžื—ืฉื‘ื™ื ื ื™ื™ื“ื™ื ื•ืœืฉื™ื ืื•ืชื ื‘ื™ื—ื“ ื‘ื’ื•ื“ืœ ืฉืœ ืžืงืจืจ.
12:14
So now we have this Blue Gene supercomputer.
261
734330
3000
ืื– ื›ืขืช ื™ืฉ ืœื ื• ืืช ืžื—ืฉื‘ ื”ืขืœ "ื”ื’ืŸ ื”ื›ื—ื•ืœ".
12:17
We can load up all the neurons,
262
737330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืœื•ืช ืœืชื•ื›ื• ืืช ื›ืœ ื”ื ื•ื™ืจื•ื ื™ื,
12:19
each one on to its processor,
263
739330
2000
ื›ืœ ืื—ื“ ืœืžืขื‘ื“ ืขืฆืžืื™ ืžืฉืœื•,
12:21
and fire it up, and see what happens.
264
741330
4000
ืœื”ื“ืœื™ืง ืื•ืชื• ื•ืœืจืื•ืช ืžื” ืงื•ืจื”.
12:25
Take the magic carpet for a ride.
265
745330
3000
ืœืงื—ืช ืืช ื”ืฉื˜ื™ื— ื”ืžืขื•ืคืฃ ืœื˜ื™ืกื”.
12:28
Here we activate it. And this gives the first glimpse
266
748330
3000
ื›ืืŸ ืื ื—ื ื• ืžืคืขื™ืœื™ื ืื•ืชื•, ื•ื–ื” ื”ื”ืฆืฆื” ื”ืจืืฉื•ื ื”
12:31
of what is happening in your brain
267
751330
2000
ืขืœ ืžื” ืฉืงื•ืจื” ื‘ืžื•ื— ืฉืœืš
12:33
when there is a stimulation.
268
753330
2000
ื›ืืฉืจ ื™ืฉ ื’ื™ืจื•ื™.
12:35
It's the first view.
269
755330
2000
ื–ื” ื”ืžื‘ื˜ ื”ืจืืฉื•ืŸ.
12:37
Now, when you look at that the first time, you think,
270
757330
2000
ื›ืขืช, ื›ืืฉืจ ืืชื” ืžืกืชื›ืœ ื‘ืคืขื ื”ืจืืฉื•ื ื”, ืืชื” ื—ื•ืฉื‘,
12:39
"My god. How is reality coming out of that?"
271
759330
5000
"ืืœื•ื”ื™ื ืื™ืš ื”ืžืฆื™ืื•ืช ื™ื•ืฆืืช ืžื“ื‘ืจ ื›ื–ื”?"
12:44
But, in fact, you can start,
272
764330
3000
ืื‘ืœ, ื”ืขื•ื‘ื“ื” ื”ื™ื ืฉืืชื” ื™ื›ื•ืœ ืœื”ืชื—ื™ืœ,
12:47
even though we haven't trained this neocortical column
273
767330
4000
ืืคื™ืœื• ืฉืื ื—ื ื• ืœื ืื™ืžื ื• ืืช ื”ืขืžื•ื“ื” ืฉืœ ื”ืžื•ื—
12:51
to create a specific reality.
274
771330
2000
ืœื™ื™ืฆื•ืจ ืžืฆื™ืื•ืช ืกืคืฆื™ืคื™ืช.
12:53
But we can ask, "Where is the rose?"
275
773330
4000
ืื‘ืœ ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืฉืื•ืœ "ืื™ืคื” ื”ืฉื•ืฉื ื”?"
12:57
We can ask, "Where is it inside,
276
777330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืฉืื•ืœ "ืื™ืคื” ื–ื” ื‘ืคื ื™ื,
12:59
if we stimulate it with a picture?"
277
779330
3000
ืื ืื ื—ื ื• ืžื’ืจื™ื ืื•ืชื• ืขื ืชืžื•ื ื”?"
13:02
Where is it inside the neocortex?
278
782330
2000
ืื™ืคื” ื–ื” ื‘ืคื ื™ื ื‘ืชื•ืš ื”ื ื™ืื•ืงื•ืจื˜ืงืก?
13:04
Ultimately it's got to be there if we stimulated it with it.
279
784330
4000
ื‘ืื•ืคืŸ ื•ื“ืื™ ื–ื” ืฆืจื™ืš ืœื”ื™ื•ืช ืฉื ืื ืื ื—ื ื• ืžื’ืจื™ื ืื•ืชื• ืขื ื–ื”.
13:08
So, the way that we can look at that
280
788330
2000
ืื–, ื”ื“ืจืš ืฉืื ื• ื™ื›ื•ืœื™ื ืœื”ืกืชื›ืœ ืขืœ ื–ื”
13:10
is to ignore the neurons, ignore the synapses,
281
790330
3000
ื–ื” ืœื”ืชืขืœื ืžื”ื ื•ื™ืจื•ื ื™ื ื•ืœื”ืชืขืœื ืžื”ืกื™ื ืคืกื•ืช,
13:13
and look just at the raw electrical activity.
282
793330
2000
ื•ืœื”ืกืชื›ืœ ืจืง ืขืœ ื”ืคืขื™ืœื•ืช ื”ื—ืฉืžืœื™ืช ื”ื’ืกื”.
13:15
Because that is what it's creating.
283
795330
2000
ืžื›ื™ื•ื•ืŸ ืฉื–ื” ืžื” ืฉื ื•ืฆืจ.
13:17
It's creating electrical patterns.
284
797330
2000
ื–ื” ื™ื•ืฆืจ ืชื‘ื ื™ืช ืฉืœ ื“ืคื•ืกื™ื ื—ืฉืžืœื™ื™ื.
13:19
So when we did this,
285
799330
2000
ืื– ื›ืืฉืจ ืขืฉื™ื ื• ื–ืืช,
13:21
we indeed, for the first time,
286
801330
2000
ืื ื—ื ื• ื‘ืืžืช, ื‘ืคืขื ื”ืจืืฉื•ื ื”,
13:23
saw these ghost-like structures:
287
803330
3000
ืจืื™ื ื• ืืช ื”ืžื‘ื ื™ื ื“ืžื•ื™ื™ ืจื•ื—ื•ืช ื”ืจืคืื™ื:
13:26
electrical objects appearing
288
806330
3000
ืื•ื‘ื™ืงื˜ื™ื ื—ืฉืžืœื™ื™ื ืžื•ืคื™ืขื™ื
13:29
within the neocortical column.
289
809330
3000
ื‘ืชื•ืš ืขืžื•ื“ืช ื”ืžื•ื—.
13:32
And it's these electrical objects
290
812330
3000
ื•ืืœื• ื”ืื•ื‘ื™ื™ืงื˜ื™ื ื”ื—ืฉืžืœื™ื™ื
13:35
that are holding all the information about
291
815330
3000
ืฉืžื—ื–ื™ืงื™ื ืืช ื›ืœ ื”ืื™ื ืคื•ืจืžืฆื™ื” ืขืœ
13:38
whatever stimulated it.
292
818330
3000
ืžื” ืฉืžื’ืจื” ืื•ืชื•.
13:41
And then when we zoomed into this,
293
821330
2000
ื•ืื– ื›ืืฉืจ ื”ืขืžืงื ื• ืคื ื™ืžื” ืœืชื•ื›ื•,
13:43
it's like a veritable universe.
294
823330
4000
ื–ื” ื›ืžื• ืขื•ืœื ืžืฆื™ืื•ืชื™.
13:47
So the next step
295
827330
2000
ืื– ื”ืฆืขื“ ื”ื‘ื
13:49
is just to take these brain coordinates
296
829330
4000
ื”ื•ื ืจืง ืœืงื—ืช ืืช ื”ืงื•ืื•ืจื“ื™ื ื˜ื•ืช ื‘ืžื•ื—
13:53
and to project them into perceptual space.
297
833330
4000
ื•ืœื”ืงืจื™ืŸ ืื•ืชื ืขืœ ื”ืžืจื—ื‘ ื”ืชืคื™ืกืชื™.
13:57
And if you do that,
298
837330
2000
ื•ืื ืืชื” ืขื•ืฉื” ืืช ื–ื”,
13:59
you will be able to step inside
299
839330
2000
ืืชื” ืชื”ื™ื” ืžืกื•ื’ืœ ืœืฆืขื•ื“ ืคื ื™ืžื”
14:01
the reality that is created
300
841330
2000
ืœืžืฆื™ืื•ืช ืฉื ื•ืฆืจื”
14:03
by this machine,
301
843330
2000
ื‘ืืžืฆืขื•ืช ื”ืžื›ืฉื™ืจ ื”ื–ื”,
14:05
by this piece of the brain.
302
845330
3000
ื‘ืืžืฆืขื•ืช ื”ื—ืœืง ื”ื–ื” ืฉืœ ื”ืžื•ื—.
14:08
So, in summary,
303
848330
2000
ื›ืš ืฉืœืกื™ื›ื•ื,
14:10
I think that the universe may have --
304
850330
2000
ืื ื™ ื—ื•ืฉื‘ ืฉื”ื™ืงื•ื ื™ื™ืชื›ืŸ -
14:12
it's possible --
305
852330
2000
ื–ื” ืืคืฉืจื™ -
14:14
evolved a brain to see itself,
306
854330
3000
ืฉื™ื™ืฆืจ ืžื•ื— ื›ื“ื™ ืœืจืื•ืช ืืช ืขืฆืžื•,
14:17
which may be a first step in becoming aware of itself.
307
857330
5000
ืฉืื•ืœื™ ื–ื• ื”ืคืขื ื”ืฆืขื“ ื”ืจืืฉื•ืŸ ืœื”ื™ื•ืช ืžื•ื“ืข ืœืขืฆืžื•.
14:22
There is a lot more to do to test these theories,
308
862330
2000
ื™ืฉ ืขื•ื“ ื”ืจื‘ื” ืžื” ืœืขืฉื•ืช ืœื‘ื“ื•ืง ืืช ื”ืชื™ืื•ืจื™ื•ืช ื”ืืœื”,
14:24
and to test any other theories.
309
864330
3000
ื•ืœื‘ื“ื•ืง ืชื™ืื•ืจื™ื•ืช ืื—ืจื•ืช.
14:27
But I hope that you are at least partly convinced
310
867330
3000
ืื‘ืœ ืื ื™ ืžืงื•ื•ื” ืฉืืชื ืœืคื—ื•ืช ืžืฉื•ื›ื ืขื™ื ื—ืœืงื™ืช
14:30
that it is not impossible to build a brain.
311
870330
3000
ืฉื–ื” ืœื ื‘ืœืชื™ ืืคืฉืจื™ ืœื‘ื ื•ืช ืžื•ื—.
14:33
We can do it within 10 years,
312
873330
2000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ืืช ื–ื” ื‘ืชื•ืš 10 ืฉื ื™ื,
14:35
and if we do succeed,
313
875330
2000
ื•ืื ืื ื—ื ื• ื ืฆืœื™ื—,
14:37
we will send to TED, in 10 years,
314
877330
2000
ืื ื—ื ื• ื ืฉืœื— ืœTED ื‘ืขื•ื“ 10 ืฉื ื™ื,
14:39
a hologram to talk to you. Thank you.
315
879330
3000
ื”ืœื•ื’ืจืžื” ืฉืชื“ื‘ืจ ืืœื™ื›ื, ืชื•ื“ื” ืจื‘ื”.
14:42
(Applause)
316
882330
6000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7