Jeff Hawkins: How brain science will change computing

207,857 views ・ 2007-05-23

TED


Please double-click on the English subtitles below to play the video.

00:25
I do two things:
0
25476
1151
00:26
I design mobile computers and I study brains.
1
26651
2118
00:28
Today's talk is about brains and -- (Audience member cheers)
2
28793
2930
00:31
Yay! I have a brain fan out there.
3
31747
1817
00:33
(Laughter)
4
33588
3147
00:36
If I could have my first slide,
5
36759
1555
00:38
you'll see the title of my talk and my two affiliations.
6
38338
2849
00:41
So what I'm going to talk about is why we don't have a good brain theory,
7
41211
3468
00:44
why it is important that we should develop one
8
44703
2277
00:47
and what we can do about it.
9
47004
1483
00:48
I'll try to do all that in 20 minutes.
10
48511
1824
00:50
I have two affiliations.
11
50359
1151
00:51
Most of you know me from my Palm and Handspring days,
12
51534
2532
00:54
but I also run a nonprofit scientific research institute
13
54090
2683
00:56
called the Redwood Neuroscience Institute in Menlo Park.
14
56797
2632
00:59
We study theoretical neuroscience and how the neocortex works.
15
59453
3388
01:02
I'm going to talk all about that.
16
62865
1598
01:04
I have one slide on my other life, the computer life,
17
64487
2745
01:07
and that's this slide here.
18
67256
1301
01:08
These are some of the products I've worked on over the last 20 years,
19
68581
3268
01:11
starting from the very original laptop
20
71873
1842
01:13
to some of the first tablet computers
21
73739
1787
01:15
and so on, ending up most recently with the Treo,
22
75550
2298
01:17
and we're continuing to do this.
23
77872
1532
01:19
I've done this because I believe mobile computing
24
79428
2301
01:21
is the future of personal computing,
25
81753
1724
01:23
and I'm trying to make the world a little bit better
26
83501
2454
01:25
by working on these things.
27
85979
1296
01:27
But this was, I admit, all an accident.
28
87299
1874
01:29
I really didn't want to do any of these products.
29
89197
2308
01:31
Very early in my career
30
91529
1382
01:32
I decided I was not going to be in the computer industry.
31
92935
2690
01:35
Before that, I just have to tell you
32
95649
1721
01:37
about this picture of Graffiti I picked off the web the other day.
33
97394
3108
01:40
I was looking for a picture for Graffiti that'll text input language.
34
100526
3253
01:43
I found a website dedicated to teachers who want to make script-writing things
35
103803
3689
01:47
across the top of their blackboard,
36
107516
1674
01:49
and they had added Graffiti to it, and I'm sorry about that.
37
109214
2833
01:52
(Laughter)
38
112071
2247
01:54
So what happened was,
39
114342
1300
01:55
when I was young and got out of engineering school at Cornell in '79,
40
115666
4899
02:00
I went to work for Intel and was in the computer industry,
41
120589
3187
02:03
and three months into that, I fell in love with something else.
42
123800
3402
02:07
I said, "I made the wrong career choice here,"
43
127226
3044
02:10
and I fell in love with brains.
44
130294
2239
02:12
This is not a real brain.
45
132557
1533
02:14
This is a picture of one, a line drawing.
46
134114
2719
02:16
And I don't remember exactly how it happened,
47
136857
2119
02:19
but I have one recollection, which was pretty strong in my mind.
48
139000
3515
02:22
In September of 1979,
49
142539
1610
02:24
Scientific American came out with a single-topic issue about the brain.
50
144173
3364
02:27
It was one of their best issues ever.
51
147561
1938
02:29
They talked about the neuron, development, disease, vision
52
149523
2947
02:32
and all the things you might want to know about brains.
53
152494
2596
02:35
It was really quite impressive.
54
155114
1502
02:36
One might've had the impression we knew a lot about brains.
55
156640
2772
02:39
But the last article in that issue was written by Francis Crick of DNA fame.
56
159436
4195
02:43
Today is, I think, the 50th anniversary of the discovery of DNA.
57
163655
3024
02:46
And he wrote a story basically saying, this is all well and good,
58
166703
3075
02:49
but you know, we don't know diddly squat about brains,
59
169802
2743
02:52
and no one has a clue how they work,
60
172569
1739
02:54
so don't believe what anyone tells you.
61
174332
1866
02:56
This is a quote from that article, he says:
62
176222
2165
02:58
"What is conspicuously lacking" -- he's a very proper British gentleman --
63
178411
4293
03:02
"What is conspicuously lacking is a broad framework of ideas
64
182728
2830
03:05
in which to interpret these different approaches."
65
185582
2352
03:07
I thought the word "framework" was great.
66
187958
1968
03:09
He didn't say we didn't have a theory.
67
189950
1817
03:11
He says we don't even know how to begin to think about it.
68
191791
2725
03:14
We don't even have a framework.
69
194540
1492
03:16
We are in the pre-paradigm days, if you want to use Thomas Kuhn.
70
196056
3050
03:19
So I fell in love with this.
71
199130
1339
03:20
I said, look: We have all this knowledge about brains -- how hard can it be?
72
200493
3575
03:24
It's something we can work on in my lifetime; I could make a difference.
73
204092
3438
03:27
So I tried to get out of the computer business, into the brain business.
74
207554
3619
03:31
First, I went to MIT, the AI lab was there.
75
211197
2004
03:33
I said, I want to build intelligent machines too,
76
213225
2395
03:35
but I want to study how brains work first.
77
215644
2517
03:38
And they said, "Oh, you don't need to do that.
78
218185
2306
03:40
You're just going to program computers, that's all.
79
220515
2390
03:42
I said, you really ought to study brains.
80
222929
1963
03:44
They said, "No, you're wrong."
81
224916
1432
03:46
I said, "No, you're wrong," and I didn't get in.
82
226372
2246
03:48
(Laughter)
83
228642
1078
03:49
I was a little disappointed -- pretty young --
84
229744
2155
03:51
but I went back again a few years later,
85
231923
1936
03:53
this time in California, and I went to Berkeley.
86
233883
2359
03:56
And I said, I'll go in from the biological side.
87
236266
2430
03:58
So I got in the PhD program in biophysics.
88
238720
3089
04:01
I was like, I'm studying brains now. Well, I want to study theory.
89
241833
3410
04:05
They said, "You can't study theory about brains.
90
245267
2269
04:07
You can't get funded for that.
91
247560
1995
04:09
And as a graduate student, you can't do that."
92
249579
2155
04:11
So I said, oh my gosh.
93
251758
1218
04:13
I was depressed; I said, but I can make a difference in this field.
94
253000
3155
04:16
I went back in the computer industry
95
256179
2008
04:18
and said, I'll have to work here for a while.
96
258211
2105
04:20
That's when I designed all those computer products.
97
260340
2393
04:22
(Laughter)
98
262757
1301
04:24
I said, I want to do this for four years, make some money,
99
264082
2894
04:27
I was having a family, and I would mature a bit,
100
267000
3976
04:31
and maybe the business of neuroscience would mature a bit.
101
271000
2816
04:33
Well, it took longer than four years. It's been about 16 years.
102
273840
3001
04:36
But I'm doing it now, and I'm going to tell you about it.
103
276865
2716
04:39
So why should we have a good brain theory?
104
279605
2286
04:41
Well, there's lots of reasons people do science.
105
281915
3102
04:45
The most basic one is, people like to know things.
106
285041
2917
04:47
We're curious, and we go out and get knowledge.
107
287982
2195
04:50
Why do we study ants? It's interesting.
108
290201
1866
04:52
Maybe we'll learn something useful, but it's interesting and fascinating.
109
292091
3466
04:55
But sometimes a science has other attributes
110
295581
2057
04:57
which makes it really interesting.
111
297662
1829
04:59
Sometimes a science will tell something about ourselves;
112
299515
2627
05:02
it'll tell us who we are.
113
302166
1224
05:03
Evolution did this and Copernicus did this,
114
303414
2752
05:06
where we have a new understanding of who we are.
115
306190
2334
05:08
And after all, we are our brains. My brain is talking to your brain.
116
308548
3428
05:12
Our bodies are hanging along for the ride,
117
312000
2030
05:14
but my brain is talking to your brain.
118
314054
1825
05:15
And if we want to understand who we are and how we feel and perceive,
119
315903
3248
05:19
we need to understand brains.
120
319175
1391
05:20
Another thing is sometimes science leads to big societal benefits, technologies,
121
320590
3784
05:24
or businesses or whatever.
122
324398
1291
05:25
This is one, too, because when we understand how brains work,
123
325713
2878
05:28
we'll be able to build intelligent machines.
124
328615
2064
05:30
That's a good thing on the whole,
125
330703
1698
05:32
with tremendous benefits to society,
126
332425
1858
05:34
just like a fundamental technology.
127
334307
1669
05:36
So why don't we have a good theory of brains?
128
336000
2850
05:38
People have been working on it for 100 years.
129
338874
2168
05:41
Let's first take a look at what normal science looks like.
130
341066
2719
05:43
This is normal science.
131
343809
1187
05:45
Normal science is a nice balance between theory and experimentalists.
132
345020
4074
05:49
The theorist guy says, "I think this is what's going on,"
133
349118
2691
05:51
the experimentalist says, "You're wrong."
134
351833
1961
05:53
It goes back and forth, this works in physics, this in geology.
135
353818
3004
05:56
But if this is normal science, what does neuroscience look like?
136
356846
3009
05:59
This is what neuroscience looks like.
137
359879
1795
06:01
We have this mountain of data,
138
361698
1442
06:03
which is anatomy, physiology and behavior.
139
363164
2070
06:05
You can't imagine how much detail we know about brains.
140
365258
3194
06:08
There were 28,000 people who went to the neuroscience conference this year,
141
368476
3592
06:12
and every one of them is doing research in brains.
142
372092
2363
06:14
A lot of data, but no theory.
143
374479
1694
06:16
There's a little wimpy box on top there.
144
376197
2000
06:18
And theory has not played a role in any sort of grand way
145
378221
3382
06:21
in the neurosciences.
146
381627
1429
06:23
And it's a real shame.
147
383080
1240
06:24
Now, why has this come about?
148
384344
1391
06:25
If you ask neuroscientists why is this the state of affairs,
149
385759
2988
06:28
first, they'll admit it.
150
388771
1246
06:30
But if you ask them, they say,
151
390041
1485
06:31
there's various reasons we don't have a good brain theory.
152
391550
2732
06:34
Some say we still don't have enough data,
153
394306
1969
06:36
we need more information, there's all these things we don't know.
154
396299
3059
06:39
Well, I just told you there's data coming out of your ears.
155
399382
2841
06:42
We have so much information, we don't even know how to organize it.
156
402247
3164
06:45
What good is more going to do?
157
405435
1438
06:46
Maybe we'll be lucky and discover some magic thing, but I don't think so.
158
406897
3448
06:50
This is a symptom of the fact that we just don't have a theory.
159
410369
2973
06:53
We don't need more data, we need a good theory.
160
413366
2610
06:56
Another one is sometimes people say,
161
416000
1798
06:57
"Brains are so complex, it'll take another 50 years."
162
417822
3154
07:01
I even think Chris said something like this yesterday, something like,
163
421000
3354
07:04
it's one of the most complicated things in the universe.
164
424378
2627
07:07
That's not true -- you're more complicated than your brain.
165
427029
2790
07:09
You've got a brain.
166
429843
1151
07:11
And although the brain looks very complicated,
167
431018
2150
07:13
things look complicated until you understand them.
168
433192
2336
07:15
That's always been the case.
169
435552
1335
07:16
So we can say, my neocortex, the part of the brain I'm interested in,
170
436911
3243
07:20
has 30 billion cells.
171
440178
1152
07:21
But, you know what? It's very, very regular.
172
441354
2432
07:23
In fact, it looks like it's the same thing repeated over and over again.
173
443810
3394
07:27
It's not as complex as it looks. That's not the issue.
174
447228
2536
07:29
Some people say, brains can't understand brains.
175
449788
2287
07:32
Very Zen-like. Woo.
176
452099
1988
07:34
(Laughter)
177
454111
2188
07:36
You know, it sounds good, but why? I mean, what's the point?
178
456323
2859
07:39
It's just a bunch of cells. You understand your liver.
179
459206
2569
07:41
It's got a lot of cells in it too, right?
180
461799
1977
07:43
So, you know, I don't think there's anything to that.
181
463800
2494
07:46
And finally, some people say,
182
466318
2112
07:48
"I don't feel like a bunch of cells -- I'm conscious.
183
468454
2983
07:51
I've got this experience, I'm in the world.
184
471461
2069
07:53
I can't be just a bunch of cells."
185
473554
1910
07:55
Well, people used to believe there was a life force to be living,
186
475488
3223
07:58
and we now know that's really not true at all.
187
478735
2409
08:01
And there's really no evidence,
188
481168
1898
08:03
other than that people just disbelieve that cells can do what they do.
189
483090
3374
08:06
So some people have fallen into the pit of metaphysical dualism,
190
486488
3041
08:09
some really smart people, too, but we can reject all that.
191
489553
2730
08:12
(Laughter)
192
492307
2895
08:15
No, there's something else,
193
495226
1741
08:16
something really fundamental, and it is:
194
496991
1985
08:19
another reason why we don't have a good brain theory
195
499000
2451
08:21
is because we have an intuitive, strongly held but incorrect assumption
196
501475
5535
08:27
that has prevented us from seeing the answer.
197
507034
2112
08:29
There's something we believe that just, it's obvious, but it's wrong.
198
509170
3788
08:32
Now, there's a history of this in science and before I tell you what it is,
199
512982
3566
08:36
I'll tell you about the history of it in science.
200
516572
2299
08:38
Look at other scientific revolutions --
201
518895
1910
08:40
the solar system, that's Copernicus,
202
520829
1879
08:42
Darwin's evolution, and tectonic plates, that's Wegener.
203
522732
2819
08:46
They all have a lot in common with brain science.
204
526059
2295
08:48
First, they had a lot of unexplained data. A lot of it.
205
528378
2666
08:51
But it got more manageable once they had a theory.
206
531068
2794
08:53
The best minds were stumped -- really smart people.
207
533886
2807
08:56
We're not smarter now than they were then;
208
536717
2004
08:58
it just turns out it's really hard to think of things,
209
538745
2527
09:01
but once you've thought of them, it's easy to understand.
210
541296
2676
09:03
My daughters understood these three theories,
211
543996
2106
09:06
in their basic framework, in kindergarten.
212
546126
2518
09:08
It's not that hard -- here's the apple, here's the orange,
213
548668
3266
09:11
the Earth goes around, that kind of stuff.
214
551958
2018
09:14
Another thing is the answer was there all along,
215
554000
2586
09:16
but we kind of ignored it because of this obvious thing.
216
556610
2779
09:19
It was an intuitive, strongly held belief that was wrong.
217
559413
2850
09:22
In the case of the solar system,
218
562287
1690
09:24
the idea that the Earth is spinning,
219
564001
1760
09:25
the surface is going a thousand miles an hour,
220
565785
2191
09:28
and it's going through the solar system at a million miles an hour --
221
568000
3249
09:31
this is lunacy; we all know the Earth isn't moving.
222
571273
2476
09:33
Do you feel like you're moving a thousand miles an hour?
223
573773
2877
09:36
If you said Earth was spinning around in space and was huge --
224
576674
2919
09:39
they would lock you up, that's what they did back then.
225
579617
2591
09:42
So it was intuitive and obvious. Now, what about evolution?
226
582232
3275
09:45
Evolution, same thing.
227
585531
1154
09:46
We taught our kids the Bible says God created all these species,
228
586709
3080
09:49
cats are cats; dogs are dogs; people are people; plants are plants;
229
589813
3143
09:52
they don't change.
230
592980
1241
09:54
Noah put them on the ark in that order, blah, blah.
231
594245
2649
09:56
The fact is, if you believe in evolution, we all have a common ancestor.
232
596918
3395
10:00
We all have a common ancestor with the plant in the lobby!
233
600337
3282
10:03
This is what evolution tells us. And it's true. It's kind of unbelievable.
234
603643
3686
10:07
And the same thing about tectonic plates.
235
607353
2557
10:09
All the mountains and the continents
236
609934
1722
10:11
are kind of floating around on top of the Earth.
237
611680
2344
10:14
It doesn't make any sense.
238
614048
1246
10:15
So what is the intuitive, but incorrect assumption,
239
615318
4601
10:19
that's kept us from understanding brains?
240
619943
1967
10:21
I'll tell you. It'll seem obvious that it's correct. That's the point.
241
621934
3293
10:25
Then I'll make an argument why you're incorrect on the other assumption.
242
625251
3434
10:28
The intuitive but obvious thing is:
243
628709
1682
10:30
somehow, intelligence is defined by behavior;
244
630415
2314
10:32
we're intelligent because of how we do things
245
632753
2350
10:35
and how we behave intelligently.
246
635127
1572
10:36
And I'm going to tell you that's wrong.
247
636723
1879
10:38
Intelligence is defined by prediction.
248
638626
2131
10:40
I'm going to work you through this in a few slides,
249
640781
2415
10:43
and give you an example of what this means.
250
643220
2094
10:45
Here's a system.
251
645338
1301
10:46
Engineers and scientists like to look at systems like this.
252
646663
2908
10:49
They say, we have a thing in a box. We have its inputs and outputs.
253
649595
3163
10:52
The AI people said, the thing in the box is a programmable computer,
254
652782
3240
10:56
because it's equivalent to a brain.
255
656046
1679
10:57
We'll feed it some inputs and get it to do something, have some behavior.
256
657749
3506
11:01
Alan Turing defined the Turing test, which essentially says,
257
661279
2822
11:04
we'll know if something's intelligent if it behaves identical to a human --
258
664125
3553
11:07
a behavioral metric of what intelligence is
259
667702
2106
11:09
that has stuck in our minds for a long time.
260
669832
2144
11:12
Reality, though -- I call it real intelligence.
261
672000
2392
11:14
Real intelligence is built on something else.
262
674416
2175
11:16
We experience the world through a sequence of patterns,
263
676615
3214
11:19
and we store them, and we recall them.
264
679853
2149
11:22
When we recall them, we match them up against reality,
265
682026
2545
11:24
and we're making predictions all the time.
266
684595
2251
11:26
It's an internal metric; there's an internal metric about us,
267
686870
2958
11:29
saying, do we understand the world, am I making predictions, and so on.
268
689852
3342
11:33
You're all being intelligent now, but you're not doing anything.
269
693218
3002
11:36
Maybe you're scratching yourself, but you're not doing anything.
270
696244
3002
11:39
But you're being intelligent; you're understanding what I'm saying.
271
699270
3156
11:42
Because you're intelligent and you speak English,
272
702450
2295
11:44
you know the word at the end of this
273
704769
1751
11:46
sentence.
274
706544
1159
11:47
The word came to you; you make these predictions all the time.
275
707727
3152
11:50
What I'm saying is,
276
710903
1699
11:52
the internal prediction is the output in the neocortex,
277
712626
2631
11:55
and somehow, prediction leads to intelligent behavior.
278
715281
2541
11:57
Here's how that happens:
279
717846
1151
11:59
Let's start with a non-intelligent brain.
280
719021
1955
12:01
I'll argue a non-intelligent brain, we'll call it an old brain.
281
721000
3009
12:04
And we'll say it's a non-mammal, like a reptile,
282
724033
2871
12:06
say, an alligator; we have an alligator.
283
726928
1985
12:08
And the alligator has some very sophisticated senses.
284
728937
3371
12:12
It's got good eyes and ears and touch senses and so on,
285
732332
3206
12:15
a mouth and a nose.
286
735562
1469
12:17
It has very complex behavior.
287
737055
1991
12:19
It can run and hide. It has fears and emotions. It can eat you.
288
739070
3906
12:23
It can attack. It can do all kinds of stuff.
289
743000
3590
12:27
But we don't consider the alligator very intelligent,
290
747193
2856
12:30
not in a human sort of way.
291
750073
1676
12:31
But it has all this complex behavior already.
292
751773
2356
12:34
Now in evolution, what happened?
293
754510
1801
12:36
First thing that happened in evolution with mammals
294
756335
2385
12:38
is we started to develop a thing called the neocortex.
295
758744
2531
12:41
I'm going to represent the neocortex by this box on top of the old brain.
296
761299
3793
12:45
Neocortex means "new layer." It's a new layer on top of your brain.
297
765116
3353
12:48
It's the wrinkly thing on the top of your head
298
768493
2343
12:50
that got wrinkly because it got shoved in there and doesn't fit.
299
770860
3084
12:53
(Laughter)
300
773968
1008
12:55
Literally, it's about the size of a table napkin
301
775000
2242
12:57
and doesn't fit, so it's wrinkly.
302
777266
1574
12:58
Now, look at how I've drawn this.
303
778864
1745
13:00
The old brain is still there.
304
780633
1386
13:02
You still have that alligator brain. You do. It's your emotional brain.
305
782043
3655
13:05
It's all those gut reactions you have.
306
785722
2730
13:08
On top of it, we have this memory system called the neocortex.
307
788476
3270
13:11
And the memory system is sitting over the sensory part of the brain.
308
791770
4294
13:16
So as the sensory input comes in and feeds from the old brain,
309
796088
3055
13:19
it also goes up into the neocortex.
310
799167
2154
13:21
And the neocortex is just memorizing.
311
801345
1913
13:23
It's sitting there saying, I'm going to memorize all the things going on:
312
803282
3561
13:26
where I've been, people I've seen, things I've heard, and so on.
313
806867
3019
13:29
And in the future, when it sees something similar to that again,
314
809910
3362
13:33
in a similar environment, or the exact same environment,
315
813296
2635
13:35
it'll start playing it back: "Oh, I've been here before,"
316
815955
3555
13:39
and when you were here before, this happened next.
317
819534
2364
13:41
It allows you to predict the future.
318
821922
1726
13:43
It literally feeds back the signals into your brain;
319
823672
3396
13:47
they'll let you see what's going to happen next,
320
827092
2265
13:49
will let you hear the word "sentence" before I said it.
321
829381
2595
13:52
And it's this feeding back into the old brain
322
832000
3185
13:55
that will allow you to make more intelligent decisions.
323
835209
2577
13:57
This is the most important slide of my talk, so I'll dwell on it a little.
324
837810
3489
14:01
And all the time you say, "Oh, I can predict things,"
325
841323
3575
14:04
so if you're a rat and you go through a maze, and you learn the maze,
326
844922
3360
14:08
next time you're in one, you have the same behavior.
327
848306
2439
14:10
But suddenly, you're smarter; you say, "I recognize this maze,
328
850769
2991
14:13
I know which way to go; I've been here before; I can envision the future."
329
853784
3542
14:17
That's what it's doing.
330
857350
1168
14:18
This is true for all mammals --
331
858542
2840
14:21
in humans, it got a lot worse.
332
861406
2031
14:23
Humans actually developed the front of the neocortex,
333
863461
2587
14:26
called the anterior part of the neocortex.
334
866072
2221
14:28
And nature did a little trick.
335
868317
1438
14:29
It copied the posterior, the back part, which is sensory,
336
869779
2687
14:32
and put it in the front.
337
872490
1151
14:33
Humans uniquely have the same mechanism on the front,
338
873665
2480
14:36
but we use it for motor control.
339
876169
1554
14:37
So we're now able to do very sophisticated motor planning, things like that.
340
877747
3581
14:41
I don't have time to explain, but to understand how a brain works,
341
881352
3126
14:44
you have to understand how the first part of the mammalian neocortex works,
342
884502
3537
14:48
how it is we store patterns and make predictions.
343
888063
2293
14:50
Let me give you a few examples of predictions.
344
890380
2188
14:52
I already said the word "sentence."
345
892592
1676
14:54
In music, if you've heard a song before,
346
894292
3206
14:57
when you hear it, the next note pops into your head already --
347
897522
2909
15:00
you anticipate it.
348
900455
1151
15:01
With an album, at the end of a song, the next song pops into your head.
349
901630
3354
15:05
It happens all the time, you make predictions.
350
905008
2305
15:07
I have this thing called the "altered door" thought experiment.
351
907337
3039
15:10
It says, you have a door at home;
352
910400
2829
15:13
when you're here, I'm changing it --
353
913253
1755
15:15
I've got a guy back at your house right now, moving the door around,
354
915032
3196
15:18
moving your doorknob over two inches.
355
918252
1769
15:20
When you go home tonight, you'll put your hand out, reach for the doorknob,
356
920045
3584
15:23
notice it's in the wrong spot
357
923653
1514
15:25
and go, "Whoa, something happened."
358
925191
1687
15:26
It may take a second, but something happened.
359
926902
2101
15:29
I can change your doorknob in other ways --
360
929027
2003
15:31
make it larger, smaller, change its brass to silver, make it a lever,
361
931054
3241
15:34
I can change the door; put colors on, put windows in.
362
934319
2576
15:36
I can change a thousand things about your door
363
936919
2151
15:39
and in the two seconds you take to open it,
364
939094
2008
15:41
you'll notice something has changed.
365
941126
1722
15:42
Now, the engineering approach, the AI approach to this,
366
942872
2584
15:45
is to build a door database with all the door attributes.
367
945480
2675
15:48
And as you go up to the door, we check them off one at time:
368
948179
2819
15:51
door, door, color ...
369
951022
1346
15:52
We don't do that. Your brain doesn't do that.
370
952392
2100
15:54
Your brain is making constant predictions all the time
371
954516
2540
15:57
about what will happen in your environment.
372
957080
2034
15:59
As I put my hand on this table, I expect to feel it stop.
373
959138
2746
16:01
When I walk, every step, if I missed it by an eighth of an inch,
374
961908
3019
16:04
I'll know something has changed.
375
964951
1533
16:06
You're constantly making predictions about your environment.
376
966508
2820
16:09
I'll talk about vision, briefly.
377
969352
1593
16:10
This is a picture of a woman.
378
970969
1383
16:12
When we look at people, our eyes saccade over two to three times a second.
379
972376
3490
16:15
We're not aware of it, but our eyes are always moving.
380
975890
2529
16:18
When we look at a face, we typically go from eye to eye to nose to mouth.
381
978443
3435
16:21
When your eye moves from eye to eye,
382
981902
1869
16:23
if there was something else there like a nose,
383
983795
2158
16:25
you'd see a nose where an eye is supposed to be and go, "Oh, shit!"
384
985977
3546
16:29
(Laughter)
385
989547
1396
16:30
"There's something wrong about this person."
386
990967
2109
16:33
That's because you're making a prediction.
387
993100
2005
16:35
It's not like you just look over and say, "What am I seeing? A nose? OK."
388
995129
3439
16:38
No, you have an expectation of what you're going to see.
389
998592
2634
16:41
Every single moment.
390
1001250
1151
16:42
And finally, let's think about how we test intelligence.
391
1002425
2629
16:45
We test it by prediction: What is the next word in this ...?
392
1005078
3081
16:48
This is to this as this is to this. What is the next number in this sentence?
393
1008183
3627
16:51
Here's three visions of an object. What's the fourth one?
394
1011834
2690
16:54
That's how we test it. It's all about prediction.
395
1014548
2504
16:57
So what is the recipe for brain theory?
396
1017573
2194
17:00
First of all, we have to have the right framework.
397
1020219
2366
17:02
And the framework is a memory framework,
398
1022609
1913
17:04
not a computational or behavior framework,
399
1024546
2024
17:06
it's a memory framework.
400
1026594
1163
17:07
How do you store and recall these sequences of patterns?
401
1027781
2623
17:10
It's spatiotemporal patterns.
402
1030428
1442
17:11
Then, if in that framework, you take a bunch of theoreticians --
403
1031894
3009
17:14
biologists generally are not good theoreticians.
404
1034927
2246
17:17
Not always, but generally, there's not a good history of theory in biology.
405
1037197
3529
17:20
I've found the best people to work with are physicists,
406
1040750
2574
17:23
engineers and mathematicians,
407
1043348
1383
17:24
who tend to think algorithmically.
408
1044755
1696
17:26
Then they have to learn the anatomy and the physiology.
409
1046475
3264
17:29
You have to make these theories very realistic in anatomical terms.
410
1049763
4496
17:34
Anyone who tells you their theory about how the brain works
411
1054283
2765
17:37
and doesn't tell you exactly how it's working
412
1057072
2097
17:39
and how the wiring works --
413
1059193
1303
17:40
it's not a theory.
414
1060520
1267
17:41
And that's what we do at the Redwood Neuroscience Institute.
415
1061811
2833
17:44
I'd love to tell you we're making fantastic progress in this thing,
416
1064668
3308
17:48
and I expect to be back on this stage sometime in the not too distant future,
417
1068000
3662
17:51
to tell you about it.
418
1071686
1164
17:52
I'm really excited; this is not going to take 50 years.
419
1072874
2594
17:55
What will brain theory look like?
420
1075492
1578
17:57
First of all, it's going to be about memory.
421
1077094
2055
17:59
Not like computer memory -- not at all like computer memory.
422
1079173
2822
18:02
It's very different.
423
1082019
1151
18:03
It's a memory of very high-dimensional patterns,
424
1083194
2257
18:05
like the things that come from your eyes.
425
1085475
1962
18:07
It's also memory of sequences:
426
1087461
1437
18:08
you cannot learn or recall anything outside of a sequence.
427
1088922
2730
18:11
A song must be heard in sequence over time,
428
1091676
2837
18:14
and you must play it back in sequence over time.
429
1094537
2351
18:16
And these sequences are auto-associatively recalled,
430
1096912
2449
18:19
so if I see something, I hear something, it reminds me of it,
431
1099385
2873
18:22
and it plays back automatically.
432
1102282
1533
18:23
It's an automatic playback.
433
1103839
1294
18:25
And prediction of future inputs is the desired output.
434
1105157
2548
18:27
And as I said, the theory must be biologically accurate,
435
1107729
2620
18:30
it must be testable and you must be able to build it.
436
1110373
2484
18:32
If you don't build it, you don't understand it.
437
1112881
2211
18:35
One more slide.
438
1115116
1532
18:36
What is this going to result in?
439
1116672
2309
18:39
Are we going to really build intelligent machines?
440
1119005
2348
18:41
Absolutely. And it's going to be different than people think.
441
1121377
3798
18:45
No doubt that it's going to happen, in my mind.
442
1125508
2392
18:47
First of all, we're going to build this stuff out of silicon.
443
1127924
3116
18:51
The same techniques we use to build silicon computer memories,
444
1131064
2912
18:54
we can use here.
445
1134000
1151
18:55
But they're very different types of memories.
446
1135175
2109
18:57
And we'll attach these memories to sensors,
447
1137308
2023
18:59
and the sensors will experience real-live, real-world data,
448
1139355
2777
19:02
and learn about their environment.
449
1142156
1752
19:03
Now, it's very unlikely the first things you'll see are like robots.
450
1143932
3445
19:07
Not that robots aren't useful; people can build robots.
451
1147401
2575
19:10
But the robotics part is the hardest part. That's old brain. That's really hard.
452
1150000
3767
19:13
The new brain is easier than the old brain.
453
1153791
2007
19:15
So first we'll do things that don't require a lot of robotics.
454
1155822
3082
19:18
So you're not going to see C-3PO.
455
1158928
2179
19:21
You're going to see things more like intelligent cars
456
1161131
2485
19:23
that really understand what traffic is, what driving is
457
1163640
2808
19:26
and have learned that cars with the blinkers on for half a minute
458
1166472
3278
19:29
probably aren't going to turn.
459
1169774
1574
19:31
(Laughter)
460
1171372
1291
19:32
We can also do intelligent security systems.
461
1172687
2064
19:34
Anytime we're basically using our brain but not doing a lot of mechanics --
462
1174775
3573
19:38
those are the things that will happen first.
463
1178372
2059
19:40
But ultimately, the world's the limit.
464
1180455
1820
19:42
I don't know how this will turn out.
465
1182299
1732
19:44
I know a lot of people who invented the microprocessor.
466
1184055
2591
19:46
And if you talk to them,
467
1186670
2164
19:48
they knew what they were doing was really significant,
468
1188858
2575
19:51
but they didn't really know what was going to happen.
469
1191457
2500
19:53
They couldn't anticipate cell phones and the Internet
470
1193981
2768
19:56
and all this kind of stuff.
471
1196773
1735
19:58
They just knew like, "We're going to build calculators
472
1198532
2621
20:01
and traffic-light controllers.
473
1201177
1440
20:02
But it's going to be big!"
474
1202641
1299
20:03
In the same way, brain science and these memories
475
1203964
2341
20:06
are going to be a very fundamental technology,
476
1206329
2225
20:08
and it will lead to unbelievable changes in the next 100 years.
477
1208578
3442
20:12
And I'm most excited about how we're going to use them in science.
478
1212044
3405
20:15
So I think that's all my time -- I'm over,
479
1215473
2837
20:18
and I'm going to end my talk right there.
480
1218334
2277
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7