Jeff Hawkins: How brain science will change computing

208,097 views ใƒป 2007-05-23

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Oran Tzuman ืžื‘ืงืจ: Uri Yaffe
00:25
I do two things:
0
25476
1151
ืื ื™ ืขื•ืฉื” ืฉื ื™ ื“ื‘ืจื™ื. ืื ื™ ืžืชื›ื ืŸ ืžื—ืฉื‘ื™ื ื ื™ื™ื“ื™ื ื•ื—ื•ืงืจ ืืช ื”ืžื—.
00:26
I design mobile computers and I study brains.
1
26651
2118
00:28
Today's talk is about brains and -- (Audience member cheers)
2
28793
2930
ื”ื”ืจืฆืื” ื”ื™ื•ื ื”ื™ื ืขืœ ื”ืžื— ื•...
00:31
Yay! I have a brain fan out there.
3
31747
1817
ื™ืฉ ืœื ื• ืžืขืจื™ืฆื™ ืžื— ื›ืืŸ!
00:33
(Laughter)
4
33588
3147
(ืฆื—ื•ืง)
ืื ืืคืฉืจ ืœื”ืฆื™ื’ ืืช ื”ืฉืงื•ืคื™ืช ื”ืจืืฉื•ื ื”
00:36
If I could have my first slide,
5
36759
1555
ื•ืชืจืื• ืืช ื ื•ืฉื ื”ื”ืจืฆืื” ื•ืืช ืฉืชื™ ื”ื”ืฉืชื™ื›ื•ื™ื•ืช ืฉืœื™
00:38
you'll see the title of my talk and my two affiliations.
6
38338
2849
00:41
So what I'm going to talk about is why we don't have a good brain theory,
7
41211
3468
ืžื” ืฉืื ื™ ื”ื•ืœืš ืœื“ื‘ืจ ืขืœื™ื• ื”ื•ื ืžื“ื•ืข ืื™ืŸ ืœื ื• ืชื™ืื•ืจื™ืช ืžื— ื˜ื•ื‘ื”,
00:44
why it is important that we should develop one
8
44703
2277
ืžื“ื•ืข ื—ืฉื•ื‘ ืฉื ืคืชื— ืื—ืช ื›ื–ื• ื•ืžื” ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ื‘ืขื ื™ื™ืŸ.
00:47
and what we can do about it.
9
47004
1483
00:48
I'll try to do all that in 20 minutes.
10
48511
1824
ื•ืื ืกื” ืœืขืฉื•ืช ืืช ื›ืœ ื–ื” ื‘ืขืฉืจื™ื ื“ืงื•ืช. ื™ืฉ ืœื™ ืฉืชื™ ื”ืฉืชื™ื›ื•ื™ื•ืช.
00:50
I have two affiliations.
11
50359
1151
00:51
Most of you know me from my Palm and Handspring days,
12
51534
2532
ืจื•ื‘ื›ื ืžื›ื™ืจื™ื ืื•ืชื™ ืžื™ืžื™ื™ ื‘ืคืืœื ื•ื”ื ื“ืกืคืจื™ื ื’
00:54
but I also run a nonprofit scientific research institute
13
54090
2683
ืื‘ืœ ืื ื™ ื’ื ืžืคืขื™ืœ ืžื›ื•ืŸ ืžื“ืขื™ ืœืœื ืžื˜ืจื•ืช ืจื•ื•ื—
00:56
called the Redwood Neuroscience Institute in Menlo Park.
14
56797
2632
ืžื•ืกื“ ืจื“-ื•ื•ื“ ืœืžื“ืขื™ ื”ืžื— ื‘ืคืืจืง ืžืœื ื•,
00:59
We study theoretical neuroscience and how the neocortex works.
15
59453
3388
ื•ืื ื• ื—ื•ืงืจื™ื ืชื™ืื•ืจื™ื•ืช ื‘ืžื“ืขื™ ื”ืžื—
ื•ืื™ืš ืงืœื™ืคืช ื”ืžื— ืขื•ื‘ื“ืช.
01:02
I'm going to talk all about that.
16
62865
1598
ื•ืขืœ ื–ื” ืื“ื‘ืจ.
01:04
I have one slide on my other life, the computer life,
17
64487
2745
ื™ืฉ ืœื™ ืฉืงื•ืคื™ืช ืื—ืช ืขืœ ื”ื—ื™ื™ื ื”ืื—ืจื™ื ืฉืœื™, ื—ื™ื™ ื”ืžื—ืฉื‘ื™ื,
01:07
and that's this slide here.
18
67256
1301
01:08
These are some of the products I've worked on over the last 20 years,
19
68581
3268
ืืœื” ื›ืžื” ืžื”ืžื•ืฆืจื™ื ืฉืขื‘ื“ืชื™ ืขืœื™ื”ื ื‘ืขืฉืจื™ื ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
01:11
starting from the very original laptop
20
71873
1842
ื”ื—ืœ ืžื”ืœืคื˜ื•ืค ื”ืจืืฉื•ื ื™ ื•ืขื“ ืžื—ืฉื‘ื™ ื›ืฃ ื”ื™ื“ ื”ืจืืฉื•ื ื™ื™ื
01:13
to some of the first tablet computers
21
73739
1787
01:15
and so on, ending up most recently with the Treo,
22
75550
2298
ื•ื›ืš ื”ืœืื”, ืขื“ ื”-Treo ืœืื—ืจื•ื ื”,
01:17
and we're continuing to do this.
23
77872
1532
ื•ืื ื• ืžืžืฉื™ื›ื™ื ืขื ื–ื”.
01:19
I've done this because I believe mobile computing
24
79428
2301
ื•ืขืฉื™ืชื™ ืืช ื–ื” ื›ื™ ืื ื™ ื‘ืืžืช ืžืืžื™ืŸ ืฉืžื—ืฉื‘ื™ื ื ื™ื™ื“ื™ื
01:21
is the future of personal computing,
25
81753
1724
ื”ื ื”ืขืชื™ื“ ืฉืœ ื”ืžื—ืฉื•ื‘ ื”ืื™ืฉื™, ื•ืื ื™ ืžื ืกื” ืœืขืฉื•ืช ืืช ื”ืขื•ืœื
01:23
and I'm trying to make the world a little bit better
26
83501
2454
ืžืขื˜ ื™ื•ืชืจ ื˜ื•ื‘ ืขืœ ื™ื“ื™ ืขื‘ื•ื“ื” ื‘ืชื—ื•ื ื”ื–ื”.
01:25
by working on these things.
27
85979
1296
01:27
But this was, I admit, all an accident.
28
87299
1874
ืื‘ืœ ืื ื™ ื—ื™ื™ื‘ ืœื”ื•ื“ื•ืช, ืฉื›ืœ ื–ื” ืงืจื” ื‘ื˜ืขื•ืช.
01:29
I really didn't want to do any of these products.
29
89197
2308
ื”ืืžืช ืฉืœื ืจืฆื™ืชื™ ืœืขืฉื•ืช ืืช ื›ืœ ื”ืžื•ืฆืจื™ื ื”ืืœื•
01:31
Very early in my career
30
91529
1382
ื•ืžื•ืงื“ื ืžืื•ื“ ื‘ืงืจื™ื™ืจื” ืฉืœื™ ื”ื—ืœื˜ืชื™
01:32
I decided I was not going to be in the computer industry.
31
92935
2690
ืฉืื ื™ ืœื ื”ื•ืœืš ืœื”ื™ื•ืช ื‘ืขืกืงื™ื™ ื”ืžื—ืฉื‘ื™ื.
01:35
Before that, I just have to tell you
32
95649
1721
ื•ืœืคื ื™ ืฉืืกืคืจ ืœื›ื ืขืœ ื›ืš, ืืกืคืจ
01:37
about this picture of Graffiti I picked off the web the other day.
33
97394
3108
ืขืœ ืื™ื•ืจ ืฉืœืงื—ืชื™ ืžื”ืจืฉืช ืœืคื ื™ ื–ืžืŸ ืžื”.
01:40
I was looking for a picture for Graffiti that'll text input language.
34
100526
3253
ื ื™ืกื™ืชื™ ืœื—ืคืฉ ืชืžื•ื ื” ืฉืœ ื’ืจืคื™ื˜ื™, ืžืขื™ื™ืŸ ื˜ืงืกื˜ ืžืขื ื™ื™ืŸ ืฉืœ ืฉืคื”
01:43
I found a website dedicated to teachers who want to make script-writing things
35
103803
3689
ื•ืžืฆืืชื™ ืืชืจ ืœื›ืชื™ื‘ืช ื˜ืงืกื˜ื™ื ื”ืžื™ื•ืขื“ ืœืžื•ืจื™ื
ืฉืจื•ืฆื™ื ืœื”ืฆื™ื’ ืžืฉืคื˜ ืงื‘ื•ืข ื‘ืจืืฉ ื”ืœื•ื—
01:47
across the top of their blackboard,
36
107516
1674
01:49
and they had added Graffiti to it, and I'm sorry about that.
37
109214
2833
ื•ืœื™ื“ ื”ื ื”ื•ืกื™ืคื• ื’ืจืคื™ื˜ื™, ื•ืื ื™ ืžืฆื˜ืขืจ ืขืœ ื›ืš.
01:52
(Laughter)
38
112071
2247
(ืฆื—ื•ืง)
01:54
So what happened was,
39
114342
1300
ืื– ื”ื™ื™ืชื™ ืฆืขื™ืจ ื•ื™ืฆืืชื™ ืžื‘ื™ืช ื”ืกืคืจ ืœื”ื ื“ืกื”
01:55
when I was young and got out of engineering school at Cornell in '79,
40
115666
4899
ื‘ืงื•ืจื ืœ ื‘ืฉื ืช 79 ื•ื”ื’ืขืชื™ ืœืขื‘ื•ื“ ื‘ืื™ื ื˜ืœ.
02:00
I went to work for Intel and was in the computer industry,
41
120589
3187
02:03
and three months into that, I fell in love with something else.
42
123800
3402
ื”ื™ื™ืชื™ ื‘ืขืกืงื™ ื”ืžื—ืฉื‘ื™ื, ื•ืชื•ืš ืฉืœื•ืฉื” ื—ื•ื“ืฉื™ื
ื”ืชืื”ื‘ืชื™ ื‘ืžืฉื”ื• ืื—ืจ, ื•ืืžืจืชื™ ืœืขืฆืžื™ "ืขืฉื™ืชื™ ื”ื—ืœื˜ื” ืžื•ื˜ืขืช",
02:07
I said, "I made the wrong career choice here,"
43
127226
3044
02:10
and I fell in love with brains.
44
130294
2239
ื•ื”ืชืื”ื‘ืชื™ ื‘ืžื—.
02:12
This is not a real brain.
45
132557
1533
ื–ื” ืœื ืžื— ืืžื™ืชื™. ื–ื” ืฆื™ื•ืจ ืฉืœ ืžื—.
02:14
This is a picture of one, a line drawing.
46
134114
2719
02:16
And I don't remember exactly how it happened,
47
136857
2119
ืื ื™ ืœื ื–ื•ื›ืจ ื‘ื“ื™ื•ืง ืื™ืš ื–ื” ืงืจื”,
02:19
but I have one recollection, which was pretty strong in my mind.
48
139000
3515
ืื‘ืœ ื™ืฉ ืœื™ ื–ื™ื›ืจื•ืŸ ืื—ื“ ื“ื™ ื—ื–ืง
02:22
In September of 1979,
49
142539
1610
ื‘ืกืคื˜ืžื‘ืจ 1979, ื”ืขืœื•ืŸ ืกื™ื™ื ื˜ื™ืคื™ืง ืืžืจื™ืงืŸ ื”ื•ืฆื™ื ืœืื•ืจ
02:24
Scientific American came out with a single-topic issue about the brain.
50
144173
3364
ืžื”ื“ื•ืจื” ืืš ื•ืจืง ืขืœ ืžื—. ื–ื” ื”ื™ื” ืžืžืฉ ื˜ื•ื‘.
02:27
It was one of their best issues ever.
51
147561
1938
ืื—ืช ื”ืžื”ื“ื•ืจื•ืช ื”ื˜ื•ื‘ื•ืช ืื™ ืคืขื. ื•ื”ื ื“ื™ื‘ืจื• ืขืœ ื”ื ื•ื™ืจื•ื ื™ื
02:29
They talked about the neuron, development, disease, vision
52
149523
2947
ืขืœ ื”ืชืคืชื—ื•ืช ืขืœ ืžื—ืœื•ืช ืจืื™ื” ื•ื›ืœ ื”ื“ื‘ืจื™ื
02:32
and all the things you might want to know about brains.
53
152494
2596
ืฉืืชื” ืจื•ืฆื” ืœื“ืขืช ืขืœ ื”ืžื—. ื–ื” ื”ื™ื” ื“ื™ ืžืจืฉื™ื.
02:35
It was really quite impressive.
54
155114
1502
02:36
One might've had the impression we knew a lot about brains.
55
156640
2772
ื•ื–ื” ื™ืฆืจ ืจื•ืฉื ืฉืื ื• ื™ื•ื“ืขื™ื ื“ื™ ื”ืจื‘ื” ืขืœ ื”ืžื—.
02:39
But the last article in that issue was written by Francis Crick of DNA fame.
56
159436
4195
ืืš ืืช ื”ืžืืžืจ ื”ืื—ืจื•ืŸ ื‘ืžื”ื“ื•ืจื” ื›ืชื‘ ืคืจื ืกื™ืก ืงืจื™ืง
02:43
Today is, I think, the 50th anniversary of the discovery of DNA.
57
163655
3024
ื”ื™ื•ื, ืื ื™ ื—ื•ืฉื‘, ื—ื’ื™ื’ื•ืช ื”ื—ืžื™ืฉื™ื ืœืชื’ืœื™ืช ื”ื“ื "ื.
02:46
And he wrote a story basically saying, this is all well and good,
58
166703
3075
ื•ื”ื•ื ื›ืชื‘ ืžืฉื”ื• ืžืขื™ืŸ ื–ื”,
ื–ื” ื”ื›ืœ ื™ืคื” ื•ื˜ื•ื‘, ืื‘ืœ ืืชื ื™ื•ื“ืขื™ื ืžื”,
02:49
but you know, we don't know diddly squat about brains,
59
169802
2743
ืื ื—ื ื• ืœื ื™ื•ื“ืขื™ื ื“ื‘ืจ ื•ื—ืฆื™ ื“ื‘ืจ ืขืœ ื”ืžื—
02:52
and no one has a clue how they work,
60
172569
1739
ื•ืœืืฃ ืื—ื“ ืื™ืŸ ืžื•ืฉื’ ืื™ืš ื”ื“ื‘ืจ ื”ื–ื” ืขื•ื‘ื“,
02:54
so don't believe what anyone tells you.
61
174332
1866
ืื– ืืœ ืชืืžื™ื ื• ืœืžื” ืฉืื•ืžืจื™ื ืœื›ื.
02:56
This is a quote from that article, he says:
62
176222
2165
ื”ื ื” ืฆื™ื˜ื•ื˜ ืžื”ืžืืžืจ ื”ื”ื•ื. ื”ื•ื ื›ื•ืชื‘, "ืžื” ืฉื—ืกืจ ื‘ืื•ืคืŸ ืžื•ื‘ื”ืง,"
02:58
"What is conspicuously lacking" -- he's a very proper British gentleman --
63
178411
4293
ื”ื•ื ืื“ื•ืŸ ื‘ืจื™ื˜ื™ ืžืื•ื“ ืชื™ืงื ื™, "ืžื” ืฉื—ืกืจ ื‘ืื•ืคืŸ ืžื•ื‘ื”ืง,
03:02
"What is conspicuously lacking is a broad framework of ideas
64
182728
2830
ื–ื• ืชื‘ื ื™ืช ืจื—ื‘ื” ืฉืœ ืจืขื™ื•ื ื•ืช ืฉื‘ืขื–ืจืชื” ื ื™ืชืŸ ื™ื”ื™ื” ืœืคืจืฉ ืืช ื”ื’ื™ืฉื•ืช ื”ืฉื•ื ื•ืช".
03:05
in which to interpret these different approaches."
65
185582
2352
03:07
I thought the word "framework" was great.
66
187958
1968
ื—ืฉื‘ืชื™ ืฉื”ืžื™ืœื” "ืชื‘ื ื™ืช" ื ื”ื“ืจืช.
03:09
He didn't say we didn't have a theory.
67
189950
1817
ื”ื•ื ืœื ืืžืจ ืื™ืŸ ืœื ื• ืืคื™ืœื• ืชืื•ืจื™ื”. ื”ื•ื ืืžืจ,
03:11
He says we don't even know how to begin to think about it.
68
191791
2725
ืื ื• ืœื ื™ื•ื“ืขื™ื ืื™ืš ื‘ื›ืœืœ ืœื”ืชื—ื™ืœ ืœื—ืฉื•ื‘ ืขืœ ื–ื” -
ืื™ืŸ ืœื ื• ืชื‘ื ื™ืช.
03:14
We don't even have a framework.
69
194540
1492
ืื ื—ื ื• ื‘ืขื™ื“ืŸ ืฉืœ ื”ืœืคื ื™-ืชื‘ื ื™ืช, ืื ืœื”ืฉืชืžืฉ ื‘ืžื™ืœื•ืชื™ื• ืฉืœ ืชื•ืžืก ืงื•ืŸ.
03:16
We are in the pre-paradigm days, if you want to use Thomas Kuhn.
70
196056
3050
ืื– ื”ืชืื”ื‘ืชื™ ื‘ื–ื” ื•ืืžืจืชื™,
03:19
So I fell in love with this.
71
199130
1339
03:20
I said, look: We have all this knowledge about brains -- how hard can it be?
72
200493
3575
ื™ืฉ ืœื ื• ืืช ื›ืœ ื”ื™ื“ืข ื”ื–ื” ืขืœ ื”ืžื—, ื›ืžื” ืงืฉื” ื–ื” ื›ื‘ืจ ื™ื›ื•ืœ ืœื”ื™ื•ืช?
03:24
It's something we can work on in my lifetime; I could make a difference.
73
204092
3438
ื•ื–ื” ืžืฉื”ื• ืฉืื ื™ ื™ื›ื•ืœ ืœืขื‘ื•ื“ ืขืœื™ื• ืขื•ื“ ื‘ื—ื™ื™ ื•ืœืชืจื•ื ื‘ื•,
03:27
So I tried to get out of the computer business, into the brain business.
74
207554
3619
ืื– ื ื™ืกื™ืชื™ ืœืฆืืช ืžืขืกืงื™ ื”ืžื—ืฉื‘ื™ื, ื•ืœื”ื›ื ืก ืœืขืกืงื™ ื”ืžื—.
03:31
First, I went to MIT, the AI lab was there.
75
211197
2004
ื‘ื”ืชื—ืœื”, ื ื™ื’ืฉืชื™ ืœืืž. ืื™ื™. ื˜ื™., ืœืžืขื‘ื“ืช ื”ืื ื˜ื™ืœื™ื’ื ืฆื™ื” ื”ืžืœืื›ื•ืชื™ืช,
03:33
I said, I want to build intelligent machines too,
76
213225
2395
ื•ืืžืจืชื™ ืœื”ื, ื’ื ืื ื™ ืจื•ืฆื” ืœื‘ื ื•ืช ืžื›ื•ื ื•ืช ื—ื›ืžื•ืช,
03:35
but I want to study how brains work first.
77
215644
2517
ืื‘ืœ ื”ื“ืจืš ืฉืื ื™ ืจื•ืฆื” ืœืขืฉื•ืช ื–ืืช, ื”ื™ื ืœืœืžื•ื“ ืื™ืš ื”ืžื— ืขื•ื‘ื“ ืงื•ื“ื.
03:38
And they said, "Oh, you don't need to do that.
78
218185
2306
ื•ื”ื ืืžืจื•, ืืชื” ืœื ืฆืจื™ืš ืœืขืฉื•ืช ืืช ื–ื”.
03:40
You're just going to program computers, that's all.
79
220515
2390
ืื ื• ืจืง ืžืชื›ื ืชื™ื ืžื—ืฉื‘ื™ื, ื–ื” ื›ืœ ืžื” ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืขืฉื•ืช.
03:42
I said, you really ought to study brains.
80
222929
1963
ื•ืืžืจืชื™ ืœื”ื, ืœื, ืืชื ืžืžืฉ ืฆืจื™ื›ื™ื ืœืœืžื•ื“ ืขืœ ื”ืžื—. ื”ื ืืžืจื• ืœื,
03:44
They said, "No, you're wrong."
81
224916
1432
03:46
I said, "No, you're wrong," and I didn't get in.
82
226372
2246
ืืชื” ื˜ื•ืขื”. ื•ืืžืจืชื™, ืœื, ืืชื ื˜ื•ืขื™ื, ื•ืœื ื”ืชืงื‘ืœืชื™.
03:48
(Laughter)
83
228642
1078
(ืฆื—ื•ืง)
03:49
I was a little disappointed -- pretty young --
84
229744
2155
ืื– ื”ื™ื™ืชื™ ืงืฆืช ืžืื•ื›ื–ื‘ - ื“ื™ ืฆืขื™ืจ, ื—ื–ืจืชื™ ืฉื•ื‘
03:51
but I went back again a few years later,
85
231923
1936
ืœืื—ืจ ื›ืžื” ืฉื ื™ื ื•ื”ืคืขื ื–ื” ื”ื™ื” ืœืงืœื™ืคื•ืจื ื™ื”, ืœื‘ืจืงืœื™.
03:53
this time in California, and I went to Berkeley.
86
233883
2359
ืื– ืืžืจืชื™, ืื’ืฉ ืžื”ืฆื“ ื”ื‘ื™ื•ืœื•ื’ื™.
03:56
And I said, I'll go in from the biological side.
87
236266
2430
03:58
So I got in the PhD program in biophysics.
88
238720
3089
ื•ื ื›ื ืกืชื™ ืœืชื•ื›ื ื™ืช ื“ื•ืงื˜ื•ืจื˜ ื‘ื‘ื™ื•ืคื™ืกื™ืงื”, ื•ืืžืจืชื™ ืœืขืฆืžื™, ืžืฆื•ื™ืŸ,
04:01
I was like, I'm studying brains now. Well, I want to study theory.
89
241833
3410
ืื ื™ ืœื•ืžื“ ืขืœ ื”ืžื— ืขื›ืฉื™ื•, ื•ืื ื™ ืจื•ืฆื” ืœืœืžื•ื“ ืชื™ืื•ืจื™ื”.
04:05
They said, "You can't study theory about brains.
90
245267
2269
ื•ื”ื ืืžืจื•, ืื•ื™ ืœื, ืืชื” ืœื ื™ื›ื•ืœ ืœืœืžื•ื“ ืชืื•ืจื™ื” ืฉืœ ื”ืžื—.
04:07
You can't get funded for that.
91
247560
1995
ื–ื” ืœื ืžืฉื”ื• ืฉืืชื” ื™ื›ื•ืœ ืœืขืฉื•ืช. ืืชื” ืœื ื™ื›ื•ืœ ืœืงื‘ืœ ืชืงืฆื™ื‘ ืœื–ื”.
04:09
And as a graduate student, you can't do that."
92
249579
2155
ื•ื›ืชืœืžื™ื“ ืืชื” ืœื ื™ื›ื•ืœ ืœืขืฉื•ืช ืืช ื–ื”. ืื– ืืžืจืชื™ ืื•ื™ ื•ืื‘ื•ื™.
04:11
So I said, oh my gosh.
93
251758
1218
04:13
I was depressed; I said, but I can make a difference in this field.
94
253000
3155
ื”ื™ื™ืชื™ ืžืื•ื“ ืžืฆื•ื‘ืจื—. ื”ืจื™ ืื ื™ ื™ื›ื•ืœ ืœืชืจื•ื ืœืชื—ื•ื ื”ื–ื”.
ืื– ื—ื–ืจืชื™ ืœืขืกืงื™ ื”ืžื—ืฉื‘ื™ื
04:16
I went back in the computer industry
95
256179
2008
04:18
and said, I'll have to work here for a while.
96
258211
2105
ื•ืืžืจืชื™, ื ื• ื˜ื•ื‘, ืืขื‘ื•ื“ ืคื” ืœื–ืžืŸ ืžื”, ืืขืฉื” ืžืฉื”ื•.
04:20
That's when I designed all those computer products.
97
260340
2393
ื–ื• ื”ืชืงื•ืคื” ืฉืชื›ื ื ืชื™ ืืช ื›ืœ ื”ืžื•ืฆืจื™ื ื”ื”ื.
04:22
(Laughter)
98
262757
1301
(ืฆื—ื•ืง)
04:24
I said, I want to do this for four years, make some money,
99
264082
2894
ืื ื™ ืจื•ืฆื” ืœืขืฉื•ืช ืืช ื–ื” ื›ืืจื‘ืข ืฉื ื™ื, ืœื”ืจื•ื•ื™ื— ืงืฆืช ื›ืกืฃ,
04:27
I was having a family, and I would mature a bit,
100
267000
3976
ืœื”ืงื™ื ืžืฉืคื—ื”, ื•ืœื”ืชื‘ื’ืจ ืงืฆืช,
04:31
and maybe the business of neuroscience would mature a bit.
101
271000
2816
ื•ืื•ืœื™ ืชื—ื•ื ืžื“ืขื™ ื”ืžื— ื™ืชื‘ื’ืจ ืงืฆืช.
04:33
Well, it took longer than four years. It's been about 16 years.
102
273840
3001
ืื– ื–ื” ืœืงื— ื™ื•ืชืจ ืžืืจื‘ืข ืฉื ื™ื. ื–ื” ืœืงื— ืฉืฉ ืขืฉืจื” ืฉื ื™ื.
04:36
But I'm doing it now, and I'm going to tell you about it.
103
276865
2716
ืื‘ืœ ืื ื™ ืขื•ืฉื” ืืช ื–ื” ืขื›ืฉื™ื•, ื•ืืกืคืจ ืœื›ื ืขืœ ื–ื”.
04:39
So why should we have a good brain theory?
104
279605
2286
ืื– ืœืžื” ืื ื—ื ื• ืฆืจื™ื›ื™ื ืชื™ืื•ืจื™ื” ื˜ื•ื‘ื” ืขืœ ื”ืžื—?
04:41
Well, there's lots of reasons people do science.
105
281915
3102
ื™ืฉ ื”ืจื‘ื” ืกื™ื‘ื•ืช ืฉื‘ื’ืœืœืŸ ืื ืฉื™ื ืขื•ืกืงื™ื ื‘ืžื“ืข.
04:45
The most basic one is, people like to know things.
106
285041
2917
ืื—ืช - ื”ืกื™ื‘ื” ื”ื‘ืกื™ืกื™ืช ื‘ื™ื•ืชืจ - ืื ืฉื™ื ืจื•ืฆื™ื ืœื“ืขืช.
04:47
We're curious, and we go out and get knowledge.
107
287982
2195
ืื ื• ืกืงืจื ื™ื ื•ื™ื•ืฆืื™ื ืœืืกื•ืฃ ืžื™ื“ืข.
04:50
Why do we study ants? It's interesting.
108
290201
1866
ืœืžื” ืื ื• ื—ื•ืงืจื™ื ื ืžืœื™ื? ื›ื™ ื”ืŸ ืžืขื ื™ื™ื ื•ืช.
04:52
Maybe we'll learn something useful, but it's interesting and fascinating.
109
292091
3466
ืื•ืœื™ ื ืœืžื“ ืžืฉื”ื• ืžืžืฉ ืžื•ืขื™ืœ ืขืœื™ื”ืŸ, ืื‘ืœ ื–ื” ืžืขื ื™ื™ืŸ ื•ืžืกืงืจืŸ.
04:55
But sometimes a science has other attributes
110
295581
2057
ืื‘ืœ ืœืคืขืžื™ื, ืœืžื“ืข ื™ืฉ ืžืฉื”ื• ืื—ืจ
04:57
which makes it really interesting.
111
297662
1829
ืฉื”ื•ืคืš ืื•ืชื• ืœืžืื•ื“ ืžืื•ื“ ืžืขื ื™ื™ืŸ.
04:59
Sometimes a science will tell something about ourselves;
112
299515
2627
ืœืคืขืžื™ื ื”ืžื“ืข ื™ืืžืจ ืœื ื• ืžืฉื”ื• ืขืœ ืขืฆืžื ื•,
05:02
it'll tell us who we are.
113
302166
1224
ื”ื•ื ื™ืืžืจ ืœื ื• ืžื™ ืื ื—ื ื•.
05:03
Evolution did this and Copernicus did this,
114
303414
2752
ืœืขื™ืชื™ื ื ื“ื™ืจื•ืช, ื”ืื‘ื•ืœื•ืฆื™ื” ืขืฉืชื” ืืช ื–ื”, ืงื•ืคืจื ื™ืงื•ืก ืขืฉื” ืืช ื–ื”,
05:06
where we have a new understanding of who we are.
115
306190
2334
ื”ืžืงื•ื ืฉื‘ื• ื™ืฉ ืœื ื• ื”ื‘ื ื” ื—ื“ืฉื” ืขืœ ืžื”ื•ืชื ื•.
05:08
And after all, we are our brains. My brain is talking to your brain.
116
308548
3428
ื•ืื—ืจื™ ื”ื›ืœ, ืื ื—ื ื• ื–ื” ื”ืžื•ื—ื•ืช ืฉืœื ื•. ื”ืžื— ืฉืœื™ ืžื“ื‘ืจ ืœืžื— ืฉืœื›ื.
05:12
Our bodies are hanging along for the ride,
117
312000
2030
ื’ื•ืคื ื• ืžื ืฆืœ ืืช ื”ื”ื–ื“ืžื ื•ืช ืœื”ืฆื˜ืจืฃ, ืื‘ืœ ื”ืžื— ืฉืœื™ ืžื“ื‘ืจ ืœืžื— ืฉืœื›ื.
05:14
but my brain is talking to your brain.
118
314054
1825
05:15
And if we want to understand who we are and how we feel and perceive,
119
315903
3248
ื•ืื ืื ื—ื ื• ืจื•ืฆื™ื ืœื”ื‘ื™ืŸ ืžื™ ืื ื—ื ื• ื•ืžื” ืื ื• ืžืจื’ื™ืฉื™ื ื•ื—ื•ื•ื™ื,
ืื ื• ื—ื™ื™ื‘ื™ื ืœื”ื‘ื™ืŸ ืžื”ื• ื”ืžื—.
05:19
we need to understand brains.
120
319175
1391
05:20
Another thing is sometimes science leads to big societal benefits, technologies,
121
320590
3784
ื“ื‘ืจ ื ื•ืกืฃ ื”ื•ื ืœืคืขืžื™ื ื”ืžื“ืข
ืžื•ื‘ื™ืœ ืœืชื•ืขืœื•ืช ื—ื‘ืจืชื™ื•ืช ื•ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื’ื“ื•ืœื•ืช,
05:24
or businesses or whatever.
122
324398
1291
ืื• ืขืกืงื™ื ืื• ืœื ืžืฉื ื” ืžื”. ื•ื’ื ื–ื” ื—ืฉื•ื‘,
05:25
This is one, too, because when we understand how brains work,
123
325713
2878
ื›ื™ ืื ื ื‘ื™ืŸ ืื™ืš ื”ืžื— ืขื•ื‘ื“, ื ื•ื›ืœ
05:28
we'll be able to build intelligent machines.
124
328615
2064
ืœื‘ื ื•ืช ืžื›ื•ื ื•ืช ื—ื›ืžื•ืช, ื•ืื ื™ ื—ื•ืฉื‘ ืฉื‘ืื•ืคืŸ ืจื—ื‘ ื–ื” ื“ื‘ืจ ื˜ื•ื‘,
05:30
That's a good thing on the whole,
125
330703
1698
05:32
with tremendous benefits to society,
126
332425
1858
ื•ืชื”ื™ื” ืœื–ื” ืชื•ืขืœืช ืžื“ื”ื™ืžื” ืœื—ื‘ืจื”
05:34
just like a fundamental technology.
127
334307
1669
ื‘ื“ื™ื•ืง ื›ืžื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืจืืฉื•ื ื™ืช.
05:36
So why don't we have a good theory of brains?
128
336000
2850
ืื– ืœืžื” ืื™ืš ืœื ื• ืชืื•ืจื™ื” ื˜ื•ื‘ื” ืขืœ ื”ืžื—?
05:38
People have been working on it for 100 years.
129
338874
2168
ื•ืื ืฉื™ื ืขื‘ื“ื• ืขืœ ื›ืš ืžืื” ืฉื ื”.
05:41
Let's first take a look at what normal science looks like.
130
341066
2719
ื‘ื•ืื• ื ืกืชื›ืœ ืงื•ื“ื ืื™ืš ื ืจืื” ืžื“ืข ื ื•ืจืžืœื™ .
05:43
This is normal science.
131
343809
1187
ื–ื” ืžื“ืข ื ื•ืจืžืœื™.
05:45
Normal science is a nice balance between theory and experimentalists.
132
345020
4074
ืžื“ืข ื ื•ืจืžืœื™ ื”ื•ื ืฉื•ื•ื™ ืžืฉืงืœ ื‘ื™ืŸ ืชืื•ืจื™ื” ืœืžื—ืงืจ.
05:49
The theorist guy says, "I think this is what's going on,"
133
349118
2691
ื”ืชืื•ืจืชื™ืงืŸ ืื•ืžืจ ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ืžื” ืฉืงื•ืจื”,
05:51
the experimentalist says, "You're wrong."
134
351833
1961
ื•ื”ื—ื•ืงืจ ืื•ืžืจ, ืœื, ืืชื” ื˜ื•ืขื”.
05:53
It goes back and forth, this works in physics, this in geology.
135
353818
3004
ื•ื”ื•ืœื›ื™ื ืœื›ืืŸ ื•ืœืฉื,
ื›ื›ื” ื–ื” ืขื•ื‘ื“ ื‘ืคื™ืกื™ืงื”. ื›ื›ื” ื–ื” ืขื•ื‘ื“ ื‘ื’ื™ืื•ืœื•ื’ื™ื”. ืื‘ืœ ืื ื–ื” ืžื“ืข ื”ื ื•ืจืžืœื™,
05:56
But if this is normal science, what does neuroscience look like?
136
356846
3009
ืื™ืš ื ืจืื™ื ืžื“ืขื™ ื”ืžื—?
05:59
This is what neuroscience looks like.
137
359879
1795
ื™ืฉ ืœื ื• ื”ืจ ืฉืœ ืžื™ื“ืข, ืขืœ ืื ื˜ื•ืžื™ื”, ืคื™ื–ื™ื•ืœื•ื’ื™ื” ื•ื”ืชื ื”ื’ื•ืช.
06:01
We have this mountain of data,
138
361698
1442
06:03
which is anatomy, physiology and behavior.
139
363164
2070
06:05
You can't imagine how much detail we know about brains.
140
365258
3194
ืืชื ืœื ื™ื›ื•ืœื™ื ืœืชืืจ ื›ืžื” ืžื™ื“ืข ืžืคื•ืจื˜ ื™ืฉ ืœื ื• ืขืœ ื”ืžื—.
06:08
There were 28,000 people who went to the neuroscience conference this year,
141
368476
3592
28,000 ืื ืฉื™ื ื”ื’ื™ืขื• ืœื›ื ืก ืžื“ืขื™ ื”ืžื— ื”ืฉื ื”,
06:12
and every one of them is doing research in brains.
142
372092
2363
ื•ื›ืœ ืื—ื“ ืžื”ื ื—ื•ืงืจ ืืช ื”ืžื—.
06:14
A lot of data, but no theory.
143
374479
1694
ื”ืžื•ืŸ ื ืชื•ื ื™ื. ืื‘ืœ ืื™ืŸ ืชืื•ืจื™ื”. ื”ืงื•ืคืกื ื”ืงื˜ื ื˜ื ื” ื”ื–ืืช ืœืžืขืœื”.
06:16
There's a little wimpy box on top there.
144
376197
2000
06:18
And theory has not played a role in any sort of grand way
145
378221
3382
ื•ื”ืชืื•ืจื™ื” ืืฃ ืคืขื ืœื ืฉื™ื—ืงื” ืชืคืงื™ื“ ืžื›ืจื™ืข ื‘ืžื“ืขื™ ื”ืžื—.
06:21
in the neurosciences.
146
381627
1429
06:23
And it's a real shame.
147
383080
1240
ื•ื–ื” ืžืžืฉ ื—ื‘ืœ. ืœืžื” ื–ื” ืงืจื”?
06:24
Now, why has this come about?
148
384344
1391
06:25
If you ask neuroscientists why is this the state of affairs,
149
385759
2988
ืื ืืชื ืฉื•ืืœื™ื ื—ื•ืงืจื™ ืžื— ืœืžื”?
06:28
first, they'll admit it.
150
388771
1246
ื”ื ืงื•ื“ื ื›ืœ ื™ื•ื“ื• ื‘ื›ืš. ืื‘ืœ ืื ืชืฉืืœ ืื•ืชื, ื”ื ื™ืืžืจื•
06:30
But if you ask them, they say,
151
390041
1485
06:31
there's various reasons we don't have a good brain theory.
152
391550
2732
ื•ื‘ื›ืŸ, ื™ืฉ ื›ืžื” ืกื™ื‘ื•ืช ืœื›ืš,
06:34
Some say we still don't have enough data,
153
394306
1969
ื—ืœืง ื™ืืžืจื•, ืื™ืŸ ืœื ื• ืžืกืคื™ืง ื ืชื•ื ื™ื,
06:36
we need more information, there's all these things we don't know.
154
396299
3059
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืืกื•ืฃ ื™ื•ืชืจ ืžื™ื“ืข, ื™ืฉ ืขื“ื™ืŸ ื”ืจื‘ื” ื“ื‘ืจื™ื ืฉืื™ื ื ื• ื™ื•ื“ืขื™ื.
06:39
Well, I just told you there's data coming out of your ears.
155
399382
2841
ื•ื‘ื›ืŸ, ื”ืจื’ืข ืืžืจืชื™ ืœืš ืฉื›ืžื•ืช ื”ื ืชื•ื ื™ื ื ืฉืคื›ืช ืžื”ืื–ื ื™ื™ื.
06:42
We have so much information, we don't even know how to organize it.
156
402247
3164
ื™ืฉ ืœื ื• ื›ืœ ื›ืš ื”ืจื‘ื” ืžื™ื“ืข ืฉืื ื—ื ื• ืœื ื™ื•ื“ืขื™ื ืื™ืš ืœื”ืชื—ื™ืœ ืœืืจื’ืŸ ืื•ืชื•.
06:45
What good is more going to do?
157
405435
1438
ืื™ื–ื• ื˜ื•ื‘ื” ืชืฆื ืœื ื• ืžืขื•ื“?
06:46
Maybe we'll be lucky and discover some magic thing, but I don't think so.
158
406897
3448
ืื•ืœื™ ื™ื”ื™ื” ืœื ื• ืžื–ืœ ื•ื ืžืฆื ื“ื‘ืจ ืงืกื, ืื‘ืœ ืื ื™ ืœื ื—ื•ืฉื‘ ื›ืš
06:50
This is a symptom of the fact that we just don't have a theory.
159
410369
2973
ื–ื” ืœืžืขืฉื” ืชืกืžื™ืŸ ืœื”ืขื“ืจ ืชืื•ืจื™ื”.
06:53
We don't need more data, we need a good theory.
160
413366
2610
ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืขื•ื“ ื ืชื•ื ื™ื - ืื ื—ื ื• ืฆืจื™ื›ื™ื ืชืื•ืจื™ื” ื˜ื•ื‘ื” ื‘ืฉื‘ื™ืœื.
06:56
Another one is sometimes people say,
161
416000
1798
ืœืคืขืžื™ื ืื ืฉื™ื ืื•ืžืจื™ื, ื”ืžื— ื›ื” ืžื•ืจื›ื‘,
06:57
"Brains are so complex, it'll take another 50 years."
162
417822
3154
ื–ื” ื™ืงื— ืขื•ื“ ื—ืžื™ืฉื™ื ืฉื ื”.
07:01
I even think Chris said something like this yesterday, something like,
163
421000
3354
ืื ื™ ื—ื•ืฉื‘ ืฉื›ืจื™ืก ืืžืจ ืžืฉื”ื• ื›ื–ื” ืืชืžื•ืœ.
ืื ื™ ืœื ื‘ื˜ื•ื— ืžื” ืืžืจืช, ืื‘ืœ ืžืฉื”ื• ื›ื–ื”,
07:04
it's one of the most complicated things in the universe.
164
424378
2627
ื–ื” ืื—ื“ ื”ื“ื‘ืจื™ื ื”ืžืกื•ื‘ื›ื™ื ื‘ื™ื•ืชืจ ื‘ื™ืงื•ื. ื–ื” ืœื ื ื›ื•ืŸ.
07:07
That's not true -- you're more complicated than your brain.
165
427029
2790
ืืชื” ื™ื•ืชืจ ืžืกื•ื‘ืš ืžื”ืžื— ืฉืœืš. ื™ืฉ ืœืš ืžื—.
07:09
You've got a brain.
166
429843
1151
ื•ื’ื, ืœืžืจื•ืช ืฉื”ืžื— ื ืจืื” ืžืื•ื“ ืžืกื•ื‘ืš,
07:11
And although the brain looks very complicated,
167
431018
2150
ื“ื‘ืจื™ื ื ืจืื™ื ืžืกื•ื‘ื›ื™ื ืขื“ ืฉืžื‘ื™ื ื™ื ืื•ืชื.
07:13
things look complicated until you understand them.
168
433192
2336
07:15
That's always been the case.
169
435552
1335
ืชืžื™ื“ ื–ื” ื”ื™ื” ื›ื›ื”. ืื– ืžื” ืฉื™ื›ื•ืœื™ื ืœื•ืžืจ,
07:16
So we can say, my neocortex, the part of the brain I'm interested in,
170
436911
3243
ืฉื‘ืงืœื™ืคืช ื”ืžื—, ื”ื™ื ื”ื—ืœืง ืฉืื ื™ ืžืชืขื ื™ื™ืŸ ื‘ื•, ื™ืฉ 30 ืžื™ืœื™ืืจื“ ืชืื™ื.
07:20
has 30 billion cells.
171
440178
1152
07:21
But, you know what? It's very, very regular.
172
441354
2432
ืื‘ืœ ืืชื ื™ื•ื“ืขื™ื ืžื”? ื–ื” ืžืื•ื“ ืžืื•ื“ ืกื“ื™ืจ.
07:23
In fact, it looks like it's the same thing repeated over and over again.
173
443810
3394
ืœืžืขืฉื”, ื ืจืื” ืฉืื•ืชื• ื”ื“ื‘ืจ ื‘ื“ื™ื•ืง ื—ื•ื–ืจ ืขืœ ืขืฆืžื• ื›ืœ ื”ื–ืžืŸ .
07:27
It's not as complex as it looks. That's not the issue.
174
447228
2536
ื–ื” ืœื ืขื“ ื›ื“ื™ ื›ืš ืžืกื•ื‘ืš ื›ืžื• ืฉื–ื” ื ืจืื”. ืœื ื–ื” ื”ืขื ื™ื™ืŸ.
07:29
Some people say, brains can't understand brains.
175
449788
2287
ืื ืฉื™ื ืžืกื•ื™ืžื™ื ืื•ืžืจื™ื, ืžื•ื—ื•ืช ืœื ื™ื›ื•ืœื™ื ืœื”ื‘ื™ืŸ ืžื•ื—ื•ืช.
07:32
Very Zen-like. Woo.
176
452099
1988
ืžืื•ื“ "ื–ึตื ื™".
07:34
(Laughter)
177
454111
2188
(ืฆื—ื•ืง)
07:36
You know, it sounds good, but why? I mean, what's the point?
178
456323
2859
ื–ื” ื ืฉืžืข ื˜ื•ื‘, ืื‘ืœ ืœืžื”? ืžื” ื”ื ืงื•ื“ื”?
07:39
It's just a bunch of cells. You understand your liver.
179
459206
2569
ื”ื ืจืง ื›ืžื” ืชืื™ื. ืืชื” ื™ื›ื•ืœ ืœื”ื‘ื™ืŸ ืืช ื”ื›ื‘ื“ ืฉืœืš.
07:41
It's got a lot of cells in it too, right?
180
461799
1977
ื’ื ื‘ื• ื™ืฉ ื”ืžื•ืŸ ืชืื™ื, ื ื›ื•ืŸ?
07:43
So, you know, I don't think there's anything to that.
181
463800
2494
ืื ื™ ืœื ื—ื•ืฉื‘ ืฉื™ืฉ ื‘ื–ื” ืžืฉื”ื•.
07:46
And finally, some people say,
182
466318
2112
ื•ืœืกื™ื•ื, ืื ืฉื™ื ืžืกื•ื™ืžื™ื ืื•ืžืจื™ื, ืืชื” ื™ื•ื“ืข,
07:48
"I don't feel like a bunch of cells -- I'm conscious.
183
468454
2983
ืื ื™ ืœื ืžืจื’ื™ืฉ ืจืง ื›ืžื• ืชืื™ื. ืื ื™ ืžื•ื“ืข.
07:51
I've got this experience, I'm in the world.
184
471461
2069
ื™ืฉ ืœื™ ื—ื•ื•ื™ื•ืช, ืื ื™ ื‘ืขื•ืœื.
07:53
I can't be just a bunch of cells."
185
473554
1910
ืœื ื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉืื ื™ ืจืง ืขืจื™ืžื” ืฉืœ ืชืื™ื. ื•ื‘ื›ืŸ,
07:55
Well, people used to believe there was a life force to be living,
186
475488
3223
ืคืขื ืื ืฉื™ื ื”ืืžื™ื ื• ืฉื™ืฉ ืื™ื–ืฉื”ื• ื›ื— ื—ื™ื™ื,
07:58
and we now know that's really not true at all.
187
478735
2409
ื•ื›ื™ื•ื ืื ื• ื™ื•ื“ืขื™ื ืฉื–ื” ืœื ืžืžืฉ ื ื›ื•ืŸ.
08:01
And there's really no evidence,
188
481168
1898
ื•ืื™ืŸ ื”ื•ื›ื—ื”, ืœืžืขื˜ ืื ืฉื™ื
08:03
other than that people just disbelieve that cells can do what they do.
189
483090
3374
ืฉืื™ื ื ืžืืžื™ื ื™ื ืฉืชืื™ื ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ืžื” ืฉื”ื ืขื•ืฉื™ื.
08:06
So some people have fallen into the pit of metaphysical dualism,
190
486488
3041
ื›ืš ื”ืจื‘ื” ืื ืฉื™ื ื ืคืœื• ืœื‘ื•ืจ ื”ื“ื•ืืœื™ื–ื ื”ืžื˜ืืคื™ื–ื™
08:09
some really smart people, too, but we can reject all that.
191
489553
2730
ื’ื ื›ืžื” ืื ืฉื™ื ืžืžืฉ ื—ื›ืžื™ื, ืื‘ืœ ืื ื• ื™ื›ื•ืœื™ื ืœื“ื—ื•ืช ื–ืืช ืขืœ ื”ืกืฃ.
08:12
(Laughter)
192
492307
2895
(ืฆื—ื•ืง)
ืื ื™ ื”ื•ืœืš ืœื•ืžืจ ืœื›ื ืฉื™ืฉ ืžืฉื”ื• ืื—ืจ,
08:15
No, there's something else,
193
495226
1741
08:16
something really fundamental, and it is:
194
496991
1985
ื‘ืกื™ืกื™ ืžืื•ื“,
08:19
another reason why we don't have a good brain theory
195
499000
2451
ืฉื™ืฉ ืกื™ื‘ื” ืื—ืจืช ืœื›ืš ืฉืื™ืŸ ืœื ื• ืชื™ืื•ืจื™ืช ืžื— ื˜ื•ื‘ื”,
08:21
is because we have an intuitive, strongly held but incorrect assumption
196
501475
5535
ื•ื”ื™ื ื‘ื’ืœืœ ืฉืื ื• ืžื—ื–ื™ืงื™ื ื‘ืื™ื ื˜ื•ืื™ืฆื™ื” ื—ื–ืงื”,
ื•ื‘ื”ื ื—ื” ืœื ื ื›ื•ื ื” ืฉืžื•ื ืขืช ืžืืชื ื• ืœืจืื•ืช ืืช ื”ืชืฉื•ื‘ื”.
08:27
that has prevented us from seeing the answer.
197
507034
2112
08:29
There's something we believe that just, it's obvious, but it's wrong.
198
509170
3788
ื™ืฉื ื• ื“ื‘ืจ ืฉืื ื• ืžืืžื™ื ื™ื ื‘ื•, ืžื•ื‘ืŸ ืžืืœื™ื• ื‘ืฉื‘ื™ืœื ื•, ืืš ื”ื•ื ืฉื’ื•ื™.
08:32
Now, there's a history of this in science and before I tell you what it is,
199
512982
3566
ื™ืฉ ืœื›ืš ื”ื™ืกื˜ื•ืจื™ื” ื‘ืžื“ืข, ื•ืœืคื ื™ ืฉืื•ืžืจ ืœื›ื ืžื”ื•,
08:36
I'll tell you about the history of it in science.
200
516572
2299
ืืกืคืจ ืœื›ื ืžืขื˜ ืขืœ ื”ื”ืกื˜ื•ืจื™ื”.
08:38
Look at other scientific revolutions --
201
518895
1910
ื”ื‘ื™ื˜ื• ืขืœ ื›ืžื” ืžื”ืคื›ื•ืช ืžื“ืขื™ื•ืช ืื—ืจื•ืช,
08:40
the solar system, that's Copernicus,
202
520829
1879
ืื ื™ ืžื“ื‘ืจ ืขืœ ืžืขืจื›ืช ื”ืฉืžืฉ, ื–ื” ืงื•ืคืจื ื™ืงื•ืก,
08:42
Darwin's evolution, and tectonic plates, that's Wegener.
203
522732
2819
ื”ืื‘ื•ืœื•ืฆื™ื” ืฉืœ ื“ืจื•ื•ื™ืŸ, ื•ื”ืœื•ื—ื•ืช ื”ื˜ืงื˜ื•ื ื™ื ืฉืœ ื•ื’ื ืจ.
ื™ืฉ ืœื”ื ื”ืžื•ืŸ ืžื”ืžืฉื•ืชืฃ ืœืžื“ืขื™ ื”ืžื—.
08:46
They all have a lot in common with brain science.
204
526059
2295
08:48
First, they had a lot of unexplained data. A lot of it.
205
528378
2666
ืงื•ื“ื ื›ืœ ื”ื™ื• ืœื”ื ื”ืžื•ืŸ ื ืชื•ื ื™ื ืœื ืžื•ืกื‘ืจื™ื. ื”ืžื•ืŸ.
08:51
But it got more manageable once they had a theory.
206
531068
2794
ืื‘ืœ ื–ื” ื”ื™ื” ื”ืจื‘ื” ื™ื•ืชืจ ื‘ืจื•ืจ ื‘ืจื’ืข ืฉื”ื™ืชื” ืœื”ื ืชืื•ืจื™ื”.
08:53
The best minds were stumped -- really smart people.
207
533886
2807
ื”ืื ืฉื™ื ื”ื—ื›ืžื™ื ื‘ื™ื•ืชืจ ื”ื™ื• ืžื‘ื•ืœื‘ืœื™ื, ืื ืฉื™ื ืžืื•ื“ ืžืื•ื“ ื—ื›ืžื™ื.
08:56
We're not smarter now than they were then;
208
536717
2004
ืื ื• ืœื ื—ื›ืžื™ื ื™ื•ืชืจ ืขื›ืฉื™ื• ืžืžื” ืฉื”ื ื”ื™ื• ืื–
08:58
it just turns out it's really hard to think of things,
209
538745
2527
ืจืง ื”ืกืชื‘ืจ ืฉื–ื” ืžืื•ื“ ืงืฉื” ืœื—ืฉื•ื‘ ืขืœ ื“ื‘ืจื™ื,
09:01
but once you've thought of them, it's easy to understand.
210
541296
2676
ืื‘ืœ ื‘ืจื’ืข ืฉื—ืฉื‘ืช ืขืœื™ื”ื, ื–ื” ื“ื™ ืคืฉื•ื˜ ืœื”ื‘ื™ืŸ ืื•ืชื.
09:03
My daughters understood these three theories,
211
543996
2106
ื”ื‘ื ื•ืช ืฉืœื™ ื”ื‘ื™ื ื• ืืช ืฉืœื•ืฉืช ื”ืชื™ืื•ืจื™ื•ืช
ื‘ืฆื•ืจืชื ื”ื‘ืกื™ืกื™ืช, ื›ื‘ืจ ื‘ื’ืŸ ื”ื™ืœื“ื™ื.
09:06
in their basic framework, in kindergarten.
212
546126
2518
09:08
It's not that hard -- here's the apple, here's the orange,
213
548668
3266
ื•ื–ื” ืœื ื›ื–ื” ืงืฉื”, ื”ื ื” ื”ืชืคื•ื—, ื”ื ื” ื”ืชืคื•ื–,
09:11
the Earth goes around, that kind of stuff.
214
551958
2018
ื›ื“ื•ืจ ื”ืืจืฅ ืžืงื™ืฃ, ืžืฉื”ื• ื‘ืกื’ื ื•ืŸ ื”ื–ื”.
09:14
Another thing is the answer was there all along,
215
554000
2586
ืœื‘ืกื•ืฃ, ื”ืชืฉื•ื‘ื” ื”ื™ืชื” ืฉื ืœืื•ืจืš ื›ืœ ื”ื“ืจืš,
09:16
but we kind of ignored it because of this obvious thing.
216
556610
2779
ืื‘ืœ ื“ื™ ื”ืชืขืœืžื ื• ืžืžื ื”, ื‘ื’ืœืœ ื”ื“ื‘ืจ ื”ืžื•ื‘ืŸ ืžืืœื™ื•.
09:19
It was an intuitive, strongly held belief that was wrong.
217
559413
2850
ื–ื• ื”ื™ืชื” ืืžื•ื ื” ืื™ื ื˜ื•ืื™ื˜ื™ื‘ื™ืช ื—ื–ืงื” ืืš ืฉื’ื•ื™ื”.
09:22
In the case of the solar system,
218
562287
1690
ื‘ืžืงืจื” ืฉืœ ืžืขืจื›ืช ื”ืฉืžืฉ, ื”ืจืขื™ื•ืŸ ืฉื›ื“ื•ืจ ื”ืืจืฅ ืกื‘
09:24
the idea that the Earth is spinning,
219
564001
1760
09:25
the surface is going a thousand miles an hour,
220
565785
2191
ื•ืฉื›ื“ื•ืจ ื”ืืจืฅ ื ืข ื‘ืžื”ื™ืจื•ืช ืฉืœ ืืœืคื™ ืงื™ืœื•ืžื˜ืจื™ื ืœืฉืขื”,
09:28
and it's going through the solar system at a million miles an hour --
221
568000
3249
ื•ืกื‘ื™ื‘ ืžืขืจื›ืช ื”ืฉืžืฉ ื‘ืžื”ื™ืจื•ืช ืฉืœ ืžื™ืœื™ื•ื ื™ ืงื™ืœื•ืžื˜ืจื™ื ืœืฉืขื”.
09:31
this is lunacy; we all know the Earth isn't moving.
222
571273
2476
ื–ื” ื˜ื™ืจื•ืฃ. ื›ื•ืœื ื• ื™ื•ื“ืขื™ื ืฉื”ืงืจืงืข ืื™ื ื” ื–ื–ื”.
09:33
Do you feel like you're moving a thousand miles an hour?
223
573773
2877
ืืชื ืžืจื’ื™ืฉื™ื ืฉืืชื ื–ื–ื™ื ื‘ืžื”ื™ืจื•ืช ืฉืœ ืืœืคื™ ืงื™ืœื•ืžื˜ืจื™ื ืœืฉืขื”?
ื‘ืจื•ืจ ืฉืœื. ื•ืžื™ ืฉื”ื™ื” ืื•ืžืจ,
09:36
If you said Earth was spinning around in space and was huge --
224
576674
2919
ื”ื•ื ื—ื’ ื‘ื—ืœืœ ื•ื”ื•ื ื›ืœ ื›ืš ืขื ืง,
09:39
they would lock you up, that's what they did back then.
225
579617
2591
ื”ื ื”ื™ื• ื›ื•ืœืื™ื ืื•ืชื• ื•ื–ื” ืžื” ืฉื”ื ื”ื™ื• ืขื•ืฉื™ื.
(ืฆื—ื•ืง)
09:42
So it was intuitive and obvious. Now, what about evolution?
226
582232
3275
ื–ื• ื”ื™ืชื” ื”ื ื—ื” ื‘ืจื•ืจื” ืžืืœื™ื”. ื•ืžื” ืœื’ื‘ื™ ื”ืื‘ื•ืœื•ืฆื™ื”?
09:45
Evolution, same thing.
227
585531
1154
ื‘ืื•ืชื” ืžื™ื“ื”. ืœื™ืžื“ื ื• ืืช ื™ืœื“ื ื•, ืฉื”ืชื "ืš ืžืกืคืจ
09:46
We taught our kids the Bible says God created all these species,
228
586709
3080
ืฉืืœื•ื”ื™ื ื™ืฆืจ ืืช ื›ืœ ื”ื™ืฆื•ืจื™ื, ื—ืชื•ืœื™ื ื”ื ื—ืชื•ืœื™ื, ื›ืœื‘ื™ื ื”ื ื›ืœื‘ื™ื,
09:49
cats are cats; dogs are dogs; people are people; plants are plants;
229
589813
3143
ื‘ื ื™ ืื“ื ื”ื ื‘ื ื™ ืื“ื, ืฆืžื—ื™ื ื”ื ืฆืžื—ื™ื, ื”ื ืœื ืžืฉืชื ื™ื.
09:52
they don't change.
230
592980
1241
ื ื•ื— ืฉื ืื•ืชื ื‘ืชื™ื‘ื” ืœืคื™ ื”ืกื“ืจ, ื‘ืœื” ื‘ืœื” ื‘ืœื”.
09:54
Noah put them on the ark in that order, blah, blah.
231
594245
2649
09:56
The fact is, if you believe in evolution, we all have a common ancestor.
232
596918
3395
ื•ืœืžืขืฉื”, ืื ืืชื ืžืืžื™ื ื™ื ื‘ืื‘ื•ืœื•ืฆื™ื”, ืื– ืœื›ื•ืœื ื• ื™ืฉ ืื‘ ืงื“ืžื•ืŸ,
10:00
We all have a common ancestor with the plant in the lobby!
233
600337
3282
ืžืฉื•ืชืฃ ืขื ื”ืขืฆื™ืฅ ื‘ืœื•ื‘ื™.
10:03
This is what evolution tells us. And it's true. It's kind of unbelievable.
234
603643
3686
ื–ื” ืžื” ืฉื”ืื‘ื•ืœื•ืฆื™ื” ืžืกืคืจืช ืœื ื•. ื•ื–ื” ื ื›ื•ืŸ. ืื‘ืœ ืžืžืฉ ืœื ื™ืื•ืžืŸ.
10:07
And the same thing about tectonic plates.
235
607353
2557
ื•ืื•ืชื• ื”ื“ื‘ืจ ืขื ื”ืœื•ื—ื•ืช ื”ื˜ืงื˜ื•ื ื™ื,
10:09
All the mountains and the continents
236
609934
1722
ื›ืœ ื”ื”ืจื™ื ื•ื”ื™ื‘ืฉื•ืช ืžืจื—ืคื•ืช ืœื”ื
10:11
are kind of floating around on top of the Earth.
237
611680
2344
ืขืœ ืคื ื™ ื›ื“ื•ืจ ื”ืืจืฅ, ืžืžืฉ ืœื ื”ื’ื™ื•ื ื™.
10:14
It doesn't make any sense.
238
614048
1246
10:15
So what is the intuitive, but incorrect assumption,
239
615318
4601
ืื– ืžื”ื™ ื”ื”ื ื—ื” ื”ืื™ื ื˜ื•ืื™ื˜ื™ื‘ื™ืช ื”ืžื•ื˜ืขื™ืช,
10:19
that's kept us from understanding brains?
240
619943
1967
ืฉืžื ืขื” ืžืื™ืชื ื• ืœื”ื‘ื™ืŸ ืืช ื”ืžื—?
10:21
I'll tell you. It'll seem obvious that it's correct. That's the point.
241
621934
3293
ืื ื™ ื”ื•ืœืš ืœื•ืžืจ ืœื›ื ืื•ืชื”, ื•ื”ื™ื ืชื™ืฉืžืข ืžื•ื‘ื ืช ืžืืœื™ื”,
ื•ื–ื• ื”ื ืงื•ื“ื”, ื ื›ื•ืŸ? ืื—ืจ ื›ืš ืืฆื˜ืจืš ืœื˜ืขื•ืŸ ื˜ืขื ื”
10:25
Then I'll make an argument why you're incorrect on the other assumption.
242
625251
3434
ืžื“ื•ืข ืืชื ืฉื•ื’ื™ื ืœื’ื‘ื™ ืื•ืชื” ื”ื ื—ื”.
10:28
The intuitive but obvious thing is:
243
628709
1682
ื”ื”ื ื—ื” ื”ืื™ื ื˜ื•ืื™ื˜ื™ื‘ื™ืช ื”ืžื•ื‘ื ืช ืžืืœื™ื” ื”ื™ื ืฉื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื”,
10:30
somehow, intelligence is defined by behavior;
244
630415
2314
ืžื•ื’ื“ืจืช ืขืœ ื™ื“ื™ ื”ืชื ื”ื’ื•ืช,
10:32
we're intelligent because of how we do things
245
632753
2350
ืฉืื ื• ืื™ื ื˜ืœื™ื’ื ื˜ื™ื ื‘ื–ื›ื•ืช ื”ื“ืจืš ืฉื‘ื” ืื ื• ืขื•ืฉื™ื ื“ื‘ืจื™ื
10:35
and how we behave intelligently.
246
635127
1572
ื”ื“ืจืš ืฉื‘ื” ืื ื• ืžืชื ื”ื’ื™ื ื‘ืฆื•ืจื” ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช, ื•ืื ื™ ืื•ืžืจ ืœื›ื ืฉื–ื• ื˜ืขื•ืช.
10:36
And I'm going to tell you that's wrong.
247
636723
1879
10:38
Intelligence is defined by prediction.
248
638626
2131
ืฉืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ื™ื ื ื™ื‘ื•ื™.
10:40
I'm going to work you through this in a few slides,
249
640781
2415
ื•ืืกื‘ื™ืจ ื–ืืช ื‘ืฉืงื•ืคื™ื•ืช ื”ื‘ืื•ืช,
10:43
and give you an example of what this means.
250
643220
2094
ืืชืŸ ืœื›ื ื“ื•ื’ืžื ืœืžื” ืื ื™ ืžืชื›ื•ื•ืŸ. ื”ื ื” ืžืขืจื›ืช.
10:45
Here's a system.
251
645338
1301
10:46
Engineers and scientists like to look at systems like this.
252
646663
2908
ืžื”ื ื“ืกื™ื ื•ืžื“ืขื ื™ื ืื•ื”ื‘ื™ื ืœื”ื‘ื™ื˜ ืขืœ ืžืขืจื›ื•ืช ื›ืืœื”.
10:49
They say, we have a thing in a box. We have its inputs and outputs.
253
649595
3163
ื”ื ืื•ืžืจื™ื, ื™ืฉื ื• ืžืฉื”ื• ื‘ืงื•ืคืกื, ื•ืœืงื•ืคืกื ื™ืฉ ืงืœื˜ ื•ืคืœื˜.
10:52
The AI people said, the thing in the box is a programmable computer,
254
652782
3240
ืื ืฉื™ ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ืžืœืื›ื•ืชื™ืช ืื•ืžืจื™ื, ื”ื“ื‘ืจ ืฉื‘ืงื•ืคืกื ื”ื•ื ืžื—ืฉื‘
10:56
because it's equivalent to a brain.
255
656046
1679
ืฉื”ื•ื ื”ืžืงื‘ื™ืœ ืœืžื—, ื ื˜ืขืŸ ืื•ืชื• ื‘ืงืœื˜ ื›ืœืฉื”ื•
10:57
We'll feed it some inputs and get it to do something, have some behavior.
256
657749
3506
ื•ื ื’ืจื•ื ืœื• ืœืขืฉื•ืช ืžืฉื”ื•, ืœื”ืชื ื”ื’ ื‘ืฆื•ืจื” ืžืกื•ื™ื™ืžืช.
ื•ืืœืŸ ื˜ื•ืจื™ื ื’ ื”ื’ื“ื™ืจ ืืช ืžื‘ื—ืŸ ื˜ื•ืจื™ื ื’, ืฉืขืงืจื•ื ื™ืช ืื•ืžืจ
11:01
Alan Turing defined the Turing test, which essentially says,
257
661279
2822
ื ื“ืข ืื ืžืฉื”ื• ืื™ื ื˜ืœื™ื’ื ื˜ื™ ืื ื™ืชื ื”ื’ ื–ื”ื” ืœืื“ื.
11:04
we'll know if something's intelligent if it behaves identical to a human --
258
664125
3553
ื”ืชื ื”ื’ื•ืช ื”ืฉืงื•ืœื” ืœืื™ื ื˜ืœื™ื’ื ืฆื™ื”.
11:07
a behavioral metric of what intelligence is
259
667702
2106
11:09
that has stuck in our minds for a long time.
260
669832
2144
ื•ื–ื” ื ืชืงืข ืœื ื• ื‘ืชื•ื“ืขื” ืœืื•ืจืš ื”ืžื•ืŸ ื–ืžืŸ.
11:12
Reality, though -- I call it real intelligence.
261
672000
2392
ืื•ืœื ื‘ืžืฆื™ืื•ืช, ืื ื™ ืงื•ืจื” ืœื–ื” ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืืžื™ืชื™ืช.
11:14
Real intelligence is built on something else.
262
674416
2175
ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืืžื™ืชื™ืช ื‘ื ื•ื™ื” ืขืœ ืžืฉื”ื• ืื—ืจ.
11:16
We experience the world through a sequence of patterns,
263
676615
3214
ืื ื• ื—ื•ื•ื™ื ืืช ื”ืขื•ืœื ื‘ืืžืฆืขื•ืช ืจืฆืฃ ื“ืคื•ืกื™ื, ืฉื•ืžืจื™ื ืื•ืชื,
11:19
and we store them, and we recall them.
264
679853
2149
ื ื–ื›ืจื™ื ื‘ื”ื ื•ืžืขืžืชื™ื ืื•ืชื
11:22
When we recall them, we match them up against reality,
265
682026
2545
ืขื ื”ืžืฆื™ืื•ืช, ื•ืžื ื‘ืื™ื ื›ืœ ื”ื–ืžืŸ.
11:24
and we're making predictions all the time.
266
684595
2251
11:26
It's an internal metric; there's an internal metric about us,
267
686870
2958
ื–ื• ืžืขืจื›ืช ืื™ืŸ ืกื•ืคื™ืช.
11:29
saying, do we understand the world, am I making predictions, and so on.
268
689852
3342
ื”ืื ืื ื™ ืžื‘ื™ืŸ ืืช ื”ืขื•ืœื? ืื ื™ ืžื ื‘ื? ื•ื›ื•ืœื™.
11:33
You're all being intelligent now, but you're not doing anything.
269
693218
3002
ื›ื•ืœื›ื ืื™ื ื˜ืœื™ื’ื ื˜ื™ื™ื ื‘ืจื’ืข ื–ื”, ืืš ืืชื ืœื ืขื•ืฉื™ื ืฉื•ื ื“ื‘ืจ.
ืื•ืœื™ ืžื’ืจื“ื™ื ืื™ืคืฉื”ื• ืื• ืžื—ื˜ื˜ื™ื ื‘ืืฃ,
11:36
Maybe you're scratching yourself, but you're not doing anything.
270
696244
3002
ืื‘ืœ ืืชื ืœื ืขื•ืฉื™ื ืฉื•ื ื“ื‘ืจ ื‘ืจื’ืข ื–ื”,
11:39
But you're being intelligent; you're understanding what I'm saying.
271
699270
3156
ืืš ืขื“ื™ื™ืŸ ืื™ื ื˜ืœื™ื’ื ื˜ื™ื, ืืชื ืžื‘ื™ื ื™ื ืืช ืžื™ืœื•ืชื™ื™.
11:42
Because you're intelligent and you speak English,
272
702450
2295
ื‘ื–ื›ื•ืช ื–ื” ืฉืืชื ืื™ื ื˜ืœื™ื’ื ื˜ื™ื ื•ืžื“ื‘ืจื™ื ืื ื’ืœื™ืช,
11:44
you know the word at the end of this
273
704769
1751
ืืชื ื™ื•ื“ืขื™ื ืื™ื–ื• ืžื™ืœื” ืชื‘ื•ื ื‘ืกื•ืฃ...
ื”ืžืฉืคื˜.
11:46
sentence.
274
706544
1159
11:47
The word came to you; you make these predictions all the time.
275
707727
3152
ื”ืžื™ืœื” ืงืคืฆื” ืœื›ื, ื•ืืชื ืขื•ืฉื™ื ื ื™ื‘ื•ื™ื™ื ื›ืืœื” ื›ืœ ื”ื–ืžืŸ.
11:50
What I'm saying is,
276
710903
1699
ื•ืžื” ืฉืื ื™ ืื•ืžืจ
11:52
the internal prediction is the output in the neocortex,
277
712626
2631
ืฉื”ื ื™ื‘ื•ื™ ื”ื•ื ื”ืคืœื˜ ืฉืœ ืงืœื™ืคืช ื”ืžื—.
ื•ืฉื‘ืฆื•ืจื” ื›ืœืฉื”ื™, ื”ื ื™ื‘ื•ื™ื™ื ืžื•ื‘ื™ืœื™ื ืœื”ืชื ื”ื’ื•ืช ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช.
11:55
and somehow, prediction leads to intelligent behavior.
278
715281
2541
11:57
Here's how that happens:
279
717846
1151
ื•ื›ืš ื–ื” ืงื•ืจื”. ื‘ื•ืื• ื ืชื—ื™ืœ ืขื ืžื— ืœื ืื™ื ื˜ืœื™ื’ื ื˜ื™.
11:59
Let's start with a non-intelligent brain.
280
719021
1955
ื”ืžื— ื”ื™ืฉืŸ,
12:01
I'll argue a non-intelligent brain, we'll call it an old brain.
281
721000
3009
12:04
And we'll say it's a non-mammal, like a reptile,
282
724033
2871
ื ื ื™ื— ืฉืœ ืœื ื™ื•ื ืง, ื–ื•ื—ืœ ืœืžืฉืœ ,
12:06
say, an alligator; we have an alligator.
283
726928
1985
ื›ืžื• ืชื ื™ืŸ.
12:08
And the alligator has some very sophisticated senses.
284
728937
3371
ื•ืœืชื ื™ืŸ ื™ืฉ ื—ื•ืฉื™ื ืžืื•ื“ ืžืชื•ื—ื›ืžื™ื.
12:12
It's got good eyes and ears and touch senses and so on,
285
732332
3206
ื™ืฉ ืœื• ืขื™ื ื™ื™ื ื•ืื•ื–ื ื™ื™ื ื˜ื•ื‘ื•ืช, ื•ื—ื•ืฉ ืžื™ืฉื•ืฉ,
12:15
a mouth and a nose.
286
735562
1469
ืคื” ื•ืืฃ. ื™ืฉ ืœื• ื”ืชื ื”ื’ื•ืช ืžืื•ื“ ืžื•ืจื›ื‘ืช.
12:17
It has very complex behavior.
287
737055
1991
12:19
It can run and hide. It has fears and emotions. It can eat you.
288
739070
3906
ื”ื•ื ื™ื›ื•ืœ ืœืจื•ืฅ ื•ืœื”ืชื—ื‘ื. ื™ืฉ ืœื• ืคื—ื“ื™ื ื•ืจื’ืฉื•ืช. ื”ื•ื ื™ื›ื•ืœ ืœื˜ืจื•ืฃ ืื•ืชืš.
12:23
It can attack. It can do all kinds of stuff.
289
743000
3590
ื•ืœืชืงื•ืฃ. ื›ืœ ืžื™ื ื™ ื“ื‘ืจื™ื.
12:27
But we don't consider the alligator very intelligent,
290
747193
2856
ืืš ืื ื• ืœื ืžื—ืฉื™ื‘ื™ื ืืช ื”ืชื ื™ืŸ ืœื™ืฆื•ืจ ืžืื•ื“ ืื™ื ื˜ืœื™ื’ื ื˜ื™, ืœื ื‘ืžื•ื‘ืŸ ื”ืื ื•ืฉื™ ืฉืœ ื”ืžื™ืœื”.
12:30
not in a human sort of way.
291
750073
1676
12:31
But it has all this complex behavior already.
292
751773
2356
ืœืžืจื•ืช ืฉื™ืฉ ืœื• ืืช ื”ื”ืชื ื”ื’ื•ืช ื”ืžื•ืจื›ื‘ืช ื”ื–ื•.
12:34
Now in evolution, what happened?
293
754510
1801
ืžื” ืงืจื” ื‘ืžื”ืœืš ื”ืื‘ื•ืœื•ืฆื™ื”?
12:36
First thing that happened in evolution with mammals
294
756335
2385
ื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉืงืจื” ื‘ืื‘ื•ืœื•ืฆื™ื” ืฉืœ ื”ื™ื•ื ืงื™ื,
12:38
is we started to develop a thing called the neocortex.
295
758744
2531
ืฉืคื™ืชื—ื ื• ืืช ื”ื—ืœืง ืฉื ืงืจื ืงืœื™ืคืช ื”ืžื—.
12:41
I'm going to represent the neocortex by this box on top of the old brain.
296
761299
3793
ื•ืื ื™ ืžื™ืฆื’ ืืช ืงืœื™ืคืช ื”ืžื— ื›ืืŸ,
ื‘ืขื–ืจืช ื”ืชื™ื‘ื” ื”ื–ื• ืฉื™ื•ืฉื‘ืช ืžืขืœ ื”ืžื— ื”ื™ืฉืŸ.
12:45
Neocortex means "new layer." It's a new layer on top of your brain.
297
765116
3353
ืงืœื™ืคืช ื”ืžื— ืžืฉืžืขื•ืชื” ืฉื›ื‘ื” ื—ื“ืฉื”. ื‘ืจืืฉ ื”ืžื—.
12:48
It's the wrinkly thing on the top of your head
298
768493
2343
ืื ืœื ื™ื“ืขืชื, ื”ื™ื ื”ื“ื‘ืจ ื”ืžืงื•ืžื˜ ื‘ืจืืฉ ื”ืžื—,
12:50
that got wrinkly because it got shoved in there and doesn't fit.
299
770860
3084
ืฉื™ืฉ ืœื• ืงืžื˜ื™ื ื›ื™ ื”ื•ื ื ื“ื—ืก ืœืฉื ืœืžืจื•ืช ื—ื•ืกืจ ื”ืžืงื•ื.
12:53
(Laughter)
300
773968
1008
(ืฆื—ื•ืง)
12:55
Literally, it's about the size of a table napkin
301
775000
2242
ื‘ืจืฆื™ื ื•ืช, ื–ื” ื‘ืขืจืš ื‘ื’ื•ื“ืœ ืฉืœ ืžืคื™ืช.
12:57
and doesn't fit, so it's wrinkly.
302
777266
1574
ื•ื–ื” ืœื ืžืชืื™ื, ืื– ื–ื” ื ื”ื™ื” ืžืงื•ืžื˜.
12:58
Now, look at how I've drawn this.
303
778864
1745
13:00
The old brain is still there.
304
780633
1386
ื”ืžื— ื”ื™ืฉืŸ ืขื“ื™ืŸ ื›ืืŸ. ืขื“ื™ืŸ ื™ืฉ ืœื ื• ืžื— ื›ืžื• ืœืชื ื™ืŸ.
13:02
You still have that alligator brain. You do. It's your emotional brain.
305
782043
3655
ื‘ื”ื—ืœื˜ ื™ืฉ ืœื ื•. ื–ื” ื”ืžื— ื”ืจื’ืฉื™ ืฉืœื›ื.
13:05
It's all those gut reactions you have.
306
785722
2730
ื›ืœ ื”ื“ื‘ืจื™ื ื›ืžื• ืชื—ื•ืฉื•ืช ื”ื‘ื˜ืŸ.
13:08
On top of it, we have this memory system called the neocortex.
307
788476
3270
ื•ืžืขืœื™ื• ื™ืฉื ื” ืžืขืจื›ืช ื–ื›ืจื•ืŸ ื”ื ืงืจืืช ืงืœื™ืคืช ื”ืžื—.
13:11
And the memory system is sitting over the sensory part of the brain.
308
791770
4294
ื•ืžืขืจื›ืช ื”ื–ื›ืจื•ืŸ ื”ื–ื• ื™ื•ืฉื‘ืช ื‘ื—ืœืง ื”ื—ื™ืฉืชื™ ืฉืœ ื”ืžื—.
13:16
So as the sensory input comes in and feeds from the old brain,
309
796088
3055
ื•ื›ืฉืงืœื˜ ื—ื™ืฉืชื™ ืžื’ื™ืข ืœืžื— ื”ื™ืฉืŸ,
13:19
it also goes up into the neocortex.
310
799167
2154
ื”ื•ื ื‘ื ื•ืกืฃ ืขื•ืœื” ืœืงืœื™ืคืช ื”ืžื— ื”ื—ื“ืฉื”. ื•ื”ื™ื ืžืฉื ื ืช.
13:21
And the neocortex is just memorizing.
311
801345
1913
13:23
It's sitting there saying, I'm going to memorize all the things going on:
312
803282
3561
ื”ื™ื ื™ื•ืฉื‘ืช ืฉื "ื•ืื•ืžืจืช" ืื ื™ ืืฉื ืŸ ืืช ื›ืœ ื”ื“ื‘ืจื™ื ืฉืงื•ืจื™ื,
13:26
where I've been, people I've seen, things I've heard, and so on.
313
806867
3019
ืื™ืคื” ื”ื™ื™ืชื™, ืื ืฉื™ื ืฉืจืื™ืชื™, ื“ื‘ืจื™ื ืฉืฉืžืขืชื™ ื•ื›ื•ืœื™.
13:29
And in the future, when it sees something similar to that again,
314
809910
3362
ื•ื‘ืขืชื™ื“, ื›ืฉื”ื™ื ืจื•ืื” ืžืฉื”ื• ื“ื•ืžื”,
13:33
in a similar environment, or the exact same environment,
315
813296
2635
ื‘ืกื‘ื™ื‘ื” ื“ื•ืžื” ืื• ืืคื™ืœื• ื–ื”ื”,
13:35
it'll start playing it back: "Oh, I've been here before,"
316
815955
3555
ื”ื™ื ืชืคืขื™ืœ ืืช ื–ื” ืžื—ื“ืฉ.
ื”ื•, ื”ื™ื™ืชื™ ืคื” ื‘ืขื‘ืจ. ื•ื›ืฉื”ื™ื™ืชื™ ืคื”,
13:39
and when you were here before, this happened next.
317
819534
2364
ื–ื” ืžื” ืฉืงืจื” ื”ืœืื”. ื–ื” ืžืืคืฉืจ ืœื›ื ืœื—ื–ื•ืช ืืช ื”ืขืชื™ื“.
13:41
It allows you to predict the future.
318
821922
1726
13:43
It literally feeds back the signals into your brain;
319
823672
3396
ื–ื” ืžืืคืฉืจ ืœื›ื ืคืฉื•ื˜ื• ื›ืžืฉืžืขื• ืœื”ื–ื™ืŸ ืืช ื”ืื•ืชื•ืช ืœืžื•ื—ื›ื,
13:47
they'll let you see what's going to happen next,
320
827092
2265
ืฉื™ืืคืฉืจื• ืœื›ื ืœืฆืคื•ืช ืžื” ื™ืงืจื” ื‘ืจื’ืข ื”ื‘ื,
13:49
will let you hear the word "sentence" before I said it.
321
829381
2595
ื™ืืคืฉืจ ืœื›ื ืœืฉืžื•ืข ืืช ื”ืžื™ืœื” "ืžืฉืคื˜" ืœืคื ื™ ืฉืืžืจืชื™ ืื•ืชื”.
13:52
And it's this feeding back into the old brain
322
832000
3185
ื–ื• ื”ื”ื–ื ื”-ื”ื—ื•ื–ืจืช ืœืžื— ื”ื™ืฉืŸ
13:55
that will allow you to make more intelligent decisions.
323
835209
2577
ื”ืžืืคืฉืจืช ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช.
13:57
This is the most important slide of my talk, so I'll dwell on it a little.
324
837810
3489
ื–ื• ื”ืฉืงื•ืคื™ืช ื”ื—ืฉื•ื‘ื” ื‘ื™ื•ืชืจ ื‘ื”ืจืฆืื” ืื– ืืฉืืจ ืื™ืชื” ืœื–ืžืŸ ืžื”.
14:01
And all the time you say, "Oh, I can predict things,"
325
841323
3575
ืื– ื›ืœ ื”ื–ืžืŸ ืืชื” ืื•ืžืจ, ื”ื•, ืื ื™ ื™ื›ื•ืœ ืœื—ื–ื•ืช ืžื” ื™ืงืจื”.
14:04
so if you're a rat and you go through a maze, and you learn the maze,
326
844922
3360
ืื ืืชื” ืขื›ื‘ืจ ืฉืžืกืชื•ื‘ื‘ ื‘ืžื‘ื•ืš, ื•ืœื•ืžื“ ืื•ืชื•,
14:08
next time you're in one, you have the same behavior.
327
848306
2439
ืคืขื ื”ื‘ืื” ืฉืืชื” ื‘ืžื‘ื•ืš, ื”ื”ืชื ื”ื’ื•ืช ืฉืœืš ื–ื”ื”,
14:10
But suddenly, you're smarter; you say, "I recognize this maze,
328
850769
2991
ืื‘ืœ ืœืคืชืข, ืืชื” ื—ื›ื ื™ื•ืชืจ
ื›ื™ ืืชื” ืื•ืžืจ, ืื ื™ ืžื›ื™ืจ ืืช ื”ืžื‘ื•ืš ื”ื–ื”, ืื ื™ ื™ื•ื“ืข ืœืืŸ ืœืœื›ืช,
14:13
I know which way to go; I've been here before; I can envision the future."
329
853784
3542
ื”ื™ื™ืชื™ ืคื” ื‘ืขื‘ืจ, ืื ื™ ื™ื›ื•ืœ ืœืฆืคื•ืช ืืช ื”ืขืชื™ื“. ื•ื–ื” ืžื” ืฉื”ื•ื ืขื•ืฉื”.
14:17
That's what it's doing.
330
857350
1168
14:18
This is true for all mammals --
331
858542
2840
ื–ื” ื ื›ื•ืŸ ืœื’ื‘ื™ ื›ืœ ื”ื™ื•ื ืงื™ื,
14:21
in humans, it got a lot worse.
332
861406
2031
ืื‘ืœ ืืฆืœ ื‘ื ื™ ืื“ื, ื–ื” ื ื”ื™ื” ื’ืจื•ืข ื™ื•ืชืจ.
14:23
Humans actually developed the front of the neocortex,
333
863461
2587
ืืฆืœ ื‘ื ื™ ืื“ื, ืคื™ืชื—ื ื• ืืช ื”ื—ืœืง ื”ืงื“ืžื™ ืฉืœ ืงืœื™ืคืช ื”ืžื—
14:26
called the anterior part of the neocortex.
334
866072
2221
ืฉื ืงืจื ื”ืงืœื™ืคื” ื”ืงื“ืžื™ืช. ื•ื”ื˜ื‘ืข ืขืฉื” ืชืขืœื•ืœ.
14:28
And nature did a little trick.
335
868317
1438
14:29
It copied the posterior, the back part, which is sensory,
336
869779
2687
ื”ื•ื ื”ืขืชื™ืง ืืช ื”ื—ืœืง ื”ืื—ื•ืจื™ ื”ื—ื™ืฉืชื™,
14:32
and put it in the front.
337
872490
1151
ื•ืฉื ืื•ืชื• ืžืงื“ื™ืžื”.
14:33
Humans uniquely have the same mechanism on the front,
338
873665
2480
ืืฆืœ ื‘ื ื™ ืื“ื ื‘ืื•ืคืŸ ื™ื—ื•ื“ื™ ื™ืฉ ืืช ืื•ืชื• ืžื ื’ื ื•ืŸ ืžืงื“ื™ืžื”,
14:36
but we use it for motor control.
339
876169
1554
ืื‘ืœ ืื ื• ืžืฉืชืžืฉื™ื ื‘ื• ืœืชื ื•ืขื”.
14:37
So we're now able to do very sophisticated motor planning, things like that.
340
877747
3581
ืื ื• ื™ื›ื•ืœื™ื ืขื›ืฉื™ื• ืœืขืฉื•ืช ืชื ื•ืขื•ืช ืžืชื•ื—ื›ืžื•ืช ืžืื•ื“.
14:41
I don't have time to explain, but to understand how a brain works,
341
881352
3126
ืื™ืŸ ืœื™ ื–ืžืŸ ืœื”ื›ื ืก ืœื”ื›ืœ, ืื‘ืœ ืื ืืชื ืจื•ืฆื™ื ืœื”ื‘ื™ืŸ ืื™ืŸ ื”ืžื— ืขื•ื‘ื“,
14:44
you have to understand how the first part of the mammalian neocortex works,
342
884502
3537
ืืชื ื—ื™ื™ื‘ื™ื ืœื”ื‘ื™ืŸ ืื™ืš ื”ื—ืœืง ื”ืจืืฉื•ืŸ ืฉืœ ืงืœื™ืคืช ื”ืžื— ืืฆืœ ื™ื•ื ืงื™ื ืขื•ื‘ื“,
ืื™ืš ืื ื• ืžืฆืœื™ื—ื™ื ืœืื’ื•ืจ ื“ืคื•ืกื™ื ื•ืœื ื‘ื.
14:48
how it is we store patterns and make predictions.
343
888063
2293
ื‘ื•ืื• ื ืจืื” ื›ืžื” ื“ื•ื’ืžืื•ืช ืฉืœ ื ื™ื‘ื•ื™.
14:50
Let me give you a few examples of predictions.
344
890380
2188
14:52
I already said the word "sentence."
345
892592
1676
ื›ื‘ืจ ื ืชืชื™ ืืช ื”ื“ื•ื’ืžื ืฉืœ ื”ืžื™ืœื” "ืžืฉืคื˜". ื‘ืžื•ืกื™ืงื”,
14:54
In music, if you've heard a song before,
346
894292
3206
ืื ืฉืžืขืชื ืฉื™ืจ ื‘ืขื‘ืจ, ืื ืฉืžืขืชื ืืช ื’'ื™ืœ ืฉืจื” ืืช ื”ืฉื™ืจื™ื ื”ืœืœื• ื‘ืขื‘ืจ,
14:57
when you hear it, the next note pops into your head already --
347
897522
2909
ื›ืฉื”ื™ื ืฉืจื” ืื•ืชื, ื”ืชื• ื”ื‘ื ืงื•ืคืฅ ืœื›ื ืœืจืืฉ ืžื™ื“-
15:00
you anticipate it.
348
900455
1151
ืืชื ืžืฆืคื™ื ืœืฉืžื•ืข ืื•ืชื•. ืื ื–ื” ื”ื™ื” ืืœื‘ื•ื ืฉื™ืจื™ื,
15:01
With an album, at the end of a song, the next song pops into your head.
349
901630
3354
ื‘ืกื•ืฃ ื›ืœ ืฉื™ืจ, ื”ืฉื™ืจ ื”ื‘ื ืงื•ืคืฅ ืœื›ื ืœืจืืฉ.
15:05
It happens all the time, you make predictions.
350
905008
2305
ื”ื“ื‘ืจื™ื ื”ืืœื• ืงื•ืจื™ื ื›ืœ ื”ื–ืžืŸ. ื›ืš ืืชื ืžื ื‘ืื™ื.
15:07
I have this thing called the "altered door" thought experiment.
351
907337
3039
ื™ืฉ ื ื™ืกื•ื™ ืฉืื ื™ ืงื•ืจื ืœื• ื ื™ืกื•ื™ "ื”ื“ืœืช ื”ืžืฉืชื ื”".
15:10
It says, you have a door at home;
352
910400
2829
ื•ื”ื ื™ืกื•ื™ ืื•ืžืจ, ื™ืฉ ืœื›ื ื‘ื‘ื™ืช ื“ืœืช,
15:13
when you're here, I'm changing it --
353
913253
1755
ื•ื‘ื–ืžืŸ ืฉืืชื ื ืžืฆืื™ื ื›ืืŸ, ืื ื™ ืžืฉื ื” ืื•ืชื”, ื™ืฉ ืœื™ ืžื™ืฉื”ื•
15:15
I've got a guy back at your house right now, moving the door around,
354
915032
3196
ื‘ื‘ื™ืช ืฉืœื›ื ื‘ืจื’ืข ื–ื”, ืฉืžืชืขืกืง ื‘ื“ืœืช,
15:18
moving your doorknob over two inches.
355
918252
1769
ื•ื”ื•ื ืขื•ืžื“ ืœื”ื–ื™ื– ืืช ื™ื“ื™ืช ื”ื“ืœืช ื›ืžื” ืกื ื˜ื™ืžื˜ืจื™ื.
15:20
When you go home tonight, you'll put your hand out, reach for the doorknob,
356
920045
3584
ื›ืฉืชื’ื™ืขื• ื”ื‘ื™ืชื” ื”ื™ื•ื ื‘ืขืจื‘, ืชื•ืฉื™ื˜ื• ืืช ื”ื™ื“
ืœื›ื•ื•ืŸ ื™ื“ื™ืช ื”ื“ืœืช ื•ืชื‘ื—ื™ื ื•
15:23
notice it's in the wrong spot
357
923653
1514
ืฉื”ื™ื ื‘ืžืงื•ื ื”ืœื ื ื›ื•ืŸ, ื•ืชืืžืจื•, ื•ื•ืื•, ืžืฉื”ื• ืงืจื”.
15:25
and go, "Whoa, something happened."
358
925191
1687
15:26
It may take a second, but something happened.
359
926902
2101
ื–ื” ื™ื›ื•ืœ ืœืงื—ืช ื›ืžื” ืจื’ืขื™ื ืœื”ื‘ื™ืŸ ืžื” ื‘ื“ื™ื•ืง ืงืจื”, ืื‘ืœ ืžืฉื”ื• ืงืจื”.
15:29
I can change your doorknob in other ways --
360
929027
2003
ืื ื™ ื™ื›ื•ืœ ืœืฉื ื•ืช ืืช ื”ื“ืœืช ืฉืœื›ื ื‘ื“ืจื›ื™ื ืื—ืจื•ืช.
15:31
make it larger, smaller, change its brass to silver, make it a lever,
361
931054
3241
ืื ื™ ื™ื›ื•ืœ ืœืขืฉื•ืช ืื•ืชื” ื’ื“ื•ืœื” ืื• ืงื˜ื ื” ื™ื•ืชืจ, ืœืฉื ื•ืช ืื•ืชื” ืžื ื—ื•ืฉืช ืœื›ืกืฃ,
ืœื”ื•ืกื™ืฃ ืœื” ืžื ื•ืฃ. ืœืฆื‘ื•ืข ืื•ืชื”,
15:34
I can change the door; put colors on, put windows in.
362
934319
2576
ืœืฉื™ื ื—ืœื•ื ื•ืช. ืื ื™ ื™ื›ื•ืœ ืœืฉื ื•ืช ืืœืคื™ ื“ื‘ืจื™ื ื‘ื“ืœืช,
15:36
I can change a thousand things about your door
363
936919
2151
ื•ื‘ืฉืชื™ ืฉื ื™ื•ืช ืฉืœื•ืงื— ืœื›ื ืœืคืชื•ื— ืื•ืชื”,
15:39
and in the two seconds you take to open it,
364
939094
2008
ืืชื ืชืฉื™ืžื• ืœื‘ ืฉืžืฉื”ื• ื”ืฉืชื ื”.
15:41
you'll notice something has changed.
365
941126
1722
15:42
Now, the engineering approach, the AI approach to this,
366
942872
2584
ื’ื™ืฉืช ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืžืœืื›ื•ืชื™ืช ืœื›ืš,
15:45
is to build a door database with all the door attributes.
367
945480
2675
ื”ื™ื ืœื‘ื ื•ืช ืžืกื“ ื ืชื•ื ื™ื ืฉืœ ื“ืœืชื•ืช ืฉื™ืฉ ื‘ื• ืืช ื›ืœ ืชื›ื•ื ื•ืช ื”ื“ืœืช.
15:48
And as you go up to the door, we check them off one at time:
368
948179
2819
ื•ื‘ืจื’ืข ืฉืืชื ื”ื•ืœื›ื™ื ืœื“ืœืช, ืืชื ื‘ื•ื“ืงื™ื ืืช ื›ืœ ื”ืชื›ื•ื ื•ืช ืื—ืช ืื—ืจื™ ื”ืฉื ื™ื”.
15:51
door, door, color ...
369
951022
1346
ื“ืœืช, ื“ืœืช, ื“ืœืช...ืฆื‘ืข.
15:52
We don't do that. Your brain doesn't do that.
370
952392
2100
ืื ื• ืœื ืขื•ืฉื™ื ืืช ื–ืืช ื›ืš. ื”ืžื— ืฉืœื›ื ืœื ืขื•ืฉื” ื–ืืช ื›ืš.
15:54
Your brain is making constant predictions all the time
371
954516
2540
ืžื” ืฉื”ื•ื ืขื•ืฉื” ื–ื” ืœื ื‘ื ืœืœื ื”ืจืฃ ื•ื‘ืื•ืคืŸ ืขืงื‘ื™
15:57
about what will happen in your environment.
372
957080
2034
ืžื” ื™ืงืจื” ื‘ืกื‘ื™ื‘ืชื›ื.
15:59
As I put my hand on this table, I expect to feel it stop.
373
959138
2746
ื›ืฉืื ื™ ืฉื ืืช ื™ื“ื™ ืขืœ ื”ืฉื•ืœื—ืŸ, ืื ื™ ืžืฆืคื” ืฉื”ื™ื ืชืขืฆืจ.
16:01
When I walk, every step, if I missed it by an eighth of an inch,
374
961908
3019
ื›ืฉืื ื™ ื”ื•ืœืš, ื‘ื›ืœ ืฆืขื“, ืื ืืคืกืคืก ืื•ืชื• ื‘ื›ืžื” ืžื™ืœื™ืžื˜ืจื™ื,
16:04
I'll know something has changed.
375
964951
1533
ืื“ืข ืฉืžืฉื”ื• ื”ืฉืชื ื”.
16:06
You're constantly making predictions about your environment.
376
966508
2820
ืืชื ืžื ื‘ืื™ื ืœืœื ื”ืจืฃ ืœื’ื‘ื™ ืกื‘ื™ื‘ืชื›ื.
16:09
I'll talk about vision, briefly.
377
969352
1593
ืื“ื‘ืจ ื‘ืงืฆืจื” ืขืœ ื”ืจืื™ื”. ื–ื•ื”ื™ ืชืžื•ื ื” ืฉืœ ืืฉื”.
16:10
This is a picture of a woman.
378
970969
1383
16:12
When we look at people, our eyes saccade over two to three times a second.
379
972376
3490
ื•ื›ืฉืืชื ืžืกืชื›ืœื™ื ืขืœ ืื ืฉื™ื, ื”ืขื™ื ื™ื™ื ืฉืœื›ื
ืžื“ืœื’ื•ืช ืคืขืžื™ื™ื ืขื“ ืฉืœื•ืฉ ื‘ืฉื ื™ื”.
16:15
We're not aware of it, but our eyes are always moving.
380
975890
2529
ืืชื ืœื ืžื•ื“ืขื™ื ืœื›ืš, ืื‘ืœ ื”ืขื™ื ื™ื™ื ืฉืœื›ื ื–ื–ื•ืช ืœืœื ื”ืจืฃ.
ื›ืฉืืชื ืžืกืชื›ืœื™ื ืขืœ ืคื ื™ื ืฉืœ ืžื™ืฉื”ื•,
16:18
When we look at a face, we typically go from eye to eye to nose to mouth.
381
978443
3435
ืืชื ื‘ื“ืจืš ื›ืœืœ ืขื•ื‘ืจื™ื ืžืขื™ืŸ ืœืขื™ืŸ ืœืืฃ ื•ืœืคื”.
16:21
When your eye moves from eye to eye,
382
981902
1869
ื›ืฉืืชื ืขื•ื‘ืจื™ื ืžืขื™ืŸ ืœืขื™ืŸ,
16:23
if there was something else there like a nose,
383
983795
2158
ืื ื”ื™ื” ืžืฉื”ื• ืื—ืจ ื‘ืžืงื•ื ื”ืขื™ืŸ, ืœืžืฉืœ ืืฃ,
16:25
you'd see a nose where an eye is supposed to be and go, "Oh, shit!"
384
985977
3546
ื”ื™ื™ืชื ืจื•ืื™ื ืืฃ ื‘ืžืงื•ื ืฉืขื™ืŸ ืฆืจื™ื›ื” ืœื”ื™ื•ืช,
ื”ื™ื™ืชื ืื•ืžืจื™ื, ืื•ืคืก...
16:29
(Laughter)
385
989547
1396
16:30
"There's something wrong about this person."
386
990967
2109
(ืฆื—ื•ืง)
ื™ืฉ ืžืฉื”ื• ืœื ื‘ืกื“ืจ ื‘ืื“ื ื”ื–ื”.
16:33
That's because you're making a prediction.
387
993100
2005
ื•ื–ื” ื‘ื’ืœืœ ืฉืืชื ืžื ื‘ืื™ื.
16:35
It's not like you just look over and say, "What am I seeing? A nose? OK."
388
995129
3439
ื–ื” ืœื ืฉืืชื ืžืกืชื›ืœื™ื ืœืฉื ื•ืื•ืžืจื™ื, ืžื” ืื ื™ ืจื•ืื” ืขื›ืฉื™ื•?
ืืฃ, ื–ื” ื‘ืกื“ืจ. ืœื, ื™ืฉ ืœื›ื ืฆืคื™ื” ืœื’ื‘ื™ ืžื” ืฉืืชื ืืžื•ืจื™ื ืœืจืื•ืช.
16:38
No, you have an expectation of what you're going to see.
389
998592
2634
(ืฆื—ื•ืง)
16:41
Every single moment.
390
1001250
1151
ื•ืœืกื™ื•ื, ืื™ืš ืื ื• ื‘ื•ื—ื ื™ื ืื™ื ื˜ืœื™ื’ื ืฆื™ื”.
16:42
And finally, let's think about how we test intelligence.
391
1002425
2629
16:45
We test it by prediction: What is the next word in this ...?
392
1005078
3081
ืื ื—ื ื• ื‘ื•ื—ื ื™ื ืืช ื–ื” ืขืœ ื™ื“ื™ ื ื™ื‘ื•ื™. ืžื”ื™ ื”ืžื™ืœื” ื”ื‘ืื”?
16:48
This is to this as this is to this. What is the next number in this sentence?
393
1008183
3627
ื”ื“ื‘ืจ ื”ื–ื” ื‘ื™ื—ืก ืœื–ื” ื”ื•ื ื›ืžื• ื”ื“ื‘ืจ ื”ื–ื” ื‘ื™ื—ืก ืœืžื”? ืžื” ื”ืžื™ืœื” ื”ื‘ืื” ื‘ืžืฉืคื˜?
16:51
Here's three visions of an object. What's the fourth one?
394
1011834
2690
ื”ื ื” ืฉืœื•ืฉื” ืฆื“ื“ื™ื ืฉืœ ื“ื‘ืจ.
ืžื”ื• ื”ืจื‘ื™ืขื™? ื›ืš ื‘ื•ื—ื ื™ื ืืช ื–ื”. ื”ื›ืœ ืขื ื™ื™ืŸ ืฉืœ ื ื™ื‘ื•ื™.
16:54
That's how we test it. It's all about prediction.
395
1014548
2504
16:57
So what is the recipe for brain theory?
396
1017573
2194
ืื– ืžื”ื• ื”ืžืชื›ื•ืŸ ืœืชื™ืื•ืจื™ืช ืžื—?
17:00
First of all, we have to have the right framework.
397
1020219
2366
ืงื•ื“ื ื›ืœ, ื—ื™ื™ื‘ืช ืœื”ื™ื•ืช ืœื ื• ืชื‘ื ื™ืช ื ื›ื•ื ื”.
17:02
And the framework is a memory framework,
398
1022609
1913
ืชื‘ื ื™ืช ืฉืœ ื–ื™ื›ืจื•ืŸ,
17:04
not a computational or behavior framework,
399
1024546
2024
ืœื ืชื‘ื ื™ืช ืžืžื•ื—ืฉื‘ืช ืื• ื”ืชื ื”ื’ื•ืชื™ืช. ืชื‘ื ื™ืช ื–ื™ื›ืจื•ืŸ.
17:06
it's a memory framework.
400
1026594
1163
17:07
How do you store and recall these sequences of patterns?
401
1027781
2623
ืื™ืš ืื ื• ืฉื•ืžืจื™ื ื•ื–ื•ื›ืจื™ื ืืช ื”ืจืฆืคื™ื ื•ื”ื“ืคื•ืกื™ื? ื“ืคื•ืกื™ื ื”ืŸ ื‘ื–ืžืŸ ื•ื”ืŸ ื‘ืžืจื—ื‘.
17:10
It's spatiotemporal patterns.
402
1030428
1442
17:11
Then, if in that framework, you take a bunch of theoreticians --
403
1031894
3009
ืœืื—ืจ ืžื›ืŸ ืœื•ืงื—ื™ื ื›ืžื” ืชืื•ืจื˜ื™ืงื ื™ื.
17:14
biologists generally are not good theoreticians.
404
1034927
2246
ื‘ื™ื•ืœื•ื’ื™ื ื‘ื“ืจืš ื›ืœืœ ืœื ืชืื•ืจื˜ื™ืงื ื™ื ื˜ื•ื‘ื™ื.
ื–ื” ืœื ืชืžื™ื“ ื ื›ื•ืŸ, ืืš ื‘ืื•ืคืŸ ื›ืœืœื™, ืื™ืŸ ื ื™ืกื™ื•ืŸ ืขืฉื™ืจ ืฉืœ ืชืื•ืจื™ื•ืช ื‘ื‘ื™ื•ืœื•ื’ื™ื”.
17:17
Not always, but generally, there's not a good history of theory in biology.
405
1037197
3529
17:20
I've found the best people to work with are physicists,
406
1040750
2574
ืื– ืžืฆืืชื™ ืœื ื›ื•ืŸ ืฉื”ืื ืฉื™ื ื”ื˜ื•ื‘ื™ื ื‘ื™ื•ืชืจ ืœืขื‘ื•ื“ ืื™ืชื ื”ื ืคื™ืกื™ืงืื™ื,
17:23
engineers and mathematicians,
407
1043348
1383
ืžื”ื ื“ืกื™ื ื•ืžืชืžื˜ื™ืงืื™ื, ืฉื‘ืื•ืคืŸ ื˜ื‘ืขื™ ื—ื•ืฉื‘ื™ื ื‘ืฆื•ืจื” ืืœื’ื•ืจื™ืชืžื™ืช.
17:24
who tend to think algorithmically.
408
1044755
1696
17:26
Then they have to learn the anatomy and the physiology.
409
1046475
3264
ื•ื”ื ื—ื™ื™ื‘ื™ื ืœืœืžื•ื“ ืื ื˜ื•ืžื™ื”, ื•ื”ื ื—ื™ื™ื‘ื™ื ืœืœืžื•ื“ ืคื™ื–ื™ื•ืœื•ื’ื™ื”.
17:29
You have to make these theories very realistic in anatomical terms.
410
1049763
4496
ืฆืจื™ืš ืœื‘ื ื•ืช ืืช ื”ืชื™ืื•ืจื™ื•ืช ื”ืœืœื• ื‘ืฆื•ืจื” ืžืื•ื“ ืžืฆื™ืื•ืชื™ืช ืžื‘ื—ื™ื ื” ืื ื˜ื•ืžื™ืช.
ืื ืžื™ืฉื”ื• ื™ืงื•ื ื•ื™ืกืคืจ ืœื›ื ืขืœ ืชืื•ืจื™ื”
17:34
Anyone who tells you their theory about how the brain works
411
1054283
2765
17:37
and doesn't tell you exactly how it's working
412
1057072
2097
ื•ืœื ื™ืกืคืจ ืœื›ื ืื™ืš ื‘ื“ื™ื•ืง ื–ื” ืขื•ื‘ื“ ื‘ืžื—
17:39
and how the wiring works --
413
1059193
1303
ื•ืื™ืš ื”ื—ื™ื•ื•ื˜ ืขื•ื‘ื“ ื‘ืžื—, ื–ื• ืœื ื‘ืืžืช ืชืื•ืจื™ื”.
17:40
it's not a theory.
414
1060520
1267
17:41
And that's what we do at the Redwood Neuroscience Institute.
415
1061811
2833
ื•ื–ื” ืžื” ืฉืื ื• ืขื•ืฉื™ื ื‘ืžื›ื•ืŸ ืจื“-ื•ื•ื“.
17:44
I'd love to tell you we're making fantastic progress in this thing,
416
1064668
3308
ื”ื™ื™ืชื™ ืฉืžื— ืื™ืœื• ื”ื™ื” ืœื™ ื™ื•ืชืจ ื–ืžืŸ ืœืกืคืจ ืœื›ื ืื ื• ืขื•ืฉื™ื ื”ืชืงื“ืžื•ืช ื ื”ื“ืจืช ื‘ื ื•ืฉื,
17:48
and I expect to be back on this stage sometime in the not too distant future,
417
1068000
3662
ื•ืื ื™ ืžืฆืคื” ืœืฉื•ื‘ ืœื‘ืžื” ื–ื•,
ืื•ืœื™ ื‘ืขืชื™ื“ ื”ืœื ืจื—ื•ืง ื•ืœืกืคืจ ืœื›ื ืขืœ ื–ื”.
17:51
to tell you about it.
418
1071686
1164
17:52
I'm really excited; this is not going to take 50 years.
419
1072874
2594
ืื ื™ ืžืื•ื“ ืžืื•ื“ ื ืœื”ื‘. ื–ื” ืžืžืฉ ืœื ื™ืงื— ื—ืžื™ืฉื™ื ืฉื ื”.
17:55
What will brain theory look like?
420
1075492
1578
ืื– ืื™ืš ืชื™ืื•ืจื™ืช ื”ืžื— ืชื™ืจืื”?
17:57
First of all, it's going to be about memory.
421
1077094
2055
ืงื•ื“ื ื›ืœ, ื–ื• ืชื”ื™ื” ืชืื•ืจื™ื” ืขืœ ื–ื™ื›ืจื•ืŸ.
17:59
Not like computer memory -- not at all like computer memory.
422
1079173
2822
ืœื ื›ืžื• ืฉืœ ืžื—ืฉื‘. ื–ื” ืืคื™ืœื• ืœื ื“ื•ืžื” ืœื–ื™ื›ืจื•ืŸ ืฉืœ ืžื—ืฉื‘.
18:02
It's very different.
423
1082019
1151
ื–ื” ืžืื•ื“, ืžืื•ื“ ืฉื•ื ื”. ื•ื–ื”ื• ื–ื™ื›ืจื•ืŸ ืฉืœ ืื•ืชื
18:03
It's a memory of very high-dimensional patterns,
424
1083194
2257
ื“ืคื•ืกื™ื ืจื‘-ืžื™ืžื“ื™ื™ื, ืื•ืชื ื”ื“ื‘ืจื™ื ื”ื‘ืื™ื ืžื”ืขื™ื ื™ื.
18:05
like the things that come from your eyes.
425
1085475
1962
18:07
It's also memory of sequences:
426
1087461
1437
ื‘ื ื•ืกืฃ, ื’ื ื–ื™ื›ืจื•ืŸ ืฉืœ ืจืฆืคื™ื.
18:08
you cannot learn or recall anything outside of a sequence.
427
1088922
2730
ืืชื ืœื ื™ื›ื•ืœื™ื ืœืœืžื•ื“ ืื• ืœื–ื›ื•ืจ ืฉื•ื ื“ื‘ืจ ืžื—ื•ืฅ ืœืจืฆืฃ ื›ืœืฉื”ื•.
18:11
A song must be heard in sequence over time,
428
1091676
2837
ืฉื™ืจ ื—ื™ื™ื‘ ืœื”ื™ื•ืช ืžื•ืฉืžืข ื‘ืจืฆืฃ ืœืื•ืจืš ื–ืžืŸ,
18:14
and you must play it back in sequence over time.
429
1094537
2351
ื•ืืชื ื—ื™ื™ื‘ื™ื ืœื”ืฉืžื™ืข ืื•ืชื• ืœืขืฆืžื›ื ื‘ืจืฆืฃ ืœืื•ืจืš ื–ืžืŸ.
18:16
And these sequences are auto-associatively recalled,
430
1096912
2449
ืื ื• ื ื–ื›ืจื™ื ื‘ืจืฆืคื™ื ื”ืืœื” ื‘ืฆื•ืจื” ืขืฆืžื™ืช ื•ืืกื•ืฆื™ืื˜ื™ื‘ื™ืช, ืื– ืื ืื ื™ ืจื•ืื”
18:19
so if I see something, I hear something, it reminds me of it,
431
1099385
2873
ืื• ืฉื•ืžืข ืžืฉื”ื•, ืื ื™ ื ื–ื›ืจ, ื•ืžื•ืฉืžืข ื‘ืฆื•ืจื” ืื•ื˜ื•ืžื˜ื™ืช.
18:22
and it plays back automatically.
432
1102282
1533
18:23
It's an automatic playback.
433
1103839
1294
ื–ื• ื”ืฉืžืขื” ื—ื•ื–ืจืช ืื•ื˜ื•ืžื˜ื™ืช. ื•ื ื™ื‘ื•ื™ ืฉืœ ืงืœื˜ ืขืชื™ื“ื™ ื”ื•ื ื”ืคืœื˜ ื”ืจืฆื•ื™.
18:25
And prediction of future inputs is the desired output.
434
1105157
2548
18:27
And as I said, the theory must be biologically accurate,
435
1107729
2620
ื›ืžื• ืฉืืžืจืชื™, ื”ืชืื•ืจื™ื” ื—ื™ื™ื‘ืช ืœื”ื™ื•ืช ืžื“ื•ื™ืงืช ืžื‘ื—ื™ื ื” ื‘ื™ื•ืœื•ื’ื™ืช,
18:30
it must be testable and you must be able to build it.
436
1110373
2484
ื”ื™ื ื—ื™ื™ื‘ืช ืœื”ื™ื•ืช ื‘ืจืช ื‘ื“ื™ืงื”, ื•ืืชื” ื—ื™ื™ื‘ ืœื”ื™ื•ืช ืžืกื•ื’ืœ ืœื‘ื ื•ืช ืื•ืชื”.
18:32
If you don't build it, you don't understand it.
437
1112881
2211
ืื ืืชื” ืœื ื‘ื•ื ื” ืื•ืชื”, ืืชื” ืœื ืžื‘ื™ืŸ ืื•ืชื”.
18:35
One more slide.
438
1115116
1532
18:36
What is this going to result in?
439
1116672
2309
ืžื” ื–ื” ื”ื•ืœืš ืœื™ืฆื•ืจ? ื”ืื ืื ื—ื ื• ื‘ืืžืช ื”ื•ืœื›ื™ื ืœื‘ื ื•ืช ืžื›ื•ื ื•ืช ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช?
18:39
Are we going to really build intelligent machines?
440
1119005
2348
ืœืœื ืกืคืง. ื–ื” ื™ื”ื™ื” ืฉื•ื ื” ืžืžื” ืฉืื ืฉื™ื ื—ื•ืฉื‘ื™ื.
18:41
Absolutely. And it's going to be different than people think.
441
1121377
3798
ืื™ืŸ ืœื™ ืกืคืง ืฉื–ื” ื”ื•ืœืš ืœืงืจื•ืช.
18:45
No doubt that it's going to happen, in my mind.
442
1125508
2392
18:47
First of all, we're going to build this stuff out of silicon.
443
1127924
3116
ืงื•ื“ื ื›ืœ, ื–ื” ื”ื•ืœืš ืœื”ื‘ื ื•ืช ืžืกื™ืœื™ืงื•ืŸ.
18:51
The same techniques we use to build silicon computer memories,
444
1131064
2912
ืื•ืชื” ื˜ื›ื ื™ืงื” ืฉื‘ื” ื”ืฉืชืžืฉื ื• ืœื‘ื ื•ืช ื–ื›ืจื•ืŸ ืœืžื—ืฉื‘,
18:54
we can use here.
445
1134000
1151
ืชืฉืžืฉ ืื•ืชื ื• ื›ืืŸ.
18:55
But they're very different types of memories.
446
1135175
2109
ืื‘ืœ ืกื•ื’ ื”ื–ื›ืจื•ืŸ ื”ื•ื ืžืื•ื“ ืžืื•ื“ ืฉื•ื ื”.
18:57
And we'll attach these memories to sensors,
447
1137308
2023
ื ืฆืžื™ื“ ืืช ืžืขืจื›ืช ื”ื–ื›ืจื•ืŸ ื”ื–ื• ืœื—ื™ื™ืฉื ื™ื,
18:59
and the sensors will experience real-live, real-world data,
448
1139355
2777
ื•ื”ื—ื™ื™ืฉื ื™ื ื™ืงืœื˜ื• ื ืชื•ื ื™ื ืžื”ืขื•ืœื ื”ืืžื™ืชื™,
19:02
and learn about their environment.
449
1142156
1752
ื•ื”ื“ื‘ืจื™ื ื”ืœืœื• ื™ืœืžื“ื• ืขืœ ื”ืกื‘ื™ื‘ื” ืฉืœื”ื.
19:03
Now, it's very unlikely the first things you'll see are like robots.
450
1143932
3445
ืžืื•ื“ ืœื ืกื‘ื™ืจ ืฉื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉื ืจืื” ื™ื”ื™ื• ืจื•ื‘ื•ื˜ื™ื.
19:07
Not that robots aren't useful; people can build robots.
451
1147401
2575
ืœื ื‘ื’ืœืœ ืฉืจื•ื‘ื•ื˜ื™ื ื”ื ื—ืกืจื™ ืชื•ืขืœืช ื•ื‘ื”ื—ืœื˜ ื ื™ืชืŸ ืœื‘ื ื•ืช ืื•ืชื.
19:10
But the robotics part is the hardest part. That's old brain. That's really hard.
452
1150000
3767
ืืœื ืฉื”ื—ืœืง ื”ืจื•ื‘ื•ื˜ื™ ื”ื•ื ื”ื—ืœืง ื”ืงืฉื”. ื–ื”ื• ื”ืžื— ื”ื™ืฉืŸ. ื–ื” ืžืžืฉ ืงืฉื”.
19:13
The new brain is easier than the old brain.
453
1153791
2007
ื”ืžื— ื”ื—ื“ืฉ ื”ื•ื ื“ื™ ืงืœ ื‘ื™ื—ืก ืœืžื— ื”ื™ืฉืŸ.
19:15
So first we'll do things that don't require a lot of robotics.
454
1155822
3082
ืื– ื”ื“ื‘ืจื™ื ื”ืจืืฉื•ื ื™ื ืฉืื ื• ืขื•ืžื“ื™ื ืœืขืฉื•ืช ืื™ื ื ื–ืงื•ืงื•ื™ื ืœื™ื•ืชืจ ืžื™ื“ื™ ืจื•ื‘ื•ื˜ื™ืงื”.
19:18
So you're not going to see C-3PO.
455
1158928
2179
ืืชื ืœื ื”ื•ืœื›ื™ื ืœืจืื•ืช C-3PO.
19:21
You're going to see things more like intelligent cars
456
1161131
2485
ืชืจืื• ื“ื‘ืจื™ื ื™ื•ืชืจ ื›ืžื• ืžื›ื•ื ื™ื•ืช ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช
19:23
that really understand what traffic is, what driving is
457
1163640
2808
ืฉืžื‘ื™ื ื•ืช ืชื—ื‘ื•ืจื” ื•ื ื”ื™ื’ื”
19:26
and have learned that cars with the blinkers on for half a minute
458
1166472
3278
ื•ืœืžื“ื• ืฉืžื›ื•ื ื™ื•ืช ืžืกื•ื™ื™ืžื•ืช ืขื ืื•ืจื•ืช ืฉืžื”ื‘ื”ื‘ื™ื ื‘ืžืฉืš ื—ืฆื™ ื“ืงื”
19:29
probably aren't going to turn.
459
1169774
1574
ื›ื ืจืื” ืœื ืขื•ืžื“ื•ืช ืœืคื ื•ืช, ื“ื‘ืจื™ื ืžืขื™ืŸ ืืœื”.
19:31
(Laughter)
460
1171372
1291
(ืฆื—ื•ืง)
19:32
We can also do intelligent security systems.
461
1172687
2064
ื ื™ืชืŸ ืœืขืฉื•ืช ืืžืฆืขื™ ืื‘ื˜ื—ื” ื—ื›ืžื™ื.
19:34
Anytime we're basically using our brain but not doing a lot of mechanics --
462
1174775
3573
ื›ืœ ื”ืžืงื•ืžื•ืช ืฉื‘ื”ื ืื ื• ืžืฉืชืžืฉื™ื ื”ืจื‘ื” ื‘ืฉื›ืœื ื• ืœืœื ื”ืจื‘ื” ืžื›ื ื™ืงื”.
19:38
those are the things that will happen first.
463
1178372
2059
ืืœื• ื”ื“ื‘ืจื™ื ืฉื™ืงืจื• ืงื•ื“ื.
19:40
But ultimately, the world's the limit.
464
1180455
1820
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ, ื”ืฉืžื™ื™ื ื”ื ื”ื’ื‘ื•ืœ.
19:42
I don't know how this will turn out.
465
1182299
1732
ืื™ื ื ื™ ื™ื•ื“ืข ืื™ืš ื–ื” ื™ืชืคืชื—.
19:44
I know a lot of people who invented the microprocessor.
466
1184055
2591
ืื ื™ ืžื›ื™ืจ ืื ืฉื™ื ืจื‘ื™ื ืฉื”ืžืฆื™ืื• ืืช ื”ืžื™ืงืจื•-ืžืขื‘ื“ื™ื
19:46
And if you talk to them,
467
1186670
2164
ืื ื”ื™ื™ืชื ืฉื•ืืœื™ื ืื•ืชื, ื”ื ื™ื“ืขื• ืฉืžื” ืฉื”ื ืขื•ืฉื™ื ื”ื•ื ืžืื•ื“ ื—ืฉื•ื‘,
19:48
they knew what they were doing was really significant,
468
1188858
2575
19:51
but they didn't really know what was going to happen.
469
1191457
2500
ืืš ื”ื ืœื ื™ื“ืขื• ืžื” ื™ืงืจื” ื‘ืขืชื™ื“.
19:53
They couldn't anticipate cell phones and the Internet
470
1193981
2768
ื”ื ืœื ืฆืคื• ืืช ื”ื˜ืœืคื•ื ื™ื ื”ืกืœื•ืœืจื™ื, ืืช ื”ืื™ื ื˜ืจื ื˜ ื•ื›ืœ ื”ื“ื‘ืจื™ื ื”ืœืœื•.
19:56
and all this kind of stuff.
471
1196773
1735
19:58
They just knew like, "We're going to build calculators
472
1198532
2621
ื”ื ืจืง ื™ื“ืขื• ืฉื”ื ืขื•ืžื“ื™ื ืœื‘ื ื•ืช ืžื—ืฉื‘ื•ื ื™ื
20:01
and traffic-light controllers.
473
1201177
1440
ื•ื‘ืงืจืช ืจืžื–ื•ืจื™ื. ื–ื” ื”ื•ืœืš ืœื”ื™ื•ืช ื’ื“ื•ืœ.
20:02
But it's going to be big!"
474
1202641
1299
20:03
In the same way, brain science and these memories
475
1203964
2341
ืžื“ืขื™ ื”ืžื— ื•ืžืขืจื›ื•ืช ื”ื–ื›ืจื•ืŸ ื”ืœืœื•,
20:06
are going to be a very fundamental technology,
476
1206329
2225
ื™ื”ื™ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืื•ื“ ื‘ืกื™ืกื™ืช, ืฉืชื•ื‘ื™ืœ
20:08
and it will lead to unbelievable changes in the next 100 years.
477
1208578
3442
ืœืฉื™ื ื•ื™ื™ื ืžืจื—ื™ืงื™ ืœื›ืช ื‘ืžืื” ืฉื ื™ื ื”ืงืจื•ื‘ื•ืช.
20:12
And I'm most excited about how we're going to use them in science.
478
1212044
3405
ื•ืื ื™ ื”ื›ื™ ื ืœื”ื‘ ืœื’ืœื•ืช ื›ื™ืฆื“ ื ืฉืชืžืฉ ื‘ื”ื ื‘ืžื“ืข.
20:15
So I think that's all my time -- I'm over,
479
1215473
2837
ืื ื™ ื—ื•ืฉื‘ ืฉื–ืžื ื™ ืชื, ื•ื‘ื–ืืช ืืกื™ื™ื
20:18
and I'm going to end my talk right there.
480
1218334
2277
ืืช ื”ื”ืจืฆืื”.
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7