Ray Kurzweil: Get ready for hybrid thinking

531,025 views ใƒป 2014-06-02

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Sigal Tifferet
00:12
Let me tell you a story.
0
12988
2316
ื”ื‘ื” ื•ืืกืคืจ ืœื›ื ืกื™ืคื•ืจ.
00:15
It goes back 200 million years.
1
15304
1799
ื”ื•ื ื”ืชืจื—ืฉ ืœืคื ื™ 200 ืžื™ืœื™ื•ืŸ ืฉื ื”.
00:17
It's a story of the neocortex,
2
17103
1984
ื–ื”ื• ืกื™ืคื•ืจื• ืฉืœ ื”ื ื™ืื•-ืงื•ืจื˜ืงืก, ืฉืคื™ืจื•ืฉื• "ืงืœื™ืคื” ื—ื“ืฉื”".
00:19
which means "new rind."
3
19087
1974
00:21
So in these early mammals,
4
21061
2431
ืืฆืœ ืื•ืชื ื™ื•ื ืงื™ื ืงื“ื•ืžื™ื -
00:23
because only mammals have a neocortex,
5
23492
2055
ื›ื™ ืจืง ืœื™ื•ื ืงื™ื ื™ืฉ ื ื™ืื•-ืงื•ืจื˜ืงืก -
00:25
rodent-like creatures.
6
25547
1664
ืฉื”ื™ื• ื™ืฆื•ืจื™ื ื“ืžื•ื™ื™-ืžื›ืจืกืžื™ื,
00:27
It was the size of a postage stamp and just as thin,
7
27211
3579
ื”ื•ื ื”ื™ื” ื‘ื’ื•ื“ืœ ื‘ื•ืœ ื“ื•ืืจ ื•ื’ื ื‘ืื•ืชื• ืขื•ื‘ื™,
00:30
and was a thin covering around
8
30790
1439
ื•ื”ื™ื•ื•ื” ืฆื™ืคื•ื™ ื“ืง ืกื‘ื™ื‘ ืžื•ื—ื, ืฉื”ื™ื” ื‘ื’ื•ื“ืœ ืื’ื•ื–,
00:32
their walnut-sized brain,
9
32229
2264
00:34
but it was capable of a new type of thinking.
10
34493
3701
ืื‘ืœ ื”ื•ื ื”ื™ื” ืžืกื•ื’ืœ ืœื—ืฉื™ื‘ื” ืžืกื•ื’ ื—ื“ืฉ.
00:38
Rather than the fixed behaviors
11
38194
1567
ื‘ืžืงื•ื ื”ื”ืชื ื”ื’ื•ื™ื•ืช ื”ืงื‘ื•ืขื•ืช ืฉืœ ื—ื™ื•ืช ืฉืื™ื ืŸ ื™ื•ื ืงื™ื,
00:39
that non-mammalian animals have,
12
39761
1992
00:41
it could invent new behaviors.
13
41753
2692
ื”ื•ื ื™ื“ืข ืœื”ืžืฆื™ื ื”ืชื ื”ื’ื•ื™ื•ืช ื—ื“ืฉื•ืช.
00:44
So a mouse is escaping a predator,
14
44445
2553
ืœืžืฉืœ, ืขื›ื‘ืจ ืฉื‘ื•ืจื— ืžืคื ื™ ื˜ื•ืจืฃ ื•ื“ืจื›ื• ื—ืกื•ืžื”,
00:46
its path is blocked,
15
46998
1540
00:48
it'll try to invent a new solution.
16
48538
2129
ื™ื ืกื” ืœื”ืžืฆื™ื ืคืชืจื•ืŸ ื—ื“ืฉ, ืฉืื•ืœื™ ื™ืฆืœื™ื— ื•ืื•ืœื™ ืœื,
00:50
That may work, it may not,
17
50667
1266
00:51
but if it does, it will remember that
18
51933
1910
ืื‘ืœ ื‘ืžื™ื“ื” ื•ื™ืฆืœื™ื—, ื”ื•ื ื™ื–ื›ื•ืจ ืื•ืชื•
00:53
and have a new behavior,
19
53843
1292
ื•ื™ืจื›ื•ืฉ ื”ืชื ื”ื’ื•ืช ื—ื“ืฉื” ืฉืขืฉื•ื™ื” ืœื”ืชืคืฉื˜ ื‘ืื•ืคืŸ ื•ื™ืจืืœื™
00:55
and that can actually spread virally
20
55135
1457
00:56
through the rest of the community.
21
56592
2195
ื‘ืงืจื‘ ื›ืœ ื”ืงื”ื™ืœื”.
00:58
Another mouse watching this could say,
22
58787
1609
ืขื›ื‘ืจ ืื—ืจ, ืฉืฆืคื” ื‘ื•, ืื•ืœื™ ื™ืืžืจ:
01:00
"Hey, that was pretty clever, going around that rock,"
23
60396
2704
"ื–ื” ื”ื™ื” ืžื—ื•ื›ื ืœืžื“ื™, ืœืขืงื•ืฃ ืืช ื”ืกืœืข ื”ื”ื•ื."
01:03
and it could adopt a new behavior as well.
24
63100
3725
ื•ื’ื ื”ื•ื ื™ืืžืฅ ื”ืชื ื”ื’ื•ืช ื—ื“ืฉื”.
01:06
Non-mammalian animals
25
66825
1717
ื—ื™ื•ืช ืฉืื™ื ืŸ ื™ื•ื ืงื™ื ืœื ื™ื›ืœื• ืœืขืฉื•ืช ื“ื‘ืจ ืžื›ืœ ืืœื”.
01:08
couldn't do any of those things.
26
68542
1713
01:10
They had fixed behaviors.
27
70255
1215
ื”ื™ื• ืœื”ืŸ ื”ืชื ื”ื’ื•ื™ื•ืช ืงื‘ื•ืขื•ืช.
01:11
Now they could learn a new behavior
28
71470
1331
ื”ืŸ ืืžื ื ื™ื›ืœื• ืœืœืžื•ื“ ื”ืชื ื”ื’ื•ื™ื•ืช ื—ื“ืฉื•ืช,
01:12
but not in the course of one lifetime.
29
72801
2576
ืื‘ืœ ืœื ื‘ืชืงื•ืคืช ื—ื™ื™ื”ืŸ.
01:15
In the course of maybe a thousand lifetimes,
30
75377
1767
ืื•ืœื™ ื‘ืžืจื•ืฆืช ืืœืฃ ืชืงื•ืคื•ืช-ื—ื™ื™ื
01:17
it could evolve a new fixed behavior.
31
77144
3330
ื”ืŸ ื™ื›ืœื• ืœืคืชื— ื”ืชื ื”ื’ื•ืช ืงื‘ื•ืขื” ื—ื“ืฉื”.
01:20
That was perfectly okay 200 million years ago.
32
80474
3377
ืœืคื ื™ 200 ืžื™ืœื™ื•ืŸ ืฉื ื” ืœื ื”ื™ืชื” ืขื ื–ื” ื‘ืขื™ื”:
01:23
The environment changed very slowly.
33
83851
1981
ื”ืกื‘ื™ื‘ื” ื”ืฉืชื ืชื” ืœืื˜ ืžืื“.
01:25
It could take 10,000 years for there to be
34
85832
1554
ืœืคืขืžื™ื ื ื“ืจืฉื• 10,000 ืฉื ื™ื ื›ื“ื™ ืฉื™ืชืจื—ืฉ ืฉื™ื ื•ื™ ืกื‘ื™ื‘ืชื™ ื ื™ื›ืจ,
01:27
a significant environmental change,
35
87386
2092
01:29
and during that period of time
36
89478
1382
ื•ื‘ืžืฉืš ื”ื–ืžืŸ ื”ื–ื”, ื‘ืขืœ-ื”ื—ื™ื™ื ื”ื™ื” ืžืคืชื— ื”ืชื ื”ื’ื•ืช ื—ื“ืฉื”.
01:30
it would evolve a new behavior.
37
90860
2929
01:33
Now that went along fine,
38
93789
1521
ื”ื›ืœ ื”ื™ื” ื‘ืกื“ืจ, ืื‘ืœ ืื– ืงืจื” ืžืฉื”ื•.
01:35
but then something happened.
39
95310
1704
01:37
Sixty-five million years ago,
40
97014
2246
ืœืคื ื™ 65 ืžื™ืœื™ื•ืŸ ืฉื ื” ืื™ืจืข ืฉื™ื ื•ื™ ืกื‘ื™ื‘ืชื™ ืคืชืื•ืžื™ ื•ืืœื™ื.
01:39
there was a sudden, violent change to the environment.
41
99260
2615
01:41
We call it the Cretaceous extinction event.
42
101875
3505
ืื ื• ืžื›ื ื™ื ืื•ืชื• "ื”ื›ื—ื“ืช ื”ืงืจื˜ื™ืงื•ืŸ-ืฉืœื™ืฉื•ืŸ".
01:45
That's when the dinosaurs went extinct,
43
105380
2293
ื‘ืื™ืจื•ืข ื”ื–ื” ื ื›ื—ื“ื• ื”ื“ื™ื ื•ื–ืื•ืจื™ื,
01:47
that's when 75 percent of the
44
107673
3449
ื•-75% ืžื›ืœ ื”ื—ื™ื•ืช ื•ื”ืฆืžื—ื™ื ื ื›ื—ื“ื• ื’ื ื”ื,
01:51
animal and plant species went extinct,
45
111122
2746
01:53
and that's when mammals
46
113868
1745
ื•ืื– ื›ื‘ืฉื• ื”ื™ื•ื ืงื™ื ืืช ื”ื’ื•ืžื—ื” ื”ืืงื•ืœื•ื’ื™ืช ืฉืœื”ื,
01:55
overtook their ecological niche,
47
115613
2152
01:57
and to anthropomorphize, biological evolution said,
48
117765
3654
ื•ืื ื ื“ืžื” ืื•ืชื” ืœืื“ื, ื”ืื‘ื•ืœื•ืฆื™ื” ื”ื‘ื™ื•ืœื•ื’ื™ืช ืืžืจื”:
02:01
"Hmm, this neocortex is pretty good stuff,"
49
121419
2025
"ื”ืžืž.. ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ื”ื–ื” ืื™ื ื ื• ืจืข ื‘ื›ืœืœ",
02:03
and it began to grow it.
50
123444
1793
ื•ื”ื™ื ื”ื—ืœื” ืœืคืชื— ืื•ืชื•.
02:05
And mammals got bigger,
51
125237
1342
ื•ื”ื™ื•ื ืงื™ื ื”ืœื›ื• ื•ื’ื“ืœื•,
02:06
their brains got bigger at an even faster pace,
52
126579
2915
ื•ืžื•ื—ื•ืชื™ื”ื ื’ื“ืœื• ื‘ืงืฆื‘ ืžื”ื™ืจ ืขื•ื“ ื™ื•ืชืจ,
02:09
and the neocortex got bigger even faster than that
53
129494
3807
ื•ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ื’ื“ืœ ืืคื™ืœื• ื™ื•ืชืจ ืžื”ืจ,
02:13
and developed these distinctive ridges and folds
54
133301
2929
ื•ืคื™ืชื— ืืช ื”ืจื›ืกื™ื ื•ื”ืงืคืœื™ื ื”ืžื•ื‘ื”ืงื™ื ื”ืืœื”,
02:16
basically to increase its surface area.
55
136230
2881
ืขืงืจื•ื ื™ืช, ื‘ืžื˜ืจื” ืœื”ื’ื“ื™ืœ ืืช ืฉื˜ื— ื”ืคื ื™ื ืฉืœื•.
02:19
If you took the human neocortex
56
139111
1819
ืื™ืœื• ืœืงื—ืชื ืืช ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ื”ืื ื•ืฉื™
02:20
and stretched it out,
57
140930
1301
ื•ืคืจืฉืชื ืื•ืชื•, ื”ื•ื ื‘ืขืจืš ื‘ื’ื•ื“ืœ ืžืคื™ืช,
02:22
it's about the size of a table napkin,
58
142231
1713
02:23
and it's still a thin structure.
59
143944
1306
ื•ื”ื•ื ืขื“ื™ื™ืŸ ื“ืง. ืขื•ื‘ื™ื• ื”ื•ื ื›ืฉืœ ืžืคื™ืช,
02:25
It's about the thickness of a table napkin.
60
145250
1980
02:27
But it has so many convolutions and ridges
61
147230
2497
ืื‘ืœ ื™ืฉ ื‘ื• ื›ืœ-ื›ืš ื”ืจื‘ื” ืคื™ืชื•ืœื™ื ื•ืจื›ืกื™ื,
02:29
it's now 80 percent of our brain,
62
149727
3075
ืขื“ ืฉื›ื™ื•ื ื”ื•ื ืžื”ื•ื•ื” 80% ืžื”ืžื•ื— ืฉืœื ื•,
02:32
and that's where we do our thinking,
63
152802
2461
ื•ื‘ื• ืื ื• ืžื‘ืฆืขื™ื ืืช ื”ื—ืฉื™ื‘ื” ืฉืœื ื•,
02:35
and it's the great sublimator.
64
155263
1761
ื•ื–ื”ื• ืžื ื•ืข ืกื•ื‘ืœื™ืžืฆื™ื” ื ื”ื“ืจ. ืขื“ื™ื™ืŸ ื™ืฉ ืœื ื• ื”ืžื•ื— ื”ื™ืฉืŸ,
02:37
We still have that old brain
65
157024
1114
02:38
that provides our basic drives and motivations,
66
158138
2764
ืฉืžืกืคืง ืœื ื• ื“ื—ืคื™ื ื•ื”ื ืขื•ืช ื‘ืกื™ืกื™ื™ื,
02:40
but I may have a drive for conquest,
67
160902
2716
ืื‘ืœ ื™ื›ื•ืœ ืœื”ื™ื•ืช ืœื™ ื“ื—ืฃ ืœื›ื™ื‘ื•ืฉ,
02:43
and that'll be sublimated by the neocortex
68
163618
2715
ื•ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ื™ืžื™ืจ ืื•ืชื•
02:46
into writing a poem or inventing an app
69
166333
2909
ืœื›ืชื™ื‘ืช ืฉื™ืจ ืื• ื”ืžืฆืืช ื™ื™ืฉื•ืžื•ืŸ
02:49
or giving a TED Talk,
70
169242
1509
ืื• ืžืชืŸ ื”ืจืฆืืช TED,
02:50
and it's really the neocortex that's where
71
170751
3622
ื•ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ื”ื•ื ืœืžืขืฉื” ื”ืžืงื•ื ืฉื‘ื• ื”ื›ืœ ืžืชืจื—ืฉ.
02:54
the action is.
72
174373
1968
02:56
Fifty years ago, I wrote a paper
73
176341
1717
ืœืคื ื™ 50 ืฉื ื” ื›ืชื‘ืชื™ ืžืืžืจ ืฉืžืชืืจ ืื™ืš ืœื“ืขืชื™ ื”ืžื•ื— ืคื•ืขืœ,
02:58
describing how I thought the brain worked,
74
178058
1918
02:59
and I described it as a series of modules.
75
179976
3199
ื•ืชื™ืืจืชื™ ืื•ืชื• ื›ืกื“ืจื” ืฉืœ ื™ื—ื™ื“ื•ืช.
03:03
Each module could do things with a pattern.
76
183175
2128
ื›ืœ ื™ื—ื™ื“ื” ื™ื•ื“ืขืช ืœืขืฉื•ืช ื“ื‘ืจื™ื ื‘ืืžืฆืขื•ืช ื“ืคื•ืก
03:05
It could learn a pattern. It could remember a pattern.
77
185303
2746
ื”ื™ื ืžืกื•ื’ืœืช ืœืœืžื•ื“ ื“ืคื•ืก ืžืกื•ื™ื. ืœื–ื›ื•ืจ ื“ืคื•ืก ืžืกื•ื™ื.
03:08
It could implement a pattern.
78
188049
1407
ื”ื™ื ื™ื›ื•ืœื” ืœื™ื™ืฉื ื“ืคื•ืก ืžืกื•ื™ื.
03:09
And these modules were organized in hierarchies,
79
189456
2679
ื•ื”ื™ื—ื™ื“ื•ืช ื”ืืœื” ืžืื•ืจื’ื ื•ืช ื‘ื”ื™ืจืจื›ื™ื•ืช,
03:12
and we created that hierarchy with our own thinking.
80
192135
2954
ื•ืื ื• ื™ืฆืจื ื• ื”ื™ืจืจื›ื™ื” ื–ื• ื‘ืขื–ืจืช ื”ื—ืฉื™ื‘ื” ืฉืœื ื•.
03:15
And there was actually very little to go on
81
195089
3333
ื•ืœื ื”ื™ื” ื‘ืขืฆื ื”ืจื‘ื” ื—ื•ืžืจ ืœื”ืชื‘ืกืก ืขืœื™ื•, ืœืคื ื™ 50 ืฉื ื”.
03:18
50 years ago.
82
198422
1562
03:19
It led me to meet President Johnson.
83
199984
2115
- ื–ื” ื–ื™ื›ื” ืื•ืชื™ ื‘ืคื’ื™ืฉื” ืขื ื”ื ืฉื™ื ื’'ื•ื ืกื•ืŸ --
03:22
I've been thinking about this for 50 years,
84
202099
2173
ื—ืฉื‘ืชื™ ืขืœ ื–ื” ื‘ืžืฉืš 50 ืฉื ื”,
03:24
and a year and a half ago I came out with the book
85
204272
2828
ื•ืœืคื ื™ ืฉื ื” ื•ื—ืฆื™ ืคืจืกืžืชื™ ืืช ื”ืกืคืจ "ืื™ืš ืœื™ืฆื•ืจ ืžื•ื—",
03:27
"How To Create A Mind,"
86
207100
1265
03:28
which has the same thesis,
87
208365
1613
ืฉื ืฉืขืŸ ืขืœ ืื•ืชื” ื”ื ื—ืช-ื™ืกื•ื“, ืืš ื›ื™ื•ื, ืขื ืฉืคืข ื”ื•ื›ื—ื•ืช.
03:29
but now there's a plethora of evidence.
88
209978
2812
03:32
The amount of data we're getting about the brain
89
212790
1814
ื›ืžื•ืช ื”ื ืชื•ื ื™ื ืฉืื ื• ืžืงื‘ืœื™ื ืื•ื“ื•ืช ื”ืžื•ื—, ืžืžื“ืขื™ ื”ืžื•ื—,
03:34
from neuroscience is doubling every year.
90
214604
2203
ืžื•ื›ืคืœืช ืžื™ื“ื™ ืฉื ื”.
03:36
Spatial resolution of brainscanning of all types
91
216807
2654
ืจืžืช ื”ื”ืคืจื“ื” ื”ืžืจื—ื‘ื™ืช ื‘ืกืจื™ืงื•ืช ืžื•ื— ืžื›ืœ ื”ืกื•ื’ื™ื
03:39
is doubling every year.
92
219461
2285
ืžื•ื›ืคืœืช ืžื™ื“ื™ ืฉื ื”.
03:41
We can now see inside a living brain
93
221746
1717
ื”ื™ื•ื ืื ื• ืžืกื•ื’ืœื™ื ืœื”ืชื‘ื•ื ืŸ ืœืชื•ืš ืžื•ื— ื—ื™
03:43
and see individual interneural connections
94
223463
2870
ื•ืœืจืื•ืช ืืช ื”ืงื™ืฉื•ืจื™ื ื”ื‘ื™ืŸ-ืขืฆื‘ื™ื™ื ื”ื ืคืจื“ื™ื
03:46
connecting in real time, firing in real time.
95
226333
2703
ืฉื ืขืฉื™ื ื‘ื–ืžืŸ ืืžื™ืชื™, ืื™ืชื•ืชื™ื ืขืฆื‘ื™ื™ื ื‘ื–ืžืŸ ืืžื™ืชื™.
03:49
We can see your brain create your thoughts.
96
229036
2419
ืื ื• ื™ื›ื•ืœื™ื ืœืจืื•ืช ืืช ื”ืžื•ื— ื›ืฉื”ื•ื ืžื—ื•ืœืœ ืžื—ืฉื‘ื•ืช.
03:51
We can see your thoughts create your brain,
97
231455
1575
ืื ื• ื™ื›ื•ืœื™ื ืœืจืื•ืช ืื™ืš ื”ืžื—ืฉื‘ื•ืช ื™ื•ืฆืจื•ืช ืืช ื”ืžื•ื—,
03:53
which is really key to how it works.
98
233030
1999
ืฉื–ื” ื”ืžืคืชื— ืœื”ื‘ื ืช ืื•ืคืŸ ืคืขื•ืœืชื•.
03:55
So let me describe briefly how it works.
99
235029
2219
ืื– ื”ื‘ื” ื•ืืชืืจ ืœื›ื ื‘ืงืฆืจื” ื›ื™ืฆื“ ื”ื•ื ืคื•ืขืœ.
03:57
I've actually counted these modules.
100
237248
2275
ืกืคืจืชื™ ืืช ื”ื™ื—ื™ื“ื•ืช ื”ืืœื”: ื™ืฉ ืœื ื• ื›-300 ืžื™ืœื™ื•ืŸ ื›ืืœื”,
03:59
We have about 300 million of them,
101
239523
2046
04:01
and we create them in these hierarchies.
102
241569
2229
ื•ืื ื• ื™ืฆืจื ื• ืื•ืชืŸ ื‘ื”ื™ืจืจื›ื™ื•ืช ื”ืืœื”.
04:03
I'll give you a simple example.
103
243798
2082
ืืชืŸ ืœื›ื ื“ื•ื’ืžื” ืคืฉื•ื˜ื”.
04:05
I've got a bunch of modules
104
245880
2805
ื™ืฉ ืœื™ ืงื‘ื•ืฆืช ื™ื—ื™ื“ื•ืช
04:08
that can recognize the crossbar to a capital A,
105
248685
3403
ืฉืžืกื•ื’ืœืช ืœื–ื”ื•ืช ืืช ื”ืงื• ื”ืื•ืคืงื™ ื‘ืื•ืช "ืช",
04:12
and that's all they care about.
106
252088
1914
ื•ื–ื” ื›ืœ ืžื” ืฉืžืขื ื™ื™ืŸ ืื•ืชืŸ.
04:14
A beautiful song can play,
107
254002
1578
ืื•ืœื™ ืžืชื ื’ืŸ ืฉื™ืจ ื ืคืœื, ืื•ืœื™ ืชื—ืœื•ืฃ ื ืขืจื” ื™ืคื” -
04:15
a pretty girl could walk by,
108
255580
1434
04:17
they don't care, but they see a crossbar to a capital A,
109
257014
2846
ืœื ืื™ื›ืคืช ืœื”ืŸ. ืื‘ืœ ื›ืฉื”ืŸ ืžื–ื”ื•ืช ืืช ื”ืงื• ื”ืื•ืคืงื™ ื‘ืื•ืช "ืช",
04:19
they get very excited and they say "crossbar,"
110
259860
3021
ื”ืŸ ืžืชืจื’ืฉื•ืช ืžืื“ ื•ืื•ืžืจื•ืช: "ืงื• ืื•ืคืงื™",
04:22
and they put out a high probability
111
262881
2112
ื•ืžืคื™ืงื•ืช ืคืœื˜ ืกื‘ื™ืจื•ืช ื’ื‘ื•ื”ื” ื‘ื–ื™ื– ืกื™ื‘ ื”ืขืฆื‘ ืฉืœื”ืŸ.
04:24
on their output axon.
112
264993
1634
04:26
That goes to the next level,
113
266627
1333
ื–ื” ืขื•ื‘ืจ ืœืจืžื” ื”ื‘ืื”,
04:27
and these layers are organized in conceptual levels.
114
267960
2750
ื•ื”ืจื‘ื“ื™ื ื”ืืœื” ืžืื•ืจื’ื ื™ื ื‘ืจืžื•ืช ืชืคื™ืฉืชื™ื•ืช,
04:30
Each is more abstract than the next one,
115
270710
1856
ื›ืฉื›ืœ ืจืžื” ืžื•ืคืฉื˜ืช ื™ื•ืชืจ ืžื”ืจืžื” ื”ื‘ืื”,
04:32
so the next one might say "capital A."
116
272566
2418
ื›ืš ืฉื”ืจืžื” ื”ื‘ืื” ืขืฉื•ื™ื” ืœื•ืžืจ: ื”ืื•ืช "ืช".
04:34
That goes up to a higher level that might say "Apple."
117
274984
2891
ื–ื” ืขื•ืœื” ืœืจืžื” ื”ื‘ืื” ืฉืื•ืœื™ ืชื’ื™ื“: "ืชืคื•ื—".
04:37
Information flows down also.
118
277875
2167
ื”ืžื™ื“ืข ื’ื ืžืื™ื˜.
04:40
If the apple recognizer has seen A-P-P-L,
119
280042
2936
ืื ืžื–ื”ื” ื”ืชืคื•ื— ืจืื” "ืช, ืค, ื•"
04:42
it'll think to itself, "Hmm, I think an E is probably likely,"
120
282978
3219
ื”ื•ื ื™ื—ืฉื•ื‘ ืœืขืฆืžื•: "ื”ืžืž, ืœื“ืขืชื™ ื™ืฉ ืกื‘ื™ืจื•ืช ื’ื‘ื•ื”ื” ืœ'ื—',"
04:46
and it'll send a signal down to all the E recognizers
121
286197
2564
ื•ื”ื•ื ื™ืฉื’ืจ ืœืžื˜ื” ืื•ืช, ืœื›ืœ ืžื–ื”ื™ ื”"ื—",
04:48
saying, "Be on the lookout for an E,
122
288761
1619
ืฉืžื•ื“ื™ืข: "ื”ื™ื›ื•ื ื• ืœื”ื•ืคืขืช 'ื—', ื ืจืื” ืœื™ ืฉืฆืคื•ื™ื” 'ื—'."
04:50
I think one might be coming."
123
290380
1556
04:51
The E recognizers will lower their threshold
124
291936
2843
ืžื–ื”ื™ ื”"ื—" ื™ื ืžื™ื›ื• ืืช ืกืฃ ื”ืจื’ื™ืฉื•ืช ืฉืœื”ื,
04:54
and they see some sloppy thing, could be an E.
125
294779
1945
ื•ืื ื™ืจืื• ืžืฉื”ื• ืฉื“ื•ืžื” ืื™ื›ืฉื”ื• ืœ"ื—",
04:56
Ordinarily you wouldn't think so,
126
296724
1490
ื‘ื“"ื› ื”ื ืœื ื™ืขืฉื• ื–ืืช,
04:58
but we're expecting an E, it's good enough,
127
298214
2009
ืื‘ืœ, "ืื ื• ืžืฆืคื™ื ืœ'ื—', ื ืกืชืคืง ื‘ื–ื”;
05:00
and yeah, I've seen an E, and then apple says,
128
300223
1817
"ื‘ืืžืช, ืจืื™ืชื™ 'ื—'," ื•ืื– ืžื–ื”ื” ื”ืชืคื•ื— ื™ืืžืจ:
"ื ื›ื•ืŸ, ื–ื™ื”ื™ืชื™ 'ืชืคื•ื—'."
05:02
"Yeah, I've seen an Apple."
129
302040
1728
05:03
Go up another five levels,
130
303768
1746
ื ืขืœื” ื—ืžืฉ ืจืžื•ืช,
05:05
and you're now at a pretty high level
131
305514
1353
ื•ื”ื’ืขื ื• ืœืจืžื” ื’ื‘ื•ื”ื” ืœืžื“ื™ ื‘ื”ื™ืจืจื›ื™ื” ื”ื–ืืช,
05:06
of this hierarchy,
132
306867
1569
05:08
and stretch down into the different senses,
133
308436
2353
ื ืชืคืฉื˜ ืœื—ื•ืฉื™ื ื”ืฉื•ื ื™ื,
05:10
and you may have a module that sees a certain fabric,
134
310789
2655
ื•ืฉื, ื™ื—ื™ื“ื” ืžืกื•ื™ืžืช ืื•ืœื™ ืชื–ื”ื” ืืจื™ื’ ืžืกื•ื™ื,
05:13
hears a certain voice quality, smells a certain perfume,
135
313444
2844
ืชื–ื”ื” ืื™ื›ื•ืช ืงื•ืœ ืžืกื•ื™ืžืช, ืชืจื™ื— ื‘ื•ืฉื ืžืกื•ื™ื,
05:16
and will say, "My wife has entered the room."
136
316288
2513
ื•ืชืืžืจ: "ืืฉืชื™ ื ื›ื ืกื” ืœื—ื“ืจ."
05:18
Go up another 10 levels, and now you're at
137
318801
1895
ืขื•ืœื™ื 10 ืจืžื•ืช ื•ื›ืขืช ื ืžืฆืื™ื ื‘ืจืžื” ื’ื‘ื•ื”ื” ืžืื“,
05:20
a very high level.
138
320696
1160
05:21
You're probably in the frontal cortex,
139
321856
1937
ืื•ืœื™ ื‘ืื•ื ื” ื”ืžืฆื—ื™ืช,
05:23
and you'll have modules that say, "That was ironic.
140
323793
3767
ื•ืฉื ื™ืฉ ื™ื—ื™ื“ื•ืช ืฉื™ืืžืจื•: "ื–ื” ื”ื™ื” ืื™ืจื•ื ื™."
05:27
That's funny. She's pretty."
141
327560
2370
"ื–ื” ืžืฆื—ื™ืง." "ื”ื™ื ื™ืคื”."
05:29
You might think that those are more sophisticated,
142
329930
2105
ืื•ืœื™ ื ืจืื” ืœื›ื ืฉืืœื” ื™ื—ื™ื“ื•ืช ืžืชื•ื—ื›ืžื•ืช ื™ื•ืชืจ,
05:32
but actually what's more complicated
143
332035
1506
ืืš ืœืžืขืฉื”, ืžื” ืฉื™ื•ืชืจ ืžื•ืจื›ื‘ ื”ื•ื ื”ื”ื™ืจืจื›ื™ื” ืฉื‘ื™ืกื•ื“ืŸ.
05:33
is the hierarchy beneath them.
144
333541
2669
05:36
There was a 16-year-old girl, she had brain surgery,
145
336210
2620
ื ืขืจื” ืื—ืช, ื‘ืช 16, ืขื‘ืจื” ื ื™ืชื•ื— ืžื•ื—,
05:38
and she was conscious because the surgeons
146
338830
2051
ื•ื”ื™ื ื”ื™ืชื” ื‘ื”ื›ืจื”
05:40
wanted to talk to her.
147
340881
1537
ืžืฉื•ื ืฉื”ืžื ืชื—ื™ื ืฉืœื” ืจืฆื• ืœืฉื•ื—ื— ืื™ืชื” -
05:42
You can do that because there's no pain receptors
148
342418
1822
-- ื–ื” ืืคืฉืจื™, ื›ื™ ืื™ืŸ ื‘ืžื•ื— ืงื•ืœื˜ื ื™ ื›ืื‘ --
05:44
in the brain.
149
344240
1038
05:45
And whenever they stimulated particular,
150
345278
1800
ื•ื‘ื›ืœ ืคืขื ืฉื”ื ื’ื™ืจื•
ื ืงื•ื“ื•ืช ื–ืขื™ืจื•ืช ืžืกื•ื™ืžื•ืช ื‘ื ื™ืื•-ืงื•ืจื˜ืงืก ืฉืœื”,
05:47
very small points on her neocortex,
151
347078
2463
05:49
shown here in red, she would laugh.
152
349541
2665
ืฉืžื•ืฆื’ื•ืช ื›ืืŸ ื‘ืื“ื•ื, ื”ื™ื ืฆื—ืงื”.
05:52
So at first they thought they were triggering
153
352206
1440
ืชื—ื™ืœื” ื”ื ื—ืฉื‘ื• ืฉื”ื ืžื’ืจื™ื ืจืคืœืงืก ืฆื—ื•ืง ื›ืœืฉื”ื•,
05:53
some kind of laugh reflex,
154
353646
1720
05:55
but no, they quickly realized they had found
155
355366
2519
ืื‘ืœ ืœื. ื”ื ื”ื‘ื™ื ื• ื‘ืžื”ืจื” ืฉื”ื ื’ื™ืœื•
05:57
the points in her neocortex that detect humor,
156
357885
3044
ื‘ื ื™ืื•-ืงื•ืจื˜ืงืก ืฉืœื” ืืช ื”ื ืงื•ื“ื•ืช ืฉืžื–ื”ื•ืช ื”ื•ืžื•ืจ,
06:00
and she just found everything hilarious
157
360929
1969
ื•ืœื›ืŸ ื”ื™ื ื—ืฉื‘ื” ืฉื”ื›ืœ ืžืฆื—ื™ืง
06:02
whenever they stimulated these points.
158
362898
2437
ื‘ื›ืœ ืคืขื ืฉื”ื ื’ื™ืจื• ืืช ื”ื ืงื•ื“ื•ืช ื”ืืœื”.
06:05
"You guys are so funny just standing around,"
159
365335
1925
"ืืชื ื›ืœ-ื›ืš ืžืฆื—ื™ืงื™ื ื›ืฉืืชื ืขื•ืžื“ื™ื ื›ื›ื” ืžืกื‘ื™ื‘,"
06:07
was the typical comment,
160
367260
1738
ื”ื™ืชื” ื”ืชื’ื•ื‘ื” ื”ืื•ืคื™ื™ื ื™ืช,
06:08
and they weren't funny,
161
368998
2302
ื•ื”ื ืœื ื”ื™ื• ืžืฆื—ื™ืงื™ื ื›ืœืœ.
06:11
not while doing surgery.
162
371300
3247
ืœื ื‘ืžื”ืœืš ื”ื ื™ืชื•ื—.
06:14
So how are we doing today?
163
374547
4830
ืื– ืžื” ืžืฆื‘ื ื• ื”ื™ื•ื?
06:19
Well, computers are actually beginning to master
164
379377
3054
ื•ื‘ื›ืŸ, ื”ืžื—ืฉื‘ื™ื ืžืชื—ื™ืœื™ื ืœื”ืฉืชืœื˜ ืขืœ ื”ืฉืคื” ื”ืื ื•ืฉื™ืช
06:22
human language with techniques
165
382431
2001
ื‘ืขื–ืจืช ื˜ื›ื ื™ืงื•ืช ื“ื•ืžื•ืช ืœืืœื” ืฉืœ ื”ื ื™ืื•-ืงื•ืจื˜ืงืก.
06:24
that are similar to the neocortex.
166
384432
2867
06:27
I actually described the algorithm,
167
387299
1514
ืœืžืขืŸ ื”ืืžืช, ืชื™ืืจืชื™ ืืช ื”ืืœื’ื•ืจื™ืชื, ืฉื“ื•ืžื” ืœืžืฉื”ื• ืฉืงืจื•ื™:
06:28
which is similar to something called
168
388813
2054
06:30
a hierarchical hidden Markov model,
169
390867
2233
"ืžื•ื“ืœ ืžืจืงื•ื‘ ื”ื™ืจืจื›ื™ ืกืžื•ื™", ืฉืื ื™ ืขื•ื‘ื“ ืขืœื™ื• ืžืื– ืฉื ื•ืช ื”-90.
06:33
something I've worked on since the '90s.
170
393100
3241
06:36
"Jeopardy" is a very broad natural language game,
171
396341
3238
"ื’'ืคืจื“ื™" ื”ื•ื ืฉืขืฉื•ืขื•ืŸ ืฉืคื” ื˜ื‘ืขื™ืช ืžืงื™ืฃ ื‘ื™ื•ืชืจ,
06:39
and Watson got a higher score
172
399579
1892
ื•ื”ืžื—ืฉื‘ "ื•ื•ื˜ืกื•ืŸ" ื”ืฉื™ื’ ื ื™ืงื•ื“ ื’ื‘ื•ื” ื™ื•ืชืจ
06:41
than the best two players combined.
173
401471
2000
ืžืฉื ื™ ื”ืฉื—ืงื ื™ื ื”ื˜ื•ื‘ื™ื ื‘ื™ื•ืชืจ ื’ื ื™ื—ื“.
06:43
It got this query correct:
174
403471
2499
ื”ื•ื ื”ื‘ื™ืŸ ื ื›ื•ืŸ ืืช ื”ื—ื™ื“ื” ื”ื–ื•:
06:45
"A long, tiresome speech
175
405970
2085
"ื ืื•ื ืืจื•ืš ื•ืžื™ื™ื’ืข ืฉื ื•ืฉื ืฆื™ืคื•ื™ ืฉืœ ืขื•ื’ื”,"
06:48
delivered by a frothy pie topping,"
176
408055
2152
06:50
and it quickly responded, "What is a meringue harangue?"
177
410207
2796
ื•ืขื ื” ื‘ืžื”ื™ืจื•ืช: "ื›ืžื” ืฉื”ืงืฆืคืช ืžืขื™ื™ืคืช",
06:53
And Jennings and the other guy didn't get that.
178
413003
2635
ื‘ืขื•ื“ ื’'ื ื™ื ื’ืก ื•ื”ืžืชื—ืจื” ื”ืฉื ื™ ืœื ื”ื‘ื™ื ื• ื–ืืช.
06:55
It's a pretty sophisticated example of
179
415638
1926
ื–ืืช ื“ื•ื’ืžื” ืžืชื•ื—ื›ืžืช ืœืžื“ื™ ืœืžื—ืฉื‘ ืฉืžื‘ื™ืŸ ืฉืคื” ืื ื•ืฉื™ืช,
06:57
computers actually understanding human language,
180
417564
1914
06:59
and it actually got its knowledge by reading
181
419478
1652
ื•ื”ื•ื ืจื›ืฉ ืืช ื”ื™ื“ืข ืฉืœื• ืžืงืจื™ืืช "ื•ื•ื™ืงื™ืคื“ื™ื”" ื•ืื ืฆื™ืงืœื•ืคื“ื™ื•ืช ื ื•ืกืคื•ืช.
07:01
Wikipedia and several other encyclopedias.
182
421130
3785
07:04
Five to 10 years from now,
183
424915
2133
ื‘ืขื•ื“ 5-10 ืฉื ื™ื
07:07
search engines will actually be based on
184
427048
2184
ืžื ื•ืขื™ ื”ื—ื™ืคื•ืฉ ื™ื”ื™ื• ืžื‘ื•ืกืกื™ื
07:09
not just looking for combinations of words and links
185
429232
2794
ืœื ืจืง ืขืœ ื—ื™ืคื•ืฉ ืฆื™ืจื•ืคื™ ืžืœื™ื ื•ืงื™ืฉื•ืจื™ื,
07:12
but actually understanding,
186
432026
1914
ืืœื ืขืœ ื”ื‘ื ื” ืžืžืฉื™ืช,
07:13
reading for understanding the billions of pages
187
433940
2411
ื•ื›ื“ื™ ืœื”ื‘ื™ืŸ ื”ื ื™ืงืจืื• ืžื™ืœื™ืืจื“ื™ ื“ืคื™ื ื‘ืจืฉืช ื•ื‘ืกืคืจื™ื.
07:16
on the web and in books.
188
436351
2733
07:19
So you'll be walking along, and Google will pop up
189
439084
2616
ืœื“ื•ื’ืžื”, ืืช ืชื˜ื™ื™ืœื™ ืœืš, ื•"ื’ื•ื’ืœ" ื™ืขืœื” ื•ื™ื’ื™ื“:
07:21
and say, "You know, Mary, you expressed concern
190
441700
3081
"ืืช ื™ื•ื“ืขืช, ืžืจื™, ืœืคื ื™ ื—ื•ื“ืฉ ื”ื‘ืขืช ื‘ืคื ื™ ื“ืื’ื”
07:24
to me a month ago that your glutathione supplement
191
444781
3019
"ืžื›ืš ืฉืชื•ืกืฃ ื”ื’ืœื•ืชื˜ื™ื•ืŸ ืฉืœืš ืœื ืขื•ื‘ืจ ืืช ืžื—ืกื•ื ื“ื ื”ืžื•ื—.
07:27
wasn't getting past the blood-brain barrier.
192
447800
2231
07:30
Well, new research just came out 13 seconds ago
193
450031
2593
"ืื– ืžื—ืงืจ ืฉืคื•ืจืกื ืœืคื ื™ 13 ืฉื ื™ื•ืช
07:32
that shows a whole new approach to that
194
452624
1711
"ืžืฆื™ื’ ื’ื™ืฉื” ื—ื“ืฉื” ืœื’ืžืจื™ ืœื›ืš ื•ื“ืจืš ื—ื“ืฉื” ืœื ื˜ื™ืœืช ื’ืœื•ืชื˜ื™ื•ืŸ.
07:34
and a new way to take glutathione.
195
454335
1993
07:36
Let me summarize it for you."
196
456328
2562
"ื”ื‘ื” ื•ืืกื›ื ืื•ืชื• ืขื‘ื•ืจืš."
07:38
Twenty years from now, we'll have nanobots,
197
458890
3684
ื‘ืขื•ื“ 20 ืฉื ื” ื™ื”ื™ื• ืœื ื• ื ื ื•ื‘ื•ื˜ื™ื,
07:42
because another exponential trend
198
462574
1627
ื›ื™ ืžื’ืžื” ืžืขืจื™ื›ื™ืช ื ื•ืกืคืช ื”ื™ื ืžื™ื–ืขื•ืจ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
07:44
is the shrinking of technology.
199
464201
1615
07:45
They'll go into our brain
200
465816
2370
ื”ื ื™ื™ื›ื ืกื• ืœืžื•ื— ื“ืจืš ื ื™ืžื™ ื”ื“ื
07:48
through the capillaries
201
468186
1703
07:49
and basically connect our neocortex
202
469889
2477
ื•ืขืงืจื•ื ื™ืช, ื™ื—ื‘ืจื• ืืช ื”ื ื™ืื•-ืงื•ืจื˜ืงืก
07:52
to a synthetic neocortex in the cloud
203
472366
3185
ืœื ื™ืื•-ืงื•ืจื˜ืงืก ืžืœืื›ื•ืชื™ ื‘ืขื ืŸ,
07:55
providing an extension of our neocortex.
204
475551
3591
ื•ื‘ื›ืš ื™ืกืคืงื• ื”ืจื—ื‘ื” ืœื ื™ืื•-ืงื•ืจื˜ืงืก ืฉืœื ื•.
07:59
Now today, I mean,
205
479142
1578
ื”ื™ื•ื ื™ืฉ ื”ืจื™ ืžื—ืฉื‘ ื‘ื˜ืœืคื•ื ื™ื ืฉืœื ื•,
08:00
you have a computer in your phone,
206
480720
1530
08:02
but if you need 10,000 computers for a few seconds
207
482250
2754
ืื‘ืœ ื›ืฉื ื—ื•ืฆื™ื 10,000 ืžื—ืฉื‘ื™ื ืœื›ืžื” ืฉื ื™ื•ืช, ืœื—ื™ืคื•ืฉ ืžื•ืจื›ื‘,
08:05
to do a complex search,
208
485004
1495
08:06
you can access that for a second or two in the cloud.
209
486499
3396
ืืคืฉืจ ืœื’ืฉืช ืืœื™ื”ื ืœืฉื ื™ื” ืื• ืฉืชื™ื™ื, ื‘ืขื ืŸ.
08:09
In the 2030s, if you need some extra neocortex,
210
489895
3095
ื‘ืฉื ื•ืช ื”-30 ืฉืœ ื”ืžืื” ื”-21, ืื ื ื–ื“ืงืง ืœืชื•ืกืคืช ื ื™ืื•-ืงื•ืจื˜ืงืก,
08:12
you'll be able to connect to that in the cloud
211
492990
2273
ื ื•ื›ืœ ืœื”ืชื—ื‘ืจ ืืœื™ื• ื‘ืขื ืŸ ื™ืฉื™ืจื•ืช ืžืŸ ื”ืžื•ื—.
08:15
directly from your brain.
212
495263
1648
08:16
So I'm walking along and I say,
213
496911
1543
ืœืžืฉืœ, ืื ื™ ืžื˜ื™ื™ืœ ืœื™ ื•ืื•ืžืจ:
08:18
"Oh, there's Chris Anderson.
214
498454
1363
"ื”ื ื” ื›ืจื™ืก ืื ื“ืจืกื•ืŸ. ื”ื•ื ื‘ื ืœืงืจืืชื™.
08:19
He's coming my way.
215
499817
1525
08:21
I'd better think of something clever to say.
216
501342
2335
"ื›ื“ืื™ ืฉื™ื”ื™ื” ืœื™ ืžืฉื”ื• ืžื—ื•ื›ื ืœื•ืžืจ.
08:23
I've got three seconds.
217
503677
1524
"ื™ืฉ ืœื™ 3 ืฉื ื™ื•ืช.
08:25
My 300 million modules in my neocortex
218
505201
3097
"300 ืžื™ืœื™ื•ืŸ ื”ื™ื—ื™ื“ื•ืช ืฉืœ ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ืฉืœื™
08:28
isn't going to cut it.
219
508298
1240
"ืœื ื™ืกืคื™ืงื• ื‘ืฉื‘ื™ืœ ื–ื”. ืื ื™ ื–ืงื•ืง ืœืขื•ื“ ืžื™ืœื™ืืจื“,"
08:29
I need a billion more."
220
509538
1246
08:30
I'll be able to access that in the cloud.
221
510784
3323
ื•ืื ื™ ืื•ื›ืœ ืœื’ืฉืช ืืœื™ื”ืŸ ื‘ืขื ืŸ.
08:34
And our thinking, then, will be a hybrid
222
514107
2812
ื•ืื– ื”ื—ืฉื™ื‘ื” ืฉืœื ื• ืชื”ื™ื” ื—ืฉื™ื‘ืช-ื›ืœืื™ื™ื:
08:36
of biological and non-biological thinking,
223
516919
3522
ื‘ื™ื•ืœื•ื’ื™ืช ื•ื’ื ืœื-ื‘ื™ื•ืœื•ื’ื™ืช.
08:40
but the non-biological portion
224
520441
1898
ืื‘ืœ ื”ื—ืœืง ื”ืœื-ื‘ื™ื•ืœื•ื’ื™
08:42
is subject to my law of accelerating returns.
225
522339
2682
ื›ืคื•ืฃ ืœ"ื—ื•ืง ื”ื”ื—ื–ืจ ื”ืžื•ืืฅ" ืฉืœื™:
08:45
It will grow exponentially.
226
525021
2239
ื”ื•ื ื™ื’ื“ืœ ื‘ืื•ืคืŸ ืžืขืจื™ื›ื™.
08:47
And remember what happens
227
527260
2016
ื–ื•ื›ืจื™ื ืžื” ืงืจื”
08:49
the last time we expanded our neocortex?
228
529276
2645
ื‘ืคืขื ื”ืื—ืจื•ื ื” ื‘ื” ื”ืจื—ื‘ื ื• ืืช ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ืฉืœื ื•?
08:51
That was two million years ago
229
531921
1426
ื–ื” ื”ื™ื” ืœืคื ื™ 2 ืžื™ืœื™ื•ืŸ ืฉื ื™ื, ื›ืฉื”ืคื›ื ื• ืœื“ืžื•ื™ื™-ืื“ื
08:53
when we became humanoids
230
533347
1236
08:54
and developed these large foreheads.
231
534583
1594
ื•ืคื™ืชื—ื ื• ืืช ื”ืžืฆื— ื”ืจื—ื‘ ื”ื–ื”.
08:56
Other primates have a slanted brow.
232
536177
2583
ืœื™ื•ื ืงื™ื ืขื™ืœืื™ื™ื ืื—ืจื™ื ื™ืฉ ืžืฆื— ืžืฉื•ืคืข.
08:58
They don't have the frontal cortex.
233
538760
1745
ืื™ืŸ ืœื”ื ืื•ื ื” ืžืฆื—ื™ืช.
09:00
But the frontal cortex is not really qualitatively different.
234
540505
3685
ืื‘ืœ ืื•ื ื” ืžืฆื—ื™ืช ืื™ื ื” ืฉื•ื ื” ื‘ืขืฆื ืžื‘ื—ื™ื ื” ืื™ื›ื•ืชื™ืช.
09:04
It's a quantitative expansion of neocortex,
235
544190
2743
ื–ื• ื”ืจื—ื‘ื” ื›ืžื•ืชื™ืช ืฉืœ ื”ื ื™ืื•-ืงื•ืจื˜ืงืก,
09:06
but that additional quantity of thinking
236
546933
2703
ืืš ืื•ืชื” ืชื•ืกืคืช ื›ืžื•ืชื™ืช ื‘ื—ืฉื™ื‘ื” ื”ื™ื ื”ื’ื•ืจื ืฉืื™ืคืฉืจ ืœื ื•
09:09
was the enabling factor for us to take
237
549636
1779
09:11
a qualitative leap and invent language
238
551415
3346
ืืช ื”ื–ื™ื ื•ืง ื”ืื™ื›ื•ืชื™ ื•ืืช ื”ืžืฆืืช ื”ืฉืคื”
09:14
and art and science and technology
239
554761
1967
ื•ื”ืืžื ื•ืช, ื•ื”ืžื“ืข, ื•ื”ื˜ื›ื ื•ืœื•ื’ื™ื”,
09:16
and TED conferences.
240
556728
1454
ื•ืืช ื›ื ืกื™ TED.
09:18
No other species has done that.
241
558182
2131
ืื™ืŸ ืขื•ื“ ืžื™ืŸ ืฉืขืฉื” ื–ืืช.
09:20
And so, over the next few decades,
242
560313
2075
ืื– ื‘ืžื”ืœืš ื”ืขืฉื•ืจื™ื ื”ื‘ืื™ื ื ืขืฉื” ื–ืืช ืฉื•ื‘.
09:22
we're going to do it again.
243
562388
1760
09:24
We're going to again expand our neocortex,
244
564148
2274
ืื ื• ืขื•ืžื“ื™ื ืœื”ืจื—ื™ื‘ ืฉื•ื‘ ืืช ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ืฉืœื ื•,
09:26
only this time we won't be limited
245
566422
1756
ืืœื ืฉื”ืคืขื ืœื ื ื”ื™ื” ืžื•ื’ื‘ืœื™ื
09:28
by a fixed architecture of enclosure.
246
568178
4280
ื‘ื’ืœืœ ืืจื›ื™ื˜ืงื˜ื•ืจื” ื ื•ืงืฉื” ืฉืœ ืžืืจื– ื›ื–ื” ืื• ืื—ืจ.
09:32
It'll be expanded without limit.
247
572458
3304
ื”ื ื™ืื•-ืงื•ืจื˜ืงืก ื™ืชืจื—ื‘ ื‘ืฆื•ืจื” ืื™ื ืกื•ืคื™ืช.
09:35
That additional quantity will again
248
575762
2243
ื•ืฉื•ื‘, ืชื•ืกืคืช ื”ื›ืžื•ืช
09:38
be the enabling factor for another qualitative leap
249
578005
3005
ืชื”ื™ื” ื”ื’ื•ืจื ืฉื™ืืคืฉืจ ื–ื™ื ื•ืง ืื™ื›ื•ืชื™ ื ื•ืกืฃ
09:41
in culture and technology.
250
581010
1635
ื‘ืชืจื‘ื•ืช ื•ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”.
09:42
Thank you very much.
251
582645
2054
ืชื•ื“ื” ืจื‘ื” ืœื›ื.
09:44
(Applause)
252
584699
3086
[ืžื—ื™ืื•ืช ื›ืคื™ื™ื]
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7