Ray Kurzweil: Get ready for hybrid thinking

534,016 views ・ 2014-06-02

TED


请双击下面的英文字幕来播放视频。

翻译人员: Minging Zhang 校对人员: Yuanqing Edberg
00:12
Let me tell you a story.
0
12988
2316
让我来给你们讲个故事。
00:15
It goes back 200 million years.
1
15304
1799
要追溯到两亿年前。
00:17
It's a story of the neocortex,
2
17103
1984
是个关于新大脑皮层的故事,
00:19
which means "new rind."
3
19087
1974
讲的就是大脑的表层。
00:21
So in these early mammals,
4
21061
2431
对于早期的哺乳类动物,
00:23
because only mammals have a neocortex,
5
23492
2055
由于只有他们有新大脑皮层,
00:25
rodent-like creatures.
6
25547
1664
就像啮齿类的生物。
00:27
It was the size of a postage stamp and just as thin,
7
27211
3579
皮质尺寸像邮票一样而且很薄,
00:30
and was a thin covering around
8
30790
1439
这个薄的皮质包裹着他们
00:32
their walnut-sized brain,
9
32229
2264
像核桃大小的头脑。
00:34
but it was capable of a new type of thinking.
10
34493
3701
但它可以产生新的思维方式。
00:38
Rather than the fixed behaviors
11
38194
1567
不像那些非哺乳类动物,
00:39
that non-mammalian animals have,
12
39761
1992
只有固定的行为,
00:41
it could invent new behaviors.
13
41753
2692
它可以创造新的行为。
00:44
So a mouse is escaping a predator,
14
44445
2553
例如一只老鼠在逃避捕食者,
00:46
its path is blocked,
15
46998
1540
它的路被堵住了,
00:48
it'll try to invent a new solution.
16
48538
2129
就想想出一个新的解决方案。
00:50
That may work, it may not,
17
50667
1266
那方案可能成功也可能失败,
00:51
but if it does, it will remember that
18
51933
1910
但如果成功了,它就会记住,
00:53
and have a new behavior,
19
53843
1292
于是就有了一种新的行为,
00:55
and that can actually spread virally
20
55135
1457
同时那个方案会迅速传遍
00:56
through the rest of the community.
21
56592
2195
到其余的团体。
00:58
Another mouse watching this could say,
22
58787
1609
比如另一只老鼠看到这会说,
01:00
"Hey, that was pretty clever, going around that rock,"
23
60396
2704
“噢,绕过那块岩石,真是高明,”
01:03
and it could adopt a new behavior as well.
24
63100
3725
于是它也会采取那种行为。
01:06
Non-mammalian animals
25
66825
1717
非哺乳类动物
01:08
couldn't do any of those things.
26
68542
1713
不能做这些事情。
01:10
They had fixed behaviors.
27
70255
1215
因为他们有固定的行为方式。
01:11
Now they could learn a new behavior
28
71470
1331
现在他们能学习新的行为
01:12
but not in the course of one lifetime.
29
72801
2576
但不是在一个生命的过程中。
01:15
In the course of maybe a thousand lifetimes,
30
75377
1767
也许在几千个生命周期内,
01:17
it could evolve a new fixed behavior.
31
77144
3330
它可以衍生出一个新的固定的行为。
01:20
That was perfectly okay 200 million years ago.
32
80474
3377
那对两亿年前来讲是好极了。
01:23
The environment changed very slowly.
33
83851
1981
那时的环境变化很慢。
01:25
It could take 10,000 years for there to be
34
85832
1554
那时可能要过一万年才会
01:27
a significant environmental change,
35
87386
2092
发生一次巨大的环境变化,
01:29
and during that period of time
36
89478
1382
在那期间,
01:30
it would evolve a new behavior.
37
90860
2929
可能进化一种新的行为。
01:33
Now that went along fine,
38
93789
1521
那样发展似乎还不错,
01:35
but then something happened.
39
95310
1704
但有些事情发生了。
01:37
Sixty-five million years ago,
40
97014
2246
六千五百万年前,
01:39
there was a sudden, violent change to the environment.
41
99260
2615
发生了一个突然的,剧烈的环境变化。
01:41
We call it the Cretaceous extinction event.
42
101875
3505
我们称之为白垩纪灭绝事件。
01:45
That's when the dinosaurs went extinct,
43
105380
2293
那是恐龙走向灭绝的时候,
01:47
that's when 75 percent of the
44
107673
3449
是百分之七十五的动植物
01:51
animal and plant species went extinct,
45
111122
2746
走向灭绝的时候,
01:53
and that's when mammals
46
113868
1745
也是哺乳类动物
01:55
overtook their ecological niche,
47
115613
2152
取代生态位,
01:57
and to anthropomorphize, biological evolution said,
48
117765
3654
而达到人格化,生物进化学说道,
02:01
"Hmm, this neocortex is pretty good stuff,"
49
121419
2025
“嗯,这个新大脑皮层是个好东西,”
02:03
and it began to grow it.
50
123444
1793
于是开始发展。
02:05
And mammals got bigger,
51
125237
1342
哺乳类动物逐渐变大,
02:06
their brains got bigger at an even faster pace,
52
126579
2915
他们的大脑变大的速度更快,
02:09
and the neocortex got bigger even faster than that
53
129494
3807
新大脑皮层同时变大的速度也更快,
02:13
and developed these distinctive ridges and folds
54
133301
2929
发展出明显的隆起和褶皱
02:16
basically to increase its surface area.
55
136230
2881
来增加它的表面积。
02:19
If you took the human neocortex
56
139111
1819
如果你有一个人的新大脑皮层
02:20
and stretched it out,
57
140930
1301
然后把它伸展开,
02:22
it's about the size of a table napkin,
58
142231
1713
大概有一方餐巾那么大,
02:23
and it's still a thin structure.
59
143944
1306
它也是一个很薄的构造。
02:25
It's about the thickness of a table napkin.
60
145250
1980
就像餐巾那么薄。
02:27
But it has so many convolutions and ridges
61
147230
2497
但它有很多的隆起和褶皱。
02:29
it's now 80 percent of our brain,
62
149727
3075
现在它占据我们的大脑有百分之八十
02:32
and that's where we do our thinking,
63
152802
2461
那也是我们用来思考的地方,
02:35
and it's the great sublimator.
64
155263
1761
所以那是个很棒的升华。
02:37
We still have that old brain
65
157024
1114
我们仍旧还是有那个
02:38
that provides our basic drives and motivations,
66
158138
2764
提供基本动力和动机的大脑,
02:40
but I may have a drive for conquest,
67
160902
2716
但也许我会有一个要去征服的想法,
02:43
and that'll be sublimated by the neocortex
68
163618
2715
那就要新大脑皮层
02:46
into writing a poem or inventing an app
69
166333
2909
借由写首诗或发明一个程序
02:49
or giving a TED Talk,
70
169242
1509
或来一个TED演讲而达到升华,
02:50
and it's really the neocortex that's where
71
170751
3622
它的确是在新的大脑皮层
有了行动。
02:54
the action is.
72
174373
1968
五十年前,我写了一篇论文,
02:56
Fifty years ago, I wrote a paper
73
176341
1717
02:58
describing how I thought the brain worked,
74
178058
1918
描述我对大脑如何运作的想法,
02:59
and I described it as a series of modules.
75
179976
3199
我描述说大脑就像一系列模块。
03:03
Each module could do things with a pattern.
76
183175
2128
每个模块可以用一种方式做事情。
03:05
It could learn a pattern. It could remember a pattern.
77
185303
2746
每个模块可以学习和记住一种方式。
03:08
It could implement a pattern.
78
188049
1407
也可以执行一种方式。
03:09
And these modules were organized in hierarchies,
79
189456
2679
然后这些模块被分派到统治集团中,
03:12
and we created that hierarchy with our own thinking.
80
192135
2954
我们用我们自己的想法创造了统治集团。
03:15
And there was actually very little to go on
81
195089
3333
后来我的这个想法就没怎么继续了。
那还是50年前。
03:18
50 years ago.
82
198422
1562
03:19
It led me to meet President Johnson.
83
199984
2115
它让我去见了约翰逊总统。
03:22
I've been thinking about this for 50 years,
84
202099
2173
我已经思考了五十年,
03:24
and a year and a half ago I came out with the book
85
204272
2828
一年半前我出了本书,
03:27
"How To Create A Mind,"
86
207100
1265
”如何创造思想,“
03:28
which has the same thesis,
87
208365
1613
这本书和那篇论文有着相同的主题,
03:29
but now there's a plethora of evidence.
88
209978
2812
但现在有了更多的证据支撑。
03:32
The amount of data we're getting about the brain
89
212790
1814
我们从神经科学得到
03:34
from neuroscience is doubling every year.
90
214604
2203
关于大脑的数据每年都成倍增加。
03:36
Spatial resolution of brainscanning of all types
91
216807
2654
各类脑扫描的空间分辨率也是。
03:39
is doubling every year.
92
219461
2285
每年双倍增加。
03:41
We can now see inside a living brain
93
221746
1717
我们现在可以看到一个活大脑的内部
03:43
and see individual interneural connections
94
223463
2870
看到个别神经元间的连接,
03:46
connecting in real time, firing in real time.
95
226333
2703
实时连接,实时放电
我们可以看到你大脑创造思维。
03:49
We can see your brain create your thoughts.
96
229036
2419
03:51
We can see your thoughts create your brain,
97
231455
1575
我们可以看到你的思维也在创造你的大脑,
03:53
which is really key to how it works.
98
233030
1999
这对了解大脑如何运作很重要。
03:55
So let me describe briefly how it works.
99
235029
2219
让我简单描述一下大脑如何工作的。
03:57
I've actually counted these modules.
100
237248
2275
我算过这些单位的数量。
03:59
We have about 300 million of them,
101
239523
2046
我们大约有三亿,
04:01
and we create them in these hierarchies.
102
241569
2229
我们在大脑层里创造他们。
04:03
I'll give you a simple example.
103
243798
2082
给你们简单举例。
04:05
I've got a bunch of modules
104
245880
2805
我有一堆模块,
04:08
that can recognize the crossbar to a capital A,
105
248685
3403
它们可以认知A的一横,
04:12
and that's all they care about.
106
252088
1914
那是它们关心的全部。
04:14
A beautiful song can play,
107
254002
1578
一首动人的歌在播放,
04:15
a pretty girl could walk by,
108
255580
1434
一个美丽的姑娘经过,
04:17
they don't care, but they see a crossbar to a capital A,
109
257014
2846
它们都不在意,但当它们看见A的一横,
04:19
they get very excited and they say "crossbar,"
110
259860
3021
它们就会很兴奋的说“横,”
04:22
and they put out a high probability
111
262881
2112
然后他们
04:24
on their output axon.
112
264993
1634
从输出轴突输出一个高度的可能性,
04:26
That goes to the next level,
113
266627
1333
那就到了下一个等级,
04:27
and these layers are organized in conceptual levels.
114
267960
2750
这些层次被分布在概念性等级中。
04:30
Each is more abstract than the next one,
115
270710
1856
每一个都比下一个更抽象,
04:32
so the next one might say "capital A."
116
272566
2418
所以下一个可能说“字母A。”
04:34
That goes up to a higher level that might say "Apple."
117
274984
2891
去到更高一个等级可能说“apple”
04:37
Information flows down also.
118
277875
2167
信息也这样流动。
04:40
If the apple recognizer has seen A-P-P-L,
119
280042
2936
如果那个认出apple的看到 a-p-p-l,
04:42
it'll think to itself, "Hmm, I think an E is probably likely,"
120
282978
3219
它就会想,“嗯,我觉得接下来是e,”
04:46
and it'll send a signal down to all the E recognizers
121
286197
2564
然后它就会把信号传送个所有认知e的
04:48
saying, "Be on the lookout for an E,
122
288761
1619
说,“看住e,
04:50
I think one might be coming."
123
290380
1556
我觉得它就要来了。”
04:51
The E recognizers will lower their threshold
124
291936
2843
e的认知这就会降低警觉
04:54
and they see some sloppy thing, could be an E.
125
294779
1945
它们可能粗心的看到一些东西觉得就是E。
04:56
Ordinarily you wouldn't think so,
126
296724
1490
通常你不会这样想,
04:58
but we're expecting an E, it's good enough,
127
298214
2009
但我们在期待一个E, 那就够了,
05:00
and yeah, I've seen an E, and then apple says,
128
300223
1817
于是我看到了E,然后认知的apple说,
05:02
"Yeah, I've seen an Apple."
129
302040
1728
“太好了,我看到了apple。”
05:03
Go up another five levels,
130
303768
1746
再往上五个等级,
05:05
and you're now at a pretty high level
131
305514
1353
现在你就在一个很高的水平,
05:06
of this hierarchy,
132
306867
1569
的这种大脑层,
05:08
and stretch down into the different senses,
133
308436
2353
于是延伸到不同的感官,
05:10
and you may have a module that sees a certain fabric,
134
310789
2655
你可能有一个模块看到了一个特殊东西,
05:13
hears a certain voice quality, smells a certain perfume,
135
313444
2844
听到一个声音,闻到到某个特殊的香水,
05:16
and will say, "My wife has entered the room."
136
316288
2513
它就会说,“我老婆进来房间了。”
05:18
Go up another 10 levels, and now you're at
137
318801
1895
往上十个等级,现在你在
05:20
a very high level.
138
320696
1160
一个非常高的等级。
05:21
You're probably in the frontal cortex,
139
321856
1937
你可能在大脑额叶,
05:23
and you'll have modules that say, "That was ironic.
140
323793
3767
然后你有模块说,“那很讽刺。
05:27
That's funny. She's pretty."
141
327560
2370
那很有趣。她很美。”
05:29
You might think that those are more sophisticated,
142
329930
2105
你可能觉得那些模块很复杂,
05:32
but actually what's more complicated
143
332035
1506
实际上更复杂的是
05:33
is the hierarchy beneath them.
144
333541
2669
在他们之下的大脑层集团。
05:36
There was a 16-year-old girl, she had brain surgery,
145
336210
2620
有一个十六岁的女孩,她做了一个大脑手术,
05:38
and she was conscious because the surgeons
146
338830
2051
她依然是清醒的,因为外科医生
05:40
wanted to talk to her.
147
340881
1537
要和她谈话。
05:42
You can do that because there's no pain receptors
148
342418
1822
手术可以做是因为大脑里没有疼痛的感觉器官。
在大脑里,
05:44
in the brain.
149
344240
1038
05:45
And whenever they stimulated particular,
150
345278
1800
当他们刺激到某个部分,
05:47
very small points on her neocortex,
151
347078
2463
在她大脑皮层的很小的点,
05:49
shown here in red, she would laugh.
152
349541
2665
这里显示红色的,她就会笑。
05:52
So at first they thought they were triggering
153
352206
1440
所以一开始他们以为他们触碰到
05:53
some kind of laugh reflex,
154
353646
1720
某个笑神经,
05:55
but no, they quickly realized they had found
155
355366
2519
但不是,他们很快意识到他们发现
05:57
the points in her neocortex that detect humor,
156
357885
3044
那些在新大脑皮层的小点能探测到幽默,
06:00
and she just found everything hilarious
157
360929
1969
然后她发现一切都很可笑,
06:02
whenever they stimulated these points.
158
362898
2437
每当刺激到那些点的时候。
06:05
"You guys are so funny just standing around,"
159
365335
1925
“你们站在这里真是太好笑了,”
06:07
was the typical comment,
160
367260
1738
这是她典型的言论,
06:08
and they weren't funny,
161
368998
2302
06:11
not while doing surgery.
162
371300
3247
实际上他们在做手术时并不有趣。
06:14
So how are we doing today?
163
374547
4830
所以我们现在怎么样?
06:19
Well, computers are actually beginning to master
164
379377
3054
事实上电脑逐渐开始
06:22
human language with techniques
165
382431
2001
通过科技掌握人类语言,
06:24
that are similar to the neocortex.
166
384432
2867
这和新大脑皮层类似。
06:27
I actually described the algorithm,
167
387299
1514
我实际上描述了运算法则,
06:28
which is similar to something called
168
388813
2054
这和
06:30
a hierarchical hidden Markov model,
169
390867
2233
脑层隐藏的马尔可夫模型类似,
06:33
something I've worked on since the '90s.
170
393100
3241
这是一些我从90年代就开始研究的事。
06:36
"Jeopardy" is a very broad natural language game,
171
396341
3238
"Jeopardy"是一个很广泛的语言游戏,
06:39
and Watson got a higher score
172
399579
1892
Watson得了一个
06:41
than the best two players combined.
173
401471
2000
比两人加在一起还高的分数。
06:43
It got this query correct:
174
403471
2499
它纠正了这个问题:
06:45
"A long, tiresome speech
175
405970
2085
"一段很长很无聊的演讲
06:48
delivered by a frothy pie topping,"
176
408055
2152
就像馅饼上的装饰。“
06:50
and it quickly responded, "What is a meringue harangue?"
177
410207
2796
于是有了很快的回复,”什么是长篇大论?“
06:53
And Jennings and the other guy didn't get that.
178
413003
2635
没人理解这个问题。
06:55
It's a pretty sophisticated example of
179
415638
1926
那是个很复杂的例子
06:57
computers actually understanding human language,
180
417564
1914
关于电脑理解人类语言,
06:59
and it actually got its knowledge by reading
181
419478
1652
它实际得到了自己的知识通过
07:01
Wikipedia and several other encyclopedias.
182
421130
3785
维基百科和一些其他百科。
07:04
Five to 10 years from now,
183
424915
2133
从现在起五到十年,
07:07
search engines will actually be based on
184
427048
2184
搜索引擎会不仅仅基于
07:09
not just looking for combinations of words and links
185
429232
2794
对文字、链接组合的寻找,
07:12
but actually understanding,
186
432026
1914
而是真正的理解,
07:13
reading for understanding the billions of pages
187
433940
2411
通过阅读来理解
07:16
on the web and in books.
188
436351
2733
网络和书中成千上万页。
07:19
So you'll be walking along, and Google will pop up
189
439084
2616
那么当你郁郁独行,古狗会跳出来
07:21
and say, "You know, Mary, you expressed concern
190
441700
3081
说,”你知道吗,Mary, 你一个月前
07:24
to me a month ago that your glutathione supplement
191
444781
3019
你向我述说的你补充的谷胱甘肽
07:27
wasn't getting past the blood-brain barrier.
192
447800
2231
没有通过血脑屏障。
07:30
Well, new research just came out 13 seconds ago
193
450031
2593
十三秒前刚出了个新研究,
07:32
that shows a whole new approach to that
194
452624
1711
显示有一个全新方法的来解决这个问题。
07:34
and a new way to take glutathione.
195
454335
1993
07:36
Let me summarize it for you."
196
456328
2562
一个服用谷胱甘肽的新方法,
07:38
Twenty years from now, we'll have nanobots,
197
458890
3684
让我来为你总结一下。“
从现在起二十年, 我们会有纳米机器人,
07:42
because another exponential trend
198
462574
1627
因为另一个指数趋势
07:44
is the shrinking of technology.
199
464201
1615
显示科技的收缩。
07:45
They'll go into our brain
200
465816
2370
他们会通过
毛细血管进入我们的大脑
07:48
through the capillaries
201
468186
1703
然后把我们的大脑皮层连接
07:49
and basically connect our neocortex
202
469889
2477
07:52
to a synthetic neocortex in the cloud
203
472366
3185
到枢纽里的合成大脑皮层。
07:55
providing an extension of our neocortex.
204
475551
3591
07:59
Now today, I mean,
205
479142
1578
而提供一个大脑皮层的延伸
08:00
you have a computer in your phone,
206
480720
1530
现在,我的意思是,
08:02
but if you need 10,000 computers for a few seconds
207
482250
2754
你手机里有一个电脑,
但如果你需要一万台电脑用几秒钟
08:05
to do a complex search,
208
485004
1495
来做一个复杂的研究,
08:06
you can access that for a second or two in the cloud.
209
486499
3396
你可以进入那个枢纽一两秒。
08:09
In the 2030s, if you need some extra neocortex,
210
489895
3095
在2030年,如果你需要一些额外的大脑皮层,
08:12
you'll be able to connect to that in the cloud
211
492990
2273
你可以在枢纽里
直接与你大脑连接。
08:15
directly from your brain.
212
495263
1648
08:16
So I'm walking along and I say,
213
496911
1543
所以当我走过我会说,
08:18
"Oh, there's Chris Anderson.
214
498454
1363
“哦,这是Chris Anderson。”
08:19
He's coming my way.
215
499817
1525
他在向我走来。
08:21
I'd better think of something clever to say.
216
501342
2335
我最好想一个聪明的方式来说。
08:23
I've got three seconds.
217
503677
1524
我有三秒钟来想。
08:25
My 300 million modules in my neocortex
218
505201
3097
我大脑皮层里的三亿个模块
还不够
08:28
isn't going to cut it.
219
508298
1240
我需要另外一亿个。“
08:29
I need a billion more."
220
509538
1246
08:30
I'll be able to access that in the cloud.
221
510784
3323
于是我可以在枢纽里实现。
08:34
And our thinking, then, will be a hybrid
222
514107
2812
那时我们的思考,会像一个
08:36
of biological and non-biological thinking,
223
516919
3522
生物和非生物思考的混合,
08:40
but the non-biological portion
224
520441
1898
但非生物的部分
08:42
is subject to my law of accelerating returns.
225
522339
2682
取决于我加速回收的原则。
它会成指数增长。
08:45
It will grow exponentially.
226
525021
2239
08:47
And remember what happens
227
527260
2016
还记得
08:49
the last time we expanded our neocortex?
228
529276
2645
上次我们伸展我们大脑皮层发生了什么吗?
08:51
That was two million years ago
229
531921
1426
那是两亿年前
08:53
when we became humanoids
230
533347
1236
当我们成为变类人
08:54
and developed these large foreheads.
231
534583
1594
进化这些大前额时。
08:56
Other primates have a slanted brow.
232
536177
2583
其他灵长目动物有倾斜的额。
08:58
They don't have the frontal cortex.
233
538760
1745
他们没有前大脑皮层。
09:00
But the frontal cortex is not really qualitatively different.
234
540505
3685
但那不是真正的不同。
09:04
It's a quantitative expansion of neocortex,
235
544190
2743
不同在于大脑皮层的伸展,
09:06
but that additional quantity of thinking
236
546933
2703
但那额外的思考
09:09
was the enabling factor for us to take
237
549636
1779
是我们能够飞跃的启动因子,
并因此发明了语言、艺术、科学
09:11
a qualitative leap and invent language
238
551415
3346
艺术,科学和科技
09:14
and art and science and technology
239
554761
1967
还有TED会议。
09:16
and TED conferences.
240
556728
1454
09:18
No other species has done that.
241
558182
2131
没有其他物种可以做到这样。
09:20
And so, over the next few decades,
242
560313
2075
在接下来的几十年,
09:22
we're going to do it again.
243
562388
1760
我们要再一次。
09:24
We're going to again expand our neocortex,
244
564148
2274
我们会再次伸展我们的新大脑皮层,
09:26
only this time we won't be limited
245
566422
1756
只有这样我们不会
09:28
by a fixed architecture of enclosure.
246
568178
4280
被固定的框架结构所限制。
09:32
It'll be expanded without limit.
247
572458
3304
它会无限伸展。
那个额外的量
09:35
That additional quantity will again
248
575762
2243
会再次成为一个启动因子
使我们在文化科技中有一个质的飞跃。
09:38
be the enabling factor for another qualitative leap
249
578005
3005
09:41
in culture and technology.
250
581010
1635
09:42
Thank you very much.
251
582645
2054
非常感谢!
09:44
(Applause)
252
584699
3086
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog