Ray Kurzweil: Get ready for hybrid thinking

521,726 views ・ 2014-06-02

TED


請雙擊下方英文字幕播放視頻。

譯者: FBC GLOBAL 審譯者: Cheng Zhang-Stoddard
00:12
Let me tell you a story.
0
12988
2316
首先,我想與大家分享一個故事。
00:15
It goes back 200 million years.
1
15304
1799
時鐘撥回到兩億年前,
00:17
It's a story of the neocortex,
2
17103
1984
我們的故事,
00:19
which means "new rind."
3
19087
1974
與新皮層(neocortex)有關。
00:21
So in these early mammals,
4
21061
2431
早期哺乳動物
00:23
because only mammals have a neocortex,
5
23492
2055
(實際上只有哺乳動物才有新皮層)
00:25
rodent-like creatures.
6
25547
1664
比如齧齒類動物,
00:27
It was the size of a postage stamp and just as thin,
7
27211
3579
擁有一種尺寸和厚度與郵票相當的新皮層,
00:30
and was a thin covering around
8
30790
1439
它像一層薄膜,
00:32
their walnut-sized brain,
9
32229
2264
包覆著這些動物核桃大小的大腦。
00:34
but it was capable of a new type of thinking.
10
34493
3701
新皮層的功能不可小覷, 它賦予動物新的思考能力。
00:38
Rather than the fixed behaviors
11
38194
1567
不像非哺乳類動物,
00:39
that non-mammalian animals have,
12
39761
1992
牠們的行為基本上固定不變,
00:41
it could invent new behaviors.
13
41753
2692
擁有新皮層的哺乳動物能發明新的行為。
00:44
So a mouse is escaping a predator,
14
44445
2553
比如,老鼠逃避天敵的追捕時,
00:46
its path is blocked,
15
46998
1540
一旦發現此路不通,
00:48
it'll try to invent a new solution.
16
48538
2129
牠會嘗試去找新的出路。
00:50
That may work, it may not,
17
50667
1266
最終可能逃之夭夭,也可能落入貓口,
00:51
but if it does, it will remember that
18
51933
1910
但僥倖成功時,牠會記取成功的經驗,
00:53
and have a new behavior,
19
53843
1292
最終形成一種新的行為。
00:55
and that can actually spread virally
20
55135
1457
值得一提的是,這種新近習得的行為,
00:56
through the rest of the community.
21
56592
2195
會迅速傳遍整個鼠群。
00:58
Another mouse watching this could say,
22
58787
1609
我們可以想像,一旁觀望的老鼠會說:
01:00
"Hey, that was pretty clever, going around that rock,"
23
60396
2704
“哇,真是急中生智,居然想到繞開石頭來逃生!”
01:03
and it could adopt a new behavior as well.
24
63100
3725
然後,輕而易舉也掌握了這種技能。
01:06
Non-mammalian animals
25
66825
1717
但是,非哺乳動物
01:08
couldn't do any of those things.
26
68542
1713
對此完全無能為力,
01:10
They had fixed behaviors.
27
70255
1215
牠們的行為一成不變。
01:11
Now they could learn a new behavior
28
71470
1331
準確地說,牠們也能習得新的行為,
01:12
but not in the course of one lifetime.
29
72801
2576
但不是在一朝一夕之間,
01:15
In the course of maybe a thousand lifetimes,
30
75377
1767
可能需要歷經一千個世代,
01:17
it could evolve a new fixed behavior.
31
77144
3330
整個種群才能形成一種新的固定行為。
01:20
That was perfectly okay 200 million years ago.
32
80474
3377
在兩億年前的蠻荒世界, 這種進化節奏並無大礙。
01:23
The environment changed very slowly.
33
83851
1981
那時,環境變遷步履蹣跚,
01:25
It could take 10,000 years for there to be
34
85832
1554
大約每一萬年,
01:27
a significant environmental change,
35
87386
2092
才發生一回滄海桑田的巨變,
01:29
and during that period of time
36
89478
1382
在這樣一個漫長的時間跨度裏,
01:30
it would evolve a new behavior.
37
90860
2929
動物才形成了一種新的行為。
01:33
Now that went along fine,
38
93789
1521
往後,一切安好。
01:35
but then something happened.
39
95310
1704
直到,禍從天降。
01:37
Sixty-five million years ago,
40
97014
2246
時間快進到6500萬年前,
01:39
there was a sudden, violent change to the environment.
41
99260
2615
地球遭遇一場突如其來的環境遽變,
01:41
We call it the Cretaceous extinction event.
42
101875
3505
後人稱之為“白堊紀物種大滅絕”。
01:45
That's when the dinosaurs went extinct,
43
105380
2293
恐龍遭受滅頂之災;
01:47
that's when 75 percent of the
44
107673
3449
75%的地球物種
01:51
animal and plant species went extinct,
45
111122
2746
走向滅絕;
01:53
and that's when mammals
46
113868
1745
而哺乳動物
01:55
overtook their ecological niche,
47
115613
2152
趁機佔領了其他物種的生存地盤。
01:57
and to anthropomorphize, biological evolution said,
48
117765
3654
我們可以假託這些哺乳動物的口吻, 來評論這一進化過程:
02:01
"Hmm, this neocortex is pretty good stuff,"
49
121419
2025
“唔,關鍵時候我們的新皮層真派上用場了。”
02:03
and it began to grow it.
50
123444
1793
此後,新皮層繼續發育。
02:05
And mammals got bigger,
51
125237
1342
哺乳動物個頭也日漸見長,
02:06
their brains got bigger at an even faster pace,
52
126579
2915
大腦容量迅速擴大,
02:09
and the neocortex got bigger even faster than that
53
129494
3807
其中新皮層的發育堪稱突飛猛進,
02:13
and developed these distinctive ridges and folds
54
133301
2929
已經逐步形成獨特的溝回和褶皺,
02:16
basically to increase its surface area.
55
136230
2881
這可以進一步增加其表面積。
02:19
If you took the human neocortex
56
139111
1819
人類的新皮層,
02:20
and stretched it out,
57
140930
1301
如果充分展開平鋪,
02:22
it's about the size of a table napkin,
58
142231
1713
尺寸可達一張餐巾大小。
02:23
and it's still a thin structure.
59
143944
1306
但它仍然保持了纖薄的結構,
02:25
It's about the thickness of a table napkin.
60
145250
1980
厚度也與餐巾不相上下。
02:27
But it has so many convolutions and ridges
61
147230
2497
外形曲折複雜,呈現千溝萬壑,
02:29
it's now 80 percent of our brain,
62
149727
3075
新皮層已佔據大腦體積的80%左右,
02:32
and that's where we do our thinking,
63
152802
2461
不僅肩負思考的重任,
02:35
and it's the great sublimator.
64
155263
1761
還約束和昇華個人的行為。
02:37
We still have that old brain
65
157024
1114
今天,我們的大腦
02:38
that provides our basic drives and motivations,
66
158138
2764
仍然製造原始的需求和動機。
02:40
but I may have a drive for conquest,
67
160902
2716
但是,對於我們內心狂野的征服欲望,
02:43
and that'll be sublimated by the neocortex
68
163618
2715
這個新皮層起著春風化雨、潤物無聲的作用,
02:46
into writing a poem or inventing an app
69
166333
2909
最終將這種欲望化作創造詩歌、開發APP、
02:49
or giving a TED Talk,
70
169242
1509
甚至是發表TED演講這樣的文明行為。
02:50
and it's really the neocortex that's where
71
170751
3622
對於這一切,
02:54
the action is.
72
174373
1968
新皮層功不可沒。
02:56
Fifty years ago, I wrote a paper
73
176341
1717
50年前,我完成了一篇論文,
02:58
describing how I thought the brain worked,
74
178058
1918
探究大腦的工作原理,
02:59
and I described it as a series of modules.
75
179976
3199
我認為大腦是一系列模塊的有機結合。
03:03
Each module could do things with a pattern.
76
183175
2128
每個模塊按照某種模式各司其職,
03:05
It could learn a pattern. It could remember a pattern.
77
185303
2746
但也可以學習、記憶新的模式,
03:08
It could implement a pattern.
78
188049
1407
並將模式付諸應用。
03:09
And these modules were organized in hierarchies,
79
189456
2679
這些模式以層級結構進行組織,
03:12
and we created that hierarchy with our own thinking.
80
192135
2954
當然,我們借助自己的思考 假設了這種層級結構。
03:15
And there was actually very little to go on
81
195089
3333
50年前,由於各種條件限制,
03:18
50 years ago.
82
198422
1562
研究進展緩慢,
03:19
It led me to meet President Johnson.
83
199984
2115
但這項成果使我獲得了 約翰遜總統的接見。
03:22
I've been thinking about this for 50 years,
84
202099
2173
50年來,我一直潛心研究這個領域,
03:24
and a year and a half ago I came out with the book
85
204272
2828
就在一年半前,我又發表了一部新的著作
03:27
"How To Create A Mind,"
86
207100
1265
——《心智的構建》。
03:28
which has the same thesis,
87
208365
1613
該專著探討了同一個課題,
03:29
but now there's a plethora of evidence.
88
209978
2812
幸運的是,我現在擁有充足的證據支撐。
03:32
The amount of data we're getting about the brain
89
212790
1814
神經科學為我們貢獻 大量有關大腦的數據,
03:34
from neuroscience is doubling every year.
90
214604
2203
還在以逐年翻倍的速度劇增;
03:36
Spatial resolution of brainscanning of all types
91
216807
2654
各種腦部掃描技術的空間解析度,
03:39
is doubling every year.
92
219461
2285
也在逐年翻倍。
03:41
We can now see inside a living brain
93
221746
1717
現在,我們能親眼窺見活體大腦的內部,
03:43
and see individual interneural connections
94
223463
2870
觀察單個神經間的連接,
03:46
connecting in real time, firing in real time.
95
226333
2703
目睹神經連接、觸發的實時發生。
03:49
We can see your brain create your thoughts.
96
229036
2419
我們親眼看到大腦如何創造思維,
03:51
We can see your thoughts create your brain,
97
231455
1575
或者反過來說,思維如何增強和促進大腦,
03:53
which is really key to how it works.
98
233030
1999
思維本身對大腦進化至關重要。
03:55
So let me describe briefly how it works.
99
235029
2219
接下來,我想簡單介紹大腦的工作方式。
03:57
I've actually counted these modules.
100
237248
2275
實際上,我統計過這些模塊的數量。
03:59
We have about 300 million of them,
101
239523
2046
我們總共有大約三億模塊,
04:01
and we create them in these hierarchies.
102
241569
2229
分佈在不同的層級中。
04:03
I'll give you a simple example.
103
243798
2082
讓我們來看一個簡單的例子。
04:05
I've got a bunch of modules
104
245880
2805
假設我有一組模塊,
04:08
that can recognize the crossbar to a capital A,
105
248685
3403
可以識別大寫字母“A”中間的短橫線,
04:12
and that's all they care about.
106
252088
1914
它們的主要職責就在於此。
04:14
A beautiful song can play,
107
254002
1578
無論周遭播放著美妙的音樂,
04:15
a pretty girl could walk by,
108
255580
1434
還是一位妙齡女郎翩然而至,
04:17
they don't care, but they see a crossbar to a capital A,
109
257014
2846
它們都渾然不覺。但是,一旦發現“A”的短橫線,
04:19
they get very excited and they say "crossbar,"
110
259860
3021
它們就興奮異常,異口同聲喊出:“短橫線!”
04:22
and they put out a high probability
111
262881
2112
同時,它們立即報告神經軸突,
04:24
on their output axon.
112
264993
1634
識別任務已經順利完成。
04:26
That goes to the next level,
113
266627
1333
接下來,更高級別的模塊——
04:27
and these layers are organized in conceptual levels.
114
267960
2750
概念級別的模塊,將依次登場。
04:30
Each is more abstract than the next one,
115
270710
1856
級別越高,思考的抽象程度越高。
04:32
so the next one might say "capital A."
116
272566
2418
例如,較低的級別可識別字母“A”,
04:34
That goes up to a higher level that might say "Apple."
117
274984
2891
逐級上升後,某個級別能識別“APPLE”這個單詞。
04:37
Information flows down also.
118
277875
2167
同時,信息也在持續傳遞。
04:40
If the apple recognizer has seen A-P-P-L,
119
280042
2936
負責識別“APPLE”的級別,發現A-P-P-L時,
04:42
it'll think to itself, "Hmm, I think an E is probably likely,"
120
282978
3219
它會想:“唔,我猜下一個字母應該是E吧。”
04:46
and it'll send a signal down to all the E recognizers
121
286197
2564
然後,它會將信號傳達到 負責識別“E”的那些模塊,
04:48
saying, "Be on the lookout for an E,
122
288761
1619
並發出預警:“嘿,各位注意,
04:50
I think one might be coming."
123
290380
1556
字母E就要出現了!”
04:51
The E recognizers will lower their threshold
124
291936
2843
字母“E”的識別模塊於是降低了閥值,
04:54
and they see some sloppy thing, could be an E.
125
294779
1945
一旦發現疑似字母,便認為是“E”。
04:56
Ordinarily you wouldn't think so,
126
296724
1490
當然,這並非通常情況下的處理機制,
04:58
but we're expecting an E, it's good enough,
127
298214
2009
但現在我們正在等待“E”的出現, 而疑似字母與它足夠相似,
05:00
and yeah, I've seen an E, and then apple says,
128
300223
1817
所以,我們斷定它就是“E”。
05:02
"Yeah, I've seen an Apple."
129
302040
1728
“E”識別後,“APPLE”識別成功。
05:03
Go up another five levels,
130
303768
1746
如果我們再躍升五個級別,
05:05
and you're now at a pretty high level
131
305514
1353
那麼,在整個層級結構上,
05:06
of this hierarchy,
132
306867
1569
就到達了較高水平。
05:08
and stretch down into the different senses,
133
308436
2353
這個水平上,我們具有各種感知功能,
05:10
and you may have a module that sees a certain fabric,
134
310789
2655
某些模塊能夠感知特定的布料質地,
05:13
hears a certain voice quality, smells a certain perfume,
135
313444
2844
辨識特定的音色,甚至嗅到特定的香水味,
05:16
and will say, "My wife has entered the room."
136
316288
2513
然後告诉我:妻子剛進到房间!
05:18
Go up another 10 levels, and now you're at
137
318801
1895
再上升10級,
05:20
a very high level.
138
320696
1160
我們就到達了一個很高的水平,
05:21
You're probably in the frontal cortex,
139
321856
1937
可能來到了額葉皮層。
05:23
and you'll have modules that say, "That was ironic.
140
323793
3767
在這兒,我們的模塊已經能夠臧否人物了,
05:27
That's funny. She's pretty."
141
327560
2370
比如:這事有點滑稽可笑!她真是秀色可餐!
05:29
You might think that those are more sophisticated,
142
329930
2105
大家可能覺得,這整個過程有點複雜。
05:32
but actually what's more complicated
143
332035
1506
實際上,更讓人費解的是
05:33
is the hierarchy beneath them.
144
333541
2669
是這些過程的層級結構。
05:36
There was a 16-year-old girl, she had brain surgery,
145
336210
2620
曾經有位16歲的姑娘,當時正接受腦部手術。
05:38
and she was conscious because the surgeons
146
338830
2051
由於手術過程中醫生需要跟她講話,
05:40
wanted to talk to her.
147
340881
1537
所以就讓她保持清醒。
05:42
You can do that because there's no pain receptors
148
342418
1822
保持清醒的意識,這對於手術並無妨礙,
05:44
in the brain.
149
344240
1038
因為大腦內沒有痛覺感受器。
05:45
And whenever they stimulated particular,
150
345278
1800
我們驚奇地發現,當醫生刺激新皮層上
05:47
very small points on her neocortex,
151
347078
2463
某些細小區域時,就是圖中的紅色部位,
05:49
shown here in red, she would laugh.
152
349541
2665
這個姑娘就會放聲大笑。
05:52
So at first they thought they were triggering
153
352206
1440
起初,大家以為,
05:53
some kind of laugh reflex,
154
353646
1720
可能是因為觸發了笑反應神經。
05:55
but no, they quickly realized they had found
155
355366
2519
他們很快意識到事實並非如此,
05:57
the points in her neocortex that detect humor,
156
357885
3044
這些新皮層上的特定區域能夠理會幽默,
06:00
and she just found everything hilarious
157
360929
1969
只要醫生刺激這些區域,
06:02
whenever they stimulated these points.
158
362898
2437
她就會覺得所有的一切都滑稽有趣。
06:05
"You guys are so funny just standing around,"
159
365335
1925
“你們這幫人光站在那裏,就讓人想笑。”
06:07
was the typical comment,
160
367260
1738
那位姑娘典型的解釋道。
06:08
and they weren't funny,
161
368998
2302
我們知道,這個場景並不滑稽可笑,
06:11
not while doing surgery.
162
371300
3247
因為大家都在進行緊張的手術。
06:14
So how are we doing today?
163
374547
4830
現在,我們又有哪些新的進展呢?
06:19
Well, computers are actually beginning to master
164
379377
3054
計算機日益智能化,
06:22
human language with techniques
165
382431
2001
利用功能類似新皮層的先進技術,
06:24
that are similar to the neocortex.
166
384432
2867
它們可以學習和掌握人類的語言。
06:27
I actually described the algorithm,
167
387299
1514
我曾描述過一種算法,
06:28
which is similar to something called
168
388813
2054
與層級隱含式馬爾可夫模型類似,
06:30
a hierarchical hidden Markov model,
169
390867
2233
(馬爾可夫模型是用於自然語言處理的統計模型)
06:33
something I've worked on since the '90s.
170
393100
3241
上世紀90年以來我一直研究這種算法。
06:36
"Jeopardy" is a very broad natural language game,
171
396341
3238
“Jeopardy”(危境)是一個 自然語言類的智力競賽節目,
06:39
and Watson got a higher score
172
399579
1892
IBM研發的沃森計算機在比賽中
06:41
than the best two players combined.
173
401471
2000
勇奪高分,總分超過兩名最佳選手的總和。
06:43
It got this query correct:
174
403471
2499
連這個難題都被它輕鬆化解了:
06:45
"A long, tiresome speech
175
405970
2085
“定義:由起泡的派餡料發表的冗長而乏味的演講。
06:48
delivered by a frothy pie topping,"
176
408055
2152
請問:這定義的是什麼?”
06:50
and it quickly responded, "What is a meringue harangue?"
177
410207
2796
它迅速回答道:愛開腔的蛋白霜。
06:53
And Jennings and the other guy didn't get that.
178
413003
2635
而詹尼斯和另外一名選手卻一頭霧水。
06:55
It's a pretty sophisticated example of
179
415638
1926
這個問題難度很大,極富挑戰性,
06:57
computers actually understanding human language,
180
417564
1914
向我們展示了計算機 正在掌握人類的語言。
06:59
and it actually got its knowledge by reading
181
419478
1652
實際上,沃森是通過廣泛閱讀維基百科
07:01
Wikipedia and several other encyclopedias.
182
421130
3785
及其他百科全書來發展語言能力的。
07:04
Five to 10 years from now,
183
424915
2133
5至10年以後,
07:07
search engines will actually be based on
184
427048
2184
我們的搜索引擎
07:09
not just looking for combinations of words and links
185
429232
2794
不再只是搜索詞語和鏈接這樣的簡單組合,
07:12
but actually understanding,
186
432026
1914
它會嘗試去理解信息,
07:13
reading for understanding the billions of pages
187
433940
2411
通過涉獵浩如煙海的互聯網和書籍,
07:16
on the web and in books.
188
436351
2733
攫取和提煉知識。
07:19
So you'll be walking along, and Google will pop up
189
439084
2616
想像有一天,你正在悠閒地散步,
07:21
and say, "You know, Mary, you expressed concern
190
441700
3081
智能設備端的 Google 助理突然和你說:
07:24
to me a month ago that your glutathione supplement
191
444781
3019
“瑪麗,你上月提到,正在服用的谷胱甘肽補充劑
07:27
wasn't getting past the blood-brain barrier.
192
447800
2231
因為無法透過血腦屏障,所以暫時不起作用。
07:30
Well, new research just came out 13 seconds ago
193
450031
2593
告訴你一個好消息!就在13秒鐘前,
07:32
that shows a whole new approach to that
194
452624
1711
一項新的研究成果表明,
07:34
and a new way to take glutathione.
195
454335
1993
可以透過一个新的途徑來補充谷胱甘肽。
07:36
Let me summarize it for you."
196
456328
2562
讓我給你概括一下這個報告。”
07:38
Twenty years from now, we'll have nanobots,
197
458890
3684
20年以後,我們將迎來奈米機器人,
07:42
because another exponential trend
198
462574
1627
目前,科技產品正在日益微型化,
07:44
is the shrinking of technology.
199
464201
1615
這一趨勢愈演愈烈。
07:45
They'll go into our brain
200
465816
2370
科技設備將通過毛細血管
07:48
through the capillaries
201
468186
1703
進入我們的大腦,
07:49
and basically connect our neocortex
202
469889
2477
最終,將我們自身的新皮層
07:52
to a synthetic neocortex in the cloud
203
472366
3185
與雲端的人工合成新皮層相連,
07:55
providing an extension of our neocortex.
204
475551
3591
使它成為新皮層的延伸和擴展。
07:59
Now today, I mean,
205
479142
1578
今天,
08:00
you have a computer in your phone,
206
480720
1530
智慧型手機都內置了一台計算機。
08:02
but if you need 10,000 computers for a few seconds
207
482250
2754
假如我們需要一萬台計算機,
08:05
to do a complex search,
208
485004
1495
在幾秒鐘內完成一次複雜的搜索,
08:06
you can access that for a second or two in the cloud.
209
486499
3396
我們可以通過訪問雲端來獲得這種能力。
08:09
In the 2030s, if you need some extra neocortex,
210
489895
3095
到了2030年,當你需要更加強大的新皮層時,
08:12
you'll be able to connect to that in the cloud
211
492990
2273
你可以直接從你的大腦連接到雲端,
08:15
directly from your brain.
212
495263
1648
來獲得超凡的能力。
08:16
So I'm walking along and I say,
213
496911
1543
舉個例子,我正在漫步,遠遠看到一個人。
08:18
"Oh, there's Chris Anderson.
214
498454
1363
“老天,那不是克里斯.安德森(TED主持人)嗎?
08:19
He's coming my way.
215
499817
1525
他正朝我這邊走來。
08:21
I'd better think of something clever to say.
216
501342
2335
我要抓住這個機遇,一鳴驚人!
08:23
I've got three seconds.
217
503677
1524
但是,我只有三秒鐘,
08:25
My 300 million modules in my neocortex
218
505201
3097
我新皮層的三億個模塊
08:28
isn't going to cut it.
219
508298
1240
顯然不夠用。
08:29
I need a billion more."
220
509538
1246
我需要借來10億模塊增援!”
08:30
I'll be able to access that in the cloud.
221
510784
3323
於是,我會立即連通雲端。
08:34
And our thinking, then, will be a hybrid
222
514107
2812
我的思考,綜合了生物體和非生物體
08:36
of biological and non-biological thinking,
223
516919
3522
這兩者的優勢。
08:40
but the non-biological portion
224
520441
1898
非生物部分的思考能力,
08:42
is subject to my law of accelerating returns.
225
522339
2682
將受益於“加速回報定律”,
08:45
It will grow exponentially.
226
525021
2239
這是說,科技帶來的回報 呈指數級增長,而非線性。
08:47
And remember what happens
227
527260
2016
大家是否還記得,上次新皮層大幅擴張時
08:49
the last time we expanded our neocortex?
228
529276
2645
發生了哪些重大變化?
08:51
That was two million years ago
229
531921
1426
那是200萬年前,
08:53
when we became humanoids
230
533347
1236
我們那時還只是猿人,
08:54
and developed these large foreheads.
231
534583
1594
開始發育出碩大的前額。
08:56
Other primates have a slanted brow.
232
536177
2583
而其他靈長類動物的前額向後傾斜,
08:58
They don't have the frontal cortex.
233
538760
1745
因為牠們沒有額葉皮層。
09:00
But the frontal cortex is not really qualitatively different.
234
540505
3685
但是,額葉皮層並不意味著質的變化;
09:04
It's a quantitative expansion of neocortex,
235
544190
2743
而是新皮層量的提升,
09:06
but that additional quantity of thinking
236
546933
2703
帶來了額外的思考能力,
09:09
was the enabling factor for us to take
237
549636
1779
最終促成了質的飛躍。
09:11
a qualitative leap and invent language
238
551415
3346
我們因而能夠發明語言,
09:14
and art and science and technology
239
554761
1967
創造藝術,發展科技,
09:16
and TED conferences.
240
556728
1454
並舉辦TED演講,
09:18
No other species has done that.
241
558182
2131
這都是其他物種難以完成的創舉。
09:20
And so, over the next few decades,
242
560313
2075
我相信未來數十年,
09:22
we're going to do it again.
243
562388
1760
我們將再次創造偉大的奇蹟。
09:24
We're going to again expand our neocortex,
244
564148
2274
我們將借助科技,再次擴張新皮層,
09:26
only this time we won't be limited
245
566422
1756
不同之處在於,
09:28
by a fixed architecture of enclosure.
246
568178
4280
我們將不再受到頭顱空間的局限,
09:32
It'll be expanded without limit.
247
572458
3304
意味著擴張並無止境。
09:35
That additional quantity will again
248
575762
2243
隨之而來的量的增加
09:38
be the enabling factor for another qualitative leap
249
578005
3005
在人文和科技領域,
09:41
in culture and technology.
250
581010
1635
將再次引發一輪質的飛躍。
09:42
Thank you very much.
251
582645
2054
謝謝大家!
09:44
(Applause)
252
584699
3086
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7