Sebastian Seung: I am my connectome

248,430 views ・ 2010-09-28

TED


请双击下面的英文字幕来播放视频。

翻译人员: Lili Liang 校对人员: Xiaoqiao Xie
00:17
We live in in a remarkable time,
0
17260
3000
我们生活在一个伟大的时代
00:20
the age of genomics.
1
20260
3000
基因组学时代
00:23
Your genome is the entire sequence of your DNA.
2
23260
3000
基因组是你的整套DNA序列
00:26
Your sequence and mine are slightly different.
3
26260
3000
你我的序列稍有不同
00:29
That's why we look different.
4
29260
2000
所以我们看上去不太一样
00:31
I've got brown eyes;
5
31260
2000
我的眼睛是棕色的
00:33
you might have blue or gray.
6
33260
3000
你的眼睛可能是蓝色或灰色的
00:36
But it's not just skin-deep.
7
36260
2000
但这序列的作用可没这么肤浅
00:38
The headlines tell us
8
38260
2000
新闻头条告诉我们
00:40
that genes can give us scary diseases,
9
40260
3000
基因可能产生可怕的疾病,
00:43
maybe even shape our personality,
10
43260
3000
甚至可能决定了我们的性格
00:46
or give us mental disorders.
11
46260
3000
让我们神经错乱
00:49
Our genes seem to have
12
49260
3000
我们的基因似乎有着
00:52
awesome power over our destinies.
13
52260
3000
主宰我们命运的神奇力量
00:56
And yet, I would like to think
14
56260
3000
但是, 我希望
00:59
that I am more than my genes.
15
59260
3000
我胜过我的基因。
01:04
What do you guys think?
16
64260
2000
你们怎么想?
01:06
Are you more than your genes?
17
66260
3000
你们认为你们胜过自己的基因吗?
01:09
(Audience: Yes.) Yes?
18
69260
2000
(观众:是的)是吗?
01:13
I think some people agree with me.
19
73260
2000
我看有一些人同意我的观点
01:15
I think we should make a statement.
20
75260
2000
我觉得我们应该做个声明
01:17
I think we should say it all together.
21
77260
2000
我们应该一起大声喊
01:20
All right: "I'm more than my genes" -- all together.
22
80260
3000
来吧:“我胜过我的基因”-预备起
01:23
Everybody: I am more than my genes.
23
83260
4000
众人:我胜过我的基因。
01:27
(Cheering)
24
87260
2000
(众人欢呼)
01:30
Sebastian Seung: What am I?
25
90260
2000
Sebastian Seung:那我是什么?
01:32
(Laughter)
26
92260
3000
(众人笑)
01:35
I am my connectome.
27
95260
3000
我是我的连接体
01:40
Now, since you guys are really great,
28
100260
2000
现在,既然大家万众一心
01:42
maybe you can humor me and say this all together too.
29
102260
2000
那么你们一起说这一句话,让我开开心
01:44
(Laughter)
30
104260
2000
(众人笑)
01:46
Right. All together now.
31
106260
2000
来吧,预备起
01:48
Everybody: I am my connectome.
32
108260
3000
众人:我是我的连接体
01:53
SS: That sounded great.
33
113260
2000
Sebastian Seung:太棒了!
01:55
You know, you guys are so great, you don't even know what a connectome is,
34
115260
2000
你们真是太合作了,你们连什么是连接体都不知道
01:57
and you're willing to play along with me.
35
117260
2000
就愿意听从我的指挥
01:59
I could just go home now.
36
119260
3000
我现在就可以回家歇着了
02:02
Well, so far only one connectome is known,
37
122260
3000
目前我们仅认识了一个连接体
02:05
that of this tiny worm.
38
125260
3000
就是这条小虫
02:08
Its modest nervous system
39
128260
2000
它简单的神经系统
02:10
consists of just 300 neurons.
40
130260
2000
仅由300个神经元组成
02:12
And in the 1970s and '80s,
41
132260
2000
二十世纪七八十年代
02:14
a team of scientists
42
134260
2000
一群科学家
02:16
mapped all 7,000 connections
43
136260
2000
绘制了神经元之间的
02:18
between the neurons.
44
138260
2000
7000个连接
02:21
In this diagram, every node is a neuron,
45
141260
2000
在这幅图里,每个结都是一个神经元
02:23
and every line is a connection.
46
143260
2000
每一条线都是一个连接
02:25
This is the connectome
47
145260
2000
这就是线虫的
02:27
of the worm C. elegans.
48
147260
4000
连接体
02:31
Your connectome is far more complex than this
49
151260
3000
你的连接体比这复杂多了
02:34
because your brain
50
154260
2000
因为你的大脑
02:36
contains 100 billion neurons
51
156260
2000
由一千亿个神经元组成
02:38
and 10,000 times as many connections.
52
158260
3000
连接数是线虫的一万倍
02:41
There's a diagram like this for your brain,
53
161260
2000
人脑也有类似的一张图
02:43
but there's no way it would fit on this slide.
54
163260
3000
但是这张幻灯片可放不下
02:47
Your connectome contains one million times more connections
55
167260
3000
你连接体中的连接数比你基因组中的字母
02:50
than your genome has letters.
56
170260
3000
还要多100万倍
02:53
That's a lot of information.
57
173260
2000
信息量非常大
02:55
What's in that information?
58
175260
3000
里面的信息有什么意义?
02:59
We don't know for sure, but there are theories.
59
179260
3000
我们还不得而知,但是有一些相关理论
03:02
Since the 19th century, neuroscientists have speculated
60
182260
3000
自十九世纪以来,神经元学家们就开始推测
03:05
that maybe your memories --
61
185260
2000
或许你的记忆-
03:07
the information that makes you, you --
62
187260
2000
那决定你是谁的信息-
03:09
maybe your memories are stored
63
189260
2000
也许你的记忆就储存在
03:11
in the connections between your brain's neurons.
64
191260
2000
你大脑神经元之间的连接里
03:15
And perhaps other aspects of your personal identity --
65
195260
2000
也许你个人身份的其它方面-
03:17
maybe your personality and your intellect --
66
197260
3000
你的个性与智力-
03:20
maybe they're also encoded
67
200260
2000
或许它们
03:22
in the connections between your neurons.
68
202260
3000
也被编译在你神经元的连接里
03:26
And so now you can see why I proposed this hypothesis:
69
206260
3000
现在你们明白我为什么提出这个假设了:
03:29
I am my connectome.
70
209260
3000
我是我的连接体
03:32
I didn't ask you to chant it because it's true;
71
212260
3000
我让大家一起喊并不是因为这是事实
03:35
I just want you to remember it.
72
215260
2000
我只是想让你们记住这句话
03:37
And in fact, we don't know if this hypothesis is correct,
73
217260
2000
事实上,我们不知道这假设是否正确
03:39
because we have never had technologies
74
219260
2000
因为我们的技术还没有发展到
03:41
powerful enough to test it.
75
221260
2000
足以测试其正确与否的程度
03:44
Finding that worm connectome
76
224260
3000
光是找出那条小虫的连接体
03:47
took over a dozen years of tedious labor.
77
227260
3000
就花了十几年的艰苦劳动
03:50
And to find the connectomes of brains more like our own,
78
230260
3000
而找到人脑的连接体
03:53
we need more sophisticated technologies, that are automated,
79
233260
3000
我们需要更加精尖的自动化技术
03:56
that will speed up the process of finding connectomes.
80
236260
3000
以提高寻找连接体的速度
03:59
And in the next few minutes, I'll tell you about some of these technologies,
81
239260
3000
下面的几分钟里,我将向大家介绍其中一些技术
04:02
which are currently under development
82
242260
2000
这些技术尚处于研发状态
04:04
in my lab and the labs of my collaborators.
83
244260
3000
在我和我同事的实验室里进行
04:08
Now you've probably seen pictures of neurons before.
84
248260
3000
你们可能已经见过神经元的照片
04:11
You can recognize them instantly
85
251260
2000
你们可以通过它们奇特的形状
04:13
by their fantastic shapes.
86
253260
3000
一眼就认出它们
04:16
They extend long and delicate branches,
87
256260
3000
它们延伸出长而纤细的枝条
04:19
and in short, they look like trees.
88
259260
3000
简单说,它们看起来像树一样
04:22
But this is just a single neuron.
89
262260
3000
而这只是一个神经元
04:25
In order to find connectomes,
90
265260
2000
想找到连接体
04:27
we have to see all the neurons at the same time.
91
267260
3000
我们必须同时看到所有神经元
04:30
So let's meet Bobby Kasthuri,
92
270260
2000
下面我们请出波比. 卡斯特里
04:32
who works in the laboratory of Jeff Lichtman
93
272260
2000
他在哈佛大学
04:34
at Harvard University.
94
274260
2000
利希曼实验室工作
04:36
Bobby is holding fantastically thin slices
95
276260
2000
波比拿在手里的是一片极薄的
04:38
of a mouse brain.
96
278260
2000
小鼠大脑切片
04:40
And we're zooming in by a factor of 100,000 times
97
280260
3000
我们把它放大了十万倍
04:44
to obtain the resolution,
98
284260
2000
达到所需分辨率
04:46
so that we can see the branches of neurons all at the same time.
99
286260
3000
这样我们就可以同时看到所有神经元的分枝
04:50
Except, you still may not really recognize them,
100
290260
3000
但是,你可能还是无法真正识别它们
04:53
and that's because we have to work in three dimensions.
101
293260
3000
我们必须在三维的效果下进行工作
04:56
If we take many images of many slices of the brain
102
296260
2000
如果我们给许多大脑切片拍照
04:58
and stack them up,
103
298260
2000
再把它们叠在一起
05:00
we get a three-dimensional image.
104
300260
2000
我们会得到一个三维图像
05:02
And still, you may not see the branches.
105
302260
2000
然而,你可能还是看不到那些分枝
05:04
So we start at the top,
106
304260
2000
我们从上至下
05:06
and we color in the cross-section of one branch in red,
107
306260
3000
把一个分枝的交叉部位涂成红色
05:09
and we do that for the next slice
108
309260
2000
我们对下一张切片进行同样处理
05:11
and for the next slice.
109
311260
2000
接着再下一张
05:13
And we keep on doing that,
110
313260
2000
我们持续这么处理
05:15
slice after slice.
111
315260
3000
一张接着一张
05:18
If we continue through the entire stack,
112
318260
2000
如果对一整叠切片进行处理
05:20
we can reconstruct the three-dimensional shape
113
320260
3000
我们就能重塑这一小段
05:23
of a small fragment of a branch of a neuron.
114
323260
3000
神经元分枝的三维图形
05:26
And we can do that for another neuron in green.
115
326260
2000
我们还可以把另一个神经元涂成绿色
05:28
And you can see that the green neuron touches the red neuron
116
328260
2000
你们可以看到绿色神经元与红色神经元
05:30
at two locations,
117
330260
2000
在两处接触了
05:32
and these are what are called synapses.
118
332260
2000
这两个就是神经突触
05:34
Let's zoom in on one synapse,
119
334260
2000
我们把一个神经突触放大
05:36
and keep your eyes on the interior of the green neuron.
120
336260
3000
大家注意看绿色的内部
05:39
You should see small circles --
121
339260
2000
你们可以看到一些小圆圈
05:41
these are called vesicles.
122
341260
3000
这些被称为囊泡
05:44
They contain a molecule know as a neurotransmitter.
123
344260
3000
它们包含了被称为神经递质的分子
05:47
And so when the green neuron wants to communicate,
124
347260
2000
当绿色神经元想进行沟通
05:49
it wants to send a message to the red neuron,
125
349260
2000
想给红色神经元发送信息
05:51
it spits out neurotransmitter.
126
351260
3000
它就释放出神经递质
05:54
At the synapse, the two neurons
127
354260
2000
在神经突触处,这两个神经元
05:56
are said to be connected
128
356260
2000
被认为是相互连接起来
05:58
like two friends talking on the telephone.
129
358260
3000
就像两个朋友通过电话聊天
06:02
So you see how to find a synapse.
130
362260
2000
你们了解了如何找到一个神经突触
06:04
How can we find an entire connectome?
131
364260
3000
那么我们怎么找到整个完整的连接体呢?
06:07
Well, we take this three-dimensional stack of images
132
367260
3000
我们把这个图形重叠形成的三维图像
06:10
and treat it as a gigantic three-dimensional coloring book.
133
370260
3000
处理成一本巨大的三维填图
06:13
We color every neuron in, in a different color,
134
373260
3000
我们把每一个神经元涂成一种颜色
06:16
and then we look through all of the images,
135
376260
2000
接着从所有图像中
06:18
find the synapses
136
378260
2000
找到神经突触
06:20
and note the colors of the two neurons involved in each synapse.
137
380260
3000
记录下任意两个组成神经突触的神经元的颜色,
06:23
If we can do that throughout all the images,
138
383260
3000
如果我们可以对所有图像进行这样的处理
06:26
we could find a connectome.
139
386260
2000
我们就能找到整个连接体
06:29
Now, at this point,
140
389260
2000
到目前为止
06:31
you've learned the basics of neurons and synapses.
141
391260
2000
大家已经对神经元和神经突触有了基本了解
06:33
And so I think we're ready to tackle
142
393260
2000
我想我们已经可以解决
06:35
one of the most important questions in neuroscience:
143
395260
3000
神经学上最重要的问题:
06:39
how are the brains of men and women different?
144
399260
3000
男性与女性的大脑有何不同?
06:42
(Laughter)
145
402260
2000
(众人笑)
06:44
According to this self-help book,
146
404260
2000
这本自学书上说
06:46
guys brains are like waffles;
147
406260
2000
男性的大脑像华孚饼
06:48
they keep their lives compartmentalized in boxes.
148
408260
3000
他们把自己的生活分别独立包装在盒子里
06:51
Girls' brains are like spaghetti;
149
411260
3000
而女性的大脑像意大利面
06:54
everything in their life is connected to everything else.
150
414260
3000
她们生活中的一切与其它所有东西都联系在一起
06:57
(Laughter)
151
417260
2000
(众人笑)
06:59
You guys are laughing,
152
419260
2000
大家都笑了
07:01
but you know, this book changed my life.
153
421260
2000
但是,你们知道吗,这本书改变了我一生
07:03
(Laughter)
154
423260
2000
(众人笑)
07:07
But seriously, what's wrong with this?
155
427260
3000
说正经的,这错在哪里?
07:10
You already know enough to tell me -- what's wrong with this statement?
156
430260
3000
你们已经足以回答这个问题了。这个说法错在哪里?
07:20
It doesn't matter whether you're a guy or girl,
157
440260
3000
不管你是男是女
07:23
everyone's brains are like spaghetti.
158
443260
3000
所有人的大脑都像意大利面
07:26
Or maybe really, really fine capellini with branches.
159
446260
3000
或者说像非常非常优质的、带枝条的细面条
07:30
Just as one strand of spaghetti
160
450260
2000
正如一条意大利面
07:32
contacts many other strands on your plate,
161
452260
3000
连接着你盘子里的其它面条一样
07:35
one neuron touches many other neurons
162
455260
2000
一个神经元与许多其它神经元
07:37
through their entangled branches.
163
457260
2000
通过它们缠绕的枝条相互接触
07:39
One neuron can be connected to so many other neurons,
164
459260
3000
一个神经元能够与其它众多神经元连接在一起
07:42
because there can be synapses
165
462260
2000
是因为在它们的接触点上
07:44
at these points of contact.
166
464260
3000
形成神经突触
07:49
By now, you might have sort of lost perspective
167
469260
3000
现在你们对这个大脑组织块的实际大小
07:52
on how large this cube of brain tissue actually is.
168
472260
3000
可能已经摸不着头脑了
07:55
And so let's do a series of comparisons to show you.
169
475260
3000
那么我们通过一系列对比给你们做展示
07:58
I assure you, this is very tiny. It's just six microns on a side.
170
478260
3000
我保证,这非常微小,边长仅为6微米
08:03
So, here's how it stacks up against an entire neuron.
171
483260
3000
看,这就是它与一个完整神经元的对比
08:06
And you can tell that, really, only the smallest fragments of branches
172
486260
3000
你们可以看到,真的,只有分枝最小的部分
08:09
are contained inside this cube.
173
489260
3000
被包含在这个方块里
08:12
And a neuron, well, that's smaller than brain.
174
492260
3000
而一个神经元,比大脑要小多了
08:17
And that's just a mouse brain --
175
497260
2000
而这还只是一只小鼠的脑
08:21
it's a lot smaller than a human brain.
176
501260
3000
比人类的脑小多了
08:25
So when show my friends this,
177
505260
2000
当我给朋友看这张图
08:27
sometimes they've told me,
178
507260
2000
他们有时会劝我
08:29
"You know, Sebastian, you should just give up.
179
509260
3000
“我说, Sebastian,你应该放弃了
08:32
Neuroscience is hopeless."
180
512260
2000
神经科学是没有出路的。”
08:34
Because if you look at a brain with your naked eye,
181
514260
2000
如果你只用肉眼来看一个大脑
08:36
you don't really see how complex it is,
182
516260
2000
你看不到它有多复杂
08:38
but when you use a microscope,
183
518260
2000
但是,在显微镜之下
08:40
finally the hidden complexity is revealed.
184
520260
3000
那隐藏的复杂性就最终显现出来了
08:45
In the 17th century,
185
525260
2000
在十七世纪
08:47
the mathematician and philosopher, Blaise Pascal,
186
527260
2000
数学家兼哲学家布莱士. 帕斯卡
08:49
wrote of his dread of the infinite,
187
529260
3000
在冥想浩瀚的外太空时
08:52
his feeling of insignificance
188
532260
2000
写下了他对无限性的恐惧
08:54
at contemplating the vast reaches of outer space.
189
534260
3000
以及自身的微不足道
08:59
And, as a scientist,
190
539260
2000
而,作为一个科学家
09:01
I'm not supposed to talk about my feelings --
191
541260
3000
我不应该谈论我的感受
09:04
too much information, professor.
192
544260
2000
太多隐私啊,教授
09:06
(Laughter)
193
546260
2000
(众人笑)
09:08
But may I?
194
548260
2000
还是让我说说吧?
09:10
(Laughter)
195
550260
2000
(众人笑)
09:12
(Applause)
196
552260
2000
(众人鼓掌)
09:14
I feel curiosity,
197
554260
2000
我感觉好奇
09:16
and I feel wonder,
198
556260
2000
我感觉迷惑,
09:18
but at times I have also felt despair.
199
558260
3000
但我也常常感觉绝望
09:22
Why did I choose to study
200
562260
2000
为什么我会选择
09:24
this organ that is so awesome in its complexity
201
564260
3000
学习这个复杂到绝美
09:27
that it might well be infinite?
202
567260
2000
但也复杂到无限的器官呢?
09:29
It's absurd.
203
569260
2000
太荒唐了
09:31
How could we even dare to think
204
571260
2000
我们怎敢想像
09:33
that we might ever understand this?
205
573260
3000
有一天我们能把大脑了解清楚
09:38
And yet, I persist in this quixotic endeavor.
206
578260
3000
但是,我坚持进行这项愚侠的事业
09:41
And indeed, these days I harbor new hopes.
207
581260
3000
而实际上,这些天我见到了一些新希望
09:45
Someday,
208
585260
2000
总有一天
09:47
a fleet of microscopes will capture
209
587260
2000
会有一大批显微镜
09:49
every neuron and every synapse
210
589260
2000
能够捕捉到每一个神经元与每一个神经突触
09:51
in a vast database of images.
211
591260
3000
得到一个巨大的图像数据库
09:54
And some day, artificially intelligent supercomputers
212
594260
3000
总有一天,人工智能超级计算机
09:57
will analyze the images without human assistance
213
597260
3000
能够对这些图像进行自主分析
10:00
to summarize them in a connectome.
214
600260
3000
把它们总结成连接体
10:04
I do not know, but I hope that I will live to see that day,
215
604260
3000
我不知道,但我希望我能在有生之年看到那一天
10:08
because finding an entire human connectome
216
608260
2000
因为,找到一个完整的人类连接体
10:10
is one of the greatest technological challenges of all time.
217
610260
3000
是历史上最重大的技术挑战之一
10:13
It will take the work of generations to succeed.
218
613260
3000
这要求许多代人的共同努力才能完成
10:17
At the present time, my collaborators and I,
219
617260
3000
在现今,我和我的同事们
10:20
what we're aiming for is much more modest --
220
620260
2000
我们的奋斗目标没有那么远大-
10:22
just to find partial connectomes
221
622260
2000
只是力求找到在小鼠大脑
10:24
of tiny chunks of mouse and human brain.
222
624260
3000
和人类大脑微小切片中的部分连接体
10:27
But even that will be enough for the first tests of this hypothesis
223
627260
3000
而这已经足够用来进行“我是我的连接体”
10:30
that I am my connectome.
224
630260
3000
这个假设的初期测试
10:35
For now, let me try to convince you of the plausibility of this hypothesis,
225
635260
3000
到此,我想告诉大家这个假设的合理性
10:38
that it's actually worth taking seriously.
226
638260
3000
这是值得严肃对待的
10:42
As you grow during childhood
227
642260
2000
你在童年时成长
10:44
and age during adulthood,
228
644260
3000
在成年之后逐渐变老
10:47
your personal identity changes slowly.
229
647260
3000
你个人身份慢慢地变化
10:50
Likewise, every connectome
230
650260
2000
同理,每个连接体
10:52
changes over time.
231
652260
2000
都会随着时间而变化
10:55
What kinds of changes happen?
232
655260
2000
发生什么样的变化呢?
10:57
Well, neurons, like trees,
233
657260
2000
神经元,像树
10:59
can grow new branches,
234
659260
2000
能够长出新的枝条
11:01
and they can lose old ones.
235
661260
3000
也会换下旧的枝条
11:04
Synapses can be created,
236
664260
3000
神经突触会产生
11:07
and they can be eliminated.
237
667260
3000
也会消失
11:10
And synapses can grow larger,
238
670260
2000
神经突触能够长大
11:12
and they can grow smaller.
239
672260
3000
也能变小
11:15
Second question:
240
675260
2000
第二个问题:
11:17
what causes these changes?
241
677260
3000
是什么促成了这些变化?
11:20
Well, it's true.
242
680260
2000
如人所说,
11:22
To some extent, they are programmed by your genes.
243
682260
3000
在某种程度上,它们是由你的基因决定的
11:25
But that's not the whole story,
244
685260
2000
但并不完全如此
11:27
because there are signals, electrical signals,
245
687260
2000
因为有许多信号,电子信号
11:29
that travel along the branches of neurons
246
689260
2000
沿着神经元的枝条游动
11:31
and chemical signals
247
691260
2000
还有化学信号
11:33
that jump across from branch to branch.
248
693260
2000
在枝条的交界处跳跃
11:35
These signals are called neural activity.
249
695260
3000
这些信号被称为神经活动
11:38
And there's a lot of evidence
250
698260
2000
有许多证据证明
11:40
that neural activity
251
700260
3000
神经活动
11:43
is encoding our thoughts, feelings and perceptions,
252
703260
3000
决定了我们的思想,感受与知觉
11:46
our mental experiences.
253
706260
2000
还有我们的精神经历
11:48
And there's a lot of evidence that neural activity
254
708260
3000
还有很多证据证明神经活动
11:51
can cause your connections to change.
255
711260
3000
能够促使你的连接体发生变化
11:54
And if you put those two facts together,
256
714260
3000
如果你把两个因素放在一起
11:57
it means that your experiences
257
717260
2000
它就意味着你的经历
11:59
can change your connectome.
258
719260
3000
能够改变你的连接体
12:02
And that's why every connectome is unique,
259
722260
2000
这就是为什么每个人的连接体都是独一无二的
12:04
even those of genetically identical twins.
260
724260
3000
即使是同卵双胞胎也不例外
12:08
The connectome is where nature meets nurture.
261
728260
3000
连接体是先天与后天的共同产物
12:12
And it might true
262
732260
2000
这很有可能是真的
12:14
that just the mere act of thinking
263
734260
2000
一个小小的思维动作
12:16
can change your connectome --
264
736260
2000
就能改变你的连接体——
12:18
an idea that you may find empowering.
265
738260
3000
你可能感觉这个概念很强势。
12:24
What's in this picture?
266
744260
2000
这幅图是什么?
12:28
A cool and refreshing stream of water, you say.
267
748260
3000
你会说是一条清凉的溪流
12:32
What else is in this picture?
268
752260
2000
图上还有什么?
12:37
Do not forget that groove in the Earth
269
757260
2000
别忘了那地表的深漕
12:39
called the stream bed.
270
759260
3000
我们叫它河床
12:42
Without it, the water would not know in which direction to flow.
271
762260
3000
没有了它,水就不知道往那个方向流了
12:45
And with the stream,
272
765260
2000
对于溪流
12:47
I would like to propose a metaphor
273
767260
2000
我想做一个暗喻
12:49
for the relationship between neural activity
274
769260
2000
来形容神经活动与连接体活动
12:51
and connectivity.
275
771260
2000
之间的关系
12:54
Neural activity is constantly changing.
276
774260
3000
神经活动不停地在变化
12:57
It's like the water of the stream; it never sits still.
277
777260
3000
就像溪流一样;永不停息
13:00
The connections
278
780260
2000
而脑部神经网络的
13:02
of the brain's neural network
279
782260
2000
连接处
13:04
determines the pathways
280
784260
2000
决定了神经活动
13:06
along which neural activity flows.
281
786260
2000
流动的路线
13:08
And so the connectome is like bed of the stream;
282
788260
3000
所以,连接体就像河床一样
13:13
but the metaphor is richer than that,
283
793260
3000
但这暗喻的内容比这丰富得多
13:16
because it's true that the stream bed
284
796260
3000
因为尽管河床
13:19
guides the flow of the water,
285
799260
2000
是流水的导向
13:21
but over long timescales,
286
801260
2000
但经过很长的时期
13:23
the water also reshapes the bed of the stream.
287
803260
3000
流水也会对河床进行重塑
13:26
And as I told you just now,
288
806260
2000
我刚才说过
13:28
neural activity can change the connectome.
289
808260
3000
神经活动能够改变连接体
13:33
And if you'll allow me to ascend
290
813260
2000
如果大家允许我
13:35
to metaphorical heights,
291
815260
3000
更进一步使用暗喻
13:38
I will remind you that neural activity
292
818260
3000
我会提醒大家,神经活动
13:41
is the physical basis -- or so neuroscientists think --
293
821260
2000
至少神经学家是这么认为的-
13:43
of thoughts, feelings and perceptions.
294
823260
3000
是思想,感受以及感知的生理基础
13:46
And so we might even speak of
295
826260
2000
这样我们甚至可以探讨
13:48
the stream of consciousness.
296
828260
2000
意识的溪流
13:50
Neural activity is its water,
297
830260
3000
神经活动是水
13:53
and the connectome is its bed.
298
833260
3000
连接体是河床
13:57
So let's return from the heights of metaphor
299
837260
2000
我们从暗喻中
13:59
and return to science.
300
839260
2000
回到科学上
14:01
Suppose our technologies for finding connectomes
301
841260
2000
假设我们寻找连接体的技术
14:03
actually work.
302
843260
2000
起到了作用
14:05
How will we go about testing the hypothesis
303
845260
2000
我们如何对假设进行测试
14:07
"I am my connectome?"
304
847260
3000
证明“我是我的连接体”呢?
14:10
Well, I propose a direct test.
305
850260
3000
我提出一个直接的测试
14:13
Let us attempt
306
853260
2000
我们尝试
14:15
to read out memories from connectomes.
307
855260
3000
通过连接体来解读记忆
14:18
Consider the memory
308
858260
2000
记忆是
14:20
of long temporal sequences of movements,
309
860260
3000
长期有序发生的动作
14:23
like a pianist playing a Beethoven sonata.
310
863260
3000
就像一个在弹奏贝多芬奏鸣曲的钢琴家
14:26
According to a theory that dates back to the 19th century,
311
866260
3000
根据十九世纪时提出的理论
14:29
such memories are stored
312
869260
2000
这些记忆以神经键链条的形式
14:31
as chains of synaptic connections inside your brain.
313
871260
3000
被储存在你的大脑里
14:35
Because, if the first neurons in the chain are activated,
314
875260
3000
因为,如果链条中第一批神经元被激活
14:38
through their synapses they send messages to the second neurons, which are activated,
315
878260
3000
它们会通过神经突触向第二批被激活的神经元发出信息
14:41
and so on down the line,
316
881260
2000
以此类推,一直往下
14:43
like a chain of falling dominoes.
317
883260
2000
就像是一路倒下的多米诺骨牌
14:45
And this sequence of neural activation
318
885260
2000
这有序的神经激活
14:47
is hypothesized to be the neural basis
319
887260
3000
被猜想为那些有序动作的
14:50
of those sequence of movements.
320
890260
2000
神经基础
14:52
So one way of trying to test the theory
321
892260
2000
所以,检验这一理论的一种途径
14:54
is to look for such chains
322
894260
2000
就是在连接体中
14:56
inside connectomes.
323
896260
2000
寻找这样的链条
14:58
But it won't be easy, because they're not going to look like this.
324
898260
3000
但这并非易事,因为它们可不像这样
15:01
They're going to be scrambled up.
325
901260
2000
它们会相互纠结成一团
15:03
So we'll have to use our computers
326
903260
2000
所以我们就必须使用我们的计算机
15:05
to try to unscramble the chain.
327
905260
3000
把这些链条解开
15:08
And if we can do that,
328
908260
2000
如果我们成功了
15:10
the sequence of the neurons we recover from that unscrambling
329
910260
3000
我们解开的神经元序列
15:13
will be a prediction of the pattern of neural activity
330
913260
3000
能够预测大脑中记忆回放时
15:16
that is replayed in the brain during memory recall.
331
916260
3000
神经活动的模式
15:19
And if that were successful,
332
919260
2000
如果成功了
15:21
that would be the first example of reading a memory from a connectome.
333
921260
3000
这将是由连接体读取记忆的第一例
15:28
(Laughter)
334
928260
2000
(众人笑)
15:30
What a mess --
335
930260
2000
真复杂
15:33
have you ever tried to wire up a system
336
933260
2000
你们有没有尝试过
15:35
as complex as this?
337
935260
2000
连接一个类似的复杂系统?
15:37
I hope not.
338
937260
2000
但愿没有
15:39
But if you have, you know it's very easy to make a mistake.
339
939260
3000
但是如果尝试过,你知道这很容易出错
15:45
The branches of neurons are like the wires of the brain.
340
945260
2000
神经元的枝条就像是大脑的电线一样
15:47
Can anyone guess: what's the total length of wires in your brain?
341
947260
4000
谁能猜一猜:你大脑里神经元的总长有多少?
15:54
I'll give you a hint. It's a big number.
342
954260
2000
给你一个提示:这个数字很大
15:56
(Laughter)
343
956260
2000
(众人笑)
15:59
I estimate, millions of miles,
344
959260
3000
我估计,有几百万英里
16:02
all packed in your skull.
345
962260
3000
全部装在你头颅里
16:05
And if you appreciate that number,
346
965260
2000
如果你惊叹于这个数字
16:07
you can easily see
347
967260
2000
你不难看到
16:09
there is huge potential for mis-wiring of the brain.
348
969260
2000
大脑中接错线的可能性极大
16:11
And indeed, the popular press loves headlines like,
349
971260
3000
确实,大众媒体特别青睐这样的头版头条
16:14
"Anorexic brains are wired differently,"
350
974260
2000
“”厌食症患者大脑结构与众不同
16:16
or "Autistic brains are wired differently."
351
976260
2000
或者“孤独症患者大脑结构与众不同”
16:18
These are plausible claims,
352
978260
2000
这听起来似乎有道理
16:20
but in truth,
353
980260
2000
但事实上
16:22
we can't see the brain's wiring clearly enough
354
982260
2000
我们不能够清楚地看到大脑中的连接情况
16:24
to tell if these are really true.
355
984260
2000
来证实这些说法正确与否
16:26
And so the technologies for seeing connectomes
356
986260
3000
因此,显示连接体的科技
16:29
will allow us to finally
357
989260
2000
最终能够让我们
16:31
read mis-wiring of the brain,
358
991260
2000
解读大脑中的连接错误
16:33
to see mental disorders in connectomes.
359
993260
3000
看到连接体中的精神错乱
16:40
Sometimes the best way to test a hypothesis
360
1000260
3000
有时候,检验假设的最佳方式
16:43
is to consider its most extreme implication.
361
1003260
3000
是考虑最极端的情况
16:46
Philosophers know this game very well.
362
1006260
3000
哲学家对这一招特别在行
16:50
If you believe that I am my connectome,
363
1010260
3000
如果你相信我是我的连接体
16:53
I think you must also accept the idea
364
1013260
3000
我认为你就必须接受这个观点
16:56
that death is the destruction
365
1016260
2000
那就是:死亡就是
16:58
of your connectome.
366
1018260
3000
你连接体的消亡
17:02
I mention this because there are prophets today
367
1022260
3000
我提出这一点是因为现今有一些预言家
17:05
who claim that technology
368
1025260
3000
声称科技
17:08
will fundamentally alter the human condition
369
1028260
3000
将会从根本上改变人类的身体条件
17:11
and perhaps even transform the human species.
370
1031260
3000
甚至使人类发生变异
17:14
One of their most cherished dreams
371
1034260
3000
他们最崇高的梦想之一
17:17
is to cheat death
372
1037260
2000
就是躲避死亡
17:19
by that practice known as cryonics.
373
1039260
2000
使用一种叫做人体冷冻的做法
17:21
If you pay 100,000 dollars,
374
1041260
2000
如果你出十万美元
17:23
you can arrange to have your body frozen after death
375
1043260
3000
你就可以安排在你死后把你的身体冷冻起来
17:26
and stored in liquid nitrogen
376
1046260
2000
储存在液态氮中
17:28
in one of these tanks in an Arizona warehouse,
377
1048260
2000
装进这样一个罐子里保存在亚利桑那州的一个仓库里
17:30
awaiting a future civilization
378
1050260
2000
等待未来的先进文明
17:32
that is advanced to resurrect you.
379
1052260
3000
来为你解冻
17:36
Should we ridicule the modern seekers of immortality,
380
1056260
2000
我们应对这种寻求永生的现代人嗤之以鼻
17:38
calling them fools?
381
1058260
2000
叫他们疯子?
17:40
Or will they someday chuckle
382
1060260
2000
还是他们有朝一日
17:42
over our graves?
383
1062260
2000
对着我们的墓碑发笑
17:45
I don't know --
384
1065260
2000
我不知道
17:47
I prefer to test their beliefs, scientifically.
385
1067260
3000
我更想通过科学的方法检验他们的信仰
17:50
I propose that we attempt to find a connectome
386
1070260
2000
我提议我们争取找到
17:52
of a frozen brain.
387
1072260
2000
一个冷冻大脑的连接体
17:54
We know that damage to the brain
388
1074260
2000
我们知道脑部的损伤
17:56
occurs after death and during freezing.
389
1076260
2000
是在死后以及冷冻期间发生的
17:58
The question is: has that damage erased the connectome?
390
1078260
3000
问题是:这种损伤是否已将连接体抹去?
18:01
If it has, there is no way that any future civilization
391
1081260
3000
若果真如此,那么任何未来先进文明
18:04
will be able to recover the memories of these frozen brains.
392
1084260
3000
都无法使这些冷冻大脑的记忆复原
18:07
Resurrection might succeed for the body,
393
1087260
2000
复活术也许能对身体奏效
18:09
but not for the mind.
394
1089260
2000
但对思想却无能为力
18:11
On the other hand, if the connectome is still intact,
395
1091260
3000
另一方面,如果连接体依旧完整
18:14
we cannot ridicule the claims of cryonics so easily.
396
1094260
3000
我们就不能轻易说人体冷冻术是谬论了
18:20
I've described a quest
397
1100260
2000
我已经描述了
18:22
that begins in the world of the very small,
398
1102260
3000
在微观世界开始的探视
18:25
and propels us to the world of the far future.
399
1105260
3000
并驱使我们去探寻未来的世界
18:28
Connectomes will mark a turning point in human history.
400
1108260
3000
连接体会成为人类历史的转折点
18:32
As we evolved from our ape-like ancestors
401
1112260
2000
我们是从非洲大草原上的
18:34
on the African savanna,
402
1114260
2000
人猿祖先进化而来的
18:36
what distinguished us was our larger brains.
403
1116260
3000
而我们与他们之间的不同之处在于我们的脑体积更大
18:40
We have used our brains to fashion
404
1120260
2000
我们已经利用大脑
18:42
ever more amazing technologies.
405
1122260
3000
创造了更为辉煌的科技成果
18:45
Eventually, these technologies will become so powerful
406
1125260
3000
最后,这些科技的力量变得如此强大
18:48
that we will use them to know ourselves
407
1128260
3000
以至于我们能利用它们
18:51
by deconstructing and reconstructing
408
1131260
3000
来拆析并重组我们的大脑
18:54
our own brains.
409
1134260
3000
以此了解我们自身
18:57
I believe that this voyage of self-discovery
410
1137260
3000
我相信这一自我发现的旅程
19:00
is not just for scientists,
411
1140260
3000
不仅只为了科学家
19:03
but for all of us.
412
1143260
2000
更为了我们所有人
19:05
And I'm grateful for the opportunity to share this voyage with you today.
413
1145260
3000
我很感谢能有这次机会与大家共享这一旅程
19:08
Thank you.
414
1148260
2000
感谢大家
19:10
(Applause)
415
1150260
8000
(热烈鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7