What we'll learn about the brain in the next century | Sam Rodriques

173,707 views ・ 2018-07-03

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yan Gao 校对人员: Yolanda Zhang
00:13
I want to tell you guys something about neuroscience.
0
13040
2507
我想跟各位聊一聊神经科学。
00:16
I'm a physicist by training.
1
16040
1800
我是物理学家,科班出身。
00:18
About three years ago, I left physics
2
18230
2206
大约三年前,我离开了物理学领域,
00:20
to come and try to understand how the brain works.
3
20460
2349
转行到神经科学,试图了解 大脑是如何工作的。
00:22
And this is what I found.
4
22833
1474
我发现,
00:24
Lots of people are working on depression.
5
24331
2064
很多人都在研究抑郁症。
00:26
And that's really good,
6
26419
1159
这非常好,
00:27
depression is something that we really want to understand.
7
27602
2721
我们的确特别想了解抑郁症。
但研究是这样进行的:
00:30
Here's how you do it:
8
30347
1167
00:31
you take a jar and you fill it up, about halfway, with water.
9
31538
4161
拿个罐子,装上大约半罐的水。
00:35
And then you take a mouse, and you put the mouse in the jar, OK?
10
35723
4182
然后找一只老鼠,把它放进罐子里。
00:39
And the mouse swims around for a little while
11
39929
2350
老鼠四处游了一会儿,
00:42
and then at some point, the mouse gets tired
12
42303
2388
到某一时刻,老鼠累了,
00:44
and decides to stop swimming.
13
44715
1934
决定不游了。
00:46
And when it stops swimming, that's depression.
14
46673
3133
它一旦不游了,就是得了抑郁症。
00:50
OK?
15
50696
1150
对吗?
00:52
And I'm from theoretical physics,
16
52291
3380
我以前是学理论物理的,
00:55
so I'm used to people making very sophisticated mathematical models
17
55695
3668
所以我习惯了 用非常复杂的数学模型
00:59
to precisely describe physical phenomena,
18
59387
2881
来精确描述物理现象,
01:02
so when I saw that this is the model for depression,
19
62292
2452
所以当我看到抑郁症的 模型是这个样子时,
01:04
I though to myself, "Oh my God, we have a lot of work to do."
20
64768
2937
我心想,“天呐, 要做的工作还多着呢。”
01:07
(Laughter)
21
67729
1370
(笑声)
01:09
But this is a kind of general problem in neuroscience.
22
69123
2951
但这问题在神经科学中 几乎普遍存在。
01:12
So for example, take emotion.
23
72377
2111
比如说,以情绪为例。
01:14
Lots of people want to understand emotion.
24
74512
2459
很多人想理解情绪。
01:17
But you can't study emotion in mice or monkeys
25
77352
3313
但是,在老鼠或猴子身上 没法研究情绪,
01:20
because you can't ask them
26
80689
1254
因为你不能问它们
01:21
how they're feeling or what they're experiencing.
27
81967
2317
感觉如何或正在经历什么。
所以,想要理解情绪的人
01:24
So instead, people who want to understand emotion,
28
84308
2357
01:26
typically end up studying what's called motivated behavior,
29
86689
2777
通常变成研究所谓的行为激励法,
01:29
which is code for "what the mouse does when it really, really wants cheese."
30
89490
3658
这个术语的意思是“老鼠 特别特别想要奶酪时会做什么”。
01:33
OK, I could go on and on.
31
93839
1675
我可以没完没了地说下去。
01:35
I mean, the point is, the NIH spends about 5.5 billion dollars a year
32
95538
6316
我的意思是,关键在于, NIH每年花费大约55亿美元
01:41
on neuroscience research.
33
101878
1532
用于神经科学研究。
01:43
And yet there have been almost no significant improvements in outcomes
34
103434
3603
然而,在过去40年中, 对脑病患者的治疗效果
01:47
for patients with brain diseases in the past 40 years.
35
107061
3491
几乎没有获得任何显著的进步。
01:51
And I think a lot of that is basically due to the fact
36
111015
2540
我认为,这在很大程度上是由于
01:53
that mice might be OK as a model for cancer or diabetes,
37
113579
4151
老鼠也许能做癌症 或糖尿病的模型,
01:57
but the mouse brain is just not sophisticated enough
38
117754
2687
但是老鼠的大脑却不够复杂,
02:00
to reproduce human psychology or human brain disease.
39
120465
3175
无法复制人类的心理 或人类的脑部疾病。
02:04
OK?
40
124379
1225
对吧?
02:05
So if the mouse models are so bad, why are we still using them?
41
125628
3634
那么,既然老鼠模型那么差, 为什么我们还在用它?
02:10
Well, it basically boils down to this:
42
130143
2103
原因大致是这样的:
02:12
the brain is made up of neurons
43
132270
2556
大脑是由神经元组成的,
02:14
which are these little cells that send electrical signals to each other.
44
134850
3447
这些神经元是相互发送 电信号的小细胞。
02:18
If you want to understand how the brain works,
45
138680
2144
如果你想了解大脑是如何工作的,
02:20
you have to be able to measure the electrical activity of these neurons.
46
140848
3808
就必须能够测量 这些神经元的电活动。
02:25
But to do that, you have to get really close to the neurons
47
145339
2992
但要做到这一点,你必须
02:28
with some kind of electrical recording device or a microscope.
48
148355
2928
用某种电记录设备或显微镜 来真正接近神经元。
02:31
And so you can do that in mice and you can do it in monkeys,
49
151563
2810
这个可以在老鼠身上做, 也可以在猴子身上做,
02:34
because you can physically put things into their brain
50
154397
2548
因为你可以真正地把 设备放进它们的大脑,
02:36
but for some reason we still can't do that in humans, OK?
51
156969
3046
但是由于某些原因,我们 还不能在人类身上这样做,对吧?
02:40
So instead, we've invented all these proxies.
52
160533
3370
所以,我们发明了各种替代工具。
02:43
So the most popular one is probably this,
53
163927
2515
最流行的应该是这个,
02:46
functional MRI, fMRI,
54
166466
2397
功能性磁共振成像,fMRI,
02:48
which allows you to make these pretty pictures like this,
55
168887
2692
它可以做出这样的美丽图片,
02:51
that show which parts of your brain light up
56
171603
2056
显示当你从事不同的活动时,
02:53
when you're engaged in different activities.
57
173683
2126
大脑的哪个部分会发光。
02:55
But this is a proxy.
58
175833
1920
但这只是一个替代工具。
02:57
You're not actually measuring neural activity here.
59
177777
3292
你实际上并不是在测量神经活动。
03:01
What you're doing is you're measuring, essentially,
60
181093
2842
你是在测量大脑中的
03:03
like, blood flow in the brain.
61
183959
1832
血液流动。
03:05
Where there's more blood.
62
185815
1238
看哪里的含血量更高。
其实是看哪里氧气多, 但你懂我意思了,对吧?
03:07
It's actually where there's more oxygen, but you get the idea, OK?
63
187077
3103
另一种方法是这个——
03:10
The other thing that you can do is you can do this --
64
190204
2519
03:12
electroencephalography -- you can put these electrodes on your head, OK?
65
192747
3591
脑电图——可以把 这些电极放在你的头上,
03:16
And then you can measure your brain waves.
66
196362
2143
然后可以测量你的脑电波。
03:19
And here, you're actually measuring electrical activity.
67
199125
3079
而这实际上是在测量电活动。
03:22
But you're not measuring the activity of neurons.
68
202228
2365
而不是在测量神经元的活动。
03:24
You're measuring these electrical currents,
69
204911
2444
你测量的是这些电流,
03:27
sloshing back and forth in your brain.
70
207379
2299
在你的大脑中来回流动的电流。
03:30
So the point is just that these technologies that we have
71
210157
2674
所以问题是,我们所拥有的这些技术
03:32
are really measuring the wrong thing.
72
212855
2436
实际上是在测量错误的东西。
03:35
Because, for most of the diseases that we want to understand --
73
215315
2953
因为,对于我们想了解的 大多数疾病——
03:38
like, Parkinson's is the classic example.
74
218292
2198
比如帕金森症就是典型的例子。
03:40
In Parkinson's, there's one particular kind of neuron deep in your brain
75
220514
3554
对于帕金森症,大脑深处 有一种特殊的神经元
03:44
that is responsible for the disease,
76
224092
1731
对这种疾病负责,
03:45
and these technologies just don't have the resolution that you need
77
225847
3182
而现有的这些技术还没有办法
检测这些神经元。
03:49
to get at that.
78
229053
1373
03:50
And so that's why we're still stuck with the animals.
79
230450
3974
所以这就是为什么 我们仍然在用动物。
03:54
Not that anyone wants to be studying depression
80
234448
2533
谁也不是真的想
用罐子里的老鼠 来研究抑郁症,对吧?
03:57
by putting mice into jars, right?
81
237005
2262
03:59
It's just that there's this pervasive sense that it's not possible
82
239291
3753
只是有一种共识告诉我们,
04:03
to look at the activity of neurons in healthy humans.
83
243068
3847
不可能观察到 健康人的神经元活动。
04:08
So here's what I want to do.
84
248180
1492
那么,接下来,
04:09
I want to take you into the future.
85
249974
2521
我想带你们进入未来。
04:12
To have a look at one way in which I think it could potentially be possible.
86
252519
4482
看一看我认为有可能的一种方式。
04:17
And I want to preface this by saying, I don't have all the details.
87
257526
3298
首先我想说,我没有完善的细节。
04:21
So I'm just going to provide you with a kind of outline.
88
261272
2967
所以我只提供大概的介绍。
04:24
But we're going to go the year 2100.
89
264263
2400
我们要去的是2100年。
04:27
Now what does the year 2100 look like?
90
267732
2299
2100年是什么样子呢?
04:30
Well, to start with, the climate is a bit warmer that what you're used to.
91
270055
3518
首先,气候比你习惯的暖和一点。
04:33
(Laughter)
92
273597
3583
(笑声)
04:37
And that robotic vacuum cleaner that you know and love
93
277204
4952
你了解并喜爱的机器人真空吸尘器
04:42
went through a few generations,
94
282180
1514
进化了好几代,
04:43
and the improvements were not always so good.
95
283718
2843
但进化的结果不怎么样。
04:46
(Laughter)
96
286585
1595
(笑声)
04:48
It was not always for the better.
97
288530
2310
并不总是越来越好。
04:52
But actually, in the year 2100 most things are surprisingly recognizable.
98
292221
4538
但实际上,在2100年, 我们居然还能认出大部分的事物,
04:57
It's just the brain is totally different.
99
297458
2734
只是大脑完全不同了。
05:00
For example, in the year 2100,
100
300740
2547
例如,在2100年,
05:03
we understand the root causes of Alzheimer's.
101
303311
2857
我们了解了阿尔茨海默症的病源。
05:06
So we can deliver targeted genetic therapies or drugs
102
306192
3714
所以我们可以在 大脑功能退化开始之前,
提供有针对性的基因治疗 或药物来阻止退化。
05:09
to stop the degenerative process before it begins.
103
309930
2876
05:13
So how did we do it?
104
313629
1333
那是怎么做到的呢?
05:15
Well, there were essentially three steps.
105
315898
2238
基本上有三个步骤。
05:18
The first step was that we had to figure out
106
318589
2814
第一步,我们必须想办法
05:21
some way to get electrical connections through the skull
107
321427
3293
让电信号的连接穿过头骨,
05:24
so we could measure the electrical activity of neurons.
108
324744
3015
这样我们就可以测量 神经元的电活动。
05:28
And not only that, it had to be easy and risk-free.
109
328339
4349
不仅如此,这一过程还必须 容易操作且无风险。
05:32
Something that basically anyone would be OK with,
110
332712
2378
它必须是人人都能接受的,
05:35
like getting a piercing.
111
335114
1600
就像穿个耳洞。
05:37
Because back in 2017,
112
337156
2747
因为早在2017年,
05:39
the only way that we knew of to get through the skull
113
339927
2913
人们知道的穿过头骨的唯一方法
05:42
was to drill these holes the size of quarters.
114
342864
2817
就是钻出硬币大小的洞。
05:46
You would never let someone do that to you.
115
346015
2039
谁都不会接受的。
05:48
So in the 2020s,
116
348967
2253
所以在21世纪20年代,
05:51
people began to experiment -- rather than drilling these gigantic holes,
117
351244
3381
人们开始实验—— 不是钻这些巨大的孔,
05:54
drilling microscopic holes, no thicker than a piece of hair.
118
354649
3115
而是钻出不到一根头发丝 那么厚的微型孔。
05:58
And the idea here was really for diagnosis --
119
358735
2096
这个方法实际是用于诊断——
06:00
there are lots of times in the diagnosis of brain disorders
120
360855
2786
在脑部疾病的诊断中,有很多时候,
06:03
when you would like to be able to look at the neural activity beneath the skull
121
363665
4872
你希望能够看到 颅骨底下的神经活动,
06:08
and being able to drill these microscopic holes
122
368561
3191
而能够钻这些微小的孔,
06:11
would make that much easier for the patient.
123
371776
2142
会让病人更容易 接受这种诊断方法。
06:13
In the end, it would be like getting a shot.
124
373942
2349
最终,它就像打针。
06:16
You just go in and you sit down
125
376315
1580
你只要到医院,坐下来,
06:17
and there's a thing that comes down on your head,
126
377919
2301
有个设备降到你的头上,
短暂的一下刺痛,就完事了,
06:20
and a momentary sting and then it's done,
127
380244
1953
06:22
and you can go back about your day.
128
382221
1864
你可以回去继续忙你的了。
06:24
So we're eventually able to do it
129
384736
4793
最终,我们能够用激光钻孔
06:29
using lasers to drill the holes.
130
389553
2667
来实现这种方法。
06:32
And with the lasers, it was fast and extremely reliable,
131
392244
2620
激光又快又非常可靠,
06:34
you couldn't even tell the holes were there,
132
394888
2213
你甚至感觉不到有孔,
06:37
any more than you could tell that one of your hairs was missing.
133
397125
3000
就像感觉不到掉了一根头发一样。
06:40
And I know it might sound crazy, using lasers to drill holes in your skull,
134
400753
4738
我知道用激光在颅骨上钻孔 听起来可能很疯狂,
06:45
but back in 2017,
135
405515
1366
但早在2017年,
06:46
people were OK with surgeons shooting lasers into their eyes
136
406905
4109
人们就已经接受外科医生向 他们的眼睛里发射激光了,
就为了做矫正手术,
06:51
for corrective surgery
137
411038
1214
06:52
So when you're already here, it's not that big of a step.
138
412276
3887
所以,有了这个基础, 跨度也就不显得那么大了。
06:57
OK?
139
417561
1151
对吧?
06:58
So the next step, that happened in the 2030s,
140
418736
3571
下一步,发生在2030年代的,
07:02
was that it's not just about getting through the skull.
141
422331
3086
就不仅仅是穿过头骨了。
07:05
To measure the activity of neurons,
142
425441
1700
为了测量神经元的活动,
07:07
you have to actually make it into the brain tissue itself.
143
427165
3825
你必须真正进入大脑组织本身。
07:11
And the risk, whenever you put something into the brain tissue,
144
431344
2968
而风险是,只要往脑组织里放东西,
那基本上就等于在引发中风。
07:14
is essentially that of stroke.
145
434336
1439
07:15
That you would hit a blood vessel and burst it,
146
435799
2196
你会碰到血管并使其破裂,
从而导致中风。
07:18
and that causes a stroke.
147
438019
1519
07:19
So, by the mid 2030s, we had invented these flexible probes
148
439916
3725
所以,到2030年代中期, 我们发明了柔性探针,
07:23
that were capable of going around blood vessels,
149
443665
2278
它能围绕血管安置,
07:25
rather than through them.
150
445967
1476
而不用穿过血管。
07:27
And thus, we could put huge batteries of these probes
151
447467
5697
因此,我们可以将 这些探针的巨大电池
07:33
into the brains of patients
152
453188
1357
放入病人的大脑中,
07:34
and record from thousands of their neurons without any risk to them.
153
454569
3270
对成千上万个神经元 进行记录,而不带来风险。
07:39
And what we discovered, sort of to our surprise,
154
459458
4061
令人惊讶的是,我们发现,
07:43
is that the neurons that we could identify
155
463543
2190
我们能识别的神经元
07:45
were not responding to things like ideas or emotion,
156
465757
3524
对想法或情绪之类的东西 并没有做出反应,
07:49
which was what we had expected.
157
469305
1627
这就与我们所期望的不一样。
07:50
They were mostly responding to things like Jennifer Aniston
158
470956
3796
让它们有反应的是 珍妮弗 · 安妮斯顿、
07:54
or Halle Berry
159
474776
2404
哈里 · 贝瑞、
07:57
or Justin Trudeau.
160
477204
1310
或贾斯汀 · 特鲁多。
07:58
I mean --
161
478538
1253
我的意思是——
07:59
(Laughter)
162
479815
2326
(笑声)
08:02
In hindsight, we shouldn't have been that surprised.
163
482165
2437
事后看来,也不用太惊讶。
08:04
I mean, what do your neurons spend most of their time thinking about?
164
484626
3262
再说,你的神经元 大部分时间想的是什么呢?
08:07
(Laughter)
165
487912
1150
(笑声)
08:09
But really, the point is that
166
489380
2040
但说真的,关键是,
08:11
this technology enabled us to begin studying neuroscience in individuals.
167
491444
4430
这项技术使我们能够开始 以个体为单位研究神经科学。
08:15
So much like the transition to genetics, at the single cell level,
168
495898
4230
就像遗传学转化到 单细胞水平的研究,
08:20
we started to study neuroscience, at the single human level.
169
500152
3206
我们开始在单个人类水平上 研究神经科学。
08:23
But we weren't quite there yet.
170
503890
1618
但这一步也还不够。
08:25
Because these technologies
171
505895
1642
因为这些技术
08:27
were still restricted to medical applications,
172
507561
3056
仍然局限于医学应用,
08:30
which meant that we were studying sick brains, not healthy brains.
173
510641
3391
意味着我们研究的是病态大脑, 而不是健康的大脑。
08:35
Because no matter how safe your technology is,
174
515235
3754
因为不管技术有多安全,
你都不能为了研究目的把东西塞进
08:39
you can't stick something into someone's brain
175
519013
2730
08:41
for research purposes.
176
521767
1420
别人的大脑。
08:43
They have to want it.
177
523211
1549
人们必须自己想要这么做。
08:44
And why would they want it?
178
524784
1460
那人们为什么想这么做呢?
08:46
Because as soon as you have an electrical connection to the brain,
179
526268
3571
因为一旦大脑通了电,
08:49
you can use it to hook the brain up to a computer.
180
529863
2444
就可以把人脑连接到电脑上。
08:53
Oh, well, you know, the general public was very skeptical at first.
181
533061
3429
你知道,公众一开始很怀疑。
08:56
I mean, who wants to hook their brain up to their computers?
182
536514
2869
谁想把自己的大脑连到电脑上呢?
08:59
Well just imagine being able to send an email with a thought.
183
539926
4236
那想象一下,你可以 用你的想法来发电邮。
09:04
(Laughter)
184
544186
2253
(笑声)
09:06
Imagine being able to take a picture with your eyes, OK?
185
546463
4500
想象一下能用眼睛拍照。
09:10
(Laughter)
186
550987
1230
(笑声)
09:12
Imagine never forgetting anything anymore,
187
552241
2963
想象永远不会忘记任何东西,
09:15
because anything that you choose to remember
188
555228
2159
因为你选择记住的所有事
09:17
will be stored permanently on a hard drive somewhere,
189
557411
2477
都将永久存储在某个硬盘上,
09:19
able to be recalled at will.
190
559912
2029
可以随意回忆。
09:21
(Laughter)
191
561965
3366
(笑声)
09:25
The line here between crazy and visionary
192
565355
3381
疯狂与眼界之间的界限
09:28
was never quite clear.
193
568760
1467
一直不太清晰。
09:30
But the systems were safe.
194
570720
1857
但这些系统是安全的。
09:32
So when the FDA decided to deregulate these laser-drilling systems, in 2043,
195
572879
5016
因此,当FDA在2043年决定 解除对激光钻孔系统的管制时,
09:37
commercial demand just exploded.
196
577919
2357
商业需求爆发了。
09:40
People started signing their emails,
197
580300
1888
人们的电邮签名变成,
09:42
"Please excuse any typos.
198
582212
1341
“请原谅我的错别字。
09:43
Sent from my brain."
199
583577
1333
本文来自我的大脑。”
09:44
(Laughter)
200
584934
1001
(笑声)
09:45
Commercial systems popped up left and right,
201
585959
2072
商业系统左右逢源,
09:48
offering the latest and greatest in neural interfacing technology.
202
588055
3238
开始提供最新最大的 神经接口技术。
09:51
There were 100 electrodes.
203
591792
1753
有百电极规格。
09:53
A thousand electrodes.
204
593569
1911
千电极规格。
09:55
High bandwidth for only 99.99 a month.
205
595504
2476
高速带宽,每月仅99.99。
(笑声)
09:58
(Laughter)
206
598004
1539
09:59
Soon, everyone had them.
207
599567
1534
很快,大家都有了。
10:01
And that was the key.
208
601694
1571
那才是关键。
10:03
Because, in the 2050s, if you were a neuroscientist,
209
603289
2923
因为,到2050年代, 如果你是神经科学家,
10:06
you could have someone come into your lab essentially from off the street.
210
606236
3939
你可以到大街上 随便找个人来实验室。
10:10
And you could have them engaged in some emotional task
211
610792
2864
让他们做一些情绪任务、
10:13
or social behavior or abstract reasoning,
212
613680
2437
社交行为或抽象推理,
10:16
things you could never study in mice.
213
616141
2531
这些不能用老鼠研究的东西。
10:18
And you could record the activity of their neurons
214
618696
3111
你可以用他们已经有的接口
10:21
using the interfaces that they already had.
215
621831
3191
记录他们神经元的活动。
然后问他们的感受。
10:25
And then you could also ask them about what they were experiencing.
216
625046
3189
10:28
So this link between psychology and neuroscience
217
628259
3349
所以在动物身上永远无法建立的 心理学和神经科学
10:31
that you could never make in the animals, was suddenly there.
218
631632
3381
之间的这种联系,就这么出现了。
10:35
So perhaps the classic example of this
219
635695
2184
这方面的典型例子可能是
10:37
was the discovery of the neural basis for insight.
220
637903
3523
发现了洞察力的神经基础。
10:41
That "Aha!" moment, the moment it all comes together, it clicks.
221
641450
3600
那种“原来如此!”的瞬间, 恍然大悟的时刻到来了。
10:45
And this was discovered by two scientists in 2055,
222
645593
4056
这是两位科学家巴里和雷特
10:49
Barry and Late,
223
649673
1372
在2055年发现的,
10:51
who observed, in the dorsal prefrontal cortex,
224
651069
3663
他们在背侧前额叶皮层观察到
10:54
how in the brain of someone trying to understand an idea,
225
654756
5222
人的大脑如何理解一个想法,
不同的神经元群体 如何重新组织自己——
11:00
how different populations of neurons would reorganize themselves --
226
660002
3369
11:03
you're looking at neural activity here in orange --
227
663395
2436
你现在看到的橙色是神经活动——
11:05
until finally their activity aligns in a way that leads to positive feedback.
228
665855
3738
直到它们的活动最终以一种 导向正反馈的方式匹配。
11:10
Right there.
229
670339
1150
就这一下。
11:12
That is understanding.
230
672723
1467
这就是理解。
11:15
So finally, we were able to get at the things that make us human.
231
675413
4437
终于,我们能够找到 让我们成为人类的东西。
11:21
And that's what really opened the way to major insights from medicine.
232
681871
4578
它真正为医学的 深入研究开辟了道路。
11:27
Because, starting in the 2060s,
233
687465
2755
因为从2060年代开始,
11:30
with the ability to record the neural activity
234
690244
2484
我们将有能力记录 这些不同精神疾病的
11:32
in the brains of patients with these different mental diseases,
235
692752
3587
患者大脑中的神经活动,
11:36
rather than defining the diseases on the basis of their symptoms,
236
696363
4690
而不是像本世纪初那样,
11:41
as we had at the beginning of the century,
237
701077
2040
根据症状来定义疾病,
11:43
we started to define them
238
703141
1222
我们开始根据
11:44
on the basis of the actual pathology that we observed at the neural level.
239
704387
3539
在神经层面观察到的 实际病理来定义疾病。
11:48
So for example, in the case of ADHD,
240
708768
3825
例如,在多动症(ADHD)的例子中,
11:52
we discovered that there are dozens of different diseases,
241
712617
3174
我们发现有数十种不同的疾病,
11:55
all of which had been called ADHD at the start of the century,
242
715815
3009
所有这些疾病在本世纪初 都被称为ADHD,
11:58
that actually had nothing to do with each other,
243
718848
2301
但它们除了症状相似之外,
12:01
except that they had similar symptoms.
244
721173
2118
实际上彼此无关。
12:03
And they needed to be treated in different ways.
245
723625
2372
并且需要以不同的方式治疗。
12:06
So it was kind of incredible, in retrospect,
246
726307
2247
回想起来,令人难以置信的是,
12:08
that at the beginning of the century,
247
728578
1777
在本世纪初,
12:10
we had been treating all those different diseases
248
730379
2317
我们一直用同一种药物
治疗所有这些不同的疾病,
12:12
with the same drug,
249
732720
1183
12:13
just by giving people amphetamine, basically is what we were doing.
250
733927
3214
基本上我们所做的就是 给患者服用安非他明。
治疗精神分裂症和抑郁症也一样。
12:17
And schizophrenia and depression are the same way.
251
737165
2488
12:19
So rather than prescribing drugs to people essentially at random,
252
739677
4032
因此,我们不再像以前那样, 几乎是随机地
12:23
as we had,
253
743733
1150
给人们开药,
12:24
we learned how to predict which drugs would be most effective
254
744907
3511
而是学会了如何预测 哪些药物对哪些患者
12:28
in which patients,
255
748442
1183
最有效,
12:29
and that just led to this huge improvement in outcomes.
256
749649
2756
这将带来治疗结果的巨大改善。
12:33
OK, I want to bring you back now to the year 2017.
257
753498
3476
好的,现在我们回到2017年。
12:38
Some of this may sound satirical or even far fetched.
258
758117
3373
有些内容可能听起来很讽刺, 甚至有些牵强。
12:41
And some of it is.
259
761514
1293
有些的确是。
12:43
I mean, I can't actually see into the future.
260
763291
2651
我的意思是,我不能真的看到未来。
12:45
I don't actually know
261
765966
1366
我也不知道
12:47
if we're going to be drilling hundreds or thousands of microscopic holes
262
767356
3667
30年后我们是否 会在头上钻上成百上千个
12:51
in our heads in 30 years.
263
771047
1667
微小的孔。
12:53
But what I can tell you
264
773762
1706
但我可以告诉你的是,
12:55
is that we're not going to make any progress
265
775492
2175
如果要在了解人脑或
12:57
towards understanding the human brain or human diseases
266
777691
3727
人类疾病方面取得任何进步,
13:01
until we figure out how to get at the electrical activity of neurons
267
781442
4516
就必须先知道如何获得
健康人大脑神经元的电活动。
13:05
in healthy humans.
268
785982
1200
13:07
And almost no one is working on figuring out how to do that today.
269
787918
3239
今天几乎没有人在研究 要如何做到这一点。
13:12
That is the future of neuroscience.
270
792077
2334
而这才是神经科学的未来。
13:14
And I think it's time for neuroscientists to put down the mouse brain
271
794752
4393
我认为是时候让 神经科学家放弃鼠脑,
13:19
and to dedicate the thought and investment necessary
272
799169
2754
投入必要的人力和资金
13:21
to understand the human brain and human disease.
273
801947
3267
去理解人脑和人类疾病了。
13:27
Thank you.
274
807629
1151
谢谢。
13:28
(Applause)
275
808804
1172
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7