What we'll learn about the brain in the next century | Sam Rodriques

174,774 views ・ 2018-07-03

TED


請雙擊下方英文字幕播放視頻。

譯者: Helen Chang 審譯者: Adrienne Lin
00:13
I want to tell you guys something about neuroscience.
0
13040
2507
我想告訴你們一些 關於神經科學的東西。
00:16
I'm a physicist by training.
1
16040
1800
我是一名通過培訓的物理學家。
00:18
About three years ago, I left physics
2
18230
2206
大約三年前,我離開物理學,
00:20
to come and try to understand how the brain works.
3
20460
2349
試圖理解大腦的工作原理。
00:22
And this is what I found.
4
22833
1474
這就是我發現的。
00:24
Lots of people are working on depression.
5
24331
2064
很多人研究抑鬱症。
00:26
And that's really good,
6
26419
1159
這真的很好,
00:27
depression is something that we really want to understand.
7
27602
2721
抑鬱症是我們真正想要了解的東西。
00:30
Here's how you do it:
8
30347
1167
做法是:
00:31
you take a jar and you fill it up, about halfway, with water.
9
31538
4161
拿個罐子,裝大約半滿的水。
00:35
And then you take a mouse, and you put the mouse in the jar, OK?
10
35723
4182
然後抓隻老鼠,放進罐子裡,
00:39
And the mouse swims around for a little while
11
39929
2350
老鼠游了一會兒,
00:42
and then at some point, the mouse gets tired
12
42303
2388
到某個時候,老鼠感到疲倦,
00:44
and decides to stop swimming.
13
44715
1934
決定不游了。
00:46
And when it stops swimming, that's depression.
14
46673
3133
停止游泳時就是得了抑鬱症。
00:50
OK?
15
50696
1150
這樣嗎?
00:52
And I'm from theoretical physics,
16
52291
3380
我學的是理論物理學,
00:55
so I'm used to people making very sophisticated mathematical models
17
55695
3668
所以習慣了人們用非常複雜的數學模型
00:59
to precisely describe physical phenomena,
18
59387
2881
來精確地描述物理現象,
01:02
so when I saw that this is the model for depression,
19
62292
2452
當我看到這是抑鬱症的模型時,
01:04
I though to myself, "Oh my God, we have a lot of work to do."
20
64768
2937
我對自己說:「哦,天啊, 我們還有很多工作要做。」
01:07
(Laughter)
21
67729
1370
(笑聲)
01:09
But this is a kind of general problem in neuroscience.
22
69123
2951
但這是神經科學中的普遍問題。
01:12
So for example, take emotion.
23
72377
2111
舉情感為例。
01:14
Lots of people want to understand emotion.
24
74512
2459
很多人想要了解情感。
01:17
But you can't study emotion in mice or monkeys
25
77352
3313
但你不能研究老鼠或猴子的情感,
01:20
because you can't ask them
26
80689
1254
因為你不能問牠們感覺如何
01:21
how they're feeling or what they're experiencing.
27
81967
2317
或者正經歷些什麼。
因此替代的是,想要了解情感的人
01:24
So instead, people who want to understand emotion,
28
84308
2357
01:26
typically end up studying what's called motivated behavior,
29
86689
2777
通常終會研究所謂的行為動機,
01:29
which is code for "what the mouse does when it really, really wants cheese."
30
89490
3658
是「老鼠在極想要乳酪時 所做的事情」的另一種說法。
01:33
OK, I could go on and on.
31
93839
1675
我的例子可多著呢。
01:35
I mean, the point is, the NIH spends about 5.5 billion dollars a year
32
95538
6316
我的意思是,美國衛生研究院 每年約花費 55 億美元
01:41
on neuroscience research.
33
101878
1532
研究神經科學。
01:43
And yet there have been almost no significant improvements in outcomes
34
103434
3603
然而幾乎沒有顯著的改善結果,
01:47
for patients with brain diseases in the past 40 years.
35
107061
3491
過去 40 年來 腦病患者預後的改善結果不大。
01:51
And I think a lot of that is basically due to the fact
36
111015
2540
我認為其中很大一部分原因
01:53
that mice might be OK as a model for cancer or diabetes,
37
113579
4151
是老鼠雖可當作 癌症或糖尿病的模型,
01:57
but the mouse brain is just not sophisticated enough
38
117754
2687
但鼠腦不夠複雜,
02:00
to reproduce human psychology or human brain disease.
39
120465
3175
不足以複製人類心理學 或人類的腦部疾病。
02:04
OK?
40
124379
1225
明白嗎?
02:05
So if the mouse models are so bad, why are we still using them?
41
125628
3634
如果老鼠的模型很糟糕, 為什麼我們仍用呢?
02:10
Well, it basically boils down to this:
42
130143
2103
基本上可以歸結為:
02:12
the brain is made up of neurons
43
132270
2556
大腦由神經元組成,
02:14
which are these little cells that send electrical signals to each other.
44
134850
3447
神經元是互相發送電訊的小細胞。
02:18
If you want to understand how the brain works,
45
138680
2144
要瞭解大腦的工作方式,
02:20
you have to be able to measure the electrical activity of these neurons.
46
140848
3808
就必須能測量神經元活動的電訊。
02:25
But to do that, you have to get really close to the neurons
47
145339
2992
但為了要做到,就必須用某種
02:28
with some kind of electrical recording device or a microscope.
48
148355
2928
電子記錄設備或顯微鏡 來真正接近神經元。
02:31
And so you can do that in mice and you can do it in monkeys,
49
151563
2810
在老鼠和在猴子身上做得到,
02:34
because you can physically put things into their brain
50
154397
2548
因為能把異物實際放入牠們的腦中;
02:36
but for some reason we still can't do that in humans, OK?
51
156969
3046
但由於某種原因, 我們仍無法在人類身上這麼做。
02:40
So instead, we've invented all these proxies.
52
160533
3370
於是我們發明了這些替代品。
02:43
So the most popular one is probably this,
53
163927
2515
最流行的可能是這個,
02:46
functional MRI, fMRI,
54
166466
2397
功能性核磁共振成像(fMRI),
02:48
which allows you to make these pretty pictures like this,
55
168887
2692
可以讓你做出像這樣的漂亮圖片,
02:51
that show which parts of your brain light up
56
171603
2056
顯示你從事不同的活動時,
02:53
when you're engaged in different activities.
57
173683
2126
腦部的哪些部位會發光。
02:55
But this is a proxy.
58
175833
1920
但這是替代品,
02:57
You're not actually measuring neural activity here.
59
177777
3292
並未實際測量腦的神經活動,
03:01
What you're doing is you're measuring, essentially,
60
181093
2842
基本上測量的是腦中的血流。
03:03
like, blood flow in the brain.
61
183959
1832
03:05
Where there's more blood.
62
185815
1238
實際上血流多的地方含氧量就高,
03:07
It's actually where there's more oxygen, but you get the idea, OK?
63
187077
3103
明白了吧?
另一樣是能驗「腦電圖」:
03:10
The other thing that you can do is you can do this --
64
190204
2519
03:12
electroencephalography -- you can put these electrodes on your head, OK?
65
192747
3591
把電極放在頭上,測量腦波。
03:16
And then you can measure your brain waves.
66
196362
2143
03:19
And here, you're actually measuring electrical activity.
67
199125
3079
這裡實際測量的是電子的活動,
03:22
But you're not measuring the activity of neurons.
68
202228
2365
不是測量神經元的活動。
03:24
You're measuring these electrical currents,
69
204911
2444
測量這些電流
03:27
sloshing back and forth in your brain.
70
207379
2299
在腦中流來流去。
03:30
So the point is just that these technologies that we have
71
210157
2674
重點是我們擁有的這些技術
03:32
are really measuring the wrong thing.
72
212855
2436
是在量錯誤的東西。
03:35
Because, for most of the diseases that we want to understand --
73
215315
2953
因為大多數我們想要了解的疾病──
03:38
like, Parkinson's is the classic example.
74
218292
2198
帕金森氏症就是個典型例子──
03:40
In Parkinson's, there's one particular kind of neuron deep in your brain
75
220514
3554
帕金森氏症患者的大腦深處 有種特定類型的神經元
03:44
that is responsible for the disease,
76
224092
1731
是造成這病的原因,
03:45
and these technologies just don't have the resolution that you need
77
225847
3182
而這些技術根本沒有 你需要的解析度。
03:49
to get at that.
78
229053
1373
03:50
And so that's why we're still stuck with the animals.
79
230450
3974
這就是為什麼我們仍使用動物實驗。
03:54
Not that anyone wants to be studying depression
80
234448
2533
不是每個人都想透過 把老鼠裝罐來研究抑鬱症吧?
03:57
by putting mice into jars, right?
81
237005
2262
03:59
It's just that there's this pervasive sense that it's not possible
82
239291
3753
只是大家普遍意識到
監看健康人士的神經元活動 是不可能的。
04:03
to look at the activity of neurons in healthy humans.
83
243068
3847
04:08
So here's what I want to do.
84
248180
1492
因此我想要做的是這個:
04:09
I want to take you into the future.
85
249974
2521
我想帶你走進未來,
04:12
To have a look at one way in which I think it could potentially be possible.
86
252519
4482
看看我認為一種未來可能實現的方式。
04:17
And I want to preface this by saying, I don't have all the details.
87
257526
3298
先聲明,我沒有所有的細節,
04:21
So I'm just going to provide you with a kind of outline.
88
261272
2967
我只會提供一個大致樣貌。
04:24
But we're going to go the year 2100.
89
264263
2400
我們來到 2100 年。
04:27
Now what does the year 2100 look like?
90
267732
2299
2100 年是什麼樣子?
04:30
Well, to start with, the climate is a bit warmer that what you're used to.
91
270055
3518
首先,氣候比你習慣的還熱些。 (麻薩諸塞州的波士頓市)
04:33
(Laughter)
92
273597
3583
(笑聲)
04:37
And that robotic vacuum cleaner that you know and love
93
277204
4952
那個你熟悉且喜愛的掃地機器人
04:42
went through a few generations,
94
282180
1514
經歷了幾代,
04:43
and the improvements were not always so good.
95
283718
2843
而改進並不總是盡如人意。
04:46
(Laughter)
96
286585
1595
(笑聲)
04:48
It was not always for the better.
97
288530
2310
未必總是改得更好。
04:52
But actually, in the year 2100 most things are surprisingly recognizable.
98
292221
4538
但實際上在 2100 年, 出乎意料地,大多事情仍可辨識出來。
04:57
It's just the brain is totally different.
99
297458
2734
只是大腦完全不同。
05:00
For example, in the year 2100,
100
300740
2547
例如,在 2100 年,
05:03
we understand the root causes of Alzheimer's.
101
303311
2857
我們了解阿茲海默症的根本原因,
05:06
So we can deliver targeted genetic therapies or drugs
102
306192
3714
因此,能提供標靶基因療法或藥物,
05:09
to stop the degenerative process before it begins.
103
309930
2876
在病發前就先阻止它惡化。
05:13
So how did we do it?
104
313629
1333
我們如何辦到的?
05:15
Well, there were essentially three steps.
105
315898
2238
基本上有三個步驟。
05:18
The first step was that we had to figure out
106
318589
2814
第一是我們必須找出
05:21
some way to get electrical connections through the skull
107
321427
3293
某種通過顱骨的接電方式
05:24
so we could measure the electrical activity of neurons.
108
324744
3015
來測量神經元的電訊活動。
05:28
And not only that, it had to be easy and risk-free.
109
328339
4349
不僅如此,它必須簡單且無風險,
05:32
Something that basically anyone would be OK with,
110
332712
2378
是基本上人人可行的方法,
05:35
like getting a piercing.
111
335114
1600
就像穿耳洞一樣。
05:37
Because back in 2017,
112
337156
2747
回溯到 2017 年,
05:39
the only way that we knew of to get through the skull
113
339927
2913
我們唯一知道進入頭骨的方法
05:42
was to drill these holes the size of quarters.
114
342864
2817
是鑽出 25 分錢幣大小的洞。
05:46
You would never let someone do that to you.
115
346015
2039
你絕不會允許有人 在你頭上鑽那樣的洞。
05:48
So in the 2020s,
116
348967
2253
所以到了 2020 年代,
05:51
people began to experiment -- rather than drilling these gigantic holes,
117
351244
3381
人們開始試驗不鑽大的洞,
05:54
drilling microscopic holes, no thicker than a piece of hair.
118
354649
3115
而是鑽細細的孔,不比一根頭髮粗。
05:58
And the idea here was really for diagnosis --
119
358735
2096
這想法主要是用於診斷。
06:00
there are lots of times in the diagnosis of brain disorders
120
360855
2786
為了經常要診斷異常的腦神經活動
06:03
when you would like to be able to look at the neural activity beneath the skull
121
363665
4872
而鑽這些觸及顱下的小洞,
使得重複多次為病人診斷腦障礙
06:08
and being able to drill these microscopic holes
122
368561
3191
06:11
would make that much easier for the patient.
123
371776
2142
變得更容易。
06:13
In the end, it would be like getting a shot.
124
373942
2349
最終會像是打一針而已,
06:16
You just go in and you sit down
125
376315
1580
你只要進去,坐下來,
06:17
and there's a thing that comes down on your head,
126
377919
2301
一樣東西落在你頭上,
瞬間感到刺痛,就結束了,
06:20
and a momentary sting and then it's done,
127
380244
1953
06:22
and you can go back about your day.
128
382221
1864
你可以回去做你原先做的事。
06:24
So we're eventually able to do it
129
384736
4793
我們最終能用雷射光鑽孔。
06:29
using lasers to drill the holes.
130
389553
2667
06:32
And with the lasers, it was fast and extremely reliable,
131
392244
2620
雷射光的速度非常快且非常可靠,
06:34
you couldn't even tell the holes were there,
132
394888
2213
你甚至看不出那裡有個洞,
06:37
any more than you could tell that one of your hairs was missing.
133
397125
3000
就像看不出那裡掉了根頭髮一樣。
06:40
And I know it might sound crazy, using lasers to drill holes in your skull,
134
400753
4738
用雷射光在頭骨上打洞 聽起來可能很瘋狂,
06:45
but back in 2017,
135
405515
1366
但是在 2017 年,
06:46
people were OK with surgeons shooting lasers into their eyes
136
406905
4109
外科醫生將雷射光射入眼睛
進行矯正視力是可被人接受的,
06:51
for corrective surgery
137
411038
1214
06:52
So when you're already here, it's not that big of a step.
138
412276
3887
雷射已被使用了, 所以這不算是多大的進步。
06:57
OK?
139
417561
1151
明白嗎?
06:58
So the next step, that happened in the 2030s,
140
418736
3571
2030 年代發生的下一步
07:02
was that it's not just about getting through the skull.
141
422331
3086
不止於穿過顱骨,
07:05
To measure the activity of neurons,
142
425441
1700
為了測量神經元的活動,
07:07
you have to actually make it into the brain tissue itself.
143
427165
3825
必須實際上穿透進腦組織本身。
07:11
And the risk, whenever you put something into the brain tissue,
144
431344
2968
無論何時,把異物放入腦組織的風險
基本上都是中風。
07:14
is essentially that of stroke.
145
434336
1439
07:15
That you would hit a blood vessel and burst it,
146
435799
2196
你會碰破血管,
導致中風。
07:18
and that causes a stroke.
147
438019
1519
07:19
So, by the mid 2030s, we had invented these flexible probes
148
439916
3725
因此,我們在 2030 年代中期 發明了靈活的探針,
07:23
that were capable of going around blood vessels,
149
443665
2278
能夠隨著血管繞行,
07:25
rather than through them.
150
445967
1476
不會刺穿血管。
07:27
And thus, we could put huge batteries of these probes
151
447467
5697
如此,我們可以將大量的探針電池
07:33
into the brains of patients
152
453188
1357
放入患者腦中,
07:34
and record from thousands of their neurons without any risk to them.
153
454569
3270
記錄成千上萬個神經元, 且不會造成任何風險。
07:39
And what we discovered, sort of to our surprise,
154
459458
4061
我們驚奇地發現
07:43
is that the neurons that we could identify
155
463543
2190
可辨識的神經元
07:45
were not responding to things like ideas or emotion,
156
465757
3524
對想法或情緒等事物沒反應,
07:49
which was what we had expected.
157
469305
1627
不如我們原先的預期。
07:50
They were mostly responding to things like Jennifer Aniston
158
470956
3796
神經元主要回應珍妮佛安妮斯頓、
07:54
or Halle Berry
159
474776
2404
荷莉貝瑞
07:57
or Justin Trudeau.
160
477204
1310
或加拿大總理賈斯汀杜魯道之類。
07:58
I mean --
161
478538
1253
我的意思是……
07:59
(Laughter)
162
479815
2326
(笑聲)
08:02
In hindsight, we shouldn't have been that surprised.
163
482165
2437
事後看來,我們不該感到訝異。
08:04
I mean, what do your neurons spend most of their time thinking about?
164
484626
3262
不然你的神經元 大部分時間在想什麼?
08:07
(Laughter)
165
487912
1150
(笑聲)
08:09
But really, the point is that
166
489380
2040
但說真的,重點是
08:11
this technology enabled us to begin studying neuroscience in individuals.
167
491444
4430
這項技術使我們能 開始研究個體的神經科學。
08:15
So much like the transition to genetics, at the single cell level,
168
495898
4230
就像遺傳學轉到單細胞層次那樣,
08:20
we started to study neuroscience, at the single human level.
169
500152
3206
我們開始在個人身上研究神經科學。
08:23
But we weren't quite there yet.
170
503890
1618
但我們尚未到那兒。
08:25
Because these technologies
171
505895
1642
因為這些技術
08:27
were still restricted to medical applications,
172
507561
3056
仍然侷限於醫療應用,
08:30
which meant that we were studying sick brains, not healthy brains.
173
510641
3391
這意味著我們研究生病的腦,
不是健康的腦。
08:35
Because no matter how safe your technology is,
174
515235
3754
因為不管你的技術有多安全,
08:39
you can't stick something into someone's brain
175
519013
2730
你都不能為了研究的目的
08:41
for research purposes.
176
521767
1420
而將異物插入某人的腦裡。
08:43
They have to want it.
177
523211
1549
必須經過他們同意才行。
08:44
And why would they want it?
178
524784
1460
他們為什麼會同意?
08:46
Because as soon as you have an electrical connection to the brain,
179
526268
3571
因為一旦腦有了電的接頭,
08:49
you can use it to hook the brain up to a computer.
180
529863
2444
就可以用它來接上電腦。
08:53
Oh, well, you know, the general public was very skeptical at first.
181
533061
3429
哦,你是知道的,一開始 公眾對此一直持懷疑態度。
08:56
I mean, who wants to hook their brain up to their computers?
182
536514
2869
畢竟有誰會想要把大腦連上電腦?
08:59
Well just imagine being able to send an email with a thought.
183
539926
4236
想像你腦筋一動, 就能夠發送電子郵件。
09:04
(Laughter)
184
544186
2253
(笑聲)
09:06
Imagine being able to take a picture with your eyes, OK?
185
546463
4500
想像能夠用你的眼睛拍照。
09:10
(Laughter)
186
550987
1230
(笑聲)
09:12
Imagine never forgetting anything anymore,
187
552241
2963
想像再也不會忘記任何東西了,
09:15
because anything that you choose to remember
188
555228
2159
因為你選擇記住的任何東西
09:17
will be stored permanently on a hard drive somewhere,
189
557411
2477
都會永久儲存在某個硬碟上,
09:19
able to be recalled at will.
190
559912
2029
可以隨時召回。
09:21
(Laughter)
191
561965
3366
(笑聲)
09:25
The line here between crazy and visionary
192
565355
3381
瘋狂和夢想之間的界線
09:28
was never quite clear.
193
568760
1467
從來就不是很清楚。
09:30
But the systems were safe.
194
570720
1857
但系統是安全的。
09:32
So when the FDA decided to deregulate these laser-drilling systems, in 2043,
195
572879
5016
因此,當美國食品藥品監督管理局
在 2043 年決定放寬 對雷射光鑽孔系統的管制時,
09:37
commercial demand just exploded.
196
577919
2357
商業需求就爆增了。
09:40
People started signing their emails,
197
580300
1888
人們開始在電子郵件寫著:
09:42
"Please excuse any typos.
198
582212
1341
「錯別字請見諒。
09:43
Sent from my brain."
199
583577
1333
由我的腦送出。」
09:44
(Laughter)
200
584934
1001
(笑聲)
09:45
Commercial systems popped up left and right,
201
585959
2072
到處冒出的商業系統
09:48
offering the latest and greatest in neural interfacing technology.
202
588055
3238
提供最新和最好的神經接口技術。
09:51
There were 100 electrodes.
203
591792
1753
有一百個電極。
09:53
A thousand electrodes.
204
593569
1911
一千個電極。
09:55
High bandwidth for only 99.99 a month.
205
595504
2476
寬頻每月只要 99.99 美元。
09:58
(Laughter)
206
598004
1539
(笑聲)
09:59
Soon, everyone had them.
207
599567
1534
很快,每個人都有了。
10:01
And that was the key.
208
601694
1571
那就是關鍵。
10:03
Because, in the 2050s, if you were a neuroscientist,
209
603289
2923
因為在 2050 年代, 如果你是個神經科學家,
10:06
you could have someone come into your lab essentially from off the street.
210
606236
3939
可從街上找個人進你的實驗室,
10:10
And you could have them engaged in some emotional task
211
610792
2864
讓他們參與一些情緒上的任務、
10:13
or social behavior or abstract reasoning,
212
613680
2437
社會行為或抽象推理,
10:16
things you could never study in mice.
213
616141
2531
這些永遠無法 在老鼠身上學到的東西。
10:18
And you could record the activity of their neurons
214
618696
3111
你可以用他們現成的接口
10:21
using the interfaces that they already had.
215
621831
3191
記錄他們的神經元活動。
10:25
And then you could also ask them about what they were experiencing.
216
625046
3189
也可以問他們正經歷什麼。
因此,你永遠無法在動物身上做出的
10:28
So this link between psychology and neuroscience
217
628259
3349
10:31
that you could never make in the animals, was suddenly there.
218
631632
3381
這種心理學和神經科學之間的聯繫 突然間就出現了。
10:35
So perhaps the classic example of this
219
635695
2184
也許這個典型的例子
10:37
was the discovery of the neural basis for insight.
220
637903
3523
是發現洞察力的神經基礎。
10:41
That "Aha!" moment, the moment it all comes together, it clicks.
221
641450
3600
那個激出「啊哈!」 靈光乍現時刻的當下,
10:45
And this was discovered by two scientists in 2055,
222
645593
4056
在 2055 年,被兩位科學家 貝瑞和雷特發現了。
10:49
Barry and Late,
223
649673
1372
10:51
who observed, in the dorsal prefrontal cortex,
224
651069
3663
他們觀察背外側前額葉 (Dorsolateral Prefrontal Cortex)
10:54
how in the brain of someone trying to understand an idea,
225
654756
5222
試圖了解某個想法的人腦內部,
11:00
how different populations of neurons would reorganize themselves --
226
660002
3369
不同的神經元群如何重組自己──
11:03
you're looking at neural activity here in orange --
227
663395
2436
神經活動在這裡以橙色顯示──
11:05
until finally their activity aligns in a way that leads to positive feedback.
228
665855
3738
直到最後活動以一種 導致正面反饋的方式對齊。
11:10
Right there.
229
670339
1150
在那裡。
11:12
That is understanding.
230
672723
1467
那就是理解。
11:15
So finally, we were able to get at the things that make us human.
231
675413
4437
我們終於能了解 使我們成為人類的東西。
11:21
And that's what really opened the way to major insights from medicine.
232
681871
4578
這才能真正打開 通向主要醫學見解的道路。
11:27
Because, starting in the 2060s,
233
687465
2755
因為從 2060 年代開始,
11:30
with the ability to record the neural activity
234
690244
2484
我們能夠開始記錄這些
11:32
in the brains of patients with these different mental diseases,
235
692752
3587
不同精神病患者的腦神經活動,
11:36
rather than defining the diseases on the basis of their symptoms,
236
696363
4690
而不是根據他們的症狀來定義疾病,
11:41
as we had at the beginning of the century,
237
701077
2040
就像本世紀初時那樣,
11:43
we started to define them
238
703141
1222
根據我們在神經水平觀察到的
11:44
on the basis of the actual pathology that we observed at the neural level.
239
704387
3539
實際病理來定義它們。
11:48
So for example, in the case of ADHD,
240
708768
3825
例如,對注意力不足過動症 (ADHD)而言,
11:52
we discovered that there are dozens of different diseases,
241
712617
3174
我們發現有幾十種不同的疾病,
11:55
all of which had been called ADHD at the start of the century,
242
715815
3009
在本世紀初全被稱為 ADHD,
11:58
that actually had nothing to do with each other,
243
718848
2301
但實際上它們之間沒有任何關聯,
12:01
except that they had similar symptoms.
244
721173
2118
只是症狀類似,
12:03
And they needed to be treated in different ways.
245
723625
2372
應該要用不同的治療方式。
12:06
So it was kind of incredible, in retrospect,
246
726307
2247
這有點令人難以置信,
12:08
that at the beginning of the century,
247
728578
1777
回想本世紀初,
12:10
we had been treating all those different diseases
248
730379
2317
我們一直在用同樣的藥物 治療這些不同的疾病,
12:12
with the same drug,
249
732720
1183
12:13
just by giving people amphetamine, basically is what we were doing.
250
733927
3214
基本上就是只是給人服用安非他命。
精神分裂症和抑鬱症也是如此。
12:17
And schizophrenia and depression are the same way.
251
737165
2488
12:19
So rather than prescribing drugs to people essentially at random,
252
739677
4032
因此,我們不像以前那樣隨意開藥,
12:23
as we had,
253
743733
1150
12:24
we learned how to predict which drugs would be most effective
254
744907
3511
而是學會瞭解如何預測
哪種藥物對哪些患者最有效,
12:28
in which patients,
255
748442
1183
12:29
and that just led to this huge improvement in outcomes.
256
749649
2756
因而導致巨大的改善結果。
12:33
OK, I want to bring you back now to the year 2017.
257
753498
3476
我現在帶你回 2017 年。
12:38
Some of this may sound satirical or even far fetched.
258
758117
3373
其中一些聽起來很諷刺, 甚至牽強附會。
12:41
And some of it is.
259
761514
1293
有些的確是。
12:43
I mean, I can't actually see into the future.
260
763291
2651
意思是,實際上我看不到未來,
12:45
I don't actually know
261
765966
1366
實際上我並不知道
12:47
if we're going to be drilling hundreds or thousands of microscopic holes
262
767356
3667
我們是否會在 30 年內 在頭部鑽幾百或幾千個微小洞。
12:51
in our heads in 30 years.
263
771047
1667
12:53
But what I can tell you
264
773762
1706
但我能告訴你,
12:55
is that we're not going to make any progress
265
775492
2175
我們無法在理解人類大腦
12:57
towards understanding the human brain or human diseases
266
777691
3727
或人類疾病方面取得任何進展,
13:01
until we figure out how to get at the electrical activity of neurons
267
781442
4516
除非我們先弄清楚如何取得
健康人的神經元電訊活動才能夠做到。
13:05
in healthy humans.
268
785982
1200
13:07
And almost no one is working on figuring out how to do that today.
269
787918
3239
而現今幾乎沒有人在研究如何做。
13:12
That is the future of neuroscience.
270
792077
2334
那是神經科學的未來。
13:14
And I think it's time for neuroscientists to put down the mouse brain
271
794752
4393
我認為現在是神經科學家放下鼠腦,
13:19
and to dedicate the thought and investment necessary
272
799169
2754
專心投入必要的腦力和金錢
13:21
to understand the human brain and human disease.
273
801947
3267
來了解人腦和人類疾病的時候了。
13:27
Thank you.
274
807629
1151
謝謝。
13:28
(Applause)
275
808804
1172
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog