What we'll learn about the brain in the next century | Sam Rodriques

174,236 views ・ 2018-07-03

TED


Please double-click on the English subtitles below to play the video.

00:13
I want to tell you guys something about neuroscience.
0
13040
2507
00:16
I'm a physicist by training.
1
16040
1800
00:18
About three years ago, I left physics
2
18230
2206
00:20
to come and try to understand how the brain works.
3
20460
2349
00:22
And this is what I found.
4
22833
1474
00:24
Lots of people are working on depression.
5
24331
2064
00:26
And that's really good,
6
26419
1159
00:27
depression is something that we really want to understand.
7
27602
2721
00:30
Here's how you do it:
8
30347
1167
00:31
you take a jar and you fill it up, about halfway, with water.
9
31538
4161
00:35
And then you take a mouse, and you put the mouse in the jar, OK?
10
35723
4182
00:39
And the mouse swims around for a little while
11
39929
2350
00:42
and then at some point, the mouse gets tired
12
42303
2388
00:44
and decides to stop swimming.
13
44715
1934
00:46
And when it stops swimming, that's depression.
14
46673
3133
00:50
OK?
15
50696
1150
00:52
And I'm from theoretical physics,
16
52291
3380
00:55
so I'm used to people making very sophisticated mathematical models
17
55695
3668
00:59
to precisely describe physical phenomena,
18
59387
2881
01:02
so when I saw that this is the model for depression,
19
62292
2452
01:04
I though to myself, "Oh my God, we have a lot of work to do."
20
64768
2937
01:07
(Laughter)
21
67729
1370
01:09
But this is a kind of general problem in neuroscience.
22
69123
2951
01:12
So for example, take emotion.
23
72377
2111
01:14
Lots of people want to understand emotion.
24
74512
2459
01:17
But you can't study emotion in mice or monkeys
25
77352
3313
01:20
because you can't ask them
26
80689
1254
01:21
how they're feeling or what they're experiencing.
27
81967
2317
01:24
So instead, people who want to understand emotion,
28
84308
2357
01:26
typically end up studying what's called motivated behavior,
29
86689
2777
01:29
which is code for "what the mouse does when it really, really wants cheese."
30
89490
3658
01:33
OK, I could go on and on.
31
93839
1675
01:35
I mean, the point is, the NIH spends about 5.5 billion dollars a year
32
95538
6316
01:41
on neuroscience research.
33
101878
1532
01:43
And yet there have been almost no significant improvements in outcomes
34
103434
3603
01:47
for patients with brain diseases in the past 40 years.
35
107061
3491
01:51
And I think a lot of that is basically due to the fact
36
111015
2540
01:53
that mice might be OK as a model for cancer or diabetes,
37
113579
4151
01:57
but the mouse brain is just not sophisticated enough
38
117754
2687
02:00
to reproduce human psychology or human brain disease.
39
120465
3175
02:04
OK?
40
124379
1225
02:05
So if the mouse models are so bad, why are we still using them?
41
125628
3634
02:10
Well, it basically boils down to this:
42
130143
2103
02:12
the brain is made up of neurons
43
132270
2556
02:14
which are these little cells that send electrical signals to each other.
44
134850
3447
02:18
If you want to understand how the brain works,
45
138680
2144
02:20
you have to be able to measure the electrical activity of these neurons.
46
140848
3808
02:25
But to do that, you have to get really close to the neurons
47
145339
2992
02:28
with some kind of electrical recording device or a microscope.
48
148355
2928
02:31
And so you can do that in mice and you can do it in monkeys,
49
151563
2810
02:34
because you can physically put things into their brain
50
154397
2548
02:36
but for some reason we still can't do that in humans, OK?
51
156969
3046
02:40
So instead, we've invented all these proxies.
52
160533
3370
02:43
So the most popular one is probably this,
53
163927
2515
02:46
functional MRI, fMRI,
54
166466
2397
02:48
which allows you to make these pretty pictures like this,
55
168887
2692
02:51
that show which parts of your brain light up
56
171603
2056
02:53
when you're engaged in different activities.
57
173683
2126
02:55
But this is a proxy.
58
175833
1920
02:57
You're not actually measuring neural activity here.
59
177777
3292
03:01
What you're doing is you're measuring, essentially,
60
181093
2842
03:03
like, blood flow in the brain.
61
183959
1832
03:05
Where there's more blood.
62
185815
1238
03:07
It's actually where there's more oxygen, but you get the idea, OK?
63
187077
3103
03:10
The other thing that you can do is you can do this --
64
190204
2519
03:12
electroencephalography -- you can put these electrodes on your head, OK?
65
192747
3591
03:16
And then you can measure your brain waves.
66
196362
2143
03:19
And here, you're actually measuring electrical activity.
67
199125
3079
03:22
But you're not measuring the activity of neurons.
68
202228
2365
03:24
You're measuring these electrical currents,
69
204911
2444
03:27
sloshing back and forth in your brain.
70
207379
2299
03:30
So the point is just that these technologies that we have
71
210157
2674
03:32
are really measuring the wrong thing.
72
212855
2436
03:35
Because, for most of the diseases that we want to understand --
73
215315
2953
03:38
like, Parkinson's is the classic example.
74
218292
2198
03:40
In Parkinson's, there's one particular kind of neuron deep in your brain
75
220514
3554
03:44
that is responsible for the disease,
76
224092
1731
03:45
and these technologies just don't have the resolution that you need
77
225847
3182
03:49
to get at that.
78
229053
1373
03:50
And so that's why we're still stuck with the animals.
79
230450
3974
03:54
Not that anyone wants to be studying depression
80
234448
2533
03:57
by putting mice into jars, right?
81
237005
2262
03:59
It's just that there's this pervasive sense that it's not possible
82
239291
3753
04:03
to look at the activity of neurons in healthy humans.
83
243068
3847
04:08
So here's what I want to do.
84
248180
1492
04:09
I want to take you into the future.
85
249974
2521
04:12
To have a look at one way in which I think it could potentially be possible.
86
252519
4482
04:17
And I want to preface this by saying, I don't have all the details.
87
257526
3298
04:21
So I'm just going to provide you with a kind of outline.
88
261272
2967
04:24
But we're going to go the year 2100.
89
264263
2400
04:27
Now what does the year 2100 look like?
90
267732
2299
04:30
Well, to start with, the climate is a bit warmer that what you're used to.
91
270055
3518
04:33
(Laughter)
92
273597
3583
04:37
And that robotic vacuum cleaner that you know and love
93
277204
4952
04:42
went through a few generations,
94
282180
1514
04:43
and the improvements were not always so good.
95
283718
2843
04:46
(Laughter)
96
286585
1595
04:48
It was not always for the better.
97
288530
2310
04:52
But actually, in the year 2100 most things are surprisingly recognizable.
98
292221
4538
04:57
It's just the brain is totally different.
99
297458
2734
05:00
For example, in the year 2100,
100
300740
2547
05:03
we understand the root causes of Alzheimer's.
101
303311
2857
05:06
So we can deliver targeted genetic therapies or drugs
102
306192
3714
05:09
to stop the degenerative process before it begins.
103
309930
2876
05:13
So how did we do it?
104
313629
1333
05:15
Well, there were essentially three steps.
105
315898
2238
05:18
The first step was that we had to figure out
106
318589
2814
05:21
some way to get electrical connections through the skull
107
321427
3293
05:24
so we could measure the electrical activity of neurons.
108
324744
3015
05:28
And not only that, it had to be easy and risk-free.
109
328339
4349
05:32
Something that basically anyone would be OK with,
110
332712
2378
05:35
like getting a piercing.
111
335114
1600
05:37
Because back in 2017,
112
337156
2747
05:39
the only way that we knew of to get through the skull
113
339927
2913
05:42
was to drill these holes the size of quarters.
114
342864
2817
05:46
You would never let someone do that to you.
115
346015
2039
05:48
So in the 2020s,
116
348967
2253
05:51
people began to experiment -- rather than drilling these gigantic holes,
117
351244
3381
05:54
drilling microscopic holes, no thicker than a piece of hair.
118
354649
3115
05:58
And the idea here was really for diagnosis --
119
358735
2096
06:00
there are lots of times in the diagnosis of brain disorders
120
360855
2786
06:03
when you would like to be able to look at the neural activity beneath the skull
121
363665
4872
06:08
and being able to drill these microscopic holes
122
368561
3191
06:11
would make that much easier for the patient.
123
371776
2142
06:13
In the end, it would be like getting a shot.
124
373942
2349
06:16
You just go in and you sit down
125
376315
1580
06:17
and there's a thing that comes down on your head,
126
377919
2301
06:20
and a momentary sting and then it's done,
127
380244
1953
06:22
and you can go back about your day.
128
382221
1864
06:24
So we're eventually able to do it
129
384736
4793
06:29
using lasers to drill the holes.
130
389553
2667
06:32
And with the lasers, it was fast and extremely reliable,
131
392244
2620
06:34
you couldn't even tell the holes were there,
132
394888
2213
06:37
any more than you could tell that one of your hairs was missing.
133
397125
3000
06:40
And I know it might sound crazy, using lasers to drill holes in your skull,
134
400753
4738
06:45
but back in 2017,
135
405515
1366
06:46
people were OK with surgeons shooting lasers into their eyes
136
406905
4109
06:51
for corrective surgery
137
411038
1214
06:52
So when you're already here, it's not that big of a step.
138
412276
3887
06:57
OK?
139
417561
1151
06:58
So the next step, that happened in the 2030s,
140
418736
3571
07:02
was that it's not just about getting through the skull.
141
422331
3086
07:05
To measure the activity of neurons,
142
425441
1700
07:07
you have to actually make it into the brain tissue itself.
143
427165
3825
07:11
And the risk, whenever you put something into the brain tissue,
144
431344
2968
07:14
is essentially that of stroke.
145
434336
1439
07:15
That you would hit a blood vessel and burst it,
146
435799
2196
07:18
and that causes a stroke.
147
438019
1519
07:19
So, by the mid 2030s, we had invented these flexible probes
148
439916
3725
07:23
that were capable of going around blood vessels,
149
443665
2278
07:25
rather than through them.
150
445967
1476
07:27
And thus, we could put huge batteries of these probes
151
447467
5697
07:33
into the brains of patients
152
453188
1357
07:34
and record from thousands of their neurons without any risk to them.
153
454569
3270
07:39
And what we discovered, sort of to our surprise,
154
459458
4061
07:43
is that the neurons that we could identify
155
463543
2190
07:45
were not responding to things like ideas or emotion,
156
465757
3524
07:49
which was what we had expected.
157
469305
1627
07:50
They were mostly responding to things like Jennifer Aniston
158
470956
3796
07:54
or Halle Berry
159
474776
2404
07:57
or Justin Trudeau.
160
477204
1310
07:58
I mean --
161
478538
1253
07:59
(Laughter)
162
479815
2326
08:02
In hindsight, we shouldn't have been that surprised.
163
482165
2437
08:04
I mean, what do your neurons spend most of their time thinking about?
164
484626
3262
08:07
(Laughter)
165
487912
1150
08:09
But really, the point is that
166
489380
2040
08:11
this technology enabled us to begin studying neuroscience in individuals.
167
491444
4430
08:15
So much like the transition to genetics, at the single cell level,
168
495898
4230
08:20
we started to study neuroscience, at the single human level.
169
500152
3206
08:23
But we weren't quite there yet.
170
503890
1618
08:25
Because these technologies
171
505895
1642
08:27
were still restricted to medical applications,
172
507561
3056
08:30
which meant that we were studying sick brains, not healthy brains.
173
510641
3391
08:35
Because no matter how safe your technology is,
174
515235
3754
08:39
you can't stick something into someone's brain
175
519013
2730
08:41
for research purposes.
176
521767
1420
08:43
They have to want it.
177
523211
1549
08:44
And why would they want it?
178
524784
1460
08:46
Because as soon as you have an electrical connection to the brain,
179
526268
3571
08:49
you can use it to hook the brain up to a computer.
180
529863
2444
08:53
Oh, well, you know, the general public was very skeptical at first.
181
533061
3429
08:56
I mean, who wants to hook their brain up to their computers?
182
536514
2869
08:59
Well just imagine being able to send an email with a thought.
183
539926
4236
09:04
(Laughter)
184
544186
2253
09:06
Imagine being able to take a picture with your eyes, OK?
185
546463
4500
09:10
(Laughter)
186
550987
1230
09:12
Imagine never forgetting anything anymore,
187
552241
2963
09:15
because anything that you choose to remember
188
555228
2159
09:17
will be stored permanently on a hard drive somewhere,
189
557411
2477
09:19
able to be recalled at will.
190
559912
2029
09:21
(Laughter)
191
561965
3366
09:25
The line here between crazy and visionary
192
565355
3381
09:28
was never quite clear.
193
568760
1467
09:30
But the systems were safe.
194
570720
1857
09:32
So when the FDA decided to deregulate these laser-drilling systems, in 2043,
195
572879
5016
09:37
commercial demand just exploded.
196
577919
2357
09:40
People started signing their emails,
197
580300
1888
09:42
"Please excuse any typos.
198
582212
1341
09:43
Sent from my brain."
199
583577
1333
09:44
(Laughter)
200
584934
1001
09:45
Commercial systems popped up left and right,
201
585959
2072
09:48
offering the latest and greatest in neural interfacing technology.
202
588055
3238
09:51
There were 100 electrodes.
203
591792
1753
09:53
A thousand electrodes.
204
593569
1911
09:55
High bandwidth for only 99.99 a month.
205
595504
2476
09:58
(Laughter)
206
598004
1539
09:59
Soon, everyone had them.
207
599567
1534
10:01
And that was the key.
208
601694
1571
10:03
Because, in the 2050s, if you were a neuroscientist,
209
603289
2923
10:06
you could have someone come into your lab essentially from off the street.
210
606236
3939
10:10
And you could have them engaged in some emotional task
211
610792
2864
10:13
or social behavior or abstract reasoning,
212
613680
2437
10:16
things you could never study in mice.
213
616141
2531
10:18
And you could record the activity of their neurons
214
618696
3111
10:21
using the interfaces that they already had.
215
621831
3191
10:25
And then you could also ask them about what they were experiencing.
216
625046
3189
10:28
So this link between psychology and neuroscience
217
628259
3349
10:31
that you could never make in the animals, was suddenly there.
218
631632
3381
10:35
So perhaps the classic example of this
219
635695
2184
10:37
was the discovery of the neural basis for insight.
220
637903
3523
10:41
That "Aha!" moment, the moment it all comes together, it clicks.
221
641450
3600
10:45
And this was discovered by two scientists in 2055,
222
645593
4056
10:49
Barry and Late,
223
649673
1372
10:51
who observed, in the dorsal prefrontal cortex,
224
651069
3663
10:54
how in the brain of someone trying to understand an idea,
225
654756
5222
11:00
how different populations of neurons would reorganize themselves --
226
660002
3369
11:03
you're looking at neural activity here in orange --
227
663395
2436
11:05
until finally their activity aligns in a way that leads to positive feedback.
228
665855
3738
11:10
Right there.
229
670339
1150
11:12
That is understanding.
230
672723
1467
11:15
So finally, we were able to get at the things that make us human.
231
675413
4437
11:21
And that's what really opened the way to major insights from medicine.
232
681871
4578
11:27
Because, starting in the 2060s,
233
687465
2755
11:30
with the ability to record the neural activity
234
690244
2484
11:32
in the brains of patients with these different mental diseases,
235
692752
3587
11:36
rather than defining the diseases on the basis of their symptoms,
236
696363
4690
11:41
as we had at the beginning of the century,
237
701077
2040
11:43
we started to define them
238
703141
1222
11:44
on the basis of the actual pathology that we observed at the neural level.
239
704387
3539
11:48
So for example, in the case of ADHD,
240
708768
3825
11:52
we discovered that there are dozens of different diseases,
241
712617
3174
11:55
all of which had been called ADHD at the start of the century,
242
715815
3009
11:58
that actually had nothing to do with each other,
243
718848
2301
12:01
except that they had similar symptoms.
244
721173
2118
12:03
And they needed to be treated in different ways.
245
723625
2372
12:06
So it was kind of incredible, in retrospect,
246
726307
2247
12:08
that at the beginning of the century,
247
728578
1777
12:10
we had been treating all those different diseases
248
730379
2317
12:12
with the same drug,
249
732720
1183
12:13
just by giving people amphetamine, basically is what we were doing.
250
733927
3214
12:17
And schizophrenia and depression are the same way.
251
737165
2488
12:19
So rather than prescribing drugs to people essentially at random,
252
739677
4032
12:23
as we had,
253
743733
1150
12:24
we learned how to predict which drugs would be most effective
254
744907
3511
12:28
in which patients,
255
748442
1183
12:29
and that just led to this huge improvement in outcomes.
256
749649
2756
12:33
OK, I want to bring you back now to the year 2017.
257
753498
3476
12:38
Some of this may sound satirical or even far fetched.
258
758117
3373
12:41
And some of it is.
259
761514
1293
12:43
I mean, I can't actually see into the future.
260
763291
2651
12:45
I don't actually know
261
765966
1366
12:47
if we're going to be drilling hundreds or thousands of microscopic holes
262
767356
3667
12:51
in our heads in 30 years.
263
771047
1667
12:53
But what I can tell you
264
773762
1706
12:55
is that we're not going to make any progress
265
775492
2175
12:57
towards understanding the human brain or human diseases
266
777691
3727
13:01
until we figure out how to get at the electrical activity of neurons
267
781442
4516
13:05
in healthy humans.
268
785982
1200
13:07
And almost no one is working on figuring out how to do that today.
269
787918
3239
13:12
That is the future of neuroscience.
270
792077
2334
13:14
And I think it's time for neuroscientists to put down the mouse brain
271
794752
4393
13:19
and to dedicate the thought and investment necessary
272
799169
2754
13:21
to understand the human brain and human disease.
273
801947
3267
13:27
Thank you.
274
807629
1151
13:28
(Applause)
275
808804
1172
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7