Read Montague: What we're learning from 5,000 brains

46,909 views ・ 2012-09-24

TED


Please double-click on the English subtitles below to play the video.

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
00:15
Other people. Everyone is interested in other people.
1
15734
2809
00:18
Everyone has relationships with other people,
2
18543
2123
00:20
and they're interested in these relationships
3
20666
1592
00:22
for a variety of reasons.
4
22258
1855
00:24
Good relationships, bad relationships,
5
24113
2012
00:26
annoying relationships, agnostic relationships,
6
26125
3146
00:29
and what I'm going to do is focus on the central piece
7
29271
3424
00:32
of an interaction that goes on in a relationship.
8
32695
3303
00:35
So I'm going to take as inspiration the fact that we're all
9
35998
2336
00:38
interested in interacting with other people,
10
38334
2425
00:40
I'm going to completely strip it of all its complicating features,
11
40759
3832
00:44
and I'm going to turn that object, that simplified object,
12
44591
3894
00:48
into a scientific probe, and provide the early stages,
13
48485
4150
00:52
embryonic stages of new insights into what happens
14
52635
2449
00:55
in two brains while they simultaneously interact.
15
55084
3650
00:58
But before I do that, let me tell you a couple of things
16
58734
2293
01:01
that made this possible.
17
61027
1699
01:02
The first is we can now eavesdrop safely
18
62726
2781
01:05
on healthy brain activity.
19
65507
2711
01:08
Without needles and radioactivity,
20
68218
2577
01:10
without any kind of clinical reason, we can go down the street
21
70795
2863
01:13
and record from your friends' and neighbors' brains
22
73658
3127
01:16
while they do a variety of cognitive tasks, and we use
23
76785
2538
01:19
a method called functional magnetic resonance imaging.
24
79323
3734
01:23
You've probably all read about it or heard about in some
25
83057
2325
01:25
incarnation. Let me give you a two-sentence version of it.
26
85382
4378
01:29
So we've all heard of MRIs. MRIs use magnetic fields
27
89760
3484
01:33
and radio waves and they take snapshots of your brain
28
93244
2029
01:35
or your knee or your stomach,
29
95273
2361
01:37
grayscale images that are frozen in time.
30
97634
2045
01:39
In the 1990s, it was discovered you could use
31
99679
2321
01:42
the same machines in a different mode,
32
102000
2659
01:44
and in that mode, you could make microscopic blood flow
33
104659
2346
01:47
movies from hundreds of thousands of sites independently in the brain.
34
107005
3300
01:50
Okay, so what? In fact, the so what is, in the brain,
35
110305
3200
01:53
changes in neural activity, the things that make your brain work,
36
113505
3832
01:57
the things that make your software work in your brain,
37
117337
2010
01:59
are tightly correlated with changes in blood flow.
38
119347
2489
02:01
You make a blood flow movie, you have an independent
39
121836
1973
02:03
proxy of brain activity.
40
123809
2339
02:06
This has literally revolutionized cognitive science.
41
126148
3034
02:09
Take any cognitive domain you want, memory,
42
129182
1991
02:11
motor planning, thinking about your mother-in-law,
43
131173
2141
02:13
getting angry at people, emotional response, it goes on and on,
44
133314
3715
02:17
put people into functional MRI devices, and
45
137029
3089
02:20
image how these kinds of variables map onto brain activity.
46
140118
3383
02:23
It's in its early stages, and it's crude by some measures,
47
143501
2849
02:26
but in fact, 20 years ago, we were at nothing.
48
146350
2568
02:28
You couldn't do people like this. You couldn't do healthy people.
49
148918
2359
02:31
That's caused a literal revolution, and it's opened us up
50
151277
2488
02:33
to a new experimental preparation. Neurobiologists,
51
153765
2818
02:36
as you well know, have lots of experimental preps,
52
156583
3760
02:40
worms and rodents and fruit flies and things like this.
53
160343
3141
02:43
And now, we have a new experimental prep: human beings.
54
163484
3397
02:46
We can now use human beings to study and model
55
166881
3761
02:50
the software in human beings, and we have a few
56
170642
2950
02:53
burgeoning biological measures.
57
173592
2835
02:56
Okay, let me give you one example of the kinds of experiments that people do,
58
176427
3887
03:00
and it's in the area of what you'd call valuation.
59
180314
2677
03:02
Valuation is just what you think it is, you know?
60
182991
2135
03:05
If you went and you were valuing two companies against
61
185126
2804
03:07
one another, you'd want to know which was more valuable.
62
187930
2736
03:10
Cultures discovered the key feature of valuation thousands of years ago.
63
190666
3879
03:14
If you want to compare oranges to windshields, what do you do?
64
194545
2690
03:17
Well, you can't compare oranges to windshields.
65
197235
2356
03:19
They're immiscible. They don't mix with one another.
66
199591
2255
03:21
So instead, you convert them to a common currency scale,
67
201846
2351
03:24
put them on that scale, and value them accordingly.
68
204197
2706
03:26
Well, your brain has to do something just like that as well,
69
206903
3436
03:30
and we're now beginning to understand and identify
70
210339
2488
03:32
brain systems involved in valuation,
71
212827
2137
03:34
and one of them includes a neurotransmitter system
72
214964
2632
03:37
whose cells are located in your brainstem
73
217596
2632
03:40
and deliver the chemical dopamine to the rest of your brain.
74
220228
3175
03:43
I won't go through the details of it, but that's an important
75
223403
2442
03:45
discovery, and we know a good bit about that now,
76
225845
2157
03:48
and it's just a small piece of it, but it's important because
77
228002
2230
03:50
those are the neurons that you would lose if you had Parkinson's disease,
78
230232
3275
03:53
and they're also the neurons that are hijacked by literally
79
233507
2016
03:55
every drug of abuse, and that makes sense.
80
235523
2232
03:57
Drugs of abuse would come in, and they would change
81
237755
2336
04:00
the way you value the world. They change the way
82
240091
1789
04:01
you value the symbols associated with your drug of choice,
83
241880
3199
04:05
and they make you value that over everything else.
84
245079
2514
04:07
Here's the key feature though. These neurons are also
85
247593
3021
04:10
involved in the way you can assign value to literally abstract ideas,
86
250614
3501
04:14
and I put some symbols up here that we assign value to
87
254115
2041
04:16
for various reasons.
88
256156
2720
04:18
We have a behavioral superpower in our brain,
89
258876
2689
04:21
and it at least in part involves dopamine.
90
261565
1753
04:23
We can deny every instinct we have for survival for an idea,
91
263318
4189
04:27
for a mere idea. No other species can do that.
92
267507
4005
04:31
In 1997, the cult Heaven's Gate committed mass suicide
93
271512
3606
04:35
predicated on the idea that there was a spaceship
94
275118
2215
04:37
hiding in the tail of the then-visible comet Hale-Bopp
95
277333
3785
04:41
waiting to take them to the next level. It was an incredibly tragic event.
96
281118
4272
04:45
More than two thirds of them had college degrees.
97
285390
3485
04:48
But the point here is they were able to deny their instincts for survival
98
288875
3723
04:52
using exactly the same systems that were put there
99
292598
2866
04:55
to make them survive. That's a lot of control, okay?
100
295464
4042
04:59
One thing that I've left out of this narrative
101
299506
2089
05:01
is the obvious thing, which is the focus of the rest of my
102
301595
2234
05:03
little talk, and that is other people.
103
303829
2159
05:05
These same valuation systems are redeployed
104
305988
2996
05:08
when we're valuing interactions with other people.
105
308984
2492
05:11
So this same dopamine system that gets addicted to drugs,
106
311476
3271
05:14
that makes you freeze when you get Parkinson's disease,
107
314747
2524
05:17
that contributes to various forms of psychosis,
108
317271
3077
05:20
is also redeployed to value interactions with other people
109
320348
3920
05:24
and to assign value to gestures that you do
110
324268
2896
05:27
when you're interacting with somebody else.
111
327164
2574
05:29
Let me give you an example of this.
112
329738
2577
05:32
You bring to the table such enormous processing power
113
332315
2967
05:35
in this domain that you hardly even notice it.
114
335282
2624
05:37
Let me just give you a few examples. So here's a baby.
115
337906
1467
05:39
She's three months old. She still poops in her diapers and she can't do calculus.
116
339373
3730
05:43
She's related to me. Somebody will be very glad that she's up here on the screen.
117
343103
3353
05:46
You can cover up one of her eyes, and you can still read
118
346456
2376
05:48
something in the other eye, and I see sort of curiosity
119
348832
2755
05:51
in one eye, I see maybe a little bit of surprise in the other.
120
351587
3597
05:55
Here's a couple. They're sharing a moment together,
121
355184
3179
05:58
and we've even done an experiment where you can cut out
122
358363
1318
05:59
different pieces of this frame and you can still see
123
359681
3007
06:02
that they're sharing it. They're sharing it sort of in parallel.
124
362688
2504
06:05
Now, the elements of the scene also communicate this
125
365192
2463
06:07
to us, but you can read it straight off their faces,
126
367655
2235
06:09
and if you compare their faces to normal faces, it would be a very subtle cue.
127
369890
3503
06:13
Here's another couple. He's projecting out at us,
128
373393
3347
06:16
and she's clearly projecting, you know,
129
376740
2888
06:19
love and admiration at him.
130
379628
2263
06:21
Here's another couple. (Laughter)
131
381891
3635
06:25
And I'm thinking I'm not seeing love and admiration on the left. (Laughter)
132
385526
5150
06:30
In fact, I know this is his sister, and you can just see
133
390676
2560
06:33
him saying, "Okay, we're doing this for the camera,
134
393236
2513
06:35
and then afterwards you steal my candy and you punch me in the face." (Laughter)
135
395749
5702
06:41
He'll kill me for showing that.
136
401451
2106
06:43
All right, so what does this mean?
137
403557
2797
06:46
It means we bring an enormous amount of processing power to the problem.
138
406354
3350
06:49
It engages deep systems in our brain, in dopaminergic
139
409704
3648
06:53
systems that are there to make you chase sex, food and salt.
140
413352
2818
06:56
They keep you alive. It gives them the pie, it gives
141
416170
2894
06:59
that kind of a behavioral punch which we've called a superpower.
142
419064
2904
07:01
So how can we take that and arrange a kind of staged
143
421968
3654
07:05
social interaction and turn that into a scientific probe?
144
425622
2698
07:08
And the short answer is games.
145
428320
2691
07:11
Economic games. So what we do is we go into two areas.
146
431011
4404
07:15
One area is called experimental economics. The other area is called behavioral economics.
147
435415
3336
07:18
And we steal their games. And we contrive them to our own purposes.
148
438751
4078
07:22
So this shows you one particular game called an ultimatum game.
149
442829
2967
07:25
Red person is given a hundred dollars and can offer
150
445796
1845
07:27
a split to blue. Let's say red wants to keep 70,
151
447641
3723
07:31
and offers blue 30. So he offers a 70-30 split with blue.
152
451364
4086
07:35
Control passes to blue, and blue says, "I accept it,"
153
455450
2851
07:38
in which case he'd get the money, or blue says,
154
458301
1956
07:40
"I reject it," in which case no one gets anything. Okay?
155
460257
4307
07:44
So a rational choice economist would say, well,
156
464564
3392
07:47
you should take all non-zero offers.
157
467956
2056
07:50
What do people do? People are indifferent at an 80-20 split.
158
470012
3762
07:53
At 80-20, it's a coin flip whether you accept that or not.
159
473774
3524
07:57
Why is that? You know, because you're pissed off.
160
477298
2891
08:00
You're mad. That's an unfair offer, and you know what an unfair offer is.
161
480189
3609
08:03
This is the kind of game done by my lab and many around the world.
162
483798
2704
08:06
That just gives you an example of the kind of thing that
163
486502
2544
08:09
these games probe. The interesting thing is, these games
164
489046
3738
08:12
require that you have a lot of cognitive apparatus on line.
165
492784
3707
08:16
You have to be able to come to the table with a proper model of another person.
166
496491
2928
08:19
You have to be able to remember what you've done.
167
499419
3213
08:22
You have to stand up in the moment to do that.
168
502632
1420
08:24
Then you have to update your model based on the signals coming back,
169
504052
3350
08:27
and you have to do something that is interesting,
170
507402
2972
08:30
which is you have to do a kind of depth of thought assay.
171
510374
2597
08:32
That is, you have to decide what that other person expects of you.
172
512971
3333
08:36
You have to send signals to manage your image in their mind.
173
516304
2954
08:39
Like a job interview. You sit across the desk from somebody,
174
519258
2853
08:42
they have some prior image of you,
175
522111
1369
08:43
you send signals across the desk to move their image
176
523480
2751
08:46
of you from one place to a place where you want it to be.
177
526231
3920
08:50
We're so good at this we don't really even notice it.
178
530151
3385
08:53
These kinds of probes exploit it. Okay?
179
533536
3767
08:57
In doing this, what we've discovered is that humans
180
537303
1807
08:59
are literal canaries in social exchanges.
181
539110
2331
09:01
Canaries used to be used as kind of biosensors in mines.
182
541441
3397
09:04
When methane built up, or carbon dioxide built up,
183
544838
3560
09:08
or oxygen was diminished, the birds would swoon
184
548398
4186
09:12
before people would -- so it acted as an early warning system:
185
552584
2326
09:14
Hey, get out of the mine. Things aren't going so well.
186
554910
2980
09:17
People come to the table, and even these very blunt,
187
557890
2954
09:20
staged social interactions, and they, and there's just
188
560844
2990
09:23
numbers going back and forth between the people,
189
563834
3016
09:26
and they bring enormous sensitivities to it.
190
566850
2199
09:29
So we realized we could exploit this, and in fact,
191
569049
2689
09:31
as we've done that, and we've done this now in
192
571738
2556
09:34
many thousands of people, I think on the order of
193
574294
2694
09:36
five or six thousand. We actually, to make this
194
576988
2165
09:39
a biological probe, need bigger numbers than that,
195
579153
2224
09:41
remarkably so. But anyway,
196
581377
3674
09:45
patterns have emerged, and we've been able to take
197
585051
2004
09:47
those patterns, convert them into mathematical models,
198
587055
3836
09:50
and use those mathematical models to gain new insights
199
590891
2689
09:53
into these exchanges. Okay, so what?
200
593580
2131
09:55
Well, the so what is, that's a really nice behavioral measure,
201
595711
3313
09:59
the economic games bring to us notions of optimal play.
202
599024
3319
10:02
We can compute that during the game.
203
602343
2484
10:04
And we can use that to sort of carve up the behavior.
204
604827
2953
10:07
Here's the cool thing. Six or seven years ago,
205
607780
4330
10:12
we developed a team. It was at the time in Houston, Texas.
206
612110
2550
10:14
It's now in Virginia and London. And we built software
207
614660
3394
10:18
that'll link functional magnetic resonance imaging devices
208
618054
3207
10:21
up over the Internet. I guess we've done up to six machines
209
621261
4035
10:25
at a time, but let's just focus on two.
210
625296
1981
10:27
So it synchronizes machines anywhere in the world.
211
627277
3058
10:30
We synchronize the machines, set them into these
212
630335
3169
10:33
staged social interactions, and we eavesdrop on both
213
633504
1983
10:35
of the interacting brains. So for the first time,
214
635487
1666
10:37
we don't have to look at just averages over single individuals,
215
637153
3607
10:40
or have individuals playing computers, or try to make
216
640760
2897
10:43
inferences that way. We can study individual dyads.
217
643657
2763
10:46
We can study the way that one person interacts with another person,
218
646420
2785
10:49
turn the numbers up, and start to gain new insights
219
649205
2564
10:51
into the boundaries of normal cognition,
220
651769
2515
10:54
but more importantly, we can put people with
221
654284
2732
10:57
classically defined mental illnesses, or brain damage,
222
657016
3337
11:00
into these social interactions, and use these as probes of that.
223
660353
3551
11:03
So we've started this effort. We've made a few hits,
224
663904
2350
11:06
a few, I think, embryonic discoveries.
225
666254
2449
11:08
We think there's a future to this. But it's our way
226
668703
2812
11:11
of going in and redefining, with a new lexicon,
227
671515
2560
11:14
a mathematical one actually, as opposed to the standard
228
674075
4022
11:18
ways that we think about mental illness,
229
678097
2578
11:20
characterizing these diseases, by using the people
230
680675
2067
11:22
as birds in the exchanges. That is, we exploit the fact
231
682742
3007
11:25
that the healthy partner, playing somebody with major depression,
232
685749
4244
11:29
or playing somebody with autism spectrum disorder,
233
689993
2910
11:32
or playing somebody with attention deficit hyperactivity disorder,
234
692903
3850
11:36
we use that as a kind of biosensor, and then we use
235
696753
3219
11:39
computer programs to model that person, and it gives us
236
699972
2644
11:42
a kind of assay of this.
237
702616
2470
11:45
Early days, and we're just beginning, we're setting up sites
238
705086
2131
11:47
around the world. Here are a few of our collaborating sites.
239
707217
3410
11:50
The hub, ironically enough,
240
710627
2309
11:52
is centered in little Roanoke, Virginia.
241
712936
2889
11:55
There's another hub in London, now, and the rest
242
715825
2269
11:58
are getting set up. We hope to give the data away
243
718094
4009
12:02
at some stage. That's a complicated issue
244
722103
3673
12:05
about making it available to the rest of the world.
245
725776
2994
12:08
But we're also studying just a small part
246
728770
1847
12:10
of what makes us interesting as human beings, and so
247
730617
2267
12:12
I would invite other people who are interested in this
248
732884
2041
12:14
to ask us for the software, or even for guidance
249
734925
2569
12:17
on how to move forward with that.
250
737494
2219
12:19
Let me leave you with one thought in closing.
251
739713
2341
12:22
The interesting thing about studying cognition
252
742054
1942
12:23
has been that we've been limited, in a way.
253
743996
3732
12:27
We just haven't had the tools to look at interacting brains
254
747728
2943
12:30
simultaneously.
255
750671
1200
12:31
The fact is, though, that even when we're alone,
256
751871
2470
12:34
we're a profoundly social creature. We're not a solitary mind
257
754341
4111
12:38
built out of properties that kept it alive in the world
258
758452
4373
12:42
independent of other people. In fact, our minds
259
762825
3948
12:46
depend on other people. They depend on other people,
260
766773
2870
12:49
and they're expressed in other people,
261
769643
1541
12:51
so the notion of who you are, you often don't know
262
771184
3652
12:54
who you are until you see yourself in interaction with people
263
774836
2688
12:57
that are close to you, people that are enemies of you,
264
777524
2406
12:59
people that are agnostic to you.
265
779930
2545
13:02
So this is the first sort of step into using that insight
266
782475
3776
13:06
into what makes us human beings, turning it into a tool,
267
786251
3295
13:09
and trying to gain new insights into mental illness.
268
789546
1978
13:11
Thanks for having me. (Applause)
269
791524
3121
13:14
(Applause)
270
794645
3089
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7