Read Montague: What we're learning from 5,000 brains

46,909 views ・ 2012-09-24

TED


請雙擊下方英文字幕播放視頻。

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
譯者: Jonas Lau 審譯者: Serena Chang
00:15
Other people. Everyone is interested in other people.
1
15734
2809
他人。每個人都對別人感興趣。
00:18
Everyone has relationships with other people,
2
18543
2123
每個人跟別人都有關係,
00:20
and they're interested in these relationships
3
20666
1592
而且因為各式各樣的原因
00:22
for a variety of reasons.
4
22258
1855
他們都對這些關係感興趣。
00:24
Good relationships, bad relationships,
5
24113
2012
好的關係,壞的關係,
00:26
annoying relationships, agnostic relationships,
6
26125
3146
惱人的關係,失去信任的關係,
00:29
and what I'm going to do is focus on the central piece
7
29271
3424
而我將要做的,就是集中探討在
00:32
of an interaction that goes on in a relationship.
8
32695
3303
一段關係上互動的核心
00:35
So I'm going to take as inspiration the fact that we're all
9
35998
2336
我將以「我們都對與人互動感興趣」
00:38
interested in interacting with other people,
10
38334
2425
的事實作為啓發
00:40
I'm going to completely strip it of all its complicating features,
11
40759
3832
我將要徹底脫掉它複雜的外表
00:44
and I'm going to turn that object, that simplified object,
12
44591
3894
我要轉換這個東西,這個精簡的東西
00:48
into a scientific probe, and provide the early stages,
13
48485
4150
將它變成一個科學的探針,為你展示
00:52
embryonic stages of new insights into what happens
14
52635
2449
兩個腦袋同時互動的時候,帶給我們的
00:55
in two brains while they simultaneously interact.
15
55084
3650
一些初步的、原始的啟發
00:58
But before I do that, let me tell you a couple of things
16
58734
2293
但在此之前,讓我告訴你
01:01
that made this possible.
17
61027
1699
這些事情是如何達成的
01:02
The first is we can now eavesdrop safely
18
62726
2781
首先,我們現在可以安全地竊聽
01:05
on healthy brain activity.
19
65507
2711
健全腦袋的運作
01:08
Without needles and radioactivity,
20
68218
2577
不需要針頭和放射物質
01:10
without any kind of clinical reason, we can go down the street
21
70795
2863
不需要任何醫療理由,我們只要走到街上
01:13
and record from your friends' and neighbors' brains
22
73658
3127
當你的朋友和鄰居正在做各式各樣的認知活動時
01:16
while they do a variety of cognitive tasks, and we use
23
76785
2538
用一種叫做「功能性磁力共振造影 (fMRI)」的方法
01:19
a method called functional magnetic resonance imaging.
24
79323
3734
紀錄他們腦袋的活動
01:23
You've probably all read about it or heard about in some
25
83057
2325
你們可能都讀過或聽過「影像化」
01:25
incarnation. Let me give you a two-sentence version of it.
26
85382
4378
讓我用兩個句子來描述它
01:29
So we've all heard of MRIs. MRIs use magnetic fields
27
89760
3484
我們都聽過「磁力共振造影 (MRI)」。磁力共振造影使用磁場
01:33
and radio waves and they take snapshots of your brain
28
93244
2029
和無線電波,它們可以照下你的大腦
01:35
or your knee or your stomach,
29
95273
2361
或你的膝蓋,或你的肚子的快照
01:37
grayscale images that are frozen in time.
30
97634
2045
一些凝結在時空中的黑白影像
01:39
In the 1990s, it was discovered you could use
31
99679
2321
1990 年代,人們發現可以在不同模式中
01:42
the same machines in a different mode,
32
102000
2659
使用同一個儀器
01:44
and in that mode, you could make microscopic blood flow
33
104659
2346
在那個模式下,你可以製作從腦內
01:47
movies from hundreds of thousands of sites independently in the brain.
34
107005
3300
成千上萬個據點微觀血液流動的影片
01:50
Okay, so what? In fact, the so what is, in the brain,
35
110305
3200
好,那又怎樣?事實上,那個「那又怎樣」就反映你腦內
01:53
changes in neural activity, the things that make your brain work,
36
113505
3832
神經的活動,那些使你的大腦運作
01:57
the things that make your software work in your brain,
37
117337
2010
使你大腦中的軟體運作的東西
01:59
are tightly correlated with changes in blood flow.
38
119347
2489
與腦中血流的改變有著密切關係
02:01
You make a blood flow movie, you have an independent
39
121836
1973
你製作血液流動的影片,你就有一個獨立的
02:03
proxy of brain activity.
40
123809
2339
大腦活動估算
02:06
This has literally revolutionized cognitive science.
41
126148
3034
這簡直徹底改變了認知科學
02:09
Take any cognitive domain you want, memory,
42
129182
1991
選擇任何一個你想到的認知領域,記憶、
02:11
motor planning, thinking about your mother-in-law,
43
131173
2141
動作規劃、 想一想你的岳母、
02:13
getting angry at people, emotional response, it goes on and on,
44
133314
3715
對人生氣,情緒反應,諸如此類的
02:17
put people into functional MRI devices, and
45
137029
3089
將人放進 fMRI 設備中,並且
02:20
image how these kinds of variables map onto brain activity.
46
140118
3383
想像這些變量如何與大腦活動相關
02:23
It's in its early stages, and it's crude by some measures,
47
143501
2849
這個測量工具,在某個層面上,還在早期發展階段當中
02:26
but in fact, 20 years ago, we were at nothing.
48
146350
2568
但事實上,20 年前,我們什麼都沒有
02:28
You couldn't do people like this. You couldn't do healthy people.
49
148918
2359
你不能這樣測量人類。你不能測量健康的人類
02:31
That's caused a literal revolution, and it's opened us up
50
151277
2488
事實上這的確是一場革命,它為我們提供了
02:33
to a new experimental preparation. Neurobiologists,
51
153765
2818
一個新的實驗對象。眾所皆知,生物神經學家
02:36
as you well know, have lots of experimental preps,
52
156583
3760
有很多的實驗對象
02:40
worms and rodents and fruit flies and things like this.
53
160343
3141
像是蠕蟲、齧齒(鼠)類動物和果蠅這樣的東西
02:43
And now, we have a new experimental prep: human beings.
54
163484
3397
現在,我們有新的實驗對象: 人類
02:46
We can now use human beings to study and model
55
166881
3761
現在,我們可以使用人類來研究和模擬
02:50
the software in human beings, and we have a few
56
170642
2950
人類當中的軟體運作,我們有幾個
02:53
burgeoning biological measures.
57
173592
2835
發展迅速的生物實驗方法
02:56
Okay, let me give you one example of the kinds of experiments that people do,
58
176427
3887
好吧,讓我給你們一個人們正在做的實驗例子
03:00
and it's in the area of what you'd call valuation.
59
180314
2677
而這是在一個被你們所稱為估算的領域
03:02
Valuation is just what you think it is, you know?
60
182991
2135
估算就是你所想的那種,你知道嗎?
03:05
If you went and you were valuing two companies against
61
185126
2804
如果你去估算兩家互相對抗的公司
03:07
one another, you'd want to know which was more valuable.
62
187930
2736
你會想知道哪一家比較有價值
03:10
Cultures discovered the key feature of valuation thousands of years ago.
63
190666
3879
人類文明早在數千年前就發現估算的關鍵
03:14
If you want to compare oranges to windshields, what do you do?
64
194545
2690
如果你想比較橘子和擋風玻璃,你會怎樣做?
03:17
Well, you can't compare oranges to windshields.
65
197235
2356
嗯,你不能比較橘子和擋風玻璃
03:19
They're immiscible. They don't mix with one another.
66
199591
2255
它們是不能相提並論的。他們不能彼此混在一起
03:21
So instead, you convert them to a common currency scale,
67
201846
2351
於是,你將它們轉換為貨幣單位
03:24
put them on that scale, and value them accordingly.
68
204197
2706
把它們放進這個單位,再以此衡量它們的價值
03:26
Well, your brain has to do something just like that as well,
69
206903
3436
嗯,你的大腦必須做類似的事情
03:30
and we're now beginning to understand and identify
70
210339
2488
而我們現在將開始瞭解並辨別出
03:32
brain systems involved in valuation,
71
212827
2137
大腦系統用作估算的部分
03:34
and one of them includes a neurotransmitter system
72
214964
2632
其中包括神經傳遞物質系統
03:37
whose cells are located in your brainstem
73
217596
2632
這些細胞位於你的腦幹
03:40
and deliver the chemical dopamine to the rest of your brain.
74
220228
3175
它向大腦的其他部位傳遞多巴胺(dopamine)
03:43
I won't go through the details of it, but that's an important
75
223403
2442
我不講細節,但這是一個很重要的
03:45
discovery, and we know a good bit about that now,
76
225845
2157
發現,我們現在又多了解多一些
03:48
and it's just a small piece of it, but it's important because
77
228002
2230
雖然只是當中的一小部分,但它有重要的意義,因為
03:50
those are the neurons that you would lose if you had Parkinson's disease,
78
230232
3275
如果你有帕金森氏症,你就是因為失去了這些神經元
03:53
and they're also the neurons that are hijacked by literally
79
233507
2016
而且在每次的藥物濫用中,這些神經元
03:55
every drug of abuse, and that makes sense.
80
235523
2232
都會被威脅到,這是有原因的。
03:57
Drugs of abuse would come in, and they would change
81
237755
2336
濫用藥物會 進入我們的神經系統,然後他們會改變
04:00
the way you value the world. They change the way
82
240091
1789
你估算世界的方式。他們改變
04:01
you value the symbols associated with your drug of choice,
83
241880
3199
你估算藥物選擇相關的訊號,
04:05
and they make you value that over everything else.
84
245079
2514
他們讓你覺得這些藥物比一切都重要
04:07
Here's the key feature though. These neurons are also
85
247593
3021
這是關鍵的功能。這些神經元也是
04:10
involved in the way you can assign value to literally abstract ideas,
86
250614
3501
在你為抽象概念分配價值時起作用
04:14
and I put some symbols up here that we assign value to
87
254115
2041
我在這裏放了一些我們會因為不同原因
04:16
for various reasons.
88
256156
2720
為它們分配價值的符號
04:18
We have a behavioral superpower in our brain,
89
258876
2689
我們的大腦有一個操控行為的超級力量
04:21
and it at least in part involves dopamine.
90
261565
1753
而它至少有一部分與多巴胺有關
04:23
We can deny every instinct we have for survival for an idea,
91
263318
4189
我們可以為一個想法放棄一切我們賴以為生的本能
04:27
for a mere idea. No other species can do that.
92
267507
4005
只為一個的想法。沒有其他的物種可以這樣做
04:31
In 1997, the cult Heaven's Gate committed mass suicide
93
271512
3606
1997 年,邪教「天堂之門」集體自殺
04:35
predicated on the idea that there was a spaceship
94
275118
2215
因為他們斷言有一架飛船
04:37
hiding in the tail of the then-visible comet Hale-Bopp
95
277333
3785
隱藏在那時可見的「海爾-博普彗星」的尾巴
04:41
waiting to take them to the next level. It was an incredibly tragic event.
96
281118
4272
等著把他們帶到另一個境界。這是一件令人難以置信的悲劇
04:45
More than two thirds of them had college degrees.
97
285390
3485
他們當中三分之二以上持有大專學位
04:48
But the point here is they were able to deny their instincts for survival
98
288875
3723
但這裡的要點是,他們用那個令他們
04:52
using exactly the same systems that were put there
99
292598
2866
生存的系統、那個完全相同的系統
04:55
to make them survive. That's a lot of control, okay?
100
295464
4042
去否定他們生存的本能。這裏需要大量的控制,是嗎?
04:59
One thing that I've left out of this narrative
101
299506
2089
在這個敍述裏我沒有提及一件事
05:01
is the obvious thing, which is the focus of the rest of my
102
301595
2234
一個顯而易見的東西,這是我這小小的演講中
05:03
little talk, and that is other people.
103
303829
2159
餘下部分的重點,這就是其他人
05:05
These same valuation systems are redeployed
104
305988
2996
這些相同的估算系統
05:08
when we're valuing interactions with other people.
105
308984
2492
在我們與其他人估算互動時會作出調動
05:11
So this same dopamine system that gets addicted to drugs,
106
311476
3271
所以這相同的多巴胺系統,這個會對藥物上癮的系統
05:14
that makes you freeze when you get Parkinson's disease,
107
314747
2524
使得你在得了帕金森病時動彈不得
05:17
that contributes to various forms of psychosis,
108
317271
3077
誘發各種形式精神病的系統
05:20
is also redeployed to value interactions with other people
109
320348
3920
亦是在你與其他人進行互動時
05:24
and to assign value to gestures that you do
110
324268
2896
用以估算你們之間的互動
05:27
when you're interacting with somebody else.
111
327164
2574
以及為各種姿勢分配價值的系統
05:29
Let me give you an example of this.
112
329738
2577
讓我舉一個例子
05:32
You bring to the table such enormous processing power
113
332315
2967
你在這個領域有一種這樣巨大的
05:35
in this domain that you hardly even notice it.
114
335282
2624
處理能力,你幾乎很難注意到它
05:37
Let me just give you a few examples. So here's a baby.
115
337906
1467
讓我舉幾個例子。這裡是一個嬰兒
05:39
She's three months old. She still poops in her diapers and she can't do calculus.
116
339373
3730
她三個月大。她仍然在她的紙尿褲上排泄,亦不能做微積分
05:43
She's related to me. Somebody will be very glad that she's up here on the screen.
117
343103
3353
她與我有關。有人將會因為她在這螢幕出現顯得非常高興
05:46
You can cover up one of her eyes, and you can still read
118
346456
2376
你可以掩蓋了她的一隻眼睛,但仍可以讀取
05:48
something in the other eye, and I see sort of curiosity
119
348832
2755
她另一隻眼中的訊號,我在她的一隻眼中
05:51
in one eye, I see maybe a little bit of surprise in the other.
120
351587
3597
看到的好奇心,在另一隻眼睛看到一點驚喜
05:55
Here's a couple. They're sharing a moment together,
121
355184
3179
這裡有幾個其他例子。他們正在共享一個時刻
05:58
and we've even done an experiment where you can cut out
122
358363
1318
我們甚至做了一個實驗,您可以剪出
05:59
different pieces of this frame and you can still see
123
359681
3007
此框架的不同部分,但仍然可以看到
06:02
that they're sharing it. They're sharing it sort of in parallel.
124
362688
2504
他們正共享一個時刻。他們好像在同時分享這個時刻
06:05
Now, the elements of the scene also communicate this
125
365192
2463
現在,那背景當中的某些元素也向我們
06:07
to us, but you can read it straight off their faces,
126
367655
2235
傳達這個訊息,但你可以從他們的臉上直接感受得到
06:09
and if you compare their faces to normal faces, it would be a very subtle cue.
127
369890
3503
如果你將他們的臉與正常的臉對比, 它只是一個非常微小的訊號
06:13
Here's another couple. He's projecting out at us,
128
373393
3347
這裡是另一對情侶。男方面向我們
06:16
and she's clearly projecting, you know,
129
376740
2888
而女方清楚地表示,你知道吧,
06:19
love and admiration at him.
130
379628
2263
她對他的愛戴和欽佩
06:21
Here's another couple. (Laughter)
131
381891
3635
這裡是另一對組合。(笑聲)
06:25
And I'm thinking I'm not seeing love and admiration on the left. (Laughter)
132
385526
5150
我想我在左側的男孩身上還沒看到愛戴和欽佩。(笑聲)
06:30
In fact, I know this is his sister, and you can just see
133
390676
2560
事實上,我知道這是他的姐姐,你可以想像
06:33
him saying, "Okay, we're doing this for the camera,
134
393236
2513
他說,「好吧,我們都只是在相機前擺姿勢吧
06:35
and then afterwards you steal my candy and you punch me in the face." (Laughter)
135
395749
5702
事後你偷了我的糖果,你又打我的臉。」(笑聲)
06:41
He'll kill me for showing that.
136
401451
2106
他會因為我展示這張相片而殺了我
06:43
All right, so what does this mean?
137
403557
2797
好吧,那麼這意味著什麼呢?
06:46
It means we bring an enormous amount of processing power to the problem.
138
406354
3350
它意味著我們用大量的處理能力的來解決問題
06:49
It engages deep systems in our brain, in dopaminergic
139
409704
3648
它涉及我們大腦深層的多巴胺系統
06:53
systems that are there to make you chase sex, food and salt.
140
413352
2818
令你追逐的性、 食物和鹽
06:56
They keep you alive. It gives them the pie, it gives
141
416170
2894
它們使你活著。它給它們一個圓餅,它使
06:59
that kind of a behavioral punch which we've called a superpower.
142
419064
2904
我們的行為充滿活力,我們稱之為一個超級力量
07:01
So how can we take that and arrange a kind of staged
143
421968
3654
那麼我們如何可以量度,並安排上演一場
07:05
social interaction and turn that into a scientific probe?
144
425622
2698
預先設定社交互動場景,把它變成科學探索的工具呢?
07:08
And the short answer is games.
145
428320
2691
簡短的回答是遊戲
07:11
Economic games. So what we do is we go into two areas.
146
431011
4404
經濟遊戲。所以我們做的是進入兩個領域
07:15
One area is called experimental economics. The other area is called behavioral economics.
147
435415
3336
一個領域被稱為實驗經濟學 另外一個領域被稱為行為經濟學。
07:18
And we steal their games. And we contrive them to our own purposes.
148
438751
4078
我們偷取他們的遊戲。我們用它們謀劃自己的目的
07:22
So this shows you one particular game called an ultimatum game.
149
442829
2967
所以,這裏展示一個稱為「最後通牒」的遊戲
07:25
Red person is given a hundred dollars and can offer
150
445796
1845
紅色的人有一百美元,他可以與
07:27
a split to blue. Let's say red wants to keep 70,
151
447641
3723
藍色的人分享這一百元。譬如紅色的人想要留住 70 元
07:31
and offers blue 30. So he offers a 70-30 split with blue.
152
451364
4086
給藍色的人 30 元。這樣他提出和藍色的人以 70 對 30 分賬
07:35
Control passes to blue, and blue says, "I accept it,"
153
455450
2851
控制權傳到藍色的人手裏,藍色的人說:「我接受」
07:38
in which case he'd get the money, or blue says,
154
458301
1956
在這種情況下他會拿到錢。藍色的人或者會說
07:40
"I reject it," in which case no one gets anything. Okay?
155
460257
4307
「我拒絕這筆交易」。在這種情況下, 沒有一方獲得任何東西。明白嗎?
07:44
So a rational choice economist would say, well,
156
464564
3392
一個會作理性選擇的經濟學家會這樣說,嗯,
07:47
you should take all non-zero offers.
157
467956
2056
你應接受所有高於零的出價
07:50
What do people do? People are indifferent at an 80-20 split.
158
470012
3762
人們實際會做甚麼呢? 人們對 80-20 分賬不感興趣
07:53
At 80-20, it's a coin flip whether you accept that or not.
159
473774
3524
要決定是否接受 80-20 分賬的時候,人們大概會擲硬幣決定
07:57
Why is that? You know, because you're pissed off.
160
477298
2891
為甚麼會這樣呢? 你知道,因為你被惹惱
08:00
You're mad. That's an unfair offer, and you know what an unfair offer is.
161
480189
3609
你氣瘋了。這是不公平的出價 你知道甚麼是不公平的出價
08:03
This is the kind of game done by my lab and many around the world.
162
483798
2704
這是我的實驗室,以及世界上許多實驗室都在做的實驗
08:06
That just gives you an example of the kind of thing that
163
486502
2544
這為你展示了這些遊戲嘗試探索的事情
08:09
these games probe. The interesting thing is, these games
164
489046
3738
的其中一個例子。有趣的是,這些遊戲
08:12
require that you have a lot of cognitive apparatus on line.
165
492784
3707
需要你運用大量的認知工具
08:16
You have to be able to come to the table with a proper model of another person.
166
496491
2928
你必須能夠對另一邊的人建構一個適當模型
08:19
You have to be able to remember what you've done.
167
499419
3213
你必須能夠記住你做過甚麼
08:22
You have to stand up in the moment to do that.
168
502632
1420
你在那個時候你必須堅持才能完成任務
08:24
Then you have to update your model based on the signals coming back,
169
504052
3350
然後你要在訊號回來的時候更新你的模型
08:27
and you have to do something that is interesting,
170
507402
2972
接著你必須做一些有趣的事情
08:30
which is you have to do a kind of depth of thought assay.
171
510374
2597
你必須做一個帶有深度思考的分析
08:32
That is, you have to decide what that other person expects of you.
172
512971
3333
也就是說,你必須估計對方對你的期望
08:36
You have to send signals to manage your image in their mind.
173
516304
2954
你必須為保持你在他們心目中的形象,而發出訊號
08:39
Like a job interview. You sit across the desk from somebody,
174
519258
2853
這就像一次面試。你坐在書桌一邊、某個人的對面
08:42
they have some prior image of you,
175
522111
1369
他們對你有一些看法
08:43
you send signals across the desk to move their image
176
523480
2751
你把訊號發送到書桌的另一邊,把他們對你的看法
08:46
of you from one place to a place where you want it to be.
177
526231
3920
從一處移到另一處,一個你想要它到的地方
08:50
We're so good at this we don't really even notice it.
178
530151
3385
我們很擅長做這件事,有時候我們甚至察覺不到
08:53
These kinds of probes exploit it. Okay?
179
533536
3767
我們這類探針就是要好好利用它。好嗎?
08:57
In doing this, what we've discovered is that humans
180
537303
1807
這樣做時,我們發現人類
08:59
are literal canaries in social exchanges.
181
539110
2331
都是處於社交活動中的金絲雀
09:01
Canaries used to be used as kind of biosensors in mines.
182
541441
3397
以往金絲雀曾經被用作礦井裏的生物感應器
09:04
When methane built up, or carbon dioxide built up,
183
544838
3560
當甲烷濃度上升,或二氧化碳濃度上升
09:08
or oxygen was diminished, the birds would swoon
184
548398
4186
或氧氣減少,這些鳥兒便會在人們昏倒
09:12
before people would -- so it acted as an early warning system:
185
552584
2326
之前昏倒 -- 所以它能夠充當一個早期預警系統:
09:14
Hey, get out of the mine. Things aren't going so well.
186
554910
2980
嘿,走出礦井。大事不妙。
09:17
People come to the table, and even these very blunt,
187
557890
2954
人們來到書桌前,即使這些簡單直接的、
09:20
staged social interactions, and they, and there's just
188
560844
2990
預先安排的社交互動
09:23
numbers going back and forth between the people,
189
563834
3016
這些在人們之間來來往往的數字
09:26
and they bring enormous sensitivities to it.
190
566850
2199
已經足以觸動人們的神經
09:29
So we realized we could exploit this, and in fact,
191
569049
2689
因此我們意識到,我們可以利用它,而且實際上,
09:31
as we've done that, and we've done this now in
192
571738
2556
我們做過,我們已經對數以千計人
09:34
many thousands of people, I think on the order of
193
574294
2694
做過這樣的實驗,數字可能達
09:36
five or six thousand. We actually, to make this
194
576988
2165
五、 六千人。要是我們真的想利用它
09:39
a biological probe, need bigger numbers than that,
195
579153
2224
作為生物探針,我們實在需要更多的參與者
09:41
remarkably so. But anyway,
196
581377
3674
更多更多的參與者。不過無論如何
09:45
patterns have emerged, and we've been able to take
197
585051
2004
我們已經看到一些趨勢,我們亦已經
09:47
those patterns, convert them into mathematical models,
198
587055
3836
能夠將這些趨勢轉換為數學模型
09:50
and use those mathematical models to gain new insights
199
590891
2689
並使用這些數學模型以獲得有關
09:53
into these exchanges. Okay, so what?
200
593580
2131
這些社交互動的新見解。好吧,那又怎樣?
09:55
Well, the so what is, that's a really nice behavioral measure,
201
595711
3313
嗯,這個「那又怎樣」是,這是非常好的行為測量
09:59
the economic games bring to us notions of optimal play.
202
599024
3319
這個經濟學遊戲為我們帶來一個「最佳策略」的想法
10:02
We can compute that during the game.
203
602343
2484
我們可以在遊戲中計算
10:04
And we can use that to sort of carve up the behavior.
204
604827
2953
我們可以用這來... 嗯... 界定某一個行為
10:07
Here's the cool thing. Six or seven years ago,
205
607780
4330
這是最酷的地方。六至七年前
10:12
we developed a team. It was at the time in Houston, Texas.
206
612110
2550
我們發展出一個團隊。那個時候我們在德克薩斯州的休斯頓
10:14
It's now in Virginia and London. And we built software
207
614660
3394
現在我們在維珍尼亞及倫敦。我們研發
10:18
that'll link functional magnetic resonance imaging devices
208
618054
3207
可以透過網絡連接不同磁力共振造影機器的軟件
10:21
up over the Internet. I guess we've done up to six machines
209
621261
4035
我估計我們可以同時連接最多六台機器
10:25
at a time, but let's just focus on two.
210
625296
1981
不過現在還是聚焦在兩台好了
10:27
So it synchronizes machines anywhere in the world.
211
627277
3058
它可以讓世界上的任何兩台機器同步運作
10:30
We synchronize the machines, set them into these
212
630335
3169
我們令這些機器同步,令它們運行
10:33
staged social interactions, and we eavesdrop on both
213
633504
1983
預設的社交活動,接著我們竊聽那兩個
10:35
of the interacting brains. So for the first time,
214
635487
1666
在互動的腦袋。所以,這是第一次
10:37
we don't have to look at just averages over single individuals,
215
637153
3607
我們可以看到比每個人的平均數據更多的資料
10:40
or have individuals playing computers, or try to make
216
640760
2897
又或者令個別參與者與電腦作賽,又或者嘗試
10:43
inferences that way. We can study individual dyads.
217
643657
2763
干擾他們。我們可以研究兩人的組合
10:46
We can study the way that one person interacts with another person,
218
646420
2785
我們可以研究一個人與另一個人互動的方式
10:49
turn the numbers up, and start to gain new insights
219
649205
2564
當收集的數據愈來愈多,我們可以開始洞察出
10:51
into the boundaries of normal cognition,
220
651769
2515
正常認知的邊緣
10:54
but more importantly, we can put people with
221
654284
2732
但更重要的是,我們可以把
10:57
classically defined mental illnesses, or brain damage,
222
657016
3337
已被界定為患有精神科疾病,或者有大腦缺損的病人
11:00
into these social interactions, and use these as probes of that.
223
660353
3551
放進這樣的社交活動場景,以此作為探針
11:03
So we've started this effort. We've made a few hits,
224
663904
2350
我們已經開始努力嘗試,並已經找到一些線索
11:06
a few, I think, embryonic discoveries.
225
666254
2449
一些,我認為,很初步的發現
11:08
We think there's a future to this. But it's our way
226
668703
2812
我們認為這種研究很有前景。但這是我們
11:11
of going in and redefining, with a new lexicon,
227
671515
2560
踏進這個問題並以新的詞彙、
11:14
a mathematical one actually, as opposed to the standard
228
674075
4022
一種數學的方式、而非標準的方式重新定義精神科疾病
11:18
ways that we think about mental illness,
229
678097
2578
當我們想到精神科疾病時
11:20
characterizing these diseases, by using the people
230
680675
2067
並以雀鳥交換訊息的方法去描繪
11:22
as birds in the exchanges. That is, we exploit the fact
231
682742
3007
這些疾病的病徵。換句話說,我們利用
11:25
that the healthy partner, playing somebody with major depression,
232
685749
4244
一個正常的同伴,與患有嚴重抑鬱的病人交流
11:29
or playing somebody with autism spectrum disorder,
233
689993
2910
或者與患有自閉症的患者交流
11:32
or playing somebody with attention deficit hyperactivity disorder,
234
692903
3850
又或者與患有注意力障礙及過度活躍症的患者交流
11:36
we use that as a kind of biosensor, and then we use
235
696753
3219
我們利用這種生物感應器,接著用
11:39
computer programs to model that person, and it gives us
236
699972
2644
電腦程式去建構那個人的模型,就好像
11:42
a kind of assay of this.
237
702616
2470
做化驗報告一樣
11:45
Early days, and we're just beginning, we're setting up sites
238
705086
2131
較早期的時候,當我們還在最初的階段,我們在世界不同地方
11:47
around the world. Here are a few of our collaborating sites.
239
707217
3410
建立了實驗室。這是一些與我們合作的實驗室
11:50
The hub, ironically enough,
240
710627
2309
諷刺地,那個交匯中心
11:52
is centered in little Roanoke, Virginia.
241
712936
2889
在維珍尼亞州中細小的羅阿諾克市
11:55
There's another hub in London, now, and the rest
242
715825
2269
現在,我們有另外一個交匯中心在倫敦,其他
11:58
are getting set up. We hope to give the data away
243
718094
4009
都在建構中。在其後的階段,我們希望可以把數據
12:02
at some stage. That's a complicated issue
244
722103
3673
發放出去。把數據向世界其他地方公開
12:05
about making it available to the rest of the world.
245
725776
2994
是一個複雜的問題
12:08
But we're also studying just a small part
246
728770
1847
但是我們同時在一步步研究
12:10
of what makes us interesting as human beings, and so
247
730617
2267
甚麼使我們作為有趣味的人類。因此
12:12
I would invite other people who are interested in this
248
732884
2041
我希望邀請其他同樣地對這個問題有興趣的人
12:14
to ask us for the software, or even for guidance
249
734925
2569
向我們索取這套軟體,又或者向我們提供
12:17
on how to move forward with that.
250
737494
2219
一些可以推進這些研究的方向
12:19
Let me leave you with one thought in closing.
251
739713
2341
最後,讓我跟你們分享一個想法
12:22
The interesting thing about studying cognition
252
742054
1942
研究人類認知一個有趣的地方是
12:23
has been that we've been limited, in a way.
253
743996
3732
我們在某程度上都受到一些限制
12:27
We just haven't had the tools to look at interacting brains
254
747728
2943
我們從前沒有一個能同時監察
12:30
simultaneously.
255
750671
1200
多個正在互動的大腦的裝置
12:31
The fact is, though, that even when we're alone,
256
751871
2470
然而,即使在我們獨處的時候
12:34
we're a profoundly social creature. We're not a solitary mind
257
754341
4111
我們仍然是一種極度社交性的動物。我們並不是
12:38
built out of properties that kept it alive in the world
258
758452
4373
建基於能夠生存於世上、與其他人隔絕
12:42
independent of other people. In fact, our minds
259
762825
3948
的獨立心靈。事實上,我們的心靈都
12:46
depend on other people. They depend on other people,
260
766773
2870
依靠著其他人。他們亦依靠著其他人
12:49
and they're expressed in other people,
261
769643
1541
同時亦表現在其他人的身上
12:51
so the notion of who you are, you often don't know
262
771184
3652
所以這個「你是誰」的概念,你經常不會
12:54
who you are until you see yourself in interaction with people
263
774836
2688
明白,直至你與那些跟你關係密切的人
12:57
that are close to you, people that are enemies of you,
264
777524
2406
那些你的敵人、那些你不相信的人
12:59
people that are agnostic to you.
265
779930
2545
互動中看到自己
13:02
So this is the first sort of step into using that insight
266
782475
3776
所以我們利用這個工具踏出第一步,去了解
13:06
into what makes us human beings, turning it into a tool,
267
786251
3295
甚麼使我們變成人類,並把它作為工具
13:09
and trying to gain new insights into mental illness.
268
789546
1978
使我們對精神科疾病有新的見解
13:11
Thanks for having me. (Applause)
269
791524
3121
謝謝讓我在這裡替你們演講 (掌聲)
13:14
(Applause)
270
794645
3089
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7