Steve Ramirez and Xu Liu: A mouse. A laser beam. A manipulated memory.

137,324 views ・ 2013-08-15

TED


请双击下面的英文字幕来播放视频。

翻译人员: Ruojing Zhou 校对人员: Shengwei Cai
00:12
Steve Ramirez: My first year of grad school,
0
12371
2096
史蒂夫·拉米雷斯:就读研究生的第一年,
我常常待在自己的卧室里,
00:14
I found myself in my bedroom
1
14491
1416
00:15
eating lots of Ben & Jerry's
2
15931
2276
大吃Ben & Jerry's 冰淇淋,
00:18
watching some trashy TV
3
18231
1660
看看一些没营养的电视节目,
00:19
and maybe, maybe listening to Taylor Swift.
4
19915
3203
或许,或许还听听泰勒·斯威夫特的歌。
00:23
I had just gone through a breakup.
5
23142
1717
我刚刚经历一场失恋嘛。
00:24
(Laughter)
6
24883
1423
(笑声)
00:26
So for the longest time, all I would do
7
26330
2172
所以很长一段时间里,我所做的
00:28
is recall the memory of this person over and over again,
8
28526
3792
就是一边反反复复地回忆有关我前任恋人的一切,
00:32
wishing that I could get rid of that gut-wrenching,
9
32342
2469
一边希望自己能摆脱那种
00:34
visceral "blah" feeling.
10
34835
2509
令人五脏都纠结的痛楚。
00:37
Now, as it turns out, I'm a neuroscientist,
11
37368
2259
现在,我成为了一名神经科学家,
00:39
so I knew that the memory of that person
12
39651
2378
我也知道了有关那个人的记忆,
00:42
and the awful, emotional undertones that color in that memory,
13
42053
3116
以及那记忆中所挟带的痛苦情感,
00:45
are largely mediated by separate brain systems.
14
45193
2610
其实很大程度上是由不同脑区调控的。
00:47
And so I thought, what if we could go into the brain
15
47827
2477
所以我就想:我们是不是能深入到人们的大脑中,
00:50
and edit out that nauseating feeling
16
50328
1930
仅仅剔除那种令人食不下咽的痛苦
00:52
but while keeping the memory of that person intact?
17
52282
2946
却同时完好地保留关于那个前任恋人的记忆?
00:55
Then I realized, maybe that's a little bit lofty for now.
18
55252
2667
然后我意识到也许这个想法对现在来说 或许还是有些不切实际。
00:57
So what if we could start off by going into the brain
19
57943
2489
那么我们是不是可以先从在大脑中
01:00
and just finding a single memory to begin with?
20
60456
2609
找出一个单一记忆开始着手做起呢?
01:03
Could we jump-start that memory back to life,
21
63089
2490
我们是否可以跳转到一段过往记忆并在 现实中重新演绎,
01:05
maybe even play with the contents of that memory?
22
65603
3844
甚至也许可以稍稍改编一下记忆的内容?
01:09
All that said, there is one person in the entire world right now
23
69471
2205
不过现在,全世界有一个人,我倒是希望她
01:11
that I really hope is not watching this talk.
24
71700
2143
现在没看到这段演讲。
01:13
(Laughter)
25
73867
3805
(笑声)
01:17
So there is a catch. There is a catch.
26
77696
3265
所以啦,凡事都有代价。
01:20
These ideas probably remind you of "Total Recall,"
27
80985
2764
也许以上这些设想会让你想起《全面回忆》、
01:23
"Eternal Sunshine of the Spotless Mind,"
28
83773
1950
《暖暖内含光》、
01:25
or of "Inception."
29
85747
1279
或者《盗梦空间》这些电影。
01:27
But the movie stars that we work with
30
87050
1762
但是和我们研究人员合作的 电影明星
01:28
are the celebrities of the lab.
31
88836
1709
却都是实验室的大牌。
01:30
Xu Liu: Test mice.
32
90569
1876
刘旭:(也就是)试验鼠。
01:32
(Laughter)
33
92469
1104
(笑声)
01:33
As neuroscientists, we work in the lab with mice
34
93597
3130
作为神经科学家,我们通过研究老鼠,
01:36
trying to understand how memory works.
35
96751
3386
来尝试理解记忆的运作,
01:40
And today, we hope to convince you that now
36
100161
2545
今天,我们希望向你们证明
01:42
we are actually able to activate a memory in the brain
37
102730
3192
我们事实上能够在大脑中
01:45
at the speed of light.
38
105946
2146
快速激活一段记忆。
01:48
To do this, there's only two simple steps to follow.
39
108116
3082
要完成这个任务,我们只需要两个简单的步骤。
01:51
First, you find and label a memory in the brain,
40
111222
3486
首先,在大脑中找到并且标记一段记忆,
01:54
and then you activate it with a switch.
41
114732
3606
然后用开关激活它。
01:58
As simple as that.
42
118362
1421
就是这么简单。
01:59
(Laughter)
43
119807
1798
(笑声)
02:01
SR: Are you convinced?
44
121629
1821
史蒂夫:你们信服了吗?
02:03
So, turns out finding a memory in the brain isn't all that easy.
45
123474
3697
其实,在大脑中找到一段记忆并非如此简单。
02:07
XL: Indeed. This is way more difficult than, let's say,
46
127195
2779
刘旭:确实。这其实要比
02:09
finding a needle in a haystack,
47
129998
2380
在稻草堆中找到一根针要难得多,
02:12
because at least, you know, the needle is still something
48
132402
2715
因为至少,针还是一个
你能确实触摸到的实物。
02:15
you can physically put your fingers on.
49
135141
2326
02:17
But memory is not.
50
137491
1953
但记忆不是。
02:19
And also, there's way more cells in your brain
51
139468
3014
并且,大脑中的脑细胞可比
02:22
than the number of straws in a typical haystack.
52
142506
5042
稻草堆中的稻谷要多得多。
02:27
So yeah, this task does seem to be daunting.
53
147572
2855
所以,这个任务似乎异常艰难。
02:30
But luckily, we got help from the brain itself.
54
150451
3655
但幸运的是,我们从大脑自身得到了帮助。
02:34
It turned out that all we need to do is basically
55
154130
2431
事实上我们只需要
02:36
to let the brain form a memory,
56
156585
1969
让大脑自己生成一段记忆,
02:38
and then the brain will tell us which cells are involved
57
158578
3806
然后大脑会告诉我们哪些脑细胞
02:42
in that particular memory.
58
162408
1742
参与了这一段记忆的组成。
02:44
SR: So what was going on in my brain
59
164174
2333
史蒂夫:那么在我回忆前任恋人的时候,
02:46
while I was recalling the memory of an ex?
60
166531
2070
我的大脑中究竟发生了什么呢?
02:48
If you were to just completely ignore human ethics for a second
61
168625
2378
如果你们能暂且抛开人类道德,
02:51
and slice up my brain right now,
62
171027
1644
即刻将我的大脑切片解剖,
02:52
you would see that there was an amazing number
63
172695
2191
你们就能看到在我回忆那段记忆时,
02:54
of brain regions that were active while recalling that memory.
64
174910
2957
很多脑区正处于活跃状态。
02:57
Now one brain region that would be robustly active
65
177891
2888
在这其中,
03:00
in particular is called the hippocampus,
66
180803
1983
海马体是最为持续活跃的脑区域。
03:02
which for decades has been implicated in processing
67
182810
2431
这一脑区负责处理
03:05
the kinds of memories that we hold near and dear,
68
185265
2392
对我们来说独具意义的记忆,
03:07
which also makes it an ideal target to go into
69
187681
2550
这一点也使得海马体成为可供深入研究的理想目标,
03:10
and to try and find and maybe reactivate a memory.
70
190255
2761
我们可以试着在海马体中找到,并或许激活一段记忆。
03:13
XL: When you zoom in into the hippocampus,
71
193040
2370
刘旭:当你聚焦于海马体,
03:15
of course you will see lots of cells,
72
195434
2324
当然你会看到许多脑细胞,
03:17
but we are able to find which cells are involved
73
197782
3007
但我们也能够找到参与
03:20
in a particular memory,
74
200813
1452
某一特定记忆的那些细胞,
03:22
because whenever a cell is active,
75
202289
2594
因为每当一个脑细胞在活动时,
03:24
like when it's forming a memory,
76
204907
1524
比如说当它正在形成一段记忆时,
03:26
it will also leave a footprint that will later allow us to know
77
206455
3649
它也会同时留下“足迹”,通过这些足迹我们可以得知
03:30
these cells are recently active.
78
210128
2678
这些脑细胞近期有活跃过。
03:32
SR: So the same way that building lights at night
79
212830
2334
史蒂夫:这就好比通过夜晚的大楼灯光,
你可以推知有人可能在某一时段在那里工作过。
03:35
let you know that somebody's probably working there at any given moment,
80
215188
3429
事实上,每一个细胞中都有一些传感器,
03:38
in a very real sense, there are biological sensors
81
218641
2561
03:41
within a cell that are turned on
82
221226
1930
这些传感器
03:43
only when that cell was just working.
83
223180
2111
只有在该细胞工作时开启。
03:45
They're sort of biological windows that light up
84
225315
2286
它们就像透出亮光的生物窗,
03:47
to let us know that that cell was just active.
85
227625
2191
告诉我们某一细胞刚刚处于激活状态。
03:49
XL: So we clipped part of this sensor,
86
229840
2134
刘旭:因此我们获取这种传感器的一小部分,
03:51
and attached that to a switch to control the cells,
87
231998
3123
并将这一部分连接到用于控制脑细胞的开关上,
03:55
and we packed this switch into an engineered virus
88
235145
3876
然后我们将这一开关植入于人造病毒中,
03:59
and injected that into the brain of the mice.
89
239045
2564
并注射于实验鼠大脑。
04:01
So whenever a memory is being formed,
90
241633
2610
所以每当一段记忆形成时,
04:04
any active cells for that memory
91
244267
2324
人造开关就会被安装到。
04:06
will also have this switch installed.
92
246615
2718
参与其中的每一个脑细胞
04:09
SR: So here is what the hippocampus looks like
93
249357
2191
史蒂夫:这就是海马体
形成一段记忆后的形态。
04:11
after forming a fear memory, for example.
94
251572
2226
04:13
The sea of blue that you see here
95
253822
2116
你们所看到的这一蓝色区域
04:15
are densely packed brain cells,
96
255962
1928
密集分布着许多脑细胞;
04:17
but the green brain cells,
97
257914
1521
但是这些绿色标记的脑细胞,
04:19
the green brain cells are the ones that are holding on
98
259459
2572
则是与某一特定的恐惧记忆有关。
则是与某一特定的恐惧记忆有关。
04:22
to a specific fear memory.
99
262055
1319
04:23
So you are looking at the crystallization
100
263398
1955
所以你们现在看到的就是“恐惧”形成瞬间的结晶
04:25
of the fleeting formation of fear.
101
265377
2363
所以你们现在看到的就是“恐惧”形成瞬间的结晶
04:27
You're actually looking at the cross-section of a memory right now.
102
267764
3473
你们现在看到的其实是一组记忆横截面
04:31
XL: Now, for the switch we have been talking about,
103
271261
2429
刘旭:现在来看看我们刚刚提到的开关,
04:33
ideally, the switch has to act really fast.
104
273714
2901
理论上这个开关的运作速度要非常快。
04:36
It shouldn't take minutes or hours to work.
105
276639
2555
运作这一开关不能耗上几分钟或几小时。
04:39
It should act at the speed of the brain, in milliseconds.
106
279218
4240
相反它的工作速度单位 要以大脑的运作速度(毫秒)为单位。
04:43
SR: So what do you think, Xu?
107
283482
1406
史蒂夫:所以你怎么看,刘旭?
04:44
Could we use, let's say, pharmacological drugs
108
284912
2578
你觉得我们能用医学用药
04:47
to activate or inactivate brain cells?
109
287514
1814
来激活或减退大脑细胞的活动?
04:49
XL: Nah. Drugs are pretty messy. They spread everywhere.
110
289352
4039
刘旭:行不通吧。 药物太难控制了,它们全身乱跑。
04:53
And also it takes them forever to act on cells.
111
293415
2984
而且等它们作用到脑细胞上就太慢咯。
04:56
So it will not allow us to control a memory in real time.
112
296423
3625
所以我们不太可能用药物来及时控制记忆。
05:00
So Steve, how about let's zap the brain with electricity?
113
300072
4270
那史蒂夫,你觉得用电流刺激大脑这个想法如何?
05:04
SR: So electricity is pretty fast,
114
304366
2281
史蒂夫:嗯,电流的速度确实快,
05:06
but we probably wouldn't be able to target it
115
306671
1715
但是我们不太可能将电流精确定位于
05:08
to just the specific cells that hold onto a memory,
116
308410
2429
参与一段记忆行程的那些特定脑细胞,
05:10
and we'd probably fry the brain.
117
310863
1755
而且我们很有可能会把大脑烤糊。
05:12
XL: Oh. That's true. So it looks like, hmm,
118
312642
3171
刘旭:噢,那倒是真的。嗯,那看来,
05:15
indeed we need to find a better way
119
315837
2587
我们确实得要找到一个更好的方法,
05:18
to impact the brain at the speed of light.
120
318448
3271
能以光速来控制大脑。
05:21
SR: So it just so happens that light travels at the speed of light.
121
321743
5062
史蒂夫:对了,光的穿行速度不就是光速嘛。
05:26
So maybe we could activate or inactive memories
122
326829
3459
所以或许我们就可以
05:30
by just using light --
123
330312
1473
用光来激活或消退记忆......
05:31
XL: That's pretty fast.
124
331809
1331
刘旭:那样确实够快
05:33
SR: -- and because normally brain cells
125
333164
1861
史蒂夫: 而且由于脑细胞通常
05:35
don't respond to pulses of light,
126
335049
1572
对光波没有反应,
05:36
so those that would respond to pulses of light
127
336645
1934
所以能对光波产生反应的
05:38
are those that contain a light-sensitive switch.
128
338603
2432
就是那些挟带光敏开光的细胞。
05:41
Now to do that, first we need to trick brain cells
129
341059
1922
要做到这点,首先我们必须诱使脑细胞
05:43
to respond to laser beams.
130
343005
1438
对镭射光产生反应。
05:44
XL: Yep. You heard it right.
131
344467
1046
刘旭:对!你们没有听错。
05:45
We are trying to shoot lasers into the brain.
132
345537
2143
我们要试着向大脑发射镭射光。
05:47
(Laughter)
133
347704
1686
(笑声)
05:49
SR: And the technique that lets us do that is optogenetics.
134
349414
3300
史蒂夫:而我们依靠的技术就是光遗传学
05:52
Optogenetics gave us this light switch that we can use
135
352738
3258
通过光遗传学我们得到了这个“光开关”
05:56
to turn brain cells on or off,
136
356020
1484
来开启或关闭脑细胞,
05:57
and the name of that switch is channelrhodopsin,
137
357528
2493
这个开关就叫做光敏感通道,
06:00
seen here as these green dots attached to this brain cell.
138
360045
2513
也就是这里显示的附于这个脑细胞的绿色小点。
06:02
You can think of channelrhodopsin as a sort of light-sensitive switch
139
362582
3286
你们可以将光敏感通道看作一种光敏开关,
06:05
that can be artificially installed in brain cells
140
365892
2592
它可以被人工安置于脑细胞中,
06:08
so that now we can use that switch
141
368508
1890
因此现在我们就可以使用这一开关
06:10
to activate or inactivate the brain cell simply by clicking it,
142
370422
3000
只要简单点击一下开关 我们就能激活或减退脑细胞的活动,
06:13
and in this case we click it on with pulses of light.
143
373446
2524
并且我们用的是光波来点击开关。
06:15
XL: So we attach this light-sensitive switch of channelrhodopsin
144
375994
3671
刘旭:我们将这一类似光敏开关的紫红质通道蛋白
06:19
to the sensor we've been talking about
145
379689
2184
安装于我们刚刚提到的传感器上,
06:21
and inject this into the brain.
146
381897
2431
并将它注于脑内。
06:24
So whenever a memory is being formed,
147
384352
3187
所以每当一段记忆形成时,
06:27
any active cell for that particular memory
148
387563
2203
光敏开关也会被安装到
06:29
will also have this light-sensitive switch installed in it
149
389790
3460
参与这一特定记忆的每一个细胞上
06:33
so that we can control these cells
150
393274
2377
正如你们所见,我们也就能
06:35
by the flipping of a laser just like this one you see.
151
395675
4240
通过调控镭射光来控制脑细胞啦。
06:39
SR: So let's put all of this to the test now.
152
399939
2840
史蒂夫:那么我们现在就来测试一下吧。
06:42
What we can do is we can take our mice
153
402803
2111
我们拿出实验鼠,
06:44
and then we can put them in a box that looks exactly like this box here,
154
404938
2904
将它们放到和这个盒子一模一样的盒中,
06:47
and then we can give them a very mild foot shock
155
407866
2316
然后施于它们一个轻微的足电击,
06:50
so that they form a fear memory of this box.
156
410206
2048
以使它们对这个盒子形成恐惧记忆。
06:52
They learn that something bad happened here.
157
412278
2059
它们记得在这个盒子里发生过不好的事。
06:54
Now with our system, the cells that are active
158
414361
2318
现在在我们的大脑中,海马体内有
06:56
in the hippocampus in the making of this memory,
159
416703
2766
参与形成这一恐惧记忆的活跃脑细胞,
06:59
only those cells will now contain channelrhodopsin.
160
419493
2859
现在只有这些细胞带有紫红质通道蛋白。
07:02
XL: When you are as small as a mouse,
161
422376
2993
刘旭:当你像老鼠那么小的时候,
07:05
it feels as if the whole world is trying to get you.
162
425393
3571
会感觉仿佛全世界都要欺压你
07:08
So your best response of defense
163
428988
1724
而你最好的应对办法
07:10
is trying to be undetected.
164
430736
2458
就是尽量把自己藏好。
07:13
Whenever a mouse is in fear,
165
433218
2009
每当老鼠害怕时,
07:15
it will show this very typical behavior
166
435251
1858
它最典型的表现
就是藏到盒子的一角,
07:17
by staying at one corner of the box,
167
437133
1745
07:18
trying to not move any part of its body,
168
438902
2738
一动也不动
07:21
and this posture is called freezing.
169
441664
3271
我们称这个姿势为“定格”
07:24
So if a mouse remembers that something bad happened in this box,
170
444959
4270
因此只要老鼠记得在这个盒子里发生过不愉快的事,
07:29
and when we put them back into the same box,
171
449253
2599
当我们将它们放回盒中时,
07:31
it will basically show freezing
172
451876
1780
老鼠就会表现出“定格”行为,
07:33
because it doesn't want to be detected
173
453680
2261
因为它不想被
07:35
by any potential threats in this box.
174
455965
2671
盒中任何可能的危险捕获。
07:38
SR: So you can think of freezing as,
175
458660
1331
史蒂夫:你们可以将“定格”理解为,
07:40
you're walking down the street minding your own business,
176
460015
2191
你好好地走在路上,想着你自己的事情,
07:42
and then out of nowhere you almost run into
177
462230
2048
突然不经意地你就撞上了
你的前女友或前男友,
07:44
an ex-girlfriend or ex-boyfriend,
178
464302
1821
07:46
and now those terrifying two seconds
179
466147
2110
就在那令人恐慌的两秒内,
07:48
where you start thinking, "What do I do? Do I say hi?
180
468281
1852
你开始想,“我该怎么做?我要打招呼吗?
07:50
Do I shake their hand? Do I turn around and run away?
181
470157
1344
我要和他们握手吗?还是转身就跑?
07:51
Do I sit here and pretend like I don't exist?"
182
471525
2005
还是就坐在这里,假装自己不存在?”
07:53
Those kinds of fleeting thoughts that physically incapacitate you,
183
473554
3160
这些在你脑中一闪而过的念头令你四肢僵硬、
07:56
that temporarily give you that deer-in-headlights look.
184
476738
2722
而你脸上的表情就像车头灯照射下的惊恐小鹿。
07:59
XL: However, if you put the mouse in a completely different
185
479484
3267
刘旭:然而当你将实验鼠置于一个截然不同的新盒中,
08:02
new box, like the next one,
186
482775
3137
就像这里的第二个盒子,
08:05
it will not be afraid of this box
187
485936
2123
它就不会害怕。
08:08
because there's no reason that it will be afraid of this new environment.
188
488083
4705
因为它没有理由对这个新环境产生恐惧。
08:12
But what if we put the mouse in this new box
189
492812
3156
但是如果我们将实验鼠放于这个新盒中
08:15
but at the same time, we activate the fear memory
190
495992
3587
同时用镭射光
08:19
using lasers just like we did before?
191
499603
2655
激活之前的那段恐惧记忆
08:22
Are we going to bring back the fear memory
192
502282
2830
我们能否将对于第一个盒子的那段记忆
08:25
for the first box into this completely new environment?
193
505136
3973
在这个新环境中重现呢?
08:29
SR: All right, and here's the million-dollar experiment.
194
509133
2711
史蒂夫:不错,这就是那个耗价百万的实验。
08:31
Now to bring back to life the memory of that day,
195
511868
2889
现在就让我们把那一天的记忆重现,
08:34
I remember that the Red Sox had just won,
196
514781
2159
我记得那一天波士顿红袜队正巧打赢一场比赛,
08:36
it was a green spring day,
197
516964
1885
那是一个绿意盎然的春日,
08:38
perfect for going up and down the river
198
518873
1858
最适宜在小河中划船,
08:40
and then maybe going to the North End
199
520755
2245
然后也许去波士顿北区
08:43
to get some cannolis, #justsaying.
200
523024
2135
买几个奶油甜馅煎饼卷 啊,只是说说而已。
08:45
Now Xu and I, on the other hand,
201
525183
3078
事实上刘旭和我,
08:48
were in a completely windowless black room
202
528285
2806
当时正在一间没有窗户的小黑屋里,
08:51
not making any ocular movement that even remotely resembles an eye blink
203
531115
3636
双眼一眨不眨
08:54
because our eyes were fixed onto a computer screen.
204
534775
2443
因为我们的目光正牢牢盯住电脑屏幕
08:57
We were looking at this mouse here trying to activate a memory
205
537242
2953
我们当时正看着这只老鼠 第一次尝试用我们的技术
来激活一段记忆。
09:00
for the first time using our technique.
206
540219
1859
刘旭:而这就是我们看到的:
09:02
XL: And this is what we saw.
207
542102
2243
09:04
When we first put the mouse into this box,
208
544369
2178
当我们第一次将老鼠放进这只盒子时,
09:06
it's exploring, sniffing around, walking around,
209
546571
3089
它在探索,这边嗅嗅,那边转转,
09:09
minding its own business,
210
549684
1665
完全沉浸在自己的世界里,
09:11
because actually by nature,
211
551373
1677
事实上,老鼠天性就是好奇心旺盛的动物。
09:13
mice are pretty curious animals.
212
553074
1955
事实上,老鼠天性就是好奇心旺盛的动物。
09:15
They want to know, what's going on in this new box?
213
555053
2598
他们想知道:这个新盒子里有什么
09:17
It's interesting.
214
557675
1507
这里很有趣啊。
09:19
But the moment we turned on the laser, like you see now,
215
559206
3427
但是当我们打开镭射光 就像你们现在看到的
09:22
all of a sudden the mouse entered this freezing mode.
216
562657
3008
突然之间实验鼠进入了“定格”模式。
09:25
It stayed here and tried not to move any part of its body.
217
565689
4407
它停在那里,身体一动不动
09:30
Clearly it's freezing.
218
570120
1604
很显然它僵住了。
09:31
So indeed, it looks like we are able to bring back
219
571748
2559
似乎我们能够将
09:34
the fear memory for the first box
220
574331
2040
对于第一个盒子的恐惧记忆带回到
09:36
in this completely new environment.
221
576395
3343
这个全新的环境中。
09:39
While watching this, Steve and I
222
579762
2088
当我和Steve看到这些时,
09:41
are as shocked as the mouse itself.
223
581874
2109
我们也像实验鼠一样惊呆了。
09:44
(Laughter)
224
584007
1238
(笑声)
09:45
So after the experiment, the two of us just left the room
225
585269
3283
当实验结束后,我们两啥也没说,
09:48
without saying anything.
226
588576
1729
一声不响地离开了房间。
09:50
After a kind of long, awkward period of time,
227
590329
3372
过了有点长、有点尴尬的一段时间后,
09:53
Steve broke the silence.
228
593725
2188
史蒂夫首先开口了。
09:55
SR: "Did that just work?"
229
595937
2317
史蒂夫:“所以那个成功了?”
09:58
XL: "Yes," I said. "Indeed it worked!"
230
598278
2950
刘旭:“是啊,”我说。“它确确实实成功了!”
10:01
We're really excited about this.
231
601252
2093
我们对此兴奋不已。
10:03
And then we published our findings
232
603369
2600
然后我们在《自然》杂志
10:05
in the journal Nature.
233
605993
1672
发表了这一发现。
10:07
Ever since the publication of our work,
234
607689
2447
自从发表以来,
10:10
we've been receiving numerous comments
235
610160
2391
我们从互联网
10:12
from all over the Internet.
236
612575
2101
收到了无数的评论。
10:14
Maybe we can take a look at some of those.
237
614700
3726
也许我们可以看看其中的某些评论。·
10:18
["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"]
238
618450
2433
“天呐!终于...... 有太多太多的新发现了:虚拟现实,神经控制,梦境的视觉再现,神经编码,’记忆的记录与再记录‘, 心理疾病,啊,未来太棒了”
10:20
SR: So the first thing that you'll notice is that people
239
620907
1976
史蒂夫:你首先会注意到人们
10:22
have really strong opinions about this kind of work.
240
622907
2879
对这类研究有着非常强烈的观点。
10:25
Now I happen to completely agree with the optimism
241
625810
2530
我完全同意
10:28
of this first quote,
242
628364
792
第一条评论所体现的乐观性,
10:29
because on a scale of zero to Morgan Freeman's voice,
243
629180
2784
如果用零到摩根弗里曼的声音这一范围作为量表,
10:31
it happens to be one of the most evocative accolades
244
631988
2477
这恰巧是我听过的
10:34
that I've heard come our way.
245
634489
1526
对我们的工作给出的最打动人的称赞
(笑声)
10:36
(Laughter)
246
636039
1794
10:37
But as you'll see, it's not the only opinion that's out there.
247
637857
1927
但你也将看到,这并不是唯一的观点。
10:39
["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"]
248
639808
1540
“这太可怕了......如果他们能在几年内轻而易举地将这一技术用于人类?!天呐,我们完蛋了”
10:41
XL: Indeed, if we take a look at the second one,
249
641372
2286
刘旭:确实,如果我们看看第二条,
10:43
I think we can all agree that it's, meh,
250
643682
2083
我想我们不得不说,
10:45
probably not as positive.
251
645789
1959
它的评价并不那么正面
10:47
But this also reminds us that,
252
647772
2161
但也正提醒我们,
10:49
although we are still working with mice,
253
649957
2162
虽然现在我们只是试验于老鼠身上,
10:52
it's probably a good idea to start thinking and discussing
254
652143
3493
但也许我们也该开始思考、探讨
10:55
about the possible ethical ramifications
255
655660
2967
记忆控制
10:58
of memory control.
256
658651
1924
可能产生的道德后果。
11:00
SR: Now, in the spirit of the third quote,
257
660599
2176
史蒂夫:现在,鉴于第三条评论,
11:02
we want to tell you about a recent project that we've been
258
662799
1950
我们想要向你们报告我们实验室
11:04
working on in lab that we've called Project Inception.
259
664773
2572
正在研究的第三个项目。我们称其为“盗梦空间”项目。
11:07
["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."]
260
667369
3219
“他们应该把这个拍成电影。他们将思想、观念植入于人脑,借此他们就可以为私利操控人类。这就是:盗梦空间。”
11:10
So we reasoned that now that we can reactivate a memory,
261
670612
3542
于是我们就此推论:既然我们能够重新激活一段记忆,
11:14
what if we do so but then begin to tinker with that memory?
262
674178
2928
那么何不开始操纵这段记忆?
11:17
Could we possibly even turn it into a false memory?
263
677130
3009
我们是不是可以将它变为一段错误记忆?
11:20
XL: So all memory is sophisticated and dynamic,
264
680163
4075
刘旭:所有记忆都是复杂的,它们总是处于动态模式,
11:24
but if just for simplicity, let's imagine memory
265
684262
2955
用个简化的比喻,我们可以将记忆想象为
11:27
as a movie clip.
266
687241
1378
一段电影。
11:28
So far what we've told you is basically we can control
267
688643
2646
目前我们所说的只是我们能够控制
11:31
this "play" button of the clip
268
691313
1907
这段电影片段的“播放”按钮,
11:33
so that we can play this video clip any time, anywhere.
269
693244
4561
借此我们可以随时随地地播放这段影片。
11:37
But is there a possibility that we can actually get
270
697829
2507
但是我们是否可以深入大脑内部、
11:40
inside the brain and edit this movie clip
271
700360
2836
并改编这段影片,
11:43
so that we can make it different from the original?
272
703220
2872
让它不同于原片?
11:46
Yes we can.
273
706116
2154
是的,我们可以。
11:48
Turned out that all we need to do is basically
274
708294
2191
事实上我们所要做的仅仅是
11:50
reactivate a memory using lasers just like we did before,
275
710509
4166
重新用镭射光激活一段记忆,就像我们刚刚做的那样,
11:54
but at the same time, if we present new information
276
714699
3415
但同时,如果我们呈现新的信息
11:58
and allow this new information to incorporate into this old memory,
277
718138
3950
并将新信息融入老的记忆中,
12:02
this will change the memory.
278
722112
2414
这将会改变原先的记忆。
12:04
It's sort of like making a remix tape.
279
724550
3639
这就像制作一段混音录音。
12:08
SR: So how do we do this?
280
728213
2834
史蒂夫:那么怎么做呢?
12:11
Rather than finding a fear memory in the brain,
281
731071
1933
这次我们不找大脑中的恐惧记忆了,
12:13
we can start by taking our animals,
282
733028
1692
而是将我们的实验动物
12:14
and let's say we put them in a blue box like this blue box here
283
734744
2953
放在一个蓝盒子里,就像这边的这个,
12:17
and we find the brain cells that represent that blue box
284
737721
2623
然后我们找到大脑中表示这个蓝盒子的脑细胞,
12:20
and we trick them to respond to pulses of light
285
740368
2239
诱使它们对光波产生反应,
12:22
exactly like we had said before.
286
742631
1710
就像我们刚刚说的那样,
12:24
Now the next day, we can take our animals and place them
287
744365
2100
然后第二天,我们把动物放在
12:26
in a red box that they've never experienced before.
288
746489
2675
它们从未待过的红盒子里
12:29
We can shoot light into the brain to reactivate
289
749188
2239
接着我们向大脑发射光束,从而激活动物
12:31
the memory of the blue box.
290
751451
1848
对于蓝盒子的记忆。
12:33
So what would happen here if, while the animal
291
753323
1720
那么将会发生什么呢:当动物
12:35
is recalling the memory of the blue box,
292
755067
1905
正在回忆关于蓝盒子的记忆时,
12:36
we gave it a couple of mild foot shocks?
293
756996
2584
我们施于它轻度的足电击?
12:39
So here we're trying to artificially make an association
294
759604
2669
我们想要人工形成老鼠对 蓝盒子和足电击反应之间的联系。
12:42
between the memory of the blue box
295
762297
1891
我们想要人工形成老鼠对 蓝盒子和足电击反应之间的联系。
12:44
and the foot shocks themselves.
296
764212
1479
我们想要人工形成老鼠对 蓝盒子和足电击反应之间的联系。
12:45
We're just trying to connect the two.
297
765715
1762
我们想把两者联系起来。
12:47
So to test if we had done so,
298
767501
1512
为了测试联系是否产生,
12:49
we can take our animals once again
299
769037
1304
我们再一次
12:50
and place them back in the blue box.
300
770365
1922
将动物放回蓝盒中。
12:52
Again, we had just reactivated the memory of the blue box
301
772311
2715
就在这之前,我们已在动物回忆蓝盒时
12:55
while the animal got a couple of mild foot shocks,
302
775050
2381
对它施加了轻微的足电击,
12:57
and now the animal suddenly freezes.
303
777455
2107
而现在老鼠却出现了"定格"反应,
12:59
It's as though it's recalling being mildly shocked in this environment
304
779586
3334
这就好像在记忆中它确实在蓝盒中遭到了电击,
13:02
even though that never actually happened.
305
782944
2822
虽然实际上这从未发生过。
13:05
So it formed a false memory,
306
785790
1838
由此错误记忆就生成了,
13:07
because it's falsely fearing an environment
307
787652
2048
因为我们的试验动物对环境产生了错误的“恐惧”反应,
13:09
where, technically speaking,
308
789724
1334
而实际上,
在那个环境中什么坏事都没发生。
13:11
nothing bad actually happened to it.
309
791082
2160
13:13
XL: So, so far we are only talking about
310
793266
2429
刘旭:到现在为止,我们仅仅是在讨论
13:15
this light-controlled "on" switch.
311
795719
2342
这个光控开关的“开启”功能。
13:18
In fact, we also have a light-controlled "off" switch,
312
798085
3264
事实上,我们的光控开关还有“关闭”功能,
13:21
and it's very easy to imagine that
313
801373
2056
不难想象,
13:23
by installing this light-controlled "off" switch,
314
803453
2454
有了这个光控“关闭'开关,
13:25
we can also turn off a memory, any time, anywhere.
315
805931
5564
我们也可以随时随地关闭一段记忆。
13:31
So everything we've been talking about today
316
811519
2190
所以今天我们所讨论的研究
13:33
is based on this philosophically charged principle of neuroscience
317
813733
4653
都是基于神经科学的一条哲学原则,
13:38
that the mind, with its seemingly mysterious properties,
318
818410
4094
即大脑虽然看似神秘,
13:42
is actually made of physical stuff that we can tinker with.
319
822528
3621
但其实也是由可以被我们所调控的物质基础所组成的。
13:46
SR: And for me personally,
320
826173
1451
史蒂夫:而对我个人来说,
13:47
I see a world where we can reactivate
321
827648
1762
我可以预见我们将能够重新激活
13:49
any kind of memory that we'd like.
322
829434
1887
任何我们想激活的记忆。
13:51
I also see a world where we can erase unwanted memories.
323
831345
3274
而同时我们也可以消除不想要的记忆。
13:54
Now, I even see a world where editing memories
324
834643
2143
我甚至可以预见
13:56
is something of a reality,
325
836810
1284
改编记忆将成为现实,
13:58
because we're living in a time where it's possible
326
838118
1681
因为在这个时代我们能够
13:59
to pluck questions from the tree of science fiction
327
839823
2429
从科幻小说之树上摘取”问题“果实
14:02
and to ground them in experimental reality.
328
842276
2168
然后将它们投入现实试验。
14:04
XL: Nowadays, people in the lab
329
844468
1859
刘旭:如今,全世界的科研人员
14:06
and people in other groups all over the world
330
846351
2362
和其它工作者
14:08
are using similar methods to activate or edit memories,
331
848737
3793
都在用相似的方法来激活和改写各种记忆,
14:12
whether that's old or new, positive or negative,
332
852554
3817
无论是旧的记忆、新的记忆,积极的或消极的,
14:16
all sorts of memories so that we can understand
333
856395
2648
这样我们便能够更好地理解
14:19
how memory works.
334
859067
1840
记忆的运作。
14:20
SR: For example, one group in our lab
335
860931
1760
史蒂夫:例如,我们实验室的一个研究小组
14:22
was able to find the brain cells that make up a fear memory
336
862715
2614
发现了生成恐惧记忆的相关脑细胞,
14:25
and converted them into a pleasurable memory, just like that.
337
865353
2751
并使它们产生愉快记忆。
14:28
That's exactly what I mean about editing these kinds of processes.
338
868128
3143
这就是我所说的改编记忆。
14:31
Now one dude in lab was even able to reactivate
339
871295
2239
我们实验室的一名研究人员甚至能够
14:33
memories of female mice in male mice,
340
873558
1921
在雄性老鼠大脑中再现激活雌性老鼠的记忆,
14:35
which rumor has it is a pleasurable experience.
341
875503
2971
据说这是一段相当愉快的经历。
14:38
XL: Indeed, we are living in a very exciting moment
342
878498
4093
刘旭:不错,我们正生活在一个令人振奋的时刻,
14:42
where science doesn't have any arbitrary speed limits
343
882615
3781
在这里科学发展没有任何外在限速,
14:46
but is only bound by our own imagination.
344
886420
3163
它只为我们的想象力所缚。
14:49
SR: And finally, what do we make of all this?
345
889607
2143
史蒂夫:那么最后,我们要如何运用我们的新发现呢?
14:51
How do we push this technology forward?
346
891774
1927
又如何将这一技术继续推进呢?
14:53
These are the questions that should not remain
347
893725
2191
这些问题不仅仅是实验室中的问题,
14:55
just inside the lab,
348
895940
1273
这些问题不仅仅是实验室中的问题,
14:57
and so one goal of today's talk was to bring everybody
349
897237
2572
所以我们今天讲座的目的之一
14:59
up to speed with the kind of stuff that's possible
350
899833
2381
除了是要向大家介绍
15:02
in modern neuroscience,
351
902238
1250
当代神经科学的新可能,
15:03
but now, just as importantly,
352
903512
1486
但同样重要的是
15:05
to actively engage everybody in this conversation.
353
905022
3308
我们想鼓动每个人都加入到这场交流中。
15:08
So let's think together as a team about what this all means
354
908354
2762
让我们作为一个团队来共同思考这一新发现新技术 究竟意味着什么,
从这个起点我们可以又该走向何方。
15:11
and where we can and should go from here,
355
911140
1993
15:13
because Xu and I think we all have
356
913157
2074
因为刘旭和我都认为
15:15
some really big decisions ahead of us.
357
915255
2512
有一些非常重要的决定在前方等着我们。
15:17
Thank you. XL: Thank you.
358
917791
1101
谢谢大家。 刘旭:谢谢各位
15:18
(Applause)
359
918916
1634
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7