Steve Ramirez and Xu Liu: A mouse. A laser beam. A manipulated memory.

137,324 views

2013-08-15 ・ TED


New videos

Steve Ramirez and Xu Liu: A mouse. A laser beam. A manipulated memory.

137,324 views ・ 2013-08-15

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Senzos Osijek Recezent: Ivan Stamenković
00:12
Steve Ramirez: My first year of grad school,
0
12371
2096
Steve Ramirez: Na mojoj prvoj godini postdiplomskog
našao sam se u sobi
00:14
I found myself in my bedroom
1
14491
1416
00:15
eating lots of Ben & Jerry's
2
15931
2276
kako jedem puno Ben & Jerry's,
00:18
watching some trashy TV
3
18231
1660
gledam neko smeće na TV-u
00:19
and maybe, maybe listening to Taylor Swift.
4
19915
3203
i možda, ali možda slušajući Taylor Swift.
00:23
I had just gone through a breakup.
5
23142
1717
Upravo sam doživio prekid.
00:24
(Laughter)
6
24883
1423
(Smijeh)
00:26
So for the longest time, all I would do
7
26330
2172
Dugo sam samo
00:28
is recall the memory of this person over and over again,
8
28526
3792
prizivao uspomenu na tu osobu opet iznova,
00:32
wishing that I could get rid of that gut-wrenching,
9
32342
2469
želeći da se mogu riješiti tog osjećaja u stomaku,
00:34
visceral "blah" feeling.
10
34835
2509
visceralnog „blah“ osjećaja.
00:37
Now, as it turns out, I'm a neuroscientist,
11
37368
2259
Igrom slučaja sam neuroznanstvenik
00:39
so I knew that the memory of that person
12
39651
2378
pa znam da su uspomena na tu osobu
00:42
and the awful, emotional undertones that color in that memory,
13
42053
3116
i grozna, emotivna ispraznost koja boja to sjećanje
00:45
are largely mediated by separate brain systems.
14
45193
2610
procesuirani različitim sustavima u mozgu.
00:47
And so I thought, what if we could go into the brain
15
47827
2477
I pomislio sam što ako možemo ući u mozak
00:50
and edit out that nauseating feeling
16
50328
1930
i obraditi taj grozni osjećaj,
00:52
but while keeping the memory of that person intact?
17
52282
2946
ali održati uspomenu na tu osobu netaknutom?
00:55
Then I realized, maybe that's a little bit lofty for now.
18
55252
2667
Tada sam shvatio da je to malo previsok cilj za sada.
00:57
So what if we could start off by going into the brain
19
57943
2489
Što ako bismo mogli započeti s ulaženjem u mozak
01:00
and just finding a single memory to begin with?
20
60456
2609
i pronalaskom jedne uspomene s kojom bismo započeli?
01:03
Could we jump-start that memory back to life,
21
63089
2490
Bismo li mogli naglo vratiti tu uspomenu u život,
01:05
maybe even play with the contents of that memory?
22
65603
3844
možda se poigrati sa sadržajem te uspomene?
01:09
All that said, there is one person in the entire world right now
23
69471
2205
Sve rečeno, postoji jedna osoba na cijelom svijetu sada
01:11
that I really hope is not watching this talk.
24
71700
2143
za koju se nadam da ne gleda ovaj govor.
01:13
(Laughter)
25
73867
3805
(Smijeh)
01:17
So there is a catch. There is a catch.
26
77696
3265
Postoji kvaka. Postoji kvaka.
01:20
These ideas probably remind you of "Total Recall,"
27
80985
2764
Ove vas ideje vjerojatno podsjećaju na „Total Recall“,
01:23
"Eternal Sunshine of the Spotless Mind,"
28
83773
1950
"Eternal Sunshine of the Spotless Mind,"
01:25
or of "Inception."
29
85747
1279
ili „Inception“,
01:27
But the movie stars that we work with
30
87050
1762
Ali filmske zvijezde s kojima radimo
01:28
are the celebrities of the lab.
31
88836
1709
su zvijezde laboratorija.
01:30
Xu Liu: Test mice.
32
90569
1876
Xu Liu: ispitni miš
01:32
(Laughter)
33
92469
1104
(Smijeh)
01:33
As neuroscientists, we work in the lab with mice
34
93597
3130
Kao neuroznanstvenici radimo u laboratoriju s miševima
01:36
trying to understand how memory works.
35
96751
3386
pokušavajući razumijeti kako pamćenje radi.
01:40
And today, we hope to convince you that now
36
100161
2545
I danas se nadamo da ćemo vas uvjeriti
01:42
we are actually able to activate a memory in the brain
37
102730
3192
da zapravo možemo aktivirati sjećanje u mozgu
01:45
at the speed of light.
38
105946
2146
brzinom svjetlosti.
01:48
To do this, there's only two simple steps to follow.
39
108116
3082
Kako bismo to napravili potrebna su dva koraka.
01:51
First, you find and label a memory in the brain,
40
111222
3486
Prvo, pronađeš i označiš uspomenu u mozgu
01:54
and then you activate it with a switch.
41
114732
3606
i onda je aktiviraš prekidačem.
01:58
As simple as that.
42
118362
1421
Poprilično jednostavno.
01:59
(Laughter)
43
119807
1798
(Smijeh)
02:01
SR: Are you convinced?
44
121629
1821
SR: Jeste li uvjereni?
02:03
So, turns out finding a memory in the brain isn't all that easy.
45
123474
3697
Izgleda da pronalaženje uspomene u mozgu nije tako jednostavno.
02:07
XL: Indeed. This is way more difficult than, let's say,
46
127195
2779
XL: Uistinu. To je puno teže nego, recimo,
02:09
finding a needle in a haystack,
47
129998
2380
pronalaženje igle u plastu sijena,
02:12
because at least, you know, the needle is still something
48
132402
2715
jer bar znate da je igla nešto
što fizički možete dodirnuti.
02:15
you can physically put your fingers on.
49
135141
2326
02:17
But memory is not.
50
137491
1953
Ali uspomena nije.
02:19
And also, there's way more cells in your brain
51
139468
3014
Također više je stanica u vašem mozgu
02:22
than the number of straws in a typical haystack.
52
142506
5042
nego slamki u jednom plastu sijena.
02:27
So yeah, this task does seem to be daunting.
53
147572
2855
Da, ovaj pothvat se čini obeshrabrujućim.
02:30
But luckily, we got help from the brain itself.
54
150451
3655
Ali, srećom, dobili smo pomoć od samog mozga.
02:34
It turned out that all we need to do is basically
55
154130
2431
Izgleda da je sve što trebamo učiniti
02:36
to let the brain form a memory,
56
156585
1969
pustiti mozak da stvori sjećanje
02:38
and then the brain will tell us which cells are involved
57
158578
3806
i onda će nam mozak reći koje stanice sudjeluju
02:42
in that particular memory.
58
162408
1742
u određenoj uspomeni.
02:44
SR: So what was going on in my brain
59
164174
2333
SR: Što se događalo u mom mozgu
02:46
while I was recalling the memory of an ex?
60
166531
2070
dok sam prizivao sjećanje na bivšu?
02:48
If you were to just completely ignore human ethics for a second
61
168625
2378
Ako samo na sekundu potpuno ignorirate ljudsku etiku
02:51
and slice up my brain right now,
62
171027
1644
i upravo mi razrežete mozak,
02:52
you would see that there was an amazing number
63
172695
2191
vidjeli biste da postoji velik broj
02:54
of brain regions that were active while recalling that memory.
64
174910
2957
regija mozga koje su bile aktivne dok sam prizivao uspomenu.
02:57
Now one brain region that would be robustly active
65
177891
2888
Jedna regija u mozgu, koja bi bila posebno snažno aktivna,
03:00
in particular is called the hippocampus,
66
180803
1983
zove se hipokampus,
03:02
which for decades has been implicated in processing
67
182810
2431
koji je desetljećima uključen u procesuiranje
03:05
the kinds of memories that we hold near and dear,
68
185265
2392
vrste uspomena koje smatramo bliskima i dragima
03:07
which also makes it an ideal target to go into
69
187681
2550
i koje to čini savršenim metama za
03:10
and to try and find and maybe reactivate a memory.
70
190255
2761
pronalazak i možda ponovno aktiviranje uspomene.
03:13
XL: When you zoom in into the hippocampus,
71
193040
2370
XL: Kada zumiramo hipokampus,
03:15
of course you will see lots of cells,
72
195434
2324
naravno da ćemo vidjeti mnogo stanica,
03:17
but we are able to find which cells are involved
73
197782
3007
ali u stanju smo pronaći koje stanice sudjeluju
03:20
in a particular memory,
74
200813
1452
u određenoj uspomeni,
03:22
because whenever a cell is active,
75
202289
2594
jer kad god je stanica aktivna
03:24
like when it's forming a memory,
76
204907
1524
kao kada stvara uspomenu,
03:26
it will also leave a footprint that will later allow us to know
77
206455
3649
također ostavlja otisak koji će nam kasnije omogućiti
03:30
these cells are recently active.
78
210128
2678
da znamo kako su te stanice nedavno bile aktivirane.
03:32
SR: So the same way that building lights at night
79
212830
2334
SR: Na isti način na koji vam svjetlo u zgradi po noći
govori da netko upravo tada radi tamo,
03:35
let you know that somebody's probably working there at any given moment,
80
215188
3429
isto tako postoje i biološki senzori
03:38
in a very real sense, there are biological sensors
81
218641
2561
03:41
within a cell that are turned on
82
221226
1930
unutar stanice koji su uključeni
03:43
only when that cell was just working.
83
223180
2111
samo kada ta stanica radi.
03:45
They're sort of biological windows that light up
84
225315
2286
Oni su na neki način biološki prozori kojima je svjetlo upaljeno
03:47
to let us know that that cell was just active.
85
227625
2191
kako bismo znali da je ta stanica upravo bila aktivna.
03:49
XL: So we clipped part of this sensor,
86
229840
2134
XL: Odrezali smo dio senzora
03:51
and attached that to a switch to control the cells,
87
231998
3123
i dodali ga prekidaču pomoću kojeg kontroliramo stanice
03:55
and we packed this switch into an engineered virus
88
235145
3876
i vratili smo prekidače u inženjeringom dobiven virus
03:59
and injected that into the brain of the mice.
89
239045
2564
te to ubrizgali u mozak miša.
04:01
So whenever a memory is being formed,
90
241633
2610
Svaki put kada se stvara uspomena,
04:04
any active cells for that memory
91
244267
2324
svaka će aktivna stanica za tu uspomenu
04:06
will also have this switch installed.
92
246615
2718
imati ugrađen prekidač.
04:09
SR: So here is what the hippocampus looks like
93
249357
2191
SR: Ovako izgleda hipokampus
nakon što se stvori uspomena na strah, na primjer.
04:11
after forming a fear memory, for example.
94
251572
2226
04:13
The sea of blue that you see here
95
253822
2116
More plavog koje vidite ovdje
04:15
are densely packed brain cells,
96
255962
1928
su gusto zbijene stanice mozga,
04:17
but the green brain cells,
97
257914
1521
ali zelene stanice,
04:19
the green brain cells are the ones that are holding on
98
259459
2572
zelene stanice su one koje se tiču
određene uspomene na strah.
04:22
to a specific fear memory.
99
262055
1319
04:23
So you are looking at the crystallization
100
263398
1955
Gledate u kristalizaciju
04:25
of the fleeting formation of fear.
101
265377
2363
brzog stvaranja straha.
04:27
You're actually looking at the cross-section of a memory right now.
102
267764
3473
Upravo gledate presjek uspomene.
04:31
XL: Now, for the switch we have been talking about,
103
271261
2429
XL: Sad, prekidač o kojem smo bili govorili,
04:33
ideally, the switch has to act really fast.
104
273714
2901
idealno prekidač bi trebao djelovati jako brzo.
04:36
It shouldn't take minutes or hours to work.
105
276639
2555
Ne trebaju proći minute i sati kako bi proradio.
04:39
It should act at the speed of the brain, in milliseconds.
106
279218
4240
Trebao bi djelovati brzinom mozga, u milisekundama.
04:43
SR: So what do you think, Xu?
107
283482
1406
SR: Što ti misliš, Xu?
04:44
Could we use, let's say, pharmacological drugs
108
284912
2578
Možemo li koristiti, recimo, farmakološke lijekove
04:47
to activate or inactivate brain cells?
109
287514
1814
kako bismo aktivirali ili deaktivirali moždane stanice?
04:49
XL: Nah. Drugs are pretty messy. They spread everywhere.
110
289352
4039
XL: Ne. Lijekovi su nepredvidljivi. Šire se svuda.
04:53
And also it takes them forever to act on cells.
111
293415
2984
Također, treba im vječnost da djeluju na stanice.
04:56
So it will not allow us to control a memory in real time.
112
296423
3625
Neće nam omogućiti kontrolu uspomene u stvarnom vremenu.
05:00
So Steve, how about let's zap the brain with electricity?
113
300072
4270
Steve, što misliš o tome da zdrmamo mozak strujom?
05:04
SR: So electricity is pretty fast,
114
304366
2281
SR: Pa struja putuje brzo,
05:06
but we probably wouldn't be able to target it
115
306671
1715
ali vjerojatno nećemo moći naciljati
05:08
to just the specific cells that hold onto a memory,
116
308410
2429
određenu stanicu koja sadrži uspomenu
05:10
and we'd probably fry the brain.
117
310863
1755
i vjerojatno ćemo spržiti mozak.
05:12
XL: Oh. That's true. So it looks like, hmm,
118
312642
3171
XL: Oh. Istina. Izgleda da
05:15
indeed we need to find a better way
119
315837
2587
stvarno trebamo pronaći bolji način
05:18
to impact the brain at the speed of light.
120
318448
3271
da utječemo na mozak brzinom svjetlosti.
05:21
SR: So it just so happens that light travels at the speed of light.
121
321743
5062
SR: Igrom slučaja svjetlost putuje brzinom svjetlosti.
05:26
So maybe we could activate or inactive memories
122
326829
3459
Možda možemo aktivirati ili deaktivirati uspomene
05:30
by just using light --
123
330312
1473
koristeći samo svjetlost--
05:31
XL: That's pretty fast.
124
331809
1331
XL: To je poprilično brzo.
05:33
SR: -- and because normally brain cells
125
333164
1861
SR: --i zato što normalne moždane stanice
05:35
don't respond to pulses of light,
126
335049
1572
ne reagiraju na valove svjetla,
05:36
so those that would respond to pulses of light
127
336645
1934
pa bi one koje bi reagirale na valove svjetla
05:38
are those that contain a light-sensitive switch.
128
338603
2432
bile one koje sadrže prekidače osjetljive na svjetlost.
05:41
Now to do that, first we need to trick brain cells
129
341059
1922
Kako bismo to učinili trebamo prevariti moždane stanice
05:43
to respond to laser beams.
130
343005
1438
da odgovaraju na laserske zrake.
05:44
XL: Yep. You heard it right.
131
344467
1046
XL: Da. Dobro ste čuli.
05:45
We are trying to shoot lasers into the brain.
132
345537
2143
Pokušavamo pucati laserom u mozak.
05:47
(Laughter)
133
347704
1686
(Smijeh)
05:49
SR: And the technique that lets us do that is optogenetics.
134
349414
3300
SR: A tehnika koja nam to dopušta je optogenetika.
05:52
Optogenetics gave us this light switch that we can use
135
352738
3258
Optogenetika nam je dala svjetlosne prekidače koje možemo koristiti
05:56
to turn brain cells on or off,
136
356020
1484
kako bismo uključili ili isključili moždane stanice,
05:57
and the name of that switch is channelrhodopsin,
137
357528
2493
a ime tog prekidača je kanalrodopsin
06:00
seen here as these green dots attached to this brain cell.
138
360045
2513
koji se ovdje vidi kao zelene točke pričvršćene za ovu moždanu stanicu.
06:02
You can think of channelrhodopsin as a sort of light-sensitive switch
139
362582
3286
Možete zamisliti kanalrodopsin kao prekidač osjetljiv na svjetlost
06:05
that can be artificially installed in brain cells
140
365892
2592
koji se može umjetno ugraditi u moždanu stanicu
06:08
so that now we can use that switch
141
368508
1890
tako da možemo koristiti taj prekidač
06:10
to activate or inactivate the brain cell simply by clicking it,
142
370422
3000
kako bi aktivirali ili deaktivirali moždanu stanicu pritiskom na njega,
06:13
and in this case we click it on with pulses of light.
143
373446
2524
a u ovom slučaju ga pritišćemo valom svjetlosti.
06:15
XL: So we attach this light-sensitive switch of channelrhodopsin
144
375994
3671
XL: Ugradili smo ovaj kanalrodopsinov prekidač osjetljiv na svjetlost
06:19
to the sensor we've been talking about
145
379689
2184
na senzor o kojem smo bili pričali
06:21
and inject this into the brain.
146
381897
2431
i ubrizgali smo to u mozak.
06:24
So whenever a memory is being formed,
147
384352
3187
Pa svaki put kada se stvori uspomena
06:27
any active cell for that particular memory
148
387563
2203
svaka će stanica aktivna za tu određenu uspomenu
06:29
will also have this light-sensitive switch installed in it
149
389790
3460
također imati ugrađen ovaj prekidač osjetljiv na svjetlost
06:33
so that we can control these cells
150
393274
2377
kako bismo kontrolirali ove stanice
06:35
by the flipping of a laser just like this one you see.
151
395675
4240
koristeći laser poput ovoga koji vidite.
06:39
SR: So let's put all of this to the test now.
152
399939
2840
SR: Hajdemo sada sve ovo testirati.
06:42
What we can do is we can take our mice
153
402803
2111
Ono što možemo učiniti je uzeti miševe
06:44
and then we can put them in a box that looks exactly like this box here,
154
404938
2904
i možemo ih postaviti u kutiju, koja izgleda točno kao ova ovdje,
06:47
and then we can give them a very mild foot shock
155
407866
2316
i možemo im blago stresti stopala
06:50
so that they form a fear memory of this box.
156
410206
2048
kako bi stvorili uspomenu straha od ove kutije.
06:52
They learn that something bad happened here.
157
412278
2059
Nauče da se nešto loše dogodilo ovdje.
06:54
Now with our system, the cells that are active
158
414361
2318
Sad s našim sustavom, stanice koje su aktivne
06:56
in the hippocampus in the making of this memory,
159
416703
2766
u hipokampusu, zbog stvaranja uspomene,
06:59
only those cells will now contain channelrhodopsin.
160
419493
2859
samo će one sadržavati kanalrodopsin.
07:02
XL: When you are as small as a mouse,
161
422376
2993
XL: Kada si mal poput miša,
07:05
it feels as if the whole world is trying to get you.
162
425393
3571
osjećaš se kao da te cijeli svijet želi uhvatiti.
07:08
So your best response of defense
163
428988
1724
Pa je tvoja najbolja obrana
07:10
is trying to be undetected.
164
430736
2458
pokušati biti neprimijećen.
07:13
Whenever a mouse is in fear,
165
433218
2009
Kad god se miš boji,
07:15
it will show this very typical behavior
166
435251
1858
pokazat će veoma tipično ponašanje
ostajući u jednom kutu kutije,
07:17
by staying at one corner of the box,
167
437133
1745
07:18
trying to not move any part of its body,
168
438902
2738
pokušavajući ne micati nijedan dio tijela,
07:21
and this posture is called freezing.
169
441664
3271
a ovo držanje se zove zamrzavanje.
07:24
So if a mouse remembers that something bad happened in this box,
170
444959
4270
Ako se miš sjeća da se nešto loše dogodilo u kutiji,
07:29
and when we put them back into the same box,
171
449253
2599
kada ih vratimo natrag u kutiju,
07:31
it will basically show freezing
172
451876
1780
odmah će pokazati zamrzavanje,
07:33
because it doesn't want to be detected
173
453680
2261
jer ne želi biti otkriven
07:35
by any potential threats in this box.
174
455965
2671
od strane bilo kakve opasnosti u kutiji.
07:38
SR: So you can think of freezing as,
175
458660
1331
SR: Možete zamisliti zamrzavanje kao,
07:40
you're walking down the street minding your own business,
176
460015
2191
hodate niz ulicu gledajući svoja posla
07:42
and then out of nowhere you almost run into
177
462230
2048
i odjednom skoro naletite na
bivšu curu ili dečka
07:44
an ex-girlfriend or ex-boyfriend,
178
464302
1821
07:46
and now those terrifying two seconds
179
466147
2110
i sada te strašne dvije sekunde
07:48
where you start thinking, "What do I do? Do I say hi?
180
468281
1852
kada razmišljate, „Što napraviti? Trebam li pozdraviti?
07:50
Do I shake their hand? Do I turn around and run away?
181
470157
1344
Trebamo li se rukovati? Da se okrenem i pobjegnem?
07:51
Do I sit here and pretend like I don't exist?"
182
471525
2005
A da sjednem ovdje i pretvaram se da ne postojim?“
07:53
Those kinds of fleeting thoughts that physically incapacitate you,
183
473554
3160
Takve misli o bježanju te fizički oduzmu,
07:56
that temporarily give you that deer-in-headlights look.
184
476738
2722
daju ti izgled srne pred jurećim autom.
07:59
XL: However, if you put the mouse in a completely different
185
479484
3267
XL: Ako stavite miša u posve različitu,
08:02
new box, like the next one,
186
482775
3137
novu kutiju, poput sljedeće,
08:05
it will not be afraid of this box
187
485936
2123
neće se plašiti te kutije,
08:08
because there's no reason that it will be afraid of this new environment.
188
488083
4705
jer nema razloga zbog kojih bi se bojao nove okoline.
08:12
But what if we put the mouse in this new box
189
492812
3156
Ali što ako stavimo miša u novu kutiju
08:15
but at the same time, we activate the fear memory
190
495992
3587
i u isto vrijeme aktiviramo uspomenu straha
08:19
using lasers just like we did before?
191
499603
2655
koristeći laser kao što smo učinili prije?
08:22
Are we going to bring back the fear memory
192
502282
2830
Hoćemo li vratiti uspomenu straha
08:25
for the first box into this completely new environment?
193
505136
3973
na prvu kutiju u ovaj posve novi okoliš?
08:29
SR: All right, and here's the million-dollar experiment.
194
509133
2711
SR: Dobro, to je eksperiment vrijedan milijun dolara.
08:31
Now to bring back to life the memory of that day,
195
511868
2889
Sad kako bismo vratili u život uspomenu na taj dan,
08:34
I remember that the Red Sox had just won,
196
514781
2159
sjećam se da su Red Soxi upravo pobijedili,
08:36
it was a green spring day,
197
516964
1885
bio je zeleni proljetni dan,
08:38
perfect for going up and down the river
198
518873
1858
savršen za šetnju uz rijeku
08:40
and then maybe going to the North End
199
520755
2245
i onda možda odlazak do North Enda
08:43
to get some cannolis, #justsaying.
200
523024
2135
na nekakav desert, samo kažem.
08:45
Now Xu and I, on the other hand,
201
525183
3078
Xu i ja smo, u drugu ruku,
08:48
were in a completely windowless black room
202
528285
2806
bili u posve tamnoj sobi bez prozora
08:51
not making any ocular movement that even remotely resembles an eye blink
203
531115
3636
ne radeći ni pokrete očiju koji imalo podsjećaju na treptaj oka,
08:54
because our eyes were fixed onto a computer screen.
204
534775
2443
jer su naše oči bile uperene na zaslon računala.
08:57
We were looking at this mouse here trying to activate a memory
205
537242
2953
Gledali smo u miša pokušavajući aktivirati uspomenu
po prvi put koristeći ovu tehniku.
09:00
for the first time using our technique.
206
540219
1859
XL: I vidjeli smo ovo.
09:02
XL: And this is what we saw.
207
542102
2243
09:04
When we first put the mouse into this box,
208
544369
2178
Kada smo stavili miša u kutiju,
09:06
it's exploring, sniffing around, walking around,
209
546571
3089
istraživao je, njuškao, hodao uokolo,
09:09
minding its own business,
210
549684
1665
gledao svoja posla,
09:11
because actually by nature,
211
551373
1677
jer su po prirodi,
09:13
mice are pretty curious animals.
212
553074
1955
miševi znatiželjne životinje.
09:15
They want to know, what's going on in this new box?
213
555053
2598
Žele znati što se dešava u ovoj novoj kutiji?
09:17
It's interesting.
214
557675
1507
Zanimljivo je.
09:19
But the moment we turned on the laser, like you see now,
215
559206
3427
Ali u trenutku kada smo uključili laser, kao što sada vidite,
09:22
all of a sudden the mouse entered this freezing mode.
216
562657
3008
miš se odjednom zamrznuo.
09:25
It stayed here and tried not to move any part of its body.
217
565689
4407
Ostao je tamo i pokušao je ne micati nijedan dio svoga tijela.
09:30
Clearly it's freezing.
218
570120
1604
To je očigledno zamrzavanje.
09:31
So indeed, it looks like we are able to bring back
219
571748
2559
Uistinu, izgleda da smo uspjeli vratiti
09:34
the fear memory for the first box
220
574331
2040
uspomenu straha od prve kutije
09:36
in this completely new environment.
221
576395
3343
u ovo posve novo okruženje.
09:39
While watching this, Steve and I
222
579762
2088
Dok smo to gledali, Steve i ja
09:41
are as shocked as the mouse itself.
223
581874
2109
smo bili šokirani koliko i sam miš.
09:44
(Laughter)
224
584007
1238
(Smijeh)
09:45
So after the experiment, the two of us just left the room
225
585269
3283
Nakon eksperimenta nas smo dvojica samo napustili sobu
09:48
without saying anything.
226
588576
1729
bez da smo išta reklil.
09:50
After a kind of long, awkward period of time,
227
590329
3372
Nakon dugog, neugodnog vremenskog perioda
09:53
Steve broke the silence.
228
593725
2188
Steve je prekinuo tišinu.
09:55
SR: "Did that just work?"
229
595937
2317
SR: „Je li to upravo uspjelo?“
09:58
XL: "Yes," I said. "Indeed it worked!"
230
598278
2950
XL: „Da“ rekao sam. „Uistinu je uspjelo!“
10:01
We're really excited about this.
231
601252
2093
Vrlo smo uzbuđeni oko ovoga.
10:03
And then we published our findings
232
603369
2600
I onda smo objavili naše pronalaske
10:05
in the journal Nature.
233
605993
1672
u časopisu Nature.
10:07
Ever since the publication of our work,
234
607689
2447
Otkad smo objavili naš rad,
10:10
we've been receiving numerous comments
235
610160
2391
primili smo mnogo komentara
10:12
from all over the Internet.
236
612575
2101
s čitavog Interneta.
10:14
Maybe we can take a look at some of those.
237
614700
3726
Mogli bismo pogledati neke od njih.
10:18
["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"]
238
618450
2433
[„OMGGGGG KONAČNO… toliko toga još treba doći, virtualna stvarnost, neuralna manipulacija, emulacija snova, neuralno kodiranje „pisanje i mijenjanjanje uspomena“, mentalne bolesti. Ahhh budućnost je zakon“]
10:20
SR: So the first thing that you'll notice is that people
239
620907
1976
SR: Prva stvar koju ćete zamijetiti je da ljudi
10:22
have really strong opinions about this kind of work.
240
622907
2879
imaju čvrsto mišljenje o ovoj vrsti rada.
10:25
Now I happen to completely agree with the optimism
241
625810
2530
Potpuno se slažem s optimizmom
10:28
of this first quote,
242
628364
792
ovog citata,
10:29
because on a scale of zero to Morgan Freeman's voice,
243
629180
2784
jer na ljestvici od nule pa do glasa Morgana Freemana,
10:31
it happens to be one of the most evocative accolades
244
631988
2477
to je jedno od najprimamljivijih priznanja
10:34
that I've heard come our way.
245
634489
1526
koje je došlo do nas.
(Smijeh)
10:36
(Laughter)
246
636039
1794
10:37
But as you'll see, it's not the only opinion that's out there.
247
637857
1927
No kao što vidite, to nije jedino mišljenje koje je tamo.
10:39
["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"]
248
639808
1540
[„Ovo me nasmrt prestrašilo… Što ako bi to lako mogli činiti ljudima za nekoliko godina? O moj Bože, propali smo.“]
10:41
XL: Indeed, if we take a look at the second one,
249
641372
2286
XL: Uistinu, ako promotrimo drugi,
10:43
I think we can all agree that it's, meh,
250
643682
2083
mislim da se možemo složiti da
10:45
probably not as positive.
251
645789
1959
nije baš pozitivan.
10:47
But this also reminds us that,
252
647772
2161
Ali ovo nas također podsjeća da je,
10:49
although we are still working with mice,
253
649957
2162
iako još uvijek radimo s miševima,
10:52
it's probably a good idea to start thinking and discussing
254
652143
3493
vjerojatno dobra ideja početi razmišljati i raspravljati
10:55
about the possible ethical ramifications
255
655660
2967
o mogućim etičkim posljedicama
10:58
of memory control.
256
658651
1924
kontrole sjećanja.
11:00
SR: Now, in the spirit of the third quote,
257
660599
2176
SR: Sad, u duhu trećeg citata,
11:02
we want to tell you about a recent project that we've been
258
662799
1950
želimo vam govoriti o novom projektu na kojem
11:04
working on in lab that we've called Project Inception.
259
664773
2572
radimo u laboratoriju, a zove se Project Inception.
11:07
["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."]
260
667369
3219
[„Trebali bi napraviti film o ovome. Gdje ugrade ideje u ljudski um kako bi ih mogli kontrolirati u svoju korist. Nazovimo ga: Inception“]
11:10
So we reasoned that now that we can reactivate a memory,
261
670612
3542
Raspravljali smo o tome kako možemo ponovno aktivirati uspomenu,
11:14
what if we do so but then begin to tinker with that memory?
262
674178
2928
što ako to učinimo, ali onda počnemo petljati s tom uspomenom?
11:17
Could we possibly even turn it into a false memory?
263
677130
3009
Možemo li ju čak pretvoriti u lažnu uspomenu?
11:20
XL: So all memory is sophisticated and dynamic,
264
680163
4075
XL: Sve uspomene su sofisticirane i dinamične,
11:24
but if just for simplicity, let's imagine memory
265
684262
2955
ali, zbog lakšeg razumijevanja, zamislimo uspomenu
11:27
as a movie clip.
266
687241
1378
kao filmski isječak.
11:28
So far what we've told you is basically we can control
267
688643
2646
Dosad smo vam rekli da mi zapravo možemo kontrolirati
11:31
this "play" button of the clip
268
691313
1907
ovu „play“ tipku isječka
11:33
so that we can play this video clip any time, anywhere.
269
693244
4561
tako da možemo pustiti ovaj video isječak bilo kad i bilo gdje.
11:37
But is there a possibility that we can actually get
270
697829
2507
Ali postoji li mogućnost da možemo zapravo ući
11:40
inside the brain and edit this movie clip
271
700360
2836
unutar mozga i urediti video ulomak
11:43
so that we can make it different from the original?
272
703220
2872
tako da ga učinimo drugačijim od originala?
11:46
Yes we can.
273
706116
2154
Možemo.
11:48
Turned out that all we need to do is basically
274
708294
2191
Ispalo je da je zapravo sve što trebamo
11:50
reactivate a memory using lasers just like we did before,
275
710509
4166
ponovno aktivirati uspomenu pomoću lasera kao što smo već učinili,
11:54
but at the same time, if we present new information
276
714699
3415
ali u isto vrijeme, ako prikažemo novu informaciju
11:58
and allow this new information to incorporate into this old memory,
277
718138
3950
i dopustimo novoj informaciji da se ugradi u staru uspomenu,
12:02
this will change the memory.
278
722112
2414
ovo će promijeniti uspomenu.
12:04
It's sort of like making a remix tape.
279
724550
3639
To je nešto poput pravljenja remix pjesme.
12:08
SR: So how do we do this?
280
728213
2834
SR: Kako smo to učinili?
12:11
Rather than finding a fear memory in the brain,
281
731071
1933
Umjesto pronalaženja uspomene na strah u mozgu,
12:13
we can start by taking our animals,
282
733028
1692
možemo započeti uzimanjem naših životinja
12:14
and let's say we put them in a blue box like this blue box here
283
734744
2953
i recimo da ih stavimo u plavu kutiju poput ove kutije ovdje
12:17
and we find the brain cells that represent that blue box
284
737721
2623
i pronađemo moždane stanice koje prikazuju tu plavu kutiju
12:20
and we trick them to respond to pulses of light
285
740368
2239
i prevarimo ih tako da reagiraju na puls svijetla,
12:22
exactly like we had said before.
286
742631
1710
točno kako smo već rekli prije.
12:24
Now the next day, we can take our animals and place them
287
744365
2100
Sljedeći dan možemo uzeti naše životinje i staviti ih
12:26
in a red box that they've never experienced before.
288
746489
2675
u crvenu kutiju koju nikada dosad nisu doživjeli.
12:29
We can shoot light into the brain to reactivate
289
749188
2239
Možemo ispaliti svjetlost u njihov mozak i ponovno aktivirati
12:31
the memory of the blue box.
290
751451
1848
uspomenu na plavu kutiju.
12:33
So what would happen here if, while the animal
291
753323
1720
Što bi se dogodilo ovdje kad bi im, dok se životinja
12:35
is recalling the memory of the blue box,
292
755067
1905
prisjeća uspomene na plavu kutiju,
12:36
we gave it a couple of mild foot shocks?
293
756996
2584
dali nekoliko blagih šokova u stopala?
12:39
So here we're trying to artificially make an association
294
759604
2669
Ovdje pokušavamo umjetno napraviti poveznicu
12:42
between the memory of the blue box
295
762297
1891
između uspomene na plavu kutiju
12:44
and the foot shocks themselves.
296
764212
1479
i šoka stopala.
12:45
We're just trying to connect the two.
297
765715
1762
Samo pokušavamo spojiti to dvoje.
12:47
So to test if we had done so,
298
767501
1512
Kako bi testirali jesmo li uspjeli,
12:49
we can take our animals once again
299
769037
1304
možemo uzeti naše životinje još jednom
12:50
and place them back in the blue box.
300
770365
1922
i staviti ih natrag u plavu kutiju.
12:52
Again, we had just reactivated the memory of the blue box
301
772311
2715
Opet, upravo smo reaktivirali uspomenu na plavu kutiju
12:55
while the animal got a couple of mild foot shocks,
302
775050
2381
dok životinja dobiva nekoliko blagih šokova stopala
12:57
and now the animal suddenly freezes.
303
777455
2107
i sad se životinja zamrzla.
12:59
It's as though it's recalling being mildly shocked in this environment
304
779586
3334
Kao da se sjeća da je blago šokirana u ovom okružju
13:02
even though that never actually happened.
305
782944
2822
iako se to nikada zapravo nije dogodilo.
13:05
So it formed a false memory,
306
785790
1838
Stvorilo je lažnu uspomenu,
13:07
because it's falsely fearing an environment
307
787652
2048
jer je strah od okružja lažan gdje se,
13:09
where, technically speaking,
308
789724
1334
tehnički govoreći,
ništa loše nije dogodilo.
13:11
nothing bad actually happened to it.
309
791082
2160
13:13
XL: So, so far we are only talking about
310
793266
2429
XL: Dosad smo govorili jedino o
13:15
this light-controlled "on" switch.
311
795719
2342
„uključeno“ prekidaču kontroliranom svjetlošću.
13:18
In fact, we also have a light-controlled "off" switch,
312
798085
3264
Zapravo imamo i „isključeno“ prekidač kontroliran svjetlošću
13:21
and it's very easy to imagine that
313
801373
2056
i veoma je lagano zamisliti da,
13:23
by installing this light-controlled "off" switch,
314
803453
2454
instalirajući ovaj „off“ prekidač kontroliran svjetlošću,
13:25
we can also turn off a memory, any time, anywhere.
315
805931
5564
možemo također isključiti uspomenu, bilo kad, bilo gdje.
13:31
So everything we've been talking about today
316
811519
2190
Sve o čemu smo govorili danas
13:33
is based on this philosophically charged principle of neuroscience
317
813733
4653
je bazirano na filozofski nastrojenom principu neuroznanosti
13:38
that the mind, with its seemingly mysterious properties,
318
818410
4094
da je um, sa svojim naizgled misterioznim svojstvima,
13:42
is actually made of physical stuff that we can tinker with.
319
822528
3621
zapravo napravljen od fizičke tvari kojom možemo petljati.
13:46
SR: And for me personally,
320
826173
1451
SR: I za mene osobno,
13:47
I see a world where we can reactivate
321
827648
1762
vidim svijet gdje možemo ponovno aktivirati
13:49
any kind of memory that we'd like.
322
829434
1887
bilo koju uspomenu koju želimo.
13:51
I also see a world where we can erase unwanted memories.
323
831345
3274
Također vidim svijet gdje možemo obrisati neželjene uspomene.
13:54
Now, I even see a world where editing memories
324
834643
2143
Čak vidim i svijet gdje je uređivanje uspomena
13:56
is something of a reality,
325
836810
1284
nešto realno,
13:58
because we're living in a time where it's possible
326
838118
1681
jer živimo u vremenu kada je moguće
13:59
to pluck questions from the tree of science fiction
327
839823
2429
ubrati pitanja s drveta znanstvene fantastike
14:02
and to ground them in experimental reality.
328
842276
2168
i prizemljiti ih eksperimentalnom stvarnošću.
14:04
XL: Nowadays, people in the lab
329
844468
1859
XL: Danas ljudi u laboratorijima
14:06
and people in other groups all over the world
330
846351
2362
i ljudi u drugim grupama svuda po svijetu
14:08
are using similar methods to activate or edit memories,
331
848737
3793
koriste slične metode kako bi aktivirali ili obradili uspomene
14:12
whether that's old or new, positive or negative,
332
852554
3817
bilo da su stare ili nove, pozitivne ili negativne,
14:16
all sorts of memories so that we can understand
333
856395
2648
različite vrste uspomena kako bi razumjeli
14:19
how memory works.
334
859067
1840
kako uspomene rade.
14:20
SR: For example, one group in our lab
335
860931
1760
SR: Na primjer, jedna grupa u našem laboratoriju
14:22
was able to find the brain cells that make up a fear memory
336
862715
2614
je uspjela pronaći moždane stanice koje čine uspomene na strah
14:25
and converted them into a pleasurable memory, just like that.
337
865353
2751
i prevela ih je u ugodne uspomene, samo tako.
14:28
That's exactly what I mean about editing these kinds of processes.
338
868128
3143
Upravo to mislim o obrađivanju takvih procesa.
14:31
Now one dude in lab was even able to reactivate
339
871295
2239
Jedan lik je u laboratoriju čak uspio ponovno aktivirati
14:33
memories of female mice in male mice,
340
873558
1921
uspomene ženskog miša u muškom mišu,
14:35
which rumor has it is a pleasurable experience.
341
875503
2971
a glasine govore da je to ugodno iskustvo.
14:38
XL: Indeed, we are living in a very exciting moment
342
878498
4093
XL: Uistinu, živimo u vrlo uzbudljivom trenutku
14:42
where science doesn't have any arbitrary speed limits
343
882615
3781
gdje znanost nema određenu brzinu,
14:46
but is only bound by our own imagination.
344
886420
3163
već je jedino ograničena našom maštom.
14:49
SR: And finally, what do we make of all this?
345
889607
2143
SR: I konačno, što možemo zaključiti iz svega ovoga?
14:51
How do we push this technology forward?
346
891774
1927
Kako progurati ovu tehnologiju naprijed?
14:53
These are the questions that should not remain
347
893725
2191
Ova pitanje ne trebaju ostati
14:55
just inside the lab,
348
895940
1273
samo unutar laboratorija,
14:57
and so one goal of today's talk was to bring everybody
349
897237
2572
a jedan od ciljeva današnjeg govora je da svima
14:59
up to speed with the kind of stuff that's possible
350
899833
2381
pokažemo što je sve moguće
15:02
in modern neuroscience,
351
902238
1250
u modernoj neuroznanosti,
15:03
but now, just as importantly,
352
903512
1486
ali sad, jednako važno,
15:05
to actively engage everybody in this conversation.
353
905022
3308
da aktivno uključimo sve u ovaj razgovor.
15:08
So let's think together as a team about what this all means
354
908354
2762
Hajdemo zajedno kao tim razmisliti o tome što sve ovo znači
i gdje možemo i trebamo ići dalje,
15:11
and where we can and should go from here,
355
911140
1993
15:13
because Xu and I think we all have
356
913157
2074
jer Xu i ja mislimo da su ispred nas
15:15
some really big decisions ahead of us.
357
915255
2512
neke velike odluke.
15:17
Thank you. XL: Thank you.
358
917791
1101
Hvala vam. XL: Hvala vam.
15:18
(Applause)
359
918916
1634
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7