Steve Ramirez and Xu Liu: A mouse. A laser beam. A manipulated memory.

136,919 views ใƒป 2013-08-15

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Ahreum Woo ๊ฒ€ํ† : K Bang
00:12
Steve Ramirez: My first year of grad school,
0
12371
2096
์Šคํ‹ฐ๋ธŒ ๋ผ๋ฏธ๋ ˆ์ฆˆ: ๋Œ€ํ•™์› 1ํ•™๋…„ ๋•Œ์˜€์Šต๋‹ˆ๋‹ค.
๊ทธ ๋‹น์‹œ์˜ ์ €๋Š” ๋ฐฉ์— ํ‹€์–ด๋ฐ•ํ˜€
00:14
I found myself in my bedroom
1
14491
1416
00:15
eating lots of Ben & Jerry's
2
15931
2276
๋ฒค์•ค ์ œ๋ฆฌ ์•„์ด์Šคํฌ๋ฆผ์„ ํผ๋จน์œผ๋ฉฐ
00:18
watching some trashy TV
3
18231
1660
์‹œ๋ฅ์ž–์€ TV ํ”„๋กœ๊ทธ๋žจ์ด๋‚˜ ๋ณด๊ณ  ์žˆ์—ˆ์ฃ .
00:19
and maybe, maybe listening to Taylor Swift.
4
19915
3203
๊ทธ๋ฆฌ๊ณ  ๋งŒ์— ํ•˜๋‚˜, ์–ด์ฉŒ๋ฉด ํ…Œ์ผ๋Ÿฌ ์Šค์œ„ํ”„ํŠธ์˜ ๋…ธ๋ž˜๋ฅผ ๋“ฃ๊ณ  ์žˆ์—ˆ์„ ์ง€๋„ ๋ชฐ๋ผ์š”.
00:23
I had just gone through a breakup.
5
23142
1717
์—ฌ์ž ์นœ๊ตฌ์™€ ๋ง‰ ํ—ค์–ด์กŒ์„ ๋•Œ์˜€๊ฑฐ๋“ ์š”.
00:24
(Laughter)
6
24883
1423
(๊ด€๊ฐ ์›ƒ์Œ ์†Œ๋ฆฌ)
00:26
So for the longest time, all I would do
7
26330
2172
๊ทธ๋ž˜์„œ ์ •๋ง ์˜ค๋žซ๋™์•ˆ ์ œ๊ฐ€ ํ•œ ์ผ์€
00:28
is recall the memory of this person over and over again,
8
28526
3792
๊ทธ ์‚ฌ๋žŒ์— ๋Œ€ํ•œ ๊ธฐ์–ต์„ ๊ณ„์†ํ•ด์„œ ๋– ์˜ฌ๋ฆฌ๋Š” ๊ฒƒ์ด์—ˆ์ฃ .
00:32
wishing that I could get rid of that gut-wrenching,
9
32342
2469
์†์ด ๋’ค์ง‘์–ด์งˆ ๊ฒƒ ๊ฐ™์€ ๋Š๋‚Œ์„ ์—†์•จ ์ˆ˜ ์žˆ๊ธฐ๋ฅผ ๋ฐ”๋ผ๋ฉด์„œ์š”.
00:34
visceral "blah" feeling.
10
34835
2509
๋‚ด์žฅ์ด "์šฑ" ํ•˜๋Š” ๊ทธ ๋Š๋‚Œ์ด์š”.
00:37
Now, as it turns out, I'm a neuroscientist,
11
37368
2259
์ด์ œ ์ œ๊ฐ€ ์‹ ๊ฒฝ๊ณผํ•™์ž๊ฐ€ ๋˜๊ณ  ๋‚˜๋‹ˆ
00:39
so I knew that the memory of that person
12
39651
2378
๊ทธ ์‚ฌ๋žŒ์— ๋Œ€ํ•œ ๊ธฐ์–ต๊ณผ
00:42
and the awful, emotional undertones that color in that memory,
13
42053
3116
๊ทธ ๊ธฐ์–ต์„ ๋ง์น ํ•˜๋Š” ๊ฐ์ •์ ์ธ ๋ฌด์˜์‹์€
00:45
are largely mediated by separate brain systems.
14
45193
2610
๋Œ€๋ถ€๋ถ„ ๋‡Œ์˜ ๋…๋ฆฝ๋œ ์—ฌ๋Ÿฌ ๊ธฐ๊ด€์—์„œ ๊ด€์žฅํ•œ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
00:47
And so I thought, what if we could go into the brain
15
47827
2477
๊ทธ๋ž˜์„œ ์ƒ๊ฐํ–ˆ์ฃ . ๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ๋‡Œ ์†์œผ๋กœ ๋“ค์–ด๊ฐˆ ์ˆ˜ ์žˆ๋‹ค๋ฉด,
00:50
and edit out that nauseating feeling
16
50328
1930
๊ทธ๋ž˜์„œ ๊ทธ ๋ถˆ์พŒํ•œ ๊ฐ์ •์„ ์ง€์šธ ์ˆ˜ ์žˆ๋‹ค๋ฉด,
00:52
but while keeping the memory of that person intact?
17
52282
2946
๊ทธ๋ฆฌ๊ณ  ๋™์‹œ์— ๊ทธ ์‚ฌ๋žŒ์— ๋Œ€ํ•œ ๊ธฐ์–ต์€ ์˜จ์ „ํžˆ ์ง€ํ‚ฌ ์ˆ˜ ์žˆ๋‹ค๋ฉด ์–ด๋–จ๊นŒ ํ•˜๊ณ ์š”.
00:55
Then I realized, maybe that's a little bit lofty for now.
18
55252
2667
ํ•˜์ง€๋งŒ ํ˜„์žฌ๋กœ์„  ๊ฑด๋ฐฉ์ง„ ์ƒ๊ฐ์ผ ๋ฟ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:57
So what if we could start off by going into the brain
19
57943
2489
๊ทธ๋ ‡๋‹ค๋ฉด ์ผ๋‹จ ๋‡Œ ์†์œผ๋กœ ๋“ค์–ด๊ฐ€์„œ
01:00
and just finding a single memory to begin with?
20
60456
2609
ํ•˜๋‚˜์˜ ๊ธฐ์–ต์„ ์ฐพ๋Š” ๊ฒƒ์œผ๋กœ ์‹œ์ž‘ํ•ด๋ณด๋ฉด ์–ด๋–จ๊นŒ์š”?
01:03
Could we jump-start that memory back to life,
21
63089
2490
๊ทธ ๊ธฐ์–ต์„ ๋‹ค์‹œ ๋˜์‚ด๋ ค์„œ
01:05
maybe even play with the contents of that memory?
22
65603
3844
๊ทธ ์•ˆ์˜ ๋‚ด์šฉ๋“ค์„ ์กฐ์ž‘ํ•ด๋ณผ ์ˆ˜๋„ ์žˆ์„๊นŒ์š”?
01:09
All that said, there is one person in the entire world right now
23
69471
2205
๋งํ•˜๊ณ  ๋‚˜๋‹ˆ ์ด ์„ธ์ƒ์—์„œ ๋”ฑ ํ•œ ์‚ฌ๋žŒ ๊ทธ ์‚ฌ๋žŒ๋งŒํผ์€
01:11
that I really hope is not watching this talk.
24
71700
2143
์ œ๋ฐœ ์ด ๊ฐ•์—ฐ์„ ๋ณด์ง€ ๋ง์•˜์œผ๋ฉด ํ•˜๋Š” ๋ฐ”๋žŒ์ž…๋‹ˆ๋‹ค.
01:13
(Laughter)
25
73867
3805
(๊ด€๊ฐ ์›ƒ์Œ)
01:17
So there is a catch. There is a catch.
26
77696
3265
๋„ค, ๋ฌผ๋ก  ๊ณต์งœ๋Š” ์•„๋‹™๋‹ˆ๋‹ค. ์กฐ๊ฑด์ด ์žˆ์ฃ .
01:20
These ideas probably remind you of "Total Recall,"
27
80985
2764
์ง€๊ธˆ ์ œ๊ฐ€ ํ•˜๋Š” ์ด์•ผ๊ธฐ๊ฐ€ ์˜ํ™” "ํ† ํƒˆ ๋ฆฌ์ฝœ"์ด๋‚˜
01:23
"Eternal Sunshine of the Spotless Mind,"
28
83773
1950
"์ดํ„ฐ๋„ ์„ ์ƒค์ธ"
01:25
or of "Inception."
29
85747
1279
"์ธ์…‰์…˜"์˜ ๋‚ด์šฉ์ฒ˜๋Ÿผ ๋“ค๋ฆด ์ˆ˜๋„ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
01:27
But the movie stars that we work with
30
87050
1762
ํ•˜์ง€๋งŒ ์ค‘์š”ํ•œ ์ ์€ ์šฐ๋ฆฌ๊ฐ€ ํ•œ๊ป˜ ์ผํ•œ ์˜ํ™” ๋ฐฐ์šฐ๋Š”
01:28
are the celebrities of the lab.
31
88836
1709
์‹คํ—˜์‹ค์˜ ์Šคํƒ€๋ผ๋Š” ๊ฒƒ์ด์ฃ .
01:30
Xu Liu: Test mice.
32
90569
1876
์‰ฌ ๋ฆฌ์šฐ: ์‹คํ—˜์šฉ ์ฅ๋ฅผ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค.
01:32
(Laughter)
33
92469
1104
(๊ด€๊ฐ ์›ƒ์Œ)
01:33
As neuroscientists, we work in the lab with mice
34
93597
3130
์‹ ๊ฒฝ๊ณผํ•™์ž๋กœ์„œ ์ €ํฌ๊ฐ€ ํ•˜๋Š” ์ผ์€ ์ฅ ์‹คํ—˜์„ ํ†ตํ•ด
01:36
trying to understand how memory works.
35
96751
3386
๊ธฐ์–ต์ด๋ผ๋Š” ๊ฒƒ์ด ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€ ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•œ ์—ฐ๊ตฌ์ž…๋‹ˆ๋‹ค.
01:40
And today, we hope to convince you that now
36
100161
2545
์˜ค๋Š˜ ์ €ํฌ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ์„ค๋ช…๋“œ๋ฆฌ๊ณ ์ž ํ•˜๋Š” ๊ฒƒ์€
01:42
we are actually able to activate a memory in the brain
37
102730
3192
๋‡Œ ์†์˜ ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๋Š” ๊ฒƒ์ด ๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
01:45
at the speed of light.
38
105946
2146
๊ทธ๊ฒƒ๋„ ๋น›์˜ ์†๋„๋กœ์š”.
01:48
To do this, there's only two simple steps to follow.
39
108116
3082
์ด๋ฅผ ์œ„ํ•ด์„œ ๊ฑฐ์ณ์•ผ ํ•  ๋‘๊ฐ€์ง€์˜ ๊ฐ„๋‹จํ•œ ๋‹จ๊ณ„๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
01:51
First, you find and label a memory in the brain,
40
111222
3486
์šฐ์„  ๋‡Œ ์†์˜ ํ•ด๋‹น ๊ธฐ์–ต์„ ์ฐพ์•„ ์ด๋ฆ„์„ ๋ถ™์ž…๋‹ˆ๋‹ค.
01:54
and then you activate it with a switch.
41
114732
3606
๊ทธ๋ฆฌ๊ณ  ์Šค์œ„์น˜๋ฅผ ์ผœ์„œ ์ž‘๋™์‹œํ‚ค๋ฉด ๋ฉ๋‹ˆ๋‹ค.
01:58
As simple as that.
42
118362
1421
๊ทธ๊ฒŒ ๋์ด์—์š”.
01:59
(Laughter)
43
119807
1798
(๊ด€๊ฐ ์›ƒ์Œ)
02:01
SR: Are you convinced?
44
121629
1821
๋‹ค๋“ค ์ดํ•ดํ•˜์…จ์ฃ ?
02:03
So, turns out finding a memory in the brain isn't all that easy.
45
123474
3697
ํ•˜์ง€๋งŒ ๋‡Œ ์†์˜ ๊ธฐ์–ต์„ ์ฐพ๋Š” ์ผ์€ ๊ทธ๋ฆฌ ๊ฐ„๋‹จํ•˜์ง€ ์•Š๋”๋ผ๊ณ ์š”.
02:07
XL: Indeed. This is way more difficult than, let's say,
46
127195
2779
์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค, ์ด๊ฑด ๋งํ•˜์ž๋ฉด ์งš๋”๋ฏธ ์†์—์„œ
02:09
finding a needle in a haystack,
47
129998
2380
๋ฐ”๋Š˜์„ ์ฐพ๋Š” ๊ฒƒ๋ณด๋‹ค ์–ด๋ ค์šด ์ผ์ด์ฃ .
02:12
because at least, you know, the needle is still something
48
132402
2715
์™œ๋ƒํ•˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„๋“ค๋„ ์•„์‹œ๋‹คํ”ผ์‹œ
๋ฐ”๋Š˜์€ ์šฐ๋ฆฌ๊ฐ€ ์‹ค์ œ๋กœ ๋งŒ์งˆ ์ˆ˜ ์žˆ์œผ๋‹ˆ๊นŒ์š”.
02:15
you can physically put your fingers on.
49
135141
2326
02:17
But memory is not.
50
137491
1953
ํ•˜์ง€๋งŒ ๊ธฐ์–ต์€ ๊ทธ๋ ‡์ง€ ์•Š์ฃ .
02:19
And also, there's way more cells in your brain
51
139468
3014
๊ทธ๋ฆฌ๊ณ  ๋‡Œ๊ฐ€ ๊ฐ–๊ณ  ์žˆ๋Š” ์„ธํฌ์˜ ์ˆ˜๋Š”
02:22
than the number of straws in a typical haystack.
52
142506
5042
๋ณดํ†ต ์งš๋”๋ฏธ์˜ ์ง€ํ‘ธ๋ผ๊ธฐ๋ณด๋‹ค ํ›จ์”ฌ ๋งŽ๊ธฐ๋„ ํ•˜๊ณ ์š”.
02:27
So yeah, this task does seem to be daunting.
53
147572
2855
๋„ค, ์ €ํฌ ์—ฐ๊ตฌ๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ๋งŒ๋งŒํ•œ ์ผ์€ ์•„๋‹ˆ๋”๋ผ๊ณ ์š”.
02:30
But luckily, we got help from the brain itself.
54
150451
3655
ํ•˜์ง€๋งŒ ๋‹คํ–‰ํžˆ๋„ ๋‡Œ๊ฐ€ ์šฐ๋ฆฌ๋ฅผ ๋„์™€์ฃผ์—ˆ์Šต๋‹ˆ๋‹ค.
02:34
It turned out that all we need to do is basically
55
154130
2431
์•Œ๊ณ ๋ณด๋‹ˆ ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ผ์€ ๊ฐ„๋‹จํ–ˆ์Šต๋‹ˆ๋‹ค.
02:36
to let the brain form a memory,
56
156585
1969
์šฐ์„  ๋‡Œ๊ฐ€ ๋จผ์ € ๊ธฐ์–ต์„ ํ˜•์„ฑํ•˜๋„๋ก ํ•˜๊ณ 
02:38
and then the brain will tell us which cells are involved
57
158578
3806
๊ทธ ํ›„ ๋‡Œ๊ฐ€ ์šฐ๋ฆฌ์—๊ฒŒ ์–ด๋–ค ์„ธํฌ๋ฅผ ์‚ฌ์šฉํ–ˆ๋Š”์ง€ ์•Œ๋ ค์ฃผ๋Š” ๊ฒƒ์ด์ฃ .
02:42
in that particular memory.
58
162408
1742
ํ•ด๋‹น ๊ธฐ์–ต์— ์‚ฌ์šฉ๋œ ํŠน์ • ์„ธํฌ๋ฅผ์š”.
02:44
SR: So what was going on in my brain
59
164174
2333
๊ทธ๋Ÿผ ์ œ ๋‡Œ ์†์—์„  ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚ฌ์„๊นŒ์š”?
02:46
while I was recalling the memory of an ex?
60
166531
2070
์ œ๊ฐ€ ์ „ ์—ฌ์ž ์นœ๊ตฌ๋ฅผ ์ƒ๊ฐํ•˜๋Š” ๋™์•ˆ์—์š”.
02:48
If you were to just completely ignore human ethics for a second
61
168625
2378
์ธ๊ฐ„์˜ ์œค๋ฆฌ๋ฅผ ์ž ์‹œ ์žŠ๊ณ 
02:51
and slice up my brain right now,
62
171027
1644
๋‡Œ๋ฅผ ํ•œ๋ฒˆ ์ž˜๋ผ๋ด…์‹œ๋‹ค.
02:52
you would see that there was an amazing number
63
172695
2191
์•ž์œผ๋กœ ๋ณด์‹œ๊ฒ ์ง€๋งŒ ๋‡Œ ๋‚ด๋ถ€์—์„œ๋Š”
02:54
of brain regions that were active while recalling that memory.
64
174910
2957
๊ทธ ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๋Š” ๋™์•ˆ ์ˆ˜๋งŽ์€ ๋ถ€๋ถ„์ด ์ผ์„ ํ•ด์•ผํ•ฉ๋‹ˆ๋‹ค.
02:57
Now one brain region that would be robustly active
65
177891
2888
๊ทธ๋ฆฌ๊ณ  ๊ทธ ์ค‘ ํŠนํžˆ ํ™œ๋ฐœํžˆ ์›€์ง์ด๋Š” ๊ฒƒ์€
03:00
in particular is called the hippocampus,
66
180803
1983
ํ•ด๋งˆ๋ผ๊ณ  ํ•˜๋Š” ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค.
03:02
which for decades has been implicated in processing
67
182810
2431
์ง€๋‚œ ์ˆ˜์‹ญ ๋…„๊ฐ„์˜ ์—ฐ๊ตฌ๋ฅผ ํ†ตํ•ด ์šฐ๋ฆฌ๊ฐ€ ์†Œ์ค‘ํžˆ ์•„๋ผ๋Š” ๊ธฐ์–ต์€
03:05
the kinds of memories that we hold near and dear,
68
185265
2392
์ด ๋ถ€๋ถ„์—์„œ ๋‹ด๋‹นํ•œ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
03:07
which also makes it an ideal target to go into
69
187681
2550
๊ทธ๋ ‡๊ธฐ ๋•Œ๋ฌธ์— ๊ธฐ์–ต์„ ์ฐพ์•„ ๋˜์‚ด๋ฆฌ๋ ค๋ฉด
03:10
and to try and find and maybe reactivate a memory.
70
190255
2761
์ด ๋ถ€๋ถ„์„ ๊ณต๋žตํ•˜๋Š” ๊ฒƒ์ด ๊ฐ€์žฅ ์ข‹์€ ๋ฐฉ๋ฒ•์ผ ์ˆ˜ ์žˆ์ฃ .
03:13
XL: When you zoom in into the hippocampus,
71
193040
2370
ํ•ด๋งˆ ๋ถ€์œ„๋ฅผ ํ™•๋Œ€ํ•ด์„œ ๋ณด๊ฒŒ ๋˜๋ฉด
03:15
of course you will see lots of cells,
72
195434
2324
์—„์ฒญ๋‚œ ์–‘์˜ ์„ธํฌ๋ฅผ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:17
but we are able to find which cells are involved
73
197782
3007
ํ•˜์ง€๋งŒ ์ €ํฌ๋Š” ์–ด๋–ค ์„ธํฌ๊ฐ€
03:20
in a particular memory,
74
200813
1452
์–ด๋–ค ๊ธฐ์–ต์„ ๋‹ด๋‹นํ•˜๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ์ฃ .
03:22
because whenever a cell is active,
75
202289
2594
์„ธํฌ๊ฐ€ ํ™œ์„ฑํ™” ์ƒํƒœ๋ผ๋ฉด,
03:24
like when it's forming a memory,
76
204907
1524
์˜ˆ๋ฅผ ๋“ค์ž๋ฉด ๊ธฐ์–ต์„ ๋งŒ๋“ค๊ณ  ์žˆ๋Š” ์ƒํƒœ์ฒ˜๋Ÿผ์š”,
03:26
it will also leave a footprint that will later allow us to know
77
206455
3649
ํ•ญ์ƒ ํ”์ ์„ ๋‚จ๊ธฐ๊ธฐ ๋•Œ๋ฌธ์— ์šฐ๋ฆฌ๊ฐ€ ๋‚˜์ค‘์— ์•Œ ์ˆ˜ ์žˆ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
03:30
these cells are recently active.
78
210128
2678
์ตœ๊ทผ์— ํ™œ๋™ํ•œ ์„ธํฌ๊ฐ€ ์–ด๋–ค ๊ฒƒ์ธ์ง€๋ฅผ์š”.
03:32
SR: So the same way that building lights at night
79
212830
2334
๋ฐค์— ๊ฑด๋ฌผ์— ๋ถˆ์ด ์ผœ์ ธ ์žˆ๋‹ค๋ฉด
๋ˆ„๊ตฐ๊ฐ€ ๊ฑฐ๊ธฐ์„œ ์ผ์„ ํ•œ๋‹ค๋Š”๊ฑธ ์•Œ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ๊ณผ ๊ฐ™์ด
03:35
let you know that somebody's probably working there at any given moment,
80
215188
3429
์‹ค์ œ๋กœ ์ด ์ƒ๋ฌผํ•™์  ์„ผ์„œ๋“ค๋„
03:38
in a very real sense, there are biological sensors
81
218641
2561
03:41
within a cell that are turned on
82
221226
1930
์„ธํฌ ์•ˆ์— ์ž๋ฆฌ์žก๊ณ  ์žˆ๋‹ค๊ฐ€
03:43
only when that cell was just working.
83
223180
2111
ํ•ด๋‹น ์„ธํฌ๊ฐ€ ํ™œ๋™์„ ์‹œ์ž‘ํ•˜๋Š” ์ˆœ๊ฐ„ ์ผœ์ง€๊ฒŒ ๋˜๋Š”๊ฑฐ์ฃ .
03:45
They're sort of biological windows that light up
84
225315
2286
๊ทธ๋“ค์ด ํ˜„์žฌ ์ผํ•˜๋Š” ์ƒํƒœ๋ผ๋Š” ๊ฒƒ์„ ์šฐ๋ฆฌ์—๊ฒŒ ์•Œ๋ ค์ฃผ๋Š”
03:47
to let us know that that cell was just active.
85
227625
2191
์ผ์ข…์˜ ์ƒ๋ฌผํ•™์  ์ฐฝ๋ฌธ์ด๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ๊ฒ ์Šต๋‹ˆ๋‹ค.
03:49
XL: So we clipped part of this sensor,
86
229840
2134
๊ทธ๋ž˜์„œ ์ด ์„ผ์„œ์˜ ์ผ๋ถ€๋ฅผ ์ž˜๋ผ๋‚ด์–ด
03:51
and attached that to a switch to control the cells,
87
231998
3123
์„ธํฌ ์ž‘๋™์„ ์กฐ์ ˆํ•˜๋Š” ์Šค์œ„์น˜์— ๋ถ™์—ฌ๋ณด์•˜์Šต๋‹ˆ๋‹ค.
03:55
and we packed this switch into an engineered virus
88
235145
3876
๊ทธ๋ฆฌ๊ณ  ์ด ์Šค์œ„์น˜๋ฅผ ํŠน์ˆ˜ ์ฒ˜๋ฆฌ๋œ ๋ฐ”์ด๋Ÿฌ์Šค ์•ˆ์— ์‚ฝ์ž…ํ•˜์—ฌ
03:59
and injected that into the brain of the mice.
89
239045
2564
์ด๋ฅผ ์‹คํ—˜์šฉ ์ฅ์˜ ๋‡Œ์— ์ฃผ์ž…ํ•˜์˜€์Šต๋‹ˆ๋‹ค.
04:01
So whenever a memory is being formed,
90
241633
2610
๊ทธ๋ž˜์„œ ์ฅ๊ฐ€ ๊ธฐ์–ต์„ ํ˜•์„ฑํ•˜๋Š” ๋™์•ˆ
04:04
any active cells for that memory
91
244267
2324
๊ทธ ๊ธฐ์–ต์— ๊ด€๋ จ๋œ ํ™œ์„ฑ ์„ธํฌ๊ฐ€
04:06
will also have this switch installed.
92
246615
2718
์ด ์Šค์œ„์น˜๋ฅผ ๊ฐ–๊ฒŒ ๋˜๋Š” ๊ฑฐ์ฃ .
04:09
SR: So here is what the hippocampus looks like
93
249357
2191
์˜ˆ๋ฅผ ๋“ค๋ฉด ์ด ์‚ฌ์ง„์€ ๋ฌด์„œ์šด ๊ธฐ์–ต์„ ํ˜•์„ฑํ•œ ์งํ›„์˜
ํ•ด๋งˆ ๋ถ€์œ„๋ฅผ ๋ณด์—ฌ์ฃผ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค
04:11
after forming a fear memory, for example.
94
251572
2226
04:13
The sea of blue that you see here
95
253822
2116
์—ฌ๊ธฐ ๋ณด์ด๋Š” ํŒŒ๋ž€ ๋ฐฐ๊ฒฝ์€
04:15
are densely packed brain cells,
96
255962
1928
๋‡Œ์„ธํฌ๊ฐ€ ๊ฐ€๋“์ฐฌ ์ƒํƒœ์ž…๋‹ˆ๋‹ค.
04:17
but the green brain cells,
97
257914
1521
๊ทธ๋Ÿฐ๋ฐ ์ด ์ดˆ๋ก์ƒ‰ ์„ธํฌ๊ฐ€
04:19
the green brain cells are the ones that are holding on
98
259459
2572
๋ฐ”๋กœ ํŠน์ •ํ•œ ๊ณตํฌ์— ๋Œ€ํ•œ ๊ธฐ์–ต๊ณผ ์—ฐ๊ด€์ด ์žˆ๋Š”
๋‡Œ์„ธํฌ๋“ค์ž…๋‹ˆ๋‹ค.
04:22
to a specific fear memory.
99
262055
1319
04:23
So you are looking at the crystallization
100
263398
1955
๊ทธ๋Ÿฌ๋ฏ€๋กœ ์—ฌ๋Ÿฌ๋ถ„๊ป˜์„œ๋Š” ์ง€๊ธˆ ๋– ๋‹ค๋‹ˆ๋Š” ํ˜•ํƒœ์˜
04:25
of the fleeting formation of fear.
101
265377
2363
๊ณตํฌ์˜ ์ง„์ˆ˜๋ฅผ ๋ณด๊ณ  ๊ณ„์‹  ๊ฒ๋‹ˆ๋‹ค.
04:27
You're actually looking at the cross-section of a memory right now.
102
267764
3473
๊ธฐ์–ต์˜ ํ•œ ๋‹จ๋ฉด๋„๋ฅผ ๋ณด๊ณ  ๊ณ„์‹  ๊ฒƒ์ด์ฃ .
04:31
XL: Now, for the switch we have been talking about,
103
271261
2429
๊ทธ๋ฆฌ๊ณ  ๋ง์”€๋“œ๋ ธ๋˜ ๊ทธ ์Šค์œ„์น˜๋Š”
04:33
ideally, the switch has to act really fast.
104
273714
2901
๊ต‰์žฅํžˆ ๋น ๋ฅธ ์†๋„๋กœ ์ž‘๋™ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
04:36
It shouldn't take minutes or hours to work.
105
276639
2555
๋ช‡ ๋ถ„ ํ˜น์€ ๋ช‡ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆฌ๋ฉด ๊ณค๋ž€ํ•˜์ฃ .
04:39
It should act at the speed of the brain, in milliseconds.
106
279218
4240
๋‡Œ์˜ ์†๋„์™€ ๊ฐ™์•„์•ผ ํ•˜๋‹ˆ ์ฒœ ๋ถ„์˜ ๋ช‡ ์ดˆ๊ฐ€ ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
04:43
SR: So what do you think, Xu?
107
283482
1406
์‰ฌ, ๊ทธ๋ ‡๋‹ค๋ฉด ์šฐ๋ฆฌ๊ฐ€
04:44
Could we use, let's say, pharmacological drugs
108
284912
2578
์•ฝ๋ฌผ์„ ์‚ฌ์šฉํ•  ์ˆ˜๋„ ์žˆ๊ฒ ๋„ค์š”.
04:47
to activate or inactivate brain cells?
109
287514
1814
๋‡Œ์„ธํฌ ์ž‘๋™์„ ์กฐ์ ˆํ•˜๊ธฐ ์œ„ํ•ด์„œ์š”?
04:49
XL: Nah. Drugs are pretty messy. They spread everywhere.
110
289352
4039
์—์ด, ์•ˆ๋˜์ฃ . ์•ฝ์€ ์—ฌ๊ธฐ์ €๊ธฐ ๋ฒˆ์ ธ์„œ ์ง€์ €๋ถ„ํ•ด์š”.
04:53
And also it takes them forever to act on cells.
111
293415
2984
๊ทธ๋ฆฌ๊ณ  ์‹œ๊ฐ„๋„ ์•„์ฃผ ์˜ค๋ž˜ ๊ฑธ๋ฆฌ๊ณ ์š” .
04:56
So it will not allow us to control a memory in real time.
112
296423
3625
๊ทธ๋ž˜์„œ ์‹ค์‹œ๊ฐ„์œผ๋กœ ๊ธฐ์–ต์„ ์กฐ์ ˆํ•˜๊ธฐ๊ฐ€ ๋ถˆ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
05:00
So Steve, how about let's zap the brain with electricity?
113
300072
4270
์Šคํ‹ฐ๋ธŒ, ๊ทธ๋ ‡๋‹ค๋ฉด ๋‡Œ์— ์ „๊ธฐ ์ถฉ๊ฒฉ์„ ๊ฐ€ํ•˜๋Š”๊ฑด ์–ด๋–จ๊นŒ์š”?
05:04
SR: So electricity is pretty fast,
114
304366
2281
์ „๊ธฐ๊ฐ€ ๋น ๋ฅด๊ธฐ๋Š” ํ•˜์ง€๋งŒ
05:06
but we probably wouldn't be able to target it
115
306671
1715
์šฐ๋ฆฌ๊ฐ€ ํ•„์š”๋กœ ํ•˜๋Š” ํŠน์ • ๊ธฐ์–ต์— ๊ด€๋ จ๋œ ์„ธํฌ๋ฅผ
05:08
to just the specific cells that hold onto a memory,
116
308410
2429
์ •ํ™•ํ•˜๊ฒŒ ๊ฒจ๋ƒฅํ•˜๊ธฐ๋Š” ์–ด๋ ค์šธ ๊ฑฐ์—์š”.
05:10
and we'd probably fry the brain.
117
310863
1755
๊ทธ๋ฆฌ๊ณ  ๋‡Œ๋„ ๋‹ค ํƒ€๋ฒ„๋ฆด๊ฑฐ๊ณ ์š”.
05:12
XL: Oh. That's true. So it looks like, hmm,
118
312642
3171
์•„, ๊ทธ๋Ÿฌ๋„ค์š”. ๊ทธ๋ ‡๋‹ค๋ฉด
05:15
indeed we need to find a better way
119
315837
2587
๋น›์˜ ์†๋„๋กœ ๋‡Œ๋ฅผ ์ž๊ทนํ•˜๊ธฐ ์œ„ํ•œ
05:18
to impact the brain at the speed of light.
120
318448
3271
๋” ์ข‹์€ ๋ฐฉ๋ฒ•์„ ๊ผญ ์ฐพ์•„์•ผ๊ฒ ์–ด์š”.
05:21
SR: So it just so happens that light travels at the speed of light.
121
321743
5062
๊ทธ๋Ÿฌ๊ณ  ๋ณด๋‹ˆ ๋น›์˜ ์†๋„๋กœ ์›€์ง์ด๋Š”๊ฑด ๋ฐ”๋กœ ๋น›์ด๊ตฐ์š”.
05:26
So maybe we could activate or inactive memories
122
326829
3459
๊ทธ๋ ‡๋‹ค๋ฉด ๊ธฐ์–ต์˜ ํ™œ์„ฑํ™”๋ฅผ ์กฐ์ ˆํ•˜๋Š” ์ผ์—
05:30
by just using light --
123
330312
1473
๋น›์„ ์‚ฌ์šฉํ•œ๋‹ค๋ฉด?
05:31
XL: That's pretty fast.
124
331809
1331
๊ทธ๊ฑฐ ๊ดœ์ฐฎ๊ฒ ๋Š”๋ฐ์š”.
05:33
SR: -- and because normally brain cells
125
333164
1861
์™œ๋ƒํ•˜๋ฉด ๋‡Œ์„ธํฌ๋Š” ๋ณดํ†ต
05:35
don't respond to pulses of light,
126
335049
1572
๋น›์˜ ํŒŒ๋™์—” ๋ฐ˜์‘ํ•˜์ง€ ์•Š๊ฑฐ๋“ ์š”.
05:36
so those that would respond to pulses of light
127
336645
1934
๊ทธ๋Ÿฌ๋ฏ€๋กœ ๋น›์— ๋ฐ˜์‘ํ•˜๋Š” ๊ฒƒ์€
05:38
are those that contain a light-sensitive switch.
128
338603
2432
๋น› ํŒŒ๋™์— ๋ฏผ๊ฐํ•œ ์Šค์œ„์น˜๋ฅผ ๊ฐ€์ง„ ์„ธํฌ๊ฐ€ ๋˜๊ฒ ์ฃ .
05:41
Now to do that, first we need to trick brain cells
129
341059
1922
์ž, ๊ทธ๋ ‡๋‹ค๋ฉด ์šฐ์„  ๋‡Œ์„ธํฌ๋ฅผ ์กฐ์ž‘ํ•˜์—ฌ
05:43
to respond to laser beams.
130
343005
1438
๋ ˆ์ด์ € ๋น”์— ๋ฐ˜์‘ํ•˜๋„๋ก ํ•ด๋ด…์‹œ๋‹ค.
05:44
XL: Yep. You heard it right.
131
344467
1046
๋„ค, ๋“ค์œผ์‹  ๊ทธ๋Œ€๋กœ์—์š”.
05:45
We are trying to shoot lasers into the brain.
132
345537
2143
๋‡Œ์— ๋ ˆ์ด์ €๋ฅผ ์  ๊ฑฐ์˜ˆ์š”.
05:47
(Laughter)
133
347704
1686
(๊ด€๊ฐ ์›ƒ์Œ)
05:49
SR: And the technique that lets us do that is optogenetics.
134
349414
3300
์—ฌ๊ธฐ์— ํ•„์š”ํ•œ ๊ธฐ์ˆ ์„ ๊ด‘์œ ์ „ํ•™์ด๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
05:52
Optogenetics gave us this light switch that we can use
135
352738
3258
๊ด‘์œ ์ „ํ•™์˜ ๋„์›€์œผ๋กœ ์Šค์œ„์น˜๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ
05:56
to turn brain cells on or off,
136
356020
1484
์šฐ๋ฆฌ๊ฐ€ ๋‡Œ์„ธํฌ๋ฅผ ์ž‘๋™์‹œํ‚ฌ ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์ฃ .
05:57
and the name of that switch is channelrhodopsin,
137
357528
2493
์ด ์Šค์œ„์น˜๋ฅผ ์ฑ„๋„๋กœ๋•์‹ ์ด๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
06:00
seen here as these green dots attached to this brain cell.
138
360045
2513
์—ฌ๊ธฐ ๋ณด์‹œ๋ฉด ์ดˆ๋ก์ƒ‰ ์ ์˜ ํ˜•ํƒœ๋กœ ๋‡Œ์„ธํฌ์— ๋ถ€์ฐฉ๋˜์–ด์žˆ์ฃ .
06:02
You can think of channelrhodopsin as a sort of light-sensitive switch
139
362582
3286
์ด ์ฑ„๋„๋กœ๋•์‹ ์€ ์‰ฝ๊ฒŒ ๋งํ•˜๋ฉด ๋น›์— ๋ฐ˜์‘ํ•˜๋Š” ์Šค์œ„์น˜๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:05
that can be artificially installed in brain cells
140
365892
2592
์šฐ๋ฆฌ๊ฐ€ ์ธ์œ„์ ์œผ๋กœ ์ด๋ฅผ ๋‡Œ์„ธํฌ์— ๋ถ€์ฐฉํ•˜๊ณ 
06:08
so that now we can use that switch
141
368508
1890
์ด ์Šค์œ„์น˜๋ฅผ ์‚ฌ์šฉํ•ด์„œ
06:10
to activate or inactivate the brain cell simply by clicking it,
142
370422
3000
๋‡Œ์„ธํฌ๋ฅผ ์ผœ๊ณ  ๋Œ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ด์ฃ .
06:13
and in this case we click it on with pulses of light.
143
373446
2524
๊ทธ๋ฆฌ๊ณ  ์ด ์Šค์œ„์น˜๋ฅผ ์ž‘๋™์‹œํ‚ค๋Š”๊ฒƒ์€ ๋น›์˜ ํŒŒ๋™์ž…๋‹ˆ๋‹ค.
06:15
XL: So we attach this light-sensitive switch of channelrhodopsin
144
375994
3671
์ด๋ ‡๊ฒŒ ๋น›์„ ๊ฐ์ง€ํ•˜๋Š” ์Šค์œ„์น˜์ธ ์ฑ„๋„๋กœ๋•์‹ ์„
06:19
to the sensor we've been talking about
145
379689
2184
์ง€๊ธˆ๊นŒ์ง€ ๋ง์”€๋“œ๋ฆฐ ์ด ์„ผ์„œ์— ๋ถ€์ฐฉํ•˜์—ฌ
06:21
and inject this into the brain.
146
381897
2431
๋‡Œ์— ์ฃผ์ž…ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:24
So whenever a memory is being formed,
147
384352
3187
๊ทธ๋ž˜์„œ ์•ž์œผ๋กœ ๋‡Œ์—์„œ ๊ธฐ์–ต์„ ๋งŒ๋“ค ๋•Œ
06:27
any active cell for that particular memory
148
387563
2203
ํ•ด๋‹น ๊ธฐ์–ต์— ๊ด€๋ จ๋œ ๋‡Œ์„ธํฌ๋Š”
06:29
will also have this light-sensitive switch installed in it
149
389790
3460
์šฐ๋ฆฌ๊ฐ€ ์„ค์น˜ํ•œ ์Šค์œ„์น˜๋ฅผ ๊ฐ–๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
06:33
so that we can control these cells
150
393274
2377
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๊ฐ€ ์ด ์„ธํฌ๋“ค์—
06:35
by the flipping of a laser just like this one you see.
151
395675
4240
์—ฌ๊ธฐ ๋ณด์‹œ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ ˆ์ด์ €๋ฅผ ์ชผ์—ฌ์„œ ์„ธํฌ ๊ธฐ๋Šฅ์„ ์กฐ์ ˆํ•˜๊ฒŒ ๋˜์ฃ .
06:39
SR: So let's put all of this to the test now.
152
399939
2840
๊ทธ๋Ÿผ ์ด์ œ ์‹คํ—˜์„ ํ•ด๋ด…์‹œ๋‹ค.
06:42
What we can do is we can take our mice
153
402803
2111
์‹คํ—˜์šฉ ์ฅ๋ฅผ ์ค€๋น„ํ•ด์„œ
06:44
and then we can put them in a box that looks exactly like this box here,
154
404938
2904
์ด๋ ‡๊ฒŒ ์ƒ๊ธด ์ƒ์ž ์•ˆ์— ๋„ฃ๊ณ 
06:47
and then we can give them a very mild foot shock
155
407866
2316
๋ฐœ์— ์•„์ฃผ ์•ฝํ•œ ์ถฉ๊ฒฉ์„ ๊ฐ€ํ•ฉ๋‹ˆ๋‹ค.
06:50
so that they form a fear memory of this box.
156
410206
2048
๊ทธ๋Ÿผ ์ด ์ƒ์ž์— ๋Œ€ํ•œ ๋‘๋ ค์›€์ด ํ˜•์„ฑ๋˜๊ฒ ์ฃ .
06:52
They learn that something bad happened here.
157
412278
2059
์ด ์•ˆ์—์„œ์„œ๋Š” ์•ˆ์ข‹์€ ์ผ์ด ์ƒ๊ธด๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค .
06:54
Now with our system, the cells that are active
158
414361
2318
์ด์ œ ์ €ํฌ๊ฐ€ ์„ค๋ช…๋“œ๋ฆฐ ์‹œ์Šคํ…œ์„ ํ†ตํ•ด
06:56
in the hippocampus in the making of this memory,
159
416703
2766
ํ•ด๋งˆ ๋ถ€์œ„์—์„œ ์ด ๊ธฐ์–ต์„ ๋งŒ๋“  ํ™œ์„ฑ ์„ธํฌ๋“ค๋งŒ์ด
06:59
only those cells will now contain channelrhodopsin.
160
419493
2859
์ฑ„๋„๋กœ๋•์‹  ์Šค์œ„์น˜๋ฅผ ๊ฐ–๊ฒŒ ๋ฉ๋‹ˆ๋‹ค
07:02
XL: When you are as small as a mouse,
161
422376
2993
์ฅ์ฒ˜๋Ÿผ ์ž‘์€ ์ƒ๋ช…์ฒด๋ผ๋ฉด
07:05
it feels as if the whole world is trying to get you.
162
425393
3571
์˜จ ์„ธ์ƒ์ด ๋‚˜๋งŒ ๊ณต๊ฒฉํ•˜๋Š” ๊ธฐ๋ถ„์ด๊ฒ ์ฃ .
07:08
So your best response of defense
163
428988
1724
์ด๋Ÿด ๋•Œ ์ตœ์ƒ์˜ ๋ฐฉ์–ด๋ฒ•์€
07:10
is trying to be undetected.
164
430736
2458
์ƒ๋Œ€์˜ ๋ˆˆ์— ๋„์ง€ ์•Š๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:13
Whenever a mouse is in fear,
165
433218
2009
์ฅ๊ฐ€ ๊ณตํฌ๋ฅผ ๋Š๋ผ๋ฉด
07:15
it will show this very typical behavior
166
435251
1858
์ƒ์ž์˜ ๊ตฌ์„์œผ๋กœ ๋„๋ง์น˜๋Š”
๋…ํŠนํ•œ ์Šต์„ฑ์„ ๋ณด์ž…๋‹ˆ๋‹ค.
07:17
by staying at one corner of the box,
167
437133
1745
07:18
trying to not move any part of its body,
168
438902
2738
๋ชธ์˜ ์–ด๋Š ๋ถ€๋ถ„๋„ ์›€์ง์ด์ง€ ์•Š๊ณ  ์ •์ง€ํ•˜๊ฒŒ ๋˜์ฃ .
07:21
and this posture is called freezing.
169
441664
3271
์ด ์ƒํƒœ๋ฅผ ๊ณตํฌ ๋ฐ˜์‘์ด๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
07:24
So if a mouse remembers that something bad happened in this box,
170
444959
4270
๊ทธ๋ž˜์„œ ์ด ์ƒ์ž ์•ˆ์—์„œ ์•ˆ์ข‹์€ ์ผ์ด ์ƒ๊ฒผ๋‹ค๋Š” ๊ธฐ์–ต์„ ๊ฐ–๊ฒŒ ๋˜๋ฉด,
07:29
and when we put them back into the same box,
171
449253
2599
์šฐ๋ฆฌ๊ฐ€ ๋˜‘๊ฐ™์€ ์ƒ์ž์— ์ฅ๋ฅผ ๋‹ค์‹œ ๋„ฃ์—ˆ์„ ๋•Œ,
07:31
it will basically show freezing
172
451876
1780
์ฅ๋Š” ๊ณตํฌ ๋ฐ˜์‘์„ ๋ณด์ด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
07:33
because it doesn't want to be detected
173
453680
2261
์ด ์ƒ์ž ์•ˆ์—์„œ ๊ฒช์—ˆ๋˜
07:35
by any potential threats in this box.
174
455965
2671
๊ณต๊ฒฉ์„ ํ”ผํ•˜๊ฒŒ ์œ„ํ•ด์„œ์ฃ .
07:38
SR: So you can think of freezing as,
175
458660
1331
์ด ๊ณตํฌ ๋ฐ˜์‘์„ ์‰ฝ๊ฒŒ ์„ค๋ช…๋“œ๋ฆฌ์ฃ .
07:40
you're walking down the street minding your own business,
176
460015
2191
์šฐ๋ฆฌ๊ฐ€ ์ฃผ์œ„์— ์‹ ๊ฒฝ์“ฐ์ง€ ์•Š๊ณ  ๊ธธ์„ ๊ฑธ์–ด๊ฐ€๋Š” ์ค‘์—
07:42
and then out of nowhere you almost run into
177
462230
2048
๊ฐ‘์ž๊ธฐ ์˜ˆ์ „ ์—ฌ์ž ์นœ๊ตฌ๋‚˜ ๋‚จ์ž์นœ ๊ตฌ๋ฅผ
๋งˆ์ฃผ์นœ๋‹ค๊ณ  ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
07:44
an ex-girlfriend or ex-boyfriend,
178
464302
1821
07:46
and now those terrifying two seconds
179
466147
2110
์งง์€ ์ˆœ๊ฐ„์ด์ง€๋งŒ ์ •๋ง ๋ฌด์„ญ์ž–์•„์š”.
07:48
where you start thinking, "What do I do? Do I say hi?
180
468281
1852
"์–ด๋–กํ•˜์ง€? ์ธ์‚ฌ๋ฅผ ํ•ด์•ผ ํ•˜๋‚˜?"
07:50
Do I shake their hand? Do I turn around and run away?
181
470157
1344
"์•…์ˆ˜๋ฅผ ํ•ด์•ผ ํ•˜๋‚˜? ๋’ค๋Œ์•„์„œ ๋„๋ง๊ฐˆ๊นŒ?"
07:51
Do I sit here and pretend like I don't exist?"
182
471525
2005
"์—ฌ๊ธฐ ๊ทธ๋ƒฅ ์ฃผ์ €์•‰์•„์„œ ์—†๋Š” ์ฒ™ ํ• ๊นŒ?"
07:53
Those kinds of fleeting thoughts that physically incapacitate you,
183
473554
3160
์ด๋Ÿฐ ์˜ค๋งŒ๊ฐ€์ง€ ์ƒ๊ฐ๋“ค์ด ์šฐ๋ฆฌ๋ฅผ ๊ฝ๊ฝ ๋ฌถ์–ด๋ฒ„๋ฆฌ์ž–์•„์š”.
07:56
that temporarily give you that deer-in-headlights look.
184
476738
2722
๋„๋กœ ํ•œ๊ฐ€์šด๋ฐ์„œ ์ž๋™์ฐจ๋ฅผ ๋งˆ์ฃผ์นœ ์‚ฌ์Šด ๊ผด์ด์ฃ .
07:59
XL: However, if you put the mouse in a completely different
185
479484
3267
๊ทธ๋ ‡์ง€๋งŒ ๊ฐ™์€ ์ฅ๋ฅผ ์™„์ „ํžˆ ๋‹ค๋ฅธ ํ˜•ํƒœ์˜
08:02
new box, like the next one,
186
482775
3137
์ƒˆ ์ƒ์ž์— ๋„ฃ๊ฒŒ ๋˜๋ฉด ์—ฌ๊ธฐ ์ด ์ƒ์ž์ฒ˜๋Ÿผ์š”,
08:05
it will not be afraid of this box
187
485936
2123
์ฅ๋Š” ์ด ์ƒ์ž๋ฅผ ๋‘๋ ค์›Œํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
08:08
because there's no reason that it will be afraid of this new environment.
188
488083
4705
์ด ์ƒˆ๋กœ์šด ํ™˜๊ฒฝ์„ ๋‘๋ ค์›Œํ•  ์ด์œ ๊ฐ€ ์—†๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
08:12
But what if we put the mouse in this new box
189
492812
3156
ํ•˜์ง€๋งŒ ์ฅ๋ฅผ ์ƒˆ ์ƒ์ž์— ๋„ฃ์œผ๋ฉด์„œ
08:15
but at the same time, we activate the fear memory
190
495992
3587
๋ ˆ์ด์ €๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ด์ „์— ํ˜•์„ฑ๋œ ๋ฌด์„œ์šด ๊ธฐ์–ต์„ ์ž๊ทนํ•ด ๋ด…์‹œ๋‹ค.
08:19
using lasers just like we did before?
191
499603
2655
์กฐ๊ธˆ ์ „์— ๋ณด์—ฌ๋“œ๋ฆฐ ๋Œ€๋กœ์š”.
08:22
Are we going to bring back the fear memory
192
502282
2830
๊ณผ์—ฐ ์ด ์ƒˆ๋กœ์šด ํ™˜๊ฒฝ์—์„œ๋„
08:25
for the first box into this completely new environment?
193
505136
3973
๊ณผ๊ฑฐ์˜ ๊ณตํฌ๋ฅผ ๊ทธ๋Œ€๋กœ ๋Š๋‚„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
08:29
SR: All right, and here's the million-dollar experiment.
194
509133
2711
์ž, ์ด์ œ ๋ฐฑ๋งŒ๋ถˆ ์งœ๋ฆฌ ์‹คํ—˜ ๊ฒฐ๊ณผ๋ฅผ ์•Œ๋ ค๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
08:31
Now to bring back to life the memory of that day,
195
511868
2889
ํ•ด๋‹น ์‹œ์ ์˜ ๊ธฐ์–ต์„ ๋˜์‚ด๋ ค ๋ณผ๊ฒŒ์š”.
08:34
I remember that the Red Sox had just won,
196
514781
2159
๋ ˆ๋“œ ์‚ญ์Šค๊ฐ€ ๊ทธ๋‚  ๊ฒŒ์ž„์—์„œ ์ด๊ฒผ์–ด์š”.
08:36
it was a green spring day,
197
516964
1885
์ดˆ๋ก๋น›์ด ๋งŒ์—ฐํ•œ ๋ด„๋‚ ์ด์—ˆ์ฃ .
08:38
perfect for going up and down the river
198
518873
1858
๊ฐ•๊ฐ€์—์„œ ์‚ฐ์ฑ…ํ•˜๊ธฐ ์ข‹์€ ๋‚ ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
08:40
and then maybe going to the North End
199
520755
2245
๋…ธ์Šค ์—”๋“œ(๋ณด์Šคํ„ด ์ง€๋ช…)์— ๊ฐ€์„œ
08:43
to get some cannolis, #justsaying.
200
523024
2135
์นด๋†€๋ฆฌ(์ดํƒœ๋ฆฌ์‹ ๋””์ €ํŠธ)๋ฅผ ์‚ฌ๋จน์–ด๋„ ๋˜๊ณ ์š”. ์‹ซ์œผ๋ฉด ๋ง๊ณ ์š”.
08:45
Now Xu and I, on the other hand,
201
525183
3078
ํ•˜์ง€๋งŒ ์‰ฌ์™€ ์ €๋Š”
08:48
were in a completely windowless black room
202
528285
2806
์ฐฝ๋ฌธ๋„ ํ•˜๋‚˜ ์—†๋Š” ๊นœ๊นœํ•œ ์—ฐ๊ตฌ์‹ค์—์„œ
08:51
not making any ocular movement that even remotely resembles an eye blink
203
531115
3636
๋ˆˆ ํ•œ๋ฒˆ ๊นœ๋นก์ด์ง€ ์•Š์€ ์ฑ„๋กœ ๊ผผ์ง์—†์ด ์•‰์•„์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
08:54
because our eyes were fixed onto a computer screen.
204
534775
2443
์ปดํ“จํ„ฐ์˜ ํ™”๋ฉด์— ๋ˆˆ์„ ๊ณ ์ •ํ•œ ์ฑ„๋กœ์š”.
08:57
We were looking at this mouse here trying to activate a memory
205
537242
2953
์‹คํ—˜์ฅ์˜ ๊ธฐ์–ต์„ ํ™œ์„ฑํ™”์‹œํ‚ค๋ ค๋Š” ์ค‘์ด์—ˆ๊ฑฐ๋“ ์š”.
์šฐ๋ฆฌ์˜ ๊ธฐ์ˆ ์„ ์‚ฌ์šฉํ•œ ์ฒซ ์‹คํ—˜์ด์—ˆ์Šต๋‹ˆ๋‹ค.
09:00
for the first time using our technique.
206
540219
1859
์ด๊ฒŒ ๋ฐ”๋กœ ์ €ํฌ๊ฐ€ ๋ณธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:02
XL: And this is what we saw.
207
542102
2243
09:04
When we first put the mouse into this box,
208
544369
2178
์ƒ์ž ์•ˆ์— ์ฅ๋ฅผ ๋„ฃ์€ ์งํ›„์—๋Š”
09:06
it's exploring, sniffing around, walking around,
209
546571
3089
์ฅ๊ฐ€ ๋Œ์•„๋‹ค๋‹ˆ๋ฉด์„œ ๋ƒ„์ƒˆ๋„ ๋งก๊ณ  ์ƒ์ž๋ฅผ ๊ด€์ฐฐํ•ฉ๋‹ˆ๋‹ค.
09:09
minding its own business,
210
549684
1665
์ž๊ธฐ ํ•  ์ผ์„ ํ•˜๋Š” ๊ฑฐ์ฃ 
09:11
because actually by nature,
211
551373
1677
์ฅ๋Š” ์ฒœ์„ฑ์ ์œผ๋กœ
09:13
mice are pretty curious animals.
212
553074
1955
ํ˜ธ๊ธฐ์‹ฌ์ด ๋งŽ์€ ๋™๋ฌผ์ž…๋‹ˆ๋‹ค.
09:15
They want to know, what's going on in this new box?
213
555053
2598
์ƒˆ ํ™˜๊ฒฝ์— ๋Œ€ํ•ด ์•Œ๊ณ  ์‹ถ์–ด ํ•ฉ๋‹ˆ๋‹ค. "์ด ์ƒ์ž๋Š” ๋ญ์ง€?"
09:17
It's interesting.
214
557675
1507
"์‹ ๊ธฐํ•œ๋ฐ?"
09:19
But the moment we turned on the laser, like you see now,
215
559206
3427
ํ•˜์ง€๋งŒ ์—ฌ๊ธฐ ๋ณด์‹œ๋Š” ๋Œ€๋กœ ์ €ํฌ๊ฐ€ ๋ ˆ์ด์ €๋ฅผ ๋ฐœ์‚ฌํ•˜๋Š” ์ˆœ๊ฐ„,
09:22
all of a sudden the mouse entered this freezing mode.
216
562657
3008
์ฅ๊ฐ€ ๊ฐ‘์ž๊ธฐ ๊ณตํฌ ๋ฐ˜์‘์„ ๋ณด์ด๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:25
It stayed here and tried not to move any part of its body.
217
565689
4407
์ € ๋ถ€๋ถ„์— ๊ฐ€๋งŒํžˆ ์„œ์„œ ์ „ํ˜€ ์›€์ง์ด์ง€ ์•Š๋”๋ผ๊ณ ์š”.
09:30
Clearly it's freezing.
218
570120
1604
๊ณตํฌ ๋ฐ˜์‘์ด ๋‚˜ํƒ€๋‚œ๊ฑฐ์ฃ .
09:31
So indeed, it looks like we are able to bring back
219
571748
2559
์‹ค์ œ๋กœ ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๋Š” ๋ฐ ์„ฑ๊ณตํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:34
the fear memory for the first box
220
574331
2040
์›๋ž˜ ์ƒ์ž์— ๊ฐ–๊ณ  ์žˆ๋˜ ๊ณตํฌ๋ฅผ
09:36
in this completely new environment.
221
576395
3343
์™„์ „ํžˆ ์ƒˆ๋กœ์šด ํ™˜๊ฒฝ์—์„œ๋„ ๋Š๋ผ๊ฒŒ ๋œ๊ฑฐ์ฃ .
09:39
While watching this, Steve and I
222
579762
2088
์ด ์žฅ๋ฉด์„ ๋ณด๋ฉด์„œ ์Šคํ‹ฐ๋ธŒ์™€ ์ €๋Š”
09:41
are as shocked as the mouse itself.
223
581874
2109
์ด ์ฅ์™€ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ์ถฉ๊ฒฉ์— ๋น ์กŒ์Šต๋‹ˆ๋‹ค.
09:44
(Laughter)
224
584007
1238
(๊ด€๊ฐ ์›ƒ์Œ)
09:45
So after the experiment, the two of us just left the room
225
585269
3283
์‹คํ—˜์ด ๋๋‚˜๊ณ  ์ €ํฌ๋Š” ์—ฐ๊ตฌ์‹ค์—์„œ ๋‚˜์™”์Šต๋‹ˆ๋‹ค.
09:48
without saying anything.
226
588576
1729
์•„๋ฌด ๋ง๋„ ํ•˜์ง€ ์•Š์•˜์ฃ .
09:50
After a kind of long, awkward period of time,
227
590329
3372
์•„์ฃผ ๊ธธ๊ณ  ์–ด์ƒ‰ํ•œ ์นจ๋ฌต์ด ๋๋‚˜๊ณ 
09:53
Steve broke the silence.
228
593725
2188
์Šคํ‹ฐ๋ธŒ๊ฐ€ ๋ง๋ฌธ์„ ์—ด์—ˆ์Šต๋‹ˆ๋‹ค.
09:55
SR: "Did that just work?"
229
595937
2317
"์šฐ๋ฆฌ ์ •๋ง ์„ฑ๊ณตํ•œ๊ฑฐ์•ผ?"
09:58
XL: "Yes," I said. "Indeed it worked!"
230
598278
2950
์ œ๊ฐ€ ๋งํ–ˆ์ฃ . "์‘, ์ •๋ง๋กœ ์„ฑ๊ณตํ•œ๊ฑฐ์•ผ!"
10:01
We're really excited about this.
231
601252
2093
์ •๋ง ์งœ๋ฆฟํ•œ ์ˆœ๊ฐ„์ด์—ˆ์ฃ .
10:03
And then we published our findings
232
603369
2600
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๋Š” ๊ทธ ์—ฐ๊ตฌ ๊ฒฐ๊ณผ๋ฅผ
10:05
in the journal Nature.
233
605993
1672
"๋„ค์ด์ฒ˜" ์ง€์— ๋ฐœํ‘œํ–ˆ์Šต๋‹ˆ๋‹ค.
10:07
Ever since the publication of our work,
234
607689
2447
๊ทธ ๊ฒฐ๊ณผ๊ฐ€ ๋ฐœํ‘œ๋˜๊ณ  ๋‚˜์„œ
10:10
we've been receiving numerous comments
235
610160
2391
์ธํ„ฐ๋„ท์„ ํ†ตํ•ด ์•„์ฃผ ๋งŽ์€ ์˜๊ฒฌ์„
10:12
from all over the Internet.
236
612575
2101
๋“ฃ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
10:14
Maybe we can take a look at some of those.
237
614700
3726
๊ทธ ์ค‘์—์„œ ๋ช‡ ๊ฐœ๋ฅผ ์‚ดํŽด๋ณด๋„๋ก ํ•˜์ฃ .
10:18
["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"]
238
618450
2433
["์œผ์•„์•„์•… ๋“œ๋””์–ด... ์•ž์œผ๋กœ ๋˜ ์–ด๋–ค ์ผ์ด ์ผ์–ด๋‚ ์ง€, ๊ฐ€์ƒ ํ˜„์‹ค, ์‹ ๊ฒฝ ์กฐ์ž‘, ๊ฟˆ์˜ ๋น„์ฃผ์–ผ ์—๋ฎฌ๋ ˆ์ด์…˜, ์‹ ๊ฒฝ ๋ถ€ํ˜ธํ™”, '๊ธฐ์–ต ํ˜•์„ฑ๊ณผ ์กฐ์ž‘ ', ์ •์‹  ์งˆํ™˜... ์™€, ๋ฏธ๋ž˜๊ฐ€ ๊ธฐ๋Œ€๋˜๋„ค์š”"]
10:20
SR: So the first thing that you'll notice is that people
239
620907
1976
์—ฌ๊ธฐ์„œ ๊ฐ€์žฅ ๋จผ์ € ๋ˆˆ์— ๋„๋Š” ๊ฒƒ์€
10:22
have really strong opinions about this kind of work.
240
622907
2879
์‚ฌ๋žŒ๋“ค์ด ์ด๋Ÿฌํ•œ ์‹คํ—˜์— ํญ๋ฐœ์ ์ธ ๋ฐ˜์‘์„ ๋ณด์ธ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:25
Now I happen to completely agree with the optimism
241
625810
2530
์ €๋„ ์ด ๋ถ„์˜ ๋‚™๊ด€์ ์ธ ์˜๊ฒฌ์—
10:28
of this first quote,
242
628364
792
์ „์ ์œผ๋กœ ๋™์˜ํ•ฉ๋‹ˆ๋‹ค.
10:29
because on a scale of zero to Morgan Freeman's voice,
243
629180
2784
0๋ถ€ํ„ฐ ์‹ ์˜ ์˜์—ญ๊นŒ์ง€ ๊ทธ ์ •๋„๋ฅผ ์ธก์ •ํ•œ๋‹ค๋ฉด
10:31
it happens to be one of the most evocative accolades
244
631988
2477
์ด ๋ถ„์˜ ์˜๊ฒฌ์€ ์šฐ๋ฆฌ์˜ ์˜์ง€๋ฅผ ๋ถˆํƒœ์šฐ๋Š” ๊ฐ€์žฅ ์• ์ •์–ด๋ฆฐ ๊ฒฉ๋ ค ์ค‘ ํ•˜๋‚˜์˜€๊ฑฐ๋“ ์š”.
10:34
that I've heard come our way.
245
634489
1526
์ง€๊ธˆ๊นŒ์ง€ ์ €ํฌ๊ฐ€ ๋ฐ›์€ ๊ฒƒ ์ค‘์—์„œ๋Š”์š”.
(๊ด€๊ฐ ์›ƒ์Œ)
10:36
(Laughter)
246
636039
1794
10:37
But as you'll see, it's not the only opinion that's out there.
247
637857
1927
์—ฌ๋Ÿฌ๋ถ„๋“ค๋„ ๊ณง ๋ณด์‹œ๊ฒ ์ง€๋งŒ ์ด ๋ถ„ ์™ธ์—๋„ ๋งŽ์€ ๋ถ„๋“ค์ด ์˜๊ฒฌ์„ ๋ณด๋‚ด์ฃผ์…จ์–ด์š”.
10:39
["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"]
248
639808
1540
["์ •๋ง ๋ฌด์„œ์šด ์†Œ์‹์ด๋„ค์š”... ๋งŒ์•ฝ ์ด ๊ธฐ์ˆ ์ด 1~2๋…„ ๋‚ด์— ์ธ๊ฐ„์—๊ฒŒ๋„ ์ ์šฉ ๊ฐ€๋Šฅํ•ด์ง€๋ฉด ์–ด์ฉŒ์ฃ ? ์™„์ „ ๋งํ–ˆ๋‹ค "]
10:41
XL: Indeed, if we take a look at the second one,
249
641372
2286
๋งž๋Š” ๋ง์ด๋„ค์š”, ์ด ๋‘ ๋ฒˆ์งธ ์˜๊ฒฌ์€.
10:43
I think we can all agree that it's, meh,
250
643682
2083
์•„๊นŒ์ฒ˜๋Ÿผ ๊ธ์ •์ ์ด์ง€๋งŒ์€ ์•Š๋‹ค๋Š”๊ฑธ
10:45
probably not as positive.
251
645789
1959
์šฐ๋ฆฌ ๋ชจ๋‘ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:47
But this also reminds us that,
252
647772
2161
ํ•˜์ง€๋งŒ ์—ฌ๊ธฐ์„œ ์ค‘์š”ํ•œ ๊ฒƒ์€
10:49
although we are still working with mice,
253
649957
2162
๋น„๋ก ์ง€๊ธˆ์€ ์‹คํ—˜์ฅ๋กœ ์—ฐ๊ตฌ ์ค‘์ด์ง€๋งŒ
10:52
it's probably a good idea to start thinking and discussing
254
652143
3493
์ธ๊ฐ„์˜ ๊ธฐ์–ต ์กฐ์ž‘์— ๋Œ€ํ•œ ์œค๋ฆฌ์™€ ๊ทธ ํŒŒ๊ธ‰ ํšจ๊ณผ๋ฅผ
10:55
about the possible ethical ramifications
255
655660
2967
์ง„์ง€ํ•˜๊ฒŒ ๊ณ ๋ฏผํ•ด ๋ณด๋Š” ๊ฒƒ๋„
10:58
of memory control.
256
658651
1924
์ข‹์€ ์ƒ๊ฐ์ด๋ผ๋Š” ๊ฑฐ์ฃ .
11:00
SR: Now, in the spirit of the third quote,
257
660599
2176
๊ทธ๋ฆฌ๊ณ  ์„ธ ๋ฒˆ์งธ ์˜๊ฒฌ๊ณผ ๊ด€๋ จํ•˜์—ฌ
11:02
we want to tell you about a recent project that we've been
258
662799
1950
์ €ํฌ๊ฐ€ ์ตœ๊ทผ์— ์ง„ํ–‰ํ•˜๊ณ  ์žˆ๋Š” ์‹คํ—˜์— ๋Œ€ํ•ด ๋ง์”€๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
11:04
working on in lab that we've called Project Inception.
259
664773
2572
์ธ์…‰์…˜ ํ”„๋กœ์ ํŠธ๋ผ๋Š” ๊ฒƒ์ธ๋ฐ์š”.
11:07
["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."]
260
667369
3219
["์ด๋Ÿฐ ๊ฑด ์˜ํ™”๋กœ ๋งŒ๋“ค์–ด์•ผ ํ•ด์š” . ์‚ฌ๋žŒ์˜ ์ด์„ฑ์— ์ƒ๊ฐ์„ ์ฃผ์ž…ํ•ด์„œ ์ด๋ฅผ ํ†ตํ•ด ๊ฐœ์„ ๋  ์ˆ˜ ์žˆ๊ฒŒ์š”. ์ด๋ฅผ "์ธ์…‰์…˜"์ด๋ผ๊ณ  ๋ถ€๋ฅด์ฃ ."]
11:10
So we reasoned that now that we can reactivate a memory,
261
670612
3542
์ด์ œ ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆด ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ด ๋…ผ๋ฆฌ์ ์œผ๋กœ ์ฆ๋ช…๋˜์—ˆ์œผ๋‹ˆ
11:14
what if we do so but then begin to tinker with that memory?
262
674178
2928
์ด์ œ ๋˜์‚ด๋ฆฐ ๊ธฐ์–ต์„ ์กฐ๊ธˆ ์†๋ณด๋Š” ๊ฒƒ์€ ์–ด๋–จ๊นŒ์š”?
11:17
Could we possibly even turn it into a false memory?
263
677130
3009
๊ฐ€์งœ ๊ธฐ์–ต์„ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์„๊นŒ์š”?
11:20
XL: So all memory is sophisticated and dynamic,
264
680163
4075
๋ชจ๋“  ๊ธฐ์–ต์€ ๋ชน์‹œ ์ •๊ตํ•˜๊ณ  ๋™์ ์ด์ง€๋งŒ
11:24
but if just for simplicity, let's imagine memory
265
684262
2955
๊ฐ„๋‹จํ•˜๊ฒŒ ์„ค๋ช…ํ•˜๊ธฐ ์œ„ํ•ด
11:27
as a movie clip.
266
687241
1378
๊ธฐ์–ต์„ ํ•œ ํŽธ์˜ ์˜ํ™”๋ผ๊ณ  ํ•ด๋ณด์ฃ .
11:28
So far what we've told you is basically we can control
267
688643
2646
์ง€๊ธˆ๊นŒ์ง€ ์ €ํฌ๊ฐ€ ๋ง์”€๋“œ๋ฆฐ ๋‚ด์šฉ์€ ์˜ํ™”์˜ "์žฌ์ƒ" ๋ฒ„ํŠผ์„
11:31
this "play" button of the clip
268
691313
1907
์กฐ์ž‘ํ•˜๋Š” ๋ฐฉ๋ฒ•์ด์—ˆ์Šต๋‹ˆ๋‹ค.
11:33
so that we can play this video clip any time, anywhere.
269
693244
4561
์ด๋ฅผ ํ†ตํ•ด ์–ธ์ œ ์–ด๋””์„œ๋“  ์˜ํ™”๋ฅผ ๋ณผ ์ˆ˜ ์žˆ์ฃ .
11:37
But is there a possibility that we can actually get
270
697829
2507
๊ทธ๋ ‡์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์‹ค์ œ๋กœ ๋‡Œ ์•ˆ์œผ๋กœ ๋“ค์–ด๊ฐ€์„œ
11:40
inside the brain and edit this movie clip
271
700360
2836
์ด ์˜ํ™”๋ฅผ ํŽธ์ง‘ํ•˜๋Š” ๊ฒƒ๋„ ๊ฐ€๋Šฅํ• ๊นŒ์š”?
11:43
so that we can make it different from the original?
272
703220
2872
์›๋ณธ๊ณผ๋Š” ๋‹ค๋ฅธ ์˜ํ™”๋ฅผ ๋งŒ๋“œ๋Š” ๊ฒƒ๋„?
11:46
Yes we can.
273
706116
2154
๋„ค, ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
11:48
Turned out that all we need to do is basically
274
708294
2191
์•Œ๊ณ  ๋ณด๋‹ˆ ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ผ์€ ๊ฐ„๋‹จํ•˜๋”๊ตฐ์š”.
11:50
reactivate a memory using lasers just like we did before,
275
710509
4166
์ „์— ํ•˜๋˜ ๋Œ€๋กœ ๋ ˆ์ด์ €๋ฅผ ์ด์šฉํ•ด ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๋ฉด์„œ
11:54
but at the same time, if we present new information
276
714699
3415
๋™์‹œ์— ์ƒˆ๋กœ์šด ์ •๋ณด๋ฅผ ์ฃผ์ž…ํ•˜์—ฌ
11:58
and allow this new information to incorporate into this old memory,
277
718138
3950
์ด๋ฅผ ์›๋ž˜์˜ ๊ธฐ์–ต์— ๊ฒฐํ•ฉ์‹œํ‚จ๋‹ค๋ฉด
12:02
this will change the memory.
278
722112
2414
๊ธฐ์–ต์ด ๋ณ€ํ™”๋˜๊ฒ ์ฃ .
12:04
It's sort of like making a remix tape.
279
724550
3639
๋‚˜๋งŒ์˜ ๋ฆฌ๋ฏน์Šค๋กœ ๋…ธ๋ž˜ ํ…Œ์ดํ”„๋ฅผ ๋งŒ๋“ค๋“ฏ์ด์š”.
12:08
SR: So how do we do this?
280
728213
2834
๊ทธ๋Ÿผ ์–ด๋–ป๊ฒŒ ์‹œ์ž‘ํ•ด์•ผ ํ• ๊นŒ์š”?
12:11
Rather than finding a fear memory in the brain,
281
731071
1933
์ด์   ๋‡Œ ์†์—์„œ ๊ณตํฌ์˜ ๊ธฐ์–ต์„ ์ฐพ์ง€ ์•Š๊ณ 
12:13
we can start by taking our animals,
282
733028
1692
์šฐ์„  ์‹คํ—˜์šฉ ๋™๋ฌผ์„ ๋ฐ๋ ค๋‹ค๊ฐ€
12:14
and let's say we put them in a blue box like this blue box here
283
734744
2953
์—ฌ๊ธฐ ๋ณด์ด๋Š” ํŒŒ๋ž€ ์ƒ์ž์™€ ๊ฐ™์€ ์ƒ์ž์— ๋„ฃ์–ด๋ด…์‹œ๋‹ค.
12:17
and we find the brain cells that represent that blue box
284
737721
2623
๊ทธ๋ฆฌ๊ณ  ์ด ํŒŒ๋ž€ ์ƒ์ž์— ๋Œ€ํ•œ ๊ธฐ์–ต์„ ๋‹ด๋‹นํ•˜๋Š” ๋‡Œ์„ธํฌ๋ฅผ ์ฐพ์•„์„œ
12:20
and we trick them to respond to pulses of light
285
740368
2239
์ด๋“ค์ด ๋น›์˜ ํŒŒ๋™์— ๋ฐ˜์‘ํ•˜๋„๋ก ์กฐ์ž‘ํ•˜๋Š” ๊ฑฐ์ฃ .
12:22
exactly like we had said before.
286
742631
1710
์กฐ๊ธˆ ์ „์— ๋ง์”€๋“œ๋ฆฐ ๋ฐ”๋กœ ๊ทธ ๋ฐฉ๋ฒ• ๊ทธ๋Œ€๋กœ์ž…๋‹ˆ๋‹ค.
12:24
Now the next day, we can take our animals and place them
287
744365
2100
๊ทธ๋ฆฌ๊ณ  ๊ทธ ๋‹ค์Œ ๋‚  ์‹คํ—˜์ฅ๋ฅผ ๋ฐ๋ ค๋‹ค
12:26
in a red box that they've never experienced before.
288
746489
2675
์ด๋“ค์ด ํ•œ๋ฒˆ๋„ ๋ณธ ์  ์—†๋Š” ๋นจ๊ฐ„ ์ƒ์ž์— ๋„ฃ์–ด๋ณด๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:29
We can shoot light into the brain to reactivate
289
749188
2239
๊ทธ๋ฆฌ๊ณ  ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๊ธฐ ์œ„ํ•ด ๋‡Œ์— ๋ ˆ์ด์ €๋ฅผ ์˜๋Š”๊ฑฐ์ฃ .
12:31
the memory of the blue box.
290
751451
1848
์ด์ „์˜ ๊ทธ ํŒŒ๋ž€ ์ƒ์ž์— ๋Œ€ํ•œ ๊ธฐ์–ต์„ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ ์œ„ํ•ด์„œ์š”.
12:33
So what would happen here if, while the animal
291
753323
1720
๊ทธ๋ ‡๋‹ค๋ฉด ์‹คํ—˜์ฅ๊ฐ€
12:35
is recalling the memory of the blue box,
292
755067
1905
ํŒŒ๋ž€ ์ƒ์ž์— ๋Œ€ํ•œ ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๋Š” ๋™์•ˆ
12:36
we gave it a couple of mild foot shocks?
293
756996
2584
๋ฐœ์— ์•ฝํ•œ ์ถฉ๊ฒฉ์„ ๋ช‡ ๋ฒˆ ๊ฐ€ํ•œ๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
12:39
So here we're trying to artificially make an association
294
759604
2669
์—ฌ๊ธฐ์„œ ์ €ํฌ๊ฐ€ ์–ป๊ณ ์ž ํ•˜๋Š” ๊ฒƒ์€
12:42
between the memory of the blue box
295
762297
1891
ํŒŒ๋ž€ ์ƒ์ž์— ๋Œ€ํ•œ ๊ธฐ์–ต๊ณผ ๋ฐœ์— ๊ฐ€ํ•ด์ง„ ์ถฉ๊ฒฉ์„
12:44
and the foot shocks themselves.
296
764212
1479
์ธ์œ„์ ์œผ๋กœ ์—ฐ๊ด€์ง“๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:45
We're just trying to connect the two.
297
765715
1762
๋‹จ์ˆœํžˆ ๋‘˜์„ ์—ฐ๊ฒฐ์‹œํ‚ค๋Š” ๊ฒƒ์ด์ฃ .
12:47
So to test if we had done so,
298
767501
1512
๊ทธ๋ฆฌ๊ณ  ๊ทธ ์‹คํ—˜์ด ์„ฑ๊ณตํ–ˆ๋Š”์ง€ ํ™•์ธํ•˜๊ธฐ ์œ„ํ•ด
12:49
we can take our animals once again
299
769037
1304
์‹คํ—˜์ฅ๋ฅผ ๋ฐ๋ ค๋‹ค๊ฐ€
12:50
and place them back in the blue box.
300
770365
1922
๋‹ค์‹œ ๊ทธ ํŒŒ๋ž€ ์ƒ์ž์— ๋„ฃ๋Š” ๊ฒ๋‹ˆ๋‹ค.
12:52
Again, we had just reactivated the memory of the blue box
301
772311
2715
์ €ํฌ๊ฐ€ ํŒŒ๋ž€ ์ƒ์ž์— ๋Œ€ํ•œ ๊ธฐ์–ต์„ ๋˜์‚ด๋ฆฌ๋Š” ๋™์•ˆ
12:55
while the animal got a couple of mild foot shocks,
302
775050
2381
์ฅ์˜ ๋ฐœ์— ์•ฝํ•œ ์ถฉ๊ฒฉ์„ ๊ฐ€ํ–ˆ์—ˆ์ฃ .
12:57
and now the animal suddenly freezes.
303
777455
2107
์ฅ๊ฐ€ ์ด์ œ ๊ณตํฌ ๋ฐ˜์‘์„ ๋ณด์ด๊ธฐ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค.
12:59
It's as though it's recalling being mildly shocked in this environment
304
779586
3334
๋งˆ์น˜ ๋ฐœ์— ๊ฐ€ํ•ด์ง„ ์ถฉ๊ฒฉ์ด ์ด ๊ณณ์—์„œ ์ผ์–ด๋‚œ ๊ฒƒ์ฒ˜๋Ÿผ์š”.
13:02
even though that never actually happened.
305
782944
2822
์‹ค์ œ๋กœ ์ผ์–ด๋‚˜์ง€ ์•Š์€ ์ผ์ธ๋ฐ๋„ ๋ง์ด์ฃ .
13:05
So it formed a false memory,
306
785790
1838
๊ฐ€์งœ ๊ธฐ์–ต์ด ํ˜•์„ฑ๋œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:07
because it's falsely fearing an environment
307
787652
2048
์™œ๋ƒํ•˜๋ฉด ์ด ์ฅ๊ฐ€ ๋‘๋ ค์›Œํ•˜๋Š” ๊ทธ ํ™˜๊ฒฝ์—์„ 
13:09
where, technically speaking,
308
789724
1334
์ •ํ™•ํ•˜๊ฒŒ ๋งํ•˜์ž๋ฉด
์•„๋ฌด๋Ÿฐ ๋‚˜์œ ์ผ๋„ ์ผ์–ด๋‚˜์ง€ ์•Š์•˜์œผ๋‹ˆ๊นŒ์š”.
13:11
nothing bad actually happened to it.
309
791082
2160
13:13
XL: So, so far we are only talking about
310
793266
2429
์ง€๊ธˆ๊นŒ์ง€ ๋ง์”€๋“œ๋ฆฐ ๋‚ด์šฉ์€
13:15
this light-controlled "on" switch.
311
795719
2342
์Šค์œ„์น˜์˜ "์ผœ๋Š”" ๊ธฐ๋Šฅ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
13:18
In fact, we also have a light-controlled "off" switch,
312
798085
3264
ํ•˜์ง€๋งŒ ์ €ํฌ๋Š” "๋„๋Š”" ๊ธฐ๋Šฅ์˜ ์Šค์œ„์น˜๋„ ์„ค์น˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
13:21
and it's very easy to imagine that
313
801373
2056
๊ทธ๋ฆฌ๊ณ  ์ด "๋„๋Š”" ์Šค์œ„์น˜๋ฅผ ์„ค์น˜ํ•˜์—ฌ
13:23
by installing this light-controlled "off" switch,
314
803453
2454
์–ธ์ œ ์–ด๋””์„œ๋“  ๊ธฐ์–ต์„ ๋Œ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ๋„
13:25
we can also turn off a memory, any time, anywhere.
315
805931
5564
์‰ฝ๊ฒŒ ์ƒ์ƒํ•˜์‹ค ์ˆ˜ ์žˆ๊ฒ ์ฃ .
13:31
So everything we've been talking about today
316
811519
2190
์ฆ‰ ์ €ํฌ๊ฐ€ ์˜ค๋Š˜ ๋ง์”€๋“œ๋ฆฐ ๋‚ด์šฉ์€
13:33
is based on this philosophically charged principle of neuroscience
317
813733
4653
์ฒ ํ•™์ ์ธ ๋…ผ์Ÿ์ด ์ฃผ๋ฅผ ์ด๋ฃจ์—ˆ๋˜ ์‹ ๊ฒฝ๊ณผํ•™์˜ ์›๋ฆฌ์— ๋ฐ”ํƒ•์„ ๋‘๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:38
that the mind, with its seemingly mysterious properties,
318
818410
4094
์ฆ‰ ์ด์„ฑ์€ ๊ฒ‰์œผ๋กœ๋Š” ์‹ ๋น„๋กญ๊ฒŒ ๋ณด์ด์ง€๋งŒ ์‹ค์ œ๋กœ๋Š” ์šฐ๋ฆฌ๊ฐ€ ์กฐ์ž‘ ๊ฐ€๋Šฅํ•œ
13:42
is actually made of physical stuff that we can tinker with.
319
822528
3621
๋ฌผ๋ฆฌ์ ์ธ ์‹ค์ฒด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:46
SR: And for me personally,
320
826173
1451
์ €๋Š” ๊ฐœ์ธ์ ์œผ๋กœ
13:47
I see a world where we can reactivate
321
827648
1762
์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ์–ด๋–ค ๊ธฐ์–ต์ด๋“ 
13:49
any kind of memory that we'd like.
322
829434
1887
๋‹ค์‹œ ๋ถˆ๋Ÿฌ์˜ฌ ์ˆ˜ ์žˆ๋Š” ์„ธ์ƒ์ด ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
13:51
I also see a world where we can erase unwanted memories.
323
831345
3274
์›์น˜ ์•Š๋Š” ๊ธฐ์–ต์€ ์ง€์›Œ๋ฒ„๋ฆฌ๋Š” ๊ฒƒ๋„์š”.
13:54
Now, I even see a world where editing memories
324
834643
2143
์‹ฌ์ง€์–ด ๊ธฐ์–ต์˜ ์กฐ์ž‘์ด
13:56
is something of a reality,
325
836810
1284
ํ˜„์‹ค์˜ ์ผ๋ถ€๊ฐ€ ๋œ ์„ธ์ƒ๋„ ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ๋ด…๋‹ˆ๋‹ค.
13:58
because we're living in a time where it's possible
326
838118
1681
ํ˜„์žฌ ์šฐ๋ฆฌ๊ฐ€ ์‚ด๊ณ  ์žˆ๋Š” ์„ธ์ƒ์—์„œ๋Š”
13:59
to pluck questions from the tree of science fiction
327
839823
2429
๊ณต์ƒ ๊ณผํ•™ ์†Œ์„ค์—์„œ ์œ ๋ž˜๋œ ์งˆ๋ฌธ์„
14:02
and to ground them in experimental reality.
328
842276
2168
์‹ค์ œ ์‹คํ—˜์„ ํ†ตํ•ด ์ •๋‹นํ™”ํ•  ์ˆ˜ ์žˆ์œผ๋‹ˆ๊นŒ์š”.
14:04
XL: Nowadays, people in the lab
329
844468
1859
์˜ค๋Š˜๋‚ .
14:06
and people in other groups all over the world
330
846351
2362
์ „์„ธ๊ณ„์˜ ๋งŽ์€ ์‹คํ—˜์‹ค๊ณผ ์—ฐ๊ตฌ ๋‹จ์ฒด๊ฐ€
14:08
are using similar methods to activate or edit memories,
331
848737
3793
๊ธฐ์–ต์„ ํ™œ์„ฑํ™”, ์žฌํŽธ์ง‘ํ•˜๊ธฐ ์œ„ํ•ด ๋น„์Šทํ•œ ๋ฐฉ๋ฒ•์„ ์ด์šฉํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
14:12
whether that's old or new, positive or negative,
332
852554
3817
์˜ค๋žœ ๊ธฐ์–ต, ์ƒˆ ๊ธฐ์–ต ์ข‹์€ ๊ธฐ์–ต, ๋‚˜์œ ๊ธฐ์–ต
14:16
all sorts of memories so that we can understand
333
856395
2648
๋ชจ๋“  ์ข…๋ฅ˜์˜ ๊ธฐ์–ต์„ ๋Œ€์ƒ์œผ๋กœ ์šฐ๋ฆฌ๊ฐ€ ๊ธฐ์–ต์˜ ์›๋ฆฌ๋ฅผ
14:19
how memory works.
334
859067
1840
์ดํ•ดํ• ์ˆ˜ ์žˆ๋„๋ก ๋•๋Š” ๊ฒƒ์ด์ฃ .
14:20
SR: For example, one group in our lab
335
860931
1760
์˜ˆ๋ฅผ ๋“ค๋ฉด ์ €ํฌ ์‹คํ—˜์‹ค์˜ ํ•œ ํŒ€์—์„œ๋Š”
14:22
was able to find the brain cells that make up a fear memory
336
862715
2614
๋ฌด์„œ์šด ๊ธฐ์–ต์„ ๋งŒ๋“œ๋Š” ๋‡Œ์„ธํฌ๋ฅผ ์ฐพ์•„
14:25
and converted them into a pleasurable memory, just like that.
337
865353
2751
์ด๋ฅผ ์ฆ๊ฑฐ์šด ๊ธฐ์–ต์œผ๋กœ ๋ฐ”๊พธ๋Š”๋ฐ ์„ฑ๊ณตํ–ˆ์Šต๋‹ˆ๋‹ค.
14:28
That's exactly what I mean about editing these kinds of processes.
338
868128
3143
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฐ ๊ฒƒ์ด ๋ฐ”๋กœ ์ด๋Ÿฌํ•œ ํŽธ์ง‘์˜ ๊ณผ์ •์ž…๋‹ˆ๋‹ค
14:31
Now one dude in lab was even able to reactivate
339
871295
2239
๊ทธ๋ฆฌ๊ณ  ๋˜๋‹ค๋ฅธ ์—ฐ๊ตฌ์›์€ ์•”์ปท ์ฅ์˜ ๊ธฐ์–ต์„
14:33
memories of female mice in male mice,
340
873558
1921
์ˆ˜์ปท ์ฅ์˜ ๋‡Œ์— ๋ถˆ๋Ÿฌ์˜ค๊ธฐ๋„ ํ–ˆ์–ด์š”.
14:35
which rumor has it is a pleasurable experience.
341
875503
2971
๋ฌด์ฒ™ ํฅ๋ฏธ๋กœ์šด ๊ธฐ์–ต์ด์—ˆ๋‹ค๊ณ ๋“ค ํ•˜๋”๊ตฐ์š”.
14:38
XL: Indeed, we are living in a very exciting moment
342
878498
4093
๊ทธ๋Ÿฌ๋ฏ€๋กœ ์šฐ๋ฆฌ๊ฐ€ ์‚ฌ๋Š” ์ด ์„ธ์ƒ์€
14:42
where science doesn't have any arbitrary speed limits
343
882615
3781
๋”์ด์ƒ ๊ณผํ•™์˜ ์†๋„๊ฐ€ ๋ฌด์ œํ•œ์ด ์•„๋‹ˆ๋ผ
14:46
but is only bound by our own imagination.
344
886420
3163
์šฐ๋ฆฌ ์ƒ์ƒ๋ ฅ์œผ๋กœ ์ œ์–ดํ•  ์ˆ˜ ์žˆ๋Š” ์‹ ๊ธฐํ•œ ๊ณณ์ด ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
14:49
SR: And finally, what do we make of all this?
345
889607
2143
์ด ๋ชจ๋“  ๊ฒƒ์„ ํ†ตํ•ด ์•Œ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
14:51
How do we push this technology forward?
346
891774
1927
์ด ๊ธฐ์ˆ ์„ ๋” ๋ฐœ์ „์‹œํ‚ค๋ ค๋ฉด ์–ด๋–กํ•ด์•ผ ํ• ๊นŒ์š”?
14:53
These are the questions that should not remain
347
893725
2191
์ด๋Ÿฌํ•œ ์งˆ๋ฌธ์€ ์‹คํ—˜์‹ค์—์„œ๋งŒ ๋จธ๋ฌผ๋Ÿฌ์„œ๋Š”
14:55
just inside the lab,
348
895940
1273
์•ˆ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:57
and so one goal of today's talk was to bring everybody
349
897237
2572
์˜ค๋Š˜ ์ €ํฌ ์ด์•ผ๊ธฐ์˜ ๊ถ๊ทน์ ์ธ ๋ชฉํ‘œ๋Š”
14:59
up to speed with the kind of stuff that's possible
350
899833
2381
ํ˜„๋Œ€ ์‹ ๊ฒฝ๊ณผํ•™ ์˜์—ญ์—์„œ ๊ฐ€๋Šฅํ•ด์ง„ ์ด๋Ÿฌํ•œ ์ตœ์‹  ์ •๋ณด๋“ค์„
15:02
in modern neuroscience,
351
902238
1250
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ์ „๋‹ฌํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:03
but now, just as importantly,
352
903512
1486
ํ•˜์ง€๋งŒ ์ด์™€ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ์ค‘์š”ํ•œ ๊ฒƒ์€
15:05
to actively engage everybody in this conversation.
353
905022
3308
์—ฌ๋Ÿฌ๋ถ„๋„ ์ €ํฌ์™€ ํ•จ๊ป˜ํ•ด ์ฃผ์…จ์œผ๋ฉด ์ข‹๊ฒ ์Šต๋‹ˆ๋‹ค.
15:08
So let's think together as a team about what this all means
354
908354
2762
์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€ ํ•˜๋‚˜์˜ ํฐ ํŒ€์„ ๊ตฌ์„ฑํ•˜์—ฌ ์ด ์—ฐ๊ตฌ๊ฐ€ ๊ฐ€์ง€๋Š” ์˜๋ฏธ์™€
์•ž์œผ๋กœ ๋‚˜์•„๊ฐˆ ๊ธธ์„ ๊ฐ™์ด ์˜๋…ผํ•˜๋Š”๊ฑฐ์ฃ .
15:11
and where we can and should go from here,
355
911140
1993
15:13
because Xu and I think we all have
356
913157
2074
์‰ฌ์™€ ์ œ๊ฐ€ ์ƒ๊ฐํ•˜๊ธฐ๋กœ๋Š” ์•ž์œผ๋กœ๋„ ๋งŽ์€ ์˜๋ฌธ์ด
15:15
some really big decisions ahead of us.
357
915255
2512
์šฐ๋ฆฌ๋ฅผ ๊ธฐ๋‹ค๋ฆฌ๊ณ  ์žˆ์„ ๊ฒƒ ๊ฐ™๊ฑฐ๋“ ์š”.
15:17
Thank you. XL: Thank you.
358
917791
1101
๊ณ ๋ง™์Šต๋‹ˆ๋‹ค. ์‰ฌ: ๊ณ ๋ง™์Šต๋‹ˆ๋‹ค.
15:18
(Applause)
359
918916
1634
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7