What we'll learn about the brain in the next century | Sam Rodriques

173,708 views ・ 2018-07-03

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Yoonyoung Chang κ²€ν† : TJ Kim
00:13
I want to tell you guys something about neuroscience.
0
13040
2507
신경과학에 κ΄€ν•΄ μ–˜κΈ°ν•΄ 볼까 ν•©λ‹ˆλ‹€.
00:16
I'm a physicist by training.
1
16040
1800
μ €λŠ” λ¬Όλ¦¬ν•™μžμΈλ°μš”.
00:18
About three years ago, I left physics
2
18230
2206
3λ…„ μ „ μ―€, 물리학은 κ·Έλ§Œλ‘κ³ 
00:20
to come and try to understand how the brain works.
3
20460
2349
λ‡Œκ°€ μ–΄λ–»κ²Œ μ›€μ§μ΄λŠ”μ§€ μ—°κ΅¬ν•˜κΈ° μ‹œμž‘ν–ˆμ–΄μš”.
00:22
And this is what I found.
4
22833
1474
그리고 μ•Œκ²Œ 됐죠.
00:24
Lots of people are working on depression.
5
24331
2064
λ§Žμ€ μ‚¬λžŒλ“€μ΄ μš°μšΈμ¦μ— λŒ€ν•΄ μ—°κ΅¬ν•œλ‹€λŠ” μ‚¬μ‹€μ„μš”.
00:26
And that's really good,
6
26419
1159
λ‹€ν–‰μ΄μ—μš”.
00:27
depression is something that we really want to understand.
7
27602
2721
λ‹€λ“€ μš°μšΈμ¦μ— λŒ€ν•΄ 정말 잘 μ•Œκ³  μ‹Άμ–΄ ν•˜λ‹ˆκΉŒμš”.
00:30
Here's how you do it:
8
30347
1167
μ΄λ ‡κ²Œ 해보죠.
00:31
you take a jar and you fill it up, about halfway, with water.
9
31538
4161
병에 물을 반 정도 μ±„μ›λ‹ˆλ‹€.
00:35
And then you take a mouse, and you put the mouse in the jar, OK?
10
35723
4182
그리고 κ·Έ μ•ˆμ— μ₯λ₯Ό λ„£μŠ΅λ‹ˆλ‹€, μ•Œκ² μ£ ?
00:39
And the mouse swims around for a little while
11
39929
2350
그러면 κ·Έ μ₯λŠ” ν•œλ™μ•ˆ 버λ‘₯거리닀가
00:42
and then at some point, the mouse gets tired
12
42303
2388
μ–΄λŠ μˆœκ°„, 지쳐 버리고
00:44
and decides to stop swimming.
13
44715
1934
그만 ν—€μ—„μΉ˜κΈ°λ₯Ό ν¬κΈ°ν•©λ‹ˆλ‹€.
00:46
And when it stops swimming, that's depression.
14
46673
3133
λ°”λ‘œ 우울증 λ•Œλ¬Έμ— ν—€μ—„μΉ˜κΈ°λ₯Ό λ©ˆμΆ”λŠ” κ±°μ˜ˆμš”.
00:50
OK?
15
50696
1150
μ•„μ‹œκ² μ£ ?
00:52
And I'm from theoretical physics,
16
52291
3380
μ €λŠ” 이둠 물리학을 μ „κ³΅ν–ˆκ³ 
00:55
so I'm used to people making very sophisticated mathematical models
17
55695
3668
물리 ν˜„μƒμ˜ μ •ν™•ν•œ 이해λ₯Ό μœ„ν•΄ λ³΅μž‘ν•œ μˆ˜ν•™ 곡식을 λ‹€λ£¨λŠ”
00:59
to precisely describe physical phenomena,
18
59387
2881
μ‚¬λžŒλ“€μ— μ΅μˆ™ν•΄μ„œ κ·ΈλŸ°μ§€
01:02
so when I saw that this is the model for depression,
19
62292
2452
μš°μšΈμ¦μ„ λ¬˜μ‚¬ν•˜λŠ” 이 그림을 λ³΄κ³ λŠ” 이런 생각이 λ“€μ—ˆμ–΄μš”.
01:04
I though to myself, "Oh my God, we have a lot of work to do."
20
64768
2937
"와, μ§„μ§œ ν•  일이 λ§Žκ΅¬λ‚˜."
01:07
(Laughter)
21
67729
1370
(μ›ƒμŒ)
01:09
But this is a kind of general problem in neuroscience.
22
69123
2951
그런데 신경과학에선 μ’€ 일반적인 λ¬Έμ œμž…λ‹ˆλ‹€.
01:12
So for example, take emotion.
23
72377
2111
감정을 예둜 λ“€μ–΄ 보죠.
01:14
Lots of people want to understand emotion.
24
74512
2459
λ§Žμ€ μ‚¬λžŒλ“€μ΄ 감정에 λŒ€ν•΄ μ΄ν•΄ν•˜κ³  μ‹Άμ–΄ν•©λ‹ˆλ‹€.
01:17
But you can't study emotion in mice or monkeys
25
77352
3313
ν•˜μ§€λ§Œ μ₯λ‚˜ μ›μˆ­μ΄λ₯Ό 톡해 감정을 연ꡬ할 순 μ—†μ–΄μš”.
01:20
because you can't ask them
26
80689
1254
κ±”λ“€ν•œν…Œ λ¬Όμ–΄λ³Ό μˆ˜κ°€ μ—†μž–μ•„μš”.
01:21
how they're feeling or what they're experiencing.
27
81967
2317
기뢄이 어떀지, 뭐λ₯Ό λŠλΌλŠ”μ§€ λ“±μ΄μš”.
01:24
So instead, people who want to understand emotion,
28
84308
2357
κ·Έλž˜μ„œ λŒ€μ‹ , 감정에 λŒ€ν•΄ μ•Œκ³  싢은 μ‚¬λžŒλ“€μ΄
01:26
typically end up studying what's called motivated behavior,
29
86689
2777
κ²°κ΅­ μ—°κ΅¬ν•˜κ²Œ λ˜λŠ” 건 동기가 λΆ€μ—¬λœ 행동인데
01:29
which is code for "what the mouse does when it really, really wants cheese."
30
89490
3658
"μΉ˜μ¦ˆμ— ν™˜μž₯ν•œ μ₯κ°€ ν•˜λŠ” 짓"의 κ³ μƒν•œ ν‘œν˜„μ΄μ£ .
01:33
OK, I could go on and on.
31
93839
1675
자, 이런 μ–˜κΈ°μ—” 끝이 μ—†μŠ΅λ‹ˆλ‹€.
01:35
I mean, the point is, the NIH spends about 5.5 billion dollars a year
32
95538
6316
제 말은 λ―Έ ꡭ립보건원(NIH)μ—μ„œ 맀년 μ•½ 55μ–΅ λΆˆμ„
01:41
on neuroscience research.
33
101878
1532
μ‹ κ²½κ³Όν•™μ˜ 연ꡬ에 μŸμ•„λΆ“μ§€λ§Œ
01:43
And yet there have been almost no significant improvements in outcomes
34
103434
3603
λ‡Œ μ§ˆν™˜ ν™˜μžλ“€μ„ μœ„ν•œ μ˜λ―ΈμžˆλŠ” κ²°κ³ΌλŠ”
01:47
for patients with brain diseases in the past 40 years.
35
107061
3491
μ§€λ‚œ 40λ…„ λ™μ•ˆ λ³„λ‘œ μ—†μ—ˆλ‹€λŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
01:51
And I think a lot of that is basically due to the fact
36
111015
2540
기본적으둜 λ‹€μŒ μ‚¬μ‹€λ“€μ—μ„œ λ§Žμ€ 이유λ₯Ό μ°Ύμ•„λ³Ό 수 μžˆμ–΄μš”.
01:53
that mice might be OK as a model for cancer or diabetes,
37
113579
4151
μ•”μ΄λ‚˜ 당뇨병 연ꡬ에 μ₯λ₯Ό μ΄μš©ν•  μˆ˜λ„ μžˆκ² μ§€λ§Œ,
01:57
but the mouse brain is just not sophisticated enough
38
117754
2687
μ₯μ˜ λ‡ŒλŠ” μΈκ°„μ˜ μ‹¬λ¦¬λ‚˜ λ‡Œ μ§ˆν™˜μ„
02:00
to reproduce human psychology or human brain disease.
39
120465
3175
μž¬ν˜„ν•΄ λ‚Ό μ •λ„λ‘œ μ •κ΅ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
02:04
OK?
40
124379
1225
μ•„μ‹œκ² μ£ ?
02:05
So if the mouse models are so bad, why are we still using them?
41
125628
3634
그럼 μ΄λ ‡κ²Œ 별 μ“Έλͺ¨μ—†λŠ” μ₯λ₯Ό μ™œ μ—¬μ „νžˆ μ‹€ν—˜μ— μ“°λŠ” κ±ΈκΉŒμš”?
02:10
Well, it basically boils down to this:
42
130143
2103
κ²°κ΅­ μ΄λ ‡κ²Œ μ •λ¦¬λ©λ‹ˆλ‹€:
02:12
the brain is made up of neurons
43
132270
2556
λ‡ŒλŠ” λ‰΄λŸ°μ΄λΌκ³  ν•˜λŠ”
02:14
which are these little cells that send electrical signals to each other.
44
134850
3447
μ „κΈ°μ‹ ν˜Έλ₯Ό μ£Όκ³  λ°›λŠ” μž‘μ€ μ‹ κ²½μ„Έν¬λ“€λ‘œ λ§Œλ“€μ–΄ 지죠.
02:18
If you want to understand how the brain works,
45
138680
2144
λ‡Œκ°€ μ–΄λ–»κ²Œ μ›€μ§μ΄λŠ”μ§€ μ•Œκ³  μ‹Άλ‹€λ©΄
02:20
you have to be able to measure the electrical activity of these neurons.
46
140848
3808
이 μ‹ κ²½μ„Έν¬λ“€μ˜ μ „κΈ°ν™œλ™μ„ μΈ‘μ •ν•  수 μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
02:25
But to do that, you have to get really close to the neurons
47
145339
2992
그러기 μœ„ν•΄μ„ , 이 신경세포듀을 μ•„μ£Ό μžμ„Ένžˆ μ‚΄νŽ΄λ΄μ•Ό ν•˜λŠ”λ°,
02:28
with some kind of electrical recording device or a microscope.
48
148355
2928
μ „κΈ°μ‹ ν˜Έ 기둝 μž₯μΉ˜λ‚˜ ν˜„λ―Έκ²½ 등을 μ‚¬μš©ν•˜κ² μ£ .
02:31
And so you can do that in mice and you can do it in monkeys,
49
151563
2810
μ₯λ‚˜ μ›μˆ­μ΄μ—κ²ŒλŠ” μ‚¬μš©μ΄ κ°€λŠ₯ν•œλ°
02:34
because you can physically put things into their brain
50
154397
2548
이런 μž₯비듀을 직접 λ‡Œ 속에 넣을 수 μžˆμœΌλ‹ˆκΉŒμš”.
02:36
but for some reason we still can't do that in humans, OK?
51
156969
3046
ν•˜μ§€λ§Œ μ—¬μ „νžˆ μΈκ°„μ—κ²ŒλŠ” μ‚¬μš©ν•  수 μ—†μŠ΅λ‹ˆλ‹€. μ•„μ‹œμ£ ?
02:40
So instead, we've invented all these proxies.
52
160533
3370
κ·Έ λŒ€μ‹ , λŒ€μ²΄ μž₯비듀이 μ—¬λŸΏ κ°œλ°œλ©λ‹ˆλ‹€.
02:43
So the most popular one is probably this,
53
163927
2515
μ•„λ§ˆλ„ κ°€μž₯ ν”ν•˜κ²Œ μ‚¬μš©λ˜λŠ” 건
02:46
functional MRI, fMRI,
54
166466
2397
κΈ°λŠ₯적 λ‡ŒμžκΈ°κ³΅λͺ…μ˜μƒ, fMRI μž…λ‹ˆλ‹€.
02:48
which allows you to make these pretty pictures like this,
55
168887
2692
μ΄λ ‡κ²Œ 예쁜 그림도 λ‚˜μ˜€λ„€μš”.
02:51
that show which parts of your brain light up
56
171603
2056
μ—¬λŸ¬κ°€μ§€ λ‹€μ–‘ν•œ ν™œλ™μ„ ν•  λ•Œ
02:53
when you're engaged in different activities.
57
173683
2126
λ‡Œκ°€ λ°˜μ‘ν•˜λŠ” 뢀뢄을 λΉ›μœΌλ‘œ λ³΄μ—¬μ€λ‹ˆλ‹€.
02:55
But this is a proxy.
58
175833
1920
ν•˜μ§€λ§Œ 이건 λŒ€μ²΄ μž₯비일 뿐이죠.
02:57
You're not actually measuring neural activity here.
59
177777
3292
μ‹€μ œλ‘œ μ‹ κ²½ν™œλ™μ„ μΈ‘μ •ν•˜λŠ” 것이 μ•„λ‹ˆλ‹ˆκΉŒμš”.
03:01
What you're doing is you're measuring, essentially,
60
181093
2842
κ²°κ΅­ μΈ‘μ •λ˜λŠ” 건
03:03
like, blood flow in the brain.
61
183959
1832
λ‡Œ μ†μ˜ ν˜ˆμ•‘μˆœν™˜ 같은 κ±°μ£ .
03:05
Where there's more blood.
62
185815
1238
ν”Όκ°€ 더 많이 λͺ°λ¦¬λŠ” κ³³μ΄μš”.
03:07
It's actually where there's more oxygen, but you get the idea, OK?
63
187077
3103
사싀 μ‚°μ†Œκ°€ 더 λ§Žμ€ 건데, 무슨 말인지 λ‹€λ“€ μ•„μ‹œμ£ ?
03:10
The other thing that you can do is you can do this --
64
190204
2519
또 이런 것도 μžˆμ–΄μš”.
03:12
electroencephalography -- you can put these electrodes on your head, OK?
65
192747
3591
λ‡ŒνŒŒ μ „μœ„ κΈ°λ‘μˆ μ΄λΌλŠ”, 머리에 μ“°λŠ” 이런 μ „κ·Ή μ•„μ‹œμ£ ?
03:16
And then you can measure your brain waves.
66
196362
2143
λ‡ŒνŒŒλ₯Ό μΈ‘μ •ν•˜λŠ” μž₯μΉ˜μž…λ‹ˆλ‹€.
03:19
And here, you're actually measuring electrical activity.
67
199125
3079
μ΄λ ‡κ²Œ ν•˜λ©΄, μ‹€μ œλ‘œ μ „κΈ°ν™œλ™μ„ μΈ‘μ •ν•  수 μžˆμ§€λ§Œ,
03:22
But you're not measuring the activity of neurons.
68
202228
2365
μ‹ κ²½μ„Έν¬λ“€μ˜ μ›€μ§μž„μ€ μ•Œμˆ˜κ°€ μ—†μŠ΅λ‹ˆλ‹€.
03:24
You're measuring these electrical currents,
69
204911
2444
단지 λ‡Œ μ•ˆμ—μ„œ μ „λ₯˜λ“€μ΄
03:27
sloshing back and forth in your brain.
70
207379
2299
μ•žλ’€λ‘œ μΆœλ κ±°λ¦¬λŠ” 것을 μΈ‘μ •ν•  λΏμ΄μ—μš”.
03:30
So the point is just that these technologies that we have
71
210157
2674
κ·ΈλŸ¬λ‹ˆκΉŒ μš”μ μ€, μš°λ¦¬κ°€ μ‚¬μš©ν•˜λŠ” 이런 κΈ°μˆ λ“€μ€
03:32
are really measuring the wrong thing.
72
212855
2436
μ•„μ£Ό μ—‰λš±ν•œ 것듀을 μΈ‘μ •ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
03:35
Because, for most of the diseases that we want to understand --
73
215315
2953
μ™œλƒλ©΄, μš°λ¦¬κ°€ μ•Œκ³  싢은 μ§ˆλ³‘μ˜ λŒ€λΆ€λΆ„μ€
03:38
like, Parkinson's is the classic example.
74
218292
2198
νŒŒν‚¨μŠ¨λ³‘μ΄ λŒ€ν‘œμ μΈλ°μš”,
03:40
In Parkinson's, there's one particular kind of neuron deep in your brain
75
220514
3554
λ‡Œ 속 κΉŠμ€ 곳에 μžˆλŠ” 이 νŒŒν‚¨μŠ¨λ³‘μ˜ μ£Όμš” 원인인
03:44
that is responsible for the disease,
76
224092
1731
νŠΉμ • 신경세포 ν•˜λ‚˜λ₯Ό
03:45
and these technologies just don't have the resolution that you need
77
225847
3182
ν˜„μž¬μ˜ κΈ°μˆ λ‘œλŠ” μ œλŒ€λ‘œ 된 연ꡬ쑰차
03:49
to get at that.
78
229053
1373
ν•  수 κ°€ μ—†μŠ΅λ‹ˆλ‹€.
03:50
And so that's why we're still stuck with the animals.
79
230450
3974
κ·Έλž˜μ„œ 동물듀을 μ‹€ν—˜μ— μ“Έ 수 밖에 μ—†μ£ .
03:54
Not that anyone wants to be studying depression
80
234448
2533
λˆ„κ΅¬λ“  이런 λ°©μ‹μœΌλ‘œ μš°μšΈμ¦μ„ μ—°κ΅¬ν•˜κ³  싢진 μ•Šμ„ κ±°μ˜ˆμš”.
03:57
by putting mice into jars, right?
81
237005
2262
μ₯λ₯Ό 병 속에 λ„£μœΌλ©΄μ„œ 말이죠, λ§žλ‚˜μš”?
03:59
It's just that there's this pervasive sense that it's not possible
82
239291
3753
일반적으둜 λ§Žμ€ μ‚¬λžŒλ“€μ΄ κ±΄κ°•ν•œ μ‚¬λžŒλ“€μ˜
04:03
to look at the activity of neurons in healthy humans.
83
243068
3847
신경세포 ν™œλ™μ„ κ΄€μ°°ν•˜λŠ” 건 λΆˆκ°€λŠ₯ν•˜λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
04:08
So here's what I want to do.
84
248180
1492
그럼 μ΄λ ‡κ²Œ ν•΄λ³ΌκΉŒμš”.
04:09
I want to take you into the future.
85
249974
2521
저와 ν•¨κ»˜ 미래둜 κ°€μ‹œμ£ .
04:12
To have a look at one way in which I think it could potentially be possible.
86
252519
4482
제 생각에 κ°€λŠ₯ν•  것 같은 방법 ν•˜λ‚˜κ°€ μžˆκ±°λ“ μš”.
04:17
And I want to preface this by saying, I don't have all the details.
87
257526
3298
아직 ꡬ체적인 건 μ•„λ‹ˆλΌκ³  λ¨Όμ € λ§μ”€λ“œλ¦΄κ²Œμš”.
04:21
So I'm just going to provide you with a kind of outline.
88
261272
2967
κ·ΈλŸ¬λ‹ˆκΉŒ λŒ€λž΅ 윀곽만 보여 λ“œλ¦¬λ €κ³  ν•©λ‹ˆλ‹€.
04:24
But we're going to go the year 2100.
89
264263
2400
자, 2100λ…„μœΌλ‘œ κ°‘λ‹ˆλ‹€.
04:27
Now what does the year 2100 look like?
90
267732
2299
2100λ…„μ—” 세상이 μ–΄λ–¨κΉŒμš”?
04:30
Well, to start with, the climate is a bit warmer that what you're used to.
91
270055
3518
μš°μ„ , κΈ°ν›„κ°€ μ§€κΈˆλ³΄λ‹€λŠ” μ’€ 더 λ”μ›Œ 지겠죠.
04:33
(Laughter)
92
273597
3583
(μ›ƒμŒ)
04:37
And that robotic vacuum cleaner that you know and love
93
277204
4952
μ—¬λŸ¬λΆ„λ“€μ΄ μ•„λΌλŠ” 이 λ‘œλ΄‡ μ§„κ³΅μ²­μ†ŒκΈ°λŠ”
04:42
went through a few generations,
94
282180
1514
λͺ‡ μ„ΈλŒ€λ₯Ό 거치며 λ°œμ „ν•˜μ§€λ§Œ,
04:43
and the improvements were not always so good.
95
283718
2843
κ·Έ κ²°κ³Όκ°€ 항상 μ’‹μ§€λ§Œμ€ μ•Šμ„ κ±°μ˜ˆμš”.
04:46
(Laughter)
96
286585
1595
(μ›ƒμŒ)
04:48
It was not always for the better.
97
288530
2310
λ°œμ „μ΄λž€ 게 항상 쒋은 건 μ•„λ‹ˆμ—μš”.
04:52
But actually, in the year 2100 most things are surprisingly recognizable.
98
292221
4538
사싀 λœ»λ°–μ—, 2100년에도 μš°λ¦¬μ—κ²Œ μ΅μˆ™ν•œ 것듀이 λŒ€λΆ€λΆ„μΌ κ±°μ˜ˆμš”.
04:57
It's just the brain is totally different.
99
297458
2734
ν•˜μ§€λ§Œ λ‡Œμ— κ΄€ν•΄μ„  μ–˜κΈ°κ°€ μ™„μ „νžˆ λ‹¬λΌμ§‘λ‹ˆλ‹€.
05:00
For example, in the year 2100,
100
300740
2547
예λ₯Ό λ“€μ–΄, 2100λ…„μ—λŠ”
05:03
we understand the root causes of Alzheimer's.
101
303311
2857
μ•ŒμΈ ν•˜μ΄λ¨Έλ³‘μ˜ κ·Όλ³Έ 원인을 λ°ν˜€ λ‚΄μ£ .
05:06
So we can deliver targeted genetic therapies or drugs
102
306192
3714
κ·Έλž˜μ„œ μœ μ „μž ν‘œμ μΉ˜λ£Œλ‚˜ μ•½λ¬ΌμΉ˜λ£Œκ°€ κ°€λŠ₯ν•΄ μ§€λŠ”λ°,
05:09
to stop the degenerative process before it begins.
103
309930
2876
퇴행 과정이 μ‹œμž‘λ˜κΈ°λ„ 전에 λ§‰λŠ” κ²λ‹ˆλ‹€.
05:13
So how did we do it?
104
313629
1333
μ–΄λ–»κ²Œ κ°€λŠ₯ν–ˆμ„κΉŒμš”?
05:15
Well, there were essentially three steps.
105
315898
2238
기본적으둜 μ„Έ 단계λ₯Ό κ±°μ³€μ–΄μš”.
05:18
The first step was that we had to figure out
106
318589
2814
첫 λ‹¨κ³„λ‘œ,
05:21
some way to get electrical connections through the skull
107
321427
3293
λ‘κ°œκ³¨μ„ 톡해 μ „κΈ° μž₯치λ₯Ό μ—°κ²°ν•΄
05:24
so we could measure the electrical activity of neurons.
108
324744
3015
μ‹ κ²½μ„Έν¬λ“€μ˜ μ „κΈ° ν™œλ™μ„ μΈ‘μ •ν•˜λŠ” 법을 μ°Ύμ•„μ•Ό ν–ˆμ£ .
05:28
And not only that, it had to be easy and risk-free.
109
328339
4349
뿐만 μ•„λ‹ˆλΌ, μ‚¬μš©ν•˜κΈ° 쉽고, μ•ˆμ „ν•΄μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
05:32
Something that basically anyone would be OK with,
110
332712
2378
λ§ν•˜μžλ©΄, λˆ„κ΅¬λΌλ„ 뢀담없이 받을 수 μžˆλŠ”
05:35
like getting a piercing.
111
335114
1600
ν”Όμ–΄μ‹± κ°™μ•„μ•Ό ν–ˆμ£ .
05:37
Because back in 2017,
112
337156
2747
μ™œλƒλ©΄, κ³Όκ±° 2017λ…„μ—λŠ”,
05:39
the only way that we knew of to get through the skull
113
339927
2913
λ‘κ°œκ³¨ μ•ˆμ„ λ³Ό 수 μžˆλŠ” μœ μΌν•œ 방법은
05:42
was to drill these holes the size of quarters.
114
342864
2817
25μ„ΌνŠΈ λ™μ „λ§Œν•œ 크기의 ꡬ멍을 λš«μ–΄μ•Ό ν–ˆκ±°λ“ μš”.
05:46
You would never let someone do that to you.
115
346015
2039
λˆ„κ°€ κ·Έλ ‡κ²Œ ν•˜κ²Œ λ‘κ² μ–΄μš”.
05:48
So in the 2020s,
116
348967
2253
κ·Έλž˜μ„œ 2020λ…„λŒ€μ— λ“€μ–΄
05:51
people began to experiment -- rather than drilling these gigantic holes,
117
351244
3381
μ‹€ν—˜ν•˜κΈ° μ‹œμž‘ν•œ 것은 μ΄λ ‡κ²Œ μ—„μ²­λ‚˜κ²Œ 큰 ꡬ멍을
05:54
drilling microscopic holes, no thicker than a piece of hair.
118
354649
3115
머리카락만큼 μ–‡κ³  λ―Έμ„Έν•œ ꡬ멍으둜 λŒ€μ²΄ν•˜λŠ” κ²ƒμ΄μ—ˆμ£ .
05:58
And the idea here was really for diagnosis --
119
358735
2096
μ‹€μ œ μ§„λ‹¨μš©μœΌλ‘œ λ‚˜μ˜¨ μ•„μ΄λ””μ–΄μ˜€μ–΄μš”.
06:00
there are lots of times in the diagnosis of brain disorders
120
360855
2786
λ‡Œ μ§ˆν™˜μ„ μ§„λ‹¨ν•˜λ €λ©΄ 였랜 μ‹œκ°„μ΄ κ±Έλ¦¬λŠ”λ°,
06:03
when you would like to be able to look at the neural activity beneath the skull
121
363665
4872
λ‘κ°œκ³¨ 속 μ‹ κ²½μ„Έν¬λ“€μ˜ μ›€μ§μž„μ„
06:08
and being able to drill these microscopic holes
122
368561
3191
이런 λ―Έμ„Έν•œ ꡬ멍을 톡해 κ΄€μ°°ν•  수 μžˆμ–΄μ„œ,
06:11
would make that much easier for the patient.
123
371776
2142
ν™˜μžλ“€μ΄ 훨씬 νŽΈν•΄ ν–ˆμ–΄μš”.
06:13
In the end, it would be like getting a shot.
124
373942
2349
κ²°κ΅­, 주사 ν•œλŒ€ λ§žλŠ” 것 κ°™μ•˜μ–΄μš”.
06:16
You just go in and you sit down
125
376315
1580
κ·Έλƒ₯ λ“€μ–΄κ°€ 앉아 있으면,
06:17
and there's a thing that comes down on your head,
126
377919
2301
머리 μœ„λ‘œ λ­”κ°€ 내렀와,
06:20
and a momentary sting and then it's done,
127
380244
1953
μˆœκ°„ 따끔 ν•œλ²ˆν•˜λ©΄ 끝이죠.
06:22
and you can go back about your day.
128
382221
1864
그리고 μΌμƒμœΌλ‘œ λŒμ•„κ°‘λ‹ˆλ‹€.
06:24
So we're eventually able to do it
129
384736
4793
κ·Έλž˜μ„œ κ²°κ΅­ 개발된 것이
06:29
using lasers to drill the holes.
130
389553
2667
λ ˆμ΄μ € μ²œκ³΅μˆ μž…λ‹ˆλ‹€.
06:32
And with the lasers, it was fast and extremely reliable,
131
392244
2620
λ ˆμ΄μ €λŠ” 속도가 λΉ λ₯΄κ³ , 맀우 μ•ˆμ •μ μ΄λ©°,
06:34
you couldn't even tell the holes were there,
132
394888
2213
흔적쑰차 남기지 μ•ŠμŠ΅λ‹ˆλ‹€.
06:37
any more than you could tell that one of your hairs was missing.
133
397125
3000
머리카락 ν•˜λ‚˜ 빠진 λŠλ‚Œ κ°™κ² μ£ .
06:40
And I know it might sound crazy, using lasers to drill holes in your skull,
134
400753
4738
λ ˆμ΄μ €λ‘œ 머리에 ꡬ멍을 λ‚Έλ‹€λ‹ˆ, μ •μ‹ μ—†λŠ” μ†Œλ¦¬λΌλŠ” 것 저도 μ••λ‹ˆλ‹€.
06:45
but back in 2017,
135
405515
1366
ν•˜μ§€λ§Œ κ³Όκ±° 2017년에도,
06:46
people were OK with surgeons shooting lasers into their eyes
136
406905
4109
μ˜μ‚¬λ“€μ΄ 별 λŒ€μˆ˜λ‘­μ§€ μ•Šκ²Œ λˆˆμ— λ ˆμ΄μ €λ₯Ό μˆκ±°λ“ μš”.
06:51
for corrective surgery
137
411038
1214
μ‹œλ ₯κ΅μ •μˆ˜μˆ  말이죠.
06:52
So when you're already here, it's not that big of a step.
138
412276
3887
κ·ΈλŸ¬λ‹ˆκΉŒ κ·Έλ ‡κ²Œ 큰 도약은 μ•„λ‹ˆλ„€μš”.
06:57
OK?
139
417561
1151
κ·Έλ ‡μ£ ?
06:58
So the next step, that happened in the 2030s,
140
418736
3571
λ‹€μŒ λ‹¨κ³„λŠ” 2030λ…„λŒ€λ‘œ 올라 κ°€λŠ”λ°
07:02
was that it's not just about getting through the skull.
141
422331
3086
μ΄μ œλŠ” λ‘κ°œκ³¨ κ΄€ν†΅μ—λ§Œ κ·ΈμΉ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
07:05
To measure the activity of neurons,
142
425441
1700
신경세포 ν™œλ™μ„ μΈ‘μ •ν•˜λ €λ©΄,
07:07
you have to actually make it into the brain tissue itself.
143
427165
3825
μ‹€μ œλ‘œ λ‡Œ 쑰직 자체λ₯Ό λ“€μ—¬λ‹€ 봐야 ν•˜λŠ”λ°,
07:11
And the risk, whenever you put something into the brain tissue,
144
431344
2968
λ‡Œ 쑰직에 손을 λŒ€λŠ” 건 μ•„μ£Ό μœ„ν—˜ν•œ 일이죠.
07:14
is essentially that of stroke.
145
434336
1439
λ‡Œμ‘Έμ€‘μ΄ 올 수 μžˆμœΌλ‹ˆκΉŒμš”.
07:15
That you would hit a blood vessel and burst it,
146
435799
2196
ν˜ˆκ΄€μ„ κ±΄λ“œλ € 터지면
07:18
and that causes a stroke.
147
438019
1519
λ°œμž‘μ΄ μΌμ–΄λ‚˜μž–μ•„μš”.
07:19
So, by the mid 2030s, we had invented these flexible probes
148
439916
3725
κ·Έλž˜μ„œ, 2030λ…„λŒ€ μ€‘λ°˜μ— μ•„μ£Ό μœ μ—°ν•œ 탐침을 κ°œλ°œν•΄
07:23
that were capable of going around blood vessels,
149
443665
2278
ν˜ˆκ΄€μ„ κ±΄λ“œλ¦¬μ§€ μ•Šκ³ , 주변을 돌며
07:25
rather than through them.
150
445967
1476
이동이 κ°€λŠ₯ν•΄ μ§‘λ‹ˆλ‹€.
07:27
And thus, we could put huge batteries of these probes
151
447467
5697
이 탐침에 μ„±λŠ₯이 쒋은 μ „μ§€κΉŒμ§€ μž₯μ°©ν•΄
07:33
into the brains of patients
152
453188
1357
ν™˜μžλ“€ λ‡Œ 속에 μ‚½μž…ν•˜λ©΄
07:34
and record from thousands of their neurons without any risk to them.
153
454569
3270
수 천개 μ‹ κ²½μ„Έν¬λ“€μ˜ ν™œλ™μ„ 별 λ¬Έμ œμ—†μ΄ 기둝할 수 μžˆμŠ΅λ‹ˆλ‹€.
07:39
And what we discovered, sort of to our surprise,
154
459458
4061
그런데, 쑰금 λ†€λΌμ› λ˜ 건,
07:43
is that the neurons that we could identify
155
463543
2190
μš°λ¦¬κ°€ μ•Œκ³  μžˆλŠ” 신경세포듀은
07:45
were not responding to things like ideas or emotion,
156
465757
3524
μ•„μ΄λ””μ–΄λ‚˜ 감정에 λ°˜μ‘ν•˜μ§€ μ•ŠλŠ”λ‹€λŠ” κ²ƒμ΄μ—ˆμ–΄μš”.
07:49
which was what we had expected.
157
469305
1627
μ˜ˆμƒκ³Ό λ‹¬λžμ£ .
07:50
They were mostly responding to things like Jennifer Aniston
158
470956
3796
λŒ€λΆ€λΆ„ λ°˜μ‘μ„ λ³΄μ˜€λ˜ 것듀은 μ œλ‹ˆνΌ μ• λ‹ˆμŠ€ν„΄
07:54
or Halle Berry
159
474776
2404
할리 베리
07:57
or Justin Trudeau.
160
477204
1310
또 μ €μŠ€ν‹΄ νŠΈλ€Όλ„ 같은 μ‚¬λžŒλ“€μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
07:58
I mean --
161
478538
1253
κ·ΈλŸ¬λ‹ˆκΉŒ
07:59
(Laughter)
162
479815
2326
(μ›ƒμŒ)
08:02
In hindsight, we shouldn't have been that surprised.
163
482165
2437
μ•Œκ³  보면, κ·Έλ ‡κ²Œ λ†€λž„ 일도 μ•„λ‹ˆλ„€μš”.
08:04
I mean, what do your neurons spend most of their time thinking about?
164
484626
3262
λ„λŒ€μ²΄, μ—¬λŸ¬λΆ„λ“€μ˜ 신경세포듀은 주둜 무슨 생각을 ν•˜λŠ” κ²λ‹ˆκΉŒ?
08:07
(Laughter)
165
487912
1150
(μ›ƒμŒ)
08:09
But really, the point is that
166
489380
2040
μ—¬κΈ°μ„œ κΌ­ κΈ°μ–΅ν•  것은
08:11
this technology enabled us to begin studying neuroscience in individuals.
167
491444
4430
인간을 ν†΅ν•œ μ‹ κ²½κ³Όν•™μ˜ 연ꡬ가 κ°€λŠ₯ν•΄ μ‘Œλ‹€λŠ” μ‚¬μ‹€μž…λ‹ˆλ‹€.
08:15
So much like the transition to genetics, at the single cell level,
168
495898
4230
단세포 μ—°κ΅¬μ˜ μˆ˜μ€€μ—μ„œ, μœ μ „ν•™μœΌλ‘œκΉŒμ§€ λ°œμ „ν•œ κ²ƒμ²˜λŸΌ,
08:20
we started to study neuroscience, at the single human level.
169
500152
3206
이제 신경과학을 κ°œκ°œμΈμ— 맞좰 μ—°κ΅¬ν•˜κΈ° μ‹œμž‘ν•œ κ±°μ£ .
08:23
But we weren't quite there yet.
170
503890
1618
ν•˜μ§€λ§Œ 아직 λ©€μ—ˆμŠ΅λ‹ˆλ‹€.
08:25
Because these technologies
171
505895
1642
μ™œλƒλ©΄ 이런 κΈ°μˆ λ“€μ€
08:27
were still restricted to medical applications,
172
507561
3056
의료용으둜 μ“°κΈ°μ—” μ—¬μ „νžˆ μ œμ•½μ΄ λ§Žμ€λ°
08:30
which meant that we were studying sick brains, not healthy brains.
173
510641
3391
κ±΄κ°•ν•œ λ‡Œκ°€ μ•„λ‹Œ 병든 λ‡Œλ₯Ό 가지고 μ—°κ΅¬ν•˜κΈ° λ•Œλ¬Έμ΄μ£ .
08:35
Because no matter how safe your technology is,
174
515235
3754
아무리 κ·Έ 기술이 μ•ˆμ „ν•΄λ„,
08:39
you can't stick something into someone's brain
175
519013
2730
μ‚¬λžŒμ˜ λ‡Œ 속에 μ•„λ¬΄κ±°λ‚˜ 넣을 μˆ˜λŠ” μ—†μœΌλ‹ˆκΉŒμš”.
08:41
for research purposes.
176
521767
1420
연ꡬλ₯Ό ν•‘κ³„λ‘œ 말이죠.
08:43
They have to want it.
177
523211
1549
μ‚¬λžŒλ“€μ΄ μ›ν•΄μ•Όλ§Œ κ°€λŠ₯ν•©λ‹ˆλ‹€.
08:44
And why would they want it?
178
524784
1460
그럼 μ‚¬λžŒλ“€μ€ μ™œ 이걸 μ›ν•˜κ²Œ λ κΉŒμš”?
08:46
Because as soon as you have an electrical connection to the brain,
179
526268
3571
λ‡Œμ— μ „κΈ° μž₯치λ₯Ό 달면
08:49
you can use it to hook the brain up to a computer.
180
529863
2444
λ°”λ‘œ 컴퓨터에 μ—°κ²°ν•  수 μžˆμœΌλ‹ˆκΉŒμš”.
08:53
Oh, well, you know, the general public was very skeptical at first.
181
533061
3429
μ•Œλ‹€μ‹œν”Ό, 일반 μ‚¬λžŒλ“€μ€ μ²˜μŒμ—” μ•„μ£Ό νšŒμ˜μ μ΄μ—ˆμ£ .
08:56
I mean, who wants to hook their brain up to their computers?
182
536514
2869
κ·ΈλŸ¬λ‹ˆκΉŒ, λˆ„κ°€ 자기 λ‡Œλ₯Ό 컴퓨터에 μ—°κ²°ν•˜λ €κ³  ν•˜κ² μ–΄μš”?
08:59
Well just imagine being able to send an email with a thought.
183
539926
4236
μƒκ°λ§ŒμœΌλ‘œ 이메일을 보낼 수 μžˆλ‹€κ³  상상해 λ³΄μ„Έμš”.
09:04
(Laughter)
184
544186
2253
(μ›ƒμŒ)
09:06
Imagine being able to take a picture with your eyes, OK?
185
546463
4500
눈으둜 사진을 찍을 수 μžˆλ‹€κ³  상상해 λ³΄μ„Έμš”, μ•Œκ² μ£ ?
09:10
(Laughter)
186
550987
1230
(μ›ƒμŒ)
09:12
Imagine never forgetting anything anymore,
187
552241
2963
더 이상 아무것도 μžŠμ–΄ 버리지 μ•ŠλŠ”λ‹€κ³  상상해 λ³΄μ„Έμš”.
09:15
because anything that you choose to remember
188
555228
2159
κΈ°μ–΅ν•˜κ³  싢은 것이 μžˆλ‹€λ©΄,
09:17
will be stored permanently on a hard drive somewhere,
189
557411
2477
ν•˜λ“œ λ“œλΌμ΄λΈŒ μ–΄λ”˜κ°€μ— 영ꡬ적으둜 μ €μž₯ν•˜κ³ ,
09:19
able to be recalled at will.
190
559912
2029
원할 λ•Œ λ‹€μ‹œ κΊΌλ‚΄ 보면 λ˜μž–μ•„μš”.
09:21
(Laughter)
191
561965
3366
(μ›ƒμŒ)
09:25
The line here between crazy and visionary
192
565355
3381
미친 μ‚¬λžŒκ³Ό ν™•μ‹€ν•œ 비전을 가진 μ‚¬λžŒμ€
09:28
was never quite clear.
193
568760
1467
쒅이 ν•œ μž₯ 차이죠.
09:30
But the systems were safe.
194
570720
1857
κ²°κ΅­ κ·Έ μ‹œμŠ€ν…œμ€ μ•ˆμ „ν–ˆκ³ 
09:32
So when the FDA decided to deregulate these laser-drilling systems, in 2043,
195
572879
5016
2043λ…„, λ―Έ μ‹ν’ˆμ˜μ•½κ΅­μ€ λ ˆμ΄μ € 천곡 μ‹œμŠ€ν…œμ„ ν—ˆκ°€ν•©λ‹ˆλ‹€.
09:37
commercial demand just exploded.
196
577919
2357
κ·Έ μˆ˜μš” λ˜ν•œ 폭발적으둜 λŠ˜μ–΄λ‚©λ‹ˆλ‹€.
09:40
People started signing their emails,
197
580300
1888
μ‚¬λžŒλ“€μ€ 이메일 μ„œλͺ…을 이런 μ‹μœΌλ‘œ μ“°κΈ° μ‹œμž‘ν•˜μ£ .
09:42
"Please excuse any typos.
198
582212
1341
"μ˜€νƒ€ μ–‘ν•΄ λ°”λžŒ.
09:43
Sent from my brain."
199
583577
1333
λ‡Œμ—μ„œ 보냄."
09:44
(Laughter)
200
584934
1001
(μ›ƒμŒ)
09:45
Commercial systems popped up left and right,
201
585959
2072
이제 μ—¬λŸ¬ μƒμ—…μš© μ‹œμŠ€ν…œλ“€μ΄ κ³³κ³³μ—μ„œ μ†Œκ°œλ©λ‹ˆλ‹€.
09:48
offering the latest and greatest in neural interfacing technology.
202
588055
3238
μ΅œμ‹ μ˜ λ‡Œ 접속 κΈ°λŠ₯을 λ³΄μœ ν•˜κ³  μžˆλŠ” 것듀이죠.
09:51
There were 100 electrodes.
203
591792
1753
전극이 수백 개
09:53
A thousand electrodes.
204
593569
1911
수천 κ°œκ°€ μžˆμŠ΅λ‹ˆλ‹€.
09:55
High bandwidth for only 99.99 a month.
205
595504
2476
ν•œ 달에 99뢈 정도면 κ³ λŒ€μ—­ν­μœΌλ‘œ μ΄μš©ν•  μˆ˜λ„ μžˆμ–΄μš”.
09:58
(Laughter)
206
598004
1539
(μ›ƒμŒ)
09:59
Soon, everyone had them.
207
599567
1534
곧, λˆ„κ΅¬λ‚˜ 이용이 κ°€λŠ₯ν–ˆμ–΄μš”.
10:01
And that was the key.
208
601694
1571
λ°”λ‘œ 거기에 해법이 μžˆμ—ˆμ£ .
10:03
Because, in the 2050s, if you were a neuroscientist,
209
603289
2923
μ™œλƒλ©΄, 2050λ…„λŒ€μ˜ μ‹ κ²½κ³Όν•™μžλΌλ©΄
10:06
you could have someone come into your lab essentially from off the street.
210
606236
3939
λ§ν•˜μžλ©΄, 길을 κ°€λ˜ λˆ„κ΅¬λ‚˜ μ—°κ΅¬μ‹€λ‘œ 뢈러,
10:10
And you could have them engaged in some emotional task
211
610792
2864
μ–΄λ–€ 과제λ₯Ό μ£Όκ³  감정을 μ‹€ν—˜ν•΄ λ³Ό 수 있겠죠.
10:13
or social behavior or abstract reasoning,
212
613680
2437
ν˜Ήμ€ μ‚¬νšŒμ  ν–‰λ™μ΄λ‚˜ μš”μ•½ μΆ”λ‘  같이
10:16
things you could never study in mice.
213
616141
2531
μ₯λ“€μ„ 톡해선 μ ˆλŒ€ μ‹€ν—˜ν•  수 μ—†λŠ” κ²ƒλ“€μ΄μš”.
10:18
And you could record the activity of their neurons
214
618696
3111
κ·Έλ ‡κ²Œ μΈκ°„μ˜ 신경세포 ν™œλ™μ„ κΈ°λ‘ν•˜κ²Œ λ˜λŠ”λ°,
10:21
using the interfaces that they already had.
215
621831
3191
μ‚¬λžŒλ“€μ΄ 이미 가지고 μžˆλŠ” 접속 μž₯치λ₯Ό μ΄μš©ν•©λ‹ˆλ‹€.
10:25
And then you could also ask them about what they were experiencing.
216
625046
3189
그리고 그듀이 느끼고 μžˆλŠ” 것듀에 λŒ€ν•œ μ§ˆλ¬Έλ„ κ°€λŠ₯ν•˜μ£ .
10:28
So this link between psychology and neuroscience
217
628259
3349
μ΄λ ‡κ²Œ 심리학과 μ‹ κ²½κ³Όν•™μ˜ 연계가 μ–΄λŠμƒˆ κ°€λŠ₯ν•΄ μ§‘λ‹ˆλ‹€.
10:31
that you could never make in the animals, was suddenly there.
218
631632
3381
λ™λ¬Όμ‹€ν—˜μœΌλ‘œλŠ” λΆˆκ°€λŠ₯ν–ˆλ˜ 것이죠.
10:35
So perhaps the classic example of this
219
635695
2184
μ•„λ§ˆ λŒ€ν‘œμ μΈ μ˜ˆλŠ”
10:37
was the discovery of the neural basis for insight.
220
637903
3523
톡찰을 κ°€λŠ₯μΌ€ ν•˜λŠ” μ‹ κ²½ 기반의 λ°œκ²¬μž…λ‹ˆλ‹€.
10:41
That "Aha!" moment, the moment it all comes together, it clicks.
221
641450
3600
"μ•„ν•˜!" ν•˜λ©΄μ„œ λͺ¨λ“ κ²Œ λ§žμ•„ λ–¨μ–΄μ§€λŠ” μˆœκ°„ 말이죠.
10:45
And this was discovered by two scientists in 2055,
222
645593
4056
2055λ…„, 두 κ³Όν•™μžλ“€μ— μ˜ν•΄ 발견된 것이 μžˆμ–΄μš”.
10:49
Barry and Late,
223
649673
1372
배리와 레이트죠.
10:51
who observed, in the dorsal prefrontal cortex,
224
651069
3663
λ°°μΈ‘ 전전두엽 ν”Όμ§ˆμ—μ„œ 이듀이 κ΄€μ°°ν•œ 건,
10:54
how in the brain of someone trying to understand an idea,
225
654756
5222
λ‡ŒλŠ” μ–΄λ–»κ²Œ 생각을 μ΄ν•΄ν•˜λ €κ³  ν•˜λŠ”μ§€
11:00
how different populations of neurons would reorganize themselves --
226
660002
3369
λ‹€λ₯Έ 개체ꡰ의 신경세포듀은 μ–΄λ–»κ²Œ μž¬νŽΈμ„± λ˜λŠ”μ§€ 등인데,
11:03
you're looking at neural activity here in orange --
227
663395
2436
μ‹ κ²½ν™œλ™μ΄ μΌμ–΄λ‚˜λ©΄ μ΄λ ‡κ²Œ μ£Όν™©μƒ‰μœΌλ‘œ 빛이 λ‚©λ‹ˆλ‹€.
11:05
until finally their activity aligns in a way that leads to positive feedback.
228
665855
3738
λ§ˆμΉ¨λ‚΄ μ΄λ ‡κ²Œ μ˜λ―ΈμžˆλŠ” λ°˜μ‘μ΄ λ‚˜νƒ€λ‚˜κ²Œλ” μ •λ ¬λ˜κΈ°κΉŒμ§€ 말이죠.
11:10
Right there.
229
670339
1150
λ°”λ‘œ μ €κΈ°.
11:12
That is understanding.
230
672723
1467
이해λ₯Ό ν–ˆλ‹€λŠ” ν‘œμ‹œμ£ .
11:15
So finally, we were able to get at the things that make us human.
231
675413
4437
κ²°κ΅­, 우리λ₯Ό μΈκ°„λ‹΅κ²Œ ν•΄μ£ΌλŠ” 것듀에 λŒ€ν•΄ μ•Œκ²Œ λ©λ‹ˆλ‹€.
11:21
And that's what really opened the way to major insights from medicine.
232
681871
4578
μ˜ν•™λΆ„μ•Όμ˜ μ£Όμš”ν•œ ν†΅μ°°λ‘œ κ°€λŠ” 길을 μ—΄μ–΄μ€¬μ–΄μš”.
11:27
Because, starting in the 2060s,
233
687465
2755
2060λ…„λŒ€κ°€ μ‹œμž‘λ˜λ©΄μ„œ
11:30
with the ability to record the neural activity
234
690244
2484
μ΄λ ‡κ²Œ λ‹€μ–‘ν•œ μ •μ‹  μ§ˆν™˜ ν™˜μžλ“€μ˜
11:32
in the brains of patients with these different mental diseases,
235
692752
3587
λ‡Œ μ‹ κ²½ ν™œλ™μ„ 기둝할 수 μžˆμ—ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
11:36
rather than defining the diseases on the basis of their symptoms,
236
696363
4690
증상에 따라 μ§ˆλ³‘μ˜ μ •μ˜λ₯Ό λ‚΄λ¦¬λ˜
11:41
as we had at the beginning of the century,
237
701077
2040
μ„ΈκΈ° μ΄ˆμ™€λŠ” λ‹€λ₯΄κ²Œ
11:43
we started to define them
238
703141
1222
κ·Έ 방법이
11:44
on the basis of the actual pathology that we observed at the neural level.
239
704387
3539
신경계λ₯Ό λ‹€λ£¨λŠ” 병리학에 κΈ°λ°˜μ„ 두기 μ‹œμž‘ν•œ 것이죠.
11:48
So for example, in the case of ADHD,
240
708768
3825
예λ₯Ό λ“€μ–΄, 주의λ ₯ 결핍 μž₯μ• λŠ”
11:52
we discovered that there are dozens of different diseases,
241
712617
3174
μˆ˜μ‹­ 개의 λ‹€μ–‘ν•œ μ§ˆλ³‘μœΌλ‘œ λ‚˜λ‰œλ‹€λŠ” κ±Έ μ•Œκ²Œ λ©λ‹ˆλ‹€.
11:55
all of which had been called ADHD at the start of the century,
242
715815
3009
μ„ΈκΈ° μ΄ˆμ—λŠ” λͺ¨λ‘ ADHD둜 뢈렸죠.
11:58
that actually had nothing to do with each other,
243
718848
2301
사싀 μ„œλ‘œ 관련이 μ „ν˜€ μ—†κ³ 
12:01
except that they had similar symptoms.
244
721173
2118
μ¦μƒλ§Œ λΉ„μŠ·ν•  λΏμΈλ°μš”.
12:03
And they needed to be treated in different ways.
245
723625
2372
치료 방법 λ˜ν•œ 달라야 ν–ˆμ£ .
12:06
So it was kind of incredible, in retrospect,
246
726307
2247
κ·ΈλŸ¬λ‹ˆκΉŒ 돌이켜 보면, μ’€ λ―ΏκΈ°κ°€ νž˜λ“€μ–΄μš”.
12:08
that at the beginning of the century,
247
728578
1777
μ„ΈκΈ° μ΄ˆμ—λŠ”
12:10
we had been treating all those different diseases
248
730379
2317
μ—¬λŸ¬κ°€μ§€ 각기 λ‹€λ₯Έ μ§ˆλ³‘λ“€μ„
12:12
with the same drug,
249
732720
1183
λ˜‘ 같은 μ•½
12:13
just by giving people amphetamine, basically is what we were doing.
250
733927
3214
기본적으둜 κ°μ„±μ œ 같은 것을 μ£ΌλŠ”κ²Œ μ „λΆ€μ˜€μœΌλ‹ˆκΉŒμš”.
12:17
And schizophrenia and depression are the same way.
251
737165
2488
정신뢄열증과 μš°μšΈμ¦λ„ λ§ˆμ°¬κ°€μ§€μž…λ‹ˆλ‹€.
12:19
So rather than prescribing drugs to people essentially at random,
252
739677
4032
λ¬΄μž‘μ • 약을 μ²˜λ°©ν•˜κΈ°λ³΄λ‹€,
12:23
as we had,
253
743733
1150
μ§€κΈˆκΉŒμ§€ 처럼 말이죠,
12:24
we learned how to predict which drugs would be most effective
254
744907
3511
이제 μ–΄λ–€ 약이 μ–΄λ–€ ν™˜μžμ—κ²Œ κ°€μž₯ νš¨κ³Όμ μΌμ§€
12:28
in which patients,
255
748442
1183
μ˜ˆμΈ‘ν•˜κ²Œ 됐고
12:29
and that just led to this huge improvement in outcomes.
256
749649
2756
치료 νš¨κ³Όλ„ μƒλ‹Ήνžˆ λ†’μ•„μ‘ŒμŠ΅λ‹ˆλ‹€.
12:33
OK, I want to bring you back now to the year 2017.
257
753498
3476
이제 2017λ…„μœΌλ‘œ λŒμ•„κ°€μ£ .
12:38
Some of this may sound satirical or even far fetched.
258
758117
3373
μ–΄λ–€ 것듀은 ν™©λ‹Ήν•˜κ±°λ‚˜ 우슡게 듀릴 μˆ˜λ„ μžˆμ–΄μš”.
12:41
And some of it is.
259
761514
1293
사싀 그렇기도 ν•©λ‹ˆλ‹€.
12:43
I mean, I can't actually see into the future.
260
763291
2651
κ·ΈλŸ¬λ‹ˆκΉŒ, μ‹€μ œλ‘œ μ œκ°€ 미래λ₯Ό λ³Ό 순 μ—†μž–μ•„μš”.
12:45
I don't actually know
261
765966
1366
사싀 μ•Œ μˆ˜κ°€ μ—†μŠ΅λ‹ˆλ‹€.
12:47
if we're going to be drilling hundreds or thousands of microscopic holes
262
767356
3667
머리에 수백 λ˜λŠ” 수천 개의 λ―Έμ„Έν•œ ꡬ멍을 뚫게 될지
12:51
in our heads in 30 years.
263
771047
1667
μ•žμœΌλ‘œ 30λ…„ 내에 말이죠.
12:53
But what I can tell you
264
773762
1706
ν•˜μ§€λ§Œ μ œκ°€ λ§μ”€λ“œλ¦΄ 수 μžˆλŠ” 건
12:55
is that we're not going to make any progress
265
775492
2175
더 이상 λ°œμ „ν•  수 μ—†λ‹€λŠ” κ²λ‹ˆλ‹€.
12:57
towards understanding the human brain or human diseases
266
777691
3727
μΈκ°„μ˜ λ‡Œ λ˜λŠ” μ§ˆλ³‘μ˜ 이해λ₯Ό λ•λŠ”
13:01
until we figure out how to get at the electrical activity of neurons
267
781442
4516
μ‹ κ²½μ„Έν¬λ“€μ˜ μ „κΈ° ν™œλ™μ„ κ΄€μ°°ν•  방법을 찾아내지 λͺ»ν•œλ‹€λ©΄μš”.
13:05
in healthy humans.
268
785982
1200
κ±΄κ°•ν•œ μ‚¬λžŒλ“€μ„ λŒ€μƒμœΌλ‘œ 말이죠.
13:07
And almost no one is working on figuring out how to do that today.
269
787918
3239
μ˜€λŠ˜λ‚  이런 방법을 μ—°κ΅¬ν•˜λŠ” μ‚¬λžŒλ“€μ€ 거의 μ—†μŠ΅λ‹ˆλ‹€.
13:12
That is the future of neuroscience.
270
792077
2334
이것이 μ‹ κ²½κ³Όν•™μ˜ 미래죠.
13:14
And I think it's time for neuroscientists to put down the mouse brain
271
794752
4393
μ‹ κ²½κ³Όν•™μžλ“€μ€ 이제 μ₯μ˜ λ‡ŒλŠ” 그만 λ‚΄λ € 놓고
13:19
and to dedicate the thought and investment necessary
272
799169
2754
μΈκ°„μ˜ λ‡Œμ™€ μ§ˆλ³‘μ„ μ΄ν•΄ν•˜λŠ”λ° 정말 ν•„μš”ν•œ
13:21
to understand the human brain and human disease.
273
801947
3267
생각과 νˆ¬μžμ— 전념해야할 λ•ŒλΌκ³  μƒκ°ν•©λ‹ˆλ‹€.
13:27
Thank you.
274
807629
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
13:28
(Applause)
275
808804
1172
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7