Jeff Hawkins: How brain science will change computing

207,124 views ใƒป 2007-05-23

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Sunphil Ga ๊ฒ€ํ† : Kahyun Kim
00:25
I do two things:
0
25476
1151
์ „ ๋‘ ๊ฐ€์ง€ ์ผ์„ ํ•ฉ๋‹ˆ๋‹ค. ๋ชจ๋ฐ”์ผ ์ปดํ“จํ„ฐ ๋””์ž์ธ์„ ํ•˜๊ณ  ๋‡Œ ์—ฐ๊ตฌ๋ฅผ ํ•ฉ๋‹ˆ๋‹ค.
00:26
I design mobile computers and I study brains.
1
26651
2118
00:28
Today's talk is about brains and -- (Audience member cheers)
2
28793
2930
์˜ค๋Š˜ ๊ฐ•์—ฐ์€ ๋‡Œ์— ๋Œ€ํ•ด์„œ, ๊ทธ๋ฆฌ๊ณ 
00:31
Yay! I have a brain fan out there.
3
31747
1817
์˜ˆ, ๋‡Œ ํŒฌ์ด ์–ด๋”˜๊ฐ€์— ์žˆ์œผ์‹œ๊ตฐ์š”.
00:33
(Laughter)
4
33588
3147
(์›ƒ์Œ)
์ €๋Š”, ์—ฌ๊ธฐ์— ์ œ ์ฒซ๋ฒˆ์งธ ์Šฌ๋ผ์ด๋“œ๊ฐ€ ๋œฌ๋‹ค๋ฉด
00:36
If I could have my first slide,
5
36759
1555
์ œ ๊ฐ•์—ฐ์˜ ์ œ๋ชฉ๊ณผ ์ €์˜ ๋‘๊ฐ€์ง€ ์†Œ์†์ด ๋ณด์ด์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:38
you'll see the title of my talk and my two affiliations.
6
38338
2849
00:41
So what I'm going to talk about is why we don't have a good brain theory,
7
41211
3468
์ด์ œ ์ €๋Š” ์™œ ์šฐ๋ฆฌ์—๊ฒŒ ๋‡Œ ์ด๋ก ์ด ์—†๋Š”์ง€์— ๋Œ€ํ•ด ๋งํ•ด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค,
00:44
why it is important that we should develop one
8
44703
2277
์šฐ๋ฆฌ๊ฐ€ ๋‡Œ ์ด๋ก ์„ ์ •๋ฆฝํ•˜๋Š” ๊ฒŒ ์™œ ์ค‘์š”ํ•œ ์ง€์™€, ๊ทธ๊ฑธ ์ •๋ฆฝํ•˜๊ธฐ ์œ„ํ•ด์„œ ๋ฌด์–ผ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€๋„์š”.
00:47
and what we can do about it.
9
47004
1483
00:48
I'll try to do all that in 20 minutes.
10
48511
1824
์ด ๋ชจ๋“  ๊ฒƒ์„ 20๋ถ„ ๋™์•ˆ ๊ฐ•์—ฐํ•˜๋ ค ํ•ฉ๋‹ˆ๋‹ค. ์ „ ์†Œ์†์ด ๋‘ ๊ณณ์ž…๋‹ˆ๋‹ค.
00:50
I have two affiliations.
11
50359
1151
00:51
Most of you know me from my Palm and Handspring days,
12
51534
2532
๋งŽ์€ ๋ถ„๋“ค์ด ์ œ๊ฐ€ ํŒœ, ๊ทธ๋ฆฌ๊ณ  ํ•ธ๋“œ์Šคํ”„๋ง์—์„œ ์ผํ•˜๋˜ ์‹œ์ ˆ๋กœ๋ถ€ํ„ฐ ์ ˆ ์•Œ๊ณ  ๊ณ„์‹œ์ง€๋งŒ
00:54
but I also run a nonprofit scientific research institute
13
54090
2683
์ €๋Š” ๋น„์˜๋ฆฌ ๊ณผํ•™ ์—ฐ๊ตฌ์†Œ๋„ ๊ฒฝ์˜ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:56
called the Redwood Neuroscience Institute in Menlo Park.
14
56797
2632
๋ฉ˜๋กœํŒŒํฌ์— ์žˆ๋Š” ๋ ˆ๋“œ์šฐ๋“œ ๋‡Œ๊ณผํ•™ ์—ฐ๊ตฌ์†Œ-๋ผ๋Š” ๊ณณ์ž…๋‹ˆ๋‹ค.
00:59
We study theoretical neuroscience and how the neocortex works.
15
59453
3388
์—ฌ๊ธฐ์„  ์ด๋ก ์ ์ธ ๋‡Œ๊ณผํ•™๊ณผ
๋Œ€๋‡Œ ์‹ ํ”ผ์งˆ์ด ์–ด๋–ป๊ฒŒ ๊ธฐ๋Šฅํ•˜๋Š”์ง€์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•ฉ๋‹ˆ๋‹ค.
01:02
I'm going to talk all about that.
16
62865
1598
์ด์ œ ์ €๋Š” ์ด๊ฒƒ๋“ค์— ๋Œ€ํ•ด ๋งํ•ด๋ณด๋ ค ํ•ฉ๋‹ˆ๋‹ค.
01:04
I have one slide on my other life, the computer life,
17
64487
2745
๋˜ ํ•˜๋‚˜์˜ ์Šฌ๋ผ์ด๋“œ์—๋Š” ์ €์˜ ๋‹ค๋ฅธ ์‚ถ์— ๊ด€ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค, ์ปดํ“จํ„ฐ ์‚ถ์ด์ฃ , ์—ฌ๊ธฐ์žˆ๋Š” ์Šฌ๋ผ์ด๋“œ์ž…๋‹ˆ๋‹ค.
01:07
and that's this slide here.
18
67256
1301
01:08
These are some of the products I've worked on over the last 20 years,
19
68581
3268
์ด ์ œํ’ˆ๋“ค์€ ์ œ๊ฐ€ ์ง€๋‚œ 20๋…„๋™์•ˆ ์ž‘์—…ํ•ด์˜จ ๊ฒƒ๋“ค์ž…๋‹ˆ๋‹ค,
01:11
starting from the very original laptop
20
71873
1842
์•„์ฃผ ์ตœ์ดˆ์˜ ๋žฉํƒ‘์—์„œ๋ถ€ํ„ฐ ์‹œ์ž‘ํ•ด์„œ ์ดˆ๋ฐ˜์˜ ํƒœ๋ธ”๋ฆฟ ์ปดํ“จํ„ฐ,
01:13
to some of the first tablet computers
21
73739
1787
01:15
and so on, ending up most recently with the Treo,
22
75550
2298
๋‹ค๋ฅธ ์—ฌ๋Ÿฌ ์ œํ’ˆ๋“ค, ๊ทธ๋ฆฌ๊ณ  ๋งˆ์ง€๋ง‰์— ์žˆ๋Š” ์ตœ๊ทผ์˜ ํŠธ๋ ˆ์˜ค๊นŒ์ง€ ์žˆ์ฃ .
01:17
and we're continuing to do this.
23
77872
1532
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๋Š” ์ด ์ž‘์—…์„ ๊ณ„์† ์ด์–ด๊ฐ€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:19
I've done this because I believe mobile computing
24
79428
2301
์ œ๊ฐ€ ์ด๋ ‡๊ฒŒ ์ง€๊ธˆ๊ป ์—ฐ๊ตฌํ•ด์˜จ ์ด์œ ์€ ์ €๋Š” ์ •๋ง ๋ชจ๋ฐ”์ผ ์ปดํ“จํŒ…์ด
01:21
is the future of personal computing,
25
81753
1724
๊ฐœ์ธ์šฉ ์ปดํ“จํ„ฐ์˜ ๋ฏธ๋ž˜๋ผ๊ณ  ์ƒ๊ฐํ•˜๊ธฐ๋•Œ๋ฌธ์ด์ฃ ,๊ทธ๋ž˜์„œ ์ด๋Ÿฐ๊ฒƒ์„ ๊ฐœ๋ฐœํ•จ์œผ๋กœ ์ด ์„ธ์ƒ์„
01:23
and I'm trying to make the world a little bit better
26
83501
2454
์กฐ๊ธˆ๋” ์‚ด๊ธฐ ์ข‹์€ ๊ณณ์œผ๋กœ ๋งŒ๋“ค์–ด๋ณด๋ คํ•ฉ๋‹ˆ๋‹ค.
01:25
by working on these things.
27
85979
1296
01:27
But this was, I admit, all an accident.
28
87299
1874
์ œ๊ฐ€ ์ธ์ •ํ•˜๋Š” ๊ฑด. ์ด๊ฒŒ ๋‹ค ์šฐ์—ฐ์ด์˜€๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:29
I really didn't want to do any of these products.
29
89197
2308
์ €๋Š” ์ด๋Ÿฐ ์ œํ’ˆ ๊ด€๋ จ์ผ์„ ํ•  ์ƒ๊ฐ์ด ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
01:31
Very early in my career
30
91529
1382
๊ฒŒ๋‹ค๊ฐ€ ๋ง‰ ๊ฒฝ๋ ฅ์ด ์‹œ์ž‘ํ•  ๋•Œ๋ถ€ํ„ฐ
01:32
I decided I was not going to be in the computer industry.
31
92935
2690
์ปดํ“จํ„ฐ ์‚ฌ์—… ์ชฝ์€ ์•ˆ ํ• ๊ฑฐ๋ผ๊ณ  ๊ฒฐ์‹ฌํ–ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
01:35
Before that, I just have to tell you
32
95649
1721
์ œ๊ฐ€ ์—ฌ๊ธฐ์— ๋Œ€ํ•ด์„œ ๋” ๋งํ•˜๊ธฐ ์ „์— ๋˜ ์•Œ๋ ค๋“œ๋ฆด๊ฒŒ ์žˆ์Šต๋‹ˆ๋‹ค.
01:37
about this picture of Graffiti I picked off the web the other day.
33
97394
3108
์ด ๊ทธ๋ž˜ํ”ผํ‹ฐ ์‚ฌ์ง„์€ ์ œ๊ฐ€ ์–ด๋Š ์›น์‚ฌ์ดํŠธ์—์„œ ๊ฐ€์ ธ์˜จ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:40
I was looking for a picture for Graffiti that'll text input language.
34
100526
3253
์ €๋Š” ๊ทธ๋ž˜ํ”ผํ‹ฐ ์‚ฌ์ง„์„ ์ฐพ๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค, ์กฐ๊ทธ๋งˆํ•œ ํ…์ŠคํŠธ ์ž…๋ ฅ ์–ธ์–ด ๊ฐ™์€ ๊ฒƒ์ด์ฃ .
01:43
I found a website dedicated to teachers who want to make script-writing things
35
103803
3689
๊ทธ๋Ÿฌ๋‹ค ์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ์ง์ ‘ ๋งŒ๋“ค๊ณ  ์‹ถ์€ ์„ ์ƒ๋‹˜๋“ค์„ ์œ„ํ•œ ์›น์‚ฌ์ดํŠธ๋ฅผ ์ฐพ์•˜์Šต๋‹ˆ๋‹ค.
๋‹ค๋“ค ์•„์‹œ๋Š” ์น ํŒ ์œ„์— ์“ฐ๋Š” ์Šคํฌ๋ฆฝํŠธ ๊ฐ™์€ ๊ฒƒ๋“ค ์žˆ์ž–์•„์š”, ๊ทธ๋ฆฌ๊ณ 
01:47
across the top of their blackboard,
36
107516
1674
01:49
and they had added Graffiti to it, and I'm sorry about that.
37
109214
2833
๊ทธ๋“ค์€ ๊ฑฐ๊ธฐ๋‹ค ๊ทธ๋ž˜ํ”ผํ‹ฐ๋ฅผ ์ถ”๊ฐ€ํ–ˆ๋˜ ๊ฒƒ์ด์˜€์ฃ . ๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒŒ ์ฐธ ์œ ๊ฐ์ด๊ธดํ•ด์š”.
01:52
(Laughter)
38
112071
2247
(์›ƒ์Œ)
01:54
So what happened was,
39
114342
1300
๊ทธ๋ž˜์„œ ์–ด๋–ป๊ฒŒ ๋ฌ๋ƒ๋ฉด, ์ œ๊ฐ€ ์ Š๊ณ , ๊ณต๋Œ€๋ฅผ ๋ง‰ ๋‚˜์™”์„ ๋•Œ,
01:55
when I was young and got out of engineering school at Cornell in '79,
40
115666
4899
79๋…„ ์ฝ”๋„ฌ๋Œ€์—์„œ์˜€์ฃ , ์ €๋Š” ์ธํ…”์— ๊ฐ€ ์ผํ•˜๊ธฐ๋กœ ๊ฒฐ์ •ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:00
I went to work for Intel and was in the computer industry,
41
120589
3187
02:03
and three months into that, I fell in love with something else.
42
123800
3402
์ปดํ“จํ„ฐ ์‚ฐ์—…์— ์žˆ์—ˆ๋˜ ๊ฑฐ์ฃ . ๊ทธ๋ฆฌ๊ณ  ์‚ผ๊ฐœ์›”์ด ์ง€๋‚˜
"๊ฒฝ๋ ฅ ์„ ํƒ์„ ์ž˜๋ชปํ–ˆ์–ด" ๋ผ๊ณ  ๋งํ•˜๋ฉด์„œ, ์ „ ์ข€ ๋‹ค๋ฅธ๊ฒƒ์— ๋น ์ง€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค
02:07
I said, "I made the wrong career choice here,"
43
127226
3044
02:10
and I fell in love with brains.
44
130294
2239
๋‡Œ์— ๋น ์ ธ๋ฒ„๋ฆฌ๊ฒŒ ๋œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:12
This is not a real brain.
45
132557
1533
์ด๊ฑด ์ง„์งœ ๋‡Œ๋Š” ์•„๋‹™๋‹ˆ๋‹ค. ๋‡Œ์˜ ์„ ํ™” ๊ทธ๋ฆผ์ด์ฃ .
02:14
This is a picture of one, a line drawing.
46
134114
2719
02:16
And I don't remember exactly how it happened,
47
136857
2119
์‚ฌ์‹ค ์ œ๊ฐ€ ์™œ ๋‡Œ์— ๋น ์ง€๊ฒŒ ๋๋Š”์ง€ ์ •ํ™•ํ•˜๊ฒŒ ๋‹ค ๊ธฐ์–ต๋‚˜์ง€๋Š” ์•Š์ง€๋งŒ
02:19
but I have one recollection, which was pretty strong in my mind.
48
139000
3515
์•„์ง๋„ ์ œ๊ฐ€ ํ™•์‹คํžˆ ๊ธฐ์–ตํ•˜๊ณ ์žˆ๋Š” ๋”ฑ ํ•˜๋‚˜๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
02:22
In September of 1979,
49
142539
1610
1979๋…„ 9์›”ํŒ ์‚ฌ์ด์–ธํ‹ฐํ”ฝ ์•„๋ฉ”๋ฆฌ์นธ (์—ญ์ž ์ฃผ: ๋ฏธ๊ตญ์˜ ๊ณผํ•™์žก์ง€) ์ด ๋‚˜์™”๋Š”๋ฐ,
02:24
Scientific American came out with a single-topic issue about the brain.
50
144173
3364
ํ•œ ๊ฐ€์ง€ ์ฃผ์ œ๋งŒ ๋‹ค๋ฃจ๋Š”, ๋‡Œ์— ๊ด€ํ•œ ํŠน๋ณ„ํŒ์ด์—ˆ์–ด์š”. ๋‚ด์šฉ์ด ๊ฝค ๊ดœ์ฐฎ์•˜์ฃ .
02:27
It was one of their best issues ever.
51
147561
1938
๊ทธ ํŠน๋ณ„ํŒ์€ ๊ทธ๊ฐ„ ๋‚˜์˜จ ๊ฒƒ๋“ค ์ค‘ ์ตœ๊ณ ์— ์†ํ–ˆ์–ด์š”. ๋‰ด๋Ÿฐ์— ๋Œ€ํ•œ ๊ธฐ์‚ฌ๋„ ์žˆ์—ˆ๊ณ ,
02:29
They talked about the neuron, development, disease, vision
52
149523
2947
๋ฐœ๋‹ฌ๊ณผ ์งˆ๋ณ‘ ๊ทธ๋ฆฌ๊ณ  ์‹œ๊ฐ๋“ฑ์„ ๋‹ค๋ฃจ๋Š” ๋“ฑ
02:32
and all the things you might want to know about brains.
53
152494
2596
๋‡Œ์— ๋Œ€ํ•ด ์•Œ๊ณ  ์‹ถ์„๋งŒํ•œ ๋ชจ๋“  ๊ฒƒ๋“ค์„ ๊ธฐ์‚ฌ๋กœ ์‹ค์—ˆ์Šต๋‹ˆ๋‹ค. ์ •๋ง ์ธ์ƒ์ ์ด์˜€์ฃ .
02:35
It was really quite impressive.
54
155114
1502
02:36
One might've had the impression we knew a lot about brains.
55
156640
2772
์ด๊ฑธ ์ฝ๊ณ  ๋‚˜๋ฉด ์‚ฌ๋žŒ๋“ค์ด, ์•„- ์šฐ๋ฆฌ๊ฐ€ ๋‡Œ์— ๋Œ€ํ•ด์„œ ์ •๋ง ๋งŽ์€ ๊ฑธ ์•Œ๊ณ  ์žˆ๊ตฌ๋‚˜-ํ•˜๋Š” ๋Š๋‚Œ์„ ๋ฐ›์„ ์ •๋„์˜€์ฃ .
02:39
But the last article in that issue was written by Francis Crick of DNA fame.
56
159436
4195
์ด ์ด์Šˆ์˜ ๋งˆ์ง€๋ง‰ ๊ธฐ์‚ฌ ๊ธ€์€ DNA๋กœ ๋ช…์„ฑ์ด ์žˆ๋Š” ํ”„๋žœ์‹œ์Šค ํฌ๋ฆญ์ด ์“ด ๋‚ด์šฉ์ž…๋‹ˆ๋‹ค.
02:43
Today is, I think, the 50th anniversary of the discovery of DNA.
57
163655
3024
์˜ค๋Š˜์ด, ์•„๋งˆ, DNA๋ฅผ ๋ฐœ๊ฒฌํ•œ์ง€ 50์ฃผ๋…„์ด ๋˜์ฃ .
02:46
And he wrote a story basically saying, this is all well and good,
58
166703
3075
ํ•œ๋งˆ๋””๋กœ ์ด๋Ÿฐ ๋‚ด์šฉ์ด์˜€์Šต๋‹ˆ๋‹ค.
์ง€๊ธˆ๊นŒ์ง€์˜ ๊ฒƒ๋“ค์€ ๋‹ค ๊ดœ์ฐฎ๊ธดํ•˜์ง€๋งŒ, ์šฐ๋ฆฌ๋Š”
02:49
but you know, we don't know diddly squat about brains,
59
169802
2743
๋‡Œ์— ๊ด€ํ•˜์—ฌ ๋ฌด์ง€ํ•ฉ๋‹ˆ๋‹ค ๊ทธ๋ฆฌ๊ณ 
02:52
and no one has a clue how they work,
60
172569
1739
์•„๋ฌด๋„ ์ด๊ฒŒ ์–ด๋–ป๊ฒŒ ๋Œ์•„๊ฐ€๋Š”์ง€๋„ ์ „ํ˜€ ๋ชจ๋ฅด์ฃ .
02:54
so don't believe what anyone tells you.
61
174332
1866
๊ทธ๋Ÿฌ๋‹ˆ ๋ˆ„๊ฐ€ ๋ญ๋ผ๊ณ  ํ•˜๋“  ๋ฏฟ์ง€๋งˆ์„ธ์š”.
02:56
This is a quote from that article, he says:
62
176222
2165
๊ทธ ๊ธฐ์‚ฌ์—์„œ ๋ฐœ์ทŒํ•œ ๋‚ด์šฉ์ž…๋‹ˆ๋‹ค. ๊ทธ๋Š” 'ํ˜„์ €ํ•˜๊ฒŒ ๋ถ€์กฑํ•ฉ๋‹ˆ๋‹ค" ๋ผ๊ณ  ๋งํ–ˆ์ฃ ,
02:58
"What is conspicuously lacking" -- he's a very proper British gentleman --
63
178411
4293
๊ทธ ๋ถ„์€ ํ’ˆ์œ„์žˆ๋Š” ์˜๊ตญ์‹ ์‚ฌ์ž…๋‹ˆ๋‹ค, "ํ˜„์ €ํ•˜๊ฒŒ ๋ชจ์ž๋ผ๊ณ  ์žˆ๋Š”๊ฒƒ์€
03:02
"What is conspicuously lacking is a broad framework of ideas
64
182728
2830
์ด๋Ÿฐ ๋‹ค์–‘ํ•œ ์ ‘๊ทผ๋ฒ•๋“ค์„ ๊ทธ ํ…Œ๋‘๋ฆฌ ์•ˆ์—์„œ ํ•ด์„ํ•˜๊ฒŒ ํ•ด ์ค„ ํญ ๋„“์€ ์‚ฌ๊ณ ์˜ ํ‹€์ž…๋‹ˆ๋‹ค."
03:05
in which to interpret these different approaches."
65
185582
2352
03:07
I thought the word "framework" was great.
66
187958
1968
์ „ ํ‹€,๋ผˆ๋Œ€๋ผ๋Š” ๋‹จ์–ด๊ฐ€ ๋ฉ‹์ง€๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
03:09
He didn't say we didn't have a theory.
67
189950
1817
๊ทธ๋Š” '์šฐ๋ฆฌ๋Š” ์ด๋ก ์ด ์—†์Šต๋‹ˆ๋‹ค'๋ผ๊ณ  ๋งํ•˜์ง€ ์•Š๊ณ ,
03:11
He says we don't even know how to begin to think about it.
68
191791
2725
'์šฐ๋ฆฌ๋Š” ๋‡Œ์— ๊ด€ํ•˜์—ฌ ์–ด๋–ป๊ฒŒ ์‹œ์ž‘ํ•ด์•ผ ํ•˜๋Š”์ง€ ์กฐ์ฐจ ๋ชจ๋ฅด๊ณ ์žˆ์Šต๋‹ˆ๋‹ค --
์šฐ๋ฆฌ๋Š” ์‹ฌ์ง€์–ด ํ‹€๋„ ์—†์ฃ '๋ผ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
03:14
We don't even have a framework.
69
194540
1492
ํ† ๋งˆ์Šค ์ฟค์„ ์‘์šฉํ•˜์ž๋ฉด ์šฐ๋ฆฌ๋Š” ํŒจ๋Ÿฌ๋‹ค์ž„์˜ ์ด์ „ ์„ธ์ƒ์— ์กด์žฌํ•˜๊ณ  ์žˆ๋Š”๊ฑฐ์ฃ .
03:16
We are in the pre-paradigm days, if you want to use Thomas Kuhn.
70
196056
3050
๊ทธ๋ž˜์„œ ์ „ ๋‡Œ์— ํ‘น ๋น ์กŒ์Šต๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ๋งํ–ˆ์ฃ , ์ด๊ฑฐ ๋ด-
03:19
So I fell in love with this.
71
199130
1339
03:20
I said, look: We have all this knowledge about brains -- how hard can it be?
72
200493
3575
์šฐ๋ฆฐ ๋‡Œ์— ๋Œ€ํ•ด์„œ ๋ฒŒ์จ ์ด๋งŒํผ์ด๋‚˜ ์ง€์‹์„ ๊ฐ–๊ณ  ์žˆ๋Š”๋ฐ, ์–ด๋ ค์›Œ ๋ดค์ž ์–ผ๋งˆ๋‚˜ ์–ด๋ ต๊ฒ ์–ด?
03:24
It's something we can work on in my lifetime; I could make a difference.
73
204092
3438
์ด๊ฑด ๋‚ด ์ธ์ƒ์— ํ•ด ๋ณผ๋งŒํ•œ ๊ฒƒ ๊ฐ™์•„. ์ „ ๋ฌด์–ธ๊ฐ€ ํ•ด๋‚ผ ์ˆ˜ ์žˆ๊ฒ ๋‹ค๋Š” ์ƒ๊ฐ์ด ๋“ค์—ˆ์ฃ .
03:27
So I tried to get out of the computer business, into the brain business.
74
207554
3619
๊ทธ๋ž˜์„œ ์ „ ์ปดํ“จํ„ฐ ์‚ฌ์—…์—์„œ ๋ฒ—์–ด๋‚˜์„œ ๋‡Œ์™€ ๊ด€๋ จ๋œ ์ชฝ์œผ๋กœ ์ „ํ™˜ํ•ด ๋ณด๋ ค๊ณ  ์‹œ๋„ํ–ˆ์–ด์š”.
03:31
First, I went to MIT, the AI lab was there.
75
211197
2004
์ฒ˜์Œ์—๋Š” MIT๋กœ ๊ฐ”์ฃ . AI ์—ฐ๊ตฌ์†Œ๊ฐ€ ๊ฑฐ๊ธฐ์žˆ์—ˆ์œผ๋‹ˆ๊นŒ์š”.
03:33
I said, I want to build intelligent machines too,
76
213225
2395
'๋‚˜๋„ ํ•œ๋ฒˆ ์ง€๋Šฅํ˜•๋จธ์‹ ์„ ๋งŒ๋“ค์–ด ๋ณด๊ณ  ์‹ถ์–ด'ํ•˜๋ฉด์„œ์š”.
03:35
but I want to study how brains work first.
77
215644
2517
ํ•˜์ง€๋งŒ ์ „ ๋‡Œ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€๋ถ€ํ„ฐ ๊ณต๋ถ€ํ•˜๊ณ  ์‹ถ์—ˆ์–ด์š”.
03:38
And they said, "Oh, you don't need to do that.
78
218185
2306
๊ทธ๋Ÿฐ๋ฐ ๊ฑฐ๊ธฐ์„œ๋Š”, '์•„ ๊ทธ๋Ÿฐ๊ฑด ๋ชฐ๋ผ๋„ ๋ผ'
03:40
You're just going to program computers, that's all.
79
220515
2390
'๊ทธ๋ƒฅ ์šฐ๋ฆฐ ์ปดํ“จํ„ฐ ํ”„๋กœ๊ทธ๋ž˜๋ฐ๋งŒ ํ• ๊บผ์•ผ. ์šฐ๋ฆฐ ๊ทธ๊ฒƒ๋งŒ ํ•˜๋ฉด ๋ผ'
03:42
I said, you really ought to study brains.
80
222929
1963
์ „, '์•„๋‹ˆ์ฃ . ์šฐ๋ฆฐ ๋‡Œ์—๋Œ€ํ•ด์„œ๋„ ๊ณต๋ถ€๋ฅผ ํ•ด์•ผ๋˜์š”.'๋ผ๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:44
They said, "No, you're wrong."
81
224916
1432
03:46
I said, "No, you're wrong," and I didn't get in.
82
226372
2246
'์•„๋‹ˆ, ๋‹น์‹ ๋“ค์ด ํ‹€๋ ธ๋‹ค'๊ณ  ํ–ˆ์ฃ . ๊ทธ๋ฆฌ๊ณ  ํ•ฉ๊ฒฉํ•˜์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
03:48
(Laughter)
83
228642
1078
(์›ƒ์Œ)
03:49
I was a little disappointed -- pretty young --
84
229744
2155
์ „ ์กฐ๊ธˆ ์‹ค๋ง ์‹ค๋งํ–ˆ์—ˆ์Šต๋‹ˆ๋‹ค - ๊ฝค ์–ด๋ ธ์œผ๋‹ˆ๊นŒ์š”. ๊ทธ๋ ‡์ง€๋งŒ ๋‹ค์‹œ ๋Œ์•„๊ฐ”์ฃ .
03:51
but I went back again a few years later,
85
231923
1936
๋ช‡๋…„์ด ์ง€๋‚˜ ์ด๋ฒˆ์—” ์บ˜๋ฆฌํฌ๋‹ˆ์•„์˜ ๋ฒ„ํด๋ฆฌ๋กœ ๊ฐ”์Šต๋‹ˆ๋‹ค.
03:53
this time in California, and I went to Berkeley.
86
233883
2359
'์ด๋ฒˆ์—” ์ƒ๋ฌผํ•™์ชฝ์œผ๋กœ ํ•œ๋ฒˆ ๋“ค์–ด๊ฐ€๋ณด์ž'ํ•˜๋ฉด์„œ์š”.
03:56
And I said, I'll go in from the biological side.
87
236266
2430
03:58
So I got in the PhD program in biophysics.
88
238720
3089
๊ทธ๋ž˜์„œ ์ƒ๋ฌผ๋ฆฌํ•™ ๋ฐ•์‚ฌ๊ณผ์ •์œผ๋กœ ๋“ค์–ด๊ฐ€๊ฒŒ ๋ฌ์–ด์š”. ์ „ '๊ทธ๋ž˜'
04:01
I was like, I'm studying brains now. Well, I want to study theory.
89
241833
3410
์ง€๊ธˆ๋ถ€ํ„ฐ ๋‡Œ์— ๋Œ€ํ•ด์„œ ๊ณต๋ถ€ํ•˜๋Š”๊ตฌ๋‚˜. ์ „ '์ด๋ก ์„ ๊ณต๋ถ€ํ•ด ๋ณด๊ณ  ์‹ถ๋‹ค' ๊ณ  ํ–ˆ์ฃ .
04:05
They said, "You can't study theory about brains.
90
245267
2269
๊ทธ๋Ÿฌ์ž ๊ฑฐ๊ธฐ์„œ, '์•„๋‹ˆ, ๋‡Œ ์ด๋ก ์€ ์•ˆ ๋˜์ง€' ๊ทธ๋Ÿฌ๋Š” ๊ฑฐ์˜ˆ์š”.
04:07
You can't get funded for that.
91
247560
1995
'๊ทธ๊ฑด ํ•  ๋งŒํ•œ ๊ฒŒ ๋ชป ๋ผ. ์—ฐ๊ตฌ๋น„ ์ง€์›๋„ ๋ชป ๋ฐ›์•„'
04:09
And as a graduate student, you can't do that."
92
249579
2155
๋Œ€ํ•™์›์ƒ์œผ๋กœ์„œ, ๊ทธ๋Ÿผ ํ•  ์ˆ˜๊ฐ€ ์—†์—ˆ์ฃ . ์ „ '์•„, ์ด๋Ÿฐ.'
04:11
So I said, oh my gosh.
93
251758
1218
04:13
I was depressed; I said, but I can make a difference in this field.
94
253000
3155
ํ•˜๊ณ  ๋‚™๋‹ดํ–ˆ์—ˆ์ฃ . '๋‚œ ์ด ๋ถ„์•ผ์—์„œ ๋…ํŠนํ•˜๊ฒŒ ๋ฌด์–ธ๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š”๋ฐ' ํ–ˆ์œผ๋‹ˆ๊นŒ์š”.
๊ทธ๋ž˜์„œ ์ €๋Š” ๋‹ค์‹œ ์ปดํ“จํ„ฐ ์‚ฐ์—… ์ชฝ์œผ๋กœ ๋Œ์•„๊ฐ€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
04:16
I went back in the computer industry
95
256179
2008
04:18
and said, I'll have to work here for a while.
96
258211
2105
์•„ ๋‹น๋ถ„๊ฐ„์€ ๋ญ”๊ฐ€ ํ•˜๋ฉด์„œ ์ด ์ชฝ์—์„œ ์ผํ•ด์•ผ๊ฒ ๊ตฌ๋‚˜' ํ•˜๋ฉด์„œ์š”.
04:20
That's when I designed all those computer products.
97
260340
2393
๊ทธ๋žฌ๋˜ ๊ทธ ๋•Œ๊ฐ€, ๋ฐ”๋กœ ์—ฌ๊ธฐ ๋ณด์‹œ๋Š” ๋ชจ๋“  ์ปดํ“จํ„ฐ ์ œํ’ˆ๋“ค์„ ๋””์ž์ธํ–ˆ๋˜ ์‹œ๊ธฐ์—์š”.
04:22
(Laughter)
98
262757
1301
(์›ƒ์Œ)
04:24
I said, I want to do this for four years, make some money,
99
264082
2894
์ „ ๊ทธ๋žฌ์ฃ , '๋‚œ 4๋…„ ๋™์•ˆ ์ด๋ ‡๊ฒŒ ํ•˜๋ฉด์„œ ๋ˆ๋„ ๋ฒŒ๊ณ 
04:27
I was having a family, and I would mature a bit,
100
267000
3976
๊ฐ€์กฑ๋„ ๊ฐ–๊ณ , ์ข€ ์„ฑ์ˆ™ํ•ด์ง€๊ณ  ์‹ถ๋‹ค๊ณ ,
04:31
and maybe the business of neuroscience would mature a bit.
101
271000
2816
์•„๋งˆ ๋‡Œ ๊ณผํ•™์ชฝ๋„ ์กฐ๊ธˆ์€ ์„ฑ์ˆ™ํ•ด์ง€๊ฒ ์ง€' ํ•˜๋ฉด์„œ์š”.
04:33
Well, it took longer than four years. It's been about 16 years.
102
273840
3001
๊ทธ๋Ÿฐ๋ฐ 4๋…„๋ณด๋‹ค ๋” ๊ฑธ๋ฆฌ๋”๋ผ๊ตฌ์š”. 16๋…„ ์ •๋„๊ฐ€ ๋ฌ์–ด์š”.
04:36
But I'm doing it now, and I'm going to tell you about it.
103
276865
2716
๊ทธ๋ ‡์ง€๋งŒ ์ง€๊ธˆ์€ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค, ๋‡Œ์— ๊ด€ํ•˜์—ฌ ์ด์•ผ๊ธฐํ•˜์ฃ .
04:39
So why should we have a good brain theory?
104
279605
2286
๊ทธ๋Ÿฌ๋‹ˆ๊น ์™œ ์šฐ๋ฆฌ๊ฐ€ ์ข‹์€ ๋‡Œ ์ด๋ก ์„ ๊ฐ€์ ธ์•ผ ํ•˜์ฃ ?
04:41
Well, there's lots of reasons people do science.
105
281915
3102
์Œ, ์‚ฌ๋žŒ๋“ค์ด ๊ณผํ•™์„ ํ•˜๋Š” ์ด์œ ๋Š” ๋‹ค์–‘ํ•ฉ๋‹ˆ๋‹ค.
04:45
The most basic one is, people like to know things.
106
285041
2917
ํ•œ ๊ฐ€์ง€๋Š”-๊ฐ€์žฅ ๊ธฐ๋ณธ์ ์ด๊ธฐ๋„ ํ•œ๋ฐ์š”- ์šฐ๋ฆฐ ์•Œ์•„๋‚ด๊ณ  ์‹ถ์–ดํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:47
We're curious, and we go out and get knowledge.
107
287982
2195
์šฐ๋ฆฌ๋Š” ํ˜ธ๊ธฐ์‹ฌ์ด ๊ฐ€๋“ํ•ด์š”. ๊ทธ ํ˜ธ๊ธฐ์‹ฌ์„ ์ฑ„์šฐ๋ ค๊ณ  ๋‚˜๊ฐ€์„œ ์ง€์‹์„ ์–ป์ฃ , ๊ทธ๋ ‡์ž–์•„์š”?
04:50
Why do we study ants? It's interesting.
108
290201
1866
์™œ ์šฐ๋ฆฌ๊ฐ€ ๊ฐœ๋ฏธ๋ฅผ ์—ฐ๊ตฌํ•˜์ฃ ? ๊ทธ๋ƒฅ ์žฌ๋ฐŒ์œผ๋‹ˆ๊นŒ์š”.
04:52
Maybe we'll learn something useful, but it's interesting and fascinating.
109
292091
3466
์–ด์ฉŒ๋ฉด ๋ญ”๊ฐ€ ์œ ์šฉํ•œ ๊ฑธ ๋ฐํ˜€๋‚ผ์ˆ˜๋„ ์žˆ๊ฒ ์ฃ . ๊ทธ๋ž˜๋„ ์žฌ๋ฐŒ๊ณ  ์‹ ๋น„ํ•ฉ๋‹ˆ๋‹ค.
04:55
But sometimes a science has other attributes
110
295581
2057
๊ทธ๋Ÿฐ๋ฐ ๋•Œ๋•Œ๋กœ ๊ณผํ•™์—๋Š” ๋‹ค๋ฅธ ํŠน์„ฑ๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค
04:57
which makes it really interesting.
111
297662
1829
์•„์ฃผ ์žฌ๋ฏธ์žˆ๊ฒŒ ๋งŒ๋“ค์–ด ์ฃผ๋Š” ๊ฒƒ๋“ค์ด์ฃ .
04:59
Sometimes a science will tell something about ourselves;
112
299515
2627
๊ณผํ•™์€ ์šฐ๋ฆฌ ์ž์‹ ์— ๋Œ€ํ•ด ๋ฌด์–ธ๊ฐ€ ์•Œ๋ ค์ฃผ๊ธฐ๋„ ํ•˜์ฃ .
05:02
it'll tell us who we are.
113
302166
1224
์šฐ๋ฆฌ๊ฐ€ ๋ˆ„๊ตฌ์ธ๊ฐ€์— ๋Œ€ํ•ด์„œ๋„.
05:03
Evolution did this and Copernicus did this,
114
303414
2752
๋“œ๋ฌผ๊ฒŒ๋Š”, ์—ฌ๋Ÿฌ๋ถ„๊ป˜์„œ ์•Œ๋‹ค์‹œํ”ผ ์ง„ํ™”๋ก ๊ณผ ์ฝ”ํŽ˜๋ฅด๋‹ˆ์ฟ ์Šค๊ฐ€ ํ–ˆ๋˜ ๊ฒƒ์ฒ˜๋Ÿผ
05:06
where we have a new understanding of who we are.
115
306190
2334
์šฐ๋ฆฌ ์ž์‹ ์— ๋Œ€ํ•ด ์ƒˆ๋กœ์šด ๋ฐœ์ƒ์„ ํ•˜๊ฒŒ ํ•˜์ฃ .
05:08
And after all, we are our brains. My brain is talking to your brain.
116
308548
3428
๊ฒฐ๊ตญ์—๋Š” ์šฐ๋ฆฌ๊ฐ€ ๋ฐ”๋กœ ๋‡Œ์ด๋‹ˆ๊นŒ์š”. ์ œ ๋‡Œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์˜ ๋‡Œ์—๊ฒŒ ๋งํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
05:12
Our bodies are hanging along for the ride,
117
312000
2030
์šฐ๋ฆฌ ๋ชธ์€ ๋‡Œ์— ๊ฑธ์ณ์ ธ ์žˆ๋Š” ๊ฒƒ ๋ฟ์ด๊ณ , ์ œ ๋‡Œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„ ๋‡Œ์—๊ฒŒ ๋งํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์ด์ฃ .
05:14
but my brain is talking to your brain.
118
314054
1825
05:15
And if we want to understand who we are and how we feel and perceive,
119
315903
3248
๊ทธ๋ฆฌ๊ณ  ๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ์šฐ๋ฆฌ ์ž์‹ ์ด ๋ˆ„๊ตฐ์ง€, ์–ด๋–ป๊ฒŒ ๋Š๋ผ๊ณ  ์ง€๊ฐํ•˜๋Š” ์ง€ ๋“ฑ์„ ์•Œ๊ณ  ์‹ถ๋‹ค๋ฉด,
์ •๋ง ๋‡Œ๊ฐ€ ๋ฌด์—‡์ธ์ง€๋ฅผ ์ดํ•ดํ•ด์•ผ ํ•ด์š”.
05:19
we need to understand brains.
120
319175
1391
05:20
Another thing is sometimes science leads to big societal benefits, technologies,
121
320590
3784
๋˜ ํ•œ ๊ฐ€์ง€๋Š” ๋•Œ๋•Œ๋กœ ๊ณผํ•™์€
์‚ฌํšŒ์ ์œผ๋กœ ํฐ ์ด์ต์ด๋‚˜ ๊ธฐ์ˆ ์ด๋‚˜ ์‚ฌ์—…๊ฐ™์€ ์ชฝ์œผ๋กœ ์šฐ๋ฆด ์ด๋Œ์–ด ๊ฐˆ ๋•Œ๊ฐ€ ์žˆ์–ด์š”
05:24
or businesses or whatever.
122
324398
1291
๊ทธ๋ฆฌ๊ณ  ๋‡Œ๊ณผํ•™ ์—ญ์‹œ ๊ทธ๋ž˜์š”.
05:25
This is one, too, because when we understand how brains work,
123
325713
2878
์™œ๋ƒ๋ฉด ์šฐ๋ฆฌ๊ฐ€ ๋‡Œ๊ฐ€ ์–ด๋–ป๊ฒŒ ๊ธฐ๋Šฅํ•˜๋Š”์ง€ ์•Œ๊ฒŒ ๋˜๋ฉด
05:28
we'll be able to build intelligent machines.
124
328615
2064
์ง€๋Šฅํ˜• ๋จธ์‹ ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค, ์ „์ฒด์ ์œผ๋กœ ๋ณด์•„ ์‚ฌ์‹ค์ƒ ์ž˜ ๋œ ์ผ์ด์ฃ .
05:30
That's a good thing on the whole,
125
330703
1698
05:32
with tremendous benefits to society,
126
332425
1858
์‚ฌํšŒ์— ์—„์ฒญ๋‚œ ์ด์ต์„ ๊ฐ€์ ธ์˜ฌ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:34
just like a fundamental technology.
127
334307
1669
๊ทผ๋ณธ์ ์ธ ๊ธฐ์ˆ ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
05:36
So why don't we have a good theory of brains?
128
336000
2850
๊ทธ๋Ÿผ ์™œ ์šฐ๋ฆฐ ์ข‹์€ ๋‡Œ ์ด๋ก ์ด ์—†์„๊นŒ์š”?
05:38
People have been working on it for 100 years.
129
338874
2168
์‚ฌ๋žŒ๋“ค์ด 100๋…„์”ฉ์ด๋‚˜ ์—ฐ๊ตฌํ•ด์˜ค๊ณ  ์žˆ๋Š”๋ฐ๋„ ๋ง์ž…๋‹ˆ๋‹ค.
05:41
Let's first take a look at what normal science looks like.
130
341066
2719
๊ทธ๋Ÿผ, ์šฐ์„  ์ „ํ˜•์  ๊ณผํ•™์€ ์–ด๋–ค ์‹์ธ์ง€ ์‚ดํŽด๋ณผ๊นŒ์š”.
05:43
This is normal science.
131
343809
1187
์ด๊ฒƒ์ด ์ „ํ˜•์ ์ธ ๊ณผํ•™์ž…๋‹ˆ๋‹ค.
05:45
Normal science is a nice balance between theory and experimentalists.
132
345020
4074
๋ณดํ†ต์˜ ๊ณผํ•™์€ ์ด๋ก ๊ณผ ์‹คํ—˜์ด ์ ์ ˆํ•œ ํ‰ํ˜•์„ ์ด๋ฃจ์ฃ .
05:49
The theorist guy says, "I think this is what's going on,"
133
349118
2691
์ด๋ก ์„ ๊ณต๋ถ€ํ•˜๋Š” ์‚ฌ๋žŒ์€, '๊ทธ๋ž˜, ์ด๊ฑด ์ด๋Ÿฐ ์›๋ฆฌ์ธ๊ฑฐ ๊ฐ™์•„'ํ•˜์ฃ .
05:51
the experimentalist says, "You're wrong."
134
351833
1961
๊ทธ๋ฆฌ๊ณ  ์‹คํ—˜ํ•˜๋Š” ์‚ฌ๋žŒ์€, '์•„๋‹ˆ, ๊ทธ๊ฑด ํ‹€๋ ธ์–ด'
05:53
It goes back and forth, this works in physics, this in geology.
135
353818
3004
์•Œ๋‹ค์‹œํ”ผ, ๊ทธ๋Ÿฌ๋ฉด์„œ ์ฃผ๊ฑฐ๋‹ˆ ๋ฐ›๊ฑฐ๋‹ˆ ํ•˜๋Š”๊ฑฐ์ฃ .
์ด๊ฑด ๋ฌผ๋ฆฌ, ์ง€์งˆํ•™์—์„œ ํ†ตํ•ฉ๋‹ˆ๋‹ค. ์ด๊ฒŒ ์ „ํ˜•์  ๊ณผํ•™์ด๋ผ๋ฉด,
05:56
But if this is normal science, what does neuroscience look like?
136
356846
3009
๋‡Œ๊ณผํ•™์€ ์–ด๋–ป๊ฒŒ ์ƒ๊ฒผ์„๊นŒ์š”? ์ด๊ฒŒ ๋‡Œ๊ณผํ•™์˜ ๋ชจ์Šต์ž…๋‹ˆ๋‹ค.
05:59
This is what neuroscience looks like.
137
359879
1795
์šฐ๋ฆฌ๋Š” ์‚ฐ๋”๋ฏธ์ฒ˜๋Ÿผ ์Œ“์ธ ๋ฐ์ดํ„ฐ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค. ํ•ด๋ถ€ํ•™, ์ƒ๋ฆฌํ•™, ํ–‰๋™ํ•™๊ฐ™์€.
06:01
We have this mountain of data,
138
361698
1442
06:03
which is anatomy, physiology and behavior.
139
363164
2070
06:05
You can't imagine how much detail we know about brains.
140
365258
3194
์šฐ๋ฆฌ๊ฐ€ ๋‡Œ์— ๋Œ€ํ•ด์„œ ์–ผ๋งˆ๋‚˜ ์„ธ์„ธํ•˜๊ฒŒ ์•Œ๊ณ ์žˆ๋Š”์ง€ ์•„๋งˆ ์ƒ์ƒ๋„ ๋ชปํ•˜์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค
06:08
There were 28,000 people who went to the neuroscience conference this year,
141
368476
3592
์˜ฌํ•ด ๋‡Œ๊ณผํ•™ ์ปจํผ๋Ÿฐ์Šค์— 28,000๋ช…์ด ๋ชจ์˜€์–ด์š”.
06:12
and every one of them is doing research in brains.
142
372092
2363
๊ทธ๋ฆฌ๊ณ  ๊ทธ ํ•œ ์‚ฌ๋žŒ ํ•œ ์‚ฌ๋žŒ์ด ๋ชจ๋‘ ๋‡Œ์— ๊ด€ํ•œ ์—ฐ๊ตฌ๋ฅผ ํ•˜๊ณ ์žˆ์ฃ .
06:14
A lot of data, but no theory.
143
374479
1694
๋งŽ์€ ์ž๋ฃŒ๊ฐ€ ์žˆ๋Š”๋ฐ ์ด๋ก ์€ ์—†์Šต๋‹ˆ๋‹ค. ์•„์ฃผ ์กฐ๊ธˆ ์žˆ์ฃ ,
06:16
There's a little wimpy box on top there.
144
376197
2000
06:18
And theory has not played a role in any sort of grand way
145
378221
3382
์ด๋ก ์€ ๋‡Œ๊ณผํ•™์—์„œ ์ค‘๋Œ€ํ•œ ๋ฐฉ๋ฒ•์œผ๋กœ์„œ ์ œ ์—ญํ• ๋„ ๋ชปํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:21
in the neurosciences.
146
381627
1429
06:23
And it's a real shame.
147
383080
1240
์ด๊ฑด ์ •๋ง ์•ˆ ๋œ ์ผ์ด์—์š”. ์ž, ๊ทธ๋Ÿผ ์™œ ์ด๋Ÿฐ ํ˜„์ƒ์ด ๋‚˜ํƒ€๋‚˜๋Š” ๊ฑธ๊นŒ์š”?
06:24
Now, why has this come about?
148
384344
1391
06:25
If you ask neuroscientists why is this the state of affairs,
149
385759
2988
๋‡Œ๊ณผํ•™์ž๋“ค์—๊ฒŒ ์™œ ์ด๋Ÿฐ ์ƒํƒœ๋ƒ๊ณ  ๋ฌป๋Š”๋‹ค๋ฉด
06:28
first, they'll admit it.
150
388771
1246
์šฐ์„  ์ธ์ •์€ ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ๊ทธ๋“ค์€ ๋งํ•˜๊ฒ ์ฃ ,
06:30
But if you ask them, they say,
151
390041
1485
06:31
there's various reasons we don't have a good brain theory.
152
391550
2732
"์Œ, ์™œ ๋งŒ์กฑ์Šค๋Ÿฐ ๋‡Œ ์ด๋ก ์ด ์—†๋Š”์ง€์— ๋Œ€ํ•œ ์ด์œ ๋Š” ๋งŽ์ฃ ."
06:34
Some say we still don't have enough data,
153
394306
1969
์–ด๋–ค ์‚ฌ๋žŒ์€, '์•„์ง๋„ ์ž๋ฃŒ๊ฐ€ ์ถฉ๋ถ„์น˜ ์•Š์•„์š”,'
06:36
we need more information, there's all these things we don't know.
154
396299
3059
'๋” ๋งŽ์€ ์ง€์‹์ด ํ•„์š”ํ•ด์š”, ์šฐ๋ฆฌ๊ฐ€ ๋ชจ๋ฅด๋Š”๊ฒŒ ์ด๋ ‡๊ฒŒ ๋งŽ์€๋ฐ์š”'
06:39
Well, I just told you there's data coming out of your ears.
155
399382
2841
๊ทธ๋Ÿฐ๋ฐ ์ œ๊ฐ€ ๋ฐฉ๊ธˆ ์ž๋ฃŒ๋Š” ๋„˜์ณ๋‚˜๊ณ  ์žˆ๋‹ค๋Š” ๊ฑธ ๋ง์”€๋“œ๋ ธ์ฃ .
06:42
We have so much information, we don't even know how to organize it.
156
402247
3164
์šฐ๋ฆฌ๋Š” ์ž๋ฃŒ๊ฐ€ ๋„ˆ๋ฌด ๋งŽ์Šต๋‹ˆ๋‹ค. ์–ด๋””์„œ๋ถ€ํ„ฐ ์ •๋ฆฌํ•ด์•ผ ๋  ์ง€ ๋ชจ๋ฅผ ์ •๋„์ด์ฃ .
06:45
What good is more going to do?
157
405435
1438
๋” ๋งŽ์•„์ง„๋‹ค ํ•œ๋“ค ๋ฌด์Šจ ๋„์›€์ด ๋ ๊นŒ์š”?
06:46
Maybe we'll be lucky and discover some magic thing, but I don't think so.
158
406897
3448
์•„์ฃผ ์šด์ด ์ข‹์œผ๋ฉด ๋งˆ์ˆ ๊ฐ™์€ ๊ฒƒ์„ ๋ฐœ๊ฒฌํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ฃ . ๊ฐ€๋Šฅ์„ฑ์€ ๋‚ฎ์ง€๋งŒ์š”.
06:50
This is a symptom of the fact that we just don't have a theory.
159
410369
2973
์ด๊ฒŒ ์‚ฌ์‹ค ์šฐ๋ฆฌ๊ฐ€ ์ด๋ก ์ด ์—†๋‹ค๋Š” ๊ฒƒ์— ๋Œ€ํ•œ ๋ฐ˜์ฆ์ž…๋‹ˆ๋‹ค.
06:53
We don't need more data, we need a good theory.
160
413366
2610
์šฐ๋ฆฌ๋Š” ๋ฐ์ดํ„ฐ๊ฐ€ ๋” ํ•„์š”ํ•œ ๊ฒŒ ์•„๋‹ˆ์—์š”. ๋‡Œ์— ๋Œ€ํ•œ ์ข‹์€ ์ด๋ก ์ด ํ•„์š”ํ•œ ๊ฑฐ์ฃ .
06:56
Another one is sometimes people say,
161
416000
1798
๋˜ ๋‹ค๋ฅธ ์‚ฌ์‹ค ํ•˜๋‚˜๋Š”, ์‚ฌ๋žŒ๋“ค์ด ๊ฐ€๋” ์ด๋ ‡๊ฒŒ ์–˜๊ธฐํ•ฉ๋‹ˆ๋‹ค. '๋‡Œ๋ผ๋Š” ๊ฑด ๋„ˆ๋ฌด ๋ณต์žกํ•ด-
06:57
"Brains are so complex, it'll take another 50 years."
162
417822
3154
50๋…„์€ ๋” ๊ฑธ๋ฆด๊ฑฐ์•ผ.' ๋ผ๊ณ  ๋ง์ด์ฃ .
07:01
I even think Chris said something like this yesterday, something like,
163
421000
3354
์‹ฌ์ง€์–ด ํฌ๋ฆฌ์Šค๋„ ์–ด์ œ ์ด๋Ÿฐ์‹์œผ๋กœ ์ด์•ผ๊ธฐ ํ–ˆ์—ˆ์ฃ .
ํฌ๋ฆฌ์Šค ์”จ๊ฐ€ ํ•œ๋ง์ด ๋ฌด์—‡์ธ์ง€ ์ •ํ™•ํžˆ ์ƒ๊ฐ์ด ๋‚˜์ง€ ์•Š์ง€๋งŒ,
07:04
it's one of the most complicated things in the universe.
164
424378
2627
์•„๋งˆ ๋‡Œ๊ฐ€ ์šฐ์ฃผ์—์„œ ์ œ์ผ ๋ณต์žกํ•œ ๊ฒƒ์ด๋ผ๋Š” ์„ค๋ช…์ด์˜€์ฃ . ๊ทธ๋Ÿฌ๋ฐ ๊ทธ๊ฑด ์‚ฌ์‹ค์ด ์•„๋‹™๋‹ˆ๋‹ค.
07:07
That's not true -- you're more complicated than your brain.
165
427029
2790
์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ์—ฌ๋Ÿฌ๋ถ„์˜ ๋‡Œ๋ณด๋‹ค ๋ณต์žกํ•˜์ฃ . ๋‡Œ๋ฅผ ๊ฐ–๊ณ  ๊ณ„์‹œ์ž–์•„์š”.
07:09
You've got a brain.
166
429843
1151
๋‡Œ๋Š” ๋ณด๊ธฐ์—๋Š” ๋ณต์žกํ•ด ๋ณด์—ฌ๋„์š”
07:11
And although the brain looks very complicated,
167
431018
2150
์ดํ•ดํ•˜๊ธฐ ์ „๊นŒ์ง„ ๋‹ค ๋ณต์žกํ•ด ๋ณด์ด๋Š” ๋ฒ•์ด์ฃ .
07:13
things look complicated until you understand them.
168
433192
2336
07:15
That's always been the case.
169
435552
1335
์ง€๊ธˆ๊ป ๋‹ค ๊ทธ๋žฌ๋“ฏ์š”. ์šฐ๋ฆฌ๋Š”, ์Œ,
07:16
So we can say, my neocortex, the part of the brain I'm interested in,
170
436911
3243
์‹ ํ”ผ์งˆ, ์ œ๊ฐ€ ํฅ๋ฏธ๋ฅผ ๊ฐ€์ง€๋Š” ์ด ๋ถ€๋ถ„์€ 300์–ต๊ฐœ์˜ ์„ธํฌ๋กœ ๋˜์–ด์žˆ๋‹ค๊ณ  ๋งํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค
07:20
has 30 billion cells.
171
440178
1152
07:21
But, you know what? It's very, very regular.
172
441354
2432
๊ทธ๋Ÿฌ๋ฐ ์•Œ๊ณ ๊ณ„์‹œ๋‚˜์š”? ์ด ํ”ผ์งˆ์€ ์•„์ฃผ ์ผ์ •ํ•˜๋‹ค๋Š” ๊ฒƒ์ด์ฃ .
07:23
In fact, it looks like it's the same thing repeated over and over again.
173
443810
3394
์‹ค์ œ๋กœ, ๋˜‘๊ฐ™์€ ๊ฒƒ์ด ๊ณ„์† ๋ฐ˜๋ณต๋˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ณด์ด์ฃ .
07:27
It's not as complex as it looks. That's not the issue.
174
447228
2536
๋‡Œ๋Š” ๋ณด์ด๋Š” ๊ฒƒ๋งŒํผ ๋ณต์žกํ•˜์ง€๊ฐ€ ์•Š๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค. ๊ทธ๋ž˜์„œ ๋ณต์žกํ•จ-์€ ์ด์Šˆ๊ฐ€ ๋˜์ง€ ์•Š์•„์š”.
07:29
Some people say, brains can't understand brains.
175
449788
2287
์–ด๋–ค ์‚ฌ๋žŒ์€, ๋‡Œ๊ฐ€ ๋‡Œ๋ฅผ ์ดํ•ดํ•  ์ˆ˜๋Š” ์—†์„ ๊ฒƒ์ด๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
07:32
Very Zen-like. Woo.
176
452099
1988
์˜ค์˜ค. ๊ผญ ๋„์— ๊ด€ํ•œ ๊ฒƒ (zen=์„ ,็ฆช)๊ฐ™์ง€ ์•Š๋‚˜์š”.
07:34
(Laughter)
177
454111
2188
(์›ƒ์Œ)
07:36
You know, it sounds good, but why? I mean, what's the point?
178
456323
2859
๋ฉ‹์žˆ๊ฒŒ ๋“ค๋ฆฌ์ฃ , ๊ทผ๋ฐ ์™œ์ฃ ? ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋ฌด์Šจ ๋œป์ธ ๊ฒƒ์ด์ฃ ?
07:39
It's just a bunch of cells. You understand your liver.
179
459206
2569
๊ทธ๋ƒฅ ์„ธํฌ ๋ญ‰ํ……์ด์ฃ . ์—ฌ๋Ÿฌ๋ถ„๋“ค, ๊ฐ„์— ๋Œ€ํ•ด์„œ ๋‹ค ์ดํ•ดํ•˜์‹œ์ž–์•„์š”.
07:41
It's got a lot of cells in it too, right?
180
461799
1977
๊ฐ„๋„ ์„ธํฌ๊ฐ€ ๋งŽ์€๋ฐ์š”, ๊ทธ๋ ‡์ฃ ?
07:43
So, you know, I don't think there's anything to that.
181
463800
2494
๊ทธ๋Ÿผ ์ด๊ฑด ์•„๋‹Œ๊ฑฐ ๊ฐ™๋„ค์š”.
07:46
And finally, some people say,
182
466318
2112
๋งˆ์ง€๋ง‰์œผ๋กœ, ๋ช‡๋ช‡์˜ ์‚ฌ๋žŒ๋“ค์€ ๋งํ•˜์ฃ ,
07:48
"I don't feel like a bunch of cells -- I'm conscious.
183
468454
2983
'๋‚œ ๋‚ด๊ฐ€ ์„ธํฌ ๋ญ‰ํ……์ด ๊ฐ™์ง„ ์•Š์•„, ์•Œ๋‹ค์‹œํ”ผ, ๋‚œ ์˜์‹ํ•˜๊ณ  ์žˆ์ž–์•„.
07:51
I've got this experience, I'm in the world.
184
471461
2069
์•Œ๋‹ค์‹œํ”ผ, ๋‚œ ๊ฒฝํ—˜๋„ ์Œ“์•˜๊ณ , ์ด ์„ธ์ƒ์— ์žˆ๋Š”๋ฐ,
07:53
I can't be just a bunch of cells."
185
473554
1910
์„ธํฌ ๋ญ‰ํ……์ด์ผ์ˆ˜๊ฐ€ ์—†์ง€. ์•ˆ ๊ทธ๋ž˜?'
07:55
Well, people used to believe there was a life force to be living,
186
475488
3223
์˜›๋‚ ์—” ์‚ด์•„๊ฐ€๊ฒŒ ํ•˜๋Š” ์ƒ๋ช…์˜ ํž˜(life force)์˜ ์กด์žฌ๋ฅผ ๋ฏฟ์—ˆ์Šต๋‹ˆ๋‹ค.
07:58
and we now know that's really not true at all.
187
478735
2409
๊ทธ๋ฆฌ๊ณ  ์ง€๊ธˆ์€ ๊ทธ๊ฑด ์ „ํ˜€ ์‚ฌ์‹ค์ด ์•„๋‹Œ ๊ฒƒ์„ ์•Œ๊ณ ์žˆ์ฃ .
08:01
And there's really no evidence,
188
481168
1898
๋ฏฟ๊ณ  ์žˆ์ง€ ์•Š๋Š” ์‚ฌ๋žŒ๋“ค ๋นผ๊ณ ๋Š” ๋”ฑํžˆ ์ฆ๊ฑฐ๋Š” ์—†์Šต๋‹ˆ๋‹ค.
08:03
other than that people just disbelieve that cells can do what they do.
189
483090
3374
๊ทธ๋“ค์€ ์„ธํฌ๋“ค์ด ๊ทธ๋Ÿฐ์‹์˜ ์ œ ์—ญํ• ์„ ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ชป ๋ฏฟ์ฃ .
08:06
So some people have fallen into the pit of metaphysical dualism,
190
486488
3041
๋ช‡๋ช‡์˜ ์‚ฌ๋žŒ๋“ค์ด ํ˜•์ด์ƒํ•™์ ์ธ ์ด์›๋ก ์— ๋น ์ ธ์žˆ๋‹ค๋ฉด,
08:09
some really smart people, too, but we can reject all that.
191
489553
2730
๋˜‘๋˜‘ํ•œ ์‚ฌ๋žŒ๋“ค๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ฃ . ํ•˜์ง€๋งŒ ์šฐ๋ฆฐ ์ด๋Ÿฐ ๊ฒƒ์„ ๋‹ค ๊ฑฐ๋ถ€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:12
(Laughter)
192
492307
2895
(์›ƒ์Œ)
๋˜ ๋‹ค๋ฅธ ๊ฒŒ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๋ ค๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค,
08:15
No, there's something else,
193
495226
1741
08:16
something really fundamental, and it is:
194
496991
1985
์ •๋ง ๊ธฐ๋ณธ์ ์ธ ๊ฒƒ์ž…๋‹ˆ๋‹ค, ๋ฐ”๋กœ ์ด๊ฒƒ์ด์ฃ :
08:19
another reason why we don't have a good brain theory
195
499000
2451
์ข‹์€ ๋‡Œ ์ด๋ก ์ด ์—†๋Š” ๋˜ ๋‹ค๋ฅธ ์ด์œ ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
08:21
is because we have an intuitive, strongly held but incorrect assumption
196
501475
5535
์šฐ๋ฆฌ๋ฅผ ๊ฐ•ํ•˜๊ฒŒ ์‚ฌ๋กœ์žก๋Š” ์ง๊ฐ์„ ๊ฐ€์ง€๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ ,
ํ•˜์ง€๋งŒ ์ด ๋ถ€์ •ํ™•ํ•œ ๊ฐ€์ •์€ ์šฐ๋ฆฌ๊ฐ€ ๋ฐ”๋ฅธ ๋‹ต์„ ๋ณผ ์ˆ˜ ์—†๋„๋ก ๋ง‰์•„์™”์Šต๋‹ˆ๋‹ค.
08:27
that has prevented us from seeing the answer.
197
507034
2112
08:29
There's something we believe that just, it's obvious, but it's wrong.
198
509170
3788
๋งํ•˜์ž๋ฉด, ์šฐ๋ฆฌ๊ฐ€ ๋ฏฟ๋Š” ๋ฌด์–ธ๊ฐ€๊ฐ€ ์žˆ๋‹ค๊ณ  ํ•ฉ์‹œ๋‹ค. ์•„์ฃผ ๋ถ„๋ช…ํ•œ ๊ฒƒ ๊ฐ™์€๋ฐ, ์‚ฌ์‹ค์€ ๊ทธ๊ฒŒ ํ‹€๋ฆฐ- ๊ทธ๋Ÿฐ ๊ฒฝ์šฐ์—์š”.
08:32
Now, there's a history of this in science and before I tell you what it is,
199
512982
3566
์ž, ๋ญ”์ง€ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๋ง์”€ ๋“œ๋ฆฌ๊ธฐ ์ „์—- ์ง€๊ธˆ๊นŒ์ง€ ๊ณผํ•™์—์„œ๋Š” ์ด๋Ÿฐ ๊ฒฝ์šฐ๊ฐ€ ์ญ‰ ์žˆ์–ด ์™”์Šต๋‹ˆ๋‹ค.
08:36
I'll tell you about the history of it in science.
200
516572
2299
๊ณผํ•™ ์•ˆ์—์„œ์˜ ๊ทธ ์—ญ์‚ฌ์— ๋Œ€ํ•ด์„œ ์กฐ๊ธˆ ๋ง์”€๋“œ๋ฆฌ๊ธฐ๋กœ ํ•˜์ฃ .
08:38
Look at other scientific revolutions --
201
518895
1910
(๋‡Œ๊ณผํ•™์ด ์•„๋‹Œ ๋ถ„์•ผ์—์„œ์˜) ๋‹ค๋ฅธ ๊ณผํ•™์  ์ง„ํ™”๋“ค์„ ํ•œ ๋ฒˆ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”
08:40
the solar system, that's Copernicus,
202
520829
1879
์˜ˆ๋ฅผ ๋“ค์ž๋ฉด ํƒœ์–‘๊ณ„ ์ด๋ก  ๊ฐ™์€ ๊ฑธ ๋งํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค, ์ด๊ฑด ์ฝ”ํŽ˜๋ฅด๋‹ˆ์ฟ ์Šค์˜€๊ตฌ์š”-
08:42
Darwin's evolution, and tectonic plates, that's Wegener.
203
522732
2819
๋‹ค์œˆ์˜ ์ง„ํ™”๋ก , ๊ทธ๋ฆฌ๊ณ  ๋˜ ํŒ ๊ตฌ์กฐ๋ก , ์ด๊ฑด ๋ฒ ๊ฒŒ๋„ˆ์˜€์ฃ .
๋ชจ๋‘ ๋‡Œ๊ณผํ•™๊ณผ ๋งŽ์€ ๊ณตํ†ต์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
08:46
They all have a lot in common with brain science.
204
526059
2295
08:48
First, they had a lot of unexplained data. A lot of it.
205
528378
2666
์ฒซ ๋ฒˆ์งธ๋กœ, ์•ž์— ์–ธ๊ธ‰ํ•œ ์ผ€์ด์Šค๋“ค ๋ชจ๋‘ ์„ค๋ช…๋˜์ง€ ์•Š์€ ๋ฐ์ดํ„ฐ๋งŒ ๋งŽ์ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
08:51
But it got more manageable once they had a theory.
206
531068
2794
ํ•˜์ง€๋งŒ ์ผ๋‹จ ์ด๋ก ์ด ๋‚˜์˜ค๊ณ  ๋‚˜์„œ๋ถ€ํ„ฐ๋Š” ๊ทธ ๋ฐ์ดํ„ฐ๋“ค์„ ๋‹ค๋ฃจ๊ธฐ๊ฐ€ ์‰ฌ์›Œ์กŒ์—ˆ์ฃ .
08:53
The best minds were stumped -- really smart people.
207
533886
2807
์ตœ๊ณ ์˜ ํ•™์ž๋“ค๋„ (์ด๋ก ์ด ๋‚˜์˜ค๊ธฐ ์ „๊นŒ์ง„) ๊ทธ๋ ‡๊ฒŒ ๋ง‰ํ˜”์—ˆ์–ด์š”. ์ •๋ง, ์ •๋ง ๋˜‘๋˜‘ํ•œ ์‚ฌ๋žŒ๋“ค์ด์—ˆ๋Š”๋ฐ๋„์š”.
08:56
We're not smarter now than they were then;
208
536717
2004
์šฐ๋ฆฌ๋Š” ๊ทธ๋“ค๋ณด๋‹ค ๋˜‘๋˜‘ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
08:58
it just turns out it's really hard to think of things,
209
538745
2527
๊ฒฐ๊ตญ ์ด๋Ÿฐ ์ด๋ก ๋“ค์„ ์ƒ๊ฐํ•ด๋‚ด๋Š” ๊ฒŒ ์ •๋ง ์–ด๋ ต๋‹ค๊ณ  ํŒ๋ช…๋œ ๊ฒ๋‹ˆ๋‹ค.
09:01
but once you've thought of them, it's easy to understand.
210
541296
2676
๊ทธ๋ ‡์ง€๋งŒ ํ•œ ๋ฒˆ ์ƒ๊ฐํ•ด ๋‚ด๊ณ  ๋‚˜๋ฉด, ๊ทธ ๋‹ค์Œ์—” ์ดํ•ดํ•˜๊ธฐ๊ฐ€ ๊ฝค ์‰ฌ์šด ๊ฑฐ์ฃ .
09:03
My daughters understood these three theories,
211
543996
2106
์ €์˜ ๋”ธ๋“ค์€ ์ด ์„ธ ๊ฐœ์˜ ์ด๋ก ์„ ์ดํ•ดํ–ˆ์Šต๋‹ˆ๋‹ค
์œ ์น˜์›์— ๋“ค์–ด๊ฐˆ ๋ฌด๋ ต์—๋Š” ๋ฒŒ์จ ๊ทธ ์ด๋ก ๋“ค์˜ ๊ธฐ๋ณธ์ ์ธ ํ‹€์€ ๋‹ค ์ดํ•ดํ–ˆ์–ด์š”.
09:06
in their basic framework, in kindergarten.
212
546126
2518
09:08
It's not that hard -- here's the apple, here's the orange,
213
548668
3266
์–ด๋ ค์šด ๊ฒƒ๋“ค์ด ์•„๋‹ˆ๊ฑฐ๋“ ์š”, ์•Œ๋‹ค์‹œํ”ผ, ์—ฌ๊ธฐ ์‚ฌ๊ณผ, ์˜ค๋ Œ์ง€๊ฐ€ ์žˆ๊ณ ,
09:11
the Earth goes around, that kind of stuff.
214
551958
2018
์•Œ๋‹ค์‹œํ”ผ, ์ง€๊ตฌ๊ฐ€ ํƒœ์–‘ ์ฃผ์œ„๋ฅผ ๋ˆ๋‹ค๋Š” ์ด๋ก  ๊ฐ™์€ ๊ฒƒ๋“ค์ด์ฃ .
09:14
Another thing is the answer was there all along,
215
554000
2586
๊ทธ๋ฆฌ๊ณ  ๋˜ ํ•˜๋‚˜ ์งš๊ณ  ๋„˜์–ด๊ฐˆ ๊ฑด, ๊ทธ ๋‹ต์ด ์–ธ์ œ๋‚˜ ๊ฑฐ๊ธฐ ์žˆ์—ˆ๋‹ค๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
09:16
but we kind of ignored it because of this obvious thing.
216
556610
2779
๊ทธ๋ ‡์ง€๋งŒ ์šฐ๋ฆฐ ๊ทธ๊ฑธ ๋ฌด์‹œํ•ด์˜ค๊ณค ํ–ˆ์–ด์š”, ์™œ๋ƒํ•˜๋ฉด ๋„ˆ๋ฌด ๋‹น์—ฐํ•˜๊ณ -
09:19
It was an intuitive, strongly held belief that was wrong.
217
559413
2850
์ง๊ด€์ ์ด๊ณ , ๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋ฅผ ๊ฐ•ํ•˜๊ฒŒ ์ง€๋ฐฐํ•ด ์˜จ ๊ทธ ํ‹€๋ฆฐ ๋ฏฟ์Œ ๋•Œ๋ฌธ์—์š”.
09:22
In the case of the solar system,
218
562287
1690
ํƒœ์–‘๊ณ„๋ฅผ ์˜ˆ๋กœ ๋“ค์–ด๋ณด์ž๋ฉด, ๊ธฐ๋ณธ ๊ฐœ๋…์€ ๊ทธ๊ฑฐ์ฃ - ์ง€๊ตฌ๊ฐ€ ํšŒ์ „ํ•˜๊ณ ,
09:24
the idea that the Earth is spinning,
219
564001
1760
09:25
the surface is going a thousand miles an hour,
220
565785
2191
์ง€๊ตฌ์˜ ํ‘œ๋ฉด์ด ํ•œ์‹œ๊ฐ„์— ์ฒœ ๋งˆ์ผ์ •๋„ ์›€์ง์ด๋ฉฐ,
09:28
and it's going through the solar system at a million miles an hour --
221
568000
3249
๊ทธ๋ฆฌ๊ณ  ์ง€๊ตฌ๊ฐ€ ์‹œ๊ฐ„ ๋‹น ๋ฐฑ๋งŒ ๋งˆ์ผ ์ •๋„์˜ ์†๋„๋กœ ํƒœ์–‘๊ณ„ ์•ˆ์„ ๋ˆ๋‹ค, ๋ผ๋Š” ๊ฑฐ์—์š”.
09:31
this is lunacy; we all know the Earth isn't moving.
222
571273
2476
๋ง์ด ์•ˆ ๋˜๋Š” ๊ฒƒ์ด์ฃ . ๋ชจ๋‘ ์ง€๊ตฌ๊ฐ€ ์›€์ง์ด์ง€ ์•Š๋Š”๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:33
Do you feel like you're moving a thousand miles an hour?
223
573773
2877
ํ•œ ์‹œ๊ฐ„์— ์ง€๊ตฌ๊ฐ€ ์ฒœ ๋งˆ์ผ ์ •๋„ ์›€์ง์ด๋Š” ๊ฒƒ์ด ๋Š๊ปด์ง€๋‚˜์š”?
๋‹น์—ฐํžˆ ๊ทธ๋ ‡๊ฒŒ ๋Š๊ปด์ง€์ง€ ์•Š์ž–์•„์š”. ์•ˆ๊ทธ๋ž˜์š”? (๊ทธ ๋‹น์‹œ์—) ๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ์ด๋ ‡๊ฒŒ ๋งํ–ˆ๋‹ค๊ณ  ํ•ฉ์‹œ๋‹ค,
09:36
If you said Earth was spinning around in space and was huge --
224
576674
2919
์Œ, ์ง€๊ตฌ๋Š” ์šฐ์ฃผ๋ฅผ ๋Œ์•„, ๊ทธ๋ฆฌ๊ณ  ๋งค์šฐ ํฌ์ง€,
09:39
they would lock you up, that's what they did back then.
225
579617
2591
๊ทธ๋Ÿฐ ๋ง์„ ํ–ˆ๋‹ค๋ฉด, ๋‹น์‹ ์€ ์•„๋งˆ ๊ฐ์˜ฅ์— ๊ฐ‡ํž ๊ฑฐ์—์š”. ๋‹น์‹œ์—๋Š” ๊ทธ๋ ‡๊ฒŒ ๊ฐ๊ธˆ์„ ํ–ˆ์—ˆ๊ตฌ์š”.
(์›ƒ์Œ)
09:42
So it was intuitive and obvious. Now, what about evolution?
226
582232
3275
๊ทธ๋ž˜์š”, ๊ทธ๊ฑด ์ง๊ด€์ ์ด๊ณ  ๋„ˆ๋ฌด ๋‹น์—ฐํ–ˆ๊ฑฐ๋“ ์š”. ๊ทธ๋Ÿผ ์ง„ํ™”๋Š” ์–ด๋–จ๊นŒ์š”?
09:45
Evolution, same thing.
227
585531
1154
์ง„ํ™”๋ก ๋„ ๋˜‘๊ฐ™์•„์š”. ์šฐ๋ฆฌ๋Š” ์•„์ด๋“คํ•œํ…Œ ์ด๋ ‡๊ฒŒ ๊ฐ€๋ฅด์ณค์–ด์š”, ์Œ, ์„ฑ๊ฒฝ์— ๋”ฐ๋ฅด๋ฉด,
09:46
We taught our kids the Bible says God created all these species,
228
586709
3080
์‹ ์ด ๋ชจ๋“  ์ข…์„ ์ฐฝ์กฐํ–ˆ๋‹จ ๋ง์ž…๋‹ˆ๋‹ค. ๊ทธ๋ž˜์„œ, ๊ณ ์–‘์ด๋Š” ๊ณ ์–‘์ด์ด๊ณ - ๊ฐœ๋Š” ๊ฐœ์ด๊ณ ,
09:49
cats are cats; dogs are dogs; people are people; plants are plants;
229
589813
3143
์‚ฌ๋žŒ์€ ์‚ฌ๋žŒ์ด๊ณ  ์‹๋ฌผ์€ ์‹๋ฌผ์ด์—์š”, ๋ฐ”๋€Œ๊ฑฐ๋‚˜ ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:52
they don't change.
230
592980
1241
๊ทธ๋ฆฌ๊ณ  ๋…ธ์•„๊ฐ€ ๋ชจ๋“  ์ข…๋“ค์„ ๊ทธ ์ˆœ์„œ๋Œ€๋กœ ๋ฐฉ์ฃผ์— ์‹ค์—ˆ๊ณ  ์–ด์ฉŒ๊ตฌ ์ €์ฉŒ๊ตฌ- ์•„์‹œ๋‹ค์‹œํ”ผ ๋ง์ด์—์š”-
09:54
Noah put them on the ark in that order, blah, blah.
231
594245
2649
09:56
The fact is, if you believe in evolution, we all have a common ancestor.
232
596918
3395
ํ•˜์ง€๋งŒ ์‚ฌ์‹ค์€, ๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ์ง„ํ™”๋ก ์„ ๋ฏฟ๋Š”๋‹ค๋ฉด์š”, ์šฐ๋ฆฐ ๋‹ค ๊ฐ™์€ ์กฐ์ƒ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:00
We all have a common ancestor with the plant in the lobby!
233
600337
3282
๋งํ•˜์ž๋ฉด ๋กœ๋น„์— ๋†“์—ฌ์žˆ๋Š” ์‹๋ฌผ๊ณผ ์šฐ๋ฆฌ๋Š” ๋‹ค ๊ฐ™์€ ์กฐ์ƒํ•œํ…Œ์„œ ์ง„ํ™”ํ•ด ์˜จ ๊ฑฐ๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:03
This is what evolution tells us. And it's true. It's kind of unbelievable.
234
603643
3686
์ด๊ฒƒ์ด ์ง„ํ™”๋ก ์ด ์šฐ๋ฆฌ์—๊ฒŒ ๋งํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ , ์ด๊ฒƒ์€ ๋ฏฟ๊ธฐ ํž˜๋“ค์ง€๋งŒ, ์‚ฌ์‹ค์ด์ฃ .
10:07
And the same thing about tectonic plates.
235
607353
2557
๊ทธ๋ฆฌ๊ณ  ์ง€๊ฐํŒ์— ๋Œ€ํ•œ ๊ฒƒ๋„ ๋˜‘๊ฐ™์€ ์ด์•ผ๊ธฐ์—์š”, ๊ทธ๋ ‡์ฃ ?
10:09
All the mountains and the continents
236
609934
1722
๋ชจ๋“  ์‚ฐ๋“ค, ๊ทธ๋ฆฌ๊ณ  ๋Œ€๋ฅ™์€ ๋ชจ๋‘ ์ง€๊ตฌ ์ƒ๋‹จ์— ๋– ์žˆ๋Š” ์ƒํƒœ์ด์ฃ ,
10:11
are kind of floating around on top of the Earth.
237
611680
2344
๊ทธ๋ ‡์ฃ ? ์ด๊ฒƒ์€ ๋งˆ์น˜, ๋ง์ด ์•ˆ ๋˜๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
10:14
It doesn't make any sense.
238
614048
1246
10:15
So what is the intuitive, but incorrect assumption,
239
615318
4601
๊ทธ๋Ÿผ ๋Œ€์ฒด ์–ด๋–ค, ์ง๊ด€์ ์ด๊ธฐ๋Š” ํ•˜์ง€๋งŒ ๋ถ€์ •ํ™•ํ•œ ๊ฐ€์ •์ด (์•ž์˜ ๊ฒฝ์šฐ๋“ค์ฒ˜๋Ÿผ)
10:19
that's kept us from understanding brains?
240
619943
1967
์šฐ๋ฆฌ๊ฐ€ ๋‡Œ๋ฅผ ์ดํ•ดํ•˜๋Š” ๊ฑธ ๋ฐฉํ•ดํ•˜๊ณ  ์žˆ๋Š” ๊ฑธ๊นŒ์š”?
10:21
I'll tell you. It'll seem obvious that it's correct. That's the point.
241
621934
3293
์ด์ œ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ๊ทธ๊ฒŒ ๋ฌด์–ธ์ง€์— ๋Œ€ํ•ด์„œ ๋ง์”€๋“œ๋ฆด๊ฑฐ์—์š”. ๋“ค์–ด๋ณด์‹œ๋ฉด, ๊ทธ๊ฒŒ ์ •ํ™•ํ•˜๋‹ค๋Š” ๊ฑด ์•„์ฃผ ๋ถ„๋ช…ํ•  ๊ฒ๋‹ˆ๋‹ค.
๊ทธ๊ฒŒ ํ•ต์‹ฌ์ด์—์š”, ๊ทธ๋ ‡์ฃ ? ๊ทธ๋Ÿฐ ๋‹ค์Œ์—, ์ €๋Š” ์™œ ์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ๋‹ค๋ฅธ ๊ฐ€์ •์— ๋Œ€ํ•ด์„œ
10:25
Then I'll make an argument why you're incorrect on the other assumption.
242
625251
3434
ํ‹€๋ ธ๋Š”์ง€์— ๋Œ€ํ•ด์„œ ๋ฐ˜๋ก ์„ ํ•ด์•ผํ•  ๊ฑฐ์—์š”.
10:28
The intuitive but obvious thing is:
243
628709
1682
์ง๊ด€์ ์ด๊ณ  ๋ถ„๋ช…ํ•œ (๊ทธ๋ ‡์ง€๋งŒ ํ‹€๋ฆฐ) ๊ฐ€์ •์€ ์ด๊ฒ๋‹ˆ๋‹ค, ์–ด์จŒ๋“  ์ง€์„ฑ์€
10:30
somehow, intelligence is defined by behavior;
244
630415
2314
ํ–‰๋™์„ ํ†ตํ•ด์„œ ์ •์˜๋œ๋‹ค๋Š” ๊ฐ€์ •์ด์—์š”.
10:32
we're intelligent because of how we do things
245
632753
2350
์šฐ๋ฆฌ๊ฐ€ ๋ฌด์–ธ๊ฐ€๋ฅผ ํ•ด ๋‚ด๋Š” ๋ฐฉ๋ฒ•๋“ค, ์šฐ๋ฆฌ๊ฐ€ ์˜๋ฆฌํ•˜๊ฒŒ ํ–‰๋™ํ•˜๋Š” ๊ทธ ๋ฐฉ๋ฒ•๋“ค์ด
10:35
and how we behave intelligently.
246
635127
1572
์‚ฌ๋žŒ์„ ์ง€์ ์œผ๋กœ ๋งŒ๋“ ๋‹ค๋Š” ๊ฐ€์ •์ธ๋ฐ, ์ด์ œ ์ €๋Š” ๊ทธ๋Ÿฐ ์ƒ๊ฐ๋“ค์ด ํ‹€๋ ธ๋‹ค๊ณ  ๋ง์”€๋“œ๋ฆฌ๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
10:36
And I'm going to tell you that's wrong.
247
636723
1879
10:38
Intelligence is defined by prediction.
248
638626
2131
์ด ์ง€์„ฑ์€ ์˜ˆ์ธก์— ์˜ํ•˜์—ฌ ์ฆ๋ช…๋ฉ๋‹ˆ๋‹ค.
10:40
I'm going to work you through this in a few slides,
249
640781
2415
์—ฌ๊ธฐ ๋ช‡๋ช‡์˜ ์Šฌ๋ผ์ด๋“œ๋ฅผ ํ†ตํ•˜์—ฌ ์—ฌ๋Ÿฌ๋ถ„๊ป˜ ์„ค๋ช…ํ•˜๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค,
10:43
and give you an example of what this means.
250
643220
2094
์ง€์„ฑ์— ๋Œ€ํ•œ ์˜ˆ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค, ์—ฌ๊ธฐ ํ•œ ์ฒด๊ณ„๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
10:45
Here's a system.
251
645338
1301
10:46
Engineers and scientists like to look at systems like this.
252
646663
2908
๊ธฐ์ˆ ์ž๋“ค์€ ์ด์™€ ๊ฐ™์€ ์ฒด๊ณ„๋ฅผ ๋ณด๊ธฐ๋ฅผ ์„ ํ˜ธํ•˜์ฃ . ๊ณผํ•™์ž๋“ค ์—ญ์‹œ ๊ทธ๋ ‡์Šต๋‹ˆ๋‹ค.
10:49
They say, we have a thing in a box. We have its inputs and outputs.
253
649595
3163
๊ทธ ์‚ฌ๋žŒ๋“ค์€ ๊ทธ๋ ‡๊ฒŒ ์–˜๊ธฐํ•ด์š”, ์—ฌ๊ธฐ ๋ฐ•์Šค์•ˆ์— ๋ญ”๊ฐ€ ๋“ค์–ด ์žˆ์–ด์š”. ๊ทธ๋ฆฌ๊ณ  ๊ทธ ์ž…๋ ฅ๋ฌผ๊ณผ ์ถœ๋ ฅ๋ฌผ๋“ค๋„ ์—ฌ๊ธฐ ์žˆ๊ตฌ์š”.
10:52
The AI people said, the thing in the box is a programmable computer,
254
652782
3240
๊ทธ๋Ÿผ AI ์—ฐ๊ตฌ์ž๋“ค์€ ์–˜๊ธฐํ•˜๊ฒ ์ฃ , '๋ฐ•์Šค ์•ˆ์— ๋“  ๊ทธ๊ฑด ํ”„๋กœ๊ทธ๋ž˜๋ฐ ๊ฐ€๋Šฅํ•œ ์ปดํ“จํ„ฐ์ž…๋‹ˆ๋‹ค.' ๋ผ๊ตฌ์š”.
10:56
because it's equivalent to a brain.
255
656046
1679
์™œ๋ƒ๋ฉด ๊ทธ๊ฒŒ ๋‡Œ์™€ ๋™์ผํ•œ ๊ฒƒ ๊ฐ™๊ฑฐ๋“ ์š”. ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์ž…๋ ฅ๊ฐ’์„ ์ฃผ๊ณ ,
10:57
We'll feed it some inputs and get it to do something, have some behavior.
256
657749
3506
๊ทธ๊ฑธ๋กœ ํ•˜์—ฌ๊ธˆ ๋ญ”๊ฐ€๋ฅผ ํ•˜๊ฒŒ ์‹œํ‚ค๊ณ , ๊ทธ๋Ÿผ ๊ทธ ๊ฒฐ๊ณผ๋กœ ํ–‰๋™์„ ์–ป์–ด๋‚ด๋Š” ๊ฑฐ์—์š”.
๊ทธ๋ฆฌ๊ณ  ์•จ๋Ÿฐ ํŠœ๋ง์ด ํŠœ๋ง ํ…Œ์ŠคํŠธ๋ผ๋Š” ๊ฑธ ์ •์˜ํ–ˆ๋Š”๋ฐ์š”, ๊ธฐ๋ณธ์ ์œผ๋กœ ์ด๋Ÿฐ ๋ง์ž…๋‹ˆ๋‹ค-
11:01
Alan Turing defined the Turing test, which essentially says,
257
661279
2822
'์–ด๋–ค ๊ฒƒ์ด ์ง€์„ฑ์„ ๊ฐ€์กŒ๊ณ  ์ธ๊ฐ„๊ณผ ๋™์ผํ•œ ํ–‰๋™์„ ํ•œ๋‹ค๋ฉด ์šฐ๋ฆฌ๋Š” ์–ด๋–ค ํ–‰๋™์œผ๋กœ ์ง€์„ฑ์„ ์ธก์ •ํ•  ์ง€์— ๋Œ€ํ•ด์„œ ์•Œ๊ฒŒ ๋  ๊ฒƒ์ด๋‹ค.'
11:04
we'll know if something's intelligent if it behaves identical to a human --
258
664125
3553
๋งํ•˜์ž๋ฉด ์ง€์„ฑ์„ ์ธก์ •ํ•˜๋Š” ๊ฑด ํ–‰๋™์ด๋ผ๋Š” ๊ฒƒ์ด์ฃ .
11:07
a behavioral metric of what intelligence is
259
667702
2106
11:09
that has stuck in our minds for a long time.
260
669832
2144
๊ทธ๋ฆฌ๊ณ  ์ด ๋ง์€ ์˜ค๋žซ๋™์•ˆ ์šฐ๋ฆฌ์˜ ๋งˆ์Œ์— ๋‚จ์•„์žˆ์ฃ .
11:12
Reality, though -- I call it real intelligence.
261
672000
2392
ํ•˜์ง€๋งŒ ํ˜„์‹ค์€ ๋ง์ž…๋‹ˆ๋‹ค, ์ €๋Š” ์ด๊ฑธ ์ง„์งœ ์ง€์„ฑ์ด๋ผ๊ณ  ๋ถ€๋ฅด๋Š”๋ฐ์š”-
11:14
Real intelligence is built on something else.
262
674416
2175
์ง„์งœ ์ง€์„ฑ์€ ๋‹ค๋ฅธ ๊ฒƒ์ด์ฃ .
11:16
We experience the world through a sequence of patterns,
263
676615
3214
์šฐ๋ฆฌ๋Š” ํŒจํ„ด์˜ ์ผ์น˜๋ฅผ ํ†ตํ•˜์—ฌ ๊ฒฝํ—˜ํ•˜๋ฉฐ, ๊ทธ๊ฒƒ๋“ค์„ ์ €์žฅํ•˜๊ณ ,
11:19
and we store them, and we recall them.
264
679853
2149
์ƒ๊ธฐํ•ฉ๋‹ˆ๋‹ค. ์ด๊ฒƒ๋“ค์„ ์ƒ๊ธฐํ•  ๋•Œ, ์šฐ๋ฆฌ๋Š” ์‹ค์ œ์— ๋Œ€ํ•ญํ•˜์—ฌ ๋งž์ถฐ ๋ด…๋‹ˆ๋‹ค.
11:22
When we recall them, we match them up against reality,
265
682026
2545
๊ทธ๋ฆฌ๊ณ  ์–ธ์ œ๋‚˜ ์˜ˆ์ธก์„ ํ•˜์ฃ .
11:24
and we're making predictions all the time.
266
684595
2251
11:26
It's an internal metric; there's an internal metric about us,
267
686870
2958
๋Š์ž„์—†๋Š” ์ธก๋Ÿ‰์ด์—์š”. ์šฐ๋ฆฌ ์ž์‹ ์— ๋Œ€ํ•œ ๋Š์ž„์—†๋Š” ์ธก๋Ÿ‰ ์ด๋ผ๋Š” ๊ฑฐ์ฃ , ๋งํ•˜์ž๋ฉด
11:29
saying, do we understand the world, am I making predictions, and so on.
268
689852
3342
์šฐ๋ฆฌ๊ฐ€ ์„ธ์ƒ์„ ์ดํ•ดํ•  ์ˆ˜ ์žˆ์„๊นŒ? ๋‚ด๊ฐ€ ์ง€๊ธˆ ์˜ˆ์ธกํ•˜๊ณ  ์žˆ๋‚˜? ๊ธฐํƒ€ ๋“ฑ๋“ฑ.
11:33
You're all being intelligent now, but you're not doing anything.
269
693218
3002
์—ฌ๋Ÿฌ๋ถ„์€ ๋ชจ๋‘ ์˜๋ฆฌํ•ด์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค, ํ•˜์ง€๋งŒ ์•„๋ฌด๊ฒƒ๋„ ํ•˜์ง€ ์•Š๊ณ ์žˆ์ฃ .
์•„๋งˆ ์ง€๊ธˆ ์–ด๋”œ ๊ธ๊ณ  ์žˆ๊ฑฐ๋‚˜, ์ฝ”๋ฅผ ํ›„๋น„๊ณ  ์žˆ๊ฑฐ๋‚˜ ๊ทธ๋Ÿด์ง€๋„ ๋ชฐ๋ผ์š”-
11:36
Maybe you're scratching yourself, but you're not doing anything.
270
696244
3002
๋ชจ๋ฅด์ฃ , ํ•˜์ง€๋งŒ ์—ฌ๋Ÿฌ๋ถ„์€ ์ง€๊ธˆ ๋‹น์žฅ ์•„๋ฌด๊ฒƒ๋„ ํ•˜๊ณ  ์žˆ์ง€ ์•Š์Šต๋‹ˆ๋‹ค,
11:39
But you're being intelligent; you're understanding what I'm saying.
271
699270
3156
ํ•˜์ง€๋งŒ ์˜๋ฆฌํ•ด์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค, ์ œ๊ฐ€ ํ•˜๋Š” ๋ง์„ ์ดํ•ดํ•˜๊ณ  ์žˆ์œผ์‹œ์ฃ .
11:42
Because you're intelligent and you speak English,
272
702450
2295
์˜๋ฆฌํ•˜์‹œ๋‹ˆ๊นŒ์š” ๊ทธ๋ฆฌ๊ณ  ์˜์–ด๋ฅผ ๋งํ•˜์‹ค ์ˆ˜ ์žˆ์ฃ ,
11:44
you know the word at the end of this
273
704769
1751
์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ์ง€๊ธˆ ์ œ๊ฐ€ ํ•˜๋Š” ๋ง ๋งˆ์ง€๋ง‰์— ์˜ฌ ๋‹จ์–ด๊ฐ€- (์นจ๋ฌต)
'๋ฌธ์žฅ' ์ด๋ผ๋Š” ๊ฑธ ์•Œ์•„์š”.
11:46
sentence.
274
706544
1159
11:47
The word came to you; you make these predictions all the time.
275
707727
3152
๋‹จ์–ด๋“ค์ด ๊ทธ๋ƒฅ ๋– ์˜ค๋ฅด๋Š” ๊ฑฐ์—์š”. ๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ์ด๋Ÿฐ ์ข…๋ฅ˜์˜ ์˜ˆ์ธก์„ ํ•ญ์ƒ ํ•˜๊ณ  ์žˆ๋‹ค๋Š”๊ฑฐ์ฃ .
11:50
What I'm saying is,
276
710903
1699
๋‹ค์Œ์œผ๋กœ ์ œ๊ฐ€ ๋งํ•˜๋ ค ํ•˜๋Š” ๊ฒƒ์€,
11:52
the internal prediction is the output in the neocortex,
277
712626
2631
๊ทธ ๋Š์ž„์—†๋Š” ์˜ˆ์ธก์ด๋ผ๋Š” ๊ฒƒ์ด ๋Œ€๋‡Œ ์‹ ํ”ผ์งˆ์ด ๋‚ด์–ด๋†“๋Š” ์ถœ๋ ฅ๋ฌผ์ด๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
์–ด์ฐŒ๋˜์—ˆ๋“ , ์˜ˆ์ธก์€ ์ง€์ ์ธ ํ–‰๋™์œผ๋กœ ์ด๋•๋‹ˆ๋‹ค.
11:55
and somehow, prediction leads to intelligent behavior.
278
715281
2541
11:57
Here's how that happens:
279
717846
1151
์–ด๋–ป๊ฒŒ ๋ฐœ์ƒํ•˜๋Š”์ง€, ์—ฌ๊ธฐ ๋ณด์‹œ์ฃ . ์˜๋ฆฌํ•˜์ง€ ์•Š์€ ๋‡Œ๋กœ ์‹œ์ž‘ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
11:59
Let's start with a non-intelligent brain.
280
719021
1955
์˜๋ฆฌํ•˜์ง€ ์•Š์€ ๋‡Œ๋ฅผ ๋…ผํ•˜์ฃ , ์˜ค๋ž˜๋œ(์ง„ํ™”๊ฐ€ ๋œ ๋œ) ๋‡Œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค,
12:01
I'll argue a non-intelligent brain, we'll call it an old brain.
281
721000
3009
12:04
And we'll say it's a non-mammal, like a reptile,
282
724033
2871
ํฌ์œ ๋ฅ˜๊ฐ€ ์•„๋‹Œ ๊ฒƒ์ด๋ผ๊ณ  ํ•ฉ์‹œ๋‹ค, ํŒŒ์ถฉ๋ฅ˜๋กœ ํ•˜์ฃ -
12:06
say, an alligator; we have an alligator.
283
726928
1985
์•…์–ด๋ผ๊ณ  ํ•ฉ์‹œ๋‹ค, ๊ทธ๋ž˜์š” ์—ฌ๊ธฐ ์•…์–ด ํ•œ ๋งˆ๋ฆฌ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
12:08
And the alligator has some very sophisticated senses.
284
728937
3371
์•…์–ด๋Š” ๋งค์šฐ ๋ณต์žกํ•œ ๊ฐ๊ฐ์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:12
It's got good eyes and ears and touch senses and so on,
285
732332
3206
๋งค์šฐ ์ข‹์€ ๋ˆˆ, ๊ท€, ๊ทธ๋ฆฌ๊ณ  ์ ‘์ด‰ ๊ฐ๊ฐ ๊ธฐํƒ€ ๋“ฑ๋“ฑ
12:15
a mouth and a nose.
286
735562
1469
์ž… ๊ทธ๋ฆฌ๊ณ  ์ฝ”. ๋งค์šฐ ๋ณต์žกํ•œ ํ–‰๋™ ์–‘์‹์ด ์žˆ์Šต๋‹ˆ๋‹ค. ์•Œ๋‹ค์‹œํ”ผ
12:17
It has very complex behavior.
287
737055
1991
12:19
It can run and hide. It has fears and emotions. It can eat you.
288
739070
3906
๋‹ฌ๋ฆด ์ˆ˜๋„ ์žˆ๊ณ  ์ˆจ์„ ์ˆ˜๋„์žˆ์ฃ . ๊ณตํฌ์™€ ๊ฐ™์€ ๊ฐ์ •๋„ ์žˆ์ฃ . ์‹ฌ์ง€์–ด ์—ฌ๋Ÿฌ๋ถ„์„ ๋จน์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค
12:23
It can attack. It can do all kinds of stuff.
289
743000
3590
๊ณต๊ฒฉ๋„ ํ•  ์ˆ˜ ์žˆ๊ตฌ์š”- ์•…์–ด๋Š” ์ •๋ง ์˜จ๊ฐ– ๊ฑธ ๋‹ค ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:27
But we don't consider the alligator very intelligent,
290
747193
2856
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์•…์–ด๋ฅผ ์ธ๊ฐ„์ฒ˜๋Ÿผ ์˜๋ฆฌํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•˜์ง€ ์•Š์ฃ .
12:30
not in a human sort of way.
291
750073
1676
12:31
But it has all this complex behavior already.
292
751773
2356
ํ•˜์ง€๋งŒ ์ด๋ฏธ ์•…์–ด๋Š” ๋ชจ๋“  ๋ณต์žกํ•œ ํ–‰๋™ ์–‘์‹์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:34
Now in evolution, what happened?
293
754510
1801
์ž, ์ง„ํ™”๋ก ์—์„œ, ๋ฌด์Šจ ์ผ์ด ๋ฐœ์ƒํ–ˆ์„๊นŒ์š”?
12:36
First thing that happened in evolution with mammals
294
756335
2385
ํฌ์œ ๋ฅ˜์˜ ์ง„ํ™”์—์„œ ๊ฐ€์žฅ ๋จผ์ € ์ผ์–ด๋‚ฌ๋˜ ๊ฒƒ์€,
12:38
is we started to develop a thing called the neocortex.
295
758744
2531
์•„๋งˆ ์‹ ํ”ผ์งˆ(๋Œ€๋‡Œ)์ด๋ผ ๋ถˆ๋ฆฌ๋Š” ๊ณณ์ด ๊ฐ€์žฅ ๋จผ์ € ๋ฐœ๋‹ฌํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
12:41
I'm going to represent the neocortex by this box on top of the old brain.
296
761299
3793
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ์˜ค๋ž˜๋œ ๋‡Œ์˜ ์ƒ๋‹จ ๋ถ€๋ถ„์— ๋ถ™์–ด์žˆ๋Š” ์ด ๋ฐ•์Šค๋กœ
์‹ ํ”ผ์งˆ์— ๊ด€ํ•˜์—ฌ ์„ค๋ช…ํ•˜๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
12:45
Neocortex means "new layer." It's a new layer on top of your brain.
297
765116
3353
์‹ ํ”ผ์งˆ์€ ์ƒˆ๋กœ์šด ์ธต์„ ๋œปํ•˜์ฃ . ๋‡Œ์˜ ์ƒ๋‹จ ์œ„์— ์žˆ๋Š” ์ƒˆ๋กœ์šด ์ธต์ธ ๊ฒƒ์ด์ฃ .
12:48
It's the wrinkly thing on the top of your head
298
768493
2343
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„๊ป˜์„œ ์‹ ํ”ผ์งˆ์„ ๋ชฐ๋ž๋‹ค๋ฉด, ์ด๊ฒƒ์€ ๊ทธ์ € ๋จธ๋ฆฌ ์œ„์— ์žˆ๋Š” ์ฃผ๋ฆ„์ž…๋‹ˆ๋‹ค,
12:50
that got wrinkly because it got shoved in there and doesn't fit.
299
770860
3084
์ฃผ๋ฆ„์กŒ์ฃ  ์™œ๋ƒํ•˜๋ฉด ๋‡Œ์— ๊ตฌ๊ฒจ๋„ฃ์–ด์กŒ๊ณ  ๋งž์ง€๊ฐ€ ์•Š์•˜๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
12:53
(Laughter)
300
773968
1008
(์›ƒ์Œ)
12:55
Literally, it's about the size of a table napkin
301
775000
2242
์•„๋‹™๋‹ˆ๋‹ค, ์ •๋ง ๊ทธ๋ž˜์š”. ์‹ ํ”ผ์งˆ์ด๋ผ๋Š” ๊ฒŒ ํ…Œ์ด๋ธ” ๋ƒ…ํ‚จ ์ •๋„ ํฌ๊ธฐ๊ฑฐ๋“ ์š”.
12:57
and doesn't fit, so it's wrinkly.
302
777266
1574
๋งž์ง€๊ฐ€ ์•Š์Šต๋‹ˆ๋‹ค, ๊ทธ๋ž˜์„œ ๋ชจ๋‘ ์ฃผ๋ฆ„์กŒ์ฃ . ์—ฌ๊ธฐ๋ฅผ ์ œ๊ฐ€ ์–ด๋–ป๊ฒŒ ๊ทธ๋ ธ๋Š”์ง€ ๋ณด์„ธ์š”.
12:58
Now, look at how I've drawn this.
303
778864
1745
13:00
The old brain is still there.
304
780633
1386
์—ฌ์ „ํžˆ ์ด๊ณณ์— ์˜ค๋ž˜๋œ ๋‡Œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค. ์—ฌ๋Ÿฌ๋ถ„์€ ์—ฌ์ „ํžˆ ์˜ค๋ž˜๋œ ๋‡Œ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:02
You still have that alligator brain. You do. It's your emotional brain.
305
782043
3655
์ •๋ง์ด์ฃ . ์ด๊ฒƒ์€ ์—ฌ๋Ÿฌ๋ถ„์˜ ๊ฐ์ • ๋‡Œ์ž…๋‹ˆ๋‹ค.,
13:05
It's all those gut reactions you have.
306
785722
2730
๊ทธ ๋ชจ๋“  ๊ฐ์ •๋“ค์ด ์—ฌ๊ธฐ ์žˆ๋Š”๊ฑฐ์—์š”, ์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ๊ฐ–๊ณ  ์žˆ๋Š” ๋ณธ๋Šฅ์ ์ธ ๋ฐ˜์‘๋“ค๋„ ๋‹ค ์—ฌ๊ธฐ ์žˆ์ฃ .
13:08
On top of it, we have this memory system called the neocortex.
307
788476
3270
๊ทธ๋ฆฌ๊ณ  ์œ— ๋ถ€๋ถ„์€, ์‹ ํ”ผ์งˆ์ด๋ž‘ ๋ถˆ๋ฆฌ๋Š” ๊ธฐ์–ต ์ฒด๊ณ„์ž…๋‹ˆ๋‹ค.
13:11
And the memory system is sitting over the sensory part of the brain.
308
791770
4294
๊ธฐ์–ต ์ฒด๊ณ„๋Š” ๋‡Œ์˜ ๊ฐ๊ฐ ์ค‘์ถ” ์œ„์— ๋†“์—ฌ์žˆ์Šต๋‹ˆ๋‹ค.
13:16
So as the sensory input comes in and feeds from the old brain,
309
796088
3055
์˜ค๋ž˜๋œ ๋ถ€๋ถ„์˜ ๋‡Œ๋กœ๋ถ€ํ„ฐ ๊ฐ๊ฐ ์ž…๋ ฅ์ด ๋“ค์–ด์˜ค๋ฉด,
13:19
it also goes up into the neocortex.
310
799167
2154
์ด ์ •๋ณด๋Š” ์‹ ํ”ผ์งˆ๋กœ ์ด๋™ํ•ฉ๋‹ˆ๋‹ค. ์‹ ํ”ผ์งˆ์€ ๊ทธ์ € ๊ธฐ์–ตํ•˜์ฃ .
13:21
And the neocortex is just memorizing.
311
801345
1913
13:23
It's sitting there saying, I'm going to memorize all the things going on:
312
803282
3561
์‹ ํ”ผ์งˆ์— ์ •์ฐฉํ•ด ๋งํ•ฉ๋‹ˆ๋‹ค, '์•„, ์ง„ํ–‰ ๋˜๊ณ  ์žˆ๋Š” ๋ชจ๋“  ๊ฒƒ์„ ์™ธ์šธ๊ฑฐ์•ผ,
13:26
where I've been, people I've seen, things I've heard, and so on.
313
806867
3019
๋‚ด๊ฐ€ ๊ฐ”๋˜ ๊ณณ๋“ค, ๋ณธ ์‚ฌ๋žŒ๋“ค, ๊ทธ๋ฆฌ๊ณ  ๋“ค์—ˆ๋˜ ๊ฒƒ๋“ค ๋“ฑ๋“ฑ.
13:29
And in the future, when it sees something similar to that again,
314
809910
3362
๋ฏธ๋ž˜์—, ๋‡Œ๊ฐ€ ์ด์™€ ๋น„์Šทํ•œ ๊ฒƒ์„ ๋ณด์•˜์„ ๋•Œ,
13:33
in a similar environment, or the exact same environment,
315
813296
2635
๋น„์Šทํ•œ ํ™˜๊ฒฝ, ํ˜น์€ ๋˜‘๊ฐ™์€ ํ™˜๊ฒฝ์— ์ฒ˜ํ•˜๊ฒŒ ๋˜๋ฉด
13:35
it'll start playing it back: "Oh, I've been here before,"
316
815955
3555
๋‡Œ๊ฐ€ ๊ทธ ์ •๋ณด๋“ค์„ ๋‹ค์‹œ ์žฌ์ƒํ• ๊ฑฐ์—์š”. ๋‹ค์‹œ ์ƒ๊ธฐํ•˜๊ธฐ ์‹œ์ž‘ํ•  ๊ฒ๋‹ˆ๋‹ค.
'์˜ค, ์ „์— ์—ฌ๊ธฐ ์™”์—ˆ์ง€.' ๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„์ด ์ „์— ์ด๊ณณ์— ์™”์„ ๋•Œ,
13:39
and when you were here before, this happened next.
317
819534
2364
์ƒ๊ธฐํ•˜์ฃ . ๋‡Œ๋Š” ์—ฌ๋Ÿฌ๋ถ„์ด ๋ฏธ๋ž˜๋ฅผ ์˜ˆ์ธกํ•˜๊ฒŒ ์ด๋Œ์ฃ .
13:41
It allows you to predict the future.
318
821922
1726
13:43
It literally feeds back the signals into your brain;
319
823672
3396
๋‡Œ๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ์˜ˆ์ธกํ•˜๊ฒŒ ํ•ด์ฃผ๋Š” ๊ฒ๋‹ˆ๋‹ค, ๋ง ๊ทธ๋Œ€๋กœ ์‹ ํ˜ธ๋ฅผ ๋‡Œ์—๋‹ค ๋‹ค์‹œ ์ž…๋ ฅํ•ด ์ฃผ๋Š” ๊ฑฐ์ฃ .
13:47
they'll let you see what's going to happen next,
320
827092
2265
๋‡Œ๋Š” ์—ฌ๋Ÿฌ๋ถ„์ด ๋‹ค์Œ์— ๋ฌด์—‡์ด ์ผ์–ด๋‚ ์ง€ ์•Œ๊ฒŒ ๋งŒ๋“ค์ฃ ,
13:49
will let you hear the word "sentence" before I said it.
321
829381
2595
์ œ๊ฐ€ ๋งํ•˜๊ธฐ ์ „์— ๋‹จ์–ด๋ฅผ ๋“ฃ๊ฒŒํ•˜์ฃ , ๊ทธ๋ฆฌ๊ณ 
13:52
And it's this feeding back into the old brain
322
832000
3185
์ด ์˜ค๋ž˜๋œ ๋‡Œ์— ์žฌ์ž…๋ ฅํ•˜๋Š” ์ผ๋ จ์˜ ๊ณผ์ •์ด
13:55
that will allow you to make more intelligent decisions.
323
835209
2577
์—ฌ๋Ÿฌ๋ถ„์ด ๋”์šฑ๋” ํ˜„๋ช…ํ•œ ๊ฒฐ์ •์„ ํ•˜๊ฒŒ ์ด๋Œ์ฃ .
13:57
This is the most important slide of my talk, so I'll dwell on it a little.
324
837810
3489
์ด ๋ง์€ ์ €์˜ ๊ฐ•์—ฐ์—์„œ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค. ๊ทธ๋ž˜์„œ ์กฐ๊ธˆ ๋” ์„ค๋ช…ํ•˜๋ คํ•ฉ๋‹ˆ๋‹ค.
14:01
And all the time you say, "Oh, I can predict things,"
325
841323
3575
๊ทธ๋ž˜์„œ, ์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ๋Š˜ ๊ทธ๋ ‡๊ฒŒ ๋งํ•ฉ๋‹ˆ๋‹ค, '์˜ค, ๋‚˜๋Š” ๋ชจ๋“  ๊ฑธ ์˜ˆ์ธกํ•  ์ˆ˜ ์žˆ์–ด' ๋ผ๊ณ  ๋ง์ด์—์š”.
14:04
so if you're a rat and you go through a maze, and you learn the maze,
326
844922
3360
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ์ƒ์ฅ์ด๊ณ  ๋ฏธ๋กœ๋ฅผ ํ†ต๊ณผํ•œ๋‹ค๋ฉด, ์ดํ›„ ๋ฏธ๋กœ๋ฅผ ๋ฐฐ์šด๋‹ค๋ฉด,
14:08
next time you're in one, you have the same behavior.
327
848306
2439
๋‹ค์Œ์— ๋ฏธ๋กœ์— ์žˆ์„ ๋•Œ, ์—ฌ๋Ÿฌ๋ถ„์€ ๊ฐ™์€ ํ–‰๋™์„ ๋ณด์ด์ฃ ,
14:10
But suddenly, you're smarter; you say, "I recognize this maze,
328
850769
2991
ํ•˜์ง€๋งŒ ๊ฐ‘์ž๊ธฐ, ๋˜‘๋˜‘ํ•ด์ง€์ฃ 
์—ฌ๋Ÿฌ๋ถ„์€ ๋งํ•˜์ฃ , '์•„, ์ด ๋ฏธ๋กœ ์•Œ์•„, ๋‚˜๊ฐ€๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์ง€,
14:13
I know which way to go; I've been here before; I can envision the future."
329
853784
3542
์˜ˆ์ „์— ์ด๊ณณ์— ์™€ ๋ณธ ์ ์ด ์žˆ์–ด. ๋ฏธ๋กœ๋ฅผ ํ†ต๊ณผํ•˜๋Š” ๊ฑธ ๋งˆ์Œ์— ๋– ์˜ฌ๋ฆด ์ˆ˜ ์žˆ์ง€' ์ด๊ฒƒ์ด ๋‡Œ๊ฐ€ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค
14:17
That's what it's doing.
330
857350
1168
14:18
This is true for all mammals --
331
858542
2840
์‚ฌ๋žŒ๋“คํ•œํ…Œ ์žˆ์–ด์„œ, ์•„, ์‚ฌ์‹ค ์ด๊ฑด ๋ชจ๋“  ํฌ์œ ๋™๋ฌผ๋“คํ•œํ…Œ ํ•ด๋‹น๋˜๋Š” ๊ฑด๋ฐ์š”,
14:21
in humans, it got a lot worse.
332
861406
2031
๋‹ค๋ฅธ ํฌ์œ ๋ฅ˜์—๊ฒŒ๋„ ๋ชจ๋‘ ์‚ฌ์‹ค์ธ๋ฐ ์‚ฌ๋žŒ๋“ค์˜ ๊ฒฝ์šฐ์— ํ›จ์”ฌ ์‹ฌํ•ฉ๋‹ˆ๋‹ค.
14:23
Humans actually developed the front of the neocortex,
333
863461
2587
์ธ๊ฐ„์€ ์ „๋‘์—ฝ ๋ถ€๋ถ„์˜ ์‹ ํ”ผ์งˆ์„ ๋ฐœ๋‹ฌ์‹œ์ผฐ์Šต๋‹ˆ๋‹ค
14:26
called the anterior part of the neocortex.
334
866072
2221
์ „์ „๋‘ ์‹ ํ”ผ์งˆ์ด๋ผ ๋ถˆ๋ฆฌ ๊ณณ์ด์ฃ . ์ž์—ฐ์˜ ์ž‘์€ ๋ณ€ํ™”(์†์ž„์ˆ˜)๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
14:28
And nature did a little trick.
335
868317
1438
14:29
It copied the posterior, the back part, which is sensory,
336
869779
2687
์ด๊ฒƒ์€ ๊ฐ๊ฐ์„ ๋‹น๋‹ดํ•˜๊ณ  ์žˆ๋Š” ๋’ท ๋ถ€๋ถ„, ํ›„๋‘์—ฝ์„ ๋ณต์‚ฌ ํ–ˆ์ฃ 
14:32
and put it in the front.
337
872490
1151
๊ทธ๋ฆฌ๊ณ  ์ „๋‘์—ฝ์— ๋†“์˜€์ฃ .
14:33
Humans uniquely have the same mechanism on the front,
338
873665
2480
์ธ๊ฐ„์€ ์ „๋‘์—ฝ์— ์œ ์ผํ•˜๊ฒŒ ๊ฐ™์€ ๋ฐฉ๋ฒ•์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค,
14:36
but we use it for motor control.
339
876169
1554
ํ•˜์ง€๋งŒ ์ด๊ฒƒ์„ ์šด๋™ ์กฐ์ ˆ์„ ์œ„ํ•ด ์‚ฌ์šฉํ•˜์ฃ .
14:37
So we're now able to do very sophisticated motor planning, things like that.
340
877747
3581
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ๋งค์šฐ ์„ธ๋ฐ€ํ•œ ๋™์ž‘์„ ๊ณ„ํšํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์ด๋ ‡๊ฒŒ ๋ง์ด์ฃ .
14:41
I don't have time to explain, but to understand how a brain works,
341
881352
3126
์ด ๋ถ€๋ถ„์— ๋Œ€ํ•ด์„œ ๋ชจ๋‘ ์„ค๋ช…ํ•  ์‹œ๊ฐ„์ด ์—†์ง€๋งŒ, ์—ฌ๋Ÿฌ๋ถ„๊ป˜์„œ ๋‡Œ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€ ์•Œ๊ณ  ์‹ถ๋‹ค๋ฉด,
14:44
you have to understand how the first part of the mammalian neocortex works,
342
884502
3537
์šฐ์„  ํฌ์œ ๋ฅ˜์˜ ์‹ ํ”ผ์งˆ์ด ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š”์ง€, ๊ทธ๋ฆฌ๊ณ 
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ํŒจํ„ด์„ ์–ด๋–ป๊ฒŒ ์ €์žฅํ•˜๊ณ  ์˜ˆ์ธก์„ ํ•˜๋Š”์ง€ ์ดํ•ดํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
14:48
how it is we store patterns and make predictions.
343
888063
2293
๊ทธ๋ž˜์„œ ์ €๋Š” ๋ช‡๊ฐ€์ง€ ์˜ˆ์ธก์˜ ์˜ˆ๋ฅผ ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
14:50
Let me give you a few examples of predictions.
344
890380
2188
14:52
I already said the word "sentence."
345
892592
1676
์ €๋Š” ์ด๋ฏธ ๋‹จ์–ด ๋ฌธ์žฅ์— ๋Œ€ํ•ด ๋งํ–ˆ์Šต๋‹ˆ๋‹ค. ์Œ์•…์—์„œ,
14:54
In music, if you've heard a song before,
346
894292
3206
์˜ˆ์ „์— ์Œ์•…์„ ๋“ค์—ˆ๊ฑฐ๋‚˜, ์งˆ์ด ์˜ˆ์ „์— ๋…ธ๋ž˜ ๋ถ€๋ฅด๋Š” ๊ฒƒ์„ ๋“ค์—ˆ๋‹ค๋ฉด,
14:57
when you hear it, the next note pops into your head already --
347
897522
2909
๊ทธ๋…€๊ฐ€ ๊ทธ ๋…ธ๋ž˜๋ฅผ ๋ถ€๋ฅผ ๋•Œ, ๋‹ค์Œ ์Œ์ •์€ ์ด๋ฏธ ๋จธ๋ฆฟ ์†์— ๋– ์˜ฌ๋ผ์š”.
15:00
you anticipate it.
348
900455
1151
๋“ค์œผ๋ฉด์„œ ๋™์‹œ์— ๋‹ค์Œ ํŒŒํŠธ๋ฅผ ์˜ˆ์ธกํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค. ๋งŒ์•ฝ์— ์ด๊ฒŒ ์Œ๋ฐ˜์œผ๋กœ ๋ฌถ์ธ ์Œ์•…์ด๋ผ๋ฉด
15:01
With an album, at the end of a song, the next song pops into your head.
349
901630
3354
ํ•œ ์•จ๋ฒ”์ด ๋๋‚˜๋Š” ์ˆœ๊ฐ„์— ๋‹ค์Œ ๋…ธ๋ž˜๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์˜ ๋จธ๋ฆฌ์— ๋– ์˜ค๋ฅด์ฃ .
15:05
It happens all the time, you make predictions.
350
905008
2305
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฐ ํ˜„์ƒ์€ ๊ณ„์† ์ผ์–ด๋‚ฉ๋‹ˆ๋‹ค. ์—ฌ๋Ÿฌ๋ถ„์€ ์˜ˆ์ธกํ•˜๊ณ  ๊ณ„์‹  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:07
I have this thing called the "altered door" thought experiment.
351
907337
3039
์ €๋Š” ์ด๊ฒƒ์„ ๋ฐ”๋€ ๋ฌธ ์ธ์ง€ ์‹คํ—˜์ด๋ผ๊ณ  ๋ถ€๋ฅด์ฃ .
15:10
It says, you have a door at home;
352
910400
2829
์ด ์‹คํ—˜์€ ์ด๋ ‡์ฃ , ์—ฌ๋Ÿฌ๋ถ„์˜ ์ง‘์— ๋ฌธ์ด ์žˆ์Šต๋‹ˆ๋‹ค,
15:13
when you're here, I'm changing it --
353
913253
1755
์—ฌ๋Ÿฌ๋ถ„์ด ์—ฌ๊ธฐ ์žˆ์„ ๋•Œ, ์ €๋Š” ๋ฌธ์„ ๋ฐ”๊ฟ‰๋‹ˆ๋‹ค, ์ €๋Š” ์ง€๊ธˆ ๋‹น์žฅ
15:15
I've got a guy back at your house right now, moving the door around,
354
915032
3196
๋ˆ„๊ตฌ๋ฅผ ํ•˜๋‚˜ ์—ฌ๋Ÿฌ๋ถ„ ์ง‘์— ๋ณด๋‚ด์š”, ๊ทธ๋Ÿผ ๊ทธ ์‚ฌ๋žŒ์ด ๋ฌธ์„ ๋Œ๋ ค๋†“์„ ๊ฑฐ์—์š”-
15:18
moving your doorknob over two inches.
355
918252
1769
๊ทธ๋ฆฌ๊ณ  ๊ทธ ์‚ฌ๋žŒ์€ ๋ฌธ ์†์žก์ด๋ฅผ 2์ธ์น˜ ์ •๋„ ์˜ฎ๊ฒจ๋†“์„ ๊ฒ๋‹ˆ๋‹ค.
15:20
When you go home tonight, you'll put your hand out, reach for the doorknob,
356
920045
3584
๊ทธ๋ฆฌ๊ณ  ์˜ค๋Š˜๋ฐค ์—ฌ๋Ÿฌ๋ถ„์ด ์ง‘์œผ๋กœ ๊ฐ”์„ ๋•Œ, ์†์žก์ด์— ์†์„ ๊ฐ–๋‹ค๋Œ€๋ ค ํ•˜๊ฒ ์ฃ ,
์†์žก์ด ์ชฝ์œผ๋กœ ๋ป—๊ฒ ์ฃ  ๊ทธ๋ฆฌ๊ณ  ์•Œ์•„์ฐจ๋ฆด ๊ฒƒ์ž…๋‹ˆ๋‹ค
15:23
notice it's in the wrong spot
357
923653
1514
์ž˜๋ชป ๋˜์—ˆ๋‹ค๊ณ , ๊ทธ๋ฆฌ๊ณ  ๊นจ๋‹ซ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค, '์šฐ์™€, ๋ญ”๊ฐ€ ๋ณ€ํ–ˆ๋„ค'
15:25
and go, "Whoa, something happened."
358
925191
1687
15:26
It may take a second, but something happened.
359
926902
2101
๋Œ€์ฒด ๋ฌด์Šจ ์ผ์ด ์žˆ์—ˆ๋Š”์ง€ ์•Œ์•„๋‚ด๋Š” ๋ฐ๋Š” ์ข€ ๊ฑธ๋ฆด ์ˆ˜๋„ ์žˆ๊ฒ ์ง€๋งŒ, ๋ญ”๊ฐ€ ๋ณ€ํ–ˆ๋‹ค๋Š” ๊ฑด ๋ฐ”๋กœ ์•Œ์•„์ฐจ๋ฆด๊ฒ๋‹ˆ๋‹ค.
15:29
I can change your doorknob in other ways --
360
929027
2003
์ง€๊ธˆ ์ €๋Š” ์—ฌ๋Ÿฌ๋ถ„์˜ ๋ฌธ ์†์žก์ด๋ฅผ ์—ฌ๋Ÿฌ ๋ฐฉ๋ฒ•์œผ๋กœ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
15:31
make it larger, smaller, change its brass to silver, make it a lever,
361
931054
3241
ํฌ๊ฒŒ ํ˜น์€ ์ž‘๊ฒŒ ๋งŒ๋“ค๊ฑฐ๋‚˜ ๋™์ƒ‰์—์„œ ์€์ƒ‰์œผ๋กœ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์ฃ ,
๋ ˆ๋ฒ„ํ˜• ์†์žก์ด๋กœ ๋ฐ”๊ฟ” ๋†“์„ ์ˆ˜๋„ ์žˆ๊ณ , ๋ฌธ์„ ๋ฐ”๊ฟ€ ์ˆ˜๋„ ์žˆ์–ด์š”- ์ƒ‰์„ ์น ํ•œ๋‹ค๊ฑฐ๋‚˜,
15:34
I can change the door; put colors on, put windows in.
362
934319
2576
์ฐฝ๋ฌธ์„ ๋„ฃ์„ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ฌธ์— ๊ด€ํ•˜์—ฌ ์ฒœ๊ฐ€์ง€์ฏค์€ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค,
15:36
I can change a thousand things about your door
363
936919
2151
๊ทธ๋ฆฌ๊ณ  ์ž ๊น๋™์•ˆ ์—ฌ๋Ÿฌ๋ถ„์ด ๋ฌธ์„ ์—ด์—ˆ์„ ๋•Œ,
15:39
and in the two seconds you take to open it,
364
939094
2008
๋ญ”๊ฐ€ ๋ฐ”๋€Œ์—ˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ˆˆ์น˜ ์ฑ„์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:41
you'll notice something has changed.
365
941126
1722
15:42
Now, the engineering approach, the AI approach to this,
366
942872
2584
์ž, ์ด ํ˜„์ƒ์—๋Œ€ํ•œ ์—”์ง€๋‹ˆ์–ด์ ์ธ ์ ‘๊ทผ ๋ฐฉ๋ฒ•, ์˜ˆ๋ฅผ ๋“ค์ž๋ฉด AI ์ ์ธ ์ ‘๊ทผ๋ฐฉ๋ฒ•์€
15:45
is to build a door database with all the door attributes.
367
945480
2675
๋ฌธ์— ๊ด€ํ•œ ๋ฐ์ดํ„ฐ ๋ฒ ์ด์Šค๋ฅผ ๊ตฌ์ถ•ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค๋Š” ๋ฌธ์— ๋Œ€ํ•œ ๋ชจ๋“  ํŠน์„ฑ์„ ํฌํ•จํ•˜๋Š” ๊ฑฐ์ฃ .
15:48
And as you go up to the door, we check them off one at time:
368
948179
2819
์—ฌ๋Ÿฌ๋ถ„์ด ๋ฌธ์— ๋‹ค๊ฐ€์„ค ๋•Œ, ํ•œ ๋ฒˆ์— ๊ทธ๊ฒƒ๋“ค์„ ๊ฒ€์‚ฌํ•ฉ๋‹ˆ๋‹ค.
15:51
door, door, color ...
369
951022
1346
'๋ฌธ, ๋ฌธ, ๋ฌธ, ์ƒ‰๊น”', ๋ฌด์Šจ ๋ง์ธ์ง€ ์•„์‹œ์ฃ .
15:52
We don't do that. Your brain doesn't do that.
370
952392
2100
์šฐ๋ฆฌ๋Š” ๊ทธ๋ ‡๊ฒŒ ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. ์—ฌ๋Ÿฌ๋ถ„์˜ ๋‡Œ๋Š” ๊ทธ๋ ‡๊ฒŒ ํ•˜์ง€ ์•Š์ฃ .
15:54
Your brain is making constant predictions all the time
371
954516
2540
๋‡Œ๊ฐ€ ํ•˜๋Š” ๊ฒƒ์€ ํ•ญ์ƒ ๋Š์ž„์—†์ด ์˜ˆ์ธกํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค
15:57
about what will happen in your environment.
372
957080
2034
์ง€๊ธˆ ํ™˜๊ฒฝ์—์„œ ๋ฌด์—‡์ด ๋ฒŒ์–ด์งˆ์ง€ ๋ง์ž…๋‹ˆ๋‹ค.
15:59
As I put my hand on this table, I expect to feel it stop.
373
959138
2746
์ œ๊ฐ€ ์ด ํ…Œ์ด๋ธ”์— ์†์„ ์–น์„ ๋•Œ, ์ €๋Š” ์†์ด ๋ฉˆ์ถ”๋Š” ๊ฑธ ๋Š๋ผ๊ธฐ๋ฅผ ๊ธฐ๋Œ€ํ•ฉ๋‹ˆ๋‹ค.
16:01
When I walk, every step, if I missed it by an eighth of an inch,
374
961908
3019
์ œ๊ฐ€ ๊ฑธ์„ ๋•Œ, ๋งค ๊ฑธ์Œ ๋งˆ๋‹ค- ๋งŒ์•ฝ 1/8 ์ธ์น˜ ์ •๋„ ๋†“์ณค๋‹ค๋ฉด
16:04
I'll know something has changed.
375
964951
1533
๋ญ”๊ฐ€ ๋ฐ”๋€Œ์—ˆ๋‹ค๋Š” ๊ฒƒ์„ ๋Š๋ผ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
16:06
You're constantly making predictions about your environment.
376
966508
2820
์—ฌ๋Ÿฌ๋ถ„์€ ๋Š์ž„์—†์ด ์ฃผ์–ด์ง„ ํ™˜๊ฒฝ์— ๋Œ€ํ•˜์—ฌ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.
16:09
I'll talk about vision, briefly.
377
969352
1593
๊ฐ„๋žตํ•˜๊ฒŒ ์‹œ๊ฐ์— ๋Œ€ํ•ด์„œ ๋งํ•˜์ฃ . ์—ฌ์„ฑ ์‚ฌ์ง„์ž…๋‹ˆ๋‹ค.
16:10
This is a picture of a woman.
378
970969
1383
16:12
When we look at people, our eyes saccade over two to three times a second.
379
972376
3490
์—ฌ๋Ÿฌ๋ถ„์ด ์‚ฌ๋žŒ์„ ๋ณผ ๋•Œ, ์—ฌ๋Ÿฌ๋ถ„์˜ ๋ˆˆ์€ ์‚ฌ๋กœ์žกํž™๋‹ˆ๋‹ค
์ผ์ดˆ์— ๋‘๋ฒˆ์—์„œ ์„ธ๋ฒˆ์ •๋„ ๋ง์ด์ฃ .
16:15
We're not aware of it, but our eyes are always moving.
380
975890
2529
์ด๊ฒƒ์— ๋Œ€ํ•˜์—ฌ ์ธ์ง€ ํ•  ์ˆ˜ ์—†์ง€๋งŒ ๋ˆˆ์€ ํ•ญ์ƒ ์›€์ง์ด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
๊ทธ๋ž˜์„œ ์–ด๋–ค์ด์˜ ์–ผ๊ตด์„ ๋ณด์•˜์„ ๋–„,
16:18
When we look at a face, we typically go from eye to eye to nose to mouth.
381
978443
3435
์ผ๋ฐ˜์ ์œผ๋กœ ๋ˆˆ์—์„œ ๋ˆˆ์œผ๋กœ ๋ˆˆ์—์„œ ์ฝ”๋กœ ์ž…์œผ๋กœ ์ด๋™ํ•˜์ฃ .
16:21
When your eye moves from eye to eye,
382
981902
1869
์ง€๊ธˆ, ์—ฌ๋Ÿฌ๋ถ„์˜ ์‹œ์„ ์ด ๋ˆˆ์—์„œ ๋ˆˆ์œผ๋กœ ์›€์ง์ผ ๋•Œ,
16:23
if there was something else there like a nose,
383
983795
2158
๋งŒ์•ฝ ๋ˆˆ์— ์ฝ”์™€ ๊ฐ™์€๊ฒŒ ์žˆ๋‹ค๋ฉด
16:25
you'd see a nose where an eye is supposed to be and go, "Oh, shit!"
384
985977
3546
์—ฌ๋Ÿฌ๋ถ„์€ ๋ˆˆ์ด ์žˆ์–ด์•ผ ๋  ์ฝ”๋ฅผ ๋ณด์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค
'์˜ค ์ด๋Ÿฐ',์ด๋Ÿฐ ๋ฐ˜์‘์„ ๋ณด์ด์‹œ๊ฒ ์ฃ , ์•Œ๋‹ค์‹œํ”ผ --
16:29
(Laughter)
385
989547
1396
16:30
"There's something wrong about this person."
386
990967
2109
(์›ƒ์Œ)
์ด ์‚ฌ๋žŒ, ๋ฌด์–ธ๊ฐ€ ์ž˜๋ชป๋๋‹ค-๋ผ๊ณ  ๋Š๋ผ๊ฒ ์ฃ .
16:33
That's because you're making a prediction.
387
993100
2005
์™œ๋ƒํ•˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„์ด ์˜ˆ์ธกํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
16:35
It's not like you just look over and say, "What am I seeing? A nose? OK."
388
995129
3439
์‹œ์„ ์„ ๊ทธ ์ชฝ์œผ๋กœ ํ–ฅํ•ด์„œ ๋ณด๋ฉด์„œ ๋‚ด๊ฐ€ ์ง€๊ธˆ ๋ฌด์–ผ ๋ณด๊ณ  ์žˆ๋Š”๊ฑธ๊นŒ-ํ•˜๊ณ  ๋งํ•˜๋Š” ๊ทธ๋Ÿฐ ์ƒํ™ฉ์ด ์•„๋‹Œ๊ฑฐ์ฃ .
์ฝ”, ๊ดœ์ฐฎ์Šต๋‹ˆ๋‹ค, ์•„๋‹ˆ์ฃ , ์—ฌ๋Ÿฌ๋ถ„์€ ๋ณด๋Š” ๊ฒƒ์— ๋Œ€ํ•˜์—ฌ ์˜ˆ์ธก์„ ํ•ฉ๋‹ˆ๋‹ค..
16:38
No, you have an expectation of what you're going to see.
389
998592
2634
(์›ƒ์Œ)
16:41
Every single moment.
390
1001250
1151
๋งค์ˆœ๊ฐ„ ๋ง์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ๋งˆ์ง€๋ง‰์œผ๋กœ, ์–ด๋–ป๊ฒŒ ์ง€์„ฑ์„ ๊ฒ€์‚ฌํ•˜๋Š”์ง€ ์ƒ๊ฐํ•ด๋ด…์‹œ๋‹ค.
16:42
And finally, let's think about how we test intelligence.
391
1002425
2629
16:45
We test it by prediction: What is the next word in this ...?
392
1005078
3081
์šฐ๋ฆฌ๋Š” ์˜ˆ์ธก์œผ๋กœ ๊ฒ€์‚ฌํ•˜์ฃ . ์•Œ๋””์‹œํ”ผ, ๋ฌธ์ œ์— ๋Œ€ํ•œ ๋‹ค์Œ ๋‹จ์–ด๋Š”, ์•„์‹œ๊ฒ ์ฃ ?
16:48
This is to this as this is to this. What is the next number in this sentence?
393
1008183
3627
์ด ๋ฌธ์žฅ์—์„œ ๋‹ค์Œ ์ˆซ์ž๋Š”?
16:51
Here's three visions of an object. What's the fourth one?
394
1011834
2690
์—ฌ๊ธฐ ์‚ฌ๋ฌผ์— ๋Œ€ํ•œ ์„ธ๊ฐ€์ง€ ์‹œ๊ฐ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
๋„ค๋ฒˆ์งธ๋Š” ๋ฌด์—‡์ผ๊นŒ์š”? ์šฐ๋ฆฌ๊ฐ€ ๊ฒ€์‚ฌํ•˜๋Š” ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค. ์˜ˆ์ธก์ด์ฃ .
16:54
That's how we test it. It's all about prediction.
395
1014548
2504
16:57
So what is the recipe for brain theory?
396
1017573
2194
๋‡Œ ์ด๋ก ์„ ์œ„ํ•œ ๋ฐฉ์•ˆ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
17:00
First of all, we have to have the right framework.
397
1020219
2366
๋ฌด์—‡๋ณด๋‹ค๋„, ์šฐ๋ฆฌ๋Š” ์˜ณ์€ ํ‹€์„ ๋งŒ๋“ค์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
17:02
And the framework is a memory framework,
398
1022609
1913
๊ทธ๋ฆฌ๊ณ  ์ด ํ‹€์€ ๊ธฐ์–ต์˜ ํ‹€์ž…๋‹ˆ๋‹ค,
17:04
not a computational or behavior framework,
399
1024546
2024
ํ–‰๋™์˜ ํ‹€ ํ˜น์€ ๊ณ„์‚ฐ์˜ ํ‹€์ด ์•„๋‹ˆ์ฃ . ๊ธฐ์–ต์˜ ํ‹€์ด์ฃ .
17:06
it's a memory framework.
400
1026594
1163
17:07
How do you store and recall these sequences of patterns?
401
1027781
2623
์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ์–ด๋–ป๊ฒŒ ์ด ํŒจํ„ด๊ณผ ๊ทธ ์ˆœ์„œ๋“ค์„ ์ €์žฅํ•˜๊ณ  ์ƒ๊ธฐํ•˜๋‚˜์š”? ์ด๊ฑด ์‹œ๊ณต๊ฐ„์˜ ํŒจํ„ด์ธ๊ฒ๋‹ˆ๋‹ค.
17:10
It's spatiotemporal patterns.
402
1030428
1442
17:11
Then, if in that framework, you take a bunch of theoreticians --
403
1031894
3009
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ๊ธฐ์–ต์˜ ํ‹€์—์„œ ์ด๋ก ๊ฐ€๋“ค์„ ๋ฐ›์•„๋“ค์ธ๋‹ค๋ฉด,
17:14
biologists generally are not good theoreticians.
404
1034927
2246
์ƒ๋ฌผํ•™์ž๋“ค์€ ์ผ๋ฐ˜์ ์œผ๋กœ ์ข‹์ง€ ์•Š์€ ์„ ํƒ์ด์ฃ .
ํ•ญ์ƒ ์˜ณ์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค๋งŒ, ์ผ๋ฐ˜์ ์œผ๋กœ ์ƒ๋ฌผํ•™ ์ชฝ์—์„œ ์ข‹์€ ์ด๋ก ์ด ๋‚˜์˜จ ๊ฒฝ์šฐ๋Š” ์ž˜ ์—†์Šต๋‹ˆ๋‹ค.
17:17
Not always, but generally, there's not a good history of theory in biology.
405
1037197
3529
17:20
I've found the best people to work with are physicists,
406
1040750
2574
๊ทธ๋ž˜์„œ ์ €๋Š” ๊ฐ™์ด ์ผํ•  ๊ฐ€์žฅ ์ ํ•ฉํ•œ ๋ฌผ๋ฆฌํ•™์ž๋“ค์„ ์ฐพ์•˜์Šต๋‹ˆ๋‹ค
17:23
engineers and mathematicians,
407
1043348
1383
๋ฌธ์ œ ํ•ด๊ฒฐ ์ ˆ์ฐจ ๋ฐ ๋ฐฉ๋ฒ•์„ ์ƒ๊ฐํ•˜๋Š” ๊ธฐ์ˆ ์ž ๊ทธ๋ฆฌ๊ณ  ์ˆ˜ํ•™์ž๋“ค์„ ๋ง์ด์ฃ .
17:24
who tend to think algorithmically.
408
1044755
1696
17:26
Then they have to learn the anatomy and the physiology.
409
1046475
3264
์ดํ›„ ๊ทธ๋“ค์€ ํ•ด๋ถ€ํ•™์„ ๋ฐฐ์›Œ์•ผํ•ฉ๋‹ˆ๋‹ค, ๊ทธ๋ฆฌ๊ณ  ์ƒ๋ฆฌํ•™์„ ๋ฐฐ์›Œ์•ผ ํ•˜์ฃ .
17:29
You have to make these theories very realistic in anatomical terms.
410
1049763
4496
์ด ์ด๋ก ๋“ค์„ ํ•ด๋ถ€ํ•™์  ๊ด€์ ์—์„œ๋„ ์‚ฌ์‹ค์ ์ผ ์ˆ˜ ์žˆ๋„๋ก ๋งŒ๋“ค์–ด์•ผ ํ•˜๊ฑฐ๋“ ์š”.
๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ์ผ์–ด๋‚˜์„œ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ๋‡Œ์˜ ์ž‘์šฉ์— ๋Œ€ํ•ด์„œ ์ด๋ก ์„ ๋งํ•˜๋Š”๋ฐ
17:34
Anyone who tells you their theory about how the brain works
411
1054283
2765
17:37
and doesn't tell you exactly how it's working
412
1057072
2097
๋‡Œ ์•ˆ์—์„œ ์–ด๋–ป๊ฒŒ ๊ทธ ์ด๋ก ์ด ์ž‘๋™ํ•˜๋Š”์ง€์— ๋Œ€ํ•ด์„œ ๋งํ•ด์ฃผ์ง€ ์•Š๋Š”๋‹ค๋ฉด
17:39
and how the wiring works --
413
1059193
1303
๊ทธ๋ฆฌ๊ณ  ๋‡Œ ๋‚ด์˜ ์—ฐ๊ฒฐ๋ง๋“ค์ด ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๋Š” ์ง€ ์ •ํ™•ํžˆ ๋งํ•ด์ฃผ์ง€ ์•Š๋Š”๋‹ค๋ฉด, ๊ทธ๊ฑด ์ด๋ก ์ด ์•„๋‹ˆ์ฃ .
17:40
it's not a theory.
414
1060520
1267
17:41
And that's what we do at the Redwood Neuroscience Institute.
415
1061811
2833
์ด๊ฒƒ์ด ์šฐ๋ฆฌ๊ฐ€ ๋ ˆ๋“œ์šฐ๋“œ ๋‡Œ๊ณผํ•™ ์—ฐ๊ตฌ์†Œ์—์„œ ์—ฐ๊ตฌํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:44
I'd love to tell you we're making fantastic progress in this thing,
416
1064668
3308
์ด๋ก  ์ฐฝ์•ˆ์— ๋Œ€ํ•ด์„œ ์šฐ๋ฆฌ ์—ฐ๊ตฌ์ง„๋“ค์ด ์–ผ๋งˆ๋‚˜ ๋Œ€๋‹จํ•œ ์ง„์ฒ™์„ ๋ณด์ด๊ณ  ์žˆ๋Š” ์ง€์— ๋Œ€ํ•ด์„œ ๋” ๋งํ•  ์‹œ๊ฐ„์ด ์žˆ์—ˆ์œผ๋ฉด ์ข‹๊ฒ ์Šต๋‹ˆ๋‹ค.
17:48
and I expect to be back on this stage sometime in the not too distant future,
417
1068000
3662
๊ทธ๋ฆฌ๊ณ  ์–ธ์  ๊ฐ€๋Š” ์ด ๋ฌด๋Œ€์— ๋˜ ๋Œ์•„์˜ค๊ธฐ๋ฅผ ๊ธฐ๋Œ€ํ•ฉ๋‹ˆ๋‹ค.
์•„๋งˆ ์ง€๊ธˆ์ด ์•„๋‹Œ ๋‹ค๋ฅธ ๊ฐ•์—ฐ์—์„œ ๋ง์”€ ๋“œ๋ฆฌ๊ฒŒ ๋˜๊ฒ ์ง€๋งŒ, ๊ทธ๊ฒŒ ๊ทธ๋ฆฌ ๋จผ ๋ฏธ๋ž˜๋Š” ์•„๋‹ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:51
to tell you about it.
418
1071686
1164
17:52
I'm really excited; this is not going to take 50 years.
419
1072874
2594
์ €๋Š” ์ •๋ง๋กœ ํฅ๋ถ„๋ฉ๋‹ˆ๋‹ค. ์ด ์—ฐ๊ตฌ๋Š” ์ „ํ˜€ 50๋…„์ด ๊ฑธ๋ฆฌ์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:55
What will brain theory look like?
420
1075492
1578
๊ทธ๋Ÿผ, ๋‡Œ์ด๋ก ์€ ๋Œ€๋žต ์–ด๋–ค ๋ชจ์Šต์„ ๋ ๊ฒŒ ๋ ๊นŒ์š”?
17:57
First of all, it's going to be about memory.
421
1077094
2055
๋ฌด์—‡๋ณด๋‹ค๋„, ๊ธฐ์–ต์— ๋Œ€ํ•œ ์ด๋ก ์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:59
Not like computer memory -- not at all like computer memory.
422
1079173
2822
์ปดํ“จํ„ฐ ๋ฉ”๋ชจ๋ฆฌ ๊ฐ™์€ ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค. ์ปดํ“จํ„ฐ ๋ฉ”๋ชจ๋ฆฌ์™€๋Š” ์ „ํ˜€ ๋‹ค๋ฅด์ฃ .
18:02
It's very different.
423
1082019
1151
๋งค์šฐ ๋‹ค๋ฅธ ๊ฒƒ์ด์ฃ . ๊ทธ๋ฆฌ๊ณ  ๋งค์šฐ ๊ณ ์ฐจ์›์ ์ธ
18:03
It's a memory of very high-dimensional patterns,
424
1083194
2257
ํŒจํ„ด์˜ ๊ธฐ์–ต์ž…๋‹ˆ๋‹ค, ๋ˆˆ์—์„œ ์˜จ ๊ฒƒ๊ณผ ๊ฐ™์€ ๊ฒƒ์ด์ฃ .
18:05
like the things that come from your eyes.
425
1085475
1962
18:07
It's also memory of sequences:
426
1087461
1437
๊ทธ๋ฆฌ๊ณ  ๋˜, ์ผ์ • ์ˆœ์„œ (์‹œํ€€์Šค)์— ๋Œ€ํ•œ ๊ธฐ์–ต์ž…๋‹ˆ๋‹ค.
18:08
you cannot learn or recall anything outside of a sequence.
427
1088922
2730
์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ์ˆœ์„œ(์‹œํ€€์Šค)์— ์–ด๊ธ‹๋‚œ ๊ฒƒ๋“ค์„ ํ•™์Šตํ•˜๊ฑฐ๋‚˜ ์ƒ๊ธฐํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
18:11
A song must be heard in sequence over time,
428
1091676
2837
๋…ธ๋ž˜๊ฐ€ ์žˆ๋‹ค๋ฉด, ๊ทธ ๋…ธ๋ž˜๋Š” ์‹œ๊ฐ„ ์ˆœ์„œ์— ๋งž๊ฒŒ ๋“ค๋ ค์•ผ ํ•ฉ๋‹ˆ๋‹ค.
18:14
and you must play it back in sequence over time.
429
1094537
2351
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ๊ทธ ์‹œ๊ฐ„ ์ˆœ์„œ์— ๋งž๊ฒŒ ์ƒ๊ธฐํ•ด์•ผ ํ•˜๊ตฌ์š”.
18:16
And these sequences are auto-associatively recalled,
430
1096912
2449
๊ทธ๋ฆฌ๊ณ  ์ด ์ˆœ์„œ๋“ค์€ ์ž๋™์ ์œผ๋กœ ์—ฐ๋™๋ผ์„œ ๋– ์˜ค๋ฆ…๋‹ˆ๋‹ค. ๊ทธ๋ž˜์„œ ๋งŒ์•ฝ ์ œ๊ฐ€ ๋ฌด์–ธ๊ฐ€๋ฅผ ๋ณด๊ฑฐ๋‚˜,
18:19
so if I see something, I hear something, it reminds me of it,
431
1099385
2873
๋ฌด์–ธ๊ฐ€๋ฅผ ๋“ฃ๊ฑฐ๋‚˜ ํ•  ๋•Œ, ๊ทธ ์ˆœ์„œ๋ฅผ ๋– ์˜ค๋ฅด๊ฒŒ ๋งŒ๋“ค๊ณ  ์ž๋™์œผ๋กœ ๊ธฐ์–ต์—์„œ ๋ถˆ๋Ÿฌ์™€์„œ ์žฌ์ƒํ•˜๋Š” ๊ฑฐ์ฃ .
18:22
and it plays back automatically.
432
1102282
1533
18:23
It's an automatic playback.
433
1103839
1294
์ž๋™ ์žฌ์ƒ์ธ ์…ˆ์ด์ฃ . ๊ทธ๋ฆฌ๊ณ  ์•ž์œผ๋กœ ๋“ค์–ด์˜ฌ ์ž…๋ ฅ๊ฐ’์— ๋Œ€ํ•ด ์ œ๋Œ€๋กœ ์˜ˆ์ธกํ•˜๋Š” ๊ฒŒ ๋ฐ”๋žŒ์งํ•œ ์ถœ๋ ฅ๊ฐ’์ด ๋˜๋Š”๊ฒ๋‹ˆ๋‹ค.
18:25
And prediction of future inputs is the desired output.
434
1105157
2548
18:27
And as I said, the theory must be biologically accurate,
435
1107729
2620
์ œ๊ฐ€ ๋งํ•œ ๊ฒƒ์ฒ˜๋Ÿผ, ์ด๋ก ์€ ์ƒ๋ฌผํ•™์ ์œผ๋กœ ์ •ํ™•ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค,
18:30
it must be testable and you must be able to build it.
436
1110373
2484
์‹คํ—˜ ๊ฐ€๋Šฅํ•ด์•ผํ•˜๊ณ , ์ง์ ‘ ๊ตฌํ˜„ํ•  ์ˆ˜ ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
18:32
If you don't build it, you don't understand it.
437
1112881
2211
๋งŒ์•ฝ ์ง์ ‘ ๋งŒ๋“ค์–ด ๊ตฌํ˜„ํ•˜์ง€ ๋ชปํ•œ๋‹ค๋ฉด, ์—ฌ๋Ÿฌ๋ถ„๋“ค์€ ๊ทธ๊ฑธ ์ดํ•ดํ•  ์ˆ˜ ์—†์„๊ฒ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ ์Šฌ๋ผ์ด๋“œ๊ฐ€ ํ•œ ์žฅ ๋” ์žˆ๋Š”๋ฐ์š”-
18:35
One more slide.
438
1115116
1532
18:36
What is this going to result in?
439
1116672
2309
๊ฒฐ๊ณผ๊ฐ€ ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”? ์šฐ๋ฆฌ๋Š” ์ •๋ง๋กœ ์ง€์„ฑ์„ ๊ฐ€์ง„ ๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“ค๊นŒ์š”?
18:39
Are we going to really build intelligent machines?
440
1119005
2348
๋ฌผ๋ก ์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ , ๊ทธ ๊ธฐ๊ณ„๋“ค์€ ์ง€๊ธˆ ์‚ฌ๋žŒ๋“ค์ด ์ƒ๊ฐํ•˜๊ณ  ์žˆ๋Š” ํ˜•ํƒœ๋ž‘์€ ๋‹ค๋ฅผ ๊ฒ๋‹ˆ๋‹ค.
18:41
Absolutely. And it's going to be different than people think.
441
1121377
3798
์ €๋Š” ์ด๊ฒƒ (์ง€์„ฑ์„ ๊ฐ€์ง„ ๊ธฐ๊ณ„๋ฅผ ๊ฐœ๋ฐœํ•˜๋Š” ๊ฒƒ)์ด ๊ผญ ์ผ์–ด๋‚  ๊ฑฐ๋ผ๋Š” ๋ฐ ๋Œ€ํ•ด์„œ ์ถ”ํ˜ธ์˜ ์˜์‹ฌ๋„ ์—†์Šต๋‹ˆ๋‹ค.
18:45
No doubt that it's going to happen, in my mind.
442
1125508
2392
18:47
First of all, we're going to build this stuff out of silicon.
443
1127924
3116
์ฒซ์งธ๋กœ, ์šฐ๋ฆฌ๋Š” ์‹ค๋ฆฌ์ฝ˜์„ ์ด์šฉํ•ด์„œ ๊ทธ๊ฒƒ๋“ค์„ ๋งŒ๋“ค ๊ฒ๋‹ˆ๋‹ค.
18:51
The same techniques we use to build silicon computer memories,
444
1131064
2912
์‹ค๋ฆฌ์ฝ˜ ์žฌ์งˆ์˜ ์ปดํ“จํ„ฐ ๋ฉ”๋ชจ๋ฆฌ๋ฅผ ๋งŒ๋“œ๋Š” ๋ฐ ์‚ฌ์šฉํ•œ ๋ฐ”๋กœ ๊ทธ ๊ธฐ์ˆ ๋“ค์„-
18:54
we can use here.
445
1134000
1151
์—ฌ๊ธฐ์— ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค
18:55
But they're very different types of memories.
446
1135175
2109
ํ•˜์ง€๋งŒ ๊ทธ๊ฒƒ๋“ค์€ ์•„์ฃผ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๋ฉ”๋ชจ๋ฆฌ์ž…๋‹ˆ๋‹ค.
18:57
And we'll attach these memories to sensors,
447
1137308
2023
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ์ด ๋ฉ”๋ชจ๋ฆฌ๋“ค์„ ์„ผ์„œ์—๋‹ค ๋ถ€์ฐฉํ•  ๊ฒƒ์ด๊ณ -
18:59
and the sensors will experience real-live, real-world data,
448
1139355
2777
์„ผ์„œ๋Š” ์‹ค๋ฌผ, ์‹ค์ œ์ •๋ณด๋ฅผ ๊ฒฝํ—˜ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค,
19:02
and learn about their environment.
449
1142156
1752
๊ทธ๋ฆฌ๊ณ  ์ด๊ฒƒ๋“ค์€ ๊ทธ๋“ค์˜ ํ™˜๊ฒฝ์— ๋Œ€ํ•ด์„œ ํ•™์Šตํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:03
Now, it's very unlikely the first things you'll see are like robots.
450
1143932
3445
์ฒซ๋ฒˆ์งธ๋กœ ๋ณด๊ฒŒ ๋  ์ œํ’ˆ์€ ๋กœ๋ด‡๊ณผ ๊ฐ™์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:07
Not that robots aren't useful; people can build robots.
451
1147401
2575
๋กœ๋ด‡์€ ์œ ์šฉํ•ฉ๋‹ˆ๋‹ค ๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์€ ๋กœ๋ด‡์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
19:10
But the robotics part is the hardest part. That's old brain. That's really hard.
452
1150000
3767
ํ•˜์ง€๋งŒ ๋กœ๋ด‡ ๋ถ€๋ถ„์€ ๊ฐ€์žฅ ์–ด๋ ค์€ ๋ถ€๋ถ€์ž…๋‹ˆ๋‹ค. ์˜ค๋ž˜ ๋œ ๋‡Œ ๋ถ€๋ถ„์ด์ฃ . ์ด ๋ถ€๋ถ„์€ ์ •๋ง๋กœ ์–ด๋ ต์ฃ .
19:13
The new brain is easier than the old brain.
453
1153791
2007
์ƒˆ๋กœ์šด ๋‡Œ๋Š” ์‚ฌ์‹ค์ƒ ์˜ค๋ž˜๋œ ๋‡Œ๋ณด๋‹ค๋Š” ์‰ฝ์Šต๋‹ˆ๋‹ค.
19:15
So first we'll do things that don't require a lot of robotics.
454
1155822
3082
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๊ฐ€ ์ฒ˜์Œ์œผ๋กœ ํ•˜๊ฒŒ ๋˜๋Š” ๊ฒƒ์€ ๋งŽ์€ ๋กœ๋ด‡์„ ์š”๊ตฌํ•˜์ง€ ์•Š๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:18
So you're not going to see C-3PO.
455
1158928
2179
์ด๋ ‡๊ฒŒ ์—ฌ๋Ÿฌ๋ถ„์€ C-3PO๋ฅผ ๋ณด์ง€ ๋ชปํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:21
You're going to see things more like intelligent cars
456
1161131
2485
์•Œ๋‹ค์‹œํ”ผ, ์ง€๋Šฅ์„ ๊ฐ€์ง„ ์ฐจ๋“ค๊ฐ™์€ ๊ฒƒ์„ ์ข€ ๋” ๋ณผ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค
19:23
that really understand what traffic is, what driving is
457
1163640
2808
์ •๋ง๋กœ ๊ตํ†ต๊ณผ ์šด์ „์— ๋Œ€ํ•ด์„œ ์ดํ•ดํ•˜๋Š” ๊ฒƒ๋“ค์ด์ฃ 
19:26
and have learned that cars with the blinkers on for half a minute
458
1166472
3278
๊ทธ๋ฆฌ๊ณ  30์ดˆ ์”ฉ์ด๋‚˜ ๊นœ๋นก์ด๋ฅผ ์ผœ๊ณ  ์žˆ๋Š” ์ฐจ๋“ค์€
19:29
probably aren't going to turn.
459
1169774
1574
์•„๋งˆ๋„ ํ„ด์„ ํ•˜์ง€ ์•Š์„ ๊ฒƒ์ด๋ผ๋Š” ์‚ฌ์‹ค์„ ์ด๋ฏธ ํ•™์Šตํ•œ, ๊ทธ๋Ÿฐ ๋ฅ˜์˜ ์ง€๋Šฅ ๋ง์ด์ฃ .
19:31
(Laughter)
460
1171372
1291
(์›ƒ์Œ)
19:32
We can also do intelligent security systems.
461
1172687
2064
์ง€๋Šฅ์ ์ธ ๋ณด์•ˆ ์ฒด๊ณ„๋ฅผ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:34
Anytime we're basically using our brain but not doing a lot of mechanics --
462
1174775
3573
์–ด๋–ค ๋ถ„์•ผ๊ฐ€ ๋˜์—ˆ๊ฑด ์‚ฌ๋žŒ์ด ๋‡Œ๋ฅผ ์‚ฌ์šฉํ•˜์ง€๋งŒ ๊ธฐ๊ณ„์ ์ธ ๊ฒƒ๋“ค์„ ํฌ๊ฒŒ ๋‹ด๋‹นํ•˜์ง€ ์•Š๋Š” ๋ถ„์•ผ๋“ค-
19:38
those are the things that will happen first.
463
1178372
2059
๊ทธ ๋ถ„์•ผ๋“ค์ด ๊ฐ€์žฅ ๋จผ์ € ์ง€๋Šฅ์ ์ธ ๊ธฐ๊ณ„๋“ค์„ ์ ์šฉํ•  ๋ถ„์•ผ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:40
But ultimately, the world's the limit.
464
1180455
1820
ํ•˜์ง€๋งŒ ๊ฒฐ๊ตญ์€ ์—ฌ๊ธฐ๊นŒ์ง€๊ฐ€ ํ•œ๊ณ„์ž…๋‹ˆ๋‹ค.
19:42
I don't know how this will turn out.
465
1182299
1732
์ €๋Š” ์ด๊ฒƒ์ด ์–ด๋–ป๊ฒŒ ๋ฐœ์ „ํ•ด ๋‚˜๊ฐˆ์ง€ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
19:44
I know a lot of people who invented the microprocessor.
466
1184055
2591
์ €๋Š” ๋งˆ์ดํฌ๋กœํ”„๋กœ์„ธ์„œ๋ฅผ ๋ฐœ๋ช…ํ•œ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
19:46
And if you talk to them,
467
1186670
2164
๊ทธ ์‚ฌ๋žŒ๋“ค์ด๋ž‘ ์ด์•ผ๊ธฐ ํ•ด ๋ณด๋ฉด ๋ง์ด์ฃ , ๊ทธ๋“ค์€ ์ž์‹ ๋“ค์ด ํ•˜๊ณ  ์žˆ๋Š” ์ผ์ด ์ •๋ง ์ค‘์š”ํ•˜๋‹ค๋Š” ๊ฑธ ์•Œ๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
19:48
they knew what they were doing was really significant,
468
1188858
2575
19:51
but they didn't really know what was going to happen.
469
1191457
2500
๊ทธ๊ฒŒ ์–ด๋–ค ํšจ๊ณผ๋ฅผ ๊ฐ€์งˆ์ง€๋Š” ์ •ํ™•ํžˆ ๋ชจ๋ฅด๊ณ  ์žˆ์—ˆ์ง€๋งŒ์š”.
19:53
They couldn't anticipate cell phones and the Internet
470
1193981
2768
๊ทธ ์‚ฌ๋žŒ๋“ค์€ ๋‹น์‹œ์— ํœด๋Œ€ํฐ์ด๋‚˜ ์ธํ„ฐ๋„ท ๊ฐ™์€ ๋งŽ์€ ๊ฒƒ๋“ค์„ ์˜ˆ์ธกํ•  ์ˆ˜ ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
19:56
and all this kind of stuff.
471
1196773
1735
19:58
They just knew like, "We're going to build calculators
472
1198532
2621
๊ทธ๋“ค์€ ๋‹จ์ง€ ๊ณ„์‚ฐ๊ธฐ, ์‹ ํ˜ธ๋“ฑ ๋ถˆ ์กฐ์ ˆ ๊ฐ™์€ ๊ฒƒ์„
20:01
and traffic-light controllers.
473
1201177
1440
๋งŒ๋“ค๊ฒŒ ๋  ๊ฑฐ๋ผ๋Š” ๊ฑธ ์•Œ๊ณ  ์žˆ์—ˆ์ฃ . ํ•˜์ง€๋งŒ ๊ฒฐ๊ตญ์€ ์—„์ฒญ๋‚œ ํšจ๊ณผ๋ฅผ ๊ฐ€์ง€๊ณ  ์™”๊ฑฐ๋“ ์š”.
20:02
But it's going to be big!"
474
1202641
1299
20:03
In the same way, brain science and these memories
475
1203964
2341
๋˜‘๊ฐ™์€ ๋ฐฉ์‹์œผ๋กœ, ๋‡Œ๊ณผํ•™๊ณผ ์ด๋Ÿฐ ๋ฉ”๋ชจ๋ฆฌ์นฉ๋“ค์ด
20:06
are going to be a very fundamental technology,
476
1206329
2225
์•ž์œผ๋กœ ์•„์ฃผ ๊ทผ๋ณธ์ ์ธ ๊ธฐ์ˆ ์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ์•ž์œผ๋กœ
20:08
and it will lead to unbelievable changes in the next 100 years.
477
1208578
3442
ํ–ฅํ›„ 100๋…„๊ฐ„์˜ ๋ฏฟ์„ ์ˆ˜ ์—†๋Š” ๋ณ€ํ™”๋“ค์„ ์ด ๊ธฐ์ˆ ์ด ์ด๋Œ์–ด ๊ฐˆ ๊ฒ๋‹ˆ๋‹ค.
20:12
And I'm most excited about how we're going to use them in science.
478
1212044
3405
์ €๋Š” ๊ณผํ•™์—์„œ ์šฐ๋ฆฌ๊ฐ€ ์ด๊ฒƒ๋“ค์„ ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉํ• ์ง€ ์ •๋ง๋กœ ๊ธฐ๋Œ€๋ฉ๋‹ˆ๋‹ค.
20:15
So I think that's all my time -- I'm over,
479
1215473
2837
์ œ ํ• ๋‹น ์‹œ๊ฐ„์ด ๋๋‚œ ๊ฒƒ ๊ฐ™๋„ค์š”, ์ข€ ๋„˜์–ด๊ฐ”์–ด์š”, ๊ทธ๋Ÿผ ์—ฌ๊ธฐ์„œ ์ด๋งŒ ์ œ ๊ฐ•์˜๋ฅผ
20:18
and I'm going to end my talk right there.
480
1218334
2277
๋งˆ์น˜๊ฒ ์Šต๋‹ˆ๋‹ค.
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7