Laura Schulz: The surprisingly logical minds of babies

225,846 views ・ 2015-06-02

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Ju Hye Lim κ²€ν† : Jihyeon J. Kim
00:12
Mark Twain summed up what I take to be
0
12835
2155
λ―Έκ΅­ 유λͺ… μž‘κ°€ 마크 νŠΈμ›¨μΈμ€
00:14
one of the fundamental problems of cognitive science
1
14990
3120
μ œκ°€ μƒκ°ν•˜κΈ°μ— κ°€μž₯ 근본적인 μΈμ§€κ³Όν•™μ˜ 문제λ₯Ό
00:18
with a single witticism.
2
18110
1710
재치있게 ν•œ λ§ˆλ””λ‘œ μ •λ¦¬ν–ˆμŠ΅λ‹ˆλ‹€.
00:20
He said, "There's something fascinating about science.
3
20410
3082
"κ³Όν•™μ—λŠ” λ­”κ°€ 맀λ ₯적인 것이 μžˆλ‹€.
00:23
One gets such wholesale returns of conjecture
4
23492
3228
μ‚¬μ‹€μ΄λΌλŠ” μ•„μ£Ό μž‘μ€ 투자λ₯Ό 톡해
00:26
out of such a trifling investment in fact."
5
26720
3204
그토둝 λ§Žμ€ 좔츑을 μ΄λŒμ–΄λ‚΄λ‹ˆκΉŒ."
00:29
(Laughter)
6
29924
1585
(μ›ƒμŒ)
00:32
Twain meant it as a joke, of course, but he's right:
7
32199
2604
λ¬Όλ‘  λ†λ‹΄μ΄μ—ˆκ² μ§€λ§Œ λ§žλŠ” λ§μž…λ‹ˆλ‹€.
00:34
There's something fascinating about science.
8
34803
2876
κ³Όν•™μ—λŠ” νŠΉλ³„ν•œ 맀λ ₯이 μžˆμŠ΅λ‹ˆλ‹€.
00:37
From a few bones, we infer the existence of dinosuars.
9
37679
4261
μš°λ¦¬λŠ” λͺ‡ 개의 뼈만으둜 곡룑의 쑴재λ₯Ό μΆ”λ‘ ν•΄λƒˆκ³ 
00:42
From spectral lines, the composition of nebulae.
10
42910
3871
λΆ„κ΄‘μ„ μ—μ„œλŠ” μ„±μš΄μ˜ ꡬ성을,
00:47
From fruit flies,
11
47471
2938
μ΄ˆνŒŒλ¦¬μ—μ„œλŠ”
00:50
the mechanisms of heredity,
12
50409
2943
μœ μ „μ˜ 법칙을,
00:53
and from reconstructed images of blood flowing through the brain,
13
53352
4249
그리고 λ‡Œμ— 흐λ₯΄λŠ” ν”Όλ₯Ό μž¬κ΅¬μ„±ν•œ μ΄λ―Έμ§€λ‘œ
00:57
or in my case, from the behavior of very young children,
14
57601
4708
μ € 같은 κ²½μš°λŠ”, μ•„μ£Ό μ–΄λ¦° μ•„μ΄λ“€μ˜ 행동을 톡해
01:02
we try to say something about the fundamental mechanisms
15
62309
2829
인간 인식이 근본적으둜 μ–΄λ–»κ²Œ μž‘λ™ν•˜λŠ”μ§€ μ•Œμ•„λ‚΄λ €κ³  ν•˜μ£ .
01:05
of human cognition.
16
65138
1618
01:07
In particular, in my lab in the Department of Brain and Cognitive Sciences at MIT,
17
67716
4759
특히 μ €λŠ” MIT λ‡ŒμΈμ§€κ³Όν•™κ³Όμ— μžˆλŠ” 제 μ—°κ΅¬μ‹€μ—μ„œ
01:12
I have spent the past decade trying to understand the mystery
18
72475
3654
μ§€λ‚œ 10λ…„κ°„ 아이듀이 μ–΄λ–»κ²Œ 적은 μ–‘μ˜ μ •λ³΄λ‘œ
01:16
of how children learn so much from so little so quickly.
19
76129
3977
λ§Žμ€ 것을 빨리 λ°°μš°λŠ”μ§€μ— λŒ€ν•œ μ‹ λΉ„λ₯Ό μ•Œμ•„λ‚΄λ €κ³  λ…Έλ ₯ν–ˆμŠ΅λ‹ˆλ‹€.
01:20
Because, it turns out that the fascinating thing about science
20
80666
2978
과학이 맀λ ₯적인 μ΄μœ κ°€
01:23
is also a fascinating thing about children,
21
83644
3529
λ°”λ‘œ 아기듀이 λ†€λžκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
01:27
which, to put a gentler spin on Mark Twain,
22
87173
2581
마크 νŠΈμ›¨μΈμ˜ 말을 μ™„κ³‘ν•˜κ²Œ λ°”κΎΈμ–΄ 보자면
01:29
is precisely their ability to draw rich, abstract inferences
23
89754
4650
아기듀이 λΉ λ₯΄κ³  μ •ν™•ν•˜κ²Œ 적은 μ–‘μ˜ μ–΄μ§€λŸ¬μš΄ μžλ£Œλ‘œλΆ€ν„°
01:34
rapidly and accurately from sparse, noisy data.
24
94404
4661
ν’λΆ€ν•˜κ³  좔상적인 좔둠을 ν•  수 μžˆλŠ” λŠ₯λ ₯μž…λ‹ˆλ‹€.
01:40
I'm going to give you just two examples today.
25
100355
2398
μ €λŠ” 였늘 μ—¬λŸ¬λΆ„λ“€κ»˜ μ˜ˆμ‹œλ₯Ό 두 개만 보여 λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
01:42
One is about a problem of generalization,
26
102753
2287
ν•˜λ‚˜λŠ” μΌλ°˜ν™”μ˜ 였λ₯˜μ΄κ³ 
01:45
and the other is about a problem of causal reasoning.
27
105040
2850
λ‹€λ₯Έ ν•˜λ‚˜λŠ” μΈκ³ΌμΆ”λ‘ μ˜ 였λ₯˜μž…λ‹ˆλ‹€.
01:47
And although I'm going to talk about work in my lab,
28
107890
2525
제 μ—°κ΅¬μ‹€μ—μ„œ ν•œ 연ꡬ에 λŒ€ν•΄ λ§μ”€λ“œλ¦¬κ² μ§€λ§Œ
01:50
this work is inspired by and indebted to a field.
29
110415
3460
이 μ—°κ΅¬λŠ” μΈμ§€κ³Όν•™λΆ„μ•Όμ—μ„œ 영감과 도움을 λ°›μ•˜μŠ΅λ‹ˆλ‹€.
01:53
I'm grateful to mentors, colleagues, and collaborators around the world.
30
113875
4283
μ „ μ„Έκ³„μ˜ λ©˜ν† μ™€ λ™λ£Œμ™€ κ³΅λ™μ—°κ΅¬μžλ“€μ—κ²Œ κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€.
01:59
Let me start with the problem of generalization.
31
119308
2974
그럼 μΌλ°˜ν™”μ˜ 였λ₯˜λΆ€ν„° μ–˜κΈ°ν• κΉŒμš”?
02:02
Generalizing from small samples of data is the bread and butter of science.
32
122652
4133
적은 ν‘œλ³ΈμœΌλ‘œ μΌλ°˜ν™”ν•˜λŠ” 것은 κ³Όν•™μ˜ κΈ°λ³Έμž…λ‹ˆλ‹€.
02:06
We poll a tiny fraction of the electorate
33
126785
2554
μš°λ¦¬κ°€ 유ꢌ자의 μΌλΆ€λΆ„λ§Œ μ—¬λ‘ μ‘°μ‚¬ν•΄μ„œ
02:09
and we predict the outcome of national elections.
34
129339
2321
λŒ€κ΅­λ―Ό μ„ κ±°κ²°κ³Όλ₯Ό μ˜ˆμΈ‘ν•˜λ“― 말이죠.
02:12
We see how a handful of patients responds to treatment in a clinical trial,
35
132240
3925
μš°λ¦¬λŠ” μ†Œμˆ˜μ˜ ν™˜μžμ˜ μž„μƒμ‹€ν—˜μ— λŒ€ν•œ λ°˜μ‘μ„ 보고
02:16
and we bring drugs to a national market.
36
136165
3065
μ „κ΅­μ˜ μ‹œμž₯에 약을 λ‚΄λ†“μŠ΅λ‹ˆλ‹€.
02:19
But this only works if our sample is randomly drawn from the population.
37
139230
4365
μΌλ°˜ν™”κ°€ ν†΅ν•˜κΈ° μœ„ν•΄μ„œλŠ” ν‘œλ³Έμ΄ λ¬΄μž‘μœ„λ‘œ 선정돼야 ν•©λ‹ˆλ‹€.
02:23
If our sample is cherry-picked in some way --
38
143595
2735
ν‘œλ³Έμ΄ νŠΉμ •ν•œ λ°©μ‹μœΌλ‘œ μ„ λ³„λœλ‹€λ©΄,
02:26
say, we poll only urban voters,
39
146330
2072
예λ₯Ό λ“€μ–΄, λ„μ‹œμ˜ μ—¬λ‘ λ§Œ μ‘°μ‚¬ν•˜κ±°λ‚˜
02:28
or say, in our clinical trials for treatments for heart disease,
40
148402
4388
심μž₯μ§ˆν™˜ 치료λ₯Ό μœ„ν•œ μž„μƒμ‹€ν—˜μ˜ 경우
02:32
we include only men --
41
152790
1881
λ‚¨μžλ§Œ λŒ€μƒμœΌλ‘œ ν•œλ‹€λ©΄
02:34
the results may not generalize to the broader population.
42
154671
3158
κ²°κ³Όλ₯Ό 인ꡬ μ „μ²΄λ‘œ μΌλ°˜ν™”ν•  수 μ—†κ² μ£ .
02:38
So scientists care whether evidence is randomly sampled or not,
43
158479
3581
λ”°λΌμ„œ κ³Όν•™μžλ“€μ€ λ¬΄μž‘μœ„ ν‘œλ³ΈμΆ”μΆœμ„ μ‹ κ²½μ”λ‹ˆλ‹€.
02:42
but what does that have to do with babies?
44
162060
2015
그런데 이게 아기와 무슨 μƒκ΄€μΌκΉŒμš”?
02:44
Well, babies have to generalize from small samples of data all the time.
45
164585
4621
μ•„κΈ°λŠ” 항상 적은 ν‘œλ³ΈμœΌλ‘œ μΌλ°˜ν™”ν•΄μ•Ό ν•©λ‹ˆλ‹€.
02:49
They see a few rubber ducks and learn that they float,
46
169206
3158
고무였리 λͺ‡ 개만 보고 κ³ λ¬΄μ˜€λ¦¬λŠ” 물에 λœ¬λ‹€λŠ” 것을,
02:52
or a few balls and learn that they bounce.
47
172364
3575
λͺ‡ 개의 곡만 보고 곡은 νŠ€μ–΄μ˜€λ₯Έλ‹€λŠ” κ±Έ 배우죠.
02:55
And they develop expectations about ducks and balls
48
175939
2951
그리고 κ³ λ¬΄μ˜€λ¦¬μ™€ 곡에 λŒ€ν•΄ μ˜ˆμƒμ„ λ°œμ „μ‹œμΌœμ„œ
02:58
that they're going to extend to rubber ducks and balls
49
178890
2716
남은 μΌμƒλ™μ•ˆ κ³ λ¬΄μ˜€λ¦¬μ™€ 곡에 ν™•λŒ€ μ μš©ν•©λ‹ˆλ‹€.
03:01
for the rest of their lives.
50
181606
1879
03:03
And the kinds of generalizations babies have to make about ducks and balls
51
183485
3739
κ³ λ¬΄μ˜€λ¦¬μ™€ 곡에 μ μš©ν•˜λŠ” 것과 같은 μΌλ°˜ν™”λ₯Ό
03:07
they have to make about almost everything:
52
187224
2089
거의 λ‹€λ₯Έ λͺ¨λ“  것에도 μ μš©ν•©λ‹ˆλ‹€.
03:09
shoes and ships and sealing wax and cabbages and kings.
53
189313
3917
μ‹ λ°œ, λ°°, λ°€λž 봉인, μ–‘λ°°μΆ”, μ™•..
03:14
So do babies care whether the tiny bit of evidence they see
54
194200
2961
κ·Έλ ‡λ‹€λ©΄ μ•„κΈ°λŠ” 그듀이 λ³΄λŠ” 적은 μ–‘μ˜ 증거가
03:17
is plausibly representative of a larger population?
55
197161
3692
λͺ¨μ§‘단을 정말 λŒ€ν‘œν•  수 μžˆλŠ”μ§€μ— μ‹ κ²½μ“ΈκΉŒμš”?
03:21
Let's find out.
56
201763
1900
ν•œλ²ˆ μ•Œμ•„λ΄…μ‹œλ‹€.
03:23
I'm going to show you two movies,
57
203663
1723
μ§€κΈˆλΆ€ν„° 두 개의 μ˜μƒμ„ λ³΄μ—¬λ“œλ¦΄κ²λ‹ˆλ‹€.
03:25
one from each of two conditions of an experiment,
58
205386
2462
각각 λ‹€λ₯Έ μ‘°κ±΄μ—μ„œ μ‹€ν—˜ν•œ μ˜μƒμž…λ‹ˆλ‹€.
03:27
and because you're going to see just two movies,
59
207848
2438
두 개만 λ³΄μ‹œκ²Œ λ˜λ‹ˆκΉŒ
03:30
you're going to see just two babies,
60
210286
2136
아기도 두 λͺ…λ§Œ μžˆμ„ κ²λ‹ˆλ‹€.
03:32
and any two babies differ from each other in innumerable ways.
61
212422
3947
그리고 μ–΄λ–€ μ•„κΈ°λ“  두 λͺ…μ˜ μ•„κΈ°λŠ” μ„œλ‘œ μˆ˜λ§Žμ€ λΆ€λΆ„μ—μ„œ λ‹€λ¦…λ‹ˆλ‹€.
03:36
But these babies, of course, here stand in for groups of babies,
62
216369
3051
ν•˜μ§€λ§Œ 이 아기듀은 μ—¬κΈ°μ„œ μ•„κΈ° 집단을 λŒ€ν‘œν•©λ‹ˆλ‹€.
03:39
and the differences you're going to see
63
219420
1895
μ—¬λŸ¬λΆ„μ΄ λ³΄μ‹œκ²Œ 될 차이점은
03:41
represent average group differences in babies' behavior across conditions.
64
221315
5195
λ‹€λ₯Έ 쑰건 μ•„λž˜μ˜ μ•„κΈ°λ“€μ˜ ν–‰λ™λ°©μ‹μ˜ 평균 집단적 차이λ₯Ό λŒ€ν‘œν•©λ‹ˆλ‹€.
03:47
In each movie, you're going to see a baby doing maybe
65
227160
2583
각각의 μ˜μƒμ—μ„œ μ—¬λŸ¬λΆ„μ€ μ•„λ§ˆ μ•„κΈ°κ°€
03:49
just exactly what you might expect a baby to do,
66
229743
3460
μ—¬λŸ¬λΆ„μ΄ μ˜ˆμΈ‘ν•œ λŒ€λ‘œ ν–‰λ™ν•˜λŠ” κ±Έ 보게 될 κ²λ‹ˆλ‹€.
03:53
and we can hardly make babies more magical than they already are.
67
233203
4017
κ·Έ μ΄μƒμœΌλ‘œ μ•„κΈ°κ°€ 더 λ§ˆλ²•κ°™μ€ 행동을 ν•  순 μ—†μŠ΅λ‹ˆλ‹€.
03:58
But to my mind the magical thing,
68
238090
2010
제 μƒκ°μœΌλ‘œ λ§ˆλ²•κ°™μ€ 것은,
04:00
and what I want you to pay attention to,
69
240100
2089
μ—¬λŸ¬λΆ„μ΄ μ£Όλͺ©ν•˜μ…¨μœΌλ©΄ ν•˜λŠ” 뢀뢄이
04:02
is the contrast between these two conditions,
70
242189
3111
각각 λ‹€λ₯Έ 쑰건을 λŒ€μ‘°ν•œ κ²ƒμž…λ‹ˆλ‹€.
04:05
because the only thing that differs between these two movies
71
245300
3529
두 μ˜μƒμ˜ μœ μΌν•œ 차이점이라곀
04:08
is the statistical evidence the babies are going to observe.
72
248829
3466
아기듀이 κ΄€μ°°ν•  톡계적 증거밖에 μ—†κΈ° λ•Œλ¬Έμ΄μ£ .
04:13
We're going to show babies a box of blue and yellow balls,
73
253425
3183
μš°λ¦¬λŠ” μ•„κΈ°μ—κ²Œ νŒŒλž€μƒ‰κ³Ό λ…Έλž€μƒ‰ 곡이 λ‹΄κΈ΄ μƒμžλ₯Ό 보여쀄 κ²λ‹ˆλ‹€.
04:16
and my then-graduate student, now colleague at Stanford, Hyowon Gweon,
74
256608
4620
λ‹Ήμ‹œ λŒ€ν•™μ›μƒμ΄μ—ˆκ³  μ΄μ œλŠ” μŠ€νƒ ν¬λ“œμ—μ„œ 제 λ™λ£ŒμΈ κΆŒνš¨μ›μ–‘μ΄
04:21
is going to pull three blue balls in a row out of this box,
75
261228
3077
μƒμžμ—μ„œ νŒŒλž€ 곡을 μ—°μ†μœΌλ‘œ μ„Έ 개 κΊΌλ‚Όκ²λ‹ˆλ‹€.
04:24
and when she pulls those balls out, she's going to squeeze them,
76
264305
3123
κΊΌλ‚Έ ν›„μ—λŠ” 곡을 꽉 λˆ„λ₯Ό κ±°μ˜ˆμš”.
04:27
and the balls are going to squeak.
77
267428
2113
그러면 κ³΅μ—μ„œ 끽끽 μ†Œλ¦¬κ°€ λ‚  κ²λ‹ˆλ‹€.
04:29
And if you're a baby, that's like a TED Talk.
78
269541
2763
λ§Œμ•½ μ—¬λŸ¬λΆ„λ“€μ΄ 아기라면. TED κ°•μ—° 같은 κ±°μ£ .
04:32
It doesn't get better than that.
79
272304
1904
그보닀 쒋을 수 μ—†μœΌλ‹ˆκΉŒμš”.
04:34
(Laughter)
80
274208
2561
(μ›ƒμŒ)
04:38
But the important point is it's really easy to pull three blue balls in a row
81
278968
3659
μ€‘μš”ν•œ 건 νŒŒλž€ 곡이 λŒ€λΆ€λΆ„μΈ μƒμžμ—μ„œ νŒŒλž€ 곡을 μ„Έ 번 연속
04:42
out of a box of mostly blue balls.
82
282627
2305
κΊΌλ‚΄λŠ” 건 μ‰½λ‹€λŠ” κ²λ‹ˆλ‹€.
04:44
You could do that with your eyes closed.
83
284932
2060
눈 감고도 ν•  수 μžˆμ–΄μš”.
04:46
It's plausibly a random sample from this population.
84
286992
2996
이 λͺ¨μ§‘λ‹¨μ—μ„œ μΆ©λΆ„νžˆ λ¬΄μž‘μœ„λ‘œ λ½‘νž 수 μžˆλŠ” ν‘œλ³Έμ΄μ£ .
04:49
And if you can reach into a box at random and pull out things that squeak,
85
289988
3732
그리고 μƒμž μ•ˆμ— 손을 λ„£μ–΄ λ¬΄μž‘μœ„λ‘œ μ†Œλ¦¬λ‚˜λŠ” 곡을 뽑을 수 μžˆλ‹€λ©΄
04:53
then maybe everything in the box squeaks.
86
293720
2839
μƒμž μ•ˆμ˜ λͺ¨λ“  κ³΅μ—μ„œ μ†Œλ¦¬κ°€ λ‚  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
04:56
So maybe babies should expect those yellow balls to squeak as well.
87
296559
3650
κ·Έλž˜μ„œ μ•„κΈ°λŠ” λ…Έλž€ 곡도 μ†Œλ¦¬κ°€ λ‚  거라고 μ˜ˆμƒν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:00
Now, those yellow balls have funny sticks on the end,
88
300209
2519
이 λ…Έλž€ κ³΅μ—λŠ”, 아기듀이 λ§‰λŒ€λ‘œ λ‹€λ₯Έ κ±Έ ν•΄λ³Ό 수 μžˆλ„λ‘
05:02
so babies could do other things with them if they wanted to.
89
302728
2857
곡의 끝뢀뢄에 λ§‰λŒ€κ°€ κ½‚ν˜€ μžˆμŠ΅λ‹ˆλ‹€.
05:05
They could pound them or whack them.
90
305585
1831
마ꡬ λ‘λ“œλ¦΄ μˆ˜λ„ 있고 μΉ  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:07
But let's see what the baby does.
91
307416
2586
아기듀이 μ–΄λ–»κ²Œ ν•˜λŠ”μ§€ 보죠.
05:12
(Video) Hyowon Gweon: See this? (Ball squeaks)
92
312548
3343
κΆŒνš¨μ›: 이거 λ³΄μ΄λ‹ˆ? (곡이 끽끽 λŒ€λŠ” μ†Œλ¦¬)
05:16
Did you see that? (Ball squeaks)
93
316531
3045
λ΄€μ–΄? (곡이 끽끽 λŒ€λŠ” μ†Œλ¦¬)
05:20
Cool.
94
320036
3066
멋지닀.
05:24
See this one?
95
324706
1950
이 곡 λ³΄μ΄λ‹ˆ?
05:26
(Ball squeaks)
96
326656
1881
(곡이 λ½λ½λŒ€λŠ” μ†Œλ¦¬)
05:28
Wow.
97
328537
2653
μš°μ™€.
05:33
Laura Schulz: Told you. (Laughs)
98
333854
2113
별 κ±° μ—†μ£ ?
05:35
(Video) HG: See this one? (Ball squeaks)
99
335967
4031
κΆŒνš¨μ›: 이거 보여? (곡이 λ½λ½λŒ€λŠ” μ†Œλ¦¬)
05:39
Hey Clara, this one's for you. You can go ahead and play.
100
339998
4619
클라라, 이 곡 μ€„κ²Œ. κ°–κ³  놀아도 돼.
(μ›ƒμŒ)
05:51
(Laughter)
101
351854
4365
05:56
LS: I don't even have to talk, right?
102
356219
2995
말 μ•ˆν•΄λ„ μ–΄λ–€ 상황인지 μ•„μ‹œκ² μ£ ?
05:59
All right, it's nice that babies will generalize properties
103
359214
2899
아기듀이 νŒŒλž€ 곡의 νŠΉμ„±μ„ λ…Έλž€ 곡에
06:02
of blue balls to yellow balls,
104
362113
1528
μΌλ°˜ν™”μ‹œν‚€λŠ” 것은 μ’‹μŠ΅λ‹ˆλ‹€
06:03
and it's impressive that babies can learn from imitating us,
105
363641
3096
아기듀이 우리의 행동을 λͺ¨λ°©ν•¨μœΌλ‘œμ¨ ν•™μŠ΅ν•˜λŠ” 것도 μΈμƒκΉŠμ£ .
06:06
but we've known those things about babies for a very long time.
106
366737
3669
μš°λ¦¬λŠ” μ•„κΈ°μ˜ 이런 νŠΉμ„±μ„ 였래 μ „λΆ€ν„° μ•Œκ³  μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
06:10
The really interesting question
107
370406
1811
μ§„μ§œ ν₯미둜운 μ§ˆλ¬Έμ€
06:12
is what happens when we show babies exactly the same thing,
108
372217
2852
μ•„κΈ°λ“€μ—κ²Œ μ™„λ²½νžˆ λ˜‘κ°™μ€ 것을 보여주면 무슨 일이 μΌμ–΄λ‚˜λŠ”μ§€ μž…λ‹ˆλ‹€.
06:15
and we can ensure it's exactly the same because we have a secret compartment
109
375069
3611
μ™„λ²½νžˆ λ˜‘κ°™λ‹€κ³  μž₯λ‹΄ν•  수 μžˆλŠ”κ²Œ μš°λ¦¬κ°€ 칸을 ν•˜λ‚˜ μˆ¨κ²¨λ†“κ³ 
06:18
and we actually pull the balls from there,
110
378680
2110
κ±°κΈ°μ„œ 곡을 κΊΌλ‚΄κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
06:20
but this time, all we change is the apparent population
111
380790
3478
ν•˜μ§€λ§Œ μ΄λ²ˆμ—” 증거λ₯Ό μ‚°μΆœν•˜λŠ”
06:24
from which that evidence was drawn.
112
384268
2902
λͺ¨μ§‘λ‹¨λ§Œ λ°”κΏ€κ²λ‹ˆλ‹€.
06:27
This time, we're going to show babies three blue balls
113
387170
3553
μ΄λ²ˆμ—λŠ” μ•„κΈ°μ—κ²Œ λ…Έλž€ 곡이 λŒ€λΆ€λΆ„μΈ μƒμžμ—μ„œ κΊΌλ‚Έ
06:30
pulled out of a box of mostly yellow balls,
114
390723
3384
νŒŒλž€ 곡 μ„Έ 개λ₯Ό 보여쀄 κ²λ‹ˆλ‹€.
06:34
and guess what?
115
394107
1322
κ²°κ³Όκ°€ μ–΄λ–¨κΉŒμš”?
06:35
You [probably won't] randomly draw three blue balls in a row
116
395429
2840
μ•„λ§ˆ λ…Έλž€ 곡이 κ°€λ“ν•œ μƒμžμ—μ„œ λ¬΄μž‘μœ„λ‘œ νŒŒλž€ 곡을
06:38
out of a box of mostly yellow balls.
117
398269
2484
연속 μ„Έ 번 뽑을 μˆ˜λŠ” 없을 κ²λ‹ˆλ‹€.
06:40
That is not plausibly randomly sampled evidence.
118
400753
3747
λ¬΄μž‘μœ„λ‘œ λ½‘μ•˜μ„ λ•Œ κ°€λŠ₯ν•œ ν‘œλ³Έμ΄ μ•„λ‹ˆμ£ .
06:44
That evidence suggests that maybe Hyowon was deliberately sampling the blue balls.
119
404500
5123
이 ν‘œλ³Έμ€ νš¨μ›μ–‘μ΄ μΌλΆ€λŸ¬ νŒŒλž€ 곡만 λ½‘μ•˜μ„ μˆ˜λ„ μžˆλ‹€λŠ” κ±Έ μ•”μ‹œν•©λ‹ˆλ‹€.
06:49
Maybe there's something special about the blue balls.
120
409623
2583
νŒŒλž€ 곡이 νŠΉλ³„ν•œ κ±Έ μˆ˜λ„ 있죠.
06:52
Maybe only the blue balls squeak.
121
412846
2976
νŒŒλž€ κ³΅μ—μ„œλ§Œ μ†Œλ¦¬κ°€ λ‚  μˆ˜λ„ μžˆμ–΄μš”.
06:55
Let's see what the baby does.
122
415822
1895
μ•„κΈ°κ°€ μ–΄λ–»κ²Œ ν–‰λ™ν•˜λŠ”μ§€ λ΄…μ‹œλ‹€.
06:57
(Video) HG: See this? (Ball squeaks)
123
417717
2904
κΆŒνš¨μ€: 이거 λ³΄μ΄λ‹ˆ? (곡이 λ½λ½λŒ€λŠ” μ†Œλ¦¬)
07:02
See this toy? (Ball squeaks)
124
422851
2645
이 μž₯λ‚œκ° 보여? (곡이 λ½λ½λŒ€λŠ” μ†Œλ¦¬)
07:05
Oh, that was cool. See? (Ball squeaks)
125
425496
5480
멋지닀. 봐봐. (곡이 λ½λ½λŒ€λŠ” μ†Œλ¦¬)
07:10
Now this one's for you to play. You can go ahead and play.
126
430976
4394
자, 이거 μ€„κ²Œ. κ°–κ³  놀아도 돼.
07:18
(Fussing) (Laughter)
127
438074
6347
(μ†Œλž€) (μ›ƒμŒ)
07:26
LS: So you just saw two 15-month-old babies
128
446901
2748
15κ°œμ›”μ˜ 두 μ•„κΈ°κ°€ κ΄€μ°°ν•œ ν‘œλ³Έμ˜ ν™•λ₯ μ—λ§Œ κΈ°λ°˜ν•œ μΆ”μΈ‘μœΌλ‘œ
07:29
do entirely different things
129
449649
1942
07:31
based only on the probability of the sample they observed.
130
451591
3599
μ™„μ „νžˆ λ‹€λ₯Έ 행동을 ν•˜λŠ” 것을 방금 λ³΄μ…¨μŠ΅λ‹ˆλ‹€.
07:35
Let me show you the experimental results.
131
455190
2321
μ‹€ν—˜ κ²°κ³Όλ₯Ό λ³΄μ—¬λ“œλ¦΄κ²Œμš”
07:37
On the vertical axis, you'll see the percentage of babies
132
457511
2764
μ„Έλ‘œ 좕은 각 μ‘°κ±΄μ—μ„œ 곡을 λˆ„λ₯Έ
07:40
who squeezed the ball in each condition,
133
460275
2530
μ•„κΈ°μ˜ λΉ„μœ¨μž…λ‹ˆλ‹€.
07:42
and as you'll see, babies are much more likely to generalize the evidence
134
462805
3715
λ³΄μ‹œλ‹€μ‹œν”Ό 아기듀은 증거가 선별됐을 λ•Œλ³΄λ‹€
07:46
when it's plausibly representative of the population
135
466520
3135
λͺ¨μ§‘단을 μΆ©λΆ„νžˆ λŒ€ν‘œν•  수 μžˆμ„ λ•Œ
07:49
than when the evidence is clearly cherry-picked.
136
469655
3738
증거λ₯Ό μΌλ°˜ν™”ν•  ν™•λ₯ μ΄ 더 λ†’μŠ΅λ‹ˆλ‹€.
07:53
And this leads to a fun prediction:
137
473393
2415
이걸둜 μž¬λ°ŒλŠ” 예츑이 κ°€λŠ₯ν•©λ‹ˆλ‹€.
07:55
Suppose you pulled just one blue ball out of the mostly yellow box.
138
475808
4868
λ…Έλž€ 곡이 λŒ€λΆ€λΆ„μΈ μƒμžμ—μ„œ νŒŒλž€ 곡을 ν•˜λ‚˜λ§Œ κΊΌλ‚Έλ‹€κ³  κ°€μ •ν•©μ‹œλ‹€.
08:00
You [probably won't] pull three blue balls in a row at random out of a yellow box,
139
480896
3869
λ¬΄μž‘μœ„λ‘œ νŒŒλž€ 곡을 μ—°μ†μœΌλ‘œ μ„Έ κ°œλ‚˜ λ½‘λŠ” 건 μ–΄λ €μšΈ κ²λ‹ˆλ‹€.
08:04
but you could randomly sample just one blue ball.
140
484765
2455
ν•˜μ§€λ§Œ νŒŒλž€ 곡을 λ¬΄μž‘μœ„λ‘œ ν•˜λ‚˜λ§Œ λ½‘λŠ”κ²ƒμ€
08:07
That's not an improbable sample.
141
487220
1970
μžˆμ„ 수 μ—†λŠ” 일은 μ•„λ‹ˆμ£ .
08:09
And if you could reach into a box at random
142
489190
2224
그리고 μƒμžμ— 손을 λ„£μ–΄ μ•„λ¬΄κ±°λ‚˜ κΊΌλƒˆλŠ”λ°
08:11
and pull out something that squeaks, maybe everything in the box squeaks.
143
491414
3987
μ†Œλ¦¬κ°€ λ‚œλ‹€λ©΄ μƒμž μ†μ˜ λ‹€λ₯Έ 곡도 λ‹€ μ†Œλ¦¬λ‚  μˆ˜λ„ 있죠.
08:15
So even though babies are going to see much less evidence for squeaking,
144
495875
4445
κ·ΈλŸ¬λ‹ˆκΉŒ 이전에 λΉ„ν•΄ μ†Œλ¦¬λ‚˜λŠ” 곡을 훨씬 적게 보게 되고
08:20
and have many fewer actions to imitate
145
500320
2242
λͺ¨λ°©ν•  행동이 훨씬 적더라도
08:22
in this one ball condition than in the condition you just saw,
146
502562
3343
μ—¬λŸ¬λΆ„μ΄ 방금 보신 μ‘°κ±΄μ—μ„œλ³΄λ‹€ 곡 ν•˜λ‚˜λ₯Ό λ½‘λŠ” 이 μ‘°κ±΄μ—μ„œ
08:25
we predicted that babies themselves would squeeze more,
147
505905
3892
μ•„κΈ°κ°€ 더 많이 곡을 λˆ„λ₯Ό 거라고 우린 μΆ”μΈ‘ν–ˆκ³ 
08:29
and that's exactly what we found.
148
509797
2894
μ‹€μ œλ‘œ κ·Έλž¬μŠ΅λ‹ˆλ‹€.
08:32
So 15-month-old babies, in this respect, like scientists,
149
512691
4411
κ·ΈλŸ¬λ‹ˆκΉŒ 15κ°œμ›”μ˜ μ•„κΈ°κ°€, 이 뢀뢄에 μžˆμ–΄μ„œλŠ”
08:37
care whether evidence is randomly sampled or not,
150
517102
3088
κ³Όν•™μžλ“€μ²˜λŸΌ ν‘œλ³Έμ΄ λ¬΄μž‘μœ„λ‘œ μ„ μΆœλ˜μ—ˆλŠ”μ§€ μ‹ κ²½μ“°κ³ 
08:40
and they use this to develop expectations about the world:
151
520190
3507
세상을 더 잘 μ˜ˆμΈ‘ν•˜λŠ” 데 μ‚¬μš©ν•©λ‹ˆλ‹€.
08:43
what squeaks and what doesn't,
152
523697
2182
μ–΄λŠ 곡이 μ†Œλ¦¬κ°€ λ‚˜λŠ”μ§€
08:45
what to explore and what to ignore.
153
525879
3145
μ–΄λŠ κ±Έ νƒκ΅¬ν•˜κ³  λ¬΄μ‹œν•΄μ•Ό 할지 말이죠.
08:50
Let me show you another example now,
154
530384
2066
이제 λ‹€λ₯Έ μ˜ˆμ‹œλ₯Ό λ³΄μ—¬λ“œλ¦΄κ²Œμš”.
08:52
this time about a problem of causal reasoning.
155
532450
2730
μ΄λ²ˆμ—λŠ” μΈκ³ΌμΆ”λ‘ μ˜ 였λ₯˜μ— λŒ€ν•œ κ²λ‹ˆλ‹€.
08:55
And it starts with a problem of confounded evidence
156
535180
2439
이 였λ₯˜λŠ” 우리 λͺ¨λ‘κ°€ 가진 ν˜Όλž€μŠ€λŸ¬μš΄ μ¦κ±°λ‘œλΆ€ν„° μ‹œμž‘λ©λ‹ˆλ‹€.
08:57
that all of us have,
157
537619
1672
08:59
which is that we are part of the world.
158
539291
2020
μš°λ¦¬κ°€ μ„Έκ³„μ˜ μΌλΆ€λΌλŠ” κ±°μ£ .
09:01
And this might not seem like a problem to you, but like most problems,
159
541311
3436
μ—¬λŸ¬λΆ„κ»˜ 문제처럼 보이지 μ•Šμ„ μˆ˜λ„ μžˆκ² μ§€λ§Œ
이건 λ‹€λ₯Έ λ¬Έμ œλ“€μ²˜λŸΌ 일이 잘λͺ»λ˜μ—ˆμ„ λ•Œλ§Œ λ¬Έμ œκ°€ λ©λ‹ˆλ‹€.
09:04
it's only a problem when things go wrong.
160
544747
2337
09:07
Take this baby, for instance.
161
547464
1811
이 μ•„κΈ°λ₯Ό 예둜 λ“€μ–΄ λ΄…μ‹œλ‹€.
09:09
Things are going wrong for him.
162
549275
1705
뭐가 잘 μ•ˆ ν’€λ¦¬λ„€μš”.
09:10
He would like to make this toy go, and he can't.
163
550980
2271
μž₯λ‚œκ°μ„ μž‘λ™μ‹œν‚€κ³  싢은데 μ•ˆλΌμ£ .
09:13
I'll show you a few-second clip.
164
553251
2529
짧은 μ˜μƒμ„ λ³΄μ—¬λ“œλ¦΄κ²Œμš”.
09:21
And there's two possibilities, broadly:
165
561340
1920
크게 두 가지 κ°€λŠ₯성이 μžˆμŠ΅λ‹ˆλ‹€.
09:23
Maybe he's doing something wrong,
166
563260
2634
μ•„κΈ°κ°€ λ­”κ°€λ₯Ό 잘λͺ»ν•˜κ³  μžˆλ“ μ§€
09:25
or maybe there's something wrong with the toy.
167
565894
4216
μž₯λ‚œκ°μ΄ λ¬Έμ œλ“ μ§€.
09:30
So in this next experiment,
168
570110
2111
κ·Έλž˜μ„œ λ‹€μŒ μ‹€ν—˜μ—μ„œ
09:32
we're going to give babies just a tiny bit of statistical data
169
572221
3297
μ•„κΈ°λ“€μ—κ²Œ λ‘˜ 쀑 ν•˜λ‚˜μ˜ 가섀을 μ§€μ§€ν•˜λŠ”
09:35
supporting one hypothesis over the other,
170
575518
2582
톡계 자료λ₯Ό 쑰금 쀄 κ²λ‹ˆλ‹€.
09:38
and we're going to see if babies can use that to make different decisions
171
578100
3455
아기듀이 이걸 μ΄μš©ν•΄ μ–΄λ–€ 행동을 ν•΄μ•Όν•  지에 λŒ€ν•œ
09:41
about what to do.
172
581555
1834
λ‹€λ₯Έ νŒλ‹¨μ„ 내릴 수 μžˆλŠ”μ§€ λ³Ό κ±°μ˜ˆμš”.
09:43
Here's the setup.
173
583389
2022
μ‹€ν—˜μ€ 이런 μ‹μœΌλ‘œ μ§„ν–‰λ©λ‹ˆλ‹€.
09:46
Hyowon is going to try to make the toy go and succeed.
174
586071
3030
νš¨μ› 양이 μž₯λ‚œκ°μ„ μž‘λ™μ‹œν‚€κ³  κ·Έλ ‡κ²Œ λ©λ‹ˆλ‹€.
09:49
I am then going to try twice and fail both times,
175
589101
3320
κ·Έ λ‹€μŒ μ œκ°€ 두 번 μ‹œλ„ν•˜κ³  두 번 λ‹€ μ‹€νŒ¨ν•  κ±°μ˜ˆμš”.
09:52
and then Hyowon is going to try again and succeed,
176
592421
3112
λ‹€μ‹œ νš¨μ› 양이 μ‹œλ„ν•΄μ„œ μž₯λ‚œκ°μ„ μž‘λ™μ‹œν‚€λŠ” 데 성곡할 κ²λ‹ˆλ‹€.
09:55
and this roughly sums up my relationship to my graduate students
177
595533
3172
이건 μ „λ°˜μ μœΌλ‘œ 기계λ₯Ό μ‚¬μš©ν•˜λŠ” 데 μžˆμ–΄μ„œ
09:58
in technology across the board.
178
598705
2835
저와 학생듀 μ‚¬μ΄μ˜ 관계λ₯Ό λ³΄μ—¬μ€λ‹ˆλ‹€.
10:02
But the important point here is it provides a little bit of evidence
179
602030
3292
μ—¬κΈ°μ„œ μ€‘μš”ν•œ 점은 μž₯λ‚œκ°μ΄ λ¬Έμ œκ°€ μ•„λ‹ˆλΌ
10:05
that the problem isn't with the toy, it's with the person.
180
605322
3668
μ‚¬λžŒμ΄ λ¬Έμ œλΌλŠ” 것에 λŒ€ν•œ 증거λ₯Ό 쑰금만 μ œκ³΅ν•œλ‹€λŠ” κ²λ‹ˆλ‹€.
10:08
Some people can make this toy go,
181
608990
2350
μ–΄λ–€ μ‚¬λžŒμ€ 이 μž₯λ‚œκ°μ„ μž‘λ™μ‹œν‚¬ 수 있고
10:11
and some can't.
182
611340
959
μ–΄λ–€ μ‚¬λžŒμ€ μ•„λ‹ˆμ£ .
10:12
Now, when the baby gets the toy, he's going to have a choice.
183
612799
3413
아기듀이 μž₯λ‚œκ°μ„ λ°›μœΌλ©΄ 선택지가 μ£Όμ–΄μ§ˆ κ²λ‹ˆλ‹€.
10:16
His mom is right there,
184
616212
2188
μ—„λ§ˆκ°€ λ°”λ‘œ μ˜†μ— μžˆμ–΄μ„œ
10:18
so he can go ahead and hand off the toy and change the person,
185
618400
3315
μ•„κΈ°κ°€ μ—„λ§ˆμ—κ²Œ μž₯λ‚œκ°μ„ 쀌으둜써 μ‚¬λžŒμ„ λ°”κΏ€ μˆ˜λ„ 있고
10:21
but there's also going to be another toy at the end of that cloth,
186
621715
3158
천의 λ‹€λ₯Έ μͺ½μ—λŠ” 또 λ‹€λ₯Έ μž₯λ‚œκ°μ΄ μžˆμ–΄μ„œ
10:24
and he can pull the cloth towards him and change the toy.
187
624873
3552
μ²œμ„ μž‘μ•„λ‹Ήκ²¨ μž₯λ‚œκ°μ„ λ°”κΏ€ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
10:28
So let's see what the baby does.
188
628425
2090
μ•„κΈ°κ°€ μ–΄λ–»κ²Œ ν–‰λ™ν•˜λŠ”μ§€ λ΄…μ‹œλ‹€.
10:30
(Video) HG: Two, three. Go! (Music)
189
630515
4183
κΆŒνš¨μ›: λ‘˜, μ…‹, μ§ ! (μŒμ•…)
10:34
LS: One, two, three, go!
190
634698
3131
둜라 μ…œμΈ : ν•˜λ‚˜, λ‘˜, μ…‹, μ§ !
10:37
Arthur, I'm going to try again. One, two, three, go!
191
637829
7382
μ•„μ„œ, λ‹€μ‹œ ν•΄λ³Όκ²Œ. ν•˜λ‚˜, λ‘˜, μ…‹, μ§ !
10:45
YG: Arthur, let me try again, okay?
192
645677
2600
κΆŒνš¨μ›: μ•„μ„œ, λ‚˜λ„ λ‹€μ‹œ ν•΄λ³Όκ²Œ,μ•Œμ•˜μ§€?
10:48
One, two, three, go! (Music)
193
648277
4550
ν•˜λ‚˜, λ‘˜, μ…‹, μ§ ! (μŒμ•…)
10:53
Look at that. Remember these toys?
194
653583
1883
이것 봐. 이 μž₯λ‚œκ°λ“€ κΈ°μ–΅λ‚˜λ‹ˆ?
10:55
See these toys? Yeah, I'm going to put this one over here,
195
655466
3264
이 μž₯λ‚œκ°λ“€ 보여? 그래, 이건 μ—¬κΈ° 두고
10:58
and I'm going to give this one to you.
196
658730
2062
이건 λ„ˆν•œν…Œ μ€„κ²Œ.
11:00
You can go ahead and play.
197
660792
2335
가지고 놀아도 돼.
11:23
LS: Okay, Laura, but of course, babies love their mommies.
198
683213
4737
둜라, 그런데 μ•„κΈ°λŠ” μ—„λ§ˆλ₯Ό μ‚¬λž‘ν•˜λ‹ˆκΉŒ
11:27
Of course babies give toys to their mommies
199
687950
2182
μž₯λ‚œκ°μ΄ μž‘λ™ν•˜μ§€ μ•Šμ„ λ•Œ
11:30
when they can't make them work.
200
690132
2030
λ‹Ήμ—°νžˆ μ—„λ§ˆμ—κ²Œ μž₯λ‚œκ°μ„ μ£Όκ² μ£ .
11:32
So again, the really important question is what happens when we change
201
692162
3593
λ‹€μ‹œ λ§μ”€λ“œλ¦¬μ§€λ§Œ μ€‘μš”ν•œ 건 μš°λ¦¬κ°€ 톡계 자료λ₯Ό μ•„μ£Ό μ•½κ°„
11:35
the statistical data ever so slightly.
202
695755
3154
λ³€ν˜•μ‹œμΌ°μ„ λ•Œ μ–΄λ–€ 일이 μΌμ–΄λ‚˜λŠ”μ§€ μž…λ‹ˆλ‹€.
11:38
This time, babies are going to see the toy work and fail in exactly the same order,
203
698909
4087
μ΄λ²ˆμ— μ•„κΈ°λŠ” λ™μΌν•œ μˆœμ„œλ‘œ μž₯λ‚œκ°μ„ μž‘λ™μ‹œν‚€λŠ” κ±Έ
11:42
but we're changing the distribution of evidence.
204
702996
2415
μ„±κ³΅ν•˜κ³  μ‹€νŒ¨ν•˜λŠ” κ±Έ λ³΄κ² μ§€λ§Œ 증거의 뢄포가 λ°”λ€”κ²λ‹ˆλ‹€.
11:45
This time, Hyowon is going to succeed once and fail once, and so am I.
205
705411
4411
μ΄λ²ˆμ—” νš¨μ› μ–‘κ³Ό μ €λŠ” ν•œ 번 μ„±κ³΅ν•˜κ³  ν•œ 번 μ‹€νŒ¨ν•  κ±°μ˜ˆμš”.
11:49
And this suggests it doesn't matter who tries this toy, the toy is broken.
206
709822
5637
이건 λˆ„κ°€ μ‹œλ„ν•˜λŠ”μ§€ 상관없이 μž₯λ‚œκ°μ΄ 망가진 κ±Έ μ˜λ―Έν•©λ‹ˆλ‹€.
11:55
It doesn't work all the time.
207
715459
1886
항상 μž‘λ™ν•˜μ§„ μ•ŠμœΌλ‹ˆκΉŒμš”.
11:57
Again, the baby's going to have a choice.
208
717345
1965
λ‹€μ‹œ, μ•„κΈ°λŠ” 선택지가 μ£Όμ–΄μ§‘λ‹ˆλ‹€.
μ—„λ§ˆκ°€ λ°”λ‘œ μ˜†μ— μžˆμ–΄μ„œ μ‚¬λžŒμ„ λ°”κΏ€ μˆ˜λ„ 있고
11:59
Her mom is right next to her, so she can change the person,
209
719310
3396
12:02
and there's going to be another toy at the end of the cloth.
210
722706
3204
천 끝에 λ‹€λ₯Έ μž₯λ‚œκ°μ΄ μžˆμ–΄μ„œ μž₯λ‚œκ°μ„ λ°”κΏ€ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
12:05
Let's watch what she does.
211
725910
1378
μ•„κΈ°κ°€ μ–΄λ–»κ²Œ ν•˜λŠ”μ§€ 관찰해보죠.
12:07
(Video) HG: Two, three, go! (Music)
212
727288
4348
νš¨μ›: λ‘˜, μ…‹, μ§ !
12:11
Let me try one more time. One, two, three, go!
213
731636
4984
λ‹€μ‹œ ν•΄λ³Όκ²Œ. ν•˜λ‚˜, λ‘˜, μ…‹, μ§ !
12:17
Hmm.
214
737460
1697
음...
12:19
LS: Let me try, Clara.
215
739950
2692
둜라 슐츠: 클라라, λ‚΄κ°€ ν•΄λ³Όκ²Œ.
12:22
One, two, three, go!
216
742642
3945
ν•˜λ‚˜, λ‘˜, μ…‹, μ§ !
12:27
Hmm, let me try again.
217
747265
1935
음, λ‹€μ‹œ ν•΄λ³Όκ²Œ.
12:29
One, two, three, go! (Music)
218
749200
5670
ν•˜λ‚˜, λ‘˜, μ…‹, μ§ !
12:35
HG: I'm going to put this one over here,
219
755009
2233
κΆŒνš¨μ›: 이건 μ—¬κΈ° 두고
12:37
and I'm going to give this one to you.
220
757242
2001
이건 λ„ˆν•œν…Œ μ€„κ²Œ.
12:39
You can go ahead and play.
221
759243
3597
κ°–κ³  놀아도 돼.
12:58
(Applause)
222
778376
4897
(λ°•μˆ˜)
13:04
LS: Let me show you the experimental results.
223
784993
2392
μ‹€ν—˜ κ²°κ³Όλ₯Ό λ³΄μ—¬λ“œλ¦΄κ²Œμš”.
13:07
On the vertical axis, you'll see the distribution
224
787385
2475
μ„Έλ‘œμΆ•μ— 아이듀이 각 μ‘°κ±΄μ—μ„œ μ–΄λ–€ 선택을 ν–ˆλŠ”μ§€μ— λŒ€ν•œ
13:09
of children's choices in each condition,
225
789860
2577
뢄포λ₯Ό λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
13:12
and you'll see that the distribution of the choices children make
226
792437
4551
λ³΄μ‹œλ©΄ 뢄포도가 아기듀이 κ΄€μ°°ν•œ 증거에 따라
13:16
depends on the evidence they observe.
227
796988
2787
κ²°μ •λœλ‹€λŠ” κ±Έ μ•Œ 수 μžˆμŠ΅λ‹ˆλ‹€.
13:19
So in the second year of life,
228
799775
1857
μ•„κΈ°λŠ” 두 μ‚΄ λ•Œ
13:21
babies can use a tiny bit of statistical data
229
801632
2577
적은 μ–‘μ˜ 톡계 자료둜 행동할 λ•Œ
13:24
to decide between two fundamentally different strategies
230
804209
3367
두 κ°€μ§€μ˜ 근본적으둜 λ‹€λ₯Έ μ „λž΅ 쀑 ν•˜λ‚˜λ₯Ό κ³ λ₯Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
13:27
for acting in the world:
231
807576
1881
13:29
asking for help and exploring.
232
809457
2743
도움을 청할지 탐ꡬ해 볼지 μž…λ‹ˆλ‹€.
13:33
I've just shown you two laboratory experiments
233
813700
3434
이 λΆ„μ•Όμ—μ„œ λΉ„μŠ·ν•œ 주제둜 ν•œ μ‹€ν—˜μ΄ 정말 수 λ°±κ°œκ°€ μžˆμ§€λ§Œ κ·Έ μ€‘μ—μ„œ
13:37
out of literally hundreds in the field that make similar points,
234
817134
3691
μ—°κ΅¬μ‹€μ—μ„œ ν•œ μ‹€ν—˜ 두 개만 λ³΄μ—¬λ“œλ ΈμŠ΅λ‹ˆλ‹€.
13:40
because the really critical point
235
820825
2392
μ •λ§λ‘œ μ€‘μš”ν•œ μš”μ μ€
13:43
is that children's ability to make rich inferences from sparse data
236
823217
5108
적은 자료둜 ν’λΆ€ν•œ 좔둠을 ν•˜λŠ” μ•„μ΄λ“€μ˜ λŠ₯λ ₯이
13:48
underlies all the species-specific cultural learning that we do.
237
828325
5341
인간이 ν•˜λŠ” λͺ¨λ“  μ’… 특유의 문화적 ν•™μŠ΅μ˜ 근간이기 λ•Œλ¬Έμ΄μ£ .
13:53
Children learn about new tools from just a few examples.
238
833666
4597
아이듀은 λͺ‡ 개의 μ˜ˆμ‹œλ§ŒμœΌλ‘œ μƒˆ 도ꡬλ₯Ό μ‚¬μš©ν•˜λŠ” 법을 배우고
13:58
They learn new causal relationships from just a few examples.
239
838263
4717
λͺ‡ 개의 μ˜ˆμ‹œλ§ŒμœΌλ‘œ μƒˆλ‘œμš΄ 인과관계λ₯Ό λ°°μ›λ‹ˆλ‹€.
14:03
They even learn new words, in this case in American Sign Language.
240
843928
4871
μƒˆλ‘œμš΄ 단어도 λ°°μ›λ‹ˆλ‹€. 이 κ²½μš°λŠ” 미ꡭ식 μˆ˜ν™”μ£ .
14:08
I want to close with just two points.
241
848799
2311
두 가지 μš”μ μœΌλ‘œ λ§ˆλ¬΄λ¦¬ν•˜κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
14:12
If you've been following my world, the field of brain and cognitive sciences,
242
852050
3688
λ§Œμ•½ μ—¬λŸ¬λΆ„κ»˜μ„œ 제 세계인 λ‡Œμ™€ μΈμ§€κ³Όν•™μ˜ λΆ„μ•Όλ₯Ό
14:15
for the past few years,
243
855738
1927
μ§€λ‚œ λͺ‡ λ…„κ°„ μ§€μΌœλ³΄μ…¨λ‹€λ©΄
14:17
three big ideas will have come to your attention.
244
857665
2415
μ„Έ 가지 μ»€λ‹€λž€ 견해가 λˆˆμ— λ“€μ–΄μ˜€μ…¨μ„ κ²λ‹ˆλ‹€.
14:20
The first is that this is the era of the brain.
245
860080
3436
μ²«μ§ΈλŠ”, μ§€κΈˆμ€ λ‡Œμ˜ μ‹œλŒ€λΌλŠ” κ²λ‹ˆλ‹€.
14:23
And indeed, there have been staggering discoveries in neuroscience:
246
863516
3669
μ •λ§λ‘œ μ‹ κ²½κ³Όν•™μ—μ„œ 좩격적인 λ°œκ²¬λ“€μ΄ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
14:27
localizing functionally specialized regions of cortex,
247
867185
3436
ν”Όμ§ˆμ—μ„œ κΈ°λŠ₯이 νŠΉν™”λœ 뢀뢄을 μ•Œμ•„λ‚΄λŠ” 것,
14:30
turning mouse brains transparent,
248
870621
2601
μ₯μ˜ λ‡Œλ₯Ό 투λͺ…ν•˜κ²Œ λ§Œλ“œλŠ” 것,
14:33
activating neurons with light.
249
873222
3776
λ‰΄λŸ°μ„ λΉ›μœΌλ‘œ ν™œμ„±ν™”μ‹œν‚€λŠ” 것이 μžˆμ—ˆμ£ .
14:36
A second big idea
250
876998
1996
두 λ²ˆμ§ΈλŠ”
14:38
is that this is the era of big data and machine learning,
251
878994
4104
빅데이터와 기계 ν•™μŠ΅μ˜ μ‹œλŒ€λΌλŠ” κ²ƒμž…λ‹ˆλ‹€.
14:43
and machine learning promises to revolutionize our understanding
252
883098
3141
기계 ν•™μŠ΅μ€ 우리의 이해λ ₯에 혁λͺ…을 κ°€μ Έμ˜¬ 것을 μ•½μ†ν•©λ‹ˆλ‹€.
14:46
of everything from social networks to epidemiology.
253
886239
4667
μ†Œμ…œλ„€νŠΈμ›Œν¬λΆ€ν„° μ—­ν•™κΉŒμ§€ μ „λΆ€ λ‹€μš”.
14:50
And maybe, as it tackles problems of scene understanding
254
890906
2693
그리고 μž₯λ©΄ 이해에 λŒ€ν•œ λ¬Έμ œμ™€
14:53
and natural language processing,
255
893599
1993
μžμ—° μ–Έμ–΄ 처리λ₯Ό λ‹€λ£¨λ©΄μ„œ
14:55
to tell us something about human cognition.
256
895592
3324
μΈκ°„μ˜ 인식λ ₯에 λŒ€ν•΄ 무언가λ₯Ό μ•Œλ €μ€„ μˆ˜λ„ 있죠.
14:59
And the final big idea you'll have heard
257
899756
1937
λ§ˆμ§€λ§‰μ€ μ—¬λŸ¬λΆ„μ΄ 듀어보셨을 μˆ˜λ„ μžˆλŠ”λ°
15:01
is that maybe it's a good idea we're going to know so much about brains
258
901693
3387
μš°λ¦¬κ°€ λ‡Œμ— λŒ€ν•΄ λ§Žμ€ κ±Έ μ•Œκ³ 
빅데이터에 많이 μ ‘κ·Όν•˜λŠ” 게 쒋은 것일 수 μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
15:05
and have so much access to big data,
259
905080
1917
15:06
because left to our own devices,
260
906997
2507
기기와 남겨진 인간은
15:09
humans are fallible, we take shortcuts,
261
909504
3831
였λ₯˜λ₯Ό λ²”ν•˜κΈ° 쉽기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
μ§€λ¦„κΈΈλ‘œ κ°€κ³ , μ‹€μˆ˜λ₯Ό ν•˜κ³ 
15:13
we err, we make mistakes,
262
913335
3437
15:16
we're biased, and in innumerable ways,
263
916772
3684
λ§Žμ€ λ°©λ©΄μ—μ„œ 편ν–₯λ˜μ–΄ 있고
세상을 잘λͺ» μ΄ν•΄ν•©λ‹ˆλ‹€.
15:20
we get the world wrong.
264
920456
2969
15:24
I think these are all important stories,
265
924843
2949
μ €λŠ” 이 λͺ¨λ“  게 μ€‘μš”ν•˜κ³ 
15:27
and they have a lot to tell us about what it means to be human,
266
927792
3785
인간적인 것이 μ–΄λ–€ μ˜λ―ΈμΈμ§€μ— λŒ€ν•΄ λ§Žμ€ 것을 말해쀀닀고 μƒκ°ν•©λ‹ˆλ‹€.
15:31
but I want you to note that today I told you a very different story.
267
931577
3529
ν•˜μ§€λ§Œ μ „ 였늘 μ™„μ „νžˆ λ‹€λ₯Έ 이야기λ₯Ό λ“€λ €λ“œλ Έλ‹€λŠ” κ±Έ κ°•μ‘°ν•˜κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
15:35
It's a story about minds and not brains,
268
935966
3807
λ‡Œκ°€ μ•„λ‹Œ 사고에 κ΄€ν•œ μ΄μ•ΌκΈ°μ˜€μŠ΅λ‹ˆλ‹€.
15:39
and in particular, it's a story about the kinds of computations
269
939773
3006
그리고 νŠΉλ³„ν•˜κ²Œ μΈκ°„μ˜ μ •μ‹ λ§Œμ΄ κ°€λŠ₯ν•œ
15:42
that uniquely human minds can perform,
270
942779
2590
λ§Žμ€ μ–‘μ˜ κ΅¬μ‘°ν™”λœ 지식과
15:45
which involve rich, structured knowledge and the ability to learn
271
945369
3944
적은 μ–‘μ˜ μžλ£Œμ™€ μ†Œμˆ˜μ˜ ν‘œλ³ΈμœΌλ‘œλΆ€ν„° 얻은 μ¦κ±°λ‘œλΆ€ν„°
15:49
from small amounts of data, the evidence of just a few examples.
272
949313
5268
ν•™μŠ΅ν•  수 μžˆλŠ” λŠ₯λ ₯을 ν¬ν•¨ν•˜λŠ” 계산 λŠ₯λ ₯에 κ΄€ν•œ 이야기죠.
근본적으둜 μ–΄λ–»κ²Œ μ–΄λ¦° μ•„μ΄μ—μ„œ μ‹œμž‘ν•΄μ„œ
15:56
And fundamentally, it's a story about how starting as very small children
273
956301
4299
16:00
and continuing out all the way to the greatest accomplishments
274
960600
4180
우리 λ¬Έν™”μ—μ„œ κ°€μž₯ μœ„λŒ€ν•œ 업적을 μ„Έμ›€μœΌλ‘œμ¨
16:04
of our culture,
275
964780
3843
16:08
we get the world right.
276
968623
1997
세상을 μ œλŒ€λ‘œ μ•Œκ²Œ λ˜λŠ”μ§€μ— λŒ€ν•œ μ΄μ•ΌκΈ°μž…λ‹ˆλ‹€.
16:12
Folks, human minds do not only learn from small amounts of data.
277
972433
5267
μ—¬λŸ¬λΆ„, μΈκ°„μ˜ μ‚¬κ³ λŠ” 적은 μ–‘μ˜ 자료둜만 ν•™μŠ΅ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
16:18
Human minds think of altogether new ideas.
278
978285
2101
μƒˆλ‘œμš΄ 생각도 μ’…ν•©μ μœΌλ‘œ ν•©λ‹ˆλ‹€.
16:20
Human minds generate research and discovery,
279
980746
3041
연ꡬ와 λ°œκ²¬μ„ λ§Œλ“€μ–΄λ‚΄κ³ 
16:23
and human minds generate art and literature and poetry and theater,
280
983787
5273
예술과 λ¬Έν•™κ³Ό μ‹œμ™€ κ·Ήμž‘μ„ λ§Œλ“­λ‹ˆλ‹€.
16:29
and human minds take care of other humans:
281
989070
3760
그리고 λ‹€λ₯Έ 인간을 λ³΄μ‚΄ν•λ‹ˆλ‹€.
16:32
our old, our young, our sick.
282
992830
3427
노인과 아이와 λ³‘μžλ“€μ„μš”.
16:36
We even heal them.
283
996517
2367
심지어 μΉ˜μœ ν•΄μ€λ‹ˆλ‹€.
16:39
In the years to come, we're going to see technological innovations
284
999564
3103
μ•žμœΌλ‘œ μš°λ¦¬λŠ” 상상을 μ΄ˆμ›”ν•˜λŠ”
16:42
beyond anything I can even envision,
285
1002667
3797
기술적 ν˜μ‹ μ„ 보게 될 κ²λ‹ˆλ‹€.
16:46
but we are very unlikely
286
1006464
2150
ν•˜μ§€λ§Œ 저와 μ—¬λŸ¬λΆ„ μ„ΈλŒ€μ—μ„œ
16:48
to see anything even approximating the computational power of a human child
287
1008614
5709
인간 μ•„μ΄μ˜ 계산 λŠ₯λ ₯에 μ‘°κΈˆμ΄λΌλ„ κ·Όμ ‘ν•˜λŠ” κ±Έ
16:54
in my lifetime or in yours.
288
1014323
4298
λ³Ό κ°€λŠ₯성은 거의 μ—†μŠ΅λ‹ˆλ‹€.
16:58
If we invest in these most powerful learners and their development,
289
1018621
5047
κ°€μž₯ κ°•λ ₯ν•œ ν•™μŠ΅μžλ“€κ³Ό κ·Έλ“€μ˜ λ°œλ‹¬μ— νˆ¬μžν•œλ‹€λ©΄
17:03
in babies and children
290
1023668
2917
아기와 아이듀, μ—„λ§ˆμ™€ μ•„λΉ λ“€μ—κ²Œ,
17:06
and mothers and fathers
291
1026585
1826
17:08
and caregivers and teachers
292
1028411
2699
λŒλ³΄λ―Έμ™€ μ„ μƒλ‹˜λ“€μ—κ²Œ λ§μž…λ‹ˆλ‹€.
17:11
the ways we invest in our other most powerful and elegant forms
293
1031110
4170
μš°λ¦¬κ°€ κ°€μž₯ κ°•λ ₯ν•˜κ³  μ •λ°€ν•œ ν˜•νƒœμ˜ 기술과 곡학과 λ””μžμΈμ— ν•˜λ“―μ΄ ν•œλ‹€λ©΄
17:15
of technology, engineering and design,
294
1035280
3218
단지 더 λ‚˜μ€ 미래λ₯Ό 꿈꾸기만 ν•˜λŠ” 게 μ•„λ‹ˆλΌ
17:18
we will not just be dreaming of a better future,
295
1038498
2939
17:21
we will be planning for one.
296
1041437
2127
κ³„νšν•˜κ²Œ 될 κ²ƒμž…λ‹ˆλ‹€.
17:23
Thank you very much.
297
1043564
2345
정말 κ°μ‚¬ν•©λ‹ˆλ‹€.
17:25
(Applause)
298
1045909
3421
(λ°•μˆ˜)
17:29
Chris Anderson: Laura, thank you. I do actually have a question for you.
299
1049810
4426
크리슀: 둜라, κ°μ‚¬ν•©λ‹ˆλ‹€. 사싀 질문이 ν•˜λ‚˜ μžˆμŠ΅λ‹ˆλ‹€.
17:34
First of all, the research is insane.
300
1054236
2359
λ¨Όμ €, 미친 μ—°κ΅¬λ„€μš”.
17:36
I mean, who would design an experiment like that? (Laughter)
301
1056595
3725
λˆ„κ°€ μ €λŸ° 연ꡬλ₯Ό μ„€κ³„ν•˜κ² μ–΄μš”? (μ›ƒμŒ)
17:41
I've seen that a couple of times,
302
1061150
1790
μ €λŸ° κ±Έ λͺ‡ 번 λ³Έ 적은 μžˆμ§€λ§Œ
17:42
and I still don't honestly believe that that can truly be happening,
303
1062940
3222
아직도 μ‹€μ œλ‘œ 일어날 수 μžˆλ‹€λŠ” κ±Έ μ†”μ§νžˆ λͺ» λ―Ώκ² μ–΄μš”.
17:46
but other people have done similar experiments; it checks out.
304
1066162
3158
κ·Έλ ‡μ§€λ§Œ λ‹€λ₯Έ μ‚¬λžŒλ“€λ„ λΉ„μŠ·ν•œ μ‹€ν—˜μ„ ν–ˆκ³ 
μ•„κΈ°κ°€ 정말 μ²œμž¬λΌλŠ” 게 μž…μ¦λλ‚˜μš”?
17:49
The babies really are that genius.
305
1069320
1633
17:50
LS: You know, they look really impressive in our experiments,
306
1070953
3007
둜라: 아기듀은 우리 μ‹€ν—˜ μ†μ—μ„œ 정말 μΈμƒμ μž…λ‹ˆλ‹€.
17:53
but think about what they look like in real life, right?
307
1073960
2652
ν•˜μ§€λ§Œ ν˜„μ‹€μ—μ„œλŠ” 어떀지 생각해 λ³΄μ„Έμš”.
17:56
It starts out as a baby.
308
1076612
1150
μ•„κΈ°λ‘œ μ‹œμž‘ν•΄μ„œ
17:57
Eighteen months later, it's talking to you,
309
1077762
2007
18κ°œμ›” ν›„μ—λŠ” 말을 ν•˜κ³  있죠.
17:59
and babies' first words aren't just things like balls and ducks,
310
1079769
3041
μ•„κΈ°λ“€μ˜ 첫 λ‹¨μ–΄λŠ” κ³΅μ΄λ‚˜ μ˜€λ¦¬κ°™μ€ 게 μ•„λ‹ˆμ—μš”.
18:02
they're things like "all gone," which refer to disappearance,
311
1082810
2881
μ†Œλ©Έμ„ λœ»ν•˜λŠ” "λ‹€ μ—†μ–΄μ‘Œμ–΄" 같은 것듀이죠.
18:05
or "uh-oh," which refer to unintentional actions.
312
1085691
2283
μ•„λ‹ˆλ©΄ μ˜λ„μΉ˜ μ•Šμ€ 행동을 λœ»ν•˜λŠ” "μ–΄μ–΄"같은 κ²ƒμ΄λ‚˜μš”.
18:07
It has to be that powerful.
313
1087974
1562
그만큼 κ°•λ ₯ν•©λ‹ˆλ‹€.
18:09
It has to be much more powerful than anything I showed you.
314
1089536
2775
μ œκ°€ 보여쀀 것듀 μ΄μƒμœΌλ‘œ κ°•λ ₯ν•΄μš”.
μ„Έμƒμ˜ λͺ¨λ“  κ±Έ μ΄ν•΄ν•˜λŠ” μ€‘μ΄λ‹ˆκΉŒμš”.
18:12
They're figuring out the entire world.
315
1092311
1974
18:14
A four-year-old can talk to you about almost anything.
316
1094285
3144
4μ‚΄μ§œλ¦¬ μ•„μ΄λŠ” μ—¬λŸ¬λΆ„μ—κ²Œ 거의 λͺ¨λ“  것에 λŒ€ν•΄ μ–˜κΈ°ν•  수 있죠.
18:17
(Applause)
317
1097429
1601
(λ°•μˆ˜)
18:19
CA: And if I understand you right, the other key point you're making is,
318
1099030
3414
크리슀: μ œκ°€ μ œλŒ€λ‘œ μ΄ν•΄ν–ˆλ‹€λ©΄ 당신이 λ§ν•˜λŠ” 또 λ‹€λ₯Έ μš”μ μ€
18:22
we've been through these years where there's all this talk
319
1102444
2754
μ—¬νƒœκΉŒμ§€ 였랜 μ‹œκ°„ 우리의 정신이 λ³€λ•μŠ€λŸ½κ³ 
18:25
of how quirky and buggy our minds are,
320
1105198
1932
결함이 λ§Žλ‹€λŠ” λ…Όμ˜κ°€ 많이 μžˆμ–΄μ™”κ³ ,
18:27
that behavioral economics and the whole theories behind that
321
1107130
2867
μš°λ¦¬κ°€ 합리적인 인간이 μ•„λ‹ˆλΌλŠ” κ±Έ
18:29
that we're not rational agents.
322
1109997
1603
ν–‰λ™κ²½μ œν•™κ³Ό λ‹€λ₯Έ λ§Žμ€ 이둠듀이 λ’·λ°›μΉ¨ν•˜κ³  있죠.
18:31
You're really saying that the bigger story is how extraordinary,
323
1111600
4216
당신이 λ§μ”€ν•˜μ‹œλŠ” 건, 더 큰 그림은 아기듀이
18:35
and there really is genius there that is underappreciated.
324
1115816
4944
정말 λΉ„λ²”ν•˜κ³  사싀상 κ³Όμ†Œν‰κ°€λ°›λŠ” μ²œμž¬λ“€μ΄λΌλŠ”κ±°μ£ .
18:40
LS: One of my favorite quotes in psychology
325
1120760
2070
둜라: μ œκ°€ μ‹¬λ¦¬ν•™μ—μ„œ κ°€μž₯ μ’‹μ•„ν•˜λŠ” 인용ꡬ 쀑 ν•˜λ‚˜κ°€
18:42
comes from the social psychologist Solomon Asch,
326
1122830
2290
μ‚¬νšŒμ‹¬λ¦¬ν•™μž μ†”λ‘œλͺ¬ 애쉬가 ν•˜μ‹  λ§μ”€μž…λ‹ˆλ‹€.
18:45
and he said the fundamental task of psychology is to remove
327
1125120
2807
μ‹¬λ¦¬ν•™μ˜ 근본적 κ³Όμ œλŠ” 자λͺ…ν•¨μ˜ 베일을
18:47
the veil of self-evidence from things.
328
1127927
2626
κ±·μ–΄λ‚΄λŠ” 것이닀.
18:50
There are orders of magnitude more decisions you make every day
329
1130553
4551
μ—¬λŸ¬λΆ„μ€ 맀일, 세상을 옳게 μ΄ν•΄ν•˜λŠ” 백만 μžλ¦¬μˆ˜κ°€ λ„˜λŠ” 갯수의
18:55
that get the world right.
330
1135104
1347
결정을 λ‚΄λ¦½λ‹ˆλ‹€.
18:56
You know about objects and their properties.
331
1136451
2132
μ—¬λŸ¬λΆ„μ€ 물체와 κ·Έ 속성에 λŒ€ν•΄ μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
18:58
You know them when they're occluded. You know them in the dark.
332
1138583
3029
λ§‰ν˜€μžˆμ„ λ•Œλ‚˜ μ–΄λ‘  속에 μžˆμ„ λ•Œλ„ μ•Œκ³ 
19:01
You can walk through rooms.
333
1141612
1308
방을 κ±Έμ–΄μ„œ 톡과할 수 μžˆμŠ΅λ‹ˆλ‹€.
19:02
You can figure out what other people are thinking. You can talk to them.
334
1142920
3532
λ‹€λ₯Έ μ‚¬λžŒλ“€μ΄ 무슨 생각을 ν•˜λŠ”μ§€ μ•Œμ•„λ‚΄κ³  κ·Έλ“€κ³Ό μ–˜κΈ°ν•  수 있고
우주λ₯Ό νƒμ‚¬ν•˜κ³  μˆ«μžμ— λŒ€ν•΄ μ•Œκ³ 
19:06
You can navigate space. You know about numbers.
335
1146452
2230
인과관계와 도덕적 좔둠에 λŒ€ν•΄ μ••λ‹ˆλ‹€.
19:08
You know causal relationships. You know about moral reasoning.
336
1148682
3022
19:11
You do this effortlessly, so we don't see it,
337
1151704
2356
λ…Έλ ₯ 없이 ν•˜κΈ° λ•Œλ¬Έμ— μš°λ¦¬λŠ” 보지 λͺ»ν•˜μ§€λ§Œ
이게 μš°λ¦¬κ°€ 세상을 옳게 μ΄ν•΄ν•˜λŠ” 방법이고 λ†€λžμŠ΅λ‹ˆλ‹€.
19:14
but that is how we get the world right, and it's a remarkable
338
1154060
2912
그리고 μ΄ν•΄ν•˜κΈ° νž˜λ“  업적이죠.
19:16
and very difficult-to-understand accomplishment.
339
1156972
2318
크리슀: 제 μƒκ°μ—λŠ” 청쀑듀 쀑에 기술λ ₯을 λ°œμ „μ‹œν‚€λŠ” 관점을 가지고
19:19
CA: I suspect there are people in the audience who have
340
1159290
2628
19:21
this view of accelerating technological power
341
1161918
2238
당신이 우리 μ„ΈλŒ€μ— μ ˆλŒ€λ‘œ 컴퓨터가
19:24
who might dispute your statement that never in our lifetimes
342
1164156
2958
μ„Έ μ‚΄μ˜ 아이가 ν•  수 μžˆλŠ” 일을 ν•˜μ§€ λͺ»ν•  κ²ƒμ΄λΌλŠ” μ£Όμž₯에
19:27
will a computer do what a three-year-old child can do,
343
1167114
2618
이의λ₯Ό μ œκΈ°ν•  뢄이 μžˆμ„ 것 같진 μ•Šμ§€λ§Œ
19:29
but what's clear is that in any scenario,
344
1169732
3248
λΆ„λͺ…ν•œ 것은 μ–΄λ–€ μ‹œλ‚˜λ¦¬μ˜€ μ•ˆμ—μ„œκ±΄
19:32
our machines have so much to learn from our toddlers.
345
1172980
3770
기계가 μ•„κΈ°λ“€μ—κ²Œμ„œ 배울 게 정말 λ§Žλ‹€λŠ” κ±°μ£ .
19:38
LS: I think so. You'll have some machine learning folks up here.
346
1178230
3216
둜라: κ·Έλ ‡κ²Œ μƒκ°ν•΄μš”. μ—¬κΈ° 기계λ₯Ό κ³΅λΆ€ν•˜λŠ” 뢄도 μžˆμ„ κ±°μ˜ˆμš”.
19:41
I mean, you should never bet against babies or chimpanzees
347
1181446
4203
ν˜„μ‹€ 속 λ‚΄κΈ°μ—μ„œλŠ” μ ˆλŒ€λ‘œ μ•„κΈ°λ‚˜ μΉ¨νŒ¬μ§€λ‚˜ 기술 λ°˜λŒ€νŽΈμ—
19:45
or technology as a matter of practice,
348
1185649
3645
λˆμ„ κ±Έλ©΄ μ•ˆλ©λ‹ˆλ‹€.
19:49
but it's not just a difference in quantity,
349
1189294
4528
ν•˜μ§€λ§Œ 이건 λ‹¨μˆœνžˆ 양적 차이의 λ¬Έμ œκ°€ μ•„λ‹ˆλΌ
19:53
it's a difference in kind.
350
1193822
1764
μœ ν˜•μ  차이의 λ¬Έμ œμž…λ‹ˆλ‹€.
19:55
We have incredibly powerful computers,
351
1195586
2160
μš°λ¦¬λŠ” 가끔은 λ°©λŒ€ν•œ μ–‘μ˜ 자료둜
19:57
and they do do amazingly sophisticated things,
352
1197746
2391
맀우 μ •κ΅ν•œ μž‘μ—…λ„ μˆ˜ν–‰ν•˜λŠ”
20:00
often with very big amounts of data.
353
1200137
3204
λ†€λΌμšΈ μ •λ„μ˜ κ³ μ„±λŠ₯ 컴퓨터가 μžˆμŠ΅λ‹ˆλ‹€.
20:03
Human minds do, I think, something quite different,
354
1203341
2607
μ €λŠ” μ‚¬λžŒμ˜ μ‚¬κ³ λŠ” μƒλ‹Ήνžˆ λ‹€λ₯Έ 무언가λ₯Ό ν•œλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
20:05
and I think it's the structured, hierarchical nature of human knowledge
355
1205948
3895
그리고 κ΅¬μ‘°ν™”λœ, 계급적 본성을 가진 μΈκ°„μ˜ 지식이
20:09
that remains a real challenge.
356
1209843
2032
λ‚¨μ•„μžˆλŠ” μ–΄λ €μš΄ 과제라고 μƒκ°ν•©λ‹ˆλ‹€.
20:11
CA: Laura Schulz, wonderful food for thought. Thank you so much.
357
1211875
3061
크리슀: 정말 쒋은 생각해 λ³Ό 거리λ₯Ό μ£Όμ…¨λ„€μš”. κ°μ‚¬ν•©λ‹ˆλ‹€.
20:14
LS: Thank you. (Applause)
358
1214936
2922
둜라: κ°μ‚¬ν•©λ‹ˆλ‹€. (λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7