Laura Schulz: The surprisingly logical minds of babies

233,878 views ใƒป 2015-06-02

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Zeeva Livshitz ืžื‘ืงืจ: Ido Dekkers
00:12
Mark Twain summed up what I take to be
0
12835
2155
ืžืืจืง ื˜ื•ื•ื™ื™ืŸ ืกื™ื›ื ืืช ืžื” ืฉืื ื™ ืžื—ืฉื™ื‘ื”
00:14
one of the fundamental problems of cognitive science
1
14990
3120
ื›ืื—ืช ื”ื‘ืขื™ื•ืช ื”ื‘ืกื™ืกื™ื•ืช ืฉืœ ืžื“ืข ื”ืงื•ื’ื ื™ืฆื™ื”
00:18
with a single witticism.
2
18110
1710
ื‘ืžืฉืคื˜ ืฉื ื•ืŸ ืื—ื“.
00:20
He said, "There's something fascinating about science.
3
20410
3082
ื”ื•ื ืืžืจ, "ื™ืฉ ืžืฉื”ื• ืžืจืชืง ื‘ืžื“ืข.
00:23
One gets such wholesale returns of conjecture
4
23492
3228
ืื“ื ืžืงื‘ืœ ื”ื—ื–ืจื” ื›ื” ื’ื“ื•ืœื” ืฉืœ ื”ืฉืขืจื•ืช
00:26
out of such a trifling investment in fact."
5
26720
3204
ืชืžื•ืจืช ื”ืฉืงืขื” ื›ืœ ื›ืš ื–ืขื•ืžื” ื‘ืขื•ื‘ื“ื”."
00:29
(Laughter)
6
29924
1585
(ืฆื—ื•ืง)
00:32
Twain meant it as a joke, of course, but he's right:
7
32199
2604
ื˜ื•ื•ื™ื™ืŸ ื”ืชื›ื•ื•ืŸ ืœื›ืš ื›ืืœ ื‘ื“ื™ื—ื”, ื›ืžื•ื‘ืŸ, ืื‘ืœ ื”ื•ื ืฆื•ื“ืง:
00:34
There's something fascinating about science.
8
34803
2876
ื™ืฉ ืžืฉื”ื• ืžืจืชืง ื‘ืžื“ืข.
00:37
From a few bones, we infer the existence of dinosuars.
9
37679
4261
ืžื›ืžื” ืขืฆืžื•ืช, ืื ื—ื ื• ืžืกื™ืงื™ื ืขืœ ืงื™ื•ืžื ืฉืœ ื“ื™ื ื•ื–ืื•ืจื™ื.
00:42
From spectral lines, the composition of nebulae.
10
42910
3871
ืžืงื•ื•ื™ ืกืคืงื˜ืจื•ื ืืช ื”ืจื›ื‘ ืฉืœ ืขืจืคื™ืœื™ื•ืช,
00:47
From fruit flies,
11
47471
2938
ืžื–ื‘ื•ื‘ื™ ื”ืคื™ืจื•ืช,
00:50
the mechanisms of heredity,
12
50409
2943
ืžื ื’ื ื•ื ื™ ื”ืชื•ืจืฉื”,
00:53
and from reconstructed images of blood flowing through the brain,
13
53352
4249
ื•ืžืชืžื•ื ื•ืช ืžืฉื•ื—ื–ืจื•ืช ืฉืœ ื“ื ืฉื–ื•ืจื ื“ืจืš ื”ืžื•ื—,
00:57
or in my case, from the behavior of very young children,
14
57601
4708
ืื• ื‘ืžืงืจื” ืฉืœื™, ืžืื•ืคืŸ ื”ื”ืชื ื”ื’ื•ืช ืฉืœ ื™ืœื“ื™ื ืงื˜ื ื™ื,
01:02
we try to say something about the fundamental mechanisms
15
62309
2829
ืื ื• ืžื ืกื™ื ืœื•ืžืจ ืžืฉื”ื• ืขืœ ื”ืžื ื’ื ื•ื ื™ื ื”ื‘ืกื™ืกื™ื™ื
01:05
of human cognition.
16
65138
1618
ืฉืœ ื”ืงื•ื’ื ื™ืฆื™ื” ื”ืื ื•ืฉื™ืช.
01:07
In particular, in my lab in the Department of Brain and Cognitive Sciences at MIT,
17
67716
4759
ื‘ืžื™ื•ื—ื“ ื‘ืžืขื‘ื“ื” ืฉืœื™ ื‘ืžื—ืœืงืช ืžื“ืขื™ ื”ืžื•ื— ื•ื”ืงื•ื’ื ื™ืฆื™ื”
01:12
I have spent the past decade trying to understand the mystery
18
72475
3654
ื‘ื™ืœื™ืชื™ ืืช ื”ืขืฉื•ืจ ื”ืื—ืจื•ืŸ ื‘ื ื™ืกื™ื•ืŸ ืœื”ื‘ื™ืŸ ืืช ื”ืžื™ืกืชื•ืจื™ืŸ
01:16
of how children learn so much from so little so quickly.
19
76129
3977
ืฉืœ ืื™ืš ื™ืœื“ื™ื ืœื•ืžื“ื™ื ื›ืœ ื›ืš ื”ืจื‘ื” ืžื›ืœ ื›ืš ืžืขื˜, ื›ืœ ื›ืš ืžื”ืจ.
01:20
Because, it turns out that the fascinating thing about science
20
80666
2978
ืžืฉื•ื ืฉืžืชื‘ืจืจ ืฉืžื” ืฉืžืจืชืง ื‘ืžื“ืข
01:23
is also a fascinating thing about children,
21
83644
3529
ื”ื•ื ื’ื ืžืฉื”ื• ืžืจืชืง ืื•ื“ื•ืช ื™ืœื“ื™ื
01:27
which, to put a gentler spin on Mark Twain,
22
87173
2581
ืืฉืจ, ืื ืขื•ืฉื™ื ืกืคื™ืŸ ืขื“ื™ืŸ ื™ื•ืชืจ ืขืœ ืžืจืง ื˜ื•ื•ื™ื™ืŸ,
01:29
is precisely their ability to draw rich, abstract inferences
23
89754
4650
ื–ื• ื‘ื“ื™ื•ืง ื”ื™ื›ื•ืœืช ืฉืœื”ื ืœื”ืกื™ืง, ืžืกืงื ื•ืช ืžื•ืคืฉื˜ื•ืช ืขืฉื™ืจื•ืช
01:34
rapidly and accurately from sparse, noisy data.
24
94404
4661
ืžื”ืจ ื•ื‘ืžื“ื•ื™ืง ืžื ืชื•ื ื™ื ื–ืขื•ืžื™ื ื•ืจื•ืขืฉื™ื.
01:40
I'm going to give you just two examples today.
25
100355
2398
ืื ื™ ื”ื•ืœื›ืช ืœืชืช ืœื›ื ืจืง ืฉืชื™ ื“ื•ื’ืžืื•ืช ื”ื™ื•ื.
01:42
One is about a problem of generalization,
26
102753
2287
ื”ืื—ืช ื”ื™ื ื‘ืขื ื™ื™ืŸ ื‘ืขื™ื” ืฉืœ ื”ื›ืœืœื”,
01:45
and the other is about a problem of causal reasoning.
27
105040
2850
ื•ื”ืฉื ื™ื™ื” ื”ื™ื ื‘ื ื•ืฉื ื”ื”ื™ื’ื™ื•ืŸ ื”ืกื™ื‘ืชื™
01:47
And although I'm going to talk about work in my lab,
28
107890
2525
ื•ืœืžืจื•ืช ืฉืื ื™ ื”ื•ืœื›ืช ืœื“ื‘ืจ ืขืœ ืขื‘ื•ื“ื” ื‘ืžืขื‘ื“ื” ืฉืœื™,
01:50
this work is inspired by and indebted to a field.
29
110415
3460
ื”ืขื‘ื•ื“ื” ื”ื™ื ื‘ื”ืฉืจืืชื” ื•ื‘ื–ื›ื•ืชื” ืฉืœ ืขื‘ื•ื“ืช ืฉื˜ื—.
01:53
I'm grateful to mentors, colleagues, and collaborators around the world.
30
113875
4283
ืื ื™ ืืกื™ืจืช ืชื•ื“ื” ืœื—ื•ื ื›ื™ื, ืœืขืžื™ืชื™ื, ื•ืœืžืฉืชืคื™ ืคืขื•ืœื” ื‘ืจื—ื‘ื™ ื”ืขื•ืœื.
01:59
Let me start with the problem of generalization.
31
119308
2974
ื”ืจืฉื• ืœื™ ืœื”ืชื—ื™ืœ ืขื ื”ื‘ืขื™ื” ืฉืœ ื”ื›ืœืœื”.
02:02
Generalizing from small samples of data is the bread and butter of science.
32
122652
4133
ื”ื›ืœืœื” ืžื“ื•ื’ืžืื•ืช ืงื˜ื ื•ืช ืฉืœ ื ืชื•ื ื™ื ื”ื™ื ื”ืœื—ื ื•ื”ื—ืžืื” ืฉืœ ื”ืžื“ืข.
02:06
We poll a tiny fraction of the electorate
33
126785
2554
ืื ื• ืกื•ืงืจื™ื ื—ืœืง ื–ืขื™ืจ ืฉืœ ื”ื‘ื•ื—ืจื™ื
02:09
and we predict the outcome of national elections.
34
129339
2321
ื•ืื ื• ืฆื•ืคื™ื ืชื•ืฆืื•ืช ืฉืœ ื‘ื—ื™ืจื•ืช ืืจืฆื™ื•ืช
02:12
We see how a handful of patients responds to treatment in a clinical trial,
35
132240
3925
ืื ื• ืจื•ืื™ื ื›ื™ืฆื“ ืงื•ืžืฅ ืฉืœ ื—ื•ืœื™ื ืžื’ื™ื‘ ืœื˜ื™ืคื•ืœ ื‘ื ื™ืกื•ื™ ืงืœื™ื ื™,
02:16
and we bring drugs to a national market.
36
136165
3065
ื•ืื ื—ื ื• ืžื›ื ื™ืกื™ื ืชืจื•ืคื•ืช ืœืฉื•ืง ืœืื•ืžื™.
02:19
But this only works if our sample is randomly drawn from the population.
37
139230
4365
ืื‘ืœ ื–ื” ืขื•ื‘ื“ ืจืง ืื ื”ืžื“ื’ื ืฉืœื ื• ื ืœืงื— ื‘ืื•ืคืŸ ืืงืจืื™ ืžื”ืื•ื›ืœื•ืกื™ื™ื”.
02:23
If our sample is cherry-picked in some way --
38
143595
2735
ืื ื”ืžื“ื’ื ืฉืœื ื• ื ื‘ื—ืจ ื‘ืงืคื™ื“ื” ื‘ื“ืจืš ื›ืœืฉื”ื™ -
02:26
say, we poll only urban voters,
39
146330
2072
ืœืžืฉืœ, ืื ื• ืžืกืงืจื™ื ืจืง ื‘ื•ื—ืจื™ื ืขื™ืจื•ื ื™ื™ื,
02:28
or say, in our clinical trials for treatments for heart disease,
40
148402
4388
ืื• ืœืžืฉืœ, ื‘ื ื™ืกื•ื™ื™ื ื”ืงืœื™ื ื™ื™ื ืฉืœื ื• ืœื˜ื™ืคื•ืœื™ื ื‘ืžื—ืœื•ืช ืœื‘,
02:32
we include only men --
41
152790
1881
ืื ื• ืžื›ืœื™ืœื™ื ืจืง ื’ื‘ืจื™ื --
02:34
the results may not generalize to the broader population.
42
154671
3158
ื”ืชื•ืฆืื•ืช ืื•ืœื™ ืœื ื™ื›ืœื™ืœื• ืืช ื”ืื•ื›ืœื•ืกื™ื” ื”ืจื—ื‘ื” ื™ื•ืชืจ.
02:38
So scientists care whether evidence is randomly sampled or not,
43
158479
3581
ืื– ืœืžื“ืขื ื™ื ืื›ืคืช ืื ืจืื™ื•ืช ื ื“ื’ืžื• ื‘ืื•ืคืŸ ืืงืจืื™ ืื• ืœื.
02:42
but what does that have to do with babies?
44
162060
2015
ืื‘ืœ ืžื” ืœื–ื” ื•ืœืชื™ื ื•ืงื•ืช?
02:44
Well, babies have to generalize from small samples of data all the time.
45
164585
4621
ื•ื‘ื›ืŸ, ืชื™ื ื•ืงื•ืช ืฆืจื™ื›ื™ื ืœื”ื›ืœื™ืœ ืžื“ื’ื™ืžื•ืช ืงื˜ื ื•ืช ืฉืœ ื ืชื•ื ื™ื ื›ืœ ื”ื–ืžืŸ.
02:49
They see a few rubber ducks and learn that they float,
46
169206
3158
ื”ื ืจื•ืื™ื ื›ืžื” ื‘ืจื•ื•ื–ื™ ื’ื•ืžื™ ื•ืœื•ืžื“ื™ื ืฉื”ื ืฆืคื™ื,
02:52
or a few balls and learn that they bounce.
47
172364
3575
ืื• ื›ืžื” ื›ื“ื•ืจื™ื ื•ื”ื ืœื•ืžื“ื™ื ืฉื”ื ืงื•ืคืฆื™ื.
02:55
And they develop expectations about ducks and balls
48
175939
2951
ื•ื”ื ืžืคืชื—ื™ื ืฆื™ืคื™ื•ืช ืœื’ื‘ื™ ื‘ืจื•ื•ื–ื™ื ื•ื›ื“ื•ืจื™ื
02:58
that they're going to extend to rubber ducks and balls
49
178890
2716
ืฉื”ื ื”ื•ืœื›ื™ื ืœื”ืจื—ื™ื‘ ืœื’ื‘ื™ ื‘ืจื•ื•ื–ื™ ื’ื•ืžื™ ื•ื›ื“ื•ืจื™ื
03:01
for the rest of their lives.
50
181606
1879
ืœืžืฉืš ืฉืืจ ื—ื™ื™ื”ื,
03:03
And the kinds of generalizations babies have to make about ducks and balls
51
183485
3739
ื•ืืช ืกื•ื’ื™ ื”ื”ื›ืœืœื•ืช ืฉืชื™ื ื•ืงื•ืช ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ืœื’ื‘ื™ ื‘ืจื•ื•ื–ื™ื ื•ื›ื“ื•ืจื™ื
03:07
they have to make about almost everything:
52
187224
2089
ื”ื ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ืœื’ื‘ื™ ื”ื›ืœ ื›ืžืขื˜:
03:09
shoes and ships and sealing wax and cabbages and kings.
53
189313
3917
ื ืขืœื™ื™ื ื•ืกืคื™ื ื•ืช ื•ืฉืขื•ื•ืช-ื—ื•ืชื ื•ืจืืฉื™ ื›ืจื•ื‘ ื•ืžืœื›ื™ื.
03:14
So do babies care whether the tiny bit of evidence they see
54
194200
2961
ืื– ื”ืื ืœืชื™ื ื•ืงื•ืช ืื›ืคืช ืื ื˜ื™ืคืช ืจืื™ื™ื” ืฉื”ื ืจื•ืื™ื
03:17
is plausibly representative of a larger population?
55
197161
3692
ื”ื™ื ืžื™ื™ืฆื’ ืกื‘ื™ืจ ืฉืœ ืื•ื›ืœื•ืกื™ื” ื’ื“ื•ืœื” ื™ื•ืชืจ?
03:21
Let's find out.
56
201763
1900
ื‘ื•ืื• ื•ื ื’ืœื”.
03:23
I'm going to show you two movies,
57
203663
1723
ืื ื™ ื”ื•ืœื›ืช ืœื”ืจืื•ืช ืœื›ื ืฉื ื™ ืกืจื˜ื™ื
03:25
one from each of two conditions of an experiment,
58
205386
2462
ื›ืœ ืื—ื“ ืžื”ื ืžืฉื ื™ ืžืฆื‘ื™ื ืฉืœ ื ื™ืกื•ื™
03:27
and because you're going to see just two movies,
59
207848
2438
ื•ืžืฉื•ื ืฉืืชื ื”ื•ืœื›ื™ื ืœืจืื•ืช ืจืง ืฉื ื™ ืกืจื˜ื™ื,
03:30
you're going to see just two babies,
60
210286
2136
ืืชื ื”ื•ืœื›ื™ื ืœืจืื•ืช ืจืง ืฉื ื™ ืชื™ื ื•ืงื•ืช,
03:32
and any two babies differ from each other in innumerable ways.
61
212422
3947
ื•ื›ืœ ืฉื ื™ ืชื™ื ื•ืงื•ืช ืฉื•ื ื™ื ื–ื” ืžื–ื” ื‘ื“ืจื›ื™ื ืจื‘ื•ืช ืžืกืคื•ืจ.
03:36
But these babies, of course, here stand in for groups of babies,
62
216369
3051
ืื‘ืœ ืชื™ื ื•ืงื•ืช ืืœื”, ื›ืžื•ื‘ืŸ, ืžื™ื™ืฆื’ื™ื ืงื‘ื•ืฆื•ืช ืฉืœ ืชื™ื ื•ืงื•ืช,
03:39
and the differences you're going to see
63
219420
1895
ื•ื”ื”ื‘ื“ืœื™ื ืฉืืชื ืขื•ืžื“ื™ื ืœืจืื•ืช
03:41
represent average group differences in babies' behavior across conditions.
64
221315
5195
ืžื™ื™ืฆื’ื™ื ื”ื‘ื“ืœื™ ืงื‘ื•ืฆื” ืžืžื•ืฆืขื™ื ื‘ื”ืชื ื”ื’ื•ืชื ืฉืœ ืชื™ื ื•ืงื•ืช ืœืื•ืจืš ืžืฆื‘ื™ื ืฉื•ื ื™ื.
03:47
In each movie, you're going to see a baby doing maybe
65
227160
2583
ื‘ื›ืœ ืกืจื˜ ืืชื ื”ื•ืœื›ื™ื ืœืจืื•ืช ืชื™ื ื•ืง ืขื•ืฉื” ืื•ืœื™
03:49
just exactly what you might expect a baby to do,
66
229743
3460
ื‘ื“ื™ื•ืง ืจืง ืžื” ืฉืืชื ืขืฉื•ื™ื™ื ืœืฆืคื•ืช ืžืชื™ื ื•ืง ืฉื™ืขืฉื”,
03:53
and we can hardly make babies more magical than they already are.
67
233203
4017
ื•ืื ื—ื ื• ื‘ืงื•ืฉื™ ื™ื›ื•ืœื™ื ืœื’ืจื•ื ืœืชื™ื ื•ืงื•ืช ืœื”ื™ื•ืช ื™ื•ืชืจ ืงืกื•ืžื™ื ืžืžื” ืฉื”ื ื›ื‘ืจ.
03:58
But to my mind the magical thing,
68
238090
2010
ืื‘ืœ ืœื“ืขืชื™ ื”ื“ื‘ืจ ื”ืงืกื•ื,
04:00
and what I want you to pay attention to,
69
240100
2089
ื•ืžื” ืฉืื ื™ ืจื•ืฆื” ืฉืชืฉื™ืžื• ืœื‘ ืืœื™ื•,
04:02
is the contrast between these two conditions,
70
242189
3111
ื”ื•ื ื”ื ื™ื’ื•ื“ ื‘ื™ืŸ ืฉื ื™ ืžืฆื‘ื™ื ืืœื”,
04:05
because the only thing that differs between these two movies
71
245300
3529
ื›ื™ ื”ื“ื‘ืจ ื”ื™ื—ื™ื“ ืฉืฉื•ื ื” ื‘ื™ืŸ ืฉื ื™ ืกืจื˜ื™ื ืืœื”
04:08
is the statistical evidence the babies are going to observe.
72
248829
3466
ื”ื•ื ื”ืจืื™ื•ืช ื”ืกื˜ื˜ื™ืกื˜ื™ื•ืช ืฉื”ืชื™ื ื•ืงื•ืช ืขื•ืžื“ื™ื ืœื”ื‘ื—ื™ืŸ ื‘ื”ื.
04:13
We're going to show babies a box of blue and yellow balls,
73
253425
3183
ืื ื—ื ื• ื”ื•ืœื›ื™ื ืœื”ืจืื•ืช ืœืชื™ื ื•ืงื•ืช ืชื™ื‘ื” ืฉืœ ื›ื“ื•ืจื™ื ื›ื—ื•ืœื™ื ื•ืฆื”ื•ื‘ื™ื
04:16
and my then-graduate student, now colleague at Stanford, Hyowon Gweon,
74
256608
4620
ื•ื”ืกื˜ื•ื“ื ื˜ื™ืช ื”ืžืชืžื—ื” ืฉืœื™ ืื–, ืฉื›ื™ื•ื ื”ื™ื ืขืžื™ืชื” ื—ื“ืฉื” ื‘ืกื˜ื ืคื•ืจื“ , ื”ื™ื•ื•ื•ืŸ ื’ื•ื•ืื•ืŸ,
04:21
is going to pull three blue balls in a row out of this box,
75
261228
3077
ื”ื•ืœื›ืช ืœืฉืœื•ืฃ 3 ื›ื“ื•ืจื™ื ื›ื—ื•ืœื™ื ื‘ื–ื” ืื—ืจ ื–ื” ืžืชื•ืš ืชื™ื‘ื” ื–ื•,
04:24
and when she pulls those balls out, she's going to squeeze them,
76
264305
3123
ื•ื›ืืฉืจ ื”ื™ื ืฉื•ืœืคืช ื›ื“ื•ืจื™ื ืืœื” ื”ื—ื•ืฆื”, ื”ื™ื ื”ื•ืœื›ืช ืœืžืขื•ืš ืื•ืชื,
04:27
and the balls are going to squeak.
77
267428
2113
ื•ื”ื›ื“ื•ืจื™ื ื™ืฆืคืฆืคื•.
04:29
And if you're a baby, that's like a TED Talk.
78
269541
2763
ื•ืื ืืชื ืชื™ื ื•ืงื•ืช, ื–ื” ื›ืžื• ื”ืจืฆืื” ื‘-TED.
04:32
It doesn't get better than that.
79
272304
1904
ื–ื” ืœื ื ืขืฉื” ื™ื•ืชืจ ื˜ื•ื‘ ืžื–ื”.
04:34
(Laughter)
80
274208
2561
(ืฆื—ื•ืง)
04:38
But the important point is it's really easy to pull three blue balls in a row
81
278968
3659
ืื‘ืœ ืžื” ืฉื—ืฉื•ื‘ ื–ื” ืฉืžืžืฉ ืงืœ ืœืฉืœื•ืฃ ืฉืœื•ืฉื” ื›ื“ื•ืจื™ื ื›ื—ื•ืœื™ื ื‘ืจืฆืฃ
04:42
out of a box of mostly blue balls.
82
282627
2305
ืžืชื•ืš ืชื™ื‘ืช ื›ื“ื•ืจื™ื ื‘ืขื™ืงืจ ื›ื—ื•ืœื™ื.
04:44
You could do that with your eyes closed.
83
284932
2060
ืชื•ื›ืœื• ืœืขืฉื•ืช ื–ืืช ื‘ืขื™ื ื™ื™ื ืขืฆื•ืžื•ืช.
04:46
It's plausibly a random sample from this population.
84
286992
2996
ื–ื” ืžื“ื’ื ืืงืจืื™ ืกื‘ื™ืจ ืžืงื‘ื•ืฆื” ื–ื•.
04:49
And if you can reach into a box at random and pull out things that squeak,
85
289988
3732
ื•ืื ืืชื ื™ื›ื•ืœื™ื ืœื”ื•ืฉื™ื˜ ื™ื“ ืœืชื™ื‘ื” ื‘ืืงืจืื™ ื•ืœืฉืœื•ืฃ ืžืžื ื” ื“ื‘ืจื™ื ืžืฆืคืฆืคื™ื,
04:53
then maybe everything in the box squeaks.
86
293720
2839
ืื– ืื•ืœื™ ื›ืœ ื“ื‘ืจ ื‘ืชื™ื‘ื” ื—ื•ืจืง.
04:56
So maybe babies should expect those yellow balls to squeak as well.
87
296559
3650
ืื– ืื•ืœื™ ืชื™ื ื•ืงื•ืช ืฆืจื™ื›ื™ื ืœืฆืคื•ืช ื’ื ืžื”ื›ื“ื•ืจื™ื ื”ืฆื”ื•ื‘ื™ื ืฉื™ืฆืคืฆืคื•.
05:00
Now, those yellow balls have funny sticks on the end,
88
300209
2519
ืขื›ืฉื™ื•, ืœื›ื“ื•ืจื™ื ืฆื”ื•ื‘ื™ื ืืœื• ื™ืฉ ืžืงืœื•ืช ืžืฆื—ื™ืงื™ื ื‘ืงืฆื”,
05:02
so babies could do other things with them if they wanted to.
89
302728
2857
ื›ืš ืฉืชื™ื ื•ืงื•ืช ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ื“ื‘ืจื™ื ืื—ืจื™ื ืื™ืชื ืื ื”ื ื”ื™ื• ืจื•ืฆื™ื.
05:05
They could pound them or whack them.
90
305585
1831
05:07
But let's see what the baby does.
91
307416
2586
ืื‘ืœ ื‘ื•ืื• ื ืจืื” ืžื” ื”ืชื™ื ื•ืงืช ืขื•ืฉื”.
05:12
(Video) Hyowon Gweon: See this? (Ball squeaks)
92
312548
3343
(ื•ื™ื“ืื•) ื”ื™ื•ื•ืŸ ื’ื•ื•ืื•ืŸ: ืจื•ืื” ืืช ื–ื”? (ื›ื“ื•ืจ ืžืฆืคืฆืฃ)
05:16
Did you see that? (Ball squeaks)
93
316531
3045
ืจืื™ืช ืืช ื–ื”?
05:20
Cool.
94
320036
3066
ืžื’ื ื™ื‘.
05:24
See this one?
95
324706
1950
ืจื•ืื” ืืช ื–ื”?
05:26
(Ball squeaks)
96
326656
1881
(ื›ื“ื•ืจ ืžืฆืคืฆืฃ)
05:28
Wow.
97
328537
2653
ื•ื•ืื•.
05:33
Laura Schulz: Told you. (Laughs)
98
333854
2113
ืœื•ืจื” ืฉื•ืœืฅ: ืืžืจืชื™ ืœื›ื? (ืฆื•ื—ืงืช)
05:35
(Video) HG: See this one? (Ball squeaks)
99
335967
4031
(ื•ื™ื“ืื•) ื”.ื’. ืจื•ืื” ืืช ื–ื”? (ื›ื“ื•ืจ ืžืฆืคืฆืฃ)
05:39
Hey Clara, this one's for you. You can go ahead and play.
100
339998
4619
ื”ื™ื™ ืงืœืจื”, ื–ื” ื‘ืฉื‘ื™ืœืš. ืืช ื™ื›ื•ืœื” ืœื”ืžืฉื™ืš ื•ืœืฉื—ืง.
05:51
(Laughter)
101
351854
4365
(ืฆื—ื•ืง)
05:56
LS: I don't even have to talk, right?
102
356219
2995
ืœ.ืฉ.: ืื ื™ ืืคื™ืœื• ืœื ืฆืจื™ื›ื” ืœื“ื‘ืจ, ื ื›ื•ืŸ?
05:59
All right, it's nice that babies will generalize properties
103
359214
2899
ื‘ืกื“ืจ, ื–ื” ื ื—ืžื“ ืฉืชื™ื ื•ืงื•ืช ื™ืขืฉื• ื”ื›ืœืœื” ืฉืœ ืชื›ื•ื ื•ืช
06:02
of blue balls to yellow balls,
104
362113
1528
ืžื›ื“ื•ืจื™ื ื›ื—ื•ืœื™ื ืœื›ื“ื•ืจื™ื ืฆื”ื•ื‘ื™ื,
06:03
and it's impressive that babies can learn from imitating us,
105
363641
3096
ื•ื–ื” ืžืจืฉื™ื ืฉืชื™ื ื•ืงื•ืช ื™ื›ื•ืœื™ื ืœืœืžื•ื“ ืžืœื—ืงื•ืช ืื•ืชื ื•,
06:06
but we've known those things about babies for a very long time.
106
366737
3669
ืื‘ืœ ื™ื“ืขื ื• ื“ื‘ืจื™ื ืืœื” ืขืœ ืชื™ื ื•ืงื•ืช ืžื–ื” ื–ืžืŸ ืจื‘.
06:10
The really interesting question
107
370406
1811
ื”ืฉืืœื” ื”ืžืขื ื™ื™ื ืช ื‘ืืžืช
06:12
is what happens when we show babies exactly the same thing,
108
372217
2852
ื”ื™ื ืžื” ืงื•ืจื” ื›ืืฉืจ ืื ื• ืžืจืื™ื ืœืชื™ื ื•ืงื•ืช ื‘ื“ื™ื•ืง ืืช ืื•ืชื• ื”ื“ื‘ืจ,
06:15
and we can ensure it's exactly the same because we have a secret compartment
109
375069
3611
ื•ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื‘ื˜ื™ื— ืฉื–ื” ื‘ื“ื™ื•ืง ืื•ืชื• ื”ื“ื‘ืจ ื›ื™ ื™ืฉ ืœื ื• ืชื ืกื•ื“ื™
06:18
and we actually pull the balls from there,
110
378680
2110
ื•ืื ื—ื ื• ืœืžืขืฉื” ืฉื•ืœืคื™ื ืืช ื”ื›ื“ื•ืจื™ื ืžืฉื,
06:20
but this time, all we change is the apparent population
111
380790
3478
ืืš ื”ืคืขื, ื›ืœ ืžื” ืฉืื ื• ืžืฉื ื™ื ื–ื” ืืช ื”ืงื‘ื•ืฆื”
06:24
from which that evidence was drawn.
112
384268
2902
ืฉืžืžื ื” ืจืื™ื™ื” ื–ื• ื ืžืฉื›ื”
06:27
This time, we're going to show babies three blue balls
113
387170
3553
ื”ืคืขื, ืื ื—ื ื• ื”ื•ืœื›ื™ื ืœื”ืจืื•ืช ืœืชื™ื ื•ืงื•ืช ืฉืœื•ืฉื” ื›ื“ื•ืจื™ื ื›ื—ื•ืœื™ื
06:30
pulled out of a box of mostly yellow balls,
114
390723
3384
ืฉื”ื•ืฆืื• ืžืงื•ืคืกื” ื‘ืขื™ืงืจ ืฉืœ ื›ื“ื•ืจื™ื ืฆื”ื•ื‘ื™ื
06:34
and guess what?
115
394107
1322
ื•ื ื—ืฉื• ืžื”?
06:35
You [probably won't] randomly draw three blue balls in a row
116
395429
2840
ืืชื (ื›ื ืจืื” ืฉืœื) ืชืฉืœืคื• 3 ื›ื“ื•ืจื™ื ื‘ื–ื” ืื—ืจ ื–ื”
06:38
out of a box of mostly yellow balls.
117
398269
2484
ืžืชื™ื‘ื” ืฉืœ ื‘ืขื™ืงืจ ื›ื“ื•ืจื™ื ืฆื”ื•ื‘ื™ื
06:40
That is not plausibly randomly sampled evidence.
118
400753
3747
ื–ื• ืœื ืจืื™ื™ืช ื“ื’ื™ืžื” ืกื‘ื™ืจื” ื‘ืื•ืคืŸ ืืงืจืื™.
06:44
That evidence suggests that maybe Hyowon was deliberately sampling the blue balls.
119
404500
5123
ืจืื™ื™ื” ื–ื• ืžืฆื‘ื™ืขื” ืขืœ ื›ืš ืฉืื•ืœื™ ื”ื™ื•ื•ืŸ ื“ื™ื’ืžื” ื‘ืžืชื›ื•ื•ืŸ ืืช ื”ื›ื“ื•ืจื™ื ื”ื›ื—ื•ืœื™ื
06:49
Maybe there's something special about the blue balls.
120
409623
2583
ืื•ืœื™ ื™ืฉ ืžืฉื”ื• ืžื™ื•ื—ื“ ื‘ื›ื“ื•ืจื™ื ื”ื›ื—ื•ืœื™ื.
06:52
Maybe only the blue balls squeak.
121
412846
2976
ืื•ืœื™ ืจืง ื”ื›ื“ื•ืจื™ื ื”ื›ื—ื•ืœื™ื ืžืฆืคืฆืคื™ื.
06:55
Let's see what the baby does.
122
415822
1895
ื‘ื•ืื• ื ืจืื” ืžื” ื”ืชื™ื ื•ืงืช ืขื•ืฉื”.
06:57
(Video) HG: See this? (Ball squeaks)
123
417717
2904
(ื•ื™ื“ืื•) ื”.ื’.: ืจื•ืื” ืืช ื–ื”? (ื›ื“ื•ืจ ืžืฆืคืฆืฃ)
07:02
See this toy? (Ball squeaks)
124
422851
2645
ืจื•ืื” ืืช ื”ืฆืขืฆื•ืข ื”ื–ื”? (ื›ื“ื•ืจ ืžืฆืคืฆืฃ)
07:05
Oh, that was cool. See? (Ball squeaks)
125
425496
5480
ื”ื•, ื–ื” ื”ื™ื” ืžื’ื ื™ื‘. ืจื•ืื”? (ื›ื“ื•ืจ ืžืฆืคืฆืฃ)
07:10
Now this one's for you to play. You can go ahead and play.
126
430976
4394
ืขื›ืฉื™ื• ื–ื” ื‘ืฉื‘ื™ืœืš ืœืฉื—ืง. ืืช ื™ื›ื•ืœื” ืœืฉื—ืง.
07:18
(Fussing) (Laughter)
127
438074
6347
(ื”ืชืจื’ืฉื•ืช) (ืฆื—ื•ืง)
07:26
LS: So you just saw two 15-month-old babies
128
446901
2748
ืœ.ืฉ: ืื– ืืชื ืจืื™ืชื ืฉื ื™ ืชื™ื ื•ืงื•ืช ื‘ื ื™ 15 ื—ื•ื“ืฉื™ื
07:29
do entirely different things
129
449649
1942
ืขื•ืฉื™ื ื“ื‘ืจื™ื ืฉื•ื ื™ื ืœื—ืœื•ื˜ื™ืŸ
07:31
based only on the probability of the sample they observed.
130
451591
3599
ืฉืžื‘ื•ืกืกื™ื ืจืง ืขืœ ื”ื”ืกืชื‘ืจื•ืช ืฉืœ ื”ื“ื•ื’ืžื” ืฉื”ื ืจืื•.
07:35
Let me show you the experimental results.
131
455190
2321
ื”ืจืฉื• ืœื™ ืœื”ืจืื•ืช ืœื›ื ืืช ืชื•ืฆืื•ืช ื”ื ื™ืกื•ื™.
07:37
On the vertical axis, you'll see the percentage of babies
132
457511
2764
ืขืœ ื”ืฆื™ืจ ื”ืื ื›ื™, ืชืจืื• ืืช ืื—ื•ื– ื”ืชื™ื ื•ืงื•ืช
07:40
who squeezed the ball in each condition,
133
460275
2530
ืฉืžืขื›ื• ืืช ื”ื›ื“ื•ืจ ื‘ื›ืœ ืžืฆื‘.
07:42
and as you'll see, babies are much more likely to generalize the evidence
134
462805
3715
ื•ื›ืคื™ ืฉืชืจืื•, ืกื‘ื™ืจ ื”ืจื‘ื” ื™ื•ืชืจ ืฉืชื™ื ื•ืงื•ืช ื™ืขืฉื• ื”ื›ืœืœื”
07:46
when it's plausibly representative of the population
135
466520
3135
ื›ืฉื–ื” ื™ื™ืฆื•ื’ ืžืชืงื‘ืœ ืขืœ ื”ื“ืขืช ืฉืœ ื”ืื•ื›ืœื•ืกื™ื™ื”
07:49
than when the evidence is clearly cherry-picked.
136
469655
3738
ืžืืฉืจ ื›ืฉื”ืจืื™ื™ื” ื ื‘ื—ืจืช ื‘ืงืคื™ื“ื”
07:53
And this leads to a fun prediction:
137
473393
2415
ื•ื–ื” ืžื•ืœื™ืš ืœื—ื™ื–ื•ื™ ืžื”ื ื”:
07:55
Suppose you pulled just one blue ball out of the mostly yellow box.
138
475808
4868
ื ื ื™ื— ืฉืฉืœืคืชื ืจืง ื›ื“ื•ืจ ื›ื—ื•ืœ ืื—ื“ ืžื”ืชื™ื‘ื” ืฉื‘ื” ืจื•ื‘ ืฉืœ ื›ื“ื•ืจื™ื ืฆื”ื•ื‘ื™ื
08:00
You [probably won't] pull three blue balls in a row at random out of a yellow box,
139
480896
3869
ืืชื (ื›ื ืจืื” ืœื) ืชื•ืฆื™ืื• 3 ื›ื“ื•ืจื™ื ื›ื—ื•ืœื™ื ื‘ื–ื” ืื—ืจ ื–ื” , ื‘ืืงืจืื™ ืžืชื•ืš ืชื™ื‘ื” ืฆื”ื•ื‘ื”.
08:04
but you could randomly sample just one blue ball.
140
484765
2455
ืื‘ืœ ื™ื›ื•ืœืชื ื‘ืืงืจืื™ ืœื“ื’ื•ื ืจืง ื›ื“ื•ืจ ื›ื—ื•ืœ ืื—ื“.
08:07
That's not an improbable sample.
141
487220
1970
ื–ื” ืœื ืžื“ื’ื ื‘ืœืชื™ ืกื‘ื™ืจ.
08:09
And if you could reach into a box at random
142
489190
2224
ื•ืœื• ื™ื›ื•ืœืชื ืœื”ื’ื™ืข ืœืชื™ื‘ื” ื‘ืืงืจืื™
08:11
and pull out something that squeaks, maybe everything in the box squeaks.
143
491414
3987
ื•ืœื”ื•ืฆื™ื ืžืฉื”ื• ืฉื—ื•ืจืง,ืžืฆืคืฆืฃ, ืื•ืœื™ ื”ื›ืœ ื‘ืชื™ื‘ื” ืžืฆืคืฆืฃ.
08:15
So even though babies are going to see much less evidence for squeaking,
144
495875
4445
ืื– ืœืžืจื•ืช ืฉืชื™ื ื•ืงื•ืช ื”ื•ืœื›ื™ื ืœืจืื•ืช ื”ืจื‘ื” ืคื—ื•ืช ืจืื™ื•ืช ืœืฆืคืฆื•ืคื™ื,
08:20
and have many fewer actions to imitate
145
500320
2242
ื•ื™ืฉ ืœื”ื ื”ืจื‘ื” ืคื—ื•ืช ืคืขื•ืœื•ืช ืœื—ืงื•ืช
08:22
in this one ball condition than in the condition you just saw,
146
502562
3343
ื‘ืžืฆื‘ ื–ื” ืฉืœ ื›ื“ื•ืจ ืื—ื“ ืžืืฉืจ ื‘ืžืฆื‘ ืฉืจืื™ืชื ื–ื” ืขืชื”.
08:25
we predicted that babies themselves would squeeze more,
147
505905
3892
ื—ื–ื™ื ื• ืฉื”ืชื™ื ื•ืงื•ืช ืขืฆืžื ื™ืœื—ืฆื• ื™ื•ืชืจ.
08:29
and that's exactly what we found.
148
509797
2894
ื•ื–ื” ื‘ื“ื™ื•ืง ืžื” ืฉืžืฆืื ื•.
08:32
So 15-month-old babies, in this respect, like scientists,
149
512691
4411
ื•ื‘ื›ืŸ, ืœืชื™ื ื•ืงื•ืช ื‘ื ื™ 15 ื—ื•ื“ืฉื™ื, ื‘ื”ืงืฉืจ ื–ื”, ื›ืžื• ืžื“ืขื ื™ื,
08:37
care whether evidence is randomly sampled or not,
150
517102
3088
ืื›ืคืช ืื ืจืื™ื•ืช ื ื“ื’ืžื• ื‘ืืงืจืื™ ืื• ืœื,
08:40
and they use this to develop expectations about the world:
151
520190
3507
ื•ื”ื ืžืฉืชืžืฉื™ื ื‘ื–ื” ื›ื“ื™ ืœืคืชื— ืฆื™ืคื™ื•ืช ืœื’ื‘ื™ ื”ืขื•ืœื:
08:43
what squeaks and what doesn't,
152
523697
2182
ืžื” ื—ื•ืจืง ื•ืžื” ืœื,
08:45
what to explore and what to ignore.
153
525879
3145
ืžื” ืœื—ืงื•ืจ ื•ืžืžื” ืœื”ืชืขืœื.
08:50
Let me show you another example now,
154
530384
2066
ื”ืจืฉื• ืœื™ ืœื”ืจืื•ืช ืœื›ื ื“ื•ื’ืžื” ื ื•ืกืคืช ืขื›ืฉื™ื•,
08:52
this time about a problem of causal reasoning.
155
532450
2730
ื”ืคืขื ืขืœ ื—ืฉื™ื‘ื” ืกื™ื‘ืชื™ืช
08:55
And it starts with a problem of confounded evidence
156
535180
2439
ื•ื–ื” ืžืชื—ื™ืœ ืขื ื‘ืขื™ื” ืฉืœ ืจืื™ื™ื” ืžื‘ื•ืœื‘ืœืช.
08:57
that all of us have,
157
537619
1672
ืฉื™ืฉ ืœื›ืœ ืื—ื“ ืžืื™ืชื ื•.
08:59
which is that we are part of the world.
158
539291
2020
ืฉื”ื™ื ืฉืื ื—ื ื• ื”ื ื ื• ื—ืœืง ืžื”ืขื•ืœื
09:01
And this might not seem like a problem to you, but like most problems,
159
541311
3436
ื•ื–ื” ืขืฉื•ื™ ืœื ืœื”ื™ืจืื•ืช ืœื›ื ื›ืžื• ื‘ืขื™ื™ื”, ืื‘ืœ ื›ืžื• ืจื‘ ื”ื‘ืขื™ื•ืช,
09:04
it's only a problem when things go wrong.
160
544747
2337
ื–ื• ืจืง ื‘ืขื™ื” ื›ืืฉืจ ื“ื‘ืจื™ื ืžืฉืชื‘ืฉื™ื.
09:07
Take this baby, for instance.
161
547464
1811
ืงื—ื• ืืช ื”ืชื™ื ื•ืง ื”ื–ื” ืœื“ื•ื’ืžื”.
09:09
Things are going wrong for him.
162
549275
1705
ื“ื‘ืจื™ื ืžืฉืชื‘ืฉื™ื ืขื‘ื•ืจื•.
09:10
He would like to make this toy go, and he can't.
163
550980
2271
ื”ื•ื ื”ื™ื” ืจื•ืฆื” ืœื’ืจื•ื ืœืฆืขืฆื•ืข ื–ื” ืœืคืขื•ืœ, ื•ืื™ื ื• ื™ื›ื•ืœ.
09:13
I'll show you a few-second clip.
164
553251
2529
ืืจืื” ืœื›ื ืกืจื˜ื•ืŸ ืฉืœ ื›ืžื” ืฉื ื™ื•ืช.
09:21
And there's two possibilities, broadly:
165
561340
1920
ื•ื™ืฉ ืฉืชื™ ืืคืฉืจื•ื™ื•ืช, ื‘ื’ื“ื•ืœ:
09:23
Maybe he's doing something wrong,
166
563260
2634
ืื•ืœื™ ื”ื•ื ืขื•ืฉื” ืžืฉื”ื• ืœื ื‘ืกื“ืจ.
09:25
or maybe there's something wrong with the toy.
167
565894
4216
ืื• ืฉืื•ืœื™ ื™ืฉ ืžืฉื”ื• ืœื ื‘ืกื“ืจ ื‘ืฆืขืฆื•ืข.
09:30
So in this next experiment,
168
570110
2111
ืื– ื‘ื ื™ืกื•ื™ ื”ื‘ื,
09:32
we're going to give babies just a tiny bit of statistical data
169
572221
3297
ืื ื—ื ื• ื”ื•ืœื›ื™ื ืœืชืช ืœืชื™ื ื•ืงื•ืช ืจืง ื˜ื™ืค-ื˜ื™ืคื” ืฉืœ ื ืชื•ื ื™ื ืกื˜ื˜ื™ืกื˜ื™ื™ื
09:35
supporting one hypothesis over the other,
170
575518
2582
ืฉืชื•ืžื›ื™ื ื‘ื”ืฉืขืจื” ืื—ืช ืœืขื•ืžืช ื”ืฉื ื™ื™ื”,
09:38
and we're going to see if babies can use that to make different decisions
171
578100
3455
ื•ืื ื—ื ื• ื”ื•ืœื›ื™ื ืœืจืื•ืช ืื ืชื™ื ื•ืงื•ืช ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื–ื” ื›ื“ื™ ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืฉื•ื ื•ืช
09:41
about what to do.
172
581555
1834
ืžื” ืœืขืฉื•ืช.
09:43
Here's the setup.
173
583389
2022
ื”ื ื” ื”ืžืขืจืš.
09:46
Hyowon is going to try to make the toy go and succeed.
174
586071
3030
ื”ื™ื•ื•ื•ืŸ ืชื ืกื” ืœื’ืจื•ื ืœืฆืขืฆื•ืข ืœืคืขื•ืœ ื•ืชืฆืœื™ื—.
09:49
I am then going to try twice and fail both times,
175
589101
3320
ื•ืื– ืื ื™ ืื ืกื” ืคืขืžื™ื™ื ื•ืื›ืฉืœ ื‘ืฉืชื™ ื”ืคืขืžื™ื.
09:52
and then Hyowon is going to try again and succeed,
176
592421
3112
ื•ืื– ื”ื™ื•ื•ื•ืŸ ืชื ืกื” ืฉื•ื‘ ื•ืชืฆืœื™ื—.
09:55
and this roughly sums up my relationship to my graduate students
177
595533
3172
ื•ื–ื” ื‘ืขืจืš ืžืกื›ื ืืช ืžืขืจื›ืช ื”ื™ื—ืกื™ื ืฉืœื™ ืขื ื”ืกื˜ื•ื“ื ื˜ื™ื ืฉืœื™ ืœืชื•ืืจ ืฉื ื™
09:58
in technology across the board.
178
598705
2835
ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื‘ืื•ืคืŸ ื›ืœืœื™
10:02
But the important point here is it provides a little bit of evidence
179
602030
3292
ืื‘ืœ ื”ื ืงื•ื“ื” ื”ื—ืฉื•ื‘ื” ื›ืืŸ ื”ื™ื ืฉื–ื” ืžืกืคืง ืžืขื˜ ืจืื™ื•ืช
10:05
that the problem isn't with the toy, it's with the person.
180
605322
3668
ืฉื”ื‘ืขื™ื” ืื™ื ื” ืขื ื”ืฆืขืฆื•ืข. ืืœื ืขื ื”ืื“ื.
10:08
Some people can make this toy go,
181
608990
2350
ื™ืฉ ืื ืฉื™ื ืฉืžืกื•ื’ืœื™ื ืœื’ืจื•ื ืœืฆืขืฆื•ืข ืœืคืขื•ืœ.
10:11
and some can't.
182
611340
959
ื•ื™ืฉ ื›ืืœื” ืฉืœื.
10:12
Now, when the baby gets the toy, he's going to have a choice.
183
612799
3413
ืขื›ืฉื™ื•, ื›ืฉื”ืชื™ื ื•ืง ืžืงื‘ืœ ืฆืขืฆื•ืข, ื”ื•ืœื›ืช ืœื”ื™ื•ืช ืœื• ื‘ื—ื™ืจื”.
10:16
His mom is right there,
184
616212
2188
ื”ืืžื ืฉืœื• ื”ื™ื ืžืžืฉ ืฉื,
10:18
so he can go ahead and hand off the toy and change the person,
185
618400
3315
ืื– ืื ื™ ื™ื›ื•ืœื” ืœืชืช ืืช ื”ืฆืขืฆื•ืข ื•ืœื”ื—ืœื™ืฃ ืืช ื”ืื“ื.
10:21
but there's also going to be another toy at the end of that cloth,
186
621715
3158
ืื‘ืœ ื”ื•ืœืš ืœื”ื™ื•ืช ืฆืขืฆื•ืข ืื—ืจ ื‘ืงืฆื” ื”ื‘ื“ ื”ื–ื”,
10:24
and he can pull the cloth towards him and change the toy.
187
624873
3552
ื•ื”ื•ื ื™ื›ื•ืœ ืœืžืฉื•ืš ืืช ื”ื‘ื“ ืืœื™ื• ื•ืœื”ื—ืœื™ืฃ ืืช ื”ืฆืขืฆื•ืข.
10:28
So let's see what the baby does.
188
628425
2090
ืื– ื‘ื•ืื• ื ืจืื” ืžื” ื”ืชื™ื ื•ืง ืขื•ืฉื”.
10:30
(Video) HG: Two, three. Go! (Music)
189
630515
4183
(ื•ื™ื“ืื•) ื”.ื’.: ืฉืชื™ื™ื, ืฉืœื•ืฉ. ืฆื! (ืžื•ืกื™ืงื”)
10:34
LS: One, two, three, go!
190
634698
3131
ืœ.ืฉ: ืื—ืช ืฉืชื™ื™ื ืฉืœื•ืฉ, ืฆื!
10:37
Arthur, I'm going to try again. One, two, three, go!
191
637829
7382
ืืจืชื•ืจ, ืื ื™ ื”ื•ืœืš ืœื ืกื•ืช ืฉื•ื‘. ืื—ืช, ืฉืชื™ื™ื ืฉืœื•ืฉ, ืฆื!
10:45
YG: Arthur, let me try again, okay?
192
645677
2600
ื”.ื’: ืืจืชื•ืจ, ืชืŸ ืœื™ ืœื ืกื•ืช, ื˜ื•ื‘?
10:48
One, two, three, go! (Music)
193
648277
4550
ืื—ืช, ืฉืชื™ื™ื, ืฉืœื•ืฉ, ืฆื!
10:53
Look at that. Remember these toys?
194
653583
1883
ืชืจืื• ืืช ื–ื”. ื–ื•ื›ืจ ืืช ื”ืฆืขืฆื•ืขื™ื ื”ืืœื”?
10:55
See these toys? Yeah, I'm going to put this one over here,
195
655466
3264
ืจื•ืื” ืืช ื”ืฆืขืฆื•ืขื™ื ื”ืืœื”? ืื ื™ ื”ื•ืœื›ืช ืœื”ื ื™ื— ืืช ื–ื” ื›ืืŸ.
10:58
and I'm going to give this one to you.
196
658730
2062
ื•ืื ื™ ื”ื•ืœื›ืช ืœืชืช ืืช ื–ื” ืœืš.
11:00
You can go ahead and play.
197
660792
2335
ืืชื” ื™ื›ื•ืœ ืœื”ืžืฉื™ืš ื•ืœืฉื—ืง.
11:23
LS: Okay, Laura, but of course, babies love their mommies.
198
683213
4737
ืœ.ืฉ: ื˜ื•ื‘, ืœื•ืจื”, ืื‘ืœ ื›ืžื•ื‘ืŸ, ืชื™ื ื•ืงื•ืช ืื•ื”ื‘ื™ื ืืช ื”ืืžื”ื•ืช ืฉืœื”ื.
11:27
Of course babies give toys to their mommies
199
687950
2182
ื›ืžื•ื‘ืŸ ืฉืชื™ื ื•ืงื•ืช ื ื•ืชื ื™ื ืฆืขืฆื•ืขื™ื ืœืืžื”ื•ืช ืฉืœื”ื
11:30
when they can't make them work.
200
690132
2030
ื›ืฉืื™ื ื ืžืฆืœื™ื—ื™ื ืœื”ืคืขื™ืœ ืื•ืชื.
11:32
So again, the really important question is what happens when we change
201
692162
3593
ืื– ืฉื•ื‘, ื”ืฉืืœื” ื”ื—ืฉื•ื‘ื” ื‘ืืžืช ื”ื™ื ืžื” ืงื•ืจื” ื›ืืฉืจ ืื ื• ืžืฉื ื™ื
11:35
the statistical data ever so slightly.
202
695755
3154
ืžืขื˜ ืืช ื”ื ืชื•ื ื™ื ื”ืกื˜ื˜ื™ืกื˜ื™ื™ื .
11:38
This time, babies are going to see the toy work and fail in exactly the same order,
203
698909
4087
ื”ืคืขื, ืชื™ื ื•ืงื•ืช ื”ื•ืœื›ื™ื ืœืจืื•ืช ืืช ื”ืฆืขืฆื•ืข ืคื•ืขืœ ื•ืœื ืคื•ืขืœ ื‘ื“ื™ื•ืง ื‘ืื•ืชื• ื”ืกื“ืจ,
11:42
but we're changing the distribution of evidence.
204
702996
2415
ืื‘ืœ ืื ื—ื ื• ืžืฉื ื™ื ืืช ื—ืœื•ืงืช ื”ืจืื™ื•ืช.
11:45
This time, Hyowon is going to succeed once and fail once, and so am I.
205
705411
4411
ื”ืคืขื, ื”ื™ื•ื•ื•ืŸ ื”ื•ืœื›ืช ืœื”ืฆืœื™ื— ืคืขื ืื—ืช ื•ืœื”ื™ื›ืฉืœ ืคืขื ืื—ืช, ื•ื’ื ืื ื™!
11:49
And this suggests it doesn't matter who tries this toy, the toy is broken.
206
709822
5637
ื•ื–ื” ืžืฆื™ืข ืฉืœื ืžืฉื ื” ืžื™ ืžื ืกื” ืืช ื”ืฆืขืฆื•ืข ื”ื–ื”. ื”ืฆืขืฆื•ืข ืฉื‘ื•ืจ.
11:55
It doesn't work all the time.
207
715459
1886
ื”ื•ื ืœื ืคื•ืขืœ ื›ืœ ื”ื–ืžืŸ.
11:57
Again, the baby's going to have a choice.
208
717345
1965
ืฉื•ื‘, ืœืชื™ื ื•ืง ื”ื•ืœื›ืช ืœื”ื™ื•ืช ื‘ืจื™ืจื”.
11:59
Her mom is right next to her, so she can change the person,
209
719310
3396
ืืžื ืฉืœื” ื ืžืฆืืช ืžืžืฉ ืœื™ื“ื”, ื›ืš ืฉื”ื™ื ื™ื›ื•ืœื” ืœืฉื ื•ืช ืืช ื”ืื“ื
12:02
and there's going to be another toy at the end of the cloth.
210
722706
3204
ื•ื”ื•ืœืš ืœื”ื™ื•ืช ืฆืขืฆื•ืข ืื—ืจ ื‘ืงืฆื”ื• ืฉืœ ื”ื‘ื“.
12:05
Let's watch what she does.
211
725910
1378
ื‘ื•ืื• ื ืจืื” ืžื” ื”ื™ื ืขื•ืฉื”.
12:07
(Video) HG: Two, three, go! (Music)
212
727288
4348
(ื•ื™ื“ืื•) ื”.ื’.: ืฉืชื™ื™ื, ืฉืœื•ืฉ, ืฆื! (ืžื•ืกื™ืงื”)
12:11
Let me try one more time. One, two, three, go!
213
731636
4984
ืชื ื• ืœื™ ืœื ืกื•ืช ืขื•ื“ ืคืขื ืื—ืช. ืื—ืช, ืฉืชื™ื™ื, ืฉืœื•ืฉ, ืฆื!
12:17
Hmm.
214
737460
1697
ื”ืžืžืž.
12:19
LS: Let me try, Clara.
215
739950
2692
ืœ.ืฉ: ืชื ื™ ืœื™ ืœื ืกื•ืช, ืงืœืจื”.
12:22
One, two, three, go!
216
742642
3945
ืื—ืช, ืฉืชื™ื™ื, ืฉืœื•ืฉ, ืฆื!
12:27
Hmm, let me try again.
217
747265
1935
ื”ืžืž, ืชื ื• ืœื™ ืœื ืกื•ืช ืฉื•ื‘.
12:29
One, two, three, go! (Music)
218
749200
5670
ืื—ืช, ืฉืชื™ื™ื, ืฉืœื•ืฉ, ืฆื! (ืžื•ืกื™ืงื”)
12:35
HG: I'm going to put this one over here,
219
755009
2233
ื”.ื’.: ืื ื™ ืืฉื™ื ืืช ื–ื” ื›ืืŸ,
12:37
and I'm going to give this one to you.
220
757242
2001
ื•ืื ื™ ืืชืŸ ืืช ื–ื” ืœืš.
12:39
You can go ahead and play.
221
759243
3597
ืืช ื™ื›ื•ืœื” ืœื”ืžืฉื™ืš ื•ืœืฉื—ืง.
12:58
(Applause)
222
778376
4897
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
13:04
LS: Let me show you the experimental results.
223
784993
2392
ืœ.ืฉ.: ืชื ื• ืœื™ ืœื”ืจืื•ืช ืœื›ื ืืช ืชื•ืฆืื•ืช ื”ื ื™ืกื•ื™.
13:07
On the vertical axis, you'll see the distribution
224
787385
2475
ืขืœ ื”ืฆื™ืจ ื”ืื ื›ื™ ืชืจืื• ืืช ื”ื—ืœื•ืงื”
13:09
of children's choices in each condition,
225
789860
2577
ืฉืœ ื‘ื—ื™ืจื•ืช ื”ื™ืœื“ื™ื ื‘ื›ืœ ืžืฆื‘,
13:12
and you'll see that the distribution of the choices children make
226
792437
4551
ื•ืชืจืื• ืฉื—ืœื•ืงืช ื”ื‘ื—ื™ืจื•ืช ืฉื™ืœื“ื™ื ืขื•ืฉื™ื
13:16
depends on the evidence they observe.
227
796988
2787
ืชืœื•ื™ื™ื” ื‘ืจืื™ื•ืช ืฉื ืฆืคื•.
13:19
So in the second year of life,
228
799775
1857
ืื– ื‘ืฉื ื” ื”ืฉื ื™ื™ื” ืฉืœ ื”ื—ื™ื™ื,
13:21
babies can use a tiny bit of statistical data
229
801632
2577
ืชื™ื ื•ืงื•ืช ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื˜ื™ืค-ื˜ื™ืคื” ืฉืœ ื ืชื•ื ื™ื ืกื˜ื˜ื™ืกื˜ื™ื™ื
13:24
to decide between two fundamentally different strategies
230
804209
3367
ื›ื“ื™ ืœื”ื—ืœื™ื˜ ื‘ื™ืŸ ืฉืชื™ ืืกื˜ืจื˜ื’ื™ื•ืช ืฉื•ื ื•ืช ื‘ืžื”ื•ืชืŸ
13:27
for acting in the world:
231
807576
1881
ื›ื“ื™ ืœืคืขื•ืœ ื‘ืขื•ืœื:
13:29
asking for help and exploring.
232
809457
2743
ืœื‘ืงืฉ ืขื–ืจื” ื•ืœื—ืงื•ืจ.
13:33
I've just shown you two laboratory experiments
233
813700
3434
ื”ืจืื™ืชื™ ืœื›ื ื–ื” ืขืชื” ืฉื ื™ ื ื™ืกื•ื™ื™ ืžืขื‘ื“ื”
13:37
out of literally hundreds in the field that make similar points,
234
817134
3691
ืžืชื•ืš ืžืื•ืช, ื‘ืฉื˜ื— ืฉืžืจืื™ื ื ืงื•ื“ื•ืช ื“ื•ืžื•ืช,
13:40
because the really critical point
235
820825
2392
ื‘ื’ืœืœ ืฉื”ื ืงื•ื“ื” ื”ืงืจื™ื˜ื™ืช ืžืžืฉ
13:43
is that children's ability to make rich inferences from sparse data
236
823217
5108
ื”ื™ื ืฉื™ื›ื•ืœืชื ืฉืœ ื”ื™ืœื“ื™ื ืœื”ืกื™ืง ืžืกืงื ื•ืช ืจื‘ื•ืช ืžื ืชื•ื ื™ื ื“ืœื™ื
13:48
underlies all the species-specific cultural learning that we do.
237
828325
5341
ื ืžืฆืืช ื‘ื‘ืกื™ืก ื›ืœ ืชืจื‘ื•ืช ื”ืœืžื™ื“ื” ื™ื™ื—ื•ื“ื™ืช-ืœืžื™ื ื™ื ืฉืื ื• ืขื•ืฉื™ื.
13:53
Children learn about new tools from just a few examples.
238
833666
4597
ื™ืœื“ื™ื ืœื•ืžื“ื™ื ืขืœ ื›ืœื™ื ื—ื“ืฉื™ื ืžืชื•ืš ื›ืžื” ื“ื•ื’ืžืื•ืช ื‘ืœื‘ื“.
13:58
They learn new causal relationships from just a few examples.
239
838263
4717
ื”ื ืœื•ืžื“ื™ื ืงืฉืจื™ ื’ื•ืžืœื™ืŸ ืกื™ื‘ืชื™ื™ื ื—ื“ืฉื™ื ืžื›ืžื” ื“ื•ื’ืžืื•ืช ื‘ืœื‘ื“.
14:03
They even learn new words, in this case in American Sign Language.
240
843928
4871
ื”ื ืืคื™ืœื• ืœื•ืžื“ื™ื ืžื™ืœื™ื ื—ื“ืฉื•ืช, ื‘ืžืงืจื” ื–ื” ื‘ืฉืคืช ืกื™ืžื ื™ื ืืžืจื™ืงืื™ืช.
14:08
I want to close with just two points.
241
848799
2311
ืื ื™ ืจื•ืฆื” ืœืกื™ื™ื ืขื ืฉืชื™ ื ืงื•ื“ื•ืช ื‘ืœื‘ื“
14:12
If you've been following my world, the field of brain and cognitive sciences,
242
852050
3688
ืื ืขืงื‘ืชื ืื—ืจื™ ื”ืขื•ืœื ืฉืœื™, ืชื—ื•ื ืžื“ืขื™ ื”ืžื•ื— ื•ื”ืงื•ื’ื ื™ืฆื™ื”
14:15
for the past few years,
243
855738
1927
ื‘ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
14:17
three big ideas will have come to your attention.
244
857665
2415
ื”ื™ื™ืชื ืฉืžื™ื ืœื‘ ืœืฉืœื•ืฉื” ืจืขื™ื•ื ื•ืช ื’ื“ื•ืœื™ื.
14:20
The first is that this is the era of the brain.
245
860080
3436
ื”ืจืืฉื•ืŸ ื”ื•ื ืฉื–ื” ืขื™ื“ืŸ ื”ืžื•ื—.
14:23
And indeed, there have been staggering discoveries in neuroscience:
246
863516
3669
ื•ืื›ืŸ, ื”ื™ื• ืชื’ืœื™ื•ืช ืžื“ื”ื™ืžื•ืช ื‘ื—ืงืจ ืžื“ืขื™ ื”ืžื•ื—:
14:27
localizing functionally specialized regions of cortex,
247
867185
3436
ืœื•ืงืœื™ื–ืฆื™ื” ืฉืœ ืชืคืงื•ื“ื™ื ืžื™ื•ื—ื“ื™ื ื‘ืื–ื•ืจื™ื ืฉืœ ืงืœื™ืคืช ื”ืžื•ื—,
14:30
turning mouse brains transparent,
248
870621
2601
ื”ืคื™ื›ืช ืžื•ื—ื•ืช ืฉืœ ืขื›ื‘ืจื™ื ืœืฉืงื•ืคื™ื.
14:33
activating neurons with light.
249
873222
3776
ื”ืคืขืœืช ื ื•ื™ืจื•ื ื™ื ืขื ืื•ืจ.
14:36
A second big idea
250
876998
1996
ืจืขื™ื•ืŸ ื’ื“ื•ืœ ืฉื ื™
14:38
is that this is the era of big data and machine learning,
251
878994
4104
ื”ื•ื ืฉื–ื” ื”ืขื™ื“ืŸ ืฉืœ ื ืชื•ื ื™ื ื’ื“ื•ืœื™ื ื•ืœืžื™ื“ืช ืžื›ื•ื ื”,
14:43
and machine learning promises to revolutionize our understanding
252
883098
3141
ื•ืœืžื™ื“ืช ืžื›ื•ื ื” ืžื‘ื˜ื™ื—ื” ืœื—ื•ืœืœ ืžื”ืคื™ื›ื” ื‘ื”ื‘ื ื” ืฉืœื ื•
14:46
of everything from social networks to epidemiology.
253
886239
4667
ืฉืœ ื›ืœ ื“ื‘ืจ, ื”ื—ืœ ืžืจืฉืชื•ืช ื—ื‘ืจืชื™ื•ืช ืœืืคื™ื“ืžื™ื•ืœื•ื’ื™ื”.
14:50
And maybe, as it tackles problems of scene understanding
254
890906
2693
ื•ืื•ืœื™, ื‘ืขืช ื”ืชืžื•ื“ื“ื•ืช ืขื ื‘ืขื™ื•ืช ืฉืœ ื”ื‘ื ืช ืกืฆื ื”
14:53
and natural language processing,
255
893599
1993
ื•ืขื™ื‘ื•ื“ ื˜ื‘ืขื™ ืฉืœ ืฉืคื”,
14:55
to tell us something about human cognition.
256
895592
3324
ืœืกืคืจ ืœื ื• ืžืฉื”ื• ืขืœ ืงื•ื’ื ื™ืฆื™ื” ืื ื•ืฉื™ืช.
14:59
And the final big idea you'll have heard
257
899756
1937
ื•ื”ืจืขื™ื•ืŸ ื”ื’ื“ื•ืœ ื”ืื—ืจื•ืŸ ืฉื”ื™ื™ืชื ืฉื•ืžืขื™ื ืขืœื™ื•
15:01
is that maybe it's a good idea we're going to know so much about brains
258
901693
3387
ื”ื•ื ืฉืื•ืœื™ ื–ื” ืจืขื™ื•ืŸ ื˜ื•ื‘ ืฉืื ื—ื ื• ื”ื•ืœื›ื™ื ืœื“ืขืช ื›ืœ ื›ืš ื”ืจื‘ื” ืขืœ ืžื•ื—ื•ืช
15:05
and have so much access to big data,
259
905080
1917
ื•ื™ืฉ ื’ื™ืฉื” ืจื‘ื” ื›ืœ ื›ืš ืœื ืชื•ื ื™ื ืจื‘ื™ื,
15:06
because left to our own devices,
260
906997
2507
ื›ื™ ืื ื ื™ืขื–ื‘ ืœื ืคืฉื ื•
15:09
humans are fallible, we take shortcuts,
261
909504
3831
ื‘ื ื™ ืื“ื ื ื•ื˜ื™ื ืœื˜ืขื•ืช, ืื ื—ื ื• ืขื•ืฉื™ื ืงื™ืฆื•ืจื™ ื“ืจืš,
15:13
we err, we make mistakes,
262
913335
3437
ืื ื—ื ื• ืฉื•ื’ื™ื, ืื ื—ื ื• ืขื•ืฉื™ื ื˜ืขื•ื™ื•ืช,
15:16
we're biased, and in innumerable ways,
263
916772
3684
ืื ื—ื ื• ืžื•ื˜ื™ื, ื•ื‘ืžื•ื‘ื ื™ื ืจื‘ื™ื ืžืกืคื•ืจ,
15:20
we get the world wrong.
264
920456
2969
ืื ื• ืชื•ืคืฉื™ื ืœื ื ื›ื•ืŸ ืืช ื”ืขื•ืœื.
15:24
I think these are all important stories,
265
924843
2949
ืื ื™ ื—ื•ืฉื‘ืช ืฉืืœื” ื›ื•ืœื ืกื™ืคื•ืจื™ื ื—ืฉื•ื‘ื™ื,
15:27
and they have a lot to tell us about what it means to be human,
266
927792
3785
ื•ื™ืฉ ืœื”ื ื”ืจื‘ื” ืžื” ืœืกืคืจ ืœื ื• ืขืœ ืžื” ื–ื” ืœื”ื™ื•ืช ืื ื•ืฉื™,
15:31
but I want you to note that today I told you a very different story.
267
931577
3529
ืื‘ืœ ืื ื™ ืจื•ืฆื” ืฉืชืฉื™ืžื• ืœื‘ ืฉื”ื™ื•ื ืกื™ืคืจืชื™ ืœื›ื ืกื™ืคื•ืจ ืฉื•ื ื” ืžืื•ื“.
15:35
It's a story about minds and not brains,
268
935966
3807
ื–ื” ืกื™ืคื•ืจ ืขืœ ื”ืžื•ื“ืขื•ืช ื•ืœื ืขืœ ื”ืžื•ื—,
15:39
and in particular, it's a story about the kinds of computations
269
939773
3006
ื•ื‘ืžื™ื•ื—ื“ ื–ื” ืกื™ืคื•ืจ ืขืœ ืกื•ื’ื™ ื—ื™ืฉื•ื‘ื™ื
15:42
that uniquely human minds can perform,
270
942779
2590
ืฉื‘ืื•ืคืŸ ื™ื™ื—ื•ื“ื™ ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช ื™ื›ื•ืœื” ืœื‘ืฆืข,
15:45
which involve rich, structured knowledge and the ability to learn
271
945369
3944
ืฉืžืขื•ืจื‘ื™ื ื‘ื• ื™ื“ืข ืžื•ื‘ื ื” ืขืฉื™ืจ, ื•ื”ื™ื›ื•ืœืช ืœืœืžื•ื“
15:49
from small amounts of data, the evidence of just a few examples.
272
949313
5268
ืžื›ืžื•ื™ื•ืช ืงื˜ื ื•ืช ืฉืœ ื ืชื•ื ื™ื, ืจืื™ื™ื” ืฉืœ ืจืง ืžืขื˜ ื“ื•ื’ืžืื•ืช.
15:56
And fundamentally, it's a story about how starting as very small children
273
956301
4299
ื•ื‘ื™ืกื•ื“ื•, ื–ื” ืกื™ืคื•ืจ ืขืœ ืื™ืš ืžืชื—ื™ืœื™ื ื›ื™ืœื“ื™ื ืงื˜ื ื™ื ืžืื•ื“
16:00
and continuing out all the way to the greatest accomplishments
274
960600
4180
ื•ืžืžืฉื™ื›ื™ื ื›ืœ ื”ื“ืจืš ืœื”ื™ืฉื’ื™ื ื”ื’ื“ื•ืœื™ื ื‘ื™ื•ืชืจ
16:04
of our culture,
275
964780
3843
ืฉืœ ื”ืชืจื‘ื•ืช ืฉืœื ื•.
16:08
we get the world right.
276
968623
1997
ืื ื• ืชื•ืคืฉื™ื ื ื›ื•ืŸ ืืช ื”ืขื•ืœื .
16:12
Folks, human minds do not only learn from small amounts of data.
277
972433
5267
ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช ืœื ืจืง ืœื•ืžื“ืช ืžื›ืžื•ื™ื•ืช ืงื˜ื ื•ืช ืฉืœ ื ืชื•ื ื™ื,
16:18
Human minds think of altogether new ideas.
278
978285
2101
ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช ื—ื•ืฉื‘ืช ืขืœ ืจืขื™ื•ื ื•ืช ืœื’ืžืจื™ ื—ื“ืฉื™ื.
16:20
Human minds generate research and discovery,
279
980746
3041
ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช ื™ื•ืฆืจืช ืžื—ืงืจ ื•ืžื—ื•ืœืœืช ื’ื™ืœื•ื™.
16:23
and human minds generate art and literature and poetry and theater,
280
983787
5273
ื•ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช ื™ื•ืฆืจืช ืืžื ื•ืช ื•ืกืคืจื•ืช ื•ืฉื™ืจื” ื•ืชื™ืื˜ืจื•ืŸ,
16:29
and human minds take care of other humans:
281
989070
3760
ื•ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช ื“ื•ืื’ืช ืœื‘ื ื™ ืื“ื ืื—ืจื™ื:
16:32
our old, our young, our sick.
282
992830
3427
ื”ื–ืงื ื™ื ืฉืœื ื•, ื”ืฆืขื™ืจื™ื ืฉืœื ื•, ื”ื—ื•ืœื™ื ืฉืœื ื•.
16:36
We even heal them.
283
996517
2367
ืื ื• ืืคื™ืœื• ืžืจืคืื™ื ืื•ืชื.
16:39
In the years to come, we're going to see technological innovations
284
999564
3103
ื‘ืฉื ื™ื ื”ื‘ืื•ืช, ืื ื—ื ื• ืขื•ืžื“ื™ื ืœืจืื•ืช ื—ื™ื“ื•ืฉื™ื ื˜ื›ื ื•ืœื•ื’ื™ื™ื
16:42
beyond anything I can even envision,
285
1002667
3797
ืžืขื‘ืจ ืœื›ืœ ืžื” ืฉืื ื™ ื™ื›ื•ืœื” ืœื“ืžื™ื™ืŸ,
16:46
but we are very unlikely
286
1006464
2150
ืื‘ืœ ืžืื•ื“ ืœื ืกื‘ื™ืจ ืฉืื ื—ื ื•
16:48
to see anything even approximating the computational power of a human child
287
1008614
5709
ื ืจืื” ืžืฉื”ื• ืฉืžืชืงืจื‘ ืืคื™ืœื• ืœื›ื•ื— ื”ื—ื™ืฉื•ื‘ื™ ืฉืœ ื”ื™ืœื“ ื”ืื ื•ืฉื™
16:54
in my lifetime or in yours.
288
1014323
4298
ื‘ื—ื™ื™ื ืฉืœื™ ืื• ื‘ืฉืœื›ื.
16:58
If we invest in these most powerful learners and their development,
289
1018621
5047
ืื ื ืฉืงื™ืข ื”ืจื‘ื” ื‘ืœื•ืžื“ื™ื ืืœื” ื•ื‘ืคื™ืชื•ื—ื
17:03
in babies and children
290
1023668
2917
ื‘ืชื™ื ื•ืงื•ืช ื•ื‘ื™ืœื“ื™ื
17:06
and mothers and fathers
291
1026585
1826
ื•ืื™ืžื”ื•ืช ื•ืื‘ื•ืช
17:08
and caregivers and teachers
292
1028411
2699
ื•ื‘ืžื˜ืคืœื™ื ื•ืžื•ืจื™ื
17:11
the ways we invest in our other most powerful and elegant forms
293
1031110
4170
ื‘ื“ืจืš ืฉืื ื• ืžืฉืงื™ืขื™ื ื‘ืฆื•ืจื•ืช ื”ืื—ืจื•ืช ื”ืืœื’ื ื˜ื™ื•ืช ื•ืจื‘ื•ืช ื”ืขืฆืžื” ืฉืœื ื•
17:15
of technology, engineering and design,
294
1035280
3218
ืฉืœ ื˜ื›ื ื•ืœื•ื’ื™ื”, ื”ื ื“ืกื” ื•ืขื™ืฆื•ื‘,
17:18
we will not just be dreaming of a better future,
295
1038498
2939
ืœื ืจืง ื ื—ืœื•ื ืขืœ ืขืชื™ื“ ื˜ื•ื‘ ื™ื•ืชืจ,
17:21
we will be planning for one.
296
1041437
2127
ื ืชื›ื ืŸ ื›ื–ื”.
17:23
Thank you very much.
297
1043564
2345
ืชื•ื“ื” ืจื‘ื”.
17:25
(Applause)
298
1045909
3421
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
17:29
Chris Anderson: Laura, thank you. I do actually have a question for you.
299
1049810
4426
ื›ืจื™ืก ืื ื“ืจืกื•ืŸ: ืœื•ืจื”, ืชื•ื“ื”. ืœืžืขืฉื” ื™ืฉ ืœื™ ืฉืืœื” ืืœื™ืš.
17:34
First of all, the research is insane.
300
1054236
2359
ืจืืฉื™ืช ื”ืžื—ืงืจ ื”ื•ื ืžื˜ื•ืจืฃ.
17:36
I mean, who would design an experiment like that? (Laughter)
301
1056595
3725
ืื ื™ ืžืชื›ื•ื•ืŸ, ืžื™ ื”ื™ื” ืžืขืฆื‘ ื ื™ืกื•ื™ ื›ื–ื”? (ืฆื—ื•ืง)
17:41
I've seen that a couple of times,
302
1061150
1790
ืจืื™ืชื™ ืืช ื–ื” ื›ืžื” ืคืขืžื™ื
17:42
and I still don't honestly believe that that can truly be happening,
303
1062940
3222
ื•ืขื“ื™ื™ืŸ ืื™ื ื™ ืžืืžื™ืŸ ื‘ื›ื ื•ืช ืฉื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉื–ื” ื‘ืืžืช ืงื•ืจื”.
17:46
but other people have done similar experiments; it checks out.
304
1066162
3158
ืื‘ืœ ืื ืฉื™ื ืื—ืจื™ื ืขืฉื• ื ื™ืกื•ื™ื™ื ื“ื•ืžื™ื; ื–ื” ืขื•ื‘ื“.
17:49
The babies really are that genius.
305
1069320
1633
ื”ืชื™ื ื•ืงื•ืช ื‘ืืžืช ื›ืืœื” ื’ืื•ื ื™ื.
17:50
LS: You know, they look really impressive in our experiments,
306
1070953
3007
ืœ.ืฉ: ืืชื” ื™ื•ื“ืข, ื”ื ื ืจืื™ื ืžืžืฉ ืžืจืฉื™ืžื™ื ื‘ื ื™ืกื•ื™ื™ื ืฉืœื ื•,
17:53
but think about what they look like in real life, right?
307
1073960
2652
ืื‘ืœ ื—ื™ืฉื‘ื• ืขืœ ืื™ืš ืฉื”ื ื ืจืื™ื ื‘ื—ื™ื™ ื”ื™ื•ืžื™ื•ื. ื ื›ื•ืŸ?
17:56
It starts out as a baby.
308
1076612
1150
ื–ื” ืžืชื—ื™ืœ ื›ืชื™ื ื•ืง.
17:57
Eighteen months later, it's talking to you,
309
1077762
2007
18 ื—ื•ื“ืฉ ืžืื•ื—ืจ ื™ื•ืชืจ ื–ื” ืžื“ื‘ืจ ืืœื™ืš.
17:59
and babies' first words aren't just things like balls and ducks,
310
1079769
3041
ื•ื”ืžื™ืœื™ื ื”ืจืืฉื•ื ื•ืช ืฉืœ ื”ืชื™ื ื•ืงื•ืช ื”ืŸ ืœื ืจืง ื“ื‘ืจื™ื ื›ืžื• ื›ื“ื•ืจื™ื ื•ื‘ืจื•ื•ื–ื™ื,
18:02
they're things like "all gone," which refer to disappearance,
311
1082810
2881
ืืœื” ื“ื‘ืจื™ื ื›ืžื• "ื”ื›ืœ ื”ืœืš", ืฉืžืชื™ื™ื—ืกื™ื ืœื”ื™ืขืœืžื•ืช.
18:05
or "uh-oh," which refer to unintentional actions.
312
1085691
2283
ืื• "ืื•-ืื•" ืฉืžืชื™ื™ื—ืก ืœืคืขื•ืœื•ืช ืฉืื™ื ืŸ ืžื›ื•ื•ื ื•ืช.
18:07
It has to be that powerful.
313
1087974
1562
ื–ื” ื—ื™ื™ื‘ ืœื”ื™ื•ืช ืขื“ ื›ื“ื™ ื›ืš ื—ื–ืง.
18:09
It has to be much more powerful than anything I showed you.
314
1089536
2775
ื–ื” ื—ื™ื™ื‘ ืœื”ื™ื•ืช ื”ืจื‘ื” ื™ื•ืชืจ ื—ื–ืง ืžื›ืœ ื“ื‘ืจ ืฉื”ืจืื™ืชื™ ืœื›ื.
18:12
They're figuring out the entire world.
315
1092311
1974
ื”ื ืžื‘ื™ื ื™ื ืืช ื”ืขื•ืœื ื›ื•ืœื•.
18:14
A four-year-old can talk to you about almost anything.
316
1094285
3144
ื‘ืŸ 4 ื™ื›ื•ืœ ืœื“ื‘ืจ ืื™ืชื›ื ื›ืžืขื˜ ืขืœ ื›ืœ ื“ื‘ืจ.
18:17
(Applause)
317
1097429
1601
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
18:19
CA: And if I understand you right, the other key point you're making is,
318
1099030
3414
ื›.ื.: ื•ืื ืื ื™ ืžื‘ื™ืŸ ืื•ืชืš ื ื›ื•ืŸ ื”ื ืงื•ื“ื” ื”ืžืจื›ื–ื™ืช ื”ืื—ืจืช ืฉืืช ืžืฆื‘ื™ืขื” ืขืœื™ื” ื”ื™ื,
18:22
we've been through these years where there's all this talk
319
1102444
2754
ื›ื‘ืจ ืขื‘ืจื ื• ื“ืจืš ื”ืฉื ื™ื ื”ืืœื” ืฉื™ืฉ ื‘ื”ืŸ ืืช ื›ืœ ื”ื“ื™ื‘ื•ืจื™ื ื”ืืœื”
18:25
of how quirky and buggy our minds are,
320
1105198
1932
ืขื“ ื›ืžื” ืžื•ื–ืจื” ื•ืžื˜ื•ืจืคื” ื”ืชื•ื“ืขื” ืฉืœื ื•.
18:27
that behavioral economics and the whole theories behind that
321
1107130
2867
ืฉื›ืœื›ืœื” ืืจื’ื•ื ื™ืช ื•ืชื™ืื•ืจื™ื•ืช ืฉืœืžื•ืช ืขื•ืžื“ื™ื ืžืื—ื•ืจื™ ื–ื”
18:29
that we're not rational agents.
322
1109997
1603
ืฉืื ื—ื ื• ืœื ื™ืฆื•ืจื™ื ืจืฆื™ื•ื ืœื™ื™ื.
18:31
You're really saying that the bigger story is how extraordinary,
323
1111600
4216
ืืช ื‘ืืžืช ืื•ืžืจืช ืฉื”ืกื™ืคื•ืจ ื”ื’ื“ื•ืœ ื™ื•ืชืจ ื”ื•ื ืขื“ ื›ืžื” ื™ื•ืฆื ืžื”ื›ืœืœ,
18:35
and there really is genius there that is underappreciated.
324
1115816
4944
ื•ื™ืฉ ื’ืื•ืŸ ืฉื ืฉืื™ื ื• ืžื•ืขืจืš ื“ื™ื•.
18:40
LS: One of my favorite quotes in psychology
325
1120760
2070
ืœ.ืฉ.: ืื—ื“ ืžื”ืฆื™ื˜ื•ื˜ื™ื ื‘ืคืกื™ื›ื•ืœื•ื’ื™ื” ื”ืื”ื•ื‘ื™ื ืขืœื™
18:42
comes from the social psychologist Solomon Asch,
326
1122830
2290
ื”ื•ื ืฉืœ ื”ืคืกื™ื›ื•ืœื•ื’ ื”ื—ื‘ืจืชื™ ืกื•ืœื•ืžื•ืŸ ืืฉ,
18:45
and he said the fundamental task of psychology is to remove
327
1125120
2807
ื•ื”ื•ื ืืžืจ ืฉื”ืชืคืงื™ื“ ื”ืงืจื™ื˜ื™ ืฉืœ ืคืกื™ื›ื•ืœื•ื’ื™ื” ื”ื•ื ืœื”ืกื™ืจ
18:47
the veil of self-evidence from things.
328
1127927
2626
ืืช ืžืขื˜ื” ื”ืจืื™ื”-ื”ืขืฆืžื™ืช ืžืขืœ ื”ื“ื‘ืจื™ื.
18:50
There are orders of magnitude more decisions you make every day
329
1130553
4551
ื™ืฉื ื ืกื“ืจื™ ื’ื•ื“ืœ ื™ื•ืชืจ ื”ื—ืœื˜ื•ืช ืฉืžืงื‘ืœื™ื ืžื“ื™ ื™ื•ื
18:55
that get the world right.
330
1135104
1347
ืฉืžื‘ื™ื ื™ื ืืช ื”ืขื•ืœื ื ื›ื•ืŸ.
18:56
You know about objects and their properties.
331
1136451
2132
ืืชื ื™ื•ื“ืขื™ื ืขืœ ืื•ื‘ื™ื™ืงื˜ื™ื ื•ื”ืžืืคื™ื™ื ื™ื ืฉืœื”ื.
18:58
You know them when they're occluded. You know them in the dark.
332
1138583
3029
ืืชื ืžื›ื™ืจื™ื ืื•ืชื ื›ืืฉืจ ื”ื ื—ืกื•ืžื™ื. ืืชื ืžื›ื™ืจื™ื ืื•ืชื ื‘ื—ื•ืฉืš.
19:01
You can walk through rooms.
333
1141612
1308
ืืชื ื™ื›ื•ืœื™ื ืœืขื‘ื•ืจ ื“ืจืš ื—ื“ืจื™ื.
19:02
You can figure out what other people are thinking. You can talk to them.
334
1142920
3532
ืืชื ื™ื›ื•ืœื™ื ืœื”ื‘ื™ืŸ ืžื” ืื—ืจื™ื ื—ื•ืฉื‘ื™ื. ืืชื ื™ื›ื•ืœื™ื ืœื“ื‘ืจ ืื™ืชื.
19:06
You can navigate space. You know about numbers.
335
1146452
2230
ืืชื ื™ื›ื•ืœื™ื ืœื ื•ื•ื˜ ื‘ื—ืœืœ. ืืชื ื™ื•ื“ืขื™ื ืขืœ ืžืกืคืจื™ื
19:08
You know causal relationships. You know about moral reasoning.
336
1148682
3022
ืืชื ื™ื•ื“ืขื™ื ืขืœ ืงืฉืจื™ื ืกื™ื‘ืชื™ื™ื, ืืชื ื™ื•ื“ืขื™ื ืขืœ ื—ืฉื™ื‘ื” ืžื•ืกืจื™ืช.
19:11
You do this effortlessly, so we don't see it,
337
1151704
2356
ืืชื ืขื•ืฉื™ื ื–ืืช ื‘ืงืœื•ืช, ื•ืœื›ืŸ ืื ื• ืœื ืจื•ืื™ื ืืช ื–ื”.
19:14
but that is how we get the world right, and it's a remarkable
338
1154060
2912
ืื‘ืœ ื›ื›ื” ืื ื—ื ื• ืžื‘ื™ื ื™ื ืืช ื”ืขื•ืœื ื ื›ื•ืŸ, ื•ื–ื” ืžืจืฉื™ื
19:16
and very difficult-to-understand accomplishment.
339
1156972
2318
ื•ื”ื™ืฉื’ ืงืฉื” ืžืื•ื“ ืœื”ื‘ื ื”.
19:19
CA: I suspect there are people in the audience who have
340
1159290
2628
ื›.ื.: ืื ื™ ื—ื•ืฉื“ ืฉื™ืฉ ืื ืฉื™ื ื‘ืงื”ืœ ืฉื™ืฉ ืœื”ื
19:21
this view of accelerating technological power
341
1161918
2238
ื”ืฉืงืคื” ื–ื• ืฉืœ ื”ืืฆืช ื›ื•ื— ื˜ื›ื ื•ืœื•ื’ื™
19:24
who might dispute your statement that never in our lifetimes
342
1164156
2958
ืฉืื•ืœื™ ื™ื—ืœื•ืง ืขืœ ื”ื”ืฆื”ืจื” ืฉืœืš ืฉืืฃ ืคืขื ื‘ื—ื™ื™ื ื•
19:27
will a computer do what a three-year-old child can do,
343
1167114
2618
ืžื—ืฉื‘ ืœื ื™ืขืฉื” ืืช ืžื” ืฉื™ืœื“ ื‘ืŸ 3 ื™ื›ื•ืœ ืœืขืฉื•ืช,
19:29
but what's clear is that in any scenario,
344
1169732
3248
ืื‘ืœ ืžื” ืฉื‘ืจื•ืจ ื”ื•ื, ืฉื‘ื›ืœ ืชืจื—ื™ืฉ,
19:32
our machines have so much to learn from our toddlers.
345
1172980
3770
ืœืžื›ื•ื ื•ืช ืฉืœื ื• ื™ืฉ ื›ืœ ื›ืš ื”ืจื‘ื” ืœืœืžื•ื“ ืžื”ืคืขื•ื˜ื•ืช ืฉืœื ื•.
19:38
LS: I think so. You'll have some machine learning folks up here.
346
1178230
3216
ืœ.ืฉ.: ืื ื™ ื—ื•ืฉื‘ืช ื›ืš,ื™ื”ื™ื• ืœื›ื ื›ืžื” ืื ืฉื™ ืœืžื™ื“ืช ืžื›ื•ื ื” ื›ืืŸ.
19:41
I mean, you should never bet against babies or chimpanzees
347
1181446
4203
ื›ืœื•ืžืจ, ืืชื ืืฃ ืคืขื ืœื ืฆืจื™ื›ื™ื ืœื”ืžืจ ื ื’ื“ ืชื™ื ื•ืงื•ืช ืื• ืฉื™ืžืคื ื–ื™ื
19:45
or technology as a matter of practice,
348
1185649
3645
ืื• ื˜ื›ื ื•ืœื•ื’ื™ื” ืœืžืขืฉื”,
19:49
but it's not just a difference in quantity,
349
1189294
4528
ืื‘ืœ ื–ื” ืœื ืจืง ื”ื‘ื“ืœ ื‘ื›ืžื•ืช,
19:53
it's a difference in kind.
350
1193822
1764
ื–ื” ื”ื‘ื“ืœ ื‘ืกื•ื’.
19:55
We have incredibly powerful computers,
351
1195586
2160
ื™ืฉ ืœื ื• ืžื—ืฉื‘ื™ื ืจื‘ื™ ืขื•ืฆืžื”,
19:57
and they do do amazingly sophisticated things,
352
1197746
2391
ื•ื”ื ืขื•ืฉื™ื ื“ื‘ืจื™ื ืžืชื•ื—ื›ืžื™ื ืœื”ืคืœื™ื,
20:00
often with very big amounts of data.
353
1200137
3204
ืœืขืชื™ื ืงืจื•ื‘ื•ืช ืขื ื›ืžื•ื™ื•ืช ื’ื“ื•ืœื•ืช ืžืื•ื“ ืฉืœ ื ืชื•ื ื™ื.
20:03
Human minds do, I think, something quite different,
354
1203341
2607
ื”ืชื•ื“ืขื” ื”ืื ื•ืฉื™ืช, ืื ื™ ื—ื•ืฉื‘ืช, ืขื•ืฉื” ืžืฉื”ื• ืฉื•ื ื” ื‘ืชื›ืœื™ืช,
20:05
and I think it's the structured, hierarchical nature of human knowledge
355
1205948
3895
ื•ืื ื™ ื—ื•ืฉื‘ืช ืฉื–ื” ื”ืžื‘ื ื” ื”ื”ื™ืจืจื›ื™ ื”ืžื•ื‘ื ื” ืฉืœ ื”ื™ื“ืข ืื ื•ืฉื™
20:09
that remains a real challenge.
356
1209843
2032
ืฉื ืฉืืจ ืืชื’ืจ ืืžื™ืชื™.
20:11
CA: Laura Schulz, wonderful food for thought. Thank you so much.
357
1211875
3061
ื›.ื.: ืœื•ืจื” ืฉื•ืœืฅ, ื—ื•ืžืจ ื ืคืœื ืœืžื—ืฉื‘ื” ืชื•ื“ื” ืจื‘ื” ืœืš.
20:14
LS: Thank you. (Applause)
358
1214936
2922
ืœ.ืฉ.: ืชื•ื“ื” ืจื‘ื” (ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7