Laura Schulz: The surprisingly logical minds of babies

225,846 views ・ 2015-06-02

TED


請雙擊下方英文字幕播放視頻。

譯者: Kitty Lau 審譯者: Jack Ricardo
00:12
Mark Twain summed up what I take to be
0
12835
2155
馬克.吐溫用一句妙語概括了, 我認為是認知科學的一個最根本問題。
00:14
one of the fundamental problems of cognitive science
1
14990
3120
00:18
with a single witticism.
2
18110
1710
00:20
He said, "There's something fascinating about science.
3
20410
3082
他說:「科學的有趣之處在於,
00:23
One gets such wholesale returns of conjecture
4
23492
3228
一個人可從微不足道的事得出了偉大的猜想。」
00:26
out of such a trifling investment in fact."
5
26720
3204
00:29
(Laughter)
6
29924
1585
(笑聲)
00:32
Twain meant it as a joke, of course, but he's right:
7
32199
2604
馬克.吐溫當然只是開玩笑,但他是對的。
00:34
There's something fascinating about science.
8
34803
2876
科學有其有趣之處。
00:37
From a few bones, we infer the existence of dinosuars.
9
37679
4261
從幾塊骨頭,我們推測了恐龍的存在;
00:42
From spectral lines, the composition of nebulae.
10
42910
3871
從譜線得出了星雲的成份;
00:47
From fruit flies,
11
47471
2938
從果蠅得出了遺傳的機制;
00:50
the mechanisms of heredity,
12
50409
2943
00:53
and from reconstructed images of blood flowing through the brain,
13
53352
4249
以及從血液流入大腦的重建影像,
00:57
or in my case, from the behavior of very young children,
14
57601
4708
在我的研究則是從幼兒的行為中,
01:02
we try to say something about the fundamental mechanisms
15
62309
2829
我們嘗試解釋人類認知的基本機制。
01:05
of human cognition.
16
65138
1618
01:07
In particular, in my lab in the Department of Brain and Cognitive Sciences at MIT,
17
67716
4759
我在麻省理工大腦及認知科學系實驗室中,
01:12
I have spent the past decade trying to understand the mystery
18
72475
3654
花了過去十年研究一個謎團,
01:16
of how children learn so much from so little so quickly.
19
76129
3977
就是兒童如何從零開始, 快速地學到那麼多的東西。
01:20
Because, it turns out that the fascinating thing about science
20
80666
2978
科學令人著迷之處,
01:23
is also a fascinating thing about children,
21
83644
3529
亦正是孩子令人著迷的地方。
01:27
which, to put a gentler spin on Mark Twain,
22
87173
2581
回應馬克.吐溫的話,
01:29
is precisely their ability to draw rich, abstract inferences
23
89754
4650
那就是孩子從零碎和離亂的訊息中, 能夠得出豐富而抽象的推論的能力。
01:34
rapidly and accurately from sparse, noisy data.
24
94404
4661
01:40
I'm going to give you just two examples today.
25
100355
2398
我將舉出兩個例子:
01:42
One is about a problem of generalization,
26
102753
2287
一個是關於廣義化的問題,
01:45
and the other is about a problem of causal reasoning.
27
105040
2850
另一個則是關於因果推理的。
01:47
And although I'm going to talk about work in my lab,
28
107890
2525
雖然我將會談及我實驗室的研究,
01:50
this work is inspired by and indebted to a field.
29
110415
3460
但這個研究的靈感是來自一個領域,
01:53
I'm grateful to mentors, colleagues, and collaborators around the world.
30
113875
4283
一個我要感謝世界各地的導師、 同事和工作夥伴付出的領域。
01:59
Let me start with the problem of generalization.
31
119308
2974
讓我先談談廣義化的問題。
02:02
Generalizing from small samples of data is the bread and butter of science.
32
122652
4133
歸納數據樣本在科學上是不可或缺的,
02:06
We poll a tiny fraction of the electorate
33
126785
2554
如我們調查一部分的選民,
02:09
and we predict the outcome of national elections.
34
129339
2321
然後預測國家大選的結果。
02:12
We see how a handful of patients responds to treatment in a clinical trial,
35
132240
3925
我們觀察一小撮病人在臨床試驗中的反應,
02:16
and we bring drugs to a national market.
36
136165
3065
然後把藥物帶入市場,
02:19
But this only works if our sample is randomly drawn from the population.
37
139230
4365
但只有在整個人口中隨機抽樣才可行。
02:23
If our sample is cherry-picked in some way --
38
143595
2735
當我們刻意挑選樣本,
02:26
say, we poll only urban voters,
39
146330
2072
如我們只調查城市中的選民,
02:28
or say, in our clinical trials for treatments for heart disease,
40
148402
4388
又或在治療心臟病的臨床試驗中,
02:32
we include only men --
41
152790
1881
我們只研究男性,
02:34
the results may not generalize to the broader population.
42
154671
3158
這樣的結果便不能代表整個人口。
02:38
So scientists care whether evidence is randomly sampled or not,
43
158479
3581
因此科學家著緊抽樣的方法是否隨機。
02:42
but what does that have to do with babies?
44
162060
2015
但這又跟嬰兒有甚麼關係?
02:44
Well, babies have to generalize from small samples of data all the time.
45
164585
4621
嬰兒在任何時候都要歸納數據樣本,
02:49
They see a few rubber ducks and learn that they float,
46
169206
3158
當他們看到幾隻橡皮鴨, 並知道它們浮在水面。
02:52
or a few balls and learn that they bounce.
47
172364
3575
又或見到幾個皮球, 並知道它們能彈跳。
02:55
And they develop expectations about ducks and balls
48
175939
2951
從中他們建立對橡膠鴨和皮球的概念,
02:58
that they're going to extend to rubber ducks and balls
49
178890
2716
並將這概念延伸至日後會見到的 所有橡膠鴨和皮球。
03:01
for the rest of their lives.
50
181606
1879
03:03
And the kinds of generalizations babies have to make about ducks and balls
51
183485
3739
嬰兒對橡膠鴨和皮球的這種概括,
03:07
they have to make about almost everything:
52
187224
2089
他們會運用在每一件事上:
03:09
shoes and ships and sealing wax and cabbages and kings.
53
189313
3917
鞋子、船、封蠟、捲心菜和皇帝。
03:14
So do babies care whether the tiny bit of evidence they see
54
194200
2961
因此嬰兒留意這些細節能否代表整體。
03:17
is plausibly representative of a larger population?
55
197161
3692
03:21
Let's find out.
56
201763
1900
我們一起看看吧。
03:23
I'm going to show you two movies,
57
203663
1723
我將讓你看兩段短片,
03:25
one from each of two conditions of an experiment,
58
205386
2462
這兩段短片分別代表實驗的兩個情況。
03:27
and because you're going to see just two movies,
59
207848
2438
由於你將看到兩段短片,
03:30
you're going to see just two babies,
60
210286
2136
你只會看到兩個嬰兒,
03:32
and any two babies differ from each other in innumerable ways.
61
212422
3947
而這兩個嬰兒在很多地方都是不同的。
03:36
But these babies, of course, here stand in for groups of babies,
62
216369
3051
但這兩個嬰兒將代表更大的群組,
03:39
and the differences you're going to see
63
219420
1895
你將看到的不同之處則代表 嬰兒行為中的平均差異。
03:41
represent average group differences in babies' behavior across conditions.
64
221315
5195
03:47
In each movie, you're going to see a baby doing maybe
65
227160
2583
在每一段短片中你會見到嬰兒 在做些他們正常會做的事。
03:49
just exactly what you might expect a baby to do,
66
229743
3460
03:53
and we can hardly make babies more magical than they already are.
67
233203
4017
嬰兒本身已是十分神奇的,
03:58
But to my mind the magical thing,
68
238090
2010
但對我來說他們的神奇之處,
04:00
and what I want you to pay attention to,
69
240100
2089
也是我想你們留意的地方,
04:02
is the contrast between these two conditions,
70
242189
3111
就是這兩種情況之間的分別。
04:05
because the only thing that differs between these two movies
71
245300
3529
因為這兩段短片唯一不同的地方,
04:08
is the statistical evidence the babies are going to observe.
72
248829
3466
正是嬰兒將要觀察的資料。
04:13
We're going to show babies a box of blue and yellow balls,
73
253425
3183
我們把一些藍色和黃色的球給嬰兒看。
04:16
and my then-graduate student, now colleague at Stanford, Hyowon Gweon,
74
256608
4620
權孝媛當時是我的學生, 現在則是史丹佛大學的同事。
04:21
is going to pull three blue balls in a row out of this box,
75
261228
3077
她將拿出三個藍色的球,
04:24
and when she pulls those balls out, she's going to squeeze them,
76
264305
3123
而當她拿出這些球時, 她會把球擠一下,
04:27
and the balls are going to squeak.
77
267428
2113
讓這些球發出吱吱聲。
04:29
And if you're a baby, that's like a TED Talk.
78
269541
2763
這對於嬰兒來說就像TED一樣,
04:32
It doesn't get better than that.
79
272304
1904
是件很美好的事。
04:34
(Laughter)
80
274208
2561
(笑聲)
04:38
But the important point is it's really easy to pull three blue balls in a row
81
278968
3659
從一個裝滿藍色球的箱中, 抽出三個藍色球是件很容易的事。
04:42
out of a box of mostly blue balls.
82
282627
2305
04:44
You could do that with your eyes closed.
83
284932
2060
你閉上眼睛也能做到,
04:46
It's plausibly a random sample from this population.
84
286992
2996
這就像隨機抽樣。
04:49
And if you can reach into a box at random and pull out things that squeak,
85
289988
3732
因此當你可以在箱中隨機地抽出 能吱吱叫的物件時,
04:53
then maybe everything in the box squeaks.
86
293720
2839
也許箱中所有物件都能吱吱叫,
04:56
So maybe babies should expect those yellow balls to squeak as well.
87
296559
3650
所以嬰兒可能會假設黃色球也能吱吱叫。
05:00
Now, those yellow balls have funny sticks on the end,
88
300209
2519
但這些黃色球都有一根棒,
05:02
so babies could do other things with them if they wanted to.
89
302728
2857
所以嬰兒可用它們做些不同的事,
05:05
They could pound them or whack them.
90
305585
1831
他們可以拍打或搖動這些球。
05:07
But let's see what the baby does.
91
307416
2586
就讓我們看看這嬰兒會做甚麼。
05:12
(Video) Hyowon Gweon: See this? (Ball squeaks)
92
312548
3343
(影片) 權孝媛: 看看這個。 (球發出吱吱聲)
05:16
Did you see that? (Ball squeaks)
93
316531
3045
看到這個嗎? (球發出吱吱聲)
05:20
Cool.
94
320036
3066
很酷吧!
05:24
See this one?
95
324706
1950
看看這個。
05:26
(Ball squeaks)
96
326656
1881
(球發出吱吱聲)
05:28
Wow.
97
328537
2653
哇!
05:33
Laura Schulz: Told you. (Laughs)
98
333854
2113
羅拉·舒爾茨: 早就說了。 (笑聲)
05:35
(Video) HG: See this one? (Ball squeaks)
99
335967
4031
(影片) 孝媛: 看到這個嗎? (球發出吱吱聲)
05:39
Hey Clara, this one's for you. You can go ahead and play.
100
339998
4619
克拉拉, 這個是給你的, 你拿去玩吧。
05:51
(Laughter)
101
351854
4365
(笑聲)
05:56
LS: I don't even have to talk, right?
102
356219
2995
羅拉: 我不用解釋, 對吧?
05:59
All right, it's nice that babies will generalize properties
103
359214
2899
嬰兒把藍色球的特性套用到黃色球上。
06:02
of blue balls to yellow balls,
104
362113
1528
06:03
and it's impressive that babies can learn from imitating us,
105
363641
3096
嬰兒從模仿我們中學習,這是很神奇的,
06:06
but we've known those things about babies for a very long time.
106
366737
3669
但我們早就知道嬰兒能這樣做。
06:10
The really interesting question
107
370406
1811
有趣的地方是當把一樣的東西給嬰兒看時, 甚麼事會發生。
06:12
is what happens when we show babies exactly the same thing,
108
372217
2852
06:15
and we can ensure it's exactly the same because we have a secret compartment
109
375069
3611
我們能肯定這是完全一樣的, 因我們有個秘密的空間,
06:18
and we actually pull the balls from there,
110
378680
2110
從中我們抽出這些球。
06:20
but this time, all we change is the apparent population
111
380790
3478
但這次我們改變了抽樣的母體。
06:24
from which that evidence was drawn.
112
384268
2902
06:27
This time, we're going to show babies three blue balls
113
387170
3553
這次我們在一個裝滿黃色球的箱中, 抽出三個藍色球給嬰兒看。
06:30
pulled out of a box of mostly yellow balls,
114
390723
3384
06:34
and guess what?
115
394107
1322
想想甚麼事會發生?
06:35
You [probably won't] randomly draw three blue balls in a row
116
395429
2840
你大概不能隨機地在裝滿黃色球的箱中, 連續抽出三個藍色球,
06:38
out of a box of mostly yellow balls.
117
398269
2484
06:40
That is not plausibly randomly sampled evidence.
118
400753
3747
因此這很可能不是隨機抽樣。
06:44
That evidence suggests that maybe Hyowon was deliberately sampling the blue balls.
119
404500
5123
這反映了孝媛可能是刻意抽出藍色球,
06:49
Maybe there's something special about the blue balls.
120
409623
2583
可能這些藍色球是特別的,
06:52
Maybe only the blue balls squeak.
121
412846
2976
可能只有藍色球能吱吱叫。
06:55
Let's see what the baby does.
122
415822
1895
一起看看這嬰兒會做甚麼。
06:57
(Video) HG: See this? (Ball squeaks)
123
417717
2904
(影片) 孝媛: 看看這個。 (球發出吱吱聲)
07:02
See this toy? (Ball squeaks)
124
422851
2645
看到這個玩具嗎? (球發出吱吱聲)
07:05
Oh, that was cool. See? (Ball squeaks)
125
425496
5480
哇, 這很酷, 看到嗎? (球發出吱吱聲)
07:10
Now this one's for you to play. You can go ahead and play.
126
430976
4394
這個是給你的, 你拿去玩吧。
07:18
(Fussing) (Laughter)
127
438074
6347
(不耐煩的) (笑聲)
07:26
LS: So you just saw two 15-month-old babies
128
446901
2748
羅拉: 你剛剛看到兩個15月大的嬰兒,
07:29
do entirely different things
129
449649
1942
按他們觀察到樣本出現的機率, 而做出完全不同的事。
07:31
based only on the probability of the sample they observed.
130
451591
3599
07:35
Let me show you the experimental results.
131
455190
2321
一起看看實驗的結果,
07:37
On the vertical axis, you'll see the percentage of babies
132
457511
2764
垂直軸代表在每一個情況中, 有多少百分比的嬰兒擠壓球。
07:40
who squeezed the ball in each condition,
133
460275
2530
07:42
and as you'll see, babies are much more likely to generalize the evidence
134
462805
3715
你可看見嬰兒在樣本和整體一致時,
07:46
when it's plausibly representative of the population
135
466520
3135
比刻意挑選的樣本,
07:49
than when the evidence is clearly cherry-picked.
136
469655
3738
較會歸納他們看到的特徵。
07:53
And this leads to a fun prediction:
137
473393
2415
因此這帶出一個有趣的預測。
07:55
Suppose you pulled just one blue ball out of the mostly yellow box.
138
475808
4868
假設你在一個裝滿黃色球的箱中 只拿出一個藍色球,
08:00
You [probably won't] pull three blue balls in a row at random out of a yellow box,
139
480896
3869
當然你很難隨機地連續抽出三個藍色球,
08:04
but you could randomly sample just one blue ball.
140
484765
2455
但你可以只用一個藍色球作樣本,
08:07
That's not an improbable sample.
141
487220
1970
這不一定是個不可行的樣本。
08:09
And if you could reach into a box at random
142
489190
2224
當你隨機抽出一個會吱吱叫的東西時,
08:11
and pull out something that squeaks, maybe everything in the box squeaks.
143
491414
3987
可能箱中所有的東西都會吱吱叫,
08:15
So even though babies are going to see much less evidence for squeaking,
144
495875
4445
因此雖然嬰兒會看到較少吱吱叫的例子,
08:20
and have many fewer actions to imitate
145
500320
2242
而且在只抽出一個球的情況下, 他們會有較少的動作去模仿,
08:22
in this one ball condition than in the condition you just saw,
146
502562
3343
08:25
we predicted that babies themselves would squeeze more,
147
505905
3892
但我們預計會有更多嬰兒擠壓球。
08:29
and that's exactly what we found.
148
509797
2894
這正是我們發現的結果。
08:32
So 15-month-old babies, in this respect, like scientists,
149
512691
4411
因此15月大的嬰兒在這方面就像科學家,
08:37
care whether evidence is randomly sampled or not,
150
517102
3088
他們留意抽樣的方法是否隨機,
08:40
and they use this to develop expectations about the world:
151
520190
3507
並以此建立對事物的概念:
08:43
what squeaks and what doesn't,
152
523697
2182
甚麼會吱吱叫而甚麼不會,
08:45
what to explore and what to ignore.
153
525879
3145
甚麼需要探索而甚麼可忽略。
08:50
Let me show you another example now,
154
530384
2066
現在讓我給你們看看另一個例子,
08:52
this time about a problem of causal reasoning.
155
532450
2730
這次是關於因果推理的。
08:55
And it starts with a problem of confounded evidence
156
535180
2439
每人都要面對這個問題,
08:57
that all of us have,
157
537619
1672
08:59
which is that we are part of the world.
158
539291
2020
因為我們都是這世界的一部份。
09:01
And this might not seem like a problem to you, but like most problems,
159
541311
3436
這看似不是一個問題, 但和其他問題一樣,
09:04
it's only a problem when things go wrong.
160
544747
2337
事情會出狀況。
09:07
Take this baby, for instance.
161
547464
1811
以這個嬰兒為例,
09:09
Things are going wrong for him.
162
549275
1705
所有事都出了問題,
09:10
He would like to make this toy go, and he can't.
163
550980
2271
他想開動這個玩具,但他做不到。
09:13
I'll show you a few-second clip.
164
553251
2529
我會讓你看一段幾秒的影片。
09:21
And there's two possibilities, broadly:
165
561340
1920
這有兩個可能的原因,
09:23
Maybe he's doing something wrong,
166
563260
2634
可能是他做錯了一些事,
09:25
or maybe there's something wrong with the toy.
167
565894
4216
又或是那個玩具有些問題。
09:30
So in this next experiment,
168
570110
2111
因此在這個實驗中,
09:32
we're going to give babies just a tiny bit of statistical data
169
572221
3297
我們會給嬰兒們少許資料。
09:35
supporting one hypothesis over the other,
170
575518
2582
這些資料會傾向支持其中一個可能性,
09:38
and we're going to see if babies can use that to make different decisions
171
578100
3455
我們將研究這些嬰兒能否運用這些資料,
09:41
about what to do.
172
581555
1834
而作出不同的決定。
09:43
Here's the setup.
173
583389
2022
這個實驗是這樣的:
09:46
Hyowon is going to try to make the toy go and succeed.
174
586071
3030
孝媛嘗試開動那個玩具並成功了,
09:49
I am then going to try twice and fail both times,
175
589101
3320
而我的兩次嘗試都失敗了,
09:52
and then Hyowon is going to try again and succeed,
176
592421
3112
之後孝媛再嘗試,並再次成功了。
09:55
and this roughly sums up my relationship to my graduate students
177
595533
3172
這就像我和我的學生在使用新科技的情況。
09:58
in technology across the board.
178
598705
2835
10:02
But the important point here is it provides a little bit of evidence
179
602030
3292
重要的是這提供了少許的資料,
10:05
that the problem isn't with the toy, it's with the person.
180
605322
3668
這反映玩具並沒有問題,而是人的問題。
10:08
Some people can make this toy go,
181
608990
2350
有些人可以開動這玩具,
10:11
and some can't.
182
611340
959
有些人則不能。
10:12
Now, when the baby gets the toy, he's going to have a choice.
183
612799
3413
當這嬰兒拿到玩具時,他要作一個選擇。
10:16
His mom is right there,
184
616212
2188
他的母親在旁,
10:18
so he can go ahead and hand off the toy and change the person,
185
618400
3315
所以他可以把玩具交給母親, 換另一人嘗試。
10:21
but there's also going to be another toy at the end of that cloth,
186
621715
3158
同時在毛巾上有另一個玩具,
10:24
and he can pull the cloth towards him and change the toy.
187
624873
3552
所以他也可以把玩具拉向自己, 換另一個玩具。
10:28
So let's see what the baby does.
188
628425
2090
一起看看嬰兒會怎樣做。
10:30
(Video) HG: Two, three. Go! (Music)
189
630515
4183
(影片) 孝媛: 二、三、開始! (音樂)
10:34
LS: One, two, three, go!
190
634698
3131
羅拉: 一、二、三、開始!
10:37
Arthur, I'm going to try again. One, two, three, go!
191
637829
7382
亞瑟,讓我再試一次, 一、二、三、開始!
10:45
YG: Arthur, let me try again, okay?
192
645677
2600
孝媛: 亞瑟,讓我再試吧,好嗎?
10:48
One, two, three, go! (Music)
193
648277
4550
一、二、三、開始! (音樂)
10:53
Look at that. Remember these toys?
194
653583
1883
看看這裡,記得這些玩具嗎?
10:55
See these toys? Yeah, I'm going to put this one over here,
195
655466
3264
看到嗎? 對,我會把這個放在這裡,
10:58
and I'm going to give this one to you.
196
658730
2062
把另一個給你。
11:00
You can go ahead and play.
197
660792
2335
你拿去玩吧。
11:23
LS: Okay, Laura, but of course, babies love their mommies.
198
683213
4737
羅拉: 你或許會說嬰兒都愛他們的母親,
11:27
Of course babies give toys to their mommies
199
687950
2182
因此當玩具出現問題時, 嬰兒自然會把它交給母親。
11:30
when they can't make them work.
200
690132
2030
11:32
So again, the really important question is what happens when we change
201
692162
3593
因此,問題在於當我們稍微改變資料時, 甚麼事會發生。
11:35
the statistical data ever so slightly.
202
695755
3154
11:38
This time, babies are going to see the toy work and fail in exactly the same order,
203
698909
4087
這次,嬰兒將看到這玩具 按同一次序成功運作和失敗,
11:42
but we're changing the distribution of evidence.
204
702996
2415
但我們改變了資料的分佈。
11:45
This time, Hyowon is going to succeed once and fail once, and so am I.
205
705411
4411
這次孝媛和我各有一次成功和一次失敗,
11:49
And this suggests it doesn't matter who tries this toy, the toy is broken.
206
709822
5637
這代表誰人嘗試都沒有分別, 那件玩具是壞的,
11:55
It doesn't work all the time.
207
715459
1886
它不是每次都能運作的。
11:57
Again, the baby's going to have a choice.
208
717345
1965
同樣地,嬰兒要作出一個選擇,
11:59
Her mom is right next to her, so she can change the person,
209
719310
3396
她的母親在旁,所以她可換另一人嘗試,
12:02
and there's going to be another toy at the end of the cloth.
210
722706
3204
同時另一個玩具就在毛巾上。
12:05
Let's watch what she does.
211
725910
1378
看看她會怎樣做。
12:07
(Video) HG: Two, three, go! (Music)
212
727288
4348
(影片) 孝媛: 二、三、開始! (音樂)
12:11
Let me try one more time. One, two, three, go!
213
731636
4984
讓我再試一次, 一、二、三、開始!
12:17
Hmm.
214
737460
1697
嗯...
12:19
LS: Let me try, Clara.
215
739950
2692
羅拉: 讓我試試吧,克拉拉。
12:22
One, two, three, go!
216
742642
3945
一、二、三、開始!
12:27
Hmm, let me try again.
217
747265
1935
嗯...讓我再試試。
12:29
One, two, three, go! (Music)
218
749200
5670
一、二、三、開始! (音樂)
12:35
HG: I'm going to put this one over here,
219
755009
2233
孝媛: 我把這個放在這裡,
12:37
and I'm going to give this one to you.
220
757242
2001
這個則交給你,
12:39
You can go ahead and play.
221
759243
3597
你拿去玩吧。
12:58
(Applause)
222
778376
4897
(掌聲)
13:04
LS: Let me show you the experimental results.
223
784993
2392
羅拉: 看看這個實驗的結果,
13:07
On the vertical axis, you'll see the distribution
224
787385
2475
在垂直軸上,你會看到在每種情況下, 嬰兒作出不同選擇的分佈。
13:09
of children's choices in each condition,
225
789860
2577
13:12
and you'll see that the distribution of the choices children make
226
792437
4551
你會發現他們作的選擇是 基於他們觀察到的資料。
13:16
depends on the evidence they observe.
227
796988
2787
13:19
So in the second year of life,
228
799775
1857
因此當他們兩歲時,
13:21
babies can use a tiny bit of statistical data
229
801632
2577
嬰兒已經可以運用細微的資料,
13:24
to decide between two fundamentally different strategies
230
804209
3367
在兩個完全不同的選項中作出決定:
13:27
for acting in the world:
231
807576
1881
13:29
asking for help and exploring.
232
809457
2743
尋求幫忙或自行探索。
13:33
I've just shown you two laboratory experiments
233
813700
3434
我剛才讓你們看了兩個實驗,
13:37
out of literally hundreds in the field that make similar points,
234
817134
3691
在這領域中有數千個得出相同結果的實驗。
13:40
because the really critical point
235
820825
2392
當中反映的重點是,
13:43
is that children's ability to make rich inferences from sparse data
236
823217
5108
兒童擁有充分解讀零碎資訊的能力,
13:48
underlies all the species-specific cultural learning that we do.
237
828325
5341
這超出了所有文化的學習方式。
13:53
Children learn about new tools from just a few examples.
238
833666
4597
孩子從少數的例子便能學到新技能,
13:58
They learn new causal relationships from just a few examples.
239
838263
4717
他們從少數的例子便能領略到新的因果關係,
14:03
They even learn new words, in this case in American Sign Language.
240
843928
4871
他們甚至能學到新的生字,如美國手語。
14:08
I want to close with just two points.
241
848799
2311
我會提出兩個重點作總結。
14:12
If you've been following my world, the field of brain and cognitive sciences,
242
852050
3688
如果你近年有留意大腦和認知科學領域,
14:15
for the past few years,
243
855738
1927
14:17
three big ideas will have come to your attention.
244
857665
2415
你會聽到三個重要的概念。
14:20
The first is that this is the era of the brain.
245
860080
3436
第一,現在是大腦的時代。
14:23
And indeed, there have been staggering discoveries in neuroscience:
246
863516
3669
的確,神經科學近來有不少驚人的發現,
14:27
localizing functionally specialized regions of cortex,
247
867185
3436
例如標記了大腦皮層負責不同功能的位置、
14:30
turning mouse brains transparent,
248
870621
2601
製造出透明的老鼠大腦、
14:33
activating neurons with light.
249
873222
3776
以及利用光線啟動神經元。
14:36
A second big idea
250
876998
1996
第二個重要的概念是,
14:38
is that this is the era of big data and machine learning,
251
878994
4104
現在是大數據和機器學習的時代,
14:43
and machine learning promises to revolutionize our understanding
252
883098
3141
而機器學習能徹底改變我們對任何事的理解,
14:46
of everything from social networks to epidemiology.
253
886239
4667
從社交網站到流行病學。
14:50
And maybe, as it tackles problems of scene understanding
254
890906
2693
當機器學習能理解埸合和處理自然語言時,
14:53
and natural language processing,
255
893599
1993
14:55
to tell us something about human cognition.
256
895592
3324
也許我們能藉此了解人類的認知。
14:59
And the final big idea you'll have heard
257
899756
1937
最後一個你會聽過的重要概念是,
15:01
is that maybe it's a good idea we're going to know so much about brains
258
901693
3387
我們將對大腦有很深入的認識, 並能掌握大數據,而這很可能是件好事。
15:05
and have so much access to big data,
259
905080
1917
15:06
because left to our own devices,
260
906997
2507
因為相比機器而言,
15:09
humans are fallible, we take shortcuts,
261
909504
3831
人類易犯錯誤,我們會走捷徑,
15:13
we err, we make mistakes,
262
913335
3437
我們會做錯,
15:16
we're biased, and in innumerable ways,
263
916772
3684
我們在很多方面都有偏見,
15:20
we get the world wrong.
264
920456
2969
我們會有錯誤的理解。
15:24
I think these are all important stories,
265
924843
2949
我認為這都是重要的,
15:27
and they have a lot to tell us about what it means to be human,
266
927792
3785
因為這反映了人類的特質,
15:31
but I want you to note that today I told you a very different story.
267
931577
3529
但我今天想帶出事情的另一面。
15:35
It's a story about minds and not brains,
268
935966
3807
這是關於思維而非大腦的,
15:39
and in particular, it's a story about the kinds of computations
269
939773
3006
尤其是人類獨有的運算能力,
15:42
that uniquely human minds can perform,
270
942779
2590
15:45
which involve rich, structured knowledge and the ability to learn
271
945369
3944
這牽涉了豐富、有條理的知識,
以及從少量的數據和例子中學習的能力。
15:49
from small amounts of data, the evidence of just a few examples.
272
949313
5268
15:56
And fundamentally, it's a story about how starting as very small children
273
956301
4299
再者,這是關於我們如何從幼童,
16:00
and continuing out all the way to the greatest accomplishments
274
960600
4180
一路發展至成為文化中偉大的成就,
16:04
of our culture,
275
964780
3843
16:08
we get the world right.
276
968623
1997
我們能正確地理解這個世界。
16:12
Folks, human minds do not only learn from small amounts of data.
277
972433
5267
大家, 人腦不只是懂得從少量的數據中學習。
16:18
Human minds think of altogether new ideas.
278
978285
2101
人腦能想到新的主意。
16:20
Human minds generate research and discovery,
279
980746
3041
人腦能創造出研究和發明。
16:23
and human minds generate art and literature and poetry and theater,
280
983787
5273
人腦能創作藝術、文學、寫詩和戲劇。
16:29
and human minds take care of other humans:
281
989070
3760
人腦可照顧其他人,
16:32
our old, our young, our sick.
282
992830
3427
包括年老的、年輕的、患病的,
16:36
We even heal them.
283
996517
2367
我們甚至能治癒他們。
16:39
In the years to come, we're going to see technological innovations
284
999564
3103
在未來,我們將會看到 超乎現在能想像的科技發展,
16:42
beyond anything I can even envision,
285
1002667
3797
16:46
but we are very unlikely
286
1006464
2150
但在我或你們的一生中, 我們不太可能目睹比得上嬰兒運算能力的機器。
16:48
to see anything even approximating the computational power of a human child
287
1008614
5709
16:54
in my lifetime or in yours.
288
1014323
4298
16:58
If we invest in these most powerful learners and their development,
289
1018621
5047
假如我們投資在最厲害的學習者和其發展身上,
17:03
in babies and children
290
1023668
2917
在嬰兒和兒童身上、
17:06
and mothers and fathers
291
1026585
1826
在母親和父親身上、
17:08
and caregivers and teachers
292
1028411
2699
在照顧者和老師身上,
17:11
the ways we invest in our other most powerful and elegant forms
293
1031110
4170
如同我們投資在最厲害的科技、工程和設計上時,
17:15
of technology, engineering and design,
294
1035280
3218
17:18
we will not just be dreaming of a better future,
295
1038498
2939
我們不只是夢想有個更好的將來,
17:21
we will be planning for one.
296
1041437
2127
而是在計劃一個更好的將來。
17:23
Thank you very much.
297
1043564
2345
謝謝。
17:25
(Applause)
298
1045909
3421
(掌聲)
17:29
Chris Anderson: Laura, thank you. I do actually have a question for you.
299
1049810
4426
克里斯·安德森: 羅拉, 謝謝你, 我其實想問你一個問題。
17:34
First of all, the research is insane.
300
1054236
2359
首先,這項研究真是太瘋狂了。
17:36
I mean, who would design an experiment like that? (Laughter)
301
1056595
3725
我的意思是,有誰會想到這些實驗? (笑聲)
17:41
I've seen that a couple of times,
302
1061150
1790
我見過很多類似的實驗,
17:42
and I still don't honestly believe that that can truly be happening,
303
1062940
3222
但我仍然覺得難以置信,
17:46
but other people have done similar experiments; it checks out.
304
1066162
3158
儘管很多人做了類似的實驗,而事實的確如此。
17:49
The babies really are that genius.
305
1069320
1633
這些嬰兒根本是天才。
17:50
LS: You know, they look really impressive in our experiments,
306
1070953
3007
羅拉: 在實驗中這看似很神奇,
17:53
but think about what they look like in real life, right?
307
1073960
2652
但想想在現實生活中是怎樣的,對嗎?
17:56
It starts out as a baby.
308
1076612
1150
一出世時,他只是個嬰兒,
17:57
Eighteen months later, it's talking to you,
309
1077762
2007
但18個月後他開始說話,
17:59
and babies' first words aren't just things like balls and ducks,
310
1079769
3041
而嬰兒最初說的話不只是物件, 如皮球和鴨子,
18:02
they're things like "all gone," which refer to disappearance,
311
1082810
2881
他們更能表達「不見了」的概念,
18:05
or "uh-oh," which refer to unintentional actions.
312
1085691
2283
又或是以「哎喲」表達無心之失。
18:07
It has to be that powerful.
313
1087974
1562
這必須是那麼厲害的,
18:09
It has to be much more powerful than anything I showed you.
314
1089536
2775
這必須比我剛才展示的還要厲害。
18:12
They're figuring out the entire world.
315
1092311
1974
嬰兒在弄清楚整個世界,
18:14
A four-year-old can talk to you about almost anything.
316
1094285
3144
一個四歲的小孩幾乎懂得說所有東西。
18:17
(Applause)
317
1097429
1601
(掌聲)
18:19
CA: And if I understand you right, the other key point you're making is,
318
1099030
3414
克里斯: 如果我沒錯的話, 你想指出的另一個重點是,
18:22
we've been through these years where there's all this talk
319
1102444
2754
這些年來,我們都聽說 我們的腦袋是不可信和會出錯的,
18:25
of how quirky and buggy our minds are,
320
1105198
1932
18:27
that behavioral economics and the whole theories behind that
321
1107130
2867
行為經濟學和其他新理論都指出我們不是理性的。
18:29
that we're not rational agents.
322
1109997
1603
18:31
You're really saying that the bigger story is how extraordinary,
323
1111600
4216
但你指出了我們的腦袋是非凡的,
18:35
and there really is genius there that is underappreciated.
324
1115816
4944
我們一直忽略了我們的腦袋是多麼神奇。
18:40
LS: One of my favorite quotes in psychology
325
1120760
2070
羅拉: 我最喜愛的心理學名言之一,
18:42
comes from the social psychologist Solomon Asch,
326
1122830
2290
來自社會心理學家所羅門·阿希,
18:45
and he said the fundamental task of psychology is to remove
327
1125120
2807
他說心理學首要的任務是 去除那些毋需証明的事物的面紗。
18:47
the veil of self-evidence from things.
328
1127927
2626
18:50
There are orders of magnitude more decisions you make every day
329
1130553
4551
每天你作出大大小小的決定去理解這個世界。
18:55
that get the world right.
330
1135104
1347
18:56
You know about objects and their properties.
331
1136451
2132
你知道不同物件及其特性,
18:58
You know them when they're occluded. You know them in the dark.
332
1138583
3029
即使被覆蓋和在黑暗中你也知道。
19:01
You can walk through rooms.
333
1141612
1308
你能在空間中行走。
19:02
You can figure out what other people are thinking. You can talk to them.
334
1142920
3532
你能猜到別人在想甚麼,你能和別人交談。
你能探索空間,你明白數字。
19:06
You can navigate space. You know about numbers.
335
1146452
2230
你明白因果關係, 你懂得分辨是非。
19:08
You know causal relationships. You know about moral reasoning.
336
1148682
3022
你毫不費力便能做到, 所以我們不會察覺,
19:11
You do this effortlessly, so we don't see it,
337
1151704
2356
但這就是我們理解這個世界的方法,
19:14
but that is how we get the world right, and it's a remarkable
338
1154060
2912
這是個神奇而又難以理解的成就。
19:16
and very difficult-to-understand accomplishment.
339
1156972
2318
克里斯: 我相信在坐有人認為科技正急速發展,
19:19
CA: I suspect there are people in the audience who have
340
1159290
2628
19:21
this view of accelerating technological power
341
1161918
2238
他們可能不認同你說電腦 不能做到三歲小孩能做到的事。
19:24
who might dispute your statement that never in our lifetimes
342
1164156
2958
19:27
will a computer do what a three-year-old child can do,
343
1167114
2618
但可以肯定的是,無論在甚麼場合,
19:29
but what's clear is that in any scenario,
344
1169732
3248
19:32
our machines have so much to learn from our toddlers.
345
1172980
3770
嬰兒有很多地方值得我們的機器學習。
19:38
LS: I think so. You'll have some machine learning folks up here.
346
1178230
3216
羅拉: 我同意。有些人認同機器學習。
19:41
I mean, you should never bet against babies or chimpanzees
347
1181446
4203
我的意思是,你不應將嬰兒和黑猩猩跟科技比較,
19:45
or technology as a matter of practice,
348
1185649
3645
19:49
but it's not just a difference in quantity,
349
1189294
4528
因為這不是數量上的不同,
19:53
it's a difference in kind.
350
1193822
1764
而是性質上的不同。
19:55
We have incredibly powerful computers,
351
1195586
2160
我們有十分厲害的電腦,
19:57
and they do do amazingly sophisticated things,
352
1197746
2391
它們能做到複雜的事情,
20:00
often with very big amounts of data.
353
1200137
3204
和處理大量的資料。
20:03
Human minds do, I think, something quite different,
354
1203341
2607
我認為人類的腦袋做的事是不同的,
20:05
and I think it's the structured, hierarchical nature of human knowledge
355
1205948
3895
人類的知識是有系統和條理分明的,
20:09
that remains a real challenge.
356
1209843
2032
這對機器仍然是一個挑戰。
20:11
CA: Laura Schulz, wonderful food for thought. Thank you so much.
357
1211875
3061
克里斯: 勞拉·舒爾茨,十分精彩。謝謝。
20:14
LS: Thank you. (Applause)
358
1214936
2922
羅拉: 謝謝。 (掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7