Peter Donnelly: How stats fool juries

243,959 views ・ 2007-01-12

TED


請雙擊下方英文字幕播放視頻。

譯者: Marie Wu 審譯者: Wang-Ju Tsai
00:25
As other speakers have said, it's a rather daunting experience --
0
25000
2000
我像其他講者一樣,覺得在各位面前演講,
00:27
a particularly daunting experience -- to be speaking in front of this audience.
1
27000
3000
是一件很令人害怕的事。
00:30
But unlike the other speakers, I'm not going to tell you about
2
30000
3000
但我不像其他演講者,我不會講述有關宇宙的奧妙,
00:33
the mysteries of the universe, or the wonders of evolution,
3
33000
2000
或是講述演化的神奇之處,
00:35
or the really clever, innovative ways people are attacking
4
35000
4000
我也不會講述那些人們用來對抗世上不公不義
00:39
the major inequalities in our world.
5
39000
2000
所採行的創新招術,
00:41
Or even the challenges of nation-states in the modern global economy.
6
41000
5000
甚至那些現代國家所需要面對的全球經濟問題,
00:46
My brief, as you've just heard, is to tell you about statistics --
7
46000
4000
我會講的就是剛才主持人所提到的:統計學,
00:50
and, to be more precise, to tell you some exciting things about statistics.
8
50000
3000
正確地說,我會告訴各位統計學有趣之處,
00:53
And that's --
9
53000
1000
那就是...
00:54
(Laughter)
10
54000
1000
(笑聲)
00:55
-- that's rather more challenging
11
55000
2000
這項挑戰可不亞於在我之前
00:57
than all the speakers before me and all the ones coming after me.
12
57000
2000
或在我之後出現的講者啊!
00:59
(Laughter)
13
59000
1000
(笑聲)
01:01
One of my senior colleagues told me, when I was a youngster in this profession,
14
61000
5000
有一位前輩在我剛加入這一行時很驕傲地告訴我,
01:06
rather proudly, that statisticians were people who liked figures
15
66000
4000
他說,統計學家是一群很喜歡數字的人,
01:10
but didn't have the personality skills to become accountants.
16
70000
3000
但卻不具備得以使他們成為會計師的人際關係技巧。
01:13
(Laughter)
17
73000
2000
(笑聲)
01:15
And there's another in-joke among statisticians, and that's,
18
75000
3000
還有另一個關於統計學家的笑話:
01:18
"How do you tell the introverted statistician from the extroverted statistician?"
19
78000
3000
「要怎麼分辨一個統計學家的個性是內向還是外向?」
01:21
To which the answer is,
20
81000
2000
答案是:
01:23
"The extroverted statistician's the one who looks at the other person's shoes."
21
83000
5000
「外向的統計學家會盯著別人的鞋子看。」
01:28
(Laughter)
22
88000
3000
(笑聲)
01:31
But I want to tell you something useful -- and here it is, so concentrate now.
23
91000
5000
我要告訴各位一些有用的資訊,所以請專心一點。
01:36
This evening, there's a reception in the University's Museum of Natural History.
24
96000
3000
今晚,在學校的自然歷史博物館裡有一場招待會,
01:39
And it's a wonderful setting, as I hope you'll find,
25
99000
2000
我希望各位覺得辦得還不錯,
01:41
and a great icon to the best of the Victorian tradition.
26
101000
5000
主題是維多利亞時期的優良傳統。
01:46
It's very unlikely -- in this special setting, and this collection of people --
27
106000
5000
在這場盛會裡,聚集了很多人,
01:51
but you might just find yourself talking to someone you'd rather wish that you weren't.
28
111000
3000
但你有可能會和一個你根本不想說話的人對談,
01:54
So here's what you do.
29
114000
2000
我給各位一點建議,
01:56
When they say to you, "What do you do?" -- you say, "I'm a statistician."
30
116000
4000
當他們問說:「你做哪一行?」,你就回答:「我是個統計學家。」
02:00
(Laughter)
31
120000
1000
(笑聲)
02:01
Well, except they've been pre-warned now, and they'll know you're making it up.
32
121000
4000
除非先前就有人告訴他們這個小伎倆,否則他們不會知道你在說謊。
02:05
And then one of two things will happen.
33
125000
2000
接下來就有二種可能,
02:07
They'll either discover their long-lost cousin in the other corner of the room
34
127000
2000
他們要不是會突然發現久未聯絡的表兄弟
02:09
and run over and talk to them.
35
129000
2000
出現在大廳另一頭而趕去找他說話,
02:11
Or they'll suddenly become parched and/or hungry -- and often both --
36
131000
3000
要不就會突然覺得很渴或很餓,或是又渴又餓,
02:14
and sprint off for a drink and some food.
37
134000
2000
不得不趕緊去找些東西來吃吃或喝喝。
02:16
And you'll be left in peace to talk to the person you really want to talk to.
38
136000
4000
這時你就獲得自由了,你可以找你想要說話的人聊天了。
02:20
It's one of the challenges in our profession to try and explain what we do.
39
140000
3000
做我們這一行的人,有時很難向別人解釋我們在做什麼,
02:23
We're not top on people's lists for dinner party guests and conversations and so on.
40
143000
5000
我們也不是別人晚宴賓客或是聊天的首選名單,
02:28
And it's something I've never really found a good way of doing.
41
148000
2000
甚至我自己也覺得很難説明我的工作内容。
02:30
But my wife -- who was then my girlfriend --
42
150000
3000
但我的太太,那時還是我的女友,
02:33
managed it much better than I've ever been able to.
43
153000
3000
倒是説明得比我還清楚。
02:36
Many years ago, when we first started going out, she was working for the BBC in Britain,
44
156000
3000
多年以前,當我們開始約會時,她那時在英國的BBC(英國廣播公司)工作,
02:39
and I was, at that stage, working in America.
45
159000
2000
而我那時則在美國工作,
02:41
I was coming back to visit her.
46
161000
2000
有一次我要回來英國跟她見面。
02:43
She told this to one of her colleagues, who said, "Well, what does your boyfriend do?"
47
163000
6000
她和一個同事有了這樣的對話,對方問:「你男朋友是做什麼的?」
02:49
Sarah thought quite hard about the things I'd explained --
48
169000
2000
於是莎拉把我之前對她解釋的工作內容
02:51
and she concentrated, in those days, on listening.
49
171000
4000
再仔細地想了一遍,她在那時候都還很認真地聽我說話。
02:55
(Laughter)
50
175000
2000
(笑聲)
02:58
Don't tell her I said that.
51
178000
2000
不要告訴她我說過這件事。
03:00
And she was thinking about the work I did developing mathematical models
52
180000
4000
接著她想到我那時正在為解開演化與現代基因之謎
03:04
for understanding evolution and modern genetics.
53
184000
3000
建立一些數學模型,
03:07
So when her colleague said, "What does he do?"
54
187000
3000
所以當她的同事問道:「他是做什麼的?」
03:10
She paused and said, "He models things."
55
190000
4000
她停了好一會兒才說:「他是做模型的。」
03:14
(Laughter)
56
194000
1000
(笑聲)
03:15
Well, her colleague suddenly got much more interested than I had any right to expect
57
195000
4000
哇!她的同事突然對我所做的事感到高度興趣,
03:19
and went on and said, "What does he model?"
58
199000
3000
接著問:「他做什麼模型?」
03:22
Well, Sarah thought a little bit more about my work and said, "Genes."
59
202000
3000
莎拉想了一會兒,說:「基因。」
03:25
(Laughter)
60
205000
4000
(笑聲)
03:29
"He models genes."
61
209000
2000
「他為基因建立模型。」
03:31
That is my first love, and that's what I'll tell you a little bit about.
62
211000
4000
莎拉是我的初戀,只能說到這裡了。
03:35
What I want to do more generally is to get you thinking about
63
215000
4000
接下來,我想讓各位想想,我們所處的世界
03:39
the place of uncertainty and randomness and chance in our world,
64
219000
3000
是不是充滿了不確定性、各種隨機因素與機會?
03:42
and how we react to that, and how well we do or don't think about it.
65
222000
5000
我們的反應又是如何,有或沒有意識到這件事呢?
03:47
So you've had a pretty easy time up till now --
66
227000
2000
剛剛那幾分鐘各位都聽得很輕鬆,
03:49
a few laughs, and all that kind of thing -- in the talks to date.
67
229000
2000
有笑話還有一些別的事情,
03:51
You've got to think, and I'm going to ask you some questions.
68
231000
3000
但各位得動動腦,我要問各位幾個問題。
03:54
So here's the scene for the first question I'm going to ask you.
69
234000
2000
在我問各位第一個問題之前,
03:56
Can you imagine tossing a coin successively?
70
236000
3000
我要先請各位想像一下連續投擲幾次銅板的畫面。
03:59
And for some reason -- which shall remain rather vague --
71
239000
3000
基於一些我們還無法解釋的因素,
04:02
we're interested in a particular pattern.
72
242000
2000
統計學家對於銅板正反面出現的次序很感興趣,
04:04
Here's one -- a head, followed by a tail, followed by a tail.
73
244000
3000
例如:先是人頭、再來字、再來一次字。
04:07
So suppose we toss a coin repeatedly.
74
247000
3000
假設我們不斷重覆投擲一個銅板,
04:10
Then the pattern, head-tail-tail, that we've suddenly become fixated with happens here.
75
250000
5000
那麼「人頭、字、字」這個順序就是我們關注的重點,
04:15
And you can count: one, two, three, four, five, six, seven, eight, nine, 10 --
76
255000
4000
接下來你數:1, 2, 3, 4, 5, 6, 7, 8, 9, 10...
04:19
it happens after the 10th toss.
77
259000
2000
在第10次投擲時才出現。
04:21
So you might think there are more interesting things to do, but humor me for the moment.
78
261000
3000
你一定在想這有什麼好玩的?但還是先遷就我一下。
04:24
Imagine this half of the audience each get out coins, and they toss them
79
264000
4000
想像一下,這一半的聽眾都拿到一個銅板,開始投擲,
04:28
until they first see the pattern head-tail-tail.
80
268000
3000
要一直投到看到「人頭、字、字」這個順序為止。
04:31
The first time they do it, maybe it happens after the 10th toss, as here.
81
271000
2000
第一輪,或許就像我剛才說的,到第十次才看到,
04:33
The second time, maybe it's after the fourth toss.
82
273000
2000
到了第二輪,或許在第四次會看到,
04:35
The next time, after the 15th toss.
83
275000
2000
第三輪,或許在第15次才看到。
04:37
So you do that lots and lots of times, and you average those numbers.
84
277000
3000
就這樣一直重覆做下去,然後把所有數字平均,
04:40
That's what I want this side to think about.
85
280000
3000
這是我要這一半聽眾去想的事情。
04:43
The other half of the audience doesn't like head-tail-tail --
86
283000
2000
另外這一半的聽眾,我就不要你們做「人頭、字、字」了,
04:45
they think, for deep cultural reasons, that's boring --
87
285000
3000
基於深厚的文化因素,你們一定覺得,那種順序太無聊了,
04:48
and they're much more interested in a different pattern -- head-tail-head.
88
288000
3000
我們想要有趣一點的順序:「人頭、字、人頭」
04:51
So, on this side, you get out your coins, and you toss and toss and toss.
89
291000
3000
所以,這邊的聽眾,你們拿起了銅板,投了再投
04:54
And you count the number of times until the pattern head-tail-head appears
90
294000
3000
把第一次出現「人頭、字、人頭」這個順序的次數記錄下來,
04:57
and you average them. OK?
91
297000
3000
再算出平均數,好嗎?
05:00
So on this side, you've got a number --
92
300000
2000
這一半的聽眾,你們有一個平均數,
05:02
you've done it lots of times, so you get it accurately --
93
302000
2000
你們投過很多次,所以一定很準確,
05:04
which is the average number of tosses until head-tail-tail.
94
304000
3000
一定可以得出一個第一次出現「人頭、字、字」的平均數。
05:07
On this side, you've got a number -- the average number of tosses until head-tail-head.
95
307000
4000
而這一半聽眾,你們也有一個關於「人頭、字、人頭」的平均數。
05:11
So here's a deep mathematical fact --
96
311000
2000
因此我們可以得出一個深奧的數學理論:
05:13
if you've got two numbers, one of three things must be true.
97
313000
3000
若你有二個數字,一定會有以下三種情形的其中之一,
05:16
Either they're the same, or this one's bigger than this one,
98
316000
3000
要不他們二個相等,要不就是這個數大於另一個數,
05:19
or this one's bigger than that one.
99
319000
1000
要不就是另一個數大於這個數。
05:20
So what's going on here?
100
320000
3000
你們覺得會是哪一種情形?
05:23
So you've all got to think about this, and you've all got to vote --
101
323000
2000
大家得好好想一想,然後我要你們投票,
05:25
and we're not moving on.
102
325000
1000
現在就想一想。
05:26
And I don't want to end up in the two-minute silence
103
326000
2000
我可不想讓接下來的二分鐘冷場,
05:28
to give you more time to think about it, until everyone's expressed a view. OK.
104
328000
4000
所以我要你們都好好想一想,每個人都得表達出自己的意見。
05:32
So what you want to do is compare the average number of tosses until we first see
105
332000
4000
我要你們比較一下,第一次出現「人頭、字、人頭」的平均投擲數,
05:36
head-tail-head with the average number of tosses until we first see head-tail-tail.
106
336000
4000
和第一次出現「人頭、字、字」的平均投擲數孰大孰小。
05:41
Who thinks that A is true --
107
341000
2000
認為A是正確的請舉手?
05:43
that, on average, it'll take longer to see head-tail-head than head-tail-tail?
108
343000
4000
也就是說,平均下來要花較多時間才會看到「人頭、字、人頭」這種順序?
05:47
Who thinks that B is true -- that on average, they're the same?
109
347000
3000
認為B是正確的請舉手?就是二者平均數相等?
05:51
Who thinks that C is true -- that, on average, it'll take less time
110
351000
2000
認為C是正確的請舉手?也就是說,平均下來,
05:53
to see head-tail-head than head-tail-tail?
111
353000
3000
要花較多時間才會看到「人頭、字、字」這種順序?
05:57
OK, who hasn't voted yet? Because that's really naughty -- I said you had to.
112
357000
3000
還有誰沒投票?你們真的很不乖哦!我說過你們都得投票啊!
06:00
(Laughter)
113
360000
1000
(笑聲)
06:02
OK. So most people think B is true.
114
362000
3000
好,大部分的人都認為B是正確的,
06:05
And you might be relieved to know even rather distinguished mathematicians think that.
115
365000
3000
如果你們知道最傑出的數學家也會這麼想,應該就會釋懷了吧!
06:08
It's not. A is true here.
116
368000
4000
事實上不是,A才是正確的,
06:12
It takes longer, on average.
117
372000
2000
平均來說會花比較多時間才會看到「人頭、字、人頭」這種順序。
06:14
In fact, the average number of tosses till head-tail-head is 10
118
374000
2000
「人頭、字、人頭」的平均投擲次數是10次,
06:16
and the average number of tosses until head-tail-tail is eight.
119
376000
5000
而「人頭、字、字」的平均投擲次數則是8次。
06:21
How could that be?
120
381000
2000
怎麼會這樣?
06:24
Anything different about the two patterns?
121
384000
3000
這二種順序有什麼不同?
06:30
There is. Head-tail-head overlaps itself.
122
390000
5000
的確有所不同,「人頭、字、人頭」的頭尾是重覆的,
06:35
If you went head-tail-head-tail-head, you can cunningly get two occurrences
123
395000
4000
所以如果你投出「人頭、字、人頭、字、人頭」,
06:39
of the pattern in only five tosses.
124
399000
3000
在這五次投擲裡你就會看到二次這種順序,
06:42
You can't do that with head-tail-tail.
125
402000
2000
「人頭、字、字」就沒有這種重覆性,
06:44
That turns out to be important.
126
404000
2000
這是很重要的一點,
06:46
There are two ways of thinking about this.
127
406000
2000
我們可以從二方面來思考這件事。
06:48
I'll give you one of them.
128
408000
2000
我們來看看其中一個面向,
06:50
So imagine -- let's suppose we're doing it.
129
410000
2000
先想像一下我們在投擲銅板,
06:52
On this side -- remember, you're excited about head-tail-tail;
130
412000
2000
記住,這一邊是支持「人頭、字、字」的,
06:54
you're excited about head-tail-head.
131
414000
2000
這一邊是支持「人頭、字、人頭」的。
06:56
We start tossing a coin, and we get a head --
132
416000
3000
我們來開始投吧!我們得到一個人頭,
06:59
and you start sitting on the edge of your seat
133
419000
1000
你緊張得坐不住了吧?
07:00
because something great and wonderful, or awesome, might be about to happen.
134
420000
5000
因為有件很神奇的事情就要發生了!
07:05
The next toss is a tail -- you get really excited.
135
425000
2000
接下來投出一個字,你真的很興奮,
07:07
The champagne's on ice just next to you; you've got the glasses chilled to celebrate.
136
427000
4000
似乎看到冰桶裡的香檳就在你身邊,只要拿起杯子就可以慶祝了!
07:11
You're waiting with bated breath for the final toss.
137
431000
2000
你現在不敢大口呼吸,
07:13
And if it comes down a head, that's great.
138
433000
2000
如果最後出現一個人頭,那就太棒了!
07:15
You're done, and you celebrate.
139
435000
2000
你成功了!你可以慶祝了!
07:17
If it's a tail -- well, rather disappointedly, you put the glasses away
140
437000
2000
但如果是字,嗯,你會很失望,只好把杯子放回去,
07:19
and put the champagne back.
141
439000
2000
把香檳退掉,
07:21
And you keep tossing, to wait for the next head, to get excited.
142
441000
3000
然後繼續投擲,等待下一個人頭出現。
07:25
On this side, there's a different experience.
143
445000
2000
而這一邊,則是完全不同的際遇,
07:27
It's the same for the first two parts of the sequence.
144
447000
3000
頭二次投擲的結果都一樣,
07:30
You're a little bit excited with the first head --
145
450000
2000
你對出現第一個人頭很興奮,
07:32
you get rather more excited with the next tail.
146
452000
2000
接下來出現一個字讓你更加興奮,
07:34
Then you toss the coin.
147
454000
2000
最後,你再投一次,
07:36
If it's a tail, you crack open the champagne.
148
456000
3000
如果是字,你就開香檳慶祝,
07:39
If it's a head you're disappointed,
149
459000
2000
如果是人頭,你就會很失望,
07:41
but you're still a third of the way to your pattern again.
150
461000
3000
但你至少不用再等下一個人頭,因為你已經投出下一輪的第一個人頭了。
07:44
And that's an informal way of presenting it -- that's why there's a difference.
151
464000
4000
這不是正規的解釋方法,但這確實是他們之間的差異所在。
07:48
Another way of thinking about it --
152
468000
2000
現在我用另一個思考面向來解釋,
07:50
if we tossed a coin eight million times,
153
470000
2000
如果我們投擲八百萬次,
07:52
then we'd expect a million head-tail-heads
154
472000
2000
「人頭、字、人頭」應該會出現一百萬次,
07:54
and a million head-tail-tails -- but the head-tail-heads could occur in clumps.
155
474000
7000
「人頭、字、字」也應該會出現一百萬次,但是「人頭、字、人頭」卻會成群地出現。
08:01
So if you want to put a million things down amongst eight million positions
156
481000
2000
如果你要把一百萬件東西分散放在八百萬件東西裡面,
08:03
and you can have some of them overlapping, the clumps will be further apart.
157
483000
5000
而某些東西是可以重疊的話,群集間的距離會更遠,
08:08
It's another way of getting the intuition.
158
488000
2000
這就是另一種思考方式。
08:10
What's the point I want to make?
159
490000
2000
我到底想要說什麼?
08:12
It's a very, very simple example, an easily stated question in probability,
160
492000
4000
這是一個非常淺顯易懂的例子,很容易說明的機率問題,
08:16
which every -- you're in good company -- everybody gets wrong.
161
496000
3000
每一個人都會在這問題上犯錯,你們也不例外。
08:19
This is my little diversion into my real passion, which is genetics.
162
499000
4000
這是我的另一個嗜好,基因。
08:23
There's a connection between head-tail-heads and head-tail-tails in genetics,
163
503000
3000
「人頭、字、人頭」或「人頭、字、字」
08:26
and it's the following.
164
506000
3000
和基因有某種關聯,
08:29
When you toss a coin, you get a sequence of heads and tails.
165
509000
3000
當你投擲一個銅板,你會丟出一連串的人頭或字,
08:32
When you look at DNA, there's a sequence of not two things -- heads and tails --
166
512000
3000
而我們來看看DNA,它的組成就不是人頭或字,
08:35
but four letters -- As, Gs, Cs and Ts.
167
515000
3000
而是這四個字母:A, G, C, T。
08:38
And there are little chemical scissors, called restriction enzymes
168
518000
3000
有一種像是剪刀的化學成份,叫做限制酶,
08:41
which cut DNA whenever they see particular patterns.
169
521000
2000
會在他們看到某種特定順序組合出現時,將DNA切斷,
08:43
And they're an enormously useful tool in modern molecular biology.
170
523000
4000
這是現代分子生物學裡的一項強大工具。
08:48
And instead of asking the question, "How long until I see a head-tail-head?" --
171
528000
3000
除了問說:「多久才會看到一個人頭、字、人頭呢?」
08:51
you can ask, "How big will the chunks be when I use a restriction enzyme
172
531000
3000
你還可以問:「若限制酶在看到G-A-A-G出現時就切斷DNA,
08:54
which cuts whenever it sees G-A-A-G, for example?
173
534000
4000
那麼G-A-A-G出現前的那一段DNA
08:58
How long will those chunks be?"
174
538000
2000
會有多長呢?」
09:00
That's a rather trivial connection between probability and genetics.
175
540000
5000
這是機率與基因間淺顯的關聯性,
09:05
There's a much deeper connection, which I don't have time to go into
176
545000
3000
但他們之間還存在著很深的關係,今天我沒有足夠的時間可以說明,
09:08
and that is that modern genetics is a really exciting area of science.
177
548000
3000
但那卻是現代基因學最令人著迷之處,
09:11
And we'll hear some talks later in the conference specifically about that.
178
551000
4000
待會兒還會有其他講者就這個主題再詳細說明。
09:15
But it turns out that unlocking the secrets in the information generated by modern
179
555000
4000
我們發現,若要公開現代實驗科技產生的資訊的祕密,
09:19
experimental technologies, a key part of that has to do with fairly sophisticated --
180
559000
5000
就不得不提到一個很複雜的關鍵因素,
09:24
you'll be relieved to know that I do something useful in my day job,
181
564000
3000
各位會很高興知道我的工作還是有些用途的,
09:27
rather more sophisticated than the head-tail-head story --
182
567000
2000
這可比丟銅板複雜多了,
09:29
but quite sophisticated computer modelings and mathematical modelings
183
569000
4000
牽涉到複雜的電腦模型、數學模型
09:33
and modern statistical techniques.
184
573000
2000
和現代的統計技巧。
09:35
And I will give you two little snippets -- two examples --
185
575000
3000
我會給各位二個提示,也就是二個例子,
09:38
of projects we're involved in in my group in Oxford,
186
578000
3000
那是我在牛津的小組所參與的專案,
09:41
both of which I think are rather exciting.
187
581000
2000
這二個專案都很有趣。
09:43
You know about the Human Genome Project.
188
583000
2000
各位都知道人體基因元計畫,
09:45
That was a project which aimed to read one copy of the human genome.
189
585000
4000
這個專案的目標是要訂出人體的基因序列,
09:51
The natural thing to do after you've done that --
190
591000
2000
而接下來很自然就產生另一個專案,
09:53
and that's what this project, the International HapMap Project,
191
593000
2000
叫做國際單體型測繪計畫,
09:55
which is a collaboration between labs in five or six different countries.
192
595000
5000
由五、六個不同國家的實驗室共同合作執行。
10:00
Think of the Human Genome Project as learning what we've got in common,
193
600000
4000
人體基因計畫旨在瞭解人類基因的共通性,
10:04
and the HapMap Project is trying to understand
194
604000
2000
而國際單體型測繪計畫就是要去瞭解
10:06
where there are differences between different people.
195
606000
2000
不同人之間的基因有何相異之處。
10:08
Why do we care about that?
196
608000
2000
為什麼我們要知道這些?
10:10
Well, there are lots of reasons.
197
610000
2000
嗯,有許多原因,
10:12
The most pressing one is that we want to understand how some differences
198
612000
4000
最主要的原因是我們想要瞭解,為何基因的不同
10:16
make some people susceptible to one disease -- type-2 diabetes, for example --
199
616000
4000
會使某些人容易得某種疾病,例如第二型糖尿病,
10:20
and other differences make people more susceptible to heart disease,
200
620000
5000
而另一種基因的差異則會讓人容易產生心臟病,
10:25
or stroke, or autism and so on.
201
625000
2000
或是中風、自閉症等疾病。
10:27
That's one big project.
202
627000
2000
這是一項大型專案,
10:29
There's a second big project,
203
629000
2000
還有另一項大型專案,
10:31
recently funded by the Wellcome Trust in this country,
204
631000
2000
是由英國的衛爾康基金會出資運作,
10:33
involving very large studies --
205
633000
2000
要進行非常大規模的研究,
10:35
thousands of individuals, with each of eight different diseases,
206
635000
3000
針對數千人進行調查,主要研究八種不同的疾病,
10:38
common diseases like type-1 and type-2 diabetes, and coronary heart disease,
207
638000
4000
像是第一型與第二型糖尿病、冠狀動脈心臟病、
10:42
bipolar disease and so on -- to try and understand the genetics.
208
642000
4000
躁鬱症等,要研究病患的基因序列,
10:46
To try and understand what it is about genetic differences that causes the diseases.
209
646000
3000
試圖找出病患的基因有何不同之處。
10:49
Why do we want to do that?
210
649000
2000
為什麼要做這個研究?
10:51
Because we understand very little about most human diseases.
211
651000
3000
因為我們對於大部分的疾病都瞭解不多,
10:54
We don't know what causes them.
212
654000
2000
我們不知道人們是怎麼染病的,
10:56
And if we can get in at the bottom and understand the genetics,
213
656000
2000
但如果我們能知道最基本的基因差異,
10:58
we'll have a window on the way the disease works,
214
658000
3000
我們或許可一窺疾病運作之祕密,
11:01
and a whole new way about thinking about disease therapies
215
661000
2000
並找出治療疾病的全新方法,
11:03
and preventative treatment and so on.
216
663000
3000
加以預防。
11:06
So that's, as I said, the little diversion on my main love.
217
666000
3000
這就是我所說的我的第二個嗜好。
11:09
Back to some of the more mundane issues of thinking about uncertainty.
218
669000
5000
現在我們回歸到現實面,來看看剛才我所說的不確定性,
11:14
Here's another quiz for you --
219
674000
2000
我要問各位另一個問題,
11:16
now suppose we've got a test for a disease
220
676000
2000
假設我們針對某項疾病研發了某種測試技術,
11:18
which isn't infallible, but it's pretty good.
221
678000
2000
雖然不是萬無一失,但尚稱良好,
11:20
It gets it right 99 percent of the time.
222
680000
3000
大約有99%的準確度。
11:23
And I take one of you, or I take someone off the street,
223
683000
3000
我請在座的一位或是街上隨便找個人,
11:26
and I test them for the disease in question.
224
686000
2000
來用這種技術檢驗是否得到了這種疾病,
11:28
Let's suppose there's a test for HIV -- the virus that causes AIDS --
225
688000
4000
假設是HIV病毒的檢驗試劑好了,就是愛滋病毒的檢驗試劑,
11:32
and the test says the person has the disease.
226
692000
3000
報告出來說這個人得病了。
11:35
What's the chance that they do?
227
695000
3000
那麼這個人真正得病的機率是多少?
11:38
The test gets it right 99 percent of the time.
228
698000
2000
試劑有99%的準確度,
11:40
So a natural answer is 99 percent.
229
700000
4000
大家自然會說這個人99%得了愛滋病,
11:44
Who likes that answer?
230
704000
2000
但誰會滿意這種答案?
11:46
Come on -- everyone's got to get involved.
231
706000
1000
拜託,每一個人都要參與啊...
11:47
Don't think you don't trust me anymore.
232
707000
2000
不要不信任我嘛...
11:49
(Laughter)
233
709000
1000
(笑聲)
11:50
Well, you're right to be a bit skeptical, because that's not the answer.
234
710000
3000
抱持懷疑態度是對的,因為這個答案不對,
11:53
That's what you might think.
235
713000
2000
你一定會這樣想。
11:55
It's not the answer, and it's not because it's only part of the story.
236
715000
3000
這個答案不對,但不是因為這個原因,
11:58
It actually depends on how common or how rare the disease is.
237
718000
3000
而是要看這種疾病的普遍程度來決定,
12:01
So let me try and illustrate that.
238
721000
2000
我來為各位解說一下。
12:03
Here's a little caricature of a million individuals.
239
723000
4000
假設這裡有一百萬人,
12:07
So let's think about a disease that affects --
240
727000
3000
我們來假設一種很罕見的疾病,
12:10
it's pretty rare, it affects one person in 10,000.
241
730000
2000
得病機率只有萬分之一,
12:12
Amongst these million individuals, most of them are healthy
242
732000
3000
所以在這一百萬人裡,大部分的人都是健康的,
12:15
and some of them will have the disease.
243
735000
2000
只有少數人會得病。
12:17
And in fact, if this is the prevalence of the disease,
244
737000
3000
如果這種疾病流行起來,
12:20
about 100 will have the disease and the rest won't.
245
740000
3000
也只有100個人會生病,其餘的人則不會生病。
12:23
So now suppose we test them all.
246
743000
2000
假設我們對全部的人做檢驗,
12:25
What happens?
247
745000
2000
會有什麼結果?
12:27
Well, amongst the 100 who do have the disease,
248
747000
2000
在這100個得病的人裡,
12:29
the test will get it right 99 percent of the time, and 99 will test positive.
249
749000
5000
以這99%準確度的試劑來檢驗,會有99個人呈陽性反應,
12:34
Amongst all these other people who don't have the disease,
250
754000
2000
而在其他沒有得病的人裡,
12:36
the test will get it right 99 percent of the time.
251
756000
3000
這個試劑的準確度還是99%,
12:39
It'll only get it wrong one percent of the time.
252
759000
2000
有1%的機會會出錯,
12:41
But there are so many of them that there'll be an enormous number of false positives.
253
761000
4000
但因為人數很多,所以假陽性的數量也就跟著變多。
12:45
Put that another way --
254
765000
2000
換個方式來說,
12:47
of all of them who test positive -- so here they are, the individuals involved --
255
767000
5000
在所有呈陽性反應的人裡,
12:52
less than one in 100 actually have the disease.
256
772000
5000
100個人裡只有不到一個人是真正染病的。
12:57
So even though we think the test is accurate, the important part of the story is
257
777000
4000
即使我們認為這種試劑很準確,
13:01
there's another bit of information we need.
258
781000
3000
但重點是我們還需要其他資訊來確認,
13:04
Here's the key intuition.
259
784000
2000
我們需要敏銳的洞察力。
13:07
What we have to do, once we know the test is positive,
260
787000
3000
一旦我們發現有人呈陽性反應,
13:10
is to weigh up the plausibility, or the likelihood, of two competing explanations.
261
790000
6000
我們就該去權衡二種不同解釋之間的可信度或可能性,
13:16
Each of those explanations has a likely bit and an unlikely bit.
262
796000
3000
每一種解釋都有可能的一面,也有不可能的一面。
13:19
One explanation is that the person doesn't have the disease --
263
799000
3000
你可以說這個人沒有染病,
13:22
that's overwhelmingly likely, if you pick someone at random --
264
802000
3000
這很有可能,因為你是隨機取樣的,
13:25
but the test gets it wrong, which is unlikely.
265
805000
3000
也就是說試劑出錯了,但這種機會不大。
13:29
The other explanation is that the person does have the disease -- that's unlikely --
266
809000
3000
你也可以說這個人確實是染病了,但這種疾病發生的機率很小,
13:32
but the test gets it right, which is likely.
267
812000
3000
試劑確實是準確的,這確實很有可能發生。
13:35
And the number we end up with --
268
815000
2000
最後我們得到的數據
13:37
that number which is a little bit less than one in 100 --
269
817000
3000
是比1%還稍小一點,
13:40
is to do with how likely one of those explanations is relative to the other.
270
820000
6000
也就是這二種解釋的發生的比例(幾乎是一比一百),
13:46
Each of them taken together is unlikely.
271
826000
2000
二者同時發生的可能性不高。
13:49
Here's a more topical example of exactly the same thing.
272
829000
3000
這裡還有一個很類似的例子,
13:52
Those of you in Britain will know about what's become rather a celebrated case
273
832000
4000
各位住在英國都知道一個很著名的案例,
13:56
of a woman called Sally Clark, who had two babies who died suddenly.
274
836000
5000
有個叫做莎莉.克拉克的婦人,她的二個嬰孩同時猝死,
14:01
And initially, it was thought that they died of what's known informally as "cot death,"
275
841000
4000
一開始大家都以為是猝死症,
14:05
and more formally as "Sudden Infant Death Syndrome."
276
845000
3000
正式名稱為嬰兒猝死症候群。
14:08
For various reasons, she was later charged with murder.
277
848000
2000
基於許多不同理由,莎莉被控謀殺,
14:10
And at the trial, her trial, a very distinguished pediatrician gave evidence
278
850000
4000
而在審判中,一位很知名的小兒科醫生做證說明,
14:14
that the chance of two cot deaths, innocent deaths, in a family like hers --
279
854000
5000
在他們這種家庭裡,也就是專業人士又不抽煙的家庭,
14:19
which was professional and non-smoking -- was one in 73 million.
280
859000
6000
二個嬰兒同時猝死的機率大約是7千3百萬分之一。
14:26
To cut a long story short, she was convicted at the time.
281
866000
3000
長話短說,她後來被定罪了。
14:29
Later, and fairly recently, acquitted on appeal -- in fact, on the second appeal.
282
869000
5000
但是後來,也就是最近的事,她在第二次上訴後獲判無罪。
14:34
And just to set it in context, you can imagine how awful it is for someone
283
874000
4000
請各位想想一下,如果有人失去了一個孩子,
14:38
to have lost one child, and then two, if they're innocent,
284
878000
3000
或甚至二個孩子,以清白之身卻被判謀殺定罪,
14:41
to be convicted of murdering them.
285
881000
2000
這是多麼殘忍的一件事。
14:43
To be put through the stress of the trial, convicted of murdering them --
286
883000
2000
就只為了紓解法庭所承擔的壓力,
14:45
and to spend time in a women's prison, where all the other prisoners
287
885000
3000
就把一個人以謀殺犯定罪,把她關進女子監獄,
14:48
think you killed your children -- is a really awful thing to happen to someone.
288
888000
5000
那裡的犯人都認為你殺了自己的小孩,這真是一件悲慘絕倫的事。
14:53
And it happened in large part here because the expert got the statistics
289
893000
5000
這個錯誤最主要是因為專家在二個不同的方面,
14:58
horribly wrong, in two different ways.
290
898000
3000
大錯特錯地引用了統計數據所造成。
15:01
So where did he get the one in 73 million number?
291
901000
4000
他怎麼得出7千3百萬分之一這個數據的?
15:05
He looked at some research, which said the chance of one cot death in a family
292
905000
3000
他看了某些研究文獻,裡頭說像莎莉這種家庭,
15:08
like Sally Clark's is about one in 8,500.
293
908000
5000
一個嬰孩猝死的機率約為8千5百分之一。
15:13
So he said, "I'll assume that if you have one cot death in a family,
294
913000
4000
他說:「先假設家裡已經有一個嬰孩猝死了,
15:17
the chance of a second child dying from cot death aren't changed."
295
917000
4000
第二個嬰孩猝死的機率與第一個相同。」
15:21
So that's what statisticians would call an assumption of independence.
296
921000
3000
這就是統計學所引用的獨立性假設,
15:24
It's like saying, "If you toss a coin and get a head the first time,
297
924000
2000
就好像是說:「若你第一次丟銅板得到一個人頭,
15:26
that won't affect the chance of getting a head the second time."
298
926000
3000
並不會影響你第二次再丟銅板,得到人頭的機率。」
15:29
So if you toss a coin twice, the chance of getting a head twice are a half --
299
929000
5000
所以,如果你丟一個銅板二次,那麼連丟二次都得到人頭的機率,
15:34
that's the chance the first time -- times a half -- the chance a second time.
300
934000
3000
就是第一次丟出銅板的機率,乘上第二次的機率(1/2*1/2)。
15:37
So he said, "Here,
301
937000
2000
所以他才會說:「讓我們假設一下,
15:39
I'll assume that these events are independent.
302
939000
4000
假設這二個事件是獨立的,
15:43
When you multiply 8,500 together twice,
303
943000
2000
將8千5百乘二次,
15:45
you get about 73 million."
304
945000
2000
就會得到7千3百萬。」
15:47
And none of this was stated to the court as an assumption
305
947000
2000
但是這個前題假設並沒有在法庭上說明,
15:49
or presented to the jury that way.
306
949000
2000
也沒有對陪審團說明。
15:52
Unfortunately here -- and, really, regrettably --
307
952000
3000
很不幸也很遺憾的是,
15:55
first of all, in a situation like this you'd have to verify it empirically.
308
955000
4000
首先,像這種情形就該憑經驗先進行驗證,
15:59
And secondly, it's palpably false.
309
959000
2000
第二,這很明顯就是錯的。
16:02
There are lots and lots of things that we don't know about sudden infant deaths.
310
962000
5000
我們對於嬰兒猝死症所知真的不多,
16:07
It might well be that there are environmental factors that we're not aware of,
311
967000
3000
有可能是因為某些我們並不瞭解的環境因素所造成,
16:10
and it's pretty likely to be the case that there are
312
970000
2000
而這個個案更有可能是因為
16:12
genetic factors we're not aware of.
313
972000
2000
我們所不知道的基因缺陷所造成,
16:14
So if a family suffers from one cot death, you'd put them in a high-risk group.
314
974000
3000
所以當某個家庭裡有一個嬰孩猝死時,他們就算是高風險的家庭,
16:17
They've probably got these environmental risk factors
315
977000
2000
有可能存在著某些環境風險因子,
16:19
and/or genetic risk factors we don't know about.
316
979000
3000
或是有我們不知道的基因缺陷,或是二者都有。
16:22
And to argue, then, that the chance of a second death is as if you didn't know
317
982000
3000
真要計較起來,若完全不考慮這些因素,
16:25
that information is really silly.
318
985000
3000
就來計算第二個嬰孩的猝死機率,是很可笑的。
16:28
It's worse than silly -- it's really bad science.
319
988000
4000
甚至比可笑還糟,簡直就是爛透了的科學證據。
16:32
Nonetheless, that's how it was presented, and at trial nobody even argued it.
320
992000
5000
但這個數據就這樣被當成呈堂證供,法庭上也沒有人懷疑,
16:37
That's the first problem.
321
997000
2000
這就是第一個問題。
16:39
The second problem is, what does the number of one in 73 million mean?
322
999000
4000
第二個問題是,7千3百萬分之一代表著什麼?
16:43
So after Sally Clark was convicted --
323
1003000
2000
當莎拉.克拉克被定罪之後,
16:45
you can imagine, it made rather a splash in the press --
324
1005000
4000
你可以想見又在媒體上掀起了多大的波瀾,
16:49
one of the journalists from one of Britain's more reputable newspapers wrote that
325
1009000
7000
英國某家聲譽卓著的報社記者
16:56
what the expert had said was,
326
1016000
2000
就引用專家的話說:
16:58
"The chance that she was innocent was one in 73 million."
327
1018000
5000
「莎拉清白的機率是7千3百萬之一」
17:03
Now, that's a logical error.
328
1023000
2000
這犯了邏輯上的錯誤,
17:05
It's exactly the same logical error as the logical error of thinking that
329
1025000
3000
這個錯誤就和我們剛才所談到的疾病測試一樣,
17:08
after the disease test, which is 99 percent accurate,
330
1028000
2000
同樣具有邏輯上的錯誤,有人會以為試劑有99%的準確度,
17:10
the chance of having the disease is 99 percent.
331
1030000
4000
得到這種疾病的機率就是99%。
17:14
In the disease example, we had to bear in mind two things,
332
1034000
4000
在疾病試劑的例子裡,我們得記住二件事,
17:18
one of which was the possibility that the test got it right or not.
333
1038000
4000
其中之一是試劑的準確度,
17:22
And the other one was the chance, a priori, that the person had the disease or not.
334
1042000
4000
另一個則是人們染病的先驗機率。
17:26
It's exactly the same in this context.
335
1046000
3000
這和這個案子是一樣的情形,
17:29
There are two things involved -- two parts to the explanation.
336
1049000
4000
這個案子也有二種解釋的方向,
17:33
We want to know how likely, or relatively how likely, two different explanations are.
337
1053000
4000
我們得釐清這二種解釋發生的機率。
17:37
One of them is that Sally Clark was innocent --
338
1057000
3000
第一種解釋是莎拉是清白的,
17:40
which is, a priori, overwhelmingly likely --
339
1060000
2000
這在先驗機率上是很有可能的,
17:42
most mothers don't kill their children.
340
1062000
3000
大部分的母親都不會殺害自己的小孩。
17:45
And the second part of the explanation
341
1065000
2000
這種解釋的第二個部分是,
17:47
is that she suffered an incredibly unlikely event.
342
1067000
3000
莎拉的遭遇真的是令人難以置信,
17:50
Not as unlikely as one in 73 million, but nonetheless rather unlikely.
343
1070000
4000
雖然機率不像7千3百萬分之一那麼小,但確實是不太可能。
17:54
The other explanation is that she was guilty.
344
1074000
2000
第二種解釋是莎拉確實是有罪的,
17:56
Now, we probably think a priori that's unlikely.
345
1076000
2000
就先驗機率來說,這不太可能,
17:58
And we certainly should think in the context of a criminal trial
346
1078000
3000
而且我們當然認為在這起犯罪的審判中,
18:01
that that's unlikely, because of the presumption of innocence.
347
1081000
3000
一開始就要假設被告是無罪的,所以說莎拉有罪並不太可能。
18:04
And then if she were trying to kill the children, she succeeded.
348
1084000
4000
但若她真的想要殺害小孩,她也成功了,
18:08
So the chance that she's innocent isn't one in 73 million.
349
1088000
4000
所以她是清白的機率就不是7千3百萬分之一,
18:12
We don't know what it is.
350
1092000
2000
沒人知道是多少,
18:14
It has to do with weighing up the strength of the other evidence against her
351
1094000
4000
這個機率反而是和其他對她不利的證據和統計數據有關,
18:18
and the statistical evidence.
352
1098000
2000
得視證據強度而定。
18:20
We know the children died.
353
1100000
2000
我們只知道嬰孩死了,
18:22
What matters is how likely or unlikely, relative to each other,
354
1102000
4000
重要的是要找出這二種解釋
18:26
the two explanations are.
355
1106000
2000
之間的關聯性。
18:28
And they're both implausible.
356
1108000
2000
這二種解釋都無法使人信服,
18:31
There's a situation where errors in statistics had really profound
357
1111000
4000
有時統計上的錯誤所造成的影響,
18:35
and really unfortunate consequences.
358
1115000
3000
是很深遠且會造成不幸的。
18:38
In fact, there are two other women who were convicted on the basis of the
359
1118000
2000
事實上,還有有二位婦女因為這位小兒科醫生的證詞,
18:40
evidence of this pediatrician, who have subsequently been released on appeal.
360
1120000
4000
而被判有罪,但在後來的上訴後又被無罪釋放。
18:44
Many cases were reviewed.
361
1124000
2000
以往許多案子又被大家拿出來討論,
18:46
And it's particularly topical because he's currently facing a disrepute charge
362
1126000
4000
因此又掀起一波話題,因為這個醫生正被英國醫藥委員會
18:50
at Britain's General Medical Council.
363
1130000
3000
控以不名譽的罪名。
18:53
So just to conclude -- what are the take-home messages from this?
364
1133000
4000
結論是,這個故事帶給我們什麼樣的啟示?
18:57
Well, we know that randomness and uncertainty and chance
365
1137000
4000
我們知道隨機、不確定性及機率等,
19:01
are very much a part of our everyday life.
366
1141000
3000
都是我們日常生活的一部分,
19:04
It's also true -- and, although, you, as a collective, are very special in many ways,
367
1144000
5000
而雖然我們每一個人都與眾不同,
19:09
you're completely typical in not getting the examples I gave right.
368
1149000
4000
但就我所提出的問題沒有做出正確的回答這件事,這也是常態
19:13
It's very well documented that people get things wrong.
369
1153000
3000
很多過去的記錄顯示人們確實有時會做出錯誤判斷。
19:16
They make errors of logic in reasoning with uncertainty.
370
1156000
3000
在不確定的情況下,人們會犯下合理的邏輯錯誤。
19:20
We can cope with the subtleties of language brilliantly --
371
1160000
2000
人類可以運用精巧的語言,
19:22
and there are interesting evolutionary questions about how we got here.
372
1162000
3000
也能對人類本身的進化提出有趣的問題,
19:25
We are not good at reasoning with uncertainty.
373
1165000
3000
但我們就是不擅長預測不確定性,
19:28
That's an issue in our everyday lives.
374
1168000
2000
這是我們每天都必須面對的問題。
19:30
As you've heard from many of the talks, statistics underpins an enormous amount
375
1170000
3000
如同其他講者所提到的,統計學是其他許多科學研究的基礎,
19:33
of research in science -- in social science, in medicine
376
1173000
3000
不管是社會科學還是醫學都一樣,
19:36
and indeed, quite a lot of industry.
377
1176000
2000
還包括大部分的工業,
19:38
All of quality control, which has had a major impact on industrial processing,
378
1178000
4000
那些品質控制理論,對於工業流程管制具有重大的影響,
19:42
is underpinned by statistics.
379
1182000
2000
都是靠統計學做基礎。
19:44
It's something we're bad at doing.
380
1184000
2000
但這卻是我們所不擅長的事,
19:46
At the very least, we should recognize that, and we tend not to.
381
1186000
3000
至少我們該承認這一點,但我們卻沒人願意承認。
19:49
To go back to the legal context, at the Sally Clark trial
382
1189000
4000
回到法律層面,回到莎拉的案子上,
19:53
all of the lawyers just accepted what the expert said.
383
1193000
4000
所有的律師都接受這位專家的說法,
19:57
So if a pediatrician had come out and said to a jury,
384
1197000
2000
所以如果有一位小兒科醫生站出來對陪審團說,
19:59
"I know how to build bridges. I've built one down the road.
385
1199000
3000
「我知道如何建造橋樑,我已經在這條路上蓋了一座橋,
20:02
Please drive your car home over it,"
386
1202000
2000
請把你的車開上橋回家吧!」
20:04
they would have said, "Well, pediatricians don't know how to build bridges.
387
1204000
2000
陪審團會說:「小兒科醫生不是建造橋樑的專家,
20:06
That's what engineers do."
388
1206000
2000
這是工程師該做的事。」
20:08
On the other hand, he came out and effectively said, or implied,
389
1208000
3000
而在另一方面,這位醫師卻站出來發表專業意見,甚至暗示:
20:11
"I know how to reason with uncertainty. I know how to do statistics."
390
1211000
3000
「我知道如何解釋不確定性,我瞭解統計方法。」
20:14
And everyone said, "Well, that's fine. He's an expert."
391
1214000
3000
然後大家附和:「對,他是專家。」
20:17
So we need to understand where our competence is and isn't.
392
1217000
3000
我們必須瞭解每一個人的專長為何,
20:20
Exactly the same kinds of issues arose in the early days of DNA profiling,
393
1220000
4000
就像早期我們在描繪DNA時所引發的爭議一樣,
20:24
when scientists, and lawyers and in some cases judges,
394
1224000
4000
有些科學家、律師,或甚至法官,
20:28
routinely misrepresented evidence.
395
1228000
3000
都曾不斷地錯誤解讀他們所看到的證據。
20:32
Usually -- one hopes -- innocently, but misrepresented evidence.
396
1232000
3000
他們通常不是故意的,我們也衷心希望不是,但卻還是扭曲了證據的本質。
20:35
Forensic scientists said, "The chance that this guy's innocent is one in three million."
397
1235000
5000
鑑識專家說:「這傢伙清白的機率是三百萬分之一。」
20:40
Even if you believe the number, just like the 73 million to one,
398
1240000
2000
即使各位相信這個數據,就像先前提到的7千3百萬分之一那樣,
20:42
that's not what it meant.
399
1242000
2000
但這數據的意義並非如此,
20:44
And there have been celebrated appeal cases
400
1244000
2000
在英國和其他地方,
20:46
in Britain and elsewhere because of that.
401
1246000
2000
都有因為誤解數據而誤判的有名案例。
20:48
And just to finish in the context of the legal system.
402
1248000
3000
再讓我們回過頭來看看我們的法庭,
20:51
It's all very well to say, "Let's do our best to present the evidence."
403
1251000
4000
你大可以說:「我們得盡力將證據的原貌呈現出來。」
20:55
But more and more, in cases of DNA profiling -- this is another one --
404
1255000
3000
但是在DNA描繪的案例裡,一次又一次我們看到,這是另一個案例,
20:58
we expect juries, who are ordinary people --
405
1258000
3000
我們期望陪審團這些一般大眾,
21:01
and it's documented they're very bad at this --
406
1261000
2000
這些本來就對統計不甚在行在大眾,
21:03
we expect juries to be able to cope with the sorts of reasoning that goes on.
407
1263000
4000
我們竟然期望他們能解讀這些統計數據。
21:07
In other spheres of life, if people argued -- well, except possibly for politics --
408
1267000
5000
但在現實生活裡,如果有人爭論...嗯,除了政治話題之外,
21:12
but in other spheres of life, if people argued illogically,
409
1272000
2000
在現實生活裡,如果有人不合邏輯地爭論,
21:14
we'd say that's not a good thing.
410
1274000
2000
我們會說這樣做不好,
21:16
We sort of expect it of politicians and don't hope for much more.
411
1276000
4000
我們會認為這是政客做的事,因為我們對政客沒什麽太大的期望。
21:20
In the case of uncertainty, we get it wrong all the time --
412
1280000
3000
在面對不確定的事情時,我們總是犯錯,
21:23
and at the very least, we should be aware of that,
413
1283000
2000
但是至少我們應該知道我們會犯錯。
21:25
and ideally, we might try and do something about it.
414
1285000
2000
並希望我們能嘗試去減少錯誤的發生。
21:27
Thanks very much.
415
1287000
1000
謝謝各位!
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7