What to trust in a "post-truth" world | Alex Edmans

148,811 views ・ 2018-12-03

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: SF Huang
00:13
Belle Gibson was a happy young Australian.
0
13675
2920
貝兒.吉布森曾經是一位 快樂的年輕澳洲人。
00:16
She lived in Perth, and she loved skateboarding.
1
16619
3023
她住在伯斯,喜歡玩滑板。
00:20
But in 2009, Belle learned that she had brain cancer and four months to live.
2
20173
4449
但 2009 年,貝兒得知她得了 腦瘤,只剩下 4 個月的生命。
00:25
Two months of chemo and radiotherapy had no effect.
3
25034
3533
兩個月的化療和放射線治療 都沒有效果。
00:29
But Belle was determined.
4
29145
1500
但貝兒意志很堅強。
00:30
She'd been a fighter her whole life.
5
30669
2130
她一輩子都是個鬥士。
00:32
From age six, she had to cook for her brother, who had autism,
6
32823
3294
6 歲時,她得幫自閉症的弟弟
00:36
and her mother, who had multiple sclerosis.
7
36141
2388
及多發性硬化症的母親煮飯。
00:38
Her father was out of the picture.
8
38553
1754
父親在她的生命中缺席。
00:40
So Belle fought, with exercise, with meditation
9
40736
3286
貝兒靠著運動和冥想,
00:44
and by ditching meat for fruit and vegetables.
10
44046
2840
並以蔬果代替肉類來抗癌。
00:47
And she made a complete recovery.
11
47387
2200
她完全復原了。
00:50
Belle's story went viral.
12
50784
1579
貝兒的故事被瘋傳。
00:52
It was tweeted, blogged about, shared and reached millions of people.
13
52387
3393
在推特和部落格中, 有數百萬人分享並流傳著。
00:56
It showed the benefits of shunning traditional medicine
14
56246
3111
它顯示出不用傳統醫學
而改用飲食和運動的益處。
00:59
for diet and exercise.
15
59381
1467
01:01
In August 2013, Belle launched a healthy eating app,
16
61381
4498
2013 年 8 月,貝兒推出了 一個健康飲食的應用程式
01:05
The Whole Pantry,
17
65903
1349
「健康廚房」,
01:07
downloaded 200,000 times in the first month.
18
67276
4023
首月就有 20 萬的下載人次。
01:13
But Belle's story was a lie.
19
73228
2799
但是,貝兒的故事是假的。
01:17
Belle never had cancer.
20
77227
1534
貝兒從來沒有得過癌症。
01:19
People shared her story without ever checking if it was true.
21
79601
4133
大家在分享她的故事時, 根本沒有先確認真假。
01:24
This is a classic example of confirmation bias.
22
84815
3220
這是個確認偏誤的典型例子。
01:28
We accept a story uncritically if it confirms what we'd like to be true.
23
88403
4676
如果一個故事符合 我們希望它為真的想法,
我們就會不加鑑別地接受它。
01:33
And we reject any story that contradicts it.
24
93484
2506
且我們會排斥任何與之對立的故事。
01:36
How often do we see this
25
96937
1825
在我們分享和忽略故事的時候,
01:38
in the stories that we share and we ignore?
26
98786
3045
有多常看到這樣的現象?
01:41
In politics, in business, in health advice.
27
101855
4182
在政治、商業、保健的建議中。
01:47
The Oxford Dictionary's word of 2016 was "post-truth."
28
107180
4106
牛津字典選出 2016 年的 年度詞彙是「後真相」。
01:51
And the recognition that we now live in a post-truth world
29
111768
3492
因為認知到我們現在的世界 是個後真相的世界,
01:55
has led to a much needed emphasis on checking the facts.
30
115284
3364
因此更需著眼在確認訊息是否屬實。
01:59
But the punch line of my talk
31
119339
1397
但,我這場演說的重點是: 僅確認是否屬實是不夠的。
02:00
is that just checking the facts is not enough.
32
120760
2991
02:04
Even if Belle's story were true,
33
124347
2927
即使貝兒的故事是真的,
02:07
it would be just as irrelevant.
34
127298
2067
那也不重要。
02:10
Why?
35
130457
1150
為什麼?
02:11
Well, let's look at one of the most fundamental techniques in statistics.
36
131957
3508
讓我們來看看統計學中 最基礎的技巧之一。
02:15
It's called Bayesian inference.
37
135489
2410
就是「貝氏推論」。
02:18
And the very simple version is this:
38
138251
2936
用非常簡單的方式來說明:
02:21
We care about "does the data support the theory?"
39
141211
3268
我們在乎「資料是否支持理論?」
02:25
Does the data increase our belief that the theory is true?
40
145053
3456
資料是否會增加 我們對於理論為真的信心?
02:29
But instead, we end up asking, "Is the data consistent with the theory?"
41
149520
4383
但,我們卻淪為在問: 「資料和理論一致嗎?」
02:34
But being consistent with the theory
42
154838
2515
但,資料和理論一致
02:37
does not mean that the data supports the theory.
43
157377
2929
並不表示資料就支持理論。
02:40
Why?
44
160799
1159
為什麼?
02:41
Because of a crucial but forgotten third term --
45
161982
3825
因為有個很關鍵卻被遺忘記的 第三個條件 ——
02:45
the data could also be consistent with rival theories.
46
165831
3558
資料也有可能和對立理論一致。
02:49
But due to confirmation bias, we never consider the rival theories,
47
169918
4667
但因為確認偏誤,我們從來 都不會去考量對立理論,
02:54
because we're so protective of our own pet theory.
48
174609
3151
因為我們是如此防護 自己特別鍾愛的理論。
02:58
Now, let's look at this for Belle's story.
49
178688
2413
我們來用貝兒的故事做說明。
03:01
Well, we care about: Does Belle's story support the theory
50
181125
4214
我們在乎:貝兒的故事能否支持
「飲食能治癒癌症」的理論?
03:05
that diet cures cancer?
51
185363
1603
03:06
But instead, we end up asking,
52
186990
1787
但,我們最後反而在問:
03:08
"Is Belle's story consistent with diet curing cancer?"
53
188801
4045
「貝兒的故事是否和 飲食能治癒癌症一致?」
03:13
And the answer is yes.
54
193790
1604
答案是「是」。
03:15
If diet did cure cancer, we'd see stories like Belle's.
55
195839
4103
如果飲食能治癒癌症, 我們就會看到像貝兒這樣的故事。
03:20
But even if diet did not cure cancer,
56
200839
2849
但即使飲食不能治癒癌症,
03:23
we'd still see stories like Belle's.
57
203712
2643
我們仍然會看到像貝兒這樣的故事。
03:26
A single story in which a patient apparently self-cured
58
206744
5190
像是一個病人顯然能夠自我治癒,
03:31
just due to being misdiagnosed in the first place.
59
211958
3174
僅因一開始她就被誤診的故事。
03:35
Just like, even if smoking was bad for your health,
60
215680
3326
就像即使抽菸會危害你的健康,
03:39
you'd still see one smoker who lived until 100.
61
219030
3304
你仍然能找到一個 活到 100 歲的老煙槍。
03:42
(Laughter)
62
222664
1150
(笑聲)
03:44
Just like, even if education was good for your income,
63
224157
2562
就像即使教育有益於你的收入,
03:46
you'd still see one multimillionaire who didn't go to university.
64
226743
4281
你仍然能找到一個沒有 大學學歷的大富豪。
03:51
(Laughter)
65
231048
4984
(笑聲)
03:56
So the biggest problem with Belle's story is not that it was false.
66
236056
3911
所以,貝兒的故事最大的問題 並不在於它是假的。
03:59
It's that it's only one story.
67
239991
2531
問題在於它只是單一個故事。
04:03
There might be thousands of other stories where diet alone failed,
68
243094
4381
可能還有幾千個光靠飲食 而失敗的故事,
04:07
but we never hear about them.
69
247499
1934
但我們從來沒有聽到這些故事。
04:10
We share the outlier cases because they are new,
70
250141
3896
我們會分享特例, 因為特例很新穎,
04:14
and therefore they are news.
71
254061
1867
因此,新穎就是新聞。
04:16
We never share the ordinary cases.
72
256657
2476
我們從來不會去分享一般的案例。
04:19
They're too ordinary, they're what normally happens.
73
259157
3213
它們太一般了。 日常生活中隨處可見。
04:23
And that's the true 99 percent that we ignore.
74
263125
3095
真實的 99 % 就這樣被我們忽略了。
04:26
Just like in society, you can't just listen to the one percent,
75
266244
2968
就像在社會中, 你不能只聽那 1% 的特例,
04:29
the outliers,
76
269236
1158
04:30
and ignore the 99 percent, the ordinary.
77
270418
2666
而忽略 99 % 的一般狀況。
04:34
Because that's the second example of confirmation bias.
78
274022
3254
因為那是確認偏誤的第二個例子。
04:37
We accept a fact as data.
79
277300
2769
我們接受事實作為資料。
04:41
The biggest problem is not that we live in a post-truth world;
80
281038
3968
最大的問題並不是我們身在 「後真相」的世界中;
04:45
it's that we live in a post-data world.
81
285030
3769
而是我們身在「後資料」的世界中。
04:49
We prefer a single story to tons of data.
82
289792
3744
相較於一大堆資料, 我們比較偏好單一個故事。
04:54
Now, stories are powerful, they're vivid, they bring it to life.
83
294752
3016
故事是很強大、很生動的, 它們是很活靈活現的。
04:57
They tell you to start every talk with a story.
84
297792
2222
人們說演講要用故事來當開場。
05:00
I did.
85
300038
1150
我就這麼做了。
05:01
But a single story is meaningless and misleading
86
301696
4754
但單一個故事沒有意義, 還會造成誤導,
05:06
unless it's backed up by large-scale data.
87
306474
2849
除非它背後有大規模的資料來支持。
05:11
But even if we had large-scale data,
88
311236
2357
但即使我們有大規模的資料,
05:13
that might still not be enough.
89
313617
2158
那可能還是不夠。
05:16
Because it could still be consistent with rival theories.
90
316260
3138
因為它仍可能和對立的理論一致。
05:20
Let me explain.
91
320136
1150
讓我解釋一下。
05:22
A classic study by psychologist Peter Wason
92
322072
3262
精神科醫生彼得.瓦森 有一項經典的研究,
05:25
gives you a set of three numbers
93
325358
1952
給你一組 3 個數字,
05:27
and asks you to think of the rule that generated them.
94
327334
2905
請你去思考產生出 這些數字的規則。
05:30
So if you're given two, four, six,
95
330585
4476
如果你拿到的數字是 2、4、6,
05:35
what's the rule?
96
335085
1150
規則是什麼?
05:36
Well, most people would think, it's successive even numbers.
97
336895
3219
大部分的人會想, 這是連續的偶數。
05:40
How would you test it?
98
340767
1515
你要如何測試它?
05:42
Well, you'd propose other sets of successive even numbers:
99
342306
3262
你會提出其他組的連續偶數:
05:45
4, 6, 8 or 12, 14, 16.
100
345592
3318
4、6、8 或 12、14、16。
05:49
And Peter would say these sets also work.
101
349546
2800
彼得會說,這幾組的確行得通。
05:53
But knowing that these sets also work,
102
353124
2564
但知道這幾組也行得通,
05:55
knowing that perhaps hundreds of sets of successive even numbers also work,
103
355712
4765
也許有數百組連續偶數的 數字也行得通,
06:00
tells you nothing.
104
360501
1348
並不能告訴你什麼。
06:02
Because this is still consistent with rival theories.
105
362572
3358
因為這仍然和對立理論一致。
06:06
Perhaps the rule is any three even numbers.
106
366889
3205
也許規則是任何 3 個偶數。
06:11
Or any three increasing numbers.
107
371000
2133
或任何 3 個越來越大的數字。
06:14
And that's the third example of confirmation bias:
108
374365
2888
那就是確認偏誤的第三例子:
06:17
accepting data as evidence,
109
377277
3689
接受資料作為證據,
06:20
even if it's consistent with rival theories.
110
380990
3000
即使資料和對立理論一致。
06:24
Data is just a collection of facts.
111
384704
2952
資料只是一大堆事實。
06:28
Evidence is data that supports one theory and rules out others.
112
388402
4923
支持單一個理論並排除其他 理論的資料,才叫做證據。
06:34
So the best way to support your theory
113
394665
2483
所以,若要支持你的理論,
06:37
is actually to try to disprove it, to play devil's advocate.
114
397172
3930
最好的方式就是證明它是錯的, 要盡可能地吹毛求疵。
06:41
So test something, like 4, 12, 26.
115
401466
4718
所以,要測試如 4、12、26 這樣的組合。
06:46
If you got a yes to that, that would disprove your theory
116
406938
3683
如果結果也是肯定的,
那麼你的連續偶數理論就不成立了。
06:50
of successive even numbers.
117
410645
1936
06:53
Yet this test is powerful,
118
413232
2016
但,這種測試是很強大的,
06:55
because if you got a no, it would rule out "any three even numbers"
119
415272
4845
因為如果你得到「否」, 就能排除「任何 3 個偶數」
和「任何 3 個越來越大的數字」。
07:00
and "any three increasing numbers."
120
420141
1712
07:01
It would rule out the rival theories, but not rule out yours.
121
421877
3341
對立的理論會被排除, 你的理論卻不會被排除。
07:05
But most people are too afraid of testing the 4, 12, 26,
122
425968
4794
但大部分人都太害怕 而不敢測試 4、12、26,
07:10
because they don't want to get a yes and prove their pet theory to be wrong.
123
430786
4163
因為他們不希望得到「是」, 來證明自己鍾愛的理論是錯的。
07:16
Confirmation bias is not only about failing to search for new data,
124
436727
5676
確認偏誤並不只是
未能尋找到新的資料,
07:22
but it's also about misinterpreting data once you receive it.
125
442427
3073
還包括你對接收到的資料 做出錯誤的判讀。
07:26
And this applies outside the lab to important, real-world problems.
126
446339
3548
這也適用在實驗室以外的 重要、真實世界的問題上。
07:29
Indeed, Thomas Edison famously said,
127
449911
3309
的確,愛迪生有句名言是說:
07:33
"I have not failed,
128
453244
1888
「我沒有失敗,
07:35
I have found 10,000 ways that won't work."
129
455156
4188
我只是找出一萬種行不通的方式。」
07:40
Finding out that you're wrong
130
460281
2627
發現你的錯誤
07:42
is the only way to find out what's right.
131
462932
2733
是找到真相的唯一方式。
07:46
Say you're a university admissions director
132
466654
2946
假設你是一間大學的招生部主任,
07:49
and your theory is that only students with good grades
133
469624
2563
你的理論是:只有來自富裕家庭的
績優學生才會有優良的表現。
07:52
from rich families do well.
134
472211
1763
07:54
So you only let in such students.
135
474339
2190
所以你只讓這種學生入學。
07:56
And they do well.
136
476553
1150
他們也的確表現優良。
07:58
But that's also consistent with the rival theory.
137
478482
2772
但這個狀況也和對立理論一致。
08:01
Perhaps all students with good grades do well,
138
481593
2747
也許所有成績好的學生 都會有優良的表現,
08:04
rich or poor.
139
484364
1181
不論富有或貧窮。
08:06
But you never test that theory because you never let in poor students
140
486307
3730
但你從來沒有測試那個理論, 因為你從來不讓貧窮的學生入學,
08:10
because you don't want to be proven wrong.
141
490061
2800
因為你不希望自己被證明是錯的。
08:14
So, what have we learned?
142
494577
1857
所以,我們學到了什麼?
08:17
A story is not fact, because it may not be true.
143
497315
3560
一個故事並不是事實, 因為它可能不是真的。
08:21
A fact is not data,
144
501498
2087
一個事實並不是資料,
08:23
it may not be representative if it's only one data point.
145
503609
4039
如果它只一個資料點, 它可能不具代表性。
08:28
And data is not evidence --
146
508680
2349
資料並不是證據 ——
08:31
it may not be supportive if it's consistent with rival theories.
147
511053
3678
如果它和對立理論一致, 它就不見得有支持的力道。
08:36
So, what do you do?
148
516146
2277
所以,你能怎麼做?
08:39
When you're at the inflection points of life,
149
519464
2682
如果你正處於人生中的轉捩點,
08:42
deciding on a strategy for your business,
150
522170
2566
要為你的事業決定一種策略,
08:44
a parenting technique for your child
151
524760
2611
要為你的孩子決定一種教養技巧,
08:47
or a regimen for your health,
152
527395
2428
或要為你的健康決定一種食物療法,
08:49
how do you ensure that you don't have a story
153
529847
3539
你要如何確保你所取得的
不是一個故事,而是證據?
08:53
but you have evidence?
154
533410
1468
08:56
Let me give you three tips.
155
536268
1619
讓我提供大家 3 個秘訣。
08:58
The first is to actively seek other viewpoints.
156
538641
3984
第一,主動尋求其他觀點。
09:02
Read and listen to people you flagrantly disagree with.
157
542649
3594
閱讀並傾聽你非常不贊同的人。
09:06
Ninety percent of what they say may be wrong, in your view.
158
546267
3488
在你看來,他們說的話 有 90% 可能都是錯的。
09:10
But what if 10 percent is right?
159
550728
2133
但如果有 10% 是對的呢?
09:13
As Aristotle said,
160
553851
1619
如亞里斯多德說的:
09:15
"The mark of an educated man
161
555494
2214
「一位有教養的人
09:17
is the ability to entertain a thought
162
557732
3397
是能夠包容一種想法,
09:21
without necessarily accepting it."
163
561153
2333
卻不見得一定要接受它。」
09:24
Surround yourself with people who challenge you,
164
564649
2254
和挑戰自己的人在一起,
09:26
and create a culture that actively encourages dissent.
165
566917
3699
創造出一種主動鼓勵別人 提出不同意見的文化。
09:31
Some banks suffered from groupthink,
166
571347
2318
有些銀行飽受團體迷思之苦,
09:33
where staff were too afraid to challenge management's lending decisions,
167
573689
4309
員工太害怕去挑戰 管理階層的借貸決策,
09:38
contributing to the financial crisis.
168
578022
2466
因而造成金融財務危機。
09:41
In a meeting, appoint someone to be devil's advocate
169
581029
4199
在會議中,指定一個人
去吹毛求疵你心愛的想法。
09:45
against your pet idea.
170
585252
1642
09:47
And don't just hear another viewpoint --
171
587720
2571
且不要只是去聽不同的觀點——
09:50
listen to it, as well.
172
590315
2176
而是要認真地聽進去。
09:53
As psychologist Stephen Covey said,
173
593389
2404
心理學家史蒂芬.柯維說:
09:55
"Listen with the intent to understand,
174
595817
3397
「抱著想要了解的意圖去傾聽,
09:59
not the intent to reply."
175
599238
1666
而非想回應的意圖。」
10:01
A dissenting viewpoint is something to learn from
176
601642
3492
可以從不同意的觀點中學習,
10:05
not to argue against.
177
605158
1548
並非盲目地反對它。
10:07
Which takes us to the other forgotten terms in Bayesian inference.
178
607690
3866
這就帶到了在貝氏推論中 其他被遺忘的條件。
10:12
Because data allows you to learn,
179
612198
2324
因為資料讓你能學習,
10:14
but learning is only relative to a starting point.
180
614546
3515
但學習只是個相對的起點。
10:18
If you started with complete certainty that your pet theory must be true,
181
618085
5716
如果你一開始就完全肯定
你特別鍾愛的理論是對的,
10:23
then your view won't change --
182
623825
1897
那麼你的看法不會改變——
10:25
regardless of what data you see.
183
625746
2466
不論你看見什麼資料。
10:28
Only if you are truly open to the possibility of being wrong
184
628641
4391
只有當你放開心胸接受 自己有犯錯的可能時,
10:33
can you ever learn.
185
633056
1267
你才能學習。
10:35
As Leo Tolstoy wrote,
186
635580
2095
如托爾斯泰所寫的:
10:37
"The most difficult subjects
187
637699
2182
「最困難的問題
10:39
can be explained to the most slow-witted man
188
639905
3135
能夠解釋給最遲鈍的人瞭解,
10:43
if he has not formed any idea of them already.
189
643064
2753
只要他沒有任何先入為主的概念。
10:46
But the simplest thing
190
646365
1873
但最簡單的事,
10:48
cannot be made clear to the most intelligent man
191
648262
3071
反而無法對最睿智的人說明清楚,
10:51
if he is firmly persuaded that he knows already."
192
651357
3334
如果他堅信自身已經知道了答案。」
10:56
Tip number two is "listen to experts."
193
656500
3743
秘訣二是「聽專家的」。
11:01
Now, that's perhaps the most unpopular advice that I could give you.
194
661040
3492
這可能是我所能給你的建議當中 最不受歡迎的了。
11:04
(Laughter)
195
664556
1220
(笑聲)
11:05
British politician Michael Gove famously said that people in this country
196
665800
4738
英國政治家麥可戈夫有句名言是:
這國家的人民已經受夠了專家。
11:10
have had enough of experts.
197
670562
2276
11:13
A recent poll showed that more people would trust their hairdresser --
198
673696
3508
一項近期的調查顯示, 更多人選擇相信他們的理髮師——
11:17
(Laughter)
199
677228
2285
(笑聲)
11:19
or the man on the street
200
679537
1833
或街上的路人,
11:21
than they would leaders of businesses, the health service and even charities.
201
681394
4305
勝過相信企業領導人、 保健服務甚至是慈善事業。
11:26
So we respect a teeth-whitening formula discovered by a mom,
202
686227
3977
所以,我們重視某個媽媽 發現的牙齒美白配方,
11:30
or we listen to an actress's view on vaccination.
203
690228
3198
或會聽女演員對於接種疫苗的看法。
11:33
We like people who tell it like it is, who go with their gut,
204
693450
2865
我們喜歡那些有話直說、 憑著直覺走的人,
11:36
and we call them authentic.
205
696339
1800
我們會說這些人很真。
11:38
But gut feel can only get you so far.
206
698847
3214
但直覺沒辦法帶你走到多遠。
11:42
Gut feel would tell you never to give water to a baby with diarrhea,
207
702736
4436
直覺會告訴你, 千萬不要給腹瀉的寶寶喝水,
11:47
because it would just flow out the other end.
208
707196
2318
因為喝下去的水就只會被拉出來。
11:49
Expertise tells you otherwise.
209
709538
2578
專家意見卻是相反的。
11:53
You'd never trust your surgery to the man on the street.
210
713149
3428
如果事關你本人要動的手術, 你不會信任街上的路人。
11:56
You'd want an expert who spent years doing surgery
211
716887
3587
你會想要有位手術經驗豐富
且技術優異的專業醫師。
12:00
and knows the best techniques.
212
720498
2000
12:03
But that should apply to every major decision.
213
723514
3133
但那該應用在所有重大的決策上。
12:07
Politics, business, health advice
214
727255
4556
政治、商業、保健的建議
12:11
require expertise, just like surgery.
215
731835
2896
都需要專家,和動手術一樣。
12:16
So then, why are experts so mistrusted?
216
736474
3539
那麼,為什麼專家如此不被信任?
12:20
Well, one reason is they're seen as out of touch.
217
740981
3239
嗯,其中一個理由是, 他們似乎被認為和群眾脫節。
12:24
A millionaire CEO couldn't possibly speak for the man on the street.
218
744244
4090
百萬富翁執行長 不可能為街上的人發聲。
12:29
But true expertise is found on evidence.
219
749455
3559
但真正的專家是基於證據說話的。
12:33
And evidence stands up for the man on the street
220
753447
2905
證據會支持捍衛街上的人,
12:36
and against the elites.
221
756376
1533
並對抗精英。
12:38
Because evidence forces you to prove it.
222
758456
2667
因為證據會強迫你去證明它。
12:41
Evidence prevents the elites from imposing their own view
223
761774
4421
證據讓精英在沒有佐證的情況下,
無法將自己的想法強加在他人身上。
12:46
without proof.
224
766219
1150
12:49
A second reason why experts are not trusted
225
769006
2071
專家不被信任的第二個理由,
12:51
is that different experts say different things.
226
771101
3087
是因為不同的專家,所說各有不同。
12:54
For every expert who claimed that leaving the EU would be bad for Britain,
227
774212
4476
只要有專家聲稱 脫離歐盟對英國不是好事,
12:58
another expert claimed it would be good.
228
778712
2429
就會有其他專家聲稱這是好事。
13:01
Half of these so-called experts will be wrong.
229
781165
3767
這些所謂的專家,有半數會是錯的。
13:05
And I have to admit that most papers written by experts are wrong.
230
785774
4243
我必須承認專家寫的論文, 大部分是錯的。
13:10
Or at best, make claims that the evidence doesn't actually support.
231
790520
3505
充其量,他們會做出 證據不見得支持的一些主張。
13:14
So we can't just take an expert's word for it.
232
794990
3133
所以,我們不能 就這樣相信專家的話。
13:18
In November 2016, a study on executive pay hit national headlines.
233
798776
6034
2016 年 11 月,
一項關於主管薪資的研究 上了全國的頭條。
13:25
Even though none of the newspapers who covered the study
234
805240
2890
儘管報導這項研究的報社, 壓根沒看過該項研究。
13:28
had even seen the study.
235
808154
1600
13:30
It wasn't even out yet.
236
810685
1533
它甚至尚未發表出刊。
13:32
They just took the author's word for it,
237
812708
2204
它們只是相信了作者的話,
13:35
just like with Belle.
238
815768
1400
就像貝兒的例子。
13:38
Nor does it mean that we can just handpick any study
239
818093
2436
那並不表示我們可以挑選任何
13:40
that happens to support our viewpoint --
240
820553
2111
剛好支持我們觀點的研究——
13:42
that would, again, be confirmation bias.
241
822688
2103
同樣的,那也是確認偏誤。
13:44
Nor does it mean that if seven studies show A
242
824815
2555
那也不表示,如果 有 7 項研究顯示是 A,
13:47
and three show B,
243
827394
1668
3 項顯示是 B,
13:49
that A must be true.
244
829086
1483
則 A 就一定是對的。
13:51
What matters is the quality,
245
831109
2659
重要的是專家意見的品質,
13:53
and not the quantity of expertise.
246
833792
2817
而不是數量。
13:57
So we should do two things.
247
837879
1800
所以,我們應該要做兩件事。
14:00
First, we should critically examine the credentials of the authors.
248
840434
4578
第一,我們應該很嚴苛地 檢驗作者的資歷。
14:05
Just like you'd critically examine the credentials of a potential surgeon.
249
845807
4143
就像你會嚴苛地檢驗 準外科醫生的資歷一樣。
14:10
Are they truly experts in the matter,
250
850347
3206
他們真的是那方面的專家?
14:13
or do they have a vested interest?
251
853577
2267
或是他們有著既得利益?
14:16
Second, we should pay particular attention
252
856768
2523
第二,我們應該要特別注意
14:19
to papers published in the top academic journals.
253
859315
3889
在頂尖學術期刊中的論文。
14:24
Now, academics are often accused of being detached from the real world.
254
864038
3861
學術圈常常被批評脫離真實世界。
14:28
But this detachment gives you years to spend on a study.
255
868585
3730
但這樣的脫離,讓你可以 花數年的時間投入一項研究。
14:32
To really nail down a result,
256
872339
1905
得出一個確切的結果,
14:34
to rule out those rival theories,
257
874268
2015
把那些對立理論給排除,
14:36
and to distinguish correlation from causation.
258
876307
3134
區別出相關性和因果關係。
14:40
And academic journals involve peer review,
259
880172
3477
學術期刊需要同儕審查,
14:43
where a paper is rigorously scrutinized
260
883673
2294
論文會被嚴格地仔細審查,
14:45
(Laughter)
261
885991
1419
(笑聲)
14:47
by the world's leading minds.
262
887434
1934
被世界上最有聰明才智的人檢查。
14:50
The better the journal, the higher the standard.
263
890434
2556
越好的期刊,標準越高。
14:53
The most elite journals reject 95 percent of papers.
264
893014
5148
最優秀的期刊 會退回 95% 的論文。
14:59
Now, academic evidence is not everything.
265
899434
3333
學術證據並不代表一切。
15:03
Real-world experience is critical, also.
266
903109
2667
真實世界的經驗也很重要。
15:06
And peer review is not perfect, mistakes are made.
267
906465
3400
同儕審查並不完美, 也會有錯誤。
15:10
But it's better to go with something checked
268
910530
2063
但有檢查總比沒檢查好。
15:12
than something unchecked.
269
912617
1667
15:14
If we latch onto a study because we like the findings,
270
914696
3199
如果我們偏好一篇研究 是因為我們喜歡它的研究結果,
15:17
without considering who it's by or whether it's even been vetted,
271
917919
3888
而沒考量作者為誰或是否經過檢驗,
15:21
there is a massive chance that that study is misleading.
272
921831
3627
那這篇研究就很有可能造成誤導。
15:26
And those of us who claim to be experts
273
926894
2580
我們當中宣稱自己是專家的人
15:29
should recognize the limitations of our analysis.
274
929498
3253
應該知道我們的分析是有極限的。
15:33
Very rarely is it possible to prove or predict something with certainty,
275
933244
4563
能夠確切地證明或預測 某樣事物是極罕見的情況,
15:38
yet it's so tempting to make a sweeping, unqualified statement.
276
938292
4369
但做出一概而論的陳述 卻是如此地誘人。
15:43
It's easier to turn into a headline or to be tweeted in 140 characters.
277
943069
4344
轉換成頭條或是用 140 個字 寫在推特上,是比較容易的。
15:48
But even evidence may not be proof.
278
948417
3142
但即使證據也不見得能證明什麼。
15:52
It may not be universal, it may not apply in every setting.
279
952481
4210
它可能無法放諸四海而皆準, 它不見得適用在任何情況。
15:57
So don't say, "Red wine causes longer life,"
280
957252
4920
所以不要說「紅酒能延壽」,
16:02
when the evidence is only that red wine is correlated with longer life.
281
962196
4682
因為證據只是顯示紅酒 和長壽有相關性而已。
16:07
And only then in people who exercise as well.
282
967379
2770
且只限於同時也在運動的人才會。
16:11
Tip number three is "pause before sharing anything."
283
971868
3966
秘訣三是 「分享任何東西之前,請三思。」
16:16
The Hippocratic oath says, "First, do no harm."
284
976907
3464
醫科學生的誓約說道: 「首先,不要造成傷害。」
16:21
What we share is potentially contagious,
285
981046
3134
我們分享的內容有可能會擴散蔓延,
16:24
so be very careful about what we spread.
286
984204
3683
所以要格外小心我們所發散的內容。
16:28
Our goal should not be to get likes or retweets.
287
988632
2953
我們的目標不應該是 要得到「讚」或被轉推。
16:31
Otherwise, we only share the consensus; we don't challenge anyone's thinking.
288
991609
3985
不然,我們就只是在分享共識, 沒有去挑戰別人的想法。
16:36
Otherwise, we only share what sounds good,
289
996085
2905
不然,我們就只是分享 聽起來很棒的內容,
16:39
regardless of whether it's evidence.
290
999014
2400
不管它是不是證據。
16:42
Instead, we should ask the following:
291
1002188
2466
反之,我們應該要問下列問題:
16:45
If it's a story, is it true?
292
1005572
2135
如果它是一個故事,它是真的嗎?
16:47
If it's true, is it backed up by large-scale evidence?
293
1007731
2865
如果它是真的, 有大規模的證據來支持它嗎?
16:50
If it is, who is it by, what are their credentials?
294
1010620
2595
如果有,證據是誰提的? 他們的背景資歷為何?
16:53
Is it published, how rigorous is the journal?
295
1013239
2756
它發表了嗎? 這個期刊有多嚴謹?
16:56
And ask yourself the million-dollar question:
296
1016733
2317
並且問你自己這個 重要但難答的問題:
16:59
If the same study was written by the same authors with the same credentials
297
1019980
4023
如果同樣資格的同一位作者 寫了同樣的研究,
17:05
but found the opposite results,
298
1025130
1587
但研究的發現卻是相反的,
17:07
would you still be willing to believe it and to share it?
299
1027608
3694
你仍然願意相信並分享它嗎?
17:13
Treating any problem --
300
1033442
2246
處理任何問題 ——
17:15
a nation's economic problem or an individual's health problem,
301
1035712
3792
一個國家的經濟問題 或者個人的健康問題 ——
17:19
is difficult.
302
1039528
1150
是很困難的。
17:21
So we must ensure that we have the very best evidence to guide us.
303
1041242
4383
所以我們必須確保 有最佳的證據來引導我們。
17:26
Only if it's true can it be fact.
304
1046188
2681
只有真的,才能成為事實。
17:29
Only if it's representative can it be data.
305
1049601
2781
只有具代表性,才能成為資料。
17:33
Only if it's supportive can it be evidence.
306
1053128
3165
只有具支持性,才能成為證據。
17:36
And only with evidence can we move from a post-truth world
307
1056317
5167
只有證據, 才能讓我們從後真相的世界
17:41
to a pro-truth world.
308
1061508
1583
走向擁抱真相的世界。
17:44
Thank you very much.
309
1064183
1334
非常謝謝。
17:45
(Applause)
310
1065541
1150
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7