What to trust in a "post-truth" world | Alex Edmans

148,811 views ・ 2018-12-03

TED


请双击下面的英文字幕来播放视频。

翻译人员: jacks peng 校对人员:
00:13
Belle Gibson was a happy young Australian.
0
13675
2920
贝尔·吉布森是一个 快乐的澳大利亚年轻人。
00:16
She lived in Perth, and she loved skateboarding.
1
16619
3023
她住在珀斯, 她喜欢玩滑板。
00:20
But in 2009, Belle learned that she had brain cancer and four months to live.
2
20173
4449
但在2009年, 贝尔得知自己患有脑癌,
并且只有四个月可活。
00:25
Two months of chemo and radiotherapy had no effect.
3
25034
3533
两个月的化疗和放疗没有见效。
00:29
But Belle was determined.
4
29145
1500
但是贝尔的意志很坚强,
00:30
She'd been a fighter her whole life.
5
30669
2130
贝尔一直都是一位斗士。
00:32
From age six, she had to cook for her brother, who had autism,
6
32823
3294
从6岁起, 她就得给患自闭症的哥哥,
00:36
and her mother, who had multiple sclerosis.
7
36141
2388
还有患多发性硬化症的母亲做饭。
00:38
Her father was out of the picture.
8
38553
1754
她的父亲一直缺位。
00:40
So Belle fought, with exercise, with meditation
9
40736
3286
因此,贝尔用锻炼、冥想抗癌
同时,她也用蔬果代替肉食。
00:44
and by ditching meat for fruit and vegetables.
10
44046
2840
00:47
And she made a complete recovery.
11
47387
2200
她完全康复了。
00:50
Belle's story went viral.
12
50784
1579
贝尔的故事迅速走红。
00:52
It was tweeted, blogged about, shared and reached millions of people.
13
52387
3393
她的故事在推特和博客上广为流传。
00:56
It showed the benefits of shunning traditional medicine
14
56246
3111
它展示了传统医学以外的
00:59
for diet and exercise.
15
59381
1467
饮食和锻炼的意义。
01:01
In August 2013, Belle launched a healthy eating app,
16
61381
4498
2013年8月,贝尔发布了 一款健康饮食的应用软件,
01:05
The Whole Pantry,
17
65903
1349
“健康厨房”
01:07
downloaded 200,000 times in the first month.
18
67276
4023
首月下载量达到20万次。
01:13
But Belle's story was a lie.
19
73228
2799
但贝尔的故事是个谎言。
01:17
Belle never had cancer.
20
77227
1534
贝尔从来没有过癌症。
01:19
People shared her story without ever checking if it was true.
21
79601
4133
人们分享她的故事时 从未去检验其真实性。
01:24
This is a classic example of confirmation bias.
22
84815
3220
这是肯证偏差的典型例子。
01:28
We accept a story uncritically if it confirms what we'd like to be true.
23
88403
4676
我们会不加批判地接受一个故事,
当它证实了我们认为是真实的事时,
01:33
And we reject any story that contradicts it.
24
93484
2506
并且我们会拒绝 任何与之相悖的故事。
01:36
How often do we see this
25
96937
1825
我们看到了多少这种情况?
01:38
in the stories that we share and we ignore?
26
98786
3045
在我们分享却忽略的故事中。
01:41
In politics, in business, in health advice.
27
101855
4182
在政治上,在商业中, 在健康建议上。
01:47
The Oxford Dictionary's word of 2016 was "post-truth."
28
107180
4106
牛津词典2016年的 年度词汇是“后真相”。
01:51
And the recognition that we now live in a post-truth world
29
111768
3492
人们意识到 我们正处在后真相世界中,
01:55
has led to a much needed emphasis on checking the facts.
30
115284
3364
所以如今我们非常强调核查事实。
01:59
But the punch line of my talk
31
119339
1397
但我演讲的重点是
02:00
is that just checking the facts is not enough.
32
120760
2991
仅仅去核查真相还不够。
02:04
Even if Belle's story were true,
33
124347
2927
即便贝尔的故事是真的,
02:07
it would be just as irrelevant.
34
127298
2067
它也是一个不相关的故事。
02:10
Why?
35
130457
1150
为什么?
02:11
Well, let's look at one of the most fundamental techniques in statistics.
36
131957
3508
让我们看看统计学中的 一个基本原理。
02:15
It's called Bayesian inference.
37
135489
2410
贝叶斯推理。
02:18
And the very simple version is this:
38
138251
2936
它的核心观点就是,
02:21
We care about "does the data support the theory?"
39
141211
3268
我们关心:“数据是否支持这个理论?”
02:25
Does the data increase our belief that the theory is true?
40
145053
3456
这个数据是否能够证实 这个理论为真
02:29
But instead, we end up asking, "Is the data consistent with the theory?"
41
149520
4383
但相反,我们最终会问, “数据是否与理论一致?”
02:34
But being consistent with the theory
42
154838
2515
但与理论一致
02:37
does not mean that the data supports the theory.
43
157377
2929
不等于数据支持这个理论。
02:40
Why?
44
160799
1159
为什么?
02:41
Because of a crucial but forgotten third term --
45
161982
3825
因为有一个关键但被人遗忘的点——
02:45
the data could also be consistent with rival theories.
46
165831
3558
数据也可以和对立的理论一致。
02:49
But due to confirmation bias, we never consider the rival theories,
47
169918
4667
但由于肯证偏差, 我们从不考虑对立的理论,
02:54
because we're so protective of our own pet theory.
48
174609
3151
因为我们袒护 我们的宠物理论。
02:58
Now, let's look at this for Belle's story.
49
178688
2413
现在,让我们看看贝尔的故事。
03:01
Well, we care about: Does Belle's story support the theory
50
181125
4214
我们关心的是:贝尔的故事
支持饮食治愈癌症的理论吗?
03:05
that diet cures cancer?
51
185363
1603
03:06
But instead, we end up asking,
52
186990
1787
但相反,我们最终问的是:
03:08
"Is Belle's story consistent with diet curing cancer?"
53
188801
4045
“贝尔的故事 等同于饮食治愈癌症吗?”
03:13
And the answer is yes.
54
193790
1604
答案是肯定的。
03:15
If diet did cure cancer, we'd see stories like Belle's.
55
195839
4103
如果饮食可以治愈癌症,
我们会看到像贝尔这样的故事。
03:20
But even if diet did not cure cancer,
56
200839
2849
但即便饮食不能治疗癌症,
03:23
we'd still see stories like Belle's.
57
203712
2643
我们仍然会看到像贝尔这样的故事。
03:26
A single story in which a patient apparently self-cured
58
206744
5190
比如一个病人自我治愈
03:31
just due to being misdiagnosed in the first place.
59
211958
3174
只是因为一开始被误诊。
03:35
Just like, even if smoking was bad for your health,
60
215680
3326
或者,即使吸烟有害健康,
你仍然会看到一个烟民活到100岁。
03:39
you'd still see one smoker who lived until 100.
61
219030
3304
03:42
(Laughter)
62
222664
1150
(笑声)
03:44
Just like, even if education was good for your income,
63
224157
2562
又或者,即使接受教育 有助于增加你的收入,
03:46
you'd still see one multimillionaire who didn't go to university.
64
226743
4281
你仍会看到没上过大学的千万富翁。
03:51
(Laughter)
65
231048
4984
(笑声)
所以贝尔故事最大的问题 不在于它是虚假的。
03:56
So the biggest problem with Belle's story is not that it was false.
66
236056
3911
03:59
It's that it's only one story.
67
239991
2531
在于它只是一个故事。
04:03
There might be thousands of other stories where diet alone failed,
68
243094
4381
也许有成千上万仅靠饮食失败的故事,
04:07
but we never hear about them.
69
247499
1934
但我们从没听到这些故事。
04:10
We share the outlier cases because they are new,
70
250141
3896
我们分享异常个案 只是因为它们是新奇的,
04:14
and therefore they are news.
71
254061
1867
因此它们成了新闻。
04:16
We never share the ordinary cases.
72
256657
2476
我们从不分享普通案例。
04:19
They're too ordinary, they're what normally happens.
73
259157
3213
它们太普通, 它们就是日常发生的事情。
04:23
And that's the true 99 percent that we ignore.
74
263125
3095
这是我们忽略的99%的真相。
04:26
Just like in society, you can't just listen to the one percent,
75
266244
2968
就像在社会中,
你不能只听1%的异常个案,
04:29
the outliers,
76
269236
1158
04:30
and ignore the 99 percent, the ordinary.
77
270418
2666
去忽略99%的普通事实。
04:34
Because that's the second example of confirmation bias.
78
274022
3254
因为这是第二个肯证偏差的例子,
04:37
We accept a fact as data.
79
277300
2769
我们接受事实作为数据。
04:41
The biggest problem is not that we live in a post-truth world;
80
281038
3968
最大的问题不在于 我们生活在后真相世界;
在于我们生活在后数据世界。
04:45
it's that we live in a post-data world.
81
285030
3769
04:49
We prefer a single story to tons of data.
82
289792
3744
比起大量的数据, 我们更喜欢简单的故事。
04:54
Now, stories are powerful, they're vivid, they bring it to life.
83
294752
3016
那些强大的,生动的,鲜活的故事。
04:57
They tell you to start every talk with a story.
84
297792
2222
他们告诉你 演讲要用故事开场。
我也是这样做的。
05:00
I did.
85
300038
1150
05:01
But a single story is meaningless and misleading
86
301696
4754
但一个简单的故事 是没有意义且误导人的,
05:06
unless it's backed up by large-scale data.
87
306474
2849
除非它有大量的数据支持。
05:11
But even if we had large-scale data,
88
311236
2357
但即便我们有大量的数据,
05:13
that might still not be enough.
89
313617
2158
这可能仍然不够。
05:16
Because it could still be consistent with rival theories.
90
316260
3138
因为它可能仍然与对立结论一致。
05:20
Let me explain.
91
320136
1150
让我解释一下。
05:22
A classic study by psychologist Peter Wason
92
322072
3262
心理学家彼得·沃森的一项经典研究
05:25
gives you a set of three numbers
93
325358
1952
给你一组三个数据
05:27
and asks you to think of the rule that generated them.
94
327334
2905
并让你思考产生这些数据的规律。
05:30
So if you're given two, four, six,
95
330585
4476
如果他们给了你三个数字: 2,4,6,
05:35
what's the rule?
96
335085
1150
规律是什么?
05:36
Well, most people would think, it's successive even numbers.
97
336895
3219
很多人会认为,这是连续的偶数。
05:40
How would you test it?
98
340767
1515
你会如何检验它?
05:42
Well, you'd propose other sets of successive even numbers:
99
342306
3262
你可以提出其他连续偶数的组合:
05:45
4, 6, 8 or 12, 14, 16.
100
345592
3318
4,6,8或者12,14,16.
05:49
And Peter would say these sets also work.
101
349546
2800
彼得说这些数组也行。
05:53
But knowing that these sets also work,
102
353124
2564
但知道这些数组也行,
05:55
knowing that perhaps hundreds of sets of successive even numbers also work,
103
355712
4765
知道数百组连续的偶数也可以,
06:00
tells you nothing.
104
360501
1348
这个结论形同虚设。
06:02
Because this is still consistent with rival theories.
105
362572
3358
因为这仍然与对立理论一致。
06:06
Perhaps the rule is any three even numbers.
106
366889
3205
也许规则可能是任意三个偶数。
06:11
Or any three increasing numbers.
107
371000
2133
或者任何三个不断增加的数字。
06:14
And that's the third example of confirmation bias:
108
374365
2888
这是第三个肯证偏差的例子:
06:17
accepting data as evidence,
109
377277
3689
接受数据作为证据,
06:20
even if it's consistent with rival theories.
110
380990
3000
即便它与对立结论一致。
06:24
Data is just a collection of facts.
111
384704
2952
数据只是事实的组合。
06:28
Evidence is data that supports one theory and rules out others.
112
388402
4923
证据是支持一种理论 排除其他理论的数据。
06:34
So the best way to support your theory
113
394665
2483
所以支持你的理论最好的方法是
06:37
is actually to try to disprove it, to play devil's advocate.
114
397172
3930
试图去反驳它, 做魔鬼的代言人(唱反调)。
06:41
So test something, like 4, 12, 26.
115
401466
4718
所以检验一下, 比如4,12,26.
06:46
If you got a yes to that, that would disprove your theory
116
406938
3683
如果你的答案是肯定的,
06:50
of successive even numbers.
117
410645
1936
那么就证明 你的连续偶数理论是不成立的。
06:53
Yet this test is powerful,
118
413232
2016
这个检验很有力,
06:55
because if you got a no, it would rule out "any three even numbers"
119
415272
4845
因为如果答案不是, 就可以排除“任何三个偶数”
07:00
and "any three increasing numbers."
120
420141
1712
和“任何三个不断增长的数字”。
07:01
It would rule out the rival theories, but not rule out yours.
121
421877
3341
它会排除对立理论, 但不排除你的理论。
07:05
But most people are too afraid of testing the 4, 12, 26,
122
425968
4794
但大部分人 不敢用4,12,26检验,
07:10
because they don't want to get a yes and prove their pet theory to be wrong.
123
430786
4163
因为他们不想肯定,
也不想承认他们的宠物理论是错的。
07:16
Confirmation bias is not only about failing to search for new data,
124
436727
5676
肯证偏差不仅指 未能搜索到新的数据,
07:22
but it's also about misinterpreting data once you receive it.
125
442427
3073
它也与你误解数据有关。
07:26
And this applies outside the lab to important, real-world problems.
126
446339
3548
这个理论也适用于 实验室外的现实世界。
07:29
Indeed, Thomas Edison famously said,
127
449911
3309
的确,托马斯•爱迪生有句名言
07:33
"I have not failed,
128
453244
1888
我没有失败,
07:35
I have found 10,000 ways that won't work."
129
455156
4188
我成功地发现了1万种行不通的方法
07:40
Finding out that you're wrong
130
460281
2627
发现你的错误
07:42
is the only way to find out what's right.
131
462932
2733
是通往成功的唯一道路。
07:46
Say you're a university admissions director
132
466654
2946
假设你是大学招生办主任,
07:49
and your theory is that only students with good grades
133
469624
2563
你的理论是只有来自富裕家庭
07:52
from rich families do well.
134
472211
1763
成绩好的学生才表现好。
07:54
So you only let in such students.
135
474339
2190
所以你只招这些学生,
07:56
And they do well.
136
476553
1150
他们都表现很好。
07:58
But that's also consistent with the rival theory.
137
478482
2772
但这也跟对立理论一致。
08:01
Perhaps all students with good grades do well,
138
481593
2747
可能所有成绩好的学生都表现好,
08:04
rich or poor.
139
484364
1181
不论富裕或贫穷。
08:06
But you never test that theory because you never let in poor students
140
486307
3730
但你永远不会测试这个理论,
因为你不会招贫穷学生,
08:10
because you don't want to be proven wrong.
141
490061
2800
因为你不想被证明错误。
08:14
So, what have we learned?
142
494577
1857
那么,我们学到了什么?
08:17
A story is not fact, because it may not be true.
143
497315
3560
一个故事不是事实, 因为它可能不是真的。
08:21
A fact is not data,
144
501498
2087
一个事实不是数据,
08:23
it may not be representative if it's only one data point.
145
503609
4039
如果只是一个数据点, 它可能不具有代表性。
08:28
And data is not evidence --
146
508680
2349
数据不是证据——
如果它与对立理论一致, 就不具有支持性。
08:31
it may not be supportive if it's consistent with rival theories.
147
511053
3678
08:36
So, what do you do?
148
516146
2277
你会怎么做?
08:39
When you're at the inflection points of life,
149
519464
2682
当你处在人生的转折点,
08:42
deciding on a strategy for your business,
150
522170
2566
去选择生意的策略,
08:44
a parenting technique for your child
151
524760
2611
孩子的育儿技巧,
08:47
or a regimen for your health,
152
527395
2428
或者健康养生,
08:49
how do you ensure that you don't have a story
153
529847
3539
你如何确保你不是基于故事
08:53
but you have evidence?
154
533410
1468
而是你拥有证据?
08:56
Let me give you three tips.
155
536268
1619
让我给你们三个提示。
08:58
The first is to actively seek other viewpoints.
156
538641
3984
首先是积极寻求其他观点。
09:02
Read and listen to people you flagrantly disagree with.
157
542649
3594
阅读和倾听你公然不同意的人。
09:06
Ninety percent of what they say may be wrong, in your view.
158
546267
3488
在你看来,他们说的90%都不对。
09:10
But what if 10 percent is right?
159
550728
2133
但如果还有10%是对的呢?
09:13
As Aristotle said,
160
553851
1619
亚里士多德说过,
09:15
"The mark of an educated man
161
555494
2214
“受过教育的标志是,
09:17
is the ability to entertain a thought
162
557732
3397
你可以不接受一种观点,
09:21
without necessarily accepting it."
163
561153
2333
但你能容纳它。”
09:24
Surround yourself with people who challenge you,
164
564649
2254
和挑战你的人在一起,
09:26
and create a culture that actively encourages dissent.
165
566917
3699
创造一种积极鼓励异见的环境。
09:31
Some banks suffered from groupthink,
166
571347
2318
一些银行受到群体思维的影响,
09:33
where staff were too afraid to challenge management's lending decisions,
167
573689
4309
员工不敢挑战管理者的借贷决策,
引发了金融危机。
09:38
contributing to the financial crisis.
168
578022
2466
09:41
In a meeting, appoint someone to be devil's advocate
169
581029
4199
开会的时候, 指定某人充当魔鬼的代言人
09:45
against your pet idea.
170
585252
1642
挑战你的宠物理论。
09:47
And don't just hear another viewpoint --
171
587720
2571
不要只让这些观点从你的脑后飘过,
09:50
listen to it, as well.
172
590315
2176
请认真倾听。
09:53
As psychologist Stephen Covey said,
173
593389
2404
正如心理学家斯蒂芬·柯维所说,
09:55
"Listen with the intent to understand,
174
595817
3397
“抱着理解的态度倾听,
09:59
not the intent to reply."
175
599238
1666
别只想着怎么回答。”
10:01
A dissenting viewpoint is something to learn from
176
601642
3492
对立的观点是值得学习的,
10:05
not to argue against.
177
605158
1548
而不是不假思索地反对。
10:07
Which takes us to the other forgotten terms in Bayesian inference.
178
607690
3866
这使我们想到 贝叶斯推断中被遗忘的部分
10:12
Because data allows you to learn,
179
612198
2324
因为数据给你学习的空间,
10:14
but learning is only relative to a starting point.
180
614546
3515
但学习只是一个起点。
如果开始就完全确信 你的宠物理论必然成立,
10:18
If you started with complete certainty that your pet theory must be true,
181
618085
5716
10:23
then your view won't change --
182
623825
1897
那么你的观点不会改变——
10:25
regardless of what data you see.
183
625746
2466
不管你看到的数据是什么。
10:28
Only if you are truly open to the possibility of being wrong
184
628641
4391
只有你真正接受犯错的可能性时,
10:33
can you ever learn.
185
633056
1267
你才能学习。
10:35
As Leo Tolstoy wrote,
186
635580
2095
正如列夫·托尔斯泰所写,
10:37
"The most difficult subjects
187
637699
2182
“最难的事情
10:39
can be explained to the most slow-witted man
188
639905
3135
也可以向最迟钝的人解释清楚,
10:43
if he has not formed any idea of them already.
189
643064
2753
只要他还没有 形成任何关于此问题的见解。
10:46
But the simplest thing
190
646365
1873
但最简单的事情
10:48
cannot be made clear to the most intelligent man
191
648262
3071
却无法向最聪明的人说清楚,
10:51
if he is firmly persuaded that he knows already."
192
651357
3334
如果他确信他已经知道答案。“
10:56
Tip number two is "listen to experts."
193
656500
3743
第二个提示是:“听专家的。”
11:01
Now, that's perhaps the most unpopular advice that I could give you.
194
661040
3492
这可能是我给你的 最不流行的建议了。
11:04
(Laughter)
195
664556
1220
(笑声)
11:05
British politician Michael Gove famously said that people in this country
196
665800
4738
英国政治家迈克尔·戈夫曾说过:
这个国家的人民已经受够专家了。
11:10
have had enough of experts.
197
670562
2276
11:13
A recent poll showed that more people would trust their hairdresser --
198
673696
3508
最近的调查显示 更多人相信他们的理发师——
11:17
(Laughter)
199
677228
2285
(笑声)
11:19
or the man on the street
200
679537
1833
或者街上的路人,
11:21
than they would leaders of businesses, the health service and even charities.
201
681394
4305
而非商界领袖、医疗服务机构、 甚至慈善机构的领导人。
11:26
So we respect a teeth-whitening formula discovered by a mom,
202
686227
3977
所以我们敬仰一位母亲 发现的牙齿美白配方,
11:30
or we listen to an actress's view on vaccination.
203
690228
3198
或者我们会听 女演员对疫苗接种的看法。
11:33
We like people who tell it like it is, who go with their gut,
204
693450
2865
我们喜欢说实话、凭直觉做事的人,
11:36
and we call them authentic.
205
696339
1800
我们觉得这叫真实。
11:38
But gut feel can only get you so far.
206
698847
3214
但直觉只能让你走这么远。
11:42
Gut feel would tell you never to give water to a baby with diarrhea,
207
702736
4436
直觉会告诉你永远不要 给腹泻的婴儿喝水,
11:47
because it would just flow out the other end.
208
707196
2318
因为它会从另外一端流出。
11:49
Expertise tells you otherwise.
209
709538
2578
专家告诉你,事实并非如此。
11:53
You'd never trust your surgery to the man on the street.
210
713149
3428
你绝不会把你的手术交给街上的人。
11:56
You'd want an expert who spent years doing surgery
211
716887
3587
你想要一个拥有多年手术经验
12:00
and knows the best techniques.
212
720498
2000
并且有最佳技巧的专家。
12:03
But that should apply to every major decision.
213
723514
3133
但这点应该应用到每个重要的决定中。
12:07
Politics, business, health advice
214
727255
4556
政治,商业,健康建议都需要专家,
12:11
require expertise, just like surgery.
215
731835
2896
就跟做手术一样。
12:16
So then, why are experts so mistrusted?
216
736474
3539
那么,为什么专家如此不被信任呢?
12:20
Well, one reason is they're seen as out of touch.
217
740981
3239
一个原因是他们脱离群众。
12:24
A millionaire CEO couldn't possibly speak for the man on the street.
218
744244
4090
一个年薪百万的总裁 不可能为街头的人发声。
12:29
But true expertise is found on evidence.
219
749455
3559
但真正的专业知识来自于证据。
12:33
And evidence stands up for the man on the street
220
753447
2905
证据支持街上的人,
12:36
and against the elites.
221
756376
1533
反对精英。
12:38
Because evidence forces you to prove it.
222
758456
2667
因为证据迫使你证明它。
12:41
Evidence prevents the elites from imposing their own view
223
761774
4421
证据不允许精英们强加他们的观点
12:46
without proof.
224
766219
1150
在没有证明的情况下。
12:49
A second reason why experts are not trusted
225
769006
2071
第二个专家不被信任的理由是,
12:51
is that different experts say different things.
226
771101
3087
不同的专家观点不同。
12:54
For every expert who claimed that leaving the EU would be bad for Britain,
227
774212
4476
只要有专家说 脱欧对英国而言弊端重重,
12:58
another expert claimed it would be good.
228
778712
2429
就会有另外一个专家说 脱欧对英国而言好处良多。
13:01
Half of these so-called experts will be wrong.
229
781165
3767
这些所谓的专家, 他们一半的观点都是错的。
13:05
And I have to admit that most papers written by experts are wrong.
230
785774
4243
我不得不承认,大多数专家 写的论文也都是错的。
13:10
Or at best, make claims that the evidence doesn't actually support.
231
790520
3505
换一种好听一点的说法,
做出证据并不支持的断言。
13:14
So we can't just take an expert's word for it.
232
794990
3133
所以我们不能只相信专家。
13:18
In November 2016, a study on executive pay hit national headlines.
233
798776
6034
2016年11月,一个关于 高管薪酬的研究登上国家头条。
13:25
Even though none of the newspapers who covered the study
234
805240
2890
尽管报道这项研究的报社
13:28
had even seen the study.
235
808154
1600
没有一家看过这项研究。
13:30
It wasn't even out yet.
236
810685
1533
这项研究甚至还没有发表。
13:32
They just took the author's word for it,
237
812708
2204
他们只是把作者的话当真了,
13:35
just like with Belle.
238
815768
1400
就像贝尔的故事一样。
13:38
Nor does it mean that we can just handpick any study
239
818093
2436
这不意味着我们可以随便挑选一个
13:40
that happens to support our viewpoint --
240
820553
2111
刚好支持我们观点的研究——
13:42
that would, again, be confirmation bias.
241
822688
2103
这也是肯证偏差。
13:44
Nor does it mean that if seven studies show A
242
824815
2555
也不意味着7项研究表明A,
13:47
and three show B,
243
827394
1668
三项研究表明B,
13:49
that A must be true.
244
829086
1483
A就必然是真的。
13:51
What matters is the quality,
245
831109
2659
重点在于质量,
13:53
and not the quantity of expertise.
246
833792
2817
而不是专家的数量。
13:57
So we should do two things.
247
837879
1800
所以我们应该做两件事。
14:00
First, we should critically examine the credentials of the authors.
248
840434
4578
首先,我们应该 严格审查作者的资历,
14:05
Just like you'd critically examine the credentials of a potential surgeon.
249
845807
4143
就像你会谨慎地审视一个
外科医生的资质一样。
14:10
Are they truly experts in the matter,
250
850347
3206
他们真的是这个领域的专家吗,
14:13
or do they have a vested interest?
251
853577
2267
或者他们有没有既得利益?
14:16
Second, we should pay particular attention
252
856768
2523
第二,我们应该特别注意
14:19
to papers published in the top academic journals.
253
859315
3889
发布在顶级期刊上的论文。
14:24
Now, academics are often accused of being detached from the real world.
254
864038
3861
如今, 学术界常常被指责与现实世界脱节。
14:28
But this detachment gives you years to spend on a study.
255
868585
3730
但这种脱节 给了你充足的时间去研究。
14:32
To really nail down a result,
256
872339
1905
去真正确定一个结果,
14:34
to rule out those rival theories,
257
874268
2015
去排除那些对立的理论,
14:36
and to distinguish correlation from causation.
258
876307
3134
并且区分因果关系。
14:40
And academic journals involve peer review,
259
880172
3477
学术期刊涉及同行评议,
14:43
where a paper is rigorously scrutinized
260
883673
2294
在这个环节,论文会被严格审查
14:45
(Laughter)
261
885991
1419
(笑声)
14:47
by the world's leading minds.
262
887434
1934
被学术界的尖端代表审查。
14:50
The better the journal, the higher the standard.
263
890434
2556
越好的期刊,标准越高。
最顶级期刊的论文拒绝率高达95%
14:53
The most elite journals reject 95 percent of papers.
264
893014
5148
14:59
Now, academic evidence is not everything.
265
899434
3333
如今,学术证据并不是一切。
15:03
Real-world experience is critical, also.
266
903109
2667
现实世界的经验也很重要。
15:06
And peer review is not perfect, mistakes are made.
267
906465
3400
同行评议也不尽完美,常犯错误。
15:10
But it's better to go with something checked
268
910530
2063
但有检查
15:12
than something unchecked.
269
912617
1667
总比没有检查好。
15:14
If we latch onto a study because we like the findings,
270
914696
3199
如果我们青睐一个研究 是因为我们喜欢这个发现,
15:17
without considering who it's by or whether it's even been vetted,
271
917919
3888
而不考虑它是谁做的 或者它是否经过审查,
15:21
there is a massive chance that that study is misleading.
272
921831
3627
这个研究就很有可能误导人。
15:26
And those of us who claim to be experts
273
926894
2580
我们这些自称专家的人,
15:29
should recognize the limitations of our analysis.
274
929498
3253
需要认识到我们分析能力的局限性。
15:33
Very rarely is it possible to prove or predict something with certainty,
275
933244
4563
确切地证明或预测某事 的可能性是很小的,
15:38
yet it's so tempting to make a sweeping, unqualified statement.
276
938292
4369
然而,发表一份全面、 不够格的声明十分诱人。
15:43
It's easier to turn into a headline or to be tweeted in 140 characters.
277
943069
4344
它们往往能成为头条 或者微博热点
15:48
But even evidence may not be proof.
278
948417
3142
即便证据并不充分详实。
15:52
It may not be universal, it may not apply in every setting.
279
952481
4210
它可能不是通用的, 可能不适用于任何条件。
15:57
So don't say, "Red wine causes longer life,"
280
957252
4920
所以不要说, “红酒能延长寿命,”
16:02
when the evidence is only that red wine is correlated with longer life.
281
962196
4682
当证据只是在 红酒与长寿相关时,
16:07
And only then in people who exercise as well.
282
967379
2770
并且样本局限在运动人群中。
16:11
Tip number three is "pause before sharing anything."
283
971868
3966
提示三: “分享任何事情前先三思。”
16:16
The Hippocratic oath says, "First, do no harm."
284
976907
3464
希波克拉底誓言(医者誓言)说, “首先,不要伤害。”
16:21
What we share is potentially contagious,
285
981046
3134
我们分享的东西 有可能会快速蔓延,
16:24
so be very careful about what we spread.
286
984204
3683
所以要谨慎地对待 我们散布的东西。
16:28
Our goal should not be to get likes or retweets.
287
988632
2953
我们的目的不应该是为了 获得点赞或转发。
16:31
Otherwise, we only share the consensus; we don't challenge anyone's thinking.
288
991609
3985
否则,我们只会分享共识;
我们不会挑战任何人的思考。
16:36
Otherwise, we only share what sounds good,
289
996085
2905
否则,我们只分享听起来好的,
无视其是否是证据。
16:39
regardless of whether it's evidence.
290
999014
2400
16:42
Instead, we should ask the following:
291
1002188
2466
反过来,我们应该问如下问题:
16:45
If it's a story, is it true?
292
1005572
2135
如果这是个故事,这是真的吗?
16:47
If it's true, is it backed up by large-scale evidence?
293
1007731
2865
如果这是真的, 有大量的证据支持吗?
16:50
If it is, who is it by, what are their credentials?
294
1010620
2595
如果有,证据是谁提供的, 他们的凭证是什么?
16:53
Is it published, how rigorous is the journal?
295
1013239
2756
它发表了吗? 这个期刊是否足够权威?
16:56
And ask yourself the million-dollar question:
296
1016733
2317
并且郑重地问自己,
16:59
If the same study was written by the same authors with the same credentials
297
1019980
4023
如果同样的研究 是同等资质的同一作者写的,
17:05
but found the opposite results,
298
1025130
1587
但发现的是对立理论,
17:07
would you still be willing to believe it and to share it?
299
1027608
3694
你仍然愿意相信和分享它吗?
17:13
Treating any problem --
300
1033442
2246
处理任何问题——
17:15
a nation's economic problem or an individual's health problem,
301
1035712
3792
国家经济问题或者个人健康问题,
17:19
is difficult.
302
1039528
1150
很难。
17:21
So we must ensure that we have the very best evidence to guide us.
303
1041242
4383
所以我们必须确保 有最佳证据指引我们。
17:26
Only if it's true can it be fact.
304
1046188
2681
只有它是真的,才能成为事实。
17:29
Only if it's representative can it be data.
305
1049601
2781
只有具有代表性,才能成为数据。
17:33
Only if it's supportive can it be evidence.
306
1053128
3165
只有被支持,才能是证据。
17:36
And only with evidence can we move from a post-truth world
307
1056317
5167
只有是证据,我们才能从后真相世界
17:41
to a pro-truth world.
308
1061508
1583
走向支持真相的世界。
17:44
Thank you very much.
309
1064183
1334
谢谢。
17:45
(Applause)
310
1065541
1150
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7