How to seek truth in the era of fake news | Christiane Amanpour

152,680 views ・ 2017-10-30

TED


请双击下面的英文字幕来播放视频。

翻译人员: Jack Zhang 校对人员: Chen Zou
00:12
Chris Anderson: Christiane, great to have you here.
0
12931
2337
克里斯·安德森:克莉丝蒂安, 很高兴你今天能来。
你有个很棒的观点,
00:15
So you've had this amazing viewpoint,
1
15292
1843
00:17
and perhaps it's fair to say that in the last few years,
2
17159
3060
也许可以这么说, 在过去的几年间,
00:20
there have been some alarming developments that you're seeing.
3
20243
3753
你目睹了一些令人惊讶的发展。
你最惊讶的是什么?
00:24
What's alarmed you most?
4
24020
1564
00:25
Christiane Amanpour: Well, just listening to the earlier speakers,
5
25608
3192
克莉丝蒂安·阿曼普: 听了前几位讲者的演说,
00:28
I can frame it in what they've been saying:
6
28824
2472
我可以用他们的话来说:
00:31
climate change, for instance -- cities, the threat to our environment
7
31320
3422
气候改变,比如说—— 城市,对我们环境的威胁
00:34
and our lives.
8
34766
1260
以及对我们的生命的威胁。
00:36
It basically also boils down to understanding the truth
9
36440
3894
这基本上是说如果我们想要真正解决它,
00:40
and to be able to get to the truth of what we're talking about
10
40358
3011
我们就要去理解事实,
00:43
in order to really be able to solve it.
11
43393
2092
要能够从我们的 谈话内容中探究出真相。
00:45
So if 99.9 percent of the science on climate
12
45509
3927
因此假如在气候科学上,99.9%的证据
00:49
is empirical, scientific evidence,
13
49460
3057
是实证性的,科学的证据,
00:52
but it's competing almost equally with a handful of deniers,
14
52541
4895
但是仍充斥着等量的反面否认信息,
00:57
that is not the truth;
15
57460
1227
但那不是事实;
00:58
that is the epitome of fake news.
16
58711
2498
那是虚假信息的缩影。
01:01
And so for me, the last few years -- certainly this last year --
17
61233
5102
因此在我看来,过去的几年—— 尤其是去年——
01:06
has crystallized the notion of fake news in a way that's truly alarming
18
66359
4260
正以一种真正令人震惊的方式 将虚假新闻的定义清晰化,
01:10
and not just some slogan to be thrown around.
19
70643
2659
而不仅仅是漫天抛出 一些空洞的标语。
01:13
Because when you can't distinguish between the truth and fake news,
20
73326
3811
因为当你不能区分 事实和虚假新闻的时候,
01:17
you have a very much more difficult time trying to solve
21
77161
3891
你会在尝试去解决 一些我们所面临的重大问题时,
经历更大的困难。
01:21
some of the great issues that we face.
22
81076
2451
01:24
CA: Well, you've been involved in this question of,
23
84512
3421
CA:你参与到一些问题,
01:27
what is balance, what is truth, what is impartiality,
24
87957
2912
例如什么是平衡,什么是真实, 什么是公正,
01:30
for a long time.
25
90893
1255
有很长时间了。
01:32
You were on the front lines reporting the Balkan Wars 25 years ago.
26
92172
5870
25年前你在巴尔干战争的前线做报道。
01:38
And back then, you famously said,
27
98066
3412
那个时候,通过呼吁对人权侵犯的关注,
01:41
by calling out human right abuses,
28
101502
2621
你说了一句很有名的话,
01:44
you said, "Look, there are some situations one simply cannot be neutral about,
29
104147
4329
你说:“看,有一些情况人们 无法对其持中立态度,
01:48
because when you're neutral,
30
108500
1380
因为当你中立的时候,
01:49
you are an accomplice."
31
109904
1887
你便成为帮凶。”
01:53
So, do you feel that today's journalists aren't heeding that advice
32
113243
4897
那么,你觉得如今的记者 没有在留意
那个关于平衡的建议?
01:58
about balance?
33
118164
1472
01:59
CA: Well, look, I think for journalists, objectivity is the golden rule.
34
119660
4106
CA:是这样的, 我觉得身为记者,客观是黄金法则。
02:03
But I think sometimes we don't understand what objectivity means.
35
123790
4416
但是我认为有时候 我们不明白客观意味着什么。
02:08
And I actually learned this very, very young in my career,
36
128230
2994
事实上在我事业的初期, 我就认识到了,
02:11
which was during the Balkan Wars.
37
131248
1572
也就是在巴尔干战争期间。
02:12
I was young then.
38
132844
1216
那是我很年轻。
大约25年前。
02:14
It was about 25 years ago.
39
134084
2539
02:16
And what we faced was the wholesale violation, not just of human rights,
40
136647
5781
我们所面临的是全面的侵犯, 不仅是人权侵犯,
02:22
but all the way to ethnic cleansing and genocide,
41
142452
2979
而是种族清洗和种族灭绝。
02:25
and that has been adjudicated in the highest war crimes court
42
145455
4006
那已经在世界的
02:29
in the world.
43
149485
1164
最高战争犯罪法庭做了裁决。
02:30
So, we know what we were seeing.
44
150673
1653
所以我们了解我们所看到的。
02:32
Trying to tell the world what we were seeing
45
152350
2537
试着告诉全世界 我们所看到的,
02:34
brought us accusations of bias,
46
154911
2775
给我们带来了偏见的指责,
02:37
of siding with one side,
47
157710
1889
比如片面性,
02:39
of not seeing the whole side,
48
159623
1862
没有看到整体,
02:41
and just, you know, trying to tell one story.
49
161509
2297
仅仅是尝试着 去述说一个故事。
02:43
I particularly and personally was accused of siding with,
50
163830
4307
我尤其是被个人地指责站在,
02:48
for instance, the citizens of Sarajevo --
51
168161
1982
比如就像,塞拉耶佛的人们——
02:50
"siding with the Muslims,"
52
170167
1427
"站在穆斯林这一边,"
02:51
because they were the minority who were being attacked
53
171618
3052
因为他们是被塞尔维亚
02:54
by Christians on the Serb side
54
174694
3738
那边的基督徒
02:58
in this area.
55
178456
1719
攻击的少数群体。
03:00
And it worried me.
56
180199
1342
这让我很担心。
03:01
It worried me that I was being accused of this.
57
181565
2191
我担心我因为这个受到指责。
03:03
I thought maybe I was wrong,
58
183780
1342
我想过也许我错了,
也许我忘记了什么是客观。
03:05
maybe I'd forgotten what objectivity was.
59
185146
2348
03:07
But then I started to understand that what people wanted
60
187518
3007
然后我开始明白人们想要的,
03:10
was actually not to do anything --
61
190549
1798
实际上是不采取 任何行动——
03:12
not to step in,
62
192371
1417
不要深入涉及,
03:13
not to change the situation,
63
193812
1570
不要去改变局势,
03:15
not to find a solution.
64
195406
1449
不要去找到解决方法。
03:16
And so, their fake news at that time,
65
196879
2353
如此一来,在那时候的虚假新闻,
那时候人们的谎言——
03:19
their lie at that time --
66
199256
1382
03:20
including our government's, our democratically elected government's,
67
200662
3530
包括我们政府的, 我们民主选举出的政府,
03:24
with values and principles of human rights --
68
204216
2272
有着人权价值观 和原则的政府——
03:26
their lie was to say that all sides are equally guilty,
69
206512
3507
他们的谎言是说 所有的立场都同等地有罪,
03:30
that this has been centuries of ethnic hatred,
70
210043
2793
种族的仇恨延续了数百年,
03:32
whereas we knew that wasn't true,
71
212860
1882
然而我们知道那是不真实的,
03:34
that one side had decided to kill, slaughter and ethnically cleanse
72
214766
3647
即一方决定去杀害 屠杀和种族清洗
03:38
another side.
73
218437
1157
另一方。
03:39
So that is where, for me,
74
219618
1496
对于我来说,那个时候,
03:41
I understood that objectivity means giving all sides an equal hearing
75
221138
5306
我明白了客观意味着 给予所有的立场同等的听证机会,
03:46
and talking to all sides,
76
226468
2105
和所有立场方交谈,
03:48
but not treating all sides equally,
77
228597
3622
而不是同等地对待所有立场,
03:52
not creating a forced moral equivalence or a factual equivalence.
78
232243
4788
不是去创造一个强迫的道德平等性, 或是一个事实的平等性。
当你站在那个危机点的 反对立场,
03:57
And when you come up against that crisis point
79
237055
4479
04:01
in situations of grave violations of international and humanitarian law,
80
241558
5671
站在对国际法和人权法的 严重侵犯的立场的时候,
04:07
if you don't understand what you're seeing,
81
247253
2342
如果你不明白你所看到的,
04:09
if you don't understand the truth
82
249619
2160
如果你不明白事实的真相,
04:11
and if you get trapped in the fake news paradigm,
83
251803
3513
当你陷入在虚假新闻的 模式里面的时候,
04:15
then you are an accomplice.
84
255340
1590
那么你成为了一个同谋。
04:17
And I refuse to be an accomplice to genocide.
85
257658
2997
我拒绝成为种族灭绝的同谋。
04:20
(Applause)
86
260679
3283
(观众掌声)
04:26
CH: So there have always been these propaganda battles,
87
266402
2778
CH: 这种宣传战争的行为总在进行着,
而你在是很勇敢地 站在你选择的立场。
04:29
and you were courageous in taking the stand you took back then.
88
269204
4026
04:33
Today, there's a whole new way, though,
89
273652
3727
然而现今,我们有全新的方式,
04:37
in which news seems to be becoming fake.
90
277403
2204
新闻似乎在变得虚假。
04:39
How would you characterize that?
91
279631
1634
你如何将它们特征化?
CA: 首先——我是非常警觉的。
04:41
CA: Well, look -- I am really alarmed.
92
281289
2084
04:43
And everywhere I look,
93
283397
2202
我所看到的每处,
04:45
you know, we're buffeted by it.
94
285623
1837
都受到了一定的打击。
04:47
Obviously, when the leader of the free world,
95
287484
2202
显然,当自由世界的领导,
04:49
when the most powerful person in the entire world,
96
289710
2473
当整个世界最有力量的人,
04:52
which is the president of the United States --
97
292207
2246
那是美国的总统——
04:54
this is the most important, most powerful country in the whole world,
98
294477
4819
而这是全世界最重要的, 最有力量的国家,
04:59
economically, militarily, politically in every which way --
99
299320
4240
在经济上,军事上,政治上, 在每一方面——
05:04
and it seeks to, obviously, promote its values and power around the world.
100
304415
5017
而且它很显然地在世界范围内 不断推行它的价值观。
05:09
So we journalists, who only seek the truth --
101
309456
3976
那么我们作为记者, 只是寻求真相——
05:13
I mean, that is our mission --
102
313456
1521
我是说,那是我们的使命——
我们在全世界范围内寻找真相,
05:15
we go around the world looking for the truth
103
315001
2117
05:17
in order to be everybody's eyes and ears,
104
317142
1973
以此来成为每个人的 目击者和聆听者,
为那些不能够去到世界不同地方的人,
05:19
people who can't go out in various parts of the world
105
319139
2519
05:21
to figure out what's going on about things that are vitally important
106
321682
3369
去找寻对每个人的健康和安全
极其重要的真相。
05:25
to everybody's health and security.
107
325075
1956
所以当一个重要的世界领导人 指责你制造虚假新闻的时候,
05:27
So when you have a major world leader accusing you of fake news,
108
327055
6686
05:33
it has an exponential ripple effect.
109
333765
3843
这有着巨大的连锁反应。
05:37
And what it does is, it starts to chip away
110
337632
4272
它开始削减,
05:42
at not just our credibility,
111
342472
2888
不仅仅是对我们的信任度,
05:45
but at people's minds --
112
345384
2029
而且还有人们的大脑——
当人们看着我们的时候, 也许他们在想,
05:48
people who look at us, and maybe they're thinking,
113
348372
2364
05:50
"Well, if the president of the United States says that,
114
350760
2669
“如果美国总统那样说了,
05:53
maybe somewhere there's a truth in there."
115
353453
2135
也许某处存在着几分事实。”
05:56
CH: Presidents have always been critical of the media --
116
356148
4184
CH: 总统们总是对 媒体存在批判的——
06:00
CA: Not in this way.
117
360356
1601
CA: 不是以这个方式。
06:01
CH: So, to what extent --
118
361981
1505
CH: 那么到什么程度上——
06:03
(Laughter)
119
363510
1064
(观众笑声)
06:04
(Applause)
120
364598
3120
(观众掌声)
06:07
CH: I mean, someone a couple years ago looking at the avalanche of information
121
367742
6896
CH: 我是说,几年前看着这雪崩般的信息
06:14
pouring through Twitter and Facebook and so forth,
122
374662
3236
涌现在推特和脸书上的时候,
06:17
might have said,
123
377922
1158
人们也许会说,
“看,我们的民主变得前所未有的健康。
06:19
"Look, our democracies are healthier than they've ever been.
124
379104
2841
06:21
There's more news than ever.
125
381969
1521
我们有前所未有的大量的信息。
06:23
Of course presidents will say what they'll say,
126
383514
2211
当然总统会说他们想要说的,
06:25
but everyone else can say what they will say.
127
385749
2233
但其他人可以说他们想要说的。
那么区别是什么? 为什么这就有了额外的危险?”
06:28
What's not to like? How is there an extra danger?"
128
388006
4155
06:32
CA: So, I wish that was true.
129
392185
1542
CA: 我希望那是真实的。
06:34
I wish that the proliferation of platforms upon which we get our information
130
394992
6093
我希望我们获取信息的平台的增加
意味着事实和透明度的增加,
06:41
meant that there was a proliferation of truth and transparency
131
401109
3878
以及深度和准确性的增加。
06:45
and depth and accuracy.
132
405011
1868
06:46
But I think the opposite has happened.
133
406903
2455
但我认为相反的情况发生了。
06:49
You know, I'm a little bit of a Luddite,
134
409382
2090
你知道,我有一点路德主义,
06:51
I will confess.
135
411496
1196
我承认这一点。
06:53
Even when we started to talk about the information superhighway,
136
413147
3384
即使当我们很久之前开始谈论
06:56
which was a long time ago,
137
416555
1628
信息高速公路的时候,
06:58
before social media, Twitter and all the rest of it,
138
418207
2651
那在社交网络之前,推特和 所有其它这些东西之前,
07:00
I was actually really afraid
139
420882
1824
我实际上是真的很害怕
07:02
that that would put people into certain lanes and tunnels
140
422730
4021
那将会把人们置身于 特定的道路和隧道里面,
07:06
and have them just focusing on areas of their own interest
141
426775
4342
使得他们仅仅专注在 他们自己感兴趣的的领域,
07:11
instead of seeing the broad picture.
142
431141
2333
而看不到更广的画面。
07:13
And I'm afraid to say that with algorithms, with logarithms,
143
433498
4586
我害怕地说 按照那种算法,
无论是以什么方式,
07:18
with whatever the "-ithms" are
144
438108
1648
07:19
that direct us into all these particular channels of information,
145
439780
4266
那将领导我们去到所有这些 特定的信息渠道,
那似乎是现在正在发生的。
07:24
that seems to be happening right now.
146
444070
1870
07:25
I mean, people have written about this phenomenon.
147
445964
2544
我是说,人们描写了这些现象。
07:28
People have said that yes, the internet came,
148
448532
2198
人们说,是的, 网络世纪来临,
07:30
its promise was to exponentially explode our access to more democracy,
149
450754
5743
它带来的是大量的对于更加 民主的获取,
07:36
more information,
150
456521
1714
更多的信息,
07:38
less bias,
151
458259
1892
更少的偏见,
07:40
more varied information.
152
460175
2389
更多样化的信息。
07:42
And, in fact, the opposite has happened.
153
462588
2325
实际上,相反的情况发生了。
07:44
And so that, for me, is incredibly dangerous.
154
464937
4018
那对于我来说是相当危险的。
07:48
And again, when you are the president of this country and you say things,
155
468979
4515
回到那一点上,当你是这个国家的总统, 你在说一些事情的时候,
07:53
it also gives leaders in other undemocratic countries the cover
156
473518
5425
它同时也在为其它 不民主的国家做掩护,
08:00
to affront us even worse,
157
480009
2306
去更加地冒犯我们,
08:02
and to really whack us -- and their own journalists --
158
482339
2860
而且用这个虚假新闻的棍棒, 真正打击我们——
08:05
with this bludgeon of fake news.
159
485223
1823
和他们的记者。
CH: 这在某种程度上已经发生,
08:08
CH: To what extent is what happened, though,
160
488000
2184
08:10
in part, just an unintended consequence,
161
490208
2066
部分上,仅仅是无心的后果,
你曾经工作的传统意义上的媒体,
08:12
that the traditional media that you worked in
162
492298
2802
有这个策划调节的角色,
08:15
had this curation-mediation role,
163
495124
2080
08:17
where certain norms were observed,
164
497228
2026
一些特定的规定是必须的,
08:19
certain stories would be rejected because they weren't credible,
165
499278
3153
一些故事会被否决, 因为它们不可信,
08:22
but now that the standard for publication and for amplification
166
502455
6499
但是现在发表和传播的标准
08:28
is just interest, attention, excitement, click,
167
508978
3328
仅仅是兴趣,关注, 兴奋,点击。
08:32
"Did it get clicked on?"
168
512330
1163
“它有没有被点击?”
08:33
"Send it out there!"
169
513517
1155
“发布出去!”
08:34
and that's what's -- is that part of what's caused the problem?
170
514696
3504
那是所谓的—— 那是所引起的问题的一部分吗?
CA: 我认为这是一个大问题, 我们看到了2016年的大选,
08:38
CA: I think it's a big problem, and we saw this in the election of 2016,
171
518224
3595
08:41
where the idea of "clickbait" was very sexy and very attractive,
172
521843
5107
关于“标题党”这个概念, 是非常性感和吸引人的,
08:46
and so all these fake news sites and fake news items
173
526974
4306
那么所有这些虚假新闻网站和虚假的内容
08:51
were not just haphazardly and by happenstance being put out there,
174
531304
4122
不仅仅是胡乱地, 而且是偶然地被发布出去,
08:55
there's been a whole industry in the creation of fake news
175
535450
4451
有整个行业都在制造虚假新闻,
08:59
in parts of Eastern Europe, wherever,
176
539925
2990
在东欧的一部分,或者其它地方,
09:02
and you know, it's planted in real space and in cyberspace.
177
542939
3260
都在真实的空间以及网络空间中生根。
09:06
So I think that, also,
178
546223
2359
所以我也在想,
我们科技以声速或者光速
09:08
the ability of our technology to proliferate this stuff
179
548606
5121
09:13
at the speed of sound or light, just about --
180
553751
3511
扩散这样东西的能力——
09:17
we've never faced that before.
181
557286
1983
这种事我们从来没有面对过。
09:19
And we've never faced such a massive amount of information
182
559293
4867
而且我们从来没有面对过 这样庞大的信息量,
09:24
which is not curated
183
564184
1565
而这些信息不是被
09:25
by those whose profession leads them to abide by the truth,
184
565773
5296
那些职业记者去捍卫真相,
去做事实调查,
09:31
to fact-check
185
571093
1202
去维护一个操守准则 和一个职业道德的守则。
09:32
and to maintain a code of conduct and a code of professional ethics.
186
572319
4834
09:37
CH: Many people here may know people who work at Facebook
187
577177
3343
CH: 在座的很多人认识那些在脸书
09:40
or Twitter and Google and so on.
188
580544
2324
或者推特以及谷歌工作的人。
09:42
They all seem like great people with good intention --
189
582892
3132
他们似乎都是很棒的人, 有着好的意图——
让我们姑且这样说。
09:46
let's assume that.
190
586048
1380
09:47
If you could speak with the leaders of those companies,
191
587452
3675
如果你能够和这些 公司的领导者对话,
09:51
what would you say to them?
192
591151
1291
你会对他们说些什么?
09:52
CA: Well, you know what --
193
592466
1769
CA: 你知道吗——
09:54
I'm sure they are incredibly well-intentioned,
194
594259
2344
我确定他们都 有着极度良好的意图,
09:56
and they certainly developed an unbelievable, game-changing system,
195
596627
5218
而且他们绝对是发展了 一个令人难以置信的,有突破性的系统,
10:01
where everybody's connected on this thing called Facebook.
196
601869
3211
在那个系统里每个人都通过脸书相关联。
10:05
And they've created a massive economy for themselves
197
605104
3801
他们为他们自己 创造了一个巨大的经济体,
10:08
and an amazing amount of income.
198
608929
2680
以及令人惊叹的收入。
10:11
I would just say,
199
611633
1180
我会说,
10:12
"Guys, you know, it's time to wake up and smell the coffee
200
612837
4234
“大伙们,你们知道,是时候醒来认清事实了。
10:17
and look at what's happening to us right now."
201
617095
2702
然后看看现在正在发生的事情。”
10:19
Mark Zuckerberg wants to create a global community.
202
619821
2932
马克·扎克伯格想要创造 一个全球性的社区。
10:22
I want to know: What is that global community going to look like?
203
622777
3219
我想知道:那个全球社区 将会看起来是什么样子的?
我想知道职业守则 实际上是在哪里。
10:26
I want to know where the codes of conduct actually are.
204
626020
4067
马克·扎克伯格说——
10:30
Mark Zuckerberg said --
205
630111
1825
10:31
and I don't blame him, he probably believed this --
206
631960
2718
我不是责怪他, 他估计相信这个——
10:34
that it was crazy to think
207
634702
2356
这个很疯狂的想法是
10:37
that the Russians or anybody else could be tinkering and messing around
208
637082
4109
俄国或者任何其他人 都可以用这种方式
10:41
with this avenue.
209
641215
1243
摆弄或者是玩弄我们。
10:42
And what have we just learned in the last few weeks?
210
642482
2482
我们在过去的几周中 学到了什么?
10:44
That, actually, there has been a major problem in that regard,
211
644988
2958
实际上,有一个主要的问题,
10:47
and now they're having to investigate it and figure it out.
212
647970
3118
现在他们在进行调查。
是的,他们在做 他们现在力所能及的
10:51
Yes, they're trying to do what they can now
213
651112
3279
10:54
to prevent the rise of fake news,
214
654415
2158
去阻止虚假新闻的增加,
10:56
but, you know,
215
656597
1383
但是,你知道,
很长时间以来 这都是很不严格的。
10:58
it went pretty unrestricted for a long, long time.
216
658004
5091
所以我会说,你们知道,
11:03
So I guess I would say, you know,
217
663119
1900
11:05
you guys are brilliant at technology;
218
665043
2099
你们很精通科技;
让我们用另一套算法吧。
11:07
let's figure out another algorithm.
219
667166
1891
我们可以吗?
11:09
Can we not?
220
669081
1171
11:10
CH: An algorithm that includes journalistic investigation --
221
670276
2887
CH: 一个包括 记者性调查的算法——
CA: 我不是很明白 他们是怎么做的,但你知道——
11:13
CA: I don't really know how they do it, but somehow, you know --
222
673187
3356
11:16
filter out the crap!
223
676567
1819
过滤掉垃圾!
11:18
(Laughter)
224
678410
1150
(笑声)
11:19
And not just the unintentional --
225
679584
2002
不仅仅是不经意间——
11:21
(Applause)
226
681610
3254
(观众掌声)
11:24
but the deliberate lies that are planted
227
684888
2206
而是那种几十年来
11:27
by people who've been doing this as a matter of warfare
228
687118
4325
被视为战争手段的人们
11:31
for decades.
229
691467
1302
刻意植下的谎言。
11:32
The Soviets, the Russians --
230
692793
1933
苏维埃,俄国——
11:34
they are the masters of war by other means, of hybrid warfare.
231
694750
5244
他们是战争大师, 换句话说,混合战争大师。
11:40
And this is a --
232
700618
1444
这是一个——
11:42
this is what they've decided to do.
233
702689
2984
这是他们决定去做的事情。
11:45
It worked in the United States,
234
705697
1605
在美国行得通,
11:47
it didn't work in France,
235
707326
1321
在法国行不通,
11:48
it hasn't worked in Germany.
236
708671
1673
在德国还没有成功。
11:50
During the elections there, where they've tried to interfere,
237
710368
2941
在选举期间, 他们尝试过干涉过,
11:53
the president of France right now, Emmanuel Macron,
238
713333
2602
现任的法国总统 埃马纽埃尔·马克龙,
11:55
took a very tough stand and confronted it head on,
239
715959
2523
站在很强硬的立场上, 持续与其作战,
11:58
as did Angela Merkel.
240
718506
1158
安格拉·默克尔也是这样做的。
11:59
CH: There's some hope to be had from some of this, isn't there?
241
719688
2985
CH: 这其中多少有一些希望不是吗?
世人会学习的。
12:02
That the world learns.
242
722697
1151
12:03
We get fooled once,
243
723872
1318
我们被愚弄过一次,
12:05
maybe we get fooled again,
244
725214
1332
也许我们会再次被愚弄,
12:06
but maybe not the third time.
245
726570
1455
但也许不会有第三次。
是那样吗?
12:08
Is that true?
246
728049
1168
CA: 我想说,希望吧。
12:09
CA: I mean, let's hope.
247
729241
1156
但我认为就这而言, 这也事关科技,
12:10
But I think in this regard that so much of it is also about technology,
248
730421
3387
12:13
that the technology has to also be given some kind of moral compass.
249
733832
3445
科技也被给予了一些 道德上的罗盘。
12:17
I know I'm talking nonsense, but you know what I mean.
250
737301
2816
我知道我在乱说, 但你懂我的意思。
12:20
CH: We need a filter-the-crap algorithm with a moral compass --
251
740141
3708
CH: 我们需要有着道德罗盘的 可以过滤垃圾的算法——
12:23
CA: There you go.
252
743873
1157
CA: 正是。
CH: 我认为那是好的。
12:25
CH: I think that's good.
253
745054
1152
CA: 不——“道德科技”。
12:26
CA: No -- "moral technology."
254
746230
1671
12:27
We all have moral compasses -- moral technology.
255
747925
3106
我们都有道德罗盘—— 道德科技。
12:31
CH: I think that's a great challenge. CA: You know what I mean.
256
751055
2979
CH: 我认为那是一个很大的挑战。 CA: 你知道我的意思。
CH: 花一分钟谈论领导力。
12:34
CH: Talk just a minute about leadership.
257
754058
1944
你有一个机会和全世界 如此多的人谈话。
12:36
You've had a chance to speak with so many people across the world.
258
756026
3136
我想对于我们 其中某些人来说——
12:39
I think for some of us --
259
759186
1239
12:40
I speak for myself, I don't know if others feel this --
260
760449
2692
我是在说我自己的观点, 我不知道是否其他人也这样认为——
一直以来有一种失望:
12:43
there's kind of been a disappointment of:
261
763165
1996
领导者们都在哪里?
12:45
Where are the leaders?
262
765185
1859
我们其中的许多人感到失望——
12:47
So many of us have been disappointed --
263
767068
2314
12:49
Aung San Suu Kyi, what's happened recently,
264
769406
2016
翁山苏姬, 那是最近发生的,
就好像是,“不!另一个倒下了。"
12:51
it's like, "No! Another one bites the dust."
265
771446
2085
12:53
You know, it's heartbreaking.
266
773555
1599
你知道,很令人心碎。
(观众笑声)
12:55
(Laughter)
267
775178
1235
12:56
Who have you met
268
776437
2021
你遇见了谁?
12:58
who you have been impressed by, inspired by?
269
778482
2870
你对谁印象深刻? 你被谁鼓舞?
13:01
CA: Well, you talk about the world in crisis,
270
781376
2504
CA: 你要谈论在危机中的世界,
13:03
which is absolutely true,
271
783904
1354
这绝对是真实的,
13:05
and those of us who spend our whole lives immersed in this crisis --
272
785282
4487
对于我们这些将整个生命 沉浸在危机中的人来说——
13:09
I mean, we're all on the verge of a nervous breakdown.
273
789793
2993
我是说,我们都在神经崩溃的边缘上。
13:12
So it's pretty stressful right now.
274
792810
2676
所以现在是一个 压力非常大的时期。
13:15
And you're right --
275
795510
1159
而且你是对的——
13:16
there is this perceived and actual vacuum of leadership,
276
796693
3110
这里有一个感知的和实际上的 领导力的真空,
13:19
and it's not me saying it, I ask all these --
277
799827
2850
不是我在说, 我在问这些——
13:22
whoever I'm talking to, I ask about leadership.
278
802701
2453
所有和我谈话的人, 我和他们谈领导力。
13:25
I was speaking to the outgoing president of Liberia today,
279
805178
4510
我今天在和十分友好的 利比里亚总统谈话,
13:29
[Ellen Johnson Sirleaf,]
280
809712
1810
埃伦·约翰逊·瑟利夫,
13:31
who --
281
811546
1154
她——
13:32
(Applause)
282
812724
2215
(观众掌声)
13:34
in three weeks' time,
283
814963
1542
在三个星期以内,
13:36
will be one of the very rare heads of an African country
284
816529
3944
将会成为非洲国家 十分罕见的领导者之一,
13:40
who actually abides by the constitution
285
820497
2178
少有的遵守宪法的
13:42
and gives up power after her prescribed term.
286
822699
3612
而且在任期结束之后放弃权力的领导人。
13:46
She has said she wants to do that as a lesson.
287
826335
3857
她说她做这个是想要 建立一个先例。
13:50
But when I asked her about leadership,
288
830216
2032
但当我和她探讨领导力的时候,
13:52
and I gave a quick-fire round of certain names,
289
832272
2683
我快速提及了一些名字,
13:54
I presented her with the name of the new French president,
290
834979
2977
我提及了法国新总统的名字,
13:57
Emmanuel Macron.
291
837980
1433
埃马纽埃尔·马克龙。
13:59
And she said --
292
839437
1336
然后她说——
14:00
I said, "So what do you think when I say his name?"
293
840797
2506
我说,“当我提到他名字的时候 你想到了什么?”
14:03
And she said,
294
843327
1273
她说,
14:05
"Shaping up potentially to be
295
845578
2325
“他是可以成为潜在的
14:07
a leader to fill our current leadership vacuum."
296
847927
4066
来填充我们现今领导真空的一个领导人。”
我认为那是很有趣的。
14:12
I thought that was really interesting.
297
852017
1833
14:13
Yesterday, I happened to have an interview with him.
298
853874
2456
昨天,我刚巧和他做了访问。
我很自豪地说,
14:16
I'm very proud to say,
299
856354
1158
我拿到了他的第一个国际专访 这很棒。是在昨天。
14:17
I got his first international interview. It was great. It was yesterday.
300
857536
3419
我印象很深。
14:20
And I was really impressed.
301
860979
1292
我不知道我是否应该在 一个公开的论坛里谈论这个,
14:22
I don't know whether I should be saying that in an open forum,
302
862295
2928
但我真的有很深的印象。
14:25
but I was really impressed.
303
865247
1455
14:26
(Laughter)
304
866726
1218
(笑声)
这可能是因为 这是他的第一个专访,
14:28
And it could be just because it was his first interview,
305
868867
2675
14:31
but -- I asked questions, and you know what?
306
871566
2095
但是——我提了问, 你们知道吗?
14:33
He answered them!
307
873685
1208
他回答了他们!
14:34
(Laughter)
308
874917
1933
(观众笑声)
14:36
(Applause)
309
876874
3269
(观众掌声)
没有回旋,
14:40
There was no spin,
310
880167
1593
14:41
there was no wiggle and waggle,
311
881784
2391
没有来来回回,
14:44
there was no spend-five-minutes- to-come-back-to-the-point.
312
884199
2829
没有五分钟之后 回到话题点上来。
我不用一直打断他,
14:47
I didn't have to keep interrupting,
313
887052
1668
14:48
which I've become rather renowned for doing,
314
888744
2083
我以一直做这个闻名,
14:50
because I want people to answer the question.
315
890851
2532
因为我想要人们回答问题。
14:53
And he answered me,
316
893407
2051
而他回答了我的问题,
14:55
and it was pretty interesting.
317
895482
2614
这就有趣了。
14:58
And he said --
318
898120
1431
然后他说——
14:59
CH: Tell me what he said.
319
899575
1778
CH: 告诉我们他说的。
15:01
CA: No, no, you go ahead.
320
901377
1220
CA: 不, 你先说。
15:02
CH: You're the interrupter, I'm the listener.
321
902621
2228
CH: 你是打断者, 我是聆听者。
15:04
CA: No, no, go ahead.
322
904873
1158
CA: 不,你先说。
CH: 他说了什么?
15:06
CH: What'd he say?
323
906055
1155
CA: 我们今天在这里谈论了 民族主义和部落主义。
15:07
CA: OK. You've talked about nationalism and tribalism here today.
324
907234
3078
我问他,“你是如何有勇气 去面对现在涌行的
15:10
I asked him, "How did you have the guts to confront the prevailing winds
325
910336
3762
关于反全球化, 民族主义,民粹主义的潮势,
15:14
of anti-globalization, nationalism, populism
326
914122
4535
15:18
when you can see what happened in Brexit,
327
918681
1962
当你在2017年初看到
15:20
when you could see what happened in the United States
328
920667
2555
英国退出欧盟所发生的,
当你看到可能在美国发生的,
15:23
and what might have happened in many European elections
329
923246
2595
15:25
at the beginning of 2017?"
330
925865
1717
在很多欧洲选举中可能发生的?”
15:27
And he said,
331
927606
1319
然后他说,
15:29
"For me, nationalism means war.
332
929597
3274
“对于我来说,民族主义意味着战争。
15:33
We have seen it before,
333
933486
1673
我们曾经见证过,
15:35
we have lived through it before on my continent,
334
935183
2258
在我生活的大陆上我们经历过,
15:37
and I am very clear about that."
335
937465
2686
所以我对于这点很明确。”
15:40
So he was not going to, just for political expediency,
336
940175
3961
所以他不会只是 为了政治上的便利,
15:44
embrace the, kind of, lowest common denominator
337
944160
3442
迎合大部分的选票,
15:47
that had been embraced in other political elections.
338
947626
4005
这种情况常发生在其它政治选举中。
15:51
And he stood against Marine Le Pen, who is a very dangerous woman.
339
951655
4441
他选择和玛丽娜·勒龙, 一个很危险的女人对战。
15:56
CH: Last question for you, Christiane.
340
956928
2032
CH: 最后一个问题,克莉丝蒂安。
16:00
TED is about ideas worth spreading.
341
960093
1998
跟我们讲讲关于 值得广布流传的观念。
16:02
If you could plant one idea into the minds of everyone here,
342
962115
4647
如果你要给在座的每位 植入一个观念,
16:06
what would that be?
343
966786
1197
那会是什么?
CA: 我会说,对于你从哪里获取 信息,要十分小心谨慎;
16:08
CA: I would say really be careful where you get your information from;
344
968007
5114
16:13
really take responsibility for what you read, listen to and watch;
345
973145
5322
真的要对于你所阅读的, 听到的和看到的,承担起责任来;
16:18
make sure that you go to the trusted brands to get your main information,
346
978491
4887
确保你去到你信任的渠道 去获取你的主要信息,
16:23
no matter whether you have a wide, eclectic intake,
347
983402
4689
无论你是否有一个广泛的 不拘一格的抓取信息的方式,
真正的和你信任的渠道 保持连接,
16:28
really stick with the brand names that you know,
348
988115
2995
因为在现今这个世界, 就在此时此刻,
16:31
because in this world right now, at this moment right now,
349
991134
3592
16:34
our crises, our challenges, our problems are so severe,
350
994750
4339
我们的危机,我们的挑战, 我们的问题如此严重,
除非我们都以全球公民的 身份来共同参与,
16:39
that unless we are all engaged as global citizens
351
999113
3551
16:42
who appreciate the truth,
352
1002688
1903
我们都珍惜和推崇真相,
16:44
who understand science, empirical evidence and facts,
353
1004615
4345
我们都明白科学证据和事实,
16:48
then we are just simply going to be wandering along
354
1008984
3499
否则我们就只会是随波逐流
16:52
to a potential catastrophe.
355
1012507
1961
到一个潜在的巨大危机里面去。
16:54
So I would say, the truth,
356
1014492
1364
所以,我会说,真相,
16:55
and then I would come back to Emmanuel Macron
357
1015880
2256
然后我会回到 埃马纽埃尔·马克龙上,
16:58
and talk about love.
358
1018160
1300
谈论爱。
我认为世间没有足够的爱。
17:00
I would say that there's not enough love going around.
359
1020022
4469
17:04
And I asked him to tell me about love.
360
1024515
2692
我要他和我说说爱。
17:07
I said, "You know, your marriage is the subject of global obsession."
361
1027231
3592
我说,“你知道的,你的婚姻 是一个全球都沉迷其中的话题。”
17:10
(Laughter)
362
1030847
1635
(观众笑声)
17:12
"Can you tell me about love?
363
1032506
1413
“你可以跟我们说说爱情吗?”
17:13
What does it mean to you?"
364
1033943
1314
它对你来说意味着什么?”
17:15
I've never asked a president or an elected leader about love.
365
1035281
2941
我从来没有向一个总统或是 一个被选举的领导人提出关于爱的问题。
我当时在想,我要试试。
17:18
I thought I'd try it.
366
1038246
1158
17:19
And he said -- you know, he actually answered it.
367
1039428
3915
然后他说——你知道的, 他实际上回答了这个问题。
17:23
And he said, "I love my wife, she is part of me,
368
1043367
4161
他说,“我爱我的妻子 她是我的一部分,
17:27
we've been together for decades."
369
1047552
1627
我们在一起超过十年了。”
17:29
But here's where it really counted,
370
1049203
1685
但这里是关键点,
17:30
what really stuck with me.
371
1050912
1503
这才是真正抓住我的地方,
17:32
He said,
372
1052439
1241
他说,
17:33
"It is so important for me to have somebody at home
373
1053704
3520
“对于我来说,有一个人在家里 告诉我真相,
17:37
who tells me the truth."
374
1057248
1899
是如此地重要。”
17:40
So you see, I brought it home. It's all about the truth.
375
1060618
2712
你看,我把它带回了家里。 都是关于真相的。
(观众笑声)
17:43
(Laughter)
376
1063354
1006
17:44
CH: So there you go. Truth and love. Ideas worth spreading.
377
1064384
2807
CH: 真相和爱情。值得推广的理念。
克莉丝蒂安·阿曼普,十分感谢你今天的到来。
17:47
Christiane Amanpour, thank you so much. That was great.
378
1067215
2663
(观众掌声)
17:49
(Applause)
379
1069902
1068
17:50
CA: Thank you. CH: That was really lovely.
380
1070994
2334
CA: 谢谢。 CH: 那真的是很棒的经历。
(观众掌声)
17:53
(Applause)
381
1073352
1215
17:54
CA: Thank you.
382
1074591
1165
CA: 谢谢。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog