How to seek truth in the era of fake news | Christiane Amanpour

147,805 views ・ 2017-10-30

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Helen Chang
00:12
Chris Anderson: Christiane, great to have you here.
0
12931
2337
克里斯安德森:克莉絲蒂安, 很高興你今天能來。
00:15
So you've had this amazing viewpoint,
1
15292
1843
所以,你有個很棒的觀點,
00:17
and perhaps it's fair to say that in the last few years,
2
17159
3060
也許可以說,在過去的幾年間,
00:20
there have been some alarming developments that you're seeing.
3
20243
3753
你看見了一些令人擔憂的發展。
00:24
What's alarmed you most?
4
24020
1564
最讓你擔憂的是什麼?
00:25
Christiane Amanpour: Well, just listening to the earlier speakers,
5
25608
3192
克莉絲蒂安艾曼普:嗯, 聽了前幾位講者的演說,
00:28
I can frame it in what they've been saying:
6
28824
2472
我可以借用他們所說的來表達:
00:31
climate change, for instance -- cities, the threat to our environment
7
31320
3422
比如,氣候改變、城市, 對我們的環境
00:34
and our lives.
8
34766
1260
以及生活的威脅。
00:36
It basically also boils down to understanding the truth
9
36440
3894
基本上,可以歸結到了解真相,
00:40
and to be able to get to the truth of what we're talking about
10
40358
3011
並能夠針對我們 談論的議題去探究真相,
00:43
in order to really be able to solve it.
11
43393
2092
才能夠真正去解決它。
00:45
So if 99.9 percent of the science on climate
12
45509
3927
所以,如果 99.9% 的氣候科學
00:49
is empirical, scientific evidence,
13
49460
3057
都是實證的、科學的證據,
00:52
but it's competing almost equally with a handful of deniers,
14
52541
4895
但卻在與少數駁斥者 幾乎平頭式地競爭,
00:57
that is not the truth;
15
57460
1227
那就不是真相,
00:58
that is the epitome of fake news.
16
58711
2498
而是假新聞的縮影。
01:01
And so for me, the last few years -- certainly this last year --
17
61233
5102
對我而言,過去幾年, 特別是最近這一年,
01:06
has crystallized the notion of fake news in a way that's truly alarming
18
66359
4260
假新聞的概念被以一種 很讓人擔憂的方式給具體化了,
01:10
and not just some slogan to be thrown around.
19
70643
2659
不再只是隨處喊喊的口號而已。
01:13
Because when you can't distinguish between the truth and fake news,
20
73326
3811
因為當你無法區別 真相和假新聞的差別時,
01:17
you have a very much more difficult time trying to solve
21
77161
3891
你就會更難去試圖解決
01:21
some of the great issues that we face.
22
81076
2451
我們面對的一些重大議題。
01:24
CA: Well, you've been involved in this question of,
23
84512
3421
克里斯:你涉入
01:27
what is balance, what is truth, what is impartiality,
24
87957
2912
什麼是平衡、什麼是真相、 什麼是公正這些問題,
01:30
for a long time.
25
90893
1255
已經很長一段時間。
01:32
You were on the front lines reporting the Balkan Wars 25 years ago.
26
92172
5870
25 年前,你在巴爾幹 戰爭的前線做報導。
01:38
And back then, you famously said,
27
98066
3412
那時,你說過一句名言,
01:41
by calling out human right abuses,
28
101502
2621
大聲說出人權被侵犯,
01:44
you said, "Look, there are some situations one simply cannot be neutral about,
29
104147
4329
你說:「聽著,有些狀況 就是無法中立去看待,
01:48
because when you're neutral,
30
108500
1380
因為當你中立時,
01:49
you are an accomplice."
31
109904
1887
你就是共犯。」
01:53
So, do you feel that today's journalists aren't heeding that advice
32
113243
4897
你覺得現今的記者沒有留意到
那個關於平衡的建議嗎?
01:58
about balance?
33
118164
1472
01:59
CA: Well, look, I think for journalists, objectivity is the golden rule.
34
119660
4106
克莉絲蒂安:我認為對記者而言, 客觀是黃金法則。
02:03
But I think sometimes we don't understand what objectivity means.
35
123790
4416
但我認為,有時我們 並不了解客觀的意義。
02:08
And I actually learned this very, very young in my career,
36
128230
2994
我在職涯的極早期已學到了這一點,
02:11
which was during the Balkan Wars.
37
131248
1572
當時是在巴爾幹戰爭的期間。
02:12
I was young then.
38
132844
1216
那時我很年輕。
02:14
It was about 25 years ago.
39
134084
2539
大約是 25 年前。
02:16
And what we faced was the wholesale violation, not just of human rights,
40
136647
5781
當時我們面對的是大規模的違反,
不只違反人權而已,
02:22
but all the way to ethnic cleansing and genocide,
41
142452
2979
而是一路到排除異族和種族滅絕,
02:25
and that has been adjudicated in the highest war crimes court
42
145455
4006
已經被世界最高戰犯法庭裁決了。
02:29
in the world.
43
149485
1164
02:30
So, we know what we were seeing.
44
150673
1653
我們知道我們看見了什麼。
02:32
Trying to tell the world what we were seeing
45
152350
2537
為了試圖告訴世界我們看見了什麼,
02:34
brought us accusations of bias,
46
154911
2775
導致我們被控訴,說我們有偏見、
02:37
of siding with one side,
47
157710
1889
選邊站、
02:39
of not seeing the whole side,
48
159623
1862
不看整體大局,
02:41
and just, you know, trying to tell one story.
49
161509
2297
而只述說單方片面的故事。
02:43
I particularly and personally was accused of siding with,
50
163830
4307
我個人還特別遭到控訴,
02:48
for instance, the citizens of Sarajevo --
51
168161
1982
比如說我站在塞拉耶佛市民的那一邊,
02:50
"siding with the Muslims,"
52
170167
1427
說我「站在穆斯林的那一邊」,
02:51
because they were the minority who were being attacked
53
171618
3052
因為在那裡他們是被攻擊的少數,
02:54
by Christians on the Serb side
54
174694
3738
被與塞爾維亞同一陣線的
基督徒攻擊。
02:58
in this area.
55
178456
1719
03:00
And it worried me.
56
180199
1342
那讓我憂心。
03:01
It worried me that I was being accused of this.
57
181565
2191
我憂心遭受這樣的指控,
03:03
I thought maybe I was wrong,
58
183780
1342
心想,也許我錯了,
03:05
maybe I'd forgotten what objectivity was.
59
185146
2348
也許我忘了客觀是什麼。
03:07
But then I started to understand that what people wanted
60
187518
3007
接著我開始了解,人們想要的
03:10
was actually not to do anything --
61
190549
1798
其實是什麼都不做,
03:12
not to step in,
62
192371
1417
不要涉入,
03:13
not to change the situation,
63
193812
1570
不要去改變局勢,
03:15
not to find a solution.
64
195406
1449
不要去找解決方案。
03:16
And so, their fake news at that time,
65
196879
2353
所以,那時他們的假新聞,
03:19
their lie at that time --
66
199256
1382
那時說謊的
03:20
including our government's, our democratically elected government's,
67
200662
3530
包括我們的政府,我們的民選政府,
03:24
with values and principles of human rights --
68
204216
2272
有著人權價值和原則的政府,
03:26
their lie was to say that all sides are equally guilty,
69
206512
3507
他們的謊言是,每一方都同等有罪,
03:30
that this has been centuries of ethnic hatred,
70
210043
2793
這是由數百世紀的民族仇恨造成的;
03:32
whereas we knew that wasn't true,
71
212860
1882
而我們知道那並非事實,
03:34
that one side had decided to kill, slaughter and ethnically cleanse
72
214766
3647
而是一方鐵了心要屠殺另一方,
03:38
another side.
73
218437
1157
滅絕異族。
03:39
So that is where, for me,
74
219618
1496
所以,就在當時我了解到
03:41
I understood that objectivity means giving all sides an equal hearing
75
221138
5306
客觀意味著
給每一方被傾聽的平等機會,
03:46
and talking to all sides,
76
226468
2105
並跟每一方都談,
03:48
but not treating all sides equally,
77
228597
3622
而不是平等對待每一方,
03:52
not creating a forced moral equivalence or a factual equivalence.
78
232243
4788
不是勉強造出道德等值或事實等值。
03:57
And when you come up against that crisis point
79
237055
4479
當你面對國際法及人權法
04:01
in situations of grave violations of international and humanitarian law,
80
241558
5671
被重大違反的危機關口,
04:07
if you don't understand what you're seeing,
81
247253
2342
如果你不了解你看到了什麼,
04:09
if you don't understand the truth
82
249619
2160
如果你不了解真相,
04:11
and if you get trapped in the fake news paradigm,
83
251803
3513
如果你被假新聞寫作典範給困住,
04:15
then you are an accomplice.
84
255340
1590
那麼你就是個共犯。
04:17
And I refuse to be an accomplice to genocide.
85
257658
2997
而我拒絕成為種族屠殺的共犯。
04:20
(Applause)
86
260679
3283
(掌聲)
04:26
CH: So there have always been these propaganda battles,
87
266402
2778
克里斯:所以,宣傳戰一直存在著,
04:29
and you were courageous in taking the stand you took back then.
88
269204
4026
而你那時有勇氣採取了堅定的立場。
04:33
Today, there's a whole new way, though,
89
273652
3727
不過,現今有了全新的
04:37
in which news seems to be becoming fake.
90
277403
2204
新聞變成假新聞的方式。
04:39
How would you characterize that?
91
279631
1634
你會如何描述它的特性?
04:41
CA: Well, look -- I am really alarmed.
92
281289
2084
克莉絲蒂安:嗯,我真的很憂心。
04:43
And everywhere I look,
93
283397
2202
放眼任何地方,
04:45
you know, we're buffeted by it.
94
285623
1837
我們都不斷遭受它的衝擊。
04:47
Obviously, when the leader of the free world,
95
287484
2202
顯然,當自由世界的領袖,
04:49
when the most powerful person in the entire world,
96
289710
2473
當整個世界最有權勢的人,
04:52
which is the president of the United States --
97
292207
2246
也就是美國總統──
04:54
this is the most important, most powerful country in the whole world,
98
294477
4819
美國是全世界最重要、 最有權勢的國家,
04:59
economically, militarily, politically in every which way --
99
299320
4240
在經濟、軍事、政治, 每個面向都是──
05:04
and it seeks to, obviously, promote its values and power around the world.
100
304415
5017
很顯然尋求要在全世界 提升價值和權勢。
05:09
So we journalists, who only seek the truth --
101
309456
3976
所以我們這些只尋求真相的記者,
05:13
I mean, that is our mission --
102
313456
1521
懷抱我們的使命,
05:15
we go around the world looking for the truth
103
315001
2117
跑遍世界去尋找真相,
05:17
in order to be everybody's eyes and ears,
104
317142
1973
要成為那些無法去到 世界各地的人的眼睛和耳朵,
05:19
people who can't go out in various parts of the world
105
319139
2519
05:21
to figure out what's going on about things that are vitally important
106
321682
3369
我們要找出發生了哪些事,
哪些事會大大影響 每個人的健康與安全。
05:25
to everybody's health and security.
107
325075
1956
05:27
So when you have a major world leader accusing you of fake news,
108
327055
6686
所以,當那個主要的世界領袖
指控你做假新聞時,
05:33
it has an exponential ripple effect.
109
333765
3843
會引起呈指數成長的漣漪效應,
05:37
And what it does is, it starts to chip away
110
337632
4272
造成的結果是開始一點一點削弱
05:42
at not just our credibility,
111
342472
2888
不只我們的信用,
05:45
but at people's minds --
112
345384
2029
還有人們的理智與主見。
05:48
people who look at us, and maybe they're thinking,
113
348372
2364
人們看著我們,也許他們在想,
05:50
"Well, if the president of the United States says that,
114
350760
2669
「嗯,如果美國總統都那樣說了,
05:53
maybe somewhere there's a truth in there."
115
353453
2135
也許或多或少是真的。」
05:56
CH: Presidents have always been critical of the media --
116
356148
4184
克里斯:總統們向來都對媒體很不滿──
06:00
CA: Not in this way.
117
360356
1601
克莉絲蒂安:不是現在這樣的方式。
06:01
CH: So, to what extent --
118
361981
1505
克里斯:所以,到什麼程度──
06:03
(Laughter)
119
363510
1064
(笑聲)
06:04
(Applause)
120
364598
3120
(掌聲)
06:07
CH: I mean, someone a couple years ago looking at the avalanche of information
121
367742
6896
克里斯:我是說,幾年前
如果有人看見大量資訊湧入,
06:14
pouring through Twitter and Facebook and so forth,
122
374662
3236
資訊從 Twitter 及 Facebook 等地湧入,
06:17
might have said,
123
377922
1158
他可能會說:
06:19
"Look, our democracies are healthier than they've ever been.
124
379104
2841
「我們未曾像現在這麼民主。
06:21
There's more news than ever.
125
381969
1521
新聞量遠多於過去。
06:23
Of course presidents will say what they'll say,
126
383514
2211
當然,總統們會說他們要說的話,
06:25
but everyone else can say what they will say.
127
385749
2233
但其他人也都可以說自己想說的。
06:28
What's not to like? How is there an extra danger?"
128
388006
4155
有什麼不好?為什麼 會存在額外的危險呢?」
06:32
CA: So, I wish that was true.
129
392185
1542
克莉絲蒂安:我希望那是真的。
06:34
I wish that the proliferation of platforms upon which we get our information
130
394992
6093
我希望
資訊來源平台的數量激增
06:41
meant that there was a proliferation of truth and transparency
131
401109
3878
意味著真相和透明度激增,
06:45
and depth and accuracy.
132
405011
1868
深度和正確性也激增。
06:46
But I think the opposite has happened.
133
406903
2455
但我認為,實際發生的恰恰相反。
06:49
You know, I'm a little bit of a Luddite,
134
409382
2090
我承認自己有點算是盧德份子。 (註:反對技術革新的人)
06:51
I will confess.
135
411496
1196
06:53
Even when we started to talk about the information superhighway,
136
413147
3384
即使當我們開始談到資訊高速公路,
06:56
which was a long time ago,
137
416555
1628
那是很久以前了,
06:58
before social media, Twitter and all the rest of it,
138
418207
2651
在社交媒體、Twitter、 這類平台之前,
07:00
I was actually really afraid
139
420882
1824
其實那時我很害怕,
07:02
that that would put people into certain lanes and tunnels
140
422730
4021
怕它會把人們放到某些線道或隧道,
07:06
and have them just focusing on areas of their own interest
141
426775
4342
使他們只聚焦在自己感興趣的領域,
07:11
instead of seeing the broad picture.
142
431141
2333
而不是放寬視野去看大局。
07:13
And I'm afraid to say that with algorithms, with logarithms,
143
433498
4586
我很怕演算法、對數
07:18
with whatever the "-ithms" are
144
438108
1648
這類東西被用來
07:19
that direct us into all these particular channels of information,
145
439780
4266
引領我們到特定的資訊管道,
07:24
that seems to be happening right now.
146
444070
1870
而這正似乎是現在正在發生的情形。
07:25
I mean, people have written about this phenomenon.
147
445964
2544
人們已經在寫關於這個現象的文章,
07:28
People have said that yes, the internet came,
148
448532
2198
人們說,是的,網際網路到來了,
07:30
its promise was to exponentially explode our access to more democracy,
149
450754
5743
它承諾讓我們能大量接觸更多民主、
07:36
more information,
150
456521
1714
更多資訊、
07:38
less bias,
151
458259
1892
較少偏見、
07:40
more varied information.
152
460175
2389
和更多樣化的資訊。
07:42
And, in fact, the opposite has happened.
153
462588
2325
但事實上,發生的情形相反。
07:44
And so that, for me, is incredibly dangerous.
154
464937
4018
所以對我來說,那是相當危險的。
07:48
And again, when you are the president of this country and you say things,
155
468979
4515
同樣的,若你身為國家總統,
07:53
it also gives leaders in other undemocratic countries the cover
156
473518
5425
你的發言會掩護 其他不民主國家的領袖,
08:00
to affront us even worse,
157
480009
2306
讓他們能進一步冒犯我們,
08:02
and to really whack us -- and their own journalists --
158
482339
2860
揮著假新聞的棍棒紮紮實實痛擊我們
08:05
with this bludgeon of fake news.
159
485223
1823
和他們自己的記者。
08:08
CH: To what extent is what happened, though,
160
488000
2184
克里斯:單就已經發生的事來說,
08:10
in part, just an unintended consequence,
161
490208
2066
非蓄意的後果到了什麼程度?
08:12
that the traditional media that you worked in
162
492298
2802
你所從事的傳統媒體業
08:15
had this curation-mediation role,
163
495124
2080
扮演著調解和處理資訊的角色,
08:17
where certain norms were observed,
164
497228
2026
遵從某些基準,
08:19
certain stories would be rejected because they weren't credible,
165
499278
3153
駁回一些不可信的故事。
08:22
but now that the standard for publication and for amplification
166
502455
6499
但現在
出版和散播的標準
08:28
is just interest, attention, excitement, click,
167
508978
3328
只剩下有趣、注意力、 刺激、點閱數,
08:32
"Did it get clicked on?"
168
512330
1163
「它被點閱了嗎?」
08:33
"Send it out there!"
169
513517
1155
「把它發出去!」
08:34
and that's what's -- is that part of what's caused the problem?
170
514696
3504
那是造成問題的部分原因嗎?
克莉絲蒂安:我認為這是個大問題, 2016 年的大選已看到這現象,
08:38
CA: I think it's a big problem, and we saw this in the election of 2016,
171
518224
3595
08:41
where the idea of "clickbait" was very sexy and very attractive,
172
521843
5107
那時,點閱誘餌是非常性感、 非常有吸引力的,
08:46
and so all these fake news sites and fake news items
173
526974
4306
因此所有這些假新聞網站 及一則則假新聞
08:51
were not just haphazardly and by happenstance being put out there,
174
531304
4122
並不是隨意或偶然被放在那裡,
08:55
there's been a whole industry in the creation of fake news
175
535450
4451
而是有一整個產業在製造假新聞,
08:59
in parts of Eastern Europe, wherever,
176
539925
2990
在部份東歐地區,無論是哪,
09:02
and you know, it's planted in real space and in cyberspace.
177
542939
3260
假新聞被植入實體和網路的空間中。
09:06
So I think that, also,
178
546223
2359
所以我也認為,
09:08
the ability of our technology to proliferate this stuff
179
548606
5121
我們的科技有能力 將這類東西擴散出去,
09:13
at the speed of sound or light, just about --
180
553751
3511
擴散的速度幾近音速或光速,
09:17
we've never faced that before.
181
557286
1983
這是我們未曾面對過的。
09:19
And we've never faced such a massive amount of information
182
559293
4867
我們未曾面對過
如此大量、未經彙整的資訊,
09:24
which is not curated
183
564184
1565
未被那些身負把關職則、
09:25
by those whose profession leads them to abide by the truth,
184
565773
5296
必須確認事實的真相、
09:31
to fact-check
185
571093
1202
09:32
and to maintain a code of conduct and a code of professional ethics.
186
572319
4834
並維持職業倫理準則 與行為準則的那些人
所彙整過的資訊。
09:37
CH: Many people here may know people who work at Facebook
187
577177
3343
克里斯:這裡許多人可能認識 在 Facebook、Twitter、
09:40
or Twitter and Google and so on.
188
580544
2324
Google 等公司工作的人。
09:42
They all seem like great people with good intention --
189
582892
3132
他們都看似很棒、有著良善的意圖,
09:46
let's assume that.
190
586048
1380
就讓我們先這樣假設。
09:47
If you could speak with the leaders of those companies,
191
587452
3675
若你能和這些公司的領導人說話,
09:51
what would you say to them?
192
591151
1291
你會對他們說什麼?
09:52
CA: Well, you know what --
193
592466
1769
克莉絲蒂安:你知道嗎,
09:54
I'm sure they are incredibly well-intentioned,
194
594259
2344
我相信他們的意圖都是非常良善的,
09:56
and they certainly developed an unbelievable, game-changing system,
195
596627
5218
他們確實發展出令人難以置信 並且改變遊戲規則的系統,
10:01
where everybody's connected on this thing called Facebook.
196
601869
3211
每個人都在這個名叫 Facebook 的東西上彼此連結。
10:05
And they've created a massive economy for themselves
197
605104
3801
他們為自己創造出大規模的經濟、
10:08
and an amazing amount of income.
198
608929
2680
以及驚人的收入。
10:11
I would just say,
199
611633
1180
我只會說:
10:12
"Guys, you know, it's time to wake up and smell the coffee
200
612837
4234
「各位,該是醒來的時候了, 聞一聞咖啡、看看現在
10:17
and look at what's happening to us right now."
201
617095
2702
在我們身上發生了那些事。」
10:19
Mark Zuckerberg wants to create a global community.
202
619821
2932
馬克祖克柏想要創造一個全球社群。
10:22
I want to know: What is that global community going to look like?
203
622777
3219
我想要知道:這個全球社群 看起來會是什麼樣子?
10:26
I want to know where the codes of conduct actually are.
204
626020
4067
我想要知道行為準則到底在哪裡。
10:30
Mark Zuckerberg said --
205
630111
1825
馬克祖克柏說──
10:31
and I don't blame him, he probably believed this --
206
631960
2718
我不怪他,他可能確實相信這點──
10:34
that it was crazy to think
207
634702
2356
他說,如果認為俄國人
10:37
that the Russians or anybody else could be tinkering and messing around
208
637082
4109
或是其他人可以在用這裡胡搞亂弄,
10:41
with this avenue.
209
641215
1243
那就太瘋狂了。
10:42
And what have we just learned in the last few weeks?
210
642482
2482
我們在前幾週剛剛學到了什麼?
10:44
That, actually, there has been a major problem in that regard,
211
644988
2958
我們得知其實那方面的問題大得很,
10:47
and now they're having to investigate it and figure it out.
212
647970
3118
現在他們得要調查到底怎麼一回事。
10:51
Yes, they're trying to do what they can now
213
651112
3279
是的,他們目前正傾力試著
10:54
to prevent the rise of fake news,
214
654415
2158
防止假新聞興起,
10:56
but, you know,
215
656597
1383
但你知道,
10:58
it went pretty unrestricted for a long, long time.
216
658004
5091
長久以來,假新聞一直未曾受限。
11:03
So I guess I would say, you know,
217
663119
1900
所以,我想我會說,
11:05
you guys are brilliant at technology;
218
665043
2099
你們在科技方面才華橫溢,
11:07
let's figure out another algorithm.
219
667166
1891
咱們來想出另一個演算法,
11:09
Can we not?
220
669081
1171
行吧?
11:10
CH: An algorithm that includes journalistic investigation --
221
670276
2887
克里斯:一個包含 新聞調查的演算法──
11:13
CA: I don't really know how they do it, but somehow, you know --
222
673187
3356
克莉絲蒂安:其實我不清楚他們 怎麼做,但總要以某種方式
11:16
filter out the crap!
223
676567
1819
把狗屁都過濾掉!
11:18
(Laughter)
224
678410
1150
(笑聲)
11:19
And not just the unintentional --
225
679584
2002
不僅僅濾掉非蓄意的,
11:21
(Applause)
226
681610
3254
(掌聲)
11:24
but the deliberate lies that are planted
227
684888
2206
也要濾掉刻意植入的謊言,
11:27
by people who've been doing this as a matter of warfare
228
687118
4325
由數十年來習以為常 把假新聞當作戰爭手段的那些人
11:31
for decades.
229
691467
1302
所植入的謊言。
11:32
The Soviets, the Russians --
230
692793
1933
蘇聯人、俄國人,
11:34
they are the masters of war by other means, of hybrid warfare.
231
694750
5244
他們是利用另類手法 製造戰爭的大師,是混合戰的大師。
11:40
And this is a --
232
700618
1444
假新聞是他們決定採取的手段。
11:42
this is what they've decided to do.
233
702689
2984
11:45
It worked in the United States,
234
705697
1605
在美國奏效了,
11:47
it didn't work in France,
235
707326
1321
在法國行不通,
11:48
it hasn't worked in Germany.
236
708671
1673
在德國還沒有用上。
11:50
During the elections there, where they've tried to interfere,
237
710368
2941
選舉期間,他們試圖干預,
11:53
the president of France right now, Emmanuel Macron,
238
713333
2602
法國的現任總統埃瑪紐耶爾馬克宏
11:55
took a very tough stand and confronted it head on,
239
715959
2523
採取非常強硬的立場正面迎戰,
11:58
as did Angela Merkel.
240
718506
1158
安格拉梅克爾也一樣。 (註:德國總理)
11:59
CH: There's some hope to be had from some of this, isn't there?
241
719688
2985
克里斯:這當中是有些希望的吧?
12:02
That the world learns.
242
722697
1151
世界在學習。
12:03
We get fooled once,
243
723872
1318
我們被騙過一次,
12:05
maybe we get fooled again,
244
725214
1332
也許我們會再被騙一次,
12:06
but maybe not the third time.
245
726570
1455
但也許不會犯第三次錯。
12:08
Is that true?
246
728049
1168
是這樣的嗎?
12:09
CA: I mean, let's hope.
247
729241
1156
克莉絲蒂安:咱們就希望如此吧。
12:10
But I think in this regard that so much of it is also about technology,
248
730421
3387
但我認為,在這方面 有很大一部份和科技相關,
12:13
that the technology has to also be given some kind of moral compass.
249
733832
3445
科技也得要有某種道德羅盤。
12:17
I know I'm talking nonsense, but you know what I mean.
250
737301
2816
我知道我在說廢話, 但你們明白我的意思。
12:20
CH: We need a filter-the-crap algorithm with a moral compass --
251
740141
3708
克里斯:我們需要一個道德羅盤 過濾狗屁的演算法。
12:23
CA: There you go.
252
743873
1157
克莉絲蒂安:你說對了。
12:25
CH: I think that's good.
253
745054
1152
克里斯:我認為那很好。
12:26
CA: No -- "moral technology."
254
746230
1671
克莉絲蒂安:不,「道德科技」。
12:27
We all have moral compasses -- moral technology.
255
747925
3106
我們都要有道德羅盤──道德科技。
12:31
CH: I think that's a great challenge. CA: You know what I mean.
256
751055
2979
克里斯:我認為那是個大挑戰。 克莉絲蒂安:你懂我的意思。
12:34
CH: Talk just a minute about leadership.
257
754058
1944
克里斯:花一分鐘談談領導。
12:36
You've had a chance to speak with so many people across the world.
258
756026
3136
你有和世上那麼多人說話的機會。
12:39
I think for some of us --
259
759186
1239
我認為我們當中有些人──
12:40
I speak for myself, I don't know if others feel this --
260
760449
2692
我代表自己發言, 不知道其他人是否有同感──
12:43
there's kind of been a disappointment of:
261
763165
1996
一直懷有這樣的失望:
12:45
Where are the leaders?
262
765185
1859
領導人在哪裡?
12:47
So many of us have been disappointed --
263
767068
2314
我們這麼多人一直覺得失望──
12:49
Aung San Suu Kyi, what's happened recently,
264
769406
2016
翁山蘇姬最近怎麼搞的,
12:51
it's like, "No! Another one bites the dust."
265
771446
2085
就像是:「不好!又陣亡了一個。」
12:53
You know, it's heartbreaking.
266
773555
1599
很讓人心碎。
12:55
(Laughter)
267
775178
1235
(笑聲)
12:56
Who have you met
268
776437
2021
你遇過誰,
12:58
who you have been impressed by, inspired by?
269
778482
2870
你對誰印象的深刻、受到誰鼓舞呢?
13:01
CA: Well, you talk about the world in crisis,
270
781376
2504
克莉絲蒂安:你談到 世界正處在危機當中,
13:03
which is absolutely true,
271
783904
1354
這點絕對是真的,
13:05
and those of us who spend our whole lives immersed in this crisis --
272
785282
4487
我們這些一生埋首在危機中的人──
13:09
I mean, we're all on the verge of a nervous breakdown.
273
789793
2993
我是說,我們都在精神崩潰的邊緣。
13:12
So it's pretty stressful right now.
274
792810
2676
所以,現在壓力很大。
13:15
And you're right --
275
795510
1159
且你是對的,
13:16
there is this perceived and actual vacuum of leadership,
276
796693
3110
我的確感受到領導的空缺狀態, 實際上也是如此,
13:19
and it's not me saying it, I ask all these --
277
799827
2850
且不只是我這樣說,我問了所有
13:22
whoever I'm talking to, I ask about leadership.
278
802701
2453
與我對話過的人關於領導。
13:25
I was speaking to the outgoing president of Liberia today,
279
805178
4510
我今天在和賴比瑞亞 即將離職的總統談話,
13:29
[Ellen Johnson Sirleaf,]
280
809712
1810
艾倫強森瑟利夫,
13:31
who --
281
811546
1154
13:32
(Applause)
282
812724
2215
(掌聲)
13:34
in three weeks' time,
283
814963
1542
在三週後,
13:36
will be one of the very rare heads of an African country
284
816529
3944
她將會成為非常少數真正遵循憲法,
13:40
who actually abides by the constitution
285
820497
2178
在規定任期結束後
13:42
and gives up power after her prescribed term.
286
822699
3612
就交出權力的非洲國家領袖之一。
13:46
She has said she wants to do that as a lesson.
287
826335
3857
她說她想要那麼做,教大家一課。
13:50
But when I asked her about leadership,
288
830216
2032
但當我向她問到領導時,
13:52
and I gave a quick-fire round of certain names,
289
832272
2683
我快速丟給她一堆名字,
13:54
I presented her with the name of the new French president,
290
834979
2977
我提到新法國總統的名字,
13:57
Emmanuel Macron.
291
837980
1433
埃瑪紐耶爾馬克宏。
13:59
And she said --
292
839437
1336
她說──
14:00
I said, "So what do you think when I say his name?"
293
840797
2506
我問:「你覺得他如何?」
14:03
And she said,
294
843327
1273
她說:
14:05
"Shaping up potentially to be
295
845578
2325
「有潛力能夠成為
14:07
a leader to fill our current leadership vacuum."
296
847927
4066
填補目前領導真空的領袖。」
14:12
I thought that was really interesting.
297
852017
1833
我覺得那十分有趣。
14:13
Yesterday, I happened to have an interview with him.
298
853874
2456
正巧昨天我剛訪問馬克宏。
14:16
I'm very proud to say,
299
856354
1158
我能很驕傲地說,
14:17
I got his first international interview. It was great. It was yesterday.
300
857536
3419
我得到他的首次國際訪談。 很順利。那是昨天的事。
14:20
And I was really impressed.
301
860979
1292
我的印象非常深刻。
14:22
I don't know whether I should be saying that in an open forum,
302
862295
2928
不知道我是否該在 公開的論壇中這樣說,
14:25
but I was really impressed.
303
865247
1455
但我印象非常深刻。
14:26
(Laughter)
304
866726
1218
(笑聲)
14:28
And it could be just because it was his first interview,
305
868867
2675
可能只因為那是他的首次訪談,
14:31
but -- I asked questions, and you know what?
306
871566
2095
但我問了問題,你們知道怎樣嗎?
14:33
He answered them!
307
873685
1208
他回答了問題!
14:34
(Laughter)
308
874917
1933
(笑聲)
14:36
(Applause)
309
876874
3269
(掌聲)
14:40
There was no spin,
310
880167
1593
沒繞圈圈,
14:41
there was no wiggle and waggle,
311
881784
2391
沒閃避,
14:44
there was no spend-five-minutes- to-come-back-to-the-point.
312
884199
2829
沒花五分鐘才回到重點。
14:47
I didn't have to keep interrupting,
313
887052
1668
不需要我一直打斷,
14:48
which I've become rather renowned for doing,
314
888744
2083
我以訪談時打斷對方而聞名,
14:50
because I want people to answer the question.
315
890851
2532
因為想要他們回答我的問題。
14:53
And he answered me,
316
893407
2051
他回答了我,
14:55
and it was pretty interesting.
317
895482
2614
那十分有趣。
14:58
And he said --
318
898120
1431
他說──
14:59
CH: Tell me what he said.
319
899575
1778
克里斯:告訴我他說了什麼。
15:01
CA: No, no, you go ahead.
320
901377
1220
克莉絲蒂安:不,不,你先說。
15:02
CH: You're the interrupter, I'm the listener.
321
902621
2228
克里斯:你是打斷專家,我是聽眾。
15:04
CA: No, no, go ahead.
322
904873
1158
克莉絲蒂安:不,不,請說。
15:06
CH: What'd he say?
323
906055
1155
克里斯:他說了什麼?
15:07
CA: OK. You've talked about nationalism and tribalism here today.
324
907234
3078
克莉絲蒂安:好,今天在這裡談到 民族主義和對部族的忠誠。
15:10
I asked him, "How did you have the guts to confront the prevailing winds
325
910336
3762
我問他:「你怎麼有膽子去對抗
15:14
of anti-globalization, nationalism, populism
326
914122
4535
反全球化、民族主義、 民粹主義的主流趨勢,特別是
15:18
when you can see what happened in Brexit,
327
918681
1962
當你看到英國脫歐發生的情況,
15:20
when you could see what happened in the United States
328
920667
2555
當你看到在美國發生的狀況,
15:23
and what might have happened in many European elections
329
923246
2595
以及 2017 年初許多歐洲選舉
15:25
at the beginning of 2017?"
330
925865
1717
本來可能發生的狀況呢?」
15:27
And he said,
331
927606
1319
而他說:
15:29
"For me, nationalism means war.
332
929597
3274
「對我來說,民族主義意味著戰爭。
15:33
We have seen it before,
333
933486
1673
我們以前就看過了,
15:35
we have lived through it before on my continent,
334
935183
2258
我們以前在歐陸經歷過了,
15:37
and I am very clear about that."
335
937465
2686
而我非常清楚這一點。」
15:40
So he was not going to, just for political expediency,
336
940175
3961
所以他並不只求政治的眼前利益,
15:44
embrace the, kind of, lowest common denominator
337
944160
3442
像是擁抱最小共同點,
15:47
that had been embraced in other political elections.
338
947626
4005
其他的政治選舉都會去 擁抱最小共同點。
15:51
And he stood against Marine Le Pen, who is a very dangerous woman.
339
951655
4441
而他對抗瑪琳勒朋, 瑪琳勒朋是個很危險的女人。
15:56
CH: Last question for you, Christiane.
340
956928
2032
克里斯:克莉絲蒂安,最後一個問題。
16:00
TED is about ideas worth spreading.
341
960093
1998
和我們談談值得散播的想法。
16:02
If you could plant one idea into the minds of everyone here,
342
962115
4647
如果你能在這裡的每個人 腦中植入一個想法,
16:06
what would that be?
343
966786
1197
會是什麼?
16:08
CA: I would say really be careful where you get your information from;
344
968007
5114
克莉絲蒂安:我會說, 要非常留意你的資訊來自何處;
16:13
really take responsibility for what you read, listen to and watch;
345
973145
5322
要對你所讀到、聽到、 看到的資訊主動負責;
16:18
make sure that you go to the trusted brands to get your main information,
346
978491
4887
確保你的主要資訊必須是從 可信任的品牌那兒取得的,
16:23
no matter whether you have a wide, eclectic intake,
347
983402
4689
不論資訊的來源 有多麼廣泛或是多麼多樣化,
16:28
really stick with the brand names that you know,
348
988115
2995
一定要守住你認識的品牌,
16:31
because in this world right now, at this moment right now,
349
991134
3592
因為在這個世界中,在目前這時刻,
16:34
our crises, our challenges, our problems are so severe,
350
994750
4339
我們的危機、我們的挑戰、 我們的問題,都非常嚴重,
16:39
that unless we are all engaged as global citizens
351
999113
3551
除非我們都能以 全球市民的身份來參與,
16:42
who appreciate the truth,
352
1002688
1903
能夠意識到真相,
16:44
who understand science, empirical evidence and facts,
353
1004615
4345
能夠了解科學、實證證據與事實,
16:48
then we are just simply going to be wandering along
354
1008984
3499
不然我們就只會離開正道,
16:52
to a potential catastrophe.
355
1012507
1961
走向潛在的大災難。
16:54
So I would say, the truth,
356
1014492
1364
所以我會說「真相」,
16:55
and then I would come back to Emmanuel Macron
357
1015880
2256
接著我會回到埃瑪紐耶爾馬克宏,
16:58
and talk about love.
358
1018160
1300
並且談「愛」。
17:00
I would say that there's not enough love going around.
359
1020022
4469
我會說,還沒有足夠的愛。
17:04
And I asked him to tell me about love.
360
1024515
2692
我請他和我談談愛。
17:07
I said, "You know, your marriage is the subject of global obsession."
361
1027231
3592
我說:「你的婚姻 是全球都很迷戀的目標。」
17:10
(Laughter)
362
1030847
1635
(笑聲)
17:12
"Can you tell me about love?
363
1032506
1413
「你能跟我談談愛嗎?
17:13
What does it mean to you?"
364
1033943
1314
對你而言愛是什麼?」
17:15
I've never asked a president or an elected leader about love.
365
1035281
2941
我從未請總統或民選領袖 跟我談論愛這議題,
17:18
I thought I'd try it.
366
1038246
1158
我想試一試。
17:19
And he said -- you know, he actually answered it.
367
1039428
3915
而他──你們知道嗎, 他真的回答了。
17:23
And he said, "I love my wife, she is part of me,
368
1043367
4161
他說:「我愛我太太, 她是我的一部份,
17:27
we've been together for decades."
369
1047552
1627
我們在一起數十年了。」
17:29
But here's where it really counted,
370
1049203
1685
但真正重要,
17:30
what really stuck with me.
371
1050912
1503
真正讓我難忘的是
17:32
He said,
372
1052439
1241
他說:
17:33
"It is so important for me to have somebody at home
373
1053704
3520
「對我來說,家裡 有個人能告訴我真相
17:37
who tells me the truth."
374
1057248
1899
是非常重要的事。」
17:40
So you see, I brought it home. It's all about the truth.
375
1060618
2712
看,我把話帶回到主題了, 重點就是真相。
17:43
(Laughter)
376
1063354
1006
(笑聲)
17:44
CH: So there you go. Truth and love. Ideas worth spreading.
377
1064384
2807
克里斯:有你的。真相和愛。 值得散播的想法。
17:47
Christiane Amanpour, thank you so much. That was great.
378
1067215
2663
克莉絲蒂安艾曼普, 非常謝謝你。很棒的訪談。
17:49
(Applause)
379
1069902
1068
(掌聲)
17:50
CA: Thank you. CH: That was really lovely.
380
1070994
2334
克莉絲蒂安:謝謝您。 克里斯:訪談非常愉快。
17:53
(Applause)
381
1073352
1215
(掌聲)
17:54
CA: Thank you.
382
1074591
1165
克莉絲蒂安:謝謝。
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7