How we can protect truth in the age of misinformation | Sinan Aral

235,868 views ・ 2020-01-16

TED


請雙擊下方英文字幕播放視頻。

00:00
Translator: Ivana Korom Reviewer: Krystian Aparta
0
0
7000
譯者: Harper Chang 審譯者: Helen Chang
00:13
So, on April 23 of 2013,
1
13468
5222
2013 年 4 月 23 日,
00:18
the Associated Press put out the following tweet on Twitter.
2
18714
5514
美聯社在推特上發推文稱,
00:24
It said, "Breaking news:
3
24252
2397
「突發新聞:
00:26
Two explosions at the White House
4
26673
2571
白宮發生兩起爆炸,
00:29
and Barack Obama has been injured."
5
29268
2333
總統奧巴馬受傷。」
00:32
This tweet was retweeted 4,000 times in less than five minutes,
6
32212
5425
不到五分鐘內, 這篇推文被轉發了四千多次,
00:37
and it went viral thereafter.
7
37661
2217
接着病毒式擴散。
00:40
Now, this tweet wasn't real news put out by the Associated Press.
8
40760
4350
然而,這則推文並不是 由美聯社發佈的真實新聞,
00:45
In fact it was false news, or fake news,
9
45134
3333
這則錯誤的新聞, 或者說,假新聞,
00:48
that was propagated by Syrian hackers
10
48491
2825
是敘利亞的黑客
00:51
that had infiltrated the Associated Press Twitter handle.
11
51340
4694
在入侵美聯社的推特帳號後發佈的。
00:56
Their purpose was to disrupt society, but they disrupted much more.
12
56407
3889
他們的目的是擾亂社會, 但實際破壞遠大於此。
01:00
Because automated trading algorithms
13
60320
2476
因為自動交易算法
01:02
immediately seized on the sentiment on this tweet,
14
62820
3360
立即對這條推文進行情感分析,
01:06
and began trading based on the potential
15
66204
2968
並開始根據美國總統
01:09
that the president of the United States had been injured or killed
16
69196
3381
在爆炸中受傷或喪生的可能性
01:12
in this explosion.
17
72601
1200
進行自動交易。
01:14
And as they started tweeting,
18
74188
1992
推文被瘋狂轉發的同時,
01:16
they immediately sent the stock market crashing,
19
76204
3349
股市立刻隨之崩盤,
01:19
wiping out 140 billion dollars in equity value in a single day.
20
79577
5167
一天之內蒸發了 1400 億美金市值。
01:25
Robert Mueller, special counsel prosecutor in the United States,
21
85062
4476
美國特別檢察官羅伯特·穆勒
01:29
issued indictments against three Russian companies
22
89562
3892
曾起訴三家俄羅斯公司
01:33
and 13 Russian individuals
23
93478
2619
及十三名俄羅斯公民,
01:36
on a conspiracy to defraud the United States
24
96121
3167
指控他們合謀擾亂美國,
01:39
by meddling in the 2016 presidential election.
25
99312
3780
干涉 2016 年的總統大選。
01:43
And what this indictment tells as a story
26
103855
3564
這項指控針對的事件
01:47
is the story of the Internet Research Agency,
27
107443
3142
與俄羅斯網路水軍有關, (Internet Research Agency)
01:50
the shadowy arm of the Kremlin on social media.
28
110609
3594
那是個俄羅斯政府用來 操控社交網路的機構。
01:54
During the presidential election alone,
29
114815
2777
僅在(美國)總統大選期間,
01:57
the Internet Agency's efforts
30
117616
1889
這個網路水軍的功夫
01:59
reached 126 million people on Facebook in the United States,
31
119529
5167
就觸及了 1.26 億美國臉書用戶,
02:04
issued three million individual tweets
32
124720
3277
發佈了 3 百萬條推文
02:08
and 43 hours' worth of YouTube content.
33
128021
3842
和 43 小時的 YouTube 內容,
02:11
All of which was fake --
34
131887
1652
全都是精心設計的虛假資訊,
02:13
misinformation designed to sow discord in the US presidential election.
35
133563
6323
用來干擾美國總統選舉。
02:20
A recent study by Oxford University
36
140996
2650
牛津大學最近的一項研究表明,
02:23
showed that in the recent Swedish elections,
37
143670
3270
在近期的瑞典大選中,
02:26
one third of all of the information spreading on social media
38
146964
4375
社交網路上有三分之一 關於選舉的資訊
02:31
about the election
39
151363
1198
02:32
was fake or misinformation.
40
152585
2087
是虛假資訊或是謠言。
02:35
In addition, these types of social-media misinformation campaigns
41
155037
5078
另外,這些社交網路上的虛假資訊,
02:40
can spread what has been called "genocidal propaganda,"
42
160139
4151
可以傳播「種族屠殺宣傳」,
02:44
for instance against the Rohingya in Burma,
43
164314
3111
比如,在緬甸煽動 對羅興亞人的種族仇恨,
02:47
triggering mob killings in India.
44
167449
2303
在印度引發暴民殺戮。
02:49
We studied fake news
45
169776
1494
我們在假新聞廣受關注前 就開始對其研究。
02:51
and began studying it before it was a popular term.
46
171294
3219
02:55
And we recently published the largest-ever longitudinal study
47
175030
5040
我們最近公佈了一項迄今最大的
03:00
of the spread of fake news online
48
180094
2286
針對網路傳播上假新聞的縱向研究,
03:02
on the cover of "Science" in March of this year.
49
182404
3204
登上了今年 3 月的《科學》封面。
03:06
We studied all of the verified true and false news stories
50
186523
4161
我們研究所有經過查核的
03:10
that ever spread on Twitter,
51
190708
1753
推特上傳播的真、假新聞,
03:12
from its inception in 2006 to 2017.
52
192485
3818
範圍從推特創立的 2006 年到 2017 年。
03:16
And when we studied this information,
53
196612
2314
我們研究的新聞樣本
03:18
we studied verified news stories
54
198950
2876
是明確的真/假新聞,
03:21
that were verified by six independent fact-checking organizations.
55
201850
3918
它們經過了 6 個獨立的 事實查核機構的驗證,
03:25
So we knew which stories were true
56
205792
2762
所以我們知道哪些是真新聞,
03:28
and which stories were false.
57
208578
2126
哪些是假新聞。
03:30
We can measure their diffusion,
58
210728
1873
我們測量它們的傳播程度,
03:32
the speed of their diffusion,
59
212625
1651
比如傳播速度、
03:34
the depth and breadth of their diffusion,
60
214300
2095
傳播範圍、
03:36
how many people become entangled in this information cascade and so on.
61
216419
4142
被假新聞迷惑的人數等等。
03:40
And what we did in this paper
62
220942
1484
在這項研究中,
03:42
was we compared the spread of true news to the spread of false news.
63
222450
3865
我們比對了真、假新聞的傳播程度,
03:46
And here's what we found.
64
226339
1683
這是研究結果。
03:48
We found that false news diffused further, faster, deeper
65
228046
3979
我們發現,假新聞較真新聞 傳播得更遠、更快、更深、更廣,
03:52
and more broadly than the truth
66
232049
1806
03:53
in every category of information that we studied,
67
233879
3003
在每類新聞中都是如此,
03:56
sometimes by an order of magnitude.
68
236906
2499
有時甚至相差一個數量級。 (意即:數十倍之差)
03:59
And in fact, false political news was the most viral.
69
239842
3524
在假新聞中,虛假政治新聞 傳播程度最嚴重,
04:03
It diffused further, faster, deeper and more broadly
70
243390
3147
它比其他假新聞傳播得 更遠、更快、更深、更廣。
04:06
than any other type of false news.
71
246561
2802
04:09
When we saw this,
72
249387
1293
看到這個結果,
04:10
we were at once worried but also curious.
73
250704
2841
我們擔心、同時好奇著,
04:13
Why?
74
253569
1151
為什麼?
04:14
Why does false news travel so much further, faster, deeper
75
254744
3373
為什麼假新聞比真新聞傳播得 更遠、更快、更深、更廣?
04:18
and more broadly than the truth?
76
258141
1864
04:20
The first hypothesis that we came up with was,
77
260339
2961
我們首先想到的假設是,
04:23
"Well, maybe people who spread false news have more followers or follow more people,
78
263324
4792
也許傳播假新聞的人有更多的關注者,
或是關注了更多人,發推更頻繁,
04:28
or tweet more often,
79
268140
1557
04:29
or maybe they're more often 'verified' users of Twitter, with more credibility,
80
269721
4126
也許他們在公衆眼裡更「可信」,
04:33
or maybe they've been on Twitter longer."
81
273871
2182
或是他們更早開始使用推特。
04:36
So we checked each one of these in turn.
82
276077
2298
於是我們一一驗證,
04:38
And what we found was exactly the opposite.
83
278691
2920
得出的結果卻正好相反。
04:41
False-news spreaders had fewer followers,
84
281635
2436
假新聞傳播帳號有更少的關注者,
04:44
followed fewer people, were less active,
85
284095
2254
關注更少的用戶,
04:46
less often "verified"
86
286373
1460
可信度更低,
04:47
and had been on Twitter for a shorter period of time.
87
287857
2960
使用推特的時間更短。
04:50
And yet,
88
290841
1189
然而,
04:52
false news was 70 percent more likely to be retweeted than the truth,
89
292054
5033
即便在控制了這些變量後,
04:57
controlling for all of these and many other factors.
90
297111
3363
假新聞被轉發的可能性 仍然比真新聞高 70%。
05:00
So we had to come up with other explanations.
91
300498
2690
所以我們轉向其他假設。
05:03
And we devised what we called a "novelty hypothesis."
92
303212
3467
我們引入了一個名詞「新奇假設」, (novelty hypothesis)
05:07
So if you read the literature,
93
307038
1960
讀過文獻的人應該知道,
05:09
it is well known that human attention is drawn to novelty,
94
309022
3754
我們的注意力會被新奇事物吸引,
05:12
things that are new in the environment.
95
312800
2519
那些在環境中未曾有過的東西。
05:15
And if you read the sociology literature,
96
315343
1985
如果你讀過社會學作品,
05:17
you know that we like to share novel information.
97
317352
4300
會知道人們傾向於分享新奇的資訊,
05:21
It makes us seem like we have access to inside information,
98
321676
3838
因為這讓我們看起來掌握了內部資訊。
05:25
and we gain in status by spreading this kind of information.
99
325538
3785
透過傳播這些資訊,我們提升了地位。
05:29
So what we did was we measured the novelty of an incoming true or false tweet,
100
329792
6452
因此我們會將推文內容
05:36
compared to the corpus of what that individual had seen
101
336268
4055
同前 60 天看到過的相關內容比較,
05:40
in the 60 days prior on Twitter.
102
340347
2952
以此評估一條推文的新奇度。
05:43
But that wasn't enough, because we thought to ourselves,
103
343323
2659
但這還不夠,因為我們想到,
05:46
"Well, maybe false news is more novel in an information-theoretic sense,
104
346006
4208
也許假新聞本身更新奇,
05:50
but maybe people don't perceive it as more novel."
105
350238
3258
但人們不認為它們很新奇呢?
05:53
So to understand people's perceptions of false news,
106
353849
3927
所以為了研究人們對假新聞的感知,
05:57
we looked at the information and the sentiment
107
357800
3690
我們從對真、假新聞的回覆中
06:01
contained in the replies to true and false tweets.
108
361514
4206
提取人們的感受。
06:06
And what we found
109
366022
1206
我們發現,
06:07
was that across a bunch of different measures of sentiment --
110
367252
4214
在對不同反應情緒的統計中,
06:11
surprise, disgust, fear, sadness,
111
371490
3301
有驚訝、噁心、恐懼、悲傷、
06:14
anticipation, joy and trust --
112
374815
2484
期待、愉快和信任,
06:17
false news exhibited significantly more surprise and disgust
113
377323
5857
假新聞的回覆中, 驚訝和噁心的情緒
06:23
in the replies to false tweets.
114
383204
2806
顯著高於真新聞。
06:26
And true news exhibited significantly more anticipation,
115
386392
3789
真新聞收到的回覆中
06:30
joy and trust
116
390205
1547
包含更多的期待、
06:31
in reply to true tweets.
117
391776
2547
愉快和信任。
06:34
The surprise corroborates our novelty hypothesis.
118
394347
3786
驚訝情緒驗證了「新奇假設」,
06:38
This is new and surprising, and so we're more likely to share it.
119
398157
4609
因為它更新、更令人驚訝, 所以我們更願意轉發。
06:43
At the same time, there was congressional testimony
120
403092
2925
同時,不論在白宮
06:46
in front of both houses of Congress in the United States,
121
406041
3036
還是國會的證詞中,
06:49
looking at the role of bots in the spread of misinformation.
122
409101
3738
提到了機器人帳號 對傳播虛擬資訊的作用。
06:52
So we looked at this too --
123
412863
1354
於是我們進行了針對性研究。
06:54
we used multiple sophisticated bot-detection algorithms
124
414241
3598
我們使用多種複雜的 機器人帳號探測算法,
06:57
to find the bots in our data and to pull them out.
125
417863
3074
找出機器人發佈的資訊,
07:01
So we pulled them out, we put them back in
126
421347
2659
把它剔除出,又放回去,
07:04
and we compared what happens to our measurement.
127
424030
3119
然後比較兩種情況的數據。
07:07
And what we found was that, yes indeed,
128
427173
2293
我們發現,
機器人確實促進了假新聞的傳播,
07:09
bots were accelerating the spread of false news online,
129
429490
3682
07:13
but they were accelerating the spread of true news
130
433196
2651
但它們也幾乎同樣程度地
07:15
at approximately the same rate.
131
435871
2405
促進了真新聞的傳播。
07:18
Which means bots are not responsible
132
438300
2858
這表明,機器人與真假新聞 傳播程度的差異無關。
07:21
for the differential diffusion of truth and falsity online.
133
441182
4713
07:25
We can't abdicate that responsibility,
134
445919
2849
所以不是機器人的問題,
07:28
because we, humans, are responsible for that spread.
135
448792
4259
而是我們,人類的問題。
07:34
Now, everything that I have told you so far,
136
454472
3334
直到現在,我跟你們說的,
07:37
unfortunately for all of us,
137
457830
1754
很不幸,
07:39
is the good news.
138
459608
1261
還不算最糟糕。
07:42
The reason is because it's about to get a whole lot worse.
139
462670
4450
原因是,情況將要變得更糟。
07:47
And two specific technologies are going to make it worse.
140
467850
3682
兩種技術將惡化形式。
07:52
We are going to see the rise of a tremendous wave of synthetic media.
141
472207
5172
我們將迎來合成媒體的浪潮,
07:57
Fake video, fake audio that is very convincing to the human eye.
142
477403
6031
人眼無法分辨的假影片、假音頻。
08:03
And this will powered by two technologies.
143
483458
2754
這背後由兩種技術支持。
08:06
The first of these is known as "generative adversarial networks."
144
486236
3833
第一種技術是「生成對抗網路」,
08:10
This is a machine-learning model with two networks:
145
490093
2563
是一種包含兩個神經網路的 機器學習方法:
08:12
a discriminator,
146
492680
1547
判別網路
08:14
whose job it is to determine whether something is true or false,
147
494251
4200
用來判斷數據的真僞;
08:18
and a generator,
148
498475
1167
生成網路
08:19
whose job it is to generate synthetic media.
149
499666
3150
用來生成合成數據,也就是假數據。
08:22
So the synthetic generator generates synthetic video or audio,
150
502840
5102
所以生成網路生成假影片、假音頻,
08:27
and the discriminator tries to tell, "Is this real or is this fake?"
151
507966
4675
判別網路嘗試分辨真假。
08:32
And in fact, it is the job of the generator
152
512665
2874
事實上,生成網路的目標就是
08:35
to maximize the likelihood that it will fool the discriminator
153
515563
4435
讓假數據看起來盡可能「真」,
08:40
into thinking the synthetic video and audio that it is creating
154
520022
3587
以欺騙判別網路, 讓它認為這是真的。
08:43
is actually true.
155
523633
1730
08:45
Imagine a machine in a hyperloop,
156
525387
2373
想像一下,透過無休止的博弈,
08:47
trying to get better and better at fooling us.
157
527784
2803
機器越發擅長欺騙,以假亂真。
08:51
This, combined with the second technology,
158
531114
2500
接著第二種技術登場,
08:53
which is essentially the democratization of artificial intelligence to the people,
159
533638
5722
它可說是人工智慧的平民化。
08:59
the ability for anyone,
160
539384
2189
所有人都可使用,
09:01
without any background in artificial intelligence
161
541597
2830
無需任何人工智慧或機器學習背景。
09:04
or machine learning,
162
544451
1182
09:05
to deploy these kinds of algorithms to generate synthetic media
163
545657
4103
這些算法在合成媒體的應用,
09:09
makes it ultimately so much easier to create videos.
164
549784
4547
大大降低了修改影片的門檻。
09:14
The White House issued a false, doctored video
165
554355
4421
白宮曾發佈一段修改過的假影片,
09:18
of a journalist interacting with an intern who was trying to take his microphone.
166
558800
4288
影片中的記者正阻止 白宮實習生拿走麥克風,
09:23
They removed frames from this video
167
563427
1999
白宮刪除了幾幀影片,
09:25
in order to make his actions seem more punchy.
168
565450
3287
讓記者的動作看起來更蠻橫。
09:29
And when videographers and stuntmen and women
169
569157
3385
接受採訪的攝影師和特效師
09:32
were interviewed about this type of technique,
170
572566
2427
在被問到這類技術時,
09:35
they said, "Yes, we use this in the movies all the time
171
575017
3828
他們說:「是的,這是 電影製作的常用技術,
09:38
to make our punches and kicks look more choppy and more aggressive."
172
578869
4763
使拳打腳踢顯得 更震撼和更具侵略性。」
09:44
They then put out this video
173
584268
1867
白宮放出這段影片,
09:46
and partly used it as justification
174
586159
2500
以對女實習生的「侵略性動作」為由,
09:48
to revoke Jim Acosta, the reporter's, press pass
175
588683
3999
撤銷了吉姆·阿科斯塔 (影片中的記者)
進入白宮的記者通行證,
09:52
from the White House.
176
592706
1339
09:54
And CNN had to sue to have that press pass reinstated.
177
594069
4809
CNN 不得不透過訴訟 來要回通行證的權限。
10:00
There are about five different paths that I can think of that we can follow
178
600538
5603
我想到了五個方式
10:06
to try and address some of these very difficult problems today.
179
606165
3739
來應對這些難題。
10:10
Each one of them has promise,
180
610379
1810
每一種方式都帶來希望,
10:12
but each one of them has its own challenges.
181
612213
2999
但也各有挑戰。
10:15
The first one is labeling.
182
615236
2008
第一個方式是標示。
10:17
Think about it this way:
183
617268
1357
想一想,
10:18
when you go to the grocery store to buy food to consume,
184
618649
3611
你去商場買的吃食
10:22
it's extensively labeled.
185
622284
1904
有廣泛的標示。
10:24
You know how many calories it has,
186
624212
1992
你知道其中有多少卡路里,
10:26
how much fat it contains --
187
626228
1801
有多少脂肪。
10:28
and yet when we consume information, we have no labels whatsoever.
188
628053
4278
然而我們獲取的資訊 卻看不到任何標示。
10:32
What is contained in this information?
189
632355
1928
這則資訊中包含什麼?
10:34
Is the source credible?
190
634307
1453
資訊來源可信嗎?
10:35
Where is this information gathered from?
191
635784
2317
資訊的依據可靠嗎?
10:38
We have none of that information
192
638125
1825
我們在消耗資訊時並不知道這些。
10:39
when we are consuming information.
193
639974
2103
10:42
That is a potential avenue, but it comes with its challenges.
194
642101
3238
這可能是種解決方法,但也有困難。
10:45
For instance, who gets to decide, in society, what's true and what's false?
195
645363
6451
比如說,社會中誰來決定真假,
10:52
Is it the governments?
196
652387
1642
是政府嗎?
10:54
Is it Facebook?
197
654053
1150
是臉書嗎?
10:55
Is it an independent consortium of fact-checkers?
198
655601
3762
是獨立的事實查核組織嗎?
10:59
And who's checking the fact-checkers?
199
659387
2466
誰負責查證這些查核組織呢?
11:02
Another potential avenue is incentives.
200
662427
3084
另一種可能的方式是獎勵機制。
11:05
We know that during the US presidential election
201
665535
2634
美國總統大選期間,
11:08
there was a wave of misinformation that came from Macedonia
202
668193
3690
其中一部分假新聞來自馬其頓。
11:11
that didn't have any political motive
203
671907
2337
他們沒有任何政治動機,
11:14
but instead had an economic motive.
204
674268
2460
而是出於背後的經濟利益。
11:16
And this economic motive existed,
205
676752
2148
傳播假新聞的暴利在於,
11:18
because false news travels so much farther, faster
206
678924
3524
假新聞傳播得比真新聞 更遠、更快、更深,
11:22
and more deeply than the truth,
207
682472
2010
11:24
and you can earn advertising dollars as you garner eyeballs and attention
208
684506
4960
它們新奇而博人眼球,
11:29
with this type of information.
209
689490
1960
於是有更多的廣告費。
11:31
But if we can depress the spread of this information,
210
691474
3833
如果我們能抑制這些假新聞的傳播,
11:35
perhaps it would reduce the economic incentive
211
695331
2897
也許就能降低經濟利益,
11:38
to produce it at all in the first place.
212
698252
2690
從源頭減少假新聞數量。
11:40
Third, we can think about regulation,
213
700966
2500
第三,可以用法律規範新聞。
11:43
and certainly, we should think about this option.
214
703490
2325
這也是必要的應對方式。
11:45
In the United States, currently,
215
705839
1611
在美國,
11:47
we are exploring what might happen if Facebook and others are regulated.
216
707474
4848
我們正在嘗試用法律 規範臉書及其他社交網路。
11:52
While we should consider things like regulating political speech,
217
712346
3801
我們可以監管政治言論、
11:56
labeling the fact that it's political speech,
218
716171
2508
標明演講性質、
11:58
making sure foreign actors can't fund political speech,
219
718703
3819
禁止外國參與者資助政治演說。
12:02
it also has its own dangers.
220
722546
2547
但這個方法同樣存在危險。
12:05
For instance, Malaysia just instituted a six-year prison sentence
221
725522
4878
例如,馬來西亞最近頒布新法案,
12:10
for anyone found spreading misinformation.
222
730424
2734
假新聞傳播者將會面臨六年監禁。
12:13
And in authoritarian regimes,
223
733696
2079
在專制政權中,
12:15
these kinds of policies can be used to suppress minority opinions
224
735799
4666
這類法律可能被用來鎮壓少數異見,
12:20
and to continue to extend repression.
225
740489
3508
進一步擴大政治壓迫。
12:24
The fourth possible option is transparency.
226
744680
3543
第四種方法是透明化。
12:28
We want to know how do Facebook's algorithms work.
227
748843
3714
我們想知道臉書的算法如何,
12:32
How does the data combine with the algorithms
228
752581
2880
算法如何運用數據
12:35
to produce the outcomes that we see?
229
755485
2838
呈現出我們看到的內容。
12:38
We want them to open the kimono
230
758347
2349
我們要讓他們開誠佈公,
12:40
and show us exactly the inner workings of how Facebook is working.
231
760720
4214
讓我們看到臉書的內部運作方式。
12:44
And if we want to know social media's effect on society,
232
764958
2779
而要想知道社交網路對社會的影響,
12:47
we need scientists, researchers
233
767761
2086
我們需要讓科學家 和其他研究人員獲得這些數據。
12:49
and others to have access to this kind of information.
234
769871
3143
12:53
But at the same time,
235
773038
1547
但同時,
12:54
we are asking Facebook to lock everything down,
236
774609
3801
我們也在要求臉書封鎖所有數據,
12:58
to keep all of the data secure.
237
778434
2173
確保隱私安全,
13:00
So, Facebook and the other social media platforms
238
780631
3159
所以,臉書及同類社交平臺
13:03
are facing what I call a transparency paradox.
239
783814
3134
正面臨著「透明化悖論」。
13:07
We are asking them, at the same time,
240
787266
2674
因為我們要求他們
13:09
to be open and transparent and, simultaneously secure.
241
789964
4809
在公開透明化的同時確保隱私安全。
13:14
This is a very difficult needle to thread,
242
794797
2691
這是個極其困難的任務,
13:17
but they will need to thread this needle
243
797512
1913
但它們必須完成,
13:19
if we are to achieve the promise of social technologies
244
799449
3787
才能在避免其巨大隱患的同時
13:23
while avoiding their peril.
245
803260
1642
放心使用技術。
13:24
The final thing that we could think about is algorithms and machine learning.
246
804926
4691
最後一種方法是機器學習算法。
13:29
Technology devised to root out and understand fake news, how it spreads,
247
809641
5277
開發防範假新聞的算法,
研究假新聞的傳播,
13:34
and to try and dampen its flow.
248
814942
2331
抑制假新聞的擴散。
13:37
Humans have to be in the loop of this technology,
249
817824
2897
在技術開發過程中,人類不可缺位,
13:40
because we can never escape
250
820745
2278
因為我們無法逃避的是,
13:43
that underlying any technological solution or approach
251
823047
4038
所有的技術方案之下,
13:47
is a fundamental ethical and philosophical question
252
827109
4047
是根本的倫理與哲學問題:
13:51
about how do we define truth and falsity,
253
831180
3270
我們如何定義真實與虛假,
13:54
to whom do we give the power to define truth and falsity
254
834474
3180
誰有權定義真實與虛假,
13:57
and which opinions are legitimate,
255
837678
2460
哪些觀點是合法的,
14:00
which type of speech should be allowed and so on.
256
840162
3706
哪些言論可被允許,
諸如此類。
14:03
Technology is not a solution for that.
257
843892
2328
科技無法給出答案,
14:06
Ethics and philosophy is a solution for that.
258
846244
3698
只有倫理和哲學能夠回答。
14:10
Nearly every theory of human decision making,
259
850950
3318
幾乎所有關於人類決策、
14:14
human cooperation and human coordination
260
854292
2761
人類合作與協調的理論,
14:17
has some sense of the truth at its core.
261
857077
3674
其核心都包含追求真實。
14:21
But with the rise of fake news,
262
861347
2056
但隨著假新聞、假影片 和假音頻大行其道,
14:23
the rise of fake video,
263
863427
1443
14:24
the rise of fake audio,
264
864894
1882
14:26
we are teetering on the brink of the end of reality,
265
866800
3924
我們正滑向真實消亡的邊緣。
14:30
where we cannot tell what is real from what is fake.
266
870748
3889
我們正逐漸失去辨別真假的能力,
14:34
And that's potentially incredibly dangerous.
267
874661
3039
這或許將是極其危險的。
14:38
We have to be vigilant in defending the truth
268
878931
3948
我們必須時刻警惕,
守護真實,阻止虛假。
14:42
against misinformation.
269
882903
1534
14:44
With our technologies, with our policies
270
884919
3436
透過科技、透過政策,
14:48
and, perhaps most importantly,
271
888379
1920
或許最重要的是
14:50
with our own individual responsibilities,
272
890323
3214
透過我們每個人的責任感、
14:53
decisions, behaviors and actions.
273
893561
3555
每個人的選擇、行為
來守護真實。
14:57
Thank you very much.
274
897553
1437
謝謝大家。
14:59
(Applause)
275
899014
3517
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7