请双击下面的英文字幕来播放视频。
00:00
Translator: Ivana Korom
Reviewer: Krystian Aparta
0
0
7000
翻译人员: Wanting Zhong
校对人员: Yolanda Zhang
00:13
So, on April 23 of 2013,
1
13468
5222
2013 年 4 月 23 日,
00:18
the Associated Press
put out the following tweet on Twitter.
2
18714
5514
美联社在推特上发布了
这样一条推文:
00:24
It said, "Breaking news:
3
24252
2397
“突发新闻:
00:26
Two explosions at the White House
4
26673
2571
白宫发生两起爆炸,
00:29
and Barack Obama has been injured."
5
29268
2333
巴拉克·奥巴马受伤。”
00:32
This tweet was retweeted 4,000 times
in less than five minutes,
6
32212
5425
在不到五分钟的时间里,
这条推文被转发了四千次,
00:37
and it went viral thereafter.
7
37661
2217
随后也在网络上被疯传。
00:40
Now, this tweet wasn't real news
put out by the Associated Press.
8
40760
4350
不过,这条推文并不是
美联社发布的真实新闻。
00:45
In fact it was false news, or fake news,
9
45134
3333
事实上,这是一则不实新闻,
或者说是虚假新闻,
00:48
that was propagated by Syrian hackers
10
48491
2825
是由入侵了美联社推特账号
00:51
that had infiltrated
the Associated Press Twitter handle.
11
51340
4694
的叙利亚黑客扩散的。
00:56
Their purpose was to disrupt society,
but they disrupted much more.
12
56407
3889
他们的目的是扰乱社会,
但他们扰乱的远不止于此。
01:00
Because automated trading algorithms
13
60320
2476
因为自动交易算法
01:02
immediately seized
on the sentiment on this tweet,
14
62820
3360
立刻捕捉了这条推文的情感,
【注:机器学习中对主观性文本的情感分析】
01:06
and began trading based on the potential
15
66204
2968
并且根据美国总统在这次爆炸中
01:09
that the president of the United States
had been injured or killed
16
69196
3381
受伤或丧生的可能性,
01:12
in this explosion.
17
72601
1200
开始了交易。
01:14
And as they started tweeting,
18
74188
1992
而当他们开始发推时,
01:16
they immediately sent
the stock market crashing,
19
76204
3349
股市迅速随之崩盘,
01:19
wiping out 140 billion dollars
in equity value in a single day.
20
79577
5167
一日之内便蒸发了
1400 亿美元的市值。
01:25
Robert Mueller, special counsel
prosecutor in the United States,
21
85062
4476
美国特别检察官罗伯特·穆勒
01:29
issued indictments
against three Russian companies
22
89562
3892
起诉了三家俄罗斯公司
01:33
and 13 Russian individuals
23
93478
2619
以及十三个俄罗斯人,
01:36
on a conspiracy to defraud
the United States
24
96121
3167
指控他们干预 2016 年美国总统大选,
01:39
by meddling in the 2016
presidential election.
25
99312
3780
合谋诓骗美国。
01:43
And what this indictment tells as a story
26
103855
3564
而这次起诉讲述的
01:47
is the story of the Internet
Research Agency,
27
107443
3142
是互联网研究机构的故事,
01:50
the shadowy arm of the Kremlin
on social media.
28
110609
3594
即俄罗斯政府在社交媒体上
布下的影影绰绰的手腕。
01:54
During the presidential election alone,
29
114815
2777
仅在总统大选期间,
01:57
the Internet Agency's efforts
30
117616
1889
互联网机构就
01:59
reached 126 million people
on Facebook in the United States,
31
119529
5167
影响了 1.26 亿名
美国 Facebook 用户,
02:04
issued three million individual tweets
32
124720
3277
发布了 300 万条推文,
02:08
and 43 hours' worth of YouTube content.
33
128021
3842
以及 43 个小时的 Youtube 内容。
02:11
All of which was fake --
34
131887
1652
这一切都是虚假的——
02:13
misinformation designed to sow discord
in the US presidential election.
35
133563
6323
通过精心设计的虚假信息,
在美国总统大选中播下不和的种子。
02:20
A recent study by Oxford University
36
140996
2650
牛津大学最近的一项研究显示,
02:23
showed that in the recent
Swedish elections,
37
143670
3270
在近期的瑞典大选中,
02:26
one third of all of the information
spreading on social media
38
146964
4375
在社交媒体上传播
的关于大选的信息中,
02:31
about the election
39
151363
1198
有三分之一
02:32
was fake or misinformation.
40
152585
2087
是虚假或谬误信息。
02:35
In addition, these types
of social-media misinformation campaigns
41
155037
5078
另外,这些通过社交媒体
进行的误导活动
02:40
can spread what has been called
"genocidal propaganda,"
42
160139
4151
可以传播所谓的“种族清洗宣传”,
02:44
for instance against
the Rohingya in Burma,
43
164314
3111
例如在缅甸煽动对罗兴亚人的迫害,
02:47
triggering mob killings in India.
44
167449
2303
或者在印度引发暴徒杀人。
02:49
We studied fake news
45
169776
1494
我们在虚假新闻变成热点之前
02:51
and began studying it
before it was a popular term.
46
171294
3219
就开始了对虚假新闻的研究。
02:55
And we recently published
the largest-ever longitudinal study
47
175030
5040
最近,我们发表了一项
迄今最大型的关于虚假新闻
03:00
of the spread of fake news online
48
180094
2286
在网络传播的纵向研究,
03:02
on the cover of "Science"
in March of this year.
49
182404
3204
在今年三月登上了《科学》期刊封面。
03:06
We studied all of the verified
true and false news stories
50
186523
4161
我们研究了推特上传播的所有
03:10
that ever spread on Twitter,
51
190708
1753
核实过的真假新闻,
03:12
from its inception in 2006 to 2017.
52
192485
3818
范围是自 2006 年推特创立到 2017 年。
03:16
And when we studied this information,
53
196612
2314
在我们研究这些讯息时,
03:18
we studied verified news stories
54
198950
2876
我们通过六家独立的
事实核查机构验证,
03:21
that were verified by six
independent fact-checking organizations.
55
201850
3918
以确认新闻故事的真实性。
03:25
So we knew which stories were true
56
205792
2762
所以我们清楚哪些新闻是真的,
03:28
and which stories were false.
57
208578
2126
哪些是假的。
03:30
We can measure their diffusion,
58
210728
1873
我们可以测量
这些新闻的扩散程度,
03:32
the speed of their diffusion,
59
212625
1651
扩散速度,
03:34
the depth and breadth of their diffusion,
60
214300
2095
以及深度与广度,
03:36
how many people become entangled
in this information cascade and so on.
61
216419
4142
有多少人被卷入这个信息级联。
【注:人们加入信息更具说服力的团体】
03:40
And what we did in this paper
62
220942
1484
我们在这篇论文中
03:42
was we compared the spread of true news
to the spread of false news.
63
222450
3865
比较了真实新闻和
虚假新闻的传播程度。
03:46
And here's what we found.
64
226339
1683
这是我们的研究发现。
03:48
We found that false news
diffused further, faster, deeper
65
228046
3979
我们发现,在我们研究
的所有新闻类别中,
虚假新闻都比真实新闻传播得
03:52
and more broadly than the truth
66
232049
1806
03:53
in every category of information
that we studied,
67
233879
3003
更远、更快、更深、更广,
03:56
sometimes by an order of magnitude.
68
236906
2499
有时甚至超出一个数量级。
03:59
And in fact, false political news
was the most viral.
69
239842
3524
事实上,虚假的政治新闻
传播速度最快。
04:03
It diffused further, faster,
deeper and more broadly
70
243390
3147
它比任何其他种类的虚假新闻
04:06
than any other type of false news.
71
246561
2802
都扩散得更远、更快、更深、更广。
04:09
When we saw this,
72
249387
1293
我们看到这个结果时,
04:10
we were at once worried but also curious.
73
250704
2841
我们立刻感到担忧,
但同时也很好奇。
04:13
Why?
74
253569
1151
为什么?
04:14
Why does false news travel
so much further, faster, deeper
75
254744
3373
为什么虚假新闻比真相
传播得更远、更快、更深、更广?
04:18
and more broadly than the truth?
76
258141
1864
04:20
The first hypothesis
that we came up with was,
77
260339
2961
我们想到的第一个假设是,
04:23
"Well, maybe people who spread false news
have more followers or follow more people,
78
263324
4792
“可能传播虚假新闻的人
有更多的关注者,或者关注了更多人,
04:28
or tweet more often,
79
268140
1557
或者发推更频繁,
04:29
or maybe they're more often 'verified'
users of Twitter, with more credibility,
80
269721
4126
或者他们中有更多
推特的‘认证’用户,可信度更高,
04:33
or maybe they've been on Twitter longer."
81
273871
2182
或者他们在推特上的时间更长。”
04:36
So we checked each one of these in turn.
82
276077
2298
因此,我们挨个检验了这些假设。
04:38
And what we found
was exactly the opposite.
83
278691
2920
我们发现,结果恰恰相反。
04:41
False-news spreaders had fewer followers,
84
281635
2436
假新闻散布者有更少关注者,
04:44
followed fewer people, were less active,
85
284095
2254
关注的人更少,活跃度更低,
04:46
less often "verified"
86
286373
1460
更少被“认证”,
04:47
and had been on Twitter
for a shorter period of time.
87
287857
2960
使用推特的时间更短。
04:50
And yet,
88
290841
1189
然而,
在控制了这些和很多其他变量之后,
04:52
false news was 70 percent more likely
to be retweeted than the truth,
89
292054
5033
虚假新闻比真实新闻
被转发的可能性高出了 70%。
04:57
controlling for all of these
and many other factors.
90
297111
3363
05:00
So we had to come up
with other explanations.
91
300498
2690
我们不得不提出别的解释。
05:03
And we devised what we called
a "novelty hypothesis."
92
303212
3467
于是,我们设想了一个
“新颖性假设”。
05:07
So if you read the literature,
93
307038
1960
如果各位对文献有所了解,
05:09
it is well known that human attention
is drawn to novelty,
94
309022
3754
会知道一个广为人知的现象是,
人类的注意力会被新颖性所吸引,
05:12
things that are new in the environment.
95
312800
2519
也就是环境中的新事物。
05:15
And if you read the sociology literature,
96
315343
1985
如果各位了解社会学文献的话,
05:17
you know that we like to share
novel information.
97
317352
4300
你们应该知道,我们喜欢分享
新鲜的信息。
05:21
It makes us seem like we have access
to inside information,
98
321676
3838
这使我们看上去像是
能够获得内部消息,
05:25
and we gain in status
by spreading this kind of information.
99
325538
3785
通过传播这类信息,
我们的地位可以获得提升。
05:29
So what we did was we measured the novelty
of an incoming true or false tweet,
100
329792
6452
因此我们把刚收到的真假推文
05:36
compared to the corpus
of what that individual had seen
101
336268
4055
和用户前 60 天内
在推特上看过的语库比较,
05:40
in the 60 days prior on Twitter.
102
340347
2952
以衡量刚收到的推文的新颖度。
05:43
But that wasn't enough,
because we thought to ourselves,
103
343323
2659
但这还不够,
因为我们想到,
05:46
"Well, maybe false news is more novel
in an information-theoretic sense,
104
346006
4208
“可能在信息论的层面
虚假新闻更加新颖,
05:50
but maybe people
don't perceive it as more novel."
105
350238
3258
但也许在人们的感知里,
它并没有很新鲜。”
05:53
So to understand people's
perceptions of false news,
106
353849
3927
因此,为了理解
人们对虚假新闻的感知,
05:57
we looked at the information
and the sentiment
107
357800
3690
我们研究了对真假推文的回复中
06:01
contained in the replies
to true and false tweets.
108
361514
4206
包含的信息和情感。
06:06
And what we found
109
366022
1206
我们发现,
06:07
was that across a bunch
of different measures of sentiment --
110
367252
4214
在多种不同的情感量表上——
06:11
surprise, disgust, fear, sadness,
111
371490
3301
惊讶,厌恶,恐惧,悲伤,
06:14
anticipation, joy and trust --
112
374815
2484
期待,喜悦,信任——
06:17
false news exhibited significantly more
surprise and disgust
113
377323
5857
对虚假新闻的回复里
明显表现出了
06:23
in the replies to false tweets.
114
383204
2806
更多的惊讶和厌恶。
06:26
And true news exhibited
significantly more anticipation,
115
386392
3789
而对真实新闻的回复里,
06:30
joy and trust
116
390205
1547
表现出的则是
06:31
in reply to true tweets.
117
391776
2547
更多的期待、喜悦,和信任。
06:34
The surprise corroborates
our novelty hypothesis.
118
394347
3786
这个意外事件证实了
我们的新颖性假设。
06:38
This is new and surprising,
and so we're more likely to share it.
119
398157
4609
这很新鲜、很令人惊讶,
所以我们更可能把它分享出去。
06:43
At the same time,
there was congressional testimony
120
403092
2925
同时,在美国国会两院前
进行的国会作证
提到了机器人账号(注:一种使用
自动化脚本执行大量简单任务的软件)
06:46
in front of both houses of Congress
in the United States,
121
406041
3036
06:49
looking at the role of bots
in the spread of misinformation.
122
409101
3738
在传播虚假信息时的作用。
06:52
So we looked at this too --
123
412863
1354
因此我们也对这一点进行了研究——
06:54
we used multiple sophisticated
bot-detection algorithms
124
414241
3598
我们使用多个复杂的
机器人账号探测算法,
06:57
to find the bots in our data
and to pull them out.
125
417863
3074
寻找并提取出了
我们数据中的机器人账号。
07:01
So we pulled them out,
we put them back in
126
421347
2659
我们把机器人账号移除,
再把它们放回去,
07:04
and we compared what happens
to our measurement.
127
424030
3119
并比较其对我们的测量
产生的影响。
07:07
And what we found was that, yes indeed,
128
427173
2293
我们发现,确实,
07:09
bots were accelerating
the spread of false news online,
129
429490
3682
机器人账号加速了
虚假新闻在网络上的传播,
07:13
but they were accelerating
the spread of true news
130
433196
2651
但它们也在以大约相同的速度
07:15
at approximately the same rate.
131
435871
2405
加速真实新闻的传播。
07:18
Which means bots are not responsible
132
438300
2858
这意味着,机器人账号
07:21
for the differential diffusion
of truth and falsity online.
133
441182
4713
并不是造成网上虚实信息
传播差距的原因。
07:25
We can't abdicate that responsibility,
134
445919
2849
我们不能推脱这个责任,
07:28
because we, humans,
are responsible for that spread.
135
448792
4259
因为要对这种传播负责的,
是我们人类自己。
07:34
Now, everything
that I have told you so far,
136
454472
3334
对于我们大家来说
都很不幸的是,
07:37
unfortunately for all of us,
137
457830
1754
刚刚我告诉各位的一切
07:39
is the good news.
138
459608
1261
都是好消息。
07:42
The reason is because
it's about to get a whole lot worse.
139
462670
4450
原因在于,形势马上要大幅恶化了。
07:47
And two specific technologies
are going to make it worse.
140
467850
3682
而两种特定的技术
会将形势变得更加糟糕。
07:52
We are going to see the rise
of a tremendous wave of synthetic media.
141
472207
5172
我们将会目睹
一大波合成媒体的剧增。
07:57
Fake video, fake audio
that is very convincing to the human eye.
142
477403
6031
虚假视频、虚假音频,
对于人类来说都能以假乱真。
08:03
And this will powered by two technologies.
143
483458
2754
这是由两项技术支持的。
08:06
The first of these is known
as "generative adversarial networks."
144
486236
3833
其一是所谓的“生成对抗网络”。
08:10
This is a machine-learning model
with two networks:
145
490093
2563
这是一个由两个网络组成
的机器学习模型:
08:12
a discriminator,
146
492680
1547
一个是判别网络,
08:14
whose job it is to determine
whether something is true or false,
147
494251
4200
负责分辨样本的真假;
08:18
and a generator,
148
498475
1167
另一个是生成网络,
08:19
whose job it is to generate
synthetic media.
149
499666
3150
负责产生合成媒体。
08:22
So the synthetic generator
generates synthetic video or audio,
150
502840
5102
生成网络产生
合成视频或音频,
08:27
and the discriminator tries to tell,
"Is this real or is this fake?"
151
507966
4675
而判别网络则试图分辨,
“这是真的还是假的?”
08:32
And in fact, it is the job
of the generator
152
512665
2874
事实上,生成网络的任务是
08:35
to maximize the likelihood
that it will fool the discriminator
153
515563
4435
尽可能地欺骗判别网络,
让判别网络误以为
它合成的视频和音频
08:40
into thinking the synthetic
video and audio that it is creating
154
520022
3587
08:43
is actually true.
155
523633
1730
其实是真的。
08:45
Imagine a machine in a hyperloop,
156
525387
2373
想象一台处于超级循环中的机器,
08:47
trying to get better
and better at fooling us.
157
527784
2803
试图变得越来越擅长欺骗我们。
08:51
This, combined with the second technology,
158
531114
2500
第二项技术,
简而言之,
08:53
which is essentially the democratization
of artificial intelligence to the people,
159
533638
5722
就是在民众中
的人工智能的民主化,
08:59
the ability for anyone,
160
539384
2189
即让任何人
09:01
without any background
in artificial intelligence
161
541597
2830
不需要任何人工智能或
09:04
or machine learning,
162
544451
1182
机器学习的背景,
09:05
to deploy these kinds of algorithms
to generate synthetic media
163
545657
4103
也能调用这些算法
生成人工合成媒体。
09:09
makes it ultimately so much easier
to create videos.
164
549784
4547
这两种技术相结合,
让制作视频变得如此容易。
09:14
The White House issued
a false, doctored video
165
554355
4421
白宫曾发布过一个
虚假的、篡改过的视频,
09:18
of a journalist interacting with an intern
who was trying to take his microphone.
166
558800
4288
内容为一名记者和一个试图抢夺
他的麦克风的实习生的互动。
09:23
They removed frames from this video
167
563427
1999
他们从视频中移除了一些帧,
09:25
in order to make his actions
seem more punchy.
168
565450
3287
让他的行动显得更有攻击性。
09:29
And when videographers
and stuntmen and women
169
569157
3385
而当摄影师和替身演员
09:32
were interviewed
about this type of technique,
170
572566
2427
被采访问及这种技术时,
09:35
they said, "Yes, we use this
in the movies all the time
171
575017
3828
他们说,“是的,我们经常
在电影中使用这种技术,
09:38
to make our punches and kicks
look more choppy and more aggressive."
172
578869
4763
让我们的出拳和踢腿动作
看上去更具打击感,更加有气势。”
09:44
They then put out this video
173
584268
1867
他们于是发布了这个视频,
09:46
and partly used it as justification
174
586159
2500
将其作为部分证据,
09:48
to revoke Jim Acosta,
the reporter's, press pass
175
588683
3999
试图撤销视频中的记者,
吉姆·阿考斯塔
09:52
from the White House.
176
592706
1339
的白宫新闻通行证。
09:54
And CNN had to sue
to have that press pass reinstated.
177
594069
4809
于是 CNN 不得不提出诉讼,
要求恢复该新闻通行证。
10:00
There are about five different paths
that I can think of that we can follow
178
600538
5603
我能想到我们可以走
的五条不同道路,
10:06
to try and address some
of these very difficult problems today.
179
606165
3739
以试图解决当今我们面对
的这些异常艰难的问题。
10:10
Each one of them has promise,
180
610379
1810
每一种措施都带来希望,
10:12
but each one of them
has its own challenges.
181
612213
2999
但每一种也有其自身的挑战。
10:15
The first one is labeling.
182
615236
2008
第一种措施是贴上标签。
10:17
Think about it this way:
183
617268
1357
可以这么想:
10:18
when you go to the grocery store
to buy food to consume,
184
618649
3611
当你去超市购买食品时,
10:22
it's extensively labeled.
185
622284
1904
食品上会有详细的标签。
10:24
You know how many calories it has,
186
624212
1992
你可以得知它有多少卡路里,
10:26
how much fat it contains --
187
626228
1801
含有多少脂肪——
10:28
and yet when we consume information,
we have no labels whatsoever.
188
628053
4278
然而当我们摄取信息时,
我们没有任何标签。
10:32
What is contained in this information?
189
632355
1928
这个信息中含有什么?
10:34
Is the source credible?
190
634307
1453
其来源是否可信?
10:35
Where is this information gathered from?
191
635784
2317
这个信息是从哪里收集的?
10:38
We have none of that information
192
638125
1825
在我们摄取信息时,
10:39
when we are consuming information.
193
639974
2103
我们并没有以上的任何信息。
10:42
That is a potential avenue,
but it comes with its challenges.
194
642101
3238
这是一种可能的解决办法,
但它有自身的挑战。
10:45
For instance, who gets to decide,
in society, what's true and what's false?
195
645363
6451
比如说,在社会中,
有谁能决定信息的真伪?
10:52
Is it the governments?
196
652387
1642
是政府吗?
10:54
Is it Facebook?
197
654053
1150
是 Facebook 吗?
10:55
Is it an independent
consortium of fact-checkers?
198
655601
3762
是由事实核查机构
组成的独立联盟吗?
10:59
And who's checking the fact-checkers?
199
659387
2466
谁又来对事实核查机构
进行核查呢?
11:02
Another potential avenue is incentives.
200
662427
3084
另一种可能的解决手段是奖励措施。
11:05
We know that during
the US presidential election
201
665535
2634
我们知道,在美国总统大选期间,
11:08
there was a wave of misinformation
that came from Macedonia
202
668193
3690
有一波虚假信息来源于马其顿,
11:11
that didn't have any political motive
203
671907
2337
他们没有任何政治动机,
11:14
but instead had an economic motive.
204
674268
2460
相反,他们有经济动机。
11:16
And this economic motive existed,
205
676752
2148
这个经济动机之所以存在,
11:18
because false news travels
so much farther, faster
206
678924
3524
是因为虚假新闻比真相传播得
11:22
and more deeply than the truth,
207
682472
2010
更远、更快、更深,
11:24
and you can earn advertising dollars
as you garner eyeballs and attention
208
684506
4960
你可以使用这类信息
博取眼球、吸引注意,
11:29
with this type of information.
209
689490
1960
从而通过广告赚钱。
11:31
But if we can depress the spread
of this information,
210
691474
3833
但如果我们能抑制
这类信息的传播,
11:35
perhaps it would reduce
the economic incentive
211
695331
2897
或许就能在源头减少
11:38
to produce it at all in the first place.
212
698252
2690
生产这类信息的经济动机。
11:40
Third, we can think about regulation,
213
700966
2500
第三,我们可以考虑进行监管,
11:43
and certainly, we should think
about this option.
214
703490
2325
毫无疑问,我们应当考虑这个选项。
11:45
In the United States, currently,
215
705839
1611
现在,在美国,
11:47
we are exploring what might happen
if Facebook and others are regulated.
216
707474
4848
我们在探索当 Facebook 和其它平台
受到监管时,会发生什么事情。
11:52
While we should consider things
like regulating political speech,
217
712346
3801
我们应当考虑的措施包括:
监管政治言论,
11:56
labeling the fact
that it's political speech,
218
716171
2508
对政治言论进行标签,
11:58
making sure foreign actors
can't fund political speech,
219
718703
3819
确保外国参与者无法资助政治言论,
12:02
it also has its own dangers.
220
722546
2547
但这也有自己的风险。
12:05
For instance, Malaysia just instituted
a six-year prison sentence
221
725522
4878
举个例子,马来西亚刚刚颁布法案,
对任何散布不实消息的人
12:10
for anyone found spreading misinformation.
222
730424
2734
处以六年监禁。
12:13
And in authoritarian regimes,
223
733696
2079
而在独裁政权中,
12:15
these kinds of policies can be used
to suppress minority opinions
224
735799
4666
这种政策可以被利用
以压制少数群体的意见,
12:20
and to continue to extend repression.
225
740489
3508
继续扩大压迫。
12:24
The fourth possible option
is transparency.
226
744680
3543
第四种可能的解决方法是透明度。
12:28
We want to know
how do Facebook's algorithms work.
227
748843
3714
我们想了解 Facebook
的算法是怎样运作的。
12:32
How does the data
combine with the algorithms
228
752581
2880
数据是怎样与算法结合,
12:35
to produce the outcomes that we see?
229
755485
2838
得出我们看到的结果?
12:38
We want them to open the kimono
230
758347
2349
我们想让他们开诚布公,
12:40
and show us exactly the inner workings
of how Facebook is working.
231
760720
4214
为我们披露 Facebook 内部
具体是如何运作的。
12:44
And if we want to know
social media's effect on society,
232
764958
2779
而如果我们想知道
社交媒体对社会的影响,
12:47
we need scientists, researchers
233
767761
2086
我们需要科学家、研究人员
12:49
and others to have access
to this kind of information.
234
769871
3143
和其他人能够入手这种信息。
12:53
But at the same time,
235
773038
1547
但与此同时,
12:54
we are asking Facebook
to lock everything down,
236
774609
3801
我们还要求 Facebook 锁上一切,
12:58
to keep all of the data secure.
237
778434
2173
保证所有数据的安全。
13:00
So, Facebook and the other
social media platforms
238
780631
3159
因此,Facebook 和其他社交媒体平台
13:03
are facing what I call
a transparency paradox.
239
783814
3134
正面对我称之为的“透明性悖论”。
13:07
We are asking them, at the same time,
240
787266
2674
我们要求他们
13:09
to be open and transparent
and, simultaneously secure.
241
789964
4809
在开放、透明的同时
保证安全。
13:14
This is a very difficult needle to thread,
242
794797
2691
这是非常艰难的挑战,
13:17
but they will need to thread this needle
243
797512
1913
这些公司必须直面挑战,
13:19
if we are to achieve the promise
of social technologies
244
799449
3787
才能在实现社交科技承诺的同时
回避它们带来的危害。
13:23
while avoiding their peril.
245
803260
1642
13:24
The final thing that we could think about
is algorithms and machine learning.
246
804926
4691
我们能想到的最后一个解决手段是
算法和机器学习。
13:29
Technology devised to root out
and understand fake news, how it spreads,
247
809641
5277
有的科技被开发出来,
用于拔除和理解虚假新闻,
了解它们的传播方式,
并试图降低其扩散。
13:34
and to try and dampen its flow.
248
814942
2331
13:37
Humans have to be in the loop
of this technology,
249
817824
2897
人类需要跟进这种科技,
13:40
because we can never escape
250
820745
2278
因为我们无法逃避的是,
13:43
that underlying any technological
solution or approach
251
823047
4038
在任何科技解答或手段的背后
13:47
is a fundamental ethical
and philosophical question
252
827109
4047
都有一个根本的伦理与哲学问题:
13:51
about how do we define truth and falsity,
253
831180
3270
我们如何定义真实和虚伪,
13:54
to whom do we give the power
to define truth and falsity
254
834474
3180
我们将定义真伪的权力托付于谁,
13:57
and which opinions are legitimate,
255
837678
2460
哪些意见是合法的,
14:00
which type of speech
should be allowed and so on.
256
840162
3706
哪种言论能被允许,
诸如此类。
14:03
Technology is not a solution for that.
257
843892
2328
科技并非对这个问题的解答,
14:06
Ethics and philosophy
is a solution for that.
258
846244
3698
伦理学和哲学才是。
14:10
Nearly every theory
of human decision making,
259
850950
3318
人类决策、人类合作和人类协调
14:14
human cooperation and human coordination
260
854292
2761
的几乎每一个理论,
14:17
has some sense of the truth at its core.
261
857077
3674
其核心都存在某种程度的真相。
14:21
But with the rise of fake news,
262
861347
2056
但随着虚假新闻、
14:23
the rise of fake video,
263
863427
1443
虚假视频、
14:24
the rise of fake audio,
264
864894
1882
虚假音频的崛起,
14:26
we are teetering on the brink
of the end of reality,
265
866800
3924
我们正在现实终结
的边缘摇摇欲坠,
14:30
where we cannot tell
what is real from what is fake.
266
870748
3889
在这里我们无法分辨
何为真实,何为虚假。
14:34
And that's potentially
incredibly dangerous.
267
874661
3039
这有可能是极度危险的。
14:38
We have to be vigilant
in defending the truth
268
878931
3948
我们必须保持警惕,拒绝虚假信息,
14:42
against misinformation.
269
882903
1534
捍卫真相——
14:44
With our technologies, with our policies
270
884919
3436
通过我们的技术,我们的政策,
14:48
and, perhaps most importantly,
271
888379
1920
以及,或许也是最重要的,
14:50
with our own individual responsibilities,
272
890323
3214
通过我们自己的责任感、
14:53
decisions, behaviors and actions.
273
893561
3555
决定、行为,和举动。
14:57
Thank you very much.
274
897553
1437
谢谢大家。
14:59
(Applause)
275
899014
3517
(掌声)
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。