How deepfakes undermine truth and threaten democracy | Danielle Citron

86,286 views ・ 2019-10-07

TED


请双击下面的英文字幕来播放视频。

翻译人员: Nan Yang 校对人员: Yolanda Zhang
00:12
[This talk contains mature content]
0
12535
2767
【此演讲包含成人内容】
00:17
Rana Ayyub is a journalist in India
1
17762
2992
Rana Ayyub 是一位印度记者。
00:20
whose work has exposed government corruption
2
20778
2602
她的工作揭露了政府腐败
00:24
and human rights violations.
3
24411
2157
和人权侵犯。
00:26
And over the years,
4
26990
1167
这些年来,
00:28
she's gotten used to vitriol and controversy around her work.
5
28181
3303
她已经习惯了工作中的残酷和争议。
00:32
But none of it could have prepared her for what she faced in April 2018.
6
32149
5109
但这些都不足以让她准备好来 面对2018年4月份的事情。
00:38
She was sitting in a café with a friend when she first saw it:
7
38125
3651
当时她和一个朋友坐在 咖啡厅,第一次看见了
00:41
a two-minute, 20-second video of her engaged in a sex act.
8
41800
4943
她自己出现在一个 2分20秒的性爱视频里。
00:47
And she couldn't believe her eyes.
9
47188
2349
她不能相信自己的眼睛。
00:49
She had never made a sex video.
10
49561
2273
她从来没有拍摄过这样的视频。
00:52
But unfortunately, thousands upon thousands of people
11
52506
3465
但是很不幸的是,成千上万的人
00:55
would believe it was her.
12
55995
1666
选择相信就是她。
00:58
I interviewed Ms. Ayyub about three months ago,
13
58673
2944
我在大约三个月前 采访了Ayyub女士,
01:01
in connection with my book on sexual privacy.
14
61641
2504
为了我的关于性隐私的书籍。
01:04
I'm a law professor, lawyer and civil rights advocate.
15
64681
3198
我是一个法律教授, 律师和民权倡导者。
01:08
So it's incredibly frustrating knowing that right now,
16
68204
4611
所以我非常沮丧,因为我知道
01:12
law could do very little to help her.
17
72839
2238
现在的法律几乎帮不到她。
01:15
And as we talked,
18
75458
1547
当我们在谈话的时候,
01:17
she explained that she should have seen the fake sex video coming.
19
77029
4512
她解释道她应该更早意识到 虚假性爱视频的到来。
01:22
She said, "After all, sex is so often used to demean and to shame women,
20
82038
5596
她说:‘’毕竟,性爱经常 被用来贬低和羞辱女性,
01:27
especially minority women,
21
87658
2428
特别是少数民族的妇女,
01:30
and especially minority women who dare to challenge powerful men,"
22
90110
4312
尤其是敢于挑战权势 的少数民族妇女。”
01:34
as she had in her work.
23
94446
1533
就像她在工作所做的那样。
01:37
The fake sex video went viral in 48 hours.
24
97191
3976
那个伪造的性爱视频 在48小时之内像病毒一样传播。
01:42
All of her online accounts were flooded with screenshots of the video,
25
102064
5307
她所有的线上账户 都被这些视频截屏淹没,
01:47
with graphic rape and death threats
26
107395
2627
同时还有图片形式的强奸和死亡威胁
01:50
and with slurs about her Muslim faith.
27
110046
2533
和对她穆斯林信仰的诽谤。
01:53
Online posts suggested that she was "available" for sex.
28
113426
4564
网上的帖子说她可以 随意跟其他人进行性行为。
01:58
And she was doxed,
29
118014
1610
并且她已经被“人肉”,
01:59
which means that her home address and her cell phone number
30
119648
2778
也就是说她的家庭住址和手机号
02:02
were spread across the internet.
31
122450
1746
已经在互联网上随处可见。
02:04
The video was shared more than 40,000 times.
32
124879
4084
那个视频已经被分享了超过4万次。
02:09
Now, when someone is targeted with this kind of cybermob attack,
33
129760
3936
如今,这样的网络暴力,
02:13
the harm is profound.
34
133720
2063
其伤害是非常深远的。
02:16
Rana Ayyub's life was turned upside down.
35
136482
3039
Rana Ayyub的生活已经彻底改变。
02:20
For weeks, she could hardly eat or speak.
36
140211
3334
几周时间里, 她几乎不吃饭也不说话。
02:23
She stopped writing and closed all of her social media accounts,
37
143919
3689
她不再写文章, 关闭了所有社交媒体账户,
02:27
which is, you know, a tough thing to do when you're a journalist.
38
147632
3158
这是作为一个记者很难应付的事情。
02:31
And she was afraid to go outside her family's home.
39
151188
3484
她不敢走出家门。
02:34
What if the posters made good on their threats?
40
154696
3022
如果那些发帖的人 真的进行了威胁呢?
02:38
The UN Council on Human Rights confirmed that she wasn't being crazy.
41
158395
4365
联合国人权理事会 确认她没有精神问题。
02:42
It issued a public statement saying that they were worried about her safety.
42
162784
4637
他们发布了一个声明, 表示担心她的人身安全。
02:48
What Rana Ayyub faced was a deepfake:
43
168776
4229
Rana Ayyub面对的是deepfake : (“深度学习”和“伪造”的混合词)
02:53
machine-learning technology
44
173029
2540
一种机器学习技术,
02:55
that manipulates or fabricates audio and video recordings
45
175593
4111
能够操纵或者伪造视频和音频
02:59
to show people doing and saying things
46
179728
2723
来展示人们做了或者说了
03:02
that they never did or said.
47
182475
1866
他们从来没有做过或说过的事情。
03:04
Deepfakes appear authentic and realistic, but they're not;
48
184807
3361
Deepfake让这些看起来 是真实的,但事实上并不是;
03:08
they're total falsehoods.
49
188192
1772
这些都是虚假的视频。
03:11
Although the technology is still developing in its sophistication,
50
191228
3794
尽管这项技术还在 这种复杂性中发展,
03:15
it is widely available.
51
195046
1614
但已经广泛被使用。
03:17
Now, the most recent attention to deepfakes arose,
52
197371
3072
大家最近对deepfakes最关注的,
03:20
as so many things do online,
53
200467
2161
就像很多网上的事情一样,
03:22
with pornography.
54
202652
1255
是色情媒介。
03:24
In early 2018,
55
204498
2111
在2018年早期,
03:26
someone posted a tool on Reddit
56
206633
2468
有人在Reddit论坛上发布了一个工具,
03:29
to allow users to insert faces into porn videos.
57
209125
4412
可以让用户在色情视频中插入人脸。
03:33
And what followed was a cascade of fake porn videos
58
213561
3440
这促成了一系列伪造的色情视频,
03:37
featuring people's favorite female celebrities.
59
217025
2797
使用了人们最喜爱的女星的面貌。
03:40
And today, you can go on YouTube and pull up countless tutorials
60
220712
3477
现在,你在YoutTube上能搜索到
03:44
with step-by-step instructions
61
224213
2286
不计其数步骤详细的教学视频
03:46
on how to make a deepfake on your desktop application.
62
226523
3163
教你如何在电脑上制作这种视频。
03:50
And soon we may be even able to make them on our cell phones.
63
230260
3706
很快,我们也许也可以 在我们的手机上制作。
03:55
Now, it's the interaction of some of our most basic human frailties
64
235072
5382
此时,正是我们最基本的 人性弱点和网络工具的
04:00
and network tools
65
240478
1682
相互作用
04:02
that can turn deepfakes into weapons.
66
242184
2666
使得deepfakes变成了武器。
04:04
So let me explain.
67
244874
1200
让我解释一下。
04:06
As human beings, we have a visceral reaction to audio and video.
68
246875
4566
作为人类,我们对音频 和视频有直观的反应。
04:11
We believe they're true,
69
251860
1488
我们相信反应是真实的,
04:13
on the notion that of course you can believe
70
253372
2078
理论上当然你应该相信
04:15
what your eyes and ears are telling you.
71
255474
2478
你的眼睛和耳朵告诉你的事情。
04:18
And it's that mechanism
72
258476
1699
并且正是那种机制
04:20
that might undermine our shared sense of reality.
73
260199
3698
削弱了我们共有的对现实的感觉。
04:23
Although we believe deepfakes to be true, they're not.
74
263921
3147
尽管我们相信deepfakes 是真的,但它不是。
04:27
And we're attracted to the salacious, the provocative.
75
267604
4157
我们被那些淫秽和挑衅吸引了。
04:32
We tend to believe and to share information
76
272365
3047
我们倾向于去相信和共享
04:35
that's negative and novel.
77
275436
2023
消极和新奇的信息。
04:37
And researchers have found that online hoaxes spread 10 times faster
78
277809
5019
研究者们发现网上的骗局比真实故事
04:42
than accurate stories.
79
282852
1627
的传播速度快10倍。
04:46
Now, we're also drawn to information
80
286015
4380
我们也容易被迎合
04:50
that aligns with our viewpoints.
81
290419
1892
我们观点的信息吸引。
04:52
Psychologists call that tendency "confirmation bias."
82
292950
3561
心理学家称这种倾向为 “验证性偏见”。
04:57
And social media platforms supercharge that tendency,
83
297300
4387
同时社交媒体平台 极大鼓励了这种倾向,
05:01
by allowing us to instantly and widely share information
84
301711
3881
允许我们即刻并广泛地分享
05:05
that accords with our viewpoints.
85
305616
1792
符合我们自我观点的信息。
05:08
Now, deepfakes have the potential to cause grave individual and societal harm.
86
308735
5568
Deepfakes能够对个人 和社会造成严重的潜在危害。
05:15
So, imagine a deepfake
87
315204
2024
想象一个deepfake视频
05:17
that shows American soldiers in Afganistan burning a Koran.
88
317252
4182
显示美国士兵在阿富汗焚烧可兰经。
05:22
You can imagine that that deepfake would provoke violence
89
322807
3024
你可以想象这个视频会挑起
05:25
against those soldiers.
90
325855
1533
针对这些士兵的暴力行为。
05:27
And what if the very next day
91
327847
2873
然后如果第二天
05:30
there's another deepfake that drops,
92
330744
2254
又有另一个deepfake视频出现,
05:33
that shows a well-known imam based in London
93
333022
3317
展示在伦敦的一位有名的 伊玛目(伊斯兰教领袖的称号)
05:36
praising the attack on those soldiers?
94
336363
2467
在赞美对士兵的袭击行为呢?
05:39
We might see violence and civil unrest,
95
339617
3163
我们可能会看见暴力和内乱,
05:42
not only in Afganistan and the United Kingdom,
96
342804
3249
不仅出现在阿富汗和英国,
05:46
but across the globe.
97
346077
1515
而是全世界。
05:48
And you might say to me,
98
348251
1158
你可能会对我说,
05:49
"Come on, Danielle, that's far-fetched."
99
349433
2247
“拜托, Danielle,你说的太牵强了。 ”
05:51
But it's not.
100
351704
1150
但并不是。
05:53
We've seen falsehoods spread
101
353293
2191
我们已经看见了许多虚假信息
05:55
on WhatsApp and other online message services
102
355508
2722
在WhatsApp和其它 在线聊天服务里传播,
05:58
lead to violence against ethnic minorities.
103
358254
2761
导致了对少数民族的暴力。
06:01
And that was just text --
104
361039
1887
而那还仅仅是文字——
06:02
imagine if it were video.
105
362950
2024
如果是视频呢?
06:06
Now, deepfakes have the potential to corrode the trust that we have
106
366593
5357
Deepfakes有可能会腐蚀 我们在民主机构里
06:11
in democratic institutions.
107
371974
1992
的所拥有的信任。
06:15
So, imagine the night before an election.
108
375006
2667
想象一下在选举的前一晚,
06:17
There's a deepfake showing one of the major party candidates
109
377996
3238
Deepfake展示了 其中一个主要党派的候选人
06:21
gravely sick.
110
381258
1150
生了重病。
06:23
The deepfake could tip the election
111
383202
2333
它可以颠覆这个选举
06:25
and shake our sense that elections are legitimate.
112
385559
3375
并动摇我们认为选举合法的想法。
06:30
Imagine if the night before an initial public offering
113
390515
3326
接下来,想象在一个跨国银行的
06:33
of a major global bank,
114
393865
2333
首次公开募股的前夜,
06:36
there was a deepfake showing the bank's CEO
115
396222
3149
deepfake显示银行的CEO
06:39
drunkenly spouting conspiracy theories.
116
399395
2697
因为喝醉宣扬阴谋论。
06:42
The deepfake could tank the IPO,
117
402887
3047
这个视频可以降低IPO的估值,
06:45
and worse, shake our sense that financial markets are stable.
118
405958
4115
更糟的是,动摇我们认为 金融市场稳定的感觉。
06:51
So deepfakes can exploit and magnify the deep distrust that we already have
119
411385
6989
所以,deepfakes可以利用 并放大我们对政客,商业领袖
06:58
in politicians, business leaders and other influential leaders.
120
418398
4214
和其他有影响力的领导人 已有的深深的不信任性。
07:02
They find an audience primed to believe them.
121
422945
3284
它们找到了愿意相信它们的观众。
07:07
And the pursuit of truth is on the line as well.
122
427287
2765
对真相的追寻也岌岌可危。
07:11
Technologists expect that with advances in AI,
123
431077
3564
技术人员认为, 未来随着人工智能的进步,
07:14
soon it may be difficult if not impossible
124
434665
3682
想要分辨出真视频和假视频
07:18
to tell the difference between a real video and a fake one.
125
438371
3769
可能将会变得非常困难。
07:23
So how can the truth emerge in a deepfake-ridden marketplace of ideas?
126
443022
5341
那么真相如何才能在一个deepfake 驱使的思想市场中产生?
07:28
Will we just proceed along the path of least resistance
127
448752
3420
我们仅仅沿着阻力最小的路前进
07:32
and believe what we want to believe,
128
452196
2437
并相信我们想去相信的,
07:34
truth be damned?
129
454657
1150
而真相已经被诅咒了吗?
07:36
And not only might we believe the fakery,
130
456831
3175
而且不仅是我们相信虚假,
07:40
we might start disbelieving the truth.
131
460030
3326
可能是我们也开始不相信真相。
07:43
We've already seen people invoke the phenomenon of deepfakes
132
463887
4079
我们已经见过人们 利用deepfakes的现象
07:47
to cast doubt on real evidence of their wrongdoing.
133
467990
3920
来质疑能证明他们 错误行为的真实证据。
07:51
We've heard politicians say of audio of their disturbing comments,
134
471934
5969
我们也听过政客回应 他们令人不安的评论,
07:57
"Come on, that's fake news.
135
477927
1746
“拜托,那是假新闻。
07:59
You can't believe what your eyes and ears are telling you."
136
479697
3920
你不能相信你的眼睛 和耳朵告诉你的事情。”
08:04
And it's that risk
137
484402
1731
这是一种风险,
08:06
that professor Robert Chesney and I call the "liar's dividend":
138
486157
5436
Robert Chesney教授和我 把它叫做“撒谎者的福利”:
08:11
the risk that liars will invoke deepfakes
139
491617
3357
撒谎者利用deepfakes来
08:14
to escape accountability for their wrongdoing.
140
494998
2905
逃避他们为错误行为负责的风险。
08:18
So we've got our work cut out for us, there's no doubt about it.
141
498963
3071
所以我们知道我们的工作 非常艰难,毋庸置疑。
08:22
And we're going to need a proactive solution
142
502606
3325
我们需要一个 主动积极的解决方案,
08:25
from tech companies, from lawmakers,
143
505955
3511
来自于科技公司,国会议员,
08:29
law enforcers and the media.
144
509490
1984
执法者和媒体。
08:32
And we're going to need a healthy dose of societal resilience.
145
512093
4016
并且我们将需要健康的社会复原力。
08:37
So now, we're right now engaged in a very public conversation
146
517506
3896
当下我们正在与科技公司
08:41
about the responsibility of tech companies.
147
521426
2913
进行关于社会责任的公开谈话。
08:44
And my advice to social media platforms
148
524926
3032
我对社交媒体平台的建议
08:47
has been to change their terms of service and community guidelines
149
527982
3873
已经影响了他们的 服务条款和社区准则,
08:51
to ban deepfakes that cause harm.
150
531879
2336
来禁止deepfakes造成危害。
08:54
That determination, that's going to require human judgment,
151
534712
3960
那种决心需要人类的判断,
08:58
and it's expensive.
152
538696
1571
并且代价很大。
09:00
But we need human beings
153
540673
2285
但是我们需要人类
09:02
to look at the content and context of a deepfake
154
542982
3873
去看看deepfake的内容和背景,
09:06
to figure out if it is a harmful impersonation
155
546879
3682
弄清楚它是否是有害的伪装,
09:10
or instead, if it's valuable satire, art or education.
156
550585
4382
或者相反,是否是有价值的 讽刺,艺术或教育。
09:16
So now, what about the law?
157
556118
1495
接下来,法律呢?
09:18
Law is our educator.
158
558666
2349
法律是我们的教育家。
09:21
It teaches us about what's harmful and what's wrong.
159
561515
4038
它教育我们什么是有害的, 什么是错误的。
09:25
And it shapes behavior it deters by punishing perpetrators
160
565577
4555
法律通过惩罚肇事作案者 和保护对受害者的补救
09:30
and securing remedies for victims.
161
570156
2267
来规范行为。
09:33
Right now, law is not up to the challenge of deepfakes.
162
573148
4280
现在,法律还无法应对 deepfakes带来的挑战。
09:38
Across the globe,
163
578116
1390
在全球范围内,
09:39
we lack well-tailored laws
164
579530
2444
我们缺少精心制定的法律,
09:41
that would be designed to tackle digital impersonations
165
581998
3570
解决数字伪造问题的法律,
09:45
that invade sexual privacy,
166
585592
2231
针对会侵犯性隐私,
09:47
that damage reputations
167
587847
1387
毁坏名誉
09:49
and that cause emotional distress.
168
589258
1951
并造成情绪困扰的事件的法律。
09:51
What happened to Rana Ayyub is increasingly commonplace.
169
591725
3873
发生在Rana Ayyub身上的 事情越来越普遍。
09:56
Yet, when she went to law enforcement in Delhi,
170
596074
2214
但是,当她去德里 的执法部门举报时,
09:58
she was told nothing could be done.
171
598312
2135
她被告知他们无能为力。
10:01
And the sad truth is that the same would be true
172
601101
3183
并且令人悲伤的事实是, 在美国和欧洲
10:04
in the United States and in Europe.
173
604308
2266
可能是同样的情况。
10:07
So we have a legal vacuum that needs to be filled.
174
607300
4356
所以我们需要填补这一法律空白。
10:12
My colleague Dr. Mary Anne Franks and I are working with US lawmakers
175
612292
4092
我和同事Mary Anne Franks博士 正在和美国国会议员一起
10:16
to devise legislation that would ban harmful digital impersonations
176
616408
4804
制定可以禁止有危害的数字伪造,
10:21
that are tantamount to identity theft.
177
621236
2533
相当于身份盗窃的法律。
10:24
And we've seen similar moves
178
624252
2126
同时我们看见了
10:26
in Iceland, the UK and Australia.
179
626402
3301
在冰岛,英国和澳大利亚 有类似的过程在进行。
10:30
But of course, that's just a small piece of the regulatory puzzle.
180
630157
3259
但是当然,这只是 整个监管难题的一小部分。
10:34
Now, I know law is not a cure-all. Right?
181
634911
3169
我知道法律并不是灵丹妙药。
10:38
It's a blunt instrument.
182
638104
1600
它是一个武器。
10:40
And we've got to use it wisely.
183
640346
1539
我们需要智慧地使用它。
10:42
It also has some practical impediments.
184
642411
2812
法律也有一些实际的障碍。
10:45
You can't leverage law against people you can't identify and find.
185
645657
5044
你不能利用法律对抗 你无法找到和定位的人。
10:51
And if a perpetrator lives outside the country
186
651463
3286
如果一个作案者居住在
10:54
where a victim lives,
187
654773
1754
不同于受害者的国家,
10:56
then you may not be able to insist
188
656551
1629
那么你可能无法强制作案者
10:58
that the perpetrator come into local courts
189
658204
2349
来到当地法庭
11:00
to face justice.
190
660577
1150
来接受审判。
11:02
And so we're going to need a coordinated international response.
191
662236
4063
因此我们需要一项协调一致的, 联合全球的响应策略。
11:07
Education has to be part of our response as well.
192
667819
3333
教育也是我们响应策略中的一部分。
11:11
Law enforcers are not going to enforce laws
193
671803
3731
执法者不会去强制执行
11:15
they don't know about
194
675558
1458
他们不知道的法律,
11:17
and proffer problems they don't understand.
195
677040
2596
也不会提出他们不明白的问题。
11:20
In my research on cyberstalking,
196
680376
2191
在我关于网络追踪的研究中,
11:22
I found that law enforcement lacked the training
197
682591
3499
我发现执法者缺少
11:26
to understand the laws available to them
198
686114
2582
了解可供使用的法律
11:28
and the problem of online abuse.
199
688720
2349
和网络暴力问题的培训。
11:31
And so often they told victims,
200
691093
2682
所以他们经常告诉受害者,
11:33
"Just turn your computer off. Ignore it. It'll go away."
201
693799
3971
“把电脑关掉就行了, 不要管,自然就没了。”
11:38
And we saw that in Rana Ayyub's case.
202
698261
2466
我们在Rana Ayyub的案例中 也发现了这点。
11:41
She was told, "Come on, you're making such a big deal about this.
203
701102
3468
她被告知: “拜托,你太小题大做了。
11:44
It's boys being boys."
204
704594
1743
他们只是不懂事的孩子。”
11:47
And so we need to pair new legislation with efforts at training.
205
707268
5252
所以我们需要展开针对新法律的培训。
11:54
And education has to be aimed on the media as well.
206
714053
3429
媒体也同样需要接受相关的培训。
11:58
Journalists need educating about the phenomenon of deepfakes
207
718180
4260
记者们需要接受关于 deepfakes现象的培训,
12:02
so they don't amplify and spread them.
208
722464
3039
这样它们就不会被放大和传播。
12:06
And this is the part where we're all involved.
209
726583
2168
这正是我们所有人 需要参与其中的部分。
12:08
Each and every one of us needs educating.
210
728775
3855
我们每个人都需要这方面的教育。
12:13
We click, we share, we like, and we don't even think about it.
211
733375
3675
我们点击,我们分享,我们点赞, 但是我们没有认真去思考。
12:17
We need to do better.
212
737551
1547
我们需要做得更好。
12:19
We need far better radar for fakery.
213
739726
2809
我们需要针对虚假事件 更敏锐的雷达。
12:25
So as we're working through these solutions,
214
745744
3841
所以在我们研究解决方案的时候,
12:29
there's going to be a lot of suffering to go around.
215
749609
2563
会有很多痛苦和折磨围绕着我们。
12:33
Rana Ayyub is still wrestling with the fallout.
216
753093
2746
Rana Ayyub仍在和后果搏斗。
12:36
She still doesn't feel free to express herself on- and offline.
217
756669
4189
她还是无法自在地 在线上和线下表达自己。
12:41
And as she told me,
218
761566
1365
就像她告诉我的,
12:42
she still feels like there are thousands of eyes on her naked body,
219
762955
5074
她仍然感觉有上千双眼睛 在看着她的裸体,
12:48
even though, intellectually, she knows it wasn't her body.
220
768053
3661
尽管理智上她知道 那并不是她的身体。
12:52
And she has frequent panic attacks,
221
772371
2349
而且她经常遭受惊吓,
12:54
especially when someone she doesn't know tries to take her picture.
222
774744
4100
特别是当有陌生人试图去偷拍她。
12:58
"What if they're going to make another deepfake?" she thinks to herself.
223
778868
3511
“如果他们想拍来做另一个 deepfake呢?”她不禁想。
13:03
And so for the sake of individuals like Rana Ayyub
224
783082
3921
所以为了像Rana Ayyub一样的个人,
13:07
and the sake of our democracy,
225
787027
2306
为了我们的民主,
13:09
we need to do something right now.
226
789357
2182
我们现在就要有所作为。
13:11
Thank you.
227
791563
1151
谢谢大家。
13:12
(Applause)
228
792738
2508
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7