How deepfakes undermine truth and threaten democracy | Danielle Citron

86,369 views ・ 2019-10-07

TED


Please double-click on the English subtitles below to play the video.

00:12
[This talk contains mature content]
0
12535
2767
00:17
Rana Ayyub is a journalist in India
1
17762
2992
00:20
whose work has exposed government corruption
2
20778
2602
00:24
and human rights violations.
3
24411
2157
00:26
And over the years,
4
26990
1167
00:28
she's gotten used to vitriol and controversy around her work.
5
28181
3303
00:32
But none of it could have prepared her for what she faced in April 2018.
6
32149
5109
00:38
She was sitting in a café with a friend when she first saw it:
7
38125
3651
00:41
a two-minute, 20-second video of her engaged in a sex act.
8
41800
4943
00:47
And she couldn't believe her eyes.
9
47188
2349
00:49
She had never made a sex video.
10
49561
2273
00:52
But unfortunately, thousands upon thousands of people
11
52506
3465
00:55
would believe it was her.
12
55995
1666
00:58
I interviewed Ms. Ayyub about three months ago,
13
58673
2944
01:01
in connection with my book on sexual privacy.
14
61641
2504
01:04
I'm a law professor, lawyer and civil rights advocate.
15
64681
3198
01:08
So it's incredibly frustrating knowing that right now,
16
68204
4611
01:12
law could do very little to help her.
17
72839
2238
01:15
And as we talked,
18
75458
1547
01:17
she explained that she should have seen the fake sex video coming.
19
77029
4512
01:22
She said, "After all, sex is so often used to demean and to shame women,
20
82038
5596
01:27
especially minority women,
21
87658
2428
01:30
and especially minority women who dare to challenge powerful men,"
22
90110
4312
01:34
as she had in her work.
23
94446
1533
01:37
The fake sex video went viral in 48 hours.
24
97191
3976
01:42
All of her online accounts were flooded with screenshots of the video,
25
102064
5307
01:47
with graphic rape and death threats
26
107395
2627
01:50
and with slurs about her Muslim faith.
27
110046
2533
01:53
Online posts suggested that she was "available" for sex.
28
113426
4564
01:58
And she was doxed,
29
118014
1610
01:59
which means that her home address and her cell phone number
30
119648
2778
02:02
were spread across the internet.
31
122450
1746
02:04
The video was shared more than 40,000 times.
32
124879
4084
02:09
Now, when someone is targeted with this kind of cybermob attack,
33
129760
3936
02:13
the harm is profound.
34
133720
2063
02:16
Rana Ayyub's life was turned upside down.
35
136482
3039
02:20
For weeks, she could hardly eat or speak.
36
140211
3334
02:23
She stopped writing and closed all of her social media accounts,
37
143919
3689
02:27
which is, you know, a tough thing to do when you're a journalist.
38
147632
3158
02:31
And she was afraid to go outside her family's home.
39
151188
3484
02:34
What if the posters made good on their threats?
40
154696
3022
02:38
The UN Council on Human Rights confirmed that she wasn't being crazy.
41
158395
4365
02:42
It issued a public statement saying that they were worried about her safety.
42
162784
4637
02:48
What Rana Ayyub faced was a deepfake:
43
168776
4229
02:53
machine-learning technology
44
173029
2540
02:55
that manipulates or fabricates audio and video recordings
45
175593
4111
02:59
to show people doing and saying things
46
179728
2723
03:02
that they never did or said.
47
182475
1866
03:04
Deepfakes appear authentic and realistic, but they're not;
48
184807
3361
03:08
they're total falsehoods.
49
188192
1772
03:11
Although the technology is still developing in its sophistication,
50
191228
3794
03:15
it is widely available.
51
195046
1614
03:17
Now, the most recent attention to deepfakes arose,
52
197371
3072
03:20
as so many things do online,
53
200467
2161
03:22
with pornography.
54
202652
1255
03:24
In early 2018,
55
204498
2111
03:26
someone posted a tool on Reddit
56
206633
2468
03:29
to allow users to insert faces into porn videos.
57
209125
4412
03:33
And what followed was a cascade of fake porn videos
58
213561
3440
03:37
featuring people's favorite female celebrities.
59
217025
2797
03:40
And today, you can go on YouTube and pull up countless tutorials
60
220712
3477
03:44
with step-by-step instructions
61
224213
2286
03:46
on how to make a deepfake on your desktop application.
62
226523
3163
03:50
And soon we may be even able to make them on our cell phones.
63
230260
3706
03:55
Now, it's the interaction of some of our most basic human frailties
64
235072
5382
04:00
and network tools
65
240478
1682
04:02
that can turn deepfakes into weapons.
66
242184
2666
04:04
So let me explain.
67
244874
1200
04:06
As human beings, we have a visceral reaction to audio and video.
68
246875
4566
04:11
We believe they're true,
69
251860
1488
04:13
on the notion that of course you can believe
70
253372
2078
04:15
what your eyes and ears are telling you.
71
255474
2478
04:18
And it's that mechanism
72
258476
1699
04:20
that might undermine our shared sense of reality.
73
260199
3698
04:23
Although we believe deepfakes to be true, they're not.
74
263921
3147
04:27
And we're attracted to the salacious, the provocative.
75
267604
4157
04:32
We tend to believe and to share information
76
272365
3047
04:35
that's negative and novel.
77
275436
2023
04:37
And researchers have found that online hoaxes spread 10 times faster
78
277809
5019
04:42
than accurate stories.
79
282852
1627
04:46
Now, we're also drawn to information
80
286015
4380
04:50
that aligns with our viewpoints.
81
290419
1892
04:52
Psychologists call that tendency "confirmation bias."
82
292950
3561
04:57
And social media platforms supercharge that tendency,
83
297300
4387
05:01
by allowing us to instantly and widely share information
84
301711
3881
05:05
that accords with our viewpoints.
85
305616
1792
05:08
Now, deepfakes have the potential to cause grave individual and societal harm.
86
308735
5568
05:15
So, imagine a deepfake
87
315204
2024
05:17
that shows American soldiers in Afganistan burning a Koran.
88
317252
4182
05:22
You can imagine that that deepfake would provoke violence
89
322807
3024
05:25
against those soldiers.
90
325855
1533
05:27
And what if the very next day
91
327847
2873
05:30
there's another deepfake that drops,
92
330744
2254
05:33
that shows a well-known imam based in London
93
333022
3317
05:36
praising the attack on those soldiers?
94
336363
2467
05:39
We might see violence and civil unrest,
95
339617
3163
05:42
not only in Afganistan and the United Kingdom,
96
342804
3249
05:46
but across the globe.
97
346077
1515
05:48
And you might say to me,
98
348251
1158
05:49
"Come on, Danielle, that's far-fetched."
99
349433
2247
05:51
But it's not.
100
351704
1150
05:53
We've seen falsehoods spread
101
353293
2191
05:55
on WhatsApp and other online message services
102
355508
2722
05:58
lead to violence against ethnic minorities.
103
358254
2761
06:01
And that was just text --
104
361039
1887
06:02
imagine if it were video.
105
362950
2024
06:06
Now, deepfakes have the potential to corrode the trust that we have
106
366593
5357
06:11
in democratic institutions.
107
371974
1992
06:15
So, imagine the night before an election.
108
375006
2667
06:17
There's a deepfake showing one of the major party candidates
109
377996
3238
06:21
gravely sick.
110
381258
1150
06:23
The deepfake could tip the election
111
383202
2333
06:25
and shake our sense that elections are legitimate.
112
385559
3375
06:30
Imagine if the night before an initial public offering
113
390515
3326
06:33
of a major global bank,
114
393865
2333
06:36
there was a deepfake showing the bank's CEO
115
396222
3149
06:39
drunkenly spouting conspiracy theories.
116
399395
2697
06:42
The deepfake could tank the IPO,
117
402887
3047
06:45
and worse, shake our sense that financial markets are stable.
118
405958
4115
06:51
So deepfakes can exploit and magnify the deep distrust that we already have
119
411385
6989
06:58
in politicians, business leaders and other influential leaders.
120
418398
4214
07:02
They find an audience primed to believe them.
121
422945
3284
07:07
And the pursuit of truth is on the line as well.
122
427287
2765
07:11
Technologists expect that with advances in AI,
123
431077
3564
07:14
soon it may be difficult if not impossible
124
434665
3682
07:18
to tell the difference between a real video and a fake one.
125
438371
3769
07:23
So how can the truth emerge in a deepfake-ridden marketplace of ideas?
126
443022
5341
07:28
Will we just proceed along the path of least resistance
127
448752
3420
07:32
and believe what we want to believe,
128
452196
2437
07:34
truth be damned?
129
454657
1150
07:36
And not only might we believe the fakery,
130
456831
3175
07:40
we might start disbelieving the truth.
131
460030
3326
07:43
We've already seen people invoke the phenomenon of deepfakes
132
463887
4079
07:47
to cast doubt on real evidence of their wrongdoing.
133
467990
3920
07:51
We've heard politicians say of audio of their disturbing comments,
134
471934
5969
07:57
"Come on, that's fake news.
135
477927
1746
07:59
You can't believe what your eyes and ears are telling you."
136
479697
3920
08:04
And it's that risk
137
484402
1731
08:06
that professor Robert Chesney and I call the "liar's dividend":
138
486157
5436
08:11
the risk that liars will invoke deepfakes
139
491617
3357
08:14
to escape accountability for their wrongdoing.
140
494998
2905
08:18
So we've got our work cut out for us, there's no doubt about it.
141
498963
3071
08:22
And we're going to need a proactive solution
142
502606
3325
08:25
from tech companies, from lawmakers,
143
505955
3511
08:29
law enforcers and the media.
144
509490
1984
08:32
And we're going to need a healthy dose of societal resilience.
145
512093
4016
08:37
So now, we're right now engaged in a very public conversation
146
517506
3896
08:41
about the responsibility of tech companies.
147
521426
2913
08:44
And my advice to social media platforms
148
524926
3032
08:47
has been to change their terms of service and community guidelines
149
527982
3873
08:51
to ban deepfakes that cause harm.
150
531879
2336
08:54
That determination, that's going to require human judgment,
151
534712
3960
08:58
and it's expensive.
152
538696
1571
09:00
But we need human beings
153
540673
2285
09:02
to look at the content and context of a deepfake
154
542982
3873
09:06
to figure out if it is a harmful impersonation
155
546879
3682
09:10
or instead, if it's valuable satire, art or education.
156
550585
4382
09:16
So now, what about the law?
157
556118
1495
09:18
Law is our educator.
158
558666
2349
09:21
It teaches us about what's harmful and what's wrong.
159
561515
4038
09:25
And it shapes behavior it deters by punishing perpetrators
160
565577
4555
09:30
and securing remedies for victims.
161
570156
2267
09:33
Right now, law is not up to the challenge of deepfakes.
162
573148
4280
09:38
Across the globe,
163
578116
1390
09:39
we lack well-tailored laws
164
579530
2444
09:41
that would be designed to tackle digital impersonations
165
581998
3570
09:45
that invade sexual privacy,
166
585592
2231
09:47
that damage reputations
167
587847
1387
09:49
and that cause emotional distress.
168
589258
1951
09:51
What happened to Rana Ayyub is increasingly commonplace.
169
591725
3873
09:56
Yet, when she went to law enforcement in Delhi,
170
596074
2214
09:58
she was told nothing could be done.
171
598312
2135
10:01
And the sad truth is that the same would be true
172
601101
3183
10:04
in the United States and in Europe.
173
604308
2266
10:07
So we have a legal vacuum that needs to be filled.
174
607300
4356
10:12
My colleague Dr. Mary Anne Franks and I are working with US lawmakers
175
612292
4092
10:16
to devise legislation that would ban harmful digital impersonations
176
616408
4804
10:21
that are tantamount to identity theft.
177
621236
2533
10:24
And we've seen similar moves
178
624252
2126
10:26
in Iceland, the UK and Australia.
179
626402
3301
10:30
But of course, that's just a small piece of the regulatory puzzle.
180
630157
3259
10:34
Now, I know law is not a cure-all. Right?
181
634911
3169
10:38
It's a blunt instrument.
182
638104
1600
10:40
And we've got to use it wisely.
183
640346
1539
10:42
It also has some practical impediments.
184
642411
2812
10:45
You can't leverage law against people you can't identify and find.
185
645657
5044
10:51
And if a perpetrator lives outside the country
186
651463
3286
10:54
where a victim lives,
187
654773
1754
10:56
then you may not be able to insist
188
656551
1629
10:58
that the perpetrator come into local courts
189
658204
2349
11:00
to face justice.
190
660577
1150
11:02
And so we're going to need a coordinated international response.
191
662236
4063
11:07
Education has to be part of our response as well.
192
667819
3333
11:11
Law enforcers are not going to enforce laws
193
671803
3731
11:15
they don't know about
194
675558
1458
11:17
and proffer problems they don't understand.
195
677040
2596
11:20
In my research on cyberstalking,
196
680376
2191
11:22
I found that law enforcement lacked the training
197
682591
3499
11:26
to understand the laws available to them
198
686114
2582
11:28
and the problem of online abuse.
199
688720
2349
11:31
And so often they told victims,
200
691093
2682
11:33
"Just turn your computer off. Ignore it. It'll go away."
201
693799
3971
11:38
And we saw that in Rana Ayyub's case.
202
698261
2466
11:41
She was told, "Come on, you're making such a big deal about this.
203
701102
3468
11:44
It's boys being boys."
204
704594
1743
11:47
And so we need to pair new legislation with efforts at training.
205
707268
5252
11:54
And education has to be aimed on the media as well.
206
714053
3429
11:58
Journalists need educating about the phenomenon of deepfakes
207
718180
4260
12:02
so they don't amplify and spread them.
208
722464
3039
12:06
And this is the part where we're all involved.
209
726583
2168
12:08
Each and every one of us needs educating.
210
728775
3855
12:13
We click, we share, we like, and we don't even think about it.
211
733375
3675
12:17
We need to do better.
212
737551
1547
12:19
We need far better radar for fakery.
213
739726
2809
12:25
So as we're working through these solutions,
214
745744
3841
12:29
there's going to be a lot of suffering to go around.
215
749609
2563
12:33
Rana Ayyub is still wrestling with the fallout.
216
753093
2746
12:36
She still doesn't feel free to express herself on- and offline.
217
756669
4189
12:41
And as she told me,
218
761566
1365
12:42
she still feels like there are thousands of eyes on her naked body,
219
762955
5074
12:48
even though, intellectually, she knows it wasn't her body.
220
768053
3661
12:52
And she has frequent panic attacks,
221
772371
2349
12:54
especially when someone she doesn't know tries to take her picture.
222
774744
4100
12:58
"What if they're going to make another deepfake?" she thinks to herself.
223
778868
3511
13:03
And so for the sake of individuals like Rana Ayyub
224
783082
3921
13:07
and the sake of our democracy,
225
787027
2306
13:09
we need to do something right now.
226
789357
2182
13:11
Thank you.
227
791563
1151
13:12
(Applause)
228
792738
2508
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7