How deepfakes undermine truth and threaten democracy | Danielle Citron

86,286 views ・ 2019-10-07

TED


請雙擊下方英文字幕播放視頻。

譯者: SF Huang 審譯者: 易帆 余
00:12
[This talk contains mature content]
0
12535
2767
〔本演說含有成人內容〕
00:17
Rana Ayyub is a journalist in India
1
17762
2992
拉娜‧阿尤布是印度的記者,
00:20
whose work has exposed government corruption
2
20778
2602
她的報導曾經揭露過政府腐敗
00:24
and human rights violations.
3
24411
2157
以及侵犯人權的事件。
00:26
And over the years,
4
26990
1167
這些年來,
00:28
she's gotten used to vitriol and controversy around her work.
5
28181
3303
她也習慣了工作 所引來的尖酸批評和爭議。
00:32
But none of it could have prepared her for what she faced in April 2018.
6
32149
5109
但這些都不足以讓她準備好 面對 2018 年四月所發生的事。
00:38
She was sitting in a café with a friend when she first saw it:
7
38125
3651
她第一次看到時, 正和一名朋友坐在咖啡廳:
00:41
a two-minute, 20-second video of her engaged in a sex act.
8
41800
4943
她看到一個長達兩分二十秒 以她為主角的性行為影片,
00:47
And she couldn't believe her eyes.
9
47188
2349
她無法相信她的眼睛。
00:49
She had never made a sex video.
10
49561
2273
她從來沒有拍過性影片。
00:52
But unfortunately, thousands upon thousands of people
11
52506
3465
但不幸的是,成千上萬的人
00:55
would believe it was her.
12
55995
1666
都相信那是她。
00:58
I interviewed Ms. Ayyub about three months ago,
13
58673
2944
大約三個月前,我為了我所寫的
01:01
in connection with my book on sexual privacy.
14
61641
2504
一本關於性隱私的書而去訪問她。
01:04
I'm a law professor, lawyer and civil rights advocate.
15
64681
3198
我是法律教授、 律師和民權擁護者。
01:08
So it's incredibly frustrating knowing that right now,
16
68204
4611
當我知道現行的法律沒辦法協助她時,
01:12
law could do very little to help her.
17
72839
2238
我感到非常挫折。
01:15
And as we talked,
18
75458
1547
我們交談時,
01:17
she explained that she should have seen the fake sex video coming.
19
77029
4512
她解釋說她應該要料到 會有假的性影片出現。
01:22
She said, "After all, sex is so often used to demean and to shame women,
20
82038
5596
她說:「畢竟,性通常 被用來貶抑和羞辱女人,
01:27
especially minority women,
21
87658
2428
特別是少數的弱勢女性、
01:30
and especially minority women who dare to challenge powerful men,"
22
90110
4312
敢挑戰威權男性的少數女性,」
01:34
as she had in her work.
23
94446
1533
比如,她在工作中所扮演的角色。
01:37
The fake sex video went viral in 48 hours.
24
97191
3976
四十八小時內, 假造的性影片就被瘋傳。
01:42
All of her online accounts were flooded with screenshots of the video,
25
102064
5307
她所有的線上帳戶都被 該影片的截圖給洗版,
01:47
with graphic rape and death threats
26
107395
2627
夾帶著繪聲繪影的強暴和殺害威脅,
01:50
and with slurs about her Muslim faith.
27
110046
2533
還汙辱她的穆斯林信仰。
01:53
Online posts suggested that she was "available" for sex.
28
113426
4564
線上貼文指出 她現在「有空」可以約炮。
01:58
And she was doxed,
29
118014
1610
她的個資被外洩,
01:59
which means that her home address and her cell phone number
30
119648
2778
她的住家地址和手機號碼,
02:02
were spread across the internet.
31
122450
1746
在網路上處處可見。
02:04
The video was shared more than 40,000 times.
32
124879
4084
該影片被分享了超過四萬次。
02:09
Now, when someone is targeted with this kind of cybermob attack,
33
129760
3936
當一個人成為這種 網路暴民攻擊的目標,
02:13
the harm is profound.
34
133720
2063
所受到的傷害會極度嚴重。
02:16
Rana Ayyub's life was turned upside down.
35
136482
3039
拉娜‧阿尤布的人生 被搞得亂七八糟。
02:20
For weeks, she could hardly eat or speak.
36
140211
3334
好幾週的時間她都幾乎 食不下嚥或無法說話。
02:23
She stopped writing and closed all of her social media accounts,
37
143919
3689
她停止寫作並關閉了 所有的社群媒體帳戶,
02:27
which is, you know, a tough thing to do when you're a journalist.
38
147632
3158
對於記者來說,這是很難的抉擇。
02:31
And she was afraid to go outside her family's home.
39
151188
3484
她很怕踏出家門。
02:34
What if the posters made good on their threats?
40
154696
3022
萬一貼文的人履行了 他們的威脅怎麼辦?
02:38
The UN Council on Human Rights confirmed that she wasn't being crazy.
41
158395
4365
聯合國人權理事會確認她並沒有發瘋。
02:42
It issued a public statement saying that they were worried about her safety.
42
162784
4637
理事會發佈了一份公開聲明, 說他們擔心她的安全。
02:48
What Rana Ayyub faced was a deepfake:
43
168776
4229
拉娜‧阿尤布面對的就是深偽技術:
02:53
machine-learning technology
44
173029
2540
它是一種機器學習科技,
02:55
that manipulates or fabricates audio and video recordings
45
175593
4111
能操弄或偽造錄音和錄影,
02:59
to show people doing and saying things
46
179728
2723
偽造的影音呈現出當事人
03:02
that they never did or said.
47
182475
1866
從未做過或說過的內容。
03:04
Deepfakes appear authentic and realistic, but they're not;
48
184807
3361
深偽技術的產物 看起來很真實,但不是真相,
03:08
they're total falsehoods.
49
188192
1772
完全是假造的。
03:11
Although the technology is still developing in its sophistication,
50
191228
3794
雖然這項技術的精密度還在發展中,
03:15
it is widely available.
51
195046
1614
但它已經可廣泛取得。
03:17
Now, the most recent attention to deepfakes arose,
52
197371
3072
最近深偽技術的關注度之所以提升,
03:20
as so many things do online,
53
200467
2161
和線上許多其它內容一樣,
03:22
with pornography.
54
202652
1255
和色情內容有關。
03:24
In early 2018,
55
204498
2111
2018 年初,
03:26
someone posted a tool on Reddit
56
206633
2468
有人在 Reddit 上張貼了一項工具,
03:29
to allow users to insert faces into porn videos.
57
209125
4412
使用者可以將面孔放入色情影片當中。
03:33
And what followed was a cascade of fake porn videos
58
213561
3440
接踵而來的是 傾洩而出的偽造色情影片,
03:37
featuring people's favorite female celebrities.
59
217025
2797
主打大家最愛的女性名人。
03:40
And today, you can go on YouTube and pull up countless tutorials
60
220712
3477
現今,你可以上 YouTube 找到大量的教學影片,
03:44
with step-by-step instructions
61
224213
2286
內有逐步指示,
03:46
on how to make a deepfake on your desktop application.
62
226523
3163
教你如何用電腦的應用程式 來製作深偽影片。
03:50
And soon we may be even able to make them on our cell phones.
63
230260
3706
可能很快我們就能用手機 來製作深偽影片了。
03:55
Now, it's the interaction of some of our most basic human frailties
64
235072
5382
正是我們基本的人性弱點
04:00
and network tools
65
240478
1682
與網路工具的相互作用,
04:02
that can turn deepfakes into weapons.
66
242184
2666
使得深偽技術變成了武器。
04:04
So let me explain.
67
244874
1200
讓我解釋一下。
04:06
As human beings, we have a visceral reaction to audio and video.
68
246875
4566
我們人類對影音會直覺反應。
04:11
We believe they're true,
69
251860
1488
我們會相信它們是真的,
04:13
on the notion that of course you can believe
70
253372
2078
你認為當然能相信
04:15
what your eyes and ears are telling you.
71
255474
2478
你的眼睛和耳朵所告訴你的事情。
04:18
And it's that mechanism
72
258476
1699
而正是這個機制
04:20
that might undermine our shared sense of reality.
73
260199
3698
有可能削減了我們感知真實的靈敏度。
04:23
Although we believe deepfakes to be true, they're not.
74
263921
3147
雖然我們誤信深偽技術下的 產物是真的,但它們不是。
04:27
And we're attracted to the salacious, the provocative.
75
267604
4157
我們會被猥褻的、 刺激的內容物所吸引。
04:32
We tend to believe and to share information
76
272365
3047
我們傾向相信並轉貼分享
04:35
that's negative and novel.
77
275436
2023
這些負面新奇的資訊。
04:37
And researchers have found that online hoaxes spread 10 times faster
78
277809
5019
研究者發現網路惡作劇散播的速度,
比真實故事快十倍。
04:42
than accurate stories.
79
282852
1627
04:46
Now, we're also drawn to information
80
286015
4380
我們也會被和我們觀點
04:50
that aligns with our viewpoints.
81
290419
1892
相同的資訊所吸引。
04:52
Psychologists call that tendency "confirmation bias."
82
292950
3561
心理學家稱之為「確認偏差」的傾向。
04:57
And social media platforms supercharge that tendency,
83
297300
4387
社群媒體平台會強化你偏好的傾向,
05:01
by allowing us to instantly and widely share information
84
301711
3881
因為那些平台讓我們 能夠迅速、廣泛地分享
05:05
that accords with our viewpoints.
85
305616
1792
和我們觀點相同的資訊。
05:08
Now, deepfakes have the potential to cause grave individual and societal harm.
86
308735
5568
深偽技術有可能對 個人及社會造成嚴重的傷害。
05:15
So, imagine a deepfake
87
315204
2024
所以,想像一支深偽影片
05:17
that shows American soldiers in Afganistan burning a Koran.
88
317252
4182
內容是美國士兵在阿富汗燒毀可蘭經。
05:22
You can imagine that that deepfake would provoke violence
89
322807
3024
各位可以想像那種深偽影片
會挑起針對這些士兵的暴力反擊。
05:25
against those soldiers.
90
325855
1533
05:27
And what if the very next day
91
327847
2873
如果隔天
05:30
there's another deepfake that drops,
92
330744
2254
又出來了一支深偽影片,
05:33
that shows a well-known imam based in London
93
333022
3317
內容是在倫敦知名的伊斯蘭教領袖
05:36
praising the attack on those soldiers?
94
336363
2467
讚美那些對士兵的攻擊行為?
05:39
We might see violence and civil unrest,
95
339617
3163
我們可能會看到暴力和內亂,
05:42
not only in Afganistan and the United Kingdom,
96
342804
3249
不僅是在阿富汗及英國,
05:46
but across the globe.
97
346077
1515
還有全球各地。
05:48
And you might say to me,
98
348251
1158
各位可能會跟我說:
05:49
"Come on, Danielle, that's far-fetched."
99
349433
2247
「拜託,丹妮爾,那太牽強了。」
05:51
But it's not.
100
351704
1150
但這並不牽強。
05:53
We've seen falsehoods spread
101
353293
2191
我們已經見過在 WhatsApp
05:55
on WhatsApp and other online message services
102
355508
2722
及其它網路訊息上散播的謠言,
05:58
lead to violence against ethnic minorities.
103
358254
2761
導致對非主流人群的暴力。
06:01
And that was just text --
104
361039
1887
那還只是文字訊息——
06:02
imagine if it were video.
105
362950
2024
想像一下,若是影片會如何。
06:06
Now, deepfakes have the potential to corrode the trust that we have
106
366593
5357
深偽技術極可能會損害
我們對民主制度的信任。
06:11
in democratic institutions.
107
371974
1992
06:15
So, imagine the night before an election.
108
375006
2667
想像一下選前之夜。
06:17
There's a deepfake showing one of the major party candidates
109
377996
3238
有一支深偽影片播放 一位主要政黨的候選人
06:21
gravely sick.
110
381258
1150
病得很重。
06:23
The deepfake could tip the election
111
383202
2333
那支深偽影片有可能影響選舉,
06:25
and shake our sense that elections are legitimate.
112
385559
3375
並動搖我們對選舉合法性的感受。
06:30
Imagine if the night before an initial public offering
113
390515
3326
想像一下,某間大型全球銀行
06:33
of a major global bank,
114
393865
2333
在首次公開募股的前一晚,
06:36
there was a deepfake showing the bank's CEO
115
396222
3149
出現一支深偽影片, 內容是該銀行的執行長
06:39
drunkenly spouting conspiracy theories.
116
399395
2697
醉醺醺地大放厥詞說著陰謀論。
06:42
The deepfake could tank the IPO,
117
402887
3047
這支深偽影片可能會 重創這首次公開募股,
06:45
and worse, shake our sense that financial markets are stable.
118
405958
4115
更糟糕的是,會動搖我們對 金融市場穩定的感受。
06:51
So deepfakes can exploit and magnify the deep distrust that we already have
119
411385
6989
所以,深偽技術能利用 並放大我們對政治人物、企業領袖
06:58
in politicians, business leaders and other influential leaders.
120
418398
4214
及其他具影響力之領導人之 已有的深層不信任感。
07:02
They find an audience primed to believe them.
121
422945
3284
深偽影片一開始就誤導了它的觀眾。
07:07
And the pursuit of truth is on the line as well.
122
427287
2765
而對真相的追尋也會受到危及。
07:11
Technologists expect that with advances in AI,
123
431077
3564
技術人員認為隨著人工智慧的進步,
07:14
soon it may be difficult if not impossible
124
434665
3682
未來想要分辨出真實影片 和偽造影片的差別
07:18
to tell the difference between a real video and a fake one.
125
438371
3769
可能會變得非常困難 甚至無法分辨。
07:23
So how can the truth emerge in a deepfake-ridden marketplace of ideas?
126
443022
5341
所以,在一個充斥著深偽影片 想法的市場中,真相如何能浮現?
07:28
Will we just proceed along the path of least resistance
127
448752
3420
我們會不會選擇走阻力最少的路,
07:32
and believe what we want to believe,
128
452196
2437
相信我們想要相信的,
07:34
truth be damned?
129
454657
1150
而不管真相為何呢?
07:36
And not only might we believe the fakery,
130
456831
3175
我們不僅可能會相信偽造品,
07:40
we might start disbelieving the truth.
131
460030
3326
我們還可能會開始質疑真相。
07:43
We've already seen people invoke the phenomenon of deepfakes
132
463887
4079
我們已經看過有人利用 深偽事件的表象,
07:47
to cast doubt on real evidence of their wrongdoing.
133
467990
3920
使他們錯誤行為的真實證據遭到質疑。
07:51
We've heard politicians say of audio of their disturbing comments,
134
471934
5969
我們也聽過政客對他們 不利消息的回應:
07:57
"Come on, that's fake news.
135
477927
1746
「拜託,那是假新聞。
07:59
You can't believe what your eyes and ears are telling you."
136
479697
3920
不要相信你眼睛看到 和耳朵聽到的東西。」
08:04
And it's that risk
137
484402
1731
這種風險,
08:06
that professor Robert Chesney and I call the "liar's dividend":
138
486157
5436
羅伯‧特切斯尼教授和我 稱之為「騙子的紅利」:
08:11
the risk that liars will invoke deepfakes
139
491617
3357
這風險在於騙子會透過深偽技術
08:14
to escape accountability for their wrongdoing.
140
494998
2905
來為自己惡行脫罪。
08:18
So we've got our work cut out for us, there's no doubt about it.
141
498963
3071
所以毫無疑問的, 我們有艱鉅的工作要完成。
08:22
And we're going to need a proactive solution
142
502606
3325
我們會需要科技公司、立法者
08:25
from tech companies, from lawmakers,
143
505955
3511
法律執行者以及媒體
08:29
law enforcers and the media.
144
509490
1984
主動積極地提出解決方案。
08:32
And we're going to need a healthy dose of societal resilience.
145
512093
4016
同時我們也會需要 社會復原力的健康處方。
08:37
So now, we're right now engaged in a very public conversation
146
517506
3896
當下,我們正在與科技公司
08:41
about the responsibility of tech companies.
147
521426
2913
進行關於社會責任的公開談話。
08:44
And my advice to social media platforms
148
524926
3032
我們對社群媒體平台的建議,
08:47
has been to change their terms of service and community guidelines
149
527982
3873
已經影響了他們的 服務條款和社群準則,
08:51
to ban deepfakes that cause harm.
150
531879
2336
禁止會造成傷害的深偽技術。
08:54
That determination, that's going to require human judgment,
151
534712
3960
這樣的決定,需要人性的判斷,
08:58
and it's expensive.
152
538696
1571
代價很高。
09:00
But we need human beings
153
540673
2285
但我們需要人類
09:02
to look at the content and context of a deepfake
154
542982
3873
來查看深偽產物的內容和背景,
09:06
to figure out if it is a harmful impersonation
155
546879
3682
以判定它是否是有害的偽造品,
09:10
or instead, if it's valuable satire, art or education.
156
550585
4382
還是具價值的諷刺作品、藝術或教育。
09:16
So now, what about the law?
157
556118
1495
那麼,法律呢?
09:18
Law is our educator.
158
558666
2349
法律是我們的老師。
09:21
It teaches us about what's harmful and what's wrong.
159
561515
4038
法律教導我們什麼 是有害的、什麼是錯的。
09:25
And it shapes behavior it deters by punishing perpetrators
160
565577
4555
法律會透過懲罰犯罪者 以及讓受害者獲得補償救助,
09:30
and securing remedies for victims.
161
570156
2267
以此來規範行為。
09:33
Right now, law is not up to the challenge of deepfakes.
162
573148
4280
現在,法律還無法面對 深偽技術的挑戰。
09:38
Across the globe,
163
578116
1390
全球各地
09:39
we lack well-tailored laws
164
579530
2444
都缺乏適當的法律
09:41
that would be designed to tackle digital impersonations
165
581998
3570
來處理會侵犯性隱私、
09:45
that invade sexual privacy,
166
585592
2231
毀損名譽、
09:47
that damage reputations
167
587847
1387
造成情緒痛苦的數位偽造行為。
09:49
and that cause emotional distress.
168
589258
1951
09:51
What happened to Rana Ayyub is increasingly commonplace.
169
591725
3873
發生在拉娜‧阿尤布 身上的事越來越常見。
09:56
Yet, when she went to law enforcement in Delhi,
170
596074
2214
但,當她去找德里的執法機關,
09:58
she was told nothing could be done.
171
598312
2135
她得到的回應是執法機關無能為力。
10:01
And the sad truth is that the same would be true
172
601101
3183
令人可悲的是,就算在美國和歐洲,
10:04
in the United States and in Europe.
173
604308
2266
結果也是一樣。
10:07
So we have a legal vacuum that needs to be filled.
174
607300
4356
我們的法律有漏洞,需要填補。
10:12
My colleague Dr. Mary Anne Franks and I are working with US lawmakers
175
612292
4092
我的同事瑪莉‧安‧妮弗蘭克斯博士 和我跟美國的立法者合作,
10:16
to devise legislation that would ban harmful digital impersonations
176
616408
4804
策劃立法以禁止那些會造成傷害
且等同於偷竊身分的數位偽造行為。
10:21
that are tantamount to identity theft.
177
621236
2533
10:24
And we've seen similar moves
178
624252
2126
我們已經看到在冰島、英國
10:26
in Iceland, the UK and Australia.
179
626402
3301
和澳洲都有類似的行動。
10:30
But of course, that's just a small piece of the regulatory puzzle.
180
630157
3259
但當然,那只是控管拼圖的 其中一小片。
10:34
Now, I know law is not a cure-all. Right?
181
634911
3169
我知道法律並非萬靈丹,對吧?
10:38
It's a blunt instrument.
182
638104
1600
法律是一種鈍器。
10:40
And we've got to use it wisely.
183
640346
1539
我們得要明智地運用它。
10:42
It also has some practical impediments.
184
642411
2812
實際執法上也常遇到障礙。
10:45
You can't leverage law against people you can't identify and find.
185
645657
5044
法律對你辨識不出、 定位不了的人無能為力。
10:51
And if a perpetrator lives outside the country
186
651463
3286
如果犯罪者住在國外,
10:54
where a victim lives,
187
654773
1754
和受害者不同國家,
10:56
then you may not be able to insist
188
656551
1629
那麼你可能無法堅持
10:58
that the perpetrator come into local courts
189
658204
2349
要犯罪者到當地法庭去面對司法。
11:00
to face justice.
190
660577
1150
11:02
And so we're going to need a coordinated international response.
191
662236
4063
所以我們需要協調出 一致性的國際回應方式。
11:07
Education has to be part of our response as well.
192
667819
3333
我們的回應也得要包含教育。
11:11
Law enforcers are not going to enforce laws
193
671803
3731
執法者無法執行
11:15
they don't know about
194
675558
1458
他們不懂的法律,
11:17
and proffer problems they don't understand.
195
677040
2596
也無法提出他們不了解的問題。
11:20
In my research on cyberstalking,
196
680376
2191
在我的網路跟蹤研究中,
11:22
I found that law enforcement lacked the training
197
682591
3499
我發現執法人員缺乏訓練,
11:26
to understand the laws available to them
198
686114
2582
不了解他們能使用的法律有哪些,
11:28
and the problem of online abuse.
199
688720
2349
也不了解線上霸凌行為的問題。
11:31
And so often they told victims,
200
691093
2682
所以,他們通常會告訴受害者:
11:33
"Just turn your computer off. Ignore it. It'll go away."
201
693799
3971
「把你的電腦關機。 不要理它。就沒事了。」
11:38
And we saw that in Rana Ayyub's case.
202
698261
2466
在拉娜‧阿尤布的案例中就是如此。
11:41
She was told, "Come on, you're making such a big deal about this.
203
701102
3468
她得到的回應是: 「拜託,妳太大驚小怪了。
11:44
It's boys being boys."
204
704594
1743
男孩子都是這樣子的。」
11:47
And so we need to pair new legislation with efforts at training.
205
707268
5252
所以我們在新立法時 還要搭配對人員的培訓。
11:54
And education has to be aimed on the media as well.
206
714053
3429
且媒體也需要被教育。
11:58
Journalists need educating about the phenomenon of deepfakes
207
718180
4260
記者需要了解深偽的現象,
12:02
so they don't amplify and spread them.
208
722464
3039
他們才不會去放大和散播它們。
12:06
And this is the part where we're all involved.
209
726583
2168
這也是和我們大家都有關的地方。
12:08
Each and every one of us needs educating.
210
728775
3855
我們每一個人都需要被教育。
12:13
We click, we share, we like, and we don't even think about it.
211
733375
3675
我們會點閱、分享、按讚, 做這些時卻不會去思考。
12:17
We need to do better.
212
737551
1547
我們必須做得更好。
12:19
We need far better radar for fakery.
213
739726
2809
我們需要有更好的 判讀能力來辨識真偽。
12:25
So as we're working through these solutions,
214
745744
3841
當我們正在致力於這些解決方案時,
12:29
there's going to be a lot of suffering to go around.
215
749609
2563
也會伴隨著許多的困難和痛苦。
12:33
Rana Ayyub is still wrestling with the fallout.
216
753093
2746
拉娜‧阿尤布 仍在和事件的餘波苦戰。
12:36
She still doesn't feel free to express herself on- and offline.
217
756669
4189
不論在線上或離線, 她仍然無法自在地表達自己。
12:41
And as she told me,
218
761566
1365
她告訴我,
12:42
she still feels like there are thousands of eyes on her naked body,
219
762955
5074
她仍然覺得好像有成千上萬的人 在盯著她的裸體看,
12:48
even though, intellectually, she knows it wasn't her body.
220
768053
3661
即使理智上她也知道 那並不是她的身體。
12:52
And she has frequent panic attacks,
221
772371
2349
她常常會有突發性的恐慌,
12:54
especially when someone she doesn't know tries to take her picture.
222
774744
4100
尤其是當她不認識的人 想要拍她的照片時。
12:58
"What if they're going to make another deepfake?" she thinks to herself.
223
778868
3511
「如果他們又要拿照片去做 深偽呢?」她會這麼想。
13:03
And so for the sake of individuals like Rana Ayyub
224
783082
3921
所以,為了像拉娜‧阿尤布這樣的人,
13:07
and the sake of our democracy,
225
787027
2306
以及為了我們的民主,
13:09
we need to do something right now.
226
789357
2182
我們現在就得要採取行動。
13:11
Thank you.
227
791563
1151
謝謝。
13:12
(Applause)
228
792738
2508
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7