How we can eliminate child sexual abuse material from the internet | Julie Cordua

108,642 views ・ 2019-11-12

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: SF Huang
00:12
[This talk contains mature content]
0
12836
2548
〔本演說中有成人內容〕
00:17
Five years ago,
1
17741
1620
五年前,
00:19
I received a phone call that would change my life.
2
19385
3254
我接到了一通改變我生命的電話。
00:23
I remember so vividly that day.
3
23795
2666
我對那一天還記憶猶新。
00:27
It was about this time of year,
4
27245
1901
大約就是那一年的這個時候,
00:29
and I was sitting in my office.
5
29170
1958
我坐在辦公室中。
00:31
I remember the sun streaming through the window.
6
31692
3066
我還記得陽光灑落在窗上。
00:35
And my phone rang.
7
35592
1320
我的電話響了。
00:37
And I picked it up,
8
37565
1187
我接了起來,
00:39
and it was two federal agents, asking for my help
9
39594
3978
是兩位聯邦探員要向我求助,
00:43
in identifying a little girl
10
43596
2706
幫他們辨識一個小女孩的身分,
00:46
featured in hundreds of child sexual abuse images they had found online.
11
46326
5367
她出現在網路上數百張 兒童性虐待照片之中。
00:53
They had just started working the case,
12
53145
2576
他們才剛開始偵辦這個案件,
00:55
but what they knew
13
55745
2870
但他們知道,
00:58
was that her abuse had been broadcast to the world for years
14
58639
4823
她被虐待的影像已在一個 專對兒童性虐待的暗網上,
01:03
on dark web sites dedicated to the sexual abuse of children.
15
63486
5149
於世界各地播放很多年了。
01:09
And her abuser was incredibly technologically sophisticated:
16
69605
4385
她的施虐者有著極其精良的技術,
01:14
new images and new videos every few weeks,
17
74014
4588
每幾個星期就有新的照片和影片上傳,
01:18
but very few clues as to who she was
18
78626
4049
但很少的線索可以查出她是誰,
01:22
or where she was.
19
82699
1676
或者她身在何處。
01:25
And so they called us,
20
85324
1386
他們打電話給我們,因為他們 聽說我們這個新的非營利組織
01:26
because they had heard we were a new nonprofit
21
86734
2690
01:29
building technology to fight child sexual abuse.
22
89448
3458
專門開發對抗兒童性虐待的技術。
01:33
But we were only two years old,
23
93596
2287
但我們才成立兩年,
01:35
and we had only worked on child sex trafficking.
24
95907
3132
我們只處理過兒童的非法性交易。
01:39
And I had to tell them
25
99944
2123
我必須要告訴他們,
01:42
we had nothing.
26
102091
1197
我們什麼都沒有。
01:44
We had nothing that could help them stop this abuse.
27
104280
3955
我們沒有能協助他們 阻止這種虐待的資源。
01:49
It took those agents another year
28
109263
3574
那些探員又花了一年
01:52
to ultimately find that child.
29
112861
3001
才終於找到那個孩子。
01:56
And by the time she was rescued,
30
116853
2275
等到她被救出來時,
01:59
hundreds of images and videos documenting her rape had gone viral,
31
119152
6488
她被性侵的無數照片和影片
早就從那個暗網瘋傳出去,
02:05
from the dark web
32
125664
1691
02:07
to peer-to-peer networks, private chat rooms
33
127379
3017
傳到點對點的網路、私人聊天室,
02:10
and to the websites you and I use
34
130420
3220
甚至你我每天
02:13
every single day.
35
133664
2602
都會使用的網站。
02:17
And today, as she struggles to recover,
36
137216
3815
如今,她在努力恢復的同時,
02:21
she lives with the fact that thousands around the world
37
141055
4139
她也得接受全世界仍然有數千人
02:25
continue to watch her abuse.
38
145218
3088
仍持續在觀看她受虐影片的殘酷現實。
02:29
I have come to learn in the last five years
39
149994
2428
過去五年我漸漸了解到,
02:32
that this case is far from unique.
40
152446
2517
這絕非單一個案。
02:36
How did we get here as a society?
41
156119
3843
我們這個社會是怎麼走到這一步的?
02:41
In the late 1980s, child pornography --
42
161490
3759
八〇年代末,兒童色情——
02:45
or what it actually is, child sexual abuse material --
43
165273
5253
其實就是兒童性虐待素材——
02:50
was nearly eliminated.
44
170550
1851
幾乎絕跡了。
02:53
New laws and increased prosecutions made it simply too risky
45
173209
4504
新的法律和越來越多的起訴案件
使得透過郵件交易的風險太高。
02:57
to trade it through the mail.
46
177737
1572
03:00
And then came the internet, and the market exploded.
47
180233
4145
接著網路出現了,市場迅速膨脹。
03:05
The amount of content in circulation today
48
185310
3356
現今在網路上流通的內容量
03:08
is massive and growing.
49
188690
2844
非常巨大且持續增長。
03:12
This is a truly global problem,
50
192421
3244
這真的是個全球性的問題,
03:15
but if we just look at the US:
51
195689
1833
但,如果只談美國:
03:17
in the US alone last year,
52
197546
2711
去年光是在美國,
03:20
more than 45 million images and videos of child sexual abuse material
53
200281
5274
兒童性虐待素材的圖像和影片 數量就超過四千五百萬,
03:25
were reported to the National Center for Missing and Exploited Children,
54
205579
3678
這是回報給國家失蹤與受虐兒童 援助中心的數字,
03:29
and that is nearly double the amount the year prior.
55
209281
4381
比前一年增加近兩倍。
03:34
And the details behind these numbers are hard to contemplate,
56
214627
5286
這些數字背後的細節很難想像,
03:39
with more than 60 percent of the images featuring children younger than 12,
57
219937
5720
影像中超過六成的兒童 都還不到十二歲,
03:45
and most of them including extreme acts of sexual violence.
58
225681
4502
且大部分的畫面 都有極端的性暴力行為。
03:50
Abusers are cheered on in chat rooms dedicated to the abuse of children,
59
230850
5300
施虐者在專門談論兒童虐待的 聊天室中歡呼喝采,
03:56
where they gain rank and notoriety
60
236174
2483
在那裡隨著更多的施暴、 更多的受害者,
03:58
with more abuse and more victims.
61
238681
2906
他們就能獲得進階和惡名。
04:02
In this market,
62
242243
2595
在這個市場中,
04:04
the currency has become the content itself.
63
244862
3958
貨幣變成了內容。
04:10
It's clear that abusers have been quick to leverage new technologies,
64
250023
3806
顯然,施虐者能快速地採用新技術,
04:13
but our response as a society has not.
65
253853
3015
但我們社會的因應方式卻跟不上腳步。
04:17
These abusers don't read user agreements of websites,
66
257671
4171
這些施虐者不會閱讀 網站的使用者條款,
04:21
and the content doesn't honor geographic boundaries.
67
261866
3802
內容也不受地理國界的限制。
04:26
And they win when we look at one piece of the puzzle at a time,
68
266656
6113
每次只要我們看茫茫大海中的一個, 他們就會得勝,
04:32
which is exactly how our response today is designed.
69
272793
4017
因為我們現今設計的 因應方式就是如此。
04:36
Law enforcement works in one jurisdiction.
70
276834
3487
執法單位在一個轄區中作業。
04:40
Companies look at just their platform.
71
280345
3414
公司只管他們自己的平台。
04:43
And whatever data they learn along the way
72
283783
2704
他們在過程中得到的資料
04:46
is rarely shared.
73
286511
2002
也很少會分享出來。
04:49
It is so clear that this disconnected approach is not working.
74
289402
5601
很顯然,
這種各自為政的方法根本行不通。
04:55
We have to redesign our response to this epidemic
75
295643
4110
針對數位時代中 所充斥的兒童性侵影片,
我們應該重新設計因應解決方式。
04:59
for the digital age.
76
299777
1522
05:01
And that's exactly what we're doing at Thorn.
77
301702
2942
那正是我們在 Thorn 做的事。
05:05
We're building the technology to connect these dots,
78
305311
3617
我們建立了能夠將 這些點連結起來的技術,
05:08
to arm everyone on the front lines --
79
308952
2322
把前線的每個人武裝起來——
05:11
law enforcement, NGOs and companies --
80
311298
2824
執法人員、非政府組織、公司——
05:14
with the tools they need to ultimately eliminate
81
314146
3509
都能擁有他們需要的工具,
最終將兒童性虐待素材 從網路上根除。
05:17
child sexual abuse material from the internet.
82
317679
2493
05:21
Let's talk for a minute --
83
321571
1271
我們來談一下——
05:22
(Applause)
84
322866
1519
(掌聲)
05:24
Thank you.
85
324409
1304
謝謝。
05:25
(Applause)
86
325737
2340
(掌聲)
05:29
Let's talk for a minute about what those dots are.
87
329800
2517
我們來談一下這些「點」是什麼。
05:33
As you can imagine, this content is horrific.
88
333323
3211
各位可以想像,這些內容很可怕。
05:36
If you don't have to look at it, you don't want to look at it.
89
336558
3848
如果沒有必要,你不會想看。
05:40
And so, most companies or law enforcement agencies
90
340430
4969
所以,擁有這些內容的
大部分公司或執法機構
05:45
that have this content
91
345423
1663
05:47
can translate every file into a unique string of numbers.
92
347110
3452
可以將每個檔案轉譯為一串 獨一無二的數字,叫做雜湊。
05:50
This is called a "hash."
93
350586
1474
05:52
It's essentially a fingerprint
94
352084
2143
基本上,它就是每個檔案 或影片的指紋身分。
05:54
for each file or each video.
95
354251
2398
05:56
And what this allows them to do is use the information in investigations
96
356673
4613
他們可以把這項資訊用在調查中,
06:01
or for a company to remove the content from their platform,
97
361310
3027
或公司不需每次重覆地觀看 所有的圖像和影片,
06:04
without having to relook at every image and every video each time.
98
364361
5174
就能據以把不當內容從平台上移除。
06:10
The problem today, though,
99
370196
2151
不過,現在的問題是,
06:12
is that there are hundreds of millions of these hashes
100
372371
3751
數億組雜湊函式散布在
06:16
sitting in siloed databases all around the world.
101
376146
3610
世界各地獨立的資料庫中。
06:20
In a silo,
102
380214
1151
孤立的資料庫對於掌控它的 那個機構來說可能行得通,
06:21
it might work for the one agency that has control over it,
103
381389
3076
06:24
but not connecting this data means we don't know how many are unique.
104
384489
4130
但不把資料做連結,就表示 我們不知道有多少獨特案例。
06:28
We don't know which ones represent children who have already been rescued
105
388643
3516
我們不知道哪些代表 已經被救出的孩子,
06:32
or need to be identified still.
106
392183
2889
或者還有待被辨視出來的孩子。
06:35
So our first, most basic premise is that all of this data
107
395096
4170
所以,我們最首要、根本的 前提就是讓所有資料
06:39
must be connected.
108
399290
2403
必須相互連結。
06:42
There are two ways where this data, combined with software on a global scale,
109
402318
6169
用軟體將這些資料做全球性的整合後,
在兩個層面上會為這個領域 帶來革命性的影響。
06:48
can have transformative impact in this space.
110
408511
3408
06:52
The first is with law enforcement:
111
412464
2622
第一種是執法面:
06:55
helping them identify new victims faster,
112
415110
3631
協助他們更快辨視出新的受害者,
06:58
stopping abuse
113
418765
1216
阻止虐待,
07:00
and stopping those producing this content.
114
420005
2904
阻止製造這類內容的人。
07:03
The second is with companies:
115
423441
2666
第二種是公司面:
07:06
using it as clues to identify the hundreds of millions of files
116
426131
3621
用這些資料當線索,
來辨別現今流通的數億個檔案,
07:09
in circulation today,
117
429776
1594
07:11
pulling it down
118
431394
1187
將這些內容下架,
07:12
and then stopping the upload of new material before it ever goes viral.
119
432605
6818
並在新的素材被瘋傳之前 就阻止它們上傳。
07:21
Four years ago,
120
441694
1646
四年前,
07:23
when that case ended,
121
443364
1539
當那個案件結案時,
07:26
our team sat there, and we just felt this, um ...
122
446300
3739
我們的團隊坐在那裡, 我們感覺到一種……
07:31
... deep sense of failure, is the way I can put it,
123
451635
3338
我會說是很深的挫敗感,
07:34
because we watched that whole year
124
454997
3651
因為當探員在找尋她的下落時,
我們關注了一整年。
07:38
while they looked for her.
125
458672
1320
07:40
And we saw every place in the investigation
126
460016
3967
我們察看每個被調查的地方,
07:44
where, if the technology would have existed,
127
464007
2388
如果當時已有那個技術,
07:46
they would have found her faster.
128
466419
2304
他們就可以更快找到她。
07:49
And so we walked away from that
129
469684
1936
所以我們不再坐視,
07:51
and we went and we did the only thing we knew how to do:
130
471644
2955
做了一件我們唯一知道如何做的事:
07:54
we began to build software.
131
474623
2634
我們開始開發軟體。
07:57
So we've started with law enforcement.
132
477689
2252
我們從執法面開始。
07:59
Our dream was an alarm bell on the desks of officers all around the world
133
479965
4421
我們的夢想是做出一個警鈴 放在全世界執法人員的桌子上,
08:04
so that if anyone dare post a new victim online,
134
484410
4544
如果有人膽敢在線上 張貼出新的受害者,
08:08
someone would start looking for them immediately.
135
488978
3489
馬上就會有人開始搜尋他。
08:13
I obviously can't talk about the details of that software,
136
493324
2957
我顯然無法談論軟體的細節,
08:16
but today it's at work in 38 countries,
137
496305
2609
但現今,已有三十八個國家 運用此軟體,
08:18
having reduced the time it takes to get to a child
138
498938
2974
將找到孩子所需要的時間
08:21
by more than 65 percent.
139
501936
2330
減少了至少 65%。
08:24
(Applause)
140
504290
4370
(掌聲)
08:33
And now we're embarking on that second horizon:
141
513442
3015
現在我們在著手進行第二個範圍:
08:36
building the software to help companies identify and remove this content.
142
516481
5665
打造能協助公司
辨別和移除這類內容的軟體。
08:43
Let's talk for a minute about these companies.
143
523193
2532
我們來談談這些公司。
08:46
So, I told you -- 45 million images and videos in the US alone last year.
144
526270
5232
我剛說過——去年光在美國,
這類影像和影片的總數 就有四千五百萬。
08:52
Those come from just 12 companies.
145
532280
3887
那些影像和影片只來自十二間公司。
08:57
Twelve companies, 45 million files of child sexual abuse material.
146
537883
6428
十二間公司,
四千五百萬個兒童性虐待素材檔案。
09:04
These come from those companies that have the money
147
544335
2800
這些公司都有錢
09:07
to build the infrastructure that it takes to pull this content down.
148
547159
4557
可以建造移除這類內容的必要設施。
09:11
But there are hundreds of other companies,
149
551740
2411
但還有數百間其他公司,
09:14
small- to medium-size companies around the world,
150
554175
2666
全世界各地的中小型公司,
09:16
that need to do this work,
151
556865
2054
也需要做這項工作,
09:18
but they either: 1) can't imagine that their platform would be used for abuse,
152
558943
5425
但他們可能,第一,
無法想像他們的平台 會被作為散播施虐影像的地方,
09:24
or 2) don't have the money to spend on something that is not driving revenue.
153
564392
5845
或者,第二,
沒有錢可以花在無法產生利潤的地方。
09:30
So we went ahead and built it for them,
154
570932
3289
所以我們就主動幫他們建造,
09:34
and this system now gets smarter with the more companies that participate.
155
574245
4969
隨著更多公司的參與 使得這個系統變得更聰明了。
09:39
Let me give you an example.
156
579965
1725
讓我舉個例子。
09:42
Our first partner, Imgur -- if you haven't heard of this company,
157
582459
3878
我們的第一個夥伴是 Imgur—— 如果你沒聽過這間公司,
09:46
it's one of the most visited websites in the US --
158
586361
3142
它是美國最多造訪人次的網站之一——
09:49
millions of pieces of user-generated content uploaded every single day,
159
589527
5008
每天都有數百萬件使用者 產成的內容上傳到這個網站,
09:54
in a mission to make the internet a more fun place.
160
594559
2858
目的就是要讓網路變成更有趣的地方。
09:58
They partnered with us first.
161
598012
1852
他們最先和我們成為夥伴。
09:59
Within 20 minutes of going live on our system,
162
599888
3343
才在我們的系統運作二十分鐘,
10:03
someone tried to upload a known piece of abuse material.
163
603255
3572
就有人試圖上傳一件已知的虐待素材。
10:06
They were able to stop it, they pull it down,
164
606851
2108
他們阻止了這次上傳,將素材移除,
10:08
they report it to the National Center for Missing and Exploited Children.
165
608983
3466
他們向國家失蹤與受虐兒童 援助中心回報。
10:12
But they went a step further,
166
612473
1908
但他們又再更進一步,
10:14
and they went and inspected the account of the person who had uploaded it.
167
614405
4133
他們調查了上傳者的帳號。
10:19
Hundreds more pieces of child sexual abuse material
168
619086
4711
發現還有數百件我們從未看過的
10:23
that we had never seen.
169
623821
1818
兒童性虐待素材。
10:26
And this is where we start to see exponential impact.
170
626152
3532
這就是我們開始看見 雷霆萬鈞的影響力。
10:29
We pull that material down,
171
629708
1768
我們把那些素材下架,
10:31
it gets reported to the National Center for Missing and Exploited Children
172
631500
3550
向國家失蹤與受虐兒童 援助中心回報這些素材,
10:35
and then those hashes go back into the system
173
635074
2511
接著,那些雜湊函式 會被放回到系統中,
10:37
and benefit every other company on it.
174
637609
2464
讓其他使用系統的公司受益。
10:40
And when the millions of hashes we have lead to millions more and, in real time,
175
640097
4784
我們擁有的數百萬個雜湊函式 又即時帶出了另外數百萬個,
10:44
companies around the world are identifying and pulling this content down,
176
644905
4538
全世界的公司都可辨認出 這些內容並將之下架,
10:49
we will have dramatically increased the speed at which we are removing
177
649467
4561
我們就可以大大增加 將兒童性虐待素材
10:54
child sexual abuse material from the internet around the world.
178
654052
4294
從全世界的網路上移除的速度。
10:58
(Applause)
179
658370
5472
(掌聲)
11:06
But this is why it can't just be about software and data,
180
666208
3220
但,這就是為什麼不能 只想著軟體和資料,
11:09
it has to be about scale.
181
669452
1772
還要考量到規模。
11:11
We have to activate thousands of officers,
182
671248
3513
我們得要讓數千名執法人員、
11:14
hundreds of companies around the world
183
674785
2377
全世界數百間公司都動起來,
11:17
if technology will allow us to outrun the perpetrators
184
677186
3608
如果科技能讓我們勝過犯罪者,
11:20
and dismantle the communities that are normalizing child sexual abuse
185
680818
4125
瓦解世界各地那些
將兒童性虐待視為正常的社群。
11:24
around the world today.
186
684967
1552
11:27
And the time to do this is now.
187
687064
2650
現在是該行動的時候了。
11:30
We can no longer say we don't know the impact this is having on our children.
188
690288
5797
我們不能再說我們不知道這個問題 會對我們的孩子造成什麼樣的影響。
11:36
The first generation of children whose abuse has gone viral
189
696688
4458
那些受虐影像和影片 被瘋傳的第一代兒童
11:41
are now young adults.
190
701170
1710
現在都已是年輕人了。
11:43
The Canadian Centre for Child Protection
191
703451
2585
加拿大兒童保護中心
11:46
just did a recent study of these young adults
192
706060
2696
最近針對這些年輕人做了一項研究,
11:48
to understand the unique trauma they try to recover from,
193
708780
4636
以了解他們在知道自己受虐的 影像還繼續流傳的情況下,
11:53
knowing that their abuse lives on.
194
713440
2823
要如何從這獨特的創傷中恢復。
11:57
Eighty percent of these young adults have thought about suicide.
195
717213
4846
那些年輕人中有 80% 曾經想過自殺。
12:02
More than 60 percent have attempted suicide.
196
722566
4062
超過 60% 曾經嘗試過自殺。
12:07
And most of them live with the fear every single day
197
727572
5217
他們大部分都帶著恐懼過每一天,
12:12
that as they walk down the street or they interview for a job
198
732813
4463
擔心當他們走在街上, 或工作面試時,
12:17
or they go to school
199
737300
2290
或去學校時,
12:19
or they meet someone online,
200
739614
2425
或在網路上認識某人時,
12:22
that that person has seen their abuse.
201
742063
3658
對方可能曾看過他們被施虐的畫面。
12:26
And the reality came true for more than 30 percent of them.
202
746547
4905
這些人當中有三成 真的遇到這種狀況。
12:32
They had been recognized from their abuse material online.
203
752256
4586
他們因為網路上的受虐素材 而被人認出。
12:38
This is not going to be easy,
204
758022
3276
這絕不是一件容易的事,
12:41
but it is not impossible.
205
761322
2843
但並不是不可能達成。
12:44
Now it's going to take the will,
206
764189
2676
這需要決心,
12:46
the will of our society
207
766889
1589
我們社會得要有意志
12:48
to look at something that is really hard to look at,
208
768502
3554
去面對很難面對的問題,
12:52
to take something out of the darkness
209
772080
2343
在黑暗中找出希望,
12:54
so these kids have a voice;
210
774447
2095
這些孩子才能夠發聲;
12:58
the will of companies to take action and make sure that their platforms
211
778110
4946
公司要有意願採取行動, 確保它們的平台
13:03
are not complicit in the abuse of a child;
212
783080
3313
不會成為虐童的共犯;
13:07
the will of governments to invest with their law enforcement
213
787205
3951
政府要有意願投資在執法上,
13:11
for the tools they need to investigate a digital first crime,
214
791180
5094
因為需要工具來調查數位初犯,
13:16
even when the victims cannot speak for themselves.
215
796298
4083
即使在受害者無法 為自己發聲時也要能做到。
13:21
This audacious commitment is part of that will.
216
801746
3698
這個大無畏的承諾就是 顯示決心的一部分。
13:26
It's a declaration of war against one of humanity's darkest evils.
217
806269
5407
向人性中最黑暗的邪惡來宣戰。
13:32
But what I hang on to
218
812263
1940
但我抱持的想法是,
13:34
is that it's actually an investment in a future
219
814227
3449
這其實是對未來的投資,
13:37
where every child can simply be a kid.
220
817700
3074
未來孩子可以單純當個孩子。
13:41
Thank you.
221
821357
1194
謝謝。
13:42
(Applause)
222
822896
6144
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7