How we can eliminate child sexual abuse material from the internet | Julie Cordua

110,537 views

2019-11-12 ・ TED


New videos

How we can eliminate child sexual abuse material from the internet | Julie Cordua

110,537 views ・ 2019-11-12

TED


Please double-click on the English subtitles below to play the video.

00:12
[This talk contains mature content]
0
12836
2548
00:17
Five years ago,
1
17741
1620
00:19
I received a phone call that would change my life.
2
19385
3254
00:23
I remember so vividly that day.
3
23795
2666
00:27
It was about this time of year,
4
27245
1901
00:29
and I was sitting in my office.
5
29170
1958
00:31
I remember the sun streaming through the window.
6
31692
3066
00:35
And my phone rang.
7
35592
1320
00:37
And I picked it up,
8
37565
1187
00:39
and it was two federal agents, asking for my help
9
39594
3978
00:43
in identifying a little girl
10
43596
2706
00:46
featured in hundreds of child sexual abuse images they had found online.
11
46326
5367
00:53
They had just started working the case,
12
53145
2576
00:55
but what they knew
13
55745
2870
00:58
was that her abuse had been broadcast to the world for years
14
58639
4823
01:03
on dark web sites dedicated to the sexual abuse of children.
15
63486
5149
01:09
And her abuser was incredibly technologically sophisticated:
16
69605
4385
01:14
new images and new videos every few weeks,
17
74014
4588
01:18
but very few clues as to who she was
18
78626
4049
01:22
or where she was.
19
82699
1676
01:25
And so they called us,
20
85324
1386
01:26
because they had heard we were a new nonprofit
21
86734
2690
01:29
building technology to fight child sexual abuse.
22
89448
3458
01:33
But we were only two years old,
23
93596
2287
01:35
and we had only worked on child sex trafficking.
24
95907
3132
01:39
And I had to tell them
25
99944
2123
01:42
we had nothing.
26
102091
1197
01:44
We had nothing that could help them stop this abuse.
27
104280
3955
01:49
It took those agents another year
28
109263
3574
01:52
to ultimately find that child.
29
112861
3001
01:56
And by the time she was rescued,
30
116853
2275
01:59
hundreds of images and videos documenting her rape had gone viral,
31
119152
6488
02:05
from the dark web
32
125664
1691
02:07
to peer-to-peer networks, private chat rooms
33
127379
3017
02:10
and to the websites you and I use
34
130420
3220
02:13
every single day.
35
133664
2602
02:17
And today, as she struggles to recover,
36
137216
3815
02:21
she lives with the fact that thousands around the world
37
141055
4139
02:25
continue to watch her abuse.
38
145218
3088
02:29
I have come to learn in the last five years
39
149994
2428
02:32
that this case is far from unique.
40
152446
2517
02:36
How did we get here as a society?
41
156119
3843
02:41
In the late 1980s, child pornography --
42
161490
3759
02:45
or what it actually is, child sexual abuse material --
43
165273
5253
02:50
was nearly eliminated.
44
170550
1851
02:53
New laws and increased prosecutions made it simply too risky
45
173209
4504
02:57
to trade it through the mail.
46
177737
1572
03:00
And then came the internet, and the market exploded.
47
180233
4145
03:05
The amount of content in circulation today
48
185310
3356
03:08
is massive and growing.
49
188690
2844
03:12
This is a truly global problem,
50
192421
3244
03:15
but if we just look at the US:
51
195689
1833
03:17
in the US alone last year,
52
197546
2711
03:20
more than 45 million images and videos of child sexual abuse material
53
200281
5274
03:25
were reported to the National Center for Missing and Exploited Children,
54
205579
3678
03:29
and that is nearly double the amount the year prior.
55
209281
4381
03:34
And the details behind these numbers are hard to contemplate,
56
214627
5286
03:39
with more than 60 percent of the images featuring children younger than 12,
57
219937
5720
03:45
and most of them including extreme acts of sexual violence.
58
225681
4502
03:50
Abusers are cheered on in chat rooms dedicated to the abuse of children,
59
230850
5300
03:56
where they gain rank and notoriety
60
236174
2483
03:58
with more abuse and more victims.
61
238681
2906
04:02
In this market,
62
242243
2595
04:04
the currency has become the content itself.
63
244862
3958
04:10
It's clear that abusers have been quick to leverage new technologies,
64
250023
3806
04:13
but our response as a society has not.
65
253853
3015
04:17
These abusers don't read user agreements of websites,
66
257671
4171
04:21
and the content doesn't honor geographic boundaries.
67
261866
3802
04:26
And they win when we look at one piece of the puzzle at a time,
68
266656
6113
04:32
which is exactly how our response today is designed.
69
272793
4017
04:36
Law enforcement works in one jurisdiction.
70
276834
3487
04:40
Companies look at just their platform.
71
280345
3414
04:43
And whatever data they learn along the way
72
283783
2704
04:46
is rarely shared.
73
286511
2002
04:49
It is so clear that this disconnected approach is not working.
74
289402
5601
04:55
We have to redesign our response to this epidemic
75
295643
4110
04:59
for the digital age.
76
299777
1522
05:01
And that's exactly what we're doing at Thorn.
77
301702
2942
05:05
We're building the technology to connect these dots,
78
305311
3617
05:08
to arm everyone on the front lines --
79
308952
2322
05:11
law enforcement, NGOs and companies --
80
311298
2824
05:14
with the tools they need to ultimately eliminate
81
314146
3509
05:17
child sexual abuse material from the internet.
82
317679
2493
05:21
Let's talk for a minute --
83
321571
1271
05:22
(Applause)
84
322866
1519
05:24
Thank you.
85
324409
1304
05:25
(Applause)
86
325737
2340
05:29
Let's talk for a minute about what those dots are.
87
329800
2517
05:33
As you can imagine, this content is horrific.
88
333323
3211
05:36
If you don't have to look at it, you don't want to look at it.
89
336558
3848
05:40
And so, most companies or law enforcement agencies
90
340430
4969
05:45
that have this content
91
345423
1663
05:47
can translate every file into a unique string of numbers.
92
347110
3452
05:50
This is called a "hash."
93
350586
1474
05:52
It's essentially a fingerprint
94
352084
2143
05:54
for each file or each video.
95
354251
2398
05:56
And what this allows them to do is use the information in investigations
96
356673
4613
06:01
or for a company to remove the content from their platform,
97
361310
3027
06:04
without having to relook at every image and every video each time.
98
364361
5174
06:10
The problem today, though,
99
370196
2151
06:12
is that there are hundreds of millions of these hashes
100
372371
3751
06:16
sitting in siloed databases all around the world.
101
376146
3610
06:20
In a silo,
102
380214
1151
06:21
it might work for the one agency that has control over it,
103
381389
3076
06:24
but not connecting this data means we don't know how many are unique.
104
384489
4130
06:28
We don't know which ones represent children who have already been rescued
105
388643
3516
06:32
or need to be identified still.
106
392183
2889
06:35
So our first, most basic premise is that all of this data
107
395096
4170
06:39
must be connected.
108
399290
2403
06:42
There are two ways where this data, combined with software on a global scale,
109
402318
6169
06:48
can have transformative impact in this space.
110
408511
3408
06:52
The first is with law enforcement:
111
412464
2622
06:55
helping them identify new victims faster,
112
415110
3631
06:58
stopping abuse
113
418765
1216
07:00
and stopping those producing this content.
114
420005
2904
07:03
The second is with companies:
115
423441
2666
07:06
using it as clues to identify the hundreds of millions of files
116
426131
3621
07:09
in circulation today,
117
429776
1594
07:11
pulling it down
118
431394
1187
07:12
and then stopping the upload of new material before it ever goes viral.
119
432605
6818
07:21
Four years ago,
120
441694
1646
07:23
when that case ended,
121
443364
1539
07:26
our team sat there, and we just felt this, um ...
122
446300
3739
07:31
... deep sense of failure, is the way I can put it,
123
451635
3338
07:34
because we watched that whole year
124
454997
3651
07:38
while they looked for her.
125
458672
1320
07:40
And we saw every place in the investigation
126
460016
3967
07:44
where, if the technology would have existed,
127
464007
2388
07:46
they would have found her faster.
128
466419
2304
07:49
And so we walked away from that
129
469684
1936
07:51
and we went and we did the only thing we knew how to do:
130
471644
2955
07:54
we began to build software.
131
474623
2634
07:57
So we've started with law enforcement.
132
477689
2252
07:59
Our dream was an alarm bell on the desks of officers all around the world
133
479965
4421
08:04
so that if anyone dare post a new victim online,
134
484410
4544
08:08
someone would start looking for them immediately.
135
488978
3489
08:13
I obviously can't talk about the details of that software,
136
493324
2957
08:16
but today it's at work in 38 countries,
137
496305
2609
08:18
having reduced the time it takes to get to a child
138
498938
2974
08:21
by more than 65 percent.
139
501936
2330
08:24
(Applause)
140
504290
4370
08:33
And now we're embarking on that second horizon:
141
513442
3015
08:36
building the software to help companies identify and remove this content.
142
516481
5665
08:43
Let's talk for a minute about these companies.
143
523193
2532
08:46
So, I told you -- 45 million images and videos in the US alone last year.
144
526270
5232
08:52
Those come from just 12 companies.
145
532280
3887
08:57
Twelve companies, 45 million files of child sexual abuse material.
146
537883
6428
09:04
These come from those companies that have the money
147
544335
2800
09:07
to build the infrastructure that it takes to pull this content down.
148
547159
4557
09:11
But there are hundreds of other companies,
149
551740
2411
09:14
small- to medium-size companies around the world,
150
554175
2666
09:16
that need to do this work,
151
556865
2054
09:18
but they either: 1) can't imagine that their platform would be used for abuse,
152
558943
5425
09:24
or 2) don't have the money to spend on something that is not driving revenue.
153
564392
5845
09:30
So we went ahead and built it for them,
154
570932
3289
09:34
and this system now gets smarter with the more companies that participate.
155
574245
4969
09:39
Let me give you an example.
156
579965
1725
09:42
Our first partner, Imgur -- if you haven't heard of this company,
157
582459
3878
09:46
it's one of the most visited websites in the US --
158
586361
3142
09:49
millions of pieces of user-generated content uploaded every single day,
159
589527
5008
09:54
in a mission to make the internet a more fun place.
160
594559
2858
09:58
They partnered with us first.
161
598012
1852
09:59
Within 20 minutes of going live on our system,
162
599888
3343
10:03
someone tried to upload a known piece of abuse material.
163
603255
3572
10:06
They were able to stop it, they pull it down,
164
606851
2108
10:08
they report it to the National Center for Missing and Exploited Children.
165
608983
3466
10:12
But they went a step further,
166
612473
1908
10:14
and they went and inspected the account of the person who had uploaded it.
167
614405
4133
10:19
Hundreds more pieces of child sexual abuse material
168
619086
4711
10:23
that we had never seen.
169
623821
1818
10:26
And this is where we start to see exponential impact.
170
626152
3532
10:29
We pull that material down,
171
629708
1768
10:31
it gets reported to the National Center for Missing and Exploited Children
172
631500
3550
10:35
and then those hashes go back into the system
173
635074
2511
10:37
and benefit every other company on it.
174
637609
2464
10:40
And when the millions of hashes we have lead to millions more and, in real time,
175
640097
4784
10:44
companies around the world are identifying and pulling this content down,
176
644905
4538
10:49
we will have dramatically increased the speed at which we are removing
177
649467
4561
10:54
child sexual abuse material from the internet around the world.
178
654052
4294
10:58
(Applause)
179
658370
5472
11:06
But this is why it can't just be about software and data,
180
666208
3220
11:09
it has to be about scale.
181
669452
1772
11:11
We have to activate thousands of officers,
182
671248
3513
11:14
hundreds of companies around the world
183
674785
2377
11:17
if technology will allow us to outrun the perpetrators
184
677186
3608
11:20
and dismantle the communities that are normalizing child sexual abuse
185
680818
4125
11:24
around the world today.
186
684967
1552
11:27
And the time to do this is now.
187
687064
2650
11:30
We can no longer say we don't know the impact this is having on our children.
188
690288
5797
11:36
The first generation of children whose abuse has gone viral
189
696688
4458
11:41
are now young adults.
190
701170
1710
11:43
The Canadian Centre for Child Protection
191
703451
2585
11:46
just did a recent study of these young adults
192
706060
2696
11:48
to understand the unique trauma they try to recover from,
193
708780
4636
11:53
knowing that their abuse lives on.
194
713440
2823
11:57
Eighty percent of these young adults have thought about suicide.
195
717213
4846
12:02
More than 60 percent have attempted suicide.
196
722566
4062
12:07
And most of them live with the fear every single day
197
727572
5217
12:12
that as they walk down the street or they interview for a job
198
732813
4463
12:17
or they go to school
199
737300
2290
12:19
or they meet someone online,
200
739614
2425
12:22
that that person has seen their abuse.
201
742063
3658
12:26
And the reality came true for more than 30 percent of them.
202
746547
4905
12:32
They had been recognized from their abuse material online.
203
752256
4586
12:38
This is not going to be easy,
204
758022
3276
12:41
but it is not impossible.
205
761322
2843
12:44
Now it's going to take the will,
206
764189
2676
12:46
the will of our society
207
766889
1589
12:48
to look at something that is really hard to look at,
208
768502
3554
12:52
to take something out of the darkness
209
772080
2343
12:54
so these kids have a voice;
210
774447
2095
12:58
the will of companies to take action and make sure that their platforms
211
778110
4946
13:03
are not complicit in the abuse of a child;
212
783080
3313
13:07
the will of governments to invest with their law enforcement
213
787205
3951
13:11
for the tools they need to investigate a digital first crime,
214
791180
5094
13:16
even when the victims cannot speak for themselves.
215
796298
4083
13:21
This audacious commitment is part of that will.
216
801746
3698
13:26
It's a declaration of war against one of humanity's darkest evils.
217
806269
5407
13:32
But what I hang on to
218
812263
1940
13:34
is that it's actually an investment in a future
219
814227
3449
13:37
where every child can simply be a kid.
220
817700
3074
13:41
Thank you.
221
821357
1194
13:42
(Applause)
222
822896
6144
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7