How technology can fight extremism and online harassment | Yasmin Green

75,290 views ・ 2018-06-27

TED


Please double-click on the English subtitles below to play the video.

00:13
My relationship with the internet reminds me of the setup
0
13131
4411
00:17
to a clichéd horror movie.
1
17566
1833
00:19
You know, the blissfully happy family moves in to their perfect new home,
2
19867
4386
00:24
excited about their perfect future,
3
24277
2281
00:26
and it's sunny outside and the birds are chirping ...
4
26582
3521
00:30
And then it gets dark.
5
30857
1839
00:32
And there are noises from the attic.
6
32720
2348
00:35
And we realize that that perfect new house isn't so perfect.
7
35092
4345
00:40
When I started working at Google in 2006,
8
40485
3131
00:43
Facebook was just a two-year-old,
9
43640
1767
00:45
and Twitter hadn't yet been born.
10
45431
2012
00:47
And I was in absolute awe of the internet and all of its promise
11
47848
4410
00:52
to make us closer
12
52282
1437
00:53
and smarter
13
53743
1296
00:55
and more free.
14
55063
1214
00:57
But as we were doing the inspiring work of building search engines
15
57265
3714
01:01
and video-sharing sites and social networks,
16
61003
2886
01:04
criminals, dictators and terrorists were figuring out
17
64907
4304
01:09
how to use those same platforms against us.
18
69235
3202
01:13
And we didn't have the foresight to stop them.
19
73417
2455
01:16
Over the last few years, geopolitical forces have come online to wreak havoc.
20
76746
5099
01:21
And in response,
21
81869
1169
01:23
Google supported a few colleagues and me to set up a new group called Jigsaw,
22
83062
4778
01:27
with a mandate to make people safer from threats like violent extremism,
23
87864
4596
01:32
censorship, persecution --
24
92484
2078
01:35
threats that feel very personal to me because I was born in Iran,
25
95186
4117
01:39
and I left in the aftermath of a violent revolution.
26
99327
2929
01:43
But I've come to realize that even if we had all of the resources
27
103525
4346
01:47
of all of the technology companies in the world,
28
107895
2858
01:51
we'd still fail
29
111595
1230
01:53
if we overlooked one critical ingredient:
30
113586
2948
01:57
the human experiences of the victims and perpetrators of those threats.
31
117653
5789
02:04
There are many challenges I could talk to you about today.
32
124935
2736
02:07
I'm going to focus on just two.
33
127695
1504
02:09
The first is terrorism.
34
129623
2079
02:13
So in order to understand the radicalization process,
35
133563
2557
02:16
we met with dozens of former members of violent extremist groups.
36
136144
4287
02:21
One was a British schoolgirl,
37
141590
2483
02:25
who had been taken off of a plane at London Heathrow
38
145049
3699
02:28
as she was trying to make her way to Syria to join ISIS.
39
148772
4692
02:34
And she was 13 years old.
40
154281
1931
02:37
So I sat down with her and her father, and I said, "Why?"
41
157792
4625
02:42
And she said,
42
162441
1717
02:44
"I was looking at pictures of what life is like in Syria,
43
164182
3639
02:47
and I thought I was going to go and live in the Islamic Disney World."
44
167845
3510
02:52
That's what she saw in ISIS.
45
172527
2084
02:54
She thought she'd meet and marry a jihadi Brad Pitt
46
174635
3492
02:58
and go shopping in the mall all day and live happily ever after.
47
178151
3058
03:02
ISIS understands what drives people,
48
182977
2824
03:05
and they carefully craft a message for each audience.
49
185825
3544
03:11
Just look at how many languages
50
191122
1511
03:12
they translate their marketing material into.
51
192657
2273
03:15
They make pamphlets, radio shows and videos
52
195677
2661
03:18
in not just English and Arabic,
53
198362
1973
03:20
but German, Russian, French, Turkish, Kurdish,
54
200359
4767
03:25
Hebrew,
55
205150
1672
03:26
Mandarin Chinese.
56
206846
1741
03:29
I've even seen an ISIS-produced video in sign language.
57
209309
4192
03:34
Just think about that for a second:
58
214605
1884
03:36
ISIS took the time and made the effort
59
216513
2308
03:38
to ensure their message is reaching the deaf and hard of hearing.
60
218845
3804
03:45
It's actually not tech-savviness
61
225143
2144
03:47
that is the reason why ISIS wins hearts and minds.
62
227311
2595
03:49
It's their insight into the prejudices, the vulnerabilities, the desires
63
229930
4163
03:54
of the people they're trying to reach
64
234117
1774
03:55
that does that.
65
235915
1161
03:57
That's why it's not enough
66
237718
1429
03:59
for the online platforms to focus on removing recruiting material.
67
239171
4239
04:04
If we want to have a shot at building meaningful technology
68
244518
3581
04:08
that's going to counter radicalization,
69
248123
1874
04:10
we have to start with the human journey at its core.
70
250021
2979
04:13
So we went to Iraq
71
253884
2187
04:16
to speak to young men who'd bought into ISIS's promise
72
256095
2831
04:18
of heroism and righteousness,
73
258950
3191
04:22
who'd taken up arms to fight for them
74
262165
1847
04:24
and then who'd defected
75
264036
1338
04:25
after they witnessed the brutality of ISIS's rule.
76
265398
3021
04:28
And I'm sitting there in this makeshift prison in the north of Iraq
77
268880
3192
04:32
with this 23-year-old who had actually trained as a suicide bomber
78
272096
4550
04:36
before defecting.
79
276670
1552
04:39
And he says,
80
279080
1158
04:41
"I arrived in Syria full of hope,
81
281119
3220
04:44
and immediately, I had two of my prized possessions confiscated:
82
284363
4365
04:48
my passport and my mobile phone."
83
288752
2933
04:52
The symbols of his physical and digital liberty
84
292140
2406
04:54
were taken away from him on arrival.
85
294570
1760
04:57
And then this is the way he described that moment of loss to me.
86
297248
3510
05:01
He said,
87
301356
1586
05:02
"You know in 'Tom and Jerry,'
88
302966
2329
05:06
when Jerry wants to escape, and then Tom locks the door
89
306192
3103
05:09
and swallows the key
90
309319
1156
05:10
and you see it bulging out of his throat as it travels down?"
91
310499
3551
05:14
And of course, I really could see the image that he was describing,
92
314446
3153
05:17
and I really did connect with the feeling that he was trying to convey,
93
317623
3661
05:21
which was one of doom,
94
321308
2021
05:23
when you know there's no way out.
95
323353
1789
05:26
And I was wondering:
96
326551
1289
05:28
What, if anything, could have changed his mind
97
328644
2682
05:31
the day that he left home?
98
331350
1240
05:32
So I asked,
99
332614
1250
05:33
"If you knew everything that you know now
100
333888
3178
05:37
about the suffering and the corruption, the brutality --
101
337090
3051
05:40
that day you left home,
102
340165
1415
05:41
would you still have gone?"
103
341604
1679
05:43
And he said, "Yes."
104
343786
1711
05:45
And I thought, "Holy crap, he said 'Yes.'"
105
345846
2282
05:48
And then he said,
106
348694
1219
05:49
"At that point, I was so brainwashed,
107
349937
3001
05:52
I wasn't taking in any contradictory information.
108
352962
3244
05:56
I couldn't have been swayed."
109
356744
1555
05:59
"Well, what if you knew everything that you know now
110
359235
2527
06:01
six months before the day that you left?"
111
361786
2098
06:05
"At that point, I think it probably would have changed my mind."
112
365345
3131
06:10
Radicalization isn't this yes-or-no choice.
113
370138
3397
06:14
It's a process, during which people have questions --
114
374007
2977
06:17
about ideology, religion, the living conditions.
115
377008
3776
06:20
And they're coming online for answers,
116
380808
2766
06:23
which is an opportunity to reach them.
117
383598
1917
06:25
And there are videos online from people who have answers --
118
385905
3014
06:28
defectors, for example, telling the story of their journey
119
388943
2876
06:31
into and out of violence;
120
391843
1583
06:33
stories like the one from that man I met in the Iraqi prison.
121
393450
3487
06:37
There are locals who've uploaded cell phone footage
122
397914
2590
06:40
of what life is really like in the caliphate under ISIS's rule.
123
400528
3503
06:44
There are clerics who are sharing peaceful interpretations of Islam.
124
404055
3735
06:48
But you know what?
125
408830
1150
06:50
These people don't generally have the marketing prowess of ISIS.
126
410004
3020
06:54
They risk their lives to speak up and confront terrorist propaganda,
127
414049
4532
06:58
and then they tragically don't reach the people
128
418605
2211
07:00
who most need to hear from them.
129
420840
1682
07:03
And we wanted to see if technology could change that.
130
423173
2612
07:06
So in 2016, we partnered with Moonshot CVE
131
426205
4183
07:10
to pilot a new approach to countering radicalization
132
430412
3180
07:13
called the "Redirect Method."
133
433616
1780
07:16
It uses the power of online advertising
134
436453
3012
07:19
to bridge the gap between those susceptible to ISIS's messaging
135
439489
4514
07:24
and those credible voices that are debunking that messaging.
136
444027
3760
07:28
And it works like this:
137
448633
1150
07:29
someone looking for extremist material --
138
449807
1961
07:31
say they search for "How do I join ISIS?" --
139
451792
2990
07:34
will see an ad appear
140
454806
2476
07:37
that invites them to watch a YouTube video of a cleric, of a defector --
141
457306
4882
07:42
someone who has an authentic answer.
142
462212
2310
07:44
And that targeting is based not on a profile of who they are,
143
464546
3623
07:48
but of determining something that's directly relevant
144
468193
3053
07:51
to their query or question.
145
471270
1708
07:54
During our eight-week pilot in English and Arabic,
146
474122
2842
07:56
we reached over 300,000 people
147
476988
3279
08:00
who had expressed an interest in or sympathy towards a jihadi group.
148
480291
5545
08:06
These people were now watching videos
149
486626
2264
08:08
that could prevent them from making devastating choices.
150
488914
3340
08:13
And because violent extremism isn't confined to any one language,
151
493405
3727
08:17
religion or ideology,
152
497156
1804
08:18
the Redirect Method is now being deployed globally
153
498984
3501
08:22
to protect people being courted online by violent ideologues,
154
502509
3804
08:26
whether they're Islamists, white supremacists
155
506337
2596
08:28
or other violent extremists,
156
508957
2103
08:31
with the goal of giving them the chance to hear from someone
157
511084
2873
08:33
on the other side of that journey;
158
513981
2091
08:36
to give them the chance to choose a different path.
159
516096
2839
08:40
It turns out that often the bad guys are good at exploiting the internet,
160
520749
5980
08:46
not because they're some kind of technological geniuses,
161
526753
3744
08:50
but because they understand what makes people tick.
162
530521
2985
08:54
I want to give you a second example:
163
534855
2369
08:58
online harassment.
164
538019
1391
09:00
Online harassers also work to figure out what will resonate
165
540629
3363
09:04
with another human being.
166
544016
1615
09:05
But not to recruit them like ISIS does,
167
545655
3110
09:08
but to cause them pain.
168
548789
1275
09:11
Imagine this:
169
551259
1342
09:13
you're a woman,
170
553347
1659
09:15
you're married,
171
555030
1413
09:16
you have a kid.
172
556467
1154
09:18
You post something on social media,
173
558834
1784
09:20
and in a reply, you're told that you'll be raped,
174
560642
2886
09:24
that your son will be watching,
175
564577
1560
09:26
details of when and where.
176
566825
1856
09:29
In fact, your home address is put online for everyone to see.
177
569148
3143
09:33
That feels like a pretty real threat.
178
573580
2007
09:37
Do you think you'd go home?
179
577113
1656
09:39
Do you think you'd continue doing the thing that you were doing?
180
579999
3048
09:43
Would you continue doing that thing that's irritating your attacker?
181
583071
3220
09:48
Online abuse has been this perverse art
182
588016
3096
09:51
of figuring out what makes people angry,
183
591136
3468
09:54
what makes people afraid,
184
594628
2132
09:56
what makes people insecure,
185
596784
1641
09:58
and then pushing those pressure points until they're silenced.
186
598449
3067
10:02
When online harassment goes unchecked,
187
602333
2304
10:04
free speech is stifled.
188
604661
1667
10:07
And even the people hosting the conversation
189
607196
2127
10:09
throw up their arms and call it quits,
190
609347
1834
10:11
closing their comment sections and their forums altogether.
191
611205
2957
10:14
That means we're actually losing spaces online
192
614186
2849
10:17
to meet and exchange ideas.
193
617059
1987
10:19
And where online spaces remain,
194
619939
2163
10:22
we descend into echo chambers with people who think just like us.
195
622126
4470
10:27
But that enables the spread of disinformation;
196
627688
2499
10:30
that facilitates polarization.
197
630211
2184
10:34
What if technology instead could enable empathy at scale?
198
634508
5269
10:40
This was the question that motivated our partnership
199
640451
2486
10:42
with Google's Counter Abuse team,
200
642961
1819
10:44
Wikipedia
201
644804
1178
10:46
and newspapers like the New York Times.
202
646006
1934
10:47
We wanted to see if we could build machine-learning models
203
647964
2876
10:50
that could understand the emotional impact of language.
204
650864
3606
10:55
Could we predict which comments were likely to make someone else leave
205
655062
3610
10:58
the online conversation?
206
658696
1374
11:00
And that's no mean feat.
207
660515
3887
11:04
That's no trivial accomplishment
208
664426
1566
11:06
for AI to be able to do something like that.
209
666016
2563
11:08
I mean, just consider these two examples of messages
210
668603
3729
11:12
that could have been sent to me last week.
211
672356
2224
11:15
"Break a leg at TED!"
212
675517
1879
11:17
... and
213
677420
1164
11:18
"I'll break your legs at TED."
214
678608
2126
11:20
(Laughter)
215
680758
1246
11:22
You are human,
216
682028
1513
11:23
that's why that's an obvious difference to you,
217
683565
2210
11:25
even though the words are pretty much the same.
218
685799
2224
11:28
But for AI, it takes some training to teach the models
219
688047
3079
11:31
to recognize that difference.
220
691150
1571
11:32
The beauty of building AI that can tell the difference
221
692745
3245
11:36
is that AI can then scale to the size of the online toxicity phenomenon,
222
696014
5050
11:41
and that was our goal in building our technology called Perspective.
223
701088
3287
11:45
With the help of Perspective,
224
705056
1427
11:46
the New York Times, for example,
225
706507
1583
11:48
has increased spaces online for conversation.
226
708114
2487
11:51
Before our collaboration,
227
711005
1310
11:52
they only had comments enabled on just 10 percent of their articles.
228
712339
4305
11:57
With the help of machine learning,
229
717495
1644
11:59
they have that number up to 30 percent.
230
719163
1897
12:01
So they've tripled it,
231
721084
1156
12:02
and we're still just getting started.
232
722264
1917
12:04
But this is about way more than just making moderators more efficient.
233
724872
3461
12:10
Right now I can see you,
234
730076
1850
12:11
and I can gauge how what I'm saying is landing with you.
235
731950
3294
12:16
You don't have that opportunity online.
236
736370
1879
12:18
Imagine if machine learning could give commenters,
237
738558
3635
12:22
as they're typing,
238
742217
1162
12:23
real-time feedback about how their words might land,
239
743403
3347
12:27
just like facial expressions do in a face-to-face conversation.
240
747609
3024
12:32
Machine learning isn't perfect,
241
752926
1842
12:34
and it still makes plenty of mistakes.
242
754792
2394
12:37
But if we can build technology
243
757210
1557
12:38
that understands the emotional impact of language,
244
758791
3293
12:42
we can build empathy.
245
762108
1460
12:43
That means that we can have dialogue between people
246
763592
2425
12:46
with different politics,
247
766041
1816
12:47
different worldviews,
248
767881
1216
12:49
different values.
249
769121
1246
12:51
And we can reinvigorate the spaces online that most of us have given up on.
250
771359
4775
12:57
When people use technology to exploit and harm others,
251
777857
3785
13:01
they're preying on our human fears and vulnerabilities.
252
781666
3642
13:06
If we ever thought that we could build an internet
253
786461
3508
13:09
insulated from the dark side of humanity,
254
789993
2578
13:12
we were wrong.
255
792595
1184
13:14
If we want today to build technology
256
794361
2270
13:16
that can overcome the challenges that we face,
257
796655
3127
13:19
we have to throw our entire selves into understanding the issues
258
799806
4043
13:23
and into building solutions
259
803873
1893
13:25
that are as human as the problems they aim to solve.
260
805790
3782
13:30
Let's make that happen.
261
810071
1513
13:31
Thank you.
262
811924
1150
13:33
(Applause)
263
813098
3277
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7