The price of a "clean" internet | Hans Block and Moritz Riesewieck

66,518 views ・ 2019-11-21

TED


Please double-click on the English subtitles below to play the video.

00:12
[This talk contains mature content]
0
12360
3221
00:16
Moritz Riesewieck: On March 23, 2013,
1
16913
4452
00:21
users worldwide discovered in their news feed
2
21389
4057
00:25
a video of a young girl being raped by an older man.
3
25470
5063
00:31
Before this video was removed from Facebook,
4
31478
3856
00:35
it was already shared 16,000 times,
5
35358
4616
00:39
and it was even liked 4,000 times.
6
39998
3642
00:45
This video went viral and infected the net.
7
45268
3663
00:49
Hans Block: And that was the moment we asked ourselves
8
49873
3095
00:52
how could something like this get on Facebook?
9
52992
2778
00:55
And at the same time, why don't we see such content more often?
10
55794
4402
01:00
After all, there's a lot of revolting material online,
11
60220
3683
01:03
but why do we so rarely see such crap on Facebook, Twitter or Google?
12
63927
4455
01:08
MR: While image-recognition software
13
68958
2241
01:11
can identify the outlines of sexual organs,
14
71223
4283
01:15
blood or naked skin in images and videos,
15
75530
4928
01:20
it has immense difficulties to distinguish pornographic content
16
80482
5540
01:26
from holiday pictures, Adonis statues
17
86046
4348
01:30
or breast-cancer screening campaigns.
18
90418
2698
01:33
It can't distinguish Romeo and Juliet dying onstage
19
93140
4255
01:37
from a real knife attack.
20
97419
2555
01:39
It can't distinguish satire from propaganda
21
99998
5222
01:45
or irony from hatred, and so on and so forth.
22
105244
3968
01:50
Therefore, humans are needed to decide
23
110077
4078
01:54
which of the suspicious content should be deleted,
24
114179
4001
01:58
and which should remain.
25
118204
1600
02:00
Humans whom we know almost nothing about,
26
120909
2811
02:03
because they work in secret.
27
123744
1867
02:06
They sign nondisclosure agreements,
28
126053
1905
02:07
which prohibit them from talking and sharing
29
127982
2953
02:10
what they see on their screens and what this work does to them.
30
130959
3585
02:14
They are forced to use code words in order to hide who they work for.
31
134887
4270
02:19
They are monitored by private security firms
32
139585
2905
02:22
in order to ensure that they don't talk to journalists.
33
142514
3510
02:26
And they are threatened by fines in case they speak.
34
146048
3539
02:30
All of this sounds like a weird crime story,
35
150421
3762
02:34
but it's true.
36
154207
1330
02:35
These people exist,
37
155561
1492
02:37
and they are called content moderators.
38
157633
4181
02:42
HB: We are the directors of the feature documentary film "The Cleaners,"
39
162942
3445
02:46
and we would like to take you
40
166411
1960
02:48
to a world that many of you may not know yet.
41
168395
2587
02:51
Here's a short clip of our film.
42
171006
2133
02:58
(Music)
43
178639
3620
03:04
(Video) Moderator: I need to be anonymous, because we have a contract signed.
44
184400
3686
03:09
We are not allowed to declare whom we are working with.
45
189784
3621
03:14
The reason why I speak to you
46
194807
1763
03:16
is because the world should know that we are here.
47
196594
4380
03:22
There is somebody who is checking the social media.
48
202544
2803
03:26
We are doing our best to make this platform
49
206317
3604
03:29
safe for all of them.
50
209945
1790
03:42
Delete.
51
222438
1150
03:44
Ignore.
52
224278
1294
03:45
Delete.
53
225596
1150
03:47
Ignore.
54
227279
1297
03:48
Delete.
55
228600
1151
03:50
Ignore.
56
230680
1151
03:51
Ignore.
57
231855
1150
03:53
Delete.
58
233625
1150
03:58
HB: The so-called content moderators
59
238030
1977
04:00
don't get their paychecks from Facebook, Twitter or Google themselves,
60
240031
4000
04:04
but from outsourcing firms around the world
61
244055
2317
04:06
in order to keep the wages low.
62
246396
2067
04:08
Tens of thousands of young people
63
248833
1967
04:10
looking at everything we are not supposed to see.
64
250824
3213
04:14
And we are talking about decapitations, mutilations,
65
254061
3548
04:17
executions, necrophilia, torture, child abuse.
66
257633
3696
04:21
Thousands of images in one shift --
67
261743
2274
04:24
ignore, delete, day and night.
68
264041
2915
04:27
And much of this work is done in Manila,
69
267393
3398
04:30
where the analog toxic waste from the Western world
70
270815
3302
04:34
was transported for years by container ships,
71
274141
2608
04:36
now the digital waste is dumped there via fiber-optic cable.
72
276773
4003
04:40
And just as the so-called scavengers
73
280800
3047
04:43
rummage through gigantic tips on the edge of the city,
74
283871
3476
04:47
the content moderators click their way through an endless toxic ocean
75
287371
4833
04:52
of images and videos and all manner of intellectual garbage,
76
292228
4087
04:56
so that we don't have to look at it.
77
296339
2302
04:58
MR: But unlike the wounds of the scavengers,
78
298665
3540
05:02
those of the content moderators remain invisible.
79
302229
3451
05:06
Full of shocking and disturbing content,
80
306117
3080
05:09
these pictures and videos burrow into their memories
81
309221
3263
05:12
where, at any time, they can have unpredictable effects:
82
312508
3445
05:15
eating disorders, loss of libido,
83
315977
3357
05:19
anxiety disorders, alcoholism,
84
319358
3259
05:22
depression, which can even lead to suicide.
85
322641
2912
05:26
The pictures and videos infect them,
86
326100
2445
05:28
and often never let them go again.
87
328569
2389
05:30
If they are unlucky, they develop post-traumatic stress disorders,
88
330982
4841
05:35
like soldiers after war missions.
89
335847
2200
05:39
In our film, we tell the story of a young man
90
339445
3643
05:43
who had to monitor livestreams of self-mutilations and suicide attempts,
91
343112
5198
05:48
again and again,
92
348334
1675
05:50
and who eventually committed suicide himself.
93
350033
3066
05:53
It's not an isolated case, as we've been told.
94
353787
2740
05:57
This is the price all of us pay
95
357184
3980
06:01
for our so-called clean and safe and "healthy"
96
361188
5327
06:06
environments on social media.
97
366539
2284
06:10
Never before in the history of mankind
98
370482
2595
06:13
has it been easier to reach millions of people around the globe
99
373101
3332
06:16
in a few seconds.
100
376457
1667
06:18
What is posted on social media spreads so quickly,
101
378148
3945
06:22
becomes viral and excites the minds of people all around the globe.
102
382117
3936
06:26
Before it is deleted,
103
386450
2064
06:28
it is often already too late.
104
388538
1933
06:30
Millions of people have already been infected
105
390966
2230
06:33
with hatred and anger,
106
393220
1857
06:35
and they either become active online,
107
395101
2730
06:37
by spreading or amplifying hatred,
108
397855
3143
06:41
or they take to the streets and take up arms.
109
401022
3793
06:45
HB: Therefore, an army of content moderators
110
405236
2540
06:47
sit in front of a screen to avoid new collateral damage.
111
407800
3895
06:52
And they are deciding, as soon as possible,
112
412434
2119
06:54
whether the content stays on the platform -- ignore;
113
414577
4095
06:58
or disappears -- delete.
114
418696
2340
07:01
But not every decision is as clear
115
421823
2627
07:04
as the decision about a child-abuse video.
116
424474
2897
07:07
What about controversial content, ambivalent content,
117
427395
2777
07:10
uploaded by civil rights activists or citizen journalists?
118
430196
3153
07:14
The content moderators often decide on such cases
119
434048
3222
07:17
at the same speed as the [clear] cases.
120
437294
2733
07:21
MR: We will show you a video now,
121
441515
2659
07:24
and we would like to ask you to decide:
122
444198
3309
07:27
Would you delete it,
123
447531
1690
07:29
or would you not delete it?
124
449245
1801
07:31
(Video) (Air strike sounds)
125
451070
1667
07:33
(Explosion)
126
453100
2550
07:40
(People speaking in Arabic)
127
460076
5953
07:46
MR: Yeah, we did some blurring for you.
128
466053
2230
07:49
A child would potentially be dangerously disturbed
129
469196
3755
07:52
and extremely frightened by such content.
130
472975
2809
07:55
So, you rather delete it?
131
475808
2297
07:59
But what if this video could help investigate the war crimes in Syria?
132
479610
4639
08:04
What if nobody would have heard about this air strike,
133
484717
3167
08:07
because Facebook, YouTube, Twitter would have decided to take it down?
134
487908
3738
08:12
Airwars, a nongovernmental organization based in London,
135
492895
4325
08:17
tries to find those videos as quickly as possible
136
497244
2897
08:20
whenever they are uploaded to social media,
137
500165
2560
08:22
in order to archive them.
138
502749
1600
08:24
Because they know, sooner or later,
139
504693
2833
08:27
Facebook, YouTube, Twitter would take such content down.
140
507550
3310
08:31
People armed with their mobile phones
141
511345
2208
08:33
can make visible what journalists often do not have access to.
142
513577
4199
08:37
Civil rights groups often do not have any better option
143
517800
3063
08:40
to quickly make their recordings accessible to a large audience
144
520887
3801
08:44
than by uploading them to social media.
145
524712
2600
08:47
Wasn't this the empowering potential the World Wide Web should have?
146
527950
4286
08:52
Weren't these the dreams
147
532966
1960
08:54
people in its early stages had about the World Wide Web?
148
534950
4111
08:59
Can't pictures and videos like these
149
539608
2795
09:02
persuade people who have become insensitive to facts
150
542427
5134
09:07
to rethink?
151
547585
1150
09:09
HB: But instead, everything that might be disturbing is deleted.
152
549917
3602
09:13
And there's a general shift in society.
153
553543
2058
09:15
Media, for example, more and more often use trigger warnings
154
555625
3897
09:19
at the top of articles
155
559546
1793
09:21
which some people may perceive as offensive or troubling.
156
561363
3309
09:24
Or more and more students at universities in the United States
157
564696
3914
09:28
demand the banishment of antique classics
158
568634
2817
09:31
which depict sexual violence or assault from the curriculum.
159
571475
3115
09:34
But how far should we go with that?
160
574991
2333
09:37
Physical integrity is guaranteed as a human right
161
577875
3380
09:41
in constitutions worldwide.
162
581279
1800
09:43
In the Charter of Fundamental Rights of the European Union,
163
583422
3754
09:47
this right expressly applies to mental integrity.
164
587200
3354
09:51
But even if the potentially traumatic effect
165
591347
2658
09:54
of images and videos is hard to predict,
166
594029
2826
09:56
do we want to become so cautious
167
596879
1957
09:58
that we risk losing social awareness of injustice?
168
598860
3727
10:03
So what to do?
169
603203
1150
10:04
Mark Zuckerberg recently stated that in the future,
170
604942
2992
10:07
the users, we, or almost everybody,
171
607958
3802
10:11
will decide individually
172
611784
2261
10:14
what they would like to see on the platform,
173
614069
2048
10:16
by personal filter settings.
174
616141
2031
10:18
So everyone could easily claim to remain undisturbed
175
618196
3072
10:21
by images of war or other violent conflicts, like ...
176
621292
3739
10:25
MR: I'm the type of guy who doesn't mind seeing breasts
177
625849
4446
10:30
and I'm very interested in global warming,
178
630319
3766
10:34
but I don't like war so much.
179
634109
2564
10:37
HB: Yeah, I'm more the opposite,
180
637109
1722
10:38
I have zero interest in naked breasts or naked bodies at all.
181
638855
4053
10:43
But why not guns? I like guns, yes.
182
643209
2911
10:46
MR: Come on, if we don't share a similar social consciousness,
183
646901
3745
10:50
how shall we discuss social problems?
184
650670
2669
10:53
How shall we call people to action?
185
653363
2397
10:55
Even more isolated bubbles would emerge.
186
655784
3285
10:59
One of the central questions is: "How, in the future,
187
659665
3231
11:02
freedom of expression will be weighed against the people's need for protection."
188
662920
4903
11:08
It's a matter of principle.
189
668441
1467
11:10
Do we want to design an either open or closed society
190
670602
4248
11:14
for the digital space?
191
674874
1639
11:17
At the heart of the matter is "freedom versus security."
192
677054
5912
11:24
Facebook has always wanted to be a "healthy" platform.
193
684388
4484
11:28
Above all, users should feel safe and secure.
194
688896
3698
11:32
It's the same choice of words
195
692618
2120
11:34
the content moderators in the Philippines used
196
694762
2958
11:37
in a lot of our interviews.
197
697744
1800
11:40
(Video) The world that we are living in right now,
198
700188
2381
11:42
I believe, is not really healthy.
199
702593
2166
11:44
(Music)
200
704783
1548
11:46
In this world, there is really an evil who exists.
201
706355
3158
11:49
(Music)
202
709537
3238
11:52
We need to watch for it.
203
712799
2063
11:54
(Music)
204
714886
1882
11:56
We need to control it -- good or bad.
205
716792
3250
12:00
(Music)
206
720646
7000
12:10
[Look up, Young man! --God]
207
730193
4103
12:14
MR: For the young content moderators in the strictly Catholic Philippines,
208
734952
4278
12:19
this is linked to a Christian mission.
209
739254
2793
12:22
To counter the sins of the world
210
742833
2966
12:25
which spread across the web.
211
745823
2174
12:28
"Cleanliness is next to godliness,"
212
748641
3412
12:32
is a saying everybody in the Philippines knows.
213
752077
3434
12:36
HB: And others motivate themselves
214
756035
1659
12:37
by comparing themselves with their president, Rodrigo Duterte.
215
757718
3731
12:41
He has been ruling the Philippines since 2016,
216
761837
3491
12:45
and he won the election with the promise: "I will clean up."
217
765352
3993
12:49
And what that means is eliminating all kinds of problems
218
769892
3317
12:53
by literally killing people on the streets
219
773233
2455
12:55
who are supposed to be criminals, whatever that means.
220
775712
2865
12:58
And since he was elected,
221
778601
1270
12:59
an estimated 20,000 people have been killed.
222
779895
3436
13:03
And one moderator in our film says,
223
783655
2501
13:06
"What Duterte does on the streets,
224
786180
2055
13:08
I do for the internet."
225
788259
1714
13:10
And here they are, our self-proclaimed superheroes,
226
790934
3564
13:14
who enforce law and order in our digital world.
227
794522
2976
13:17
They clean up, they polish everything clean,
228
797522
2381
13:19
they free us from everything evil.
229
799927
2333
13:22
Tasks formerly reserved to state authorities
230
802284
3729
13:26
have been taken over by college graduates in their early 20s,
231
806037
3675
13:29
equipped with three- to five-day training --
232
809736
2893
13:32
this is the qualification --
233
812653
1936
13:34
who work on nothing less than the world's rescue.
234
814613
3067
13:38
MR: National sovereignties have been outsourced to private companies,
235
818756
4219
13:42
and they pass on their responsibilities to third parties.
236
822999
4008
13:47
It's an outsourcing of the outsourcing of the outsourcing,
237
827031
3063
13:50
which takes place.
238
830118
1150
13:51
With social networks,
239
831618
1396
13:53
we are dealing with a completely new infrastructure,
240
833038
3015
13:56
with its own mechanisms,
241
836077
1516
13:57
its own logic of action
242
837617
1579
13:59
and therefore, also, its own new dangers,
243
839220
5245
14:04
which had not yet existed in the predigitalized public sphere.
244
844489
4025
14:08
HB: When Mark Zuckerberg was at the US Congress
245
848538
2209
14:10
or at the European Parliament,
246
850771
1770
14:12
he was confronted with all kinds of critics.
247
852565
2635
14:15
And his reaction was always the same:
248
855224
2580
14:18
"We will fix that,
249
858501
1468
14:19
and I will follow up on that with my team."
250
859993
2551
14:23
But such a debate shouldn't be held in back rooms of Facebook,
251
863167
3778
14:26
Twitter or Google --
252
866969
1285
14:28
such a debate should be openly discussed in new, cosmopolitan parliaments,
253
868278
4816
14:33
in new institutions that reflect the diversity of people
254
873118
4860
14:38
contributing to a utopian project of a global network.
255
878002
4542
14:42
And while it may seem impossible to consider the values
256
882568
3377
14:45
of users worldwide,
257
885969
2242
14:48
it's worth believing
258
888235
1682
14:49
that there's more that connects us than separates us.
259
889941
3286
14:53
MR: Yeah, at a time when populism is gaining strength,
260
893624
3707
14:57
it becomes popular to justify the symptoms,
261
897355
3198
15:00
to eradicate them,
262
900577
1278
15:01
to make them invisible.
263
901879
1888
15:04
This ideology is spreading worldwide,
264
904919
3349
15:08
analog as well as digital,
265
908292
2785
15:11
and it's our duty to stop it
266
911903
3492
15:15
before it's too late.
267
915419
1626
15:17
The question of freedom and democracy
268
917665
3984
15:21
must not only have these two options.
269
921673
2967
15:25
HB: Delete.
270
925053
1166
15:26
MR: Or ignore.
271
926243
2147
15:29
HB: Thank you very much.
272
929300
1597
15:30
(Applause)
273
930921
5269
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7