How you can help transform the internet into a place of trust | Claire Wardle

53,671 views ・ 2019-11-15

TED


Please double-click on the English subtitles below to play the video.

00:13
No matter who you are or where you live,
0
13548
2516
00:16
I'm guessing that you have at least one relative
1
16088
2356
00:18
that likes to forward those emails.
2
18468
2334
00:21
You know the ones I'm talking about --
3
21206
1900
00:23
the ones with dubious claims or conspiracy videos.
4
23130
2854
00:26
And you've probably already muted them on Facebook
5
26315
2668
00:29
for sharing social posts like this one.
6
29007
2348
00:31
It's an image of a banana
7
31379
1404
00:32
with a strange red cross running through the center.
8
32807
2667
00:35
And the text around it is warning people
9
35498
2139
00:37
not to eat fruits that look like this,
10
37661
2158
00:39
suggesting they've been injected with blood
11
39843
2028
00:41
contaminated with the HIV virus.
12
41895
2130
00:44
And the social share message above it simply says,
13
44049
2603
00:46
"Please forward to save lives."
14
46676
2181
00:49
Now, fact-checkers have been debunking this one for years,
15
49672
3294
00:52
but it's one of those rumors that just won't die.
16
52990
2809
00:55
A zombie rumor.
17
55823
1271
00:57
And, of course, it's entirely false.
18
57513
2093
01:00
It might be tempting to laugh at an example like this, to say,
19
60180
2959
01:03
"Well, who would believe this, anyway?"
20
63163
1884
01:05
But the reason it's a zombie rumor
21
65419
1626
01:07
is because it taps into people's deepest fears about their own safety
22
67069
3889
01:10
and that of the people they love.
23
70982
2175
01:13
And if you spend as enough time as I have looking at misinformation,
24
73783
3273
01:17
you know that this is just one example of many
25
77080
2420
01:19
that taps into people's deepest fears and vulnerabilities.
26
79524
3047
01:23
Every day, across the world, we see scores of new memes on Instagram
27
83214
4369
01:27
encouraging parents not to vaccinate their children.
28
87607
3039
01:30
We see new videos on YouTube explaining that climate change is a hoax.
29
90670
4532
01:35
And across all platforms, we see endless posts designed to demonize others
30
95226
4302
01:39
on the basis of their race, religion or sexuality.
31
99552
3501
01:44
Welcome to one of the central challenges of our time.
32
104314
3030
01:47
How can we maintain an internet with freedom of expression at the core,
33
107647
4025
01:51
while also ensuring that the content that's being disseminated
34
111696
3303
01:55
doesn't cause irreparable harms to our democracies, our communities
35
115023
3886
01:58
and to our physical and mental well-being?
36
118933
2238
02:01
Because we live in the information age,
37
121998
2087
02:04
yet the central currency upon which we all depend -- information --
38
124109
3547
02:07
is no longer deemed entirely trustworthy
39
127680
2357
02:10
and, at times, can appear downright dangerous.
40
130061
2328
02:12
This is thanks in part to the runaway growth of social sharing platforms
41
132811
3937
02:16
that allow us to scroll through,
42
136772
1642
02:18
where lies and facts sit side by side,
43
138438
2222
02:20
but with none of the traditional signals of trustworthiness.
44
140684
3071
02:24
And goodness -- our language around this is horribly muddled.
45
144268
3619
02:27
People are still obsessed with the phrase "fake news,"
46
147911
3103
02:31
despite the fact that it's extraordinarily unhelpful
47
151038
2531
02:33
and used to describe a number of things that are actually very different:
48
153593
3460
02:37
lies, rumors, hoaxes, conspiracies, propaganda.
49
157077
3386
02:40
And I really wish we could stop using a phrase
50
160911
2912
02:43
that's been co-opted by politicians right around the world,
51
163847
2862
02:46
from the left and the right,
52
166733
1471
02:48
used as a weapon to attack a free and independent press.
53
168228
3222
02:52
(Applause)
54
172307
4702
02:57
Because we need our professional news media now more than ever.
55
177033
3462
03:00
And besides, most of this content doesn't even masquerade as news.
56
180882
3373
03:04
It's memes, videos, social posts.
57
184279
2642
03:06
And most of it is not fake; it's misleading.
58
186945
3453
03:10
We tend to fixate on what's true or false.
59
190422
3015
03:13
But the biggest concern is actually the weaponization of context.
60
193461
4032
03:18
Because the most effective disinformation
61
198855
1968
03:20
has always been that which has a kernel of truth to it.
62
200847
3048
03:23
Let's take this example from London, from March 2017,
63
203919
3476
03:27
a tweet that circulated widely
64
207419
1540
03:28
in the aftermath of a terrorist incident on Westminster Bridge.
65
208983
3587
03:32
This is a genuine image, not fake.
66
212594
2428
03:35
The woman who appears in the photograph was interviewed afterwards,
67
215046
3169
03:38
and she explained that she was utterly traumatized.
68
218239
2409
03:40
She was on the phone to a loved one,
69
220672
1738
03:42
and she wasn't looking at the victim out of respect.
70
222434
2618
03:45
But it still was circulated widely with this Islamophobic framing,
71
225076
3960
03:49
with multiple hashtags, including: #BanIslam.
72
229060
3046
03:52
Now, if you worked at Twitter, what would you do?
73
232425
2398
03:54
Would you take that down, or would you leave it up?
74
234847
2562
03:58
My gut reaction, my emotional reaction, is to take this down.
75
238553
3429
04:02
I hate the framing of this image.
76
242006
2142
04:04
But freedom of expression is a human right,
77
244585
2388
04:06
and if we start taking down speech that makes us feel uncomfortable,
78
246997
3225
04:10
we're in trouble.
79
250246
1230
04:11
And this might look like a clear-cut case,
80
251500
2294
04:13
but, actually, most speech isn't.
81
253818
1698
04:15
These lines are incredibly difficult to draw.
82
255540
2436
04:18
What's a well-meaning decision by one person
83
258000
2281
04:20
is outright censorship to the next.
84
260305
2077
04:22
What we now know is that this account, Texas Lone Star,
85
262759
2929
04:25
was part of a wider Russian disinformation campaign,
86
265712
3230
04:28
one that has since been taken down.
87
268966
2151
04:31
Would that change your view?
88
271141
1563
04:33
It would mine,
89
273322
1159
04:34
because now it's a case of a coordinated campaign
90
274505
2301
04:36
to sow discord.
91
276830
1215
04:38
And for those of you who'd like to think
92
278069
1961
04:40
that artificial intelligence will solve all of our problems,
93
280054
2831
04:42
I think we can agree that we're a long way away
94
282909
2225
04:45
from AI that's able to make sense of posts like this.
95
285158
2587
04:48
So I'd like to explain three interlocking issues
96
288856
2507
04:51
that make this so complex
97
291387
2373
04:53
and then think about some ways we can consider these challenges.
98
293784
3122
04:57
First, we just don't have a rational relationship to information,
99
297348
3890
05:01
we have an emotional one.
100
301262
1468
05:02
It's just not true that more facts will make everything OK,
101
302754
3794
05:06
because the algorithms that determine what content we see,
102
306572
3100
05:09
well, they're designed to reward our emotional responses.
103
309696
3127
05:12
And when we're fearful,
104
312847
1381
05:14
oversimplified narratives, conspiratorial explanations
105
314252
3174
05:17
and language that demonizes others is far more effective.
106
317450
3418
05:21
And besides, many of these companies,
107
321538
1874
05:23
their business model is attached to attention,
108
323436
2546
05:26
which means these algorithms will always be skewed towards emotion.
109
326006
3690
05:30
Second, most of the speech I'm talking about here is legal.
110
330371
4298
05:35
It would be a different matter
111
335081
1446
05:36
if I was talking about child sexual abuse imagery
112
336551
2341
05:38
or content that incites violence.
113
338916
1927
05:40
It can be perfectly legal to post an outright lie.
114
340867
3270
05:45
But people keep talking about taking down "problematic" or "harmful" content,
115
345130
4034
05:49
but with no clear definition of what they mean by that,
116
349188
2609
05:51
including Mark Zuckerberg,
117
351821
1264
05:53
who recently called for global regulation to moderate speech.
118
353109
3412
05:56
And my concern is that we're seeing governments
119
356870
2215
05:59
right around the world
120
359109
1292
06:00
rolling out hasty policy decisions
121
360425
2676
06:03
that might actually trigger much more serious consequences
122
363125
2746
06:05
when it comes to our speech.
123
365895
1714
06:08
And even if we could decide which speech to take up or take down,
124
368006
3706
06:11
we've never had so much speech.
125
371736
2174
06:13
Every second, millions of pieces of content
126
373934
2131
06:16
are uploaded by people right around the world
127
376089
2107
06:18
in different languages,
128
378220
1168
06:19
drawing on thousands of different cultural contexts.
129
379412
2768
06:22
We've simply never had effective mechanisms
130
382204
2532
06:24
to moderate speech at this scale,
131
384760
1738
06:26
whether powered by humans or by technology.
132
386522
2801
06:30
And third, these companies -- Google, Twitter, Facebook, WhatsApp --
133
390284
3944
06:34
they're part of a wider information ecosystem.
134
394252
2841
06:37
We like to lay all the blame at their feet, but the truth is,
135
397117
3352
06:40
the mass media and elected officials can also play an equal role
136
400493
3830
06:44
in amplifying rumors and conspiracies when they want to.
137
404347
2913
06:47
As can we, when we mindlessly forward divisive or misleading content
138
407800
4944
06:52
without trying.
139
412768
1285
06:54
We're adding to the pollution.
140
414077
1800
06:57
I know we're all looking for an easy fix.
141
417236
2618
06:59
But there just isn't one.
142
419878
1667
07:01
Any solution will have to be rolled out at a massive scale, internet scale,
143
421950
4445
07:06
and yes, the platforms, they're used to operating at that level.
144
426419
3261
07:09
But can and should we allow them to fix these problems?
145
429704
3472
07:13
They're certainly trying.
146
433668
1232
07:14
But most of us would agree that, actually, we don't want global corporations
147
434924
4086
07:19
to be the guardians of truth and fairness online.
148
439034
2332
07:21
And I also think the platforms would agree with that.
149
441390
2537
07:24
And at the moment, they're marking their own homework.
150
444257
2881
07:27
They like to tell us
151
447162
1198
07:28
that the interventions they're rolling out are working,
152
448384
2579
07:30
but because they write their own transparency reports,
153
450987
2540
07:33
there's no way for us to independently verify what's actually happening.
154
453551
3618
07:38
(Applause)
155
458431
3342
07:41
And let's also be clear that most of the changes we see
156
461797
2952
07:44
only happen after journalists undertake an investigation
157
464773
2994
07:47
and find evidence of bias
158
467791
1611
07:49
or content that breaks their community guidelines.
159
469426
2829
07:52
So yes, these companies have to play a really important role in this process,
160
472815
4595
07:57
but they can't control it.
161
477434
1560
07:59
So what about governments?
162
479855
1518
08:01
Many people believe that global regulation is our last hope
163
481863
3096
08:04
in terms of cleaning up our information ecosystem.
164
484983
2880
08:07
But what I see are lawmakers who are struggling to keep up to date
165
487887
3166
08:11
with the rapid changes in technology.
166
491077
2341
08:13
And worse, they're working in the dark,
167
493442
1904
08:15
because they don't have access to data
168
495370
1821
08:17
to understand what's happening on these platforms.
169
497215
2650
08:20
And anyway, which governments would we trust to do this?
170
500260
3071
08:23
We need a global response, not a national one.
171
503355
2770
08:27
So the missing link is us.
172
507419
2277
08:29
It's those people who use these technologies every day.
173
509720
3123
08:33
Can we design a new infrastructure to support quality information?
174
513260
4591
08:38
Well, I believe we can,
175
518371
1230
08:39
and I've got a few ideas about what we might be able to actually do.
176
519625
3357
08:43
So firstly, if we're serious about bringing the public into this,
177
523006
3103
08:46
can we take some inspiration from Wikipedia?
178
526133
2381
08:48
They've shown us what's possible.
179
528538
1824
08:50
Yes, it's not perfect,
180
530386
1151
08:51
but they've demonstrated that with the right structures,
181
531561
2634
08:54
with a global outlook and lots and lots of transparency,
182
534219
2635
08:56
you can build something that will earn the trust of most people.
183
536878
3096
08:59
Because we have to find a way to tap into the collective wisdom
184
539998
3162
09:03
and experience of all users.
185
543184
2309
09:05
This is particularly the case for women, people of color
186
545517
2646
09:08
and underrepresented groups.
187
548187
1346
09:09
Because guess what?
188
549557
1166
09:10
They are experts when it comes to hate and disinformation,
189
550747
2735
09:13
because they have been the targets of these campaigns for so long.
190
553506
3516
09:17
And over the years, they've been raising flags,
191
557046
2350
09:19
and they haven't been listened to.
192
559420
1665
09:21
This has got to change.
193
561109
1280
09:22
So could we build a Wikipedia for trust?
194
562807
4326
09:27
Could we find a way that users can actually provide insights?
195
567157
4189
09:31
They could offer insights around difficult content-moderation decisions.
196
571370
3697
09:35
They could provide feedback
197
575091
1463
09:36
when platforms decide they want to roll out new changes.
198
576578
3041
09:40
Second, people's experiences with the information is personalized.
199
580241
4162
09:44
My Facebook news feed is very different to yours.
200
584427
2643
09:47
Your YouTube recommendations are very different to mine.
201
587094
2745
09:49
That makes it impossible for us to actually examine
202
589863
2492
09:52
what information people are seeing.
203
592379
2023
09:54
So could we imagine
204
594815
1389
09:56
developing some kind of centralized open repository for anonymized data,
205
596228
4778
10:01
with privacy and ethical concerns built in?
206
601030
2864
10:04
Because imagine what we would learn
207
604220
1778
10:06
if we built out a global network of concerned citizens
208
606022
3261
10:09
who wanted to donate their social data to science.
209
609307
3294
10:13
Because we actually know very little
210
613141
1722
10:14
about the long-term consequences of hate and disinformation
211
614887
2881
10:17
on people's attitudes and behaviors.
212
617792
1975
10:20
And what we do know,
213
620236
1167
10:21
most of that has been carried out in the US,
214
621427
2193
10:23
despite the fact that this is a global problem.
215
623644
2381
10:26
We need to work on that, too.
216
626049
1635
10:28
And third,
217
628192
1150
10:29
can we find a way to connect the dots?
218
629366
2310
10:31
No one sector, let alone nonprofit, start-up or government,
219
631700
3438
10:35
is going to solve this.
220
635162
1422
10:36
But there are very smart people right around the world
221
636608
2564
10:39
working on these challenges,
222
639196
1381
10:40
from newsrooms, civil society, academia, activist groups.
223
640601
3576
10:44
And you can see some of them here.
224
644201
1898
10:46
Some are building out indicators of content credibility.
225
646123
2927
10:49
Others are fact-checking,
226
649074
1246
10:50
so that false claims, videos and images can be down-ranked by the platforms.
227
650344
3591
10:53
A nonprofit I helped to found, First Draft,
228
653959
2213
10:56
is working with normally competitive newsrooms around the world
229
656196
2968
10:59
to help them build out investigative, collaborative programs.
230
659188
3503
11:03
And Danny Hillis, a software architect,
231
663231
2309
11:05
is designing a new system called The Underlay,
232
665564
2381
11:07
which will be a record of all public statements of fact
233
667969
2775
11:10
connected to their sources,
234
670768
1329
11:12
so that people and algorithms can better judge what is credible.
235
672121
3654
11:16
And educators around the world are testing different techniques
236
676800
3356
11:20
for finding ways to make people critical of the content they consume.
237
680180
3484
11:24
All of these efforts are wonderful, but they're working in silos,
238
684633
3141
11:27
and many of them are woefully underfunded.
239
687798
2680
11:30
There are also hundreds of very smart people
240
690502
2053
11:32
working inside these companies,
241
692579
1652
11:34
but again, these efforts can feel disjointed,
242
694255
2325
11:36
because they're actually developing different solutions to the same problems.
243
696604
3937
11:41
How can we find a way to bring people together
244
701205
2269
11:43
in one physical location for days or weeks at a time,
245
703498
3278
11:46
so they can actually tackle these problems together
246
706800
2396
11:49
but from their different perspectives?
247
709220
1818
11:51
So can we do this?
248
711062
1340
11:52
Can we build out a coordinated, ambitious response,
249
712426
3239
11:55
one that matches the scale and the complexity of the problem?
250
715689
3709
11:59
I really think we can.
251
719819
1373
12:01
Together, let's rebuild our information commons.
252
721216
2959
12:04
Thank you.
253
724819
1190
12:06
(Applause)
254
726033
3728
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7