The Real-World Danger of Online Myths | Vidhya Ramalingam | TED

20,785 views ・ 2024-12-16

TED


Please double-click on the English subtitles below to play the video.

00:08
"You are a disgusting liar."
0
8556
3040
00:12
"Someone, somewhere will hunt you down."
1
12916
3400
00:17
"I hope someone puts a bullet between your eyes."
2
17796
3680
00:24
These are messages received by climate scientists.
3
24436
4440
00:29
According to a recent survey,
4
29396
2200
00:31
39 percent of climate scientists have faced online abuse.
5
31596
5320
00:37
18 percent of those are threats of physical violence.
6
37596
4320
00:44
"At the end of the day,
7
44116
1720
00:45
we're going to see just how much you believe
8
45876
2600
00:48
in your global warming
9
48516
2680
00:51
and whether you're willing to die for your so-called 'research.'"
10
51196
4400
00:58
No scientist should have to fear for their lives.
11
58356
3240
01:02
But this is just another day in the life of a climate scientist.
12
62196
4120
01:08
I'm not a climate scientist.
13
68396
1800
01:10
I'm not a climate change activist.
14
70236
2240
01:12
I'm a counterterrorism expert.
15
72996
2320
01:15
I started my journey meeting with white supremacists
16
75796
2960
01:18
in basements in Sweden
17
78796
2480
01:21
and went on to lead a global policy effort
18
81276
2920
01:24
after Europe's first major terrorist attack
19
84236
2840
01:27
perpetrated by a white supremacist.
20
87076
2360
01:30
I went on to found Moonshot,
21
90316
1920
01:32
an organization that works to end violence online.
22
92236
4080
01:37
I care about climate change denial
23
97556
2920
01:40
because it's so often weaponized
24
100476
2400
01:42
to serve as a justification for violence.
25
102876
3040
01:47
It would be easy to think
26
107276
1240
01:48
that if only we could get people to understand climate change is real,
27
108516
4200
01:52
we could put an end to this.
28
112716
1760
01:55
Unfortunately, it's not that simple.
29
115316
2720
02:00
In 2019,
30
120236
1840
02:02
a gunman walked into a Walmart in El Paso, Texas.
31
122116
4080
02:07
He killed 23 people,
32
127676
3400
02:11
many of immigrant background.
33
131076
2000
02:14
He called himself an “ecofascist.”
34
134756
3480
02:18
He believed in climate change,
35
138676
2880
02:21
but he had bought into mis- and disinformation
36
141596
2800
02:24
that immigrants were the root cause of it,
37
144396
2680
02:27
that sustainability would only be possible
38
147116
2520
02:29
with the elimination of people of color.
39
149636
2800
02:34
Mis- and disinformation are so often weaponized
40
154076
3920
02:38
to serve as a justification for violence.
41
158036
3040
02:43
Although they're often used interchangeably,
42
163356
2800
02:46
misinformation is information that's false or misleading.
43
166156
4400
02:50
Disinformation is spread intentionally to cause harm.
44
170876
5000
02:56
It’s so powerful because it taps into your grievances,
45
176476
3880
03:00
what makes you really angry,
46
180356
2480
03:03
and it offers simplistic solutions.
47
183596
2320
03:05
There's typically a villain and a hero.
48
185956
2680
03:09
Over the last two years,
49
189916
1360
03:11
my team and I have been researching different kinds of manipulation tactics
50
191316
4040
03:15
used all over the world to spread disinformation.
51
195396
3600
03:19
Two of the most common were decontextualization
52
199556
3640
03:23
and fearmongering.
53
203236
1520
03:25
Decontextualization is the practice of taking information
54
205836
3600
03:29
out of its original context to deliberately mislead people.
55
209436
4120
03:33
For example, earlier this year,
56
213996
2280
03:36
Europe experienced a series of protests by farmers
57
216316
4160
03:40
against a range of proposed environmental regulations.
58
220516
4440
03:44
There were street blockades and protests,
59
224956
2320
03:47
demonstrations, occupations.
60
227276
2600
03:50
Adding to an already tense moment,
61
230316
2880
03:53
several inauthentic images circulated.
62
233196
3240
03:57
This one purported to show
63
237516
1920
03:59
the Ukrainian embassy in Paris getting pummeled with manure.
64
239476
4080
04:04
This was actually footage taken months earlier
65
244596
3000
04:07
from an entirely different protest
66
247596
2360
04:09
about an entirely different issue in Dijon,
67
249956
2920
04:12
not even in Paris.
68
252916
1600
04:15
And this effort to mislead the public,
69
255396
2960
04:18
it wouldn't be complete without the use of new technology.
70
258396
3480
04:22
Here's an image showing the streets of Paris lined with bales of hay.
71
262916
4800
04:27
It's a really striking image, isn't it?
72
267716
2560
04:30
This never happened.
73
270836
1200
04:32
It was entirely generated by AI.
74
272076
3200
04:36
And this isn't just happening in Europe.
75
276876
3120
04:40
Last year,
76
280796
1480
04:42
after wildfires raged in Hawaii,
77
282316
3920
04:46
a disinformation network linked to the Chinese Communist Party
78
286236
4120
04:50
spread inauthentic images
79
290396
2240
04:52
purporting that the US government had intentionally spread the wildfires
80
292636
4760
04:57
using a so-called “weather weapon.”
81
297436
2800
05:01
Can you imagine?
82
301316
1280
05:02
Over a hundred people died in those wildfires,
83
302916
4080
05:07
and the idea that those fires were deliberately set
84
307036
2960
05:10
by their own government
85
310036
1640
05:11
against their own people?
86
311716
1840
05:13
It's terrifying.
87
313556
1520
05:16
These kinds of conspiratorial narratives can spread widespread fear,
88
316036
6000
05:22
which takes us to the next powerful tactic of disinformation:
89
322076
4120
05:26
fearmongering:
90
326236
1480
05:27
deliberately exaggerating an issue
91
327716
2520
05:30
so that you can provoke fear and alarm.
92
330276
2600
05:33
We know that emotion-driven information processing
93
333356
4120
05:37
can overtake evidence-based decision making,
94
337476
3400
05:40
which is what makes this form of disinformation so effective.
95
340916
4640
05:46
It's for these reasons that a recent MIT study found
96
346036
3680
05:49
a false story will travel six times quicker
97
349756
3720
05:53
to reach 1,500 people than a true story will.
98
353476
3640
05:57
And we know Facebook fact-checkers take up to 72 hours
99
357836
4120
06:01
on average
100
361996
1160
06:03
to identify and remove this content.
101
363156
2920
06:06
By that time, most impressions have already been made.
102
366956
3560
06:11
Now I know we have all seen this online,
103
371316
2920
06:14
and when you see it happen,
104
374236
1320
06:15
it can be really tempting to respond with the facts.
105
375596
3280
06:19
I get it.
106
379356
1160
06:20
We pride ourselves on logic and science.
107
380556
2840
06:23
The truth matters.
108
383396
1680
06:25
So when someone is so obviously spreading false information,
109
385116
4280
06:29
just correct them, right?
110
389396
2200
06:32
Unfortunately, this doesn't always work.
111
392676
3280
06:36
Believe me, I spent the last two decades
112
396316
2320
06:38
learning how to have conversations with people buying into white supremacy.
113
398676
4240
06:42
That is disinformation at its worst.
114
402956
2800
06:46
Disinformation wins
115
406356
1560
06:47
because of the emotions it inspires,
116
407956
3160
06:51
because of the way it makes people feel.
117
411156
3120
06:54
So if someone is so bought into disinformation,
118
414276
3480
06:57
getting into debates on the facts with them
119
417756
2040
06:59
can just risk pushing them even further into a corner,
120
419836
3360
07:03
backing them into a corner
121
423236
1480
07:04
so that they get really defensive.
122
424756
2280
07:07
OK, so if we can't debate the facts endlessly,
123
427796
2440
07:10
what can we do?
124
430276
1240
07:12
Last year, Moonshot partnered with Google
125
432436
2680
07:15
to test an approach known as “prebunking.”
126
435156
2960
07:18
Prebunking is a proven communication technique
127
438756
3040
07:21
designed to help people spot and reject efforts
128
441836
3720
07:25
to manipulate them in the future
129
445596
2520
07:28
by giving them forewarning
130
448156
2040
07:30
and giving them tools to be able to reject a manipulative message,
131
450236
4280
07:34
you lessen the likelihood that they will be misled.
132
454556
3240
07:38
This is not about telling people what is true or false
133
458276
3360
07:41
or right or wrong.
134
461676
1680
07:43
It's about empowering people to protect themselves.
135
463396
3960
07:47
We've tapped into the universal human desire not to be manipulated,
136
467756
5000
07:52
and this method has been tried and tested for decades, since the 1960s.
137
472756
5240
07:58
All prebunking messages contain three essential ingredients.
138
478516
4520
08:03
One:
139
483396
1280
08:04
an emotional warning.
140
484716
2120
08:06
You alert people that there are others out there
141
486876
2680
08:09
who may be trying to mislead or manipulate them.
142
489596
3240
08:13
Be aware, you may be targeted.
143
493156
2360
08:16
Two:
144
496956
1160
08:18
stimulus.
145
498556
1320
08:20
You show people examples of manipulative messaging
146
500236
3800
08:24
so that they will be more likely
147
504076
2320
08:26
to be able to identify those in the future.
148
506436
2800
08:30
And three: refutation.
149
510956
3680
08:35
You give people the tools
150
515236
1480
08:36
to be able to refute a message in real time.
151
516756
3200
08:39
For example,
152
519996
1200
08:41
if you see a headline that’s really sensational,
153
521196
3720
08:44
and it either seems too good to be true or it makes you really angry,
154
524956
5120
08:50
always Google around for other sources.
155
530076
2520
08:52
Always Google around.
156
532636
1680
08:55
OK, so we knew the steps to take,
157
535196
3480
08:58
but we also knew if we were going to really get at this problem
158
538716
3640
09:02
around the world,
159
542396
1600
09:04
a one-size-fits-all approach wouldn't work.
160
544036
2200
09:06
We knew we needed to get local.
161
546276
2600
09:09
So we partnered with civil society organizations
162
549476
3240
09:12
in countries around the world,
163
552756
2160
09:14
from Germany to Indonesia to Ukraine.
164
554916
3200
09:18
And we started first with the evidence.
165
558796
2640
09:21
We met with dozens of experts,
166
561836
2160
09:24
we surveyed the online space,
167
564036
2320
09:26
and we identified the most common manipulation tactics
168
566396
3320
09:29
being used in each country.
169
569716
2280
09:32
We then partnered with local filmmakers
170
572516
2400
09:34
to create educational videos
171
574956
1880
09:36
that would teach people about those manipulation tactics
172
576836
3520
09:40
that were being used in their home country.
173
580356
2800
09:44
In some contexts,
174
584356
1320
09:45
we found that people trust close peers and relatives the most.
175
585716
4320
09:50
So in Germany,
176
590676
1480
09:52
we filmed close friends chatting in a park.
177
592196
2800
09:55
In Ukraine,
178
595796
1320
09:57
we filmed family dialogues around a kitchen table,
179
597156
3720
10:00
a setting that's so familiar to so many of us,
180
600876
2640
10:03
where so many of us have had those difficult conversations.
181
603556
3440
10:07
We wanted to encourage people to have these kinds of conversations
182
607716
3840
10:11
within their own trusted circles,
183
611556
3040
10:14
whether they're in El Salvador or Indonesia.
184
614636
3000
10:18
And to do so before pivotal moments
185
618036
2280
10:20
where online manipulation efforts intensify,
186
620316
3480
10:23
like elections.
187
623836
2080
10:26
So as we prepared to head into the EU elections,
188
626956
3640
10:30
we knew that distrust in climate science
189
630596
2920
10:33
had already emerged as a critical misinformation theme.
190
633556
3960
10:38
Now one study had found that adults over the age of 45
191
638596
4200
10:42
are less likely to investigate false information
192
642836
3040
10:45
when they stumble across it online.
193
645916
2280
10:48
Now we also know that adults over the age of 45
194
648636
3360
10:52
have higher voter turnout,
195
652036
2080
10:54
which means if it wins,
196
654116
2200
10:56
disinformation can have a disproportionate impact
197
656316
3200
10:59
on the outcomes of elections.
198
659516
2440
11:02
So as we prepare to head into the EU elections,
199
662556
3240
11:05
we created content for every EU country,
200
665796
3800
11:09
in 27 languages,
201
669636
2680
11:12
aiming to empower Europeans to spot and reject
202
672356
3520
11:15
efforts to manipulate them before the elections.
203
675916
3400
11:20
Over the last year,
204
680636
1200
11:21
we have reached millions of people around the globe
205
681876
2680
11:24
with these videos.
206
684596
1440
11:26
In Germany alone, we reached 42 million people.
207
686516
4600
11:31
That's half the German population.
208
691156
2560
11:33
And we found on average,
209
693756
1800
11:35
viewers of these videos were up to 10 percent more likely
210
695556
3640
11:39
to be able to identify manipulation efforts
211
699236
2720
11:41
than those who hadn't seen those videos.
212
701956
2680
11:45
This is a winning formula.
213
705796
2600
11:48
The evidence shows us
214
708436
1720
11:50
that prebunking is effective at building resistance to disinformation.
215
710156
5640
11:56
It begs the question,
216
716636
1640
11:58
how do we make that resistance last?
217
718276
2680
12:01
How do we build long-term societal resilience
218
721276
3560
12:04
to disinformation efforts?
219
724836
1880
12:09
There is an ongoing effort to use disinformation
220
729156
3600
12:12
to undermine our democracies.
221
732796
2240
12:16
Just last month,
222
736276
1200
12:17
the US Justice Department seized 32 internet domains
223
737476
4240
12:21
secretly deployed by the Russian government
224
741716
2560
12:24
to spread disinformation across the US and Europe.
225
744316
3680
12:29
This included deliberate efforts to exploit anxieties and fear
226
749076
4160
12:33
across the public about the energy transition,
227
753236
3160
12:36
specifically to encourage violence.
228
756396
2800
12:40
Now it’s not just the Russian government that we need to be worried about.
229
760716
3880
12:45
Easy access to generative AI tools
230
765276
2760
12:48
means that anyone,
231
768076
1720
12:49
not just those with resources, money and power,
232
769836
3840
12:53
can create high-quality,
233
773676
2000
12:55
effective, powerful disinformation content.
234
775676
3920
13:00
And the sources of disinformation are varied.
235
780156
3160
13:03
They can come from our elected officials
236
783356
2840
13:06
all the way through to our neighbors down the road.
237
786196
2680
13:09
Many of us don't need to look further than our own families.
238
789596
3720
13:14
But so many of the tools we tested online
239
794236
3280
13:17
are even more powerful
240
797556
1640
13:19
when they come directly from the people
241
799196
1920
13:21
that you trust and love the most in real life, IRL.
242
801116
4200
13:25
So instead of endlessly debating the facts,
243
805716
3280
13:29
give your loved ones the tools
244
809036
1600
13:30
that they need to protect themselves online.
245
810676
2840
13:35
Information manipulation is unfortunately the new norm.
246
815076
4760
13:40
But that doesn't mean we need to accept our loved ones being misled,
247
820916
4480
13:45
and we shouldn't accept our climate scientists living in fear.
248
825436
4480
13:50
So if we can't fact-check our way out of this problem,
249
830516
3840
13:54
we need to beat disinformation at its own game
250
834396
3760
13:58
by reaching people before disinformation does
251
838156
3360
14:01
and giving them all the tools that they need
252
841556
2560
14:04
to protect themselves online.
253
844116
2200
14:06
Thank you so much.
254
846876
1240
14:08
(Applause)
255
848116
3680
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7