Dear Facebook, this is how you're breaking democracy | Yael Eisenstat

114,784 views ・ 2020-09-24

TED


Please double-click on the English subtitles below to play the video.

00:13
Around five years ago,
0
13000
2018
00:15
it struck me that I was losing the ability
1
15042
2392
00:17
to engage with people who aren't like-minded.
2
17458
2500
00:20
The idea of discussing hot-button issues with my fellow Americans
3
20917
3642
00:24
was starting to give me more heartburn
4
24583
2393
00:27
than the times that I engaged with suspected extremists overseas.
5
27000
4125
00:31
It was starting to leave me feeling more embittered and frustrated.
6
31750
3601
00:35
And so just like that,
7
35375
1434
00:36
I shifted my entire focus
8
36833
1810
00:38
from global national security threats
9
38667
2434
00:41
to trying to understand what was causing this push
10
41125
3226
00:44
towards extreme polarization at home.
11
44375
3226
00:47
As a former CIA officer and diplomat
12
47625
2351
00:50
who spent years working on counterextremism issues,
13
50000
3434
00:53
I started to fear that this was becoming a far greater threat to our democracy
14
53458
3851
00:57
than any foreign adversary.
15
57333
2518
00:59
And so I started digging in,
16
59875
1809
01:01
and I started speaking out,
17
61708
1518
01:03
which eventually led me to being hired at Facebook
18
63250
2809
01:06
and ultimately brought me here today
19
66083
2685
01:08
to continue warning you about how these platforms
20
68792
2809
01:11
are manipulating and radicalizing so many of us
21
71625
4018
01:15
and to talk about how to reclaim our public square.
22
75667
2958
01:19
I was a foreign service officer in Kenya
23
79625
1976
01:21
just a few years after the September 11 attacks,
24
81625
3059
01:24
and I led what some call "hearts and minds" campaigns
25
84708
2560
01:27
along the Somalia border.
26
87292
1934
01:29
A big part of my job was to build trust with communities
27
89250
3309
01:32
deemed the most susceptible to extremist messaging.
28
92583
2709
01:36
I spent hours drinking tea with outspoken anti-Western clerics
29
96125
4184
01:40
and even dialogued with some suspected terrorists,
30
100333
3226
01:43
and while many of these engagements began with mutual suspicion,
31
103583
3393
01:47
I don't recall any of them resulting in shouting or insults,
32
107000
3809
01:50
and in some case we even worked together on areas of mutual interest.
33
110833
4084
01:56
The most powerful tools we had were to simply listen, learn
34
116000
4143
02:00
and build empathy.
35
120167
1726
02:01
This is the essence of hearts and minds work,
36
121917
3059
02:05
because what I found again and again is that what most people wanted
37
125000
3268
02:08
was to feel heard, validated and respected.
38
128292
3767
02:12
And I believe that's what most of us want.
39
132083
2643
02:14
So what I see happening online today is especially heartbreaking
40
134750
3309
02:18
and a much harder problem to tackle.
41
138083
2084
02:20
We are being manipulated by the current information ecosystem
42
140917
3767
02:24
entrenching so many of us so far into absolutism
43
144708
3768
02:28
that compromise has become a dirty word.
44
148500
2976
02:31
Because right now,
45
151500
1309
02:32
social media companies like Facebook
46
152833
2143
02:35
profit off of segmenting us and feeding us personalized content
47
155000
3851
02:38
that both validates and exploits our biases.
48
158875
3851
02:42
Their bottom line depends on provoking a strong emotion
49
162750
3434
02:46
to keep us engaged,
50
166208
1685
02:47
often incentivizing the most inflammatory and polarizing voices,
51
167917
4267
02:52
to the point where finding common ground no longer feels possible.
52
172208
4435
02:56
And despite a growing chorus of people crying out for the platforms to change,
53
176667
4226
03:00
it's clear they will not do enough on their own.
54
180917
3309
03:04
So governments must define the responsibility
55
184250
2893
03:07
for the real-world harms being caused by these business models
56
187167
3809
03:11
and impose real costs on the damaging effects
57
191000
2768
03:13
they're having to our public health, our public square and our democracy.
58
193792
5309
03:19
But unfortunately, this won't happen in time for the US presidential election,
59
199125
4643
03:23
so I am continuing to raise this alarm,
60
203792
2767
03:26
because even if one day we do have strong rules in place,
61
206583
3060
03:29
it will take all of us to fix this.
62
209667
2750
03:33
When I started shifting my focus from threats abroad
63
213417
2601
03:36
to the breakdown in civil discourse at home,
64
216042
2142
03:38
I wondered if we could repurpose some of these hearts and minds campaigns
65
218208
4060
03:42
to help heal our divides.
66
222292
2101
03:44
Our more than 200-year experiment with democracy works
67
224417
3809
03:48
in large part because we are able to openly and passionately
68
228250
3893
03:52
debate our ideas for the best solutions.
69
232167
2809
03:55
But while I still deeply believe
70
235000
1809
03:56
in the power of face-to-face civil discourse,
71
236833
2268
03:59
it just cannot compete
72
239125
1643
04:00
with the polarizing effects and scale of social media right now.
73
240792
3809
04:04
The people who are sucked down these rabbit holes
74
244625
2351
04:07
of social media outrage
75
247000
1309
04:08
often feel far harder to break of their ideological mindsets
76
248333
3643
04:12
than those vulnerable communities I worked with ever were.
77
252000
3476
04:15
So when Facebook called me in 2018
78
255500
2309
04:17
and offered me this role
79
257833
1310
04:19
heading its elections integrity operations for political advertising,
80
259167
3976
04:23
I felt I had to say yes.
81
263167
2017
04:25
I had no illusions that I would fix it all,
82
265208
2726
04:27
but when offered the opportunity
83
267958
1643
04:29
to help steer the ship in a better direction,
84
269625
2184
04:31
I had to at least try.
85
271833
1542
04:34
I didn't work directly on polarization,
86
274667
2267
04:36
but I did look at which issues were the most divisive in our society
87
276958
4435
04:41
and therefore the most exploitable in elections interference efforts,
88
281417
3809
04:45
which was Russia's tactic ahead of 2016.
89
285250
2583
04:48
So I started by asking questions.
90
288583
2351
04:50
I wanted to understand the underlying systemic issues
91
290958
2893
04:53
that were allowing all of this to happen,
92
293875
2434
04:56
in order to figure out how to fix it.
93
296333
2084
04:59
Now I still do believe in the power of the internet
94
299625
2559
05:02
to bring more voices to the table,
95
302208
2518
05:04
but despite their stated goal of building community,
96
304750
3101
05:07
the largest social media companies as currently constructed
97
307875
3309
05:11
are antithetical to the concept of reasoned discourse.
98
311208
3601
05:14
There's no way to reward listening,
99
314833
2518
05:17
to encourage civil debate
100
317375
1601
05:19
and to protect people who sincerely want to ask questions
101
319000
3643
05:22
in a business where optimizing engagement and user growth
102
322667
3351
05:26
are the two most important metrics for success.
103
326042
3226
05:29
There's no incentive to help people slow down,
104
329292
3434
05:32
to build in enough friction that people have to stop,
105
332750
3226
05:36
recognize their emotional reaction to something,
106
336000
2601
05:38
and question their own assumptions before engaging.
107
338625
2667
05:42
The unfortunate reality is:
108
342625
1976
05:44
lies are more engaging online than truth,
109
344625
2893
05:47
and salaciousness beats out wonky, fact-based reasoning
110
347542
3726
05:51
in a world optimized for frictionless virality.
111
351292
3041
05:55
As long as algorithms' goals are to keep us engaged,
112
355167
3434
05:58
they will continue to feed us the poison that plays to our worst instincts
113
358625
4059
06:02
and human weaknesses.
114
362708
1459
06:04
And yes, anger, mistrust,
115
364958
3018
06:08
the culture of fear, hatred:
116
368000
1726
06:09
none of this is new in America.
117
369750
2851
06:12
But in recent years, social media has harnessed all of that
118
372625
3351
06:16
and, as I see it, dramatically tipped the scales.
119
376000
3601
06:19
And Facebook knows it.
120
379625
2393
06:22
A recent "Wall Street Journal" article
121
382042
1892
06:23
exposed an internal Facebook presentation from 2018
122
383958
4143
06:28
that specifically points to the companies' own algorithms
123
388125
3768
06:31
for growing extremist groups' presence on their platform
124
391917
3517
06:35
and for polarizing their users.
125
395458
2167
06:38
But keeping us engaged is how they make their money.
126
398708
3685
06:42
The modern information environment is crystallized around profiling us
127
402417
4226
06:46
and then segmenting us into more and more narrow categories
128
406667
2976
06:49
to perfect this personalization process.
129
409667
3309
06:53
We're then bombarded with information confirming our views,
130
413000
3809
06:56
reinforcing our biases,
131
416833
1851
06:58
and making us feel like we belong to something.
132
418708
3518
07:02
These are the same tactics we would see terrorist recruiters
133
422250
3393
07:05
using on vulnerable youth,
134
425667
2184
07:07
albeit in smaller, more localized ways before social media,
135
427875
3851
07:11
with the ultimate goal of persuading their behavior.
136
431750
3268
07:15
Unfortunately, I was never empowered by Facebook to have an actual impact.
137
435042
5267
07:20
In fact, on my second day, my title and job description were changed
138
440333
3768
07:24
and I was cut out of decision-making meetings.
139
444125
2976
07:27
My biggest efforts,
140
447125
1309
07:28
trying to build plans
141
448458
1393
07:29
to combat disinformation and voter suppression in political ads,
142
449875
3601
07:33
were rejected.
143
453500
1559
07:35
And so I lasted just shy of six months.
144
455083
2500
07:38
But here is my biggest takeaway from my time there.
145
458125
3601
07:41
There are thousands of people at Facebook
146
461750
2476
07:44
who are passionately working on a product
147
464250
2018
07:46
that they truly believe makes the world a better place,
148
466292
3684
07:50
but as long as the company continues to merely tinker around the margins
149
470000
3684
07:53
of content policy and moderation,
150
473708
2726
07:56
as opposed to considering
151
476458
1310
07:57
how the entire machine is designed and monetized,
152
477792
3226
08:01
they will never truly address how the platform is contributing
153
481042
3351
08:04
to hatred, division and radicalization.
154
484417
4184
08:08
And that's the one conversation I never heard happen during my time there,
155
488625
4268
08:12
because that would require fundamentally accepting
156
492917
3059
08:16
that the thing you built might not be the best thing for society
157
496000
4143
08:20
and agreeing to alter the entire product and profit model.
158
500167
3125
08:24
So what can we do about this?
159
504500
1833
08:27
I'm not saying that social media bears the sole responsibility
160
507042
3601
08:30
for the state that we're in today.
161
510667
2142
08:32
Clearly, we have deep-seated societal issues that we need to solve.
162
512833
5435
08:38
But Facebook's response, that it is just a mirror to society,
163
518292
4142
08:42
is a convenient attempt to deflect any responsibility
164
522458
3018
08:45
from the way their platform is amplifying harmful content
165
525500
4268
08:49
and pushing some users towards extreme views.
166
529792
3041
08:53
And Facebook could, if they wanted to,
167
533583
2851
08:56
fix some of this.
168
536458
1851
08:58
They could stop amplifying and recommending the conspiracy theorists,
169
538333
3851
09:02
the hate groups, the purveyors of disinformation
170
542208
2685
09:04
and, yes, in some cases even our president.
171
544917
3851
09:08
They could stop using the same personalization techniques
172
548792
3392
09:12
to deliver political rhetoric that they use to sell us sneakers.
173
552208
4226
09:16
They could retrain their algorithms
174
556458
1726
09:18
to focus on a metric other than engagement,
175
558208
2185
09:20
and they could build in guardrails to stop certain content from going viral
176
560417
4309
09:24
before being reviewed.
177
564750
1667
09:26
And they could do all of this
178
566875
2268
09:29
without becoming what they call the arbiters of truth.
179
569167
4184
09:33
But they've made it clear that they will not go far enough
180
573375
2809
09:36
to do the right thing without being forced to,
181
576208
2810
09:39
and, to be frank, why should they?
182
579042
2934
09:42
The markets keep rewarding them, and they're not breaking the law.
183
582000
3934
09:45
Because as it stands,
184
585958
1310
09:47
there are no US laws compelling Facebook, or any social media company,
185
587292
4892
09:52
to protect our public square,
186
592208
1768
09:54
our democracy
187
594000
1309
09:55
and even our elections.
188
595333
2435
09:57
We have ceded the decision-making on what rules to write and what to enforce
189
597792
4101
10:01
to the CEOs of for-profit internet companies.
190
601917
3083
10:05
Is this what we want?
191
605750
2018
10:07
A post-truth world where toxicity and tribalism
192
607792
3226
10:11
trump bridge-building and consensus-seeking?
193
611042
2625
10:14
I do remain optimistic that we still have more in common with each other
194
614667
4101
10:18
than the current media and online environment portray,
195
618792
3642
10:22
and I do believe that having more perspective surface
196
622458
3101
10:25
makes for a more robust and inclusive democracy.
197
625583
3601
10:29
But not the way it's happening right now.
198
629208
2810
10:32
And it bears emphasizing, I do not want to kill off these companies.
199
632042
4226
10:36
I just want them held to a certain level of accountability,
200
636292
3226
10:39
just like the rest of society.
201
639542
1916
10:42
It is time for our governments to step up and do their jobs
202
642542
4351
10:46
of protecting our citizenry.
203
646917
1642
10:48
And while there isn't one magical piece of legislation
204
648583
3060
10:51
that will fix this all,
205
651667
1476
10:53
I do believe that governments can and must find the balance
206
653167
4684
10:57
between protecting free speech
207
657875
2143
11:00
and holding these platforms accountable for their effects on society.
208
660042
4309
11:04
And they could do so in part by insisting on actual transparency
209
664375
4309
11:08
around how these recommendation engines are working,
210
668708
3143
11:11
around how the curation, amplification and targeting are happening.
211
671875
4643
11:16
You see, I want these companies held accountable
212
676542
2434
11:19
not for if an individual posts misinformation
213
679000
3101
11:22
or extreme rhetoric,
214
682125
1476
11:23
but for how their recommendation engines spread it,
215
683625
3851
11:27
how their algorithms are steering people towards it,
216
687500
3226
11:30
and how their tools are used to target people with it.
217
690750
3292
11:35
I tried to make change from within Facebook and failed,
218
695417
3684
11:39
and so I've been using my voice again for the past few years
219
699125
3143
11:42
to continue sounding this alarm
220
702292
2226
11:44
and hopefully inspire more people to demand this accountability.
221
704542
4392
11:48
My message to you is simple:
222
708958
2643
11:51
pressure your government representatives
223
711625
2184
11:53
to step up and stop ceding our public square to for-profit interests.
224
713833
5435
11:59
Help educate your friends and family
225
719292
1851
12:01
about how they're being manipulated online.
226
721167
3142
12:04
Push yourselves to engage with people who aren't like-minded.
227
724333
3518
12:07
Make this issue a priority.
228
727875
2643
12:10
We need a whole-society approach to fix this.
229
730542
3416
12:15
And my message to the leaders of my former employer Facebook is this:
230
735250
5559
12:20
right now, people are using your tools exactly as they were designed
231
740833
5810
12:26
to sow hatred, division and distrust,
232
746667
2601
12:29
and you're not just allowing it, you are enabling it.
233
749292
4142
12:33
And yes, there are lots of great stories
234
753458
2476
12:35
of positive things happening on your platform around the globe,
235
755958
3685
12:39
but that doesn't make any of this OK.
236
759667
3226
12:42
And it's only getting worse as we're heading into our election,
237
762917
2976
12:45
and even more concerning,
238
765917
1767
12:47
face our biggest potential crisis yet,
239
767708
2476
12:50
if the results aren't trusted, and if violence breaks out.
240
770208
4143
12:54
So when in 2021 you once again say, "We know we have to do better,"
241
774375
5018
12:59
I want you to remember this moment,
242
779417
2684
13:02
because it's no longer just a few outlier voices.
243
782125
3226
13:05
Civil rights leaders, academics,
244
785375
2268
13:07
journalists, advertisers, your own employees,
245
787667
2976
13:10
are shouting from the rooftops
246
790667
2059
13:12
that your policies and your business practices
247
792750
2601
13:15
are harming people and democracy.
248
795375
2417
13:18
You own your decisions,
249
798875
2351
13:21
but you can no longer say that you couldn't have seen it coming.
250
801250
3667
13:26
Thank you.
251
806167
1250
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7