Dear Facebook, this is how you're breaking democracy | Yael Eisenstat

115,091 views ใƒป 2020-09-24

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: Roey Perlstein ืขืจื™ื›ื”: zeeva livshitz
ืœืคื ื™ ื›ื—ืžืฉ ืฉื ื™ื
ื”ื‘ื ืชื™ ืคืชืื•ื ืฉืื ื™ ืžืื‘ื“ืช ืืช ื”ื™ื›ื•ืœืช
ืœืชืงืฉืจ ืขื ืื ืฉื™ื ืฉืœื ื—ื•ืฉื‘ื™ื ื›ืžื•ื ื™.
ื”ืžื—ืฉื‘ื” ืขืœ ื ื™ื”ื•ืœ ืฉื™ื—ื•ืช ืขืœ ื ื•ืฉืื™ื ื‘ื•ืขืจื™ื ืขื ื—ื‘ืจื™ ื”ืืžืจื™ืงืื™ื
00:13
Around five years ago,
0
13000
2018
00:15
it struck me that I was losing the ability
1
15042
2392
ื”ืชื—ื™ืœื” ืœื’ืจื•ื ืœื™ ืœื™ื•ืชืจ ืฆืจื‘ืช
00:17
to engage with people who aren't like-minded.
2
17458
2500
ืžืืฉืจ ื”ืคืขืžื™ื ืฉื‘ื”ืŸ ืชืงืฉืจืชื™ ืขื ืื ืฉื™ื ื”ื—ืฉื•ื“ื™ื ื›ืงื™ืฆื•ื ื™ื™ื ืžืขื‘ืจ ืœื™ื.
00:20
The idea of discussing hot-button issues with my fellow Americans
3
20917
3642
ื–ื” ื”ืชื—ื™ืœ ืœื’ืจื•ื ืœื™ ืœื”ืจื’ื™ืฉ ืžืจื™ืจื” ื•ืžืชื•ืกื›ืœืช.
00:24
was starting to give me more heartburn
4
24583
2393
ื•ื›ืš, ื›ืžืขื˜ ื‘ื”ื™ื ืฃ ื™ื“,
00:27
than the times that I engaged with suspected extremists overseas.
5
27000
4125
ื”ืกื˜ืชื™ ืืช ื›ืœ ื”ื”ืชืžืงื“ื•ืช ืฉืœื™
ืžืื™ื•ืžื™ ื‘ื˜ื—ื•ืŸ ื’ืœื•ื‘ืœื™ื™ื
00:31
It was starting to leave me feeling more embittered and frustrated.
6
31750
3601
ื‘ื ื™ืกื™ื•ืŸ ืœื”ื‘ื™ืŸ ืžื” ื’ื•ืจื ืœื“ื—ื™ืคื” ื”ื–ื•
00:35
And so just like that,
7
35375
1434
ื‘ื›ื™ื•ื•ื ื™ื ืฉืœ ืงื™ื˜ื•ื‘ ืงื™ืฆื•ื ื™ ื‘ื‘ื™ืช.
00:36
I shifted my entire focus
8
36833
1810
00:38
from global national security threats
9
38667
2434
ื›ื“ื™ืคืœื•ืžื˜ื™ืช ื•ืงืฆื™ื ื” ืœืฉืขื‘ืจ ื‘-CIA,
00:41
to trying to understand what was causing this push
10
41125
3226
ืฉื”ืงื“ื™ืฉื” ืฉื ื™ื ืœืขื‘ื•ื“ื” ืขืœ ืกื•ื’ื™ื•ืช ืฉืœ ื”ืชื ื’ื“ื•ืช ืœืงื™ืฆื•ื ื™ื•ืช,
00:44
towards extreme polarization at home.
11
44375
3226
ื”ืชื—ืœืชื™ ืœื—ืฉื•ืฉ ืฉื–ื” ื”ื•ืคืš ืœืื™ื•ื ื”ืจื‘ื” ื™ื•ืชืจ ื’ื“ื•ืœ ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœื ื•
00:47
As a former CIA officer and diplomat
12
47625
2351
ืžืืฉืจ ื›ืœ ื™ืจื™ื‘ ื–ืจ.
00:50
who spent years working on counterextremism issues,
13
50000
3434
ืื– ื”ืชื—ืœืชื™ ืœื—ืงื•ืจ,
00:53
I started to fear that this was becoming a far greater threat to our democracy
14
53458
3851
ื•ื”ืชื—ืœืชื™ ืœื“ื‘ืจ ืขืœ ื”ื ื•ืฉื,
ืžื” ืฉื”ื•ื‘ื™ืœ ืื•ืชื™ ื‘ื”ืžืฉืš ืœื”ื™ื•ืช ืžื•ืขืกืงืช ืขืœ-ื™ื“ื™ ืคื™ื™ืกื‘ื•ืง,
00:57
than any foreign adversary.
15
57333
2518
ื•ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื”ื‘ื™ื ืื•ืชื™ ืœื›ืืŸ, ื”ื™ื•ื
00:59
And so I started digging in,
16
59875
1809
ื›ื“ื™ ืœื”ืžืฉื™ืš ื•ืœื”ื–ื”ื™ืจ ืืชื›ื ืžื”ื“ืจื›ื™ื ืฉื‘ื”ืŸ ื”ืคืœื˜ืคื•ืจืžื•ืช ื”ืœืœื•
01:01
and I started speaking out,
17
61708
1518
01:03
which eventually led me to being hired at Facebook
18
63250
2809
ืžืชืžืจื ื•ืช ื•ืžืงืฆื™ื ื•ืช ืจื‘ื™ื ื›ืœ-ื›ืš ืžื‘ื™ื ื™ื ื•
01:06
and ultimately brought me here today
19
66083
2685
ื•ืœื“ื‘ืจ ืขืœ ื”ืื•ืคืŸ ืฉื‘ื• ื ื•ื›ืœ ืœืชื‘ื•ืข ืœืขืฆืžื ื• ืžื—ื“ืฉ ืืช ื›ื™ื›ืจ ื”ืขื™ืจ.
01:08
to continue warning you about how these platforms
20
68792
2809
01:11
are manipulating and radicalizing so many of us
21
71625
4018
ื”ื™ื™ืชื™ ืงืฆื™ื ื” ื‘ืฉื™ืจื•ืช ื”ื—ื•ืฅ ื‘ืงื ื™ื”
ืฉื ื™ื ืกืคื•ืจื•ืช ืœืื—ืจ ื”ื”ืชืงืคื•ืช ืฉืœ ื”-11 ื‘ืกืคื˜ืžื‘ืจ,
01:15
and to talk about how to reclaim our public square.
22
75667
2958
ื•ื”ื•ื‘ืœืชื™ ืืช ืžื” ืฉื—ืœืง ืžื”ืื ืฉื™ื ืงื•ืจืื™ื ืœื•
ืงืžืคื™ื™ื ื™ื ืฉืœ โ€œืœื‘ื‘ื•ืช ื•ืžื•ื—ื•ืชโ€ ืœืื•ืจืš ื”ื’ื‘ื•ืœ ืขื ืกื•ืžืœื™ื”.
01:19
I was a foreign service officer in Kenya
23
79625
1976
ื—ืœืง ื’ื“ื•ืœ ืžื”ืชืคืงื™ื“ ืฉืœื™ ื”ื™ื” ืœื‘ื ื•ืช ืืžื•ืŸ ืขื ืงื”ื™ืœื•ืช
01:21
just a few years after the September 11 attacks,
24
81625
3059
01:24
and I led what some call "hearts and minds" campaigns
25
84708
2560
ืฉื ืชืคืกื• ื›ื—ืฉื•ืคื•ืช ื‘ืžื™ื“ื” ื”ืจื‘ื” ื‘ื™ื•ืชืจ ืœืžืกืจื™ื ืฉืœ ื”ืงืฆื ื”.
01:27
along the Somalia border.
26
87292
1934
ื”ืงื“ืฉืชื™ ืฉืขื•ืช ืœืฉืชื™ื™ืช ืชื” ืขื ืื ืฉื™ ื“ืช ืื ื˜ื™-ืžืขืจื‘ื™ื™ื ืžื•ืฆื”ืจื™ื
01:29
A big part of my job was to build trust with communities
27
89250
3309
ื•ืืคื™ืœื• ื ื™ื”ืœืชื™ ื“ื™ืืœื•ื’ ืขื ื›ืžื” ื—ืฉื•ื“ื™ื ื‘ื˜ืจื•ืจ,
01:32
deemed the most susceptible to extremist messaging.
28
92583
2709
ื•ื‘ืขื•ื“ ืฉืจื‘ื•ืช ืžื”ืฉื™ื—ื•ืช ื”ืœืœื• ื”ื—ืœื• ืžืžืงื•ื ืฉืœ ื—ืฉื“ ื”ื“ื“ื™,
01:36
I spent hours drinking tea with outspoken anti-Western clerics
29
96125
4184
ืื ื™ ืœื ื–ื•ื›ืจืช ื•ืœื• ืื—ืช ืžื”ืŸ ืฉื”ืกืชื™ื™ืžื” ื‘ืฆืขืงื•ืช ืื• ื‘ืขืœื‘ื•ื ื•ืช,
01:40
and even dialogued with some suspected terrorists,
30
100333
3226
ื•ื‘ืžืงืจื™ื ืžืกื•ื™ืžื™ื ืืคื™ืœื• ืขื‘ื“ื ื• ื™ื—ื“ ืขืœ ื ื•ืฉืื™ื ื‘ื”ื ื”ื™ื” ืœื ื• ืื™ื ื˜ืจืก ืžืฉื•ืชืฃ.
01:43
and while many of these engagements began with mutual suspicion,
31
103583
3393
01:47
I don't recall any of them resulting in shouting or insults,
32
107000
3809
ื”ื›ืœื™ื ืจื‘ื™ ื”ืขื•ืฆืžื” ื‘ื™ื•ืชืจ ืฉื”ื™ื• ื‘ืจืฉื•ืชื ื• ื”ื™ื• ืคืฉื•ื˜ ืœื”ืงืฉื™ื‘, ืœืœืžื•ื“,
01:50
and in some case we even worked together on areas of mutual interest.
33
110833
4084
ื•ืœื‘ื ื•ืช ืืžืคืชื™ื”.
ื–ื• ื”ืžื”ื•ืช ืฉืœ ืขื‘ื•ื“ืช ืœื‘ื‘ื•ืช ื•ืžื•ื—ื•ืช,
01:56
The most powerful tools we had were to simply listen, learn
34
116000
4143
ืžืฉื•ื ืฉืžื” ืฉื’ื™ืœื™ืชื™ - ืฉื•ื‘ ื•ืฉื•ื‘ - ื”ื•ื ืฉืžื” ืฉืจื•ื‘ ื”ืื ืฉื™ื ืจื•ืฆื™ื
ื”ื•ื ืœื”ืจื’ื™ืฉ ืฉืฉื•ืžืขื™ื ืื•ืชื, ืžื›ื™ืจื™ื ื‘ื”ื ื•ืžื›ื‘ื“ื™ื ืื•ืชื.
02:00
and build empathy.
35
120167
1726
02:01
This is the essence of hearts and minds work,
36
121917
3059
ื•ืื ื™ ืžืืžื™ื ื” ืฉื–ื” ืžื” ืฉืจื•ื‘ื ื• ืจื•ืฆื™ื.
02:05
because what I found again and again is that what most people wanted
37
125000
3268
ืื– ืžื” ืฉืื ื™ ืจื•ืื” ืฉืงื•ืจื” ื”ื™ื•ื ื‘ืจืฉืช ืฉื•ื‘ืจ ืืช ื”ืœื‘ ื‘ืื•ืคืŸ ืžื™ื•ื—ื“
02:08
was to feel heard, validated and respected.
38
128292
3767
ื•ื”ื•ื ื‘ืขื™ื” ื”ืจื‘ื” ื™ื•ืชืจ ืงืฉื” ืœื”ืชืžื•ื“ื“ื•ืช.
02:12
And I believe that's what most of us want.
39
132083
2643
ืื ื—ื ื• ืžืชื•ืžืจื ื™ื ืขืœ-ื™ื“ื™ ื”ืืงื•ืกื™ืกื˜ืžื” ื”ื ื•ื›ื—ื™ืช ืฉืœ ื”ืžื™ื“ืข,
02:14
So what I see happening online today is especially heartbreaking
40
134750
3309
ืฉืžืงื‘ืขืช ืจื‘ื™ื ื›ืœ-ื›ืš ื‘ื™ื ื™ื ื• ื›ืœ-ื›ืš ืขืžื•ืง ื‘ืงืฆื” ืฉืœ ื”ืืžืช ื”ืžื•ื—ืœื˜ืช,
02:18
and a much harder problem to tackle.
41
138083
2084
02:20
We are being manipulated by the current information ecosystem
42
140917
3767
ืฉืคืฉืจื” ื”ืคื›ื” ืœืžื™ืœื” ืžืœื•ื›ืœื›ืช.
ืžืฉื•ื ืฉืžืžืฉ ืขื›ืฉื™ื•,
02:24
entrenching so many of us so far into absolutism
43
144708
3768
ื—ื‘ืจื•ืช ืžื“ื™ื” ื—ื‘ืจืชื™ืช ื›ืžื• ืคื™ื™ืกื‘ื•ืง
ืžืจื•ื•ื™ื—ื•ืช ืžื›ืš ืฉื”ืŸ ืžืคืฆืœื•ืช ืื•ืชื ื• ื•ืžื–ื™ื ื•ืช ืื•ืชื ื• ื‘ืชื•ื›ืŸ ืžื•ืชืื ืื™ืฉื™ืช,
02:28
that compromise has become a dirty word.
44
148500
2976
ืฉื’ื ืžืชืงืฃ ืืช ื”ื”ื˜ื™ื•ืช ืฉืœื ื• ื•ื’ื ืžื ืฆืœ ืื•ืชืŸ.
02:31
Because right now,
45
151500
1309
02:32
social media companies like Facebook
46
152833
2143
ื”ืฉื•ืจื” ื”ืชื—ืชื•ื ื” ืฉืœื”ืŸ ืชืœื•ื™ื” ื‘ื’ื™ืจื•ื™ ืฉืœ ืจื’ืฉ ื—ื–ืง
02:35
profit off of segmenting us and feeding us personalized content
47
155000
3851
ืฉื™ืฉืื™ืจ ืื•ืชื ื• ืžื—ื•ื‘ืจื™ื,
02:38
that both validates and exploits our biases.
48
158875
3851
ืžื” ืฉืœืขืชื™ื ืงืจื•ื‘ื•ืช ืžืชืžืจืฅ ืงื•ืœื•ืช ืžืกื™ืชื™ื ื•ืžืงื˜ื‘ื™ื
02:42
Their bottom line depends on provoking a strong emotion
49
162750
3434
ืขื“ ื›ื“ื™ ื›ืš ืฉืžืฆื™ืืช ืžื›ื ื” ืžืฉื•ืชืฃ ื›ื‘ืจ ืžืจื’ื™ืฉื” ื›ืžืฉื™ืžื” ื‘ืœืชื™ ืืคืฉืจื™ืช.
02:46
to keep us engaged,
50
166208
1685
02:47
often incentivizing the most inflammatory and polarizing voices,
51
167917
4267
ื•ืขืœ ืืฃ ืžืงื”ืœื” ื’ื•ื‘ืจืช ืฉืœ ืื ืฉื™ื ื”ืงื•ืจืื™ื ืœืคืœื˜ืคื•ืจืžื•ืช ืœื”ืฉืชื ื•ืช,
02:52
to the point where finding common ground no longer feels possible.
52
172208
4435
ื–ื” ื‘ืจื•ืจ ืฉื”ืŸ ืœื ื™ืขืฉื• ืžืกืคื™ืง ืžืจืฆื•ื ืŸ.
ืœื›ืŸ ืขืœ ื”ืžืžืฉืœื•ืช ืœื”ื’ื“ื™ืจ ืืช ื”ืื—ืจื™ื•ืช
02:56
And despite a growing chorus of people crying out for the platforms to change,
53
176667
4226
ืœื ื–ืงื™ื ื‘ืขื•ืœื ื”ืืžื™ืชื™ ืฉื ื’ืจืžื™ื ืขืœ-ื™ื“ื™ ื”ืžื•ื“ืœื™ื ื”ืขืกืงื™ื™ื ื”ืœืœื•
03:00
it's clear they will not do enough on their own.
54
180917
3309
ื•ืœื”ื˜ื™ืœ ืขืœื•ื™ื•ืช ืžืžืฉื™ื•ืช ืขืœ ื”ื”ืฉืคืขื•ืช ื”ืžื–ื™ืงื•ืช
03:04
So governments must define the responsibility
55
184250
2893
ืฉื™ืฉ ืœื”ื ืขืœ ื”ื‘ืจื™ืื•ืช ื”ืฆื™ื‘ื•ืจื™ืช ืฉืœื ื•, ืขืœ ื›ื™ื›ืจ ื”ืขื™ืจ ืฉืœื ื• ื•ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœื ื•.
03:07
for the real-world harms being caused by these business models
56
187167
3809
ืืš ืœืžืจื‘ื” ื”ืฆืขืจ, ื–ื” ืœื ื™ืงืจื” ื‘ื–ืžืŸ ืœื‘ื—ื™ืจื•ืช ืœื ืฉื™ืื•ืช ืืจืฆื•ืช-ื”ื‘ืจื™ืช,
03:11
and impose real costs on the damaging effects
57
191000
2768
03:13
they're having to our public health, our public square and our democracy.
58
193792
5309
ื›ืš ืฉืื ื™ ืžืžืฉื™ื›ื” ืœื”ืฉืžื™ืข ืืช ื”ืื–ืขืงื” ื”ื–ื•,
ืžืฉื•ื ืฉืืคื™ืœื• ืื ื™ื•ื ืื—ื“ ื›ืŸ ื™ื”ื™ื• ืœื ื• ื›ืœืœื™ื ื—ื–ืงื™ื ืœืขื ื™ื™ืŸ ื”ื–ื”,
03:19
But unfortunately, this won't happen in time for the US presidential election,
59
199125
4643
ื”ืชื™ืงื•ืŸ ืฉืœ ื–ื” ื™ื“ืจื•ืฉ ืืช ื›ื•ืœื ื•.
03:23
so I am continuing to raise this alarm,
60
203792
2767
ื›ืืฉืจ ื”ืชื—ืœืชื™ ืœื”ืกื™ื˜ ืืช ืชืฉื•ืžืช ื”ืœื‘ ืฉืœื™ ืžืื™ื•ืžื™ื ื‘ื—ื•ืฅ-ืœืืจืฅ
03:26
because even if one day we do have strong rules in place,
61
206583
3060
ืœื”ืชืคืจืงื•ืช ืฉืœ ื”ืฉื™ื— ื”ืื–ืจื—ื™ ื‘ื‘ื™ืช,
03:29
it will take all of us to fix this.
62
209667
2750
ืชื”ื™ืชื™ ืื ื ื•ื›ืœ ืœื™ื™ืขื“ ืžื—ื“ืฉ ื—ืœืง ืžื”ืงืžืคื™ื™ื ื™ื ื”ืœืœื• ืฉืœ ืœื‘ื‘ื•ืช ื•ืžื•ื—ื•ืช
03:33
When I started shifting my focus from threats abroad
63
213417
2601
ื›ื“ื™ ืœืกื™ื™ืข ื‘ืื™ื—ื•ื™ ื”ืงืจืขื™ื ื‘ื™ื ื™ื ื•.
ื”ื ื™ืกื•ื™ ืฉืœื ื• ื‘ื“ืžื•ืงืจื˜ื™ื”, ื‘ืŸ ื™ื•ืชืจ ืžืžืืชื™ื™ื ืฉื ื™ื,
03:36
to the breakdown in civil discourse at home,
64
216042
2142
03:38
I wondered if we could repurpose some of these hearts and minds campaigns
65
218208
4060
ืขื•ื‘ื“ ื‘ืžื™ื“ื” ืจื‘ื” ืžืฉื•ื ืฉืื ื—ื ื• ืžืกื•ื’ืœื™ื
03:42
to help heal our divides.
66
222292
2101
ืœื“ื•ืŸ ื‘ืจืขื™ื•ื ื•ืช ืฉืœื ื• ืœืคืชืจื•ื ื•ืช ืžื™ื˜ื‘ื™ื™ื ื‘ืคืชื™ื—ื•ืช ื•ื‘ืœื”ื˜.
03:44
Our more than 200-year experiment with democracy works
67
224417
3809
ืื‘ืœ ื‘ืขื•ื“ ืฉืื ื™ ืขื“ื™ื™ืŸ ืžืืžื™ื ื” ืื“ื•ืงื”
03:48
in large part because we are able to openly and passionately
68
228250
3893
ื‘ื›ื•ื—ื• ืฉืœ ืฉื™ื— ืื–ืจื—ื™ ืคื ื™ื-ืืœ-ืคื ื™ื,
ื”ื•ื ืคืฉื•ื˜ ืœื ื™ื›ื•ืœ ืœื”ืชื—ืจื•ืช
03:52
debate our ideas for the best solutions.
69
232167
2809
ื‘ื”ืฉืคืขื•ืช ื”ืžืงื˜ื‘ื•ืช ื•ื‘ืกื“ืจื™ ื”ื’ื•ื“ืœ ืฉืœ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ื›ืจื’ืข.
03:55
But while I still deeply believe
70
235000
1809
03:56
in the power of face-to-face civil discourse,
71
236833
2268
ื”ืื ืฉื™ื ืฉื ืฉืื‘ื™ื ื‘ืžื•ืจื“ ืžื—ื™ืœื•ืช ื”ืืจื ื‘ ื”ืœืœื•
ืฉืœ ื–ืขื ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช,
03:59
it just cannot compete
72
239125
1643
ืœืขืชื™ื ืงืจื•ื‘ื•ืช ื–ื” ืžืจื’ื™ืฉ ืฉืงืฉื” ืœื”ื ื™ื•ืชืจ ืœื”ืฉืชื—ืจืจ ืžื“ืคื•ืกื™ ื”ื—ืฉื™ื‘ื” ื”ืื™ื“ื™ืื•ืœื•ื’ื™ื™ื ืฉืœื”ื
04:00
with the polarizing effects and scale of social media right now.
73
240792
3809
ืžืืฉืจ ืื•ืชืŸ ืงื”ื™ืœื•ืช ืคื’ื™ืขื•ืช ืฉืขื‘ื“ืชื™ ืื™ืชืŸ.
04:04
The people who are sucked down these rabbit holes
74
244625
2351
04:07
of social media outrage
75
247000
1309
ื›ืš ืฉื›ืืฉืจ ืคื™ื™ืกื‘ื•ืง ืงืจืื” ืœื™ ื‘-2018
04:08
often feel far harder to break of their ideological mindsets
76
248333
3643
ื•ื”ืฆื™ืขื” ืœื™ ืืช ื”ืชืคืงื™ื“ ื”ื–ื”,
ืœืขืžื•ื“ ื‘ืจืืฉ ืคืขื•ืœื•ืช ื™ื•ืฉืจืช ื”ื‘ื—ื™ืจื•ืช ืฉืœื”ื ืœืคืจืกื•ื ืคื•ืœื™ื˜ื™,
04:12
than those vulnerable communities I worked with ever were.
77
252000
3476
ื”ืจื’ืฉืชื™ ืฉืื ื™ ืžื—ื•ื™ื‘ืช ืœื•ืžืจ ื›ืŸ.
04:15
So when Facebook called me in 2018
78
255500
2309
ืœื ื”ื™ื• ืœื™ ืืฉืœื™ื•ืช ืฉืืชืงืŸ ืืช ื”ื›ืœ,
04:17
and offered me this role
79
257833
1310
04:19
heading its elections integrity operations for political advertising,
80
259167
3976
ืื‘ืœ ื›ืืฉืจ ื”ื•ืฆืขื” ืœื™ ื”ื”ื–ื“ืžื ื•ืช
ืœืขื–ื•ืจ ืœื ื•ื•ื˜ ืืช ื”ืกืคื™ื ื” ื‘ื›ื™ื•ื•ืŸ ื˜ื•ื‘ ื™ื•ืชืจ,
04:23
I felt I had to say yes.
81
263167
2017
ื”ืจื’ืฉืชื™ ืฉืื ื™ ื—ื™ื™ื‘ืช ืœืคื—ื•ืช ืœื ืกื•ืช.
04:25
I had no illusions that I would fix it all,
82
265208
2726
ืœื ืขื‘ื“ืชื™ ื™ืฉื™ืจื•ืช ืขืœ ืงื™ื˜ื•ื‘,
04:27
but when offered the opportunity
83
267958
1643
ืื‘ืœ ื›ืŸ ื‘ื—ื ืชื™ ืžื”ืŸ ื”ืกื•ื’ื™ื•ืช ืฉื”ื™ื• ื”ืžืงื˜ื‘ื•ืช ื‘ื™ื•ืชืจ ื‘ื—ื‘ืจื” ืฉืœื ื•
04:29
to help steer the ship in a better direction,
84
269625
2184
04:31
I had to at least try.
85
271833
1542
ื•ืœืคื™ื›ืš ื”ืงืœื•ืช ื‘ื™ื•ืชืจ ืœื ื™ืฆื•ืœ ืœืจืขื” ื‘ืžืืžืฆื™ื ืฉืœ ื”ืชืขืจื‘ื•ืช ื‘ื‘ื—ื™ืจื•ืช,
04:34
I didn't work directly on polarization,
86
274667
2267
04:36
but I did look at which issues were the most divisive in our society
87
276958
4435
ืฉื”ื™ื• ื”ื˜ืงื˜ื™ืงื” ืฉืœ ืจื•ืกื™ื” ืœืงืจืืช 2016.
ืื– ื”ืชื—ืœืชื™ ื‘ื”ืฆื‘ืช ืฉืืœื•ืช.
04:41
and therefore the most exploitable in elections interference efforts,
88
281417
3809
ืจืฆื™ืชื™ ืœื”ื‘ื™ืŸ ืืช ื”ืกื•ื’ื™ื•ืช ื”ืžืขืจื›ืชื™ื•ืช ื”ื‘ืกื™ืกื™ื•ืช
04:45
which was Russia's tactic ahead of 2016.
89
285250
2583
ืฉืžืืคืฉืจื•ืช ืœื›ืœ ื–ื” ืœื”ืชืจื—ืฉ,
ื‘ื›ื“ื™ ืœื”ื‘ื™ืŸ ืื™ืš ืœืชืงืŸ ืืช ื–ื”.
04:48
So I started by asking questions.
90
288583
2351
04:50
I wanted to understand the underlying systemic issues
91
290958
2893
ื›ืขืช ืื ื™ ืขื“ื™ื™ืŸ ืžืืžื™ื ื” ื‘ื›ื•ื—ื• ืฉืœ ื”ืื™ื ื˜ืจื ื˜
04:53
that were allowing all of this to happen,
92
293875
2434
ืœื”ื‘ื™ื ื™ื•ืชืจ ืงื•ืœื•ืช ืืœ ื”ืฉื•ืœื—ืŸ,
04:56
in order to figure out how to fix it.
93
296333
2084
ืืš ืœืžืจื•ืช ื”ืžื˜ืจื” ื”ืžื•ืฆื”ืจืช ืฉืœื”ืŸ ืฉืœ ื‘ื ื™ื™ืช ืงื”ื™ืœื”,
04:59
Now I still do believe in the power of the internet
94
299625
2559
ื—ื‘ืจื•ืช ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ื”ื’ื“ื•ืœื•ืช ื‘ื™ื•ืชืจ, ื›ืคื™ ืฉื”ืŸ ื‘ื ื•ื™ื•ืช ื›ื™ื•ื,
05:02
to bring more voices to the table,
95
302208
2518
ื”ืŸ ืื ื˜ื™ืชื–ื” ืœืจืขื™ื•ืŸ ืฉืœ ืฉื™ื— ืžื•ืฉื›ืœ.
05:04
but despite their stated goal of building community,
96
304750
3101
ืื™ืŸ ื›ืœ ื“ืจืš ืœืชื’ืžืœ ื”ืงืฉื‘ื”,
05:07
the largest social media companies as currently constructed
97
307875
3309
ืœืขื•ื“ื“ ื“ื™ื•ืŸ ืžืชื•ืจื‘ืช
ื•ืœื”ื’ืŸ ืขืœ ืื ืฉื™ื ืฉืจื•ืฆื™ื ื‘ื›ื ื•ืช ืœืฉืื•ืœ ืฉืืœื•ืช,
05:11
are antithetical to the concept of reasoned discourse.
98
311208
3601
05:14
There's no way to reward listening,
99
314833
2518
ื‘ืขืกืง ืฉื‘ื• ืžื™ื˜ื•ื‘ ืžืขื•ืจื‘ื•ืช ื•ืฆืžื™ื—ืช ืžืฉืชืžืฉื™ื
05:17
to encourage civil debate
100
317375
1601
ื”ื ืฉื ื™ ื”ืžื“ื“ื™ื ื”ื—ืฉื•ื‘ื™ื ื‘ื™ื•ืชืจ ืœื”ืฆืœื—ื”.
05:19
and to protect people who sincerely want to ask questions
101
319000
3643
ืื™ืŸ ื›ืœ ืชืžืจื™ืฅ ืฉื™ืขื–ื•ืจ ืœืื ืฉื™ื ืœื”ืื˜,
05:22
in a business where optimizing engagement and user growth
102
322667
3351
ืœื”ื‘ื ื•ืช ืคื ื™ืžื” ืžืกืคื™ืง ื—ื™ื›ื•ืš ื›ืš ืฉืื ืฉื™ื ื™ื”ื™ื• ื—ื™ื™ื‘ื™ื ืœืขืฆื•ืจ,
05:26
are the two most important metrics for success.
103
326042
3226
ืœื”ื›ื™ืจ ื‘ืชื’ื•ื‘ื” ื”ืจื’ืฉื™ืช ืฉืœื”ื ืœืžืฉื”ื•,
05:29
There's no incentive to help people slow down,
104
329292
3434
ื•ืœื”ื˜ื™ืœ ื‘ืกืคืง ืืช ื”ื”ื ื—ื•ืช ืฉืœื”ื ืขืฆืžื ืœืคื ื™ ืฉื”ื ื ื›ื ืกื™ื ืœื“ื™ื•ืŸ.
05:32
to build in enough friction that people have to stop,
105
332750
3226
ื”ืžืฆื™ืื•ืช ื”ืžืฆืขืจืช ื”ื™ื ื›ื–ื•:
05:36
recognize their emotional reaction to something,
106
336000
2601
ืœืฉืงืจื™ื ื™ืฉ ื‘ืจืฉืช ื›ื•ื— ืื—ื™ื–ื” ื’ื“ื•ืœ ื™ื•ืชืจ ืžืืฉืจ ืœืืžืช,
05:38
and question their own assumptions before engaging.
107
338625
2667
ื•ืขืกื™ืกื™ื•ืช ืžื ืฆื—ืช ื˜ื™ืขื•ื ื™ื ื™ื‘ืฉื™ื, ืžื‘ื•ืกืกื™-ืขื•ื‘ื“ื•ืช
05:42
The unfortunate reality is:
108
342625
1976
ื‘ืขื•ืœื ืฉืžื•ืชืื ืœื•ื™ืจืืœื™ื•ืช ืœืœื ื—ื™ื›ื•ืš.
05:44
lies are more engaging online than truth,
109
344625
2893
ื›ืœ ืขื•ื“ ืžื˜ืจื•ืช ื”ืืœื’ื•ืจื™ืชืžื™ื ื”ืŸ ืœื”ืฉืื™ืจ ืื•ืชื ื• ื‘ืื—ื™ื–ืชื,
05:47
and salaciousness beats out wonky, fact-based reasoning
110
347542
3726
ื”ื ื™ื–ื™ื ื• ืื•ืชื ื• ื‘ืจืขืœ ืฉืฉื•ืื‘ ืืช ื›ื•ื—ื• ืžื”ืื™ื ืกื˜ื™ื ืงื˜ื™ื
05:51
in a world optimized for frictionless virality.
111
351292
3041
ื•ืžื—ื•ืœืฉื•ืช ื”ืื ื•ืฉ ื”ื’ืจื•ืขื™ื ื‘ื™ื•ืชืจ ืฉืœื ื•.
05:55
As long as algorithms' goals are to keep us engaged,
112
355167
3434
ื•ื›ืŸ - ื›ืขืก, ื—ื•ืกืจ ืืžื•ืŸ,
05:58
they will continue to feed us the poison that plays to our worst instincts
113
358625
4059
ืชืจื‘ื•ืช ืฉืœ ืคื—ื“, ืฉื ืื”:
ืืฃ ืื—ื“ ืžื”ื“ื‘ืจื™ื ื”ืœืœื• ืื™ื ื ื• ื—ื“ืฉ ื‘ืืžืจื™ืงื”
06:02
and human weaknesses.
114
362708
1459
06:04
And yes, anger, mistrust,
115
364958
3018
ืื‘ืœ ื‘ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช, ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ืจืชืžื” ืืช ื›ืœ ืืœื•
ื•ื›ืคื™ ืฉืื ื™ ืจื•ืื” ืืช ื–ื”, ื”ื™ื˜ืชื” ื‘ืื•ืคืŸ ื“ืจืžื˜ื™ ืืช ื›ืคื•ืช ื”ืžืื–ื ื™ื™ื.
06:08
the culture of fear, hatred:
116
368000
1726
06:09
none of this is new in America.
117
369750
2851
ื•ืคื™ื™ืกื‘ื•ืง ื™ื•ื“ืขืช ืืช ื–ื”.
06:12
But in recent years, social media has harnessed all of that
118
372625
3351
ืžืืžืจ ืฉืคื•ืจืกื ืœืื—ืจื•ื ื” ื‘โ€œื•ื•ืœ ืกื˜ืจื™ื˜ ื’โ€™ื•ืจื ืœโ€
ื—ืฉืฃ ืžืฆื’ืช ืคื ื™ืžื™ืช ืฉืœ ืคื™ื™ืกื‘ื•ืง ืžืฉื ืช 2018
06:16
and, as I see it, dramatically tipped the scales.
119
376000
3601
06:19
And Facebook knows it.
120
379625
2393
ืฉืžืฆื‘ื™ืขื” ื‘ืื•ืคืŸ ืกืคืฆื™ืคื™ ืขืœ ื”ืฉืคืขืช ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ื”ื—ื‘ืจื” ืขืฆืžื”
06:22
A recent "Wall Street Journal" article
121
382042
1892
06:23
exposed an internal Facebook presentation from 2018
122
383958
4143
ืขืœ ืฆืžื™ื—ื” ื‘ื ื•ื›ื—ื•ืช ืฉืœ ืงื‘ื•ืฆื•ืช ืงื™ืฆื•ื ื™ื•ืช ื‘ืคืœื˜ืคื•ืจืžื” ืฉืœื”ื
ื•ืขืœ ื”ืงื™ื˜ื•ื‘ ืฉืœ ื”ืžืฉืชืžืฉื™ื ืฉืœื”ื.
06:28
that specifically points to the companies' own algorithms
123
388125
3768
ืื‘ืœ ื”ื“ืจืš ืฉืœื”ื ืœื”ืจื•ื•ื™ื— ืืช ื”ื›ืกืฃ ืฉืœื”ื ื”ื™ื ื‘ื›ืš ืฉื”ื ื™ืฉืื™ืจื• ืื•ืชื ื• ืžืขื•ืจื‘ื™ื.
06:31
for growing extremist groups' presence on their platform
124
391917
3517
ืกื‘ื™ื‘ืช ื”ืžื™ื“ืข ื”ืžื•ื“ืจื ื™ืช ืžืชื’ื‘ืฉืช ืกื‘ื™ื‘ ื™ืฆื™ืจืช ืคืจื•ืคื™ืœื™ื ืฉืœื ื•
06:35
and for polarizing their users.
125
395458
2167
06:38
But keeping us engaged is how they make their money.
126
398708
3685
ื•ืœืื—ืจ ืžื›ืŸ ืคื™ืฆื•ืœ ืฉืœื ื• ืœืงื˜ื’ื•ืจื™ื•ืช ืฆืจื•ืช ื™ื•ืชืจ ื•ื™ื•ืชืจ
ื›ื“ื™ ืœืฉื›ืœืœ ืขื“ ื”ืกื•ืฃ ืืช ืชื”ืœื™ืš ื”ื”ืชืืžื” ื”ืื™ืฉื™ืช ื”ื–ื”.
06:42
The modern information environment is crystallized around profiling us
127
402417
4226
ืœืื—ืจ ืžื›ืŸ ืื ื—ื ื• ืžื•ืคื’ื–ื™ื ื‘ืžื™ื“ืข ื”ืžืืฉืฉ ืืช ื”ืขืžื“ื•ืช ืฉืœื ื•,
06:46
and then segmenting us into more and more narrow categories
128
406667
2976
ืžืชืงืฃ ืืช ื”ื”ื˜ื™ื•ืช ืฉืœื ื•,
06:49
to perfect this personalization process.
129
409667
3309
ื•ื’ื•ืจื ืœื ื• ืœื”ืจื’ื™ืฉ ืฉืื ื—ื ื• ืฉื™ื™ื›ื™ื ืœืžืฉื”ื•.
06:53
We're then bombarded with information confirming our views,
130
413000
3809
ืื™ืœื• ืื•ืชืŸ ื”ื˜ืงื˜ื™ืงื•ืช ืฉื ื”ื’ื ื• ืœืจืื•ืช ืืฆืœ ืžื’ื™ื™ืกื™ ื˜ืจื•ืจื™ืกื˜ื™ื,
06:56
reinforcing our biases,
131
416833
1851
ื”ืžืฉืชืžืฉื™ื ื‘ื”ืŸ ืขืœ ื‘ื ื™-ื ื•ืขืจ ืคื’ื™ืขื™ื,
06:58
and making us feel like we belong to something.
132
418708
3518
ืืฃ ื›ื™ ื‘ื“ืจื›ื™ื ืงื˜ื ื•ืช ื•ืžืงื•ืžื™ื•ืช ื™ื•ืชืจ ืœืคื ื™ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช,
07:02
These are the same tactics we would see terrorist recruiters
133
422250
3393
ืฉื™ื˜ื•ืช ืฉืžื˜ืจืชืŸ ื”ืกื•ืคื™ืช ืœืฉื›ื ืข ืืช ื”ื”ืชื ื”ื’ื•ืช ืฉืœื”ื.
07:05
using on vulnerable youth,
134
425667
2184
ืœืžืจื‘ื” ื”ืฆืขืจ, ืžืขื•ืœื ืœื ืงื™ื‘ืœืชื™ ืžืคื™ื™ืกื‘ื•ืง ืืช ื”ื›ื•ื— ืฉื™ืืคืฉืจ ืœื™ ืžืžืฉ ืœื”ืฉืคื™ืข.
07:07
albeit in smaller, more localized ways before social media,
135
427875
3851
07:11
with the ultimate goal of persuading their behavior.
136
431750
3268
ืœืžืขืฉื”, ื‘ื™ื•ื ื”ืฉื ื™ ืฉืœื™ ื”ืชื•ืืจ ื•ืชื™ืื•ืจ ื”ืชืคืงื™ื“ ืฉืœื™ ืฉื•ื ื•,
07:15
Unfortunately, I was never empowered by Facebook to have an actual impact.
137
435042
5267
ื•ื”ื“ื™ืจื• ืื•ืชื™ ืžื™ืฉื™ื‘ื•ืช ืฉืœ ืงื‘ืœืช ื”ื—ืœื˜ื•ืช.
ื”ืžืืžืฆื™ื ื”ื’ื“ื•ืœื™ื ื‘ื™ื•ืชืจ ืฉืœื™,
ืœื ืกื•ืช ื•ืœื‘ื ื•ืช ืชื›ื ื™ื•ืช
07:20
In fact, on my second day, my title and job description were changed
138
440333
3768
ืฉื™ื™ืื‘ืงื• ื‘ื“ื™ืกืื™ื ืคื•ืจืžืฆื™ื” ื•ื‘ื“ื™ื›ื•ื™ ื”ืฆื‘ืขื” ื‘ืžื•ื“ืขื•ืช ืคื•ืœื™ื˜ื™ื•ืช,
07:24
and I was cut out of decision-making meetings.
139
444125
2976
ื ื“ื—ื•.
ื•ื›ืš, ื”ื—ื–ืงืชื™ ืžืขืžื“ ืงืฆืช ืคื—ื•ืช ืžืฉื™ืฉื” ื—ื•ื“ืฉื™ื.
07:27
My biggest efforts,
140
447125
1309
07:28
trying to build plans
141
448458
1393
07:29
to combat disinformation and voter suppression in political ads,
142
449875
3601
ืื‘ืœ ื”ื ื” ื”ืชื•ื‘ื ื” ื”ื’ื“ื•ืœื” ื‘ื™ื•ืชืจ ืฉืœื™ ืžื”ื–ืžืŸ ืฉืœื™ ืฉื.
07:33
were rejected.
143
453500
1559
ื™ืฉื ื ืืœืคื™ ืื ืฉื™ื ื‘ืคื™ื™ืกื‘ื•ืง
07:35
And so I lasted just shy of six months.
144
455083
2500
ืฉืขื•ื‘ื“ื™ื ื‘ืœื”ื˜ ืขืœ ืžื•ืฆืจ,
ืฉื”ื ื‘ืืžืช ืžืืžื™ื ื™ื ืฉื”ื•ืคืš ืืช ื”ืขื•ืœื ืœืžืงื•ื ื˜ื•ื‘ ื™ื•ืชืจ,
07:38
But here is my biggest takeaway from my time there.
145
458125
3601
07:41
There are thousands of people at Facebook
146
461750
2476
ืื‘ืœ ื›ืœ ืขื•ื“ ื”ื—ื‘ืจื” ืžืžืฉื™ื›ื” ืœืขืฉื•ืช ืจืง ืฉื™ืคื•ืฆื™ื ืงื˜ื ื™ื ื‘ืฉื•ืœื™ื™ื
07:44
who are passionately working on a product
147
464250
2018
ืฉืœ ืžื“ื™ื ื™ื•ืช ื•ืฉืœ ื•ื•ื™ืกื•ืช ืชื›ื ื™ื,
07:46
that they truly believe makes the world a better place,
148
466292
3684
ื‘ื ื™ื’ื•ื“ ืœื—ืฉื™ื‘ื”
ืขืœ ื”ืื•ืคืŸ ื‘ื• ื”ืžื›ื•ื ื” ื›ื•ืœื” ืžืชื•ื›ื ื ืช ื•ืžื ื•ืฆืœืช ืœืžื˜ืจื•ืช ืจื•ื•ื—,
07:50
but as long as the company continues to merely tinker around the margins
149
470000
3684
ื”ื ืœืขื•ืœื ืœื ื™ื˜ืคืœื• ื‘ืืžืช ื‘ืื•ืคืŸ ืฉื‘ื• ื”ืคืœื˜ืคื•ืจืžื” ืชื•ืจืžืช
07:53
of content policy and moderation,
150
473708
2726
ืœืฉื ืื”, ืœืคื™ืœื•ื’ ื•ืœื”ืงืฆื ื”.
07:56
as opposed to considering
151
476458
1310
07:57
how the entire machine is designed and monetized,
152
477792
3226
ื•ื–ื• ื”ืฉื™ื—ื” ื”ืื—ืช ืฉืžืขื•ืœื ืœื ืฉืžืขืชื™ ืžืชืจื—ืฉืช ื‘ืชืงื•ืคื” ืฉื”ื™ื™ืชื™ ืฉื,
08:01
they will never truly address how the platform is contributing
153
481042
3351
08:04
to hatred, division and radicalization.
154
484417
4184
ืžืฉื•ื ืฉื–ื” ื™ื—ื™ื™ื‘ ืงื‘ืœื” ื‘ืกื™ืกื™ืช,
ืฉื”ื“ื‘ืจ ืฉื‘ื ื™ืชื ืื•ืœื™ ืื™ื ื ื• ื”ื“ื‘ืจ ื”ื˜ื•ื‘ ื‘ื™ื•ืชืจ ืœื—ื‘ืจื”,
08:08
And that's the one conversation I never heard happen during my time there,
155
488625
4268
ื•ืœื”ืกื›ื™ื ืœืฉื ื•ืช ืืช ื”ืžื•ืฆืจ ื›ื•ืœื• ื•ืืช ืžื•ื“ืœ ื”ืจื•ื•ื—.
08:12
because that would require fundamentally accepting
156
492917
3059
08:16
that the thing you built might not be the best thing for society
157
496000
4143
ืื– ืžื” ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ื‘ืขื ื™ื™ืŸ?
ืื ื™ ืœื ืื•ืžืจืช ืฉื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ื ื•ืฉืืช ืœื‘ื“ื” ื‘ื›ืœ ื”ืื—ืจื™ื•ืช
08:20
and agreeing to alter the entire product and profit model.
158
500167
3125
ืœืžืฆื‘ ืฉื‘ื• ืื ื—ื ื• ื ืžืฆืื™ื ื›ื™ื•ื.
08:24
So what can we do about this?
159
504500
1833
ืœืœื ืกืคืง, ื™ืฉ ืœื ื• ืกื•ื’ื™ื•ืช ื—ื‘ืจืชื™ื•ืช ืขืžื•ืงื•ืช, ืฉืขืœื™ื ื• ืœืคืชื•ืจ.
08:27
I'm not saying that social media bears the sole responsibility
160
507042
3601
ืื‘ืœ ื”ืชื’ื•ื‘ื” ืฉืœ ืคื™ื™ืกื‘ื•ืง, ืฉื”ื™ื ืœื ื™ื•ืชืจ ืžืืฉืจ ืžืจืื” ืœื—ื‘ืจื”,
08:30
for the state that we're in today.
161
510667
2142
08:32
Clearly, we have deep-seated societal issues that we need to solve.
162
512833
5435
ื”ื™ื ื ื™ืกื™ื•ืŸ ื ื•ื— ืœื”ืกื™ืจ ื›ืœ ืื—ืจื™ื•ืช
ืžื”ืื•ืคืŸ ืฉื‘ื• ื”ืคืœื˜ืคื•ืจืžื” ืฉืœื”ื ืžืขืฆื™ืžื” ืชื›ื ื™ื ืคื•ื’ืขื ื™ื™ื
08:38
But Facebook's response, that it is just a mirror to society,
163
518292
4142
ื•ื“ื•ื—ืคืช ืžืฉืชืžืฉื™ื ืžืกื•ื™ืžื™ื ืœืขืžื“ื•ืช ืงื™ืฆื•ื ื™ื•ืช.
08:42
is a convenient attempt to deflect any responsibility
164
522458
3018
08:45
from the way their platform is amplifying harmful content
165
525500
4268
ื•ืคื™ื™ืกื‘ื•ืง ื”ื™ืชื” ื™ื›ื•ืœื”, ืœื• ืจืง ืจืฆืชื”,
ืœืชืงืŸ ื—ืœืง ืžื–ื”.
08:49
and pushing some users towards extreme views.
166
529792
3041
ื”ื ื”ื™ื• ื™ื›ื•ืœื™ื ืœื”ืคืกื™ืง ืœื”ื’ื‘ื™ืจ ื•ืœื”ืžืœื™ืฅ ืขืœ ืื ืฉื™ ืชื™ืื•ืจื™ื•ืช ื”ืงื•ื ืกืคื™ืจืฆื™ื”,
08:53
And Facebook could, if they wanted to,
167
533583
2851
ืงื‘ื•ืฆื•ืช ื”ืฉื ืื”, ืžืคื™ืฆื™ ื”ื“ื™ืกืื™ื ืคื•ืจืžืฆื™ื”
08:56
fix some of this.
168
536458
1851
ื•ื›ืŸ, ื‘ืžืงืจื™ื ืžืกื•ื™ืžื™ื, ืืคื™ืœื• ื”ื ืฉื™ื ืฉืœื ื•.
08:58
They could stop amplifying and recommending the conspiracy theorists,
169
538333
3851
ื”ื ื”ื™ื• ื™ื›ื•ืœื™ื ืœื”ืคืกื™ืง ืœื”ืฉืชืžืฉ ื‘ืื•ืชืŸ ื˜ื›ื ื™ืงื•ืช ืฉืœ ื”ืชืืžื” ืื™ืฉื™ืช
09:02
the hate groups, the purveyors of disinformation
170
542208
2685
ืฉื”ื ืžืฉืชืžืฉื™ื ื‘ื”ืŸ ื›ื“ื™ ืœืžื›ื•ืจ ืœื ื• ื ืขืœื™ื™ื, ื›ื“ื™ ืœื”ืขื‘ื™ืจ ืจื˜ื•ืจื™ืงื” ืคื•ืœื™ื˜ื™ืช.
09:04
and, yes, in some cases even our president.
171
544917
3851
ื”ื ื™ื›ืœื• ืœืืžืŸ ืžื—ื“ืฉ ืืช ื”ืืœื’ื•ืจื™ืชืžื™ื
09:08
They could stop using the same personalization techniques
172
548792
3392
ื›ืš ืฉื™ืชืžืงื“ื• ื‘ืžื“ื“ ืื—ืจ, ืฉืื™ื ื ื• ืžื™ื“ืช ืžืขื•ืจื‘ื•ืช,
ื•ื”ื ื”ื™ื• ื™ื›ื•ืœื™ื ืœื‘ื ื•ืช ืืžืฆืขื™ ื‘ื˜ื™ื—ื•ืช, ืฉื™ืžื ืขื• ืžืชื›ื ื™ื ืžืกื•ื™ืžื™ื ืœื”ืคื•ืš ื•ื™ืจืืœื™ื™ื
09:12
to deliver political rhetoric that they use to sell us sneakers.
173
552208
4226
09:16
They could retrain their algorithms
174
556458
1726
ืœืคื ื™ ืฉื”ื ืขื•ื‘ืจื™ื ื‘ืงืจื”.
09:18
to focus on a metric other than engagement,
175
558208
2185
ื•ื”ื ื™ื›ืœื• ืœืขืฉื•ืช ืืช ื›ืœ ื–ื”
09:20
and they could build in guardrails to stop certain content from going viral
176
560417
4309
ืžื‘ืœื™ ืœื”ืคื•ืš ืœืžื” ืฉื”ื ืงื•ืจืื™ื, ื‘ื•ืจืจื™ื ืฉืœ ื”ืืžืช.
09:24
before being reviewed.
177
564750
1667
ืื‘ืœ ื”ื ื”ื‘ื”ื™ืจื• ื”ื™ื˜ื‘, ืฉื”ื ืœื ื™ื™ืœื›ื• ืžืกืคื™ืง ืจื—ื•ืง
09:26
And they could do all of this
178
566875
2268
ื›ื“ื™ ืœืขืฉื•ืช ืืช ื”ื“ื‘ืจ ื”ื ื›ื•ืŸ ืžื‘ืœื™ ืฉื™ื›ืจื™ื—ื• ืื•ืชื,
09:29
without becoming what they call the arbiters of truth.
179
569167
4184
ื•ืื ืœื”ื™ื•ืช ื›ื ื™ื, ืœืžื” ืฉื”ื ื™ืขืฉื• ื–ืืช?
09:33
But they've made it clear that they will not go far enough
180
573375
2809
ื”ืฉื•ื•ืงื™ื ืžืžืฉื™ื›ื™ื ืœืชื’ืžืœ ืื•ืชื, ื•ื”ื ืœื ืขื•ื‘ืจื™ื ืขืœ ื”ื—ื•ืง.
09:36
to do the right thing without being forced to,
181
576208
2810
ืžืฉื•ื ืฉื›ืคื™ ืฉื”ืžืฆื‘ ื”ื•ื ื›ื™ื•ื,
ืื™ืŸ ื›ืœ ื—ื•ืง ืืžืจื™ืงืื™ ื”ืžื—ื™ื™ื‘ ืืช ืคื™ื™ืกื‘ื•ืง, ืื• ื›ืœ ื—ื‘ืจืช ืžื“ื™ื” ื—ื‘ืจืชื™ืช ืื—ืจืช,
09:39
and, to be frank, why should they?
182
579042
2934
09:42
The markets keep rewarding them, and they're not breaking the law.
183
582000
3934
ืœื”ื’ืŸ ืขืœ ื›ื™ื›ืจ ื”ืขื™ืจ ืฉืœื ื•,
09:45
Because as it stands,
184
585958
1310
ืขืœ ื”ื“ืžื•ืงืจื˜ื™ื” ืฉืœื ื•,
ื•ืืคื™ืœื• ืขืœ ื”ื‘ื—ื™ืจื•ืช ืฉืœื ื•.
09:47
there are no US laws compelling Facebook, or any social media company,
185
587292
4892
ื•ื™ืชืจื ื• ืขืœ ื”ื–ื›ื•ืช ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืœื’ื‘ื™ ืื™ืœื• ื›ืœืœื™ื ืœื›ืชื•ื‘ ื•ืžื” ืœืื›ื•ืฃ
09:52
to protect our public square,
186
592208
1768
ืœื˜ื•ื‘ืช ืžื ื›โ€œืœื™ื ืฉืœ ื—ื‘ืจื•ืช ืื™ื ื˜ืจื ื˜ ืœืžื˜ืจื•ืช ืจื•ื•ื—.
09:54
our democracy
187
594000
1309
09:55
and even our elections.
188
595333
2435
09:57
We have ceded the decision-making on what rules to write and what to enforce
189
597792
4101
ื”ืื ื–ื” ืžื” ืฉืื ื—ื ื• ืจื•ืฆื™ื?
ืขื•ืœื ืฉืœ ืคื•ืกื˜-ืืžืช, ืฉื‘ื• ืจืขื™ืœื•ืช ื•ืฉื‘ื˜ื™ื•ืช
10:01
to the CEOs of for-profit internet companies.
190
601917
3083
ืžื ืฆื—ื•ืช ื‘ื ื™ื™ืช ื’ืฉืจื™ื ื•ื—ื™ืคื•ืฉ ืื—ืจ ื”ืกื›ืžื•ืช?
10:05
Is this what we want?
191
605750
2018
ืื ื™ ืขื“ื™ื™ืŸ ืื•ืคื˜ื™ืžื™ืช, ืฉืขื“ื™ื™ืŸ ื™ืฉ ื‘ื™ื ื™ื ื• ื™ื•ืชืจ ืžืŸ ื”ืžืฉื•ืชืฃ
10:07
A post-truth world where toxicity and tribalism
192
607792
3226
ืžืžื” ืฉืžืฆื™ื’ื•ืช ื›ื™ื•ื ื”ืžื“ื™ื” ื•ื”ืกื‘ื™ื‘ื” ื”ืžืงื•ื•ื ืช,
10:11
trump bridge-building and consensus-seeking?
193
611042
2625
ื•ืื ื™ ื›ืŸ ืžืืžื™ื ื” ืฉื”ื™ืงืฃ ืจื—ื‘ ื™ื•ืชืจ ืœืคืจืกืคืงื˜ื™ื‘ื” ืฉืœื ื•
10:14
I do remain optimistic that we still have more in common with each other
194
614667
4101
ื™ื•ืฆืจ ื“ืžื•ืงืจื˜ื™ื” ื™ื•ืชืจ ื—ื–ืงื” ื•ืžื›ื™ืœื”.
10:18
than the current media and online environment portray,
195
618792
3642
ืื‘ืœ ืœื ื›ืžื• ืฉื–ื” ืžืชืจื—ืฉ ื›ืจื’ืข.
10:22
and I do believe that having more perspective surface
196
622458
3101
ื•ื—ืฉื•ื‘ ืœื”ื“ื’ื™ืฉ, ืื ื™ ืœื ืจื•ืฆื” ืœื”ืจื•ื’ ืืช ื”ื—ื‘ืจื•ืช ื”ืœืœื•.
10:25
makes for a more robust and inclusive democracy.
197
625583
3601
ืื ื™ ืจืง ืจื•ืฆื” ืฉื”ืŸ ื™ื”ื™ื• ืžื—ื•ื™ื‘ื•ืช ืœืจืžื” ืžืกื•ื™ืžืช ืฉืœ ื“ื™ืŸ-ื•ื—ืฉื‘ื•ืŸ,
10:29
But not the way it's happening right now.
198
629208
2810
ื‘ื“ื™ื•ืง ื›ืžื• ื›ืœ ื™ืชืจ ื”ื—ื‘ืจื”.
10:32
And it bears emphasizing, I do not want to kill off these companies.
199
632042
4226
ื”ื’ื™ืข ื”ื–ืžืŸ ืฉื”ืžืžืฉืœื•ืช ืฉืœื ื• ื™ืื–ืจื• ืื•ืžืฅ ื•ื™ืขืฉื• ืืช ืขื‘ื•ื“ืชืŸ,
10:36
I just want them held to a certain level of accountability,
200
636292
3226
ืœื”ื’ืŸ ืขืœ ื”ืื•ื›ืœื•ืกื™ื™ื” ื”ืื–ืจื—ื™ืช ืฉืœื ื•.
10:39
just like the rest of society.
201
639542
1916
ื•ื‘ืขื•ื“ ืฉืื™ืŸ ืคื™ืกืช ื—ืงื™ืงื” ืงืกื•ืžื” ืื—ืช
10:42
It is time for our governments to step up and do their jobs
202
642542
4351
ืฉืชืชืงืŸ ืืช ื›ืœ ื–ื”,
ืื ื™ ื›ืŸ ืžืืžื™ื ื” ืฉืžืžืฉืœื•ืช ื™ื›ื•ืœื•ืช, ื•ื—ื™ื™ื‘ื•ืช, ืœืžืฆื•ื ืืช ื”ืื™ื–ื•ืŸ
10:46
of protecting our citizenry.
203
646917
1642
10:48
And while there isn't one magical piece of legislation
204
648583
3060
ื‘ื™ืŸ ื”ื”ื’ื ื” ืขืœ ื—ื•ืคืฉ ื”ื‘ื™ื˜ื•ื™
10:51
that will fix this all,
205
651667
1476
ืœื‘ื™ืŸ ื“ืจื™ืฉืช ื“ื™ื•ืŸ ื•ื—ืฉื‘ื•ืŸ ืžื”ืคืœื˜ืคื•ืจืžื•ืช ื”ืœืœื• ืขืœ ื”ื”ืฉืคืขื•ืช ืฉืœื”ืŸ ืขืœ ื”ื—ื‘ืจื”.
10:53
I do believe that governments can and must find the balance
206
653167
4684
ื•ื”ืŸ ื™ื›ื•ืœื•ืช ืœืขืฉื•ืช ืืช ื–ื”, ื—ืœืงื™ืช, ื‘ื›ืš ืฉื”ืŸ ื™ืชืขืงืฉื• ืขืœ ืฉืงื™ืคื•ืช ืžืžืฉื™ืช
10:57
between protecting free speech
207
657875
2143
11:00
and holding these platforms accountable for their effects on society.
208
660042
4309
ืกื‘ื™ื‘ ื”ืื•ืคืŸ ื‘ื• ืคื•ืขืœื™ื ืžื ื•ืขื™ ื”ื”ืžืœืฆื” ื”ืœืœื•,
ืกื‘ื™ื‘ ื”ืื•ืคืŸ ืฉื‘ื• ืžืชืจื—ืฉื•ืช ื”ืืฆื™ืจื”, ื”ื”ืขืฆืžื” ื•ื”ื’ื“ืจืช ื”ื™ืขื“ื™ื ื”ืœืœื•.
11:04
And they could do so in part by insisting on actual transparency
209
664375
4309
11:08
around how these recommendation engines are working,
210
668708
3143
ืžื‘ื™ื ื™ื, ืื ื™ ืจื•ืฆื” ืฉื”ื—ื‘ืจื•ืช ื”ืœืœื• ื™ื—ื•ื™ื™ื‘ื• ืœืชืช ื“ื™ืŸ-ื•ื—ืฉื‘ื•ืŸ
ืœื ืขืœ ื”ืฉืืœื”, ืื ืคืจื˜ ืžืกื•ื™ื ืžืคืจืกื ืžื™ื“ืข ืœื ื ื›ื•ืŸ ืื• ืจื˜ื•ืจื™ืงื” ืงื™ืฆื•ื ื™ืช,
11:11
around how the curation, amplification and targeting are happening.
211
671875
4643
ืืœื ืขืœ ื”ืื•ืคืŸ ืฉื‘ื• ืžื ื•ืขื™ ื”ื”ืžืœืฆื” ืฉืœื”ืŸ ืžืคื™ืฆื™ื ืื•ืชื”
11:16
You see, I want these companies held accountable
212
676542
2434
11:19
not for if an individual posts misinformation
213
679000
3101
ื”ืื•ืคืŸ ืฉื‘ื• ื”ืืœื’ื•ืจื™ืชืžื™ื ืฉืœื”ืŸ ืžื›ื•ื•ื ื™ื ืื ืฉื™ื ื›ืœืคื™ื”,
11:22
or extreme rhetoric,
214
682125
1476
ื•ื”ืื•ืคืŸ ืฉื‘ื• ื”ื›ืœื™ื ืฉืœื”ืŸ ืžืฉืžืฉื™ื ืœื”ื›ื•ื•ื ืชื” ืœืื ืฉื™ื ืžืกื•ื™ืžื™ื.
11:23
but for how their recommendation engines spread it,
215
683625
3851
ื ื™ืกื™ืชื™ ืœื™ืฆื•ืจ ืฉื™ื ื•ื™ ื‘ืชื•ืš ืคื™ื™ืกื‘ื•ืง ื•ื ื›ืฉืœืชื™,
11:27
how their algorithms are steering people towards it,
216
687500
3226
11:30
and how their tools are used to target people with it.
217
690750
3292
ื•ืœืคื™ื›ืš ืื ื™ ืžืฉืชืžืฉืช ื‘ืงื•ืœ ืฉืœื™ ืฉื•ื‘, ื‘ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
ื›ื“ื™ ืœื”ืžืฉื™ืš ื•ืœื”ืฉืžื™ืข ืืช ื”ืื–ื”ืจื” ื”ื–ื•,
11:35
I tried to make change from within Facebook and failed,
218
695417
3684
ื‘ืชืงื•ื•ื” ืฉื–ื” ื™ื™ืชืŸ ืœืื ืฉื™ื ื ื•ืกืคื™ื ืืช ื”ื›ื•ื— ืœื“ืจื•ืฉ ืืช ืึทื—ึฐืจึธื™ื•ึผืช ื”ื“ึดึผื•ึผื•ึผื—ึท, ื”ื–ื•.
11:39
and so I've been using my voice again for the past few years
219
699125
3143
ื”ืžืกืจ ืฉืœื™ ืืœื™ื›ื ื”ื•ื ืคืฉื•ื˜:
11:42
to continue sounding this alarm
220
702292
2226
ื”ืคืขื™ืœื• ืœื—ืฅ ืขืœ ื ืฆื™ื’ื™ื›ื ื‘ืžืžืฉืœื”
11:44
and hopefully inspire more people to demand this accountability.
221
704542
4392
ืœืื–ื•ืจ ืื•ืžืฅ ื•ืœื”ืคืกื™ืง ืœื•ื•ืชืจ ืขืœ ื›ื™ื›ืจ ื”ืขื™ืจ ืœื˜ื•ื‘ืช ืื™ื ื˜ืจืกื™ื™ื ืขืกืงื™ื™ื.
11:48
My message to you is simple:
222
708958
2643
ืขื™ื–ืจื• ืœื—ื ืš ืืช ื—ื‘ืจื™ื›ื ื•ื‘ื ื™ ืžืฉืคื—ื•ืชื™ื›ื
11:51
pressure your government representatives
223
711625
2184
ืœื’ื‘ื™ ื”ื“ืจื›ื™ื ื‘ื”ืŸ ื”ื ืžืชื•ืžืจื ื™ื ื‘ืจืฉืช.
11:53
to step up and stop ceding our public square to for-profit interests.
224
713833
5435
ื“ื—ืคื• ืืช ืขืฆืžื›ื ืœืฉื™ื—ื•ืช ืขื ืื ืฉื™ื, ืฉืื™ื ื ื—ื•ืฉื‘ื™ื ื›ืžื•ืชื›ื.
11:59
Help educate your friends and family
225
719292
1851
ื”ื™ืคื›ื• ืืช ื”ืกื•ื’ื™ื™ื” ื”ื–ื• ืœื‘ืขืœืช ืขื“ื™ืคื•ืช.
12:01
about how they're being manipulated online.
226
721167
3142
ืื ื—ื ื• ื–ืงื•ืงื™ื ืœื’ื™ืฉื” ื›ืœืœ-ื—ื‘ืจืชื™ืช ื›ื“ื™ ืœืคืชื•ืจ ืืช ื–ื”.
12:04
Push yourselves to engage with people who aren't like-minded.
227
724333
3518
ื•ื”ืžืกืจ ืฉืœื™ ืœืžื ื”ื™ื’ื™ื ืฉืœ ืคื™ื™ืกื‘ื•ืง, ื”ืžืขืกื™ืงื” ืฉืœื™ ืœืฉืขื‘ืจ, ื”ื•ื ื–ื”:
12:07
Make this issue a priority.
228
727875
2643
12:10
We need a whole-society approach to fix this.
229
730542
3416
ื›ืจื’ืข, ืื ืฉื™ื ืžืฉืชืžืฉื™ื ื‘ื›ืœื™ื ืฉืœื›ื ื‘ื“ื™ื•ืง ื›ืคื™ ืฉื”ื ืขื•ืฆื‘ื•,
12:15
And my message to the leaders of my former employer Facebook is this:
230
735250
5559
ื›ื“ื™ ืœื–ืจื•ืข ืฉื ืื”, ืคื™ืœื•ื’ ื•ื—ื•ืกืจ ืืžื•ืŸ,
12:20
right now, people are using your tools exactly as they were designed
231
740833
5810
ื•ืืชื ืœื ืจืง ืžืืคืฉืจื™ื ืืช ื–ื”, ืืœื ื ื•ืชื ื™ื ืœื–ื” ื™ื“.
ื•ื›ืŸ, ื™ืฉื ื ื”ืจื‘ื” ืžืื“ ืกื™ืคื•ืจื™ื ื ื”ื“ืจื™ื
12:26
to sow hatred, division and distrust,
232
746667
2601
ืขืœ ื“ื‘ืจื™ื ื—ื™ื•ื‘ื™ื™ื ืฉืงื•ืจื™ื ื‘ืคืœื˜ืคื•ืจืžื” ืฉืœื›ื ื•ื‘ืจื—ื‘ื™ ื”ืขื•ืœื,
12:29
and you're not just allowing it, you are enabling it.
233
749292
4142
ืื‘ืœ ื–ื” ืœื ื”ื•ืคืš ื“ื‘ืจ ืžื›ืœ ื–ื” ืœื‘ืกื“ืจ.
12:33
And yes, there are lots of great stories
234
753458
2476
ื•ื–ื” ืจืง ื”ื•ืœืš ื•ืžื—ืžื™ืจ ื‘ืขื•ื“ ืื ื—ื ื• ื ื›ื ืกื™ื ืœืžืขืจื›ืช ื”ื‘ื—ื™ืจื•ืช ืฉืœื ื•,
12:35
of positive things happening on your platform around the globe,
235
755958
3685
ื•ืืคื™ืœื• ื™ื•ืชืจ ืžื“ืื™ื’,
12:39
but that doesn't make any of this OK.
236
759667
3226
ื ื™ืฆื‘ื™ื ื‘ืคื ื™ ื”ืžืฉื‘ืจ ื”ืคื•ื˜ื ืฆื™ืืœื™ ื”ื’ื“ื•ืœ ื‘ื™ื•ืชืจ ืฉืœื ื• ืขื“ ื”ื™ื•ื -
ืื ื”ืชื•ืฆืื•ืช ืœื ื™ื–ื›ื• ืœืืžื•ืŸ ื•ืื ื™ืคืจืฆื• ืžืขืฉื™ ืืœื™ืžื•ืช.
12:42
And it's only getting worse as we're heading into our election,
237
762917
2976
12:45
and even more concerning,
238
765917
1767
ืื– ื‘-2021, ื›ืืฉืจ ืชื’ื™ื“ื• ืคืขื ื ื•ืกืคืช โ€œืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ืฉืชืคืจโ€œ,
12:47
face our biggest potential crisis yet,
239
767708
2476
12:50
if the results aren't trusted, and if violence breaks out.
240
770208
4143
ืื ื™ ืจื•ืฆื” ืฉืชื–ื›ืจื• ืืช ื”ืจื’ืข ื”ื–ื”,
ืžืฉื•ื ืฉืืœื• ื›ื‘ืจ ืœื ืจืง ื›ืžื” ืงื•ืœื•ืช ื‘ื•ื“ื“ื™ื ื‘ืฉื•ืœื™ื™ื.
12:54
So when in 2021 you once again say, "We know we have to do better,"
241
774375
5018
ืžื ื”ื™ื’ื™ื ื‘ืชื—ื•ืžื™ ื–ื›ื•ื™ื•ืช ืื–ืจื—, ืืงื“ืžืื™ื,
12:59
I want you to remember this moment,
242
779417
2684
ืขื™ืชื•ื ืื™ื, ืžืคืจืกืžื™ื, ื•ื”ืขื•ื‘ื“ื™ื ืฉืœื›ื ืขืฆืžื,
13:02
because it's no longer just a few outlier voices.
243
782125
3226
ืฆื•ืขืงื™ื ืžื›ืœ ื”ื’ื’ื•ืช
ืฉืงื•ื•ื™ ื”ืžื“ื™ื ื™ื•ืช ืฉืœื›ื ื•ื”ืืชื™ืงื” ื‘ืขืกืงื™ื ืฉืœื›ื
13:05
Civil rights leaders, academics,
244
785375
2268
ืคื•ื’ืขื™ื ื‘ืื ืฉื™ื ื•ื‘ื“ืžื•ืงืจื˜ื™ื”.
13:07
journalists, advertisers, your own employees,
245
787667
2976
13:10
are shouting from the rooftops
246
790667
2059
ืืชื ืื—ืจืื™ื ืœื”ื—ืœื˜ื•ืช ืฉืœื›ื,
13:12
that your policies and your business practices
247
792750
2601
ืื‘ืœ ืืชื ืœื ื™ื›ื•ืœื™ื ืขื•ื“ ืœื•ืžืจ, ืฉืœื ื™ื›ื•ืœืชื ืœืจืื•ืช ืืช ื–ื” ื‘ื.
13:15
are harming people and democracy.
248
795375
2417
ืชื•ื“ื”.
13:18
You own your decisions,
249
798875
2351
13:21
but you can no longer say that you couldn't have seen it coming.
250
801250
3667
13:26
Thank you.
251
806167
1250
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7