Dear Facebook, this is how you're breaking democracy | Yael Eisenstat
114,945 views ・ 2020-09-24
请双击下面的英文字幕来播放视频。
翻译人员: Bill Fu
校对人员: psjmz mz
00:13
Around five years ago,
0
13000
2018
大约五年前,
00:15
it struck me that I was losing the ability
1
15042
2392
我突然发现我在丧失
00:17
to engage with people
who aren't like-minded.
2
17458
2500
与立场不同的人交流的能力。
00:20
The idea of discussing hot-button issues
with my fellow Americans
3
20917
3642
单是跟美国同胞讨论
争议性话题这个想法
00:24
was starting to give me more heartburn
4
24583
2393
让我感到心累的程度,
00:27
than the times that I engaged
with suspected extremists overseas.
5
27000
4125
甚至就超过了我和海外可疑的
极端主义者有过的交流。
00:31
It was starting to leave me feeling
more embittered and frustrated.
6
31750
3601
这开始让我感到更加痛苦和沮丧。
00:35
And so just like that,
7
35375
1434
于是就这样,
00:36
I shifted my entire focus
8
36833
1810
我把我的焦点
00:38
from global national security threats
9
38667
2434
从影响全球国家安全的威胁,
00:41
to trying to understand
what was causing this push
10
41125
3226
转移到了寻找那个导致
00:44
towards extreme polarization at home.
11
44375
3226
国内民众两极化的原因上。
00:47
As a former CIA officer and diplomat
12
47625
2351
作为一名前中情局警官和外交官,
00:50
who spent years working
on counterextremism issues,
13
50000
3434
我在解决反极端主义的问题上
有着多年的工作经验。
00:53
I started to fear that this was becoming
a far greater threat to our democracy
14
53458
3851
我开始担心,这对
我们国家的民主造成的威胁
00:57
than any foreign adversary.
15
57333
2518
会超过来自国外对手的威胁。
00:59
And so I started digging in,
16
59875
1809
于是我开始深究,
01:01
and I started speaking out,
17
61708
1518
开始发声,
01:03
which eventually led me
to being hired at Facebook
18
63250
2809
后来脸谱网聘请了我,
01:06
and ultimately brought me here today
19
66083
2685
也最终引领我来到了这里,
01:08
to continue warning you
about how these platforms
20
68792
2809
使我能继续警告各位,
这些平台在如何
01:11
are manipulating
and radicalizing so many of us
21
71625
4018
操控和极端化我们的思想,
01:15
and to talk about
how to reclaim our public square.
22
75667
2958
并谈谈我们应该如何
夺回自己的公共空间。
01:19
I was a foreign service officer in Kenya
23
79625
1976
就在 911 事件发生的几年后,
01:21
just a few years after
the September 11 attacks,
24
81625
3059
我曾担任过肯尼亚外交官,
01:24
and I led what some call
"hearts and minds" campaigns
25
84708
2560
那时我在索马里的
边境领导了一个叫
01:27
along the Somalia border.
26
87292
1934
“心灵与精神”的项目。
01:29
A big part of my job
was to build trust with communities
27
89250
3309
我的主要工作是与那些最容易
01:32
deemed the most susceptible
to extremist messaging.
28
92583
2709
被极端主义思想
影响的群体建立信任。
01:36
I spent hours drinking tea
with outspoken anti-Western clerics
29
96125
4184
我花了大量的时间与
直言不讳的反西方教士喝茶,
01:40
and even dialogued
with some suspected terrorists,
30
100333
3226
甚至跟疑似恐怖份子的人
进行过谈话,
01:43
and while many of these engagements
began with mutual suspicion,
31
103583
3393
虽然这些交谈都是从
互相怀疑开始的,
01:47
I don't recall any of them
resulting in shouting or insults,
32
107000
3809
他们却从来没有大喊
或说过任何辱骂的话,
01:50
and in some case we even worked together
on areas of mutual interest.
33
110833
4084
有些时候,我们甚至能在
共同感兴趣的领域一起合作。
01:56
The most powerful tools we had
were to simply listen, learn
34
116000
4143
我们拥有的最有效的工具
就是聆听、学习
02:00
and build empathy.
35
120167
1726
还有建立同理心。
02:01
This is the essence
of hearts and minds work,
36
121917
3059
这就是“心灵与精神”
这份工作的核心,
02:05
because what I found again and again
is that what most people wanted
37
125000
3268
因为我多次发现,
他们大多数人想要的
02:08
was to feel heard,
validated and respected.
38
128292
3767
无非是被聆听、被认可,
还有被尊重。
02:12
And I believe that's what most of us want.
39
132083
2643
我相信这也是我们
大多数人想要的。
02:14
So what I see happening online today
is especially heartbreaking
40
134750
3309
可如今,网络上
发生的一切却令我心碎,
02:18
and a much harder problem to tackle.
41
138083
2084
也让我明白,这是一个
很难解决的问题。
02:20
We are being manipulated
by the current information ecosystem
42
140917
3767
我们正被目前的
信息网络操纵,
02:24
entrenching so many of us
so far into absolutism
43
144708
3768
陷入了绝对主义中,
02:28
that compromise has become a dirty word.
44
148500
2976
在这里,“妥协”已经成为
一个肮脏的字眼。
02:31
Because right now,
45
151500
1309
因为当今,
02:32
social media companies like Facebook
46
152833
2143
类似于脸谱网的社交媒体公司
02:35
profit off of segmenting us
and feeding us personalized content
47
155000
3851
通过为我们提供
个性化的内容来获利,
02:38
that both validates
and exploits our biases.
48
158875
3851
而这些内容既认可,
也利用了我们的偏见。
02:42
Their bottom line depends
on provoking a strong emotion
49
162750
3434
他们的底线就是,
通过激起强烈的情绪
02:46
to keep us engaged,
50
166208
1685
来让我们沉迷,
02:47
often incentivizing the most
inflammatory and polarizing voices,
51
167917
4267
持续强调那些煽性的,
十分两极化的声音,
02:52
to the point where finding common ground
no longer feels possible.
52
172208
4435
直到找到共同点变得不再可能。
02:56
And despite a growing chorus of people
crying out for the platforms to change,
53
176667
4226
虽然越来越多的人开始
要求这些平台做出改变,
03:00
it's clear they will not
do enough on their own.
54
180917
3309
但单靠它们是明显不够的。
03:04
So governments must define
the responsibility
55
184250
2893
所以,政府必须定义
03:07
for the real-world harms being caused
by these business models
56
187167
3809
这些对社会造成危害的商业模式
需承担什么样的责任,
03:11
and impose real costs
on the damaging effects
57
191000
2768
并让它们因严重损害
03:13
they're having to our public health,
our public square and our democracy.
58
193792
5309
我们的公共健康、公共空间,
还有我们的民主付出代价。
03:19
But unfortunately, this won't happen
in time for the US presidential election,
59
199125
4643
但不幸的是,在美国大选
之前这都不可能发生,
03:23
so I am continuing to raise this alarm,
60
203792
2767
所以我只能继续发出警告,
03:26
because even if one day
we do have strong rules in place,
61
206583
3060
因为即使有一天
我们有了强有力的法律,
03:29
it will take all of us to fix this.
62
209667
2750
还是需要大家一起来解决问题。
03:33
When I started shifting my focus
from threats abroad
63
213417
2601
当我的注意力从海外威胁
03:36
to the breakdown
in civil discourse at home,
64
216042
2142
转移到已然破碎的
国内民众对话时,
03:38
I wondered if we could repurpose
some of these hearts and minds campaigns
65
218208
4060
我曾想过,是否可以重新利用
“心灵与精神”这个项目
03:42
to help heal our divides.
66
222292
2101
来缓解国内的分歧。
03:44
Our more than 200-year
experiment with democracy works
67
224417
3809
我们过去 200 多年的
民主实验之所以有效,
03:48
in large part because we are able
to openly and passionately
68
228250
3893
很大程度是因为我们
可以公开并且热情地
03:52
debate our ideas for the best solutions.
69
232167
2809
讨论想法,
以找到最佳解决方案。
03:55
But while I still deeply believe
70
235000
1809
我始终坚信,
03:56
in the power of face-to-face
civil discourse,
71
236833
2268
面对面的民间对话是有效的,
03:59
it just cannot compete
72
239125
1643
但它确实难以对抗
04:00
with the polarizing effects
and scale of social media right now.
73
240792
3809
如今社交媒体带来的
两极分化的效应和规模。
04:04
The people who are sucked
down these rabbit holes
74
244625
2351
与我曾经合作过的
那些弱势群体相比,
04:07
of social media outrage
75
247000
1309
那些沉陷于
04:08
often feel far harder to break
of their ideological mindsets
76
248333
3643
社交媒体上激进内容的人
04:12
than those vulnerable communities
I worked with ever were.
77
252000
3476
更难脱离他们的固有观念。
04:15
So when Facebook called me in 2018
78
255500
2309
所以在 2018 年,
当脸谱网找到我,
04:17
and offered me this role
79
257833
1310
并给我提供了这份
04:19
heading its elections integrity operations
for political advertising,
80
259167
3976
负责政治广告管理的工作时,
04:23
I felt I had to say yes.
81
263167
2017
我觉得我必须答应。
04:25
I had no illusions
that I would fix it all,
82
265208
2726
我知道我可能
无法解决所有的问题,
04:27
but when offered the opportunity
83
267958
1643
但有了这个可以
04:29
to help steer the ship
in a better direction,
84
269625
2184
引导民众的机会,
04:31
I had to at least try.
85
271833
1542
我必须至少尝试一下。
04:34
I didn't work directly on polarization,
86
274667
2267
虽然我们没有研究过两极分化,
04:36
but I did look at which issues
were the most divisive in our society
87
276958
4435
但是我确实研究过哪些问题
最容易引起社会分裂,
04:41
and therefore the most exploitable
in elections interference efforts,
88
281417
3809
在选举干扰中也最易被利用,
04:45
which was Russia's tactic ahead of 2016.
89
285250
2583
这也是俄罗斯在 2016 年前的策略。
04:48
So I started by asking questions.
90
288583
2351
所以我从提问开始。
04:50
I wanted to understand
the underlying systemic issues
91
290958
2893
我想找到导致这一切发生的
04:53
that were allowing all of this to happen,
92
293875
2434
潜在的系统性问题,
04:56
in order to figure out how to fix it.
93
296333
2084
以便找到解决的方法。
04:59
Now I still do believe
in the power of the internet
94
299625
2559
现在,我仍然相信互联网的力量
05:02
to bring more voices to the table,
95
302208
2518
能让更多的声音参与进来,
05:04
but despite their stated goal
of building community,
96
304750
3101
尽管它们宣称的目标是建立社区,
05:07
the largest social media companies
as currently constructed
97
307875
3309
目前最大的
社交媒体公司的构建方式
05:11
are antithetical to the concept
of reasoned discourse.
98
311208
3601
与理性言论的概念
却是背道而驰的。
05:14
There's no way to reward listening,
99
314833
2518
在一个把优化用户参与度
05:17
to encourage civil debate
100
317375
1601
和用户增长量
05:19
and to protect people
who sincerely want to ask questions
101
319000
3643
作为衡量成功的两大指标的行业里,
05:22
in a business where optimizing
engagement and user growth
102
322667
3351
你没有办法奖励倾听、
鼓励民众讨论,
05:26
are the two most important
metrics for success.
103
326042
3226
你也没有办法去保护
那些诚恳提问的人。
05:29
There's no incentive
to help people slow down,
104
329292
3434
没有任何激励因素
去帮助人们慢下来,
05:32
to build in enough friction
that people have to stop,
105
332750
3226
没有去建立足够的摩擦
让人们不得不停下来,
05:36
recognize their emotional
reaction to something,
106
336000
2601
认识到自己
对某事的情绪化反应,
05:38
and question their own
assumptions before engaging.
107
338625
2667
并且在参与之前
质疑自己的假设。
05:42
The unfortunate reality is:
108
342625
1976
令人遗憾的事实是:
05:44
lies are more engaging online than truth,
109
344625
2893
在一个优化无摩擦的,
病毒式传播的世界里,
05:47
and salaciousness beats out
wonky, fact-based reasoning
110
347542
3726
网络上的谎言比真相更吸引人,
05:51
in a world optimized
for frictionless virality.
111
351292
3041
淫秽信息胜过了
基于事实的推理。
05:55
As long as algorithms' goals
are to keep us engaged,
112
355167
3434
只要算法的目标是保持我们的参与,
05:58
they will continue to feed us the poison
that plays to our worst instincts
113
358625
4059
它们就会继续“喂”迎合
我们最糟糕的本能
06:02
and human weaknesses.
114
362708
1459
和人性弱点的毒药。
06:04
And yes, anger, mistrust,
115
364958
3018
当然,愤怒、不信任、
06:08
the culture of fear, hatred:
116
368000
1726
恐惧和仇恨的文化:
06:09
none of this is new in America.
117
369750
2851
这些在美国都不是新事物。
06:12
But in recent years,
social media has harnessed all of that
118
372625
3351
但近年来,社交软件
充分利用了这一切,
06:16
and, as I see it,
dramatically tipped the scales.
119
376000
3601
并且在我看来,
还起到了决定性的作用。
06:19
And Facebook knows it.
120
379625
2393
脸谱网也深谙这一点。
06:22
A recent "Wall Street Journal" article
121
382042
1892
《华尔街日报》最近的一篇文章
06:23
exposed an internal
Facebook presentation from 2018
122
383958
4143
披露的一项脸谱网
2018 年的内部演示中,
06:28
that specifically points
to the companies' own algorithms
123
388125
3768
特别提到了该公司自己的算法
06:31
for growing extremist groups'
presence on their platform
124
391917
3517
让越来越多的极端组织信息
出现在其平台上,
06:35
and for polarizing their users.
125
395458
2167
并导致了用户两极分化。
06:38
But keeping us engaged
is how they make their money.
126
398708
3685
但让我们参与其中
正是它们赚钱的方式。
06:42
The modern information environment
is crystallized around profiling us
127
402417
4226
现代信息环境是通过
我们的信息画像而构建的,
06:46
and then segmenting us
into more and more narrow categories
128
406667
2976
再将我们分割成
越来越细小的类别,
06:49
to perfect this personalization process.
129
409667
3309
从而完善这个个性化的过程。
06:53
We're then bombarded
with information confirming our views,
130
413000
3809
然后,我们被可以证实
我们观点的信息狂轰滥炸,
06:56
reinforcing our biases,
131
416833
1851
强化我们的偏见,
06:58
and making us feel
like we belong to something.
132
418708
3518
并且让我们感觉属于某类群体。
07:02
These are the same tactics
we would see terrorist recruiters
133
422250
3393
这些策略和那些
恐怖分子用于招募
07:05
using on vulnerable youth,
134
425667
2184
弱势青少年的伎俩如出一辙,
07:07
albeit in smaller, more localized ways
before social media,
135
427875
3851
尽管是用比社交媒体更小、
更本土化的方式,
07:11
with the ultimate goal
of persuading their behavior.
136
431750
3268
但最终目标都是灌输某种行为。
07:15
Unfortunately, I was never empowered
by Facebook to have an actual impact.
137
435042
5267
不幸的是,我从未被脸谱网
授权去产生实质影响。
07:20
In fact, on my second day,
my title and job description were changed
138
440333
3768
事实上,在我工作的第二天,
我的头衔和具体工作就发生了变化,
07:24
and I was cut out
of decision-making meetings.
139
444125
2976
并且被排除在决策会议外。
07:27
My biggest efforts,
140
447125
1309
我最大的努力,
07:28
trying to build plans
141
448458
1393
试图制定计划,
07:29
to combat disinformation
and voter suppression in political ads,
142
449875
3601
以打击政治广告中的
虚假信息和投票权压制
07:33
were rejected.
143
453500
1559
被拒绝了。
07:35
And so I lasted just shy of six months.
144
455083
2500
所以我只坚持了
不到 6 个月就辞职了。
07:38
But here is my biggest takeaway
from my time there.
145
458125
3601
但这是我在这段期间最大的收获。
07:41
There are thousands of people at Facebook
146
461750
2476
成千上万的人在脸谱网工作,
07:44
who are passionately working on a product
147
464250
2018
充满热情的投入于这个
07:46
that they truly believe
makes the world a better place,
148
466292
3684
他们相信能将世界
变得更美好的产品,
07:50
but as long as the company continues
to merely tinker around the margins
149
470000
3684
但是,只要公司依然仅仅只是
07:53
of content policy and moderation,
150
473708
2726
在内容政策和合理性的边缘试探,
07:56
as opposed to considering
151
476458
1310
而不考虑
07:57
how the entire machine
is designed and monetized,
152
477792
3226
整个平台的设计和赚钱方式,
08:01
they will never truly address
how the platform is contributing
153
481042
3351
那它们永远不会真正解决
平台是如何助长
08:04
to hatred, division and radicalization.
154
484417
4184
仇恨、分裂和激进这些问题的。
08:08
And that's the one conversation
I never heard happen during my time there,
155
488625
4268
而这也正是我在那里
从未听到过的对话,
08:12
because that would require
fundamentally accepting
156
492917
3059
因为那将会需要你从根本上接受
08:16
that the thing you built
might not be the best thing for society
157
496000
4143
你所创造的东西可能
不是对社会最有益的东西,
08:20
and agreeing to alter
the entire product and profit model.
158
500167
3125
并同意改变整个产品和盈利模式。
08:24
So what can we do about this?
159
504500
1833
那么我们能做些什么呢?
08:27
I'm not saying that social media
bears the sole responsibility
160
507042
3601
我并不是说社交媒体
要为我们国家的现状
08:30
for the state that we're in today.
161
510667
2142
承担唯一的责任。
08:32
Clearly, we have deep-seated
societal issues that we need to solve.
162
512833
5435
显然,我们有一些根深蒂固的
社会问题需要解决。
08:38
But Facebook's response,
that it is just a mirror to society,
163
518292
4142
但脸谱网的回应
只是社会的一面镜子,
08:42
is a convenient attempt
to deflect any responsibility
164
522458
3018
这是一种转移其平台
08:45
from the way their platform
is amplifying harmful content
165
525500
4268
放大有害言论,并将一些用户
08:49
and pushing some users
towards extreme views.
166
529792
3041
推向极端观点的责任的方便说辞。
08:53
And Facebook could, if they wanted to,
167
533583
2851
当然,如果脸谱网
的确有这方面的主观意愿,
08:56
fix some of this.
168
536458
1851
是可以解决其中一些问题的。
08:58
They could stop amplifying
and recommending the conspiracy theorists,
169
538333
3851
他们可以停止推荐和放大
阴谋理论家、
09:02
the hate groups,
the purveyors of disinformation
170
542208
2685
仇恨组织和
虚假信息散布者的影响,
09:04
and, yes, in some cases
even our president.
171
544917
3851
是的,在某些情况下,
甚至包括我们的总统。
09:08
They could stop using
the same personalization techniques
172
548792
3392
它们可以停止使用
与推销跑鞋同样的个性化推荐服务
09:12
to deliver political rhetoric
that they use to sell us sneakers.
173
552208
4226
来传递政治说辞。
09:16
They could retrain their algorithms
174
556458
1726
它们可以重新修改算法,
09:18
to focus on a metric
other than engagement,
175
558208
2185
去专注于一个
非用户互动类的度量指标,
09:20
and they could build in guardrails
to stop certain content from going viral
176
560417
4309
也可以建立信息护栏,
防止某些内容在被审核之前
09:24
before being reviewed.
177
564750
1667
像病毒一样被传播。
09:26
And they could do all of this
178
566875
2268
他们可以做到所有这一切。
09:29
without becoming what they call
the arbiters of truth.
179
569167
4184
而不必成为他们所谓的
“真理仲裁者”。
09:33
But they've made it clear
that they will not go far enough
180
573375
2809
但是他们已经明确表示,
除非被强制这样做,
09:36
to do the right thing
without being forced to,
181
576208
2810
否则他们就不会
积极采取正确的行动,
09:39
and, to be frank, why should they?
182
579042
2934
毕竟,坦白说,
他们为什么要这么做呢?
09:42
The markets keep rewarding them,
and they're not breaking the law.
183
582000
3934
市场一直在奖励他们,
他们也没有违法。
09:45
Because as it stands,
184
585958
1310
因为就现在的情况而言,
09:47
there are no US laws compelling Facebook,
or any social media company,
185
587292
4892
美国并没有法律强制脸谱网
或者其他社交媒体公司
09:52
to protect our public square,
186
592208
1768
去保护我们的公共广场、
09:54
our democracy
187
594000
1309
民主,
09:55
and even our elections.
188
595333
2435
甚至是我们的选举。
09:57
We have ceded the decision-making
on what rules to write and what to enforce
189
597792
4101
我们已经将制定和执行
哪些计划的决定权交给了
10:01
to the CEOs of for-profit
internet companies.
190
601917
3083
盈利性互联网公司的
首席执行官们。
10:05
Is this what we want?
191
605750
2018
这是我们想要的吗?
10:07
A post-truth world
where toxicity and tribalism
192
607792
3226
一个有毒言论和部落主义
10:11
trump bridge-building
and consensus-seeking?
193
611042
2625
彻底压倒打造沟通桥梁
和追求共识的后真相世界?
10:14
I do remain optimistic that we still
have more in common with each other
194
614667
4101
我仍然乐观的认为,我们要比
现存的媒体和网络环境所描绘的彼此
10:18
than the current media
and online environment portray,
195
618792
3642
拥有更多的共同点,
10:22
and I do believe that having
more perspective surface
196
622458
3101
我也坚信相信,拥有更多视角
10:25
makes for a more robust
and inclusive democracy.
197
625583
3601
会让民主更加健康和包容,
10:29
But not the way it's happening right now.
198
629208
2810
但不是以目前的方式。
10:32
And it bears emphasizing,
I do not want to kill off these companies.
199
632042
4226
需要强调的是,
我并不想搞垮这些公司。
10:36
I just want them held
to a certain level of accountability,
200
636292
3226
我只想让它们承担
一定程度的责任,
10:39
just like the rest of society.
201
639542
1916
就像社会中的其他人一样。
10:42
It is time for our governments
to step up and do their jobs
202
642542
4351
我们的政府是时候站出来
10:46
of protecting our citizenry.
203
646917
1642
做好保护我们公民的工作了。
10:48
And while there isn't
one magical piece of legislation
204
648583
3060
虽然还没有一项神奇的立法
10:51
that will fix this all,
205
651667
1476
可以解决所有的问题,
10:53
I do believe that governments
can and must find the balance
206
653167
4684
但我的确相信,
政府可以,而且必须在
10:57
between protecting free speech
207
657875
2143
保护自由言论和让这些平台
11:00
and holding these platforms accountable
for their effects on society.
208
660042
4309
为它们的社会影响负责之间找到平衡。
11:04
And they could do so in part
by insisting on actual transparency
209
664375
4309
他们可以通过保证
这些推荐引擎如何工作,
11:08
around how these recommendation
engines are working,
210
668708
3143
如何进行管理、
放大和定位的透明度
11:11
around how the curation, amplification
and targeting are happening.
211
671875
4643
来做到这一点。
11:16
You see, I want these companies
held accountable
212
676542
2434
我希望这些公司
11:19
not for if an individual
posts misinformation
213
679000
3101
不仅是对个人发布虚假信息
11:22
or extreme rhetoric,
214
682125
1476
或者极端言论负责,
11:23
but for how their
recommendation engines spread it,
215
683625
3851
还要对他们的推荐引擎
如何传播这些信息,
11:27
how their algorithms
are steering people towards it,
216
687500
3226
他们的算法如何
引导人们使用这些信息,
11:30
and how their tools are used
to target people with it.
217
690750
3292
以及他们的工具
如何被用来针对这些人负责。
11:35
I tried to make change
from within Facebook and failed,
218
695417
3684
我试图从脸谱网内部
做出改变,但是我失败了,
11:39
and so I've been using my voice again
for the past few years
219
699125
3143
所以在过去几年,
我一直在用自己的声音
11:42
to continue sounding this alarm
220
702292
2226
不断发出警告,
11:44
and hopefully inspire more people
to demand this accountability.
221
704542
4392
希望能激发更多的人
来要求这种责任。
11:48
My message to you is simple:
222
708958
2643
我要传递给你们的信息很简单:
11:51
pressure your government representatives
223
711625
2184
对你们的政府代表施压,
让他们站出来,
11:53
to step up and stop ceding
our public square to for-profit interests.
224
713833
5435
停止将我们的公共空间
拱手让给以营利为目的获利行为。
11:59
Help educate your friends and family
225
719292
1851
帮助你的朋友和家人了解
12:01
about how they're being
manipulated online.
226
721167
3142
他们在网上是如何被操纵的。
12:04
Push yourselves to engage
with people who aren't like-minded.
227
724333
3518
推动自己和没有共同想法的人交流。
12:07
Make this issue a priority.
228
727875
2643
把这个问题当作优先事项。
12:10
We need a whole-society
approach to fix this.
229
730542
3416
我们需要一个能动员
全社会的方法来解决它。
12:15
And my message to the leaders
of my former employer Facebook is this:
230
735250
5559
我对我的前雇主,
脸谱网的领导们想要说的是:
12:20
right now, people are using your tools
exactly as they were designed
231
740833
5810
现在,人们正在使用
你们设计好的工具
12:26
to sow hatred, division and distrust,
232
746667
2601
传播仇恨、分裂和不信任,
12:29
and you're not just allowing it,
you are enabling it.
233
749292
4142
而你们不仅允许了,
而且还在加剧这种情况。
12:33
And yes, there are lots of great stories
234
753458
2476
没错,在全球范围内
12:35
of positive things happening
on your platform around the globe,
235
755958
3685
你们的平台上发生了很多积极的事,
12:39
but that doesn't make any of this OK.
236
759667
3226
但这并不意味着这一切
都是可以接受的。
12:42
And it's only getting worse
as we're heading into our election,
237
762917
2976
随着选举的临近,
情况越来越糟,
12:45
and even more concerning,
238
765917
1767
更值得担忧的是,
12:47
face our biggest potential crisis yet,
239
767708
2476
如果选举结果不被信任,
如果暴力事件爆发,
12:50
if the results aren't trusted,
and if violence breaks out.
240
770208
4143
那么迄今为止
最大的潜在危机将不可避免。
12:54
So when in 2021 you once again say,
"We know we have to do better,"
241
774375
5018
所以到 2021 年,当你们再说,
“我们知道我们必须做得更好”时,
12:59
I want you to remember this moment,
242
779417
2684
我想要你们记住现在这一刻,
13:02
because it's no longer
just a few outlier voices.
243
782125
3226
因为这再也不只是
一些局外人的声音。
13:05
Civil rights leaders, academics,
244
785375
2268
民权领袖、学者、
13:07
journalists, advertisers,
your own employees,
245
787667
2976
记者、广告商和你们自己的员工
13:10
are shouting from the rooftops
246
790667
2059
都会在屋顶上大声呼吁,
13:12
that your policies
and your business practices
247
792750
2601
你们的政策和商业行为
13:15
are harming people and democracy.
248
795375
2417
一直都在危害人民和民主。
13:18
You own your decisions,
249
798875
2351
决定权在你们手中,
13:21
but you can no longer say
that you couldn't have seen it coming.
250
801250
3667
但是你们再也不能说,
你们不可能预见这样的后果。
13:26
Thank you.
251
806167
1250
谢谢。
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。