Dear Facebook, this is how you're breaking democracy | Yael Eisenstat
114,784 views ・ 2020-09-24
請雙擊下方英文字幕播放視頻。
譯者: LoHsien Huang
審譯者: Yi-Ping Cho (Marssi)
00:13
Around five years ago,
0
13000
2018
大約五年前,
00:15
it struck me that I was losing the ability
1
15042
2392
我感到很震驚,因為我無法
00:17
to engage with people
who aren't like-minded.
2
17458
2500
跟那些立場和我不同的人對話討論。
00:20
The idea of discussing hot-button issues
with my fellow Americans
3
20917
3642
我與美國同胞討論敏感議題的想法
00:24
was starting to give me more heartburn
4
24583
2393
讓我胃痛的程度,
00:27
than the times that I engaged
with suspected extremists overseas.
5
27000
4125
比和海外的可疑極端分子對話還嚴重。
00:31
It was starting to leave me feeling
more embittered and frustrated.
6
31750
3601
這件事開始讓我更加苦惱和沮喪。
00:35
And so just like that,
7
35375
1434
就這樣,
00:36
I shifted my entire focus
8
36833
1810
我轉移我的焦點,
00:38
from global national security threats
9
38667
2434
從一開始的全球國家安全威脅
00:41
to trying to understand
what was causing this push
10
41125
3226
轉移到試著了解是什麼造成
00:44
towards extreme polarization at home.
11
44375
3226
國內極端分化的原因。
00:47
As a former CIA officer and diplomat
12
47625
2351
我身為前中情局的官員和外交官,
00:50
who spent years working
on counterextremism issues,
13
50000
3434
曾經花了好幾年的時間
處理反極端主義的問題,
00:53
I started to fear that this was becoming
a far greater threat to our democracy
14
53458
3851
我開始感到害怕,因為這件事情
對我們的民主產生巨大威脅,
00:57
than any foreign adversary.
15
57333
2518
更甚任何國外的威脅。
00:59
And so I started digging in,
16
59875
1809
所以我開始深入發掘,
01:01
and I started speaking out,
17
61708
1518
而且開始疾呼,
01:03
which eventually led me
to being hired at Facebook
18
63250
2809
這最終讓我在臉書找到一份工作,
01:06
and ultimately brought me here today
19
66083
2685
最後讓我今天能夠站在這裡,
01:08
to continue warning you
about how these platforms
20
68792
2809
繼續提醒各位這些社群平台
01:11
are manipulating
and radicalizing so many of us
21
71625
4018
如何操縱我們,
讓我們許多人變得激進,
01:15
and to talk about
how to reclaim our public square.
22
75667
2958
並且告訴大家如何
奪回我們的輿論制度。
01:19
I was a foreign service officer in Kenya
23
79625
1976
我在肯亞的時候是名外交人員,
01:21
just a few years after
the September 11 attacks,
24
81625
3059
那是 911 攻擊事件過後沒幾年,
01:24
and I led what some call
"hearts and minds" campaigns
25
84708
2560
我在沿著索馬利亞的邊境
01:27
along the Somalia border.
26
87292
1934
發起了一個所謂
「贏得民心」的運動。
01:29
A big part of my job
was to build trust with communities
27
89250
3309
很大一部分的工作是,
在社群之間建立信任,
01:32
deemed the most susceptible
to extremist messaging.
28
92583
2709
這些社群被認為最容易受到
極端主義訊息的影響。
01:36
I spent hours drinking tea
with outspoken anti-Western clerics
29
96125
4184
我花了數小時與個性坦率的
反西方宗教領袖喝茶,
01:40
and even dialogued
with some suspected terrorists,
30
100333
3226
甚至與疑似恐怖分子的人交談,
01:43
and while many of these engagements
began with mutual suspicion,
31
103583
3393
儘管許多這樣的交涉
剛開始都出於互相懷疑,
01:47
I don't recall any of them
resulting in shouting or insults,
32
107000
3809
但我不記得有哪次引起爭執或侮辱,
01:50
and in some case we even worked together
on areas of mutual interest.
33
110833
4084
在某些情況下,我們甚至在
共同關心的領域中一同努力。
01:56
The most powerful tools we had
were to simply listen, learn
34
116000
4143
我們擁有最有力的工具就是
單純的傾聽、學習,
02:00
and build empathy.
35
120167
1726
而且建立同理心。
02:01
This is the essence
of hearts and minds work,
36
121917
3059
這是贏得人心工作的本質,
02:05
because what I found again and again
is that what most people wanted
37
125000
3268
因為我一再發現到大部分人想要的
02:08
was to feel heard,
validated and respected.
38
128292
3767
就是感覺被人傾聽、認可和尊重。
02:12
And I believe that's what most of us want.
39
132083
2643
我相信這就是大多數人想要的。
02:14
So what I see happening online today
is especially heartbreaking
40
134750
3309
而我看到現今網路上發生的事
特別令人心痛,
02:18
and a much harder problem to tackle.
41
138083
2084
而這是更難解決的問題。
02:20
We are being manipulated
by the current information ecosystem
42
140917
3767
我們正被目前的資訊生態系統操縱,
02:24
entrenching so many of us
so far into absolutism
43
144708
3768
讓我們許多人深陷
難以動搖的絕對主義中,
02:28
that compromise has become a dirty word.
44
148500
2976
以致於「妥協」已成了骯髒的字眼。
02:31
Because right now,
45
151500
1309
因為目前,
02:32
social media companies like Facebook
46
152833
2143
社群媒體公司,像是臉書,
02:35
profit off of segmenting us
and feeding us personalized content
47
155000
3851
它們的獲利方式就是區分我們
和提供我們個人化的內容,
02:38
that both validates
and exploits our biases.
48
158875
3851
這既可以驗證我們的偏好
還可以利用我們的偏見來獲利。
02:42
Their bottom line depends
on provoking a strong emotion
49
162750
3434
他們的獲利取決於
激發人們強烈的情緒,
02:46
to keep us engaged,
50
166208
1685
讓我們保持參與,
02:47
often incentivizing the most
inflammatory and polarizing voices,
51
167917
4267
通常會激發最具煽動性
和兩極分化的聲音,
02:52
to the point where finding common ground
no longer feels possible.
52
172208
4435
直到我們覺得不可能達成共識。
02:56
And despite a growing chorus of people
crying out for the platforms to change,
53
176667
4226
儘管越來越多人大聲疾呼
要這些社群平台做出改變,
03:00
it's clear they will not
do enough on their own.
54
180917
3309
但很顯然,他們自己不會做足這些事。
03:04
So governments must define
the responsibility
55
184250
2893
所以政府一定要界定出
03:07
for the real-world harms being caused
by these business models
56
187167
3809
這些商業模式對現實世界
造成危害的責任,
03:11
and impose real costs
on the damaging effects
57
191000
2768
並要求他們對
帶來破壞的影響付出代價,
03:13
they're having to our public health,
our public square and our democracy.
58
193792
5309
他們破壞了我們的公共健康、
輿論制度,以及我們的民主制度。
03:19
But unfortunately, this won't happen
in time for the US presidential election,
59
199125
4643
但很不幸的是,這不會發生在
美國總統大選期間,
03:23
so I am continuing to raise this alarm,
60
203792
2767
所以我繼續疾呼這項警訊,
03:26
because even if one day
we do have strong rules in place,
61
206583
3060
因為即使有一天
我們確實有嚴格的規定,
03:29
it will take all of us to fix this.
62
209667
2750
也需要我們所有人來解決這問題。
03:33
When I started shifting my focus
from threats abroad
63
213417
2601
當我開始轉移焦點,從國外的威脅
03:36
to the breakdown
in civil discourse at home,
64
216042
2142
轉移到國內公民對話的瓦解,
03:38
I wondered if we could repurpose
some of these hearts and minds campaigns
65
218208
4060
我在想,我們是否可能重新調整
一些贏得民心運動的目標,
03:42
to help heal our divides.
66
222292
2101
來修復我們的隔閡。
03:44
Our more than 200-year
experiment with democracy works
67
224417
3809
我們兩百多年的民主實驗能夠有效,
03:48
in large part because we are able
to openly and passionately
68
228250
3893
很大程度是因為我們能夠
公開並且熱情地
03:52
debate our ideas for the best solutions.
69
232167
2809
辯論我們的想法,
以尋求最佳的解決方法。
03:55
But while I still deeply believe
70
235000
1809
即便我仍深信
03:56
in the power of face-to-face
civil discourse,
71
236833
2268
面對面的公民對話所產生的力量,
03:59
it just cannot compete
72
239125
1643
但這都無法和兩極分化的影響力
04:00
with the polarizing effects
and scale of social media right now.
73
240792
3809
和當今社群媒體的規模來抗衡。
04:04
The people who are sucked
down these rabbit holes
74
244625
2351
那些被社群媒體的憤慨深淵
04:07
of social media outrage
75
247000
1309
所吸引的人群,
04:08
often feel far harder to break
of their ideological mindsets
76
248333
3643
比那些我曾合作過的脆弱社群來說,
04:12
than those vulnerable communities
I worked with ever were.
77
252000
3476
通常更難打破他們的思想觀念。
04:15
So when Facebook called me in 2018
78
255500
2309
所以當臉書在 2018 年邀請我,
04:17
and offered me this role
79
257833
1310
並且提供我這個職位,
04:19
heading its elections integrity operations
for political advertising,
80
259167
3976
讓我領導公正選舉業務進行政治宣傳,
04:23
I felt I had to say yes.
81
263167
2017
我覺得我必須答應。
04:25
I had no illusions
that I would fix it all,
82
265208
2726
我沒有天真以為能夠解決所有問題,
04:27
but when offered the opportunity
83
267958
1643
但當我得到這個機會
04:29
to help steer the ship
in a better direction,
84
269625
2184
能夠協助去導正方向時,
04:31
I had to at least try.
85
271833
1542
我至少要盡力去嘗試。
04:34
I didn't work directly on polarization,
86
274667
2267
我當時並沒有直接針對
兩極分化的問題,
04:36
but I did look at which issues
were the most divisive in our society
87
276958
4435
但我確實關注哪些問題
在我們社會中分歧最大,
04:41
and therefore the most exploitable
in elections interference efforts,
88
281417
3809
因此最容易被利用來干預選舉。
04:45
which was Russia's tactic ahead of 2016.
89
285250
2583
這是俄羅斯在 2016 選前的策略。
04:48
So I started by asking questions.
90
288583
2351
所以我從提出問題著手。
04:50
I wanted to understand
the underlying systemic issues
91
290958
2893
我想了解檯面下,
04:53
that were allowing all of this to happen,
92
293875
2434
讓所有事情爆發的系統性問題。
04:56
in order to figure out how to fix it.
93
296333
2084
就是為了想通如何解決這事。
04:59
Now I still do believe
in the power of the internet
94
299625
2559
現在我還是相信網路的力量
05:02
to bring more voices to the table,
95
302208
2518
能把更多聲音帶到檯面上,
05:04
but despite their stated goal
of building community,
96
304750
3101
但儘管他們聲稱目標是建立社群,
05:07
the largest social media companies
as currently constructed
97
307875
3309
然而目前建立起來的
最大社群媒體公司,
05:11
are antithetical to the concept
of reasoned discourse.
98
311208
3601
與理性對話的概念互相牴觸。
05:14
There's no way to reward listening,
99
314833
2518
用心傾聽無法得到任何回報,
05:17
to encourage civil debate
100
317375
1601
無法去鼓勵公民辯論
05:19
and to protect people
who sincerely want to ask questions
101
319000
3643
還有保護那些真心想提問的人,
05:22
in a business where optimizing
engagement and user growth
102
322667
3351
而這是在一個追求參與度和用戶增長
05:26
are the two most important
metrics for success.
103
326042
3226
作為成功最重要的
兩個指標的商業模式下。
05:29
There's no incentive
to help people slow down,
104
329292
3434
沒有動機去幫助人們放慢腳步,
05:32
to build in enough friction
that people have to stop,
105
332750
3226
去製造足夠的摩擦力讓大家停下來,
05:36
recognize their emotional
reaction to something,
106
336000
2601
辨識他們對某些事物的情緒反應,
05:38
and question their own
assumptions before engaging.
107
338625
2667
並且在投入之前,
質疑他們自己的假設。
05:42
The unfortunate reality is:
108
342625
1976
但不幸的事實是,
05:44
lies are more engaging online than truth,
109
344625
2893
謊言比真相得到更多的共鳴,
05:47
and salaciousness beats out
wonky, fact-based reasoning
110
347542
3726
淫邪擊敗了講究細節
且基於事實的真知灼見,
05:51
in a world optimized
for frictionless virality.
111
351292
3041
而這都發生在這個特別讓人毫不費力
就能瘋傳訊息的世界裡。
05:55
As long as algorithms' goals
are to keep us engaged,
112
355167
3434
只要演算法的目標
是要讓我們繼續參與,
05:58
they will continue to feed us the poison
that plays to our worst instincts
113
358625
4059
他們就會繼續餵我們毒藥
激出我們最壞的本能和人性弱點。
06:02
and human weaknesses.
114
362708
1459
06:04
And yes, anger, mistrust,
115
364958
3018
沒錯,就是憤怒、懷疑、
06:08
the culture of fear, hatred:
116
368000
1726
恐懼的文化、仇恨:
06:09
none of this is new in America.
117
369750
2851
這些在美國早已不是什麼新鮮事了。
06:12
But in recent years,
social media has harnessed all of that
118
372625
3351
但在最近幾年,
社群媒體已經駕馭所有事情,
06:16
and, as I see it,
dramatically tipped the scales.
119
376000
3601
就我看來,天秤已經嚴重的傾斜了。
06:19
And Facebook knows it.
120
379625
2393
臉書知道這個問題。
06:22
A recent "Wall Street Journal" article
121
382042
1892
華爾街日報近期的文章
06:23
exposed an internal
Facebook presentation from 2018
122
383958
4143
揭露臉書 2018 年的內部會議,
06:28
that specifically points
to the companies' own algorithms
123
388125
3768
特別指出公司自己的演算法
06:31
for growing extremist groups'
presence on their platform
124
391917
3517
助長極端分子組織在平臺上活動,
06:35
and for polarizing their users.
125
395458
2167
兩極分化他們的用戶。
06:38
But keeping us engaged
is how they make their money.
126
398708
3685
但讓我們持續參與
就是他們賺錢的方法。
06:42
The modern information environment
is crystallized around profiling us
127
402417
4226
現代訊息環境
是透過我們的個資形成,
06:46
and then segmenting us
into more and more narrow categories
128
406667
2976
然後將我們細分到
越來越狹窄的類別,
06:49
to perfect this personalization process.
129
409667
3309
讓個人化的過程能更加完善。
06:53
We're then bombarded
with information confirming our views,
130
413000
3809
然後我們被和我們觀點
相符的訊息轟炸,
06:56
reinforcing our biases,
131
416833
1851
這些訊息強化我們的偏見,
06:58
and making us feel
like we belong to something.
132
418708
3518
而且讓我們感覺自己有所歸屬。
07:02
These are the same tactics
we would see terrorist recruiters
133
422250
3393
這些是我們看到恐怖分子招募者
07:05
using on vulnerable youth,
134
425667
2184
對弱勢青年使用的相同策略,
07:07
albeit in smaller, more localized ways
before social media,
135
427875
3851
儘管在社群媒體前,這招募方式
規模小,且較為本地化,
07:11
with the ultimate goal
of persuading their behavior.
136
431750
3268
但其最終目的還是說服他們。
07:15
Unfortunately, I was never empowered
by Facebook to have an actual impact.
137
435042
5267
不幸的是,臉書從來沒給我權力
發揮實際的影響。
07:20
In fact, on my second day,
my title and job description were changed
138
440333
3768
事實上,我工作的第二天,
我的職銜和工作內容就被改掉了,
07:24
and I was cut out
of decision-making meetings.
139
444125
2976
而且我被排除在決策會議之外。
07:27
My biggest efforts,
140
447125
1309
我付出的最大努力被否決。
07:28
trying to build plans
141
448458
1393
我那時試著擬定計畫
07:29
to combat disinformation
and voter suppression in political ads,
142
449875
3601
打擊政治廣告中的虛假訊息
07:33
were rejected.
143
453500
1559
和選民受阻的事件,
07:35
And so I lasted just shy of six months.
144
455083
2500
所以我只待了不到六個月的時間。
07:38
But here is my biggest takeaway
from my time there.
145
458125
3601
但是我當時在那工作最大的收穫是,
07:41
There are thousands of people at Facebook
146
461750
2476
有數千人在臉書公司,
07:44
who are passionately working on a product
147
464250
2018
他們投注熱情在一個產品上,
07:46
that they truly believe
makes the world a better place,
148
466292
3684
且深信這讓世界變得更好。
07:50
but as long as the company continues
to merely tinker around the margins
149
470000
3684
但是只要公司還繼續只在
07:53
of content policy and moderation,
150
473708
2726
內容政策與審核的
枝微末節上進行修改,
07:56
as opposed to considering
151
476458
1310
而不是考慮
07:57
how the entire machine
is designed and monetized,
152
477792
3226
如何設計整個機器和獲利方式,
08:01
they will never truly address
how the platform is contributing
153
481042
3351
他們將永遠無法真正解決
該平台如何助長仇恨、
分裂和激進化的問題。
08:04
to hatred, division and radicalization.
154
484417
4184
08:08
And that's the one conversation
I never heard happen during my time there,
155
488625
4268
我在那裡工作的日子,
從沒聽過這樣的談話,
08:12
because that would require
fundamentally accepting
156
492917
3059
因為這需要從根本上接受一件事:
08:16
that the thing you built
might not be the best thing for society
157
496000
4143
你所打造的東西
可能對社會來說不是最好的,
08:20
and agreeing to alter
the entire product and profit model.
158
500167
3125
並且同意更改整個產品和獲利模式。
08:24
So what can we do about this?
159
504500
1833
所以我們可以針對這事做些什麼?
08:27
I'm not saying that social media
bears the sole responsibility
160
507042
3601
我並不是說社交媒體
要對我們今天的處境承擔全部責任。
08:30
for the state that we're in today.
161
510667
2142
08:32
Clearly, we have deep-seated
societal issues that we need to solve.
162
512833
5435
我們顯然有根深柢固的
社會議題需要解決。
08:38
But Facebook's response,
that it is just a mirror to society,
163
518292
4142
但臉書的回應是:
它們只是社會的一面鏡子。
08:42
is a convenient attempt
to deflect any responsibility
164
522458
3018
這個便利想法
讓他們可以轉移任何責任,
08:45
from the way their platform
is amplifying harmful content
165
525500
4268
像是它們的平台上
放大有害內容的方式,
08:49
and pushing some users
towards extreme views.
166
529792
3041
還有把使用者推向極端觀點的責任。
08:53
And Facebook could, if they wanted to,
167
533583
2851
臉書是做得到的,如果他們願意的話。
08:56
fix some of this.
168
536458
1851
去解決這些問題吧。
08:58
They could stop amplifying
and recommending the conspiracy theorists,
169
538333
3851
他們可以停止放大,
且停止推薦陰謀論者、
09:02
the hate groups,
the purveyors of disinformation
170
542208
2685
仇恨團體、虛假訊息提供者,
09:04
and, yes, in some cases
even our president.
171
544917
3851
還有,甚至在一些案例中,
停止推薦我們的總統。
09:08
They could stop using
the same personalization techniques
172
548792
3392
他們可以停止使用
向我們出售運動鞋時相同的
個人化技術來傳達政治言論。
09:12
to deliver political rhetoric
that they use to sell us sneakers.
173
552208
4226
09:16
They could retrain their algorithms
174
556458
1726
他們可以重新訓練演算法,
09:18
to focus on a metric
other than engagement,
175
558208
2185
聚焦在參與度之外的指標,
09:20
and they could build in guardrails
to stop certain content from going viral
176
560417
4309
而且他們可以打造一個防護網
阻止特定訊息在審核前
09:24
before being reviewed.
177
564750
1667
被大肆宣傳。
09:26
And they could do all of this
178
566875
2268
他們可以做到這所有事情,
09:29
without becoming what they call
the arbiters of truth.
179
569167
4184
無須成為他們所說的「真理仲裁者」。
09:33
But they've made it clear
that they will not go far enough
180
573375
2809
但是他們已經表明,
要是沒有被逼迫
就不會花力氣去做對的事,
09:36
to do the right thing
without being forced to,
181
576208
2810
09:39
and, to be frank, why should they?
182
579042
2934
而且說實話,他們為什麼應該這麼做?
09:42
The markets keep rewarding them,
and they're not breaking the law.
183
582000
3934
這個市場持續給予他們回報,
而且他們又沒有違法。
09:45
Because as it stands,
184
585958
1310
因為就目前而言,
09:47
there are no US laws compelling Facebook,
or any social media company,
185
587292
4892
美國沒有法律能夠制裁臉書,
或是制裁任何一家社群媒體公司,
09:52
to protect our public square,
186
592208
1768
去保獲我們的輿論空間、
09:54
our democracy
187
594000
1309
我們的民主,
09:55
and even our elections.
188
595333
2435
甚至是我們的選舉。
09:57
We have ceded the decision-making
on what rules to write and what to enforce
189
597792
4101
我們已經將決策權交給
營利性網路公司的執行長,
10:01
to the CEOs of for-profit
internet companies.
190
601917
3083
讓他們決定制定什麼規則
以及執行什麼規則。
10:05
Is this what we want?
191
605750
2018
這就是我們想要的嗎?
10:07
A post-truth world
where toxicity and tribalism
192
607792
3226
我們想要一個
「真相是次要的世界」?
10:11
trump bridge-building
and consensus-seeking?
193
611042
2625
希望毒素和部落主義
比調解和尋求共識更為重要?
10:14
I do remain optimistic that we still
have more in common with each other
194
614667
4101
我保持樂觀相信:
相較現在媒體和網路環境所描繪的,
10:18
than the current media
and online environment portray,
195
618792
3642
我們彼此仍有更多相似之處,
10:22
and I do believe that having
more perspective surface
196
622458
3101
而且我確實相信,擁有更多的視角
10:25
makes for a more robust
and inclusive democracy.
197
625583
3601
可以建立一個更加健全
和包容的民主體制。
10:29
But not the way it's happening right now.
198
629208
2810
但是並非指現在這種呈現方式。
10:32
And it bears emphasizing,
I do not want to kill off these companies.
199
632042
4226
而且需要強調的是,
我不想弄倒這些公司。
10:36
I just want them held
to a certain level of accountability,
200
636292
3226
我只希望他們承擔一定的責任,
10:39
just like the rest of society.
201
639542
1916
就像社會上其他人一樣。
10:42
It is time for our governments
to step up and do their jobs
202
642542
4351
現在正是時候讓我們的政府挺身而出
10:46
of protecting our citizenry.
203
646917
1642
保護自己的公民。
10:48
And while there isn't
one magical piece of legislation
204
648583
3060
而且當沒有一條具有威力的法律
10:51
that will fix this all,
205
651667
1476
來解決這所有問題時,
10:53
I do believe that governments
can and must find the balance
206
653167
4684
我相信政府有辦法,
且一定要找到平衡點,
10:57
between protecting free speech
207
657875
2143
兼顧言論自由,
11:00
and holding these platforms accountable
for their effects on society.
208
660042
4309
同時讓這些平台負責
其對社會所造成的影響。
11:04
And they could do so in part
by insisting on actual transparency
209
664375
4309
他們可藉著堅持公開實際情形
來做到這一點,
11:08
around how these recommendation
engines are working,
210
668708
3143
像是公開這些推薦引擎是如何運作,
11:11
around how the curation, amplification
and targeting are happening.
211
671875
4643
公開如何呈現、擴大用戶看到的資訊,
以及如何選擇廣告目標群眾。
11:16
You see, I want these companies
held accountable
212
676542
2434
你知道,我想讓這些公司負起責任,
11:19
not for if an individual
posts misinformation
213
679000
3101
不是為了個人張貼錯誤訊息
11:22
or extreme rhetoric,
214
682125
1476
或是極端言詞負責,
11:23
but for how their
recommendation engines spread it,
215
683625
3851
而是要對他們的推薦引擎
散播這些訊息而負責,
11:27
how their algorithms
are steering people towards it,
216
687500
3226
對他們的演算法
引導大家接觸這些訊息負責,
11:30
and how their tools are used
to target people with it.
217
690750
3292
對他們的工具
鎖定特定大眾而負責。
11:35
I tried to make change
from within Facebook and failed,
218
695417
3684
我嘗試從臉書內部進行改變
但是失敗了。
11:39
and so I've been using my voice again
for the past few years
219
699125
3143
因此過去幾年我一直發聲,
11:42
to continue sounding this alarm
220
702292
2226
持續呼籲大家,
11:44
and hopefully inspire more people
to demand this accountability.
221
704542
4392
希望啟發更多人,
要求這些公司負起責任。
11:48
My message to you is simple:
222
708958
2643
我想告訴大家的事很簡單:
11:51
pressure your government representatives
223
711625
2184
對你的政府官員施壓,
11:53
to step up and stop ceding
our public square to for-profit interests.
224
713833
5435
請他們挺身而出,並且禁止
我們的輿論空間割讓給營利團體。
11:59
Help educate your friends and family
225
719292
1851
協助教導你的親朋好友,
12:01
about how they're being
manipulated online.
226
721167
3142
他們在網路上是怎麼被操控。
12:04
Push yourselves to engage
with people who aren't like-minded.
227
724333
3518
鼓勵自己與立場不同的人對話。
12:07
Make this issue a priority.
228
727875
2643
優先處理這個問題。
12:10
We need a whole-society
approach to fix this.
229
730542
3416
我們需要一個遍及社會的方法
來解決這個問題。
12:15
And my message to the leaders
of my former employer Facebook is this:
230
735250
5559
而我想對臉書的領導們說:
12:20
right now, people are using your tools
exactly as they were designed
231
740833
5810
現在,大家正在使用你們設計的工具
12:26
to sow hatred, division and distrust,
232
746667
2601
來散播仇恨,分裂和不信任,
12:29
and you're not just allowing it,
you are enabling it.
233
749292
4142
你們不只允許這件事發生,
而且還協助這件事發生。
12:33
And yes, there are lots of great stories
234
753458
2476
沒錯,世界上有很多正面的好故事
12:35
of positive things happening
on your platform around the globe,
235
755958
3685
在你們的平台上發生,
12:39
but that doesn't make any of this OK.
236
759667
3226
但這不意味著可以分裂群眾。
12:42
And it's only getting worse
as we're heading into our election,
237
762917
2976
而且當我們進入選舉期間,
事情只會變得越來越糟,
12:45
and even more concerning,
238
765917
1767
更令人擔心的是,
12:47
face our biggest potential crisis yet,
239
767708
2476
我們將面臨最大的潛在危機,
12:50
if the results aren't trusted,
and if violence breaks out.
240
770208
4143
如果選舉結果不可信,
如果爆發暴力衝突。
12:54
So when in 2021 you once again say,
"We know we have to do better,"
241
774375
5018
所以 2021 年,你們又會說:
「我們知道必須做得更好。」
12:59
I want you to remember this moment,
242
779417
2684
我希望大家記得這個時刻,
13:02
because it's no longer
just a few outlier voices.
243
782125
3226
因為這不再是一些局外人的聲音。
13:05
Civil rights leaders, academics,
244
785375
2268
民權領袖、學者、
13:07
journalists, advertisers,
your own employees,
245
787667
2976
記者、廣告商、你自己的員工,
13:10
are shouting from the rooftops
246
790667
2059
都在大聲疾呼:
13:12
that your policies
and your business practices
247
792750
2601
你們的政策和商業行為
13:15
are harming people and democracy.
248
795375
2417
正在傷害人民及民主。
13:18
You own your decisions,
249
798875
2351
你們擁有自己的決定,
13:21
but you can no longer say
that you couldn't have seen it coming.
250
801250
3667
但你們不能再說看不到問題的到來。
13:26
Thank you.
251
806167
1250
謝謝大家。
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。