The price of a "clean" internet | Hans Block and Moritz Riesewieck

66,430 views ・ 2019-11-21

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Helen Chang
00:12
[This talk contains mature content]
0
12360
3221
〔本演說有成人內容〕
00:16
Moritz Riesewieck: On March 23, 2013,
1
16913
4452
莫里茲·里斯維克: 2013 年 3 月 23 日,
00:21
users worldwide discovered in their news feed
2
21389
4057
全世界的使用者 在他們的新聞提要中
00:25
a video of a young girl being raped by an older man.
3
25470
5063
看到一個年輕女孩 被年長男子強暴的影片。
00:31
Before this video was removed from Facebook,
4
31478
3856
在這支影片被臉書移除之前,
00:35
it was already shared 16,000 times,
5
35358
4616
它已經被分享了一萬六千次,
00:39
and it was even liked 4,000 times.
6
39998
3642
它甚至得到了四千個讚。
00:45
This video went viral and infected the net.
7
45268
3663
這支影片在網絡上被瘋傳。
00:49
Hans Block: And that was the moment we asked ourselves
8
49873
3095
漢斯·布拉克:那時,我們自問,
00:52
how could something like this get on Facebook?
9
52992
2778
這樣的東西怎麼會出現在臉書上?
00:55
And at the same time, why don't we see such content more often?
10
55794
4402
此外,我們為什麼不會 更常見到這種內容?
01:00
After all, there's a lot of revolting material online,
11
60220
3683
畢竟,網路上有許多 令人反感的素材,
01:03
but why do we so rarely see such crap on Facebook, Twitter or Google?
12
63927
4455
但我們為什麼很少會在臉書、 推特,或 Google 看到這類內容?
01:08
MR: While image-recognition software
13
68958
2241
莫:雖然影像辨識軟體
01:11
can identify the outlines of sexual organs,
14
71223
4283
可以在影像和影片中辨認出
性器官的外形、血, 或者裸露的皮膚,
01:15
blood or naked skin in images and videos,
15
75530
4928
01:20
it has immense difficulties to distinguish pornographic content
16
80482
5540
但它非常難區別色情內容
01:26
from holiday pictures, Adonis statues
17
86046
4348
和渡假清涼照片、阿多尼斯雕像,
01:30
or breast-cancer screening campaigns.
18
90418
2698
或者乳癌篩選活動的差別。
01:33
It can't distinguish Romeo and Juliet dying onstage
19
93140
4255
它無法區別羅密歐 和茱麗葉死在舞台上
01:37
from a real knife attack.
20
97419
2555
與真正用刀攻擊的差別。
01:39
It can't distinguish satire from propaganda
21
99998
5222
它無法區別諷刺作品 和宣傳的不同,
01:45
or irony from hatred, and so on and so forth.
22
105244
3968
或者諷刺和仇恨的不同,諸如此類。
01:50
Therefore, humans are needed to decide
23
110077
4078
因此,必須要由人類來決定
01:54
which of the suspicious content should be deleted,
24
114179
4001
哪些可疑的內容應該被刪除,
01:58
and which should remain.
25
118204
1600
哪些應該留下。
02:00
Humans whom we know almost nothing about,
26
120909
2811
我們完全不知道這些人的任何事,
02:03
because they work in secret.
27
123744
1867
因為他們的工作很秘密。
02:06
They sign nondisclosure agreements,
28
126053
1905
他們要簽署保密協定,
02:07
which prohibit them from talking and sharing
29
127982
2953
保密協定會禁止他們談論及分享
02:10
what they see on their screens and what this work does to them.
30
130959
3585
他們在螢幕上看到什麼, 及這份工作對他們的影響。
02:14
They are forced to use code words in order to hide who they work for.
31
134887
4270
他們被迫要使用代號, 以隱瞞他們為誰工作。
02:19
They are monitored by private security firms
32
139585
2905
他們會被私人保全公司監視,
02:22
in order to ensure that they don't talk to journalists.
33
142514
3510
以確保他們沒有和記者交談。
02:26
And they are threatened by fines in case they speak.
34
146048
3539
如果他們說出來,還有罰金的威脅。
02:30
All of this sounds like a weird crime story,
35
150421
3762
這一切聽起來就像是個 怪異的犯罪故事,
02:34
but it's true.
36
154207
1330
但這是真的。
02:35
These people exist,
37
155561
1492
這些人真的存在,
02:37
and they are called content moderators.
38
157633
4181
他們被稱為內容審查員。
02:42
HB: We are the directors of the feature documentary film "The Cleaners,"
39
162942
3445
漢:我們是紀錄長片 《網路監護人》的導演,
02:46
and we would like to take you
40
166411
1960
我們想要帶大家前往
02:48
to a world that many of you may not know yet.
41
168395
2587
一個許多人都還不知道的世界。
02:51
Here's a short clip of our film.
42
171006
2133
這個片段取自我們的紀錄片。
02:58
(Music)
43
178639
3620
(音樂)
03:04
(Video) Moderator: I need to be anonymous, because we have a contract signed.
44
184400
3686
(影片)審查員:我必須要匿名, 因為我們有簽合約。
03:09
We are not allowed to declare whom we are working with.
45
189784
3621
我們不可以聲明我們和誰一起工作。
03:14
The reason why I speak to you
46
194807
1763
我跟你談的理由是因為
03:16
is because the world should know that we are here.
47
196594
4380
世界應該要知道我們的存在。
03:22
There is somebody who is checking the social media.
48
202544
2803
有人在負責檢查社交媒體。
03:26
We are doing our best to make this platform
49
206317
3604
我們在盡力確保這個平台
03:29
safe for all of them.
50
209945
1790
對大家都是安全的。
03:42
Delete.
51
222438
1150
刪除。
03:44
Ignore.
52
224278
1294
忽略。
03:45
Delete.
53
225596
1150
刪除。
03:47
Ignore.
54
227279
1297
忽略。
03:48
Delete.
55
228600
1151
刪除。
03:50
Ignore.
56
230680
1151
忽略。
03:51
Ignore.
57
231855
1150
忽略。
03:53
Delete.
58
233625
1150
刪除。
03:58
HB: The so-called content moderators
59
238030
1977
漢:所謂的內容審查員,
04:00
don't get their paychecks from Facebook, Twitter or Google themselves,
60
240031
4000
他們的薪水並不是來自臉書、 推特,或 Google 本身,
04:04
but from outsourcing firms around the world
61
244055
2317
而是世界各地的外包公司, 以確保維持低薪資。
04:06
in order to keep the wages low.
62
246396
2067
04:08
Tens of thousands of young people
63
248833
1967
數萬名年輕人
04:10
looking at everything we are not supposed to see.
64
250824
3213
在負責看我們不應該看的內容。
04:14
And we are talking about decapitations, mutilations,
65
254061
3548
不該看的內容指的是斬首、殘害、
04:17
executions, necrophilia, torture, child abuse.
66
257633
3696
處決、戀屍癖、拷打、孩童虐待。
04:21
Thousands of images in one shift --
67
261743
2274
輪一次班就要看數千張 影像——忽略、刪除,
04:24
ignore, delete, day and night.
68
264041
2915
日以繼夜,夜以繼日。
04:27
And much of this work is done in Manila,
69
267393
3398
這些工作大部分在馬尼拉進行,
04:30
where the analog toxic waste from the Western world
70
270815
3302
來自西方世界的類比有毒廢棄物
04:34
was transported for years by container ships,
71
274141
2608
數年來一直用貨櫃船運輸到那裡,
04:36
now the digital waste is dumped there via fiber-optic cable.
72
276773
4003
現在數位廢棄物則是 透過光纖網路丟過去。
04:40
And just as the so-called scavengers
73
280800
3047
當所謂的拾荒者
04:43
rummage through gigantic tips on the edge of the city,
74
283871
3476
在廣大的城市邊緣翻找,
04:47
the content moderators click their way through an endless toxic ocean
75
287371
4833
在影像、影片,和各種智慧 所形成的無邊無際有毒海洋中,
04:52
of images and videos and all manner of intellectual garbage,
76
292228
4087
內容審查員則是地不斷點擊滑鼠,
04:56
so that we don't have to look at it.
77
296339
2302
讓我們不需要去看這些內容。
04:58
MR: But unlike the wounds of the scavengers,
78
298665
3540
莫:但,不像拾荒者的傷口,
05:02
those of the content moderators remain invisible.
79
302229
3451
內容審查員的傷口一直是隱形的。
05:06
Full of shocking and disturbing content,
80
306117
3080
這些影像和影片中滿是 震撼、讓人不安的內容,
05:09
these pictures and videos burrow into their memories
81
309221
3263
這些全都潛入了他們的記憶中,
05:12
where, at any time, they can have unpredictable effects:
82
312508
3445
它們隨時都可能會在記憶中 產生無法預測的影響:
05:15
eating disorders, loss of libido,
83
315977
3357
飲食失調、性慾喪失、
05:19
anxiety disorders, alcoholism,
84
319358
3259
焦慮相關疾病、酗酒、
05:22
depression, which can even lead to suicide.
85
322641
2912
憂鬱,甚至可能會導致自殺。
05:26
The pictures and videos infect them,
86
326100
2445
這些影像和影片會感染他們,
05:28
and often never let them go again.
87
328569
2389
且通常不會再放他們走。
05:30
If they are unlucky, they develop post-traumatic stress disorders,
88
330982
4841
如果他們不夠幸運,
他們會得到創傷後壓力症候群,
05:35
like soldiers after war missions.
89
335847
2200
和戰後的士兵一樣。
05:39
In our film, we tell the story of a young man
90
339445
3643
我們的紀錄片說了 一個年輕人的故事,
05:43
who had to monitor livestreams of self-mutilations and suicide attempts,
91
343112
5198
他必須要監看自殘 和嘗試自殺的直播,
05:48
again and again,
92
348334
1675
不斷地監看,
05:50
and who eventually committed suicide himself.
93
350033
3066
最後,他自己也自殺了。
05:53
It's not an isolated case, as we've been told.
94
353787
2740
我們聽說,這並非獨立個案。
05:57
This is the price all of us pay
95
357184
3980
我們所有人要付出這樣的代價,
06:01
for our so-called clean and safe and "healthy"
96
361188
5327
才能有我們所謂乾淨、安全,
和「健康」的社交媒體環境。
06:06
environments on social media.
97
366539
2284
06:10
Never before in the history of mankind
98
370482
2595
人類歷史上從來沒有像現在
06:13
has it been easier to reach millions of people around the globe
99
373101
3332
這麼容易在幾秒鐘內就能 接觸到全世界數百萬人。
06:16
in a few seconds.
100
376457
1667
06:18
What is posted on social media spreads so quickly,
101
378148
3945
社交媒體上刊登的內容 能非常快速地散播出去,
06:22
becomes viral and excites the minds of people all around the globe.
102
382117
3936
瞬間爆紅,刺激全球 各地使用者的大腦。
06:26
Before it is deleted,
103
386450
2064
在它被刪除之前,
06:28
it is often already too late.
104
388538
1933
通常都已經太遲了。
06:30
Millions of people have already been infected
105
390966
2230
數百萬人已經感染到了仇恨和憤怒,
06:33
with hatred and anger,
106
393220
1857
06:35
and they either become active online,
107
395101
2730
他們可能會在網路上採取主動,
06:37
by spreading or amplifying hatred,
108
397855
3143
將仇恨散播出去或放大,
06:41
or they take to the streets and take up arms.
109
401022
3793
要不他們就會走上街頭使用武力。
06:45
HB: Therefore, an army of content moderators
110
405236
2540
漢:因此,一支內容審查員軍團
06:47
sit in front of a screen to avoid new collateral damage.
111
407800
3895
坐在螢幕前面, 避免新的間接傷害發生。
06:52
And they are deciding, as soon as possible,
112
412434
2119
他們得要盡快判定
06:54
whether the content stays on the platform -- ignore;
113
414577
4095
這些內容要留在平台上——忽略;
06:58
or disappears -- delete.
114
418696
2340
或者應該消失——刪除。
07:01
But not every decision is as clear
115
421823
2627
但並非每個決定
都跟兒童虐待影片的決定 一樣黑白分明。
07:04
as the decision about a child-abuse video.
116
424474
2897
07:07
What about controversial content, ambivalent content,
117
427395
2777
民權活動家或者公民記者 上傳的爭議性內容、
07:10
uploaded by civil rights activists or citizen journalists?
118
430196
3153
矛盾內容又該如何判定?
07:14
The content moderators often decide on such cases
119
434048
3222
內容審查員判定這些情況的速度,
07:17
at the same speed as the [clear] cases.
120
437294
2733
通常跟判定黑白分明的情況一樣快。
07:21
MR: We will show you a video now,
121
441515
2659
莫:我們先給各位看一段影片,
07:24
and we would like to ask you to decide:
122
444198
3309
接著想請各位決定:
07:27
Would you delete it,
123
447531
1690
你會刪除它,
07:29
or would you not delete it?
124
449245
1801
或者不會刪除它?
07:31
(Video) (Air strike sounds)
125
451070
1667
(影片)(空襲聲)
07:33
(Explosion)
126
453100
2550
(爆炸)
(阿拉伯語)
07:40
(People speaking in Arabic)
127
460076
5953
07:46
MR: Yeah, we did some blurring for you.
128
466053
2230
莫:是的,我們 為大家打了些馬賽克。
07:49
A child would potentially be dangerously disturbed
129
469196
3755
這樣的內容有可能造成孩子 不安,到了有危險的程度,
07:52
and extremely frightened by such content.
130
472975
2809
且受到極度的驚嚇。
07:55
So, you rather delete it?
131
475808
2297
所以,你想要刪除它嗎?
07:59
But what if this video could help investigate the war crimes in Syria?
132
479610
4639
但如果這支影片能夠協助 調查敘利亞的戰爭犯罪呢?
08:04
What if nobody would have heard about this air strike,
133
484717
3167
如果因為臉書、YouTube、 推特都決定把這支影片下架,
08:07
because Facebook, YouTube, Twitter would have decided to take it down?
134
487908
3738
因此就沒有人知道這次空襲呢?
08:12
Airwars, a nongovernmental organization based in London,
135
492895
4325
在倫敦有一個非政府 組織叫做 Airwars,
08:17
tries to find those videos as quickly as possible
136
497244
2897
每當有這類影片 被上傳到社交媒體上,
08:20
whenever they are uploaded to social media,
137
500165
2560
這個組織就會嘗試 盡快找出這些影片,
08:22
in order to archive them.
138
502749
1600
以將它們建檔。
08:24
Because they know, sooner or later,
139
504693
2833
因為,他們知道,遲早,
08:27
Facebook, YouTube, Twitter would take such content down.
140
507550
3310
臉書、YouTube、推特 都會把這類內容移除。
08:31
People armed with their mobile phones
141
511345
2208
有手機的人,
08:33
can make visible what journalists often do not have access to.
142
513577
4199
能夠讓記者通常無法 取得的資訊曝光。
08:37
Civil rights groups often do not have any better option
143
517800
3063
民權團體若想要很快 把它們錄下的內容
08:40
to quickly make their recordings accessible to a large audience
144
520887
3801
提供給大眾,通常它們最好的選擇
08:44
than by uploading them to social media.
145
524712
2600
就是將這些內容上傳到社交媒體。
08:47
Wasn't this the empowering potential the World Wide Web should have?
146
527950
4286
這不就是網際網路潛能: 可以讓使用者有更多能力?
08:52
Weren't these the dreams
147
532966
1960
這不就是網際網路剛開始發展時,
08:54
people in its early stages had about the World Wide Web?
148
534950
4111
大家對它懷抱的夢想嗎?
08:59
Can't pictures and videos like these
149
539608
2795
像這樣的影像和影片不能夠
09:02
persuade people who have become insensitive to facts
150
542427
5134
說服已經對事實麻木的人
09:07
to rethink?
151
547585
1150
重新思考嗎?
09:09
HB: But instead, everything that might be disturbing is deleted.
152
549917
3602
漢:但卻變成,凡是 讓人不安的內容都會被刪除。
09:13
And there's a general shift in society.
153
553543
2058
社會上發生了普遍的轉變。
09:15
Media, for example, more and more often use trigger warnings
154
555625
3897
比如,媒體越來越常在文章的開頭
09:19
at the top of articles
155
559546
1793
使用觸發警告,
09:21
which some people may perceive as offensive or troubling.
156
561363
3309
有些人可能會覺得被冒犯或不舒服。
09:24
Or more and more students at universities in the United States
157
564696
3914
或者,越來越多美國大學生
09:28
demand the banishment of antique classics
158
568634
2817
要求將描述暴力或攻擊性的古典文學
09:31
which depict sexual violence or assault from the curriculum.
159
571475
3115
從課程中除去。
09:34
But how far should we go with that?
160
574991
2333
但,我們要做到什麼程度?
09:37
Physical integrity is guaranteed as a human right
161
577875
3380
全世界的憲法都會保證
身體完整性這項人權。
09:41
in constitutions worldwide.
162
581279
1800
09:43
In the Charter of Fundamental Rights of the European Union,
163
583422
3754
在歐洲聯盟基本權利憲章中,
09:47
this right expressly applies to mental integrity.
164
587200
3354
這項權利明確適用於心理完整性。
09:51
But even if the potentially traumatic effect
165
591347
2658
但,就算很難預測影像和影片
09:54
of images and videos is hard to predict,
166
594029
2826
可能造成的創傷性影響,
09:56
do we want to become so cautious
167
596879
1957
我們想要那麼謹慎嗎?
09:58
that we risk losing social awareness of injustice?
168
598860
3727
要做到甚至會有風險,可能失去 對於不公正的社會意識嗎?
10:03
So what to do?
169
603203
1150
所以,該怎麼辦?
10:04
Mark Zuckerberg recently stated that in the future,
170
604942
2992
馬克·祖克柏最近表示,在未來,
10:07
the users, we, or almost everybody,
171
607958
3802
使用者,即我們, 或幾乎可說是所有人,
10:11
will decide individually
172
611784
2261
會能夠自己做出個人決定,
10:14
what they would like to see on the platform,
173
614069
2048
決定想要在平台上看到什麼,
10:16
by personal filter settings.
174
616141
2031
只要調整個人過濾設定即可。
10:18
So everyone could easily claim to remain undisturbed
175
618196
3072
所以,人人都能輕易地要求不要受到
10:21
by images of war or other violent conflicts, like ...
176
621292
3739
戰爭或其他暴力衝突影像的打擾……
10:25
MR: I'm the type of guy who doesn't mind seeing breasts
177
625849
4446
莫:我是那種不介意看到胸部的人,
10:30
and I'm very interested in global warming,
178
630319
3766
且我對全球暖化非常感興趣,
10:34
but I don't like war so much.
179
634109
2564
但我就不那麼喜歡戰爭了。
10:37
HB: Yeah, I'm more the opposite,
180
637109
1722
漢:我則正好相反,
10:38
I have zero interest in naked breasts or naked bodies at all.
181
638855
4053
我對裸露的胸部或身體完全沒興趣。
10:43
But why not guns? I like guns, yes.
182
643209
2911
但,為什麼不看槍枝呢? 是的,我喜歡槍枝。
10:46
MR: Come on, if we don't share a similar social consciousness,
183
646901
3745
莫:拜託,如果沒有 相近的社會意識,
10:50
how shall we discuss social problems?
184
650670
2669
我們要如何討論社會問題?
10:53
How shall we call people to action?
185
653363
2397
我們要如何呼籲大家採取行動?
10:55
Even more isolated bubbles would emerge.
186
655784
3285
更孤立的同溫層會出現。
10:59
One of the central questions is: "How, in the future,
187
659665
3231
很重要的問題之一是:在未來,
11:02
freedom of expression will be weighed against the people's need for protection."
188
662920
4903
在大家對於保護的需求之下,
自由表述會有多大的重要性?
11:08
It's a matter of principle.
189
668441
1467
這是原則問題。
11:10
Do we want to design an either open or closed society
190
670602
4248
我們想要為數位空間
設計一個非開即關的社會嗎?
11:14
for the digital space?
191
674874
1639
11:17
At the heart of the matter is "freedom versus security."
192
677054
5912
那件事的核心就是
「自由 vs. 安全」。
11:24
Facebook has always wanted to be a "healthy" platform.
193
684388
4484
臉書一直想要成為 「健康」的平台。
11:28
Above all, users should feel safe and secure.
194
688896
3698
最重要的是,使用者 應該要感到安心、安全。
11:32
It's the same choice of words
195
692618
2120
我們在訪談菲律賓的 內容審查員時,
11:34
the content moderators in the Philippines used
196
694762
2958
也常聽到他們選用同樣這些字眼。
11:37
in a lot of our interviews.
197
697744
1800
11:40
(Video) The world that we are living in right now,
198
700188
2381
(影片)我相信,我們現在 所生活的世界並非真的很健康。
11:42
I believe, is not really healthy.
199
702593
2166
11:44
(Music)
200
704783
1548
(音樂)
11:46
In this world, there is really an evil who exists.
201
706355
3158
在這個世界裡,確實有邪惡存在。
11:49
(Music)
202
709537
3238
(音樂)
11:52
We need to watch for it.
203
712799
2063
我們必須要小心它。
11:54
(Music)
204
714886
1882
(音樂)
11:56
We need to control it -- good or bad.
205
716792
3250
我們必須要控制它——不論好壞。
12:00
(Music)
206
720646
7000
(音樂)
12:10
[Look up, Young man! --God]
207
730193
4103
〔向上看,年輕人!——上帝〕
12:14
MR: For the young content moderators in the strictly Catholic Philippines,
208
734952
4278
莫:在嚴格信奉天主教的菲律賓, 對這些年輕的內容審查員來說,
12:19
this is linked to a Christian mission.
209
739254
2793
這和基督教的使命有關。
12:22
To counter the sins of the world
210
742833
2966
是要反擊散佈在網路上的、
12:25
which spread across the web.
211
745823
2174
這個世界的罪惡。
12:28
"Cleanliness is next to godliness,"
212
748641
3412
「乾淨就是非常接近神聖。」
12:32
is a saying everybody in the Philippines knows.
213
752077
3434
在菲律賓,人人都知道這個說法。
12:36
HB: And others motivate themselves
214
756035
1659
漢:其他人的動機則是
12:37
by comparing themselves with their president, Rodrigo Duterte.
215
757718
3731
將他們自己和他們的 總統杜特蒂做比較。
12:41
He has been ruling the Philippines since 2016,
216
761837
3491
這位總統從 2016 年 便開始統治菲律賓,
12:45
and he won the election with the promise: "I will clean up."
217
765352
3993
他贏得選舉是因為他保證: 「我會清理乾淨。」
12:49
And what that means is eliminating all kinds of problems
218
769892
3317
那句話的意思是要 把各種問題都清除,
12:53
by literally killing people on the streets
219
773233
2455
且他的做法是殺光街上所有
12:55
who are supposed to be criminals, whatever that means.
220
775712
2865
應該是罪犯的人, 不論那是什麼意思。
12:58
And since he was elected,
221
778601
1270
他當選之後,
12:59
an estimated 20,000 people have been killed.
222
779895
3436
估計有兩萬人遭殺害。
13:03
And one moderator in our film says,
223
783655
2501
我們電影中的一位審查員說:
13:06
"What Duterte does on the streets,
224
786180
2055
「杜特蒂在街頭所做的事,
13:08
I do for the internet."
225
788259
1714
就是我在網路上所做的事。」
13:10
And here they are, our self-proclaimed superheroes,
226
790934
3564
這就是我們自命的超級英雄,
13:14
who enforce law and order in our digital world.
227
794522
2976
他們在我們的數位世界中 執行法律維護秩序。
13:17
They clean up, they polish everything clean,
228
797522
2381
他們清理,把一切都擦乾淨,
13:19
they free us from everything evil.
229
799927
2333
讓我們不用接觸任何邪惡。
13:22
Tasks formerly reserved to state authorities
230
802284
3729
以前是國家當權機關要做的工作,
13:26
have been taken over by college graduates in their early 20s,
231
806037
3675
現在已經被二十多歲的 大學畢業生接手,
13:29
equipped with three- to five-day training --
232
809736
2893
他們只受過三到五天的訓練——
13:32
this is the qualification --
233
812653
1936
這就是必要資格——
13:34
who work on nothing less than the world's rescue.
234
814613
3067
他們的工作不亞於拯救世界。
13:38
MR: National sovereignties have been outsourced to private companies,
235
818756
4219
莫:國家統治權已經 被外包給私人公司,
13:42
and they pass on their responsibilities to third parties.
236
822999
4008
而他們又把他們的責任 轉給了第三方。
13:47
It's an outsourcing of the outsourcing of the outsourcing,
237
827031
3063
真正發生的狀況, 是外包給外包再做外包。
13:50
which takes place.
238
830118
1150
13:51
With social networks,
239
831618
1396
隨著社交網路出現,我們 要處理的是全新的基礎結構,
13:53
we are dealing with a completely new infrastructure,
240
833038
3015
13:56
with its own mechanisms,
241
836077
1516
它有自己的機制、自己的行動邏輯,
13:57
its own logic of action
242
837617
1579
13:59
and therefore, also, its own new dangers,
243
839220
5245
因此,也有自己的新危險,
14:04
which had not yet existed in the predigitalized public sphere.
244
844489
4025
在數位化之前的公共領域中 不曾有過這些危險。
14:08
HB: When Mark Zuckerberg was at the US Congress
245
848538
2209
漢:當馬克·祖伯克出席 美國國會或者歐洲議會時,
14:10
or at the European Parliament,
246
850771
1770
14:12
he was confronted with all kinds of critics.
247
852565
2635
他受到各種批評。
14:15
And his reaction was always the same:
248
855224
2580
而他的反應始終如一:
14:18
"We will fix that,
249
858501
1468
「我們會處理它,
14:19
and I will follow up on that with my team."
250
859993
2551
我和團隊會做後續追蹤。」
14:23
But such a debate shouldn't be held in back rooms of Facebook,
251
863167
3778
但這種討論不應該在臉書、推特,
14:26
Twitter or Google --
252
866969
1285
或 Google 私下進行——
14:28
such a debate should be openly discussed in new, cosmopolitan parliaments,
253
868278
4816
應該要在新的、國際性的 議會中公開討論,
14:33
in new institutions that reflect the diversity of people
254
873118
4860
在新機構中進行,要能夠 反映出對於全球網路的
14:38
contributing to a utopian project of a global network.
255
878002
4542
烏托邦計畫做出貢獻的人的多樣性。
14:42
And while it may seem impossible to consider the values
256
882568
3377
雖然,似乎不可能考量到
14:45
of users worldwide,
257
885969
2242
全世界使用者的價值觀,
14:48
it's worth believing
258
888235
1682
但仍然值得相信,
14:49
that there's more that connects us than separates us.
259
889941
3286
將我們連結在一起的東西, 比將我們分化的東西更多。
14:53
MR: Yeah, at a time when populism is gaining strength,
260
893624
3707
莫:是的,在民粹主義 越來越強大的時代,
14:57
it becomes popular to justify the symptoms,
261
897355
3198
似乎大家都會將症狀正當化,
15:00
to eradicate them,
262
900577
1278
根絕它們,讓它們隱形。
15:01
to make them invisible.
263
901879
1888
15:04
This ideology is spreading worldwide,
264
904919
3349
這種意識形態散播到全世界,
15:08
analog as well as digital,
265
908292
2785
類比和數位都是如此,
15:11
and it's our duty to stop it
266
911903
3492
我們有責任
要在太遲之前阻止它。
15:15
before it's too late.
267
915419
1626
15:17
The question of freedom and democracy
268
917665
3984
自由和民主的問題
15:21
must not only have these two options.
269
921673
2967
不能只有這兩個選項。
15:25
HB: Delete.
270
925053
1166
漢:刪除。
15:26
MR: Or ignore.
271
926243
2147
莫:或者忽略。
15:29
HB: Thank you very much.
272
929300
1597
漢:非常謝謝。
15:30
(Applause)
273
930921
5269
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7