The price of a "clean" internet | Hans Block and Moritz Riesewieck

66,430 views ・ 2019-11-21

TED


请双击下面的英文字幕来播放视频。

翻译人员: Wanting Zhong 校对人员: Jiasi Hao
00:12
[This talk contains mature content]
0
12360
3221
【本演讲包含成人内容】
00:16
Moritz Riesewieck: On March 23, 2013,
1
16913
4452
莫里兹·里斯维克: 2013 年 3 月 23 日,
00:21
users worldwide discovered in their news feed
2
21389
4057
世界各地的用户 在他们的新闻推送里
00:25
a video of a young girl being raped by an older man.
3
25470
5063
发现了一个年轻女孩 被年长男性强奸的视频。
00:31
Before this video was removed from Facebook,
4
31478
3856
在这个视频从 Facebook 上被移除前,
00:35
it was already shared 16,000 times,
5
35358
4616
它已经被转发了 1.6 万次,
00:39
and it was even liked 4,000 times.
6
39998
3642
甚至被点赞了 4 千次。
00:45
This video went viral and infected the net.
7
45268
3663
这个视频被疯转, 像病毒一样侵染了网络。
00:49
Hans Block: And that was the moment we asked ourselves
8
49873
3095
汉斯·布洛克: 也正是在这一刻,我们问自己,
00:52
how could something like this get on Facebook?
9
52992
2778
这种东西是怎么得以 出现在 Facebook 上的?
00:55
And at the same time, why don't we see such content more often?
10
55794
4402
同时,为什么我们没有 更加频繁地看见这种内容?
01:00
After all, there's a lot of revolting material online,
11
60220
3683
毕竟网络上有很多 令人反胃的资料信息,
01:03
but why do we so rarely see such crap on Facebook, Twitter or Google?
12
63927
4455
但为什么我们很少在 Facebook 、 推特,或谷歌上看到这样的垃圾?
01:08
MR: While image-recognition software
13
68958
2241
莫:虽说图像识别软件
01:11
can identify the outlines of sexual organs,
14
71223
4283
可以在图片和视频中
01:15
blood or naked skin in images and videos,
15
75530
4928
分辨性器官、血或者裸体,
01:20
it has immense difficulties to distinguish pornographic content
16
80482
5540
它很难从度假照片、阿多尼斯雕像,
或乳腺癌检查的宣传活动中,
01:26
from holiday pictures, Adonis statues
17
86046
4348
01:30
or breast-cancer screening campaigns.
18
90418
2698
区分出色情内容。
01:33
It can't distinguish Romeo and Juliet dying onstage
19
93140
4255
它无法区分舞台上 罗密欧与朱丽叶的死亡,
01:37
from a real knife attack.
20
97419
2555
和现实中的持刀袭击。
01:39
It can't distinguish satire from propaganda
21
99998
5222
它无法区分讽喻和煽动,
01:45
or irony from hatred, and so on and so forth.
22
105244
3968
反语和仇恨, 如此种种。
01:50
Therefore, humans are needed to decide
23
110077
4078
因此,需要人类来判断
01:54
which of the suspicious content should be deleted,
24
114179
4001
可疑内容中哪些应被删除,
01:58
and which should remain.
25
118204
1600
哪些可以保留。
02:00
Humans whom we know almost nothing about,
26
120909
2811
我们对这些人几乎一无所知,
02:03
because they work in secret.
27
123744
1867
因为他们进行的是秘密工作。
02:06
They sign nondisclosure agreements,
28
126053
1905
他们签了保密协议,
02:07
which prohibit them from talking and sharing
29
127982
2953
禁止他们谈论与分享
02:10
what they see on their screens and what this work does to them.
30
130959
3585
自己在屏幕上看到了什么, 以及这份工作对他们造成的影响。
02:14
They are forced to use code words in order to hide who they work for.
31
134887
4270
他们被迫使用暗号 以隐藏他们的雇主。
02:19
They are monitored by private security firms
32
139585
2905
他们被私人安保公司监控,
02:22
in order to ensure that they don't talk to journalists.
33
142514
3510
以确保他们不会同记者交谈。
而要是他们发声, 便会被威胁处以罚款。
02:26
And they are threatened by fines in case they speak.
34
146048
3539
02:30
All of this sounds like a weird crime story,
35
150421
3762
这些听起来像是 某个离奇的犯罪故事,
但这是真实的。
02:34
but it's true.
36
154207
1330
02:35
These people exist,
37
155561
1492
这些人是存在的,
02:37
and they are called content moderators.
38
157633
4181
他们被称为“网络审查员”。
02:42
HB: We are the directors of the feature documentary film "The Cleaners,"
39
162942
3445
汉:我们是专题纪录片《The Cleaners》 (《网络清道夫》)的导演。
02:46
and we would like to take you
40
166411
1960
请让我们将你们带往
02:48
to a world that many of you may not know yet.
41
168395
2587
一个你们大多数人 可能还未曾知晓的世界。
这是我们电影的一个片段。
02:51
Here's a short clip of our film.
42
171006
2133
(音乐)
02:58
(Music)
43
178639
3620
03:04
(Video) Moderator: I need to be anonymous, because we have a contract signed.
44
184400
3686
(视频)审查员:我必须匿名, 因为我们签了合同。
03:09
We are not allowed to declare whom we are working with.
45
189784
3621
我们不被允许透露 我们在为谁工作。
03:14
The reason why I speak to you
46
194807
1763
我之所以和你对话的原因
03:16
is because the world should know that we are here.
47
196594
4380
是因为世界应当知道 我们在这里。
03:22
There is somebody who is checking the social media.
48
202544
2803
有人在检查社交媒体。
03:26
We are doing our best to make this platform
49
206317
3604
我们在尽自己所能
为他们所有人 维持一个安全的平台。
03:29
safe for all of them.
50
209945
1790
03:42
Delete.
51
222438
1150
删除。
03:44
Ignore.
52
224278
1294
忽略。
03:45
Delete.
53
225596
1150
删除。
03:47
Ignore.
54
227279
1297
忽略。
03:48
Delete.
55
228600
1151
删除。
03:50
Ignore.
56
230680
1151
忽略。
03:51
Ignore.
57
231855
1150
忽略。
03:53
Delete.
58
233625
1150
删除。
03:58
HB: The so-called content moderators
59
238030
1977
汉:这个被称作“网络审查员”的群体
04:00
don't get their paychecks from Facebook, Twitter or Google themselves,
60
240031
4000
并不是直接从Facebook、 推特,或谷歌拿工资,
而是受雇于世界各地的外包公司,
04:04
but from outsourcing firms around the world
61
244055
2317
04:06
in order to keep the wages low.
62
246396
2067
以降低时薪成本。
04:08
Tens of thousands of young people
63
248833
1967
成千上万的年轻人
04:10
looking at everything we are not supposed to see.
64
250824
3213
看着我们不应当看到的一切。
我们指的是斩首、残割、
04:14
And we are talking about decapitations, mutilations,
65
254061
3548
04:17
executions, necrophilia, torture, child abuse.
66
257633
3696
处决、尸奸、酷刑、儿童虐待。
04:21
Thousands of images in one shift --
67
261743
2274
一次轮值要处理几千张图像——
忽略,删除—— 不论昼夜。
04:24
ignore, delete, day and night.
68
264041
2915
04:27
And much of this work is done in Manila,
69
267393
3398
这项工作大部分是在 马尼拉进行的,
04:30
where the analog toxic waste from the Western world
70
270815
3302
多年来西方世界 都用集装箱船
将有毒的电子垃圾 输向这里,
04:34
was transported for years by container ships,
71
274141
2608
04:36
now the digital waste is dumped there via fiber-optic cable.
72
276773
4003
如今数字垃圾正通过 光纤电缆倾倒在同一个地方。
04:40
And just as the so-called scavengers
73
280800
3047
而正如同所谓的拾荒者
04:43
rummage through gigantic tips on the edge of the city,
74
283871
3476
在城市边缘的 巨大垃圾山里翻捡一样,
04:47
the content moderators click their way through an endless toxic ocean
75
287371
4833
网络审查员点击着鼠标, 趟过一片无边无际的、
04:52
of images and videos and all manner of intellectual garbage,
76
292228
4087
由图像、视频和各种知识垃圾 构成的充满毒素的汪洋,
04:56
so that we don't have to look at it.
77
296339
2302
感谢于此, 我们无需亲自面对这些内容。
04:58
MR: But unlike the wounds of the scavengers,
78
298665
3540
莫:但和拾荒者们身上的伤口不同,
05:02
those of the content moderators remain invisible.
79
302229
3451
网络审查员的伤口是看不见的。
05:06
Full of shocking and disturbing content,
80
306117
3080
这些图片和视频充斥着 令人震惊与不安的内容,
05:09
these pictures and videos burrow into their memories
81
309221
3263
烙印在他们的记忆里,
05:12
where, at any time, they can have unpredictable effects:
82
312508
3445
随时可能造成难以预计的影响:
05:15
eating disorders, loss of libido,
83
315977
3357
饮食失调、性欲丧失、
05:19
anxiety disorders, alcoholism,
84
319358
3259
焦虑症、酗酒、
05:22
depression, which can even lead to suicide.
85
322641
2912
抑郁症,甚至可能造成自杀。
05:26
The pictures and videos infect them,
86
326100
2445
那些图片和视频感染了他们,
05:28
and often never let them go again.
87
328569
2389
往往再也不会放过他们。
05:30
If they are unlucky, they develop post-traumatic stress disorders,
88
330982
4841
如果不幸的话,他们会像 从战场归来的士兵一样,
05:35
like soldiers after war missions.
89
335847
2200
患上创伤后应激障碍(PTSD)。
05:39
In our film, we tell the story of a young man
90
339445
3643
在影片里,我们讲述了 一个年轻人的故事:
他的工作是监控 自残以及自杀企图的直播,
05:43
who had to monitor livestreams of self-mutilations and suicide attempts,
91
343112
5198
05:48
again and again,
92
348334
1675
周而复始,
然而最终,他也以自杀的方式 结束了自己的生命。
05:50
and who eventually committed suicide himself.
93
350033
3066
05:53
It's not an isolated case, as we've been told.
94
353787
2740
我们被告知的是, 这样的事并非个例。
这是我们所有人,
05:57
This is the price all of us pay
95
357184
3980
为了我们所谓的干净、安全、 且“健康”的社交媒体环境,
06:01
for our so-called clean and safe and "healthy"
96
361188
5327
06:06
environments on social media.
97
366539
2284
付出的代价。
06:10
Never before in the history of mankind
98
370482
2595
在人类历史中, 从未有哪个时代
能像现在这样轻易地
06:13
has it been easier to reach millions of people around the globe
99
373101
3332
在数秒之内便触及 全球各地的数百万人。
06:16
in a few seconds.
100
376457
1667
在社交媒体上发布的内容 传递得如此之快,
06:18
What is posted on social media spreads so quickly,
101
378148
3945
迅速爆红疯转, 刺激全球所有人的神经。
06:22
becomes viral and excites the minds of people all around the globe.
102
382117
3936
06:26
Before it is deleted,
103
386450
2064
在它被删除之前,
06:28
it is often already too late.
104
388538
1933
往往已为时晚矣。
06:30
Millions of people have already been infected
105
390966
2230
数百万人已经 被憎恨和愤怒感染,
06:33
with hatred and anger,
106
393220
1857
他们抑或在网上变得活跃,
06:35
and they either become active online,
107
395101
2730
06:37
by spreading or amplifying hatred,
108
397855
3143
继续传播或放大憎恨,
抑或走上街头, 诉诸暴力。
06:41
or they take to the streets and take up arms.
109
401022
3793
06:45
HB: Therefore, an army of content moderators
110
405236
2540
汉:因此, 一支由网络审查员形成的军队,
06:47
sit in front of a screen to avoid new collateral damage.
111
407800
3895
守在屏幕前, 防止新的附带损害产生。
06:52
And they are deciding, as soon as possible,
112
412434
2119
他们必须尽快做出决断,
06:54
whether the content stays on the platform -- ignore;
113
414577
4095
是否保留某条内容——忽略;
06:58
or disappears -- delete.
114
418696
2340
还是让它消失——删除。
07:01
But not every decision is as clear
115
421823
2627
但并不是每个决定
都能像对儿童虐待的视频那样 迅速做出清晰明了的判断。
07:04
as the decision about a child-abuse video.
116
424474
2897
07:07
What about controversial content, ambivalent content,
117
427395
2777
对于由民权活动人士、 公民记者上传的
07:10
uploaded by civil rights activists or citizen journalists?
118
430196
3153
有争议的、模棱两可的内容, 该怎么处理呢?
网络审查员在判断这些案例时,
07:14
The content moderators often decide on such cases
119
434048
3222
07:17
at the same speed as the [clear] cases.
120
437294
2733
通常和处理泾渭分明的案例时 使用同样的速度。
07:21
MR: We will show you a video now,
121
441515
2659
莫:下面我们会让大家 观看一段视频,
07:24
and we would like to ask you to decide:
122
444198
3309
我们想让你们决定:
07:27
Would you delete it,
123
447531
1690
你们会删除它吗?
还是不删除它呢?
07:29
or would you not delete it?
124
449245
1801
(视频)(空袭声)
07:31
(Video) (Air strike sounds)
125
451070
1667
07:33
(Explosion)
126
453100
2550
(爆炸)
07:40
(People speaking in Arabic)
127
460076
5953
(人们用阿拉伯语交谈)
莫:没错,我们已为大家 对视频进行了打码处理。
07:46
MR: Yeah, we did some blurring for you.
128
466053
2230
07:49
A child would potentially be dangerously disturbed
129
469196
3755
一个孩子要是看到这样的内容,
可能会感到严重不安, 以及极度恐惧。
07:52
and extremely frightened by such content.
130
472975
2809
07:55
So, you rather delete it?
131
475808
2297
那么,你们觉得删了更好?
07:59
But what if this video could help investigate the war crimes in Syria?
132
479610
4639
但要是说,这段视频能帮助调查 在叙利亚发生的战争罪行呢?
08:04
What if nobody would have heard about this air strike,
133
484717
3167
要是因为Facebook、Youtube、 推特都决定撤除这段视频,
08:07
because Facebook, YouTube, Twitter would have decided to take it down?
134
487908
3738
导致无人得知这场空袭呢?
08:12
Airwars, a nongovernmental organization based in London,
135
492895
4325
Airwars 是一个位于伦敦的 非政府组织,
08:17
tries to find those videos as quickly as possible
136
497244
2897
他们试图在这些视频 上传到社交网络时,
08:20
whenever they are uploaded to social media,
137
500165
2560
尽快找到这些视频,
08:22
in order to archive them.
138
502749
1600
以便对它们进行归档记录。
08:24
Because they know, sooner or later,
139
504693
2833
因为他们知道,这些内容迟早 会被Facebook、Youtube、推特删除,
08:27
Facebook, YouTube, Twitter would take such content down.
140
507550
3310
08:31
People armed with their mobile phones
141
511345
2208
拥有手机的人们能曝光
08:33
can make visible what journalists often do not have access to.
142
513577
4199
记者们通常难以接触的事情。
08:37
Civil rights groups often do not have any better option
143
517800
3063
人权组织为了让他们的录像
08:40
to quickly make their recordings accessible to a large audience
144
520887
3801
能迅速向广大观众公开,
除了上传到社交媒体 常常没有更好的选择。
08:44
than by uploading them to social media.
145
524712
2600
08:47
Wasn't this the empowering potential the World Wide Web should have?
146
527950
4286
这难道不是万维网应当拥有的 能够赋予力量的潜力吗?
08:52
Weren't these the dreams
147
532966
1960
这难道不是万维网初具雏形时,
08:54
people in its early stages had about the World Wide Web?
148
534950
4111
人们对它抱有的梦想吗?
08:59
Can't pictures and videos like these
149
539608
2795
这样的图片和视频
09:02
persuade people who have become insensitive to facts
150
542427
5134
难道无法劝说已对事实 变得麻木的人们
09:07
to rethink?
151
547585
1150
开始反思吗?
09:09
HB: But instead, everything that might be disturbing is deleted.
152
549917
3602
汉:然而,一切可能 造成不安的内容都被删除了。
09:13
And there's a general shift in society.
153
553543
2058
在社会中还有这样的 一种变化趋势。
09:15
Media, for example, more and more often use trigger warnings
154
555625
3897
比如说,媒体更加频繁地
在有人可能感到冒犯 或者不安的文章顶部
09:19
at the top of articles
155
559546
1793
09:21
which some people may perceive as offensive or troubling.
156
561363
3309
使用“敏感警告”。
09:24
Or more and more students at universities in the United States
157
564696
3914
美国的大学校园内 有越来越多的学生
09:28
demand the banishment of antique classics
158
568634
2817
要求从课程中剔除
09:31
which depict sexual violence or assault from the curriculum.
159
571475
3115
描写性暴力或性侵犯的古典内容。
09:34
But how far should we go with that?
160
574991
2333
但这些行为的尺度该如何把握?
09:37
Physical integrity is guaranteed as a human right
161
577875
3380
在世界各地的宪法中,
身体健全是被保障的基本人权。
09:41
in constitutions worldwide.
162
581279
1800
09:43
In the Charter of Fundamental Rights of the European Union,
163
583422
3754
欧盟的《基本权利宪章》明文规定,
09:47
this right expressly applies to mental integrity.
164
587200
3354
这项权利同样适用于心理健全。
09:51
But even if the potentially traumatic effect
165
591347
2658
但即使图像和视频 带来的潜在创伤难以预测,
09:54
of images and videos is hard to predict,
166
594029
2826
09:56
do we want to become so cautious
167
596879
1957
我们是否想变得如此谨小慎微,
09:58
that we risk losing social awareness of injustice?
168
598860
3727
以至于要冒险失去 对不公的社会意识?
10:03
So what to do?
169
603203
1150
那么该怎么做呢?
10:04
Mark Zuckerberg recently stated that in the future,
170
604942
2992
马克·扎克伯格最近声明
10:07
the users, we, or almost everybody,
171
607958
3802
在未来,用户们, 即我们,或者几乎是任何人,
10:11
will decide individually
172
611784
2261
将会通过个人过滤设定,
10:14
what they would like to see on the platform,
173
614069
2048
个人独立决定在平台上 想看到的内容。
10:16
by personal filter settings.
174
616141
2031
也就是说任何人能轻松地声称
10:18
So everyone could easily claim to remain undisturbed
175
618196
3072
看到战争和暴力冲突的图像时 能不为所动,比如说——
10:21
by images of war or other violent conflicts, like ...
176
621292
3739
10:25
MR: I'm the type of guy who doesn't mind seeing breasts
177
625849
4446
莫:我是那种不介意 看到胸部的男人,
10:30
and I'm very interested in global warming,
178
630319
3766
我对全球变暖很感兴趣,
但不怎么喜欢战争。
10:34
but I don't like war so much.
179
634109
2564
10:37
HB: Yeah, I'm more the opposite,
180
637109
1722
汉:嗯,我就比较相反,
10:38
I have zero interest in naked breasts or naked bodies at all.
181
638855
4053
我对胸部或者裸体 压根没有一点兴趣。
10:43
But why not guns? I like guns, yes.
182
643209
2911
但何不谈谈枪支? 没错,我喜欢枪。
10:46
MR: Come on, if we don't share a similar social consciousness,
183
646901
3745
莫:看嘛,如果我们没有 共享相似的社会意识,
10:50
how shall we discuss social problems?
184
650670
2669
我们该如何讨论社会问题?
10:53
How shall we call people to action?
185
653363
2397
我们该如何呼吁人们行动?
10:55
Even more isolated bubbles would emerge.
186
655784
3285
更多互相孤立的泡泡会浮现。
10:59
One of the central questions is: "How, in the future,
187
659665
3231
核心问题之一是:
“在未来,我们该如何平衡 言论自由与人们对保护的需求。”
11:02
freedom of expression will be weighed against the people's need for protection."
188
662920
4903
11:08
It's a matter of principle.
189
668441
1467
这是个原则性的问题。
11:10
Do we want to design an either open or closed society
190
670602
4248
我们想为数字空间
设计一个相较开放 或是封闭的社会?
11:14
for the digital space?
191
674874
1639
问题的核心是 “自由 vs. 安全感”。
11:17
At the heart of the matter is "freedom versus security."
192
677054
5912
11:24
Facebook has always wanted to be a "healthy" platform.
193
684388
4484
Facebook 一直想成为 一个“健康”的平台。
11:28
Above all, users should feel safe and secure.
194
688896
3698
重中之重的是, 用户应当感到安全。
11:32
It's the same choice of words
195
692618
2120
在我们的很多采访中,
11:34
the content moderators in the Philippines used
196
694762
2958
菲律宾的网络审查员们
也使用了同样的遣词。
11:37
in a lot of our interviews.
197
697744
1800
11:40
(Video) The world that we are living in right now,
198
700188
2381
(视频)我相信, 我们正生活的世界,
11:42
I believe, is not really healthy.
199
702593
2166
并不是真的健康。
11:44
(Music)
200
704783
1548
(音乐)
11:46
In this world, there is really an evil who exists.
201
706355
3158
在这世界上, 确实存在着邪恶。
11:49
(Music)
202
709537
3238
(音乐)
11:52
We need to watch for it.
203
712799
2063
我们需要警惕它。
11:54
(Music)
204
714886
1882
(音乐)
11:56
We need to control it -- good or bad.
205
716792
3250
我们需要控制它——无论好坏。
12:00
(Music)
206
720646
7000
(音乐)
12:10
[Look up, Young man! --God]
207
730193
4103
[年轻人,抬头看!——上帝]
12:14
MR: For the young content moderators in the strictly Catholic Philippines,
208
734952
4278
莫:这些来自信奉天主教的菲律宾的 年轻网络审查员们,
12:19
this is linked to a Christian mission.
209
739254
2793
对于他们来说,这份工作 和基督教的使命有所联系。
12:22
To counter the sins of the world
210
742833
2966
为了对抗在网络上传播的
12:25
which spread across the web.
211
745823
2174
这个世界的罪恶。
12:28
"Cleanliness is next to godliness,"
212
748641
3412
“清洁近于圣洁”,
12:32
is a saying everybody in the Philippines knows.
213
752077
3434
这个说法在菲律宾人尽皆知。
汉:其他人则将自己与他们的总统 罗德里戈·杜特尔特相比较,
12:36
HB: And others motivate themselves
214
756035
1659
12:37
by comparing themselves with their president, Rodrigo Duterte.
215
757718
3731
以此激励自身。
12:41
He has been ruling the Philippines since 2016,
216
761837
3491
他自 2016 年当选以来 一直掌权菲律宾,
12:45
and he won the election with the promise: "I will clean up."
217
765352
3993
他凭借“我会进行清扫”的承诺 在当年的选举中胜出。
12:49
And what that means is eliminating all kinds of problems
218
769892
3317
而这个承诺的意思是
通过杀掉街上被视为罪犯的人, 不管这是什么意思,
12:53
by literally killing people on the streets
219
773233
2455
12:55
who are supposed to be criminals, whatever that means.
220
775712
2865
从而达到 排除社会上各种问题的目的。
12:58
And since he was elected,
221
778601
1270
自从他当选以后,
12:59
an estimated 20,000 people have been killed.
222
779895
3436
估计有 2 万人被杀。
13:03
And one moderator in our film says,
223
783655
2501
我们影片中的一位审查员说:
“杜特尔特在街头上怎么做,
13:06
"What Duterte does on the streets,
224
786180
2055
13:08
I do for the internet."
225
788259
1714
我在网络上也怎么做。”
13:10
And here they are, our self-proclaimed superheroes,
226
790934
3564
这就是他们, 我们的“自我标榜的超级英雄”,
13:14
who enforce law and order in our digital world.
227
794522
2976
在数字世界里维持法制与秩序。
13:17
They clean up, they polish everything clean,
228
797522
2381
他们进行扫除, 把一切擦拭得干干净净,
13:19
they free us from everything evil.
229
799927
2333
他们将我们 从一切邪恶中解放出来。
13:22
Tasks formerly reserved to state authorities
230
802284
3729
曾经为国家机关保留的任务
如今落到了二十岁出头的 大学毕业生肩上,
13:26
have been taken over by college graduates in their early 20s,
231
806037
3675
13:29
equipped with three- to five-day training --
232
809736
2893
他们接受完三天到五天的训练,
13:32
this is the qualification --
233
812653
1936
——这便是他们的资格证,
13:34
who work on nothing less than the world's rescue.
234
814613
3067
他们的工作不亚于拯救世界。
13:38
MR: National sovereignties have been outsourced to private companies,
235
818756
4219
莫:国家权能被外包给私人公司,
13:42
and they pass on their responsibilities to third parties.
236
822999
4008
他们又将自己的责任 托付给第三方。
情况就是—— 外包,再外包,再外包。
13:47
It's an outsourcing of the outsourcing of the outsourcing,
237
827031
3063
13:50
which takes place.
238
830118
1150
13:51
With social networks,
239
831618
1396
对于社交网络,
我们要处理的是一个全新的架构,
13:53
we are dealing with a completely new infrastructure,
240
833038
3015
它有着自己的运行机制,
13:56
with its own mechanisms,
241
836077
1516
13:57
its own logic of action
242
837617
1579
自己的行为逻辑,
13:59
and therefore, also, its own new dangers,
243
839220
5245
因而也有其特定的潜在新危险。
14:04
which had not yet existed in the predigitalized public sphere.
244
844489
4025
这些危险在电子化时代以前的 公共领域中不曾存在过。
14:08
HB: When Mark Zuckerberg was at the US Congress
245
848538
2209
汉:当马克·扎克伯格在美国国会
14:10
or at the European Parliament,
246
850771
1770
或者欧洲议会时,
14:12
he was confronted with all kinds of critics.
247
852565
2635
他面对的是各式各样的批评。
14:15
And his reaction was always the same:
248
855224
2580
而他的反应总是千篇一律的:
14:18
"We will fix that,
249
858501
1468
“这一点我们会改进,
14:19
and I will follow up on that with my team."
250
859993
2551
那一点我们团队会跟进。”
可是这样的辩论
14:23
But such a debate shouldn't be held in back rooms of Facebook,
251
863167
3778
不应该在 Facebook 、推特 或谷歌的幕后进行——
14:26
Twitter or Google --
252
866969
1285
14:28
such a debate should be openly discussed in new, cosmopolitan parliaments,
253
868278
4816
这样的辩论应当被公开探讨, 在崭新的、国际化的议会中,
在新的机构中—— 它们应当能反映
14:33
in new institutions that reflect the diversity of people
254
873118
4860
“为全球化网络理想工程 做出贡献的人们的多元化”。
14:38
contributing to a utopian project of a global network.
255
878002
4542
14:42
And while it may seem impossible to consider the values
256
882568
3377
考虑到全球用户的价值观 虽说看上去不可能
14:45
of users worldwide,
257
885969
2242
14:48
it's worth believing
258
888235
1682
但值得相信的是,
14:49
that there's more that connects us than separates us.
259
889941
3286
我们之间的 联系将比隔阂更强大。
14:53
MR: Yeah, at a time when populism is gaining strength,
260
893624
3707
莫:没错, 在这个民粹主义抬头的时点,
14:57
it becomes popular to justify the symptoms,
261
897355
3198
为症状辩解、
将它们消除、将它们隐形,
15:00
to eradicate them,
262
900577
1278
15:01
to make them invisible.
263
901879
1888
这样的做法变得普及。
15:04
This ideology is spreading worldwide,
264
904919
3349
这种观念正在全世界扩散,
15:08
analog as well as digital,
265
908292
2785
无论在现实里还是在网络上,
15:11
and it's our duty to stop it
266
911903
3492
而我们的义务
是在为时已晚前阻止它。
15:15
before it's too late.
267
915419
1626
15:17
The question of freedom and democracy
268
917665
3984
自由和民主的问题
15:21
must not only have these two options.
269
921673
2967
并不能只有这两个选项。
汗:删除。
15:25
HB: Delete.
270
925053
1166
15:26
MR: Or ignore.
271
926243
2147
莫:或者忽略。
15:29
HB: Thank you very much.
272
929300
1597
汗:谢谢大家。
15:30
(Applause)
273
930921
5269
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7