How you can help transform the internet into a place of trust | Claire Wardle

52,343 views ・ 2019-11-15

TED


请双击下面的英文字幕来播放视频。

翻译人员: Nan Yang 校对人员: Jiasi Hao
00:13
No matter who you are or where you live,
0
13548
2516
无论你是谁或者住在哪,
00:16
I'm guessing that you have at least one relative
1
16088
2356
我猜你们身边至少有一位
00:18
that likes to forward those emails.
2
18468
2334
喜欢转发那些电子邮件的亲戚。
00:21
You know the ones I'm talking about --
3
21206
1900
你们知道我说的那些邮件
是带有可疑声明 或阴谋论录像的邮件。
00:23
the ones with dubious claims or conspiracy videos.
4
23130
2854
00:26
And you've probably already muted them on Facebook
5
26315
2668
而且你们可能已经在 Facebook 上屏蔽了他们,
00:29
for sharing social posts like this one.
6
29007
2348
不想收到这种信息,像这个。
00:31
It's an image of a banana
7
31379
1404
这是一张香蕉的图片,
00:32
with a strange red cross running through the center.
8
32807
2667
香蕉中间有一个奇怪的红色十字。
00:35
And the text around it is warning people
9
35498
2139
旁边的文字在警告人们
00:37
not to eat fruits that look like this,
10
37661
2158
不要吃这样的水果,
00:39
suggesting they've been injected with blood
11
39843
2028
指出了它们已经被注射了
00:41
contaminated with the HIV virus.
12
41895
2130
被 HIV 病毒污染的血液。
00:44
And the social share message above it simply says,
13
44049
2603
而且它上面的分享消息写着:
00:46
"Please forward to save lives."
14
46676
2181
“为了拯救生命,请转发。”
00:49
Now, fact-checkers have been debunking this one for years,
15
49672
3294
事实审查员多年来 一直在揭露这一说法的真相,
00:52
but it's one of those rumors that just won't die.
16
52990
2809
但这只是那些 永远不会消失的传闻之一。
00:55
A zombie rumor.
17
55823
1271
一个僵尸谣言。
00:57
And, of course, it's entirely false.
18
57513
2093
当然它是完全虚假的。
01:00
It might be tempting to laugh at an example like this, to say,
19
60180
2959
去嘲笑这样一个例子可能很诱人,
大家会说: “反正,谁会相信这个呢?”
01:03
"Well, who would believe this, anyway?"
20
63163
1884
01:05
But the reason it's a zombie rumor
21
65419
1626
但说这是僵尸谣言的原因是
01:07
is because it taps into people's deepest fears about their own safety
22
67069
3889
它利用了关于人们对自身安全 和他们所爱之人的
01:10
and that of the people they love.
23
70982
2175
最深层次的恐惧。
01:13
And if you spend as enough time as I have looking at misinformation,
24
73783
3273
而且如果你们花费和我一样多 的时间来看这个误传,
01:17
you know that this is just one example of many
25
77080
2420
你们会知道这只是众多
01:19
that taps into people's deepest fears and vulnerabilities.
26
79524
3047
利用人们最深层的 恐惧和脆弱的例子之一。
01:23
Every day, across the world, we see scores of new memes on Instagram
27
83214
4369
每一天,在世界范围内, 我们都能看到 Instagram 上出现的
01:27
encouraging parents not to vaccinate their children.
28
87607
3039
大量新表情包在鼓励父母 不要给孩子接种疫苗。
01:30
We see new videos on YouTube explaining that climate change is a hoax.
29
90670
4532
我们看到 YoutTube 上的新视频 在解释说气候变化问题是个骗局。
01:35
And across all platforms, we see endless posts designed to demonize others
30
95226
4302
在所有平台上,我们看见了 因为种族、宗教或性取向的不同,
01:39
on the basis of their race, religion or sexuality.
31
99552
3501
而妖魔化他人的无穷无尽的帖子。
01:44
Welcome to one of the central challenges of our time.
32
104314
3030
欢迎来到我们这个时代 的主要挑战之一。
01:47
How can we maintain an internet with freedom of expression at the core,
33
107647
4025
我们如何在维护互联网 言论自由核心的同时,
01:51
while also ensuring that the content that's being disseminated
34
111696
3303
也能确保正在传播的内容
01:55
doesn't cause irreparable harms to our democracies, our communities
35
115023
3886
不会对我们的民主、我们的社区 和我们的身心健康
01:58
and to our physical and mental well-being?
36
118933
2238
造成不可弥补的伤害?
02:01
Because we live in the information age,
37
121998
2087
因为我们生活在信息时代,
02:04
yet the central currency upon which we all depend -- information --
38
124109
3547
但是我们所有人都依赖的 中央货币——信息——
02:07
is no longer deemed entirely trustworthy
39
127680
2357
不再完全值得信赖,
02:10
and, at times, can appear downright dangerous.
40
130061
2328
而且有时会显得非常危险。
02:12
This is thanks in part to the runaway growth of social sharing platforms
41
132811
3937
这部分“归功”于 社交共享媒体的迅猛发展,
02:16
that allow us to scroll through,
42
136772
1642
让我们可以在谎言和真相
02:18
where lies and facts sit side by side,
43
138438
2222
并存的世界里滑屏浏览,
02:20
but with none of the traditional signals of trustworthiness.
44
140684
3071
但是没有任何 传统的可信赖的信号。
02:24
And goodness -- our language around this is horribly muddled.
45
144268
3619
天啊——我们关于这方面 的语言极其混乱。
02:27
People are still obsessed with the phrase "fake news,"
46
147911
3103
人们仍然沉迷于“假新闻”这一短语,
02:31
despite the fact that it's extraordinarily unhelpful
47
151038
2531
尽管事实是,该短语毫无帮助,
02:33
and used to describe a number of things that are actually very different:
48
153593
3460
并且被用来描述一系列实际上 非常不同的东西:
谎言、谣言、恶作剧、 阴谋、宣传鼓吹……
02:37
lies, rumors, hoaxes, conspiracies, propaganda.
49
157077
3386
02:40
And I really wish we could stop using a phrase
50
160911
2912
我非常希望我们可以停止使用
02:43
that's been co-opted by politicians right around the world,
51
163847
2862
这一世界各地的政治家 共同选择的一个短语,
02:46
from the left and the right,
52
166733
1471
从左派到右派,
02:48
used as a weapon to attack a free and independent press.
53
168228
3222
停止将它作为武器来攻击 自由和独立的新闻媒体。
02:52
(Applause)
54
172307
4702
(掌声)
02:57
Because we need our professional news media now more than ever.
55
177033
3462
因为我们比以往任何时候 都更需要我们的专业新闻媒体。
03:00
And besides, most of this content doesn't even masquerade as news.
56
180882
3373
而且除此之外,大多数内容 甚至都没有被伪装成新闻,
03:04
It's memes, videos, social posts.
57
184279
2642
而是以表情包、视频、 社交帖子的形式存在。
03:06
And most of it is not fake; it's misleading.
58
186945
3453
而且大部分内容 不是假的,而是误导。
我们倾向于专注在真假上。
03:10
We tend to fixate on what's true or false.
59
190422
3015
03:13
But the biggest concern is actually the weaponization of context.
60
193461
4032
但是最大的担忧实际上是 语境的武器化。
03:18
Because the most effective disinformation
61
198855
1968
因为最有效的虚假信息
03:20
has always been that which has a kernel of truth to it.
62
200847
3048
一直是具有真实内核的那些。
03:23
Let's take this example from London, from March 2017,
63
203919
3476
举一个 2017 年 3 月伦敦的例子,
03:27
a tweet that circulated widely
64
207419
1540
在威斯敏斯特桥
03:28
in the aftermath of a terrorist incident on Westminster Bridge.
65
208983
3587
恐怖袭击事件发生后 的一个广为流传的推文。
03:32
This is a genuine image, not fake.
66
212594
2428
这是个真实的图片,不是假的。
03:35
The woman who appears in the photograph was interviewed afterwards,
67
215046
3169
照片里的女子在事后接受了采访,
她解释到 她精神上完全受到了打击。
03:38
and she explained that she was utterly traumatized.
68
218239
2409
03:40
She was on the phone to a loved one,
69
220672
1738
事件发生当时 她和爱人在打电话,
03:42
and she wasn't looking at the victim out of respect.
70
222434
2618
出于尊重, 她没有看向身旁的受害者。
03:45
But it still was circulated widely with this Islamophobic framing,
71
225076
3960
但是它仍然以 伊斯兰恐惧症为主题广为流传,
03:49
with multiple hashtags, including: #BanIslam.
72
229060
3046
并带有多个主题标签, 包括:#BanIslam(禁止伊斯兰人)。
03:52
Now, if you worked at Twitter, what would you do?
73
232425
2398
如果你在推特工作, 你会怎么做?
03:54
Would you take that down, or would you leave it up?
74
234847
2562
你会把它撤下平台, 还是让它留在那?
03:58
My gut reaction, my emotional reaction, is to take this down.
75
238553
3429
我的直觉和情感反应 是把它撤下来。
04:02
I hate the framing of this image.
76
242006
2142
我讨厌对这个图片的“包装”。
04:04
But freedom of expression is a human right,
77
244585
2388
但是言论自由是人类的权力,
04:06
and if we start taking down speech that makes us feel uncomfortable,
78
246997
3225
如果我们开始 把让我们感到不舒服的言论撤下,
04:10
we're in trouble.
79
250246
1230
我们就有麻烦了。
04:11
And this might look like a clear-cut case,
80
251500
2294
而且这个可能看起来是 一个清晰明了的案例,
04:13
but, actually, most speech isn't.
81
253818
1698
但实际上,大多数言论不是。
04:15
These lines are incredibly difficult to draw.
82
255540
2436
这些边界很难描绘。
04:18
What's a well-meaning decision by one person
83
258000
2281
一个人的善意决定
04:20
is outright censorship to the next.
84
260305
2077
是对下一个人的公然审查。
04:22
What we now know is that this account, Texas Lone Star,
85
262759
2929
现在我们知道的是, 这个账户,德克萨斯孤星,
04:25
was part of a wider Russian disinformation campaign,
86
265712
3230
是一个更大的 俄罗斯造谣运动的一部分,
04:28
one that has since been taken down.
87
268966
2151
此后已经被扳倒。
04:31
Would that change your view?
88
271141
1563
这会改变你的看法吗?
04:33
It would mine,
89
273322
1159
会改变我的,
04:34
because now it's a case of a coordinated campaign
90
274505
2301
因为现在这个案子的性质 变成了一场播种不和谐种子的
04:36
to sow discord.
91
276830
1215
协作运动。
04:38
And for those of you who'd like to think
92
278069
1961
而对于你们中那些认为
04:40
that artificial intelligence will solve all of our problems,
93
280054
2831
人工智能将会解决我们 所有问题的人来说,
04:42
I think we can agree that we're a long way away
94
282909
2225
我认为我们可以同意
在人工智能能够理解这种帖子前 我们还有很长的一段路要走。
04:45
from AI that's able to make sense of posts like this.
95
285158
2587
04:48
So I'd like to explain three interlocking issues
96
288856
2507
所以我想解释一下 让这事变得如此复杂的
04:51
that make this so complex
97
291387
2373
三个环环相扣的问题,
04:53
and then think about some ways we can consider these challenges.
98
293784
3122
然后琢磨出些方法来思考这些挑战。
04:57
First, we just don't have a rational relationship to information,
99
297348
3890
首先,我们只是没有 与信息建立起理性关系,
05:01
we have an emotional one.
100
301262
1468
我们的关系是感性的。
05:02
It's just not true that more facts will make everything OK,
101
302754
3794
并非更多的事实真相 会让一切顺利,
05:06
because the algorithms that determine what content we see,
102
306572
3100
因为决定我们 能看见什么内容的算法,
05:09
well, they're designed to reward our emotional responses.
103
309696
3127
是被设计来 奖励我们的情感反应。
05:12
And when we're fearful,
104
312847
1381
而当我们恐惧时,
05:14
oversimplified narratives, conspiratorial explanations
105
314252
3174
过于简化的叙述、阴谋论解释,
05:17
and language that demonizes others is far more effective.
106
317450
3418
和妖魔化事物的语言 更加有效。
05:21
And besides, many of these companies,
107
321538
1874
此外,有很多公司,
05:23
their business model is attached to attention,
108
323436
2546
他们的商业模式 与人们的关注度息息相关,
05:26
which means these algorithms will always be skewed towards emotion.
109
326006
3690
这意味着这些算法 总是会偏向情感。
05:30
Second, most of the speech I'm talking about here is legal.
110
330371
4298
第二,我现在谈论的 大多数言论是合法的。
如果我在说的是
05:35
It would be a different matter
111
335081
1446
05:36
if I was talking about child sexual abuse imagery
112
336551
2341
儿童性虐待图片, 或者是煽动暴力的内容,
05:38
or content that incites violence.
113
338916
1927
就是另外一回事。
05:40
It can be perfectly legal to post an outright lie.
114
340867
3270
公开撒谎可以是完全合法的。
05:45
But people keep talking about taking down "problematic" or "harmful" content,
115
345130
4034
但是人们一直在讨论撤下 “有问题的”或“有害的”内容,
05:49
but with no clear definition of what they mean by that,
116
349188
2609
但是并没有对它们是什么 有明确的定义,
05:51
including Mark Zuckerberg,
117
351821
1264
包括马克·扎克伯格,
他最近呼吁 全球监管来缓和言论。
05:53
who recently called for global regulation to moderate speech.
118
353109
3412
05:56
And my concern is that we're seeing governments
119
356870
2215
而我的担心是,我们看见了
05:59
right around the world
120
359109
1292
世界各地的政府
06:00
rolling out hasty policy decisions
121
360425
2676
推出仓促的政策决定,
06:03
that might actually trigger much more serious consequences
122
363125
2746
但是它可能实际上触发了
06:05
when it comes to our speech.
123
365895
1714
对于我们的言论更严重的后果。
而且即使我们可以决定 哪个言论留住或撤下来,
06:08
And even if we could decide which speech to take up or take down,
124
368006
3706
06:11
we've never had so much speech.
125
371736
2174
我们从来没有过 现在这么多的言论。
06:13
Every second, millions of pieces of content
126
373934
2131
每一秒,几百万的内容,
06:16
are uploaded by people right around the world
127
376089
2107
以不同的语言
被世界各地的人上传,
06:18
in different languages,
128
378220
1168
06:19
drawing on thousands of different cultural contexts.
129
379412
2768
吸取了上千种不同的文化背景。
06:22
We've simply never had effective mechanisms
130
382204
2532
我们根本没有有效的机制,
06:24
to moderate speech at this scale,
131
384760
1738
无论是通过人工还是技术手段
06:26
whether powered by humans or by technology.
132
386522
2801
去缓和这种规模的言论内容,
06:30
And third, these companies -- Google, Twitter, Facebook, WhatsApp --
133
390284
3944
然后第三,这些公司—— 谷歌,推特,脸书,WhatsApp ——
06:34
they're part of a wider information ecosystem.
134
394252
2841
它们是广阔的 信息生态系统的一部分。
06:37
We like to lay all the blame at their feet, but the truth is,
135
397117
3352
我们喜欢把所有责任 都推到他们身上,但事实是,
06:40
the mass media and elected officials can also play an equal role
136
400493
3830
大众媒体和民选官员 只要他们想,
06:44
in amplifying rumors and conspiracies when they want to.
137
404347
2913
也可以在扩大谣言和阴谋上 发挥同等作用。
06:47
As can we, when we mindlessly forward divisive or misleading content
138
407800
4944
我们也一样,当我们漫不经心地 转发分裂性或误导性的内容时,
06:52
without trying.
139
412768
1285
甚至没有费力。
06:54
We're adding to the pollution.
140
414077
1800
我们正在加剧这种“污染”。
06:57
I know we're all looking for an easy fix.
141
417236
2618
我知道我们都在 寻找简单的解决方法。
06:59
But there just isn't one.
142
419878
1667
但就是一个都没有。
07:01
Any solution will have to be rolled out at a massive scale, internet scale,
143
421950
4445
任何解决方案都必须 以大规模推出,互联网的规模,
07:06
and yes, the platforms, they're used to operating at that level.
144
426419
3261
而且的确,这些平台 已经习惯在那种级别的规模上运行。
07:09
But can and should we allow them to fix these problems?
145
429704
3472
但是我们可以并应该 允许他们来解决这些问题吗?
07:13
They're certainly trying.
146
433668
1232
他们当然在努力。
07:14
But most of us would agree that, actually, we don't want global corporations
147
434924
4086
但是我们大多数人都会同意, 实际上,我们不希望跨国公司
07:19
to be the guardians of truth and fairness online.
148
439034
2332
成为网上真理与公平的守护者。
07:21
And I also think the platforms would agree with that.
149
441390
2537
我也认为这些平台也会同意。
07:24
And at the moment, they're marking their own homework.
150
444257
2881
此时此刻, 他们正在展现属于自己的成果。
07:27
They like to tell us
151
447162
1198
他们想告诉我们
07:28
that the interventions they're rolling out are working,
152
448384
2579
他们实施的干预措施正在奏效,
07:30
but because they write their own transparency reports,
153
450987
2540
但是因为他们编写的是 他们自己的透明度报告,
07:33
there's no way for us to independently verify what's actually happening.
154
453551
3618
我们无法独立验证实际的情况。
07:38
(Applause)
155
458431
3342
(掌声)
07:41
And let's also be clear that most of the changes we see
156
461797
2952
我们也要清楚, 大多数我们看到的变化
07:44
only happen after journalists undertake an investigation
157
464773
2994
只发生在记者进行了调查
07:47
and find evidence of bias
158
467791
1611
并找到了存在
07:49
or content that breaks their community guidelines.
159
469426
2829
违反社区规则的偏见 和内容的证据之后。
07:52
So yes, these companies have to play a really important role in this process,
160
472815
4595
所以是的,这些公司必须 在这个过程中扮演重要角色,
07:57
but they can't control it.
161
477434
1560
但是他们无法控制它。
07:59
So what about governments?
162
479855
1518
那么政府呢?
08:01
Many people believe that global regulation is our last hope
163
481863
3096
很多人相信全球监管
08:04
in terms of cleaning up our information ecosystem.
164
484983
2880
是清理我们信息生态系统的 最后希望。
08:07
But what I see are lawmakers who are struggling to keep up to date
165
487887
3166
但是我看见的是正在努力跟上
08:11
with the rapid changes in technology.
166
491077
2341
技术迅速变革的立法者。
08:13
And worse, they're working in the dark,
167
493442
1904
更糟的是, 他们在黑暗中摸索工作,
08:15
because they don't have access to data
168
495370
1821
因为他们没有获取数据的权限
08:17
to understand what's happening on these platforms.
169
497215
2650
来了解这些平台上 正在发生些什么。
08:20
And anyway, which governments would we trust to do this?
170
500260
3071
更何况,我们会相信 哪个政府来做这件事呢?
08:23
We need a global response, not a national one.
171
503355
2770
我们需要全球的回应, 不是国家的。
08:27
So the missing link is us.
172
507419
2277
所以缺少的环节是我们。
08:29
It's those people who use these technologies every day.
173
509720
3123
是每天使用这些技术的那些人。
08:33
Can we design a new infrastructure to support quality information?
174
513260
4591
我们能不能设计一个新的基础设施 来支持高质量信息?
08:38
Well, I believe we can,
175
518371
1230
我相信我们可以,
08:39
and I've got a few ideas about what we might be able to actually do.
176
519625
3357
我已经有了一些想法, 关于我们实际上可以做什么。
08:43
So firstly, if we're serious about bringing the public into this,
177
523006
3103
所以首先,如果认真考虑 让公众参与进来,
08:46
can we take some inspiration from Wikipedia?
178
526133
2381
我们可以从维基百科 汲取一些灵感吗?
08:48
They've shown us what's possible.
179
528538
1824
他们已经向我们展示了可能的方法。
08:50
Yes, it's not perfect,
180
530386
1151
是的,它并不完美,
08:51
but they've demonstrated that with the right structures,
181
531561
2634
但是他们已经用正确的结构,
全球的视野 和很高很高的透明度证明了
08:54
with a global outlook and lots and lots of transparency,
182
534219
2635
08:56
you can build something that will earn the trust of most people.
183
536878
3096
你们可以建立一些 将赢得大多数人信任的东西。
08:59
Because we have to find a way to tap into the collective wisdom
184
539998
3162
因为我们必须找到一种方法, 充分利用集体的智慧
09:03
and experience of all users.
185
543184
2309
和所有用户的经验。
09:05
This is particularly the case for women, people of color
186
545517
2646
妇女,有色人种和 未能充分代表大众的群体
09:08
and underrepresented groups.
187
548187
1346
尤其如此。
09:09
Because guess what?
188
549557
1166
猜猜为什么?
09:10
They are experts when it comes to hate and disinformation,
189
550747
2735
他们是仇恨和虚假信息方面的专家,
09:13
because they have been the targets of these campaigns for so long.
190
553506
3516
因为他们很久以来都是 这些信息运动的目标。
09:17
And over the years, they've been raising flags,
191
557046
2350
多年来,他们一直摇旗呐喊,
09:19
and they haven't been listened to.
192
559420
1665
但是从来没有被听见。
09:21
This has got to change.
193
561109
1280
这必须改变。
09:22
So could we build a Wikipedia for trust?
194
562807
4326
所以我们能否为信任 创建一个维基百科?
09:27
Could we find a way that users can actually provide insights?
195
567157
4189
我们能否找到一种 用户可以真正提供见解的方法?
09:31
They could offer insights around difficult content-moderation decisions.
196
571370
3697
他们可以对有难度的 内容审核决定提出见解。
09:35
They could provide feedback
197
575091
1463
当平台决定要推出新变更时,
09:36
when platforms decide they want to roll out new changes.
198
576578
3041
他们可以提出反馈。
09:40
Second, people's experiences with the information is personalized.
199
580241
4162
第二,人们对信息的体验 是个性化的。
09:44
My Facebook news feed is very different to yours.
200
584427
2643
我脸书上的新闻推荐 与你们的非常不同。
09:47
Your YouTube recommendations are very different to mine.
201
587094
2745
你们的 YouTube 推荐 与我的也很不同。
09:49
That makes it impossible for us to actually examine
202
589863
2492
这使得我们无法实际检查
09:52
what information people are seeing.
203
592379
2023
大家看到的是什么信息。
09:54
So could we imagine
204
594815
1389
那么,我们是否可以想象
09:56
developing some kind of centralized open repository for anonymized data,
205
596228
4778
为匿名数据 开发某种集中式开放存储库,
10:01
with privacy and ethical concerns built in?
206
601030
2864
并内置隐私和道德问题?
10:04
Because imagine what we would learn
207
604220
1778
因为想象一下, 如果我们建立一个
10:06
if we built out a global network of concerned citizens
208
606022
3261
由关心且忧虑的 公民组成的全球网络,
10:09
who wanted to donate their social data to science.
209
609307
3294
他们希望将其社交数据捐赠给科学, 那么我们将学到什么?
10:13
Because we actually know very little
210
613141
1722
因为我们实际上对
10:14
about the long-term consequences of hate and disinformation
211
614887
2881
仇恨和虚假信息对 人们态度和行为产生的
10:17
on people's attitudes and behaviors.
212
617792
1975
长期后果知之甚少。
10:20
And what we do know,
213
620236
1167
而我们知道的是,
10:21
most of that has been carried out in the US,
214
621427
2193
其中大部分是在美国进行的,
10:23
despite the fact that this is a global problem.
215
623644
2381
尽管这是一个全球性问题。
10:26
We need to work on that, too.
216
626049
1635
我们也需要为此努力。
10:28
And third,
217
628192
1150
第三点,
我们可以找到方法来连接个体吗?
10:29
can we find a way to connect the dots?
218
629366
2310
10:31
No one sector, let alone nonprofit, start-up or government,
219
631700
3438
没有一个部门可以解决这个问题, 更不用说非营利组织,初创企业
10:35
is going to solve this.
220
635162
1422
或政府部门了。
10:36
But there are very smart people right around the world
221
636608
2564
但是世界各地有 那些非常聪明的人们
10:39
working on these challenges,
222
639196
1381
在应对这些挑战,
10:40
from newsrooms, civil society, academia, activist groups.
223
640601
3576
包括新闻编辑部,民间社会组织, 学术界和维权组织。
10:44
And you can see some of them here.
224
644201
1898
你们在这可以看见其中的一些。
10:46
Some are building out indicators of content credibility.
225
646123
2927
有些正在建立内容可信度的指标。
10:49
Others are fact-checking,
226
649074
1246
其他人在做事实核查,
10:50
so that false claims, videos and images can be down-ranked by the platforms.
227
650344
3591
以至于虚假声明,视频和图像 可以被平台撤下。
10:53
A nonprofit I helped to found, First Draft,
228
653959
2213
我协助建立的一个非营利组织, 名叫“初稿”(First Draft),
10:56
is working with normally competitive newsrooms around the world
229
656196
2968
正在与世界各地通常竞争激烈的 新闻编辑室合作,
10:59
to help them build out investigative, collaborative programs.
230
659188
3503
以帮助他们建立调查性协作项目。
11:03
And Danny Hillis, a software architect,
231
663231
2309
丹尼·希利斯, 一个软件设计师,
11:05
is designing a new system called The Underlay,
232
665564
2381
正在设计一个 叫做 The Underlay 的新系统,
11:07
which will be a record of all public statements of fact
233
667969
2775
它将记录所有与其来源相连接的
11:10
connected to their sources,
234
670768
1329
公开事实陈述,
11:12
so that people and algorithms can better judge what is credible.
235
672121
3654
以便人们和算法可以 更好地判断什么是可信的。
11:16
And educators around the world are testing different techniques
236
676800
3356
而且世界各地的教育者 在测试不同的技术,
11:20
for finding ways to make people critical of the content they consume.
237
680180
3484
以找到能使人们对所看到内容 产生批判的方法。
11:24
All of these efforts are wonderful, but they're working in silos,
238
684633
3141
所有的这些努力都很棒, 但是他们埋头各自为战,
11:27
and many of them are woefully underfunded.
239
687798
2680
而且很多都严重资金不足。
11:30
There are also hundreds of very smart people
240
690502
2053
在这些公司内部
11:32
working inside these companies,
241
692579
1652
也有成百上千的聪明人在努力,
11:34
but again, these efforts can feel disjointed,
242
694255
2325
但是同样, 这些努力让人感到不够连贯,
11:36
because they're actually developing different solutions to the same problems.
243
696604
3937
因为他们正在为同样的问题 建立不同的解决方案。
11:41
How can we find a way to bring people together
244
701205
2269
我们怎么能找到一种方法, 把这些人同时
11:43
in one physical location for days or weeks at a time,
245
703498
3278
聚集在同一个地点几天或几周,
11:46
so they can actually tackle these problems together
246
706800
2396
这样他们可以真正从不同角度
11:49
but from their different perspectives?
247
709220
1818
共同解决这些问题?
11:51
So can we do this?
248
711062
1340
那么我们能做到吗?
11:52
Can we build out a coordinated, ambitious response,
249
712426
3239
我们能否建立一种协调一致, 雄心勃勃的应对措施,
11:55
one that matches the scale and the complexity of the problem?
250
715689
3709
使其与问题的规模和复杂性相匹配?
11:59
I really think we can.
251
719819
1373
我真的认为我们可以。
12:01
Together, let's rebuild our information commons.
252
721216
2959
加入我们,重建我们的信息共享吧。
12:04
Thank you.
253
724819
1190
谢谢。
12:06
(Applause)
254
726033
3728
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7