How you can help transform the internet into a place of trust | Claire Wardle

52,343 views ・ 2019-11-15

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: 易帆 余
00:13
No matter who you are or where you live,
0
13548
2516
不論你是誰,不論你住在哪裡,
00:16
I'm guessing that you have at least one relative
1
16088
2356
我猜想你應該至少有一位親戚
00:18
that likes to forward those emails.
2
18468
2334
會喜歡轉傳那些電子郵件。
00:21
You know the ones I'm talking about --
3
21206
1900
你們知道,我指的是那種 內容可疑的郵件或陰謀論的影片。
00:23
the ones with dubious claims or conspiracy videos.
4
23130
2854
00:26
And you've probably already muted them on Facebook
5
26315
2668
你可能已經因為他們 會分享這類社交貼文
00:29
for sharing social posts like this one.
6
29007
2348
而在臉書上取消追蹤他們了。
00:31
It's an image of a banana
7
31379
1404
像這張圖片中的香蕉 中間有個奇怪的紅色十字。
00:32
with a strange red cross running through the center.
8
32807
2667
00:35
And the text around it is warning people
9
35498
2139
周圍的文字則是在警告大家 不要吃這樣的食物,
00:37
not to eat fruits that look like this,
10
37661
2158
00:39
suggesting they've been injected with blood
11
39843
2028
指出這類食物已經被注入了 感染愛滋病病毒的血液。
00:41
contaminated with the HIV virus.
12
41895
2130
00:44
And the social share message above it simply says,
13
44049
2603
分享這個資訊的社群只是說:
00:46
"Please forward to save lives."
14
46676
2181
「請轉傳,以拯救人命。」
00:49
Now, fact-checkers have been debunking this one for years,
15
49672
3294
事實查核早在數年前 就揭穿這則假消息了,
00:52
but it's one of those rumors that just won't die.
16
52990
2809
但有些謠言就是殺不死, 這是其中之一。
00:55
A zombie rumor.
17
55823
1271
喪屍級謠言。
00:57
And, of course, it's entirely false.
18
57513
2093
當然,這完全是錯誤資訊。
01:00
It might be tempting to laugh at an example like this, to say,
19
60180
2959
我們會很想笑看這種例子,並說:
01:03
"Well, who would believe this, anyway?"
20
63163
1884
「反正,誰會相信啊?」
01:05
But the reason it's a zombie rumor
21
65419
1626
但,它之所以是喪屍級謠言,
01:07
is because it taps into people's deepest fears about their own safety
22
67069
3889
就是因為它觸及了 人們最深的恐懼,
01:10
and that of the people they love.
23
70982
2175
擔心他們自己和愛人的安全。
01:13
And if you spend as enough time as I have looking at misinformation,
24
73783
3273
如果你跟我一樣花足夠 時間去研究錯誤資訊,
01:17
you know that this is just one example of many
25
77080
2420
你會就知道,還有很多像這樣的例子,
01:19
that taps into people's deepest fears and vulnerabilities.
26
79524
3047
都會觸及人們最深的恐懼和弱點。
01:23
Every day, across the world, we see scores of new memes on Instagram
27
83214
4369
每天,在全世界都會在 IG 上 看到許多新的網路迷因,
01:27
encouraging parents not to vaccinate their children.
28
87607
3039
鼓勵父母不要帶孩子去接種疫苗。
01:30
We see new videos on YouTube explaining that climate change is a hoax.
29
90670
4532
在 YouTube 上會有新的影片出來, 解釋氣候變遷只是個騙局。
01:35
And across all platforms, we see endless posts designed to demonize others
30
95226
4302
在各平台上,都有無數貼文
目的是在將某些種族、宗教, 或性向的人給妖魔化。
01:39
on the basis of their race, religion or sexuality.
31
99552
3501
01:44
Welcome to one of the central challenges of our time.
32
104314
3030
歡迎來到這個時代的核心挑戰之一。
01:47
How can we maintain an internet with freedom of expression at the core,
33
107647
4025
我們要如何讓網路保有 自由發表意見的核心價值,
01:51
while also ensuring that the content that's being disseminated
34
111696
3303
同時確保被散播出去的內容
01:55
doesn't cause irreparable harms to our democracies, our communities
35
115023
3886
不會對我們的民主、我們的社會、
01:58
and to our physical and mental well-being?
36
118933
2238
我們的身心安康, 造成無法修復的傷害?
02:01
Because we live in the information age,
37
121998
2087
因為我們生活在資訊的時代,
02:04
yet the central currency upon which we all depend -- information --
38
124109
3547
但我們所有人仰賴的 中心貨幣——即,資訊——
02:07
is no longer deemed entirely trustworthy
39
127680
2357
已經不再能被完全信任,
02:10
and, at times, can appear downright dangerous.
40
130061
2328
有些時候,甚至會造成危險。
02:12
This is thanks in part to the runaway growth of social sharing platforms
41
132811
3937
這背後有部分原因是因為 社交分享平台的脫韁成長,
02:16
that allow us to scroll through,
42
136772
1642
在那裡,謊言和事實 混在一起供我們瀏覽,
02:18
where lies and facts sit side by side,
43
138438
2222
02:20
but with none of the traditional signals of trustworthiness.
44
140684
3071
傳統的信任價值已蕩然無存。
02:24
And goodness -- our language around this is horribly muddled.
45
144268
3619
且,老天,我們 在這方面的用語是一團亂。
02:27
People are still obsessed with the phrase "fake news,"
46
147911
3103
大家對「假新聞」上癮,
02:31
despite the fact that it's extraordinarily unhelpful
47
151038
2531
這些新聞不但沒有幫助,
02:33
and used to describe a number of things that are actually very different:
48
153593
3460
且常被用來包裝各式各樣的東西:
02:37
lies, rumors, hoaxes, conspiracies, propaganda.
49
157077
3386
謊言、謠言、騙局、陰謀、洗腦文。
02:40
And I really wish we could stop using a phrase
50
160911
2912
我真希望我們能不要 再用「假新聞」這詞了,
02:43
that's been co-opted by politicians right around the world,
51
163847
2862
它已經被全世界政治人物汙染,
02:46
from the left and the right,
52
166733
1471
從左派到右派都一樣,
02:48
used as a weapon to attack a free and independent press.
53
168228
3222
用來當作武器, 攻擊自由、獨立的媒體。
02:52
(Applause)
54
172307
4702
(掌聲)
02:57
Because we need our professional news media now more than ever.
55
177033
3462
因為,比起過去,現在我們 更需要專業的新聞媒體。
03:00
And besides, most of this content doesn't even masquerade as news.
56
180882
3373
此外,這類內容根本 還稱不上是新聞。
03:04
It's memes, videos, social posts.
57
184279
2642
它們只是迷因、影片、社群貼文。
03:06
And most of it is not fake; it's misleading.
58
186945
3453
且它們大部分不是假的, 而是會誤導的。
03:10
We tend to fixate on what's true or false.
59
190422
3015
我們總是想把事情二分為對或錯。
03:13
But the biggest concern is actually the weaponization of context.
60
193461
4032
但最讓人擔憂的, 是把情境變成武器。
03:18
Because the most effective disinformation
61
198855
1968
因為,最有效的虛假資訊
03:20
has always been that which has a kernel of truth to it.
62
200847
3048
通常在核心還是有某些真相。
03:23
Let's take this example from London, from March 2017,
63
203919
3476
舉個倫敦的例子,2017 年三月,
03:27
a tweet that circulated widely
64
207419
1540
在西敏橋的恐攻事件之後,
03:28
in the aftermath of a terrorist incident on Westminster Bridge.
65
208983
3587
出現了一則廣為流傳的推文。
03:32
This is a genuine image, not fake.
66
212594
2428
這是真實的照片,不是假的。
03:35
The woman who appears in the photograph was interviewed afterwards,
67
215046
3169
照片中的女子後來有被訪問,
她解釋說她心理受到很大的創傷。
03:38
and she explained that she was utterly traumatized.
68
218239
2409
03:40
She was on the phone to a loved one,
69
220672
1738
她當時在和愛人講電話,
03:42
and she wasn't looking at the victim out of respect.
70
222434
2618
去被誤會成對受害者毫無同情心。
03:45
But it still was circulated widely with this Islamophobic framing,
71
225076
3960
但這張照片仍然被扣上了
恐伊斯蘭的誣陷文字 之後廣為流傳,
03:49
with multiple hashtags, including: #BanIslam.
72
229060
3046
還附上許多主題標籤, 包括 #禁伊斯蘭。
03:52
Now, if you worked at Twitter, what would you do?
73
232425
2398
如果你在推特工作,你會怎麼做?
03:54
Would you take that down, or would you leave it up?
74
234847
2562
你會把這則推文拿掉? 還是留在上面?
03:58
My gut reaction, my emotional reaction, is to take this down.
75
238553
3429
我的直覺反應,我的 情緒反應,是把它拿掉。
04:02
I hate the framing of this image.
76
242006
2142
我很不喜歡這張照片的說明。
04:04
But freedom of expression is a human right,
77
244585
2388
但人權本來就包括表達的自由,
04:06
and if we start taking down speech that makes us feel uncomfortable,
78
246997
3225
如果我們開始把讓我們不舒服的 言論都拿掉,就會有麻煩了。
04:10
we're in trouble.
79
250246
1230
04:11
And this might look like a clear-cut case,
80
251500
2294
這可能看起來是個很明確的案例,
04:13
but, actually, most speech isn't.
81
253818
1698
但,大部分的言論沒這麼清楚。 因為那條界線非常難拿捏。
04:15
These lines are incredibly difficult to draw.
82
255540
2436
04:18
What's a well-meaning decision by one person
83
258000
2281
一個出於善意的舉動,
04:20
is outright censorship to the next.
84
260305
2077
對其它人來說,可能就是無情的審查。
04:22
What we now know is that this account, Texas Lone Star,
85
262759
2929
我們知道,Texas Lone Star (德州孤星)這個帳號
04:25
was part of a wider Russian disinformation campaign,
86
265712
3230
曾經是俄國用來散播 虛假資訊活動的帳號,
04:28
one that has since been taken down.
87
268966
2151
雖然事後被下架了。
04:31
Would that change your view?
88
271141
1563
但會改變你的觀點嗎?
04:33
It would mine,
89
273322
1159
會改變我的,
04:34
because now it's a case of a coordinated campaign
90
274505
2301
因為它是一個製造爭端的協作案例。
04:36
to sow discord.
91
276830
1215
04:38
And for those of you who'd like to think
92
278069
1961
如果你認為人工智慧將會 解決我們所有的問題,
04:40
that artificial intelligence will solve all of our problems,
93
280054
2831
04:42
I think we can agree that we're a long way away
94
282909
2225
我想大家都同意, 我們還有很長的路要走,
04:45
from AI that's able to make sense of posts like this.
95
285158
2587
才能讓人工智慧能夠 理解像這樣的貼文。
04:48
So I'd like to explain three interlocking issues
96
288856
2507
所以我想解釋一下, 讓這件事變得如此複雜的
04:51
that make this so complex
97
291387
2373
三個相關議題,
04:53
and then think about some ways we can consider these challenges.
98
293784
3122
接著,想想我們要 如何處理這些挑戰。
04:57
First, we just don't have a rational relationship to information,
99
297348
3890
首先,我們只是沒有和資訊 建立起理性的關係,
05:01
we have an emotional one.
100
301262
1468
我們建立的是情緒化的關係。
05:02
It's just not true that more facts will make everything OK,
101
302754
3794
並非有越多事實就能搞定一切,
05:06
because the algorithms that determine what content we see,
102
306572
3100
因為,那些用來判定 我們想看什麼內容的演算法,
05:09
well, they're designed to reward our emotional responses.
103
309696
3127
它們是設計來獎勵 我們的情緒反應。
05:12
And when we're fearful,
104
312847
1381
當我們恐懼時,過度簡化的說詞、
05:14
oversimplified narratives, conspiratorial explanations
105
314252
3174
陰謀論的解釋,
05:17
and language that demonizes others is far more effective.
106
317450
3418
將他人妖魔化的用語, 都會更有效許多。
05:21
And besides, many of these companies,
107
321538
1874
此外,許多這類公司的商業模型 和注意力息息相關,
05:23
their business model is attached to attention,
108
323436
2546
05:26
which means these algorithms will always be skewed towards emotion.
109
326006
3690
也就是說,這些演算法 一定都會偏向情緒那一邊。
05:30
Second, most of the speech I'm talking about here is legal.
110
330371
4298
第二,我所說的這些言論, 大部分是合法的。
05:35
It would be a different matter
111
335081
1446
這完全不像是談論兒童性侵的意象
05:36
if I was talking about child sexual abuse imagery
112
336551
2341
05:38
or content that incites violence.
113
338916
1927
或者煽動暴力的內容。
05:40
It can be perfectly legal to post an outright lie.
114
340867
3270
張貼徹頭徹尾的謊言並不違法。
05:45
But people keep talking about taking down "problematic" or "harmful" content,
115
345130
4034
但大家不斷地討論「有問題的」 或「會造成傷害」的內容,
05:49
but with no clear definition of what they mean by that,
116
349188
2609
卻無法清楚定義他們指的是什麼,
05:51
including Mark Zuckerberg,
117
351821
1264
就連最近呼籲建立全球規制 來管理言論的馬克祖柏格也一樣。
05:53
who recently called for global regulation to moderate speech.
118
353109
3412
05:56
And my concern is that we're seeing governments
119
356870
2215
我擔心的是,我們看到 世界各地的政府
05:59
right around the world
120
359109
1292
06:00
rolling out hasty policy decisions
121
360425
2676
趕著推出政策決定,
06:03
that might actually trigger much more serious consequences
122
363125
2746
在言論方面,這些決定 可能會觸發更嚴重的後果。
06:05
when it comes to our speech.
123
365895
1714
06:08
And even if we could decide which speech to take up or take down,
124
368006
3706
就算我們能判定哪些言論 可以放上網,哪些要拿掉,
06:11
we've never had so much speech.
125
371736
2174
我們以前也從未有這麼多言論。
06:13
Every second, millions of pieces of content
126
373934
2131
每秒鐘就有數百萬則內容 被世界各地的人上傳到網路上,
06:16
are uploaded by people right around the world
127
376089
2107
06:18
in different languages,
128
378220
1168
用的是不同語言,背景是 數以千計不同的文化情境。
06:19
drawing on thousands of different cultural contexts.
129
379412
2768
06:22
We've simply never had effective mechanisms
130
382204
2532
我們實在沒有任何有效的機制 可以管理這麼大規模的言論,
06:24
to moderate speech at this scale,
131
384760
1738
06:26
whether powered by humans or by technology.
132
386522
2801
不論用人力或技術都辦不到。
06:30
And third, these companies -- Google, Twitter, Facebook, WhatsApp --
133
390284
3944
第三,這些公司——Google、 推特、臉書、WhatsApp——
06:34
they're part of a wider information ecosystem.
134
394252
2841
它們都是廣闊資訊生態系統的一部分。
06:37
We like to lay all the blame at their feet, but the truth is,
135
397117
3352
我們喜歡把所有責任 都推到它們身上,但事實是,
06:40
the mass media and elected officials can also play an equal role
136
400493
3830
大眾媒體和民選官員 也都有同樣的影響力,
06:44
in amplifying rumors and conspiracies when they want to.
137
404347
2913
只要他們想,也可以在放大謠言和 陰謀上,發揮等同的影響力。
06:47
As can we, when we mindlessly forward divisive or misleading content
138
407800
4944
我們也是,
我們會無心且毫不費力地轉傳 造成分化或誤導的內容。
06:52
without trying.
139
412768
1285
06:54
We're adding to the pollution.
140
414077
1800
我們也在助長污染。
06:57
I know we're all looking for an easy fix.
141
417236
2618
我知道大家都想要 有簡單的解決方式。
06:59
But there just isn't one.
142
419878
1667
但就是沒有。
07:01
Any solution will have to be rolled out at a massive scale, internet scale,
143
421950
4445
要推出解決方式,就要以 極大的規模推出,網路的規模,
07:06
and yes, the platforms, they're used to operating at that level.
144
426419
3261
是的,平台就是在網路 這麼大的規模下運作的。
07:09
But can and should we allow them to fix these problems?
145
429704
3472
但我們能讓它們去解決這些問題嗎?
07:13
They're certainly trying.
146
433668
1232
他們肯定有在試。 但我們大部分人都認同,
07:14
But most of us would agree that, actually, we don't want global corporations
147
434924
4086
其實,我們並不希望全球大企業
07:19
to be the guardians of truth and fairness online.
148
439034
2332
成為網路真相和公平的守護者。
07:21
And I also think the platforms would agree with that.
149
441390
2537
我認為平台也不希望如此。
07:24
And at the moment, they're marking their own homework.
150
444257
2881
此刻,他們是球員兼裁判。
07:27
They like to tell us
151
447162
1198
他們想要告訴我們, 他們推出的干預方式奏效了,
07:28
that the interventions they're rolling out are working,
152
448384
2579
07:30
but because they write their own transparency reports,
153
450987
2540
但,因為他們的透明度 報告是自己撰寫的,
07:33
there's no way for us to independently verify what's actually happening.
154
453551
3618
我們沒有辦法獨立地 去驗證發生了什麼事。
07:38
(Applause)
155
458431
3342
(掌聲)
07:41
And let's also be clear that most of the changes we see
156
461797
2952
另外要說清楚一件事, 我們看到的大部分改變發生的時機,
07:44
only happen after journalists undertake an investigation
157
464773
2994
都是在記者進行調查
07:47
and find evidence of bias
158
467791
1611
並找到偏見的證據
07:49
or content that breaks their community guidelines.
159
469426
2829
或違反業界原則的內容之後。
07:52
So yes, these companies have to play a really important role in this process,
160
472815
4595
所以,是的,在這個過程中, 這些企業扮演了很重要的角色,
07:57
but they can't control it.
161
477434
1560
但他們不能控制言論。
07:59
So what about governments?
162
479855
1518
那政府呢?
08:01
Many people believe that global regulation is our last hope
163
481863
3096
許多人認為,全球規制 是我們最後的希望,
08:04
in terms of cleaning up our information ecosystem.
164
484983
2880
清理資訊生態系統的最後希望。
08:07
But what I see are lawmakers who are struggling to keep up to date
165
487887
3166
但我看到立法者
非常難跟上科技的快速改變。
08:11
with the rapid changes in technology.
166
491077
2341
08:13
And worse, they're working in the dark,
167
493442
1904
更糟的是,他們還在瞎子摸象,
08:15
because they don't have access to data
168
495370
1821
因為他們無法取得資料
08:17
to understand what's happening on these platforms.
169
497215
2650
來了解這些平台發生了什麼事。
08:20
And anyway, which governments would we trust to do this?
170
500260
3071
總之,在這件事情上, 我們能相信哪個政府?
08:23
We need a global response, not a national one.
171
503355
2770
我們需要的是全球性的 因應,而非全國性的。
08:27
So the missing link is us.
172
507419
2277
這當中遺失的環節就是我們。
08:29
It's those people who use these technologies every day.
173
509720
3123
就是每天使用這些科技的人。
08:33
Can we design a new infrastructure to support quality information?
174
513260
4591
我們能否設計新的基礎結構 來支持有品質的資訊?
08:38
Well, I believe we can,
175
518371
1230
我相信可以,
08:39
and I've got a few ideas about what we might be able to actually do.
176
519625
3357
對於我們能做什麼, 我有幾個想法。
08:43
So firstly, if we're serious about bringing the public into this,
177
523006
3103
首先,如果我們當真 想要讓大眾介入,
08:46
can we take some inspiration from Wikipedia?
178
526133
2381
我們能否從維基百科來找靈感?
08:48
They've shown us what's possible.
179
528538
1824
他們讓我們看到了可能性。
08:50
Yes, it's not perfect,
180
530386
1151
是的,它並不完美,但它證明了
08:51
but they've demonstrated that with the right structures,
181
531561
2634
有對的架構,有全球的目光, 有很高的透明度,
08:54
with a global outlook and lots and lots of transparency,
182
534219
2635
08:56
you can build something that will earn the trust of most people.
183
536878
3096
就能建造出讓多數人信賴的東西。
08:59
Because we have to find a way to tap into the collective wisdom
184
539998
3162
因為我們得要找到方法,去挖掘
所有使用者的集體智慧和經驗。
09:03
and experience of all users.
185
543184
2309
09:05
This is particularly the case for women, people of color
186
545517
2646
對於女性、有色人種, 及少數弱勢族群更是如此。
09:08
and underrepresented groups.
187
548187
1346
09:09
Because guess what?
188
549557
1166
因為,你猜如何?他們是 仇恨和虛假資訊的專家,
09:10
They are experts when it comes to hate and disinformation,
189
550747
2735
09:13
because they have been the targets of these campaigns for so long.
190
553506
3516
因為長久以來他們都是 這些活動的目標對象。
09:17
And over the years, they've been raising flags,
191
557046
2350
這些年來,他們一直 都在小心提防,
09:19
and they haven't been listened to.
192
559420
1665
但沒有人傾聽他們。
09:21
This has got to change.
193
561109
1280
這點必須要改變。
09:22
So could we build a Wikipedia for trust?
194
562807
4326
我們能建造出信任的維基百科嗎?
09:27
Could we find a way that users can actually provide insights?
195
567157
4189
我們能否找到方法, 讓使用者真正提供洞見?
09:31
They could offer insights around difficult content-moderation decisions.
196
571370
3697
他們可以針對很困難的 內容管理決策來提供洞見。
09:35
They could provide feedback
197
575091
1463
當平台決定想要推出新改變時,
09:36
when platforms decide they want to roll out new changes.
198
576578
3041
他們可以回饋意見。
09:40
Second, people's experiences with the information is personalized.
199
580241
4162
第二,人對於資訊的體驗 是很個人化的。
09:44
My Facebook news feed is very different to yours.
200
584427
2643
我的臉書新聞饋送內容 和你的非常不一樣。
09:47
Your YouTube recommendations are very different to mine.
201
587094
2745
你的 YouTube 推薦 和我的非常不一樣。
09:49
That makes it impossible for us to actually examine
202
589863
2492
因此我們不可能去真正檢查
09:52
what information people are seeing.
203
592379
2023
大家看到了什麼資訊。
09:54
So could we imagine
204
594815
1389
我們能否想像開發一種
09:56
developing some kind of centralized open repository for anonymized data,
205
596228
4778
中央化的開放貯藏庫, 用來存放匿名的資料,
10:01
with privacy and ethical concerns built in?
206
601030
2864
且把隱私和倫理考量 都內建在其中?
10:04
Because imagine what we would learn
207
604220
1778
想想我們能夠學到多少,
10:06
if we built out a global network of concerned citizens
208
606022
3261
若我們能打造一個全球網路, 由在乎的公民構成,
10:09
who wanted to donate their social data to science.
209
609307
3294
他們想要把他們的 社交資料貢獻給科學。
10:13
Because we actually know very little
210
613141
1722
因為我們幾乎不了解
10:14
about the long-term consequences of hate and disinformation
211
614887
2881
仇恨和虛假資訊對於人的態度
10:17
on people's attitudes and behaviors.
212
617792
1975
和行為會有什麼長期後果。
10:20
And what we do know,
213
620236
1167
我們確實知道的是, 雖然這是個全球性的問題,
10:21
most of that has been carried out in the US,
214
621427
2193
10:23
despite the fact that this is a global problem.
215
623644
2381
卻大部分發生在美國。
10:26
We need to work on that, too.
216
626049
1635
我們也得處理這一點。
10:28
And third,
217
628192
1150
第三,我們能不能 想辦法把點連起來?
10:29
can we find a way to connect the dots?
218
629366
2310
10:31
No one sector, let alone nonprofit, start-up or government,
219
631700
3438
這個問題不是單一部門能解決的,
更別說非營利組織、 新創公司、政府。
10:35
is going to solve this.
220
635162
1422
10:36
But there are very smart people right around the world
221
636608
2564
但全世界都有非常聰明的人 在處理這些難題,
10:39
working on these challenges,
222
639196
1381
10:40
from newsrooms, civil society, academia, activist groups.
223
640601
3576
他們可能在新聞編輯部、 公民社會、學術圈、活動團體裡。
10:44
And you can see some of them here.
224
644201
1898
在這裡也有一些。
10:46
Some are building out indicators of content credibility.
225
646123
2927
有些人在建造內容可信度指標。 也有人在做事實查核,
10:49
Others are fact-checking,
226
649074
1246
10:50
so that false claims, videos and images can be down-ranked by the platforms.
227
650344
3591
讓虛假的主張、影片, 和影像在平台上的排名下降。
10:53
A nonprofit I helped to found, First Draft,
228
653959
2213
我協助成立了非營利組織 「第一版草稿」,
10:56
is working with normally competitive newsrooms around the world
229
656196
2968
它和全球各地平時會互相 競爭的新聞編輯室合作,
10:59
to help them build out investigative, collaborative programs.
230
659188
3503
目的在協助他們建造 調查性的協同作業方案。
11:03
And Danny Hillis, a software architect,
231
663231
2309
軟體架構工程師丹尼希利斯
11:05
is designing a new system called The Underlay,
232
665564
2381
在設計一個新系統, 叫做「The Underlay」,
11:07
which will be a record of all public statements of fact
233
667969
2775
它能記錄事實的所有公開陳述, 並和其源頭連結,
11:10
connected to their sources,
234
670768
1329
11:12
so that people and algorithms can better judge what is credible.
235
672121
3654
供人及演算法參考, 更能判斷資訊的可信度。
11:16
And educators around the world are testing different techniques
236
676800
3356
世界各地的教育家 正在測試不同的技術,
11:20
for finding ways to make people critical of the content they consume.
237
680180
3484
找方法讓大家可以對自己 觀看的內容進行批判。
11:24
All of these efforts are wonderful, but they're working in silos,
238
684633
3141
這些做法都很棒,但他們 都在自己的地窖中努力,
11:27
and many of them are woefully underfunded.
239
687798
2680
很遺憾的是,很多人還資金不足。
11:30
There are also hundreds of very smart people
240
690502
2053
還有數百名非常聰明的人 在這些公司中工作,
11:32
working inside these companies,
241
692579
1652
11:34
but again, these efforts can feel disjointed,
242
694255
2325
但,同樣的,他們做的努力 似乎也沒有相連結,
11:36
because they're actually developing different solutions to the same problems.
243
696604
3937
因為他們是針對同樣的問題 在開發不同的解決方案。
11:41
How can we find a way to bring people together
244
701205
2269
我們要如何想辦法集合大家,
11:43
in one physical location for days or weeks at a time,
245
703498
3278
於特定時間,在實體地點 聚會幾天或幾週,
11:46
so they can actually tackle these problems together
246
706800
2396
讓他們能真正從他們不同的觀點 來一起處理這些問題?
11:49
but from their different perspectives?
247
709220
1818
11:51
So can we do this?
248
711062
1340
我們能做到嗎?
11:52
Can we build out a coordinated, ambitious response,
249
712426
3239
我們能否建造一個組織良好 且有野心的因應方式,
11:55
one that matches the scale and the complexity of the problem?
250
715689
3709
且能夠符合問題的規模和複雜度?
11:59
I really think we can.
251
719819
1373
我真心認為可以。
12:01
Together, let's rebuild our information commons.
252
721216
2959
讓我們同心協力一起重建 我們對資訊的共識。
12:04
Thank you.
253
724819
1190
謝謝。
12:06
(Applause)
254
726033
3728
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7