請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: SF Huang
00:08
"You are a disgusting liar."
0
8556
3040
「你是個讓人作嘔的騙子。」
00:12
"Someone, somewhere will hunt you down."
1
12916
3400
「在某處一定會有某人
絕對不會放過你的。」
00:17
"I hope someone puts a bullet
between your eyes."
2
17796
3680
「我希望有人一槍斃了你。」
00:24
These are messages received
by climate scientists.
3
24436
4440
這些是氣候科學家收到的訊息。
00:29
According to a recent survey,
4
29396
2200
根據最近的一項調查,
00:31
39 percent of climate scientists
have faced online abuse.
5
31596
5320
39% 的氣候科學家
都曾遇過網路暴力。
00:37
18 percent of those
are threats of physical violence.
6
37596
4320
其中 18% 是人身暴力相關的威脅。
00:44
"At the end of the day,
7
44116
1720
「到頭來,
00:45
we're going to see
just how much you believe
8
45876
2600
我們就來看看你有
多相信你的全球暖化,
00:48
in your global warming
9
48516
2680
00:51
and whether you're willing to die
for your so-called 'research.'"
10
51196
4400
以及你是否願意為了
你所謂的『研究』而死。」
00:58
No scientist should
have to fear for their lives.
11
58356
3240
科學家不應該還得要
擔心自己的生命安全。
01:02
But this is just another day
in the life of a climate scientist.
12
62196
4120
但這卻只是氣候科學家
稀鬆平常的一天。
01:08
I'm not a climate scientist.
13
68396
1800
我不是氣候科學家。
01:10
I'm not a climate change activist.
14
70236
2240
我不是氣候變遷活動家。
01:12
I'm a counterterrorism expert.
15
72996
2320
我是反恐專家。
01:15
I started my journey
meeting with white supremacists
16
75796
2960
我這段旅程的開端,
是在瑞典的地下室
和白人至上主義者會面。
01:18
in basements in Sweden
17
78796
2480
01:21
and went on to lead a global policy effort
18
81276
2920
接著,我領導了一項全球政策行動,
01:24
after Europe's first
major terrorist attack
19
84236
2840
時間是在白人至上主義者第一次
01:27
perpetrated by a white supremacist.
20
87076
2360
向歐洲發動重大恐怖攻擊之後。
01:30
I went on to found Moonshot,
21
90316
1920
我接著成立了「宏大目標」,
01:32
an organization that works
to end violence online.
22
92236
4080
這個組織的宗旨
是致力終結網路暴力。
01:37
I care about climate change denial
23
97556
2920
我很在乎對於氣候變遷的否認,
01:40
because it's so often weaponized
24
100476
2400
因為它經常被當成武器來使用,
01:42
to serve as a justification for violence.
25
102876
3040
為暴力行為找正當理由。
01:47
It would be easy to think
26
107276
1240
很容易就會認為,
01:48
that if only we could get people
to understand climate change is real,
27
108516
4200
如果我們能讓大家了解
氣候變遷是真實的,
01:52
we could put an end to this.
28
112716
1760
我們就能消除這種狀況。
01:55
Unfortunately, it's not that simple.
29
115316
2720
不幸的是,沒那麼簡單。
02:00
In 2019,
30
120236
1840
2019 年,
02:02
a gunman walked into a Walmart
in El Paso, Texas.
31
122116
4080
一名槍手走進德州
厄爾帕索的沃爾瑪。
02:07
He killed 23 people,
32
127676
3400
他殺死了二十三個人,
02:11
many of immigrant background.
33
131076
2000
許多都有移民背景。
02:14
He called himself an “ecofascist.”
34
134756
3480
他自稱「生態法西斯主義者」。
02:18
He believed in climate change,
35
138676
2880
他相信氣候變遷,
02:21
but he had bought
into mis- and disinformation
36
141596
2800
但他誤信了錯誤資訊和假資訊,
02:24
that immigrants were the root cause of it,
37
144396
2680
以為氣候變遷是移民造成的,
02:27
that sustainability would only be possible
38
147116
2520
且唯有消滅有色人種,
才有可能實現永續。
02:29
with the elimination of people of color.
39
149636
2800
02:34
Mis- and disinformation
are so often weaponized
40
154076
3920
錯誤資訊和假資訊
太常被當成武器來用,
成為使用暴力的正當理由。
02:38
to serve as a justification for violence.
41
158036
3040
02:43
Although they're often
used interchangeably,
42
163356
2800
儘管這兩個詞常被
當成是一樣的意思,
02:46
misinformation is information
that's false or misleading.
43
166156
4400
但錯誤資訊指的是不正確
或會誤導人的資訊。
02:50
Disinformation is spread
intentionally to cause harm.
44
170876
5000
假資訊則是為了造成傷害
而蓄意傳播出去的資訊。
02:56
It’s so powerful because it taps
into your grievances,
45
176476
3880
它非常強大,因為
它能觸及你的不滿,
03:00
what makes you really angry,
46
180356
2480
讓你非常憤怒,
03:03
and it offers simplistic solutions.
47
183596
2320
並提供過度簡化的解決方案。
03:05
There's typically a villain and a hero.
48
185956
2680
通常都有一個反派和一個英雄。
03:09
Over the last two years,
49
189916
1360
在過去兩年間,我和我的團隊
一直在研究世界各地中,
03:11
my team and I have been researching
different kinds of manipulation tactics
50
191316
4040
03:15
used all over the world
to spread disinformation.
51
195396
3600
用來散播假資訊的各種操弄策略。
03:19
Two of the most common
were decontextualization
52
199556
3640
其中最常見的兩種策略是脫離情境
03:23
and fearmongering.
53
203236
1520
和製造恐慌。
03:25
Decontextualization is the practice
of taking information
54
205836
3600
脫離情境的做法是讓資訊
脫離它原本的情境,
03:29
out of its original context
to deliberately mislead people.
55
209436
4120
刻意去誤導別人。
03:33
For example, earlier this year,
56
213996
2280
例如,今年早些時候,
03:36
Europe experienced a series
of protests by farmers
57
216316
4160
歐洲發生了一連串的農民抗爭,
03:40
against a range of proposed
environmental regulations.
58
220516
4440
反對提議出來的一系列環境規定。
03:44
There were street blockades and protests,
59
224956
2320
有街道封鎖、抗議、
示威、佔領等活動。
03:47
demonstrations, occupations.
60
227276
2600
03:50
Adding to an already tense moment,
61
230316
2880
當時局勢已經夠緊張了,
在火上再加油的是
03:53
several inauthentic images circulated.
62
233196
3240
有數張不真實的圖像四處流傳。
03:57
This one purported to show
63
237516
1920
這張圖片聲稱位在
巴黎的烏克蘭大使館
03:59
the Ukrainian embassy in Paris
getting pummeled with manure.
64
239476
4080
遭到糞便攻擊。
04:04
This was actually footage
taken months earlier
65
244596
3000
這其實是在幾個月前拍攝的,
04:07
from an entirely different protest
66
247596
2360
是一場完全不同的抗議活動,
04:09
about an entirely
different issue in Dijon,
67
249956
2920
反對的議題也不同,
04:12
not even in Paris.
68
252916
1600
地點甚至不是巴黎,而是第戎。
04:15
And this effort to mislead the public,
69
255396
2960
而這種誤導公眾的行動,
04:18
it wouldn't be complete
without the use of new technology.
70
258396
3480
若沒有使用新技術就不可能辦到。
04:22
Here's an image showing the streets
of Paris lined with bales of hay.
71
262916
4800
這張圖片顯示巴黎的
街道堆滿了乾草捆。
04:27
It's a really striking image, isn't it?
72
267716
2560
這張圖很引人注目,對吧?
04:30
This never happened.
73
270836
1200
從來沒發生過這種事,
這張圖完全是由人工智慧生成的。
04:32
It was entirely generated by AI.
74
272076
3200
04:36
And this isn't just happening in Europe.
75
276876
3120
且這種情況不只出現在歐洲。
04:40
Last year,
76
280796
1480
去年,
04:42
after wildfires raged in Hawaii,
77
282316
3920
在夏威夷遭野火肆虐之後,
04:46
a disinformation network linked
to the Chinese Communist Party
78
286236
4120
一個與中國共產黨有關的假資訊網絡
04:50
spread inauthentic images
79
290396
2240
散播了不真實的圖像,
04:52
purporting that the US government
had intentionally spread the wildfires
80
292636
4760
聲稱美國政府用所謂的「氣象武器」
04:57
using a so-called “weather weapon.”
81
297436
2800
蓄意引發這些野火。
05:01
Can you imagine?
82
301316
1280
各位能想像嗎?
05:02
Over a hundred people died
in those wildfires,
83
302916
4080
有超過一百人因為那些野火喪命,
竟有人認為野火是他們
自己的政府蓄意引發的,
05:07
and the idea that those fires
were deliberately set
84
307036
2960
05:10
by their own government
85
310036
1640
05:11
against their own people?
86
311716
1840
用來對抗自己的人民?
05:13
It's terrifying.
87
313556
1520
太駭人了。
這類陰謀論的說法
可能引發廣泛的恐懼,
05:16
These kinds of conspiratorial narratives
can spread widespread fear,
88
316036
6000
05:22
which takes us to the next
powerful tactic of disinformation:
89
322076
4120
此時就要談到第二種
強大的假資訊策略:
05:26
fearmongering:
90
326236
1480
製造恐慌:
05:27
deliberately exaggerating an issue
91
327716
2520
蓄意把議題誇大,
05:30
so that you can provoke fear and alarm.
92
330276
2600
目的在引發恐懼和警覺心。
05:33
We know that emotion-driven
information processing
93
333356
4120
我們知道,由情緒
驅動的資訊處理方式
05:37
can overtake evidence-based
decision making,
94
337476
3400
可以取代以證據為基礎的決策,
05:40
which is what makes this form
of disinformation so effective.
95
340916
4640
這就是這種形式的
假資訊如此有效的原因。
05:46
It's for these reasons
that a recent MIT study found
96
346036
3680
基於這些理由,麻省理工學院
近期的一項研究發現
05:49
a false story will travel
six times quicker
97
349756
3720
假故事傳播給一千五百個人的速度
05:53
to reach 1,500 people
than a true story will.
98
353476
3640
是真實故事的六倍。
05:57
And we know Facebook
fact-checkers take up to 72 hours
99
357836
4120
我們知道臉書的事實查核員
平均要花七十二小時
06:01
on average
100
361996
1160
06:03
to identify and remove this content.
101
363156
2920
才能辨識出這些內容並將之刪除。
06:06
By that time, most impressions
have already been made.
102
366956
3560
等到刪除時,錯誤的印象
早就已經建立起來了。
06:11
Now I know we have all seen this online,
103
371316
2920
我知道大家都在網路上看過這些,
06:14
and when you see it happen,
104
374236
1320
看到假資訊時,你就會
很想要用事實來回應。
06:15
it can be really tempting
to respond with the facts.
105
375596
3280
06:19
I get it.
106
379356
1160
我懂。我們以邏輯和科學為榮。
06:20
We pride ourselves on logic and science.
107
380556
2840
06:23
The truth matters.
108
383396
1680
真相很重要。
06:25
So when someone is so obviously
spreading false information,
109
385116
4280
所以當很明顯有人在散播假資訊時,
06:29
just correct them, right?
110
389396
2200
提出糾正就行了,對吧?
06:32
Unfortunately, this doesn't always work.
111
392676
3280
不幸的是,這麼做不見得行得通。
06:36
Believe me, I spent the last two decades
112
396316
2320
相信我,過去二十年來我一直在學習
06:38
learning how to have conversations
with people buying into white supremacy.
113
398676
4240
如何與相信白人比較優越的人交談。
06:42
That is disinformation at its worst.
114
402956
2800
那是最惡劣的假資訊。
06:46
Disinformation wins
115
406356
1560
假資訊能得逞,
06:47
because of the emotions it inspires,
116
407956
3160
是因為它能激發出的情緒,
06:51
because of the way it makes people feel.
117
411156
3120
是因為它能讓人產生的感受。
06:54
So if someone is so bought
into disinformation,
118
414276
3480
因此,如果某人對假資訊深信不疑,
06:57
getting into debates
on the facts with them
119
417756
2040
和他們爭辯事實反而可能
會把他們進一步推到角落,
06:59
can just risk pushing them
even further into a corner,
120
419836
3360
07:03
backing them into a corner
121
423236
1480
他們被逼到角落後,
07:04
so that they get really defensive.
122
424756
2280
就會豎起極高的防備心。
07:07
OK, so if we can't debate
the facts endlessly,
123
427796
2440
好,如果我們不能無止盡地
不斷爭辯事實,那能怎辬辦?
07:10
what can we do?
124
430276
1240
07:12
Last year, Moonshot partnered with Google
125
432436
2680
去年,「宏大目標」與 Google 合作
07:15
to test an approach known as “prebunking.”
126
435156
2960
測試一種稱為「預先闢謠」的方法。
07:18
Prebunking is a proven
communication technique
127
438756
3040
預先闢謠是一種
已通過驗證的溝通技巧,
07:21
designed to help people spot
and reject efforts
128
441836
3720
旨在幫助大家發現和抵制
07:25
to manipulate them in the future
129
445596
2520
未來試圖操弄他們的作為,
07:28
by giving them forewarning
130
448156
2040
做法是預先警告他們,
07:30
and giving them tools to be able
to reject a manipulative message,
131
450236
4280
並提供他們可以抵制
操弄訊息的工具。
07:34
you lessen the likelihood
that they will be misled.
132
454556
3240
這樣就能減少他們被誤導的機會。
07:38
This is not about telling people
what is true or false
133
458276
3360
這並不是要告訴大家什麼
是真的或假的,是對的或錯的。
07:41
or right or wrong.
134
461676
1680
07:43
It's about empowering people
to protect themselves.
135
463396
3960
重點在於讓大家
有能力可以保護自己。
07:47
We've tapped into the universal
human desire not to be manipulated,
136
467756
5000
我們利用的是人類普遍
不希望被操弄的慾望,
07:52
and this method has been tried and tested
for decades, since the 1960s.
137
472756
5240
且這種方法從 1960 年代起
就經過數十年的反覆測試。
07:58
All prebunking messages contain
three essential ingredients.
138
478516
4520
所有要做預先闢謠的訊息
都包含三個基本要素。
08:03
One:
139
483396
1280
第一:
08:04
an emotional warning.
140
484716
2120
情感上的警告。
08:06
You alert people
that there are others out there
141
486876
2680
你要警告大家,外頭有些人
08:09
who may be trying to mislead
or manipulate them.
142
489596
3240
可能會試圖誤導或操弄他們。
08:13
Be aware, you may be targeted.
143
493156
2360
請注意,你可能會成為目標。
08:16
Two:
144
496956
1160
第二:
08:18
stimulus.
145
498556
1320
刺激。
08:20
You show people examples
of manipulative messaging
146
500236
3800
你給大家看一些操弄式訊息的例子,
08:24
so that they will be more likely
147
504076
2320
這樣他們未來就比較
有可能辨識出這類訊息。
08:26
to be able to identify
those in the future.
148
506436
2800
08:30
And three: refutation.
149
510956
3680
第三:反駁。
08:35
You give people the tools
150
515236
1480
你提供工具給大家,
讓他們能即使反駁訊息。
08:36
to be able to refute
a message in real time.
151
516756
3200
08:39
For example,
152
519996
1200
比如,
08:41
if you see a headline
that’s really sensational,
153
521196
3720
如果你看到一個能引起轟動的標題,
08:44
and it either seems too good to be true
or it makes you really angry,
154
524956
5120
也許是太好了,好到不像真的,
也許是會讓你非常憤怒,
08:50
always Google around for other sources.
155
530076
2520
務必一定要上網搜尋其他來源。
08:52
Always Google around.
156
532636
1680
一定要上網搜尋。
08:55
OK, so we knew the steps to take,
157
535196
3480
好,所以,我們知道該採取的步驟,
08:58
but we also knew if we were going
to really get at this problem
158
538716
3640
但我們也知道,如果我們真的
要解決世界各地的這個問題,
09:02
around the world,
159
542396
1600
不可能靠單一種通用的方法就辦到。
09:04
a one-size-fits-all approach
wouldn't work.
160
544036
2200
09:06
We knew we needed to get local.
161
546276
2600
我們知道在地化是必要的。
09:09
So we partnered
with civil society organizations
162
549476
3240
因此,我們與世界各國的
公民社會組織合作,
09:12
in countries around the world,
163
552756
2160
09:14
from Germany to Indonesia to Ukraine.
164
554916
3200
從德國、到印尼、到烏克蘭。
09:18
And we started first with the evidence.
165
558796
2640
我們先從證據著手。
09:21
We met with dozens of experts,
166
561836
2160
我們與數十位專家會面,
我們對線上環境做了調查,
09:24
we surveyed the online space,
167
564036
2320
09:26
and we identified the most common
manipulation tactics
168
566396
3320
並找出每個國家最常
被使用的操弄戰略。
09:29
being used in each country.
169
569716
2280
09:32
We then partnered with local filmmakers
170
572516
2400
接著,我們與當地電影製作人合作,
09:34
to create educational videos
171
574956
1880
製作教育影片,來教導大家
09:36
that would teach people
about those manipulation tactics
172
576836
3520
和他們所屬國家常用的
操弄手法的相關資訊,
09:40
that were being used
in their home country.
173
580356
2800
09:44
In some contexts,
174
584356
1320
在某些情境中,
09:45
we found that people trust
close peers and relatives the most.
175
585716
4320
我們發現大家最相信的
是親密的同儕和親人。
09:50
So in Germany,
176
590676
1480
所以,在德國,
09:52
we filmed close friends
chatting in a park.
177
592196
2800
我們拍攝的是親密的
朋友在公園裡聊天。
09:55
In Ukraine,
178
595796
1320
在烏克蘭,
09:57
we filmed family dialogues
around a kitchen table,
179
597156
3720
我們拍攝的則是全家
坐在餐桌前的對話,
10:00
a setting that's so familiar
to so many of us,
180
600876
2640
那是許多人都很熟悉的場景,
10:03
where so many of us have had
those difficult conversations.
181
603556
3440
許多人都經歷過必須要進行
這類困難談話的情況。
10:07
We wanted to encourage people
to have these kinds of conversations
182
607716
3840
我們想鼓勵大家在自己的
信任圈內進行這類談話,
10:11
within their own trusted circles,
183
611556
3040
10:14
whether they're
in El Salvador or Indonesia.
184
614636
3000
不論是在薩爾瓦多或印尼。
10:18
And to do so before pivotal moments
185
618036
2280
且要在關鍵時刻之前就做,
10:20
where online manipulation
efforts intensify,
186
620316
3480
關鍵時刻指的是線上操弄
越演越烈的時刻,
10:23
like elections.
187
623836
2080
比如選舉時。
10:26
So as we prepared to head
into the EU elections,
188
626956
3640
在歐盟選舉即將到來之前,
10:30
we knew that distrust in climate science
189
630596
2920
我們知道對氣候科學的不信任
10:33
had already emerged
as a critical misinformation theme.
190
633556
3960
已經成為了重要的錯誤資訊主題。
10:38
Now one study had found
that adults over the age of 45
191
638596
4200
一項研究發現,
四十五歲以上的成年人
10:42
are less likely to investigate
false information
192
642836
3040
在網路上碰到假資訊時,
比較不會去查證它。
10:45
when they stumble across it online.
193
645916
2280
10:48
Now we also know that adults
over the age of 45
194
648636
3360
我們也知道四十五歲以上的成年人
10:52
have higher voter turnout,
195
652036
2080
投票率較高,
10:54
which means if it wins,
196
654116
2200
意思就是,如果假資訊得逞,
10:56
disinformation can have
a disproportionate impact
197
656316
3200
就可能會對選舉結果
產生不成比例的影響。
10:59
on the outcomes of elections.
198
659516
2440
11:02
So as we prepare to head
into the EU elections,
199
662556
3240
因此,在歐盟選舉即將到來之前,
11:05
we created content for every EU country,
200
665796
3800
我們為每一個歐盟
成員國都創建了內容,
11:09
in 27 languages,
201
669636
2680
用了二十七種語言,
11:12
aiming to empower Europeans
to spot and reject
202
672356
3520
旨在讓歐洲人有能力辨識和抵制
11:15
efforts to manipulate them
before the elections.
203
675916
3400
試圖在選舉前操弄他們的各種作為。
11:20
Over the last year,
204
680636
1200
在過去一年中,
11:21
we have reached millions
of people around the globe
205
681876
2680
我們用這些影片接觸到了
全世界數百萬人。
11:24
with these videos.
206
684596
1440
11:26
In Germany alone,
we reached 42 million people.
207
686516
4600
光是在德國,
我們就接觸到了四千兩百萬人。
11:31
That's half the German population.
208
691156
2560
佔德國人口的一半。
11:33
And we found on average,
209
693756
1800
而我們發現,平均來說,
11:35
viewers of these videos
were up to 10 percent more likely
210
695556
3640
看過這些影片的觀眾
辨識出操弄手法的可能性
11:39
to be able to identify
manipulation efforts
211
699236
2720
比沒看過影片者的人高了 10%。
11:41
than those who hadn't seen those videos.
212
701956
2680
11:45
This is a winning formula.
213
705796
2600
這是個能成功的公式。
11:48
The evidence shows us
214
708436
1720
我們從證據得知,
11:50
that prebunking is effective
at building resistance to disinformation.
215
710156
5640
預先闢謠可以有效協助抵制假資訊。
11:56
It begs the question,
216
716636
1640
這就帶出了一個問題,
11:58
how do we make that resistance last?
217
718276
2680
如何讓這種抵制持續下去?
12:01
How do we build long-term
societal resilience
218
721276
3560
要如何建立長期的社會韌性,
12:04
to disinformation efforts?
219
724836
1880
對抗假資訊的作為?
12:09
There is an ongoing effort
to use disinformation
220
729156
3600
試圖運用假資訊來破壞
我們民主的作為沒有停過。
12:12
to undermine our democracies.
221
732796
2240
12:16
Just last month,
222
736276
1200
就在上個月,美國司法部
查獲了三十二個網域,
12:17
the US Justice Department
seized 32 internet domains
223
737476
4240
12:21
secretly deployed
by the Russian government
224
741716
2560
皆是俄羅斯政府暗地中部署的,
12:24
to spread disinformation
across the US and Europe.
225
744316
3680
目的是在美國和歐洲
各地散播假資訊。
12:29
This included deliberate efforts
to exploit anxieties and fear
226
749076
4160
他們所做的包括蓄意利用民眾
對於能源轉型的焦慮和恐懼,
12:33
across the public
about the energy transition,
227
753236
3160
12:36
specifically to encourage violence.
228
756396
2800
特別是要鼓勵暴力。
12:40
Now it’s not just the Russian government
that we need to be worried about.
229
760716
3880
我們需要擔心的不只是俄羅斯政府。
12:45
Easy access to generative AI tools
230
765276
2760
很容易取得並使用
生成式人工智慧工具
12:48
means that anyone,
231
768076
1720
就表示任何人,
12:49
not just those with resources,
money and power,
232
769836
3840
不只是擁有資源、
金錢和權力的人,
12:53
can create high-quality,
233
773676
2000
都可以創作出高品質、
12:55
effective, powerful
disinformation content.
234
775676
3920
有效、強大的假資訊內容。
13:00
And the sources
of disinformation are varied.
235
780156
3160
且假資訊的來源非常多。
13:03
They can come from our elected officials
236
783356
2840
可能來自我們的民選官員,
13:06
all the way through
to our neighbors down the road.
237
786196
2680
一直到我們附近的鄰居都有可能。
13:09
Many of us don't need to look further
than our own families.
238
789596
3720
許多人甚至會從自己的
家人那裡聽到假資訊。
13:14
But so many of the tools we tested online
239
794236
3280
但我們在線上測試的許多工具
13:17
are even more powerful
240
797556
1640
如果是直接來自
你在真實生活中(IRL)
13:19
when they come directly from the people
241
799196
1920
13:21
that you trust and love
the most in real life, IRL.
242
801116
4200
最信任且最親愛的人時,
就會更強大。
13:25
So instead of endlessly
debating the facts,
243
805716
3280
所以,不要無止盡地去辯論事實,
換成提供工具給你的家人,
讓他們可以在線上保護自己。
13:29
give your loved ones the tools
244
809036
1600
13:30
that they need
to protect themselves online.
245
810676
2840
13:35
Information manipulation
is unfortunately the new norm.
246
815076
4760
很不幸,資訊操弄是新的常態。
13:40
But that doesn't mean we need to accept
our loved ones being misled,
247
820916
4480
但那並不表示我們必須
接受我們的家人被誤導,
13:45
and we shouldn't accept
our climate scientists living in fear.
248
825436
4480
我們也不應該接受我們的
氣候科學家必須生活在恐懼中。
13:50
So if we can't fact-check
our way out of this problem,
249
830516
3840
所以,如果我們無法靠
查證事實來解決這個問題,
13:54
we need to beat disinformation
at its own game
250
834396
3760
對假資訊,我們就得
以其人之道還治其人之身,
13:58
by reaching people
before disinformation does
251
838156
3360
比假資訊更早一步接觸到人,
14:01
and giving them
all the tools that they need
252
841556
2560
並給予他們所需的全部工具,
14:04
to protect themselves online.
253
844116
2200
讓他們在線上保護自己。
14:06
Thank you so much.
254
846876
1240
非常謝謝。
14:08
(Applause)
255
848116
3680
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。