請雙擊下方英文字幕播放視頻。
譯者: Anney Ye
審譯者: Helen Chang
00:13
Let's say you despise
0
13145
2238
假設你鄙視
00:15
Western democracy.
1
15407
1446
西方的民主制度。
00:18
Democracy, in all its trappings,
2
18412
2212
民主,佈滿了陷阱、
00:20
free elections, town halls,
3
20648
2663
從投票自由、市政廳、
00:23
endless debates about
the proper role of government.
4
23335
2836
到對政府角色永無止境的爭論。
00:26
Too messy,
5
26195
1333
都太亂七八糟,
00:27
too unpredictable,
6
27552
1156
無法預測,
00:28
too constraining for your taste.
7
28732
1964
不符合你的修養。
00:31
And the way these democracies
band together and lecture everyone else
8
31531
4250
當這些民主人士勾結在一起,
00:35
about individual rights and freedoms --
9
35805
2291
向其他人宣揚自由與人權時──
00:38
it gets under your skin.
10
38120
2040
總讓你覺得憤怒。
00:41
So what to do about it?
11
41200
1444
所以,我們應該怎麼辦呢?
00:43
You can call out the hypocrisy
and failures of Western democracies
12
43961
4538
你可以指出西方民主的失敗和虛偽,
00:48
and explain how your way is better,
13
48523
3229
然後提供一種更好的制度,
00:51
but that's never really worked for you.
14
51776
2055
但那種方法並不適合你。
00:54
What if you could get the people
15
54536
2676
那麼,如果你能夠使那些
00:57
whose support is the very foundation
of these democracies
16
57236
3140
民主制度的支持者
01:00
to start questioning the system?
17
60400
2592
開始懷疑這個系統呢?
01:04
Make the idea occur in their own minds
18
64262
2448
讓他們自己產生這樣的想法:
01:06
that democracy and its institutions
are failing them,
19
66734
4112
民主及其制度正在將他們引向失敗,
01:10
their elite are corrupt puppet masters
20
70870
2260
他們眼中的精英其實都是
腐敗的演戲高手,
01:13
and the country they knew is in free fall.
21
73154
2766
而他們的國家正在無可挽回地衰退。
01:17
To do that,
22
77603
1274
要想做到這點,
01:18
you'll need to infiltrate
the information spheres
23
78901
2998
你需要滲透進
這些民主人士的圈子。
01:21
of these democracies.
24
81923
1302
01:23
You'll need to turn
their most powerful asset --
25
83249
3701
你需要將他們最有力量的資產──
01:26
an open mind --
26
86974
1905
開放的思維──
01:28
into their greatest vulnerability.
27
88903
1982
轉變成他們最脆弱的地方。
01:31
You'll need people to question the truth.
28
91908
2052
你需要讓人們開始質疑真相。
01:35
Now, you'll be familiar of hacking
and leaks that happened in 2016.
29
95922
4742
現在,各位都知道在 2016 年
發生的駭客攻擊和洩密事件。
01:40
One was the Democratic
National Committee's networks,
30
100688
2822
被攻擊的網路之一是
民主黨全國委員會,
01:43
and the personal email
accounts of its staff,
31
103534
2831
以及民主黨成員的個人郵箱賬號,
01:46
later released on WikiLeaks.
32
106389
1853
這些信息被發表在在維基解密上。
01:48
After that, various online personas,
33
108844
2326
在那之後,許多網路上的人物,
01:51
like a supposed Romanian cybercriminal
who didn't speak Romanian,
34
111194
4880
比如一個宣稱自己是羅馬尼亞駭客,
但不會說羅馬尼亞語的人
01:56
aggressively pushed news
of these leaks to journalists.
35
116098
3506
強烈地想把這些洩密新聞
推送給新聞工作者。
02:00
The media took the bait.
36
120811
1796
媒體上鉤了。
02:02
They were consumed by how much
the DNC hated Bernie.
37
122631
3107
他们的關注都被吸引到了
民主黨對伯尼·桑德斯的恨上。
02:06
At the time, it was that narrative
that far outshined the news
38
126738
4240
在那時,有一個故事,
比新聞本身更受關注:
02:11
that a group of Russian government
sponsored hackers
39
131002
3340
一群俄羅斯官方駭客
02:14
who we called "Advanced
Persistent Threat 28,"
40
134366
3233
名稱為「高階持續性威脅 28 」,
02:17
or "APT28" for short,
41
137623
2581
或者簡稱 APT28 ,
02:20
was carrying out
these operations against the US.
42
140228
3057
為了制約美國而發動了
這些黑客攻擊。
02:24
And there was no shortage of evidence.
43
144012
2159
我們有充足的證據證明這點。
02:26
This group of Russian government hackers
hadn't just appeared out of nowhere
44
146804
3718
這群來自俄羅斯政府的駭客並不是
在 2016 年無端出現。
02:30
in 2016.
45
150546
1164
02:31
We had started tracking
this group back in 2014.
46
151734
3157
我們從 2014 年
就已經盯上這個組織了。
02:34
And the tools that APT28 used
to compromise its victims' networks
47
154915
4881
而 APT28 所用的
破壞受害人網路的工具
02:39
demonstrated a thoughtful,
well-resourced effort
48
159820
3593
展現出其背後投入的
長期精力和資源。
02:43
that had taken place for now over a decade
49
163437
2826
這一項目已經進行了超過十年,
02:46
in Moscow's time zone
50
166287
1499
在莫斯科時區,十年如一日,
02:47
from about 9 am to 6 pm.
51
167810
1986
駭客們每天從早上九點,
工作到到下午六點。
02:50
APT28 loved to prey on the emails
and contacts of journalists in Chechnya,
52
170915
5082
APT28 經常獵捕某些特定聯繫人
和郵件:車臣共和國記者、
02:56
the Georgian government,
eastern European defense attachés --
53
176021
3507
喬治亞國政府,以及
東歐防禦部門──
02:59
all targets with an undeniable interest
to the Russian government.
54
179552
4041
所有對於俄羅斯政府來說
非常有趣的目標。
03:03
We weren't the only ones onto this.
55
183617
1960
我們並不是唯一關注這件事的人。
03:05
Governments, research teams
across the world,
56
185601
3417
政府、全世界的研究機關
03:09
were coming to similar conclusions
57
189042
1907
都得到了相似的結論,
03:10
and observing the same
types of operations.
58
190973
2321
並且看到了同樣類型的操作。
03:14
But what Russia was doing in 2016
59
194152
3108
但俄羅斯在 2016 年做的事
03:17
went far beyond espionage.
60
197284
1914
遠遠超過了間諜活動。
03:19
The DNC hack was just one of many
where stolen data was posted online
61
199880
6618
民主黨全國委員會受到的黑客攻擊
只是被盜數據在線上發佈的一個例子,
03:26
accompanied by a sensational narrative,
62
206522
2254
伴隨著聳人聽聞的敘述,
03:28
then amplified in social media
63
208800
1938
然後由於媒體的光速傳播
03:30
for lightning-speed adoption by the media.
64
210762
2839
在社交媒體上放大。
03:36
This didn't ring the alarm bells
65
216656
2542
然而這並沒有響起人們的警鐘:
03:39
that a nation-state was trying
to interfere with the credibility
66
219222
4491
事實上,一個國家正試著干預
另一個國家的內政。
03:43
of another's internal affairs.
67
223737
1924
03:45
So why, collectively,
did we not see this coming?
68
225685
4662
那麼,為什麼我們集體
沒有意識到它的到來?
03:50
Why did it take months
before Americans understood
69
230931
3882
為什麼遠在幾個月以後,
美國人才明白
03:54
that they were under a state-sponsored
information attack?
70
234837
4121
他們受到的信息攻擊,
是另一個國家的操縱?
04:00
The easy answer is politics.
71
240276
1639
一個簡單的答案是政治。
04:01
The Obama Administration was caught
in a perfect catch-22.
72
241939
3923
奧巴馬陷入了一個完美的困境。
04:05
By raising the specter that the Russian
government was interfering
73
245886
4398
通過讓民眾對俄羅斯政府正在
干預美國總統大選一事感到焦慮,
04:10
in the US presidential campaign,
74
250308
2092
04:12
the Administration risked appearing
to meddle in the campaign itself.
75
252424
4261
行政當局可能會使自己
看起來像是在干擾大選。
04:17
But the better answer, I think,
76
257812
2055
但我想,更好的答案
04:19
is that the US and the West
were utterly unequipped
77
259891
3853
是美國和西方對於現代的信息化操縱
04:23
to recognize and respond
to a modern information operation,
78
263768
4654
毫無任何防備,
04:28
despite the fact that the US
had wielded information
79
268446
5112
儘管美國曾在不久前的時代
04:33
with devastating success
in an era not so long ago.
80
273582
3185
在信息方面取得過巨大的成功。
04:38
Look, so while the US and the West
spent the last 20 years
81
278104
3894
美國和西方國家
曾花費了過去 20 年時間
04:42
caught up in cybersecurity --
82
282022
1554
强化網路安全──
04:43
what networks to harden,
83
283600
1495
什麼樣的網路需要強化,
04:45
which infrastructure to deem critical,
84
285119
2309
哪些基礎設施是關鍵,
04:47
how to set up armies of cyber warriors
and cyber commands --
85
287452
3993
怎樣建立網路戰士的
軍隊和網路指令──
04:51
Russia was thinking in far more
consequential terms.
86
291469
3733
而俄羅斯的想法更注重結果。
04:57
Before the first iPhone
even hit the shelf,
87
297142
3327
在第一個 iPhone 出現之前,
05:00
the Russian government understood
the risks and the opportunity
88
300493
4473
俄羅斯政府就知道了
網路技術所能提供的
風險和機會,
05:04
that technology provided
89
304990
1425
05:06
and the inter-communication
and instant communication it provided us.
90
306439
4411
以及它給我們提供的即時信息交流。
05:12
As our realities are increasingly
based on the information
91
312311
3217
隨著我們眼中的現實越來越基於
05:15
that we're consuming
at the palm of our hand
92
315552
2356
我們掌中得到的信息,
05:17
and from the news feeds
that we're scanning
93
317932
2253
以及從我們瀏覽的新聞,
05:20
and the hashtags and stories
that we see trending,
94
320209
2966
和當下流行的標籤和故事,
05:23
the Russian government
was the first to recognize
95
323199
2927
俄羅斯政府是第一個意識到
05:26
how this evolution
96
326150
1875
這場技術革命是如何
05:28
had turned your mind into the most
exploitable device on the planet.
97
328049
4789
把你的意識
變成地球上可利用資源的。
05:34
And your mind is particularly exploitable
98
334774
2464
當你習慣於現在不受監管
05:37
if you're accustomed
to an unfettered flow of information,
99
337262
3775
又引人入勝的信息流時,
05:41
now increasingly curated
to your own tastes.
100
341061
3279
你的思想格外容易被人利用。
05:47
This panorama of information
that's so interesting to you
101
347064
2891
這個對你來說非常有趣的全景式信息
05:49
gives a state, or anyone for that matter,
a perfect back door into your mind.
102
349979
5946
給了一個國家,或者任何人,
一個入侵你思想的機會。
05:56
It's this new brand of state-sponsored
information operations
103
356798
3678
這一新型的,由國家支持的信息行動
06:00
that can be that much more successful,
104
360500
2135
成功率更高,
06:02
more insidious,
105
362659
1302
更加隱秘,
06:03
and harder for the target audience --
that includes the media --
106
363985
4086
同時更難被目標觀眾,包括媒體,
所察覺到。
06:08
to decipher and characterize.
107
368095
1784
06:10
If you can get a hashtag
trending on Twitter,
108
370522
2193
如果你能使一個標籤
在推特上流行起來,
06:12
or chum the waters with fake news
109
372739
3115
或者用假新聞收買網路水軍,
06:15
directed to audiences
primed to receive it,
110
375878
2441
針對著準備接收這些信息的觀眾,
06:18
or drive journalists to dissect
terabytes of email
111
378343
2877
或者讓記者們解析海量的郵件
06:21
for a cent of impropriety --
112
381244
1975
來尋找哪怕是一分一毫的不誠實──
06:23
all tactics used in Russian operations --
113
383243
2642
這都是被俄羅斯政府利用過的策略,
06:25
then you've got a shot at effectively
camouflaging your operations
114
385909
4291
這樣,你就能夠高效地
改變你目標的思想。
06:30
in the mind of your target.
115
390224
1804
06:33
This is what Russia's long called
"reflexive control."
116
393687
3832
這就是被俄羅斯稱作
「反射控制」的東西。
06:38
It's the ability to use
information on someone else
117
398669
3782
這是將信息用在別人身上的能力:
06:42
so that they make a decision
118
402475
2184
這樣做,他們的選擇
06:44
on their own accord
119
404683
1551
是由他們自己做出的,
06:46
that's favorable to you.
120
406258
1543
卻有利於你。
06:50
This is nation-state-grade image control
and perception management,
121
410111
4079
這是一個國家
在影響和控制你的認知:
06:54
and it's conducted by any means,
122
414214
2318
這可以用各種形式來實現:
06:56
with any tools, network-based
or otherwise, that will achieve it.
123
416556
4299
不管用什麼工具,
不管基不基於網路,都能達到目標。
07:01
Take this for another example.
124
421631
1430
還有另一個例子。
07:03
In early February 2014, a few weeks
before Russia would invade Crimea,
125
423085
4953
在 2014 年的二月初,
俄羅斯入侵克里米亞的幾週前,
07:08
a phone call is posted on YouTube.
126
428062
2229
有人在 YouTube 上傳了一通電話。
07:10
In it, there's two US diplomats.
127
430315
2375
電話裡有兩個美國外交官。
07:12
They sound like they're playing
kingmaker in Ukraine,
128
432714
3194
他們聽起來像是在烏克蘭玩過家家,
07:15
and worse, they curse the EU
for its lack of speed and leadership
129
435932
3443
而更糟糕的是,
他們咒罵歐盟在化解危機時
缺乏領導力和速度。
07:19
in resolving the crisis.
130
439399
1586
07:21
The media covers the phone call,
131
441887
2459
媒體報導了這個電話,
07:24
and then the ensuing diplomatic backlash
132
444370
3338
最後的外交反作用
07:29
leaves Washington and Europe reeling.
133
449003
2335
使得華盛頓和歐洲失去平衡。
07:32
And it creates a fissured response
and a feckless attitude
134
452782
4079
糟糕的是,它使得人們
對俄羅斯在烏克蘭的侵略
07:36
towards Russia's land grab in Ukraine.
135
456885
2130
持有一種玩世不恭,
毫不嚴肅的態度。
07:39
Mission accomplished.
136
459938
1535
任務完成。
07:42
So while hacked phone calls
and emails and networks
137
462468
3380
所以當駭客洩露的
電話通話、郵件和網路
07:45
keep grabbing the headlines,
138
465872
2046
一直牢牢佔據著頭條,
07:47
the real operations are the ones
139
467942
2634
其背後真正的操作
07:50
that are influencing
the decisions you make
140
470600
2808
正在影響你做出的決定
07:53
and the opinions you hold,
141
473432
1818
和你的想法。
07:55
all in the service of a nation-state's
strategic interest.
142
475274
3874
而這一切全都為
一個國家的戰略利益服務。
07:59
This is power in the information age.
143
479764
2011
這就是信息時代的力量。
08:03
And this information is all
that much more seductive,
144
483347
3444
而當它是隱秘信息時,
08:06
all that much easier to take
at face value and pass on,
145
486815
3787
這個信息變得愈發吸引人,
08:10
when it's authentic.
146
490626
1382
傳播的速度更快。
08:12
Who's not interested in the truth
that's presented in phone calls and emails
147
492697
5394
有誰不會對本來永遠不會公開的
藏在電話通話和郵件裡的真相
感到好奇呢?
08:18
that were never intended
for public consumption?
148
498115
2851
08:22
But how meaningful is that truth
149
502061
1754
但如果你不知道
為什麼那個真相展示給了你,
08:23
if you don't know why
it's being revealed to you?
150
503839
2440
真相又有什麼意義呢?
08:27
We must recognize that this place
where we're increasingly living,
151
507786
4188
我們必須意識到
我們生活的這個地方──
08:31
which we've quaintly termed "cyberspace,"
152
511998
2285
我們把它叫做「網路空間」,
08:34
isn't defined by ones and zeroes,
153
514307
2206
並不是由 0 和 1 來定義,
08:36
but by information
and the people behind it.
154
516537
2989
而是由信息和信息背後的人。
08:40
This is far more than a network
of computers and devices.
155
520648
3143
這遠不僅僅是一個
由電腦和設備構成的網路。
08:43
This is a network composed of minds
156
523815
3152
這是一個由操作電腦設備的
腦袋們所形成的網路。
08:46
interacting with computers and devices.
157
526991
2399
08:50
And for this network,
158
530770
1901
而這個網路,
08:54
there's no encryption,
there's no firewall,
159
534448
3381
沒有加密,沒有防火墻,
08:57
no two-factor authentication,
160
537853
1723
沒有雙重認證,
08:59
no password complex enough to protect you.
161
539600
2876
沒有足夠複雜到保護你的密碼。
09:03
What you have for defense
162
543388
2414
你擁有的抵禦武器
09:05
is far stronger, it's more adaptable,
it's always running the latest version.
163
545826
4447
遠遠更強,更具有適應性,
永遠都是最新版。
09:11
It's the ability to think critically:
164
551130
2925
它是批判性思維:
09:14
call out falsehood,
165
554079
1719
指出錯誤的地方,
09:15
press for the facts.
166
555822
1427
努力找到事實。
09:18
And above all, you must have the courage
167
558622
4193
而最重要的,你必須要有勇氣
09:22
to unflinchingly pursue the truth.
168
562839
2948
去堅定不移的追求真相。
09:27
(Applause)
169
567572
5038
(掌聲)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。