How the NSA betrayed the world's trust -- time to act | Mikko Hypponen

405,654 views ・ 2013-11-07

TED


請雙擊下方英文字幕播放視頻。

譯者: Zhiting Chen 審譯者: NAN-KUN WU
00:12
The two most likely largest inventions
0
12492
4634
當代影響最大的
00:17
of our generation
1
17126
2247
兩個發明
00:19
are the Internet and the mobile phone.
2
19373
3193
是網絡和手機。
00:22
They've changed the world.
3
22566
2135
它們改變了世界。
00:24
However, largely to our surprise,
4
24701
3515
然而,大大出乎我們意料的是,
00:28
they also turned out to be the perfect tools
5
28216
4398
它們竟成了國家監視
00:32
for the surveillance state.
6
32614
3150
的理想工具。
00:35
It turned out that the capability
7
35764
2897
事實是
00:38
to collect data, information and connections
8
38661
4044
收集關於我們任何一人,乃至所有人
00:42
about basically any of us and all of us
9
42705
4218
的數據,信息,和關係的能力
00:46
is exactly what we've been hearing
10
46923
1813
就是我們
00:48
throughout of the summer through revelations and leaks
11
48736
4607
整個夏天通過 揭露和洩密所聽到的
00:53
about Western intelligence agencies,
12
53343
3091
關於西方情報機構
00:56
mostly U.S. intelligence agencies,
13
56434
3026
主要是美國的情報機構,
00:59
watching over the rest of the world.
14
59460
3173
正在窺探整個世界。
01:02
We've heard about these starting with the
15
62633
3198
6月6日的揭露
01:05
revelations from June 6.
16
65831
3686
使我們開始了解到這些
01:09
Edward Snowden started leaking information,
17
69517
3069
愛德華斯諾登開始披露
01:12
top secret classified information,
18
72586
2304
有關美國情報機構的
01:14
from the U.S. intelligence agencies,
19
74890
1467
最高機密信息,
01:16
and we started learning about things like PRISM
20
76357
2469
因此我們知道了諸如棱鏡計劃
01:18
and XKeyscore and others.
21
78826
3267
XKeyscore和其他的監視計劃。
01:22
And these are examples of the kinds of programs
22
82093
3105
這些只是美國情報機構正在進行的
01:25
U.S. intelligence agencies are running right now,
23
85198
4279
此類監視計劃中的幾個個案,
01:29
against the whole rest of the world.
24
89477
3516
對象是全世界其他國家。
01:32
And if you look back about the forecasts
25
92993
3708
現在若回過去看
01:36
on surveillance by George Orwell,
26
96701
4101
被監視的喬治奧威爾,
01:40
well it turns out that
27
100817
2118
事實證明
01:42
George Orwell was an optimist.
28
102935
2504
喬治奧威爾是個樂天派。
01:45
(Laughter)
29
105439
2480
(笑聲)
01:47
We are right now seeing a much larger scale
30
107919
2700
我們現在面臨的是比我們
01:50
of tracking of individual citizens
31
110619
1898
所能想像得到的更大規模的
01:52
than he could have ever imagined.
32
112517
3632
對於每個公民的監視和跟蹤
01:56
And this here is the infamous
33
116149
3535
這個就是臭名昭著的
01:59
NSA data center in Utah.
34
119684
3844
美國安全局位於猶他州的數據中心。
02:03
Due to be opened very soon,
35
123528
3156
不久將正式辦公,
02:06
it will be both a supercomputing center
36
126684
2791
它將作為超級計算機中心
02:09
and a data storage center.
37
129475
2137
和數據儲存中心。
02:11
You could basically imagine it has a large hall
38
131612
2893
你可以簡單想像一下,
02:14
filled with hard drives storing data
39
134505
2456
它有一個巨大的廳堂,
02:16
they are collecting.
40
136961
2274
裡面放滿了他們蒐集的 存有數據的硬碟。
02:19
And it's a pretty big building.
41
139235
2157
而且這棟樓非常大。
02:21
How big? Well, I can give you the numbers --
42
141392
1851
有多大?我可以給你幾個數字---
02:23
140,000 square meters --
43
143243
2022
14萬平方米--
02:25
but that doesn't really tell you very much.
44
145265
2606
但是這可能不夠具體。
02:27
Maybe it's better to imagine it as a comparison.
45
147871
3176
或許比較之下更好想像。
02:31
You think about the largest IKEA store
46
151047
2456
想像一下你所去過的
02:33
you've ever been in.
47
153503
1747
最大的宜家賣場。
02:35
This is five times larger.
48
155250
3469
這棟大樓就是它的五倍大。
02:38
How many hard drives can you fit in an IKEA store?
49
158719
3076
在宜家裡面能裝進多少硬碟?
02:41
Right? It's pretty big.
50
161795
2007
對吧?它很大。
02:43
We estimate that just the electricity bill
51
163802
2846
我們估計光這個數據中心
02:46
for running this data center
52
166648
1876
運營的電費賬單
02:48
is going to be in the tens of millions of dollars a year.
53
168524
3398
每年就可高達數千萬美元了。
02:51
And this kind of wholesale surveillance
54
171922
2509
而且這種大規模的監聽
02:54
means that they can collect our data
55
174431
2736
意味著他們將會收集我們的數據
02:57
and keep it basically forever,
56
177167
2003
還永久保存下去,
02:59
keep it for extended periods of time,
57
179170
2509
留更長的時間,
03:01
keep it for years, keep it for decades.
58
181679
3246
留上幾年,幾十年。
03:04
And this opens up completely new kinds of risks
59
184925
3379
這對我們所有人來說
03:08
to us all.
60
188304
1946
是一種新的危機。
03:10
And what this is is that it is wholesale
61
190250
3628
這是對每個人的大規模
03:13
blanket surveillance on everyone.
62
193878
4857
地毯式的監聽。
03:18
Well, not exactly everyone,
63
198735
1554
也許,不是每一個人
03:20
because the U.S. intelligence only has a legal right
64
200289
4028
因為美國情報機構只能合法
03:24
to monitor foreigners.
65
204317
1970
監視外國人
03:26
They can monitor foreigners
66
206287
1750
他們可以在外國人的
03:28
when foreigners' data connections
67
208037
2810
數據連接到美國
03:30
end up in the United States or pass through the United States.
68
210847
3490
或經過美國的時候進行監視。
03:34
And monitoring foreigners doesn't sound too bad
69
214337
2784
對外國人實施網路監視 聽起來並不那麼糟
03:37
until you realize
70
217121
2367
直到你意識到
03:39
that I'm a foreigner and you're a foreigner.
71
219488
3001
我是外國人,你也是外國人
03:42
In fact, 96 percent of the planet are foreigners.
72
222489
3786
事實上,地球上百分之九十六的人 都是外國人。
03:46
(Laughter)
73
226275
1670
(笑聲)
03:47
Right?
74
227945
1887
對吧?
03:49
So it is wholesale blanket surveillance of all of us,
75
229832
4449
所以這是針對我們所有人的 大規模的地毯式的監聽。
03:54
all of us who use telecommunications and the Internet.
76
234281
4102
所有使用電信通訊和網絡的人 都受到監聽。
03:58
But don't get me wrong:
77
238383
1891
但是別誤解我的意思:
04:00
There are actually types of surveillance that are okay.
78
240274
5226
其實有些類型監聽是合理的。
04:05
I love freedom, but even I agree
79
245500
3029
我熱愛自由,但是我也認同
04:08
that some surveillance is fine.
80
248529
2279
有些監視是可以的。
04:10
If the law enforcement is trying to find a murderer,
81
250808
3903
如果執法人員在搜尋殺人犯,
04:14
or they're trying to catch a drug lord
82
254711
3102
或者他們在追蹤毒梟
04:17
or trying to prevent a school shooting,
83
257813
3604
或在防止校園槍擊,
04:21
and they have leads and they have suspects,
84
261417
1677
他們有證據 或有鎖定的犯罪嫌疑人,
04:23
then it's perfectly fine for them to tap the suspect's phone,
85
263094
3717
那他們監聽嫌疑人的電話
04:26
and to intercept his Internet communications.
86
266811
3356
攔截他的網絡通訊 是完全合理的。
04:30
I'm not arguing that at all,
87
270167
1938
我沒有就此辯論,
04:32
but that's not what programs like PRISM are about.
88
272105
2824
因為這些與棱鏡計劃是不同的。
04:34
They are not about doing surveillance on people
89
274929
2885
他們不是對那些疑似犯罪嫌疑人
04:37
that they have reason to suspect of some wrongdoings.
90
277814
3204
進行監視。
04:41
They're about doing surveillance on people
91
281018
1677
他們是對那些他們明知
04:42
they know are innocent.
92
282695
3760
無辜的人進行監視。
04:46
So the four main arguments
93
286455
2245
所以,針對支持此等監視
04:48
supporting surveillance like this,
94
288700
2152
主要有四個爭論據,
04:50
well, the first of all is that whenever you start
95
290852
2235
首先,每當你開始說起
04:53
discussing about these revelations,
96
293087
1895
這些被披露的真相時,
04:54
there will be naysayers trying to minimize
97
294982
2293
就有否定者想息事寧人
04:57
the importance of these revelations, saying that
98
297275
2188
聲稱早已知道,
04:59
we knew all this already,
99
299463
1613
我們早已知道會發生此事
05:01
we knew it was happening, there's nothing new here.
100
301076
3580
我們知道已經在發生, 這些資訊我們早已掌握
05:04
And that's not true. Don't let anybody tell you
101
304656
3215
那都是騙人的。 別信他們說“我們已經知道了”
05:07
that we knew this already, because we did not know this already.
102
307871
5712
因為我們不知道。
05:13
Our worst fears might have been something like this,
103
313583
3773
我們最怕可能就是這樣的,
05:17
but we didn't know this was happening.
104
317356
1951
但是我們不知道這正在發生。
05:19
Now we know for a fact it's happening.
105
319307
2777
現在我們知道事實就是它正在發生。
05:22
We didn't know about this. We didn't know about PRISM.
106
322084
2579
我們不知道這些。 我們不知道棱鏡計劃。
05:24
We didn't know about XKeyscore. We didn't know about Cybertrans.
107
324663
2906
我們不知道XKeyscore 我們不知道Cybertrans.
05:27
We didn't know about DoubleArrow.
108
327569
1950
我們不知道Doublearrow.
05:29
We did not know about Skywriter --
109
329519
2148
我們不知道到Skywriter--
05:31
all these different programs
110
331667
1695
這些各種各樣的
05:33
run by U.S. intelligence agencies.
111
333362
3241
美國情報機構進行的計劃。
05:36
But now we do.
112
336603
3029
但現在我們知道了。
05:39
And we did not know
113
339632
2166
我們過去不知道美國
05:41
that U.S. intelligence agencies go to extremes
114
341798
3075
情報機構會極端的
05:44
such as infiltrating standardization bodies
115
344873
3837
做出例如潛入官方標準機構
05:48
to sabotage encryption algorithms on purpose.
116
348710
4748
有目的地破壞加密運算法。
05:53
And what that means
117
353458
2037
也就是說
05:55
is that you take something which is secure,
118
355495
1820
當你使用一個安全的文件,
05:57
an encryption algorithm which is so secure
119
357315
2421
一個安全的加密運算法會
05:59
that if you use that algorithm to encrypt one file,
120
359736
3107
給你的文件加密,
06:02
nobody can decrypt that file.
121
362843
1742
一旦被加密那麼 沒有人能破解它。
06:04
Even if they take every single computer on the planet just to decrypt that one file,
122
364585
4413
即使他們想用地球上的所有電腦 去破解那個文件
06:08
it's going to take millions of years.
123
368998
2060
也得花上百萬年。
06:11
So that's basically perfectly safe, uncrackable.
124
371058
2247
所以它稱得上是安全的, 無懈可擊的。
06:13
You take something which is that good
125
373305
2074
你擁有那麼好的東西
06:15
and then you weaken it on purpose,
126
375379
2484
卻故意削弱它的優勢
06:17
making all of us less secure as an end result.
127
377863
5610
導致我們承擔不安全的後果。
06:23
A real-world equivalent would be that
128
383473
2131
而現實是
06:25
intelligence agencies would force
129
385604
2652
情報機構強行
06:28
some secret pin code into every single house alarm
130
388256
2827
給每個房子的報警器 都植入秘密的識別碼
06:31
so they could get into every single house
131
391083
1793
這樣他們就可以 自由進入每一座房子
06:32
because, you know, bad people might have house alarms,
132
392876
2246
因為,壞人也會有房屋報警鈴,
06:35
but it will also make all of us
133
395122
2439
但是它也最終讓我們
06:37
less secure as an end result.
134
397561
2229
都少一些保障。
06:39
Backdooring encryption algorithms
135
399790
3740
越位加密算法
06:43
just boggles the mind.
136
403530
3096
讓這種想法退縮。
06:46
But of course, these intelligence agencies are doing their job.
137
406626
3775
然而,情報機構當然有他們的職責。
06:50
This is what they have been told to do:
138
410401
1757
他們被命令做如下幾件事:
06:52
do signals intelligence,
139
412158
2230
信號情報,
06:54
monitor telecommunications,
140
414388
2012
監聽電信,
06:56
monitor Internet traffic.
141
416400
1422
監控網絡流量。
06:57
That's what they're trying to do,
142
417822
1708
那就是他們正在做的,
06:59
and since most, a very big part of the Internet traffic today is encrypted,
143
419530
3082
由於當今大多數的網絡流量都是加密的,
07:02
they're trying to find ways around the encryption.
144
422612
1957
他們在努力破解。
07:04
One way is to sabotage encryption algorithms,
145
424569
3057
一種破解辦法是破壞加密運算法,
07:07
which is a great example
146
427626
1885
這就是美國情報機構
07:09
about how U.S. intelligence agencies
147
429511
2221
胡作非為的
07:11
are running loose.
148
431732
1749
典型例子。
07:13
They are completely out of control,
149
433481
1841
他們完全失控了,
07:15
and they should be brought back under control.
150
435322
4442
而他們必須趕快恢復理智。
07:21
So what do we actually know about the leaks?
151
441629
2950
那我們又對洩密有什麼真正的了解呢?
07:24
Everything is based on the files
152
444579
2110
我們的了解全都基於
07:26
leaked by Mr. Snowden.
153
446689
2498
斯諾登曝露的文件。
07:29
The very first PRISM slides
154
449187
2848
六月初最先
07:32
from the beginning of June
155
452035
1989
揭露的棱鏡計劃
07:34
detail a collection program where the data
156
454024
2094
詳述這個數據收集計劃
07:36
is collected from service providers,
157
456118
1786
是從服務提供商那裡獲得的,
07:37
and they actually go and name the service providers
158
457904
2878
而且實際上他們指定
07:40
they have access to.
159
460782
1331
這些可以合作的服務提供商。
07:42
They even have a specific date
160
462113
2942
他們有詳細的日期記錄
07:45
on when the collection of data began
161
465055
2664
記錄著從什麼時候開始
07:47
for each of the service providers.
162
467719
1639
從哪個服務提供商得到收集的數據。
07:49
So for example, they name the collection from Microsoft
163
469358
2287
例如,他們指定從2007年9月11日開始
07:51
started on September 11, 2007,
164
471645
3720
從微軟收集數據,
07:55
for Yahoo on the March 12, 2008,
165
475365
2732
雅虎是從2008年3月12日,
07:58
and then others: Google, Facebook,
166
478097
3016
隨後其他的:谷歌, 臉書,
08:01
Skype, Apple and so on.
167
481113
3244
Skype, 蘋果等也開始了。
08:04
And every single one of these companies denies.
168
484357
2848
每一家公司都拒絕承認。
08:07
They all say that this simply isn't true,
169
487205
4395
他們都說這是胡扯,
08:11
that they are not giving backdoor access to their data.
170
491600
4608
他們沒有授權讓人秘密使用他們的數據。
08:16
Yet we have these files.
171
496208
4123
然而,我們有這些文件。
08:20
So is one of the parties lying,
172
500331
2321
所以,是他們之中有一方在撒謊呢,
08:22
or is there some other alternative explanation?
173
502652
3323
還是有什麽其他的解釋呢?
08:25
And one explanation would be
174
505975
2922
有一種解釋是
08:28
that these parties, these service providers,
175
508897
2887
這些當事人,這些服務提供商,
08:31
are not cooperating.
176
511784
1916
沒有配合政府。
08:33
Instead, they've been hacked.
177
513700
3021
其實他們的系統是被駭客侵入了。
08:36
That would explain it. They aren't cooperating. They've been hacked.
178
516721
3217
這樣猜行得通。他們沒有狼狽為奸, 他們是被駭客攻擊了。
08:39
In this case, they've been hacked by their own government.
179
519938
4178
這就是說,他們被 自己政府的駭客侵入了。
08:44
That might sound outlandish,
180
524116
2421
這說法聽起來很奇特,
08:46
but we already have cases where this has happened,
181
526537
2214
但是也不是第一次了, 已有類似情況發生,
08:48
for example, the case of the Flame malware
182
528751
3046
例如惡意軟件“火焰”
08:51
which we strongly believe was authored
183
531797
2033
我們堅信這是
08:53
by the U.S. government,
184
533830
1897
美國政府授權的,
08:55
and which, to spread, subverted the security
185
535727
3899
散播,破壞Windows升級
08:59
of the Windows Update network,
186
539626
2886
網絡的安全性,
09:02
meaning here, the company was hacked
187
542512
4093
這就意味著,這家公司被
09:06
by their own government.
188
546605
2358
他們自己政府的駭客攻擊了。
09:08
And there's more evidence
189
548963
1599
還有更多的證據
09:10
supporting this theory as well.
190
550562
2551
證明這個論點。
09:13
Der Spiegel, from Germany, leaked more information
191
553113
4005
德國人Der Spiegel洩露更多
09:17
about the operations run by the elite hacker units
192
557118
4265
在這些情報機構內, 精英駭客小組
09:21
operating inside these intelligence agencies.
193
561383
3035
執行的任務。
09:24
Inside NSA, the unit is called TAO,
194
564418
2626
在美國國家安全局NSA,這個小隊代號TAO,
09:27
Tailored Access Operations,
195
567044
1845
特別侵入行動,
09:28
and inside GCHQ, which is the U.K. equivalent,
196
568889
3564
在英國同類機構 政府通訊總部GCHQ里,
09:32
it's called NAC, Network Analysis Centre.
197
572453
3999
這樣的小組叫做NAC, 網絡分析中心。
09:36
And these recent leaks of these three slides
198
576452
3844
最近洩漏的3個幻燈片
09:40
detail an operation
199
580296
2204
詳細描述了
09:42
run by this GCHQ intelligence agency
200
582500
3158
英國政府通訊總局
09:45
from the United Kingdom
201
585658
1809
指揮的一次行動
09:47
targeting a telecom here in Belgium.
202
587467
4233
目標是比利時的一個電信公司。
09:51
And what this really means
203
591700
2276
這顯然
09:53
is that an E.U. country's intelligence agency
204
593976
3888
是一個歐盟國家的情報機構
09:57
is breaching the security
205
597864
2215
故意破壞另外一個
10:00
of a telecom of a fellow E.U. country on purpose,
206
600079
4813
歐盟國家的電信安全,
10:04
and they discuss it in their slides completely casually,
207
604892
3835
然後在他們的幻燈片里,
10:08
business as usual.
208
608727
1601
完全像談生意一樣 隨意地討論這件事。
10:10
Here's the primary target,
209
610328
1668
這是第一個目標,
10:11
here's the secondary target,
210
611996
1378
第二個目標是這個,
10:13
here's the teaming.
211
613374
1424
這樣組隊。
10:14
They probably have a team building on Thursday evening in a pub.
212
614798
3856
他們這個小組可能 星期四晚上在酒吧聚餐。
10:18
They even use cheesy PowerPoint clip art
213
618654
3041
當他們達到目的, 像進入這家電信
10:21
like, you know, "Success,"
214
621695
1707
他們竟然用了PowerPoint
10:23
when they gain access to services like this.
215
623402
3264
剪貼畫,比如“成功”。
10:26
What the hell?
216
626666
2826
搞什麽?
10:31
And then there's the argument
217
631685
1833
接著有這樣一個論點
10:33
that okay, yes, this might be going on,
218
633518
1660
好吧,對,我們在做這件事,
10:35
but then again, other countries are doing it as well.
219
635178
2637
但其他國家也在做。
10:37
All countries spy.
220
637815
2423
所有的國家都監視。
10:40
And maybe that's true.
221
640238
1738
或許,確實如此。
10:41
Many countries spy, not all of them, but let's take an example.
222
641976
2438
很多國家都監視, 但不是全部,舉個例子。
10:44
Let's take, for example, Sweden.
223
644414
2111
拿瑞典舉例。
10:46
I'm speaking of Sweden because Sweden
224
646525
1376
我用瑞典舉例主要因為
10:47
has a little bit of a similar law to the United States.
225
647901
2279
他們的法律和美國 有一點相似之處。
10:50
When your data traffic goes through Sweden,
226
650180
2123
當你的數據流量通過瑞典時,
10:52
their intelligence agency has a legal right by the law
227
652303
2810
他們的情報機構可以合法的
10:55
to intercept that traffic.
228
655113
2001
攔截那個流量。
10:57
All right, how many Swedish decisionmakers
229
657114
3205
好吧,哪有有多少瑞典的決策人
11:00
and politicians and business leaders
230
660319
2872
政客,商業領袖
11:03
use, every day, U.S.-based services,
231
663191
3073
每天使用美國服務器呢,
11:06
like, you know, run Windows or OSX,
232
666264
3268
例如運行Windows,或OSX
11:09
or use Facebook or LinkedIn,
233
669532
2210
或者使用Facebook或LinkedIn,
11:11
or store their data in clouds like iCloud
234
671742
3400
或把他們的數據存儲到諸如iCloud
11:15
or Skydrive or DropBox,
235
675142
3894
或Skydrive或DropBox之類的雲端存儲
11:19
or maybe use online services like Amazon web services or sales support?
236
679036
4303
或者使用例如Amazon網站服務 或銷售支持這樣的線上服務呢
11:23
And the answer is, every single Swedish business leader does that every single day.
237
683339
3957
答案是,每個瑞典的商業領袖 每天都在使用。
11:27
And then we turn it around.
238
687296
1599
然後我們反過來看。
11:28
How many American leaders
239
688895
1905
有多少美國各界領袖使用瑞典的
11:30
use Swedish webmails and cloud services?
240
690800
4293
郵件和雲端存儲服務呢?
11:35
And the answer is zero.
241
695093
2040
答案是零。
11:37
So this is not balanced.
242
697133
2269
這顯然不平衡。
11:39
It's not balanced by any means, not even close.
243
699402
4625
從任何的角度來看, 都絕對不平衡。
11:44
And when we do have the occasional
244
704027
2441
當我們偶爾有個
11:46
European success story,
245
706468
2001
歐洲企業成功的故事,
11:48
even those, then, typically end up being sold to the United States.
246
708469
3566
但它們最終也總會被美國買走。
11:52
Like, Skype used to be secure.
247
712035
2264
像Skype,曾經是安全的。
11:54
It used to be end-to-end encrypted.
248
714299
2733
它曾經是客戶端與服務器終端加密的。
11:57
Then it was sold to the United States.
249
717032
2041
後來被賣給美國了。
11:59
Today, it no longer is secure.
250
719073
2649
目前,它已經不再是安全的了。
12:01
So once again, we take something which is secure
251
721722
3221
再一次,我們把安全的東西
12:04
and then we make it less secure on purpose,
252
724943
1870
故意弄得不安全,
12:06
making all of us less secure as an outcome.
253
726813
4484
結果就是我們所有人都不那麼安全了。
12:12
And then the argument that the United States
254
732855
2247
然後他們爭論
12:15
is only fighting terrorists.
255
735102
2018
美國只是在對抗恐怖主義。
12:17
It's the war on terror.
256
737120
1166
這是反恐戰爭。
12:18
You shouldn't worry about it.
257
738286
2547
你不必擔心這個。
12:20
Well, it's not the war on terror.
258
740833
2230
這並不是場與恐怖注意的戰爭。
12:23
Yes, part of it is war on terror, and yes,
259
743063
2173
是,有一部份是爲了反恐,是,
12:25
there are terrorists, and they do kill and maim,
260
745236
2976
確實有恐怖分子, 他們確實殺人也傷人,
12:28
and we should fight them,
261
748212
1551
我們應該跟他們鬥爭,
12:29
but we know through these leaks
262
749763
1606
但是通過這些洩露的信息
12:31
that they have used the same techniques
263
751369
2582
我們知道他們使用反恐的科技
12:33
to listen to phone calls of European leaders,
264
753951
3336
來監聽歐洲首腦們的電話,
12:37
to tap the email of residents of Mexico and Brazil,
265
757287
3455
監視墨西哥和巴西人的的電子郵件,
12:40
to read email traffic inside the United Nations Headquarters and E.U. Parliament,
266
760742
4806
看聯合國總部和歐盟議會內部郵件往來,
12:45
and I don't think they are trying to find terrorists
267
765548
3154
所以我不認為他們是想從
12:48
from inside the E.U. Parliament, right?
268
768702
3018
歐盟議會裡找恐怖分子,對吧?
12:51
It's not the war on terror.
269
771720
1948
這不是反恐戰爭。
12:53
Part of it might be, and there are terrorists,
270
773668
4142
確實反恐是部份原因, 因為確實有恐怖分子,
12:57
but are we really thinking about terrorists
271
777810
2427
但是我們真的願意做任何事
13:00
as such an existential threat
272
780237
2169
來打擊這些
13:02
that we are willing to do anything at all to fight them?
273
782406
3676
已經存在的威脅嗎?
13:06
Are the Americans ready to throw away the Constituion
274
786082
3491
美國人已經準備好了拋棄憲法了嗎?
13:09
and throw it in the trash just because there are terrorists?
275
789573
4241
就因為有恐怖分子就要 把憲法扔進垃圾桶嗎?
13:13
And the same thing with the Bill of Rights and all the amendments
276
793814
2524
也要把人權法案,修正案
13:16
and the Universal Declaration of Human Rights
277
796338
2317
世界人權宣言
13:18
and the E.U. conventions on human rights and fundamental freedoms
278
798655
5151
和歐盟人權會議和基本自由
13:23
and the press freedom?
279
803806
1517
以及新聞自由也要扔掉嗎?
13:25
Do we really think terrorism is such an existential threat,
280
805323
3815
你真的認為, 爲了恐怖主義這種已存在的威脅
13:29
we are ready to do anything at all?
281
809138
3126
我們已經準備好放棄一切了嗎?
13:34
But people are scared about terrorists,
282
814490
2664
人們害怕恐怖分子,
13:37
and then they think that maybe that surveillance is okay
283
817154
2414
所以他們想,監視都是可以的
13:39
because they have nothing to hide.
284
819568
2044
因為他們沒什麼要掩飾的。
13:41
Feel free to survey me if that helps.
285
821612
2707
如果有用的話,隨便查我。
13:44
And whoever tells you that they have nothing to hide
286
824319
2888
不論是誰,當他告訴你 他沒什麼好隱瞞的,
13:47
simply hasn't thought about this long enough.
287
827207
4713
只是因為他沒深思熟慮過。
13:54
(Applause)
288
834520
5865
(掌聲)
14:00
Because we have this thing called privacy,
289
840385
2772
因為我們有一個叫做隱私的東西,
14:03
and if you really think that you have nothing to hide,
290
843157
2345
如果你真的沒什麼可隱瞞的,
14:05
please make sure that's the first thing you tell me,
291
845502
2216
請馬上告訴我,
14:07
because then I know
292
847718
1550
這樣我就知道
14:09
that I should not trust you with any secrets,
293
849268
1640
我不能告訴你任何秘密,
14:10
because obviously you can't keep a secret.
294
850908
3298
因為很顯然,你不能保密。
14:17
But people are brutally honest with the Internet,
295
857065
3829
人們在網絡上實在是太誠實了,
14:20
and when these leaks started,
296
860894
2696
當這些洩密事件發生後,
14:23
many people were asking me about this.
297
863590
1878
很多人問我。
14:25
And I have nothing to hide.
298
865468
1574
我沒有什麽需要隱瞞。
14:27
I'm not doing anything bad or anything illegal.
299
867042
3290
我沒做任何壞事, 或違法的事。
14:30
Yet, I have nothing that I would in particular
300
870332
2785
然而,我也沒啥
14:33
like to share with an intelligence agency,
301
873117
2793
特別想要與情報機構分享的,
14:35
especially a foreign intelligence agency.
302
875910
4137
特別是外國的情報機構。
14:40
And if we indeed need a Big Brother,
303
880047
2855
假設我們真是需要一個大哥,
14:42
I would much rather have a domestic Big Brother
304
882902
3478
我更希望找我本國的
14:46
than a foreign Big Brother.
305
886380
3160
而不是外國大哥。
14:49
And when the leaks started, the very first thing I tweeted about this
306
889545
5059
當洩密風波開始時, 我在twitter上發佈的第一條與此相關的推文
14:54
was a comment about how,
307
894604
2074
是評論為何當你使用搜索引擎,
14:56
when you've been using search engines,
308
896678
1688
你已經潛在的洩露信息給
14:58
you've been potentially leaking all that to U.S. intelligence.
309
898366
3649
美國情報機構。
15:02
And two minutes later, I got a reply
310
902015
1972
2分鐘後我就收到回覆,
15:03
by somebody called Kimberly from the United States
311
903987
2336
是一個叫做Kimberly的美國人
15:06
challenging me, like, why am I worried about this?
312
906323
2167
質疑我,問爲什麽我要操這份心?
15:08
What am I sending to worry about this? Am I sending naked pictures or something?
313
908503
4032
我發了什麽東西讓我擔心被發現? 我是傳了裸照還是什麽?
15:12
And my answer to Kimberly was
314
912535
1968
我回答Kimberly,
15:14
that what I'm sending is none of your business,
315
914503
3029
我發什麽都不關你的事,
15:17
and it should be none of your government's business either.
316
917532
4265
也跟你們政府沒有任何關係。
15:21
Because that's what it's about. It's about privacy.
317
921797
2252
這就是事情的本質。 事關隱私。
15:24
Privacy is nonnegotiable.
318
924049
1914
隱私權是沒什麼好商量的。
15:25
It should be built in to all the systems we use.
319
925963
3960
它應該存在於一切 我們應用的制度中。
15:31
(Applause)
320
931968
3578
(掌聲)
15:38
And one thing we should all understand
321
938830
2619
我們所有人都得明白
15:41
is that we are brutally honest with search engines.
322
941449
4599
我們在搜尋時都太誠實了。
15:46
You show me your search history,
323
946048
2751
你把你搜尋紀錄給我看,
15:48
and I'll find something incriminating
324
948799
2366
我五分鐘就能找到有罪的
15:51
or something embarrassing there in five minutes.
325
951165
3437
或者尷尬的東西
15:54
We are more honest with search engines
326
954602
1788
我們對待搜尋引擎
15:56
than we are with our families.
327
956390
1762
比對家人還實在。
15:58
Search engines know more about you
328
958152
2091
搜尋引擎比你的家人
16:00
than your family members know about you.
329
960243
2766
更了解你。
16:03
And this is all the kind of information we are giving away,
330
963009
3088
這些都源於你給出的信息,
16:06
we are giving away to the United States.
331
966097
4375
我們,給了美國這些信息。
16:10
And surveillance changes history.
332
970472
2478
監控改變了歷史。
16:12
We know this through examples of corrupt presidents like Nixon.
333
972950
3209
我們從腐敗的總統尼克森案例可以證實。
16:16
Imagine if he would have had the kind of surveillance tools that are available today.
334
976159
4472
想像一下,如果那時候 他有我們當今的監控設備。
16:20
And let me actually quote
335
980631
2309
我引用巴西總統
16:22
the president of Brazil, Ms. Dilma Rousseff.
336
982940
3133
Dilma Rousseff 女士的一句話。
16:26
She was one of the targets of NSA surveillance.
337
986073
3286
她也是美國國家安全局監視對象之一。
16:29
Her email was read, and she spoke
338
989359
2276
她的電子郵件被窺探
16:31
at the United Nations Headquarters, and she said,
339
991635
3023
她在聯合國總部發言,說道,
16:34
"If there is no right to privacy,
340
994658
2013
"如果沒有隱私權,
16:36
there can be no true freedom of expression and opinion,
341
996671
2827
就沒有真正表達意見的自由,
16:39
and therefore, there can be no effective democracy."
342
999498
5111
因此,就沒有真正的民主。"
16:44
That's what it's about.
343
1004609
2345
事實如此。
16:46
Privacy is the building block of our democracies.
344
1006954
3868
隱私是民主的基礎。
16:52
And to quote a fellow security researcher, Marcus Ranum,
345
1012611
3465
引用一位安全研究員,Marcus Ranum的話,
16:56
he said that the United States is right now treating the Internet
346
1016076
3827
他說美國對待網絡的態度
16:59
as it would be treating one of its colonies.
347
1019903
3093
就好像對待其殖民地一樣。
17:02
So we are back to the age of colonization,
348
1022996
2565
我們,回到了殖民時期,
17:05
and we, the foreign users of the Internet,
349
1025561
3062
我們,網絡的外國使用者,
17:08
we should think about Americans as our masters.
350
1028623
3705
應該把美國當做我們的主人。
17:15
So Mr. Snowden, he's been blamed for many things.
351
1035005
3975
斯諾登先生因很多事情被譴責。
17:18
Some are blaming him for causing problems
352
1038980
2654
有些責怪他給美國雲端產業
17:21
for the U.S. cloud industry and software companies with these revelations --
353
1041634
3191
與這些揭秘相關的軟件公司 造成了那麼多麻煩--
17:24
and blaming Snowden for causing problems for the U.S. cloud industry
354
1044825
4296
指責斯諾登給美國雲端產業造成麻煩
17:29
would be the equivalent of blaming Al Gore
355
1049121
2459
跟怪戈爾(美國前副總統)造成了
17:31
for causing global warming.
356
1051580
2317
全球變暖一樣。
17:33
(Laughter)
357
1053897
2254
(笑聲)
17:36
(Applause)
358
1056151
5071
(掌聲)
17:43
So, what is there to be done?
359
1063853
6208
那我們該做些什麽?
17:50
Should we worry. No, we shouldn't worry.
360
1070061
1780
我們該擔心嗎? 不,我們不必擔心。
17:51
We should be angry, because this is wrong,
361
1071841
2436
我們應該憤怒, 因為這是錯誤的,
17:54
and it's rude, and it should not be done.
362
1074277
2739
野蠻的,不應該出現的。
17:57
But that's not going to really change the situation.
363
1077016
2268
但是那也不會改變局面。
17:59
What's going to change the situation for the rest of the world
364
1079284
3221
世界上其他國家, 要改變這個局面
18:02
is to try to steer away
365
1082505
2282
就要繞過美國
18:04
from systems built in the United States.
366
1084787
2633
建立的系統。
18:07
And that's much easier said than done.
367
1087420
2630
這個說比做容易。
18:10
How do you do that?
368
1090050
1709
怎麼做呢?
18:11
A single country, any single country in Europe
369
1091759
1799
一個國家,任何一個歐洲的國家
18:13
cannot replace and build replacements
370
1093558
2793
都無法取代,無法創造
18:16
for the U.S.-made operating systems and cloud services.
371
1096351
2762
美國製造,運行的系統以及雲端服務的代替品。
18:19
But maybe you don't have to do it alone.
372
1099113
1893
但是你沒必要自己完成。
18:21
Maybe you can do it together with other countries.
373
1101006
1769
你可以和其他國家聯合起來一起做。
18:22
The solution is open source.
374
1102775
3496
辦法就是開放資源。
18:26
By building together open, free, secure systems,
375
1106271
5613
通過一起建立開放, 自由,安全的系統,
18:31
we can go around such surveillance,
376
1111884
3108
我們可以躲過這些監視,
18:34
and then one country doesn't have to solve the problem by itself.
377
1114992
3223
一個國家不必自己解決這個問題。
18:38
It only has to solve one little problem.
378
1118215
2472
他只需要解決一個小問題。
18:40
And to quote a fellow security researcher, Haroon Meer,
379
1120687
5523
用一個安全研究員Haroon Meer的話,
18:46
one country only has to make a small wave,
380
1126210
2969
一個國家只造成一個小浪花,
18:49
but those small waves together become a tide,
381
1129179
3467
但是多個小浪花一起就成了潮流,
18:52
and the tide will lift all the boats up at the same time,
382
1132646
3620
並且潮會讓所有的船同時上升,
18:56
and the tide we will build
383
1136266
1651
我們建立的大潮
18:57
with secure, free, open-source systems,
384
1137917
3441
擁有安全,自由,開放的資源體系,
19:01
will become the tide that will lift all of us
385
1141358
2399
會把我們拖起來,
19:03
up and above the surveillance state.
386
1143757
5582
讓我們越過被監視的狀態。
19:09
Thank you very much.
387
1149339
2112
非常感謝。
19:11
(Applause)
388
1151451
2398
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7