What will a future without secrets look like? | Alessandro Acquisti

202,332 views ・ 2013-10-18

TED


請雙擊下方英文字幕播放視頻。

譯者: Bert Chen 審譯者: Kuo-Hsien Chiang
00:12
I would like to tell you a story
0
12641
2354
我想告訴各位一則故事
00:14
connecting the notorious privacy incident
1
14995
3176
是著名的亞當和夏娃隱私事件
00:18
involving Adam and Eve,
2
18171
2769
是著名的亞當和夏娃隱私事件
00:20
and the remarkable shift in the boundaries
3
20940
3446
以及這10年來, 在公眾與個人之間分界的重大改變
00:24
between public and private which has occurred
4
24386
2686
以及這10年來, 在公眾與個人之間分界的重大改變
00:27
in the past 10 years.
5
27072
1770
以及這10年來, 在公眾與個人之間分界的重大改變
00:28
You know the incident.
6
28842
1298
你知道這起事件
00:30
Adam and Eve one day in the Garden of Eden
7
30140
3330
有一天亞當和夏娃在伊甸園
00:33
realize they are naked.
8
33470
1843
發現他們都沒穿衣服
00:35
They freak out.
9
35313
1500
他們嚇壞了
00:36
And the rest is history.
10
36813
2757
其餘的部分你們都知道了
00:39
Nowadays, Adam and Eve
11
39570
2188
換做是現在的話, 亞當和夏娃
00:41
would probably act differently.
12
41758
2361
可能會有不同的反應
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
44119
2268
(twitter)@亞當 昨晚的表現真是精采! 愛死那顆蘋果了
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
46387
1873
(twitter)@夏娃 寶貝你知道我的褲子怎麼了嗎?
00:48
We do reveal so much more information
15
48260
2636
在網路上, 我們都比從前透露出更多關於自己的訊息
00:50
about ourselves online than ever before,
16
50896
3334
在網路上, 我們都比從前透露出更多關於自己的訊息
00:54
and so much information about us
17
54230
1704
而這些有關我們的訊息
00:55
is being collected by organizations.
18
55934
2224
正被許多政府機構收集起來
00:58
Now there is much to gain and benefit
19
58158
3282
現在可以從分析個人資訊或是巨量資料中得到許多利益
01:01
from this massive analysis of personal information,
20
61440
2446
現在可以從分析個人資訊或是巨量資料中得到許多利益
01:03
or big data,
21
63886
1946
現在可以從分析個人資訊或是巨量資料中得到許多利益
01:05
but there are also complex tradeoffs that come
22
65832
2638
但是在捨棄隱私權的同時, 也伴隨著複雜的得失交換
01:08
from giving away our privacy.
23
68470
3098
但是在捨棄隱私權的同時, 也伴隨著複雜的得失交換
01:11
And my story is about these tradeoffs.
24
71568
4023
我要講的故事是有關這些得失交換
01:15
We start with an observation which, in my mind,
25
75591
2584
我們先從觀察開始
01:18
has become clearer and clearer in the past few years,
26
78175
3327
在我看來, 過去幾年中, 這個情況已經變得越來越明確
01:21
that any personal information
27
81502
2097
任何個人資訊
01:23
can become sensitive information.
28
83599
2285
都能變成敏感的訊息
01:25
Back in the year 2000, about 100 billion photos
29
85884
4125
回朔到西元2000年的時候,
01:30
were shot worldwide,
30
90009
1912
全世界上所有人約拍出1000億張照片
01:31
but only a minuscule proportion of them
31
91921
3065
但是只有極少數的照片
01:34
were actually uploaded online.
32
94986
1883
被上傳到網路上
01:36
In 2010, only on Facebook, in a single month,
33
96869
3361
在2010年 光是一個月, Facebook用戶就上傳25億張照片
01:40
2.5 billion photos were uploaded,
34
100230
3270
在2010年時 光是一個月, Facebook用戶就上傳25億張照片
01:43
most of them identified.
35
103500
1882
而多數照片上的人都可以被辨識出來
01:45
In the same span of time,
36
105382
1880
在這段時間裡
01:47
computers' ability to recognize people in photos
37
107262
4870
辨認照片內人物的電腦運算能力
01:52
improved by three orders of magnitude.
38
112132
3608
也加快了1000倍
01:55
What happens when you combine
39
115740
1882
如果將這些技術結合起來
01:57
these technologies together:
40
117622
1501
會發生什麼事?
01:59
increasing availability of facial data;
41
119123
2658
取得了更多的臉部資料
02:01
improving facial recognizing ability by computers;
42
121781
3648
改善了電腦臉部辨識的能力
02:05
but also cloud computing,
43
125429
2182
還有雲端運算
02:07
which gives anyone in this theater
44
127611
1888
這會給與現場任何一個人
02:09
the kind of computational power
45
129499
1560
一種運算能力
02:11
which a few years ago was only the domain
46
131059
1886
而這種運算能力在幾年前只專屬於
02:12
of three-letter agencies;
47
132945
1782
那些政府機構
02:14
and ubiquitous computing,
48
134727
1378
這種普及的運算能力
02:16
which allows my phone, which is not a supercomputer,
49
136105
2892
能讓我的普通手機
02:18
to connect to the Internet
50
138997
1671
連上網際網路
02:20
and do there hundreds of thousands
51
140668
2334
在幾秒之內進行數十萬次的人臉辨識
02:23
of face metrics in a few seconds?
52
143002
2639
在幾秒之內進行數十萬次的人臉辨識
02:25
Well, we conjecture that the result
53
145641
2628
我們推測這些
02:28
of this combination of technologies
54
148269
2064
技術結合的結果
02:30
will be a radical change in our very notions
55
150333
2888
會顛覆我們
02:33
of privacy and anonymity.
56
153221
2257
對於隱私權與匿名性最初的想法
02:35
To test that, we did an experiment
57
155478
1993
為了進行測試 我們做了一項實驗
02:37
on Carnegie Mellon University campus.
58
157471
2121
在卡內基美隆大學的校園裡
02:39
We asked students who were walking by
59
159592
2099
我們找路過的學生
02:41
to participate in a study,
60
161691
1779
來參與這項研究
02:43
and we took a shot with a webcam,
61
163470
2562
我們拿視訊攝影機拍照
02:46
and we asked them to fill out a survey on a laptop.
62
166032
2782
請他們用筆電填寫問卷調查
02:48
While they were filling out the survey,
63
168814
1979
他們在填寫問卷的時候
02:50
we uploaded their shot to a cloud-computing cluster,
64
170793
2797
上傳他們的照片到一個雲端運算群組
02:53
and we started using a facial recognizer
65
173590
1727
使用一個臉部辨識系統
02:55
to match that shot to a database
66
175317
2405
將這組照片拿去與
02:57
of some hundreds of thousands of images
67
177722
2393
一個約有數十萬張圖像的資料庫比對
03:00
which we had downloaded from Facebook profiles.
68
180115
3596
這些圖像是我們從Facebook的個人簡介下載下來的
03:03
By the time the subject reached the last page
69
183711
3259
等到受測者填寫到問卷最後一頁的時候
03:06
on the survey, the page had been dynamically updated
70
186970
3347
畫面會更新成辨識器找出的10張最相符的照片
03:10
with the 10 best matching photos
71
190317
2313
畫面會更新成辨識器找出的10張最相符的照片
03:12
which the recognizer had found,
72
192630
2285
畫面會更新成辨識器找出的10張最相符的照片
03:14
and we asked the subjects to indicate
73
194915
1738
我們要求受測者指出
03:16
whether he or she found themselves in the photo.
74
196653
4120
是否有在這些照片中找到他們自己
03:20
Do you see the subject?
75
200773
3699
你有看到這名受測者嗎?
03:24
Well, the computer did, and in fact did so
76
204472
2845
是的, 電腦有找到
03:27
for one out of three subjects.
77
207317
2149
三人之中就有一人被找到
03:29
So essentially, we can start from an anonymous face,
78
209466
3184
基本上 我們能夠從一張不知名的臉開始,
03:32
offline or online, and we can use facial recognition
79
212650
3484
不管是離線或在線 我們都能利用臉部辨識
03:36
to give a name to that anonymous face
80
216134
2360
讓一張不知名的臉找到它的名字
03:38
thanks to social media data.
81
218494
2108
這都是拜社群媒體資料庫所賜
03:40
But a few years back, we did something else.
82
220602
1872
但是幾年前 我們又做其他的事情
03:42
We started from social media data,
83
222474
1823
我們從社群媒體開始著手-
03:44
we combined it statistically with data
84
224297
3051
我們將它與美國社會安全局的資料做結合
03:47
from U.S. government social security,
85
227348
2102
我們將它與美國社會安全局的資料做結合
03:49
and we ended up predicting social security numbers,
86
229450
3324
我們可以猜出個人的社會安全號碼
03:52
which in the United States
87
232774
1512
這在美國是一項極度敏感的個人資訊
03:54
are extremely sensitive information.
88
234286
2040
這在美國是一項極度敏感的個人資訊
03:56
Do you see where I'm going with this?
89
236326
2093
你知道我在講什麼嗎?
03:58
So if you combine the two studies together,
90
238419
2922
所以如果你們將兩種研究結果加在一起,
04:01
then the question becomes,
91
241341
1512
那這個問題就會變成
04:02
can you start from a face and,
92
242853
2720
你能從一張臉開始
04:05
using facial recognition, find a name
93
245573
2311
利用臉部辨識技術 找到他的名字
04:07
and publicly available information
94
247884
2669
找到這個人的公開資訊
04:10
about that name and that person,
95
250553
1932
找到這個人的公開資訊
04:12
and from that publicly available information
96
252485
2248
再從公開資訊
04:14
infer non-publicly available information,
97
254733
2042
推測出那些更加敏感的非公開資訊
04:16
much more sensitive ones
98
256775
1606
推測出那些更加敏感的非公開資訊
04:18
which you link back to the face?
99
258381
1492
然後你再回想起這張臉嗎?
04:19
And the answer is, yes, we can, and we did.
100
259873
1916
答案是可以的, 而且我們也做到了
04:21
Of course, the accuracy keeps getting worse.
101
261789
2568
當然, 準確度還不是很好
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
264357
944
在四次嘗試中, 可以辨識出27%受測者的社會安全號碼前五碼
04:25
But in fact, we even decided to develop an iPhone app
103
265301
3827
但事實上 我們甚至決定做一個 iPhone app
04:29
which uses the phone's internal camera
104
269128
2715
利用手機內建像機
04:31
to take a shot of a subject
105
271843
1600
幫受測者拍一張照片
04:33
and then upload it to a cloud
106
273443
1487
然後上傳至雲端網路
04:34
and then do what I just described to you in real time:
107
274930
2662
接下來馬上就像我對大家描述的一樣
04:37
looking for a match, finding public information,
108
277592
2088
即時找出相符的臉, 找出公開資訊
04:39
trying to infer sensitive information,
109
279680
1730
試著推斷敏感的私人資訊
04:41
and then sending back to the phone
110
281410
2591
然後傳回手機
04:44
so that it is overlaid on the face of the subject,
111
284001
3609
這些資訊會顯示在受測者的臉部照片旁
04:47
an example of augmented reality,
112
287610
1901
這是一個擴增實境的例子
04:49
probably a creepy example of augmented reality.
113
289511
2451
也許是一個會令人毛骨悚然的擴增實境案例
04:51
In fact, we didn't develop the app to make it available,
114
291962
3339
事實上我們並沒有讓這個app上市
04:55
just as a proof of concept.
115
295301
1922
只是做為一種觀念的證明
04:57
In fact, take these technologies
116
297223
2313
事實上, 利用這些科技到極致的時候
04:59
and push them to their logical extreme.
117
299536
1837
事實上, 利用這些科技到極致的時候
05:01
Imagine a future in which strangers around you
118
301373
2719
想像一下未來, 你身旁的陌生人
05:04
will look at you through their Google Glasses
119
304092
2311
能透過Google眼鏡來看你
05:06
or, one day, their contact lenses,
120
306403
2307
或者有一天 隱形眼鏡也能做到同樣的事情
05:08
and use seven or eight data points about you
121
308710
4020
使用七或八個有關於你的資訊
05:12
to infer anything else
122
312730
2582
去推測其他
05:15
which may be known about you.
123
315312
2603
可能與你相關的事
05:17
What will this future without secrets look like?
124
317915
4794
沒有秘密的未來會是什麼樣子?
05:22
And should we care?
125
322709
1964
我們應該關心這個嗎?
05:24
We may like to believe
126
324673
1891
我們可能比較願意去相信
05:26
that the future with so much wealth of data
127
326564
3040
一個有這麼多數據資料的未來
05:29
would be a future with no more biases,
128
329604
2514
會是一個沒有偏差的未來
05:32
but in fact, having so much information
129
332118
3583
但是, 事實上, 擁有這麼多資訊
05:35
doesn't mean that we will make decisions
130
335701
2191
不表示我們能夠做出
05:37
which are more objective.
131
337892
1706
更客觀的決定
05:39
In another experiment, we presented to our subjects
132
339598
2560
在另一個實驗裡 我們把求職者的資訊給受測者看
05:42
information about a potential job candidate.
133
342158
2246
在另一個實驗裡 我們把求職者的資訊給受測者看
05:44
We included in this information some references
134
344404
3178
我們的資料含括
05:47
to some funny, absolutely legal,
135
347582
2646
關於一些有趣, 絕對合法
05:50
but perhaps slightly embarrassing information
136
350228
2465
但也許稍微有點尷尬的訊息
05:52
that the subject had posted online.
137
352693
2020
這些都是受測者張貼在網路上的資訊
05:54
Now interestingly, among our subjects,
138
354713
2366
有趣的是 我們實驗的對象中
05:57
some had posted comparable information,
139
357079
3083
有些人也發表了類似的訊息
06:00
and some had not.
140
360162
2362
但有些人則沒有
06:02
Which group do you think
141
362524
1949
你認為哪一組人
06:04
was more likely to judge harshly our subject?
142
364473
4552
比較可能嚴厲批評我們的受測者?
06:09
Paradoxically, it was the group
143
369025
1957
答案出乎意料的是
06:10
who had posted similar information,
144
370982
1733
那些發表類似訊息的人
06:12
an example of moral dissonance.
145
372715
2942
這也是種道德觀念不一致的例子
06:15
Now you may be thinking,
146
375657
1750
現在你可能正在想
06:17
this does not apply to me,
147
377407
1702
這對我來說沒用
06:19
because I have nothing to hide.
148
379109
2162
因為我沒有什麼要藏的東西
06:21
But in fact, privacy is not about
149
381271
2482
但事實上 隱私不只是
06:23
having something negative to hide.
150
383753
3676
有什麼不好的東西要藏起來而已
06:27
Imagine that you are the H.R. director
151
387429
2354
想像你是某個組織的人事主管
06:29
of a certain organization, and you receive résumés,
152
389783
2947
你收到應徵者寄來的履歷
06:32
and you decide to find more information about the candidates.
153
392730
2473
你決定要找出更多該名應徵者的訊息
06:35
Therefore, you Google their names
154
395203
2460
因此 你就在google上搜尋他們的名字
06:37
and in a certain universe,
155
397663
2240
在特定時空
06:39
you find this information.
156
399903
2008
你可以找到這筆資訊
06:41
Or in a parallel universe, you find this information.
157
401911
4437
或是在平行時空 你找到這筆資訊
06:46
Do you think that you would be equally likely
158
406348
2717
你認為你會同樣的
06:49
to call either candidate for an interview?
159
409065
2803
打電話通知應徵者前來面試嗎?
06:51
If you think so, then you are not
160
411868
2282
如果你這麼認為,
06:54
like the U.S. employers who are, in fact,
161
414150
2582
那你就不像美國雇主
06:56
part of our experiment, meaning we did exactly that.
162
416732
3307
事實上, 他們也在我們的實驗當中
07:00
We created Facebook profiles, manipulating traits,
163
420039
3182
我們創造了一些Facebook個人簡介,
07:03
then we started sending out résumés to companies in the U.S.,
164
423221
2851
然後寄送履歷到美國各家公司
07:06
and we detected, we monitored,
165
426072
1908
然後我們偵查、監控
07:07
whether they were searching for our candidates,
166
427980
2393
看是否他們正在上網搜尋我們的應徵者
07:10
and whether they were acting on the information
167
430373
1832
並且依照這些社群媒體上找到的資訊做事.
07:12
they found on social media. And they were.
168
432205
1938
他們真的這麼作.
07:14
Discrimination was happening through social media
169
434143
2101
透過社群媒體, 對技能相當的應徵者們來說
07:16
for equally skilled candidates.
170
436244
3073
也會發生不公平待遇的事情
07:19
Now marketers like us to believe
171
439317
4575
現在行銷的人想讓我們相信
07:23
that all information about us will always
172
443892
2269
所有關於我們的個人資訊都會
07:26
be used in a manner which is in our favor.
173
446161
3273
用在對我們有利的面向
07:29
But think again. Why should that be always the case?
174
449434
3715
但是再想想, 真的會這樣嗎?
07:33
In a movie which came out a few years ago,
175
453149
2664
幾年前上映的一部電影
07:35
"Minority Report," a famous scene
176
455813
2553
「關鍵報告」裡一個著名的場景
07:38
had Tom Cruise walk in a mall
177
458366
2576
就是湯姆克‧魯斯走進一間賣場
07:40
and holographic personalized advertising
178
460942
3776
有一個個人化的雷射投影廣告
07:44
would appear around him.
179
464718
1835
出現在他旁邊
07:46
Now, that movie is set in 2054,
180
466553
3227
那部電影的時空背景設定於2054年
07:49
about 40 years from now,
181
469780
1642
從現在算起 大約是40年之後
07:51
and as exciting as that technology looks,
182
471422
2908
那種技術看起來很精彩
07:54
it already vastly underestimates
183
474330
2646
它已經大大低估
07:56
the amount of information that organizations
184
476976
2140
各組織能夠匯集起有關你個人的資料量
07:59
can gather about you, and how they can use it
185
479116
2483
與他們是如何運用這些資料
08:01
to influence you in a way that you will not even detect.
186
481599
3398
以某一個你無法查覺的方式, 對你造成影響
08:04
So as an example, this is another experiment
187
484997
2103
還有一個例子 這是另一項實驗
08:07
actually we are running, not yet completed.
188
487100
2273
是我們正在進行中的實驗, 還沒有完成
08:09
Imagine that an organization has access
189
489373
2319
想像一個組織能夠進入
08:11
to your list of Facebook friends,
190
491692
2056
你的Facebook好友清單
08:13
and through some kind of algorithm
191
493748
1772
透過某種運算規則
08:15
they can detect the two friends that you like the most.
192
495520
3734
他們可以偵測到你最喜歡的兩個好友
08:19
And then they create, in real time,
193
499254
2280
然後他們就能即時創造出
08:21
a facial composite of these two friends.
194
501534
2842
由這兩個好友所組成的臉部合成照
08:24
Now studies prior to ours have shown that people
195
504376
3069
在我們之前有研究已經顯示
08:27
don't recognize any longer even themselves
196
507445
2885
人們在看臉部合成照, 連他們自己都認不出來
08:30
in facial composites, but they react
197
510330
2462
人們在看臉部合成照, 連他們自己都認不出來
08:32
to those composites in a positive manner.
198
512792
2117
但是他們對那些合成照有正面評價
08:34
So next time you are looking for a certain product,
199
514909
3415
所以下次你在找某項產品
08:38
and there is an ad suggesting you to buy it,
200
518324
2559
此時有一個建議購買的廣告
08:40
it will not be just a standard spokesperson.
201
520883
2907
廣告將不會是一個固定的代言人
08:43
It will be one of your friends,
202
523790
2313
他很可能是你其中一位朋友
08:46
and you will not even know that this is happening.
203
526103
3303
你甚至不知道這種事正發生在你的生活中
08:49
Now the problem is that
204
529406
2413
現在問題就是
08:51
the current policy mechanisms we have
205
531819
2519
目前政策機制是我們必須
08:54
to protect ourselves from the abuses of personal information
206
534338
3438
保護我們自己免於個人資料遭到濫用
08:57
are like bringing a knife to a gunfight.
207
537776
2984
這就像是以卵擊石
09:00
One of these mechanisms is transparency,
208
540760
2913
其中一項機制就是資訊透明化
09:03
telling people what you are going to do with their data.
209
543673
3200
你必須告訴人們你想拿他們資料做什麼
09:06
And in principle, that's a very good thing.
210
546873
2106
原則上 這是一件非常好的事情
09:08
It's necessary, but it is not sufficient.
211
548979
3667
這是應該的, 但是這麼做還不夠
09:12
Transparency can be misdirected.
212
552646
3698
資訊透明化的方向可能會被誤導
09:16
You can tell people what you are going to do,
213
556344
2104
你可以告訴大家你想做什麼
09:18
and then you still nudge them to disclose
214
558448
2232
然後你促使他人揭露
09:20
arbitrary amounts of personal information.
215
560680
2623
或多或少的個人資訊
09:23
So in yet another experiment, this one with students,
216
563303
2886
在另一項實驗中, 實驗對象是學生
09:26
we asked them to provide information
217
566189
3058
我們要求他們提供個人資訊
09:29
about their campus behavior,
218
569247
1813
關於他們在學校裡做的事
09:31
including pretty sensitive questions, such as this one.
219
571060
2940
包括一些相當敏感的問題 就像這一個
09:34
[Have you ever cheated in an exam?]
220
574000
621
09:34
Now to one group of subjects, we told them,
221
574621
2300
在考試的時候 你有作弊過嗎?
對其中一組受測者, 我們告訴他們
09:36
"Only other students will see your answers."
222
576921
2841
只有其他的同學會看到你的答案
09:39
To another group of subjects, we told them,
223
579762
1579
對另一組受測者 我們告訴他們
09:41
"Students and faculty will see your answers."
224
581341
3561
所有學生和教職員都會看到你的答案
09:44
Transparency. Notification. And sure enough, this worked,
225
584902
2591
透明化 告知. 這真的有用.
09:47
in the sense that the first group of subjects
226
587493
1407
第一組受測者
09:48
were much more likely to disclose than the second.
227
588900
2568
比第二組受測者更有可能公佈事實
09:51
It makes sense, right?
228
591468
1520
合理吧?
09:52
But then we added the misdirection.
229
592988
1490
但是之後我們加入誤導手段
09:54
We repeated the experiment with the same two groups,
230
594478
2760
我們對相同兩組學生重覆進行實驗
09:57
this time adding a delay
231
597238
2427
這次在不同的時間告訴受測者我們是如何使用他們的資料
09:59
between the time we told subjects
232
599665
2935
這次在不同的時間告訴受測者我們是如何使用他們的資料
10:02
how we would use their data
233
602600
2080
這次在不同的時間告訴受測者我們是如何使用他們的資料
10:04
and the time we actually started answering the questions.
234
604680
4388
現在我們知道了
10:09
How long a delay do you think we had to add
235
609068
2561
你認為我們必須要延遲多久時間
10:11
in order to nullify the inhibitory effect
236
611629
4613
為使這種抑制效應無效
10:16
of knowing that faculty would see your answers?
237
616242
3411
而這種效應就是知道教職員會看見你的答案?
10:19
Ten minutes?
238
619653
1780
10分鐘?
10:21
Five minutes?
239
621433
1791
5分鐘?
10:23
One minute?
240
623224
1776
1分鐘?
10:25
How about 15 seconds?
241
625000
2049
15秒怎樣?
10:27
Fifteen seconds were sufficient to have the two groups
242
627049
2668
15秒就足夠讓兩組人
10:29
disclose the same amount of information,
243
629717
1568
透露出相同資訊量
10:31
as if the second group now no longer cares
244
631285
2746
就好像第二組人現在不再關心教職員會看他們的答案
10:34
for faculty reading their answers.
245
634031
2656
就好像第二組人現在不再關心教職員會看他們的答案
10:36
Now I have to admit that this talk so far
246
636687
3336
現在我必須承認目前為止我說的這些話
10:40
may sound exceedingly gloomy,
247
640023
2480
可能聽起來非常沉悶
10:42
but that is not my point.
248
642503
1721
但我要說的不是這個
10:44
In fact, I want to share with you the fact that
249
644224
2699
事實上 我想與大家分享的是
10:46
there are alternatives.
250
646923
1772
還有替代方案
10:48
The way we are doing things now is not the only way
251
648695
2499
我們現在實驗的方式
10:51
they can done, and certainly not the best way
252
651194
3037
並不是唯一可行的方式
10:54
they can be done.
253
654231
2027
當然也不是最好的辦法
10:56
When someone tells you, "People don't care about privacy,"
254
656258
4171
有人告訴你 「沒人會在乎他的隱私」
11:00
consider whether the game has been designed
255
660429
2642
想想看是否這場遊戲已經遭到設計
11:03
and rigged so that they cannot care about privacy,
256
663071
2724
暗中操作 所以他們才不在意隱私權
11:05
and coming to the realization that these manipulations occur
257
665795
3262
逐漸發現這些操作手段的已經入侵到那些能夠能夠保護你的方法中
11:09
is already halfway through the process
258
669057
1607
逐漸發現這些操作手段的已經入侵到那些能夠能夠保護你的方法中
11:10
of being able to protect yourself.
259
670664
2258
逐漸發現這些操作手段的已經入侵到那些能夠能夠保護你的方法中
11:12
When someone tells you that privacy is incompatible
260
672922
3710
有人告訴你隱私
11:16
with the benefits of big data,
261
676632
1849
與巨量資料所帶來的利益是無法共存的
11:18
consider that in the last 20 years,
262
678481
2473
想想看近20年
11:20
researchers have created technologies
263
680954
1917
研究人員已經研發出數套技術
11:22
to allow virtually any electronic transactions
264
682871
3318
讓幾乎所有電子交易
11:26
to take place in a more privacy-preserving manner.
265
686189
3749
能夠在有更高度的隱私環境下進行
11:29
We can browse the Internet anonymously.
266
689938
2555
我們可以匿名瀏覽網頁
11:32
We can send emails that can only be read
267
692493
2678
傳送唯讀電子郵件
11:35
by the intended recipient, not even the NSA.
268
695171
3709
這些電子郵件僅能由指定的收件者閱讀 就連國家安全局都沒辦法查看
11:38
We can have even privacy-preserving data mining.
269
698880
2997
我們甚至能在隱私受到保護的情況下 進行資料開採
11:41
In other words, we can have the benefits of big data
270
701877
3894
另一方面, 在保護隱私權的同時, 我們仍擁有巨量資料所帶來的好處
11:45
while protecting privacy.
271
705771
2132
另一方面, 在保護隱私權的同時, 我們仍擁有巨量資料所帶來的好處
11:47
Of course, these technologies imply a shifting
272
707903
3791
當然 這些技術也可以看出,
11:51
of cost and revenues
273
711694
1546
在資料持有人與資料提供者之間
11:53
between data holders and data subjects,
274
713240
2107
利益的變化
11:55
which is why, perhaps, you don't hear more about them.
275
715347
3453
這也許是為什麼你沒有聽過太多有關這些技術的事情
11:58
Which brings me back to the Garden of Eden.
276
718800
3706
就讓我將話題轉回伊甸園
12:02
There is a second privacy interpretation
277
722506
2780
有第二種
12:05
of the story of the Garden of Eden
278
725286
1809
對伊甸園故事的隱私解釋
12:07
which doesn't have to do with the issue
279
727095
2096
這與亞當和夏娃
12:09
of Adam and Eve feeling naked
280
729191
2225
覺得全身赤裸
12:11
and feeling ashamed.
281
731416
2381
和感到羞恥沒有關係
12:13
You can find echoes of this interpretation
282
733797
2781
你可以在約翰·密爾頓的《失樂園》裡發現對於這個解釋的迴響
12:16
in John Milton's "Paradise Lost."
283
736578
2782
你可以在約翰·密爾頓的《失樂園》裡發現對於這個解釋的迴響
12:19
In the garden, Adam and Eve are materially content.
284
739360
4197
在伊甸園裡 亞當和夏娃只是物品
12:23
They're happy. They are satisfied.
285
743557
2104
他們很快樂 很滿足
12:25
However, they also lack knowledge
286
745661
2293
然而 他們也缺乏知識
12:27
and self-awareness.
287
747954
1640
和自覺
12:29
The moment they eat the aptly named
288
749594
3319
此刻他們恰好吃下名叫
12:32
fruit of knowledge,
289
752913
1293
「知識」的水果
12:34
that's when they discover themselves.
290
754206
2605
就在那時他們才發現自我
12:36
They become aware. They achieve autonomy.
291
756811
4031
他們開始擁有自覺和自主能力
12:40
The price to pay, however, is leaving the garden.
292
760842
3126
然而 所付出的代價就是必須離開伊甸園
12:43
So privacy, in a way, is both the means
293
763968
3881
所以, 隱私權是自由的意義也是代價
12:47
and the price to pay for freedom.
294
767849
2962
所以, 隱私權是自由的意義也是代價
12:50
Again, marketers tell us
295
770811
2770
市場商人告訴我們
12:53
that big data and social media
296
773581
3019
巨量資料與社群媒體
12:56
are not just a paradise of profit for them,
297
776600
2979
不只對於他們是獲利的天堂
12:59
but a Garden of Eden for the rest of us.
298
779579
2457
對我們其餘的人也是座伊甸園
13:02
We get free content.
299
782036
1238
我們可以得到免費的內容
13:03
We get to play Angry Birds. We get targeted apps.
300
783274
3123
我們可以玩憤怒鳥 使用挑選好的app
13:06
But in fact, in a few years, organizations
301
786397
2897
實際上,在幾年之內
13:09
will know so much about us,
302
789294
1609
政府機構將知道許多關於我們的資訊
13:10
they will be able to infer our desires
303
790903
2710
他們能在我們想到之前推斷我們想做的事情
13:13
before we even form them, and perhaps
304
793613
2204
他們能在我們想到之前推斷我們想做的事情
13:15
buy products on our behalf
305
795817
2447
也許在我們知道我們需要這些東西之前, 就替我們購買產品
13:18
before we even know we need them.
306
798264
2274
也許在我們知道我們需要這些東西之前, 就替我們購買產品
13:20
Now there was one English author
307
800538
3237
現在有一名英國作家
13:23
who anticipated this kind of future
308
803775
3045
考慮到未來可能會發生這種情況
13:26
where we would trade away
309
806820
1405
到時候我們可能會為了過舒適的生活
13:28
our autonomy and freedom for comfort.
310
808225
3548
而賤賣我們的自主能力與自由
13:31
Even more so than George Orwell,
311
811773
2161
其著作比喬治·歐威爾還多
13:33
the author is, of course, Aldous Huxley.
312
813934
2761
這名作家當然就是奧爾德斯·赫胥黎
13:36
In "Brave New World," he imagines a society
313
816695
2854
在《美麗新世界》書中 他想像出一個社會
13:39
where technologies that we created
314
819549
2171
那裡的科技是
13:41
originally for freedom
315
821720
1859
我們為了自由而創造的技術
13:43
end up coercing us.
316
823579
2567
最後我們反被科技奴役
13:46
However, in the book, he also offers us a way out
317
826146
4791
然而 在書中 他也提供我們一個逃離
13:50
of that society, similar to the path
318
830937
3438
那個社會的方式 與那條路很像
13:54
that Adam and Eve had to follow to leave the garden.
319
834375
3955
就是亞當和夏娃離開伊甸園的那條路
13:58
In the words of the Savage,
320
838330
2147
就「野蠻人」這個詞而言
14:00
regaining autonomy and freedom is possible,
321
840477
3069
重新找回自主能力和自由是可行的
14:03
although the price to pay is steep.
322
843546
2679
雖然需要付出的代價實在太高
14:06
So I do believe that one of the defining fights
323
846225
5715
所以我相信我們這個時代的
14:11
of our times will be the fight
324
851940
2563
其中一個決定性的戰鬥將會是
14:14
for the control over personal information,
325
854503
2387
為掌控個人資訊而戰
14:16
the fight over whether big data will become a force
326
856890
3507
不管巨量資料是否將成為一股迎向自由的力量
14:20
for freedom,
327
860397
1289
這場戰鬥終將結束
14:21
rather than a force which will hiddenly manipulate us.
328
861686
4746
而不會成為一股暗中操縱我們的力量-
14:26
Right now, many of us
329
866432
2593
現在 我們當中許多人
14:29
do not even know that the fight is going on,
330
869025
2753
甚至都不知道 戰鬥正在進行
14:31
but it is, whether you like it or not.
331
871778
2672
不管你喜不喜歡 這就是現況
14:34
And at the risk of playing the serpent,
332
874450
2804
冒著玩弄魔鬼的危險
14:37
I will tell you that the tools for the fight
333
877254
2897
我告訴各位, 這場戰爭的工具就在這裡
14:40
are here, the awareness of what is going on,
334
880151
3009
了解現在發生什麼事
14:43
and in your hands,
335
883160
1355
就掌握在你手裡
14:44
just a few clicks away.
336
884515
3740
只要用滑鼠點幾下就行了
14:48
Thank you.
337
888255
1482
謝謝大家
14:49
(Applause)
338
889737
4477
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7