What will a future without secrets look like? | Alessandro Acquisti

202,219 views ・ 2013-10-18

TED


请双击下面的英文字幕来播放视频。

翻译人员: xuan wang 校对人员: Jing Peng
00:12
I would like to tell you a story
0
12641
2354
我想跟大家分享一个
00:14
connecting the notorious privacy incident
1
14995
3176
将亚当和夏娃的
00:18
involving Adam and Eve,
2
18171
2769
臭名昭著的隐私事件
00:20
and the remarkable shift in the boundaries
3
20940
3446
和过去十年发生的
00:24
between public and private which has occurred
4
24386
2686
公共和隐私领域里的显著变迁
00:27
in the past 10 years.
5
27072
1770
相结合的故事。
00:28
You know the incident.
6
28842
1298
大家都知道这个事件吧。
00:30
Adam and Eve one day in the Garden of Eden
7
30140
3330
在伊甸园里有一天亚当和夏娃
00:33
realize they are naked.
8
33470
1843
意识到了他们是赤裸的。
00:35
They freak out.
9
35313
1500
他们吓坏了。
00:36
And the rest is history.
10
36813
2757
然后剩下的就是历史了。
00:39
Nowadays, Adam and Eve
11
39570
2188
换作现在的话,亚当和夏娃
00:41
would probably act differently.
12
41758
2361
可能会有不同的举动。
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
44119
2268
(推特)[@亚当,昨天太销魂了!我好爱那个苹果啊。]
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
46387
1873
[@夏娃,是啊,宝贝儿,知道我裤子变成什么样了吗?]
00:48
We do reveal so much more information
15
48260
2636
我们确实比以往任何时候都开放,
00:50
about ourselves online than ever before,
16
50896
3334
把大量关于自己的信息放在网上传播。
00:54
and so much information about us
17
54230
1704
而且这么多有关我们的信息
00:55
is being collected by organizations.
18
55934
2224
正在被各种机构收集起来。
00:58
Now there is much to gain and benefit
19
58158
3282
当今,通过对这些大量
01:01
from this massive analysis of personal information,
20
61440
2446
个人信息的研究,
01:03
or big data,
21
63886
1946
我们从中受益非浅;
01:05
but there are also complex tradeoffs that come
22
65832
2638
但是在放弃我们的隐私的同时
01:08
from giving away our privacy.
23
68470
3098
也要付出很多的代价。
01:11
And my story is about these tradeoffs.
24
71568
4023
而我的故事就是关于这些代价的。
01:15
We start with an observation which, in my mind,
25
75591
2584
让我们首先看看一个我认为
01:18
has become clearer and clearer in the past few years,
26
78175
3327
在过去几年已经变得越来越清晰的现象,
01:21
that any personal information
27
81502
2097
那就是任何个人信息
01:23
can become sensitive information.
28
83599
2285
都可能变成敏感信息。
01:25
Back in the year 2000, about 100 billion photos
29
85884
4125
在2000年的时候,全球大约拍摄了1000亿
01:30
were shot worldwide,
30
90009
1912
的照片,
01:31
but only a minuscule proportion of them
31
91921
3065
但是只有非常微不足道的一部分
01:34
were actually uploaded online.
32
94986
1883
被放在了网上。
01:36
In 2010, only on Facebook, in a single month,
33
96869
3361
到了2010年,仅仅在脸书上,一个月
01:40
2.5 billion photos were uploaded,
34
100230
3270
就上传了25亿张照片,
01:43
most of them identified.
35
103500
1882
大部分都是可确认的。
01:45
In the same span of time,
36
105382
1880
同时,
01:47
computers' ability to recognize people in photos
37
107262
4870
计算机在照片中识别面孔的能力
01:52
improved by three orders of magnitude.
38
112132
3608
提高了三个数量级。
01:55
What happens when you combine
39
115740
1882
当你把这些技术结合起来后
01:57
these technologies together:
40
117622
1501
会发生什么呢?
01:59
increasing availability of facial data;
41
119123
2658
不断增加的脸部信息可用性;
02:01
improving facial recognizing ability by computers;
42
121781
3648
不断增强的计算机面部识别能力;
02:05
but also cloud computing,
43
125429
2182
同时还有云计算,
02:07
which gives anyone in this theater
44
127611
1888
让在座的任何人
02:09
the kind of computational power
45
129499
1560
都拥有了
02:11
which a few years ago was only the domain
46
131059
1886
几年前只有情报机构才有的
02:12
of three-letter agencies;
47
132945
1782
计算能力;
02:14
and ubiquitous computing,
48
134727
1378
同时还有普适计算,
02:16
which allows my phone, which is not a supercomputer,
49
136105
2892
让我的手机,
02:18
to connect to the Internet
50
138997
1671
和互联网相连接
02:20
and do there hundreds of thousands
51
140668
2334
然后可以在几秒内进行
02:23
of face metrics in a few seconds?
52
143002
2639
成百上千的面部数据测算。
02:25
Well, we conjecture that the result
53
145641
2628
我们预测
02:28
of this combination of technologies
54
148269
2064
这些技术的结合体
02:30
will be a radical change in our very notions
55
150333
2888
会对我们所谓的隐私和匿名
02:33
of privacy and anonymity.
56
153221
2257
产生非常巨大的影响。
02:35
To test that, we did an experiment
57
155478
1993
为了证明这个想法,我们在
02:37
on Carnegie Mellon University campus.
58
157471
2121
卡内基·梅隆大学校园里做了一个测试。
02:39
We asked students who were walking by
59
159592
2099
我们让过路的学生们
02:41
to participate in a study,
60
161691
1779
参与一项研究,
02:43
and we took a shot with a webcam,
61
163470
2562
我们用摄像头给他们照了相,
02:46
and we asked them to fill out a survey on a laptop.
62
166032
2782
然后我们让他们在电脑上填写一张调查问卷。
02:48
While they were filling out the survey,
63
168814
1979
在此同时,
02:50
we uploaded their shot to a cloud-computing cluster,
64
170793
2797
我们把他们的照片上传到一个云计算节点上,
02:53
and we started using a facial recognizer
65
173590
1727
然后我们开始用一个面部识别程序来
02:55
to match that shot to a database
66
175317
2405
把那张照片和一个有
02:57
of some hundreds of thousands of images
67
177722
2393
成百上千张照片的数据库相比较对照
03:00
which we had downloaded from Facebook profiles.
68
180115
3596
这些照片都是我们从脸书上下载下来的。
03:03
By the time the subject reached the last page
69
183711
3259
当被研究对象做到问卷的最后一页时,
03:06
on the survey, the page had been dynamically updated
70
186970
3347
那页已经自动显示我们找到的
03:10
with the 10 best matching photos
71
190317
2313
10张由识别程序找到的
03:12
which the recognizer had found,
72
192630
2285
最相似的图片,
03:14
and we asked the subjects to indicate
73
194915
1738
然后我们让被研究对象确认
03:16
whether he or she found themselves in the photo.
74
196653
4120
那些照片到底是不是自己。
03:20
Do you see the subject?
75
200773
3699
大家看到被研究对象了吗?
03:24
Well, the computer did, and in fact did so
76
204472
2845
电脑做到了,实际上它的准确率是
03:27
for one out of three subjects.
77
207317
2149
三分之一。
03:29
So essentially, we can start from an anonymous face,
78
209466
3184
基本上,我们可以从一张匿名的面孔开始,
03:32
offline or online, and we can use facial recognition
79
212650
3484
线下或线上,然后我们可以用脸部识别技术
03:36
to give a name to that anonymous face
80
216134
2360
找到那个人。
03:38
thanks to social media data.
81
218494
2108
这多亏了社交媒体的数据。
03:40
But a few years back, we did something else.
82
220602
1872
但是几年前,我们做了些其他事情。
03:42
We started from social media data,
83
222474
1823
我们从社交媒体数据出发,
03:44
we combined it statistically with data
84
224297
3051
然后我们把它和美国政府的
03:47
from U.S. government social security,
85
227348
2102
社会安全机构里的数据相对照,
03:49
and we ended up predicting social security numbers,
86
229450
3324
我们最终可以预测一个人的社会保险号码,
03:52
which in the United States
87
232774
1512
这个号码在美国
03:54
are extremely sensitive information.
88
234286
2040
是极其敏感的信息。
03:56
Do you see where I'm going with this?
89
236326
2093
大家明白我的意思了吗?
03:58
So if you combine the two studies together,
90
238419
2922
如果你把这两个研究相结合,
04:01
then the question becomes,
91
241341
1512
问题就来了,
04:02
can you start from a face and,
92
242853
2720
你可不可以从一张面孔出发,
04:05
using facial recognition, find a name
93
245573
2311
然后通过面部识别找到这个人
04:07
and publicly available information
94
247884
2669
和有关此人的
04:10
about that name and that person,
95
250553
1932
各种公共信息,
04:12
and from that publicly available information
96
252485
2248
从这些公共信息里,
04:14
infer non-publicly available information,
97
254733
2042
可以推断出未公开的信息,
04:16
much more sensitive ones
98
256775
1606
即那些关于此人
04:18
which you link back to the face?
99
258381
1492
更敏感的信息呢?
04:19
And the answer is, yes, we can, and we did.
100
259873
1916
答案是,可以的,我们也做到了。
04:21
Of course, the accuracy keeps getting worse.
101
261789
2568
当然,准确率也变糟了。
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
264357
944
[27%的调查对象的社会保障号头5个数字 可以通过4次尝试得到]
04:25
But in fact, we even decided to develop an iPhone app
103
265301
3827
但实际上,我们甚至决定开发一个苹果应用,
04:29
which uses the phone's internal camera
104
269128
2715
这个应用使用手机内置的相机给
04:31
to take a shot of a subject
105
271843
1600
研究对象拍照
04:33
and then upload it to a cloud
106
273443
1487
然后把照片上传到云端
04:34
and then do what I just described to you in real time:
107
274930
2662
然后实时地进行我刚才描述的计算:
04:37
looking for a match, finding public information,
108
277592
2088
寻找匹配,公共信息,
04:39
trying to infer sensitive information,
109
279680
1730
尝试推测敏感信息,
04:41
and then sending back to the phone
110
281410
2591
然后把这些信息传送回手机
04:44
so that it is overlaid on the face of the subject,
111
284001
3609
然后把这些信息列到研究对象的图像旁边,
04:47
an example of augmented reality,
112
287610
1901
这是个夸张现实的例子,
04:49
probably a creepy example of augmented reality.
113
289511
2451
大概也是一个令人毛骨悚然的现实。
04:51
In fact, we didn't develop the app to make it available,
114
291962
3339
实际上,我们没有开发这个应用,
04:55
just as a proof of concept.
115
295301
1922
这只是一个概念验证。
04:57
In fact, take these technologies
116
297223
2313
事实是,让我们把这些技术推进到
04:59
and push them to their logical extreme.
117
299536
1837
逻辑的极限。
05:01
Imagine a future in which strangers around you
118
301373
2719
设想一下未来你周围的陌生人
05:04
will look at you through their Google Glasses
119
304092
2311
可以通过他们的谷歌眼镜
05:06
or, one day, their contact lenses,
120
306403
2307
或者,他们的隐形眼镜,
05:08
and use seven or eight data points about you
121
308710
4020
并通过你身上的7、8个数据点
05:12
to infer anything else
122
312730
2582
就可以推测出
05:15
which may be known about you.
123
315312
2603
任何与你有关的信息。
05:17
What will this future without secrets look like?
124
317915
4794
这个没有任何秘密的未来会是怎样的?
05:22
And should we care?
125
322709
1964
而我们该不该关心这个问题?
05:24
We may like to believe
126
324673
1891
我们可能会倾向于相信
05:26
that the future with so much wealth of data
127
326564
3040
这个有这么丰富的数据的未来
05:29
would be a future with no more biases,
128
329604
2514
会是一个不再有偏见的未来,
05:32
but in fact, having so much information
129
332118
3583
但实际上,拥有这么多的信息
05:35
doesn't mean that we will make decisions
130
335701
2191
并不意味着我们就会做出
05:37
which are more objective.
131
337892
1706
更理性的选择。
05:39
In another experiment, we presented to our subjects
132
339598
2560
在另一个试验里,我们给研究对象
05:42
information about a potential job candidate.
133
342158
2246
关于一个工作应征者的信息。
05:44
We included in this information some references
134
344404
3178
上传信息里同时也包括了一些
05:47
to some funny, absolutely legal,
135
347582
2646
有趣并且绝对合法,
05:50
but perhaps slightly embarrassing information
136
350228
2465
但毕竟有些
05:52
that the subject had posted online.
137
352693
2020
尴尬的内容。
05:54
Now interestingly, among our subjects,
138
354713
2366
有趣的是,
05:57
some had posted comparable information,
139
357079
3083
一部分研究对象发布了类似的信息,
06:00
and some had not.
140
360162
2362
有些没有。
06:02
Which group do you think
141
362524
1949
大家认为哪个组
06:04
was more likely to judge harshly our subject?
142
364473
4552
会更有可能质疑他人呢?
06:09
Paradoxically, it was the group
143
369025
1957
自相矛盾的是,
06:10
who had posted similar information,
144
370982
1733
那是发布了类似信息的组,
06:12
an example of moral dissonance.
145
372715
2942
这就是一个与个人道德相悖的例子。
06:15
Now you may be thinking,
146
375657
1750
大家可能现在会想,
06:17
this does not apply to me,
147
377407
1702
这跟我无关,
06:19
because I have nothing to hide.
148
379109
2162
因为我没有什么可隐藏的。
06:21
But in fact, privacy is not about
149
381271
2482
但实际上,隐私不是说
06:23
having something negative to hide.
150
383753
3676
你有什么坏事情要隐藏。
06:27
Imagine that you are the H.R. director
151
387429
2354
想象一下你是某机构人事部的主管,
06:29
of a certain organization, and you receive résumés,
152
389783
2947
你收到一些简历,
06:32
and you decide to find more information about the candidates.
153
392730
2473
然后你决定寻找更多的关于这些应征者的信息。
06:35
Therefore, you Google their names
154
395203
2460
然后,你就在谷歌上搜他们的名字
06:37
and in a certain universe,
155
397663
2240
在某种情形下,
06:39
you find this information.
156
399903
2008
你找到这个信息。
06:41
Or in a parallel universe, you find this information.
157
401911
4437
或者在一个平行的空间里,你找到了这个信息。
06:46
Do you think that you would be equally likely
158
406348
2717
你认为你会公平的
06:49
to call either candidate for an interview?
159
409065
2803
给任何一个应征者面试的机会吗?
06:51
If you think so, then you are not
160
411868
2282
如果你是这样想的话,
06:54
like the U.S. employers who are, in fact,
161
414150
2582
那么你与美国的老板们不同,
06:56
part of our experiment, meaning we did exactly that.
162
416732
3307
实际上,我们就是用了这些老板做的这个试验。
07:00
We created Facebook profiles, manipulating traits,
163
420039
3182
我们建立了一些脸书帐号,编制了一些信息,
07:03
then we started sending out résumés to companies in the U.S.,
164
423221
2851
然后我们开始给他们发简历,
07:06
and we detected, we monitored,
165
426072
1908
此后我们监控着,
07:07
whether they were searching for our candidates,
166
427980
2393
到底这些公司会不会搜索我们的应征者,
07:10
and whether they were acting on the information
167
430373
1832
他们有没有对他们在社交网络上找到的信息
07:12
they found on social media. And they were.
168
432205
1938
有所举动。实际上他们确实这样做了。
07:14
Discrimination was happening through social media
169
434143
2101
对同等条件的应征者的歧视
07:16
for equally skilled candidates.
170
436244
3073
是正从社交网络收集的信息开始的。
07:19
Now marketers like us to believe
171
439317
4575
现在营销人员希望我们相信
07:23
that all information about us will always
172
443892
2269
关于我们的所有信息
07:26
be used in a manner which is in our favor.
173
446161
3273
永远都会以我们喜欢的方式被使用。
07:29
But think again. Why should that be always the case?
174
449434
3715
但是想想看,凭什么总会是这样?
07:33
In a movie which came out a few years ago,
175
453149
2664
在几年前出品的一部电影里,
07:35
"Minority Report," a famous scene
176
455813
2553
“少数派报告”,一个著名的镜头里
07:38
had Tom Cruise walk in a mall
177
458366
2576
是汤姆·克鲁斯在一个大厦里走着
07:40
and holographic personalized advertising
178
460942
3776
然后全息个性化的广告
07:44
would appear around him.
179
464718
1835
出现在他周围。
07:46
Now, that movie is set in 2054,
180
466553
3227
那部电影的背景年代是2054年,
07:49
about 40 years from now,
181
469780
1642
大约离现在还有40年,
07:51
and as exciting as that technology looks,
182
471422
2908
就像那个技术显示的一样让人兴奋,
07:54
it already vastly underestimates
183
474330
2646
它已经大大低估了
07:56
the amount of information that organizations
184
476976
2140
各种机构可以搜集到的
07:59
can gather about you, and how they can use it
185
479116
2483
关于你自己的信息,以及他们如何能利用这些信息
08:01
to influence you in a way that you will not even detect.
186
481599
3398
以一种你自己都无法预测到的方式来影响你。
08:04
So as an example, this is another experiment
187
484997
2103
举个例子,这是另一个
08:07
actually we are running, not yet completed.
188
487100
2273
我们正在做的未完成的试验。
08:09
Imagine that an organization has access
189
489373
2319
想象一下某个机构有
08:11
to your list of Facebook friends,
190
491692
2056
你的脸书朋友信息,
08:13
and through some kind of algorithm
191
493748
1772
通过某种算法
08:15
they can detect the two friends that you like the most.
192
495520
3734
他们可以找到两个你最喜欢的朋友。
08:19
And then they create, in real time,
193
499254
2280
然后,他们的即时创建出
08:21
a facial composite of these two friends.
194
501534
2842
这两个朋友的脸部信息结合体。
08:24
Now studies prior to ours have shown that people
195
504376
3069
在我们之前的研究显示
08:27
don't recognize any longer even themselves
196
507445
2885
人们在合成的脸部图片中
08:30
in facial composites, but they react
197
510330
2462
甚至不会识别出自己,
08:32
to those composites in a positive manner.
198
512792
2117
但是他们却对这些合成图片有好感。
08:34
So next time you are looking for a certain product,
199
514909
3415
那么下次你在浏览某个产品的时候,
08:38
and there is an ad suggesting you to buy it,
200
518324
2559
同时有个广告建议你买它,
08:40
it will not be just a standard spokesperson.
201
520883
2907
这就不会是一个标准的推销员,
08:43
It will be one of your friends,
202
523790
2313
却会变成你的朋友,
08:46
and you will not even know that this is happening.
203
526103
3303
而且你都不会意识到正在发生着什么。
08:49
Now the problem is that
204
529406
2413
现在的问题是
08:51
the current policy mechanisms we have
205
531819
2519
我们当下的保护个人信息
08:54
to protect ourselves from the abuses of personal information
206
534338
3438
不被滥用的政策法规
08:57
are like bringing a knife to a gunfight.
207
537776
2984
还十分薄弱。
09:00
One of these mechanisms is transparency,
208
540760
2913
其中的一个法规是透明性,
09:03
telling people what you are going to do with their data.
209
543673
3200
要告诉人们你将怎样使用这些数据。
09:06
And in principle, that's a very good thing.
210
546873
2106
理论上,这是非常好的事情。
09:08
It's necessary, but it is not sufficient.
211
548979
3667
这是必要的,但是却不完善。
09:12
Transparency can be misdirected.
212
552646
3698
透明性也会被误导。
09:16
You can tell people what you are going to do,
213
556344
2104
你会告诉人们你要做什么,
09:18
and then you still nudge them to disclose
214
558448
2232
然后你仍然试图诱导他们
09:20
arbitrary amounts of personal information.
215
560680
2623
给你任意数量的个人信息。
09:23
So in yet another experiment, this one with students,
216
563303
2886
那么在另一个实验里,这次我们让
09:26
we asked them to provide information
217
566189
3058
学生们给我们他们的
09:29
about their campus behavior,
218
569247
1813
学校表现信息
09:31
including pretty sensitive questions, such as this one.
219
571060
2940
这包括一些非常敏感的信息,比如这个。
09:34
[Have you ever cheated in an exam?]
220
574000
621
09:34
Now to one group of subjects, we told them,
221
574621
2300
[你有没有在考试中做过弊?]
对其中一个组,我们告诉他们,
09:36
"Only other students will see your answers."
222
576921
2841
“只有其他的学生会看到你的答案。”
09:39
To another group of subjects, we told them,
223
579762
1579
而对另一组学生,我们说,
09:41
"Students and faculty will see your answers."
224
581341
3561
“学生和系里会看到你们的答案。”
09:44
Transparency. Notification. And sure enough, this worked,
225
584902
2591
透明度。预先声明。当然,这个奏效了,
09:47
in the sense that the first group of subjects
226
587493
1407
第一组学生
09:48
were much more likely to disclose than the second.
227
588900
2568
比第二组更愿意说出实情。
09:51
It makes sense, right?
228
591468
1520
很合理吧,不是吗?
09:52
But then we added the misdirection.
229
592988
1490
但是我们加了下面的误导。
09:54
We repeated the experiment with the same two groups,
230
594478
2760
我们在两组中重复做了这个实验,
09:57
this time adding a delay
231
597238
2427
这次我们
09:59
between the time we told subjects
232
599665
2935
在告诉他们我们如何
10:02
how we would use their data
233
602600
2080
使用这些数据
10:04
and the time we actually started answering the questions.
234
604680
4388
和让他们实际开始回答问题之间增加了一点延迟。
10:09
How long a delay do you think we had to add
235
609068
2561
大家认为这个延迟需要多久
10:11
in order to nullify the inhibitory effect
236
611629
4613
能让我们抵消掉之前的“系里也会看你们的答案”
10:16
of knowing that faculty would see your answers?
237
616242
3411
带来的抑制作用?
10:19
Ten minutes?
238
619653
1780
十分钟?
10:21
Five minutes?
239
621433
1791
五分钟?
10:23
One minute?
240
623224
1776
一分钟?
10:25
How about 15 seconds?
241
625000
2049
15秒怎么样?
10:27
Fifteen seconds were sufficient to have the two groups
242
627049
2668
只要15秒就会让两组
10:29
disclose the same amount of information,
243
629717
1568
提供同样数量的数据,
10:31
as if the second group now no longer cares
244
631285
2746
就好像第二组不再关心
10:34
for faculty reading their answers.
245
634031
2656
系里会不会看他们的答案一样。
10:36
Now I have to admit that this talk so far
246
636687
3336
到此为止,我得承认这个演讲
10:40
may sound exceedingly gloomy,
247
640023
2480
可能显得非常的郁闷,
10:42
but that is not my point.
248
642503
1721
但是这不是我的重点。
10:44
In fact, I want to share with you the fact that
249
644224
2699
实际上,我想分享的是我们还是有
10:46
there are alternatives.
250
646923
1772
其他办法的。
10:48
The way we are doing things now is not the only way
251
648695
2499
我们现在的处理方式不是唯一的,
10:51
they can done, and certainly not the best way
252
651194
3037
也绝对不是最好的。
10:54
they can be done.
253
654231
2027
也绝对不是最好的。
10:56
When someone tells you, "People don't care about privacy,"
254
656258
4171
当有人对你说,“大家不用关心隐私,”
11:00
consider whether the game has been designed
255
660429
2642
想想是不是因为事情已经被扭曲到
11:03
and rigged so that they cannot care about privacy,
256
663071
2724
他们不能再关心个人隐私了,
11:05
and coming to the realization that these manipulations occur
257
665795
3262
然后我们才意识到一切已被人操纵,
11:09
is already halfway through the process
258
669057
1607
已经逐渐侵入到
11:10
of being able to protect yourself.
259
670664
2258
自我保护的整个过程中。
11:12
When someone tells you that privacy is incompatible
260
672922
3710
当有人说隐私和大量信息带来的好处
11:16
with the benefits of big data,
261
676632
1849
无法兼得时,
11:18
consider that in the last 20 years,
262
678481
2473
想想过去的20年里,
11:20
researchers have created technologies
263
680954
1917
研究人员已经发明了
11:22
to allow virtually any electronic transactions
264
682871
3318
理论上使任何电子转帐
11:26
to take place in a more privacy-preserving manner.
265
686189
3749
更加安全保密的方式来进行的技术。
11:29
We can browse the Internet anonymously.
266
689938
2555
我们可以匿名的浏览网页。
11:32
We can send emails that can only be read
267
692493
2678
我们可以发送连美国国家安全局都不可以
11:35
by the intended recipient, not even the NSA.
268
695171
3709
读取的个人电子邮件,
11:38
We can have even privacy-preserving data mining.
269
698880
2997
我们甚至可以有保护隐私的数据挖掘。
11:41
In other words, we can have the benefits of big data
270
701877
3894
换句话说,我们可以在得到大量数据的同时
11:45
while protecting privacy.
271
705771
2132
仍能保护个人隐私。
11:47
Of course, these technologies imply a shifting
272
707903
3791
当然,这些技术的应用意味着
11:51
of cost and revenues
273
711694
1546
在数据拥有者们和
11:53
between data holders and data subjects,
274
713240
2107
数据对象们之间将有花费和收入的变化,
11:55
which is why, perhaps, you don't hear more about them.
275
715347
3453
也许这可能就是我们为什么没怎么听说过这些技术的原因。
11:58
Which brings me back to the Garden of Eden.
276
718800
3706
让我再回到伊甸园。
12:02
There is a second privacy interpretation
277
722506
2780
关于伊甸园的故事
12:05
of the story of the Garden of Eden
278
725286
1809
还有第二个关于隐私的解释
12:07
which doesn't have to do with the issue
279
727095
2096
这跟亚当和夏娃
12:09
of Adam and Eve feeling naked
280
729191
2225
的赤裸和羞耻
12:11
and feeling ashamed.
281
731416
2381
没有任何关系。
12:13
You can find echoes of this interpretation
282
733797
2781
大家可以在
12:16
in John Milton's "Paradise Lost."
283
736578
2782
约翰·弥尔顿的“失乐园”里看到类似的解释。
12:19
In the garden, Adam and Eve are materially content.
284
739360
4197
在伊甸园里,亚当和夏娃是物质上的满足。
12:23
They're happy. They are satisfied.
285
743557
2104
他们很开心,也很满足。
12:25
However, they also lack knowledge
286
745661
2293
但是,他们没有知识
12:27
and self-awareness.
287
747954
1640
和自觉性。
12:29
The moment they eat the aptly named
288
749594
3319
当他们吃到
12:32
fruit of knowledge,
289
752913
1293
智慧之果时,
12:34
that's when they discover themselves.
290
754206
2605
其实是他们发现自我的时刻。
12:36
They become aware. They achieve autonomy.
291
756811
4031
他们变得自觉,实现了自主。
12:40
The price to pay, however, is leaving the garden.
292
760842
3126
但是代价却是,离开伊甸园。
12:43
So privacy, in a way, is both the means
293
763968
3881
那么,隐私,换句话说,就是
12:47
and the price to pay for freedom.
294
767849
2962
为了得到自由必须付出的代价。
12:50
Again, marketers tell us
295
770811
2770
再次,营销人员告诉我们
12:53
that big data and social media
296
773581
3019
大量数据和社交网络
12:56
are not just a paradise of profit for them,
297
776600
2979
并不仅仅是为他们谋福利的天堂,
12:59
but a Garden of Eden for the rest of us.
298
779579
2457
同时也是我们所有人的伊甸园。
13:02
We get free content.
299
782036
1238
我们得到免费的信息。
13:03
We get to play Angry Birds. We get targeted apps.
300
783274
3123
我们可以玩愤怒的小鸟。我们得到适合自己的应用。
13:06
But in fact, in a few years, organizations
301
786397
2897
但实际上,在几年内,各种机构
13:09
will know so much about us,
302
789294
1609
就会因为知道这么多关于我们的信息,
13:10
they will be able to infer our desires
303
790903
2710
进而可以在我们知道自己想要做什么之前
13:13
before we even form them, and perhaps
304
793613
2204
就可以诱导我们的想法,或许
13:15
buy products on our behalf
305
795817
2447
在我们知道自己是不是真的需要某个商品之前
13:18
before we even know we need them.
306
798264
2274
就以我们自己的名义把它买下来了。
13:20
Now there was one English author
307
800538
3237
有一个英国作家
13:23
who anticipated this kind of future
308
803775
3045
预测到了这种未来
13:26
where we would trade away
309
806820
1405
就是我们会用自己的自主
13:28
our autonomy and freedom for comfort.
310
808225
3548
和自由来换来舒适安逸。
13:31
Even more so than George Orwell,
311
811773
2161
甚至超过了乔治·奥威尔,
13:33
the author is, of course, Aldous Huxley.
312
813934
2761
这个作家当然是赫胥黎。
13:36
In "Brave New World," he imagines a society
313
816695
2854
在“美丽新世界”里,他想象了一个社会:
13:39
where technologies that we created
314
819549
2171
人们发明了原本是为了得到
13:41
originally for freedom
315
821720
1859
自由的一种技术,
13:43
end up coercing us.
316
823579
2567
最终反被此技术所奴役。
13:46
However, in the book, he also offers us a way out
317
826146
4791
然而,在这本书里,他同样给我们指出了一条
13:50
of that society, similar to the path
318
830937
3438
突破这个社会的道路,
13:54
that Adam and Eve had to follow to leave the garden.
319
834375
3955
跟亚当和夏娃不得不离开伊甸园的道路类似。
13:58
In the words of the Savage,
320
838330
2147
用野人的话说,
14:00
regaining autonomy and freedom is possible,
321
840477
3069
重获自主和自由是可能的,
14:03
although the price to pay is steep.
322
843546
2679
尽管代价惨重。
14:06
So I do believe that one of the defining fights
323
846225
5715
因此我相信当今
14:11
of our times will be the fight
324
851940
2563
具有决定性的战役之一
14:14
for the control over personal information,
325
854503
2387
就是控制个人信息之战,
14:16
the fight over whether big data will become a force
326
856890
3507
决定大量数据是否会变成帮助获得自由
14:20
for freedom,
327
860397
1289
的武器,
14:21
rather than a force which will hiddenly manipulate us.
328
861686
4746
还是暗中操纵我们的工具。
14:26
Right now, many of us
329
866432
2593
现在,我们中的大多数
14:29
do not even know that the fight is going on,
330
869025
2753
甚至不知道战斗已经打响了,
14:31
but it is, whether you like it or not.
331
871778
2672
但这是真的,不管你喜欢不喜欢。
14:34
And at the risk of playing the serpent,
332
874450
2804
冒着打草惊蛇的危险,
14:37
I will tell you that the tools for the fight
333
877254
2897
我告诉大家战斗的武器就在这里,
14:40
are here, the awareness of what is going on,
334
880151
3009
那就是意识到正在发生着什么,
14:43
and in your hands,
335
883160
1355
就在你手中,
14:44
just a few clicks away.
336
884515
3740
只需几次点击。
14:48
Thank you.
337
888255
1482
谢谢大家。
14:49
(Applause)
338
889737
4477
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7