What will a future without secrets look like? | Alessandro Acquisti

202,533 views ใƒป 2013-10-18

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Ido Dekkers
00:12
I would like to tell you a story
0
12641
2354
ื‘ืจืฆื•ื ื™ ืœืกืคืจ ืœื›ื ืกื™ืคื•ืจ
00:14
connecting the notorious privacy incident
1
14995
3176
ื”ืžืงืฉืจ ืืช ืชืงืจื™ืช ื”ืคืจื˜ื™ื•ืช ื”ื™ื“ื•ืขื” ืœืฉืžืฆื”
00:18
involving Adam and Eve,
2
18171
2769
ื‘ื™ืŸ ืื“ื ืœื—ื•ื•ื”,
00:20
and the remarkable shift in the boundaries
3
20940
3446
ืขื ื”ืชื–ื•ื–ื” ื”ื‘ื•ืœื˜ืช ื‘ื’ื‘ื•ืœื•ืช
00:24
between public and private which has occurred
4
24386
2686
ืฉื‘ื™ืŸ ื”ืคืจื˜ื™ ืœืฆื™ื‘ื•ืจื™, ืฉื”ืชืจื—ืฉื”
00:27
in the past 10 years.
5
27072
1770
ื‘-10 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช.
00:28
You know the incident.
6
28842
1298
ื”ืชืงืจื™ืช ื”ื”ื™ื ืžื•ื›ืจืช ืœื›ื.
00:30
Adam and Eve one day in the Garden of Eden
7
30140
3330
ืื“ื ื•ื—ื•ื•ื”, ื™ื•ื ืื—ื“ ื‘ื’ืŸ-ืขื“ืŸ,
00:33
realize they are naked.
8
33470
1843
ืชื•ืคืฉื™ื ืฉื”ื ืขื™ืจื•ืžื™ื.
00:35
They freak out.
9
35313
1500
ื”ื ืžืชื—ืจืคื ื™ื.
00:36
And the rest is history.
10
36813
2757
ื•ื”ืฉืืจ ื›ืชื•ื‘ ื‘ื“ื‘ืจื™ ื”ื™ืžื™ื.
00:39
Nowadays, Adam and Eve
11
39570
2188
ื”ื™ื•ื, ืื“ื ื•ื—ื•ื•ื”
00:41
would probably act differently.
12
41758
2361
ื”ื™ื• ื•ื“ืื™ ื ื•ื”ื’ื™ื ืื—ืจืช.
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
44119
2268
[@ืื“ื ืืชืžื•ืœ ื‘ืœื™ืœื” ื”ื™ื” ืคื™ืฆื•ืฅ! ื”ื™ื” ืื—ืœื” ืชืคื•ื—:)))]
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
46387
1873
[@ื—ื•ื•ื” ื›ืŸ... ืžื•ืชืง, ื™ื•ื“ืขืช ืื™ืคื” ื”ืžื›ื ืกื™ื™ื ืฉืœื™?]
00:48
We do reveal so much more information
15
48260
2636
ืื ื• ื—ื•ืฉืคื™ื ืžื™ื“ืข ืจื‘
00:50
about ourselves online than ever before,
16
50896
3334
ืขืœ ืขืฆืžื ื• ื‘ืจืฉืช, ื™ื•ืชืจ ืžืื™-ืคืขื,
00:54
and so much information about us
17
54230
1704
ื•ื”ืžื•ืŸ ืžื™ื“ืข ืื•ื“ื•ืชื™ื ื•
00:55
is being collected by organizations.
18
55934
2224
ื ืืกืฃ ืข"ื™ ืืจื’ื•ื ื™ื.
00:58
Now there is much to gain and benefit
19
58158
3282
ื™ืฉ ื”ืจื‘ื” ื™ืชืจื•ื ื•ืช ื•ืชื•ืขืœืช
01:01
from this massive analysis of personal information,
20
61440
2446
ื‘ื ื™ืชื•ื— ื”ืžืกื™ื‘ื™ ื”ื–ื” ืฉืœ ืžื™ื“ืข ืื™ืฉื™,
01:03
or big data,
21
63886
1946
ืื• ืฆื‘ื™ืจืช ื”ื ืชื•ื ื™ื,
01:05
but there are also complex tradeoffs that come
22
65832
2638
ืื‘ืœ ื™ืฉ ื’ื ื—ืกืจื•ื ื•ืช ืžื•ืจื›ื‘ื™ื
01:08
from giving away our privacy.
23
68470
3098
ืฉื ืœื•ื•ื™ื ืœื•ื•ื™ืชื•ืจ ืขืœ ืคืจื˜ื™ื•ืชื ื•. [ื”ืกื•ื›ื ื•ืช ืœื‘ื˜ื—ื•ืŸ ืœืื•ืžื™ ื‘ื™ืŸ ื”ื—ื‘ืจื™ื ื‘"ืคื™ื™ืกื‘ื•ืง"]
01:11
And my story is about these tradeoffs.
24
71568
4023
ื•ื”ืกื™ืคื•ืจ ืฉืœื™ ืขื•ืกืง ื‘ื—ืกืจื•ื ื•ืช ื”ืืœื”.
01:15
We start with an observation which, in my mind,
25
75591
2584
ื ืคืชื— ื‘ื”ื‘ื—ื ื”, ืฉืœื“ืขืชื™,
01:18
has become clearer and clearer in the past few years,
26
78175
3327
ื”ืคื›ื” ื‘ืจื•ืจื” ื™ื•ืชืจ ื•ื™ื•ืชืจ ื‘ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
01:21
that any personal information
27
81502
2097
ื•ื”ื™ื ืฉื›ืœ ืžื™ื“ืข ืื™ืฉื™
01:23
can become sensitive information.
28
83599
2285
ืขืœื•ืœ ืœื”ืคื•ืš ืœืžื™ื“ืข ืจื’ื™ืฉ.
01:25
Back in the year 2000, about 100 billion photos
29
85884
4125
ื‘ืฉื ืช 2000, ื›-100 ืžื™ืœื™ืืจื“ ืชืžื•ื ื•ืช
01:30
were shot worldwide,
30
90009
1912
ืฆื•ืœืžื• ื‘ื›ืœ ื”ืขื•ืœื,
01:31
but only a minuscule proportion of them
31
91921
3065
ืื‘ืœ ืจืง ื—ืœืงื™ืง ื–ืขื™ืจ ืžื”ืŸ
01:34
were actually uploaded online.
32
94986
1883
ื”ื•ืขืœื• ืœืจืฉืช.
01:36
In 2010, only on Facebook, in a single month,
33
96869
3361
ื‘-2010, ืจืง ื‘"ืคื™ื™ืกื‘ื•ืง" ื•ืจืง ื‘ื—ื•ื“ืฉ ืื—ื“,
01:40
2.5 billion photos were uploaded,
34
100230
3270
ื”ื•ืขืœื• 2.5 ืžื™ืœื™ืืจื“ ืฆื™ืœื•ืžื™ื,
01:43
most of them identified.
35
103500
1882
ืจื•ื‘ื ืžื–ื•ื”ื™ื.
01:45
In the same span of time,
36
105382
1880
ื‘ืื•ืชื” ืชืงื•ืคื”,
01:47
computers' ability to recognize people in photos
37
107262
4870
ื™ื›ื•ืœืช ื”ืžื—ืฉื‘ื™ื ืœื–ื”ื•ืช ืื ืฉื™ื ื‘ืฆื™ืœื•ืžื™ื
01:52
improved by three orders of magnitude.
38
112132
3608
ื”ืฉืชืคืจื” ื‘ืฉืœื•ืฉื” ืกื“ืจื™-ื’ื•ื“ืœ.
01:55
What happens when you combine
39
115740
1882
ืžื” ืงื•ืจื” ื›ืฉืžืฉืœื‘ื™ื
01:57
these technologies together:
40
117622
1501
ื‘ื™ืŸ ืฉืชื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื”:
01:59
increasing availability of facial data;
41
119123
2658
ืขืœื™ื” ื‘ื–ืžื™ื ื•ืช ื ืชื•ื ื™ ื”ืคื ื™ื
02:01
improving facial recognizing ability by computers;
42
121781
3648
ื•ืฉื™ืคื•ืจ ื™ื›ื•ืœืช ื–ื™ื”ื•ื™ ื”ืคื ื™ื ืข"ื™ ืžื—ืฉื‘ื™ื,
02:05
but also cloud computing,
43
125429
2182
ืื‘ืœ ื’ื ืžื™ื—ืฉื•ื‘ ื”ืขื ืŸ
02:07
which gives anyone in this theater
44
127611
1888
ืฉืžืขื ื™ืง ืœื›ืœ ืื“ื ื‘ื–ื™ืจื” ื”ื–ื•
02:09
the kind of computational power
45
129499
1560
ื›ื•ื— ืžื™ื—ืฉื•ื‘ ืžื”ืกื•ื’
02:11
which a few years ago was only the domain
46
131059
1886
ืฉืœืคื ื™ ืฉื ื™ื ืกืคื•ืจื•ืช ื”ื™ื” ืฉื™ื™ืš ื‘ืœืขื“ื™ืช
02:12
of three-letter agencies;
47
132945
1782
ืœืกื•ื›ื ื•ื™ื•ืช ืฉืฉืžืŸ ืžื•ืจื›ื‘ ืž-3 ืื•ืชื™ื•ืช,
02:14
and ubiquitous computing,
48
134727
1378
ื•ื”ืžื™ื—ืฉื•ื‘ ืฉื ืžืฆื ื‘ื›ืœ,
02:16
which allows my phone, which is not a supercomputer,
49
136105
2892
ืฉืžืืคืฉืจ ืœื˜ืœืคื•ืŸ ืฉืœื™, ืฉืื™ื ื ื• ืžื—ืฉื‘-ืขืœ,
02:18
to connect to the Internet
50
138997
1671
ืœื”ืชื—ื‘ืจ ืœืื™ื ื˜ืจื ื˜
02:20
and do there hundreds of thousands
51
140668
2334
ื•ืœื‘ืฆืข ืฉื ืžืื•ืช ืืœืคื™
02:23
of face metrics in a few seconds?
52
143002
2639
ืžื“ื™ื“ื•ืช ืคื ื™ื ื‘ื›ืžื” ืฉื ื™ื•ืช?
02:25
Well, we conjecture that the result
53
145641
2628
ื•ื‘ื›ืŸ, ืื ื• ืžืฉืขืจื™ื ืฉื”ืชื•ืฆืื”
02:28
of this combination of technologies
54
148269
2064
ืฉืœ ืฉื™ืœื•ื‘ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ื–ื”
02:30
will be a radical change in our very notions
55
150333
2888
ืชื”ื™ื” ืฉื™ื ื•ื™ ืงื™ืฆื•ื ื™ ื‘ืžื•ืฉื’ื™ื ืฉืœื ื• ืขืฆืžื
02:33
of privacy and anonymity.
56
153221
2257
ื‘ืงืฉืจ ืœืคืจื˜ื™ื•ืช ื•ืืœืžื•ื ื™ื•ืช.
02:35
To test that, we did an experiment
57
155478
1993
ื›ื“ื™ ืœื‘ื—ื•ืŸ ื–ืืช, ืขืจื›ื ื• ื ื™ืกื•ื™
02:37
on Carnegie Mellon University campus.
58
157471
2121
ื‘ืงืžืคื•ืก ืฉืœ ืื•ื ื™ื‘ืจืกื™ื˜ืช ืงืจื ื’ื™ ืžืœื•ืŸ.
02:39
We asked students who were walking by
59
159592
2099
ื‘ื™ืงืฉื ื• ืžืกื˜ื•ื“ื ื˜ื™ื ืขื•ื‘ืจื™ื ื•ืฉื‘ื™ื
02:41
to participate in a study,
60
161691
1779
ืœื”ืฉืชืชืฃ ื‘ืžื—ืงืจ ืžืกื•ื™ื,
02:43
and we took a shot with a webcam,
61
163470
2562
ืฆื™ืœืžื ื• ืื•ืชื ื‘ืžืฆืœืžืช ืจืฉืช,
02:46
and we asked them to fill out a survey on a laptop.
62
166032
2782
ื•ื‘ื™ืงืฉื ื• ืžื”ื ืœืžืœื ื˜ื•ืคืก ืฉืœ ืกืงืจ ื‘ืžื—ืฉื‘ ืื™ืฉื™.
02:48
While they were filling out the survey,
63
168814
1979
ื•ื‘ื–ืžืŸ ืฉื”ื ืžื™ืœืื• ืืช ื”ื˜ื•ืคืก,
02:50
we uploaded their shot to a cloud-computing cluster,
64
170793
2797
ื”ืขืœื™ื ื• ืืช ืชืžื•ื ืชื ืœืžื™ื—ืฉื•ื‘ ื”ืขื ืŸ,
02:53
and we started using a facial recognizer
65
173590
1727
ื•ื”ืชื—ืœื ื• ืœื”ืฉืชืžืฉ ื‘ืชื•ื›ื ืช ื–ื™ื”ื•ื™ ืคื ื™ื
02:55
to match that shot to a database
66
175317
2405
ื›ื“ื™ ืœื”ืชืื™ื ืืช ื”ืชืฆืœื•ื ืœื‘ืกื™ืก ื ืชื•ื ื™ื
02:57
of some hundreds of thousands of images
67
177722
2393
ืฉืœ ื›ืžื” ืžืื•ืช ืืœืคื™ ืชืžื•ื ื•ืช
03:00
which we had downloaded from Facebook profiles.
68
180115
3596
ืฉื”ื•ืจื“ื ื• ืžืคืจื•ืคื™ืœื™ื ื‘"ืคื™ื™ืกื‘ื•ืง".
03:03
By the time the subject reached the last page
69
183711
3259
ืขื“ ืฉื”ื ื‘ื“ืง ืกื™ื™ื ืœืžืœื ืืช ื”ืขืžื•ื“ ื”ืื—ืจื•ืŸ ื‘ื˜ื•ืคืก ื”ืกืงืจ,
03:06
on the survey, the page had been dynamically updated
70
186970
3347
ื”ื“ืฃ ื”ืชืขื“ื›ืŸ ื‘ืื•ืคืŸ ื“ื™ื ืžื™
03:10
with the 10 best matching photos
71
190317
2313
ืขื 10 ื”ืชืžื•ื ื•ืช ื”ืชื•ืืžื•ืช ื‘ื™ื•ืชืจ
03:12
which the recognizer had found,
72
192630
2285
ืฉืชื•ื›ื ืช ื”ื–ื™ื”ื•ื™ ืžืฆืื”,
03:14
and we asked the subjects to indicate
73
194915
1738
ื•ืื– ื‘ื™ืงืฉื ื• ืžื”ื ื‘ื“ืงื™ื ืœืฆื™ื™ืŸ
03:16
whether he or she found themselves in the photo.
74
196653
4120
ืื ื”ื•ื ืื• ื”ื™ื ืžืฆืื• ืืช ืขืฆืžื ื‘ื™ื ื™ื”ื.
03:20
Do you see the subject?
75
200773
3699
ืืชื ืžื–ื”ื™ื ืืช ื”ื ื‘ื“ืง?
03:24
Well, the computer did, and in fact did so
76
204472
2845
ื”ืžื—ืฉื‘ ื‘ื”ื—ืœื˜ ื–ื™ื”ื”, ื•ืœืžืขืฉื” ื”ืฆืœื™ื— ื‘ื›ืš
03:27
for one out of three subjects.
77
207317
2149
ืขื ืื—ื“ ืžื›ืœ ืฉืœื•ืฉื” ื ื‘ื“ืงื™ื.
03:29
So essentially, we can start from an anonymous face,
78
209466
3184
ื›ืš ืฉื‘ืขืฆื, ืื ื• ื™ื›ื•ืœื™ื ืœื”ืชื—ื™ืœ ืžืคืจืฆื•ืฃ ืืœืžื•ื ื™,
03:32
offline or online, and we can use facial recognition
79
212650
3484
ืžืงื•ื•ืŸ ืื• ืœื ืžืงื•ื•ืŸ, ื•ืœื”ืฉืชืžืฉ ื‘ื–ื™ื”ื•ื™ ืคื ื™ื
03:36
to give a name to that anonymous face
80
216134
2360
ื›ื“ื™ ืœื”ืฆืžื™ื“ ืฉื ืœืื•ืชื• ืคืจืฆื•ืฃ ืืœืžื•ื ื™
03:38
thanks to social media data.
81
218494
2108
ื”ื•ื“ื•ืช ืœื ืชื•ื ื™ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช.
03:40
But a few years back, we did something else.
82
220602
1872
ืืš ืœืคื ื™ ื›ืžื” ืฉื ื™ื ืขืฉื™ื ื• ืžืฉื”ื• ืื—ืจ.
03:42
We started from social media data,
83
222474
1823
ื”ืชื—ืœื ื• ืžื ืชื•ื ื™ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช,
03:44
we combined it statistically with data
84
224297
3051
ืฉื™ืœื‘ื ื• ืื•ืชื ื‘ืื•ืคืŸ ืกื˜ื˜ื™ืกื˜ื™ ืขื ื ืชื•ื ื™ื
03:47
from U.S. government social security,
85
227348
2102
ืžื”ื‘ื™ื˜ื•ื— ื”ืœืื•ืžื™ ืฉืœ ืžืžืฉืœืช ืืจื”"ื‘,
03:49
and we ended up predicting social security numbers,
86
229450
3324
ื•ื”ืฆืœื—ื ื• ืœื’ืœื•ืช ืžืจืืฉ ืžืกืคืจื™ ื‘ื™ื˜ื•ื— ืœืื•ืžื™,
03:52
which in the United States
87
232774
1512
ื•ื‘ืืจื”"ื‘,
03:54
are extremely sensitive information.
88
234286
2040
ื–ื”ื• ืžื™ื“ืข ืจื’ื™ืฉ ื‘ื™ื•ืชืจ.
03:56
Do you see where I'm going with this?
89
236326
2093
ืืชื ืžื‘ื™ื ื™ื ืžื” ืื ื™ ืื•ืžืจ ื›ืืŸ?
03:58
So if you combine the two studies together,
90
238419
2922
ืื ืžืฉืœื‘ื™ื ื‘ื™ืŸ ืฉื ื™ ื”ืžื—ืงืจื™ื,
04:01
then the question becomes,
91
241341
1512
ืื– ื”ืฉืืœื” ื”ื™ื,
04:02
can you start from a face and,
92
242853
2720
ื”ืื ืืคืฉืจ ืœื”ืชื—ื™ืœ ืžืคืจืฆื•ืฃ,
04:05
using facial recognition, find a name
93
245573
2311
ื•ื‘ืขื–ืจืช ื–ื™ื”ื•ื™ ืคื ื™ื, ืœืžืฆื•ื ืฉื
04:07
and publicly available information
94
247884
2669
ื•ืžื™ื“ืข ื–ืžื™ืŸ ืœืฆื™ื‘ื•ืจ
04:10
about that name and that person,
95
250553
1932
ืขืœ ื”ืฉื ื•ื”ืื“ื ื”ืืœื”,
04:12
and from that publicly available information
96
252485
2248
ื•ืžืชื•ืš ืื•ืชื• ืžื™ื“ืข ื–ืžื™ืŸ ืœืฆื™ื‘ื•ืจ,
04:14
infer non-publicly available information,
97
254733
2042
ืœื”ืคื™ืง ืžื™ื“ืข ืฉืื™ื ื ื• ื–ืžื™ืŸ ืœืฆื™ื‘ื•ืจ,
04:16
much more sensitive ones
98
256775
1606
ืžื™ื“ืข ื”ืจื‘ื” ื™ื•ืชืจ ืจื’ื™ืฉ
04:18
which you link back to the face?
99
258381
1492
ืฉื ื™ืชืŸ ืœืงืฉืจื• ืขื ืื•ืชื• ืคืจืฆื•ืฃ?
04:19
And the answer is, yes, we can, and we did.
100
259873
1916
ื•ื”ืชืฉื•ื‘ื” ื”ื™ื ื›ืŸ, ืื ื• ื™ื›ื•ืœื™ื, ื•ืื ื• ืขืฉื™ื ื• ื–ืืช.
04:21
Of course, the accuracy keeps getting worse.
101
261789
2568
ื›ืžื•ื‘ืŸ, ืฉื”ื“ื™ื•ืง ื”ื•ืœืš ื•ื™ื•ืจื“.
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
264357
944
[ื–ื•ื”ื• 27% ืžืžืกืคืจื™ ื”ื‘ื™ื˜ื•ื— ื”ืœืื•ืžื™ ืฉืœ ื”ื ื‘ื“ืงื™ื (ื‘-4 ื ืกื™ื•ื ื•ืช)]
04:25
But in fact, we even decided to develop an iPhone app
103
265301
3827
ืืš ืœืžืขืฉื” ื”ื—ืœื˜ื ื• ืืคื™ืœื• ืœืคืชื— ื™ื™ืฉื•ืžื•ืŸ ืœ"ืื™ื™ืคื•ืŸ"
04:29
which uses the phone's internal camera
104
269128
2715
ืฉืžื ืฆืœ ืืช ื”ืžืฆืœืžื” ื”ืคื ื™ืžื™ืช ืฉืœ ื”ื˜ืœืคื•ืŸ
04:31
to take a shot of a subject
105
271843
1600
ื›ื“ื™ ืœืฆืœื ืืช ื”ื ื‘ื“ืง,
04:33
and then upload it to a cloud
106
273443
1487
ืœื”ืขืœื•ืช ืืช ื–ื” ืœืขื ืŸ
04:34
and then do what I just described to you in real time:
107
274930
2662
ื•ืื– ืœืขืฉื•ืช ืืช ืžื” ืฉื–ื” ืขืชื” ืชื™ืืจืชื™, ื‘ื–ืžืŸ ืืžืช:
04:37
looking for a match, finding public information,
108
277592
2088
ืœื—ืคืฉ ื”ืชืืžื”, ืœืืชืจ ืžื™ื“ืข ื’ืœื•ื™,
04:39
trying to infer sensitive information,
109
279680
1730
ืœื ืกื•ืช ืœื”ืกื™ืง ืžืžื ื• ืžื™ื“ืข ืจื’ื™ืฉ,
04:41
and then sending back to the phone
110
281410
2591
ื•ืœืฉืœื•ื— ืืช ื–ื” ื‘ื—ื–ืจื” ืืœ ื”ื˜ืœืคื•ืŸ
04:44
so that it is overlaid on the face of the subject,
111
284001
3609
ื›ื“ื™ ืœืจื‘ื“ ืืช ื–ื” ืขืœ ืคืจืฆื•ืคื• ืฉืœ ื”ื ื‘ื“ืง,
04:47
an example of augmented reality,
112
287610
1901
ื›ื“ื•ื’ืžื” ืฉืœ ืžืฆื™ืื•ืช ืžื•ืจื—ื‘ืช,
04:49
probably a creepy example of augmented reality.
113
289511
2451
ืื•ืœื™ ื“ื•ื’ืžื” ื—ื•ืœื ื™ืช ืฉืœ ืžืฆื™ืื•ืช ืžื•ืจื—ื‘ืช.
04:51
In fact, we didn't develop the app to make it available,
114
291962
3339
ืœื ืคื™ืชื—ื ื• ืืช ื”ื™ื™ืฉื•ืžื•ืŸ ื›ื“ื™ ืฉื™ื”ื™ื” ื–ืžื™ืŸ ืœืฆื™ื‘ื•ืจ,
04:55
just as a proof of concept.
115
295301
1922
ืืœื ืจืง ื›ื“ื™ ืœื”ื•ื›ื™ื— ืจืขื™ื•ืŸ.
04:57
In fact, take these technologies
116
297223
2313
ื‘ืขืฆื, ื›ื“ื™ ืœืงื—ืช ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื”
04:59
and push them to their logical extreme.
117
299536
1837
ืขื“ ืœืงื™ืฆื•ื ื™ื•ืช ื”ืœื•ื’ื™ืช ืฉืœื”ืŸ.
05:01
Imagine a future in which strangers around you
118
301373
2719
ืชืืจื• ืœืขืฆืžื›ืŸ ืขืชื™ื“ ืฉื‘ื• ืื ืฉื™ื ื–ืจื™ื ืกื‘ื™ื‘ื›ื
05:04
will look at you through their Google Glasses
119
304092
2311
ื™ื‘ื™ื˜ื• ื‘ื›ื ื“ืจืš ืžืฉืงืคื™ ื”"ื’ื•ื’ืœ" ืฉืœื”ื
05:06
or, one day, their contact lenses,
120
306403
2307
ืื•, ื™ื•ื ืื—ื“, ื“ืจืš ืขื“ืฉื•ืช ื”ืžื’ืข ืฉืœื”ื,
05:08
and use seven or eight data points about you
121
308710
4020
ื•ื™ืฉืชืžืฉื• ื‘-7 ืื• 8 ื ืงื•ื“ื•ืช ืžื™ื“ืข ืื•ื“ื•ืชื™ื›ื
05:12
to infer anything else
122
312730
2582
ื›ื“ื™ ืœื”ืกื™ืง ืืช ื›ืœ ื™ืชืจ ื”ืžื™ื“ืข
05:15
which may be known about you.
123
315312
2603
ืฉืืคืฉืจ ืœืžืฆื•ื ืื•ื“ื•ืชื™ื›ื.
05:17
What will this future without secrets look like?
124
317915
4794
ืื™ืš ื™ื™ืจืื” ืขืชื™ื“ ืœืœื ืกื•ื“ื•ืช?
05:22
And should we care?
125
322709
1964
ื•ื”ืื ื–ื” ืฆืจื™ืš ืœื”ื™ื•ืช ื—ืฉื•ื‘ ืœื ื•?
05:24
We may like to believe
126
324673
1891
ืื•ืœื™ ืื ื• ืื•ื”ื‘ื™ื ืœื”ืืžื™ืŸ
05:26
that the future with so much wealth of data
127
326564
3040
ืฉืขืชื™ื“ ืฉื™ืฉ ื‘ื• ืขื•ืฉืจ ื›ื–ื” ืฉืœ ื ืชื•ื ื™ื
05:29
would be a future with no more biases,
128
329604
2514
ื™ื”ื™ื” ืขืชื™ื“ ืฉื‘ื• ื›ื‘ืจ ืœื ื™ื”ื™ื• ื“ืขื•ืช ืžื•ื˜ื•ืช,
05:32
but in fact, having so much information
129
332118
3583
ืืš ืœืžืขืฉื”, ื–ืžื™ื ื•ืชื• ืฉืœ ืžื™ื“ืข ื›ื” ืจื‘
05:35
doesn't mean that we will make decisions
130
335701
2191
ืื™ื ื” ืื•ืžืจืช ืฉืื ื• ื ืงื‘ืœ ื”ื—ืœื˜ื•ืช
05:37
which are more objective.
131
337892
1706
ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ื•ืช ื™ื•ืชืจ.
05:39
In another experiment, we presented to our subjects
132
339598
2560
ื‘ื ื™ืกื•ื™ ืื—ืจ ื”ืฆื’ื ื• ืœื ื‘ื“ืงื™ื ืฉืœื ื•
05:42
information about a potential job candidate.
133
342158
2246
ืžื™ื“ืข ืื•ื“ื•ืช ืžื•ืขืžื“ ืคื•ื˜ื ืฆื™ืืœื™ ืœืžืฉืจื” ื›ืœืฉื”ื™.
05:44
We included in this information some references
134
344404
3178
ื›ืœืœื ื• ื‘ืžื™ื“ืข ื–ื” ื›ืžื” ื”ืชื™ื™ื—ืกื•ื™ื•ืช
05:47
to some funny, absolutely legal,
135
347582
2646
ืœืžื™ื“ืข ืงืฆืช ืžืฉืขืฉืข, ื—ื•ืงื™ ืœื’ืžืจื™,
05:50
but perhaps slightly embarrassing information
136
350228
2465
ืืš ืื•ืœื™ ืžืขื˜ ืžื‘ื™ืš
05:52
that the subject had posted online.
137
352693
2020
ืฉื”ื ื‘ื“ืง ื”ืขืœื” ืœืจืฉืช.
05:54
Now interestingly, among our subjects,
138
354713
2366
ื”ืžืขื ื™ื™ืŸ ื”ื•ื ืฉื‘ื™ืŸ ื”ื ื‘ื“ืงื™ื ืฉืœื ื•,
05:57
some had posted comparable information,
139
357079
3083
ืื—ื“ื™ื ื”ืขืœื• ืžื™ื“ืข ื“ื•ืžื”,
06:00
and some had not.
140
360162
2362
ื•ืื—ืจื™ื ืœื.
06:02
Which group do you think
141
362524
1949
ืื™ื–ื• ืงื‘ื•ืฆื” ืœื“ืขืชื›ื
06:04
was more likely to judge harshly our subject?
142
364473
4552
ื ื˜ืชื” ืœืฉืคื•ื˜ ื‘ื—ื•ืžืจื” ืืช ื”ื ื‘ื“ืง ืฉืœื ื•?
06:09
Paradoxically, it was the group
143
369025
1957
ื‘ืื•ืคืŸ ืคืจื“ื•ืงืกืœื™, ื”ื™ืชื” ื–ื• ื”ืงื‘ื•ืฆื”
06:10
who had posted similar information,
144
370982
1733
ืฉื”ืขืœืชื” ืœืจืฉืช ืžื™ื“ืข ื“ื•ืžื”,
06:12
an example of moral dissonance.
145
372715
2942
ื“ื•ื’ืžื” ืœืขื™ื•ื•ืช ืžื•ืกืจื™.
06:15
Now you may be thinking,
146
375657
1750
ืื•ืœื™ ืชื—ืฉื‘ื•:
06:17
this does not apply to me,
147
377407
1702
"ื–ื” ืœื ื ื•ื’ืข ืœื™,
06:19
because I have nothing to hide.
148
379109
2162
"ื›ื™ ืื™ืŸ ืœื™ ื“ื‘ืจ ืœื”ืกืชื™ืจ."
06:21
But in fact, privacy is not about
149
381271
2482
ืืš ื”ืืžืช ื”ื™ื ืฉืคืจื˜ื™ื•ืช ืื™ื ื ื”
06:23
having something negative to hide.
150
383753
3676
ื”ืกืชืจื” ืฉืœ ืžืฉื”ื• ืฉืœื™ืœื™.
06:27
Imagine that you are the H.R. director
151
387429
2354
ืชืืจื• ืœืขืฆืžื›ื ืฉืืชื ืžื ื”ืœื™ ืžืฉืื‘ื™-ืื ื•ืฉ
06:29
of a certain organization, and you receive rรฉsumรฉs,
152
389783
2947
ื‘ืืจื’ื•ืŸ ื›ืœืฉื”ื•, ื•ืืชื ืžืงื‘ืœื™ื ืงื•ืจื•ืช ื—ื™ื™ื
06:32
and you decide to find more information about the candidates.
153
392730
2473
ื•ืžื—ืœื™ื˜ื™ื ืœื—ืคืฉ ืžื™ื“ืข ื ื•ืกืฃ ืื•ื“ื•ืช ื”ืžื•ืขืžื“ื™ื ืฉืœื›ื.
06:35
Therefore, you Google their names
154
395203
2460
ืื– ืืชื "ืžื’ื’ืœื™ื" ืืช ืฉืžื•ืชื™ื”ื
06:37
and in a certain universe,
155
397663
2240
ื•ื‘ืžืฆื™ืื•ืช ืžืกื•ื™ืžืช,
06:39
you find this information.
156
399903
2008
ืืชื ืžื’ืœื™ื ืืช ื”ืžื™ื“ืข ื”ื–ื”.
06:41
Or in a parallel universe, you find this information.
157
401911
4437
ืื• ื‘ืžืฆื™ืื•ืช ืžืงื‘ื™ืœื” ืื—ืจืช, ืืช ื”ืžื™ื“ืข ื”ื–ื”.
06:46
Do you think that you would be equally likely
158
406348
2717
ื”ืื ื ืจืื” ืœื›ื ืกื‘ื™ืจ ืฉืชื–ืžื™ื ื• ื‘ืื•ืคืŸ ืฉื•ื•ื™ื•ื ื™
06:49
to call either candidate for an interview?
159
409065
2803
ืืช ื”ืžื•ืขืžื“ื•ืช ื”ืืœื” ืœืจืื™ื•ืŸ?
06:51
If you think so, then you are not
160
411868
2282
ืื ืืชื ื—ื•ืฉื‘ื™ื ืฉื›ืŸ, ื”ืจื™ ืฉืื™ื ื›ื
06:54
like the U.S. employers who are, in fact,
161
414150
2582
ื›ืžื• ื”ืžืขืกื™ืงื™ื ื‘ืืจื”"ื‘, ืฉืœืžืขืฉื”,
06:56
part of our experiment, meaning we did exactly that.
162
416732
3307
ืžืฉืชืชืคื™ื ื‘ื ื™ืกื•ื™ ืฉืœื ื•, ื›ืœื•ืžืจ: ื–ื” ื‘ื“ื™ื•ืง ืžื” ืฉืขืฉื™ื ื•.
07:00
We created Facebook profiles, manipulating traits,
163
420039
3182
ื™ืฆืจื ื• ืคืจื•ืคื™ืœื™ "ืคื™ื™ืกื‘ื•ืง", ื”ืžืฆืื ื• ืชื›ื•ื ื•ืช ืื•ืคื™,
07:03
then we started sending out rรฉsumรฉs to companies in the U.S.,
164
423221
2851
ื•ืื– ื”ืชื—ืœื ื• ืœืฉืœื•ื— ืงื•ืจื•ืช ื—ื™ื™ื ืœื—ื‘ืจื•ืช ื‘ืืจื”"ื‘,
07:06
and we detected, we monitored,
165
426072
1908
ื•ื’ื™ืœื™ื ื•, ืขืงื‘ื ื•,
07:07
whether they were searching for our candidates,
166
427980
2393
ืื ื”ื ื—ื™ืคืฉื• ืžื™ื“ืข ืื•ื“ื•ืช ืžื•ืขืžื“ื™ื ื•,
07:10
and whether they were acting on the information
167
430373
1832
ื•ืื ื”ื ืคืขืœื• ืขืœ ืกืžืš ื”ืžื™ื“ืข
07:12
they found on social media. And they were.
168
432205
1938
ืฉืžืฆืื• ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช. ื•ื”ื ื‘ื”ื—ืœื˜ ื›ืŸ.
07:14
Discrimination was happening through social media
169
434143
2101
ื”ืชืจื—ืฉื” ืืคืœื™ื” ื‘ื›ืœ ื”ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช
07:16
for equally skilled candidates.
170
436244
3073
ืœื’ื‘ื™ ืžื•ืขืžื“ื™ื ื–ื”ื™ื ืžื‘ื—ื™ื ืช ื›ื™ืฉื•ืจื™ื.
07:19
Now marketers like us to believe
171
439317
4575
ื”ืžืฉื•ื•ืงื™ื ืจื•ืฆื™ื ืฉื ืืžื™ืŸ
07:23
that all information about us will always
172
443892
2269
ืฉื›ืœ ื”ืžื™ื“ืข ืขืœื™ื ื• ื™ืฉืžืฉ ืชืžื™ื“
07:26
be used in a manner which is in our favor.
173
446161
3273
ืœืชื•ืขืœืชื ื•.
07:29
But think again. Why should that be always the case?
174
449434
3715
ืื‘ืœ ืื ืชื—ืฉื‘ื• ืขืœ ื–ื”, ืœืžื” ืฉื–ื” ืชืžื™ื“ ื™ื”ื™ื” ื›ืš?
07:33
In a movie which came out a few years ago,
175
453149
2664
ื‘ืกืจื˜ ืฉื™ืฆื ืœืคื ื™ ื›ืžื” ืฉื ื™ื,
07:35
"Minority Report," a famous scene
176
455813
2553
"ื“ื•ื— ืžื™ื•ื—ื“", ื‘ืกืฆื ื” ืžืคื•ืจืกืžืช
07:38
had Tom Cruise walk in a mall
177
458366
2576
ืจื•ืื™ื ืืช ื˜ื•ื ืงืจื•ื– ืขื•ื‘ืจ ื‘ืงื ื™ื•ืŸ
07:40
and holographic personalized advertising
178
460942
3776
ื•ืคืจืกื•ื ื”ื•ืœื•ื’ืจืคื™ ืžื•ืชืื-ืื™ืฉื™ืช
07:44
would appear around him.
179
464718
1835
ืžื•ืคื™ืข ืกื‘ื™ื‘ื•.
07:46
Now, that movie is set in 2054,
180
466553
3227
ื”ืกืจื˜ ืžืชืจื—ืฉ ื‘-2054,
07:49
about 40 years from now,
181
469780
1642
ื›-40 ืฉื ื” ืžื”ื™ื•ื,
07:51
and as exciting as that technology looks,
182
471422
2908
ื•ื›ื›ืœ ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื• ื ืจืื™ืช ืžืœื”ื™ื‘ื”,
07:54
it already vastly underestimates
183
474330
2646
ื”ื™ื ื›ื‘ืจ ืžืžืขื™ื˜ื” ืžืื“ ื‘ืื•ืžื“ืŸ
07:56
the amount of information that organizations
184
476976
2140
ื›ืžื•ืช ื”ืžื™ื“ืข ืฉืืจื’ื•ื ื™ื
07:59
can gather about you, and how they can use it
185
479116
2483
ื™ื›ื•ืœื™ื ืœืืกื•ืฃ ืื•ื“ื•ืชื™ื›ื, ื•ื›ื™ืฆื“ ื”ื ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื‘ื•
08:01
to influence you in a way that you will not even detect.
186
481599
3398
ื›ื“ื™ ืœื”ืฉืคื™ืข ืขืœื™ื›ื ื‘ื“ืจืš ืฉืืคื™ืœื• ืœื ืชื’ืœื•.
08:04
So as an example, this is another experiment
187
484997
2103
ื‘ืชื•ืจ ื“ื•ื’ืžื”, ื–ื”ื• ื ื™ืกื•ื™ ื ื•ืกืฃ
08:07
actually we are running, not yet completed.
188
487100
2273
ืฉืื ื• ืขื•ืจื›ื™ื ื‘ื™ืžื™ื ืืœื”, ื”ื•ื ื˜ืจื ื”ื•ืฉืœื.
08:09
Imagine that an organization has access
189
489373
2319
ืชืืจื• ืœืขืฆืžื›ื ืฉืœืืจื’ื•ืŸ ื›ืœืฉื”ื• ื™ืฉ ื’ื™ืฉื”
08:11
to your list of Facebook friends,
190
491692
2056
ืœืจืฉื™ืžืช ื—ื‘ืจื™ื›ื ื‘"ืคื™ื™ืกื‘ื•ืง",
08:13
and through some kind of algorithm
191
493748
1772
ื•ื‘ืืžืฆืขื•ืช ืืœื’ื•ืจื™ืชื ืžืกื•ื™ื
08:15
they can detect the two friends that you like the most.
192
495520
3734
ื”ื•ื ื™ื›ื•ืœ ืœื–ื”ื•ืช ืืช ืฉื ื™ ื”ื—ื‘ืจื™ื ื”ื›ื™ ืื”ื•ื‘ื™ื ืฉืœื›ื.
08:19
And then they create, in real time,
193
499254
2280
ื•ืื– ื”ื•ื ื™ื•ืฆืจ, ื‘ื–ืžืŸ ืืžืช,
08:21
a facial composite of these two friends.
194
501534
2842
ืงืœืกืชืจื•ืŸ ืžืฉื•ืœื‘ ืฉืœ ืฉื ื™ ื”ื—ื‘ืจื™ื ื”ืืœื”.
08:24
Now studies prior to ours have shown that people
195
504376
3069
ืžื—ืงืจื™ื ืฉื ืขืจื›ื• ืœืคื ื™ื ื• ื”ืจืื• ืฉืื ืฉื™ื
08:27
don't recognize any longer even themselves
196
507445
2885
ื›ื‘ืจ ืœื ืžื–ื”ื™ื ืืคื™ืœื• ืืช ืขืฆืžื
08:30
in facial composites, but they react
197
510330
2462
ื‘ืงืœืกืชืจื•ื ื™ื, ืื‘ืœ ื”ื ืžื’ื™ื‘ื™ื
08:32
to those composites in a positive manner.
198
512792
2117
ืœืงืœืกืชืจื•ื ื™ื ืืœื” ื‘ืื•ืคืŸ ื—ื™ื•ื‘ื™.
08:34
So next time you are looking for a certain product,
199
514909
3415
ื›ืš ืฉื‘ืคืขื ื”ื‘ืื” ืฉืชื—ืคืฉื• ืžื•ืฆืจ ื›ืœืฉื”ื•,
08:38
and there is an ad suggesting you to buy it,
200
518324
2559
ื•ืชื”ื™ื” ืื™ื–ื• ืคืจืกื•ืžืช ืฉืชืฆื™ืข ืœื›ื ืœืงื ื•ืช ืื•ืชื•,
08:40
it will not be just a standard spokesperson.
201
520883
2907
ื”ืžืฆื™ื’ ืœื ื™ื”ื™ื” ื“ื•ื‘ืจ ืจื’ื™ืœ,
08:43
It will be one of your friends,
202
523790
2313
ืืœื ืื—ื“ ืžื—ื‘ืจื™ื›ื,
08:46
and you will not even know that this is happening.
203
526103
3303
ื•ืืชื ืืคื™ืœื• ืœื ืชื“ืขื• ืฉื–ื” ืžื” ืฉืงื•ืจื”.
08:49
Now the problem is that
204
529406
2413
ื”ื‘ืขื™ื” ื”ื™ื,
08:51
the current policy mechanisms we have
205
531819
2519
ืฉืžื ื’ื ื•ื ื™ ื”ืžื“ื™ื ื™ื•ืช ื”ื ื•ื›ื—ื™ื™ื ืฉืœื ื•
08:54
to protect ourselves from the abuses of personal information
206
534338
3438
ืฉืžื˜ืจืชื ืœื”ื’ืŸ ืขืœ ืขืฆืžื ื• ืžื ื™ืฆื•ืœ ืœืจืขื” ืฉืœ ืžื™ื“ืข ืื™ืฉื™
08:57
are like bringing a knife to a gunfight.
207
537776
2984
ื”ื ื›ืžื• ืœื”ืฉืชืžืฉ ื‘ืกื›ื™ืŸ ื‘ืงืจื‘ ืืงื“ื—ื™ื.
09:00
One of these mechanisms is transparency,
208
540760
2913
ืื—ื“ ื”ืžื ื’ื ื•ื ื™ื ื”ืืœื” ื”ื•ื ืฉืงื™ืคื•ืช,
09:03
telling people what you are going to do with their data.
209
543673
3200
ืœื•ืžืจ ืœืื ืฉื™ื ืžื” ืชืขืฉื” ืขื ื”ืžื™ื“ืข ืื•ื“ื•ืชื™ื”ื,
09:06
And in principle, that's a very good thing.
210
546873
2106
ื•ืขืงืจื•ื ื™ืช, ื–ื” ื˜ื•ื‘ ืžืื“.
09:08
It's necessary, but it is not sufficient.
211
548979
3667
ื–ื” ื”ื›ืจื—ื™, ืื‘ืœ ื–ื” ืœื ืžืกืคื™ืง.
09:12
Transparency can be misdirected.
212
552646
3698
ืืคืฉืจ ืœื”ืฉืชืžืฉ ื‘ืฉืงื™ืคื•ืช ื‘ืื•ืคืŸ ืžื˜ืขื”.
09:16
You can tell people what you are going to do,
213
556344
2104
ืืคืฉืจ ืœื•ืžืจ ืœืื ืฉื™ื ืžื” ืขื•ืžื“ื™ื ืœืขืฉื•ืช,
09:18
and then you still nudge them to disclose
214
558448
2232
ื•ืœื”ืžืฉื™ืš ืœื”ืฆื™ืง ืœื”ื ื‘ื“ืจื™ืฉื•ืช ืœืžืกื•ืจ
09:20
arbitrary amounts of personal information.
215
560680
2623
ื›ืžื•ื™ื•ืช ืฉืจื™ืจื•ืชื™ื•ืช ืฉืœ ืžื™ื“ืข ืื™ืฉื™.
09:23
So in yet another experiment, this one with students,
216
563303
2886
ืื– ื‘ื ื™ืกื•ื™ ื ื•ืกืฃ, ื”ืคืขื ื‘ื”ืฉืชืชืคื•ืช ืกื˜ื•ื“ื ื˜ื™ื,
09:26
we asked them to provide information
217
566189
3058
ื‘ื™ืงืฉื ื• ืžื”ื ืœืกืคืง ืžื™ื“ืข
09:29
about their campus behavior,
218
569247
1813
ืœื’ื‘ื™ ื”ืชื ื”ืœื•ืชื ื‘ืงืžืคื•ืก,
09:31
including pretty sensitive questions, such as this one.
219
571060
2940
ื›ื•ืœืœ ืฉืืœื•ืช ืจื’ื™ืฉื•ืช ืœืžื“ื™, ื›ืžื• ื–ืืช.
09:34
[Have you ever cheated in an exam?]
220
574000
621
09:34
Now to one group of subjects, we told them,
221
574621
2300
[ื”ืื ืื™-ืคืขื ืจื™ืžื™ืช ื‘ืžื‘ื—ืŸ?] ืœืงื‘ื•ืฆื” ืื—ืช ืฉืœ ื ื‘ื“ืงื™ื ืืžืจื ื•:
[ื”ืื ืื™-ืคืขื ืจื™ืžื™ืช ื‘ืžื‘ื—ืŸ?] ืœืงื‘ื•ืฆื” ืื—ืช ืฉืœ ื ื‘ื“ืงื™ื ืืžืจื ื•:
09:36
"Only other students will see your answers."
222
576921
2841
"ืจืง ืกื˜ื•ื“ื ื˜ื™ื ืื—ืจื™ื ื™ืจืื• ืืช ืชืฉื•ื‘ื•ืชื™ื›ื."
09:39
To another group of subjects, we told them,
223
579762
1579
ืœืงื‘ื•ืฆืช ื ื‘ื“ืงื™ื ืฉื ื™ื” ืืžืจื ื•:
09:41
"Students and faculty will see your answers."
224
581341
3561
"ื”ืกื˜ื•ื“ื ื˜ื™ื ื•ื”ืกื’ืœ ื™ืจืื• ืืช ืชืฉื•ื‘ื•ืชื™ื›ื."
09:44
Transparency. Notification. And sure enough, this worked,
225
584902
2591
ืฉืงื™ืคื•ืช. ื”ื•ื“ืขื” ืžืจืืฉ. ื•ื–ื” ื‘ื”ื—ืœื˜ ืขื‘ื“,
09:47
in the sense that the first group of subjects
226
587493
1407
ื‘ืžื•ื‘ืŸ ื–ื” ืฉืงื‘ื•ืฆืช ื”ื ื‘ื“ืงื™ื ื”ืจืืฉื•ื ื”
09:48
were much more likely to disclose than the second.
227
588900
2568
ื ื˜ืชื” ื”ืจื‘ื” ื™ื•ืชืจ ืœืžืกื•ืจ ืžื™ื“ืข ืžืืฉืจ ื”ืฉื ื™ื”.
09:51
It makes sense, right?
228
591468
1520
ื”ื’ื™ื•ื ื™, ื ื›ื•ืŸ?
09:52
But then we added the misdirection.
229
592988
1490
ืื‘ืœ ื”ื•ืกืคื ื• ื”ื˜ืขื™ื”.
09:54
We repeated the experiment with the same two groups,
230
594478
2760
ื—ื–ืจื ื• ืขืœ ื”ื ื™ืกื•ื™ ืขื ืื•ืชืŸ ืฉืชื™ ืงื‘ื•ืฆื•ืช,
09:57
this time adding a delay
231
597238
2427
ื•ื”ืคืขื ื”ื•ืกืคื ื• ื”ืฉื”ื™ื”
09:59
between the time we told subjects
232
599665
2935
ื‘ื™ืŸ ื”ื–ืžืŸ ื‘ื• ื”ื•ื“ืขื ื• ืœื ื‘ื“ืงื™ื
10:02
how we would use their data
233
602600
2080
ืื™ืš ื‘ื“ืขืชื ื• ืœื”ืฉืชืžืฉ ื‘ื ืชื•ื ื™ื
10:04
and the time we actually started answering the questions.
234
604680
4388
ื•ื‘ื™ืŸ ื”ื–ืžืŸ ืฉื‘ื• ื”ื ื”ื—ืœื• ื‘ืคื•ืขืœ ืœืขื ื•ืช ืขืœ ื”ืฉืืœื•ืช.
10:09
How long a delay do you think we had to add
235
609068
2561
ืžื” ืื•ืจืš ื”ื”ืฉื”ื™ื™ื”, ืœื“ืขืชื›ื, ื”ื™ื” ืขืœื™ื ื• ืœื”ื›ื ื™ืก
10:11
in order to nullify the inhibitory effect
236
611629
4613
ื›ื“ื™ ืœื‘ื˜ืœ ืืช ื”ืืคืงื˜ ื”ืžืขื›ื‘
10:16
of knowing that faculty would see your answers?
237
616242
3411
ืฉื‘ื™ื“ื™ืขื” ืฉื”ืกื’ืœ ืขืชื™ื“ ืœืจืื•ืช ืืช ื”ืชืฉื•ื‘ื•ืช?
10:19
Ten minutes?
238
619653
1780
10 ื“ืงื•ืช?
10:21
Five minutes?
239
621433
1791
5 ื“ืงื•ืช?
10:23
One minute?
240
623224
1776
ื“ืงื” ืื—ืช?
10:25
How about 15 seconds?
241
625000
2049
ืžื” ืชืืžืจื• ืขืœ 15 ืฉื ื™ื•ืช?
10:27
Fifteen seconds were sufficient to have the two groups
242
627049
2668
ื“ื™ ื”ื™ื” ื‘-15 ืฉื ื™ื•ืช ื›ื“ื™ ืฉืฉืชื™ ื”ืงื‘ื•ืฆื•ืช
10:29
disclose the same amount of information,
243
629717
1568
ืชืžืกื•ืจื ื” ืืช ืื•ืชื” ื›ืžื•ืช ืžื™ื“ืข,
10:31
as if the second group now no longer cares
244
631285
2746
ื›ืื™ืœื• ืฉืœื—ื‘ืจื™ ื”ืงื‘ื•ืฆื” ื”ืฉื ื™ื” ื›ื‘ืจ ืœื ื”ื™ื” ืื™ื›ืคืช
10:34
for faculty reading their answers.
245
634031
2656
ืื ื”ืกื’ืœ ื™ืจืื” ืืช ืชืฉื•ื‘ื•ืชื™ื”ื.
10:36
Now I have to admit that this talk so far
246
636687
3336
ืขืœื™ ืœื”ื•ื“ื•ืช ืฉื”ื”ืจืฆืื” ื”ื–ื• ื ืจืื™ืช ืขื“ ืขืชื”
10:40
may sound exceedingly gloomy,
247
640023
2480
ืงื•ื“ืจืช ื‘ื™ื•ืชืจ,
10:42
but that is not my point.
248
642503
1721
ืื‘ืœ ืœื ื–ื” ื”ืžืกืจ ืฉืœื™.
10:44
In fact, I want to share with you the fact that
249
644224
2699
ืœืžืขืฉื”, ืื ื™ ืจื•ืฆื” ืœื’ืœื•ืช ืœื›ื ืืช ื”ืขื•ื‘ื“ื”
10:46
there are alternatives.
250
646923
1772
ืฉื™ืฉ ื—ืœื•ืคื•ืช.
10:48
The way we are doing things now is not the only way
251
648695
2499
ื”ื“ืจืš ื‘ื” ืื ื• ืขื•ืฉื™ื ื“ื‘ืจื™ื ื›ื™ื•ื ืื™ื ื” ื”ื“ืจืš ื”ื™ื—ื™ื“ื”
10:51
they can done, and certainly not the best way
252
651194
3037
ื‘ื” ื ื™ืชืŸ ืœืขืฉื•ืชื, ื•ืื™ืŸ ืกืคืง ืฉื”ื™ื ืœื ื”ื“ืจืš ื”ื›ื™ ื˜ื•ื‘ื”
10:54
they can be done.
253
654231
2027
ื‘ื” ื ื™ืชืŸ ืœืขืฉื•ืชื.
10:56
When someone tells you, "People don't care about privacy,"
254
656258
4171
ื›ืฉืžื™ืฉื”ื• ืื•ืžืจ ืœื›ื: "ืœืื ืฉื™ื ืœื ืื™ื›ืคืช ืžืคืจื˜ื™ื•ืช",
11:00
consider whether the game has been designed
255
660429
2642
ืฉืืœื• ืืช ืขืฆืžื›ื ื”ืื ื”ืžืฉื—ืง ืœื ืชื•ื›ื ืŸ ื•ื˜ื•ืคืœ ืžืจืืฉ
11:03
and rigged so that they cannot care about privacy,
256
663071
2724
ื›ืš ืฉืœื ื™ื”ื™ื” ืœื”ื ืื™ื›ืคืช ืœื’ื‘ื™ ื”ืคืจื˜ื™ื•ืช,
11:05
and coming to the realization that these manipulations occur
257
665795
3262
ื•ื”ื”ื‘ื ื” ืฉื ืขืฉื•ืช ืžื ื™ืคื•ืœืฆื™ื•ืช ื›ืืœื”
11:09
is already halfway through the process
258
669057
1607
ื”ื™ื ื›ื‘ืจ ื—ืฆื™ ืžื”ืคืชืจื•ืŸ
11:10
of being able to protect yourself.
259
670664
2258
ืฉืœ ื”ื™ื›ื•ืœืช ืœื”ื’ืŸ ืขืœ ืขืฆืžื›ื.
11:12
When someone tells you that privacy is incompatible
260
672922
3710
ื›ืฉืžื™ืฉื”ื• ืื•ืžืจ ืœื›ื ืฉื”ืคืจื˜ื™ื•ืช ืื™ื ื” ืขื•ืœื” ื‘ืงื ื” ืื—ื“
11:16
with the benefits of big data,
261
676632
1849
ืขื ื”ืชื•ืขืœืช ืฉื‘ืฆื‘ื™ืจืช ื ืชื•ื ื™ื,
11:18
consider that in the last 20 years,
262
678481
2473
ื—ื™ืฉื‘ื• ืขืœ ื›ืš ืฉื‘-20 ื”ืฉื ื” ื”ืื—ืจื•ื ื•ืช,
11:20
researchers have created technologies
263
680954
1917
ื—ื•ืงืจื™ื ื™ืฆืจื• ื˜ื›ื ื•ืœื•ื’ื™ื•ืช
11:22
to allow virtually any electronic transactions
264
682871
3318
ืฉืชืืคืฉืจื ื” ืœื‘ืฆืข ืœืžืขืฉื” ื›ืœ ืขื™ืกืงื” ืืœืงื˜ืจื•ื ื™ืช
11:26
to take place in a more privacy-preserving manner.
265
686189
3749
ื‘ืื•ืคืŸ ืฉืžื’ืŸ ื™ื•ืชืจ ืขืœ ื”ืคืจื˜ื™ื•ืช.
11:29
We can browse the Internet anonymously.
266
689938
2555
ืื ื• ื™ื›ื•ืœื™ื ืœื’ืœื•ืฉ ื‘ืื™ื ื˜ืจื ื˜ ื‘ืืœืžื•ื ื™ื•ืช.
11:32
We can send emails that can only be read
267
692493
2678
ืื ื• ื™ื›ื•ืœื™ื ืœืฉืœื•ื— ื“ื•ื"ืœ ืฉื™ื›ื•ืœ ืœื”ื™ืงืจื
11:35
by the intended recipient, not even the NSA.
268
695171
3709
ืจืง ืข"ื™ ื”ื ืžืขืŸ, ืืคื™ืœื• ืœื ืข"ื™ ื”ืกื•ื›ื ื•ืช ืœื‘ื˜ื—ื•ืŸ ืœืื•ืžื™.
11:38
We can have even privacy-preserving data mining.
269
698880
2997
ืื ื• ื™ื›ื•ืœื™ื ืœื‘ืฆืข ืืคื™ืœื• ื›ืจื™ื™ืช-ื ืชื•ื ื™ื ืžื›ื‘ื“ืช ืคืจื˜ื™ื•ืช.
11:41
In other words, we can have the benefits of big data
270
701877
3894
ื‘ืžืœื™ื ืื—ืจื•ืช, ืื ื• ื™ื›ื•ืœื™ื ืœืงื‘ืœ ืืช ื™ืชืจื•ื ื•ืช ืฆื‘ื™ืจืช ื”ื ืชื•ื ื™ื
11:45
while protecting privacy.
271
705771
2132
ืชื•ืš ืฉืžื™ืจื” ืขืœ ื”ืคืจื˜ื™ื•ืช.
11:47
Of course, these technologies imply a shifting
272
707903
3791
ืžื•ื‘ืŸ ืฉืžื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื” ืžืฉืชืžืข ืฉื™ื ื•ื™
11:51
of cost and revenues
273
711694
1546
ืžื‘ื—ื™ื ืช ืขืœื•ืช ื•ื”ื›ื ืกื•ืช
11:53
between data holders and data subjects,
274
713240
2107
ื‘ื™ืŸ ื‘ืขืœื™ ื”ื ืชื•ื ื™ื ืœื‘ื™ืŸ ื ื•ืฉืื™ ื”ืžื™ื“ืข,
11:55
which is why, perhaps, you don't hear more about them.
275
715347
3453
ื•ืื•ืœื™ ื‘ื’ืœืœ ื–ื” ืœื ืžืจื‘ื™ื ืœืฉืžื•ืข ืขืœื™ื”ืŸ.
11:58
Which brings me back to the Garden of Eden.
276
718800
3706
ื•ื–ื” ืžื—ื–ื™ืจ ืื•ืชื™ ืœื’ืŸ-ืขื“ืŸ.
12:02
There is a second privacy interpretation
277
722506
2780
ื™ืฉ ืคืจืฉื ื•ืช ื ื•ืกืคืช, ืฉืงืฉื•ืจื” ื‘ืคืจื˜ื™ื•ืช,
12:05
of the story of the Garden of Eden
278
725286
1809
ืœืกื™ืคื•ืจ ื’ืŸ ื”ืขื“ืŸ
12:07
which doesn't have to do with the issue
279
727095
2096
ืฉืœื ื‘ื”ื›ืจื— ืžืชืขืกืง ืขื ื”ืกื•ื’ื™ื”
12:09
of Adam and Eve feeling naked
280
729191
2225
ืฉืœ ืขื™ืจื•ืžื ืฉืœ ืื“ื ื•ื—ื•ื•ื”
12:11
and feeling ashamed.
281
731416
2381
ื•ืชื—ื•ืฉืช ื”ื‘ื•ืฉื” ืฉืœื”ื.
12:13
You can find echoes of this interpretation
282
733797
2781
ืืคืฉืจ ืœืžืฆื•ื ื”ื“ื™ื ืœืคืจืฉื ื•ืช ื–ื•
12:16
in John Milton's "Paradise Lost."
283
736578
2782
ื‘"ื’ืŸ ื”ืขื“ืŸ ื”ืื‘ื•ื“" ืฉืœ ื’'ื•ืŸ ืžื™ืœื˜ื•ืŸ.
12:19
In the garden, Adam and Eve are materially content.
284
739360
4197
ื‘ื’ืŸ ื”ืขื“ืŸ, ืื“ื ื•ื—ื•ื•ื” ืžืกื•ืคืงื™ื ืžื‘ื—ื™ื ื” ื—ื•ืžืจื™ืช.
12:23
They're happy. They are satisfied.
285
743557
2104
ื”ื ืžืื•ืฉืจื™ื. ื”ื ืฉื‘ืขื™-ืจืฆื•ืŸ.
12:25
However, they also lack knowledge
286
745661
2293
ืื‘ืœ ื’ื ืื™ืŸ ืœื”ื ื™ื“ืข
12:27
and self-awareness.
287
747954
1640
ื•ืžื•ื“ืขื•ืช ืขืฆืžื™ืช.
12:29
The moment they eat the aptly named
288
749594
3319
ื‘ืจื’ืข ืฉื”ื ืื•ื›ืœื™ื ืžื”ืคืจื™ ื‘ืขืœ ื”ืฉื ื”ื”ื•ืœื,
12:32
fruit of knowledge,
289
752913
1293
ืคืจื™ ืขืฅ ื”ื“ืขืช,
12:34
that's when they discover themselves.
290
754206
2605
ื”ื ืžื’ืœื™ื ืืช ืขืฆืžื.
12:36
They become aware. They achieve autonomy.
291
756811
4031
ื”ื ื ืขืฉื™ื ืžื•ื“ืขื™ื. ื”ื ืจื•ื›ืฉื™ื ืขืฆืžืื•ืช.
12:40
The price to pay, however, is leaving the garden.
292
760842
3126
ืื‘ืœ ื”ืžื—ื™ืจ ืฉืขืœื™ื”ื ืœืฉืœื ื”ื•ื ืขื–ื™ื‘ืช ื’ืŸ ื”ืขื“ืŸ.
12:43
So privacy, in a way, is both the means
293
763968
3881
ื›ืš ืฉื”ืคืจื˜ื™ื•ืช, ื‘ืžื•ื‘ืŸ ืžืกื•ื™ื, ื”ื™ื ื’ื ื”ืืžืฆืขื™
12:47
and the price to pay for freedom.
294
767849
2962
ื•ื’ื ื”ืžื—ื™ืจ ืฉืœ ื”ื—ื•ืคืฉ.
12:50
Again, marketers tell us
295
770811
2770
ืฉื•ื‘, ื”ืžืฉื•ื•ืงื™ื ืžืกืคืจื™ื ืœื ื•
12:53
that big data and social media
296
773581
3019
ืฉืฆื‘ื™ืจืช ื ืชื•ื ื™ื ื•ื”ืžื™ื“ื” ื”ื—ื‘ืจืชื™ืช
12:56
are not just a paradise of profit for them,
297
776600
2979
ื”ื ืœื ืจืง ื’ืŸ-ืขื“ืŸ ืฉืœ ืจื•ื•ื—ื™ื ืขื‘ื•ืจื,
12:59
but a Garden of Eden for the rest of us.
298
779579
2457
ืืœื ื’ืŸ-ืขื“ืŸ ืขื‘ื•ืจ ื›ื•ืœื ื•.
13:02
We get free content.
299
782036
1238
ืื ื• ืžืงื‘ืœื™ื ืชื•ื›ืŸ ื‘ื—ื™ื ื.
13:03
We get to play Angry Birds. We get targeted apps.
300
783274
3123
ืื ื• ื–ื•ื›ื™ื ืœืฉื—ืง "ืื ื’ืจื™ ื‘ื™ืจื“ืก", ืื ื• ืžืงื‘ืœื™ื ื™ื™ืฉื•ืžื•ื ื™ื ื™ื™ืขื•ื“ื™ื™ื.
13:06
But in fact, in a few years, organizations
301
786397
2897
ืืš ื”ืืžืช ื”ื™ื ืฉืชื•ืš ืฉื ื™ื ืกืคื•ืจื•ืช, ื”ืืจื’ื•ื ื™ื
13:09
will know so much about us,
302
789294
1609
ื™ื“ืขื• ื›ืœ-ื›ืš ื”ืจื‘ื” ืขืœื™ื ื•,
13:10
they will be able to infer our desires
303
790903
2710
ืฉื”ื ื™ื•ื›ืœื• ืœื”ืกื™ืง ืžื”ื ืจืฆื•ื ื•ืชื™ื ื•
13:13
before we even form them, and perhaps
304
793613
2204
ืขื•ื“ ืœืคื ื™ ืฉืื ื• ื ื“ืข ืžื”ื, ื•ืื•ืœื™
13:15
buy products on our behalf
305
795817
2447
ืœืจื›ื•ืฉ ืžื•ืฆืจื™ื ื‘ืฉืžื ื•
13:18
before we even know we need them.
306
798264
2274
ืขื•ื“ ืœืคื ื™ ืฉื ื“ืข ืฉืื ื• ื–ืงื•ืงื™ื ืœื”ื.
13:20
Now there was one English author
307
800538
3237
ื”ื™ื” ืกื•ืคืจ ืื ื’ืœื™,
13:23
who anticipated this kind of future
308
803775
3045
ืฉืฆืคื” ืขืชื™ื“ ืžืขื™ืŸ ื–ื”,
13:26
where we would trade away
309
806820
1405
ืฉื‘ื• ื ื•ื›ืœ ืœืกื—ื•ืจ
13:28
our autonomy and freedom for comfort.
310
808225
3548
ื‘ืขืฆืžืื•ืชื ื• ื•ื‘ื—ื™ืจื•ืชื ื• ืชืžื•ืจืช ื ื•ื—ื•ืช.
13:31
Even more so than George Orwell,
311
811773
2161
ื™ื•ืชืจ ืžื›ืคื™ ืฉื—ื–ื” ื’'ื•ืจื’' ืื•ืจื•ื•ืœ,
13:33
the author is, of course, Aldous Huxley.
312
813934
2761
ื”ืกื•ืคืจ ื”ื•ื ื›ืžื•ื‘ืŸ ืืœื“ื•ืก ื”ืงืกืœื™.
13:36
In "Brave New World," he imagines a society
313
816695
2854
ื‘"ืขื•ืœื ื—ื“ืฉ ืžื•ืคืœื" ื”ื•ื ืชื™ืืจ ื—ื‘ืจื”
13:39
where technologies that we created
314
819549
2171
ืฉื‘ื” ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉื™ืฆืจื ื•
13:41
originally for freedom
315
821720
1859
ื‘ืžืงื•ืจ ืœืžืขืŸ ื”ื—ื•ืคืฉ,
13:43
end up coercing us.
316
823579
2567
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ืžื“ื›ืื•ืช ืื•ืชื ื•.
13:46
However, in the book, he also offers us a way out
317
826146
4791
ืื‘ืœ ื‘ืื•ืชื• ืกืคืจ ื”ื•ื ื’ื ื”ืฆื™ืข ื“ืจืš ืžื•ืฆื
13:50
of that society, similar to the path
318
830937
3438
ืžืื•ืชื” ื—ื‘ืจื”, ื‘ื“ื•ืžื” ืœื“ืจืš
13:54
that Adam and Eve had to follow to leave the garden.
319
834375
3955
ืฉืื“ื ื•ื—ื•ื•ื” ื ืืœืฆื• ืœืœื›ืช ื‘ื” ื›ื“ื™ ืœืขื–ื•ื‘ ืืช ื”ื’ืŸ.
13:58
In the words of the Savage,
320
838330
2147
ื›ืคื™ ืฉืื•ืžืจ ื”ืคืจื, ืžืชื•ืš ื”ืกืคืจ,
14:00
regaining autonomy and freedom is possible,
321
840477
3069
ื”ืฉื‘ืช ื”ืขืฆืžืื•ืช ื•ื”ื—ื™ืจื•ืช ื”ื™ื ืืคืฉืจื™ืช,
14:03
although the price to pay is steep.
322
843546
2679
ืื ื›ื™ ื”ืžื—ื™ืจ ืฉื™ืฉ ืœืฉืœื ื”ื•ื ื’ื‘ื•ื”.
14:06
So I do believe that one of the defining fights
323
846225
5715
ื›ืš ืฉืื ื™ ืื›ืŸ ืžืืžื™ืŸ ืฉืื—ื“ ื”ืงืจื‘ื•ืช
14:11
of our times will be the fight
324
851940
2563
ืฉื™ื’ื“ื™ืจื• ืืช ืชืงื•ืคืชื ื• ื™ื”ื™ื” ื”ืงืจื‘
14:14
for the control over personal information,
325
854503
2387
ืขืœ ื”ืฉืœื™ื˜ื” ื‘ืžื™ื“ืข ื”ืื™ืฉื™ ืฉืœื ื•,
14:16
the fight over whether big data will become a force
326
856890
3507
ื”ืงืจื‘ ืฉื™ืงื‘ืข ืื ืฆื‘ื™ืจืช ื”ื ืชื•ื ื™ื ืชื”ืคื•ืš ืœื›ื•ื—
14:20
for freedom,
327
860397
1289
ืœืžืขืŸ ื”ื—ื™ืจื•ืช,
14:21
rather than a force which will hiddenly manipulate us.
328
861686
4746
ื•ืœื ืœื›ื•ื— ืฉื™ืคืขื™ืœ ืขืœื™ื ื• ืžื ื™ืคื•ืœืฆื™ื•ืช ืกืžื•ื™ื•ืช.
14:26
Right now, many of us
329
866432
2593
ืœืขืช ืขืชื”, ืจื‘ื™ื ืžืื™ืชื ื•
14:29
do not even know that the fight is going on,
330
869025
2753
ืืคื™ืœื• ืœื ื™ื•ื“ืขื™ื ืฉื”ืงืจื‘ ื”ื–ื” ื‘ื›ืœืœ ืžืชื—ื•ืœืœ,
14:31
but it is, whether you like it or not.
331
871778
2672
ืืš ื”ื•ื ื‘ื”ื—ืœื˜ ื ื™ื˜ืฉ, ืื ื–ื” ืžื•ืฆื ื—ืŸ ื‘ืขื™ื ื™ื›ื ื•ืื ืœื.
14:34
And at the risk of playing the serpent,
332
874450
2804
ื•ืžืชื•ืš ื”ืกืชื›ื ื•ืช ื‘ื’ื™ืœื•ื ื“ืžื•ืช ื”ื ื—ืฉ,
14:37
I will tell you that the tools for the fight
333
877254
2897
ืื’ืœื” ืœื›ื ืฉื”ื›ืœื™ื ืœื ื™ื”ื•ืœ ื”ืงืจื‘
14:40
are here, the awareness of what is going on,
334
880151
3009
ื”ื ื›ืืŸ - ื”ืžื•ื“ืขื•ืช ืœืžื” ืฉืงื•ืจื”,
14:43
and in your hands,
335
883160
1355
ื•ื›ืืŸ, ื‘ืงืฆื•ืช ืืฆื‘ืขื•ืชื™ื›ื.
14:44
just a few clicks away.
336
884515
3740
ื ื—ื•ืฆื•ืช ืจืง ื›ืžื” ื”ืงืฉื•ืช ืžืงืฉ.
14:48
Thank you.
337
888255
1482
ืชื•ื“ื” ืœื›ื.
14:49
(Applause)
338
889737
4477
[ืžื—ื™ืื•ืช ื›ืคื™ื™ื]
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7