What will a future without secrets look like? | Alessandro Acquisti

202,332 views ・ 2013-10-18

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Wooran Lee κ²€ν† : Gemma Lee
00:12
I would like to tell you a story
0
12641
2354
μ €λŠ” μ•„λ‹΄κ³Ό μ΄λΈŒμ™€ κ΄€λ ¨λœ
00:14
connecting the notorious privacy incident
1
14995
3176
μ•…λͺ… 높은
00:18
involving Adam and Eve,
2
18171
2769
μ‚¬μƒν™œ 사건과 κ΄€λ ¨λœ 이야기와
00:20
and the remarkable shift in the boundaries
3
20940
3446
μ§€λ‚œ 10λ…„ κ°„ μ‚¬μƒν™œκ³Ό
00:24
between public and private which has occurred
4
24386
2686
곡개된 μƒν™œμ˜ 경계가 μ–Όλ§ˆλ‚˜ λ†€λžκ²Œ λ°”λ€Œμ—ˆλŠ”μ§€
00:27
in the past 10 years.
5
27072
1770
λ§μ”€λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
00:28
You know the incident.
6
28842
1298
μ—¬λŸ¬λΆ„λ„ μ•„λŠ” μΌμž…λ‹ˆλ‹€.
00:30
Adam and Eve one day in the Garden of Eden
7
30140
3330
μ–΄λŠ λ‚  μ•„λ‹΄κ³Ό μ΄λΈŒλŠ” 에덴 λ™μ‚°μ—μ„œ
00:33
realize they are naked.
8
33470
1843
그듀이 μ•ŒλͺΈμ΄λΌλŠ” 것을 μΈμ§€ν•©λ‹ˆλ‹€.
00:35
They freak out.
9
35313
1500
맀우 놀라죠.
00:36
And the rest is history.
10
36813
2757
λ‚˜λ¨Έμ§€λŠ” 성경에 λ‚˜μ˜€λŠ” κ·ΈλŒ€λ‘œμž…λ‹ˆλ‹€.
00:39
Nowadays, Adam and Eve
11
39570
2188
λ§Œμ•½ μ§€κΈˆμ΄μ—ˆλ‹€λ©΄ μ•„λ‹΄κ³Ό μ΄λΈŒλŠ”
00:41
would probably act differently.
12
41758
2361
μ•„λ§ˆ λ‹€λ₯΄κ²Œ ν–‰λ™ν–ˆμ„ κ²λ‹ˆλ‹€.
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
44119
2268
[@μ•„λ‹΄ μ–΄μ œ 밀은 ν­λ°œμ μ΄μ—ˆμ–΄! κ·Έ 사과 μ΅œκ³ μ•Ό. LOL]
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
46387
1873
[@이브 응... 자기, 근데 λ‚΄ 바지 μ–΄λ””κ°„ κ±°μ•Ό?]
00:48
We do reveal so much more information
15
48260
2636
μš°λ¦¬λŠ” μŠ€μŠ€λ‘œμ— κ΄€ν•΄μ„œ μ „λ‘€κ°€ 없을 만큼
00:50
about ourselves online than ever before,
16
50896
3334
훨씬 λ§Žμ€ 정보λ₯Ό μ˜¨λΌμΈμ— κ³΅κ°œν•˜κ³ 
00:54
and so much information about us
17
54230
1704
μš°λ¦¬μ— κ΄€ν•œ λ§Žμ€ 정보λ₯Ό
00:55
is being collected by organizations.
18
55934
2224
μ—¬λŸ¬ 기관듀이 μˆ˜μ§‘ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
00:58
Now there is much to gain and benefit
19
58158
3282
이제 λŒ€λŸ‰μ˜ 개인 정보 λΆ„μ„μžλ£Œμ™€
01:01
from this massive analysis of personal information,
20
61440
2446
빅데이터λ₯Ό 톡해
01:03
or big data,
21
63886
1946
얻을 수 μžˆλŠ” 이득은 맀우 λ§ŽμŠ΅λ‹ˆλ‹€
01:05
but there are also complex tradeoffs that come
22
65832
2638
ν•˜μ§€λ§Œ 우리의 κ°œμΈμ •λ³΄λ₯Ό λ„˜κΉ€μœΌλ‘œμ¨
01:08
from giving away our privacy.
23
68470
3098
λ°œμƒν•˜λŠ” λ³΅μž‘ν•œ λŒ“κ°€κ°€ μžˆμŠ΅λ‹ˆλ‹€.
01:11
And my story is about these tradeoffs.
24
71568
4023
그리고 제 μ΄μ•ΌκΈ°λŠ” κ·Έ λŒ“κ°€μ— κ΄€ν•œ κ²ƒμž…λ‹ˆλ‹€.
01:15
We start with an observation which, in my mind,
25
75591
2584
μ§€λ‚œ μˆ˜λ…„κ°„ 제 λ¨Έλ¦¬μ†μ—μ„œ
01:18
has become clearer and clearer in the past few years,
26
78175
3327
더 λͺ…확해진 κ΄€μ°°λ‘œ μ‹œμž‘ν•˜κ² μŠ΅λ‹ˆλ‹€
01:21
that any personal information
27
81502
2097
μ–΄λ–€ 개인 정보도
01:23
can become sensitive information.
28
83599
2285
λ―Όκ°ν•œ 정보가 될 수 μžˆμŠ΅λ‹ˆλ‹€
01:25
Back in the year 2000, about 100 billion photos
29
85884
4125
2000년에 μ „μ„Έκ³„μ—μ„œ
01:30
were shot worldwide,
30
90009
1912
μ²œμ–΅λͺ…μ˜ μ‚¬λžŒλ“€μ΄ 사진을 μ°ν˜”κ³ 
01:31
but only a minuscule proportion of them
31
91921
3065
ν•˜μ§€λ§Œ κ·Έ 쀑 μ•„μ£Ό μž‘μ€ λΉ„μœ¨λ§Œ
01:34
were actually uploaded online.
32
94986
1883
μ‹€μ œλ‘œ μ˜¨λΌμΈμ— μ‹€λ ΈμŠ΅λ‹ˆλ‹€.
01:36
In 2010, only on Facebook, in a single month,
33
96869
3361
2010λ…„μ—” 단 ν•œ 달간 νŽ˜μ΄μŠ€λΆμ—μ„œλ§Œ
01:40
2.5 billion photos were uploaded,
34
100230
3270
25μ–΅ μž₯의 사진이 올렀쑌고
01:43
most of them identified.
35
103500
1882
λŒ€λΆ€λΆ„μ΄ 신원 ν™•μΈλ˜μ—ˆμ£ .
01:45
In the same span of time,
36
105382
1880
같은 κΈ°κ°„λ™μ•ˆ
01:47
computers' ability to recognize people in photos
37
107262
4870
사진 μ†μ˜ μ‚¬λžŒμ„ 인식할 수 μžˆλŠ”
01:52
improved by three orders of magnitude.
38
112132
3608
μ»΄ν“¨ν„°μ˜ λŠ₯λ ₯은 천배 κ°€λŸ‰ μ¦κ°€ν–ˆμŠ΅λ‹ˆλ‹€.
01:55
What happens when you combine
39
115740
1882
κ·Έ κΈ°μˆ λ“€μ„ λͺ¨λ‘ ν•©μΉ˜λ©΄
01:57
these technologies together:
40
117622
1501
μ–΄λ–€ 일이 μΌμ–΄λ‚ κΉŒμš”.
01:59
increasing availability of facial data;
41
119123
2658
μ•ˆλ©΄ 정보 이용율의 증가,
02:01
improving facial recognizing ability by computers;
42
121781
3648
μ»΄ν“¨ν„°μ˜ μ•ˆλ©΄ 인식 λŠ₯λ ₯ κ°œμ„ ,
02:05
but also cloud computing,
43
125429
2182
λ˜ν•œ ν΄λΌμš°λ“œ μ»΄ν“¨νŒ…μ„ κ°€λŠ₯ν•˜κ²Œ ν•©λ‹ˆλ‹€
02:07
which gives anyone in this theater
44
127611
1888
κ·Έλž˜μ„œ 이 κ·Ήμž₯에 계신 μ–΄λ–€ 뢄이라도
02:09
the kind of computational power
45
129499
1560
λͺ‡λ…„ μ „μ—λŠ” μ„ΈκΈ€μžλ₯Ό 가진 쑰직만이
02:11
which a few years ago was only the domain
46
131059
1886
ν•  수 μžˆμ—ˆλ˜
02:12
of three-letter agencies;
47
132945
1782
계산 λŠ₯λ ₯을 κ°€μ§ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
02:14
and ubiquitous computing,
48
134727
1378
그리고 μœ λΉ„μΏΌν„°μŠ€ μ»΄ν“¨νŒ…μ€
02:16
which allows my phone, which is not a supercomputer,
49
136105
2892
μŠˆνΌμ»΄ν“¨ν„°κ°€ μ•„λ‹Œ 제 νœ΄λŒ€μ „ν™”λ₯Ό
02:18
to connect to the Internet
50
138997
1671
인터넷에 μ—°κ²°ν•΄μ„œ
02:20
and do there hundreds of thousands
51
140668
2334
λͺ‡ 초 λ§Œμ— 수백, 수천μž₯의
02:23
of face metrics in a few seconds?
52
143002
2639
μ•ˆλ©΄μΈμ‹μ„ κ°€λŠ₯ν•˜λ„λ‘ ν•©λ‹ˆλ‹€.
02:25
Well, we conjecture that the result
53
145641
2628
μš°λ¦¬λŠ” 이 기술 μ‘°ν•©μ˜ κ²°κ³Όκ°€
02:28
of this combination of technologies
54
148269
2064
μ‚¬μƒν™œκ³Ό 읡λͺ…성에 κ΄€ν•œ
02:30
will be a radical change in our very notions
55
150333
2888
κ°œλ…μ— κΈ‰μ†ν•œ λ³€ν™”λ₯Ό
02:33
of privacy and anonymity.
56
153221
2257
κ°€μ Έμ˜€λ¦¬λΌ μΆ”μΈ‘ν•©λ‹ˆλ‹€.
02:35
To test that, we did an experiment
57
155478
1993
그것을 μ‹œν—˜ν•˜κΈ° μœ„ν•΄
02:37
on Carnegie Mellon University campus.
58
157471
2121
μΉ΄λ„€κΈ° 멜둠 λŒ€ν•™κ΅μ—μ„œ μ‹€ν—˜μ„ μ§„ν–‰ν–ˆμŠ΅λ‹ˆλ‹€.
02:39
We asked students who were walking by
59
159592
2099
μš°λ¦¬λŠ” κ±Έμ–΄ μ§€λ‚˜κ°€λŠ” ν•™μƒλ“€μ—κ²Œ
02:41
to participate in a study,
60
161691
1779
연ꡬ에 μ°Έμ—¬ν•  수 μžˆλƒκ³  묻고
02:43
and we took a shot with a webcam,
61
163470
2562
μ›ΉμΊ μœΌλ‘œ 사진을 찍고
02:46
and we asked them to fill out a survey on a laptop.
62
166032
2782
λ…ΈνŠΈλΆμ˜ 섀문쑰사에 응해달라고 ν–ˆμŠ΅λ‹ˆλ‹€.
02:48
While they were filling out the survey,
63
168814
1979
그듀이 섀문에 λ‹΅ν•˜λŠ” λ™μ•ˆ
02:50
we uploaded their shot to a cloud-computing cluster,
64
170793
2797
우린 κ·Έλ“€μ˜ 사진을 ν΄λΌμš°λ“œ μ»΄ν“¨νŒ…κ΅°μ— 올렸고
02:53
and we started using a facial recognizer
65
173590
1727
μ•ˆλ©΄ 인식기λ₯Ό μ΄μš©ν•΄μ„œ κ·Έ 사진과
02:55
to match that shot to a database
66
175317
2405
μš°λ¦¬κ°€ 페이슀뢁 ν”„λ‘œν•„μ—μ„œ 내렀받은
02:57
of some hundreds of thousands of images
67
177722
2393
μˆ˜λ§Žμ€ 사진듀이 μžˆλŠ” λ°μ΄ν„°λ² μ΄μŠ€λ₯Ό
03:00
which we had downloaded from Facebook profiles.
68
180115
3596
맞좰보기 μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
03:03
By the time the subject reached the last page
69
183711
3259
μ°Έκ°€μžκ°€ μ„€λ¬Έμ§€μ˜ λ§ˆμ§€λ§‰μž₯에 λ„λ‹¬ν–ˆμ„ λ•Œ
03:06
on the survey, the page had been dynamically updated
70
186970
3347
μ‹€μ‹œκ°„μœΌλ‘œ μ—…λ°μ΄νŠΈ λ˜λŠ” κ·Έ νŽ˜μ΄μ§€λŠ”
03:10
with the 10 best matching photos
71
190317
2313
μ•ˆλ©΄ μΈμ‹ν”„λ‘œκ·Έλž¨μ΄ 찾은
03:12
which the recognizer had found,
72
192630
2285
κ°€μž₯ λΉ„μŠ·ν•œ 사진을 10μž₯ ν‘œμ‹œν–ˆκ³ 
03:14
and we asked the subjects to indicate
73
194915
1738
μš°λ¦¬λŠ” μ°Έκ°€μžλ“€μ—κ²Œ
03:16
whether he or she found themselves in the photo.
74
196653
4120
μžμ‹ λ“€μ˜ 사진이 μžˆλŠ”μ§€ ν‘œκΈ°ν•˜λ„λ‘ ν–ˆμŠ΅λ‹ˆλ‹€.
03:20
Do you see the subject?
75
200773
3699
μ„€λ¬Έ μ°Έκ°€μžκ°€ λ³΄μ΄μ‹œλ‚˜μš”?
03:24
Well, the computer did, and in fact did so
76
204472
2845
μ»΄ν“¨ν„°λŠ” λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
03:27
for one out of three subjects.
77
207317
2149
사싀 3λͺ… 쀑 1λͺ…을 μ°Ύμ•„λƒˆμŠ΅λ‹ˆλ‹€.
03:29
So essentially, we can start from an anonymous face,
78
209466
3184
기본적으둜 μš°λ¦¬λŠ” μ˜€ν”„λΌμΈ λ˜λŠ” μ˜¨λΌμΈμ— μžˆλŠ”
03:32
offline or online, and we can use facial recognition
79
212650
3484
읡λͺ…μ˜ μ–Όκ΅΄λ‘œ μ‹œμž‘ν•΄μ„œ μ•ˆλ©΄ 인식 κΈ°μˆ μ„ 톡해
03:36
to give a name to that anonymous face
80
216134
2360
κ·Έ 읡λͺ…μ˜ 얼꡴에 이름을 찾을 수 μžˆμŠ΅λ‹ˆλ‹€.
03:38
thanks to social media data.
81
218494
2108
μ†Œμ…œ λ―Έλ””μ–΄ 데이터 λ•λΆ„μ΄μ§€μš”.
03:40
But a few years back, we did something else.
82
220602
1872
ν•˜μ§€λ§Œ λͺ‡λ…„ μ „ 우린 λ‹€λ₯Έ 일을 ν–ˆμŠ΅λ‹ˆλ‹€.
03:42
We started from social media data,
83
222474
1823
μš°λ¦¬λŠ” μ†Œμ…œ λ―Έλ””μ–΄ 데이터λ₯Ό
03:44
we combined it statistically with data
84
224297
3051
λ―Έκ΅­ μ •λΆ€μ˜ μ‚¬νšŒ 보μž₯ 정보와
03:47
from U.S. government social security,
85
227348
2102
ν†΅κ³„μ μœΌλ‘œ 합쳐
03:49
and we ended up predicting social security numbers,
86
229450
3324
μ‚¬νšŒ 보μž₯ 번호λ₯Ό μ˜ˆμƒν•˜κΈ°μ— 이λ₯΄λ €μŠ΅λ‹ˆλ‹€
03:52
which in the United States
87
232774
1512
λ―Έκ΅­λ‚΄μ—μ„œλŠ”
03:54
are extremely sensitive information.
88
234286
2040
맀우 λ―Όκ°ν•œ μ •λ³΄μž…λ‹ˆλ‹€.
03:56
Do you see where I'm going with this?
89
236326
2093
μ œκ°€ 무슨 말을 ν•˜λŠ”μ§€ μ•„μ‹œκ² μŠ΅λ‹ˆκΉŒ?
03:58
So if you combine the two studies together,
90
238419
2922
λ§Œμ•½ μ—¬λŸ¬λΆ„μ΄ 두 연ꡬλ₯Ό ν•©μΉœλ‹€λ©΄
04:01
then the question becomes,
91
241341
1512
μ˜λ¬Έμ„ κ°€μ§€κ²Œ 될 것 μž…λ‹ˆλ‹€.
04:02
can you start from a face and,
92
242853
2720
μ–Όκ΅΄λ‘œ μ‹œμž‘ν•΄μ„œ
04:05
using facial recognition, find a name
93
245573
2311
μ•ˆλ©΄ 인식 κΈ°μˆ μ„ μ΄μš©ν•΄μ„œ 이름을 μ•Œμ•„λ‚΄κ³ 
04:07
and publicly available information
94
247884
2669
κ·Έ 이름과 κ·Έ μ‚¬λžŒμ— λŒ€ν•΄
04:10
about that name and that person,
95
250553
1932
곡개된 정보λ₯Ό μ–»κ³ 
04:12
and from that publicly available information
96
252485
2248
κ·Έλ ‡κ²Œ 곡개된 μ •λ³΄λ‘œλΆ€ν„°
04:14
infer non-publicly available information,
97
254733
2042
훨씬 더 λ―Όκ°ν•œ
04:16
much more sensitive ones
98
256775
1606
λΉ„κ³΅κ°œ 정보λ₯Ό μΆ”λ‘ ν•΄μ„œ
04:18
which you link back to the face?
99
258381
1492
κ·Έ 얼꡴에 μ—°κ²°ν•  수 μžˆμ„κΉŒ?
04:19
And the answer is, yes, we can, and we did.
100
259873
1916
닡은 예이고 μš°λ¦¬λŠ” ν•΄λƒˆμŠ΅λ‹ˆλ‹€.
04:21
Of course, the accuracy keeps getting worse.
101
261789
2568
λ¬Όλ‘  μ •ν™•λ„λŠ” 계속 μ•ˆ 쒋아지죠.
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
264357
944
[λ„€ 번의 μ‹œλ„λ‘œ λŒ€μƒμ˜ 27%의 μ‚¬νšŒ 보μž₯ 번호 첫 λ‹€μ„― μžλ¦¬κ°€ 확인]
04:25
But in fact, we even decided to develop an iPhone app
103
265301
3827
ν•˜μ§€λ§Œ μš°λ¦¬λŠ” 사싀 아이폰 앱을 κ°œλ°œν•˜κΈ°λ‘œ κ²°μ •ν–ˆμŠ΅λ‹ˆλ‹€.
04:29
which uses the phone's internal camera
104
269128
2715
이앱은 μ „ν™”κΈ°μ˜ λ‚΄λΆ€ 카메라λ₯Ό μ‚¬μš©ν•˜μ—¬
04:31
to take a shot of a subject
105
271843
1600
λŒ€μƒμ˜ 사진을 찍어
04:33
and then upload it to a cloud
106
273443
1487
ν΄λΌμš°λ“œμ— μ˜¬λ €μ„œ
04:34
and then do what I just described to you in real time:
107
274930
2662
μ œκ°€ 방금 μ„€λͺ…λ“œλ¦° μž‘μ—…μ„ μ‹€μ‹œκ°„μœΌλ‘œ μ§„ν–‰ν•©λ‹ˆλ‹€.
04:37
looking for a match, finding public information,
108
277592
2088
μΌμΉ˜ν•˜λŠ” 것을 μ°Ύκ³  곡개된 정보λ₯Ό 찾으며
04:39
trying to infer sensitive information,
109
279680
1730
λ―Όκ°ν•œ 정보λ₯Ό μΆ”λ‘ ν•˜μ—¬
04:41
and then sending back to the phone
110
281410
2591
그리고 μ „ν™”κΈ°μ˜ μ‚¬μ§„μœ„μ—
04:44
so that it is overlaid on the face of the subject,
111
284001
3609
찾은 정보λ₯Ό λ§μ”Œμ›Œ ν‘œκΈ°λ©λ‹ˆλ‹€.
04:47
an example of augmented reality,
112
287610
1901
증강 ν˜„μ‹€μ˜ μ˜ˆμ²˜λŸΌμš”.
04:49
probably a creepy example of augmented reality.
113
289511
2451
μ•„λ§ˆ μ†Œλ¦„λΌμΉ˜λŠ” 증강 ν˜„μ‹€μ˜ 예일 κ²ƒμž…λ‹ˆλ‹€.
04:51
In fact, we didn't develop the app to make it available,
114
291962
3339
사싀 μš°λ¦¬λŠ” κ·Έ 앱을 μ‚¬μš©ν•˜κΈ° μœ„ν•΄μ„œκ°€ μ•„λ‹ˆλΌ
04:55
just as a proof of concept.
115
295301
1922
단지 κ°œλ…μ„ 증λͺ…ν•΄ 보이기 μœ„ν•΄μ„œ λ§Œλ“€μ—ˆμ£ .
04:57
In fact, take these technologies
116
297223
2313
μ‹€μ œλ‘œ 이 κΈ°μˆ μ„ μ΄μš©ν•΄μ„œ
04:59
and push them to their logical extreme.
117
299536
1837
κ·Ήν•œμ˜ 논리λ₯Ό μ μš©ν•΄λ³΄μ„Έμš”.
05:01
Imagine a future in which strangers around you
118
301373
2719
λ―Έλž˜μ— μ—¬λŸ¬λΆ„ μ£Όμœ„μ—μ„œ λ‚―μ„  μ‚¬λžŒλ“€μ΄
05:04
will look at you through their Google Glasses
119
304092
2311
ꡬ글 κΈ€λž˜μŠ€λ‚˜ μ½˜ν…μΈ  렌즈둜
05:06
or, one day, their contact lenses,
120
306403
2307
μ—¬λŸ¬λΆ„μ„ 쳐닀본닀고 상상해 λ³΄μ„Έμš”.
05:08
and use seven or eight data points about you
121
308710
4020
μ—¬λŸ¬λΆ„μ— λŒ€ν•œ 일곱 μ—¬λŸ 가지 데이터λ₯Ό μ‚¬μš©ν•΄
05:12
to infer anything else
122
312730
2582
μ—¬λŸ¬λΆ„μ— λŒ€ν•΄ μ•Œλ €μ§„
05:15
which may be known about you.
123
315312
2603
μ–΄λ– ν•œ 정보든지 μΆ”λ‘ ν•©λ‹ˆλ‹€.
05:17
What will this future without secrets look like?
124
317915
4794
비밀이 μ—†λŠ” λ―Έλž˜λŠ” μ–΄λ–€ λͺ¨μŠ΅μΌκΉŒμš”?
05:22
And should we care?
125
322709
1964
그리고 μš°λ¦¬κ°€ κ·Έκ±Έ 신경써야 ν• κΉŒμš”?
05:24
We may like to believe
126
324673
1891
μš°λ¦¬λŠ” ν’λΆ€ν•œ 데이터가 μ‘΄μž¬ν•˜λŠ” κ·Έ λ―Έλž˜κ°€
05:26
that the future with so much wealth of data
127
326564
3040
편견이 사라진 미래라고
05:29
would be a future with no more biases,
128
329604
2514
λ―Ώκ³  μ‹Άμ–΄ν•  μˆ˜λ„ μžˆμ§€λ§Œ
05:32
but in fact, having so much information
129
332118
3583
사싀 λ§Žμ€ 정보λ₯Ό κ°–λŠ”λ‹€λŠ” 것이
05:35
doesn't mean that we will make decisions
130
335701
2191
μš°λ¦¬κ°€ 보닀 객관적인 결정을
05:37
which are more objective.
131
337892
1706
λ‚΄λ¦°λ‹€λŠ” λœ»μ€ μ•„λ‹™λ‹ˆλ‹€.
05:39
In another experiment, we presented to our subjects
132
339598
2560
또 λ‹€λ₯Έ μ‹€ν—˜μ—μ„œ 우린 ν”Όμ‹€ν—˜μžλ“€μ—κ²Œ
05:42
information about a potential job candidate.
133
342158
2246
잠재적 μž…μ‚¬μ§€μ›μžλ“€μ— λŒ€ν•œ 정보λ₯Ό μ œκ³΅ν•˜μ˜€μŠ΅λ‹ˆλ‹€
05:44
We included in this information some references
134
344404
3178
우린 κ·Έ 정보 속에
05:47
to some funny, absolutely legal,
135
347582
2646
ν”Όμ‹€ν—˜μžλ“€μ΄ 인터넷에 올렸던
05:50
but perhaps slightly embarrassing information
136
350228
2465
쑰금 웃기고 μ•„μ£Ό ν•©λ²•μ μ΄μ§€λ§Œ
05:52
that the subject had posted online.
137
352693
2020
μ•„λ§ˆ μ‘°κΈˆμ€ μ°½ν”Όν•  수 μžˆλŠ” λ‚΄μš©μ„ ν¬ν•¨ν–ˆμŠ΅λ‹ˆλ‹€.
05:54
Now interestingly, among our subjects,
138
354713
2366
자 ν₯λ―Έλ‘­κ²Œλ„, ν”Όμ‹€ν—˜μžλ“€ μ€‘μ—μ„œ
05:57
some had posted comparable information,
139
357079
3083
μ–΄λ–€ 이듀은 λΉ„μŠ·ν•œ 정보λ₯Ό 올렸고
06:00
and some had not.
140
360162
2362
λ‹€λ₯Έ 이듀은 κ·ΈλŸ¬μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
06:02
Which group do you think
141
362524
1949
μ—¬λŸ¬λΆ„ 생각엔 μ–΄λ–€ 그룹이
06:04
was more likely to judge harshly our subject?
142
364473
4552
μž…μ‚¬μ§€μ›μžλ“€μ„ λ”μš± λƒ‰μ •ν•˜κ²Œ ν‰κ°€ν–ˆμ„κΉŒμš”?
06:09
Paradoxically, it was the group
143
369025
1957
μ—­μ„€μ μœΌλ‘œ κ·Έ 그룹은
06:10
who had posted similar information,
144
370982
1733
λΉ„μŠ·ν•œ 정보λ₯Ό 올렸던 κ·Έλ£Ήμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
06:12
an example of moral dissonance.
145
372715
2942
도덕적 λΆ€μ‘°ν™”μ˜ ν•œ μ˜ˆμž…λ‹ˆλ‹€.
06:15
Now you may be thinking,
146
375657
1750
μ—¬λŸ¬λΆ„μ€ μ•„λ§ˆ 숨길게 μ—†κΈ° λ•Œλ¬Έμ—
06:17
this does not apply to me,
147
377407
1702
이건 λ‚˜μ™€λŠ” μƒκ΄€μ—†λŠ” 일이야
06:19
because I have nothing to hide.
148
379109
2162
라고 μƒκ°ν•˜μ‹€μ§€ λͺ¨λ¦…λ‹ˆλ‹€.
06:21
But in fact, privacy is not about
149
381271
2482
ν•˜μ§€λ§Œ 사싀, μ‚¬μƒν™œμ€ 감좰야 ν• 
06:23
having something negative to hide.
150
383753
3676
뢀정적인 것에 κ΄€ν•œ 것이 μ•„λ‹™λ‹ˆλ‹€.
06:27
Imagine that you are the H.R. director
151
387429
2354
μ—¬λŸ¬λΆ„μ΄ μ–΄λŠ λ‹¨μ²΄μ˜ 인사 λ‹΄λ‹ΉμžλΌκ³  κ°€μ •ν•΄λ³΄μ„Έμš”.
06:29
of a certain organization, and you receive rΓ©sumΓ©s,
152
389783
2947
μ—¬λŸ¬λΆ„μ€ 이λ ₯μ„œλ₯Ό λ°›κ³ 
06:32
and you decide to find more information about the candidates.
153
392730
2473
μ§€μ›μžμ— λŒ€ν•œ 더 λ§Žμ€ 정보λ₯Ό 찾기둜 κ²°μ •ν•©λ‹ˆλ‹€
06:35
Therefore, you Google their names
154
395203
2460
κ·Έλž˜μ„œ μ—¬λŸ¬λΆ„μ€ κ·Έ 이름듀을 ꡬ글에 κ²€μƒ‰ν•˜κ³ 
06:37
and in a certain universe,
155
397663
2240
μ–΄λŠ ν•œ κ³³μ—μ„œ
06:39
you find this information.
156
399903
2008
κ·Έ 정보λ₯Ό μ°ΎμŠ΅λ‹ˆλ‹€.
06:41
Or in a parallel universe, you find this information.
157
401911
4437
ν˜Ήμ€ 평행 μ„Έκ³„μ—μ„œ 이런 정보λ₯Ό μ°ΎμŠ΅λ‹ˆλ‹€.
06:46
Do you think that you would be equally likely
158
406348
2717
μ—¬λŸ¬λΆ„μ€ 두 μ§€μ›μž λͺ¨λ‘ λ™λ“±ν•˜κ²Œ
06:49
to call either candidate for an interview?
159
409065
2803
면접에 λΆ€λ₯Ό 것이라 μƒκ°ν•˜μ„Έμš”?
06:51
If you think so, then you are not
160
411868
2282
λ§Œμ•½ κ·Έλ ‡λ‹€λ©΄
06:54
like the U.S. employers who are, in fact,
161
414150
2582
μ—¬λŸ¬λΆ„μ€ 우리 μ‹€ν—˜μ— μ°Έκ°€ν–ˆλ˜
06:56
part of our experiment, meaning we did exactly that.
162
416732
3307
미ꡭ의 κ³ μš©μžμ™€ λ‹€λ₯Έ κ²λ‹ˆλ‹€. 우린 κ·Έλ ‡κ²Œ ν–ˆμ–΄μš”.
07:00
We created Facebook profiles, manipulating traits,
163
420039
3182
우린 μ‘°μž‘λœ 페이슀뢁 ν”„λ‘œν•„μ„ λ§Œλ“€κ³ 
07:03
then we started sending out rΓ©sumΓ©s to companies in the U.S.,
164
423221
2851
λ―Έκ΅­ νšŒμ‚¬μ— 이λ ₯μ„œλ₯Ό λ³΄λƒˆμŠ΅λ‹ˆλ‹€.
07:06
and we detected, we monitored,
165
426072
1908
그리고 κ·Έ νšŒμ‚¬μ—μ„œ
07:07
whether they were searching for our candidates,
166
427980
2393
우리의 μ§€μ›μžλ₯Ό μ°ΎλŠ”μ§€ κ°μ‹œν•˜μ˜€μŠ΅λ‹ˆλ‹€.
07:10
and whether they were acting on the information
167
430373
1832
λ˜ν•œ 그듀이 μ†Œμ…œ λ―Έλ””μ–΄μ—μ„œ 찾은 정보에 따라
07:12
they found on social media. And they were.
168
432205
1938
ν–‰λ™ν•˜λŠ”μ§€ μ§€μΌœλ³΄μ•˜μ–΄μš”. 그듀은 κ·Έλ ‡κ²Œ ν–ˆμŠ΅λ‹ˆλ‹€.
07:14
Discrimination was happening through social media
169
434143
2101
μ†Œμ…œ λ―Έλ””μ–΄λ‘œ 인해 λ™λ“±ν•œ κΈ°μˆ μ„ μ§€λ‹Œ
07:16
for equally skilled candidates.
170
436244
3073
μ§€μ›μžλ“€μ„ 두고 차별이 μƒκ²ΌμŠ΅λ‹ˆλ‹€.
07:19
Now marketers like us to believe
171
439317
4575
λ§ˆμΌ€νŒ… λ‹΄λ‹Ήμžλ“€μ€
07:23
that all information about us will always
172
443892
2269
μš°λ¦¬μ— λŒ€ν•œ μ •λ³΄λŠ” 항상 μš°λ¦¬μ—κ²Œ
07:26
be used in a manner which is in our favor.
173
446161
3273
쒋은 μͺ½μœΌλ‘œ μ‚¬μš©λ κ±°λΌκ³  믿게 ν•©λ‹ˆλ‹€.
07:29
But think again. Why should that be always the case?
174
449434
3715
ν•˜μ§€λ§Œ λ‹€μ‹œ μƒκ°ν•΄λ³΄μ„Έμš”. μ™œ 항상 κ·Έλž˜μ•Ό ν• κΉŒμš”?
07:33
In a movie which came out a few years ago,
175
453149
2664
λͺ‡λ…„ 전에 κ°œλ΄‰ν•œ μ˜ν™”,
07:35
"Minority Report," a famous scene
176
455813
2553
"λ§ˆμ΄λ„ˆλ¦¬ν‹° 리포트"μ—λŠ”
07:38
had Tom Cruise walk in a mall
177
458366
2576
ν†° ν¬λ£¨μ¦ˆκ°€ 백화점을 걸을 λ•Œ ν™€λ‘œκ·Έλž˜ν”½ 개인 맞좀 κ΄‘κ³ κ°€
07:40
and holographic personalized advertising
178
460942
3776
그의 μ£Όμœ„μ— λ‚˜νƒ€λ‚˜λŠ”
07:44
would appear around him.
179
464718
1835
유λͺ…ν•œ μž₯면이 μžˆμŠ΅λ‹ˆλ‹€.
07:46
Now, that movie is set in 2054,
180
466553
3227
κ·Έ μ˜ν™”μ˜ 배경은 2054λ…„μž…λ‹ˆλ‹€.
07:49
about 40 years from now,
181
469780
1642
μ§€κΈˆμœΌλ‘œλΆ€ν„° 40λ…„ ν›„μ£ .
07:51
and as exciting as that technology looks,
182
471422
2908
κ·Έ 기술이 μ•„μ£Ό λ©‹μ§€κ²Œ 보이듯
07:54
it already vastly underestimates
183
474330
2646
단체듀이 λͺ¨μ„ 수 μžˆλŠ” μ—¬λŸ¬λΆ„μ— λŒ€ν•œ μ •λ³΄μ˜ μ–‘κ³Ό
07:56
the amount of information that organizations
184
476976
2140
또 κ·Έ 정보λ₯Ό μ΄μš©ν•΄ μ—¬λŸ¬λΆ„μ΄ μ•Œμ•„μ±Œ 수 없도둝
07:59
can gather about you, and how they can use it
185
479116
2483
μ—¬λŸ¬λΆ„μ—κ²Œ 영ν–₯을 쀄 수 μžˆλ‹€λŠ” 사싀은
08:01
to influence you in a way that you will not even detect.
186
481599
3398
맀우 κ³Όμ†Œν‰κ°€λΌμžˆμŠ΅λ‹ˆλ‹€.
08:04
So as an example, this is another experiment
187
484997
2103
예둜, 이 μ‹€ν—˜μ€ ν˜„μž¬ μš°λ¦¬κ°€ 진행 쀑인
08:07
actually we are running, not yet completed.
188
487100
2273
λ˜λ‹€λ₯Έ μ‹€ν—˜μ΄κ³  아직 λλ‚˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
08:09
Imagine that an organization has access
189
489373
2319
μ–΄λŠ 단체가 λ‹Ήμ‹ μ˜ 페이슀뢁 친ꡬ λͺ©λ‘μ—
08:11
to your list of Facebook friends,
190
491692
2056
μ ‘κ·Όν•  수 μžˆλ‹€κ³  μƒμƒν•΄λ³΄μ„Έμš”.
08:13
and through some kind of algorithm
191
493748
1772
그리고 μ–΄λ– ν•œ μ•Œκ³ λ¦¬μ¦˜μ„ 톡해
08:15
they can detect the two friends that you like the most.
192
495520
3734
μ—¬λŸ¬λΆ„μ΄ κ°€μž₯ μ’‹μ•„ν•˜λŠ” 친ꡬ 2λͺ…을 μ•Œμ•„λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
08:19
And then they create, in real time,
193
499254
2280
그리고 그듀은 μ‹€μ‹œκ°„μœΌλ‘œ
08:21
a facial composite of these two friends.
194
501534
2842
κ·Έ 두 친ꡬ의 μ•ˆλ©΄ 합성을 λ§Œλ“€μ–΄λƒ…λ‹ˆλ‹€.
08:24
Now studies prior to ours have shown that people
195
504376
3069
이 μ „μ˜ μ—°κ΅¬λ“€μ—μ„œ μ‚¬λžŒλ“€μ€
08:27
don't recognize any longer even themselves
196
507445
2885
ν•©μ„±λœ μ–Όκ΅΄μ—μ„œ μžμ‹ μ‘°μ°¨λ„ 인식할 수 μ—†μ§€λ§Œ
08:30
in facial composites, but they react
197
510330
2462
08:32
to those composites in a positive manner.
198
512792
2117
긍정적인 λ°˜μ‘μ„ 보인닀고 ν–ˆμŠ΅λ‹ˆλ‹€.
08:34
So next time you are looking for a certain product,
199
514909
3415
κ·ΈλŸ¬λ‹ˆ λ‹€μŒμ— λ§Œμ•½ μ—¬λŸ¬λΆ„μ΄ μ–΄λ–€ μ œν’ˆμ„ 찾을 λ•Œ
08:38
and there is an ad suggesting you to buy it,
200
518324
2559
λ§Œμ•½ νŠΉμ • μ œν’ˆμ„ κΆŒν•˜λŠ” κ΄‘κ³ κ°€ 있으면
08:40
it will not be just a standard spokesperson.
201
520883
2907
일반적인 λŒ€λ³€μΈμ΄ μ•„λ‹κ²λ‹ˆλ‹€.
08:43
It will be one of your friends,
202
523790
2313
μ•„λ§ˆ μ—¬λŸ¬λΆ„μ˜ 친ꡬ 쀑 ν•œλͺ…일 κ±°μ—μš”.
08:46
and you will not even know that this is happening.
203
526103
3303
그리고 μ—¬λŸ¬λΆ„μ€ 이런 일이 μΌμ–΄λ‚˜λŠ”μ§€λ„ λͺ¨λ₯΄κ² μ£ .
08:49
Now the problem is that
204
529406
2413
이제 λ¬Έμ œλŠ”
08:51
the current policy mechanisms we have
205
531819
2519
ν˜„μž¬ μš°λ¦¬κ°€ 가진 개인 μ •λ³΄μ˜ λ‚¨μš©μ„
08:54
to protect ourselves from the abuses of personal information
206
534338
3438
막기 μœ„ν•œ μ •μ±… κ΅¬μ‘°λŠ”
08:57
are like bringing a knife to a gunfight.
207
537776
2984
마치 총싸움에 칼을 가지고 κ°€λŠ” 것과 κ°™μŠ΅λ‹ˆλ‹€.
09:00
One of these mechanisms is transparency,
208
540760
2913
κ·Έ ꡬ쑰 쀑 ν•˜λ‚˜λŠ” 투λͺ…μ„±μž…λ‹ˆλ‹€.
09:03
telling people what you are going to do with their data.
209
543673
3200
μ‚¬λžŒλ“€μ˜ 정보λ₯Ό 가지고 무엇을 할지 λ§ν•˜λŠ” 것이죠.
09:06
And in principle, that's a very good thing.
210
546873
2106
μ›μΉ™λŒ€λ‘œλΌλ©΄ μ•„μ£Ό 쒋은 κ±°μ—μš”.
09:08
It's necessary, but it is not sufficient.
211
548979
3667
ν•„μš”ν•˜μ§€λ§Œ μΆ©λΆ„ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
09:12
Transparency can be misdirected.
212
552646
3698
투λͺ…성은 잘λͺ» μ‚¬μš©λ  수 μžˆμŠ΅λ‹ˆλ‹€.
09:16
You can tell people what you are going to do,
213
556344
2104
μ‚¬λžŒλ“€μ—κ²Œ 무엇을 할지 말을 ν•˜κ³ 
09:18
and then you still nudge them to disclose
214
558448
2232
계속 κ·Έλ“€μ—κ²Œ ν™•μ‹€ν•˜μ§€ μ•Šμ€ μ–‘μ˜
09:20
arbitrary amounts of personal information.
215
560680
2623
정보λ₯Ό 밝히라고 λΆ€μΆ”κΈΈ 수 μžˆμ–΄μš”.
09:23
So in yet another experiment, this one with students,
216
563303
2886
κ·Έλž˜μ„œ 또 λ‹€λ₯Έ 학생듀과 μ§„ν–‰ν•œ μ‹€ν—˜μ—μ„œλŠ”
09:26
we asked them to provide information
217
566189
3058
ꡐ정 μ•ˆμ—μ„œμ˜ 행동에 λŒ€ν•œ
09:29
about their campus behavior,
218
569247
1813
정보λ₯Ό μš”κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
09:31
including pretty sensitive questions, such as this one.
219
571060
2940
κ½€λ‚˜ λ―Όκ°ν•œ 이런 μ§ˆλ¬Έλ„ μžˆμ—ˆμ–΄μš”.
09:34
[Have you ever cheated in an exam?]
220
574000
621
09:34
Now to one group of subjects, we told them,
221
574621
2300
[μ‹œν—˜ 도쀑 μ»¨λ‹ν•œ 적이 μžˆμŠ΅λ‹ˆκΉŒ?]
우린 ν•œ ν”Όμ‹€ν—˜μž κ·Έλ£Ήμ—κ²Œ μ΄λ ‡κ²Œ μ–˜κΈ°ν–ˆμŠ΅λ‹ˆλ‹€.
09:36
"Only other students will see your answers."
222
576921
2841
"λ‹€λ₯Έ ν•™μƒλ“€λ§Œ 닡을 λ³Όκ±°μ—μš”."
09:39
To another group of subjects, we told them,
223
579762
1579
또 λ‹€λ₯Έ κ·Έλ£Ήμ—κ²ŒλŠ” μ΄λ ‡κ²Œ μ–˜κΈ°ν–ˆμ£ .
09:41
"Students and faculty will see your answers."
224
581341
3561
"학생듀과 κ΅μˆ˜μ§„λ“€ λͺ¨λ‘ 응닡을 λ³Όκ²λ‹ˆλ‹€."
09:44
Transparency. Notification. And sure enough, this worked,
225
584902
2591
투λͺ…μ„±. μ•Œλ¦Ό. 이건 λ‹Ήμ—°ν•˜κ²Œλ„ ν†΅ν–ˆμŠ΅λ‹ˆλ‹€.
09:47
in the sense that the first group of subjects
226
587493
1407
첫번재 그룹이 λ‘λ²ˆμ§Έ 그룹보닀
09:48
were much more likely to disclose than the second.
227
588900
2568
더 λ§Žμ€ 정보λ₯Ό λ°ν˜”μ–΄μš”.
09:51
It makes sense, right?
228
591468
1520
납득이 λ˜μ§€μš”. κ·Έλ ‡μ£ ?
09:52
But then we added the misdirection.
229
592988
1490
그리고 잘λͺ»λœ μ§€μ‹œλ₯Ό λ”ν–ˆμŠ΅λ‹ˆλ‹€.
09:54
We repeated the experiment with the same two groups,
230
594478
2760
같은 κ·Έλ£Ήμ—κ²Œ μ‹€ν—˜μ„ λ°˜λ³΅ν–ˆμ§€λ§Œ
09:57
this time adding a delay
231
597238
2427
μ΄λ²ˆμ—” ν”Όμ‹€ν—˜μžλ“€μ—κ²Œ
09:59
between the time we told subjects
232
599665
2935
μš°λ¦¬κ°€ 정보λ₯Ό μ–΄λ–»κ²Œ μ‚¬μš©ν• μ§€ λ§ν•œ λ•Œμ™€
10:02
how we would use their data
233
602600
2080
μ§ˆλ¬Έμ— λ‹΅ν•˜κΈ° μ‹œμž‘ν•œ λ•Œ 사이에
10:04
and the time we actually started answering the questions.
234
604680
4388
μ‹œκ°„ 간격을 λ‘μ—ˆμ–΄μš”.
10:09
How long a delay do you think we had to add
235
609068
2561
κ΅μˆ˜μ§„μ΄ 응닡을 λ³Όκ±°λž€ 사싀에 μ˜ν•œ
10:11
in order to nullify the inhibitory effect
236
611629
4613
μ–΅μ œ 영ν–₯을 λ¬΄νš¨ν™”μ‹œν‚€κΈ° μœ„ν•΄
10:16
of knowing that faculty would see your answers?
237
616242
3411
μ–Όλ§ˆλ‚˜ κΈ΄ 간격을 두어야 ν–ˆμ„κΉŒμš”?
10:19
Ten minutes?
238
619653
1780
10λΆ„?
10:21
Five minutes?
239
621433
1791
5λΆ„?
10:23
One minute?
240
623224
1776
1λΆ„?
10:25
How about 15 seconds?
241
625000
2049
15μ΄ˆλŠ” μ–΄λ–»μŠ΅λ‹ˆκΉŒ?
10:27
Fifteen seconds were sufficient to have the two groups
242
627049
2668
두 그룹이 같은 μ–‘μ˜ 정보λ₯Ό 밝히기 μœ„ν•΄μ„ 
10:29
disclose the same amount of information,
243
629717
1568
15초면 μΆ©λΆ„ν–ˆμŠ΅λ‹ˆλ‹€.
10:31
as if the second group now no longer cares
244
631285
2746
λ‘λ²ˆμ§Έ 그룹이 κ΅μˆ˜μ§„μ΄ 응닡을 λ³Όκ±°λž€ 사싀을
10:34
for faculty reading their answers.
245
634031
2656
μ‹ κ²½ 쓰지 μ•Šμ€ κ²ƒμ²˜λŸΌμš”.
10:36
Now I have to admit that this talk so far
246
636687
3336
μ €λŠ” μ§€κΈˆκ» 이 강연이
10:40
may sound exceedingly gloomy,
247
640023
2480
맀우 μš°μšΈν•˜κ²Œ λ“€λ Έλ‹€λŠ” κ±Έ μΈμ •ν•˜μ§€λ§Œ
10:42
but that is not my point.
248
642503
1721
그게 쀑점은 μ•„λ‹™λ‹ˆλ‹€.
10:44
In fact, I want to share with you the fact that
249
644224
2699
사싀 μ „ λŒ€μ•ˆμ΄ μžˆλ‹€λŠ” 것을
10:46
there are alternatives.
250
646923
1772
μ•Œλ €λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
10:48
The way we are doing things now is not the only way
251
648695
2499
μ§€κΈˆ 저희가 ν•˜λŠ” 방식이
10:51
they can done, and certainly not the best way
252
651194
3037
μœ μΌν•œ 방법이 μ•„λ‹ˆκ³  λ˜ν•œ μ΅œμƒμ˜ 방법은
10:54
they can be done.
253
654231
2027
λ”λ”μš± μ•„λ‹™λ‹ˆλ‹€.
10:56
When someone tells you, "People don't care about privacy,"
254
656258
4171
λˆ„κ΅°κ°€ μ—¬λŸ¬λΆ„μ—κ²Œ "μ‚¬λžŒλ“€μ€ μ‚¬μƒν™œμ— 신경쓰지 μ•Šμ•„."
11:00
consider whether the game has been designed
255
660429
2642
라고 말할 λ•Œ λͺ¨λ“ κ²Œ μ„€κ³„λ˜κ³  μ‘°μž‘λ˜μ–΄
11:03
and rigged so that they cannot care about privacy,
256
663071
2724
μ‚¬μƒν™œμ— μ‹ κ²½ μ“Έ 수 μ—†κ²Œ λ¬λŠ”μ§€ κ³ λ €ν•΄λ³΄μ„Έμš”.
11:05
and coming to the realization that these manipulations occur
257
665795
3262
그리고 κ·ΈλŸ¬ν•œ μ‘°μž‘λ“€μ΄ μΌμ–΄λ‚˜κ³  μžˆμŒμ„ κΉ¨λ‹«λŠ” μˆœκ°„μ΄
11:09
is already halfway through the process
258
669057
1607
이미 μ—¬λŸ¬λΆ„ μžμ‹ μ„ λ³΄ν˜Έν•˜λŠ” κ³Όμ •μ˜
11:10
of being able to protect yourself.
259
670664
2258
μ ˆλ°˜μ— μ™”μŠ΅λ‹ˆλ‹€.
11:12
When someone tells you that privacy is incompatible
260
672922
3710
λˆ„κ΅°κ°€ μ—¬λŸ¬λΆ„μ—κ²Œ μ‚¬μƒν™œκ³Ό
11:16
with the benefits of big data,
261
676632
1849
λΉ… λ°μ΄ν„°μ˜ μž₯점은 곡쑴할 수 μ—†λ‹€ 말할 λ•Œ
11:18
consider that in the last 20 years,
262
678481
2473
μ§€λ‚œ 20λ…„ κ°„ 연ꡬ진듀이
11:20
researchers have created technologies
263
680954
1917
κΈ°μˆ μ„ κ°œλ°œν•΄μ„œ
11:22
to allow virtually any electronic transactions
264
682871
3318
거의 λͺ¨λ“  μ „μž κ±°λž˜κ°€
11:26
to take place in a more privacy-preserving manner.
265
686189
3749
μ‚¬μƒν™œμ„ 더 λ³΄ν˜Έν•˜λŠ” λ°©μ‹μœΌλ‘œ 이뀄지고 μžˆμŠ΅λ‹ˆλ‹€.
11:29
We can browse the Internet anonymously.
266
689938
2555
우린 읡λͺ…μœΌλ‘œ 인터넷을 λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
11:32
We can send emails that can only be read
267
692493
2678
우린 였직 μ§€μ •λœ μˆ˜μ‹ μžλ§Œ 읽을 수 μžˆλŠ”
11:35
by the intended recipient, not even the NSA.
268
695171
3709
이메일을 보낼 수 μžˆμŠ΅λ‹ˆλ‹€. NSA도 읽을 수 μ—†μ–΄μš”.
11:38
We can have even privacy-preserving data mining.
269
698880
2997
우린 μ‚¬μƒν™œμ„ λ³΄ν˜Έν•˜λ©° 데이터 λ§ˆμ΄λ‹μ„ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
11:41
In other words, we can have the benefits of big data
270
701877
3894
λ‹€λ₯Έ 말둜 μš°λ¦¬λŠ” μ‚¬μƒν™œμ„ λ³΄ν˜Έν•˜λ©°
11:45
while protecting privacy.
271
705771
2132
λΉ… λ°μ΄ν„°μ˜ 이점을 κ°€μ§ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
11:47
Of course, these technologies imply a shifting
272
707903
3791
λ¬Όλ‘  κ·Έ κΈ°μˆ λ“€μ€
11:51
of cost and revenues
273
711694
1546
데이터 λ³΄μœ μžμ™€ λŒ€μƒμž μ‚¬μ΄μ˜
11:53
between data holders and data subjects,
274
713240
2107
λΉ„μš©κ³Ό 수읡의 λ³€ν™”λ₯Ό μ˜λ―Έν•©λ‹ˆλ‹€.
11:55
which is why, perhaps, you don't hear more about them.
275
715347
3453
μ–΄μ©Œλ©΄ μ—¬λŸ¬λΆ„μ΄ 그것에 λŒ€ν•΄ 더이상 듣지 λͺ»ν•˜λŠ” μ΄μœ μΌκ²λ‹ˆλ‹€.
11:58
Which brings me back to the Garden of Eden.
276
718800
3706
λ‹€μ‹œ μ—λ΄μ˜ λ™μ‚°μœΌλ‘œ λŒμ•„κ°€λ³΄μ£ .
12:02
There is a second privacy interpretation
277
722506
2780
μ—λ΄μ˜ 동산 이야기엔
12:05
of the story of the Garden of Eden
278
725286
1809
λ‘λ²ˆμ§Έ μ‚¬μƒν™œμ— κ΄€λ ¨λœ 해석이 μžˆμŠ΅λ‹ˆλ‹€.
12:07
which doesn't have to do with the issue
279
727095
2096
이건 μ•„λ‹΄κ³Ό μ΄λΈŒκ°€
12:09
of Adam and Eve feeling naked
280
729191
2225
μ•ŒλͺΈμΈ 채
12:11
and feeling ashamed.
281
731416
2381
μ°½ν”Όν•΄ ν•˜λŠ” κ²ƒκ³ΌλŠ” 상관 μ—†μ–΄μš”.
12:13
You can find echoes of this interpretation
282
733797
2781
이 ν•΄μ„μ˜ λ©”μ•„λ¦¬λŠ”
12:16
in John Milton's "Paradise Lost."
283
736578
2782
μ‘΄ λ°€νŠΌμ˜ "싀낙원"μ—μ„œ μ°ΎμœΌμ‹€ 수 μžˆμŠ΅λ‹ˆλ‹€.
12:19
In the garden, Adam and Eve are materially content.
284
739360
4197
λ™μ‚°μ—μ„œ μ•„λ‹΄κ³Ό μ΄λΈŒλŠ” 물질적으둜 λ§Œμ‘±ν•©λ‹ˆλ‹€.
12:23
They're happy. They are satisfied.
285
743557
2104
그듀은 ν–‰λ³΅ν•©λ‹ˆλ‹€. λ§Œμ‘±ν•˜κ³  μžˆμ–΄μš”.
12:25
However, they also lack knowledge
286
745661
2293
ν•˜μ§€λ§Œ 그듀은
12:27
and self-awareness.
287
747954
1640
지식과 자기 인식이 μ—†μŠ΅λ‹ˆλ‹€.
12:29
The moment they eat the aptly named
288
749594
3319
그듀이 μ μ ˆν•˜κ²Œ 이름 지어진
12:32
fruit of knowledge,
289
752913
1293
μ„ μ•…κ³Όλ₯Ό λ¨ΉλŠ” μˆœκ°„
12:34
that's when they discover themselves.
290
754206
2605
자기 μžμ‹ μ„ λ°œκ²¬ν•©λ‹ˆλ‹€.
12:36
They become aware. They achieve autonomy.
291
756811
4031
그듀은 μ˜μ‹ν•˜κ²Œ λ˜μš”. μžμ£Όμ„±μ„ μ–»μŠ΅λ‹ˆλ‹€.
12:40
The price to pay, however, is leaving the garden.
292
760842
3126
ν•˜μ§€λ§Œ 그에 λŒ€ν•œ λŒ€κ°€λŠ” 동산을 λ– λ‚˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:43
So privacy, in a way, is both the means
293
763968
3881
μ‚¬μƒν™œμ΄λž€ ν•œνŽΈμœΌλ‘œ
12:47
and the price to pay for freedom.
294
767849
2962
자유λ₯Ό μœ„ν•΄ μ§€λΆˆν•˜λŠ” λŒ€κ°€μž…λ‹ˆλ‹€.
12:50
Again, marketers tell us
295
770811
2770
λ‹€μ‹œ ν•œλ²ˆ, λ§ˆμΌ€νŒ… λ‹΄λ‹Ήμžλ“€μ€
12:53
that big data and social media
296
773581
3019
λΉ… 데이터와 μ†Œμ…œ λ―Έλ””μ–΄κ°€
12:56
are not just a paradise of profit for them,
297
776600
2979
그듀을 μœ„ν•œ 이읡의 낙원일 뿐만 μ•„λ‹ˆλΌ
12:59
but a Garden of Eden for the rest of us.
298
779579
2457
우리λ₯Ό μœ„ν•œ 에덴 동산이라고 λ§ν•©λ‹ˆλ‹€.
13:02
We get free content.
299
782036
1238
우린 무료 μ½˜ν…μΈ λ₯Ό μ–»μ£ .
13:03
We get to play Angry Birds. We get targeted apps.
300
783274
3123
액그리 λ²„λ“œλ₯Ό ν•˜κ²Œ λ˜κ³ μš”. νŠΉμ •ν•œ 앱을 μ–»μŠ΅λ‹ˆλ‹€.
13:06
But in fact, in a few years, organizations
301
786397
2897
ν•˜μ§€λ§Œ 사싀 λͺ‡λ…„ ν›„ 단체듀은
13:09
will know so much about us,
302
789294
1609
μš°λ¦¬μ— λŒ€ν•΄ μ•„μ£Ό 많이 μ•Œκ²Œ 될 κ²λ‹ˆλ‹€.
13:10
they will be able to infer our desires
303
790903
2710
그듀은 μš°λ¦¬κ°€ μš•κ΅¬λ₯Ό 가지기도 전에
13:13
before we even form them, and perhaps
304
793613
2204
κ·Έ μš•κ΅¬λ₯Ό μΆ”λ‘ ν•  수 있고
13:15
buy products on our behalf
305
795817
2447
μš°λ¦¬κ°€ ν•„μš”λ₯Ό λŠλΌκΈ°λ„ 전에
13:18
before we even know we need them.
306
798264
2274
μ œν’ˆμ„ μ‚¬κ²Œ ν•  μˆ˜λ„ μžˆμ„ κ²λ‹ˆλ‹€.
13:20
Now there was one English author
307
800538
3237
ν•œ 영ꡭ μž‘κ°€κ°€
13:23
who anticipated this kind of future
308
803775
3045
이런 미래λ₯Ό μ˜ˆμƒν–ˆμ–΄μš”.
13:26
where we would trade away
309
806820
1405
편의λ₯Ό μœ„ν•΄
13:28
our autonomy and freedom for comfort.
310
808225
3548
우리의 μžμ£Όμ„±κ³Ό 자유λ₯Ό νŒλ‹€κ³  말이죠.
13:31
Even more so than George Orwell,
311
811773
2161
쑰지 μ˜€μ›°λ³΄λ‹€ λ”ν•œ
13:33
the author is, of course, Aldous Huxley.
312
813934
2761
μžκ°€λŠ” λ¬Όλ‘  μ˜¬λ”μŠ€ ν—‰μŠ¬λ¦¬μž…λ‹ˆλ‹€.
13:36
In "Brave New World," he imagines a society
313
816695
2854
"멋진 신세계"μ—μ„œ κ·ΈλŠ”
13:39
where technologies that we created
314
819549
2171
μš°λ¦¬κ°€ μ°½μ‘°ν•œ 자유λ₯Ό μœ„ν•œ κΈ°μˆ λ“€μ΄
13:41
originally for freedom
315
821720
1859
우리λ₯Ό κ°•μ••ν•˜λŠ”
13:43
end up coercing us.
316
823579
2567
μ‚¬νšŒλ₯Ό κ·Έλ¦½λ‹ˆλ‹€.
13:46
However, in the book, he also offers us a way out
317
826146
4791
ν•˜μ§€λ§Œ μ±…μ—μ„œ κ·ΈλŠ”
13:50
of that society, similar to the path
318
830937
3438
그런 μ‚¬νšŒμ—μ„œ λ²—μ–΄λ‚˜λŠ” 방법도 κΆŒν•©λ‹ˆλ‹€.
13:54
that Adam and Eve had to follow to leave the garden.
319
834375
3955
μ•„λ‹΄κ³Ό μ΄λΈŒκ°€ 동산을 λ– λ‚˜μ•Ό ν–ˆλ˜ 것과 λΉ„μŠ·ν•˜κ²Œμš”.
13:58
In the words of the Savage,
320
838330
2147
μƒˆλΉ„μ§€μ˜ 말둜
14:00
regaining autonomy and freedom is possible,
321
840477
3069
μžμ£Όμ„±κ³Ό 자유λ₯Ό λ˜μ°ΎλŠ” 것은 κ°€λŠ₯ν•©λ‹ˆλ‹€.
14:03
although the price to pay is steep.
322
843546
2679
그에 λŒ€ν•œ λŒ€κ°€λŠ” λΉ„μ‹Έμ§€λ§Œμš”.
14:06
So I do believe that one of the defining fights
323
846225
5715
κ·Έλž˜μ„œ μ €λŠ”
14:11
of our times will be the fight
324
851940
2563
우리 μ‹œλŒ€μ˜ μ€‘μš”ν•œ 싸움은
14:14
for the control over personal information,
325
854503
2387
κ°œμΈμ •λ³΄ ν†΅μ œλ₯Ό μœ„ν•œ 싸움,
14:16
the fight over whether big data will become a force
326
856890
3507
λΉ… 데이터가 자유λ₯Ό μœ„ν•œ
14:20
for freedom,
327
860397
1289
힘이 λ μ§€μ˜ 싸움이라 λ―ΏμŠ΅λ‹ˆλ‹€.
14:21
rather than a force which will hiddenly manipulate us.
328
861686
4746
우리λ₯Ό λͺ°λž˜ μ‘°μ’…ν•˜λŠ” 힘 λŒ€μ‹ μ—μš”.
14:26
Right now, many of us
329
866432
2593
μ§€κΈˆ λ§Žμ€ μ‚¬λžŒλ“€μ€
14:29
do not even know that the fight is going on,
330
869025
2753
κ·Έ 싸움이 진행 쀑인지도 λͺ¨λ₯Όκ±°μ—μš”.
14:31
but it is, whether you like it or not.
331
871778
2672
ν•˜μ§€λ§Œ μ—¬λŸ¬λΆ„μ΄ μ’‹λ“  μ‹«λ“  진행 μ€‘μž…λ‹ˆλ‹€.
14:34
And at the risk of playing the serpent,
332
874450
2804
그리고 뱀을 κ°–κ³  λ…ΈλŠ” μœ„ν—˜μ„ 무릅쓰고
14:37
I will tell you that the tools for the fight
333
877254
2897
싸움을 μœ„ν•œ λ„κ΅¬λŠ”
14:40
are here, the awareness of what is going on,
334
880151
3009
μ—¬κΈ° μžˆμŠ΅λ‹ˆλ‹€. ν˜„ μƒν™©μ˜ 자각과
14:43
and in your hands,
335
883160
1355
μ—¬λŸ¬λΆ„μ˜ 손에 μžˆλŠ”
14:44
just a few clicks away.
336
884515
3740
λͺ‡ 번의 ν΄λ¦­μž…λ‹ˆλ‹€.
14:48
Thank you.
337
888255
1482
κ°μ‚¬ν•©λ‹ˆλ‹€.
14:49
(Applause)
338
889737
4477
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7