What will a future without secrets look like? | Alessandro Acquisti

202,219 views ใƒป 2013-10-18

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Wooran Lee ๊ฒ€ํ† : Gemma Lee
00:12
I would like to tell you a story
0
12641
2354
์ €๋Š” ์•„๋‹ด๊ณผ ์ด๋ธŒ์™€ ๊ด€๋ จ๋œ
00:14
connecting the notorious privacy incident
1
14995
3176
์•…๋ช… ๋†’์€
00:18
involving Adam and Eve,
2
18171
2769
์‚ฌ์ƒํ™œ ์‚ฌ๊ฑด๊ณผ ๊ด€๋ จ๋œ ์ด์•ผ๊ธฐ์™€
00:20
and the remarkable shift in the boundaries
3
20940
3446
์ง€๋‚œ 10๋…„ ๊ฐ„ ์‚ฌ์ƒํ™œ๊ณผ
00:24
between public and private which has occurred
4
24386
2686
๊ณต๊ฐœ๋œ ์ƒํ™œ์˜ ๊ฒฝ๊ณ„๊ฐ€ ์–ผ๋งˆ๋‚˜ ๋†€๋ž๊ฒŒ ๋ฐ”๋€Œ์—ˆ๋Š”์ง€
00:27
in the past 10 years.
5
27072
1770
๋ง์”€๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
00:28
You know the incident.
6
28842
1298
์—ฌ๋Ÿฌ๋ถ„๋„ ์•„๋Š” ์ผ์ž…๋‹ˆ๋‹ค.
00:30
Adam and Eve one day in the Garden of Eden
7
30140
3330
์–ด๋Š ๋‚  ์•„๋‹ด๊ณผ ์ด๋ธŒ๋Š” ์—๋ด ๋™์‚ฐ์—์„œ
00:33
realize they are naked.
8
33470
1843
๊ทธ๋“ค์ด ์•Œ๋ชธ์ด๋ผ๋Š” ๊ฒƒ์„ ์ธ์ง€ํ•ฉ๋‹ˆ๋‹ค.
00:35
They freak out.
9
35313
1500
๋งค์šฐ ๋†€๋ผ์ฃ .
00:36
And the rest is history.
10
36813
2757
๋‚˜๋จธ์ง€๋Š” ์„ฑ๊ฒฝ์— ๋‚˜์˜ค๋Š” ๊ทธ๋Œ€๋กœ์ž…๋‹ˆ๋‹ค.
00:39
Nowadays, Adam and Eve
11
39570
2188
๋งŒ์•ฝ ์ง€๊ธˆ์ด์—ˆ๋‹ค๋ฉด ์•„๋‹ด๊ณผ ์ด๋ธŒ๋Š”
00:41
would probably act differently.
12
41758
2361
์•„๋งˆ ๋‹ค๋ฅด๊ฒŒ ํ–‰๋™ํ–ˆ์„ ๊ฒ๋‹ˆ๋‹ค.
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
44119
2268
[@์•„๋‹ด ์–ด์ œ ๋ฐค์€ ํญ๋ฐœ์ ์ด์—ˆ์–ด! ๊ทธ ์‚ฌ๊ณผ ์ตœ๊ณ ์•ผ. LOL]
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
46387
1873
[@์ด๋ธŒ ์‘... ์ž๊ธฐ, ๊ทผ๋ฐ ๋‚ด ๋ฐ”์ง€ ์–ด๋””๊ฐ„ ๊ฑฐ์•ผ?]
00:48
We do reveal so much more information
15
48260
2636
์šฐ๋ฆฌ๋Š” ์Šค์Šค๋กœ์— ๊ด€ํ•ด์„œ ์ „๋ก€๊ฐ€ ์—†์„ ๋งŒํผ
00:50
about ourselves online than ever before,
16
50896
3334
ํ›จ์”ฌ ๋งŽ์€ ์ •๋ณด๋ฅผ ์˜จ๋ผ์ธ์— ๊ณต๊ฐœํ•˜๊ณ 
00:54
and so much information about us
17
54230
1704
์šฐ๋ฆฌ์— ๊ด€ํ•œ ๋งŽ์€ ์ •๋ณด๋ฅผ
00:55
is being collected by organizations.
18
55934
2224
์—ฌ๋Ÿฌ ๊ธฐ๊ด€๋“ค์ด ์ˆ˜์ง‘ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:58
Now there is much to gain and benefit
19
58158
3282
์ด์ œ ๋Œ€๋Ÿ‰์˜ ๊ฐœ์ธ ์ •๋ณด ๋ถ„์„์ž๋ฃŒ์™€
01:01
from this massive analysis of personal information,
20
61440
2446
๋น…๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ด
01:03
or big data,
21
63886
1946
์–ป์„ ์ˆ˜ ์žˆ๋Š” ์ด๋“์€ ๋งค์šฐ ๋งŽ์Šต๋‹ˆ๋‹ค
01:05
but there are also complex tradeoffs that come
22
65832
2638
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ์˜ ๊ฐœ์ธ์ •๋ณด๋ฅผ ๋„˜๊น€์œผ๋กœ์จ
01:08
from giving away our privacy.
23
68470
3098
๋ฐœ์ƒํ•˜๋Š” ๋ณต์žกํ•œ ๋Œ“๊ฐ€๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
01:11
And my story is about these tradeoffs.
24
71568
4023
๊ทธ๋ฆฌ๊ณ  ์ œ ์ด์•ผ๊ธฐ๋Š” ๊ทธ ๋Œ“๊ฐ€์— ๊ด€ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:15
We start with an observation which, in my mind,
25
75591
2584
์ง€๋‚œ ์ˆ˜๋…„๊ฐ„ ์ œ ๋จธ๋ฆฌ์†์—์„œ
01:18
has become clearer and clearer in the past few years,
26
78175
3327
๋” ๋ช…ํ™•ํ•ด์ง„ ๊ด€์ฐฐ๋กœ ์‹œ์ž‘ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค
01:21
that any personal information
27
81502
2097
์–ด๋–ค ๊ฐœ์ธ ์ •๋ณด๋„
01:23
can become sensitive information.
28
83599
2285
๋ฏผ๊ฐํ•œ ์ •๋ณด๊ฐ€ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค
01:25
Back in the year 2000, about 100 billion photos
29
85884
4125
2000๋…„์— ์ „์„ธ๊ณ„์—์„œ
01:30
were shot worldwide,
30
90009
1912
์ฒœ์–ต๋ช…์˜ ์‚ฌ๋žŒ๋“ค์ด ์‚ฌ์ง„์„ ์ฐํ˜”๊ณ 
01:31
but only a minuscule proportion of them
31
91921
3065
ํ•˜์ง€๋งŒ ๊ทธ ์ค‘ ์•„์ฃผ ์ž‘์€ ๋น„์œจ๋งŒ
01:34
were actually uploaded online.
32
94986
1883
์‹ค์ œ๋กœ ์˜จ๋ผ์ธ์— ์‹ค๋ ธ์Šต๋‹ˆ๋‹ค.
01:36
In 2010, only on Facebook, in a single month,
33
96869
3361
2010๋…„์—” ๋‹จ ํ•œ ๋‹ฌ๊ฐ„ ํŽ˜์ด์Šค๋ถ์—์„œ๋งŒ
01:40
2.5 billion photos were uploaded,
34
100230
3270
25์–ต ์žฅ์˜ ์‚ฌ์ง„์ด ์˜ฌ๋ ค์กŒ๊ณ 
01:43
most of them identified.
35
103500
1882
๋Œ€๋ถ€๋ถ„์ด ์‹ ์› ํ™•์ธ๋˜์—ˆ์ฃ .
01:45
In the same span of time,
36
105382
1880
๊ฐ™์€ ๊ธฐ๊ฐ„๋™์•ˆ
01:47
computers' ability to recognize people in photos
37
107262
4870
์‚ฌ์ง„ ์†์˜ ์‚ฌ๋žŒ์„ ์ธ์‹ํ•  ์ˆ˜ ์žˆ๋Š”
01:52
improved by three orders of magnitude.
38
112132
3608
์ปดํ“จํ„ฐ์˜ ๋Šฅ๋ ฅ์€ ์ฒœ๋ฐฐ ๊ฐ€๋Ÿ‰ ์ฆ๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:55
What happens when you combine
39
115740
1882
๊ทธ ๊ธฐ์ˆ ๋“ค์„ ๋ชจ๋‘ ํ•ฉ์น˜๋ฉด
01:57
these technologies together:
40
117622
1501
์–ด๋–ค ์ผ์ด ์ผ์–ด๋‚ ๊นŒ์š”.
01:59
increasing availability of facial data;
41
119123
2658
์•ˆ๋ฉด ์ •๋ณด ์ด์šฉ์œจ์˜ ์ฆ๊ฐ€,
02:01
improving facial recognizing ability by computers;
42
121781
3648
์ปดํ“จํ„ฐ์˜ ์•ˆ๋ฉด ์ธ์‹ ๋Šฅ๋ ฅ ๊ฐœ์„ ,
02:05
but also cloud computing,
43
125429
2182
๋˜ํ•œ ํด๋ผ์šฐ๋“œ ์ปดํ“จํŒ…์„ ๊ฐ€๋Šฅํ•˜๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค
02:07
which gives anyone in this theater
44
127611
1888
๊ทธ๋ž˜์„œ ์ด ๊ทน์žฅ์— ๊ณ„์‹  ์–ด๋–ค ๋ถ„์ด๋ผ๋„
02:09
the kind of computational power
45
129499
1560
๋ช‡๋…„ ์ „์—๋Š” ์„ธ๊ธ€์ž๋ฅผ ๊ฐ€์ง„ ์กฐ์ง๋งŒ์ด
02:11
which a few years ago was only the domain
46
131059
1886
ํ•  ์ˆ˜ ์žˆ์—ˆ๋˜
02:12
of three-letter agencies;
47
132945
1782
๊ณ„์‚ฐ ๋Šฅ๋ ฅ์„ ๊ฐ€์งˆ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:14
and ubiquitous computing,
48
134727
1378
๊ทธ๋ฆฌ๊ณ  ์œ ๋น„์ฟผํ„ฐ์Šค ์ปดํ“จํŒ…์€
02:16
which allows my phone, which is not a supercomputer,
49
136105
2892
์Šˆํผ์ปดํ“จํ„ฐ๊ฐ€ ์•„๋‹Œ ์ œ ํœด๋Œ€์ „ํ™”๋ฅผ
02:18
to connect to the Internet
50
138997
1671
์ธํ„ฐ๋„ท์— ์—ฐ๊ฒฐํ•ด์„œ
02:20
and do there hundreds of thousands
51
140668
2334
๋ช‡ ์ดˆ ๋งŒ์— ์ˆ˜๋ฐฑ, ์ˆ˜์ฒœ์žฅ์˜
02:23
of face metrics in a few seconds?
52
143002
2639
์•ˆ๋ฉด์ธ์‹์„ ๊ฐ€๋Šฅํ•˜๋„๋ก ํ•ฉ๋‹ˆ๋‹ค.
02:25
Well, we conjecture that the result
53
145641
2628
์šฐ๋ฆฌ๋Š” ์ด ๊ธฐ์ˆ  ์กฐํ•ฉ์˜ ๊ฒฐ๊ณผ๊ฐ€
02:28
of this combination of technologies
54
148269
2064
์‚ฌ์ƒํ™œ๊ณผ ์ต๋ช…์„ฑ์— ๊ด€ํ•œ
02:30
will be a radical change in our very notions
55
150333
2888
๊ฐœ๋…์— ๊ธ‰์†ํ•œ ๋ณ€ํ™”๋ฅผ
02:33
of privacy and anonymity.
56
153221
2257
๊ฐ€์ ธ์˜ค๋ฆฌ๋ผ ์ถ”์ธกํ•ฉ๋‹ˆ๋‹ค.
02:35
To test that, we did an experiment
57
155478
1993
๊ทธ๊ฒƒ์„ ์‹œํ—˜ํ•˜๊ธฐ ์œ„ํ•ด
02:37
on Carnegie Mellon University campus.
58
157471
2121
์นด๋„ค๊ธฐ ๋ฉœ๋ก  ๋Œ€ํ•™๊ต์—์„œ ์‹คํ—˜์„ ์ง„ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:39
We asked students who were walking by
59
159592
2099
์šฐ๋ฆฌ๋Š” ๊ฑธ์–ด ์ง€๋‚˜๊ฐ€๋Š” ํ•™์ƒ๋“ค์—๊ฒŒ
02:41
to participate in a study,
60
161691
1779
์—ฐ๊ตฌ์— ์ฐธ์—ฌํ•  ์ˆ˜ ์žˆ๋ƒ๊ณ  ๋ฌป๊ณ 
02:43
and we took a shot with a webcam,
61
163470
2562
์›น์บ ์œผ๋กœ ์‚ฌ์ง„์„ ์ฐ๊ณ 
02:46
and we asked them to fill out a survey on a laptop.
62
166032
2782
๋…ธํŠธ๋ถ์˜ ์„ค๋ฌธ์กฐ์‚ฌ์— ์‘ํ•ด๋‹ฌ๋ผ๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:48
While they were filling out the survey,
63
168814
1979
๊ทธ๋“ค์ด ์„ค๋ฌธ์— ๋‹ตํ•˜๋Š” ๋™์•ˆ
02:50
we uploaded their shot to a cloud-computing cluster,
64
170793
2797
์šฐ๋ฆฐ ๊ทธ๋“ค์˜ ์‚ฌ์ง„์„ ํด๋ผ์šฐ๋“œ ์ปดํ“จํŒ…๊ตฐ์— ์˜ฌ๋ ธ๊ณ 
02:53
and we started using a facial recognizer
65
173590
1727
์•ˆ๋ฉด ์ธ์‹๊ธฐ๋ฅผ ์ด์šฉํ•ด์„œ ๊ทธ ์‚ฌ์ง„๊ณผ
02:55
to match that shot to a database
66
175317
2405
์šฐ๋ฆฌ๊ฐ€ ํŽ˜์ด์Šค๋ถ ํ”„๋กœํ•„์—์„œ ๋‚ด๋ ค๋ฐ›์€
02:57
of some hundreds of thousands of images
67
177722
2393
์ˆ˜๋งŽ์€ ์‚ฌ์ง„๋“ค์ด ์žˆ๋Š” ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค๋ฅผ
03:00
which we had downloaded from Facebook profiles.
68
180115
3596
๋งž์ถฐ๋ณด๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:03
By the time the subject reached the last page
69
183711
3259
์ฐธ๊ฐ€์ž๊ฐ€ ์„ค๋ฌธ์ง€์˜ ๋งˆ์ง€๋ง‰์žฅ์— ๋„๋‹ฌํ–ˆ์„ ๋•Œ
03:06
on the survey, the page had been dynamically updated
70
186970
3347
์‹ค์‹œ๊ฐ„์œผ๋กœ ์—…๋ฐ์ดํŠธ ๋˜๋Š” ๊ทธ ํŽ˜์ด์ง€๋Š”
03:10
with the 10 best matching photos
71
190317
2313
์•ˆ๋ฉด ์ธ์‹ํ”„๋กœ๊ทธ๋žจ์ด ์ฐพ์€
03:12
which the recognizer had found,
72
192630
2285
๊ฐ€์žฅ ๋น„์Šทํ•œ ์‚ฌ์ง„์„ 10์žฅ ํ‘œ์‹œํ–ˆ๊ณ 
03:14
and we asked the subjects to indicate
73
194915
1738
์šฐ๋ฆฌ๋Š” ์ฐธ๊ฐ€์ž๋“ค์—๊ฒŒ
03:16
whether he or she found themselves in the photo.
74
196653
4120
์ž์‹ ๋“ค์˜ ์‚ฌ์ง„์ด ์žˆ๋Š”์ง€ ํ‘œ๊ธฐํ•˜๋„๋ก ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:20
Do you see the subject?
75
200773
3699
์„ค๋ฌธ ์ฐธ๊ฐ€์ž๊ฐ€ ๋ณด์ด์‹œ๋‚˜์š”?
03:24
Well, the computer did, and in fact did so
76
204472
2845
์ปดํ“จํ„ฐ๋Š” ๋ณด์•˜์Šต๋‹ˆ๋‹ค.
03:27
for one out of three subjects.
77
207317
2149
์‚ฌ์‹ค 3๋ช… ์ค‘ 1๋ช…์„ ์ฐพ์•„๋ƒˆ์Šต๋‹ˆ๋‹ค.
03:29
So essentially, we can start from an anonymous face,
78
209466
3184
๊ธฐ๋ณธ์ ์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์˜คํ”„๋ผ์ธ ๋˜๋Š” ์˜จ๋ผ์ธ์— ์žˆ๋Š”
03:32
offline or online, and we can use facial recognition
79
212650
3484
์ต๋ช…์˜ ์–ผ๊ตด๋กœ ์‹œ์ž‘ํ•ด์„œ ์•ˆ๋ฉด ์ธ์‹ ๊ธฐ์ˆ ์„ ํ†ตํ•ด
03:36
to give a name to that anonymous face
80
216134
2360
๊ทธ ์ต๋ช…์˜ ์–ผ๊ตด์— ์ด๋ฆ„์„ ์ฐพ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:38
thanks to social media data.
81
218494
2108
์†Œ์…œ ๋ฏธ๋””์–ด ๋ฐ์ดํ„ฐ ๋•๋ถ„์ด์ง€์š”.
03:40
But a few years back, we did something else.
82
220602
1872
ํ•˜์ง€๋งŒ ๋ช‡๋…„ ์ „ ์šฐ๋ฆฐ ๋‹ค๋ฅธ ์ผ์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
03:42
We started from social media data,
83
222474
1823
์šฐ๋ฆฌ๋Š” ์†Œ์…œ ๋ฏธ๋””์–ด ๋ฐ์ดํ„ฐ๋ฅผ
03:44
we combined it statistically with data
84
224297
3051
๋ฏธ๊ตญ ์ •๋ถ€์˜ ์‚ฌํšŒ ๋ณด์žฅ ์ •๋ณด์™€
03:47
from U.S. government social security,
85
227348
2102
ํ†ต๊ณ„์ ์œผ๋กœ ํ•ฉ์ณ
03:49
and we ended up predicting social security numbers,
86
229450
3324
์‚ฌํšŒ ๋ณด์žฅ ๋ฒˆํ˜ธ๋ฅผ ์˜ˆ์ƒํ•˜๊ธฐ์— ์ด๋ฅด๋ €์Šต๋‹ˆ๋‹ค
03:52
which in the United States
87
232774
1512
๋ฏธ๊ตญ๋‚ด์—์„œ๋Š”
03:54
are extremely sensitive information.
88
234286
2040
๋งค์šฐ ๋ฏผ๊ฐํ•œ ์ •๋ณด์ž…๋‹ˆ๋‹ค.
03:56
Do you see where I'm going with this?
89
236326
2093
์ œ๊ฐ€ ๋ฌด์Šจ ๋ง์„ ํ•˜๋Š”์ง€ ์•„์‹œ๊ฒ ์Šต๋‹ˆ๊นŒ?
03:58
So if you combine the two studies together,
90
238419
2922
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ๋‘ ์—ฐ๊ตฌ๋ฅผ ํ•ฉ์นœ๋‹ค๋ฉด
04:01
then the question becomes,
91
241341
1512
์˜๋ฌธ์„ ๊ฐ€์ง€๊ฒŒ ๋  ๊ฒƒ ์ž…๋‹ˆ๋‹ค.
04:02
can you start from a face and,
92
242853
2720
์–ผ๊ตด๋กœ ์‹œ์ž‘ํ•ด์„œ
04:05
using facial recognition, find a name
93
245573
2311
์•ˆ๋ฉด ์ธ์‹ ๊ธฐ์ˆ ์„ ์ด์šฉํ•ด์„œ ์ด๋ฆ„์„ ์•Œ์•„๋‚ด๊ณ 
04:07
and publicly available information
94
247884
2669
๊ทธ ์ด๋ฆ„๊ณผ ๊ทธ ์‚ฌ๋žŒ์— ๋Œ€ํ•ด
04:10
about that name and that person,
95
250553
1932
๊ณต๊ฐœ๋œ ์ •๋ณด๋ฅผ ์–ป๊ณ 
04:12
and from that publicly available information
96
252485
2248
๊ทธ๋ ‡๊ฒŒ ๊ณต๊ฐœ๋œ ์ •๋ณด๋กœ๋ถ€ํ„ฐ
04:14
infer non-publicly available information,
97
254733
2042
ํ›จ์”ฌ ๋” ๋ฏผ๊ฐํ•œ
04:16
much more sensitive ones
98
256775
1606
๋น„๊ณต๊ฐœ ์ •๋ณด๋ฅผ ์ถ”๋ก ํ•ด์„œ
04:18
which you link back to the face?
99
258381
1492
๊ทธ ์–ผ๊ตด์— ์—ฐ๊ฒฐํ•  ์ˆ˜ ์žˆ์„๊นŒ?
04:19
And the answer is, yes, we can, and we did.
100
259873
1916
๋‹ต์€ ์˜ˆ์ด๊ณ  ์šฐ๋ฆฌ๋Š” ํ•ด๋ƒˆ์Šต๋‹ˆ๋‹ค.
04:21
Of course, the accuracy keeps getting worse.
101
261789
2568
๋ฌผ๋ก  ์ •ํ™•๋„๋Š” ๊ณ„์† ์•ˆ ์ข‹์•„์ง€์ฃ .
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
264357
944
[๋„ค ๋ฒˆ์˜ ์‹œ๋„๋กœ ๋Œ€์ƒ์˜ 27%์˜ ์‚ฌํšŒ ๋ณด์žฅ ๋ฒˆํ˜ธ ์ฒซ ๋‹ค์„ฏ ์ž๋ฆฌ๊ฐ€ ํ™•์ธ]
04:25
But in fact, we even decided to develop an iPhone app
103
265301
3827
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์‚ฌ์‹ค ์•„์ดํฐ ์•ฑ์„ ๊ฐœ๋ฐœํ•˜๊ธฐ๋กœ ๊ฒฐ์ •ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:29
which uses the phone's internal camera
104
269128
2715
์ด์•ฑ์€ ์ „ํ™”๊ธฐ์˜ ๋‚ด๋ถ€ ์นด๋ฉ”๋ผ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ
04:31
to take a shot of a subject
105
271843
1600
๋Œ€์ƒ์˜ ์‚ฌ์ง„์„ ์ฐ์–ด
04:33
and then upload it to a cloud
106
273443
1487
ํด๋ผ์šฐ๋“œ์— ์˜ฌ๋ ค์„œ
04:34
and then do what I just described to you in real time:
107
274930
2662
์ œ๊ฐ€ ๋ฐฉ๊ธˆ ์„ค๋ช…๋“œ๋ฆฐ ์ž‘์—…์„ ์‹ค์‹œ๊ฐ„์œผ๋กœ ์ง„ํ–‰ํ•ฉ๋‹ˆ๋‹ค.
04:37
looking for a match, finding public information,
108
277592
2088
์ผ์น˜ํ•˜๋Š” ๊ฒƒ์„ ์ฐพ๊ณ  ๊ณต๊ฐœ๋œ ์ •๋ณด๋ฅผ ์ฐพ์œผ๋ฉฐ
04:39
trying to infer sensitive information,
109
279680
1730
๋ฏผ๊ฐํ•œ ์ •๋ณด๋ฅผ ์ถ”๋ก ํ•˜์—ฌ
04:41
and then sending back to the phone
110
281410
2591
๊ทธ๋ฆฌ๊ณ  ์ „ํ™”๊ธฐ์˜ ์‚ฌ์ง„์œ„์—
04:44
so that it is overlaid on the face of the subject,
111
284001
3609
์ฐพ์€ ์ •๋ณด๋ฅผ ๋ง์”Œ์›Œ ํ‘œ๊ธฐ๋ฉ๋‹ˆ๋‹ค.
04:47
an example of augmented reality,
112
287610
1901
์ฆ๊ฐ• ํ˜„์‹ค์˜ ์˜ˆ์ฒ˜๋Ÿผ์š”.
04:49
probably a creepy example of augmented reality.
113
289511
2451
์•„๋งˆ ์†Œ๋ฆ„๋ผ์น˜๋Š” ์ฆ๊ฐ• ํ˜„์‹ค์˜ ์˜ˆ์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:51
In fact, we didn't develop the app to make it available,
114
291962
3339
์‚ฌ์‹ค ์šฐ๋ฆฌ๋Š” ๊ทธ ์•ฑ์„ ์‚ฌ์šฉํ•˜๊ธฐ ์œ„ํ•ด์„œ๊ฐ€ ์•„๋‹ˆ๋ผ
04:55
just as a proof of concept.
115
295301
1922
๋‹จ์ง€ ๊ฐœ๋…์„ ์ฆ๋ช…ํ•ด ๋ณด์ด๊ธฐ ์œ„ํ•ด์„œ ๋งŒ๋“ค์—ˆ์ฃ .
04:57
In fact, take these technologies
116
297223
2313
์‹ค์ œ๋กœ ์ด ๊ธฐ์ˆ ์„ ์ด์šฉํ•ด์„œ
04:59
and push them to their logical extreme.
117
299536
1837
๊ทนํ•œ์˜ ๋…ผ๋ฆฌ๋ฅผ ์ ์šฉํ•ด๋ณด์„ธ์š”.
05:01
Imagine a future in which strangers around you
118
301373
2719
๋ฏธ๋ž˜์— ์—ฌ๋Ÿฌ๋ถ„ ์ฃผ์œ„์—์„œ ๋‚ฏ์„  ์‚ฌ๋žŒ๋“ค์ด
05:04
will look at you through their Google Glasses
119
304092
2311
๊ตฌ๊ธ€ ๊ธ€๋ž˜์Šค๋‚˜ ์ฝ˜ํ…์ธ  ๋ Œ์ฆˆ๋กœ
05:06
or, one day, their contact lenses,
120
306403
2307
์—ฌ๋Ÿฌ๋ถ„์„ ์ณ๋‹ค๋ณธ๋‹ค๊ณ  ์ƒ์ƒํ•ด ๋ณด์„ธ์š”.
05:08
and use seven or eight data points about you
121
308710
4020
์—ฌ๋Ÿฌ๋ถ„์— ๋Œ€ํ•œ ์ผ๊ณฑ ์—ฌ๋Ÿ ๊ฐ€์ง€ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•ด
05:12
to infer anything else
122
312730
2582
์—ฌ๋Ÿฌ๋ถ„์— ๋Œ€ํ•ด ์•Œ๋ ค์ง„
05:15
which may be known about you.
123
315312
2603
์–ด๋– ํ•œ ์ •๋ณด๋“ ์ง€ ์ถ”๋ก ํ•ฉ๋‹ˆ๋‹ค.
05:17
What will this future without secrets look like?
124
317915
4794
๋น„๋ฐ€์ด ์—†๋Š” ๋ฏธ๋ž˜๋Š” ์–ด๋–ค ๋ชจ์Šต์ผ๊นŒ์š”?
05:22
And should we care?
125
322709
1964
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ๊ทธ๊ฑธ ์‹ ๊ฒฝ์จ์•ผ ํ• ๊นŒ์š”?
05:24
We may like to believe
126
324673
1891
์šฐ๋ฆฌ๋Š” ํ’๋ถ€ํ•œ ๋ฐ์ดํ„ฐ๊ฐ€ ์กด์žฌํ•˜๋Š” ๊ทธ ๋ฏธ๋ž˜๊ฐ€
05:26
that the future with so much wealth of data
127
326564
3040
ํŽธ๊ฒฌ์ด ์‚ฌ๋ผ์ง„ ๋ฏธ๋ž˜๋ผ๊ณ 
05:29
would be a future with no more biases,
128
329604
2514
๋ฏฟ๊ณ  ์‹ถ์–ดํ•  ์ˆ˜๋„ ์žˆ์ง€๋งŒ
05:32
but in fact, having so much information
129
332118
3583
์‚ฌ์‹ค ๋งŽ์€ ์ •๋ณด๋ฅผ ๊ฐ–๋Š”๋‹ค๋Š” ๊ฒƒ์ด
05:35
doesn't mean that we will make decisions
130
335701
2191
์šฐ๋ฆฌ๊ฐ€ ๋ณด๋‹ค ๊ฐ๊ด€์ ์ธ ๊ฒฐ์ •์„
05:37
which are more objective.
131
337892
1706
๋‚ด๋ฆฐ๋‹ค๋Š” ๋œป์€ ์•„๋‹™๋‹ˆ๋‹ค.
05:39
In another experiment, we presented to our subjects
132
339598
2560
๋˜ ๋‹ค๋ฅธ ์‹คํ—˜์—์„œ ์šฐ๋ฆฐ ํ”ผ์‹คํ—˜์ž๋“ค์—๊ฒŒ
05:42
information about a potential job candidate.
133
342158
2246
์ž ์žฌ์  ์ž…์‚ฌ์ง€์›์ž๋“ค์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•˜์˜€์Šต๋‹ˆ๋‹ค
05:44
We included in this information some references
134
344404
3178
์šฐ๋ฆฐ ๊ทธ ์ •๋ณด ์†์—
05:47
to some funny, absolutely legal,
135
347582
2646
ํ”ผ์‹คํ—˜์ž๋“ค์ด ์ธํ„ฐ๋„ท์— ์˜ฌ๋ ธ๋˜
05:50
but perhaps slightly embarrassing information
136
350228
2465
์กฐ๊ธˆ ์›ƒ๊ธฐ๊ณ  ์•„์ฃผ ํ•ฉ๋ฒ•์ ์ด์ง€๋งŒ
05:52
that the subject had posted online.
137
352693
2020
์•„๋งˆ ์กฐ๊ธˆ์€ ์ฐฝํ”ผํ•  ์ˆ˜ ์žˆ๋Š” ๋‚ด์šฉ์„ ํฌํ•จํ–ˆ์Šต๋‹ˆ๋‹ค.
05:54
Now interestingly, among our subjects,
138
354713
2366
์ž ํฅ๋ฏธ๋กญ๊ฒŒ๋„, ํ”ผ์‹คํ—˜์ž๋“ค ์ค‘์—์„œ
05:57
some had posted comparable information,
139
357079
3083
์–ด๋–ค ์ด๋“ค์€ ๋น„์Šทํ•œ ์ •๋ณด๋ฅผ ์˜ฌ๋ ธ๊ณ 
06:00
and some had not.
140
360162
2362
๋‹ค๋ฅธ ์ด๋“ค์€ ๊ทธ๋Ÿฌ์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
06:02
Which group do you think
141
362524
1949
์—ฌ๋Ÿฌ๋ถ„ ์ƒ๊ฐ์—” ์–ด๋–ค ๊ทธ๋ฃน์ด
06:04
was more likely to judge harshly our subject?
142
364473
4552
์ž…์‚ฌ์ง€์›์ž๋“ค์„ ๋”์šฑ ๋ƒ‰์ •ํ•˜๊ฒŒ ํ‰๊ฐ€ํ–ˆ์„๊นŒ์š”?
06:09
Paradoxically, it was the group
143
369025
1957
์—ญ์„ค์ ์œผ๋กœ ๊ทธ ๊ทธ๋ฃน์€
06:10
who had posted similar information,
144
370982
1733
๋น„์Šทํ•œ ์ •๋ณด๋ฅผ ์˜ฌ๋ ธ๋˜ ๊ทธ๋ฃน์ด์—ˆ์Šต๋‹ˆ๋‹ค.
06:12
an example of moral dissonance.
145
372715
2942
๋„๋•์  ๋ถ€์กฐํ™”์˜ ํ•œ ์˜ˆ์ž…๋‹ˆ๋‹ค.
06:15
Now you may be thinking,
146
375657
1750
์—ฌ๋Ÿฌ๋ถ„์€ ์•„๋งˆ ์ˆจ๊ธธ๊ฒŒ ์—†๊ธฐ ๋•Œ๋ฌธ์—
06:17
this does not apply to me,
147
377407
1702
์ด๊ฑด ๋‚˜์™€๋Š” ์ƒ๊ด€์—†๋Š” ์ผ์ด์•ผ
06:19
because I have nothing to hide.
148
379109
2162
๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹ค์ง€ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
06:21
But in fact, privacy is not about
149
381271
2482
ํ•˜์ง€๋งŒ ์‚ฌ์‹ค, ์‚ฌ์ƒํ™œ์€ ๊ฐ์ถฐ์•ผ ํ• 
06:23
having something negative to hide.
150
383753
3676
๋ถ€์ •์ ์ธ ๊ฒƒ์— ๊ด€ํ•œ ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
06:27
Imagine that you are the H.R. director
151
387429
2354
์—ฌ๋Ÿฌ๋ถ„์ด ์–ด๋Š ๋‹จ์ฒด์˜ ์ธ์‚ฌ ๋‹ด๋‹น์ž๋ผ๊ณ  ๊ฐ€์ •ํ•ด๋ณด์„ธ์š”.
06:29
of a certain organization, and you receive rรฉsumรฉs,
152
389783
2947
์—ฌ๋Ÿฌ๋ถ„์€ ์ด๋ ฅ์„œ๋ฅผ ๋ฐ›๊ณ 
06:32
and you decide to find more information about the candidates.
153
392730
2473
์ง€์›์ž์— ๋Œ€ํ•œ ๋” ๋งŽ์€ ์ •๋ณด๋ฅผ ์ฐพ๊ธฐ๋กœ ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค
06:35
Therefore, you Google their names
154
395203
2460
๊ทธ๋ž˜์„œ ์—ฌ๋Ÿฌ๋ถ„์€ ๊ทธ ์ด๋ฆ„๋“ค์„ ๊ตฌ๊ธ€์— ๊ฒ€์ƒ‰ํ•˜๊ณ 
06:37
and in a certain universe,
155
397663
2240
์–ด๋Š ํ•œ ๊ณณ์—์„œ
06:39
you find this information.
156
399903
2008
๊ทธ ์ •๋ณด๋ฅผ ์ฐพ์Šต๋‹ˆ๋‹ค.
06:41
Or in a parallel universe, you find this information.
157
401911
4437
ํ˜น์€ ํ‰ํ–‰ ์„ธ๊ณ„์—์„œ ์ด๋Ÿฐ ์ •๋ณด๋ฅผ ์ฐพ์Šต๋‹ˆ๋‹ค.
06:46
Do you think that you would be equally likely
158
406348
2717
์—ฌ๋Ÿฌ๋ถ„์€ ๋‘ ์ง€์›์ž ๋ชจ๋‘ ๋™๋“ฑํ•˜๊ฒŒ
06:49
to call either candidate for an interview?
159
409065
2803
๋ฉด์ ‘์— ๋ถ€๋ฅผ ๊ฒƒ์ด๋ผ ์ƒ๊ฐํ•˜์„ธ์š”?
06:51
If you think so, then you are not
160
411868
2282
๋งŒ์•ฝ ๊ทธ๋ ‡๋‹ค๋ฉด
06:54
like the U.S. employers who are, in fact,
161
414150
2582
์—ฌ๋Ÿฌ๋ถ„์€ ์šฐ๋ฆฌ ์‹คํ—˜์— ์ฐธ๊ฐ€ํ–ˆ๋˜
06:56
part of our experiment, meaning we did exactly that.
162
416732
3307
๋ฏธ๊ตญ์˜ ๊ณ ์šฉ์ž์™€ ๋‹ค๋ฅธ ๊ฒ๋‹ˆ๋‹ค. ์šฐ๋ฆฐ ๊ทธ๋ ‡๊ฒŒ ํ–ˆ์–ด์š”.
07:00
We created Facebook profiles, manipulating traits,
163
420039
3182
์šฐ๋ฆฐ ์กฐ์ž‘๋œ ํŽ˜์ด์Šค๋ถ ํ”„๋กœํ•„์„ ๋งŒ๋“ค๊ณ 
07:03
then we started sending out rรฉsumรฉs to companies in the U.S.,
164
423221
2851
๋ฏธ๊ตญ ํšŒ์‚ฌ์— ์ด๋ ฅ์„œ๋ฅผ ๋ณด๋ƒˆ์Šต๋‹ˆ๋‹ค.
07:06
and we detected, we monitored,
165
426072
1908
๊ทธ๋ฆฌ๊ณ  ๊ทธ ํšŒ์‚ฌ์—์„œ
07:07
whether they were searching for our candidates,
166
427980
2393
์šฐ๋ฆฌ์˜ ์ง€์›์ž๋ฅผ ์ฐพ๋Š”์ง€ ๊ฐ์‹œํ•˜์˜€์Šต๋‹ˆ๋‹ค.
07:10
and whether they were acting on the information
167
430373
1832
๋˜ํ•œ ๊ทธ๋“ค์ด ์†Œ์…œ ๋ฏธ๋””์–ด์—์„œ ์ฐพ์€ ์ •๋ณด์— ๋”ฐ๋ผ
07:12
they found on social media. And they were.
168
432205
1938
ํ–‰๋™ํ•˜๋Š”์ง€ ์ง€์ผœ๋ณด์•˜์–ด์š”. ๊ทธ๋“ค์€ ๊ทธ๋ ‡๊ฒŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
07:14
Discrimination was happening through social media
169
434143
2101
์†Œ์…œ ๋ฏธ๋””์–ด๋กœ ์ธํ•ด ๋™๋“ฑํ•œ ๊ธฐ์ˆ ์„ ์ง€๋‹Œ
07:16
for equally skilled candidates.
170
436244
3073
์ง€์›์ž๋“ค์„ ๋‘๊ณ  ์ฐจ๋ณ„์ด ์ƒ๊ฒผ์Šต๋‹ˆ๋‹ค.
07:19
Now marketers like us to believe
171
439317
4575
๋งˆ์ผ€ํŒ… ๋‹ด๋‹น์ž๋“ค์€
07:23
that all information about us will always
172
443892
2269
์šฐ๋ฆฌ์— ๋Œ€ํ•œ ์ •๋ณด๋Š” ํ•ญ์ƒ ์šฐ๋ฆฌ์—๊ฒŒ
07:26
be used in a manner which is in our favor.
173
446161
3273
์ข‹์€ ์ชฝ์œผ๋กœ ์‚ฌ์šฉ๋ ๊ฑฐ๋ผ๊ณ  ๋ฏฟ๊ฒŒ ํ•ฉ๋‹ˆ๋‹ค.
07:29
But think again. Why should that be always the case?
174
449434
3715
ํ•˜์ง€๋งŒ ๋‹ค์‹œ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”. ์™œ ํ•ญ์ƒ ๊ทธ๋ž˜์•ผ ํ• ๊นŒ์š”?
07:33
In a movie which came out a few years ago,
175
453149
2664
๋ช‡๋…„ ์ „์— ๊ฐœ๋ด‰ํ•œ ์˜ํ™”,
07:35
"Minority Report," a famous scene
176
455813
2553
"๋งˆ์ด๋„ˆ๋ฆฌํ‹ฐ ๋ฆฌํฌํŠธ"์—๋Š”
07:38
had Tom Cruise walk in a mall
177
458366
2576
ํ†ฐ ํฌ๋ฃจ์ฆˆ๊ฐ€ ๋ฐฑํ™”์ ์„ ๊ฑธ์„ ๋•Œ ํ™€๋กœ๊ทธ๋ž˜ํ”ฝ ๊ฐœ์ธ ๋งž์ถค ๊ด‘๊ณ ๊ฐ€
07:40
and holographic personalized advertising
178
460942
3776
๊ทธ์˜ ์ฃผ์œ„์— ๋‚˜ํƒ€๋‚˜๋Š”
07:44
would appear around him.
179
464718
1835
์œ ๋ช…ํ•œ ์žฅ๋ฉด์ด ์žˆ์Šต๋‹ˆ๋‹ค.
07:46
Now, that movie is set in 2054,
180
466553
3227
๊ทธ ์˜ํ™”์˜ ๋ฐฐ๊ฒฝ์€ 2054๋…„์ž…๋‹ˆ๋‹ค.
07:49
about 40 years from now,
181
469780
1642
์ง€๊ธˆ์œผ๋กœ๋ถ€ํ„ฐ 40๋…„ ํ›„์ฃ .
07:51
and as exciting as that technology looks,
182
471422
2908
๊ทธ ๊ธฐ์ˆ ์ด ์•„์ฃผ ๋ฉ‹์ง€๊ฒŒ ๋ณด์ด๋“ฏ
07:54
it already vastly underestimates
183
474330
2646
๋‹จ์ฒด๋“ค์ด ๋ชจ์„ ์ˆ˜ ์žˆ๋Š” ์—ฌ๋Ÿฌ๋ถ„์— ๋Œ€ํ•œ ์ •๋ณด์˜ ์–‘๊ณผ
07:56
the amount of information that organizations
184
476976
2140
๋˜ ๊ทธ ์ •๋ณด๋ฅผ ์ด์šฉํ•ด ์—ฌ๋Ÿฌ๋ถ„์ด ์•Œ์•„์ฑŒ ์ˆ˜ ์—†๋„๋ก
07:59
can gather about you, and how they can use it
185
479116
2483
์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ์˜ํ–ฅ์„ ์ค„ ์ˆ˜ ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์€
08:01
to influence you in a way that you will not even detect.
186
481599
3398
๋งค์šฐ ๊ณผ์†Œํ‰๊ฐ€๋ผ์žˆ์Šต๋‹ˆ๋‹ค.
08:04
So as an example, this is another experiment
187
484997
2103
์˜ˆ๋กœ, ์ด ์‹คํ—˜์€ ํ˜„์žฌ ์šฐ๋ฆฌ๊ฐ€ ์ง„ํ–‰ ์ค‘์ธ
08:07
actually we are running, not yet completed.
188
487100
2273
๋˜๋‹ค๋ฅธ ์‹คํ—˜์ด๊ณ  ์•„์ง ๋๋‚˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
08:09
Imagine that an organization has access
189
489373
2319
์–ด๋Š ๋‹จ์ฒด๊ฐ€ ๋‹น์‹ ์˜ ํŽ˜์ด์Šค๋ถ ์นœ๊ตฌ ๋ชฉ๋ก์—
08:11
to your list of Facebook friends,
190
491692
2056
์ ‘๊ทผํ•  ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
08:13
and through some kind of algorithm
191
493748
1772
๊ทธ๋ฆฌ๊ณ  ์–ด๋– ํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ†ตํ•ด
08:15
they can detect the two friends that you like the most.
192
495520
3734
์—ฌ๋Ÿฌ๋ถ„์ด ๊ฐ€์žฅ ์ข‹์•„ํ•˜๋Š” ์นœ๊ตฌ 2๋ช…์„ ์•Œ์•„๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:19
And then they create, in real time,
193
499254
2280
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋“ค์€ ์‹ค์‹œ๊ฐ„์œผ๋กœ
08:21
a facial composite of these two friends.
194
501534
2842
๊ทธ ๋‘ ์นœ๊ตฌ์˜ ์•ˆ๋ฉด ํ•ฉ์„ฑ์„ ๋งŒ๋“ค์–ด๋ƒ…๋‹ˆ๋‹ค.
08:24
Now studies prior to ours have shown that people
195
504376
3069
์ด ์ „์˜ ์—ฐ๊ตฌ๋“ค์—์„œ ์‚ฌ๋žŒ๋“ค์€
08:27
don't recognize any longer even themselves
196
507445
2885
ํ•ฉ์„ฑ๋œ ์–ผ๊ตด์—์„œ ์ž์‹ ์กฐ์ฐจ๋„ ์ธ์‹ํ•  ์ˆ˜ ์—†์ง€๋งŒ
08:30
in facial composites, but they react
197
510330
2462
08:32
to those composites in a positive manner.
198
512792
2117
๊ธ์ •์ ์ธ ๋ฐ˜์‘์„ ๋ณด์ธ๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:34
So next time you are looking for a certain product,
199
514909
3415
๊ทธ๋Ÿฌ๋‹ˆ ๋‹ค์Œ์— ๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ์–ด๋–ค ์ œํ’ˆ์„ ์ฐพ์„ ๋•Œ
08:38
and there is an ad suggesting you to buy it,
200
518324
2559
๋งŒ์•ฝ ํŠน์ • ์ œํ’ˆ์„ ๊ถŒํ•˜๋Š” ๊ด‘๊ณ ๊ฐ€ ์žˆ์œผ๋ฉด
08:40
it will not be just a standard spokesperson.
201
520883
2907
์ผ๋ฐ˜์ ์ธ ๋Œ€๋ณ€์ธ์ด ์•„๋‹๊ฒ๋‹ˆ๋‹ค.
08:43
It will be one of your friends,
202
523790
2313
์•„๋งˆ ์—ฌ๋Ÿฌ๋ถ„์˜ ์นœ๊ตฌ ์ค‘ ํ•œ๋ช…์ผ ๊ฑฐ์—์š”.
08:46
and you will not even know that this is happening.
203
526103
3303
๊ทธ๋ฆฌ๊ณ  ์—ฌ๋Ÿฌ๋ถ„์€ ์ด๋Ÿฐ ์ผ์ด ์ผ์–ด๋‚˜๋Š”์ง€๋„ ๋ชจ๋ฅด๊ฒ ์ฃ .
08:49
Now the problem is that
204
529406
2413
์ด์ œ ๋ฌธ์ œ๋Š”
08:51
the current policy mechanisms we have
205
531819
2519
ํ˜„์žฌ ์šฐ๋ฆฌ๊ฐ€ ๊ฐ€์ง„ ๊ฐœ์ธ ์ •๋ณด์˜ ๋‚จ์šฉ์„
08:54
to protect ourselves from the abuses of personal information
206
534338
3438
๋ง‰๊ธฐ ์œ„ํ•œ ์ •์ฑ… ๊ตฌ์กฐ๋Š”
08:57
are like bringing a knife to a gunfight.
207
537776
2984
๋งˆ์น˜ ์ด์‹ธ์›€์— ์นผ์„ ๊ฐ€์ง€๊ณ  ๊ฐ€๋Š” ๊ฒƒ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
09:00
One of these mechanisms is transparency,
208
540760
2913
๊ทธ ๊ตฌ์กฐ ์ค‘ ํ•˜๋‚˜๋Š” ํˆฌ๋ช…์„ฑ์ž…๋‹ˆ๋‹ค.
09:03
telling people what you are going to do with their data.
209
543673
3200
์‚ฌ๋žŒ๋“ค์˜ ์ •๋ณด๋ฅผ ๊ฐ€์ง€๊ณ  ๋ฌด์—‡์„ ํ• ์ง€ ๋งํ•˜๋Š” ๊ฒƒ์ด์ฃ .
09:06
And in principle, that's a very good thing.
210
546873
2106
์›์น™๋Œ€๋กœ๋ผ๋ฉด ์•„์ฃผ ์ข‹์€ ๊ฑฐ์—์š”.
09:08
It's necessary, but it is not sufficient.
211
548979
3667
ํ•„์š”ํ•˜์ง€๋งŒ ์ถฉ๋ถ„ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:12
Transparency can be misdirected.
212
552646
3698
ํˆฌ๋ช…์„ฑ์€ ์ž˜๋ชป ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:16
You can tell people what you are going to do,
213
556344
2104
์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋ฌด์—‡์„ ํ• ์ง€ ๋ง์„ ํ•˜๊ณ 
09:18
and then you still nudge them to disclose
214
558448
2232
๊ณ„์† ๊ทธ๋“ค์—๊ฒŒ ํ™•์‹คํ•˜์ง€ ์•Š์€ ์–‘์˜
09:20
arbitrary amounts of personal information.
215
560680
2623
์ •๋ณด๋ฅผ ๋ฐํžˆ๋ผ๊ณ  ๋ถ€์ถ”๊ธธ ์ˆ˜ ์žˆ์–ด์š”.
09:23
So in yet another experiment, this one with students,
216
563303
2886
๊ทธ๋ž˜์„œ ๋˜ ๋‹ค๋ฅธ ํ•™์ƒ๋“ค๊ณผ ์ง„ํ–‰ํ•œ ์‹คํ—˜์—์„œ๋Š”
09:26
we asked them to provide information
217
566189
3058
๊ต์ • ์•ˆ์—์„œ์˜ ํ–‰๋™์— ๋Œ€ํ•œ
09:29
about their campus behavior,
218
569247
1813
์ •๋ณด๋ฅผ ์š”๊ตฌํ–ˆ์Šต๋‹ˆ๋‹ค.
09:31
including pretty sensitive questions, such as this one.
219
571060
2940
๊ฝค๋‚˜ ๋ฏผ๊ฐํ•œ ์ด๋Ÿฐ ์งˆ๋ฌธ๋„ ์žˆ์—ˆ์–ด์š”.
09:34
[Have you ever cheated in an exam?]
220
574000
621
09:34
Now to one group of subjects, we told them,
221
574621
2300
[์‹œํ—˜ ๋„์ค‘ ์ปจ๋‹ํ•œ ์ ์ด ์žˆ์Šต๋‹ˆ๊นŒ?]
์šฐ๋ฆฐ ํ•œ ํ”ผ์‹คํ—˜์ž ๊ทธ๋ฃน์—๊ฒŒ ์ด๋ ‡๊ฒŒ ์–˜๊ธฐํ–ˆ์Šต๋‹ˆ๋‹ค.
09:36
"Only other students will see your answers."
222
576921
2841
"๋‹ค๋ฅธ ํ•™์ƒ๋“ค๋งŒ ๋‹ต์„ ๋ณผ๊ฑฐ์—์š”."
09:39
To another group of subjects, we told them,
223
579762
1579
๋˜ ๋‹ค๋ฅธ ๊ทธ๋ฃน์—๊ฒŒ๋Š” ์ด๋ ‡๊ฒŒ ์–˜๊ธฐํ–ˆ์ฃ .
09:41
"Students and faculty will see your answers."
224
581341
3561
"ํ•™์ƒ๋“ค๊ณผ ๊ต์ˆ˜์ง„๋“ค ๋ชจ๋‘ ์‘๋‹ต์„ ๋ณผ๊ฒ๋‹ˆ๋‹ค."
09:44
Transparency. Notification. And sure enough, this worked,
225
584902
2591
ํˆฌ๋ช…์„ฑ. ์•Œ๋ฆผ. ์ด๊ฑด ๋‹น์—ฐํ•˜๊ฒŒ๋„ ํ†ตํ–ˆ์Šต๋‹ˆ๋‹ค.
09:47
in the sense that the first group of subjects
226
587493
1407
์ฒซ๋ฒˆ์žฌ ๊ทธ๋ฃน์ด ๋‘๋ฒˆ์งธ ๊ทธ๋ฃน๋ณด๋‹ค
09:48
were much more likely to disclose than the second.
227
588900
2568
๋” ๋งŽ์€ ์ •๋ณด๋ฅผ ๋ฐํ˜”์–ด์š”.
09:51
It makes sense, right?
228
591468
1520
๋‚ฉ๋“์ด ๋˜์ง€์š”. ๊ทธ๋ ‡์ฃ ?
09:52
But then we added the misdirection.
229
592988
1490
๊ทธ๋ฆฌ๊ณ  ์ž˜๋ชป๋œ ์ง€์‹œ๋ฅผ ๋”ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:54
We repeated the experiment with the same two groups,
230
594478
2760
๊ฐ™์€ ๊ทธ๋ฃน์—๊ฒŒ ์‹คํ—˜์„ ๋ฐ˜๋ณตํ–ˆ์ง€๋งŒ
09:57
this time adding a delay
231
597238
2427
์ด๋ฒˆ์—” ํ”ผ์‹คํ—˜์ž๋“ค์—๊ฒŒ
09:59
between the time we told subjects
232
599665
2935
์šฐ๋ฆฌ๊ฐ€ ์ •๋ณด๋ฅผ ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉํ• ์ง€ ๋งํ•œ ๋•Œ์™€
10:02
how we would use their data
233
602600
2080
์งˆ๋ฌธ์— ๋‹ตํ•˜๊ธฐ ์‹œ์ž‘ํ•œ ๋•Œ ์‚ฌ์ด์—
10:04
and the time we actually started answering the questions.
234
604680
4388
์‹œ๊ฐ„ ๊ฐ„๊ฒฉ์„ ๋‘์—ˆ์–ด์š”.
10:09
How long a delay do you think we had to add
235
609068
2561
๊ต์ˆ˜์ง„์ด ์‘๋‹ต์„ ๋ณผ๊ฑฐ๋ž€ ์‚ฌ์‹ค์— ์˜ํ•œ
10:11
in order to nullify the inhibitory effect
236
611629
4613
์–ต์ œ ์˜ํ–ฅ์„ ๋ฌดํšจํ™”์‹œํ‚ค๊ธฐ ์œ„ํ•ด
10:16
of knowing that faculty would see your answers?
237
616242
3411
์–ผ๋งˆ๋‚˜ ๊ธด ๊ฐ„๊ฒฉ์„ ๋‘์–ด์•ผ ํ–ˆ์„๊นŒ์š”?
10:19
Ten minutes?
238
619653
1780
10๋ถ„?
10:21
Five minutes?
239
621433
1791
5๋ถ„?
10:23
One minute?
240
623224
1776
1๋ถ„?
10:25
How about 15 seconds?
241
625000
2049
15์ดˆ๋Š” ์–ด๋–ป์Šต๋‹ˆ๊นŒ?
10:27
Fifteen seconds were sufficient to have the two groups
242
627049
2668
๋‘ ๊ทธ๋ฃน์ด ๊ฐ™์€ ์–‘์˜ ์ •๋ณด๋ฅผ ๋ฐํžˆ๊ธฐ ์œ„ํ•ด์„ 
10:29
disclose the same amount of information,
243
629717
1568
15์ดˆ๋ฉด ์ถฉ๋ถ„ํ–ˆ์Šต๋‹ˆ๋‹ค.
10:31
as if the second group now no longer cares
244
631285
2746
๋‘๋ฒˆ์งธ ๊ทธ๋ฃน์ด ๊ต์ˆ˜์ง„์ด ์‘๋‹ต์„ ๋ณผ๊ฑฐ๋ž€ ์‚ฌ์‹ค์„
10:34
for faculty reading their answers.
245
634031
2656
์‹ ๊ฒฝ ์“ฐ์ง€ ์•Š์€ ๊ฒƒ์ฒ˜๋Ÿผ์š”.
10:36
Now I have to admit that this talk so far
246
636687
3336
์ €๋Š” ์ง€๊ธˆ๊ป ์ด ๊ฐ•์—ฐ์ด
10:40
may sound exceedingly gloomy,
247
640023
2480
๋งค์šฐ ์šฐ์šธํ•˜๊ฒŒ ๋“ค๋ ธ๋‹ค๋Š” ๊ฑธ ์ธ์ •ํ•˜์ง€๋งŒ
10:42
but that is not my point.
248
642503
1721
๊ทธ๊ฒŒ ์ค‘์ ์€ ์•„๋‹™๋‹ˆ๋‹ค.
10:44
In fact, I want to share with you the fact that
249
644224
2699
์‚ฌ์‹ค ์ „ ๋Œ€์•ˆ์ด ์žˆ๋‹ค๋Š” ๊ฒƒ์„
10:46
there are alternatives.
250
646923
1772
์•Œ๋ ค๋“œ๋ฆฌ๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
10:48
The way we are doing things now is not the only way
251
648695
2499
์ง€๊ธˆ ์ €ํฌ๊ฐ€ ํ•˜๋Š” ๋ฐฉ์‹์ด
10:51
they can done, and certainly not the best way
252
651194
3037
์œ ์ผํ•œ ๋ฐฉ๋ฒ•์ด ์•„๋‹ˆ๊ณ  ๋˜ํ•œ ์ตœ์ƒ์˜ ๋ฐฉ๋ฒ•์€
10:54
they can be done.
253
654231
2027
๋”๋”์šฑ ์•„๋‹™๋‹ˆ๋‹ค.
10:56
When someone tells you, "People don't care about privacy,"
254
656258
4171
๋ˆ„๊ตฐ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ "์‚ฌ๋žŒ๋“ค์€ ์‚ฌ์ƒํ™œ์— ์‹ ๊ฒฝ์“ฐ์ง€ ์•Š์•„."
11:00
consider whether the game has been designed
255
660429
2642
๋ผ๊ณ  ๋งํ•  ๋•Œ ๋ชจ๋“ ๊ฒŒ ์„ค๊ณ„๋˜๊ณ  ์กฐ์ž‘๋˜์–ด
11:03
and rigged so that they cannot care about privacy,
256
663071
2724
์‚ฌ์ƒํ™œ์— ์‹ ๊ฒฝ ์“ธ ์ˆ˜ ์—†๊ฒŒ ๋ฌ๋Š”์ง€ ๊ณ ๋ คํ•ด๋ณด์„ธ์š”.
11:05
and coming to the realization that these manipulations occur
257
665795
3262
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋Ÿฌํ•œ ์กฐ์ž‘๋“ค์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ์Œ์„ ๊นจ๋‹ซ๋Š” ์ˆœ๊ฐ„์ด
11:09
is already halfway through the process
258
669057
1607
์ด๋ฏธ ์—ฌ๋Ÿฌ๋ถ„ ์ž์‹ ์„ ๋ณดํ˜ธํ•˜๋Š” ๊ณผ์ •์˜
11:10
of being able to protect yourself.
259
670664
2258
์ ˆ๋ฐ˜์— ์™”์Šต๋‹ˆ๋‹ค.
11:12
When someone tells you that privacy is incompatible
260
672922
3710
๋ˆ„๊ตฐ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ์‚ฌ์ƒํ™œ๊ณผ
11:16
with the benefits of big data,
261
676632
1849
๋น… ๋ฐ์ดํ„ฐ์˜ ์žฅ์ ์€ ๊ณต์กดํ•  ์ˆ˜ ์—†๋‹ค ๋งํ•  ๋•Œ
11:18
consider that in the last 20 years,
262
678481
2473
์ง€๋‚œ 20๋…„ ๊ฐ„ ์—ฐ๊ตฌ์ง„๋“ค์ด
11:20
researchers have created technologies
263
680954
1917
๊ธฐ์ˆ ์„ ๊ฐœ๋ฐœํ•ด์„œ
11:22
to allow virtually any electronic transactions
264
682871
3318
๊ฑฐ์˜ ๋ชจ๋“  ์ „์ž ๊ฑฐ๋ž˜๊ฐ€
11:26
to take place in a more privacy-preserving manner.
265
686189
3749
์‚ฌ์ƒํ™œ์„ ๋” ๋ณดํ˜ธํ•˜๋Š” ๋ฐฉ์‹์œผ๋กœ ์ด๋ค„์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:29
We can browse the Internet anonymously.
266
689938
2555
์šฐ๋ฆฐ ์ต๋ช…์œผ๋กœ ์ธํ„ฐ๋„ท์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:32
We can send emails that can only be read
267
692493
2678
์šฐ๋ฆฐ ์˜ค์ง ์ง€์ •๋œ ์ˆ˜์‹ ์ž๋งŒ ์ฝ์„ ์ˆ˜ ์žˆ๋Š”
11:35
by the intended recipient, not even the NSA.
268
695171
3709
์ด๋ฉ”์ผ์„ ๋ณด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. NSA๋„ ์ฝ์„ ์ˆ˜ ์—†์–ด์š”.
11:38
We can have even privacy-preserving data mining.
269
698880
2997
์šฐ๋ฆฐ ์‚ฌ์ƒํ™œ์„ ๋ณดํ˜ธํ•˜๋ฉฐ ๋ฐ์ดํ„ฐ ๋งˆ์ด๋‹์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:41
In other words, we can have the benefits of big data
270
701877
3894
๋‹ค๋ฅธ ๋ง๋กœ ์šฐ๋ฆฌ๋Š” ์‚ฌ์ƒํ™œ์„ ๋ณดํ˜ธํ•˜๋ฉฐ
11:45
while protecting privacy.
271
705771
2132
๋น… ๋ฐ์ดํ„ฐ์˜ ์ด์ ์„ ๊ฐ€์งˆ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:47
Of course, these technologies imply a shifting
272
707903
3791
๋ฌผ๋ก  ๊ทธ ๊ธฐ์ˆ ๋“ค์€
11:51
of cost and revenues
273
711694
1546
๋ฐ์ดํ„ฐ ๋ณด์œ ์ž์™€ ๋Œ€์ƒ์ž ์‚ฌ์ด์˜
11:53
between data holders and data subjects,
274
713240
2107
๋น„์šฉ๊ณผ ์ˆ˜์ต์˜ ๋ณ€ํ™”๋ฅผ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
11:55
which is why, perhaps, you don't hear more about them.
275
715347
3453
์–ด์ฉŒ๋ฉด ์—ฌ๋Ÿฌ๋ถ„์ด ๊ทธ๊ฒƒ์— ๋Œ€ํ•ด ๋”์ด์ƒ ๋“ฃ์ง€ ๋ชปํ•˜๋Š” ์ด์œ ์ผ๊ฒ๋‹ˆ๋‹ค.
11:58
Which brings me back to the Garden of Eden.
276
718800
3706
๋‹ค์‹œ ์—๋ด์˜ ๋™์‚ฐ์œผ๋กœ ๋Œ์•„๊ฐ€๋ณด์ฃ .
12:02
There is a second privacy interpretation
277
722506
2780
์—๋ด์˜ ๋™์‚ฐ ์ด์•ผ๊ธฐ์—”
12:05
of the story of the Garden of Eden
278
725286
1809
๋‘๋ฒˆ์งธ ์‚ฌ์ƒํ™œ์— ๊ด€๋ จ๋œ ํ•ด์„์ด ์žˆ์Šต๋‹ˆ๋‹ค.
12:07
which doesn't have to do with the issue
279
727095
2096
์ด๊ฑด ์•„๋‹ด๊ณผ ์ด๋ธŒ๊ฐ€
12:09
of Adam and Eve feeling naked
280
729191
2225
์•Œ๋ชธ์ธ ์ฑ„
12:11
and feeling ashamed.
281
731416
2381
์ฐฝํ”ผํ•ด ํ•˜๋Š” ๊ฒƒ๊ณผ๋Š” ์ƒ๊ด€ ์—†์–ด์š”.
12:13
You can find echoes of this interpretation
282
733797
2781
์ด ํ•ด์„์˜ ๋ฉ”์•„๋ฆฌ๋Š”
12:16
in John Milton's "Paradise Lost."
283
736578
2782
์กด ๋ฐ€ํŠผ์˜ "์‹ค๋‚™์›"์—์„œ ์ฐพ์œผ์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
12:19
In the garden, Adam and Eve are materially content.
284
739360
4197
๋™์‚ฐ์—์„œ ์•„๋‹ด๊ณผ ์ด๋ธŒ๋Š” ๋ฌผ์งˆ์ ์œผ๋กœ ๋งŒ์กฑํ•ฉ๋‹ˆ๋‹ค.
12:23
They're happy. They are satisfied.
285
743557
2104
๊ทธ๋“ค์€ ํ–‰๋ณตํ•ฉ๋‹ˆ๋‹ค. ๋งŒ์กฑํ•˜๊ณ  ์žˆ์–ด์š”.
12:25
However, they also lack knowledge
286
745661
2293
ํ•˜์ง€๋งŒ ๊ทธ๋“ค์€
12:27
and self-awareness.
287
747954
1640
์ง€์‹๊ณผ ์ž๊ธฐ ์ธ์‹์ด ์—†์Šต๋‹ˆ๋‹ค.
12:29
The moment they eat the aptly named
288
749594
3319
๊ทธ๋“ค์ด ์ ์ ˆํ•˜๊ฒŒ ์ด๋ฆ„ ์ง€์–ด์ง„
12:32
fruit of knowledge,
289
752913
1293
์„ ์•…๊ณผ๋ฅผ ๋จน๋Š” ์ˆœ๊ฐ„
12:34
that's when they discover themselves.
290
754206
2605
์ž๊ธฐ ์ž์‹ ์„ ๋ฐœ๊ฒฌํ•ฉ๋‹ˆ๋‹ค.
12:36
They become aware. They achieve autonomy.
291
756811
4031
๊ทธ๋“ค์€ ์˜์‹ํ•˜๊ฒŒ ๋˜์š”. ์ž์ฃผ์„ฑ์„ ์–ป์Šต๋‹ˆ๋‹ค.
12:40
The price to pay, however, is leaving the garden.
292
760842
3126
ํ•˜์ง€๋งŒ ๊ทธ์— ๋Œ€ํ•œ ๋Œ€๊ฐ€๋Š” ๋™์‚ฐ์„ ๋– ๋‚˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:43
So privacy, in a way, is both the means
293
763968
3881
์‚ฌ์ƒํ™œ์ด๋ž€ ํ•œํŽธ์œผ๋กœ
12:47
and the price to pay for freedom.
294
767849
2962
์ž์œ ๋ฅผ ์œ„ํ•ด ์ง€๋ถˆํ•˜๋Š” ๋Œ€๊ฐ€์ž…๋‹ˆ๋‹ค.
12:50
Again, marketers tell us
295
770811
2770
๋‹ค์‹œ ํ•œ๋ฒˆ, ๋งˆ์ผ€ํŒ… ๋‹ด๋‹น์ž๋“ค์€
12:53
that big data and social media
296
773581
3019
๋น… ๋ฐ์ดํ„ฐ์™€ ์†Œ์…œ ๋ฏธ๋””์–ด๊ฐ€
12:56
are not just a paradise of profit for them,
297
776600
2979
๊ทธ๋“ค์„ ์œ„ํ•œ ์ด์ต์˜ ๋‚™์›์ผ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
12:59
but a Garden of Eden for the rest of us.
298
779579
2457
์šฐ๋ฆฌ๋ฅผ ์œ„ํ•œ ์—๋ด ๋™์‚ฐ์ด๋ผ๊ณ  ๋งํ•ฉ๋‹ˆ๋‹ค.
13:02
We get free content.
299
782036
1238
์šฐ๋ฆฐ ๋ฌด๋ฃŒ ์ฝ˜ํ…์ธ ๋ฅผ ์–ป์ฃ .
13:03
We get to play Angry Birds. We get targeted apps.
300
783274
3123
์•ต๊ทธ๋ฆฌ ๋ฒ„๋“œ๋ฅผ ํ•˜๊ฒŒ ๋˜๊ณ ์š”. ํŠน์ •ํ•œ ์•ฑ์„ ์–ป์Šต๋‹ˆ๋‹ค.
13:06
But in fact, in a few years, organizations
301
786397
2897
ํ•˜์ง€๋งŒ ์‚ฌ์‹ค ๋ช‡๋…„ ํ›„ ๋‹จ์ฒด๋“ค์€
13:09
will know so much about us,
302
789294
1609
์šฐ๋ฆฌ์— ๋Œ€ํ•ด ์•„์ฃผ ๋งŽ์ด ์•Œ๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
13:10
they will be able to infer our desires
303
790903
2710
๊ทธ๋“ค์€ ์šฐ๋ฆฌ๊ฐ€ ์š•๊ตฌ๋ฅผ ๊ฐ€์ง€๊ธฐ๋„ ์ „์—
13:13
before we even form them, and perhaps
304
793613
2204
๊ทธ ์š•๊ตฌ๋ฅผ ์ถ”๋ก ํ•  ์ˆ˜ ์žˆ๊ณ 
13:15
buy products on our behalf
305
795817
2447
์šฐ๋ฆฌ๊ฐ€ ํ•„์š”๋ฅผ ๋Š๋ผ๊ธฐ๋„ ์ „์—
13:18
before we even know we need them.
306
798264
2274
์ œํ’ˆ์„ ์‚ฌ๊ฒŒ ํ•  ์ˆ˜๋„ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
13:20
Now there was one English author
307
800538
3237
ํ•œ ์˜๊ตญ ์ž‘๊ฐ€๊ฐ€
13:23
who anticipated this kind of future
308
803775
3045
์ด๋Ÿฐ ๋ฏธ๋ž˜๋ฅผ ์˜ˆ์ƒํ–ˆ์–ด์š”.
13:26
where we would trade away
309
806820
1405
ํŽธ์˜๋ฅผ ์œ„ํ•ด
13:28
our autonomy and freedom for comfort.
310
808225
3548
์šฐ๋ฆฌ์˜ ์ž์ฃผ์„ฑ๊ณผ ์ž์œ ๋ฅผ ํŒ๋‹ค๊ณ  ๋ง์ด์ฃ .
13:31
Even more so than George Orwell,
311
811773
2161
์กฐ์ง€ ์˜ค์›ฐ๋ณด๋‹ค ๋”ํ•œ
13:33
the author is, of course, Aldous Huxley.
312
813934
2761
์ž๊ฐ€๋Š” ๋ฌผ๋ก  ์˜ฌ๋”์Šค ํ—‰์Šฌ๋ฆฌ์ž…๋‹ˆ๋‹ค.
13:36
In "Brave New World," he imagines a society
313
816695
2854
"๋ฉ‹์ง„ ์‹ ์„ธ๊ณ„"์—์„œ ๊ทธ๋Š”
13:39
where technologies that we created
314
819549
2171
์šฐ๋ฆฌ๊ฐ€ ์ฐฝ์กฐํ•œ ์ž์œ ๋ฅผ ์œ„ํ•œ ๊ธฐ์ˆ ๋“ค์ด
13:41
originally for freedom
315
821720
1859
์šฐ๋ฆฌ๋ฅผ ๊ฐ•์••ํ•˜๋Š”
13:43
end up coercing us.
316
823579
2567
์‚ฌํšŒ๋ฅผ ๊ทธ๋ฆฝ๋‹ˆ๋‹ค.
13:46
However, in the book, he also offers us a way out
317
826146
4791
ํ•˜์ง€๋งŒ ์ฑ…์—์„œ ๊ทธ๋Š”
13:50
of that society, similar to the path
318
830937
3438
๊ทธ๋Ÿฐ ์‚ฌํšŒ์—์„œ ๋ฒ—์–ด๋‚˜๋Š” ๋ฐฉ๋ฒ•๋„ ๊ถŒํ•ฉ๋‹ˆ๋‹ค.
13:54
that Adam and Eve had to follow to leave the garden.
319
834375
3955
์•„๋‹ด๊ณผ ์ด๋ธŒ๊ฐ€ ๋™์‚ฐ์„ ๋– ๋‚˜์•ผ ํ–ˆ๋˜ ๊ฒƒ๊ณผ ๋น„์Šทํ•˜๊ฒŒ์š”.
13:58
In the words of the Savage,
320
838330
2147
์ƒˆ๋น„์ง€์˜ ๋ง๋กœ
14:00
regaining autonomy and freedom is possible,
321
840477
3069
์ž์ฃผ์„ฑ๊ณผ ์ž์œ ๋ฅผ ๋˜์ฐพ๋Š” ๊ฒƒ์€ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
14:03
although the price to pay is steep.
322
843546
2679
๊ทธ์— ๋Œ€ํ•œ ๋Œ€๊ฐ€๋Š” ๋น„์‹ธ์ง€๋งŒ์š”.
14:06
So I do believe that one of the defining fights
323
846225
5715
๊ทธ๋ž˜์„œ ์ €๋Š”
14:11
of our times will be the fight
324
851940
2563
์šฐ๋ฆฌ ์‹œ๋Œ€์˜ ์ค‘์š”ํ•œ ์‹ธ์›€์€
14:14
for the control over personal information,
325
854503
2387
๊ฐœ์ธ์ •๋ณด ํ†ต์ œ๋ฅผ ์œ„ํ•œ ์‹ธ์›€,
14:16
the fight over whether big data will become a force
326
856890
3507
๋น… ๋ฐ์ดํ„ฐ๊ฐ€ ์ž์œ ๋ฅผ ์œ„ํ•œ
14:20
for freedom,
327
860397
1289
ํž˜์ด ๋ ์ง€์˜ ์‹ธ์›€์ด๋ผ ๋ฏฟ์Šต๋‹ˆ๋‹ค.
14:21
rather than a force which will hiddenly manipulate us.
328
861686
4746
์šฐ๋ฆฌ๋ฅผ ๋ชฐ๋ž˜ ์กฐ์ข…ํ•˜๋Š” ํž˜ ๋Œ€์‹ ์—์š”.
14:26
Right now, many of us
329
866432
2593
์ง€๊ธˆ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์€
14:29
do not even know that the fight is going on,
330
869025
2753
๊ทธ ์‹ธ์›€์ด ์ง„ํ–‰ ์ค‘์ธ์ง€๋„ ๋ชจ๋ฅผ๊ฑฐ์—์š”.
14:31
but it is, whether you like it or not.
331
871778
2672
ํ•˜์ง€๋งŒ ์—ฌ๋Ÿฌ๋ถ„์ด ์ข‹๋“  ์‹ซ๋“  ์ง„ํ–‰ ์ค‘์ž…๋‹ˆ๋‹ค.
14:34
And at the risk of playing the serpent,
332
874450
2804
๊ทธ๋ฆฌ๊ณ  ๋ฑ€์„ ๊ฐ–๊ณ  ๋…ธ๋Š” ์œ„ํ—˜์„ ๋ฌด๋ฆ…์“ฐ๊ณ 
14:37
I will tell you that the tools for the fight
333
877254
2897
์‹ธ์›€์„ ์œ„ํ•œ ๋„๊ตฌ๋Š”
14:40
are here, the awareness of what is going on,
334
880151
3009
์—ฌ๊ธฐ ์žˆ์Šต๋‹ˆ๋‹ค. ํ˜„ ์ƒํ™ฉ์˜ ์ž๊ฐ๊ณผ
14:43
and in your hands,
335
883160
1355
์—ฌ๋Ÿฌ๋ถ„์˜ ์†์— ์žˆ๋Š”
14:44
just a few clicks away.
336
884515
3740
๋ช‡ ๋ฒˆ์˜ ํด๋ฆญ์ž…๋‹ˆ๋‹ค.
14:48
Thank you.
337
888255
1482
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
14:49
(Applause)
338
889737
4477
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7