What will a future without secrets look like? | Alessandro Acquisti

202,219 views ・ 2013-10-18

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Ivana Korom Lektor: Mile Živković
00:12
I would like to tell you a story
0
12641
2354
Želim da vam ispričam jednu priču
00:14
connecting the notorious privacy incident
1
14995
3176
koja povezuje čuveni incident u vezi sa privatnošću,
00:18
involving Adam and Eve,
2
18171
2769
koji uključuje Adama i Evu,
00:20
and the remarkable shift in the boundaries
3
20940
3446
sa neverovatnim promenama u granicama
00:24
between public and private which has occurred
4
24386
2686
između privatnog i javnog života
koje su se desile u poslednjih 10 godina.
00:27
in the past 10 years.
5
27072
1770
00:28
You know the incident.
6
28842
1298
Svi znate za taj incident.
00:30
Adam and Eve one day in the Garden of Eden
7
30140
3330
Adam i Eva, jednoga dana u Raju
00:33
realize they are naked.
8
33470
1843
shvate da su nagi.
00:35
They freak out.
9
35313
1500
Polude.
00:36
And the rest is history.
10
36813
2757
A ostatak priče pripada istoriji.
00:39
Nowadays, Adam and Eve
11
39570
2188
Danas bi Adam i Eva
00:41
would probably act differently.
12
41758
2361
verovatno drugačije reagovali.
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
44119
2268
[@Adam Sinoć je bila ludnica! jabuka je super LOL]
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
46387
1873
[@Eva Aha.. dušo, znaš li gde su mi pantalone?
00:48
We do reveal so much more information
15
48260
2636
Mnogo više informacija o sebi
00:50
about ourselves online than ever before,
16
50896
3334
otkrivamo na internetu, nego ikad ranije,
00:54
and so much information about us
17
54230
1704
i sve više i više informacija o nama
00:55
is being collected by organizations.
18
55934
2224
prikupljaju razne organizacije.
00:58
Now there is much to gain and benefit
19
58158
3282
Mnogo toga možemo da dobijemo
01:01
from this massive analysis of personal information,
20
61440
2446
od ove masivne analize ličnih podataka,
01:03
or big data,
21
63886
1946
ili "velikih podataka",
01:05
but there are also complex tradeoffs that come
22
65832
2638
ali odavanjem svoje privatnosti
01:08
from giving away our privacy.
23
68470
3098
stupamo u kompleksnu razmenu.
01:11
And my story is about these tradeoffs.
24
71568
4023
Priča koju vam pričam je upravo o toj razmeni.
01:15
We start with an observation which, in my mind,
25
75591
2584
Počinjemo sa opservacijom koja je, po meni,
01:18
has become clearer and clearer in the past few years,
26
78175
3327
postala sve jasnija u poslednjih nekoliko godina,
01:21
that any personal information
27
81502
2097
da bilo koja lična informacija
01:23
can become sensitive information.
28
83599
2285
lako može postati osetljiva informacija.
01:25
Back in the year 2000, about 100 billion photos
29
85884
4125
2000. godine je slikano oko 100 milijardi fotografija
01:30
were shot worldwide,
30
90009
1912
širom sveta,
01:31
but only a minuscule proportion of them
31
91921
3065
ali je samo mali deo njih
01:34
were actually uploaded online.
32
94986
1883
postavljen na internet.
01:36
In 2010, only on Facebook, in a single month,
33
96869
3361
2010, samo na Fejsbuk tokom jednog meseca
01:40
2.5 billion photos were uploaded,
34
100230
3270
postavljeno je 2,5 milijarde fotografija,
01:43
most of them identified.
35
103500
1882
a većina njih je bila sa identifikacijom.
01:45
In the same span of time,
36
105382
1880
U istom periodu,
01:47
computers' ability to recognize people in photos
37
107262
4870
mogućnost kompjutera da prepoznaje ljude na slikama
01:52
improved by three orders of magnitude.
38
112132
3608
se povećala tri puta.
01:55
What happens when you combine
39
115740
1882
Šta se dešava kad kombinujemo
01:57
these technologies together:
40
117622
1501
ove tehnologije:
01:59
increasing availability of facial data;
41
119123
2658
uvećanu dostupnost facijalnih podataka,
02:01
improving facial recognizing ability by computers;
42
121781
3648
povećanu sposobnost kompjutera da prepoznaju lica,
02:05
but also cloud computing,
43
125429
2182
ali i informacije u oblaku,
02:07
which gives anyone in this theater
44
127611
1888
što svakome u ovoj prostoriji
02:09
the kind of computational power
45
129499
1560
daje moć računara
02:11
which a few years ago was only the domain
46
131059
1886
koja je do pre nekoliko godina bila svojstvena
02:12
of three-letter agencies;
47
132945
1782
samo agencijama sa troslovnim akronimom;
02:14
and ubiquitous computing,
48
134727
1378
i sveprisutno korišćenje računara,
02:16
which allows my phone, which is not a supercomputer,
49
136105
2892
koje dozvoljava mom telefonu, koji nije superkompjuter,
02:18
to connect to the Internet
50
138997
1671
da se poveže na internet
02:20
and do there hundreds of thousands
51
140668
2334
i obavlja stotine hiljada
02:23
of face metrics in a few seconds?
52
143002
2639
prepoznavanja lica u nekoliko sekundi?
02:25
Well, we conjecture that the result
53
145641
2628
Pa, nagađamo da će rezultat
02:28
of this combination of technologies
54
148269
2064
ove kombinacije tehnologija
02:30
will be a radical change in our very notions
55
150333
2888
uneti radikalnu promenu u našim shvatanjima
02:33
of privacy and anonymity.
56
153221
2257
privatnosti i anonimnosti.
02:35
To test that, we did an experiment
57
155478
1993
Da bismo to testirali, sproveli smo eksperiment
02:37
on Carnegie Mellon University campus.
58
157471
2121
na kampusu Karnegi Melon univerziteta.
02:39
We asked students who were walking by
59
159592
2099
Zamolili smo studente u prolazu
02:41
to participate in a study,
60
161691
1779
da učestvuju u ispitivanju,
02:43
and we took a shot with a webcam,
61
163470
2562
fotografisali smo ih veb kamerom
02:46
and we asked them to fill out a survey on a laptop.
62
166032
2782
i zamolili ih da na laptopu popune jedan upitnik.
02:48
While they were filling out the survey,
63
168814
1979
Dok su to radili,
02:50
we uploaded their shot to a cloud-computing cluster,
64
170793
2797
mi smo ubacili njihovu fotografiju na računarsku mrežu u oblaku
02:53
and we started using a facial recognizer
65
173590
1727
i pokrenuli program prepoznavanja lica
02:55
to match that shot to a database
66
175317
2405
kako bismo povezali tu fotografiju
02:57
of some hundreds of thousands of images
67
177722
2393
sa bazom od stotina hiljada slika
03:00
which we had downloaded from Facebook profiles.
68
180115
3596
koje smo poskidali sa Fejsbuk profila.
03:03
By the time the subject reached the last page
69
183711
3259
Do trenutka kada je ispitanik stigao do poslednje strane upitnika,
03:06
on the survey, the page had been dynamically updated
70
186970
3347
stranica je automatski ažurirana
03:10
with the 10 best matching photos
71
190317
2313
sa deset najpribližnijih fotografija
03:12
which the recognizer had found,
72
192630
2285
koje je program pronašao,
03:14
and we asked the subjects to indicate
73
194915
1738
i zamolili smo ispitanike da naznače
03:16
whether he or she found themselves in the photo.
74
196653
4120
ako su se prepoznali na fotografijama.
03:20
Do you see the subject?
75
200773
3699
Da li vidite ispitanika?
03:24
Well, the computer did, and in fact did so
76
204472
2845
Pa, računar je video,
03:27
for one out of three subjects.
77
207317
2149
u stvari uspeo je u jednom od tri slučaja.
03:29
So essentially, we can start from an anonymous face,
78
209466
3184
U osnovi, možemo početi od anonimnog lica,
03:32
offline or online, and we can use facial recognition
79
212650
3484
na internetu ili van njega, i možemo koristiti prepoznavanje lica
03:36
to give a name to that anonymous face
80
216134
2360
da imenujemo to nepoznato lice
03:38
thanks to social media data.
81
218494
2108
zahvaljujući podacima sa društvenih medija.
03:40
But a few years back, we did something else.
82
220602
1872
Ali nekoliko godina ranije, uradili smo nešto drugo.
03:42
We started from social media data,
83
222474
1823
Počeli smo od podataka sa društvenih medija,
03:44
we combined it statistically with data
84
224297
3051
pomešali ih statistički sa podacima
03:47
from U.S. government social security,
85
227348
2102
iz socijalnog osiguranja vlade SAD-a,
03:49
and we ended up predicting social security numbers,
86
229450
3324
i na kraju smo predviđali brojeve socijalnog osiguranja,
03:52
which in the United States
87
232774
1512
koji su u Sjedinjenim Državama
03:54
are extremely sensitive information.
88
234286
2040
izuzetno osetljiva informacija.
03:56
Do you see where I'm going with this?
89
236326
2093
Da li primećujete kuda idem s ovim?
03:58
So if you combine the two studies together,
90
238419
2922
Dakle, ako spojite ta dva istraživanja,
04:01
then the question becomes,
91
241341
1512
onda pitanje postaje,
04:02
can you start from a face and,
92
242853
2720
možete li početi od lica
04:05
using facial recognition, find a name
93
245573
2311
i koristeći prepoznavanje lica, pronaći ime
04:07
and publicly available information
94
247884
2669
i javno dostupne informacije
04:10
about that name and that person,
95
250553
1932
o tom imenu i toj osobi,
04:12
and from that publicly available information
96
252485
2248
i iz tih javno dostupnih informacija
04:14
infer non-publicly available information,
97
254733
2042
izvući one koje nisu javno dostupne,
04:16
much more sensitive ones
98
256775
1606
one mnogo osetljivije,
04:18
which you link back to the face?
99
258381
1492
koje možete povezati sa tim licem?
04:19
And the answer is, yes, we can, and we did.
100
259873
1916
Odgovor je da, možemo, i to smo i uradili.
04:21
Of course, the accuracy keeps getting worse.
101
261789
2568
Naravno, preciznost se smanjuje.
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
264357
944
[4 pokušaja; identifikovano prvih 5 cifara broja SO kod 27% ispitanika]
04:25
But in fact, we even decided to develop an iPhone app
103
265301
3827
Ali odlučili smo i da razvijemo aplikaciju
04:29
which uses the phone's internal camera
104
269128
2715
koja koristi ugrađenu telefonsku kameru
04:31
to take a shot of a subject
105
271843
1600
da fotografiše ispitanika
04:33
and then upload it to a cloud
106
273443
1487
i potom ubaci sliku na oblak
04:34
and then do what I just described to you in real time:
107
274930
2662
i uradi u realnom vremenu ono što sam vam opisao:
04:37
looking for a match, finding public information,
108
277592
2088
traži podudaranje, nalazi javne informacije,
04:39
trying to infer sensitive information,
109
279680
1730
pokušava da otkrije osetljive informacije,
04:41
and then sending back to the phone
110
281410
2591
i šalje sve nazad telefonu
04:44
so that it is overlaid on the face of the subject,
111
284001
3609
tako da se prikaže preko lica ispitanika,
04:47
an example of augmented reality,
112
287610
1901
kao proširena stvarnost,
04:49
probably a creepy example of augmented reality.
113
289511
2451
verovatno jezivi primer proširene stvarnosti.
04:51
In fact, we didn't develop the app to make it available,
114
291962
3339
Zapravo, nismo razvili aplikaciju koja bi bila dostupna,
04:55
just as a proof of concept.
115
295301
1922
nego samo kao dokaz ideje.
04:57
In fact, take these technologies
116
297223
2313
U stvari, uzmite ove tehnologije
04:59
and push them to their logical extreme.
117
299536
1837
i gurnite ih do njihovog logičnog ekstrema.
05:01
Imagine a future in which strangers around you
118
301373
2719
Zamislite budućnost u kojoj će vas stranci u okolini
05:04
will look at you through their Google Glasses
119
304092
2311
gledati kroz Gugl naočare,
05:06
or, one day, their contact lenses,
120
306403
2307
ili jednog dana kroz kontaktna sočiva,
05:08
and use seven or eight data points about you
121
308710
4020
i koristiti sedam ili osam podataka o vama
05:12
to infer anything else
122
312730
2582
da pronađu bilo šta
05:15
which may be known about you.
123
315312
2603
što je možda o vama poznato.
05:17
What will this future without secrets look like?
124
317915
4794
Kako će izgledati ta budućnost bez tajni?
05:22
And should we care?
125
322709
1964
I da li bi trebalo da nas je briga?
05:24
We may like to believe
126
324673
1891
Možda nam se sviđa da verujemo
05:26
that the future with so much wealth of data
127
326564
3040
da će ta budućnost sa obiljem podataka
05:29
would be a future with no more biases,
128
329604
2514
biti nepolarizovana budućnost,
05:32
but in fact, having so much information
129
332118
3583
ali zapravo, posedovanje toliko informacija
05:35
doesn't mean that we will make decisions
130
335701
2191
ne znači da ćemo donositi objektivnije odluke.
05:37
which are more objective.
131
337892
1706
05:39
In another experiment, we presented to our subjects
132
339598
2560
U jednom drugom eksperimentu, ispitanicima smo predstavili
05:42
information about a potential job candidate.
133
342158
2246
informacije o potencijalnom kandidatu za posao.
05:44
We included in this information some references
134
344404
3178
Tu smo uključili i neke
05:47
to some funny, absolutely legal,
135
347582
2646
zabavne, potpuno legalne,
05:50
but perhaps slightly embarrassing information
136
350228
2465
ali možda donekle ponižavajuće informacije
05:52
that the subject had posted online.
137
352693
2020
koje je subjekat postavio na internet.
05:54
Now interestingly, among our subjects,
138
354713
2366
Zanimljivo je da su neki ispitanici
05:57
some had posted comparable information,
139
357079
3083
postavili slične informacije,
06:00
and some had not.
140
360162
2362
a drugi nisu.
06:02
Which group do you think
141
362524
1949
Šta mislite, koja grupa
06:04
was more likely to judge harshly our subject?
142
364473
4552
je bila sklonija da oštro prosuđuje o našem subjektu?
06:09
Paradoxically, it was the group
143
369025
1957
Paradoksalno, ona grupa
06:10
who had posted similar information,
144
370982
1733
koja je postavila slične informacije,
06:12
an example of moral dissonance.
145
372715
2942
to je primer moralne disonance.
06:15
Now you may be thinking,
146
375657
1750
Možda mislite
06:17
this does not apply to me,
147
377407
1702
da se to ne odnosi na vas,
06:19
because I have nothing to hide.
148
379109
2162
jer nemate šta da krijete.
06:21
But in fact, privacy is not about
149
381271
2482
Ali u stvari, kod privatnosti
06:23
having something negative to hide.
150
383753
3676
se ne radi o tome da krijete nešto negativno.
06:27
Imagine that you are the H.R. director
151
387429
2354
Zamislite da ste direktor odeljenja za ljudske resurse
06:29
of a certain organization, and you receive résumés,
152
389783
2947
u nekoj organizaciji i primate biografije,
06:32
and you decide to find more information about the candidates.
153
392730
2473
i odlučite da pronađete više informacija o kandidatima.
06:35
Therefore, you Google their names
154
395203
2460
Ukucate njihova imena u Gugl,
06:37
and in a certain universe,
155
397663
2240
i u nekom univerzumu,
06:39
you find this information.
156
399903
2008
pronađete ovu informaciju.
06:41
Or in a parallel universe, you find this information.
157
401911
4437
Ili u nekom paralelnom univerzumu, pronađete ovu informaciju.
06:46
Do you think that you would be equally likely
158
406348
2717
Da li mislite da biste podjednako bili skloni
06:49
to call either candidate for an interview?
159
409065
2803
da pozovete jednog od ova dva kandidata na razgovor?
06:51
If you think so, then you are not
160
411868
2282
Ako to mislite, onda se razlikujete
06:54
like the U.S. employers who are, in fact,
161
414150
2582
od poslodavaca u SAD-u, koji su u stvari
06:56
part of our experiment, meaning we did exactly that.
162
416732
3307
deo našeg eksperimenta, znači da smo upravo to uradili.
07:00
We created Facebook profiles, manipulating traits,
163
420039
3182
Napravili smo profile na Fejsbuku, izmislili osobine,
07:03
then we started sending out résumés to companies in the U.S.,
164
423221
2851
a potom smo počeli da šaljemo biografije kompanijama u SAD,
07:06
and we detected, we monitored,
165
426072
1908
i pratili smo
07:07
whether they were searching for our candidates,
166
427980
2393
da li proveravaju naše kandidate
07:10
and whether they were acting on the information
167
430373
1832
i da li reaguju na informacije koje nalaze na društvenim medijima.
07:12
they found on social media. And they were.
168
432205
1938
I reagovali su.
07:14
Discrimination was happening through social media
169
434143
2101
Kroz društvene medije, kandidati istih veština
07:16
for equally skilled candidates.
170
436244
3073
bili su diskriminisani.
07:19
Now marketers like us to believe
171
439317
4575
Marketari bi želeli da verujemo
07:23
that all information about us will always
172
443892
2269
kako će sve informacije o nama
07:26
be used in a manner which is in our favor.
173
446161
3273
uvek biti korišćene u našu korist.
07:29
But think again. Why should that be always the case?
174
449434
3715
Razmislite još jednom. Zašto bi to uvek bilo tako?
07:33
In a movie which came out a few years ago,
175
453149
2664
U jednom filmu od pre nekoliko godina,
07:35
"Minority Report," a famous scene
176
455813
2553
"Suvišni izveštaj", u poznatoj sceni
07:38
had Tom Cruise walk in a mall
177
458366
2576
Tom Kruz ulazi u tržni centar
07:40
and holographic personalized advertising
178
460942
3776
i holografske personalizovane reklame
07:44
would appear around him.
179
464718
1835
se pojavljuju oko njega.
07:46
Now, that movie is set in 2054,
180
466553
3227
Radnja tog filma smeštena je u 2054,
07:49
about 40 years from now,
181
469780
1642
oko 40 godina od danas,
07:51
and as exciting as that technology looks,
182
471422
2908
i koliko god ta tehnologija uzbudljivo izgledala,
07:54
it already vastly underestimates
183
474330
2646
ona već izuzetno potcenjuje
07:56
the amount of information that organizations
184
476976
2140
količinu informacija koje organizacije
07:59
can gather about you, and how they can use it
185
479116
2483
mogu da skupe o vama, i kako mogu da ih koriste
08:01
to influence you in a way that you will not even detect.
186
481599
3398
da utiču na vas na načine koje nećete ni primetiti.
08:04
So as an example, this is another experiment
187
484997
2103
Kao ilustracija, ovo je još jedan eksperiment
08:07
actually we are running, not yet completed.
188
487100
2273
koji sprovodimo, koji nije još završen.
08:09
Imagine that an organization has access
189
489373
2319
Zamislite da neka organizacija ima pristup
08:11
to your list of Facebook friends,
190
491692
2056
listi vaših Fejsbuk prijatelja,
08:13
and through some kind of algorithm
191
493748
1772
i preko nekakvog algoritma
08:15
they can detect the two friends that you like the most.
192
495520
3734
može da otkrije vaša dva najdraža prijatelja.
08:19
And then they create, in real time,
193
499254
2280
I potom u realnom vremenu kreira
08:21
a facial composite of these two friends.
194
501534
2842
lice od sklopa lica ta dva prijatelja.
08:24
Now studies prior to ours have shown that people
195
504376
3069
E sad, studije pre naše su dokazale
08:27
don't recognize any longer even themselves
196
507445
2885
da u takvim sklopovima
08:30
in facial composites, but they react
197
510330
2462
ljudi više ne prepoznaju ni sebe,
08:32
to those composites in a positive manner.
198
512792
2117
ali reaguju pozitivno na njih.
08:34
So next time you are looking for a certain product,
199
514909
3415
Sledeći put kad budete tražili neki proizvod,
08:38
and there is an ad suggesting you to buy it,
200
518324
2559
i pojavi se oglas koji vam sugeriše da ga kupite,
08:40
it will not be just a standard spokesperson.
201
520883
2907
to neće biti neki standardni prodavac.
08:43
It will be one of your friends,
202
523790
2313
Biće to jedan od vaših prijatelja,
08:46
and you will not even know that this is happening.
203
526103
3303
a vi nećete ni znati da se to dešava.
08:49
Now the problem is that
204
529406
2413
Problem je u tome
08:51
the current policy mechanisms we have
205
531819
2519
što trenutni mehanizmi zaštite
08:54
to protect ourselves from the abuses of personal information
206
534338
3438
protiv zloupotrebe ličnih informacija
08:57
are like bringing a knife to a gunfight.
207
537776
2984
liče na srljanje grlom u jagode.
09:00
One of these mechanisms is transparency,
208
540760
2913
Jedan od tih mehanizama je transparentnost,
09:03
telling people what you are going to do with their data.
209
543673
3200
obaveštavanje ljudi o tome šta ćete uraditi s njihovim podacima.
09:06
And in principle, that's a very good thing.
210
546873
2106
U osnovi je to veoma dobra stvar.
09:08
It's necessary, but it is not sufficient.
211
548979
3667
Neophodna je, ali nije dovoljna.
09:12
Transparency can be misdirected.
212
552646
3698
Transparentnost može biti pogrešno usmerena.
09:16
You can tell people what you are going to do,
213
556344
2104
Možete reći ljudima šta ćete uraditi,
09:18
and then you still nudge them to disclose
214
558448
2232
i onda ih ipak navesti da otkriju
09:20
arbitrary amounts of personal information.
215
560680
2623
proizvoljne količine ličnih informacija.
09:23
So in yet another experiment, this one with students,
216
563303
2886
U još jednom eksperimentu, ovaj put sa studentima,
09:26
we asked them to provide information
217
566189
3058
tražili smo da daju informacije
09:29
about their campus behavior,
218
569247
1813
o svom ponašanju na kampusu,
09:31
including pretty sensitive questions, such as this one.
219
571060
2940
uključujući i dosta osetljiva pitanja, poput ovog:
[Da li ste ikad varali na ispitu?]
09:34
[Have you ever cheated in an exam?]
220
574000
621
09:34
Now to one group of subjects, we told them,
221
574621
2300
Jednoj grupi studenata smo rekli:
09:36
"Only other students will see your answers."
222
576921
2841
"Vaše odgovore će videti samo drugi studenti".
09:39
To another group of subjects, we told them,
223
579762
1579
Drugoj grupi studenata smo rekli:
09:41
"Students and faculty will see your answers."
224
581341
3561
"Vaše odgovore će videti studenti i nastavno osoblje".
09:44
Transparency. Notification. And sure enough, this worked,
225
584902
2591
Transparentnost. Obaveštenje. I ovo je upalilo,
09:47
in the sense that the first group of subjects
226
587493
1407
u smislu da je prva grupa studenata
09:48
were much more likely to disclose than the second.
227
588900
2568
bila spremnija da odgovori od druge.
09:51
It makes sense, right?
228
591468
1520
Ima smisla, zar ne?
09:52
But then we added the misdirection.
229
592988
1490
Ali onda amo dodali obmanjujuće uputstvo.
09:54
We repeated the experiment with the same two groups,
230
594478
2760
Ponovili smo eksperiment sa iste dve grupe,
09:57
this time adding a delay
231
597238
2427
ovog puta dodajući vreme
09:59
between the time we told subjects
232
599665
2935
između trenutka kad ih obaveštavamo
10:02
how we would use their data
233
602600
2080
na koji način ćemo koristiti njihove podatke
10:04
and the time we actually started answering the questions.
234
604680
4388
i trenutka kad počinjemo da odgovaramo na pitanja.
10:09
How long a delay do you think we had to add
235
609068
2561
Šta mislite, koliko vremena smo morali da dodamo
10:11
in order to nullify the inhibitory effect
236
611629
4613
da bismo poništili inhibitorni efekat
10:16
of knowing that faculty would see your answers?
237
616242
3411
saznanja da će nastavno osoblje videti vaše odgovore?
10:19
Ten minutes?
238
619653
1780
Deset minuta?
10:21
Five minutes?
239
621433
1791
Pet minuta?
10:23
One minute?
240
623224
1776
Minut?
10:25
How about 15 seconds?
241
625000
2049
Kako zvuči 15 sekundi?
10:27
Fifteen seconds were sufficient to have the two groups
242
627049
2668
15 sekundi je bilo dovoljno da obe grupe
10:29
disclose the same amount of information,
243
629717
1568
otkriju istu količinu informacija,
10:31
as if the second group now no longer cares
244
631285
2746
kao da drugoj grupi više nije važno
10:34
for faculty reading their answers.
245
634031
2656
što će nastavno osoblje čitati njihove odgovore.
10:36
Now I have to admit that this talk so far
246
636687
3336
Moram da priznam
da ovaj govor do sada zvuči izuzetno mračno,
10:40
may sound exceedingly gloomy,
247
640023
2480
10:42
but that is not my point.
248
642503
1721
ali to mi nije cilj.
10:44
In fact, I want to share with you the fact that
249
644224
2699
Zapravo, želim sa vama da podelim činjenicu
10:46
there are alternatives.
250
646923
1772
da postoje alternative.
10:48
The way we are doing things now is not the only way
251
648695
2499
Način na koji sada radimo
10:51
they can done, and certainly not the best way
252
651194
3037
nije jedini i sigurno nije najbolji.
10:54
they can be done.
253
654231
2027
10:56
When someone tells you, "People don't care about privacy,"
254
656258
4171
Kada vam neko kaže da ljudima nije stalo do privatnosti,
11:00
consider whether the game has been designed
255
660429
2642
razmislite o tome da je igra osmišljena
11:03
and rigged so that they cannot care about privacy,
256
663071
2724
i napravljena tako da ne mogu da brinu o privatnosti,
11:05
and coming to the realization that these manipulations occur
257
665795
3262
i shvatanje da se ovakve manipulacije dešavaju,
11:09
is already halfway through the process
258
669057
1607
je već polovina pređenog puta
11:10
of being able to protect yourself.
259
670664
2258
u procesu da se zaštitite.
11:12
When someone tells you that privacy is incompatible
260
672922
3710
Kada vam neko kaže da se privatnost
11:16
with the benefits of big data,
261
676632
1849
ne slaže sa koristima "velikih podataka",
11:18
consider that in the last 20 years,
262
678481
2473
uzmite u obzir da su u poslednjih 20 godina
11:20
researchers have created technologies
263
680954
1917
istraživači stvorili tehnologije
11:22
to allow virtually any electronic transactions
264
682871
3318
koje dozvoljavaju da se skoro svaka elektronska transakcija izvrši
11:26
to take place in a more privacy-preserving manner.
265
686189
3749
na način koji omogućava bolje čuvanje privatnosti.
11:29
We can browse the Internet anonymously.
266
689938
2555
Možemo anonimno da pretražujemo internet.
11:32
We can send emails that can only be read
267
692493
2678
Možemo da šaljemo imejlove koje može da pročita
11:35
by the intended recipient, not even the NSA.
268
695171
3709
samo primalac, ali ne i NSA.
11:38
We can have even privacy-preserving data mining.
269
698880
2997
Možemo čak i da prikupljamo podatke uz očuvanje privatnosti.
11:41
In other words, we can have the benefits of big data
270
701877
3894
Drugim rečima, možemo imati koristi od "velikih podataka"
11:45
while protecting privacy.
271
705771
2132
uz očuvanje privatnosti.
11:47
Of course, these technologies imply a shifting
272
707903
3791
Naravno, ove tehnologije
nagoveštavaju promenu troškova i zarade
11:51
of cost and revenues
273
711694
1546
11:53
between data holders and data subjects,
274
713240
2107
između onih koji imaju i onih koji traže podatke,
11:55
which is why, perhaps, you don't hear more about them.
275
715347
3453
što je verovatno razlog zašto ne znate više o njima.
11:58
Which brings me back to the Garden of Eden.
276
718800
3706
I to me vraća rajskom vrtu.
12:02
There is a second privacy interpretation
277
722506
2780
Postoji i druga interpretacija privatnosti
12:05
of the story of the Garden of Eden
278
725286
1809
u priči o rajskom vrtu,
12:07
which doesn't have to do with the issue
279
727095
2096
koja ne mora biti u vezi sa činjenicom
12:09
of Adam and Eve feeling naked
280
729191
2225
da se Adam i Eva osećaju golo i posramljeno.
12:11
and feeling ashamed.
281
731416
2381
12:13
You can find echoes of this interpretation
282
733797
2781
Odjeke ove interpretacije možete naći
12:16
in John Milton's "Paradise Lost."
283
736578
2782
u "Izgubljenom raju", Džona Miltona.
12:19
In the garden, Adam and Eve are materially content.
284
739360
4197
U raju, Adam i Eva su materijalno zadovoljni.
12:23
They're happy. They are satisfied.
285
743557
2104
Srećni su i zadovoljni.
12:25
However, they also lack knowledge
286
745661
2293
Međutim, nedostaje im znanje
12:27
and self-awareness.
287
747954
1640
i samosvest.
12:29
The moment they eat the aptly named
288
749594
3319
U trenutku kada pojedu
12:32
fruit of knowledge,
289
752913
1293
prigodno nazvano, voće znanja,
12:34
that's when they discover themselves.
290
754206
2605
oni otkrivaju sebe.
12:36
They become aware. They achieve autonomy.
291
756811
4031
Postaju svesni. Postižu nezavisnost.
12:40
The price to pay, however, is leaving the garden.
292
760842
3126
Međutim, cena koju plaćaju je napuštanje raja.
12:43
So privacy, in a way, is both the means
293
763968
3881
Dakle, privatnost je i sredstvo
12:47
and the price to pay for freedom.
294
767849
2962
i cena koja se plaća za slobodu.
12:50
Again, marketers tell us
295
770811
2770
Opet, stručnjaci kažu
12:53
that big data and social media
296
773581
3019
da "veliki podaci" i društveni mediji
12:56
are not just a paradise of profit for them,
297
776600
2979
za njih nisu samo raj profita,
12:59
but a Garden of Eden for the rest of us.
298
779579
2457
nego i Rajski vrt za nas ostale.
13:02
We get free content.
299
782036
1238
Dobijamo besplatan sadržaj.
13:03
We get to play Angry Birds. We get targeted apps.
300
783274
3123
Možemo da igramo "Angry Birds". Dobijamo ciljane aplikacije.
13:06
But in fact, in a few years, organizations
301
786397
2897
Ali u stvari, za nekoliko godina,
organizacije će znati toliko o nama,
13:09
will know so much about us,
302
789294
1609
13:10
they will be able to infer our desires
303
790903
2710
da će biti u mogućnosti da zaključe o našim željama
13:13
before we even form them, and perhaps
304
793613
2204
pre nego što ih mi uobličimo,
13:15
buy products on our behalf
305
795817
2447
i možda da kupuju proizvode za nas,
13:18
before we even know we need them.
306
798264
2274
pre nego što i znamo da su nam potrebni.
13:20
Now there was one English author
307
800538
3237
Postojao je jedan engleski autor
13:23
who anticipated this kind of future
308
803775
3045
koji je predvideo ovakvu budućnost
13:26
where we would trade away
309
806820
1405
gde menjamo svoju autonomiju i slobodu
13:28
our autonomy and freedom for comfort.
310
808225
3548
za udobnost.
13:31
Even more so than George Orwell,
311
811773
2161
Čak i više nego Džordž Orvel,
13:33
the author is, of course, Aldous Huxley.
312
813934
2761
taj autor je, naravno, Oldos Haksli.
13:36
In "Brave New World," he imagines a society
313
816695
2854
U "Vrlom novom svetu" on predstavlja društvo
13:39
where technologies that we created
314
819549
2171
gde nas tehnologije,
13:41
originally for freedom
315
821720
1859
koje smo stvorili pre svega za slobodu,
13:43
end up coercing us.
316
823579
2567
na kraju zarobe.
13:46
However, in the book, he also offers us a way out
317
826146
4791
Međutim, on u knjizi nudi i izlaz iz takvog društva,
13:50
of that society, similar to the path
318
830937
3438
sličan putu
13:54
that Adam and Eve had to follow to leave the garden.
319
834375
3955
kojim su Adam i Eva morali da krenu da bi izašli iz vrta.
13:58
In the words of the Savage,
320
838330
2147
Prema rečima Divljaka,
14:00
regaining autonomy and freedom is possible,
321
840477
3069
moguće je povratiti autonomiju i slobodu,
14:03
although the price to pay is steep.
322
843546
2679
ali je cena koja se plaća visoka.
14:06
So I do believe that one of the defining fights
323
846225
5715
Verujem da će jedna od definišućih borbi našeg vremena
14:11
of our times will be the fight
324
851940
2563
biti borba
14:14
for the control over personal information,
325
854503
2387
za kontrolu naših ličnih informacija,
14:16
the fight over whether big data will become a force
326
856890
3507
borba oko pitanja da li će "veliki podaci"
biti sila slobode
14:20
for freedom,
327
860397
1289
14:21
rather than a force which will hiddenly manipulate us.
328
861686
4746
ili sila koja će u potaji manipulisati nama.
14:26
Right now, many of us
329
866432
2593
Trenutno, mnogi od nas
14:29
do not even know that the fight is going on,
330
869025
2753
ni ne znaju da se vodi borba,
14:31
but it is, whether you like it or not.
331
871778
2672
ali vodi se, sviđalo vam se to ili ne.
14:34
And at the risk of playing the serpent,
332
874450
2804
Uz rizik da preuzmem ulogu zmije,
14:37
I will tell you that the tools for the fight
333
877254
2897
reći ću vam da su alati te borbe ovde,
14:40
are here, the awareness of what is going on,
334
880151
3009
svest o tome šta se dešava,
14:43
and in your hands,
335
883160
1355
i u vašim rukama,
14:44
just a few clicks away.
336
884515
3740
na samo nekoliko klikova od vas.
14:48
Thank you.
337
888255
1482
Hvala.
14:49
(Applause)
338
889737
4477
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7