We're building a dystopia just to make people click on ads | Zeynep Tufekci

738,629 views

2017-11-17 ・ TED


New videos

We're building a dystopia just to make people click on ads | Zeynep Tufekci

738,629 views ・ 2017-11-17

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Anja Kolobarić Recezent: Lucija Jelić
00:12
So when people voice fears of artificial intelligence,
0
12760
3536
Kad ljudi izraze svoje strahove o umjetnoj inteligenciji
00:16
very often, they invoke images of humanoid robots run amok.
1
16320
3976
često prizivaju slike humanoidnih robota koji divljaju.
00:20
You know? Terminator?
2
20320
1240
Znate? Terminator?
00:22
You know, that might be something to consider,
3
22400
2336
Znate, to je nešto što treba imati na umu,
00:24
but that's a distant threat.
4
24760
1856
ali to je daleka prijetnja.
00:26
Or, we fret about digital surveillance
5
26640
3456
Ili se sekiramo zbog digitalnog nadzora
00:30
with metaphors from the past.
6
30120
1776
s metaforama iz prošlosti.
00:31
"1984," George Orwell's "1984,"
7
31920
2656
"1984", Georgea Orwella, "1984"
00:34
it's hitting the bestseller lists again.
8
34600
2280
ponovno je jedna od najčitanijih knjiga.
00:37
It's a great book,
9
37960
1416
To je odlična knjiga,
00:39
but it's not the correct dystopia for the 21st century.
10
39400
3880
ali to nije točna distopija 21. stoljeća.
00:44
What we need to fear most
11
44080
1416
Ono čega se moramo najviše bojati
00:45
is not what artificial intelligence will do to us on its own,
12
45520
4776
nije ono što će nam umjetna inteligencija učiniti sama po sebi,
00:50
but how the people in power will use artificial intelligence
13
50320
4736
nego to kako će ljudi na vlasti koristiti umjetnu inteligenciju
00:55
to control us and to manipulate us
14
55080
2816
da nas kontroliraju i manipuliraju nama
00:57
in novel, sometimes hidden,
15
57920
3136
na nove, ponekad skrivene,
01:01
subtle and unexpected ways.
16
61080
3016
suptilne i neočekivane načine.
01:04
Much of the technology
17
64120
1856
Puno tehnologije
01:06
that threatens our freedom and our dignity in the near-term future
18
66000
4336
koja ugrožava našu slobodu i naše dostojanstvo u bliskoj budućnosti
01:10
is being developed by companies
19
70360
1856
razvijaju društva
01:12
in the business of capturing and selling our data and our attention
20
72240
4936
koja se bave prikupljanjem i prodajom naših podataka i naše pozornosti
01:17
to advertisers and others:
21
77200
2256
oglašivačima i ostalima:
01:19
Facebook, Google, Amazon,
22
79480
3416
Facebook, Google, Amazon,
01:22
Alibaba, Tencent.
23
82920
1880
Alibaba, Tencent.
01:26
Now, artificial intelligence has started bolstering their business as well.
24
86040
5496
Umjetna je inteligencija počela podupirati i njihovo poslovanje.
01:31
And it may seem like artificial intelligence
25
91560
2096
I može se činiti da umjetna inteligencija
01:33
is just the next thing after online ads.
26
93680
2856
slijedi nakon oglasa na internetu.
01:36
It's not.
27
96560
1216
Nije.
01:37
It's a jump in category.
28
97800
2456
To je posebna kategorija.
01:40
It's a whole different world,
29
100280
2576
To je jedan sasvim novi svijet
01:42
and it has great potential.
30
102880
2616
i ima velik potencijal.
01:45
It could accelerate our understanding of many areas of study and research.
31
105520
6920
Može ubrzati naše razumijevanje mnogih područja proučavanja i istraživanja,
01:53
But to paraphrase a famous Hollywood philosopher,
32
113120
3496
ali da parafraziramo poznatog holivudskog filozofa,
01:56
"With prodigious potential comes prodigious risk."
33
116640
3640
"S ogromnim potencijalom dolazi i ogroman rizik."
02:01
Now let's look at a basic fact of our digital lives, online ads.
34
121120
3936
Proučimo osnovnu činjenicu naših digitalnih života, oglase na internetu.
02:05
Right? We kind of dismiss them.
35
125080
2896
Nekako ih ignoriramo.
02:08
They seem crude, ineffective.
36
128000
1976
Čine se nedorađeno, neučinkovito.
02:10
We've all had the experience of being followed on the web
37
130000
4256
Svi smo se mi suočili sa situacijom da nas na internetu
02:14
by an ad based on something we searched or read.
38
134280
2776
prati oglas na temelju onoga što smo tražili ili čitali.
02:17
You know, you look up a pair of boots
39
137080
1856
Znate, tražite čizme
02:18
and for a week, those boots are following you around everywhere you go.
40
138960
3376
i tjedan dana vas te čizme slijede gdje god pošli.
02:22
Even after you succumb and buy them, they're still following you around.
41
142360
3656
Čak i nakon što podlegnete i kupite ih, i dalje vas prate.
02:26
We're kind of inured to that kind of basic, cheap manipulation.
42
146040
3016
Na neke smo načine oguglali na takvu vrstu jeftine manipulacije.
02:29
We roll our eyes and we think, "You know what? These things don't work."
43
149080
3400
Kolutamo očima i mislimo si, "Ma znate što? Takve stvari ne pale."
02:33
Except, online,
44
153720
2096
Osim što, na internetu,
02:35
the digital technologies are not just ads.
45
155840
3600
digitalne tehnologije nisu samo oglasi.
02:40
Now, to understand that, let's think of a physical world example.
46
160240
3120
Kako biste to razumjeli, proučimo primjer fizičkog svijeta.
02:43
You know how, at the checkout counters at supermarkets, near the cashier,
47
163840
4656
Znate kako u redovima u supermarketima, pored blagajni
02:48
there's candy and gum at the eye level of kids?
48
168520
3480
stoje slatkiši i žvakaće u visini očiju djece?
02:52
That's designed to make them whine at their parents
49
172800
3496
Osmišljeni su kako bi natjerali djecu
02:56
just as the parents are about to sort of check out.
50
176320
3080
da gnjave već izmorene roditelje.
03:00
Now, that's a persuasion architecture.
51
180040
2640
To je arhitektura uvjeravanja.
03:03
It's not nice, but it kind of works.
52
183160
3096
Nije lijepa, ali funkcionira.
03:06
That's why you see it in every supermarket.
53
186280
2040
Zato je možete vidjeti posvuda.
03:08
Now, in the physical world,
54
188720
1696
U fizičkom svijetu
03:10
such persuasion architectures are kind of limited,
55
190440
2496
te su arhitekture uvjeravanja ograničene
03:12
because you can only put so many things by the cashier. Right?
56
192960
4816
jer ne možete puno toga staviti pored blagajne. Zar ne?
03:17
And the candy and gum, it's the same for everyone,
57
197800
4296
Slatkiši i žvakaće isti su za sve
03:22
even though it mostly works
58
202120
1456
iako uglavnom funkcioniraju
03:23
only for people who have whiny little humans beside them.
59
203600
4040
samo na ljudima koji pored sebe imaju male gnjavatore.
03:29
In the physical world, we live with those limitations.
60
209160
3920
U fizičkom svijetu živimo s takvim ograničenjima.
03:34
In the digital world, though,
61
214280
1936
U digitalnom svijetu, doduše,
03:36
persuasion architectures can be built at the scale of billions
62
216240
4320
arhitekture uvjeravanja mogu se izgraditi na ljestvici od milijardi
03:41
and they can target, infer, understand
63
221840
3856
i mogu naciljati, zaključiti, razumjeti
03:45
and be deployed at individuals
64
225720
2896
te ih mogu koristiti pojedinci
03:48
one by one
65
228640
1216
jedan po jedan
03:49
by figuring out your weaknesses,
66
229880
2136
otkrivajući naše slabosti
03:52
and they can be sent to everyone's phone private screen,
67
232040
5616
i mogu se slati svima na privatne mobitele,
03:57
so it's not visible to us.
68
237680
2256
pa nam nije vidljivo.
03:59
And that's different.
69
239960
1256
A to je drugačije.
04:01
And that's just one of the basic things that artificial intelligence can do.
70
241240
3576
To je samo jedna od osnovnih stvari koje umjetna inteligencija može raditi.
04:04
Now, let's take an example.
71
244840
1336
Uzmimo primjer.
04:06
Let's say you want to sell plane tickets to Vegas. Right?
72
246200
2696
Recimo da želite prodati avionske karte za Vegas. Dobro?
04:08
So in the old world, you could think of some demographics to target
73
248920
3496
U starom svijetu mogli ste razmišljati o ciljanoj populaciji
04:12
based on experience and what you can guess.
74
252440
2520
ovisno o iskustvu i o tome što možete pretpostaviti.
04:15
You might try to advertise to, oh,
75
255560
2816
Mogli biste postaviti oglas za
04:18
men between the ages of 25 and 35,
76
258400
2496
muškarce između 25. i 35. godine
04:20
or people who have a high limit on their credit card,
77
260920
3936
ili za ljude koji imaju visok dozvoljeni minus na kartici
04:24
or retired couples. Right?
78
264880
1376
ili za umirovljene parove.
04:26
That's what you would do in the past.
79
266280
1816
To biste radili u prošlosti.
04:28
With big data and machine learning,
80
268120
2896
Zahvaljujući ogromnim količinama podataka i strojnom učenju,
04:31
that's not how it works anymore.
81
271040
1524
to više ne ide tako.
04:33
So to imagine that,
82
273320
2176
Kako biste to zamislili,
04:35
think of all the data that Facebook has on you:
83
275520
3856
razmišljajte o svim podacima koje Facebook ima o vama:
04:39
every status update you ever typed,
84
279400
2536
svaki status koji ste ikad napisali,
04:41
every Messenger conversation,
85
281960
2016
svaki razgovor na chatu,
04:44
every place you logged in from,
86
284000
1880
svako mjesto na kojem ste se logirali,
04:48
all your photographs that you uploaded there.
87
288400
3176
sve vaše fotografije koje ste ondje objavili.
04:51
If you start typing something and change your mind and delete it,
88
291600
3776
Ako počnete nešto tipkati, predomislite se i obrišete to,
04:55
Facebook keeps those and analyzes them, too.
89
295400
3200
Facebook će i to čuvati i analizirati.
04:59
Increasingly, it tries to match you with your offline data.
90
299160
3936
Sve vas više pokušava povezati s vašim izvanmrežnim podacima.
05:03
It also purchases a lot of data from data brokers.
91
303120
3176
Također kupuje puno podataka od posrednika podacima.
05:06
It could be everything from your financial records
92
306320
3416
Može kupiti bilo što od vaših financijskih izvještaja
05:09
to a good chunk of your browsing history.
93
309760
2120
do većeg dijela vaše povijesti pretraživanja.
05:12
Right? In the US, such data is routinely collected,
94
312360
5416
U SAD-u se takvi podaci rutinski prikupljaju,
05:17
collated and sold.
95
317800
1960
razvrstavaju i prodaju.
05:20
In Europe, they have tougher rules.
96
320320
2440
U Europi su pravila stroža.
05:23
So what happens then is,
97
323680
2200
Događa se to
05:26
by churning through all that data, these machine-learning algorithms --
98
326920
4016
da prekopavanjem svih tih podataka ti algoritmi za strojno učenje --
05:30
that's why they're called learning algorithms --
99
330960
2896
zato se zovu algoritmi učenja -
05:33
they learn to understand the characteristics of people
100
333880
4096
uče razumijevati karakteristike ljudi
05:38
who purchased tickets to Vegas before.
101
338000
2520
koji su prije kupili karte za Vegas.
05:41
When they learn this from existing data,
102
341760
3536
Kad to nauče iz postojećih podataka,
05:45
they also learn how to apply this to new people.
103
345320
3816
također nauče kako to primjenjivati na nove ljude.
05:49
So if they're presented with a new person,
104
349160
3056
Pa ako se nađu pred novom osobom,
05:52
they can classify whether that person is likely to buy a ticket to Vegas or not.
105
352240
4640
mogu odrediti hoće li ta osoba kupiti kartu za Vegas ili ne.
05:57
Fine. You're thinking, an offer to buy tickets to Vegas.
106
357720
5456
Dobro. Mislite, ponuda da kupim karte za Vegas.
06:03
I can ignore that.
107
363200
1456
Mogu to ignorirati.
06:04
But the problem isn't that.
108
364680
2216
Ali nije problem u tome.
06:06
The problem is,
109
366920
1576
Problem je u tome
06:08
we no longer really understand how these complex algorithms work.
110
368520
4136
što više ne shvaćamo kako ti složeni algoritmi funkcioniraju.
06:12
We don't understand how they're doing this categorization.
111
372680
3456
Ne razumijemo kako rade tu kategorizaciju.
06:16
It's giant matrices, thousands of rows and columns,
112
376160
4416
Njegove ogromne matrice, tisuće redova i stupaca,
06:20
maybe millions of rows and columns,
113
380600
1960
možda milijune redova i stupaca,
06:23
and not the programmers
114
383320
2640
a ni programeri,
06:26
and not anybody who looks at it,
115
386760
1680
a ni bilo tko tko to pogleda,
06:29
even if you have all the data,
116
389440
1496
čak i ako imate sve podatke,
06:30
understands anymore how exactly it's operating
117
390960
4616
više ne razumije kako to točno funkcionira
06:35
any more than you'd know what I was thinking right now
118
395600
3776
bolje nego što bi vi znali o čemu ja upravo razmišljam
06:39
if you were shown a cross section of my brain.
119
399400
3960
da vam pokažu presjek mog mozga.
06:44
It's like we're not programming anymore,
120
404360
2576
Kao da više ne programiramo,
06:46
we're growing intelligence that we don't truly understand.
121
406960
4400
već uzgajamo inteligenciju koju baš i ne razumijemo.
06:52
And these things only work if there's an enormous amount of data,
122
412520
3976
A takve stvari funkcioniraju samo kada imamo ogromne količine podataka,
06:56
so they also encourage deep surveillance on all of us
123
416520
5096
pa također potiču na značajan nadzor svih nas
07:01
so that the machine learning algorithms can work.
124
421640
2336
da algoritmi za strojno učenje mogu funkcionirati.
07:04
That's why Facebook wants to collect all the data it can about you.
125
424000
3176
Zato Facebook želi prikupiti što više podataka o vama.
07:07
The algorithms work better.
126
427200
1576
Algoritmi bolje funkcioniraju.
07:08
So let's push that Vegas example a bit.
127
428800
2696
Pozabavimo se još malo primjerom Vegasa.
07:11
What if the system that we do not understand
128
431520
3680
Što ako je sustav koji ne razumijemo
07:16
was picking up that it's easier to sell Vegas tickets
129
436200
5136
shvatio da bi bilo lakše prodati karte za Vegas
07:21
to people who are bipolar and about to enter the manic phase.
130
441360
3760
bipolarnim osobama koje upravo ulaze u fazu manije?
07:25
Such people tend to become overspenders, compulsive gamblers.
131
445640
4920
Takvi ljudi prekomjerno troše i kompulzivno kockaju.
07:31
They could do this, and you'd have no clue that's what they were picking up on.
132
451280
4456
Mogli bi to napraviti, a vi ne biste imali pojma da tako djeluju.
07:35
I gave this example to a bunch of computer scientists once
133
455760
3616
Ovaj sam primjer jednom dala većem broju kompjutorskih znanstvenika
07:39
and afterwards, one of them came up to me.
134
459400
2056
i nakon toga mi je jedan od njih prišao.
07:41
He was troubled and he said, "That's why I couldn't publish it."
135
461480
3520
Bio je uznemiren i rekao je, "Zato to nisam mogao objaviti."
07:45
I was like, "Couldn't publish what?"
136
465600
1715
Rekoh "Što to?"
07:47
He had tried to see whether you can indeed figure out the onset of mania
137
467800
5856
Pokušao je vidjeti može li se prepoznati početak faze manije
07:53
from social media posts before clinical symptoms,
138
473680
3216
iz objava na društvenim mrežama prije kliničkih znakova,
07:56
and it had worked,
139
476920
1776
i to je funkcioniralo,
07:58
and it had worked very well,
140
478720
2056
i to jako dobro,
08:00
and he had no idea how it worked or what it was picking up on.
141
480800
4880
a nije imao pojma kako je funkcioniralo ili na što je reagiralo.
08:06
Now, the problem isn't solved if he doesn't publish it,
142
486840
4416
Ali problem nije riješen ako to ne objavi
08:11
because there are already companies
143
491280
1896
jer već postoje tvrtke
08:13
that are developing this kind of technology,
144
493200
2536
koje razvijaju ovakvu vrstu tehnologije,
08:15
and a lot of the stuff is just off the shelf.
145
495760
2800
a puno je stvari raspoloživo.
08:19
This is not very difficult anymore.
146
499240
2576
To više nije teško.
08:21
Do you ever go on YouTube meaning to watch one video
147
501840
3456
Jeste li ikada otišli na YouTube s namjerom da pogledate jedan video
08:25
and an hour later you've watched 27?
148
505320
2360
i sat vremena kasnije pogledali ste ih 27?
08:28
You know how YouTube has this column on the right
149
508760
2496
Znate kako YouTube s desne strane ime stupac
08:31
that says, "Up next"
150
511280
2216
"Sljedeće"
08:33
and it autoplays something?
151
513520
1816
i automatski reproducira nešto?
08:35
It's an algorithm
152
515360
1216
To je algoritam
08:36
picking what it thinks that you might be interested in
153
516600
3616
koji bira ono što misli da biste vi htjeli pogledati,
08:40
and maybe not find on your own.
154
520240
1536
a što ne biste uspjeli naći sami.
08:41
It's not a human editor.
155
521800
1256
To nije ljudski uređivač.
08:43
It's what algorithms do.
156
523080
1416
Algoritmi tako funkcioniraju.
08:44
It picks up on what you have watched and what people like you have watched,
157
524520
4736
Prati što ste gledali i što su gledali ljudi poput vas
08:49
and infers that that must be what you're interested in,
158
529280
4216
i zaključuje da i vas to vjerojatno zanima,
08:53
what you want more of,
159
533520
1255
da želite još toga,
08:54
and just shows you more.
160
534799
1336
pa vam pokazuje više toga.
08:56
It sounds like a benign and useful feature,
161
536159
2201
Zvuči kao nevina i korisna značajka,
08:59
except when it isn't.
162
539280
1200
ali nije.
09:01
So in 2016, I attended rallies of then-candidate Donald Trump
163
541640
6960
Godine 2016. bila sam na skupu tadašnjeg kandidata Donalda Trumpa
09:09
to study as a scholar the movement supporting him.
164
549840
3336
kako bih kao naučnik proučavala pokret koji ga podržava.
09:13
I study social movements, so I was studying it, too.
165
553200
3456
Proučavam društvene pokrete, pa sam i to proučavala.
09:16
And then I wanted to write something about one of his rallies,
166
556680
3336
Tada sam htjela napisati nešto o jednom od njegovih skupova,
09:20
so I watched it a few times on YouTube.
167
560040
1960
pa sam ga pogledala nekoliko puta na YT-u.
09:23
YouTube started recommending to me
168
563240
3096
YouTube mi je počeo preporučivati
09:26
and autoplaying to me white supremacist videos
169
566360
4256
i automatski puštati videe o nadmoći bijele rase
09:30
in increasing order of extremism.
170
570640
2656
koji su išli prema ekstremizmu.
09:33
If I watched one,
171
573320
1816
Ako bih pogledala jedan,
09:35
it served up one even more extreme
172
575160
2976
ponudio bi mi jedan još ekstremniji
09:38
and autoplayed that one, too.
173
578160
1424
i automatski ga puštao.
09:40
If you watch Hillary Clinton or Bernie Sanders content,
174
580320
4536
Ako gledate sadržaje Hillary Clinton ili Bernija Sandersa,
09:44
YouTube recommends and autoplays conspiracy left,
175
584880
4696
Youtube vam preporučuje i pušta zavjeru ljevice
09:49
and it goes downhill from there.
176
589600
1760
i od toga ide samo nagore.
09:52
Well, you might be thinking, this is politics, but it's not.
177
592480
3056
Možda mislite da je to politika, ali nije.
09:55
This isn't about politics.
178
595560
1256
Tu se ne radi o politici.
09:56
This is just the algorithm figuring out human behavior.
179
596840
3096
Ovo je samo algoritam koji tumači ljudsko ponašanje.
09:59
I once watched a video about vegetarianism on YouTube
180
599960
4776
Jednom sam pogledala video o vegetarijanstvu
10:04
and YouTube recommended and autoplayed a video about being vegan.
181
604760
4936
i Youtube mi je preporučio i pustio video o veganstvu.
10:09
It's like you're never hardcore enough for YouTube.
182
609720
3016
Kad da nikad niste dovoljno žestoki zagovornici ičega.
10:12
(Laughter)
183
612760
1576
(Smijeh)
10:14
So what's going on?
184
614360
1560
Što se događa?
10:16
Now, YouTube's algorithm is proprietary,
185
616520
3536
YouTubeov algoritam je privatan,
10:20
but here's what I think is going on.
186
620080
2360
ali mislim da se ovo događa.
10:23
The algorithm has figured out
187
623360
2096
Algoritam je shvatio
10:25
that if you can entice people
188
625480
3696
da ako možete navesti ljude
10:29
into thinking that you can show them something more hardcore,
189
629200
3736
da misle da im možete pokazati nešto još žešće,
10:32
they're more likely to stay on the site
190
632960
2416
skloniji su tome da ostanu na stranici
10:35
watching video after video going down that rabbit hole
191
635400
4416
gledajući video za videom ulazeći sve dublje u rupu bez dna
10:39
while Google serves them ads.
192
639840
1680
dok im Google pokazuje oglase.
10:43
Now, with nobody minding the ethics of the store,
193
643760
3120
Dok nikome na pazi na etničku strukturu u trgovini,
10:47
these sites can profile people
194
647720
4240
ove stranice mogu isprofilirati ljude
10:53
who are Jew haters,
195
653680
1920
koji su mrzitelji židova,
10:56
who think that Jews are parasites
196
656360
2480
koji misle da su židovi paraziti
11:00
and who have such explicit anti-Semitic content,
197
660320
4920
i koji imaju takav eksplicitan antisemitski sadržaj
11:06
and let you target them with ads.
198
666080
2000
i dati vam da oglase usmjerite prema njima.
11:09
They can also mobilize algorithms
199
669200
3536
Također mogu mobilizirati algoritme
11:12
to find for you look-alike audiences,
200
672760
3136
kako bi pronašli publiku sličnu vama,
11:15
people who do not have such explicit anti-Semitic content on their profile
201
675920
5576
ljude koji nemaju takav eksplicitni antisemitski sadržaj na svom profilu,
11:21
but who the algorithm detects may be susceptible to such messages,
202
681520
6176
ali koje algoritam prepoznaje kao osobe koje su osjetljive na takve porukama
11:27
and lets you target them with ads, too.
203
687720
1920
i dopušta da na njih usmjeravate oglase.
11:30
Now, this may sound like an implausible example,
204
690680
2736
Ovo će možda zvučati kao nevjerojatan primjer,
11:33
but this is real.
205
693440
1320
ali stvaran je.
11:35
ProPublica investigated this
206
695480
2136
ProPublica je to istraživala
11:37
and found that you can indeed do this on Facebook,
207
697640
3616
i otkrila je da to stvarno možete raditi na Facebooku
11:41
and Facebook helpfully offered up suggestions
208
701280
2416
i Facebook je dobronamjerno ponudio prijedloge
11:43
on how to broaden that audience.
209
703720
1600
kako proširiti takvu publiku.
11:46
BuzzFeed tried it for Google, and very quickly they found,
210
706720
3016
BuzzFeed je to isprobao na Googleu i brzo su otkrili
11:49
yep, you can do it on Google, too.
211
709760
1736
da to možete raditi i na Googleu.
11:51
And it wasn't even expensive.
212
711520
1696
Čak nije bilo ni skupo.
11:53
The ProPublica reporter spent about 30 dollars
213
713240
4416
Reporter ProPublice potrošio je 30-ak dolara
11:57
to target this category.
214
717680
2240
kako bi se oglašavao ovoj kategoriji.
12:02
So last year, Donald Trump's social media manager disclosed
215
722600
5296
Prošle godine upravitelj društvenih medija Donalda Trumpa priopćio je
12:07
that they were using Facebook dark posts to demobilize people,
216
727920
5336
da su koristili tamne postove na Facebooku kako bi demobilizirali ljude,
12:13
not to persuade them,
217
733280
1376
ne kako bi ih nagovorili,
12:14
but to convince them not to vote at all.
218
734680
2800
već kako bi ih uvjerili da uopće ne glasaju.
12:18
And to do that, they targeted specifically,
219
738520
3576
Kako bi to učinili, posebno su se usmjerili na
12:22
for example, African-American men in key cities like Philadelphia,
220
742120
3896
npr. afro-amerikance u ključnim gradovima poput Philadelphije
12:26
and I'm going to read exactly what he said.
221
746040
2456
i pročitat ću što je točno rekao.
12:28
I'm quoting.
222
748520
1216
Citiram.
12:29
They were using "nonpublic posts
223
749760
3016
Koristili su "privatne objave
12:32
whose viewership the campaign controls
224
752800
2176
čiju publiku određuje kampanja
12:35
so that only the people we want to see it see it.
225
755000
3776
kako bi ih vidjeli samo ljudi za koje želimo da ih vide.
12:38
We modeled this.
226
758800
1216
Modelirali smo to.
12:40
It will dramatically affect her ability to turn these people out."
227
760040
4720
Dramatično će utjecati na sposobnost da te ljude isključi."
12:45
What's in those dark posts?
228
765720
2280
Što je u tim tamnim postovima?
12:48
We have no idea.
229
768480
1656
Nemamo pojma.
12:50
Facebook won't tell us.
230
770160
1200
Facebook nam neće reći.
12:52
So Facebook also algorithmically arranges the posts
231
772480
4376
Facebook također putem algoritma razvrstava objave
12:56
that your friends put on Facebook, or the pages you follow.
232
776880
3736
koje vaši prijatelji stavljaju na Facebook, ili stranice koje pratite.
13:00
It doesn't show you everything chronologically.
233
780640
2216
Ne pokazuje vam sve kronološkim slijedom.
13:02
It puts the order in the way that the algorithm thinks will entice you
234
782880
4816
Redoslijed je takav da vas algoritam želi namamiti
13:07
to stay on the site longer.
235
787720
1840
da duže ostanete na stranici.
13:11
Now, so this has a lot of consequences.
236
791040
3376
To ima jako puno posljedica.
13:14
You may be thinking somebody is snubbing you on Facebook.
237
794440
3800
Možete misliti da vas netko ignorira na Facebooku.
13:18
The algorithm may never be showing your post to them.
238
798800
3256
Algoritam možda nikada ne pokaže vašu objavu njima.
13:22
The algorithm is prioritizing some of them and burying the others.
239
802080
5960
Algoritam stavlja prioritet na neke od njih, a druge zakopava.
13:29
Experiments show
240
809320
1296
Eksperimenti pokazuju
13:30
that what the algorithm picks to show you can affect your emotions.
241
810640
4520
da ono što vam algoritam odluči pokazati može utjecati na vaše emocije.
13:36
But that's not all.
242
816600
1200
Ali to nije sve.
13:38
It also affects political behavior.
243
818280
2360
Također utječe na vaše političko ponašanje.
13:41
So in 2010, in the midterm elections,
244
821360
4656
Godine 2010. u prvom dijelu izbora,
13:46
Facebook did an experiment on 61 million people in the US
245
826040
5896
Facebook je proveo eksperiment na 61 milijunu ljudi u SAD-u
13:51
that was disclosed after the fact.
246
831960
1896
o kojem su izvijestili kad je završio.
13:53
So some people were shown, "Today is election day,"
247
833880
3416
Nekim su ljudima pokazivali "Danas su izbori",
13:57
the simpler one,
248
837320
1376
jednostavniju poruku,
13:58
and some people were shown the one with that tiny tweak
249
838720
3896
a nekima su pokazivali neznatno izmijenjenu poruku
14:02
with those little thumbnails
250
842640
2096
sa sličicama
14:04
of your friends who clicked on "I voted."
251
844760
2840
vaših prijatelja koji su kliknuli na "Glasao sam".
14:09
This simple tweak.
252
849000
1400
Ta jednostavna izmjena.
14:11
OK? So the pictures were the only change,
253
851520
4296
Dobro? Slike su bile jedina izmjena,
14:15
and that post shown just once
254
855840
3256
a to što je ta objava pokazana samo jednom
14:19
turned out an additional 340,000 voters
255
859120
6056
potaknula je dodatnih 340.000 birača
14:25
in that election,
256
865200
1696
da glasuju na tim izborima,
14:26
according to this research
257
866920
1696
prema ovom istraživanju,
14:28
as confirmed by the voter rolls.
258
868640
2520
što je potvrđeno popisom birača.
14:32
A fluke? No.
259
872920
1656
Slučajnost? Ne.
14:34
Because in 2012, they repeated the same experiment.
260
874600
5360
Godine 2012. ponovili su taj eksperiment.
14:40
And that time,
261
880840
1736
I tada
14:42
that civic message shown just once
262
882600
3296
građanska poruka koja je prikazana samo jednom
14:45
turned out an additional 270,000 voters.
263
885920
4440
potaknula je dodatnih 270.000 birača.
14:51
For reference, the 2016 US presidential election
264
891160
5216
Za usporedbu, predsjedničke izbore 2016. u SAD-u
14:56
was decided by about 100,000 votes.
265
896400
3520
razriješilo je 100.000 glasova.
15:01
Now, Facebook can also very easily infer what your politics are,
266
901360
4736
Facebook također može vrlo lako zaključiti vaše političko opredjeljenje,
15:06
even if you've never disclosed them on the site.
267
906120
2256
čak i ako ga nikad niste objavili na stranici.
15:08
Right? These algorithms can do that quite easily.
268
908400
2520
Ti algoritmi to mogu odraditi vrlo lako.
15:11
What if a platform with that kind of power
269
911960
3896
Što ako platforma s takvom moći
15:15
decides to turn out supporters of one candidate over the other?
270
915880
5040
odluči okrenuti pristalice jednog kandidata na štetu drugoga?
15:21
How would we even know about it?
271
921680
2440
Kako bismo uopće znali da se to dogodilo?
15:25
Now, we started from someplace seemingly innocuous --
272
925560
4136
Počeli smo od nečega naizgled bezopasnog --
15:29
online adds following us around --
273
929720
2216
oglasi na internetu što nas slijede okolo --
15:31
and we've landed someplace else.
274
931960
1840
i došli na jedno sasvim drugo mjesto.
15:35
As a public and as citizens,
275
935480
2456
Kao javnost i kao građani,
15:37
we no longer know if we're seeing the same information
276
937960
3416
više ne znamo gledamo li iste informacije
15:41
or what anybody else is seeing,
277
941400
1480
ili ono što svi ostali gledaju
15:43
and without a common basis of information,
278
943680
2576
i bez zajedničke baze podataka,
15:46
little by little,
279
946280
1616
malo po malo,
15:47
public debate is becoming impossible,
280
947920
3216
javne debate postaju nemoguće
15:51
and we're just at the beginning stages of this.
281
951160
2976
i opet smo na početku.
15:54
These algorithms can quite easily infer
282
954160
3456
Ovi algoritmi mogu vrlo lako zaključiti
15:57
things like your people's ethnicity,
283
957640
3256
o stvarima poput etničkog podrijetla,
16:00
religious and political views, personality traits,
284
960920
2336
religijskim, političkim nazorima, crtama ličnosti,
16:03
intelligence, happiness, use of addictive substances,
285
963280
3376
inteligenciji, sreći, korištenju tvari koje izazivaju ovisnost,
16:06
parental separation, age and genders,
286
966680
3136
rastavljenosti roditelja, dobi i spolu,
16:09
just from Facebook likes.
287
969840
1960
i to sve na osnovu lajkova na Facebooku.
16:13
These algorithms can identify protesters
288
973440
4056
Ti algoritmi mogu prepoznati prosvjednike,
16:17
even if their faces are partially concealed.
289
977520
2760
čak i ako su njihova lica djelomično skrivena.
16:21
These algorithms may be able to detect people's sexual orientation
290
981720
6616
Ti algoritmi mogu prepoznati seksualnu orijentaciju ljudi
16:28
just from their dating profile pictures.
291
988360
3200
prema njihovim slikama na stranicama za pronalaženje partnera.
16:33
Now, these are probabilistic guesses,
292
993560
2616
To su probabilističke pretpostavke,
16:36
so they're not going to be 100 percent right,
293
996200
2896
pa neće biti 100% istinite,
16:39
but I don't see the powerful resisting the temptation to use these technologies
294
999120
4896
ali ne vidim snažno opiranje iskušenjima da se koriste te tehnologije
16:44
just because there are some false positives,
295
1004040
2176
jer postoje neki lažno pozitivni nalazi
16:46
which will of course create a whole other layer of problems.
296
1006240
3256
koji će, naravno, stvoriti cijeli niz problema.
16:49
Imagine what a state can do
297
1009520
2936
Zamislite što država može napraviti
16:52
with the immense amount of data it has on its citizens.
298
1012480
3560
s ogromnim količinama podataka koje ima o svojim građanima.
16:56
China is already using face detection technology
299
1016680
4776
Kina već koristi tehnologiju prepoznavanja lica
17:01
to identify and arrest people.
300
1021480
2880
kako bi identificirala i uhitila ljude.
17:05
And here's the tragedy:
301
1025280
2136
A u ovome je tragedija:
17:07
we're building this infrastructure of surveillance authoritarianism
302
1027440
5536
gradimo ovu infrastrukturu nadgledačkog autoritarizma
17:13
merely to get people to click on ads.
303
1033000
2960
samo kako bi ljudi kliknuli na oglase.
17:17
And this won't be Orwell's authoritarianism.
304
1037240
2576
Ovo neće biti Orwellov autoritarizam.
17:19
This isn't "1984."
305
1039839
1897
Ovo nije "1984".
17:21
Now, if authoritarianism is using overt fear to terrorize us,
306
1041760
4576
Ako autoritarizam koristi strah kako bi nas terorizirao,
17:26
we'll all be scared, but we'll know it,
307
1046359
2897
svi ćemo biti u strahu, ali znat ćemo to,
17:29
we'll hate it and we'll resist it.
308
1049280
2200
mrzit ćemo to i opirat ćemo se tome.
17:32
But if the people in power are using these algorithms
309
1052880
4416
Ali ako oni koji su na vlasti koriste ove algoritme
17:37
to quietly watch us,
310
1057319
3377
kako bi nas tiho nadgledali,
17:40
to judge us and to nudge us,
311
1060720
2080
kako bi nas procjenjivali i gurkali,
17:43
to predict and identify the troublemakers and the rebels,
312
1063720
4176
kako bi predvidjeli i prepoznali one koji prave probleme i buntovnike,
17:47
to deploy persuasion architectures at scale
313
1067920
3896
kako bi razvili arhitekturu uvjeravanja u stvarnom životu
17:51
and to manipulate individuals one by one
314
1071840
4136
i kako bi manipulirali pojedincima
17:56
using their personal, individual weaknesses and vulnerabilities,
315
1076000
5440
koristeći osobne, individualne slabosti i osjetljivosti,
18:02
and if they're doing it at scale
316
1082720
2200
i ako to rade u velikom broju
18:06
through our private screens
317
1086080
1736
kroz naše privatne ekrane
18:07
so that we don't even know
318
1087840
1656
pa čak ni ne znamo
18:09
what our fellow citizens and neighbors are seeing,
319
1089520
2760
što naši sugrađani i susjedi vide,
18:13
that authoritarianism will envelop us like a spider's web
320
1093560
4816
taj će se autoritarizam omotati oko nas kao paukova mreža,
18:18
and we may not even know we're in it.
321
1098400
2480
a mi nećemo ni znati da smo u njoj.
18:22
So Facebook's market capitalization
322
1102440
2936
Facebookova tržišna kapitalizacija
18:25
is approaching half a trillion dollars.
323
1105400
3296
iznosi pola trilijuna dolara
18:28
It's because it works great as a persuasion architecture.
324
1108720
3120
jer odlično funkcionira kao arhitektura uvjeravanja.
18:33
But the structure of that architecture
325
1113760
2816
Ali struktura te arhitekture
18:36
is the same whether you're selling shoes
326
1116600
3216
ista je bez obzira na to prodajete li cipele
18:39
or whether you're selling politics.
327
1119840
2496
ili politiku.
18:42
The algorithms do not know the difference.
328
1122360
3120
Algoritmi tu ne vide razliku.
18:46
The same algorithms set loose upon us
329
1126240
3296
Ti isti algoritmi usmjereni na nas
18:49
to make us more pliable for ads
330
1129560
3176
kako bi nas napravili podložnijima oglasima
18:52
are also organizing our political, personal and social information flows,
331
1132760
6736
također organiziraju naše političke, osobne i društvene tokove informacija,
18:59
and that's what's got to change.
332
1139520
1840
a to se mora promijeniti.
19:02
Now, don't get me wrong,
333
1142240
2296
Nemojte me krivo shvatiti,
19:04
we use digital platforms because they provide us with great value.
334
1144560
3680
koristimo digitalne platforme jer nam puno toga pružaju.
19:09
I use Facebook to keep in touch with friends and family around the world.
335
1149120
3560
Koristim Face da budem u kontaktu s prijateljima i obitelji diljem svijeta.
19:14
I've written about how crucial social media is for social movements.
336
1154000
5776
Pisala sam o tome koliko su ključni društveni mediji za društvene pokrete.
19:19
I have studied how these technologies can be used
337
1159800
3016
Proučavala sam kako se te tehnologije mogu koristiti
19:22
to circumvent censorship around the world.
338
1162840
2480
da zaobiđemo cenzuru diljem svijeta.
19:27
But it's not that the people who run, you know, Facebook or Google
339
1167280
6416
Ali ljudi koji vode Facebook ili Google
19:33
are maliciously and deliberately trying
340
1173720
2696
ne pokušavaju zlobno ili namjerno
19:36
to make the country or the world more polarized
341
1176440
4456
državu ili svijet dodatno polarizirati
19:40
and encourage extremism.
342
1180920
1680
i poticati ekstremizam.
19:43
I read the many well-intentioned statements
343
1183440
3976
Pročitala sam mnoge dobronamjerne izjave
19:47
that these people put out.
344
1187440
3320
koje ti ljudi daju.
19:51
But it's not the intent or the statements people in technology make that matter,
345
1191600
6056
No, nije važna namjera ili izjave ljudi u tehnologiji,
19:57
it's the structures and business models they're building.
346
1197680
3560
već su važne strukture i poslovni modeli koje oni grade.
20:02
And that's the core of the problem.
347
1202360
2096
To je srž problema.
20:04
Either Facebook is a giant con of half a trillion dollars
348
1204480
4720
Ili je Facebook ogromni prevarant s pola trilijuna dolara
20:10
and ads don't work on the site,
349
1210200
1896
i oglasi na toj stranici ne rade
20:12
it doesn't work as a persuasion architecture,
350
1212120
2696
kao arhitektura uvjeravanja,
20:14
or its power of influence is of great concern.
351
1214840
4120
ili je njegova moć utjecaja razlog za brigu.
20:20
It's either one or the other.
352
1220560
1776
Ili jedno ili drugo.
20:22
It's similar for Google, too.
353
1222360
1600
Slično vrijedi i za Google.
20:24
So what can we do?
354
1224880
2456
Što možemo učiniti?
20:27
This needs to change.
355
1227360
1936
To se mora promijeniti.
20:29
Now, I can't offer a simple recipe,
356
1229320
2576
Ne mogu ponuditi jednostavan recept
20:31
because we need to restructure
357
1231920
2256
jer moramo restrukturirati
20:34
the whole way our digital technology operates.
358
1234200
3016
cjelokupan način na koji naša digitalna tehnologija funkcionira.
20:37
Everything from the way technology is developed
359
1237240
4096
Sve od toga kako se tehnologija razvija
20:41
to the way the incentives, economic and otherwise,
360
1241360
3856
pa do načina na koji su poticaji, ekonomski ili drugi,
20:45
are built into the system.
361
1245240
2280
ugrađeni u sustav.
20:48
We have to face and try to deal with
362
1248480
3456
Moramo se suočiti i pokušati nositi s
20:51
the lack of transparency created by the proprietary algorithms,
363
1251960
4656
manjkom transparentnosti koju stvaraju privatni algoritmi,
20:56
the structural challenge of machine learning's opacity,
364
1256640
3816
strukturalnim izazovom nejasnoće strojnog učenja,
21:00
all this indiscriminate data that's being collected about us.
365
1260480
3400
svim tim podacima koji se o nama skupljaju.
21:05
We have a big task in front of us.
366
1265000
2520
Pred nama je velik zadatak.
21:08
We have to mobilize our technology,
367
1268160
2680
Moramo mobilizirati našu tehnologiju,
21:11
our creativity
368
1271760
1576
našu kreativnost,
21:13
and yes, our politics
369
1273360
1880
i da, i našu politiku
21:16
so that we can build artificial intelligence
370
1276240
2656
kako bismo stvorili umjetnu inteligenciju
21:18
that supports us in our human goals
371
1278920
3120
koja podržava naše ljudske ciljeve,
21:22
but that is also constrained by our human values.
372
1282800
3920
ali koju ograničavaju naše ljudske vrijednosti.
21:27
And I understand this won't be easy.
373
1287600
2160
Shvaćam da to neće biti lako.
21:30
We might not even easily agree on what those terms mean.
374
1290360
3600
Možda se nećemo čak ni složiti o tome koji bi to uvjeti trebali biti.
21:34
But if we take seriously
375
1294920
2400
Ali ako ozbiljno shvatimo
21:38
how these systems that we depend on for so much operate,
376
1298240
5976
kako funkcioniraju ovi sustavi o kojima toliko dugo ovisimo,
21:44
I don't see how we can postpone this conversation anymore.
377
1304240
4120
ne znam kako možemo odgađati ovaj razgovor.
21:49
These structures
378
1309200
2536
Ove strukture
21:51
are organizing how we function
379
1311760
4096
organiziraju način na koji funkcioniramo
21:55
and they're controlling
380
1315880
2296
i kontroliraju
21:58
what we can and we cannot do.
381
1318200
2616
što možemo i što ne možemo raditi.
22:00
And many of these ad-financed platforms,
382
1320840
2456
Mnoge od tih platformi koje financiraju oglasi
22:03
they boast that they're free.
383
1323320
1576
naglašavaju da su besplatne.
22:04
In this context, it means that we are the product that's being sold.
384
1324920
4560
U ovom kontekstu, to znači da smo mi proizvod koji se prodaje.
22:10
We need a digital economy
385
1330840
2736
Potrebna nam je digitalna ekonomija
22:13
where our data and our attention
386
1333600
3496
gdje naši podaci i pozornost
22:17
is not for sale to the highest-bidding authoritarian or demagogue.
387
1337120
5080
neće biti prodani diktatoru ili demagogiji koja ponudi najviše.
22:23
(Applause)
388
1343160
3800
(Pljesak)
22:30
So to go back to that Hollywood paraphrase,
389
1350480
3256
Vratimo se onoj holivudskoj parafrazi,
22:33
we do want the prodigious potential
390
1353760
3736
želimo da ogroman potencijal
22:37
of artificial intelligence and digital technology to blossom,
391
1357520
3200
umjetne inteligencije i digitalne tehnologije procvjeta,
22:41
but for that, we must face this prodigious menace,
392
1361400
4936
ali zbog toga se moramo suočiti s ovom ogromnom prijetnjom,
22:46
open-eyed and now.
393
1366360
1936
otvorenih očiju, i to odmah.
22:48
Thank you.
394
1368320
1216
Hvala vam.
22:49
(Applause)
395
1369560
4640
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7