We're building a dystopia just to make people click on ads | Zeynep Tufekci

732,435 views ・ 2017-11-17

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Ivana Korom Lektor: Aleksandar Korom
00:12
So when people voice fears of artificial intelligence,
0
12760
3536
Kada ljudi pričaju o strahu od veštačke inteligencije,
00:16
very often, they invoke images of humanoid robots run amok.
1
16320
3976
veoma često se pozivaju na slike humanoidnih robota koji divljaju.
00:20
You know? Terminator?
2
20320
1240
Znate? Terminator?
00:22
You know, that might be something to consider,
3
22400
2336
Znate, o tome bi možda i trebalo da razmišljamo,
00:24
but that's a distant threat.
4
24760
1856
ali to je daleka pretnja.
00:26
Or, we fret about digital surveillance
5
26640
3456
Ili se plašimo digitalnog nadzora
00:30
with metaphors from the past.
6
30120
1776
putem metafora iz prošlosti.
00:31
"1984," George Orwell's "1984,"
7
31920
2656
"1984" Džordža Orvela
00:34
it's hitting the bestseller lists again.
8
34600
2280
ponovo se našla na listi bestselera.
00:37
It's a great book,
9
37960
1416
To je odlična knjiga,
00:39
but it's not the correct dystopia for the 21st century.
10
39400
3880
ali ne i tačan opis distopije za 21. vek.
00:44
What we need to fear most
11
44080
1416
Ono čega bi najviše trebalo da se plašimo
00:45
is not what artificial intelligence will do to us on its own,
12
45520
4776
nije ono što veštačka inteligencija može da nam uradi sama od sebe,
00:50
but how the people in power will use artificial intelligence
13
50320
4736
nego kako će moćnici koristiti veštačku inteligenciju
00:55
to control us and to manipulate us
14
55080
2816
kako bi nas kontrolisali i manipulisali nama
00:57
in novel, sometimes hidden,
15
57920
3136
na neki nov, ponekad diskretan,
01:01
subtle and unexpected ways.
16
61080
3016
suptilan i neočekivan način.
01:04
Much of the technology
17
64120
1856
Tehnologije koje će u skorijoj budućnosti
01:06
that threatens our freedom and our dignity in the near-term future
18
66000
4336
početi da prete našoj slobodi i našem dostojanstvu
01:10
is being developed by companies
19
70360
1856
većinom razvijaju velike kompanije
01:12
in the business of capturing and selling our data and our attention
20
72240
4936
koje se bave prikupljanjem i prodajom naših informacija i naše pažnje
01:17
to advertisers and others:
21
77200
2256
oglašivačima i drugima:
01:19
Facebook, Google, Amazon,
22
79480
3416
Fejsbuk, Gugl, Amazon,
01:22
Alibaba, Tencent.
23
82920
1880
Alibaba, Tencent.
01:26
Now, artificial intelligence has started bolstering their business as well.
24
86040
5496
Sad, i veštačka inteligencija je postala potpora njihovog poslovanja.
01:31
And it may seem like artificial intelligence
25
91560
2096
I iako se možda čini da će veštačka inteligencija
01:33
is just the next thing after online ads.
26
93680
2856
postati trend odmah posle internet reklama,
01:36
It's not.
27
96560
1216
nije tako.
01:37
It's a jump in category.
28
97800
2456
To nije ista kategorija.
01:40
It's a whole different world,
29
100280
2576
To je potpuno drugi svet
01:42
and it has great potential.
30
102880
2616
koji ima veliki potencijal.
01:45
It could accelerate our understanding of many areas of study and research.
31
105520
6920
To može ubrzati naše razumevanje mnogih naučnih oblasti i istraživanja.
01:53
But to paraphrase a famous Hollywood philosopher,
32
113120
3496
Ali da parafraziram poznatog holivudkog filozofa:
01:56
"With prodigious potential comes prodigious risk."
33
116640
3640
"Ogroman potencijal prati i ogroman rizik."
02:01
Now let's look at a basic fact of our digital lives, online ads.
34
121120
3936
Hajde da se vratimo na osnovu naših digitalnih života - internet reklame.
02:05
Right? We kind of dismiss them.
35
125080
2896
Mi ih nekako odbacujemo, zar ne?
02:08
They seem crude, ineffective.
36
128000
1976
Čine nam se banalnim, uzaludnim.
02:10
We've all had the experience of being followed on the web
37
130000
4256
Svi smo doživeli da se osećamo kao da nas neko posmatra na internetu
02:14
by an ad based on something we searched or read.
38
134280
2776
zahvaljujući reklami baziranoj na onome što smo tražili ili pročitali.
02:17
You know, you look up a pair of boots
39
137080
1856
Znate kako ide, jedan dan tražite čizme
02:18
and for a week, those boots are following you around everywhere you go.
40
138960
3376
i narednih neoliko dana te čizme vas prate kuda god da odete.
02:22
Even after you succumb and buy them, they're still following you around.
41
142360
3656
Čak i nakon što se povinujete i kupite ih, one vas i dalje prate svuda okolo.
02:26
We're kind of inured to that kind of basic, cheap manipulation.
42
146040
3016
Mi smo nekako uvučeni u taj tip osnovne, jeftine manipulacije.
02:29
We roll our eyes and we think, "You know what? These things don't work."
43
149080
3400
Zakolutamo očima i pomislimo: "Znate šta?
Ovako nešto ne može da funkcioniše."
02:33
Except, online,
44
153720
2096
Osim što na internetu digitalne tehnologije nisu samo reklame.
02:35
the digital technologies are not just ads.
45
155840
3600
02:40
Now, to understand that, let's think of a physical world example.
46
160240
3120
Da bismo to razumeli, uzmimo primer iz fizičkog sveta.
02:43
You know how, at the checkout counters at supermarkets, near the cashier,
47
163840
4656
Znate kako blizu kase u supermarketu
02:48
there's candy and gum at the eye level of kids?
48
168520
3480
slatkiši i žvake uvek stoje u nivou očiju male dece?
02:52
That's designed to make them whine at their parents
49
172800
3496
To je dizajnirano da bi deca počela da se prenemažu
02:56
just as the parents are about to sort of check out.
50
176320
3080
baš kada roditelji treba da plate račun.
03:00
Now, that's a persuasion architecture.
51
180040
2640
E to vam je arhitektura ubeđivanja.
03:03
It's not nice, but it kind of works.
52
183160
3096
Nije lepo, ali funkcioniše.
03:06
That's why you see it in every supermarket.
53
186280
2040
Zato to vidite u svakom supermarketu.
03:08
Now, in the physical world,
54
188720
1696
Sada, u fizičkom svetu,
03:10
such persuasion architectures are kind of limited,
55
190440
2496
takva arhitektura ubeđivanja ima svoje granice,
03:12
because you can only put so many things by the cashier. Right?
56
192960
4816
jer pored kase možete da stavite samo određen broj stvari, zar ne?
03:17
And the candy and gum, it's the same for everyone,
57
197800
4296
A slatkiši i žvake isti su za svakoga
03:22
even though it mostly works
58
202120
1456
iako ovo obično funkcioniše
03:23
only for people who have whiny little humans beside them.
59
203600
4040
samo kod ljudi koji vode male kenjkave osobe sa sobom.
03:29
In the physical world, we live with those limitations.
60
209160
3920
U fizičkom svetu živimo sa ovakvim ograničenjima.
03:34
In the digital world, though,
61
214280
1936
U digitalnom svetu, doduše,
03:36
persuasion architectures can be built at the scale of billions
62
216240
4320
mogu da se izgrade milijarde objekata po arhitekturi ubeđivanja
03:41
and they can target, infer, understand
63
221840
3856
i oni mogu da targetiraju, zaključuju, razumeju
03:45
and be deployed at individuals
64
225720
2896
i mogu da budu raspoređeni prema pojedincima
03:48
one by one
65
228640
1216
jedan po jedan
03:49
by figuring out your weaknesses,
66
229880
2136
tako što će razumevati naše slabosti,
03:52
and they can be sent to everyone's phone private screen,
67
232040
5616
i mogu da se pošalju na svačiji privatni ekran
03:57
so it's not visible to us.
68
237680
2256
tako da ne budu svima vidljivi.
03:59
And that's different.
69
239960
1256
I tu leži ta razlika.
04:01
And that's just one of the basic things that artificial intelligence can do.
70
241240
3576
To je samo jedna od osnovnih stvari koje veštačka inteligencija može da uradi.
04:04
Now, let's take an example.
71
244840
1336
Sada, uzmimo jedan primer.
04:06
Let's say you want to sell plane tickets to Vegas. Right?
72
246200
2696
Recimo da želite da prodate avionske karte za Vegas.
04:08
So in the old world, you could think of some demographics to target
73
248920
3496
U starom svetu mogli biste se setiti neke demografije koju treba da targetirate
04:12
based on experience and what you can guess.
74
252440
2520
na osnovu iskustva i onoga što možete da pogodite.
04:15
You might try to advertise to, oh,
75
255560
2816
Mogli biste da probate da ih reklamirate,
04:18
men between the ages of 25 and 35,
76
258400
2496
recimo, muškarcima od 25 do 35 godina
04:20
or people who have a high limit on their credit card,
77
260920
3936
ili ljudima koji imaju visok limit na svojim kreditnim karticama,
04:24
or retired couples. Right?
78
264880
1376
ili parovima u penziji. Zar ne?
04:26
That's what you would do in the past.
79
266280
1816
To biste uradili u prošlosti.
04:28
With big data and machine learning,
80
268120
2896
Zahvaljujući velikoj količini podataka i mašinskom učenju,
04:31
that's not how it works anymore.
81
271040
1524
to tako više ne funkcioniše.
04:33
So to imagine that,
82
273320
2176
Da biste to sada zamislili,
04:35
think of all the data that Facebook has on you:
83
275520
3856
setite se svih onih podataka o vama koje poseduje Fejsbuk:
04:39
every status update you ever typed,
84
279400
2536
svaki status koji ste ikada napisali,
04:41
every Messenger conversation,
85
281960
2016
svaki razgovor na Mesindžeru,
04:44
every place you logged in from,
86
284000
1880
svako mesto sa kojeg ste se ulogovali,
04:48
all your photographs that you uploaded there.
87
288400
3176
sve vaše fotografije koje ste tamo postavili.
04:51
If you start typing something and change your mind and delete it,
88
291600
3776
Čak i kada počnete da pišete nešto pa se predomislite i obrišete,
04:55
Facebook keeps those and analyzes them, too.
89
295400
3200
Fejsbuk čuva te informacije i takođe ih analizira.
04:59
Increasingly, it tries to match you with your offline data.
90
299160
3936
Sve više pokušava da vas poveže sa informacijama koje nisu na internetu.
05:03
It also purchases a lot of data from data brokers.
91
303120
3176
Uz to i kupuje veliki broj informacija od data brokera.
05:06
It could be everything from your financial records
92
306320
3416
To može biti sve, od vaših finansijskih izveštaja,
05:09
to a good chunk of your browsing history.
93
309760
2120
sve do dobrog dela istorije u vašem pretraživaču.
05:12
Right? In the US, such data is routinely collected,
94
312360
5416
U SAD se takve informacije rutinski
prikupljaju, sređuju i prodaju.
05:17
collated and sold.
95
317800
1960
05:20
In Europe, they have tougher rules.
96
320320
2440
U Evropi postoje čvršća pravila.
05:23
So what happens then is,
97
323680
2200
Tako se dešava da mešanjem svih tih informacija
05:26
by churning through all that data, these machine-learning algorithms --
98
326920
4016
ovi algoritmi mašinskog učenja -
05:30
that's why they're called learning algorithms --
99
330960
2896
jer tako se zovu: algoritmi za učenje -
05:33
they learn to understand the characteristics of people
100
333880
4096
oni pokušavaju da razumeju karakteristike ljudi
05:38
who purchased tickets to Vegas before.
101
338000
2520
koji su ranije kupili karte za Vegas.
05:41
When they learn this from existing data,
102
341760
3536
Kada ovo nauče od postojećih informacija
05:45
they also learn how to apply this to new people.
103
345320
3816
takođe nauče kako isto da primene na drugim ljudima.
05:49
So if they're presented with a new person,
104
349160
3056
Pa ako im se predstavi nova osoba, mogu da je svrstaju među one
05:52
they can classify whether that person is likely to buy a ticket to Vegas or not.
105
352240
4640
koje će verovatno kupiti karte za Vegas ili među one koji neće.
05:57
Fine. You're thinking, an offer to buy tickets to Vegas.
106
357720
5456
Dobro. Vi sada mislite: "Ponuda za karte za Vegas.
06:03
I can ignore that.
107
363200
1456
Ignorisaću to."
06:04
But the problem isn't that.
108
364680
2216
Ali problem ne leži u tome.
06:06
The problem is,
109
366920
1576
Problem je što mi više ne razumemo
06:08
we no longer really understand how these complex algorithms work.
110
368520
4136
kako zaista ti algoritmi funkcionišu.
06:12
We don't understand how they're doing this categorization.
111
372680
3456
Ne razumemo kako oni prave tu kategorizaciju.
06:16
It's giant matrices, thousands of rows and columns,
112
376160
4416
Njihove grandiozne metrike, hiljade redova i kolona,
06:20
maybe millions of rows and columns,
113
380600
1960
možda milione redova i kolona,
06:23
and not the programmers
114
383320
2640
a ne čak ni programeri,
06:26
and not anybody who looks at it,
115
386760
1680
ili bilo ko drugi ko radi na tome,
06:29
even if you have all the data,
116
389440
1496
čak i ako poseduju sve informacije
06:30
understands anymore how exactly it's operating
117
390960
4616
ne razumeju kako tačno oni rade
06:35
any more than you'd know what I was thinking right now
118
395600
3776
ništa više od onoga što vi znate o onome o čemu ja trenutno razmišljam
06:39
if you were shown a cross section of my brain.
119
399400
3960
čak i ako biste videli poprečni presek mog mozga.
06:44
It's like we're not programming anymore,
120
404360
2576
Čini se da mi više ne programiramo,
06:46
we're growing intelligence that we don't truly understand.
121
406960
4400
mi gajimo inteligenciju koju istinski ne razumemo.
06:52
And these things only work if there's an enormous amount of data,
122
412520
3976
A ove stvari funkcionišu samo ako postoji ogromna količina podataka,
06:56
so they also encourage deep surveillance on all of us
123
416520
5096
tako da oni takođe ohrabruju teški nadzor nad svima nama
07:01
so that the machine learning algorithms can work.
124
421640
2336
kako bi algoritmi mašinskog učenja funkcionisali.
07:04
That's why Facebook wants to collect all the data it can about you.
125
424000
3176
Zbog toga Fejsbuk želi da prikupi sve moguće informacije o vama.
07:07
The algorithms work better.
126
427200
1576
Da bi algoritmi radili bolje.
07:08
So let's push that Vegas example a bit.
127
428800
2696
Hajde da se još malo poigramo sa primerom o Vegasu.
07:11
What if the system that we do not understand
128
431520
3680
Šta ako je sistem koji ne razumemo
07:16
was picking up that it's easier to sell Vegas tickets
129
436200
5136
shvatio da je lakše prodati karte za Vegas
07:21
to people who are bipolar and about to enter the manic phase.
130
441360
3760
ljudima koji su bipolarni i koji su na pragu manične faze.
07:25
Such people tend to become overspenders, compulsive gamblers.
131
445640
4920
Ovakvi ljudi prelako troše novac i imaju tendenciju da postanu kompulsivni kockari.
07:31
They could do this, and you'd have no clue that's what they were picking up on.
132
451280
4456
Oni to mogu da urade, a da vi nemate pojma da su baš te informacije prikupljali.
07:35
I gave this example to a bunch of computer scientists once
133
455760
3616
Jednom sam dala ovaj primer grupi kompjuterskih naučnika
07:39
and afterwards, one of them came up to me.
134
459400
2056
da bi mi, nakon toga, jedan od njih prišao.
07:41
He was troubled and he said, "That's why I couldn't publish it."
135
461480
3520
Nešto ga je mučilo i rekao je: "Zato nisam mogao da ga objavim."
07:45
I was like, "Couldn't publish what?"
136
465600
1715
A ja sam ga pitala: "Šta niste mogli da objavite?"
07:47
He had tried to see whether you can indeed figure out the onset of mania
137
467800
5856
Pokušao je da ustanovi da li zatista može da se utvrdi početak u manične faze
07:53
from social media posts before clinical symptoms,
138
473680
3216
na osnovu postova na društvenim mrežama pre nego što se pojave klinički simptomi,
07:56
and it had worked,
139
476920
1776
i uspelo mu je,
07:58
and it had worked very well,
140
478720
2056
veoma mu je uspelo,
08:00
and he had no idea how it worked or what it was picking up on.
141
480800
4880
a da on nije imao pojma kako je uspelo i kakve informacije su se prikupljale.
08:06
Now, the problem isn't solved if he doesn't publish it,
142
486840
4416
No, problem se neće rešiti ukoliko on to ne objavi,
08:11
because there are already companies
143
491280
1896
jer već postoje kompanije
08:13
that are developing this kind of technology,
144
493200
2536
koje razvijaju ovakvu tehnologiju,
08:15
and a lot of the stuff is just off the shelf.
145
495760
2800
i mnogo je stvari sada dostupno.
08:19
This is not very difficult anymore.
146
499240
2576
To više nije tako komplikovano.
08:21
Do you ever go on YouTube meaning to watch one video
147
501840
3456
Da li vam se ikada desilo da odete na Jutjub da pogledate jedan video
08:25
and an hour later you've watched 27?
148
505320
2360
i sat vremena kasnije pogledali ste već njih 27?
08:28
You know how YouTube has this column on the right
149
508760
2496
Znate kako Jutjub ima onu kolonu sa desne strane
08:31
that says, "Up next"
150
511280
2216
koja kaže "Sledeće..."
08:33
and it autoplays something?
151
513520
1816
pa onda automatski pusti nešto?
08:35
It's an algorithm
152
515360
1216
To je algoritam
08:36
picking what it thinks that you might be interested in
153
516600
3616
koji pokazuje ono za šta misli da bi moglo da vas interesuje,
08:40
and maybe not find on your own.
154
520240
1536
a da možda sami nećete naći.
08:41
It's not a human editor.
155
521800
1256
To nije živi urednik.
08:43
It's what algorithms do.
156
523080
1416
To rade algoritmi.
08:44
It picks up on what you have watched and what people like you have watched,
157
524520
4736
Oni prikupljaju sve ono što ste gledali i ono što su gledali ljudi poput vas,
08:49
and infers that that must be what you're interested in,
158
529280
4216
i zaključuje da bi to trebalo da bude ono što vas interesuje,
08:53
what you want more of,
159
533520
1255
ono čega želite još,
08:54
and just shows you more.
160
534799
1336
pa vam pokazuje još toga.
08:56
It sounds like a benign and useful feature,
161
536159
2201
To deluje kao benigna i korisna odlika,
08:59
except when it isn't.
162
539280
1200
ali ipak nije.
09:01
So in 2016, I attended rallies of then-candidate Donald Trump
163
541640
6960
Tokom 2016. sam posećivala mitinge tadašnjeg kandidata Donalda Trampa
09:09
to study as a scholar the movement supporting him.
164
549840
3336
kako bih kao naučnik izučavala pokrete koji ga podržavaju.
09:13
I study social movements, so I was studying it, too.
165
553200
3456
Ja se bavim društvenim pokretima, tako da sam i njih izučavala.
09:16
And then I wanted to write something about one of his rallies,
166
556680
3336
I onda sam htela da napišem nešto o jednom od njegovih mitinga,
09:20
so I watched it a few times on YouTube.
167
560040
1960
pa sam ga nekoliko puta pogledala na Jutjubu.
09:23
YouTube started recommending to me
168
563240
3096
Jutjub je onda počeo da mi preporučuje
09:26
and autoplaying to me white supremacist videos
169
566360
4256
i automatski pušta video snimke belačkih supremacističkih grupa
09:30
in increasing order of extremism.
170
570640
2656
po rastućem redosledu ekstremizma.
09:33
If I watched one,
171
573320
1816
Ako bih pogledala jedan,
09:35
it served up one even more extreme
172
575160
2976
spremio bi mi jedan još ekstremniji
09:38
and autoplayed that one, too.
173
578160
1424
pa bi mi ga pustio automatski, takođe.
09:40
If you watch Hillary Clinton or Bernie Sanders content,
174
580320
4536
Ako biste gledali sadržaj vezan za Hilari Klinton ili Bernija Sandersa,
09:44
YouTube recommends and autoplays conspiracy left,
175
584880
4696
Jutjub bi vam preporučio i automatski puštao snimke o zaveri levice,
09:49
and it goes downhill from there.
176
589600
1760
i od toga sve odlazi nizbrdo.
09:52
Well, you might be thinking, this is politics, but it's not.
177
592480
3056
Pa, mogli biste pomisliti, ovo je politika, ali nije.
09:55
This isn't about politics.
178
595560
1256
To nema veze sa politikom.
09:56
This is just the algorithm figuring out human behavior.
179
596840
3096
To je samo algoritam u pokušaju da prokljuvi ljudsko ponašanje.
09:59
I once watched a video about vegetarianism on YouTube
180
599960
4776
Jednom sam na Jutjubu gledala video o vegetarijanstvu,
10:04
and YouTube recommended and autoplayed a video about being vegan.
181
604760
4936
pa mi je Jutjub preporučio i automatski pustio video o tome kako je biti vegan.
10:09
It's like you're never hardcore enough for YouTube.
182
609720
3016
Kao da ništa nije dovoljno hardkor za Jutjub.
10:12
(Laughter)
183
612760
1576
(Smeh)
10:14
So what's going on?
184
614360
1560
Pa, o čemu se radi?
10:16
Now, YouTube's algorithm is proprietary,
185
616520
3536
Jutjubov algoritam je vlasnički,
10:20
but here's what I think is going on.
186
620080
2360
ali evo šta ja mislim o tome.
10:23
The algorithm has figured out
187
623360
2096
Algoritam je shvatio da
10:25
that if you can entice people
188
625480
3696
ako možete da namamite ljude
10:29
into thinking that you can show them something more hardcore,
189
629200
3736
da misle da možete da im pokažete nešto još više hardkor
10:32
they're more likely to stay on the site
190
632960
2416
verovatnije je da će ostati na sajtu
10:35
watching video after video going down that rabbit hole
191
635400
4416
i gledati video za videom, kao da propadaju kroz zečiju rupu,
10:39
while Google serves them ads.
192
639840
1680
dok im Gugl servira reklame.
10:43
Now, with nobody minding the ethics of the store,
193
643760
3120
Sad, dok god nikoga ne brine etika prodavnice,
10:47
these sites can profile people
194
647720
4240
ovi sajtovi mogu da profilišu ljude
10:53
who are Jew haters,
195
653680
1920
koji mrze Jevreje,
10:56
who think that Jews are parasites
196
656360
2480
koji misle da su Jevreji paraziti,
11:00
and who have such explicit anti-Semitic content,
197
660320
4920
i koji poseduju tako eksplicitni antisemitski sadržaj
11:06
and let you target them with ads.
198
666080
2000
i dopuštaju vam da ih vaše reklame targetiraju.
11:09
They can also mobilize algorithms
199
669200
3536
Oni takođe mogu da mobilizuju algoritme
11:12
to find for you look-alike audiences,
200
672760
3136
ne bi li vam našli slične publike,
11:15
people who do not have such explicit anti-Semitic content on their profile
201
675920
5576
ljude koji ne poseduju tako eksplicitni antisemitski sadržaj na svojim profilima
11:21
but who the algorithm detects may be susceptible to such messages,
202
681520
6176
ali koje algoritam prepoznaje kao podložne takvim porukama,
11:27
and lets you target them with ads, too.
203
687720
1920
i dopušta vam da i njih targetirate.
11:30
Now, this may sound like an implausible example,
204
690680
2736
Sad, ovo će možda zvučati kao malo verovatan primer,
11:33
but this is real.
205
693440
1320
ali je realan.
11:35
ProPublica investigated this
206
695480
2136
ProPublika je istraživala ovo
11:37
and found that you can indeed do this on Facebook,
207
697640
3616
i saznala da to zaista možete da uradite na Fejsbuku
11:41
and Facebook helpfully offered up suggestions
208
701280
2416
i da će vam Fejsbuk ljubazno ponuditi još predloga
11:43
on how to broaden that audience.
209
703720
1600
kako da proširite auditorijum.
11:46
BuzzFeed tried it for Google, and very quickly they found,
210
706720
3016
Bazfid je isto pokušao sa Guglom i veoma brzo su shvatili da,
11:49
yep, you can do it on Google, too.
211
709760
1736
naravno, isto možete da uradite i na Guglu.
11:51
And it wasn't even expensive.
212
711520
1696
A nije bilo ni skupo.
11:53
The ProPublica reporter spent about 30 dollars
213
713240
4416
Reporter ProPublike potrošio je oko 30 dolara
11:57
to target this category.
214
717680
2240
na targetiranje ove kategorije.
12:02
So last year, Donald Trump's social media manager disclosed
215
722600
5296
Tako je prošle godine menadžer društvenih stranica Donalda Trampa
12:07
that they were using Facebook dark posts to demobilize people,
216
727920
5336
otkrio da su koristili Fejsbukove mračne postove da demobilišu ljude,
12:13
not to persuade them,
217
733280
1376
ne da ih navedu,
12:14
but to convince them not to vote at all.
218
734680
2800
već da ih ubede da ne glasaju uopšte.
12:18
And to do that, they targeted specifically,
219
738520
3576
A da bi to uradili, targetirali su posebno,
12:22
for example, African-American men in key cities like Philadelphia,
220
742120
3896
na primer, muškarce Afroamerikance iz ključnih gradova poput Filadelfije,
12:26
and I'm going to read exactly what he said.
221
746040
2456
i sada ću vam pročitati tačno šta je on rekao.
12:28
I'm quoting.
222
748520
1216
Citiram:
12:29
They were using "nonpublic posts
223
749760
3016
Koristili su "privatne postove
12:32
whose viewership the campaign controls
224
752800
2176
čije preglede kontroliše kampanja
12:35
so that only the people we want to see it see it.
225
755000
3776
kako bi ih videli samo oni ljudi za koje želimo da ih vide.
12:38
We modeled this.
226
758800
1216
Mi smo to modelovali.
12:40
It will dramatically affect her ability to turn these people out."
227
760040
4720
To će dramatično uticati na njenu sposobnost da odbije ove ljude."
12:45
What's in those dark posts?
228
765720
2280
Šta je u ovim mračnim postovima?
12:48
We have no idea.
229
768480
1656
Mi nemamo pojma.
12:50
Facebook won't tell us.
230
770160
1200
Fejsbuk neće da nam kaže.
12:52
So Facebook also algorithmically arranges the posts
231
772480
4376
To znači da Fejsbuk postove vaših prijatelja ili stranica koje pratite
12:56
that your friends put on Facebook, or the pages you follow.
232
776880
3736
raspoređuje po algoritmu.
13:00
It doesn't show you everything chronologically.
233
780640
2216
Ne pokazuje vam sve hronološki.
13:02
It puts the order in the way that the algorithm thinks will entice you
234
782880
4816
Raspored utvrđuje po tome šta algoritam misli da će vas namamiti
13:07
to stay on the site longer.
235
787720
1840
da što duže ostanete na sajtu.
13:11
Now, so this has a lot of consequences.
236
791040
3376
Sve to ima brojne posledice.
13:14
You may be thinking somebody is snubbing you on Facebook.
237
794440
3800
Možda mislite da vas neko ignoriše na Fejsbuku,
13:18
The algorithm may never be showing your post to them.
238
798800
3256
a zapravo im Fejsbuk nikada ne pokazuje vaše postove.
13:22
The algorithm is prioritizing some of them and burying the others.
239
802080
5960
Algoritam daje prednost nekima dok druge ukopava.
13:29
Experiments show
240
809320
1296
Eksperimenti pokazuju da
13:30
that what the algorithm picks to show you can affect your emotions.
241
810640
4520
ono što algoritam odabere da vam pokaže može da utiče na vaše emocije.
13:36
But that's not all.
242
816600
1200
Ali, to nije sve.
13:38
It also affects political behavior.
243
818280
2360
Takođe utiče na političko ponašanje.
13:41
So in 2010, in the midterm elections,
244
821360
4656
2010. godine, usred srednjoročnih izbora,
13:46
Facebook did an experiment on 61 million people in the US
245
826040
5896
Fejsbuk je napravio eksperiment nad 61 milionom ljudi u SAD
13:51
that was disclosed after the fact.
246
831960
1896
koji je kasnije obelodanjen.
13:53
So some people were shown, "Today is election day,"
247
833880
3416
Neki ljudi su videli: "Danas su izbori",
13:57
the simpler one,
248
837320
1376
jednostavnija verzija,
13:58
and some people were shown the one with that tiny tweak
249
838720
3896
dok je nekima puštena ona pomalo izmenjena,
14:02
with those little thumbnails
250
842640
2096
sa onim malim slikama
14:04
of your friends who clicked on "I voted."
251
844760
2840
svih vaših prijateljima koji su kliknuli na "Glasao sam".
14:09
This simple tweak.
252
849000
1400
Mala je razlika.
14:11
OK? So the pictures were the only change,
253
851520
4296
OK? Dakle, samo su se slike razlikovale,
14:15
and that post shown just once
254
855840
3256
i taj post pušten samo jednom
14:19
turned out an additional 340,000 voters
255
859120
6056
odbio je dodatnih 34.000 glasača
14:25
in that election,
256
865200
1696
na tim izborima,
14:26
according to this research
257
866920
1696
sudeći po ovom istraživanju
14:28
as confirmed by the voter rolls.
258
868640
2520
a što je potvrđeno glasačkim listićima.
14:32
A fluke? No.
259
872920
1656
Slučajnost? Nije.
14:34
Because in 2012, they repeated the same experiment.
260
874600
5360
Jer su 2012. ponovili isti eksperiment.
14:40
And that time,
261
880840
1736
I taj put
14:42
that civic message shown just once
262
882600
3296
ta građanska poruka viđena samo jednom
14:45
turned out an additional 270,000 voters.
263
885920
4440
odbila je dodatnih 270.000 glasača.
14:51
For reference, the 2016 US presidential election
264
891160
5216
Radi poređenja, rezultat predsedničkih izbora u SAD 2016. godine
14:56
was decided by about 100,000 votes.
265
896400
3520
odredilo je oko 100.000 glasova.
15:01
Now, Facebook can also very easily infer what your politics are,
266
901360
4736
Sad, Fejsbuk takođe može da vrlo lako da zaključi koji su vaši politički stavovi
15:06
even if you've never disclosed them on the site.
267
906120
2256
čak i ako ih nikada niste otvoreno pokazali na ovom sajtu.
15:08
Right? These algorithms can do that quite easily.
268
908400
2520
Jasno? To ovi algoritmi mogu vrlo lako da urade.
15:11
What if a platform with that kind of power
269
911960
3896
Šta ako platforma koja ima toliku moć
15:15
decides to turn out supporters of one candidate over the other?
270
915880
5040
odluči da izostavi pristalice jednog kandidata u odnosu na druge?
15:21
How would we even know about it?
271
921680
2440
Kako bismo ikada saznali za to?
15:25
Now, we started from someplace seemingly innocuous --
272
925560
4136
Počeli smo nečim čini se bezazlenim -
15:29
online adds following us around --
273
929720
2216
internet reklame nas prate svuda -
15:31
and we've landed someplace else.
274
931960
1840
a mi smo otišli negde drugde.
15:35
As a public and as citizens,
275
935480
2456
Kao javnost i kao građani,
15:37
we no longer know if we're seeing the same information
276
937960
3416
mi više ne znamo da li vidimo iste informacije
15:41
or what anybody else is seeing,
277
941400
1480
niti šta bilo ko drugi vidi,
15:43
and without a common basis of information,
278
943680
2576
a bez zajedničke baze informacija,
15:46
little by little,
279
946280
1616
malo po malo,
15:47
public debate is becoming impossible,
280
947920
3216
javna debata postaje nemoguća,
15:51
and we're just at the beginning stages of this.
281
951160
2976
a mi smo još samo u početnoj fazi ovoga.
15:54
These algorithms can quite easily infer
282
954160
3456
Ovi algoritmi mogu vrlo lako da izvedu zaključak
15:57
things like your people's ethnicity,
283
957640
3256
o stvarima poput etničke pripadnosti ljudi,
16:00
religious and political views, personality traits,
284
960920
2336
religiji i političkim stavovima, ličnim osobinama,
16:03
intelligence, happiness, use of addictive substances,
285
963280
3376
inteligenciji, sreći, zavisnosti o supstancama,
16:06
parental separation, age and genders,
286
966680
3136
separaciji roditelja, polu i rodu,
16:09
just from Facebook likes.
287
969840
1960
samo po lajkovima na Fejsbuku.
16:13
These algorithms can identify protesters
288
973440
4056
Ovi algoritmi mogu identifikovati demonstrante
16:17
even if their faces are partially concealed.
289
977520
2760
čak i ako su njihova lica delimično skrivena.
16:21
These algorithms may be able to detect people's sexual orientation
290
981720
6616
Ovakvi algoritmi mogu biti sposobni da otkriju seksualnu orijentaciju ljudi
16:28
just from their dating profile pictures.
291
988360
3200
samo po njihovnim slikama na profilma za traženje partnera.
16:33
Now, these are probabilistic guesses,
292
993560
2616
Ovo su verovatne pretpostavke,
16:36
so they're not going to be 100 percent right,
293
996200
2896
što znači da neće biti 100 posto tačne,
16:39
but I don't see the powerful resisting the temptation to use these technologies
294
999120
4896
ali ne vidim moćnike kako odbijaju iskušenju da iskoriste ove tehnologije
16:44
just because there are some false positives,
295
1004040
2176
samo zato što postoje neki lažni pozitivi,
16:46
which will of course create a whole other layer of problems.
296
1006240
3256
koje će svakako napraviti sasvim novi sloj problema.
16:49
Imagine what a state can do
297
1009520
2936
Zamislite šta država može da uradi
16:52
with the immense amount of data it has on its citizens.
298
1012480
3560
sa ogromnom količinom podataka koje poseduje o svojim građanima.
16:56
China is already using face detection technology
299
1016680
4776
Kina već koristi tehnologiju prepoznavanja lica
17:01
to identify and arrest people.
300
1021480
2880
za identifikaciju i hapšenje ljudi.
17:05
And here's the tragedy:
301
1025280
2136
U ovome je tragedija:
17:07
we're building this infrastructure of surveillance authoritarianism
302
1027440
5536
pravimo infrastrukturu nadzorskog autoritarizma
17:13
merely to get people to click on ads.
303
1033000
2960
samo da bismo naveli ljude da klikću na reklame.
17:17
And this won't be Orwell's authoritarianism.
304
1037240
2576
A ovo neće biti orvelovski autoritatizam.
17:19
This isn't "1984."
305
1039839
1897
Ovo nije "1984".
17:21
Now, if authoritarianism is using overt fear to terrorize us,
306
1041760
4576
Sada, ako autoritarizam otvoreno koristi strah da bi nas terorisao,
17:26
we'll all be scared, but we'll know it,
307
1046359
2897
svi bismo bili uplašeni, ali bismo ga prepoznali,
17:29
we'll hate it and we'll resist it.
308
1049280
2200
mrzeli bismo ga i oduprli mu se.
17:32
But if the people in power are using these algorithms
309
1052880
4416
Ali, ako ove algoritme koriste moćnici
17:37
to quietly watch us,
310
1057319
3377
da nas posmatraju iz prikrajka,
17:40
to judge us and to nudge us,
311
1060720
2080
da sude o nama i da nas gurkaju,
17:43
to predict and identify the troublemakers and the rebels,
312
1063720
4176
da predvide i identifikuju problematične i buntovnike,
17:47
to deploy persuasion architectures at scale
313
1067920
3896
da što više razviju arhitekturu ubeđivanja
17:51
and to manipulate individuals one by one
314
1071840
4136
i da manipulišu pojedincima jednim za drugim
17:56
using their personal, individual weaknesses and vulnerabilities,
315
1076000
5440
koristeći njihove lične slabosti i ranjivosti,
18:02
and if they're doing it at scale
316
1082720
2200
a ako to rade na taj način
18:06
through our private screens
317
1086080
1736
kroz naše privatne ekrane
18:07
so that we don't even know
318
1087840
1656
tako da mi ni ne znamo
18:09
what our fellow citizens and neighbors are seeing,
319
1089520
2760
šta naši sugrađani i komšije vide,
18:13
that authoritarianism will envelop us like a spider's web
320
1093560
4816
takav autoritarizam će nas uhvatiti kao u paukovu mrežu
18:18
and we may not even know we're in it.
321
1098400
2480
i možda nećemo ni znati da smo u njoj.
18:22
So Facebook's market capitalization
322
1102440
2936
Dakle, Fejsbukov tržišni kapital
18:25
is approaching half a trillion dollars.
323
1105400
3296
se približava polovini biliona dolara.
18:28
It's because it works great as a persuasion architecture.
324
1108720
3120
To je zato što odlično funkcioniše kao arhitektura ubeđivanja.
18:33
But the structure of that architecture
325
1113760
2816
Ali, struktura te arhitekture je ista
18:36
is the same whether you're selling shoes
326
1116600
3216
bez obzira da li prodajete cipele
18:39
or whether you're selling politics.
327
1119840
2496
ili političke stavove.
18:42
The algorithms do not know the difference.
328
1122360
3120
Algoritam ne poznaje razliku.
18:46
The same algorithms set loose upon us
329
1126240
3296
Isti oni algoritmi koji se koriste nad nama
18:49
to make us more pliable for ads
330
1129560
3176
da bi nas učinili manje otpornim na reklame
18:52
are also organizing our political, personal and social information flows,
331
1132760
6736
takođe organizuju protok političkih, ličnih i društvenih informacija do nas,
18:59
and that's what's got to change.
332
1139520
1840
i to je ono što mora da se promeni.
19:02
Now, don't get me wrong,
333
1142240
2296
Nemojte pogrešno da me shvatite,
19:04
we use digital platforms because they provide us with great value.
334
1144560
3680
digitalne platforme koristimo jer su nam veoma dragocene.
19:09
I use Facebook to keep in touch with friends and family around the world.
335
1149120
3560
Ja koristim Fejsbuk da budem u kontaktu sa prijatejima i porodicom širom sveta.
19:14
I've written about how crucial social media is for social movements.
336
1154000
5776
Pisala sam o tome da su društvene mreže od ključne važnosti za društvene pokrete.
19:19
I have studied how these technologies can be used
337
1159800
3016
Izučavala sam kako ove tehnologije mogu da budu korišćene
19:22
to circumvent censorship around the world.
338
1162840
2480
širom sveta kako bi se izbegla cenzura.
19:27
But it's not that the people who run, you know, Facebook or Google
339
1167280
6416
Ali, to ne znači da su ljudi koji vode Fejsbuk ili Gugl zli,
19:33
are maliciously and deliberately trying
340
1173720
2696
niti da namerno pokušavaju da još više razdvoje
19:36
to make the country or the world more polarized
341
1176440
4456
zemlju ili svet
19:40
and encourage extremism.
342
1180920
1680
i da podržavaju ekstremizam.
19:43
I read the many well-intentioned statements
343
1183440
3976
Pročitala sam mnogo dobronamernih
19:47
that these people put out.
344
1187440
3320
izjava ovih ljudi.
19:51
But it's not the intent or the statements people in technology make that matter,
345
1191600
6056
Ali nije važna namera ili izjave koje daju ljudi iz tehnologije,
19:57
it's the structures and business models they're building.
346
1197680
3560
nego strukture i modeli poslovanja koje grade.
20:02
And that's the core of the problem.
347
1202360
2096
To je srž problema.
20:04
Either Facebook is a giant con of half a trillion dollars
348
1204480
4720
Ili je Fejsbuk ogromna prevara od pola biliona dolara
20:10
and ads don't work on the site,
349
1210200
1896
i oglasi ne rade na sajtu,
20:12
it doesn't work as a persuasion architecture,
350
1212120
2696
ne funkcioniše kao arhitektura ubeđivanja,
20:14
or its power of influence is of great concern.
351
1214840
4120
ili je njegova moć uticaja veoma zabrinjavajuća.
20:20
It's either one or the other.
352
1220560
1776
Ili jedno ili drugo.
20:22
It's similar for Google, too.
353
1222360
1600
Slično je i sa Guglom.
20:24
So what can we do?
354
1224880
2456
Šta možemo da uradimo?
20:27
This needs to change.
355
1227360
1936
Ovo mora de se promeni.
20:29
Now, I can't offer a simple recipe,
356
1229320
2576
Ne mogu da ponudim jednostavan recept,
20:31
because we need to restructure
357
1231920
2256
jer moramo da restrukturišemo
20:34
the whole way our digital technology operates.
358
1234200
3016
ceo način na koji digitalna tehnologija radi.
20:37
Everything from the way technology is developed
359
1237240
4096
Sve od načina na koji se tehnologija razvija
20:41
to the way the incentives, economic and otherwise,
360
1241360
3856
to načina na koji su motivacija, ekonomija i drugo
20:45
are built into the system.
361
1245240
2280
utkani u sistem.
20:48
We have to face and try to deal with
362
1248480
3456
Moramo da se suočimo i pokušamo da se izborimo
20:51
the lack of transparency created by the proprietary algorithms,
363
1251960
4656
sa nedostatkom transparentnosti koji stvaraju zaštićeni algoritmi,
20:56
the structural challenge of machine learning's opacity,
364
1256640
3816
strukturalnim izazovom razumljivosti mašinskog učenja,
21:00
all this indiscriminate data that's being collected about us.
365
1260480
3400
svim ovim nasumičnim podacima koji se o nama prikupljaju.
21:05
We have a big task in front of us.
366
1265000
2520
Pred nama je veliki zadatak.
21:08
We have to mobilize our technology,
367
1268160
2680
Moramo da upregnemo našu tehnologiju,
21:11
our creativity
368
1271760
1576
našu kreativnost,
21:13
and yes, our politics
369
1273360
1880
i da, našu politiku
21:16
so that we can build artificial intelligence
370
1276240
2656
kako bismo izgradili veštačku inteligenciju
21:18
that supports us in our human goals
371
1278920
3120
koja nas podržava u našim humanim ciljevima
21:22
but that is also constrained by our human values.
372
1282800
3920
ali je i ograničena našim humanim vrednostima.
21:27
And I understand this won't be easy.
373
1287600
2160
Razumem da ovo neće biti lako.
21:30
We might not even easily agree on what those terms mean.
374
1290360
3600
Možda se čak ni ne slažemo šta ti termini znače.
21:34
But if we take seriously
375
1294920
2400
Ali ako ozbiljno shvatimo
21:38
how these systems that we depend on for so much operate,
376
1298240
5976
kako funkcionišu ovi sistemi od kojih toliko zavisimo,
21:44
I don't see how we can postpone this conversation anymore.
377
1304240
4120
ne vidim kako možemo da odložimo taj razgovor.
21:49
These structures
378
1309200
2536
Ove strukture
21:51
are organizing how we function
379
1311760
4096
organizuju naše funkcionisanje
21:55
and they're controlling
380
1315880
2296
i one kontrolišu
21:58
what we can and we cannot do.
381
1318200
2616
šta možemo i šta ne možemo da uradimo.
22:00
And many of these ad-financed platforms,
382
1320840
2456
I mnoge od ovih platformi koje finansiraju reklame,
22:03
they boast that they're free.
383
1323320
1576
one se hvale da su besplatne.
22:04
In this context, it means that we are the product that's being sold.
384
1324920
4560
U ovom kontekstu, to znači da smo mi proizvod koji se prodaje.
22:10
We need a digital economy
385
1330840
2736
Potrebna nam je digitalna ekonomija
22:13
where our data and our attention
386
1333600
3496
gde naši podaci i naša pažnja
22:17
is not for sale to the highest-bidding authoritarian or demagogue.
387
1337120
5080
nisu na prodaju onom vladaru ili demagogu koji naviše plati.
22:23
(Applause)
388
1343160
3800
(Aplauz)
22:30
So to go back to that Hollywood paraphrase,
389
1350480
3256
Da se vratim onoj holivudskoj parafrazi,
22:33
we do want the prodigious potential
390
1353760
3736
mi želimo da izuzetan potencijal
22:37
of artificial intelligence and digital technology to blossom,
391
1357520
3200
veštačke inteligencije i digitalne tehnologije procveta,
22:41
but for that, we must face this prodigious menace,
392
1361400
4936
ali za to moramo da se suočimo sa izuzetnom opasnošću,
22:46
open-eyed and now.
393
1366360
1936
otvorenih očiju i sada.
22:48
Thank you.
394
1368320
1216
Hvala vam.
22:49
(Applause)
395
1369560
4640
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7