Machine intelligence makes human morals more important | Zeynep Tufekci

180,762 views ・ 2016-11-11

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Mile Živković
00:12
So, I started my first job as a computer programmer
0
12739
4122
Dakle, počela sam da radim kao kompjuterska programerka
00:16
in my very first year of college --
1
16885
1956
kad sam bila prva godina na fakultetu -
00:18
basically, as a teenager.
2
18865
1507
u suštini kao tinejdžerka.
00:20
Soon after I started working,
3
20889
1732
Ubrzo nakon što sam počela da radim,
00:22
writing software in a company,
4
22645
1610
da pišem softvere za firmu,
00:24
a manager who worked at the company came down to where I was,
5
24799
3635
menadžer koji je radio u firmi se spustio do mene
00:28
and he whispered to me,
6
28458
1268
i prošaputao mi je:
00:30
"Can he tell if I'm lying?"
7
30229
2861
"Zna li da lažem?"
00:33
There was nobody else in the room.
8
33806
2077
Nije bilo više bilo koga u prostoriji.
00:37
"Can who tell if you're lying? And why are we whispering?"
9
37032
4389
"Zna li ko da lažete? I zašto šapućemo?"
00:42
The manager pointed at the computer in the room.
10
42266
3107
Menadžer je pokazao na kompjuter u prostoriji.
00:45
"Can he tell if I'm lying?"
11
45397
3096
"Zna li da lažem?"
00:49
Well, that manager was having an affair with the receptionist.
12
49613
4362
Pa, taj menadžer je imao aferu sa recepcionerom.
00:53
(Laughter)
13
53999
1112
(Smeh)
00:55
And I was still a teenager.
14
55135
1766
A ja sam još uvek bila tinejdžerka.
00:57
So I whisper-shouted back to him,
15
57447
2019
Pa sam mu polušapatom odgovorila:
00:59
"Yes, the computer can tell if you're lying."
16
59490
3624
"Da, kompjuter zna kad mu lažete."
01:03
(Laughter)
17
63138
1806
(Smeh)
01:04
Well, I laughed, but actually, the laugh's on me.
18
64968
2923
Pa, smejala sam se, ali zapravo šala je na moj račun.
01:07
Nowadays, there are computational systems
19
67915
3268
Ovih dana imamo kompjuterske sisteme
01:11
that can suss out emotional states and even lying
20
71207
3548
koji mogu da prozru emocionalna stanja, čak i laganje,
01:14
from processing human faces.
21
74779
2044
obradom ljudskih lica.
01:17
Advertisers and even governments are very interested.
22
77248
4153
Oglašivači, čak i vlade su veoma zainteresovane za to.
01:22
I had become a computer programmer
23
82319
1862
Postala sam kompjuterska programerka
01:24
because I was one of those kids crazy about math and science.
24
84205
3113
jer sam bila jedno od dece koja su luda za matematikom i naukom.
01:27
But somewhere along the line I'd learned about nuclear weapons,
25
87942
3108
No, nekako sam usput saznala za nuklearno oružje
01:31
and I'd gotten really concerned with the ethics of science.
26
91074
2952
i postala sam zaista zabrinuta zbog naučne etike.
01:34
I was troubled.
27
94050
1204
Mučilo me je to.
01:35
However, because of family circumstances,
28
95278
2641
Međutim, zbog porodičnih okolnosti,
01:37
I also needed to start working as soon as possible.
29
97943
3298
takođe je bilo potrebno da počnem da radim što pre.
01:41
So I thought to myself, hey, let me pick a technical field
30
101265
3299
Pa sam pomislila u sebi, hej, hajde da izaberem tehničku oblast
01:44
where I can get a job easily
31
104588
1796
gde mogu lako da se zaposlim
01:46
and where I don't have to deal with any troublesome questions of ethics.
32
106408
4018
i gde ne moram da se bavim bilo kakvim mučnim etičkim pitanjem.
01:51
So I picked computers.
33
111022
1529
Pa sam odabrala kompjutere.
01:52
(Laughter)
34
112575
1104
(Smeh)
01:53
Well, ha, ha, ha! All the laughs are on me.
35
113703
3410
Pa, ha, ha, ha! Šala je skroz na moj račun.
01:57
Nowadays, computer scientists are building platforms
36
117137
2754
Ovih dana kompjuterski naučnici prave platforme
01:59
that control what a billion people see every day.
37
119915
4209
koje upravljaju onim što milijarde ljudi gledaju svakodnevno.
02:05
They're developing cars that could decide who to run over.
38
125052
3822
Razvijaju automobile koji mogu da odluče koga da pregaze.
02:09
They're even building machines, weapons,
39
129707
3213
Čak grade mašine, oružja,
02:12
that might kill human beings in war.
40
132944
2285
koja mogu da ubijaju ljude u ratu.
02:15
It's ethics all the way down.
41
135253
2771
U potpunosti se radi o etici.
02:19
Machine intelligence is here.
42
139183
2058
Mašinska inteligencija je tu.
02:21
We're now using computation to make all sort of decisions,
43
141823
3474
Trenutno koristimo kompjutere da donesemo razne odluke,
02:25
but also new kinds of decisions.
44
145321
1886
ali i nove tipove odluka.
02:27
We're asking questions to computation that have no single right answers,
45
147231
5172
Postavljamo kompjuterima pitanja koja nemaju jedan pravi odgovor,
02:32
that are subjective
46
152427
1202
koja su subjektivna,
02:33
and open-ended and value-laden.
47
153653
2325
otvorena i krcata vrednostima.
02:36
We're asking questions like,
48
156002
1758
Postavljamo ovakva pitanja:
02:37
"Who should the company hire?"
49
157784
1650
"Koga da firma zaposli?"
02:40
"Which update from which friend should you be shown?"
50
160096
2759
"Koje ažuriranje od kog prijatelja treba da bude vidljivo?"
02:42
"Which convict is more likely to reoffend?"
51
162879
2266
"Koji osuđenik je skloniji novom prestupu?"
02:45
"Which news item or movie should be recommended to people?"
52
165514
3054
"Koji novinski čalanak ili film treba preporučiti ljudima?"
02:48
Look, yes, we've been using computers for a while,
53
168592
3372
Gledajte, da, već neko vreme koristimo kompjutere,
02:51
but this is different.
54
171988
1517
ali ovo je drugačije.
02:53
This is a historical twist,
55
173529
2067
Ovo je istorijski preokret
02:55
because we cannot anchor computation for such subjective decisions
56
175620
5337
jer ne možemo da usidrimo kompjutere kod sličnih subjektivnih odluka
03:00
the way we can anchor computation for flying airplanes, building bridges,
57
180981
5420
na način na koji usidravamo kompjutere koji upravljaju avionima, grade mostove,
03:06
going to the moon.
58
186425
1259
idu na mesec.
03:08
Are airplanes safer? Did the bridge sway and fall?
59
188449
3259
Jesu li avioni bezbedniji? Da li se most zaljuljao i pao?
03:11
There, we have agreed-upon, fairly clear benchmarks,
60
191732
4498
Tu imamo uspostavljene prilično jasne repere
03:16
and we have laws of nature to guide us.
61
196254
2239
i tu su zakoni prirode da nas vode.
03:18
We have no such anchors and benchmarks
62
198517
3394
Nemamo slična težišta i repere
03:21
for decisions in messy human affairs.
63
201935
3963
za odluke koje se tiču haotičnih ljudskih odnosa.
03:25
To make things more complicated, our software is getting more powerful,
64
205922
4237
Da bi stvari bile još složenije, naši softveri postaju sve moćniji,
03:30
but it's also getting less transparent and more complex.
65
210183
3773
ali takođe postaju manje transparentni i složeniji.
03:34
Recently, in the past decade,
66
214542
2040
U skorije vreme, u poslednjoj deceniji,
03:36
complex algorithms have made great strides.
67
216606
2729
složeni algoritmi su poprilično napredovali.
03:39
They can recognize human faces.
68
219359
1990
Mogu da prepoznaju ljudska lica.
03:41
They can decipher handwriting.
69
221985
2055
Mogu da dešifruju rukopis.
03:44
They can detect credit card fraud
70
224436
2066
Mogu da zapaze prevaru kod kreditnih kartica
03:46
and block spam
71
226526
1189
i da blokiraju spamove
03:47
and they can translate between languages.
72
227739
2037
i da prevode s jezika na jezik.
03:49
They can detect tumors in medical imaging.
73
229800
2574
Mogu da zapaze tumore na medicinskim snimcima.
03:52
They can beat humans in chess and Go.
74
232398
2205
Mogu da pobede ljude u šahu i gou.
03:55
Much of this progress comes from a method called "machine learning."
75
235264
4504
Veliki deo ovog napretka potiče od metoda nazvanog "mašinsko učenje".
04:00
Machine learning is different than traditional programming,
76
240175
3187
Mašinsko učenje se razlikuje od tradicionalnog programiranja,
04:03
where you give the computer detailed, exact, painstaking instructions.
77
243386
3585
gde dajete kompjuteru detaljne, tačne, minuciozne instrukcije.
04:07
It's more like you take the system and you feed it lots of data,
78
247378
4182
Pre se radi o odabiru sistema i pohranjivanju podataka u njega,
04:11
including unstructured data,
79
251584
1656
uključujući nestrukturirane podatke,
04:13
like the kind we generate in our digital lives.
80
253264
2278
poput onih koje stvaramo u digitalnim životima.
04:15
And the system learns by churning through this data.
81
255566
2730
A sistem uči, pretresajući podatke.
04:18
And also, crucially,
82
258669
1526
Suštinsko je takođe
04:20
these systems don't operate under a single-answer logic.
83
260219
4380
da se ovi sistemi ne vode logikom samo jednog odgovora.
04:24
They don't produce a simple answer; it's more probabilistic:
84
264623
2959
Ne proizvode jednostavne odgovore; više se radi o verovatnoći:
04:27
"This one is probably more like what you're looking for."
85
267606
3483
"Ovo je verovatno sličnije onome što tražite."
04:32
Now, the upside is: this method is really powerful.
86
272023
3070
Sad, pozitivno je: ovaj metod je zaista moćan.
04:35
The head of Google's AI systems called it,
87
275117
2076
Glavni u Guglovom sistemu za VI je to nazvao:
04:37
"the unreasonable effectiveness of data."
88
277217
2197
"nerazumna efikasnost podataka."
04:39
The downside is,
89
279791
1353
Negativno je:
04:41
we don't really understand what the system learned.
90
281738
3071
ne razumemo zaista šta je sistem naučio.
04:44
In fact, that's its power.
91
284833
1587
Zapravo, to je njegova moć.
04:46
This is less like giving instructions to a computer;
92
286946
3798
Ovo manje liči na davanje uputstava kompjuteru;
04:51
it's more like training a puppy-machine-creature
93
291200
4064
više liči na dresiranje bića - mehaničko kuče,
04:55
we don't really understand or control.
94
295288
2371
koje zaista ne razumemo, niti kontrolišemo.
04:58
So this is our problem.
95
298362
1551
Dakle, to je naš problem.
05:00
It's a problem when this artificial intelligence system gets things wrong.
96
300427
4262
Problem je kad ovaj sistem veštačke inteligencije nešto pogrešno shvati.
05:04
It's also a problem when it gets things right,
97
304713
3540
Takođe je problem kad nešto dobro shvati.
05:08
because we don't even know which is which when it's a subjective problem.
98
308277
3628
jer čak ni ne znamo šta je šta kod subjektivnog problema.
05:11
We don't know what this thing is thinking.
99
311929
2339
Ne znamo o čemu ova stvar razmišlja.
05:15
So, consider a hiring algorithm --
100
315493
3683
Dakle, uzmite u obzir algoritam za zapošljavanje -
05:20
a system used to hire people, using machine-learning systems.
101
320123
4311
sistem koji se koristi pri zapošljavanju, koji koristi sisteme mašinskog učenja.
05:25
Such a system would have been trained on previous employees' data
102
325052
3579
Sličan sistem je obučavan na podacima prethodnih zaposlenih
05:28
and instructed to find and hire
103
328655
2591
i naučen je da pronalazi i zapošljava
05:31
people like the existing high performers in the company.
104
331270
3038
ljude poput postojećih najučinkovitijih u firmi.
05:34
Sounds good.
105
334814
1153
Zvuči dobro.
05:35
I once attended a conference
106
335991
1999
Jednom sam bila na konferenciji
05:38
that brought together human resources managers and executives,
107
338014
3125
koja je spojila menadžere iz kadrovske službe i direktore,
05:41
high-level people,
108
341163
1206
ljude s visokih pozicija,
05:42
using such systems in hiring.
109
342393
1559
koristeći ove sisteme zapošljavanja.
05:43
They were super excited.
110
343976
1646
Bili su veoma uzbuđeni.
05:45
They thought that this would make hiring more objective, less biased,
111
345646
4653
Smatrali su da bi zbog ovoga zapošljavanje bilo objektivnije, nepristrasnije,
05:50
and give women and minorities a better shot
112
350323
3000
i da bi žene i manjine imale više šanse,
05:53
against biased human managers.
113
353347
2188
nasuprot pristrasnim ljudskim menadžerima.
05:55
And look -- human hiring is biased.
114
355559
2843
I, gledajte - zapošljavanje ljudi je pristrasno.
05:59
I know.
115
359099
1185
Znam.
06:00
I mean, in one of my early jobs as a programmer,
116
360308
3005
Mislim, na jednom od mojih prvih poslova kao programerke,
06:03
my immediate manager would sometimes come down to where I was
117
363337
3868
moja nadređena menadžerka bi ponekad prišla mestu na kom sam,
06:07
really early in the morning or really late in the afternoon,
118
367229
3753
veoma rano ujutru ili veoma kasno poslepodne,
06:11
and she'd say, "Zeynep, let's go to lunch!"
119
371006
3062
i rekla bi: "Zejnep, pođimo na ručak!"
06:14
I'd be puzzled by the weird timing.
120
374724
2167
Zbunilo bi me neobično vreme.
06:16
It's 4pm. Lunch?
121
376915
2129
Četiri je popodne. Ručak?
06:19
I was broke, so free lunch. I always went.
122
379068
3094
Bila sam švorc, pa sam uvek išla na besplatan ručak.
06:22
I later realized what was happening.
123
382618
2067
Kasnije sam shvatila o čemu se radilo.
06:24
My immediate managers had not confessed to their higher-ups
124
384709
4546
Moji nadređeni menadžeri nisu priznali svojim nadređenim
06:29
that the programmer they hired for a serious job was a teen girl
125
389279
3113
da je programer kog su zaposlili za ozbiljan posao bila tinejdžerka
06:32
who wore jeans and sneakers to work.
126
392416
3930
koja je nosila farmerke i patike na posao.
06:37
I was doing a good job, I just looked wrong
127
397174
2202
Bila sam dobar radnik, samo pogrešnog izgleda
06:39
and was the wrong age and gender.
128
399400
1699
i bila sam pogrešnih godina i roda.
06:41
So hiring in a gender- and race-blind way
129
401123
3346
Pa zapošljavanje na rodno i rasno nepristrasan način
06:44
certainly sounds good to me.
130
404493
1865
izvesno da mi zvuči dobro.
06:47
But with these systems, it is more complicated, and here's why:
131
407031
3341
Ali uz ove sisteme, složenije je, a evo zašto:
06:50
Currently, computational systems can infer all sorts of things about you
132
410968
5791
trenutno kompjuterski sistemi mogu da zaključe razne stvari o vama
06:56
from your digital crumbs,
133
416783
1872
iz vaših digitalnih tragova,
06:58
even if you have not disclosed those things.
134
418679
2333
čak iako to niste obelodanili.
07:01
They can infer your sexual orientation,
135
421506
2927
Mogu da zaključe vašu seksualnu orijentaciju,
07:04
your personality traits,
136
424994
1306
vaše lične osobine,
07:06
your political leanings.
137
426859
1373
vaša politička naginjanja.
07:08
They have predictive power with high levels of accuracy.
138
428830
3685
Imaju moć predviđanja sa visokim stepenom tačnosti.
07:13
Remember -- for things you haven't even disclosed.
139
433362
2578
Zapamtite - za ono što čak niste ni obelodanili.
07:15
This is inference.
140
435964
1591
To je zaključivanje.
07:17
I have a friend who developed such computational systems
141
437579
3261
Imam prijateljicu koja je razvila sličan kompjuterski sistem
07:20
to predict the likelihood of clinical or postpartum depression
142
440864
3641
za predviđanje verovatnoće kliničke ili postporođajne depresije
07:24
from social media data.
143
444529
1416
iz podataka sa društvenih mreža.
07:26
The results are impressive.
144
446676
1427
Rezultati su bili impresivni.
07:28
Her system can predict the likelihood of depression
145
448492
3357
Njeni sistemi mogu da predvide verovatnoću depresije
07:31
months before the onset of any symptoms --
146
451873
3903
mesecima pre nastupa bilo kakvih simptoma -
07:35
months before.
147
455800
1373
mesecima ranije.
07:37
No symptoms, there's prediction.
148
457197
2246
Bez simptoma imamo predviđanje.
07:39
She hopes it will be used for early intervention. Great!
149
459467
4812
Ona se nada da će biti korišćeni za rane intervencije. Sjajno!
07:44
But now put this in the context of hiring.
150
464911
2040
Sad ovo stavite u kontekst zapošljavanja.
07:48
So at this human resources managers conference,
151
468027
3046
Pa sam na ovoj konferenciji menadžera iz kadrovske
07:51
I approached a high-level manager in a very large company,
152
471097
4709
prišla visokoprofilnoj menadžerki iz prilično velike firme,
07:55
and I said to her, "Look, what if, unbeknownst to you,
153
475830
4578
i rekla sam joj: "Pazi, šta ako bi, bez tvog znanja,
08:00
your system is weeding out people with high future likelihood of depression?
154
480432
6549
ovaj sistem iskorenjivao ljude sa velikim izgledima za depresiju u budućnosti?
08:07
They're not depressed now, just maybe in the future, more likely.
155
487761
3376
Trenutno nisu depresivni, ali je veća verovatnoća da će biti u budućnosti.
08:11
What if it's weeding out women more likely to be pregnant
156
491923
3406
Šta ako iskorenjuje žene s većom verovatnoćom da zatrudne
08:15
in the next year or two but aren't pregnant now?
157
495353
2586
u narednih godinu ili dve, ali trenutno nisu trudne?
08:18
What if it's hiring aggressive people because that's your workplace culture?"
158
498844
5636
Šta ako zapošljava agresivne ljude jer je to kultura na vašem radnom mestu?"
08:25
You can't tell this by looking at gender breakdowns.
159
505173
2691
Ovo ne možete da vidite, posmatrajući rodnu nejednakost.
08:27
Those may be balanced.
160
507888
1502
Ona bi mogla da bude u ravnoteži.
08:29
And since this is machine learning, not traditional coding,
161
509414
3557
A kako se radi o mašinskom učenju, a ne tradicionalnom programiranju,
08:32
there is no variable there labeled "higher risk of depression,"
162
512995
4907
tu nemamo varijablu s oznakom "veći rizik od depresije",
08:37
"higher risk of pregnancy,"
163
517926
1833
"veći rizik za trudnoću",
08:39
"aggressive guy scale."
164
519783
1734
"skala agresivnih muškaraca".
08:41
Not only do you not know what your system is selecting on,
165
521995
3679
Ne samo da ne znate na osnovu čega vaš sistem bira,
08:45
you don't even know where to begin to look.
166
525698
2323
čak ne znate ni gde da gledate.
08:48
It's a black box.
167
528045
1246
To je crna kutija.
08:49
It has predictive power, but you don't understand it.
168
529315
2807
Ima moć predviđanja, ali je vi ne razumete.
08:52
"What safeguards," I asked, "do you have
169
532486
2369
"Koja vam je zaštita", pitala sam, "koju imate
08:54
to make sure that your black box isn't doing something shady?"
170
534879
3673
kojom se starate da crna kutija ne obavlja nešto sumnjivo?"
09:00
She looked at me as if I had just stepped on 10 puppy tails.
171
540863
3878
Pogledala me je kao da sam nagazila na 10 kučećih repića.
09:04
(Laughter)
172
544765
1248
(Smeh)
09:06
She stared at me and she said,
173
546037
2041
Buljila je u mene i rekla:
09:08
"I don't want to hear another word about this."
174
548556
4333
"Ne želim da čujem ni reč više o ovome."
09:13
And she turned around and walked away.
175
553458
2034
Okrenula se i otišla.
09:16
Mind you -- she wasn't rude.
176
556064
1486
Pazite - nije bila nepristojna.
09:17
It was clearly: what I don't know isn't my problem, go away, death stare.
177
557574
6308
Jasno se radilo o ovome: ono što ne znam nije moj problem, nestani, prazan pogeld.
09:23
(Laughter)
178
563906
1246
(Smeh)
09:25
Look, such a system may even be less biased
179
565862
3839
Vidite, sličan sistem bi mogao čak da bude na neki način
09:29
than human managers in some ways.
180
569725
2103
i manje pristrasan od ljudskih menadžera.
09:31
And it could make monetary sense.
181
571852
2146
I mogao bi da ima finansijsku prednost.
09:34
But it could also lead
182
574573
1650
Ali bi takođe mogao da dovede
09:36
to a steady but stealthy shutting out of the job market
183
576247
4748
do stabilnog, ali prikrivenog isključivanja sa tržišta rada
09:41
of people with higher risk of depression.
184
581019
2293
ljudi s većim rizikom od depresije.
09:43
Is this the kind of society we want to build,
185
583753
2596
Da li je ovo oblik društva koji želimo da gradimo,
09:46
without even knowing we've done this,
186
586373
2285
a da pri tom ne znamo da smo to uradili
09:48
because we turned decision-making to machines we don't totally understand?
187
588682
3964
jer smo prepustili donošenje odluka mašinama koje u potpunosti ne razumemo?
09:53
Another problem is this:
188
593265
1458
Drugi problem je sledeće:
09:55
these systems are often trained on data generated by our actions,
189
595314
4452
ovi sistemi su često obučavani na podacima koje proizvode naša delanja,
09:59
human imprints.
190
599790
1816
na ljudskom otisku.
10:02
Well, they could just be reflecting our biases,
191
602188
3808
Pa, oni bi prosto mogli da odražavaju naše predrasude,
10:06
and these systems could be picking up on our biases
192
606020
3593
i ovi sistemi bi mogli da pokupe naše predrasude
10:09
and amplifying them
193
609637
1313
i da ih naglase
10:10
and showing them back to us,
194
610974
1418
i potom da nam ih pokažu,
10:12
while we're telling ourselves,
195
612416
1462
dok mi govorimo sebi:
10:13
"We're just doing objective, neutral computation."
196
613902
3117
"Samo izvodimo objektivne, neutralne proračune."
10:18
Researchers found that on Google,
197
618314
2677
Istraživači su otkrili da na Guglu
10:22
women are less likely than men to be shown job ads for high-paying jobs.
198
622134
5313
ženama mnogo ređe nego muškarcima prikazuju oglase za dobro plaćene poslove.
10:28
And searching for African-American names
199
628463
2530
A pretraga afroameričkih imena
10:31
is more likely to bring up ads suggesting criminal history,
200
631017
4706
često sa sobom povlači oglase koji nagoveštavaju kriminalnu prošlost,
10:35
even when there is none.
201
635747
1567
čak i kad ona ne postoji.
10:38
Such hidden biases and black-box algorithms
202
638693
3549
Slične prikrivene predrasude i algoritmi nalik crnoj kutiji,
10:42
that researchers uncover sometimes but sometimes we don't know,
203
642266
3973
koje istraživači povremeno otkrivaju, ali ponekad to ne uspeju,
10:46
can have life-altering consequences.
204
646263
2661
mogu da imaju ozbiljne posledice.
10:49
In Wisconsin, a defendant was sentenced to six years in prison
205
649958
4159
Okrivljeni iz Viskonsina je osuđen na šest godina zatvora
10:54
for evading the police.
206
654141
1355
zbog izbegavanja policije.
10:56
You may not know this,
207
656824
1186
Možda ne znate za ovo,
10:58
but algorithms are increasingly used in parole and sentencing decisions.
208
658034
3998
ali algoritme sve češće koriste u odlučivanju o uslovnoj ili kazni.
11:02
He wanted to know: How is this score calculated?
209
662056
2955
Želeo je da zna: kako su izračunali ovaj rezultat?
11:05
It's a commercial black box.
210
665795
1665
To je komercijalna crna kutija.
11:07
The company refused to have its algorithm be challenged in open court.
211
667484
4205
Firma je odbila da njen algoritam izazovu na javnom suđenju.
11:12
But ProPublica, an investigative nonprofit, audited that very algorithm
212
672396
5532
No, ProPublica, istraživačka neprofitna organizacija je proverila taj algoritam
11:17
with what public data they could find,
213
677952
2016
sa podacima koje su uspeli da nađu
11:19
and found that its outcomes were biased
214
679992
2316
i otkrili su da su njihovi rezultati pristrasni,
11:22
and its predictive power was dismal, barely better than chance,
215
682332
3629
a da je njihova moć predviđanja očajna, jedva bolja od nagađanja
11:25
and it was wrongly labeling black defendants as future criminals
216
685985
4416
i da su pogrešno označavali okrivljene crnce kao buduće kriminalce,
11:30
at twice the rate of white defendants.
217
690425
3895
dvostruko češće nego okrivljene belce.
11:35
So, consider this case:
218
695891
1564
Pa, razmotrite ovaj slučaj:
11:38
This woman was late picking up her godsister
219
698103
3852
ova žena je kasnila da pokupi svoje kumče
11:41
from a school in Broward County, Florida,
220
701979
2075
iz okruga Brauard u Floridi,
11:44
running down the street with a friend of hers.
221
704757
2356
trčala je niz ulicu sa svojom prijateljicom.
11:47
They spotted an unlocked kid's bike and a scooter on a porch
222
707137
4099
Spazile su nezaključan dečji bicikl i skuter na tremu
11:51
and foolishly jumped on it.
223
711260
1632
i nesmotreno su sele na bicikl.
11:52
As they were speeding off, a woman came out and said,
224
712916
2599
Dok su jurile dalje, žena je izašla i rekla:
11:55
"Hey! That's my kid's bike!"
225
715539
2205
"Hej! To je bicikl mog deteta!"
11:57
They dropped it, they walked away, but they were arrested.
226
717768
3294
Ostavile su ga, odšetale, ali su uhapšene.
12:01
She was wrong, she was foolish, but she was also just 18.
227
721086
3637
Pogrešila je, bila je nesmotrena, ali je takođe imala svega 18 godina.
12:04
She had a couple of juvenile misdemeanors.
228
724747
2544
Imala je nekoliko maloletničkih prekršaja.
12:07
Meanwhile, that man had been arrested for shoplifting in Home Depot --
229
727808
5185
U međuvremenu, ovaj čovek je uhapšen zbog krađe u supermarketu -
12:13
85 dollars' worth of stuff, a similar petty crime.
230
733017
2924
robe u vrednosti od 85 dolara, sličan manji zločin.
12:16
But he had two prior armed robbery convictions.
231
736766
4559
Ali je pre toga imao dve osude zbog oružane pljačke.
12:21
But the algorithm scored her as high risk, and not him.
232
741955
3482
Ali je algoritam nju ocenio kao visokorizičnu, a njega nije.
12:26
Two years later, ProPublica found that she had not reoffended.
233
746746
3874
Dve godine kasnije, ProPublica je otkrila da ona nije imala novih prekršaja.
12:30
It was just hard to get a job for her with her record.
234
750644
2550
Samo joj je sa dosijeom bilo teško da nađe posao.
12:33
He, on the other hand, did reoffend
235
753218
2076
On, s druge strane, ponovo je u zatvoru
12:35
and is now serving an eight-year prison term for a later crime.
236
755318
3836
i ovaj put služi kaznu od osam godina zbog kasnijeg zločina.
12:40
Clearly, we need to audit our black boxes
237
760088
3369
Očigledno, moramo da proveravamo naše crne kutije
12:43
and not have them have this kind of unchecked power.
238
763481
2615
kako ne bi imale sličnu nekontrolisanu moć.
12:46
(Applause)
239
766120
2879
(Aplauz)
12:50
Audits are great and important, but they don't solve all our problems.
240
770087
4242
Provere su sjajne i važne, ali ne rešavaju sve naše probleme.
12:54
Take Facebook's powerful news feed algorithm --
241
774353
2748
Uzmite Fejsbukov moćan algoritam za dostavu vesti -
12:57
you know, the one that ranks everything and decides what to show you
242
777125
4843
znate, onaj koji sve rangira i odlučuje šta da vam pokaže
13:01
from all the friends and pages you follow.
243
781992
2284
od svih prijatelja i stranica koje pratite.
13:04
Should you be shown another baby picture?
244
784898
2275
Da li da vam pokaže još jednu sliku bebe?
13:07
(Laughter)
245
787197
1196
(Smeh)
13:08
A sullen note from an acquaintance?
246
788417
2596
Sumornu poruku od poznanika?
13:11
An important but difficult news item?
247
791449
1856
Važnu, ali tešku vest?
13:13
There's no right answer.
248
793329
1482
Nema pravog odgovora.
13:14
Facebook optimizes for engagement on the site:
249
794835
2659
Fejsbuk najviše ima koristi od angažmana na sajtu:
13:17
likes, shares, comments.
250
797518
1415
sviđanja, deljenja, komentara.
13:20
In August of 2014,
251
800168
2696
Avgusta 2014,
13:22
protests broke out in Ferguson, Missouri,
252
802888
2662
izbili su protesti u Fergusonu, Misuri,
13:25
after the killing of an African-American teenager by a white police officer,
253
805574
4417
nakon ubistva afroameričkog tinejdžera od strane policajca belca,
13:30
under murky circumstances.
254
810015
1570
pod nejasnim okolnostima.
13:31
The news of the protests was all over
255
811974
2007
Vesti o protestima su bile svuda
13:34
my algorithmically unfiltered Twitter feed,
256
814005
2685
po mom algoritamski nefilterisanom Tviter nalogu,
13:36
but nowhere on my Facebook.
257
816714
1950
ali nigde na mom Fejsbuku.
13:39
Was it my Facebook friends?
258
819182
1734
Da li su krivi prijatelji na Fejsbuku?
13:40
I disabled Facebook's algorithm,
259
820940
2032
Onemogućila sam Fejsbukov algoritam,
13:43
which is hard because Facebook keeps wanting to make you
260
823472
2848
a to je teško jer Fejsbuk nastoji da vas natera
13:46
come under the algorithm's control,
261
826344
2036
da budete pod kontrolom algoritma,
13:48
and saw that my friends were talking about it.
262
828404
2238
i videla sam da moji prijatelji govore o tome.
13:50
It's just that the algorithm wasn't showing it to me.
263
830666
2509
Samo mi moj algoritam to nije pokazivao.
13:53
I researched this and found this was a widespread problem.
264
833199
3042
Istražila sam ovo i otkrila da je ovo raširen problem.
13:56
The story of Ferguson wasn't algorithm-friendly.
265
836265
3813
Priča o Fergusonu nije bila prihvatljiva za algoritam.
14:00
It's not "likable."
266
840102
1171
Nije nešto za "sviđanje".
14:01
Who's going to click on "like?"
267
841297
1552
Ko će da pritisne "sviđanje"?
14:03
It's not even easy to comment on.
268
843500
2206
Nije je čak lako ni komentarisati.
14:05
Without likes and comments,
269
845730
1371
Bez sviđanja i komentara,
14:07
the algorithm was likely showing it to even fewer people,
270
847125
3292
algoritam je težio da je prikaže čak i manjem broju ljudi,
14:10
so we didn't get to see this.
271
850441
1542
pa nismo mogli da je vidimo.
14:12
Instead, that week,
272
852946
1228
Umesto toga, te sedmice,
14:14
Facebook's algorithm highlighted this,
273
854198
2298
Fejsbukov algoritam je izdvojio ovo,
14:16
which is the ALS Ice Bucket Challenge.
274
856520
2226
a to je ALS ledeni izazov.
14:18
Worthy cause; dump ice water, donate to charity, fine.
275
858770
3742
Plemenit cilj; polij se ledenom vodom, doniraj u dobrotvorne svrhe, fino.
14:22
But it was super algorithm-friendly.
276
862536
1904
Ali bilo je veoma algoritamski prihvatljivo.
14:25
The machine made this decision for us.
277
865219
2613
Mašina je donela ovu odluku u naše ime.
14:27
A very important but difficult conversation
278
867856
3497
Veoma važan, ali težak razgovor
14:31
might have been smothered,
279
871377
1555
je mogao da bude ugušen,
14:32
had Facebook been the only channel.
280
872956
2696
da je Fejsbuk bio jedini kanal.
14:36
Now, finally, these systems can also be wrong
281
876117
3797
Sad, naposletku, ovi sistemi takođe mogu da greše
14:39
in ways that don't resemble human systems.
282
879938
2736
drugačije od ljudskih sistema.
14:42
Do you guys remember Watson, IBM's machine-intelligence system
283
882698
2922
Ljudi, sećate li se Votsona, IBM-ovog sistema mašinske inteligencije
14:45
that wiped the floor with human contestants on Jeopardy?
284
885644
3128
koji je obrisao pod ljudskim takmičarima na kvizu?
14:49
It was a great player.
285
889131
1428
Bio je sjajan takmičar.
14:50
But then, for Final Jeopardy, Watson was asked this question:
286
890583
3569
Ali su ga onda u finalnom izazovu upitali sledeće pitanje:
14:54
"Its largest airport is named for a World War II hero,
287
894659
2932
"Njegov najveći aerodrom je nazvan po heroju iz II svetskog rata,
14:57
its second-largest for a World War II battle."
288
897615
2252
a drugi po veličini po bici iz II svetskog rata."
14:59
(Hums Final Jeopardy music)
289
899891
1378
(Pevuši finalnu temu iz kviza)
15:01
Chicago.
290
901582
1182
Čikago.
15:02
The two humans got it right.
291
902788
1370
Oba ljudska bića su pogodila.
15:04
Watson, on the other hand, answered "Toronto" --
292
904697
4348
Votson, s druge strane, odgovorio je: "Toronto" -
15:09
for a US city category!
293
909069
1818
za kategoriju gradova SAD-a.
15:11
The impressive system also made an error
294
911596
2901
Impresivni sistem je napravio i grešku
15:14
that a human would never make, a second-grader wouldn't make.
295
914521
3651
koju ljudsko biće nikad ne bi, drugaš je nikad ne bi napravio.
15:18
Our machine intelligence can fail
296
918823
3109
Naša mašinska inteligencija može da omane
15:21
in ways that don't fit error patterns of humans,
297
921956
3100
na načine koji se ne uklapaju sa obrascima grešenja kod ljudi,
15:25
in ways we won't expect and be prepared for.
298
925080
2950
na načine koji su neočekivani i na koje nismo pripremljeni.
15:28
It'd be lousy not to get a job one is qualified for,
299
928054
3638
Bilo bi loše ne dobiti posao za koji ste kvalifikovani,
15:31
but it would triple suck if it was because of stack overflow
300
931716
3727
ali bi bilo tri puta gore ako bi to bilo zbog preopterećenja
15:35
in some subroutine.
301
935467
1432
u nekakvoj sistemskoj podrutini.
15:36
(Laughter)
302
936923
1579
(Smeh)
15:38
In May of 2010,
303
938526
2786
U maju 2010,
15:41
a flash crash on Wall Street fueled by a feedback loop
304
941336
4044
munjevit krah na Vol Stritu je pokrenut povratnom petljom
15:45
in Wall Street's "sell" algorithm
305
945404
3028
u Vol Stritovom algoritmu "prodaja",
15:48
wiped a trillion dollars of value in 36 minutes.
306
948456
4184
izbrisao je vrednost od trilion dolara za 36 minuta.
15:53
I don't even want to think what "error" means
307
953722
2187
Ne želim da razmišljam o tome šta znači "greška"
15:55
in the context of lethal autonomous weapons.
308
955933
3589
u kontekstu smrtonosnog autonomnog oružja.
16:01
So yes, humans have always made biases.
309
961894
3790
Dakle, da, ljudi su oduvek bili pristrasni.
16:05
Decision makers and gatekeepers,
310
965708
2176
Donosioci odluka i čuvari informacija,
16:07
in courts, in news, in war ...
311
967908
3493
na sudovima, vestima, u ratu...
16:11
they make mistakes; but that's exactly my point.
312
971425
3038
oni greše; ali u tome je poenta.
16:14
We cannot escape these difficult questions.
313
974487
3521
Ne možemo da izbegnemo ova teška pitanja.
16:18
We cannot outsource our responsibilities to machines.
314
978596
3516
Ne možemo da delegiramo naša zaduženja mašinama.
16:22
(Applause)
315
982676
4208
(Aplauz)
16:29
Artificial intelligence does not give us a "Get out of ethics free" card.
316
989089
4447
Veštačka inteligencija nam ne pruža besplatnu kartu za "beg od etike".
16:34
Data scientist Fred Benenson calls this math-washing.
317
994742
3381
Naučnik za podatke Fred Benson to naziva matematičkim ispiranjem.
16:38
We need the opposite.
318
998147
1389
Potrebno nam je suprotno.
16:39
We need to cultivate algorithm suspicion, scrutiny and investigation.
319
999560
5388
Moramo da negujemo sumnju u algoritme, nadzor i istraživanje.
16:45
We need to make sure we have algorithmic accountability,
320
1005380
3198
Moramo da se postaramo da imamo algoritamsku odgovrnost,
16:48
auditing and meaningful transparency.
321
1008602
2445
proveru i smislenu transparentnost.
16:51
We need to accept that bringing math and computation
322
1011380
3234
Moramo da prihvatimo da uvođenje matematike i kompjutera
16:54
to messy, value-laden human affairs
323
1014638
2970
u neuredne ljudske odnose vođene vrednostima
16:57
does not bring objectivity;
324
1017632
2384
ne donosi objektivnost;
17:00
rather, the complexity of human affairs invades the algorithms.
325
1020040
3633
već pre složenost ljudskih odnosa osvaja algoritme.
17:04
Yes, we can and we should use computation
326
1024148
3487
Da, možemo i treba da koristimo kompjutere
17:07
to help us make better decisions.
327
1027659
2014
kako bi donosili bolje odluke.
17:09
But we have to own up to our moral responsibility to judgment,
328
1029697
5332
Ali moramo da ovladamo našom moralnom odgovornošću i rasuđivanjem
17:15
and use algorithms within that framework,
329
1035053
2818
i da koristimo algoritme unutar tog okvira,
17:17
not as a means to abdicate and outsource our responsibilities
330
1037895
4935
ne kao sredstva da se odreknemo i da delegiramo naše odgovornosti
17:22
to one another as human to human.
331
1042854
2454
nekom drugom, kao čovek čoveku.
17:25
Machine intelligence is here.
332
1045807
2609
Mašinska inteligencija je tu.
17:28
That means we must hold on ever tighter
333
1048440
3421
To znači da se kao nikad pre moramo čvrsto držati
17:31
to human values and human ethics.
334
1051885
2147
ljudskih vrednosti i ljudske etike.
17:34
Thank you.
335
1054056
1154
Hvala vam.
17:35
(Applause)
336
1055234
5020
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7