The era of blind faith in big data must end | Cathy O'Neil

240,040 views ・ 2017-09-07

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Romeo Mlinar Lektor: Tijana Mihajlović
00:12
Algorithms are everywhere.
0
12795
1596
Algoritmi su svuda.
00:15
They sort and separate the winners from the losers.
1
15931
3125
Filtriraju i odvajaju pobednike od gubitnika.
00:19
The winners get the job
2
19839
2264
Pobednici dobijaju posao
ili dobru ponudu kreditne kartice.
00:22
or a good credit card offer.
3
22127
1743
00:23
The losers don't even get an interview
4
23894
2651
Za gubitnike nema ni intervjua
00:27
or they pay more for insurance.
5
27410
1777
ili plaćaju više za osiguranje.
Ocenjuju nas tajne formule koje ne razumemo,
00:30
We're being scored with secret formulas that we don't understand
6
30017
3549
00:34
that often don't have systems of appeal.
7
34495
3217
za koje često ne postoje sistemi za podnošenje žalbe.
To nas tera se zapitamo:
00:39
That begs the question:
8
39060
1296
00:40
What if the algorithms are wrong?
9
40380
2913
„Šta ako algoritmi greše?“
00:44
To build an algorithm you need two things:
10
44920
2040
Da napravite algoritam potrebne su vam dve stvari:
00:46
you need data, what happened in the past,
11
46984
1981
podaci, šta je bilo u prošlosti,
00:48
and a definition of success,
12
48989
1561
i definicija uspeha,
00:50
the thing you're looking for and often hoping for.
13
50574
2457
koji tražite i kojem se često nadate.
Osposobljavate algoritam posmatranjem i zaključivanjem.
00:53
You train an algorithm by looking, figuring out.
14
53055
5037
Algoritam pronalazi vezu sa uspehom.
00:58
The algorithm figures out what is associated with success.
15
58116
3419
01:01
What situation leads to success?
16
61559
2463
Koja situacija vodi ka uspehu?
01:04
Actually, everyone uses algorithms.
17
64701
1762
Zapravo, svi koriste algoritme,
01:06
They just don't formalize them in written code.
18
66487
2718
ali ih ne formalizuju u pisani kôd.
Evo vam primera.
01:09
Let me give you an example.
19
69229
1348
Svaki dan koristim algoritam da napravim porodici doručak.
01:10
I use an algorithm every day to make a meal for my family.
20
70601
3316
01:13
The data I use
21
73941
1476
Podaci koje koristim su:
01:16
is the ingredients in my kitchen,
22
76214
1659
namirnice u kuhinji,
01:17
the time I have,
23
77897
1527
vreme koje imam na raspolaganju,
01:19
the ambition I have,
24
79448
1233
ambicija koju imam,
01:20
and I curate that data.
25
80705
1709
i te podatke obrađujem.
01:22
I don't count those little packages of ramen noodles as food.
26
82438
4251
U hranu ne računam ona mala pakovanja instant špageta.
01:26
(Laughter)
27
86713
1869
(Smeh)
01:28
My definition of success is:
28
88606
1845
Moja definicija uspeha je -
01:30
a meal is successful if my kids eat vegetables.
29
90475
2659
obrok je uspeo ako moja deca jedu povrće.
To bi izgledalo mnogo drugačije da se moj sin pita.
01:34
It's very different from if my youngest son were in charge.
30
94001
2854
01:36
He'd say success is if he gets to eat lots of Nutella.
31
96879
2788
Njemu bi uspeh bio da se najede nutele.
01:40
But I get to choose success.
32
100999
2226
Ali, ja biram uspeh.
Ja sam zadužena za to. Moje mišljenje je važno.
01:43
I am in charge. My opinion matters.
33
103249
2707
01:45
That's the first rule of algorithms.
34
105980
2675
To je prvo pravilo algoritama.
01:48
Algorithms are opinions embedded in code.
35
108679
3180
Algoritmi su mišljenja ugrađena u kôd.
01:53
It's really different from what you think most people think of algorithms.
36
113382
3663
To se veoma razlikuje od onoga što mislite da većina ljudi misli o algoritmima.
01:57
They think algorithms are objective and true and scientific.
37
117069
4504
Ljudi misle da su algoritmi objektivni, istiniti i naučni.
To je marketinški trik.
02:02
That's a marketing trick.
38
122207
1699
02:05
It's also a marketing trick
39
125089
2125
Marketinški trik jeste
02:07
to intimidate you with algorithms,
40
127238
3154
i kada vas plaše algoritmima,
02:10
to make you trust and fear algorithms
41
130416
3661
kada vas teraju da verujete i da se plašite algoritama
jer verujete matematici i plašite je se.
02:14
because you trust and fear mathematics.
42
134101
2018
02:17
A lot can go wrong when we put blind faith in big data.
43
137387
4830
Mnogo grešaka može se desiti kada slepo verujemo u masovne podatke.
02:23
This is Kiri Soares. She's a high school principal in Brooklyn.
44
143504
3373
Ovo je Kiri Soares, direktor srednje škole u Bruklinu.
02:26
In 2011, she told me her teachers were being scored
45
146901
2586
Godine 2011. rekla mi je da njen kolektiv ocenjuju
02:29
with a complex, secret algorithm
46
149511
2727
složenim, tajnim algoritmom
koji se zove „model dodatne vrednosti“.
02:32
called the "value-added model."
47
152262
1489
02:34
I told her, "Well, figure out what the formula is, show it to me.
48
154325
3092
Rekla sam joj: „Saznaj koja je formula i pokaži mi je.
Objasniću ti je.“
02:37
I'm going to explain it to you."
49
157441
1541
Rekla je: „Pokušala sam dobiti formulu,
02:39
She said, "Well, I tried to get the formula,
50
159006
2141
ali osoba iz Ministarstva obrazovanja mi je rekla da je to matematika
02:41
but my Department of Education contact told me it was math
51
161171
2772
02:43
and I wouldn't understand it."
52
163967
1546
i da je neću razumeti.“
Postaje sve gore.
02:47
It gets worse.
53
167086
1338
02:48
The New York Post filed a Freedom of Information Act request,
54
168448
3530
Njujork Post je podneo zahtev na osnovu zakona o slobodi informacija,
dobio imena i ocene svih nastavnika
02:52
got all the teachers' names and all their scores
55
172002
2959
02:54
and they published them as an act of teacher-shaming.
56
174985
2782
i onda ih objavio kao čin sramoćenja nastavnika.
02:58
When I tried to get the formulas, the source code, through the same means,
57
178904
3860
Kada sam probala istim putem doći do formula, do izvornog kôda,
03:02
I was told I couldn't.
58
182788
2149
rečeno mi je da ne može.
03:04
I was denied.
59
184961
1236
Odbijena sam.
Posle sam saznala
03:06
I later found out
60
186221
1174
da niko u Njujorku nema podatke o toj formuli.
03:07
that nobody in New York City had access to that formula.
61
187419
2866
Niko je nije razumeo.
03:10
No one understood it.
62
190309
1305
03:13
Then someone really smart got involved, Gary Rubinstein.
63
193749
3224
Onda se uključio neko veoma bistar, Geri Rubenstajn.
03:16
He found 665 teachers from that New York Post data
64
196997
3621
Pronašao je 665 nastavnika iz onog članka u Njujork Postu
03:20
that actually had two scores.
65
200642
1866
koji zapravo imaju dva rezultata.
03:22
That could happen if they were teaching
66
202532
1881
Ovo se moglo desiti jer su predavali matematiku u sedmom i osmom razredu.
03:24
seventh grade math and eighth grade math.
67
204437
2439
03:26
He decided to plot them.
68
206900
1538
Odlučio je da ih ubaci u grafikon.
03:28
Each dot represents a teacher.
69
208462
1993
Svaka tačka je nastavnica ili nastavnik.
03:30
(Laughter)
70
210924
2379
(Smeh)
03:33
What is that?
71
213327
1521
Šta je to?
03:34
(Laughter)
72
214872
1277
(Smeh)
To se nikako nije trebalo koristiti za individualne procene.
03:36
That should never have been used for individual assessment.
73
216173
3446
03:39
It's almost a random number generator.
74
219643
1926
Ovo je kao generator nasumičnih brojeva.
03:41
(Applause)
75
221593
2946
(Aplauz)
03:44
But it was.
76
224563
1162
Ali, korišćeno je.
03:45
This is Sarah Wysocki.
77
225749
1176
Ovo je Sara Visoki.
03:46
She got fired, along with 205 other teachers,
78
226949
2175
Otpuštena je kad i 205 drugih nastavnika
iz škola vašingtonskog okruga
03:49
from the Washington, DC school district,
79
229148
2662
03:51
even though she had great recommendations from her principal
80
231834
2909
iako je imala odlučne preporuke direktora
03:54
and the parents of her kids.
81
234767
1428
i roditelja učenika.
Znam šta mnogi od vas ovde sada misle,
03:57
I know what a lot of you guys are thinking,
82
237210
2032
posebno naučnici za podatke, stručnjaci za veštačku inteligenciju.
03:59
especially the data scientists, the AI experts here.
83
239266
2487
04:01
You're thinking, "Well, I would never make an algorithm that inconsistent."
84
241777
4226
Mislite: „Pa, nikada ne bismo napravili tako nedosledan algoritam.“
04:06
But algorithms can go wrong,
85
246673
1683
Ali, algoritmi mogu pogrešiti,
04:08
even have deeply destructive effects with good intentions.
86
248380
4598
čak imati i duboko destruktivno dejstvo sa dobrom namerama.
04:14
And whereas an airplane that's designed badly
87
254351
2379
Dok loše napravljen avion
04:16
crashes to the earth and everyone sees it,
88
256754
2001
padne na tlo i svi to vide,
04:18
an algorithm designed badly
89
258779
1850
loše osmišljen algoritam
može trajati dugo i potajno i tiho praviti ogromnu štetu.
04:22
can go on for a long time, silently wreaking havoc.
90
262065
3865
04:27
This is Roger Ailes.
91
267568
1570
Ovo je Rodžer Ejls.
(Smeh)
04:29
(Laughter)
92
269162
2000
04:32
He founded Fox News in 1996.
93
272344
2388
Osnovao je Foks Njuz 1996. godine.
Preko 20 žena žalilo se na seksualno uznemiravanje.
04:35
More than 20 women complained about sexual harassment.
94
275256
2581
04:37
They said they weren't allowed to succeed at Fox News.
95
277861
3235
Rekle su da im u Foks Njuzu nije dozvoljen uspeh.
Izbačen je prošle godine,
04:41
He was ousted last year, but we've seen recently
96
281120
2520
ali nedavno smo videli da problemi još nisu rešeni.
04:43
that the problems have persisted.
97
283664
2670
04:47
That begs the question:
98
287474
1400
To zahteva da se postavi pitanje
04:48
What should Fox News do to turn over another leaf?
99
288898
2884
šta Foks Njuz treba da uradi da okrene novi list.
Šta bi se desilo da proces zapošljavanja
04:53
Well, what if they replaced their hiring process
100
293065
3041
zamene mašinskim algoritmom koji uči?
04:56
with a machine-learning algorithm?
101
296130
1654
04:57
That sounds good, right?
102
297808
1595
Zvuči dobro, zar ne?
04:59
Think about it.
103
299427
1300
Razmislite o tome.
05:00
The data, what would the data be?
104
300751
2105
Podaci. Šta bi bi bili podaci?
05:02
A reasonable choice would be the last 21 years of applications to Fox News.
105
302880
4947
Ima smisla izabrati prijave za Foks Njuz tokom poslednjih 21 godina.
05:07
Reasonable.
106
307851
1502
Ima smisla.
05:09
What about the definition of success?
107
309377
1938
A definicija uspeha?
05:11
Reasonable choice would be,
108
311741
1324
Razuman izbor bio bi, valjda,
neko ko je uspešan u Foks Njuzu?
05:13
well, who is successful at Fox News?
109
313089
1778
05:14
I guess someone who, say, stayed there for four years
110
314891
3580
Recimo, osoba koja je tamo bila četiri godine
05:18
and was promoted at least once.
111
318495
1654
i dobila unapređenje makar jednom.
05:20
Sounds reasonable.
112
320636
1561
Ima smisla.
05:22
And then the algorithm would be trained.
113
322221
2354
Onda bismo osposobljavali algoritam.
05:24
It would be trained to look for people to learn what led to success,
114
324599
3877
Osposobili bismo ga da traži ljude, da uči šta je vodilo ka uspehu,
kakve vrste prijava su vremenom vodile ka uspehu
05:29
what kind of applications historically led to success
115
329039
4318
05:33
by that definition.
116
333381
1294
u skladu sa tom definicijom.
Razmislite sada šta bi se desilo
05:36
Now think about what would happen
117
336020
1775
05:37
if we applied that to a current pool of applicants.
118
337819
2555
kada bismo to primenili na trenutne kandidate.
05:40
It would filter out women
119
340939
1629
Izbacilo bi žene
05:43
because they do not look like people who were successful in the past.
120
343483
3930
jer ne deluju kao osobe koje su bile uspešne u prošlosti.
05:51
Algorithms don't make things fair
121
351572
2537
Algoritmi ne popravljaju stvari
ako ih samo nonšalantno i slepo primenjujete.
05:54
if you just blithely, blindly apply algorithms.
122
354133
2694
05:56
They don't make things fair.
123
356851
1482
Ne popravljaju stvari.
05:58
They repeat our past practices,
124
358357
2128
Ponavljaju našu praksu iz prošlosti,
06:00
our patterns.
125
360509
1183
naše šablone.
06:01
They automate the status quo.
126
361716
1939
Automatizuju status kvo.
06:04
That would be great if we had a perfect world,
127
364538
2389
Da živimo u savršenom svetu to bi bilo sjajno,
06:07
but we don't.
128
367725
1312
ali ne živimo.
Dodaću da većina firmi nema sramne parnice,
06:09
And I'll add that most companies don't have embarrassing lawsuits,
129
369061
4102
06:14
but the data scientists in those companies
130
374266
2588
ali naučnicima za podatke u tim kompanijama
06:16
are told to follow the data,
131
376878
2189
rečeno je idu tragom podataka,
da paze na tačnost.
06:19
to focus on accuracy.
132
379091
2143
Razmislite šta to znači.
06:22
Think about what that means.
133
382093
1381
06:23
Because we all have bias, it means they could be codifying sexism
134
383498
4027
Pošto smo svi pristrasni, to znači da će možda kodifikovati seksizam
06:27
or any other kind of bigotry.
135
387549
1836
ili drugu netrpeljivost.
06:31
Thought experiment,
136
391308
1421
Misaoni eksperiment,
06:32
because I like them:
137
392753
1509
jer ih volim:
06:35
an entirely segregated society --
138
395394
2975
jedno društvo, skroz podeljeno -
na osnovu rase, svi gradovi, sve opštine -
06:40
racially segregated, all towns, all neighborhoods
139
400067
3328
06:43
and where we send the police only to the minority neighborhoods
140
403419
3037
a policiju šaljemo samo u delove gde živi manjina
06:46
to look for crime.
141
406480
1193
u potrazi za kriminalom.
06:48
The arrest data would be very biased.
142
408271
2219
Podaci o hapšenjima bili bi veoma pristrasni.
06:51
What if, on top of that, we found the data scientists
143
411671
2575
Šta ako bismo, povrh svega, pronašli naučnike za podatke
06:54
and paid the data scientists to predict where the next crime would occur?
144
414270
4161
i platili tim naučnicima da predvide mesto sledećeg zločina?
Opštine sa manjinama.
06:59
Minority neighborhood.
145
419095
1487
Ili da predvide ko će sledeći biti kriminalac?
07:01
Or to predict who the next criminal would be?
146
421105
3125
07:04
A minority.
147
424708
1395
Neko iz manjine.
07:07
The data scientists would brag about how great and how accurate
148
427769
3541
Naučnici bi se hvalisali svojim sjajnim i tačnim modelom,
07:11
their model would be,
149
431334
1297
07:12
and they'd be right.
150
432655
1299
i bili bi u pravu.
07:15
Now, reality isn't that drastic, but we do have severe segregations
151
435771
4615
Realnost nije tako drastična, ali postoje ozbiljne podele
07:20
in many cities and towns,
152
440410
1287
u mnogim malim i velikom gradovima,
07:21
and we have plenty of evidence
153
441721
1893
i imamo mnoštvo dokaza
07:23
of biased policing and justice system data.
154
443638
2688
o pristrasnim podacima u sistemu policije i pravosuđa.
07:27
And we actually do predict hotspots,
155
447452
2815
Mi zapravo predviđamo krizna mesta,
07:30
places where crimes will occur.
156
450291
1530
mesta gde će se desiti nasilje.
I predviđamo, zapravo, pojedinačni kriminalitet,
07:32
And we do predict, in fact, the individual criminality,
157
452221
3866
kriminalitet pojedinaca.
07:36
the criminality of individuals.
158
456111
1770
07:38
The news organization ProPublica recently looked into
159
458792
3963
Novinska organizacija Propablika nedavno je proverila
07:42
one of those "recidivism risk" algorithms,
160
462779
2024
jedan „algoritam ugrožen recidivizmom“
07:44
as they're called,
161
464827
1163
kako ih zovu,
koje sudije koriste u presudama na Floridi.
07:46
being used in Florida during sentencing by judges.
162
466014
3194
07:50
Bernard, on the left, the black man, was scored a 10 out of 10.
163
470231
3585
Bernard, levo, crnac, dobio je 10 od 10 poena.
07:54
Dylan, on the right, 3 out of 10.
164
474999
2007
Dilan, desno, 3 od 10.
Deset od deset, visok rizik. Tri od deset, nizak rizik.
07:57
10 out of 10, high risk. 3 out of 10, low risk.
165
477030
2501
08:00
They were both brought in for drug possession.
166
480418
2385
Obojica su privedeni zbog posedovanja droge.
08:02
They both had records,
167
482827
1154
Obojica su imali dosije,
ali Dilan je imao krivično delo,
08:04
but Dylan had a felony
168
484005
2806
08:06
but Bernard didn't.
169
486835
1176
a Bernard nije.
08:09
This matters, because the higher score you are,
170
489638
3066
Ovo je bitno jer što su ti veći poeni,
08:12
the more likely you're being given a longer sentence.
171
492728
3473
veće su šanse da dobiješ dužu kaznu.
O čemu se ovde radi?
08:18
What's going on?
172
498114
1294
08:20
Data laundering.
173
500346
1332
Pranje podataka.
08:22
It's a process by which technologists hide ugly truths
174
502750
4427
Proces kojim tehnolozi sakrivaju ružnu istinu
08:27
inside black box algorithms
175
507201
1821
u crne kutije algoritama
i nazivaju ih objektivnima;
08:29
and call them objective;
176
509046
1290
nazivaju ih meritokratskim.
08:31
call them meritocratic.
177
511140
1568
08:34
When they're secret, important and destructive,
178
514938
2385
Za tajne, važne i destruktivne algoritme
08:37
I've coined a term for these algorithms:
179
517347
2487
sam skovala frazu „oružje za matematičko uništenje“.
08:39
"weapons of math destruction."
180
519858
1999
08:41
(Laughter)
181
521881
1564
(Smeh)
(Aplauz)
08:43
(Applause)
182
523469
3054
08:46
They're everywhere, and it's not a mistake.
183
526547
2354
Oni su svuda i to nije greška.
08:49
These are private companies building private algorithms
184
529515
3723
To su privatne kompanije koje prave privatne algoritme
08:53
for private ends.
185
533262
1392
za privatne ciljeve.
Čak i one već spomenute, za nastavnike i policiju,
08:55
Even the ones I talked about for teachers and the public police,
186
535034
3214
08:58
those were built by private companies
187
538272
1869
napravile su privatne kompanije
i zatim ih prodale vladinim telima.
09:00
and sold to the government institutions.
188
540165
2231
09:02
They call it their "secret sauce" --
189
542420
1873
Zovu ih „tajnim umakom“;
zato nam ništa ne mogu reći o tome.
09:04
that's why they can't tell us about it.
190
544317
2128
09:06
It's also private power.
191
546469
2220
To je i privatna moć.
09:09
They are profiting for wielding the authority of the inscrutable.
192
549744
4695
Zarađuju na korišćenju autoriteta koji se ne može proveriti.
09:16
Now you might think, since all this stuff is private
193
556934
2934
Možda ste pomislili da, pošto je ovo privatno,
09:19
and there's competition,
194
559892
1158
postoji konkurencija;
možda će slobodno tržište rešiti problem.
09:21
maybe the free market will solve this problem.
195
561074
2306
09:23
It won't.
196
563404
1249
Neće.
09:24
There's a lot of money to be made in unfairness.
197
564677
3120
Mnogo se novca može napraviti nepravdom.
09:28
Also, we're not economic rational agents.
198
568947
3369
Uz to, mi nismo ekonomski racionalni činioci.
09:32
We all are biased.
199
572851
1292
Svi smo pristrasni.
09:34
We're all racist and bigoted in ways that we wish we weren't,
200
574780
3377
Svi smo rasisti i netrpeljivi onako kako ne želimo biti
09:38
in ways that we don't even know.
201
578181
2019
u oblicima koje i ne poznajemo.
Ipak, znamo da je to kolektivno
09:41
We know this, though, in aggregate,
202
581172
3081
09:44
because sociologists have consistently demonstrated this
203
584277
3220
jer to sociolozi dosledno dokazuju
09:47
with these experiments they build,
204
587521
1665
eksperimentima koje osmišljavaju,
kada pošalju gomilu prijava za posao,
09:49
where they send a bunch of applications to jobs out,
205
589210
2568
09:51
equally qualified but some have white-sounding names
206
591802
2501
podjednako dobrih, ali neke imaju imena koja zvuče belački
09:54
and some have black-sounding names,
207
594327
1706
a neke koje zvuče kao crnački,
09:56
and it's always disappointing, the results -- always.
208
596057
2694
i uvek su razočaravajući rezultati; uvek.
09:59
So we are the ones that are biased,
209
599330
1771
Tako, mi smo pristrasni
i mi u algoritme ubacujemo pristrasnost
10:01
and we are injecting those biases into the algorithms
210
601125
3429
10:04
by choosing what data to collect,
211
604578
1812
izborom podataka za prikupljanje,
10:06
like I chose not to think about ramen noodles --
212
606414
2743
kao kada sam odlučila da ne mislim o instant-špagetama;
odlučila sam da su nebitne.
10:09
I decided it was irrelevant.
213
609181
1625
10:10
But by trusting the data that's actually picking up on past practices
214
610830
5684
Ako verujemo podacima koji otkrivaju praksu iz prošlosti
10:16
and by choosing the definition of success,
215
616538
2014
i biramo definiciju uspeha,
10:18
how can we expect the algorithms to emerge unscathed?
216
618576
3983
kako onda očekujemo da algoritmi ostanu neoštećeni?
10:22
We can't. We have to check them.
217
622583
2356
Ne možemo. Moramo ih proveriti.
10:25
We have to check them for fairness.
218
625985
1709
Moramo proveriti da li su pravični.
10:27
The good news is, we can check them for fairness.
219
627718
2711
Dobra vest jeste da možemo proveriti jesu li pravični.
10:30
Algorithms can be interrogated,
220
630453
3352
Algoritme možemo ispitati
10:33
and they will tell us the truth every time.
221
633829
2034
i reći će nam istinu svaki put.
10:35
And we can fix them. We can make them better.
222
635887
2493
I možemo ih popraviti. Možemo ih poboljšati.
10:38
I call this an algorithmic audit,
223
638404
2375
To zovem revizijom algoritma
10:40
and I'll walk you through it.
224
640803
1679
i ukratko ću vam je objasniti.
10:42
First, data integrity check.
225
642506
2196
Prvo, provera integriteta podataka.
10:45
For the recidivism risk algorithm I talked about,
226
645952
2657
Zbog algoritma rizika od recidivizma o kojem sam govorila,
10:49
a data integrity check would mean we'd have to come to terms with the fact
227
649402
3573
provera integriteta podataka značila bi prihvatanje činjenice
10:52
that in the US, whites and blacks smoke pot at the same rate
228
652999
3526
da u SAD crnci i belci podjednako puše travu
10:56
but blacks are far more likely to be arrested --
229
656549
2485
ali crnci imaju mnogo više šanse da budu uhapšeni -
četiri ili pet puta, zavisi od kraja.
10:59
four or five times more likely, depending on the area.
230
659058
3184
Kako ta pristrasnost izgleda u drugim kriminalnim oblastima,
11:03
What is that bias looking like in other crime categories,
231
663137
2826
11:05
and how do we account for it?
232
665987
1451
i kako je uzimamo u obzir?
11:07
Second, we should think about the definition of success,
233
667982
3039
Drugo, treba da razmislimo o definiciji uspeha,
da je revidiramo.
11:11
audit that.
234
671045
1381
11:12
Remember -- with the hiring algorithm? We talked about it.
235
672450
2752
Setite se algoritma za zapošljavanje koji smo spomenuli.
11:15
Someone who stays for four years and is promoted once?
236
675226
3165
Osoba koja je tu četiri godine i unapređena je jednom?
11:18
Well, that is a successful employee,
237
678415
1769
Pa, to je uspešan zaposleni,
ali je takođe i zaposleni u skladu sa njihovom kulturom.
11:20
but it's also an employee that is supported by their culture.
238
680208
3079
11:23
That said, also it can be quite biased.
239
683909
1926
Tako i to može biti pristrasno.
11:25
We need to separate those two things.
240
685859
2065
Moramo razdvojiti te dve stvari.
11:27
We should look to the blind orchestra audition
241
687948
2426
Treba da uzmemo slepe audicije za orkestar kao primer.
11:30
as an example.
242
690398
1196
11:31
That's where the people auditioning are behind a sheet.
243
691618
2756
Tamo ljudi konkurišu su iza zastora.
11:34
What I want to think about there
244
694766
1931
Što je meni bitno jeste
11:36
is the people who are listening have decided what's important
245
696721
3417
da ljudi koji slušaju odlučuju šta je bitno
11:40
and they've decided what's not important,
246
700162
2029
i odlučuju su šta nije bitno,
tako da im to ne odvlači pažnju.
11:42
and they're not getting distracted by that.
247
702215
2059
11:44
When the blind orchestra auditions started,
248
704781
2749
Otkad su počele slepe audicije za orkestre,
11:47
the number of women in orchestras went up by a factor of five.
249
707554
3444
broj žena u orkestrima povećao se pet puta.
Zatim, moramo razmotriti tačnost.
11:52
Next, we have to consider accuracy.
250
712073
2015
Tada bi se model dodatne vrednosti za nastavnike odmah raspao.
11:55
This is where the value-added model for teachers would fail immediately.
251
715053
3734
11:59
No algorithm is perfect, of course,
252
719398
2162
Nema savršenog algoritma, naravno,
12:02
so we have to consider the errors of every algorithm.
253
722440
3605
pa moramo razmotriti greške svakog algoritma.
12:06
How often are there errors, and for whom does this model fail?
254
726656
4359
Koliko su greške česte i za koga ovaj model ne funkcioniše?
12:11
What is the cost of that failure?
255
731670
1718
Koja je cena te nefunkcionalnosti?
12:14
And finally, we have to consider
256
734254
2207
Na kraju, moramo razmotriti
12:17
the long-term effects of algorithms,
257
737793
2186
dugoročne efekte algoritama,
12:20
the feedback loops that are engendering.
258
740686
2207
njihove povratne kružne sprege koje se stvaraju.
12:23
That sounds abstract,
259
743406
1236
Ovo zvuči apstraktno,
12:24
but imagine if Facebook engineers had considered that
260
744666
2664
ali zamislite da su Fejsbukovi inženjeri to uzeli u obzir
pre odluke da nam prikažu samo postove naših prijatelja.
12:28
before they decided to show us only things that our friends had posted.
261
748090
4855
12:33
I have two more messages, one for the data scientists out there.
262
753581
3234
Imam još dve poruke, jednu za naučnike koji se bave podacima.
12:37
Data scientists: we should not be the arbiters of truth.
263
757270
3409
Naučnici za podatke - ne treba da budemo sudije istine.
12:41
We should be translators of ethical discussions that happen
264
761340
3783
Treba da budemo prevodioci etičkih rasprava
koje se odvijaju u širem društvu.
12:45
in larger society.
265
765147
1294
12:47
(Applause)
266
767399
2133
(Aplauz)
12:49
And the rest of you,
267
769556
1556
A za vas ostale,
12:51
the non-data scientists:
268
771831
1396
koji niste naučnici za podatke:
12:53
this is not a math test.
269
773251
1498
ovo nije test iz matematike.
12:55
This is a political fight.
270
775452
1348
Ovo je politička borba.
12:58
We need to demand accountability for our algorithmic overlords.
271
778407
3907
Od naših algoritamskih vladara moramo zahtevati odgovornost.
13:03
(Applause)
272
783938
1499
(Aplauz)
13:05
The era of blind faith in big data must end.
273
785461
4225
Doba slepe vere u masovne podatke mora se okončati.
13:09
Thank you very much.
274
789710
1167
Hvala vam mnogo.
13:10
(Applause)
275
790901
5303
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7