How I'm fighting bias in algorithms | Joy Buolamwini

308,224 views ・ 2017-03-29

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Ivana Krivokuća Lektor: Tijana Mihajlović
00:12
Hello, I'm Joy, a poet of code,
0
12861
3134
Zdravo, ja sam Džoj, pesnikinja kodova,
na misiji da zaustavim neviđenu silu u usponu,
00:16
on a mission to stop an unseen force that's rising,
1
16019
4993
00:21
a force that I called "the coded gaze,"
2
21036
2856
silu koju nazivam „kodirani pogled“,
00:23
my term for algorithmic bias.
3
23916
3309
što je moj termin za algoritamsku pristrasnost.
00:27
Algorithmic bias, like human bias, results in unfairness.
4
27249
4300
Algoritamska pristrasnost, kao i ljudska, ima za posledicu nepravednost.
00:31
However, algorithms, like viruses, can spread bias on a massive scale
5
31573
6022
Međutim, algoritmi, poput virusa, mogu raširiti pristrasnost u ogromnoj meri
00:37
at a rapid pace.
6
37619
1582
velikom brzinom.
00:39
Algorithmic bias can also lead to exclusionary experiences
7
39763
4387
Algoritamska pristrasnost može dovesti i do izloženosti isključivanju
00:44
and discriminatory practices.
8
44174
2128
i prakse diskriminacije.
00:46
Let me show you what I mean.
9
46326
2061
Dozvolite da vam pokažem šta hoću da kažem.
00:48
(Video) Joy Buolamwini: Hi, camera. I've got a face.
10
48800
2436
(Video) Džoj Buolamvini: Zdravo, kamero. Imam lice.
00:51
Can you see my face?
11
51982
1864
Možeš li da vidiš moje lice?
00:53
No-glasses face?
12
53871
1625
Lice bez naočara?
00:55
You can see her face.
13
55521
2214
Možeš videti njeno lice.
A moje?
00:58
What about my face?
14
58057
2245
01:03
I've got a mask. Can you see my mask?
15
63710
3750
Imam masku. Možeš li da vidiš moju masku?
01:08
Joy Buolamwini: So how did this happen?
16
68294
2365
Džoj Buolamvini: Pa, kako se ovo dogodilo?
01:10
Why am I sitting in front of a computer
17
70683
3141
Zašto sedim ispred kompjutera
01:13
in a white mask,
18
73848
1424
sa belom maskom,
01:15
trying to be detected by a cheap webcam?
19
75296
3650
pokušavajući da me prepozna jeftina kamera?
01:18
Well, when I'm not fighting the coded gaze
20
78970
2291
Kada se ne borim protiv kodiranog pogleda
01:21
as a poet of code,
21
81285
1520
kao pesnikinja kodova,
01:22
I'm a graduate student at the MIT Media Lab,
22
82829
3272
postdiplomac sam medijske laboratorije MIT-a
01:26
and there I have the opportunity to work on all sorts of whimsical projects,
23
86125
4917
i tamo imam priliku da radim na raznim neobičnim projektima,
01:31
including the Aspire Mirror,
24
91066
2027
uključujući „Ogledalo aspiracije“, projekat koji sam sprovela
01:33
a project I did so I could project digital masks onto my reflection.
25
93117
5134
tako da mogu da projektujem digitalne maske na svoj odraz.
01:38
So in the morning, if I wanted to feel powerful,
26
98275
2350
Tako bih ujutru, ako želim da se osećam snažno,
01:40
I could put on a lion.
27
100649
1434
mogla da stavim lava.
Ako bih htela da podignem raspoloženje, možda bih dobila citat.
01:42
If I wanted to be uplifted, I might have a quote.
28
102107
3496
01:45
So I used generic facial recognition software
29
105627
2989
Koristila sam generički softver za prepoznavanje lica
01:48
to build the system,
30
108640
1351
da bih napravila sistem,
ali sam otkrila da ga je teško testirati ukoliko ne nosim belu masku.
01:50
but found it was really hard to test it unless I wore a white mask.
31
110015
5103
01:56
Unfortunately, I've run into this issue before.
32
116102
4346
Nažalost, već sam ranije nailazila na ovaj problem.
02:00
When I was an undergraduate at Georgia Tech studying computer science,
33
120472
4303
Kada sam bila na osnovnim studijama na Tehnološkom institutu u Džordžiji,
gde sam studirala informatiku,
02:04
I used to work on social robots,
34
124799
2055
radila sam na društvenim robotima,
02:06
and one of my tasks was to get a robot to play peek-a-boo,
35
126878
3777
a jedan od mojih zadataka je bio da navedem robota da se igra skrivanja,
02:10
a simple turn-taking game
36
130679
1683
jednostavne igre menjanja uloga
02:12
where partners cover their face and then uncover it saying, "Peek-a-boo!"
37
132386
4321
u kojoj partneri pokrivaju lice, a zatim ga otkriju i kažu: „Uja!“
02:16
The problem is, peek-a-boo doesn't really work if I can't see you,
38
136731
4429
Problem je što igra skrivanja ne funkcioniše ako ne mogu da vas vidim,
02:21
and my robot couldn't see me.
39
141184
2499
a moj robot nije mogao da me vidi.
02:23
But I borrowed my roommate's face to get the project done,
40
143707
3950
No, pozajmila sam lice svoje cimerke da bih završila projekat,
02:27
submitted the assignment,
41
147681
1380
predala sam zadatak,
02:29
and figured, you know what, somebody else will solve this problem.
42
149085
3753
i mislila sam: „Znate šta, neko drugi će rešiti ovaj problem.“
02:33
Not too long after,
43
153489
2003
Nedugo potom,
02:35
I was in Hong Kong for an entrepreneurship competition.
44
155516
4159
bila sam u Hongkongu na takmičenju preduzetnika.
02:40
The organizers decided to take participants
45
160159
2694
Organizatori su rešili da povedu učesnike
02:42
on a tour of local start-ups.
46
162877
2372
u obilazak lokalnih startapova.
02:45
One of the start-ups had a social robot,
47
165273
2715
Jedan od startapova imao je društvenog robota
02:48
and they decided to do a demo.
48
168012
1912
i rešili su da naprave demonstraciju.
02:49
The demo worked on everybody until it got to me,
49
169948
2980
Demonstracija je radila kod svih dok nisu stigli do mene
02:52
and you can probably guess it.
50
172952
1923
i možete verovatno pretpostaviti šta se dogodilo.
02:54
It couldn't detect my face.
51
174899
2965
Nije mogao da prepozna moje lice.
02:57
I asked the developers what was going on,
52
177888
2511
Pitala sam programere šta se dešava
03:00
and it turned out we had used the same generic facial recognition software.
53
180423
5533
i ispostavilo se da smo koristili isti generički softver prepoznavanja lica.
03:05
Halfway around the world,
54
185980
1650
Preko pola sveta,
03:07
I learned that algorithmic bias can travel as quickly
55
187654
3852
saznala sam da algoritamska pristrasnost može putovati toliko brzo
03:11
as it takes to download some files off of the internet.
56
191530
3170
koliko treba da se skine nešto fajlova sa interneta.
03:15
So what's going on? Why isn't my face being detected?
57
195565
3076
Pa, šta se dešava? Zašto se moje lice ne prepoznaje?
03:18
Well, we have to look at how we give machines sight.
58
198665
3356
Pa, moramo pogledati kako mašini dajemo vid.
Kompjuterski vid koristi tehnike mašinskog učenja
03:22
Computer vision uses machine learning techniques
59
202045
3409
03:25
to do facial recognition.
60
205478
1880
da bi prepoznao lica.
03:27
So how this works is, you create a training set with examples of faces.
61
207382
3897
To funkcioniše tako što napravite komplet za vežbanje sa primerima lica.
03:31
This is a face. This is a face. This is not a face.
62
211303
2818
Ovo je lice. Ovo je lice. Ovo nije lice.
03:34
And over time, you can teach a computer how to recognize other faces.
63
214145
4519
Vremenom možete naučiti kompjuter kako da prepoznaje druga lica.
03:38
However, if the training sets aren't really that diverse,
64
218688
3989
Međutim, ako kompleti za vežbanje baš i nisu tako raznovrsni,
03:42
any face that deviates too much from the established norm
65
222701
3349
svako lice koje previše odstupa od uspostavljene norme
biće teže da se prepozna,
03:46
will be harder to detect,
66
226074
1649
03:47
which is what was happening to me.
67
227747
1963
a to je ono što se događa sa mnom.
03:49
But don't worry -- there's some good news.
68
229734
2382
Ali ne brinite, ima dobrih vesti.
03:52
Training sets don't just materialize out of nowhere.
69
232140
2771
Kompleti za vežbanje ne dolaze tek tako niotkuda.
03:54
We actually can create them.
70
234935
1788
Možemo ih stvoriti.
03:56
So there's an opportunity to create full-spectrum training sets
71
236747
4176
Postoji mogućnost za stvaranje kompleta za vežbu celokupnog spektra
04:00
that reflect a richer portrait of humanity.
72
240947
3824
koji odražavaju bogatiji portret čovečanstva.
04:04
Now you've seen in my examples
73
244795
2221
Videli ste u mojim primerima
da sam preko društvenih robota
04:07
how social robots
74
247040
1768
04:08
was how I found out about exclusion with algorithmic bias.
75
248832
4611
saznala za isključivanje kroz algoritamsku pristrasnost.
04:13
But algorithmic bias can also lead to discriminatory practices.
76
253467
4815
Ali algoritamska pristrasnost može dovesti i do prakse diskriminacije.
04:19
Across the US,
77
259257
1453
Širom SAD-a,
04:20
police departments are starting to use facial recognition software
78
260734
4198
policijske uprave počinju da koriste softver za prepoznavanje lica
04:24
in their crime-fighting arsenal.
79
264956
2459
u svom arsenalu za borbu protiv kriminala.
04:27
Georgetown Law published a report
80
267439
2013
Zakon Džordžtauna je objavio izveštaj
04:29
showing that one in two adults in the US -- that's 117 million people --
81
269476
6763
koji pokazuje da se jednoj od dve odrasle osobe u SAD-u -
to je 117 miliona ljudi -
04:36
have their faces in facial recognition networks.
82
276263
3534
lice nalazi u mrežama za prepoznavanje lica.
04:39
Police departments can currently look at these networks unregulated,
83
279821
4552
Odeljenja policije trenutno mogu da neregulisano pregledaju ove mreže,
04:44
using algorithms that have not been audited for accuracy.
84
284397
4286
pomoću algoritama kojima nije proverena tačnost.
04:48
Yet we know facial recognition is not fail proof,
85
288707
3864
Znamo da prepoznavanje lica nije bez mane,
04:52
and labeling faces consistently remains a challenge.
86
292595
4179
a naznačavanje lica stalno ostaje izazov.
04:56
You might have seen this on Facebook.
87
296798
1762
Možda ste to videli na Fejsbuku.
04:58
My friends and I laugh all the time when we see other people
88
298584
2988
Moji prijatelji i ja se uvek smejemo kad vidimo druge ljude
05:01
mislabeled in our photos.
89
301596
2458
koji su pogrešno označeni na našim fotografijama.
05:04
But misidentifying a suspected criminal is no laughing matter,
90
304078
5591
Ali pogrešno identifikovanje osumnjičenog zločinca nije za smejanje,
05:09
nor is breaching civil liberties.
91
309693
2827
kao ni kršenje građanskih sloboda.
05:12
Machine learning is being used for facial recognition,
92
312544
3205
Mašinsko učenje se koristi za prepoznavanje lica,
05:15
but it's also extending beyond the realm of computer vision.
93
315773
4505
ali se takođe proteže i van dometa kompjuterskog vida.
05:21
In her book, "Weapons of Math Destruction,"
94
321086
4016
U svojoj knjizi „Oružja za matematičko uništenje“,
05:25
data scientist Cathy O'Neil talks about the rising new WMDs --
95
325126
6681
naučnica u oblasti podataka Keti O'Nil govori o usponu novih RMD-a,
05:31
widespread, mysterious and destructive algorithms
96
331831
4353
rasprostranjenih, misterioznih i destruktivnih algoritama
05:36
that are increasingly being used to make decisions
97
336208
2964
koji se sve više koriste za donošenje odluka
05:39
that impact more aspects of our lives.
98
339196
3177
koje utiču na sve više aspekata našeg života.
05:42
So who gets hired or fired?
99
342397
1870
Koga će zaposliti ili otpustiti?
Da li ćete dobiti taj kredit? Da li ćete dobiti osiguranje?
05:44
Do you get that loan? Do you get insurance?
100
344291
2112
05:46
Are you admitted into the college you wanted to get into?
101
346427
3503
Da li ste primljeni na fakultet u koji ste želeli da upadnete?
05:49
Do you and I pay the same price for the same product
102
349954
3509
Da li vi i ja plaćamo istu cenu za isti proizvod
05:53
purchased on the same platform?
103
353487
2442
kupljen na istoj platformi?
05:55
Law enforcement is also starting to use machine learning
104
355953
3759
Sprovođenje zakona takođe počinje da koristi mašinsko učenje
05:59
for predictive policing.
105
359736
2289
za prediktivni rad policije.
06:02
Some judges use machine-generated risk scores to determine
106
362049
3494
Neke sudije koriste mašinski generisane procene rizika
06:05
how long an individual is going to spend in prison.
107
365567
4402
da bi odredile koliko vremena će neki pojedinac provesti u zatvoru.
06:09
So we really have to think about these decisions.
108
369993
2454
Zato zaista treba da razmislimo o ovim odlukama.
06:12
Are they fair?
109
372471
1182
Jesu li pravedne?
06:13
And we've seen that algorithmic bias
110
373677
2890
A videli smo da algoritamske predrasude
06:16
doesn't necessarily always lead to fair outcomes.
111
376591
3374
ne dovode nužno uvek do pravednih ishoda.
06:19
So what can we do about it?
112
379989
1964
Šta možemo da uradimo u vezi sa time?
06:21
Well, we can start thinking about how we create more inclusive code
113
381977
3680
Pa, možemo početi da razmišljamo o tome kako da stvorimo inkluzivniji kod
06:25
and employ inclusive coding practices.
114
385681
2990
i da upotrebimo inkluzivne postupke kodiranja.
06:28
It really starts with people.
115
388695
2309
To zapravo počinje od ljudi.
06:31
So who codes matters.
116
391528
1961
Zato je bitno ko kodira.
06:33
Are we creating full-spectrum teams with diverse individuals
117
393513
4119
Da li kreiramo timove celokupnog spektra sa različitim pojedincima
06:37
who can check each other's blind spots?
118
397656
2411
koji mogu da jedno drugome ispitaju stvari za koje su slepi?
06:40
On the technical side, how we code matters.
119
400091
3545
Sa tehničke strane, bitno je kako kodiramo.
06:43
Are we factoring in fairness as we're developing systems?
120
403660
3651
Da li uzimamo u obzir pravičnost dok razvijamo sisteme?
06:47
And finally, why we code matters.
121
407335
2913
I najzad, bitno je zašto kodiramo.
06:50
We've used tools of computational creation to unlock immense wealth.
122
410605
5083
Koristili smo alate računarskog stvaranja da bismo otključali ogromno bogatstvo.
06:55
We now have the opportunity to unlock even greater equality
123
415712
4447
Sada imamo priliku da otključamo još veću ravnopravnost
07:00
if we make social change a priority
124
420183
2930
ako učinimo društvene promene prioritetom,
07:03
and not an afterthought.
125
423137
2170
a ne da ih naknadno promišljamo.
07:05
And so these are the three tenets that will make up the "incoding" movement.
126
425828
4522
Dakle, ovo su tri principa koji će sačinjavati pokret „inkodiranja“.
07:10
Who codes matters,
127
430374
1652
Bitno je ko kodira,
07:12
how we code matters
128
432050
1543
kako kodiramo
07:13
and why we code matters.
129
433617
2023
i zašto kodiramo.
07:15
So to go towards incoding, we can start thinking about
130
435664
3099
Stoga, da bismo išli u pravcu inkodiranja, možemo početi da razmišljamo
07:18
building platforms that can identify bias
131
438787
3164
o izgradnji platforma koje mogu da identifikuju pristrasnost
07:21
by collecting people's experiences like the ones I shared,
132
441975
3078
prikupljanjem iskustava ljudi poput onih koje sam podelila,
07:25
but also auditing existing software.
133
445077
3070
ali i pregledom postojećeg softvera.
07:28
We can also start to create more inclusive training sets.
134
448171
3765
Takođe možemo početi da stvaramo inkluzivnije komplete za vežbanje.
07:31
Imagine a "Selfies for Inclusion" campaign
135
451960
2803
Zamislite kampanju „Selfiji za inkluziju“
07:34
where you and I can help developers test and create
136
454787
3655
u kojoj vi i ja možemo pomoći programerima da testiraju i naprave
07:38
more inclusive training sets.
137
458466
2093
inkluzivnije komplete za vežbanje.
07:41
And we can also start thinking more conscientiously
138
461122
2828
Takođe možemo početi da savesnije razmišljamo
07:43
about the social impact of the technology that we're developing.
139
463974
5391
o društvenom uticaju tehnologije koju razvijamo.
07:49
To get the incoding movement started,
140
469389
2393
Da bih otpočela pokret inkodiranja,
07:51
I've launched the Algorithmic Justice League,
141
471806
2847
pokrenula sam Ligu za algoritamsku pravdu,
07:54
where anyone who cares about fairness can help fight the coded gaze.
142
474677
5872
gde svako ko se brine o pravičnosti
može pomoći u borbi protiv kodiranog pogleda.
08:00
On codedgaze.com, you can report bias,
143
480573
3296
Na codedgaze.com možete prijaviti pristrasnost,
08:03
request audits, become a tester
144
483893
2445
zatražiti proveru, postati tester
08:06
and join the ongoing conversation,
145
486362
2771
i pridružiti se aktuelnom razgovoru,
08:09
#codedgaze.
146
489157
2287
#codedgaze.
08:12
So I invite you to join me
147
492562
2487
Pozivam vas da mi se pridružite
u stvaranju sveta u kome tehnologija radi za sve nas,
08:15
in creating a world where technology works for all of us,
148
495073
3719
08:18
not just some of us,
149
498816
1897
a ne samo za neke od nas,
08:20
a world where we value inclusion and center social change.
150
500737
4588
sveta u kome cenimo inkluziju i stavljamo u središte društvene promene.
08:25
Thank you.
151
505349
1175
Hvala.
08:26
(Applause)
152
506548
4271
(Aplauz)
08:32
But I have one question:
153
512693
2854
Ali imam jedno pitanje.
08:35
Will you join me in the fight?
154
515571
2059
Hoćete li mi se pridružiti u borbi?
08:37
(Laughter)
155
517654
1285
(Smeh)
08:38
(Applause)
156
518963
3687
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7