How I'm fighting bias in algorithms | Joy Buolamwini

319,900 views ใƒป 2017-03-29

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Ilan Caner ืžื‘ืงืจ: Ido Dekkers
00:12
Hello, I'm Joy, a poet of code,
0
12861
3134
ืฉืœื•ื, ืฉืžื™ ื’ืณื•ื™, ื•ืื ื™ ืžืฉื•ืจืจืช ืฉืœ ืงื•ื“,
00:16
on a mission to stop an unseen force that's rising,
1
16019
4993
ื‘ืžืฉื™ืžื” ืœืขืฆื•ืจ ื›ื— ื‘ืœืชื™ ื ืจืื” ืฉื ืžืฆื ื‘ืขืœื™ื”,
00:21
a force that I called "the coded gaze,"
2
21036
2856
ื›ื— ืฉืงืจืืชื™ ืœื• ืดื‘ื”ื™ื™ื” ืงื•ื“ื™ืชืด,
00:23
my term for algorithmic bias.
3
23916
3309
ื”ื”ื’ื“ืจื” ืฉืœื™ ืœื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช.
00:27
Algorithmic bias, like human bias, results in unfairness.
4
27249
4300
ื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช, ื‘ื“ื•ืžื” ืœื”ื˜ื™ื™ื” ืื ื•ืฉื™ืช, ื’ื•ืจืžืช ืœื—ื•ืกืจ ื”ื•ื’ื ื•ืช.
00:31
However, algorithms, like viruses, can spread bias on a massive scale
5
31573
6022
ืื‘ืœ ืืœื’ื•ืจื™ืชืžื™ื, ื›ืžื• ื•ื™ืจื•ืกื™ื, ื™ื›ื•ืœื™ื ืœื”ืคื™ืฅ ืืช ื”ื”ื˜ื™ื™ื” ื‘ืฆื•ืจื” ื ืจื—ื‘ืช
00:37
at a rapid pace.
6
37619
1582
ื•ื‘ืงืฆื‘ ื’ื‘ื•ื”.
00:39
Algorithmic bias can also lead to exclusionary experiences
7
39763
4387
ื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช ื™ื›ื•ืœื” ืœื”ื•ื‘ื™ืœ ืœื—ื•ื•ื™ื•ืช ืคืกื™ืœื”
00:44
and discriminatory practices.
8
44174
2128
ื•ืœืคืจืงื˜ื™ืงื•ืช ืžืคืœื•ืช.
00:46
Let me show you what I mean.
9
46326
2061
ื‘ื•ืื• ื•ืืจืื” ืœื›ื ืœืžื” ืื ื™ ืžืชื›ื•ื•ื ืช.
00:48
(Video) Joy Buolamwini: Hi, camera. I've got a face.
10
48800
2436
ืฉืœื•ื ืžืฆืœืžื”, ื™ืฉ ืœื™ ืคื ื™ื.
00:51
Can you see my face?
11
51982
1864
ืืช ืžื–ื”ื” ืืช ื”ืคื ื™ื ืฉืœื™?
00:53
No-glasses face?
12
53871
1625
ืื•ืœื™ ื‘ืœื™ ืžืฉืงืคื™ื™ื?
00:55
You can see her face.
13
55521
2214
ืืช ืจื•ืื” ืืช ื”ืคื ื™ื ืฉืœื”.
00:58
What about my face?
14
58057
2245
ืžื” ืœื’ื‘ื™ ื”ืคื ื™ื ืฉืœื™?
01:03
I've got a mask. Can you see my mask?
15
63710
3750
ื™ืฉ ืœื™ ืžืกื›ื”. ืืช ืจื•ืื” ืื•ืชื”?
01:08
Joy Buolamwini: So how did this happen?
16
68294
2365
ืื– ืื™ืš ื–ื” ืงืจื”?
01:10
Why am I sitting in front of a computer
17
70683
3141
ืœืžื” ืื ื™ ื™ื•ืฉื‘ืช ืœืคื ื™ ื”ืžื—ืฉื‘
01:13
in a white mask,
18
73848
1424
ื‘ืžืกื›ื” ืœื‘ื ื”,
01:15
trying to be detected by a cheap webcam?
19
75296
3650
ืžื ืกื” ืœื”ื™ื•ืช ืžื–ื•ื”ื” ืขืœ ื™ื“ื™ ืžืฆืœืžืช ืจืฉืช ื–ื•ืœื”?
01:18
Well, when I'm not fighting the coded gaze
20
78970
2291
ื•ื‘ื›ืŸ, ื›ืฉืื ื™ ืœื ื ืœื—ืžืช ื‘ื‘ื”ื™ื™ื” ื”ืงื•ื“ื™ืช
01:21
as a poet of code,
21
81285
1520
ื›ืžืฉื•ืจืจืช ืฉืœ ืงื•ื“,
01:22
I'm a graduate student at the MIT Media Lab,
22
82829
3272
ืื ื™ ืกื˜ื•ื“ื ื˜ื™ืช ืœืชื•ืืจ ืฉื ื™ ื‘ืžืขื‘ื“ืช ื”ืžื“ื™ื” ืฉืœ MIT,
01:26
and there I have the opportunity to work on all sorts of whimsical projects,
23
86125
4917
ื•ืฉื ื™ืฉ ืœื™ ื”ื–ื“ืžื ื•ืช ืœืขื‘ื•ื“ ืขืœ ื›ืœ ืžื™ื ื™ ืคืจื•ื™ื™ืงื˜ื™ื ื’ื—ืžื ื™ื™ื,
01:31
including the Aspire Mirror,
24
91066
2027
ื›ื•ืœืœ ืดื”ืžืจืื” ื”ืžืขืฆื™ืžื”ืด,
01:33
a project I did so I could project digital masks onto my reflection.
25
93117
5134
ืคืจื•ื™ื™ืงื˜ ืฉืขืฉื™ืชื™ ื›ื“ื™ ืœืฉื™ื ืžืกื›ื” ื“ื™ื’ื™ื˜ืœื™ืช ืขืœ ื”ื”ืฉืชืงืคื•ืช ืฉืœื™.
01:38
So in the morning, if I wanted to feel powerful,
26
98275
2350
ื›ืš ืฉื‘ื‘ื•ืงืจ, ืื ืจืฆื™ืชื™ ืœื”ืจื’ื™ืฉ ืขื•ืฆืžืชื™ืช,
01:40
I could put on a lion.
27
100649
1434
ื™ื›ื•ืœืชื™ ืœืฉื™ื ืžืกื™ื›ื” ืฉืœ ืืจื™ื”.
01:42
If I wanted to be uplifted, I might have a quote.
28
102107
3496
ืื ืจืฆื™ืชื™ ืœื”ื™ื•ืช ืžืขื•ื“ื“ืช, ื™ื›ื•ืœืชื™ ืœืฉื™ื ืฆื™ื˜ื•ื˜.
01:45
So I used generic facial recognition software
29
105627
2989
ื•ื‘ื›ืŸ, ื”ืฉืชืžืฉืชื™ ื‘ืชื•ื›ื ืช ื–ื™ื”ื•ื™ ืคื ื™ื ื›ืœืœื™ืช
01:48
to build the system,
30
108640
1351
ื›ื“ื™ ืœื‘ื ื•ืช ืืช ื”ืžืขืจื›ืช,
01:50
but found it was really hard to test it unless I wore a white mask.
31
110015
5103
ืื‘ืœ ื’ื™ืœื™ืชื™ ืฉืงืฉื” ืœื‘ื“ื•ืง ืืช ื”ืžืขืจื›ืช ื‘ืœื™ ืฉืืœื‘ืฉ ืžืกื™ื›ื” ืœื‘ื ื”.
01:56
Unfortunately, I've run into this issue before.
32
116102
4346
ืœืจื•ืข ื”ืžื–ืœ, ื ืชืงืœืชื™ ื‘ื‘ืขื™ื” ื–ื• ื’ื ื‘ืขื‘ืจ.
02:00
When I was an undergraduate at Georgia Tech studying computer science,
33
120472
4303
ื›ืฉืœืžื“ืชื™ ืœืชื•ืืจ ืจืืฉื•ืŸ ื‘ืžื“ืขื™ ื”ืžื—ืฉื‘ ื‘ืžื›ื•ืŸ ื”ื˜ื›ื ื•ืœื•ื’ื™ ื‘ื’ืณื•ืจื’ืณื™ื”,
02:04
I used to work on social robots,
34
124799
2055
ืขื‘ื“ืชื™ ืขืœ ืดืจื•ื‘ื•ื˜ื™ื ื—ื‘ืจืชื™ื™ืืด,
02:06
and one of my tasks was to get a robot to play peek-a-boo,
35
126878
3777
ื•ืื—ืช ืžื”ืžืฉื™ืžื•ืช ื”ื™ืชื” ืœืœืžื“ ืจื•ื‘ื•ื˜ ืœืฉื—ืง ืžื—ื‘ื•ืื™ื (Peek-a-boo),
02:10
a simple turn-taking game
36
130679
1683
ืžืฉื—ืง ืคืฉื•ื˜ ืžื‘ื•ืกืก ืชื•ืจื•ืช
02:12
where partners cover their face and then uncover it saying, "Peek-a-boo!"
37
132386
4321
ืฉื‘ื• ื”ืฉื—ืงื ื™ื ืžื›ืกื™ื ืืช ืคื ื™ื”ื, ื•ื›ืฉื”ื ืžื’ืœื™ื ืื•ืชื ื”ื ืื•ืžืจื™ื ืดPeek-a-booืด!
02:16
The problem is, peek-a-boo doesn't really work if I can't see you,
38
136731
4429
ื”ื‘ืขื™ื™ื” ื”ื™ื ืฉื”ืžืฉื—ืง ืœื ืขื•ื‘ื“ ืื ืื™ ืืคืฉืจ ืœืจืื•ืช ืื•ืชืš.
02:21
and my robot couldn't see me.
39
141184
2499
ื•ื”ืจื•ื‘ื•ื˜ ืฉืœื™ ืœื ื™ื›ืœ ืœืจืื•ืช ืื•ืชื™.
02:23
But I borrowed my roommate's face to get the project done,
40
143707
3950
ืื‘ืœ ื”ืฉืืœืชื™ ืืช ืคื ื™ื” ืฉืœ ื”ืฉื•ืชืคื” ืœื“ื™ืจื” ืฉืœื™ ื›ื“ื™ ืœื”ืฉืœื™ื ืืช ื”ืคืจื•ื™ื™ืงื˜,
02:27
submitted the assignment,
41
147681
1380
ื”ื’ืฉืชื™ ืืช ื”ืžืฉื™ืžื”,
02:29
and figured, you know what, somebody else will solve this problem.
42
149085
3753
ื•ื”ื ื—ืชื™ ืฉืžื™ืฉื”ื• ืื—ืจ ื›ื‘ืจ ื™ืคืชื•ืจ ืืช ื”ื‘ืขื™ื™ื” ื”ื–ืืช.
02:33
Not too long after,
43
153489
2003
ื›ืขื‘ื•ืจ ื–ืžืŸ ืœื ืจื‘,
02:35
I was in Hong Kong for an entrepreneurship competition.
44
155516
4159
ื”ื™ื™ืชื™ ื‘ื”ื•ื ื’ ืงื•ื ื’ ื›ืžืฉืชืชืคืช ื‘ืชื—ืจื•ืช ื™ื–ืžื•ืช.
02:40
The organizers decided to take participants
45
160159
2694
ื”ืžืืจื’ื ื™ื ื”ื—ืœื™ื˜ื• ืœืงื—ืช ืืช ื”ืžืฉืชืชืคื™ื
02:42
on a tour of local start-ups.
46
162877
2372
ืœืกื™ื•ืจ ื‘ื™ืŸ ืกื˜ืืจื˜ืืคื™ื ืžืงื•ืžื™ื™ื.
02:45
One of the start-ups had a social robot,
47
165273
2715
ื‘ืื—ื“ ื”ืกื˜ืืจื˜ืืคื™ื ื”ื™ื” ืดืจื•ื‘ื•ื˜ ื—ื‘ืจืชื™ืด,
02:48
and they decided to do a demo.
48
168012
1912
ื•ื”ื ื”ื—ืœื™ื˜ื• ืœืขืจื•ืš ื”ื“ื’ืžื”.
02:49
The demo worked on everybody until it got to me,
49
169948
2980
ื”ื”ื“ื’ืžื” ืขื‘ื“ื” ืขืœ ื›ื•ืœื ืขื“ ืฉื”ื’ื™ืขื• ืืœื™,
02:52
and you can probably guess it.
50
172952
1923
ื•ืืชื ื‘ื˜ื— ื›ื‘ืจ ืžื ื—ืฉื™ื,
02:54
It couldn't detect my face.
51
174899
2965
ื”ืจื•ื‘ื•ื˜ ืœื ื”ืฆืœื™ื— ืœื–ื”ื•ืช ืืช ืคื ื™ื™.
02:57
I asked the developers what was going on,
52
177888
2511
ืฉืืœืชื™ ืืช ื”ืžืคืชื—ื™ื ืœืžื” ื–ื” ืงื•ืจื”,
03:00
and it turned out we had used the same generic facial recognition software.
53
180423
5533
ื•ืžืกืชื‘ืจ ืฉื”ื ื”ืฉืชืžืฉื• ื‘ืื•ืชื” ืชื•ื›ื ื” ื›ืœืœื™ืช ืœื–ื™ื”ื•ื™ ืคื ื™ื.
03:05
Halfway around the world,
54
185980
1650
ื‘ืฆื“ ื”ืฉื ื™ ืฉืœ ื”ืขื•ืœื,
03:07
I learned that algorithmic bias can travel as quickly
55
187654
3852
ืœืžื“ืชื™ ืฉื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช ื™ื›ื•ืœื” ืœื ื•ืข ื‘ืื•ืชื” ืžื”ื™ืจื•ืช
03:11
as it takes to download some files off of the internet.
56
191530
3170
ืฉืœื•ืงื— ืœื”ื•ืจื™ื“ ื›ืžื” ืงื‘ืฆื™ื ืžื”ืื™ื ื˜ืจื ื˜.
03:15
So what's going on? Why isn't my face being detected?
57
195565
3076
ืื– ืžื” ืงื•ืจื” ืคื”? ืœืžื” ื”ืคื ื™ื ืฉืœื™ ืœื ืžื–ื•ื”ื•ืช?
03:18
Well, we have to look at how we give machines sight.
58
198665
3356
ื•ื‘ื›ืŸ, ืขืœื™ื ื• ืœื”ื‘ื™ืŸ ื›ื™ืฆื“ ืื ื• ื ื•ืชื ื™ื ืœืžื›ื•ื ื•ืช ืืช ื”ื™ื›ื•ืœืช ืœืจืื•ืช.
03:22
Computer vision uses machine learning techniques
59
202045
3409
ืจืื™ื™ื” ืžืžื•ื—ืฉื‘ืช ืžืฉืชืžืฉืช ื‘ื˜ื›ื ื™ืงื•ืช ืฉืœ ืœืžื™ื“ืช ืžื›ื•ื ื”
03:25
to do facial recognition.
60
205478
1880
ื›ื“ื™ ืœื‘ืฆืข ื–ื™ื”ื•ื™ ืคื ื™ื.
03:27
So how this works is, you create a training set with examples of faces.
61
207382
3897
ื”ื“ืจืš ื‘ื” ื–ื” ืขื•ื‘ื“ ื”ื™ื ืฉืืชื” ืžื›ื™ืŸ ืงื‘ื•ืฆืช ืื™ืžื•ืŸ ืฉืœ ื“ื•ื’ืžืื•ืช ืฉืœ ืคื ื™ื.
03:31
This is a face. This is a face. This is not a face.
62
211303
2818
ืืœื• ืคื ื™ื. ืืœื• ืคื ื™ื. ืืœื• ืœื ืคื ื™ื.
03:34
And over time, you can teach a computer how to recognize other faces.
63
214145
4519
ื•ืื–, ืœืื—ืจ ื–ืžืŸ ืžื”, ืืชื” ื™ื›ื•ืœ ืœืœืžื“ ืืช ื”ืžื—ืฉื‘ ืœื–ื”ื•ืช ืคื ื™ื ืื—ืจื•ืช.
03:38
However, if the training sets aren't really that diverse,
64
218688
3989
ืื‘ืœ, ืื ืงื‘ื•ืฆื•ืช ื”ืื™ืžื•ืŸ ืื™ื ืŸ ื‘ืืžืช ืžื’ื•ื•ื ื•ืช ืžืกืคื™ืง
03:42
any face that deviates too much from the established norm
65
222701
3349
ืื– ื›ืœ ืฆื•ืจืช ืคื ื™ื ืฉืกื•ื˜ื” ื™ื•ืชืจ ืžื“ื™ ืžื”ื ื•ืจืžื”
03:46
will be harder to detect,
66
226074
1649
ืชื”ื™ื” ืงืฉื” ืœื–ื™ื”ื•ื™,
03:47
which is what was happening to me.
67
227747
1963
ื•ื–ื” ืžื” ืฉืงืจื” ืœื™.
03:49
But don't worry -- there's some good news.
68
229734
2382
ืื‘ืœ ืืœ ื“ืื’ื” - ื™ืฉ ืœื™ ื—ื“ืฉื•ืช ื˜ื•ื‘ื•ืช.
03:52
Training sets don't just materialize out of nowhere.
69
232140
2771
ืงื‘ื•ืฆื•ืช ืื™ืžื•ืŸ ืื™ื ืŸ ื ื•ืฆืจื•ืช ืžืฉื•ื ืžืงื•ื.
03:54
We actually can create them.
70
234935
1788
ืื ื—ื ื• ืœืžืขืฉื” ื™ื›ื•ืœื™ื ืœื™ืฆื•ืจ ืื•ืชืŸ.
03:56
So there's an opportunity to create full-spectrum training sets
71
236747
4176
ื›ืœื•ืžืจ ื™ืฉ ืœื ื• ื”ื–ื“ืžื ื•ืช ืœื™ืฆื•ืจ ืงื‘ื•ืฆื•ืช ืื™ืžื•ืŸ ืฉืžื›ืกื•ืช ืืช ื›ืœ ื”ืกืคืงื˜ืจื•ื
04:00
that reflect a richer portrait of humanity.
72
240947
3824
ื•ืžื™ื™ืฆื’ื•ืช ื“ื™ื•ืงืŸ ืขืฉื™ืจ ื™ื•ืชืจ ืฉืœ ื”ืื ื•ืฉื•ืช.
04:04
Now you've seen in my examples
73
244795
2221
ื›ืขืช ืจืื™ืชื ื‘ื“ื•ื’ืžืื•ืช ืฉื”ื‘ืืชื™
04:07
how social robots
74
247040
1768
ื›ื™ืฆื“ ื‘ืขื–ืจืช ืจื•ื‘ื•ื˜ื™ื ื—ื‘ืจืชื™ื™ื
04:08
was how I found out about exclusion with algorithmic bias.
75
248832
4611
ื’ื™ืœื™ืชื™ ืขืœ ื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช ืฉื’ื•ืจืžืช ืœืคืกื™ืœื”.
04:13
But algorithmic bias can also lead to discriminatory practices.
76
253467
4815
ืื‘ืœ ื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืชืžื™ืช ื™ื›ื•ืœื” ืœื”ื•ื‘ื™ืœ ื’ื ืœืืคืœื™ื™ื”.
04:19
Across the US,
77
259257
1453
ืœืื•ืจืš ื›ืœ ืืจืฆื•ืช ื”ื‘ืจื™ืช
04:20
police departments are starting to use facial recognition software
78
260734
4198
ืžื—ืœืงื•ืช ืžืฉื˜ืจื” ืžืชื—ื™ืœื•ืช ืœื”ืฉืชืžืฉ ื‘ืชื•ื›ื ื” ืœื–ื™ื”ื•ื™ ืคื ื™ื
04:24
in their crime-fighting arsenal.
79
264956
2459
ื›ื—ืœืง ืžื”ืืžืฆืขื™ื ืœืœื—ื™ืžื” ืคืฉืข.
04:27
Georgetown Law published a report
80
267439
2013
ื”ืื•ื ื™ื‘ืจืกื™ื˜ื” ื’'ื•ืจื’'ื˜ืื•ืŸ ืœืื• ืคืจืกืžื” ื“ื•ืดื—
04:29
showing that one in two adults in the US -- that's 117 million people --
81
269476
6763
ืฉื”ืคื ื™ื ืฉืœ ืื—ื“ ืžื›ืœ ืฉื ื™ ื‘ื•ื’ืจื™ื ื‘ืืจื”ืดื‘ - ื›ืœื•ืžืจ 117 ืžืœื™ื•ืŸ ืื ืฉื™ื
04:36
have their faces in facial recognition networks.
82
276263
3534
ื ืžืฆืื•ืช ื‘ืจืฉืชื•ืช ืฉืœ ื–ื™ื”ื•ื™ ืคื ื™ื.
04:39
Police departments can currently look at these networks unregulated,
83
279821
4552
ืžื—ืœืงื•ืช ืžืฉื˜ืจื” ืžืฉืชืžืฉื•ืช ื›ื™ื•ื ื‘ืจืฉืชื•ืช ืืœื• ื‘ืฆื•ืจื” ืœื ืžื•ืกื“ืจืช,
04:44
using algorithms that have not been audited for accuracy.
84
284397
4286
ื•ืžืฉืชืžืฉื•ืช ื‘ืืœื’ื•ืจื™ืชืžื™ื ืฉื“ื™ื•ืงื ืœื ืžื•ื•ื“ื.
04:48
Yet we know facial recognition is not fail proof,
85
288707
3864
ื•ืขื“ื™ื™ืŸ, ืื ื• ื™ื•ื“ืขื™ื ืฉื–ื™ื”ื•ื™ ืคื ื™ื ืื™ื ื• ื—ืฃ ืžื˜ืขื•ื™ื•ืช,
04:52
and labeling faces consistently remains a challenge.
86
292595
4179
ื•ืชื™ื•ื’ ืคื ื™ื ื‘ืขืงื‘ื™ื•ืช ื ืฉืืจ ืืชื’ืจ.
04:56
You might have seen this on Facebook.
87
296798
1762
ืื•ืœื™ ืจืื™ืชื ื–ืืช ื‘ืคื™ื™ืกื‘ื•ืง.
04:58
My friends and I laugh all the time when we see other people
88
298584
2988
ื—ื‘ืจื™ื™ ื•ืื ื™ ืฆื•ื—ืงื™ื ื›ืœ ื”ื–ืžืŸ ื›ืฉืื ื• ืจื•ืื™ื ืื ืฉื™ื ืื—ืจื™ื
05:01
mislabeled in our photos.
89
301596
2458
ืžืชื•ื™ื™ื’ื™ื ื‘ื˜ืขื•ืช ื‘ืชืžื•ื ื•ืช ืฉืœื ื•.
05:04
But misidentifying a suspected criminal is no laughing matter,
90
304078
5591
ืื‘ืœ ื–ื™ื”ื•ื™ ืžื•ื˜ืขื” ืฉืœ ื—ืฉื•ื“ ื‘ืคืฉืข ืื™ื ื• ื ื•ืฉื ืœืฆื—ื•ืง,
05:09
nor is breaching civil liberties.
91
309693
2827
ื›ืžื• ื’ื ืคืจื™ืฆืช ื–ื›ื•ื™ื•ืช ืื–ืจื—.
05:12
Machine learning is being used for facial recognition,
92
312544
3205
ืœื™ืžื•ื“ ืžื›ื•ื ื” ื ืžืฆื ื‘ืฉื™ืžื•ืฉ ืฉืœ ื–ื™ื”ื•ื™ ืคื ื™ื,
05:15
but it's also extending beyond the realm of computer vision.
93
315773
4505
ืืš ื–ื” ื’ื ืžืชืจื—ื‘ ืžืขื‘ืจ ืœืชื—ื•ื ืฉืœ ืจืื™ื™ื” ืžืžื•ื—ืฉื‘ืช.
05:21
In her book, "Weapons of Math Destruction,"
94
321086
4016
ื‘ืกืคืจื” ืดื ืฉืงื™ื ืฉืœ ื”ืฉืžื“ื” ืžืชืžื˜ื™ืชืด,
05:25
data scientist Cathy O'Neil talks about the rising new WMDs --
95
325126
6681
ืžื“ืขื ื™ืช ื”ื ืชื•ื ื™ื ืงืืชืณื™ ืื•ื ื™ืœ ืžื“ื‘ืจืช ืขืœ ื”ืขืœื™ื™ื” ืฉืœ ืืœื’ื•ืจื™ืชืžื™ื ืžืกื•ื’ WMDs -
05:31
widespread, mysterious and destructive algorithms
96
331831
4353
ืืœื’ื•ืจื™ืชืžื™ื ื ืคื•ืฆื™ื, ืžืกืชื•ืจื™ื™ื ื•ื”ืจืกื ื™ื™ื
05:36
that are increasingly being used to make decisions
97
336208
2964
ืฉื ืžืฆืื™ื ื‘ืฉื™ืžื•ืฉ ื’ื•ื‘ืจ ืœืžื˜ืจืช ืงื‘ืœืช ื”ื—ืœื˜ื•ืช
05:39
that impact more aspects of our lives.
98
339196
3177
ืฉืžืฉืคื™ืขื•ืช ืขืœ ืขื•ื“ ื•ืขื•ื“ ื”ื™ื‘ื˜ื™ื ืฉืœ ื—ื™ื™ื ื•.
05:42
So who gets hired or fired?
99
342397
1870
ืžื™ ืžืชืงื‘ืœ ืœืขื‘ื•ื“ื” ื•ืžื™ ืžืคื•ื˜ืจ?
05:44
Do you get that loan? Do you get insurance?
100
344291
2112
ื”ืื ืชืงื‘ืœ ืืช ื”ื”ืœื•ื•ืื”? ื”ืื ืชืงื‘ืœ ืืช ื”ื‘ื™ื˜ื•ื—?
05:46
Are you admitted into the college you wanted to get into?
101
346427
3503
ื”ืื ืืชื” ืžืชืงื‘ืœ ืœืงื•ืœื’ืณ ืฉืจืฆื™ืช?
05:49
Do you and I pay the same price for the same product
102
349954
3509
ื”ืื ืืชื” ื•ืื ื™ ืžืฉืœืžื™ื ืื•ืชื• ืžื—ื™ืจ ืขื‘ื•ืจ ืื•ืชื• ืžื•ืฆืจ
05:53
purchased on the same platform?
103
353487
2442
ืฉื ืงื ื” ื‘ืื•ืชื” ืคืœื˜ืคื•ืจืžื”?
05:55
Law enforcement is also starting to use machine learning
104
355953
3759
ื™ื—ื™ื“ื•ืช ืœืื›ื™ืคืช ื”ื—ื•ืง ื’ื ืžืชื—ื™ืœื•ืช ืœื”ืฉืชืžืฉ ื‘ืœื™ืžื•ื“ ืžื›ื•ื ื”
05:59
for predictive policing.
105
359736
2289
ืขื‘ื•ืจ ืฉื™ื˜ื•ืจ ืชื—ื–ื™ืชื™.
06:02
Some judges use machine-generated risk scores to determine
106
362049
3494
ื™ืฉ ืฉื•ืคื˜ื™ื ืฉืžืฉืชืžืฉื™ื ื‘ืฆื™ื•ื ื™ ืกื™ื›ื•ืŸ ืฉื”ื•ืคืงื• ืขืดื™ ืžื›ื•ื ื” ื›ื“ื™ ืœื”ื—ืœื™ื˜
06:05
how long an individual is going to spend in prison.
107
365567
4402
ื›ืžื” ื–ืžืŸ ืื“ื ื™ื‘ืœื” ื‘ื›ืœื.
06:09
So we really have to think about these decisions.
108
369993
2454
ืœื›ืŸ ืขืœื™ื ื• ืœื—ืฉื•ื‘ ื‘ืืžืช ืขืœ ื”ื—ืœื˜ื•ืช ืืœื•.
06:12
Are they fair?
109
372471
1182
ื”ืื ื”ืŸ ื”ื•ื’ื ื•ืช?
06:13
And we've seen that algorithmic bias
110
373677
2890
ื•ื›ื‘ืจ ืจืื™ื ื• ืฉื”ื˜ื™ื™ื” ืืœื’ื•ืจื™ืžื™ืช
06:16
doesn't necessarily always lead to fair outcomes.
111
376591
3374
ืœื ื‘ื”ื›ืจื— ืžื‘ื™ืื” ืœืชื•ืฆืื•ืช ื”ื•ื’ื ื•ืช.
06:19
So what can we do about it?
112
379989
1964
ืื– ืžื” ื ื™ืชืŸ ืœืขืฉื•ืช ื‘ืงืฉืจ ืœื–ื”?
06:21
Well, we can start thinking about how we create more inclusive code
113
381977
3680
ื•ื‘ื›ืŸ, ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืชื—ื™ืœ ืœื—ืฉื•ื‘ ืื™ืš ื ื™ืฆื•ืจ ืงื•ื“ ื™ื•ืชืจ ืžื›ืœื™ืœ
06:25
and employ inclusive coding practices.
114
385681
2990
ื•ืœืžืžืฉ ืฉื™ื˜ื•ืช ืงื™ื“ื•ื“ ืžื›ืœื™ืœื•ืช.
06:28
It really starts with people.
115
388695
2309
ื–ื” ื‘ืืžืช ืžืชื—ื™ืœ ื‘ืื ืฉื™ื.
06:31
So who codes matters.
116
391528
1961
ื›ืš ืฉื–ื” ืžืฉื ื” ืžื™ ืžืงื•ื“ื“.
06:33
Are we creating full-spectrum teams with diverse individuals
117
393513
4119
ื”ืื ืื ื—ื ื• ื‘ื•ื ื™ื ืฆื•ื•ืชื™ื ืžื›ืœ ื”ืกืคืงื˜ืจื•ื ืขื ืื ืฉื™ื ืฉื•ื ื™ื?
06:37
who can check each other's blind spots?
118
397656
2411
ืฉื™ื›ื•ืœื™ื ืœื‘ื“ื•ืง ืืช ื”ืื–ื•ืจื™ื ื”ืžืชื™ื ืฉืœ ื”ืื—ืจื™ื?
06:40
On the technical side, how we code matters.
119
400091
3545
ื‘ืฆื“ ื”ื˜ื›ื ื™, ืื™ืš ืื ื—ื ื• ืžืงื•ื“ื“ื™ื ืžืฉื ื” ื’ื.
06:43
Are we factoring in fairness as we're developing systems?
120
403660
3651
ื”ืื ืื ื—ื ื• ืžืฉืงืœืœื™ื ื’ื ื”ื•ื’ื ื•ืช ื›ืฉืื ื• ืžืคืชื—ื™ื ืžืขืจื›ื•ืช?
06:47
And finally, why we code matters.
121
407335
2913
ื•ืœื‘ืกื•ืฃ, ืœืžื” ืื ื—ื ื• ืžืงื•ื“ื“ื™ื ื’ื ืžืฉื ื”.
06:50
We've used tools of computational creation to unlock immense wealth.
122
410605
5083
ื”ืฉืชืžืฉื ื• ื‘ื›ืœื™ื ืžืžื•ื—ืฉื‘ื™ื ืœืคืชื•ื— ืขื•ืฉืจ ืขืฆื•ื.
06:55
We now have the opportunity to unlock even greater equality
123
415712
4447
ื™ืฉ ืœื ื• ื”ื–ื“ืžื ื•ืช ื›ืขืช ืœืคืชื•ื— ืืคื™ืœื• ื™ื•ืชืจ ืฉื•ื•ื™ื•ืŸ
07:00
if we make social change a priority
124
420183
2930
ืื ื ื™ืชืŸ ืขื“ื™ืคื•ืช ืœืฉื™ื ื•ื™ ื—ื‘ืจืชื™
07:03
and not an afterthought.
125
423137
2170
ื•ืœื ืจืง ืœืื—ืจ ืžืขืฉื”.
07:05
And so these are the three tenets that will make up the "incoding" movement.
126
425828
4522
ื•ื›ืš ืืœื• ืฉืœื•ืฉืช ื”ืขื™ืงืจื™ื ืฉืœ ืชื ื•ืขืช ืดื”ืงื•ื“ ื”ืžื›ืœื™ืœืด.
07:10
Who codes matters,
127
430374
1652
ืžื™ ืฉืžืงื•ื“ื“ ืžืฉื ื”,
07:12
how we code matters
128
432050
1543
ืื™ืš ืžืงื•ื“ื“ื™ื ืžืฉื ื”
07:13
and why we code matters.
129
433617
2023
ื•ืœืžื” ืื ื—ื ื• ืžืงื•ื“ื“ื™ื ืžืฉื ื”.
07:15
So to go towards incoding, we can start thinking about
130
435664
3099
ื›ื“ื™ ืœื”ืชืงื“ื ืœืขื‘ืจ ื”ืงื•ื“ ื”ืžื›ืœื™ืœ ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืชื—ื™ืœ ืœื—ืฉื•ื‘
07:18
building platforms that can identify bias
131
438787
3164
ืขืœ ื‘ื ื™ื™ืช ืคืœื˜ืคื•ืจืžื•ืช ืฉื™ื›ื•ืœื•ืช ืœื–ื”ื•ืช ื”ื˜ื™ื™ื”
07:21
by collecting people's experiences like the ones I shared,
132
441975
3078
ืขืœ ื™ื“ื™ ืื™ืกื•ืฃ ื—ื•ื•ื™ื•ืช ืžืื ืฉื™ื, ื›ืคื™ ืฉืื ื™ ืฉื™ืชืคืชื™,
07:25
but also auditing existing software.
133
445077
3070
ืืš ื’ื ืขืœ ื™ื“ื™ ื‘ื™ืงื•ืจืช ืฉืœ ืชื•ื›ื ื” ืงื™ื™ืžืช.
07:28
We can also start to create more inclusive training sets.
134
448171
3765
ืื ื• ื™ื›ื•ืœื™ื ื’ื ืœื™ืฆื•ืจ ืงื‘ื•ืฆื•ืช ืื™ืžื•ืŸ ืฉืœืžื•ืช ื™ื•ืชืจ.
07:31
Imagine a "Selfies for Inclusion" campaign
135
451960
2803
ื“ืžื™ื™ื ื• ืงืžืคื™ื™ืŸ ืดืกืœืคื™ ืœืฉืœืžื•ืชืด
07:34
where you and I can help developers test and create
136
454787
3655
ืฉื‘ื• ืืชื ื•ืื ื™ ื™ื›ื•ืœื™ื ืœืขื–ื•ืจ ืœืžืคืชื—ื™ื ืœื‘ื“ื•ืง ื•ืœื™ืฆื•ืจ
07:38
more inclusive training sets.
137
458466
2093
ืงื‘ื•ืฆื•ืช ืื™ืžื•ืŸ ืฉืœืžื•ืช ื™ื•ืชืจ.
07:41
And we can also start thinking more conscientiously
138
461122
2828
ืื ื• ื™ื›ื•ืœื™ื ื’ื ืœื”ืชื—ื™ืœ ืœื—ืฉื•ื‘ ื‘ืฆื•ืจื” ืžื•ื“ืขืช ื™ื•ืชืจ
07:43
about the social impact of the technology that we're developing.
139
463974
5391
ืขืœ ื”ื”ืฉืคืขื” ื”ื—ื‘ืจืชื™ืช ืฉืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืื•ืชื” ืื ื• ืžืคืชื—ื™ื.
07:49
To get the incoding movement started,
140
469389
2393
ื›ื“ื™ ืœื”ืชื—ื™ืœ ืืช ืชื ื•ืขืช ืดื”ืงื•ื“ ื”ืžื›ืœื™ืœืด,
07:51
I've launched the Algorithmic Justice League,
141
471806
2847
ื”ืฉืงืชื™ ืืช ื”ืœื™ื’ื” ืœืฆื“ืง ืืœื’ื•ืจื™ืชืžื™,
07:54
where anyone who cares about fairness can help fight the coded gaze.
142
474677
5872
ืฉื‘ื” ื›ืœ ืื—ื“ ืฉืื›ืคืช ืœื• ืžืฆื“ืง ื™ื›ื•ืœ ืœื”ื™ืœื—ื ื‘ื‘ื”ื™ื™ื” ืงื•ื“ื™ืช.
08:00
On codedgaze.com, you can report bias,
143
480573
3296
ื‘ืืชืจ codedgaze.com ืืชื ื™ื›ื•ืœื™ื ืœื“ื•ื•ื— ืขืœ ื”ื˜ื™ื™ื”,
08:03
request audits, become a tester
144
483893
2445
ืœื‘ืงืฉ ื‘ื™ืงื•ืจืช, ืœื”ื™ื•ืช ื‘ื•ื“ืง
08:06
and join the ongoing conversation,
145
486362
2771
ื•ืœื”ืฆื˜ืจืฃ ืœืฉื™ื—ื” ืขืœ ื”ื ื•ืฉื,
08:09
#codedgaze.
146
489157
2287
#codegaze.
08:12
So I invite you to join me
147
492562
2487
ืื– ืื ื™ ืžื–ืžื™ื ื” ืืชื›ื ืœื”ืฆื˜ืจืฃ ืืœื™
08:15
in creating a world where technology works for all of us,
148
495073
3719
ื‘ื™ืฆื™ืจืช ืขื•ืœื ืฉื‘ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืขื•ื‘ื“ืช ืขื‘ื•ืจ ื›ื•ืœื ื•,
08:18
not just some of us,
149
498816
1897
ืœื ืจืง ืขื‘ื•ืจ ื—ืœืงื™ื ื•,
08:20
a world where we value inclusion and center social change.
150
500737
4588
ืขื•ืœื ืฉื‘ื• ืื ื• ืžืขืจื™ื›ื™ื ืฉืœืžื•ืช ื•ืžืชืจื›ื–ื™ื ื‘ืฉื™ื ื•ื™ ื—ื‘ืจืชื™.
08:25
Thank you.
151
505349
1175
ืชื•ื“ื” ืœื›ื.
08:26
(Applause)
152
506548
4271
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
08:32
But I have one question:
153
512693
2854
ื™ืฉ ืœื™ ืจืง ืฉืืœื” ืื—ืช:
08:35
Will you join me in the fight?
154
515571
2059
ืชืฆื˜ืจืคื• ืืœื™ ืœืžืื‘ืง?
08:37
(Laughter)
155
517654
1285
(ืฆื—ื•ืง)
08:38
(Applause)
156
518963
3687
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7