How I'm fighting bias in algorithms | Joy Buolamwini

308,224 views ・ 2017-03-29

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: HASTSanggyu Lee κ²€ν† : Gichung Lee
00:12
Hello, I'm Joy, a poet of code,
0
12861
3134
μ•ˆλ…•ν•˜μ„Έμš”. μ½”λ“œμ˜ μ‹œμΈ, μ‘°μ΄μž…λ‹ˆλ‹€.
00:16
on a mission to stop an unseen force that's rising,
1
16019
4993
μ €λŠ” 보이지 μ•ŠλŠ” 힘이 μΌμ–΄λ‚˜λŠ” 것을
막기 μœ„ν•œ 사λͺ…을 띠고 μžˆμ–΄μš”.
00:21
a force that I called "the coded gaze,"
2
21036
2856
μ œκ°€ 'μ½”λ“œν™”λœ μ‹œμ„ '이라 λΆ€λ₯΄λŠ” νž˜μΈλ°μš”.
00:23
my term for algorithmic bias.
3
23916
3309
λ‹€λ₯Έ μ‚¬λžŒμ€ μ•Œκ³ λ¦¬μ¦˜μ˜ 편견이라 λΆ€λ₯΄μ£ .
00:27
Algorithmic bias, like human bias, results in unfairness.
4
27249
4300
μ•Œκ³ λ¦¬μ¦˜μ˜ νŽΈκ²¬μ€ μΈκ°„μ˜ 편견처럼 λΆˆν‰λ“±μ„ μ΄ˆλž˜ν•˜μ§€λ§Œ
00:31
However, algorithms, like viruses, can spread bias on a massive scale
5
31573
6022
μ•Œκ³ λ¦¬μ¦˜μ€ λ°”μ΄λŸ¬μŠ€μ²˜λŸΌ λŒ€κ·œλͺ¨μ˜ νŽΈκ²¬μ„ λΉ λ₯Έ μ†λ„λ‘œ 퍼뜨릴 수 μžˆμ–΄μš”.
00:37
at a rapid pace.
6
37619
1582
00:39
Algorithmic bias can also lead to exclusionary experiences
7
39763
4387
λ˜ν•œ, μ•Œκ³ λ¦¬μ¦˜μ˜ νŽΈκ²¬μ€ μžμ‹ μ΄ λ°°μ œλ˜λŠ” κ²½ν—˜μ΄λ‚˜
차별적인 λŒ€μš°λ‘œ μ΄μ–΄μ§ˆ 수 μžˆμ–΄μš”.
00:44
and discriminatory practices.
8
44174
2128
00:46
Let me show you what I mean.
9
46326
2061
μžμ„Ένžˆ μ„€λͺ…ν•΄ λ“œλ¦¬μ£ .
00:48
(Video) Joy Buolamwini: Hi, camera. I've got a face.
10
48800
2436
(λΉ„λ””μ˜€) μ•ˆλ…•, 카메라야. 이게 λ‚΄ 얼꡴이야.
00:51
Can you see my face?
11
51982
1864
λ‚΄ 얼꡴이 λ³΄μ΄λ‹ˆ?
00:53
No-glasses face?
12
53871
1625
μ•ˆκ²½μ„ λ²—μœΌλ©΄ λ³΄μ΄λ‹ˆ?
00:55
You can see her face.
13
55521
2214
이 친ꡬ의 얼꡴은 λ³΄μ΄μž–μ•„.
00:58
What about my face?
14
58057
2245
λ‚΄ 얼꡴은 보여?
01:03
I've got a mask. Can you see my mask?
15
63710
3750
그럼 가면을 μ“Έκ²Œ. λ‚΄ 가면이 λ³΄μ΄λ‹ˆ?
01:08
Joy Buolamwini: So how did this happen?
16
68294
2365
이런 일이 μ™œ μΌμ–΄λ‚œ κ±ΈκΉŒμš”?
01:10
Why am I sitting in front of a computer
17
70683
3141
μ œκ°€ μ™œ 컴퓨터 μ•žμ— μ•‰μ•„μ„œ
01:13
in a white mask,
18
73848
1424
ν•˜μ–€ 가면을 μ“°κ³ 
01:15
trying to be detected by a cheap webcam?
19
75296
3650
싸ꡬ렀 웹캠에 인식이 λ˜λ„λ‘ λ…Έλ ₯ν•˜κ³  μžˆμ„κΉŒμš”?
01:18
Well, when I'm not fighting the coded gaze
20
78970
2291
μ œκ°€ μ½”λ“œμ˜ μ‹œμΈμœΌλ‘œμ„œ
'μ½”λ“œν™”λœ μ‹œμ„ 'κ³Ό μ‹Έμš°κΈ° 전에
01:21
as a poet of code,
21
81285
1520
01:22
I'm a graduate student at the MIT Media Lab,
22
82829
3272
μ €λŠ” MIT λ―Έλ””μ–΄λž©μ˜ λŒ€ν•™μ›μƒμ΄μ—ˆμ–΄μš”.
01:26
and there I have the opportunity to work on all sorts of whimsical projects,
23
86125
4917
κ·Έκ³³μ—μ„œ λ§Žμ€ κΈ°λ°œν•œ ν”„λ‘œμ νŠΈμ— μ°Έμ—¬ν•  수 μžˆμ—ˆλŠ”λ°
01:31
including the Aspire Mirror,
24
91066
2027
μ—Όμ›μ˜ κ±°μšΈλ„ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:33
a project I did so I could project digital masks onto my reflection.
25
93117
5134
μ œκ°€ μ°Έμ—¬ν•œ ν”„λ‘œμ νŠΈλ‘œ, κ±°μšΈμ— λΉ„μΉœ 제 λͺ¨μŠ΅μ— 디지털 가면을 μ”Œμš°λŠ” 것이죠.
01:38
So in the morning, if I wanted to feel powerful,
26
98275
2350
아침에 힘찬 λŠλ‚Œμ„ μ›ν•˜λ©΄
01:40
I could put on a lion.
27
100649
1434
μ‚¬μž 가면을 μ”ŒμšΈ 수 있고
01:42
If I wanted to be uplifted, I might have a quote.
28
102107
3496
희망찬 λŠλ‚Œμ„ λ°›κ³  μ‹Άλ‹€λ©΄ λͺ…언을 λ„μšΈ 수 μžˆμ—ˆμ£ .
01:45
So I used generic facial recognition software
29
105627
2989
μ €λŠ” 일반적인 μ–Όκ΅΄ 인식 μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό μ‚¬μš©ν•˜μ—¬
01:48
to build the system,
30
108640
1351
μ‹œμŠ€ν…œμ„ λ§Œλ“€μ—ˆμ§€λ§Œ
01:50
but found it was really hard to test it unless I wore a white mask.
31
110015
5103
μ œκ°€ 흰 가면을 쓰지 μ•ŠμœΌλ©΄ ꡉμž₯히 ν…ŒμŠ€νŠΈν•˜κΈ° μ–΄λ €μ› μ–΄μš”.
01:56
Unfortunately, I've run into this issue before.
32
116102
4346
λΆˆν–‰ν•˜κ²Œλ„, μ €λŠ” 전에도 이런 λ¬Έμ œμ— λΆ€λ”ͺ힌 적이 μžˆμ–΄μš”.
02:00
When I was an undergraduate at Georgia Tech studying computer science,
33
120472
4303
μ œκ°€ 쑰지아 κ³΅λŒ€μ—μ„œ 컴퓨터 곡학 μ „κ³΅μƒμ΄μ—ˆμ„ λ•Œ
02:04
I used to work on social robots,
34
124799
2055
μ €λŠ” μ‚¬νšŒμ  λ‘œλ΄‡μ„ μ—°κ΅¬ν–ˆμ–΄μš”.
02:06
and one of my tasks was to get a robot to play peek-a-boo,
35
126878
3777
κ³Όμ œλ“€ 쀑 ν•˜λ‚˜λŠ” κΉŒκΏλ†€μ΄ν•˜λŠ” λ‘œλ΄‡μ„ λ§Œλ“€κΈ°μ˜€μ£ .
02:10
a simple turn-taking game
36
130679
1683
κ°„λ‹¨ν•œ μˆœμ„œ κ΅λŒ€ κ²Œμž„μœΌλ‘œ,
02:12
where partners cover their face and then uncover it saying, "Peek-a-boo!"
37
132386
4321
얼꡴을 κ°€λ Έλ‹€κ°€ 보이며 "까꿍!"이라고 λ§ν•˜λŠ” κ²Œμž„μ΄μ£ .
02:16
The problem is, peek-a-boo doesn't really work if I can't see you,
38
136731
4429
λ¬Έμ œλŠ”, 까꿍 λ†€μ΄λŠ” μ œκ°€ μ—¬λŸ¬λΆ„μ˜ 얼꡴을 λ³Ό 수 μžˆμ–΄μ•Ό ν•˜λŠ”λ°
02:21
and my robot couldn't see me.
39
141184
2499
λ‘œλ΄‡μ΄ μ €λ₯Ό 보지 λͺ»ν–ˆμ–΄μš”.
02:23
But I borrowed my roommate's face to get the project done,
40
143707
3950
ν•˜μ§€λ§Œ μ €λŠ” λ£Έλ©”μ΄νŠΈμ˜ 얼꡴을 λΉŒλ €μ„œ ν”„λ‘œμ νŠΈλ₯Ό λλƒˆκ³ 
02:27
submitted the assignment,
41
147681
1380
과제λ₯Ό μ œμΆœν•œ λ‹€μŒ
02:29
and figured, you know what, somebody else will solve this problem.
42
149085
3753
λ‹€λ₯Έ λˆ„κ΅°κ°€κ°€ 이 문제λ₯Ό ν•΄κ²°ν•˜κ² μ§€ 라고 μƒκ°ν–ˆμ–΄μš”.
02:33
Not too long after,
43
153489
2003
κ·Έλ‘œλΆ€ν„° 였래 μ§€λ‚˜μ§€ μ•Šμ•„
02:35
I was in Hong Kong for an entrepreneurship competition.
44
155516
4159
μ°½μ—… λŒ€νšŒ μ°Έκ°€λ₯Ό μœ„ν•΄ 홍콩에 κ°”μ–΄μš”.
02:40
The organizers decided to take participants
45
160159
2694
주졜 츑은 μ°Έμ—¬μžλ“€μ΄
02:42
on a tour of local start-ups.
46
162877
2372
κ·Έ μ§€μ—­μ˜ μŠ€νƒ€νŠΈμ—… 기업듀을 λ°©λ¬Έν•˜λ„λ‘ ν–ˆμ–΄μš”.
02:45
One of the start-ups had a social robot,
47
165273
2715
ν•œ μŠ€νƒ€νŠΈμ—…μ— μ‚¬νšŒμ  λ‘œλ΄‡μ΄ μžˆμ—ˆκ³ 
02:48
and they decided to do a demo.
48
168012
1912
μ‹œλ²”μ„ λ³΄μ—¬μ£ΌκΈ°λ‘œ ν–ˆμ–΄μš”.
02:49
The demo worked on everybody until it got to me,
49
169948
2980
λ‘œλ΄‡μ€ λͺ¨λ‘μ—κ²Œ 잘 μž‘λ™ν–ˆμ£ . μ €λ§Œ λΉΌκ³ μš”.
02:52
and you can probably guess it.
50
172952
1923
μ•„λ§ˆ μ§μž‘ν•˜μ…¨μ„ κ±°μ˜ˆμš”.
02:54
It couldn't detect my face.
51
174899
2965
제 얼꡴을 μΈμ‹ν•˜μ§€ λͺ»ν–ˆμ–΄μš”.
02:57
I asked the developers what was going on,
52
177888
2511
μ €λŠ” κ°œλ°œμžλ“€μ—κ²Œ 무슨 일이냐고 λ¬Όμ—ˆκ³ 
03:00
and it turned out we had used the same generic facial recognition software.
53
180423
5533
μ œκ°€ 썼던 κ·Έ μ–Όκ΅΄ 인식 μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό μ“΄ 게 λ¬Έμ œμ˜€μ–΄μš”.
03:05
Halfway around the world,
54
185980
1650
지ꡬ λ°˜λŒ€νŽΈμ—μ„œ
03:07
I learned that algorithmic bias can travel as quickly
55
187654
3852
μ €λŠ” μ•Œκ³ λ¦¬μ¦˜μ˜ 편견이 μΈν„°λ„·μ—μ„œ νŒŒμΌμ„ λ‹€μš΄λ‘œλ“œλ°›λŠ” κ²ƒμ²˜λŸΌ
03:11
as it takes to download some files off of the internet.
56
191530
3170
λΉ λ₯΄κ²Œ 퍼질 수 μžˆλ‹€λŠ” κ±Έ μ•Œμ•˜μ–΄μš”.
03:15
So what's going on? Why isn't my face being detected?
57
195565
3076
μ–΄λ–»κ²Œ 된 κ±ΈκΉŒμš”? μ™œ 제 얼꡴은 μΈμ‹λ˜μ§€ μ•Šμ£ ?
03:18
Well, we have to look at how we give machines sight.
58
198665
3356
자, μš°λ¦¬κ°€ μ–΄λ–»κ²Œ 기계가 λ³Ό 수 있게 ν•˜λŠ”μ§€ μ•Œμ•„λ³΄μ„Έμš”.
03:22
Computer vision uses machine learning techniques
59
202045
3409
μ»΄ν“¨ν„°μ˜ μ‹œμ•ΌλŠ” λ¨Έμ‹  λŸ¬λ‹ κΈ°μˆ μ„ μ‚¬μš©ν•΄
03:25
to do facial recognition.
60
205478
1880
얼꡴을 μΈμ‹ν•΄μš”.
03:27
So how this works is, you create a training set with examples of faces.
61
207382
3897
μš°λ¦¬λŠ” μ—¬λŸ¬ μ˜ˆμ‹œ μ–Όκ΅΄λ“€λ‘œ 이루어진 μ—°μŠ΅ μ„ΈνŠΈλ₯Ό λ§Œλ“€μ–΄ 놓죠.
03:31
This is a face. This is a face. This is not a face.
62
211303
2818
이건 얼꡴이닀. 이건 얼꡴이닀. 이건 얼꡴이 μ•„λ‹ˆλ‹€.
03:34
And over time, you can teach a computer how to recognize other faces.
63
214145
4519
그리고 μ‹œκ°„μ΄ μ§€λ‚˜λ©΄, μ»΄ν“¨ν„°μ—κ²Œ μ–Όκ΅΄ 인식을 κ°€λ₯΄μΉ  수 μžˆμ–΄μš”.
03:38
However, if the training sets aren't really that diverse,
64
218688
3989
ν•˜μ§€λ§Œ, μ—°μŠ΅ μ„ΈνŠΈκ°€ κ·Έλ ‡κ²Œ λ‹€μ–‘ν•˜μ§€ μ•Šλ‹€λ©΄
03:42
any face that deviates too much from the established norm
65
222701
3349
κ·œμ •λœ ν‘œμ€€μ—μ„œ λ„ˆλ¬΄ λ²—μ–΄λ‚˜λŠ” 얼꡴듀은
03:46
will be harder to detect,
66
226074
1649
μΈμ‹ν•˜κΈ° μ–΄λ €μšΈ κ±°μ˜ˆμš”.
03:47
which is what was happening to me.
67
227747
1963
μ €ν•œν…Œ μΌμ–΄λ‚¬λ˜ 일과 κ°™μ£ .
03:49
But don't worry -- there's some good news.
68
229734
2382
ν•˜μ§€λ§Œ κ±±μ •ν•˜μ§€ λ§ˆμ„Έμš”. 쒋은 μ†Œμ‹λ„ μžˆμ–΄μš”.
03:52
Training sets don't just materialize out of nowhere.
69
232140
2771
μ—°μŠ΅ μ„ΈνŠΈλŠ” ν•˜λŠ˜μ—μ„œ 뚝 떨어지지 μ•Šμ•„μš”.
03:54
We actually can create them.
70
234935
1788
μš°λ¦¬κ°€ 직접 λ§Œλ“€ 수 있죠.
03:56
So there's an opportunity to create full-spectrum training sets
71
236747
4176
λ”°λΌμ„œ μ „ μ˜μ—­μ„ μ•„μšΈλŸ¬ λ‹€μ–‘ν•œ 인λ₯˜μ˜ 얼꡴을 λ°˜μ˜ν•˜λŠ”
04:00
that reflect a richer portrait of humanity.
72
240947
3824
μ—°μŠ΅ μ„ΈνŠΈλ₯Ό λ§Œλ“€ κΈ°νšŒκ°€ μžˆμ–΄μš”.
04:04
Now you've seen in my examples
73
244795
2221
μ—¬λŸ¬λΆ„μ€ 방금
04:07
how social robots
74
247040
1768
μ‚¬νšŒμ  λ‘œλ΄‡μ΄ 어떀지
04:08
was how I found out about exclusion with algorithmic bias.
75
248832
4611
μ œκ°€ μ–΄λ–»κ²Œ μ•Œκ³ λ¦¬μ¦˜μ˜ νŽΈκ²¬μ— μ˜ν•œ λ°°μ œμ— λŒ€ν•΄ μ•Œκ²Œλ˜μ—ˆλŠ”μ§€ λ³΄μ…¨μ–΄μš”.
04:13
But algorithmic bias can also lead to discriminatory practices.
76
253467
4815
ν•˜μ§€λ§Œ μ•Œκ³ λ¦¬μ¦˜μ˜ νŽΈκ²¬μ€ 차별적 κ΄€ν–‰μœΌλ‘œ μ΄μ–΄μ§ˆ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
04:19
Across the US,
77
259257
1453
λ―Έκ΅­ μ „μ—­μ—μ„œ
04:20
police departments are starting to use facial recognition software
78
260734
4198
κ²½μ°°μ„œλ“€μ΄ 범죄 근절의 무기둜
04:24
in their crime-fighting arsenal.
79
264956
2459
μ–Όκ΅΄ 인식 μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό μ‚¬μš©ν•˜κΈ° μ‹œμž‘ν–ˆμ–΄μš”.
04:27
Georgetown Law published a report
80
267439
2013
μ‘°μ§€νƒ€μš΄λŒ€ 법학센터에 λ”°λ₯΄λ©΄
04:29
showing that one in two adults in the US -- that's 117 million people --
81
269476
6763
총 1μ–΅1천7백만λͺ…에 λ‹¬ν•˜λŠ” λ―Έκ΅­ 성인 λ‘˜ 쀑 ν•œ λͺ…μ˜ 얼꡴이
04:36
have their faces in facial recognition networks.
82
276263
3534
μ–Όκ΅΄ 인식 λ„€νŠΈμ›Œν¬μ— 올렀져 μžˆμ–΄μš”.
04:39
Police departments can currently look at these networks unregulated,
83
279821
4552
경찰은 ν˜„μž¬ 이 λ„€νŠΈμ›Œν¬λ₯Ό μ œν•œ 없이 μ‚΄νŽ΄λ³Ό 수 μžˆμ–΄μš”.
04:44
using algorithms that have not been audited for accuracy.
84
284397
4286
정확성이 κ²€μ¦λ˜μ§€ μ•Šμ€ μ•Œκ³ λ¦¬μ¦˜μ„ μ‚¬μš©ν•˜λ©΄μ„œμš”.
04:48
Yet we know facial recognition is not fail proof,
85
288707
3864
μš°λ¦¬λŠ” μ–Όκ΅΄ 인식이 잘λͺ»λ  수 μžˆλ‹€λŠ” 것을 μ•Œκ³  있고
04:52
and labeling faces consistently remains a challenge.
86
292595
4179
얼꡴을 μΌκ΄€λ˜κ²Œ ν‘œμ‹œν•˜λŠ” 것은 과제둜 λ‚¨μ•„μžˆμ–΄μš”.
04:56
You might have seen this on Facebook.
87
296798
1762
μ•„λ§ˆ νŽ˜μ΄μŠ€λΆμ—μ„œ 보셨을 κ±°μ˜ˆμš”.
04:58
My friends and I laugh all the time when we see other people
88
298584
2988
저와 제 μΉœκ΅¬λ“€μ€ λ‹€λ₯Έ μ‚¬λžŒμ˜ 이름이
05:01
mislabeled in our photos.
89
301596
2458
우리 사진에 ν‘œμ‹œλœ 것을 보고 맀번 μ›ƒμ–΄μš”.
05:04
But misidentifying a suspected criminal is no laughing matter,
90
304078
5591
ν•˜μ§€λ§Œ 범죄 용의자λ₯Ό 잘λͺ» νŒŒμ•…ν•˜λŠ” 것은
웃을 일이 μ•„λ‹ˆλ©° μ‹œλ―Όμ˜ 자유λ₯Ό μΉ¨ν•΄ν•˜μ£ .
05:09
nor is breaching civil liberties.
91
309693
2827
05:12
Machine learning is being used for facial recognition,
92
312544
3205
λ¨Έμ‹ λŸ¬λ‹μ€ ν˜„μž¬ 얼꡴인식에 μ‚¬μš©λ˜μ§€λ§Œ
05:15
but it's also extending beyond the realm of computer vision.
93
315773
4505
컴퓨터 μ‹œκ°μ„ λ„˜μ–΄μ„  κ³³κΉŒμ§€ ν™•μž₯되고 μžˆμ–΄μš”.
05:21
In her book, "Weapons of Math Destruction,"
94
321086
4016
'λŒ€λŸ‰μ‚΄μƒλ¬΄κΈ° (WMD)'λΌλŠ” μ±…μ—μ„œ
05:25
data scientist Cathy O'Neil talks about the rising new WMDs --
95
325126
6681
데이터 κ³Όν•™μž μΊμ‹œ μ˜€λ‹μ€ μƒˆλ‘œμš΄ λŒ€λŸ‰μ‚΄μƒλ¬΄κΈ°μ— λŒ€ν•΄μ„œ λ§ν•΄μš”.
05:31
widespread, mysterious and destructive algorithms
96
331831
4353
널리 퍼진, μ•Œ 수 μ—†λŠ” 파괴적인 μ•Œκ³ λ¦¬μ¦˜μ΄μ£ .
05:36
that are increasingly being used to make decisions
97
336208
2964
이듀은 우리 삢에 큰 영ν–₯을 λ―ΈμΉ˜λŠ” 선택에
05:39
that impact more aspects of our lives.
98
339196
3177
점점 많이 μ‚¬μš©λ˜κ³  μžˆμ–΄μš”.
05:42
So who gets hired or fired?
99
342397
1870
λˆ„κ°€ 고용되고 λˆ„κ°€ ν•΄κ³ λ˜λŠ”κ°€?
05:44
Do you get that loan? Do you get insurance?
100
344291
2112
λΉšμ„ 질까? λ³΄ν—˜μ— κ°€μž…ν• κΉŒ?
05:46
Are you admitted into the college you wanted to get into?
101
346427
3503
μ›ν•˜λŠ” λŒ€ν•™μ— ν•©κ²©ν•˜λŠ”κ°€?
05:49
Do you and I pay the same price for the same product
102
349954
3509
μ—¬λŸ¬λΆ„κ³Ό 당신이 같은 μƒν’ˆμ— λŒ€ν•΄μ„œ
05:53
purchased on the same platform?
103
353487
2442
같은 가격을 μ§€λΆˆν•˜λŠ”κ°€?
05:55
Law enforcement is also starting to use machine learning
104
355953
3759
법 μ§‘ν–‰μ—μ„œλ„ 예방적 μΉ˜μ•ˆμ„ μœ„ν•΄
05:59
for predictive policing.
105
359736
2289
λ¨Έμ‹  λŸ¬λ‹ μ‚¬μš©μ„ μ‹œμž‘ν–ˆμ–΄μš”.
06:02
Some judges use machine-generated risk scores to determine
106
362049
3494
λͺ‡λͺ‡ νŒμ‚¬λ“€μ€ 기계가 λ§Œλ“  μœ„ν—˜ 점수λ₯Ό μ‚¬μš©ν•˜μ—¬
06:05
how long an individual is going to spend in prison.
107
365567
4402
μ‚¬λžŒλ“€μ˜ ν˜•λŸ‰μ„ κ²°μ •ν•˜κΈ°λ„ ν•΄μš”.
06:09
So we really have to think about these decisions.
108
369993
2454
κ·Έλž˜μ„œ 우린 이런 선택에 λŒ€ν•΄ 생각해 봐야 ν•΄μš”.
06:12
Are they fair?
109
372471
1182
이 선택이 κ³΅μ •ν•œκ°€?
06:13
And we've seen that algorithmic bias
110
373677
2890
κ²Œλ‹€κ°€ μš°λ¦¬λŠ” μ•Œκ³ λ¦¬μ¦˜μ˜ 선택이
06:16
doesn't necessarily always lead to fair outcomes.
111
376591
3374
맀번 κ³΅μ •ν•˜μ§€λŠ” μ•Šλ‹€λŠ” κ±Έ λ΄€μ–΄μš”.
06:19
So what can we do about it?
112
379989
1964
그럼 μ–΄λ–»κ²Œ ν•΄μ•Ό ν• κΉŒμš”?
06:21
Well, we can start thinking about how we create more inclusive code
113
381977
3680
μš°λ¦¬λŠ” 포괄적인 μ½”λ“œλ₯Ό λ§Œλ“€κ³ 
06:25
and employ inclusive coding practices.
114
385681
2990
포괄적인 μ½”λ”© μ„ λ‘€λ₯Ό λ„μž…ν•΄μ•Ό ν•΄μš”.
06:28
It really starts with people.
115
388695
2309
이것은 μ‚¬λžŒλ“€λ‘œλΆ€ν„° μ‹œμž‘λ©λ‹ˆλ‹€.
06:31
So who codes matters.
116
391528
1961
λ”°λΌμ„œ, λˆ„κ°€ 코딩을 ν•˜λŠ”μ§€κ°€ μ€‘μš”ν•˜μ£ .
06:33
Are we creating full-spectrum teams with diverse individuals
117
393513
4119
μš°λ¦¬λŠ” μ§€κΈˆ λ‹€μ–‘ν•œ κ°œμΈλ“€λ‘œ 이루어져
μ„œλ‘œμ˜ 맹점을 λ³Ό 수 μžˆλŠ” νŒ€μ„ λ§Œλ“€κ³  μžˆλ‚˜μš”?
06:37
who can check each other's blind spots?
118
397656
2411
06:40
On the technical side, how we code matters.
119
400091
3545
기술적인 λ©΄μ—μ„œ μš°λ¦¬κ°€ μ–΄λ–»κ²Œ 코딩을 ν•˜λŠ”μ§€κ°€ μ€‘μš”ν•΄μš”.
06:43
Are we factoring in fairness as we're developing systems?
120
403660
3651
μ§€κΈˆ μš°λ¦¬λŠ” μ‹œμŠ€ν…œμ„ κ°œλ°œν•˜λ©΄μ„œ 곡정함을 염두에 두고 μžˆλ‚˜μš”?
06:47
And finally, why we code matters.
121
407335
2913
λ§ˆμ§€λ§‰μœΌλ‘œ, μš°λ¦¬κ°€ μ™œ 코딩을 ν•˜λŠ”μ§€κ°€ μ€‘μš”ν•΄μš”.
06:50
We've used tools of computational creation to unlock immense wealth.
122
410605
5083
μš°λ¦¬λŠ” μ—„μ²­λ‚œ λΆ€λ₯Ό μœ„ν•˜μ—¬ 컴퓨터λ₯Ό λ„κ΅¬λ‘œ μ‚¬μš©ν–ˆμ–΄μš”.
06:55
We now have the opportunity to unlock even greater equality
123
415712
4447
이제 μš°λ¦¬μ—κ² 더 큰 평등을 얻을 κΈ°νšŒκ°€ μžˆμ–΄μš”.
07:00
if we make social change a priority
124
420183
2930
μš°λ¦¬κ°€ μ‚¬νšŒμ  λ³€ν™”λ₯Ό 미루지 μ•Šκ³ 
07:03
and not an afterthought.
125
423137
2170
μš°μ„ μˆœμœ„μ— λ‘”λ‹€λ©΄μš”.
07:05
And so these are the three tenets that will make up the "incoding" movement.
126
425828
4522
이 μ„Έ 가지 μš”μ†Œκ°€ '인코딩' μš΄λ™μ„ κ΅¬μ„±ν•©λ‹ˆλ‹€.
07:10
Who codes matters,
127
430374
1652
λˆ„κ°€ 코딩을 ν•˜λŠ”μ§€
07:12
how we code matters
128
432050
1543
μ–΄λ–»κ²Œ 코딩을 ν•˜λŠ”μ§€
07:13
and why we code matters.
129
433617
2023
μ™œ 코딩을 ν•˜λŠ”μ§€κ°€ μ€‘μš”ν•΄μš”.
07:15
So to go towards incoding, we can start thinking about
130
435664
3099
그리고 인코딩을 ν–₯ν•΄ κ°€λ©° μš°λ¦¬λŠ” νŽΈκ²¬μ„ λΆ„λ³„ν•˜λŠ” ν”Œλž«νΌμ„
07:18
building platforms that can identify bias
131
438787
3164
ꡬ좕할 수 μžˆμ–΄μš”.
07:21
by collecting people's experiences like the ones I shared,
132
441975
3078
μ œκ°€ κ³΅μœ ν•œ 것과 같은 λ‹€λ₯Έ μ‚¬λžŒλ“€μ˜ κ²½ν—˜μ„ λͺ¨μœΌκ³ 
07:25
but also auditing existing software.
133
445077
3070
ν˜„μ‘΄ν•˜λŠ” μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό κ²€μ‚¬ν•˜λ©΄μ„œ 말이죠.
07:28
We can also start to create more inclusive training sets.
134
448171
3765
μš°λ¦¬λŠ” λ˜ν•œ λ”μš± 포용적인 μ—°μŠ΅ μ„ΈνŠΈλ₯Ό λ§Œλ“€κΈ° μ‹œμž‘ν•  수 μžˆμ–΄μš”.
07:31
Imagine a "Selfies for Inclusion" campaign
135
451960
2803
"포괄적인 μ…€μΉ΄" μΊ νŽ˜μΈμ„ 상상해 λ³΄μ„Έμš”.
07:34
where you and I can help developers test and create
136
454787
3655
μ—¬λŸ¬λΆ„κ³Ό μ œκ°€ λ”μš± 포용적인 μ—°μŠ΅ μ„ΈνŠΈλ₯Ό λ§Œλ“œλŠ” 데
07:38
more inclusive training sets.
137
458466
2093
μ…€μΉ΄λ₯Ό λ³΄λ‚΄λ©΄μ„œ 도움을 쀄 수 μžˆλŠ” κ±°μ˜ˆμš”.
07:41
And we can also start thinking more conscientiously
138
461122
2828
그리고 μš°λ¦¬κ°€ κ°œλ°œν•˜λŠ” 기술의 μ‚¬νšŒμ  영ν–₯에 λŒ€ν•΄
07:43
about the social impact of the technology that we're developing.
139
463974
5391
보닀 μ–‘μ‹¬μ μœΌλ‘œ 생각할 수 μžˆμ–΄μš”.
07:49
To get the incoding movement started,
140
469389
2393
인코딩 μš΄λ™μ„ μ‹œμž‘ν•˜κΈ° μœ„ν•΄μ„œ
07:51
I've launched the Algorithmic Justice League,
141
471806
2847
μ €λŠ” μ•Œκ³ λ¦¬μ¦˜ μ •μ˜ 연합을 μ°½μ„€ν–ˆμ–΄μš”.
07:54
where anyone who cares about fairness can help fight the coded gaze.
142
474677
5872
곡정함을 μ€‘μš”μ‹œ μ—¬κΈ°λŠ” μ‚¬λžŒ λˆ„κ΅¬λ“  'μ½”λ”©λœ μ‹œμ„ '에 λ§žμ„œ μ‹Έμš°λŠ” κ±Έ λ„μ™€μ€λ‹ˆλ‹€.
08:00
On codedgaze.com, you can report bias,
143
480573
3296
codedgaze.comμ—μ„œ νŽΈκ²¬μ„ λ³΄κ³ ν•˜κ±°λ‚˜
08:03
request audits, become a tester
144
483893
2445
검사λ₯Ό μš”μ²­ν•˜κ±°λ‚˜ ν…ŒμŠ€ν„°κ°€ 될 수 있으며
08:06
and join the ongoing conversation,
145
486362
2771
μ§„ν–‰λ˜λŠ” λŒ€ν™”μ— μ°Έμ—¬ν•  μˆ˜λ„ μžˆμ–΄μš”.
08:09
#codedgaze.
146
489157
2287
ν•΄μ‹œνƒœκ·Έ codedgazeμž…λ‹ˆλ‹€.
08:12
So I invite you to join me
147
492562
2487
κ·Έλž˜μ„œ μ €λŠ” μ—¬λŸ¬λΆ„μ΄ 저와 ν•¨κ»˜
08:15
in creating a world where technology works for all of us,
148
495073
3719
기술이 μΌλΆ€λ§Œμ΄ μ•„λ‹Œ
λͺ¨λ‘λ₯Ό μœ„ν•΄ μ“°μ΄λŠ” 세상을
08:18
not just some of us,
149
498816
1897
08:20
a world where we value inclusion and center social change.
150
500737
4588
ν¬μš©μ„±μ„ μ€‘μš”μ‹œμ—¬κΈ°κ³  μ‚¬νšŒμ  λ³€ν™”λ₯Ό μ€‘μ‹œν•˜λŠ” 세상을 λ§Œλ“œλŠ”λ° λ™μ°Έν•˜μ…¨μœΌλ©΄ ν•©λ‹ˆλ‹€.
08:25
Thank you.
151
505349
1175
κ°μ‚¬ν•©λ‹ˆλ‹€.
08:26
(Applause)
152
506548
4271
(λ°•μˆ˜)
08:32
But I have one question:
153
512693
2854
ν•˜μ§€λ§Œ μ—¬λŸ¬λΆ„μ—κ²Œ 질문이 ν•˜λ‚˜ μžˆμ–΄μš”.
08:35
Will you join me in the fight?
154
515571
2059
μ—¬λŸ¬λΆ„μ€ 이 싸움에 λ™μ°Έν•˜μ‹€ κ±΄κ°€μš”?
08:37
(Laughter)
155
517654
1285
(μ›ƒμŒ)
08:38
(Applause)
156
518963
3687
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7