How I'm fighting bias in algorithms | Joy Buolamwini

345,474 views ・ 2017-03-29

TED


請雙擊下方英文字幕播放視頻。

譯者: Suzie Tang 審譯者: Helen Chang
00:12
Hello, I'm Joy, a poet of code,
0
12861
3134
你好 我叫玖伊 是個寫媒體程式的詩人
00:16
on a mission to stop an unseen force that's rising,
1
16019
4993
我的使命是
終止一個隱形力量的崛起
00:21
a force that I called "the coded gaze,"
2
21036
2856
我稱這種力量為「數碼凝視」
00:23
my term for algorithmic bias.
3
23916
3309
是我替偏差演算法取的名稱
00:27
Algorithmic bias, like human bias, results in unfairness.
4
27249
4300
偏差的演算法跟人的偏見一樣
會導致不公平的結果
00:31
However, algorithms, like viruses, can spread bias on a massive scale
5
31573
6022
然而演算法更像病毒
它傳播的偏見
大量而迅速
00:37
at a rapid pace.
6
37619
1582
00:39
Algorithmic bias can also lead to exclusionary experiences
7
39763
4387
演算法偏差讓人 體驗到什麼叫做被排擠
00:44
and discriminatory practices.
8
44174
2128
也會導致差別對待
00:46
Let me show you what I mean.
9
46326
2061
讓我告訴你我的意思
00:48
(Video) Joy Buolamwini: Hi, camera. I've got a face.
10
48800
2436
嗨 相機 我有一張臉
00:51
Can you see my face?
11
51982
1864
你能看見我的臉嗎?
00:53
No-glasses face?
12
53871
1625
不戴眼鏡呢?
00:55
You can see her face.
13
55521
2214
你看得見她啊
00:58
What about my face?
14
58057
2245
那麼我的臉呢?
01:03
I've got a mask. Can you see my mask?
15
63710
3750
戴上面具 你看得見戴上面具嗎?
01:08
Joy Buolamwini: So how did this happen?
16
68294
2365
到底是怎麽回事?
01:10
Why am I sitting in front of a computer
17
70683
3141
我為什麽要坐在電腦前
01:13
in a white mask,
18
73848
1424
戴著白色面具
01:15
trying to be detected by a cheap webcam?
19
75296
3650
好讓這台廉價的攝影機能看得見我
01:18
Well, when I'm not fighting the coded gaze
20
78970
2291
如果我沒有忙著對抗數碼凝視
01:21
as a poet of code,
21
81285
1520
當個媒體程式詩人
01:22
I'm a graduate student at the MIT Media Lab,
22
82829
3272
我就是麻省理工媒體實驗室的研究生
01:26
and there I have the opportunity to work on all sorts of whimsical projects,
23
86125
4917
我在那裡從事一些稀奇古怪的計劃
01:31
including the Aspire Mirror,
24
91066
2027
包括照妖鏡
01:33
a project I did so I could project digital masks onto my reflection.
25
93117
5134
照妖鏡計劃
讓我能把數位面具投射在自己臉上
01:38
So in the morning, if I wanted to feel powerful,
26
98275
2350
早上起來如果我需要強大的力量
01:40
I could put on a lion.
27
100649
1434
我就投上一個獅子面具
01:42
If I wanted to be uplifted, I might have a quote.
28
102107
3496
如果我缺乏鬥志
我就放一段名人名言
01:45
So I used generic facial recognition software
29
105627
2989
因為我使用一般的臉部辨識軟體
01:48
to build the system,
30
108640
1351
來測試這個系統
01:50
but found it was really hard to test it unless I wore a white mask.
31
110015
5103
結果竟然發現
電腦無法偵測到我
除非我戴上白色面具
很不幸我之前就碰過這種問題
01:56
Unfortunately, I've run into this issue before.
32
116102
4346
02:00
When I was an undergraduate at Georgia Tech studying computer science,
33
120472
4303
先前我在喬治亞理工學院
攻讀電腦科學學士學位時
02:04
I used to work on social robots,
34
124799
2055
我研究社交機器人
02:06
and one of my tasks was to get a robot to play peek-a-boo,
35
126878
3777
其中的一個實驗
就是和機器人玩躲貓貓
02:10
a simple turn-taking game
36
130679
1683
這個簡單的互動遊戲
02:12
where partners cover their face and then uncover it saying, "Peek-a-boo!"
37
132386
4321
讓對手先遮住臉再放開
同時要說 peek-a-boo
02:16
The problem is, peek-a-boo doesn't really work if I can't see you,
38
136731
4429
問題是如果看不到對方
遊戲就玩不下去了
02:21
and my robot couldn't see me.
39
141184
2499
我的機器人就是看不到我
02:23
But I borrowed my roommate's face to get the project done,
40
143707
3950
最後我只好借我室友的臉來完成
02:27
submitted the assignment,
41
147681
1380
做完實驗時我想
02:29
and figured, you know what, somebody else will solve this problem.
42
149085
3753
總有一天會有別人解決這個問題
02:33
Not too long after,
43
153489
2003
不久之後
02:35
I was in Hong Kong for an entrepreneurship competition.
44
155516
4159
我去香港參加一個
業界舉辦的競技比賽
02:40
The organizers decided to take participants
45
160159
2694
主辦單位先帶每位參賽者
02:42
on a tour of local start-ups.
46
162877
2372
去參觀當地的新創市場
02:45
One of the start-ups had a social robot,
47
165273
2715
其中一項就是社交機器人
02:48
and they decided to do a demo.
48
168012
1912
當他們用社交機器人展示成果時
02:49
The demo worked on everybody until it got to me,
49
169948
2980
社交機器人對每個參賽者都有反應
02:52
and you can probably guess it.
50
172952
1923
直到遇到了我
接下來的情形你應該能想像
02:54
It couldn't detect my face.
51
174899
2965
社交機器人怎樣都偵測不到我的臉
02:57
I asked the developers what was going on,
52
177888
2511
我問軟體開發人員是怎麼一回事
03:00
and it turned out we had used the same generic facial recognition software.
53
180423
5533
才驚覺當年通用的
人臉辨識軟體
03:05
Halfway around the world,
54
185980
1650
竟然飄洋過海到了香港
03:07
I learned that algorithmic bias can travel as quickly
55
187654
3852
偏差的演算邏輯快速散播
03:11
as it takes to download some files off of the internet.
56
191530
3170
只要從網路下載幾個檔案就搞定了
03:15
So what's going on? Why isn't my face being detected?
57
195565
3076
為什麼機器人就是看不見我的臉?
03:18
Well, we have to look at how we give machines sight.
58
198665
3356
得先知道我們如何賦予機器視力
03:22
Computer vision uses machine learning techniques
59
202045
3409
電腦使用機器學習的技術
03:25
to do facial recognition.
60
205478
1880
來辨識人臉
03:27
So how this works is, you create a training set with examples of faces.
61
207382
3897
你必須用許多實作測試來訓練他們
這是人臉這是人臉這是人臉
03:31
This is a face. This is a face. This is not a face.
62
211303
2818
這不是人臉
03:34
And over time, you can teach a computer how to recognize other faces.
63
214145
4519
一而再再而三你就能教機器人
辨識其他的人臉
03:38
However, if the training sets aren't really that diverse,
64
218688
3989
但是如果實作測試不夠多樣化
03:42
any face that deviates too much from the established norm
65
222701
3349
當出現的人臉
與既定規範相去太遠時
03:46
will be harder to detect,
66
226074
1649
電腦就很難判斷了
03:47
which is what was happening to me.
67
227747
1963
我的親身經驗就是這樣
03:49
But don't worry -- there's some good news.
68
229734
2382
但別慌張 有好消息
03:52
Training sets don't just materialize out of nowhere.
69
232140
2771
實作測試並不是無中生有
03:54
We actually can create them.
70
234935
1788
事實上我們能夠建的
03:56
So there's an opportunity to create full-spectrum training sets
71
236747
4176
我們可以有一套更周詳的測試樣本
04:00
that reflect a richer portrait of humanity.
72
240947
3824
涵蓋人種的多樣性
04:04
Now you've seen in my examples
73
244795
2221
我的實驗說明了
04:07
how social robots
74
247040
1768
社交機器人
04:08
was how I found out about exclusion with algorithmic bias.
75
248832
4611
產生排他現象
因為偏差的演算邏輯
04:13
But algorithmic bias can also lead to discriminatory practices.
76
253467
4815
偏差的演算邏輯
也可能讓偏見成為一種習慣
04:19
Across the US,
77
259257
1453
美國各地的警方
04:20
police departments are starting to use facial recognition software
78
260734
4198
正開始使用這套人臉辨識軟體
04:24
in their crime-fighting arsenal.
79
264956
2459
來建立警方的打擊犯罪系統
04:27
Georgetown Law published a report
80
267439
2013
喬治城大學法律中心的報告指出
04:29
showing that one in two adults in the US -- that's 117 million people --
81
269476
6763
每兩個美國成年人就有一個人
也就是一億一千七百萬筆臉部資料
04:36
have their faces in facial recognition networks.
82
276263
3534
在美國警方這套系統裡
04:39
Police departments can currently look at these networks unregulated,
83
279821
4552
警方這套系統既缺乏規範
04:44
using algorithms that have not been audited for accuracy.
84
284397
4286
也缺乏正確合法的演算邏輯
04:48
Yet we know facial recognition is not fail proof,
85
288707
3864
你要知道人臉辨識並非萬無一失
04:52
and labeling faces consistently remains a challenge.
86
292595
4179
要一貫正確地標註人臉 往往不是那麼容易
04:56
You might have seen this on Facebook.
87
296798
1762
或許你在臉書上看過
04:58
My friends and I laugh all the time when we see other people
88
298584
2988
朋友和我常覺得很好笑
看見有人標註朋友卻標錯了
05:01
mislabeled in our photos.
89
301596
2458
05:04
But misidentifying a suspected criminal is no laughing matter,
90
304078
5591
如果標錯的是犯人的臉呢
那就讓人笑不出來了
05:09
nor is breaching civil liberties.
91
309693
2827
侵害公民自由也同樣讓人笑不出來
05:12
Machine learning is being used for facial recognition,
92
312544
3205
不僅辨識人臉倚賴機器學習的技術
05:15
but it's also extending beyond the realm of computer vision.
93
315773
4505
許多領域其實都要用到機器學習
05:21
In her book, "Weapons of Math Destruction,"
94
321086
4016
《大數據的傲慢與偏見》 這本書的作者
05:25
data scientist Cathy O'Neil talks about the rising new WMDs --
95
325126
6681
數據科學家凱西 歐尼爾
談到新 WMD 勢力的崛起
05:31
widespread, mysterious and destructive algorithms
96
331831
4353
WMD 是廣泛 神秘和具破壞性的算法
05:36
that are increasingly being used to make decisions
97
336208
2964
演算法漸漸取代我們做決定
05:39
that impact more aspects of our lives.
98
339196
3177
影響我們生活的更多層面
05:42
So who gets hired or fired?
99
342397
1870
例如誰升了官?誰丟了飯碗?
05:44
Do you get that loan? Do you get insurance?
100
344291
2112
你借到錢了嗎?你買保險了嗎?
05:46
Are you admitted into the college you wanted to get into?
101
346427
3503
你進入心目中理想的大學了嗎?
05:49
Do you and I pay the same price for the same product
102
349954
3509
我們花同樣多的錢在同樣的平台上
05:53
purchased on the same platform?
103
353487
2442
買到同樣的產品嗎?
05:55
Law enforcement is also starting to use machine learning
104
355953
3759
警方也開始使用機器學習
05:59
for predictive policing.
105
359736
2289
來防範犯罪
06:02
Some judges use machine-generated risk scores to determine
106
362049
3494
法官根據電腦顯示的危險因子數據
06:05
how long an individual is going to spend in prison.
107
365567
4402
來決定一個人要在監獄待幾年
06:09
So we really have to think about these decisions.
108
369993
2454
我們得仔細想想這些判定
06:12
Are they fair?
109
372471
1182
它們真的公平嗎?
06:13
And we've seen that algorithmic bias
110
373677
2890
我們親眼看見偏差的演算邏輯
06:16
doesn't necessarily always lead to fair outcomes.
111
376591
3374
未必做出正確的判斷
06:19
So what can we do about it?
112
379989
1964
我們該怎麽辦呢?
06:21
Well, we can start thinking about how we create more inclusive code
113
381977
3680
我們要先確定程式碼是否具多樣性
06:25
and employ inclusive coding practices.
114
385681
2990
以及寫程式的過程是否周詳
06:28
It really starts with people.
115
388695
2309
事實上全都始於人
06:31
So who codes matters.
116
391528
1961
程式是誰寫的有關係
06:33
Are we creating full-spectrum teams with diverse individuals
117
393513
4119
寫程式的團隊是否由 多元的個體組成呢?
06:37
who can check each other's blind spots?
118
397656
2411
這樣才能互補並找出彼此的盲點
06:40
On the technical side, how we code matters.
119
400091
3545
從技術面而言 我們如何寫程式很重要
06:43
Are we factoring in fairness as we're developing systems?
120
403660
3651
我們是否對公平這項要素
在系統開發階段就考量到呢?
06:47
And finally, why we code matters.
121
407335
2913
最後 我們為什麼寫程式也重要
06:50
We've used tools of computational creation to unlock immense wealth.
122
410605
5083
我們使用計算創造工具 開啟了巨額財富之門
06:55
We now have the opportunity to unlock even greater equality
123
415712
4447
我們現在有機會實現更大的平等
07:00
if we make social change a priority
124
420183
2930
如果我們將社會變革作為優先事項
07:03
and not an afterthought.
125
423137
2170
而不是事後的想法
07:05
And so these are the three tenets that will make up the "incoding" movement.
126
425828
4522
這裡有改革程式的三元素
07:10
Who codes matters,
127
430374
1652
程式是誰寫的重要
07:12
how we code matters
128
432050
1543
如何寫程式重要
07:13
and why we code matters.
129
433617
2023
以及為何寫程式重要
07:15
So to go towards incoding, we can start thinking about
130
435664
3099
要成功改革程式
我們可以先從建立能夠 找出偏差的分析平台開始
07:18
building platforms that can identify bias
131
438787
3164
07:21
by collecting people's experiences like the ones I shared,
132
441975
3078
作法是收集人們的親身經歷 像是我剛才分享的經歷
07:25
but also auditing existing software.
133
445077
3070
也檢視現存的軟體
07:28
We can also start to create more inclusive training sets.
134
448171
3765
我們可以著手建立 更具包容性的測試樣本
07:31
Imagine a "Selfies for Inclusion" campaign
135
451960
2803
想像「包容的自拍」活動
07:34
where you and I can help developers test and create
136
454787
3655
我們可以幫助開發人員測試和創建
07:38
more inclusive training sets.
137
458466
2093
更具包容性的測試樣本
07:41
And we can also start thinking more conscientiously
138
461122
2828
我們也要更自省
07:43
about the social impact of the technology that we're developing.
139
463974
5391
我們發展的科技帶給社會的衝擊
07:49
To get the incoding movement started,
140
469389
2393
為了著手程式改革
07:51
I've launched the Algorithmic Justice League,
141
471806
2847
我發起了「演算邏輯正義聯盟」
07:54
where anyone who cares about fairness can help fight the coded gaze.
142
474677
5872
只要你贊同公平
就可以加入打擊數碼凝視的行列
08:00
On codedgaze.com, you can report bias,
143
480573
3296
只要上 codedgaze.com 網路
可以舉報你發現的偏差演算邏輯
08:03
request audits, become a tester
144
483893
2445
可以申請測試
可以成為受測者
08:06
and join the ongoing conversation,
145
486362
2771
也可以加入論壇
08:09
#codedgaze.
146
489157
2287
只要搜尋 #codedgaze
08:12
So I invite you to join me
147
492562
2487
我在此邀請大家加入我的行列
08:15
in creating a world where technology works for all of us,
148
495073
3719
創造一個技術適用於 我們所有人的世界
08:18
not just some of us,
149
498816
1897
而不是只適用於某些人
08:20
a world where we value inclusion and center social change.
150
500737
4588
一個重視包容性 和以社會變革為中心的世界
08:25
Thank you.
151
505349
1175
謝謝
08:26
(Applause)
152
506548
4271
(掌聲)
08:32
But I have one question:
153
512693
2854
我還有一個問題
08:35
Will you join me in the fight?
154
515571
2059
你要和我並肩作戰嗎?
08:37
(Laughter)
155
517654
1285
(笑聲)
08:38
(Applause)
156
518963
3687
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog