请双击下面的英文字幕来播放视频。
翻译人员: Cong Zhu
校对人员: Aviva Nassimi
00:12
Hello, I'm Joy, a poet of code,
0
12861
3134
大家好 我是乔伊
一位写代码的诗人
00:16
on a mission to stop
an unseen force that's rising,
1
16019
4993
我正努力阻止一股
逐渐凸显的无形力量
00:21
a force that I called "the coded gaze,"
2
21036
2856
一种我称为 代码的凝视 的力量
00:23
my term for algorithmic bias.
3
23916
3309
这是我用来定义算法偏见的术语
00:27
Algorithmic bias, like human bias,
results in unfairness.
4
27249
4300
正如人类之间的偏见
算法偏见也会导致不公平
00:31
However, algorithms, like viruses,
can spread bias on a massive scale
5
31573
6022
然而算法就像病毒一样
会以飞快的速度大范围地
00:37
at a rapid pace.
6
37619
1582
扩散偏见
00:39
Algorithmic bias can also lead
to exclusionary experiences
7
39763
4387
算法也将会导致排他的经历和
00:44
and discriminatory practices.
8
44174
2128
歧视性的做法
00:46
Let me show you what I mean.
9
46326
2061
给大家举个例子
00:48
(Video) Joy Buolamwini: Hi, camera.
I've got a face.
10
48800
2436
(录像)乔伊·博拉维尼:
嘿 摄像头 我来了
00:51
Can you see my face?
11
51982
1864
你可以看到我的脸吗
00:53
No-glasses face?
12
53871
1625
没有戴眼镜的脸呢
00:55
You can see her face.
13
55521
2214
你可以看到她的脸
00:58
What about my face?
14
58057
2245
那么我的脸呢
01:03
I've got a mask. Can you see my mask?
15
63710
3750
我戴上了一个面罩
你可以看到我的面罩吗
01:08
Joy Buolamwini: So how did this happen?
16
68294
2365
乔伊·博拉维尼:
这是怎么回事呢
01:10
Why am I sitting in front of a computer
17
70683
3141
为什么我坐在一台电脑前
01:13
in a white mask,
18
73848
1424
戴着一个白色的面罩
01:15
trying to be detected by a cheap webcam?
19
75296
3650
尝试着被一个廉价的
网络摄像头检测到
01:18
Well, when I'm not fighting the coded gaze
20
78970
2291
当我的身份不是写代码的诗人
01:21
as a poet of code,
21
81285
1520
与 代码的凝视 较劲的时候
01:22
I'm a graduate student
at the MIT Media Lab,
22
82829
3272
我是MIT媒体实验室的
一位硕士生
01:26
and there I have the opportunity to work
on all sorts of whimsical projects,
23
86125
4917
在那里我有机会参与
各种不同的项目
01:31
including the Aspire Mirror,
24
91066
2027
包括激励镜子
01:33
a project I did so I could project
digital masks onto my reflection.
25
93117
5134
一个可以将数字面罩
投射在我的映像上的项目
01:38
So in the morning, if I wanted
to feel powerful,
26
98275
2350
在早上的时候
如果我想充满力量
01:40
I could put on a lion.
27
100649
1434
我可以放上一个狮子的图像
01:42
If I wanted to be uplifted,
I might have a quote.
28
102107
3496
如果我想要感到积极向上
我也许就会放上一句格言
01:45
So I used generic
facial recognition software
29
105627
2989
我使用通用的人脸识别软件
01:48
to build the system,
30
108640
1351
来搭建系统
01:50
but found it was really hard to test it
unless I wore a white mask.
31
110015
5103
但是我发现除非我戴上白色的面罩
否则测试很难成功
01:56
Unfortunately, I've run
into this issue before.
32
116102
4346
遗憾的是 我以前
也曾遇到过这种问题
02:00
When I was an undergraduate
at Georgia Tech studying computer science,
33
120472
4303
当我在佐治亚理工学院
读计算机科学专业本科的时候
02:04
I used to work on social robots,
34
124799
2055
我曾经在一个
社交机器人上进行实验
02:06
and one of my tasks was to get a robot
to play peek-a-boo,
35
126878
3777
我的任务之一是
让机器人玩躲猫猫
02:10
a simple turn-taking game
36
130679
1683
一个简单的轮换游戏
02:12
where partners cover their face
and then uncover it saying, "Peek-a-boo!"
37
132386
4321
在游戏中玩伴盖住他们的脸
然后掀开说“躲猫猫!“
02:16
The problem is, peek-a-boo
doesn't really work if I can't see you,
38
136731
4429
问题是躲猫猫在我不能
看见你的时候不起作用
02:21
and my robot couldn't see me.
39
141184
2499
而我的机器人看不见我
02:23
But I borrowed my roommate's face
to get the project done,
40
143707
3950
我只好借了我室友的脸
去完成这个项目
02:27
submitted the assignment,
41
147681
1380
递交了作业
02:29
and figured, you know what,
somebody else will solve this problem.
42
149085
3753
寻思着总会有人
来解决这个问题的把
02:33
Not too long after,
43
153489
2003
不久之后
02:35
I was in Hong Kong
for an entrepreneurship competition.
44
155516
4159
我在香港参加一次创业比赛
02:40
The organizers decided
to take participants
45
160159
2694
组织者决定将各位参与者
02:42
on a tour of local start-ups.
46
162877
2372
带到当地的初创企业参观
02:45
One of the start-ups had a social robot,
47
165273
2715
其中一个创业公司
有一个社交机器人
02:48
and they decided to do a demo.
48
168012
1912
他们决定进行一个项目演示
02:49
The demo worked on everybody
until it got to me,
49
169948
2980
这个项目演示对除我之外的
每个人都有效果
02:52
and you can probably guess it.
50
172952
1923
你恐怕可以猜到
02:54
It couldn't detect my face.
51
174899
2965
它不能检测到我的脸
02:57
I asked the developers what was going on,
52
177888
2511
我问开发师到底发生了什么
03:00
and it turned out we had used the same
generic facial recognition software.
53
180423
5533
结果是我们使用了同一款
通用面部识别软件
03:05
Halfway around the world,
54
185980
1650
在地球的另一边
03:07
I learned that algorithmic bias
can travel as quickly
55
187654
3852
我意识到算法偏见
传播得如此之快
03:11
as it takes to download
some files off of the internet.
56
191530
3170
只需要从互联网上
下载一些文件
03:15
So what's going on?
Why isn't my face being detected?
57
195565
3076
那么到底发生了什么
为什么我的脸没有被检测到
03:18
Well, we have to look
at how we give machines sight.
58
198665
3356
我们需要了解我们
如何教会机器识别
03:22
Computer vision uses
machine learning techniques
59
202045
3409
计算机视觉使用机器学习技术
03:25
to do facial recognition.
60
205478
1880
来进行面部识别
03:27
So how this works is, you create
a training set with examples of faces.
61
207382
3897
所以你要用一系列脸的样本
创建一个训练体系
03:31
This is a face. This is a face.
This is not a face.
62
211303
2818
这是一张脸 这是一张脸
而这不是一张脸
03:34
And over time, you can teach a computer
how to recognize other faces.
63
214145
4519
慢慢地你可以教电脑
如何识别其它的脸
03:38
However, if the training sets
aren't really that diverse,
64
218688
3989
然而如果这个训练集
不是那么的多样化
03:42
any face that deviates too much
from the established norm
65
222701
3349
那些与已建立的标准
偏差较多的脸
03:46
will be harder to detect,
66
226074
1649
将会难以被检测到
03:47
which is what was happening to me.
67
227747
1963
而这正是我遭遇的问题
03:49
But don't worry -- there's some good news.
68
229734
2382
不过别担心
我们还有好消息
03:52
Training sets don't just
materialize out of nowhere.
69
232140
2771
训练集并不是凭空产生的
03:54
We actually can create them.
70
234935
1788
实际上我们可以创造它们
03:56
So there's an opportunity to create
full-spectrum training sets
71
236747
4176
现在就有机会去创造
全波段光谱的训练集
04:00
that reflect a richer
portrait of humanity.
72
240947
3824
可以反映更加饱满的人类面貌
04:04
Now you've seen in my examples
73
244795
2221
现在你看到了在我的例子中
04:07
how social robots
74
247040
1768
社交机器人
04:08
was how I found out about exclusion
with algorithmic bias.
75
248832
4611
使我发现了算法偏见的排他性
04:13
But algorithmic bias can also lead
to discriminatory practices.
76
253467
4815
不过算法偏见还会导致
各种歧视性的做法
04:19
Across the US,
77
259257
1453
美国境内的警察局
04:20
police departments are starting to use
facial recognition software
78
260734
4198
在打击犯罪的过程中
04:24
in their crime-fighting arsenal.
79
264956
2459
开始使用面部识别软件
04:27
Georgetown Law published a report
80
267439
2013
乔治敦大学法学院
发表了一个报告
04:29
showing that one in two adults
in the US -- that's 117 million people --
81
269476
6763
表明在全美两个成年人中就有一个
也就是近1.2亿的人口
04:36
have their faces
in facial recognition networks.
82
276263
3534
他们的面部信息
被储存在了面部识别网络中
04:39
Police departments can currently look
at these networks unregulated,
83
279821
4552
警察局如今可以访问
这些未被规范的
04:44
using algorithms that have not
been audited for accuracy.
84
284397
4286
使用着未审核准确性的
算法的面部识别网络
04:48
Yet we know facial recognition
is not fail proof,
85
288707
3864
然而我们知道面部识别
并非万无一失
04:52
and labeling faces consistently
remains a challenge.
86
292595
4179
而持续地给面部标签
还是很有挑战性的
04:56
You might have seen this on Facebook.
87
296798
1762
你也许在Facebook上见过这个
04:58
My friends and I laugh all the time
when we see other people
88
298584
2988
当我和我的朋友看到其他人
在我们的照片上被错误标注时
05:01
mislabeled in our photos.
89
301596
2458
都会捧腹大笑
05:04
But misidentifying a suspected criminal
is no laughing matter,
90
304078
5591
但是误认一个犯罪嫌疑人
可不是闹着玩儿的
05:09
nor is breaching civil liberties.
91
309693
2827
对公民自由的侵犯也不容忽视
05:12
Machine learning is being used
for facial recognition,
92
312544
3205
机器学习正被用于面部识别
05:15
but it's also extending beyond the realm
of computer vision.
93
315773
4505
但也延伸到了计算机视觉领域之外
05:21
In her book, "Weapons
of Math Destruction,"
94
321086
4016
在数据科学家凯西·欧奈尔在她
《数学杀伤性武器》一书中
05:25
data scientist Cathy O'Neil
talks about the rising new WMDs --
95
325126
6681
叙述了逐渐严重的
新型大规模杀伤性武器
05:31
widespread, mysterious
and destructive algorithms
96
331831
4353
即 广泛应用而又神秘的
具有破坏性的算法
05:36
that are increasingly being used
to make decisions
97
336208
2964
正在被越来越多地
运用于决策制定上
05:39
that impact more aspects of our lives.
98
339196
3177
而这些决策影响着
我们生活的方方面面
05:42
So who gets hired or fired?
99
342397
1870
谁被录用
又有谁被解雇
05:44
Do you get that loan?
Do you get insurance?
100
344291
2112
你得到了贷款吗
你买到了保险吗
05:46
Are you admitted into the college
you wanted to get into?
101
346427
3503
你被心目中的理想大学录取了吗
05:49
Do you and I pay the same price
for the same product
102
349954
3509
在同一平台上的同一件产品
05:53
purchased on the same platform?
103
353487
2442
你和我是否支付同样的价格
05:55
Law enforcement is also starting
to use machine learning
104
355953
3759
为了实现警情预测
执法机构也开始
05:59
for predictive policing.
105
359736
2289
使用起机器学习
06:02
Some judges use machine-generated
risk scores to determine
106
362049
3494
一些法官使用机器生成的
危险评分来决定
06:05
how long an individual
is going to spend in prison.
107
365567
4402
囚犯要在监狱里呆多久
06:09
So we really have to think
about these decisions.
108
369993
2454
我们真的应该
仔细思考这些决定
06:12
Are they fair?
109
372471
1182
它们公平吗
06:13
And we've seen that algorithmic bias
110
373677
2890
我们已经清楚了 算法偏见
06:16
doesn't necessarily always
lead to fair outcomes.
111
376591
3374
不一定总能带来公平的结果
06:19
So what can we do about it?
112
379989
1964
那我们应该怎么做呢
06:21
Well, we can start thinking about
how we create more inclusive code
113
381977
3680
我们可以开始思考如何
创造更具有包容性的代码
06:25
and employ inclusive coding practices.
114
385681
2990
并且运用有包容性的编程实践
06:28
It really starts with people.
115
388695
2309
这真的要从人开始
06:31
So who codes matters.
116
391528
1961
由谁来编程很重要
06:33
Are we creating full-spectrum teams
with diverse individuals
117
393513
4119
我们组建的全光谱团队中
是否包括各种各样的个体
06:37
who can check each other's blind spots?
118
397656
2411
他们可以弥补彼此的盲区吗
06:40
On the technical side,
how we code matters.
119
400091
3545
在技术层面上
我们如何编程很重要
06:43
Are we factoring in fairness
as we're developing systems?
120
403660
3651
我们在研发系统的同时
有没有也考虑到公平的因素
06:47
And finally, why we code matters.
121
407335
2913
最后一点 我们为什么编程也很重要
06:50
We've used tools of computational creation
to unlock immense wealth.
122
410605
5083
我们用计算机创建的工具
创造了巨大的财富
06:55
We now have the opportunity
to unlock even greater equality
123
415712
4447
现在我们有机会去
创造进一步的平等
07:00
if we make social change a priority
124
420183
2930
我们应该优先考虑社会变革
07:03
and not an afterthought.
125
423137
2170
而不是想着事后优化
07:05
And so these are the three tenets
that will make up the "incoding" movement.
126
425828
4522
所以这三个宗旨
将构成“译码”运动
07:10
Who codes matters,
127
430374
1652
由谁来编程很重要
07:12
how we code matters
128
432050
1543
我们如何编程很重要
07:13
and why we code matters.
129
433617
2023
以及我们为什么编程很重要
07:15
So to go towards incoding,
we can start thinking about
130
435664
3099
所以就译码来说
我们可以开始考虑
07:18
building platforms that can identify bias
131
438787
3164
建立一个我们可以辨识偏见的平台
07:21
by collecting people's experiences
like the ones I shared,
132
441975
3078
通过收集人们与我类似的经历
07:25
but also auditing existing software.
133
445077
3070
不过也要审查现有的软件
07:28
We can also start to create
more inclusive training sets.
134
448171
3765
我们也可以创造一些
更有包容性的训练集
07:31
Imagine a "Selfies for Inclusion" campaign
135
451960
2803
想象一个为了包容性的自拍运动
07:34
where you and I can help
developers test and create
136
454787
3655
在那里 你和我可以帮助
程序员测试以及创造
07:38
more inclusive training sets.
137
458466
2093
更具包容性的训练集
07:41
And we can also start thinking
more conscientiously
138
461122
2828
我们还可以开始更认真地思考
07:43
about the social impact
of the technology that we're developing.
139
463974
5391
关于正在发展的科技
造成的社会影响
07:49
To get the incoding movement started,
140
469389
2393
为了开启译码运动
07:51
I've launched the Algorithmic
Justice League,
141
471806
2847
我发起了算法正义联盟
07:54
where anyone who cares about fairness
can help fight the coded gaze.
142
474677
5872
在那里任何关心公平的人
可以出力来对抗 代码的凝视
08:00
On codedgaze.com, you can report bias,
143
480573
3296
在codedgaze.com网站
你可以举报偏见
08:03
request audits, become a tester
144
483893
2445
请求审核 成为测试者
08:06
and join the ongoing conversation,
145
486362
2771
以及加入正在进行的谈话
08:09
#codedgaze.
146
489157
2287
标签就是 代码的凝视
08:12
So I invite you to join me
147
492562
2487
我在此邀请各位加入我
08:15
in creating a world where technology
works for all of us,
148
495073
3719
去创造一个让科技为我们
所有人服务的世界
08:18
not just some of us,
149
498816
1897
而不是只服务于部分人
08:20
a world where we value inclusion
and center social change.
150
500737
4588
一个我们珍惜包容和
聚焦社会变革的世界
08:25
Thank you.
151
505349
1175
谢谢
08:26
(Applause)
152
506548
4271
(掌声)
08:32
But I have one question:
153
512693
2854
不过我还有一个问题
08:35
Will you join me in the fight?
154
515571
2059
你会与我并肩战斗吗
08:37
(Laughter)
155
517654
1285
(笑声)
08:38
(Applause)
156
518963
3687
(掌声)
New videos
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。