How to keep human bias out of AI | Kriti Sharma

99,664 views ・ 2019-04-12

TED


请双击下面的英文字幕来播放视频。

00:00
Translator: Ivana Korom Reviewer: Joanna Pietrulewicz
0
0
7000
翻译人员: jacks peng 校对人员: Bighead Ge
00:12
How many decisions have been made about you today,
1
12875
3768
你今天,这周,或今年
00:16
or this week or this year,
2
16667
2601
有多少决定
00:19
by artificial intelligence?
3
19292
1958
是人工智能(AI)做出的?
00:22
I build AI for a living
4
22958
1685
我靠创建AI为生,
00:24
so, full disclosure, I'm kind of a nerd.
5
24667
3017
所以,坦白说,我是个技术狂。
00:27
And because I'm kind of a nerd,
6
27708
2393
因为我是算是个技术狂,
00:30
wherever some new news story comes out
7
30125
2351
每当有关于人工智能 要抢走我们的工作
00:32
about artificial intelligence stealing all our jobs,
8
32500
3434
这样的新闻报道出来,
00:35
or robots getting citizenship of an actual country,
9
35958
4185
或者机器人获得了 一个国家的公民身份时,
00:40
I'm the person my friends and followers message
10
40167
3142
我就成了对未来感到 担忧的朋友和关注者
00:43
freaking out about the future.
11
43333
1542
发消息的对象。
00:45
We see this everywhere.
12
45833
2101
这种事情随处可见。
00:47
This media panic that our robot overlords are taking over.
13
47958
4893
媒体担心机器人 正在接管人类的统治。
00:52
We could blame Hollywood for that.
14
52875
1917
我们可以为此谴责好莱坞。
00:56
But in reality, that's not the problem we should be focusing on.
15
56125
4125
但现实中,这不是 我们应该关注的问题。
01:01
There is a more pressing danger, a bigger risk with AI,
16
61250
3643
人工智能还有一个更紧迫的 危机, 一个更大的风险,
01:04
that we need to fix first.
17
64917
1583
需要我们首先应对。
01:07
So we are back to this question:
18
67417
2309
所以我们再回到这个问题:
01:09
How many decisions have been made about you today by AI?
19
69750
4708
今天我们有多少决定 是由人工智能做出的?
01:15
And how many of these
20
75792
1976
其中有多少决定
01:17
were based on your gender, your race or your background?
21
77792
4500
是基于你的性别,种族或者背景?
01:24
Algorithms are being used all the time
22
84500
2768
算法一直在被用来
01:27
to make decisions about who we are and what we want.
23
87292
3833
判断我们是谁,我们想要什么。
01:32
Some of the women in this room will know what I'm talking about
24
92208
3643
在座的人里有些女性 知道我在说什么,
01:35
if you've been made to sit through those pregnancy test adverts on YouTube
25
95875
3768
如果你有上千次 被要求看完YouTube上
01:39
like 1,000 times.
26
99667
2059
那些怀孕测试广告,
01:41
Or you've scrolled past adverts of fertility clinics
27
101750
2851
或者你在Faceboo的短新闻中
01:44
on your Facebook feed.
28
104625
2042
刷到过生育诊所的广告。
01:47
Or in my case, Indian marriage bureaus.
29
107625
2393
或者我的遇到的情况是, 印度婚姻局。
(笑声)
01:50
(Laughter)
30
110042
1267
01:51
But AI isn't just being used to make decisions
31
111333
2976
但人工智能不仅被用来决定
01:54
about what products we want to buy
32
114333
2601
我们想要买什么产品,
01:56
or which show we want to binge watch next.
33
116958
2500
或者我们接下来想刷哪部剧。
02:01
I wonder how you'd feel about someone who thought things like this:
34
121042
5184
我想知道你会怎么看这样想的人:
02:06
"A black or Latino person
35
126250
1934
“黑人或拉丁美洲人
02:08
is less likely than a white person to pay off their loan on time."
36
128208
4125
比白人更不可能按时还贷。”
02:13
"A person called John makes a better programmer
37
133542
2809
“名叫约翰的人编程能力
02:16
than a person called Mary."
38
136375
1667
要比叫玛丽的人好。”
02:19
"A black man is more likely to be a repeat offender than a white man."
39
139250
5083
“黑人比白人更有可能成为惯犯。”
02:26
You're probably thinking,
40
146958
1268
你可能在想,
02:28
"Wow, that sounds like a pretty sexist, racist person," right?
41
148250
3750
“哇,这听起来像是一个有严重 性别歧视和种族歧视的人。” 对吧?
02:33
These are some real decisions that AI has made very recently,
42
153000
4851
这些都是人工智能 近期做出的真实决定,
02:37
based on the biases it has learned from us,
43
157875
2934
基于它从我们人类身上
02:40
from the humans.
44
160833
1250
学习到的偏见。
02:43
AI is being used to help decide whether or not you get that job interview;
45
163750
4809
人工智能被用来帮助决定 你是否能够得到面试机会;
02:48
how much you pay for your car insurance;
46
168583
2393
你应该为车险支付多少费用;
你的信用分数有多好;
02:51
how good your credit score is;
47
171000
1893
02:52
and even what rating you get in your annual performance review.
48
172917
3125
甚至你在年度绩效评估中 应该得到怎样的评分。
02:57
But these decisions are all being filtered through
49
177083
3143
但这些决定都是 通过它对我们的身份、
03:00
its assumptions about our identity, our race, our gender, our age.
50
180250
5875
种族、性别和年龄的 假设过滤出来的。
03:08
How is that happening?
51
188250
2268
为什么会这样?
03:10
Now, imagine an AI is helping a hiring manager
52
190542
3517
想象一下人工智能 正在帮助一个人事主管
03:14
find the next tech leader in the company.
53
194083
2851
寻找公司下一位科技领袖。
03:16
So far, the manager has been hiring mostly men.
54
196958
3101
目前为止,主管雇佣的大部分是男性。
03:20
So the AI learns men are more likely to be programmers than women.
55
200083
4750
所以人工智能知道男人比女人 更有可能成为程序员,
03:25
And it's a very short leap from there to:
56
205542
2892
也就更容易做出这样的判断:
03:28
men make better programmers than women.
57
208458
2042
男人比女人更擅长编程。
03:31
We have reinforced our own bias into the AI.
58
211417
3726
我们通过人工智能强化了自己的偏见。
03:35
And now, it's screening out female candidates.
59
215167
3625
现在,它正在筛选掉女性候选人。
03:40
Hang on, if a human hiring manager did that,
60
220917
3017
等等,如果人类招聘主管这样做,
03:43
we'd be outraged, we wouldn't allow it.
61
223958
2351
我们会很愤怒, 不允许这样的事情发生。
03:46
This kind of gender discrimination is not OK.
62
226333
3476
这种性别偏见让人难以接受。
03:49
And yet somehow, AI has become above the law,
63
229833
4518
然而,或多或少, 人工智能已经凌驾于法律之上,
03:54
because a machine made the decision.
64
234375
2083
因为是机器做的决定。
03:57
That's not it.
65
237833
1518
这还没完。
03:59
We are also reinforcing our bias in how we interact with AI.
66
239375
4875
我们也在强化我们与 人工智能互动的偏见。
04:04
How often do you use a voice assistant like Siri, Alexa or even Cortana?
67
244917
5976
你们使用Siri,Alexa或者Cortana 这样的语音助手有多频繁?
04:10
They all have two things in common:
68
250917
2559
它们有两点是相同的:
04:13
one, they can never get my name right,
69
253500
3101
第一点,它们总是搞错我的名字,
04:16
and second, they are all female.
70
256625
2667
第二点,它们都有女性特征。
04:20
They are designed to be our obedient servants,
71
260417
2767
它们都被设计成顺从我们的仆人,
04:23
turning your lights on and off, ordering your shopping.
72
263208
3250
开灯关灯,下单购买商品。
04:27
You get male AIs too, but they tend to be more high-powered,
73
267125
3309
也有男性的人工智能, 但他们倾向于拥有更高的权力,
04:30
like IBM Watson, making business decisions,
74
270458
3059
比如IBM的Watson可以做出商业决定,
04:33
Salesforce Einstein or ROSS, the robot lawyer.
75
273541
3792
还有Salesforce的Einstein 或者ROSS, 是机器人律师。
04:38
So poor robots, even they suffer from sexism in the workplace.
76
278208
4060
所以即便是机器人也没能 逃脱工作中的性别歧视。
04:42
(Laughter)
77
282292
1125
(笑声)
04:44
Think about how these two things combine
78
284542
2851
想想这两者如何结合在一起,
04:47
and affect a kid growing up in today's world around AI.
79
287417
5309
又会影响一个在当今人工智能 世界中长大的孩子。
04:52
So they're doing some research for a school project
80
292750
2934
比如他们正在为学校的 一个项目做一些研究,
04:55
and they Google images of CEO.
81
295708
3018
他们在谷歌上搜索了CEO的照片。
04:58
The algorithm shows them results of mostly men.
82
298750
2893
算法向他们展示的大部分是男性。
05:01
And now, they Google personal assistant.
83
301667
2559
他们又搜索了个人助手。
05:04
As you can guess, it shows them mostly females.
84
304250
3434
你可以猜到,它显示的大部分是女性。
05:07
And then they want to put on some music, and maybe order some food,
85
307708
3601
然后他们想放点音乐, 也许想点些吃的,
05:11
and now, they are barking orders at an obedient female voice assistant.
86
311333
6584
而现在,他们正对着一位 顺从的女声助手发号施令。
05:19
Some of our brightest minds are creating this technology today.
87
319542
5309
我们中一些最聪明的人 创建了今天的这个技术。
05:24
Technology that they could have created in any way they wanted.
88
324875
4184
他们可以用任何他们 想要的方式创造技术。
05:29
And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary.
89
329083
5685
然而,他们却选择了上世纪50年代 《广告狂人》的秘书风格。
05:34
Yay!
90
334792
1500
是的,你没听错!
05:36
But OK, don't worry,
91
336958
1310
但还好,不用担心。
05:38
this is not going to end with me telling you
92
338292
2059
这不会因为我告诉你
05:40
that we are all heading towards sexist, racist machines running the world.
93
340375
3477
我们都在朝着性别歧视、 种族主义的机器前进而结束。
05:44
The good news about AI is that it is entirely within our control.
94
344792
5791
人工智能的好处是, 一切都在我们的控制中。
05:51
We get to teach the right values, the right ethics to AI.
95
351333
4000
我们得告诉人工智能 正确的价值观,道德观。
05:56
So there are three things we can do.
96
356167
2184
所以有三件事我们可以做。
05:58
One, we can be aware of our own biases
97
358375
3351
第一,我们能够意识到自己的偏见
06:01
and the bias in machines around us.
98
361750
2726
和我们身边机器的偏见。
06:04
Two, we can make sure that diverse teams are building this technology.
99
364500
4518
第二,我们可以确保打造 这个技术的是背景多样的团队。
第三,我们必须让它 从丰富的经验中学习。
06:09
And three, we have to give it diverse experiences to learn from.
100
369042
4916
06:14
I can talk about the first two from personal experience.
101
374875
3309
我可以从我个人的经验来说明前两点。
06:18
When you work in technology
102
378208
1435
当你在科技行业工作,
06:19
and you don't look like a Mark Zuckerberg or Elon Musk,
103
379667
3392
并且不像马克·扎克伯格 或埃隆·马斯克那样位高权重,
06:23
your life is a little bit difficult, your ability gets questioned.
104
383083
3750
你的生活会有点困难, 你的能力会收到质疑。
06:27
Here's just one example.
105
387875
1393
这只是一个例子。
06:29
Like most developers, I often join online tech forums
106
389292
3726
跟大部分开发者一样, 我经常参加在线科技论坛,
分享我的知识帮助别人。
06:33
and share my knowledge to help others.
107
393042
3226
06:36
And I've found,
108
396292
1309
我发现,
06:37
when I log on as myself, with my own photo, my own name,
109
397625
3976
当我用自己的照片, 自己的名字登陆时,
06:41
I tend to get questions or comments like this:
110
401625
4601
我倾向于得到这样的问题或评论:
06:46
"What makes you think you're qualified to talk about AI?"
111
406250
3000
“你为什么觉得自己 有资格谈论人工智能?”
06:50
"What makes you think you know about machine learning?"
112
410458
3476
“你为什么觉得你了解机器学习?”
06:53
So, as you do, I made a new profile,
113
413958
3435
所以,我创建了新的资料页,
06:57
and this time, instead of my own picture, I chose a cat with a jet pack on it.
114
417417
4851
这次,我没有选择自己的照片, 而是选择了一只带着喷气背包的猫。
07:02
And I chose a name that did not reveal my gender.
115
422292
2458
并选择了一个无法体现我性别的名字。
07:05
You can probably guess where this is going, right?
116
425917
2726
你能够大概猜到会怎么样,对吧?
07:08
So, this time, I didn't get any of those patronizing comments about my ability
117
428667
6392
于是这次,我不再收到 任何居高临下的评论,
07:15
and I was able to actually get some work done.
118
435083
3334
我能够专心把工作做完。
07:19
And it sucks, guys.
119
439500
1851
这感觉太糟糕了,伙计们。
07:21
I've been building robots since I was 15,
120
441375
2476
我从15岁起就在构建机器人,
07:23
I have a few degrees in computer science,
121
443875
2268
我有计算机科学领域的几个学位,
07:26
and yet, I had to hide my gender
122
446167
2434
然而,我不得不隐藏我的性别
07:28
in order for my work to be taken seriously.
123
448625
2250
以让我的工作被严肃对待。
07:31
So, what's going on here?
124
451875
1893
这是怎么回事呢?
07:33
Are men just better at technology than women?
125
453792
3208
男性在科技领域就是强于女性吗?
07:37
Another study found
126
457917
1559
另一个研究发现,
07:39
that when women coders on one platform hid their gender, like myself,
127
459500
4934
当女性程序员在平台上 隐藏性别时,像我这样,
07:44
their code was accepted four percent more than men.
128
464458
3250
她们的代码被接受的 比例比男性高4%。
07:48
So this is not about the talent.
129
468542
2916
所以这跟能力无关。
07:51
This is about an elitism in AI
130
471958
2893
这是人工智能领域的精英主义,
07:54
that says a programmer needs to look like a certain person.
131
474875
2792
即程序员看起来得像 具备某个特征的人。
07:59
What we really need to do to make AI better
132
479375
3101
让人工智能变得更好, 我们需要切实的
08:02
is bring people from all kinds of backgrounds.
133
482500
3042
把来自不同背景的人集合到一起。
08:06
We need people who can write and tell stories
134
486542
2559
我们需要能够书写和讲故事的人
08:09
to help us create personalities of AI.
135
489125
2167
来帮助我们创建人工智能更好的个性。
08:12
We need people who can solve problems.
136
492208
2042
我们需要能够解决问题的人。
08:15
We need people who face different challenges
137
495125
3768
我们需要能应对不同挑战的人,
08:18
and we need people who can tell us what are the real issues that need fixing
138
498917
5351
我们需要有人告诉我们什么是 真正需要解决的问题,
08:24
and help us find ways that technology can actually fix it.
139
504292
3041
帮助我们找到用技术 解决问题的方法。
08:29
Because, when people from diverse backgrounds come together,
140
509833
3726
因为,当不同背景的人走到一起时,
08:33
when we build things in the right way,
141
513583
2143
当我们以正确的方式做事情时,
08:35
the possibilities are limitless.
142
515750
2042
就有无限的可能。
08:38
And that's what I want to end by talking to you about.
143
518750
3309
这就是我最后想和你们讨论的。
08:42
Less racist robots, less machines that are going to take our jobs --
144
522083
4225
减少种族歧视的机器人, 减少夺走我们工作的机器——
08:46
and more about what technology can actually achieve.
145
526332
3125
更多专注于技术究竟能实现什么。
08:50
So, yes, some of the energy in the world of AI,
146
530292
3434
是的,人工智能世界中,
08:53
in the world of technology
147
533750
1393
科技世界中的一些能量
08:55
is going to be about what ads you see on your stream.
148
535167
4267
是关于你在流媒体中看到的广告。
08:59
But a lot of it is going towards making the world so much better.
149
539458
5209
但更多是朝着让世界更美好的方向前进。
09:05
Think about a pregnant woman in the Democratic Republic of Congo,
150
545500
3768
想想刚果民主共和国的一位孕妇,
09:09
who has to walk 17 hours to her nearest rural prenatal clinic
151
549292
4184
需要走17小时才能 到最近的农村产前诊所
09:13
to get a checkup.
152
553500
1851
进行产检。
09:15
What if she could get diagnosis on her phone, instead?
153
555375
2917
如果她在手机上 就能得到诊断会怎样呢?
09:19
Or think about what AI could do
154
559750
1809
或者想象一下人工智能
09:21
for those one in three women in South Africa
155
561583
2726
能为1/3面临家庭暴力的
09:24
who face domestic violence.
156
564333
2125
南非女性做什么。
09:27
If it wasn't safe to talk out loud,
157
567083
2726
如果大声说出来不安全的话,
09:29
they could get an AI service to raise alarm,
158
569833
2476
她们可以通过一个 人工智能服务来报警,
09:32
get financial and legal advice.
159
572333
2459
获得财务和法律咨询。
09:35
These are all real examples of projects that people, including myself,
160
575958
5018
这些都是包括我在内, 正在使用人工智能的人
所做的项目中的真实案例。
09:41
are working on right now, using AI.
161
581000
2500
09:45
So, I'm sure in the next couple of days there will be yet another news story
162
585542
3601
我确信在未来的几十天里面, 会有另一个新闻故事,
09:49
about the existential risk,
163
589167
2684
告诉你们,
09:51
robots taking over and coming for your jobs.
164
591875
2434
机器人会接管你们的工作。
09:54
(Laughter)
165
594333
1018
(笑声)
09:55
And when something like that happens,
166
595375
2309
当这样的事情发生时,
09:57
I know I'll get the same messages worrying about the future.
167
597708
3601
我知道我会收到 同样对未来表示担忧的信息。
10:01
But I feel incredibly positive about this technology.
168
601333
3667
但我对这个技术极为乐观。
10:07
This is our chance to remake the world into a much more equal place.
169
607458
5959
这是我们重新让世界 变得更平等的机会。
10:14
But to do that, we need to build it the right way from the get go.
170
614458
4000
但要做到这一点,我们需要 在一开始就以正确的方式构建它。
10:19
We need people of different genders, races, sexualities and backgrounds.
171
619667
5083
我们需要不同性别,种族, 性取向和背景的人。
10:26
We need women to be the makers
172
626458
2476
我们需要女性成为创造者,
10:28
and not just the machines who do the makers' bidding.
173
628958
3000
而不仅仅是听从创造者命令的机器。
10:33
We need to think very carefully what we teach machines,
174
633875
3768
我们需要仔细思考 我们教给机器的东西,
10:37
what data we give them,
175
637667
1642
我们给它们什么数据,
10:39
so they don't just repeat our own past mistakes.
176
639333
3125
这样它们就不会 只是重复我们过去的错误。
10:44
So I hope I leave you thinking about two things.
177
644125
3542
所以我希望我留给你们两个思考。
10:48
First, I hope you leave thinking about bias today.
178
648542
4559
首先,我希望你们思考 当今社会中的的偏见。
10:53
And that the next time you scroll past an advert
179
653125
3184
下次当你滚动刷到
10:56
that assumes you are interested in fertility clinics
180
656333
2810
认为你对生育诊所
10:59
or online betting websites,
181
659167
2851
或者网上投注站有兴趣的广告时,
这会让你回想起
11:02
that you think and remember
182
662042
2017
11:04
that the same technology is assuming that a black man will reoffend.
183
664083
4625
同样的技术也在假定 黑人会重复犯罪。
11:09
Or that a woman is more likely to be a personal assistant than a CEO.
184
669833
4167
或者女性更可能成为 个人助理而非CEO。
11:14
And I hope that reminds you that we need to do something about it.
185
674958
3709
我希望那会提醒你, 我们需要对此有所行动。
11:20
And second,
186
680917
1851
第二,
11:22
I hope you think about the fact
187
682792
1892
我希望你们考虑一下这个事实,
11:24
that you don't need to look a certain way
188
684708
1976
你不需要以特定的方式去看,
11:26
or have a certain background in engineering or technology
189
686708
3851
也不需要有一定的工程或技术背景
11:30
to create AI,
190
690583
1268
去创建人工智能,
11:31
which is going to be a phenomenal force for our future.
191
691875
2875
人工智能将成为我们未来的 一股非凡力量。
11:36
You don't need to look like a Mark Zuckerberg,
192
696166
2143
你不需要看起来像马克·扎克伯格,
11:38
you can look like me.
193
698333
1250
你可以看起来像我。
11:41
And it is up to all of us in this room
194
701250
2893
我们这个房间里的所有人都有责任
11:44
to convince the governments and the corporations
195
704167
2726
去说服政府和公司
11:46
to build AI technology for everyone,
196
706917
2892
为每个人创建人工智能技术,
11:49
including the edge cases.
197
709833
2393
包括边缘的情况。
11:52
And for us all to get education
198
712250
2059
让我们所有人都能
11:54
about this phenomenal technology in the future.
199
714333
2375
在未来接受有关这项非凡技术的教育。
11:58
Because if we do that,
200
718167
2017
因为如果我们那样做了,
12:00
then we've only just scratched the surface of what we can achieve with AI.
201
720208
4893
才刚刚打开了人工智能世界的大门。
12:05
Thank you.
202
725125
1268
谢谢。
12:06
(Applause)
203
726417
2708
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7