Machine intelligence makes human morals more important | Zeynep Tufekci

180,336 views ・ 2016-11-11

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yangyang Liu 校对人员: Junyi Sha
00:12
So, I started my first job as a computer programmer
0
12739
4122
我的第一份工作是程序员,
00:16
in my very first year of college --
1
16885
1956
那是在我刚上大学的时候,
00:18
basically, as a teenager.
2
18865
1507
不到二十岁。
00:20
Soon after I started working,
3
20889
1732
我刚开始工作不久,
00:22
writing software in a company,
4
22645
1610
正当在公司写程序,
00:24
a manager who worked at the company came down to where I was,
5
24799
3635
公司的一位经理来到我旁边,
00:28
and he whispered to me,
6
28458
1268
他悄悄的对我说,
00:30
"Can he tell if I'm lying?"
7
30229
2861
“他能看出来我在撒谎吗?”
00:33
There was nobody else in the room.
8
33806
2077
当时屋子里没有别人。
“你是指谁能看出你在撒谎? 还有,我们干嘛要悄悄地说话?”
00:37
"Can who tell if you're lying? And why are we whispering?"
9
37032
4389
00:42
The manager pointed at the computer in the room.
10
42266
3107
那个经理指着屋子里的电脑,说:
00:45
"Can he tell if I'm lying?"
11
45397
3096
“他能看出我在撒谎吗?”
00:49
Well, that manager was having an affair with the receptionist.
12
49613
4362
其实,那个经理和前台有一腿。
00:53
(Laughter)
13
53999
1112
(笑声)
当时我只有十来岁,
00:55
And I was still a teenager.
14
55135
1766
00:57
So I whisper-shouted back to him,
15
57447
2019
我低声地回答他,
00:59
"Yes, the computer can tell if you're lying."
16
59490
3624
“是的,电脑什么都知道。”
(笑声)
01:03
(Laughter)
17
63138
1806
01:04
Well, I laughed, but actually, the laugh's on me.
18
64968
2923
我笑了,但其实我是在笑自己,
01:07
Nowadays, there are computational systems
19
67915
3268
现在,计算机系统已经可以
01:11
that can suss out emotional states and even lying
20
71207
3548
通过分析人脸来辨别人的情绪,
01:14
from processing human faces.
21
74779
2044
甚至包括是否在撒谎。
01:17
Advertisers and even governments are very interested.
22
77248
4153
广告商,甚至政府 都对此很感兴趣。
01:22
I had become a computer programmer
23
82319
1862
我选择成为电脑程序员,
01:24
because I was one of those kids crazy about math and science.
24
84205
3113
因为我是那种 痴迷于数学和科学孩子。
01:27
But somewhere along the line I'd learned about nuclear weapons,
25
87942
3108
其间我也学习过核武器,
我也非常关心科学伦理。
01:31
and I'd gotten really concerned with the ethics of science.
26
91074
2952
我曾经很困惑。
01:34
I was troubled.
27
94050
1204
01:35
However, because of family circumstances,
28
95278
2641
但是,因为家庭原因,
01:37
I also needed to start working as soon as possible.
29
97943
3298
我需要尽快参加工作。
01:41
So I thought to myself, hey, let me pick a technical field
30
101265
3299
我对自己说,嘿, 选一个容易找工作
01:44
where I can get a job easily
31
104588
1796
的科技领域吧,
01:46
and where I don't have to deal with any troublesome questions of ethics.
32
106408
4018
并且找个不需要 操心伦理问题的。
所以我选了计算机。
01:51
So I picked computers.
33
111022
1529
01:52
(Laughter)
34
112575
1104
(笑声)
01:53
Well, ha, ha, ha! All the laughs are on me.
35
113703
3410
哈哈哈, 我多可笑。
如今,计算机科学控制着
01:57
Nowadays, computer scientists are building platforms
36
117137
2754
01:59
that control what a billion people see every day.
37
119915
4209
十亿人每天能看到的信息,
它们可以控制汽车朝哪里开,
02:05
They're developing cars that could decide who to run over.
38
125052
3822
02:09
They're even building machines, weapons,
39
129707
3213
它们可以建造机器、武器,
02:12
that might kill human beings in war.
40
132944
2285
那些在战争中 用于杀人的武器。
02:15
It's ethics all the way down.
41
135253
2771
说到底, 都是伦理问题。
02:19
Machine intelligence is here.
42
139183
2058
机器智能来了。
02:21
We're now using computation to make all sort of decisions,
43
141823
3474
我们用计算机来做各种决策,
02:25
but also new kinds of decisions.
44
145321
1886
包括人们面临的新决策。
02:27
We're asking questions to computation that have no single right answers,
45
147231
5172
我们向计算机询问多解的、
02:32
that are subjective
46
152427
1202
主观的、
02:33
and open-ended and value-laden.
47
153653
2325
开放性的或意义深远的问题。
我们会问,
02:36
We're asking questions like,
48
156002
1758
02:37
"Who should the company hire?"
49
157784
1650
“我们公司应该聘请谁?”
“你该关注哪个朋友 的哪条状态?”
02:40
"Which update from which friend should you be shown?"
50
160096
2759
02:42
"Which convict is more likely to reoffend?"
51
162879
2266
“哪种犯罪更容易再犯?”
02:45
"Which news item or movie should be recommended to people?"
52
165514
3054
“应该给人们推荐 哪条新闻或是电影?”
02:48
Look, yes, we've been using computers for a while,
53
168592
3372
看,是的,我们使用计算机 已经有一段时间了,
02:51
but this is different.
54
171988
1517
但现在不一样了。
02:53
This is a historical twist,
55
173529
2067
这是历史性的转折,
02:55
because we cannot anchor computation for such subjective decisions
56
175620
5337
因为我们在这些主观决策上 无法主导计算机,
03:00
the way we can anchor computation for flying airplanes, building bridges,
57
180981
5420
不像我们在 管理飞机、建造桥梁、
登月等问题上, 可以主导它们。
03:06
going to the moon.
58
186425
1259
03:08
Are airplanes safer? Did the bridge sway and fall?
59
188449
3259
飞机会更安全吗? 桥梁会摇晃或倒塌吗?
03:11
There, we have agreed-upon, fairly clear benchmarks,
60
191732
4498
在这些问题上,我们 有统一而清晰的判断标准,
03:16
and we have laws of nature to guide us.
61
196254
2239
我们有自然定律来指导。
03:18
We have no such anchors and benchmarks
62
198517
3394
但是在复杂的人类事务上,
03:21
for decisions in messy human affairs.
63
201935
3963
我们没有这样的客观标准。
03:25
To make things more complicated, our software is getting more powerful,
64
205922
4237
让问题变得更复杂的, 是我们的软件正越来越强大,
03:30
but it's also getting less transparent and more complex.
65
210183
3773
同时也变得更加不透明, 更加复杂。
03:34
Recently, in the past decade,
66
214542
2040
最近的几十年,
03:36
complex algorithms have made great strides.
67
216606
2729
复杂算法已 取得了长足发展,
03:39
They can recognize human faces.
68
219359
1990
它们可以识别人脸,
03:41
They can decipher handwriting.
69
221985
2055
它们可以破解笔迹,
03:44
They can detect credit card fraud
70
224436
2066
它们可以识别信用卡欺诈,
03:46
and block spam
71
226526
1189
可以屏蔽垃圾信息,
03:47
and they can translate between languages.
72
227739
2037
它们可以翻译语言,
03:49
They can detect tumors in medical imaging.
73
229800
2574
他们可以通过 医学图像识别肿瘤,
03:52
They can beat humans in chess and Go.
74
232398
2205
它们可以在国际象棋 和围棋上击败人类。
03:55
Much of this progress comes from a method called "machine learning."
75
235264
4504
类似的很多发展, 都来自一种叫“机器学习”的方法。
机器学习不像传统程序一样,
04:00
Machine learning is different than traditional programming,
76
240175
3187
04:03
where you give the computer detailed, exact, painstaking instructions.
77
243386
3585
需要给计算机详细、 准确的逐条指令。
04:07
It's more like you take the system and you feed it lots of data,
78
247378
4182
它更像是你给系统 喂了很多数据,
04:11
including unstructured data,
79
251584
1656
包括非结构化数据,
04:13
like the kind we generate in our digital lives.
80
253264
2278
比如我们在数字生活中 产生的数据。
04:15
And the system learns by churning through this data.
81
255566
2730
系统扎进这些数据中学习,
04:18
And also, crucially,
82
258669
1526
重要的是,
04:20
these systems don't operate under a single-answer logic.
83
260219
4380
这些系统不再局限单一答案。
04:24
They don't produce a simple answer; it's more probabilistic:
84
264623
2959
他们得出的不是一个 简单的答案,而是概率性的:
04:27
"This one is probably more like what you're looking for."
85
267606
3483
“这个更像是你在寻找的。”
它的优势是: 它真的非常强大。
04:32
Now, the upside is: this method is really powerful.
86
272023
3070
Google 人工智能系统的 负责人称它为:
04:35
The head of Google's AI systems called it,
87
275117
2076
04:37
"the unreasonable effectiveness of data."
88
277217
2197
“不可思议的数据效率”。
04:39
The downside is,
89
279791
1353
缺点在于,
04:41
we don't really understand what the system learned.
90
281738
3071
我们无法清楚的了解 系统学到了什么,
04:44
In fact, that's its power.
91
284833
1587
事实上,这也正是 它的强大之处。
04:46
This is less like giving instructions to a computer;
92
286946
3798
不像是给计算机下达指令,
04:51
it's more like training a puppy-machine-creature
93
291200
4064
更像是在训练一个机器狗,
04:55
we don't really understand or control.
94
295288
2371
我们无法精确的 了解和控制它。
04:58
So this is our problem.
95
298362
1551
这就是我们遇到的问题。
05:00
It's a problem when this artificial intelligence system gets things wrong.
96
300427
4262
人工智能会出错, 这是一个问题。
05:04
It's also a problem when it gets things right,
97
304713
3540
但他们得出正确答案, 又是另一种问题。
05:08
because we don't even know which is which when it's a subjective problem.
98
308277
3628
因为我们面对主观问题, 是不应该有答案的。
05:11
We don't know what this thing is thinking.
99
311929
2339
我们不知道 这些机器在想什么。
05:15
So, consider a hiring algorithm --
100
315493
3683
所以,考虑一下招聘算法-
通过机器学习构建的招聘系统。
05:20
a system used to hire people, using machine-learning systems.
101
320123
4311
这样的系统会用员工 现有的数据进行自我培训,
05:25
Such a system would have been trained on previous employees' data
102
325052
3579
05:28
and instructed to find and hire
103
328655
2591
参照公司的优秀员工
05:31
people like the existing high performers in the company.
104
331270
3038
来寻找和招聘新人。
05:34
Sounds good.
105
334814
1153
听起来很好。
05:35
I once attended a conference
106
335991
1999
有次我参加了一个会议,
会上聚集了很多 人力资源部的经理和总监,
05:38
that brought together human resources managers and executives,
107
338014
3125
都是高管,
05:41
high-level people,
108
341163
1206
05:42
using such systems in hiring.
109
342393
1559
让他们使用这样的招聘系统。
05:43
They were super excited.
110
343976
1646
他们都非常兴奋,
05:45
They thought that this would make hiring more objective, less biased,
111
345646
4653
认为这可以让招聘变得 更加客观,从而减少偏见,
05:50
and give women and minorities a better shot
112
350323
3000
给女性和少数族裔 更多的机会,
05:53
against biased human managers.
113
353347
2188
减少他们自身的偏见。
05:55
And look -- human hiring is biased.
114
355559
2843
你知道的, 招聘是存在偏见的,
我也很清楚。
05:59
I know.
115
359099
1185
06:00
I mean, in one of my early jobs as a programmer,
116
360308
3005
在我刚开始做程序员的时候,
06:03
my immediate manager would sometimes come down to where I was
117
363337
3868
我的直接主管会来找我,
06:07
really early in the morning or really late in the afternoon,
118
367229
3753
在早晨很早或下午很晚的时候,
说,“ 图费, 我们去吃午饭!”
06:11
and she'd say, "Zeynep, let's go to lunch!"
119
371006
3062
06:14
I'd be puzzled by the weird timing.
120
374724
2167
我就被这奇怪的时间 给搞糊涂了,
06:16
It's 4pm. Lunch?
121
376915
2129
现在是下午4点,吃午饭?
我当时很穷,所以 不会放过免费的午餐。
06:19
I was broke, so free lunch. I always went.
122
379068
3094
06:22
I later realized what was happening.
123
382618
2067
后来我才想明白原因,
06:24
My immediate managers had not confessed to their higher-ups
124
384709
4546
我的主管们没有 向他们的上级坦白,
06:29
that the programmer they hired for a serious job was a teen girl
125
389279
3113
他们雇了一个十多岁的小女孩 来做重要的编程工作,
06:32
who wore jeans and sneakers to work.
126
392416
3930
一个穿着牛仔裤, 运动鞋工作的女孩。
我的工作做得很好, 我只是看起来不合适,
06:37
I was doing a good job, I just looked wrong
127
397174
2202
06:39
and was the wrong age and gender.
128
399400
1699
年龄和性别也不合适。
所以,忽略性别和种族的招聘,
06:41
So hiring in a gender- and race-blind way
129
401123
3346
06:44
certainly sounds good to me.
130
404493
1865
听起来很适合我。
但是这样的系统会带来更多问题,
06:47
But with these systems, it is more complicated, and here's why:
131
407031
3341
06:50
Currently, computational systems can infer all sorts of things about you
132
410968
5791
当前,计算机系统 能根据零散的数据,
06:56
from your digital crumbs,
133
416783
1872
推断出关于你的一切,
06:58
even if you have not disclosed those things.
134
418679
2333
甚至你没有公开的事。
07:01
They can infer your sexual orientation,
135
421506
2927
它们可以推断你的性取向,
07:04
your personality traits,
136
424994
1306
你的性格特点,
07:06
your political leanings.
137
426859
1373
你的政治倾向。
07:08
They have predictive power with high levels of accuracy.
138
428830
3685
它们有高准确度的预测能力,
07:13
Remember -- for things you haven't even disclosed.
139
433362
2578
记住,是你没有公开的事情,
07:15
This is inference.
140
435964
1591
这就是推断。
07:17
I have a friend who developed such computational systems
141
437579
3261
我有个朋友 就是开发这种系统,
07:20
to predict the likelihood of clinical or postpartum depression
142
440864
3641
从社交媒体的数据中,
推断患临床或 产后抑郁症的可能性。
07:24
from social media data.
143
444529
1416
07:26
The results are impressive.
144
446676
1427
结果令人印象深刻,
07:28
Her system can predict the likelihood of depression
145
448492
3357
她的系统可以 在症状出现前几个月
07:31
months before the onset of any symptoms --
146
451873
3903
成功预测到 患抑郁的可能性,
07:35
months before.
147
455800
1373
提前几个月。
07:37
No symptoms, there's prediction.
148
457197
2246
在有症状之前, 就可以预测到,
07:39
She hopes it will be used for early intervention. Great!
149
459467
4812
她希望这可以用于 临床早期干预,这很棒!
07:44
But now put this in the context of hiring.
150
464911
2040
现在我们把这项技术 放到招聘中来看。
在那次人力资源管理会议中,
07:48
So at this human resources managers conference,
151
468027
3046
我接近了一位大公司的高管,
07:51
I approached a high-level manager in a very large company,
152
471097
4709
07:55
and I said to her, "Look, what if, unbeknownst to you,
153
475830
4578
我对她说,“看,如果这个系统 在不通知你的情况下,
08:00
your system is weeding out people with high future likelihood of depression?
154
480432
6549
就剔除了未来 有可能抑郁的人,怎么办?
08:07
They're not depressed now, just maybe in the future, more likely.
155
487761
3376
他们现在不抑郁, 只是未来有可能。
08:11
What if it's weeding out women more likely to be pregnant
156
491923
3406
如果它剔除了 有可能怀孕的女性,怎么办?
08:15
in the next year or two but aren't pregnant now?
157
495353
2586
她们现在没怀孕, 但未来一两年有可能。
08:18
What if it's hiring aggressive people because that's your workplace culture?"
158
498844
5636
如果因为你的公司文化, 它只雇佣激进的候选人怎么办?”
只看性别比例, 你发现不了这些问题,
08:25
You can't tell this by looking at gender breakdowns.
159
505173
2691
08:27
Those may be balanced.
160
507888
1502
性别比例是可以被调整的。
08:29
And since this is machine learning, not traditional coding,
161
509414
3557
并且因为这是机器学习, 不是传统的代码,
08:32
there is no variable there labeled "higher risk of depression,"
162
512995
4907
不会有一个变量来标识 “高抑郁风险”、
08:37
"higher risk of pregnancy,"
163
517926
1833
“高怀孕风险”、
08:39
"aggressive guy scale."
164
519783
1734
“人员的激进程度”。
08:41
Not only do you not know what your system is selecting on,
165
521995
3679
你不仅无法了解系统 在选什么样的人,
08:45
you don't even know where to begin to look.
166
525698
2323
你甚至不知道 从哪里入手了解。
它是个暗箱。
08:48
It's a black box.
167
528045
1246
08:49
It has predictive power, but you don't understand it.
168
529315
2807
它有预测的能力, 但你不了解它。
08:52
"What safeguards," I asked, "do you have
169
532486
2369
我问,“你有什么措施 可以保证,
08:54
to make sure that your black box isn't doing something shady?"
170
534879
3673
你的暗箱没有 在做些见不得人的事?”
09:00
She looked at me as if I had just stepped on 10 puppy tails.
171
540863
3878
她看着我,就好像 我刚踩了10只小狗的尾巴。
09:04
(Laughter)
172
544765
1248
(笑声)
她瞪着我说:
09:06
She stared at me and she said,
173
546037
2041
09:08
"I don't want to hear another word about this."
174
548556
4333
“我不想再听你多说一个字。”
09:13
And she turned around and walked away.
175
553458
2034
然后她转身走开了。
其实, 她不是无礼,
09:16
Mind you -- she wasn't rude.
176
556064
1486
09:17
It was clearly: what I don't know isn't my problem, go away, death stare.
177
557574
6308
她想表达的其实是:我不知道, 这不是我的错,走开,不然我瞪死你。
09:23
(Laughter)
178
563906
1246
(笑声)
09:25
Look, such a system may even be less biased
179
565862
3839
看,这样的系统 可能在某些方面
09:29
than human managers in some ways.
180
569725
2103
比人类高管 怀有更少偏见,
09:31
And it could make monetary sense.
181
571852
2146
而且可以创造经济价值。
09:34
But it could also lead
182
574573
1650
但它也可能
09:36
to a steady but stealthy shutting out of the job market
183
576247
4748
用一种顽固且隐秘的方式,
把高抑郁风险的人清出职场。
09:41
of people with higher risk of depression.
184
581019
2293
09:43
Is this the kind of society we want to build,
185
583753
2596
这是我们想要的未来吗?
09:46
without even knowing we've done this,
186
586373
2285
把决策权给予我们 并不完全了解的机器,
09:48
because we turned decision-making to machines we don't totally understand?
187
588682
3964
在我们不知情的状况下 构建一种新的社会?
09:53
Another problem is this:
188
593265
1458
另一个问题是,
09:55
these systems are often trained on data generated by our actions,
189
595314
4452
这些系统通常使用
09:59
human imprints.
190
599790
1816
我们真实的 行为数据来训练。
10:02
Well, they could just be reflecting our biases,
191
602188
3808
它们可能只是在 反馈我们的偏见,
这些系统会 继承我们的偏见,
10:06
and these systems could be picking up on our biases
192
606020
3593
10:09
and amplifying them
193
609637
1313
并把它们放大,
10:10
and showing them back to us,
194
610974
1418
然后反馈给我们。
10:12
while we're telling ourselves,
195
612416
1462
我们骗自己说,
10:13
"We're just doing objective, neutral computation."
196
613902
3117
“我们只做客观、 中立的预测。”
10:18
Researchers found that on Google,
197
618314
2677
研究者发现,在 Google 上,
高收入工作的广告 更多的被展示给男性用户。
10:22
women are less likely than men to be shown job ads for high-paying jobs.
198
622134
5313
10:28
And searching for African-American names
199
628463
2530
搜索非裔美国人的名字,
更可能出现 关于犯罪史的广告,
10:31
is more likely to bring up ads suggesting criminal history,
200
631017
4706
10:35
even when there is none.
201
635747
1567
即使某些根本不存在。
10:38
Such hidden biases and black-box algorithms
202
638693
3549
这些潜在的偏见 以及暗箱中的算法,
10:42
that researchers uncover sometimes but sometimes we don't know,
203
642266
3973
有些会被研究者揭露, 有些根本不会被发现,
10:46
can have life-altering consequences.
204
646263
2661
它的后果可能是 改变一个人的人生。
10:49
In Wisconsin, a defendant was sentenced to six years in prison
205
649958
4159
在威斯康星,一个被告
因逃避警察被判刑六年。
10:54
for evading the police.
206
654141
1355
10:56
You may not know this,
207
656824
1186
你可能不知道,
但计算机算法正越来越多的 被应用在假释及量刑裁定上。
10:58
but algorithms are increasingly used in parole and sentencing decisions.
208
658034
3998
他想要弄清楚,这个 得分是怎么算出来的?
11:02
He wanted to know: How is this score calculated?
209
662056
2955
11:05
It's a commercial black box.
210
665795
1665
这是个商业暗箱,
11:07
The company refused to have its algorithm be challenged in open court.
211
667484
4205
这家公司拒绝在公开法庭上 讨论他们的算法。
11:12
But ProPublica, an investigative nonprofit, audited that very algorithm
212
672396
5532
但是一家叫 ProPublica 的非盈利机构,
根据公开数据, 对这个算法进行了评估,
11:17
with what public data they could find,
213
677952
2016
11:19
and found that its outcomes were biased
214
679992
2316
他们发现这个算法 的结论是有偏见的,
11:22
and its predictive power was dismal, barely better than chance,
215
682332
3629
它的预测能力很差, 比碰运气强不了多少,
11:25
and it was wrongly labeling black defendants as future criminals
216
685985
4416
并且它错误的把黑人被告 未来犯罪的可能性
11:30
at twice the rate of white defendants.
217
690425
3895
标记为白人的两倍。
11:35
So, consider this case:
218
695891
1564
看下这个案例:
这个女人急着去佛罗里达州, 布劳沃德县的一所学校,
11:38
This woman was late picking up her godsister
219
698103
3852
11:41
from a school in Broward County, Florida,
220
701979
2075
去接她的干妹妹。
11:44
running down the street with a friend of hers.
221
704757
2356
女人和她的朋友在街上狂奔,
她们看到门廊上一辆没上锁的 儿童自行车,和一辆电瓶车,
11:47
They spotted an unlocked kid's bike and a scooter on a porch
222
707137
4099
11:51
and foolishly jumped on it.
223
711260
1632
于是就愚蠢的骑上了车。
11:52
As they were speeding off, a woman came out and said,
224
712916
2599
正在她们要骑走的时候, 另一个女人出来,喊道:
11:55
"Hey! That's my kid's bike!"
225
715539
2205
“嘿!那是我孩子的自行车!”
11:57
They dropped it, they walked away, but they were arrested.
226
717768
3294
她们扔掉车走开, 但还是被抓住了。
她做错了,她很愚蠢, 但她也才刚满18岁,
12:01
She was wrong, she was foolish, but she was also just 18.
227
721086
3637
12:04
She had a couple of juvenile misdemeanors.
228
724747
2544
她之前有不少 青少年轻罪的记录。
12:07
Meanwhile, that man had been arrested for shoplifting in Home Depot --
229
727808
5185
与此同时,这个男人 在连锁超市偷窃被捕了,
偷了价值85美金的东西, 同样的轻微犯罪,
12:13
85 dollars' worth of stuff, a similar petty crime.
230
733017
2924
12:16
But he had two prior armed robbery convictions.
231
736766
4559
但他有两次持枪抢劫的案底。
12:21
But the algorithm scored her as high risk, and not him.
232
741955
3482
这个程序将这位女性判定为 高风险,而这位男性则不是。
12:26
Two years later, ProPublica found that she had not reoffended.
233
746746
3874
两年后,ProPublica 发现她没有再次犯罪,
12:30
It was just hard to get a job for her with her record.
234
750644
2550
但这个记录 使她很难找到工作。
12:33
He, on the other hand, did reoffend
235
753218
2076
而这位男性,却再次犯罪,
12:35
and is now serving an eight-year prison term for a later crime.
236
755318
3836
并因此被判八年监禁。
显然,我们需要 审查这些暗箱,
12:40
Clearly, we need to audit our black boxes
237
760088
3369
12:43
and not have them have this kind of unchecked power.
238
763481
2615
确保它们不再有这样 不加限制的权限。
(掌声)
12:46
(Applause)
239
766120
2879
审查是很重要的, 但不能解决所有的问题。
12:50
Audits are great and important, but they don't solve all our problems.
240
770087
4242
12:54
Take Facebook's powerful news feed algorithm --
241
774353
2748
拿 Facebook 的强大的 新闻流算法来说,
就是通过你的朋友圈 和你浏览过的页面,
12:57
you know, the one that ranks everything and decides what to show you
242
777125
4843
决定你的 “推荐内容”的算法。
13:01
from all the friends and pages you follow.
243
781992
2284
13:04
Should you be shown another baby picture?
244
784898
2275
它会决定要不要 再推一张婴儿照片给你,
13:07
(Laughter)
245
787197
1196
(笑声)
13:08
A sullen note from an acquaintance?
246
788417
2596
要不要推一条熟人 的沮丧状态?
13:11
An important but difficult news item?
247
791449
1856
要不要推一条重要 但艰涩的新闻?
13:13
There's no right answer.
248
793329
1482
这个问题没有正解。
13:14
Facebook optimizes for engagement on the site:
249
794835
2659
Facebook 会根据 网站的参与度来优化:
13:17
likes, shares, comments.
250
797518
1415
喜欢、分享、评论。
在2014年8月,
13:20
In August of 2014,
251
800168
2696
13:22
protests broke out in Ferguson, Missouri,
252
802888
2662
密苏里州弗格森市爆发了游行,
13:25
after the killing of an African-American teenager by a white police officer,
253
805574
4417
一个白人警察在不明状况下
杀害了一位非裔少年。
13:30
under murky circumstances.
254
810015
1570
13:31
The news of the protests was all over
255
811974
2007
关于游行的新闻
在我的未经算法过滤的 Twitter 上大量出现,
13:34
my algorithmically unfiltered Twitter feed,
256
814005
2685
13:36
but nowhere on my Facebook.
257
816714
1950
但 Facebook 上却没有。
13:39
Was it my Facebook friends?
258
819182
1734
是因为我的 Facebook 好友 不关注这事吗?
13:40
I disabled Facebook's algorithm,
259
820940
2032
我禁用了 Facebook 的算法,
13:43
which is hard because Facebook keeps wanting to make you
260
823472
2848
这是很麻烦的一键事, 因为 Facebook 希望
13:46
come under the algorithm's control,
261
826344
2036
你一直在它的算法 控制下使用,
13:48
and saw that my friends were talking about it.
262
828404
2238
希望我的朋友持续 地谈论这件事。
13:50
It's just that the algorithm wasn't showing it to me.
263
830666
2509
只是算法没法 给我这些信息。
13:53
I researched this and found this was a widespread problem.
264
833199
3042
我研究了这个现象, 发现这是个普遍的问题。
13:56
The story of Ferguson wasn't algorithm-friendly.
265
836265
3813
弗格森事件 对算法是不适用的,
它不是值得“赞”的新闻,
14:00
It's not "likable."
266
840102
1171
14:01
Who's going to click on "like?"
267
841297
1552
谁会在这样 的文章下点“赞”呢?
14:03
It's not even easy to comment on.
268
843500
2206
甚至这新闻都不好被评论。
14:05
Without likes and comments,
269
845730
1371
因为没有“赞”和评论,
算法会减少 这些新闻的曝光,
14:07
the algorithm was likely showing it to even fewer people,
270
847125
3292
14:10
so we didn't get to see this.
271
850441
1542
所以我们无法看到。
14:12
Instead, that week,
272
852946
1228
相反的,在同一周,
14:14
Facebook's algorithm highlighted this,
273
854198
2298
Facebook 的算法热推了
14:16
which is the ALS Ice Bucket Challenge.
274
856520
2226
ALS 冰桶挑战的信息。
14:18
Worthy cause; dump ice water, donate to charity, fine.
275
858770
3742
这很有意义,倒冰水, 为慈善捐款,很好。
14:22
But it was super algorithm-friendly.
276
862536
1904
这个事件对算法是很适用的,
14:25
The machine made this decision for us.
277
865219
2613
机器帮我们做了这个决定。
14:27
A very important but difficult conversation
278
867856
3497
非常重要但艰涩的新闻事件
14:31
might have been smothered,
279
871377
1555
可能会被埋没掉,
14:32
had Facebook been the only channel.
280
872956
2696
因为 Facebook 已经成为 主要的信息来源。
14:36
Now, finally, these systems can also be wrong
281
876117
3797
最后,这些系统 也可能会在一些
14:39
in ways that don't resemble human systems.
282
879938
2736
不同于人力系统 的那些事情上搞错。
14:42
Do you guys remember Watson, IBM's machine-intelligence system
283
882698
2922
你们记得 Watson 吧, 那个在智力竞赛《危险边缘》中
14:45
that wiped the floor with human contestants on Jeopardy?
284
885644
3128
横扫人类选手的 IBM 机器智能系统,
14:49
It was a great player.
285
889131
1428
它是个很厉害的选手。
14:50
But then, for Final Jeopardy, Watson was asked this question:
286
890583
3569
但是,在最后一轮比赛中, Watson 被问道:
14:54
"Its largest airport is named for a World War II hero,
287
894659
2932
“它最大的机场是以 二战英雄命名的,
14:57
its second-largest for a World War II battle."
288
897615
2252
它第二大机场是以 二战战场命名的。”
14:59
(Hums Final Jeopardy music)
289
899891
1378
(哼唱《危险边缘》插曲)
15:01
Chicago.
290
901582
1182
芝加哥。
15:02
The two humans got it right.
291
902788
1370
两位人类选手答对了,
15:04
Watson, on the other hand, answered "Toronto" --
292
904697
4348
但 Watson 答的是, “多伦多”,
这是个猜美国城市的环节!
15:09
for a US city category!
293
909069
1818
15:11
The impressive system also made an error
294
911596
2901
这个厉害的系统也会犯
15:14
that a human would never make, a second-grader wouldn't make.
295
914521
3651
人类都不会犯的,二年级 小孩都不会犯的错误。
15:18
Our machine intelligence can fail
296
918823
3109
我们的机器智能系统,
15:21
in ways that don't fit error patterns of humans,
297
921956
3100
会在一些不符合人类 出错模式的问题上出错,
这些问题都是我们 无法预料和准备的。
15:25
in ways we won't expect and be prepared for.
298
925080
2950
丢失一份完全有能力胜任 的工作时,人们会感到很糟,
15:28
It'd be lousy not to get a job one is qualified for,
299
928054
3638
15:31
but it would triple suck if it was because of stack overflow
300
931716
3727
但如果是因为机器 子程序的过度堆积,
15:35
in some subroutine.
301
935467
1432
就简直糟透了。
15:36
(Laughter)
302
936923
1579
(笑声)
15:38
In May of 2010,
303
938526
2786
在2010年五月,
15:41
a flash crash on Wall Street fueled by a feedback loop
304
941336
4044
华尔街出现一次 股票闪电崩盘,
原因是“卖出”算法 的反馈回路导致,
15:45
in Wall Street's "sell" algorithm
305
945404
3028
15:48
wiped a trillion dollars of value in 36 minutes.
306
948456
4184
在36分钟内 损失了几十亿美金。
15:53
I don't even want to think what "error" means
307
953722
2187
我甚至不敢想, 致命的自动化武器
15:55
in the context of lethal autonomous weapons.
308
955933
3589
发生“错误”会是什么后果。
16:01
So yes, humans have always made biases.
309
961894
3790
是的,人类总是会有偏见,
16:05
Decision makers and gatekeepers,
310
965708
2176
法庭上、新闻机构、战争中的,
16:07
in courts, in news, in war ...
311
967908
3493
决策者、看门人…
16:11
they make mistakes; but that's exactly my point.
312
971425
3038
他们都会犯错, 但这恰恰是我要说的。
16:14
We cannot escape these difficult questions.
313
974487
3521
我们无法抛开 这些困难的问题,
我们不能把我们自身 该承担的责任推给机器。
16:18
We cannot outsource our responsibilities to machines.
314
978596
3516
16:22
(Applause)
315
982676
4208
(掌声)
人工智能不会给我们 一张“伦理免责卡”。
16:29
Artificial intelligence does not give us a "Get out of ethics free" card.
316
989089
4447
16:34
Data scientist Fred Benenson calls this math-washing.
317
994742
3381
数据科学家 Fred Benenson 称之为“数学粉饰”。
我们需要是相反的东西。
16:38
We need the opposite.
318
998147
1389
16:39
We need to cultivate algorithm suspicion, scrutiny and investigation.
319
999560
5388
我们需要培养算法的 怀疑、复查和调研能力。
16:45
We need to make sure we have algorithmic accountability,
320
1005380
3198
我们需要确保 有人为算法负责,
16:48
auditing and meaningful transparency.
321
1008602
2445
为算法审查, 并切实的公开透明。
16:51
We need to accept that bringing math and computation
322
1011380
3234
我们必须认识到, 把数学和计算引入
16:54
to messy, value-laden human affairs
323
1014638
2970
解决复杂的、高价值 的人类事务中,
16:57
does not bring objectivity;
324
1017632
2384
并不能带来客观性,
相反,人类事务 的复杂性会扰乱算法。
17:00
rather, the complexity of human affairs invades the algorithms.
325
1020040
3633
是的,我们可以 并且需要使用计算机
17:04
Yes, we can and we should use computation
326
1024148
3487
17:07
to help us make better decisions.
327
1027659
2014
来帮助我们做更好的决策,
17:09
But we have to own up to our moral responsibility to judgment,
328
1029697
5332
但我们也需要在判断中 加入道德义务,
在这个框架下使用算法,
17:15
and use algorithms within that framework,
329
1035053
2818
17:17
not as a means to abdicate and outsource our responsibilities
330
1037895
4935
而不是像人与人 之间相互推卸那样,
就把责任转移给机器。
17:22
to one another as human to human.
331
1042854
2454
17:25
Machine intelligence is here.
332
1045807
2609
人工智能到来了,
17:28
That means we must hold on ever tighter
333
1048440
3421
这意味着 我们要格外坚守
人类的价值观和伦理。
17:31
to human values and human ethics.
334
1051885
2147
谢谢。
17:34
Thank you.
335
1054056
1154
17:35
(Applause)
336
1055234
5020
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7