Machine intelligence makes human morals more important | Zeynep Tufekci

180,762 views ใƒป 2016-11-11

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Gili Baltsan ืžื‘ืงืจ: Ido Dekkers
00:12
So, I started my first job as a computer programmer
0
12739
4122
ื”ืชื—ืœืชื™ ืืช ืขื‘ื•ื“ืชื™ ื”ืจืืฉื•ื ื” ื›ืžืชื›ื ืชืช ืžื—ืฉื‘ื™ื
00:16
in my very first year of college --
1
16885
1956
ื‘ืฉื ืชื™ ื”ืจืืฉื•ื ื” ื‘ืงื•ืœื’' -
00:18
basically, as a teenager.
2
18865
1507
ื›ืฉื”ื™ื™ืชื™ ื ืขืจื” ืžืชื‘ื’ืจืช.
00:20
Soon after I started working,
3
20889
1732
ื–ืžืŸ ืงืฆืจ ืœืื—ืจ ืฉื”ืชื—ืœืชื™ ืœืขื‘ื•ื“,
00:22
writing software in a company,
4
22645
1610
ืœื›ืชื•ื‘ ืชื•ื›ื ื•ืช ื‘ื—ื‘ืจื”,
00:24
a manager who worked at the company came down to where I was,
5
24799
3635
ืžื ื”ืœ ืฉืขื‘ื“ ื‘ื—ื‘ืจื” ื ื›ื ืก ืœื—ื“ืจ ื‘ื• ื”ื™ื™ืชื™,
00:28
and he whispered to me,
6
28458
1268
ื•ืœื—ืฉ ืืœื™ื™,
00:30
"Can he tell if I'm lying?"
7
30229
2861
"ื”ืื ื”ื•ื ื™ื›ื•ืœ ืœื“ืขืช ืื ืื ื™ ืžืฉืงืจ?"
00:33
There was nobody else in the room.
8
33806
2077
ืœื ื”ื™ื” ืืฃ ืื—ื“ ื ื•ืกืฃ ื‘ื—ื“ืจ.
00:37
"Can who tell if you're lying? And why are we whispering?"
9
37032
4389
"ืžื™ ื™ื›ื•ืœ ืœื“ืขืช ืื ืืชื” ืžืฉืงืจ? ื•ืœืžื” ืื ื—ื ื• ืœื•ื—ืฉื™ื?"
00:42
The manager pointed at the computer in the room.
10
42266
3107
ื”ืžื ื”ืœ ื”ืฆื‘ื™ืข ืขืœ ื”ืžื—ืฉื‘ ื‘ื—ื“ืจ.
00:45
"Can he tell if I'm lying?"
11
45397
3096
"ื”ืื ื”ื•ื ื™ื›ื•ืœ ืœื“ืขืช ืื ืื ื™ ืžืฉืงืจ?"
00:49
Well, that manager was having an affair with the receptionist.
12
49613
4362
ื˜ื•ื‘, ื”ืžื ื”ืœ ื”ื–ื” ื ื™ื”ืœ ืจื•ืžืŸ ืขื ืคืงื™ื“ืช ื”ืงื‘ืœื”.
00:53
(Laughter)
13
53999
1112
(ืฆื—ื•ืง)
00:55
And I was still a teenager.
14
55135
1766
ื•ืื ื™ ื”ื™ื™ืชื™ ืขื“ื™ื™ืŸ ื ืขืจื” ืžืชื‘ื’ืจืช.
00:57
So I whisper-shouted back to him,
15
57447
2019
ืื– ืœื—ืฉืชื™-ืฆืขืงืชื™ ืืœื™ื• ื‘ื—ื–ืจื”,
00:59
"Yes, the computer can tell if you're lying."
16
59490
3624
"ื›ืŸ, ื”ืžื—ืฉื‘ ื™ื›ื•ืœ ืœื“ืขืช ืื ืืชื” ืžืฉืงืจ."
01:03
(Laughter)
17
63138
1806
(ืฆื—ื•ืง)
01:04
Well, I laughed, but actually, the laugh's on me.
18
64968
2923
ื˜ื•ื‘, ืฆื—ืงืชื™, ืื‘ืœ ื‘ืขืฆื, ื”ื‘ื“ื™ื—ื” ื”ื™ื ืขืœ ื—ืฉื‘ื•ื ื™.
01:07
Nowadays, there are computational systems
19
67915
3268
ื‘ื™ืžื™ื ื•, ืงื™ื™ืžื•ืช ืžืขืจื›ื•ืช ืžืžื•ื—ืฉื‘ื•ืช
01:11
that can suss out emotional states and even lying
20
71207
3548
ืฉื™ื›ื•ืœื•ืช ืœืงืœื•ื˜ ืžืฆื‘ื™ื ืจื’ืฉื™ื™ื ื•ืืคื™ืœื• ืฉืงืจ
01:14
from processing human faces.
21
74779
2044
ื‘ืืžืฆืขื•ืช ืขื™ื‘ื•ื“ ืคืจืฆื•ืคื™ื ืื ื•ืฉื™ื™ื.
01:17
Advertisers and even governments are very interested.
22
77248
4153
ืžืคืจืกืžื™ื ื•ืืคื™ืœื• ืžืžืฉืœื•ืช ืžืื“ ืžืชืขื ื™ื™ื ื•ืช ื‘ื–ื”.
01:22
I had become a computer programmer
23
82319
1862
ืื ื™ ื ื”ื™ื™ืชื™ ืžืชื›ื ืชืช ืžื—ืฉื‘ื™ื
01:24
because I was one of those kids crazy about math and science.
24
84205
3113
ื›ื™ ื”ื™ื™ืชื™ ืžื”ื™ืœื“ื™ื ื”ืืœื” ืฉืžืฉื•ื’ืขื™ื ืขืœ ืžืชืžื˜ื™ืงื” ื•ืžื“ืขื™ื.
01:27
But somewhere along the line I'd learned about nuclear weapons,
25
87942
3108
ืื‘ืœ ืื™ืคืฉื”ื• ื‘ืžื”ืœืš ื”ื“ืจืš ืœืžื“ืชื™ ืขืœ ื ืฉืง ื’ืจืขื™ื ื™,
01:31
and I'd gotten really concerned with the ethics of science.
26
91074
2952
ื•ื”ืชื—ืœืชื™ ืœื”ื™ื•ืช ืžืื•ื“ ืžื•ื“ืื’ืช ื‘ืขื ื™ื™ืŸ ื”ืืชื™ืงื” ืฉืœ ื”ืžื“ืข.
01:34
I was troubled.
27
94050
1204
ื”ื™ื™ืชื™ ืžื•ื˜ืจื“ืช.
01:35
However, because of family circumstances,
28
95278
2641
ืขื ื–ืืช, ื‘ื’ืœืœ ื ืกื™ื‘ื•ืช ืžืฉืคื—ืชื™ื•ืช,
01:37
I also needed to start working as soon as possible.
29
97943
3298
ื ืืœืฆืชื™ ืœื”ืชื—ื™ืœ ืœืขื‘ื•ื“ ืžื•ืงื“ื ื›ื›ืœ ื”ืืคืฉืจ.
01:41
So I thought to myself, hey, let me pick a technical field
30
101265
3299
ืื– ื—ืฉื‘ืชื™ ืœืขืฆืžื™, ืื ื™ ืื‘ื—ืจ ื‘ืชื—ื•ื ื˜ื›ื ื™
01:44
where I can get a job easily
31
104588
1796
ื‘ื• ืืžืฆื ืขื‘ื•ื“ื” ื‘ืงืœื•ืช
01:46
and where I don't have to deal with any troublesome questions of ethics.
32
106408
4018
ื•ืœื ืืฆื˜ืจืš ืœื”ืชืขืกืง ื‘ืฉืืœื•ืช ืืชื™ื•ืช ืžื˜ืจื™ื“ื•ืช.
01:51
So I picked computers.
33
111022
1529
ืื– ื‘ื—ืจืชื™ ื‘ืžื—ืฉื‘ื™ื.
01:52
(Laughter)
34
112575
1104
(ืฆื—ื•ืง)
01:53
Well, ha, ha, ha! All the laughs are on me.
35
113703
3410
ื˜ื•ื‘, ื—ื”, ื—ื”, ื—ื”! ื›ืœ ื”ืฆื—ื•ืง ื”ื•ื ืขืœ ื—ืฉื‘ื•ื ื™.
01:57
Nowadays, computer scientists are building platforms
36
117137
2754
ื‘ื™ืžื™ื ื•, ืžื“ืขื ื™ ืžื—ืฉื‘ื™ื ื‘ื•ื ื™ื ืคืœื˜ืคื•ืจืžื•ืช
01:59
that control what a billion people see every day.
37
119915
4209
ืืฉืจ ืฉื•ืœื˜ื•ืช ื‘ืžื” ืฉืžื™ืœื™ืืจื“ ืื ืฉื™ื ื™ืฆืคื• ื‘ื›ืœ ื™ื•ื.
02:05
They're developing cars that could decide who to run over.
38
125052
3822
ื”ื ืžืคืชื—ื™ื ืžื›ื•ื ื™ื•ืช ืฉื™ื›ื•ืœื•ืช ืœื”ื—ืœื™ื˜ ืืช ืžื™ ืœื“ืจื•ืก.
02:09
They're even building machines, weapons,
39
129707
3213
ื”ื ืืคื™ืœื• ื‘ื•ื ื™ื ืžื›ื•ื ื•ืช, ื›ืœื™ ื ืฉืง,
02:12
that might kill human beings in war.
40
132944
2285
ืฉื™ื›ื•ืœื™ื ืœื”ืจื•ื’ ื‘ื ื™ ืื“ื ื‘ืžืœื—ืžื”.
02:15
It's ethics all the way down.
41
135253
2771
ืžื“ื•ื‘ืจ ื‘ืืชื™ืงื” ืœืื•ืจืš ื›ืœ ื”ื“ืจืš.
02:19
Machine intelligence is here.
42
139183
2058
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื›ื‘ืจ ื›ืืŸ.
02:21
We're now using computation to make all sort of decisions,
43
141823
3474
ืื ื• ืžืฉืชืžืฉื™ื ื›ืขืช ื‘ืžื—ืฉื•ื‘ ื›ื“ื™ ืœืงื‘ืœ ื›ืœ ืžื™ื ื™ ื”ื—ืœื˜ื•ืช.
02:25
but also new kinds of decisions.
44
145321
1886
ืื‘ืœ ื’ื ื”ื—ืœื˜ื•ืช ืžืกื•ื’ื™ื ื—ื“ืฉื™ื.
02:27
We're asking questions to computation that have no single right answers,
45
147231
5172
ืื ื—ื ื• ืฉื•ืืœื™ื ืืช ื”ืžื—ืฉื•ื‘ ืฉืืœื•ืช ืฉื™ืฉ ืœื”ืŸ ื™ื•ืชืจ ืžืชืฉื•ื‘ื” ื ื›ื•ื ื” ืื—ืช,
02:32
that are subjective
46
152427
1202
ืฉื”ืŸ ืกื•ื‘ื™ืงื˜ื™ื‘ื™ื•ืช
02:33
and open-ended and value-laden.
47
153653
2325
ืฉืืœื•ืช ืคืชื•ื—ื•ืช ื•ื‘ืขืœื•ืช ืžื˜ืขืŸ ืขืจื›ื™.
02:36
We're asking questions like,
48
156002
1758
ืื ื• ืฉื•ืืœื™ื ืฉืืœื•ืช ื›ืžื•,
02:37
"Who should the company hire?"
49
157784
1650
"ืืช ืžื™ ืฆืจื™ื›ื” ื”ื—ื‘ืจื” ืœื”ืขืกื™ืง?"
02:40
"Which update from which friend should you be shown?"
50
160096
2759
"ืื™ื–ื” ืขื“ื›ื•ืŸ ืžืื™ื–ื” ื—ื‘ืจ ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ืจืื•ืช?"
02:42
"Which convict is more likely to reoffend?"
51
162879
2266
"ืžื™ื”ื• ื”ืืกื™ืจ ืฉืกื‘ื™ืจ ื™ื•ืชืจ ืฉื™ืคืฉืข ืฉื•ื‘?"
02:45
"Which news item or movie should be recommended to people?"
52
165514
3054
"ืขืœ ืื™ื–ื” ืื™ื™ื˜ื ื—ื“ืฉื•ืชื™ ืื• ืกืจื˜ ื›ื“ืื™ ืœื”ืžืœื™ืฅ ืœืื ืฉื™ื?"
02:48
Look, yes, we've been using computers for a while,
53
168592
3372
ืชืจืื•, ื›ืŸ, ืื ื—ื ื• ืžืฉืชืžืฉื™ื ื‘ืžื—ืฉื‘ื™ื ื›ื‘ืจ ื–ืžืŸ ืจื‘,
02:51
but this is different.
54
171988
1517
ืื‘ืœ ื–ื” ืฉื•ื ื”.
02:53
This is a historical twist,
55
173529
2067
ื–ื”ื• ื˜ื•ื•ื™ืกื˜ ื”ื™ืกื˜ื•ืจื™,
02:55
because we cannot anchor computation for such subjective decisions
56
175620
5337
ื›ื™ื•ื•ืŸ ืฉืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื”ื™ืขื–ืจ ื‘ืžื—ืฉื•ื‘ ื‘ืฉืืœื•ืช ืกื•ื‘ื™ื™ืงื˜ื™ื‘ื™ื•ืช ื›ืืœื•
03:00
the way we can anchor computation for flying airplanes, building bridges,
57
180981
5420
ื›ืคื™ ืฉืื ื• ื ืขื–ืจื™ื ื‘ืžื—ืฉื•ื‘ ืœื”ื˜ืกืช ืžื˜ื•ืกื™ื, ืœื‘ื ื™ื™ืช ื’ืฉืจื™ื,
03:06
going to the moon.
58
186425
1259
ืœื”ื’ื™ืข ืœื™ืจื—.
03:08
Are airplanes safer? Did the bridge sway and fall?
59
188449
3259
ื”ืื ืžื˜ื•ืกื™ื ื‘ื˜ื•ื—ื™ื ื™ื•ืชืจ? ื”ืื ื”ื’ืฉืจ ื”ืชื ื“ื ื“ ื•ื ืคืœ?
03:11
There, we have agreed-upon, fairly clear benchmarks,
60
191732
4498
ื‘ื“ื‘ืจื™ื ื”ืืœื• ื”ืกื›ืžื ื• ืขืœ ืืžื•ืช ืžื™ื“ื” ื‘ืจื•ืจื•ืช ืœืžื“ื™,
03:16
and we have laws of nature to guide us.
61
196254
2239
ื•ื™ืฉ ืœื ื• ืืช ื—ื•ืงื™ ื”ื˜ื‘ืข ืฉืžื“ืจื™ื›ื™ื ืื•ืชื ื•.
03:18
We have no such anchors and benchmarks
62
198517
3394
ืื™ืŸ ืœื ื• ืขื•ื’ื ื™ื ื•ืืžื•ืช ืžื™ื“ื” ื›ืืœื•
03:21
for decisions in messy human affairs.
63
201935
3963
ืขื‘ื•ืจ ื”ื—ืœื˜ื•ืช ื‘ืขื ื™ื™ื ื™ื ืื ื•ืฉื™ื™ื ืžืกื•ื‘ื›ื™ื.
03:25
To make things more complicated, our software is getting more powerful,
64
205922
4237
ื›ื“ื™ ืœืกื‘ืš ืืช ื”ืขื ื™ื™ืŸ ืขื•ื“ ื™ื•ืชืจ, ื”ืชื•ื›ื ื” ืฉืœื ื• ื”ื•ืœื›ืช ื•ื ืขืฉื™ืช ื—ื–ืงื” ื™ื•ืชืจ,
03:30
but it's also getting less transparent and more complex.
65
210183
3773
ืื‘ืœ ื”ื™ื ื’ื ื ืขืฉื™ืช ืคื—ื•ืช ืฉืงื•ืคื” ื•ื™ื•ืชืจ ืžื•ืจื›ื‘ืช.
03:34
Recently, in the past decade,
66
214542
2040
ืœืื—ืจื•ื ื”, ื‘ืขืฉื•ืจ ื”ื—ื•ืœืฃ,
03:36
complex algorithms have made great strides.
67
216606
2729
ืืœื’ื•ืจื™ืชืžื™ื ืžื•ืจื›ื‘ื™ื ื”ืชืงื“ืžื• ืžืื“.
03:39
They can recognize human faces.
68
219359
1990
ื”ื ื™ื›ื•ืœื™ื ืœื–ื”ื•ืช ืคืจืฆื•ืคื™ื ืื ื•ืฉื™ื™ื.
03:41
They can decipher handwriting.
69
221985
2055
ื”ื ื™ื›ื•ืœื™ื ืœืคืขื ื— ื›ืชื‘ ื™ื“.
03:44
They can detect credit card fraud
70
224436
2066
ื”ื ื™ื›ื•ืœื™ื ืœื–ื”ื•ืช ื”ื•ื ืืช ื›ืจื˜ื™ืกื™ ืืฉืจืื™
03:46
and block spam
71
226526
1189
ื•ืœื—ืกื•ื ื“ื•ืืจ ื–ื‘ืœ
03:47
and they can translate between languages.
72
227739
2037
ื•ื”ื ื™ื›ื•ืœื™ื ืœืชืจื’ื ืžืฉืคื” ืœืฉืคื”.
03:49
They can detect tumors in medical imaging.
73
229800
2574
ื”ื ื™ื›ื•ืœื™ื ืœื–ื”ื•ืช ื’ื™ื“ื•ืœื™ื ื‘ื”ื“ืžื™ื” ืจืคื•ืื™ืช.
03:52
They can beat humans in chess and Go.
74
232398
2205
ื”ื ื™ื›ื•ืœื™ื ืœื ืฆื— ื‘ื ื™ ืื“ื ื‘ืžืฉื—ืงื™ ืฉื—-ืžื˜ ื•ื’ื•.
03:55
Much of this progress comes from a method called "machine learning."
75
235264
4504
ื”ืจื‘ื” ืžื”ื”ืชืงื“ืžื•ืช ื”ื–ื• ื”ื™ื ื”ื•ื“ื•ืช ืœืฉื™ื˜ื” ืฉื ืงืจืืช "ืœื™ืžื•ื“ ืžื›ื•ื ื”".
04:00
Machine learning is different than traditional programming,
76
240175
3187
ืœื™ืžื•ื“ ืžื›ื•ื ื” ื”ื•ื ืฉื•ื ื” ืžืชื›ื ื•ืช ืžืกื•ืจืชื™,
04:03
where you give the computer detailed, exact, painstaking instructions.
77
243386
3585
ื‘ื• ื ื•ืชื ื™ื ืœืžื—ืฉื‘ ื”ื•ืจืื•ืช ืžืคื•ืจื˜ื•ืช, ืžื“ื•ื™ืงื•ืช, ืžื“ื•ืงื“ืงื•ืช.
04:07
It's more like you take the system and you feed it lots of data,
78
247378
4182
ื–ื” ื™ื•ืชืจ ื›ืžื• ืฉืžื›ื ื™ืกื™ื ืœืžืขืจื›ืช ื”ืจื‘ื” ื ืชื•ื ื™ื,
04:11
including unstructured data,
79
251584
1656
ื›ื•ืœืœ ื ืชื•ื ื™ื ืœื ืžื•ื‘ื ื™ื,
04:13
like the kind we generate in our digital lives.
80
253264
2278
ื›ืžื• ืืœื” ืฉืื ื• ืžื™ื™ืฆืจื™ื ื‘ื—ื™ื™ื ื• ื”ื“ื™ื’ื™ื˜ืœื™ื™ื.
04:15
And the system learns by churning through this data.
81
255566
2730
ื•ื”ืžืขืจื›ืช ืœื•ืžื“ืช, ืขืœ-ื™ื“ื™ ืขืจื‘ื•ืœ ื›ืœ ื”ื ืชื•ื ื™ื ื”ืœืœื•.
04:18
And also, crucially,
82
258669
1526
ื›ืžื• ื›ืŸ, ื‘ืื•ืคืŸ ืžื›ืจื™ืข,
04:20
these systems don't operate under a single-answer logic.
83
260219
4380
ื”ืžืขืจื›ื•ืช ื”ืœืœื• ืœื ืคื•ืขืœื•ืช ืขืœ ืคื™ ื”ื”ื™ื’ื™ื•ืŸ ืฉืœ ืชืฉื•ื‘ื”-ืื—ืช.
04:24
They don't produce a simple answer; it's more probabilistic:
84
264623
2959
ื”ืŸ ืœื ืžื™ื™ืฆืจื•ืช ืชืฉื•ื‘ื” ืคืฉื•ื˜ื”; ื–ื” ื™ื•ืชืจ ื”ืกืชื‘ืจื•ืชื™:
04:27
"This one is probably more like what you're looking for."
85
267606
3483
"ื–ื• ื›ื ืจืื” ื™ื•ืชืจ ืžืชืื™ืžื” ืœืžื” ืฉืืชื” ืžื—ืคืฉ."
04:32
Now, the upside is: this method is really powerful.
86
272023
3070
ื”ื™ืชืจื•ืŸ ื”ื•ื: ื”ืฉื™ื˜ื” ื”ื–ื• ืžืžืฉ ื—ื–ืงื”.
04:35
The head of Google's AI systems called it,
87
275117
2076
ื”ืžื ื”ืœ ืฉืœ ืžืขืจื›ืช AI ืฉืœ ื’ื•ื’ืœ ืงืจื ืœื–ื”,
04:37
"the unreasonable effectiveness of data."
88
277217
2197
"ื”ื™ืขื™ืœื•ืช ื—ืกืจืช ื”ื”ื™ื’ื™ื•ืŸ ืฉืœ ื ืชื•ื ื™ื."
04:39
The downside is,
89
279791
1353
ื”ื—ื™ืกืจื•ืŸ ื”ื•ื,
04:41
we don't really understand what the system learned.
90
281738
3071
ืฉืื ื—ื ื• ืœื ื‘ืืžืช ืžื‘ื™ื ื™ื ืžื” ื”ืžืขืจื›ืช ืœืžื“ื”.
04:44
In fact, that's its power.
91
284833
1587
ื‘ืขืฆื, ื–ื” ื”ื›ื•ื— ืฉืœื”.
04:46
This is less like giving instructions to a computer;
92
286946
3798
ื–ื” ืคื—ื•ืช ื›ืžื• ืœืชืช ื”ื•ืจืื•ืช ืœืžื—ืฉื‘;
04:51
it's more like training a puppy-machine-creature
93
291200
4064
ื–ื” ื™ื•ืชืจ ื›ืžื• ืœืืžืŸ ื™ืฆื•ืจ ืฉื”ื•ื ื’ื•ืจ-ืžื›ื•ื ื”
04:55
we don't really understand or control.
94
295288
2371
ืฉืื ื—ื ื• ืœื ื‘ืืžืช ืžื‘ื™ื ื™ื ื•ืฉื•ืœื˜ื™ื ื‘ื•.
04:58
So this is our problem.
95
298362
1551
ืื– ื–ื• ื”ื‘ืขื™ื” ืฉืœื ื•.
05:00
It's a problem when this artificial intelligence system gets things wrong.
96
300427
4262
ื–ื• ื‘ืขื™ื” ื›ืืฉืจ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ื–ื• ื˜ื•ืขื”.
05:04
It's also a problem when it gets things right,
97
304713
3540
ื–ื• ื’ื ื‘ืขื™ื” ื›ืืฉืจ ื”ื™ื ืื™ื ื” ื˜ื•ืขื”,
05:08
because we don't even know which is which when it's a subjective problem.
98
308277
3628
ืžื›ื™ื•ื•ืŸ ืฉืื ื• ืœื ื™ื•ื“ืขื™ื ืžื”ื• ืžื” ื›ืืฉืจ ื”ื‘ืขื™ื” ื”ื™ื ืกื•ื‘ื™ืงื˜ื™ื‘ื™ืช.
05:11
We don't know what this thing is thinking.
99
311929
2339
ืื™ื ื ื• ื™ื•ื“ืขื™ื ืžื” ื”ื“ื‘ืจ ื”ื–ื” ื—ื•ืฉื‘.
05:15
So, consider a hiring algorithm --
100
315493
3683
ืื–, ื—ื™ืฉื‘ื• ืขืœ ืืœื’ื•ืจื™ืชื ืฉืœ ื”ืขืกืงืช ืขื•ื‘ื“ื™ื -
05:20
a system used to hire people, using machine-learning systems.
101
320123
4311
ืžืขืจื›ืช ืฉืจื’ื™ืœื” ืœื”ืขืกื™ืง ืื ืฉื™ื, ืžืฉืชืžืฉืช ื‘ืžืขืจื›ื•ืช ืœื™ืžื•ื“-ืžื›ื•ื ื”.
05:25
Such a system would have been trained on previous employees' data
102
325052
3579
ืžืขืจื›ืช ื›ื–ื• ื”ื™ื™ืชื” ืžืื•ืžื ืช ืขืœ ื ืชื•ื ื™ ื”ืขืกืงื” ืงื•ื“ืžื™ื
05:28
and instructed to find and hire
103
328655
2591
ื•ื ื™ืชื ืช ืœื” ื”ื ื—ื™ื™ื” ืœืžืฆื•ื ื•ืœื”ืขืกื™ืง
05:31
people like the existing high performers in the company.
104
331270
3038
ืื ืฉื™ื ื‘ืขืœื™ ื‘ื™ืฆื•ืขื™ื ื˜ื•ื‘ื™ื ื›ืžื• ืืœื” ืฉื›ื‘ืจ ื ืžืฆืื™ื ื‘ื—ื‘ืจื”.
05:34
Sounds good.
105
334814
1153
ื ืฉืžืข ื˜ื•ื‘.
05:35
I once attended a conference
106
335991
1999
ืคืขื ื”ืฉืชืชืคืชื™ ื‘ื›ื ืก
05:38
that brought together human resources managers and executives,
107
338014
3125
ืืฉืจ ื—ื™ื‘ืจ ื™ื—ื“ ืžื ื”ืœื™ ืžืฉืื‘ื™ ืื ื•ืฉ,
05:41
high-level people,
108
341163
1206
ืื ืฉื™ื ื‘ืจืžื” ื’ื‘ื•ื”ื”,
05:42
using such systems in hiring.
109
342393
1559
ื”ืžืฉืชืžืฉื™ื ื‘ืžืขืจื›ื•ืช ื›ืืœื” ืœื”ืขืกืงืช ืขื•ื‘ื“ื™ื.
05:43
They were super excited.
110
343976
1646
ื”ื ืžืื“ ื”ืชืœื”ื‘ื•.
05:45
They thought that this would make hiring more objective, less biased,
111
345646
4653
ื”ื ื—ืฉื‘ื• ืฉื–ื” ื™ื”ืคื•ืš ืืช ืชื”ืœื™ืš ื”ื”ืขืกืงื” ืœืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ ื™ื•ืชืจ, ืคื—ื•ืช ืžื•ื˜ื”,
05:50
and give women and minorities a better shot
112
350323
3000
ื•ื™ืขื ื™ืง ืœื ืฉื™ื ื•ืžื™ืขื•ื˜ื™ื ืกื™ื›ื•ื™ ื’ื‘ื•ื” ื™ื•ืชืจ
05:53
against biased human managers.
113
353347
2188
ื›ื ื’ื“ ืžื ื”ืœื™ื ืื ื•ืฉื™ื™ื ืžืฉื•ื—ื“ื™ื.
05:55
And look -- human hiring is biased.
114
355559
2843
ื•ืชืจืื• - ื”ืขืกืงื” ืขืœ-ื™ื“ื™ ื‘ื ื™-ืื“ื ื”ื™ื ืžืฉื•ื—ื“ืช.
05:59
I know.
115
359099
1185
ืื ื™ ื™ื•ื“ืขืช.
06:00
I mean, in one of my early jobs as a programmer,
116
360308
3005
ื‘ืื—ื“ ืžืžืงื•ืžื•ืช ื”ืขื‘ื•ื“ื” ื”ืžื•ืงื“ืžื™ื ืฉืœื™ ื›ืžืชื›ื ืชืช,
06:03
my immediate manager would sometimes come down to where I was
117
363337
3868
ื”ืžื ื”ืœืช ื”ื™ืฉื™ืจื” ืฉืœื™ ื”ื™ื™ืชื” ื‘ืื” ืœืคืขืžื™ื ืœืžืงื•ื ื‘ื• ื”ื™ื™ืชื™
06:07
really early in the morning or really late in the afternoon,
118
367229
3753
ืžืžืฉ ืžื•ืงื“ื ื‘ื‘ื•ืงืจ ืื• ืžืžืฉ ืžืื•ื—ืจ ืื—ืจ ื”ืฆื”ืจื™ื™ื,
06:11
and she'd say, "Zeynep, let's go to lunch!"
119
371006
3062
ื•ืืžืจื” ืœื™, "ื–ื™ื™ื ื‘, ื‘ื•ืื™ ื ืœืš ืœืื›ื•ืœ ืืจื•ื—ืช ืฆื”ืจื™ื!"
06:14
I'd be puzzled by the weird timing.
120
374724
2167
ื”ื™ื™ืชื™ ืžื‘ื•ืœื‘ืœืช ืžื”ืชื–ืžื•ืŸ ื”ืžื•ื–ืจ.
06:16
It's 4pm. Lunch?
121
376915
2129
ืขื›ืฉื™ื• 4 ืื—ืจ ื”ืฆื”ืจื™ื™ื. ืืจื•ื—ืช ืฆื”ืจื™ื™ื?
06:19
I was broke, so free lunch. I always went.
122
379068
3094
ื”ื™ื™ืชื™ ืžืจื•ืฉืฉืช, ืื– ืืจื•ื—ืช ื—ื™ื ื. ืชืžื™ื“ ื”ืœื›ืชื™.
06:22
I later realized what was happening.
123
382618
2067
ืžืื•ื—ืจ ื™ื•ืชืจ ื”ื‘ื ืชื™ ืžื” ืงืจื”.
06:24
My immediate managers had not confessed to their higher-ups
124
384709
4546
ืžื ื”ืœื™ื™ ื”ื™ืฉื™ืจื™ื ืœื ื”ื•ื“ื• ื‘ืคื ื™ ื”ืžืžื•ื ื™ื ืขืœื™ื”ื
06:29
that the programmer they hired for a serious job was a teen girl
125
389279
3113
ืฉื”ืžืชื›ื ืชืช ืฉื”ื ื”ืขืกื™ืงื• ื‘ืชืคืงื™ื“ ืจืฆื™ื ื™ ื”ื™ื ื ืขืจื” ืžืชื‘ื’ืจืช
06:32
who wore jeans and sneakers to work.
126
392416
3930
ืฉื‘ืื” ืœืขื‘ื•ื“ื” ื‘ื’'ื™ื ืก ื•ืกื ื™ืงืจืก.
06:37
I was doing a good job, I just looked wrong
127
397174
2202
ืขืฉื™ืชื™ ืขื‘ื•ื“ื” ื˜ื•ื‘ื”, ืจืง ืœื ื ืจืื™ืชื™ ื ื›ื•ืŸ
06:39
and was the wrong age and gender.
128
399400
1699
ื•ื”ื™ื™ืชื™ ื‘ื’ื™ืœ ื•ื‘ืžื’ื“ืจ ื”ืœื ื ื›ื•ื ื™ื.
06:41
So hiring in a gender- and race-blind way
129
401123
3346
ืื– ื”ืขืกืงืช ืขื•ื‘ื“ื™ื ื‘ืื•ืคืŸ ืขื™ื•ื•ืจ ืœืžื’ื“ืจ ื•ื’ื–ืข
06:44
certainly sounds good to me.
130
404493
1865
ื‘ื”ื—ืœื˜ ื ืฉืžืข ืœื™ ื˜ื•ื‘.
06:47
But with these systems, it is more complicated, and here's why:
131
407031
3341
ืื‘ืœ ืขื ื”ืžืขืจื›ื•ืช ื”ืืœื•, ื–ื” ื™ื•ืชืจ ืžื•ืจื›ื‘, ื•ื–ื• ื”ืกื™ื‘ื”:
06:50
Currently, computational systems can infer all sorts of things about you
132
410968
5791
ื›ืขืช, ืžืขืจื›ื•ืช ืžืžื•ื—ืฉื‘ื•ืช ื™ื›ื•ืœื•ืช ืœื”ืกื™ืง ื›ืœ ืžื™ื ื™ ื“ื‘ืจื™ื ืขืœื™ื›ื
06:56
from your digital crumbs,
133
416783
1872
ืžื”ืคื™ืจื•ืจื™ื ื”ื“ื™ื’ื™ื˜ืœื™ื™ื ืฉืœื›ื,
06:58
even if you have not disclosed those things.
134
418679
2333
ืืคื™ืœื• ืื ืœื ื’ื™ืœื™ืช ืืช ื”ื“ื‘ืจื™ื ื”ืืœื”.
07:01
They can infer your sexual orientation,
135
421506
2927
ื”ื ื™ื›ื•ืœื•ืช ืœื”ืกื™ืง ืœื’ื‘ื™ ื”ืื•ืจื™ื™ื ื˜ืฆื™ื” ื”ืžื™ื ื™ืช ืฉืœื›ื,
07:04
your personality traits,
136
424994
1306
ืชื›ื•ื ื•ืช ื”ืื•ืคื™ ืฉืœื›ื,
07:06
your political leanings.
137
426859
1373
ื”ื ื˜ื™ื™ื” ื”ืคื•ืœื™ื˜ื™ืช ืฉืœื›ื.
07:08
They have predictive power with high levels of accuracy.
138
428830
3685
ื™ืฉ ืœื”ืŸ ื™ื›ื•ืœืช ื ื™ื‘ื•ื™ ืขื ืจืžื•ืช ื“ื™ื•ืง ื’ื‘ื•ื”ื•ืช.
07:13
Remember -- for things you haven't even disclosed.
139
433362
2578
ื–ื›ืจื• - ืืคื™ืœื• ืœื’ื‘ื™ ื“ื‘ืจื™ื ืฉืœื ื’ื™ืœื™ืชื.
07:15
This is inference.
140
435964
1591
ื–ื•ื”ื™ ื”ืกืงื”.
07:17
I have a friend who developed such computational systems
141
437579
3261
ื™ืฉ ืœื™ ื—ื‘ืจื” ืฉืคื™ืชื—ื” ืžืขืจื›ื•ืช ืžืžื•ื—ืฉื‘ื•ืช ืฉื™ื›ื•ืœื•ืช
07:20
to predict the likelihood of clinical or postpartum depression
142
440864
3641
ืœื ื‘ื ืืช ื”ืกื™ื›ื•ืŸ ืœื“ื™ื›ืื•ืŸ ืงืœื™ื ื™ ืื• ื“ื™ื›ืื•ืŸ ืื—ืจื™ ืœื™ื“ื”
07:24
from social media data.
143
444529
1416
ืžื ืชื•ื ื™ื ืฉืœ ืžื“ื™ื” ื—ื‘ืจืชื™ืช.
07:26
The results are impressive.
144
446676
1427
ื”ืชื•ืฆืื•ืช ื”ืŸ ืžืจืฉื™ืžื•ืช.
07:28
Her system can predict the likelihood of depression
145
448492
3357
ื”ืžืขืจื›ืช ืฉืœื” ื™ื›ื•ืœื” ืœื ื‘ื ืืช ื”ืกื™ื›ื•ืŸ ืœืœืงื•ืช ื‘ื“ื™ื›ืื•ืŸ
07:31
months before the onset of any symptoms --
146
451873
3903
ื—ื•ื“ืฉื™ื ืœืคื ื™ ื”ื•ืคืขืช ืกื™ืžืคื˜ื•ืžื™ื ื›ืœืฉื”ื -
07:35
months before.
147
455800
1373
ื—ื•ื“ืฉื™ื ืœืคื ื™.
07:37
No symptoms, there's prediction.
148
457197
2246
ืื™ืŸ ืกื™ืžืคื˜ื•ืžื™ื, ื™ืฉ ื ื™ื‘ื•ื™.
07:39
She hopes it will be used for early intervention. Great!
149
459467
4812
ื”ื™ื ืžืงื•ื•ื” ืฉื™ืฉืชืžืฉื• ื‘ื–ื” ืœื”ืชืขืจื‘ื•ืช ืžื•ืงื“ืžืช. ื ื”ื“ืจ!
07:44
But now put this in the context of hiring.
150
464911
2040
ืื‘ืœ ืขื›ืฉื™ื• ืฉื™ืžื• ืืช ื–ื” ื‘ื”ืงืฉืจ ืฉืœ ื”ืขืกืงื”.
07:48
So at this human resources managers conference,
151
468027
3046
ืื– ื‘ื›ื ืก ื”ื–ื” ืฉืœ ืžื ื”ืœื™ ืžืฉืื‘ื™ ืื ื•ืฉ,
07:51
I approached a high-level manager in a very large company,
152
471097
4709
ืคื ื™ืชื™ ืืœ ืžื ื”ืœืช ื‘ื›ื™ืจื” ื‘ื—ื‘ืจื” ื’ื“ื•ืœื” ืžืื“,
07:55
and I said to her, "Look, what if, unbeknownst to you,
153
475830
4578
ื•ืืžืจืชื™ ืœื”, "ืชืจืื™, ืžื” ืื, ืœืœื ื™ื“ื™ืขืชืš,
08:00
your system is weeding out people with high future likelihood of depression?
154
480432
6549
"ื”ืžืขืจื›ืช ืฉืœืš ืžื ืคื” ื”ื—ื•ืฆื” ืื ืฉื™ื ืขื ืกื™ื›ื•ืŸ ืขืชื™ื“ื™ ื’ื‘ื•ื” ืœืœืงื•ืช ื‘ื“ื™ื›ืื•ืŸ?
08:07
They're not depressed now, just maybe in the future, more likely.
155
487761
3376
"ื”ื ืœื ืžื“ื•ื›ืื™ื ื›ืขืช, ืจืง ืื•ืœื™ ื‘ืขืชื™ื“, ื‘ืกื™ื›ื•ืŸ ื’ื‘ื•ื” ื™ื•ืชืจ.
08:11
What if it's weeding out women more likely to be pregnant
156
491923
3406
"ืžื” ืื ื”ื™ื ืžื ืคื” ื”ื—ื•ืฆื” ื ืฉื™ื ืฉืกื™ื›ื•ื™ื™ื”ืŸ ื’ื“ื•ืœื™ื ื™ื•ืชืจ ืœื”ืจื•ืช
08:15
in the next year or two but aren't pregnant now?
157
495353
2586
"ื‘ืฉื ื” ืื• ืฉื ืชื™ื™ื ื”ืงืจื•ื‘ื•ืช ืืš ืื™ื ืŸ ื‘ื”ืจื™ื•ืŸ ื›ืขืช?
08:18
What if it's hiring aggressive people because that's your workplace culture?"
158
498844
5636
"ืžื” ืื ื”ืžืขืจื›ืช ืžืขืกื™ืงื” ืื ืฉื™ื ืื’ืจืกื™ื‘ื™ื™ื ืžื›ื™ื•ื•ืŸ ืฉื–ื• ื”ืชืจื‘ื•ืช ื‘ืกื‘ื™ื‘ืช ื”ืขื‘ื•ื“ื” ืฉืœืš?"
08:25
You can't tell this by looking at gender breakdowns.
159
505173
2691
ืืชื ืœื ื™ื›ื•ืœื™ื ืœื“ืขืช ืืช ื–ื” ืขืœ-ื™ื“ื™ ื—ืœื•ืงื” ืœืžื’ื“ืจื™ื.
08:27
Those may be balanced.
160
507888
1502
ืืœื• ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืžืื•ื–ื ื™ื.
08:29
And since this is machine learning, not traditional coding,
161
509414
3557
ื•ืžืื—ืจ ื•ื–ืืช ืœืžื™ื“ืช ืžื›ื•ื ื”, ืœื ืฉื™ื˜ืช ืงื™ื“ื•ื“ ืžืกื•ืจืชื™ืช,
08:32
there is no variable there labeled "higher risk of depression,"
162
512995
4907
ืื™ืŸ ืฉื ืžืฉืชื ื” ืฉืžื•ื’ื“ืจ "ืกื™ื›ื•ืŸ ืžื•ื’ื‘ืจ ืœื“ื™ื›ืื•ืŸ",
08:37
"higher risk of pregnancy,"
163
517926
1833
"ืกื™ื›ื•ืŸ ืžื•ื’ื‘ืจ ืœื”ืจื™ื•ืŸ,"
08:39
"aggressive guy scale."
164
519783
1734
"ืกื•ืœื ืฉืœ ืื’ืจืกื™ื‘ื™ื•ืช".
08:41
Not only do you not know what your system is selecting on,
165
521995
3679
ืœื ืจืง ืฉืื™ื ื›ื ื™ื•ื“ืขื™ื ืœืคื™ ืžื” ื”ืžืขืจื›ืช ืฉืœื›ื ื‘ื•ื—ืจืช,
08:45
you don't even know where to begin to look.
166
525698
2323
ืืชื ืืคื™ืœื• ืœื ื™ื•ื“ืขื™ื ืื™ืคื” ืœื”ืชื—ื™ืœ ืœื—ืคืฉ.
08:48
It's a black box.
167
528045
1246
ื–ื•ื”ื™ ืงื•ืคืกื” ืฉื—ื•ืจื”.
08:49
It has predictive power, but you don't understand it.
168
529315
2807
ื™ืฉ ืœื” ื™ื›ื•ืœืช ื ื™ื‘ื•ื™, ืื‘ืœ ืืชื ืœื ืžื‘ื™ื ื™ื ืื•ืชื”.
08:52
"What safeguards," I asked, "do you have
169
532486
2369
"ืื™ืœื• ืืžืฆืขื™ ื‘ื™ื˜ื—ื•ืŸ," ืฉืืœืชื™, "ื™ืฉ ืœืš
08:54
to make sure that your black box isn't doing something shady?"
170
534879
3673
"ืœื”ื‘ื˜ื™ื— ืฉื”ืงื•ืคืกื” ื”ืฉื—ื•ืจื” ืฉืœืš ืœื ืขื•ืฉื” ืžืฉื”ื• ืžืคื•ืงืคืง?"
09:00
She looked at me as if I had just stepped on 10 puppy tails.
171
540863
3878
ื”ื™ื ื”ืกืชื›ืœื” ืขืœื™ ื›ืื™ืœื• ืฉื“ืจื›ืชื™ ืขืœ 10 ื–ื ื‘ื•ืช ืฉืœ ื›ืœื‘ืœื‘ื™ื.
09:04
(Laughter)
172
544765
1248
(ืฆื—ื•ืง)
09:06
She stared at me and she said,
173
546037
2041
ื”ื™ื ื ืขืฆื” ื‘ื™ ืžื‘ื˜ ื•ืืžืจื”,
09:08
"I don't want to hear another word about this."
174
548556
4333
"ืื ื™ ืœื ืจื•ืฆื” ืœืฉืžื•ืข ืžื™ืœื” ื ื•ืกืคืช ืขืœ ื–ื”."
09:13
And she turned around and walked away.
175
553458
2034
ื•ื”ื™ื ื”ืกืชื•ื‘ื‘ื” ื•ื”ืœื›ื” ืžืฉื.
09:16
Mind you -- she wasn't rude.
176
556064
1486
ืฉื™ืžื• ืœื‘ - ื”ื™ื ืœื ื”ื™ื™ืชื” ื’ืกืช ืจื•ื—.
09:17
It was clearly: what I don't know isn't my problem, go away, death stare.
177
557574
6308
ื–ื” ื”ื™ื” ื‘ื‘ื™ืจื•ืจ: ืžื” ืฉืื ื™ ืœื ื™ื•ื“ืขืช ื”ื•ื ืœื ื”ื‘ืขื™ื” ืฉืœื™, ืœื›ื™ ืžื›ืืŸ, ืžื‘ื˜ ืžืงืคื™ื.
09:23
(Laughter)
178
563906
1246
(ืฆื—ื•ืง)
09:25
Look, such a system may even be less biased
179
565862
3839
ืชืจืื•, ืžืขืจื›ืช ื›ื–ื• ื™ื›ื•ืœื” ืืคื™ืœื• ืœื”ื™ื•ืช ืคื—ื•ืช ืžื•ื˜ื”
09:29
than human managers in some ways.
180
569725
2103
ืžืืฉืจ ืžื ื”ืœื™ื ืื ื•ืฉื™ื™ื ื‘ืื•ืคื ื™ื ืžืกื•ื™ืžื™ื.
09:31
And it could make monetary sense.
181
571852
2146
ื•ื”ื™ื ื™ื›ื•ืœื” ืœื”ื™ื•ืช ื”ื’ื™ื•ื ื™ืช ืžื‘ื—ื™ื ื” ื›ืœื›ืœื™ืช.
09:34
But it could also lead
182
574573
1650
ืื‘ืœ ื”ื™ื ื’ื ื™ื›ื•ืœื” ืœื”ื•ื‘ื™ืœ
09:36
to a steady but stealthy shutting out of the job market
183
576247
4748
ืœืกื’ื™ืจื” ืžืชืžืฉื›ืช ื•ื—ืžืงื ื™ืช ืฉืœ ืฉื•ืง ื”ืขื‘ื•ื“ื”
09:41
of people with higher risk of depression.
184
581019
2293
ื‘ืคื ื™ ืื ืฉื™ื ืขื ืกื™ื›ื•ืŸ ื’ื‘ื•ื” ืœืœืงื•ืช ื‘ื“ื™ื›ืื•ืŸ.
09:43
Is this the kind of society we want to build,
185
583753
2596
ื”ืื ื–ื• ื”ื—ื‘ืจื” ืฉืื ื—ื ื• ืจื•ืฆื™ื ืœื‘ื ื•ืช,
09:46
without even knowing we've done this,
186
586373
2285
ืžื‘ืœื™ ืฉื ื“ืข ืืคื™ืœื• ืฉืื ื—ื ื• ืขื•ืฉื™ื ื–ืืช,
09:48
because we turned decision-making to machines we don't totally understand?
187
588682
3964
ื‘ื’ืœืœ ืฉื”ืฉืืจื ื• ืืช ืงื‘ืœืช ื”ื”ื—ืœื˜ื•ืช ื‘ื™ื“ื™ ืžื›ื•ื ื•ืช ืฉืื ื—ื ื• ืœื ืžื‘ื™ื ื™ื ืขื“ ื”ืกื•ืฃ?
09:53
Another problem is this:
188
593265
1458
ื‘ืขื™ื” ื ื•ืกืคืช ื”ื™ื ื–ื•:
09:55
these systems are often trained on data generated by our actions,
189
595314
4452
ื”ืžืขืจื›ื•ืช ื”ืืœื• ืœืขื™ืชื™ื ืงืจื•ื‘ื•ืช ืžื›ื•ื•ื ื•ืช ืœื ืชื•ื ื™ื ื”ืžื™ื•ืฆืจื™ื ืขืœ ื™ื“ื™ ื”ืคืขื•ืœื•ืช ืฉืœื ื•,
09:59
human imprints.
190
599790
1816
ื—ื•ืชื ืื ื•ืฉื™.
10:02
Well, they could just be reflecting our biases,
191
602188
3808
ืื ื›ืš, ื”ืŸ ื™ื›ื•ืœื•ืช ืคืฉื•ื˜ ืœืฉืงืฃ ืืช ื”ื”ืขื“ืคื•ืช ืฉืœื ื•.
10:06
and these systems could be picking up on our biases
192
606020
3593
ื•ื”ืžืขืจื›ื•ืช ื”ืืœื• ื™ื›ื•ืœื•ืช ืœื”ื™ื˜ืคืœ ืœื”ืขื“ืคื•ืช ืฉืœื ื•
10:09
and amplifying them
193
609637
1313
ื•ืœื”ื’ื‘ื™ืจ ืื•ืชืŸ
10:10
and showing them back to us,
194
610974
1418
ื•ืœืฉืงืฃ ืœื ื• ืื•ืชืŸ ื‘ื—ื–ืจื”,
10:12
while we're telling ourselves,
195
612416
1462
ื‘ื–ืžืŸ ืฉืื ื—ื ื• ืื•ืžืจื™ื ืœืขืฆืžื ื•,
10:13
"We're just doing objective, neutral computation."
196
613902
3117
"ืื ื—ื ื• ืขื•ืกืงื™ื ื‘ืžื—ืฉื•ื‘ ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ ื•ื ื™ื˜ืจืœื™."
10:18
Researchers found that on Google,
197
618314
2677
ื—ื•ืงืจื™ื ืžืฆืื• ืฉื‘ื’ื•ื’ืœ,
10:22
women are less likely than men to be shown job ads for high-paying jobs.
198
622134
5313
ืœื ืฉื™ื ื™ืฉ ืกื™ื›ื•ื™ ื ืžื•ืš ื™ื•ืชืจ ืžืืฉืจ ืœื’ื‘ืจื™ื ืœืงื‘ืœ ืžื•ื“ืขื•ืช ื“ืจื•ืฉื™ื ืœืชืคืงื™ื“ื™ื ืขื ืฉื›ืจ ื’ื‘ื•ื”.
10:28
And searching for African-American names
199
628463
2530
ื•ื—ื™ืคื•ืฉ ืฉืœ ืฉืžื•ืช ืืคืจื™ืงื ื™ื-ืืžืจื™ืงื ื™ื
10:31
is more likely to bring up ads suggesting criminal history,
200
631017
4706
ื™ืขืœื” ื‘ืกื™ื›ื•ื™ ื’ื‘ื•ื” ื™ื•ืชืจ ืคืจืกื•ืžื•ืช ื”ืžืจืžื–ื•ืช ืขืœ ืขื‘ืจ ืคืœื™ืœื™,
10:35
even when there is none.
201
635747
1567
ืืคื™ืœื• ื›ืืฉืจ ืื™ืŸ ื›ื–ื”.
10:38
Such hidden biases and black-box algorithms
202
638693
3549
ืœื”ื˜ื™ื•ืช ื—ื‘ื•ื™ื•ืช ื›ืืœื” ื•ืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืงื•ืคืกื ืฉื—ื•ืจื”
10:42
that researchers uncover sometimes but sometimes we don't know,
203
642266
3973
ืฉื—ื•ืงืจื™ื ืžื’ืœื™ื ืœืคืขืžื™ื ืื‘ืœ ืœืคืขืžื™ื ืื™ื ื ื• ื™ื•ื“ืขื™ื ืขืœื™ื”ื,
10:46
can have life-altering consequences.
204
646263
2661
ื™ื›ื•ืœื•ืช ืœื”ื™ื•ืช ื”ืฉืœื›ื•ืช ืžืฉื ื•ืช ื—ื™ื™ื.
10:49
In Wisconsin, a defendant was sentenced to six years in prison
205
649958
4159
ื‘ื•ื•ื™ืกืงื•ื ืกื™ืŸ, ื ืืฉื ื ืฉืคื˜ ืœืฉืฉ ืฉื ื™ื ื‘ื›ืœื
10:54
for evading the police.
206
654141
1355
ืขืœ ื”ืชื—ืžืงื•ืช ืžื”ืžืฉื˜ืจื”.
10:56
You may not know this,
207
656824
1186
ื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉืืชื ืœื ื™ื•ื“ืขื™ื ื–ืืช,
10:58
but algorithms are increasingly used in parole and sentencing decisions.
208
658034
3998
ืืš ื”ืฉื™ืžื•ืฉ ื‘ืืœื’ื•ืจื™ืชืžื™ื ื”ื•ืœืš ื•ื’ื•ื‘ืจ ื‘ื”ื—ืœื˜ื•ืช ืขืœ ืขื ื™ืฉื” ื•ื—ื ื™ื ื”.
11:02
He wanted to know: How is this score calculated?
209
662056
2955
ื”ื•ื ืจืฆื” ืœื“ืขืช: ืื™ืš ื”ืžืกืคืจ ื”ื–ื” ื—ื•ืฉื‘?
11:05
It's a commercial black box.
210
665795
1665
ื–ื• ื”ื™ื ืงื•ืคืกื” ืฉื—ื•ืจื” ืžืกื—ืจื™ืช.
11:07
The company refused to have its algorithm be challenged in open court.
211
667484
4205
ื”ื—ื‘ืจื” ืกื™ืจื‘ื” ืœืืคืฉืจ ืœืืชื’ืจ ืืช ื”ืืœื’ื•ืจื™ืชื ืฉืœื” ื‘ืื•ืœื ื‘ื™ืช ื”ืžืฉืคื˜
11:12
But ProPublica, an investigative nonprofit, audited that very algorithm
212
672396
5532
ืื‘ืœ ืคืจื•-ืคื‘ืœื™ืงื”, ื—ื‘ืจืช ื—ืงื™ืจื•ืช ืœืœื ืžื˜ืจื•ืช ืจื•ื•ื—, ื‘ื“ืงื” ืืช ื”ืืœื’ื•ืจื™ืชื ื”ื–ื”
11:17
with what public data they could find,
213
677952
2016
ืขื ื›ืœ ื”ื ืชื•ื ื™ื ื”ืฆื™ื‘ื•ืจื™ื™ื ืฉื™ื›ืœื• ืœืžืฆื•ื,
11:19
and found that its outcomes were biased
214
679992
2316
ื•ื’ื™ืœื• ืฉื”ืชื•ืฆืื•ืช ื”ื™ื• ืžื•ื˜ื•ืช
11:22
and its predictive power was dismal, barely better than chance,
215
682332
3629
ื•ื›ื•ื— ื”ื ื™ื‘ื•ื™ ืฉืœื• ื”ื™ื” ืžืคื—ื™ื“, ื‘ืงื•ืฉื™ ื™ื•ืชืจ ื˜ื•ื‘ ืžืžื–ืœ,
11:25
and it was wrongly labeling black defendants as future criminals
216
685985
4416
ื•ื”ื•ื ื”ื’ื“ื™ืจ ื‘ืื•ืคืŸ ืžื•ื˜ืขื” ื ืืฉื ืฉื—ื•ืจ ื›ืคื•ืฉืข ืขืชื™ื“ื™
11:30
at twice the rate of white defendants.
217
690425
3895
ื‘ืฉื™ืขื•ืจ ื›ืคื•ืœ ืžืืฉืจ ื ืืฉื ืœื‘ืŸ.
11:35
So, consider this case:
218
695891
1564
ืื–, ื‘ื—ื ื• ืืช ื”ืžืงืจื” ื”ื‘ื:
11:38
This woman was late picking up her godsister
219
698103
3852
ื”ืื™ืฉื” ื”ื–ื• ืื™ื—ืจื” ืœืืกื•ืฃ ืืช ืื—ื•ืชื” ื”ื—ื•ืจื’ืช
11:41
from a school in Broward County, Florida,
220
701979
2075
ืžื‘ื™ืช ืกืคืจ ื‘ืžื—ื•ื– ื‘ืจื•ื•ืืจื“, ื‘ืคืœื•ืจื™ื“ื”,
11:44
running down the street with a friend of hers.
221
704757
2356
ื•ื”ื™ื ืจืฆื” ื‘ืจื—ื•ื‘ ืขื ื—ื‘ืจืชื”.
11:47
They spotted an unlocked kid's bike and a scooter on a porch
222
707137
4099
ื”ืŸ ื”ื‘ื—ื™ื ื• ื‘ืื•ืคื ื™ ื™ืœื“ื™ื ื•ื‘ืงื•ืจืงื™ื ื˜ ืฉื”ื™ื• ืœื ืงืฉื•ืจื™ื ื‘ืžืจืคืกืช
11:51
and foolishly jumped on it.
223
711260
1632
ื•ื‘ื˜ื™ืคืฉื•ืช ืงืคืฆื• ืขืœื™ื”ื.
11:52
As they were speeding off, a woman came out and said,
224
712916
2599
ื‘ืขื•ื“ ื”ืŸ ื“ื•ื”ืจื•ืช, ืื™ืฉื” ื”ื’ื™ื—ื” ื•ืืžืจื”,
11:55
"Hey! That's my kid's bike!"
225
715539
2205
" ื”ื™, ืืœื• ื”ืื•ืคื ื™ื™ื ืฉืœ ื”ื™ืœื“ ืฉืœื™!"
11:57
They dropped it, they walked away, but they were arrested.
226
717768
3294
ื”ืŸ ื–ืจืงื• ืื•ืชื, ื”ืŸ ื”ืœื›ื• ืžืฉื, ืื‘ืœ ื”ืŸ ื ืขืฆืจื•.
12:01
She was wrong, she was foolish, but she was also just 18.
227
721086
3637
ื”ื™ื ื˜ืขืชื”, ื”ื™ื ืขืฉืชื” ืฉื˜ื•ืช, ืื‘ืœ ื”ื™ื ื’ื ื”ื™ื™ืชื” ืจืง ื‘ืช 18.
12:04
She had a couple of juvenile misdemeanors.
228
724747
2544
ื”ื™ื• ืœื” ืฉืชื™ ืขื‘ืจื•ืช ื ืขื•ืจื™ื ืงื•ื“ืžื•ืช.
12:07
Meanwhile, that man had been arrested for shoplifting in Home Depot --
229
727808
5185
ื‘ื™ื ืชื™ื™ื, ื”ืื™ืฉ ื”ื–ื” ื ืขืฆืจ ืขืœ ื’ื ื™ื‘ื” ืžื—ื ื•ืช ื‘ื”ื•ื ื“ื™ืคื• -
12:13
85 dollars' worth of stuff, a similar petty crime.
230
733017
2924
ื“ื‘ืจื™ื ื‘ืฉื•ื•ื™ ืฉืœ 85 ื“ื•ืœืจ, ืคืฉืข ื—ืกืจ ื—ืฉื™ื‘ื•ืช ื“ื•ืžื”.
12:16
But he had two prior armed robbery convictions.
231
736766
4559
ืื‘ืœ ื”ื™ื• ืœื• ืฉืชื™ ื”ืจืฉืขื•ืช ืงื•ื“ืžื•ืช ืขืœ ืฉื•ื“ ืžื–ื•ื™ื™ืŸ.
12:21
But the algorithm scored her as high risk, and not him.
232
741955
3482
ืื‘ืœ ื”ืืœื’ื•ืจื™ืชื ื—ื™ืฉื‘ ืื•ืชื” ื‘ืกื™ื›ื•ืŸ ื’ื‘ื•ื”, ื•ืœื ืื•ืชื•.
12:26
Two years later, ProPublica found that she had not reoffended.
233
746746
3874
ืฉื ืชื™ื™ื ืœืื—ืจ ืžื›ืŸ, ืคืจื•-ืคื‘ืœื™ืงื” ืžืฆืื” ืฉื”ื™ื ืœื ืคืฉืขื” ืฉื•ื‘.
12:30
It was just hard to get a job for her with her record.
234
750644
2550
ืจืง ื”ื™ื” ืœื” ืงืฉื” ืœืžืฆื•ื ืขื‘ื•ื“ื” ืขื ื”ืขื‘ืจ ืฉืœื”.
12:33
He, on the other hand, did reoffend
235
753218
2076
ื”ื•ื, ืœืขื•ืžืช ื–ืืช, ื›ืŸ ืคืฉืข ืฉื•ื‘
12:35
and is now serving an eight-year prison term for a later crime.
236
755318
3836
ื•ื›ืขืช ื”ื•ื ืžืจืฆื” ืขื•ื ืฉ ืฉืœ ืฉืžื•ื ื” ืฉื ื•ืช ืžืืกืจ ื‘ื’ืœืœ ืคืฉืข ืžืื•ื—ืจ ื™ื•ืชืจ.
12:40
Clearly, we need to audit our black boxes
237
760088
3369
ื‘ื‘ืจื•ืจ, ืขืœื™ื ื• ืœื‘ืงืจ ืืช ื”ืงื•ืคืกืื•ืช ื”ืฉื—ื•ืจื•ืช ืฉืœื ื•
12:43
and not have them have this kind of unchecked power.
238
763481
2615
ื•ืœื ืœืืคืฉืจ ืœื”ืŸ ืกื•ื’ ื›ื–ื” ืฉืœ ื›ื•ื— ื‘ืœืชื™ ื‘ื“ื•ืง.
12:46
(Applause)
239
766120
2879
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
12:50
Audits are great and important, but they don't solve all our problems.
240
770087
4242
ื‘ื™ืงื•ืจื•ืช ื”ืŸ ื ื”ื“ืจื•ืช ื•ื—ืฉื•ื‘ื•ืช ืืš ื”ืŸ ืœื ืคื•ืชืจื•ืช ืืช ื›ืœ ื‘ืขื™ื•ืชื™ื ื•.
12:54
Take Facebook's powerful news feed algorithm --
241
774353
2748
ืงื—ื• ืœืžืฉืœ ืืช ื”ืืœื’ื•ืจื™ืชื ื”ื—ื–ืง ืฉืœ ื”ืคื™ื“ ื”ื—ื“ืฉื•ืชื™ ืฉืœ ืคื™ื™ืกื‘ื•ืง -
12:57
you know, the one that ranks everything and decides what to show you
242
777125
4843
ืืชื ื™ื•ื“ืขื™ื, ื–ื” ืฉืžื“ืจื’ ื›ืœ ื“ื‘ืจ ื•ืžื—ืœื™ื˜ ืžื” ืœื”ืจืื•ืช ืœื›ื
13:01
from all the friends and pages you follow.
243
781992
2284
ืžื›ืœ ื”ื—ื‘ืจื™ื ื•ื”ื“ืคื™ื ืฉืื—ืจื™ื”ื ืืชื ืขื•ืงื‘ื™ื.
13:04
Should you be shown another baby picture?
244
784898
2275
ื”ืื ืฆืจื™ืš ืœื”ืจืื•ืช ืœื›ื ืชืžื•ื ื” ืื—ืจืช ืฉืœ ืชื™ื ื•ืง?
13:07
(Laughter)
245
787197
1196
(ืฆื—ื•ืง)
13:08
A sullen note from an acquaintance?
246
788417
2596
ื”ืขืจื” ื–ื•ืขืคืช ืžืื™ื–ื” ืžื›ืจ?
13:11
An important but difficult news item?
247
791449
1856
ืคืจื™ื˜ ื—ื“ืฉื•ืชื™ ื—ืฉื•ื‘ ืืš ืงืฉื” ืœืฆืคื™ื™ื”?
13:13
There's no right answer.
248
793329
1482
ืื™ืŸ ื›ืืŸ ืชืฉื•ื‘ื” ื ื›ื•ื ื”.
13:14
Facebook optimizes for engagement on the site:
249
794835
2659
ืคื™ื™ืกื‘ื•ืง ืžึฐื™ึทื˜ึถื‘ ืœืฆื•ืจืš ืžืขื•ืจื‘ื•ืช ื‘ืืชืจ:
13:17
likes, shares, comments.
250
797518
1415
ืœื™ื™ืงื™ื, ืฉื™ืชื•ืคื™ื, ืชื’ื•ื‘ื•ืช.
13:20
In August of 2014,
251
800168
2696
ื‘ืื•ื’ื•ืกื˜ 2014,
13:22
protests broke out in Ferguson, Missouri,
252
802888
2662
ืคืจืฆื” ืžื—ืื” ื‘ืคืจื’ื•ืกื•ืŸ, ืžื™ื–ื•ืจื™
13:25
after the killing of an African-American teenager by a white police officer,
253
805574
4417
ืœืื—ืจ ื”ืจื™ื’ืชื• ืฉืœ ื ืขืจ ืืคืจื™ืงื ื™-ืืžืจื™ืงื ื™ ืขืœ-ื™ื“ื™ ืฉื•ื˜ืจ ืœื‘ืŸ,
13:30
under murky circumstances.
254
810015
1570
ื‘ื ืกื™ื‘ื•ืช ื—ืฉื•ื“ื•ืช.
13:31
The news of the protests was all over
255
811974
2007
ื”ื—ื“ืฉื•ืช ื‘ื ื•ื’ืข ืœืžื—ืื” ื”ื•ืคื™ืขื• ื‘ื’ื“ื•ืœ
13:34
my algorithmically unfiltered Twitter feed,
256
814005
2685
ื‘ื“ืฃ ื”ื˜ื•ื•ื™ื˜ืจ ืฉืœื™ ืฉื”ื•ื ืœืœื ืกื™ื ื•ืŸ ืืœื’ื•ืจื™ืชืžื™,
13:36
but nowhere on my Facebook.
257
816714
1950
ืืš ืœื ื”ื•ืคื™ืขื• ื›ืœืœ ื‘ื“ืฃ ื”ืคื™ื™ืกื‘ื•ืง ืฉืœื™.
13:39
Was it my Facebook friends?
258
819182
1734
ื”ืื ื”ื™ื• ืืœื” ื—ื‘ืจื™ื™ ื‘ืคื™ื™ืกื‘ื•ืง?
13:40
I disabled Facebook's algorithm,
259
820940
2032
ื ื™ื˜ืจืœืชื™ ืืช ื”ืืœื’ื•ืจื™ืชื ืฉืœ ื”ืคื™ื™ืกื‘ื•ืง,
13:43
which is hard because Facebook keeps wanting to make you
260
823472
2848
ื“ื‘ืจ ืฉืงืฉื” ืœืขืฉื•ืช ื›ื™ ืคื™ื™ืกื‘ื•ืง ืจื•ืฆื” ืฉืชื”ื™ื•
13:46
come under the algorithm's control,
261
826344
2036
ื›ืœ ื”ื–ืžืŸ ืชื—ืช ืฉืœื™ื˜ืช ื”ืืœื’ื•ืจื™ืชื,
13:48
and saw that my friends were talking about it.
262
828404
2238
ื•ืจืื™ืชื™ ืฉื—ื‘ืจื™ื™ ื“ื™ื‘ืจื• ืขืœ ื–ื”.
13:50
It's just that the algorithm wasn't showing it to me.
263
830666
2509
ืจืง ืฉื”ืืœื’ื•ืจื™ืชื ืœื ื”ืจืื” ืœื™ ืืช ื–ื”.
13:53
I researched this and found this was a widespread problem.
264
833199
3042
ื‘ื“ืงืชื™ ืืช ื–ื” ื•ืžืฆืืชื™ ืฉื–ื• ื”ื™ื™ืชื” ื‘ืขื™ื” ื ืจื—ื‘ืช.
13:56
The story of Ferguson wasn't algorithm-friendly.
265
836265
3813
ื”ืกื™ืคื•ืจ ืžืคืจื’ื•ืกื•ืŸ ืœื ื”ื™ื” ื™ื“ื™ื“ื•ืชื™ ืœืืœื’ื•ืจื™ืชื.
14:00
It's not "likable."
266
840102
1171
ื”ื•ื ืœื "ืื”ื•ื‘".
14:01
Who's going to click on "like?"
267
841297
1552
ืžื™ ื™ืกืžืŸ ืœื–ื” "ืœื™ื™ืง"?
14:03
It's not even easy to comment on.
268
843500
2206
ืืคื™ืœื• ืœื ืงืœ ืœื”ื’ื™ื‘ ืขืœื™ื•.
14:05
Without likes and comments,
269
845730
1371
ืœืœื ืœื™ื™ืงื™ื ื•ืชื’ื•ื‘ื•ืช,
14:07
the algorithm was likely showing it to even fewer people,
270
847125
3292
ืกื‘ื™ืจ ืฉื”ืืœื’ื•ืจื™ืชื ื”ืจืื” ืื•ืชื• ืืคื™ืœื• ืœืคื—ื•ืช ืื ืฉื™ื,
14:10
so we didn't get to see this.
271
850441
1542
ืื– ืœื ื–ื›ื™ื ื• ืœืจืื•ืช ืืช ื–ื”.
14:12
Instead, that week,
272
852946
1228
ื‘ืžืงื•ืžื•, ื‘ืื•ืชื• ืฉื‘ื•ืข,
14:14
Facebook's algorithm highlighted this,
273
854198
2298
ื”ืืœื’ื•ืจื™ืชื ืฉืœ ืคื™ื™ืกื‘ื•ืง ื”ื“ื’ื™ืฉ ืืช ื–ื”,
14:16
which is the ALS Ice Bucket Challenge.
274
856520
2226
ื–ื”ื• ืืชื’ืจ ื“ืœื™ ื”ืงืจื— ืฉืœ ALS.
14:18
Worthy cause; dump ice water, donate to charity, fine.
275
858770
3742
ืžื˜ืจื” ื˜ื•ื‘ื”; ืฉืคื•ืš ื“ืœื™ ืงืจื—, ืชืจื•ื ื›ืกืฃ, ื™ื•ืคื™.
14:22
But it was super algorithm-friendly.
276
862536
1904
ืื‘ืœ ื”ื•ื ื”ื™ื” ื™ื“ื™ื“ื•ืชื™ ื‘ื™ื•ืชืจ ืœืืœื’ื•ืจื™ืชื.
14:25
The machine made this decision for us.
277
865219
2613
ื”ืžื›ื•ื ื” ืงื™ื‘ืœื” ืืช ื”ื”ื—ืœื˜ื” ื”ื–ื• ืขื‘ื•ืจื ื•.
14:27
A very important but difficult conversation
278
867856
3497
ืฉื™ื—ื” ื—ืฉื•ื‘ื” ืžืื“ ืืš ืงืฉื”
14:31
might have been smothered,
279
871377
1555
ืื•ืœื™ ื”ื™ื™ืชื” ืžื•ืฉืชืงืช,
14:32
had Facebook been the only channel.
280
872956
2696
ืื ืคื™ื™ืกื‘ื•ืง ื”ื™ื” ื”ืขืจื•ืฅ ื”ื™ื—ื™ื“.
14:36
Now, finally, these systems can also be wrong
281
876117
3797
ื›ืขืช, ืœื‘ืกื•ืฃ, ื”ืžืขืจื›ื•ืช ื”ืืœื” ื’ื ื™ื›ื•ืœื•ืช ืœื˜ืขื•ืช
14:39
in ways that don't resemble human systems.
282
879938
2736
ื‘ืฆื•ืจื•ืช ืฉืื™ื ืŸ ื“ื•ืžื•ืช ืœืžืขืจื›ื•ืช ืื ื•ืฉื™ื•ืช.
14:42
Do you guys remember Watson, IBM's machine-intelligence system
283
882698
2922
ื”ืื ืืชื ื–ื•ื›ืจื™ื ืืช ื•ื•ื˜ืกื•ืŸ, ืžืขืจื›ืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉืœ IBM
14:45
that wiped the floor with human contestants on Jeopardy?
284
885644
3128
ืฉื˜ืื˜ืื” ืืช ื”ืจืฆืคื” ืขื ืžืชื—ืจื” ืื ื•ืฉื™ ื‘ืžืฉื—ืง "ืกื›ื ื”"?
14:49
It was a great player.
285
889131
1428
ื”ื•ื ื”ื™ื” ืฉื—ืงืŸ ื ื”ื“ืจ.
14:50
But then, for Final Jeopardy, Watson was asked this question:
286
890583
3569
ืื‘ืœ ืื–, ื‘ืžืฉื™ืžื” ื”ืื—ืจื•ื ื”, ื•ื•ื˜ืกื•ืŸ ื ืฉืืœ ืืช ื”ืฉืืœื” ื”ื‘ืื”:
14:54
"Its largest airport is named for a World War II hero,
287
894659
2932
"ืฉืžื• ืฉืœ ืฉื“ื” ื”ืชืขื•ืคื” ื”ื›ื™ ื’ื“ื•ืœ ืฉืœื• ื ืงืจื ืข"ืฉ ื’ื™ื‘ื•ืจ ืžืœื—ืžืช ื”ืขื•ืœื ื”-2,
14:57
its second-largest for a World War II battle."
288
897615
2252
ื”ืฉื ื™ ื”ื›ื™ ื’ื“ื•ืœ ืขืœ ืฉื ืงืจื‘ ื‘ืžืœื—ืžืช ื”ืขื•ืœื ื”-II."
14:59
(Hums Final Jeopardy music)
289
899891
1378
(ืžื–ืžื–ืช ืืช ื”ื ืขื™ืžื” ืฉืœ ื”ืฉืœื‘ ื”ืกื•ืคื™ ื‘ืžืฉื—ืง)
15:01
Chicago.
290
901582
1182
ืฉื™ืงื’ื•.
15:02
The two humans got it right.
291
902788
1370
ืฉื ื™ ื”ืื ืฉื™ื ืขื ื• ื ื›ื•ืŸ.
15:04
Watson, on the other hand, answered "Toronto" --
292
904697
4348
ื•ื•ื˜ืกื•ืŸ, ืœืขื•ืžืช ื–ืืช, ืขื ื” "ื˜ื•ืจื•ื ื˜ื•" -
15:09
for a US city category!
293
909069
1818
ื‘ืงื˜ื’ื•ืจื™ื” ืฉืœ ืขืจื™ื ื‘ืืจืฆื•ืช ื”ื‘ืจื™ืช!
15:11
The impressive system also made an error
294
911596
2901
ื”ืžืขืจื›ืช ื”ืžืจืฉื™ืžื” ื’ื ืขืฉืชื” ื˜ืขื•ืช
15:14
that a human would never make, a second-grader wouldn't make.
295
914521
3651
ืฉื‘ืŸ ืื ื•ืฉ ืœืขื•ืœื ืœื ื”ื™ื” ืขื•ืฉื”, ืฉืชืœืžื™ื“ ื›ื™ืชื” ื‘' ืœื ื”ื™ื” ืขื•ืฉื”.
15:18
Our machine intelligence can fail
296
918823
3109
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉืœื ื• ื™ื›ื•ืœื” ืœื”ื›ืฉืœ
15:21
in ways that don't fit error patterns of humans,
297
921956
3100
ื‘ื“ืจื›ื™ื ืฉืื™ื ืŸ ืชื•ืืžื•ืช ืœื“ืคื•ืกื™ ื˜ืขื•ื™ื•ืช ืื ื•ืฉื™ื•ืช,
15:25
in ways we won't expect and be prepared for.
298
925080
2950
ื‘ื“ืจื›ื™ื ืฉืื ื• ืœื ืžืฆืคื™ื ืœื”ืŸ ื•ืœื ืžื•ื›ื ื™ื ืœื”ืŸ.
15:28
It'd be lousy not to get a job one is qualified for,
299
928054
3638
ื–ื” ื™ื”ื™ื” ืžื—ื•ืจื‘ืŸ ืœื ืœืงื‘ืœ ืขื‘ื•ื“ื” ืฉืืชื” ืžื•ื›ืฉืจ ืืœื™ื”,
15:31
but it would triple suck if it was because of stack overflow
300
931716
3727
ืื‘ืœ ื–ื” ื™ืขืฆื‘ืŸ ืคื™ ืฉืœื•ืฉ ืื ื–ื” ื™ื”ื™ื” ื‘ื’ืœืœ ื’ืœื™ืฉืช ืžื—ืกื ื™ืช
15:35
in some subroutine.
301
935467
1432
ื‘ืื™ื–ื• ืคื•ื ืงืฆื™ื” ืชื›ื ื•ืชื™ืช.
15:36
(Laughter)
302
936923
1579
(ืฆื—ื•ืง)
15:38
In May of 2010,
303
938526
2786
ื‘ืžืื™ 2010,
15:41
a flash crash on Wall Street fueled by a feedback loop
304
941336
4044
ื”ืชืจืกืงื•ืช ื‘ื–ืง ื‘ื•ื•ืœ ืกื˜ืจื™ื˜ ืฉื”ืชื’ื‘ืจื” ืขืงื‘ ืชื’ื•ื‘ื” ื—ื•ื–ืจืช ื•ื ืฉื ื™ืช
15:45
in Wall Street's "sell" algorithm
305
945404
3028
ืฉืœ ืืœื’ื•ืจื™ืชื ื”"ืžื›ื™ืจื”" ืฉืœ ื•ื•ืœ ืกื˜ืจื™ื˜
15:48
wiped a trillion dollars of value in 36 minutes.
306
948456
4184
ืžื—ืงื” ืขืจืš ืฉืœ ื˜ืจื™ืœื™ื•ืŸ ื“ื•ืœืจ ื‘- 36 ื“ืงื•ืช.
15:53
I don't even want to think what "error" means
307
953722
2187
ืื ื™ ืืคื™ืœื• ืœื ืจื•ืฆื” ืœื—ืฉื•ื‘ ืžื” ื”ืžืฉืžืขื•ืช ืฉืœ "ื˜ืขื•ืช"
15:55
in the context of lethal autonomous weapons.
308
955933
3589
ื‘ื”ืงืฉืจ ืฉืœ ื ืฉืง ืงื˜ืœื ื™ ืื•ื˜ื•ื ื•ืžื™.
16:01
So yes, humans have always made biases.
309
961894
3790
ืื– ื›ืŸ, ืื ืฉื™ื ืžืื– ื•ืžืชืžื™ื“ ืขืฉื• ื”ื˜ื™ื•ืช.
16:05
Decision makers and gatekeepers,
310
965708
2176
ืžืงื‘ืœื™ ื”ื—ืœื˜ื•ืช ื•ืฉื•ืžืจื™ ื”ืกืฃ,
16:07
in courts, in news, in war ...
311
967908
3493
ื‘ื‘ืชื™ ืžืฉืคื˜, ื‘ื—ื“ืฉื•ืช, ื‘ืžืœื—ืžื”...
16:11
they make mistakes; but that's exactly my point.
312
971425
3038
ื”ื ืขื•ืฉื™ื ื˜ืขื•ื™ื•ืช: ืื‘ืœ ื–ื• ื‘ื“ื™ื•ืง ื”ื˜ืขื ื” ืฉืœื™.
16:14
We cannot escape these difficult questions.
313
974487
3521
ืื ื—ื ื• ืœื ื™ื›ื•ืœื™ื ืœื‘ืจื•ื— ืžื”ืฉืืœื•ืช ื”ืงืฉื•ืช ื”ืœืœื•.
16:18
We cannot outsource our responsibilities to machines.
314
978596
3516
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ืขื‘ื™ืจ ืืช ื”ืื—ืจื™ื•ืช ืฉืœื ื• ืœืžื›ื•ื ื•ืช.
16:22
(Applause)
315
982676
4208
(ืžื—ื™ืื•ืช ื›ืคื™ื)
16:29
Artificial intelligence does not give us a "Get out of ethics free" card.
316
989089
4447
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืœื ื ื•ืชื ืช ืœื ื• ืื™ืฉื•ืจ ืœื”ืฉืชื—ืจืจ ืžื”ืืชื™ืงื”.
16:34
Data scientist Fred Benenson calls this math-washing.
317
994742
3381
ืžื“ืขืŸ ื”ื ืชื•ื ื™ื ืคืจื“ ื‘ื ืกื•ืŸ ืžื›ื ื” ื–ืืช "ืฉื˜ื™ืคื”-ืžืชืžื˜ื™ืช".
16:38
We need the opposite.
318
998147
1389
ืื ื—ื ื• ื–ืงื•ืงื™ื ืœื“ื‘ืจ ื”ื”ืคื•ืš.
16:39
We need to cultivate algorithm suspicion, scrutiny and investigation.
319
999560
5388
ืขืœื™ื ื• ืœืคืชื— ืืœื’ื•ืจื™ืชื ืœื—ืฉื“ื ื•ืช, ื‘ื—ื™ื ื” ืžื“ื•ืงื“ืงืช ื•ื—ืงื™ืจื”.
16:45
We need to make sure we have algorithmic accountability,
320
1005380
3198
ืขืœื™ื ื• ืœื”ื‘ื˜ื™ื— ืฉื™ืฉ ืœื ื• ื ื˜ื™ืœืช ืื—ืจื™ื•ืช ืืœื’ื•ืจื™ืชืžื™ืช,
16:48
auditing and meaningful transparency.
321
1008602
2445
ื‘ื™ืงื•ืจื•ืช ื•ืฉืงื™ืคื•ืช ืžืฉืžืขื•ืชื™ืช.
16:51
We need to accept that bringing math and computation
322
1011380
3234
ืขืœื™ื ื• ืœืงื‘ืœ ืฉื”ื›ื ืกืช ืžืชืžื˜ื™ืงื” ื•ืžื—ืฉื•ื‘
16:54
to messy, value-laden human affairs
323
1014638
2970
ืืœ ืขื ื™ื™ื ื™ื ืื ื•ืฉื™ื™ื ืžืกื•ื‘ื›ื™ื, ืขืžื•ืกื™ ืขืจืš
16:57
does not bring objectivity;
324
1017632
2384
ืœื ืžื‘ื™ืื” ืื•ื‘ื™ืงื˜ื™ื‘ื™ื•ืช;
17:00
rather, the complexity of human affairs invades the algorithms.
325
1020040
3633
ืืœื, ื”ืžื•ืจื›ื‘ื•ืช ืฉืœ ืขื ื™ื™ื ื™ื ืื ื•ืฉื™ื™ื ื—ื•ื“ืจืช ืืœ ื”ืืœื’ื•ืจื™ืชืžื™ื.
17:04
Yes, we can and we should use computation
326
1024148
3487
ื›ืŸ, ืื ื• ื™ื›ื•ืœื™ื ื•ืฆืจื™ื›ื™ื ืœื”ืฉืชืžืฉ ื‘ืžื—ืฉื•ื‘
17:07
to help us make better decisions.
327
1027659
2014
ื›ื“ื™ ืœืขื–ื•ืจ ืœื ื• ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ื˜ื•ื‘ื•ืช ื™ื•ืชืจ.
17:09
But we have to own up to our moral responsibility to judgment,
328
1029697
5332
ืืš ืขืœื™ื ื• ืœืงื—ืช ื‘ืขืœื•ืช ืขืœ ื”ืื—ืจื™ื•ืช ื”ืžื•ืกืจื™ืช ื•ื”ืฉื™ืคื•ื˜ื™ื•ืช ืฉืœื ื•,
17:15
and use algorithms within that framework,
329
1035053
2818
ื•ืœื”ืฉืชืžืฉ ื‘ืืœื’ื•ืจื™ืชืžื™ื ื‘ืžืกื’ืจืช ื”ื–ื•,
17:17
not as a means to abdicate and outsource our responsibilities
330
1037895
4935
ืœื ื›ืืžืฆืขื™ ืœื”ืชืคื˜ืจ ืื• ืœื”ืขื‘ื™ืจ ืืช ื”ืื—ืจื™ื•ืช ืฉืœื ื•
17:22
to one another as human to human.
331
1042854
2454
ืžืื—ื“ ืœืฉื ื™ ื›ืžื• ืžืื“ื ืœืื“ื.
17:25
Machine intelligence is here.
332
1045807
2609
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื›ื‘ืจ ื›ืืŸ.
17:28
That means we must hold on ever tighter
333
1048440
3421
ื–ื” ืื•ืžืจ ืฉืขืœื™ื ื• ืœืฉืžื•ืจ ื™ื•ืชืจ ืžืื™ ืคืขื
17:31
to human values and human ethics.
334
1051885
2147
ืขืœ ืขืจื›ื™ื ืื ื•ืฉื™ื™ื ื•ืืชื™ืงื” ืื ื•ืฉื™ืช.
17:34
Thank you.
335
1054056
1154
ืชื•ื“ื” ืจื‘ื”.
17:35
(Applause)
336
1055234
5020
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7