How to keep human bias out of AI | Kriti Sharma

100,036 views ใƒป 2019-04-12

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
Translator: Ivana Korom Reviewer: Joanna Pietrulewicz
0
0
7000
ืชืจื’ื•ื: Ruth Veksler ืขืจื™ื›ื”: Nurit Noy
00:12
How many decisions have been made about you today,
1
12875
3768
ื›ืžื” ืžื”ื”ื—ืœื˜ื•ืช ืฉื ืขืฉื• ืœื’ื‘ื™ื›ื ื”ื™ื•ื,
00:16
or this week or this year,
2
16667
2601
ื”ืฉื‘ื•ืข ืื• ื”ืฉื ื”,
00:19
by artificial intelligence?
3
19292
1958
ื ืขืฉื• ืขืœ-ื™ื“ื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช?
00:22
I build AI for a living
4
22958
1685
ืื ื™ ืขื•ืกืงืช ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืœืžื—ื™ื™ืชื™
00:24
so, full disclosure, I'm kind of a nerd.
5
24667
3017
ืื– ืœืžืขืŸ ื”ืืžืช, ืื ื™ ื“ื™ ื—ื ื•ื ื™ืช.
00:27
And because I'm kind of a nerd,
6
27708
2393
ื•ืžืฉื•ื ืฉืื ื™ ื“ื™ ื—ื ื•ื ื™ืช,
00:30
wherever some new news story comes out
7
30125
2351
ื‘ื›ืœ ืคืขื ืฉืžืชืคืจืกืžืช ื›ืชื‘ื” ื—ื“ืฉื”
00:32
about artificial intelligence stealing all our jobs,
8
32500
3434
ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืฉื’ื•ื–ืœืช ืœื ื• ืืช ืžืงื•ืžื•ืช ื”ืขื‘ื•ื“ื”,
00:35
or robots getting citizenship of an actual country,
9
35958
4185
ืื• ืจื•ื‘ื•ื˜ื™ื ืฉืžืงื‘ืœื™ื ืื–ืจื—ื•ืช ืฉืœ ืžื“ื™ื ื” ืžืžืฉื™ืช,
00:40
I'm the person my friends and followers message
10
40167
3142
ืื ื™ ื–ื• ืฉื›ืœ ื”ื—ื‘ืจื™ื ื•ื”ืขื•ืงื‘ื™ื ืคื•ื ื™ื ืืœื™ื”,
00:43
freaking out about the future.
11
43333
1542
ืžื‘ื•ื”ืœื™ื ืœื’ื‘ื™ ื”ืขืชื™ื“.
00:45
We see this everywhere.
12
45833
2101
ืื ื—ื ื• ืจื•ืื™ื ืืช ื–ื” ื‘ื›ืœ ืžืงื•ื.
00:47
This media panic that our robot overlords are taking over.
13
47958
4893
ื”ื‘ื”ืœื” ื”ืชืงืฉื•ืจืชื™ืช ืžื›ืš ืฉื”ืจื•ื‘ื•ื˜ื™ื, ืื“ื•ื ื™ื ื•, ืžืฉืชืœื˜ื™ื.
00:52
We could blame Hollywood for that.
14
52875
1917
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ืืฉื™ื ื‘ื›ืš ืืช ื”ื•ืœื™ื•ื•ื“.
00:56
But in reality, that's not the problem we should be focusing on.
15
56125
4125
ืืš ื‘ืžืฆื™ืื•ืช, ื–ื• ืœื ื”ื‘ืขื™ื” ื‘ื” ืขืœื™ื ื• ืœื”ืชืžืงื“.
01:01
There is a more pressing danger, a bigger risk with AI,
16
61250
3643
ื™ืฉ ืกื›ื ื” ื“ื•ื—ืงืช ื™ื•ืชืจ, ืกื™ื›ื•ืŸ ื’ื“ื•ืœ ื™ื•ืชืจ ื‘ื”ืงืฉืจ ืœื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช,
01:04
that we need to fix first.
17
64917
1583
ืฉืขืœื™ื ื• ืœืชืงืŸ ืงื•ื“ื.
01:07
So we are back to this question:
18
67417
2309
ื•ืฉื•ื‘ ื—ื–ืจื ื• ืœืฉืืœื”:
01:09
How many decisions have been made about you today by AI?
19
69750
4708
ื›ืžื” ืžื”ื”ื—ืœื˜ื•ืช ืฉื ืขืฉื• ืœื’ื‘ื™ื›ื ื”ื™ื•ื, ื ืขืฉื• ืขืœ-ื™ื“ื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช?
01:15
And how many of these
20
75792
1976
ื•ื›ืžื” ืžืชื•ื›ืŸ
01:17
were based on your gender, your race or your background?
21
77792
4500
ื”ืชื‘ืกืกื• ืขืœ ื”ืžื’ื“ืจ ืฉืœื›ื, ื”ื’ื–ืข ืื• ื”ืจืงืข ืฉืœื›ื?
01:24
Algorithms are being used all the time
22
84500
2768
ืืœื’ื•ืจื™ืชืžื™ื ืžืฉืžืฉื™ื ื‘ื›ืœ ื–ืžืŸ
01:27
to make decisions about who we are and what we want.
23
87292
3833
ืœืงื‘ืœืช ื”ื—ืœื˜ื•ืช ื‘ื”ืงืฉืจ ืœืžื™ ืฉืื ื—ื ื• ื•ืžื” ืฉืื ื• ืจื•ืฆื™ื.
01:32
Some of the women in this room will know what I'm talking about
24
92208
3643
ื—ืœืง ืžื”ื ืฉื™ื ื‘ื—ื“ืจ ื”ื–ื” ื™ื‘ื™ื ื• ืขืœ ืžื” ืื ื™ ืžื“ื‘ืจืช,
01:35
if you've been made to sit through those pregnancy test adverts on YouTube
25
95875
3768
ืื ื ืืœืฆืชืŸ ืœืจืื•ืช ืคืจืกื•ืžื•ืช ืœื‘ื“ื™ืงื•ืช ื”ืจื™ื•ืŸ ื‘ื™ื•ื˜ื™ื•ื‘
01:39
like 1,000 times.
26
99667
2059
ืื™ื–ื” ืืœืฃ ืคืขืžื™ื.
01:41
Or you've scrolled past adverts of fertility clinics
27
101750
2851
ืื• ืฉื’ืœืœืชืŸ ื“ืจืš ืžื•ื“ืขื•ืช ืœืžืจืคืื•ืช ืคื•ืจื™ื•ืช
01:44
on your Facebook feed.
28
104625
2042
ื‘ืขื“ื›ื•ื ื™ ื”ืคื™ื™ืกื‘ื•ืง ืฉืœื›ื.
01:47
Or in my case, Indian marriage bureaus.
29
107625
2393
ืื• ื‘ืžืงืจื” ืฉืœื™, ืœืžืฉืจื“ื™ ืฉื™ื“ื•ื›ื™ื ื”ื•ื“ื™ื™ื.
01:50
(Laughter)
30
110042
1267
(ืฆื—ื•ืง)
01:51
But AI isn't just being used to make decisions
31
111333
2976
ืื‘ืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื™ื ื” ืžืฉืžืฉืช ืจืง ืœืงื‘ืœืช ื”ื—ืœื˜ื•ืช
01:54
about what products we want to buy
32
114333
2601
ืขื‘ื•ืจ ื”ืžื•ืฆืจื™ื ืื•ืชื ื ืงื ื”
01:56
or which show we want to binge watch next.
33
116958
2500
ืื• ื”ืชื•ื›ื ื™ืช ื”ื‘ืื” ืœืฆืคื™ื™ืช-ืจืฆืฃ (ื‘ื™ื ื’').
02:01
I wonder how you'd feel about someone who thought things like this:
34
121042
5184
ืื ื™ ืชื•ื”ื” ืื™ืš ืชืจื’ื™ืฉื• ืœื’ื‘ื™ ืžื™ืฉื”ื• ืฉื™ื—ืฉื•ื‘ ื“ื‘ืจื™ื ื›ืžื•:
02:06
"A black or Latino person
35
126250
1934
"ืœืื“ื ืฉื—ื•ืจ ืื• ืžืžื•ืฆื ืœื˜ื™ื ื™
02:08
is less likely than a white person to pay off their loan on time."
36
128208
4125
ื™ืฉ ืกื™ื›ื•ื™ ื ืžื•ืš ืžืื“ื ืœื‘ืŸ ืœื”ื—ื–ื™ืจ ืืช ื”ื”ืœื•ื•ืื” ื‘ื–ืžืŸ."
02:13
"A person called John makes a better programmer
37
133542
2809
"ืื“ื ืฉืฉืžื• ื’'ื•ืŸ ื”ื•ื ืžืชื›ื ืช ื˜ื•ื‘ ื™ื•ืชืจ
02:16
than a person called Mary."
38
136375
1667
ืžืื“ื ืฉืฉืžื• ืžืจื™."
02:19
"A black man is more likely to be a repeat offender than a white man."
39
139250
5083
"ืœื’ื‘ืจ ืฉื—ื•ืจ ืกื™ื›ื•ื™ ื’ื‘ื•ื” ื™ื•ืชืจ ืœื”ื™ื•ืช ืขื‘ืจื™ื™ืŸ ื‘ื”ืฉื•ื•ืื” ืœื’ื‘ืจ ืœื‘ืŸ."
02:26
You're probably thinking,
40
146958
1268
ืืชื ื‘ื˜ื— ื—ื•ืฉื‘ื™ื ืœืขืฆืžื›ื:
02:28
"Wow, that sounds like a pretty sexist, racist person," right?
41
148250
3750
"ื•ื•ืื•, ื–ื” ื ืฉืžืข ื›ืžื• ืžื™ืฉื”ื• ืกืงืกื™ืกื˜ื™ ื•ื’ื–ืขื ื™ ืœืžื“ื™" ื ื›ื•ืŸ?
02:33
These are some real decisions that AI has made very recently,
42
153000
4851
ืืœื” ื›ืžื” ืžื”ื”ื—ืœื˜ื•ืช ื”ืืžื™ืชื™ื•ืช ืฉื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืงื™ื‘ืœื” ืœืื—ืจื•ื ื”,
02:37
based on the biases it has learned from us,
43
157875
2934
ื‘ื”ืชื‘ืกืก ืขืœ ื”ื”ื˜ื™ื•ืช ืฉื”ื™ื ืœืžื“ื” ืžืื™ืชื ื•,
02:40
from the humans.
44
160833
1250
ืžื‘ื ื™ ื”ืื“ื.
02:43
AI is being used to help decide whether or not you get that job interview;
45
163750
4809
ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืžืฉืžืฉืช ืœืกื™ื™ืข ื‘ื”ื—ืœื˜ื” ื”ืื ืชืชืงื‘ืœื• ืœืขื‘ื•ื“ื” ืื• ืœื,
02:48
how much you pay for your car insurance;
46
168583
2393
ื›ืžื” ืชืฉืœืžื• ืขื‘ื•ืจ ื‘ื™ื˜ื•ื— ื”ืจื›ื‘,
02:51
how good your credit score is;
47
171000
1893
ื›ืžื” ื˜ื•ื‘ ื“ื™ืจื•ื’ ื”ืืฉืจืื™ ืฉืœื›ื,
02:52
and even what rating you get in your annual performance review.
48
172917
3125
ื•ืืคื™ืœื• ืžื” ื”ืฆื™ื•ืŸ ืฉืชืงื‘ืœื• ื‘ื”ืขืจื›ืช ื”ืขื•ื‘ื“ ื”ืฉื ืชื™ืช ืฉืœื›ื.
02:57
But these decisions are all being filtered through
49
177083
3143
ืืš ื”ื—ืœื˜ื•ืช ืืœื” ืขื•ื‘ืจื•ืช ืกื™ื ื•ืŸ
03:00
its assumptions about our identity, our race, our gender, our age.
50
180250
5875
ืžื‘ืขื“ ืœื”ื ื—ื•ืช ืœื’ื‘ื™ ื–ื”ื•ืชื ื•, ื”ื’ื–ืข, ื”ืžื’ื“ืจ ื•ื”ื’ื™ืœ ืฉืœื ื•.
03:08
How is that happening?
51
188250
2268
ื›ื™ืฆื“ ื–ื” ืงื•ืจื”?
03:10
Now, imagine an AI is helping a hiring manager
52
190542
3517
ื•ื‘ื›ืŸ, ื“ืžื™ื™ื ื• ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืฉืžืกื™ื™ืขืช ืœืžื ื”ืœ ืžื’ื™ื™ืก
03:14
find the next tech leader in the company.
53
194083
2851
ื‘ืžืฆื™ืืช ืจืืฉ ื”ืฆื•ื•ืช ื”ื˜ื›ื ื•ืœื•ื’ื™ ื”ื‘ื ืฉืœ ื”ื—ื‘ืจื”.
03:16
So far, the manager has been hiring mostly men.
54
196958
3101
ืขื“ ื›ื” ื”ืžื ื”ืœ ื’ื™ื™ืก ื‘ืขื™ืงืจ ื’ื‘ืจื™ื.
03:20
So the AI learns men are more likely to be programmers than women.
55
200083
4750
ื›ืš ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืœืžื“ื” ืฉืœื’ื‘ืจื™ื ืกื™ื›ื•ื™ ื’ื‘ื•ื” ื™ื•ืชืจ ืœื”ื™ื•ืช ืžืชื›ื ืชื™ื ืžื ืฉื™ื.
03:25
And it's a very short leap from there to:
56
205542
2892
ื•ื–ื• ืงืคื™ืฆื” ืงืฆืจื” ืืฃ ื™ื•ืชืจ ืœ:
03:28
men make better programmers than women.
57
208458
2042
ื’ื‘ืจื™ื ื”ื ืžืชื›ื ืชื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ ืžื ืฉื™ื.
03:31
We have reinforced our own bias into the AI.
58
211417
3726
ื”ื˜ืžืขื ื• ืืช ื”ื”ื˜ื™ื•ืช ืฉืœ ืขืฆืžื ื• ื‘ืชื•ืš ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
03:35
And now, it's screening out female candidates.
59
215167
3625
ื•ื›ืขืช ื”ื™ื ืžืกื ื ืช ืžื•ืขืžื“ื•ืช ื ืฉื™ื.
03:40
Hang on, if a human hiring manager did that,
60
220917
3017
ื—ื›ื• ืจื’ืข, ืื ืžื ื”ืœ ืื ื•ืฉื™ ื”ื™ื” ืขื•ืฉื” ืืช ื–ื”
03:43
we'd be outraged, we wouldn't allow it.
61
223958
2351
ื”ื™ื™ื ื• ืžืชืงื•ืžืžื™ื, ืœื ื”ื™ื™ื ื• ืžืืคืฉืจื™ื ื–ืืช.
03:46
This kind of gender discrimination is not OK.
62
226333
3476
ืืคืœื™ื™ื” ื›ื–ื• ืขืœ ืจืงืข ืžื’ื“ืจ ื”ื™ื ืœื ื‘ืกื“ืจ.
03:49
And yet somehow, AI has become above the law,
63
229833
4518
ื•ื‘ื›ืœ ื–ืืช, ืื™ื›ืฉื”ื• ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื”ืคื›ื” ืœื”ื™ื•ืช ืžืขืœ ืœื—ื•ืง,
03:54
because a machine made the decision.
64
234375
2083
ื›ื™ ื–ื• ืžื›ื•ื ื” ืฉืงื™ื‘ืœื” ืืช ื”ื”ื—ืœื˜ื”.
03:57
That's not it.
65
237833
1518
ื•ื–ื” ืœื ื”ื›ืœ.
03:59
We are also reinforcing our bias in how we interact with AI.
66
239375
4875
ืื ื—ื ื• ื’ื ืžื˜ืžื™ืขื™ื ืืช ื”ื“ืขื•ืช ื”ืงื“ื•ืžื•ืช ืฉืœื ื• ื‘ืชืงืฉื•ืจืช ืขื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
04:04
How often do you use a voice assistant like Siri, Alexa or even Cortana?
67
244917
5976
ื›ืžื” ืืชื ืžืฉืชืžืฉื™ื ื‘ืขื•ื–ืจืช ืงื•ืœื™ืช ื›ืžื• ืกื™ืจื™, ืืœื›ืกื” ืื• ืืคื™ืœื• ืงื•ืจื˜ื ื”?
04:10
They all have two things in common:
68
250917
2559
ืœื›ื•ืœืŸ ืฉื ื™ ื“ื‘ืจื™ื ื‘ืžืฉื•ืชืฃ:
04:13
one, they can never get my name right,
69
253500
3101
ื”ืจืืฉื•ืŸ, ื”ืŸ ืœืขื•ืœื ืœื ื™ืฆืœื™ื—ื• ืœืงืœื•ื˜ ืืช ื”ืฉื ืฉืœื™ ื ื›ื•ืŸ,
04:16
and second, they are all female.
70
256625
2667
ื•ื”ืฉื ื™, ื”ืŸ ื›ื•ืœืŸ ื ืฉื™ื.
04:20
They are designed to be our obedient servants,
71
260417
2767
ื”ืŸ ืžืชื•ื›ื ื ื•ืช ืœื”ื™ื•ืช ื”ืžืฉืจืชื•ืช ื”ืฆื™ื™ืชื ื™ื•ืช ืฉืœื ื•,
04:23
turning your lights on and off, ordering your shopping.
72
263208
3250
ืžื›ื‘ื•ืช ื•ืžื“ืœื™ืงื•ืช ืขื‘ื•ืจื›ื ืืช ื”ืื•ืจ, ืžื–ืžื™ื ื•ืช ืืช ื”ืงื ื™ื•ืช ืฉืœื›ื,
04:27
You get male AIs too, but they tend to be more high-powered,
73
267125
3309
ืืชื ืžืงื‘ืœื™ื ื’ื ืงื•ืœื•ืช ื’ื‘ืจื™ื™ื, ืืš ื”ื ื ื•ื˜ื™ื ืœื”ื™ื•ืช ืžืฉืคื™ืขื™ื ื™ื•ืชืจ,
04:30
like IBM Watson, making business decisions,
74
270458
3059
ื›ืžื• ื•ื•ื˜ืกื•ืŸ ืฉืœ ืื™ื™.ื‘ื™.ืื, ืžืงื‘ืœ ื”ื—ืœื˜ื•ืช ืขืกืงื™ื•ืช,
04:33
Salesforce Einstein or ROSS, the robot lawyer.
75
273541
3792
ืื™ื™ื ืฉื˜ื™ื™ืŸ ืฉืœ ืกื™ื™ืœืกืคื•ืจืก ืื• ืจื•ืก, ืขื•ืจืš ื”ื“ื™ืŸ ื”ืจื•ื‘ื•ื˜.
04:38
So poor robots, even they suffer from sexism in the workplace.
76
278208
4060
ืžืกื›ื ื™ื ื”ืจื•ื‘ื•ื˜ื™ื, ืืคื™ืœื• ื”ื ืกื•ื‘ืœื™ื ืžืกืงืกื™ื–ื ื‘ืžืงื•ื ื”ืขื‘ื•ื“ื”.
04:42
(Laughter)
77
282292
1125
(ืฆื—ื•ืง)
04:44
Think about how these two things combine
78
284542
2851
ื—ื™ืฉื‘ื• ืื™ืš ืฉื ื™ ื”ื“ื‘ืจื™ื ื”ืืœื” ืžืชื—ื‘ืจื™ื
04:47
and affect a kid growing up in today's world around AI.
79
287417
5309
ื•ืžืฉืคื™ืขื™ื ืขืœ ื™ืœื“ ื”ื’ื“ืœ ื‘ืขื•ืœื ืฉืœ ื”ื™ื•ื ื‘ืกื‘ื™ื‘ืช ื‘ื™ื ื•ืช ืžืœืื›ื•ืชื™ื•ืช.
04:52
So they're doing some research for a school project
80
292750
2934
ื”ื ื—ื•ืงืจื™ื ืงืฆืช ื‘ืžืกื’ืจืช ืฉื™ืขื•ืจื™ ื‘ื™ืช
04:55
and they Google images of CEO.
81
295708
3018
ื•ืžื—ืคืฉื™ื ื‘ื’ื•ื’ืœ ืชืžื•ื ื•ืช ืฉืœ ืžื ื›"ืœ.
04:58
The algorithm shows them results of mostly men.
82
298750
2893
ื”ืืœื’ื•ืจื™ืชื ื™ืจืื” ืœื”ื ื‘ืขื™ืงืจ ืชื•ืฆืื•ืช ืฉืœ ื’ื‘ืจื™ื.
05:01
And now, they Google personal assistant.
83
301667
2559
ื•ืขื›ืฉื™ื•, ื”ื ืžื—ืคืฉื™ื ื‘ื’ื•ื’ืœ ืžื–ื›ื™ืจื•ืช.
05:04
As you can guess, it shows them mostly females.
84
304250
3434
ื›ืคื™ ืฉืชื•ื›ืœื• ืœื ื—ืฉ, ื”ื•ื ื™ืจืื” ืœื”ื ื‘ืขื™ืงืจ ื ืฉื™ื.
05:07
And then they want to put on some music, and maybe order some food,
85
307708
3601
ื•ืื– ื”ื ื™ืจืฆื• ืœื”ื“ืœื™ืง ืžื•ื–ื™ืงื”, ื•ืื•ืœื™ ืœื”ื–ืžื™ืŸ ืžืฉื”ื• ืœืื›ื•ืœ,
05:11
and now, they are barking orders at an obedient female voice assistant.
86
311333
6584
ื•ืขื›ืฉื™ื• ื”ื ื ื•ื‘ื—ื™ื ืคืงื•ื“ื•ืช ืขืœ ืขื•ื–ืจืช ืงื•ืœื™ืช ืฆื™ื™ืชื ื™ืช.
05:19
Some of our brightest minds are creating this technology today.
87
319542
5309
ื›ืžื” ืžื”ืžื•ื—ื•ืช ื”ืžื‘ืจื™ืงื™ื ื‘ื™ื•ืชืจ ืฉืœื ื• ื”ื ืžื™ื•ืฆืจื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื• ื›ื™ื•ื.
05:24
Technology that they could have created in any way they wanted.
88
324875
4184
ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉื”ื ื™ื›ืœื• ืœื™ืฆื•ืจ ื‘ืื™ื–ื• ืฆื•ืจื” ืฉื™ื‘ื—ืจื•.
05:29
And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary.
89
329083
5685
ื•ื‘ื›ืœ ื–ืืช, ื”ื ื‘ื—ืจื• ืœื™ืฆื•ืจ ืื•ืชื” ื‘ืกื’ื ื•ืŸ ืžื–ื›ื™ืจื”, ืžืฉื ื•ืช ื”ื—ืžื™ืฉื™ื.
05:34
Yay!
90
334792
1500
ื”ื™ื“ื“!
05:36
But OK, don't worry,
91
336958
1310
ืื‘ืœ ื‘ืกื“ืจ, ืืœ ื“ืื’ื”.
05:38
this is not going to end with me telling you
92
338292
2059
ื–ื” ืœื ื”ื•ืœืš ืœื”ื’ืžืจ ื‘ื›ืš ืฉืื•ืžืจ ืœื›ื
05:40
that we are all heading towards sexist, racist machines running the world.
93
340375
3477
ืฉื›ื•ืœื ื• ืขื•ืžื“ื™ื ื‘ืคื ื™ ืžื›ื•ื ื•ืช ืกืงืกื™ืกื˜ื™ื•ืช ื•ื’ื–ืขื ื™ื•ืช ืฉื™ืฉืœื˜ื• ื‘ืขื•ืœื.
05:44
The good news about AI is that it is entirely within our control.
94
344792
5791
ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช ืœื’ื‘ื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ืŸ ืฉื”ื™ื ืœื’ืžืจื™ ื‘ืฉืœื™ื˜ืชื ื•.
05:51
We get to teach the right values, the right ethics to AI.
95
351333
4000
ืื ื—ื ื• ื–ื•ื›ื™ื ืœืœืžื“ ืื•ืชื” ืืช ื”ืขืจื›ื™ื ื•ืืช ื”ืืชื™ืงื” ื”ื ื›ื•ื ื”.
05:56
So there are three things we can do.
96
356167
2184
ื™ืฉ ืฉืœื•ืฉื” ื“ื‘ืจื™ื ืฉื‘ื™ื›ื•ืœืชื ื• ืœืขืฉื•ืช.
05:58
One, we can be aware of our own biases
97
358375
3351
ื”ืจืืฉื•ืŸ, ืื ื• ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืžื•ื“ืขื™ื ืœื“ื™ืขื•ืช ื”ืงื“ื•ืžื•ืช ืฉืœื ื•
06:01
and the bias in machines around us.
98
361750
2726
ื•ื”ื“ื™ืขื•ืช ื”ืงื“ื•ืžื•ืช ืฉืœ ื”ืžื›ื•ื ื•ืช ืฉื‘ืกื‘ื™ื‘ื”.
06:04
Two, we can make sure that diverse teams are building this technology.
99
364500
4518
ื”ืฉื ื™, ืื ื• ื™ื›ื•ืœื™ื ืœื•ื•ื“ื ื›ื™ ืฆื•ื•ืชื™ื ืžื’ื•ื•ื ื™ื ื‘ื•ื ื™ื ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื•.
06:09
And three, we have to give it diverse experiences to learn from.
100
369042
4916
ื•ื”ืฉืœื™ืฉื™, ืื ื• ืฆืจื™ื›ื™ื ืœืกืคืง ืœื” ื—ื•ื•ื™ื•ืช ืžื’ื•ื•ื ื•ืช ืœืœืžื•ื“ ืžืชื•ื›ืŸ.
06:14
I can talk about the first two from personal experience.
101
374875
3309
ืื ื™ ื™ื›ื•ืœื” ืœืกืคืจ ืขืœ ื”ืฉื ื™ื™ื ื”ืจืืฉื•ื ื™ื ืžื ืกื™ื•ื ื™ ื”ืื™ืฉื™.
06:18
When you work in technology
102
378208
1435
ื›ืฉืืชื ืขื•ื‘ื“ื™ื ื‘ืชื—ื•ื ื”ื˜ื›ื ื•ืœื•ื’ื™
06:19
and you don't look like a Mark Zuckerberg or Elon Musk,
103
379667
3392
ื•ืืชื ืœื ื ืจืื™ื ื›ืžื• ืžืืจืง ืฆื•ืงืจื‘ืจื’ ืื• ืื™ืœื•ืŸ ืžืืกืง,
06:23
your life is a little bit difficult, your ability gets questioned.
104
383083
3750
ื”ื—ื™ื™ื ืฉืœื›ื ืงืฆืช ื™ื•ืชืจ ืงืฉื™ื, ื”ื™ื›ื•ืœืช ืฉืœื›ื ืžื•ื˜ืœืช ื‘ืกืคืง.
06:27
Here's just one example.
105
387875
1393
ื”ื ื” ืจืง ื“ื•ื’ืžื” ืื—ืช.
06:29
Like most developers, I often join online tech forums
106
389292
3726
ื›ืžื• ืจื•ื‘ ื”ืžืคืชื—ื™ื, ืœืขื™ืชื™ื ืงืจื•ื‘ื•ืช ืื ื™ ืžืชื—ื‘ืจืช ืœืคื•ืจื•ืžื™ื ื˜ื›ื ื™ื™ื ื‘ืจืฉืช
06:33
and share my knowledge to help others.
107
393042
3226
ื•ื—ื•ืœืงืช ืžื”ื™ื“ืข ืฉืœื™ ื›ื“ื™ ืœืขื–ื•ืจ ืœืื—ืจื™ื.
06:36
And I've found,
108
396292
1309
ื•ื’ื™ืœื™ืชื™,
06:37
when I log on as myself, with my own photo, my own name,
109
397625
3976
ืฉื›ืฉืื ื™ ืžืชื—ื‘ืจืช ื‘ืชื•ืจ ืขืฆืžื™, ืขื ื”ืชืžื•ื ื” ื•ื”ืฉื ืฉืœื™,
06:41
I tend to get questions or comments like this:
110
401625
4601
ืื ื™ ื ื•ื˜ื” ืœืงื‘ืœ ืฉืืœื•ืช ืื• ืชื’ื•ื‘ื•ืช ื›ืžื• ืœืžืฉืœ:
06:46
"What makes you think you're qualified to talk about AI?"
111
406250
3000
"ืžื” ื’ื•ืจื ืœืš ืœื—ืฉื•ื‘ ืฉืืช ืžื•ืกืžื›ืช ืœื“ื‘ืจ ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช?"
06:50
"What makes you think you know about machine learning?"
112
410458
3476
"ืžื” ื’ื•ืจื ืœืš ืœื—ืฉื•ื‘ ืฉืืช ืžื‘ื™ื ื” ื‘ืœืžื™ื“ืช ืžื›ื•ื ื”?"
06:53
So, as you do, I made a new profile,
113
413958
3435
ืื– ื›ื ื”ื•ื’, ื™ืฆืจืชื™ ืคืจื•ืคื™ืœ ื—ื“ืฉ,
06:57
and this time, instead of my own picture, I chose a cat with a jet pack on it.
114
417417
4851
ื•ื”ืคืขื ื‘ืžืงื•ื ื”ืชืžื•ื ื” ืฉืœื™, ื‘ื—ืจืชื™ ื‘ืชืžื•ื ืช ื—ืชื•ืœ ืขื ืชืจืžื™ืœ ืจื™ื—ื•ืฃ.
07:02
And I chose a name that did not reveal my gender.
115
422292
2458
ื•ื‘ื—ืจืชื™ ื‘ืฉื ืฉืœื ื—ื•ืฉืฃ ืืช ื”ืžื’ื“ืจ ืฉืœื™.
07:05
You can probably guess where this is going, right?
116
425917
2726
ืืชื ื‘ื˜ื— ื™ื›ื•ืœื™ื ืœื ื—ืฉ ืœืืŸ ื–ื” ืžืชืงื“ื, ื ื›ื•ืŸ?
07:08
So, this time, I didn't get any of those patronizing comments about my ability
117
428667
6392
ื•ื‘ื›ืŸ, ื”ืคืขื ืœื ืงื™ื‘ืœืชื™ ืืฃ ืื—ืช ืžืื•ืชืŸ ืชื’ื•ื‘ื•ืช ืคื˜ืจื•ื ื™ื•ืช ืœื’ื‘ื™ ื”ื™ื›ื•ืœืช ืฉืœื™
07:15
and I was able to actually get some work done.
118
435083
3334
ื•ืœืžืขืฉื” ื”ืฆืœื—ืชื™ ืœืขื‘ื•ื“ ืงืฆืช.
07:19
And it sucks, guys.
119
439500
1851
ื•ื–ื” ื’ืจื•ืข, ื—ื‘ืจื™ื.
07:21
I've been building robots since I was 15,
120
441375
2476
ืื ื™ ื‘ื•ื ื” ืจื•ื‘ื•ื˜ื™ื ืžืื– ื’ื™ืœ 15,
07:23
I have a few degrees in computer science,
121
443875
2268
ื™ืฉ ืœื™ ื›ืžื” ืชืืจื™ื ื‘ืžื“ืขื™ ื”ืžื—ืฉื‘,
07:26
and yet, I had to hide my gender
122
446167
2434
ื•ื‘ื›ืœ ื–ืืช, ื ืืœืฆืชื™ ืœื”ืกืชื™ืจ ืืช ื”ืžื’ื“ืจ ืฉืœื™,
07:28
in order for my work to be taken seriously.
123
448625
2250
ืขืœ ืžื ืช ืฉื™ืงื—ื• ื‘ืจืฆื™ื ื•ืช ืืช ื”ืขื‘ื•ื“ื” ืฉืœื™.
07:31
So, what's going on here?
124
451875
1893
ืื– ืžื” ืงื•ืจื” ื›ืืŸ?
07:33
Are men just better at technology than women?
125
453792
3208
ื”ืื ื’ื‘ืจื™ื ืคืฉื•ื˜ ื˜ื•ื‘ื™ื ื™ื•ืชืจ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ืžื ืฉื™ื?
07:37
Another study found
126
457917
1559
ืžื—ืงืจ ื ื•ืกืฃ ืžืฆื
07:39
that when women coders on one platform hid their gender, like myself,
127
459500
4934
ืฉื›ืืฉืจ ืžืชื›ื ืชื•ืช ื‘ืคืœื˜ืคื•ืจืžื” ืžืกื•ื™ื™ืžืช ื”ืกืชื™ืจื• ืืช ื”ืžื’ื“ืจ ืฉืœื”ืŸ, ื›ืžื•ื ื™,
07:44
their code was accepted four percent more than men.
128
464458
3250
ื”ืงื•ื“ ืฉืœื”ืŸ ื”ืชืงื‘ืœ ื‘-4 ืื—ื•ื–ื™ื ื™ื•ืชืจ ืžืงื•ื“ ืฉืœ ื”ื’ื‘ืจื™ื.
07:48
So this is not about the talent.
129
468542
2916
ืื– ื”ื‘ืขื™ื” ื”ื™ื ืœื ื‘ื›ื™ืฉืจื•ืŸ.
07:51
This is about an elitism in AI
130
471958
2893
ื”ื™ื ื‘ืืœื™ื˜ื™ื–ื ื‘ืชื—ื•ื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
07:54
that says a programmer needs to look like a certain person.
131
474875
2792
ืฉืงื•ื‘ืข ืื™ืš ืžืชื›ื ืช ืืžื•ืจ ืœื”ื™ืจืื•ืช.
07:59
What we really need to do to make AI better
132
479375
3101
ืžื” ืฉืื ื—ื ื• ื‘ืืžืช ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ื›ื“ื™ ืœืฉืคืจ ืืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
08:02
is bring people from all kinds of backgrounds.
133
482500
3042
ื–ื” ืœื”ื‘ื™ื ืื ืฉื™ื ื‘ืขืœื™ ืจืงืข ืฉื•ื ื”.
08:06
We need people who can write and tell stories
134
486542
2559
ืื ื• ื–ืงื•ืงื™ื ืœืื ืฉื™ื ืฉื™ื•ื“ืขื™ื ืœื›ืชื•ื‘ ื•ืœืกืคืจ ืกื™ืคื•ืจื™ื
08:09
to help us create personalities of AI.
135
489125
2167
ื›ื“ื™ ืœืขื–ื•ืจ ืœื ื• ืœื™ืฆื•ืจ ืืช ืื™ืฉื™ื•ืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
08:12
We need people who can solve problems.
136
492208
2042
ืื ื• ื–ืงื•ืงื™ื ืœืื ืฉื™ื ืฉื™ื•ื“ืขื™ื ืœืคืชื•ืจ ื‘ืขื™ื•ืช.
08:15
We need people who face different challenges
137
495125
3768
ืื ื• ื–ืงื•ืงื™ื ืœืื ืฉื™ื ืฉืžืชืžื•ื“ื“ื™ื ืขื ืืชื’ืจื™ื ืฉื•ื ื™ื
08:18
and we need people who can tell us what are the real issues that need fixing
138
498917
5351
ื•ืื ื• ื–ืงื•ืงื™ื ืœืื ืฉื™ื ืฉื™ื›ื•ืœื™ื ืœื•ืžืจ ืœื ื• ืžื” ื”ื‘ืขื™ื•ืช ื”ืืžื™ืชื™ื•ืช ืฉื™ืฉ ืœืคืชื•ืจ,
08:24
and help us find ways that technology can actually fix it.
139
504292
3041
ื•ืฉื™ืขื–ืจื• ืœื ื• ืœืžืฆื•ื ื“ืจื›ื™ื ื‘ื”ืŸ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืชื•ื›ืœ ืœืคืชื•ืจ ืื•ืชืŸ.
08:29
Because, when people from diverse backgrounds come together,
140
509833
3726
ืžืฉื•ื ืฉื›ืฉืื ืฉื™ื ืžืžื’ื•ื•ืŸ ืจืงืขื™ื ื ืคื’ืฉื™ื,
08:33
when we build things in the right way,
141
513583
2143
ื›ืฉืื ื• ื‘ื•ื ื™ื ื“ื‘ืจื™ื ื‘ื“ืจืš ื”ื ื›ื•ื ื”,
08:35
the possibilities are limitless.
142
515750
2042
ื”ืืคืฉืจื•ื™ื•ืช ื”ืŸ ื‘ืœืชื™-ืžื•ื’ื‘ืœื•ืช.
08:38
And that's what I want to end by talking to you about.
143
518750
3309
ื•ื›ืš ืื ื™ ืจื•ืฆื” ืœืกื™ื™ื ืืช ื”ืฉื™ื—ื” ืฉืœื ื•.
08:42
Less racist robots, less machines that are going to take our jobs --
144
522083
4225
ืคื—ื•ืช ืจื•ื‘ื•ื˜ื™ื ื’ื–ืขื ื™ื, ืคื—ื•ืช ืžื›ื•ื ื•ืช ืฉื’ื•ื–ืœื•ืช ืืช ื”ืžืฉืจื•ืช ืฉืœื ื• --
08:46
and more about what technology can actually achieve.
145
526332
3125
ื•ื™ื•ืชืจ ืขืœ ืžื” ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื™ื›ื•ืœื” ื‘ืืžืช ืœื”ืฉื™ื’.
08:50
So, yes, some of the energy in the world of AI,
146
530292
3434
ืื– ื ื›ื•ืŸ, ื—ืœืง ืžื”ืžืฉืื‘ื™ื ื‘ืขื•ืœื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช,
08:53
in the world of technology
147
533750
1393
ื‘ืขื•ืœื ื”ื˜ื›ื ื•ืœื•ื’ื™ื”,
08:55
is going to be about what ads you see on your stream.
148
535167
4267
ื™ื•ืงื“ืฉื• ืœืื™ืœื• ืžื•ื“ืขื•ืช ืชืจืื• ื‘ืจืฉืช.
08:59
But a lot of it is going towards making the world so much better.
149
539458
5209
ืื‘ืœ ื—ืœืง ื’ื“ื•ืœ ื™ื•ืงื“ืฉ ืœื”ืคื™ื›ืช ื”ืขื•ืœื ืœื”ืจื‘ื” ื™ื•ืชืจ ื˜ื•ื‘.
09:05
Think about a pregnant woman in the Democratic Republic of Congo,
150
545500
3768
ื—ื™ืฉื‘ื• ืขืœ ืืฉื” ื‘ื”ืจื™ื•ืŸ ื‘ืจืคื•ื‘ืœื™ืงื” ื”ื“ืžื•ืงืจื˜ื™ืช ืฉืœ ืงื•ื ื’ื•,
09:09
who has to walk 17 hours to her nearest rural prenatal clinic
151
549292
4184
ืฉืฆืจื™ื›ื” ืœืœื›ืช 17 ืฉืขื•ืช ืœืžืจืคืืช ื”ื ืฉื™ื ื”ื›ืคืจื™ืช ื”ืงืจื•ื‘ื”
09:13
to get a checkup.
152
553500
1851
ื›ื“ื™ ืœื”ื‘ื“ืง.
09:15
What if she could get diagnosis on her phone, instead?
153
555375
2917
ืžื” ืื ื”ื™ื ืชื•ื›ืœ ืœืขื‘ื•ืจ ืื‘ื—ื•ืŸ ื‘ื˜ืœืคื•ืŸ ืฉืœื” ื‘ืžืงื•ื?
09:19
Or think about what AI could do
154
559750
1809
ืื• ื—ื™ืฉื‘ื• ืขืœ ืžื” ืฉื ื™ืชืŸ ืœืขืฉื•ืช
09:21
for those one in three women in South Africa
155
561583
2726
ืขื‘ื•ืจ ืื—ืช ืžืฉืœื•ืฉ ื ืฉื™ื ื‘ื“ืจื•ื ืืคืจื™ืงื”
09:24
who face domestic violence.
156
564333
2125
ืฉืžืชืžื•ื“ื“ื•ืช ืขื ืืœื™ืžื•ืช ื‘ืžืฉืคื—ื”.
09:27
If it wasn't safe to talk out loud,
157
567083
2726
ืื ืœื ื‘ื˜ื•ื— ืขื‘ื•ืจืŸ ืœื“ื‘ืจ ื‘ืงื•ืœ,
09:29
they could get an AI service to raise alarm,
158
569833
2476
ื”ืŸ ืชื•ื›ืœื ื” ืœื”ื™ืขื–ืจ ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื“ื™ ืœื”ืชืจื™ืข,
09:32
get financial and legal advice.
159
572333
2459
ืœืงื‘ืœ ื™ืขื•ืฅ ื›ืกืคื™ ื•ื—ื•ืงื™.
09:35
These are all real examples of projects that people, including myself,
160
575958
5018
ืืœื” ื“ื•ื’ืžืื•ืช ืืžื™ืชื™ื•ืช ืฉืœ ืคืจื•ื™ื™ืงื˜ื™ื ืฉืื ืฉื™ื, ื›ื•ืœืœ ืื ื™,
09:41
are working on right now, using AI.
161
581000
2500
ืขื•ื‘ื“ื™ื ืขืœื™ื”ื ื›ื™ื•ื, ืชื•ืš ืฉื™ืžื•ืฉ ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
09:45
So, I'm sure in the next couple of days there will be yet another news story
162
585542
3601
ืื ื™ ื‘ื˜ื•ื—ื” ืฉื‘ืงืจื•ื‘ ื™ืฆื•ืฅ ืกื™ืคื•ืจ ื—ื“ืฉื•ืชื™ ื ื•ืกืฃ
09:49
about the existential risk,
163
589167
2684
ืขืœ ื”ืกื™ื›ื•ืŸ ื”ืงื™ื•ืžื™,
09:51
robots taking over and coming for your jobs.
164
591875
2434
ืจื•ื‘ื•ื˜ื™ื ืฉืžืฉืชืœื˜ื™ื ื•ื—ื•ืžื“ื™ื ืืช ื”ืžืฉืจื•ืช ืฉืœื›ื.
09:54
(Laughter)
165
594333
1018
(ืฆื—ื•ืง)
09:55
And when something like that happens,
166
595375
2309
ื•ื›ืฉืžืฉื”ื• ื›ื–ื” ืงื•ืจื”,
09:57
I know I'll get the same messages worrying about the future.
167
597708
3601
ืื ื™ ื™ื•ื“ืขืช ืฉืืงื‘ืœ ืืช ืื•ืชืŸ ื”ื”ื•ื“ืขื•ืช ื‘ื“ืื’ื” ืœื’ื‘ื™ ื”ืขืชื™ื“.
10:01
But I feel incredibly positive about this technology.
168
601333
3667
ืืš ืื ื™ ืžืจื’ื™ืฉื” ื—ื™ื•ื‘ื™ืช ืœื”ืคืœื™ื ื‘ื ื•ื’ืข ืœื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื•.
10:07
This is our chance to remake the world into a much more equal place.
169
607458
5959
ื–ื• ื”ื”ื–ื“ืžื ื•ืช ืฉืœื ื• ืœื”ืคื•ืš ืืช ื”ืขื•ืœื ืœืžืงื•ื ื”ืจื‘ื” ื™ื•ืชืจ ืฉื•ื•ื™ื•ื ื™.
10:14
But to do that, we need to build it the right way from the get go.
170
614458
4000
ืืš ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช, ืขืœื™ื ื• ืœื‘ื ื•ืช ืื•ืชื• ื‘ื“ืจืš ื”ื ื›ื•ื ื” ืžื”ื‘ืกื™ืก.
10:19
We need people of different genders, races, sexualities and backgrounds.
171
619667
5083
ืื ื• ื–ืงื•ืงื™ื ืœืื ืฉื™ื ืžืžื’ื•ื•ืŸ ืžื’ื“ืจื™ื, ื’ื–ืขื™ื, ื”ืขื“ืคื•ืช ืžื™ื ื™ื•ืช ื•ืจืงืขื™ื.
10:26
We need women to be the makers
172
626458
2476
ืื ื• ื–ืงื•ืงื™ื ืœื ืฉื™ื ืฉืชื”ื™ื™ื ื” ื”ื™ื•ืฆืจื•ืช
10:28
and not just the machines who do the makers' bidding.
173
628958
3000
ื•ืœื ืจืง ื”ืžื›ื•ื ื•ืช ืขื•ืฉื•ืช ื“ื‘ืจื• ืฉืœ ื”ื™ื•ืฆืจ.
10:33
We need to think very carefully what we teach machines,
174
633875
3768
ืื ื• ืฆืจื™ื›ื™ื ืœื‘ื—ื•ืŸ ื‘ืงืคื™ื“ื” ืžื” ืื ื• ืžืœืžื“ื™ื ืืช ื”ืžื›ื•ื ื•ืช,
10:37
what data we give them,
175
637667
1642
ืื™ืœื• ื ืชื•ื ื™ื ืื ื• ื ื•ืชื ื™ื ืœื”ืŸ,
10:39
so they don't just repeat our own past mistakes.
176
639333
3125
ื›ื“ื™ ืฉื”ืŸ ืœื ืจืง ืชื—ื–ื•ืจื ื” ืขืœ ื˜ืขื•ื™ื•ืช ื”ืขื‘ืจ ืฉืœื ื•.
10:44
So I hope I leave you thinking about two things.
177
644125
3542
ืื ื™ ืžืงื•ื•ื” ืฉืื ื™ ืžื•ืชื™ืจื” ืืชื›ื ื‘ืžื—ืฉื‘ื” ืขืœ ืฉื ื™ ื“ื‘ืจื™ื.
10:48
First, I hope you leave thinking about bias today.
178
648542
4559
ืจืืฉื™ืช, ืื ื™ ืžืงื•ื•ื” ืฉืืฉืื™ืจ ืืชื›ื ื—ื•ืฉื‘ื™ื ืขืœ ื“ืขื•ืช ืงื“ื•ืžื•ืช ื”ื™ื•ื.
10:53
And that the next time you scroll past an advert
179
653125
3184
ื•ืฉื‘ืคืขื ื”ื‘ืื” ืฉืชืชืงืœื• ื‘ืžื•ื“ืขื” ื‘ืจืฉืช
10:56
that assumes you are interested in fertility clinics
180
656333
2810
ืฉืžื ื™ื—ื” ืฉืืชื ืžืชืขื ื™ื™ื ื™ื ื‘ืžืจืคืื•ืช ืคื•ืจื™ื•ืช,
10:59
or online betting websites,
181
659167
2851
ืื• ืืชืจื™ ื”ื™ืžื•ืจื™ื,
11:02
that you think and remember
182
662042
2017
ืฉืชื—ืฉื‘ื• ืขืœ ื›ืš ื•ืชื–ื›ืจื•
11:04
that the same technology is assuming that a black man will reoffend.
183
664083
4625
ืฉื–ื• ืื•ืชื” ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืžื ื™ื—ื” ืฉื’ื‘ืจ ืฉื—ื•ืจ ื™ื”ื™ื” ืขื‘ืจื™ื™ืŸ ื—ื•ื–ืจ.
11:09
Or that a woman is more likely to be a personal assistant than a CEO.
184
669833
4167
ืื• ืฉืœืืฉื” ืกื™ื›ื•ื™ ื’ื“ื•ืœ ื™ื•ืชืจ ืœื”ื™ื•ืช ืขื•ื–ืจืช ืื™ืฉื™ืช ืžืืฉืจ ืžื ื›"ืœื™ืช.
11:14
And I hope that reminds you that we need to do something about it.
185
674958
3709
ื•ืื ื™ ืžืงื•ื•ื” ืฉื–ื” ื™ื–ื›ื™ืจ ืœื›ื ืฉืขืœื™ื ื• ืœืคืขื•ืœ ื‘ืขื ื™ื™ืŸ.
11:20
And second,
186
680917
1851
ื•ืฉื ื™ืช,
11:22
I hope you think about the fact
187
682792
1892
ืื ื™ ืžืงื•ื•ื” ืฉืชื—ืฉื‘ื• ืขืœ ื”ืขื•ื‘ื“ื”
11:24
that you don't need to look a certain way
188
684708
1976
ืฉืืชื ืœื ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ื‘ืขืœื™ ืžืจืื” ืžืกื•ื™ื
11:26
or have a certain background in engineering or technology
189
686708
3851
ืื• ืฉื™ื”ื™ื” ืœื›ื ืจืงืข ืžืกื•ื™ื ื‘ื”ื ื“ืกื” ืื• ื˜ื›ื ื•ืœื•ื’ื™ื”
11:30
to create AI,
190
690583
1268
ื›ื“ื™ ืœื™ืฆื•ืจ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
11:31
which is going to be a phenomenal force for our future.
191
691875
2875
ืฉืชื”ื™ื” ื‘ืขืœืช ื”ืฉืคืขื” ืžื›ืจืขืช ืขืœ ืขืชื™ื“ื ื•.
11:36
You don't need to look like a Mark Zuckerberg,
192
696166
2143
ืืชื ืœื ืฆืจื™ื›ื™ื ืœื”ื™ืจืื•ืช ื›ืžื• ืžืืจืง ืฆื•ืงืจื‘ืจื’,
11:38
you can look like me.
193
698333
1250
ืืชื ื™ื›ื•ืœื™ื ืœื”ื™ืจืื•ืช ื›ืžื•ื ื™.
11:41
And it is up to all of us in this room
194
701250
2893
ื•ื‘ืื—ืจื™ื•ืช ื›ืœ ืžื™ ืฉื ืžืฆื ื‘ื—ื“ืจ ื”ื–ื”
11:44
to convince the governments and the corporations
195
704167
2726
ืœืฉื›ื ืข ืืช ื”ืžืžืฉืœื•ืช ื•ื”ืชืื’ื™ื“ื™ื
11:46
to build AI technology for everyone,
196
706917
2892
ืœื‘ื ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ืžื™ื•ืขื“ื•ืช ืœื›ื•ืœื,
11:49
including the edge cases.
197
709833
2393
ื›ื•ืœืœ ืžืงืจื™ ืงืฆื”.
11:52
And for us all to get education
198
712250
2059
ื•ืœืืคืฉืจ ืœื›ื•ืœื ื• ื”ืฉื›ืœื”
11:54
about this phenomenal technology in the future.
199
714333
2375
ื‘ืชื—ื•ื ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื™ื™ื—ื•ื“ื™ืช ื”ื–ื• ื‘ืขืชื™ื“.
11:58
Because if we do that,
200
718167
2017
ื›ื™ ืื ื ืขืฉื” ื–ืืช,
12:00
then we've only just scratched the surface of what we can achieve with AI.
201
720208
4893
ื–ื” ื™ื”ื™ื” ืจืง ืžืขื˜ ืžืžื” ืฉื ื•ื›ืœ ืœื”ืฉื™ื’ ื‘ืขื–ืจืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
12:05
Thank you.
202
725125
1268
ืชื•ื“ื”.
12:06
(Applause)
203
726417
2708
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7