How to keep human bias out of AI | Kriti Sharma

100,490 views ใƒป 2019-04-12

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
Translator: Ivana Korom Reviewer: Joanna Pietrulewicz
0
0
7000
๋ฒˆ์—ญ: Minji Kim ๊ฒ€ํ† : Jihyeon J. Kim
00:12
How many decisions have been made about you today,
1
12875
3768
์—ฌ๋Ÿฌ๋ถ„์€ ์ธ๊ณต์ง€๋Šฅ์˜ ๋„์›€์„ ๋ฐ›์•„์„œ
00:16
or this week or this year,
2
16667
2601
์˜ค๋Š˜์ด๋‚˜ ์ด๋ฒˆ ์ฃผ ํ˜น์€ ์˜ฌํ•ด
00:19
by artificial intelligence?
3
19292
1958
์–ผ๋งˆ๋‚˜ ๋งŽ์€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ์…จ๋‚˜์š”?
00:22
I build AI for a living
4
22958
1685
์ œ ์ง์—…์€ ์ธ๊ณต์ง€๋Šฅ์„ ๊ตฌ์ถ•ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:24
so, full disclosure, I'm kind of a nerd.
5
24667
3017
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋ฐํžˆ์ž๋ฉด ์ข€ ๋ฒ”์ƒ์ด์—์š”.
00:27
And because I'm kind of a nerd,
6
27708
2393
์ œ๊ฐ€ ๋ฒ”์ƒ์ด๋ผ์„œ
00:30
wherever some new news story comes out
7
30125
2351
์ƒˆ๋กœ์šด ๋‰ด์Šค์—์„œ ์ธ๊ณต์ง€๋Šฅ์ด ์ผ์ž๋ฆฌ๋ฅผ ๋บ๋Š”๋‹ค๊ฑฐ๋‚˜
00:32
about artificial intelligence stealing all our jobs,
8
32500
3434
00:35
or robots getting citizenship of an actual country,
9
35958
4185
์‹ค์ œ ๊ตญ๊ฐ€์˜ ์‹œ๋ฏผ๊ถŒ์„ ํš๋“ํ•œ๋‹ค๊ฑฐ๋‚˜ ํ•˜๋Š”
00:40
I'm the person my friends and followers message
10
40167
3142
๋‰ด์Šค๊ฐ€ ๋‚˜์˜ค๋ฉด ์นœ๊ตฌ๋“ค๊ณผ ์ฃผ๋ณ€์ธ๋“ค์ด
00:43
freaking out about the future.
11
43333
1542
๊ทธ๋ ‡๊ฒŒ ๋ ๊นŒ ๋ด ๊ธฐ๊ฒํ•ด์„œ ์ €์—๊ฒŒ ๋ฉ”์‹œ์ง€๋ฅผ ๋ณด๋‚ด์š”.
00:45
We see this everywhere.
12
45833
2101
๋กœ๋ด‡ ๊ณผ๋‹ค ํ˜„์ƒ์ด ์„ธ์ƒ์„ ์ ๋ นํ•œ๋‹ค๋Š”
00:47
This media panic that our robot overlords are taking over.
13
47958
4893
๋ฏธ๋””์–ด ๊ณตํฌ ํ˜„์ƒ์€ ์—ฌ๊ธฐ์ €๊ธฐ์— ๋งŒ์—ฐํ•ด ์žˆ์–ด์š”.
00:52
We could blame Hollywood for that.
14
52875
1917
ํ• ๋ฆฌ์šฐ๋“œ๋กœ ๋น„๋‚œ์˜ ํ™”์‚ด์„ ๋Œ๋ฆฌ๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
00:56
But in reality, that's not the problem we should be focusing on.
15
56125
4125
ํ•˜์ง€๋งŒ ํ˜„์‹ค์€ ๊ทธ๊ฑธ ๊ฑฑ์ •ํ•  ๋•Œ๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
01:01
There is a more pressing danger, a bigger risk with AI,
16
61250
3643
์ธ๊ณต์ง€๋Šฅ์—๋Š” ๋” ํฐ ์œ„ํ—˜์ด ๋„์‚ฌ๋ฆฌ๊ณ  ์žˆ๋Š”๋ฐ
01:04
that we need to fix first.
17
64917
1583
์ด๊ฒƒ๋ถ€ํ„ฐ ๊ณ ์ณ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
01:07
So we are back to this question:
18
67417
2309
์ด ์งˆ๋ฌธ์œผ๋กœ ๋Œ์•„๊ฐ€ ๋ณผ๊ฒŒ์š”.
01:09
How many decisions have been made about you today by AI?
19
69750
4708
์—ฌ๋Ÿฌ๋ถ„์€ ์˜ค๋Š˜ ์ธ๊ณต์ง€๋Šฅ ๋„์›€์œผ๋กœ ์–ผ๋งˆ๋‚˜ ๋งŽ์€ ๊ฒฐ์ •์„ ๋‚ด๋ ธ๋‚˜์š”?
01:15
And how many of these
20
75792
1976
์–ผ๋งˆ๋‚˜ ๋งŽ์€ ๊ฒฐ์ •์ด
01:17
were based on your gender, your race or your background?
21
77792
4500
์„ฑ๋ณ„, ์ธ์ข…, ๋ฐฐ๊ฒฝ์— ๊ธฐ๋ฐ˜ํ•ด์„œ ์ด๋ฃจ์–ด์กŒ๋‚˜์š”?
01:24
Algorithms are being used all the time
22
84500
2768
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ํ•ญ์ƒ ์šฐ๋ฆฌ๊ฐ€ ๋ˆ„๊ตฌ์ด๊ณ 
01:27
to make decisions about who we are and what we want.
23
87292
3833
๋ฌด์—‡์„ ์›ํ•˜๋Š”์ง€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋Š”๋ฐ ์ด์šฉ๋ฉ๋‹ˆ๋‹ค.
01:32
Some of the women in this room will know what I'm talking about
24
92208
3643
์—ฌ๊ธฐ ๊ณ„์‹  ๋ช‡๋ช‡ ์—ฌ์„ฑ๋ถ„๋“ค์€ ๋ฌด์Šจ ๋ง์ธ์ง€ ์•„์‹ค ๊ฑฐ์˜ˆ์š”.
01:35
if you've been made to sit through those pregnancy test adverts on YouTube
25
95875
3768
์œ ํŠœ๋ธŒ๋กœ ์ž„์‹  ํ…Œ์ŠคํŠธ๋ฅผ
01:39
like 1,000 times.
26
99667
2059
ํ•œ 1,000๋ฒˆ ์ •๋„ ํ–ˆ๋Š”์ง€
01:41
Or you've scrolled past adverts of fertility clinics
27
101750
2851
๋˜๋Š” ํŽ˜์ด์Šค๋ถ์—์„œ ๊ฐ€์ž„ ํด๋ฆฌ๋‹‰ ๊ด‘๊ณ ๋ฅผ ๋ณด๊ณ 
01:44
on your Facebook feed.
28
104625
2042
์Šคํฌ๋กค์„ ๋‚ด๋ฆฌ๋ฉฐ ์ง€๋‚˜์ณค๋Š”์ง€์™€ ๊ฐ™์€ ๊ฒƒ๋“ค ๋ง์ž…๋‹ˆ๋‹ค.
01:47
Or in my case, Indian marriage bureaus.
29
107625
2393
์•„๋‹ˆ๋ฉด ์ œ ๊ฒฝ์šฐ์—๋Š” ์ธ๋„ ๊ฒฐํ˜ผ ์ƒ๋‹ด์†Œ๊ฐ€ ์žˆ์–ด์š”.
01:50
(Laughter)
30
110042
1267
(์›ƒ์Œ)
01:51
But AI isn't just being used to make decisions
31
111333
2976
์ธ๊ณต์ง€๋Šฅ์€ ๋‹จ์ง€ ๋ฌด์—‡์„ ์‚ด์ง€ ๋˜๋Š”
01:54
about what products we want to buy
32
114333
2601
๋ฌด์Šจ ํ‹ฐ๋ธŒ์ด ํ”„๋กœ๊ทธ๋žจ์„ ๋Œ๋ ค๊ฐ€๋ฉฐ ๋ณผ์ง€์— ๋Œ€ํ•ด
๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋Š” ๋ฐ์—๋งŒ ํ™œ์šฉ๋˜๋Š” ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
01:56
or which show we want to binge watch next.
33
116958
2500
๋ˆ„๊ฐ€ ์ด๋Ÿฐ ์ƒ๊ฐ์„ ํ•œ๋‹ค๋ฉด ์–ด๋–จ์ง€ ํ•œ ๋ฒˆ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
02:01
I wonder how you'd feel about someone who thought things like this:
34
121042
5184
"ํ‘์ธ์ด๋‚˜ ๋ผํ‹ด๊ณ„ ์‚ฌ๋žŒ์ด
02:06
"A black or Latino person
35
126250
1934
02:08
is less likely than a white person to pay off their loan on time."
36
128208
4125
๋ฐฑ์ธ๋ณด๋‹ค ๋Œ€์ถœ๊ธˆ์„ ์ œ๋•Œ ์ž˜ ๊ฐš์ง€ ์•Š๋Š”๋‹ค."
"์กด์ด๋ผ๋Š” ์‚ฌ๋žŒ์ด ๋ฉ”๋ฆฌ๋ผ๋Š” ์‚ฌ๋žŒ๋ณด๋‹ค
02:13
"A person called John makes a better programmer
37
133542
2809
๋” ๋‚˜์€ ํ”„๋กœ๊ทธ๋ž˜๋จธ๋‹ค."
02:16
than a person called Mary."
38
136375
1667
"ํ‘์ธ ๋‚จ์„ฑ์€ ๋ฐฑ์ธ ๋‚จ์„ฑ๋ณด๋‹ค ๋” ์ž์ฃผ ๋ฒ”์ฃ„๋ฅผ ๋˜ํ’€์ดํ•œ๋‹ค."
02:19
"A black man is more likely to be a repeat offender than a white man."
39
139250
5083
02:26
You're probably thinking,
40
146958
1268
๊ทธ๋Ÿฌ๋ฉด ์—ฌ๋Ÿฌ๋ถ„์€ ์•„๋งˆ ์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜์‹ค ๊ฒ๋‹ˆ๋‹ค.
02:28
"Wow, that sounds like a pretty sexist, racist person," right?
41
148250
3750
"์™€, ๊ทธ๊ฑด ์„ฑ์ฐจ๋ณ„์ ์ด๊ณ  ์ธ์ข…์ฐจ๋ณ„์ ์ธ ๋ฐœ์–ธ์ด์•ผ."๋ผ๊ณ ์š”.
02:33
These are some real decisions that AI has made very recently,
42
153000
4851
์ตœ๊ทผ์— ์ธ๊ณต์ง€๋Šฅ์ด ๋‚ด๋ฆฐ ๊ฒฐ์ • ์ค‘์—์„œ๋Š”
02:37
based on the biases it has learned from us,
43
157875
2934
์šฐ๋ฆฌ ์ธ๊ฐ„์—๊ฒŒ ๋ฐฐ์šด ํŽธํ–ฅ์—
02:40
from the humans.
44
160833
1250
์˜ํ•œ ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
์ธ๊ณต์ง€๋Šฅ์€ ๊ตฌ์ง ๋ฉด์ ‘์„ ๋ฐ›์„์ง€ ์—ฌ๋ถ€์™€
02:43
AI is being used to help decide whether or not you get that job interview;
45
163750
4809
์ž๋™์ฐจ ๋ณดํ—˜์— ์–ผ๋งˆ๋ฅผ ์ง€๋ถˆํ• ์ง€,
02:48
how much you pay for your car insurance;
46
168583
2393
์‹ ์šฉ ์ ์ˆ˜๋Š” ์–ผ๋งˆ๋‚˜ ์ข‹์€์ง€
02:51
how good your credit score is;
47
171000
1893
02:52
and even what rating you get in your annual performance review.
48
172917
3125
์‹ฌ์ง€์–ด ์—ฐ๋ก€ ์‹ค์  ํ‰๊ฐ€๋กœ ๋ช‡ ์ ์„ ์ฃผ์–ด์•ผ ํ•  ์ง€๋„ ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค.
ํ•˜์ง€๋งŒ ์ด๋Ÿฌํ•œ ๊ฒฐ์ •๋“ค์ด ์šฐ๋ฆฌ์˜ ์ •์ฒด์„ฑ, ์ธ์ข…, ์„ฑ๋ณ„, ๋‚˜์ด์—
02:57
But these decisions are all being filtered through
49
177083
3143
03:00
its assumptions about our identity, our race, our gender, our age.
50
180250
5875
๊ทผ๊ฑฐํ•œ ์ถ”์ธก์„ ๊ฑฐ์ณ์„œ ๋งŒ๋“ค์–ด์ง‘๋‹ˆ๋‹ค.
์–ด๋–ป๊ฒŒ ์ด๋Ÿฐ ์ผ์ด ์ผ์–ด๋‚˜๋Š” ๊ฑธ๊นŒ์š”?
03:08
How is that happening?
51
188250
2268
์ด์ œ ํ•œ ์ฑ„์šฉ ๋‹ด๋‹น์ž๊ฐ€ ํšŒ์‚ฌ์˜ ์ฐจ๊ธฐ ๊ธฐ์ˆ  ์„ ๋„์ž๋ฅผ
03:10
Now, imagine an AI is helping a hiring manager
52
190542
3517
03:14
find the next tech leader in the company.
53
194083
2851
๋ฝ‘๋Š”๋ฐ ์ธ๊ณต์ง€๋Šฅ์ด ๊ด€์—ฌํ•œ๋‹ค๊ณ  ์ƒ์ƒํ•ด๋ณด์„ธ์š”.
03:16
So far, the manager has been hiring mostly men.
54
196958
3101
์ง€๊ธˆ๊นŒ์ง€ ์ฑ„์šฉ ๋‹ด๋‹น์ž๋Š” ๋Œ€๋ถ€๋ถ„ ๋‚จ์ž์˜€์Šต๋‹ˆ๋‹ค.
03:20
So the AI learns men are more likely to be programmers than women.
55
200083
4750
๊ทธ๋ž˜์„œ ์ธ๊ณต์ง€๋Šฅ์€ ํ”„๋กœ๊ทธ๋ž˜๋จธ๊ฐ€ ๋Œ€๋ถ€๋ถ„ ์—ฌ์ž๊ฐ€ ์•„๋‹Œ ๋‚จ์ž๋ผ๊ณ  ๋ฐฐ์›๋‹ˆ๋‹ค.
03:25
And it's a very short leap from there to:
56
205542
2892
์—ฌ๊ธฐ์„œ ๋งค์šฐ ์งง์€ ๋„์•ฝ์œผ๋กœ ์ธ๊ณต์ง€๋Šฅ์€
03:28
men make better programmers than women.
57
208458
2042
๋‚จ์ž๊ฐ€ ์—ฌ์ž๋ณด๋‹ค ๋” ๋‚˜์€ ํ”„๋กœ๊ทธ๋ž˜๋จธ๋ผ๋Š” ๊ฒฐ์ •์„ ๋‚ด๋ฆฝ๋‹ˆ๋‹ค.
03:31
We have reinforced our own bias into the AI.
58
211417
3726
์šฐ๋ฆฌ์˜ ํŽธํ–ฅ์„ ์ธ๊ณต์ง€๋Šฅ์— ๋ณด๊ฐ•ํ•˜๊ณ  ์žˆ์—ˆ๋˜ ๊ฒ๋‹ˆ๋‹ค.
03:35
And now, it's screening out female candidates.
59
215167
3625
์ด์ œ, ์—ฌ์„ฑ ํ›„๋ณด์ž๋“ค์„ ์„ ๋ณ„ํ•ฉ๋‹ˆ๋‹ค.
03:40
Hang on, if a human hiring manager did that,
60
220917
3017
์ž ์‹œ๋งŒ์š”, ๋งŒ์•ฝ ์ธ๊ฐ„ ์ฑ„์šฉ ๋‹ด๋‹น์ž๊ฐ€ ๊ทธ๋žฌ๋‹ค๋ฉด
03:43
we'd be outraged, we wouldn't allow it.
61
223958
2351
์šฐ๋ฆฌ๋Š” ๋ถ„๋…ธํ•˜๋ฉฐ ํ—ˆ์šฉํ•˜์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:46
This kind of gender discrimination is not OK.
62
226333
3476
์ด๋Ÿฌํ•œ ์„ฑ์ฐจ๋ณ„์€ ์˜ณ์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
ํžˆ์ง€๋งŒ ์–ด์ฉ ์ผ์ธ์ง€ ์ธ๊ณต์ง€๋Šฅ์ด ๋ฒ• ์šฐ์œ„์— ์„œ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
03:49
And yet somehow, AI has become above the law,
63
229833
4518
03:54
because a machine made the decision.
64
234375
2083
์™œ๋ƒํ•˜๋ฉด ๊ธฐ๊ณ„๊ฐ€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋‹ˆ๊นŒ์š”.
์ด๊ฒŒ ๋‹ค๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
03:57
That's not it.
65
237833
1518
์šฐ๋ฆฌ๋Š” ๋˜ํ•œ ์ธ๊ณต์ง€๋Šฅ๊ณผ ์ƒํ˜ธ ์ž‘์šฉ ํ•˜๋Š”๋ฐ ํŽธํ–ฅ์„ ๊ฐ•ํ™”ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:59
We are also reinforcing our bias in how we interact with AI.
66
239375
4875
04:04
How often do you use a voice assistant like Siri, Alexa or even Cortana?
67
244917
5976
์‹œ๋ฆฌ, ์•Œ๋ ‰์‚ฌ, ์ฝ”ํƒ€๋‚˜์™€ ๊ฐ™์€ ์Œ์„ฑ ๋น„์„œ๋ฅผ ์–ผ๋งˆ๋‚˜ ์ž์ฃผ ์‚ฌ์šฉํ•˜๋‚˜์š”?
04:10
They all have two things in common:
68
250917
2559
์ด๋Ÿฌํ•œ ์Œ์„ฑ ๋ณด์กฐ ์žฅ์น˜์—๋Š” ์ „๋ถ€ ๋‘ ๊ฐ€์ง€ ๊ณตํ†ต์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:13
one, they can never get my name right,
69
253500
3101
์ฒซ์งธ, ๋‚ด ์ด๋ฆ„์„ ์ „ํ˜€ ์˜ฌ๋ฐ”๋ฅด๊ฒŒ ๋ฐœ์Œํ•˜์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
04:16
and second, they are all female.
70
256625
2667
๋‘˜์งธ, ๋ชจ๋‘ ์—ฌ์ž์ž…๋‹ˆ๋‹ค.
์šฐ๋ฆฌ์—๊ฒŒ ์ˆœ์ข…ํ•˜๋Š” ํ•˜์ธ์ด ๋˜๋„๋ก ์„ค๊ณ„๋˜์–ด
04:20
They are designed to be our obedient servants,
71
260417
2767
๋ช…๋ น์— ๋”ฐ๋ผ ์ „๋“ฑ์„ ์ผœ๊ณ  ๋„๊ณ , ์žฅ์„ ๋Œ€์‹  ๋ณด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
04:23
turning your lights on and off, ordering your shopping.
72
263208
3250
๋‚จ์„ฑ ์ธ๊ณต์ง€๋Šฅ๋„ ์žˆ์ง€๋งŒ ๋” ์ฑ…์ž„์ด ๋ง‰์ค‘ํ•œ ์ผ์„ ๋งก์Šต๋‹ˆ๋‹ค.
04:27
You get male AIs too, but they tend to be more high-powered,
73
267125
3309
04:30
like IBM Watson, making business decisions,
74
270458
3059
IBM ์™“์Šจ์ฒ˜๋Ÿผ ์‚ฌ์—… ์˜์‚ฌ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๊ฑฐ๋‚˜
04:33
Salesforce Einstein or ROSS, the robot lawyer.
75
273541
3792
ํŒ๋งค ๋‹ด๋‹น์ž ์•„์ธ์Šˆํƒ€์ธ์ด๋‚˜ ๋กœ๋ด‡ ๋ณ€ํ˜ธ์‚ฌ ๋กœ์Šค๊ฐ€ ๋ฉ๋‹ˆ๋‹ค.
๋ถˆ์Œํ•œ ๋กœ๋ด‡๋“ค, ์ง์žฅ์—์„œ๋„ ์„ฑ์ฐจ๋ณ„๋กœ ๊ณ ํ†ต๋ฐ›๋Š”๊ตฐ์š”.
04:38
So poor robots, even they suffer from sexism in the workplace.
76
278208
4060
(์›ƒ์Œ)
04:42
(Laughter)
77
282292
1125
์ด ๋‘ ๊ฐ€์ง€๊ฐ€ ๊ฒฐํ•ฉํ•ด์„œ ์ธ๊ณต์ง€๋Šฅ์ด ์กด์žฌํ•˜๋Š” ์˜ค๋Š˜๋‚ 
04:44
Think about how these two things combine
78
284542
2851
04:47
and affect a kid growing up in today's world around AI.
79
287417
5309
์ž๋ผ๋‚˜๋Š” ์–ด๋ฆฐ์ด์—๊ฒŒ ์–ด๋–ค ์˜ํ–ฅ์„ ๋ฏธ์น ์ง€ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
04:52
So they're doing some research for a school project
80
292750
2934
์•„์ด๋“ค์€ ํ•™๊ต ํ”„๋กœ์ ํŠธ๋ฅผ ์œ„ํ•ด ์กฐ์‚ฌ๋ฅผ ํ•ฉ๋‹ˆ๋‹ค.
04:55
and they Google images of CEO.
81
295708
3018
์ตœ๊ณ ๊ฒฝ์˜์ž ์ด๋ฏธ์ง€๋ฅผ ๊ตฌ๊ธ€๋งํ•ฉ๋‹ˆ๋‹ค.
04:58
The algorithm shows them results of mostly men.
82
298750
2893
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋Œ€๋ถ€๋ถ„ ๋‚จ์„ฑ์„ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.
05:01
And now, they Google personal assistant.
83
301667
2559
๊ทธ๋ฆฌ๊ณ  ์ด์ œ ์•„์ด๋“ค์€ ๊ฐœ์ธ ๋น„์„œ๋ฅผ ๊ตฌ๊ธ€๋งํ•ฉ๋‹ˆ๋‹ค.
์˜ˆ์ƒํ•˜์‹  ๋Œ€๋กœ ๋Œ€๋ถ€๋ถ„ ์—ฌ์„ฑ์„ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.
05:04
As you can guess, it shows them mostly females.
84
304250
3434
๊ทธ๋ฆฌ๊ณ  ์•„์ด๋“ค์€ ์Œ์•…๋„ ๋„ฃ๊ณ  ์Œ์‹๋„ ์ฃผ๋ฌธํ•˜๊ณ  ์‹ถ์„ ๊ฑฐ์˜ˆ์š”.
05:07
And then they want to put on some music, and maybe order some food,
85
307708
3601
์ˆœ์ข…์ ์ธ ์—ฌ์„ฑ ์Œ์„ฑ ๋น„์„œ์—๊ฒŒ ์•„์ด๋“ค์€ ์†Œ๋ฆฌ์ณ ์ฃผ๋ฌธํ•ฉ๋‹ˆ๋‹ค.
05:11
and now, they are barking orders at an obedient female voice assistant.
86
311333
6584
05:19
Some of our brightest minds are creating this technology today.
87
319542
5309
๋ช‡๋ช‡์˜ ๋›ฐ์–ด๋‚œ ์ง€์„ฑ์ธ๋“ค์ด ์˜ค๋Š˜๋‚  ์ด ๊ธฐ์ˆ ์„ ์ฐฝ์กฐํ•ฉ๋‹ˆ๋‹ค.
05:24
Technology that they could have created in any way they wanted.
88
324875
4184
๊ทธ๋“ค์ด ์›ํ•˜๋Š” ๋Œ€๋กœ ๊ธฐ์ˆ ์„ ์ฐฝ์กฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:29
And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary.
89
329083
5685
๊ทธ๋Ÿฐ๋ฐ๋„ ๊ตณ์ด 1950๋…„๋Œ€ "๋งค๋“œ ๋งจ" ๋น„์„œ ์Šคํƒ€์ผ๋กœ ๋งŒ๋“ค๊ธฐ๋กœ ์„ ํƒํ–ˆ๋„ค์š”.
05:34
Yay!
90
334792
1500
๋งŒ์„ธ!
05:36
But OK, don't worry,
91
336958
1310
ํ•˜์ง€๋งŒ ๊ดœ์ฐฎ์•„์š”, ๊ฑฑ์ • ๋งˆ์„ธ์š”
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฌ๋ ค๋Š” ๊ฒƒ์€ ์ด๊ฒŒ ๋์ด ์•„๋‹™๋‹ˆ๋‹ค.
05:38
this is not going to end with me telling you
92
338292
2059
05:40
that we are all heading towards sexist, racist machines running the world.
93
340375
3477
์„ฑ์ฐจ๋ณ„์ ์ด๊ณ  ์ธ์ข…์ฐจ๋ณ„์ ์ธ ๊ธฐ๊ณ„๊ฐ€ ์„ธ์ƒ์„ ์›€์ง์ธ๋‹ค๊ณ  ๋ง์”€๋“œ๋ ธ์ง€๋งŒ
05:44
The good news about AI is that it is entirely within our control.
94
344792
5791
์ธ๊ณต์ง€๋Šฅ์— ๋Œ€ํ•œ ์ข‹์€ ์†Œ์‹์€ ์šฐ๋ฆฌ๊ฐ€ ์ „์ ์œผ๋กœ ํ†ต์ œํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:51
We get to teach the right values, the right ethics to AI.
95
351333
4000
์ธ๊ณต์ง€๋Šฅ์—๊ฒŒ ์˜ฌ๋ฐ”๋ฅธ ๊ฐ€์น˜์™€ ์œค๋ฆฌ๊ด€์„ ๊ฐ€๋ฅด์น  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:56
So there are three things we can do.
96
356167
2184
์—ฌ๊ธฐ์— ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ์„ธ ๊ฐ€์ง€๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
05:58
One, we can be aware of our own biases
97
358375
3351
์ฒซ์งธ, ์šฐ๋ฆฌ ๊ณ ์œ ์˜ ํŽธํ–ฅ์„ ์ธ์ง€ํ•˜๊ณ 
06:01
and the bias in machines around us.
98
361750
2726
์šฐ๋ฆฌ ์ฃผ๋ณ€์— ์žˆ๋Š” ๊ธฐ๊ณ„๊ฐ€ ๊ฐ€์ง„ ํŽธํ–ฅ์„ ์ธ์ง€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:04
Two, we can make sure that diverse teams are building this technology.
99
364500
4518
๋‘˜์งธ, ๋‹ค์–‘ํ•œ ํŒ€์ด ์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ์„ ๊ตฌ์ถ•ํ•˜๋„๋ก ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:09
And three, we have to give it diverse experiences to learn from.
100
369042
4916
์…‹์งธ, ๊ธฐ๊ณ„๊ฐ€ ๋ฐฐ์šธ ์ˆ˜ ์žˆ๋„๋ก ๋‹ค์–‘ํ•œ ๊ฒฝํ—˜์„ ์ œ๊ณตํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
06:14
I can talk about the first two from personal experience.
101
374875
3309
์ œ ๊ฐœ์ธ์ ์ธ ๊ฒฝํ—˜์„ ํ†ตํ•ด ์ฒซ ๋ฒˆ์งธ์™€ ๋‘ ๋ฒˆ์งธ์— ๋Œ€ํ•ด ๋ง์”€๋“œ๋ฆด๊ฒŒ์š”.
06:18
When you work in technology
102
378208
1435
๊ธฐ์ˆ  ๋ถ„์•ผ์—์„œ ์ผ์„ ํ•  ๋•Œ
06:19
and you don't look like a Mark Zuckerberg or Elon Musk,
103
379667
3392
์—ฌ๋Ÿฌ๋ถ„์€ ๋งˆํฌ ์ฃผ์ปค๋ฒ„๊ทธ๋‚˜ ์—˜๋ก  ๋จธ์Šคํฌ์ฒ˜๋Ÿผ ์ƒ๊ธฐ์ง€ ์•Š์•˜๋‹ค๋ฉด
์‚ถ์€ ์กฐ๊ธˆ ๋” ์–ด๋ ต๊ณ  ๋Šฅ๋ ฅ๋„ ์˜์‹ฌ๋ฐ›์Šต๋‹ˆ๋‹ค.
06:23
your life is a little bit difficult, your ability gets questioned.
104
383083
3750
06:27
Here's just one example.
105
387875
1393
ํ•˜๋‚˜์˜ ์˜ˆ๋ฅผ ๋“ค์–ด๋ณผ๊ฒŒ์š”.
06:29
Like most developers, I often join online tech forums
106
389292
3726
๋Œ€๋ถ€๋ถ„์˜ ๊ฐœ๋ฐœ์ž์ฒ˜๋Ÿผ ์ €๋„ ์˜จ๋ผ์ธ ๊ธฐ์ˆ  ํฌ๋Ÿผ์— ์ฐธ์—ฌํ•ด์„œ
06:33
and share my knowledge to help others.
107
393042
3226
ํƒ€์ธ์„ ๋•๊ธฐ ์œ„ํ•ด ์ง€์‹์„ ๊ณต์œ ํ•ฉ๋‹ˆ๋‹ค.
06:36
And I've found,
108
396292
1309
๊ทธ๋ฆฌ๊ณ  ์ œ๊ฐ€ ๋ฐœ๊ฒฌํ•œ ๊ฒƒ์€
06:37
when I log on as myself, with my own photo, my own name,
109
397625
3976
์ œ ์‚ฌ์ง„๊ณผ ์ด๋ฆ„์„ ๊ณต๊ฐœํ•˜๊ณ  ๋กœ๊ทธ์ธํ•  ๋•Œ
06:41
I tend to get questions or comments like this:
110
401625
4601
๋‹ค์Œ๊ณผ ๊ฐ™์€ ์งˆ๋ฌธ์ด๋‚˜ ๋ง์„ ๋“ฃ๋Š”๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
"์–ด๋–ป๊ฒŒ ๋‹น์‹ ์ด ์ธ๊ณต์ง€๋Šฅ์— ๋Œ€ํ•ด ๋งํ•  ์ž๊ฒฉ์ด ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š” ๊ฑฐ์ฃ ?"
06:46
"What makes you think you're qualified to talk about AI?"
111
406250
3000
06:50
"What makes you think you know about machine learning?"
112
410458
3476
"์–ด๋–ป๊ฒŒ ๋‹น์‹ ์ด ๊ธฐ๊ณ„ ํ•™์Šต์— ๋Œ€ํ•ด ์•ˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š” ๊ฑฐ์ฃ ?"
06:53
So, as you do, I made a new profile,
113
413958
3435
๊ทธ๋ž˜์„œ ์—ฌ๋Ÿฌ๋ถ„์ฒ˜๋Ÿผ ์ €๋„ ์ƒˆ๋กœ์šด ํ”„๋กœํ•„์„ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
06:57
and this time, instead of my own picture, I chose a cat with a jet pack on it.
114
417417
4851
์ด๋ฒˆ์—๋Š” ์ œ ์‚ฌ์ง„ ๋Œ€์‹  ์ œํŠธ ํŒฉ์„ ๋ฉ”๊ณ  ์žˆ๋Š” ๊ณ ์–‘์ด ์‚ฌ์ง„์„ ์„ ํƒํ–ˆ์–ด์š”.
07:02
And I chose a name that did not reveal my gender.
115
422292
2458
๊ทธ๋ฆฌ๊ณ  ์ œ ์„ฑ๋ณ„์„ ์•Œ ์ˆ˜ ์—†๋Š” ์ด๋ฆ„์„ ์„ ํƒํ–ˆ์–ด์š”.
07:05
You can probably guess where this is going, right?
116
425917
2726
๋ฌด์Šจ ์ผ์ด ๋ฒŒ์–ด์งˆ์ง€ ์˜ˆ์ƒ๋˜์‹œ์ฃ ?
07:08
So, this time, I didn't get any of those patronizing comments about my ability
117
428667
6392
์ด๋ฒˆ์—๋Š” ์ œ ๋Šฅ๋ ฅ์„ ๊น”๋ณด๋Š” ๋ง์„ ํ•˜๋Š” ์‚ฌ๋žŒ์ด ์—†์—ˆ์–ด์š”.
07:15
and I was able to actually get some work done.
118
435083
3334
๊ทธ๋ฆฌ๊ณ  ์‹ค์ œ๋กœ ๋ช‡ ๊ฐ€์ง€ ์ผ๋„ ์™„์ˆ˜ํ•  ์ˆ˜ ์žˆ์—ˆ๊ณ ์š”.
07:19
And it sucks, guys.
119
439500
1851
์ด๊ฑด ์ •๋ง ์งœ์ฆ ๋‚˜๋Š” ์ผ์ด์—์š”.
07:21
I've been building robots since I was 15,
120
441375
2476
์ „ 15์‚ด ๋•Œ๋ถ€ํ„ฐ ๋กœ๋ด‡์„ ๋งŒ๋“ค์–ด ์™”๊ณ 
07:23
I have a few degrees in computer science,
121
443875
2268
์ปดํ“จํ„ฐ ๊ณผํ•™ ๋ถ„์•ผ์— ํ•™์œ„๋„ ๋ช‡ ๊ฐœ ๊ฐ–๊ณ  ์žˆ์–ด์š”.
07:26
and yet, I had to hide my gender
122
446167
2434
๊ทธ๋Ÿผ์—๋„ ์ œ ์„ฑ๋ณ„์„ ์ˆจ๊ฒจ์•ผ๋งŒ ํ–ˆ์–ด์š”.
07:28
in order for my work to be taken seriously.
123
448625
2250
๊ทธ๋ž˜์•ผ ์ œ ์ผ์ด ์ง„์ง€ํ•˜๊ฒŒ ๋ฐ›์•„๋“ค์—ฌ์ง€๋‹ˆ๊นŒ์š”.
07:31
So, what's going on here?
124
451875
1893
์—ฌ๊ธฐ์— ๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š” ๊ฑด๊ฐ€์š”?
07:33
Are men just better at technology than women?
125
453792
3208
๊ธฐ์ˆ  ๋ฉด์—์„œ ๋‚จ์ž๊ฐ€ ์—ฌ์ž๋ณด๋‹ค ๋” ๋›ฐ์–ด๋‚˜๋‚˜์š”?
07:37
Another study found
126
457917
1559
๋‹ค๋ฅธ ์—ฐ๊ตฌ์—์„œ ๋ฐœ๊ฒฌํ•œ ์‚ฌ์‹ค์€
07:39
that when women coders on one platform hid their gender, like myself,
127
459500
4934
ํ”Œ๋žซํผ์—์„œ ์—ฌ์„ฑ ํ”„๋กœ๊ทธ๋ž˜๋จธ๋“ค์ด ์ €์ฒ˜๋Ÿผ ์ž๊ธฐ ์„ฑ๋ณ„์„ ์ˆจ๊ธธ ๋•Œ
07:44
their code was accepted four percent more than men.
128
464458
3250
์ฝ”๋”ฉ์ด ๋‚จ์ž๋ณด๋‹ค 4% ๋” ๋งŽ์ด ๋ฐ›์•„๋“ค์—ฌ์กŒ์–ด์š”.
07:48
So this is not about the talent.
129
468542
2916
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ด๊ฑด ์žฌ๋Šฅ์— ๊ด€ํ•œ ๊ฒŒ ์•„๋‹ˆ์—์š”.
07:51
This is about an elitism in AI
130
471958
2893
์ด๊ฒƒ์€ ์ธ๊ณต์ง€๋Šฅ์— ์žˆ๋Š” ์—˜๋ฆฌํŠธ ์˜์‹์— ๊ด€ํ•œ ๊ฑฐ์˜ˆ์š”.
07:54
that says a programmer needs to look like a certain person.
131
474875
2792
๊ทธ๊ฑด ํ”„๋กœ๊ทธ๋ž˜๋จธ๊ฐ€ ํŠน์ •ํ•œ ์‚ฌ๋žŒ์ฒ˜๋Ÿผ ๋ณด์—ฌ์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•˜์ฃ .
07:59
What we really need to do to make AI better
132
479375
3101
๋” ๋‚˜์€ ์ธ๊ณต์ง€๋Šฅ์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ์šฐ๋ฆฌ๊ฐ€ ํ•ด์•ผ ํ•  ์ผ์€
08:02
is bring people from all kinds of backgrounds.
133
482500
3042
๊ฐ๊ธฐ ๋‹ค๋ฅธ ๋ฐฐ๊ฒฝ์—์„œ ์ž๋ž€ ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์„ ๋ฐ๋ ค์˜ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
08:06
We need people who can write and tell stories
134
486542
2559
์ธ๊ณต์ง€๋Šฅ์˜ ๊ฐœ์„ฑ์„ ๋งŒ๋“ค๋„๋ก ๋„์™€์ค„
๊ธ€์„ ์“ฐ๊ณ  ์ด์•ผ๊ธฐ๋ฅผ ํ•  ์ˆ˜ ์žˆ๋Š” ์‚ฌ๋žŒ์„ ํ•„์š”๋กœ ํ•ฉ๋‹ˆ๋‹ค.
08:09
to help us create personalities of AI.
135
489125
2167
๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๋Š” ์‚ฌ๋žŒ์„ ํ•„์š”๋กœ ํ•ฉ๋‹ˆ๋‹ค.
08:12
We need people who can solve problems.
136
492208
2042
๋‹ค๋ฅธ ๋„์ „์„ ๋งˆ์ฃผํ•˜๊ณ 
08:15
We need people who face different challenges
137
495125
3768
08:18
and we need people who can tell us what are the real issues that need fixing
138
498917
5351
๊ณ ์ณ์•ผ ํ•˜๋Š” ์ง„์งœ ๋ฌธ์ œ์— ๋Œ€ํ•ด ๋งํ•ด์ค„ ์ˆ˜ ์žˆ๊ณ ,
08:24
and help us find ways that technology can actually fix it.
139
504292
3041
๊ทธ๊ฑธ ๊ณ ์น  ์ˆ˜ ์žˆ๋Š” ๊ธฐ์ˆ ์„ ์ฐพ๋„๋ก ๋„์™€์ค„ ์‚ฌ๋žŒ์„ ํ•„์š”๋กœ ํ•ฉ๋‹ˆ๋‹ค.
08:29
Because, when people from diverse backgrounds come together,
140
509833
3726
์™œ๋ƒํ•˜๋ฉด ๋‹ค์–‘ํ•œ ๋ฐฐ๊ฒฝ์„ ๊ฐ€์ง„ ์‚ฌ๋žŒ๋“ค์ด ๋ชจ์ผ ๋•Œ,
08:33
when we build things in the right way,
141
513583
2143
์šฐ๋ฆฌ๊ฐ€ ์˜ฌ๋ฐ”๋ฅธ ๊ฒƒ๋“ค์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์„ ๋•Œ,
08:35
the possibilities are limitless.
142
515750
2042
๊ฐ€๋Šฅ์„ฑ์€ ๋ฌด๊ถ๋ฌด์ง„ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:38
And that's what I want to end by talking to you about.
143
518750
3309
์ด๊ฒƒ์ด ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ๋ง์”€๋“œ๋ฆฌ๋ฉฐ ๋๋งบ๊ณ  ์‹ถ์€ ์ด์•ผ๊ธฐ์ž…๋‹ˆ๋‹ค.
08:42
Less racist robots, less machines that are going to take our jobs --
144
522083
4225
๋œ ์ธ์ข…์ฐจ๋ณ„์ ์ธ ๋กœ๋ด‡, ์šฐ๋ฆฌ ์ผ์ž๋ฆฌ๋ฅผ ๋œ ๋นผ์•—์•„ ๊ฐˆ ๊ธฐ๊ณ„
08:46
and more about what technology can actually achieve.
145
526332
3125
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์‹ค์ œ๋กœ ์„ฑ์ทจํ•  ์ˆ˜ ์žˆ๋Š” ๋” ๋งŽ์€ ๊ธฐ์ˆ ์— ๋Œ€ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
๋งž์•„์š”, ์ธ๊ณต์ง€๋Šฅ์˜ ์„ธ๊ณ„์™€ ๊ธฐ์ˆ ์˜ ์„ธ๊ณ„์—์„œ
08:50
So, yes, some of the energy in the world of AI,
146
530292
3434
์ผ๋ถ€ ์—๋„ˆ์ง€๋Š”
08:53
in the world of technology
147
533750
1393
์—ฌ๋Ÿฌ๋ถ„์ด ์ŠคํŠธ๋ฆผ์—์„œ ๋ณผ ์ˆ˜ ์žˆ๋Š” ๊ด‘๊ณ ์— ์“ฐ์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:55
is going to be about what ads you see on your stream.
148
535167
4267
08:59
But a lot of it is going towards making the world so much better.
149
539458
5209
ํ•˜์ง€๋งŒ ๊ทธ์ค‘ ๋งŽ์€ ์—๋„ˆ์ง€๋Š” ๋” ๋‚˜์€ ์„ธ์ƒ์„ ๋งŒ๋“œ๋Š”๋ฐ ์“ฐ์ž…๋‹ˆ๋‹ค.
09:05
Think about a pregnant woman in the Democratic Republic of Congo,
150
545500
3768
์ฝฉ๊ณ  ๋ฏผ์ฃผ ๊ณตํ™”๊ตญ์— ์žˆ๋Š” ์ž„์‚ฐ๋ถ€๋ฅผ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
09:09
who has to walk 17 hours to her nearest rural prenatal clinic
151
549292
4184
๊ฐ€์žฅ ๊ฐ€๊นŒ์šด ์‹œ๊ณจ ์‚ฐ์ „ ํด๋ฆฌ๋‹‰์— ๊ฐ€๋ ค๋ฉด 17์‹œ๊ฐ„์„ ๊ฑธ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:13
to get a checkup.
152
553500
1851
๊ฒ€์ง„๋ฐ›๊ธฐ ์œ„ํ•ด์„œ์š”.
09:15
What if she could get diagnosis on her phone, instead?
153
555375
2917
๋งŒ์•ฝ ์ž„์‚ฐ๋ถ€๊ฐ€ ๊ทธ ๋Œ€์‹  ์ „ํ™”๋กœ ์ง„๋‹จ๋ฐ›์„ ์ˆ˜ ์žˆ๋‹ค๋ฉด ์–ด๋–จ๊นŒ์š”?
09:19
Or think about what AI could do
154
559750
1809
์•„๋‹ˆ๋ฉด ์ธ๊ณต์ง€๋Šฅ์ด ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ์„ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
09:21
for those one in three women in South Africa
155
561583
2726
๋‚จ์•„ํ”„๋ฆฌ์นด ์—ฌ์„ฑ 3๋ช… ์ค‘ 1๋ช…์€
09:24
who face domestic violence.
156
564333
2125
๊ฐ€์ • ํญ๋ ฅ์— ๋…ธ์ถœ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
09:27
If it wasn't safe to talk out loud,
157
567083
2726
๋งŒ์•ฝ ํฌ๊ฒŒ ์†Œ๋ฆฌ ๋‚ด์–ด ๋งํ•˜๋Š” ๊ฒƒ์ด ์•ˆ์ „ํ•˜์ง€ ์•Š๋‹ค๋ฉด
09:29
they could get an AI service to raise alarm,
158
569833
2476
์ธ๊ณต์ง€๋Šฅ ์„œ๋น„์Šค๋ฅผ ์ด์šฉํ•ด ๊ฒฝ๋ณด๋ฅผ ์šธ๋ ค
์žฌ์ •๊ณผ ๋ฒ•๋ฅ  ์ƒ๋‹ด์„ ๋ฐ›์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:32
get financial and legal advice.
159
572333
2459
์ด๊ฒƒ์€ ํ˜„์žฌ ์ €๋ฅผ ํฌํ•จํ•ด ์‚ฌ๋žŒ๋“ค์ด ์ธ๊ณต์ง€๋Šฅ์„ ์ด์šฉํ•ด
09:35
These are all real examples of projects that people, including myself,
160
575958
5018
์ž‘์—…ํ•˜๊ณ  ์žˆ๋Š” ํ”„๋กœ์ ํŠธ์˜ ์‹ค์ œ ์‚ฌ๋ก€์ž…๋‹ˆ๋‹ค.
09:41
are working on right now, using AI.
161
581000
2500
09:45
So, I'm sure in the next couple of days there will be yet another news story
162
585542
3601
๋ฉฐ์น  ๋’ค์— ๋˜ ๋‹ค๋ฅธ ๋ณด๋„ ๊ธฐ์‚ฌ๊ฐ€ ์žˆ์„ ๊ฑฐ๋ผ ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.
09:49
about the existential risk,
163
589167
2684
๋กœ๋ด‡์ด ์—ฌ๋Ÿฌ๋ถ„์˜ ์ผ์ž๋ฆฌ๋ฅผ ์ ๋ นํ•˜๋Ÿฌ ์˜ค๋Š” ๊ฒƒ์— ๋Œ€ํ•œ
09:51
robots taking over and coming for your jobs.
164
591875
2434
์‹ค์กด์ ์ธ ์œ„ํ—˜์— ๋Œ€ํ•ด์„œ์š”.
09:54
(Laughter)
165
594333
1018
(์›ƒ์Œ)
09:55
And when something like that happens,
166
595375
2309
๊ทธ๋Ÿฐ ์ผ์ด ์ผ์–ด๋‚  ๋•Œ,
09:57
I know I'll get the same messages worrying about the future.
167
597708
3601
๋ฏธ๋ž˜์— ๋Œ€ํ•ด ๊ฑฑ์ •ํ•˜๋Š” ๋˜‘๊ฐ™์€ ๋ฉ”์‹œ์ง€๋ฅผ ๋ฐ›๊ฒŒ ๋  ๊ฒƒ๋„ ์••๋‹ˆ๋‹ค.
ํ•˜์ง€๋งŒ ์ €๋Š” ์ด ๊ธฐ์ˆ ์— ๋Œ€ํ•ด ๋งค์šฐ ๊ธ์ •์ ์ž…๋‹ˆ๋‹ค.
10:01
But I feel incredibly positive about this technology.
168
601333
3667
์ด๊ฒƒ์ด์•ผ๋ง๋กœ ์„ธ์ƒ์„ ๋” ํ‰๋“ฑํ•œ ๊ณณ์œผ๋กœ ๋งŒ๋“ค ๊ธฐํšŒ์ž…๋‹ˆ๋‹ค.
10:07
This is our chance to remake the world into a much more equal place.
169
607458
5959
10:14
But to do that, we need to build it the right way from the get go.
170
614458
4000
ํ•˜์ง€๋งŒ ๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„œ๋Š” ์ฒ˜์Œ๋ถ€ํ„ฐ ์˜ฌ๋ฐ”๋ฅธ ๋ฐฉ๋ฒ•์œผ๋กœ ๋งŒ๋“ค์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:19
We need people of different genders, races, sexualities and backgrounds.
171
619667
5083
๋‹ค๋ฅธ ์„ฑ๋ณ„, ์ธ์ข…, ์„ฑ์  ์ทจํ–ฅ, ๋ฐฐ๊ฒฝ ์ถœ์‹ ์˜ ์‚ฌ๋žŒ๋“ค์„ ํ•„์š”๋กœ ํ•ฉ๋‹ˆ๋‹ค.
10:26
We need women to be the makers
172
626458
2476
์—ฌ์„ฑ์ด ์ œ์ž‘์ž๊ฐ€ ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:28
and not just the machines who do the makers' bidding.
173
628958
3000
๋‹จ์ง€ ์ œ์ž‘์ž ์ž…์ฐฐ์„ ํ•˜๋Š” ๊ธฐ๊ณ„๋ฟ๋งŒ์ด ์•„๋‹ˆ๋ผ ๋ง์ด์ฃ .
10:33
We need to think very carefully what we teach machines,
174
633875
3768
์šฐ๋ฆฌ๋Š” ๊ธฐ๊ณ„์—๊ฒŒ ๋ฌด์—‡์„ ๊ฐ€๋ฅด์น˜๊ณ  ์–ด๋–ค ๋ฐ์ดํ„ฐ๋ฅผ ์ œ๊ณตํ• ์ง€์— ๋Œ€ํ•ด
10:37
what data we give them,
175
637667
1642
๋งค์šฐ ์‹ ์ค‘ํ•˜๊ฒŒ ์ƒ๊ฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:39
so they don't just repeat our own past mistakes.
176
639333
3125
๊ณผ๊ฑฐ์™€ ๊ฐ™์€ ์‹ค์ˆ˜๋ฅผ ํ•˜์ง€ ์•Š๊ธฐ ์œ„ํ•ด์„œ ๋ง์ด์ฃ .
10:44
So I hope I leave you thinking about two things.
177
644125
3542
์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ์ƒ๊ฐํ•  ๊ฑฐ๋ฆฌ ๋‘ ๊ฐ€์ง€๋ฅผ ๋“œ๋ฆด๊ฒŒ์š”.
10:48
First, I hope you leave thinking about bias today.
178
648542
4559
์ฒซ์งธ, ์˜ค๋Š˜๋‚  ํŽธํ–ฅ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด๋ณด์‹œ๊ธธ ๋ฐ”๋ž๋‹ˆ๋‹ค.
10:53
And that the next time you scroll past an advert
179
653125
3184
๊ทธ๋ฆฌ๊ณ  ๋‹ค์Œ๋ฒˆ์— ์—ฌ๋Ÿฌ๋ถ„์ด ๊ด€์‹ฌ ์žˆ์–ด ํ•  ๊ฑฐ๋ผ ๊ฐ€์ •ํ•˜๋Š”
10:56
that assumes you are interested in fertility clinics
180
656333
2810
๋ถˆ์ž„ ํด๋ฆฌ๋‹‰ ๊ด‘๊ณ ๋‚˜
10:59
or online betting websites,
181
659167
2851
์˜จ๋ผ์ธ ๋„๋ฐ• ์›น์‚ฌ์ดํŠธ ๊ด‘๊ณ ๋ฅผ ์Šคํฌ๋กค ํ•˜๋ฉฐ ์ง€๋‚˜์น  ๋•Œ
11:02
that you think and remember
182
662042
2017
์ƒ๊ฐํ•˜๊ณ  ๊ธฐ์–ตํ•ด์ฃผ์„ธ์š”.
11:04
that the same technology is assuming that a black man will reoffend.
183
664083
4625
๊ฐ™์€ ๊ธฐ์ˆ ์ด ํ‘์ธ์€ ๋‹ค์‹œ ๋ฒ•์„ ์œ„๋ฐ˜ํ•  ๊ฒƒ์ด๋ผ๊ณ  ๊ฐ€์ •ํ•œ๋‹ค๋Š” ๊ฒƒ์„,
11:09
Or that a woman is more likely to be a personal assistant than a CEO.
184
669833
4167
๋˜๋Š” ์ตœ๊ณ ๊ฒฝ์˜์ž์˜ ๊ฐœ์ธ ๋น„์„œ๋Š” ์—ฌ์„ฑ์ผ ๊ฐ€๋Šฅ์„ฑ์ด ๋งŽ๋‹ค๊ณ  ๊ฐ€์ •ํ•œ๋‹ค๋Š” ๊ฒƒ์„์š”.
11:14
And I hope that reminds you that we need to do something about it.
185
674958
3709
์ด๊ฒƒ์— ๋Œ€ํ•ด ์šฐ๋ฆฌ๋Š” ๋ฌด์–ธ๊ฐ€๋ฅผ ํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ๋„ ๊ธฐ์–ตํ•ด์ฃผ์„ธ์š”.
11:20
And second,
186
680917
1851
๋‘ ๋ฒˆ์งธ๋กœ,
11:22
I hope you think about the fact
187
682792
1892
๋‹ค์Œ๊ณผ ๊ฐ™์€ ์‚ฌ์‹ค์„ ๊ธฐ์–ตํ•ด ์ฃผ์‹œ๊ธธ ๋ฐ”๋ž๋‹ˆ๋‹ค.
11:24
that you don't need to look a certain way
188
684708
1976
์—ฌ๋Ÿฌ๋ถ„์€ ์–ด๋–ค ํŠน์ •ํ•œ ์‚ฌ๋žŒ์ฒ˜๋Ÿผ ๋ณด์—ฌ์•ผ ํ•˜๊ฑฐ๋‚˜
11:26
or have a certain background in engineering or technology
189
686708
3851
ํŠน์ •ํ•œ ๊ณตํ•™์ด๋‚˜ ๊ธฐ์ˆ  ๋ฐฐ๊ฒฝ์„ ๊ฐ€์ ธ์•ผ๋งŒ
11:30
to create AI,
190
690583
1268
๋ฏธ๋ž˜๋ฅผ ์œ„ํ•œ ๋†€๋ž„ ๋งŒํ•œ ์„ธ๋ ฅ์ด ๋  ์ธ๊ณต์ง€๋Šฅ์„
11:31
which is going to be a phenomenal force for our future.
191
691875
2875
๋งŒ๋“ค ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ๋Š” ๊ฒƒ์„ ๋ง์ž…๋‹ˆ๋‹ค.
๋งˆํฌ ์ฃผ์ปค๋ฒ„๊ทธ์ฒ˜๋Ÿผ ์ƒ๊ธฐ์ง€ ์•Š์•„๋„ ๋˜๊ณ 
11:36
You don't need to look like a Mark Zuckerberg,
192
696166
2143
11:38
you can look like me.
193
698333
1250
๊ทธ๋ƒฅ ์ €์ฒ˜๋Ÿผ ์ƒ๊ฒจ๋„ ๋ฉ๋‹ˆ๋‹ค.
์ด ๊ณต๊ฐ„์— ์žˆ๋Š” ๋ชจ๋‘์—๊ฒŒ ๋‹ฌ๋ ธ์Šต๋‹ˆ๋‹ค.
11:41
And it is up to all of us in this room
194
701250
2893
์ •๋ถ€์™€ ๊ธฐ์—…์„ ์„ค๋“ํ•ด์„œ
11:44
to convince the governments and the corporations
195
704167
2726
11:46
to build AI technology for everyone,
196
706917
2892
๋ชจ๋‘๋ฅผ ์œ„ํ•œ ์ธ๊ณต์ง€๋Šฅ ๊ธฐ์ˆ ์„ ๊ตฌ์ถ•ํ•˜๋„๋ก ํ•˜๋Š” ๊ฒƒ ๋ง์ž…๋‹ˆ๋‹ค.
11:49
including the edge cases.
197
709833
2393
์—ฃ์ง€ ์ผ€์ด์Šค๋ฅผ ํฌํ•จํ•ด์„œ์š”.
11:52
And for us all to get education
198
712250
2059
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ๋†€๋ผ์šด ๋ฏธ๋ž˜ ๊ธฐ์ˆ ์— ๋Œ€ํ•ด ๊ต์œก๋ฐ›์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:54
about this phenomenal technology in the future.
199
714333
2375
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ํ•  ๊ฒฝ์šฐ, ๊ทธ๊ฒƒ์€
11:58
Because if we do that,
200
718167
2017
12:00
then we've only just scratched the surface of what we can achieve with AI.
201
720208
4893
์šฐ๋ฆฌ๊ฐ€ ์ธ๊ณต์ง€๋Šฅ์œผ๋กœ ์„ฑ์ทจํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์— ๋Œ€ํ•œ ์‹œ์ž‘์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
12:05
Thank you.
202
725125
1268
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
12:06
(Applause)
203
726417
2708
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7