Machine intelligence makes human morals more important | Zeynep Tufekci

180,762 views ใƒป 2016-11-11

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jeongmin Kim ๊ฒ€ํ† : JY Kang
00:12
So, I started my first job as a computer programmer
0
12739
4122
์ €๋Š” ์ œ ์ฒซ ๋ฒˆ์งธ ์ง์—…์ธ ์ปดํ“จํ„ฐ ํ”„๋กœ๊ทธ๋ž˜๋จธ ์ผ์„
00:16
in my very first year of college --
1
16885
1956
๋Œ€ํ•™ 1ํ•™๋…„ ๋•Œ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:18
basically, as a teenager.
2
18865
1507
์•„์ง ์‹ญ๋Œ€์˜€์ฃ .
00:20
Soon after I started working,
3
20889
1732
์ผ์„ ์‹œ์ž‘ํ•œ ์ง€ ์–ผ๋งˆ ์•ˆ ๋˜์–ด
00:22
writing software in a company,
4
22645
1610
ํšŒ์‚ฌ์—์„œ ํ”„๋กœ๊ทธ๋ž˜๋ฐ์„ ํ•˜๊ณ  ์žˆ๋Š”๋ฐ
00:24
a manager who worked at the company came down to where I was,
5
24799
3635
ํšŒ์‚ฌ์˜ ํ•œ ๊ด€๋ฆฌ์ž ์ œ ์ž๋ฆฌ๋กœ ์™€์„œ๋Š”
00:28
and he whispered to me,
6
28458
1268
์ €ํ•œํ…Œ ์†์‚ญ์˜€์–ด์š”.
00:30
"Can he tell if I'm lying?"
7
30229
2861
"๋‚ด๊ฐ€ ๊ฑฐ์ง“๋งํ•˜๋ฉด ์Ÿค๊ฐ€ ์•Œ์•„์ฑŒ๊นŒ์š”?"
00:33
There was nobody else in the room.
8
33806
2077
ํ•˜์ง€๋งŒ ๋ฐฉ์—๋Š” ๋‘˜๋ฐ–์— ์—†์—ˆ์–ด์š”.
00:37
"Can who tell if you're lying? And why are we whispering?"
9
37032
4389
"๋ˆ„๊ฐ€ ์•Œ์•„์ฑˆ๋‹ค๋Š” ๊ฑฐ์ฃ ? ์•„๋ฌด๋„ ์—†๋Š”๋ฐ ์™œ ์†์‚ญ์ด์‹œ๋Š” ๊ฑฐ์˜ˆ์š”?"
00:42
The manager pointed at the computer in the room.
10
42266
3107
๋งค๋‹ˆ์ €๋Š” ๋ฐฉ์— ์žˆ๋Š” ์ปดํ“จํ„ฐ๋ฅผ ๊ฐ€๋ฆฌ์ผฐ์–ด์š”.
00:45
"Can he tell if I'm lying?"
11
45397
3096
"์ €๋†ˆ์ด ์•Œ์•„์ฑŒ ์ˆ˜ ์žˆ์„๊นŒ์š”?"
00:49
Well, that manager was having an affair with the receptionist.
12
49613
4362
๊ทธ ๋งค๋‹ˆ์ €๋Š” ํšŒ์‚ฌ ์ ‘์ˆ˜๊ณ„ ์ง์›๊ณผ ๋ฐ”๋žŒ์„ ํ”ผ์šฐ๊ณ  ์žˆ์—ˆ์ฃ .
00:53
(Laughter)
13
53999
1112
(์›ƒ์Œ)
00:55
And I was still a teenager.
14
55135
1766
์ „ ์•„์ง ์‹ญ๋Œ€์˜€๊ธฐ์—
00:57
So I whisper-shouted back to him,
15
57447
2019
๊ทธ ์‚ฌ๋žŒ ๊ท€์— ๋Œ€๊ณ  ์†Œ๋ฆฌ์ณค์ฃ .
00:59
"Yes, the computer can tell if you're lying."
16
59490
3624
"๋„ค, ์ € ์ปดํ“จํ„ฐ๋Š” ๋‹น์‹  ๋ถ€์ •์„ ์•Œ ์ˆ˜ ์žˆ์„ ๊ฑฐ์˜ˆ์š”."
01:03
(Laughter)
17
63138
1806
(์›ƒ์Œ)
01:04
Well, I laughed, but actually, the laugh's on me.
18
64968
2923
์ „ ์›ƒ๊ณ  ๋ง์•˜์ง€๋งŒ, ๊ฒฐ๊ตญ ์ œ๊ฐ€ ์–ด๋ฆฌ์„์—ˆ์ฃ .
01:07
Nowadays, there are computational systems
19
67915
3268
์š”์ฆ˜ ์ปดํ“จํ„ฐ ์‹œ์Šคํ…œ์€
01:11
that can suss out emotional states and even lying
20
71207
3548
๊ฐ์ • ์ƒํƒœ๋‚˜, ์‹ฌ์ง€์–ด ๊ฑฐ์ง“๋ง๊นŒ์ง€
01:14
from processing human faces.
21
74779
2044
์ธ๊ฐ„ ํ‘œ์ •์œผ๋กœ ์•Œ์•„๋‚ผ ์ˆ˜ ์žˆ๊ฑฐ๋“ ์š”.
01:17
Advertisers and even governments are very interested.
22
77248
4153
๊ด‘๊ณ  ์—…์ฒด์™€ ์ •๋ถ€๊นŒ์ง€๋„ ์ด ๊ธฐ์ˆ ์— ๊ด€์‹ฌ์„ ๊ธฐ์šธ์ด๊ณ  ์žˆ์ฃ .
01:22
I had become a computer programmer
23
82319
1862
์ €๋Š” ์–ด๋ฆด ๋•Œ ์ˆ˜ํ•™๊ณผ ๊ณผํ•™์„ ๋งค์šฐ ์ข‹์•„ํ•ด์„œ
01:24
because I was one of those kids crazy about math and science.
24
84205
3113
์ž์—ฐ์Šค๋ ˆ ํ”„๋กœ๊ทธ๋ž˜๋จธ๊ฐ€ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
01:27
But somewhere along the line I'd learned about nuclear weapons,
25
87942
3108
๊ทธ๋Ÿฐ๋ฐ ์–ธ์  ๊ฐ€ ํ•ต๋ฌด๊ธฐ๋ฅผ ์•Œ๊ฒŒ ๋˜์—ˆ์„ ๋•Œ
01:31
and I'd gotten really concerned with the ethics of science.
26
91074
2952
์ €๋Š” ๊ณผํ•™ ์œค๋ฆฌ์— ๋Œ€ํ•ด ๊ดŒ์‹ฌ์ด ๋งŽ์•„์กŒ์Šต๋‹ˆ๋‹ค.
01:34
I was troubled.
27
94050
1204
๊ณ ๋ฏผ์„ ๋งŽ์ด ํ–ˆ์ฃ .
01:35
However, because of family circumstances,
28
95278
2641
ํ•˜์ง€๋งŒ ์ง‘์•ˆ ์‚ฌ์ • ๋•Œ๋ฌธ์—
01:37
I also needed to start working as soon as possible.
29
97943
3298
์ตœ๋Œ€ํ•œ ๋นจ๋ฆฌ ์ผ์„ ํ•ด์•ผ ํ–ˆ์ฃ .
01:41
So I thought to myself, hey, let me pick a technical field
30
101265
3299
๊ทธ๋ž˜์„œ ๊ณผํ•™๊ธฐ์ˆ  ๋ถ„์•ผ ์ค‘์—์„œ
01:44
where I can get a job easily
31
104588
1796
์ง์—…์„ ์‰ฝ๊ฒŒ ๊ฐ€์งˆ ์ˆ˜ ์žˆ์œผ๋ฉด์„œ๋„
01:46
and where I don't have to deal with any troublesome questions of ethics.
32
106408
4018
๋ณต์žกํ•œ ์œค๋ฆฌ์  ๊ณ ๋ฏผ์„ ํ•  ํ•„์š”๊ฐ€ ์—†๋Š” ์ผ์„ ๊ณ ๋ฅด๊ธฐ๋กœ ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:51
So I picked computers.
33
111022
1529
๊ทธ๋ž˜์„œ ์ปดํ“จํ„ฐ ๊ด€๋ จ๋œ ์ผ์„ ๊ณจ๋ž์ฃ .
01:52
(Laughter)
34
112575
1104
(์›ƒ์Œ)
01:53
Well, ha, ha, ha! All the laughs are on me.
35
113703
3410
ํ•˜ํ•˜ํ•˜! ๊ทธ๋Ÿฐ๋ฐ ๋˜ ์–ด๋ฆฌ์„์€ ์ƒ๊ฐ์ด์—ˆ๋„ค์š”.
01:57
Nowadays, computer scientists are building platforms
36
117137
2754
์š”์ฆ˜ ์ปดํ“จํ„ฐ ๊ณผํ•™์ž๋“ค์ด ๋งŒ๋“œ๋Š” ํ”Œ๋žซํผ์€
01:59
that control what a billion people see every day.
37
119915
4209
์‹ญ์–ต ๋ช…์ด ๋งค์ผ ์ ‘ํ•˜๊ฒŒ ๋˜๋Š” ์‹œ์Šคํ…œ์„ ํ†ต์ œํ•ฉ๋‹ˆ๋‹ค.
02:05
They're developing cars that could decide who to run over.
38
125052
3822
๋ˆ„๊ตด ์น˜๊ฒŒ ๋ ์ง€๋„ ๋ชจ๋ฅผ ์ฐจ๋ฅผ ๊ฐœ๋ฐœํ•˜๊ณ 
02:09
They're even building machines, weapons,
39
129707
3213
์ „์Ÿ์—์„œ ์‚ฌ๋žŒ์„ ์ฃฝ์ผ ์ˆ˜ ์žˆ๋Š”
02:12
that might kill human beings in war.
40
132944
2285
๊ธฐ๊ณ„๋‚˜ ๋ฌด๊ธฐ๋„ ์„ค๊ณ„ํ•˜๊ณ  ์žˆ์ฃ .
02:15
It's ethics all the way down.
41
135253
2771
๋ชจ๋‘ ์œค๋ฆฌ์— ๊ด€ํ•œ ๊ฑฐ์ฃ .
02:19
Machine intelligence is here.
42
139183
2058
์ธ๊ณต์ง€๋Šฅ์˜ ์‹œ๋Œ€์ž…๋‹ˆ๋‹ค.
02:21
We're now using computation to make all sort of decisions,
43
141823
3474
์ปดํ“จํ„ฐ๋Š” ์ ์  ๋‹ค์–‘ํ•œ ์˜์‚ฌ๊ฒฐ์ •์— ์‚ฌ์šฉ๋˜๊ณ  ์žˆ๊ณ 
02:25
but also new kinds of decisions.
44
145321
1886
๊ทธ ์ค‘์—” ์ƒˆ๋กœ์šด ๊ฒƒ๋„ ์žˆ์ฃ .
02:27
We're asking questions to computation that have no single right answers,
45
147231
5172
์ฃผ๊ด€์ ์ด๊ณ  ๊ฐ€์น˜ ํŒ๋‹จ์ด ํ•„์š”ํ•œ
์ •๋‹ต์ด ์—†๋Š” ์—ด๋ฆฐ ์งˆ๋ฌธ๊นŒ์ง€๋„ ์ปดํ“จํ„ฐ์—๊ฒŒ ๋ฌป๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:32
that are subjective
46
152427
1202
02:33
and open-ended and value-laden.
47
153653
2325
02:36
We're asking questions like,
48
156002
1758
์ด๋ฅผํ…Œ๋ฉด ์ด๋Ÿฐ ์งˆ๋ฌธ๋“ค์ด์ฃ .
02:37
"Who should the company hire?"
49
157784
1650
"๋ˆ„๊ตด ์ฑ„์šฉํ•ด์•ผ ํ• ๊นŒ์š”?"
02:40
"Which update from which friend should you be shown?"
50
160096
2759
"์–ด๋Š ์นœ๊ตฌ์˜ ์–ด๋–ค ์†Œ์‹์„ ์—…๋ฐ์ดํŠธํ•ด์•ผ ํ• ๊นŒ์š”?
02:42
"Which convict is more likely to reoffend?"
51
162879
2266
"์–ด๋Š ์žฌ์†Œ์ž๊ฐ€ ์žฌ๋ฒ” ๊ฐ€๋Šฅ์„ฑ์ด ๋” ๋†’์„๊นŒ์š”?"
02:45
"Which news item or movie should be recommended to people?"
52
165514
3054
"์–ด๋–ค ๋‰ด์Šค ๊ธฐ์‚ฌ๋‚˜ ์˜ํ™”๋ฅผ ์ถ”์ฒœ ๋ชฉ๋ก์— ๋„ฃ์„๊นŒ์š”?"
02:48
Look, yes, we've been using computers for a while,
53
168592
3372
์ธ๊ฐ„์€ ์ปดํ“จํ„ฐ๋ฅผ ๊ฝค ์˜ค๋ž˜ ์‚ฌ์šฉํ–ˆ์ง€๋งŒ
02:51
but this is different.
54
171988
1517
์ด๊ฑด ์ข€ ๋‹ค๋ฅธ ๋ฌธ์ œ์ฃ .
02:53
This is a historical twist,
55
173529
2067
์—ญ์‚ฌ์  ๋ฐ˜์ „์ž…๋‹ˆ๋‹ค.
02:55
because we cannot anchor computation for such subjective decisions
56
175620
5337
์™œ๋ƒํ•˜๋ฉด ๊ทธ๋Ÿฐ ์ฃผ๊ด€์ ์ธ ๊ฒฐ์ •๊นŒ์ง€ ์ปดํ“จํ„ฐ์— ์˜์ง€ํ•  ์ˆ˜๋Š” ์—†๊ฑฐ๋“ ์š”.
03:00
the way we can anchor computation for flying airplanes, building bridges,
57
180981
5420
๋น„ํ–‰๊ธฐ ์กฐ์ข…์ด๋‚˜ ๋‹ค๋ฆฌ๋ฅผ ์ง“๊ฑฐ๋‚˜ ๋‹ฌ์— ๊ฐ€๋Š” ๊ฒƒ๊ณผ ๋‹ค๋ฅด์ฃ .
03:06
going to the moon.
58
186425
1259
03:08
Are airplanes safer? Did the bridge sway and fall?
59
188449
3259
๋น„ํ–‰๊ธฐ๊ฐ€ ์•ˆ์ „ํ•  ๊ฒƒ์ธ๊ฐ€. ๋‹ค๋ฆฌ๊ฐ€ ํ”๋“ค๋ฆฌ๊ณ  ๋ฌด๋„ˆ์งˆ ๊ฒƒ์ธ๊ฐ€.
03:11
There, we have agreed-upon, fairly clear benchmarks,
60
191732
4498
์ด๋Ÿฐ ๋ฌธ์ œ๋Š” ๋ช…ํ™•ํ•˜๊ณ  ๋ชจ๋‘๊ฐ€ ๋™์˜ํ•  ๋งŒํ•œ ๊ธฐ์ค€์ด ์žˆ๊ณ 
03:16
and we have laws of nature to guide us.
61
196254
2239
์ž์—ฐ๋ฒ•์น™์— ๋”ฐ๋ผ ํŒ๋‹จํ•˜๋ฉด ๋˜์ฃ .
03:18
We have no such anchors and benchmarks
62
198517
3394
ํ•˜์ง€๋งŒ ์ธ๊ฐ„ ์‚ฌํšŒ์˜ ์ผ์„ ํŒ๋‹จํ•˜๋Š” ๋ฐ์—๋Š”
03:21
for decisions in messy human affairs.
63
201935
3963
๊ทธ๋Ÿฐ ๊ธฐ์ค€์ด ์กด์žฌํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
03:25
To make things more complicated, our software is getting more powerful,
64
205922
4237
๋”์šฑ์ด ์š”์ฆ˜ ์†Œํ”„ํŠธ์›จ์–ด๋Š” ์ ์  ๋” ๊ฐ•๋ ฅํ•ด์ง€๊ณ  ์žˆ์ง€๋งŒ
03:30
but it's also getting less transparent and more complex.
65
210183
3773
๋™์‹œ์— ๋”์šฑ ๋ถˆํˆฌ๋ช…ํ•ด์ง€๊ณ  ์ดํ•ดํ•˜๊ธฐ ํž˜๋“ค์–ด์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:34
Recently, in the past decade,
66
214542
2040
์ง€๋‚œ ์‹ญ ๋…„ ๋™์•ˆ
03:36
complex algorithms have made great strides.
67
216606
2729
๋ณตํ•ฉ ์•Œ๊ณ ๋ฆฌ์ฆ˜์—๋Š” ๊ต‰์žฅํ•œ ์ง„์ „์ด ์žˆ์—ˆ์ฃ .
03:39
They can recognize human faces.
68
219359
1990
์‚ฌ๋žŒ ์–ผ๊ตด์„ ์ธ์‹ํ•  ์ˆ˜ ์žˆ๊ณ 
03:41
They can decipher handwriting.
69
221985
2055
์†๊ธ€์”จ๋„ ์ฝ์–ด๋‚ด๋ฉฐ
03:44
They can detect credit card fraud
70
224436
2066
์‹ ์šฉ์นด๋“œ ์‚ฌ๊ธฐ๋ฅผ ๊ฐ„ํŒŒํ•˜๊ณ 
03:46
and block spam
71
226526
1189
์ŠคํŒธ์„ ๋ง‰๊ณ 
03:47
and they can translate between languages.
72
227739
2037
๋ฒˆ์—ญ๋„ ํ•  ์ˆ˜ ์žˆ์–ด์š”.
03:49
They can detect tumors in medical imaging.
73
229800
2574
์˜์ƒ ์˜๋ฃŒ ์‚ฌ์ง„์—์„œ ์ข…์–‘์„ ์‹๋ณ„ํ•  ์ˆ˜ ์žˆ๊ณ 
03:52
They can beat humans in chess and Go.
74
232398
2205
์ธ๊ฐ„๊ณผ ์ฒด์Šค๋‚˜ ๋ฐ”๋‘‘์„ ๋‘์–ด ์ด๊ธธ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:55
Much of this progress comes from a method called "machine learning."
75
235264
4504
์ด ์ง„์ „์—๋Š” '๊ธฐ๊ณ„ ํ•™์Šต'์ด๋ผ๋Š” ๊ธฐ๋ฒ•์˜ ๊ณต์ด ํฝ๋‹ˆ๋‹ค.
04:00
Machine learning is different than traditional programming,
76
240175
3187
๊ธฐ๊ณ„ ํ•™์Šต์€ ๊ธฐ์กด์˜ ํ”„๋กœ๊ทธ๋ž˜๋ฐ๊ณผ๋Š” ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
04:03
where you give the computer detailed, exact, painstaking instructions.
77
243386
3585
๊ธฐ์กด์—๋Š” ์ปดํ“จํ„ฐ์—๊ฒŒ ๋ช…ํ™•ํ•˜๊ณ  ์ž์„ธํ•œ ์ง€์‹œ๋ฅผ ๋‚ด๋ ค์•ผ ํ–ˆ์ฃ .
04:07
It's more like you take the system and you feed it lots of data,
78
247378
4182
์ด์ œ๋Š” ์‹œ์Šคํ…œ์„ ๋งŒ๋“  ๋’ค์— ๋Œ€๋Ÿ‰์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ž…๋ ฅํ•ฉ๋‹ˆ๋‹ค.
04:11
including unstructured data,
79
251584
1656
์šฐ๋ฆฌ์˜ ๋””์ง€ํ„ธ ์‹œ๋Œ€์— ์ƒ์„ฑ๋œ
04:13
like the kind we generate in our digital lives.
80
253264
2278
๊ตฌ์กฐํ™”๋˜์ง€ ์•Š์€ ๋ฐ์ดํ„ฐ๋“ค๊นŒ์ง€ ํฌํ•จํ•ด์„œ ๋ง์ด์ฃ .
04:15
And the system learns by churning through this data.
81
255566
2730
์‹œ์Šคํ…œ์€ ์ด ๋ฐ์ดํ„ฐ๋ฅผ ํ—ค์ณ๋‚˜๊ฐ€๋ฉด์„œ ํ•™์Šตํ•ฉ๋‹ˆ๋‹ค.
04:18
And also, crucially,
82
258669
1526
๋˜ ์ค‘์š”ํ•œ ์‚ฌ์‹ค์ด ์žˆ๋Š”๋ฐ
04:20
these systems don't operate under a single-answer logic.
83
260219
4380
์ด ์‹œ์Šคํ…œ์€ ์ •๋‹ต์„ ๋‹จ์ •์ง“๋Š” ๋…ผ๋ฆฌ๋กœ ์ž‘๋™ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
04:24
They don't produce a simple answer; it's more probabilistic:
84
264623
2959
ํ™•์ •์ ์ธ ์ •๋‹ต์„ ๋‚ด๊ธฐ๋ณด๋‹ค๋Š” ํ™•๋ฅ ์  ๊ฒฐ๋ก ์„ ๋‚ด๋ฆฌ์ฃ .
04:27
"This one is probably more like what you're looking for."
85
267606
3483
"์ด๊ฒŒ ๋‹น์‹ ์ด ์ฐพ๋˜ ๊ฒฐ๊ณผ์ผ ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์Šต๋‹ˆ๋‹ค."๋ผ๊ณ ์š”.
04:32
Now, the upside is: this method is really powerful.
86
272023
3070
์žฅ์ ์€ ์ด ๋ฐฉ์‹์ด ์ •๋ง ๊ฐ•๋ ฅํ•˜๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
04:35
The head of Google's AI systems called it,
87
275117
2076
๊ตฌ๊ธ€์˜ ์ธ๊ณต์ง€๋Šฅ ์‹œ์Šคํ…œ ์ฑ…์ž„์ž๋Š” ์ด๋ฅผ ๋‘๊ณ 
'์ •๋ณด์˜ ๊ณผ์ž‰ ํšจ์œจ์„ฑ'์ด๋ผ๊ณ  ํ‘œํ˜„ํ–ˆ์ฃ .
04:37
"the unreasonable effectiveness of data."
88
277217
2197
04:39
The downside is,
89
279791
1353
๋‹จ์ ์€
04:41
we don't really understand what the system learned.
90
281738
3071
๊ทธ ์‹œ์Šคํ…œ์ด ๋ฌด์—‡์„ ๋ฐฐ์› ๋Š”์ง€ ์šฐ๋ฆฌ๋Š” ์•Œ ์ˆ˜ ์—†๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
04:44
In fact, that's its power.
91
284833
1587
์‚ฌ์‹ค ์žฅ์ ์ด๊ธฐ๋„ ํ•˜์ฃ .
04:46
This is less like giving instructions to a computer;
92
286946
3798
์ปดํ“จํ„ฐ์— ๋ช…๋ น์„ ๋‚ด๋ฆฐ๋‹ค๊ธฐ ๋ณด๋‹ค
04:51
it's more like training a puppy-machine-creature
93
291200
4064
์šฐ๋ฆฌ๊ฐ€ ์ดํ•ดํ•˜๊ฑฐ๋‚˜ ํ†ต์ œํ•˜์ง€ ๋ชปํ•˜๋Š” ๊ฐ•์•„์ง€ ๊ฐ™์€ ๊ธฐ๊ณ„๋ฅผ
04:55
we don't really understand or control.
94
295288
2371
ํ›ˆ๋ จ์‹œํ‚ค๋Š” ๊ฑฐ๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ์ฃ .
04:58
So this is our problem.
95
298362
1551
๊ทธ๋Ÿฐ๋ฐ ๋ฌธ์ œ๊ฐ€ ์žˆ์–ด์š”.
05:00
It's a problem when this artificial intelligence system gets things wrong.
96
300427
4262
์ด ์ธ๊ณต์ง€๋Šฅ์ด ์ž˜๋ชป๋œ ๊ฒƒ์„ ํ•™์Šตํ•  ๋•Œ ๋ฌธ์ œ๊ฐ€ ๋ฐœ์ƒํ•˜์ฃ .
05:04
It's also a problem when it gets things right,
97
304713
3540
๊ทธ๋ฆฌ๊ณ  ์ž˜ ํ•™์Šตํ–ˆ์–ด๋„ ๋ฌธ์ œ๊ฐ€ ๋ฉ๋‹ˆ๋‹ค.
05:08
because we don't even know which is which when it's a subjective problem.
98
308277
3628
์™œ๋ƒํ•˜๋ฉด ์ฃผ๊ด€์ ์ธ ๋ฌธ์ œ์—์„œ๋Š” ๋ญ๊ฐ€ ๋ญ”์ง€๋„ ๋ชจ๋ฅด๋‹ˆ๊นŒ์š”.
05:11
We don't know what this thing is thinking.
99
311929
2339
๋ฌด์Šจ ์ƒ๊ฐ์œผ๋กœ ์ด๋Ÿฐ ํŒ๋‹จ์„ ํ–ˆ๋Š”์ง€ ์•Œ ์ˆ˜๊ฐ€ ์—†๋Š” ๊ฑฐ์ฃ .
05:15
So, consider a hiring algorithm --
100
315493
3683
์ฑ„์šฉ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
05:20
a system used to hire people, using machine-learning systems.
101
320123
4311
์‚ฌ๋žŒ์„ ๊ฐ€๋ ค๋‚ด๋Š” ๊ธฐ๊ณ„ ํ•™์Šต ์‹œ์Šคํ…œ์ž…๋‹ˆ๋‹ค.
05:25
Such a system would have been trained on previous employees' data
102
325052
3579
์ด๋Ÿฐ ์‹œ์Šคํ…œ์€ ์ด์ „ ์ง์›๋“ค ๋ฐ์ดํ„ฐ๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ํ›ˆ๋ จ๋˜์—ˆ์„ ๊ฒƒ์ด๊ณ 
05:28
and instructed to find and hire
103
328655
2591
์„ฑ๊ณผ๊ฐ€ ์ข‹์„ ๋งŒํ•œ ์ง์›๋“ค์„
05:31
people like the existing high performers in the company.
104
331270
3038
๋ฏธ๋ฆฌ ์ฐพ์•„์„œ ๊ณ ์šฉํ•˜๋ ค๊ณ  ํ•˜๊ฒ ์ฃ .
05:34
Sounds good.
105
334814
1153
๊ดœ์ฐฎ์•„ ๋ณด์ž…๋‹ˆ๋‹ค.
05:35
I once attended a conference
106
335991
1999
์ €๋Š” ์ธ์‚ฌ๋ถ€์™€ ์ž„์›๋“ค
05:38
that brought together human resources managers and executives,
107
338014
3125
ํšŒ์‚ฌ ๊ณ ์œ„์ง๋“ค์ด ํ•œ๋ฐ ๋ชจ์ธ ๊ทธ๋Ÿฐ ์ฑ„์šฉ ์‹œ์Šคํ…œ ๋„์ž…์„
05:41
high-level people,
108
341163
1206
์ฃผ์ œ๋กœ ํ•œ ์ปจํผ๋Ÿฐ์Šค์— ์ฐธ์„ํ•œ ์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:42
using such systems in hiring.
109
342393
1559
05:43
They were super excited.
110
343976
1646
๊ทธ ๋ถ„๋“ค์€ ์ •๋ง ๋“ค๋–  ์žˆ์—ˆ์ฃ .
05:45
They thought that this would make hiring more objective, less biased,
111
345646
4653
๊ทธ๋“ค์€ ์ด ์‹œ์Šคํ…œ์ด ํŽธํŒŒ์ ์ด์ง€ ์•Š๊ณ  ๊ฐ๊ด€์ ์ธ ์ฑ„์šฉ์„ ๊ฐ€๋Šฅ์ผ€ ํ•˜๊ณ 
05:50
and give women and minorities a better shot
112
350323
3000
ํŽธ๊ฒฌ์„ ๊ฐ€์ง„ ์ธ๊ฐ„ ๊ด€๋ฆฌ์ž๋ณด๋‹ค ์—ฌ์„ฑ๊ณผ ์†Œ์ˆ˜์ž์—๊ฒŒ
05:53
against biased human managers.
113
353347
2188
๋” ๋งŽ์€ ๊ธฐํšŒ๋ฅผ ์ฃผ๋ฆฌ๋ผ ๊ธฐ๋Œ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:55
And look -- human hiring is biased.
114
355559
2843
์‚ฌ๋žŒ์— ์˜ํ•œ ๊ณ ์šฉ์€ ํŽธํ–ฅ๋์ฃ .
05:59
I know.
115
359099
1185
์ €๋„ ์•Œ์•„์š”.
06:00
I mean, in one of my early jobs as a programmer,
116
360308
3005
ํ”„๋กœ๊ทธ๋ž˜๋จธ๋กœ์„œ์˜ ์ œ ์ดˆ๊ธฐ ์ง์žฅ ์ค‘ ํ•˜๋‚˜์—์„œ
06:03
my immediate manager would sometimes come down to where I was
117
363337
3868
์ œ ์ง์† ์ƒ์‚ฌ๋Š” ๊ฐ€๋” ์ œ๊ฐ€ ์•„์นจ ์ผ์ฐ๋ถ€ํ„ฐ
06:07
really early in the morning or really late in the afternoon,
118
367229
3753
๋ฐค ๋Šฆ๊ฒŒ๊นŒ์ง€ ์ผํ•˜๋˜ ์ž๋ฆฌ๋กœ ์™€์„œ
06:11
and she'd say, "Zeynep, let's go to lunch!"
119
371006
3062
"์ œ์ด๋„ต, ์ ์‹ฌ ๋จน์œผ๋Ÿฌ ๊ฐ‘์‹œ๋‹ค" ๋ผ๊ณ  ๋งํ–ˆ์—ˆ์–ด์š”.
06:14
I'd be puzzled by the weird timing.
120
374724
2167
์‹œ๊ณ„๋ฅผ ๋ณด๊ณ  ์˜์•„ํ•ดํ–ˆ์ฃ .
06:16
It's 4pm. Lunch?
121
376915
2129
์˜คํ›„ 4์‹œ์— ์ ์‹ฌ์ด๋ผ๋‹ˆ?
06:19
I was broke, so free lunch. I always went.
122
379068
3094
์ „ ๋ˆ์ด ์—†์—ˆ์œผ๋‹ˆ ์ ์‹ฌ ์‚ฌ ์ค€๋‹ค๋‹ˆ๊นŒ ํ•ญ์ƒ ๊ฐ”์ฃ .
06:22
I later realized what was happening.
123
382618
2067
๋‚˜์ค‘์— ๋ฌด์Šจ ์ด์œ ์ธ์ง€ ์•Œ๊ฒŒ ๋˜์—ˆ์–ด์š”.
06:24
My immediate managers had not confessed to their higher-ups
124
384709
4546
์ œ ์ง์† ์ƒ์‚ฌ๋Š” ๊ทธ๋…€๊ฐ€ ๊ณ ์šฉํ•œ ํ”„๋กœ๊ทธ๋ž˜๋จธ๊ฐ€
06:29
that the programmer they hired for a serious job was a teen girl
125
389279
3113
์ฒญ๋ฐ”์ง€์— ์šด๋™ํ™”๋ฅผ ์‹ ๊ณ  ์ผํ„ฐ์— ์˜ค๋Š” ์‹ญ๋Œ€ ํ•™์ƒ์ด๋ž€ ๊ฑธ
06:32
who wore jeans and sneakers to work.
126
392416
3930
๋งํ•˜์ง€ ์•Š์•˜๋˜ ๊ฑฐ์ฃ .
06:37
I was doing a good job, I just looked wrong
127
397174
2202
์ผ์€ ์ž˜ ํ•˜๊ณ  ์žˆ์—ˆ์ง€๋งŒ ์ž…์€ ์˜ท๊ณผ ๋‚˜์ด์™€ ์„ฑ๋ณ„์ด
06:39
and was the wrong age and gender.
128
399400
1699
'์ ์ ˆ์น˜ ์•Š์•˜๋˜' ๊ฑฐ์ฃ .
06:41
So hiring in a gender- and race-blind way
129
401123
3346
๊ทธ๋ž˜์„œ ๋‚˜์ด์™€ ์ธ์ข…์„ ๊ฐ€๋ฆฌ์ง€ ์•Š์€ ์ฑ„์šฉ์€
06:44
certainly sounds good to me.
130
404493
1865
์ €์—๊ฒŒ๋Š” ์ข‹์•„ ๋ณด์ž…๋‹ˆ๋‹ค.
06:47
But with these systems, it is more complicated, and here's why:
131
407031
3341
ํ•˜์ง€๋งŒ ์ด ์‹œ์Šคํ…œ์ด ์–‘๋‚ ์˜ ์นผ์ธ ์ด์œ ๋Š” ๋”ฐ๋กœ ์žˆ์ฃ .
06:50
Currently, computational systems can infer all sorts of things about you
132
410968
5791
ํ˜„์žฌ ์ด๋Ÿฐ ์‹œ์Šคํ…œ์€ ์—ฌ๋Ÿฌ๋ถ„์ด ๊ณต๊ฐœํ•˜์ง€ ์•Š์€ ๊ฐœ์ธ ์ •๋ณด๋„
06:56
from your digital crumbs,
133
416783
1872
์—ฌ๋Ÿฌ๋ถ„์ด ๋‚จ๊ธด ์ •๋ณด ๋ถ€์Šค๋Ÿฌ๊ธฐ์—์„œ ์ถ”๋ก ํ•  ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
06:58
even if you have not disclosed those things.
134
418679
2333
07:01
They can infer your sexual orientation,
135
421506
2927
์—ฌ๋Ÿฌ๋ถ„์˜ ์„ฑ์  ์ทจํ–ฅ์„ ์ถ”์ธกํ•  ์ˆ˜ ์žˆ๊ณ 
07:04
your personality traits,
136
424994
1306
์„ฑ๊ฒฉ ํŠน์„ฑ๊ณผ
07:06
your political leanings.
137
426859
1373
์ •์น˜ ์„ฑํ–ฅ๊นŒ์ง€ ์ถ”์ธกํ•˜์ฃ .
07:08
They have predictive power with high levels of accuracy.
138
428830
3685
์ƒ๋‹นํžˆ ๋†’์€ ์ˆ˜์ค€์˜ ์ ์ค‘๋ฅ ๋กœ์š”.
07:13
Remember -- for things you haven't even disclosed.
139
433362
2578
๊ณต๊ฐœํ•˜์ง€๋„ ์•Š์€ ์ •๋ณด๋ฅผ ์•Œ์•„๋ƒ…๋‹ˆ๋‹ค.
07:15
This is inference.
140
435964
1591
์ถ”๋ก ํ•œ ๊ฒƒ์ด์ฃ .
07:17
I have a friend who developed such computational systems
141
437579
3261
์ œ ์นœ๊ตฌ ํ•˜๋‚˜๊ฐ€ SNS ์ž๋ฃŒ๋ฅผ ํ†ตํ•ด
07:20
to predict the likelihood of clinical or postpartum depression
142
440864
3641
์งˆ๋ณ‘์ด๋‚˜ ์‚ฐํ›„ ์šฐ์šธ์ฆ ๊ฐ€๋Šฅ์„ฑ์„ ์˜ˆ์ธกํ•˜๋Š” ์‹œ์Šคํ…œ์„ ๊ฐœ๋ฐœํ–ˆ์Šต๋‹ˆ๋‹ค.
07:24
from social media data.
143
444529
1416
07:26
The results are impressive.
144
446676
1427
๊ฒฐ๊ณผ๋Š” ์ธ์ƒ์ ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
07:28
Her system can predict the likelihood of depression
145
448492
3357
๊ทธ๋…€๊ฐ€ ๋งŒ๋“  ์‹œ์Šคํ…œ์€ ์ฆ์ƒ์ด ์‹œ์ž‘๋˜๊ธฐ ๋ช‡ ๋‹ฌ ์ „์—
07:31
months before the onset of any symptoms --
146
451873
3903
์šฐ์šธ์ฆ ๋ฐœ์ƒ ๊ฐ€๋Šฅ์„ฑ์„ ์˜ˆ์ธกํ•  ์ˆ˜ ์žˆ์—ˆ์–ด์š”.
07:35
months before.
147
455800
1373
๋ช‡ ๋‹ฌ ์ „์—์š”.
07:37
No symptoms, there's prediction.
148
457197
2246
์ฆ์ƒ๋„ ์—†์ด ์˜ˆ์ธกํ•œ ๊ฒƒ์ด์ฃ .
07:39
She hopes it will be used for early intervention. Great!
149
459467
4812
๊ทธ๋…€๋Š” ์ด๊ฒŒ ์šฐ์šธ์ฆ ์˜ˆ๋ฐฉ์— ์‚ฌ์šฉ๋  ๊ฑฐ๋ผ ์ƒ๊ฐํ–ˆ์–ด์š”. ์ข‹์€ ์ผ์ด์ฃ ?
07:44
But now put this in the context of hiring.
150
464911
2040
ํ•˜์ง€๋งŒ ์ง€๊ธˆ ๊ทธ ๊ธฐ์ˆ ์€ ์ฑ„์šฉ ์‹œ์Šคํ…œ์— ์ ์šฉ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
07:48
So at this human resources managers conference,
151
468027
3046
์ €๋Š” ์ธ์‚ฌ ๋‹ด๋‹น์ž๋“ค์ด ๋ชจ์ธ ์•„๊นŒ ๊ทธ ํ•™ํšŒ์—์„œ
07:51
I approached a high-level manager in a very large company,
152
471097
4709
๋Œ€๊ธฐ์—…์˜ ๊ณ ์œ„์ง ๊ด€๋ฆฌ์ž์—๊ฒŒ ๋‹ค๊ฐ€๊ฐ€์„œ ์ด๋ ‡๊ฒŒ ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
07:55
and I said to her, "Look, what if, unbeknownst to you,
153
475830
4578
"๋‹น์‹ ์ด ๋ชจ๋ฅด๋Š” ์‚ฌ์ด์— ํ”„๋กœ๊ทธ๋žจ์ด
08:00
your system is weeding out people with high future likelihood of depression?
154
480432
6549
์šฐ์šธ์ฆ ๋ฐœ๋ณ‘ ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์€ ์‚ฌ๋žŒ๋“ค์„ ๊ฐ€๋ ค๋‚ด๊ณ  ์žˆ๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
08:07
They're not depressed now, just maybe in the future, more likely.
155
487761
3376
์ง€๊ธˆ์€ ์šฐ์šธ์ฆ์ด ์—†์ง€๋งŒ, ๋ฏธ๋ž˜์— ์œ„ํ—˜ ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์ฃ .
08:11
What if it's weeding out women more likely to be pregnant
156
491923
3406
์ง€๊ธˆ์€ ์ž„์‹ ํ•˜์ง€ ์•Š์•˜์ง€๋งŒ
1,2๋…„ ๋‚ด์— ์ถœ์‚ฐ ํœด๊ฐ€๋ฅผ ๋‚ผ ๋งŒํ•œ ์—ฌ์„ฑ๋“ค์„ ๋ฏธ๋ฆฌ ๊ฐ€๋ ค๋‚ด๊ณ  ์žˆ๋‹ค๋ฉด์š”?
08:15
in the next year or two but aren't pregnant now?
157
495353
2586
08:18
What if it's hiring aggressive people because that's your workplace culture?"
158
498844
5636
์ง์žฅ ๋ฌธํ™”์— ์ ํ•ฉํ•˜๋‹ค๋Š” ์ด์œ ๋กœ ๊ณต๊ฒฉ์ ์ธ ์‚ฌ๋žŒ๋งŒ์„ ๊ณ ์šฉํ•œ๋‹ค๋ฉด์š”?"
08:25
You can't tell this by looking at gender breakdowns.
159
505173
2691
์ด๊ฑด ๋‚จ๋…€ ์„ฑ๋น„๋งŒ์œผ๋กœ๋Š” ํŒ๋‹จํ•  ์ˆ˜ ์—†์–ด์š”.
08:27
Those may be balanced.
160
507888
1502
์„ฑ๋น„๋Š” ์ด๋ฏธ ๊ท ํ˜•์žกํ˜€ ์žˆ๊ฒ ์ฃ .
08:29
And since this is machine learning, not traditional coding,
161
509414
3557
๊ทธ๋ฆฌ๊ณ  ์ „ํ†ต์ ์ธ ํ”„๋กœ๊ทธ๋ž˜๋ฐ ๋ฐฉ์‹์ด ์•„๋‹Œ ๊ธฐ๊ณ„ ํ•™์Šต์ด๊ธฐ ๋•Œ๋ฌธ์—
08:32
there is no variable there labeled "higher risk of depression,"
162
512995
4907
'์šฐ์šธ์ฆ ์œ„ํ—˜์„ฑ ๋†’์Œ'์ด๋ผ๋Š” ๋ณ€์ˆ˜๋ช…์€ ์กด์žฌํ•˜์ง€ ์•Š์•„์š”.
08:37
"higher risk of pregnancy,"
163
517926
1833
'์ž„์‹  ๊ฐ€๋Šฅ์„ฑ ๋†’์Œ'
08:39
"aggressive guy scale."
164
519783
1734
'๋‚จ์„ฑ ๊ณต๊ฒฉ์„ฑ ์ฒ™๋„'๋„ ์—†์ฃ .
08:41
Not only do you not know what your system is selecting on,
165
521995
3679
์‹œ์Šคํ…œ์ด ์–ด๋–ค ๊ธฐ์ค€์œผ๋กœ ์„ ํƒํ•˜๋Š”์ง€ ๋ชจ๋ฅด๋Š” ๊ฒƒ์€ ๋ฌผ๋ก 
08:45
you don't even know where to begin to look.
166
525698
2323
์–ด๋””๋ถ€ํ„ฐ ๋ด์•ผ ํ• ์ง€์กฐ์ฐจ ๋ชจ๋ฅด๋Š” ๊ฑฐ์ฃ .
08:48
It's a black box.
167
528045
1246
๋ธ”๋ž™๋ฐ•์Šค์ž…๋‹ˆ๋‹ค.
08:49
It has predictive power, but you don't understand it.
168
529315
2807
๊ทธ ์˜ˆ์ธก ๋Šฅ๋ ฅ์„ ์šฐ๋ฆฌ๋Š” ์•Œ์ง€ ๋ชปํ•˜์ฃ .
08:52
"What safeguards," I asked, "do you have
169
532486
2369
๊ทธ๋ž˜์„œ ์ œ๊ฐ€ ๋ฌผ์—ˆ์ฃ .
08:54
to make sure that your black box isn't doing something shady?"
170
534879
3673
"๋ธ”๋ž™๋ฐ•์Šค๊ฐ€ ๋ญ”๊ฐ€ ์ด์ƒํ•œ ์ง“์„ ๋ชปํ•˜๋„๋ก ์–ด๋–ค ์•ˆ์ „ ์žฅ์น˜๋ฅผ ๋งˆ๋ จํ•˜์‹œ๊ฒ ์–ด์š”?"
09:00
She looked at me as if I had just stepped on 10 puppy tails.
171
540863
3878
์ œ๊ฐ€ ์—„์ฒญ๋‚œ ์‚ฌ๊ฑด์„ ์ผ์œผํ‚จ ๊ฒƒ์ฒ˜๋Ÿผ ์ณ๋‹ค๋ณด๋”๋ผ๊ณ ์š”.
09:04
(Laughter)
172
544765
1248
(์›ƒ์Œ)
09:06
She stared at me and she said,
173
546037
2041
์ €๋ฅผ ๋นคํžˆ ์ณ๋‹ค๋ณด๊ณค ์ด๋ ‡๊ฒŒ ๋งํ•˜๋”๊ตฐ์š”.
09:08
"I don't want to hear another word about this."
174
548556
4333
"์ด๊ฒƒ์— ๋Œ€ํ•ด์„œ๋Š” ๋” ์ด์ƒ ๋“ฃ๊ณ  ์‹ถ์ง€ ์•Š๋„ค์š”."
09:13
And she turned around and walked away.
175
553458
2034
๊ทธ๋ฆฌ๊ณ  ๋Œ์•„์„œ์„œ ๊ฐ€ ๋ฒ„๋ ธ์–ด์š”.
09:16
Mind you -- she wasn't rude.
176
556064
1486
๊ทธ๋ ‡๋‹ค๊ณ  ๋ฌด๋ก€ํ–ˆ๋˜ ๊ฑด ์•„๋‹ˆ์—์š”.
09:17
It was clearly: what I don't know isn't my problem, go away, death stare.
177
557574
6308
์ž๊ธฐ๊ฐ€ ๋ชจ๋ฅด๋Š” ๊ฑด ์ž๊ธฐ ๋ฌธ์ œ๊ฐ€ ์•„๋‹ˆ๋‹ˆ ์‹ ๊ฒฝ์“ฐ๊ฒŒ ํ•˜์ง€ ๋ง๋ผ๋Š” ๊ฒฝ๊ณ ์˜€์ฃ .
09:23
(Laughter)
178
563906
1246
(์›ƒ์Œ)
09:25
Look, such a system may even be less biased
179
565862
3839
์‹œ์Šคํ…œ์€ ์ธ๊ฐ„ ๊ด€๋ฆฌ์ž์™€ ๋‹ฌ๋ฆฌ ํŽธ๊ฒฌ์ด ์—†์„ ์ˆ˜๋„ ์žˆ์–ด์š”.
09:29
than human managers in some ways.
180
569725
2103
09:31
And it could make monetary sense.
181
571852
2146
์ธ์‚ฌ ์—…๋ฌด์— ๋ˆ๋„ ๋œ ์จ๋„ ๋˜๊ฒ ์ฃ .
09:34
But it could also lead
182
574573
1650
ํ•˜์ง€๋งŒ ์ด ์‹œ์Šคํ…œ์€ ์ง€์†์ ์ด๊ณ  ์•”๋ฌต์ ์œผ๋กœ
09:36
to a steady but stealthy shutting out of the job market
183
576247
4748
์šฐ์šธ์ฆ ์œ„ํ—˜์ด ๋†’์€ ์‚ฌ๋žŒ๋“ค์˜ ๊ณ ์šฉ ๊ธฐํšŒ๋ฅผ ๋ฐ•ํƒˆํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:41
of people with higher risk of depression.
184
581019
2293
09:43
Is this the kind of society we want to build,
185
583753
2596
์ดํ•ดํ•˜์ง€๋„ ๋ชปํ•˜๋Š” ๊ธฐ๊ณ„์—๊ฒŒ ์˜์‚ฌ๊ฒฐ์ •์„ ๋งก๊ธฐ๋ฉด์„œ
09:46
without even knowing we've done this,
186
586373
2285
09:48
because we turned decision-making to machines we don't totally understand?
187
588682
3964
์šฐ๋ฆฌ ๋ชจ๋ฅด๊ฒŒ ๊ธฐํšŒ๋ฅผ ๋ฐ•ํƒˆํ•˜๋Š” ๊ฒŒ ๋ฐ”๋žŒ์งํ•œ ์‚ฌํšŒ์ธ๊ฐ€์š”?
09:53
Another problem is this:
188
593265
1458
๋˜ ๋‹ค๋ฅธ ๋ฌธ์ œ๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
09:55
these systems are often trained on data generated by our actions,
189
595314
4452
์ด ์‹œ์Šคํ…œ์€ ์šฐ๋ฆฌ ์ธ๊ฐ„์˜ ํ–‰๋™๋ฐฉ์‹์ด ๋งŒ๋“ค์–ด ๋‚ธ ์ •๋ณด๋“ค๋กœ ํ•™์Šต์„ ํ•ฉ๋‹ˆ๋‹ค.
09:59
human imprints.
190
599790
1816
์ธ๊ฐ„์ด ๋‚จ๊ธด ํ”์ ๋“ค์ด์ฃ .
10:02
Well, they could just be reflecting our biases,
191
602188
3808
๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ์˜ ํŽธ๊ฒฌ์„ ๊ทธ๋Œ€๋กœ ๋ฐ˜์˜ํ•˜๊ฒŒ ๋˜๊ณ 
10:06
and these systems could be picking up on our biases
192
606020
3593
์‹œ์Šคํ…œ์€ ๊ทธ ํŽธ๊ฒฌ์„ ์ตํžˆ๊ณ  ํ™•๋Œ€์‹œ์ผœ ์šฐ๋ฆฌ์—๊ฒŒ ๊ฒฐ๊ณผ๋กœ ๋ณด์—ฌ์ฃผ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
10:09
and amplifying them
193
609637
1313
10:10
and showing them back to us,
194
610974
1418
10:12
while we're telling ourselves,
195
612416
1462
์šฐ๋ฆฌ๋Š” ๊ทธ๊ฑธ ํ•ฉ๋ฆฌํ™”ํ•˜์ฃ .
10:13
"We're just doing objective, neutral computation."
196
613902
3117
"์ง€๊ธˆ ๊ฐ๊ด€์ ์ด๊ณ  ๊ณต์ •ํ•œ ๊ณ„์‚ฐ ๊ฒฐ๊ณผ๋ฅผ ๋ฝ‘๋Š” ์ค‘์ด์•ผ~" ๋ผ๋ฉด์„œ์š”.
10:18
Researchers found that on Google,
197
618314
2677
์—ฐ๊ตฌ ๊ฒฐ๊ณผ์— ๋”ฐ๋ฅด๋ฉด ๊ตฌ๊ธ€ ๊ฒ€์ƒ‰์˜ ๊ฒฝ์šฐ์—๋Š”
10:22
women are less likely than men to be shown job ads for high-paying jobs.
198
622134
5313
์—ฌ์„ฑ์€ ๋‚จ์„ฑ๋ณด๋‹ค ๊ณ ์†Œ๋“ ๊ตฌ์ธ ๊ด‘๊ณ ์— ๋œ ๋…ธ์ถœ๋œ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
10:28
And searching for African-American names
199
628463
2530
๊ทธ๋ฆฌ๊ณ  ํ‘์ธ๊ณ„ ์ด๋ฆ„์œผ๋กœ ๊ฒ€์ƒ‰ํ•ด ๋ณด๋ฉด
10:31
is more likely to bring up ads suggesting criminal history,
200
631017
4706
๋ฒ”์ฃ„ ์ „๊ณผ๋ฅผ ์‹œ์‚ฌํ•˜๋Š” ๊ด‘๊ณ ๊ฐ€ ๋” ๋งŽ์ด ๋‚˜์˜จ๋‹ค๊ณ  ํ•ด์š”.
10:35
even when there is none.
201
635747
1567
์ „๊ณผ๊ฐ€ ์—†๋Š”๋ฐ๋„ ๋ง์ด์ฃ .
10:38
Such hidden biases and black-box algorithms
202
638693
3549
์ด๋Ÿฐ ์•”๋ฌต์  ํŽธ๊ฒฌ๊ณผ ๋ธ”๋ž™๋ฐ•์Šค ์† ์•Œ๊ณ ๋ฆฌ์ฆ˜์€
10:42
that researchers uncover sometimes but sometimes we don't know,
203
642266
3973
์—ฐ๊ตฌ์ž๋“ค์ด ๋ชจ๋‘ ๋ฐํ˜€๋‚ผ ์ˆ˜ ์—†์–ด ์šฐ๋ฆฌ๋„ ๋ชจ๋ฅด๋Š” ์‚ฌ์ด์—
10:46
can have life-altering consequences.
204
646263
2661
๊ฐœ์ธ์˜ ์‚ถ์— ์˜ํ–ฅ์„ ๋ฏธ์น  ์ˆ˜ ์žˆ์ฃ .
10:49
In Wisconsin, a defendant was sentenced to six years in prison
205
649958
4159
์œ„์Šค์ฝ˜์‹ ์—์„œ๋Š” ์–ด๋Š ํ”ผ๊ณ ์ธ์ด ๊ฒฝ์ฐฐ ์ˆ˜์‚ฌ๋ฅผ ๊ฑฐ๋ถ€ํ•œ ์ฃ„๋กœ
10:54
for evading the police.
206
654141
1355
6๋…„ํ˜•์„ ์„ ๊ณ ๋ฐ›์•˜์Šต๋‹ˆ๋‹ค.
10:56
You may not know this,
207
656824
1186
์•Œ๊ณ  ๊ณ„์‹ค์ง€ ๋ชจ๋ฅด๊ฒ ์ง€๋งŒ
10:58
but algorithms are increasingly used in parole and sentencing decisions.
208
658034
3998
์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๊ฐ€์„๋ฐฉ๊ณผ ํ˜•๋Ÿ‰ ํŒ๊ฒฐ์— ์ ์  ๋” ๋งŽ์ด ์‚ฌ์šฉ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
11:02
He wanted to know: How is this score calculated?
209
662056
2955
๊ทธ๋Š” ๋Œ€์ฒด ๊ทธ๋Ÿฐ ๊ฒฐ์ •์ด ์–ด๋–ป๊ฒŒ ๋‚˜์˜ค๋Š”์ง€ ์•Œ๊ณ  ์‹ถ์—ˆ์ฃ .
11:05
It's a commercial black box.
210
665795
1665
๊ทธ๊ฑด ์ƒ์—…์ ์ธ ๋ธ”๋ž™๋ฐ•์Šค์˜€๊ณ 
11:07
The company refused to have its algorithm be challenged in open court.
211
667484
4205
๊ฐœ๋ฐœ์‚ฌ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฒ•์ •์—์„œ ์‹ฌํŒ๋ฐ›๋Š” ๊ฒƒ์„ ๊ฑฐ๋ถ€ํ–ˆ์ฃ .
11:12
But ProPublica, an investigative nonprofit, audited that very algorithm
212
672396
5532
ํ•˜์ง€๋งŒ ProPublica๋ผ๋Š” ๋น„์˜๋ฆฌ ์ˆ˜์‚ฌ ๊ธฐ๊ตฌ๊ฐ€
๊ฐ์ข… ๋ฐ์ดํ„ฐ๋กœ ๊ทธ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๊ฒ€์‚ฌํ–ˆ๊ณ 
11:17
with what public data they could find,
213
677952
2016
11:19
and found that its outcomes were biased
214
679992
2316
๋ฐํ˜€์ง„ ์‚ฌ์‹ค์€ ๊ทธ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์˜ˆ์ธก ์„ฑ๋Šฅ์ด ์šฐ์—ฐ๊ณผ ๋ณ„๋‹ค๋ฅด์ง€ ์•Š์€ ์ˆ˜์ค€์ด์—ˆ์œผ๋ฉฐ
11:22
and its predictive power was dismal, barely better than chance,
215
682332
3629
11:25
and it was wrongly labeling black defendants as future criminals
216
685985
4416
ํ‘์ธ ํ”ผ๊ณ ๋ฅผ ์ž ์žฌ์  ๋ฒ”์ฃ„์ž๋กœ ๋‚™์ธ์ฐ๋Š” ํ™•๋ฅ ์ด
11:30
at twice the rate of white defendants.
217
690425
3895
๋ฐฑ์ธ ํ”ผ๊ณ ์— ๋น„ํ•ด ๋‘ ๋ฐฐ๋‚˜ ๋†’๋‹ค๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
11:35
So, consider this case:
218
695891
1564
๋‹ค๋ฅธ ๊ฒฝ์šฐ๋„ ์‚ดํŽด ๋ณผ๊นŒ์š”.
11:38
This woman was late picking up her godsister
219
698103
3852
์˜ค๋ฅธ์ชฝ ์—ฌ์„ฑ์€ ํ”Œ๋กœ๋ฆฌ๋‹ค ๋ธŒ๋กœ์›Œ๋“œ ์นด์šดํ‹ฐ์˜ ํ•™๊ต์— ๋‹ค๋‹ˆ๋Š”
11:41
from a school in Broward County, Florida,
220
701979
2075
๊ตํšŒ ๋™์ƒ์„ ๋ฐ๋ฆฌ๋Ÿฌ ๊ฐˆ ์•ฝ์†์— ๋Šฆ๋Š” ๋ฐ”๋žŒ์—
11:44
running down the street with a friend of hers.
221
704757
2356
์นœ๊ตฌ์™€ ํ•จ๊ป˜ ๋›ฐ์–ด๊ฐ€๊ณ  ์žˆ์—ˆ์ฃ .
11:47
They spotted an unlocked kid's bike and a scooter on a porch
222
707137
4099
๊ทธ๋Ÿฌ๋‹ค ์–ด๋Š ์ง‘ ํ˜„๊ด€์— ์žˆ๋˜ ์ž์ „๊ฑฐ์™€ ์Šค์ฟ ํ„ฐ๋ฅผ ๋ฐœ๊ฒฌํ•˜๊ณ ๋Š”
11:51
and foolishly jumped on it.
223
711260
1632
์–ด๋ฆฌ์„๊ฒŒ๋„ ๊ทธ๊ฑธ ์ง‘์–ด ํƒ”์–ด์š”.
11:52
As they were speeding off, a woman came out and said,
224
712916
2599
๊ทธ๋“ค์ด ์†๋„๋ฅผ ๋‚ด๋ฉฐ ๋‹ฌ์•„๋‚  ๋•Œ ํ•œ ์—ฌ์ž๊ฐ€ ๋›ฐ์–ด๋‚˜์™€ ์†Œ๋ฆฌ์ณค์ฃ .
11:55
"Hey! That's my kid's bike!"
225
715539
2205
"์šฐ๋ฆฌ ์•  ์ž์ „๊ฑฐ๋กœ ๋ญ ํ•˜๋Š” ๊ฑฐ์•ผ!"
11:57
They dropped it, they walked away, but they were arrested.
226
717768
3294
๊ทธ๋“ค์€ ์ž์ „๊ฑฐ๋ฅผ ๋ฒ„๋ฆฌ๊ณ  ๊ฑธ์–ด์„œ ๋‹ฌ์•„๋‚ฌ์ง€๋งŒ ์ฒดํฌ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
12:01
She was wrong, she was foolish, but she was also just 18.
227
721086
3637
๊ทธ๋“ค์ด ์ž˜๋ชปํ–ˆ๊ณ , ์–ด๋ฆฌ์„๊ธด ํ–ˆ์–ด์š”. ๊ทธ๋Ÿฐ๋ฐ ๊ฒจ์šฐ ์—ด์—ฌ๋Ÿ ์‚ด์ด์—ˆ์ฃ .
12:04
She had a couple of juvenile misdemeanors.
228
724747
2544
๊ทธ๋…€๋Š” ์ฒญ์†Œ๋…„ ๋ฒ”์ฃ„ ์ „๊ณผ๊ฐ€ ๋ช‡ ๊ฑด ์žˆ์—ˆ์ฃ .
12:07
Meanwhile, that man had been arrested for shoplifting in Home Depot --
229
727808
5185
ํ•œํŽธ, ์ด ๋‚จ์„ฑ์€ ๋งˆํŠธ์—์„œ 85๋‹ฌ๋Ÿฌ์–ด์น˜ ์ข€๋„๋‘‘์งˆ์„ ํ•˜๋‹ค๊ฐ€ ์ฒดํฌ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
12:13
85 dollars' worth of stuff, a similar petty crime.
230
733017
2924
๋น„์Šทํ•œ ๋ฒ”์ฃ„์ฃ .
12:16
But he had two prior armed robbery convictions.
231
736766
4559
ํ•˜์ง€๋งŒ ๊ทธ์—๊ฒŒ๋Š” ๋‘ ๋ฒˆ์˜ ๋ฌด์žฅ๊ฐ•๋„ ์ „๊ณผ๊ฐ€ ์žˆ์—ˆ์–ด์š”.
12:21
But the algorithm scored her as high risk, and not him.
232
741955
3482
๊ทธ๋Ÿฐ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋‚จ์ž๊ฐ€ ์•„๋‹ˆ๋ผ ์—ฌ์ž๋ฅผ ๊ณ ์œ„ํ—˜๊ตฐ์œผ๋กœ ๋ถ„๋ฅ˜ํ–ˆ์ฃ .
12:26
Two years later, ProPublica found that she had not reoffended.
233
746746
3874
2๋…„ ๋’ค์—
ProPublica๊ฐ€ ์กฐ์‚ฌํ•ด๋ณด๋‹ˆ ์—ฌ์„ฑ์€ ์žฌ๋ฒ”ํ•˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
12:30
It was just hard to get a job for her with her record.
234
750644
2550
์ „๊ณผ๊ฐ€ ์žˆ์—ˆ๊ธฐ์— ์ทจ์ง์ด ์–ด๋ ต๊ธด ํ–ˆ์ฃ .
12:33
He, on the other hand, did reoffend
235
753218
2076
๋ฐ˜๋ฉด ๋‚จ์ž๋Š” ์žฌ๋ฒ”ํ•˜์˜€๊ณ 
12:35
and is now serving an eight-year prison term for a later crime.
236
755318
3836
๊ทธ ์ดํ›„ ์ €์ง€๋ฅธ ๋ฒ”์ฃ„๋กœ ํ˜„์žฌ 8๋…„์„ ๋ณต์—ญ ์ค‘์ž…๋‹ˆ๋‹ค.
12:40
Clearly, we need to audit our black boxes
237
760088
3369
์šฐ๋ฆฌ๋Š” ๋ธ”๋ž™๋ฐ•์Šค๋ฅผ ์ž˜ ๊ฒ€์ˆ˜ํ•ด์„œ
12:43
and not have them have this kind of unchecked power.
238
763481
2615
์—‰๋šฑํ•œ ๊ถŒํ•œ์„ ๊ฐ–์ง€ ์•Š๋„๋ก ๋ถ„๋ช…ํžˆ ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:46
(Applause)
239
766120
2879
(๋ฐ•์ˆ˜)
12:50
Audits are great and important, but they don't solve all our problems.
240
770087
4242
๊ฒ€์ˆ˜๋Š” ์ค‘์š”ํ•˜๊ณ  ๋˜ ์œ ํšจํ•˜์ง€๋งŒ ๋ชจ๋“  ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜์ง„ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
12:54
Take Facebook's powerful news feed algorithm --
241
774353
2748
ํŽ˜์ด์Šค๋ถ์˜ ๊ฐ•๋ ฅํ•œ ๋‰ด์Šค ํ”ผ๋“œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ดํŽด ๋ณด์ฃ .
12:57
you know, the one that ranks everything and decides what to show you
242
777125
4843
๋ชจ๋“  ๊ฒƒ์„ ์ˆœ์œ„๋Œ€๋กœ ๋‚˜์—ดํ•˜๊ณ 
ํŒ”๋กœ์šฐํ•˜๋Š” ๋ชจ๋“  ์นœ๊ตฌ์™€ ํŽ˜์ด์ง€์—์„œ ๋ฌด์—‡์„ ๋ณด์—ฌ ์ค„์ง€๋ฅผ ๊ฒฐ์ •ํ•˜์ฃ .
13:01
from all the friends and pages you follow.
243
781992
2284
13:04
Should you be shown another baby picture?
244
784898
2275
์•„๊ธฐ ์‚ฌ์ง„์„ ๋˜ ๋ด์•ผ ํ• ๊นŒ์š”?
13:07
(Laughter)
245
787197
1196
(์›ƒ์Œ)
13:08
A sullen note from an acquaintance?
246
788417
2596
์•„๋‹ˆ๋ฉด ์ง€์ธ์˜ ์‚์ง„ ๋“ฏํ•œ ๊ธ€?
13:11
An important but difficult news item?
247
791449
1856
์–ด๋ ต์ง€๋งŒ ์ค‘์š”ํ•œ ๋‰ด์Šค ๊ธฐ์‚ฌ?
13:13
There's no right answer.
248
793329
1482
์ •๋‹ต์€ ์—†์Šต๋‹ˆ๋‹ค.
13:14
Facebook optimizes for engagement on the site:
249
794835
2659
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ํŽ˜์ด์Šค๋ถ ํ™œ๋™์— ์ตœ์ ํ™”๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
13:17
likes, shares, comments.
250
797518
1415
์ข‹์•„์š”, ๊ณต์œ , ๋Œ“๊ธ€์ด์ฃ .
13:20
In August of 2014,
251
800168
2696
2014๋…„ 8์›”์—
13:22
protests broke out in Ferguson, Missouri,
252
802888
2662
๋ฐฑ์ธ ๊ฒฝ์ฐฐ๊ด€์ด ๋ฒ”ํ–‰์ด ๋ถˆํ™•์‹คํ•œ ์ƒํ™ฉ์—์„œ
13:25
after the killing of an African-American teenager by a white police officer,
253
805574
4417
์‹ญ๋Œ€ ํ‘์ธ์—๊ฒŒ ๋ฐœํฌํ•˜์—ฌ ์‚ดํ•ดํ•œ ์‚ฌ๊ฑด ํ›„, ๋ฏธ์ฃผ๋ฆฌ ์ฃผ ํผ๊ฑฐ์Šจ์—์„œ
13:30
under murky circumstances.
254
810015
1570
์‹œ์œ„๊ฐ€ ์ผ์–ด๋‚ฌ์Šต๋‹ˆ๋‹ค.
13:31
The news of the protests was all over
255
811974
2007
์‹œ์œ„ ๋‰ด์Šค๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜ ํ•„ํ„ฐ๊ฐ€ ์—†๋Š”
13:34
my algorithmically unfiltered Twitter feed,
256
814005
2685
ํŠธ์œ„ํ„ฐ ๊ธ€๋ชฉ๋ก์—๋Š” ๋‚˜ํƒ€๋‚ฌ์ง€๋งŒ
13:36
but nowhere on my Facebook.
257
816714
1950
ํŽ˜์ด์Šค๋ถ์—๋Š” ํ”์ ์ด ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
13:39
Was it my Facebook friends?
258
819182
1734
ํŽ˜์ด์Šค๋ถ ์นœ๊ตฌ ๋•Œ๋ฌธ์ธ๊ฐ€ ์ƒ๊ฐํ•˜๊ณ  ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ•ด์ œํ•ด ๋ณด์•˜์Šต๋‹ˆ๋‹ค.
13:40
I disabled Facebook's algorithm,
259
820940
2032
13:43
which is hard because Facebook keeps wanting to make you
260
823472
2848
ํŽ˜์ด์Šค๋ถ์€ ๊ณ„์† ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ถ”์ฒœํ•˜๋Š” ๊ธ€์„ ๋ณด์—ฌ์ฃผ๋ ค๊ณ  ํ•ด์„œ ์ข€ ๊นŒ๋‹ค๋กœ์› ์ฃ .
13:46
come under the algorithm's control,
261
826344
2036
13:48
and saw that my friends were talking about it.
262
828404
2238
์นœ๊ตฌ๋“ค์ด ์‹œ์œ„ ์ด์•ผ๊ธฐ๋ฅผ ํ•˜์ง€ ์•Š๋˜ ๊ฒŒ ์•„๋‹ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
13:50
It's just that the algorithm wasn't showing it to me.
263
830666
2509
์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ „๋‹ฌ์„ ๋ง‰๊ณ  ์žˆ์—ˆ๋˜ ๊ฒ๋‹ˆ๋‹ค.
13:53
I researched this and found this was a widespread problem.
264
833199
3042
์ „ ์ด๊ฑธ ์กฐ์‚ฌํ•˜๊ณ ๋Š” ๊ด‘๋ฒ”์œ„ํ•œ ๋ฌธ์ œ์ž„์„ ์•Œ์•˜์Šต๋‹ˆ๋‹ค.
13:56
The story of Ferguson wasn't algorithm-friendly.
265
836265
3813
ํผ๊ฑฐ์Šจ ์‚ฌ๊ฑด์€ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์„ ํ˜ธํ•  ๋งŒํ•œ ๊ฒŒ ์•„๋‹ˆ์ฃ .
14:00
It's not "likable."
266
840102
1171
์ข‹์•„์š”๊ฐ€ ์ ์Šต๋‹ˆ๋‹ค.
14:01
Who's going to click on "like?"
267
841297
1552
๋Œ“๊ธ€ ๋‚จ๊ธฐ๊ธฐ๋„ ๊ป„๋„๋Ÿฌ์šด๋ฐ
14:03
It's not even easy to comment on.
268
843500
2206
๋ˆ„๊ฐ€ ์ข‹์•„์š”๋ฅผ ๋ˆ„๋ฅด๊ฒ ์–ด์š”?
14:05
Without likes and comments,
269
845730
1371
์ข‹์•„์š”์™€ ๋Œ“๊ธ€์ด ์ ์–ด์„œ
14:07
the algorithm was likely showing it to even fewer people,
270
847125
3292
์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋” ๋ณด์—ฌ์ฃผ๊ณ  ์‹ถ์ง€ ์•Š์•„ํ–ˆ๊ณ 
14:10
so we didn't get to see this.
271
850441
1542
๊ฒฐ๊ตญ ์šฐ๋ฆฌ๊ฐ€ ๋ณด์ง€ ๋ชปํ•œ ๊ฒ๋‹ˆ๋‹ค.
14:12
Instead, that week,
272
852946
1228
๋Œ€์‹  ๊ทธ ์ฃผ์— ํŽ˜์ด์Šค๋ถ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์„ ํ˜ธํ•œ ๊ฒƒ์€
14:14
Facebook's algorithm highlighted this,
273
854198
2298
14:16
which is the ALS Ice Bucket Challenge.
274
856520
2226
๋ฃจ๊ฒŒ๋ฆญ ๋ณ‘ ๋ชจ๊ธˆ์„ ์œ„ํ•œ ์•„์ด์Šค ๋ฒ„ํ‚ท ์ฑŒ๋ฆฐ์ง€์˜€์Šต๋‹ˆ๋‹ค.
14:18
Worthy cause; dump ice water, donate to charity, fine.
275
858770
3742
์–ผ์Œ๋ฌผ ์„ธ๋ก€๋ฅผ ๋งž๊ณ  ๊ธฐ๋ถ€๋ฅผ ํ•œ๋‹ค๋Š” ์ทจ์ง€ ์ž์ฒด๋Š” ๊ดœ์ฐฎ์ฃ .
14:22
But it was super algorithm-friendly.
276
862536
1904
ํ•˜์ง€๋งŒ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๊ณผํ•˜๊ฒŒ ์ข‹์•„ํ•  ๋งŒํ•œ ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
14:25
The machine made this decision for us.
277
865219
2613
๊ธฐ๊ณ„๊ฐ€ ๊ฒฐ์ •์„ ๋‚ด๋ ค ๋ฒ„๋ฆฐ ๊ฑฐ์ฃ .
14:27
A very important but difficult conversation
278
867856
3497
ํŽ˜์ด์Šค๋ถ์ด ์œ ์ผํ•œ ์†Œํ†ต ์ฐฝ๊ตฌ์˜€๋‹ค๋ฉด
14:31
might have been smothered,
279
871377
1555
์ค‘์š”ํ•˜์ง€๋งŒ ๊นŒ๋‹ค๋กœ์šด ์Ÿ์ ์ด ๋ฌปํž ๋ป”ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:32
had Facebook been the only channel.
280
872956
2696
14:36
Now, finally, these systems can also be wrong
281
876117
3797
๋งˆ์ง€๋ง‰์œผ๋กœ, ์ด ์‹œ์Šคํ…œ์€ ์ธ๊ฐ„๊ณผ๋Š” ๋‹ค๋ฅธ ๋ฐฉ์‹์œผ๋กœ
14:39
in ways that don't resemble human systems.
282
879938
2736
์˜ค๋ฅ˜๋ฅผ ๋ฒ”ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
14:42
Do you guys remember Watson, IBM's machine-intelligence system
283
882698
2922
ํ€ด์ฆˆ ํ”„๋กœ๊ทธ๋žจ์—์„œ ์ธ๊ฐ„ ์ฐธ๊ฐ€์ž๋ฅผ ๋ˆ„๋ฅด๊ณ  ์šฐ์Šน์„ ์ฐจ์ง€ํ•œ
14:45
that wiped the floor with human contestants on Jeopardy?
284
885644
3128
IBM์˜ ์ธ๊ณต์ง€๋Šฅ ์™“์Šจ์„ ๊ธฐ์–ตํ•˜์‹œ๋‚˜์š”?
14:49
It was a great player.
285
889131
1428
๋Œ€๋‹จํ•œ ์‹ค๋ ฅ์ด์—ˆ์ฃ .
14:50
But then, for Final Jeopardy, Watson was asked this question:
286
890583
3569
ํ•˜์ง€๋งŒ ์™“์Šจ์ด ๋งž์ดํ•œ ๋งˆ์ง€๋ง‰ ๋ฌธ์ œ๋ฅผ ๋ณด์‹œ๋ฉด
14:54
"Its largest airport is named for a World War II hero,
287
894659
2932
"์ด ๋„์‹œ ์ตœ๋Œ€ ๊ณตํ•ญ ์ด๋ฆ„์€ 2์ฐจ ๋Œ€์ „ ์˜์›…์„,
14:57
its second-largest for a World War II battle."
288
897615
2252
๋‘ ๋ฒˆ์งธ๋กœ ํฐ ๊ณตํ•ญ์€ 2์ฐจ ๋Œ€์ „ ์ „ํˆฌ๋ฅผ ๋”ฐ์„œ ์ง€์–ด์กŒ๋‹ค.
14:59
(Hums Final Jeopardy music)
289
899891
1378
(Jeopardy ๋Œ€๊ธฐ ์Œ์•…)
15:01
Chicago.
290
901582
1182
๋‹ต์€ ์‹œ์นด๊ณ ์ฃ .
15:02
The two humans got it right.
291
902788
1370
์ธ๊ฐ„ ์ฐธ๊ฐ€์ž๋Š” ๋ชจ๋‘ ๋งž์ท„์Šต๋‹ˆ๋‹ค.
15:04
Watson, on the other hand, answered "Toronto" --
292
904697
4348
ํ•˜์ง€๋งŒ ์™“์Šจ์€ 'ํ† ๋ก ํ† '๋ผ๊ณ  ์ผ์ฃ .
15:09
for a US city category!
293
909069
1818
์ฃผ์ œ๊ฐ€ ๋ฏธ๊ตญ ๋„์‹œ์˜€๋Š”๋ฐ๋„ ๋ง์ด์ฃ .
15:11
The impressive system also made an error
294
911596
2901
๋˜ํ•œ ์ด ๋›ฐ์–ด๋‚œ ์‹œ์Šคํ…œ์€
15:14
that a human would never make, a second-grader wouldn't make.
295
914521
3651
์ดˆ๋“ฑํ•™๊ต 2ํ•™๋…„์ƒ๋„ ํ•˜์ง€ ์•Š์„ ์‹ค์ˆ˜๋ฅผ ์ €์งˆ๋ €์ฃ .
15:18
Our machine intelligence can fail
296
918823
3109
์ธ๊ณต์ง€๋Šฅ์€ ์ธ๊ฐ„์˜ ์˜ค๋ฅ˜์™€๋Š” ๋‹ค๋ฅธ ๋ฐฉ์‹์œผ๋กœ ์˜ค์ž‘๋™ํ•  ์ˆ˜ ์žˆ๊ณ 
15:21
in ways that don't fit error patterns of humans,
297
921956
3100
15:25
in ways we won't expect and be prepared for.
298
925080
2950
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๊ฐ€ ์˜ˆ์ƒํ•˜๊ฑฐ๋‚˜ ๋Œ€๋น„ํ•˜๊ธฐ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
15:28
It'd be lousy not to get a job one is qualified for,
299
928054
3638
๋Šฅ๋ ฅ์— ๋งž๋Š” ์ง์—…์„ ๊ฐ–์ง€ ๋ชปํ•œ๋‹ค๋ฉด ๊ธฐ๋ถ„ ๋‚˜์  ๊ฑฐ์˜ˆ์š”.
15:31
but it would triple suck if it was because of stack overflow
300
931716
3727
๊ทธ๋Ÿฐ๋ฐ ๊ทธ ์ด์œ ๊ฐ€ ํ”„๋กœ๊ทธ๋žจ ํ•จ์ˆ˜์˜ ๊ณผ๋ถ€ํ•˜ ์˜ค๋ฅ˜ ๋•Œ๋ฌธ์ด๋ผ๋ฉด
15:35
in some subroutine.
301
935467
1432
๋ช‡ ๋ฐฐ๋Š” ๋” ๊ธฐ๋ถ„ ๋‚˜์˜๊ฒ ์ฃ .
15:36
(Laughter)
302
936923
1579
(์›ƒ์Œ)
15:38
In May of 2010,
303
938526
2786
2010๋…„ 5์›”์—
15:41
a flash crash on Wall Street fueled by a feedback loop
304
941336
4044
์›”๊ฐ€์˜ ๋งค๋„ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ํ”ผ๋“œ๋ฐฑ ๋ฐ˜๋ณต๋ฌธ ์˜ค๋ฅ˜๋กœ ์ฃผ๊ฐ€๊ฐ€ ํญ๋ฝํ–ˆ๊ณ 
15:45
in Wall Street's "sell" algorithm
305
945404
3028
15:48
wiped a trillion dollars of value in 36 minutes.
306
948456
4184
36๋ถ„ ๋งŒ์— 1์กฐ ๋‹ฌ๋Ÿฌ์–ด์น˜์˜ ๊ฐ€์น˜๊ฐ€ ์‚ฌ๋ผ์ง„ ์ผ์ด ์žˆ์—ˆ์ฃ .
15:53
I don't even want to think what "error" means
307
953722
2187
์‚ด์ƒ ๋ฌด๊ธฐ์˜ ๊ฒฝ์šฐ์— '์˜ค๋ฅ˜'๊ฐ€ ์ผ์–ด๋‚˜๋ฉด ์–ด๋–ป๊ฒŒ ๋ ์ง€
15:55
in the context of lethal autonomous weapons.
308
955933
3589
์ƒ๊ฐํ•˜๊ณ  ์‹ถ์ง€๋„ ์•Š์Šต๋‹ˆ๋‹ค.
16:01
So yes, humans have always made biases.
309
961894
3790
์ธ๊ฐ„์˜ ๊ฒฐ์ •์—๋Š” ๊ฒฐํ•จ์ด ๋งŽ์ฃ .
16:05
Decision makers and gatekeepers,
310
965708
2176
์˜์‚ฌ๊ฒฐ์ •๊ณผ ๋ณด์•ˆ
16:07
in courts, in news, in war ...
311
967908
3493
๋ฒ•์ •, ์–ธ๋ก , ์ „์Ÿ์—์„œ
16:11
they make mistakes; but that's exactly my point.
312
971425
3038
๋ชจ๋‘ ์‹ค์ˆ˜๊ฐ€ ์ผ์–ด๋‚˜์ง€๋งŒ ์ €๋Š” ๊ทธ๋ž˜์•ผ ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
16:14
We cannot escape these difficult questions.
313
974487
3521
์šฐ๋ฆฌ๋Š” ์–ด๋ ค์šด ๋ฌธ์ œ๋ฅผ ํ”ผํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
16:18
We cannot outsource our responsibilities to machines.
314
978596
3516
๊ธฐ๊ณ„์—๊ฒŒ ์ฑ…์ž„์„ ๋– ๋„˜๊ฒจ์„œ๋Š” ์•ˆ ๋ฉ๋‹ˆ๋‹ค.
16:22
(Applause)
315
982676
4208
(๋ฐ•์ˆ˜)
16:29
Artificial intelligence does not give us a "Get out of ethics free" card.
316
989089
4447
์ธ๊ณต์ง€๋Šฅ์ด '์œค๋ฆฌ์  ๋ฌธ์ œ์˜ ๋ฉด์ฃ„๋ถ€'๋ฅผ ์ฃผ์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค.
16:34
Data scientist Fred Benenson calls this math-washing.
317
994742
3381
๋ฐ์ดํ„ฐ ๊ณผํ•™์ž์ธ ํ”„๋ ˆ๋“œ ๋ฒ ๋„จ์Šจ์€ ์ด๋ฅผ ๋‘๊ณ  '๋…ผ๋ฆฌ ์„ธํƒ'์ด๋ผ ํ‘œํ˜„ํ–ˆ์ฃ .
16:38
We need the opposite.
318
998147
1389
์ •๋ฐ˜๋Œ€ ํƒœ๋„๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
16:39
We need to cultivate algorithm suspicion, scrutiny and investigation.
319
999560
5388
์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์˜์‹ฌํ•˜๊ณ  ์กฐ์‚ฌํ•˜๊ณ  ๊ฒ€์ˆ˜ํ•˜๋Š” ๋Šฅ๋ ฅ์„ ๊ธธ๋Ÿฌ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
16:45
We need to make sure we have algorithmic accountability,
320
1005380
3198
์•Œ๊ณ ๋ฆฌ์ฆ˜์— ๋Œ€ํ•œ ํšŒ๊ณ„์™€ ๊ฐ์‚ฌ ๊ทธ๋ฆฌ๊ณ  ํˆฌ๋ช…์„ฑ ์ œ๊ณ  ๋ฐฉ๋ฒ•์„
16:48
auditing and meaningful transparency.
321
1008602
2445
๊ตฌ์ถ•ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
16:51
We need to accept that bringing math and computation
322
1011380
3234
์ธ๊ฐ„ ์‚ฌํšŒ์˜ ๊ฐ€์น˜ ํŒ๋‹จ ๋ฌธ์ œ์— ์ˆ˜ํ•™๊ณผ ์ปดํ“จํ„ฐ๋ฅผ ๋„์ž…ํ•œ๋‹ค๊ณ  ํ•ด์„œ
16:54
to messy, value-laden human affairs
323
1014638
2970
๊ฐ๊ด€์ ์ธ ์ผ์ด ๋˜์ง€๋Š” ์•Š๋Š”๋‹ค๋Š” ๊ฑธ
16:57
does not bring objectivity;
324
1017632
2384
๋ฐ›์•„๋“ค์—ฌ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
17:00
rather, the complexity of human affairs invades the algorithms.
325
1020040
3633
์˜คํžˆ๋ ค ์ธ๊ฐ„ ๋ฌธ์ œ์˜ ๋ณต์žก์„ฑ์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ฃผ๊ด€์ ์œผ๋กœ ๋งŒ๋“ค์ฃ .
17:04
Yes, we can and we should use computation
326
1024148
3487
๋ฌผ๋ก  ๋” ๋‚˜์€ ์˜์‚ฌ๊ฒฐ์ •์„ ์œ„ํ•ด์„œ๋ผ๋ฉด ์ปดํ“จํ„ฐ๋ฅผ ์ด์šฉํ•  ์ˆ˜๋„ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
17:07
to help us make better decisions.
327
1027659
2014
17:09
But we have to own up to our moral responsibility to judgment,
328
1029697
5332
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ ํŒ๋‹จ์˜ ๋„๋•์  ์ฑ…์ž„์€ ์šฐ๋ฆฌ ์Šค์Šค๋กœ๊ฐ€ ์งŠ์–ด์ ธ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
17:15
and use algorithms within that framework,
329
1035053
2818
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๊ทธ ํ‹€ ์•ˆ์—์„œ๋งŒ ์ด์šฉ๋˜์–ด์•ผ ํ•  ๋ฟ์ด๊ณ 
17:17
not as a means to abdicate and outsource our responsibilities
330
1037895
4935
์šฐ๋ฆฌ์˜ ๋„๋•์  ์ฑ…์ž„์„ ๋‹ค๋ฅธ ์ชฝ์— ์ „๊ฐ€ํ•˜๋Š” ์ˆ˜๋‹จ์ด ๋˜์–ด์„œ๋Š” ์•ˆ๋˜์ฃ .
17:22
to one another as human to human.
331
1042854
2454
17:25
Machine intelligence is here.
332
1045807
2609
์ธ๊ณต์ง€๋Šฅ์˜ ์‹œ๋Œ€์—๋Š”
17:28
That means we must hold on ever tighter
333
1048440
3421
์ธ๊ฐ„ ๊ฐ€์น˜์™€ ์œค๋ฆฌ๊ฐ€
17:31
to human values and human ethics.
334
1051885
2147
๋”์šฑ๋” ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
17:34
Thank you.
335
1054056
1154
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
17:35
(Applause)
336
1055234
5020
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7