The era of blind faith in big data must end | Cathy O'Neil

232,156 views ใƒป 2017-09-07

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: SeungGyu Min ๊ฒ€ํ† : Tae-Hoon Chung
00:12
Algorithms are everywhere.
0
12795
1596
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ์–ด๋””๋‚˜ ์žˆ์ฃ .
00:15
They sort and separate the winners from the losers.
1
15931
3125
๊ทธ๊ฑธ๋กœ ์Šน์ž์™€ ํŒจ์ž๊ฐ€ ๊ฐˆ๋ฆฌ๊ณ  ๋‚˜๋‰˜์–ด
00:19
The winners get the job
2
19839
2264
์Šน์ž๋Š” ์ง์žฅ์„ ๊ตฌํ•˜๊ฑฐ๋‚˜
00:22
or a good credit card offer.
3
22127
1743
์‹ ์šฉ์นด๋“œ ์กฐ๊ฑด๋„ ์ข‹์•„์ง€์ง€๋งŒ
00:23
The losers don't even get an interview
4
23894
2651
ํŒจ์ž๋Š” ๋ฉด์ ‘๊ธฐํšŒ์กฐ์ฐจ ์—†๊ฑฐ๋‚˜
00:27
or they pay more for insurance.
5
27410
1777
๋ณดํ—˜๋ฃŒ๋„ ๋” ๋ถ€๋‹ดํ•˜์ฃ .
00:30
We're being scored with secret formulas that we don't understand
6
30017
3549
์ดํ•ด๋„ ๋ชปํ•˜๋Š” ์€๋ฐ€ํ•œ ๊ณต์‹์œผ๋กœ ์ ์ˆ˜ ๋งค๊ฒจ์ง€์ง€๋งŒ
00:34
that often don't have systems of appeal.
7
34495
3217
๋ณดํ†ต์€ ๊ฑฐ๊ธฐ์— ์ด๊ฒฌ์„ ์ œ์‹œํ•  ๊ธฐํšŒ์กฐ์ฐจ ์—†์Šต๋‹ˆ๋‹ค.
00:39
That begs the question:
8
39060
1296
์ด๋Ÿฌ๋ฉด ์งˆ๋ฌธ์ด ์ƒ๊น๋‹ˆ๋‹ค:
00:40
What if the algorithms are wrong?
9
40380
2913
์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ž˜๋ชป๋๋‹ค๋ฉด ์–ด๋–ค ์ผ์ด ๋ฐœ์ƒํ• ๊นŒ์š”?
00:44
To build an algorithm you need two things:
10
44920
2040
์•Œ๊ณ ๋ฆฌ์ฆ˜ ๊ฐœ๋ฐœ์—” ๋‘ ๊ฐ€์ง€๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
00:46
you need data, what happened in the past,
11
46984
1981
๊ณผ๊ฑฐ์— ์žˆ์—ˆ๋˜ ์ผ์— ๋Œ€ํ•œ ์ž๋ฃŒ์™€
00:48
and a definition of success,
12
48989
1561
์šฐ๋ฆฌ๊ฐ€ ์ฐพ๊ณ  ์ข…์ข… ์›ํ•˜๋Š”
00:50
the thing you're looking for and often hoping for.
13
50574
2457
์„ฑ๊ณต์ด๋ž€ ๊ฒƒ์— ๋Œ€ํ•œ ์ •์˜์ฃ .
00:53
You train an algorithm by looking, figuring out.
14
53055
5037
์ €ํฌ๋Š” ์ด๊ฑธ ๋“ค์—ฌ๋‹ค๋ณด๊ณ  ๋ถ„์„ํ•ด์„œ ์•Œ๊ณ ๋ฆฌ๋“ฌ์„ ํ›ˆ๋ จ์‹œํ‚ต๋‹ˆ๋‹ค
00:58
The algorithm figures out what is associated with success.
15
58116
3419
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ์„ฑ๊ณต๊ณผ ๊ด€๋ จ๋œ ๊ฒŒ ๋ญ”์ง€ ์•Œ์•„๋‚ด์ฃ .
01:01
What situation leads to success?
16
61559
2463
์–ด๋–ค ์ƒํ™ฉ์ด๋ฉด ์„ฑ๊ณตํ•˜๊ฒŒ ๋ ๊นŒ์š”?
01:04
Actually, everyone uses algorithms.
17
64701
1762
์‹ค์ œ ์šฐ๋ฆฌ ๋ชจ๋‘๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค.
01:06
They just don't formalize them in written code.
18
66487
2718
๊ทธ์ € ์ด๊ฑธ ์ฝ”๋“œ๋กœ ํ˜•์‹ํ™”ํ•˜์ง€ ์•Š์•˜์„ ๋ฟ์ธ๋ฐ
01:09
Let me give you an example.
19
69229
1348
์˜ˆ๋ฅผ ๋“ค์–ด ๋ณผ๊ฒŒ์š”.
01:10
I use an algorithm every day to make a meal for my family.
20
70601
3316
์ €๋Š” ๋งค์ผ ๊ฐ€์กฑ๋“ค์˜ ์‹์‚ฌ๋ฅผ ์ค€๋น„ํ•  ๋•Œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ด์šฉํ•ฉ๋‹ˆ๋‹ค.
01:13
The data I use
21
73941
1476
์ œ๊ฐ€ ์ด์šฉํ•˜๋Š” ์ž๋ฃŒ๋Š”
01:16
is the ingredients in my kitchen,
22
76214
1659
์ฃผ๋ฐฉ์— ์žˆ๋Š” ์žฌ๋ฃŒ
01:17
the time I have,
23
77897
1527
์ œ๊ฒŒ ์ฃผ์–ด์ง„ ์‹œ๊ฐ„
01:19
the ambition I have,
24
79448
1233
ํ•˜๊ณ  ์‹ถ์€ ์š”๋ฆฌ ๋“ฑ์œผ๋กœ
01:20
and I curate that data.
25
80705
1709
์ œ๊ฐ€ ์ง์ ‘ ๊ด€๋ฆฌํ•˜๋Š” ๊ฒƒ๋“ค์ด์ฃ .
01:22
I don't count those little packages of ramen noodles as food.
26
82438
4251
๊ฐœ์ธ์ ์œผ๋กœ ์ €๋Š” ๋ผ๋ฉด๊ฐ™์€ ์ฆ‰์„ ์‹ํ’ˆ์€ ์š”๋ฆฌ๋กœ ์น˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
01:26
(Laughter)
27
86713
1869
(์›ƒ์Œ)
01:28
My definition of success is:
28
88606
1845
์ œ๊ฐ€ ์ •์˜ํ•˜๋Š” ์„ฑ๊ณต์€ ์ด๊ฒ๋‹ˆ๋‹ค.
01:30
a meal is successful if my kids eat vegetables.
29
90475
2659
์•„์ด๋“ค์ด ์ฑ„์†Œ๋ฅผ ๋จน๋Š”๋‹ค๋ฉด ์„ฑ๊ณตํ•œ ์‹์‚ฌ๊ฐ€ ๋˜๋Š” ๊ฑด๋ฐ
01:34
It's very different from if my youngest son were in charge.
30
94001
2854
์ œ ๋ง‰๋‚ด ์•„๋“ค์ด ์ฐจ๋ ธ๋‹ค๋ฉด ์ƒ๊ฐํ–ˆ์„ ์„ฑ๊ณต๊ณผ๋Š” ์•„์ฃผ ๋‹ค๋ฅด์ฃ .
01:36
He'd say success is if he gets to eat lots of Nutella.
31
96879
2788
๋ง‰๋‚ด๋Š” ๋ˆ„ํ…”๋ผ ์ดˆ์ฝ”์žผ๋งŒ ๋“ฌ๋ฟ ๋จน์–ด๋„ ์„ฑ๊ณต์ด๋ผ ํ–ˆ์„ ํ…Œ๋‹ˆ๊นŒ์š”.
01:40
But I get to choose success.
32
100999
2226
ํ•˜์ง€๋งŒ ์„ฑ๊ณต์€ ์ œ๊ฐ€ ์„ ํƒํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
01:43
I am in charge. My opinion matters.
33
103249
2707
์ œ๊ฐ€ ์ฐจ๋ฆฌ๋Š” ๊ฑฐ๋‹ˆ๊นŒ์š”. ์ œ ์˜๊ฒฌ์ด ์ค‘์š”ํ•œ ๊ฑฐ์ฃ .
01:45
That's the first rule of algorithms.
34
105980
2675
์ด๊ฒƒ์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์ฒซ ๋ฒˆ์งธ ๊ทœ์น™์ž…๋‹ˆ๋‹ค.
01:48
Algorithms are opinions embedded in code.
35
108679
3180
์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๋ž€ ์ฝ”๋“œ์— ๋‹ด๊ธด ์˜๊ฒฌ์ž…๋‹ˆ๋‹ค.
01:53
It's really different from what you think most people think of algorithms.
36
113382
3663
์—ฌ๋Ÿฌ๋ถ„๊ณผ ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์ด ์ƒ๊ฐํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ๋Š” ๋งค์šฐ ๋‹ค๋ฅด์ฃ .
01:57
They think algorithms are objective and true and scientific.
37
117069
4504
์‚ฌ๋žŒ๋“ค์€ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๊ฐ๊ด€์ ์ด๋ฉฐ ์‚ฌ์‹ค์ด๊ณ  ๊ณผํ•™์ ์ด๋ผ๊ณ  ์ƒ๊ฐํ•˜์ง€๋งŒ
02:02
That's a marketing trick.
38
122207
1699
์ด๊ฑด ๋งˆ์ผ€ํŒ… ์ƒ์ˆ ์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
02:05
It's also a marketing trick
39
125089
2125
์ด๊ฒƒ์€ ๋˜ํ•œ ๊ทธ ์•Œ๊ณ ๋ฆฌ์ฆ˜์œผ๋กœ
02:07
to intimidate you with algorithms,
40
127238
3154
์—ฌ๋Ÿฌ๋ถ„์„ ์œ„ํ˜‘ํ•˜๊ณ  ์—ฌ๋Ÿฌ๋ถ„์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜์„
02:10
to make you trust and fear algorithms
41
130416
3661
์‹ ๋ขฐํ•˜๊ณ  ๋‘๋ ค์›Œํ•˜๊ฒŒ ๋งŒ๋“ค๋ ค๋Š” ๋งˆ์ผ€ํŒ… ์ƒ์ˆ ์ธ๋ฐ
02:14
because you trust and fear mathematics.
42
134101
2018
์™œ๋ƒํ•˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„๋“ค์ด ์ˆ˜ํ•™์„ ๋ฏฟ๊ณ  ๋‘๋ ค์›Œํ•˜๋‹ˆ๊นŒ์š”.
02:17
A lot can go wrong when we put blind faith in big data.
43
137387
4830
๋น… ๋ฐ์ดํ„ฐ๋ฅผ ๋งน์‹ ํ•˜๋ฉด ๋งŽ์€ ๊ฒƒ์ด ์ž˜๋ชป๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:23
This is Kiri Soares. She's a high school principal in Brooklyn.
44
143504
3373
์ด๋ถ„์€ ๋ธŒ๋ฃฉํด๋ฆฐ์˜ ์–ด๋Š ๊ณ ๋“ฑํ•™๊ต ๊ต์žฅ์ธ ํ‚ค๋ฆฌ ์†Œ์–ด์Šค ์”จ์ธ๋ฐ
02:26
In 2011, she told me her teachers were being scored
45
146901
2586
2011๋…„์— ์ €์—๊ฒŒ ๊ต์‚ฌ๋“ค์˜ ์ธ์‚ฌ ๊ณ ๊ณผ ํ‰๊ฐ€์—
02:29
with a complex, secret algorithm
46
149511
2727
๋ณต์žกํ•˜๊ณ  ๋น„๋ฐ€์Šค๋Ÿฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ธ
02:32
called the "value-added model."
47
152262
1489
"๊ฐ€์น˜-๋ถ€๊ฐ€ ๋ชจ๋ธ"์„ ์“ด๋‹ค๊ณ  ํ–ˆ์ฃ .
02:34
I told her, "Well, figure out what the formula is, show it to me.
48
154325
3092
์ œ๊ฐ€ ๊ทธ๋žฌ์ฃ . "์Œ, ๊ณต์‹์ด ๋ญ”์ง€ ํŒŒ์•…ํ•˜๊ณ  ๋ณด์—ฌ์ฃผ์„ธ์š”.
์ œ๊ฐ€ ์„ค๋ช…ํ•ด ๋“œ๋ฆด๊ฒŒ์š”."
02:37
I'm going to explain it to you."
49
157441
1541
02:39
She said, "Well, I tried to get the formula,
50
159006
2141
๊ทธ๋žฌ๋”๋‹ˆ "์Œ, ๊ณต์‹์„ ๊ตฌํ•˜๋ ค๊ณ  ํ–ˆ์ง€๋งŒ
๊ต์œก๋ถ€ ๋‹ด๋‹น์ž๊ฐ€ ๊ทธ๋Ÿฌ๋Š”๋ฐ ๊ทธ๊ฑด ์ˆ˜ํ•™์ด๋ผ
02:41
but my Department of Education contact told me it was math
51
161171
2772
02:43
and I wouldn't understand it."
52
163967
1546
์ œ๊ฐ€ ๋ด๋„ ๋ชจ๋ฅผ ๊ฒ๋‹ˆ๋‹ค"๋ผ๊ณ  ํ–ˆ๋Œ€์š”.
02:47
It gets worse.
53
167086
1338
์„ค์ƒ๊ฐ€์ƒ์ธ ๊ฑด
02:48
The New York Post filed a Freedom of Information Act request,
54
168448
3530
๋‰ด์š•ํฌ์ŠคํŠธ์ง€์—์„œ ์ •๋ณด์ž์œ ๋ฒ•์— ๋”ฐ๋ผ ์ •๋ณด๊ณต๊ฐœ๋ฅผ ์‹ ์ฒญํ•œ ํ›„
02:52
got all the teachers' names and all their scores
55
172002
2959
๋ชจ๋“  ์„ ์ƒ๋‹˜์˜ ์ด๋ฆ„๊ณผ ์ ์ˆ˜๋ฅผ ์ž…์ˆ˜ํ•ด์„œ
02:54
and they published them as an act of teacher-shaming.
56
174985
2782
์ผ์ข…์˜ ์„ ์ƒ๋‹˜ ๋ง์‹  ์ฃผ๊ธฐ๋กœ ๊ธฐ์‚ฌํ™” ํ–ˆ์–ด์š”.
02:58
When I tried to get the formulas, the source code, through the same means,
57
178904
3860
์ œ๊ฐ€ ๊ฐ™์€ ๋ฐฉ๋ฒ•์œผ๋กœ ์ด ๊ณต์‹๊ณผ ์†Œ์Šค์ฝ”๋“œ๋ฅผ ์ฐพ์œผ๋ ค๊ณ  ํ–ˆ๋”๋‹ˆ
03:02
I was told I couldn't.
58
182788
2149
ํ—ˆ๊ฐ€ํ•  ์ˆ˜ ์—†๋‹ค๊ณ  ํ•˜๋”๋ผ๊ณ ์š”.
03:04
I was denied.
59
184961
1236
๊ฑฐ๋ถ€ ๋‹นํ•œ ๊ฒ๋‹ˆ๋‹ค.
๊ทธ ํ›„ ์•Œ๊ฒŒ๋œ ๊ฑด๋ฐ
03:06
I later found out
60
186221
1174
03:07
that nobody in New York City had access to that formula.
61
187419
2866
๋‰ด์š•์˜ ์–ด๋Š ๋ˆ„๊ตฌ๋„ ๊ทธ ๊ณต์‹์€ ์ ‘๊ทผํ•  ์ˆ˜ ์—†์—ˆ์–ด์š”.
03:10
No one understood it.
62
190309
1305
์•„๋Š” ์‚ฌ๋žŒ๋„ ์—†์—ˆ์ฃ .
03:13
Then someone really smart got involved, Gary Rubinstein.
63
193749
3224
๊ทธ๋Ÿฌ๋‹ค ๊ฒŒ๋ฆฌ ๋ฃจ๋นˆ์Šคํƒ€์ธ์ด๋ผ๋Š” ์•„์ฃผ ๋˜‘๋˜‘ํ•œ ์‚ฌ๋žŒ์ด ์—ฌ๊ธฐ ๊ด€์—ฌํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
03:16
He found 665 teachers from that New York Post data
64
196997
3621
๊ทธ๊ฐ€ ๋‰ด์š• ํฌ์ŠคํŠธ ์ž๋ฃŒ์—์„œ 665๋ช…์˜ ์„ ์ƒ๋‹˜์„ ์ฐพ์•„๋‚ด ๋ดค๋”๋‹ˆ
03:20
that actually had two scores.
65
200642
1866
์ ์ˆ˜๊ฐ€ ๋‘ ๊ฐ€์ง€์˜€์–ด์š”.
03:22
That could happen if they were teaching
66
202532
1881
ํ•œ ๋ถ„์ด 7ํ•™๋…„ ์ˆ˜ํ•™๊ณผ 8ํ•™๋…„ ์ˆ˜ํ•™์„
03:24
seventh grade math and eighth grade math.
67
204437
2439
ํ•จ๊ป˜ ๊ฐ€๋ฅด์น˜๋ฉด ์ƒ๊ธฐ๋Š” ์ผ์ž…๋‹ˆ๋‹ค.
03:26
He decided to plot them.
68
206900
1538
๊ฒŒ๋ฆฌ๊ฐ€ ์ด๊ฑธ๋กœ ๊ทธ๋ฆผ์„ ๊ทธ๋ ค๋ดค์–ด์š”.
03:28
Each dot represents a teacher.
69
208462
1993
์  ํ•˜๋‚˜ ํ•˜๋‚˜๊ฐ€ ์„ ์ƒ๋‹˜ ํ•œ ๋ถ„์ž…๋‹ˆ๋‹ค.
03:30
(Laughter)
70
210924
2379
(์›ƒ์Œ)
03:33
What is that?
71
213327
1521
์ €๊ฑด ๋ญก๋‹ˆ๊นŒ?
03:34
(Laughter)
72
214872
1277
(์›ƒ์Œ)
03:36
That should never have been used for individual assessment.
73
216173
3446
๊ฐœ์ธ ์ธ์‚ฌ ๊ณ ๊ณผ์— ์ •๋ง ์ด์šฉํ•˜์ง€ ๋ง์•˜์–ด์•ผ ํ•  ํ”„๋กœ๊ทธ๋žจ์ด์—ˆ๋˜ ๊ฒ๋‹ˆ๋‹ค.
03:39
It's almost a random number generator.
74
219643
1926
๊ฑฐ์˜ ๋‚œ์ˆ˜ ์ƒ์„ฑ๊ธฐ๋„ค์š”.
03:41
(Applause)
75
221593
2946
(๋ฐ•์ˆ˜)
03:44
But it was.
76
224563
1162
ํ•˜์ง€๋งŒ ์‹ค์ œ๋กœ ํ™œ์šฉ๋์Šต๋‹ˆ๋‹ค.
03:45
This is Sarah Wysocki.
77
225749
1176
์ด๋ถ„์€ ์‚ฌ๋ผ ์™€์ด์‚ฌํ‚ค ์„ ์ƒ๋‹˜์ž…๋‹ˆ๋‹ค.
03:46
She got fired, along with 205 other teachers,
78
226949
2175
๋‹ค๋ฅธ 205๋ช…์˜ ์„ ์ƒ๋‹˜๊ณผ ํ•จ๊ป˜
์›Œ์‹ฑํ„ด DC ํ•™๊ตฐ์—์„œ ํ•ด์ง๋˜์…จ์ฃ .
03:49
from the Washington, DC school district,
79
229148
2662
03:51
even though she had great recommendations from her principal
80
231834
2909
๊ต์žฅ์ด๋‚˜ ํ•™์ƒ๋“ค ํ•™๋ถ€๋ชจ๋กœ๋ถ€ํ„ฐ ๋†’์€ ํ‰๊ฐ€๋ฅผ ๋ฐ›์•˜์Œ์—๋„
03:54
and the parents of her kids.
81
234767
1428
์–ด์ฉ” ์ˆ˜ ์—†์—ˆ์–ด์š”.
03:57
I know what a lot of you guys are thinking,
82
237210
2032
์—ฌ๋Ÿฌ๋ถ„๋“ค, ํŠนํžˆ ์—ฌ๊ธฐ ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž, ์ธ๊ณต์ง€๋Šฅ ๊ณผํ•™์ž๋ถ„๋“ค์ด
03:59
especially the data scientists, the AI experts here.
83
239266
2487
๋ฌด์Šจ ์ƒ๊ฐ๋“ค ํ•˜์‹œ๋Š”์ง€ ์••๋‹ˆ๋‹ค. ์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜์‹œ๊ฒ ์ฃ .
04:01
You're thinking, "Well, I would never make an algorithm that inconsistent."
84
241777
4226
"์Œ, ๋‚œ ์ €๋Ÿฐ ์ผ๊ด€์„ฑ ์—†๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ์ ˆ๋Œ€ ์•ˆ ๋งŒ๋“ค์–ด."
04:06
But algorithms can go wrong,
85
246673
1683
ํ•˜์ง€๋งŒ ์•Œ๊ณ ๋ฆฌ์ฆ˜๋„ ์ž˜๋ชป๋  ์ˆ˜ ์žˆ๊ณ 
04:08
even have deeply destructive effects with good intentions.
86
248380
4598
์ข‹์€ ์˜๋„์—๋„ ์‹ฌ๊ฐํžˆ ํŒŒ๊ดด์ ์ธ ๊ฒฐ๊ณผ๋กœ ์ด๋Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:14
And whereas an airplane that's designed badly
87
254351
2379
์„ค๊ณ„๊ฐ€ ์ž˜๋ชป๋œ ๋น„ํ–‰๊ธฐ๋Š”
04:16
crashes to the earth and everyone sees it,
88
256754
2001
๋•…์œผ๋กœ ์ถ”๋ฝํ•˜๊ณ  ๊ทธ๋Ÿฌ๋ฉด ๋ชจ๋‘๊ฐ€ ์•Œ ์ˆ˜ ์žˆ์ง€๋งŒ
04:18
an algorithm designed badly
89
258779
1850
์„ค๊ณ„๊ฐ€ ์ž˜๋ชป๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€
04:22
can go on for a long time, silently wreaking havoc.
90
262065
3865
์˜ค๋žœ ์‹œ๊ฐ„์— ๊ฑธ์ณ ์กฐ์šฉํžˆ ์šฐ๋ฆฌ๋ฅผ ํŒŒ๋ฉธ์‹œํ‚ต๋‹ˆ๋‹ค.
04:27
This is Roger Ailes.
91
267568
1570
์ด๋ถ„์€ ๋กœ์ € ์—์ผ์ฆˆ์”จ์ž…๋‹ˆ๋‹ค.
04:29
(Laughter)
92
269162
2000
(์›ƒ์Œ)
04:32
He founded Fox News in 1996.
93
272344
2388
1996๋…„ ํญ์Šค ๋‰ด์Šค๋ฅผ ์„ธ์› ์ฃ .
04:35
More than 20 women complained about sexual harassment.
94
275256
2581
20๋ช… ์ด์ƒ์˜ ์—ฌ์„ฑ๋“ค์ด ์„ฑํฌ๋กฑ์„ ๋‹นํ–ˆ๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:37
They said they weren't allowed to succeed at Fox News.
95
277861
3235
๊ทธ๋“ค์— ๋”ฐ๋ฅด๋ฉด ์—ฌ์ž๋“ค์€ ํญ์Šค ๋‰ด์Šค์—์„œ ์„ฑ๊ณตํ•  ์ˆ˜ ์—†์—ˆ๋‹ค๊ณ  ํ•ด์š”.
04:41
He was ousted last year, but we've seen recently
96
281120
2520
๊ทธ๋Š” ์ž‘๋…„์— ์ซ“๊ฒจ ๋‚ฌ์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์ตœ๊ทผ์—๋„
04:43
that the problems have persisted.
97
283664
2670
๊ทธ ๋ฌธ์ œ๊ฐ€ ์—ฌ์ „ํ•˜๋‹ค๋Š” ๊ฑธ ์ ‘ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:47
That begs the question:
98
287474
1400
์—ฌ๊ธฐ์„œ ์˜๋ฌธ์ด ๋– ์˜ค๋ฆ…๋‹ˆ๋‹ค.
04:48
What should Fox News do to turn over another leaf?
99
288898
2884
๋ถ„์œ„๊ธฐ๋ฅผ ์‡„์‹ ํ•˜๋ ค๋ฉด ํญ์Šค ๋‰ด์Šค๋Š” ๋ญ˜ ํ•ด์•ผ ํ• ๊นŒ์š”?
04:53
Well, what if they replaced their hiring process
100
293065
3041
์Œ, ๊ฑฐ๊ธฐ ๊ณ ์šฉ ์ ˆ์ฐจ๋ฅผ
04:56
with a machine-learning algorithm?
101
296130
1654
๊ธฐ๊ณ„ ํ•™์Šต ์•Œ๊ณ ๋ฆฌ์ฆ˜์œผ๋กœ ๋ฐ”๊พธ๋ฉด ์–ด๋–จ๊นŒ์š”?
04:57
That sounds good, right?
102
297808
1595
๊ดœ์ฐฎ์€ ์ƒ๊ฐ์ด์ฃ ? ์•ˆ ๊ทธ๋ž˜์š”?
04:59
Think about it.
103
299427
1300
์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
05:00
The data, what would the data be?
104
300751
2105
์ž๋ฃŒ, ์ž๋ฃŒ๋Š” ์–ด๋–ค ๊ฑฐ๋ฉด ๋ ๊นŒ์š”?
05:02
A reasonable choice would be the last 21 years of applications to Fox News.
105
302880
4947
์ง€๋‚œ 21๋…„๊ฐ„ ํŒ์Šค ๋‰ด์Šค ์ง€์›์ž์˜ ์ž๋ฃŒ๋ฉด ํ•ฉ๋‹นํ•˜๊ฒ ์ฃ .
05:07
Reasonable.
106
307851
1502
ํ•ฉ๋ฆฌ์ ์ž…๋‹ˆ๋‹ค.
05:09
What about the definition of success?
107
309377
1938
์„ฑ๊ณต์— ๋Œ€ํ•œ ์ •์˜๋Š” ์–ด๋–จ๊นŒ์š”?
05:11
Reasonable choice would be,
108
311741
1324
ํ•ฉ๋ฆฌ์ ์ธ ์„ ํƒ์ด๋ผ๋ฉด
05:13
well, who is successful at Fox News?
109
313089
1778
์Œ, ํญ์Šค ๋‰ด์Šค์—์„œ ์„ฑ๊ณตํ•œ ์‚ฌ๋žŒ ์ •๋„๋ฉด ์–ด๋•Œ์š”?
05:14
I guess someone who, say, stayed there for four years
110
314891
3580
์ œ ์ƒ๊ฐ์— ์˜ˆ๋ฅผ ๋“ค์–ด 4๋…„ ์ •๋„ ๊ทผ๋ฌดํ•˜๋ฉด์„œ
05:18
and was promoted at least once.
111
318495
1654
์ ์–ด๋„ ํ•œ ๋ฒˆ์ฏค ์Šน์ง„ํ•œ ๊ฑฐ๋ฉด ๋  ๋“ฏํ•œ๋ฐ.
05:20
Sounds reasonable.
112
320636
1561
๊ทธ๋Ÿด ๋“ฏํ•ฉ๋‹ˆ๋‹ค.
05:22
And then the algorithm would be trained.
113
322221
2354
์ด์ œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ํ•™์Šตํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:24
It would be trained to look for people to learn what led to success,
114
324599
3877
๋ฌด์—‡์ด ์„ฑ๊ณต์˜ ์›์ธ์ธ๊ฐ€๋ฅผ ํ•™์Šตํ•ด ์ด ์ •์˜์— ๋”ฐ๋ผ
๊ณผ๊ฑฐ์— ์–ด๋–ค ์ง€์›์ž๊ฐ€ ์„ฑ๊ณตํ–ˆ๋Š”์ง€ ์ฐพ์•„๋‚ด๋„๋ก ํ›ˆ๋ จํ•œ
05:29
what kind of applications historically led to success
115
329039
4318
05:33
by that definition.
116
333381
1294
์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ƒ๊ธฐ๊ฒ ์ฃ .
05:36
Now think about what would happen
117
336020
1775
์ด์ œ ๊ทธ๊ฑธ ์ง€๊ธˆ์˜ ์ง€์›์ž๋“ค์—๊ฒŒ ์ ์šฉํ•˜๋ฉด
05:37
if we applied that to a current pool of applicants.
118
337819
2555
์–ด๋–ค ์ผ์ด ๋ฐœ์ƒํ• ์ง€ ์ƒ๊ฐํ•ด ๋ด…์‹œ๋‹ค.
05:40
It would filter out women
119
340939
1629
์—ฌ์„ฑ์€ ๋ฐฐ์ œ๋  ๊ฒ๋‹ˆ๋‹ค.
05:43
because they do not look like people who were successful in the past.
120
343483
3930
๊ณผ๊ฑฐ์— ์„ฑ๊ณตํ•œ ๊ฒฝ๋ ฅ์ด ์žˆ์„ ๊ฒƒ ๊ฐ™์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
05:51
Algorithms don't make things fair
121
351572
2537
์•„๋ฌด ์ƒ๊ฐ ์—†์ด ๋งน๋ชฉ์ ์œผ๋กœ ์ด์šฉํ•œ๋‹ค๊ณ 
์•Œ๊ณ ๋ฆฌ์ฆ˜์œผ๋กœ ์„ธ์ƒ์ด ๊ณตํ‰ํ•ด์ง€์ง„ ์•Š์Šต๋‹ˆ๋‹ค.
05:54
if you just blithely, blindly apply algorithms.
122
354133
2694
05:56
They don't make things fair.
123
356851
1482
๊ทธ๊ฑธ๋กœ ์„ธ์ƒ์ด ๊ณต์ •ํ•ด์ง€์ง„ ์•Š์•„์š”.
05:58
They repeat our past practices,
124
358357
2128
๋‹จ์ง€ ๊ณผ๊ฑฐ์˜ ๊ด€ํ–‰๊ณผ ์šฐ๋ฆฌ ํ–‰๋™์˜ ์œ ํ˜•์„ ๋”ฐ๋ผํ•  ๋ฟ์ž…๋‹ˆ๋‹ค.
06:00
our patterns.
125
360509
1183
06:01
They automate the status quo.
126
361716
1939
ํ˜„์ƒํƒœ๋ฅผ ์ž๋™ํ™”ํ•˜๋Š” ๊ฑฐ์ฃ .
06:04
That would be great if we had a perfect world,
127
364538
2389
์šฐ๋ฆฌ์˜ ํ˜„์žฌ๊ฐ€ ์™„๋ฒฝํ•˜๋‹ค๋ฉด ํ›Œ๋ฅญํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๊ฒ ์ง€๋งŒ
06:07
but we don't.
128
367725
1312
ํ˜„์‹ค์€ ์™„๋ฒฝํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
06:09
And I'll add that most companies don't have embarrassing lawsuits,
129
369061
4102
๊ทธ๋ฆฌ๊ณ  ์—ฌ๊ธฐ์— ๋Œ€๋ถ€๋ถ„์˜ ๊ธฐ์—…์ด ๋‚œ์ฒ˜ํ•œ ์†Œ์†ก์— ํœ˜๋ง๋ฆฌ์ง„ ์•Š์•„๋„
06:14
but the data scientists in those companies
130
374266
2588
๊ทธ๋Ÿฐ ๊ธฐ์—…์˜ ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž๋“ค์€
06:16
are told to follow the data,
131
376878
2189
๊ทธ ๋ฐ์ดํ„ฐ์— ๋”ฐ๋ผ ์ผํ•˜๋„๋ก
06:19
to focus on accuracy.
132
379091
2143
์ •ํ™•์„ฑ์— ์ง‘์ค‘ํ•˜๋„๋ก ์š”๊ตฌ๋ฐ›๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:22
Think about what that means.
133
382093
1381
๋ฌด์Šจ ๋œป์ผ์ง€ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
06:23
Because we all have bias, it means they could be codifying sexism
134
383498
4027
์šฐ๋ฆฌ๋Š” ๋ชจ๋‘ ํŽธ๊ฒฌ์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ์„ฑ์ฐจ๋ณ„์ด๋‚˜ ๋‹ค๋ฅธ ์–ด๋–ค ํŽธ๊ฒฌ์„
06:27
or any other kind of bigotry.
135
387549
1836
์ฝ”๋“œ์— ๋„ฃ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:31
Thought experiment,
136
391308
1421
์‚ฌ๊ณ  ์‹คํ—˜ ํ•ด๋ด…์‹œ๋‹ค.
06:32
because I like them:
137
392753
1509
์ œ๊ฐ€ ์ข‹์•„ํ•˜๋‹ˆ๊นŒ
06:35
an entirely segregated society --
138
395394
2975
์™„๋ฒฝํ•˜๊ฒŒ ๋ถ„๋ฆฌ๋œ ์‚ฌํšŒ
06:40
racially segregated, all towns, all neighborhoods
139
400067
3328
์ธ์ข…์œผ๋กœ๋„ ๋ถ„๋ฆฌ๋˜๊ณ , ๊ฐ ๋„์‹œ๋„ ์ด์›ƒ๋„ ๋ถ„๋ฆฌ๋˜๊ณ 
06:43
and where we send the police only to the minority neighborhoods
140
403419
3037
์‚ฌํšŒ์  ์•ฝ์ž์˜ ๊ฑฐ์ฃผ์ง€์—ญ์—๋งŒ ๊ฒฝ์ฐฐ์„ ๋ณด๋‚ด ๋ฒ”์ฃ„๋ฅผ ์กฐ์‚ฌํ•œ๋‹ค๊ณ  ํ•ด๋ณด์ฃ .
06:46
to look for crime.
141
406480
1193
06:48
The arrest data would be very biased.
142
408271
2219
๊ทธ ๊ฒ€๊ฑฐ์œจ ์ž๋ฃŒ๋Š” ์•„์ฃผ ํŽธํ–ฅ๋˜์—ˆ์„ ๊ฒ๋‹ˆ๋‹ค.
06:51
What if, on top of that, we found the data scientists
143
411671
2575
๊ฑฐ๊ธฐ์—, ๋งŒ์•ฝ ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž๋“ค๊ณผ ๊ณ ์šฉ๋œ ๋ฐ์ดํ„ฐ ๋ถ„์„๊ฐ€๋“ค์—๊ฒŒ
06:54
and paid the data scientists to predict where the next crime would occur?
144
414270
4161
๋‹ค์Œ ๋ฒˆ์—” ์–ด๋””์„œ ๋ฒ”์ฃ„๊ฐ€ ์ƒ๊ธธ์ง€ ์˜ˆ์ธกํ•˜๊ฒŒ ํ•œ๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
06:59
Minority neighborhood.
145
419095
1487
์‚ฌํšŒ์  ์•ฝ์ž์˜ ๊ฑฐ์ฃผ ์ง€์—ญ์ผ ๊ฒ๋‹ˆ๋‹ค.
ํ˜น์€ ๋‹ค์Œ ๋ฒˆ ๋ฒ”์ฃ„์ž๊ฐ€ ๋ˆ„๊ตฌ์ผ์ง€ ์˜ˆ์ธก์ผ€ ํ•œ๋‹ค๋ฉด ๋ˆ„๊ฐ€ ๋ ๊นŒ์š”?
07:01
Or to predict who the next criminal would be?
146
421105
3125
07:04
A minority.
147
424708
1395
์‚ฌํšŒ์  ์•ฝ์ž๊ฒ ์ฃ .
07:07
The data scientists would brag about how great and how accurate
148
427769
3541
๊ทธ ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž๋Š” ์ž์‹ ์˜ ๋ชจ๋ธ์ด ์–ผ๋งˆ๋‚˜ ๋Œ€๋‹จํ•˜๊ณ  ์ •ํ™•ํ•œ์ง€
07:11
their model would be,
149
431334
1297
์ž๋ž‘ํ•  ๊ฒƒ์ด๊ณ 
07:12
and they'd be right.
150
432655
1299
๊ทธ๊ฒŒ ๋งž๊ฒ ์ฃ .
07:15
Now, reality isn't that drastic, but we do have severe segregations
151
435771
4615
์ง€๊ธˆ ํ˜„์‹ค์€ ์ €๋ ‡๊ฒŒ ๊ทน๋‹จ์ ์ด์ง„ ์•Š์•„๋„ ์šฐ๋ฆฌ๋Š” ๋งŽ์€ ๋„์‹œ์™€ ๋งˆ์„์—์„œ
์‹ฌ๊ฐํ•œ ์ธ์ข…์ฐจ๋ณ„์ด ์žˆ๊ณ 
07:20
in many cities and towns,
152
440410
1287
07:21
and we have plenty of evidence
153
441721
1893
์ฐจ๋ณ„์ ์ธ ๊ฒฝ์ฐฐ ํ™œ๋™๊ณผ ๋ฒ• ์ง‘ํ–‰์— ๋Œ€ํ•œ ์ฆ๊ฑฐ๋Š” ์ฐจ๊ณ  ๋„˜์นฉ๋‹ˆ๋‹ค.
07:23
of biased policing and justice system data.
154
443638
2688
07:27
And we actually do predict hotspots,
155
447452
2815
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฒ”์ง€์—ญ, ๋ฐ”๋กœ ๋ฒ”์ฃ„๊ฐ€ ๋ฐœ์ƒํ•  ๊ฒƒ ๊ฐ™์€ ์žฅ์†Œ๋ฅผ
07:30
places where crimes will occur.
156
450291
1530
์‹ค์ œ๋กœ๋„ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.
07:32
And we do predict, in fact, the individual criminality,
157
452221
3866
๋˜ํ•œ ์šฐ๋ฆฌ๋Š” ์‹ค์ œ๋กœ ๊ฐœ์ธ์˜ ๋ฒ”์ฃ„ ๊ฐ€๋Šฅ์„ฑ์„ ์˜ˆ์ธกํ•˜์ฃ .
07:36
the criminality of individuals.
158
456111
1770
๊ฐœ์ธ์  ๋ฒ”์ฃ„์„ฑ์ด๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
07:38
The news organization ProPublica recently looked into
159
458792
3963
๋‰ด์Šค ์กฐ์ง์ธ ํ”„๋กœ ํผ๋ธ”๋ฆฌ์นด๋Š” ์ตœ๊ทผ
07:42
one of those "recidivism risk" algorithms,
160
462779
2024
์–ด๋Š "์žฌ๋ฒ” ์œ„ํ—˜์„ฑ ํ‰๊ฐ€" ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ดํŽด ๋ดค์Šต๋‹ˆ๋‹ค.
07:44
as they're called,
161
464827
1163
์ „๋ฌธ๊ฐ€๋“ค์€ ๊ทธ๋ ‡๊ฒŒ ๋ถ€๋ฅด๋”๊ตฐ์š”.
ํ”Œ๋กœ๋ฆฌ๋‹ค์ฃผ์—์„œ ํŒ์‚ฌ๊ฐ€ ํ˜•์„ ์„ ๊ณ ํ•˜๋Š” ๋™์•ˆ ์‚ฌ์šฉํ•˜๊ณ  ์žˆ์ฃ .
07:46
being used in Florida during sentencing by judges.
162
466014
3194
07:50
Bernard, on the left, the black man, was scored a 10 out of 10.
163
470231
3585
์™ผ์ชฝ์˜ ํ‘์ธ ๋ฒ„๋„ˆ๋“œ๋Š” 10์  ๋งŒ์ ์— 10์ ์„ ๋ฐ›์•˜์Šต๋‹ˆ๋‹ค.
07:54
Dylan, on the right, 3 out of 10.
164
474999
2007
์˜ค๋ฅธ์ชฝ ๋”œ๋Ÿฐ์€ 10์  ๋งŒ์ ์— 3์ ์„ ๋ฐ›์•˜๊ณ ์š”.
07:57
10 out of 10, high risk. 3 out of 10, low risk.
165
477030
2501
10์  ๋งŒ์ ์— 10์ , ๊ณ ์œ„ํ—˜๊ตฐ. 10์  ๋งŒ์ ์— 3์ , ์ €์œ„ํ—˜๊ตฐ.
08:00
They were both brought in for drug possession.
166
480418
2385
๋‘˜ ๋‹ค ๋ถˆ๋ฒ•์•ฝ๋ฌผ ์†Œ์ง€ํ˜์˜๋กœ ์žฌํŒ ์ค‘์ด์—ˆ์Šต๋‹ˆ๋‹ค.
08:02
They both had records,
167
482827
1154
๋‘˜ ๋‹ค ์ „๊ณผ๊ฐ€ ์žˆ์—ˆ์ง€๋งŒ
๋”œ๋Ÿฐ์€ ์ค‘๋ฒ”์ฃ„ ์ „๊ณผ์ž์˜€๊ณ 
08:04
but Dylan had a felony
168
484005
2806
08:06
but Bernard didn't.
169
486835
1176
๋ฒ„๋„ˆ๋“œ๋Š” ๊ทธ๋ ‡์ง€ ์•Š์•˜์ฃ .
08:09
This matters, because the higher score you are,
170
489638
3066
์ด๊ฒŒ ์ค‘์š”ํ•œ๋ฐ ์™œ๋ƒํ•˜๋ฉด ์ ์ˆ˜๊ฐ€ ๋†’์œผ๋ฉด ๋†’์„์ˆ˜๋ก
08:12
the more likely you're being given a longer sentence.
171
492728
3473
๋” ๊ธด ํ˜•๊ธฐ๋ฅผ ์„ ๊ณ ๋ฐ›์„ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:18
What's going on?
172
498114
1294
๋„๋Œ€์ฒด ๋ฌด์Šจ ์ผ์ด ๋ฒŒ์–ด์ง€๊ณ  ์žˆ์ฃ ?
08:20
Data laundering.
173
500346
1332
๋ฐ์ดํ„ฐ ์„ธํƒ์ด์ฃ .
08:22
It's a process by which technologists hide ugly truths
174
502750
4427
๋ฐ”๋กœ ๊ธฐ์ˆ ์ž๋“ค์ด ์ถ”์•…ํ•œ ์ง„์‹ค์„
๋ธ”๋ž™ ๋ฐ•์Šค ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์†์— ์ˆจ๊ฒจ๋ฒ„๋ฆฌ๊ณ 
08:27
inside black box algorithms
175
507201
1821
๊ทธ๊ฒƒ์„ ๊ฐ๊ด€์ ์ด๋ผ ๋Šฅ๋ ฅ์ฃผ์˜์ ์ด๋ผ
08:29
and call them objective;
176
509046
1290
08:31
call them meritocratic.
177
511140
1568
๋ถ€๋ฅด๋Š” ๊ณผ์ •์ž…๋‹ˆ๋‹ค.
08:34
When they're secret, important and destructive,
178
514938
2385
๊ทธ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋น„๋ฐ€์ด๊ณ , ์ค‘์š”ํ•˜๋ฉด์„œ ํŒŒ๊ดด์ ์ด๊ธฐ ๋•Œ๋ฌธ์—
08:37
I've coined a term for these algorithms:
179
517347
2487
์ €๋Š” ์ด๋Ÿฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ์ƒˆ ์ด๋ฆ„์„ ์ง€์—ˆ์Šต๋‹ˆ๋‹ค.
08:39
"weapons of math destruction."
180
519858
1999
"๋Œ€๋Ÿ‰ ์ˆ˜ํ•™ ์‚ด์ƒ ๋ฌด๊ธฐ."
08:41
(Laughter)
181
521881
1564
(์›ƒ์Œ)
(๋ฐ•์ˆ˜)
08:43
(Applause)
182
523469
3054
08:46
They're everywhere, and it's not a mistake.
183
526547
2354
์–ด๋Š ๊ณณ์—๋‚˜ ํผ์ ธ ์žˆ๊ณ  ๊ทธ๊ฒŒ ์‹ค์ˆ˜๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
08:49
These are private companies building private algorithms
184
529515
3723
์ด๋“ค์ด ์ž์‹ ๋“ค์˜ ๋ชฉ์ ์„ ์œ„ํ•ด ์ž์‹ ๋“ค๋งŒ์˜ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๋งŒ๋“  ์‚ฌ๊ธฐ์—…์ž…๋‹ˆ๋‹ค.
08:53
for private ends.
185
533262
1392
08:55
Even the ones I talked about for teachers and the public police,
186
535034
3214
์‹ฌ์ง€์–ด ์ œ๊ฐ€ ์–ธ๊ธ‰ํ•œ ๊ต์ง์›์ด๋‚˜ ๊ฒฝ์ฐฐ๊ด€ ๊ณ ๊ณผํ‰๊ฐ€ ํ”„๋กœ๊ทธ๋žจ๋„
08:58
those were built by private companies
187
538272
1869
๊ฐœ์ธ ๊ธฐ์—…์ด ๋งŒ๋“ค๊ณ 
์ •๋ถ€๊ฐ€ ๋ˆ์„ ๋“ค์—ฌ ์‚ฐ ๊ฒ๋‹ˆ๋‹ค.
09:00
and sold to the government institutions.
188
540165
2231
09:02
They call it their "secret sauce" --
189
542420
1873
๊ทธ๋“ค์€ ์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ "๋น„๋ฒ•"์ด๋ผ๊ณ  ํ•˜๊ณ 
09:04
that's why they can't tell us about it.
190
544317
2128
๊ทธ๋ž˜์„œ ๊ณต๊ฐœ ๋ชปํ•œ๋‹ค๋Š” ๊ฑด๋ฐ
09:06
It's also private power.
191
546469
2220
๊ทธ๊ฒŒ ์‚ฌ๊ธฐ์—…์˜ ํž˜์ด์ฃ .
09:09
They are profiting for wielding the authority of the inscrutable.
192
549744
4695
๊ทธ๋“ค์€ "๋ถˆ๊ฐ€ํ•ด"๋ผ๋Š” ๊ถŒ๋ ฅ์„ ํœ˜๋‘˜๋Ÿฌ ์ด์ต์„ ์ฑ™๊ธฐ๊ณ  ์žˆ์ฃ .
09:16
Now you might think, since all this stuff is private
193
556934
2934
์—ฌ๋Ÿฌ๋ถ„๋“ค ์ค‘์—๋Š” ์ด ๋ชจ๋“  ๊ฒŒ ์‚ฌ์ ์ธ ๋ฌธ์ œ์ด๊ณ 
09:19
and there's competition,
194
559892
1158
๊ฒฝ์Ÿ์ด ์žˆ์œผ๋‹ˆ๊นŒ
์ž์œ  ์‹œ์žฅ์ด ๋‹ค ํ•ด๊ฒฐํ•  ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹ค ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
09:21
maybe the free market will solve this problem.
195
561074
2306
09:23
It won't.
196
563404
1249
์ ˆ๋Œ€๋กœ ๊ทธ๋ ‡์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:24
There's a lot of money to be made in unfairness.
197
564677
3120
๋ถˆ๊ณต์ •ํ•œ ์„ธ์ƒ์—์„œ๋Š” ๋งŽ์€ ๋ˆ์„ ๋ฒŒ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:28
Also, we're not economic rational agents.
198
568947
3369
๋˜ํ•œ ์šฐ๋ฆฌ๋Š” ๊ฒฝ์ œ์ ์œผ๋กœ ์ด์„ฑ์  ์ฐธ์—ฌ์ž๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
09:32
We all are biased.
199
572851
1292
์šฐ๋ฆฌ์—๊ฒ ํŽธ๊ฒฌ์ด ์žˆ์ฃ .
09:34
We're all racist and bigoted in ways that we wish we weren't,
200
574780
3377
์šฐ๋ฆฌ๋Š” ์›ํ•˜์ง€ ์•Š๊ณ  ์‹ฌ์ง€์–ด ์•Œ์ง€๋„ ๋ชปํ•˜๋Š” ๋ฐฉ์‹์œผ๋กœ
๋ชจ๋‘ ์ธ์ข…์ฐจ๋ณ„์ฃผ์˜์ž์ด๊ณ  ํŽธ๊ฒฌ์— ์‚ฌ๋กœ์žกํ˜€ ์žˆ์Šต๋‹ˆ๋‹ค.
09:38
in ways that we don't even know.
201
578181
2019
09:41
We know this, though, in aggregate,
202
581172
3081
์ „์ฒด์ ์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์ „๋ถ€ ์ด๊ฑธ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:44
because sociologists have consistently demonstrated this
203
584277
3220
์™œ๋ƒํ•˜๋ฉด ๋งŽ์€ ์‚ฌํšŒํ•™์ž๋“ค์ด ์ž์‹ ๋“ค๋งŒ์˜ ์‹คํ—˜์œผ๋กœ
09:47
with these experiments they build,
204
587521
1665
๋Š์ž„์—†์ด ์ด๊ฑธ ์ฆ๋ช…ํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
09:49
where they send a bunch of applications to jobs out,
205
589210
2568
์ž๊ฒฉ์€ ๋˜‘๊ฐ™์•˜์ง€๋งŒ ๋ฐฑ์ธ์ผ ๊ฒƒ ๊ฐ™์€ ์ด๋ฆ„์„ ์ ์€ ์ง€์›์„œ์™€
09:51
equally qualified but some have white-sounding names
206
591802
2501
ํ‘์ธ์ผ ๊ฒƒ ๊ฐ™์€ ์ด๋ฆ„์„ ์ ์€ ์ง€์›์„œ๋ฅผ ์—ฌ๋Ÿฌ ์žฅ ๊ธฐ์—…์— ์ œ์ถœํ–ˆ๋Š”๋ฐ
09:54
and some have black-sounding names,
207
594327
1706
์–ธ์ œ๋‚˜ ์‹ค๋ง์ ์ด์—ˆ์Šต๋‹ˆ๋‹ค. ๊ฒฐ๊ณผ๋Š”, ๋Š˜ ๊ทธ๋žฌ์–ด์š”.
09:56
and it's always disappointing, the results -- always.
208
596057
2694
09:59
So we are the ones that are biased,
209
599330
1771
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ํŽธ๊ฒฌ์„ ๊ฐ€์ง„ ๋™๋ฌผ์ด๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:01
and we are injecting those biases into the algorithms
210
601125
3429
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ์ด๋Ÿฐ ํŽธ๊ฒฌ์„ ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ํˆฌ์˜ํ•ฉ๋‹ˆ๋‹ค.
10:04
by choosing what data to collect,
211
604578
1812
์˜ˆ๋ฅผ ๋“ค๋ฉด ๋ผ๋ฉด์„ ๋ฐฐ์ œํ•˜๊ธฐ๋กœ ํ•œ๋‹ค๋˜์ง€ ํ•˜๋Š” ์‹์œผ๋กœ
10:06
like I chose not to think about ramen noodles --
212
606414
2743
์ˆ˜์ง‘ํ•  ์ž๋ฃŒ๋ฅผ ์„ ๋ณ„ํ•˜๊ณ 
10:09
I decided it was irrelevant.
213
609181
1625
์ €๋Š” ์ด๊ฒƒ์ด ์ „ํ˜€ ๊ด€๊ณ„๊ฐ€ ์—†๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์ฃ .
10:10
But by trusting the data that's actually picking up on past practices
214
610830
5684
ํ•˜์ง€๋งŒ ์‹ค์ œ๋กœ ์ƒ๊ธด ๊ณผ๊ฑฐ์˜ ์‚ฌ๋ก€์—์„œ ์ˆ˜์ง‘๋œ ์ž๋ฃŒ๋ฅผ ์‹ ๋ขฐํ•˜๊ฑฐ๋‚˜
10:16
and by choosing the definition of success,
215
616538
2014
์„ฑ๊ณต์˜ ์ •์˜๋ฅผ ์„ ํƒํ•  ๋•Œ
10:18
how can we expect the algorithms to emerge unscathed?
216
618576
3983
์–ด๋–ป๊ฒŒ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์•„๋ฌดํƒˆ ์—†๊ธฐ๋ฅผ ๊ธฐ๋Œ€ํ•˜๊ฒ ์Šต๋‹ˆ๊นŒ?
10:22
We can't. We have to check them.
217
622583
2356
๋ถˆ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค. ๊ทธ๋“ค์˜ ์œ ํšจ์„ฑ์„ ๊ฒ€์ฆํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:25
We have to check them for fairness.
218
625985
1709
๊ณต์ •์„ฑ์„ ์œ„ํ•ด ๋ฐ˜๋“œ์‹œ ๊ฒ€์ฆ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
10:27
The good news is, we can check them for fairness.
219
627718
2711
์ข‹์€ ์†Œ์‹์€ ์šฐ๋ฆฌ๊ฐ€ ๊ณต์ •์„ฑ์„ ๊ฒ€์ฆํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:30
Algorithms can be interrogated,
220
630453
3352
์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ์กฐ์‚ฌํ•  ์ˆ˜ ์žˆ๊ณ 
10:33
and they will tell us the truth every time.
221
633829
2034
๊ทธ๋Ÿฌ๋ฉด ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋งค๋ฒˆ ์šฐ๋ฆฌ์—๊ฒŒ ์ง„์‹ค์„ ๋งํ•ด ์ค„ ๊ฒ๋‹ˆ๋‹ค.
10:35
And we can fix them. We can make them better.
222
635887
2493
๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ๋Š” ์ˆ˜์ •ํ•  ์ˆ˜ ์žˆ์ฃ . ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๊ฐœ์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:38
I call this an algorithmic audit,
223
638404
2375
์ €๋Š” ์ด๊ฑธ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ๊ฐ์‚ฌ๋ผ๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
10:40
and I'll walk you through it.
224
640803
1679
์–ด๋–ป๊ฒŒ ํ•˜๋Š”์ง€ ์•Œ๋ ค๋“œ๋ฆฌ์ฃ .
10:42
First, data integrity check.
225
642506
2196
์ฒซ ๋ฒˆ์งธ๋Š” ์ž๋ฃŒ ์ง„์‹ค์„ฑ ๊ฒ€์‚ฌ์ž…๋‹ˆ๋‹ค.
10:45
For the recidivism risk algorithm I talked about,
226
645952
2657
์ œ๊ฐ€ ๋ง์”€๋“œ๋ฆฐ ์žฌ๋ฒ” ์œ„ํ—˜๋„ ์•Œ๊ณ ๋ฆฌ์ฆ˜์—์„œ
10:49
a data integrity check would mean we'd have to come to terms with the fact
227
649402
3573
์ž๋ฃŒ ์ง„์‹ค์„ฑ ํ™•์ธ์€ ๋ฏธ๊ตญ์—์„œ ํ‘์ธ๊ณผ ๋ฐฑ์ธ ๋ชจ๋‘ ๊ฐ™์€ ๋น„์œจ๋กœ
10:52
that in the US, whites and blacks smoke pot at the same rate
228
652999
3526
๋Œ€๋งˆ์ดˆ๋ฅผ ํ”ผ์šฐ๊ณ  ์žˆ์ง€๋งŒ ์ฒดํฌ์œจ์€ ํ‘์ธ์ด ํ›จ์”ฌ ๋†’์Œ์„
10:56
but blacks are far more likely to be arrested --
229
656549
2485
์ธ์ •ํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:59
four or five times more likely, depending on the area.
230
659058
3184
์ง€์—ญ๋งˆ๋‹ค ๋‹ค๋ฅด์ง€๋งŒ ์„ธ ๋ฐฐ์—์„œ ๋„ค ๋ฐฐ๊นŒ์ง€ ๋†’์ฃ .
11:03
What is that bias looking like in other crime categories,
231
663137
2826
๋‹ค๋ฅธ ๋ฒ”์ฃ„์—์„œ๋Š” ์ด๋Ÿฐ ํŽธ๊ฒฌ์ด ๊ณผ์—ฐ ์–ด๋–จ๊นŒ์š”?
11:05
and how do we account for it?
232
665987
1451
๋˜ ์ด๊ฑธ ์–ด๋–ป๊ฒŒ ๋ฐ˜์˜ํ•ด์•ผ ํ• ๊นŒ์š”?
11:07
Second, we should think about the definition of success,
233
667982
3039
๋‘ ๋ฒˆ์งธ, ์šฐ๋ฆฌ๋Š” ์„ฑ๊ณต์— ๋Œ€ํ•œ ์ •์˜๋ฅผ ๋‹ค์‹œ ์ƒ๊ฐํ•˜๊ณ  ๋”ฐ์ ธ๋ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:11
audit that.
234
671045
1381
11:12
Remember -- with the hiring algorithm? We talked about it.
235
672450
2752
๊ณ ์šฉ๋ฌธ์ œ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ๊ธฐ์–ตํ•˜์„ธ์š”? ์•„๊นŒ ์–˜๊ธฐํ–ˆ๋Š”๋ฐ์š”.
11:15
Someone who stays for four years and is promoted once?
236
675226
3165
4๋…„ ๊ทผ๋ฌดํ•˜๋ฉด 1๋ฒˆ ์Šน์ง„์„ ํ•œ ์‚ฌ๋žŒ์„ ์ฐพ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜ ๋ง์ž…๋‹ˆ๋‹ค.
11:18
Well, that is a successful employee,
237
678415
1769
์ด๊ฑด ์„ฑ๊ณตํ•œ ์ง์žฅ์ธ์ด๊ธด ํ•ฉ๋‹ˆ๋‹ค๋งŒ
11:20
but it's also an employee that is supported by their culture.
238
680208
3079
๋˜ํ•œ ๊ทธ ๋ฌธํ™”์— ์ž˜ ๋™ํ™”๋œ ์ง์›์ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
11:23
That said, also it can be quite biased.
239
683909
1926
์ด๋ ‡๊ฒŒ ๋งํ•˜๋ฉด ์ด ์•Œ๊ณ ๋ฆฌ์ฆ˜ ๋˜ํ•œ ๋„ˆ๋ฌด ํ•œ์ชฝ์œผ๋กœ ์น˜์šฐ์ ธ ์žˆ์ฃ .
11:25
We need to separate those two things.
240
685859
2065
์šฐ๋ฆฌ๋Š” ์ด ๋‘˜์„ ๋ถ„๋ฆฌํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
11:27
We should look to the blind orchestra audition
241
687948
2426
์˜ˆ๋กœ ๋ธ”๋ผ์ธ๋“œ ์˜ค์ผ€์ŠคํŠธ๋ผ ์˜ค๋””์…˜์„ ์ฐธ๊ณ ํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
11:30
as an example.
242
690398
1196
11:31
That's where the people auditioning are behind a sheet.
243
691618
2756
์‹ฌ์‚ฌ์ž๋“ค์ด ์ปคํŠผ ๋’ค์— ์žˆ๋Š” ๊ฑฐ์ฃ .
11:34
What I want to think about there
244
694766
1931
์ œ๊ฐ€ ์—ฌ๊ธฐ์„œ ์ฐพ๋Š” ๊ฒƒ์€
11:36
is the people who are listening have decided what's important
245
696721
3417
๋ฌด์—‡์ด ์ค‘์š”ํ•œ์ง€๋ฅผ ๊ฒฐ์ •์„ ํ•˜๊ณ 
๋˜ ๋œ ์ค‘์š”ํ•œ๊ฐ€๋ฅผ ๊ฒฐ์ •ํ•˜๋Š” ๊ฒŒ ๋“ฃ๋Š” ์‚ฌ๋žŒ์ด๋ผ๋Š” ๊ฑฐ์ฃ 
11:40
and they've decided what's not important,
246
700162
2029
11:42
and they're not getting distracted by that.
247
702215
2059
๊ทธ ์–ด๋–ค ๊ฒƒ์—๋„ ๊ฐ„์„ญ๋ฐ›์ง€ ์•Š๋Š” ๊ฑฐ์ฃ .
11:44
When the blind orchestra auditions started,
248
704781
2749
๋ธ”๋ผ์ธ๋“œ ์˜ค์ผ€์ŠคํŠธ๋ผ ์˜ค๋””์…˜์ด ์‹œ์ž‘๋˜๋ฉด์„œ
11:47
the number of women in orchestras went up by a factor of five.
249
707554
3444
์˜ค์ผ€์ŠคํŠธ๋ผ์˜ ์—ฌ์„ฑ ๋‹จ์› ์ˆ˜๊ฐ€ 5๋ฐฐ ์ •๋„ ์ฆ๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
11:52
Next, we have to consider accuracy.
250
712073
2015
๋‹ค์Œ์œผ๋กœ ์ •ํ™•์„ฑ๋„ ์ƒ๊ฐํ•ด๋ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:55
This is where the value-added model for teachers would fail immediately.
251
715053
3734
๋ฐ”๋กœ ์—ฌ๊ธฐ์„œ ์„ ์ƒ๋‹˜๋“ค์—๊ฒŒ ์ ์šฉํ•œ ๊ฐ€์น˜ ์ฆ๊ฐ• ๋ชจ๋ธ์ด ๊ธˆ๋ฐฉ ํƒˆ๋ฝํ•ฉ๋‹ˆ๋‹ค.
11:59
No algorithm is perfect, of course,
252
719398
2162
๋ฌผ๋ก  ์–ด๋Š ์•Œ๊ณ ๋ฆฌ์ฆ˜๋„ ์™„๋ฒฝํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
12:02
so we have to consider the errors of every algorithm.
253
722440
3605
๊ทธ๋ž˜์„œ ๋ชจ๋“  ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์˜ค๋ฅ˜๋ฅผ ๊ณ ๋ คํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:06
How often are there errors, and for whom does this model fail?
254
726656
4359
์–ผ๋งˆ๋‚˜ ์ž์ฃผ ์˜ค๋ฅ˜๊ฐ€ ๋ฐœ์ƒํ•˜๊ณ  ์ด ๋ชจ๋ธ์ด ์•ˆ ๋งž๋Š” ์‚ฌ๋žŒ์€ ๋ˆ„๊ตฐ๊ฐ€์š”?
12:11
What is the cost of that failure?
255
731670
1718
๊ทธ ์˜ค๋ฅ˜์˜ ๋Œ“๊ฐ€๋Š” ์–ผ๋งˆ๋‚˜ ๋˜๋‚˜์š”?
12:14
And finally, we have to consider
256
734254
2207
๊ทธ๋ฆฌ๊ณ  ๋งˆ์ง€๋ง‰์œผ๋กœ ๋ฐ˜๋“œ์‹œ ๊ณ ๋ คํ•ด์•ผ ํ•˜๋Š” ๊ฒƒ์€
12:17
the long-term effects of algorithms,
257
737793
2186
์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์žฅ๊ธฐ์  ์˜ํ–ฅ๊ณผ
12:20
the feedback loops that are engendering.
258
740686
2207
์—ฌ๊ธฐ์„œ ์ƒ๊ฒจ๋‚˜๋Š” ํ”ผ๋“œ๋ฐฑ ๊ณ ๋ฆฌ์ฃ .
12:23
That sounds abstract,
259
743406
1236
๊ตฌ์ฒด์ ์œผ๋กœ ์™€๋‹ฟ์ง„ ์•Š๊ฒ ์ง€๋งŒ
12:24
but imagine if Facebook engineers had considered that
260
744666
2664
ํŽ˜์ด์Šค๋ถ ์—”์ง€๋‹ˆ์–ด๊ฐ€ ์นœ๊ตฌ๋“ค์ด ์˜ฌ๋ฆฐ ๊ธ€๋งŒ ๋ณด์—ฌ์ฃผ๋„๋ก ๊ฒฐ์ •ํ•˜๊ธฐ ์ „์—
12:28
before they decided to show us only things that our friends had posted.
261
748090
4855
์ด ์ ์„ ๋จผ์ € ์ƒ๊ฐํ•ด ๋ดค๋‹ค๋ฉด ๊ณผ์—ฐ ์–ด๋• ์„๊นŒ์š”?
12:33
I have two more messages, one for the data scientists out there.
262
753581
3234
์ œ๊ฐ€ ๋“œ๋ฆด ๋ฉ”์„ธ์ง€๊ฐ€ ๋‘˜ ์žˆ๋Š”๋ฐ ํ•˜๋‚˜๋Š” ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž๋ถ„๋“ค ๊ฒ๋‹ˆ๋‹ค.
12:37
Data scientists: we should not be the arbiters of truth.
263
757270
3409
๋ฐ์ดํ„ฐ ๊ณผํ•™์ž ์—ฌ๋Ÿฌ๋ถ„, ์šฐ๋ฆฌ๋Š” ์ง„์‹ค์˜ ๊ฒฐ์ •๊ถŒ์ž๊ฐ€ ์•„๋‹™๋‹ˆ๋‹ค.
12:41
We should be translators of ethical discussions that happen
264
761340
3783
์šฐ๋ฆฌ๋Š” ๋” ํฐ ์‚ฌํšŒ์—์„œ ๋ฒŒ์–ด์ง€๋Š” ์œค๋ฆฌ์  ํ† ๋ก ์„ ๋ฒˆ์—ญํ•˜๋Š” ์‚ฌ๋žŒ์— ๋ถˆ๊ณผํ•ฉ๋‹ˆ๋‹ค.
12:45
in larger society.
265
765147
1294
12:47
(Applause)
266
767399
2133
(๋ฐ•์ˆ˜)
12:49
And the rest of you,
267
769556
1556
๋‚˜๋จธ์ง€ ๋น„๋ฐ์ดํ„ฐ ๊ณผํ•™์ž
12:51
the non-data scientists:
268
771831
1396
์—ฌ๋Ÿฌ๋ถ„
12:53
this is not a math test.
269
773251
1498
์ด ๋ฌธ์ œ๋Š” ์ˆ˜ํ•™ ์‹œํ—˜์ด ์•„๋‹™๋‹ˆ๋‹ค.
12:55
This is a political fight.
270
775452
1348
์ด๊ฒƒ์€ ์ •์น˜์  ํˆฌ์Ÿ์ž…๋‹ˆ๋‹ค.
12:58
We need to demand accountability for our algorithmic overlords.
271
778407
3907
์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ง€๋ฐฐํ•˜๋Š” ์ด๋“ค์—๊ฒŒ ์ฑ…์ž„์„ ์š”๊ตฌํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
13:03
(Applause)
272
783938
1499
(๋ฐ•์ˆ˜)
13:05
The era of blind faith in big data must end.
273
785461
4225
๋น… ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•œ ๋งน์‹ ์˜ ์‹œ๊ธฐ๋Š” ๋ฐ˜๋“œ์‹œ ๋๋‚˜์•ผ ํ•ฉ๋‹ˆ๋‹ค.
13:09
Thank you very much.
274
789710
1167
๋Œ€๋‹จํžˆ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค
13:10
(Applause)
275
790901
5303
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7