How humans and AI can work together to create better businesses | Sylvain Duranton

28,502 views

2020-02-14 ใƒป TED


New videos

How humans and AI can work together to create better businesses | Sylvain Duranton

28,502 views ใƒป 2020-02-14

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
Translator: Ivana Korom Reviewer: Krystian Aparta
0
0
7000
๋ฒˆ์—ญ: Keun Jeong Mo ๊ฒ€ํ† : Theo Kim
00:12
Let me share a paradox.
1
12865
2127
์—ญ์„ค์„ ํ•˜๋‚˜ ๊ณต์œ ํ•ด ๋ณผ๊ฒŒ์š”.
00:16
For the last 10 years,
2
16429
1467
์ง€๋‚œ 10๋…„ ๊ฐ„
00:17
many companies have been trying to become less bureaucratic,
3
17920
3848
๋งŽ์€ ํšŒ์‚ฌ๋“ค์€ ๊ด€๋ฃŒ์ฃผ์˜,
00:21
to have fewer central rules and procedures,
4
21792
2833
์ค‘์‹ฌ์ ์ธ ๊ทœ์น™, ์ ˆ์ฐจ์—์„œ ๋ฒ—์–ด๋‚˜๊ณ 
00:24
more autonomy for their local teams to be more agile.
5
24649
3245
๊ฐ ๋ถ€์„œ๊ฐ€ ๋”์šฑ ๋ฏผ์ฒฉํ•˜๊ฒŒ๋” ์ž์œจ์ ์œผ๋กœ ๋งŒ๋“ค๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:28
And now they are pushing artificial intelligence, AI,
6
28204
4586
๊ทธ๋ฆฌ๊ณ  ํšŒ์‚ฌ๋“ค์ด ๊ฐ•์กฐํ•˜๋Š” ๋Œ€๋‹จํ•œ ๊ธฐ์ˆ ์ธ AI๊ฐ€
00:32
unaware that cool technology
7
32814
2445
๊ทธ ์–ด๋Š ๋•Œ ๋ณด๋‹ค๋„ ์‚ฌ๋žŒ๋“ค์„ ๋”์šฑ ๊ด€๋ฃŒ์ ์œผ๋กœ
00:35
might make them more bureaucratic than ever.
8
35283
3602
๋งŒ๋“œ๋Š”์ง€ ์ธ์ง€ํ•˜์ง€ ๋ชปํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:39
Why?
9
39378
1151
์™œ์ฃ ?
00:40
Because AI operates just like bureaucracies.
10
40553
3492
๋ฐ”๋กœ AI๊ฐ€ ๊ด€๋ฃŒ์ฃผ์˜์ฒ˜๋Ÿผ ์ž‘๋™ํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
00:44
The essence of bureaucracy
11
44403
2412
๊ด€๋ฃŒ์ฃผ์˜์˜ ๋ณธ์งˆ์€
00:46
is to favor rules and procedures over human judgment.
12
46839
4444
์ธ๊ฐ„์˜ ํŒ๋‹จ๋ณด๋‹ค ๊ทœ์น™๊ณผ ์ ˆ์ฐจ๋ฅผ ์„ ํ˜ธํ•ฉ๋‹ˆ๋‹ค.
00:51
And AI decides solely based on rules.
13
51887
3841
๊ทธ๋ฆฌ๊ณ  AI๋Š” ์˜ค์ง ๊ทœ์น™๋“ค์— ๊ธฐ๋ฐ˜ํ•ด ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค.
00:56
Many rules inferred from past data
14
56062
2833
๋งŽ์€ ๊ทœ์น™๋“ค์„ ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ์—์„œ ์ถ”๋ก ํ•ด์™”์Šต๋‹ˆ๋‹ค.
00:58
but only rules.
15
58919
1904
ํ•˜์ง€๋งŒ ๊ทœ์น™๋งŒ ์ถ”๋ก ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:01
And if human judgment is not kept in the loop,
16
61204
3730
๋งŒ์•ฝ ์ธ๊ฐ„์˜ ํŒ๋‹จ์ด ์˜์‚ฌ๊ฒฐ์ • ๊ณผ์ •์— ์—†๋‹ค๋ฉด
01:04
AI will bring a terrifying form of new bureaucracy --
17
64958
4556
AI๋Š” ๋ฌด์‹œ๋ฌด์‹œํ•œ ํ˜•ํƒœ์˜ ์ƒˆ๋กœ์šด ๊ด€๋ฃŒ์ฃผ์˜๋ฅผ ๊ฐ€์ ธ์˜ฌ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:09
I call it "algocracy" --
18
69538
2999
์ €๋Š” ์ด๊ฑธ '์•Œ๊ณ ํฌ๋ผ์‹œ (์•Œ๊ณ ๋ฆฌ์ฆ˜์— ์˜ํ•œ ์ง€๋ฐฐ)'๋ผ๊ณ  ๋ถ€๋ฆ…๋‹ˆ๋‹ค.
01:12
where AI will take more and more critical decisions by the rules
19
72561
4500
AI๊ฐ€ ์ธ๊ฐ„์˜ ํ†ต์ œ๋ฅผ ๋ฒ—์–ด๋‚˜ ๊ทœ์น™์„ ์ด์šฉํ•ด
์ ์  ๋” ๋งŽ์€ ๋น„ํŒ์ ์ธ ๊ฒฐ์ •์„ ๋‚ด๋ฆด ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:17
outside of any human control.
20
77085
2317
01:20
Is there a real risk?
21
80427
1674
์ด๊ฒŒ ์ง„์งœ ์œ„ํ—˜ํ• ๊นŒ์š”?
01:22
Yes.
22
82514
1150
๋„ค
01:23
I'm leading a team of 800 AI specialists.
23
83688
3006
์ €๋Š” AI ์ „๋ฌธ๊ฐ€ 800๋ช…์„ ์ด๋Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
01:26
We have deployed over 100 customized AI solutions
24
86718
3857
์ €ํฌ๋Š” ๋งž์ถคํ™”๋œ AI ์†”๋ฃจ์…˜์„ 100๊ฐœ ์ด์ƒ
01:30
for large companies around the world.
25
90599
2467
์ „์„ธ๊ณ„ ๋Œ€ํ˜• ํšŒ์‚ฌ๋“ค์— ๋ฐฐ์น˜ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:33
And I see too many corporate executives behaving like bureaucrats from the past.
26
93417
5819
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ๋„ˆ๋ฌด๋‚˜ ๋งŽ์€ ๊ฒฝ์˜์ง„์ด ์˜›๋‚  ๊ด€๋ฃŒ์ฒ˜๋Ÿผ ํ–‰๋™ํ•˜๋Š” ๊ฒƒ์„ ๋ดค์Šต๋‹ˆ๋‹ค.
01:39
They want to take costly, old-fashioned humans out of the loop
27
99784
4912
๊ทธ๋“ค์€ ๋ˆ ๋งŽ์ด ๋“œ๋Š”, ์œ ํ–‰ ์ง€๋‚œ ์‚ฌ๋žŒ๋“ค์„ ์˜์‚ฌ๊ฒฐ์ • ๊ณผ์ •์—์„œ ๋‚ด๋ณด๋‚ด๊ณ 
01:44
and rely only upon AI to take decisions.
28
104720
3865
์˜ค๋กœ์ง€ AI์— ์˜์ง€ํ•ด ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๊ณ  ์‹ถ์–ดํ•ฉ๋‹ˆ๋‹ค.
01:49
I call this the "human-zero mindset."
29
109244
4253
์ €๋Š” ์ด๊ฑธ '์ธ๊ฐ„์ œ๋กœ ์‚ฌ๊ณ ๋ฐฉ์‹'์ด๋ผ ๋ถ€๋ฆ…๋‹ˆ๋‹ค.
01:54
And why is it so tempting?
30
114260
2118
์ด๊ฒŒ ์™œ ๊ทธ๋ ‡๊ฒŒ ์†”๊นƒํ•˜๊ฒŒ ๋“ค๋ฆด๊นŒ์š”?
01:56
Because the other route, "Human plus AI," is long,
31
116879
5404
์™œ๋ƒํ•˜๋ฉด ๋‹ค๋ฅธ ๋ฐฉ์‹์ธ "์ธ๊ฐ„ ํ”Œ๋Ÿฌ์Šค AI"๋Š” ์†Œ์š” ์‹œ๊ฐ„์ด ๊ธธ๊ณ 
02:02
costly and difficult.
32
122307
2609
๋น„์‹ธ๊ณ  ์‹คํ–‰ํ•˜๊ธฐ ์–ด๋ ต๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
02:04
Business teams, tech teams, data-science teams
33
124940
3293
๋น„์ฆˆ๋‹ˆ์Šค ํŒ€, ๊ธฐ์ˆ  ํŒ€, ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž ํŒ€๋“ค์€
02:08
have to iterate for months
34
128257
2079
๋ช‡ ๊ฐœ์›” ๋™์•ˆ ๋ฐ˜๋ณต ์ž‘์—…์„ ํ†ตํ•ด
02:10
to craft exactly how humans and AI can best work together.
35
130360
5268
์ธ๊ฐ„๊ณผ AI๊ฐ€ ์–ด๋–ป๊ฒŒ ์ตœ๊ณ ์˜ ํ˜‘์—…์„ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ์ •ํ™•ํžˆ ๋งŒ๋“ค์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
02:16
Long, costly and difficult.
36
136160
3428
์˜ค๋ž˜ ๊ฑธ๋ฆฌ๊ณ , ๋น„์‹ธ๊ณ , ์–ด๋ ต์ฃ .
02:19
But the reward is huge.
37
139891
2070
ํ•˜์ง€๋งŒ ์„ฑ๊ณผ๋Š” ๊ฑฐ๋Œ€ํ•ฉ๋‹ˆ๋‹ค.
02:22
A recent survey from BCG and MIT
38
142343
3306
์ตœ๊ทผ BCG์™€ MIT์˜ ์กฐ์‚ฌ์— ๋”ฐ๋ฅด๋ฉด
02:25
shows that 18 percent of companies in the world
39
145673
4507
์ „์„ธ๊ณ„ 18ํผ์„ผํŠธ์˜ ํšŒ์‚ฌ๋Š”
02:30
are pioneering AI,
40
150204
2214
AI๋ฅผ ์ผ์ฐ์ด ๋„์ž…ํ•˜๊ณ 
02:32
making money with it.
41
152442
2301
์ด๋ฅผ ํ†ตํ•ด ์ˆ˜์ต์„ ์–ป์Šต๋‹ˆ๋‹ค.
02:35
Those companies focus 80 percent of their AI initiatives
42
155157
5590
์ด ํšŒ์‚ฌ๋“ค์€ AI ๊ณ„ํš ์ค‘ 80ํผ์„ผํŠธ๋ฅผ
02:40
on effectiveness and growth,
43
160771
1953
ํšจ์œจ์„ฑ๊ณผ ๋ฐœ์ „,
02:42
taking better decisions --
44
162748
2170
๋” ๋‚˜์€ ๊ฒฐ์ •์„ ์œ„ํ•ด ์ง‘์ค‘ํ•˜์ง€๋งŒ
02:44
not replacing humans with AI to save costs.
45
164942
3538
๋น„์šฉ ์ ˆ์•ฝ์„ ์œ„ํ•ด AI๋กœ ์ธ๊ฐ„์„ ๋Œ€์ฒดํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
02:50
Why is it important to keep humans in the loop?
46
170159
3200
์ธ๊ฐ„์„ ์˜์‚ฌ๊ฒฐ์ • ๊ณผ์ •์— ๋‘๋Š” ๊ฒƒ์ด ์™œ ์ค‘์š”ํ• ๊นŒ์š”?
02:54
Simply because, left alone, AI can do very dumb things.
47
174032
4847
์‰ฝ๊ฒŒ ๋งํ•ด, AI๋ฅผ ํ˜ผ์ž ๋‘๋ฉด ์•„์ฃผ ๋ฉ์ฒญํ•œ ํ–‰๋™์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:59
Sometimes with no consequences, like in this tweet.
48
179363
3373
๊ฐ€๋” ์ด ํŠธ์œ—์ฒ˜๋Ÿผ ๋‹ต์ด ์—†์Šต๋‹ˆ๋‹ค.
03:03
"Dear Amazon,
49
183212
1564
"์•„๋งˆ์กด์—๊ฒŒ
03:04
I bought a toilet seat.
50
184800
1386
์ „ ๋ณ€๊ธฐ๋ฅผ ํ•˜๋‚˜ ์ƒ€์–ด์š”.
03:06
Necessity, not desire.
51
186210
1661
๋ณ€๊ธฐ๊ฐ€ ํ•„์š” ํ–ˆ์„ ๋ฟ, ์š•๊ตฌ ์ถฉ์กฑ์ด ์•„๋‹ˆ์ฃ .
03:07
I do not collect them,
52
187895
1452
์ €๋Š” ๋ณ€๊ธฐ๋ฅผ ์ˆ˜์ง‘ํ•˜์ง€ ์•Š์•„์š”.
03:09
I'm not a toilet-seat addict.
53
189371
2238
์ €๋Š” ๋ณ€๊ธฐ ์ค‘๋…์ž๊ฐ€ ์•„๋‹ˆ๋ผ๊ณ ์š”.
03:11
No matter how temptingly you email me,
54
191633
2206
์—ฌ๋Ÿฌ๋ถ„์ด ์–ด๋–ค ์†”๊นƒํ•œ ์ด๋ฉ”์ผ์„ ๋ณด๋‚ด๋”๋ผ๋„
03:13
I am not going to think, 'Oh, go on, then,
55
193863
2349
์ €๋Š” ์ ˆ๋Œ€๋กœ '์˜ค, ์ข‹์•„, ๊ทธ๋Ÿผ
03:16
one more toilet seat, I'll treat myself.' "
56
196236
2140
๋ณ€๊ธฐ ํ•˜๋‚˜๋ฅผ ๋” ๊ตฌ๋งคํ•ด์„œ ๋‚˜์—๊ฒŒ ์„ ๋ฌผํ•ด์•ผ์ง€' ์ƒ๊ฐํ•˜์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:18
(Laughter)
57
198400
1344
(์›ƒ์Œ)
03:19
Sometimes, with more consequence, like in this other tweet.
58
199768
4618
์–ด๋–ค ๋•Œ์—๋Š” ๋˜ ์ด ํŠธ์œ—์ฒ˜๋Ÿผ ๋” ์‹ฌ๊ฐํ•œ ๊ฒฐ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜ต๋‹ˆ๋‹ค.
03:24
"Had the same situation
59
204903
1787
"์ €์˜ ์–ด๋จธ๋‹ˆ์˜ ์œ ๊ณจํ•จ ๋•Œ์™€
03:26
with my mother's burial urn."
60
206714
2459
๋น„์Šทํ•œ ์ƒํ™ฉ์— ์ฒ˜ํ•œ ์ ์ด ์žˆ์–ด์š”."
03:29
(Laughter)
61
209197
1008
(์›ƒ์Œ)
03:30
"For months after her death,
62
210229
1365
"์–ด๋จธ๋‹ˆ๊ฐ€ ๋Œ์•„๊ฐ€์‹  ํ›„
03:31
I got messages from Amazon, saying, 'If you liked that ...' "
63
211618
3548
์•„๋งˆ์กด์—์„œ
'๋งŒ์•ฝ ๊ท€ํ•˜์˜ ๋งˆ์Œ์— ๋“ค์—ˆ๋‹ค๋ฉด...' ์ด๋ผ๋Š” ๋ฌธ์ž๋ฅผ ๋ฐ›์•˜์–ด์š”.
03:35
(Laughter)
64
215190
2015
(์›ƒ์Œ)
03:37
Sometimes with worse consequences.
65
217229
2528
์–ด๋–ค ๋•Œ์—๋Š” ๋” ์‹ฌ๊ฐํ•œ ๊ฒฐ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜ต๋‹ˆ๋‹ค.
03:39
Take an AI engine rejecting a student application for university.
66
219781
4730
AI๊ฐ€ ์–ด๋–ค ํ•™์ƒ์˜ ๋Œ€ํ•™๊ต ์ง€์›์„ ๊ฑฐ๋ถ€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:44
Why?
67
224535
1150
์™œ์ฃ ?
03:45
Because it has "learned," on past data,
68
225709
2670
์™œ๋ƒํ•˜๋ฉด ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ด
03:48
characteristics of students that will pass and fail.
69
228403
3182
ํ•ฉ๊ฒฉ, ๋ถˆํ•ฉ๊ฒฉ ํ•™์ƒ๋“ค์˜ ํŠน์ง•์„ ๋ฐฐ์šฐ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค
03:51
Some are obvious, like GPAs.
70
231609
2103
GPA ๊ฐ™์€ ๊ฑด ๋ช…ํ™•ํ•  ๊ฒ๋‹ˆ๋‹ค.
03:54
But if, in the past, all students from a given postal code have failed,
71
234069
5109
๊ทธ๋Ÿฌ๋‚˜ ๋งŒ์•ฝ์— ํŠน์ • ์šฐํŽธ๋ฒˆํ˜ธ๋ฅผ ์‚ฌ์šฉํ•œ ํ•™์ƒ๋“ค์ด ๋ชจ๋‘ ํƒˆ๋ฝํ–ˆ๋‹ค๋ฉด
03:59
it is very likely that AI will make this a rule
72
239202
3532
AI๋Š” ์ด๋ฅผ ๊ทœ์น™์œผ๋กœ ๋งŒ๋“ค ๊ฐ€๋Šฅ์„ฑ์ด ๋งค์šฐ ๋†’๊ณ 
04:02
and will reject every student with this postal code,
73
242758
3770
์ด ์šฐํŽธ๋ฒˆํ˜ธ๋ฅผ ์‚ฌ์šฉํ•œ ๋ชจ๋“  ํ•™์ƒ๋“ค์„ ํƒˆ๋ฝ์‹œํ‚ฌ ๊ฒƒ์ด๋ฉฐ
04:06
not giving anyone the opportunity to prove the rule wrong.
74
246552
4813
๋ˆ„๊ตฌ์—๊ฒŒ๋„ ์ด ์ž˜๋ชป๋œ ๊ทœ์น™์„ ์ฆ๋ช…ํ•  ๊ธฐํšŒ๋ฅผ ์ฃผ์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:11
And no one can check all the rules,
75
251857
2516
๊ทธ๋ฆฌ๊ณ  ์•„๋ฌด๋„ ์ด ๋ชจ๋“  ๊ทœ์น™์„ ๊ฒ€์‚ฌํ•  ์ˆ˜ ์—†์ฃ .
04:14
because advanced AI is constantly learning.
76
254397
3452
๋›ฐ์–ด๋‚œ AI๋Š” ๋Š์ž„์—†์ด ๋ฐฐ์šฐ๋‹ˆ๊นŒ์š”.
04:18
And if humans are kept out of the room,
77
258307
2326
๋งŒ์•ฝ ์ธ๊ฐ„์ด ์ด ํ‹€์„ ๋ฒ—์–ด๋‚˜๊ณ ์ž ํ•œ๋‹ค๋ฉด
04:20
there comes the algocratic nightmare.
78
260657
3277
์ด๋Š” ์•Œ๊ณ ํฌ๋ผ์‹œ ์•…๋ชฝ์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:24
Who is accountable for rejecting the student?
79
264466
2857
์ž…ํ•™ ๊ฑฐ๋ถ€์— ๋Œ€ํ•ด ๋ˆ„๊ฐ€ ์ฑ…์ž„์„ ์ง€๋‚˜์š”?
04:27
No one, AI did.
80
267347
1643
์•„๋ฌด๋„ ์—†์Šต๋‹ˆ๋‹ค. AI๊ฐ€ ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:29
Is it fair? Yes.
81
269014
1674
๊ณตํ‰ํ•œ๊ฐ€์š”? ๋„ค.
04:30
The same set of objective rules has been applied to everyone.
82
270712
3242
์™œ๋ƒํ•˜๋ฉด ๊ฐ๊ด€์ ์ธ ํ‰๊ฐ€๊ฐ€ ๋ชจ๋“  ์‚ฌ๋žŒ์—๊ฒŒ ๋˜‘๊ฐ™์ด ์ ์šฉ๋˜์—ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:34
Could we reconsider for this bright kid with the wrong postal code?
83
274367
3902
์ž˜๋ชป๋œ ์šฐํŽธ๋ฒˆํ˜ธ๋ฅผ ์‚ฌ์šฉํ•œ ๋˜‘๋˜‘ํ•œ ์•„์ด์—๊ฒŒ ๋‹ค๋ฅธ ๊ธฐํšŒ๋ฅผ ์ค„ ์ˆ˜ ์žˆ์„๊นŒ์š”?
04:38
No, algos don't change their mind.
84
278899
3111
์•ˆ๋ฉ๋‹ˆ๋‹ค. ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๊ฒฐ์ •์„ ๋ฒˆ๋ณตํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
04:42
We have a choice here.
85
282974
2016
๊ทธ๋Ÿฌ๋‚˜ ์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ ํ•œ ๊ฐ€์ง€ ์„ ํƒ๊ถŒ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:45
Carry on with algocracy
86
285756
2524
์•Œ๊ณ ํฌ๋ผ์‹œ๋กœ ๊ณ„์† ๊ฐ€์‹œ๋“ ์ง€
04:48
or decide to go to "Human plus AI."
87
288304
2865
์•„๋‹ˆ๋ฉด '์ธ๊ฐ„ ํ”Œ๋Ÿฌ์Šค AI'๋กœ ๊ฒฐ์ •ํ•  ์ˆ˜๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
04:51
And to do this,
88
291193
1333
์ด๋ฅผ ์œ„ํ•ด์„œ
04:52
we need to stop thinking tech first,
89
292550
3440
์šฐ๋ฆฌ๋Š” ๊ธฐ์ˆ ์— ๋Œ€ํ•œ ์ƒ๊ฐ์„ ๋จผ์ € ๋ฉˆ์ถ”๊ณ 
04:56
and we need to start applying the secret formula.
90
296014
3650
์€๋ฐ€ํ•œ ๊ณต์‹์„ ์ ์šฉ์‹œ์ผœ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
05:00
To deploy "Human plus AI,"
91
300601
2103
"์ธ๊ฐ„ ํ”Œ๋Ÿฌ์Šค AI"๋ฅผ ์ ์šฉํ•˜๊ธฐ ์œ„ํ•ด
05:02
10 percent of the effort is to code algos;
92
302728
2921
์•Œ๊ณ ๋ฆฌ์ฆ˜ ์ฝ”๋“œ ์ œ์ž‘์— 10% ๋…ธ๋ ฅ์„ ์Ÿ๊ณ 
05:05
20 percent to build tech around the algos,
93
305673
3531
20ํผ์„ผํŠธ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜ ๊ด€๋ จ ๊ธฐ์ˆ ์„ ๋งŒ๋“ค๊ณ 
05:09
collecting data, building UI, integrating into legacy systems;
94
309228
4106
๋ฐ์ดํ„ฐ ์ˆ˜์ง‘, UI ์ œ์ž‘, ๊ธฐ์กด ์‹œ์Šคํ…œ๊ณผ์˜ ํ†ตํ•ฉ์ด ์ด๋ฃจ์–ด์ง‘๋‹ˆ๋‹ค.
05:13
But 70 percent, the bulk of the effort,
95
313358
2904
๊ทธ๋Ÿฌ๋‚˜ 70%์— ๋‹ฌํ•˜๋Š” ๋งŽ์€ ๋…ธ๋ ฅ์€
05:16
is about weaving together AI with people and processes
96
316286
4476
AI์— ์ธ๊ฐ„๊ณผ ํ”„๋กœ์„ธ์Šค๋ฅผ ํ•จ๊ป˜ ์—ฎ์Œ์œผ๋กœ์„œ
05:20
to maximize real outcome.
97
320786
2374
์ตœ๋Œ€์˜ ์„ฑ๊ณผ๋ฅผ ์–ป๊ฒŒ๋” ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:24
AI fails when cutting short on the 70 percent.
98
324136
4634
AI๋Š” ์ด 70%์— ๋ฏธ์น˜์ง€ ๋ชปํ•˜๋ฉด ์‹คํŒจํ•ฉ๋‹ˆ๋‹ค.
05:28
The price tag for that can be small,
99
328794
3159
์ด๋ฅผ ์œ„ํ•œ ๋น„์šฉ์€ ์ค„์–ด๋“ค ์ˆ˜๋„ ์žˆ๊ฒ ์ง€๋งŒ
05:31
wasting many, many millions of dollars on useless technology.
100
331977
3985
์ˆ˜๋ฐฑ๋งŒ ๋‹ฌ๋Ÿฌ๋ฅผ ์“ธ๋ชจ์—†๋Š” ๊ธฐ์ˆ ์— ๋‚ญ๋น„ํ•˜๊ฒŒ ๋˜๊ฒ ์ฃ .
05:35
Anyone cares?
101
335986
1150
๋ˆ„๊ฐ€ ์‹ ๊ฒฝ์ด๋‚˜ ์“ธ๊นŒ์š”?
05:38
Or real tragedies:
102
338153
2325
ํ˜น์€ ์ด๋Ÿฌํ•œ ๋น„๊ทน์ด ์žˆ์ฃ .
05:41
346 casualties in the recent crashes of two B-737 aircrafts
103
341137
7515
346๋ช…์˜ ์‚ฌ์ƒ์ž๋ฅผ ๋‚ธ ์ตœ๊ทผ 2๋ฒˆ์˜ B-737๊ธฐ ์‚ฌ๊ณ  ์ด์œ ๋Š”
05:48
when pilots could not interact properly
104
348776
3261
ํŒŒ์ผ๋Ÿฟ์ด ์ปดํ“จํ„ฐํ™”๋œ ๋ช…๋ น ์‹œ์Šคํ…œ๊ณผ
05:52
with a computerized command system.
105
352061
2467
์ ์ ˆํžˆ ์ƒํ˜ธ์ž‘์šฉ์„ ํ•˜์ง€ ๋ชปํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค
05:55
For a successful 70 percent,
106
355974
1794
์„ฑ๊ณต์ ์ธ 70%๋ฅผ ์œ„ํ•ด์„œ
05:57
the first step is to make sure that algos are coded by data scientists
107
357792
5095
์ฒซ ๋ฒˆ์งธ ๋‹จ๊ณ„๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž์™€ ํŠน์ • ๋ถ„์•ผ ์ „๋ฌธ๊ฐ€์— ์˜ํ•ด
06:02
and domain experts together.
108
362911
2118
์ฝ”๋“œ๋กœ ์งœ์—ฌ์ ธ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
06:05
Take health care for example.
109
365427
2198
๊ฑด๊ฐ• ๊ด€๋ฆฌ๋ฅผ ์˜ˆ๋กœ ๋“ค์–ด๋ด…์‹œ๋‹ค.
06:07
One of our teams worked on a new drug with a slight problem.
110
367649
4817
์ €ํฌ ํŒ€ ํ•˜๋‚˜๊ฐ€ ์ž‘์€ ๋ฌธ์ œ๊ฐ€ ์žˆ๋Š” ์‹ ์•ฝ์— ๊ด€ํ•ด ์ž‘์—…ํ–ˆ์Šต๋‹ˆ๋‹ค.
06:12
When taking their first dose,
111
372784
1499
์‹ ์•ฝ ์ฒซ ๋ณต์šฉ์‹œ
06:14
some patients, very few, have heart attacks.
112
374307
3484
์•„์ฃผ ์†Œ์ˆ˜์˜ ํ™˜์ž๋“ค์—๊ฒŒ ์‹ฌ์žฅ๋งˆ๋น„๊ฐ€ ์˜ต๋‹ˆ๋‹ค.
06:18
So, all patients, when taking their first dose,
113
378117
3135
๊ทธ๋ฆฌํ•˜์—ฌ ์ฒซ ๋ณต์šฉ์„ ํ•˜๋Š” ๋ชจ๋“  ํ™˜์ž๋“ค์€
06:21
have to spend one day in hospital,
114
381276
2682
๋ฐ˜๋“œ์‹œ ๋ณ‘์›์—์„œ ํ•˜๋ฃจ๋ฅผ ์ง€๋‚ด๋ฉฐ
06:23
for monitoring, just in case.
115
383982
2071
๋งŒ์•ฝ์„ ๋Œ€๋น„ํ•ด ๊ฒฝ๊ณผ๋ฅผ ์ง€์ผœ๋ด…๋‹ˆ๋‹ค.
06:26
Our objective was to identify patients who were at zero risk of heart attacks,
116
386613
5556
์ €ํฌ์˜ ๋ชฉ์ ์€ ์‹ฌ์žฅ๋งˆ๋น„์˜ ์œ„ํ—˜์ด ์ „ํ˜€ ์—†๋Š” ํ™˜์ž๋ฅผ ๊ฐ€๋ ค๋‚ด์„œ
06:32
who could skip the day in hospital.
117
392193
2333
๋ณ‘์›์—์„œ ๊ฒฝ๊ณผ๋ฅผ ์ง€์ผœ๋ณด๋Š” ์ผ์„ ์—†์• ๋Š” ๊ฒƒ์ด์—ˆ์ฃ .
06:34
We used AI to analyze data from clinical trials,
118
394962
4079
์ €ํฌ๋Š” AI๋ฅผ ํ†ตํ•ด ์ž„์ƒ ์‹คํ—˜ ํ™˜์ž ๋ฐ์ดํ„ฐ๋ฅผ ๋ถ„์„ํ•ด
06:40
to correlate ECG signal, blood composition, biomarkers,
119
400145
4368
ECG์‹ ํ˜ธ, ํ˜ˆ์•ก ์„ฑ๋ถ„, ์ƒ์ฒด์ง€ํ‘œ์™€
06:44
with the risk of heart attack.
120
404537
2000
์‹ฌ์žฅ๋งˆ๋น„ ๊ฐ€๋Šฅ์„ฑ์„ ์—ฐ๊ด€์ง€์—ˆ์Šต๋‹ˆ๋‹ค.
06:47
In one month,
121
407232
1274
ํ•œ ๋‹ฌ ํ›„,
06:48
our model could flag 62 percent of patients at zero risk.
122
408530
6031
์ €ํฌ ๋ชจ๋ธ์—์„œ 62%์˜ ํ™˜์ž๋Š” ์œ„ํ—˜์„ฑ์ด ์ „ํ˜€ ์—†์„ ์ˆ˜๋„ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
06:54
They could skip the day in hospital.
123
414887
2222
์ด๋“ค์€ ๋ณ‘์›์—์„œ ๊ฒฝ๊ณผ๋ฅผ ์ง€์ผœ๋ณผ ํ•„์š”๊ฐ€ ์—†์ฃ .
06:57
Would you be comfortable staying at home for your first dose
124
417863
3492
๊ทธ๋Ÿฐ๋ฐ ์—ฌ๋Ÿฌ๋ถ„์€ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋งํ•œ๋Œ€๋กœ ์‹ ์•ฝ ์ฒซ ๋ณต์šฉ ํ›„
07:01
if the algo said so?
125
421379
1524
์ง‘์—์„œ ํŽธํžˆ ๊ณ„์‹ค ์ˆ˜ ์žˆ๋‚˜์š”?
07:02
(Laughter)
126
422927
1015
(์›ƒ์Œ)
07:03
Doctors were not.
127
423966
1650
์˜์‚ฌ๋“ค์€ ์•„๋‹ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
07:05
What if we had false negatives,
128
425966
2302
๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ๊ฑฐ์ง“ ๋ถ€์ • ํŒ๋‹จ์„ ํ–ˆ๋‹ค๋ฉด,
07:08
meaning people who are told by AI they can stay at home, and die?
129
428292
5229
AI๊ฐ€ ์ฒซ ๋ณต์šฉ ํ›„ ๋ฐ”๋กœ ๊ท€๊ฐ€ํ•ด๋„ ๋œ๋‹ค๋˜ ์‚ฌ๋žŒ๋“ค์ด ์ฃฝ์œผ๋ฉด ์–ด๋–ป๊ฒŒ ๋˜๋‚˜์š”?
07:13
(Laughter)
130
433545
1365
(์›ƒ์Œ)
07:14
There started our 70 percent.
131
434934
2452
์—ฌ๊ธฐ์„œ ์ €ํฌ์˜ 70%๊ฐ€ ์‹œ์ž‘๋์Šต๋‹ˆ๋‹ค.
07:17
We worked with a team of doctors
132
437410
1992
์ €ํฌ๋Š” ์˜์‚ฌ๋“ค๊ณผ ํ˜‘๋ ฅํ•ด
07:19
to check the medical logic of each variable in our model.
133
439426
3799
ํ˜„์žฌ ๋ชจ๋ธ์—์„œ ๊ฐ ๋ณ€์ˆ˜์˜ ์˜ํ•™ ๋…ผ๋ฆฌ๋ฅผ ์ฒดํฌํ•˜๊ธฐ๋กœ ํ–ˆ์Šต๋‹ˆ๋‹ค.
07:23
For instance, we were using the concentration of a liver enzyme
134
443537
4569
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์ €ํฌ๋Š” ๊ฐ„ ํšจ์†Œ๋ฅผ
์˜ˆ์ธก ๋ณ€์ˆ˜๋กœ ์‚ฌ์šฉํ–ˆ๋Š”๋ฐ
07:28
as a predictor,
135
448130
1273
07:29
for which the medical logic was not obvious.
136
449427
3698
์˜ํ•™ ๋…ผ๋ฆฌ๊ฐ€ ๋ช…ํ™•ํ•˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
07:33
The statistical signal was quite strong.
137
453149
2666
ํ†ต๊ณ„ ์‹ ํ˜ธ๋Š” ์ƒ๋‹นํžˆ ๊ฐ•ํ–ˆ์Šต๋‹ˆ๋‹ค.
07:36
But what if it was a bias in our sample?
138
456300
2833
๊ทธ๋Ÿฌ๋‚˜ ๋งŒ์•ฝ ์ด๊ฒƒ์ด ์ €ํฌ ์ƒ˜ํ”Œ์˜ ํŽธํ–ฅ์ด๋ฉด์š”?
07:39
That predictor was taken out of the model.
139
459157
2800
๊ทธ๋ž˜์„œ ์ด ์˜ˆ์ธก ๋ณ€์ˆ˜๋Š” ์ €ํฌ ๋ชจ๋ธ์—์„œ ์ œ์™ธ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
07:42
We also took out predictors for which experts told us
140
462307
3445
๋˜ํ•œ ์ „๋ฌธ๊ฐ€๋“ค์ด ์–ธ๊ธ‰ํ•œ,
์‹ค์ƒํ™œ์—์„œ ์˜์‚ฌ๋“ค์ด ์ •๋ฐ€ํ•˜๊ฒŒ ์ธก์ •ํ•  ์ˆ˜ ์—†๋Š” ์˜ˆ์ธก ๋ณ€์ˆ˜๋ฅผ ์ œ์™ธ์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
07:45
they cannot be rigorously measured by doctors in real life.
141
465776
3936
07:50
After four months,
142
470371
1690
4๊ฐœ์›” ํ›„,
07:52
we had a model and a medical protocol.
143
472085
3071
์ €ํฌ๋Š” ์ƒˆ๋กœ์šด ๋ชจ๋ธ๊ณผ ์˜ํ•™ ์ง€์นจ์„ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
07:55
They both got approved
144
475514
1666
๋ชจ๋‘ ์ž‘๋…„ ๋ด„, ๋ฏธ๊ตญ์˜ ๊ถŒ์œ„์žˆ๋Š” ์˜ํ•™ ์ „๋ฌธ๊ฐ€๋“ค์—๊ฒŒ
07:57
my medical authorities in the US last spring,
145
477204
3222
08:00
resulting in far less stress for half of the patients
146
480450
3706
์ƒˆ๋กœ์šด ๋ชจ๋ธ์ด ์ ˆ๋ฐ˜ ์ด์ƒ์˜ ํ™˜์ž๋“ค์—๊ฒŒ
ํ›จ์”ฌ ์ ์€ ์ŠคํŠธ๋ ˆ์Šค์™€ ๋” ๋‚˜์€ ์ƒํ™œ์„ ๊ฐ€์ ธ๋‹ค ์ค€๋‹ค๋Š”
08:04
and better quality of life.
147
484180
1800
๊ฒฐ๊ณผ๋ฅผ ์ธ์ •๋ฐ›๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
08:06
And an expected upside on sales over 100 million for that drug.
148
486355
4269
๊ทธ๋ฆฌ๊ณ  ์‹ ์•ฝ ๋งค์ถœ์ด 1์–ต์ด ๋„˜์„ ๊ฒƒ์ด๋ผ๊ณ  ์˜ˆ์ƒํ–ˆ์Šต๋‹ˆ๋‹ค.
08:11
Seventy percent weaving AI with team and processes
149
491668
4198
70ํผ์„ผํŠธ๋‚˜ ํŒ€๊ณผ ํ”„๋กœ์„ธ์Šค๋ฅผ ํ†ตํ•ด AI์™€ ๊ฒฐํ•ฉํ•œ๋‹ค๋Š” ๊ฒƒ์€
08:15
also means building powerful interfaces
150
495890
3571
์ธ๊ฐ„๊ณผ AI๊ฐ€ ๊ฐ€์žฅ ์–ด๋ ค์šด ๋ฌธ์ œ๋ฅผ ํ•จ๊ป˜ ํ’€ ์ˆ˜ ์žˆ๋„๋ก
08:19
for humans and AI to solve the most difficult problems together.
151
499485
5309
๊ฐ•๋ ฅํ•œ ์ ‘์ ์„ ๋งŒ๋“ค์—ˆ๋‹ค๋Š” ๊ฒƒ๋„ ์˜๋ฏธํ•˜์ฃ .
08:25
Once, we got challenged by a fashion retailer.
152
505286
4635
์–ธ์  ๊ฐ€ ์ €ํฌ๋Š” ์˜๋ฅ˜ ์—…์ฒด์˜ ๋„์ „์žฅ์„ ๋ฐ›์•˜์Šต๋‹ˆ๋‹ค.
08:31
"We have the best buyers in the world.
153
511143
2498
"์ €ํฌ์—๊ฒŒ๋Š” ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ๊ตฌ๋งค์ž๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
08:33
Could you build an AI engine that would beat them at forecasting sales?
154
513665
5111
ํ˜น์‹œ ๊ตฌ๋งค์ž๋“ค์˜ ์˜ˆ์ƒ ํŒ๋งค๋Ÿ‰์„ ์ด๊ธธ AI ์—”์ง„์„ ์ œ์ž‘ํ•  ์ˆ˜ ์žˆ๋‚˜์š”?
08:38
At telling how many high-end, light-green, men XL shirts
155
518800
4166
์ €ํฌ๊ฐ€ ์–ผ๋งˆ๋‚˜ ๋งŽ์€ ํ•˜์ด์—”๋“œ,
๋ฐ์€ ์ดˆ๋ก์ƒ‰ ๋‚จ์ž XL์‚ฌ์ด์ฆˆ ์…”์ธ ๋ฅผ ๋‚ด๋…„์„ ์œ„ํ•ด ๊ตฌ์ž…ํ•ด์•ผ ํ• ๊นŒ์š”?
08:42
we need to buy for next year?
156
522990
2047
08:45
At predicting better what will sell or not
157
525061
2810
์–ด๋–ค ์ œํ’ˆ์ด ๋” ํŒ”๋ฆด์ง€ ์ €ํฌ ๋””์ž์ด๋„ˆ๋“ค๋ณด๋‹ค
08:47
than our designers."
158
527895
1960
์ข€ ๋” ์ •ํ™•ํžˆ ์˜ˆ์ธกํ•ด ์ฃผ์„ธ์š”."
08:50
Our team trained a model in a few weeks, on past sales data,
159
530434
3976
์ €ํฌ ํŒ€์€ ๊ณผ๊ฑฐ ํŒ๋งค ๋ฐ์ดํ„ฐ๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ AI ๋ชจ๋ธ์„ ๋ช‡ ์ฃผ๊ฐ„ ํ›ˆ๋ จ์‹œ์ผฐ๊ณ 
08:54
and the competition was organized with human buyers.
160
534434
3533
๋Œ€๊ฒฐ์€ ์ธ๊ฐ„ ๊ตฌ๋งค์ž๋“ค๊ณผ ์ด๋ฃจ์–ด์กŒ์Šต๋‹ˆ๋‹ค.
08:58
Result?
161
538347
1150
๊ฒฐ๊ณผ๋Š”?
09:00
AI wins, reducing forecasting errors by 25 percent.
162
540061
4682
AI๊ฐ€ ์ด๊ฒผ์Šต๋‹ˆ๋‹ค. ์˜ˆ์ธก ์˜ค๋ฅ˜๋ฅผ 25%๋‚˜ ์ค„์˜€์ฃ .
09:05
Human-zero champions could have tried to implement this initial model
163
545903
4833
์ธ๊ฐ„์ œ๋กœ ์ฑ”ํ”ผ์–ธ์ด ์ด ์ตœ์ดˆ์˜ ๋ชจ๋ธ์„ ๋„์ž…ํ•˜๊ณ 
09:10
and create a fight with all human buyers.
164
550760
2754
๋ชจ๋“  ์ธ๊ฐ„ ๊ตฌ๋งค์ž์™€ ์‹ธ์›€์„ ์ผ์œผํ‚ฌ ์ˆ˜๋„ ์žˆ์—ˆ์„ ๊ฒ๋‹ˆ๋‹ค.
09:13
Have fun.
165
553538
1150
์žฌ๋ฐŒ๊ฒ ์ฃ .
09:15
But we knew that human buyers had insights on fashion trends
166
555205
5126
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„ ๊ตฌ๋งค์ž๊ฐ€ ๊ฐ–๋Š” ํŒจ์…˜ ํŠธ๋ Œ๋“œ์— ๋Œ€ํ•œ ์•ˆ๋ชฉ์„
09:20
that could not be found in past data.
167
560355
2845
๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ์—์„œ ์ฐพ์„ ์ˆ˜ ์—†์„์ง€๋„ ๋ชจ๋ฅธ๋‹ค๋Š”๊ฑธ ์••๋‹ˆ๋‹ค.
09:23
There started our 70 percent.
168
563701
2857
๋ฐ”๋กœ ๊ฑฐ๊ธฐ์—์„œ ์ €ํฌ 70%๊ฐ€ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:26
We went for a second test,
169
566582
1944
๊ทธ๋ฆฌํ•˜์—ฌ ์ €๋Š” ๋‘ ๋ฒˆ์งธ ์‹คํ—˜์„ ์ง„ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:28
where human buyers were reviewing quantities
170
568550
3103
์ธ๊ฐ„ ๊ตฌ๋งค์ž๋“ค์€ AI๊ฐ€ ์ถ”์ฒœํ•˜๋Š” ์ˆ˜๋Ÿ‰์„
09:31
suggested by AI
171
571677
1662
๊ฒ€ํ† ํ•  ์ˆ˜ ์žˆ๊ณ 
09:33
and could correct them if needed.
172
573363
2325
ํ•„์š”ํ•˜๋‹ค๋ฉด ์ด๋ฅผ ์ˆ˜์ •ํ•  ์ˆ˜๋„ ์žˆ๊ฒŒ๋” ๋ง์ด์ฃ .
09:36
Result?
173
576180
1150
๊ฒฐ๊ณผ๋Š”?
09:37
Humans using AI ...
174
577704
2117
์ธ๊ฐ„์ด AI๋ฅผ ์ด์šฉํ•œ ๊ฒฐ๊ณผ
09:39
lose.
175
579845
1407
์กŒ์Šต๋‹ˆ๋‹ค.
09:41
Seventy-five percent of the corrections made by a human
176
581795
4151
์ธ๊ฐ„์ด ์ˆ˜์ •ํ•œ 75%๋Š”
09:45
were reducing accuracy.
177
585970
2055
์ •ํ™•๋„๋ฅผ ๋–จ์–ดํŠธ๋ ธ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:49
Was it time to get rid of human buyers?
178
589002
3174
์ธ๊ฐ„ ๊ตฌ๋งค์ž๋ฅผ ์—†์• ์•ผ ํ–ˆ๋‚˜์š”?
09:52
No.
179
592200
1158
์•„๋‹ˆ์š”.
09:53
It was time to recreate a model
180
593382
2617
์ด๋Š” ์ธ๊ฐ„์ด ์–ธ์ œ AI๊ฐ€ ํ‹€๋ ธ๋‹ค๋Š” ๊ฒƒ์„ ๋งž์ถ”๋Š” ๊ฒƒ์ด ์•„๋‹Œ
09:56
where humans would not try to guess when AI is wrong,
181
596023
5071
AI๊ฐ€ ์ธ๊ฐ„ ๊ตฌ๋งค์ž์—๊ฒŒ์„œ ํ˜„์‹ค์ ์ธ ์ž…๋ ฅ ๊ฐ’์„ ๋ฐ›๋„๋ก
10:01
but where AI would take real input from human buyers.
182
601118
4542
๋ชจ๋ธ์„ ๋‹ค์‹œ ๋งŒ๋“ค์–ด์•ผ ํ•  ๋•Œ์˜€์Šต๋‹ˆ๋‹ค.
10:06
We fully rebuilt the model
183
606962
1611
์ €ํฌ๋Š” ๋ชจ๋ธ์„ ์™„์ „ํžˆ ๋‹ค์‹œ ๋งŒ๋“ค์—ˆ๊ณ 
10:08
and went away from our initial interface, which was, more or less,
184
608597
5964
์ €ํฌ ์ตœ์ดˆ์˜ ์ ‘์ ๊ณผ ๋ฉ€์–ด์ ธ ๋ฒ„๋ ค์„œ ๊ฑฐ์˜ ์ด๋žฌ์Šต๋‹ˆ๋‹ค.
10:14
"Hey, human! This is what I forecast,
185
614585
2437
"์ธ๊ฐ„์•„! ์ด๊ฑด ๋‚ด๊ฐ€ ์˜ˆ์ธกํ•œ๊ฑด๋ฐ
10:17
correct whatever you want,"
186
617046
1761
์ˆ˜์ •ํ•˜๊ณ  ์‹ถ์€ ๊ฑด ๋‹ค ์ˆ˜์ •ํ•ด."
10:18
and moved to a much richer one, more like,
187
618831
3636
๊ทธ๋Ÿฌ๋‚˜ ๋” ์ •๊ตํ•œ ๋ชจ๋ธ์€, ์ข€ ๋” ์ด๋žฌ์ฃ .
10:22
"Hey, humans!
188
622491
1976
"์ธ๊ฐ„์•„!
10:24
I don't know the trends for next year.
189
624491
1825
๋‚˜๋Š” ๋‚ด๋…„์˜ ํŠธ๋žœ๋“œ๋ฅผ ๋ชจ๋ฅด๊ฒ ์–ด.
10:26
Could you share with me your top creative bets?"
190
626340
2956
๊ทธ๋Ÿฌ๋‹ˆ ๋„ˆ์˜ ์ฐฝ์˜์ ์ธ ์•ˆ๋ชฉ์„ ํ•จ๊ป˜ ๊ณต์œ  ํ•˜๊ฒ ๋‹ˆ?"
10:30
"Hey, humans!
191
630063
1476
"์ธ๊ฐ„์•„!
10:31
Could you help me quantify those few big items?
192
631563
2719
์ด ํฐ ์ œํ’ˆ๋“ค ์ˆ˜๋Ÿ‰ํ™” ์ข€ ๋„์™€์ฃผ๊ฒ ๋‹ˆ?
10:34
I cannot find any good comparables in the past for them."
193
634306
3317
์˜ˆ์ „ ๋ฐ์ดํ„ฐ์—์„œ ์ข‹์€ ๋น„๊ต์น˜๋ฅผ ์ฐพ์ง€ ๋ชปํ–ˆ์–ด."
10:38
Result?
194
638401
1150
๊ฒฐ๊ณผ๋Š”?
10:40
"Human plus AI" wins,
195
640195
2000
"์ธ๊ฐ„ ํ”Œ๋Ÿฌ์Šค AI"์˜ ์Šน๋ฆฌ์ด๊ณ 
10:42
reducing forecast errors by 50 percent.
196
642219
3919
์˜ˆ์ธก ์˜ค๋ฅ˜๋ฅผ 50%๋‚˜ ์ค„์˜€๋‹ต๋‹ˆ๋‹ค.
10:47
It took one year to finalize the tool.
197
647770
2828
์ด ๋„๊ตฌ๋ฅผ ์™„์„ฑํ•˜๋Š”๋ฐ 1๋…„์ด๋‚˜ ๊ฑธ๋ ธ์ฃ .
10:51
Long, costly and difficult.
198
651073
3317
์˜ค๋ž˜ ๊ฑธ๋ฆฌ๊ณ , ๋ˆ์ด ๋งŽ์ด ๋“ค๊ณ , ์–ด๋ ค์› ์Šต๋‹ˆ๋‹ค.
10:55
But profits and benefits
199
655046
2206
๊ทธ๋Ÿฌ๋‚˜ ์ด์ต๊ณผ ํ˜œํƒ์„ ํ†ตํ•ด
10:57
were in excess of 100 million of savings per year for that retailer.
200
657276
5396
ํ•ด๋‹น ์˜๋ฅ˜ ์—…์ฒด๋Š” ๋งค๋…„ 1์–ต ๋‹ฌ๋Ÿฌ๋ฅผ ์ด์ƒ ์ ˆ์•ฝํ–ˆ์Šต๋‹ˆ๋‹ค.
11:03
Seventy percent on very sensitive topics
201
663459
2936
70%๋ฅผ ๋ฏผ๊ฐํ•œ ์ฃผ์ œ์— ๋Œ€ํ•ด ๋‹ค๋ฃฌ๋‹ค๋Š” ๊ฑด
11:06
also means human have to decide what is right or wrong
202
666419
3778
์ธ๊ฐ„์ด ์˜ณ๊ณ  ๊ทธ๋ฆ„์„ ๊ฒฐ์ •ํ•˜๊ณ 
11:10
and define rules for what AI can do or not,
203
670221
4086
AI๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ๊ณผ ์—†๋Š” ์ผ์„ ๊ทœ์ •ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
11:14
like setting caps on prices to prevent pricing engines
204
674331
3484
๋งํ•˜์ž๋ฉด ๊ฐ€๊ฒฉ ์ƒํ•œ์„ ์„ ์ •ํ•ด์„œ, ๊ฐ€๊ฒฉ ์„ค์ • ์—”์ง„์ด ํ„ฐ๋ฌด๋‹ˆ์—†์ด ๋†’์€ ๊ฐ€๊ฒฉ์„
11:17
[from charging] outrageously high prices to uneducated customers
205
677839
4524
์•„๋ฌด๊ฒƒ๋„ ๋ชจ๋ฅด๋Š” ๊ณ ๊ฐ์—๊ฒŒ ์ œ์‹œํ•ด ๊ทธ๊ฑธ ์ˆ˜์šฉํ•˜๋Š” ์ผ์„
11:22
who would accept them.
206
682387
1466
์˜ˆ๋ฐฉํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
11:24
Only humans can define those boundaries --
207
684538
2563
๊ทธ๋ฆฌ๊ณ  ์˜ค์ง ์ธ๊ฐ„๋งŒ์ด ์ด ๊ฒฝ๊ณ„๊ฐ’์„ ์„ค์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:27
there is no way AI can find them in past data.
208
687125
3621
AI๊ฐ€ ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ์—์„œ ์ด ๊ฒฝ๊ณ„๊ฐ’์„ ์ฐพ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
11:31
Some situations are in the gray zone.
209
691230
2467
ํšŒ์ƒ‰ ์ง€๋Œ€์— ์žˆ๋Š” ์ƒํ™ฉ๋“ค๋„ ์žˆ์ฃ .
11:34
We worked with a health insurer.
210
694135
2745
์ €ํฌ๋Š” ๊ฑด๊ฐ•๋ณดํ—˜์‚ฌ์™€ ์ผํ•œ ์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
11:36
He developed an AI engine to identify, among his clients,
211
696904
4713
๋ณดํ—˜์‚ฌ๋Š” AI ์—”์ง„ ๊ฐœ๋ฐœ์„ ํ†ตํ•ด ๊ณ ๊ฐ๋“ค ์ค‘
11:41
people who are just about to go to hospital
212
701641
2548
๊ณง ๋ณ‘์›์— ๊ฐ€์•ผํ•˜๋Š” ๊ณ ๊ฐ์„ ์‹๋ณ„ํ•ด
11:44
to sell them premium services.
213
704213
2269
ํ”„๋ฆฌ๋ฏธ์—„ ์„œ๋น„์Šค๋ฅผ ํŒ๋งคํ•˜๊ณ ์ž ํ–ˆ์Šต๋‹ˆ๋‹ค.
11:46
And the problem is,
214
706506
1515
๊ทธ๋Ÿฌ๋‚˜ ๋ฌธ์ œ๋Š” ๋ฐ”๋กœ,
11:48
some prospects were called by the commercial team
215
708045
2969
๊ด‘๊ณ ํŒ€ ์ „ํ™”๋ฅผ ๋ฐ›์€ ๋ชฉํ‘œ ๊ณ ๊ฐ ์ค‘ ์ผ๋ถ€๋Š”
11:51
while they did not know yet
216
711038
2697
์•„์ง ์ž์‹ ๋“ค์ด ๊ทธ๋ ‡๊ฒŒ๋‚˜ ์ผ์ฐ
11:53
they would have to go to hospital very soon.
217
713759
2818
๋ณ‘์›์— ๊ฐ€์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ๋ชจ๋ฅด๋Š” ์ƒํƒœ์˜€์Šต๋‹ˆ๋‹ค.
11:57
You are the CEO of this company.
218
717720
2317
๋งŒ์•ฝ ๋‹น์‹ ์ด ์ด ํšŒ์‚ฌ์˜ CEO๋ผ๋ฉด,
12:00
Do you stop that program?
219
720061
1667
๊ทธ ํ”„๋กœ๊ทธ๋žจ์„ ๊ทธ๋งŒ ์‚ฌ์šฉํ•  ๊ฒƒ์ธ๊ฐ€์š”?
12:02
Not an easy question.
220
722577
1913
์ด๋Š” ์‰ฌ์šด ๋ฌธ์ œ๋Š” ์•„๋‹™๋‹ˆ๋‹ค.
12:04
And to tackle this question, some companies are building teams,
221
724514
3563
์ด ๋ฌธ์ œ๋ฅผ ์ฒ˜๋ฆฌํ•˜๊ธฐ ์œ„ํ•ด ์–ด๋–ค ํšŒ์‚ฌ๋“ค์€ ํŒ€์„ ์กฐ์งํ•˜๊ณ ,
12:08
defining ethical rules and standards to help business and tech teams set limits
222
728101
5793
์œค๋ฆฌ์  ๊ทœ์น™๊ณผ ํ‘œ์ค€์„ ์„ค์ •ํ•ด, ๋น„์ฆˆ๋‹ˆ์Šค์™€ ๊ธฐ์ˆ ํŒ€์ด
12:13
between personalization and manipulation,
223
733918
3596
๊ฐœ์ธํ™”์™€ ์กฐ์ž‘,
12:17
customization of offers and discrimination,
224
737538
2969
์š”์ฒญ์˜ ๋งž์ถคํ™”์™€ ์ฐจ๋ณ„ํ™”,
12:20
targeting and intrusion.
225
740531
2023
๋ชฉํ‘œ ์„ค์ •, ์นจ๋ฒ” ์‚ฌ์ด์— ํ•œ๊ณ„๋ฅผ ์ •ํ•˜๋„๋ก ๋•์Šต๋‹ˆ๋‹ค.
12:24
I am convinced that in every company,
226
744562
3674
์ €๋Š” ๋ชจ๋“  ํšŒ์‚ฌ๋“ค์ด ์ •๋ง๋กœ ์ค‘์š”ํ•œ ๊ณณ์—
12:28
applying AI where it really matters has massive payback.
227
748260
4650
AI๋ฅผ ์ ์šฉํ•œ๋‹ค๋ฉด ์–ด๋งˆ๋ฌด์‹œํ•œ ์ด์ต์„ ์–ป์„ ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.
12:33
Business leaders need to be bold
228
753474
2151
๋น„์ฆˆ๋‹ˆ์Šค ๋ฆฌ๋”๋Š” ๊ณผ๊ฐํ•˜๊ฒŒ
12:35
and select a few topics,
229
755649
1976
๋ช‡ ๊ฐ€์ง€ ์‚ฌ์—… ์ฃผ์ œ๋ฅผ ์„ ์ •ํ•ด
12:37
and for each of them, mobilize 10, 20, 30 people from their best teams --
230
757649
4936
๋งค ์ฃผ์ œ๋งˆ๋‹ค 10, 20, 30๋ช…์”ฉ ์ตœ๊ณ ์˜ ํŒ€์—์„œ ์˜จ ์‚ฌ๋žŒ๋“ค๋กœ
12:42
tech, AI, data science, ethics --
231
762609
3333
๊ธฐ์ˆ , AI, ๋ฐ์ดํ„ฐ ๊ณผํ•™, ์œค๋ฆฌ ๋“ฑ์—์„œ ๋™์›ํ•˜์—ฌ
12:45
and go through the full 10-, 20-, 70-percent cycle
232
765966
4421
10%, 20%, 70% ์ฃผ๊ธฐ์˜
"์ธ๊ฐ„ ํ”Œ๋Ÿฌ์Šค AI"์˜ ๊ทธ ๋ชจ๋“  ๊ณผ์ •์„ ๊ฑฐ์ณ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:50
of "Human plus AI,"
233
770411
1722
12:52
if they want to land AI effectively in their teams and processes.
234
772157
4340
๋งŒ์•ฝ AI๋ฅผ ํšจ๊ณผ์ ์œผ๋กœ ํŒ€๊ณผ ํ”„๋กœ์„ธ์Šค์— ์ •์ฐฉ์‹œํ‚ค๋ ค ํ•œ๋‹ค๋ฉด ๋ง์ด์ฃ .
12:57
There is no other way.
235
777006
1889
๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์€ ์—†์Šต๋‹ˆ๋‹ค.
12:58
Citizens in developed economies already fear algocracy.
236
778919
4724
์„ ์ง„๊ตญ ์‹œ๋ฏผ๋“ค์€ ์ด๋ฏธ ์•Œ๊ณ ํฌ๋ผ์‹œ๋ฅผ ๋‘๋ ค์›Œํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:04
Seven thousand were interviewed in a recent survey.
237
784196
3571
์ตœ๊ทผ ์กฐ์‚ฌ์—์„œ 7์ฒœ ๋ช…์„ ์ธํ„ฐ๋ทฐ ํ–ˆ๋Š”๋ฐ
13:08
More than 75 percent expressed real concerns
238
788157
3555
๊ทธ ์ค‘ 75ํผ์„ผํŠธ๊ฐ€ ๋„˜๋Š” ์‚ฌ๋žŒ๋“ค์€ ์‹ฌ๊ฐํ•˜๊ฒŒ ๊ฑฑ์ •ํ•ฉ๋‹ˆ๋‹ค.
13:11
on the impact of AI on the workforce, on privacy,
239
791736
3937
AI๊ฐ€ ์ผ์ž๋ฆฌ, ์‚ฌ์ƒํ™œ
13:15
on the risk of a dehumanized society.
240
795697
3436
๋น„์ธ๊ฐ„์ ์ธ ์‚ฌํšŒ๋กœ์˜ ์œ„ํ—˜์„ฑ์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์— ๋Œ€ํ•ด์„œ์š”.
13:19
Pushing algocracy creates a real risk of severe backlash against AI
241
799157
5380
์•Œ๊ณ ํฌ๋ผ์‹œ๋ฅผ ๋ฐ€์–ด๋ถ™์ด๋ฉด, AI์— ์—„์ฒญ๋‚˜๊ฒŒ ๋ฐ˜๋ฐœํ•˜๋Š” ์‹ค์งˆ์ ์ธ ์œ„ํ—˜์ด
13:24
within companies or in society at large.
242
804561
4103
๊ธฐ์—…์ด๋‚˜ ์‚ฌํšŒ ์•ˆ์—์„œ ์—„์ฒญ๋‚˜๊ฒŒ ์ƒ๊ฒจ๋‚  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:29
"Human plus AI" is our only option
243
809014
3285
"์ธ๊ฐ„ ํ”Œ๋Ÿฌ์Šค AI" ์•ผ๋ง๋กœ ์šฐ๋ฆฌ์˜ ์œ ์ผํ•œ ์˜ต์…˜์œผ๋กœ
13:32
to bring the benefits of AI to the real world.
244
812323
3134
AI์˜ ํ˜œํƒ์„ ํ˜„์‹ค ์„ธ๊ณ„๋กœ ๊ฐ€์ ธ์˜ค๊ฒŒ ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:36
And in the end,
245
816038
1158
๊ทธ๋ฆฌ๊ณ  ๊ฒฐ๊ตญ์—”
13:37
winning organizations will invest in human knowledge,
246
817220
4134
์„ฑ๊ณตํ•œ ๊ธฐ์—…๋“ค์€ ์ธ๋ฅ˜์˜ ์ง€์‹์— ํˆฌ์žํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:41
not just AI and data.
247
821378
2325
AI์™€ ๋ฐ์ดํ„ฐ์—๋งŒ ํˆฌ์žํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ์š”.
13:44
Recruiting, training, rewarding human experts.
248
824792
3328
์ „๋ฌธ๊ฐ€๋ฅผ ๋ชจ์ง‘ํ•˜๊ณ , ํ›ˆ๋ จํ•˜๊ณ , ๋ณด์ƒํ•˜์„ธ์š”.
13:48
Data is said to be the new oil,
249
828800
3142
๋ชจ๋‘๋“ค ๋ฐ์ดํ„ฐ๋ฅผ ์ƒˆ๋กœ์šด ๊ธฐ๋ฆ„์ด ๋  ๊ฑฐ๋ผ๊ณ  ํ•˜์ง€๋งŒ
13:51
but believe me, human knowledge will make the difference,
250
831966
4071
์ ˆ ๋ฏฟ์œผ์„ธ์š”. ์ธ๋ฅ˜๊ฐ€ ๊ฐ–๊ณ  ์žˆ๋Š” ์ง€์‹์€ ์ฐจ์ด๋ฅผ ๋งŒ๋“ค์–ด๋‚ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:56
because it is the only derrick available
251
836061
3588
์™œ๋ƒํ•˜๋ฉด, ์ธ๊ฐ„์ด์•ผ๋ง๋กœ ์œ ์ผํ•˜๊ฒŒ
13:59
to pump the oil hidden in the data.
252
839673
3674
๊ทธ ๋ฐ์ดํ„ฐ์— ์ˆจ๊ฒจ์ง„ ๊ธฐ๋ฆ„์„ ๋ฝ‘์•„๋‚ผ ๊ธฐ์ค‘๊ธฐ๋‹ˆ๊นŒ์š”.
14:04
Thank you.
253
844633
1153
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
14:05
(Applause)
254
845810
3904
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7