Why People and AI Make Good Business Partners | Shervin Khodabandeh | TED

53,067 views ・ 2022-05-22

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Yeonji Seo κ²€ν† : Hyeryung Kim
00:04
I've been working in AI for most of my career,
0
4292
3128
μ €λŠ” 인곡 지λŠ₯ λΆ„μ•Όμ—μ„œ 였랜 μ‹œκ°„μ„ μΌν•΄μ™”μŠ΅λ‹ˆλ‹€.
00:07
helping companies build artificial intelligence capabilities
1
7462
3378
기업이 인곡 지λŠ₯ κΈ°μˆ μ„ κ΅¬μΆ•ν•˜μ—¬ 사업을 μ„±μž₯μ‹œν‚¬ 수 있게 도왔죠.
00:10
to improve their business,
2
10840
1752
00:12
which is why I think what I'm about to tell you
3
12634
2336
κ·Έλž˜μ„œ μ œκ°€ μ§€κΈˆ ν•΄λ“œλ¦΄ μ–˜κΈ°κ°€ κ½€ λ†€λΌμšΈ 거라고 μƒκ°ν•©λ‹ˆλ‹€.
00:15
is quite shocking.
4
15011
1418
00:16
Every year, thousands of companies across the world
5
16763
4087
맀년 μ „ 세계 수천 개의 기업이
00:20
spend collectively tens of billions of dollars to build AI capabilities.
6
20892
4838
인곡 지λŠ₯ 기술 ꡬ좕에 μˆ˜μ‹­μ–΅ λ‹¬λŸ¬λ₯Ό νˆ¬μžν•©λ‹ˆλ‹€.
00:26
But according to research my colleagues and I have done,
7
26106
2669
ν•˜μ§€λ§Œ 제 λ™λ£Œλ“€κ³Ό ν•¨κ»˜ ν•œ 연ꡬ κ²°κ³Όλ₯Ό 보면,
00:28
only about 10 percent of these companies get any meaningful financial impact
8
28775
4922
투자둜 인해 μ‹€μ œλ‘œ μ˜λ―ΈμžˆλŠ” μž¬μ •μ  영ν–₯을 λ°›λŠ” 기업은
00:33
from their investments.
9
33738
1502
μ•½ 10%밖에 μ•ˆλ©λ‹ˆλ‹€.
00:35
These 10 percent winners with AI have a secret.
10
35740
2670
이 10%의 μŠΉμžλ“€μ€ 비결이 μžˆμŠ΅λ‹ˆλ‹€.
00:38
And their secret is not about fancy algorithms or sophisticated technology.
11
38868
5256
그것은 멋진 μ•Œκ³ λ¦¬μ¦˜μ΄λ‚˜ μ •κ΅ν•œ 기술이 μ•„λ‹™λ‹ˆλ‹€.
00:44
It's something far more basic.
12
44165
1919
훨씬 더 기본적인 데에 있죠.
00:46
It's how they get their people and AI to work together.
13
46751
4129
λ°”λ‘œ μ‚¬λžŒλ“€κ³Ό 인곡 지λŠ₯이 ν•¨κ»˜ 일할 수 있게 ν•˜λŠ” 방법에 μžˆμŠ΅λ‹ˆλ‹€.
00:50
Together, not against each other,
14
50880
2294
μ„œλ‘œ μΆ©λŒν•˜κ±°λ‚˜ λŒ€μ²΄ν•˜λŠ” 게 μ•„λ‹ˆλΌ
00:53
not instead of each other.
15
53174
1794
00:54
Together in a mutually beneficial relationship.
16
54968
2919
μ„œλ‘œμ—κ²Œ 도움을 μ£Όλ©° ν•¨κ»˜ μΌν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
00:58
Unfortunately, when most people think about AI,
17
58555
2753
μ‚¬λžŒλ“€μ€ 보톡 인곡 지λŠ₯이라고 ν•˜λ©΄
01:01
they think about the most extreme cases.
18
61349
2127
극단적인 κ²½μš°λ“€μ„ μƒκ°ν•©λ‹ˆλ‹€.
01:04
That AI is here only to replace us
19
64019
2294
인곡 지λŠ₯이 인간을 λŒ€μ²΄ν•˜κ±°λ‚˜
01:06
or overtake our intelligence and make us unnecessary.
20
66354
2878
μΈκ°„μ˜ 지λŠ₯을 μΆ”μ›”ν•˜κ³  λΆˆν•„μš”ν•˜κ²Œ λ§Œλ“€ 거라고 μƒκ°ν•˜μ£ .
01:09
But what I'm saying
21
69608
1418
ν•˜μ§€λ§Œ μ œκ°€ μƒκ°ν•˜κΈ°μ—
01:11
is that we don't seem to quite appreciate the huge opportunity that exists
22
71026
4212
μš°λ¦¬λŠ” μ—„μ²­λ‚œ 기회λ₯Ό μ œλŒ€λ‘œ μΈμ§€ν•˜μ§€ λͺ»ν•˜κ³  μžˆλŠ” 것 κ°™μŠ΅λ‹ˆλ‹€.
01:15
in the middle ground,
23
75280
1460
01:16
where humans and AI come together
24
76781
3087
인간과 인곡지λŠ₯이 ν•¨κ»˜ 쀑간 μ§€μ μ—μ„œ λ§Œλ‚˜
01:19
to achieve outcomes that neither one could do alone on their own.
25
79909
5047
ν˜Όμžμ„œλŠ” ν•˜μ§€ λͺ»ν•˜λŠ” 과제λ₯Ό ν•΄λ‚Ό 수 μžˆλŠ” 기회λ₯Όμš”.
01:24
Consider the game of chess.
26
84998
1501
체슀 κ²Œμž„μ„ 생각해 λ³΄μ„Έμš”.
01:26
You probably knew that AI today can beat any human grandmaster.
27
86541
4880
μ˜€λŠ˜λ‚  인곡 지λŠ₯이 μ–΄λ–€ 인간 μ„ μˆ˜λ„ 이길 수 μžˆλ‹€λŠ” 건 λ‹€λ“€ μ•„μ‹€ κ²λ‹ˆλ‹€.
01:32
But did you know that the combination of a human chess player and AI
28
92088
4004
그런데 인간 체슀 μ„ μˆ˜μ™€ 인곡 지λŠ₯이 합쳐지면
01:36
can beat not only any human but also any machine.
29
96134
3754
μ–΄λ–€ 인간뿐 μ•„λ‹ˆλΌ κ·Έ μ–΄λ–€ 기계도 이길 수 μžˆλ‹€λŠ” 것도 μ•„μ…¨λ‚˜μš”?
01:40
The combination is much more powerful than the sum of its parts.
30
100263
4338
κ·Έ λ‘˜μ„ ν•©μΉœ 힘이 각각의 νž˜λ³΄λ‹€ 훨씬 μ…‰λ‹ˆλ‹€.
01:45
In a perfect combination, AI will do what it does best,
31
105226
3838
ν™˜μƒμ˜ 쑰합이 μ΄λ£¨μ–΄μ‘Œμ„ λ•Œ 인곡 지λŠ₯은 κ°€μž₯ μž˜ν•˜λŠ” 일을 ν•©λ‹ˆλ‹€.
01:49
which is dealing with massive amounts of data and solving complex problems.
32
109105
4547
λ§Žμ€ μ–‘μ˜ 자료λ₯Ό μ²˜λ¦¬ν•˜κ³  λ³΅μž‘ν•œ 문제λ₯Ό ν•΄κ²°ν•˜λŠ” 것이죠.
01:53
And humans do what we do best
33
113693
2461
인간도 κ°€μž₯ μž˜ν•˜λŠ” 일을 ν•©λ‹ˆλ‹€.
01:56
using our creativity, our judgment, our empathy, our ethics
34
116196
4671
창의λ ₯, νŒλ‹¨λ ₯, 곡감, 윀리,
02:00
and our ability to compromise.
35
120867
2294
그리고 νƒ€ν˜‘ν•˜λŠ” λŠ₯λ ₯을 μ΄μš©ν•˜μ£ .
02:03
For several years,
36
123161
1335
μ§€λ‚œ λͺ‡ λ…„κ°„, 제 λ™λ£Œλ“€κ³Ό μ €λŠ”
02:04
my colleagues and I have studied
37
124537
2086
02:06
and worked with hundreds of winning companies
38
126623
3045
μ΄λŸ¬ν•œ 인간과 인곡 지λŠ₯ 관계λ₯Ό μ„±κ³΅μ μœΌλ‘œ κ΅¬μΆ•ν•œ
수백 개의 νšŒμ‚¬λ“€κ³Ό μ—°κ΅¬ν•˜κ³  ν˜‘λ ₯ν–ˆμŠ΅λ‹ˆλ‹€.
02:09
who are successfully building these human-AI relationships.
39
129668
3628
02:13
And what we've seen is quite interesting.
40
133338
2627
그리고 맀우 ν₯미둜운 사싀을 λ°œκ²¬ν–ˆμ£ .
02:15
First of all, these companies get five times more financial value
41
135965
4130
μš°μ„ , 이 기업듀은 인곡 지λŠ₯을 μ‚¬λžŒμ„ λŒ€μ²΄ν•˜λŠ” 데 μ‚¬μš©ν•˜λŠ” 기업보닀
02:20
than companies who use AI only to replace people.
42
140136
3379
λ‹€μ„― λ°° μ΄μƒμ˜ μž¬μ •μ  κ°€μΉ˜λ₯Ό μ–»μŠ΅λ‹ˆλ‹€.
02:24
Most importantly, they have a happier workforce.
43
144140
2920
κ°€μž₯ μ€‘μš”ν•œ 것은 직원듀이 더 ν–‰λ³΅ν•˜λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:27
Their employees are more proud, more fulfilled,
44
147060
2919
이곳의 직원듀은 μžλΆ€μ‹¬μ΄ 더 κ°•ν•˜κ³ , 더 높은 성취감을 느끼며
02:29
they collaborate better with each other, and they're more effective.
45
149979
3254
μ„œλ‘œ 더 잘 ν˜‘λ ₯ν•˜κ³ , 더 쒋은 μ„±κ³Όλ₯Ό λƒ…λ‹ˆλ‹€.
02:33
Five times more value and a happier workforce.
46
153233
4045
λ‹€μ„― λ°° 더 λ§Žμ€ μž¬μ •μ  κ°€μΉ˜λ₯Ό μ–»κ³  직원듀은 더 ν–‰λ³΅ν•œ κ±°μ£ .
02:37
So the question is, how do these companies do it?
47
157320
2753
그럼 이 기업듀은 μ–΄λ–»κ²Œ 이런 일을 ν•΄λ‚΄λŠ” κ±ΈκΉŒμš”?
02:40
How do they achieve these symbiotic human-AI relationships?
48
160115
4504
μ–΄λ–»κ²Œ 인간과 인곡 지λŠ₯이 μ΄λ ‡κ²Œ μ‘°ν™”λ‘œμš΄ 관계가 될 수 μžˆμ„κΉŒμš”?
02:44
I have some answers.
49
164661
1501
λͺ‡ 가지 닡이 μžˆμŠ΅λ‹ˆλ‹€.
02:46
First of all, they don't think of AI in the most extreme case
50
166204
3378
μš°μ„  그듀은 인곡 지λŠ₯이 인간을 λŒ€μ²΄ν•  λΏμ΄λΌλŠ”
02:49
only to replace humans.
51
169624
1585
극단적인 생각을 ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
02:51
Instead, they look deep inside their organizations
52
171251
3295
λŒ€μ‹ , 그듀은 쑰직 내뢀와 직원듀이 μˆ˜ν–‰ν•˜λŠ”
02:54
and at the various roles their people play.
53
174546
2294
λ‹€μ–‘ν•œ 역할을 μžμ„Ένžˆ μ‚΄νŽ΄λ΄…λ‹ˆλ‹€.
02:56
And they ask:
54
176881
1126
그리고 물어보죠.
인곡 지λŠ₯이 μ–΄λ–»κ²Œ μ‚¬λžŒλ“€μ˜ 성취감과 νš¨μœ¨μ„ 높이고
02:58
How can AI make our people more fulfilled, more effective,
55
178049
4713
03:02
more amplified?
56
182762
1585
μ„±μž₯을 λ„μšΈ 수 μžˆμ„κΉŒ?
03:04
Let me give you an example.
57
184347
1710
예λ₯Ό λ“€μ–΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
03:06
Humana is a health care company here in the US.
58
186057
3962
νœ΄λ§ˆλ‚˜λŠ” 미ꡭ의 의료 νšŒμ‚¬μž…λ‹ˆλ‹€.
03:10
It has pharmacy call centers where pharmacists work with patients
59
190019
3379
이 νšŒμ‚¬μ—λŠ” 약사와 ν™˜μžκ°€ μ†Œν†΅ν•˜λŠ” μ•½κ΅­ μ „ν™” 상담싀이 μžˆμŠ΅λ‹ˆλ‹€.
03:13
over the phone.
60
193398
1418
03:14
It's a job that requires a fair amount of empathy and humanity.
61
194858
3962
μƒλ‹Ήν•œ 곡감과 인간애가 ν•„μš”ν•œ 직업이죠.
03:19
Humana has developed an AI system
62
199362
2127
νœ΄λ§ˆλ‚˜μ—μ„œ κ°œλ°œν•œ 인곡 지λŠ₯은
03:21
that listens to the pharmacists' conversation
63
201531
2711
μ•½μ‚¬λ“€μ˜ λŒ€ν™”λ₯Ό λ“£κ³  감정과 말투λ₯Ό 포착해
03:24
and picks up emotional and tone signals
64
204284
2752
03:27
and then gives real-time suggestions to the pharmacists
65
207078
3003
약사듀이 μ–΄λ–»κ²Œ λŒ€ν™”μ˜ μ§ˆμ„ ν–₯상 μ‹œν‚¬ 수 μžˆμ„μ§€
03:30
on how to improve the quality of that conversation.
66
210123
3086
μ‹€μ‹œκ°„μœΌλ‘œ μ œμ•ˆν•©λ‹ˆλ‹€.
03:33
For example, it might say β€œSlow down” or β€œPause”
67
213835
3211
예λ₯Ό λ“€μ–΄ β€œμ²œμ²œνžˆ ν•˜μ„Έμš”β€ λ˜λŠ” β€œμ‰¬μ„Έμš”β€ 라든지
03:37
or "Hey, consider how the other person is feeling right now."
68
217088
4046
β€œμ§€κΈˆ μƒλŒ€λ°©μ˜ 감정을 μƒκ°ν•΄λ³΄μ„Έμš”.” 라고 ν•  μˆ˜λ„ 있죠.
03:41
All to improve the quality of that conversation.
69
221176
3128
μ΄λ ‡κ²Œ λŒ€ν™”μ˜ μ§ˆμ„ 높일 수 μžˆμŠ΅λ‹ˆλ‹€.
03:45
I'm pretty sure my wife would buy me one of these if she could,
70
225096
4505
제 아내도 μ €ν•œν…Œ 이걸 사주고 μ‹Άμ–΄ ν•  κ±°μ˜ˆμš”.
03:49
just to help me in some of my conversations with her.
71
229642
2586
이런 게 있으면 말이 더 잘 톡할 ν…Œλ‹ˆκΉŒμš”.
03:52
(Laughter)
72
232270
1001
(μ›ƒμŒ)
03:53
Turns out the pharmacists like it quite a lot, too.
73
233271
2544
약사듀도 이것을 κ½€ μ’‹μ•„ν•œλ‹€κ³  ν•©λ‹ˆλ‹€.
03:56
They're more effective in their jobs,
74
236191
1793
일을 더 효과적으둜 ν•  수 있고
03:57
but they also learn something about themselves,
75
237984
2211
μ˜ˆμ „μ— λͺ°λžλ˜ μžμ‹ μ˜ 행동과 νŽΈκ²¬μ„ μ•Œκ²Œ 될 수 있기 λ•Œλ¬Έμ΄μ£ .
04:00
their own behaviors and biases.
76
240236
2253
04:02
The result has been more effective pharmacists
77
242989
2544
κ·Έ κ²°κ³Ό, μ•½μ‚¬λ“€μ˜ 업무 효율이 λ†’μ•„μ‘Œκ³ 
04:05
and much higher customer satisfaction scores.
78
245533
3170
고객 λ§Œμ‘±λ„λ„ 훨씬 λ†’μ•„μ‘ŒμŠ΅λ‹ˆλ‹€.
04:09
Now, this is just one example of many possibilities where human AI collaborate.
79
249037
6047
이것은 인간과 인곡 지λŠ₯ ν˜‘μ—…μ˜ μˆ˜λ§Žμ€ κ°€λŠ₯μ„±μ˜ ν•œ 예일 λΏμž…λ‹ˆλ‹€.
04:15
In this example, AI was a recommender.
80
255126
2795
이 경우 인곡 지λŠ₯은 μ‘°μ–Έμžμ˜€μœΌλ©°
04:17
It didn't replace the human or make any decisions of its own.
81
257962
3420
인간을 λŒ€μ²΄ν•˜μ§€λ„ 슀슀둜 κ²°μ •ν•˜μ§€λ„ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
04:21
It simply made suggestions,
82
261758
1460
λ‹¨μˆœνžˆ μ œμ•ˆμ„ ν–ˆκ³ 
04:23
and it was up to the person to decide and act.
83
263259
3837
κ²°μ •ν•˜κ³  ν–‰λ™ν•˜λŠ” 것은 μΈκ°„μ˜ λͺ«μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
04:27
And at the heart of it is a feedback loop,
84
267806
2752
그리고 κ·Έ μ€‘μ‹¬μ—λŠ”
04:30
which, by the way, is very critical for any human-AI relationship.
85
270558
4338
μ–΄λ–€ 인간-인곡 지λŠ₯ κ΄€κ³„μ—μ„œλ„ 맀우 μ€‘μš”ν•œ μˆœν™˜ 고리가 μžˆμŠ΅λ‹ˆλ‹€.
04:35
By that I mean that in this example,
86
275605
1752
λ‹€μ‹œ λ§ν•˜λ©΄, 이 μ‚¬λ‘€μ—μ„œ
04:37
first AI had to learn from humans the qualities that would make up a good
87
277357
5213
인곡 지λŠ₯은 λ¨Όμ € 질 높은 λŒ€ν™”λ‚˜ 그렇지 μ•Šμ€ λŒ€ν™”λ₯Ό κ΅¬λΆ„ν•˜λŠ” 법을
04:42
or not so good conversation.
88
282612
1585
μΈκ°„μ—κ²Œμ„œ λ°°μ›Œμ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
04:44
And then over time, as AI built more intelligence,
89
284697
3587
그리고 μ‹œκ°„μ΄ μ§€λ‚˜λ©° 인곡 지λŠ₯이 더 λ˜‘λ˜‘ν•΄μ Έμ„œ
04:48
it would be able to make suggestions,
90
288284
1919
μ œμ•ˆμ„ ν•  수 있게 된 κ±°μ£ .
04:50
but it would be up to the person to decide and act.
91
290745
3337
κ·ΈλŸ¬λ‚˜ κ²°μ •κ³Ό 행동은 μΈκ°„μ—κ²Œ λ‹¬λ €μžˆμŠ΅λ‹ˆλ‹€.
04:54
And if they didn't agree with the recommendation
92
294582
3045
λ§Œμ•½ 이해가 λ˜μ§€ μ•Šμ•„μ„œ μ œμ•ˆμ— λ™μ˜ν•˜μ§€ μ•ŠλŠ”λ‹€λ©΄
04:57
because it might have not made sense to them,
93
297669
2127
04:59
they didn't have to.
94
299838
1292
λ”°λ₯΄μ§€ μ•Šμ•„λ„ λ©λ‹ˆλ‹€.
05:01
In which case AI might learn something and adapt for the future.
95
301172
3671
이 경우 인곡 지λŠ₯은 또 λ‹€λ₯Έ κ±Έ λ°°μ›Œ λ―Έλž˜μ— 더 쒋은 μ œμ•ˆμ„ ν•  수 있죠.
05:05
It's basically open, frequent, two-way communication,
96
305426
3921
개방적이고 λΉˆλ²ˆν•œ μƒν˜Έ μ˜μ‚¬μ†Œν†΅μ€
05:09
like any couples therapist will tell you,
97
309389
1960
λͺ¨λ“  λΆ€λΆ€ 상담사듀이 λ§ν•˜λŠ” κ²ƒμ²˜λŸΌ
05:11
is very important for any good relationship.
98
311391
3086
쒋은 관계λ₯Ό μœ„ν•΄ 맀우 μ€‘μš”ν•©λ‹ˆλ‹€.
05:15
Now the key word here is relationship.
99
315270
2627
μ—¬κΈ°μ„œ 핡심 λ‹¨μ–΄λŠ” κ΄€κ³„μž…λ‹ˆλ‹€.
05:18
Think about your own personal relationships with other people.
100
318481
4004
λ‹Ήμ‹ κ³Ό λ‹€λ₯Έ μ‚¬λžŒλ“€μ˜ 개인적 관계λ₯Ό 생각해 λ³΄μ„Έμš”.
05:22
You don't have the same kind of relationship with your accountant
101
322527
3462
λ‹Ήμ‹ κ³Ό νšŒκ³„μ‚¬μ™€μ˜ 관계와
05:26
or your boss or your spouse, do you?
102
326030
2753
λ‹Ήμ‹ κ³Ό μƒμ‚¬λ‚˜ λ°°μš°μžμ™€μ˜ κ΄€κ³„λŠ” λ‹€λ₯΄μ£ ?
05:28
Well, I certainly hope not.
103
328783
1627
같지 μ•ŠκΈΈ λ°”λžλ‹ˆλ‹€.
05:30
And just like that,
104
330451
1669
λ§ˆμ°¬κ°€μ§€λ‘œ
05:32
the right relationship between human and AI in a company
105
332120
4171
νšŒμ‚¬ μ•ˆμ—μ„œ 인간과 인곡 지λŠ₯과의 관계도
05:36
is not a one-size-fits-all.
106
336332
2461
λ‹€ 같지 μ•ŠμŠ΅λ‹ˆλ‹€.
05:38
So in the case of Humana, AI was a recommender
107
338835
3295
νœ΄λ§ˆλ‚˜μ˜ 경우, 인곡 지λŠ₯은 μ‘°μ–Έμžμ˜€κ³ 
05:42
and a human was decision-maker and actor.
108
342171
2795
인간은 κ²°μ •κΆŒμžμ΄μž ν–‰λ™ν•˜λŠ” μžμ˜€μŠ΅λ‹ˆλ‹€.
05:45
In some other examples, AI might be an evaluator
109
345425
4129
λ‹€λ₯Έ μƒν™©μ—μ„œλŠ” 인곡 지λŠ₯이 ν‰κ°€μžμΌ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:49
where a human comes up with ideas or scenarios,
110
349596
2961
인간이 μ•„μ΄λ””μ–΄λ‚˜ μ‹œλ‚˜λ¦¬μ˜€λ₯Ό 생각해내고
05:52
and AI evaluates the complex implications and tradeoffs of those ideas
111
352599
4754
인곡 지λŠ₯은 μ΄λŸ¬ν•œ μ•„μ΄λ””μ–΄μ˜ λ³΅μž‘ν•œ μ˜λ―Έμ™€ κ· ν˜•μ„ ν‰κ°€ν•©λ‹ˆλ‹€.
05:57
and makes it easy for humans to decide the best course of action.
112
357395
4046
인간이 졜고의 결정을 내릴 수 있게 도와주죠.
06:02
In some other examples, AI might take a more creative role.
113
362108
3921
또 λ‹€λ₯Έ μƒν™©μ—μ„œλŠ” 인곡 지λŠ₯이 더 창의적인 역할을 ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
06:06
It could be an illuminator where it can take a complex problem
114
366362
3796
λ³΅μž‘ν•œ 문제λ₯Ό ν•΄κ²°ν•˜κ³ 
06:10
and come up with potential solutions to that problem
115
370199
2920
κ·Έ λ¬Έμ œμ— λŒ€ν•œ 잠재적 해결책을 μ œμ‹œν•˜κ³ 
06:13
and illuminate some options
116
373161
1877
인간듀은 μΈμ§€ν•˜μ§€ λͺ»ν–ˆμ„
06:15
that might have been impossible for humans to see.
117
375038
2669
λ‹€λ₯Έ 선택지λ₯Ό μΆ”μ²œν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
06:18
Let me give you another example.
118
378207
1877
λ‹€λ₯Ό 예λ₯Ό λ“€μ–΄λ³ΌκΉŒμš”?
06:21
During the COVID pandemic,
119
381085
1794
μ½”λ‘œλ‚˜κ°€ μœ ν–‰ν–ˆμ„ λ•Œ
06:22
if you walked into a retail or grocery store,
120
382921
2711
μ–΄λ–€ κ°€κ²Œλ‚˜ μ‹œμž₯에 λ“€μ–΄κ°€λ©΄
06:25
you saw that many retailers were struggling.
121
385673
3295
λ§Žμ€ μ†Œλ§€μ—…μ²΄λ“€μ΄ κ³ κ΅°λΆ„νˆ¬ν•˜λŠ” λͺ¨μŠ΅μ„ 보셨을 κ²ƒμž…λ‹ˆλ‹€.
06:29
Their shelves were empty,
122
389010
1501
μ„ λ°˜μ€ ν…… λΉ„μ–΄μžˆκ³ 
06:30
their suppliers were not able to fulfill the orders,
123
390553
2878
κ³΅κΈ‰μžλ“€μ€ μ£Όλ¬ΈλŸ‰μ„ μΆ©λ‹Ήν•  수 μ—†μ—ˆκ³ 
06:33
and with all the uncertainties of the pandemic,
124
393473
3086
μ „μ—Όλ³‘μ˜ λΆˆν™•μ‹€μ„± λ•Œλ¬Έμ—
06:36
they simply had no idea how many people would be walking into what stores,
125
396559
4713
μ–΄λ–€ κ°€κ²Œμ— λͺ‡ λͺ…μ˜ μ†λ‹˜μ΄ μ™€μ„œ
06:41
demanding what products.
126
401314
1752
μ–΄λ–€ 물건을 살지도 μ˜ˆμƒν•  수 μ—†μ—ˆμ£ .
06:43
Now, to put this in perspective,
127
403733
2085
μ΄λ ‡κ²Œ ν•œλ²ˆ 생각해 λ³΄μ„Έμš”.
06:45
this is a problem that's already quite hard when things are normal.
128
405860
4004
이 λ¬Έμ œλŠ” λͺ¨λ“  게 정상인 μƒν™©μ—μ„œλ„ 이미 μ–΄λ €μš΄ λ¬Έμ œμž…λ‹ˆλ‹€.
06:49
Retailers have to predict demand
129
409906
2085
μ†Œλ§€μ—…μ²΄λ“€μ€
맀일 수천 개의 μ§€μ—­μ—μ„œ 수천 개의 곡급 업체λ₯Ό κ±°μ³μ˜€λŠ”
06:52
for tens of thousands of products across thousands of locations
130
412033
4254
수만 개의 μ œν’ˆμ— λŒ€ν•œ μˆ˜μš”λ₯Ό μ˜ˆμΈ‘ν•΄μ•Ό
06:56
and thousands of suppliers every day
131
416329
2836
06:59
to manage and optimize their inventory.
132
419207
2794
재고λ₯Ό μœ μ§€ν•˜κ³  μ΅œμ ν™”ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
07:02
Add to that the uncertainties of COVID and the global supply chain disruptions,
133
422043
5255
여기에 μ „μ—Όλ³‘μ˜ λΆˆν™•μ‹€μ„±κ³Ό 세계 κ³΅κΈ‰λ§μ˜ 쀑단이 더해지면
07:07
and this became 100 times more difficult.
134
427340
2961
100λ°°λŠ” 더 μ–΄λ €μ›Œμ§„ μ…ˆμ΄μ£ .
07:10
And many retailers were simply paralyzed.
135
430301
2294
κ·Έ κ²°κ³Ό λ§Žμ€ μ†Œλ§€μ—…μ²΄λ“€μ΄ λ§ˆλΉ„λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
07:13
But there were a few who had built strong foundations with AI
136
433096
4087
κ·ΈλŸ¬λ‚˜ λͺ‡λͺ‡ 업체듀은 인곡 지λŠ₯을 ν†΅ν•œ κ°•λ ₯ν•œ 기반,
07:17
and the human-AI feedback loop that we talked about.
137
437225
3336
그리고 μ•„κΉŒ λ§μ”€λ“œλ Έλ˜ 인간과 인곡 지λŠ₯의 μˆœν™˜ 고리λ₯Ό μŒ“μ•„μ™”μŠ΅λ‹ˆλ‹€.
07:20
And these guys were able to navigate all this uncertainty
138
440603
3128
κ·Έλž˜μ„œ 이듀은 이런 λΆˆν™•μ‹€μ„±μ„
07:23
much better than others.
139
443731
1710
남듀보닀 잘 λ‹€λ£° 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
07:26
They used AI to analyze tens of billions of data points
140
446150
3254
그듀은 인곡 지λŠ₯을 톡해 μˆ˜μ²œμ–΅ 개의 자료λ₯Ό λΆ„μ„ν–ˆμŠ΅λ‹ˆλ‹€.
07:29
on consumer behavior and global supply chain disruptions
141
449445
3712
μ†ŒλΉ„μž 행동과 κΈ€λ‘œλ²Œ 곡급망 λΆ•κ΄΄,
07:33
and local government closures and mandates
142
453199
2878
지방 μ •λΆ€ 폐쇄와 ν†΅μ œ,
07:36
and traffic on highways
143
456119
1918
κ³ μ†λ„λ‘œ ꡐ톡과 해상 ν™”λ¬Ό μš΄μ†‘λ‘œ λ“± μ—¬λŸ¬ 가지 μš”μΈλ“€μ„ 뢄석해
07:38
and ocean freight lanes and many, many other factors
144
458079
2794
07:40
and get a pretty good handle on what consumers in each unique area
145
460915
4755
각 λΆ„μ•Όμ—μ„œ μ†ŒλΉ„μžλ“€μ΄ κ°€μž₯ μ›ν•˜λŠ” 것이 무엇인지,
07:45
wanted the most,
146
465670
1335
07:47
what would have been feasible,
147
467046
1543
무엇이 κ°€λŠ₯ν•œμ§€,
07:48
and for items that were not available,
148
468631
1835
λΆˆκ°€λŠ₯ν•œ 것듀을 λŒ€μ²΄ν•  수 μžˆλŠ” 건 무엇인지λ₯Ό μ•Œμ•„λ‚Ό 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
07:50
what substitutions could be made.
149
470508
2002
07:53
But AI alone without the human touch wouldn't work either.
150
473011
3962
κ·ΈλŸ¬λ‚˜ μΈκ°„μ˜ 도움 없이 인곡 지λŠ₯λ§ŒμœΌλ‘œλŠ” ν•  수 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
07:57
There were ethical and economic tradeoffs that had to be considered.
151
477015
3795
윀리적, 경제적 상좩 관계λ₯Ό κ³ λ €ν•΄μ•Ό ν–ˆκΈ° λ•Œλ¬Έμ΄μ£ .
08:00
For example, deciding to bring in a product
152
480852
2878
예λ₯Ό λ“€μ–΄ μ œν’ˆμ„ κ°€μ Έμ˜¬ λ•Œ
08:03
that didn't have a good margin for the retailer
153
483771
2461
μ†Œλ§€ μ—…μ²΄μ—κ²ŒλŠ” 큰 이읡이 μ—†μ§€λ§Œ
08:06
but would really help support the local community
154
486274
2794
지역 μ‚¬νšŒλ₯Ό μ§€μ›ν•˜λŠ” λ°μ—λŠ” μ‹€μ œλ‘œ 도움이 λ˜λŠ” κ²½μš°κ°€ μžˆλŠ”λ°
08:09
at their time of need.
155
489068
2127
08:11
After all, AI couldn't quite understand
156
491195
2336
인곡 지λŠ₯은 κ²°κ΅­ 이런 것듀을 μ΄ν•΄ν•˜μ§€ λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
08:13
the uniquely human behavior of panic-buying toilet paper
157
493531
3504
ν™”μž₯지λ₯Ό μ‚¬μž¬κΈ°ν•˜κ±°λ‚˜ μˆ˜μ‹­ 가런의 μˆ μ„ μ‚¬μ„œ
08:17
or tens of gallons of liquor,
158
497076
2378
08:19
only to be used as hand sanitizer.
159
499495
2336
손 μ†Œλ…μ œλ‘œ μ‚¬μš©ν•˜λŠ” μ‚¬λžŒλ“€μ˜ ν–‰λ™μ„μš”.
08:22
It was the combination that was the key.
160
502665
2544
μ—¬κΈ°μ„œ 관건은 μ‘°ν•©μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
08:25
And the winning companies know this.
161
505710
2377
그리고 μŠΉλ¦¬ν•œ 기업듀은 이것을 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
08:28
They also know that inside their companies,
162
508129
2377
그듀은 인간과 인곡 지λŠ₯이 νž˜μ„ ν•©μΉ  수 μžˆλŠ”
08:30
there's literally hundreds of these opportunities for human-AI combination,
163
510548
4630
수백 가지 κΈ°νšŒκ°€ μžˆλ‹€λŠ” 것을 μ•Œκ³ 
08:35
and they actively identify and pursue them.
164
515219
2878
μˆ˜μ‹œλ‘œ 이런 기회λ₯Ό μ°Ύμ•„λ‚΄κ³  μ‹€ν–‰ν•˜λ € ν•©λ‹ˆλ‹€.
08:38
They think of AI as much more broadly a means to replace people.
165
518973
4630
인곡 지λŠ₯으둜 λ‹¨μˆœνžˆ 인간을 λŒ€μ²΄ν•˜μ§€ μ•Šκ³  더 큰 그림을 그리죠.
08:44
They look inside their organizations
166
524187
2127
그듀은 쑰직을 더 μžμ„Ένžˆ λ“€μ—¬λ‹€ 보고
08:46
and re-imagine how the biggest challenges and opportunities of their company
167
526355
5089
μ–΄λ–»κ²Œ κ·Έλ“€μ˜ κ°€μž₯ 큰 λ¬Έμ œμ™€ 기회λ₯Ό
08:51
can be addressed
168
531444
1168
인간과 인곡 지λŠ₯의 μ‘°ν•©μœΌλ‘œ λ‹€λ£° 수 μžˆμ„μ§€μ— λŒ€ν•΄ μž¬κ³ ν•©λ‹ˆλ‹€.
08:52
by the combination of human and AI.
169
532653
2461
08:55
And they put in place the right combination for each unique situation.
170
535531
3921
그리고 각각의 상황에 λ§žλŠ” 쑰합을 μ°Ύμ•„ μ”λ‹ˆλ‹€.
09:00
Whether it's the recommender or the evaluator
171
540078
3128
μΆ”μ²œμΈ 역할이든 ν‰κ°€μž 역할이든
09:03
or the illuminator or optimizer or many, many other ones.
172
543247
3754
κ΅ν™”μž 역할이든 λ‚™κ΄€μž 역할이든 μ–΄λ–€ 역할이든 상관 없이
09:07
They build and evolve the feedback loops that we talked about.
173
547919
3461
저희가 μ΄μ•ΌκΈ°ν•œ μˆœν™˜ 고리λ₯Ό λ§Œλ“€κ³  λ°œμ „μ‹œν‚΅λ‹ˆλ‹€.
09:11
And finally and most importantly, they don't just throw technology at it.
174
551380
4213
그리고 λ§ˆμ§€λ§‰μ΄μ§€λ§Œ μ€‘μš”ν•œ 건 λ§‰λ¬΄κ°€λ‚΄λ‘œ κΈ°μˆ μ„ 던져 넣지 μ•ŠμŠ΅λ‹ˆλ‹€.
09:16
In fact, this has been the biggest pitfall of companies
175
556094
3878
사싀 이것이 인곡 지λŠ₯에 νˆ¬μžν•΄λ„ μˆ˜μ΅μ„ 얻지 λͺ»ν•˜λŠ” κΈ°μ—…λ“€μ˜
09:20
who don't get their return from their AI investments.
176
560014
3003
κ°€μž₯ 큰 λ§Ήμ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
09:23
If they overinvest in technology
177
563059
2210
기술 ν•˜λ‚˜κ°€ λͺ¨λ“  문제λ₯Ό ν•΄κ²°ν•  것이라 λ―Ώκ³ 
09:25
expecting a piece of tech to solve all their problems.
178
565269
3629
κ³Όλ„ν•˜κ²Œ νˆ¬μžν•˜λŠ” 경우죠.
09:29
But there is no silver bullet.
179
569273
1669
ν•˜μ§€λ§Œ λ¬˜μ±…μ΄λž€ 건 μ—†μŠ΅λ‹ˆλ‹€.
09:30
Technology and automation can only go so far,
180
570983
2878
기술과 μžλ™ν™”μ—λ„ ν•œκ³„κ°€ μžˆμŠ΅λ‹ˆλ‹€.
09:33
and for every one automation opportunity inside a company,
181
573861
3295
ν•œ νšŒμ‚¬ λ‚΄λΆ€μ—μ„œ ν•œ 가지 μž‘μ—…μ„ μžλ™ν™” ν•  λ•Œλ§ˆλ‹€
09:37
there's literally ten for collaboration.
182
577198
3003
κ·Έμ•Όλ§λ‘œ μ—΄ 가지에 λ‹¬ν•˜λŠ” ν˜‘μ—…μ΄ ν•„μš”ν•©λ‹ˆλ‹€.
09:40
But collaboration's hard.
183
580493
1793
κ·ΈλŸ¬λ‚˜ ν˜‘μ—…μ€ μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
09:42
It requires a new mindset
184
582328
2085
μƒˆλ‘œμš΄ λ§ˆμŒκ°€μ§μ΄ ν•„μš”ν•˜κ³ 
09:44
and doing things differently than how we've always done it.
185
584413
3629
늘 ν•΄μ™”λ˜ 방식이 μ•„λ‹Œ λ‹€λ₯Έ 방법을 생각해야 ν•˜μ£ .
09:48
And the winning companies know this, too,
186
588084
2085
μ„±κ³΅ν•˜λŠ” 기업듀은 이것 λ˜ν•œ μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
09:50
which is why they don't just invest in technology,
187
590211
2502
κ·Έλž˜μ„œ κ·Έλƒ₯ κΈ°μˆ μ— λ¬΄μž‘μ • νˆ¬μžν•˜μ§€ μ•Šκ³ 
09:52
but so much more on human factors,
188
592713
2878
인간적 μš”μ†Œμ— 더 νˆ¬μžν•©λ‹ˆλ‹€.
09:55
on their people, on training and reskilling
189
595633
2628
직원듀을 ν›ˆλ ¨ν•˜κ³  μž¬κ΅μœ‘ν•˜λŠ” λ™μ‹œμ—
09:58
and reimagining how their people and AI work together in new ways.
190
598302
4713
인간과 인곡 지λŠ₯이 μ–΄λ–»κ²Œ μƒˆλ‘­κ²Œ ν˜‘λ ₯ν•  수 μžˆμ„μ§€ λŠμž„μ—†μ΄ κ³ λ―Όν•©λ‹ˆλ‹€.
10:03
Inside these companies, it's not just machines replacing humans.
191
603474
4380
이런 κΈ°μ—…μ—μ„œλŠ” λ‹¨μˆœνžˆ 기계가 인간을 λŒ€μ²΄ν•˜λŠ” 게 μ•„λ‹™λ‹ˆλ‹€.
10:07
It's machines and humans working together,
192
607895
2753
기계와 인간이 ν•¨κ»˜ μΌν•˜κ³ 
10:10
learning from each other.
193
610648
1710
μ„œλ‘œμ—κ²Œ λ°°μš°λ©΄μ„œ
10:12
And when that happens,
194
612650
1376
κΈ°μ—…μ˜ 전체 ν•™μŠ΅ λΉ„μœ¨μ΄ μ¦κ°€ν•˜κ³ ,
10:14
the organization's overall rate of learning increases,
195
614068
3462
10:17
which in turn makes the company much more agile,
196
617572
2711
κ·Έ κ²°κ³Ό 기업은 더 민첩해지고 회볡λ ₯이 μƒκ²¨μ„œ
10:20
much more resilient,
197
620324
1252
10:21
ready to adapt and take on any challenge.
198
621617
2795
μ‰½κ²Œ μ μ‘ν•˜κ³  도전할 수 μžˆμŠ΅λ‹ˆλ‹€.
10:25
It is the human touch that will bring the best out of AI.
199
625413
4212
인곡 지λŠ₯이 졜고의 λŠ₯λ ₯을 λ°œνœ˜ν•˜κ²Œ ν•˜λŠ” 것은
λ°”λ‘œ μΈκ°„μž…λ‹ˆλ‹€.
10:29
Thank you.
200
629917
1168
κ°μ‚¬ν•©λ‹ˆλ‹€.
10:31
(Applause)
201
631127
5547
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7