AI and the Paradox of Self-Replacing Workers | Madison Mohns | TED

67,036 views ・ 2024-03-22

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Sarah Hong κ²€ν† : DK Kim
00:04
I'm going about my day, normal Tuesday of meetings
0
4334
2600
ν‰μƒμ‹œμ²˜λŸΌ ν™”μš”μΌ 회의λ₯Ό ν•˜λŸ¬ κ°€λŠ”λ°
00:06
when I get a ping from my manager's manager's manager.
1
6967
3434
μƒμ‚¬μ˜ μƒμ‚¬μ˜ μƒμ‚¬μ—κ²Œ 연락을 λ°›μŠ΅λ‹ˆλ‹€.
00:12
It says: β€œGet me a document by the end of the day
2
12167
2634
μ΄λ ‡κ²Œ μ ν˜€ μžˆλ„€μš”.
β€œμ˜€λŠ˜ μ•ˆμœΌλ‘œ 인곡 μ§€λŠ₯κ³Ό κ΄€λ ¨ν•˜μ—¬ νŒ€μ΄ μž‘μ—…ν•˜κ³  μžˆλŠ”
00:14
that records everything your team has been working on related to AI."
3
14834
3933
λͺ¨λ“  것을 κΈ°λ‘ν•΄μ„œ λ³΄κ³ μ„œλ₯Ό μ˜¬λ¦¬μ„Έμš”.”
00:18
As it turns out, the board of directors of my large company
4
18767
2800
μ•Œκ³  λ³΄λ‹ˆ, 저희 λŒ€κΈ°μ—…μ˜ μ΄μ‚¬νšŒκ°€
00:21
had been hearing buzz about this new thing called ChatGPT,
5
21601
3133
μ±—μ§€ν”Όν‹°λΌλŠ” μƒˆλ‘œμš΄ 것에 λŒ€ν•œ μ†Œλ¬Έμ„ λ“£κ³  μžˆμ—ˆλŠ”λ°,
00:24
and they wanted to know what we were doing about it.
6
24767
2467
μš°λ¦¬κ°€ 그에 λŒ€ν•΄ 무엇을 ν•˜κ³  μžˆλŠ”μ§€ μ•Œκ³  μ‹Άμ—ˆλ˜ κ²ƒμž…λ‹ˆλ‹€.
00:27
They are freaking out about the future,
7
27701
1900
그듀은 λ―Έλž˜μ— λŒ€ν•΄ 겁을 λ¨Ήκ³  있고,
00:29
I'm freaking out about this measly document,
8
29634
2400
μ €λŠ” 이 λ³΄μž˜κ²ƒμ—†λŠ” λ¬Έμ„œ λ•Œλ¬Έμ— 겁이 λ‚˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
기술 λΆ„μ•Όμ—μ„œ κ°€μž₯ 뜨거운 문제λ₯Ό ν•΄κ²°ν•˜λŠ” μ™„λ²½ν•œ μ‹œμž‘μΈ 것 κ°™μ£ ?
00:32
it sounds like the perfect start
9
32067
1534
00:33
to solving the next hottest problem in tech, right?
10
33601
2800
00:36
As someone who works with machine-learning models
11
36434
2333
맀일같이 λ¨Έμ‹ λŸ¬λ‹ λͺ¨λΈμ„ λ‹€λ£¨λŠ” μ‚¬λžŒμœΌλ‘œμ„œ
00:38
every single day,
12
38767
1167
00:39
I know firsthand that the rapid development of these technologies
13
39967
3267
μ €λŠ” μ΄λŸ¬ν•œ κΈ°μˆ λ“€μ˜ κΈ‰μ†ν•œ λ°œμ „μ΄
00:43
poses endless opportunities for innovation.
14
43267
3400
λ¬΄ν•œν•œ ν˜μ‹  기회λ₯Ό κ°€μ Έμ˜¨λ‹€λŠ” 것을 μ§μ ‘μ μœΌλ‘œ μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
00:46
However, the same exponential improvement in AI systems
15
46667
3667
κ·ΈλŸ¬λ‚˜ 인곡 μ§€λŠ₯의 μ§€μˆ˜μ μΈ κ°œμ„ μ€
00:50
is becoming a looming existential threat to the team I manage.
16
50367
3434
μ œκ°€ 맑은 νŒ€μ— λ‹€κ°€μ˜€λŠ” 싀쑴적 μœ„ν˜‘μ΄ 되고 μžˆμŠ΅λ‹ˆλ‹€.
00:54
With increasing accessibility
17
54201
1766
접근성이 ν–₯μƒλ˜κ³  μ†Œλ¦„λΌμΉ  μ •λ„λ‘œ 인간과 μœ μ‚¬ν•œ κ²°κ³Όκ°€
00:55
and creepily human-like results coming out of the field of AI research,
18
55967
3900
인곡 μ§€λŠ₯ 연ꡬ λΆ„μ•Όμ—μ„œ λ‚˜μ˜€λ©΄μ„œ
00:59
companies like my own are turning toward automation to make things more efficient.
19
59867
4400
μ œκ°€ λ‹€λ‹ˆλŠ” νšŒμ‚¬ 같은 νšŒμ‚¬λ“€μ€
업무λ₯Ό 더 효율적으둜 λ§Œλ“€κΈ° μœ„ν•΄ μžλ™ν™”λ‘œ μ „ν™˜ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:04
Now on the surface, this seems like a pretty great vision.
20
64267
3300
이제 ν‘œλ©΄μ μœΌλ‘œλŠ”, κ½€ ν›Œλ₯­ν•œ μ „λ§μ²˜λŸΌ λ³΄μž…λ‹ˆλ‹€.
01:07
But as we start to dig deeper, we uncover an uncomfortable paradox.
21
67567
4300
ν•˜μ§€λ§Œ 더 깊이 νŒŒκ³ λ“€λ©΄ λΆˆνŽΈν•œ 역섀이 λ“œλŸ¬λ‚©λ‹ˆλ‹€.
01:11
Let's break this down.
22
71901
1466
μ’€ 더 μžμ„Ένžˆ μ„€λͺ…ν•΄ 보죠.
01:13
In order to harness the power of AI systems,
23
73401
2633
인곡 μ§€λŠ₯의 μ„±λŠ₯을 ν™œμš©ν•˜λ €λ©΄,
μ‹œμŠ€ν…œμ„ 높은 기쀀에 맞게 ν•™μŠ΅ν•˜κ³  λ―Έμ„Έ μ‘°μ •ν•΄μ•Ό ν•©λ‹ˆλ‹€.
01:16
these systems must be trained and fine-tuned
24
76034
2433
01:18
to match a high-quality standard.
25
78501
2366
01:20
But who defines quality,
26
80901
2400
ν•˜μ§€λ§Œ λˆ„κ°€ ν’ˆμ§ˆμ„ μ •μ˜ν•˜λ©°,
01:23
and who trains these systems in the first place?
27
83334
3167
μ• μ΄ˆμ— μ΄λŸ¬ν•œ μ‹œμŠ€ν…œμ„ ν›ˆλ ¨ν•˜λŠ” μ‚¬λžŒμ€ λˆ„κ΅¬μΌκΉŒμš”?
01:26
As you may have guessed, real-life subject matter experts,
28
86534
3500
μ§μž‘ν•˜μ…¨κ² μ§€λ§Œ, μ£Όμ œλ³„ 인간 μ „λ¬Έκ°€λ“€μž…λ‹ˆλ‹€.
ν”νžˆ ν˜„μž¬ κ·Έ 일을 ν•˜κ³  μžˆλŠ” λ°”λ‘œ κ·Έ μ‚¬λžŒλ“€μ΄μ£ .
01:30
oftentimes the same exact people who are currently doing the job.
29
90034
3967
01:34
Imagine my predicament here.
30
94934
2033
μ œκ°€ μ²˜ν•œ 이 곀경을 생각해 λ³΄μ„Έμš”.
μˆ˜λ…„ λ™μ•ˆ ν•¨κ»˜ 일해 온, μ œκ°€ λ―ΏλŠ” νŒ€μ— κ°€μ„œ
01:37
I get to go to my trusted team, whom I've worked with for years,
31
97001
3433
01:40
look them in the eyes
32
100467
1367
그듀을 마주 바라보며
01:41
and pitch them on training the very systems that might displace them.
33
101867
3867
μžμ‹ λ“€μ„ λŒ€μ²΄ν•  μˆ˜λ„ μžˆλŠ” μ‹œμŠ€ν…œμ„ κ΅μœ‘ν•˜λ„λ‘ μ œμ•ˆν•©λ‹ˆλ‹€.
01:46
This paradox had led me to rely on three ethical principles
34
106334
4033
μ΄λŸ¬ν•œ μ—­μ„€λ‘œ 인해 μ €λŠ” μ„Έ κ°€μ§€ 윀리 원칙을 μ„Έμš°κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:50
that can ensure that managers can grapple with the implications
35
110401
3266
이 원칙듀은 κ΄€λ¦¬μžκ°€ 인λ ₯의 자기 λŒ€μ²΄κ°€ λ―ΈμΉ˜λŠ” 영ν–₯을
01:53
of a self-replacing workforce.
36
113701
1900
ν•΄κ²°ν•  수 μžˆλ„λ‘ 보μž₯ν•΄ μ€λ‹ˆλ‹€.
μ²«μ§ΈλŠ” μ „ν™˜ κ³Όμ •μ˜ 투λͺ…성이고,
01:56
One, transformational transparency,
37
116034
2867
01:58
Two, collaborative AI augmentation.
38
118901
2900
λ‘˜μ§ΈλŠ” ν˜‘μ—…μ  인곡 μ§€λŠ₯ μ¦κ°•μž…λ‹ˆλ‹€.
02:01
And three, reskilling to realize potential.
39
121834
2833
그리고, μ„Έ λ²ˆμ§ΈλŠ” 잠재λ ₯을 μ‹€ν˜„ν•˜κΈ° μœ„ν•œ μž¬κ΅μœ‘μž…λ‹ˆλ‹€.
02:05
Now before we get into solutions, let’s zoom out a little bit.
40
125567
4200
이제 해법을 μ‚΄νŽ΄λ³΄κΈ° 전에, 쑰금 λ¬ΌλŸ¬λ‚˜μ„œ λ³΄κ² μŠ΅λ‹ˆλ‹€.
02:09
How deep is this problem of self-replacing workers, really?
41
129801
3733
자기 λŒ€μ²΄ 인λ ₯ λ¬Έμ œλŠ” μ‹€μ œλ‘œ μ–Όλ§ˆλ‚˜ μ‹¬κ°ν• κΉŒμš”?
02:13
Research from this year coming out of OpenAI
42
133567
2234
OpenAIκ°€ μ˜¬ν•΄ λ°œν‘œν•œ 연ꡬ에 λ”°λ₯΄λ©΄
02:15
indicates that approximately 80 percent of the US workforce
43
135834
3133
λ―Έκ΅­ 근둜자의 μ•½ 80%λŠ” 인곡 μ§€λŠ₯ λ„μž…μœΌλ‘œ 인해
κ·Έλ“€ μ—…λ¬΄μ˜ μ΅œλŒ€ 10%κ°€ 영ν–₯을 받을 κ²ƒμœΌλ‘œ λ΄…λ‹ˆλ‹€.
02:19
could see up to 10 percent of their tasks impacted
44
139001
2533
02:21
by the introduction of AI,
45
141567
1867
02:23
while around 19 percent of the workforce
46
143467
2500
ν•œνŽΈ, 근둜자의 μ•½ 19%λŠ”
μ—…λ¬΄μ˜ 50%κΉŒμ§€ 영ν–₯을 받을 κ²ƒμœΌλ‘œ λ΄…λ‹ˆλ‹€.
02:26
could see up to 50 percent of their tasks impacted.
47
146001
3366
02:29
The craziest thing about all of this is,
48
149367
1934
이 μ€‘μ—μ„œ κ°€μž₯ λ†€λΌμš΄ 점은,
02:31
is that these technologies do not discriminate.
49
151334
4267
이 기술이 차별을 ν•˜μ§€ μ•ŠλŠ”λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:35
Occupations that have historically required an immense amount of training
50
155634
3667
μ—­μ‚¬μ μœΌλ‘œ μ—„μ²­λ‚œ ν›ˆλ ¨μ΄λ‚˜ ꡐ윑이 ν•„μš”ν•œ 직쒅듀도
02:39
or education are equally as vulnerable to being outsourced to AI.
51
159334
4300
인곡 μ§€λŠ₯으둜 λ„˜μ–΄κ°ˆ μœ„ν—˜μ΄ λ§ˆμ°¬κ°€μ§€λ‘œ λ†’μŠ΅λ‹ˆλ‹€.
02:44
Now before we throw our hands up and let the robots take over,
52
164334
4233
이제 μš°λ¦¬κ°€ 손을 듀어버리고 λ‘œλ΄‡μ—κ²Œ μž‘μ—…μ„ λ§‘κΈ°κΈ° 전에
02:48
let's put this all into perspective.
53
168601
2266
이것듀을 μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
02:50
Fortunately for us,
54
170867
1200
λ‹€ν–‰μŠ€λŸ½κ²Œλ„ 이런 일이 μΌμ–΄λ‚œ 게 역사상 μ²˜μŒμ€ μ•„λ‹™λ‹ˆλ‹€.
02:52
this is not the first time in history that this has happened.
55
172067
2867
02:54
Let's go back to the Industrial revolution.
56
174967
2534
μ‚°μ—… 혁λͺ…μœΌλ‘œ λŒμ•„κ°€ λ΄…μ‹œλ‹€.
02:57
Picture Henry Ford’s iconic Model T automobile production line.
57
177534
4333
헨리 ν¬λ“œμ˜ 상징적인 λͺ¨λΈ T μžλ™μ°¨ 생산 곡μž₯을 상상해 λ³΄μ„Έμš”.
03:01
In this remarkable setup,
58
181867
1467
이 λ†€λΌμš΄ μž‘μ—… λ°°μΉ˜μ—μ„œ
03:03
workers and machines engage in a synchronous dance.
59
183334
3533
μž‘μ—…μžμ™€ 기계가 μ‘°ν™”λ‘­κ²Œ 움직이고 μžˆμŠ΅λ‹ˆλ‹€.
03:06
They were tasked with specific repetitive tasks,
60
186867
2634
이듀은 νŠΉμ •ν•œ 반볡 μž‘μ—…μ„ ν•©λ‹ˆλ‹€.
03:09
such as tightening bolts or fitting components
61
189534
2300
μ œν’ˆμ΄ 쑰립 과정을 따라 움직일 λ•Œ λ‚˜μ‚¬λ₯Ό μ‘°μ΄κ±°λ‚˜ λΆ€ν’ˆμ„ λ„£λŠ” κ±°μ£ .
03:11
as the product moved down the line.
62
191867
2100
μ—­μ„€μ μ΄κ²Œλ„, 그리고 μ œκ°€ ν˜„μž¬ μ²˜ν•œ 상황과 λ‹€λ₯΄μ§€ μ•Šκ²Œ,
03:14
Ironically, and not dissimilar to my current predicament,
63
194001
3233
03:17
the humans themselves played a crucial role in training the systems
64
197267
3700
인간듀이 ν•œλ•Œ λ‹€λ°©λ©΄μœΌλ‘œ μˆ™λ ¨λ˜μ–΄ 있던 역할을
κ²°κ΅­ λŒ€μ²΄ν•  μ‹œμŠ€ν…œμ„ ν›ˆλ ¨ν•˜λŠ” 데 인간 μžμ‹ λ“€μ΄ μ€‘μš”ν•œ 역할을 ν–ˆμŠ΅λ‹ˆλ‹€.
03:21
that would eventually replace their once multi-skilled roles.
65
201001
3533
03:24
They were the ones who honed their craft, perfected the techniques
66
204567
3867
μžμ‹ μ˜ κΈ°μˆ μ„ μ—°λ§ˆν•˜κ³ , κΈ°μˆ μ„ μ™„μ„±ν•˜κ³ ,
03:28
and ultimately handed off the knowledge to the technicians
67
208434
2833
ꢁ극적으둜 전체 과정을 μžλ™ν™”ν•˜λŠ” 데 κ΄€λ ¨λœ κΈ°μˆ μžμ™€ κ³΅ν•™μžμ—κ²Œ
03:31
and engineers involved in automating their entire process.
68
211267
3700
μžμ‹ λ“€μ˜ 지식을 λ„˜κ²¨μ€€ 것은 λ°”λ‘œ κ·Έλ“€μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
03:35
Now on the outset, this situation seems pretty dire.
69
215401
5066
μ§€κΈˆ μ΄ˆκΈ°μ—λŠ” 이런 상황이 맀우 심각해 λ³΄μž…λ‹ˆλ‹€.
03:40
Yet despite initial fears and hesitations
70
220467
2767
ν•˜μ§€λ§Œ μ΄λŸ¬ν•œ 기술 λ°œμ „μ— λŒ€ν•œ
03:43
involved in these technological advancements,
71
223267
2367
초기의 두렀움과 λ§μ„€μž„μ—λ„ λΆˆκ΅¬ν•˜κ³ 
03:45
history has proven that humans have continuously found ways
72
225667
3734
역사λ₯Ό 톡해 증λͺ…λœ 것은
인λ₯˜λŠ” κ³„μ†ν•΄μ„œ μ μ‘ν•˜κ³  ν˜μ‹ ν•  방법을 μ°Ύμ•„μ™”λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
03:49
to adapt and innovate.
73
229434
2133
03:51
While some roles were indeed replaced, new roles emerged,
74
231601
3166
일뢀 역할은 μ‹€μ œλ‘œ λŒ€μ²΄λ˜μ—ˆμ§€λ§Œ μƒˆλ‘œμš΄ 역할이 생겨났기 λ•Œλ¬Έμ—
03:54
requiring higher-level skills like creativity
75
234767
2734
κΈ°κ³„λ‘œλŠ” μž¬ν˜„ν•  수 μ—†λŠ” μ°½μ˜μ„±κ³Ό
03:57
and creative problem solving that machines just simply couldn't replicate.
76
237501
4033
창의적인 문제 ν•΄κ²°κ³Ό 같은 더 높은 μˆ˜μ€€μ˜ 기술이 ν•„μš”ν–ˆμŠ΅λ‹ˆλ‹€.
04:02
Reflecting on this historical example
77
242101
2066
이 역사적 사둀λ₯Ό 생각해 보면
04:04
reminds us that the relationship between humans and machines
78
244201
3166
인간과 κΈ°κ³„μ˜ κ΄€κ³„λŠ” μ–Έμ œλ‚˜ μ„¬μ„Έν•œ κ· ν˜•μ„
04:07
has always been a delicate balancing act.
79
247401
2666
μœ μ§€ν•΄ μ™”λ‹€λŠ” 것을 상기할 수 μžˆμŠ΅λ‹ˆλ‹€.
04:10
We are the architects of our own progress,
80
250434
3167
μš°λ¦¬λŠ” μžμ‹ μ˜ λ°œμ „μ„ μ„€κ³„ν•˜λ©΄μ„œ
04:13
often training machines to replace us
81
253634
2600
ν”νžˆ μžμ‹ μ„ λŒ€μ²΄ν•  기계λ₯Ό ν›ˆλ ¨ν•©λ‹ˆλ‹€.
04:16
while simultaneously carving out unique roles for ourselves
82
256234
3400
μžμ‹ μ˜ κ³ μœ ν•œ 역할을 κ°œμ²™ν•˜λŠ” λ™μ‹œμ—
04:19
and discovering new possibilities.
83
259667
2334
μƒˆλ‘œμš΄ κ°€λŠ₯성을 λ°œκ²¬ν•˜κΈ°λ„ ν•©λ‹ˆλ‹€.
이제 ν˜„μž¬λ‘œ λŒμ•„μ™€μ„œ, μš°λ¦¬λŠ” 인곡 μ§€λŠ₯ 혁λͺ…μ˜ λ°œλ‹¨μ— μ„œ μžˆμŠ΅λ‹ˆλ‹€.
04:22
Now coming back to the present day, we are on the cusp of the AI revolution.
84
262034
4900
04:26
As someone responsible for moving that revolution forward,
85
266967
2734
κ·Έ 혁λͺ…을 μ „μ§„μ‹œν‚€λŠ” μ±…μž„μ„ 맑은 μ‚¬λžŒμœΌλ‘œμ„œ
04:29
the tension becomes omnipresent.
86
269701
2133
κΈ΄μž₯감은 μ–΄λ””μ—λ‚˜ μ‘΄μž¬ν•˜κ²Œ λ©λ‹ˆλ‹€.
04:31
Option one, I can innovate quickly and risk displacing my team.
87
271834
4200
이 경우 μ €λŠ” μ‹ μ†ν•˜κ²Œ ν˜μ‹ ν•˜κ³  νŒ€μ„ λŒ€μ²΄ν•˜λŠ” μœ„ν—˜μ„ κ°μˆ˜ν•  수 있죠.
μ•„λ‹ˆλ©΄ 두 번째 λ°©λ²•μœΌλ‘œλŠ”,
04:36
Or option two, I can refuse to innovate in an effort to protect my team,
88
276067
5167
νŒ€μ„ λ³΄ν˜Έν•˜λ €λŠ” λ…Έλ ₯의 μΌν™˜μœΌλ‘œ ν˜μ‹ μ„ κ±°λΆ€ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
04:41
but ultimately still lose people because the company falls behind.
89
281234
3700
ν•˜μ§€λ§Œ νšŒμ‚¬κ°€ λ’€μ²˜μ§€κΈ° λ•Œλ¬Έμ— ꢁ극적으둜 μ—­μ‹œ μ‚¬λžŒμ„ μžƒμŠ΅λ‹ˆλ‹€.
04:45
So what am I supposed to do
90
285501
2000
이런 μƒν™©μ—μ„œ κ·Έμ € 쀑간 κ΄€λ¦¬μžμΈ μ œκ°€ ν•  일은 λ¬΄μ—‡μΌκΉŒμš”?
04:47
as a mere middle manager in this situation?
91
287534
2733
04:50
Knowingly introducing this complex paradox for your team
92
290267
3034
μ΄λŸ¬ν•œ λ³΅μž‘ν•œ 역섀을 μ•Œλ©΄μ„œ νŒ€μ— λ„μž…ν•˜λŠ” 것은
04:53
presents strong challenges for people management.
93
293301
3233
인λ ₯ 관리에 큰 어렀움을 μ•ˆκ²¨μ€λ‹ˆλ‹€.
04:56
Luckily, we can refer back to those three ethical principles
94
296567
2834
λ‹€ν–‰μŠ€λŸ½κ²Œλ„ μ œκ°€ κ°•μ—° μ΄ˆλ°˜μ— λ§ν•œ μ„Έ κ°€μ§€ 윀리 원칙을 μ°Έκ³ ν•˜λ©΄
04:59
I addressed at the beginning of the talk
95
299434
1933
05:01
to ensure that you can continue to move ahead
96
301367
2234
직원듀이 λ’€μ²˜μ§€μ§€ μ•Šκ²Œ ν•˜λ©΄μ„œ 계속 μ•žμœΌλ‘œ λ‚˜κ°ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
05:03
without leaving your people behind.
97
303634
2067
05:06
First and foremost,
98
306401
1200
무엇보닀도 인곡 μ§€λŠ₯으둜 μ „ν™˜ν•˜λŠ” 것은 투λͺ…ν•΄μ•Ό ν•©λ‹ˆλ‹€.
05:07
AI transformation needs to be transparent.
99
307634
3467
05:11
As leaders, it is imperative to foster dialogue,
100
311134
2733
λ¦¬λ”λ‘œμ„œ λŒ€ν™”λ₯Ό μ΄‰μ§„ν•˜κ³ , μ£Όμš” 문제λ₯Ό ν•΄κ²°ν•˜κ³ ,
05:13
address key concerns,
101
313867
1167
05:15
and offer concise explanations regarding the purpose
102
315067
2900
AI κ΅¬ν˜„μ— μˆ˜λ°˜λ˜λŠ” λͺ©μ κ³Ό 잠재적 κ³Όμ œμ— λŒ€ν•΄
05:17
and potential challenges entailed in implementing AI.
103
317967
3467
κ°„κ²°ν•˜κ²Œ μ„€λͺ…ν•˜λŠ” 것이 ν•„μˆ˜μž…λ‹ˆλ‹€.
05:21
This requires actively involving your employees
104
321467
2667
이λ₯Ό μœ„ν•΄μ„œλŠ” μ˜μ‚¬κ²°μ • 과정에
05:24
in the decision-making process
105
324167
1567
직원을 적극적으둜 μ°Έμ—¬μ‹œν‚€κ³  μ§μ›μ˜ μžμœ¨μ„±μ„ 쑴쀑해야 ν•©λ‹ˆλ‹€.
05:25
and valuing their autonomy.
106
325767
1834
05:28
By introducing the concept of consent,
107
328234
2433
특히 μžμ‹ μ˜ 핡심 업무λ₯Ό μžλ™ν™”ν•˜λŠ” μž„λ¬΄λ₯Ό 맑은 μ§μ›λ“€μ—κ²Œ
05:30
especially for employees who are tasked
108
330701
1933
05:32
with automating their core responsibilities,
109
332667
2300
λ™μ˜μ˜ κ°œλ…μ„ λ„μž…ν•¨μœΌλ‘œμ¨,
직원듀이 κ°•λ ₯ν•œ λͺ©μ†Œλ¦¬λ₯Ό μœ μ§€ν•˜λ©΄μ„œ
05:35
we can ensure that they maintain a strong voice
110
335001
2633
05:37
in carving out their professional destiny.
111
337667
2334
μžμ‹ μ˜ 직업 전망을 κ°œμ²™ν•˜κ²Œ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
λ‹€μŒμœΌλ‘œ, μ•žμœΌλ‘œ 펼쳐질 미래λ₯Ό μΈμ •ν•˜λ©΄μ„œ
05:41
Next, now that we've gotten folks bought into this grandiose vision
112
341034
3400
05:44
while acknowledging the journey that lies ahead,
113
344467
3000
μ‚¬λžŒλ“€μ΄ 이 μž₯λŒ€ν•œ 미래λ₯Ό λ°›μ•„λ“€μ΄κ²Œ ν–ˆμœΌλ©΄
05:47
let's talk about how to use AI as an augmentation device.
114
347467
4000
이제 인곡 μ§€λŠ₯을 증강 μž₯치둜 μ‚¬μš©ν•˜λŠ” 방법을 μ•Œμ•„λ³΄κ² μŠ΅λ‹ˆλ‹€.
05:51
Picture the worst part of your job today.
115
351501
2366
ν˜„μž¬ μ—…λ¬΄μ—μ„œ κ°€μž₯ μ•ˆ 쒋은 뢀뢄을 λ– μ˜¬λ € λ³΄μ„Έμš”.
05:54
What if you could delegate it?
116
354767
1700
κ·Έκ±Έ μœ„μž„ν•  수 μžˆλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
05:56
And no, not hand it off to some other sad soul at work,
117
356501
3233
μ•„λ‹ˆμš”, λ‹€λ₯Έ λΆˆμš΄ν•œ λ™λ£Œμ—κ²Œ λ„˜κ²¨μ£ΌλŠ” 것이 μ•„λ‹ˆλΌ,
05:59
but hand it to a system that can do your rote tasks for you.
118
359734
3833
μ—¬λŸ¬λΆ„ λŒ€μ‹  λ‹¨μˆœ 반볡 μž‘μ—…μ„ ν•  수 μžˆλŠ” μ‹œμŠ€ν…œμ— λ„˜κΈ°λŠ” κ²λ‹ˆλ‹€.
06:03
Instead of perceiving AI as a complete replacement,
119
363601
3333
인곡 μ§€λŠ₯을 μ™„μ „ν•œ λŒ€μ²΄λ¬Όλ‘œ μΈμ‹ν•˜λŠ” λŒ€μ‹ ,
06:06
identify opportunities where you can use it
120
366967
2267
μ§μ›μ˜ 잠재λ ₯κ³Ό 생산성을 ν–₯μƒν•˜λŠ” 데
06:09
to enhance your employees' potential and productivity.
121
369267
3967
인곡 μ§€λŠ₯을 μ‚¬μš©ν•  수 μžˆλŠ” 기회λ₯Ό μ°Ύμ•„ λ³΄μ„Έμš”.
06:13
Collaboratively with your team,
122
373267
1567
μ—¬λŸ¬λΆ„μ˜ νŒ€κ³Ό ν˜‘λ ₯ν•˜μ—¬,
06:14
identify areas and tasks that can be automated,
123
374867
3134
μžλ™ν™”ν•  수 μžˆλŠ” μ˜μ—­κ³Ό μž‘μ—…μ„ μ‹λ³„ν•˜μ—¬
κΈ°κ³„λ‘œλŠ” 잘 μˆ˜ν–‰ν•  수 μ—†λŠ”, λΉ„νŒμ  사고가 ν•„μš”ν•œ
06:18
carving out more room for higher-value activities
124
378034
2767
06:20
requiring critical thinking that machines just aren't very good at doing.
125
380834
4633
κ³ λΆ€κ°€ κ°€μΉ˜ ν™œλ™μ„ μœ„ν•œ μ—¬μ§€λ₯Ό 더 많이 ν™•λ³΄ν•˜μ„Έμš”.
06:26
Let's put this into an example.
126
386434
1900
예λ₯Ό λ“€μ–΄ λ³΄κ² μŠ΅λ‹ˆλ‹€.
06:28
Recently, I completed a project with my team at work
127
388367
2500
μ΅œκ·Όμ— μ €λŠ” νšŒμ‚¬ νŒ€μ›λ“€κ³Ό ν•¨κ»˜ 일을 μ™„λ£Œν–ˆμŠ΅λ‹ˆλ‹€.
06:30
that's going to save our company over 12,000 working hours.
128
390901
4366
이 일을 톡해 νšŒμ‚¬μ˜ 근무 μ‹œκ°„μ„
12,000μ‹œκ°„ 이상 μ ˆμ•½ν•  수 μžˆμ„ κ²ƒμž…λ‹ˆλ‹€.
06:35
The folks involved in training this algorithm
129
395267
2134
이 μ•Œκ³ λ¦¬μ¦˜ κ΅μœ‘μ— μ°Έμ—¬ν•œ μ‚¬λžŒλ“€μ€
06:37
are the same subject matter experts that worked tirelessly last year
130
397434
3500
μž‘λ…„μ— 우리 μ›Ήμ‚¬μ΄νŠΈ μ „λ°˜μ— 걸쳐 μ„ΈλΆ„ν™”λœ κ²½ν—˜μ„ μ΅œμ ν™”ν•˜κΈ° μœ„ν•΄
06:40
to hand-curate and research data to optimize segmented experiences
131
400967
4434
데이터λ₯Ό 직접 μ„ λ³„ν•˜κ³  μ‘°μ‚¬ν•˜λ©°
쉬지 μ•Šκ³  μΌν•œ 동일 λΆ„μ•Ό μ „λ¬Έκ°€λ“€μž…λ‹ˆλ‹€.
06:45
across our website.
132
405434
1500
06:47
Now because of the sheer amount of time spent and the level of detail involved,
133
407634
5100
μ—„μ²­λ‚˜κ²Œ μ†Œμš”λœ μ‹œκ°„κ³Ό κ΄€λ ¨λœ μ„ΈλΆ€ μ‚¬ν•­μ˜ μˆ˜μ€€μœΌλ‘œ 인해
06:52
I would have expected
134
412767
1167
μ €λŠ” 이 μž‘μ—… 후에 μ—„μ²­λ‚œ μžλΆ€μ‹¬μ„ λŠλ‚„ κ²ƒμœΌλ‘œ μ˜ˆμƒν–ˆμŠ΅λ‹ˆλ‹€.
06:53
that there was an immense amount of pride behind this workflow.
135
413967
3334
06:57
But to my surprise, as it turns out,
136
417334
2400
κ·ΈλŸ¬λ‚˜ λ†€λžκ²Œλ„, μ•Œκ³  λ³΄λ‹ˆ
06:59
the subject matter experts who built this model
137
419734
2300
이 λͺ¨λΈμ„ κ΅¬μΆ•ν•œ ν•΄λ‹Ή λΆ„μ•Ό 전문가듀은
07:02
were actually excited to hand these tasks off to automation.
138
422067
3667
사싀 μ΄λŸ¬ν•œ μž‘μ—…μ„ μžλ™ν™”μ— λ§‘κΈ°λŠ” 것을 기쁘게 μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€.
07:05
There were things that they would have much rather spent their time on,
139
425767
3367
μ‹œκ°„μ„ 훨씬 더 νˆ¬μžν•˜κ³  싢은 일듀이 μžˆμ—ˆλ˜ κ±°μ£ .
μ œν’ˆμ΄ 더 λ‚˜μ€ μ„±λŠ₯을 λ°œνœ˜ν•˜λ„λ‘ κΈ°μ‘΄ 데이터λ₯Ό μ΅œμ ν™”ν•œλ‹€κ±°λ‚˜
07:09
like in optimizing existing data to perform better on product surfaces
140
429134
3400
07:12
or even researching and developing new insights to augment
141
432567
3200
λͺ¨λΈμ΄ μ œλŒ€λ‘œ μˆ˜ν–‰ν•˜μ§€ λͺ»ν•˜λŠ” 뢀뢄을 λ³΄κ°•ν•˜κΈ° μœ„ν•΄
07:15
where the model just simply doesn't do as well.
142
435801
2233
μƒˆλ‘œμš΄ 방법을 μ—°κ΅¬ν•˜κ³  κ°œλ°œν•˜λŠ” 일 λ“± λ§μž…λ‹ˆλ‹€.
07:19
Lastly, we must reskill in order to avoid replacement.
143
439901
4600
λ§ˆμ§€λ§‰μœΌλ‘œ, λŒ€μ²΄λ₯Ό ν”Όν•˜λ €λ©΄ 기술 κ΅μœ‘μ„ λ‹€μ‹œ λ°›μ•„μ•Ό ν•©λ‹ˆλ‹€.
07:24
Knowingly investing in the professional development
144
444534
2533
μ§μ›μ˜ μ „λ¬Έμ„± 개발과 볡지에 μ•Œλ§žκ²Œ νˆ¬μžν•˜λ©΄
07:27
and well-being of our workforce
145
447101
1833
07:28
ensures that they are equipped with the skills and knowledge
146
448934
2900
인곡 μ§€λŠ₯을 기초둜 ν•œ λ―Έλž˜μ— μ„±κ³΅ν•˜λŠ” 데 ν•„μš”ν•œ
07:31
needed to thrive in an AI-powered future.
147
451867
2867
기술과 지식을 κ°–μΆ”κ²Œ λ©λ‹ˆλ‹€.
07:34
By providing opportunities for upskilling and reskilling,
148
454767
3367
기술 ν–₯상과 재ꡐ윑 기회λ₯Ό μ œκ³΅ν•¨μœΌλ‘œμ¨
07:38
we can empower our employees to rethink their roles as they exist today
149
458167
4334
직원듀이 ν˜„μž¬ ν•˜λŠ” 역할을 λ‹€μ‹œ μ κ²€ν•˜κ³ 
07:42
and carve out new possibilities that align with their evolving expertise
150
462534
3700
μ§„ν™”ν•˜λŠ” μ „λ¬Έμ„±κ³Ό 관심사에 λ§žλŠ”
μƒˆλ‘œμš΄ κ°€λŠ₯성을 κ°œμ²™ν•˜λ„λ‘ νž˜μ„ 싀어쀄 수 μžˆμŠ΅λ‹ˆλ‹€.
07:46
and interests.
151
466267
1500
07:47
So how does this work in practice?
152
467801
2333
κ·Έλ ‡λ‹€λ©΄ 이것을 μ–΄λ–»κ²Œ μ‹€ν–‰ν• κΉŒμš”?
07:50
When I started introducing AI as a way to accelerate my team's workflows,
153
470167
4534
νŒ€μ˜ μž‘μ—… 속도λ₯Ό μ˜¬λ¦¬λŠ” λ°©λ²•μœΌλ‘œ 인곡 μ§€λŠ₯을 λ„μž…ν•˜κΈ° μ‹œμž‘ν–ˆμ„ λ•Œ,
07:54
I used it as an opportunity to improve my team's technical literacy.
154
474701
4166
μ €λŠ” 이λ₯Ό νŒ€μ˜ 기술 ν™œμš© λŠ₯λ ₯을 ν–₯μƒν•˜λŠ” 기회둜 ν™œμš©ν–ˆμŠ΅λ‹ˆλ‹€.
07:58
I worked with my team of engineers on a tool
155
478867
2334
μ €μ˜ μ—”μ§€λ‹ˆμ–΄ νŒ€κ³Ό ν•¨κ»˜
08:01
that could transparently identify
156
481234
2033
λͺ¨λΈ 결과에 λŒ€ν•œ λ°μ΄ν„°μ˜ 영ν–₯을
08:03
the impact of data on a model's outcomes.
157
483301
3200
투λͺ…ν•˜κ²Œ 식별할 수 μžˆλŠ” 도ꡬλ₯Ό κ°œλ°œν–ˆμŠ΅λ‹ˆλ‹€.
08:06
I then went to my operations analyst,
158
486534
1833
그런 λ‹€μŒ 기술 κ΅μœ‘μ„ λ°›μ§€ μ•Šμ€ 운영 뢄석가λ₯Ό μ°Ύμ•„κ°”κ³ ,
08:08
who didn't have technical training at the time,
159
488367
2234
08:10
and they were able to quickly identify areas where the model was underperforming
160
490634
4733
그듀은 λͺ¨λΈμ΄ μ œλŒ€λ‘œ μž‘λ™ν•˜μ§€ μ•ŠλŠ” 뢀뢄을 λΉ λ₯΄κ²Œ μ°Ύμ•„λ‚΄κ³ 
08:15
and hand off direct suggestions to my data science team
161
495367
3167
데이터 κ³Όν•™νŒ€μ— 직접 μ œμ•ˆμ„ μ „λ‹¬ν•΄μ„œ
08:18
to make those models do better next time.
162
498567
2600
λ‹€μŒ λ²ˆμ—λŠ” λͺ¨λΈμ΄ 더 잘 μž‘λ™ν•˜λ„λ‘ ν•˜κ²Œ ν–ˆμŠ΅λ‹ˆλ‹€.
08:21
Fostering a culture of continuous learning
163
501201
2800
지속적인 ν•™μŠ΅κ³Ό 재ꡐ윑 λ¬Έν™”λ₯Ό μ‘°μ„±ν•˜λŠ” 것이 무엇보닀 μ€‘μš”ν•©λ‹ˆλ‹€.
08:24
and reskilling is paramount.
164
504034
2067
이λ₯Ό 톡해 인곡 μ§€λŠ₯ ν˜μ‹ μ€ 훨씬 더 ν₯λ―Έλ‘­κ³  덜 λ¬΄μ„œμ›Œμ§‘λ‹ˆλ‹€.
08:26
It makes AI transformation a lot more exciting and a lot less scary.
165
506101
4766
08:31
We have reached a critical juncture
166
511967
2234
μš°λ¦¬λŠ” 인곡 μ§€λŠ₯ 기술의 κΈ‰μ†ν•œ λ°œμ „μ΄
08:34
where the rapid development of AI technology
167
514234
2133
κΈ°νšŒμ™€ 도전을 λ™μ‹œμ— μ•ˆκ²¨μ£ΌλŠ” μ€‘λŒ€ν•œ ꡭ면에 이λ₯΄λ €μŠ΅λ‹ˆλ‹€.
08:36
poses both opportunities and challenges.
168
516367
2967
08:39
As managers and leaders,
169
519834
1500
κ΄€λ¦¬μžμ΄μž λ¦¬λ”λ‘œμ„œ,
08:41
it is imperative that we navigate this terrain
170
521334
2200
민감성과 μ˜ˆμ§€λ ₯을 λ‘˜ λ‹€ κ°–μΆ”κ³  이 μ§€ν˜•μ„ νƒμƒ‰ν•˜λŠ” 것이 ν•„μˆ˜μž…λ‹ˆλ‹€.
08:43
with both sensitivity and foresight.
171
523534
2267
08:45
By embracing innovation,
172
525801
2000
ν˜μ‹ μ„ μˆ˜μš©ν•˜κ³ , μ μ‘ν•˜λŠ” λ¬Έν™”λ₯Ό μ‘°μ„±ν•˜κ³ ,
08:47
fostering a culture of adaptation,
173
527801
2066
08:49
and ultimately intentionally investing in the professional development
174
529901
4233
ꢁ극적으둜 인λ ₯의 μ „λ¬Έμ„± 개발과 볡지에 λ©΄λ°€ν•˜κ²Œ νˆ¬μžν•¨μœΌλ‘œμ¨,
08:54
and well-being of our workforce,
175
534167
1734
08:55
we can ensure that we are preparing our team
176
535934
2067
인곡 μ§€λŠ₯ λ„μž…μ˜ λ³΅μž‘μ„±μ„ ν•΄κ²°ν•˜λŠ” λ™μ‹œμ—
08:58
for the challenges that lie ahead
177
538034
1800
08:59
while addressing the complexities of introducing AI.
178
539867
3700
νŒ€μ΄ μ•žμœΌλ‘œ λ‹₯μΉ  도전에 λŒ€λΉ„ν•˜λ„λ‘ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
09:03
Together, let's forge a future that harmoniously combines human ingenuity
179
543601
4800
μΈκ°„μ˜ 독창성과 기술 진보가 μ‘°ν™”λ‘­κ²Œ κ²°ν•©λ˜λŠ” 미래,
09:08
and technological progress,
180
548434
1867
09:10
where AI enhances human potential
181
550301
2533
인곡 μ§€λŠ₯이 μΈκ°„μ˜ 잠재λ ₯을 λŒ€μ²΄ν•˜λŠ” 것이 μ•„λ‹ˆλΌ
09:12
rather than replacing it.
182
552834
1700
ν–₯μƒν•˜λŠ” 미래λ₯Ό λ‹€ ν•¨κ»˜ λ§Œλ“­μ‹œλ‹€.
09:14
Thank you.
183
554534
1167
κ°μ‚¬ν•©λ‹ˆλ‹€.
09:15
(Applause)
184
555734
1133
(λ°•μˆ˜)(ν™˜ν˜Έ)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7