Leaders and machines

14,145 views ใƒป 2022-03-15

BBC Learning English


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋ฒˆ์—ญ๋œ ์ž๋ง‰์€ ๊ธฐ๊ณ„ ๋ฒˆ์—ญ๋ฉ๋‹ˆ๋‹ค.

00:01
Robots might be on the factory floor now,
0
1520
3200
์ง€๊ธˆ์€ ๋กœ๋ด‡์ด ๊ณต์žฅ ๋ฐ”๋‹ฅ์—
00:04
but could they one day be your boss?
1
4720
2360
์žˆ์ง€๋งŒ ์–ธ์  ๊ฐ€๋Š” ๋กœ๋ด‡์ด ๋‹น์‹ ์˜ ์ƒ์‚ฌ๊ฐ€ ๋  ์ˆ˜ ์žˆ์„๊นŒ์š”?
00:07
Robots are not perfect and they are not perfect for everything,
2
7080
4520
๋กœ๋ด‡์€ ์™„๋ฒฝํ•˜์ง€ ์•Š๊ณ  ๋ชจ๋“  ์ผ์— ์™„๋ฒฝ
00:11
but they do have a capability to do some things
3
11600
3640
ํ•˜์ง€๋Š” ์•Š์ง€๋งŒ
00:15
that humans can only dream of doing.
4
15240
2720
์ธ๊ฐ„์ด ๊ฟˆ๋งŒ ๊ฟ€ ์ˆ˜ ์žˆ๋Š” ์ผ์„ ํ•  ์ˆ˜ ์žˆ๋Š” ๋Šฅ๋ ฅ์ด ์žˆ์Šต๋‹ˆ๋‹ค .
00:17
We are going to look at leadership in a future working alongside robots,
5
17960
4840
์šฐ๋ฆฌ๋Š” ๋กœ๋ด‡๊ณผ ํ•จ๊ป˜ ์ผํ•˜๋Š” ๋ฏธ๋ž˜์˜ ๋ฆฌ๋”์‹ญ
00:22
and what this could mean for you.
6
22800
3160
๊ณผ ์ด๊ฒƒ์ด ์—ฌ๋Ÿฌ๋ถ„์—๊ฒŒ ๋ฌด์—‡์„ ์˜๋ฏธํ•˜๋Š”์ง€ ์‚ดํŽด๋ณผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:27
A global tech giant, Alibaba rivals Amazon
7
27760
4000
์„ธ๊ณ„์ ์ธ ๊ธฐ์ˆ  ๋Œ€๊ธฐ์—…์ธ Alibaba
00:31
for the title of world's largest online retailer.
8
31760
3920
๋Š” ์„ธ๊ณ„ ์ตœ๋Œ€ ์˜จ๋ผ์ธ ์†Œ๋งค์—…์ฒด๋ผ๋Š” ํƒ€์ดํ‹€์„ ๋†“๊ณ  Amazon๊ณผ ๊ฒฝ์Ÿ ํ•ฉ๋‹ˆ๋‹ค.
00:35
It has some 800 million users.
9
35680
3000
์•ฝ 8์–ต ๋ช…์˜ ์‚ฌ์šฉ์ž๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
00:38
It doesn't just help you shop;
10
38680
2160
์‡ผํ•‘์—๋งŒ ๋„์›€์ด ๋˜๋Š” ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
00:40
it can also bank your money and store your data.
11
40840
4160
๋˜ํ•œ ๋ˆ์„ ์€ํ–‰์— ๋ณด๊ด€ํ•˜๊ณ  ๋ฐ์ดํ„ฐ๋ฅผ ์ €์žฅํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
00:45
Alibaba has got so big,
12
45000
1840
Alibaba๋Š” ๋„ˆ๋ฌด
00:46
the Chinese government wants new rules to curb its power.
13
46840
3880
์ปค์ ธ์„œ ์ค‘๊ตญ ์ •๋ถ€๋Š” ๊ถŒ๋ ฅ์„ ์–ต์ œํ•˜๊ธฐ ์œ„ํ•ด ์ƒˆ๋กœ์šด ๊ทœ์น™์„ ์›ํ•ฉ๋‹ˆ๋‹ค.
00:50
The company has long embraced artificial intelligence, or AI.
14
50720
5440
์ด ํšŒ์‚ฌ๋Š” ์˜ค๋žซ๋™์•ˆ ์ธ๊ณต์ง€๋Šฅ(AI)์„ ์ˆ˜์šฉํ•ด ์™”์Šต๋‹ˆ๋‹ค.
00:56
It uses algorithms to provide a personalised service for its customers
15
56160
4480
์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ฌ์šฉ ํ•˜์—ฌ ๊ณ ๊ฐ
01:00
and robots to process and pack goods in its warehouses.
16
60640
4160
๊ณผ ๋กœ๋ด‡ ์ด ์ฐฝ๊ณ ์—์„œ ์ƒํ’ˆ์„ ์ฒ˜๋ฆฌํ•˜๊ณ  ํฌ์žฅํ•  ์ˆ˜ ์žˆ๋„๋ก ๊ฐœ์ธํ™”๋œ ์„œ๋น„์Šค๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
01:04
Jack Ma, who founded the firm,
17
64800
2160
ํšŒ์‚ฌ๋ฅผ ์„ค๋ฆฝํ•œ Jack Ma
01:06
believes robots could one day run companies.
18
66960
2880
๋Š” ๋กœ๋ด‡์ด ์–ธ์  ๊ฐ€ ํšŒ์‚ฌ๋ฅผ ์šด์˜ํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ด๋ผ๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค .
01:09
He says, 'In 30 years,
19
69840
2000
๊ทธ๋Š” '30๋…„
01:11
a robot will likely be on the cover of Time magazine
20
71840
2920
์•ˆ์— ๋กœ๋ด‡์ด ์ตœ๊ณ ์˜ CEO ๋กœ ํƒ€์ž„์ง€ ํ‘œ์ง€๋ฅผ ์žฅ์‹ํ•  ๊ฒƒ ๊ฐ™๋‹ค
01:14
as the best CEO.'
21
74760
2800
'๊ณ  ๋งํ•œ๋‹ค.
01:17
However, another pioneer of the tech world,
22
77560
2800
๊ทธ๋Ÿฌ๋‚˜ ๊ธฐ์ˆ  ์„ธ๊ณ„์˜ ๋˜ ๋‹ค๋ฅธ ์„ ๊ตฌ์ž
01:20
the American Elon Musk, is more worried.
23
80360
3080
์ธ ๋ฏธ๊ตญ Elon Musk ๋Š” ๋” ๊ฑฑ์ •ํ•ฉ๋‹ˆ๋‹ค.
01:23
He fears robots could one day get rid of us entirely.
24
83440
3240
๊ทธ๋Š” ๋กœ๋ด‡์ด ์–ธ์  ๊ฐ€ ์šฐ๋ฆฌ๋ฅผ ์™„์ „ํžˆ ์—†์•จ ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋‘๋ ค์›Œํ•ฉ๋‹ˆ๋‹ค.
01:26
So, how will leadership look in a world of AI?
25
86680
5120
๊ทธ๋ ‡๋‹ค๋ฉด AI ์„ธ์ƒ์—์„œ ๋ฆฌ๋”์‹ญ์€ ์–ด๋–ค ๋ชจ์Šต์ผ๊นŒ์š”?
01:31
So, a big advantage for human beings,
26
91800
2600
๋”ฐ๋ผ์„œ ์ž‘์—…์žฅ
01:34
ย  in having more robots and AI in the workplace,
27
94400
3280
์— ๋” ๋งŽ์€ ๋กœ๋ด‡ ๊ณผ AI๊ฐ€ ์žˆ๋‹ค๋Š” ์ ์—์„œ ์ธ๊ฐ„์—๊ฒŒ ํฐ ์ด์ 
01:37
are clearly that... that these technologies
28
97680
2080
์€ ๋ถ„๋ช…ํžˆ ... ์ด๋Ÿฌํ•œ ๊ธฐ์ˆ 
01:39
can perhaps in the future do a lot of the dirty work for us,
29
99760
3200
์ด ๋ฏธ๋ž˜ ์— ์šฐ๋ฆฌ๋ฅผ ์œ„ํ•ด ๋”๋Ÿฌ์šด ์ผ์„ ๋งŽ์ด ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š”
01:42
and by dirty work, I think I mean things like
30
102960
2840
๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:45
heavy lifting, cleaning, moving goods from A to B,
31
105800
4120
๋ฌด๊ฑฐ์šด ๋ฌผ๊ฑด ๋“ค๊ธฐ, ์ฒญ์†Œํ•˜๊ธฐ, A์—์„œ B๋กœ ๋ฌผ๊ฑด ์˜ฎ๊ธฐ๊ธฐ ๋“ฑ
01:49
but it can also mean repetitive, computer-based tasks
32
109920
2640
์„ ์˜๋ฏธํ•˜์ง€๋งŒ ๋ฐ˜๋ณต์ ์ธ ์ปดํ“จํ„ฐ ๊ธฐ๋ฐ˜ ์ž‘์—…์„ ์˜๋ฏธ
01:52
and it's not very healthy for human beings
33
112560
1680
01:54
to be in front of a computer for extended periods of time.
34
114240
3280
ํ•  ์ˆ˜๋„ ์žˆ์œผ๋ฉฐ ์ธ๊ฐ„์ด ์ปดํ“จํ„ฐ ์•ž์— ์žฅ์‹œ๊ฐ„ ์žˆ๋Š” ๊ฒƒ์€ ๊ฑด๊ฐ•์— ์ข‹์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
01:57
And that can free up human beings to do a lot more thinking:
35
117520
3200
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์€ ์ธ๊ฐ„์ด ํ›จ์”ฌ ๋” ๋งŽ์€ ์ƒ๊ฐ์„ ํ•  ์ˆ˜ ์žˆ๋„๋ก ์ž์œ ๋กญ๊ฒŒ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:00
to think about โ€“ big thoughts about the future,
36
120720
2680
์ƒ๊ฐํ•˜๋Š” ๊ฒƒ โ€“ ๋ฏธ๋ž˜์—
02:03
about what a carbon-neutral planet looks like,
37
123400
3880
๋Œ€ํ•œ ํฐ ์ƒ๊ฐ, ํƒ„์†Œ ์ค‘๋ฆฝ ์ง€๊ตฌ๊ฐ€ ์–ด๋–ค ๋ชจ์Šต์ธ์ง€,
02:07
about the kinds of communities we want to develop.
38
127280
4320
์šฐ๋ฆฌ๊ฐ€ ๋ฐœ์ „์‹œํ‚ค๊ณ  ์‹ถ์€ ์ปค๋ฎค๋‹ˆํ‹ฐ์˜ ์ข…๋ฅ˜์— ๋Œ€ํ•ด.
02:11
So, robots and AI could free us
39
131600
2400
๋”ฐ๋ผ์„œ ๋กœ๋ด‡๊ณผ AI ๋Š” ์šฐ๋ฆฌ
02:14
from the dull, repetitive work we don't want to do.
40
134000
3640
๊ฐ€ ํ•˜๊ธฐ ์‹ซ์€ ์ง€๋ฃจํ•˜๊ณ  ๋ฐ˜๋ณต์ ์ธ ์ž‘์—…์—์„œ ์šฐ๋ฆฌ๋ฅผ ์ž์œ ๋กญ๊ฒŒ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:17
But aren't there dangers with that?
41
137640
2640
๊ทธ๋Ÿฌ๋‚˜ ๊ฑฐ๊ธฐ์—๋Š” ์œ„ํ—˜์ด ์—†์Šต๋‹ˆ๊นŒ?
02:20
So, the big danger essentially is that,
42
140280
2240
๋”ฐ๋ผ์„œ ๋ณธ์งˆ์ ์œผ๋กœ ํฐ ์œ„ํ—˜ ์€
02:22
if our workplace has become more automated
43
142520
3200
์šฐ๋ฆฌ์˜ ์ž‘์—…์žฅ์ด
02:25
and busier with robotics,
44
145720
2880
๋กœ๋ด‡ ๊ณตํ•™์œผ๋กœ ๋” ์ž๋™ํ™”๋˜๊ณ  ๋” ๋ฐ”๋น ์ง„
02:28
that we'll have to have something to do,
45
148600
1440
๋‹ค๋ฉด ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ผ
02:30
and governments will have to find solutions
46
150040
3640
์ด ์žˆ์–ด์•ผ ํ•˜๊ณ  ์ •๋ถ€๋Š” ๋” ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์„ ์œ„ํ•œ ํ•ด๊ฒฐ์ฑ…์„ ์ฐพ์•„์•ผ
02:33
to ever greater numbers of people, who might not be out of work,
47
153680
4080
ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์‹ค์ง
02:37
but sort-of hopping from one insecure temporary job to another.
48
157760
3400
ํ–ˆ์ง€๋งŒ ๋ถˆ์•ˆ์ •ํ•œ ์ž„์‹œ์ง์—์„œ ๋‹ค๋ฅธ ์ž„์‹œ์ง์œผ๋กœ ์˜ฎ๊ฒจ ๋‹ค๋‹ˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:41
And that presents really big, actually, social challenges.
49
161160
3680
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์€ ์‹ค์ œ๋กœ ํฐ ์‚ฌํšŒ์  ๋„์ „์„ ์ œ์‹œํ•ฉ๋‹ˆ๋‹ค.
02:44
Giving more jobs to robots and AI
50
164840
2640
๋กœ๋ด‡๊ณผ AI์—๊ฒŒ ๋” ๋งŽ์€ ์ผ์ž๋ฆฌ ๋ฅผ ์ œ๊ณตํ•˜๋Š”
02:47
is going to present huge social challenges to humans.
51
167480
3800
๊ฒƒ์€ ์ธ๊ฐ„์—๊ฒŒ ์—„์ฒญ๋‚œ ์‚ฌํšŒ์  ๊ณผ์ œ๋ฅผ ์•ˆ๊ฒจ์ค„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:51
Where does leadership fit into this?
52
171280
3000
๋ฆฌ๋”์‹ญ์€ ์—ฌ๊ธฐ์— ์–ด๋””์— ์ ํ•ฉํ•ฉ๋‹ˆ๊นŒ? ๊ด€๋ฆฌ์™€ ๋‹ฌ๋ฆฌ
02:54
A key part of leadership, as opposed to management,
53
174280
3280
๋ฆฌ๋”์‹ญ์˜ ํ•ต์‹ฌ ๋ถ€๋ถ„ ์€ ๊ด€๋ฆฌ
02:57
is how central care is to leadership:
54
177560
3160
, ์ดํ•ด
03:00
care, understanding and empathy.
55
180720
2000
๋ฐ ๊ณต๊ฐ๊ณผ ๊ฐ™์€ ๋ฆฌ๋”์‹ญ์˜ ์ค‘์‹ฌ ๊ด€๋ฆฌ์ž…๋‹ˆ๋‹ค.
03:02
And so, in its most obvious guises, we can think of caring for others,
56
182720
4200
๊ทธ๋ž˜์„œ ๊ฐ€์žฅ ๋ถ„๋ช…ํ•œ ๋ชจ์Šต์œผ๋กœ ์šฐ๋ฆฌ๋Š” ๋‹ค๋ฅธ ์‚ฌ๋žŒ, ์šฐ๋ฆฌ ์ž์‹ ๋ณด๋‹ค ๋” ์ทจ์•ฝํ•œ ์‚ฌ๋žŒ๋“ค์„ ๋Œ๋ณด๋Š” ๊ฒƒ์„ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:06
the people who are more vulnerable than ourselves โ€“
57
186920
2000
03:08
and this is just really something that robots, no matter how sophisticated,
58
188920
3600
์ด๊ฒƒ์€ ๋กœ๋ด‡์ด ์•„๋ฌด๋ฆฌ ์ •๊ตํ•˜๋”๋ผ๋„
03:12
can't be replaced by human beings.
59
192520
1920
์ธ๊ฐ„์œผ๋กœ ๋Œ€์ฒดํ•  ์ˆ˜ ์—†๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:14
But the central task of leadership, which is walking with people,
60
194440
3320
๊ทธ๋Ÿฌ๋‚˜ ์‚ฌ๋žŒ๋“ค๊ณผ ํ•จ๊ป˜ ๊ฑท๊ณ ,
03:17
responding to their needs, caring for them โ€“
61
197760
2320
๊ทธ๋“ค์˜ ์š”๊ตฌ์— ๋ถ€์‘ํ•˜๊ณ ,
03:20
on the big issues as well as the small issues โ€“
62
200080
2440
ํฐ ๋ฌธ์ œ์™€ ์ž‘์€ ๋ฌธ์ œ
03:22
is something that robots will probably never be able to compensate for.
63
202520
5120
์— ๋Œ€ํ•ด ๊ทธ๋“ค์„ ๋Œ๋ณด๋Š” ๋ฆฌ๋”์‹ญ์˜ ์ค‘์‹ฌ ์ž„๋ฌด๋Š” ๋กœ๋ด‡์ด ๊ฒฐ์ฝ” ๋ณด์ƒํ•  ์ˆ˜ ์—†๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:27
So, qualities such as empathy, care
64
207640
2560
๋”ฐ๋ผ์„œ ๋ฆฌ๋”์‹ญ์—์„œ ๊ณต๊ฐ, ๋ณด์‚ดํ•Œ, ์ดํ•ด์™€ ๊ฐ™์€ ์ž์งˆ
03:30
and understanding in leadership will be very important โ€“
65
210200
3480
03:33
human skills that robots will probably never acquire.
66
213680
3960
์€ ๋กœ๋ด‡ ์ด ๊ฒฐ์ฝ” ์Šต๋“ํ•˜์ง€ ๋ชปํ•  ์ธ๊ฐ„ ๊ธฐ์ˆ ๊ณผ ๊ฐ™์€ ๋งค์šฐ ์ค‘์š”ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:37
There are loads of ethical responsibilities
67
217640
1800
03:39
for people who are creating AI,
68
219440
1680
AI๋ฅผ ๋งŒ๋“œ๋Š”
03:41
but also people who are in charge of implementing it
69
221120
3000
์‚ฌ๋žŒ๋“ค๋ฟ๋งŒ ์•„๋‹ˆ๋ผ AI ๋ฅผ ๊ตฌํ˜„ํ•˜๊ณ  ์กฐ์ง
03:44
and seeing how it progresses through organisations and society.
70
224120
3920
๊ณผ ์‚ฌํšŒ์—์„œ AI๊ฐ€ ์–ด๋–ป๊ฒŒ ์ง„ํ–‰๋˜๋Š”์ง€ ์ง€์ผœ๋ณด๋Š” ์‚ฌ๋žŒ๋“ค์—๊ฒŒ๋„ ๋งŽ์€ ์œค๋ฆฌ์  ์ฑ…์ž„์ด ์žˆ์Šต๋‹ˆ๋‹ค .
03:48
The main ethical issue, I think, here is that AI,
71
228040
2960
์ฃผ์š” ์œค๋ฆฌ์  ๋ฌธ์ œ๋Š” AI
03:51
in some senses, is a kind of automation of a human will,
72
231000
4920
๊ฐ€ ์–ด๋–ค ์˜๋ฏธ์—์„œ ์ผ์ข…์˜ ์ธ๊ฐ„ ์˜์ง€์˜ ์ž๋™ํ™”์ด๋ฉฐ
03:55
which can unfortunately include lots of human prejudices.
73
235920
2800
๋ถˆํ–‰ํžˆ๋„ ์ธ๊ฐ„์˜ ๋งŽ์€ ํŽธ๊ฒฌ์„ ํฌํ•จํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:58
So, for example, there have been problems with policing algorithms,
74
238720
4280
์˜ˆ๋ฅผ ๋“ค์–ด, ์ž์› ๋ฐฐ์น˜
04:03
in the sense that they have reflected, maybe, some underlying racial biases
75
243000
4120
04:07
of certain police forces, in terms of where they deploy resources.
76
247120
3560
์ธก๋ฉด์—์„œ ํŠน์ • ๊ฒฝ์ฐฐ๋ ฅ์˜ ๊ทผ๋ณธ์ ์ธ ์ธ์ข…์  ํŽธ๊ฒฌ์„ ๋ฐ˜์˜ํ•œ๋‹ค๋Š” ์˜๋ฏธ์—์„œ ์น˜์•ˆ ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ๋ฌธ์ œ๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค .
04:10
So, we have to keep a really close eye
77
250680
1760
๋”ฐ๋ผ์„œ ์šฐ๋ฆฌ๋Š” ๋งค์šฐ ๋ฉด๋ฐ€ํžˆ ์ฃผ์‹œ
04:12
and it's a really big, ethical leadership responsibility
78
252440
3000
04:15
to keep a close eye on how artificial intelligence is deployed โ€“
79
255440
4160
ํ•ด์•ผ ํ•˜๋ฉฐ ์ธ๊ณต ์ง€๋Šฅ์ด ๋ฐฐ์น˜
04:19
that it doesn't get out of hand and doesn't actually automate
80
259600
3160
๋˜๋Š” ๋ฐฉ์‹์„ ๋ฉด๋ฐ€ํžˆ ์ฃผ์‹œํ•˜๋Š” ๊ฒƒ์€ ๋งค์šฐ
04:22
really serious ethical problems.
81
262760
2840
ํฌ๊ณ  ์œค๋ฆฌ์ ์ธ ๋ฆฌ๋”์‹ญ ์ฑ…์ž„์ž…๋‹ˆ๋‹ค. .
04:25
We need our leaders to be ethical and responsible when implementing AI,
82
265600
4840
์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„์˜ ํŽธ๊ฒฌ์„ ์ž๋™ํ™”ํ•˜๊ณ  ๋ฐ˜๋ณตํ•˜์ง€ ์•Š๋„๋ก ๋ฆฌ๋”๊ฐ€ AI๋ฅผ ๊ตฌํ˜„ํ•  ๋•Œ ์œค๋ฆฌ์ ์ด๊ณ  ์ฑ…์ž„๊ฐ ์žˆ๊ฒŒ ํ–‰๋™ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค
04:30
so that we don't automate and repeat our own human prejudices.
83
270440
5560
.
04:40
Could you one day have a boss like this?
84
280120
3000
์–ธ์  ๊ฐ€ ์ด๋Ÿฐ ์ƒ์‚ฌ๋ฅผ ๊ฐ€์งˆ ์ˆ˜ ์žˆ์„๊นŒ?
04:43
Meet Ai-Da, the first human-like robot artist.
85
283120
4880
์ตœ์ดˆ์˜ ์ธ๊ฐ„ํ˜• ๋กœ๋ด‡ ์˜ˆ์ˆ ๊ฐ€์ธ Ai-Da๋ฅผ ๋งŒ๋‚˜๋ณด์„ธ์š” .
04:48
Ai-Da can draw and recite poetry.
86
288000
3480
Ai-Da๋Š” ์‹œ๋ฅผ ๊ทธ๋ฆฌ๊ณ  ์•”์†กํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:51
The eerie sounds, which echoed throughout
87
291480
4640
04:56
with the weight of the world itself.
88
296120
3800
์„ธ์ƒ ์ž์ฒด์˜ ๋ฌด๊ฒŒ์™€ ํ•จ๊ป˜ ๋ฉ”์•„๋ฆฌ์น˜๋Š” ์„ฌ๋œฉํ•œ ์†Œ๋ฆฌ.
04:59
But what her creators really want her to do
89
299920
2920
ํ•˜์ง€๋งŒ ํฌ๋ฆฌ์—์ดํ„ฐ๊ฐ€ ๊ทธ๋…€์—๊ฒŒ ์ง„์ •์œผ๋กœ ๋ฐ”๋ผ๋Š”
05:02
ย  is to get people thinking about a world with AI,
90
302840
4200
๊ฒƒ์€ ์‚ฌ๋žŒ๋“ค ์ด AI๊ฐ€ ์žˆ๋Š” ์„ธ์ƒ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๊ฒŒ ํ•˜๋Š”
05:07
and that includes thinking about the impact it will have on leadership.
91
307040
4760
๊ฒƒ์ด๋ฉฐ ์—ฌ๊ธฐ์—๋Š” AI๊ฐ€ ๋ฆฌ๋”์‹ญ์— ๋ฏธ์น  ์˜ํ–ฅ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ๋„ ํฌํ•จ๋ฉ๋‹ˆ๋‹ค .
05:13
Robots are great, but not human.
92
313120
3200
๋กœ๋ด‡์€ ํ›Œ๋ฅญํ•˜์ง€๋งŒ ์ธ๊ฐ„์€ ์•„๋‹™๋‹ˆ๋‹ค.
05:16
They are not perfect and they are not perfect for everything,
93
316320
4040
๊ทธ๋“ค์€ ์™„๋ฒฝํ•˜์ง€ ์•Š๊ณ  ๋ชจ๋“  ๋ฉด์—์„œ ์™„๋ฒฝ
05:20
but they do have a capability to do some things
94
320360
3680
ํ•˜์ง€ ์•Š์ง€๋งŒ
05:24
that humans can only dream of doing.
95
324040
3800
์ธ๊ฐ„์ด ๊ฟˆ๊พธ๊ธฐ๋งŒ ํ•  ์ˆ˜ ์žˆ๋Š” ๋ช‡ ๊ฐ€์ง€ ์ผ์„ ํ•  ์ˆ˜ ์žˆ๋Š” ๋Šฅ๋ ฅ์ด ์žˆ์Šต๋‹ˆ๋‹ค .
05:27
More challenging is how to direct this capability
96
327840
3600
๋” ์–ด๋ ค์šด ๊ฒƒ์€
05:31
for a sustainable environment and our future.
97
331440
3800
์ง€์† ๊ฐ€๋Šฅํ•œ ํ™˜๊ฒฝ ๊ณผ ์šฐ๋ฆฌ์˜ ๋ฏธ๋ž˜๋ฅผ ์œ„ํ•ด ์ด ๋Šฅ๋ ฅ์„ ์–ด๋–ป๊ฒŒ ์ง€์‹œํ•  ๊ฒƒ์ธ๊ฐ€ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:35
This is difficult.
98
335240
2560
์ด๊ฑด ์–ด๋ ค์›Œ.
05:37
Robots will bring opportunities and challenges.
99
337800
3360
๋กœ๋ด‡์€ ๊ธฐํšŒ์™€ ๋„์ „์„ ๊ฐ€์ ธ์˜ฌ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:41
Would robots make better leaders than humans?
100
341160
3520
๋กœ๋ด‡ ์ด ์ธ๊ฐ„๋ณด๋‹ค ๋” ๋‚˜์€ ๋ฆฌ๋”๊ฐ€ ๋ ๊นŒ์š”?
05:44
They work together.
101
344680
2000
๊ทธ๋“ค์€ ํ•จ๊ป˜ ์ผํ•ฉ๋‹ˆ๋‹ค.
05:46
At the end of the day, the state of the game
102
346680
2920
๊ฒฐ๊ตญ ๊ฒŒ์ž„์˜ ์ƒํƒœ๋Š”
05:49
is up to how we design and use technology/robots.
103
349600
5440
์šฐ๋ฆฌ๊ฐ€ ๊ธฐ์ˆ /๋กœ๋ด‡์„ ์„ค๊ณ„ํ•˜๊ณ  ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์— ๋‹ฌ๋ ค ์žˆ์Šต๋‹ˆ๋‹ค.
05:55
So, humans need to be more conscious
104
355040
2160
๋”ฐ๋ผ์„œ ์ธ๊ฐ„
05:57
of how much we are doing and what we are doing,
105
357200
2640
์€ ์ƒˆ๋กœ์šด ๊ธฐ์ˆ ์„ ๋งŒ๋“ค๊ณ  ์‚ฌ์šฉํ•  ๋•Œ ์šฐ๋ฆฌ๊ฐ€ ์–ผ๋งˆ๋‚˜ ๋งŽ์€ ์ผ์„ ํ•˜๊ณ  ์žˆ๋Š”์ง€, ๋ฌด์—‡์„ ํ•˜๊ณ  ์žˆ๋Š”์ง€ ๋” ์˜์‹ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค
05:59
when we create and use new technology.
106
359840
3760
.
06:03
So we, as humans, and robots need to work together.
107
363600
4200
๋”ฐ๋ผ์„œ ์šฐ๋ฆฌ ์ธ๊ฐ„ ๊ณผ ๋กœ๋ด‡์€ ํ•จ๊ป˜ ์ผํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
06:07
We should understand the power of new technologies.
108
367800
3680
์šฐ๋ฆฌ ๋Š” ์‹ ๊ธฐ์ˆ ์˜ ํž˜์„ ์ดํ•ดํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
06:11
In order to truly appreciate
109
371480
2080
์ƒˆ๋กญ๊ฒŒ ๋– ์˜ค๋ฅด๋Š” ๊ธฐํšŒ๋ฅผ ์ง„์ •์œผ๋กœ ์ดํ•ดํ•˜๋ ค๋ฉด
06:13
the opportunities that are emerging,
110
373560
3240
06:16
we need to first understand the past.
111
376800
3440
๋จผ์ € ๊ณผ๊ฑฐ๋ฅผ ์ดํ•ดํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
06:20
With this, we can be more aware and flexible
112
380240
3240
์ด๋ฅผ ํ†ตํ•ด ์šฐ๋ฆฌ๋Š” ์˜ˆ์ธกํ•  ์ˆ˜ ์—†๋Š” ๋ฏธ๋ž˜๋ฅผ ๋” ์ž˜ ์•Œ๊ณ  ์œ ์—ฐ
06:23
to deal with an unpredictable future.
113
383480
4000
ํ•˜๊ฒŒ ๋Œ€์ฒ˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:27
The world is changing.
114
387480
2280
์„ธ์ƒ์€ ๋ณ€ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:29
The future is not what we consider it to be.
115
389760
3640
๋ฏธ๋ž˜๋Š” ์šฐ๋ฆฌ๊ฐ€ ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
06:33
It is requiring from us
116
393400
2000
๊ทธ๊ฒƒ์€ ์šฐ๋ฆฌ์—๊ฒŒ ์šฐ๋ฆฌ
06:35
a willingness to give up our comfort zone.
117
395400
4040
์˜ ์•ˆ์ „์ง€๋Œ€๋ฅผ ๊ธฐ๊บผ์ด ํฌ๊ธฐํ•  ๊ฒƒ์„ ์š”๊ตฌ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:39
Giving up your comfort zone means opening yourself up
118
399440
3400
ํŽธ์•ˆํ•จ์„ ํฌ๊ธฐ
06:42
to new possibilities.
119
402840
2000
ํ•œ๋‹ค๋Š” ๊ฒƒ์€ ์ƒˆ๋กœ์šด ๊ฐ€๋Šฅ์„ฑ์„ ์—ด์–ด์ค€๋‹ค๋Š” ์˜๋ฏธ์ž…๋‹ˆ๋‹ค.
06:44
To do this, it helps to understand the past.
120
404840
3280
์ด๋ฅผ ์œ„ํ•ด์„œ๋Š” ๊ณผ๊ฑฐ๋ฅผ ์ดํ•ดํ•˜๋Š” ๊ฒƒ์ด ๋„์›€์ด ๋ฉ๋‹ˆ๋‹ค.
06:48
So, what does Ai-Da think is a key quality for leadership?
121
408120
4680
๊ทธ๋ ‡๋‹ค๋ฉด Ai-Da ๋Š” ๋ฆฌ๋”์‹ญ์˜ ํ•ต์‹ฌ ์ž์งˆ์ด ๋ฌด์—‡์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๊นŒ?
06:52
Humility: we are to be humble in every action
122
412800
4440
๊ฒธ์†: ์šฐ๋ฆฌ ๋Š” ๋ชจ๋“  ํ–‰๋™์—์„œ ๊ฒธ์†ํ•ด์•ผ
06:57
and this includes what we are willing to do
123
417240
2720
ํ•˜๋ฉฐ ์—ฌ๊ธฐ์—๋Š”
06:59
and say to help others.
124
419960
2280
๋‹ค๋ฅธ ์‚ฌ๋žŒ์„ ๋•๊ธฐ ์œ„ํ•ด ๊ธฐ๊บผ์ด ํ–‰ํ•˜๊ณ  ๋งํ•˜๋Š” ๊ฒƒ๋„ ํฌํ•จ๋ฉ๋‹ˆ๋‹ค.
07:02
Leadership is not the same thing as success or failure.
125
422240
4240
๋ฆฌ๋”์‹ญ์€ ์„ฑ๊ณต์ด๋‚˜ ์‹คํŒจ์™€ ๊ฐ™์€ ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
07:06
We have all failed,
126
426480
1880
์šฐ๋ฆฌ๋Š” ๋ชจ๋‘ ์‹คํŒจ
07:08
but we can recognise the mistakes that we made
127
428360
2560
ํ–ˆ์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์ €์ง€๋ฅธ ์‹ค์ˆ˜๋ฅผ ์ธ์ •ํ•˜๊ณ  ๊ทธ ์‹ค์ˆ˜
07:10
and learn from them.
128
430920
2040
๋กœ๋ถ€ํ„ฐ ๋ฐฐ์šธ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:12
Humility is when you put yourself in someone else's shoes.
129
432960
4520
๊ฒธ์†์€ ๋‹ค๋ฅธ ์‚ฌ๋žŒ์˜ ์ž…์žฅ์ด ๋˜์–ด๋ณด๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:17
Showing humility, recognising mistakes and learning from them,
130
437480
4640
๊ฒธ์†ํ•จ์„ ๋ณด์—ฌ์ฃผ๊ณ , ์‹ค์ˆ˜๋ฅผ ์ธ์‹ํ•˜๊ณ , ์‹ค์ˆ˜๋กœ๋ถ€ํ„ฐ ๋ฐฐ์šฐ๋Š”
07:22
are qualities this robot wants to see in leaders.
131
442120
5360
๊ฒƒ์€ ์ด ๋กœ๋ด‡ ์ด ๋ฆฌ๋”์—๊ฒŒ์„œ ๋ณด๊ณ  ์‹ถ์–ดํ•˜๋Š” ์ž์งˆ์ž…๋‹ˆ๋‹ค.
07:31
It's hard to know exactly how leadership will look in the future,
132
451040
3960
๋ฏธ๋ž˜์— ๋ฆฌ๋”์‹ญ์ด ์–ด๋–ค ๋ชจ์Šต์ผ์ง€ ์ •ํ™•ํžˆ ์•Œ๊ธฐ
07:35
but it is clear that human qualities of care and empathy will be vital.
133
455000
4640
๋Š” ์–ด๋ ต์ง€๋งŒ ๋ฐฐ๋ ค์™€ ๊ณต๊ฐ์ด๋ผ๋Š” ์ธ๊ฐ„์  ์ž์งˆ์ด ํ•„์ˆ˜์ ์ด๋ผ๋Š” ๊ฒƒ์€ ๋ถ„๋ช…ํ•ฉ๋‹ˆ๋‹ค . ๊ธฐ์ˆ ๊ณผ ๊ทธ
07:39
Having a detailed knowledge of the technology
134
459640
2200
์ž ์žฌ๋ ฅ์— ๋Œ€ํ•œ ์ž์„ธํ•œ ์ง€์‹์„ ๊ฐ–๋Š”
07:41
and its potential is important,
135
461840
1960
๊ฒƒ์ด ์ค‘์š”ํ•˜๋ฉฐ,
07:43
as is being ethical and responsible in how we use it.
136
463800
4920
์ด๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์— ์žˆ์–ด ์œค๋ฆฌ์ ์ด๊ณ  ์ฑ…์ž„๊ฐ ์žˆ๋Š” ํƒœ๋„๋ฅผ ์ทจํ•˜๋Š” ๊ฒƒ๋„ ์ค‘์š” ํ•ฉ๋‹ˆ๋‹ค.

Original video on YouTube.com
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7