Leaders and machines

14,262 views ใƒป 2022-03-15

BBC Learning English


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ. ื›ืชื•ื‘ื™ื•ืช ืžืชื•ืจื’ืžื•ืช ืžืชื•ืจื’ืžื•ืช ื‘ืžื›ื•ื ื”.

00:01
Robots might be on the factory floor now,
0
1520
3200
ืจื•ื‘ื•ื˜ื™ื ืื•ืœื™ ื ืžืฆืื™ื ืขืœ ืจืฆืคืช ื”ืžืคืขืœ ืขื›ืฉื™ื•,
00:04
but could they one day be your boss?
1
4720
2360
ืื‘ืœ ื”ืื ื”ื ื™ื›ื•ืœื™ื ื™ื•ื ืื—ื“ ืœื”ื™ื•ืช ื”ื‘ื•ืก ืฉืœืš?
00:07
Robots are not perfect and they are not perfect for everything,
2
7080
4520
ืจื•ื‘ื•ื˜ื™ื ื”ื ืœื ืžื•ืฉืœืžื™ื ื•ื”ื ืœื ืžื•ืฉืœืžื™ื ืœื›ืœ ื“ื‘ืจ,
00:11
but they do have a capability to do some things
3
11600
3640
ืื‘ืœ ื™ืฉ ืœื”ื ื™ื›ื•ืœืช ืœืขืฉื•ืช ื›ืžื” ื“ื‘ืจื™ื
00:15
that humans can only dream of doing.
4
15240
2720
ืฉื‘ื ื™ ืื“ื ื™ื›ื•ืœื™ื ืจืง ืœื—ืœื•ื ืœืขืฉื•ืช.
00:17
We are going to look at leadership in a future working alongside robots,
5
17960
4840
ืื ื—ื ื• ื”ื•ืœื›ื™ื ืœื‘ื—ื•ืŸ ืžื ื”ื™ื’ื•ืช ื‘ืขืชื™ื“ ื‘ืขื‘ื•ื“ื” ืœืฆื“ ืจื•ื‘ื•ื˜ื™ื,
00:22
and what this could mean for you.
6
22800
3160
ื•ืžื” ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ืขื‘ื•ืจื›ื.
00:27
A global tech giant, Alibaba rivals Amazon
7
27760
4000
ืขื ืงื™ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ืขื•ืœืžื™ืช, ืขืœื™ื‘ืื‘ื ืžืชื—ืจื” ื‘ืืžื–ื•ืŸ
00:31
for the title of world's largest online retailer.
8
31760
3920
ืขืœ ื”ืชื•ืืจ ื”ืงืžืขื•ื ืื™ืช ื”ืžืงื•ื•ื ืช ื”ื’ื“ื•ืœื” ื‘ืขื•ืœื.
00:35
It has some 800 million users.
9
35680
3000
ื™ืฉ ืœื• ื›-800 ืžื™ืœื™ื•ืŸ ืžืฉืชืžืฉื™ื.
00:38
It doesn't just help you shop;
10
38680
2160
ื–ื” ืœื ืจืง ืขื•ื–ืจ ืœืš ืœืงื ื•ืช;
00:40
it can also bank your money and store your data.
11
40840
4160
ื–ื” ื™ื›ื•ืœ ื’ื ืœื‘ื ืง ืืช ื”ื›ืกืฃ ืฉืœืš ื•ืœืื—ืกืŸ ืืช ื”ื ืชื•ื ื™ื ืฉืœืš.
00:45
Alibaba has got so big,
12
45000
1840
ืขืœื™ื‘ืื‘ื ื”ืคื›ื” ื›ืœ ื›ืš ื’ื“ื•ืœื”,
00:46
the Chinese government wants new rules to curb its power.
13
46840
3880
ืฉื”ืžืžืฉืœื” ื”ืกื™ื ื™ืช ืจื•ืฆื” ื—ื•ืงื™ื ื—ื“ืฉื™ื ื›ื“ื™ ืœืจืกืŸ ืืช ื›ื•ื—ื”.
00:50
The company has long embraced artificial intelligence, or AI.
14
50720
5440
ื”ื—ื‘ืจื” ืื™ืžืฆื” ืžื–ื” ื–ืžืŸ ืจื‘ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช, ืื• AI.
00:56
It uses algorithms to provide a personalised service for its customers
15
56160
4480
ื”ื™ื ืžืฉืชืžืฉืช ื‘ืืœื’ื•ืจื™ืชืžื™ื ื›ื“ื™ ืœืกืคืง ืฉื™ืจื•ืช ืžื•ืชืื ืื™ืฉื™ืช ืœืœืงื•ื—ื•ืชื™ื”
01:00
and robots to process and pack goods in its warehouses.
16
60640
4160
ื•ืœืจื•ื‘ื•ื˜ื™ื ืœืขื™ื‘ื•ื“ ื•ืืจื™ื–ืช ืกื—ื•ืจื•ืช ื‘ืžื—ืกื ื™ื ืฉืœื”.
01:04
Jack Ma, who founded the firm,
17
64800
2160
ื’'ืง ืžื, ืฉื”ืงื™ื ืืช ื”ื—ื‘ืจื”,
01:06
believes robots could one day run companies.
18
66960
2880
ืžืืžื™ืŸ ืฉืจื•ื‘ื•ื˜ื™ื ื™ื•ื›ืœื• ื™ื•ื ืื—ื“ ืœื ื”ืœ ื—ื‘ืจื•ืช.
01:09
He says, 'In 30 years,
19
69840
2000
ื”ื•ื ืื•ืžืจ, 'ื‘ืขื•ื“ 30 ืฉื ื”,
01:11
a robot will likely be on the cover of Time magazine
20
71840
2920
ืจื•ื‘ื•ื˜ ื›ื ืจืื” ื™ื”ื™ื” ืขืœ ืฉืขืจ ืžื’ื–ื™ืŸ ื˜ื™ื™ื
01:14
as the best CEO.'
21
74760
2800
ื‘ืชื•ืจ ื”ืžื ื›"ืœ ื”ื˜ื•ื‘ ื‘ื™ื•ืชืจ".
01:17
However, another pioneer of the tech world,
22
77560
2800
ืขื ื–ืืช, ื—ืœื•ืฅ ืื—ืจ ืฉืœ ืขื•ืœื ื”ื˜ื›ื ื•ืœื•ื’ื™ื”,
01:20
the American Elon Musk, is more worried.
23
80360
3080
ืื™ืœื•ืŸ ืžืืกืง ื”ืืžืจื™ืงืื™, ืžื•ื“ืื’ ื™ื•ืชืจ.
01:23
He fears robots could one day get rid of us entirely.
24
83440
3240
ื”ื•ื ื—ื•ืฉืฉ ืฉืจื•ื‘ื•ื˜ื™ื ื™ื•ื›ืœื• ื™ื•ื ืื—ื“ ืœื”ื™ืคื˜ืจ ืžืื™ืชื ื• ืœื—ืœื•ื˜ื™ืŸ.
01:26
So, how will leadership look in a world of AI?
25
86680
5120
ืื– ืื™ืš ืชื™ืจืื” ืžื ื”ื™ื’ื•ืช ื‘ืขื•ืœื ืฉืœ AI?
01:31
So, a big advantage for human beings,
26
91800
2600
ืื– ื™ืชืจื•ืŸ ื’ื“ื•ืœ ืœื‘ื ื™ ืื“ื,
01:34
ย  in having more robots and AI in the workplace,
27
94400
3280
ื‘ื›ืš ืฉื™ืฉ ื™ื•ืชืจ ืจื•ื‘ื•ื˜ื™ื ื•ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื‘ืžืงื•ื ื”ืขื‘ื•ื“ื”,
01:37
are clearly that... that these technologies
28
97680
2080
ื‘ืจื•ืจ ืฉ... ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื”
01:39
can perhaps in the future do a lot of the dirty work for us,
29
99760
3200
ื™ื›ื•ืœื•ืช ืื•ืœื™ ื‘ืขืชื™ื“ ืœืขืฉื•ืช ื”ืจื‘ื” ืžื”ืขื‘ื•ื“ื” ื”ืžืœื•ื›ืœื›ืช ืขื‘ื•ืจื ื•,
01:42
and by dirty work, I think I mean things like
30
102960
2840
ื•ื‘ืขื‘ื•ื“ื” ืžืœื•ื›ืœื›ืช, ืื ื™ ื—ื•ืฉื‘ ืื ื™ ืžืชื›ื•ื•ืŸ ืœื“ื‘ืจื™ื ื›ืžื•
01:45
heavy lifting, cleaning, moving goods from A to B,
31
105800
4120
ื”ืจืžื” ื›ื‘ื“ื”, ื ื™ืงื™ื•ืŸ, ื”ืขื‘ืจืช ืกื—ื•ืจื•ืช ืž-A ืœ-B,
01:49
but it can also mean repetitive, computer-based tasks
32
109920
2640
ืื‘ืœ ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ื’ื ืžืฉื™ืžื•ืช ืฉื—ื•ื–ืจื•ืช ืขืœ ืขืฆืžืŸ, ืžื‘ื•ืกืกื•ืช ืžื—ืฉื‘,
01:52
and it's not very healthy for human beings
33
112560
1680
ื•ื–ื” ืœื ืžืื•ื“ ื‘ืจื™ื ืœื‘ื ื™ ืื“ื
01:54
to be in front of a computer for extended periods of time.
34
114240
3280
ืœื”ื™ื•ืช ืžื•ืœ ื”ืžื—ืฉื‘ ืœืคืจืงื™ ื–ืžืŸ ืžืžื•ืฉื›ื™ื.
01:57
And that can free up human beings to do a lot more thinking:
35
117520
3200
ื•ื–ื” ื™ื›ื•ืœ ืœืฉื—ืจืจ ืืช ื‘ื ื™ ื”ืื“ื ืœืขืฉื•ืช ื”ืจื‘ื” ื™ื•ืชืจ ื—ืฉื™ื‘ื”:
02:00
to think about โ€“ big thoughts about the future,
36
120720
2680
ืœื—ืฉื•ื‘ ืขืœ - ืžื—ืฉื‘ื•ืช ื’ื“ื•ืœื•ืช ืขืœ ื”ืขืชื™ื“,
02:03
about what a carbon-neutral planet looks like,
37
123400
3880
ืขืœ ืื™ืš ื ืจืื” ื›ื•ื›ื‘ ืœื›ืช ื ื™ื™ื˜ืจืœื™ ืคื—ืžืŸ,
02:07
about the kinds of communities we want to develop.
38
127280
4320
ืขืœ ืกื•ื’ื™ ื”ืงื”ื™ืœื•ืช ืฉืื ื—ื ื• ืจื•ืฆื™ื ืœืคืชื—.
02:11
So, robots and AI could free us
39
131600
2400
ืื– ืจื•ื‘ื•ื˜ื™ื ื•ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื™ื›ื•ืœื™ื ืœืฉื—ืจืจ ืื•ืชื ื•
02:14
from the dull, repetitive work we don't want to do.
40
134000
3640
ืžื”ืขื‘ื•ื“ื” ื”ืžืฉืžื™ืžื” ื”ื—ื•ื–ืจืช ืขืœ ืขืฆืžื” ืฉืื ื—ื ื• ืœื ืจื•ืฆื™ื ืœืขืฉื•ืช.
02:17
But aren't there dangers with that?
41
137640
2640
ืื‘ืœ ื”ืื ืื™ืŸ ื‘ื›ืš ืกื›ื ื•ืช?
02:20
So, the big danger essentially is that,
42
140280
2240
ืื–, ื”ืกื›ื ื” ื”ื’ื“ื•ืœื” ื”ื™ื ื‘ืขืฆื ืฉืื
02:22
if our workplace has become more automated
43
142520
3200
ืžืงื•ื ื”ืขื‘ื•ื“ื” ืฉืœื ื• ื”ืคืš ืื•ื˜ื•ืžื˜ื™ ื™ื•ืชืจ
02:25
and busier with robotics,
44
145720
2880
ื•ืขืกื•ืง ื™ื•ืชืจ ื‘ืจื•ื‘ื•ื˜ื™ืงื”, ื™ื”ื™ื” ืœื ื• ืžื” ืœืขืฉื•ืช
02:28
that we'll have to have something to do,
45
148600
1440
,
02:30
and governments will have to find solutions
46
150040
3640
ื•ืžืžืฉืœื•ืช ื™ืฆื˜ืจื›ื• ืœืžืฆื•ื ืคืชืจื•ื ื•ืช
02:33
to ever greater numbers of people, who might not be out of work,
47
153680
4080
ืœืžืกืคืจ ื’ื“ื•ืœ ื™ื•ืชืจ ืฉืœ ืื ืฉื™ื, ืฉืื•ืœื™ ืœื ื™ื”ื™ื• ืœืœื ืขื‘ื•ื“ื”,
02:37
but sort-of hopping from one insecure temporary job to another.
48
157760
3400
ืื‘ืœ ืžืขื™ืŸ ื“ื™ืœื•ื’ ืžืขื‘ื•ื“ื” ื–ืžื ื™ืช ืœื ื‘ื˜ื•ื—ื” ืื—ืช ืœืื—ืจืช.
02:41
And that presents really big, actually, social challenges.
49
161160
3680
ื•ื–ื” ืžืฆื™ื’ ืืชื’ืจื™ื ื—ื‘ืจืชื™ื™ื ื’ื“ื•ืœื™ื ื‘ืืžืช.
02:44
Giving more jobs to robots and AI
50
164840
2640
ืžืชืŸ ื™ื•ืชืจ ืžืฉืจื•ืช ืœืจื•ื‘ื•ื˜ื™ื ื•ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
02:47
is going to present huge social challenges to humans.
51
167480
3800
ืขื•ืžื“ ืœื”ืฆื™ื‘ ืืชื’ืจื™ื ื—ื‘ืจืชื™ื™ื ืขืฆื•ืžื™ื ืœื‘ื ื™ ืื“ื.
02:51
Where does leadership fit into this?
52
171280
3000
ืื™ืคื” ืžื ื”ื™ื’ื•ืช ืžืฉืชืœื‘ืช ื‘ื–ื”?
02:54
A key part of leadership, as opposed to management,
53
174280
3280
ื—ืœืง ืžืจื›ื–ื™ ื‘ืžื ื”ื™ื’ื•ืช, ื‘ื ื™ื’ื•ื“ ืœื ื™ื”ื•ืœ,
02:57
is how central care is to leadership:
54
177560
3160
ื”ื•ื ืขื“ ื›ืžื” ื”ื˜ื™ืคื•ืœ ื”ื•ื ืžืจื›ื–ื™ ื‘ืžื ื”ื™ื’ื•ืช:
03:00
care, understanding and empathy.
55
180720
2000
ื˜ื™ืคื•ืœ, ื”ื‘ื ื” ื•ืืžืคืชื™ื”.
03:02
And so, in its most obvious guises, we can think of caring for others,
56
182720
4200
ื•ื›ืš, ื‘ืžืกื•ื•ื” ื”ื‘ืจื•ืจ ื‘ื™ื•ืชืจ ืฉืœื•, ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื—ืฉื•ื‘ ืขืœ ื“ืื’ื” ืœืื—ืจื™ื,
03:06
the people who are more vulnerable than ourselves โ€“
57
186920
2000
ื”ืื ืฉื™ื ื”ืคื’ื™ืขื™ื ื™ื•ืชืจ ืžืื™ืชื ื• โ€“
03:08
and this is just really something that robots, no matter how sophisticated,
58
188920
3600
ื•ื–ื” ืคืฉื•ื˜ ื‘ืืžืช ืžืฉื”ื• ืฉืจื•ื‘ื•ื˜ื™ื, ืžืชื•ื—ื›ืžื™ื ื›ื›ืœ ืฉื™ื”ื™ื•,
03:12
can't be replaced by human beings.
59
192520
1920
ืœื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืžื•ื—ืœืคื™ื ื‘ื‘ื ื™ ืื“ื.
03:14
But the central task of leadership, which is walking with people,
60
194440
3320
ืื‘ืœ ื”ืžืฉื™ืžื” ื”ืžืจื›ื–ื™ืช ืฉืœ ืžื ื”ื™ื’ื•ืช, ืฉื”ื™ื ื”ืœื™ื›ื” ืขื ืื ืฉื™ื,
03:17
responding to their needs, caring for them โ€“
61
197760
2320
ืžืขื ื” ืœืฆืจื›ื™ื”ื, ื˜ื™ืคื•ืœ ื‘ื”ื โ€“
03:20
on the big issues as well as the small issues โ€“
62
200080
2440
ื‘ื ื•ืฉืื™ื ื”ื’ื“ื•ืœื™ื ื›ืžื• ื’ื ื‘ื ื•ืฉืื™ื ื”ืงื˜ื ื™ื โ€“
03:22
is something that robots will probably never be able to compensate for.
63
202520
5120
ื”ื™ื ืžืฉื”ื• ืฉืจื•ื‘ื•ื˜ื™ื ื›ื ืจืื” ืœืขื•ืœื ืœื ื™ื•ื›ืœื• ืœืคืฆื•ืช ืขืœื™ื•.
03:27
So, qualities such as empathy, care
64
207640
2560
ืœื›ืŸ, ืชื›ื•ื ื•ืช ื›ืžื• ืืžืคืชื™ื”, ืื›ืคืชื™ื•ืช
03:30
and understanding in leadership will be very important โ€“
65
210200
3480
ื•ื”ื‘ื ื” ื‘ืžื ื”ื™ื’ื•ืช ื™ื”ื™ื• ื—ืฉื•ื‘ื•ืช ืžืื•ื“ -
03:33
human skills that robots will probably never acquire.
66
213680
3960
ื›ื™ืฉื•ืจื™ื ืื ื•ืฉื™ื™ื ืฉืจื•ื‘ื•ื˜ื™ื ื›ื ืจืื” ืœืขื•ืœื ืœื ื™ืจื›ืฉื•.
03:37
There are loads of ethical responsibilities
67
217640
1800
ื™ืฉ ื”ืžื•ืŸ ืื—ืจื™ื•ืช ืืชื™ืช
03:39
for people who are creating AI,
68
219440
1680
ืœืื ืฉื™ื ืฉื™ื•ืฆืจื™ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
03:41
but also people who are in charge of implementing it
69
221120
3000
ืื‘ืœ ื’ื ืื ืฉื™ื ืฉืื—ืจืื™ื ืขืœ ื”ื˜ืžืขืชื”
03:44
and seeing how it progresses through organisations and society.
70
224120
3920
ื•ืœืจืื•ืช ืื™ืš ื”ื™ื ืžืชืงื“ืžืช ื‘ืืจื’ื•ื ื™ื ื•ื‘ื—ื‘ืจื”.
03:48
The main ethical issue, I think, here is that AI,
71
228040
2960
ื”ื ื•ืฉื ื”ืืชื™ ื”ืขื™ืงืจื™, ืื ื™ ื—ื•ืฉื‘, ื›ืืŸ ื”ื•ื ืฉ-AI,
03:51
in some senses, is a kind of automation of a human will,
72
231000
4920
ื‘ืžื•ื‘ื ื™ื ืžืกื•ื™ืžื™ื, ื”ื•ื ืกื•ื’ ืฉืœ ืื•ื˜ื•ืžืฆื™ื” ืฉืœ ืจืฆื•ืŸ ืื ื•ืฉื™,
03:55
which can unfortunately include lots of human prejudices.
73
235920
2800
ืฉืœืžืจื‘ื” ื”ืฆืขืจ ื™ื›ื•ืœ ืœื›ืœื•ืœ ื”ืจื‘ื” ื“ืขื•ืช ืงื“ื•ืžื•ืช ืื ื•ืฉื™ื•ืช.
03:58
So, for example, there have been problems with policing algorithms,
74
238720
4280
ืื–, ืœืžืฉืœ, ื”ื™ื• ื‘ืขื™ื•ืช ืขื ืืœื’ื•ืจื™ืชืžื™ ืฉื™ื˜ื•ืจ,
04:03
in the sense that they have reflected, maybe, some underlying racial biases
75
243000
4120
ื‘ืžื•ื‘ืŸ ื–ื” ืฉื”ื ืฉื™ืงืคื•, ืื•ืœื™, ื›ืžื” ื”ื˜ื™ื•ืช ื’ื–ืขื™ื•ืช ื‘ืกื™ืกื™ื•ืช
04:07
of certain police forces, in terms of where they deploy resources.
76
247120
3560
ืฉืœ ื›ื•ื—ื•ืช ืžืฉื˜ืจื” ืžืกื•ื™ืžื™ื, ื‘ืžื•ื ื—ื™ื ืฉืœ ื”ื™ื›ืŸ ื”ื ืคื•ืจืกื™ื ืžืฉืื‘ื™ื.
04:10
So, we have to keep a really close eye
77
250680
1760
ืื–, ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืคืงื•ื— ืขื™ืŸ ืžืžืฉ ืžืงืจื•ื‘
04:12
and it's a really big, ethical leadership responsibility
78
252440
3000
ื•ื–ื•ื”ื™ ืื—ืจื™ื•ืช ืžื ื”ื™ื’ื•ืชื™ืช ืืชื™ืช ื’ื“ื•ืœื” ืžืื•ื“
04:15
to keep a close eye on how artificial intelligence is deployed โ€“
79
255440
4160
ืœืคืงื— ืžืงืจื•ื‘ ืขืœ ื”ืื•ืคืŸ ืฉื‘ื• ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืžืœืื›ื•ืชื™ืช ื ืคืจืกืช -
04:19
that it doesn't get out of hand and doesn't actually automate
80
259600
3160
ืฉื”ื™ื ืœื ื™ื•ืฆืืช ืžืฉืœื™ื˜ื” ื•ืœื ื‘ืขืฆื ืžืžื›ื ืช
04:22
really serious ethical problems.
81
262760
2840
ื‘ืขื™ื•ืช ืืชื™ื•ืช ืจืฆื™ื ื™ื•ืช ื‘ืืžืช. .
04:25
We need our leaders to be ethical and responsible when implementing AI,
82
265600
4840
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืฉื”ืžื ื”ื™ื’ื™ื ืฉืœื ื• ื™ื”ื™ื• ืืชื™ื™ื ื•ืื—ืจืื™ื ื‘ืขืช ื”ื˜ืžืขืช AI,
04:30
so that we don't automate and repeat our own human prejudices.
83
270440
5560
ื›ื“ื™ ืฉืœื ื ื‘ืฆืข ืื•ื˜ื•ืžืฆื™ื” ื•ื ื—ื–ื•ืจ ืขืœ ื”ื“ืขื•ืช ื”ืงื“ื•ืžื•ืช ื”ืื ื•ืฉื™ื•ืช ืฉืœื ื•.
04:40
Could you one day have a boss like this?
84
280120
3000
ืื•ืœื™ ื™ื•ื ืื—ื“ ื™ื”ื™ื” ืœืš ื‘ื•ืก ื›ื–ื”?
04:43
Meet Ai-Da, the first human-like robot artist.
85
283120
4880
ื”ื›ื™ืจื• ืืช Ai-Da, ื”ืืžืŸ ื”ืจื•ื‘ื•ื˜ ื“ืžื•ื™ ื”ืื“ื ื”ืจืืฉื•ืŸ.
04:48
Ai-Da can draw and recite poetry.
86
288000
3480
ืื™-ื“ื ื™ื›ื•ืœ ืœืฆื™ื™ืจ ื•ืœื“ืงืœื ืฉื™ืจื”.
04:51
The eerie sounds, which echoed throughout
87
291480
4640
ื”ืฆืœื™ืœื™ื ื”ืžืคื—ื™ื“ื™ื, ืฉื”ื“ื”ื“ื• ืœื›ืœ ืื•ืจื›ื•
04:56
with the weight of the world itself.
88
296120
3800
ืขื ืžืฉืงืœ ื”ืขื•ืœื ืขืฆืžื•.
04:59
But what her creators really want her to do
89
299920
2920
ืื‘ืœ ืžื” ืฉื”ื™ื•ืฆืจื™ื ืฉืœื” ื‘ืืžืช ืจื•ืฆื™ื ืฉื”ื™ื ืชืขืฉื”
05:02
ย  is to get people thinking about a world with AI,
90
302840
4200
ื”ื•ื ืœื’ืจื•ื ืœืื ืฉื™ื ืœื—ืฉื•ื‘ ืขืœ ืขื•ืœื ืขื AI,
05:07
and that includes thinking about the impact it will have on leadership.
91
307040
4760
ื•ื–ื” ื›ื•ืœืœ ื—ืฉื™ื‘ื” ืขืœ ื”ื”ืฉืคืขื” ืฉืชื”ื™ื” ืœื• ืขืœ ืžื ื”ื™ื’ื•ืช.
05:13
Robots are great, but not human.
92
313120
3200
ืจื•ื‘ื•ื˜ื™ื ื”ื ื ื”ื“ืจื™ื, ืื‘ืœ ืœื ืื ื•ืฉื™ื™ื.
05:16
They are not perfect and they are not perfect for everything,
93
316320
4040
ื”ื ืœื ืžื•ืฉืœืžื™ื ื•ื”ื ืœื ืžื•ืฉืœืžื™ื ืœื›ืœ ื“ื‘ืจ,
05:20
but they do have a capability to do some things
94
320360
3680
ืื‘ืœ ื™ืฉ ืœื”ื ื™ื›ื•ืœืช ืœืขืฉื•ืช ื›ืžื” ื“ื‘ืจื™ื
05:24
that humans can only dream of doing.
95
324040
3800
ืฉื‘ื ื™ ืื“ื ื™ื›ื•ืœื™ื ืจืง ืœื—ืœื•ื ืœืขืฉื•ืช.
05:27
More challenging is how to direct this capability
96
327840
3600
ืžืืชื’ืจ ื™ื•ืชืจ ื”ื•ื ื›ื™ืฆื“ ืœื›ื•ื•ืŸ ืืช ื”ื™ื›ื•ืœืช ื”ื–ื•
05:31
for a sustainable environment and our future.
97
331440
3800
ืœืกื‘ื™ื‘ื” ื‘ืช ืงื™ื™ืžื ื•ืœืขืชื™ื“ ืฉืœื ื•.
05:35
This is difficult.
98
335240
2560
ื–ื” ืงืฉื”.
05:37
Robots will bring opportunities and challenges.
99
337800
3360
ืจื•ื‘ื•ื˜ื™ื ื™ื‘ื™ืื• ื”ื–ื“ืžื ื•ื™ื•ืช ื•ืืชื’ืจื™ื.
05:41
Would robots make better leaders than humans?
100
341160
3520
ื”ืื ืจื•ื‘ื•ื˜ื™ื ื™ื”ื™ื• ืžื ื”ื™ื’ื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ ืžื‘ื ื™ ืื“ื?
05:44
They work together.
101
344680
2000
ื”ื ืขื•ื‘ื“ื™ื ื™ื—ื“.
05:46
At the end of the day, the state of the game
102
346680
2920
ื‘ืกื•ืคื• ืฉืœ ื™ื•ื, ืžืฆื‘ ื”ืžืฉื—ืง
05:49
is up to how we design and use technology/robots.
103
349600
5440
ืชืœื•ื™ ื‘ืื™ืš ืื ื—ื ื• ืžืขืฆื‘ื™ื ื•ืžืฉืชืžืฉื™ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”/ืจื•ื‘ื•ื˜ื™ื.
05:55
So, humans need to be more conscious
104
355040
2160
ืœื›ืŸ, ื‘ื ื™ ืื“ื ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ืžื•ื“ืขื™ื ื™ื•ืชืจ
05:57
of how much we are doing and what we are doing,
105
357200
2640
ืœื›ืžื” ืื ื—ื ื• ืขื•ืฉื™ื ื•ืžื” ืื ื—ื ื• ืขื•ืฉื™ื,
05:59
when we create and use new technology.
106
359840
3760
ื›ืืฉืจ ืื ื—ื ื• ื™ื•ืฆืจื™ื ื•ืžืฉืชืžืฉื™ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื—ื“ืฉื”.
06:03
So we, as humans, and robots need to work together.
107
363600
4200
ืื– ืื ื—ื ื•, ื›ื‘ื ื™ ืื“ื, ื•ืจื•ื‘ื•ื˜ื™ื ืฆืจื™ื›ื™ื ืœืขื‘ื•ื“ ื™ื—ื“.
06:07
We should understand the power of new technologies.
108
367800
3680
ืขืœื™ื ื• ืœื”ื‘ื™ืŸ ืืช ื›ื•ื—ืŸ ืฉืœ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื—ื“ืฉื•ืช.
06:11
In order to truly appreciate
109
371480
2080
ื›ื“ื™ ืœื”ืขืจื™ืš ื‘ืืžืช
06:13
the opportunities that are emerging,
110
373560
3240
ืืช ื”ื”ื–ื“ืžื ื•ื™ื•ืช ืฉืฆืฆื•ืช,
06:16
we need to first understand the past.
111
376800
3440
ืขืœื™ื ื• ืœื”ื‘ื™ืŸ ืชื—ื™ืœื” ืืช ื”ืขื‘ืจ.
06:20
With this, we can be more aware and flexible
112
380240
3240
ืขื ื–ื”, ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืžื•ื“ืขื™ื ื™ื•ืชืจ ื•ื’ืžื™ืฉื™ื ื™ื•ืชืจ
06:23
to deal with an unpredictable future.
113
383480
4000
ืœื”ืชืžื•ื“ื“ื•ืช ืขื ืขืชื™ื“ ื‘ืœืชื™ ืฆืคื•ื™.
06:27
The world is changing.
114
387480
2280
ื”ืขื•ืœื ืžืฉืชื ื”.
06:29
The future is not what we consider it to be.
115
389760
3640
ื”ืขืชื™ื“ ื”ื•ื ืœื ืžื” ืฉืื ื—ื ื• ืžื—ืฉื™ื‘ื™ื ืื•ืชื•.
06:33
It is requiring from us
116
393400
2000
ื–ื” ื“ื•ืจืฉ ืžืื™ืชื ื•
06:35
a willingness to give up our comfort zone.
117
395400
4040
ื ื›ื•ื ื•ืช ืœื•ื•ืชืจ ืขืœ ืื–ื•ืจ ื”ื ื•ื—ื•ืช ืฉืœื ื•.
06:39
Giving up your comfort zone means opening yourself up
118
399440
3400
ืœื•ื•ืชืจ ืขืœ ืื–ื•ืจ ื”ื ื•ื—ื•ืช ืฉืœืš ืื•ืžืจ ืœืคืชื•ื— ืืช ืขืฆืžืš
06:42
to new possibilities.
119
402840
2000
ืœืืคืฉืจื•ื™ื•ืช ื—ื“ืฉื•ืช.
06:44
To do this, it helps to understand the past.
120
404840
3280
ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช, ื–ื” ืขื•ื–ืจ ืœื”ื‘ื™ืŸ ืืช ื”ืขื‘ืจ.
06:48
So, what does Ai-Da think is a key quality for leadership?
121
408120
4680
ืื– ืžื” ืœื“ืขืช Ai-Da ื”ื™ื ืชื›ื•ื ืช ืžืคืชื— ืœืžื ื”ื™ื’ื•ืช?
06:52
Humility: we are to be humble in every action
122
412800
4440
ืขื ื•ื•ื”: ืขืœื™ื ื• ืœื”ื™ื•ืช ืฆื ื•ืขื™ื ื‘ื›ืœ ืคืขื•ืœื”
06:57
and this includes what we are willing to do
123
417240
2720
ื•ื–ื” ื›ื•ืœืœ ืืช ืžื” ืฉืื ื• ืžื•ื›ื ื™ื ืœืขืฉื•ืช
06:59
and say to help others.
124
419960
2280
ื•ืœื•ืžืจ ื›ื“ื™ ืœืขื–ื•ืจ ืœืื—ืจื™ื.
07:02
Leadership is not the same thing as success or failure.
125
422240
4240
ืžื ื”ื™ื’ื•ืช ื”ื™ื ืœื ืื•ืชื• ื“ื‘ืจ ื›ืžื• ื”ืฆืœื—ื” ืื• ื›ื™ืฉืœื•ืŸ.
07:06
We have all failed,
126
426480
1880
ื›ื•ืœื ื• ื ื›ืฉืœื ื•,
07:08
but we can recognise the mistakes that we made
127
428360
2560
ืื‘ืœ ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื–ื”ื•ืช ืืช ื”ื˜ืขื•ื™ื•ืช ืฉืขืฉื™ื ื•
07:10
and learn from them.
128
430920
2040
ื•ืœืœืžื•ื“ ืžื”ืŸ.
07:12
Humility is when you put yourself in someone else's shoes.
129
432960
4520
ืขื ื•ื•ื” ื”ื™ื ื›ืืฉืจ ืืชื” ืฉื ืืช ืขืฆืžืš ื‘ื ืขืœื™ื• ืฉืœ ืžื™ืฉื”ื• ืื—ืจ.
07:17
Showing humility, recognising mistakes and learning from them,
130
437480
4640
ื’ื™ืœื•ื™ ืขื ื•ื•ื”, ื–ื™ื”ื•ื™ ื˜ืขื•ื™ื•ืช ื•ืœืœืžื•ื“ ืžื”ืŸ,
07:22
are qualities this robot wants to see in leaders.
131
442120
5360
ื”ืŸ ืชื›ื•ื ื•ืช ื”ืจื•ื‘ื•ื˜ ื”ื–ื” ืจื•ืฆื” ืœืจืื•ืช ืืฆืœ ืžื ื”ื™ื’ื™ื.
07:31
It's hard to know exactly how leadership will look in the future,
132
451040
3960
ืงืฉื” ืœื“ืขืช ื‘ื“ื™ื•ืง ืื™ืš ืชืจืื” ืžื ื”ื™ื’ื•ืช ื‘ืขืชื™ื“,
07:35
but it is clear that human qualities of care and empathy will be vital.
133
455000
4640
ืื‘ืœ ื‘ืจื•ืจ ืฉืชื›ื•ื ื•ืช ืื ื•ืฉื™ื•ืช ืฉืœ ื˜ื™ืคื•ืœ ื•ืืžืคืชื™ื” ื™ื”ื™ื• ื—ื™ื•ื ื™ื•ืช.
07:39
Having a detailed knowledge of the technology
134
459640
2200
ื™ืฉ ื—ืฉื™ื‘ื•ืช ืœื™ื“ืข ืžืคื•ืจื˜ ืขืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”
07:41
and its potential is important,
135
461840
1960
ื•ื”ืคื•ื˜ื ืฆื™ืืœ ืฉืœื”,
07:43
as is being ethical and responsible in how we use it.
136
463800
4920
ื›ืžื• ื’ื ืœื”ื™ื•ืช ืืชื™ ื•ืื—ืจืื™ ื‘ืื•ืคืŸ ื”ืฉื™ืžื•ืฉ ื‘ื”.

Original video on YouTube.com
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7