Leaders and machines

14,145 views ใƒป 2022-03-15

BBC Learning English


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚ ็ฟป่จณใ•ใ‚ŒใŸๅญ—ๅน•ใฏๆฉŸๆขฐ็ฟป่จณใงใ™ใ€‚

00:01
Robots might be on the factory floor now,
0
1520
3200
ใƒญใƒœใƒƒใƒˆใฏ ็พๅœจใ€ๅทฅๅ ดใฎใƒ•ใƒญใ‚ขใซใ„ใ‚‹
00:04
but could they one day be your boss?
1
4720
2360
ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใŒใ€ ใ„ใคใฎๆ—ฅใ‹ใƒญใƒœใƒƒใƒˆใŒใ‚ใชใŸใฎไธŠๅธใซใชใ‚‹ๅฏ่ƒฝๆ€งใฏใ‚ใ‚Šใพใ™ใ‹?
00:07
Robots are not perfect and they are not perfect for everything,
2
7080
4520
ใƒญใƒœใƒƒใƒˆใฏ ๅฎŒ็’งใงใฏใชใใ€ใ™ในใฆใซๅฎŒ็’งใจใ„ใ†ใ‚ใ‘ใงใฏใ‚ใ‚Š
00:11
but they do have a capability to do some things
3
11600
3640
ใพใ›ใ‚“ใŒใ€ไบบ้–“ใŒๅคข่ฆ‹ใ‚‹ ใ“ใจใ—ใ‹ใงใใชใ„ใ„ใใคใ‹ใฎใ“ใจใ‚’ๅฎŸ่กŒใงใใ‚‹่ƒฝๅŠ›ใ‚’ๅ‚™ใˆใฆ
00:15
that humans can only dream of doing.
4
15240
2720
ใ„ใพใ™ใ€‚
00:17
We are going to look at leadership in a future working alongside robots,
5
17960
4840
ใƒญใƒœใƒƒใƒˆใจใจใ‚‚ใซๅƒใๅฐ†ๆฅใฎใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใจใ€
00:22
and what this could mean for you.
6
22800
3160
ใใ‚ŒใŒใ‚ใชใŸใซใจใฃใฆไฝ•ใ‚’ๆ„ๅ‘ณใ™ใ‚‹ใ‹ใ‚’่ฆ‹ใฆใ„ใใพใ™ใ€‚
00:27
A global tech giant, Alibaba rivals Amazon
7
27760
4000
ใ‚ฐใƒญใƒผใƒใƒซใชใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใฎๅทจไบบใงใ‚ใ‚‹ ใ‚ขใƒชใƒใƒ
00:31
for the title of world's largest online retailer.
8
31760
3920
ใฏใ€ไธ–็•Œ ๆœ€ๅคงใฎใ‚ชใƒณใƒฉใ‚คใƒณๅฐๅฃฒๆฅญ่€…ใฎ็งฐๅทใงใ‚ขใƒžใ‚พใƒณใซๅŒนๆ•ตใ—ใพใ™ใ€‚
00:35
It has some 800 million users.
9
35680
3000
็ด„8ๅ„„ไบบใฎใƒฆใƒผใ‚ถใƒผใŒใ„ใพใ™ใ€‚
00:38
It doesn't just help you shop;
10
38680
2160
่ฒทใ„็‰ฉใซๅฝน็ซ‹ใคใ ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
00:40
it can also bank your money and store your data.
11
40840
4160
ใพใŸใ€ใŠ้‡‘ ใ‚’้ ใ‘ใŸใ‚Šใ€ใƒ‡ใƒผใ‚ฟใ‚’ไฟๅญ˜ใ—ใŸใ‚Šใ™ใ‚‹ใ“ใจใ‚‚ใงใใพใ™ใ€‚
00:45
Alibaba has got so big,
12
45000
1840
ใ‚ขใƒชใƒใƒใฏ้žๅธธใซๅคงใใใชใฃใŸใŸใ‚
00:46
the Chinese government wants new rules to curb its power.
13
46840
3880
ใ€ไธญๅ›ฝๆ”ฟๅบœใฏ ใใฎๅŠ›ใ‚’ๆŠ‘ใˆใ‚‹ใŸใ‚ใฎๆ–ฐใ—ใ„่ฆๅ‰‡ใ‚’ๆœ›ใ‚“ใงใ„ใพใ™ใ€‚
00:50
The company has long embraced artificial intelligence, or AI.
14
50720
5440
ๅŒ็คพใฏ้•ทใ„้–“ใ€ไบบๅทฅ็Ÿฅ่ƒฝ (AI) ใ‚’ๆŽก็”จใ—ใฆใใพใ—ใŸ ใ€‚
00:56
It uses algorithms to provide a personalised service for its customers
15
56160
4480
ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใ‚’ไฝฟ็”จ ใ—ใฆ้กงๅฎขใซใƒ‘ใƒผใ‚ฝใƒŠใƒฉใ‚คใ‚บใ•ใ‚ŒใŸใ‚ตใƒผใƒ“ใ‚นใ‚’ๆไพ›ใ—ใ€
01:00
and robots to process and pack goods in its warehouses.
16
60640
4160
ใƒญใƒœใƒƒใƒˆใŒ ๅ€‰ๅบซใงๅ•†ๅ“ใ‚’ๅ‡ฆ็†ใŠใ‚ˆใณๆขฑๅŒ…ใ—ใพใ™ใ€‚
01:04
Jack Ma, who founded the firm,
17
64800
2160
ๅŒ็คพใ‚’่จญ็ซ‹ใ—ใŸใ‚ธใƒฃใƒƒใ‚ฏใƒปใƒžใƒผ
01:06
believes robots could one day run companies.
18
66960
2880
ๆฐใฏใ€ใƒญใƒœใƒƒใƒˆใŒไผๆฅญใ‚’้‹ๅ–ถใงใใ‚‹ใ‚ˆใ†ใซใชใ‚‹ๆ—ฅใŒๆฅใ‚‹ใจไฟกใ˜ใฆ ใ„ใ‚‹ใ€‚
01:09
He says, 'In 30 years,
19
69840
2000
ๅฝผใฏใ€Œ30ๅนดไปฅๅ†…ใซ
01:11
a robot will likely be on the cover of Time magazine
20
71840
2920
ใ€ใƒญใƒœใƒƒใƒˆใŒๆœ€้ซ˜ใฎCEO ใจใ—ใฆใ‚ฟใ‚คใƒ ่ชŒใฎ่กจ็ด™ใ‚’้ฃพใ‚‹ๅฏ่ƒฝๆ€งใŒ้ซ˜ใ„ใงใ—ใ‚‡ใ†ใ€ใจ่ชžใฃ
01:14
as the best CEO.'
21
74760
2800
ใฆใ„ใพใ™ใ€‚
01:17
However, another pioneer of the tech world,
22
77560
2800
ใ—ใ‹ใ—ใ€ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผ็•Œใฎใ‚‚ใ†ไธ€ไบบใฎใƒ‘ใ‚คใ‚ชใƒ‹ใ‚ข ใง
01:20
the American Elon Musk, is more worried.
23
80360
3080
ใ‚ใ‚‹ใ‚ขใƒกใƒชใ‚ซใฎใ‚คใƒผใƒญใƒณใƒปใƒžใ‚นใ‚ฏใฏใ€ ใ‚‚ใฃใจๅฟƒ้…ใ—ใฆใ„ใพใ™ใ€‚
01:23
He fears robots could one day get rid of us entirely.
24
83440
3240
ๅฝผใฏใ€ ใ„ใคใฎๆ—ฅใ‹ใƒญใƒœใƒƒใƒˆใŒ็งใŸใกใ‚’ๅฎŒๅ…จใซๆŽ’้™คใ™ใ‚‹ใฎใงใฏใชใ„ใ‹ใจๆใ‚Œใฆใ„ใพใ™ใ€‚
01:26
So, how will leadership look in a world of AI?
25
86680
5120
ใงใฏใ€AI ใฎไธ–็•Œใงใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฏใฉใฎใ‚ˆใ†ใซ ่ฆ‹ใˆใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹?
01:31
So, a big advantage for human beings,
26
91800
2600
ใคใพใ‚Šใ€่ทๅ ดใซ
01:34
ย  in having more robots and AI in the workplace,
27
94400
3280
ใƒญใƒœใƒƒใƒˆใ‚„ AI ใŒๅข—ใˆใ‚‹ใ“ใจใซใ‚ˆใ‚‹ไบบ้–“ใซใจใฃใฆใฎๅคงใใชๅˆฉ็‚น
01:37
are clearly that... that these technologies
28
97680
2080
ใฏใ€ๆ˜Žใ‚‰ใ‹ใซ โ€ฆใ“ใ‚Œใ‚‰ใฎใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผ
01:39
can perhaps in the future do a lot of the dirty work for us,
29
99760
3200
ใŒใŠใใ‚‰ใๅฐ†ๆฅ ็š„ใซๅคšใใฎๆฑšใ„ไป•ไบ‹ใ‚’็งใŸใกใซไปฃใ‚ใฃใฆ่กŒใ†ใ“ใจใŒใงใใ‚‹ใจใ„ใ†ใ“ใจใงใ™ใ€‚
01:42
and by dirty work, I think I mean things like
30
102960
2840
้‡ใ„ใ‚‚ใฎใ‚’
01:45
heavy lifting, cleaning, moving goods from A to B,
31
105800
4120
ๆŒใกไธŠใ’ใŸใ‚Šใ€ๆŽƒ้™คใ‚’ใ—ใŸใ‚Šใ€ ๅ•†ๅ“ใ‚’ A ใ‹ใ‚‰ B ใซ็งปๅ‹•ใ—ใŸใ‚Š
01:49
but it can also mean repetitive, computer-based tasks
32
109920
2640
ใ™ใ‚‹ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใŒใ€ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใ‚’ไฝฟใฃใŸ็นฐใ‚Š่ฟ”ใ—ใฎไฝœๆฅญใ‚’ๆ„ๅ‘ณใ™ใ‚‹ใ“ใจใ‚‚ใ‚ใ‚Š
01:52
and it's not very healthy for human beings
33
112560
1680
ใ€ไบบ้–“ใŒ้•ทๆ™‚้–“
01:54
to be in front of a computer for extended periods of time.
34
114240
3280
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใฎๅ‰ใซใ„ใ‚‹ใ“ใจใฏใ‚ใพใ‚Šๅฅๅบท็š„ใงใฏใ‚ใ‚Šใพใ›ใ‚“ ใ€‚
01:57
And that can free up human beings to do a lot more thinking:
35
117520
3200
ใใ—ใฆใใ‚Œใซใ‚ˆใฃใฆใ€ ไบบ้กžใฏใ‚ˆใ‚Šๅคšใใฎใ“ใจใ‚’่€ƒใˆใ‚‹ใ“ใจใŒใงใใ‚‹ใ‚ˆใ†ใซใชใ‚Šใพใ™ใ€‚
02:00
to think about โ€“ big thoughts about the future,
36
120720
2680
ใคใพใ‚Šใ€ๆœชๆฅ
02:03
about what a carbon-neutral planet looks like,
37
123400
3880
ใซใคใ„ใฆใ€ใ‚ซใƒผใƒœใƒณใƒ‹ใƒฅใƒผใƒˆใƒฉใƒซใช ๆƒ‘ๆ˜ŸใŒใฉใฎใ‚ˆใ†ใชใ‚‚ใฎใซใชใ‚‹ใ‹
02:07
about the kinds of communities we want to develop.
38
127280
4320
ใซใคใ„ใฆใ€ ็งใŸใกใŒ้–‹็™บใ—ใŸใ„ใ‚ณใƒŸใƒฅใƒ‹ใƒ†ใ‚ฃใฎ็จฎ้กžใซใคใ„ใฆ่€ƒใˆใ‚‹ใ“ใจใงใ™ใ€‚
02:11
So, robots and AI could free us
39
131600
2400
ใคใพใ‚Šใ€ใƒญใƒœใƒƒใƒˆใจ AI ใฏใ€ใ‚„ใ‚ŠใŸใใชใ„ๅ˜่ชฟใงๅๅพฉ็š„ใชไฝœๆฅญใ‹ใ‚‰็งใŸใกใ‚’่งฃๆ”พใ—ใฆใใ‚Œใ‚‹ๅฏ่ƒฝๆ€งใŒ
02:14
from the dull, repetitive work we don't want to do.
40
134000
3640
ใ‚ใ‚Šใพใ™ใ€‚
02:17
But aren't there dangers with that?
41
137640
2640
ใ—ใ‹ใ—ใ€ใใ‚Œใซใฏๅฑ้™บใฏใ‚ใ‚Šใพใ›ใ‚“ใ‹๏ผŸ
02:20
So, the big danger essentially is that,
42
140280
2240
ใคใพใ‚Šใ€ ๆœฌ่ณช็š„ใซๅคงใใชๅฑ้™บใฏใ€
02:22
if our workplace has become more automated
43
142520
3200
็งใŸใกใฎ่ทๅ ด ใŒใ‚ˆใ‚Š่‡ชๅ‹•ๅŒ–ใ•ใ‚Œ
02:25
and busier with robotics,
44
145720
2880
ใ€ใƒญใƒœใƒƒใƒˆๅทฅๅญฆใงๅฟ™ใ—ใใชใฃใŸๅ ดๅˆ
02:28
that we'll have to have something to do,
45
148600
1440
ใ€็งใŸใกใฏ ไฝ•ใ‹ใ‚’ใ—ใชใ‘ใ‚Œใฐใชใ‚‰ใš
02:30
and governments will have to find solutions
46
150040
3640
ใ€ๆ”ฟๅบœใฏ
02:33
to ever greater numbers of people, who might not be out of work,
47
153680
4080
ใ“ใ‚ŒใพใงไปฅไธŠใซๅคšใใฎไบบใ€…ใซๅฏพใ™ใ‚‹่งฃๆฑบ็ญ–ใ‚’่ฆ‹ใคใ‘ใชใ‘ใ‚Œใฐ ใชใ‚‰ใชใ„ใจใ„ใ†ใ“ใจใงใ™ใ€‚
02:37
but sort-of hopping from one insecure temporary job to another.
48
157760
3400
ไป•ไบ‹ใ‚’ใ—ใฆใ„ใพใ›ใ‚“ใŒใ€ๅฎ‰ๅ…จใงใชใ„ไธ€ๆ™‚็š„ใชไป•ไบ‹ใ‹ใ‚‰ๅˆฅใฎไป•ไบ‹ใธใจ้ฃ›ใณๅ›žใฃใฆใ„ใพใ™ใ€‚
02:41
And that presents really big, actually, social challenges.
49
161160
3680
ใใ—ใฆใใ‚Œใฏใ€ๅฎŸ้š›ใซใฏ้žๅธธใซๅคงใใช็คพไผš็š„่ชฒ้กŒใ‚’ๆ็คบใ—ใพใ™ ใ€‚ ใƒญใƒœใƒƒใƒˆใ‚„ AI
02:44
Giving more jobs to robots and AI
50
164840
2640
ใซใ‚ˆใ‚Šๅคšใใฎไป•ไบ‹ใ‚’ไธŽใˆใ‚‹ใ“ใจ
02:47
is going to present huge social challenges to humans.
51
167480
3800
ใฏ ใ€ไบบ้–“ใซๅคงใใช็คพไผš็š„่ชฒ้กŒใ‚’ใ‚‚ใŸใ‚‰ใ™ใงใ—ใ‚‡ใ†ใ€‚
02:51
Where does leadership fit into this?
52
171280
3000
ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฏใ“ใ‚Œใฎใฉใ“ใซๅฝ“ใฆใฏใพใ‚Šใพใ™ใ‹? ็ฎก็†ใจใฏๅฏพ็…ง
02:54
A key part of leadership, as opposed to management,
53
174280
3280
็š„ใซใ€ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฎ้‡่ฆใช้ƒจๅˆ†ใฏใ€ใ‚ฑใ‚ขใ€็†่งฃใ€ๅ…ฑๆ„Ÿ
02:57
is how central care is to leadership:
54
177560
3160
ใจใ„ใ†ไธญๅฟƒ็š„ใชใ‚ฑใ‚ขใŒใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใซใจใฃใฆใ„ใ‹ใซ้‡่ฆใ‹ใจใ„ใ†
03:00
care, understanding and empathy.
55
180720
2000
ใ“ใจ ใงใ™ใ€‚
03:02
And so, in its most obvious guises, we can think of caring for others,
56
182720
4200
ใ—ใŸใŒใฃใฆใ€ๆœ€ใ‚‚ๆ˜Ž็™ฝใชๅฝขใง ่จ€ใˆใฐใ€็งใŸใกใฏไป–ไบบใ€ใคใพใ‚Š่‡ชๅˆ†ใ‚ˆใ‚Š่„†ๅผฑใชไบบใ€…ใฎไธ–่ฉฑใ‚’ใ™ใ‚‹ใ“ใจใ‚’่€ƒใˆใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚
03:06
the people who are more vulnerable than ourselves โ€“
57
186920
2000
03:08
and this is just really something that robots, no matter how sophisticated,
58
188920
3600
ใ“ใ‚Œใฏใ€ ใƒญใƒœใƒƒใƒˆใŒใฉใ‚Œใปใฉๆด—็ทดใ•ใ‚Œใฆ
03:12
can't be replaced by human beings.
59
192520
1920
ใ„ใฆใ‚‚ใ€ไบบ้–“ใซๅ–ใฃใฆไปฃใ‚ใ‚‹ใ“ใจใฏใงใใพใ›ใ‚“.
03:14
But the central task of leadership, which is walking with people,
60
194440
3320
ใ—ใ‹ใ—ใ€ ไบบใ€…ใจๅ…ฑใซๆญฉใฟใ€
03:17
responding to their needs, caring for them โ€“
61
197760
2320
ๅฝผใ‚‰ใฎใƒ‹ใƒผใ‚บใซๅฟœใˆใ€ ไธ–่ฉฑใ‚’
03:20
on the big issues as well as the small issues โ€“
62
200080
2440
ใ™ใ‚‹ใจใ„ใ†ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฎไธญๅฟƒ็š„ใชใ‚ฟใ‚นใ‚ฏใฏใ€ๅคงใใชๅ•้กŒ ใงใ‚‚ๅฐใ•ใชๅ•้กŒใงใ‚‚ใ€
03:22
is something that robots will probably never be able to compensate for.
63
202520
5120
ใŠใใ‚‰ใใƒญใƒœใƒƒใƒˆ ใŒ่ฃœใ†ใ“ใจใŒใงใใชใ„ใ‚‚ใฎใงใ™.
03:27
So, qualities such as empathy, care
64
207640
2560
ใใฎใŸใ‚ใ€ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใซใŠใ‘ใ‚‹ๅ…ฑๆ„Ÿใ€ๆฐ—้ฃใ„ใ€็†่งฃใชใฉใฎ่ณ‡่ณช
03:30
and understanding in leadership will be very important โ€“
65
210200
3480
ใŒ้žๅธธใซ้‡่ฆใซใชใ‚Šใพใ™ใ€‚
03:33
human skills that robots will probably never acquire.
66
213680
3960
ไบบ้–“ใฎใ‚นใ‚ญใƒซ ใฏใ€ใŠใใ‚‰ใใƒญใƒœใƒƒใƒˆใŒ็ฟ’ๅพ—ใ™ใ‚‹ใ“ใจใฏๆฑบใ—ใฆใชใ„ใงใ—ใ‚‡ใ†ใ€‚
03:37
There are loads of ethical responsibilities
67
217640
1800
03:39
for people who are creating AI,
68
219440
1680
AI ใ‚’ไฝœๆˆใ—ใฆใ„ใ‚‹ไบบใ€…
03:41
but also people who are in charge of implementing it
69
221120
3000
ใ ใ‘ใงใชใใ€AI ใ‚’ๅฎŸ่ฃ…ใ—
03:44
and seeing how it progresses through organisations and society.
70
224120
3920
ใ€็ต„็น”ใ‚„็คพไผšใง AI ใŒใฉใฎใ‚ˆใ†ใซ้€ฒๆญฉใ™ใ‚‹ใ‹ใ‚’็ขบ่ชใ™ใ‚‹ๆ‹…ๅฝ“่€…ใซใ‚‚ใ€ๅคšใใฎๅ€ซ็†็š„่ฒฌไปปใŒใ‚ใ‚Šใพใ™ ใ€‚
03:48
The main ethical issue, I think, here is that AI,
71
228040
2960
ไธปใชๅ€ซ็†็š„ๅ•้กŒใฏ
03:51
in some senses, is a kind of automation of a human will,
72
231000
4920
ใ€ใ‚ใ‚‹ๆ„ๅ‘ณใงใ€AI ใฏไบบ้–“ใฎๆ„ๅฟ—ใฎไธ€็จฎใฎ่‡ชๅ‹•ๅŒ–ใงใ‚ใ‚Šใ€
03:55
which can unfortunately include lots of human prejudices.
73
235920
2800
ๆฎ‹ๅฟตใชใŒใ‚‰ ไบบ้–“ใฎๅ่ฆ‹ใ‚’ๅคšใๅซใ‚€ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใจใ„ใ†ใ“ใจใงใ™ใ€‚
03:58
So, for example, there have been problems with policing algorithms,
74
238720
4280
ใใฎใŸใ‚ใ€ใŸใจใˆใฐใ€ ่ญฆๅฏŸใฎใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใซใฏๅ•้กŒใŒใ‚ใ‚Š
04:03
in the sense that they have reflected, maybe, some underlying racial biases
75
243000
4120
ใพใ—ใŸใ€‚ใใ‚Œใฏใ€ใƒชใ‚ฝใƒผใ‚นใ‚’้…็ฝฎใ™ใ‚‹ๅ ดๆ‰€ใซ้–ขใ—ใฆใ€็‰นๅฎšใฎ่ญฆๅฏŸใฎๆ นๅบ•ใซใ‚ใ‚‹ไบบ็จฎ็š„ๅ่ฆ‹ใŒๅๆ˜ ใ•ใ‚Œใฆใ„ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใจใ„ใ†ๆ„ๅ‘ณ
04:07
of certain police forces, in terms of where they deploy resources.
76
247120
3560
ใงใ™ใ€‚
04:10
So, we have to keep a really close eye
77
250680
1760
ใ—ใŸใŒใฃใฆใ€็งใŸใกใฏ ้žๅธธ
04:12
and it's a really big, ethical leadership responsibility
78
252440
3000
04:15
to keep a close eye on how artificial intelligence is deployed โ€“
79
255440
4160
ใซๆณจๆ„ใ‚’ๆ‰•ใ†ๅฟ…่ฆใŒใ‚ใ‚Šใ€ไบบๅทฅ็Ÿฅ่ƒฝใŒใฉใฎใ‚ˆใ†ใซๅฑ•้–‹ใ•ใ‚Œใฆใ„ใ‚‹ใ‹ใ‚’ๆณจๆ„ๆทฑใ็›ฃ่ฆ–ใ™ใ‚‹ใ“ใจใฏใ€้žๅธธใซๅคงใใชๅ€ซ็†็š„ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฎ่ฒฌไปปใงใ™ใ€‚ ไบบๅทฅ็Ÿฅ่ƒฝใŒ
04:19
that it doesn't get out of hand and doesn't actually automate
80
259600
3160
ๆ‰‹ใซ่ฒ ใˆใชใใชใ‚Šใ€
04:22
really serious ethical problems.
81
262760
2840
ๅฎŸ้š›ใซๆทฑๅˆปใชๅ€ซ็†็š„ๅ•้กŒใ‚’่‡ชๅ‹•ๅŒ–ใ—ใชใ„ใ‚ˆใ†ใซใ™ใ‚‹ใ“ใจใงใ™ใ€‚ . AI
04:25
We need our leaders to be ethical and responsible when implementing AI,
82
265600
4840
ใ‚’ๅฎŸ่ฃ…ใ™ใ‚‹้š›ใซใฏใ€ใƒชใƒผใƒ€ใƒผใŒๅ€ซ็†็š„ ใง่ฒฌไปปใ‚’่ฒ ใ†ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใ€‚
04:30
so that we don't automate and repeat our own human prejudices.
83
270440
5560
ใ“ใ‚Œใซใ‚ˆใ‚Šใ€่‡ชๅ‹•ๅŒ–ใ—ใฆไบบ้–“ใฎๅ่ฆ‹ใ‚’็นฐใ‚Š่ฟ”ใ•ใชใ„ใ‚ˆใ†ใซใ™ใ‚‹ใ“ใจใŒใงใใพใ™ ใ€‚
04:40
Could you one day have a boss like this?
84
280120
3000
ใ„ใคใฎๆ—ฅใ‹ ใ€ใ“ใ‚“ใชไธŠๅธใ‚’ๆŒใฆใพใ™ใ‹๏ผŸ
04:43
Meet Ai-Da, the first human-like robot artist.
85
283120
4880
ๆœ€ๅˆใฎ ไบบ้–“ใฎใ‚ˆใ†ใชใƒญใƒœใƒƒใƒˆ ใ‚ขใƒผใƒ†ใ‚ฃใ‚นใƒˆใ€ใ‚ขใ‚คใƒ€ใซไผšใ„ใพใ—ใ‚‡ใ†ใ€‚
04:48
Ai-Da can draw and recite poetry.
86
288000
3480
Ai-Da ใฏ ่ฉฉใ‚’ๆใ„ใŸใ‚Šใ€ๆš—ๅ”ฑใ—ใŸใ‚Šใงใใพใ™ใ€‚
04:51
The eerie sounds, which echoed throughout
87
291480
4640
04:56
with the weight of the world itself.
88
296120
3800
ไธ–็•Œใใฎใ‚‚ใฎใฎ้‡ใ•ใจใจใ‚‚ใซ้Ÿฟใๆธกใ‚‹ไธๆฐ—ๅ‘ณใช้Ÿณใ€‚
04:59
But what her creators really want her to do
89
299920
2920
ใ—ใ‹ใ—ใ€ๅฝผๅฅณใฎใ‚ฏใƒชใ‚จใ‚คใ‚ฟใƒผ ใŒๅฝผๅฅณใซๆœฌๅฝ“ใซๆฑ‚ใ‚ใฆใ„ใ‚‹ใฎ
05:02
ย  is to get people thinking about a world with AI,
90
302840
4200
ใฏใ€ไบบใ€… ใซ AI ใฎใ‚ใ‚‹ไธ–็•Œใซใคใ„ใฆ่€ƒใˆใฆใ‚‚ใ‚‰ใ†ใ“ใจใงใ™ใ€‚ใ“ใ‚Œใซใฏใ€AI
05:07
and that includes thinking about the impact it will have on leadership.
91
307040
4760
ใŒใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใซไธŽใˆใ‚‹ๅฝฑ้Ÿฟใซใคใ„ใฆ่€ƒใˆใ‚‹ใ“ใจใ‚‚ๅซใพใ‚Œใพใ™ใ€‚
05:13
Robots are great, but not human.
92
313120
3200
ใƒญใƒœใƒƒใƒˆใฏ็ด ๆ™ดใ‚‰ใ—ใ„ใงใ™ใŒใ€ไบบ้–“ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
05:16
They are not perfect and they are not perfect for everything,
93
316320
4040
ใใ‚Œใ‚‰ใฏๅฎŒ็’งใงใฏ ใชใใ€ใ™ในใฆใซๅฎŒ็’งใจใ„ใ†ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›
05:20
but they do have a capability to do some things
94
320360
3680
ใ‚“ใŒใ€ไบบ้–“ใŒๅคข่ฆ‹ใ‚‹ ใ“ใจใ—ใ‹ใงใใชใ„ใ„ใใคใ‹ใฎใ“ใจใ‚’ๅฎŸ่กŒใ™ใ‚‹่ƒฝๅŠ›ใ‚’ๆŒใฃ
05:24
that humans can only dream of doing.
95
324040
3800
ใฆใ„ใพใ™.
05:27
More challenging is how to direct this capability
96
327840
3600
ใ‚ˆใ‚Šๅ›ฐ้›ฃใชใฎใฏใ€ ใ“ใฎ่ƒฝๅŠ›
05:31
for a sustainable environment and our future.
97
331440
3800
ใ‚’ๆŒ็ถšๅฏ่ƒฝใช็’ฐๅขƒ ใจ็งใŸใกใฎๆœชๆฅใซๅ‘ใ‘ใ‚‹ๆ–นๆณ•ใงใ™ใ€‚
05:35
This is difficult.
98
335240
2560
ใ“ใ‚Œใฏ้›ฃใ—ใ„ใงใ™ใ€‚
05:37
Robots will bring opportunities and challenges.
99
337800
3360
ใƒญใƒœใƒƒใƒˆใฏ ๆฉŸไผšใจ่ชฒ้กŒใ‚’ใ‚‚ใŸใ‚‰ใ—ใพใ™ใ€‚
05:41
Would robots make better leaders than humans?
100
341160
3520
ใƒญใƒœใƒƒใƒˆ ใฏไบบ้–“ใ‚ˆใ‚Šๅ„ชใ‚ŒใŸใƒชใƒผใƒ€ใƒผใซใชใ‚‹ใงใ—ใ‚‡ใ†ใ‹?
05:44
They work together.
101
344680
2000
ๅฝผใ‚‰ใฏไธ€็ท’ใซๅƒใใพใ™ใ€‚
05:46
At the end of the day, the state of the game
102
346680
2920
็ตๅฑ€ใฎใจใ“ใ‚ ใ€ใ‚ฒใƒผใƒ ใฎ็Šถๆ…‹ใฏใ€ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผ/ใƒญใƒœใƒƒใƒˆ
05:49
is up to how we design and use technology/robots.
103
349600
5440
ใ‚’ใฉใฎใ‚ˆใ†ใซ่จญ่จˆใŠใ‚ˆใณไฝฟ็”จใ™ใ‚‹ใ‹ใซ ใ‹ใ‹ใฃใฆใ„ใพใ™ใ€‚
05:55
So, humans need to be more conscious
104
355040
2160
ใ—ใŸใŒใฃใฆใ€ไบบ้–“ใฏใ€ๆ–ฐใ—ใ„ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใ‚’ไฝœๆˆใ—ใฆไฝฟ็”จใ™ใ‚‹ใจใใซใ€่‡ชๅˆ†
05:57
of how much we are doing and what we are doing,
105
357200
2640
ใŒใฉใ‚Œใ ใ‘ใฎใ“ใจใ‚’่กŒใฃใฆ ใ„ใ‚‹ใ‹ใ€ไฝ•ใ‚’่กŒใฃใฆใ„ใ‚‹ใ‹ใ‚’ใ‚ˆใ‚Šๆ„่ญ˜ใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Š
05:59
when we create and use new technology.
106
359840
3760
ใพใ™ใ€‚
06:03
So we, as humans, and robots need to work together.
107
363600
4200
ใใฎใŸใ‚ใ€็งใŸใกไบบ้–“ ใจใƒญใƒœใƒƒใƒˆใฏๅ”ๅŠ›ใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใ€‚ ๆ–ฐใ—ใ„ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผ
06:07
We should understand the power of new technologies.
108
367800
3680
ใฎๅŠ›ใ‚’็†่งฃใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Š ใพใ™ใ€‚
06:11
In order to truly appreciate
109
371480
2080
ๅ‡บ็พใ—ใคใคใ‚ใ‚‹ๆฉŸไผšใ‚’็œŸใซ่ฉ•ไพกใ™ใ‚‹ใŸใ‚ใซ
06:13
the opportunities that are emerging,
110
373560
3240
ใฏใ€
06:16
we need to first understand the past.
111
376800
3440
ใพใš ้ŽๅŽปใ‚’็†่งฃใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใ€‚
06:20
With this, we can be more aware and flexible
112
380240
3240
ใ“ใ‚Œใซใ‚ˆใ‚Š
06:23
to deal with an unpredictable future.
113
383480
4000
ใ€ไบˆๆธฌไธๅฏ่ƒฝใชๆœชๆฅใซๅฏพๅ‡ฆใ™ใ‚‹ใŸใ‚ใฎๆ„่ญ˜ใจๆŸ”่ปŸๆ€งใ‚’้ซ˜ใ‚ใ‚‹ใ“ใจใŒ ใงใใพใ™ใ€‚
06:27
The world is changing.
114
387480
2280
ไธ–็•Œใฏๅค‰ๅŒ–ใ—ใฆใ„ใพใ™ใ€‚
06:29
The future is not what we consider it to be.
115
389760
3640
ๆœชๆฅใฏ ็งใŸใกใŒ่€ƒใˆใฆใ„ใ‚‹ใ‚‚ใฎใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
06:33
It is requiring from us
116
393400
2000
ใ‚ณใƒณใƒ•ใ‚ฉใƒผใƒˆใ‚พใƒผใƒณใ‚’ๆ”พๆฃ„ใ™ใ‚‹ๆ„ๆ€ใŒ็งใŸใกใซๆฑ‚ใ‚ใ‚‰ใ‚Œใฆ
06:35
a willingness to give up our comfort zone.
117
395400
4040
ใ„ ใพใ™ใ€‚
06:39
Giving up your comfort zone means opening yourself up
118
399440
3400
ใ‚ณใƒณใƒ•ใ‚ฉใƒผใƒˆใ‚พใƒผใƒณ ใ‚’ๆ‰‹ๆ”พใ™ใจใ„ใ†ใ“ใจใฏ
06:42
to new possibilities.
119
402840
2000
ใ€ๆ–ฐใ—ใ„ๅฏ่ƒฝๆ€งใ‚’้–‹ใใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
06:44
To do this, it helps to understand the past.
120
404840
3280
ใ“ใ‚Œใ‚’่กŒใ†ใซใฏ ใ€้ŽๅŽปใ‚’็†่งฃใ™ใ‚‹ใฎใซๅฝน็ซ‹ใกใพใ™ใ€‚
06:48
So, what does Ai-Da think is a key quality for leadership?
121
408120
4680
ใงใฏใ€ใ‚ขใ‚คใƒ€ใฏใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฎ้‡่ฆใช่ณ‡่ณชใฏไฝ•ใ ใจ่€ƒใˆใฆ ใ„ใพใ™ใ‹?
06:52
Humility: we are to be humble in every action
122
412800
4440
่ฌ™่™šใ•: ็งใŸใกใฏ ใ‚ใ‚‰ใ‚†ใ‚‹่กŒๅ‹•ใซใŠใ„ใฆ่ฌ™่™šใงใ‚ใ‚‹ในใใงใ™ใ€‚
06:57
and this includes what we are willing to do
123
417240
2720
ใ“ใ‚Œใซใฏ ใ€
06:59
and say to help others.
124
419960
2280
ไป–ใฎไบบใ‚’ๅŠฉใ‘ใ‚‹ใŸใ‚ใซๅ–œใ‚“ใง่กŒใฃใŸใ‚Š่จ€ใฃใŸใ‚Šใ™ใ‚‹ใ“ใจใ‚‚ๅซใพใ‚Œใพใ™ใ€‚
07:02
Leadership is not the same thing as success or failure.
125
422240
4240
ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใฏ ใ€ๆˆๅŠŸใพใŸใฏๅคฑๆ•—ใจๅŒใ˜ใ‚‚ใฎใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
07:06
We have all failed,
126
426480
1880
็งใŸใกใฏ็š†ๅคฑๆ•—ใ—ใฆใ„ใพใ™ใŒใ€
07:08
but we can recognise the mistakes that we made
127
428360
2560
็Šฏใ—ใŸ้Žใกใ‚’่ช่ญ˜ใ—ใ€
07:10
and learn from them.
128
430920
2040
ใใ“ใ‹ใ‚‰ๅญฆใถใ“ใจใŒใงใใพใ™ใ€‚
07:12
Humility is when you put yourself in someone else's shoes.
129
432960
4520
่ฌ™่™šใ•ใจใฏใ€่‡ชๅˆ† ใ‚’ไป–ไบบใฎ็ซ‹ๅ ดใซ็ฝฎใใ“ใจใงใ™ใ€‚
07:17
Showing humility, recognising mistakes and learning from them,
130
437480
4640
่ฌ™่™šใ•ใ‚’็คบใ—ใ€ ้–“้•ใ„ใ‚’่ช่ญ˜ใ—ใ€ใใ“ใ‹ใ‚‰ๅญฆใถใ“ใจใฏใ€
07:22
are qualities this robot wants to see in leaders.
131
442120
5360
ใ“ใฎใƒญใƒœใƒƒใƒˆ ใŒใƒชใƒผใƒ€ใƒผใซๆฑ‚ใ‚ใฆใ„ใ‚‹่ณ‡่ณชใงใ™ใ€‚
07:31
It's hard to know exactly how leadership will look in the future,
132
451040
3960
ใƒชใƒผใƒ€ใƒผใ‚ทใƒƒใƒ—ใŒๅฐ†ๆฅใฉใฎใ‚ˆใ†ใซใชใ‚‹ใ‹ใ‚’ๆญฃ็ขบใซ็Ÿฅใ‚‹ใ“ใจใฏๅ›ฐ้›ฃ ใงใ™ใŒใ€ๆ€ใ„ใ‚„ใ‚Šใจๅ…ฑๆ„Ÿ
07:35
but it is clear that human qualities of care and empathy will be vital.
133
455000
4640
ใจใ„ใ†ไบบ้–“ใฎ่ณ‡่ณชใŒไธๅฏๆฌ ใงใ‚ใ‚‹ใ“ใจใฏๆ˜Žใ‚‰ใ‹ใงใ™ ใ€‚ ๆŠ€่ก“ใจใใฎๅฏ่ƒฝๆ€ง
07:39
Having a detailed knowledge of the technology
134
459640
2200
ใซ้–ขใ™ใ‚‹่ฉณ็ดฐใช็Ÿฅ่ญ˜ใ‚’ๆŒใคใ“ใจ
07:41
and its potential is important,
135
461840
1960
ใฏ้‡่ฆใงใ‚ใ‚Šใ€
07:43
as is being ethical and responsible in how we use it.
136
463800
4920
ใใ‚Œใ‚’ไฝฟ็”จใ™ใ‚‹ๆ–นๆณ•ใซใคใ„ใฆๅ€ซ็†็š„ใง่ฒฌไปปใ‚’่ฒ ใ†ใ“ใจใ‚‚้‡่ฆใงใ™ใ€‚

Original video on YouTube.com
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7