BOX SET: 6 Minute English - 'Artificial intelligence' English mega-class! 30 minutes of new vocab!

6,365 views ใƒป 2025-01-19

BBC Learning English


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚ ็ฟป่จณใ•ใ‚ŒใŸๅญ—ๅน•ใฏๆฉŸๆขฐ็ฟป่จณใงใ™ใ€‚

00:00
6 Minute English.
0
760
1720
6ๅˆ†้–“่‹ฑ่ชžใ€‚
00:02
From BBC Learning English.
1
2600
2640
BBC Learning Englishใ‚ˆใ‚Šใ€‚
00:05
Hello. This is 6 Minute English from BBC Learning English. I'm Neil.
2
5840
3920
ใ“ใ‚“ใซใกใฏใ€‚ ใ“ใ‚Œใฏ BBC Learning English ใฎ 6 Minute English ใงใ™ใ€‚ ็งใฏใƒ‹ใƒผใƒซใงใ™ใ€‚
00:09
And I'm Rob.
3
9880
1200
ใใ—ใฆ็งใฏใƒญใƒ–ใงใ™ใ€‚
00:11
Now, I'm sure most of us have interacted with a chatbot.
4
11200
4360
ใŠใใ‚‰ใใ€ใปใจใ‚“ใฉใฎไบบใŒ ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใจใ‚„ใ‚Šๅ–ใ‚Šใ—ใŸใ“ใจใŒใ‚ใ‚‹ใจๆ€ใ„ใพใ™ใ€‚
00:15
These are bits of computer technology
5
15680
2320
ใ“ใ‚Œใ‚‰ใฏใ€
00:18
that respond to text with text or respond to your voice.
6
18120
4600
ใƒ†ใ‚ญใ‚นใƒˆใซใƒ†ใ‚ญใ‚นใƒˆใงๅฟœ็ญ”ใ—ใŸใ‚Š ใ€้Ÿณๅฃฐใซๅฟœ็ญ”ใ—ใŸใ‚Šใ™ใ‚‹ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผ ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใฎไธ€้ƒจใงใ™ใ€‚
00:22
You ask it a question and usually it comes up with an answer.
7
22840
3960
่ณชๅ•ใ™ใ‚‹ใจ ใ€ใŸใ„ใฆใ„ใฏ็ญ”ใˆใŒ่ฟ”ใฃใฆใใพใ™ใ€‚
00:26
Yes, it's almost like talking to another human, but of course it's not,
8
26920
4480
ใฏใ„ใ€ใพใ‚‹ใงไป–ใฎไบบ้–“ใจ่ฉฑใ—ใฆใ„ใ‚‹ใ‚ˆใ†ใงใ™ ใŒใ€ใ‚‚ใกใ‚ใ‚“ใใ†ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
00:31
it's just a clever piece of technology.
9
31520
2560
ใ“ใ‚Œใฏๅ˜ใชใ‚‹ๅทงๅฆ™ใชใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใงใ™ใ€‚ ไบบๅทฅ็Ÿฅ่ƒฝใฏใพใ™ใพใ™
00:34
It is becoming more 'sophisticated' โ€” more 'advanced and complex' โ€”
10
34200
3640
ใ€Œๆด—็ทดใ€ใ•ใ‚Œใ€ ใ•ใ‚‰ใซใ€Œ้ซ˜ๅบฆใง่ค‡้›‘ใ€ใซใชใฃใฆใใฆใ„ใพใ™
00:37
but could they replace real human interaction altogether?
11
37960
3200
ใŒใ€ๅฎŸ้š›ใฎไบบ้–“ๅŒๅฃซใฎใ‚„ใ‚Šใจใ‚Šใ‚’ๅฎŒๅ…จใซ็ฝฎใๆ›ใˆใ‚‹ใ“ใจใŒใงใใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹ ?
00:41
We'll discuss that more in a moment
12
41280
2040
ใ“ใ‚Œใซใคใ„ใฆใฏๅพŒใปใฉ่ฉณใ—ใ่ชฌๆ˜Žใ—
00:43
and find out if chatbots really think for themselves.
13
43440
3680
ใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใŒๆœฌๅฝ“ใซ่‡ชๅˆ†ใง่€ƒใˆใ‚‹ใฎใ‹ใฉใ†ใ‹ใ‚’็ขบ่ชใ—ใพใ™ ใ€‚
00:47
But first I have a question for you, Rob.
14
47240
2440
ใ—ใ‹ใ—ใ€ใพใšใฏใ‚ใชใŸใซ่ณชๅ•ใŒใ‚ใ‚Šใพใ™ใ€ใƒญใƒ–ใ€‚ ไบบ้–“ใจๆฉŸๆขฐใฎ้–“ใงไฝ•ใ‚‰ใ‹ใฎๅฆฅๅฝ“ใชไผš่ฉฑใ‚’
00:49
The first computer program that allowed some kind of plausible conversation
15
49800
4440
ๅฏ่ƒฝใซใ™ใ‚‹ๆœ€ๅˆใฎใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟ ใƒ—ใƒญใ‚ฐใƒฉใƒ ใฏ
00:54
between humans and machines was invented in 1966, but what was it called?
16
54360
6440
1966 ๅนดใซ็™บๆ˜Žใ•ใ‚Œใพใ—ใŸใŒใ€ใใ‚Œใฏไฝ•ใจๅ‘ผใฐใ‚Œใฆใ„ใพใ—ใŸใ‹?
01:00
Was it a) Alexa? b) ELIZA? Or c) PARRY?
17
60920
6120
ใใ‚Œใฏ a) Alexa ใงใ—ใŸใ‹? b) ใ‚จใƒชใ‚ถ๏ผŸ ใใ‚Œใจใ‚‚c) ใƒ‘ใƒชใƒผ๏ผŸ
01:07
Ah, well, it's not Alexa, that's too new, so I'll guess c) PARRY.
18
67160
4280
ใ‚ใ‚ใ€ใพใ‚ใ€Alexa ใงใฏใชใ„ใงใ™ใญใ€‚ใใ‚Œใฏใ‚ใพใ‚Šใซใ‚‚ๆ–ฐใ—ใ„ ใฎใงใ€c) PARRY ใ ใจๆ€ใ„ใพใ™ใ€‚
01:11
I'll reveal the answer at the end of the programme.
19
71560
3200
็ญ”ใˆใฏ ็•ช็ต„ใฎๆœ€ๅพŒใซ็™บ่กจใ—ใพใ™ใ€‚
01:14
Now, the old chatbots of the 1960s and '70s were quite basic,
20
74880
5440
1960 ๅนดไปฃใ‚„ 70 ๅนดไปฃใฎๅคใ„ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฏ ้žๅธธใซๅŸบๆœฌ็š„ใชใ‚‚ใฎใงใ—ใŸ
01:20
but more recently, the technology is able to predict the next word
21
80440
4640
ใŒใ€ๆœ€่ฟ‘ใฎใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใงใฏใ€
01:25
that is likely to be used in a sentence,
22
85200
2200
ๆ–‡ไธญใงๆฌกใซไฝฟใ‚ใ‚Œใ‚‹ๅฏ่ƒฝๆ€งใฎใ‚ใ‚‹ๅ˜่ชžใ‚’ไบˆๆธฌใ—ใŸใ‚Š
01:27
and it learns words and sentence structures.
23
87520
2840
ใ€ๅ˜่ชž ใ‚„ๆ–‡ใฎๆง‹้€ ใ‚’ๅญฆ็ฟ’ใ—ใŸใ‚Šใงใใ‚‹ใ‚ˆใ†ใซใชใ‚Šใพใ—ใŸใ€‚
01:30
Mm, it's clever stuff.
24
90480
1480
ใ†ใƒผใ‚“ใ€ใใ‚Œใฏ่ณขใ„ใ‚‚ใฎใงใ™ใ€‚
01:32
I've experienced using them when talking to my bank,
25
92080
2800
้Š€่กŒใจ่ฉฑใ‚’ใ™ใ‚‹ใจใ
01:35
or when I have problems trying to book a ticket on a website.
26
95000
3080
ใ‚„ใ€ ใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใงใƒใ‚ฑใƒƒใƒˆใ‚’ไบˆ็ด„ใ—ใ‚ˆใ†ใจใ—ใฆๅ•้กŒใŒ็™บ็”Ÿใ—ใŸใจใใซใ€ใใ‚Œใ‚‰ใ‚’ไฝฟ็”จใ—ใŸ็ตŒ้จ“ใŒใ‚ใ‚Šใพใ™ใ€‚
01:38
I no longer phone a human, I speak to a virtual assistant instead.
27
98200
5160
ใ‚‚ใ†ไบบ้–“ใซ้›ป่ฉฑใ‚’ใ‹ใ‘ใ‚‹ใฎใงใฏใชใใ€ ไปฃใ‚ใ‚Šใซไปฎๆƒณใ‚ขใ‚ทใ‚นใ‚ฟใƒณใƒˆใจ่ฉฑใ—ใพใ™ใ€‚
01:43
Probably the most well-known chatbot at the moment is ChatGPT..
28
103480
4560
ใŠใใ‚‰ใ็พๆ™‚็‚นใงๆœ€ใ‚‚ใ‚ˆใ็Ÿฅใ‚‰ใ‚Œใฆใ„ใ‚‹ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฏ ChatGPT ใงใ™ใ€‚
01:48
It is. The claim is that it's able to answer anything you ask it.
29
108160
4400
ไฝ•ใ‚’ๅฐ‹ใญใฆใ‚‚็ญ”ใˆใ‚‰ใ‚Œใ‚‹ใจไธปๅผตใ—ใฆใ„ใ‚‹ใ€‚
01:52
This includes writing students' essays.
30
112680
2760
ใ“ใ‚Œใซใฏ็”Ÿๅพ’ใฎใ‚จใƒƒใ‚ปใ‚คใฎๅŸท็ญ†ใ‚‚ๅซใพใ‚Œใพใ™ใ€‚
01:55
Now, this is something that was discussed
31
115560
1760
ใ•ใฆใ€ใ“ใ‚Œใฏ
01:57
on the BBC Radio 4 programme, Word of Mouth.
32
117440
3040
BBC ใƒฉใ‚ธใ‚ช 4 ใฎ็•ช็ต„ใ€Œ Word of Mouthใ€ใง่ญฐ่ซ–ใ•ใ‚ŒใŸใ“ใจใงใ™ใ€‚
02:00
Emily M Bender, Professor of Computational Linguistics
33
120600
4280
02:05
at the University of Washington,
34
125000
1680
ใƒฏใ‚ทใƒณใƒˆใƒณๅคงๅญฆใฎ่จˆ็ฎ—่จ€่ชžๅญฆๆ•™ๆŽˆใงใ‚ใ‚‹ใ‚จใƒŸใƒชใƒผใƒปMใƒปใƒ™ใƒณใƒ€ใƒผๆฐใฏใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฎ่จ€ใ†ใ“ใจใ‚’
02:06
explained why it's dangerous to always trust what a chatbot is telling us.
35
126800
4720
ๅธธใซไฟกใ˜ใฆใ—ใพใ†ใ“ใจใŒใชใœๅฑ้™บใชใฎใ‹ใ‚’่ชฌๆ˜Žใ—ใพใ—ใŸ ใ€‚
02:11
We tend to react to grammatical, fluent, coherent-seeming text
36
131640
5120
็งใŸใกใฏใ€ๆ–‡ๆณ•็š„ใซๆญฃใ—ใใ€ ๆตๆšขใงใ€้ฆ–ๅฐพไธ€่ฒซใ—ใฆใ„ใ‚‹ใ‚ˆใ†ใซ่ฆ‹ใˆใ‚‹ใƒ†ใ‚ญใ‚นใƒˆใ‚’ใ€
02:16
as authoritative and reliable and valuable and we need to be on guard against that,
37
136880
6040
ๆจฉๅจใŒใ‚ใ‚Šใ€ไฟก้ ผใงใใ€ไพกๅ€คใŒใ‚ใ‚‹ใ‚‚ใฎใจใ—ใฆๅๅฟœใ™ใ‚‹ๅ‚พๅ‘ใŒใ‚ใ‚Šใพใ™ใŒ ใ€ใใ‚Œใซๅฏพใ—ใฆ่ญฆๆˆ’ใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใ€‚
02:23
because what's coming out of ChatGPT is none of that.
38
143040
2520
ใชใœใชใ‚‰ใ€ChatGPT ใ‹ใ‚‰ๅ‡บใฆใใ‚‹ใ‚‚ใฎใฏใ€ ใใฎใฉใ‚Œใงใ‚‚ใชใ„ใ‹ใ‚‰ใงใ™ใ€‚
02:25
So, Professor Bender says that well-written text that is 'coherent' โ€”
39
145680
4200
ใใฎใŸใ‚ใ€ใƒ™ใƒณใƒ€ใƒผๆ•™ๆŽˆใฏใ€ ใ‚ˆใๆ›ธใ‹ใ‚ŒใŸใ€Œ้ฆ–ๅฐพไธ€่ฒซใ—ใŸใ€ๆ–‡็ซ ใ€
02:30
that means it's 'clear, carefully considered and sensible' โ€”
40
150000
3440
ใคใพใ‚Šใ€Œๆ˜Ž็ขบใงใ€ ๆ…Ž้‡ใซๆคœ่จŽใ•ใ‚Œใ€็†ใซใ‹ใชใฃใŸใ€ๆ–‡็ซ ใฏใ€
02:33
makes us think what we are reading is reliable and 'authoritative'.
41
153560
4040
็งใŸใกใŒ่ชญใ‚“ใงใ„ใ‚‹ๅ†…ๅฎนใŒ ไฟก้ ผใงใใ€ใ€ŒๆจฉๅจใŒใ‚ใ‚‹ใ€ใจๆ€ใ‚ใ›ใ‚‹ใจ่ฟฐในใฆใ„ใพใ™ใ€‚
02:37
So it's 'respected, accurate and important sounding'.
42
157720
3440
ใคใพใ‚Šใ€ใ€ŒๅฐŠๆ•ฌใ•ใ‚Œใ€ๆญฃ็ขบใง ใ€้‡่ฆใช้Ÿฟใใ€ใชใฎใงใ™ใ€‚
02:41
Yes, chatbots might appear to write in this way,
43
161280
3000
็ขบใ‹ใซใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฏ ใ“ใฎใ‚ˆใ†ใซๆ›ธใ„ใฆใ„ใ‚‹ใ‚ˆใ†ใซ่ฆ‹ใˆใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
02:44
but really, they are just predicting one word after another,
44
164400
3560
ใŒใ€ๅฎŸ้š›ใซใฏใ€
02:48
based on what they have learnt.
45
168080
2040
ๅญฆ็ฟ’ใ—ใŸๅ†…ๅฎนใซๅŸบใฅใ„ใฆใ€ๅ˜่ชžใ‚’ๆฌกใ€…ใซไบˆๆธฌใ—ใฆใ„ใ‚‹ใ ใ‘ใงใ™ใ€‚
02:50
We should, therefore, be 'on guard' โ€” be 'careful and alert' โ€”
46
170240
3720
ใ—ใŸใŒใฃใฆใ€็งใŸใกใฏใ€ไผใˆใ‚‰ใ‚Œใฆใ„ใ‚‹ๅ†…ๅฎนใฎๆญฃ็ขบใ•ใซใคใ„ใฆใ€Œ่ญฆๆˆ’ใ™ใ‚‹ใ€ใ€ใคใพใ‚Šใ€Œๆณจๆ„ๆทฑใ่ญฆๆˆ’ใ™ใ‚‹ใ€ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™
02:54
about the accuracy of what we are being told.
47
174080
3000
ใ€‚
02:57
One concern is that chatbots โ€” a form of artificial intelligence โ€”
48
177200
4240
ๆ‡ธๅฟตใ•ใ‚Œใ‚‹ใฎใฏใ€ ไบบๅทฅ็Ÿฅ่ƒฝใฎไธ€็จฎใงใ‚ใ‚‹ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใŒใ€ๆƒ…ๅ ฑใ‚’ๅญฆ็ฟ’ใ—ใฆๅ‡ฆ็†ใ™ใ‚‹
03:01
work a bit like a human brain in the way it can learn and process information.
49
181560
4920
ๆ–นๆณ•ใซใŠใ„ใฆใ€ไบบ้–“ใฎ่„ณใจๅฐ‘ใ—ไผผใฆใ„ใ‚‹ใจใ„ใ†็‚นใงใ™ ใ€‚
03:06
They are able to learn from experience, something called deep learning.
50
186600
4360
ๅฝผใ‚‰ใฏ็ตŒ้จ“ใ‹ใ‚‰ๅญฆใถใ“ใจใŒใงใใ€ ใ“ใ‚Œใ‚’ใƒ‡ใ‚ฃใƒผใƒ—ใƒฉใƒผใƒ‹ใƒณใ‚ฐใจๅ‘ผใณใพใ™ใ€‚
03:11
A cognitive psychologist and computer scientist called Geoffrey Hinton
51
191080
4200
่ช็Ÿฅๅฟƒ็†ๅญฆ่€…ใงใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผ ็ง‘ๅญฆ่€…ใฎใ‚ธใ‚งใƒ•ใƒชใƒผใƒปใƒ’ใƒณใƒˆใƒณๆฐใฏ
03:15
recently said he feared that chatbots could soon overtake
52
195400
3680
ๆœ€่ฟ‘ใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใŒ ่ฟ‘ใ„ใ†ใกใซ
03:19
the level of information that a human brain holds.
53
199200
3400
ไบบ้–“ใฎ่„ณใŒไฟๆŒใ™ใ‚‹ๆƒ…ๅ ฑใƒฌใƒ™ใƒซใ‚’่ฟฝใ„ๆŠœใใฎใงใฏใชใ„ใ‹ใจๆ‡ธๅฟตใ—ใฆใ„ใ‚‹ใจ่ฟฐในใŸใ€‚
03:22
That's a bit scary, isn't it?
54
202720
1440
ใใ‚Œใฏใกใ‚‡ใฃใจๆ€–ใ„ใงใ™ใญใ€‚
03:24
Mm, but for now, chatbots can be useful for practical information,
55
204280
4360
ใ†ใƒผใ‚“ใ€ใงใ‚‚ไปŠใฎใจใ“ใ‚ใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฏ ๅฎŸ็”จ็š„ใชๆƒ…ๅ ฑใ‚’ๅพ—ใ‚‹ใฎใซใฏๅฝน็ซ‹ใกใพใ™
03:28
but sometimes we start to believe they are human
56
208760
2680
ใŒใ€ๆ™‚ใซใฏ็งใŸใกใฏ ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใŒไบบ้–“ใ ใจไฟกใ˜ๅง‹ใ‚
03:31
and we interact with them in a human-like way.
57
211560
2680
ใ€ ไบบ้–“ใฎใ‚ˆใ†ใชใ‚„ใ‚Šๆ–นใงใ‚„ใ‚Šใจใ‚Šใ™ใ‚‹ใ“ใจใ‚‚ใ‚ใ‚Šใพใ™ใ€‚
03:34
This can make us believe them even more.
58
214360
2400
ใ“ใ‚Œใซใ‚ˆใ‚Šใ€็งใŸใกใฏๅฝผใ‚‰ใ‚’ใ•ใ‚‰ใซไฟกใ˜ใ‚‹ใ‚ˆใ†ใซใชใ‚‹ใงใ—ใ‚‡ใ†ใ€‚
03:36
Professor Emma Bender, speaking on the BBC's Word of Mouth programme,
59
216880
3520
ใ‚จใƒžใƒปใƒ™ใƒณใƒ€ใƒผๆ•™ๆŽˆใฏใ€ BBCใฎใ€Œๅฃใ‚ณใƒŸใ€็•ช็ต„ใงใ€
03:40
explains why we might feel like that.
60
220520
2760
ใชใœ็งใŸใกใŒใใฎใ‚ˆใ†ใซๆ„Ÿใ˜ใ‚‹ใฎใ‹ใ‚’่ชฌๆ˜Žใ—ใฆใ„ใพใ™ใ€‚ ใŠใใ‚‰ใใ€ใ“ใฎๆฉŸๆขฐใฏใ€
03:43
I think what's going on there is the kinds of answers you get
61
223400
4160
03:47
depend on the questions you put in,
62
227680
2040
03:49
because it's doing likely next word, likely next word,
63
229840
2520
ใŠใใ‚‰ใๆฌกใฎๅ˜่ชžใ€ ใŠใใ‚‰ใๆฌกใฎๅ˜่ชžใ‚’ๆŽจๆธฌใ—ใฆใ„ใ‚‹ใฎใงใ€ๅ…ฅๅŠ›ใ—ใŸ่ณชๅ•ใซๅฟœใ˜ใฆ่ฟ”็ญ”ใฎ็จฎ้กžใŒๅค‰ใ‚ใ‚‹ใจๆ€ใ„ใพใ™ใ€‚ใ—ใŸใŒใฃใฆ
03:52
and so if, as the human interacting with this machine,
64
232480
3280
ใ€ ใ“ใฎๆฉŸๆขฐใจๅฏพ่ฉฑใ™ใ‚‹ไบบ้–“ใจใ—ใฆใ€ๆฌกใฎใ‚ˆใ†ใช
03:55
you start asking it questions about how do you feel, you know, Chatbot?
65
235880
4280
่ณชๅ•ใ‚’ใ—ๅง‹ใ‚ใ‚‹ใจใ€ ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใ•ใ‚“ใ€ใฉใ†ๆ„Ÿใ˜ใพใ™ใ‹๏ผŸ
04:00
And "What do you think of this?" And, "What are your goals?"
66
240280
2840
ใใ—ใฆใ€Œใ“ใ‚Œใซใคใ„ใฆใฉใ†ๆ€ใ„ใพใ™ใ‹๏ผŸใ€ ใใ—ใฆใ€ใ€Œใ‚ใชใŸใฎ็›ฎๆจ™ใฏไฝ•ใงใ™ใ‹๏ผŸใ€ ็Ÿฅ่ฆšๅŠ›ใฎใ‚ใ‚‹ๅญ˜ๅœจใŒ่จ€ใ†ใ‚ˆใ†ใชใ“ใจใ‚’
04:03
You can provoke it to say things
67
243240
1720
่จ€ใ‚ใ›ใ‚‹ใ‚ˆใ†ใซๅˆบๆฟ€ใ™ใ‚‹ใ“ใจใŒใงใใพใ™
04:05
that sound like what a sentient entity would say.
68
245080
2960
ใ€‚
04:08
We are really primed to imagine a mind behind language
69
248160
3320
็งใŸใกใฏ่จ€่ชžใซ้ญ้‡ใ™ใ‚‹ใŸใณใซใ€ใใฎ ่จ€่ชžใฎ่ƒŒๅพŒใซใ‚ใ‚‹ๅฟƒใ‚’ๆƒณๅƒใ™ใ‚‹ใ‚ˆใ†ใซๆบ–ๅ‚™ใ•ใ‚Œใฆใ„ใ‚‹
04:11
whenever we encounter language
70
251600
1600
04:13
and so we really have to account for that when we're making decisions about these.
71
253320
3680
ใฎใงใ€่จ€่ชžใซใคใ„ใฆๆฑบๅฎšใ‚’ไธ‹ใ™ใจใใซใฏใใฎใ“ใจใ‚’่€ƒๆ…ฎใซๅ…ฅใ‚Œใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ ใ€‚
04:17
So, although a chatbot might sound human,
72
257840
2560
ใ—ใŸใŒใฃใฆใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฏไบบ้–“ใฎใ‚ˆใ†ใซ่žใ“ใˆใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใŒใ€
04:20
we really just ask it things to get a reaction โ€” we 'provoke' it โ€”
73
260520
3840
ๅฎŸ้š›ใซใฏๅๅฟœใ‚’ๅพ—ใ‚‹ใŸใ‚ใซไฝ•ใ‹ใ‚’่ณชๅ•ใ—ใฆใ„ใ‚‹ใ ใ‘ใงใ‚ใ‚Šใ€ใคใพใ‚Šใ€Œ ๆŒ‘็™บใ€ใ—ใฆใ„ใ‚‹ใ ใ‘ใงใ‚ใ‚Šใ€ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใŒ
04:24
and it answers only with words it's learned to use before,
74
264480
4040
็ญ”ใˆใ‚‹ใฎใฏใ€ๅทงๅฆ™ใช็ญ”ใˆใ‚’ๆ€ใ„ใคใ„ใŸใ‹ใ‚‰ใงใฏใชใ ใ€ไปฅๅ‰ใซๅญฆ็ฟ’ใ—ใŸ่จ€่‘‰ใ ใ‘ใชใฎใงใ™
04:28
not because it has come up with a clever answer.
75
268640
2720
ใ€‚
04:31
But it does sound like a sentient entity โ€”
76
271480
2600
ใ—ใ‹ใ—ใ€ใใ‚Œใฏ็ขบใ‹ใซๆ„Ÿ่ฆšใ‚’ๆŒใฃใŸๅญ˜ๅœจใฎใ‚ˆใ†ใซ่žใ“ใˆใพใ™ใ€‚
04:34
'sentient' describes 'a living thing that experiences feelings'.
77
274200
4440
ใ€Œๆ„Ÿ่ฆšใ‚’ๆŒใฃใŸใ€ใจใฏใ€ใ€Œ ๆ„Ÿๆƒ…ใ‚’็ตŒ้จ“ใ™ใ‚‹็”Ÿใ็‰ฉใ€ใ‚’่กจใ—ใพใ™ใ€‚
04:38
As Professor Bender says,
78
278760
1640
ใƒ™ใƒณใƒ€ใƒผๆ•™ๆŽˆใŒ่จ€ใ†ใ‚ˆใ†ใซใ€
04:40
we imagine that when something speaks, there is a mind behind it.
79
280520
4080
ไฝ•ใ‹ใŒ่ฉฑใ™ใจใใ€ ใใฎ่ƒŒๅพŒใซใฏๅฟƒใŒใ‚ใ‚‹ใจ็งใŸใกใฏๆƒณๅƒใ—ใพใ™ใ€‚
04:44
But sorry, Neil, they are not your friend, they're just machines!
80
284720
4000
ใงใ‚‚ๆฎ‹ๅฟตใ ใ‘ใฉใ€ใƒ‹ใƒผใƒซใ€ๅฝผใ‚‰ใฏใ‚ใชใŸใฎๅ‹้”ใ˜ใ‚ƒใชใ„ใ€ ใŸใ ใฎๆฉŸๆขฐใชใ‚“ใ ๏ผ
04:48
Yes, it's strange then that we sometimes give chatbots names.
81
288840
3240
ใใ†ใงใ™ใญใ€ ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใซๅๅ‰ใ‚’ไป˜ใ‘ใ‚‹ใ“ใจใŒใ‚ใ‚‹ใจใ„ใ†ใฎใฏๅฅ‡ๅฆ™ใชใ“ใจใงใ™ใ€‚
04:52
Alexa, Siri, and earlier I asked you what the name was for the first ever chatbot.
82
292200
5920
Alexaใ€Siriใ€ๅ…ˆใปใฉใ€ๅฒไธŠๅˆใฎใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฎๅๅ‰ใฏไฝ•ใ ใฃใŸใ‹ใจใŠ่žใใ—ใพใ—ใŸ ใ€‚
04:58
And I guessed it was PARRY. Was I right?
83
298240
2880
ใใ—ใฆ็งใฏใใ‚ŒใŒPARRYใ ใจๆŽจๆธฌใ—ใพใ—ใŸใ€‚ ็งใฏๆญฃใ—ใ‹ใฃใŸใงใ—ใ‚‡ใ†ใ‹๏ผŸ
05:01
You guessed wrong, I'm afraid.
84
301240
2040
ๆฎ‹ๅฟตใชใŒใ‚‰ใ€ใ‚ใชใŸใฎๆŽจๆธฌใฏ้–“้•ใฃใฆใ„ใพใ™ใ€‚
05:03
PARRY was an early form of chatbot from 1972, but the correct answer was ELIZA.
85
303400
6400
PARRY ใฏ 1972 ๅนดใฎๅˆๆœŸใฎใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใงใ—ใŸใŒ ใ€ๆญฃ่งฃใฏ ELIZA ใงใ—ใŸใ€‚
05:09
It was considered to be the first 'chatterbot' โ€” as it was called then โ€”
86
309920
4000
ใ“ใ‚Œใฏใ€ ๅฝ“ๆ™‚ใฏใ€Œใƒใƒฃใƒƒใ‚ฟใƒผใƒœใƒƒใƒˆใ€ใจๅ‘ผใฐใ‚Œใฆใ„ใŸๆœ€ๅˆใฎใ‚‚ใฎใจ่€ƒใˆใ‚‰ใ‚ŒใฆใŠใ‚Šใ€ใƒžใ‚ตใƒใƒฅใƒผใ‚ปใƒƒใƒ„ๅทฅ็ง‘ๅคงๅญฆใฎ
05:14
and was developed by Joseph Weizenbaum at Massachusetts Institute of Technology.
87
314040
5640
ใ‚ธใƒงใ‚ปใƒ•ใƒปใƒฏใ‚คใ‚ผใƒณใƒใ‚ฆใƒ ใซใ‚ˆใฃใฆ้–‹็™บใ•ใ‚Œใพใ—ใŸ ใ€‚
05:19
Fascinating stuff.
88
319800
1040
่ˆˆๅ‘ณๆทฑใ„ใงใ™ใญใ€‚
05:20
OK, now let's recap some of the vocabulary we highlighted in this programme.
89
320960
4360
ใ•ใฆใ€ใ“ใฎใƒ—ใƒญใ‚ฐใƒฉใƒ ใงๅ–ใ‚ŠไธŠใ’ใŸ่ชžๅฝ™ใฎใ„ใใคใ‹ใ‚’ๆŒฏใ‚Š่ฟ”ใฃใฆใฟใพใ—ใ‚‡ใ† ใ€‚
05:25
Starting with 'sophisticated',
90
325440
2000
ใ€Œ้ซ˜ๅบฆใง่ค‡้›‘ใชใ€ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใ‚’่กจใ™ใ€Œๆด—็ทดใ•ใ‚ŒใŸใ€ใจใ„ใ†่จ€่‘‰ใ‹ใ‚‰ๅง‹ใพใ‚Šใพใ™
05:27
which can describe technology that is 'advanced and complex'.
91
327560
3880
ใ€‚
05:31
Something that is 'coherent' is 'clear, carefully considered and sensible'.
92
331560
4320
ใ€Œ้ฆ–ๅฐพไธ€่ฒซใ—ใŸใ€ใ‚‚ใฎใจใฏใ€ใ€Œๆ˜Ž็ขบใงใ€ ๆ…Ž้‡ใซๆคœ่จŽใ•ใ‚Œใ€็†ใซใ‹ใชใฃใŸใ€ใ‚‚ใฎใงใ™ใ€‚
05:36
'Authoritative' means 'respected, accurate and important sounding'.
93
336000
4480
ใ€Œๆจฉๅจใ‚ใ‚‹ใ€ใจใฏใ€ใ€ŒๅฐŠๆ•ฌใ•ใ‚Œใ€ ๆญฃ็ขบใงใ€้‡่ฆใช้Ÿฟใใ€ใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚ ใ€Œ
05:40
When you are 'on guard' you must be 'careful and alert' about something โ€”
94
340600
3760
่ญฆๆˆ’ใ—ใฆใ„ใ‚‹ใ€ใจใใฏใ€ ไฝ•ใ‹ใซๅฏพใ—ใฆใ€Œๆณจๆ„ๆทฑใ่ญฆๆˆ’ใ—ใฆใ„ใ‚‹ใ€ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใ€‚
05:44
it could be accuracy of what you see or hear,
95
344480
3320
ใใ‚Œใฏใ€ ่ฆ‹ใŸใ‚Š่žใ„ใŸใ‚Šใ—ใŸใ“ใจใฎๆญฃ็ขบใ•ใงใ‚ใฃใŸใ‚Š
05:47
or just being aware of the dangers around you.
96
347920
2640
ใ€ๅ˜ใซๅ‘จๅ›ฒใฎๅฑ้™บใ‚’่ช่ญ˜ใ—ใฆใ„ใ‚‹ใ“ใจใงใ‚ใฃใŸใ‚Šใ—ใพใ™ ใ€‚
05:50
To 'provoke' means to 'do something that causes a reaction from someone'.
97
350680
4160
ใ€ŒๆŒ‘็™บใ™ใ‚‹ใ€ใจใฏใ€ใ€Œ ่ชฐใ‹ใฎๅๅฟœใ‚’ๅผ•ใ่ตทใ“ใ™ใ‚ˆใ†ใชใ“ใจใ‚’ใ™ใ‚‹ใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ใ€‚
05:54
'Sentient' describes 'something that experiences feelings' โ€”
98
354960
3680
ใ€Œ็Ÿฅ่ฆšๅŠ›ใฎใ‚ใ‚‹ใ€ใจใฏใ€Œ ๆ„Ÿๆƒ…ใ‚’็ตŒ้จ“ใ™ใ‚‹ใ‚‚ใฎใ€ใ€
05:58
so it's 'something that is living'.
99
358760
2120
ใคใพใ‚Šใ€Œ็”Ÿใใฆใ„ใ‚‹ใ‚‚ใฎใ€ใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
06:01
Once again, our six minutes are up. Goodbye.
100
361000
2640
ใ‚‚ใ†ไธ€ๅบฆ่จ€ใ„ใพใ™ใŒใ€6ๅˆ†ใŒ็ตŒ้Žใ—ใพใ—ใŸใ€‚ ใ•ใ‚ˆใ†ใชใ‚‰ใ€‚
06:03
Bye for now.
101
363760
1000
ใจใ‚Šใ‚ใˆใšใ•ใ‚ˆใ†ใชใ‚‰ใ€‚
06:05
6 Minute English.
102
365680
1520
6ๅˆ†้–“่‹ฑ่ชžใ€‚
06:07
From BBC Learning English.
103
367320
2400
BBC Learning Englishใ‚ˆใ‚Šใ€‚
06:10
Hello. This is 6 Minute English from BBC Learning English.
104
370600
3240
ใ“ใ‚“ใซใกใฏใ€‚ ใ“ใ‚Œใฏ BBC Learning English ใฎ 6 Minute English ใงใ™ใ€‚
06:13
โ€” I'm Sam. โ€” And I'm Neil.
105
373960
1240
โ€” ็งใฏใ‚ตใƒ ใงใ™ใ€‚ โ€” ใใ—ใฆ็งใฏใƒ‹ใƒผใƒซใงใ™ใ€‚
06:15
In the autumn of 2021, something strange happened
106
375320
3640
2021ๅนด็ง‹ใ€ใ‚ซใƒชใƒ•ใ‚ฉใƒซใƒ‹ใ‚ขๅทžใ‚ทใƒชใ‚ณใƒณใƒใƒฌใƒผใฎGoogleๆœฌ็คพใง ๅฅ‡ๅฆ™ใชๅ‡บๆฅไบ‹ใŒ่ตทใ“ใฃใŸ
06:19
at the Google headquarters in California's Silicon Valley.
107
379080
3440
ใ€‚
06:22
A software engineer called Blake Lemoine
108
382640
2720
Blake Lemoine ใจใ„ใ†ใ‚ฝใƒ•ใƒˆใ‚ฆใ‚งใ‚ข ใ‚จใƒณใ‚ธใƒ‹ใ‚ขใŒใ€
06:25
was working on the artificial intelligence project
109
385480
2840
ไบบๅทฅ็Ÿฅ่ƒฝใƒ—ใƒญใ‚ธใ‚งใ‚ฏใƒˆ
06:28
Language Models for Dialogue Applications, or LaMDA for short.
110
388440
5080
Language Models for Dialogue Applications ( ็•ฅใ—ใฆ LaMDA) ใซๅ–ใ‚Š็ต„ใ‚“ใงใ„ใพใ—ใŸใ€‚
06:33
LaMDA is a 'chatbot' โ€” a 'computer programme
111
393640
2880
LaMDA ใฏใ€Œใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใ€ใ€ใคใพใ‚Šใ€Œใ‚คใƒณใ‚ฟใƒผใƒใƒƒใƒˆไธŠใงไบบ้–“ใจ
06:36
'designed to have conversations with humans over the internet'.
112
396640
3360
ไผš่ฉฑใ™ใ‚‹ใ‚ˆใ†ใซ่จญ่จˆใ•ใ‚ŒใŸใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟ ใƒ—ใƒญใ‚ฐใƒฉใƒ ใ€ใงใ™ ใ€‚ ๆ˜ ็”ปใ‹ใ‚‰ไบบ็”Ÿใฎๆ„ๅ‘ณใพใงใ•ใพใ–ใพใช่ฉฑ้กŒใซใคใ„ใฆ
06:40
After months talking with LaMDA
113
400120
2280
LaMDA ใจๆ•ฐใ‹ๆœˆ่ฉฑใ—ๅˆใฃใŸๅพŒ
06:42
on topics ranging from movies to the meaning of life,
114
402520
3360
ใ€
06:46
Blake came to a surprising conclusion โ€”
115
406000
2640
ใƒ–ใƒฌใ‚คใ‚ฏใฏ้ฉšใในใ็ต่ซ–ใซ้”ใ—ใพใ—ใŸใ€‚
06:48
the chatbot was an intelligent person
116
408760
2720
ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใฏๅฐŠ้‡ใ•ใ‚Œใ‚‹ในใๅธŒๆœ›ใจๆจฉๅˆฉใ‚’ๆŒใค็Ÿฅ็š„ใชไบบ้–“ใงใ‚ใ‚‹ใจใ„ใ†ใ“ใจใงใ™
06:51
with wishes and rights that should be respected.
117
411600
3040
ใ€‚
06:54
For Blake, LaMDA was a Google employee, not a machine.
118
414760
3960
ใƒ–ใƒฌใ‚คใ‚ฏใซใจใฃใฆใ€LaMDA ใฏๆฉŸๆขฐใงใฏใชใใ€Google ใฎๅพ“ๆฅญๅ“กใงใ—ใŸ ใ€‚
06:58
He also called it his friend.
119
418840
1880
ๅฝผใฏใใ‚Œใ‚’ๅ‹ไบบใจใ‚‚ๅ‘ผใ‚“ใ ใ€‚
07:00
Google quickly reassigned Blake from the project,
120
420840
2880
Google ใฏใ™ใใซใƒ–ใƒฌใ‚คใ‚ฏๆฐใ‚’ ใƒ—ใƒญใ‚ธใ‚งใ‚ฏใƒˆใ‹ใ‚‰ๅค–ใ—ใ€
07:03
announcing that his ideas were not supported by the evidence.
121
423840
3760
ๅฝผใฎใ‚ขใ‚คใƒ‡ใ‚ขใฏ ่จผๆ‹ ใซใ‚ˆใฃใฆ่ฃไป˜ใ‘ใ‚‰ใ‚Œใฆใ„ใชใ„ใจ็™บ่กจใ—ใŸใ€‚
07:07
But what exactly was going on?
122
427720
2440
ใ—ใ‹ใ—ใ€ไธ€ไฝ“ไฝ•ใŒ่ตทใ“ใฃใฆใ„ใŸใฎใงใ—ใ‚‡ใ†ใ‹?
07:10
In this programme, we'll be discussing whether artificial intelligence
123
430280
3720
ใ“ใฎ็•ช็ต„ใงใฏใ€ ไบบๅทฅ็Ÿฅ่ƒฝใŒ
07:14
is capable of consciousness.
124
434120
2160
ๆ„่ญ˜ใ‚’ๆŒใคใ“ใจใŒใงใใ‚‹ใ‹ใฉใ†ใ‹ใซใคใ„ใฆ่ญฐ่ซ–ใ—ใพใ™ใ€‚ AI ใฏ็งใŸใกใŒ่€ƒใˆใ‚‹ใปใฉ่ณขใใชใ„ใจ่€ƒใˆใ‚‹
07:16
We'll hear from one expert
125
436400
1480
ๅฐ‚้–€ๅฎถใฎ่ฉฑใ‚’่žใ
07:18
who thinks AI is not as intelligent as we sometimes think
126
438000
3600
07:21
and, as usual, we'll be learning some new vocabulary as well.
127
441720
3800
ใ€ใ„ใคใ‚‚ใฎใ‚ˆใ†ใซ ๆ–ฐใ—ใ„่ชžๅฝ™ใ‚‚ๅญฆใณใพใ™ใ€‚
07:25
But before that, I have a question for you, Neil.
128
445640
2480
ใ—ใ‹ใ—ใใฎๅ‰ใซใ€ ใƒ‹ใƒผใƒซใ•ใ‚“ใซ่ณชๅ•ใŒใ‚ใ‚Šใพใ™ใ€‚
07:28
What happened to Blake Lemoine
129
448240
1400
ใƒ–ใƒฌใ‚คใ‚ฏใƒปใƒซใƒขใƒฏใƒณใซ่ตทใ“ใฃใŸใ“ใจใฏใ€
07:29
is strangely similar to the 2013 Hollywood movie, Her,
130
449760
4200
07:34
starring Joaquin Phoenix as a lonely writer who talks with his computer,
131
454080
4400
ใƒ›ใ‚ขใ‚ญใƒณใƒปใƒ•ใ‚งใƒ‹ใƒƒใ‚ฏใ‚นใŒไธปๆผ”ใ—ใ€
07:38
voiced by Scarlett Johansson.
132
458600
1840
ใ‚นใ‚ซใƒผใƒฌใƒƒใƒˆใƒปใƒจใƒใƒณใ‚ฝใƒณใŒๅฃฐใ‚’ๆ‹…ๅฝ“ใ—ใ€ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใจไผš่ฉฑใ™ใ‚‹ๅญค็‹ฌใชไฝœๅฎถใ‚’ๆผ”ใ˜ใŸ2013ๅนดใฎใƒใƒชใ‚ฆใƒƒใƒ‰ๆ˜ ็”ปใ€Œher/ไธ–็•Œใงใฒใจใคใฎๅฝผๅฅณใ€ใจๅฅ‡ๅฆ™ใชใปใฉไผผใฆใ„ใ‚‹ใ€‚
07:40
But what happens at the end of the movie?
133
460560
2280
ใ—ใ‹ใ—ใ€ๆ˜ ็”ปใฎๆœ€ๅพŒใฏใฉใ†ใชใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹?
07:42
Is it a) The computer comes to life?
134
462960
2520
ใใ‚Œใฏ a) ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใŒ็”Ÿใ่ฟ”ใ‚‹ใจใ„ใ†ใ“ใจใงใ™ใ‹?
07:45
b) The computer dreams about the writer?
135
465600
2480
b) ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใฏไฝœๅฎถใซใคใ„ใฆๅคขใ‚’่ฆ‹ใ‚‹ใฎใ‹?
07:48
Or c) The writer falls in love with the computer?
136
468200
2800
ใ‚ใ‚‹ใ„ใฏ c) ็ญ†่€…ใฏใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใซๆ‹ใ‚’ใ™ใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹ ?
07:51
C) The writer falls in love with the computer.
137
471120
3240
C) ็ญ†่€…ใฏใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใซๅคขไธญใซใชใ‚‹ ใ€‚
07:54
OK, Neil, I'll reveal the answer at the end of the programme.
138
474480
3360
ใ‚ใ‹ใ‚Šใพใ—ใŸใ€ใƒ‹ใƒผใƒซใ€ ็•ช็ต„ใฎๆœ€ๅพŒใซ็ญ”ใˆใ‚’ๆ˜Žใ‹ใ—ใพใ™ใ€‚
07:57
Although Hollywood is full of movies about robots coming to life,
139
477960
3760
ใƒใƒชใ‚ฆใƒƒใƒ‰ใซใฏใƒญใƒœใƒƒใƒˆใŒ็”Ÿใ็”Ÿใใจๅ‹•ใๆ˜ ็”ปใŒๆบขใ‚Œใฆใ„ใ‚‹ใŒ ใ€ใƒฏใ‚ทใƒณใƒˆใƒณๅคงๅญฆใฎ
08:01
Emily Bender, Professor of Linguistics and Computing at the University of Washington,
140
481840
5440
่จ€่ชžๅญฆใŠใ‚ˆใณใ‚ณใƒณใƒ”ใƒฅใƒผใƒ†ใ‚ฃใƒณใ‚ฐใฎๆ•™ๆŽˆใงใ‚ใ‚‹ใ‚จใƒŸใƒชใƒผใƒปใƒ™ใƒณใƒ€ใƒผๆฐใฏ ใ€
08:07
thinks AI isn't that smart.
141
487400
2640
AIใฏใใ‚Œใปใฉ่ณขใใชใ„ใจ่€ƒใˆใฆใ„ใ‚‹ใ€‚
08:10
She thinks the words we use to talk about technology โ€”
142
490160
3240
ๅฝผๅฅณใฏใ€ ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใซใคใ„ใฆ่ฉฑใ™ใจใใซไฝฟใ†่จ€่‘‰ใ€
08:13
phrases like 'machine learning' โ€”
143
493520
2120
ใŸใจใˆใฐใ€ŒๆฉŸๆขฐๅญฆ็ฟ’ใ€ใฎใ‚ˆใ†ใชใƒ•ใƒฌใƒผใ‚บใฏใ€
08:15
give a false impression about what computers can and can't do.
144
495760
4440
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใŒไฝ•ใŒใงใใฆไฝ•ใŒใงใใชใ„ใ‹ใซใคใ„ใฆ่ชคใฃใŸๅฐ่ฑกใ‚’ไธŽใˆใ‚‹ใจ่€ƒใˆใฆใ„ใพใ™ใ€‚
08:20
Here is Professor Bender discussing another misleading phrase โ€”
145
500320
3520
ใ“ใ“ใงใƒ™ใƒณใƒ€ใƒผๆ•™ๆŽˆใŒใ€BBC ใƒฏใƒผใƒซใƒ‰ ใ‚ตใƒผใƒ“ใ‚นใฎ็•ช็ต„ใ€ŒThe Inquiryใ€ใงใ€ ่ชค่งฃใ‚’ๆ‹›ใๅˆฅใฎใƒ•ใƒฌใƒผใ‚บ
08:23
'speech recognition' โ€” with BBC World Service programme The Inquiry.
146
503960
5120
ใ€Œ้Ÿณๅฃฐ่ช่ญ˜ใ€ใซใคใ„ใฆ่ญฐ่ซ–ใ—ใฆใ„ใพใ™ ใ€‚
08:29
If you talk about 'automatic speech recognition',
147
509200
3000
ใ€Œ่‡ชๅ‹•้Ÿณๅฃฐ่ช่ญ˜ใ€ใซใคใ„ใฆ่ฉฑใ™ๅ ดๅˆใ€
08:32
the term 'recognition' suggests that there's something cognitive going on,
148
512320
4680
ใ€Œ่ช่ญ˜ใ€ใจใ„ใ†็”จ่ชžใฏใ€ ่ช็Ÿฅ็š„ใชไฝ•ใ‹ใŒ่ตทใ“ใฃใฆใ„ใ‚‹ใ“ใจใ‚’็คบๅ”†ใ—ใพใ™ใŒใ€
08:37
where I think a better term would be automatic transcription.
149
517120
2840
ใ‚ˆใ‚Š้ฉๅˆ‡ใช็”จ่ชžใฏใ€Œ ่‡ชๅ‹•่ปขๅ†™ใ€ใ ใจๆ€ใ„ใพใ™ใ€‚
08:40
That just describes the input-output relation,
150
520080
2400
ใ“ใ‚Œใฏๅ˜ใซ ๅ…ฅๅŠ›ใจๅ‡บๅŠ›ใฎ้–ขไฟ‚ใ‚’่ชฌๆ˜Žใ™ใ‚‹ใ‚‚ใฎใง
08:42
and not any theory or wishful thinking
151
522600
3480
ใ‚ใ‚Šใ€ใใ‚Œใ‚’้”ๆˆใ™ใ‚‹ใŸใ‚ใซใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใŒไฝ•ใ‚’ใ—ใฆใ„ใ‚‹ใ‹ใซใคใ„ใฆใฎ็†่ซ–ใ‚„ๅธŒๆœ›็š„่ฆณๆธฌใ‚’่ชฌๆ˜Žใ™ใ‚‹ใ‚‚ใฎใงใฏใ‚ใ‚Šใพใ›ใ‚“
08:46
about what the computer is doing to be able to achieve that.
152
526200
3360
ใ€‚ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใซ้–ข้€ฃใ—ใฆ
08:49
Using words like 'recognition' in relation to computers
153
529680
3720
ใ€Œ่ช่ญ˜ใ€ใชใฉใฎ่จ€่‘‰ใ‚’ไฝฟ็”จใ™ใ‚‹ใจใ€ใ€Œ่ช็Ÿฅ
08:53
gives the idea that something 'cognitive' is happening โ€”
154
533520
3160
็š„ใชใ€ไฝ•ใ‹ใ€ใคใพใ‚Šใ€Œ
08:56
something 'related to the mental processes
155
536800
2680
08:59
'of thinking, knowing, learning and understanding'.
156
539600
3120
่€ƒใˆใ‚‹ใ€็Ÿฅใ‚‹ใ€ๅญฆใถ ใ€็†่งฃใ™ใ‚‹ใ€ใจใ„ใ†็ฒพ็ฅž็š„ใชใƒ—ใƒญใ‚ปใ‚นใซ้–ข้€ฃใ™ใ‚‹ไฝ•ใ‹ใŒ่ตทใ“ใฃใฆใ„ใ‚‹ใจใ„ใ†ๅฐ่ฑกใ‚’ไธŽใˆใพใ™ใ€‚
09:02
But thinking and knowing are human, not machine, activities.
157
542840
4480
ใ—ใ‹ใ—ใ€่€ƒใˆใ‚‹ใ“ใจใจ็Ÿฅใ‚‹ใ“ใจใฏ ๆฉŸๆขฐใฎๆดปๅ‹•ใงใฏใชใใ€ไบบ้–“ใฎๆดปๅ‹•ใงใ™ใ€‚
09:07
Professor Benders says that talking about them in connection with computers
158
547440
4160
ใƒ™ใƒณใƒ€ใƒผใ‚บๆ•™ๆŽˆใฏใ€ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใจใฎ้–ข้€ฃใงใใ‚Œใ‚‰ใซใคใ„ใฆ่ชžใ‚‹ใ“ใจใฏ
09:11
is 'wishful thinking' โ€” 'something which is unlikely to happen'.
159
551720
4720
ใ€ŒๅธŒๆœ›็š„่ฆณๆธฌใ€ใงใ‚ใ‚Šใ€ใ€Œ ่ตทใ“ใ‚Šใใ†ใซใชใ„ใ“ใจใ€ใ ใจ่จ€ใ†ใ€‚
09:16
The problem with using words in this way
160
556560
2120
ใ“ใฎใ‚ˆใ†ใซ่จ€่‘‰ใ‚’ไฝฟใ†ใ“ใจใฎๅ•้กŒ็‚นใฏใ€
09:18
is that it reinforces what Professor Bender calls 'technical bias' โ€”
161
558800
4760
ใƒ™ใƒณใƒ€ใƒผๆ•™ๆŽˆใŒใ€ŒๆŠ€่ก“็š„ๅ่ฆ‹ใ€ใจๅ‘ผใถใ‚‚ใฎใ€ใคใพใ‚Šใ€Œ
09:23
'the assumption that the computer is always right'.
162
563680
2920
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใฏๅธธใซๆญฃใ—ใ„ใจใ„ใ†ๆ€ใ„่พผใฟใ€ใ‚’ๅผทๅŒ–ใ—ใฆใ—ใพใ†ใ“ใจใ ใ€‚
09:26
When we encounter language that sounds natural, but is coming from a computer,
163
566720
4120
่‡ช็„ถใซ่žใ“ใˆใ‚‹ใŒใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใ‹ใ‚‰็™บใ›ใ‚‰ใ‚ŒใŸ่จ€่ชžใซ้ญ้‡ใ™ใ‚‹ใจใ€ใŸใจใˆใใ‚ŒใŒๅญ˜ๅœจใ—ใชใ„ใจใ—ใฆใ‚‚ใ€
09:30
humans can't help but imagine a mind behind the language,
164
570960
3760
ไบบ้–“ใฏใใฎ่จ€่ชžใฎ่ƒŒๅพŒใซๅฟƒใ‚’ๆƒณๅƒใ›ใšใซใฏใ„ใ‚‰ใ‚Œใพใ›ใ‚“
09:34
even when there isn't one.
165
574840
1520
ใ€‚
09:36
In other words, we 'anthropomorphise' computers โ€”
166
576480
2960
่จ€ใ„ๆ›ใˆใ‚Œใฐใ€ ็งใŸใกใฏใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใ‚’ใ€Œๆ“ฌไบบๅŒ–ใ€ใ—ใ€ใ€Œ
09:39
we 'treat them as if they were human'.
167
579560
2320
ใพใ‚‹ใงไบบ้–“ใฎใ‚ˆใ†ใซๆ‰ฑใ†ใ€ใฎใงใ™ใ€‚
09:42
Here's Professor Bender again, discussing this idea with Charmaine Cozier,
168
582000
4600
ใ“ใ“ใงๅ†ใณใƒ™ใƒณใƒ€ใƒผๆ•™ๆŽˆใŒใ€
09:46
the presenter of BBC World Service's The Inquiry.
169
586720
3520
BBC ใƒฏใƒผใƒซใƒ‰ ใ‚ตใƒผใƒ“ใ‚นใฎใ€ŒThe Inquiryใ€ใฎๅธไผš่€…ใƒใƒฃใƒผใƒกใ‚คใƒณใƒปใ‚ณใ‚ธใ‚จใจใ“ใฎใ‚ขใ‚คใƒ‡ใ‚ขใซใคใ„ใฆ่ญฐ่ซ–ใ—ใพใ™ใ€‚
09:50
So 'ism' means system, 'anthro' or 'anthropo' means human,
170
590360
4800
ใคใพใ‚Šใ€ใ€Œismใ€ใฏใ‚ทใ‚นใƒ†ใƒ ใ€ ใ€Œanthroใ€ใพใŸใฏใ€Œanthropoใ€ใฏไบบ้–“ใ€
09:55
and 'morph' means shape.
171
595280
1800
ใ€Œmorphใ€ใฏๅฝข็Šถใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
09:57
And so this is a system that puts the shape of a human on something,
172
597200
4760
ใ“ใ‚Œใฏ ไบบ้–“ใฎๅฝขใ‚’ไฝ•ใ‹ใฎไธŠใซ่ผ‰ใ›ใ‚‹ใ‚ทใ‚นใƒ†ใƒ ใงใ‚ใ‚Š
10:02
and, in this case, the something is a computer.
173
602080
1680
ใ€ใ“ใฎๅ ดๅˆใ€ ใใฎไฝ•ใ‹ใจใฏใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใงใ™ใ€‚
10:03
We anthropomorphise animals all the time,
174
603880
2920
็งใŸใกใฏๅ‹•็‰ฉใ‚’ๆ“ฌไบบๅŒ–ใ™ใ‚‹ใ“ใจใฏใ‚ˆใใ‚ใ‚Šใพใ™
10:06
but we also anthropomorphise action figures, or dolls,
175
606920
3840
ใŒใ€ใ‚ขใ‚ฏใ‚ทใƒงใƒณ ใƒ•ใ‚ฃใ‚ฎใƒฅใ‚ขใ‚„ไบบๅฝขใ€
10:10
or companies when we talk about companies having intentions and so on.
176
610880
3880
ใ‚ใ‚‹ใ„ใฏ ไผๆฅญใŒๆ„ๅ›ณใ‚’ๆŒใฃใฆใ„ใ‚‹ใชใฉใซใคใ„ใฆ่ฉฑใ™ใจใใซไผๆฅญใ‚’ๆ“ฌไบบๅŒ–ใ™ใ‚‹ใ“ใจใ‚‚ใ‚ˆใใ‚ใ‚Šใพใ™ใ€‚
10:14
We very much are in the habit of seeing ourselves in the world around us.
177
614880
4480
็งใŸใกใฏใ€ ๅ‘จๅ›ฒใฎไธ–็•Œใฎไธญใง่‡ชๅˆ†่‡ช่บซใ‚’่ฆ‹ใ‚‹็ฟ’ๆ…ฃใŒใ‚ใ‚Šใพใ™ใ€‚
10:19
And while we're busy seeing ourselves
178
619480
1920
ใใ—ใฆใ€ไบบ้–“ใงใฏใชใ„ใ‚‚ใฎใซไบบ้–“็š„ใช็‰นๅพดใ‚’ๅฝ“ใฆใฏใ‚ใฆ่‡ชๅˆ†่‡ช่บซใ‚’่ฆ‹ใ‚‹ใ“ใจใซๅฟ™ใ—ใใ—ใฆใ„ใ‚‹้–“
10:21
by assigning human traits to things that are not, we risk being blindsided.
179
621520
4760
ใ€็งใŸใกใฏไธๆ„ใ‚’็ชใ‹ใ‚Œใ‚‹ๅฑ้™บใซใ•ใ‚‰ใ•ใ‚Œใฆใ„ใ‚‹ใฎใงใ™ใ€‚
10:26
The more fluent that text is, the more different topics it can converse on,
180
626400
3880
ใƒ†ใ‚ญใ‚นใƒˆใŒๆตๆšขใงใ‚ใ‚Œใฐใ‚ใ‚‹ใปใฉใ€ ไผš่ฉฑใงใใ‚‹ใƒˆใƒ”ใƒƒใ‚ฏใŒๅคšๆง˜ใซใชใ‚Šใ€้จ™ใ•ใ‚Œใ‚‹
10:30
the more chances there are to get taken in.
181
630400
2720
ๅฏ่ƒฝๆ€งใ‚‚้ซ˜ใใชใ‚Šใพใ™ ใ€‚
10:34
If we treat computers as if they could think,
182
634040
2600
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใ‚’ ๆ€่€ƒใงใใ‚‹ใ‚‚ใฎใจใ—ใฆๆ‰ฑใ†ใจใ€
10:36
we might get 'blindsided', or 'unpleasantly surprised'.
183
636760
4160
ใ€Œไธๆ„ๆ‰“ใกใ€ใ‚’ๅ—ใ‘ใŸใ‚Šใ€ ใ€Œไธๅฟซใช้ฉšใใ€ใ‚’่ฆšใˆใŸใ‚Šใ™ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€‚
10:41
Artificial intelligence works by finding patterns in massive amounts of data,
184
641040
4560
ไบบๅทฅ็Ÿฅ่ƒฝใฏ ่†จๅคงใช้‡ใฎใƒ‡ใƒผใ‚ฟใ‹ใ‚‰ใƒ‘ใ‚ฟใƒผใƒณใ‚’่ฆ‹ใคใ‘ใ‚‹ใ“ใจใงๆฉŸ่ƒฝใ™ใ‚‹
10:45
so it can seem like we're talking with a human,
185
645720
2520
ใŸใ‚ใ€
10:48
instead of a machine doing data analysis.
186
648360
3000
ใƒ‡ใƒผใ‚ฟๅˆ†ๆžใ‚’่กŒใ†ๆฉŸๆขฐใงใฏใชใใ€ไบบ้–“ใจไผš่ฉฑใ—ใฆใ„ใ‚‹ใ‚ˆใ†ใซๆ„Ÿใ˜ใ‚‹ใ“ใจใŒใ‚ใ‚Šใพใ™ใ€‚ ใใฎ
10:51
As a result, we 'get taken in' โ€” we're 'tricked or deceived'
187
651480
4240
็ตๆžœใ€็งใŸใกใฏใ€Œ ้จ™ใ•ใ‚Œใ€ใ€
10:55
into thinking we're dealing with a human, or with something intelligent.
188
655840
3640
ไบบ้–“ ใ‚„็Ÿฅ็š„ใชไฝ•ใ‹ใ‚’็›ธๆ‰‹ใซใ—ใฆใ„ใ‚‹ใจๆ€ใ„่พผใพใ•ใ‚Œใฆใ—ใพใ†ใฎใงใ™ใ€‚
10:59
Powerful AI can make machines appear conscious,
189
659600
3760
ๅผทๅŠ›ใช AI ใซใ‚ˆใ‚ŠๆฉŸๆขฐใŒ ๆ„่ญ˜ใ‚’ๆŒใคใ‚ˆใ†ใซใชใ‚‹
11:03
but even tech giants like Google
190
663480
2320
ใŒใ€Google ใฎใ‚ˆใ†ใชใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผๅคงๆ‰‹ใงใ•ใˆใ€
11:05
are years away from building computers that can dream or fall in love.
191
665920
4480
ๅคขใ‚’่ฆ‹ใŸใ‚Šๆ‹ใซ่ฝใกใŸใ‚Šใงใใ‚‹ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใ‚’้–‹็™บใ™ใ‚‹ใซใฏไฝ•ๅนดใ‚‚ใ‹ใ‹ใ‚‹ใ€‚
11:10
Speaking of which, Sam, what was the answer to your question?
192
670520
3160
ใใ†ใ„ใˆใฐใ€ใ‚ตใƒ ใ€ ใ‚ใชใŸใฎ่ณชๅ•ใฎ็ญ”ใˆใฏไฝ•ใงใ—ใŸใ‹?
11:13
I asked what happened in the 2013 movie, Her.
193
673800
3240
2013ๅนดใฎๆ˜ ็”ปใ€Žherใ€ใงไฝ•ใŒ่ตทใ“ใฃใŸใฎใ‹ๅฐ‹ใญใพใ—ใŸ ใ€‚
11:17
Neil thought that the main character falls in love with his computer,
194
677160
3080
ใƒ‹ใƒผใƒซใฏใ€ไธปไบบๅ…ฌใŒ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใซๆ‹ใ‚’ใ™ใ‚‹ใจ่€ƒใˆใพใ—ใŸใ€‚
11:20
โ€” which was the correct answer! โ€” OK.
195
680360
2760
ใใ‚ŒใŒๆญฃ่งฃใงใ—ใŸใ€‚ - ใ‚ใ‹ใ‚Šใพใ—ใŸใ€‚
11:23
Right, it's time to recap the vocabulary we've learned from this programme
196
683240
3640
ใ•ใฆใ€ ใ“ใฎ็•ช็ต„ใงๅญฆใ‚“ใ 
11:27
about AI, including 'chatbots' โ€”
197
687000
2720
AI ใซ้–ขใ™ใ‚‹่ชžๅฝ™ใ‚’ๅพฉ็ฟ’ใ—ใพใ—ใ‚‡ใ†ใ€‚ใใฎไธญใซใฏใ€ใ€Œใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใ€ใ‚‚ๅซใพใ‚Œใพใ™ใ€‚ใƒใƒฃใƒƒใƒˆใƒœใƒƒใƒˆใจใฏใ€ใ€Œใ‚คใƒณใ‚ฟใƒผใƒใƒƒใƒˆใ‚’ไป‹ใ—ใฆไบบ้–“ใจ
11:29
'computer programmes designed to interact with humans over the internet'.
198
689840
4080
ๅฏพ่ฉฑใ™ใ‚‹ใ‚ˆใ†ใซ่จญ่จˆใ•ใ‚ŒใŸใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผ ใƒ—ใƒญใ‚ฐใƒฉใƒ  ใ€ใฎใ“ใจใงใ™ใ€‚
11:34
The adjective 'cognitive' describes anything connected
199
694040
3440
ใ€Œ่ช็Ÿฅ็š„ใ€ใจใ„ใ†ๅฝขๅฎน่ฉžใฏใ€
11:37
with 'the mental processes of knowing, learning and understanding'.
200
697600
3680
ใ€Œ็Ÿฅใ‚‹ใ€ๅญฆใถใ€็†่งฃใ™ใ‚‹็ฒพ็ฅž็š„ใชใƒ—ใƒญใ‚ปใ‚น ใ€ใซ้–ข้€ฃใ™ใ‚‹ใ‚ใ‚‰ใ‚†ใ‚‹ใ‚‚ใฎใ‚’่กจใ—ใพใ™ใ€‚
11:41
'Wishful thinking' means 'thinking that something which is very unlikely to happen
201
701400
4720
ใ€ŒๅธŒๆœ›็š„่ฆณๆธฌใ€ใจใฏใ€ใ€Œ ่ตทใ“ใ‚‹ๅฏ่ƒฝๆ€งใŒ้žๅธธใซไฝŽใ„ใ“ใจใŒใ€
11:46
'might happen one day in the future'.
202
706240
2040
ๅฐ†ๆฅใฎใ‚ใ‚‹ๆ—ฅ่ตทใ“ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใจ่€ƒใˆใ‚‹ใ“ใจใ€ใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
11:48
To 'anthropomorphise' an object
203
708400
1440
็‰ฉไฝ“ใ‚’ใ€Œๆ“ฌไบบๅŒ–ใ™ใ‚‹ใ€ใจใ„ใ†ใ“ใจใฏใ€ใ€ŒใŸใจใˆใใ‚ŒใŒ
11:49
means 'to treat it as if it were human, even though it's not'.
204
709960
3440
ไบบ้–“ใงใชใใฆใ‚‚ใ€ไบบ้–“ใงใ‚ใ‚‹ใ‹ใฎใ‚ˆใ†ใซๆ‰ฑใ†ใ€ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ ใ€‚
11:53
When you're 'blindsided', you're 'surprised in a negative way'.
205
713520
3640
ใ€Œไธๆ„ใ‚’็ชใ‹ใ‚Œใ‚‹ใ€ใจใ„ใ†ใ“ใจใฏใ€ใ€Œ ๆ‚ชใ„ๆ„ๅ‘ณใง้ฉšใ‹ใ•ใ‚Œใ‚‹ใ€ใจใ„ใ†ใ“ใจใงใ™ใ€‚
11:57
And finally, to 'get taken in' by someone means to be 'deceived or tricked' by them.
206
717280
4760
ใใ—ใฆๆœ€ๅพŒใซใ€่ชฐใ‹ใซใ€Œ้จ™ใ•ใ‚Œใ‚‹ใ€ใจใ„ใ†ใ“ใจใฏใ€ ใใฎไบบใซใ€Œ้จ™ใ•ใ‚ŒใŸใ‚Šใ€ใ ใพใ•ใ‚ŒใŸใ‚Šใ™ใ‚‹ใ€ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
12:02
My computer tells me that our six minutes are up!
207
722160
2920
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใซใ‚ˆใ‚‹ใจใ€ 6 ๅˆ†ใŒ็ตŒ้Žใ—ใŸใ‚ˆใ†ใงใ™ใ€‚
12:05
Join us again soon, for now it's goodbye from us.
208
725200
2960
ใ™ใใซใพใŸๅ‚ๅŠ ใ—ใฆใใ ใ•ใ„ใ€‚ ไปŠใฏใ“ใ‚ŒใงใŠๅˆฅใ‚Œใงใ™ใ€‚
12:08
Bye!
209
728280
1360
ใ•ใ‚ˆใชใ‚‰๏ผ
12:09
6 Minute English.
210
729760
1560
6ๅˆ†้–“่‹ฑ่ชžใ€‚
12:11
From BBC Learning English.
211
731440
2400
BBC Learning Englishใ‚ˆใ‚Šใ€‚
12:14
Hello, I'm Rob. Welcome to 6 Minute English and with me in the studio is Neil.
212
734640
4520
ใ“ใ‚“ใซใกใฏใ€ใƒญใƒ–ใงใ™ใ€‚ 6 Minute English ใธใ‚ˆใ†ใ“ใใ€‚ ใ‚นใ‚ฟใ‚ธใ‚ชใซใฏใƒ‹ใƒผใƒซใ‚‚ๆฅใฆใ„ใพใ™ใ€‚
12:19
โ€” Hello, Rob. โ€” Hello.
213
739280
1720
โ€” ใ“ใ‚“ใซใกใฏใ€ใƒญใƒ–ใ€‚ - ใ“ใ‚“ใซใกใฏใ€‚
12:21
Feeling clever today, Neil?
214
741120
1520
ไปŠๆ—ฅใฏ่ณขใ„ๆฐ—ๅˆ†ใ‹ใ„ใ€ใƒ‹ใƒผใƒซ๏ผŸ
12:22
I am feeling quite bright and clever, yes!
215
742760
2240
ใใ†ใงใ™ใญใ€็งใฏใจใฆใ‚‚้ ญใŒ่‰ฏใใฆ่ณขใ„ใจๆ„Ÿใ˜ใฆใ„ใพใ™๏ผ
12:25
That's good to hear.
216
745120
1000
ใใ‚Œใฏใ‚ˆใ‹ใฃใŸใงใ™ใ€‚
12:26
Well, 'you'll need your wits about you' โ€”
217
746240
1640
ใใ†ใงใ™ใญใ€ใ€ŒๆฉŸ่ปขใ‚’ๅˆฉใ‹ใ›ใฆใใ ใ•ใ„ใ€ใ€‚
12:28
meaning 'you'll need to think very quickly' in this programme,
218
748000
2760
ใคใพใ‚Šใ€Œ็ด ๆ—ฉใ่€ƒใˆใฆใใ ใ•ใ„ ใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ใ€‚ใ“ใฎ็•ช็ต„ใงใฏใ€
12:30
because we're talking about intelligence,
219
750880
2480
็Ÿฅ่ƒฝใ€
12:33
or to be more accurate, artificial intelligence,
220
753480
3280
ใ‚ˆใ‚Šๆญฃ็ขบใซ่จ€ใˆใฐ ไบบๅทฅ็Ÿฅ่ƒฝใซใคใ„ใฆ่ฉฑใ—ใฆใ„ใ‚‹ใฎใงใ€
12:36
and we'll learn some vocabulary related to the topic,
221
756880
3040
็งใŸใกใฏ ใƒˆใƒ”ใƒƒใ‚ฏใซ้–ข้€ฃใ™ใ‚‹ใ„ใใคใ‹ใฎ่ชžๅฝ™ใ‚’็”จๆ„ใ—ใฆใ€
12:40
so that you can have your own discussion about it.
222
760040
2720
ใใ‚Œใซใคใ„ใฆ็‹ฌ่‡ชใฎ่ญฐ่ซ–ใŒใงใใ‚‹ใ‚ˆใ†ใซใ—ใพใ™ใ€‚
12:42
Neil, now, you know who Professor Stephen Hawking is, right?
223
762880
3120
ใƒ‹ใƒผใƒซใ€ใ‚ใชใŸใฏ ใ‚นใƒ†ใ‚ฃใƒผใƒ–ใƒณใƒปใƒ›ใƒผใ‚ญใƒณใ‚ฐๆ•™ๆŽˆใŒ่ชฐใชใฎใ‹็Ÿฅใฃใฆใ„ใพใ™ใ‹๏ผŸ
12:46
Well, of course! Yes.
224
766120
1240
ใˆใˆใ€ใ‚‚ใกใ‚ใ‚“ใงใ™๏ผ ใฏใ„ใ€‚
12:47
Many people say that he's a 'genius' โ€”
225
767480
2320
ๅคšใใฎไบบใŒๅฝผใ‚’ใ€Œๅคฉๆ‰ใ€ใ ใจ่จ€ใ„ใพใ™ใ€‚
12:49
in other words, he is 'very, very intelligent'.
226
769920
3080
่จ€ใ„ๆ›ใˆใ‚Œใฐใ€ ๅฝผใฏใ€Œ้žๅธธใซ้ ญใŒ่‰ฏใ„ใ€ใฎใงใ™ใ€‚
12:53
Professor Hawking is one of the most famous scientists in the world
227
773120
3480
ใƒ›ใƒผใ‚ญใƒณใ‚ฐๆ•™ๆŽˆใฏ ไธ–็•Œใงๆœ€ใ‚‚ๆœ‰ๅใช็ง‘ๅญฆ่€…ใฎไธ€ไบบใงใ‚ใ‚Š
12:56
and people remember him for his brilliance
228
776720
1960
ใ€ใใฎ่กๆ˜Žใ•ใ ใ‘ใงใชใใ€
12:58
and also because he communicates using a synthetic voice generated by a computer โ€”
229
778800
5560
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใง็”Ÿๆˆใ•ใ‚ŒใŸๅˆๆˆ้Ÿณๅฃฐใ‚’ไฝฟใฃใฆใ‚ณใƒŸใƒฅใƒ‹ใ‚ฑใƒผใ‚ทใƒงใƒณใ‚’ใจใ‚‹ใ“ใจใงใ‚‚ไบบใ€…ใซ่จ˜ๆ†ถใ•ใ‚Œใฆใ„ใพใ™
13:04
'synthetic' means it's 'made from something non-natural'.
230
784480
3120
ใ€‚ๅˆๆˆใจใฏใ€ใ€Œ้ž่‡ช็„ถ็š„ใชใ‚‚ใฎใ‹ใ‚‰ไฝœใ‚‰ใ‚ŒใŸใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ใ€‚
13:07
'Artificial' is similar in meaning โ€”
231
787720
2040
ใ€Œไบบๅทฅใ€ใ‚‚ๆ„ๅ‘ณใฏไผผใฆใŠใ‚Šใ€ใ€Œ
13:09
we use it when something is 'man-made to look or behave like something natural'.
232
789880
4720
ไฝ•ใ‹ใŒ ่‡ช็„ถใชใ‚‚ใฎใฎใ‚ˆใ†ใซ่ฆ‹ใˆใ‚‹ใ‹ใ€่‡ช็„ถใชๅ‹•ไฝœใ‚’ใ™ใ‚‹ใ‚ˆใ†ใซไบบๅทฅ็š„ใซไฝœใ‚‰ใ‚ŒใŸใ€ๅ ดๅˆใซไฝฟ็”จใ—ใพใ™ใ€‚
13:14
Well, Professor Hawking has said recently
233
794720
2360
ใใ†ใงใ™ใญใ€ใƒ›ใƒผใ‚ญใƒณใ‚ฐๆ•™ๆŽˆใฏๆœ€่ฟ‘ใ€
13:17
that efforts to create thinking machines are a threat to our existence.
234
797200
4440
่€ƒใˆใ‚‹ๆฉŸๆขฐใ‚’ไฝœใ‚ใ†ใจใ™ใ‚‹ๅŠชๅŠ›ใฏ ็งใŸใกใฎๅญ˜ๅœจใซใจใฃใฆ่„…ๅจใงใ‚ใ‚‹ใจ่ฟฐในใพใ—ใŸใ€‚
13:21
A 'threat' means 'something which can put us in danger'.
235
801760
3240
ใ€Œ่„…ๅจใ€ใจใฏใ€ใ€Œ ็งใŸใกใ‚’ๅฑ้™บใซใ•ใ‚‰ใ™ๅฏ่ƒฝๆ€งใฎใ‚ใ‚‹ใ‚‚ใฎใ€ใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
13:25
Now, can you imagine that, Neil?!
236
805120
1360
ใ•ใฆใ€ใƒ‹ใƒผใƒซใ€ใใ‚Œใ‚’ๆƒณๅƒใงใใพใ™ใ‹๏ผŸ
13:26
Well, there's no denying that good things
237
806600
2080
ใพใ‚ใ€ไบบๅทฅ็Ÿฅ่ƒฝใฎๅ‰ต้€ ใ‹ใ‚‰่‰ฏใ„ใ‚‚ใฎใŒ็”Ÿใพใ‚Œใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใ“ใจใฏๅฆๅฎšใงใใพใ›ใ‚“
13:28
can come from the creation of artificial intelligence.
238
808800
2640
ใ€‚
13:31
Computers which can think for themselves
239
811560
1920
่‡ชๅˆ†ใง่€ƒใˆใ‚‹ใ“ใจใŒใงใใ‚‹ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใฏใ€็งใŸใกใŒ่งฃๆฑบใงใใชใ‹ใฃใŸๅ•้กŒใฎ
13:33
might be able to find solutions to problems we haven't been able to solve.
240
813600
4320
่งฃๆฑบ็ญ–ใ‚’่ฆ‹ใคใ‘ใ‚‹ใ“ใจใŒใงใใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ ใ€‚
13:38
But technology is developing quickly and maybe we should consider the consequences.
241
818040
4680
ใ—ใ‹ใ—ใ€ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใฏๆ€ฅ้€Ÿใซ็™บๅฑ•ใ—ใฆใŠใ‚Šใ€ใใฎ ็ตๆžœใซใคใ„ใฆ่€ƒๆ…ฎใ™ในใใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€‚
13:42
Some of these very clever robots are already surpassing us, Rob.
242
822840
4440
้žๅธธใซ่ณขใ„ใƒญใƒœใƒƒใƒˆใฎไธญใซใฏใ€ ใ™ใงใซ็งใŸใกใ‚’่ฟฝใ„่ถŠใ—ใฆใ„ใ‚‹ใ‚‚ใฎใ‚‚ใ‚ใ‚Šใพใ™ใ‚ˆใ€ใƒญใƒ–ใ€‚
13:47
'To surpass' means 'to have abilities superior to our own'.
243
827400
4000
ใ€Œๅ‡Œ้ง•ใ™ใ‚‹ใ€ใจใฏใ€ใ€Œ ่‡ชๅˆ†ใ‚ˆใ‚Šใ‚‚ๅ„ชใ‚ŒใŸ่ƒฝๅŠ›ใ‚’ๆŒใคใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ใ€‚
13:51
Yes. Maybe you can remember the headlines when a supercomputer
244
831520
3280
ใฏใ„ใ€‚ ใ‚นใƒผใƒ‘ใƒผใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใŒใƒใ‚งใ‚นใฎ
13:54
defeated the World Chess Champion, Gary Kasparov, to everyone's astonishment.
245
834920
4960
ไธ–็•Œใƒใƒฃใƒณใƒ”ใ‚ชใƒณใ€ใ‚ฌใƒซใƒชใƒป ใ‚ซใ‚นใƒ‘ใƒญใƒ•ใ‚’็ ดใ‚Šใ€็š†ใŒ้ฉšใ„ใŸใจใใฎใƒ‹ใƒฅใƒผใ‚นใฎ่ฆ‹ๅ‡บใ—ใ‚’่ฆšใˆใฆใ„ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€‚
14:00
It was in 1997. What was the computer called though, Neil?
246
840000
3440
ใใ‚Œใฏ 1997 ๅนดใฎใ“ใจใงใ—ใŸใ€‚ ใƒ‹ใƒผใƒซใ€ใใฎใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใฏไฝ•ใจๅ‘ผใฐใ‚Œใฆใ„ใพใ—ใŸใ‹?
14:03
Was it a) Red Menace? b) Deep Blue? Or c) Silver Surfer?
247
843560
6040
ใใ‚Œใฏ a) ใƒฌใƒƒใƒ‰ใƒกใƒŠใ‚นใงใ—ใŸใ‹? b) ใƒ‡ใ‚ฃใƒผใƒ—ใƒ–ใƒซใƒผ๏ผŸ ใใ‚Œใจใ‚‚c) ใ‚ทใƒซใƒใƒผใ‚ตใƒผใƒ•ใ‚กใƒผ๏ผŸ
14:09
Erm, I don't know.
248
849720
2200
ใˆใƒผใฃใจใ€ๅˆ†ใ‹ใ‚Šใพใ›ใ‚“ใ€‚
14:12
I think c) is probably not right. Erm...
249
852040
2360
c) ใฏใŠใใ‚‰ใๆญฃใ—ใใชใ„ใจๆ€ใ„ใพใ™ใ€‚ ใˆใƒผใจโ€ฆ
14:16
I think Deep Blue. That's b) Deep Blue.
250
856000
2160
ใƒ‡ใ‚ฃใƒผใƒ—ใƒปใƒ–ใƒซใƒผใ ใจๆ€ใ„ใพใ™ใ€‚ ใใ‚Œใฏ b) ใƒ‡ใ‚ฃใƒผใƒ—ใƒ–ใƒซใƒผใงใ™ใ€‚
14:18
OK. Well, you'll know if you got the answer right at the end of the programme.
251
858280
3680
ใ‚ใ‹ใ‚Šใพใ—ใŸใ€‚ ๆญฃ่งฃใ‹ใฉใ†ใ‹ใฏ ็•ช็ต„ใฎๆœ€ๅพŒใซใ‚ใ‹ใ‚Šใพใ™ใ€‚
14:22
Well, our theme is artificial intelligence
252
862080
2240
ใ•ใฆใ€็งใŸใกใฎใƒ†ใƒผใƒžใฏไบบๅทฅ็Ÿฅ่ƒฝใงใ‚ใ‚Š
14:24
and when we talk about this, we have to mention the movies.
253
864440
2840
ใ€ใ“ใ‚Œใซใคใ„ใฆ่ฉฑใ™ใจใใ€ ๆ˜ ็”ปใซใคใ„ใฆ่งฆใ‚Œใชใ‘ใ‚Œใฐใชใ‚Šใพใ›ใ‚“ใ€‚
14:27
Mm, many science fiction movies have explored the idea
254
867400
3240
ใ†ใƒผใ‚“ใ€ๅคšใใฎ SF ๆ˜ ็”ปใงใฏใ€
14:30
of bad computers who want to harm us.
255
870760
2440
ไบบ้–“ใซๅฑๅฎณใ‚’ๅŠ ใˆใ‚ˆใ†ใจใ™ใ‚‹ๆ‚ช่ณชใชใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใจใ„ใ†ๆฆ‚ๅฟตใŒๅ–ใ‚ŠไธŠใ’ใ‚‰ใ‚Œใฆใใพใ—ใŸใ€‚
14:33
One example is 2001: A Space Odyssey.
256
873320
3440
ไธ€ไพ‹ใจใ—ใฆใฏใ€Ž2001ๅนดๅฎ‡ๅฎ™ใฎๆ—…ใ€ใŒๆŒ™ใ’ใ‚‰ใ‚Œใพใ™ใ€‚
14:36
Yes, a good film.
257
876880
1000
ใฏใ„ใ€่‰ฏใ„ๆ˜ ็”ปใงใ—ใŸใ€‚
14:38
And another is The Terminator, a movie in which actor Arnold Schwarzenegger
258
878000
4000
ใใ—ใฆใ‚‚ใ†ใฒใจใคใฏใ€ ไฟณๅ„ชใ‚ขใƒผใƒŽใƒซใƒ‰ใƒปใ‚ทใƒฅใƒฏใƒซใƒ„ใ‚งใƒใƒƒใ‚ฌใƒผใŒ
14:42
played an android from the future.
259
882120
2240
ๆœชๆฅใ‹ใ‚‰ๆฅใŸใ‚ขใƒณใƒ‰ใƒญใ‚คใƒ‰ใ‚’ๆผ”ใ˜ใŸๆ˜ ็”ปใ€Žใ‚ฟใƒผใƒŸใƒใƒผใ‚ฟใƒผใ€ใงใ™ใ€‚
14:44
An 'android' is 'a robot that looks like a human'. Have you watched that one, Neil?
260
884480
3640
ใ€Œใ‚ขใƒณใƒ‰ใƒญใ‚คใƒ‰ใ€ใจใฏใ€Œไบบ้–“ใฎใ‚ˆใ†ใซ่ฆ‹ใˆใ‚‹ใƒญใƒœใƒƒใƒˆ ใ€ใงใ™ใ€‚ ใƒ‹ใƒผใƒซใ€ใ‚ใ‚Œใ‚’่ฆ‹ใพใ—ใŸใ‹๏ผŸ
14:48
Yes, I have and that android is not very friendly.
261
888240
3480
ใฏใ„ใ€ๆŒใฃใฆใ„ใพใ™ใŒใ€ใใฎใ‚ขใƒณใƒ‰ใƒญใ‚คใƒ‰ใฏ ใ‚ใพใ‚Šใƒ•ใƒฌใƒณใƒ‰ใƒชใƒผใงใฏใ‚ใ‚Šใพใ›ใ‚“ใ€‚
14:51
No, it's not!
262
891840
1000
ใ„ใ„ใˆ้•ใ„ใพใ™๏ผ ๆ€่€ƒใ™ใ‚‹ใƒญใƒœใƒƒใƒˆใซ้–ขใ™ใ‚‹
14:52
In many movies and books about robots that think,
263
892960
2920
ๅคšใใฎๆ˜ ็”ปใ‚„ๆœฌใงใฏใ€
14:56
the robots end up rebelling against their creators.
264
896000
3200
ใƒญใƒœใƒƒใƒˆใฏ็ตๅฑ€ใ€ ๅ‰ต้€ ่€…ใซๅฏพใ—ใฆๅๆŠ—ใ™ใ‚‹ใ“ใจใซใชใ‚Šใพใ™ใ€‚
14:59
But some experts say the risk posed by artificial intelligence
265
899320
3480
ใ—ใ‹ใ—ใ€ ไบบๅทฅ็Ÿฅ่ƒฝใŒใ‚‚ใŸใ‚‰ใ™ใƒชใ‚นใ‚ฏใฏใ€
15:02
is not that computers attack us because they hate us.
266
902920
3320
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใŒไบบ้–“ใ‚’ๆ†Žใ‚“ใงๆ”ปๆ’ƒใ™ใ‚‹ใจใ„ใ†ใ“ใจใงใฏใชใ„ใจไธปๅผตใ™ใ‚‹ๅฐ‚้–€ๅฎถใ‚‚ใ„ใ‚‹ ใ€‚
15:06
Their problem is related to their efficiency.
267
906360
2800
ๅฝผใ‚‰ใฎๅ•้กŒใฏ ๅŠน็Ž‡ๆ€งใซ้–ขไฟ‚ใ—ใฆใ„ใพใ™ใ€‚
15:09
What do you mean?
268
909280
1000
ใฉใ†ใ„ใ†ๆ„ๅ‘ณใงใ™ใ‹๏ผŸ
15:10
Well, let's listen to what philosopher Nick Bostrom has to say.
269
910400
3920
ใ•ใฆใ€ๅ“ฒๅญฆ่€…ใƒ‹ใƒƒใ‚ฏใƒปใƒœใ‚นใƒˆใƒญใƒ ใฎ่จ€ใ†ใ“ใจใ‚’่žใ„ใฆใฟใพใ—ใ‚‡ใ† ใ€‚
15:14
He's the founder of the Future of Humanity Institute at Oxford University.
270
914440
4880
ๅฝผใฏใ‚ชใƒƒใ‚ฏใ‚นใƒ•ใ‚ฉใƒผใƒ‰ๅคงๅญฆใฎไบบ้กžใฎๆœชๆฅ็ ”็ฉถๆ‰€ใฎๅ‰ต่จญ่€…ใงใ™ ใ€‚
15:19
He uses three words when describing what's inside the mind of a thinking computer.
271
919440
5800
ๅฝผใฏใ€ๆ€่€ƒใ™ใ‚‹ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใฎ้ ญใฎไธญใซใ‚ใ‚‹ใ‚‚ใฎใ‚’่ชฌๆ˜Žใ™ใ‚‹ใจใใซใ€3 ใคใฎๅ˜่ชžใ‚’ไฝฟใ„ใพใ™ ใ€‚
15:25
This phrase means 'to meet their objectives'. What's the phrase he uses?
272
925360
4640
ใ“ใฎใƒ•ใƒฌใƒผใ‚บใฏใ€Œ็›ฎ็š„ใ‚’้”ๆˆใ™ใ‚‹ใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ ใ€‚ ๅฝผใŒไฝฟใ†ใƒ•ใƒฌใƒผใ‚บใฏไฝ•ใงใ™ใ‹๏ผŸ
15:30
The bulk of the risk is not in machines being evil or hating humans,
273
930680
5120
ใƒชใ‚นใ‚ฏใฎๅคง้ƒจๅˆ†ใฏใ€ๆฉŸๆขฐใŒ ้‚ชๆ‚ชใงใ‚ใฃใŸใ‚Šใ€ไบบ้–“ใ‚’ๆ†Žใ‚“ใ ใ‚Šใ™ใ‚‹ใ“ใจใงใฏใชใใ€
15:35
but rather that they are indifferent to humans
274
935920
2320
ใ‚€ใ—ใ‚ๆฉŸๆขฐใŒ ไบบ้–“ใซ็„ก้–ขๅฟƒใงใ‚ใ‚Š
15:38
and that, in pursuit of their own goals, we humans would suffer as a side effect.
275
938360
4360
ใ€ๆฉŸๆขฐ่‡ช่บซใฎ็›ฎ็š„ใ‚’่ฟฝๆฑ‚ใ™ใ‚‹ใ“ใจใงใ€ ๅ‰ฏไฝœ็”จใจใ—ใฆไบบ้–“ใŒ่‹ฆใ—ใ‚€ใ“ใจใซใชใ‚‹ใจใ„ใ†็‚นใซใ‚ใ‚Šใพใ™ใ€‚ ใงใใ‚‹ใ ใ‘ๅคšใใฎใƒšใƒผใƒ‘ใƒผใ‚ฏใƒชใƒƒใƒ—ใ‚’ไฝœใ‚‹ใ“ใจใ‚’ๅ”ฏไธ€ใฎ็›ฎๆจ™ใจใ™ใ‚‹
15:42
Suppose you had a super intelligent AI
276
942840
1800
่ถ…็Ÿฅ่ƒฝ AI ใŒใ‚ใฃใŸใจใ—ใพใ™
15:44
whose only goal was to make as many paperclips as possible.
277
944760
3240
ใ€‚
15:48
Human bodies consist of atoms
278
948120
2280
ไบบไฝ“ใฏๅŽŸๅญใงๆง‹ๆˆใ•ใ‚ŒใฆใŠใ‚Š
15:50
and those atoms could be used to make a lot of really nice paperclips.
279
950520
4360
ใ€ใใ‚Œใ‚‰ใฎๅŽŸๅญใ‚’ไฝฟใฃใฆ ๆœฌๅฝ“ใซ็ด ๆ•ตใชใƒšใƒผใƒ‘ใƒผใ‚ฏใƒชใƒƒใƒ—ใ‚’ใŸใใ•ใ‚“ไฝœใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚
15:55
If you want paperclips, it turns out that in the pursuit of this,
280
955000
3080
ใƒšใƒผใƒ‘ใƒผใ‚ฏใƒชใƒƒใƒ—ใŒๆฌฒใ—ใ„ใชใ‚‰ใ€ ใใ‚Œใ‚’่ฟฝๆฑ‚ใ™ใ‚‹้Ž็จ‹ใง
15:58
you would have instrumental reasons to do things that would be harmful to humanity.
281
958200
3320
ไบบ้กžใซๅฎณใ‚’ๅŠใผใ™ใ‚ˆใ†ใชใ“ใจใ‚’ใ™ใ‚‹ๆ‰‹ๆฎต็š„็†็”ฑใ‚’ๆŒใคใ“ใจใซใชใ‚‹ใจใ„ใ†ใ“ใจใŒใ‚ใ‹ใ‚Šใพใ™ใ€‚
16:02
A world in which humans become paperclips โ€” wow, that's scary!
282
962360
4640
ไบบ้–“ใŒ ใƒšใƒผใƒ‘ใƒผใ‚ฏใƒชใƒƒใƒ—ใซใชใ‚‹ไธ–็•Œใ€‚ใ‚ใ‚ใ€ๆ€–ใ„ใงใ™ใญ๏ผ
16:07
But the phrase which means 'meet their objectives' is to 'pursue their goals'.
283
967120
4520
ใ—ใ‹ใ—ใ€ใ€Œ็›ฎๆจ™ใ‚’้”ๆˆใ™ใ‚‹ใ€ใจใ„ใ†ๆ„ๅ‘ณใฎใƒ•ใƒฌใƒผใ‚บใฏใ€ ใ€Œ็›ฎๆจ™ใ‚’่ฟฝๆฑ‚ใ™ใ‚‹ใ€ใจใ„ใ†ใ“ใจใงใ™ใ€‚
16:11
Yes, it is.
284
971760
1000
ใฏใ„ใ€ใใ†ใงใ™ใ€‚
16:12
So the academic explains that if you're a computer
285
972880
3280
ใใ“ใงใ€ใ“ใฎๅญฆ่€…ใฏใ€
16:16
responsible for producing paperclips, you will pursue your objective at any cost.
286
976280
5800
ใƒšใƒผใƒ‘ใƒผใ‚ฏใƒชใƒƒใƒ—ใฎไฝœๆˆใ‚’ๆ‹…ๅฝ“ใ™ใ‚‹ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใงใ‚ใ‚Œใฐใ€ ใฉใ‚“ใช็Š ็‰ฒใ‚’ๆ‰•ใฃใฆใงใ‚‚็›ฎ็š„ใ‚’่ฟฝๆฑ‚ใ™ใ‚‹ใ ใ‚ใ†ใจ่ชฌๆ˜Žใ—ใฆใ„ใพใ™ใ€‚
16:22
And even use atoms from human bodies to turn them into paperclips!
287
982200
4440
ใ•ใ‚‰ใซใ€ไบบไฝ“ใฎๅŽŸๅญใ‚’ไฝฟใฃใฆ ใƒšใƒผใƒ‘ใƒผใ‚ฏใƒชใƒƒใƒ—ใ‚’ไฝœใ‚‹ใ“ใจใ‚‚ใงใใพใ™ใ€‚
16:26
โ€” Now that's a horror story, Rob. โ€” Mm.
288
986760
2040
โ€” ใใ‚Œใฏๆใ‚ใ—ใ„่ฉฑใ ใ‚ˆใ€ใƒญใƒ–ใ€‚ โ€” ใ†ใƒผใ‚“ใ€‚
16:28
If Stephen Hawking is worried, I think I might be too!
289
988920
3120
ใ‚นใƒ†ใ‚ฃใƒผใƒ–ใƒณใƒปใƒ›ใƒผใ‚ญใƒณใ‚ฐใŒๅฟƒ้…ใ—ใฆใ„ใ‚‹ใชใ‚‰ใ€ ็งใ‚‚ๅฟƒ้…ใ™ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“๏ผ
16:32
How can we be sure that artificial intelligence โ€”
290
992160
2880
16:35
be either a device or software โ€” will have a moral compass?
291
995160
4000
ใƒ‡ใƒใ‚คใ‚นใงใ‚ใ‚Œใ‚ฝใƒ•ใƒˆใ‚ฆใ‚งใ‚ขใงใ‚ใ‚Œใ€ไบบๅทฅ็Ÿฅ่ƒฝใŒ ้“ๅพณ็š„ใชๆŒ‡้‡ใ‚’ๆŒใฃใฆใ„ใ‚‹ใจ็ขบไฟกใงใใ‚‹ใฎใฏใชใœใงใ—ใ‚‡ใ†ใ‹?
16:39
Ah, a good expression โ€” a 'moral compass' โ€”
292
999280
2000
ใ‚ใ‚ใ€ใ„ใ„่กจ็พใงใ™ใญใ€‚ ใ€Œ้“ๅพณ็š„ใช็พ…้‡็›คใ€ใ€
16:41
in other words, 'an understanding of what is right and what is wrong'.
293
1001400
3960
่จ€ใ„ๆ›ใˆใ‚Œใฐใ€Œ ไฝ•ใŒๆญฃใ—ใใฆไฝ•ใŒ้–“้•ใฃใฆใ„ใ‚‹ใ‹ใฎ็†่งฃใ€ใงใ™ใ€‚
16:45
Artificial intelligence is an interesting topic, Rob.
294
1005480
2560
ไบบๅทฅ็Ÿฅ่ƒฝใฏ ่ˆˆๅ‘ณๆทฑใ„ใƒˆใƒ”ใƒƒใ‚ฏใงใ™ใญใ€ใƒญใƒ–ใ€‚ ไปŠๅพŒ
16:48
I hope we can chat about it again in the future.
295
1008160
2400
ใพใŸใ“ใฎไปถใซใคใ„ใฆใŠ่ฉฑใ—ใงใใ‚Œใฐใจๆ€ใ„ใพใ™ ใ€‚
16:50
But now I'm looking at the clock and we're running out of time, I'm afraid,
296
1010680
2880
ใ—ใ‹ใ—ใ€ไปŠๆ™‚่จˆใ‚’่ฆ‹ใ‚‹ ใจใ€ๆฎ‹ๅฟตใชใŒใ‚‰ๆ™‚้–“ใŒใชใใชใฃใฆใใฆใ„ใพใ™ใ€‚ใ‚ฏใ‚คใ‚บใฎ่ณชๅ•ใซๆญฃใ—ใ็ญ”ใˆ
16:53
and I'd like to know if I got the answer to the quiz question right?
297
1013680
3320
ใ‚‰ใ‚ŒใŸใ‹ใฉใ†ใ‹็Ÿฅใ‚ŠใŸใ„ใฎใงใ™ใŒ ใ€‚
16:57
Well, my question was about a supercomputer
298
1017120
3000
ใ•ใฆใ€็งใฎ่ณชๅ•ใฏใ€1997 ๅนดใซ
17:00
which defeated the World Chess Champion, Gary Kasparov, in 1997.
299
1020240
4320
ไธ–็•Œใƒใ‚งใ‚น ใƒใƒฃใƒณใƒ”ใ‚ชใƒณใฎใ‚ฌใƒซใƒช ใ‚ซใ‚นใƒ‘ใƒญใƒ•ใ‚’็ ดใฃใŸใ‚นใƒผใƒ‘ใƒผใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใซใคใ„ใฆใงใ™ใ€‚
17:04
What was the machine's name? Was it Red Menace, Deep Blue or Silver Surfer?
300
1024680
4920
ใใฎใƒžใ‚ทใƒณใฎๅๅ‰ใฏไฝ•ใงใ—ใŸใ‹? ใใ‚Œใฏ ใ€Žใƒฌใƒƒใƒ‰ใƒกใƒŠใ‚นใ€ใ€ใ€Žใƒ‡ใ‚ฃใƒผใƒ—ใƒ–ใƒซใƒผใ€ใ€ใใ‚Œใจใ‚‚ใ€Žใ‚ทใƒซใƒใƒผใ‚ตใƒผใƒ•ใ‚กใƒผใ€ใงใ—ใŸใ‹?
17:09
And I think it's Deep Blue.
301
1029720
2880
ใใ—ใฆใใ‚Œใฏใƒ‡ใ‚ฃใƒผใƒ—ใƒ–ใƒซใƒผใ ใจๆ€ใ„ใพใ™ใ€‚
17:12
Well, it sounds like you are more intelligent than a computer,
302
1032720
2880
ใใ†ใงใ™ใญใ€ใ‚ใชใŸใฏ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใ‚ˆใ‚Šใ‚‚่ณขใ„ใ‚ˆใ†ใงใ™ใญ
17:15
because you got the answer right.
303
1035720
1800
ใ€‚ใชใœใชใ‚‰ใ€ใ‚ใชใŸใฏๆญฃใ—ใ„็ญ”ใˆใ‚’ๅ‡บใ—ใŸใ‹ใ‚‰ใงใ™ใ€‚
17:17
Yes, it was Deep Blue.
304
1037640
1240
ใฏใ„ใ€ใƒ‡ใ‚ฃใƒผใƒ—ใƒ–ใƒซใƒผใงใ—ใŸใ€‚
17:19
The 1997 match was actually the second one between Kasparov and Deep Blue,
305
1039000
4680
1997ๅนดใฎ่ฉฆๅˆใฏใ€ๅฎŸ้š›ใซใฏ ใ‚ซใ‚นใƒ‘ใƒญใƒ•ใจใƒ‡ใ‚ฃใƒผใƒ—ใƒปใƒ–ใƒซใƒผ๏ผˆ
17:23
a supercomputer designed by the company IBM
306
1043800
3000
IBM็คพใŒ่จญ่จˆใ—
17:26
and it was specialised in chess-playing.
307
1046920
2400
ใ€ใƒใ‚งใ‚นใซ็‰นๅŒ–ใ—ใŸใ‚นใƒผใƒ‘ใƒผใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผ๏ผ‰ใจใฎ2ๅบฆ็›ฎใฎ่ฉฆๅˆใ ใฃใŸใ€‚
17:29
Well, I think I might challenge Deep Blue to a game!
308
1049440
2840
ใ•ใฆใ€ใƒ‡ใ‚ฃใƒผใƒ—ใƒ–ใƒซใƒผใซใ‚ฒใƒผใƒ ใงๆŒ‘ๆˆฆใ—ใฆใฟใ‚ˆใ†ใ‹ใช ๏ผ
17:32
Obviously, I'm a bit, I'm a bit of a genius myself.
309
1052400
2600
ๆ˜Žใ‚‰ใ‹ใซใ€ ็ง่‡ช่บซใฏใกใ‚‡ใฃใจใ—ใŸๅคฉๆ‰ใงใ™ใ€‚
17:35
Very good! Good to hear!
310
1055120
1600
ใจใฆใ‚‚่‰ฏใ„๏ผ ใใ‚Œใฏใ‚ˆใ‹ใฃใŸใงใ™๏ผ
17:36
Anyway, we've just got time to remember
311
1056840
1600
ใจใซใ‹ใใ€ไปŠๆ—ฅไฝฟใฃใŸ
17:38
some of the words and expressions that we've used today. Neil?
312
1058560
3080
ๅ˜่ชžใ‚„่กจ็พใฎใ„ใใคใ‹ใ‚’ๆ€ใ„ๅ‡บใ™ๆ™‚้–“ใŒใ‚ใ‚Šใพใ™ ใ€‚ ใƒ‹ใƒผใƒซ๏ผŸ
17:41
They were...
313
1061760
1000
ๅฝผใ‚‰ใฏ...
17:42
you'll need your wits about you,
314
1062880
3680
ใ‚ใชใŸใซใฏๆฉŸ็ŸฅใŒๅฟ…่ฆใงใ™ใ€‚
17:46
artificial,
315
1066680
2560
ไบบๅทฅ็š„ใงใ€
17:49
genius,
316
1069360
2600
ๅคฉๆ‰็š„ใงใ€
17:52
synthetic,
317
1072080
2040
ๅˆๆˆ็š„ใงใ€
17:54
threat,
318
1074240
1760
่„…ๅจใงใ€
17:56
to surpass,
319
1076120
2600
17:58
to pursue their goals,
320
1078840
3120
ๅฝผใ‚‰ใฎ็›ฎๆจ™ใ‚’่ฟฝใ„ๆฑ‚ใ‚ใ‚‹ใซใฏใ€
18:02
moral compass.
321
1082080
1320
้“ๅพณ็š„ใชๆŒ‡้‡ใŒๅฟ…่ฆใงใ™ใ€‚
18:03
Thank you. Well, that's it for this programme.
322
1083520
2280
ใ‚ใ‚ŠใŒใจใ†ใ€‚ ใ•ใฆใ€ไปŠๅ›žใฎใƒ—ใƒญใ‚ฐใƒฉใƒ ใฏใ“ใ‚Œใง็ต‚ใ‚ใ‚Šใงใ™ใ€‚ 6 Minute English ็•ช็ต„ใ‚’ใ‚‚ใฃใจใ”่ฆงใซใชใ‚ŠใŸใ„ๆ–นใฏใ€
18:05
Do visit BBC Learning English dot com to find more 6 Minute English programmes.
323
1085920
4720
BBC Learning English dot com ใ‚’ใ”่ฆงใใ ใ•ใ„ ใ€‚
18:10
โ€” Until next time, goodbye! โ€” Goodbye!
324
1090760
1880
โ€” ใงใฏใพใŸๆฌกๅ›žใพใงใ€ใ•ใ‚ˆใ†ใชใ‚‰๏ผ - ใ•ใ‚ˆใ†ใชใ‚‰๏ผ
18:13
6 Minute English.
325
1093640
1440
6ๅˆ†้–“่‹ฑ่ชžใ€‚
18:15
From BBC Learning English.
326
1095200
2280
BBC Learning Englishใ‚ˆใ‚Šใ€‚
18:18
Hello. This is 6 Minute English. I'm Rob. And joining me to do this is Sam.
327
1098600
4480
ใ“ใ‚“ใซใกใฏใ€‚ ใ“ใ‚Œใฏ6ๅˆ†้–“่‹ฑ่ชžใงใ™ใ€‚ ็งใฏใƒญใƒ–ใงใ™ใ€‚ ใใ—ใฆใ€็งใจไธ€็ท’ใซใ“ใ‚Œใ‚’ใ‚„ใฃใฆใใ‚Œใ‚‹ใฎใฏใ‚ตใƒ ใงใ™ใ€‚
18:23
Hello.
328
1103200
1000
ใ“ใ‚“ใซใกใฏใ€‚
18:24
In this programme, we're talking about robots.
329
1104320
3040
ใ“ใฎ็•ช็ต„ใงใฏ ใƒญใƒœใƒƒใƒˆใซใคใ„ใฆใŠ่ฉฑใ—ใพใ™ใ€‚
18:27
Robots can perform many tasks,
330
1107480
2000
ใƒญใƒœใƒƒใƒˆใฏๅคšใใฎใ‚ฟใ‚นใ‚ฏใ‚’ๅฎŸ่กŒใงใใพใ™
18:29
but they're now being introduced in social care to operate as carers,
331
1109600
4560
ใŒใ€็พๅœจใงใฏ ไป‹่ญทๅฃซใจใ—ใฆ็คพไผš็ฆ็ฅ‰ใฎๅˆ†้‡ŽใงๅฐŽๅ…ฅใ•ใ‚Œใ€
18:34
to look after the sick and elderly.
332
1114280
2120
็—…ไบบใ‚„้ซ˜้ฝข่€…ใฎไธ–่ฉฑใ‚’ใ—ใฆใ„ใพใ™ใ€‚ ใ“ใ‚Œใซ้–ขใ™ใ‚‹
18:36
We'll be discussing the positive and negative issues around this,
333
1116520
3680
่‰ฏใ„็‚นใจๆ‚ชใ„็‚นใซใคใ„ใฆ่ญฐ่ซ–ใ™ใ‚‹ใคใ‚‚ใ‚Šใงใ™
18:40
but first, let's set you a question to answer, Sam. Are you ready for this?
334
1120320
3720
ใŒใ€ใพใšใฏ ใ‚ตใƒ ใ•ใ‚“ใซ็ญ”ใˆใฆใ‚‚ใ‚‰ใ†่ณชๅ•ใ‚’ใ—ใพใ—ใ‚‡ใ†ใ€‚ ๆบ–ๅ‚™ใฏใงใใฆใ„ใพใ™ใ‹?
18:44
Fire away!
335
1124160
1120
็™บๅฐ„ใ—ใฆใใ ใ•ใ„๏ผ ๆœ€ๅˆใฎๅ•†็”จใƒญใƒœใƒƒใƒˆใŒไฝœใ‚‰ใ‚ŒใŸใฎ
18:45
Do you know in which year was the first commercial robot built?
336
1125400
3880
ใฏไฝ•ๅนดใ‹ใ”ๅญ˜็Ÿฅใงใ™ใ‹ ?
18:49
Was it in a) 1944? b) 1954? Or c) 1964?
337
1129400
7280
ใใ‚Œใฏ1944ๅนดใงใ—ใŸใ‹? b) 1954ๅนด? ใใ‚Œใจใ‚‚c) 1964ๅนดใงใ—ใ‚‡ใ†ใ‹?
18:56
They're not a brand-new invention, so I'll go for 1954.
338
1136800
5440
ใพใฃใŸใๆ–ฐใ—ใ„็™บๆ˜Žใงใฏใชใ„ ใฎใงใ€1954 ๅนดใจใ—ใพใ™ใ€‚
19:02
OK, well, I'll tell you if you're right or wrong at the end of the programme.
339
1142360
4520
ใงใฏใ€ ็•ช็ต„ใฎๆœ€ๅพŒใซๆญฃ่งฃใ‹ไธๆญฃ่งฃใ‹ใ‚’ใŠไผใˆใ—ใพใ™ใ€‚
19:07
So, let's talk more about robots,
340
1147000
1920
ใใ‚Œใงใฏใ€ใƒญใƒœใƒƒใƒˆใซใคใ„ใฆ
19:09
and specifically ones that are designed to care for people.
341
1149040
3720
ใ€็‰นใซ ไบบ้–“ใฎไธ–่ฉฑใ‚’ใ™ใ‚‹ใŸใ‚ใซ่จญ่จˆใ•ใ‚ŒใŸใƒญใƒœใƒƒใƒˆใซใคใ„ใฆใ€ใ•ใ‚‰ใซ่ฉณใ—ใใŠ่ฉฑใ—ใ—ใพใ—ใ‚‡ใ†ใ€‚
19:12
Traditionally, it's humans working as nurses or carers
342
1152880
3480
ไผ็ตฑ็š„ใซใ€
19:16
who take care of elderly people โ€”
343
1156480
2200
19:18
those people who are too old or too unwell to look after themselves.
344
1158800
3800
้ซ˜้ฝข ใ‚„ไฝ“่ชฟไธ่‰ฏใง่‡ชๅˆ†ใง่บซใฎๅ›žใ‚Šใฎใ“ใจใŒใงใใชใ„้ซ˜้ฝข่€…ใฎไธ–่ฉฑใ‚’ใ™ใ‚‹ใฎใฏใ€็œ‹่ญทๅธซใ‚„ไป‹่ญทๅฃซใจใ—ใฆๅƒใไบบ้–“ใงใ™ใ€‚
19:22
But finding enough carers to look after people is a problem โ€”
345
1162720
4360
ใ—ใ‹ใ—ใ€ไบบใ€…ใฎไธ–่ฉฑใ‚’ใ™ใ‚‹ไป‹่ญท่€…ใ‚’ๅๅˆ†ใซ่ฆ‹ใคใ‘ใ‚‹ใ“ใจใฏ ๅ•้กŒใงใ™ใ€‚
19:27
there are more people needing care than there are people who can help.
346
1167200
4120
ไป‹่ญทใ‚’ๅฟ…่ฆใจใ™ใ‚‹ไบบใฎๆ•ฐใฏใ€ ๅŠฉใ‘ใ‚‹ใ“ใจใŒใงใใ‚‹ไบบใฎๆ•ฐใ‚ˆใ‚Šใ‚‚ๅคšใ„ใฎใงใ™ใ€‚
19:31
And recently in the UK, the government announced a ยฃ34 million fund
347
1171440
5560
ใใ—ใฆๆœ€่ฟ‘ใ€่‹ฑๅ›ฝๆ”ฟๅบœใฏใ€
19:37
to help develop robots to look after us in our later years.
348
1177120
4520
็งใŸใกใฎ่€ๅพŒใฎไธ–่ฉฑใ‚’ใ—ใฆใใ‚Œใ‚‹ใƒญใƒœใƒƒใƒˆใฎ้–‹็™บใ‚’ๆ”ฏๆดใ™ใ‚‹ใŸใ‚ใซ3,400ไธ‡ใƒใƒณใƒ‰ใฎๅŸบ้‡‘ใ‚’็™บ่กจใ—ใŸ ใ€‚
19:41
Well, robot carers are being developed,
349
1181760
2440
ใƒญใƒœใƒƒใƒˆไป‹่ญทๅฃซใฏ้–‹็™บใ•ใ‚Œใคใคใ‚ใ‚Šใพใ™
19:44
but can they really learn enough empathy to take care of the elderly and unwell?
350
1184320
4560
ใŒใ€้ซ˜้ฝข่€…ใ‚„็—…ไบบใฎไป‹่ญทใซๅๅˆ†ใชๅ…ฑๆ„ŸๅŠ›ใ‚’ๆœฌๅฝ“ใซ่บซใซใคใ‘ใ‚‹ใ“ใจใŒใงใใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹ ๏ผŸ
19:49
'Empathy' is 'the ability to understand how someone feels
351
1189000
3680
ใ€Œๅ…ฑๆ„Ÿใ€ใจใฏใ€ใ€Œใ‚ใ‚‹ไบบใฎ็ซ‹ๅ ดใ ใฃใŸใ‚‰
19:52
'by imagining what it would be like to be in that person's situation'.
352
1192800
4240
ใฉใ†ใชใ‚‹ใ‹ใ‚’ๆƒณๅƒใ™ใ‚‹ใ“ใจใซใ‚ˆใฃใฆใ€ใใฎไบบใฎๆฐ—ๆŒใกใ‚’็†่งฃใ™ใ‚‹่ƒฝๅŠ›ใ€ใงใ™ ใ€‚
19:57
Well, let's hear about one of those new robots now, called Pepper.
353
1197160
4680
ใ•ใฆใ€ ไปŠๅบฆใฏใƒšใƒƒใƒ‘ใƒผใจๅ‘ผใฐใ‚Œใ‚‹ๆ–ฐใ—ใ„ใƒญใƒœใƒƒใƒˆใซใคใ„ใฆ่žใใพใ—ใ‚‡ใ†ใ€‚
20:01
Abbey Hearn-Nagaf is a research assistant at the University of Bedfordshire.
354
1201960
5080
ใ‚ขใƒ“ใƒผใƒปใƒใƒผใƒณใƒปใƒŠใ‚ฌใƒ•ใฏ ใƒ™ใƒƒใƒ‰ใƒ•ใ‚ฉใƒผใƒ‰ใ‚ทใƒฃใƒผๅคงๅญฆใฎ็ ”็ฉถๅŠฉๆ‰‹ใงใ™ใ€‚
20:07
She spoke to BBC Radio 4's You and Yours programme
355
1207160
3880
ๅฝผๅฅณใฏBBCใƒฉใ‚ธใ‚ช4ใฎ ็•ช็ต„ใ€Œใƒฆใƒผใƒปใ‚ขใƒณใƒ‰ใƒปใƒฆใ‚ขใƒผใ‚บใ€ใซๅ‡บๆผ”ใ—
20:11
and explained how Pepper is first introduced to someone in a care home.
356
1211160
4600
ใ€ใƒšใƒƒใƒ‘ใƒผใŒ ไป‹่ญทๆ–ฝ่จญใฎไบบใซๅˆใ‚ใฆ็ดนไป‹ใ•ใ‚Œใ‚‹ๆง˜ๅญใ‚’่ชฌๆ˜Žใ—ใŸใ€‚
20:15
We just bring the robot to their room
357
1215880
2200
็งใŸใกใฏใƒญใƒœใƒƒใƒˆใ‚’ๅฝผใ‚‰ใฎ้ƒจๅฑ‹ใซ้€ฃใ‚Œใฆ่กŒใ
20:18
and we talk about what Pepper can't do, which is important,
358
1218200
2640
ใ€ใƒšใƒƒใƒ‘ใƒผใŒใงใใชใ„ใ“ใจใซใคใ„ใฆ่ฉฑใ‚’ใ—ใพใ™ใ€‚ ใ“ใ‚Œใฏ้‡่ฆใชใ“ใจใงใ™
20:20
so it can't provide physical assistance in any way.
359
1220960
2720
ใŒใ€ใƒšใƒƒใƒ‘ใƒผใŒ็‰ฉ็†็š„ใชๆ”ฏๆดใ‚’ใ™ใ‚‹ใ“ใจใฏๆฑบใ—ใฆใงใใพใ›ใ‚“ ใ€‚
20:23
It does have hands, it can wave.
360
1223800
2160
ๆ‰‹ใŒใ‚ใ‚‹ใฎใงใ€ๆ‰‹ใ‚’ๆŒฏใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚
20:26
When you ask for privacy, it does turn around
361
1226080
2000
ใƒ—ใƒฉใ‚คใƒใ‚ทใƒผใ‚’่ฆๆฑ‚ใ™ใ‚‹ใจใ€ ๆŒฏใ‚Š่ฟ”ใฃ
20:28
and sort of cover its eyes with its hands, but that's the most it does.
362
1228200
3000
ใฆๆ‰‹ใง็›ฎใ‚’่ฆ†ใฃใŸใ‚Šใ—ใพใ™ ใŒใ€ใใ‚ŒไปฅไธŠใฏใ—ใพใ›ใ‚“ใ€‚
20:31
It doesn't grip anything, it doesn't move anything,
363
1231320
2160
ไฝ•ใ‹ใ‚’ใคใ‹ใ‚“ใ ใ‚Šใ€ ๅ‹•ใ‹ใ—ใŸใ‚Šใฏใ—ใพใ›ใ‚“ใ€‚
20:33
because we're more interested to see how it works as a companion,
364
1233600
3480
ใชใœใชใ‚‰ใ€็งใŸใกใฏใ€ใใ‚ŒใŒ ไปฒ้–“ใจใ—ใฆใ€
20:37
having something there to talk to, to converse with, to interact with.
365
1237200
4080
่ฉฑใ—ใŸใ‚Šใ€ไผš่ฉฑใ— ใŸใ‚Šใ€ไบคๆตใ—ใŸใ‚Šใ™ใ‚‹ใ‚‚ใฎใจใ—ใฆใฉใฎใ‚ˆใ†ใซๆฉŸ่ƒฝใ™ใ‚‹ใ‹ใ‚’่ฆ‹ใ‚‹ใ“ใจใซ่ˆˆๅ‘ณใŒใ‚ใ‚‹ใ‹ใ‚‰ใงใ™ใ€‚
20:41
So, Abbey described how the robot is introduced to someone.
366
1241400
4240
ใใ“ใงใ€ใ‚ขใƒ“ใƒผใฏใƒญใƒœใƒƒใƒˆ ใŒ่ชฐใ‹ใซ็ดนไป‹ใ•ใ‚Œใ‚‹ๆง˜ๅญใ‚’่ชฌๆ˜Žใ—ใพใ—ใŸใ€‚
20:45
She was keen to point out that this robot has 'limitations' โ€” 'things it can't do'.
367
1245760
6120
ๅฝผๅฅณใฏใ€ใ“ใฎใƒญใƒœใƒƒใƒˆใซใฏ ใ€Œ้™็•Œใ€ใ€ใคใพใ‚Šใ€Œใงใใชใ„ใ“ใจใ€ใŒใ‚ใ‚‹ใ“ใจใ‚’ๅผท่ชฟใ—ใŸใ€‚
20:52
It can wave or turn round when a person needs 'privacy' โ€” 'to be private' โ€”
368
1252000
5040
ไบบใŒใ€Œใƒ—ใƒฉใ‚คใƒใ‚ทใƒผใ€ใ€ใคใพใ‚Šใ€Œใƒ—ใƒฉใ‚คใƒ™ใƒผใƒˆใงใ‚ใ‚‹ใ€ใ“ใจใ‚’ๅฟ…่ฆใจใ™ใ‚‹ใจใใ€ใƒญใƒœใƒƒใƒˆใฏๆ‰‹ใ‚’ๆŒฏใฃใŸใ‚Šๅ‘ใใ‚’ๅค‰ใˆใŸใ‚Šใ™ใ‚‹ใ“ใจใฏใงใใ‚‹
20:57
but it can't provide 'physical assistance'.
369
1257160
3280
ใŒใ€ ใ€Œ็‰ฉ็†็š„ใชๆ”ฏๆดใ€ใ‚’ๆไพ›ใ™ใ‚‹ใ“ใจใฏใงใใชใ„ใ€‚
21:00
This means it can't help someone by 'touching or feeling' them.
370
1260560
4440
ใคใพใ‚Šใ€ไบบใซ ใ€Œ่งฆใ‚ŒใŸใ‚Šๆ„Ÿใ˜ใŸใ‚Šใ€ใ™ใ‚‹ใ“ใจใงๅŠฉใ‘ใ‚‹ใ“ใจใฏใงใใชใ„ใจใ„ใ†ใ“ใจใงใ™ใ€‚
21:05
But that's OK, Abbey says.
371
1265120
1680
ใ—ใ‹ใ—ใ€ใใ‚Œใฏๅคงไธˆๅคซใ ใจใ‚ขใƒ“ใƒผใฏ่จ€ใ„ใพใ™ใ€‚
21:06
This robot is designed to be a 'companion' โ€”
372
1266920
3080
ใ“ใฎใƒญใƒœใƒƒใƒˆใฏใ€ ใ€Œไปฒ้–“ใ€ใ€ใคใพใ‚Šใ€Œ
21:10
'someone who is with you to keep you company' โ€”
373
1270120
2320
ใ‚ใชใŸใจไธ€็ท’ใซใ„ใฆไป˜ใๆทปใฃใฆใใ‚Œใ‚‹ไบบใ€ใ€
21:12
a friend, in other words, that you can converse or talk with.
374
1272560
3480
่จ€ใ„ๆ›ใˆใ‚Œใฐใ€ ไผš่ฉฑใ—ใŸใ‚Š่ฉฑใ—ใŸใ‚Šใงใใ‚‹ๅ‹ไบบใจใชใ‚‹ใ‚ˆใ†ใซ่จญ่จˆใ•ใ‚Œใฆใ„ใ‚‹ใ€‚
21:16
Well, having a companion is a good way to stop people getting lonely,
375
1276160
4400
ใใ†ใงใ™ใญใ€ไปฒ้–“ใŒใ„ใ‚‹ใ“ใจใฏ ๅญค็‹ฌใซใชใ‚‰ใชใ„ใŸใ‚ใฎ่‰ฏใ„ๆ–นๆณ•ใงใ™
21:20
but surely a human is better for that?
376
1280680
2960
ใŒใ€ใใฎใŸใ‚ใซใฏไบบ้–“ใฎๆ–นใŒ่‰ฏใ„ใฎใงใฏใชใ„ใงใ—ใ‚‡ใ†ใ‹๏ผŸ
21:23
Surely they understand you better than a robot ever can?
377
1283760
3680
ๅฝผใ‚‰ใฏ ใใฃใจใƒญใƒœใƒƒใƒˆใ‚ˆใ‚Šใ‚‚ใ‚ใชใŸใ‚’ใ‚ˆใ็†่งฃใ—ใฆใ„ใ‚‹ใฎใงใฏใชใ„ใงใ—ใ‚‡ใ†ใ‹?
21:27
Well, innovation means that robots are becoming cleverer all the time.
378
1287560
4640
ใใ†ใงใ™ใญใ€ใ‚คใƒŽใƒ™ใƒผใ‚ทใƒงใƒณใจใฏใ€ใƒญใƒœใƒƒใƒˆใŒ ๅธธใซ่ณขใใชใฃใฆใ„ใใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
21:32
And, as we've mentioned, in the UK alone there is a growing elderly population
379
1292320
4720
ใใ—ใฆใ€ใ™ใงใซ่ฟฐในใŸใ‚ˆใ†ใซใ€่‹ฑๅ›ฝใ ใ‘ใงใ‚‚ ้ซ˜้ฝข่€…ไบบๅฃใŒๅข—ๅŠ ใ—ใฆใŠใ‚Šใ€
21:37
and more than 100,000 care assistant vacancies.
380
1297160
3360
ไป‹่ญทๅŠฉๆ‰‹ใฎๆฑ‚ไบบใฏ 10 ไธ‡ไบบไปฅไธŠใ‚ใ‚Šใพใ™ใ€‚
21:40
Who's going to do all the work?
381
1300640
1800
่ชฐใŒใ™ในใฆใฎไป•ไบ‹ใ‚’ใ™ใ‚‹ใฎใงใ—ใ‚‡ใ†ใ‹?
21:42
I think we should hear from Dr Sarah Woodin,
382
1302560
2640
21:45
a health researcher in independent living from Leeds University,
383
1305320
4040
21:49
who also spoke to the BBC's You and Yours programme.
384
1309480
3960
BBCใฎใ€ŒYou and Yoursใ€็•ช็ต„ใงใ‚‚่ฌ›ๆผ”ใ—ใŸใ€ใƒชใƒผใ‚บๅคงๅญฆใฎ่‡ช็ซ‹็”Ÿๆดปใซ้–ขใ™ใ‚‹ๅฅๅบท็ ”็ฉถ่€…ใ‚ตใƒฉใƒปใ‚ฆใƒƒใƒ‡ใ‚ฃใƒณๅšๅฃซใฎ่ฉฑใ‚’่ดใในใใ ใจๆ€ใ† ใ€‚
21:53
She seems more realistic about the introduction of robot carers.
385
1313560
5120
ๅฝผๅฅณใฏ ใƒญใƒœใƒƒใƒˆไป‹่ญท่€…ใฎๅฐŽๅ…ฅใซใคใ„ใฆใฏใ‚ˆใ‚Š็พๅฎŸ็š„ใงใ‚ใ‚‹ใ‚ˆใ†ใ ใ€‚
21:59
I think there are problems if we consider robots as replacement for people.
386
1319200
4600
ใƒญใƒœใƒƒใƒˆใ‚’ไบบ้–“ใฎไปฃใ‚ใ‚Šใจ่€ƒใˆใ‚‹ใจๅ•้กŒใŒใ‚ใ‚‹ใจๆ€ใ„ใพใ™ใ€‚
22:03
We know that money is tight โ€” if robots become mass-produced,
387
1323920
4680
่ณ‡้‡‘ใŒไธ่ถณใ—ใฆใ„ใ‚‹ใ“ใจใฏใ‚ใ‹ใฃใฆใ„ใพใ™ใ€‚ ใƒญใƒœใƒƒใƒˆใŒๅคง้‡็”Ÿ็”ฃใ•ใ‚Œใ‚Œใฐใ€
22:08
there could be large institutions where people might be housed
388
1328720
4200
ไบบ้–“ใ‚’ๅŽๅฎนใ—
22:13
and abandoned to robots.
389
1333040
2800
ใ€ใƒญใƒœใƒƒใƒˆใซไปปใ›ใ‚‹ๅคง่ฆๆจกใชๆ–ฝ่จญใŒใงใใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™ใ€‚
22:15
I do think questions of ethics
390
1335960
1480
ๅ€ซ็†ใฎๅ•้กŒใ‚‚ๆˆ้•ทใจ้›‡็”จใฎ่ชฒ้กŒใซ็ต„ใฟ่พผใ‚€ๅฟ…่ฆใŒใ‚ใ‚‹ใจ็งใฏ่€ƒใˆใฆใ„ใพใ™ใ€‚ใชใœใชใ‚‰ใ€ๅ€ซ็†ใฎๅ•้กŒใจ
22:17
need to come into the growth and jobs agenda as well,
391
1337560
3600
ๆˆ้•ทใจ้›‡็”จใฎๅ•้กŒใฏ
22:21
because, sometimes, they're treated very separately.
392
1341280
2440
ใ€ๆ™‚ใจใ—ใฆ ใพใฃใŸใๅˆฅใ€…ใซๆ‰ฑใ‚ใ‚Œใ‚‹ใ“ใจใŒใ‚ใ‚‹ใ‹ใ‚‰ใงใ™ใ€‚
22:23
OK, so Sarah Woodin suggests that when money is 'tight' โ€”
393
1343840
3440
ใ•ใฆใ€ใ‚ตใƒฉใƒปใ‚ฆใƒƒใƒ‡ใ‚ฃใƒณใฏใ€ ่ณ‡้‡‘ใŒใ€ŒๅŽณใ—ใ„ใ€ๅ ดๅˆใ€
22:27
meaning there is 'only just enough' โ€”
394
1347400
1680
ใคใพใ‚Šใ€ŒใŽใ‚ŠใŽใ‚Šๅๅˆ†ใชใ€ๅ ดๅˆใ€
22:29
making robots in large quantities โ€” or mass-produced โ€”
395
1349200
3320
ใƒญใƒœใƒƒใƒˆใ‚’ๅคง้‡ใซไฝœใ‚‹๏ผˆ ๅคง้‡็”Ÿ็”ฃใ™ใ‚‹๏ผ‰ใ“ใจใฏใ€ไบบ้–“ใ‚’ไฝฟใ†ใ‚ˆใ‚Šใ‚‚
22:32
might be a cheaper option than using humans.
396
1352640
2720
ๅฎ‰ไพกใช้ธๆŠž่‚ขใซใชใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใจ็คบๅ”†ใ—ใฆใ„ใ‚‹ ใ€‚
22:35
And she says people might be abandoned to robots.
397
1355480
3160
ใใ—ใฆใ€ไบบ้–“ใฏ ใƒญใƒœใƒƒใƒˆใซๆจใฆใ‚‰ใ‚Œใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใจๅฝผๅฅณใฏ่จ€ใ†ใ€‚
22:38
Yes, 'abandoned' means 'left alone in a place, usually forever'.
398
1358760
5840
ใฏใ„ใ€ใ€Œๆ”พๆฃ„ใ€ใจใฏใ€ใ€Œ ้€šๅธธใฏๆฐธไน…ใซใ€ใ‚ใ‚‹ๅ ดๆ‰€ใซๆ”พ็ฝฎใ•ใ‚Œใ‚‹ใ€ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
22:44
So she says it might be possible that someone ends up being forgotten
399
1364720
4360
ใใฎใŸใ‚ใ€ ่ชฐใ‹ใŒๅฟ˜ใ‚ŒๅŽปใ‚‰ใ‚Œ
22:49
and only having a robot to care for them. So is this right, ethically?
400
1369200
5640
ใ€ใƒญใƒœใƒƒใƒˆใ ใ‘ใŒไธ–่ฉฑใ‚’ใ™ใ‚‹ใ“ใจใซใชใ‚‹ๅฏ่ƒฝๆ€งใ‚‚ใ‚ใ‚‹ใ€ใจๅฝผๅฅณใฏ่จ€ใ†ใ€‚ ใใ‚Œใงใ€ใ“ใ‚Œใฏๅ€ซ็†็š„ใซๆญฃใ—ใ„ใฎใงใ—ใ‚‡ใ†ใ‹?
22:54
Yes, well, she mentions 'ethics' โ€” that's 'what is morally right' โ€”
401
1374960
3920
ใใ†ใงใ™ใญใ€ๅฝผๅฅณใฏใ€Œๅ€ซ็†ใ€ใซใคใ„ใฆ่จ€ๅŠใ—ใฆใ„ใพใ™ใ€‚ ใใ‚Œใฏใ€Œ้“ๅพณ็š„ใซๆญฃใ—ใ„ใ“ใจใ€ใงใ‚ใ‚Š
22:59
and that needs to be considered as part of the jobs agenda.
402
1379000
3440
ใ€้›‡็”จ่จˆ็”ปใฎไธ€้ƒจใจใ—ใฆ่€ƒๆ…ฎใ•ใ‚Œใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ ใ€‚
23:02
So, we shouldn't just consider what job vacancies need filling,
403
1382560
3160
ใ—ใŸใŒใฃใฆใ€ ใฉใฎใ‚ˆใ†ใชๆฑ‚ไบบใ‚’ๅŸ‹ใ‚ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹ใ‹ใ ใ‘
23:05
but who and how it should be done.
404
1385840
2360
ใงใชใใ€่ชฐใŒใฉใฎใ‚ˆใ†ใซๅŸ‹ใ‚ใ‚‹ในใใ‹ใ‚’ๆคœ่จŽใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใ€‚
23:08
And earlier I asked you, Sam,
405
1388320
1400
ๅ…ˆใปใฉใ‚ตใƒ ใ•ใ‚“ใซๅฐ‹ใญใพใ—ใŸใŒใ€
23:09
did you know in which year was the first commercial robot built? And you said?
406
1389840
4440
ๆœ€ๅˆใฎ ๅ•†็”จใƒญใƒœใƒƒใƒˆใŒไฝœใ‚‰ใ‚ŒใŸใฎใฏไฝ•ๅนดใ‹ใ”ๅญ˜ใ˜ใงใ™ใ‹? ใใ—ใฆใ‚ใชใŸใฏ่จ€ใ„ใพใ—ใŸใ‹๏ผŸ
23:14
I said 1954.
407
1394400
1640
็งใฏ 1954 ๅนดใจ่จ€ใ„ใพใ—ใŸใ€‚
23:16
Well, you didn't need a robot to help you there because you are right.
408
1396160
3320
ใใ†ใงใ™ใญใ€ใ‚ใชใŸใŒๆญฃใ—ใ„ใฎใงใ€ใใ“ใงใฏใƒญใƒœใƒƒใƒˆใฎๅŠฉใ‘ใฏๅฟ…่ฆใ‚ใ‚Šใพใ›ใ‚“ใงใ—ใŸ ใ€‚
23:19
โ€” Ah, yay! โ€” Well done!
409
1399600
1760
โ€” ใ‚ใ‚ใ€ใ‚„ใฃใŸใƒผ๏ผ - ใ‚ˆใใ‚„ใฃใŸ๏ผ
23:21
Now let's do something a robot can't do yet,
410
1401480
2720
ใงใฏใ€ใƒญใƒœใƒƒใƒˆใŒใพใ ใงใใชใ„ใ“ใจใ‚’ใ‚„ใฃใฆใฟใพใ—ใ‚‡ใ†ใ€‚ใใ‚Œใฏใ€
23:24
and that's recap the vocabulary we've highlighted today, starting with empathy.
411
1404320
5280
ๅ…ฑๆ„Ÿใ‹ใ‚‰ๅง‹ใ‚ใฆใ€ไปŠๆ—ฅๅผท่ชฟใ—ใŸ่ชžๅฝ™ใ‚’่ฆ็ด„ใ™ใ‚‹ใ“ใจใงใ™ใ€‚
23:29
'Empathy' is 'the ability to understand how someone feels
412
1409720
3440
ใ€Œๅ…ฑๆ„Ÿใ€ใจใฏใ€ใ€Œใ‚ใ‚‹ไบบใฎ็ซ‹ๅ ดใ ใฃใŸใ‚‰
23:33
by imagining what it would be like to be in that person's situation'.
413
1413280
3920
ใฉใ†ใชใ‚‹ใ‹ใ‚’ๆƒณๅƒใ™ใ‚‹ใ“ใจใงใ€ใใฎไบบใฎๆฐ—ๆŒใกใ‚’็†่งฃใ™ใ‚‹่ƒฝๅŠ›ใ€ใงใ™ ใ€‚
23:37
'Physical assistance' describes 'helping someone by touching them'.
414
1417320
4280
ใ€Œ่บซไฝ“็š„ใชๆดๅŠฉใ€ใจใฏใ€ ใ€Œ่งฆใ‚Œใ‚‹ใ“ใจใง่ชฐใ‹ใ‚’ๅŠฉใ‘ใ‚‹ใ“ใจใ€ใ‚’ๆŒ‡ใ—ใพใ™ใ€‚
23:41
We also mentioned a 'companion' โ€”
415
1421720
1920
ใพใŸใ€ใ€Œไปฒ้–“ใ€ใซใคใ„ใฆใ‚‚่งฆใ‚Œใพใ—ใŸใ€‚ใ“ใ‚Œใฏ
23:43
that's 'someone who is with you and keeps you company'.
416
1423760
2680
ใ€Œใ‚ใชใŸใจไธ€็ท’ใซใ„ใฆ ใ€ใ‚ใชใŸใซไป˜ใๆทปใฃใฆใใ‚Œใ‚‹ไบบใ€ใฎใ“ใจใงใ™ใ€‚
23:46
Our next word was 'tight' โ€” in the context of money,
417
1426560
3280
ๆฌกใฎๅ˜่ชžใฏใ€Œtightใ€ใงใ™ใ€‚ ใŠ้‡‘ใซ้–ขใ—ใฆ่จ€ใˆใฐใ€
23:49
when money is tight, it means there's 'not enough'.
418
1429960
3120
ใŠ้‡‘ใŒ่ถณใ‚Šใชใ„ใจใ„ใ†ใ“ใจใฏ ใ€Œๅๅˆ†ใงใฏใชใ„ใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ใ€‚
23:53
'Abandoned' means 'left alone in a place, usually forever'.
419
1433200
3400
ใ€Œๆ”พๆฃ„ใ•ใ‚Œใ‚‹ใ€ใจใฏใ€ ใ€Œ้€šๅธธใฏๆฐธไน…ใซใ€ใ‚ใ‚‹ๅ ดๆ‰€ใซๆ”พ็ฝฎใ•ใ‚Œใ‚‹ใ€ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
23:56
And finally, we discussed the word 'ethics' โ€”
420
1436720
2440
ใใ—ใฆๆœ€ๅพŒใซใ€ ็งใŸใกใฏใ€Œๅ€ซ็†ใ€ใจใ„ใ†่จ€่‘‰ใซใคใ„ใฆ่ฉฑใ—ๅˆใ„ใพใ—ใŸใ€‚
23:59
we hear a lot about business ethics or medical ethics โ€”
421
1439280
3760
ใƒ“ใ‚ธใƒใ‚นๅ€ซ็†ใ‚„ๅŒป็™‚ๅ€ซ็†ใซใคใ„ใฆใฏใ‚ˆใ่€ณใซใ—ใพใ™ใŒ
24:03
and it means 'the study of what is morally right'.
422
1443160
3280
ใ€ใใ‚Œใฏ ใ€Œ้“ๅพณ็š„ใซๆญฃใ—ใ„ใ“ใจใซใคใ„ใฆใฎ็ ”็ฉถใ€ใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
24:06
OK, thank you, Sam.
423
1446560
1560
ใ‚ใ‹ใ‚Šใพใ—ใŸใ€ใ‚ใ‚ŠใŒใจใ†ใ€ใ‚ตใƒ ใ€‚
24:08
Well, we've managed to get through 6 Minute English without the aid of a robot.
424
1448240
4520
ใ•ใฆใ€็งใŸใกใฏ ใƒญใƒœใƒƒใƒˆใฎๅŠฉใ‘ใ‚’ๅ€Ÿใ‚Šใšใซ 6 Minute English ใ‚’ใ‚„ใ‚Š้‚ใ’ใ‚‹ใ“ใจใŒใงใใพใ—ใŸใ€‚
24:12
That's all for now, but please join us again soon. Goodbye!
425
1452880
2920
ไปŠๅ›žใฏใ“ใ“ใพใงใงใ™ ใŒใ€ใพใŸๆฌกๅ›žใ‚‚ใœใฒใ”ๅ‚ๅŠ ใใ ใ•ใ„ใ€‚ ใ•ใ‚ˆใ†ใชใ‚‰๏ผ
24:15
Bye-bye, everyone!
426
1455920
1000
ใฟใชใ•ใ‚“ใ€ใ•ใ‚ˆใ†ใชใ‚‰๏ผ
24:17
6 Minute English.
427
1457720
1680
6ๅˆ†้–“่‹ฑ่ชžใ€‚
24:19
From BBC Learning English.
428
1459520
2480
BBC Learning Englishใ‚ˆใ‚Šใ€‚
24:22
Hello. This is 6 Minute English from BBC Learning English. I'm Phil.
429
1462840
3800
ใ“ใ‚“ใซใกใฏใ€‚ ใ“ใ‚Œใฏ BBC Learning English ใฎ 6 Minute English ใงใ™ใ€‚ ็งใฏใƒ•ใ‚ฃใƒซใงใ™ใ€‚
24:26
And I'm Georgie.
430
1466760
1240
ใใ—ใฆ็งใฏใ‚ธใƒงใƒผใ‚ธใƒผใงใ™ใ€‚
24:28
Animal testing is when living animals are used in scientific research
431
1468120
4320
ๅ‹•็‰ฉๅฎŸ้จ“ใจใฏใ€
24:32
to find out how effective a new medicine is, or how safe a product is for humans.
432
1472560
5880
ๆ–ฐใ—ใ„่–ฌใŒใฉใ‚Œใ ใ‘ๅŠนๆžœ็š„ใ‹ใ€ ใ‚ใ‚‹ใ„ใฏ่ฃฝๅ“ใŒไบบ้–“ใซใจใฃใฆใฉใ‚Œใ ใ‘ๅฎ‰ๅ…จใ‹ใ‚’่ชฟในใ‚‹ใŸใ‚ใซใ€็”ŸใใŸๅ‹•็‰ฉใ‚’็ง‘ๅญฆ็š„็ ”็ฉถใซไฝฟใ†ใ“ใจใงใ™ใ€‚ ๅ‹•็‰ฉๅฎŸ้จ“ใ‚’
24:38
Scientists in favour of it argue that animal testing
433
1478560
3360
ๆ”ฏๆŒใ™ใ‚‹็ง‘ๅญฆ่€…ใฏใ€ ๅ‹•็‰ฉๅฎŸ้จ“ใฏ
24:42
shows whether medicines are safe or dangerous for humans,
434
1482040
3840
ๅŒป่–ฌๅ“ ใŒไบบ้–“ใซใจใฃใฆๅฎ‰ๅ…จใ‹ๅฑ้™บใ‹ใ‚’ๆ˜Žใ‚‰ใ‹ใซใ—
24:46
and has saved many lives.
435
1486000
1760
ใ€ๅคšใใฎๅ‘ฝใ‚’ๆ•‘ใฃใฆใใŸใจไธปๅผตใ™ใ‚‹ใ€‚
24:47
But animal rights campaigners say it's cruel,
436
1487880
2720
ใ—ใ‹ใ—ใ€ๅ‹•็‰ฉๆ„›่ญทๆดปๅ‹•ๅฎถใ‚‰ใฏใ€ ใ“ใ‚Œใฏๆฎ‹้…ทใงใ‚ใ‚Š
24:50
and also ineffective because animals and humans are so different.
437
1490720
4800
ใ€ๅ‹•็‰ฉใจไบบ้–“ใฏ้žๅธธใซ็•ฐใชใ‚‹ใŸใ‚ๅŠนๆžœใŒใชใ„ใจไธปๅผตใ—ใฆใ„ใ‚‹ ใ€‚
24:55
Under British law, medicines must be tested on two different types of animals,
438
1495640
5800
่‹ฑๅ›ฝใฎๆณ•ๅพ‹ใงใฏใ€ๅŒป่–ฌๅ“ใฏ้€šๅธธใ€ใƒใ‚บใƒŸใ€ใƒžใ‚ฆใ‚นใ€ใƒขใƒซใƒขใƒƒใƒˆใชใฉใ‹ใ‚‰ๅง‹ใ‚ใฆใ€ 2็จฎ้กžใฎ็•ฐใชใ‚‹ๅ‹•็‰ฉใงใƒ†ใ‚นใƒˆใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹
25:01
usually starting with rats, mice or guinea pigs.
439
1501560
4280
ใ€‚
25:05
And in everyday English, the term 'human guinea pig'
440
1505960
3680
ใพใŸใ€ๆ—ฅๅธธ่‹ฑ่ชžใงใฏใ€ ใ€Œไบบ้–“ใƒขใƒซใƒขใƒƒใƒˆใ€ใจใ„ใ†็”จ่ชžใฏใ€ใ€Œไฝ•ใ‹ใฎใƒ†ใ‚นใƒˆใ‚’ๅ—ใ‘ใ‚‹
25:09
can be used to mean 'the first people to have something tested on them'.
441
1509760
4600
ๆœ€ๅˆใฎไบบใ€…ใ€ใจใ„ใ†ๆ„ๅ‘ณใงไฝฟ็”จใ•ใ‚Œใพใ™ ใ€‚
25:14
But now, groups both for and against animal testing are thinking again,
442
1514480
4960
ใ—ใ‹ใ—็พๅœจใ€ๅ‹•็‰ฉๅฎŸ้จ“่ณ›ๆˆๆดพใจๅๅฏพๆดพใฎไธกๆ–นใฎใ‚ฐใƒซใƒผใƒ—ใŒใ€่ญฐ่ซ–ใซใŠใ‘ใ‚‹
25:19
thanks to a recent development in the debate โ€” AI.
443
1519560
3640
ๆœ€่ฟ‘ใฎ้€ฒๅฑ•ใงใ‚ใ‚‹ AI ใฎใŠใ‹ใ’ใงใ€่€ƒใˆ็›ดใ—ๅง‹ใ‚ใฆใ„ใพใ™ ใ€‚
25:23
In this programme, we'll be hearing how artificial intelligence
444
1523320
3160
ใ“ใฎ็•ช็ต„ใงใฏใ€ ไบบๅทฅ็Ÿฅ่ƒฝใŒ
25:26
could help reduce the need for scientific testing on animals.
445
1526600
3960
ๅ‹•็‰ฉใซๅฏพใ™ใ‚‹็ง‘ๅญฆ็š„ๅฎŸ้จ“ใฎๅฟ…่ฆๆ€งใ‚’ๆธ›ใ‚‰ใ™ใฎใซใฉใฎใ‚ˆใ†ใซๅฝน็ซ‹ใคใ‹ใซใคใ„ใฆใŠ่ฉฑใ—ใพใ™ใ€‚
25:30
But first, I have a question for you, Georgie.
446
1530680
3400
ใ—ใ‹ใ—ใ€ใพใšใ‚ธใƒงใƒผใ‚ธใƒผใ•ใ‚“ใซ่ณชๅ•ใŒใ‚ใ‚Šใพใ™ ใ€‚
25:34
There's one commonly used medicine in particular
447
1534200
2880
็‰นใซใ€
25:37
which is harmful for animals but safe for humans, but what?
448
1537200
5040
ๅ‹•็‰ฉใซใฏๆœ‰ๅฎณใ  ใŒไบบ้–“ใซใฏๅฎ‰ๅ…จใชใ€ใ‚ˆใไฝฟใ‚ใ‚Œใ‚‹่–ฌใŒ 1 ใคใ‚ใ‚Šใพใ™ใ€‚ใใ‚Œใฏไฝ•ใงใ—ใ‚‡ใ†ใ‹?
25:43
Is it a) Antibiotics?
449
1543240
3280
ใใ‚Œใฏ a) ๆŠ—็”Ÿ็‰ฉ่ณชใงใ™ใ‹?
25:46
b) Aspirin?
450
1546640
2080
b) ใ‚ขใ‚นใƒ”ใƒชใƒณ?
25:48
Or c) Paracetamol?
451
1548840
2480
ใใ‚Œใจใ‚‚c)ใƒ‘ใƒฉใ‚ปใ‚ฟใƒขใƒผใƒซใงใ—ใ‚‡ใ†ใ‹?
25:51
Hmm, I guess it's aspirin.
452
1551440
2880
ใ†ใƒผใ‚“ใ€ใ‚ขใ‚นใƒ”ใƒชใƒณใ‹ใชใ€‚
25:54
OK, Georgie, I'll reveal the answer at the end of the programme.
453
1554440
4080
OKใ€ใ‚ธใƒงใƒผใ‚ธใƒผใ€ ็•ช็ต„ใฎๆœ€ๅพŒใซ็ญ”ใˆใ‚’ๆ˜Žใ‹ใ—ใพใ™ใ€‚
25:58
Christine Ro is a science journalist who's interested in the animal testing debate.
454
1558640
5360
ใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒณใƒปใƒญใƒผใฏ ๅ‹•็‰ฉๅฎŸ้จ“่ซ–ไบ‰ใซ้–ขๅฟƒใ‚’ๆŒใค็ง‘ๅญฆใ‚ธใƒฃใƒผใƒŠใƒชใ‚นใƒˆใงใ™ใ€‚
26:04
Here, she explains to BBC World Service programme Tech Life
455
1564120
4600
ใ“ใ“ใงๅฝผๅฅณใฏใ€BBCใƒฏใƒผใƒซใƒ‰ใ‚ตใƒผใƒ“ใ‚นใฎ ็•ช็ต„ใ€Œใƒ†ใƒƒใ‚ฏใƒปใƒฉใ‚คใƒ•ใ€ใงใ€
26:08
some of the limitations of testing medicines on animals.
456
1568840
3640
ๅ‹•็‰ฉใ‚’ไฝฟใฃใŸๅŒป่–ฌๅ“ใฎ่ฉฆ้จ“ใฎ้™็•Œใฎใ„ใใคใ‹ใซใคใ„ใฆ่ชฌๆ˜Žใ—ใฆใ„ใพใ™ใ€‚
26:12
Of course, you can't necessarily predict from a mouse or a dog
457
1572600
2640
ใ‚‚ใกใ‚ใ‚“ใ€ ใƒžใ‚ฆใ‚นใ‚„็Šฌใ‹ใ‚‰
26:15
what's going to happen in a human, and there have been a number of cases
458
1575360
3760
ไบบ้–“ใซไฝ•ใŒ่ตทใ“ใ‚‹ใ‹ใ‚’ๅฟ…ใšใ—ใ‚‚ไบˆๆธฌใ™ใ‚‹ใ“ใจใฏใงใใพใ›ใ‚“ใ— ใ€
26:19
where substances that have proven to be toxic in animals
459
1579240
3320
ๅ‹•็‰ฉใซๆœ‰ๆฏ’ใงใ‚ใ‚‹ใ“ใจใŒ่จผๆ˜Žใ•ใ‚ŒใŸ็‰ฉ่ณชใŒ
26:22
have been proven to be safe in humans, and vice versa.
460
1582680
3200
ไบบ้–“ใซใฏๅฎ‰ๅ…จใงใ‚ใ‚‹ใ“ใจใŒ่จผๆ˜Žใ•ใ‚ŒใŸไพ‹ใ‚‚ๆ•ฐๅคšใใ‚ใ‚Šใพใ™ใ€‚ ้€†ใ‚‚ใพใŸๅŒๆง˜ใงใ™ใ€‚
26:27
There are also, of course, animal welfare limitations to animal testing.
461
1587200
4040
ใ‚‚ใกใ‚ใ‚“ใ€ ๅ‹•็‰ฉๅฎŸ้จ“ใซใฏๅ‹•็‰ฉ็ฆ็ฅ‰ไธŠใฎๅˆถ้™ใ‚‚ใ‚ใ‚Šใพใ™ใ€‚
26:31
Most people, I think, if they had the choice,
462
1591360
2280
ๅคšใใฎไบบใฏใ€ใ‚‚ใ— ้ธๆŠžๆจฉใŒใ‚ใ‚‹ใชใ‚‰ใ€ๅฎ‰ๅ…จๆ€งใ‚’็ขบไฟใ—ใชใŒใ‚‰ใ€
26:33
would want their substances to be used on as few animals or no animals as possible,
463
1593760
5160
ใใฎ็‰ฉ่ณชใŒ ใงใใ‚‹ใ ใ‘ๅฐ‘ใชใ„ๅ‹•็‰ฉใ€ใ‚ใ‚‹ใ„ใฏๅ‹•็‰ฉใซใพใฃใŸใไฝฟใ‚ใ‚Œใชใ„ใ“ใจใ‚’ๆœ›ใ‚€ใ ใ‚ใ†ใจๆ€ใ„ใพใ™
26:39
while still ensuring safety.
464
1599040
1840
ใ€‚
26:41
Now, that's been a really difficult needle to thread,
465
1601000
2280
ใ“ใ‚Œใฏๆœฌๅฝ“ใซ้›ฃใ—ใ„ๅ•้กŒใงใ™
26:43
but AI might help to make that more possible.
466
1603400
2440
ใŒใ€AI ใซใ‚ˆใฃใฆ ใใ‚ŒใŒๅฎŸ็พใ—ใ‚„ใ™ใใชใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใ€‚
26:45
Christine says that medicines which are safe for animals
467
1605960
3280
ใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒณใฏใ€ ๅ‹•็‰ฉใซใจใฃใฆๅฎ‰ๅ…จใช่–ฌใŒ
26:49
might not be safe for humans.
468
1609360
2320
ไบบ้–“ใซใจใฃใฆใ‚‚ๅฎ‰ๅ…จใงใฏใชใ„ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใจ่จ€ใ„ใพใ™ใ€‚
26:51
But the opposite is also true โ€”
469
1611800
1760
ใ—ใ‹ใ—ใ€ใใฎ้€†ใ‚‚ใพใŸ็œŸใชใ‚Šใงใ™ใ€‚
26:53
what's safe for humans might not be safe for animals.
470
1613680
3760
ไบบ้–“ใซใจใฃใฆๅฎ‰ๅ…จใชใ‚‚ใฎใŒใ€ ๅ‹•็‰ฉใซใจใฃใฆใ‚‚ๅฎ‰ๅ…จใงใฏใชใ„ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™ใ€‚
26:57
Christine uses the phrase 'vice versa'
471
1617560
2600
ใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒณใฏใ€Œvice versaใ€ใจใ„ใ†ใƒ•ใƒฌใƒผใ‚บใ‚’ไฝฟใฃใฆใ€่‡ชๅˆ†ใŒ่จ€ใฃใฆใ„ใ‚‹ใ“ใจใฎ
27:00
to show that 'the opposite' of what she says is also true.
472
1620280
3920
ใ€Œๅๅฏพใ€ใ‚‚็œŸๅฎŸใงใ‚ใ‚‹ใ“ใจใ‚’็คบใ—ใฆใ„ใพใ™ ใ€‚
27:05
Christine also uses the idiom to 'thread the needle'
473
1625320
3200
ใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒณใฏใ€ ใ€Œ้‡ใซ็ณธใ‚’้€šใ™ใ€ใจใ„ใ†ๆ…ฃ็”จๅฅใ‚’ใ€ใ€Œ็‰นใซๅฏพ็ซ‹ใ‚’ไผดใ†ใ‚ˆใ†ใชใ€้ซ˜ๅบฆใชๆŠ€่ก“ใจๆญฃ็ขบใ•ใ‚’
27:08
to describe 'a task which requires a lot of skill and precision,
474
1628640
3800
ๅฟ…่ฆใจใ™ใ‚‹ไป•ไบ‹ใ€ใ‚’่กจใ™ใŸใ‚ใซใ‚‚ไฝฟใ„ใพใ™
27:12
'especially one involving a conflict'.
475
1632560
3480
ใ€‚
27:16
Yes, medical animal testing may save human lives,
476
1636160
4400
็ขบใ‹ใซใ€ๅŒป็™‚ๅ‹•็‰ฉๅฎŸ้จ“ใฏ ไบบ้–“ใฎๅ‘ฝใ‚’ๆ•‘ใ†ใ‹ใ‚‚ใ—ใ‚Œใชใ„
27:20
but many people see it as cruel and distressing for the animal โ€”
477
1640680
3920
ใŒใ€ๅคšใใฎไบบใ€…ใฏใใ‚Œใ‚’ ๅ‹•็‰ฉใซใจใฃใฆๆฎ‹้…ทใง่‹ฆ็—›ใชใ“ใจใ ใจ่€ƒใˆใฆใ„ใ‚‹ใ€‚
27:24
it's a difficult needle to thread.
478
1644720
2840
ใใ‚Œใฏ้›ฃใ—ใ„ๅ•้กŒใ ใ€‚
27:27
But now, the challenge of threading that needle has got a little easier
479
1647680
3760
ใ—ใ‹ใ—ไปŠใงใฏใ€ไบบๅทฅ็Ÿฅ่ƒฝใฎใŠใ‹ใ’ใงใ€้‡ใซ็ณธใ‚’้€šใ™ใจใ„ใ†่ชฒ้กŒใŒ ๅฐ‘ใ—ๆฅฝใซใชใ‚Šใพใ—ใŸ
27:31
because of artificial intelligence.
480
1651560
2080
ใ€‚
27:33
Predicting how likely a new medicine is to harm humans
481
1653760
3680
ๆ–ฐใ—ใ„่–ฌใŒไบบ้–“ใซๅฎณใ‚’ๅŠใผใ™ๅฏ่ƒฝๆ€งใ‚’ไบˆๆธฌใ™ใ‚‹ใซใฏใ€ไฝ•ๅƒใ‚‚ใฎๅฎŸ้จ“ใฎ
27:37
involves analysing the results of thousands of experiments.
482
1657560
3960
็ตๆžœใ‚’ๅˆ†ๆžใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ ใ€‚
27:41
And one thing AI is really good at is analysing mountains and mountains of data.
483
1661640
6280
AI ใŒๆœฌๅฝ“ใซๅพ—ๆ„ใจใ™ใ‚‹ใฎใฏใ€ ๅฑฑใฎใ‚ˆใ†ใชใƒ‡ใƒผใ‚ฟใฎๅˆ†ๆžใงใ™ใ€‚
27:48
Here's Christine Ro again, speaking with BBC World Service's Tech Life.
484
1668040
4440
ๅ†ใณใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒณใƒปใƒญใƒผใŒ BBC ใƒฏใƒผใƒซใƒ‰ ใ‚ตใƒผใƒ“ใ‚นใฎ Tech Life ใซๅ‡บๆผ”ใ—ใพใ™ใ€‚
27:52
So, AI isn't the whole picture, of course,
485
1672600
1880
ใ‚‚ใกใ‚ใ‚“ใ€AIใŒๅ…จไฝ“ๅƒใงใฏใ‚ใ‚Šใพใ›ใ‚“ใŒใ€AIใฏ
27:54
but it's an increasingly important part of the picture and one reason for that
486
1674600
4240
ใพใ™ใพใ™้‡่ฆใช้ƒจๅˆ†ใ‚’ๅ ใ‚ใฆใ„ใพใ™ใ€‚ ใใฎ็†็”ฑใฎ1ใคใฏใ€ๅŒ–ๅญฆ็‰ฉ่ณชใฎๅฎ‰ๅ…จๆ€งใ‚’ๅˆคๆ–ญใ™ใ‚‹้š›ใซใ€
27:58
is that there is a huge amount of toxicology data to wade through
487
1678960
3320
่†จๅคงใช้‡ ใฎๆฏ’็‰ฉๅญฆใƒ‡ใƒผใ‚ฟใ‚’็ฒพๆŸปใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹ใ“ใจใงใ™
28:02
when it comes to determining chemical safety, and, on top of that,
488
1682400
3720
ใ€‚ ใใ‚ŒใซๅŠ ใˆใฆใ€
28:06
there's the staggering number of chemicals being invented all of the time.
489
1686240
4120
้ฉšใใปใฉๅคšใใฎๅŒ–ๅญฆ็‰ฉ่ณชใŒ ๅธธใซ็™บๆ˜Žใ•ใ‚Œใฆใ„ใพใ™ใ€‚
28:10
AI helps scientists wade through huge amounts of data.
490
1690480
4280
AI ใฏ็ง‘ๅญฆ่€…ใŒ ่†จๅคงใช้‡ใฎใƒ‡ใƒผใ‚ฟใ‚’ๅ‡ฆ็†ใ™ใ‚‹ใฎใ‚’ๆ”ฏๆดใ—ใพใ™ใ€‚
28:14
If you 'wade through' something,
491
1694880
2200
ไฝ•ใ‹ใ‚’ใ€Œใ‚„ใ‚Š้‚ใ’ใ‚‹ใ€ใจใ„ใ†ใ“ใจใฏใ€
28:17
you 'spend a lot of time and effort doing something boring or difficult,
492
1697200
4920
28:22
'especially reading a lot of information'.
493
1702240
2960
ใ€Œ็‰นใซๅคง้‡ใฎๆƒ…ๅ ฑใ‚’่ชญใ‚€ใ“ใจใ€ใชใฉใ€้€€ๅฑˆใชใ“ใจใ‚„้›ฃใ—ใ„ใ“ใจใซๅคšใใฎๆ™‚้–“ใจๅŠดๅŠ›ใ‚’่ฒปใ‚„ใ™ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
28:25
AI can process huge amounts of data,
494
1705320
2600
AIใฏ่†จๅคงใช้‡ใฎใƒ‡ใƒผใ‚ฟใ‚’ๅ‡ฆ็†ใ™ใ‚‹ใ“ใจใŒใงใใ€
28:28
and what's more, that amount keeps growing as new chemicals are invented.
495
1708040
5360
ใ•ใ‚‰ใซใ€ ๆ–ฐใ—ใ„ๅŒ–ๅญฆ็‰ฉ่ณชใŒ็™บๆ˜Žใ•ใ‚Œใ‚‹ใซใคใ‚Œใฆใใฎ้‡ใฏๅข—ใˆ็ถšใ‘ใพใ™ใ€‚
28:33
Christine uses the phrase 'on top of that', meaning 'in addition to something'.
496
1713520
4960
ใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒณใฏใ€Œon top of thatใ€ใจใ„ใ†ใƒ•ใƒฌใƒผใ‚บใ‚’ไฝฟใฃใฆใ„ใพใ™ใŒ ใ€ใ“ใ‚Œใฏใ€Œไฝ•ใ‹ใซๅŠ ใˆใฆใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ™ใ€‚
28:38
Often this extra thing is negative.
497
1718600
2440
ๅคšใใฎๅ ดๅˆใ€ใ“ใฎไฝ™ๅˆ†ใชใ‚‚ใฎใฏใƒžใ‚คใƒŠใ‚นใซใชใ‚Šใพใ™ใ€‚
28:41
She means there's already so much data to understand
498
1721160
3360
ๅฝผๅฅณใฎ่จ€ใ†ใ“ใจใฏใ€็†่งฃใ™ในใใƒ‡ใƒผใ‚ฟใŒใ™ใงใซใŸใใ•ใ‚“ใ‚ใ‚Š
28:44
and additionally, there's even more to be understood about these new chemicals.
499
1724640
5000
ใ€ใ•ใ‚‰ใซใ€ ใ“ใ‚Œใ‚‰ใฎๆ–ฐใ—ใ„ๅŒ–ๅญฆ็‰ฉ่ณชใซใคใ„ใฆใฏ็†่งฃใ™ในใใ“ใจใŒใ•ใ‚‰ใซใŸใใ•ใ‚“ใ‚ใ‚‹ใจใ„ใ†ใ“ใจใงใ™ใ€‚
28:49
Of course, the good news is that with AI, testing on animals could one day stop,
500
1729760
6160
ใ‚‚ใกใ‚ใ‚“ใ€AI ใŒใ‚ใ‚Œใฐใ€ ใ„ใคใ‹ๅ‹•็‰ฉๅฎŸ้จ“ใŒใชใใชใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใจใ„ใ†ใฎใฏๆœ—ๅ ฑใ 
28:56
although Christine warns that AI is not the 'whole picture',
501
1736040
3840
ใŒใ€ใ‚ฏใƒชใ‚นใƒ†ใ‚ฃใƒผใƒณใฏ AI ใฏ ใ€Œๅ…จไฝ“ๅƒใ€ใงใฏใชใใ€ใ€Œ
29:00
it's not 'a complete description of something
502
1740000
2120
29:02
'which includes all the relevant information'.
503
1742240
2920
ใ™ในใฆใฎ้–ข้€ฃๆƒ…ๅ ฑใ‚’ๅซใ‚€ไฝ•ใ‹ใฎๅฎŒๅ…จใช่ชฌๆ˜Žใ€ใงใฏใชใ„ใจ่ญฆๅ‘Šใ—ใฆใ„ใ‚‹ใ€‚
29:05
Nevertheless, the news is a step forward
504
1745280
2640
ใใ‚Œใงใ‚‚ใ€ใ“ใฎใƒ‹ใƒฅใƒผใ‚นใฏ
29:08
for both animal welfare and for modern medicine.
505
1748040
4040
ๅ‹•็‰ฉ็ฆ็ฅ‰ใซใจใฃใฆ ใ‚‚็พไปฃๅŒปๅญฆใซใจใฃใฆใ‚‚ๅ‰้€ฒใจใชใ‚‹ใ€‚
29:12
Speaking of which, what was the answer to your question, Phil?
506
1752200
3720
ใใ†ใ„ใˆใฐใ€ ใƒ•ใ‚ฃใƒซใ•ใ‚“ใ€ใ‚ใชใŸใฎ่ณชๅ•ใฎ็ญ”ใˆใฏไฝ•ใงใ—ใŸใ‹๏ผŸ
29:16
What is a commonly used medicine which is safe for humans, but harmful to animals?
507
1756040
5200
ไบบ้–“ใซใฏๅฎ‰ๅ…จใ ใŒใ€ๅ‹•็‰ฉใซใฏๆœ‰ๅฎณใชใ€ใ‚ˆใไฝฟใ‚ใ‚Œใ‚‹่–ฌใฏไฝ•ใงใ™ใ‹?
29:21
I guessed it was aspirin.
508
1761360
1920
ใใ‚Œใฏใ‚ขใ‚นใƒ”ใƒชใƒณใ ใฃใŸใจๆ€ใ„ใพใ™ใ€‚
29:23
Which was the correct answer!
509
1763400
3080
ๆญฃ่งฃใฏใฉใ‚Œใงใ—ใ‚‡ใ†ใ‹๏ผ
29:26
Right, let's recap the vocabulary we've discussed,
510
1766600
3640
ใ•ใฆใ€ใ“ใ‚Œใพใง่ญฐ่ซ–ใ—ใฆใใŸ่ชžๅฝ™ใ‚’ใพใจใ‚ใฆใฟใพใ—ใ‚‡ใ†ใ€‚ใพใšใฏใ€Œไฝ•ใ‹
29:30
starting with 'human guinea pigs'
511
1770360
2600
29:33
meaning 'the first people to have something new tested on them'.
512
1773080
3880
ๆ–ฐใ—ใ„ใ‚‚ใฎใ‚’ใƒ†ใ‚นใƒˆใ•ใ‚Œใ‚‹ๆœ€ๅˆใฎไบบใ€…ใ€ใ‚’ๆ„ๅ‘ณใ™ใ‚‹ใ€Œไบบ้–“ใƒขใƒซใƒขใƒƒใƒˆใ€ใ‹ใ‚‰ๅง‹ใ‚ใพใ—ใ‚‡ใ†ใ€‚
29:37
The phrase 'vice versa' is used to indicate
513
1777080
2560
ใ€Œvice versaใ€ใจใ„ใ†ใƒ•ใƒฌใƒผใ‚บใฏใ€
29:39
that 'the opposite of what you have just said is also true'.
514
1779760
4280
ใ€Œ ใ‚ใชใŸใŒไปŠ่จ€ใฃใŸใ“ใจใฎๅๅฏพใ‚‚็œŸๅฎŸใงใ‚ใ‚‹ใ€ใ“ใจใ‚’็คบใ™ใŸใ‚ใซไฝฟ็”จใ•ใ‚Œใพใ™ใ€‚
29:44
To 'thread the needle'
515
1784160
1320
ใ€Œ้‡ใซ็ณธใ‚’้€šใ™ใ€ใจใฏใ€ใ€Œ
29:45
describes 'a task which requires extreme skill and precision to do successfully'.
516
1785600
6120
ๆˆๅŠŸใ™ใ‚‹ใŸใ‚ใซ้ซ˜ๅบฆใชๆŠ€่ก“ใจๆญฃ็ขบใ•ใ‚’ๅฟ…่ฆใจใ™ใ‚‹ไฝœๆฅญใ€ใ‚’่กจใ—ใพใ™ใ€‚
29:51
The 'whole picture' means 'a complete description of something
517
1791840
3360
ใ€Œๅ…จไฝ“ๅƒใ€ใจใฏใ€ใ€Œใ‚ใ‚‹ไบ‹ๆŸ„ใซ้–ขใ™ใ‚‹
29:55
'which includes all the relevant information and opinions about it'.
518
1795320
4440
ใ™ในใฆใฎ้–ข้€ฃ ๆƒ…ๅ ฑใจๆ„่ฆ‹ใ‚’ๅซใ‚€ใ€ใใฎไบ‹ๆŸ„ใฎๅฎŒๅ…จใช่ชฌๆ˜Žใ€ใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
29:59
If you 'wade through' something,
519
1799880
2000
ไฝ•ใ‹ใ‚’ใ€Œใ‚„ใ‚Š้‚ใ’ใ‚‹ใ€ใจใ„ใ†ใ“ใจใฏใ€
30:02
you 'spend a lot of time and effort doing something boring or difficult,
520
1802000
4840
30:06
'especially reading a lot of information'.
521
1806960
2800
ใ€Œ็‰นใซๅคง้‡ใฎๆƒ…ๅ ฑใ‚’่ชญใ‚€ใ“ใจใ€ใชใฉใ€้€€ๅฑˆใชใ“ใจใ‚„้›ฃใ—ใ„ใ“ใจใซๅคšใใฎๆ™‚้–“ใจๅŠดๅŠ›ใ‚’่ฒปใ‚„ใ™ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ™ใ€‚
30:09
And finally, the phrase 'on top of something'
522
1809880
2960
ใใ—ใฆๆœ€ๅพŒใซใ€ ใ€Œon top of somethingใ€ใจใ„ใ†ใƒ•ใƒฌใƒผใ‚บใฏ
30:12
means 'in addition to something', and that extra thing is often negative.
523
1812960
5000
ใ€Œไฝ•ใ‹ใซๅŠ ใˆใฆใ€ใจใ„ใ†ๆ„ๅ‘ณใงใ€ ใใฎไฝ™ๅˆ†ใชใ‚‚ใฎใฏๅคšใใฎๅ ดๅˆๅฆๅฎš็š„ใชๆ„ๅ‘ณใซใชใ‚Šใพใ™ใ€‚
30:18
That's all for this week. Goodbye for now!
524
1818080
2080
ไปŠ้€ฑใฏใ“ใ‚Œใง็ต‚ใ‚ใ‚Šใงใ™ใ€‚ ใใ‚ŒใงใฏใพใŸ๏ผ
30:20
Bye!
525
1820280
1360
ใ•ใ‚ˆใชใ‚‰๏ผ
30:21
6 Minute English.
526
1821760
1320
6ๅˆ†้–“่‹ฑ่ชžใ€‚
30:23
From BBC Learning English.
527
1823200
2800
BBC Learning Englishใ‚ˆใ‚Šใ€‚
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7