Will robots out-think humans? 6 Minute English

79,866 views ・ 2018-01-25

BBC Learning English


Please double-click on the English subtitles below to play the video.

00:07
Dan: Hello and welcome to 6 Minute English.
0
7300
1640
00:08
I'm Dan and joining me today is Neil. Hi Neil.
1
8940
2519
00:11
Neil: Hi Dan. What’s with the protective
2
11459
2321
00:13
gear and helmet?
3
13780
929
00:14
Dan: I’m just getting ready for the inevitable
4
14709
2676
00:17
rise of the machines. That’s the takeover
5
17385
2385
00:19
of the world by artificial intelligence, or
6
19770
3486
00:23
AI, which some people predict will happen.
7
23256
3404
00:26
Neil: Inevitable means cannot be avoided or
8
26660
2615
00:29
stopped. Rise of the machines? What do you mean?
9
29275
3555
00:32
Dan: It’s our topic in this 6 Minute English.
10
32830
2138
00:34
We’ll be talking about that, giving you
11
34968
1852
00:36
six related pieces of vocabulary and, of course,
12
36820
3055
00:39
our regular quiz question.
13
39875
1655
00:41
Neil: That’s the first thing you’ve said
14
41530
1907
00:43
that makes any sense. What’s the question?
15
43437
2003
00:45
Dan: The word ‘robot’ as we use it today
16
45440
3218
00:48
was first used in a 1920’s Czech play ‘Rossum’s
17
48658
3782
00:52
Universal Robots’. But before this, what
18
52440
3356
00:55
was its original meaning? a) forced labour
19
55796
3524
00:59
b) metal man c) heartless thing
20
59320
3640
01:02
Neil: I will go for a) forced labour
21
62960
3250
01:06
Dan: We’ll find out if you were right or
22
66210
2335
01:08
not later in the show.
23
68545
1285
01:09
Neil: OK Dan. Tell me what’s going on.
24
69830
2310
01:12
Dan: I saw a news article written by BBC technology
25
72140
3431
01:15
correspondent Rory Cellan-Jones about the
26
75571
2759
01:18
recent CES technology show in Las Vegas. He
27
78330
3865
01:22
interviewed David Hanson, founder of Hanson
28
82195
3865
01:26
Robotics, who said it was his ambition to
29
86060
3660
01:29
achieve an AI that can beat humans at any
30
89720
3660
01:33
intellectual task.
31
93380
1500
01:34
Neil: Surely it’s a good thing! Better AI
32
94880
2655
01:37
and robotics could take over many of the jobs
33
97535
2915
01:40
that we don’t want to do, or that are so
34
100450
2466
01:42
important to get 100% right… like air traffic
35
102916
2774
01:45
control. We’d never have another plane crash.
36
105690
2566
01:48
It would be infallible because it would be
37
108256
2394
01:50
so clever.
38
110650
1190
01:51
Dan: Infallible means never failing. And that’s
39
111840
2767
01:54
what bothers me. What happens when its intelligence
40
114607
3003
01:57
surpasses ours? Why should it do what
41
117610
2774
02:00
we want it to do?
42
120384
1275
02:01
Neil: To surpass something is to do or be
43
121659
2076
02:03
better than it. Dan, you’ve been watching
44
123735
2075
02:05
too many movies. Robots fighting humanity
45
125810
2971
02:08
is a popular theme. Guess what… humanity
46
128781
2899
02:11
often wins. And besides, we would programme
47
131680
2498
02:14
the computer to be benevolent.
48
134178
1742
02:15
Dan: Benevolent means kind and helpful. But
49
135920
2805
02:18
that’s just it, once the intelligence becomes
50
138725
2935
02:21
sentient, or able to think for itself, who
51
141660
2781
02:24
knows what it will do. We humans are not exactly
52
144441
3179
02:27
perfect, you know. What happens if it decides
53
147620
2602
02:30
that it is better than us and wants us out
54
150222
2428
02:32
of the way?
55
152650
1000
02:33
Neil: Don’t worry. Asimov thought of that.
56
153650
2524
02:36
Isaac Asimov was an American science fiction
57
156174
2645
02:38
writer who, among other things, wrote about
58
158819
2747
02:41
robots. He came up with three laws that every
59
161566
2874
02:44
robot would have to follow to stop it from
60
164440
2624
02:47
acting against humanity. So we’re safe!
61
167064
2436
02:49
Dan: I’m not so sure. A sentient robot could
62
169500
2933
02:52
make up its own mind about how to interpret
63
172433
2867
02:55
the laws. For example, imagine if we created
64
175300
3284
02:58
an AI system to protect all of humanity.
65
178584
2985
03:01
Neil: Well, that’s great! No more war. No
66
181569
2195
03:03
more murder. No more fighting.
67
183764
1606
03:05
Dan: Do you really think that humans can stop
68
185370
2694
03:08
fighting? What if the AI decides that the
69
188064
2455
03:10
only way to stop us from hurting ourselves
70
190519
3286
03:13
and each other is to control everything we
71
193805
3285
03:17
do, so it takes over to protect us. Then we
72
197090
3507
03:20
would lose our freedom to a thing that we
73
200597
3343
03:23
created that is infallible and more intelligent
74
203940
3377
03:27
than we are! That’s the end, Neil!
75
207317
2443
03:29
Neil: I think that’s a little far-fetched,
76
209760
2216
03:31
which means difficult to believe. I’m sure
77
211976
2215
03:34
others don’t think that way.
78
214191
1539
03:35
Dan: OK. Let’s hear what the Learning English
79
215730
2523
03:38
team say when I ask them if they are worried
80
218253
2467
03:40
that AI and robots could take over the world.
81
220720
3519
03:44
Phil: Well, it’s possible, but unlikely.
82
224240
3580
03:47
There will come a point where our technology
83
227820
1880
03:49
will be limited – probably before real AI
84
229700
3125
03:52
is achieved.
85
232825
1935
03:54
Sam: Never in a million years. First of all
86
234760
2360
03:57
we’d programme them so that they couldn’t,
87
237124
3306
04:00
and secondly we’d beat them anyway. Haven’t
88
240430
3632
04:04
you ever seen a movie?
89
244062
1858
04:05
Kee: I totally think it could happen. We
90
245920
2371
04:08
only have to make a robot that’s smart enough
91
248291
2668
04:10
to start thinking for itself. After that,
92
250959
2605
04:13
who knows what it might do.
93
253564
2796
04:16
Neil: A mixed bag of opinions there, Dan.
94
256360
1796
04:18
It seems you aren’t alone.
95
258156
1823
04:19
Dan: Nope. But I don’t exactly have an army
96
259979
2459
04:22
of support either. I guess we’ll just have
97
262438
2402
04:24
to wait and see.
98
264840
1000
04:25
Neil: Speak for yourself. I’ve waited long
99
265840
1949
04:27
enough – for our quiz question that is.
100
267789
1810
04:29
Dan: Oh yeah! I asked you what the original
101
269599
2496
04:32
meaning of the word ‘robot’ was before
102
272095
2205
04:34
it was used in its modern form. a) forced
103
274300
3633
04:37
labour b) metal man c) heartless thing
104
277933
3367
04:41
Neil: And I said a) forced labour
105
281300
1709
04:43
Dan: And you were… right!
106
283009
1971
04:44
Neil: Shall we take a look at the vocabulary then?
107
284980
2580
04:47
Dan: OK. First we had inevitable. If something
108
287569
3450
04:51
is inevitable then it cannot be avoided or
109
291019
3150
04:54
stopped. Can you think of something inevitable, Neil?
110
294169
2571
04:56
Neil: It is inevitable that one day the Sun
111
296749
3005
04:59
will stop burning. Then we had infallible,
112
299754
2936
05:02
which means never failing. Give us an example, Dan.
113
302690
3050
05:05
Dan: The vaccine for small pox is infallible.
114
305740
2332
05:08
The natural spread of that disease has been
115
308072
2228
05:10
completely stopped. After that was surpasses.
116
310300
3470
05:13
If something surpasses something else then
117
313770
3239
05:17
it becomes better than it.
118
317009
1711
05:18
Neil: Many parents across the world hope that
119
318720
2275
05:20
their children will surpass them in wealth,
120
320995
2174
05:23
status or achievement. After that we heard
121
323169
2662
05:25
benevolent, which means kind and helpful.
122
325831
2599
05:28
Name a person famous for being benevolent, Dan.
123
328430
3019
05:31
Dan: Father Christmas is a benevolent character.
124
331449
3216
05:34
After that we heard sentient. If something
125
334665
2814
05:37
is sentient, it is able to think for itself.
126
337479
2720
05:40
Neil: Indeed. Many people wonder about the
127
340200
2340
05:42
possibility of sentient life on other planets.
128
342620
3060
05:45
Finally we heard far-fetched, which means
129
345680
2176
05:47
difficult to believe. Like that far-fetched
130
347856
2283
05:50
story you told me the other day about being
131
350139
1932
05:52
late because of a dragon, Dan.
132
352071
1348
05:53
Dan: I swear it was real! It had big sharp
133
353419
2480
05:55
teeth and everything!
134
355899
1240
05:57
Neil: Yeah, yeah, yeah. And that’s the end
135
357139
1938
05:59
of this 6 Minute English. Don’t forget to
136
359077
1892
06:00
check out our Facebook, Twitter, and YouTube
137
360969
2171
06:03
pages. See you next time!
138
363140
1640
06:04
Dan: Bye!
139
364780
640
06:05
Neil: Bye.
140
365420
860
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7