Can AI have a mind of its own? ⏲️ 6 Minute English

249,203 views ・ 2023-01-26

BBC Learning English


Please double-click on the English subtitles below to play the video.

00:08
Hello. This is 6 Minute English from
0
8520
2220
00:10
BBC Learning English. I’m Sam.
1
10740
1680
00:12
And I’m Neil.
2
12420
840
00:13
In the autumn of 2021, something
3
13260
2520
00:15
strange happened at the Google
4
15780
1920
00:17
headquarters in California’s Silicon
5
17700
2460
00:20
Valley. A software engineer called,
6
20160
2100
00:22
Blake Lemoine, was working on the
7
22260
2460
00:24
artificial intelligence project, ‘Language
8
24720
2340
00:27
Models for Dialogue Applications’, or
9
27060
2700
00:29
LaMDA for short. LaMDA is a
10
29760
2940
00:32
chatbot – a computer programme
11
32700
1860
00:34
designed to have conversations with
12
34560
2040
00:36
humans over the internet.
13
36600
1080
00:37
After months talking with LaMDA
14
37680
2820
00:40
on topics ranging from movies to
15
40500
2280
00:42
the meaning of life, Blake came to
16
42780
2220
00:45
a surprising conclusion: the chatbot
17
45000
2700
00:47
was an intelligent person with wishes
18
47700
2640
00:50
and rights that should be respected.
19
50340
2160
00:52
For Blake, LaMDA was a Google
20
52500
2640
00:55
employee, not a machine.
21
55140
1380
00:56
He also called it his ‘friend’.
22
56520
2160
00:58
Google quickly reassigned Blake from
23
58680
2580
01:01
the project, announcing that his ideas
24
61260
1860
01:03
were not supported by the evidence.
25
63120
2220
01:05
But what exactly was going on?
26
65340
2700
01:08
In this programme, we’ll be
27
68040
1860
01:09
discussing whether artificial intelligence
28
69900
2160
01:12
is capable of consciousness. We’ll hear
29
72060
3000
01:15
from one expert who thinks AI is not as
30
75060
3000
01:18
intelligent as we sometimes think,
31
78060
1560
01:19
and as usual, we’ll be learning some
32
79620
2460
01:22
new vocabulary as well.
33
82080
1320
01:23
But before that, I have a question for
34
83400
2100
01:25
you, Neil. What happened to Blake Lemoine
35
85500
2280
01:27
is strangely similar to the 2013 Hollywood
36
87780
2940
01:30
movie, Her, starring Joaquin Phoenix as
37
90720
3420
01:34
a lonely writer who talks with his
38
94140
1860
01:36
computer, voiced by Scarlett Johansson.
39
96000
2280
01:38
But what happens at the end
40
98280
1860
01:40
of the movie? Is it:
41
100140
1380
01:41
a) the computer comes to life?
42
101520
1860
01:43
b) the computer dreams about the writer? or,
43
103380
3360
01:46
c) the writer falls in love with the computer?
44
106740
2040
01:48
... c) the writer falls in love with the computer.
45
108780
3420
01:52
OK, Neil, I’ll reveal the answer at the end
46
112200
2700
01:54
of the programme. Although Hollywood is
47
114900
2460
01:57
full of movies about robots coming to life,
48
117360
2340
01:59
Emily Bender, a professor of linguistics and
49
119700
3240
02:02
computing at the University of Washington,
50
122940
2100
02:05
thinks AI isn’t that smart. She thinks the
51
125040
4440
02:09
words we use to talk about technology,
52
129480
1800
02:11
phrases like ‘machine learning’, give a
53
131280
3300
02:14
false impression about what
54
134580
1620
02:16
computers can and can’t do.
55
136200
1740
02:17
Here is Professor Bender discussing
56
137940
2400
02:20
another misleading phrase, ‘speech
57
140340
2340
02:22
recognition’, with BBC World Service
58
142680
2460
02:25
programme, The Inquiry:
59
145140
1440
02:27
If you talk about ‘automatic speech
60
147360
2220
02:29
recognition’, the term ‘recognition’
61
149580
1920
02:31
suggests that there's something
62
151500
2100
02:33
cognitive going on, where I think a
63
153600
2160
02:35
better term would be automatic transcription.
64
155760
2100
02:37
That just describes the input-output
65
157860
1980
02:39
relation, and not any theory or wishful
66
159840
3660
02:43
thinking about what the computer is
67
163500
2280
02:45
doing to be able to achieve that.
68
165780
1440
02:47
Using words like ‘recognition’ in relation
69
167220
3360
02:50
to computers gives the idea that
70
170580
2100
02:52
something cognitive is happening – something
71
172680
2760
02:55
related to the mental processes of
72
175440
2340
02:57
thinking, knowing, learning and understanding.
73
177780
2760
03:00
But thinking and knowing are human,
74
180540
2700
03:03
not machine, activities. Professor Benders
75
183240
3060
03:06
says that talking about them in connection
76
186300
2040
03:08
with computers is wishful thinking -
77
188340
3570
03:11
something which is unlikely to happen.
78
191910
2310
03:14
The problem with using words in this
79
194220
2220
03:16
way is that it reinforces what
80
196440
2100
03:18
Professor Bender calls, technical
81
198540
2160
03:20
bias – the assumption that the computer
82
200700
2520
03:23
is always right. When we encounter
83
203220
2520
03:25
language that sounds natural, but is
84
205740
1680
03:27
coming from a computer, humans
85
207420
2280
03:29
can’t help but imagine a mind behind
86
209700
2460
03:32
the language, even when there isn’t one.
87
212160
2220
03:34
In other words, we anthropomorphise
88
214380
2160
03:36
computers – we treat them as if they
89
216540
2520
03:39
were human. Here’s Professor Bender
90
219060
2220
03:41
again, discussing this idea with
91
221280
2220
03:43
Charmaine Cozier, presenter of BBC
92
223500
2700
03:46
World Service’s, the Inquiry.
93
226200
1620
03:48
So ‘ism’ means system, ‘anthro’ or ‘anthropo’
94
228420
3660
03:52
means human, and ‘morph’ means shape...
95
232080
3000
03:55
And so this is a system that puts the
96
235080
3000
03:58
shape of a human on something, and
97
238080
2160
04:00
in this case the something is a computer.
98
240240
1260
04:01
We anthropomorphise animals all the time,
99
241500
3180
04:04
but we also anthropomorphise action figures,
100
244680
3060
04:07
or dolls, or companies when we talk about
101
247740
2880
04:10
companies having intentions and so on.
102
250620
2040
04:12
We very much are in the habit of seeing
103
252660
2880
04:15
ourselves in the world around us.
104
255540
1620
04:17
And while we’re busy seeing ourselves
105
257160
2220
04:19
by assigning human traits to things that
106
259380
2100
04:21
are not, we risk being blindsided.
107
261480
2520
04:24
The more fluent that text is, the more
108
264000
2400
04:26
different topics it can converse on, the
109
266400
2700
04:29
more chances there are to get taken in.
110
269100
1920
04:31
If we treat computers as if they could think,
111
271860
2760
04:34
we might get blindsided, or
112
274620
2520
04:37
unpleasantly surprised. Artificial intelligence
113
277140
3180
04:40
works by finding patterns in massive
114
280320
2220
04:42
amounts of data, so it can seem like
115
282540
2520
04:45
we’re talking with a human, instead
116
285060
1860
04:46
of a machine doing data analysis.
117
286920
2220
04:49
As a result, we get taken in – we’re tricked
118
289140
4020
04:53
or deceived into thinking we’re dealing
119
293160
1920
04:55
with a human, or with something intelligent.
120
295080
3003
04:58
Powerful AI can make machines appear conscious,
121
298083
3318
05:01
but even tech giants like Google are years
122
301401
2945
05:04
away from building computers that can
123
304346
2407
05:06
dream or fall in love. Speaking of which,
124
306753
2355
05:09
Sam, what was the answer to your question?
125
309108
2671
05:11
I asked what happened in the 2013 movie, Her.
126
311779
2752
05:14
Neil thought that the main character
127
314531
1900
05:16
falls in love with his computer, which
128
316431
2248
05:18
was the correct answer!
129
318679
1361
05:20
OK. Right, it’s time to recap the vocabulary
130
320880
2640
05:23
we’ve learned from this programme about AI,
131
323520
2340
05:25
including chatbots - computer programmes
132
325860
3180
05:29
designed to interact with
133
329040
1380
05:30
humans over the internet.
134
330420
1560
05:31
The adjective cognitive describes
135
331980
2640
05:34
anything connected with the mental
136
334620
1860
05:36
processes of knowing,
137
336480
1320
05:37
learning and understanding.
138
337800
1380
05:39
Wishful thinking means thinking that
139
339180
2640
05:41
something which is very unlikely to happen
140
341820
2100
05:43
might happen one day in the future.
141
343920
2040
05:45
To anthropomorphise an object means
142
345960
2400
05:48
to treat it as if it were human,
143
348360
1500
05:49
even though it’s not.
144
349860
1200
05:51
When you’re blindsided, you’re
145
351060
2520
05:53
surprised in a negative way.
146
353580
1500
05:55
And finally, to get taken in by someone means
147
355080
2880
05:57
to be deceived or tricked by them.
148
357960
1860
05:59
My computer tells me that our six minutes
149
359820
2640
06:02
are up! Join us again soon, for now
150
362460
2580
06:05
it’s goodbye from us.
151
365040
1140
06:06
Bye!
152
366180
500
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7