Can AI have a mind of its own? โฒ๏ธ 6 Minute English

238,455 views ใƒป 2023-01-26

BBC Learning English


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ. ื›ืชื•ื‘ื™ื•ืช ืžืชื•ืจื’ืžื•ืช ืžืชื•ืจื’ืžื•ืช ื‘ืžื›ื•ื ื”.

00:08
Hello. This is 6 Minute English from
0
8520
2220
ืฉืœื•ื. ื–ื•ื”ื™ 6 ื“ืงื•ืช ืื ื’ืœื™ืช ืžื‘ื™ืช
00:10
BBC Learning English. Iโ€™m Sam.
1
10740
1680
BBC Learning English. ืื ื™ ืกืื.
00:12
And Iโ€™m Neil.
2
12420
840
ื•ืื ื™ ื ื™ืœ.
00:13
In the autumn of 2021, something
3
13260
2520
ื‘ืกืชื™ื• 2021, ืžืฉื”ื•
00:15
strange happened at the Google
4
15780
1920
ืžื•ื–ืจ ืงืจื”
00:17
headquarters in Californiaโ€™s Silicon
5
17700
2460
ื‘ืžื˜ื” ื’ื•ื’ืœ ื‘ืขืžืง ื”ืกื™ืœื™ืงื•ืŸ ืฉืœ ืงืœื™ืคื•ืจื ื™ื”
00:20
Valley. A software engineer called,
6
20160
2100
. ืžื”ื ื“ืก ืชื•ื›ื ื” ื‘ืฉื,
00:22
Blake Lemoine, was working on the
7
22260
2460
ื‘ืœื™ื™ืง ืœืžื•ื™ืŸ, ืขื‘ื“ ืขืœ
00:24
artificial intelligence project, โ€˜Language
8
24720
2340
ืคืจื•ื™ืงื˜ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช, '
00:27
Models for Dialogue Applicationsโ€™, or
9
27060
2700
ืžื•ื“ืœื™ื ืœืฉื•ื ื™ื™ื ืœื™ื™ืฉื•ืžื™ ื“ื™ืืœื•ื’', ืื•
00:29
LaMDA for short. LaMDA is a
10
29760
2940
ื‘ืงื™ืฆื•ืจ LaMDA. LaMDA ื”ื•ื
00:32
chatbot โ€“ a computer programme
11
32700
1860
ืฆ'ื˜ื‘ื•ื˜ - ืชื•ื›ื ืช ืžื—ืฉื‘
00:34
designed to have conversations with
12
34560
2040
ืฉื ื•ืขื“ื” ืœื ื”ืœ ืฉื™ื—ื•ืช ืขื
00:36
humans over the internet.
13
36600
1080
ื‘ื ื™ ืื“ื ื“ืจืš ื”ืื™ื ื˜ืจื ื˜.
00:37
After months talking with LaMDA
14
37680
2820
ืœืื—ืจ ื—ื•ื“ืฉื™ื ืฉืฉื•ื—ื—ื• ืขื ืœืžื“"ื
00:40
on topics ranging from movies to
15
40500
2280
ืขืœ โ€‹โ€‹ื ื•ืฉืื™ื ื”ื—ืœ ืžืกืจื˜ื™ื ื•ืขื“
00:42
the meaning of life, Blake came to
16
42780
2220
ืœืžืฉืžืขื•ืช ื”ื—ื™ื™ื, ื‘ืœื™ื™ืง ื”ื’ื™ืข
00:45
a surprising conclusion: the chatbot
17
45000
2700
ืœืžืกืงื ื” ืžืคืชื™ืขื”: ื”ืฆ'ื˜ื‘ื•ื˜
00:47
was an intelligent person with wishes
18
47700
2640
ื”ื™ื” ืื“ื ืื™ื ื˜ืœื™ื’ื ื˜ื™ ืขื ืžืฉืืœื•ืช
00:50
and rights that should be respected.
19
50340
2160
ื•ื–ื›ื•ื™ื•ืช ืฉืฆืจื™ืš ืœื›ื‘ื“.
00:52
For Blake, LaMDA was a Google
20
52500
2640
ืขื‘ื•ืจ ื‘ืœื™ื™ืง, LaMDA ื”ื™ื”
00:55
employee, not a machine.
21
55140
1380
ืขื•ื‘ื“ ืฉืœ ื’ื•ื’ืœ, ืœื ืžื›ื•ื ื”.
00:56
He also called it his โ€˜friendโ€™.
22
56520
2160
ื”ื•ื ื’ื ืงืจื ืœื–ื” 'ื—ื‘ืจ' ืฉืœื•.
00:58
Google quickly reassigned Blake from
23
58680
2580
ื’ื•ื’ืœ ื”ืงืฆื” ืžื—ื“ืฉ ืืช ื‘ืœื™ื™ืง ืžื”ืคืจื•ื™ืงื˜ ื‘ืžื”ื™ืจื•ืช
01:01
the project, announcing that his ideas
24
61260
1860
, ื•ื”ื•ื“ื™ืขื” ืฉื”ืจืขื™ื•ื ื•ืช ืฉืœื•
01:03
were not supported by the evidence.
25
63120
2220
ืื™ื ื ื ืชืžื›ื™ื ื‘ืจืื™ื•ืช.
01:05
But what exactly was going on?
26
65340
2700
ืื‘ืœ ืžื” ื‘ื“ื™ื•ืง ืงืจื”?
01:08
In this programme, weโ€™ll be
27
68040
1860
ื‘ืชื•ื›ื ื™ืช ื–ื•, ื ื“ื•ืŸ
01:09
discussing whether artificial intelligence
28
69900
2160
ื‘ืฉืืœื” ื”ืื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
01:12
is capable of consciousness. Weโ€™ll hear
29
72060
3000
ืžืกื•ื’ืœืช ืœืชื•ื“ืขื”. ื ืฉืžืข
01:15
from one expert who thinks AI is not as
30
75060
3000
ืžืžื•ืžื—ื” ืื—ื“ ืฉื—ื•ืฉื‘ ืฉ-AI ืื™ื ื•
01:18
intelligent as we sometimes think,
31
78060
1560
ืื™ื ื˜ืœื™ื’ื ื˜ื™ ื›ืžื• ืฉืื ื• ื—ื•ืฉื‘ื™ื ืœืคืขืžื™ื,
01:19
and as usual, weโ€™ll be learning some
32
79620
2460
ื•ื›ืจื’ื™ืœ, ื ืœืžื“
01:22
new vocabulary as well.
33
82080
1320
ื’ื ืื•ืฆืจ ืžื™ืœื™ื ื—ื“ืฉ.
01:23
But before that, I have a question for
34
83400
2100
ืื‘ืœ ืœืคื ื™ ื›ืŸ, ื™ืฉ ืœื™ ืฉืืœื”
01:25
you, Neil. What happened to Blake Lemoine
35
85500
2280
ืืœื™ืš, ื ื™ืœ. ืžื” ืฉืงืจื” ืœื‘ืœื™ื™ืง ืœืžื•ืืŸ
01:27
is strangely similar to the 2013 Hollywood
36
87780
2940
ื“ื•ืžื” ื‘ืื•ืคืŸ ืžื•ื–ืจ ืœืกืจื˜ ื”ื”ื•ืœื™ื•ื•ื“ื™ ืฉืœ 2013
01:30
movie, Her, starring Joaquin Phoenix as
37
90720
3420
, ืฉืœื”, ื‘ื›ื™ื›ื•ื‘ื• ืฉืœ ื—ื•ืืงื™ืŸ ืคื™ื ื™ืงืก
01:34
a lonely writer who talks with his
38
94140
1860
ื›ืกื•ืคืจ ื‘ื•ื“ื“ ืฉืžื“ื‘ืจ ืขื
01:36
computer, voiced by Scarlett Johansson.
39
96000
2280
ื”ืžื—ืฉื‘ ืฉืœื•, ื‘ื“ื™ื‘ื•ื‘ ืฉืœ ืกืงืจืœื˜ ื’'ื•ื”ื ืกื•ืŸ.
01:38
But what happens at the end
40
98280
1860
ืื‘ืœ ืžื” ืงื•ืจื” ื‘ืกื•ืฃ
01:40
of the movie? Is it:
41
100140
1380
ื”ืกืจื˜? ื”ืื ื–ื”:
01:41
a) the computer comes to life?
42
101520
1860
ื) ื”ืžื—ืฉื‘ ืžืชืขื•ืจืจ ืœื—ื™ื™ื?
01:43
b) the computer dreams about the writer? or,
43
103380
3360
ื‘) ื”ืžื—ืฉื‘ ื—ื•ืœื ืขืœ ื”ืกื•ืคืจ? ืื•,
01:46
c) the writer falls in love with the computer?
44
106740
2040
ื’) ื”ืกื•ืคืจ ืžืชืื”ื‘ ื‘ืžื—ืฉื‘?
01:48
... c) the writer falls in love with the computer.
45
108780
3420
... ื’) ื”ืกื•ืคืจ ืžืชืื”ื‘ ื‘ืžื—ืฉื‘.
01:52
OK, Neil, Iโ€™ll reveal the answer at the end
46
112200
2700
ื‘ืกื“ืจ, ื ื™ืœ, ืื ื™ ืื’ืœื” ืืช ื”ืชืฉื•ื‘ื” ื‘ืกื•ืฃ
01:54
of the programme. Although Hollywood is
47
114900
2460
ื”ืชื•ื›ื ื™ืช. ืœืžืจื•ืช ืฉื”ื•ืœื™ื•ื•ื“
01:57
full of movies about robots coming to life,
48
117360
2340
ืžืœืื” ื‘ืกืจื˜ื™ื ืขืœ ืจื•ื‘ื•ื˜ื™ื ื”ืžืชืขื•ืจืจื™ื ืœื—ื™ื™ื,
01:59
Emily Bender, a professor of linguistics and
49
119700
3240
ืืžื™ืœื™ ื‘ื ื“ืจ, ืคืจื•ืคืกื•ืจ ืœื‘ืœืฉื ื•ืช
02:02
computing at the University of Washington,
50
122940
2100
ื•ืžื—ืฉื•ื‘ ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ื•ื•ืฉื™ื ื’ื˜ื•ืŸ,
02:05
thinks AI isnโ€™t that smart. She thinks the
51
125040
4440
ื—ื•ืฉื‘ืช ืฉื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื™ื ื” ื›ืœ ื›ืš ื—ื›ืžื”. ื”ื™ื ื—ื•ืฉื‘ืช
02:09
words we use to talk about technology,
52
129480
1800
ืฉื”ืžื™ืœื™ื ืฉืื ื• ืžืฉืชืžืฉื™ื ื‘ื”ืŸ ื›ื“ื™ ืœื“ื‘ืจ ืขืœ ื˜ื›ื ื•ืœื•ื’ื™ื”,
02:11
phrases like โ€˜machine learningโ€™, give a
53
131280
3300
ื‘ื™ื˜ื•ื™ื™ื ื›ืžื• 'ืœืžื™ื“ืช ืžื›ื•ื ื”', ื ื•ืชื ื•ืช
02:14
false impression about what
54
134580
1620
ืจื•ืฉื ืžื•ื˜ืขื” ืœื’ื‘ื™ ืžื”
02:16
computers can and canโ€™t do.
55
136200
1740
ืžื—ืฉื‘ื™ื ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ื•ืžื” ืœื.
02:17
Here is Professor Bender discussing
56
137940
2400
ื”ื ื” ืคืจื•ืคืกื•ืจ ื‘ื ื“ืจ ื“ืŸ ื‘ื‘ื™ื˜ื•ื™
02:20
another misleading phrase, โ€˜speech
57
140340
2340
ืžื˜ืขื” ืื—ืจ, '
02:22
recognitionโ€™, with BBC World Service
58
142680
2460
ื–ื™ื”ื•ื™ ื“ื™ื‘ื•ืจ', ืขื ืชื•ื›ื ื™ืช BBC World Service
02:25
programme, The Inquiry:
59
145140
1440
, The Inquiry:
02:27
If you talk about โ€˜automatic speech
60
147360
2220
ืื ืืชื” ืžื“ื‘ืจ ืขืœ 'ื–ื™ื”ื•ื™ ื“ื™ื‘ื•ืจ ืื•ื˜ื•ืžื˜ื™
02:29
recognitionโ€™, the term โ€˜recognitionโ€™
61
149580
1920
', ื”ืžื•ื ื— 'ื–ื™ื”ื•ื™'
02:31
suggests that there's something
62
151500
2100
ืžืฆื‘ื™ืข ืขืœ ื›ืš ืฉื™ืฉ ืžืฉื”ื•
02:33
cognitive going on, where I think a
63
153600
2160
ืงื•ื’ื ื™ื˜ื™ื‘ื™ ืฉืžืชืจื—ืฉ, ืฉื‘ื• ืื ื™ ื—ื•ืฉื‘ ืฉ
02:35
better term would be automatic transcription.
64
155760
2100
ืžื•ื ื— ื˜ื•ื‘ ื™ื•ืชืจ ื™ื”ื™ื” ืชืžืœื•ืœ ืื•ื˜ื•ืžื˜ื™.
02:37
That just describes the input-output
65
157860
1980
ื–ื” ืจืง ืžืชืืจ ืืช ื™ื—ืก ื”ืงืœื˜-ืคืœื˜
02:39
relation, and not any theory or wishful
66
159840
3660
, ื•ืœื ื›ืœ ืชื™ืื•ืจื™ื” ืื• ืžืฉืืœืช
02:43
thinking about what the computer is
67
163500
2280
ืœื‘ ืœื’ื‘ื™ ืžื” ืฉื”ืžื—ืฉื‘
02:45
doing to be able to achieve that.
68
165780
1440
ืขื•ืฉื” ื›ื“ื™ ืœื”ื™ื•ืช ืžืกื•ื’ืœ ืœื”ืฉื™ื’ ื–ืืช.
02:47
Using words like โ€˜recognitionโ€™ in relation
69
167220
3360
ืฉื™ืžื•ืฉ ื‘ืžื™ืœื™ื ื›ืžื• 'ื”ื›ืจื”' ื‘ื™ื—ืก
02:50
to computers gives the idea that
70
170580
2100
ืœืžื—ืฉื‘ื™ื ื ื•ืชืŸ ืืช ื”ืจืขื™ื•ืŸ ืฉืžืฉื”ื•
02:52
something cognitive is happening โ€“ something
71
172680
2760
ืงื•ื’ื ื™ื˜ื™ื‘ื™ ืงื•ืจื” - ืžืฉื”ื•
02:55
related to the mental processes of
72
175440
2340
ืฉืงืฉื•ืจ ืœืชื”ืœื™ื›ื™ื ื”ืžื ื˜ืœื™ื™ื ืฉืœ
02:57
thinking, knowing, learning and understanding.
73
177780
2760
ื—ืฉื™ื‘ื”, ื™ื“ื™ืขื”, ืœืžื™ื“ื” ื•ื”ื‘ื ื”.
03:00
But thinking and knowing are human,
74
180540
2700
ืื‘ืœ ื—ืฉื™ื‘ื” ื•ื™ื“ื™ืขื” ื”ืŸ ืคืขื™ืœื•ื™ื•ืช ืื ื•ืฉื™ื•ืช,
03:03
not machine, activities. Professor Benders
75
183240
3060
ืœื ืžื›ื•ื ื”. ืคืจื•ืคืกื•ืจ ื‘ื ื“ืจืก
03:06
says that talking about them in connection
76
186300
2040
ืื•ืžืจ ืฉืœื“ื‘ืจ ืขืœื™ื”ื ื‘ืงืฉืจ
03:08
with computers is wishful thinking -
77
188340
3570
ืœืžื—ืฉื‘ื™ื ื”ื•ื ืžืฉืืœืช ืœื‘ -
03:11
something which is unlikely to happen.
78
191910
2310
ื“ื‘ืจ ืฉืœื ืกื‘ื™ืจ ืฉื™ืงืจื”.
03:14
The problem with using words in this
79
194220
2220
ื”ื‘ืขื™ื” ื‘ืฉื™ืžื•ืฉ ื‘ืžื™ืœื™ื ื‘ืฆื•ืจื” ื›ื–ื•
03:16
way is that it reinforces what
80
196440
2100
ื”ื™ื ืฉื–ื” ืžื—ื–ืง ืืช ืžื”
03:18
Professor Bender calls, technical
81
198540
2160
ืฉืคืจื•ืคืกื•ืจ ื‘ื ื“ืจ ืžื›ื ื”,
03:20
bias โ€“ the assumption that the computer
82
200700
2520
ื”ื˜ื™ื” ื˜ื›ื ื™ืช - ื”ื”ื ื—ื” ืฉื”ืžื—ืฉื‘
03:23
is always right. When we encounter
83
203220
2520
ืชืžื™ื“ ืฆื•ื“ืง. ื›ืืฉืจ ืื ื• ืคื•ื’ืฉื™ื
03:25
language that sounds natural, but is
84
205740
1680
ืฉืคื” ืฉื ืฉืžืขืช ื˜ื‘ืขื™ืช, ืืš
03:27
coming from a computer, humans
85
207420
2280
ืžื’ื™ืขื” ืžืžื—ืฉื‘, ื‘ื ื™ ืื“ื
03:29
canโ€™t help but imagine a mind behind
86
209700
2460
ืœื ื™ื›ื•ืœื™ื ืฉืœื ืœื“ืžื™ื™ืŸ ืžื•ื— ืžืื—ื•ืจื™
03:32
the language, even when there isnโ€™t one.
87
212160
2220
ื”ืฉืคื”, ื’ื ื›ืฉืื™ืŸ ื›ื–ื•.
03:34
In other words, we anthropomorphise
88
214380
2160
ื‘ืžื™ืœื™ื ืื—ืจื•ืช, ืื ื• ืขื•ืฉื™ื ืื ืชืจื•ืคื•ืžื•ืจืคื™ื–ืฆื™ื” ืฉืœ
03:36
computers โ€“ we treat them as if they
89
216540
2520
ืžื—ืฉื‘ื™ื - ืื ื• ืžืชื™ื™ื—ืกื™ื ืืœื™ื”ื ื›ืื™ืœื•
03:39
were human. Hereโ€™s Professor Bender
90
219060
2220
ื”ื™ื• ืื ื•ืฉื™ื™ื. ื”ื ื” ืฉื•ื‘ ืคืจื•ืคืกื•ืจ ื‘ื ื“ืจ
03:41
again, discussing this idea with
91
221280
2220
, ื“ืŸ ื‘ืจืขื™ื•ืŸ ื”ื–ื” ืขื
03:43
Charmaine Cozier, presenter of BBC
92
223500
2700
ืฉืจืžื™ื™ืŸ ืงื•ื–ื™ื™ืจ, ืžื’ื™ืฉืช ื”-BBC
03:46
World Serviceโ€™s, the Inquiry.
93
226200
1620
World Service ืฉืœ ื”ื—ืงื™ืจื”.
03:48
So โ€˜ismโ€™ means system, โ€˜anthroโ€™ or โ€˜anthropoโ€™
94
228420
3660
ืื– 'ืื™ื–ื' ืคื™ืจื•ืฉื• ืžืขืจื›ืช, 'ืื ืชืจื•' ืื• 'ืื ืชืจื•ืคื•'
03:52
means human, and โ€˜morphโ€™ means shape...
95
232080
3000
ืคื™ืจื•ืฉื• ืื ื•ืฉื™, ื•'ืžื•ืจืฃ' ืคื™ืจื•ืฉื• ืฆื•ืจื”...
03:55
And so this is a system that puts the
96
235080
3000
ื•ืœื›ืŸ ื–ื• ืžืขืจื›ืช ืฉืฉืžื”
03:58
shape of a human on something, and
97
238080
2160
ืฆื•ืจื” ืฉืœ ืื“ื ืขืœ ืžืฉื”ื•,
04:00
in this case the something is a computer.
98
240240
1260
ื•ื‘ืžืงืจื” ื”ื–ื” ื”ืžืฉื”ื• ื”ื•ื ืžึทื—ืฉืึตื‘.
04:01
We anthropomorphise animals all the time,
99
241500
3180
ืื ื—ื ื• ืขื•ืฉื™ื ืื ืชืจื•ืคื•ืžื•ืจืคื™ื–ืฆื™ื” ืฉืœ ื‘ืขืœื™ ื—ื™ื™ื ื›ืœ ื”ื–ืžืŸ,
04:04
but we also anthropomorphise action figures,
100
244680
3060
ืื‘ืœ ืื ื—ื ื• ื’ื ืขื•ืฉื™ื ืื ืชืจื•ืคื•ืžื•ืจืคื™ื–ืฆื™ื” ืฉืœ ื“ืžื•ื™ื•ืช ืคืขื•ืœื”,
04:07
or dolls, or companies when we talk about
101
247740
2880
ืื• ื‘ื•ื‘ื•ืช, ืื• ื—ื‘ืจื•ืช ื›ืฉืื ื—ื ื• ืžื“ื‘ืจื™ื ืขืœ
04:10
companies having intentions and so on.
102
250620
2040
ื—ื‘ืจื•ืช ืฉื™ืฉ ืœื”ืŸ ื›ื•ื•ื ื•ืช ื•ื›ืŸ ื”ืœืื”.
04:12
We very much are in the habit of seeing
103
252660
2880
ืื ื—ื ื• ืžืื•ื“ ื ื•ื”ื’ื™ื ืœืจืื•ืช
04:15
ourselves in the world around us.
104
255540
1620
ืืช ืขืฆืžื ื• ื‘ืขื•ืœื ืฉืกื‘ื™ื‘ื ื•.
04:17
And while weโ€™re busy seeing ourselves
105
257160
2220
ื•ื‘ืขื•ื“ ืื ื—ื ื• ืขืกื•ืงื™ื ื‘ืœื”ืจืื•ืช ืืช ืขืฆืžื ื•
04:19
by assigning human traits to things that
106
259380
2100
ืขืœ ื™ื“ื™ ื”ืงืฆืืช ืชื›ื•ื ื•ืช ืื ื•ืฉื™ื•ืช ืœื“ื‘ืจื™ื
04:21
are not, we risk being blindsided.
107
261480
2520
ืฉืื™ื ื, ืื ื• ืžืกืชื›ื ื™ื ื‘ืกื ื•ื•ืจื™ื.
04:24
The more fluent that text is, the more
108
264000
2400
ื›ื›ืœ ืฉื”ื˜ืงืกื˜ ืฉื•ื˜ืฃ ื™ื•ืชืจ, ื›ืš
04:26
different topics it can converse on, the
109
266400
2700
ื”ื•ื ื™ื›ื•ืœ ืœืฉื•ื—ื— ืขืœ ื ื•ืฉืื™ื ืฉื•ื ื™ื ื™ื•ืชืจ, ื›ืš ื™ืฉ
04:29
more chances there are to get taken in.
110
269100
1920
ื™ื•ืชืจ ืกื™ื›ื•ื™ ืœื”ื™ืงืœื˜.
04:31
If we treat computers as if they could think,
111
271860
2760
ืื ื ืชื™ื™ื—ืก ืœืžื—ืฉื‘ื™ื ื›ืื™ืœื• ื”ื ื™ื›ื•ืœื™ื ืœื—ืฉื•ื‘,
04:34
we might get blindsided, or
112
274620
2520
ืื ื• ืขืœื•ืœื™ื ืœื”ืกืชื›ืœ ืื• ืœื”ื™ื•ืช
04:37
unpleasantly surprised. Artificial intelligence
113
277140
3180
ืžื•ืคืชืขื™ื ื‘ืื•ืคืŸ ืœื ื ืขื™ื. ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
04:40
works by finding patterns in massive
114
280320
2220
ืคื•ืขืœืช ืขืœ ื™ื“ื™ ืžืฆื™ืืช ื“ืคื•ืกื™ื ื‘ื›ืžื•ื™ื•ืช ืื“ื™ืจื•ืช
04:42
amounts of data, so it can seem like
115
282540
2520
ืฉืœ ื ืชื•ื ื™ื, ื›ืš ืฉื–ื” ื™ื›ื•ืœ ืœื”ื™ืจืื•ืช ื›ืื™ืœื•
04:45
weโ€™re talking with a human, instead
116
285060
1860
ืื ื—ื ื• ืžื“ื‘ืจื™ื ืขื ืื“ื,
04:46
of a machine doing data analysis.
117
286920
2220
ื‘ืžืงื•ื ืฉืžื›ื•ื ื” ืขื•ืฉื” ื ื™ืชื•ื— ื ืชื•ื ื™ื.
04:49
As a result, we get taken in โ€“ weโ€™re tricked
118
289140
4020
ื›ืชื•ืฆืื” ืžื›ืš, ืื ื• ื ืงืœื˜ื™ื - ืžืจืžื™ื ืื•ืชื ื•
04:53
or deceived into thinking weโ€™re dealing
119
293160
1920
ืื• ืžืจืžื™ื ืื•ืชื ื• ืœื—ืฉื•ื‘ ืฉื™ืฉ ืœื ื• ืขืกืง
04:55
with a human, or with something intelligent.
120
295080
3003
ืขื ืื“ื, ืื• ืขื ืžืฉื”ื• ืื™ื ื˜ืœื™ื’ื ื˜ื™.
04:58
Powerful AI can make machines appear conscious,
121
298083
3318
ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื—ื–ืงื” ื™ื›ื•ืœื” ืœื’ืจื•ื ืœืžื›ื•ื ื•ืช ืœื”ื™ืจืื•ืช ืžื•ื“ืขื•ืช,
05:01
but even tech giants like Google are years
122
301401
2945
ืื‘ืœ ืืคื™ืœื• ืขื ืงื™ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ืžื• ื’ื•ื’ืœ
05:04
away from building computers that can
123
304346
2407
ืจื—ื•ืงื•ืช ืฉื ื™ื ืžื‘ื ื™ื™ืช ืžื—ืฉื‘ื™ื ืฉื™ื›ื•ืœื™ื
05:06
dream or fall in love. Speaking of which,
124
306753
2355
ืœื—ืœื•ื ืื• ืœื”ืชืื”ื‘. ืื ื›ื‘ืจ ืžื“ื‘ืจื™ื ืขืœ ื–ื”,
05:09
Sam, what was the answer to your question?
125
309108
2671
ืกื, ืžื” ื”ื™ื™ืชื” ื”ืชืฉื•ื‘ื” ืœืฉืืœืชืš?
05:11
I asked what happened in the 2013 movie, Her.
126
311779
2752
ืฉืืœืชื™ ืžื” ืงืจื” ื‘ืกืจื˜ ืฉืœื” ืž-2013.
05:14
Neil thought that the main character
127
314531
1900
ื ื™ืœ ื—ืฉื‘ ืฉื”ื“ืžื•ืช ื”ืจืืฉื™ืช
05:16
falls in love with his computer, which
128
316431
2248
ืžืชืื”ื‘ืช ื‘ืžื—ืฉื‘ ืฉืœื•, ื•ื–ื•
05:18
was the correct answer!
129
318679
1361
ื”ื™ื™ืชื” ื”ืชืฉื•ื‘ื” ื”ื ื›ื•ื ื”!
05:20
OK. Right, itโ€™s time to recap the vocabulary
130
320880
2640
ื‘ืกื“ืจ. ื ื›ื•ืŸ, ื”ื’ื™ืข ื”ื–ืžืŸ ืœืกื›ื ืืช ืื•ืฆืจ ื”ืžื™ืœื™ื
05:23
weโ€™ve learned from this programme about AI,
131
323520
2340
ืฉืœืžื“ื ื• ืžื”ืชื•ื›ื ื™ืช ื”ื–ื• ืขืœ AI,
05:25
including chatbots - computer programmes
132
325860
3180
ื›ื•ืœืœ ืฆ'ื˜ื‘ื•ื˜ื™ื - ืชื•ื›ื ื•ืช ืžื—ืฉื‘
05:29
designed to interact with
133
329040
1380
ืฉื ื•ืขื“ื• ืœืงื™ื™ื ืื™ื ื˜ืจืืงืฆื™ื” ืขื
05:30
humans over the internet.
134
330420
1560
ื‘ื ื™ ืื“ื ื“ืจืš ื”ืื™ื ื˜ืจื ื˜.
05:31
The adjective cognitive describes
135
331980
2640
ืฉื ื”ืชื•ืืจ ืงื•ื’ื ื™ื˜ื™ื‘ื™ ืžืชืืจ
05:34
anything connected with the mental
136
334620
1860
ื›ืœ ื“ื‘ืจ ื”ืงืฉื•ืจ
05:36
processes of knowing,
137
336480
1320
ืœืชื”ืœื™ื›ื™ื ื”ืžื ื˜ืœื™ื™ื ืฉืœ ื™ื“ื™ืขื”,
05:37
learning and understanding.
138
337800
1380
ืœืžื™ื“ื” ื•ื”ื‘ื ื”.
05:39
Wishful thinking means thinking that
139
339180
2640
ืžืฉืืœืช ืœื‘ ืคื™ืจื•ืฉื” ืœื—ืฉื•ื‘ ืฉืžืฉื”ื•
05:41
something which is very unlikely to happen
140
341820
2100
ืฉืกื‘ื™ืจ ืžืื•ื“ ืฉืœื ื™ืงืจื”
05:43
might happen one day in the future.
141
343920
2040
ืขืœื•ืœ ืœืงืจื•ืช ื™ื•ื ืื—ื“ ื‘ืขืชื™ื“.
05:45
To anthropomorphise an object means
142
345960
2400
ืœืื ืชืจื•ืคื•ืžื•ืจืคื™ื–ืฆื™ื” ืฉืœ ื—ืคืฅ ืคื™ืจื•ืฉื•
05:48
to treat it as if it were human,
143
348360
1500
ืœื”ืชื™ื™ื—ืก ืืœื™ื• ื›ืื™ืœื• ื”ื•ื ืื ื•ืฉื™,
05:49
even though itโ€™s not.
144
349860
1200
ืœืžืจื•ืช ืฉื”ื•ื ืœื.
05:51
When youโ€™re blindsided, youโ€™re
145
351060
2520
ื›ืฉืืชื” ืžืขื•ื•ืจ, ืืชื”
05:53
surprised in a negative way.
146
353580
1500
ืžื•ืคืชืข ื‘ืฆื•ืจื” ืฉืœื™ืœื™ืช.
05:55
And finally, to get taken in by someone means
147
355080
2880
ื•ืœื‘ืกื•ืฃ, ืœื”ื™ืงืœื˜ ืขืœ ื™ื“ื™ ืžื™ืฉื”ื• ืคื™ืจื•ืฉื•
05:57
to be deceived or tricked by them.
148
357960
1860
ืœื”ื™ื•ืช ืฉื•ืœืœ ืื• ืœืจืžื•ืช ืขืœ ื™ื“ื•.
05:59
My computer tells me that our six minutes
149
359820
2640
ื”ืžื—ืฉื‘ ืฉืœื™ ืื•ืžืจ ืœื™ ืฉืฉืฉ ื”ื“ืงื•ืช ืฉืœื ื•
06:02
are up! Join us again soon, for now
150
362460
2580
ื ื’ืžืจื•! ื”ืฆื˜ืจืคื• ืืœื™ื ื• ืฉื•ื‘ ื‘ืงืจื•ื‘, ืœืขืช ืขืชื”
06:05
itโ€™s goodbye from us.
151
365040
1140
ื–ื” ืœื”ืชืจืื•ืช ืžืื™ืชื ื•.
06:06
Bye!
152
366180
500
ื‘ื™ื™!
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7