Can AI have a mind of its own? โฒ๏ธ 6 Minute English

251,550 views ใƒป 2023-01-26

BBC Learning English


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋ฒˆ์—ญ๋œ ์ž๋ง‰์€ ๊ธฐ๊ณ„ ๋ฒˆ์—ญ๋ฉ๋‹ˆ๋‹ค.

00:08
Hello. This is 6 Minute English from
0
8520
2220
์•ˆ๋…•ํ•˜์„ธ์š”. BBC Learning English์˜ 6๋ถ„ ์˜์–ด์ž…๋‹ˆ๋‹ค
00:10
BBC Learning English. Iโ€™m Sam.
1
10740
1680
. ์ €๋Š” ์ƒ˜์ž…๋‹ˆ๋‹ค.
00:12
And Iโ€™m Neil.
2
12420
840
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ๋‹์ž…๋‹ˆ๋‹ค.
00:13
In the autumn of 2021, something
3
13260
2520
2021๋…„ ๊ฐ€์„,
00:15
strange happened at the Google
4
15780
1920
00:17
headquarters in Californiaโ€™s Silicon
5
17700
2460
์บ˜๋ฆฌํฌ๋‹ˆ์•„ ์‹ค๋ฆฌ์ฝ˜๋ฐธ๋ฆฌ์˜ ๊ตฌ๊ธ€ ๋ณธ์‚ฌ์—์„œ ์ด์ƒํ•œ ์ผ์ด ๋ฒŒ์–ด์กŒ๋‹ค
00:20
Valley. A software engineer called,
6
20160
2100
.
00:22
Blake Lemoine, was working on the
7
22260
2460
Blake Lemoine์ด๋ผ๋Š” ์†Œํ”„ํŠธ์›จ์–ด ์—”์ง€๋‹ˆ์–ด๋Š”
00:24
artificial intelligence project, โ€˜Language
8
24720
2340
์ธ๊ณต ์ง€๋Šฅ ํ”„๋กœ์ ํŠธ์ธ 'Language
00:27
Models for Dialogue Applicationsโ€™, or
9
27060
2700
Models for Dialogue Applications'(
00:29
LaMDA for short. LaMDA is a
10
29760
2940
์ค„์—ฌ์„œ LaMDA)๋ฅผ ์ง„ํ–‰ํ•˜๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค. LaMDA๋Š”
00:32
chatbot โ€“ a computer programme
11
32700
1860
00:34
designed to have conversations with
12
34560
2040
00:36
humans over the internet.
13
36600
1080
์ธํ„ฐ๋„ท์„ ํ†ตํ•ด ์‚ฌ๋žŒ๊ณผ ๋Œ€ํ™”ํ•˜๋„๋ก ์„ค๊ณ„๋œ ์ปดํ“จํ„ฐ ํ”„๋กœ๊ทธ๋žจ์ธ ์ฑ—๋ด‡์ž…๋‹ˆ๋‹ค.
00:37
After months talking with LaMDA
14
37680
2820
00:40
on topics ranging from movies to
15
40500
2280
์˜ํ™”์—์„œ ์‚ถ์˜ ์˜๋ฏธ์— ์ด๋ฅด๊ธฐ๊นŒ์ง€ ๋‹ค์–‘ํ•œ ์ฃผ์ œ์— ๋Œ€ํ•ด LaMDA์™€ ๋ช‡ ๋‹ฌ๊ฐ„ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆˆ
00:42
the meaning of life, Blake came to
16
42780
2220
ํ›„ Blake
00:45
a surprising conclusion: the chatbot
17
45000
2700
๋Š” ๋†€๋ผ์šด ๊ฒฐ๋ก ์— ๋„๋‹ฌํ–ˆ์Šต๋‹ˆ๋‹ค. ์ฑ—๋ด‡
00:47
was an intelligent person with wishes
18
47700
2640
์€
00:50
and rights that should be respected.
19
50340
2160
์กด์ค‘๋ฐ›์•„์•ผ ํ•  ์†Œ๋ง๊ณผ ๊ถŒ๋ฆฌ๋ฅผ ๊ฐ€์ง„ ์ง€์ ์ธ ์‚ฌ๋žŒ์ด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:52
For Blake, LaMDA was a Google
20
52500
2640
Blake์—๊ฒŒ LaMDA๋Š”
00:55
employee, not a machine.
21
55140
1380
๊ธฐ๊ณ„๊ฐ€ ์•„๋‹ˆ๋ผ Google ์ง์›์ด์—ˆ์Šต๋‹ˆ๋‹ค.
00:56
He also called it his โ€˜friendโ€™.
22
56520
2160
๊ทธ๋Š” ๋˜ํ•œ ๊ทธ๊ฒƒ์„ '์นœ๊ตฌ'๋ผ๊ณ  ๋ถˆ๋ €์Šต๋‹ˆ๋‹ค.
00:58
Google quickly reassigned Blake from
23
58680
2580
Google
01:01
the project, announcing that his ideas
24
61260
1860
์€ ๊ทธ์˜ ์•„์ด๋””์–ด
01:03
were not supported by the evidence.
25
63120
2220
๊ฐ€ ์ฆ๊ฑฐ์— ์˜ํ•ด ๋’ท๋ฐ›์นจ๋˜์ง€ ์•Š๋Š”๋‹ค๊ณ  ๋ฐœํ‘œํ•˜๋ฉด์„œ ํ”„๋กœ์ ํŠธ์—์„œ Blake๋ฅผ ์žฌ๋นจ๋ฆฌ ์žฌ๋ฐฐ์น˜ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:05
But what exactly was going on?
26
65340
2700
๊ทธ๋Ÿฌ๋‚˜ ์ •ํ™•ํžˆ ๋ฌด์Šจ ์ผ์ด ์žˆ์—ˆ์Šต๋‹ˆ๊นŒ?
01:08
In this programme, weโ€™ll be
27
68040
1860
์ด ํ”„๋กœ๊ทธ๋žจ์—์„œ๋Š”
01:09
discussing whether artificial intelligence
28
69900
2160
์ธ๊ณต ์ง€๋Šฅ
01:12
is capable of consciousness. Weโ€™ll hear
29
72060
3000
์ด ์˜์‹์„ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๋Š”์ง€ ์—ฌ๋ถ€์— ๋Œ€ํ•ด ๋…ผ์˜ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์šฐ๋ฆฌ๋Š”
01:15
from one expert who thinks AI is not as
30
75060
3000
AI๊ฐ€ ์šฐ๋ฆฌ๊ฐ€ ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ๋งŒํผ ์ง€๋Šฅ์ ์ด์ง€ ์•Š๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š” ํ•œ ์ „๋ฌธ๊ฐ€์˜ ๋ง
01:18
intelligent as we sometimes think,
31
78060
1560
01:19
and as usual, weโ€™ll be learning some
32
79620
2460
์„ ๋“ฃ๊ฒŒ ๋  ๊ฒƒ์ด๋ฉฐ ํ‰์†Œ์™€ ๊ฐ™์ด
01:22
new vocabulary as well.
33
82080
1320
์ƒˆ๋กœ์šด ์–ดํœ˜๋„ ๋ฐฐ์šธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:23
But before that, I have a question for
34
83400
2100
ํ•˜์ง€๋งŒ ๊ทธ ์ „์— ์งˆ๋ฌธ์ด
01:25
you, Neil. What happened to Blake Lemoine
35
85500
2280
์žˆ์Šต๋‹ˆ๋‹ค, ๋‹. Blake Lemoine์—๊ฒŒ ์ผ์–ด๋‚œ ์ผ์€ ์Šค์นผ๋ › ์š”ํ•œ์Šจ์˜ ๋ชฉ์†Œ๋ฆฌ๋กœ ์ปดํ“จํ„ฐ์™€ ๋Œ€ํ™”ํ•˜๋Š” ์™ธ๋กœ์šด ์ž‘๊ฐ€
01:27
is strangely similar to the 2013 Hollywood
36
87780
2940
01:30
movie, Her, starring Joaquin Phoenix as
37
90720
3420
๋กœ Joaquin Phoenix๊ฐ€ ์ถœ์—ฐํ•œ 2013๋…„ ํ• ๋ฆฌ์šฐ๋“œ ์˜ํ™” Her์™€ ์ด์ƒํ•˜๊ฒŒ ์œ ์‚ฌํ•ฉ๋‹ˆ๋‹ค
01:34
a lonely writer who talks with his
38
94140
1860
01:36
computer, voiced by Scarlett Johansson.
39
96000
2280
.
01:38
But what happens at the end
40
98280
1860
ํ•˜์ง€๋งŒ
01:40
of the movie? Is it:
41
100140
1380
์˜ํ™”๊ฐ€ ๋๋‚˜๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”? ๊ทธ๊ฒƒ์€:
01:41
a) the computer comes to life?
42
101520
1860
a) ์ปดํ“จํ„ฐ๊ฐ€ ์‚ด์•„๋‚˜๋‚˜์š”?
01:43
b) the computer dreams about the writer? or,
43
103380
3360
b) ์ปดํ“จํ„ฐ๊ฐ€ ์ž‘๊ฐ€๋ฅผ ๊ฟˆ๊พธ๋Š”๊ฐ€? ๋˜๋Š”
01:46
c) the writer falls in love with the computer?
44
106740
2040
c) ์ž‘๊ฐ€๊ฐ€ ์ปดํ“จํ„ฐ์™€ ์‚ฌ๋ž‘์— ๋น ์ง€๋‚˜์š”?
01:48
... c) the writer falls in love with the computer.
45
108780
3420
... c) ์ž‘๊ฐ€๋Š” ์ปดํ“จํ„ฐ์™€ ์‚ฌ๋ž‘์— ๋น ์ง„๋‹ค.
01:52
OK, Neil, Iโ€™ll reveal the answer at the end
46
112200
2700
๋„ค, ๋‹, ํ”„๋กœ๊ทธ๋žจ์ด ๋๋‚  ๋•Œ ๋‹ต์„ ๊ณต๊ฐœํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค
01:54
of the programme. Although Hollywood is
47
114900
2460
. ํ• ๋ฆฌ์šฐ๋“œ
01:57
full of movies about robots coming to life,
48
117360
2340
๋Š” ๋กœ๋ด‡์ด ์‚ด์•„ ์›€์ง์ด๋Š” ์˜ํ™”๋กœ ๊ฐ€๋“ ์ฐจ ์žˆ์ง€๋งŒ ์›Œ์‹ฑํ„ด ๋Œ€ํ•™
01:59
Emily Bender, a professor of linguistics and
49
119700
3240
์˜ ์–ธ์–ดํ•™ ๋ฐ ์ปดํ“จํŒ… ๊ต์ˆ˜์ธ Emily Bender
02:02
computing at the University of Washington,
50
122940
2100
02:05
thinks AI isnโ€™t that smart. She thinks the
51
125040
4440
๋Š” AI๊ฐ€ ๊ทธ๋ ‡๊ฒŒ ๋˜‘๋˜‘ํ•˜์ง€ ์•Š๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค. ๊ทธ๋…€๋Š”
02:09
words we use to talk about technology,
52
129480
1800
์šฐ๋ฆฌ๊ฐ€ ๊ธฐ์ˆ ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•  ๋•Œ ์‚ฌ์šฉํ•˜๋Š”
02:11
phrases like โ€˜machine learningโ€™, give a
53
131280
3300
'๊ธฐ๊ณ„ ํ•™์Šต'๊ณผ ๊ฐ™์€ ๋ฌธ๊ตฌ๊ฐ€
02:14
false impression about what
54
134580
1620
02:16
computers can and canโ€™t do.
55
136200
1740
์ปดํ“จํ„ฐ๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ๊ณผ ํ•  ์ˆ˜ ์—†๋Š” ๊ฒƒ์— ๋Œ€ํ•ด ์ž˜๋ชป๋œ ์ธ์ƒ์„ ์ค€๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
02:17
Here is Professor Bender discussing
56
137940
2400
๋‹ค์Œ์€ Bender ๊ต์ˆ˜๊ฐ€ BBC World Service ํ”„๋กœ๊ทธ๋žจ์ธ The Inquiry์—์„œ
02:20
another misleading phrase, โ€˜speech
57
140340
2340
์˜คํ•ด์˜ ์†Œ์ง€๊ฐ€ ์žˆ๋Š” ๋˜ ๋‹ค๋ฅธ ๋ฌธ๊ตฌ์ธ '์Œ์„ฑ
02:22
recognitionโ€™, with BBC World Service
58
142680
2460
์ธ์‹'
02:25
programme, The Inquiry:
59
145140
1440
02:27
If you talk about โ€˜automatic speech
60
147360
2220
์— ๋Œ€ํ•ด ๋…ผ์˜ํ•˜๋Š” ๋‚ด์šฉ์ž…๋‹ˆ๋‹ค
02:29
recognitionโ€™, the term โ€˜recognitionโ€™
61
149580
1920
02:31
suggests that there's something
62
151500
2100
02:33
cognitive going on, where I think a
63
153600
2160
.
02:35
better term would be automatic transcription.
64
155760
2100
๋” ๋‚˜์€ ์šฉ์–ด๋Š” ์ž๋™ ์ „์‚ฌ์ž…๋‹ˆ๋‹ค.
02:37
That just describes the input-output
65
157860
1980
๊ทธ๊ฒƒ์€ ์ž…๋ ฅ-์ถœ๋ ฅ ๊ด€๊ณ„๋ฅผ ์„ค๋ช…ํ•  ๋ฟ
02:39
relation, and not any theory or wishful
66
159840
3660
02:43
thinking about what the computer is
67
163500
2280
, ์ปดํ“จํ„ฐ๊ฐ€
02:45
doing to be able to achieve that.
68
165780
1440
์ด๋ฅผ ๋‹ฌ์„ฑํ•˜๊ธฐ ์œ„ํ•ด ์ˆ˜ํ–‰ํ•˜๋Š” ์ž‘์—…์— ๋Œ€ํ•œ ์ด๋ก ์ด๋‚˜ ํฌ๋ง์‚ฌํ•ญ์ด ์•„๋‹™๋‹ˆ๋‹ค. ์ปดํ“จํ„ฐ
02:47
Using words like โ€˜recognitionโ€™ in relation
69
167220
3360
์™€ ๊ด€๋ จํ•˜์—ฌ '์ธ์‹'๊ณผ ๊ฐ™์€ ๋‹จ์–ด๋ฅผ ์‚ฌ์šฉํ•˜๋ฉด ์ธ์‹
02:50
to computers gives the idea that
70
170580
2100
02:52
something cognitive is happening โ€“ something
71
172680
2760
02:55
related to the mental processes of
72
175440
2340
02:57
thinking, knowing, learning and understanding.
73
177780
2760
, ์ธ์ง€, ํ•™์Šต ๋ฐ ์ดํ•ด์˜ ์ •์‹ ์  ๊ณผ์ •๊ณผ ๊ด€๋ จ๋œ ์–ด๋–ค ์ธ์ง€์  ์ผ์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ๋‹ค๋Š” ์ƒ๊ฐ์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
03:00
But thinking and knowing are human,
74
180540
2700
๊ทธ๋Ÿฌ๋‚˜ ์‚ฌ๊ณ ์™€ ์•Ž์€
03:03
not machine, activities. Professor Benders
75
183240
3060
๊ธฐ๊ณ„ ํ™œ๋™์ด ์•„๋‹ˆ๋ผ ์ธ๊ฐ„์˜ ํ™œ๋™์ž…๋‹ˆ๋‹ค. Benders ๊ต์ˆ˜
03:06
says that talking about them in connection
76
186300
2040
๋Š” ์ปดํ“จํ„ฐ์™€ ๊ด€๋ จํ•˜์—ฌ ๊ทธ๊ฒƒ๋“ค์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•˜๋Š”
03:08
with computers is wishful thinking -
77
188340
3570
๊ฒƒ์€ ํฌ๋ง์ ์ธ ์ƒ๊ฐ,
03:11
something which is unlikely to happen.
78
191910
2310
์ผ์–ด๋‚  ๊ฒƒ ๊ฐ™์ง€ ์•Š์€ ์ผ์ด๋ผ๊ณ  ๋งํ•ฉ๋‹ˆ๋‹ค.
03:14
The problem with using words in this
79
194220
2220
์ด๋Ÿฌํ•œ ๋ฐฉ์‹์œผ๋กœ ๋‹จ์–ด๋ฅผ ์‚ฌ์šฉ
03:16
way is that it reinforces what
80
196440
2100
ํ•˜๋Š” ๊ฒƒ์˜ ๋ฌธ์ œ๋Š”
03:18
Professor Bender calls, technical
81
198540
2160
Bender ๊ต์ˆ˜๊ฐ€ ๋ถ€๋ฅด๋Š” ๊ธฐ์ˆ ์ 
03:20
bias โ€“ the assumption that the computer
82
200700
2520
ํŽธ๊ฒฌ, ์ฆ‰ ์ปดํ“จํ„ฐ๊ฐ€ ํ•ญ์ƒ ์˜ณ๋‹ค๋Š” ๊ฐ€์ •์„ ๊ฐ•ํ™”ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค
03:23
is always right. When we encounter
83
203220
2520
.
03:25
language that sounds natural, but is
84
205740
1680
์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ๋“ค๋ฆฌ์ง€๋งŒ
03:27
coming from a computer, humans
85
207420
2280
์ปดํ“จํ„ฐ์—์„œ ๋‚˜์˜ค๋Š” ์–ธ์–ด๋ฅผ ์ ‘ํ•  ๋•Œ ์ธ๊ฐ„
03:29
canโ€™t help but imagine a mind behind
86
209700
2460
์€ ์–ธ์–ด ๋’ค์— ๋งˆ์Œ
03:32
the language, even when there isnโ€™t one.
87
212160
2220
์ด ์—†๋”๋ผ๋„ ์ƒ์ƒํ•  ์ˆ˜๋ฐ–์— ์—†์Šต๋‹ˆ๋‹ค.
03:34
In other words, we anthropomorphise
88
214380
2160
๋‹ค์‹œ ๋งํ•ด, ์šฐ๋ฆฌ๋Š” ์ปดํ“จํ„ฐ๋ฅผ ์˜์ธํ™”
03:36
computers โ€“ we treat them as if they
89
216540
2520
ํ•˜์—ฌ ๋งˆ์น˜ ์ธ๊ฐ„์ธ ๊ฒƒ์ฒ˜๋Ÿผ ์ทจ๊ธ‰ํ•ฉ๋‹ˆ๋‹ค
03:39
were human. Hereโ€™s Professor Bender
90
219060
2220
. ์—ฌ๊ธฐ Bender ๊ต์ˆ˜
03:41
again, discussing this idea with
91
221280
2220
03:43
Charmaine Cozier, presenter of BBC
92
223500
2700
๊ฐ€ BBC World Service์˜ The Inquiry ๋ฐœํ‘œ์ž์ธ Charmaine Cozier์™€ ์ด ์•„์ด๋””์–ด์— ๋Œ€ํ•ด ๋…ผ์˜
03:46
World Serviceโ€™s, the Inquiry.
93
226200
1620
ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:48
So โ€˜ismโ€™ means system, โ€˜anthroโ€™ or โ€˜anthropoโ€™
94
228420
3660
๊ทธ๋ž˜์„œ 'ism'์€ ์ฒด๊ณ„๋ฅผ ์˜๋ฏธํ•˜๊ณ  'anthro' ๋˜๋Š” 'anthropo'
03:52
means human, and โ€˜morphโ€™ means shape...
95
232080
3000
๋Š” ์ธ๊ฐ„์„ ์˜๋ฏธํ•˜๊ณ  'morph'๋Š” ํ˜•ํƒœ
03:55
And so this is a system that puts the
96
235080
3000
03:58
shape of a human on something, and
97
238080
2160
04:00
in this case the something is a computer.
98
240240
1260
๋ฅผ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค. ์ปดํ“จํ„ฐ.
04:01
We anthropomorphise animals all the time,
99
241500
3180
์šฐ๋ฆฌ๋Š” ํ•ญ์ƒ ๋™๋ฌผ์„
04:04
but we also anthropomorphise action figures,
100
244680
3060
์˜์ธํ™”ํ•˜์ง€๋งŒ ์˜๋„
04:07
or dolls, or companies when we talk about
101
247740
2880
๊ฐ€ ์žˆ๋Š” ํšŒ์‚ฌ ๋“ฑ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•  ๋•Œ ์•ก์…˜ ํ”ผ๊ทœ์–ด, ์ธํ˜• ๋˜๋Š” ํšŒ์‚ฌ๋ฅผ ์˜์ธํ™”
04:10
companies having intentions and so on.
102
250620
2040
ํ•˜๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
04:12
We very much are in the habit of seeing
103
252660
2880
์šฐ๋ฆฌ๋Š”
04:15
ourselves in the world around us.
104
255540
1620
์ฃผ๋ณ€ ์„ธ๊ณ„์—์„œ ์ž์‹ ์„ ๋ณด๋Š” ์Šต๊ด€์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:17
And while weโ€™re busy seeing ourselves
105
257160
2220
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š”
04:19
by assigning human traits to things that
106
259380
2100
์ธ๊ฐ„์˜ ํŠน์„ฑ์„ ์•„๋‹Œ ๊ฒƒ์— ๋ถ€์—ฌํ•จ์œผ๋กœ์จ
04:21
are not, we risk being blindsided.
107
261480
2520
์šฐ๋ฆฌ ์ž์‹ ์„ ๋ณด๋Š๋ผ ๋ฐ”์˜์ง€๋งŒ ๋ˆˆ์ด ๋ฉ€๊ฒŒ ๋  ์œ„ํ—˜์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:24
The more fluent that text is, the more
108
264000
2400
ํ…์ŠคํŠธ๊ฐ€ ๋” ์œ ์ฐฝํ• ์ˆ˜๋ก
04:26
different topics it can converse on, the
109
266400
2700
๋Œ€ํ™”ํ•  ์ˆ˜ ์žˆ๋Š” ๋‹ค์–‘ํ•œ ์ฃผ์ œ๊ฐ€
04:29
more chances there are to get taken in.
110
269100
1920
๋” ๋งŽ์•„์งˆ ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์•„์ง‘๋‹ˆ๋‹ค
04:31
If we treat computers as if they could think,
111
271860
2760
. ์ปดํ“จํ„ฐ๋ฅผ ๋งˆ์น˜ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ์ทจ๊ธ‰
04:34
we might get blindsided, or
112
274620
2520
ํ•˜๋ฉด ๋ˆˆ์ด ๋ฉ€๊ฑฐ๋‚˜
04:37
unpleasantly surprised. Artificial intelligence
113
277140
3180
๋ถˆ์พŒํ•˜๊ฒŒ ๋†€๋ž„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ธ๊ณต ์ง€๋Šฅ
04:40
works by finding patterns in massive
114
280320
2220
์€ ๋ฐฉ๋Œ€ํ•œ ์–‘์˜ ๋ฐ์ดํ„ฐ์—์„œ ํŒจํ„ด์„ ์ฐพ์•„ ์ž‘๋™
04:42
amounts of data, so it can seem like
115
282540
2520
ํ•˜๋ฏ€๋กœ
04:45
weโ€™re talking with a human, instead
116
285060
1860
04:46
of a machine doing data analysis.
117
286920
2220
๋ฐ์ดํ„ฐ ๋ถ„์„์„ ์ˆ˜ํ–‰ํ•˜๋Š” ๊ธฐ๊ณ„๊ฐ€ ์•„๋‹ˆ๋ผ ์‚ฌ๋žŒ๊ณผ ๋Œ€ํ™”ํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ณด์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
04:49
As a result, we get taken in โ€“ weโ€™re tricked
118
289140
4020
๊ฒฐ๊ณผ์ ์œผ๋กœ ์šฐ๋ฆฌ๋Š” ์†์•„
04:53
or deceived into thinking weโ€™re dealing
119
293160
1920
๋„˜์–ด๊ฐ‘๋‹ˆ๋‹ค. ์†๊ฑฐ๋‚˜ ์†์•„ ์šฐ๋ฆฌ๊ฐ€ ์ธ๊ฐ„์ด๋‚˜ ์ง€์ ์ธ ๊ฒƒ์„ ์ƒ๋Œ€ํ•˜๊ณ  ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๊ฒŒ
04:55
with a human, or with something intelligent.
120
295080
3003
๋ฉ๋‹ˆ๋‹ค.
04:58
Powerful AI can make machines appear conscious,
121
298083
3318
๊ฐ•๋ ฅํ•œ AI๋Š” ๊ธฐ๊ณ„๊ฐ€ ์˜์‹์ด ์žˆ๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ณด์ด๊ฒŒ ๋งŒ๋“ค ์ˆ˜
05:01
but even tech giants like Google are years
122
301401
2945
์žˆ์ง€๋งŒ Google๊ณผ ๊ฐ™์€ ๊ฑฐ๋Œ€ ๊ธฐ์ˆ  ๊ธฐ์—…๋„
05:04
away from building computers that can
123
304346
2407
05:06
dream or fall in love. Speaking of which,
124
306753
2355
๊ฟˆ์„ ๊พธ๊ฑฐ๋‚˜ ์‚ฌ๋ž‘์— ๋น ์งˆ ์ˆ˜ ์žˆ๋Š” ์ปดํ“จํ„ฐ๋ฅผ ๋งŒ๋“œ๋Š” ๋ฐ ๋ช‡ ๋…„์ด ๊ฑธ๋ฆฝ๋‹ˆ๋‹ค. ๋งํ•˜์ž๋ฉด,
05:09
Sam, what was the answer to your question?
125
309108
2671
Sam, ๋‹น์‹ ์˜ ์งˆ๋ฌธ์— ๋Œ€ํ•œ ๋Œ€๋‹ต์€ ๋ฌด์—‡์ด์—ˆ์Šต๋‹ˆ๊นŒ?
05:11
I asked what happened in the 2013 movie, Her.
126
311779
2752
2013๋…„ ์˜ํ™” '๊ทธ๋…€'์—์„œ ๋ฌด์Šจ ์ผ์ด ์žˆ์—ˆ๋Š”์ง€ ๋ฌผ์—ˆ๋‹ค.
05:14
Neil thought that the main character
127
314531
1900
Neil์€ ์ฃผ์ธ๊ณต
05:16
falls in love with his computer, which
128
316431
2248
์ด ์ž์‹ ์˜ ์ปดํ“จํ„ฐ์™€ ์‚ฌ๋ž‘์— ๋น ์ง„๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ๊ณ 
05:18
was the correct answer!
129
318679
1361
๊ทธ๊ฒƒ์ด ์ •๋‹ต์ด์—ˆ์Šต๋‹ˆ๋‹ค!
05:20
OK. Right, itโ€™s time to recap the vocabulary
130
320880
2640
์ข‹์•„์š”. ์ด์ œ ์ธํ„ฐ๋„ท์„ ํ†ตํ•ด ์ธ๊ฐ„๊ณผ ์ƒํ˜ธ ์ž‘์šฉํ•˜๋„๋ก ์„ค๊ณ„๋œ ์ปดํ“จํ„ฐ ํ”„๋กœ๊ทธ๋žจ์ธ ์ฑ—๋ด‡์„ ํฌํ•จํ•˜์—ฌ
05:23
weโ€™ve learned from this programme about AI,
131
323520
2340
AI์— ๋Œ€ํ•ด ์ด ํ”„๋กœ๊ทธ๋žจ์—์„œ ๋ฐฐ์šด ์–ดํœ˜๋ฅผ ์š”์•ฝํ•  ์‹œ๊ฐ„
05:25
including chatbots - computer programmes
132
325860
3180
05:29
designed to interact with
133
329040
1380
05:30
humans over the internet.
134
330420
1560
์ž…๋‹ˆ๋‹ค.
05:31
The adjective cognitive describes
135
331980
2640
์ธ์ง€๋ผ๋Š” ํ˜•์šฉ์‚ฌ
05:34
anything connected with the mental
136
334620
1860
05:36
processes of knowing,
137
336480
1320
๋Š” ์•Œ๊ณ ,
05:37
learning and understanding.
138
337800
1380
๋ฐฐ์šฐ๊ณ , ์ดํ•ดํ•˜๋Š” ์ •์‹ ์  ๊ณผ์ •๊ณผ ๊ด€๋ จ๋œ ๋ชจ๋“  ๊ฒƒ์„ ์„ค๋ช…ํ•ฉ๋‹ˆ๋‹ค.
05:39
Wishful thinking means thinking that
139
339180
2640
์†Œ๋ง์  ์‚ฌ๊ณ ๋ž€
05:41
something which is very unlikely to happen
140
341820
2100
์ผ์–ด๋‚  ๊ฐ€๋Šฅ์„ฑ์ด ๊ฑฐ์˜ ์—†๋Š” ์ผ
05:43
might happen one day in the future.
141
343920
2040
์ด ๋ฏธ๋ž˜์— ์–ธ์  ๊ฐ€๋Š” ์ผ์–ด๋‚  ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
05:45
To anthropomorphise an object means
142
345960
2400
๋ฌผ์ฒด๋ฅผ ์˜์ธํ™”ํ•œ๋‹ค๋Š” ๊ฒƒ์€
05:48
to treat it as if it were human,
143
348360
1500
๊ทธ๊ฒƒ์ด ์ธ๊ฐ„์ด ์•„๋‹Œ๋ฐ๋„ ๋งˆ์น˜ ์ธ๊ฐ„์ธ ๊ฒƒ์ฒ˜๋Ÿผ ์ทจ๊ธ‰ํ•˜๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค
05:49
even though itโ€™s not.
144
349860
1200
.
05:51
When youโ€™re blindsided, youโ€™re
145
351060
2520
๋ˆˆ์ด ๋ฉ€๋ฉด
05:53
surprised in a negative way.
146
353580
1500
๋ถ€์ •์ ์ธ ๋ฐฉ์‹์œผ๋กœ ๋†€๋ž€๋‹ค.
05:55
And finally, to get taken in by someone means
147
355080
2880
๊ทธ๋ฆฌ๊ณ  ๋งˆ์ง€๋ง‰์œผ๋กœ ๋ˆ„๊ตฐ๊ฐ€
05:57
to be deceived or tricked by them.
148
357960
1860
์—๊ฒŒ ์†์•˜๋‹ค๋Š” ๊ฒƒ์€ ๊ทธ๋“ค์—๊ฒŒ ์†๊ฑฐ๋‚˜ ์†์•˜๋‹ค๋Š” ๋œป์ž…๋‹ˆ๋‹ค.
05:59
My computer tells me that our six minutes
149
359820
2640
๋‚ด ์ปดํ“จํ„ฐ๋Š” 6๋ถ„์ด ๋๋‚ฌ๋‹ค๊ณ  ์•Œ๋ ค์ค๋‹ˆ๋‹ค
06:02
are up! Join us again soon, for now
150
362460
2580
! ๊ณง ๋‹ค์‹œ ํ•ฉ๋ฅ˜ํ•˜์„ธ์š”.
06:05
itโ€™s goodbye from us.
151
365040
1140
์ด์ œ ์ž‘๋ณ„์ž…๋‹ˆ๋‹ค.
06:06
Bye!
152
366180
500
์•ˆ๋…•!
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7