Will robots out-think humans? 6 Minute English

81,614 views ใƒป 2018-01-25

BBC Learning English


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋ฒˆ์—ญ๋œ ์ž๋ง‰์€ ๊ธฐ๊ณ„ ๋ฒˆ์—ญ๋ฉ๋‹ˆ๋‹ค.

00:07
Dan: Hello and welcome to 6 Minute English.
0
7300
1640
Dan: ์•ˆ๋…•ํ•˜์„ธ์š”. 6 Minute English์— ์˜ค์‹  ๊ฒƒ์„ ํ™˜์˜ํ•ฉ๋‹ˆ๋‹ค.
00:08
I'm Dan and joining me today is Neil. Hi Neil.
1
8940
2519
๋‚˜๋Š” Dan์ด๊ณ  ์˜ค๋Š˜ ๋‚˜์™€ ํ•จ๊ป˜ํ•˜๋Š” ๊ฒƒ์€ Neil์ž…๋‹ˆ๋‹ค. ์•ˆ๋…• ๋‹.
00:11
Neil: Hi Dan. Whatโ€™s with the protective
2
11459
2321
๋‹: ์•ˆ๋…• ๋Œ„. ๋ณดํ˜ธ
00:13
gear and helmet?
3
13780
929
์žฅ๋น„์™€ ํ—ฌ๋ฉง์€ ๋ฌด์—‡์ž…๋‹ˆ๊นŒ?
00:14
Dan: Iโ€™m just getting ready for the inevitable
4
14709
2676
Dan: ๊ธฐ๊ณ„์˜ ๋ถˆ๊ฐ€ํ”ผํ•œ ๋ถ€์ƒ์— ๋Œ€๋น„ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค
00:17
rise of the machines. Thatโ€™s the takeover
5
17385
2385
. ๊ทธ๊ฒƒ์€
00:19
of the world by artificial intelligence, or
6
19770
3486
00:23
AI, which some people predict will happen.
7
23256
3404
์ผ๋ถ€ ์‚ฌ๋žŒ๋“ค์ด ์ผ์–ด๋‚  ๊ฒƒ์ด๋ผ๊ณ  ์˜ˆ์ธกํ•˜๋Š” ์ธ๊ณต ์ง€๋Šฅ ๋˜๋Š” AI๊ฐ€ ์„ธ์ƒ์„ ์žฅ์•…ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:26
Neil: Inevitable means cannot be avoided or
8
26660
2615
Neil: ํ”ผํ•  ์ˆ˜ ์—†๋Š” ์ˆ˜๋‹จ์€ ํ”ผํ•˜๊ฑฐ๋‚˜
00:29
stopped. Rise of the machines? What do you mean?
9
29275
3555
๋ฉˆ์ถœ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค. ๊ธฐ๊ณ„์˜ ๋ถ€์ƒ? ๋ฌด์Šจ ๋œป์ด์—์š”?
00:32
Dan: Itโ€™s our topic in this 6 Minute English.
10
32830
2138
๋Œ„: ์ด๋ฒˆ 6๋ถ„ ์˜์–ด์˜ ์ฃผ์ œ์ž…๋‹ˆ๋‹ค.
00:34
Weโ€™ll be talking about that, giving you
11
34968
1852
์šฐ๋ฆฌ๋Š” ๊ทธ๊ฒƒ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•˜๊ณ 
00:36
six related pieces of vocabulary and, of course,
12
36820
3055
6๊ฐœ์˜ ๊ด€๋ จ ์–ดํœ˜์™€ ๋ฌผ๋ก 
00:39
our regular quiz question.
13
39875
1655
์šฐ๋ฆฌ์˜ ์ผ๋ฐ˜ ํ€ด์ฆˆ ์งˆ๋ฌธ์„ ์ œ๊ณตํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:41
Neil: Thatโ€™s the first thing youโ€™ve said
14
41530
1907
๋‹: ๊ทธ๊ฒŒ ๋‹น์‹ ์ด ๋งํ•œ ์ฒซ ๋ฒˆ์งธ ๋ง์ด๊ตฐ์š”
00:43
that makes any sense. Whatโ€™s the question?
15
43437
2003
. ์งˆ๋ฌธ์ด ๋ญ์•ผ?
00:45
Dan: The word โ€˜robotโ€™ as we use it today
16
45440
3218
๋Œ„: ์˜ค๋Š˜๋‚  ์šฐ๋ฆฌ๊ฐ€ ์‚ฌ์šฉํ•˜๋Š” '๋กœ๋ด‡'์ด๋ผ๋Š” ๋‹จ์–ด๋Š”
00:48
was first used in a 1920โ€™s Czech play โ€˜Rossumโ€™s
17
48658
3782
1920๋…„๋Œ€ ์ฒด์ฝ” ์—ฐ๊ทน '๋กœ์„ฌ์˜ ๋งŒ๋Šฅ ๋กœ๋ด‡'์—์„œ ์ฒ˜์Œ ์‚ฌ์šฉ๋˜์—ˆ์Šต๋‹ˆ๋‹ค
00:52
Universal Robotsโ€™. But before this, what
18
52440
3356
. ๊ทธ๋Ÿฌ๋‚˜ ๊ทธ ์ „์—๋Š”
00:55
was its original meaning? a) forced labour
19
55796
3524
์›๋ž˜ ์˜๋ฏธ๊ฐ€ ๋ฌด์—‡์ด์—ˆ์Šต๋‹ˆ๊นŒ? a) ๊ฐ•์ œ ๋…ธ๋™
00:59
b) metal man c) heartless thing
20
59320
3640
b) ๊ธˆ์† ๋‚จ์ž c) ๋ฌด์ž๋น„ํ•œ ๊ฒƒ
01:02
Neil: I will go for a) forced labour
21
62960
3250
Neil: ๋‚˜๋Š” ๊ฐˆ ๊ฒƒ์ž…๋‹ˆ๋‹ค a) ๊ฐ•์ œ ๋…ธ๋™
01:06
Dan: Weโ€™ll find out if you were right or
22
66210
2335
Dan: ๋‚˜์ค‘์— ์‡ผ์—์„œ ๋‹น์‹ ์ด ์˜ณ์•˜๋Š”์ง€ ์•„๋‹Œ์ง€ ์•Œ๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค
01:08
not later in the show.
23
68545
1285
.
01:09
Neil: OK Dan. Tell me whatโ€™s going on.
24
69830
2310
๋‹: ์•Œ์•˜์–ด ๋Œ„. ๋ฌด์Šจ ์ผ์ธ์ง€ ๋งํ•ด์ค˜.
01:12
Dan: I saw a news article written by BBC technology
25
72140
3431
Dan: BBC ๊ธฐ์ˆ 
01:15
correspondent Rory Cellan-Jones about the
26
75571
2759
ํŠนํŒŒ์› Rory Cellan-Jones๊ฐ€
01:18
recent CES technology show in Las Vegas. He
27
78330
3865
์ตœ๊ทผ ๋ผ์Šค๋ฒ ๊ฑฐ์Šค์—์„œ ์—ด๋ฆฐ CES ๊ธฐ์ˆ  ์‡ผ์— ๋Œ€ํ•ด ์“ด ๋‰ด์Šค ๊ธฐ์‚ฌ๋ฅผ ๋ดค์Šต๋‹ˆ๋‹ค. ๊ทธ๋Š”
01:22
interviewed David Hanson, founder of Hanson
28
82195
3865
Hanson Robotics์˜ ์„ค๋ฆฝ์ž David Hanson์„ ์ธํ„ฐ๋ทฐํ–ˆ๋Š”๋ฐ
01:26
Robotics, who said it was his ambition to
29
86060
3660
, ๊ทธ๋Š”
01:29
achieve an AI that can beat humans at any
30
89720
3660
๋ชจ๋“  ์ง€์  ์ž‘์—…์—์„œ ์ธ๊ฐ„์„ ๋Šฅ๊ฐ€ํ•  ์ˆ˜ ์žˆ๋Š” AI๋ฅผ ๋‹ฌ์„ฑํ•˜๋Š” ๊ฒƒ์ด ์ž์‹ ์˜ ์•ผ๋ง์ด๋ผ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค
01:33
intellectual task.
31
93380
1500
.
01:34
Neil: Surely itโ€™s a good thing! Better AI
32
94880
2655
๋‹: ํ™•์‹คํžˆ ์ข‹์€ ์ผ์ด์•ผ! ๋” ๋‚˜์€ AI
01:37
and robotics could take over many of the jobs
33
97535
2915
์™€ ๋กœ๋ด‡ ๊ณตํ•™์€
01:40
that we donโ€™t want to do, or that are so
34
100450
2466
์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜์ง€ ์•Š๊ฑฐ๋‚˜
01:42
important to get 100% rightโ€ฆ like air traffic
35
102916
2774
100% ์ œ๋Œ€๋กœ ํ•˜๋Š” ๋ฐ ๋งค์šฐ ์ค‘์š”ํ•œ ํ•ญ๊ณต ๊ตํ†ต
01:45
control. Weโ€™d never have another plane crash.
36
105690
2566
๊ด€์ œ์™€ ๊ฐ™์€ ๋งŽ์€ ์ž‘์—…์„ ๋Œ€์‹ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์šฐ๋ฆฌ๋Š” ๋˜ ๋‹ค๋ฅธ ๋น„ํ–‰๊ธฐ ์ถ”๋ฝ ์‚ฌ๊ณ ๋ฅผ ๊ฒช์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๋„ˆ๋ฌด ์˜๋ฆฌํ•˜๊ธฐ
01:48
It would be infallible because it would be
37
108256
2394
๋•Œ๋ฌธ์— ์˜ค๋ฅ˜๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค
01:50
so clever.
38
110650
1190
.
01:51
Dan: Infallible means never failing. And thatโ€™s
39
111840
2767
Dan: Infallible์€ ๊ฒฐ์ฝ” ์‹คํŒจํ•˜์ง€ ์•Š๋Š”๋‹ค๋Š” ๋œป์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์ด
01:54
what bothers me. What happens when its intelligence
40
114607
3003
๋‚˜๋ฅผ ๊ดด๋กญํžˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์ง€๋Šฅ์ด
01:57
surpasses ours? Why should it do what
41
117610
2774
์šฐ๋ฆฌ๋ฅผ ๋Šฅ๊ฐ€ํ•˜๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”? ์™œ ์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๋Œ€๋กœ ํ•ด์•ผ ํ•ฉ๋‹ˆ๊นŒ
02:00
we want it to do?
42
120384
1275
?
02:01
Neil: To surpass something is to do or be
43
121659
2076
Neil: ๋ฌด์–ธ๊ฐ€๋ฅผ ๋Šฅ๊ฐ€ํ•œ๋‹ค๋Š” ๊ฒƒ์€
02:03
better than it. Dan, youโ€™ve been watching
44
123735
2075
๊ทธ๊ฒƒ๋ณด๋‹ค ๋” ๋‚˜์€ ๊ฒƒ์„ ํ•˜๊ฑฐ๋‚˜ ๋” ์ž˜ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๋Œ„, ๋‹น์‹ ์€
02:05
too many movies. Robots fighting humanity
45
125810
2971
์˜ํ™”๋ฅผ ๋„ˆ๋ฌด ๋งŽ์ด ๋ดค์–ด์š”. ์ธ๋ฅ˜์™€ ์‹ธ์šฐ๋Š” ๋กœ๋ด‡์€
02:08
is a popular theme. Guess whatโ€ฆ humanity
46
128781
2899
์ธ๊ธฐ ์žˆ๋Š” ํ…Œ๋งˆ์ž…๋‹ˆ๋‹ค. ๋งž์ถฐ๋ณด์„ธ์š”โ€ฆ ์ธ๋ฅ˜๋Š”
02:11
often wins. And besides, we would programme
47
131680
2498
์ข…์ข… ์Šน๋ฆฌํ•ฉ๋‹ˆ๋‹ค. ๊ฒŒ๋‹ค๊ฐ€ ์šฐ๋ฆฌ๋Š”
02:14
the computer to be benevolent.
48
134178
1742
์ปดํ“จํ„ฐ๊ฐ€ ์ž๋น„๋กœ์›Œ์ง€๋„๋ก ํ”„๋กœ๊ทธ๋ž˜๋ฐํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:15
Dan: Benevolent means kind and helpful. But
49
135920
2805
Dan: Benevolent๋Š” ์นœ์ ˆํ•˜๊ณ  ๋„์›€์ด ๋œ๋‹ค๋Š” ๋œป์ž…๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜
02:18
thatโ€™s just it, once the intelligence becomes
50
138725
2935
์ผ๋‹จ ์ง€๋Šฅ์ด
02:21
sentient, or able to think for itself, who
51
141660
2781
์ง€๊ฐํ•˜๊ฑฐ๋‚˜ ์Šค์Šค๋กœ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜๋ฉด
02:24
knows what it will do. We humans are not exactly
52
144441
3179
๊ทธ๊ฒƒ์ด ๋ฌด์—‡์„ ํ• ์ง€ ๋ˆ„๊ฐ€ ์•Œ๊ฒ ์Šต๋‹ˆ๊นŒ? ์šฐ๋ฆฌ ์ธ๊ฐ„์€ ์ •ํ™•ํžˆ ์™„๋ฒฝํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค
02:27
perfect, you know. What happens if it decides
53
147620
2602
. ๊ทธ๊ฒƒ์ด
02:30
that it is better than us and wants us out
54
150222
2428
์šฐ๋ฆฌ๋ณด๋‹ค ๋‚ซ๋‹ค๊ณ  ํŒ๋‹จํ•˜๊ณ  ์šฐ๋ฆฌ๋ฅผ ๋ฐฉํ•ดํ•˜์ง€ ์•Š๊ธฐ๋ฅผ ์›ํ•˜๋ฉด ์–ด๋–ป๊ฒŒ ๋ฉ๋‹ˆ๊นŒ
02:32
of the way?
55
152650
1000
?
02:33
Neil: Donโ€™t worry. Asimov thought of that.
56
153650
2524
๋‹: ๊ฑฑ์ • ๋งˆ์„ธ์š”. Asimov๋Š” ๊ทธ๊ฒƒ์„ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
02:36
Isaac Asimov was an American science fiction
57
156174
2645
์•„์ด์ž‘ ์•„์‹œ๋ชจํ”„(Isaac Asimov)๋Š”
02:38
writer who, among other things, wrote about
58
158819
2747
๋ฌด์—‡๋ณด๋‹ค๋„ ๋กœ๋ด‡์— ๊ด€ํ•ด ๊ธ€์„ ์“ด ๋ฏธ๊ตญ ๊ณต์ƒ๊ณผํ•™ ์ž‘๊ฐ€์˜€์Šต๋‹ˆ๋‹ค
02:41
robots. He came up with three laws that every
59
161566
2874
. ๊ทธ๋Š” ๋ชจ๋“  ๋กœ๋ด‡์ด ์ธ๋ฅ˜์— ๋ฐ˜ํ•˜๋Š” ํ–‰๋™์„ ๋ง‰๊ธฐ ์œ„ํ•ด ๋”ฐ๋ผ์•ผ ํ•  ์„ธ ๊ฐ€์ง€ ๋ฒ•์น™์„ ์ œ์‹œํ–ˆ์Šต๋‹ˆ๋‹ค
02:44
robot would have to follow to stop it from
60
164440
2624
02:47
acting against humanity. So weโ€™re safe!
61
167064
2436
. ๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ์•ˆ์ „ํ•ฉ๋‹ˆ๋‹ค!
02:49
Dan: Iโ€™m not so sure. A sentient robot could
62
169500
2933
๋Œ„: ์ž˜ ๋ชจ๋ฅด๊ฒ ์Šต๋‹ˆ๋‹ค. ์ง€๊ฐ ์žˆ๋Š” ๋กœ๋ด‡์€ ๋ฒ•์„
02:52
make up its own mind about how to interpret
63
172433
2867
ํ•ด์„ํ•˜๋Š” ๋ฐฉ๋ฒ•์— ๋Œ€ํ•ด ์Šค์Šค๋กœ ๊ฒฐ์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค
02:55
the laws. For example, imagine if we created
64
175300
3284
. ์˜ˆ๋ฅผ ๋“ค์–ด
02:58
an AI system to protect all of humanity.
65
178584
2985
๋ชจ๋“  ์ธ๋ฅ˜๋ฅผ ๋ณดํ˜ธํ•˜๊ธฐ ์œ„ํ•ด AI ์‹œ์Šคํ…œ์„ ๋งŒ๋“ค์—ˆ๋‹ค๊ณ  ์ƒ์ƒํ•ด ๋ณด์‹ญ์‹œ์˜ค.
03:01
Neil: Well, thatโ€™s great! No more war. No
66
181569
2195
๋‹: ์Œ, ํ›Œ๋ฅญํ•˜๋„ค์š”! ๋” ์ด์ƒ ์ „์Ÿ์ด ์—†์Šต๋‹ˆ๋‹ค.
03:03
more murder. No more fighting.
67
183764
1606
๋” ์ด์ƒ ์‚ด์ธ์€ ์—†์Šต๋‹ˆ๋‹ค. ๋” ์ด์ƒ ์‹ธ์šฐ์ง€ ๋งˆ์„ธ์š”.
03:05
Dan: Do you really think that humans can stop
68
185370
2694
๋Œ„: ์ •๋ง ์ธ๊ฐ„์ด ์‹ธ์›€์„ ๋ฉˆ์ถœ ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜์„ธ์š”
03:08
fighting? What if the AI decides that the
69
188064
2455
? AI๊ฐ€
03:10
only way to stop us from hurting ourselves
70
190519
3286
์šฐ๋ฆฌ ์ž์‹ ๊ณผ ์„œ๋กœ๊ฐ€ ๋‹ค์น˜๋Š” ๊ฒƒ์„ ๋ง‰์„ ์ˆ˜ ์žˆ๋Š” ์œ ์ผํ•œ ๋ฐฉ๋ฒ•์€ ์šฐ๋ฆฌ๊ฐ€ ํ•˜๋Š”
03:13
and each other is to control everything we
71
193805
3285
๋ชจ๋“  ์ผ์„ ํ†ต์ œํ•˜๋Š” ๊ฒƒ์ด๋ผ๊ณ  ๊ฒฐ์ •ํ•˜๊ณ 
03:17
do, so it takes over to protect us. Then we
72
197090
3507
์šฐ๋ฆฌ๋ฅผ ๋ณดํ˜ธํ•˜๊ธฐ ์œ„ํ•ด ๋Œ€์‹ ํ•œ๋‹ค๋ฉด ์–ด๋–จ๊นŒ์š”? ๊ทธ๋ ‡๋‹ค๋ฉด ์šฐ๋ฆฌ๋Š” ์šฐ๋ฆฌ๋ณด๋‹ค ๋” ๋˜‘๋˜‘ํ•˜๊ณ  ์˜ค๋ฅ˜๊ฐ€ ์—†๋Š”
03:20
would lose our freedom to a thing that we
73
200597
3343
์šฐ๋ฆฌ๊ฐ€ ์ฐฝ์กฐํ•œ ๊ฒƒ์—๊ฒŒ ์šฐ๋ฆฌ์˜ ์ž์œ ๋ฅผ ์žƒ๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค
03:23
created that is infallible and more intelligent
74
203940
3377
03:27
than we are! Thatโ€™s the end, Neil!
75
207317
2443
! ๊ทธ๊ฒŒ ๋์ด์•ผ, ๋‹!
03:29
Neil: I think thatโ€™s a little far-fetched,
76
209760
2216
Neil: ์ œ ์ƒ๊ฐ์—๋Š” ์กฐ๊ธˆ ์–ต์ง€์Šค๋Ÿฌ์šด ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
03:31
which means difficult to believe. Iโ€™m sure
77
211976
2215
๋ฏฟ๊ธฐ ์–ด๋ ต๋‹ค๋Š” ๋œป์ž…๋‹ˆ๋‹ค.
03:34
others donโ€™t think that way.
78
214191
1539
๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์€ ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜์ง€ ์•Š์„ ๊ฒƒ์ด๋ผ๊ณ  ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.
03:35
Dan: OK. Letโ€™s hear what the Learning English
79
215730
2523
๋Œ„: ์•Œ์•˜์–ด.
03:38
team say when I ask them if they are worried
80
218253
2467
03:40
that AI and robots could take over the world.
81
220720
3519
์ธ๊ณต์ง€๋Šฅ๊ณผ ๋กœ๋ด‡์ด ์„ธ์ƒ์„ ์ง€๋ฐฐํ•  ์ˆ˜ ์žˆ๋‹ค๊ณ  ๊ฑฑ์ •ํ•˜๋Š๋ƒ๊ณ  ์˜์–ด ํ•™์ŠตํŒ€์ด ๋ฌป๋Š” ๋ง์„ ๋“ค์–ด๋ณด์ž.
03:44
Phil: Well, itโ€™s possible, but unlikely.
82
224240
3580
ํ•„: ์Œ, ๊ฐ€๋Šฅํ•˜์ง€๋งŒ ๊ฐ€๋Šฅ์„ฑ์€ ๋‚ฎ์Šต๋‹ˆ๋‹ค.
03:47
There will come a point where our technology
83
227820
1880
์šฐ๋ฆฌ์˜ ๊ธฐ์ˆ ์ด ์ œํ•œ๋˜๋Š” ์‹œ์ ์ด ์˜ฌ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
03:49
will be limited โ€“ probably before real AI
84
229700
3125
์•„๋งˆ๋„ ์‹ค์ œ AI๊ฐ€
03:52
is achieved.
85
232825
1935
๋‹ฌ์„ฑ๋˜๊ธฐ ์ „์— ๋ง์ž…๋‹ˆ๋‹ค.
03:54
Sam: Never in a million years. First of all
86
234760
2360
์ƒ˜: ๋ฐฑ๋งŒ ๋…„ ๋™์•ˆ์€ ์ ˆ๋Œ€. ์šฐ์„ 
03:57
weโ€™d programme them so that they couldnโ€™t,
87
237124
3306
์šฐ๋ฆฌ๋Š” ๊ทธ๋“ค์ด ํ•  ์ˆ˜ ์—†๋„๋ก ํ”„๋กœ๊ทธ๋ž˜๋ฐํ•˜๊ณ 
04:00
and secondly weโ€™d beat them anyway. Havenโ€™t
88
240430
3632
๋‘ ๋ฒˆ์งธ๋กœ ์–ด์จŒ๋“  ๊ทธ๋“ค์„ ์ด๊ธธ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:04
you ever seen a movie?
89
244062
1858
์˜ํ™” ๋ณธ ์  ์—†์–ด?
04:05
Kee: I totally think it could happen. We
90
245920
2371
ํ‚ค: ๊ทธ๋Ÿด ์ˆ˜๋„ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”. ์šฐ๋ฆฌ๋Š” ์Šค์Šค๋กœ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์„ ๋งŒํผ
04:08
only have to make a robot thatโ€™s smart enough
91
248291
2668
๋˜‘๋˜‘ํ•œ ๋กœ๋ด‡์„ ๋งŒ๋“ค๊ธฐ๋งŒ ํ•˜๋ฉด ๋ฉ๋‹ˆ๋‹ค
04:10
to start thinking for itself. After that,
92
250959
2605
. ๊ทธ ํ›„,
04:13
who knows what it might do.
93
253564
2796
๊ทธ๊ฒƒ์ด ๋ฌด์—‡์„ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ๋ˆ„๊ฐ€ ์•Œ๊ฒ ์Šต๋‹ˆ๊นŒ?
04:16
Neil: A mixed bag of opinions there, Dan.
94
256360
1796
Neil: ์˜๊ฒฌ์ด ์—‡๊ฐˆ๋ฆฌ๋„ค์š”, Dan.
04:18
It seems you arenโ€™t alone.
95
258156
1823
๋‹น์‹ ์€ ํ˜ผ์ž๊ฐ€ ์•„๋‹Œ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
04:19
Dan: Nope. But I donโ€™t exactly have an army
96
259979
2459
๋Œ„: ์•„๋‹ˆ์š”. ๊ทธ๋Ÿฌ๋‚˜ ๋‚˜๋Š” ์ง€์›๊ตฐ๋„ ์ •ํ™•ํžˆ ๊ฐ€์ง€๊ณ  ์žˆ์ง€ ์•Š์Šต๋‹ˆ๋‹ค
04:22
of support either. I guess weโ€™ll just have
97
262438
2402
.
04:24
to wait and see.
98
264840
1000
๋‘๊ณ  ๋ด์•ผ ํ•  ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
04:25
Neil: Speak for yourself. Iโ€™ve waited long
99
265840
1949
๋‹: ์Šค์Šค๋กœ ๋งํ•ด ๋ณด์„ธ์š”. ๋‚˜๋Š” ์ถฉ๋ถ„ํžˆ ์˜ค๋ž˜ ๊ธฐ๋‹ค๋ ธ์Šต๋‹ˆ๋‹ค
04:27
enough โ€“ for our quiz question that is.
100
267789
1810
โ€“ ์šฐ๋ฆฌ์˜ ํ€ด์ฆˆ ์งˆ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:29
Dan: Oh yeah! I asked you what the original
101
269599
2496
๋Œ„: ์•„ ์˜ˆ!
04:32
meaning of the word โ€˜robotโ€™ was before
102
272095
2205
๋กœ๋ด‡์ด๋ผ๋Š” ๋‹จ์–ด๊ฐ€
04:34
it was used in its modern form. a) forced
103
274300
3633
ํ˜„๋Œ€์ ์œผ๋กœ ์‚ฌ์šฉ๋˜๊ธฐ ์ „์˜ ์›๋ž˜ ์˜๋ฏธ๊ฐ€ ๋ฌด์—‡์ธ์ง€ ๋ฌผ์—ˆ์Šต๋‹ˆ๋‹ค. a) ๊ฐ•์ œ
04:37
labour b) metal man c) heartless thing
104
277933
3367
๋…ธ๋™ b) ๊ธˆ์† ์ธ๊ฐ„ c) ๋ฌด์ž๋น„ํ•œ ๊ฒƒ
04:41
Neil: And I said a) forced labour
105
281300
1709
Neil: ๊ทธ๋ฆฌ๊ณ  ๋‚ด๊ฐ€ ๋งํ–ˆ์–ด์š” a) ๊ฐ•์ œ ๋…ธ๋™
04:43
Dan: And you wereโ€ฆ right!
106
283009
1971
Dan: ๊ทธ๋ฆฌ๊ณ  ๋‹น์‹  ๋ง์ด... ๋งž์•„์š”!
04:44
Neil: Shall we take a look at the vocabulary then?
107
284980
2580
๋‹: ๊ทธ๋Ÿผ ์–ดํœ˜๋ฅผ ์‚ดํŽด๋ณผ๊นŒ์š”?
04:47
Dan: OK. First we had inevitable. If something
108
287569
3450
๋Œ„: ์•Œ์•˜์–ด. ๋จผ์ € ์šฐ๋ฆฌ๋Š” ๋ถˆ๊ฐ€ํ”ผํ–ˆ์Šต๋‹ˆ๋‹ค. ๋ถˆ๊ฐ€ํ”ผํ•œ ์ผ์ด๋ผ๋ฉด
04:51
is inevitable then it cannot be avoided or
109
291019
3150
ํ”ผํ•˜๊ฑฐ๋‚˜ ๋ฉˆ์ถœ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค
04:54
stopped. Can you think of something inevitable, Neil?
110
294169
2571
. ๋ถˆ๊ฐ€ํ”ผํ•œ ์ผ์ด ์ƒ๊ฐ๋‚˜๋‚˜์š”, ๋‹?
04:56
Neil: It is inevitable that one day the Sun
111
296749
3005
Neil: ์–ธ์  ๊ฐ€๋Š” ํƒœ์–‘์ด
04:59
will stop burning. Then we had infallible,
112
299754
2936
ํƒ€๋Š” ๊ฒƒ์„ ๋ฉˆ์ถœ ์ˆ˜๋ฐ–์— ์—†์Šต๋‹ˆ๋‹ค. ๊ทธ๋Ÿฐ ๋‹ค์Œ ์šฐ๋ฆฌ๋Š”
05:02
which means never failing. Give us an example, Dan.
113
302690
3050
๊ฒฐ์ฝ” ์‹คํŒจํ•˜์ง€ ์•Š๋Š”๋‹ค๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•˜๋Š” ์ ˆ๋Œ€ ์˜ค๋ฅ˜๊ฐ€ ์—†์—ˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด๋ณด์„ธ์š”, ๋Œ„.
05:05
Dan: The vaccine for small pox is infallible.
114
305740
2332
Dan: ์ฒœ์—ฐ๋‘ ๋ฐฑ์‹ ์€ ํ‹€๋ฆผ์ด ์—†์Šต๋‹ˆ๋‹ค.
05:08
The natural spread of that disease has been
115
308072
2228
๊ทธ ์งˆ๋ณ‘์˜ ์ž์—ฐ์  ํ™•์‚ฐ์€
05:10
completely stopped. After that was surpasses.
116
310300
3470
์™„์ „ํžˆ ์ค‘๋‹จ๋˜์—ˆ์Šต๋‹ˆ๋‹ค. ๊ทธ ํ›„ ๋Šฅ๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:13
If something surpasses something else then
117
313770
3239
์–ด๋–ค ๊ฒƒ์ด ๋‹ค๋ฅธ ๊ฒƒ์„ ๋Šฅ๊ฐ€ํ•˜๋ฉด
05:17
it becomes better than it.
118
317009
1711
๊ทธ๊ฒƒ๋ณด๋‹ค ๋” ๋‚˜์€ ๊ฒƒ์ด ๋œ๋‹ค.
05:18
Neil: Many parents across the world hope that
119
318720
2275
Neil: ์ „ ์„ธ๊ณ„์˜ ๋งŽ์€ ๋ถ€๋ชจ๋Š”
05:20
their children will surpass them in wealth,
120
320995
2174
์ž๋…€๊ฐ€ ๋ถ€,
05:23
status or achievement. After that we heard
121
323169
2662
์ง€์œ„ ๋˜๋Š” ์„ฑ์ทจ์—์„œ ์ž์‹ ์„ ๋Šฅ๊ฐ€ํ•˜๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค. ๊ทธ ํ›„ ์šฐ๋ฆฌ๋Š”
05:25
benevolent, which means kind and helpful.
122
325831
2599
์นœ์ ˆํ•˜๊ณ  ๋„์›€์ด ๋œ๋‹ค๋Š” ๋œป์˜ ์ž๋น„๋กœ์šด ๋ง์„ ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
05:28
Name a person famous for being benevolent, Dan.
123
328430
3019
์ž๋น„๋กœ ์œ ๋ช…ํ•œ ์‚ฌ๋žŒ์„ ๋งํ•ด๋ณด์„ธ์š”, Dan.
05:31
Dan: Father Christmas is a benevolent character.
124
331449
3216
๋Œ„: ํŒŒ๋” ํฌ๋ฆฌ์Šค๋งˆ์Šค๋Š” ์ž์• ๋กœ์šด ์บ๋ฆญํ„ฐ๋‹ค.
05:34
After that we heard sentient. If something
125
334665
2814
๊ทธ ํ›„ ์šฐ๋ฆฌ๋Š” ์ง€๊ฐ๋ ฅ์„ ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค. ๋งŒ์•ฝ ์–ด๋–ค ๊ฒƒ์ด
05:37
is sentient, it is able to think for itself.
126
337479
2720
์ง€๊ฐ์ด ์žˆ๋‹ค๋ฉด ๊ทธ๊ฒƒ์€ ์Šค์Šค๋กœ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:40
Neil: Indeed. Many people wonder about the
127
340200
2340
๋‹: ๊ณผ์—ฐ. ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด
05:42
possibility of sentient life on other planets.
128
342620
3060
๋‹ค๋ฅธ ํ–‰์„ฑ์— ์ƒ๋ช…์ฒด๊ฐ€ ์กด์žฌํ•  ๊ฐ€๋Šฅ์„ฑ์— ๋Œ€ํ•ด ๊ถ๊ธˆํ•ดํ•ฉ๋‹ˆ๋‹ค.
05:45
Finally we heard far-fetched, which means
129
345680
2176
๋งˆ์นจ๋‚ด ์šฐ๋ฆฌ๋Š” ๋ฏฟ๊ธฐ ์–ด๋ ต๋‹ค๋Š” ๋œป์˜ ์–ต์ง€์Šค๋Ÿฌ์šด ๋ง์„ ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค
05:47
difficult to believe. Like that far-fetched
130
347856
2283
.
05:50
story you told me the other day about being
131
350139
1932
์ €๋ฒˆ์— ๋‹น์‹ ์ด
05:52
late because of a dragon, Dan.
132
352071
1348
์šฉ ๋•Œ๋ฌธ์— ๋Šฆ์—ˆ๋‹ค๊ณ  ๋‚˜์—๊ฒŒ ๋งํ–ˆ๋˜ ํ„ฐ๋ฌด๋‹ˆ์—†๋Š” ์ด์•ผ๊ธฐ์ฒ˜๋Ÿผ, ๋Œ„.
05:53
Dan: I swear it was real! It had big sharp
133
353419
2480
๋Œ„: ๋งน์„ธ์ปจ๋ฐ ์ง„์งœ์˜€์–ด! ๊ทธ๊ฒƒ์€ ํฌ๊ณ  ๋‚ ์นด๋กœ์šด
05:55
teeth and everything!
134
355899
1240
์ด๋นจ๊ณผ ๋ชจ๋“  ๊ฒƒ์„ ๊ฐ€์ง€๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค!
05:57
Neil: Yeah, yeah, yeah. And thatโ€™s the end
135
357139
1938
๋‹: ์˜ˆ, ์˜ˆ, ์˜ˆ. ์ด๊ฒƒ์œผ๋กœ
05:59
of this 6 Minute English. Donโ€™t forget to
136
359077
1892
6๋ถ„ ์˜์–ด๊ฐ€ ๋๋‚ฌ์Šต๋‹ˆ๋‹ค.
06:00
check out our Facebook, Twitter, and YouTube
137
360969
2171
Facebook, Twitter ๋ฐ YouTube
06:03
pages. See you next time!
138
363140
1640
ํŽ˜์ด์ง€๋ฅผ ํ™•์ธํ•˜๋Š” ๊ฒƒ์„ ์žŠ์ง€ ๋งˆ์‹ญ์‹œ์˜ค. ๋‹ค์Œ์— ๋งŒ๋‚˜์š”!
06:04
Dan: Bye!
139
364780
640
๋Œ„: ์•ˆ๋…•!
06:05
Neil: Bye.
140
365420
860
๋‹: ์•ˆ๋…•.
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7