Fake smiles and the computers that can spot them - 6 Minute English

65,823 views ・ 2019-09-19

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:07
Neil: Hello. This is 6 Minute English, I'm Neil.
0
7920
2200
닐: μ•ˆλ…•ν•˜μ„Έμš”. 6λΆ„μ˜μ–΄ λ‹μž…λ‹ˆλ‹€.
00:10
Sam: And I'm Sam.
1
10120
1460
μƒ˜: 그리고 μ €λŠ” μƒ˜μž…λ‹ˆλ‹€.
00:11
Neil: It’s good to see you again, Sam
2
11580
1480
Neil: λ‹€μ‹œ λ§Œλ‚˜μ„œ λ°˜κ°€μ›Œ Sam
00:13
Sam: Really?
3
13060
600
00:13
Neil: Yes, of course, can’t you tell by the
4
13660
2339
Sam: μ •λ§μš”?
Neil: λ„€, 물둠이죠.
00:15
way I’m smiling?
5
15999
1000
μ œκ°€ μ›ƒλŠ” λͺ¨μŠ΅μ„ 보면 μ•Œ 수 μ—†λ‚˜μš”?
00:17
Sam: Ah well, I find it difficult to tell if
6
17000
2620
Sam: μ•„ κΈ€μŽ„μš”,
00:19
someone is really smiling or if it’s a fake
7
19620
2380
λˆ„κ΅°κ°€κ°€ μ§„μ§œλ‘œ 웃고 μžˆλŠ”μ§€ κ°€μ§œ
00:22
smile.
8
22000
900
00:22
Neil: Well, that’s a coincidence because
9
22900
2040
λ―Έμ†ŒμΈμ§€ κ΅¬λΆ„ν•˜κΈ° μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
Neil: 음, 그것은 μš°μ—°μ˜ μΌμΉ˜μž…λ‹ˆλ‹€. μ™œλƒν•˜λ©΄
00:24
this programme is all about how
10
24940
2140
이 ν”„λ‘œκ·Έλž¨μ€
00:27
computers may be able tell real smiles
11
27080
2380
컴퓨터가 μ–΄λ–»κ²Œ μ§„μ§œ λ―Έμ†Œμ™€
00:29
from fake smiles better than humans can.
12
29460
3200
κ°€μ§œ λ―Έμ†Œλ₯Ό 인간보닀 더 잘 ꡬ별할 수 μžˆλŠ”μ§€μ— κ΄€ν•œ 것이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
00:32
Before we get in to that though, a
13
32660
1314
ν•˜μ§€λ§Œ 그것에 λŒ€ν•΄ μ•Œμ•„λ³΄κΈ° 전에
00:33
question. The expressions we can
14
33974
2516
질문이 μžˆμŠ΅λ‹ˆλ‹€. μš°λ¦¬κ°€
00:36
make with our face are controlled by
15
36490
2258
μ–Όκ΅΄λ‘œ ν•  수 μžˆλŠ” ν‘œμ •μ€ κ·Όμœ‘μ— μ˜ν•΄ μ œμ–΄λ©λ‹ˆλ‹€
00:38
muscles. How many muscles do we have
16
38748
2303
. 우리 μ–Όκ΅΄μ—λŠ” λͺ‡ 개의 근윑이 μžˆμŠ΅λ‹ˆκΉŒ
00:41
in our face? Is it:
17
41060
1220
?
00:42
A: 26, B: 43 or C: 62?
18
42280
4600
A: 26, B: 43 λ˜λŠ” C: 62μž…λ‹ˆκΉŒ?
00:46
What do you think, Sam?
19
46880
1000
μ–΄λ–»κ²Œ 생각해, μƒ˜?
00:47
Sam: No idea! But a lot, I’d guess, so I’m
20
47880
3680
μƒ˜: λͺ°λΌ! ν•˜μ§€λ§Œ λ§Žμ€ 것 κ°™μœΌλ‹ˆ
00:51
going with 62.
21
51560
1420
62둜 ν•˜κ² μŠ΅λ‹ˆλ‹€.
00:52
Neil: OK. Well, we’ll see if you’ll be smiling
22
52980
2240
Neil: μ’‹μŠ΅λ‹ˆλ‹€. 음,
00:55
or crying later in the programme.
23
55220
2185
ν”„λ‘œκ·Έλž¨ ν›„λ°˜μ— 당신이 웃을지 μšΈμ§€ μ•Œκ²Œ 될 κ²ƒμž…λ‹ˆλ‹€.
00:57
Hassan Ugail is a professor of visual
24
57405
2496
Hassan Ugail은
00:59
computing at the University of Bradford.
25
59901
2399
λΈŒλž˜λ“œν¬λ“œ λŒ€ν•™μ˜ λΉ„μ£Όμ–Ό μ»΄ν“¨νŒ… κ΅μˆ˜μž…λ‹ˆλ‹€.
01:02
He’s been working on getting computers
26
62300
1680
κ·ΈλŠ” 컴퓨터가 우리 μ–Όκ΅΄ ν‘œμ •μ—μ„œ
01:03
to be able to recognise human emotions
27
63980
2680
μΈκ°„μ˜ 감정을 인식할 수 μžˆλ„λ‘ ν•˜λŠ” μž‘μ—…μ„ ν•΄μ™”μŠ΅λ‹ˆλ‹€
01:06
from the expressions on our
28
66660
1500
01:08
face. Here he is speaking on the BBC
29
68160
3140
. μ—¬κΈ°μ—μ„œ κ·ΈλŠ” BBC
01:11
Inside Science radio programme – how
30
71300
2680
Inside Science λΌλ””μ˜€ ν”„λ‘œκ·Έλž¨μ—μ„œ μ—°μ„€ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:13
successful does he say they have been?
31
73980
1870
κ·ΈλŠ” 그듀이 μ–Όλ§ˆλ‚˜ μ„±κ³΅μ μ΄μ—ˆλ‹€κ³  λ§ν•©λ‹ˆκΉŒ?
01:15
Professor Hassan Ugail: We've been
32
75850
1350
Hassan Ugail ꡐ수: μš°λ¦¬λŠ”
01:17
working quite a lot on the human
33
77200
1270
μΈκ°„μ˜ 감정에 λŒ€ν•΄ μƒλ‹Ήνžˆ λ§Žμ€ 연ꡬλ₯Ό ν•΄μ™”μŠ΅λ‹ˆλ‹€
01:18
emotions, so the idea is how the facial
34
78470
2869
. λ”°λΌμ„œ λΉ„λ””μ˜€ ν”„λ ˆμž„μ„ 톡해 λΆ„λͺ…νžˆ 컴퓨터λ₯Ό 톡해 얼꡴에
01:21
muscle movement, which is reflected on
35
81339
2640
λ°˜μ˜λ˜λŠ” μ•ˆλ©΄ 근윑의 μ›€μ§μž„μ΄ μ–΄λ–»κ²Œ λ‚˜νƒ€λ‚˜λŠ”μ§€
01:23
the face, through obviously a computer
36
83979
1966
01:25
through video frames and trying to
37
85945
1791
그리고
01:27
understand how these muscle
38
87740
1360
μ΄λŸ¬ν•œ 근윑 μ›€μ§μž„μ΄ μ‹€μ œλ‘œ μ–΄λ–»κ²Œ μΌμ–΄λ‚˜λŠ”μ§€ μ΄ν•΄ν•˜λ €κ³  λ…Έλ ₯ν•˜λŠ” 것이 μ•„μ΄λ””μ–΄μž…λ‹ˆλ‹€.
01:29
movements actually relate to facial
39
89100
1972
01:31
expressions and then from facial
40
91072
2126
01:33
expressions trying to understand the
41
93198
1980
01:35
emotions or to infer the emotions. And
42
95178
2121
감정을 μ΄ν•΄ν•˜κ±°λ‚˜ 감정을 μΆ”λ‘ ν•˜λ €λŠ” μ–Όκ΅΄ ν‘œμ •κ³Ό 관련이 μžˆμŠ΅λ‹ˆλ‹€. 그리고
01:37
they have been quite successful
43
97299
1730
그듀은 κ·Έλ ‡κ²Œ ν•˜λŠ” 데 κ½€ μ„±κ³΅μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€
01:39
in doing that. We have software that can
44
99029
2471
. μš°λ¦¬λŠ”
01:41
actually look at somebody's face in real
45
101500
2700
μ‹€μ œλ‘œ λˆ„κ΅°κ°€μ˜ 얼꡴을 μ‹€μ‹œκ°„μœΌλ‘œ 보고
01:44
time and then identify the series of
46
104200
2820
01:47
emotions that person
47
107023
1337
κ·Έ μ‚¬λžŒμ΄
01:48
is expressing in real time as well.
48
108367
2352
μ‹€μ‹œκ°„μœΌλ‘œ ν‘œν˜„ν•˜λŠ” 일련의 감정을 식별할 수 μžˆλŠ” μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό 가지고 μžˆμŠ΅λ‹ˆλ‹€.
01:50
Neil: So, have they been successful in
49
110719
2136
Neil: κ·Έλž˜μ„œ 그듀은
01:52
getting computers to identify emotions?
50
112855
2444
컴퓨터가 감정을 μ‹λ³„ν•˜λŠ” 데 μ„±κ³΅ν–ˆμŠ΅λ‹ˆκΉŒ?
01:55
Sam: Yes, he says they’ve been quite
51
115299
2636
Sam: λ„€, κ·ΈλŠ” 그듀이 κ½€
01:57
successful, and what’s interesting is that
52
117935
2856
μ„±κ³΅μ μ΄μ—ˆλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€. 그리고 ν₯미둜운 점은
02:00
he says that the computers can do it in
53
120800
2380
컴퓨터가 'μ‹€μ‹œκ°„'으둜 그것을 ν•  수 μžˆλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€
02:03
'real time'. This means that there’s no
54
123180
2480
. 이것은 지연이 μ—†λ‹€λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€
02:05
delay. They don’t have to stop and analyse
55
125660
2740
. 그듀은 λ©ˆμΆ°μ„œ
02:08
the data, or crunch the numbers, they can
56
128400
2780
데이터λ₯Ό λΆ„μ„ν•˜κ±°λ‚˜ 숫자λ₯Ό 계산할 ν•„μš”κ°€ μ—†μœΌλ©°
02:11
do it as the person is talking.
57
131180
2320
μ‚¬λžŒμ΄ λ§ν•˜λŠ” λŒ€λ‘œ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
02:13
Neil: The system uses video to analyse a
58
133500
2640
Neil: μ‹œμŠ€ν…œμ€ λΉ„λ””μ˜€λ₯Ό μ‚¬μš©ν•˜μ—¬
02:16
person’s expressions and can then infer
59
136140
2352
μ‚¬λžŒμ˜ ν‘œμ •μ„ λΆ„μ„ν•œ λ‹€μŒ 감정을 μΆ”λ‘ ν•  수 μžˆμŠ΅λ‹ˆλ‹€
02:18
the emotions.
60
138492
797
.
02:19
'To infer something' means to get an
61
139289
2128
'무언가λ₯Ό μΆ”λ‘ ν•˜λ‹€'λŠ”
02:21
understanding of something without
62
141417
1960
02:23
actually being told directly.
63
143377
1696
μ‹€μ œλ‘œ 직접 λ§ν•˜μ§€ μ•Šκ³  무언가λ₯Ό μ΄ν•΄ν•œλ‹€λŠ” λœ»μž…λ‹ˆλ‹€.
02:25
So, you look at available information and
64
145073
2397
λ”°λΌμ„œ μ‚¬μš© κ°€λŠ₯ν•œ 정보λ₯Ό 보고
02:27
use your understanding and knowledge to
65
147470
1911
이해와 지식을 μ‚¬μš©ν•˜μ—¬
02:29
work out the meaning.
66
149381
1029
의미λ₯Ό νŒŒμ•…ν•©λ‹ˆλ‹€.
02:30
Sam: It’s a bit like being a detective, isn’t
67
150410
2840
Sam: μ•½κ°„ ν˜•μ‚¬κ°€ λ˜λŠ” 것 κ°™μ£ 
02:33
it? You look at the clues and infer what
68
153250
2330
? λͺ¨λ“  μ„ΈλΆ€ 정보가 없더라도 λ‹¨μ„œλ₯Ό 보고 무슨
02:35
happened even if you don’t have all the
69
155580
2160
일이 μΌμ–΄λ‚¬λŠ”μ§€ μΆ”λ‘ ν•©λ‹ˆλ‹€
02:37
details.
70
157740
660
.
02:38
Neil: Yes, and in this case the computer
71
158400
1980
Neil: 예, 그리고 이 경우 μ»΄ν“¨ν„°λŠ” μ–Όκ΅΄
02:40
looks at how the movement of muscles in
72
160380
2520
의 근윑
02:42
the face or 'facial muscles', show different
73
162900
2180
λ˜λŠ” 'μ–Όκ΅΄ 근윑'의 μ›€μ§μž„μ΄ μ–΄λ–»κ²Œ λ‹€λ₯Έ
02:45
emotions. Here’s Professor Ugail again.
74
165080
2600
감정을 λ‚˜νƒ€λ‚΄λŠ”μ§€ μ‚΄νŽ΄λ΄…λ‹ˆλ‹€. μ—¬κΈ° Ugail κ΅μˆ˜κ°€ λ‹€μ‹œ μžˆμŠ΅λ‹ˆλ‹€.
02:47
Professor Hassan Ugail: We've been
75
167680
1262
Hassan Ugail ꡐ수: μš°λ¦¬λŠ”
02:48
working quite a lot on the human
76
168942
1187
μΈκ°„μ˜ 감정에 λŒ€ν•΄ μƒλ‹Ήνžˆ λ§Žμ€ 연ꡬλ₯Ό ν•΄μ™”κΈ°
02:50
emotions so the idea is how the facial
77
170129
2842
λ•Œλ¬Έμ— λΉ„λ””μ˜€ ν”„λ ˆμž„μ„ 톡해 λΆ„λͺ…νžˆ 컴퓨터λ₯Ό 톡해 μ–Όκ΅΄
02:52
muscle movement, which is reflected on
78
172971
2640
에 λ°˜μ˜λ˜λŠ” μ•ˆλ©΄ 근윑의 μ›€μ§μž„
02:55
the face, through obviously a computer
79
175611
1985
02:57
through video frames and trying to
80
177596
1810
κ³Ό
02:59
understand how these
81
179406
1064
μ΄λŸ¬ν•œ
03:00
muscle movements actually relate to
82
180470
2092
근윑 μ›€μ§μž„μ΄ μ‹€μ œλ‘œ μ–΄λ–»κ²Œ κ΄€λ ¨λ˜λŠ”μ§€ μ΄ν•΄ν•˜λ €κ³  λ…Έλ ₯ν•˜λŠ” 것이 μ•„μ΄λ””μ–΄μž…λ‹ˆλ‹€.
03:02
facial expressions and then from facial
83
182562
2377
03:04
expressions trying to understand the
84
184939
1960
03:06
emotions or to infer the emotions. And
85
186899
2098
감정을 μ΄ν•΄ν•˜κ±°λ‚˜ 감정을 μœ μΆ”ν•˜λ €λŠ” ν‘œμ •μ—μ„œ. 그리고
03:08
they have been quite successful
86
188997
1712
그듀은 κ·Έλ ‡κ²Œ ν•˜λŠ” 데 κ½€ μ„±κ³΅μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€
03:10
in doing that. We have software that can
87
190709
2703
. μš°λ¦¬λŠ”
03:13
actually look at somebody's face in real
88
193420
2360
μ‹€μ œλ‘œ λˆ„κ΅°κ°€μ˜ 얼꡴을 μ‹€μ‹œκ°„μœΌλ‘œ 보고
03:15
time and then identify the series of
89
195780
2774
03:18
emotions that person is expressing in real
90
198554
2837
κ·Έ μ‚¬λžŒμ΄ μ‹€μ‹œκ°„μœΌλ‘œ ν‘œν˜„ν•˜λŠ” 일련의 감정을 식별할 수 μžˆλŠ” μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό 가지고 μžˆμŠ΅λ‹ˆλ‹€
03:21
time as well.
91
201391
878
.
03:22
Neil: So, how do the computers know
92
202269
2443
Neil: 그러면 μ»΄ν“¨ν„°λŠ”
03:24
what is a real or a fake smile? The
93
204712
2552
무엇이 μ§„μ§œμΈμ§€ κ°€μ§œμΈμ§€ μ–΄λ–»κ²Œ μ•Œ 수 μžˆμŠ΅λ‹ˆκΉŒ?
03:27
computers have to learn
94
207264
1703
μ»΄ν“¨ν„°λŠ”
03:28
that first. Here’s Professor Ugail again
95
208967
2963
λ¨Όμ € 그것을 λ°°μ›Œμ•Ό ν•©λ‹ˆλ‹€. μ—¬κΈ° Ugail κ΅μˆ˜κ°€
03:31
talking about how they do that.
96
211930
2070
그듀이 μ–΄λ–»κ²Œ ν•˜λŠ”μ§€μ— λŒ€ν•΄ λ‹€μ‹œ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:34
Professor Hassan Ugail: We have a data
97
214000
1280
Hassan Ugail ꡐ수:
03:35
set of real smiles and we have
98
215293
1297
μ‹€μ œ λ―Έμ†Œ 데이터 μ„ΈνŠΈμ™€
03:36
a data set of fake smiles. These real
99
216590
2029
κ°€μ§œ λ―Έμ†Œ 데이터 μ„ΈνŠΈκ°€ μžˆμŠ΅λ‹ˆλ‹€. μ΄λŸ¬ν•œ μ‹€μ œ
03:38
smiles are induced smiles in a lab. So,
100
218619
2177
λ―Έμ†ŒλŠ” μ‹€ν—˜μ‹€μ—μ„œ 유발된 λ―Έμ†Œμž…λ‹ˆλ‹€. κ·Έλž˜μ„œ,
03:40
you put somebody on a chair and then
101
220796
2486
당신은 λˆ„κ΅°κ°€λ₯Ό μ˜μžμ— μ•‰νžŒ λ‹€μŒ
03:43
show some funny movies
102
223282
1542
μž¬λ―ΈμžˆλŠ” μ˜ν™”λ₯Ό 보여주고
03:44
and we expect the smiles are genuine
103
224824
2400
μš°λ¦¬λŠ” κ·Έ λ―Έμ†Œκ°€ μ§„μ •ν•œ λ―Έμ†ŒλΌκ³  κΈ°λŒ€ν•©λ‹ˆλ‹€
03:47
smiles.
104
227224
685
03:47
And similarly we ask them to pretend to
105
227909
2012
.
λ§ˆμ°¬κ°€μ§€λ‘œ μš°λ¦¬λŠ” κ·Έλ“€μ—κ²Œ μ›ƒλŠ” 척을 ν•˜λΌκ³  μš”μ²­ν•©λ‹ˆλ‹€
03:49
smile. So, these are what you'd call fake
106
229921
2019
. κ·Έλž˜μ„œ, 이것듀은 당신이 κ°€μ§œ λ―Έμ†ŒλΌκ³  λΆ€λ₯΄λŠ” κ²ƒμž…λ‹ˆλ‹€
03:51
smiles.
107
231940
680
.
03:52
So, what we do is we throw these into the
108
232620
2440
κ·Έλž˜μ„œ μš°λ¦¬κ°€ ν•˜λŠ” 일은 이것을 기계에 λ„£μœΌλ©΄
03:55
machine and then the machine figures
109
235060
1860
기계가
03:56
out what are the characteristics of a real
110
236920
2160
μ§„μ§œ λ―Έμ†Œμ˜ νŠΉμ§•
03:59
smile and what are the characteristics of
111
239080
2300
κ³Ό
04:01
a fake smile.
112
241387
1093
κ°€μ§œ λ―Έμ†Œμ˜ νŠΉμ§•μ„ μ•Œμ•„λƒ…λ‹ˆλ‹€.
04:02
Neil: So, how do they get the data that the
113
242480
2840
Neil: 그러면
04:05
computers use to see if your smile is fake
114
245320
2260
컴퓨터가 λ‹Ήμ‹ μ˜ λ―Έμ†Œκ°€ κ°€μ§œ
04:07
or 'genuine' – which is another word which
115
247580
2271
인지 'μ§„μ§œ'인지 ν™•μΈν•˜κΈ° μœ„ν•΄ μ‚¬μš©ν•˜λŠ” 데이터λ₯Ό μ–΄λ–»κ²Œ μ–»λ‚˜μš”
04:09
means real?
116
249851
1108
?
04:10
Sam: They induce real smiles in the lab by
117
250959
2720
Sam: μ‚¬λžŒλ“€μ—κ²Œ μž¬λ―ΈμžˆλŠ” μ˜ν™”λ₯Ό λ³΄μ—¬μ€ŒμœΌλ‘œμ¨ μ—°κ΅¬μ‹€μ—μ„œ μ§„μ •ν•œ μ›ƒμŒμ„ μœ λ„ν•©λ‹ˆλ‹€
04:13
showing people funny films. This means
118
253680
3360
. 이것은
04:17
that they make the smiles come naturally.
119
257040
2519
그듀이 μ›ƒλŠ” 얼꡴을 μžμ—°μŠ€λŸ½κ²Œ λ§Œλ“ λ‹€λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
04:19
They assume that the smiles while
120
259560
1700
그듀은
04:21
watching the funny films are genuine.
121
261260
2600
μž¬λ―ΈμžˆλŠ” μ˜ν™”λ₯Ό λ³΄λ©΄μ„œ λ―Έμ†Œλ₯Ό μ§“λŠ” 것이 μ§„μ§œλΌκ³  κ°€μ •ν•©λ‹ˆλ‹€.
04:23
Neil: And then they ask the people to
122
263860
1400
Neil: 그리고 λ‚˜μ„œ 그듀은 μ‚¬λžŒλ“€μ—κ²Œ μ›ƒλŠ” 척을 ν•˜λΌκ³  μš”μ²­ν•©λ‹ˆλ‹€.
04:25
pretend to smile and the computer
123
265267
1954
컴퓨터
04:27
programme now has a database of real
124
267221
2165
ν”„λ‘œκ·Έλž¨μ€ 이제 μ§„μ§œ λ―Έμ†Œμ™€ κ°€μ§œ λ―Έμ†Œμ˜ λ°μ΄ν„°λ² μ΄μŠ€λ₯Ό 가지고 있고
04:29
and fake smiles and is able
125
269386
1624
04:31
to figure out which is which.
126
271010
1790
μ–΄λŠ 것이 μ–΄λ–€ λ―Έμ†ŒμΈμ§€ μ•Œμ•„λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
04:32
Sam: 'Figure out' means to calculate and
127
272800
2500
Sam: 'ꡬ해내닀'λŠ” κ³„μ‚°ν•˜κ³  닡을 μ°ΎλŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
04:35
come to an answer
128
275300
1360
04:36
Neil: Yes, and apparently the system gets
129
276660
1260
Neil: 예, λΆ„λͺ…νžˆ μ‹œμŠ€ν…œμ€
04:37
it right 90% of the time, which is much
130
277920
2840
μ‹œκ°„μ˜ 90%λ₯Ό λ§žνžˆλŠ”λ°, μ΄λŠ”
04:40
higher than we humans can. Right, well
131
280760
2640
우리 인간이 ν•  수 μžˆλŠ” 것보닀 훨씬 더 높은 κ²ƒμž…λ‹ˆλ‹€. 자, μ–΄νœ˜λ₯Ό
04:43
before we remind ourselves of our
132
283400
1780
μƒκΈ°μ‹œν‚€κΈ° 훨씬 전에 μ§ˆλ¬Έμ—
04:45
vocabulary, let’s get the answer to the
133
285220
1936
λŒ€ν•œ 닡을 μ•Œμ•„λ΄…μ‹œλ‹€
04:47
question. How many muscles do
134
287156
2184
. 우리 μ–Όκ΅΄μ—λŠ” λͺ‡ 개의 근윑이
04:49
we have in our face? Is it:
135
289340
1920
μžˆμŠ΅λ‹ˆκΉŒ?
04:51
A: 26, B: 43 or C: 62.
136
291260
3720
A: 26, B: 43 λ˜λŠ” C: 62μž…λ‹ˆλ‹€.
04:54
Sam, are you going to be smiling?
137
294980
1339
Sam, 웃을 κ±΄κ°€μš”?
04:56
What did you say?
138
296319
1411
뭐라고 ν•˜μ…¨λ‚˜μš”?
04:57
Sam: So I thought 62! Am I smiling, Neil?
139
297730
2870
μƒ˜: κ·Έλž˜μ„œ 62라고 μƒκ°ν–ˆμ–΄μš”! λ‚΄κ°€ 웃고 μžˆλ‹ˆ, 닐?
05:00
Neil: Sadly you are not, you are using
140
300600
2440
Neil: μŠ¬ν”„κ²Œλ„ 당신은 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€. 당신은
05:03
different muscles for that sort of sad
141
303040
2560
그런 μ’…λ₯˜μ˜ μŠ¬ν”ˆ ν‘œμ •μ„ 짓기 μœ„ν•΄ λ‹€λ₯Έ κ·Όμœ‘μ„ μ‚¬μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€
05:05
look! Actually the answer is 43.
142
305600
3380
! 정닡은 43κ°œμž…λ‹ˆλ‹€. 정닡을 λ§žμΆ”μ‹ 
05:08
Congratulations to anyone
143
308980
1580
λΆ„λ“€ μΆ•ν•˜λ“œλ¦½λ‹ˆλ‹€
05:10
who got that right. Now our vocabulary.
144
310570
2070
. 이제 우리의 μ–΄νœ˜.
05:12
Sam: Yes – 'facial' is the adjective relating
145
312640
3280
Sam: 예 – 'facial'은 μ–Όκ΅΄κ³Ό κ΄€λ ¨λœ ν˜•μš©μ‚¬μž…λ‹ˆλ‹€
05:15
to face.
146
315920
1320
.
05:17
Neil: Then we had 'infer'. This verb means
147
317240
2460
Neil: 그런 λ‹€μŒ 'μΆ”λ‘ '을 ν–ˆμŠ΅λ‹ˆλ‹€. 이 λ™μ‚¬λŠ”
05:19
to understand something even when you
148
319700
1900
05:21
don’t have all the information, and you
149
321600
2034
λͺ¨λ“  정보λ₯Ό 가지고 μžˆμ§€ μ•Šμ•„λ„ 무언가λ₯Ό μ΄ν•΄ν•œλ‹€λŠ” 의미이며,
05:23
come to this understanding
150
323634
1356
05:24
based on your experience and knowledge,
151
324990
1854
κ²½ν—˜κ³Ό 지식,
05:26
or in the case of a computer, the
152
326844
2116
λ˜λŠ” μ»΄ν“¨ν„°μ˜ 경우
05:28
programming.
153
328960
980
ν”„λ‘œκ·Έλž˜λ°μ„ 기반으둜 μ΄ν•΄ν•˜κ²Œ λ©λ‹ˆλ‹€.
05:29
Sam: And these computers work in 'real
154
329940
2020
Sam: 그리고 이 μ»΄ν“¨ν„°λŠ” 'μ‹€μ‹œκ°„'으둜 μž‘λ™ν•©λ‹ˆλ‹€
05:31
time', which means that there’s no delay
155
331967
2705
. 즉, 지연이 μ—†μœΌλ©°
05:34
and they can tell a fake smile from a
156
334680
2260
κ°€μ§œ λ―Έμ†Œμ™€ 'μ§„μ§œ' λ―Έμ†Œλ₯Ό ꡬ별할 수 μžˆμŠ΅λ‹ˆλ‹€
05:36
'genuine' one, which means a real one, as
157
336940
2080
05:39
the person is speaking.
158
339020
1570
.
05:40
Neil: They made people smile, or as the
159
340590
2150
Neil: 그듀은 μ‚¬λžŒλ“€μ„ μ›ƒκ²Œ λ§Œλ“€κ±°λ‚˜
05:42
Professor said, they 'induced' smiles by
160
342740
2260
κ΅μˆ˜κ°€ λ§ν–ˆλ“―μ΄ μž¬λ―ΈμžˆλŠ” μ˜ν™”λ₯Ό λ³΄μ—¬μ€ŒμœΌλ‘œμ¨ λ―Έμ†Œλ₯Ό 'μœ λ„'ν–ˆμŠ΅λ‹ˆλ‹€
05:45
showing funny films.
161
345000
1540
.
05:46
Sam: And the computer is able to 'figure
162
346540
1780
Sam: 그리고 μ»΄ν“¨ν„°λŠ”
05:48
out', or calculate, whether the smile is fake
163
348320
3740
λ―Έμ†Œκ°€ κ°€μ§œ
05:52
or genuine.
164
352060
820
05:52
Neil: OK, thank you, Sam. That’s all from
165
352880
1440
인지 μ§„μ§œμΈμ§€ 'νŒŒμ•…'ν•˜κ±°λ‚˜ 계산할 수 μžˆμŠ΅λ‹ˆλ‹€.
닐: λ„€, κ³ λ§ˆμ›Œμš”, μƒ˜.
05:54
6 Minute English today. We look forward
166
354320
2200
였늘의 6λΆ„ μ˜μ–΄λŠ” μ—¬κΈ°κΉŒμ§€μž…λ‹ˆλ‹€. μš°λ¦¬λŠ”
05:56
to your company next time and if you
167
356520
1640
λ‹€μŒ λ²ˆμ— κ·€ν•˜μ˜ νšŒμ‚¬λ₯Ό κΈ°λŒ€ν•˜κ³ 
05:58
can’t wait you can find lots more from
168
358160
1880
기닀릴 수 μ—†λ‹€λ©΄
06:00
bbclearningenglish online,
169
360040
1740
온라인,
06:01
on social media and on our app. Goodbye!
170
361780
2400
μ†Œμ…œ λ―Έλ””μ–΄ 및 μ•±μ—μ„œ bbclearningenglishμ—μ„œ 더 λ§Žμ€ 것을 찾을 수 μžˆμŠ΅λ‹ˆλ‹€. μ•ˆλ…•νžˆ κ°€μ„Έμš”!
06:04
Sam: Bye!
171
364600
1100
μƒ˜: μ•ˆλ…•!
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7