Call centres: Are you talking to AI? ⏲️ 6 Minute English

8,067 views ・ 2024-12-19

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:07
Hello, this is 6 Minute English from BBC Learning English.
0
7880
3960
μ•ˆλ…•ν•˜μ„Έμš”. BBC Learning English의 6 Minute Englishμž…λ‹ˆλ‹€.
00:11
I'm Phil, and I'm Pippa.
1
11840
2240
μ €λŠ” 필이고, μ €λŠ” ν”ΌνŒŒμ˜ˆμš”.
00:14
Have you ever phoned up a company and had your call held in a queue?
2
14080
4720
νšŒμ‚¬μ— μ „ν™”ν–ˆλŠ”λ° 톡화가 λŒ€κΈ°μ—΄μ— κ°‡ν˜€λ²„λ¦° 적이 μžˆλ‚˜μš”?
00:18
If you have then you've probably heard messages like this:
3
18800
4520
κ·Έλ ‡λ‹€λ©΄ λ‹€μŒκ³Ό 같은 λ©”μ‹œμ§€λ₯Ό 듀어보셨을 κ²ƒμž…λ‹ˆλ‹€
00:23
Hello. Your call is important to us.
4
23320
1960
. μ•ˆλ…•ν•˜μ„Έμš”. κ·€ν•˜μ˜ μ „ν™”λŠ” μ €ν¬μ—κ²Œ μ€‘μš”ν•©λ‹ˆλ‹€.
00:25
You are number 89 in the queue.
5
25280
3040
당신은 λŒ€κΈ°μ—΄μ˜ 89λ²ˆμž…λ‹ˆλ‹€.
00:28
If you'd like to continue to hold, press one.
6
28320
2200
계속 ν†΅ν™”ν•˜λ €λ©΄ 1을 λˆ„λ₯΄μ„Έμš”.
00:30
If you'd like to return to the main menu, press zero.
7
30520
2680
메인 λ©”λ‰΄λ‘œ λŒμ•„κ°€λ €λ©΄ 0을 λˆ„λ₯΄μ„Έμš”.
00:33
How do you feel when your phone call is put on hold, Phil?
8
33200
4120
ν•„, μ „ν™”κ°€ 보λ₯˜λ˜λ©΄ μ–΄λ–€ 기뢄이 λ“œλ‚˜μš”?
00:37
Oh, frustrated usually, although I do like it
9
37320
4520
μ•„, 보톡은 μ’Œμ ˆμŠ€λŸ½μ§€λ§Œ,
00:41
when there's a number saying how many people are in front of you.
10
41880
2880
μ•žμ— λͺ‡ λͺ…이 μžˆλŠ”μ§€ 숫자둜 ν‘œμ‹œλ˜μ–΄ μžˆμ„ λ•ŒλŠ” μ’‹μ•„μš”.
00:44
That's good.
11
44760
840
μ’‹μ•„μš”.
00:45
Well, when your call is finally connected, it's usually a call centre worker
12
45600
4360
λ§ˆμΉ¨λ‚΄ μ „ν™”κ°€ μ—°κ²°λ˜λ©΄ 보톡은 μ½œμ„Όν„° 직원과
00:49
you'll speak to - a real live person who'll hopefully fix your problem.
13
49960
4760
ν†΅ν™”ν•˜κ²Œ λ©λ‹ˆλ‹€. μ‹€μ œ μ‚¬λžŒμ΄κ³  문제λ₯Ό ν•΄κ²°ν•΄ 쀄 수 있기λ₯Ό λ°”λžλ‹ˆλ‹€.
00:54
But increasingly, this work is now done using artificial intelligence, or AI.
14
54720
6120
ν•˜μ§€λ§Œ 점점 더 이런 μž‘μ—…μ€ 인곡지λŠ₯(AI)을 μ΄μš©ν•΄ 이루어지고 μžˆμŠ΅λ‹ˆλ‹€.
01:00
And this is causing problems in countries like India
15
60840
3000
이둜 인해
01:03
and the Philippines, where call centre jobs make up a big part of the economy.
16
63840
4800
μ½œμ„Όν„° μΌμžλ¦¬κ°€ κ²½μ œμ—μ„œ 큰 비쀑을 μ°¨μ§€ν•˜λŠ” μΈλ„λ‚˜ 필리핀과 같은 λ‚˜λΌμ—μ„œ λ¬Έμ œκ°€ λ°œμƒν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:08
In this program, we'll be asking
17
68640
2120
이 ν”„λ‘œκ·Έλž¨μ—μ„œλŠ” μ½œμ„Όν„°μ˜
01:10
who's really in charge at the call centre: humans or AI?
18
70760
4960
μ‹€μ œ μ±…μž„μžκ°€ λˆ„κ΅¬μΈμ§€ λ¬»κ² μŠ΅λ‹ˆλ‹€ . 인간인가, AI인가?
01:15
As always, we'll be learning some useful new vocabulary.
19
75720
3680
μ–Έμ œλ‚˜ 그렇듯이, μš°λ¦¬λŠ” μœ μš©ν•œ μƒˆλ‘œμš΄ μ–΄νœ˜λ₯Ό 배울 κ²ƒμž…λ‹ˆλ‹€.
01:19
And remember, you can read along with the programme
20
79400
2600
그리고, ν”„λ‘œκ·Έλž¨μ„ 따라 읽고
01:22
and find the list of new words and phrases by visiting our website,
21
82000
4880
μƒˆλ‘œμš΄ 단어 와 ꡬ문 λͺ©λ‘μ„ μ°ΎμœΌμ‹œλ €λ©΄ 저희 μ›Ήμ‚¬μ΄νŠΈ bbclearningenglish.com을 λ°©λ¬Έν•΄μ£Όμ„Έμš”
01:26
bbclearningenglish.com.
22
86880
3360
.
01:30
But now I have a question for you, Phil.
23
90240
2840
ν•˜μ§€λ§Œ 이제 λ‹Ήμ‹ μ—κ²Œ 질문이 μžˆμŠ΅λ‹ˆλ‹€, ν•„. 졜근 λͺ‡ λ…„ λ™μ•ˆ
01:33
Around the world,
24
93080
1000
μ „ μ„Έκ³„μ μœΌλ‘œ
01:34
numbers of call centre workers have grown rapidly in recent years.
25
94080
4200
μ½œμ„Όν„° 근무자 μˆ˜κ°€ κΈ‰κ²©νžˆ λŠ˜μ–΄λ‚¬μŠ΅λ‹ˆλ‹€.
01:38
If you're listening to this, maybe you're a call centre worker yourself.
26
98280
4080
이 글을 λ“£κ³  κ³„μ‹œλ‹€λ©΄, μ•„λ§ˆ 당신도 μ½œμ„Όν„° 직원일 κ²λ‹ˆλ‹€.
01:42
So, approximately how many people work in call centres globally, Phil? Is it
27
102360
5760
κ·Έλ ‡λ‹€λ©΄, ν•„, μ „ μ„Έκ³„μ μœΌλ‘œ μ½œμ„Όν„°μ—μ„œ μΌν•˜λŠ” μ‚¬λžŒμ€ λŒ€λž΅ λͺ‡ λͺ…μ΄λ‚˜ λ©λ‹ˆκΉŒ?
01:48
a) seven million? b) 17 million? or c) 27 million?
28
108120
5840
a) 700λ§ŒμΈκ°€μš”? b) 1,700만? λ˜λŠ” c) 2,700만?
01:53
I think b) 17 million.
29
113960
4040
μ €λŠ” b) 1700만이라고 μƒκ°ν•΄μš”.
01:58
OK, well you'll have to listen to the end to find out the answer.
30
118000
4920
μ’‹μ•„μš”. 닡을 μ•Œμ•„λ‚΄λ €λ©΄ λκΉŒμ§€ λ“€μ–΄λ΄μ•Όκ² μ–΄μš”.
02:02
Now, one worker worried about the impact of AI on jobs in the Philippines
31
122920
4800
ν˜„μž¬ 필리핀 λ‚΄ μΌμžλ¦¬μ— AIκ°€ λ―ΈμΉ˜λŠ” 영ν–₯을 μš°λ €ν•˜λŠ” 근둜자 쀑 ν•œ λͺ…은 μ½œμ„Όν„° λ…Έλ™μž λ…Έμ‘° B.I.E.N.
02:07
is Mylene Cabalona, president of the call centre workers union B.I.E.N.
32
127720
6360
의 μœ„μ›μž₯인 λ°€λ Œ μΉ΄λ°œλ‘œλ‚˜μž…λ‹ˆλ‹€.
02:14
Here, she tells BBC World Service programme 'Tech Life'
33
134080
3800
μ—¬κΈ°μ„œ κ·Έλ…€λŠ” BBC World Service ν”„λ‘œκ·Έλž¨ 'Tech Life'μ—μ„œ
02:17
about some of the difficulties of her job and why she fears for the future:
34
137880
4400
μžμ‹ μ˜ μ§μ—…μ—μ„œ κ²ͺλŠ” 어렀움 κ³Ό λ―Έλž˜μ— λŒ€ν•œ 두렀움에 λŒ€ν•΄ λ‹€μŒκ³Ό 같이 λ§ν•©λ‹ˆλ‹€.
02:22
So, and this person is quite already aggravated and he keeps on yelling
35
142280
5600
κ·Έλž˜μ„œ 이 μ‚¬λžŒμ€ 이미 λͺΉμ‹œ ν™”κ°€ λ‚˜ 있고 계속 μ†Œλ¦¬λ₯Ό 지λ₯΄κ³  μžˆμŠ΅λ‹ˆλ‹€.
02:27
and that's the, you know, that's the difficult part
36
147880
2640
κ°€μž₯ μ–΄λ €μš΄ 점은
02:30
because the mental stress also, you know, you have to pacify the client
37
150520
6000
정신적 μŠ€νŠΈλ ˆμŠ€λ„ μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€. 고객을 λ‹¬λž˜μ•Ό ν•˜κ³ 
02:36
and you have to make sure you're able to resolve the concern.
38
156520
3800
, 문제λ₯Ό ν•΄κ²°ν•  수 μžˆλŠ”μ§€ 확인해야 ν•©λ‹ˆλ‹€.
02:40
And then the difficult part on that conversation is
39
160320
4480
그리고 κ·Έ λŒ€ν™”μ—μ„œ μ–΄λ €μš΄ 뢀뢄은
02:44
that you're being monitored by an AI.
40
164800
3360
AIκ°€ 당신을 κ°μ‹œν•˜κ³  μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:48
I mean, eventually AI would replace us.
41
168160
4080
κ²°κ΅­ AIκ°€ 우리λ₯Ό λŒ€μ²΄ν•˜κ²Œ 될 κ±°μ˜ˆμš”.
02:52
Um, it's going to displace workers in the, in, you know,
42
172240
3120
음, κ²°κ΅­ κ°€κΉŒμš΄ λ―Έλž˜μ— κ·Όλ‘œμžλ“€μ΄ λŒ€μ²΄λ  κ²ƒμž…λ‹ˆλ‹€
02:55
eventually, in the near future, even as a matter of fact, there's been a,
43
175360
5040
. 사싀
03:00
a study that says that about 300,000 workers,
44
180400
4320
연ꡬ에 λ”°λ₯΄λ©΄ μ•½ 30만 λͺ…μ˜ 근둜자,
03:04
or around 27% of workers that's going to be displaced because of AI,
45
184720
6480
즉 근둜자의 μ•½ 27%κ°€ λŒ€μ²΄λ  것이라고 ν•©λ‹ˆλ‹€. AI둜 인해 일자리λ₯Ό μžƒκ±°λ‚˜ λŒ€μ²΄λ˜λŠ”
03:11
and that's slowly happening.
46
191200
1400
일이 천천히 μΌμ–΄λ‚˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:12
Call centre work involves dealing with customers who phoned up to complain.
47
192600
4080
μ½œμ„Όν„° μ—…λ¬΄λŠ” λΆˆλ§Œμ„ μ „ν™”λ‘œ μ œκΈ°ν•˜λŠ” 고객을 μ²˜λ¦¬ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€ .
03:16
They're often angry, aggravated and yelling or shouting down the phone.
48
196680
5480
그듀은 μ’…μ’… ν™”κ°€ λ‚˜κ±°λ‚˜, 짜증이 λ‚˜κ±°λ‚˜, μ†Œλ¦¬λ₯Ό 지λ₯΄κ±°λ‚˜ μ „ν™”λ‘œ μ†Œλ¦¬λ₯Ό μ§€λ¦…λ‹ˆλ‹€.
03:22
It's Mylene's job to pacify them, to calm them down.
49
202160
4120
그듀을 λ‹¬λž˜κ³ , μ§„μ •μ‹œν‚€λŠ” 게 λ°€λ Œμ˜ μž„λ¬΄μ˜ˆμš” .
03:26
If that wasn't stressful enough,
50
206280
2200
μ΄κ²ƒλ§ŒμœΌλ‘œλ„ μŠ€νŠΈλ ˆμŠ€κ°€ μΆ©λΆ„ν•˜μ§€ μ•Šμ€ λ“―,
03:28
Mylene's conversations are monitored by AI systems to see how well she fixes
51
208480
5600
Mylene의 λŒ€ν™”λŠ” AI μ‹œμŠ€ν…œμ— μ˜ν•΄ λͺ¨λ‹ˆν„°λ§λ˜μ–΄ κ·Έλ…€κ°€ 고객의 문제λ₯Ό μ–Όλ§ˆλ‚˜ 잘 ν•΄κ²°ν•˜λŠ”μ§€ ν™•μΈν•©λ‹ˆλ‹€
03:34
her client's problems.
52
214080
1600
.
03:35
You might think AI was built to support workers like Mylene,
53
215680
3840
AIλŠ” λ°€λ Œκ³Ό 같은 근둜자λ₯Ό μ§€μ›ν•˜κΈ° μœ„ν•΄ λ§Œλ“€μ–΄μ‘Œλ‹€κ³  생각할 μˆ˜λ„ μžˆμ§€λ§Œ,
03:39
but she fears AI will replace her in the near future,
54
219520
3920
κ·Έλ…€λŠ” AIκ°€ κ°€κΉŒμš΄ λ―Έλž˜μ— μžμ‹ μ„ λŒ€μ²΄ν•  것을 λ‘λ €μ›Œν•©λ‹ˆλ‹€. AIλΌλŠ”
03:43
a phrase meaning 'very soon' or 'within a short time'.
55
223440
3560
ν‘œν˜„μ€ 'μ•„μ£Ό 곧' λ˜λŠ” '짧은 μ‹œκ°„ 내에'λ₯Ό μ˜λ―Έν•©λ‹ˆλ‹€.
03:47
Mylene emphasises her fears about being replaced by giving details
56
227000
4400
λ°€λ Œμ€
03:51
about a study she read,
57
231400
1600
μžμ‹ μ΄ 읽은 연ꡬ에 λŒ€ν•œ μžμ„Έν•œ λ‚΄μš©μ„ μ„€λͺ…ν•˜λ©° μžμ‹ μ΄ 인곡지λŠ₯
03:53
which claimed that 27% of workers will be displaced by AI.
58
233000
4680
으둜 인해 근둜자의 27%κ°€ 일자리λ₯Ό μžƒμ„ κ²ƒμ΄λΌλŠ” λ‚΄μš©μ„ μ–ΈκΈ‰ν•˜λ©° μžμ‹ μ΄ 일자리λ₯Ό μžƒμ„κΉŒλ΄ λ‘λ €μ›Œν•œλ‹€λŠ” 것을 κ°•μ‘°ν–ˆμŠ΅λ‹ˆλ‹€ .
03:57
She uses the phrase 'as a matter of fact' to emphasise what she's saying
59
237680
5280
κ·Έλ…€λŠ” μžμ‹ μ΄ λ§ν•˜λŠ” λ‚΄μš©μ„ κ°•μ‘°
04:02
and to give more detail as evidence to support it.
60
242960
3120
ν•˜κ³  이λ₯Ό λ’·λ°›μΉ¨ν•˜λŠ” 증거둜 더 μžμ„Έν•œ λ‚΄μš©μ„ μ œμ‹œν•˜κΈ° μœ„ν•΄ 'μ‚¬μ‹€λ‘œμ„œ'λΌλŠ” ν‘œν˜„μ„ μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€.
04:06
But Mylene thinks AI will never fully replace humans. She says
61
246080
4960
ν•˜μ§€λ§Œ λ°€λ Œμ€ AIκ°€ 인간을 μ™„μ „νžˆ λŒ€μ²΄ν•  μˆ˜λŠ” 없을 것이라고 μƒκ°ν•©λ‹ˆλ‹€. κ·Έλ…€λŠ”
04:11
AI lacks one important quality, empathy.
62
251040
3800
AIκ°€ κ³΅κ°μ΄λΌλŠ” μ€‘μš”ν•œ νŠΉμ„±μ΄ λΆ€μ‘±ν•˜λ‹€κ³  λ§ν•©λ‹ˆλ‹€.
04:14
Here, she explains more to BBC World Service's 'Tech Life':
63
254840
4160
κ·Έλ…€λŠ” BBC μ›”λ“œ μ„œλΉ„μŠ€μ˜ 'ν…Œν¬ 라이프'μ—μ„œ 더 μžμ„Ένžˆ μ„€λͺ…ν–ˆμŠ΅λ‹ˆλ‹€.
04:19
Well, I don't think AI could... is, you know, empathetic or whenever they talk.
64
259000
6280
κΈ€μŽ„μš”, AIκ°€ κ³΅κ°ν•˜κ±°λ‚˜ 말할 λ•Œλ§ˆλ‹€ λ°˜μ‘ν•  μˆ˜λŠ” 없을 것 κ°™μ•„μš”.
04:25
I mean, if ever a machine or a robot talks to them,
65
265280
3800
제 말은, λ§Œμ•½ 기계 λ‚˜ λ‘œλ΄‡μ΄ μ‚¬λžŒλ“€μ—κ²Œ 말을 건넨닀면,
04:29
you know, um, people are more compassionate than, you know,
66
269080
4920
μ‚¬λžŒλ“€μ€
04:34
when you talk to a robot.
67
274000
2040
λ‘œλ΄‡κ³Ό λŒ€ν™”ν•  λ•Œλ³΄λ‹€ 더 동정심을 λŠλ‚€λ‹€λŠ” κ±°μ˜ˆμš”.
04:36
Mylene says that AI is not empathetic. Unlike humans,
68
276040
4640
λ°€λ Œμ€ AIκ°€ 곡감 λŠ₯λ ₯이 μ—†λ‹€κ³  λ§ν•œλ‹€. 인간과 달리
04:40
it can't put itself in someone else's place and
69
280680
3000
λ‹€λ₯Έ μ‚¬λžŒμ˜ μž…μž₯에 μ„œμ„œ κ·Έ
04:43
share their feelings or experiences.
70
283680
2800
μ‚¬λžŒμ˜ κ°μ •μ΄λ‚˜ κ²½ν—˜μ„ κ³΅μœ ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
04:46
If you listen carefully to Mylene's speech,
71
286480
2760
마일린의 연섀을 주의 깊게 듀어보면,
04:49
you'll notice she says, "you know" a lot.
72
289240
2720
κ·Έλ…€κ°€ "μ•Œλ‹€μ‹œν”Όμš”"λΌλŠ” 말을 많이 ν•œλ‹€λŠ” 것을 μ•Œ 수 μžˆμ„ κ²λ‹ˆλ‹€.
04:51
Phrases like 'you know', 'um', and 'ah' are called 'filler words'
73
291960
4800
'μ•Œλ‹€μ‹œν”Ό', '음', 'μ•„'와 같은 μ–΄κ΅¬λŠ” 'ν•„λŸ¬ 어ꡬ'라고 ν•˜λ©°
04:56
and are used to give the speaker time to think or to express uncertainty.
74
296760
4720
λ§ν•˜λŠ” μ‚¬λžŒμ—κ²Œ 생각할 μ‹œκ°„μ„ μ£Όκ±°λ‚˜ λΆˆν™•μ‹€ν•œ 점을 ν‘œν˜„ν•˜λŠ” 데 μ‚¬μš©λ©λ‹ˆλ‹€.
05:01
Right, I think it's time I revealed the answer to my question, Phil.
75
301480
4240
κ·Έλ ‡κ΅°μš”. ν•„, 제 μ§ˆλ¬Έμ— λŒ€ν•œ 닡을 밝힐 λ•Œκ°€ 된 것 κ°™μ•„μš”. μ „ μ„Έκ³„μ μœΌλ‘œ
05:05
I asked you how many call centre workers are there, globally?
76
305760
4760
μ½œμ„Όν„° κ·Όλ¬΄μžκ°€ λͺ‡ λͺ…인지 λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€.
05:10
And I said 17 million.
77
310520
2640
그리고 μ œκ°€ 1,700만이라고 λ§ν–ˆμ–΄μš”.
05:13
Which is the right answer.
78
313160
2200
μ–΄λŠ 것이 μ •λ‹΅μΈκ°€μš”?
05:15
OK, let's recap the vocabulary we've learned in this program,
79
315360
4000
μ’‹μ•„μš”, 이 ν”„λ‘œκ·Έλž¨μ—μ„œ 배운 μ–΄νœ˜λ₯Ό λ‹€μ‹œ ν•œλ²ˆ μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€ .
05:19
starting with 'yelling', another word for shouting.
80
319360
3840
λ¨Όμ € 'μ†Œλ¦¬μ§€λ₯΄λ‹€'둜 μ‹œμž‘ν•΄ λ³΄κ² μŠ΅λ‹ˆλ‹€. 'μ†Œλ¦¬μΉ˜λ‹€'λŠ” 뜻의 또 λ‹€λ₯Έ λ‹¨μ–΄μž…λ‹ˆλ‹€ .
05:23
'To pacify someone' means 'to calm them down when they're angry'.
81
323200
4080
'λˆ„κ΅°κ°€λ₯Ό λ‹¬λž˜λ‹€'λŠ” 말은 'ν™”κ°€ λ‚œ μ‚¬λžŒμ„ μ§„μ •μ‹œν‚€λ‹€'λŠ” λœ»μž…λ‹ˆλ‹€ .
05:27
'In the near future' means very soon or within a short time.
82
327280
4680
'κ°€κΉŒμš΄ λ―Έλž˜μ—'λŠ” 맀우 곧 λ˜λŠ” 짧은 μ‹œκ°„ 내에λ₯Ό μ˜λ―Έν•©λ‹ˆλ‹€.
05:31
The phrase, 'as a matter of fact', is used to add emphasis to what you're saying,
83
331960
4480
'사싀은'μ΄λΌλŠ” λ¬Έκ΅¬λŠ” λ§ν•˜λŠ” λ‚΄μš©μ„ κ°•μ‘°ν•˜κ±°λ‚˜,
05:36
to give more detail about what you've just said,
84
336440
2880
방금 λ§ν•œ λ‚΄μš©μ— λŒ€ν•΄ 더 μžμ„Έν•œ λ‚΄μš©μ„ μ œκ³΅ν•˜κ±°λ‚˜,
05:39
or to introduce something that contrasts with it.
85
339320
3320
그것과 λŒ€μ‘°λ˜λŠ” λ‚΄μš©μ„ μ†Œκ°œν•˜λŠ” 데 μ‚¬μš©λ©λ‹ˆλ‹€.
05:42
If you're empathetic,
86
342640
1560
곡감λŠ₯λ ₯이 μžˆλ‹€λ©΄,
05:44
you're able to put yourself in someone else's position and
87
344200
3560
λ‹€λ₯Έ μ‚¬λžŒμ˜ μž…μž₯에 μ„œμ„œ
05:47
share their feelings or experiences.
88
347760
2440
κ·Έ μ‚¬λžŒμ˜ κ°μ •μ΄λ‚˜ κ²½ν—˜μ„ κ³΅μœ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
05:50
And finally, filler words like 'um', 'ah' and,
89
350200
5360
그리고 λ§ˆμ§€λ§‰μœΌλ‘œ '음', 'μ•„',
05:55
'you know', give the speaker more time to think or to express uncertainty.
90
355560
6080
'μ•Œλ‹€μ‹œν”Ό'와 같은 ν•„λŸ¬ λ‹¨μ–΄λŠ” λ§ν•˜λŠ” μ‚¬λžŒμ—κ²Œ 생각할 μ‹œκ°„μ„ 더 μ£Όκ±°λ‚˜ λΆˆν™•μ‹€μ„±μ„ ν‘œν˜„ν•  수 있게 ν•©λ‹ˆλ‹€.
06:01
Once again, our six minutes are up. Bye!
91
361640
3600
λ‹€μ‹œ ν•œλ²ˆ, 6뢄이 λλ‚¬μŠ΅λ‹ˆλ‹€. μ•ˆλ…•!
06:05
Bye!
92
365240
640
μ•ˆλ…•!
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7