Robot therapists - 6 Minute English

135,234 views ・ 2018-03-08

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:07
Catherine: Welcome to 6 Minute English,
0
7540
1620
Catherine:
00:09
the programme where we explore an interesting
1
9360
2220
ν₯미둜운
00:11
topic and bring you six items of useful vocabulary.
2
11700
3280
주제λ₯Ό νƒκ΅¬ν•˜κ³  μœ μš©ν•œ μ–΄νœ˜ 6개 ν•­λͺ©μ„ μ œκ³΅ν•˜λŠ” ν”„λ‘œκ·Έλž¨μΈ 6 Minute English에 μ˜€μ‹  것을 ν™˜μ˜ν•©λ‹ˆλ‹€.
00:14
I'm Catherine.
3
14980
1140
μ €λŠ” μΊμ„œλ¦°μž…λ‹ˆλ‹€.
00:16
Rob: And I'm Rob.
4
16240
880
λ‘­: μ €λŠ” λ‘­μž…λ‹ˆλ‹€.
00:17
Catherine: I have a question for you, Rob:
5
17360
1640
μΊμ„œλ¦°: λ‹Ήμ‹ μ—κ²Œ 질문이 μžˆμŠ΅λ‹ˆλ‹€, λ‘­: λ‘œλ΄‡μœΌλ‘œλΆ€ν„°
00:19
how would you feel about having therapy
6
19340
2940
치료λ₯Ό λ°›λŠ” 것에 λŒ€ν•΄ μ–΄λ–»κ²Œ λŠλΌμ‹œκ² μŠ΅λ‹ˆκΉŒ
00:22
from a robot?
7
22500
1100
?
00:23
Rob: I'm not too sure about that -
8
23940
1480
Rob: 잘 λͺ¨λ₯΄κ² μŠ΅λ‹ˆλ‹€.
00:25
you'll need to tell me more! But first things first,
9
25620
2520
더 말씀해 μ£Όμ…”μ•Ό ν•©λ‹ˆλ‹€! ν•˜μ§€λ§Œ 무엇보닀 λ¨Όμ €,
00:28
the word therapy refers to a kind of treatment that helps
10
28380
3760
μΉ˜λ£ŒλΌλŠ” λ‹¨μ–΄λŠ” μ •μ‹  건강 λ¬Έμ œμ— λŒ€ν•œ 치료λ₯Ό ν¬ν•¨ν•˜μ—¬ λˆ„κ΅°κ°€μ˜ 기뢄이 λ‚˜μ•„μ§€λŠ” 데 도움이 λ˜λŠ” μΌμ’…μ˜ 치료λ₯Ό μ˜λ―Έν•©λ‹ˆλ‹€
00:32
someone feel better - including
11
32140
1720
00:34
treatment for mental health issues.
12
34000
1660
.
00:35
Someone who delivers therapy is called a therapist.
13
35960
2800
치료λ₯Ό μ œκ³΅ν•˜λŠ” μ‚¬λžŒμ„ μΉ˜λ£Œμ‚¬λΌκ³  ν•©λ‹ˆλ‹€.
00:39
Catherine: We'll find out more about this robot therapist
14
39100
2620
Catherine: μž μ‹œ 후에 이 λ‘œλ΄‡ μΉ˜λ£Œμ‚¬μ— λŒ€ν•΄ μžμ„Ένžˆ μ•Œμ•„λ³΄κ² μŠ΅λ‹ˆλ‹€
00:41
in just a moment, but first, Rob,
15
41720
2480
. ν•˜μ§€λ§Œ λ¨Όμ € Rob,
00:44
I've got a question for you about the scale
16
44420
2640
00:47
of mental health issues globally.
17
47060
2320
μ „ μ„Έκ³„μ μœΌλ‘œ μ •μ‹  건강 문제의 규λͺ¨μ— λŒ€ν•΄ 질문이 μžˆμŠ΅λ‹ˆλ‹€.
00:49
So roughly how many people do you think experience
18
49620
4120
κ·Έλ ‡λ‹€λ©΄ λŒ€λž΅ λͺ‡ λͺ…μ˜ μ‚¬λžŒλ“€μ΄ 일생
00:53
mental health issues at some point during
19
53740
2440
λ™μ•ˆ μ–΄λŠ μ‹œμ μ—μ„œ μ •μ‹  건강 문제λ₯Ό κ²½ν—˜ν•œλ‹€κ³  μƒκ°ν•˜μ‹­λ‹ˆκΉŒ
00:56
their lifetime? Is it... a) One in ten people,
20
56340
4600
? 그게... a) 10λͺ… 쀑 1λͺ…,
01:00
b) One in four or c) One in three
21
60940
4200
b) 4λͺ… 쀑 1λͺ… λ˜λŠ” c) 3λͺ… 쀑 1λͺ…
01:05
Rob: I'll go for one in four,
22
65340
1180
Rob: 4λͺ… 쀑 1λͺ…을 μ„ νƒν•˜κ² μŠ΅λ‹ˆλ‹€
01:06
but I know whichever answer is right -
23
66520
2300
01:08
it's a big issue.
24
68820
1000
.
01:10
How might a robot therapist help?
25
70100
1820
λ‘œλ΄‡ μΉ˜λ£Œμ‚¬λŠ” μ–΄λ–»κ²Œ 도움이 λ κΉŒμš”?
01:11
Catherine: We're not talking about a robot
26
71930
1790
μΊμ„œλ¦°: μš°λ¦¬λŠ” μŠ€νƒ€μ›Œμ¦ˆ 감각의 λ‘œλ΄‡μ— λŒ€ν•΄ λ§ν•˜λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€
01:13
in the Star Wars sense - so there's no flashing
27
73720
3020
. κ·Έλž˜μ„œ λ²ˆμ©μ΄λŠ”
01:16
lights and mechanical arms, Rob! It's actually an app
28
76820
3440
λΆˆλΉ›κ³Ό 기계 νŒ”μ΄ μ—†μŠ΅λ‹ˆλ‹€, λ‘­! 이것은 μ‹€μ œλ‘œ
01:20
in your smartphone that talks to you
29
80260
2380
μŠ€λ§ˆνŠΈν°μ— μžˆλŠ” μ•±μœΌλ‘œ,
01:23
- and it's called Woebot.
30
83080
2540
Woebot이라고 ν•©λ‹ˆλ‹€.
01:25
Rob: So - it has a sense of humour.
31
85800
1920
Rob: κ·Έλž˜μ„œ - 유머 감각이 μžˆμŠ΅λ‹ˆλ‹€.
01:28
Woe means 'sadness'; so this is a 'woe' bot,
32
88060
3400
WoeλŠ” 'μŠ¬ν””'을 μ˜λ―Έν•©λ‹ˆλ‹€. κ·Έλž˜μ„œ 이것은 λ‘œλ΄‡μ΄ μ•„λ‹ˆλΌ 'woe' λ΄‡μž…λ‹ˆλ‹€
01:31
not a robot.
33
91460
1220
.
01:33
Catherine: And it was developed by psychologist
34
93080
2140
μΊμ„œλ¦°: λ―Έκ΅­
01:35
Dr Alison Darcy from Stanford University
35
95440
3080
μŠ€νƒ νΌλ“œ λŒ€ν•™μ˜ μ‹¬λ¦¬ν•™μž μ•¨λ¦¬μŠ¨ λ‹€μ‹œ 박사가 κ°œλ°œν–ˆμŠ΅λ‹ˆλ‹€
01:38
in the US. Here she is talking to the BBC radio
36
98780
3300
. μ—¬κΈ°μ—μ„œ κ·Έλ…€λŠ” BBC λΌλ””μ˜€
01:42
programme All in the Mind.
37
102080
1640
ν”„λ‘œκ·Έλž¨ All in the Mind와 μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:44
Dr Alison Darcy: Well, after you start an
38
104100
2000
Alison Darcy 박사: 음,
01:46
initial conversation with the Woebot,
39
106260
1160
Woebotκ³Ό 초기 λŒ€ν™”λ₯Ό μ‹œμž‘ν•œ ν›„ Woebot
01:47
and he'll take you through sort of what he can do
40
107420
1540
은 κ·Έκ°€ ν•  수 μžˆλŠ” 것과
01:48
and what he can't do, he'll just essentially
41
108960
1820
ν•  수 μ—†λŠ” 것에 λŒ€ν•΄
01:50
check in with you every day and just give you
42
110780
2200
μ„€λͺ…ν•  κ²ƒμž…λ‹ˆλ‹€.
01:52
a sort of figurative tap on the shoulder
43
112980
1520
μ–΄κΉ¨λ₯Ό κ°€λ³κ²Œ λ‘λ“œλ¦¬λ©°
01:54
and say: "Hey Claudia, how are you doing?
44
114740
2380
"μ•ˆλ…• ν΄λΌμš°λ””μ•„, 잘 μ§€λ‚΄λ‹ˆ?
01:57
What's going on in your day? How do you feel?"
45
117120
1580
였늘 무슨 일 μžˆμ–΄? 기뢄이 μ–΄λ•Œ?"
01:58
So if you say, like "I'm really, really stressed out",
46
118820
2440
λ”°λΌμ„œ "λ‚˜λŠ” 정말, 정말 슀트레슀λ₯Ό λ°›κ³  μžˆμ–΄"라고 λ§ν•˜λ©΄
02:01
Woebot might offer to help
47
121600
2340
Woebot이 도움을 μ£Όκ² λ‹€κ³  μ œμ•ˆν•  수 μžˆμŠ΅λ‹ˆλ‹€
02:04
talk you through something.
48
124060
1160
.
02:06
Catherine: Woebot checks in with you every day
49
126220
2260
Catherine: Woebot은 맀일 당신을 확인
02:08
and asks how you are.
50
128660
1600
ν•˜κ³  μ•ˆλΆ€λ₯Ό λ¬»μŠ΅λ‹ˆλ‹€.
02:10
Rob: So here, to check in with someone
51
130500
1840
Rob: κ·Έλž˜μ„œ μ—¬κΈ°μ„œ λˆ„κ΅°κ°€μ™€ μ²΄ν¬μΈν•œλ‹€λŠ” 것은
02:12
doesn't mean to register at a hotel with that person!
52
132620
2580
κ·Έ μ‚¬λžŒκ³Ό ν•¨κ»˜ ν˜Έν…”μ— λ“±λ‘ν•˜λŠ” 것을 μ˜λ―Έν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€! 정보λ₯Ό λ³΄κ³ ν•˜κ±°λ‚˜ μ•Œμ•„λ‚΄κΈ° μœ„ν•΄
02:15
It's an informal way of saying you talk to someone
53
135520
2560
λˆ„κ΅°κ°€μ—κ²Œ 말을 κ±Έμ—ˆλ‹€λŠ” 비곡식적인 ν‘œν˜„μž…λ‹ˆλ‹€
02:18
in order to report or find out information.
54
138120
2460
.
02:20
Catherine: And this usage is more common in the US.
55
140820
2940
Catherine: 이 μ‚¬μš©λ²•μ€ λ―Έκ΅­μ—μ„œ 더 μΌλ°˜μ μž…λ‹ˆλ‹€. 예λ₯Ό λ“€λ©΄ λ‹€μŒκ³Ό
02:23
So for example: "I can't meet you today,
56
143760
2420
κ°™μŠ΅λ‹ˆλ‹€. "Rob, μ˜€λŠ˜μ€ λ§Œλ‚  수 μ—†μ§€λ§Œ
02:26
Rob, but I'll check in with you tomorrow
57
146260
2060
02:28
to see how the project is getting on."
58
148320
1640
ν”„λ‘œμ νŠΈκ°€ μ–΄λ–»κ²Œ μ§„ν–‰λ˜κ³  μžˆλŠ”μ§€ 내일 ν™•μΈν•˜κ² μŠ΅λ‹ˆλ‹€."
02:30
Rob: So, this robot checks in with you every day.
59
150200
3040
Rob: κ·Έλž˜μ„œ 이 λ‘œλ΄‡μ΄ 맀일 당신을 ν™•μΈν•©λ‹ˆλ‹€. 인지 행동 μΉ˜λ£ŒλΌλŠ” κΈ°μˆ μ„ μ‚¬μš©ν•˜μ—¬
02:33
It tracks your mood and talks to you
60
153240
2000
기뢄을 μΆ”μ ν•˜κ³  감정에 λŒ€ν•΄ μ΄μ•ΌκΈ°ν•©λ‹ˆλ‹€
02:35
about your emotions, using a technique
61
155300
2000
02:37
called cognitive behavioural therapy.
62
157600
2300
.
02:40
Catherine: Cognitive behavioural therapy
63
160100
2300
Catherine: 인지 행동 μΉ˜λ£ŒλŠ”
02:42
is a common therapeutic technique
64
162500
2120
02:44
that helps people deal with problems
65
164780
2000
μ‚¬λžŒλ“€μ΄
02:46
by changing the way they think.
66
166960
2000
μƒκ°ν•˜λŠ” 방식을 λ°”κΎΈμ–΄ 문제λ₯Ό ν•΄κ²°ν•˜λ„λ‘ λ•λŠ” 일반적인 치료 κΈ°μˆ μž…λ‹ˆλ‹€.
02:49
Rob: That all sounds great,
67
169140
1040
Rob: λͺ¨λ‘ ν›Œλ₯­ν•˜κ²Œ λ“€λ¦¬μ§€λ§Œ
02:50
but does Woebot actually work?
68
170320
1940
Woebot이 μ‹€μ œλ‘œ μž‘λ™ν•©λ‹ˆκΉŒ?
02:52
Catherine: They've done trials which show that
69
172540
1820
Catherine: 그듀은
02:54
it can be more effective than simply reading
70
174360
2460
λ‹¨μˆœνžˆ μ •μ‹  건강에 λŒ€ν•œ 정보λ₯Ό μ½λŠ” 것보닀 더 효과적일 수 μžˆμŒμ„ λ³΄μ—¬μ£ΌλŠ” μ‹€ν—˜μ„ ν–ˆμŠ΅λ‹ˆλ‹€
02:56
information about mental health.
71
176920
1800
.
02:58
But they haven't compared Woebot to a real
72
178720
3240
κ·ΈλŸ¬λ‚˜ 그듀은 윀리적 문제둜 인해 Woebot을 μ‹€μ œ μΉ˜λ£Œμ‚¬μ™€ λΉ„κ΅ν•˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€
03:02
therapist due to ethical concerns.
73
182160
2660
.
03:05
Rob: Yes, it could be unethical to deny
74
185040
2180
Rob: 예, μž¬νŒμ„ μœ„ν•΄
03:07
a real patient access to a human therapist
75
187420
2300
μ‹€μ œ ν™˜μžκ°€ 인간 μΉ˜λ£Œμ‚¬μ—κ²Œ μ ‘κ·Όν•˜λŠ” 것을 κ±°λΆ€ν•˜λŠ” 것은 λΉ„μœ€λ¦¬μ μΌ 수 μžˆμŠ΅λ‹ˆλ‹€
03:09
for the sake of a trial.
76
189720
2000
.
03:11
Ethical basically means morally right.
77
191740
2500
μœ€λ¦¬λŠ” 기본적으둜 λ„λ•μ μœΌλ‘œ μ˜³λ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
03:14
Catherine: And another concern is privacy.
78
194440
2360
μΊμ„œλ¦°: 그리고 또 λ‹€λ₯Έ κ΄€μ‹¬μ‚¬λŠ” ν”„λΌμ΄λ²„μ‹œμž…λ‹ˆλ‹€.
03:17
People who use apps like this are not protected
79
197060
3020
이와 같은 앱을 μ‚¬μš©ν•˜λŠ” μ‚¬λžŒλ“€μ€
03:20
by strong privacy laws.
80
200320
1720
κ°•λ ₯ν•œ κ°œμΈμ •λ³΄ λ³΄ν˜Έλ²•μ˜ 보호λ₯Ό λ°›μ§€ λͺ»ν•©λ‹ˆλ‹€.
03:22
Rob: Despite these fears, digital therapy
81
202240
2180
Rob: μ΄λŸ¬ν•œ 두렀움에도 λΆˆκ΅¬ν•˜κ³  λ””μ§€ν„Έ μΉ˜λ£ŒλŠ”
03:24
is booming - and Woebot is just one of an
82
204580
2360
ν˜Έν™©μ„ λˆ„λ¦¬κ³  μžˆμŠ΅λ‹ˆλ‹€. Woebot은 점점
03:27
an increasing number of electronic services.
83
207000
2360
더 λ§Žμ€ μ „μž μ„œλΉ„μŠ€ 쀑 ν•˜λ‚˜μΌ λΏμž…λ‹ˆλ‹€.
03:29
One reason for this could be using an app carries less
84
209880
2600
κ·Έ 이유 쀑 ν•˜λ‚˜λŠ” 앱을 μ‚¬μš©ν•˜λŠ” 것이
03:32
stigma than maybe seeing a human therapist.
85
212520
2540
인간 μΉ˜λ£Œμ‚¬λ₯Ό λ§Œλ‚˜λŠ” 것보닀 낙인이 λœν•˜κΈ° λ•Œλ¬ΈμΌ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:35
Catherine: And stigma refers to the negative
86
215400
2040
μΊμ„œλ¦°: 그리고 낙인은
03:37
associations that people have about something,
87
217580
2580
μ‚¬λžŒλ“€μ΄ 무언가에 λŒ€ν•΄ κ°–λŠ” 뢀정적인 연상을 λ§ν•˜λ©°,
03:40
especially when these associations are not fair.
88
220320
2900
특히 μ΄λŸ¬ν•œ 연상이 κ³΅μ •ν•˜μ§€ μ•Šμ„ λ•Œ κ·Έλ ‡μŠ΅λ‹ˆλ‹€.
03:43
Even though mental health is now being
89
223680
2000
μ •μ‹  건강이
03:45
talked about more openly than before,
90
225820
2140
이전보닀 더 곡개적으둜 λ…Όμ˜λ˜κ³  μžˆμ§€λ§Œ
03:48
some people do still see mental health issues
91
228440
3340
일뢀 μ‚¬λžŒλ“€μ€ μ—¬μ „νžˆ ​​정신 건강 문제
03:51
and therapy negatively.
92
231920
1600
와 치료λ₯Ό λΆ€μ •μ μœΌλ‘œ λ΄…λ‹ˆλ‹€.
03:53
Rob: Whatever you think of robot therapy,
93
233900
2200
Rob: λ‘œλ΄‡ μΉ˜λ£Œμ— λŒ€ν•΄ μ–΄λ–»κ²Œ μƒκ°ν•˜μ‹œλ“ ,
03:56
Dr Darcy believes that in the modern world
94
236340
2120
Darcy λ°•μ‚¬λŠ” ν˜„λŒ€ μ‚¬νšŒμ—μ„œ
03:58
people need to self-reflect more -
95
238700
2020
μ‚¬λžŒλ“€μ΄ 자기 성찰을 더 많이 ν•  ν•„μš”κ°€ μžˆλ‹€κ³  λ―ΏμŠ΅λ‹ˆλ‹€.
04:00
which means thinking deeply about yourself,
96
240900
2260
즉,
04:03
in order to understand the reasons behind your feelings.
97
243360
2480
μžμ‹ μ˜ 감정 뒀에 μˆ¨μ€ 이유λ₯Ό μ΄ν•΄ν•˜κΈ° μœ„ν•΄ μžμ‹ μ— λŒ€ν•΄ 깊이 μƒκ°ν•˜λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
04:06
Dr Alison Darcy: The world that we live in right now
98
246480
1240
Alison Darcy 박사: μ§€κΈˆ μš°λ¦¬κ°€ μ‚΄κ³  μžˆλŠ” 세상은
04:07
is very noisy. Particularly digitally.
99
247840
2460
맀우 μ‹œλ„λŸ½μŠ΅λ‹ˆλ‹€. 특히 λ””μ§€ν„Έ. μ–΄λ””λ₯Ό κ°€λ“  μ£Όλ¨Έλ‹ˆμ—
04:10
You know, since we've had these little computers
100
250560
2660
이 μž‘μ€ 컴퓨터가 있기 λ•Œλ¬Έμ—
04:13
in our pockets with us everywhere we go,
101
253420
2000
04:15
there aren't that many opportunities for real silence
102
255760
2940
μ§„μ •ν•œ 침묡
04:18
or self-reflection. You know, even a commute
103
258700
2460
μ΄λ‚˜ 자기 μ„±μ°°μ˜ κΈ°νšŒκ°€ λ§Žμ§€ μ•ŠμŠ΅λ‹ˆλ‹€. μ§€ν•˜μ² μ„ 타고 ν†΅κ·Όν•˜λŠ” 것쑰차 혼자만의 μ‹œκ°„μ΄ 될
04:21
on the tube might have been a moment to
104
261320
2120
수 μžˆλŠ” μˆœκ°„μ΄μ—ˆμ§€λ§Œ
04:23
just take a second to yourself, but now that void
105
263440
3380
, μ΄μ œλŠ” νœ΄λŒ€ν°μ„ 보면 κ·Έ 빈자리λ₯Ό
04:26
can be filled always with super engaging content
106
266820
2980
항상 맀우 λ§€λ ₯적인 μ½˜ν…μΈ λ‘œ μ±„μšΈ 수 μžˆμŠ΅λ‹ˆλ‹€
04:29
by looking at your phone.
107
269800
1220
.
04:31
Catherine: Darcy believes that we don't have
108
271440
1640
μΊμ„œλ¦°: DarcyλŠ”
04:33
much time for self-reflection
109
273080
1780
04:35
because there are so many distractions in life -
110
275100
2400
삢에 λ„ˆλ¬΄ λ§Žμ€ λ°©ν•΄ μš”μ†Œ,
04:37
especially smartphones!
111
277500
1940
특히 슀마트폰이 있기 λ•Œλ¬Έμ— μš°λ¦¬κ°€ 자기 성찰을 ν•  μ‹œκ°„μ΄ λ§Žμ§€ μ•Šλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€!
04:39
Rob: After discussing all this - would you actually try
112
279440
2580
Rob: 이 λͺ¨λ“  것을 λ…Όμ˜ν•œ ν›„ - μ‹€μ œλ‘œ
04:42
a therapy app like this?
113
282160
1420
이와 같은 치료 앱을 μ‚¬μš©ν•΄ λ³΄μ‹œκ² μŠ΅λ‹ˆκΉŒ?
04:43
Catherine: Yes I would, actually -
114
283920
960
μΊμ„œλ¦°: λ„€, 사싀 κ·Έλ ‡κ²Œ ν•˜κ² μŠ΅λ‹ˆλ‹€.
04:44
I think it might be quite helpful.
115
284880
1700
κ½€ 도움이 될 것 κ°™μ•„μš”.
04:46
Rob: And how about the question you asked me
116
286580
1360
Rob: 그리고 ν”„λ‘œκ·Έλž¨ μ΄ˆλ°˜μ— 제게 ν•˜μ‹  μ§ˆλ¬Έμ€ μ–΄λ–»μŠ΅λ‹ˆκΉŒ
04:47
at the beginning of the programme: how
117
287940
1560
? μ–Όλ§ˆλ‚˜
04:49
many people experience mental health issues?
118
289500
2750
λ§Žμ€ μ‚¬λžŒλ“€μ΄ μ •μ‹  건강 문제λ₯Ό κ²½ν—˜ν•©λ‹ˆκΉŒ?
04:52
Catherine: The answer was: one in four,
119
292720
1960
μΊμ„œλ¦°:
04:54
according the World Health Organisation
120
294680
2540
세계보건기ꡬ
04:57
and the World Federation for Mental Health.
121
297440
2200
와 세계정신건강연맹에 λ”°λ₯΄λ©΄ 닡은 4λΆ„μ˜ 1μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
04:59
But the WHO say that as many as two-thirds
122
299960
3320
κ·ΈλŸ¬λ‚˜ WHOλŠ” μ‚¬λžŒλ“€μ˜ 3λΆ„μ˜ 2κ°€
05:03
of people never seek help from a health professional -
123
303520
2820
의료 μ „λ¬Έκ°€μ˜ 도움을 μ „ν˜€ κ΅¬ν•˜μ§€ μ•ŠλŠ”λ‹€κ³  λ§ν•©λ‹ˆλ‹€.
05:06
with stigma being one of the main reasons.
124
306520
2940
낙인이 주된 이유 쀑 ν•˜λ‚˜μž…λ‹ˆλ‹€.
05:09
Rob: And just there we had stigma again,
125
309780
2120
Rob: 그리고 λ°”λ‘œ κ±°κΈ°μ—μ„œ μš°λ¦¬λŠ” λ‹€μ‹œ 낙인을 μ°μ—ˆμŠ΅λ‹ˆλ‹€.
05:12
let's now run through the other words we learned today.
126
312260
2360
이제 μš°λ¦¬κ°€ 였늘 배운 λ‹€λ₯Έ 단어듀을 μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
05:14
Catherine: So we had woe meaning sadness.
127
314800
2840
μΊμ„œλ¦°: κ·Έλž˜μ„œ μš°λ¦¬λŠ” μŠ¬ν””μ„ μ˜λ―Έν•˜λŠ” ν™”κ°€ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
05:17
I'm full of woe. Woe is me!
128
317880
3420
λ‚˜λŠ” λΉ„μ• λ‘œ 가득 μ°¨ μžˆμŠ΅λ‹ˆλ‹€. λ‚˜μ•Ό!
05:21
Rob: Maybe you need some therapy -
129
321520
1400
Rob: μΉ˜λ£Œκ°€ ν•„μš”ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
05:23
that's the process of receiving treatment for a particular
130
323060
2660
νŠΉμ •
05:25
health issue, especially mental health illness.
131
325920
2440
건강 문제, 특히 μ •μ‹  건강 μ§ˆν™˜μ— λŒ€ν•œ 치료λ₯Ό λ°›λŠ” κ³Όμ •μž…λ‹ˆλ‹€.
05:28
Catherine: And we had - to check in with someone.
132
328620
2320
μΊμ„œλ¦°: 그리고 μš°λ¦¬λŠ” - λˆ„κ΅°κ°€μ™€ 확인해야 ν–ˆμŠ΅λ‹ˆλ‹€.
05:31
After we finish this programme, I need to check in with
133
331200
2600
이 ν”„λ‘œκ·Έλž¨μ„ 마친 ν›„μ—λŠ”
05:33
the boss about my new our project.
134
333920
1900
μƒμ‚¬μ—κ²Œ μƒˆλ‘œμš΄ ν”„λ‘œμ νŠΈμ— λŒ€ν•΄ 확인해야 ν•©λ‹ˆλ‹€.
05:36
Rob: We also had self-reflection -
135
336100
1600
Rob: μš°λ¦¬λŠ” λ˜ν•œ 자기 성찰을 ν–ˆμŠ΅λ‹ˆλ‹€.
05:37
that's the process of thinking deeply about yourself.
136
337780
2780
그것은 μžμ‹ μ— λŒ€ν•΄ 깊이 μƒκ°ν•˜λŠ” κ³Όμ •μž…λ‹ˆλ‹€.
05:41
Catherine: And finally we had ethical.
137
341220
2080
μΊμ„œλ¦°: 그리고 λ§ˆμ§€λ§‰μœΌλ‘œ μš°λ¦¬λŠ” μœ€λ¦¬μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
05:43
If you describe something as ethical,
138
343840
2040
μ–΄λ–€ 것을 윀리적이라고 κΈ°μˆ ν•˜λ©΄
05:46
you mean it's morally right.
139
346160
1500
λ„λ•μ μœΌλ‘œ μ˜³λ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
05:47
Rob: So woe, stigma, therapy, check in with,
140
347960
3680
Rob: κ·Έλž˜μ„œ λΆˆν–‰, 낙인, 치료, 체크인,
05:51
self-reflection and ethical.
141
351640
2260
자기 μ„±μ°° 및 윀리.
05:54
That's it for this edition of 6 Minute English.
142
354100
2000
이번 6λΆ„ μ˜μ–΄νŒμ€ μ—¬κΈ°κΉŒμ§€μž…λ‹ˆλ‹€.
05:56
We'll leave you to self-reflect - and after you've done that
143
356280
2960
μš°λ¦¬λŠ” 당신이 슀슀둜 λ°˜μ„±ν•˜λ„λ‘ λ‚¨κ²¨λ‘˜ κ²ƒμž…λ‹ˆλ‹€. 그리고 당신이 그것을 ν•œ 후에
05:59
do visit our Facebook, Twitter, Instagram
144
359500
2120
우리의 페이슀뢁, νŠΈμœ„ν„°, μΈμŠ€νƒ€κ·Έλž¨,
06:01
and YouTube pages, and of course our website!
145
361620
2500
유튜브 νŽ˜μ΄μ§€, 그리고 λ¬Όλ‘  우리 μ›Ήμ‚¬μ΄νŠΈλ₯Ό λ°©λ¬Έν•˜μ„Έμš”!
06:04
Catherine: Bye for now.
146
364460
500
μΊμ„œλ¦°: 일단은 μ•ˆλ…•.
06:05
Rob: Bye bye!
147
365140
520
λ‘­: μ•ˆλ…•!
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7