AI to reduce animal testing ⏲️ 6 Minute English

106,439 views ・ 2024-07-18

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:07
Hello, this is 6 Minute English from BBC Learning English.
0
7760
3200
μ•ˆλ…•ν•˜μ„Έμš”. BBC Learning English의 6λΆ„μ˜μ–΄μž…λ‹ˆλ‹€.
00:10
I'm Phil and I'm Georgie.
1
10960
1960
μ €λŠ” Phil이고 Georgieμž…λ‹ˆλ‹€.
00:12
Animal testing is when living animals are used in scientific research
2
12920
4640
동물 μ‹€ν—˜μ€
00:17
to find out how effective a new medicine is,
3
17560
3040
신약이 μ–Όλ§ˆλ‚˜ νš¨κ³Όμ μΈμ§€,
00:20
or how safe a product is for humans.
4
20600
2880
μ œν’ˆμ΄ μΈκ°„μ—κ²Œ μ–Όλ§ˆλ‚˜ μ•ˆμ „ν•œμ§€ μ•Œμ•„λ³΄κΈ° μœ„ν•΄ μ‚΄μ•„μžˆλŠ” 동물을 κ³Όν•™ 연ꡬ에 μ‚¬μš©ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
00:23
Scientists in favour of it argue that animal testing shows
5
23480
3920
이에 μ°¬μ„±ν•˜λŠ” κ³Όν•™μžλ“€μ€ λ™λ¬Όμ‹€ν—˜μ„ 톡해
00:27
whether medicines are safe or dangerous for humans, and has saved many lives.
6
27400
5280
μ˜μ•½ν’ˆμ΄ μΈκ°„μ—κ²Œ μ•ˆμ „ν•œμ§€ μ•„λ‹ˆλ©΄ μœ„ν—˜ν•œμ§€λ₯Ό 보여주며 λ§Žμ€ 생λͺ…을 κ΅¬ν–ˆλ‹€κ³  μ£Όμž₯ν•©λ‹ˆλ‹€.
00:32
But animal rights campaigners say it's cruel, and also ineffective
7
32680
4720
κ·ΈλŸ¬λ‚˜ 동물 ꢌ리 μš΄λ™κ°€λ“€μ€
00:37
because animals and humans are so different.
8
37400
3120
동물 κ³Ό 인간이 λ„ˆλ¬΄ λ‹€λ₯΄κΈ° λ•Œλ¬Έμ— μž”μΈν•˜κ³  νš¨κ³Όμ μ΄μ§€ μ•Šλ‹€κ³  λ§ν•©λ‹ˆλ‹€.
00:40
Under British law,
9
40520
1280
영ꡭ 법에 따라
00:41
medicines must be tested on two different types of animals,
10
41800
4520
μ˜μ•½ν’ˆμ€ 일반적으둜 μ₯, 생μ₯ λ˜λŠ” κΈ°λ‹ˆν”Όκ·Έλ₯Ό μ‹œμž‘μœΌλ‘œ 두 가지 λ‹€λ₯Έ μœ ν˜•μ˜ 동물을 λŒ€μƒμœΌλ‘œ ν…ŒμŠ€νŠΈν•΄μ•Ό ν•©λ‹ˆλ‹€
00:46
usually starting with rats, mice or guinea pigs.
11
46320
4360
.
00:50
And in everyday English, the term 'human guinea pig' can be used to mean
12
50680
5400
그리고 일상 μ˜μ–΄μ—μ„œ '인간 κΈ°λ‹ˆν”Όκ·Έ'λΌλŠ” μš©μ–΄λŠ”
00:56
the first people to have something tested on them.
13
56080
3120
무언가 ν…ŒμŠ€νŠΈλ₯Ό 받은 졜초의 μ‚¬λžŒλ“€μ„ μ˜λ―Έν•˜λŠ” 데 μ‚¬μš©λ  수 μžˆμŠ΅λ‹ˆλ‹€.
00:59
But now, groups both for and against animal testing are thinking again,
14
59200
5280
κ·ΈλŸ¬λ‚˜ 이제 동물 μ‹€ν—˜μ„ μ°¬μ„±ν•˜λŠ” 단체와 λ°˜λŒ€ν•˜λŠ” λ‹¨μ²΄λŠ”
01:04
thanks to a recent development in the debate: AI.
15
64480
3680
AIλΌλŠ” 졜근 λ…ΌμŸμ˜ λ°œμ „ 덕뢄에 λ‹€μ‹œ μƒκ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:08
In this program, we'll be hearing
16
68160
1720
이 ν”„λ‘œκ·Έλž¨μ—μ„œ μš°λ¦¬λŠ”
01:09
how artificial intelligence could help reduce the need
17
69880
3160
인곡지λŠ₯이 동물에 λŒ€ν•œ 과학적 μ‹€ν—˜μ˜ ν•„μš”μ„±μ„ μ€„μ΄λŠ” 데 μ–΄λ–»κ²Œ 도움이 될 수 μžˆλŠ”μ§€ λ“€μ–΄λ³Ό κ²ƒμž…λ‹ˆλ‹€
01:13
for scientific testing on animals.
18
73040
2520
.
01:15
But first, I have a question for you, Georgie.
19
75560
3400
ν•˜μ§€λ§Œ λ¨Όμ € μ‘°μ§€μ—κ²Œ 질문이 μžˆμ–΄μš”. 특히
01:18
There's one commonly used medicine in particular,
20
78960
3080
일반적으둜 μ‚¬μš©λ˜λŠ” 약이 ν•˜λ‚˜ μžˆλŠ”λ° ,
01:22
which is harmful for animals but safe for humans,
21
82040
3880
μ΄λŠ” λ™λ¬Όμ—κ²ŒλŠ” 해둭지 만 μΈκ°„μ—κ²ŒλŠ” μ•ˆμ „ν•©λ‹ˆλ‹€.
01:25
but what? Is it, a) antibiotics b) aspirin or c) paracetamol?
22
85920
11280
그런데 λ¬΄μ—‡μž…λ‹ˆκΉŒ? a) ν•­μƒμ œ b) μ•„μŠ€ν”Όλ¦°μž…λ‹ˆκΉŒ μ•„λ‹ˆλ©΄ c) νŒŒλΌμ„Ένƒ€λͺ°μž…λ‹ˆκΉŒ?
01:37
Hmm, I guess it's aspirin.
23
97200
2200
흠, μ•„μŠ€ν”Όλ¦°μΈ 것 κ°™μ•„μš”.
01:39
OK, Georgie, I'll reveal the answer at the end of the programme.
24
99400
4080
μ’‹μ•„μš”, 쑰지. ν”„λ‘œκ·Έλž¨μ΄ λλ‚˜λ©΄ 닡을 κ³΅κ°œν•˜κ² μŠ΅λ‹ˆλ‹€ .
01:43
Christine Ro is a science journalist
25
103480
2520
Christine RoλŠ”
01:46
who's interested in the animal testing debate.
26
106000
2840
동물 μ‹€ν—˜ λ…ΌμŸμ— 관심이 μžˆλŠ” κ³Όν•™ μ €λ„λ¦¬μŠ€νŠΈμž…λ‹ˆλ‹€.
01:48
Here, she explains to BBC World Service programme 'Tech Life' some
27
108840
5280
μ—¬κΈ°μ—μ„œ κ·Έλ…€λŠ” BBC World Service ν”„λ‘œκ·Έλž¨ 'Tech Life'에
01:54
of the limitations of testing medicines on animals.
28
114120
3440
동물 λŒ€μƒ μ˜μ•½ν’ˆ ν…ŒμŠ€νŠΈμ˜ ν•œκ³„μ— λŒ€ν•΄ μ„€λͺ…ν•©λ‹ˆλ‹€.
01:57
Of course, you can't necessarily predict from a mouse or a dog
29
117560
2760
λ¬Όλ‘ , μ₯λ‚˜ κ°œμ—κ²Œμ„œ
02:00
what's going to happen in a human,
30
120320
1880
μΈκ°„μ—κ²Œ 무슨 일이 일어날지 λ°˜λ“œμ‹œ μ˜ˆμΈ‘ν•  μˆ˜λŠ” μ—†κ³ , λ™λ¬Όμ—μ„œ 독성이
02:02
and there have been a number of cases where substances that have proven
31
122200
3920
μž…μ¦λœ 물질이
02:06
to be toxic in animals have been proven to be safe in humans,
32
126120
3560
μΈκ°„μ—κ²ŒλŠ” μ•ˆμ „ν•œ κ²ƒμœΌλ‘œ μž…μ¦λœ κ²½μš°κ°€ 많이 μžˆμ—ˆκ³ , κ·Έ
02:09
and vice versa.
33
129680
2400
λ°˜λŒ€.
02:12
There are also, of course, animal welfare limitations to animal testing.
34
132080
4040
λ¬Όλ‘  λ™λ¬Όμ‹€ν—˜μ—λŠ” λ™λ¬Όλ³΅μ§€μƒμ˜ ν•œκ³„λ„ μžˆμŠ΅λ‹ˆλ‹€.
02:16
Most people, I think, if they had the choice, would want
35
136120
3280
λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ€ μ„ νƒκΆŒμ΄ μžˆλ‹€λ©΄ μ•ˆμ „μ„ 보μž₯ν•˜λ©΄μ„œλ„ κ°€λŠ₯ν•œ ν•œ
02:19
their substances to be used on as few animals, or no animals, as possible,
36
139400
4520
적은 수의 λ™λ¬Όμ—κ²Œ λ˜λŠ” μ „ν˜€ μ‚¬μš©ν•˜μ§€ μ•Šκ³  μžμ‹ μ˜ λ¬Όμ§ˆμ„ μ‚¬μš©ν•˜κΈ°λ₯Ό 원할 κ²ƒμž…λ‹ˆλ‹€
02:23
while still ensuring safety.
37
143920
1960
.
02:25
Now that's been a really difficult needle to thread,
38
145880
2360
μ§€κΈˆμ€ 싀을 κΏ°κΈ° 정말 μ–΄λ €μš΄ λ°”λŠ˜μ΄μ—ˆμ§€λ§Œ
02:28
but AI might help to make that more possible.
39
148240
2880
AIκ°€ 이λ₯Ό 더 κ°€λŠ₯ν•˜κ²Œ λ§Œλ“œλŠ” 데 도움이 될 수 μžˆμŠ΅λ‹ˆλ‹€ .
02:31
Christine says that medicines which are safe for animals might not be safe
40
151120
4400
ν¬λ¦¬μŠ€ν‹΄μ€ λ™λ¬Όμ—κ²Œ μ•ˆμ „ν•œ μ˜μ•½ν’ˆμ΄ μΈκ°„μ—κ²ŒλŠ” μ•ˆμ „ν•˜μ§€ μ•Šμ„ μˆ˜λ„ μžˆλ‹€κ³  λ§ν•©λ‹ˆλ‹€
02:35
for humans.
41
155520
1040
.
02:36
But the opposite is also true –
42
156560
2240
κ·ΈλŸ¬λ‚˜ κ·Έ λ°˜λŒ€λ„ μ‚¬μ‹€μž…λ‹ˆλ‹€.
02:38
what's safe for humans might not be safe for animals.
43
158800
3480
μΈκ°„μ—κ²Œ μ•ˆμ „ν•œ 것이 λ™λ¬Όμ—κ²ŒλŠ” μ•ˆμ „ν•˜μ§€ μ•Šμ„ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
02:42
Christine uses the phrase 'vice versa' to show that the opposite
44
162280
4360
Christine은
02:46
of what she says is also true.
45
166640
3440
μžμ‹ μ΄ λ§ν•˜λŠ” 것과 λ°˜λŒ€λ˜λŠ” κ²½μš°λ„ μ‚¬μ‹€μž„μ„ 보여주기 μœ„ν•΄ 'μ—­λŒ€'λΌλŠ” 문ꡬλ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€.
02:50
Christine also uses the idiom 'to thread the needle' to describe a task,
46
170080
4680
Christine은 λ˜ν•œ
02:54
which requires a lot of skill and precision,
47
174760
3040
λ§Žμ€ 기술 κ³Ό 정확성이 ν•„μš”ν•œ μž‘μ—…,
02:57
especially one involving a conflict.
48
177800
3160
특히 κ°ˆλ“±κ³Ό κ΄€λ ¨λœ μž‘μ—…μ„ μ„€λͺ…ν•˜κΈ° μœ„ν•΄ 'to thread the needle'μ΄λΌλŠ” κ΄€μš©μ–΄λ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€.
03:00
Yes, medical animal testing may save human lives,
49
180960
4520
κ·Έλ ‡μŠ΅λ‹ˆλ‹€. 의료 동물 μ‹€ν—˜μ€ μΈκ°„μ˜ 생λͺ…을 ꡬ할 수 μžˆμ§€λ§Œ
03:05
but many people see it as cruel and distressing for the animal –
50
185480
4320
λ§Žμ€ μ‚¬λžŒλ“€μ€ 이λ₯Ό λ™λ¬Όμ—κ²Œ μž”μΈν•˜κ³  κ³ ν†΅μŠ€λŸ¬μš΄ 일둜 μ—¬κΉλ‹ˆλ‹€.
03:09
it's a difficult needle to thread.
51
189800
2600
μ΄λŠ” 싀을 κΏ°κΈ° μ–΄λ €μš΄ μΌμž…λ‹ˆλ‹€.
03:12
But now, the challenge of threading that needle has got a little easier
52
192400
4120
ν•˜μ§€λ§Œ μ΄μ œλŠ” 인곡지λŠ₯ 덕뢄에 λ°”λŠ˜μ— 싀을 κΏ°λŠ” 일이 쑰금 더 μ‰¬μ›Œμ‘ŒμŠ΅λ‹ˆλ‹€
03:16
because of artificial intelligence. Predicting
53
196520
2760
.
03:19
how likely a new medicine is to harm humans,
54
199280
3160
신약이 μΈκ°„μ—κ²Œ ν•΄λ₯Ό 끼칠 κ°€λŠ₯성을 μ˜ˆμΈ‘ν•˜λ €λ©΄ 수천 번의 μ‹€ν—˜
03:22
involves analysing the results of thousands of experiments.
55
202440
3960
κ²°κ³Όλ₯Ό 뢄석해야 ν•©λ‹ˆλ‹€ .
03:26
And one thing AI is really good at, is analysing mountains
56
206400
4040
그리고 AIκ°€ 정말 μž˜ν•˜λŠ” 것 쀑 ν•˜λ‚˜λŠ” 산더미 같은 데이터λ₯Ό λΆ„μ„ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€
03:30
and mountains of data.
57
210440
2200
.
03:32
Here's Christine Ro again, speaking with BBC World Service's 'Tech Life'.
58
212640
4880
BBC World Service의 'Tech Life'μ—μ„œ Christine Roκ°€ λ‹€μ‹œ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:37
So, AI isn't the whole picture, of course,
59
217520
1960
λ¬Όλ‘  AIκ°€ 전체 그림은 μ•„λ‹ˆμ§€λ§Œ
03:39
but it's an increasingly important part of the picture.
60
219480
3360
κ·Έλ¦Όμ—μ„œ 점점 더 μ€‘μš”ν•œ 뢀뢄이 되고 μžˆμŠ΅λ‹ˆλ‹€.
03:42
And one reason for that, is that there is a huge amount of toxicology data
61
222840
3800
κ·Έ 이유 쀑 ν•˜λ‚˜λŠ” 화학적 μ•ˆμ „μ„±μ„ κ²°μ •ν•  λ•Œ κ²€ν† ν•΄μ•Ό ν•  μ—„μ²­λ‚œ μ–‘μ˜ 독성학 데이터가 있기 λ•Œλ¬Έμž…λ‹ˆλ‹€
03:46
to wade through when it comes to determining chemical safety.
62
226640
3560
.
03:50
And on top of that, there's this staggering number of chemicals
63
230200
3120
κ²Œλ‹€κ°€, 항상 μ—„μ²­λ‚˜κ²Œ λ§Žμ€ ν™”ν•™λ¬Όμ§ˆμ΄
03:53
being invented all of the time.
64
233320
2080
발λͺ…λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:55
AI helps scientists wade through huge amounts of data.
65
235400
4320
AIλŠ” κ³Όν•™μžλ“€μ΄ μ—„μ²­λ‚œ μ–‘μ˜ 데이터λ₯Ό μ²˜λ¦¬ν•˜λŠ” 데 도움이 λ©λ‹ˆλ‹€ .
03:59
If you wade through something, you spend a lot of time and effort
66
239720
4480
μ–΄λ–€ 일을 ν—€μ³λ‚˜κ°€λ©΄
04:04
doing something boring or difficult, especially reading a lot of information.
67
244200
5960
μ§€λ£¨ν•˜κ±°λ‚˜ μ–΄λ €μš΄ 일, 특히 λ§Žμ€ 정보λ₯Ό μ½λŠ” 데 λ§Žμ€ μ‹œκ°„κ³Ό λ…Έλ ₯을 쏟게 λ©λ‹ˆλ‹€.
04:10
AI can process huge amounts of data,
68
250160
2920
AIλŠ” μ—„μ²­λ‚œ μ–‘μ˜ 데이터λ₯Ό μ²˜λ¦¬ν•  수 있으며,
04:13
and what's more,
69
253080
1120
λ”μš±μ΄
04:14
that amount keeps growing as new chemicals are invented.
70
254200
4080
μƒˆλ‘œμš΄ ν™”ν•™ 물질이 발λͺ…됨에 따라 κ·Έ 양은 계속 μ¦κ°€ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
04:18
Christine uses the phrase 'on top of that' meaning in addition to something,
71
258280
5160
Christine은 'on top of that'μ΄λΌλŠ” ν‘œν˜„μ„ μ‚¬μš©ν•˜μ—¬ 무엇인가에 μΆ”κ°€λ˜λŠ” 것을 μ˜λ―Έν•˜λ©°
04:23
often this extra thing is negative.
72
263440
2440
μ’…μ’… 이 좔가적인 것은 뢀정적인 의미λ₯Ό κ°–μŠ΅λ‹ˆλ‹€.
04:25
She means there's already so much data to understand,
73
265880
3720
κ·Έλ…€λŠ” 이미 이해해야 ν•  데이터가 λ„ˆλ¬΄ 많고,
04:29
and additionally, there's even more to be understood about these new chemicals.
74
269600
5400
κ²Œλ‹€κ°€ μ΄λŸ¬ν•œ μƒˆλ‘œμš΄ ν™”ν•™λ¬Όμ§ˆμ— λŒ€ν•΄ 이해해야 ν•  데이터가 훨씬 더 λ§Žλ‹€λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
04:35
Of course, the good news is that
75
275000
1920
λ¬Όλ‘  쒋은 μ†Œμ‹μ€
04:36
with AI, testing on animals could one day stop,
76
276920
4120
AIλ₯Ό μ‚¬μš©ν•˜λ©΄ μ–Έμ  κ°€λŠ” 동물 μ‹€ν—˜μ΄ 쀑단될 수 μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:41
although Christine warns that AI is not the whole picture,
77
281040
3840
ν•˜μ§€λ§Œ Christine은 AIκ°€ 전체 그림이 μ•„λ‹ˆλ©°
04:44
it's not a complete description of something which includes
78
284880
3000
04:47
all the relevant information.
79
287880
2080
λͺ¨λ“  κ΄€λ ¨ 정보λ₯Ό ν¬ν•¨ν•˜λŠ” μ™„μ „ν•œ μ„€λͺ…이 μ•„λ‹ˆλΌκ³  κ²½κ³ ν•©λ‹ˆλ‹€.
04:49
Nevertheless, the news is a step forward for both animal welfare
80
289960
5200
κ·ΈλŸΌμ—λ„ λΆˆκ΅¬ν•˜κ³ , 이 μ†Œμ‹μ€ 동물 볡지
04:55
and for modern medicine.
81
295160
2080
와 ν˜„λŒ€ μ˜ν•™ λͺ¨λ‘λ₯Ό μœ„ν•œ ν•œ 단계 μ§„μ „μž…λ‹ˆλ‹€.
04:57
Speaking of which, what was the answer to your question
82
297240
3200
그러고 λ³΄λ‹ˆ ν•„μ˜ μ§ˆλ¬Έμ— λŒ€ν•œ λŒ€λ‹΅μ€ λ¬΄μ—‡μ΄μ—ˆλ‚˜μš”
05:00
Phil? What is a commonly used medicine which is safe for humans
83
300440
4200
? μ‚¬λžŒμ—κ²ŒλŠ” μ•ˆμ „
05:04
but harmful to animals?
84
304640
1720
ν•˜μ§€λ§Œ λ™λ¬Όμ—κ²ŒλŠ” ν•΄λ‘œμš΄ μ•½ 쀑 ν”νžˆ μ‚¬μš©λ˜λŠ” 약은 λ¬΄μ—‡μž…λ‹ˆκΉŒ?
05:06
I guessed it was aspirin.
85
306360
1760
μ•„μŠ€ν”Όλ¦°μΈ 쀄 μ•Œμ•˜μ–΄μš”.
05:08
Which was the correct answer.
86
308120
3360
μ •λ‹΅μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
05:11
Right, let's recap the vocabulary we've discussed, starting
87
311480
4400
λ§žμŠ΅λ‹ˆλ‹€, μš°λ¦¬κ°€ λ…Όμ˜ν•œ μ–΄νœ˜λ₯Ό μš”μ•½ν•΄ λ³΄κ² μŠ΅λ‹ˆλ‹€.
05:15
with 'human guinea pigs',
88
315880
1960
'인간 μ‹€ν—˜μš© 돼지'λΆ€ν„° μ‹œμž‘ν•˜μ—¬, μƒˆλ‘œμš΄ 것을 ν…ŒμŠ€νŠΈν•œ
05:17
meaning the first people to have something new tested on them.
89
317840
4040
졜초의 μ‚¬λžŒλ“€μ„ μ˜λ―Έν•©λ‹ˆλ‹€ .
05:21
The phrase 'vice versa' is used to indicate
90
321880
2840
'μ—­λŒ€'λΌλŠ” ν‘œν˜„μ€
05:24
that the opposite of what you have just said is also true.
91
324720
4240
방금 λ§ν•œ λ‚΄μš©κ³Ό λ°˜λŒ€λ˜λŠ” λ‚΄μš©λ„ μ‚¬μ‹€μž„μ„ λ‚˜νƒ€λ‚΄λŠ” 데 μ‚¬μš©λ©λ‹ˆλ‹€.
05:28
'To thread the needle' describes a task which requires extreme skill
92
328960
4880
'λ°”λŠ˜μ— 싀을 κΏ°λŠ” 것'은
05:33
and precision to do successfully.
93
333840
3040
μ„±κ³΅μ μœΌλ‘œ μˆ˜ν–‰ν•˜κΈ° μœ„ν•΄ κ·Ήλ„μ˜ 기술과 정확성이 ν•„μš”ν•œ μž‘μ—…μ„ μ„€λͺ…ν•©λ‹ˆλ‹€.
05:36
'The whole picture' means a complete description of something, which includes
94
336880
4360
'전체 κ·Έλ¦Ό'μ΄λž€ μ–΄λ–€ 것에 λŒ€ν•œ μ™„μ „ν•œ μ„€λͺ…을 μ˜λ―Έν•˜λ©°, μ—¬κΈ°μ—λŠ”
05:41
all the relevant information and opinions about it.
95
341240
3400
λͺ¨λ“  κ΄€λ ¨ 정보 와 의견이 ν¬ν•¨λ©λ‹ˆλ‹€.
05:44
If you wade through something, you spend a lot of time and effort
96
344640
4560
μ–΄λ–€ 일을 ν—€μ³λ‚˜κ°€λ©΄
05:49
doing something boring or difficult, especially reading a lot of information.
97
349200
5520
μ§€λ£¨ν•˜κ±°λ‚˜ μ–΄λ €μš΄ 일, 특히 λ§Žμ€ 정보λ₯Ό μ½λŠ” 데 λ§Žμ€ μ‹œκ°„κ³Ό λ…Έλ ₯을 쏟게 λ©λ‹ˆλ‹€.
05:54
And finally,
98
354720
1280
그리고 λ§ˆμ§€λ§‰μœΌλ‘œ
05:56
the phrase 'on top of something' means 'in addition to something',
99
356000
3760
'on top of Something'μ΄λΌλŠ” λ¬Έκ΅¬λŠ” ' 무언가에 μΆ”κ°€λ‘œ'λ₯Ό μ˜λ―Έν•˜λ©°,
05:59
and that extra thing is often negative.
100
359760
3080
κ·Έ μ—¬λΆ„μ˜ 것은 μ’…μ’… 뢀정적인 의미λ₯Ό κ°–μŠ΅λ‹ˆλ‹€.
06:02
That's all for this week.
101
362840
1360
이번 μ£Όμ—λŠ” 이것이 μ „λΆ€μž…λ‹ˆλ‹€.
06:04
Goodbye for now. Bye.
102
364200
2240
μ§€κΈˆμ€ μ•ˆλ…•. μ•ˆλ…•.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7