IELTS Speaking Idioms: AI

18,446 views ・ 2024-05-16

English Speaking Success


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:00
(gentle upbeat music)
0
80
3167
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
00:06
Idioms, idioms.
1
6870
1800
μˆ™μ–΄, μˆ™μ–΄.
00:08
And I've got another song about idioms.
2
8670
3120
그리고 κ΄€μš©κ΅¬μ— κ΄€ν•œ 또 λ‹€λ₯Έ λ…Έλž˜κ°€ μžˆμ–΄μš”.
00:11
Here we go.
3
11790
1334
μ—¬κΈ° μžˆμŠ΅λ‹ˆλ‹€.
00:13
β™ͺ I want some idioms β™ͺ
4
13124
1816
β™ͺ κ΄€μš©μ–΄ λͺ‡ 개λ₯Ό μ›ν•΄μš” β™ͺ
00:14
(gentle upbeat music)
5
14940
2221
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
00:17
β™ͺ And I want them now β™ͺ
6
17161
1427
β™ͺ 그리고 μ§€κΈˆ κ·Έκ±Έ μ›ν•΄μš” β™ͺ
00:18
(gentle upbeat music)
7
18588
2801
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
00:21
β™ͺ I want some idioms β™ͺ
8
21389
2008
β™ͺ κ΄€μš©μ–΄ λͺ‡ 개λ₯Ό μ›ν•΄μš” β™ͺ
00:23
(gentle upbeat music)
9
23397
833
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
00:24
β™ͺ But I don't know how, how to learn them β™ͺ
10
24230
3189
β™ͺ ν•˜μ§€λ§Œ μ–΄λ–»κ²Œ, μ–΄λ–»κ²Œ λ°°μš°λŠ”μ§€ λͺ¨λ₯΄κ² μ–΄μš” β™ͺ
00:27
β™ͺ How to remember them β™ͺ
11
27419
2131
β™ͺ μ–΄λ–»κ²Œ β™ͺ
00:29
β™ͺ All that I know and I really think so β™ͺ
12
29550
4037
β™ͺ λ‚΄κ°€ μ•„λŠ” 건 μ „λΆ€κ³  정말 κ·Έλ ‡κ²Œ μƒκ°ν•΄μš” β™ͺ
00:33
β™ͺ Is that I want some idioms now β™ͺ
13
33587
2564
β™ͺ μ§€κΈˆ κ΄€μš©κ΅¬ λͺ‡ 개λ₯Ό μ•Œκ³  싢은 κ±΄κ°€μš” β™ͺ
00:36
(gentle upbeat music)
14
36151
1326
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
00:37
β™ͺ How to learn them β™ͺ
15
37477
2012
β™ͺ μ–΄λ–»κ²Œ λ°°μš°λŠ”μ§€ β™ͺ
00:39
β™ͺ How to remember them β™ͺ
16
39489
2025
β™ͺ κΈ°μ–΅ν•˜λŠ” 방법 β™ͺ
00:41
β™ͺ All I know and I really think so β™ͺ
17
41514
3803
β™ͺ λ‚΄κ°€ μ•„λŠ” 건 μ „λΆ€κ³  정말 κ·Έλ ‡κ²Œ μƒκ°ν•΄μš” β™ͺ
00:45
β™ͺ Is that I want β™ͺ
18
45317
908
β™ͺ λ‚΄κ°€ μ›ν•˜λŠ” κ±΄κ°€μš” β™ͺ
00:46
- I love AI.
19
46225
958
- λ‚˜λŠ” AIλ₯Ό μ‚¬λž‘ν•΄μš”.
00:47
β™ͺ Now β™ͺ
20
47183
945
β™ͺ μ§€κΈˆ β™ͺ
00:48
(gentle upbeat music)
21
48128
2013
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
00:50
β™ͺ How to learn them β™ͺ
22
50141
1945
β™ͺ μ–΄λ–»κ²Œ λ°°μš°λŠ”μ§€ β™ͺ
00:52
β™ͺ How to remember them β™ͺ
23
52086
2050
β™ͺ κΈ°μ–΅ν•˜λŠ” 방법 β™ͺ
00:54
β™ͺ All I know β™ͺ
24
54136
989
β™ͺ λ‚΄κ°€ μ•„λŠ” μ „λΆ€ β™ͺ
00:55
β™ͺ And I really think so β™ͺ β™ͺ I want some idioms now β™ͺ
25
55125
2815
β™ͺ 그리고 정말 그럴 것 κ°™μ•„μš” β™ͺ β™ͺ μ§€κΈˆ κ΄€μš©κ΅¬ λͺ‡ 개λ₯Ό μ›ν•΄μš” β™ͺ
00:57
β™ͺ Is that I want some idioms now β™ͺ
26
57940
3092
β™ͺ μ§€κΈˆ κ΄€μš©κ΅¬ λͺ‡ 개λ₯Ό μ›ν•΄μš” β™ͺ
01:01
(gentle upbeat music)
27
61032
1718
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
01:02
β™ͺ How to learn them β™ͺ
28
62750
1573
β™ͺ μ–΄λ–»κ²Œ λ°°μš°λŠ”μ§€ β™ͺ
01:04
(gentle upbeat music)
29
64323
1677
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
01:06
- And it goes on and on and on, right?
30
66000
1860
- κ³„μ†ν•΄μ„œ κ³„μ†λ˜λŠ” κ±°μ£ , κ·Έλ ‡μ£ ?
01:07
It's idioms is indeed.
31
67860
1890
κ΄€μš©κ΅¬λŠ” μ‹€μ œλ‘œμž…λ‹ˆλ‹€.
01:09
It's time for some idioms like this one over here.
32
69750
3575
이제 이와 같은 κ΄€μš©μ–΄λ₯Ό 배울 μ‹œκ°„μž…λ‹ˆλ‹€.
01:13
(Keith clears throat)
33
73325
1165
(ν‚€μŠ€κ°€ λͺ©μ„ κ°€λ‹€λ“¬λŠ”λ‹€)
01:14
But listen, let me switch around.
34
74490
1920
ν•˜μ§€λ§Œ λ“€μ–΄λ³΄μ„Έμš”. λ°”κΏ” λ³΄κ² μŠ΅λ‹ˆλ‹€. λ§žλŠ” 사진을 찾으면
01:16
I'm gonna show you some idioms
35
76410
2160
λͺ‡ 가지 κ΄€μš©μ–΄λ₯Ό 보여 λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€
01:18
if I can find my right picture.
36
78570
3180
.
01:21
Where are we? We're over here.
37
81750
2010
우리λ₯Ό μ–΄λ””? μš°λ¦¬λŠ” 여기에 μžˆμŠ΅λ‹ˆλ‹€.
01:23
I'm gonna show you a picture
38
83760
1770
μ œκ°€ 사진을 λ³΄μ—¬λ“œλ¦΄ ν…Œλ‹ˆ
01:25
and I'd like you to tell me what you think the idiom is
39
85530
4110
이 κ΄€μš©κ΅¬κ°€
01:29
connected to AI.
40
89640
3270
AI와 μ—°κ²°λ˜μ–΄ μžˆλ‹€κ³  μƒκ°ν•˜μ‹œλŠ”μ§€ 말씀해 μ£Όμ‹œκΈΈ λ°”λžλ‹ˆλ‹€.
01:32
Okay?
41
92910
833
μ’‹μ•„μš”?
01:33
Here's the first picture.
42
93743
1777
μ—¬κΈ° 첫 번째 사진이 μžˆμŠ΅λ‹ˆλ‹€.
01:35
What do you think the idiom is?
43
95520
2640
κ΄€μš©κ΅¬κ°€ 무엇이라고 μƒκ°ν•˜μ„Έμš”?
01:38
So, as Lucia says,
44
98160
1747
λ”°λΌμ„œ Luciaκ°€ λ§ν–ˆλ“―μ΄
01:39
"Idioms are phrases that convey a different message
45
99907
2813
"μˆ™μ–΄λŠ” 문자
01:42
despite their own literal meaning."
46
102720
2160
κ·ΈλŒ€λ‘œμ˜ μ˜λ―Έμ—λ„ λΆˆκ΅¬ν•˜κ³  λ‹€λ₯Έ λ©”μ‹œμ§€λ₯Ό μ „λ‹¬ν•˜λŠ” λ¬Έκ΅¬μž…λ‹ˆλ‹€."
01:44
That's right. Very good, Lucia.
47
104880
1680
μ’‹μ•„μš”. μ•„μ£Ό μ’‹μ•„, λ£¨μ‹œμ•„.
01:46
Thank you for that.
48
106560
1290
κ°μ‚¬ν•©λ‹ˆλ‹€.
01:47
That one.
49
107850
1110
ν•˜λ‚˜.
01:48
So let's see.
50
108960
1057
그럼 보자.
01:50
"On cloud nine." Lovely idea.
51
110017
2783
"9번째 ν΄λΌμš°λ“œμ—." 멋진 μ•„μ΄λ””μ–΄μž…λ‹ˆλ‹€.
01:52
Lovely.
52
112800
1387
μ‚¬λž‘μŠ€λŸ¬μš΄.
01:54
"Felt over the moon."
53
114187
1373
"달 λ„ˆλ¨Έλ‘œ λŠκ»΄μ‘Œλ‹€."
01:55
Not a bad one, not a bad one.
54
115560
2857
λ‚˜μœ 것도 μ•„λ‹ˆκ³  λ‚˜μœ 것도 μ•„λ‹™λ‹ˆλ‹€.
01:58
"AI pilot."
55
118417
2033
"AI 쑰쒅사."
02:00
Almost.
56
120450
963
거의.
02:02
Ah, Dina, I think you are almost there.
57
122460
4053
μ•„, λ””λ‚˜, 거의 λ‹€ 온 것 κ°™μ•„μš”.
02:07
"Spaceship."
58
127447
1613
"μš°μ£Όμ„ ."
02:09
But what's the idiom?
59
129060
1353
ν•˜μ§€λ§Œ κ΄€μš©κ΅¬λŠ” λ¬΄μ—‡μž…λ‹ˆκΉŒ?
02:11
Ernesto, very good, very good.
60
131940
2640
에λ₯΄λ„€μŠ€ν† , μ•„μ£Ό μ’‹μ•„, μ•„μ£Ό μ’‹μ•„.
02:14
I think you are there.
61
134580
1383
λ‚΄ 생각엔 당신이 κ±°κΈ° μžˆλŠ” 것 κ°™μ•„μš”.
02:17
Uh-huh.
62
137850
1950
μ–΄ ν—ˆ.
02:19
Anybody else? Ege, Ege, Ege?
63
139800
2127
λ‹€λ₯Έ μ‚¬λžŒ? μ—κ²Œ, μ—κ²Œ, μ—κ²Œ?
02:22
Yeah, absolutely, yes.
64
142770
2310
응, 물둠이지, 응.
02:25
It could be over the moon,
65
145080
1320
달 λ„ˆλ¨Έμ— μžˆμ„ μˆ˜λ„ μžˆμ§€λ§Œ
02:26
but actually it's not quite there.
66
146400
2520
μ‹€μ œλ‘œλŠ” 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
02:28
Not quite there.
67
148920
1087
κ±°κΈ°κΉŒμ§€λŠ” μ•„λ‹™λ‹ˆλ‹€.
02:30
"I'm over the cloud." Hmm.
68
150007
2670
"λ‚˜λŠ” ꡬ름 λ„ˆλ¨Έμ— μžˆμ–΄μš”." 흠.
02:32
"Over the moon."
69
152677
1523
"달 λ„ˆλ¨Έ."
02:34
Ah.
70
154200
1627
μ•„.
02:35
"On cloud nine means very happy."
71
155827
1943
"On Cloud nine은 맀우 ν–‰λ³΅ν•˜λ‹€λŠ” λœ»μž…λ‹ˆλ‹€."
02:37
You are right there, you are right.
72
157770
2163
당신은 λ°”λ‘œ 거기에 μžˆμŠ΅λ‹ˆλ‹€. λ‹Ήμ‹  말이 λ§žμŠ΅λ‹ˆλ‹€.
02:41
Anything else?
73
161130
1380
λ‹€λ₯Έ 건 μ—†λ‚˜μš”?
02:42
Something connected with autopilot.
74
162510
2010
μžλ™ μ‘°μ’… μž₯μΉ˜μ™€ κ΄€λ ¨λœ 것이 μžˆμŠ΅λ‹ˆλ‹€.
02:44
Yeah.
75
164520
960
응.
02:45
You're right.
76
165480
833
λ„€κ°€ μ˜³μ•„.
02:46
So this one, right?
77
166313
2227
그럼 이건 맞죠?
02:48
This particular one is the following.
78
168540
2880
이 νŠΉλ³„ν•œ 것은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
02:51
I can do it now. I can show you this one.
79
171420
1957
이제 ν•  수 μžˆμ–΄μš”. 이걸 λ³΄μ—¬λ“œλ¦΄ 수 μžˆμ–΄μš”.
02:53
"To be on autopilot
80
173377
1913
"μžλ™ μ‘°μ’… μž₯치λ₯Ό μ‚¬μš©ν•œλ‹€λŠ” 것은 μžλ™μœΌλ‘œ μž‘λ™ν•˜λŠ”
02:55
means to use a system that operates automatically."
81
175290
3687
μ‹œμŠ€ν…œμ„ μ‚¬μš©ν•˜λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€ ."
03:00
Autopilot.
82
180150
1563
μžλ™ μ‘°μ’… μž₯치.
03:02
Notice the stress?
83
182820
1500
슀트레슀λ₯Ό λˆˆμΉ˜μ±„μ…¨λ‚˜μš”?
03:04
Autopilot, autopilot, autopilot.
84
184320
3030
μžλ™ μ‘°μ’… μž₯치, μžλ™ μ‘°μ’… μž₯치, μžλ™ μ‘°μ’… μž₯치.
03:07
Autopilot. To be on autopilot.
85
187350
2670
μžλ™ μ‘°μ’… μž₯치. μžλ™ μ‘°μ’… μž₯치λ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€.
03:10
Stress should be there.
86
190020
1150
μŠ€νŠΈλ ˆμŠ€κ°€ μžˆμ–΄μ•Όν•©λ‹ˆλ‹€.
03:15
"To use a system that operates automatically."
87
195997
3533
"μžλ™μœΌλ‘œ μž‘λ™ν•˜λŠ” μ‹œμŠ€ν…œμ„ μ‚¬μš©ν•©λ‹ˆλ‹€ ." 슀트레슀 ν•΄μ†Œμ—
03:19
I'm gonna try and help you with the stress.
88
199530
2640
도움이 λ˜λ„λ‘ λ…Έλ ₯ν•˜κ² μŠ΅λ‹ˆλ‹€ .
03:22
For example,
89
202170
833
예λ₯Ό λ“€μ–΄,
03:23
"I use Google maps so whenever I travel,
90
203003
1957
"μ €λŠ” Google 지도λ₯Ό μ‚¬μš©ν•˜λ―€λ‘œ μ—¬ν–‰ν•  λ•Œλ§ˆλ‹€
03:24
my navigation is on autopilot."
91
204960
2427
λ‚΄ λ„€λΉ„κ²Œμ΄μ…˜μ΄ μžλ™ μ‘°μ’…λ©λ‹ˆλ‹€."
03:28
Okay.
92
208516
833
μ’‹μ•„μš”.
03:29
Sometimes when I'm driving, right?
93
209349
2331
가끔 μš΄μ „ν•  λ•Œ κ·Έλ ‡μ£ ?
03:31
I'm not thinking.
94
211680
1620
λ‚˜λŠ” μƒκ°ν•˜μ§€ μ•ŠλŠ”λ‹€.
03:33
I'm just driving.
95
213300
1680
λ‚œ κ·Έλƒ₯ μš΄μ „ μ€‘μ΄μ—μš”.
03:34
Listening to the music
96
214980
1200
μŒμ•…μ„ λ“£λ‹€κ°€
03:36
and then suddenly I've arrived.
97
216180
2610
κ°‘μžκΈ° λ„μ°©ν–ˆμ–΄μš”.
03:38
And I think, "How did I do that?"
98
218790
2640
그리고 μ €λŠ” "λ‚΄κ°€ μ–΄λ–»κ²Œ κ·Έλž¬μ§€?"라고 μƒκ°ν•©λ‹ˆλ‹€.
03:41
I was on autopilot, right?
99
221430
2760
λ‚˜λŠ” μžλ™ μ‘°μ’… μž₯μΉ˜μ— μžˆμ—ˆμ£ ?
03:44
I'm not thinking about it.
100
224190
1890
λ‚˜λŠ” 그것에 λŒ€ν•΄ μƒκ°ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
03:46
I can do it on autopilot.
101
226080
1713
μžλ™ μ‘°μ’… μž₯치둜 ν•  수 μžˆμ–΄μš”.
03:49
Sometimes I can almost teach English on autopilot.
102
229800
4560
λ•Œλ‘œλŠ” μžλ™ μ‘°μ’… μž₯치둜 μ˜μ–΄λ₯Ό κ°€λ₯΄μΉ  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
03:54
I'm not thinking about it.
103
234360
1320
λ‚˜λŠ” 그것에 λŒ€ν•΄ μƒκ°ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
03:55
It's just natural, it's automatic.
104
235680
2610
그것은 μžμ—°μŠ€λŸ½κ³  μžλ™μž…λ‹ˆλ‹€.
03:58
Sometimes. (laughs)
105
238290
2070
λ•Œλ•Œλ‘œ. (μ›ƒμŒ)
04:00
Google maps whenever I travel,
106
240360
1680
μ—¬ν–‰ν•  λ•Œλ§ˆλ‹€ ꡬ글 지도λ₯Ό 켜면
04:02
my navigation is on autopilot, right? (clears throat)
107
242040
3960
λ‚΄λΉ„κ²Œμ΄μ…˜μ΄ μžλ™μœΌλ‘œ μž‘λ™ 되죠? (λͺ©μ„ κ°€λ‹€λ“¬μœΌλ©°)
04:06
Okay, that's the first one.
108
246000
1953
μ’‹μ•„μš”, 그게 첫 λ²ˆμ§Έμ˜ˆμš”.
04:08
Let's have a look at the next one,
109
248820
2970
λ‹€μŒμ„ μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€. 이것은
04:11
which is not this one,
110
251790
1650
이것이 μ•„λ‹ˆλΌ
04:13
but number two.
111
253440
1200
두 λ²ˆμ§Έμž…λ‹ˆλ‹€.
04:14
What do you think?
112
254640
900
μ–΄λ–»κ²Œ μƒκ°ν•˜λ‚˜μš”?
04:19
And I'm gonna put a bit of background music on.
113
259980
3134
그리고 λ°°κ²½μŒμ•…μ„ 쑰금 λ„£μ–΄λ³΄κ² μŠ΅λ‹ˆλ‹€ .
04:23
β™ͺ I want some idioms β™ͺ
114
263114
1540
β™ͺ κ΄€μš©μ–΄ λͺ‡ 개λ₯Ό μ›ν•΄μš” β™ͺ
04:24
(gentle upbeat music)
115
264654
2456
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
04:27
β™ͺ And I want them now β™ͺ
116
267110
1632
β™ͺ 그리고 μ§€κΈˆ κ·Έκ±Έ μ›ν•΄μš” β™ͺ
04:28
(gentle upbeat music)
117
268742
2478
(λΆ€λ“œλŸ¬μš΄ κ²½μΎŒν•œ μŒμ•…)
04:31
β™ͺ I want some idioms β™ͺ
118
271220
2836
β™ͺ κ΄€μš©μ–΄ λͺ‡ 개λ₯Ό μ›ν•΄μš” β™ͺ
04:34
β™ͺ But I don't know how, how to learn them β™ͺ
119
274056
3309
β™ͺ ν•˜μ§€λ§Œ μ–΄λ–»κ²Œ, μ–΄λ–»κ²Œ λ°°μš°λŠ”μ§€ λͺ¨λ₯΄κ² μ–΄μš” β™ͺ
04:37
β™ͺ How to remember them β™ͺ
120
277365
2134
β™ͺ μ–΄λ–»κ²Œ κΈ°μ–΅ν•˜λŠ”μ§€ β™ͺ
04:39
β™ͺ All that I know and I really think so β™ͺ
121
279499
3251
β™ͺ λ‚΄κ°€ μ•„λŠ” 건 μ „λΆ€κ³  정말 κ·Έλ ‡κ²Œ μƒκ°ν•΄μš” β™ͺ
04:42
- [Keith] What do you think this one is?
122
282750
1890
- [Keith] 이게 뭐라고 μƒκ°ν•˜μ„Έμš”?
04:44
Look carefully at the name.
123
284640
3483
이름을 잘 λ³΄μ„Έμš”.
04:49
β™ͺ How to remember them β™ͺ
124
289161
2529
β™ͺ κΈ°μ–΅ν•˜λŠ” 방법 β™ͺ
04:51
- [Keith] Ooh, nice.
125
291690
1269
- [Keith] 였, μ’‹λ„€μš”.
04:52
β™ͺ Really think so β™ͺ
126
292959
2243
β™ͺ 정말 κ·Έλ ‡κ²Œ μƒκ°ν•΄μš” β™ͺ
04:55
β™ͺ Is that I want some β™ͺ
127
295202
1003
β™ͺ μ œκ°€ μ’€ λ¨Ήκ³  싢은 κ±΄κ°€μš” β™ͺ
04:56
- [Keith] Interesting.
128
296205
942
- [Keith] ν₯λ―Έλ‘­λ„€μš”.
04:57
β™ͺ Now β™ͺ
129
297147
1880
β™ͺ 이제 β™ͺ
04:59
- [Keith] It's a good expression.
130
299027
1156
- [Keith] 쒋은 ν‘œν˜„μ΄λ„€μš”.
05:00
β™ͺ How to learn them β™ͺ
131
300183
1647
β™ͺ λ°°μš°λŠ” 방법 β™ͺ
05:01
β™ͺ How to remember β™ͺ
132
301830
1141
β™ͺ κΈ°μ–΅ν•˜λŠ” 방법 β™ͺ
05:02
- [Keith] Almost.
133
302971
1031
- [Keith] 거의.
05:04
β™ͺ All I know β™ͺ
134
304002
1297
β™ͺ λ‚΄κ°€ μ•„λŠ” μ „λΆ€ β™ͺ
05:05
- [Keith] Ooh, Dina.
135
305299
833
- [Keith] 였, Dina.
05:06
"Out of the box" like thinking out of the box.
136
306132
2478
"Out of the box"λŠ” μƒμžμ—μ„œ λ²—μ–΄λ‚œ 생각과 κ°™μŠ΅λ‹ˆλ‹€.
05:08
Nice idea.
137
308610
1157
쒋은 생각.
05:09
β™ͺ Now β™ͺ
138
309767
1183
β™ͺ 이제 β™ͺ
05:10
- Yuwei, you are on form today.
139
310950
2700
- Yuwei, 당신은 였늘 μ»¨λ””μ…˜μ΄ μ’‹μŠ΅λ‹ˆλ‹€.
05:13
You're on a roll.
140
313650
1022
당신은 순쑰둭게 μ§„ν–‰λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
05:14
(gentle upbeat music)
141
314672
1768
(λΆ€λ“œλŸ½κ³  κ²½μΎŒν•œ μŒμ•…)
05:16
Ernesto as well, guys, you are fantastic.
142
316440
2429
에λ₯΄λ„€μŠ€ν† λ„ κ·Έλ ‡κ³ , μ—¬λŸ¬λΆ„ 정말 ν™˜μƒμ μ΄μ—μš”.
05:18
(gentle upbeat music)
143
318869
2731
(λΆ€λ“œλŸ½κ³  κ²½μΎŒν•œ μŒμ•…)
05:21
ANRN as well is talking about Pandora's magic.
144
321600
2760
ANRN도 νŒλ„λΌμ˜ λ§ˆλ²•μ— λŒ€ν•΄ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμ–΄μš”.
05:24
Think out of the box is a very good expression,
145
324360
3090
Think out of the boxλŠ” 맀우 쒋은 ν‘œν˜„μ΄μ§€λ§Œ
05:27
but a different one here.
146
327450
1563
μ—¬κΈ°μ„œλŠ” λ‹€λ₯Έ ν‘œν˜„μž…λ‹ˆλ‹€.
05:31
Ah, Zlatiborka, you're actually getting it.
147
331470
3600
μ•„, Zlatiborka, μ‹€μ œλ‘œ μ΄ν•΄ν•˜κ³  κ³„μ‹œκ΅°μš”.
05:35
Yeah, you're there, you're there.
148
335070
1470
응, κ±°κΈ° μžˆμ–΄, κ±°κΈ° μžˆμ–΄.
05:36
Luis as well. Absolutely, yes.
149
336540
2853
λ£¨μ΄μŠ€λ„ κ·Έλ ‡κ³ . ν™•μ‹€νžˆ λ§žμ•„μš”.
05:41
"Pandora was opened."
150
341047
1433
"νŒλ„λΌκ°€ μ—΄λ ΈμŠ΅λ‹ˆλ‹€."
05:42
Almost, almost. Yeah.
151
342480
2520
거의, 거의. 응.
05:45
You're very, very close.
152
345000
1110
당신은 맀우 κ°€κΉμŠ΅λ‹ˆλ‹€.
05:46
I think you've...
153
346110
930
λ‚΄ 생각에 μ—¬λŸ¬λΆ„μ€...
05:47
Most of you, some of you have got it spot on.
154
347040
2430
μ—¬λŸ¬λΆ„ 쀑 λŒ€λΆ€λΆ„, μΌλΆ€λŠ” λ°”λ‘œ μ΄ν•΄ν•˜κ³  μžˆμ„ κ²ƒμž…λ‹ˆλ‹€.
05:49
So the expression is,
155
349470
1770
κ·Έλž˜μ„œ ν‘œν˜„μ€,
05:51
if I just bring me back in,
156
351240
1683
λ§Œμ•½ λ‚΄κ°€ λ‚˜λ₯Ό λ‹€μ‹œ μ•ˆμœΌλ‘œ λ°λ €μ˜¨λ‹€λ©΄,
05:54
is to open Pandora's box.
157
354240
3123
νŒλ„λΌμ˜ μƒμžλ₯Ό μ—¬λŠ” κ²ƒμž…λ‹ˆλ‹€.
05:58
You can say, "To open a Pandora's box."
158
358650
2640
"νŒλ„λΌμ˜ μƒμžλ₯Ό μ—΄λ‹€"라고 말할 수 μžˆμŠ΅λ‹ˆλ‹€.
06:01
Sometimes we have a, sometimes we don't.
159
361290
2343
λ•Œλ‘œλŠ” aκ°€ 있고 λ•Œλ‘œλŠ” 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
06:04
And it means "To create a lot of unexpected problems."
160
364860
4020
그리고 ' μ˜ˆμƒμΉ˜ λͺ»ν•œ 문제λ₯Ό 많이 μΌμœΌν‚€λ‹€'λΌλŠ” λœ»μ΄λ‹€.
06:08
It's nothing to do with opening a box,
161
368880
2220
μƒμžλ₯Ό μ—¬λŠ” κ²ƒκ³ΌλŠ” 별 상관이 μ—†μ§€λ§Œ,
06:11
but the idea is that you open a box
162
371100
2340
μƒμžλ₯Ό μ—΄λ©΄
06:13
and all of these spirits come out
163
373440
2580
이런 기운이 λ‹€ λ‚˜μ˜€κ³ ,
06:16
and problems come out that you did not expect.
164
376020
3840
μ˜ˆμƒν•˜μ§€ λͺ»ν•œ λ¬Έμ œλ„ λ‚˜μ˜¨λ‹€λŠ” 생각이닀.
06:19
Unexpected, right?
165
379860
1443
μ˜ˆμƒμΉ˜ λͺ»ν•œ 일이죠?
06:22
For example, "I'm unsure about using AI in healthcare.
166
382980
5000
예λ₯Ό λ“€μ–΄, " 의료 뢄야에 AIλ₯Ό μ‚¬μš©ν•˜λŠ” 것에 λŒ€ν•΄ 확신이 μ—†μŠ΅λ‹ˆλ‹€.
06:29
I think it's opening a Pandora's box
167
389550
4110
νŒλ„λΌμ˜ μƒμžλ₯Ό μ—¬λŠ”
06:33
and will lead to ethical problems we're not ready for."
168
393660
4047
것이며 μš°λ¦¬κ°€ μ€€λΉ„ν•˜μ§€ λͺ»ν•œ 윀리적 문제둜 μ΄μ–΄μ§ˆ 것이라고 μƒκ°ν•©λ‹ˆλ‹€."
06:38
Okay.
169
398610
1080
μ’‹μ•„μš”.
06:39
So I think it's gonna create unexpected problems.
170
399690
3603
κ·Έλž˜μ„œ μ˜ˆμƒμΉ˜ λͺ»ν•œ λ¬Έμ œκ°€ 생길 것 κ°™μ•„μš”.
06:44
And this is very common, I think, in many areas of AI.
171
404400
4140
제 생각에 이것은 AI의 λ§Žμ€ μ˜μ—­μ—μ„œ 맀우 ν”ν•œ μΌμž…λ‹ˆλ‹€.
06:48
On the news this morning I was listening to about the big...
172
408540
3953
였늘 μ•„μΉ¨ λ‰΄μŠ€μ—μ„œ μ €λŠ” 큰 λ¬Έμ œμ— λŒ€ν•΄ λ“£κ³  μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
06:55
Not the problem, but the worry politicians have.
173
415590
3330
λ¬Έμ œλŠ” μ•„λ‹ˆμ§€λ§Œ μ •μΉ˜μΈλ“€μ΄ κ±±μ •ν•˜λŠ” 뢀뢄이 μžˆμŠ΅λ‹ˆλ‹€.
06:58
In the UK, we've got the elections coming
174
418920
3420
μ˜κ΅­μ—μ„œλŠ” μ„ κ±°κ°€ λ‹€κ°€μ˜€κ³ 
07:02
and in the US the elections this year,
175
422340
2610
있으며 λ―Έκ΅­μ—μ„œλŠ” μ˜¬ν•΄ μ„ κ±°κ°€ μžˆμŠ΅λ‹ˆλ‹€.
07:04
a big worry about AI creating deep fakes,
176
424950
4500
AIκ°€
07:09
fake videos that influence the voter
177
429450
3420
μœ κΆŒμžμ—κ²Œ 영ν–₯을 λ―ΈμΉ˜λŠ” κ°€μ§œ λΉ„λ””μ˜€λ₯Ό λ§Œλ“€μ–΄
07:12
and we get the wrong result or a different result.
178
432870
4293
잘λͺ»λœ κ²°κ³Ό λ‚˜ λ‹€λ₯Έ κ²°κ³Όλ₯Ό μ–»κ²Œ λ˜λŠ” 것에 λŒ€ν•΄ 큰 걱정을 ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
07:18
So people think, "Yeah, AI good.
179
438270
3360
κ·Έλž˜μ„œ μ‚¬λžŒλ“€μ€ "그래, AIλŠ” μ’‹μ•„.
07:21
But it's opening a Pandora's box."
180
441630
3510
ν•˜μ§€λ§Œ νŒλ„λΌμ˜ μƒμžλ₯Ό μ—¬λŠ” μ…ˆμ΄κ΅¬λ‚˜"라고 μƒκ°ν•©λ‹ˆλ‹€.
07:25
Because all of these unexpected problems
181
445140
2250
μ™œλƒλ©΄ 이런 μ˜ˆμƒμΉ˜ λͺ»ν•œ λ¬Έμ œλ“€μ΄ λ‹€
07:27
are going to appear.
182
447390
1530
λ‚˜νƒ€λ‚  것이기 λ•Œλ¬Έμ΄μ£ .
07:28
Very, very true.
183
448920
1170
μ•„μ£Ό, μ•„μ£Ό μ‚¬μ‹€μ΄μ—μš”.
07:30
Even if we get regulation.
184
450090
2190
규제λ₯Ό 받더라도.
07:32
So it's a good expression, right?
185
452280
2070
그럼 쒋은 ν‘œν˜„μ΄μ£ ?
07:34
To open a Pandora's box.
186
454350
2703
νŒλ„λΌμ˜ μƒμžλ₯Ό μ—΄λ €λ©΄.
07:41
Good.
187
461400
1110
쒋은.
07:42
So it's more negative.
188
462510
1650
κ·Έλž˜μ„œ 더 뢀정적이닀.
07:44
Yes, Kristen, yes.
189
464160
2610
λ„€, ν¬λ¦¬μŠ€ν‹΄, λ„€.
07:46
Yeah, I think it's negative 'cause...
190
466770
1413
λ„€, 뢀정적인 것 κ°™μ•„μš” μ™œλƒλ©΄...
07:48
Yeah, Abdula, you were saying that it's a happy event.
191
468183
3537
λ„€, μ••λ‘˜λΌ, 당신은 그것이 ν–‰λ³΅ν•œ 행사라고 λ§ν–ˆμž–μ•„μš”.
07:51
Not really.
192
471720
1590
μ„€λ§ˆ.
07:53
I think Pandora's box is a very...
193
473310
2250
λ‚΄ 생각에 νŒλ„λΌμ˜ μƒμžλŠ”... μ‚¬λžŒλ“€μ΄ κ±±μ •ν•˜λŠ”
07:55
It's more negative where people are worried.
194
475560
3180
뢀뢄이 더 뢀정적이닀 .
07:58
So very often we say, "Don't do that
195
478740
2520
κ·Έλž˜μ„œ μš°λ¦¬λŠ” "νŒλ„λΌμ˜ μƒμžλ₯Ό μ—΄κ²Œ 될 ν…Œλ‹ˆκΉŒ κ·Έλ ‡κ²Œ ν•˜μ§€ λ§ˆμ„Έμš”"라고 자주 λ§ν•©λ‹ˆλ‹€
08:01
because you'll open a Pandora's box." Right?
196
481260
2280
. 였λ₯Έμͺ½?
08:03
It can lead to a negative consequence.
197
483540
2403
뢀정적인 κ²°κ³Όλ₯Ό μ΄ˆλž˜ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
08:08
Excellent. Good.
198
488730
993
ν›Œλ₯­ν•œ. 쒋은.
08:10
Christian, hello.
199
490740
990
크리슀찬, μ•ˆλ…•ν•˜μ„Έμš”.
08:11
Happy to see you again as well.
200
491730
1620
λ‹€μ‹œ λ§Œλ‚˜μ„œ λ°˜κ°€μ›Œμš”.
08:13
Hi. Yes. (chuckles)
201
493350
3333
μ•ˆλ…•. 예. (μ›ƒμŒ)
08:20
(Keith laughing)
202
500713
1067
(Keith μ›ƒμŒ)
08:21
Salman, I like your comment. That's nice.
203
501780
2163
Salman, λ‹Ήμ‹ μ˜ 의견이 λ§ˆμŒμ— λ“­λ‹ˆλ‹€. κ·Έκ±° μ’‹λ„€μš”.
08:27
And FinOsin says, interestingly,
204
507630
2017
그리고 FinOsin은 ν₯λ―Έλ‘­κ²Œλ„
08:29
"In fact, lots of graduate programs
205
509647
2453
"사싀 λ§Žμ€ λŒ€ν•™μ› ν”„λ‘œκ·Έλž¨μ—μ„œ
08:32
they teach about using the combination
206
512100
2370
08:34
between data and AI to improve the health system."
207
514470
3150
데이터와 AI의 결합을 μ‚¬μš©ν•˜μ—¬ 의료 μ‹œμŠ€ν…œμ„ κ°œμ„ ν•˜λŠ” 방법을 κ°€λ₯΄μΉ©λ‹ˆλ‹€."라고 λ§ν•©λ‹ˆλ‹€.
08:37
Interesting. Great.
208
517620
1530
ν₯미둜운. μ—„μ²­λ‚œ.
08:39
Excellent. Thank you for sharing.
209
519150
1440
ν›Œλ₯­ν•œ. κ³΅μœ ν•΄ μ£Όμ…”μ„œ κ°μ‚¬ν•©λ‹ˆλ‹€. μ‚¬λž‘
08:40
That's lovely.
210
520590
1200
μŠ€λŸ½λ„€.
08:41
Okay.
211
521790
1230
μ’‹μ•„μš”. 계속
08:43
We're going to go onto...
212
523020
1470
ν•˜κ² μŠ΅λ‹ˆλ‹€...
08:44
I'm looking for number, I've lost it.
213
524490
1890
번호λ₯Ό μ°Ύκ³  μžˆλŠ”λ° μžƒμ–΄λ²„λ Έμ–΄μš”.
08:46
We're on idioms, right?
214
526380
960
μš°λ¦¬λŠ” κ΄€μš©κ΅¬λ₯Ό μ‚¬μš©ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€. κ·Έλ ‡μ£ ?
08:47
But where is the next one?
215
527340
1890
그런데 λ‹€μŒμ€ μ–΄λ””μ£ ?
08:49
Number three.
216
529230
1530
μ„Έ 번째.
08:50
Here, idiom number three.
217
530760
1140
μ—¬κΈ°, μ„Έ 번째 κ΄€μš©κ΅¬κ°€ μžˆμŠ΅λ‹ˆλ‹€.
08:51
Here we go.
218
531900
833
μ—¬κΈ° μžˆμŠ΅λ‹ˆλ‹€.
08:54
What's this one?
219
534750
1353
이건 뭐야?
08:59
(gentle piano music)
220
539887
1643
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:01
Pablo says the vocabulary is mind blowing.
221
541530
2340
νŒŒλΈ”λ‘œλŠ” μ–΄νœ˜λ ₯이 정말 λŒ€λ‹¨ν•˜λ‹€κ³  λ§ν•©λ‹ˆλ‹€.
09:03
Great, good.
222
543870
887
μ’‹μ•„, μ’‹μ•„.
09:04
(gentle piano music)
223
544757
1250
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:06
"To leg it." Dhoni, that's amazing.
224
546007
2843
"닀리λ₯Ό λ»—μœΌλ €λ©΄." λ„λ‹ˆ, 정말 λ†€λžλ‹€.
09:08
I love that expression.
225
548850
1140
λ‚˜λŠ” κ·Έ ν‘œν˜„μ„ μ’‹μ•„ν•œλ‹€. 닀리λ₯Ό
09:09
To leg it, to run away.
226
549990
1789
λ»—κΈ° μœ„ν•΄, 도망가기 μœ„ν•΄.
09:11
(gentle piano music)
227
551779
2261
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…) 그렇지
09:14
Not quite.
228
554040
976
μ•Šμ•„μš”.
09:15
(gentle piano music)
229
555016
1454
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:16
Lovely idea, ERFAN.
230
556470
1844
멋진 아이디어, ERFAN.
09:18
(gentle piano music)
231
558314
1703
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:20
Krithika, good idea.
232
560017
1793
Krithika, 쒋은 μƒκ°μ΄μ—μš”.
09:21
(gentle piano music)
233
561810
3637
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:25
"Last straw." Maybe?
234
565447
3282
"λ§ˆμ§€λ§‰ 지푸라기." μ•„λ§ˆλ„?
09:28
(gentle piano music)
235
568729
950
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:29
(Keith vocalizing)
236
569679
2917
(Keith의 λͺ©μ†Œλ¦¬)
09:34
It's all downhill now. Good.
237
574080
1710
이제 λͺ¨λ“  것이 λ‚΄λ¦¬λ§‰κΈΈμž…λ‹ˆλ‹€. 쒋은.
09:35
That's a good one, actually. Yes.
238
575790
2131
μ‹€μ œλ‘œ 쒋은 κ²ƒμž…λ‹ˆλ‹€. 예.
09:37
(gentle piano music)
239
577921
3599
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:41
Alex, you're almost there.
240
581520
1885
μ•Œλ ‰μŠ€, 거의 λ‹€ μ™”μ–΄μš”.
09:43
(gentle piano music)
241
583405
3155
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:46
DW, very, very good.
242
586560
2148
DW, μ•„μ£Ό μ•„μ£Ό μ’‹μ•„μš”.
09:48
(gentle piano music)
243
588708
3083
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
09:54
Yeah.
244
594801
833
λ„€. μ˜€λŠ˜μ€
09:55
Ernesto and Zlatiborka are on the ball today.
245
595634
3076
에λ₯΄λ„€μŠ€ν† μ™€ 즐라티보λ₯΄μΉ΄κ°€ 경기에 λ‚˜μ„°μŠ΅λ‹ˆλ‹€.
09:58
As is Hiyan Let.
246
598710
1714
Hiyan Let도 λ§ˆμ°¬κ°€μ§€μž…λ‹ˆλ‹€.
10:00
(gentle piano music)
247
600424
2786
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
10:03
Yeah. Excellent, excellent.
248
603210
1740
λ„€. ν›Œλ₯­ν•΄μš”, ν›Œλ₯­ν•΄μš”.
10:04
You're all very, very, very, very, very close, okay?
249
604950
2640
λ„ˆν¬λ“€μ€ λͺ¨λ‘ μ•„μ£Ό, μ•„μ£Ό, μ•„μ£Ό, μ•„μ£Ό, μ•„μ£Ό μΉœν•΄, μ•Œμ•˜μ§€?
10:07
Whoa, (laughs) come down.
250
607590
2973
와, (μ›ƒμŒ) λ‚΄λ €μ˜€μ„Έμš”.
10:12
Very, very, very close.
251
612300
1260
μ•„μ£Ό μ•„μ£Ό μ•„μ£Ό κ°€κΉμŠ΅λ‹ˆλ‹€.
10:13
So this one is,
252
613560
2193
κ·Έλž˜μ„œ 이것은
10:17
slippery slope.
253
617640
2130
λ―Έλ„λŸ¬μš΄ κ²½μ‚¬λ©΄μž…λ‹ˆλ‹€.
10:19
Let me show you.
254
619770
833
λ³΄μ—¬λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
10:21
A slippery slope.
255
621630
1353
λ―Έλ„λŸ¬μš΄ 경사면.
10:23
A slippery slope is "A bad situation that will get worse."
256
623910
5000
λ―Έλ„λŸ¬μš΄ 경사면은 "더 악화될 λ‚˜μœ 상황"μž…λ‹ˆλ‹€.
10:29
Right?
257
629880
833
였λ₯Έμͺ½?
10:30
You can imagine you're starting to slide.
258
630713
3397
당신은 λ―Έλ„λŸ¬μ§€κΈ° μ‹œμž‘ν•œλ‹€κ³  상상할 수 μžˆμŠ΅λ‹ˆλ‹€.
10:34
Whoa, that's bad,
259
634110
1320
μš°μ™€, κ·Έκ±° μ•ˆ 쒋은데, 점점 더
10:35
but it's gonna get worse
260
635430
833
λ‚˜λΉ μ§ˆ κ±°μ•Ό.
10:36
'cause you're going down and down and down
261
636263
1627
μ™œλƒλ©΄ λ„Œ 점점 μ•„λž˜λ‘œ λ‚΄λ €κ°€κ³  μžˆκ±°λ“ 
10:37
and down. (imitating explosion sound)
262
637890
990
. (폭발 μ†Œλ¦¬ 흉내 λ‚΄λ©°)
10:38
Until you explode.
263
638880
1440
ν­λ°œν•  λ•ŒκΉŒμ§€μš”.
10:40
So a slippery slope is idiomatic
264
640320
3240
λ”°λΌμ„œ λ―Έλ„λŸ¬μš΄ 경사면은 κ΄€μš©μ 
10:43
and it just means a bad situation that will get worse.
265
643560
2970
이며 악화될 λ‚˜μœ 상황을 μ˜λ―Έν•  λΏμž…λ‹ˆλ‹€.
10:46
For example,
266
646530
1117
예λ₯Ό λ“€μ–΄
10:47
"Relying on AI for hiring decisions..."
267
647647
5000
"μ±„μš© 결정을 AI에 μ˜μ‘΄ν•˜λŠ” 것..."
10:52
Well let's call it hiring new recruits, bit easier.
268
652710
3963
음, μ‹ μž… 사원 μ±„μš©μ„ μ’€ 더 μ‰½κ²Œ λΆ€λ₯΄μž.
10:58
"Is a slippery slope that could lead to more people
269
658177
3713
" 더 λ§Žμ€ μ‚¬λžŒλ“€μ΄
11:01
being discriminated against."
270
661890
2097
차별을 λ‹Ήν•  수 μžˆλŠ” λ―Έλ„λŸ¬μš΄ λΉ„νƒˆκΈΈμž…λ‹ˆλ‹€."
11:05
Okay?
271
665130
1590
μ’‹μ•„μš”?
11:06
So it is a slippery slope.
272
666720
1680
κ·Έλž˜μ„œ λ―Έλ„λŸ¬μš΄ λΉ„νƒˆκΈΈμ΄λ‹€.
11:08
It's a bad situation that will get worse
273
668400
2640
11:11
because more people can be discriminated against.
274
671040
4650
더 λ§Žμ€ μ‚¬λžŒλ“€μ΄ 차별을 받을 수 있기 λ•Œλ¬Έμ— λ”μš± 악화될 μ•ˆνƒ€κΉŒμš΄ μƒν™©μž…λ‹ˆλ‹€.
11:15
Now, I don't know if you know this,
275
675690
2283
자, μ—¬λŸ¬λΆ„μ΄ 이것을 μ•Œκ³  μžˆλŠ”μ§€λŠ” λͺ¨λ₯΄κ² μŠ΅λ‹ˆλ‹€.
11:19
I'm sure most of you do,
276
679110
1350
λŒ€λΆ€λΆ„ μ•Œκ³  κ³„μ‹œκ² μ§€λ§Œ
11:20
but AI is based on language models.
277
680460
5000
AIλŠ” μ–Έμ–΄ λͺ¨λΈμ„ 기반으둜 ν•©λ‹ˆλ‹€.
11:26
And so some companies use AI to filter candidates
278
686130
4167
κ·Έλž˜μ„œ 일뢀 νšŒμ‚¬μ—μ„œλŠ” AIλ₯Ό μ‚¬μš©ν•˜μ—¬
11:31
who want to join their company.
279
691650
2220
νšŒμ‚¬μ— μž…μ‚¬ν•˜κΈ°λ₯Ό μ›ν•˜λŠ” ν›„λ³΄μžλ₯Ό ν•„ν„°λ§ν•©λ‹ˆλ‹€.
11:33
But of course, if AI is trained on a language model
280
693870
4110
κ·ΈλŸ¬λ‚˜ λ¬Όλ‘  AIκ°€
11:37
that is biased or that discriminates against people,
281
697980
4620
편견이 μžˆκ±°λ‚˜ μ‚¬λžŒμ„ μ°¨λ³„ν•˜λŠ” μ–Έμ–΄ λͺ¨λΈλ‘œ ν›ˆλ ¨λœλ‹€λ©΄
11:42
then the results will be discriminating against people.
282
702600
2973
κ²°κ³ΌλŠ” μ‚¬λžŒμ„ μ°¨λ³„ν•˜κ²Œ 될 κ²ƒμž…λ‹ˆλ‹€.
11:46
And so there's a big controversy,
283
706470
2370
κ·Έλž˜μ„œ
11:48
a big worry,
284
708840
1650
11:50
that using AI for recruitment
285
710490
4020
μ±„μš©μ— AIλ₯Ό ν™œμš©ν•˜λŠ” 것이
11:54
will discriminate against people.
286
714510
2310
μ‚¬λžŒμ„ 차별할 κ²ƒμ΄λΌλŠ” 큰 λ…Όλž€κ³Ό 큰 μš°λ €κ°€ μžˆμŠ΅λ‹ˆλ‹€.
11:56
In fact, AI in any role might discriminate against people,
287
716820
5000
μ‹€μ œλ‘œ μ–΄λ–€ μ—­ν• μ—μ„œλ“  AIλŠ” μ‚¬λžŒ,
12:02
especially minority groups.
288
722400
2490
특히 μ†Œμˆ˜ 집단을 차별할 수 μžˆμŠ΅λ‹ˆλ‹€.
12:04
And that's a slippery slope.
289
724890
1620
그리고 그것은 λ―Έλ„λŸ¬μš΄ κ²½μ‚¬μž…λ‹ˆλ‹€.
12:06
'Cause you start going down,
290
726510
1530
ν•˜λ½ν•˜κΈ° μ‹œμž‘ν•˜λ©΄ 상황은 λ”μš±
12:08
it gets worse, it gets worse, it gets worse,
291
728040
1560
μ•…ν™”λ˜κ³ , λ”μš± μ•…ν™”λ˜κ³ ,
12:09
and the problem will just keep on getting worse.
292
729600
2460
λ¬Έμ œλŠ” κ³„μ†ν•΄μ„œ 악화될 κ²ƒμž…λ‹ˆλ‹€.
12:12
So this expression, a slippery slope,
293
732060
2100
λ―Έλ„λŸ¬μš΄ κ²½μ‚¬λΌλŠ” 이 ν‘œν˜„μ€
12:14
it's a nice one that we can use for this context, right?
294
734160
3843
μš°λ¦¬κ°€ 이 λ§₯λ½μ—μ„œ μ‚¬μš©ν•  수 μžˆλŠ” 쒋은 ν‘œν˜„μ΄μ£ , κ·Έλ ‡μ£ ?
12:19
(indistinct) Thank you.
295
739634
916
(뢈λͺ…ν™•) κ°μ‚¬ν•©λ‹ˆλ‹€. μ‚¬μ΄λ§ˆ
12:20
Saima Jahan, "Relying on AI for writing a scientific paper
296
740550
4740
μžν•œ(Saima Jahan), " κ³Όν•™ λ…Όλ¬Έ μž‘μ„±μ„ μœ„ν•΄ AI에 μ˜μ‘΄ν•˜λŠ” 것은
12:25
is a slippery slope."
297
745290
1257
μœ„ν—˜ν•œ μΌμž…λ‹ˆλ‹€."
12:27
Also, relying on AI to create IELTS answers
298
747390
5000
λ˜ν•œ IELTS 닡변을 λ§Œλ“€κΈ° μœ„ν•΄ AI에 μ˜μ‘΄ν•˜λŠ” 것은
12:33
is a slippery slope.
299
753870
1350
μœ„ν—˜ν•œ μΌμž…λ‹ˆλ‹€.
12:35
I think it can be good,
300
755220
1740
쒋을 μˆ˜λ„ μžˆλ‹€κ³  생각
12:36
but to rely too much is a slippery slope.
301
756960
4203
ν•˜μ§€λ§Œ, λ„ˆλ¬΄ μ˜μ‘΄ν•˜λŠ” 것은 μœ„ν—˜ν•œ μΌμž…λ‹ˆλ‹€.
12:42
What's discrimination?
302
762570
1200
μ°¨λ³„μ΄λž€ λ¬΄μ—‡μž…λ‹ˆκΉŒ?
12:43
Discriminate is to do something in favor
303
763770
4650
μ°¨λ³„μ΄λž€
12:48
or against a group of people.
304
768420
2763
ν•œ μ§‘λ‹¨μ˜ μ‚¬λžŒλ“€μ—κ²Œ μœ λ¦¬ν•˜κ±°λ‚˜ λΆˆλ¦¬ν•˜κ²Œ ν–‰λ™ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:53
To discriminate against
305
773070
1440
μ°¨λ³„ν•œλ‹€λŠ” 것은
12:54
is to do something either in favor of some people
306
774510
3810
μ–΄λ–€ μ‚¬λžŒμ—κ²Œ μœ λ¦¬ν•˜κ±°λ‚˜ νŠΉμ •
12:58
or against a group of people.
307
778320
2040
집단에 λΆˆλ¦¬ν•˜κ²Œ ν–‰λ™ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
13:00
Yeah.
308
780360
833
응.
13:03
Okay.
309
783060
833
μ’‹μ•„μš”.
13:05
(laughs) It's my first time joining
310
785201
2389
(μ›ƒμŒ) 처음 κ°€μž…ν•˜λŠ”λ°
13:07
and my mind just feels like gotten awakened. (laughs)
311
787590
3450
정신이 번쩍 λ“œλŠ” 것 κ°™μ•„μš”. (μ›ƒμŒ)
13:11
You've just woken up.
312
791040
840
13:11
Good, good.
313
791880
1053
방금 μΌμ–΄λ‚¬μ–΄μš”.
μ’‹λ‹€ (μ’‹μ•„μš”.
13:15
Here's another one from Salman,
314
795270
1507
Salman의 또 λ‹€λ₯Έ 말은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
13:16
"Using AI-driven decision..."
315
796777
2393
"AI 기반 μ˜μ‚¬κ²°μ • μ‚¬μš©..."
13:19
Oh, beautiful English.
316
799170
2677
였, μ•„λ¦„λ‹€μš΄ μ˜μ–΄λ„€μš”.
13:21
"Using AI-driven decision in laws and judgments
317
801847
3653
"법λ₯ κ³Ό νŒκ²°μ—μ„œ AI 기반 결정을 μ‚¬μš©ν•˜λŠ” 것은
13:25
might be a slippery slope."
318
805500
1950
λ―Έλ„λŸ¬μš΄ 경사일 수 μžˆμŠ΅λ‹ˆλ‹€."
13:27
Oh, Salman Abrar, your English is beautiful
319
807450
3120
였, μ‚΄λ§Œ μ•„λΈŒλΌ(Salman Abrar), λ‹Ήμ‹ μ˜ μ˜μ–΄λŠ” 아름닡고
13:30
and the idea is phenomenal.
320
810570
2820
μ•„μ΄λ””μ–΄λŠ” κ²½μ΄λ‘­μŠ΅λ‹ˆλ‹€.
13:33
Fantastic.
321
813390
1620
ν™˜μƒμ μž…λ‹ˆλ‹€.
13:35
Like it very, very much. Lovely one to share.
322
815010
2190
μ•„μ£Ό μ•„μ£Ό μ’‹μ•„ν•΄μš”. κ³΅μœ ν•˜κΈ° 쒋은 것.
13:37
Okay.
323
817200
1050
μ’‹μ•„μš”.
13:38
Excellent.
324
818250
833
ν›Œλ₯­ν•œ.
13:39
Number four, idiom number four.
325
819083
2827
λ„·μ§Έ, μˆ™μ–΄ λ„·μ§Έ. 계속
13:41
Let's move on.
326
821910
833
μ§„ν–‰ν•©μ‹œλ‹€.
13:42
We've had...
327
822743
1177
μš°λ¦¬λŠ”...
13:43
Where are we?
328
823920
1290
μš°λ¦¬λŠ” 어디에 μžˆλ‚˜μš”?
13:45
Here's number four.
329
825210
1350
μ—¬κΈ° 4λ²ˆμ§Έκ°€ μžˆμŠ΅λ‹ˆλ‹€.
13:46
I'm just gonna take that off for a moment.
330
826560
2093
잠깐만 λ–Όμ–΄ λ†“κ² μŠ΅λ‹ˆλ‹€.
13:49
Right.
331
829770
833
였λ₯Έμͺ½.
13:50
Now, I think a lot of you know this one.
332
830603
2440
이제 λ§Žμ€ 뢄듀이 이 사싀을 μ•Œκ³  κ³„μ‹œλ¦¬λΌ μƒκ°ν•©λ‹ˆλ‹€.
13:54
Looking at the comments earlier on,
333
834390
2070
μ•žμ„  λŒ“κΈ€λ“€μ„ 보면
13:56
I think a lot of you know this.
334
836460
1470
μ•„μ‹œλŠ” 뢄듀이 λ§Žμ„ 거라 μƒκ°ν•©λ‹ˆλ‹€.
13:57
Let me know. Let's see your comments, what you think.
335
837930
3363
μ•Œλ €μ€˜μš”. μ—¬λŸ¬λΆ„μ˜ 의견, μ—¬λŸ¬λΆ„μ˜ 생각을 μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
14:01
(gentle piano music)
336
841293
3083
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:05
Hmm.
337
845772
833
흠.
14:06
(gentle piano music)
338
846605
3083
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:15
Eshan, yeah, very good comment.
339
855387
1983
Eshan, λ„€, μ•„μ£Ό 쒋은 μ˜κ²¬μ΄λ„€μš”.
14:17
Absolutely.
340
857370
862
μ „μ μœΌλ‘œ.
14:18
(gentle piano music)
341
858232
3083
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:24
It's a nice idiom, jump on the bandwagon.
342
864141
2768
쒋은 κ΄€μš©κ΅¬μž…λ‹ˆλ‹€. μ‹œλ₯˜μ— λ™μ°Έν•˜μ„Έμš”.
14:26
(gentle piano music)
343
866909
3083
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:31
Mohamed Shamo Hamedee, lovely comment. Thank you.
344
871350
4380
Mohamed Shamo Hamedeeλ‹˜, μ‚¬λž‘μŠ€λŸ¬μš΄ λŒ“κΈ€μž…λ‹ˆλ‹€. κ°μ‚¬ν•©λ‹ˆλ‹€.
14:35
(gentle piano music)
345
875730
2700
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:38
Rue, you're almost there.
346
878430
2310
루, 거의 λ‹€ μ™”μ–΄μš”.
14:40
Almost.
347
880740
884
거의.
14:41
(gentle piano music)
348
881624
1066
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:42
Elvira, spot on.
349
882690
1880
μ—˜λΉ„λΌ, λ”± λ“€μ–΄λ§žμ•„μš”.
14:44
(gentle piano music)
350
884570
1270
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:45
Well done, Julia. Nice to see you.
351
885840
2237
μž˜ν–ˆμ–΄μš”, Julia. λ§Œλ‚˜μ„œ λ°˜κ°€μ›Œ.
14:48
(gentle piano music)
352
888077
3613
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
14:51
Alex, this is a really good one.
353
891690
1770
Alex, 이거 정말 쒋은 κ³‘μ΄μ—μš”.
14:53
Hold your horses is another expression.
354
893460
2520
말을 μž‘μ•„λΌ(Hold your Horses)λŠ” 또 λ‹€λ₯Έ ν‘œν˜„μ΄λ‹€.
14:55
Actually, it's not the same one, but it's a different idiom.
355
895980
3660
사싀 같은 ν‘œν˜„μ€ μ•„λ‹ˆκ³  λ‹€λ₯Έ ν‘œν˜„μž…λ‹ˆλ‹€.
14:59
Yes.
356
899640
833
예.
15:00
(gentle piano music)
357
900473
1777
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
15:02
Alberto's got the same here.
358
902250
1804
Alberto도 λ§ˆμ°¬κ°€μ§€μž…λ‹ˆλ‹€.
15:04
(gentle piano music)
359
904054
966
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
15:05
(Keith laughing)
360
905020
1250
(Keith μ›ƒμŒ)
15:06
Nelson, "To put the cart before the horse." Yes.
361
906270
2797
Nelson, "말 μ•žμ— 카트λ₯Ό λ†“μœΌλ €λ©΄ ." 예.
15:09
"Don't put your cart before the horses."
362
909067
2393
"말보닀 수레λ₯Ό λ¨Όμ € 놓지 λ§ˆμ„Έμš”."
15:11
Horses or horse?
363
911460
1407
말인가 말인가?
15:12
(gentle piano music)
364
912867
3083
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
15:18
Yeah.
365
918279
833
λ„€.
15:19
(gentle piano music)
366
919112
898
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
15:20
Sourav, very, very good. Excellent.
367
920010
2103
Sourav, μ•„μ£Ό μ•„μ£Ό μ’‹μ•„μš”. ν›Œλ₯­ν•œ.
15:22
(gentle piano music)
368
922113
3777
(λΆ€λ“œλŸ¬μš΄ ν”Όμ•„λ…Έ μŒμ•…)
15:25
Excellent. Very, very nice.
369
925890
1170
ν›Œλ₯­ν•΄μš”. μ•„μ£Ό μ•„μ£Ό 쒋은.
15:27
Good.
370
927060
833
15:27
So let me bring this one back in
371
927893
3037
쒋은.
κ·Έλž˜μ„œ 이것을 λ‹€μ‹œ κ°€μ Έμ˜€κ² μŠ΅λ‹ˆλ‹€.
15:30
because you're absolutely right
372
930930
1680
μ™œλƒλ©΄
15:32
to put the cart before the horse
373
932610
2760
말 μ•žμ— 수레λ₯Ό λ†“λŠ” 것이 μ ˆλŒ€μ μœΌλ‘œ 옳기 λ•Œλ¬Έμž…λ‹ˆλ‹€. μ΄λŠ” 잘λͺ»λœ μˆœμ„œλ‘œ
15:35
which means to do things in the wrong order.
374
935370
4263
일을 ν•˜λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€ .
15:41
For example, "I think using AI-driven..."
375
941010
4140
예λ₯Ό λ“€μ–΄ "AI κΈ°λ°˜μ„ μ‚¬μš©ν•˜λŠ” 것 κ°™μ•„μš”..."
15:45
Nice, right?
376
945150
1087
멋지죠?
15:46
"Using AI-driven teaching assistants
377
946237
3323
"AI 기반 쑰ꡐ가
15:49
before we understand their impact on learning is a mistake.
378
949560
5000
ν•™μŠ΅μ— λ―ΈμΉ˜λŠ” 영ν–₯을 μ΄ν•΄ν•˜κΈ° 전에 μ‚¬μš©ν•˜λŠ” 것은 μ‹€μˆ˜μž…λ‹ˆλ‹€.
15:55
It's putting the cart before the horse."
379
955440
2127
말보닀 카트λ₯Ό λ¨Όμ € λ†“λŠ” κ²ƒμž…λ‹ˆλ‹€."
15:59
I think using AI-driven teaching assistance is a mistake.
380
959010
4657
AI 기반 ꡐ윑 지원을 μ‚¬μš©ν•˜λŠ” 것은 μ‹€μˆ˜λΌκ³  μƒκ°ν•©λ‹ˆλ‹€ .
16:05
Well, before we understand their impact
381
965910
1950
κΈ€μŽ„μš”, 그것이 ν•™μŠ΅μ— λ―ΈμΉ˜λŠ” 영ν–₯을 μ΄ν•΄ν•˜κΈ° μ „μ—λŠ”
16:07
on learning is a mistake.
382
967860
1620
μ‹€μˆ˜μž…λ‹ˆλ‹€.
16:09
It's putting the cart before the horse,
383
969480
2610
말보닀 카트λ₯Ό λ¨Όμ € λ†“λŠ” κ²ƒμž…λ‹ˆλ‹€. 즉, μΉ΄νŠΈκ°€
16:12
meaning that we should understand their impact
384
972090
4770
16:16
on learning first,
385
976860
2580
ν•™μŠ΅μ— λ―ΈμΉ˜λŠ” 영ν–₯을 λ¨Όμ € μ΄ν•΄ν•œ
16:19
and then use AI-driven teaching assistants.
386
979440
4023
λ‹€μŒ AI 기반 쑰ꡐλ₯Ό μ‚¬μš©ν•΄μ•Ό ν•œλ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
16:24
If we use them first
387
984450
1830
λ¨Όμ € μ‚¬μš©ν•˜κ³ 
16:26
and then later discover their impact,
388
986280
3600
λ‚˜μ€‘μ— κ·Έ 영ν–₯을 λ°œκ²¬ν•˜λ©΄
16:29
it's too late.
389
989880
840
λ„ˆλ¬΄ λŠ¦μŠ΅λ‹ˆλ‹€.
16:30
It's putting the cart before the horse.
390
990720
2700
말 μ•žμ— 수레λ₯Ό λ†“λŠ” κ²ƒμž…λ‹ˆλ‹€.
16:33
So very often with AI,
391
993420
2010
AI의 경우
16:35
there are things that we are doing in the wrong order,
392
995430
4140
μš°λ¦¬κ°€ 잘λͺ»λœ μˆœμ„œλ‘œ μˆ˜ν–‰ν•˜λŠ” 일이 맀우 자주 λ°œμƒν•©λ‹ˆλ‹€.
16:39
right?
393
999570
833
κ·Έλ ‡μ£ ? κ·œμΉ™κ³Ό κ·œμ œκ°€ 있기 전에
16:41
Letting everybody use AI
394
1001670
2100
λͺ¨λ“  μ‚¬λžŒμ΄ AIλ₯Ό μ‚¬μš©ν•˜λ„λ‘ ν•˜λŠ” 것은
16:43
before having rules and regulation
395
1003770
3990
16:47
is putting the cart before the horse.
396
1007760
2130
말보닀 카트λ₯Ό λ†“λŠ” 것과 κ°™μŠ΅λ‹ˆλ‹€.
16:49
We should have the rules first and then everybody using AI.
397
1009890
3663
λ¨Όμ € κ·œμΉ™μ„ μ •ν•˜κ³  κ·Έ λ‹€μŒ AIλ₯Ό μ‚¬μš©ν•˜λŠ” λͺ¨λ“  μ‚¬λžŒμ΄ μžˆμ–΄μ•Ό ν•©λ‹ˆλ‹€.
16:54
So I'm just gonna add here
398
1014390
3180
κ·Έλž˜μ„œ μ €λŠ” 여기에
16:57
the expression that a few of you said,
399
1017570
1900
μ—¬λŸ¬λΆ„ 쀑 λͺ‡λͺ‡μ΄ λ§ν–ˆλ˜ "
17:00
hold your horses,
400
1020450
1350
hold your Horses"λΌλŠ” ν‘œν˜„μ„ μΆ”κ°€ν•˜κ² μŠ΅λ‹ˆλ‹€.
17:01
which is an interesting one,
401
1021800
1170
μ΄λŠ” ν₯미둜운 ν‘œν˜„μž…λ‹ˆλ‹€.
17:02
which just means wait.
402
1022970
1473
μ΄λŠ” 단지 κΈ°λ‹€λ¦¬λΌλŠ” λœ»μž…λ‹ˆλ‹€.
17:06
It is also a...
403
1026300
2490
그것은 λ˜ν•œ...
17:08
It's an idiom.
404
1028790
833
그것은 μˆ™μ–΄μž…λ‹ˆλ‹€.
17:10
It's not used as a verb.
405
1030710
2010
λ™μ‚¬λ‘œ μ‚¬μš©λ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
17:12
It's used as a imperative.
406
1032720
1830
ν•„μˆ˜λ‘œ μ‚¬μš©λ©λ‹ˆλ‹€.
17:14
When you say, "Wait, just hold your horses."
407
1034550
2850
당신이 "잠깐, 말을 μž‘μœΌμ„Έμš”."라고 λ§ν•˜λ©΄.
17:17
So whenever somebody...
408
1037400
1860
κ·ΈλŸ¬λ‹ˆ λˆ„κ΅°κ°€...
17:19
You're in a meeting
409
1039260
870
νšŒμ˜μ— μ°Έμ„ν•˜κ³  μžˆλŠ”λ°
17:20
and somebody has an idea and says, "Let's do this."
410
1040130
3780
λˆ„κ΅°κ°€ 아이디어가 λ– μ˜€λ₯΄κ³  "이걸 ν•΄λ³΄μž"라고 말할 λ•Œλ§ˆλ‹€μš”.
17:23
And you say, "Hold your horses.
411
1043910
3330
그리고 당신은 "말을 μ‘°μ‹¬ν•˜μ„Έμš”.
17:27
We need to think carefully first."
412
1047240
2460
λ¨Όμ € μ‹ μ€‘ν•˜κ²Œ 생각해야 ν•©λ‹ˆλ‹€."라고 λ§ν•©λ‹ˆλ‹€.
17:29
Wait, right?
413
1049700
1140
잠깐, 그렇지?
17:30
Hold your horses.
414
1050840
1380
λ‹Ήμ‹ μ˜ 말을 보유.
17:32
It's another nice...
415
1052220
1620
또 쒋은데...
17:33
Oops.
416
1053840
833
이런.
17:34
Another nice idiom, subliminal advertising. (laughs)
417
1054673
2527
또 λ‹€λ₯Έ 멋진 κ΄€μš©μ–΄, μž μž¬μ˜μ‹ κ΄‘κ³ μž…λ‹ˆλ‹€. (μ›ƒμŒ)
17:37
Excuse me.
418
1057200
1140
μ£„μ†‘ν•΄μš”. 말을
17:38
Hold your horses, wait.
419
1058340
1653
작고 κΈ°λ‹€λ¦¬μ„Έμš”.
17:41
The other one put the cart before the horse.
420
1061790
2490
λ‹€λ₯Έ ν•œ μ‚¬λžŒμ€ 말 μ•žμ— 수레λ₯Ό λ†“μ•˜μŠ΅λ‹ˆλ‹€ .
17:44
So that's it.
421
1064280
1620
κ·Έλž˜μ„œ 그게 λ‹€μž…λ‹ˆλ‹€.
17:45
So excellent, we've got all of them there.
422
1065900
1950
정말 ν›Œλ₯­ν•©λ‹ˆλ‹€. 거기에 λͺ¨λ“  것이 μžˆμŠ΅λ‹ˆλ‹€.
17:47
We've got totally bum, bum, bum, bum.
423
1067850
3090
μš°λ¦¬λŠ” μ™„μ „νžˆ λΆ€λž‘μž, λΆ€λž‘μž, λΆ€λž‘μž, λΆ€λž‘μžμž…λ‹ˆλ‹€.
17:50
We've got to be on autopilot, number one.
424
1070940
3030
μš°μ„  μš°λ¦¬λŠ” μžλ™ μ‘°μ’… μž₯치λ₯Ό μž‘λ™ν•΄μ•Ό ν•©λ‹ˆλ‹€.
17:53
To open a Pandora's box, number two.
425
1073970
2910
νŒλ„λΌμ˜ μƒμžλ₯Ό μ—΄λ €λ©΄ 두 λ²ˆμ§Έμž…λ‹ˆλ‹€.
17:56
A slippery slope, number three.
426
1076880
2460
λ―Έλ„λŸ¬μš΄ 경사면, μ„Έ λ²ˆμ§Έμž…λ‹ˆλ‹€.
17:59
And to put the cart before the horse, number four.
427
1079340
4530
그리고 말 μ•žμ— 수레λ₯Ό λ†“λŠ” 것 , 4번.
18:03
And these are your idioms for today.
428
1083870
5000
이것이 였늘의 μˆ™μ–΄μž…λ‹ˆλ‹€.
18:10
(gentle upbeat music)
429
1090216
3167
(λΆ€λ“œλŸ½κ³  κ²½μΎŒν•œ μŒμ•…)

Original video on YouTube.com
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7