AI critic wins Nobel Prize… for AI: BBC Learning English from the News

41,264 views ・ 2024-10-09

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:00
From BBC Learning English.
0
300
2100
BBC ν•™μŠ΅ μ˜μ–΄μ—μ„œ.
00:02
This is Learning English from the News. Our podcast
1
2400
2967
이것은 λ‰΄μŠ€μ—μ„œ μ˜μ–΄λ₯Ό λ°°μš°λŠ” κ²ƒμž…λ‹ˆλ‹€ .
00:05
about the news headlines. In
2
5367
1916
λ‰΄μŠ€ ν—€λ“œλΌμΈμ— κ΄€ν•œ νŒŸμΊμŠ€νŠΈμž…λ‹ˆλ‹€.
00:07
this programme: AI critic wins Nobel prize... for AI.
3
7283
4984
이 ν”„λ‘œκ·Έλž¨μ—μ„œλŠ” AI 평둠가가 AI λΆ€λ¬Έμ˜ 노벨상을 μˆ˜μƒν•©λ‹ˆλ‹€.
00:14
Hello, I'm Georgie.
4
14850
1133
μ•ˆλ…•ν•˜μ„Έμš”, μ €λŠ” μ‘°μ§€μž…λ‹ˆλ‹€.
00:15
And I'm Pippa.
5
15983
1200
그리고 μ €λŠ” ν”ΌνŒŒμž…λ‹ˆλ‹€.
00:17
In this programme, we look at one big news story
6
17183
2950
이 ν”„λ‘œκ·Έλž¨μ—μ„œ μš°λ¦¬λŠ” 큰 λ‰΄μŠ€ 기사 ν•˜λ‚˜
00:20
and the vocabulary in the headlines that will help you understand it.
7
20133
3717
와 이λ₯Ό μ΄ν•΄ν•˜λŠ” 데 도움이 될 ν—€λ“œλΌμΈμ˜ μ–΄νœ˜λ₯Ό μ‚΄νŽ΄λ΄…λ‹ˆλ‹€.
00:23
You can find all the vocabulary and headlines from this episode,
8
23850
3667
이번 μ—ν”Όμ†Œλ“œμ˜ λͺ¨λ“  μ–΄νœ˜μ™€ ν—€λ“œλΌμΈμ€
00:27
as well as a worksheet on our website, bbclearningenglish.com.
9
27517
4250
λ¬Όλ‘  μ›Ήμ‚¬μ΄νŠΈ bbclearningenglish.comμ—μ„œ μ›Œν¬μ‹œνŠΈλ„ 찾아보싀 수 μžˆμŠ΅λ‹ˆλ‹€.
00:31
So, let's hear more about this story.
10
31767
3000
그럼 이 이야기λ₯Ό μ’€ 더 λ“€μ–΄λ³΄μž.
00:38
The Nobel Prize in Physics, a famous science award,
11
38700
3033
유λͺ…ν•œ 과학상인 노벨 물리학상은 λ¨Έμ‹ λŸ¬λ‹μ— λŒ€ν•œ μ—°κ΅¬λ‘œ
00:41
has been given to two scientists, Geoffrey Hinton and John Hopfield,
12
41733
4500
μ œν”„λ¦¬ 힌튼(Geoffrey Hinton)κ³Ό μ‘΄ ν™‰ν•„λ“œ(John Hopfield) 두 λͺ…μ˜ κ³Όν•™μžμ—κ²Œ λŒμ•„κ°”μŠ΅λ‹ˆλ‹€
00:46
for their work on machine learning.
13
46233
2634
.
00:48
Machine learning is a kind of technology which is used by a lot
14
48867
3850
λ¨Έμ‹ λŸ¬λ‹(Machine Learning)은 λ§Žμ€ 인곡지λŠ₯ μ†Œν”„νŠΈμ›¨μ–΄μ—μ„œ μ‚¬μš©λ˜λŠ” 기술의 일쒅이닀
00:52
of artificial intelligence software.
15
52717
2150
.
00:54
And so the scientists' work is very important
16
54867
2500
λ”°λΌμ„œ κ³Όν•™μžλ“€μ˜ μž‘μ—…μ€
00:57
for a lot of modern technology.
17
57367
2116
λ§Žμ€ ν˜„λŒ€ κΈ°μˆ μ— 맀우 μ€‘μš”ν•©λ‹ˆλ‹€.
00:59
But one of the winners of the award, Geoffrey Hinton, now believes
18
59483
4367
κ·ΈλŸ¬λ‚˜ μˆ˜μƒμž 쀑 ν•œ λͺ…인 μ œν”„λ¦¬ 힌튼(Geoffrey Hinton)은 이제
01:03
that AI technology could be dangerous in the future.
19
63850
4167
AI 기술이 λ―Έλž˜μ— μœ„ν—˜ν•  수 μžˆλ‹€κ³  λ―ΏμŠ΅λ‹ˆλ‹€ .
01:08
Let's have our first headline.
20
68017
2016
첫 번째 ν—€λ“œλΌμΈμ„ λ§Œλ“€μ–΄ λ³΄κ² μŠ΅λ‹ˆλ‹€.
01:10
This one's from the Evening Standard in the UK.
21
70033
3384
이것은 영ꡭ의 Evening Standardμ—μ„œ λ‚˜μ˜¨ κ²ƒμž…λ‹ˆλ‹€ .
01:13
Godfather of AI who warned technology could end humanity wins Nobel Prize.
22
73417
6483
기술이 인λ₯˜λ₯Ό λ©Έλ§μ‹œν‚¬ 수 μžˆλ‹€κ³  κ²½κ³ ν•œ AI의 λŒ€λΆ€, 노벨상 μˆ˜μƒ.
01:19
OK, let's hear that again.
23
79900
1767
μ’‹μ•„, λ‹€μ‹œ λ“€μ–΄λ³΄μž.
01:21
Godfather of AI who warned technology could end
24
81667
3550
기술이 인λ₯˜λ₯Ό λ©Έλ§μ‹œν‚¬ 수 μžˆλ‹€κ³  κ²½κ³ ν•œ AI의 λŒ€λΆ€,
01:25
humanity wins Nobel Prize.
25
85217
2416
노벨상 μˆ˜μƒ.
01:27
And that's from the Evening Standard in the UK.
26
87633
3000
그리고 그것은 영ꡭ의 Evening Standardμ—μ„œ λ‚˜μ˜¨ κ²ƒμž…λ‹ˆλ‹€ .
01:30
This headline is talking about Geoffrey Hinton, who's won the Nobel Prize
27
90633
4267
이 ν—€λ“œλΌμΈμ€ μžμ‹ μ˜ μ—°κ΅¬λ‘œ 노벨상을 μˆ˜μƒν•œ μ œν”„λ¦¬ 힌튼(Geoffrey Hinton)에 λŒ€ν•΄ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€
01:34
for his work, even though he now thinks it could be dangerous.
28
94900
3483
. 비둝 μ§€κΈˆμ€ 그것이 μœ„ν—˜ν•  수 μžˆλ‹€κ³  μƒκ°ν•˜μ§€λ§Œ λ§μž…λ‹ˆλ‹€.
01:38
Now the headline calls him The Godfather of AI.
29
98383
3600
이제 ν—€λ“œλΌμΈμ—μ„œλŠ” κ·Έλ₯Ό AI의 λŒ€λΆ€λΌκ³  λΆ€λ¦…λ‹ˆλ‹€.
01:41
Now, godfather, I have one of those.
30
101983
2400
자, λŒ€λΆ€λ‹˜, μ €μ—κ²Œλ„ κ·Έ 쀑 ν•˜λ‚˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
01:44
Isn't it supposed to be your spiritual guide?
31
104383
3300
그것은 λ‹Ήμ‹ μ˜ 영적 κ°€μ΄λ“œκ°€ λ˜μ–΄μ•Ό ν•˜μ§€ μ•Šλ‚˜μš”?
01:47
Yes. Godfather in some cultures is chosen by parents when they have a baby.
32
107683
5334
예. 일뢀 λ¬Έν™”κΆŒμ—μ„œλŠ” λΆ€λͺ¨κ°€ μ•„κΈ°λ₯Ό 낳을 λ•Œ λŒ€λΆ€λ₯Ό μ„ νƒν•©λ‹ˆλ‹€.
01:53
But the headline isn't talking about this kind of godfather,
33
113017
3300
ν•˜μ§€λ§Œ ν—€λ“œλΌμΈμ€ 이런 μ’…λ₯˜μ˜ λŒ€λΆ€μ— λŒ€ν•΄ λ§ν•˜λŠ” 것이 μ•„λ‹ˆλ©°, 폭λ ₯ μ˜ν™”μ—μ„œ 갱단을 μ΄λ„λŠ”
01:56
and it isn't talking about the kind of godfather who leads the gang
34
116317
3750
μ’…λ₯˜μ˜ λŒ€λΆ€μ— λŒ€ν•΄μ„œλ„ λ§ν•˜λŠ” 것이 μ•„λ‹™λ‹ˆλ‹€
02:00
in the violent film, either.
35
120067
1750
.
02:01
OK, so, Pippa, what does it mean?
36
121817
2850
μ•Œμ•˜μ–΄, ν”ΌνŒŒ, 그게 무슨 λœ»μ΄μ•Ό?
02:04
Well, when we call someone The Godfather of something,
37
124667
3250
κΈ€μŽ„, μš°λ¦¬κ°€ λˆ„κ΅°κ°€λ₯Ό λ¬΄μ–Έκ°€μ˜ λŒ€λΆ€λΌκ³  λΆ€λ₯Ό λ•Œ,
02:07
we mean that they started or developed a new idea or thing.
38
127917
3750
κ·Έ μ‚¬λžŒμ΄ μƒˆλ‘œμš΄ μ•„μ΄λ””μ–΄λ‚˜ 사물을 μ‹œμž‘ν•˜κ±°λ‚˜ κ°œλ°œν–ˆλ‹€λŠ” β€‹β€‹μ˜λ―Έμž…λ‹ˆλ‹€.
02:11
That's right. So we hear this a lot in art and music.
39
131667
3300
μ’‹μ•„μš”. κ·Έλž˜μ„œ μš°λ¦¬λŠ” λ―Έμˆ μ΄λ‚˜ μŒμ•…μ—μ„œ 이런 말을 많이 λ“£μŠ΅λ‹ˆλ‹€. 예λ₯Ό λ“€μ–΄
02:14
So someone might be The Godfather of Soul
40
134967
2250
λˆ„κ΅°κ°€λŠ” 영혼의 λŒ€λΆ€
02:17
or The Godfather of Impressionist Art, for example.
41
137217
3000
λ˜λŠ” μΈμƒνŒŒ 예술의 λŒ€λΆ€μΌ 수 μžˆμŠ΅λ‹ˆλ‹€.
02:20
Yes. So in the headline, Geoffrey Hinton is called The Godfather of AI.
42
140217
4816
예. κ·Έλž˜μ„œ ν—€λ“œλΌμΈμ—μ„œλŠ” μ œν”„λ¦¬ νžŒνŠΌμ„ AI의 λŒ€λΆ€λΌκ³  λΆ€λ¦…λ‹ˆλ‹€.
02:25
It just means people think his work was the start of AI and is really important.
43
145033
4600
μ΄λŠ” μ‚¬λžŒλ“€μ΄ 그의 μž‘μ—…μ΄ AI의 μ‹œμž‘μ΄κ³  정말 μ€‘μš”ν•˜λ‹€κ³  μƒκ°ν•œλ‹€λŠ” 의미일 λΏμž…λ‹ˆλ‹€.
02:29
Without him, AI it might not exist at all.
44
149633
3167
κ·Έκ°€ μ—†μ—ˆλ‹€λ©΄ AIλŠ” μ „ν˜€ μ‘΄μž¬ν•˜μ§€ μ•Šμ•˜μ„ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
02:35
We've had Godfather of... someone who started
45
155233
3484
μš°λ¦¬μ—κ²ŒλŠ”
02:38
or developed a new idea or thing.
46
158717
2200
μƒˆλ‘œμš΄ μ•„μ΄λ””μ–΄λ‚˜ 일을 μ‹œμž‘ν•˜κ±°λ‚˜ λ°œμ „μ‹œν‚¨ μ‚¬λžŒμ˜ λŒ€λΆ€κ°€ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
02:40
OK, for example, many people regard the director
47
160917
3016
예λ₯Ό λ“€μ–΄, λ§Žμ€ μ‚¬λžŒλ“€μ€ 감독을
02:43
as the Godfather of Modern Cinema.
48
163933
2700
ν˜„λŒ€ μ˜ν™”μ˜ λŒ€λΆ€λ‘œ μ—¬κΉλ‹ˆλ‹€.
02:50
This is Learning English from the News, our podcast about the news headlines.
49
170483
4684
이것은 λ‰΄μŠ€ ν—€λ“œλΌμΈμ— κ΄€ν•œ 팟캐슀트인 Learning English from the Newsμž…λ‹ˆλ‹€.
02:55
Today we're talking about a Nobel Prize winner who is now scared
50
175167
4350
였늘 μš°λ¦¬λŠ”
02:59
of the technology he worked on.
51
179517
1766
μžμ‹ μ΄ κ°œλ°œν•œ κΈ°μˆ μ„ λ‘λ €μ›Œν•˜λŠ” 노벨상 μˆ˜μƒμžμ— λŒ€ν•΄ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:01
As we've discussed, Geoffrey Hinton, one
52
181283
2434
μš°λ¦¬κ°€ λ…Όμ˜ν•œ κ²ƒμ²˜λŸΌ, 노벨상 μˆ˜μƒμž 쀑 ν•œ λͺ…인 Geoffrey Hinton은
03:03
of the winners of the Nobel Prize,
53
183717
1683
03:05
is very famous for his work on AI.
54
185400
2700
AI에 λŒ€ν•œ μ—°κ΅¬λ‘œ 맀우 유λͺ…ν•©λ‹ˆλ‹€.
03:08
Now, he said he was flabbergasted about the award.
55
188100
3333
이제 κ·ΈλŠ” μˆ˜μƒ μ†Œμ‹μ— 깜짝 λ†€λžλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€ .
03:11
That just means really shocked.
56
191433
2217
정말 좩격을 λ°›μ•˜λ‹€λŠ” λœ»μ΄μ—μš”.
03:13
He's often called The Godfather of AI because his work was essential
57
193650
3900
κ·ΈλŠ” μ’…μ’… AI의 λŒ€λΆ€λΌκ³  λΆˆλ¦½λ‹ˆλ‹€. 그의 μž‘μ—…μ€ AIκ°€ 세상을 λ°”κΏ€ 수 μžˆλ‹€κ³  λ―ΏλŠ”
03:17
for many people using AI today who believe it could change the world.
58
197550
4233
μ˜€λŠ˜λ‚  AIλ₯Ό μ‚¬μš©ν•˜λŠ” λ§Žμ€ μ‚¬λžŒλ“€μ—κ²Œ ν•„μˆ˜μ μ΄μ—ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€ .
03:21
OK, let's have another headline.
59
201783
1850
μ’‹μŠ΅λ‹ˆλ‹€. ν—€λ“œλΌμΈμ„ ν•˜λ‚˜ 더 λ³΄κ² μŠ΅λ‹ˆλ‹€.
03:23
This one is from The Mirror
60
203633
1450
이것은 영ꡭ의 The Mirrorμ—μ„œ λ‚˜μ˜¨ κ²ƒμž…λ‹ˆλ‹€
03:25
in the UK. Brit scientist Geoffrey Hinton, dubbed 'The Godfather of AI',
61
205083
5450
. 'AI의 λŒ€λΆ€'둜 λΆˆλ¦¬λŠ” 영ꡭ κ³Όν•™μž μ œν”„λ¦¬ 힌튼(Geoffrey Hinton)이
03:30
wins Nobel Prize for physics.
62
210533
2350
노벨 물리학상을 μˆ˜μƒν–ˆλ‹€.
03:32
That headline, again from The Mirror in the UK.
63
212883
3267
영ꡭ의 The Mirrorμ—μ„œ λ‚˜μ˜¨ ν—€λ“œλΌμΈμž…λ‹ˆλ‹€.
03:36
Brit scientist Geoffrey Hinton, dubbed 'The Godfather of AI',
64
216150
4283
'AI의 λŒ€λΆ€'둜 λΆˆλ¦¬λŠ” 영ꡭ κ³Όν•™μž μ œν”„λ¦¬ 힌튼(Geoffrey Hinton)이
03:40
wins Nobel Prize for physics.
65
220433
2200
노벨 물리학상을 μˆ˜μƒν–ˆλ‹€.
03:42
Now we learned what The Godfather of AI means in the last headline,
66
222633
3450
이제 μš°λ¦¬λŠ” λ§ˆμ§€λ§‰ ν—€λ“œλΌμΈμ—μ„œ The Godfather of AIκ°€ 무엇을 μ˜λ―Έν•˜λŠ”μ§€ λ°°μ› μŠ΅λ‹ˆλ‹€.
03:46
but what about this word dubbed?
67
226083
2250
ν•˜μ§€λ§Œ λ”λΉ™λœ 이 λ‹¨μ–΄λŠ” μ–΄λ–»μŠ΅λ‹ˆκΉŒ?
03:48
What does it mean to dub someone something?
68
228333
3150
λˆ„κ΅°κ°€μ—κ²Œ 무언가λ₯Ό λ”λΉ™ν•œλ‹€λŠ” 것은 무엇을 μ˜λ―Έν•©λ‹ˆκΉŒ?
03:51
Well, dub is a very old English word, and it has kind of a royal meaning.
69
231483
5217
음, 더빙(dub)은 μ•„μ£Ό 였래된 μ˜μ–΄ 단어 이고 μΌμ’…μ˜ μ™•μ‘±μ˜ 의미λ₯Ό κ°–κ³  μžˆμŠ΅λ‹ˆλ‹€.
03:56
Think of kings and queens.
70
236700
1883
μ™•κ³Ό μ™•λΉ„λ₯Ό 생각해 λ³΄μ‹­μ‹œμ˜€.
03:58
It means to give someone an honour.
71
238583
2234
λˆ„κ΅°κ°€μ—κ²Œ λͺ…μ˜ˆλ₯Ό μ€€λ‹€λŠ” λœ»μž…λ‹ˆλ‹€.
04:00
So if the King or Queen dubbed someone a knight,
72
240817
3083
λ”°λΌμ„œ μ™• μ΄λ‚˜ 여왕이 λˆ„κ΅°κ°€λ₯Ό 기사라고 λΆˆλ €λ‹€λ©΄
04:03
that meant they held a ceremony and gave them a name and sometimes money or land.
73
243900
4800
μ΄λŠ” 그듀이 μ˜μ‹μ„ μ—΄κ³  이름을 λΆ€μ—¬ν•˜κ³  λ•Œλ‘œλŠ” λˆμ΄λ‚˜ 땅을 μ£Όμ—ˆλ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
04:08
OK, so has the King called the Nobel Prize winner The Godfather of AI
74
248700
5817
κ·Έλ ‡λ‹€λ©΄ ꡭ왕은 νŠΉλ³„ ν–‰μ‚¬μ—μ„œ 노벨상 μˆ˜μƒμžλ₯Ό AI의 λŒ€λΆ€λΌκ³  λΆˆλ €μŠ΅λ‹ˆκΉŒ
04:14
in a special ceremony, then?
75
254517
1983
?
04:16
No. In the headline, dubbed is more metaphorical.
76
256500
3450
μ•„λ‹ˆμš”. ν—€λ“œλΌμΈμ—μ„œλŠ” 더빙이 더 μ€μœ μ μž…λ‹ˆλ‹€.
04:19
Here we use dubbed to mean
77
259950
1650
μ—¬κΈ°μ„œλŠ” 더빙을 μ‚¬μš©ν•˜μ—¬
04:21
given a particular name describing their achievements or personality.
78
261600
4283
κ·Έλ“€μ˜ μ—…μ μ΄λ‚˜ 성격을 μ„€λͺ…ν•˜λŠ” νŠΉμ • 이름을 μ˜λ―Έν•©λ‹ˆλ‹€.
04:25
OK, so Geoffrey Hinton has been dubbed The Godfather of AI.
79
265883
4234
μ’‹μ•„μš”, Geoffrey Hinton은 AI의 λŒ€λΆ€λΌκ³  λΆˆλ¦½λ‹ˆλ‹€.
04:30
He's been given that name by lots of people. Exactly.
80
270117
3366
κ·Έμ—κ²ŒλŠ” λ§Žμ€ μ‚¬λžŒλ“€μ΄ κ·Έ 이름을 λΆ™μ—¬μ£Όμ—ˆμŠ΅λ‹ˆλ‹€ . μ •ν™•νžˆ.
04:33
And there are a few common expressions we use with dubbed.
81
273483
3184
그리고 μš°λ¦¬κ°€ dubbed와 ν•¨κ»˜ μ‚¬μš©ν•˜λŠ” λͺ‡ 가지 일반적인 ν‘œν˜„μ΄ μžˆμŠ΅λ‹ˆλ‹€.
04:36
All of them are metaphorical,
82
276667
1700
λͺ¨λ‘
04:38
just like The Godfather of AI. We often use royal titles.
83
278367
4316
AI의 λŒ€λΆ€μ²˜λŸΌ μ€μœ μ μ΄λ‹€. μš°λ¦¬λŠ” μ™•μ‘±μ˜ 칭호λ₯Ό 자주 μ‚¬μš©ν•©λ‹ˆλ‹€.
04:42
Elvis is often dubbed The King of Rock and Roll.
84
282683
2884
μ—˜λΉ„μŠ€λŠ” μ’…μ’… 둜큰둀의 μ™•μœΌλ‘œ λΆˆλ¦½λ‹ˆλ‹€ .
04:45
Madonna is sometimes dubbed The Queen of Pop.
85
285567
3150
λ§ˆλˆλ‚˜λŠ” λ•Œλ•Œλ‘œ 팝의 μ—¬μ™•μœΌλ‘œ λΆˆλ¦°λ‹€.
04:48
Yes, and we also hear father, godfather, like in the story,
86
288717
4000
예, 그리고 μš°λ¦¬λŠ” μ΄μ•ΌκΈ°μ—μ„œμ²˜λŸΌ 아버지, λŒ€λΆ€,
04:52
and godmother and mother too.
87
292717
1883
그리고 λŒ€λͺ¨μ™€ β€‹β€‹μ–΄λ¨Έλ‹ˆλ„ λ“£μŠ΅λ‹ˆλ‹€.
04:54
For example, Florence Nightingale is dubbed The Mother of Modern Nursing
88
294600
4417
예λ₯Ό λ“€μ–΄, ν”Œλ‘œλ ŒμŠ€ λ‚˜μ΄νŒ…κ²ŒμΌμ€ 크림 μ „μŸ 쀑 κ·Έλ…€μ˜ μ—…μ μœΌλ‘œ 인해 ν˜„λŒ€ κ°„ν˜Έμ˜ μ–΄λ¨Έλ‹ˆλ‘œ λΆˆλ¦½λ‹ˆλ‹€
04:59
because of her work during the Crimean War.
89
299017
2833
.
05:03
We've had dubbed - given a name describing their achievements.
90
303400
4200
μš°λ¦¬λŠ” κ·Έλ“€μ˜ 업적을 μ„€λͺ…ν•˜λŠ” 이름을 λΆ™μ˜€μŠ΅λ‹ˆλ‹€.
05:07
For example, I've been dubbed The Queen of Karaoke
91
307600
3333
예λ₯Ό λ“€μ–΄, μ €λŠ” 직μž₯ 크리슀마슀 νŒŒν‹°μ—μ„œ 곡연을 ν•œ ν›„ λ…Έλž˜λ°©μ˜ μ—¬μ™•μ΄λΌλŠ” 별λͺ…을 μ–»μ—ˆμŠ΅λ‹ˆλ‹€
05:10
after my performance at the work Christmas party.
92
310933
2784
.
05:13
You are The Queen of Karaoke, Pippa.
93
313717
2566
당신은 λ…Έλž˜λ°©μ˜ μ—¬μ™• ν”ΌνŒŒμž…λ‹ˆλ‹€.
05:17
This is Learning English from the News from BBC Learning English.
94
317867
4016
BBC Learning English의 λ‰΄μŠ€λ‘œ λ°°μš°λŠ” μ˜μ–΄μž…λ‹ˆλ‹€.
05:21
We're talking about a Nobel Prize winner who has warned
95
321883
3267
μš°λ¦¬λŠ”
05:25
about the impact of the technology that made him famous.
96
325150
3450
μžμ‹ μ„ 유λͺ…ν•˜κ²Œ λ§Œλ“  기술의 영ν–₯에 λŒ€ν•΄ κ²½κ³ ν•œ 노벨상 μˆ˜μƒμžμ— λŒ€ν•΄ μ΄μ•ΌκΈ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
05:28
So, Geoffrey Hinton worked on technology which made him The Godfather of AI.
97
328600
5033
κ·Έλž˜μ„œ Geoffrey Hinton은 κ·Έλ₯Ό AI의 λŒ€λΆ€λ‘œ λ§Œλ“  κΈ°μˆ μ„ μ—°κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
05:33
But now he often warns about the tech he helped to create.
98
333633
3784
κ·ΈλŸ¬λ‚˜ 이제 κ·ΈλŠ” μžμ‹ μ΄ κ°œλ°œν•˜λŠ” 데 도움을 μ€€ κΈ°μˆ μ— λŒ€ν•΄ 자주 κ²½κ³ ν•©λ‹ˆλ‹€.
05:37
Yes. In 2023, Professor Hinton quit his job at Google
99
337417
4533
예. 2023λ…„ 힌튼 κ΅μˆ˜λŠ” κ΅¬κΈ€μ—μ„œ 일을 κ·Έλ§Œλ‘κ³ 
05:41
and said that AI was dangerous.
100
341950
2217
AIκ°€ μœ„ν—˜ν•˜λ‹€κ³  λ§ν–ˆλ‹€.
05:44
He said that one day we might not be able to control AI technology
101
344167
4466
κ·ΈλŠ” μ–Έμ  κ°€ μš°λ¦¬κ°€ AI κΈ°μˆ μ„ ν†΅μ œν•  수 μ—†κ²Œ 될 μˆ˜λ„ 있고
05:48
and that it could destroy humanity.
102
348633
3000
그것이 인λ₯˜λ₯Ό νŒŒκ΄΄ν•  μˆ˜λ„ μžˆλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:51
And we have a headline about his warnings.
103
351633
2917
그리고 μš°λ¦¬λŠ” 그의 경고에 λŒ€ν•œ ν—€λ“œλΌμΈμ„ 가지고 μžˆμŠ΅λ‹ˆλ‹€.
05:54
This one's from Politico, a European news website.
104
354550
4233
유럽의 λ‰΄μŠ€ μ›Ήμ‚¬μ΄νŠΈμΈ Politicoμ—μ„œ κ°€μ Έμ˜¨ λ‚΄μš©μž…λ‹ˆλ‹€.
05:58
AI Doomsayer wins Nobel Prize for key research.
105
358783
3934
AI Doomsayerκ°€ 핡심 μ—°κ΅¬λ‘œ 노벨상을 μˆ˜μƒν–ˆμŠ΅λ‹ˆλ‹€.
06:02
OK, let's hear that again.
106
362717
1683
μ’‹μ•„, λ‹€μ‹œ λ“€μ–΄λ³΄μž.
06:04
AI doomsayer wins Nobel Prize for key research.
107
364400
3683
AI μ’…λ§λ‘ μžλŠ” 핡심 μ—°κ΅¬λ‘œ 노벨상을 μˆ˜μƒν–ˆμŠ΅λ‹ˆλ‹€.
06:08
And that's from Politico.
108
368083
1600
그리고 그것은 Politicoμ—μ„œ λ‚˜μ˜¨ κ²ƒμž…λ‹ˆλ‹€.
06:09
We're interested in the word doomsayer.
109
369683
2700
μš°λ¦¬λŠ” μ’…λ§λ‘ μžλΌλŠ” 단어에 관심이 μžˆμŠ΅λ‹ˆλ‹€.
06:12
Can you tell us more, Georgie?
110
372383
1400
μ’€ 더 μžμ„Ένžˆ 말해주싀 수 μžˆλ‚˜μš”, 쑰지?
06:13
Yes, let's split it up a bit.
111
373783
2050
응, μ’€ λ‚˜λˆ λ³΄μž.
06:15
So the first part of the word is doom,
112
375833
2667
κ·Έλž˜μ„œ λ‹¨μ–΄μ˜ 첫 번째 뢀뢄은 doom이고,
06:18
and that means a bad situation that can't be avoided.
113
378500
3217
μ΄λŠ” ν”Όν•  수 μ—†λŠ” λ‚˜μœ 상황을 μ˜λ―Έν•©λ‹ˆλ‹€ .
06:21
And the second part is sayer.
114
381717
2033
그리고 두 번째 뢀뢄은 λ§ν•˜λŠ” μ‚¬λžŒμž…λ‹ˆλ‹€.
06:23
That's just someone who says something.
115
383750
2250
κ·Έλƒ₯ λ­”κ°€ λ§ν•˜λŠ” μ‚¬λžŒμ΄μ—μš”.
06:26
OK, so a doomsayer is someone that says bad things are going to happen.
116
386000
4733
μ’‹μ•„μš”, μ’…λ§λ‘ μžλŠ” λ‚˜μœ 일이 일어날 것이라고 λ§ν•˜λŠ” μ‚¬λžŒμž…λ‹ˆλ‹€.
06:30
Yes. So the headline describes Geoffrey Hinton as a doomsayer
117
390733
4350
예. κ·Έλž˜μ„œ ν—€λ“œλΌμΈμ—μ„œλŠ” μ œν”„λ¦¬ 힌튼(Geoffrey Hinton)을
06:35
because of his predictions about the dangers of artificial intelligence.
118
395083
4084
인곡 지λŠ₯의 μœ„ν—˜μ„±μ— λŒ€ν•œ 예츑으둜 인해 μ’…λ§λ‘ μžλ‘œ λ¬˜μ‚¬ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
06:39
Right, so are there any other ways we can use this?
119
399167
2816
κ·Έλ ‡κ΅°μš”. 그럼 이것을 μ‚¬μš©ν•  수 μžˆλŠ” λ‹€λ₯Έ 방법은 μ—†μ„κΉŒμš”?
06:41
Yes. Well, we often talk about lots of doomsayers.
120
401983
2850
예. κΈ€μŽ„, μš°λ¦¬λŠ” μ’…μ’… λ§Žμ€ μ’…λ§λ‘ μžλ“€μ— λŒ€ν•΄ μ΄μ•ΌκΈ°ν•©λ‹ˆλ‹€.
06:44
Lots of people that say something bad is going to happen or things won't go well,
121
404833
4200
μ•ˆ 쒋은 일이 생길 거라고, 일이 잘 μ•ˆ 될 거라고 ν–ˆλŠ”λ°,
06:49
and who turn out to be wrong.
122
409033
1384
그게 사싀이 μ•„λ‹Œ κ²½μš°κ°€ λ§ŽμŠ΅λ‹ˆλ‹€.
06:50
So I could say that despite all the doomsayers,
123
410417
3066
κ·Έλž˜μ„œ λ‚˜λŠ” λͺ¨λ“  비관둠에도 λΆˆκ΅¬ν•˜κ³ 
06:53
my plan for a picnic in October went really well.
124
413483
3200
10μ›” μ†Œν’ κ³„νšμ΄ 정말 잘 μ§„ν–‰λ˜μ—ˆλ‹€κ³  말할 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
06:56
It didn't rain at all.
125
416683
1350
λΉ„κ°€ μ „ν˜€ 내리지 μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
06:58
OK, doomsayer, Godfather of AI...
126
418033
3067
μ’‹μ•„μš”, μ’…λ§λ‘ μž, AI의 λŒ€λΆ€...
07:01
It feels like there are a lot of dramatic names in this story.
127
421100
3867
이 μ΄μ•ΌκΈ°μ—λŠ” λ“œλΌλ§ˆν‹±ν•œ 이름이 많이 λ‚˜μ˜€λŠ” 것 κ°™μ•„μš”.
07:04
Yes, this language is used to make the story more exciting
128
424967
3400
λ„€, 이 μ–Έμ–΄λŠ” 이야기λ₯Ό λ”μš± ν₯λ―Έμ§„μ§„ν•˜κ²Œ λ§Œλ“€κ³ 
07:08
and show that even though artificial intelligence is very complicated
129
428367
4166
인곡 지λŠ₯이
07:12
and hard for us to understand,
130
432533
2050
μš°λ¦¬κ°€ μ΄ν•΄ν•˜κΈ°μ—λŠ” 맀우 λ³΅μž‘ν•˜κ³  μ–΄λ ΅μ§€λ§Œ 인곡지λŠ₯이
07:14
there are a lot of opinions about how it will change the future.
131
434583
4417
미래λ₯Ό μ–΄λ–»κ²Œ λ³€ν™”μ‹œν‚¬ 것인지에 λŒ€ν•œ λ§Žμ€ 의견이 μžˆλ‹€λŠ” 것을 보여주기 μœ„ν•΄ μ‚¬μš©λ©λ‹ˆλ‹€.
07:20
We've had doomsayer - someone who says bad things will happen.
132
440167
3766
λ‚˜μœ 일이 일어날 것이라고 λ§ν•˜λŠ” μ‚¬λžŒ, 즉 μ’…λ§λ‘ μžκ°€ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
07:23
For example:
133
443933
1117
예λ₯Ό λ“€μ–΄,
07:25
lots of people thought the CEO was a doomsayer,
134
445050
2967
λ§Žμ€ μ‚¬λžŒλ“€μ€ CEOκ°€ μ’…λ§λ‘ μžλΌκ³  μƒκ°ν–ˆμ§€λ§Œ νšŒμ‚¬μ˜ μž¬μ • 상황
07:28
but he turned out to be right about the company's financial situation.
135
448017
4266
에 λŒ€ν•΄μ„œλŠ” κ·Έκ°€ μ˜³μ•˜λ‹€λŠ” 것이 λ°ν˜€μ‘ŒμŠ΅λ‹ˆλ‹€ .
07:32
That's it for this episode of Learning English from the News.
136
452283
3117
λ‰΄μŠ€μ—μ„œ μ˜μ–΄ 배우기의 이번 μ—ν”Όμ†Œλ“œλŠ” μ΄κ²ƒμž…λ‹ˆλ‹€.
07:35
We'll be back next week with another news story.
137
455400
2700
λ‹€μŒμ£Όμ—λ„ 또 λ‹€λ₯Έ μ†Œμ‹μœΌλ‘œ λŒμ•„μ˜€κ² μŠ΅λ‹ˆλ‹€.
07:38
If you like stories about AI, we have lots of episodes
138
458100
3567
AI에 λŒ€ν•œ 이야기λ₯Ό μ’‹μ•„ν•˜μ‹ λ‹€λ©΄
07:41
of 6 Minute English all about how technology is changing the world.
139
461667
4166
기술이 세상을 μ–΄λ–»κ²Œ λ³€ν™”μ‹œν‚€κ³  μžˆλŠ”μ§€μ— λŒ€ν•œ 6λΆ„ μ˜μ–΄ μ—ν”Όμ†Œλ“œλ₯Ό 많이 μ‹œμ²­ν•΄ λ³΄μ„Έμš”. 저희 μ›Ήμ‚¬μ΄νŠΈ bbclearningenglish.com
07:45
Listen and learn useful vocabulary on our website bbclearningenglish.com.
140
465833
5400
μ—μ„œ μœ μš©ν•œ μ–΄νœ˜λ₯Ό λ“£κ³  λ°°μš°μ„Έμš” .
07:51
And don't forget to check us out on social media.
141
471233
2650
그리고 μ†Œμ…œ λ―Έλ””μ–΄μ—μ„œ 우리λ₯Ό ν™•μΈν•˜λŠ” 것도 μžŠμ§€ λ§ˆμ„Έμš” . μ’‹μ•„ν•˜λŠ” μ†Œμ…œ λ―Έλ””μ–΄ ν”Œλž«νΌ
07:53
Just search BBC Learning English on your favourite social media platform.
142
473883
4100
μ—μ„œ BBC Learning Englishλ₯Ό 검색해 λ³΄μ„Έμš” .
07:57
Goodbye for now. Bye.
143
477983
2067
μ§€κΈˆμ€ μ•ˆλ…•. μ•ˆλ…•.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7