Are opinion polls accurate? - 6 Minute English

76,520 views ・ 2022-12-15

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:08
Hello. This is 6 Minute English
0
8400
1560
μ•ˆλ…•ν•˜μ„Έμš”. BBC Learning English의 6λΆ„ μ˜μ–΄
00:09
from BBC Learning English.
1
9960
1480
μž…λ‹ˆλ‹€.
00:11
I'm Neil.
2
11440
760
μ €λŠ” λ‹μž…λ‹ˆλ‹€.
00:12
And I'm Sam. Predicting the future is not easy,
3
12200
3040
그리고 μ €λŠ” μƒ˜μž…λ‹ˆλ‹€. 미래λ₯Ό μ˜ˆμΈ‘ν•˜λŠ” 것은 쉽지
00:15
but that's exactly the job of opinion pollsters - researchers who ask people
4
15240
4640
μ•Šμ§€λ§Œ 그것이 λ°”λ‘œ μ—¬λ‘  μ‘°μ‚¬μ›μ˜ μΌμž…λ‹ˆλ‹€. μ‚¬λžŒλ“€μ—κ²Œ νŠΉμ • μ£Όμ œμ—
00:19
questions to discover what they think about certain topics. Often their aim
5
19880
5120
λŒ€ν•΄ μ–΄λ–»κ²Œ μƒκ°ν•˜λŠ”μ§€ μ•Œμ•„λ³΄κΈ° μœ„ν•΄ μ§ˆλ¬Έμ„ ν•˜λŠ” μ—°κ΅¬μ›μž…λ‹ˆλ‹€ . μ’…μ’… κ·Έλ“€μ˜ λͺ©ν‘œ
00:25
is predicting which political party will win in an election
6
25000
3560
00:28
by asking members of the public how they intend to vote.
7
28560
3200
λŠ” λŒ€μ€‘ μ—κ²Œ νˆ¬ν‘œ 방법을 λ¬Όμ–΄λ΄„μœΌλ‘œμ¨ μ–΄λŠ 정당이 μ„ κ±°μ—μ„œ μŠΉλ¦¬ν• μ§€ μ˜ˆμΈ‘ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
00:31
But predicting the future is never 100 per cent accurate,
8
31760
4400
κ·ΈλŸ¬λ‚˜ 미래λ₯Ό μ˜ˆμΈ‘ν•˜λŠ” 것은 κ²°μ½” 100νΌμ„ΌνŠΈ μ •ν™•
00:36
and opinion polls don't always get it right.
9
36200
3280
ν•˜μ§€ μ•ŠμœΌλ©°, μ—¬λ‘  쑰사가 항상 μ˜³μ€ 것도 μ•„λ‹™λ‹ˆλ‹€.
00:39
In 2016, few pollsters predicted a victory for Donald Trump over Hillary Clinton
10
39480
5800
2016λ…„ λ―Έκ΅­ λŒ€μ„ μ—μ„œ λ„λ„λ“œ νŠΈλŸΌν”„κ°€ 힐러리 클린턴을 꺾을 것이라고 μ˜ˆμΈ‘ν•œ 여둠쑰사 기관은 거의 μ—†μ—ˆλ‹€
00:45
in the U S presidential election. And in the 2020 US elections,
11
45280
4760
. 그리고 2020λ…„ λ―Έκ΅­ λŒ€μ„ μ—μ„œ
00:50
most polls predicted
12
50040
1440
λŒ€λΆ€λΆ„μ˜ μ—¬λ‘ μ‘°μ‚¬λŠ”
00:51
Trump would lose to Joe Biden, by a much larger amount
13
51480
3800
νŠΈλŸΌν”„κ°€ μ‹€μ œλ³΄λ‹€ 훨씬 더 많이 μ‘° λ°”μ΄λ“ μ—κ²Œ νŒ¨ν•  것이라고 예츑
00:55
than he actually did. These mistakes sometimes called 'misfires',
14
55280
4960
ν–ˆμŠ΅λ‹ˆλ‹€. 상황이 μ˜λ„ν•œ λŒ€λ‘œ μž‘λ™ν•˜μ§€ μ•Šμ„ λ•Œ λ•Œλ•Œλ‘œ 'μ‹€ν™”'라고 λΆˆλ¦¬λŠ” μ΄λŸ¬ν•œ μ‹€μˆ˜
01:00
when things do not work in the way intended,
15
60240
2760
01:03
have damaged the reputation of opinion pollsters. In this programme,
16
63000
4160
λŠ” μ—¬λ‘  μ‘°μ‚¬μ›μ˜ ν‰νŒμ„ μ†μƒμ‹œμΌ°μŠ΅λ‹ˆλ‹€ . 이 ν”„λ‘œκ·Έλž¨μ—μ„œ
01:07
we'll be taking a look into the opinion polling industry and of course,
17
67160
4160
μš°λ¦¬λŠ” μ—¬λ‘  쑰사 산업을 μ‚΄νŽ΄λ³΄κ³ 
01:11
learning some useful vocabulary as well.
18
71320
2320
λ¬Όλ‘  μœ μš©ν•œ μš©μ–΄λ„ 배울 κ²ƒμž…λ‹ˆλ‹€.
01:13
But first I have a question for you,
19
73640
1840
ν•˜μ§€λ§Œ λ¨Όμ € λ‹Ήμ‹ μ—κ²Œ 질문이 μžˆμŠ΅λ‹ˆλ‹€,
01:15
Sam, it's about another time when the opinion polls got it wrong.
20
75480
5080
μƒ˜, μ—¬λ‘  쑰사가 잘λͺ»λ˜μ—ˆμ„ λ•Œ 또 λ‹€λ₯Έ λ•Œμž…λ‹ˆλ‹€. 2016λ…„ λΈŒλ ‰μ‹œνŠΈ κ΅­λ―Όνˆ¬ν‘œμ—μ„œ
01:20
Few pollsters predicted that Britain would vote to leave the European Union
21
80560
4360
영ꡭ 이 μœ λŸ½μ—°ν•© νƒˆν‡΄μ— νˆ¬ν‘œν•  것이라고 μ˜ˆμΈ‘ν•œ μ—¬λ‘  쑰사원은 거의
01:24
in the 2016 Brexit referendum,
22
84920
2680
01:27
which in the end, it did.
23
87600
2480
μ—†μ—ˆμœΌλ©° κ²°κ΅­ κ·Έλ ‡κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:30
But what was the final split between those who voted to leave
24
90080
3640
κ·ΈλŸ¬λ‚˜ λ– λ‚˜κΈ°λ‘œ νˆ¬ν‘œ
01:33
and those who wanted to remain?
25
93720
2200
ν•œ μ‚¬λžŒλ“€κ³Ό 남아 있기λ₯Ό μ›ν•˜λŠ” μ‚¬λžŒλ“€ μ‚¬μ΄μ˜ λ§ˆμ§€λ§‰ 뢄열은 λ¬΄μ—‡μ΄μ—ˆμŠ΅λ‹ˆκΉŒ?
01:35
Was it a) 51 leave to 49 remain,
26
95920
3840
a) 51μ—μ„œ 49κΉŒμ§€ λ‚¨μŒ,
01:39
b) 52 leave to 48 remain, or
27
99840
3480
b) 52μ—μ„œ 48κΉŒμ§€ λ‚¨μŒ,
01:43
c) 52 remain to 48 leave?
28
103320
3200
c) 52μ—μ„œ 48κΉŒμ§€ λ‚¨μŒ?
01:46
I think it was b)
29
106520
1720
b)
01:48
52 per cent voted to leave and 48 per cent to remain.
30
108240
4240
νƒˆν‡΄μ— 52 %, μž”λ₯˜μ— 48%κ°€ νˆ¬ν‘œν–ˆλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
01:52
OK, Sam, I'll reveal the answer at the end of the programme.
31
112480
3600
μ’‹μ•„μš”, μƒ˜, ν”„λ‘œκ·Έλž¨μ΄ 끝날 λ•Œ 닡을 κ³΅κ°œν•˜κ² μŠ΅λ‹ˆλ‹€.
01:56
One of the biggest polling companies
32
116080
1800
κ°€μž₯ 큰 νˆ¬ν‘œ νšŒμ‚¬ 쀑 ν•˜λ‚˜
01:57
was founded by George Gallup, born in 1901
33
117880
2960
λŠ” 1901λ…„ μ•„μ΄μ˜€μ™€μ˜ 농μž₯μ—μ„œ νƒœμ–΄λ‚œ 쑰지 가럽이 μ„€λ¦½ν–ˆμœΌλ©°
02:00
on a farm in Iowa, Gallup was a student of journalism.
34
120840
4480
가럽 은 μ €λ„λ¦¬μ¦˜μ„ μ „κ³΅ν–ˆμŠ΅λ‹ˆλ‹€.
02:05
He wanted to know people's opinion on a range of subjects
35
125320
3360
κ·ΈλŠ” λ‹€μ–‘ν•œ μ£Όμ œμ— λŒ€ν•œ μ‚¬λžŒλ“€μ˜ μ˜κ²¬μ„ μ•Œκ³  μ‹Άμ—ˆκ³ 
02:08
and came up with a simple idea -
36
128680
2400
κ°„λ‹¨ν•œ 아이디어λ₯Ό λ‚΄λ†“μ•˜
02:11
why not try asking them?
37
131080
2360
μŠ΅λ‹ˆλ‹€. κ·Έλ“€μ—κ²Œ λ¬Όμ–΄λ³΄λŠ” 것은 μ–΄λ–¨κΉŒμš”?
02:13
Here's G Elliott Morris,
38
133440
1840
λ‹€μŒ
02:15
a data journalist from the Economist, explaining more to BBC
39
135280
3760
은 Economist의 데이터 μ €λ„λ¦¬μŠ€νŠΈμΈ G Elliott Morris κ°€ BBC
02:19
World Service Programme, More or Less.
40
139040
2840
World Service Programme, More or Lessμ—μ„œ μžμ„Έν•œ λ‚΄μš©μ„ μ„€λͺ…ν•˜λŠ” λ‚΄μš©μž…λ‹ˆλ‹€.
02:21
And,he publishes his dissertation
41
141880
2280
그리고 κ·ΈλŠ” 기본적으둜
02:24
on this - how to measure what people want, basically.
42
144160
2280
μ‚¬λžŒλ“€μ΄ μ›ν•˜λŠ” 것을 μΈ‘μ •ν•˜λŠ” 방법에 λŒ€ν•œ 논문을 λ°œν‘œ ν•©λ‹ˆλ‹€.
02:26
And he gets hired by a much bigger advertising agency
43
146440
3960
그리고 κ·ΈλŠ” Young and Rubicamμ΄λΌλŠ” λ‰΄μš• 의 훨씬 더 큰 κ΄‘κ³  λŒ€ν–‰μ‚¬μ— κ³ μš©λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
02:30
in New York called Young and Rubicam, and they basically give him
44
150400
3360
그듀은 기본적으둜 κ·Έ
02:33
a blank cheque to do their research, to figure out how to call people,
45
153760
4280
μ—κ²Œ 백지 μˆ˜ν‘œλ₯Ό μ£Όμ–΄ 쑰사λ₯Ό ν•˜κ³  μ‚¬λžŒλ“€μ—κ²Œ μ „ν™”λ₯Ό κ±°λŠ”
02:38
how to talk to them, to figure out if they remember or liked a certain product.
46
158040
4800
방법과 λŒ€ν™”ν•˜λŠ” 방법을 μ•Œμ•„λ‚΄κ³  그듀이 νŠΉμ • μ œν’ˆμ„ κΈ°μ–΅ν•˜κ±°λ‚˜ μ’‹μ•„ν–ˆμŠ΅λ‹ˆλ‹€.
02:42
Basically, to figure out early methodologies
47
162840
2880
기본적으둜 κ΄‘κ³ μ˜ 초기 방법둠을 νŒŒμ•…
02:45
in advertising and then by 1931 or so, he's wondering
48
165720
4920
ν•˜κ³  1931
02:50
well, if it works for toothpaste
49
170640
1400
λ…„κ²½κΉŒμ§€ μΉ˜μ•½μ— νš¨κ³Όκ°€ μžˆλ‹€λ©΄
02:52
why not politics?
50
172040
1560
μ •μΉ˜μ—λŠ” μ™œ νš¨κ³Όκ°€ μ—†λŠ”μ§€ κΆκΈˆν•©λ‹ˆλ‹€.
02:53
George Gallup tried to figure out what customers wanted to buy.
51
173600
4280
George Gallup은 고객이 무엇을 사고 μ‹Άμ–΄ν•˜λŠ”μ§€ μ•Œμ•„λ‚΄λ €κ³  λ…Έλ ₯ν–ˆμŠ΅λ‹ˆλ‹€.
02:57
If you figure something out,
52
177880
1680
무언가λ₯Ό μ•Œμ•„λ‚΄λ©΄
02:59
you finally understand it or find a solution to a problem
53
179560
3640
λ§ˆμΉ¨λ‚΄ 그것을 이해 ν•˜κ±°λ‚˜ 그것에 λŒ€ν•΄ 많이 μƒκ°ν•œ 후에 λ¬Έμ œμ— λŒ€ν•œ 해결책을 μ°ΎμŠ΅λ‹ˆλ‹€
03:03
after thinking about it a lot.
54
183200
2440
.
03:05
Later, he was hired by a New York advertising agency to find out
55
185640
4080
λ‚˜μ€‘μ— κ·ΈλŠ” μΉ˜μ•½κ³Ό μ²­λŸ‰ μŒλ£Œμ™€ 같은
03:09
people's opinion of consumer products like toothpaste and soft drinks.
56
189720
4840
μ†ŒλΉ„μž μ œν’ˆμ— λŒ€ν•œ μ‚¬λžŒλ“€μ˜ μ˜κ²¬μ„ μ•Œμ•„λ³΄κΈ° μœ„ν•΄ λ‰΄μš• κ΄‘κ³  λŒ€ν–‰μ‚¬μ— κ³ μš©λ˜μ—ˆμŠ΅λ‹ˆλ‹€ .
03:14
George was given a 'blank cheque' - an unlimited amount of money
57
194560
3960
GeorgeλŠ” λ¬΄μ œν•œμ˜ 돈
03:18
and freedom to do his job.
58
198520
2080
κ³Ό 일을 ν•  수 μžˆλŠ” 자유인 '백지 μˆ˜ν‘œ'λ₯Ό λ°›μ•˜μŠ΅λ‹ˆλ‹€.
03:20
At this time, polling was focused on consumer preferences,
59
200600
3560
μ΄λ•Œ μ—¬λ‘ μ‘°μ‚¬λŠ” μ •μΉ˜κ°€ μ•„λ‹Œ μ†ŒλΉ„μž μ„ ν˜Έλ„μ— μ΄ˆμ μ„ λ§žμ·„λ‹€
03:24
not politics.
60
204160
1560
.
03:25
But asking people about their political views is
61
205720
2640
κ·ΈλŸ¬λ‚˜ μ‚¬λžŒλ“€ μ—κ²Œ μ •μΉ˜μ  견해
03:28
a lot more complicated than asking them about toothpaste.
62
208360
4080
에 λŒ€ν•΄ λ¬»λŠ” 것은 μΉ˜μ•½μ— λŒ€ν•΄ λ¬»λŠ” 것보닀 훨씬 더 λ³΅μž‘ν•©λ‹ˆλ‹€.
03:32
Making accurate election predictions.
63
212440
2200
μ •ν™•ν•œ μ„ κ±° μ˜ˆμΈ‘μ„ ν•©λ‹ˆλ‹€. 인ꡬ 전체λ₯Ό μ •ν™•ν•˜κ²Œ λŒ€ν‘œν•˜λŠ”
03:34
depends on polling a sample group of people
64
214640
3040
μ‚¬λžŒλ“€μ˜ ν‘œλ³Έ 집단을 νˆ¬ν‘œν•˜λŠ” 것에 달렀
03:37
who accurately represent the population as a whole. One of the reasons
65
217680
4520
μžˆμŠ΅λ‹ˆλ‹€.
03:42
for pollsters' failure to predict Trump's election in 2016,
66
222200
4120
μ—¬λ‘  쑰사원 이 2016λ…„ νŠΈλŸΌν”„μ˜ μ„ κ±°λ₯Ό μ˜ˆμΈ‘ν•˜μ§€ λͺ»ν•œ 이유 쀑 ν•˜λ‚˜
03:46
is that they didn't ask enough white, non-college educated voters.
67
226320
4160
λŠ” λŒ€ν•™ κ΅μœ‘μ„ 받지 μ•Šμ€ 백인 μœ κΆŒμžμ—κ²Œ μΆ©λΆ„ν•œ μ§ˆλ¬Έμ„ ν•˜μ§€ μ•Šμ•˜κΈ° λ•Œλ¬Έ μž…λ‹ˆλ‹€.
03:50
So, polling is a very complex process -
68
230480
2640
λ”°λΌμ„œ νˆ¬ν‘œλŠ” 맀우 λ³΅μž‘ν•œ κ³Όμ •μž…λ‹ˆλ‹€.
03:53
one which is never totally reliable,
69
233120
2640
03:55
according to G Elliott Morris, speaking again here to BBC
70
235760
3880
G Elliott Morris에 λ”°λ₯΄λ©΄ BBC
03:59
World Service's More or Less... Β 
71
239640
2480
World Service의 More or Less
04:02
If people were understanding this process that is generating all these polls,
72
242120
5640
... μ‚¬λžŒλ“€μ΄ μ΄λŸ¬ν•œ λͺ¨λ“  νˆ¬ν‘œλ₯Ό μƒμ„±ν•˜λŠ” 이 과정을 이해
04:07
then they would understand polls as less, sort of, precise tools,
73
247760
4040
ν–ˆλ‹€λ©΄ 여둠쑰사λ₯Ό 덜 μ •ν™•ν•˜κ³  μ •λ°€ν•œ λ„κ΅¬λ‘œ 이해할 κ²ƒμž…λ‹ˆλ‹€. μš°λ¦¬κ°€ κΈ°λŒ€ν•˜λŠ” λ ˆμ΄μ €μ™€ 같은 예츑 정확도
04:11
tools they definitely can't offer the laser-like predictive accuracy
74
251800
2760
λ₯Ό ν™•μ‹€νžˆ μ œκ³΅ν•  수 μ—†λŠ” 도ꡬ
04:14
we've come to expect from them.
75
254560
2040
μž…λ‹ˆλ‹€.
04:16
then the difference between pollings'
76
256600
2160
그러면 μ—¬λ‘ μ‘°μ‚¬μ˜ κΈ°λŒ€μΉ˜μ™€ μ„±κ³Ό μ‚¬μ΄μ˜ 차이
04:18
expectations and performance wouldn't be so stark.
77
258760
3720
κ°€ κ·Έλ ‡κ²Œ λšœλ ·ν•˜μ§€ μ•Šμ„ κ²ƒμž…λ‹ˆλ‹€.
04:22
Opinion polls can estimate the outcome of an election,
78
262480
3080
μ—¬λ‘  쑰사 λŠ” μ„ κ±° κ²°κ³Όλ₯Ό μ˜ˆμΈ‘ν•  수
04:25
but they can't give us laser-like accuracy.
79
265560
3320
μžˆμ§€λ§Œ λ ˆμ΄μ €μ™€ 같은 정확도λ₯Ό μ œκ³΅ν•  μˆ˜λŠ” μ—†μŠ΅λ‹ˆλ‹€.
04:28
If you describe something as 'laser-like' you mean it
80
268880
2840
μ–΄λ–€ 것을 'λ ˆμ΄μ €μ™€ κ°™λ‹€'κ³  μ„€λͺ…ν•˜λ©΄ λ ˆμ΄μ €μ²˜λŸΌ 맀우 μ •ν™•ν•˜κ³  초점이 맞좰져 μžˆλ‹€λŠ” 의미
04:31
it's very accurate and focused, like a laser.
81
271720
3440
μž…λ‹ˆλ‹€.
04:35
If people understand how hard
82
275160
1800
μ‚¬λžŒλ“€
04:36
it is to predict the future,
83
276960
1680
이 미래λ₯Ό μ˜ˆμΈ‘ν•˜λŠ” 것이 μ–Όλ§ˆλ‚˜ μ–΄λ €μš΄μ§€ μ΄ν•΄ν•œλ‹€λ©΄
04:38
they might be more realistic about
84
278640
1960
04:40
how accurate opinion polls can be. Then differences between a prediction
85
280600
5280
여둠쑰사가 μ–Όλ§ˆλ‚˜ μ •ν™•ν•  수 μžˆλŠ”μ§€μ— λŒ€ν•΄ 더 ν˜„μ‹€μ μΌ 수 μžˆμŠ΅λ‹ˆλ‹€. 그러면 예츑
04:45
and the final result would not be so stark - obvious and easily visible or harsh.
86
285880
6720
κ³Ό μ΅œμ’… κ²°κ³Ό μ‚¬μ΄μ˜ 차이가 κ·Έλ ‡κ²Œ λšœλ ·ν•˜μ§€ μ•Šμ„ 것 μž…λ‹ˆλ‹€. λͺ…λ°±ν•˜κ³  μ‰½κ²Œ λˆˆμ— λ„κ±°λ‚˜ κ°€ν˜Ήν•©λ‹ˆλ‹€.
04:52
Predicting the future is difficult,
87
292600
2360
미래λ₯Ό μ˜ˆμΈ‘ν•˜λŠ” 것은 μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
04:54
otherwise everyone would be a lottery winner by now.
88
294960
3560
그렇지 μ•ŠμœΌλ©΄ λͺ¨λ‘κ°€ μ§€κΈˆμ―€ 볡ꢌ λ‹Ήμ²¨μžκ°€ 될 κ²ƒμž…λ‹ˆλ‹€.
04:58
Maybe it's not opinion polls that have broken,
89
298520
2360
μ•„λ§ˆλ„ λ¬΄λ„ˆμ§„ 것은 μ—¬λ‘  쑰사가
05:00
but our desire to know the future that's the problem.
90
300880
3920
μ•„λ‹ˆλΌ 문제인 미래λ₯Ό μ•Œκ³ μž ν•˜λŠ” 우리의 μš•λ§μΌ 것 μž…λ‹ˆλ‹€.
05:05
OK, it's time to reveal the answer to my question about the Brexit referendum.
91
305000
4600
자, 이제 λΈŒλ ‰μ‹œνŠΈ κ΅­λ―Όνˆ¬ν‘œμ— λŒ€ν•œ 제 μ§ˆλ¬Έμ— λŒ€ν•œ 닡을 밝힐 μ‹œκ°„μž…λ‹ˆλ‹€.
05:09
I said the final result was 52 per cent for leave,
92
309600
3560
λ‚˜λŠ” μ΅œμ’… κ²°κ³Ό κ°€ νœ΄κ°€ 52
05:13
and 48 per cent for remain.
93
313160
2360
%, μž”λ₯˜ 48%라고 λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:15
Which was the correct answer. And another example of an opinion poll 'misfire' - a situation
94
315520
6360
μ •λ‹΅μ΄μ—ˆμŠ΅λ‹ˆλ‹€. 그리고 여둠쑰사 'misfire'의 또 λ‹€λ₯Έ 예 -
05:21
where something does not work as intended.
95
321880
3360
무언가가 μ˜λ„ν•œ λŒ€λ‘œ μž‘λ™ν•˜μ§€ μ•ŠλŠ” 상황.
05:25
OK, let's recap the rest of the vocabulary
96
325240
2480
μ’‹μ•„μš”,
05:27
from this programme about opinion pollsters -
97
327720
2480
이 ν”„λ‘œκ·Έλž¨μ—μ„œ μ—¬λ‘  쑰사원에 λŒ€ν•œ λ‚˜λ¨Έμ§€ μ–΄νœ˜λ₯Ό μš”μ•½ν•΄ λ΄…μ‹œλ‹€. νŠΉμ • 주제, 특히 μ •μΉ˜μ— λŒ€ν•œ λŒ€μ€‘μ˜ μ˜κ²¬μ„ λ¬»λŠ”
05:30
people who conduct polls
98
330200
1480
μ—¬λ‘  쑰사λ₯Ό μ‹€μ‹œν•˜λŠ” μ‚¬λžŒλ“€μž…λ‹ˆλ‹€
05:31
asking the public their opinion on particular subjects,
99
331680
3400
05:35
especially politics.
100
335080
1760
.
05:36
If you figure something out.
101
336840
1640
당신이 λ­”κ°€ μ•Œμ•„λ‚Ό 경우.
05:38
you finally understand it or find the solution to a problem
102
338480
4320
당신은 그것에 λŒ€ν•΄ μ˜€λž«λ™μ•ˆ μ—΄μ‹¬νžˆ μƒκ°ν•œ 후에 λ§ˆμΉ¨λ‚΄ 그것을 이해 ν•˜κ±°λ‚˜ λ¬Έμ œμ— λŒ€ν•œ 해결책을 μ°ΎμŠ΅λ‹ˆλ‹€
05:42
after thinking long and hard about it.
103
342800
2240
.
05:45
If someone gives you a blank cheque,
104
345040
2400
λˆ„κ΅°κ°€ λ‹Ήμ‹ μ—κ²Œ 백지 μˆ˜ν‘œλ₯Ό μ€€λ‹€λ©΄,
05:47
you have unlimited money and freedom to complete a task.,
105
347440
3920
당신은 λ¬΄μ œν•œμ˜ 돈 κ³Ό μž‘μ—…μ„ μ™„λ£Œν•  μžμœ κ°€ μžˆμŠ΅λ‹ˆλ‹€.
05:51
When you describe something as 'laser-like', you mean that
106
351360
3080
, 당신이 μ–΄λ–€ 것을 'λ ˆμ΄μ € κ°™λ‹€'κ³  λ¬˜μ‚¬ν•  λ•Œ, 당신은
05:54
it's very accurate and precise.
107
354440
2160
그것이 맀우 μ •ν™•ν•˜κ³  μ •ν™•ν•˜λ‹€λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
05:56
And finally, the adjective 'stark' has several meanings,
108
356600
3360
그리고 λ§ˆμ§€λ§‰μœΌλ‘œ 'stark'λΌλŠ” ν˜•μš©μ‚¬λŠ”
05:59
including 'obvious', 'harsh' and 'plain'.
109
359960
3080
'λͺ…λ°±ν•œ', 'κ°€ν˜Ήν•œ', 'ν‰λ²”ν•œ' λ“± μ—¬λŸ¬ 의미λ₯Ό 가지고 μžˆμŠ΅λ‹ˆλ‹€.
06:03
Once again, our six minutes are up. Bye for now.
110
363040
2880
λ‹€μ‹œ ν•œ 번 6뢄이 λλ‚¬μŠ΅λ‹ˆλ‹€. μ§€κΈˆμ€ μ•ˆλ…•.
06:05
Bye bye.
111
365920
1160
μ•ˆλ…•.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7