Rhetoric: How persuasive are you? - 6 Minute English

78,984 views ・ 2022-04-14

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:08
Hello. This is 6 Minute English
0
8240
1520
μ•ˆλ…•ν•˜μ„Έμš”. BBC Learning English의 6λΆ„ μ˜μ–΄
00:09
from BBC Learning English.
1
9760
1520
μž…λ‹ˆλ‹€.
00:11
I'm Neil. And I'm Sam. Β 
2
11280
1920
μ €λŠ” λ‹μž…λ‹ˆλ‹€. 그리고 μ €λŠ” μƒ˜μž…λ‹ˆλ‹€.
00:13
'Friends, Romans, countrymen,
3
13200
2240
'μΉœκ΅¬λ“€μ΄μ—¬, λ‘œλ§ˆμΈλ“€μ΄μ—¬, 동포듀이여,
00:15
lend me your ears!' Do you
4
15440
1840
κ·€λ₯Ό κΈ°μšΈμ—¬ μ£Όμ‹­μ‹œμ˜€!'
00:17
know where these famous words
5
17280
1360
이 유λͺ…ν•œ 말이 μ–΄λ””μ—μ„œ μ™”λŠ”μ§€ μ•„μ„Έμš”
00:18
are from, Sam? I think that's a speech by
6
18640
3520
, μƒ˜?
00:22
Marc Antony in William
7
22160
1600
William
00:23
Shakespeare's play, Julius Caesar. Wow, I'm impressed! Caesar has
8
23760
4800
Shakespeare의 희곑 Julius Caesar에 λ‚˜μ˜€λŠ” Marc Antony의 연섀이라고 μƒκ°ν•©λ‹ˆλ‹€. μ™€μš°, κ°λ™λ°›μ•˜μ–΄μš”! 카이사λ₯΄
00:28
been assassinated and Marc Antony
9
28560
2080
κ°€ μ•”μ‚΄λ‹Ήν•˜κ³  마크 μ•ˆν† λ‹ˆμš°μŠ€
00:30
tries to persuade the crowd
10
30640
1600
λŠ”
00:32
to find his killers. Using words to persuade people,
11
32240
4000
μžμ‹ μ„ 죽인 범인을 μ°ΎκΈ° μœ„ν•΄ ꡰ쀑을 μ„€λ“ν•˜λ € ν•©λ‹ˆλ‹€. μ‚¬λžŒλ“€μ„ μ„€λ“ν•˜κΈ° μœ„ν•΄ 말을 μ‚¬μš©
00:36
giving them a good reason to do
12
36240
1680
00:37
what you say, or to accept your
13
37920
2160
ν•˜κ±°λ‚˜, 당신이 λ§ν•œ 것을 ν–‰ν•˜κ±°λ‚˜ λ‹Ήμ‹ μ˜ μ£Όμž₯을 받아듀이도둝 κ·Έλ“€μ—κ²Œ μΆ©λΆ„ν•œ 이유λ₯Ό μ œκ³΅ν•˜λŠ” 것을
00:40
argument, is known as
14
40080
2080
00:42
'rhetoric'. In this programme,
15
42160
2160
'μˆ˜μ‚¬ν•™'이라고 ν•©λ‹ˆλ‹€. 이 ν”„λ‘œκ·Έλž¨μ—μ„œ
00:44
we'll be hearing all about
16
44320
1520
μš°λ¦¬λŠ” μˆ˜μ‚¬ν•™μ— λŒ€ν•œ λͺ¨λ“  것을 λ“£κ³ 
00:45
rhetoric and of course learning
17
45840
2160
00:48
some related vocabulary as well. The art of rhetoric started
18
48000
3920
λ¬Όλ‘  일뢀 κ΄€λ ¨ μ–΄νœ˜λ„ 배울 κ²ƒμž…λ‹ˆλ‹€. μˆ˜μ‚¬ν•™μ˜ μ˜ˆμˆ μ€
00:51
with the ancient Greek
19
51920
1200
κ³ λŒ€ 그리슀 μ² ν•™μžλ“€κ³Ό ν•¨κ»˜ μ‹œμž‘λ˜μ—ˆμŠ΅λ‹ˆλ‹€
00:53
philosophers. Later, during
20
53120
2000
. λ‚˜μ€‘μ—
00:55
the Roman republic, politicians
21
55120
1920
둜마 곡화ꡭ μ‹œλŒ€μ— μ •μΉ˜μΈ
00:57
and statesmen used rhetoric in
22
57040
1840
κ³Ό μ •μΉ˜κ°€λ“€μ€ κ΄‘μž₯
00:58
speeches given to crowds in
23
58880
1760
μ—μ„œ κ΅°μ€‘μ—κ²Œ μ—°μ„€ν•  λ•Œ μˆ˜μ‚¬λ²•μ„ μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€
01:00
the public square. Although technology has
24
60640
2880
. κ·Έ μ΄ν›„λ‘œ 기술이
01:03
transformed the way we
25
63520
1440
μš°λ¦¬κ°€ μ˜μ‚¬μ†Œν†΅ν•˜λŠ” 방식을 λ³€ν™”μ‹œμΌ°μ§€λ§Œ
01:04
communicate since then, the
26
64960
1840
01:06
art of rhetoric is still
27
66800
1680
μˆ˜μ‚¬ν•™μ˜ μ˜ˆμˆ μ€ μ˜€λŠ˜λ‚ μ—λ„ μ—¬μ „νžˆ
01:08
alive today. Modern
28
68480
1840
μ‚΄μ•„ μžˆμŠ΅λ‹ˆλ‹€. ν˜„λŒ€
01:10
politicians may prefer
29
70320
1520
μ •μΉ˜μΈλ“€μ€
01:11
Twitter to the public square,
30
71840
1600
곡개 κ΄‘μž₯보닀 νŠΈμœ„ν„°λ₯Ό μ„ ν˜Έν•  수
01:14
but they still use persuasive
31
74000
1760
μžˆμ§€λ§Œ μ—¬μ „νžˆ
01:15
language, including
32
75760
1520
01:17
soundbites - short sentences
33
77280
2400
짧은 λ¬Έμž₯
01:19
or phrases giving a message
34
79680
2080
μ΄λ‚˜
01:21
in an easy to remember way. We'll hear more soon but
35
81760
3440
κΈ°μ–΅ν•˜κΈ° μ‰¬μš΄ λ°©μ‹μœΌλ‘œ λ©”μ‹œμ§€λ₯Ό μ „λ‹¬ν•˜λŠ” ꡬ문인 μ‚¬μš΄λ“œλ°”μ΄νŠΈλ₯Ό ν¬ν•¨ν•˜μ—¬ 섀득λ ₯ μžˆλŠ” μ–Έμ–΄λ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€. 곧 더 λ§Žμ€ μ†Œμ‹μ„ λ“£κ² μ§€λ§Œ
01:25
first I have a question for
36
85200
1280
λ¨Όμ € Samμ—κ²Œ 질문이
01:26
you, Sam. Roman politicians
37
86480
2480
μžˆμŠ΅λ‹ˆλ‹€. 둜마 μ •μΉ˜κ°€λ“€μ€ μƒλŒ€λ°©μ˜ 도덕적 성격에 λŒ€ν•œ 곡격인
01:28
used many rhetorical tricks
38
88960
1680
01:30
to persuade people including
39
90640
1840
01:32
the argumentum ad hominum
40
92480
2640
λ…ΌμŸ(argumentum ad hominum)을 ν¬ν•¨ν•˜μ—¬ μ‚¬λžŒλ“€μ„ μ„€λ“ν•˜κΈ° μœ„ν•΄ λ§Žμ€ μˆ˜μ‚¬μ  μ†μž„μˆ˜λ₯Ό μ‚¬μš©ν–ˆμŠ΅λ‹ˆλ‹€
01:35
which was an attack on their
41
95120
1520
01:36
opponent's moral character.
42
96640
1680
.
01:38
Another was called the
43
98880
1120
λ‹€λ₯Έ ν•˜λ‚˜λŠ” argumentum ad baculum이라고 λΆˆλ €μŠ΅λ‹ˆλ‹€.
01:40
argumentum ad baculum - but
44
100000
2560
01:42
what did it mean? Was it: a) an argument based on
45
102560
3840
κ·ΈλŸ¬λ‚˜ 그것은 무엇을 μ˜λ―Έν•©λ‹ˆκΉŒ? a) 논리에 κ·Όκ±°ν•œ μ£Όμž₯
01:46
logic? b) an argument based
46
106400
2800
인가? b)
01:49
on emotion? or c) an argument based on
47
109200
4000
감정에 κ·Όκ±°ν•œ λ…ΌμŸ? λ˜λŠ” c)
01:53
the stick? Well, to persuade someone
48
113200
3280
λ§‰λŒ€κΈ°μ— κ·Όκ±°ν•œ λ…ΌμŸ? 음, λˆ„κ΅°κ°€λ₯Ό μ„€λ“ν•˜λ €λ©΄
01:56
your argument needs to be
49
116480
1360
λ‹Ήμ‹ μ˜ μ£Όμž₯이
01:57
logical, so I'll say a). OK, we'll find out the
50
117840
3680
논리적이어야 ν•©λ‹ˆλ‹€. κ·Έλž˜μ„œ μ €λŠ” a)라고 말할 κ²ƒμž…λ‹ˆλ‹€. μ’‹μ•„, μš°λ¦¬λŠ”
02:01
answer later. Whether you
51
121520
1760
λ‚˜μ€‘μ— λŒ€λ‹΅μ„ 찾을 수 μžˆμŠ΅λ‹ˆλ‹€.
02:03
want someone to vote for
52
123280
1360
λˆ„κ΅°κ°€κ°€ λ‹Ήμ‹ μ—κ²Œ νˆ¬ν‘œ
02:04
you, or to buy what
53
124640
1120
ν•˜κΈ°λ₯Ό μ›ν•˜λ“ ,
02:05
you're selling, rhetoric
54
125760
1920
당신이 νŒλ§€ν•˜λŠ” μ œν’ˆμ„ κ΅¬μž…ν•˜κΈ°λ₯Ό μ›ν•˜λ“ , μˆ˜μ‚¬ν•™
02:07
can make your message
55
127680
1280
은 λ‹Ήμ‹ μ˜ λ©”μ‹œμ§€λ₯Ό 섀득λ ₯ 있게 λ§Œλ“€ 수 μžˆμŠ΅λ‹ˆλ‹€
02:08
persuasive. During his
56
128960
1920
.
02:10
career in the adverting
57
130880
1280
κ΄‘κ³  μ—…κ³„μ—μ„œ κ²½λ ₯을 μŒ“λŠ” λ™μ•ˆ
02:12
industry, Sam Tatum
58
132160
1760
Sam Tatum
02:13
learned a lot about
59
133920
880
은 μ‚¬λžŒλ“€μ„ μ„€λ“ν•˜λŠ” 방법에 λŒ€ν•΄ λ§Žμ€ 것을 λ°°μ› μŠ΅λ‹ˆλ‹€
02:14
persuading people.
60
134800
960
.
02:16
Here he explains the many
61
136400
1520
μ—¬κΈ°μ—μ„œ κ·ΈλŠ”
02:17
uses of rhetoric to BBC
62
137920
1920
BBC
02:19
World Service programme,
63
139840
1360
World Service ν”„λ‘œκ·Έλž¨μΈ
02:21
The Why Factor. Rhetoric is persuasive
64
141200
4000
The Why Factor에 λŒ€ν•œ μˆ˜μ‚¬ν•™μ˜ λ‹€μ–‘ν•œ μš©λ„λ₯Ό μ„€λͺ…ν•©λ‹ˆλ‹€. μˆ˜μ‚¬ν•™μ€ 섀득λ ₯ μžˆλŠ”
02:25
language. We use it rally,
65
145200
1840
μ–Έμ–΄μž…λ‹ˆλ‹€. 우리
02:27
to simplify the complex,
66
147040
2320
λŠ” λ³΅μž‘ν•œ 것을 λ‹¨μˆœν™”ν•˜κ³ 
02:29
to inspire and influence.
67
149360
1600
μ˜κ°μ„ μ£Όκ³  영ν–₯을 미치기 μœ„ν•΄ μ§‘νšŒλ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€.
02:30
It's important, I think,
68
150960
640
02:31
to identify what strategies
69
151600
1600
μ–΄λ–€ μ „λž΅
02:33
might be influencing us
70
153200
1120
이
02:34
more than we think.
71
154320
1200
μš°λ¦¬κ°€ μƒκ°ν•˜λŠ” 것보닀 μš°λ¦¬μ—κ²Œ 더 λ§Žμ€ 영ν–₯을 λ―ΈμΉ  수 μžˆλŠ”μ§€ μ‹λ³„ν•˜λŠ” 것이 μ€‘μš”ν•˜λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€. 인식
02:35
By understanding the power
72
155520
1200
02:36
of language in shaping
73
156720
1120
을 ν˜•μ„±ν•˜λŠ” μ–Έμ–΄μ˜ νž˜μ„ μ΄ν•΄ν•¨μœΌλ‘œμ¨
02:37
perceptions, we can
74
157840
1120
우리
02:38
start to see, 'I'm
75
158960
640
λŠ” '
02:39
wondering why people are
76
159600
1200
02:40
looking to be so concrete.
77
160800
1280
μ‚¬λžŒλ“€μ΄ μ™œ κ·Έλ ‡κ²Œ ꡬ체적이렀고 ν•˜λŠ”μ§€ κΆκΈˆν•©λ‹ˆλ‹€.
02:42
Are we trying to pull the
78
162080
960
μš°λ¦¬κ°€ μ‹€μ œλ‘œ μ§„μˆ ν•œ 것보닀 훨씬 더 λ³΅μž‘ν•œ 것에
02:43
wool over our eyes on
79
163040
960
λŒ€ν•΄ 우리의 λˆˆμ„ μ†μ΄λ €λŠ”
02:44
something that's more
80
164000
800
02:44
far complex than we
81
164800
880
02:45
actually state?' As well as persuading
82
165680
2960
κ±΄κ°€μš”?' Sam Tatum은 μ‚¬λžŒλ“€μ„ μ„€λ“ν•˜λŠ” κ²ƒλΏλ§Œ μ•„λ‹ˆλΌ κ³΅ν†΅μ˜ λͺ©ν‘œλ₯Ό μ§€μ§€ν•˜κΈ° μœ„ν•΄ μ‚¬λžŒλ“€μ„ ν•˜λ‚˜λ‘œ λͺ¨μœΌκΈ°
02:48
people, Sam Tatum says
83
168640
1920
02:50
rhetoric can be used to
84
170560
1600
μœ„ν•΄ μˆ˜μ‚¬λ²•μ„ μ‚¬μš©ν•  수 μžˆλ‹€κ³  λ§ν•©λ‹ˆλ‹€
02:52
rally - to bring people
85
172160
1760
02:53
together in support of
86
173920
1520
02:55
a common goal. A recent
87
175440
1760
. 이에 λŒ€ν•œ 졜근의
02:57
example of this is the
88
177200
1280
예
02:58
way politicians called
89
178480
1600
λŠ” μ •μΉ˜μΈ
03:00
the coronavirus our 'enemy'. The words politicians choose,
90
180080
4000
듀이 μ½”λ‘œλ‚˜λ°”μ΄λŸ¬μŠ€λ₯Ό 우리의 '적'이라고 λΆ€λ₯΄λŠ” λ°©μ‹μž…λ‹ˆλ‹€. μ •μΉ˜μΈμ΄ μ„ νƒν•˜λŠ” 단어
03:04
and the way they use them,
91
184080
1360
와 μ‚¬μš© 방식
03:05
can influence us more than
92
185440
1520
은 μš°λ¦¬κ°€ μƒκ°ν•˜λŠ” 것보닀 더 큰 영ν–₯을 λ―ΈμΉ  수 μžˆμŠ΅λ‹ˆλ‹€
03:06
we think. Sam Tatum says
93
186960
2160
. μƒ˜ ν…Œμ΄ν…€μ€
03:09
we should question whether
94
189120
1120
03:10
political rhetoric is
95
190240
1360
μ •μΉ˜μ  μˆ˜μ‚¬λ²•μ΄
03:11
trying to pull the wool
96
191600
1440
03:13
over our eyes, an
97
193040
1440
우리λ₯Ό μ†μ΄κ±°λ‚˜ μ†μ΄λŠ”
03:14
informal way of saying
98
194480
1280
비곡식적인 ν‘œν˜„μœΌλ‘œ
03:15
trick or deceive us. But in the age of 24-hour
99
195760
3760
우리의 λˆˆμ„ 속이렀 ν•˜λŠ”μ§€ μ˜λ¬Έμ„ μ œκΈ°ν•΄μ•Ό ν•œλ‹€κ³  λ§ν–ˆμŠ΅λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜ 24μ‹œκ°„
03:19
news updates and non-stop
100
199520
1760
λ‰΄μŠ€ μ—…λ°μ΄νŠΈμ™€ λ©ˆμΆ”μ§€ μ•ŠλŠ”
03:21
Twitter, has the skill
101
201280
1920
νŠΈμœ„ν„° μ‹œλŒ€μ—
03:23
of making a thoughtful
102
203200
1440
사렀 κΉŠμ€ μ£Όμž₯을 ν•˜λŠ” κΈ°μˆ μ„
03:24
argument been lost?
103
204640
1360
μžƒμ–΄λ²„λ ΈμŠ΅λ‹ˆκΉŒ?
03:26
Here's Kendal Phillips,
104
206560
1520
λ‹€μŒμ€ μ‹œλŸ¬νμŠ€ λŒ€ν•™
03:28
professor of political
105
208080
1280
의 μ •μΉ˜ μ² ν•™ ꡐ수인 Kendal Phillips
03:29
philosophy at Syracuse
106
209360
1520
03:30
University, speaking to
107
210880
1760
κ°€
03:32
BBC World Service's,
108
212640
1600
BBC World Service의
03:34
The Why Factor. It's hard to analyse
109
214240
3760
The Why Factorμ—μ„œ λ§ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€. νŠΈμœ—
03:38
the argument or reasoning of
110
218000
1280
의 μ£Όμž₯μ΄λ‚˜ 좔둠을 λΆ„μ„ν•˜κΈ° μ–΄λ ΅μŠ΅λ‹ˆλ‹€
03:39
a tweet, 'cos 280 characters
111
219280
1840
. 'μ™œλƒν•˜λ©΄ 280μžλŠ”
03:41
is not a way for me to
112
221120
880
03:42
lay out a logical argument
113
222000
1360
03:43
with a major premise, a minor
114
223360
1120
λŒ€μ „μ œ,
03:44
premise and a conclusion,
115
224480
1200
μ†Œμ „μ œ 및 결둠으둜 ​​논리적 μ£Όμž₯을 λ°°μΉ˜ν•˜λŠ” 방법이 μ•„λ‹ˆκΈ°
03:45
it's much easier to just
116
225680
1360
03:47
use a two-word phrase or
117
227040
2160
λ•Œλ¬Έμ— 두 단어λ₯Ό μ‚¬μš©ν•˜λŠ” 것이 훨씬 μ‰½μŠ΅λ‹ˆλ‹€.
03:49
a hashtag that usually
118
229200
2000
일반적으둜 λ‚΄
03:51
ends up adding to that
119
231200
1120
편과 κ·Έλ“€μ˜ 편 사이에
03:52
kind of polemical division
120
232320
1280
그런 μ’…λ₯˜μ˜ λ…ΌμŸμ μΈ ꡬ뢄을 μΆ”κ°€ν•˜λŠ” 문ꡬ λ˜λŠ” ν•΄μ‹œ νƒœκ·Έ
03:53
between my side
121
233600
1280
03:54
and their side. Global problems involve
122
234880
3360
. κΈ€λ‘œλ²Œ λ¬Έμ œμ—λŠ”
03:58
complex issues which cannot
123
238240
1760
04:00
be solved in 280 letters,
124
240000
2560
04:02
the maximum length of a tweet
125
242560
1840
νŠΈμœ„ν„°κ°€ ν—ˆμš©ν•˜λŠ” μ΅œλŒ€ νŠΈμœ— 길이인 280자둜 ν•΄κ²°ν•  수 μ—†λŠ” λ³΅μž‘ν•œ λ¬Έμ œκ°€ 포함
04:04
allowed by Twitter.
126
244400
1120
λ©λ‹ˆλ‹€.
04:06
According to Professor Kendal,
127
246160
1520
Kendal κ΅μˆ˜μ— λ”°λ₯΄λ©΄,
04:07
we need logical arguments
128
247680
1680
우리
04:09
containing a premise -
129
249360
1680
λŠ” μ „μ œ
04:11
something which you think
130
251040
1120
(λ‹Ήμ‹ 
04:12
is true and you use as
131
252160
1920
이 참이라고 μƒκ°ν•˜κ³  λ‹Ήμ‹ μ˜ 아이디어
04:14
the basis for developing
132
254080
1360
λ₯Ό κ°œλ°œν•˜κΈ° μœ„ν•œ 기초둜 μ‚¬μš©ν•˜λŠ”
04:15
your idea, and a conclusion -
133
255440
2240
것)와 κ²°λ‘ (λͺ¨λ“  κ΄€λ ¨ 사싀을 주의 깊게 κ³ λ €ν•œ 것에 κΈ°μ΄ˆν•œ
04:17
your decision or plan of
134
257680
1600
λ‹Ήμ‹ μ˜ κ²°μ • λ˜λŠ” 행동 κ³„νš)을 ν¬ν•¨ν•˜λŠ” 논리적 μ£Όμž₯이 ν•„μš”ν•©λ‹ˆλ‹€
04:19
action based on carefully
135
259280
1920
04:21
considering all
136
261200
960
04:22
the relevant facts. For example: climate change
137
262160
4320
. 예λ₯Ό λ“€μ–΄: κΈ°ν›„ λ³€ν™”
04:26
is damaging the planet -
138
266480
1760
λŠ” 지ꡬλ₯Ό μ†μƒμ‹œν‚€κ³  μžˆμŠ΅λ‹ˆλ‹€
04:28
that's a premise; therefore,
139
268240
2240
. 그것은 μ „μ œμž…λ‹ˆλ‹€. κ·ΈλŸ¬λ―€λ‘œ
04:30
we should act to stop it -
140
270480
2160
μš°λ¦¬λŠ” 그것을 λ©ˆμΆ”κΈ° μœ„ν•΄ 행동해야 ν•©λ‹ˆλ‹€.
04:32
that's a conclusion. Few issues are simply black
141
272640
3360
그것이 κ²°λ‘ μž…λ‹ˆλ‹€. ν•˜μ§€λ§Œ 흑백 λ¬Έμ œλŠ” 거의 μ—†μœΌλ©°
04:36
and white though, and this
142
276000
1680
04:37
is a problem because Twitter
143
277680
1360
Twitter
04:39
debates are often polemical -
144
279040
2320
토둠은 νŠΉμ • μ˜κ²¬μ΄λ‚˜ 아이디어에 λŒ€ν•΄ 맀우 κ°•λ ₯ν•˜κ²Œ μ°¬μ„±ν•˜κ±°λ‚˜ λ°˜λŒ€ν•˜λŠ” λ…ΌμŸμ μΈ κ²½μš°κ°€ 많기 λ•Œλ¬Έμ— λ¬Έμ œκ°€ λ©λ‹ˆλ‹€
04:41
argued very strongly either
145
281360
2080
04:43
for or against a particular
146
283440
1760
04:45
opinion or idea. If you believe passionately
147
285200
3120
. 무언가λ₯Ό μ—΄μ •μ μœΌλ‘œ λ―ΏλŠ”λ‹€λ©΄
04:48
in something, you need to
148
288320
1360
04:49
explain it to people in
149
289680
1360
μ‚¬λžŒλ“€μ΄
04:51
a way they understand, and
150
291040
1920
μ΄ν•΄ν•˜λŠ” λ°©μ‹μœΌλ‘œ μ„€λͺ…ν•΄μ•Ό ν•˜λ©°
04:52
in ancient times rhetoric
151
292960
1840
, κ³ λŒ€μ˜ μˆ˜μ‚¬ν•™μ€ μ‚¬λžŒλ“€ 사이에
04:54
also meant building bridges
152
294800
2480
닀리λ₯Ό 놓고 곡톡점을 μ°ΎλŠ” 것을 μ˜λ―Έν•˜κΈ°λ„ ν–ˆμŠ΅λ‹ˆλ‹€
04:57
between people and finding
153
297280
1600
04:58
common ground. Like those
154
298880
2000
.
05:00
Romans you mentioned, Neil. Yes, in my question I asked
155
300880
3200
당신이 μ–ΈκΈ‰ν•œ λ‘œλ§ˆμΈλ“€μ²˜λŸΌ, 닐. 예, 제 μ§ˆλ¬Έμ—μ„œ
05:04
Sam for the meaning of
156
304080
1360
Samμ—κ²Œ
05:05
term, argumentum ad baculum. I guessed it was an argument
157
305440
5360
μš©μ–΄, argumentum ad baculum의 의미λ₯Ό λ¬Όμ—ˆμŠ΅λ‹ˆλ‹€. 논리에 κ·Όκ±°ν•œ μ£Όμž₯이라고 μƒκ°ν–ˆμŠ΅λ‹ˆλ‹€
05:10
based on logic. Which was the wrong answer,
158
310800
3520
. μ–΄λŠ μͺ½μ΄
05:14
I'm afraid. In fact,
159
314320
1680
μ˜€λ‹΅μ΄μ—ˆμ„κΉŒμš”? 사싀,
05:16
argumentum ad baculum
160
316000
1760
argumentum ad baculum
05:17
means the argument with
161
317760
1600
은 λ§‰λŒ€κΈ°λ‘œ λ…ΌμŸν•˜λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€.
05:19
a stick, or in other words,
162
319360
1440
즉,
05:20
hitting somebody with a
163
320800
1200
λˆ„κ΅°κ°€κ°€ λ‹Ήμ‹ μ˜
05:22
stick until they agree
164
322000
1440
μ˜κ²¬μ— λ™μ˜ν•  λ•ŒκΉŒμ§€ λ§‰λŒ€κΈ°λ‘œ λˆ„κ΅°κ°€λ₯Ό λ•Œλ¦¬λŠ”
05:23
with you! I guess that's
165
323440
1600
κ²ƒμž…λ‹ˆλ‹€! 그것이
05:25
one way to win an argument.
166
325040
1200
λ…ΌμŸμ—μ„œ μ΄κΈ°λŠ” ν•œ 방법이라고 μƒκ°ν•©λ‹ˆλ‹€.
05:26
OK, let's recap the
167
326800
1200
μ’‹μ•„,
05:28
vocabulary from the programme,
168
328000
1760
05:29
starting with a soundbite -
169
329760
1760
μ‚¬μš΄λ“œ λ°”μ΄νŠΈ(기얡에 남도둝
05:31
a short sentence or phrase
170
331520
1760
κ³ μ•ˆλœ 짧은 λ¬Έμž₯ λ˜λŠ” ꡬ)둜 μ‹œμž‘ν•˜μ—¬ ν”„λ‘œκ·Έλž¨μ˜ μ–΄νœ˜λ₯Ό μš”μ•½ν•΄ λ΄…μ‹œλ‹€
05:33
designed to stick
171
333280
960
05:34
in the memory. When people rally together,
172
334240
2720
. μ‚¬λžŒλ“€μ΄ ν•¨κ»˜ μ§‘κ²°ν•˜λ©΄ κ³΅ν†΅μ˜ λͺ©ν‘œ
05:36
they unite to support
173
336960
1600
λ₯Ό μ§€μ›ν•˜κΈ° μœ„ν•΄ λ‹¨κ²°ν•©λ‹ˆλ‹€
05:38
a common goal. To pull the wool over
174
338560
2320
. λˆ„κ΅°κ°€μ˜ λˆˆμ— 양털을 λ‹ΉκΈ°λŠ” 것은
05:40
someone's eyes means
175
340880
1200
05:42
to trick someone. Logical arguments contain
176
342080
2800
λˆ„κ΅°κ°€λ₯Ό μ†μ΄λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€. 논리적 논증
05:44
a premise - a truth used
177
344880
2240
은 μ „μ œ(μ£Όμž₯
05:47
as the basis for
178
347120
960
을
05:48
developing an argument,
179
348080
1520
λ°œμ „μ‹œν‚€λŠ” 근거둜 μ‚¬μš©λ˜λŠ” 진싀)
05:49
and a conclusion - a
180
349600
1760
와 κ²°λ‘ (λͺ¨λ“  κ΄€λ ¨ 사싀
05:51
decision based on
181
351360
1120
을 μ‹ μ€‘ν•˜κ²Œ κ³ λ €ν•œ κ²°μ •)을 포함
05:52
carefully considering all
182
352480
1520
05:54
the relevant facts. And finally, polemical
183
354000
2720
ν•©λ‹ˆλ‹€. 그리고 λ§ˆμ§€λ§‰μœΌλ‘œ 논증
05:56
means strongly attacking
184
356720
1600
은 μ˜κ²¬μ΄λ‚˜ 아이디어λ₯Ό κ°•λ ₯ν•˜κ²Œ 곡격
05:58
or defending an opinion or
185
358320
1600
ν•˜κ±°λ‚˜ λ°©μ–΄ν•˜λŠ” 것을 의미
05:59
idea. But there's no
186
359920
1520
ν•©λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜
06:01
arguing the fact that
187
361440
1040
06:02
once again our six
188
362480
1280
λ‹€μ‹œ ν•œ 번 우리의 6
06:03
minutes are up!
189
363760
800
뢄이 λλ‚¬λ‹€λŠ” μ‚¬μ‹€μ—λŠ” λ…ΌμŸμ˜ 여지가 μ—†μŠ΅λ‹ˆλ‹€!
06:04
Goodbye for now! Bye!
190
364560
1280
μ§€κΈˆμ€ μ•ˆλ…•! μ•ˆλ…•!
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7