Rage bait: How online anger makes money ⏲️ 6 Minute English

61,810 views ・ 2025-02-13

BBC Learning English


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λ²ˆμ—­λœ μžλ§‰μ€ 기계 λ²ˆμ—­λ©λ‹ˆλ‹€.

00:08
Hello, this is Six Minute English from BBC Learning English.
0
8440
3720
μ•ˆλ…•ν•˜μ„Έμš”. BBC Learning English의 Six Minute Englishμž…λ‹ˆλ‹€.
00:12
I'm Phil, and I'm Beth.
1
12160
1960
μ €λŠ” 필이고, μ €λŠ” λ² μŠ€μ˜ˆμš”.
00:14
If you use the internet, and nearly everyone does,
2
14120
3520
인터넷을 μ‚¬μš©ν•œλ‹€λ©΄ ( 거의 λͺ¨λ“  μ‚¬λžŒμ΄ κ·Έλ ‡κ² μ§€λ§Œ) λ‹€μŒκ³Ό 같은
00:17
you've probably read headlines like this:
3
17640
3000
제λͺ©μ„ 읽어봀을 κ²ƒμž…λ‹ˆλ‹€ :
00:20
You won't believe what plastic surgery this celebrity has had done.
4
20640
4640
이 유λͺ…인이 받은 μ„±ν˜• 수술이 λ―ΏκΈ° μ–΄λ €μšΈ μ •λ„μž…λ‹ˆλ‹€.
00:25
Known as clickbait, headlines like these are used to grab your attention
5
25280
4800
클릭베이트라고 λΆˆλ¦¬λŠ” 이런 ν—€λ“œλΌμΈμ€ λ…μžμ˜ 관심을 λŒμ–΄
00:30
and make you read more.
6
30080
1600
더 읽게 ν•˜λŠ” 데 μ‚¬μš©λ©λ‹ˆλ‹€.
00:31
But now a new trend called 'rage bait' is spreading across social media.
7
31680
4920
ν•˜μ§€λ§Œ 이제 'λΆ„λ…Έ 미끼'라 λΆˆλ¦¬λŠ” μƒˆλ‘œμš΄ νŠΈλ Œλ“œκ°€ μ†Œμ…œ 미디어에 퍼지고 μžˆμŠ΅λ‹ˆλ‹€.
00:36
Rage bait is online content designed to make you angry or outraged.
8
36600
5280
λΆ„λ…Έ μœ λ°œλ¬Όμ€ 당신을 ν™”λ‚˜κ²Œ ν•˜κ±°λ‚˜ κ²©λΆ„ν•˜κ²Œ λ§Œλ“€κΈ° μœ„ν•΄ κ³ μ•ˆλœ 온라인 μ½˜ν…μΈ μž…λ‹ˆλ‹€.
00:41
In this programme, we'll explore the trend of rage baiting.
9
41880
3400
이 ν”„λ‘œκ·Έλž¨μ—μ„œλŠ” λΆ„λ…Έ μ‘°μž₯의 μΆ”μ„Έλ₯Ό μ‚΄νŽ΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
00:45
And as usual, we'll learn some useful new vocabulary,
10
45280
3560
그리고 ν‰μ†Œμ²˜λŸΌ, μš°λ¦¬λŠ” μœ μš©ν•œ μƒˆλ‘œμš΄ μ–΄νœ˜λ₯Ό 배울 κ²ƒμž…λ‹ˆλ‹€.
00:48
all of which you can find on our website bbclearningenglish.com.
11
48840
4520
κ·Έ λͺ¨λ“  μ–΄νœ˜λŠ” bbclearningenglish.com μ›Ήμ‚¬μ΄νŠΈμ—μ„œ 찾을 수 μžˆμŠ΅λ‹ˆλ‹€.
00:53
Great. But first, it's time for my question,
12
53360
3000
μ—„μ²­λ‚œ. ν•˜μ§€λ§Œ λ¨Όμ €, 베슀 씨, μ œκ°€ μ§ˆλ¬Έλ“œλ¦΄ μ°¨λ‘€μž…λ‹ˆλ‹€
00:56
Beth, which is about the two terms we've been using 'clickbait' and 'rage bait'.
13
56360
6080
. μ§ˆλ¬Έμ€ '클릭베이트'와 'λΆ„λ…Έ 미끼'λΌλŠ” 두 가지 μš©μ–΄μ— λŒ€ν•œ κ²ƒμž…λ‹ˆλ‹€.
01:02
Both contain the word 'bait'.
14
62440
2640
두 단어 λͺ¨λ‘ '미끼'λΌλŠ” 단어가 λ“€μ–΄μžˆμŠ΅λ‹ˆλ‹€.
01:05
But what is its actual meaning?
15
65080
2360
ν•˜μ§€λ§Œ μ‹€μ œ μ˜λ―ΈλŠ” λ¬΄μ—‡μΌκΉŒ?
01:07
Is bait, a) a strong feeling of anger?
16
67440
4480
λ―ΈλΌλŠ” a) κ°•ν•œ λΆ„λ…Έμ˜ κ°μ •μΈκ°€μš”?
01:11
b) food put on a hook to catch fish or animals.
17
71920
3600
b) λ¬Όκ³ κΈ°λ‚˜ 동물을 작기 μœ„ν•΄ λ‚šμ‹―λ°”λŠ˜μ— κΏ°λŠ” μŒμ‹ .
01:15
Or c) a piece of computer software.
18
75520
2720
λ˜λŠ” c) 컴퓨터 μ†Œν”„νŠΈμ›¨μ–΄.
01:18
I am fairly confident that it is
19
78240
2200
μ €λŠ” 그것이
01:20
b) food put on a hook.
20
80440
2000
b) κ°ˆκ³ λ¦¬μ— κ½‚νžŒ μŒμ‹μ΄λΌκ³  ν™•μ‹ ν•©λ‹ˆλ‹€.
01:22
OK. Well, we'll find out the correct answer later in the programme.
21
82440
4520
μ’‹μ•„μš”. κΈ€μŽ„μš”, 정닡은 λ‚˜μ€‘μ— ν”„λ‘œκ·Έλž¨μ—μ„œ μ•Œμ•„λ³΄κ² μŠ΅λ‹ˆλ‹€.
01:26
Here's how rage bait works,
22
86960
2720
λΆ„λ…Έ 미끼가 μž‘λ™ν•˜λŠ” 방식은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
01:29
a creator posts a provocative piece of content online, a message, maybe,
23
89680
4320
μ œμž‘μžκ°€ μ˜¨λΌμΈμ— λ„λ°œμ μΈ μ½˜ν…μΈ , λ©”μ‹œμ§€
01:34
or a video. People see it, feel outraged, and comment angrily.
24
94000
6240
λ˜λŠ” λΉ„λ””μ˜€λ₯Ό κ²Œμ‹œν•©λ‹ˆλ‹€. μ‚¬λžŒλ“€μ€ 그것을 보고 λΆ„λ…Έλ₯Ό 느끼며 ν™”κ°€ λ‚œ λ“―ν•œ λŒ“κΈ€μ„ λ‚¨κΉλ‹ˆλ‹€.
01:40
Others see it, like it, and share it around.
25
100240
3080
λ‹€λ₯Έ μ‚¬λžŒλ“€μ΄ 그것을 보고, μ’‹μ•„ν•˜κ³ , κ³΅μœ ν•©λ‹ˆλ‹€.
01:43
Either way, the content creates interest, increases internet traffic,
26
103320
4840
μ–΄λŠ μͺ½μ΄λ“ , μ½˜ν…μΈ λŠ” 관심을 λΆˆλŸ¬μΌμœΌν‚€κ³  , 인터넷 νŠΈλž˜ν”½μ„ 늘리고
01:48
and makes money for the creator.
27
108160
2200
, μ œμž‘μžμ—κ²Œ μˆ˜μ΅μ„ κ°€μ Έλ‹€μ€λ‹ˆλ‹€.
01:50
Here's marketing strategist Andrea Jones explaining more to Megan Lawton,
28
110360
4720
λ§ˆμΌ€νŒ… μ „λž΅κ°€ μ•ˆλ“œλ ˆμ•„ μ‘΄μŠ€κ°€ BBC μ›”λ“œ μ„œλΉ„μŠ€ ν”„λ‘œκ·Έλž¨ Business Daily의 μ§„ν–‰μž 메건 λ‘€νŠΌμ—κ²Œ 더 μžμ„Ένžˆ μ„€λͺ…ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€
01:55
presenter of BBC World Service programme Business Daily.
29
115080
4280
.
01:59
The more content they create, the more engagement they get,
30
119360
2400
더 λ§Žμ€ μ½˜ν…μΈ λ₯Ό μƒμ„±ν• μˆ˜λ‘ 참여도가 높아지고, 그에 따라
02:01
the more that they get paid.
31
121760
1480
더 λ§Žμ€ μˆ˜μ΅μ„ 얻을 수 μžˆμŠ΅λ‹ˆλ‹€.
02:03
And so they will do anything...
32
123240
2160
κ·Έλž˜μ„œ 그듀은 무엇이든 ν•  κ²λ‹ˆλ‹€.
02:05
Some creators will do anything to get more views,
33
125400
2680
일뢀 ν¬λ¦¬μ—μ΄ν„°λŠ” 더 λ§Žμ€ 쑰회수λ₯Ό μ–»κΈ° μœ„ν•΄ 무엇이든 ν•  κ²λ‹ˆλ‹€.
02:08
because the more views they get, the more that they get paid.
34
128080
2760
μ‘°νšŒμˆ˜κ°€ λ§Žμ„μˆ˜λ‘ 더 λ§Žμ€ λˆμ„ λ°›κΈ° λ•Œλ¬Έμ΄μ£ .
02:10
Even if, even if those views are negative or inciting rage and anger in people.
35
130840
6160
비둝, 비둝 κ·Έ 견해가 뢀정적 μ΄κ±°λ‚˜ μ‚¬λžŒλ“€μ—κ²Œ 뢄노와 ν™”λ‚˜κ²Œ ν•˜λŠ” 것이라 ν• μ§€λΌλ„μš”.
02:17
Andrea, how is rage bait different to clickbait or other online tactics?
36
137000
5320
μ•ˆλ“œλ ˆμ•„, λΆ„λ…Έ λ―ΈλΌλŠ” 클릭 λ―ΈλΌλ‚˜ λ‹€λ₯Έ 온라인 μ „λž΅κ³Ό μ–΄λ–»κ²Œ λ‹€λ¦…λ‹ˆκΉŒ?
02:22
As a marketer, I'm always, you know, coaching my clients
37
142320
2880
λ§ˆμΌ€ν„°λ‘œμ„œ μ €λŠ” 항상 고객을 μ§€λ„ν•˜κ³  λ§ˆμΌ€νŒ…μ—
02:25
and talking to them about using hooks in their marketing. Right.
38
145200
2760
후크λ₯Ό μ‚¬μš©ν•˜λŠ” 방법에 λŒ€ν•΄ μ΄μ•ΌκΈ°ν•©λ‹ˆλ‹€ . 였λ₯Έμͺ½.
02:27
And I think the difference between a hook and rage bait
39
147960
3240
그리고 μ €λŠ” 후크와 λ ˆμ΄μ§€ 베이트,
02:31
or even its long-lost cousin, clickbait content...
40
151200
3560
λ˜λŠ” κ·Έ 였랜 μ‚¬μ΄ŒμΈ 클릭베이트 μ½˜ν…μΈ μ˜ 차이가 μžˆλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€...
02:34
When we think about a hook,
41
154760
1240
후크에 λŒ€ν•΄ 생각해 λ³Ό λ•Œ,
02:36
to me a hook accurately reflects what's in that piece of content,
42
156000
4680
μ €λŠ” 후크가 κ·Έ μ½˜ν…μΈ μ˜ λ‚΄μš©μ„ μ •ν™•ν•˜κ²Œ λ°˜μ˜ν•˜κ³ ,
02:40
and it comes from a place of trust.
43
160680
3000
μ‹ λ’°μ˜ μž₯μ†Œμ—μ„œ λ‚˜μ˜¨λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
02:43
Whereas rage baiting content is designed to be manipulative.
44
163680
4520
λ°˜λ©΄μ— λΆ„λ…Έλ₯Ό μœ λ°œν•˜λŠ” μ½˜ν…μΈ λŠ” μ‘°μž‘μ μœΌλ‘œ μ„€κ³„λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
02:48
Andrea says the more reaction a post gets, the more money the creator makes,
45
168200
6040
μ•ˆλ“œλ ˆμ•„λŠ” κ²Œμ‹œλ¬Όμ— λŒ€ν•œ λ°˜μ‘μ΄ λ§Žμ„μˆ˜λ‘ κ²Œμ‹œλ¬Όμ„ λ§Œλ“  μ‚¬λžŒμ΄ 더 λ§Žμ€ λˆμ„ 벌 수 μžˆλ‹€κ³  λ§ν•©λ‹ˆλ‹€.
02:54
even if the reactions are hateful, she uses the structure,
46
174240
4160
비둝 κ·Έ λ°˜μ‘μ΄ μ¦μ˜€μ μΌμ§€λΌλ„μš”. κ·Έλ…€λŠ”
02:58
the more one thing happens, the more something else happens,
47
178400
4200
ν•œ 가지 일이 더 많이 μΌμ–΄λ‚˜λ©΄ λ‹€λ₯Έ 일이 더 많이 μΌμ–΄λ‚˜λŠ” ꡬ쑰λ₯Ό μ‚¬μš©ν•˜μ—¬
03:02
to show that as one thing happens repeatedly,
48
182600
2920
ν•œ 가지 일이 반볡적으둜 μΌμ–΄λ‚˜λ©΄ κ·Έ
03:05
another thing increases as a result.
49
185520
2760
결과둜 λ‹€λ₯Έ 일이 λŠ˜μ–΄λ‚œλ‹€λŠ” 것을 λ³΄μ—¬μ€λ‹ˆλ‹€.
03:08
For example, the more you practice English, the more you'll improve.
50
188280
4440
예λ₯Ό λ“€μ–΄, μ˜μ–΄ μ—°μŠ΅μ„ 많이 ν• μˆ˜λ‘ μ‹€λ ₯이 ν–₯μƒλ©λ‹ˆλ‹€.
03:12
It doesn't matter that the content is designed to incite outrage - to encourage
51
192720
4360
λ‚΄μš©μ΄ λΆ„λ…Έλ₯Ό μ‘°μž₯ν•˜κ±°λ‚˜
03:17
unpleasant or violent reactions.
52
197080
2600
λΆˆμΎŒν•˜κ±°λ‚˜ 폭λ ₯적인 λ°˜μ‘μ„ μ΄‰μ§„ν•˜λŠ” 것을 λͺ©μ μœΌλ‘œ μž‘μ„±λ˜μ—ˆλ‹€λŠ” 것은 μ€‘μš”ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
03:19
That's why Andrea thinks rage bait is worse than clickbait.
53
199680
4360
κ·Έλž˜μ„œ μ•ˆλ“œλ ˆμ•„λŠ” λΆ„λ…Έ 미끼가 클릭 미끼보닀 더 λ‚˜μ˜λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
03:24
While clickbait is more truthful about its content,
54
204040
3520
ν΄λ¦­λ² μ΄νŠΈκ°€ λ‚΄μš©μ— λŒ€ν•΄ 더 μ†”μ§ν•œ 반면 ,
03:27
rage bait is manipulative, meaning it tries to influence something
55
207560
4560
λΆ„λ…Έλ² μ΄νŠΈλŠ” μ‘°μž‘μ μž…λ‹ˆλ‹€. 즉, 무언가에 영ν–₯을 미쳐 μžμ‹ μ—κ²Œ μœ λ¦¬ν•˜κ²Œ λ§Œλ“€λ €κ³  ν•©λ‹ˆλ‹€
03:32
to its own advantage.
56
212120
1760
.
03:33
So, it's not hard to see why many people think rage bait is toxic.
57
213880
4960
κ·ΈλŸ¬λ‹ˆ λ§Žμ€ μ‚¬λžŒμ΄ λΆ„λ…Έ 미끼가 ν•΄λ‘­λ‹€κ³  μƒκ°ν•˜λŠ” 이유λ₯Ό μ•Œμ•„λ³΄λŠ” 건 어렡지 μ•ŠμŠ΅λ‹ˆλ‹€.
03:38
So, if you're wondering why people react in the first place, listen
58
218840
4600
그러면, μ‚¬λžŒλ“€μ΄ μ• μ΄ˆμ— μ™œ λ°˜μ‘ν•˜λŠ”μ§€ κΆκΈˆν•˜λ‹€λ©΄
03:43
as Doctor William Brady explains the psychology behind rage bait
59
223440
4400
μœŒλ¦¬μ—„ λΈŒλž˜λ”” 박사가
03:47
to BBC World Service's Business Daily.
60
227840
3080
BBC μ›”λ“œ μ„œλΉ„μŠ€μ˜ λΉ„μ¦ˆλ‹ˆμŠ€ 데일리에 λΆ„λ…Έ 미끼의 심리λ₯Ό μ„€λͺ…ν•˜λŠ” 것을 λ“€μ–΄λ³΄μ„Έμš”.
03:50
If you are an influencer and you want to figure out,
61
230920
3920
당신이 μΈν”Œλ£¨μ–Έμ„œ 이고,
03:54
'Well, how do I get more eyeballs on my content?'
62
234840
3000
'λ‚΄ μ½˜ν…μΈ μ— 더 λ§Žμ€ 관심을 끌렀면 μ–΄λ–»κ²Œ ν•΄μ•Ό ν• κΉŒ ?'λ₯Ό μ•Œκ³  μ‹Άλ‹€λ©΄,
03:57
Well, you need to exploit those biases we have in our psychology,
63
237840
4200
음, 우리 심리에 μžˆλŠ” νŽΈκ²¬μ„ μ΄μš©ν•  ν•„μš”κ°€ μžˆμŠ΅λ‹ˆλ‹€.
04:02
because that's the content we'll pay more attention to.
64
242040
2520
μ™œλƒν•˜λ©΄ μš°λ¦¬κ°€ 더 λ§Žμ€ 주의λ₯Ό κΈ°μšΈμ΄λŠ” λ‚΄μš©μ΄κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
04:04
In turn, that's the content that algorithms will amplify,
65
244560
2920
그러면 μ•Œκ³ λ¦¬μ¦˜μ΄ κ·Έ μ½˜ν…μΈ λ₯Ό μ¦ν­μ‹œμΌœ
04:07
which ultimately means more advertising revenue.
66
247480
3120
ꢁ극적으둜 κ΄‘κ³  수읡이 λŠ˜μ–΄λ‚˜κ²Œ λ©λ‹ˆλ‹€.
04:10
Influencers want more eyeballs on their content -
67
250600
3800
μΈν”Œλ£¨μ–Έμ„œλ“€μ€ 더 λ§Žμ€ μ‚¬λžŒλ“€μ΄ μžμ‹ μ˜ μ½˜ν…μΈ μ— μ£Όλͺ©ν•˜κΈ°λ₯Ό μ›ν•©λ‹ˆλ‹€. 즉,
04:14
more people to view their website or social media posts
68
254400
4000
더 λ§Žμ€ μ‚¬λžŒμ΄ μžμ‹ μ˜ μ›Ήμ‚¬μ΄νŠΈ λ‚˜ μ†Œμ…œ λ―Έλ””μ–΄ κ²Œμ‹œλ¬Όμ„ 보고
04:18
and use human psychology to do it.
69
258400
2720
인간 심리학을 ν™œμš©ν•˜κΈ°λ₯Ό μ›ν•©λ‹ˆλ‹€.
04:21
Psychologically speaking, we all have biases -
70
261120
3560
μ‹¬λ¦¬ν•™μ μœΌλ‘œ λ§ν•΄μ„œ, 우리 λͺ¨λ‘λŠ” 편견, 즉 감정을 가지고 있으며,
04:24
feelings - which are often unconscious either for or against a certain idea
71
264680
5360
μ΄λŸ¬ν•œ νŽΈκ²¬μ€ νŠΉμ • 생각
04:30
or group of people.
72
270040
1640
μ΄λ‚˜ μ‚¬λžŒλ“€ 집단에 λŒ€ν•΄ λ¬΄μ˜μ‹μ μœΌλ‘œ λ‚˜νƒ€λ‚˜κ±°λ‚˜ λ°˜λŒ€λ˜λŠ” κ²½μš°κ°€ λ§ŽμŠ΅λ‹ˆλ‹€.
04:31
Biases are emotional, and since listening
73
271680
2720
νŽΈκ²¬μ€ 감정적이며, 우리의 감정에 κ·€λ₯Ό κΈ°μšΈμ΄λŠ” 것이
04:34
to our emotions has been vital to the evolution of the human species,
74
274400
4200
인간 μ’…μ˜ 진화에 ν•„μˆ˜μ μ΄μ—ˆκΈ° λ•Œλ¬Έμ—
04:38
creators know that provoking our emotions will grab our attention.
75
278600
4600
μ°½μž‘μžλ“€μ€ 우리의 감정을 μžκ·Ήν•˜λ©΄ 주의λ₯Ό 끌 수 μžˆλ‹€λŠ” 것을 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€ .
04:43
So, what can be done to combat rage bait?
76
283200
3320
κ·Έλ ‡λ‹€λ©΄ λΆ„λ…Έ 미끼λ₯Ό ν‡΄μΉ˜ν•˜λ €λ©΄ 무엇을 ν•  수 μžˆμ„κΉŒ?
04:46
Well, we could all take a moment to think before reacting and remember
77
286520
4240
κΈ€μŽ„μš”, 우리 λͺ¨λ‘λŠ” λ°˜μ‘ν•˜κΈ° 전에 μž μ‹œ 생각해보고,
04:50
that by sharing something outrageous, you might be making things worse
78
290760
4040
ν„°λ¬΄λ‹ˆμ—†λŠ” 것을 κ³΅μœ ν•¨μœΌλ‘œμ¨ 상황을 μ•…ν™”μ‹œν‚¬ μˆ˜λ„ μžˆμ§€λ§Œ, κ·Έ
04:54
while also making the creator richer.
79
294800
2560
글을 μ“΄ μ‚¬λžŒμ€ 더 λΆ€μœ ν•΄μ§ˆ μˆ˜λ„ μžˆλ‹€λŠ” κ±Έ κΈ°μ–΅ν•΄μ•Ό ν•©λ‹ˆλ‹€.
04:57
OK, Phil, let's reveal the answer to your question.
80
297360
2800
μ’‹μ•„μš”, ν•„, μ§ˆλ¬Έμ— λŒ€ν•œ 닡을 κ³΅κ°œν•΄ λ³ΌκΉŒμš”?
05:00
You asked me what bait is?
81
300160
1880
미끼가 뭔지 λ¬Όμ—ˆμ§€?
05:02
I did, and the correct answer is b) food put on a hook
82
302040
4720
λ„€, 정닡은 b) λ¬Όκ³ κΈ°λ‚˜ 동물을 작기 μœ„ν•΄ λ‚šμ‹―λ°”λŠ˜μ— κΏ°μ–΄ 놓은 μŒμ‹μΈλ°,
05:06
to catch a fish or an animal,
83
306760
1720
05:08
and isn't that what you said, Beth?
84
308480
1600
베슀, 당신도 κ·Έλ ‡κ²Œ λ§ν–ˆμ§€ μ•Šλ‚˜μš”?
05:10
It is, yes. Right again!
85
310080
2120
κ·Έλ ‡μŠ΅λ‹ˆλ‹€. 또 κ·Έλ ‡κ΅°μš”!
05:12
OK, let's recap the vocabulary we've learnt in the programme,
86
312200
4640
μ’‹μ•„μš”, ν”„λ‘œκ·Έλž¨μ—μ„œ 배운 μ–΄νœ˜λ₯Ό λ‹€μ‹œ ν•œλ²ˆ μš”μ•½ν•΄ λ³΄κ² μŠ΅λ‹ˆλ‹€. λ¨Όμ €
05:16
starting with 'rage bait', social media content designed to provoke anger,
87
316840
5360
'λΆ„λ…Έ 미끼'λΆ€ν„° μ‹œμž‘ν•΄ λ³ΌκΉŒμš”? λΆ„λ…Έλ₯Ό μœ λ°œν•˜λŠ” μ†Œμ…œ λ―Έλ””μ–΄ μ½˜ν…μΈ λŠ”
05:22
thereby encouraging people to engage with it.
88
322200
3080
μ‚¬λžŒλ“€μ΄ λΆ„λ…Έ 에 μ°Έμ—¬ν•˜λ„λ‘ λ…λ €ν•˜λŠ” 것을 λ§ν•©λ‹ˆλ‹€.
05:25
The structure 'the more... the more...' describes the situation
89
325280
4360
'더 많이... 더 많이...' κ΅¬μ‘°λŠ”
05:29
when one thing happening repeatedly results in another thing happening too.
90
329640
4840
ν•œ 가지 일이 반볡적으둜 μΌμ–΄λ‚˜λ©΄ λ‹€λ₯Έ 일도 μΌμ–΄λ‚˜λŠ” 상황을 μ„€λͺ…ν•©λ‹ˆλ‹€.
05:34
'To incite' someone means to encourage them to do
91
334480
3920
λˆ„κ΅°κ°€λ₯Ό 'μ„ λ™ν•œλ‹€'λŠ” 것은 κ·Έ μ‚¬λžŒμ΄ λΆˆμΎŒν•˜κ±°λ‚˜ 폭λ ₯적인 일을 ν•˜κ±°λ‚˜ λŠλΌλ„λ‘ κ²©λ €ν•˜λŠ” 것을 μ˜λ―Έν•©λ‹ˆλ‹€
05:38
or feel something unpleasant or violent.
92
338400
3120
.
05:41
The adjective 'manipulative' describes trying to influence or control someone
93
341520
4760
'μ‘°μž‘μ 'μ΄λΌλŠ” ν˜•μš©μ‚¬λŠ” μžμ‹ μ˜ 이읡을 μœ„ν•΄ λˆ„κ΅°κ°€μ—κ²Œ 영ν–₯을 λ―ΈμΉ˜κ±°λ‚˜ ν†΅μ œν•˜λ €λŠ” 것을 μ„€λͺ…ν•©λ‹ˆλ‹€
05:46
to your own advantage.
94
346280
1880
.
05:48
'Eyeballs' is an informal word for the number
95
348160
2600
'μ‹œμ²­λ₯ '은
05:50
of people viewing a particular website or television programme.
96
350760
3640
νŠΉμ • μ›Ήμ‚¬μ΄νŠΈλ‚˜ TV ν”„λ‘œκ·Έλž¨μ„ μ‹œμ²­ν•˜λŠ” μ‚¬λžŒμ˜ 수λ₯Ό λ‚˜νƒ€λ‚΄λŠ” 비곡식적인 λ‹¨μ–΄μž…λ‹ˆλ‹€.
05:54
And finally, biases are feelings, often unconscious, either for
97
354400
5040
λ§ˆμ§€λ§‰μœΌλ‘œ νŽΈκ²¬μ€ μ’…μ’… λ¬΄μ˜μ‹μ μœΌλ‘œ
05:59
or against an idea or a group of people.
98
359440
3000
μ–΄λ–€ 생각 μ΄λ‚˜ μ‚¬λžŒ 집단에 λŒ€ν•΄ ν˜Έκ°μ μ΄κ±°λ‚˜ 뢀정적인 감정을 λ§ν•©λ‹ˆλ‹€.
06:02
Once again, our six minutes are up.
99
362440
2160
λ‹€μ‹œ ν•œλ²ˆ, 6뢄이 λλ‚¬μŠ΅λ‹ˆλ‹€.
06:04
Goodbye for now. Bye.
100
364600
2920
이제 μ•ˆλ…•. μ•ˆλ…•.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7