Do politics make us irrational? - Jay Van Bavel

541,518 views ・ 2020-02-04

TED-Ed


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: κ²€ν† : DK Kim
00:06
In 2013, a team of researchers held a math test.
0
6805
5056
2013년에 ν•œ 연ꡬ νŒ€μ΄ μˆ˜ν•™ μ‹œν—˜μ„ μ‹€μ‹œν–ˆμŠ΅λ‹ˆλ‹€.
00:11
The exam was administered to over 1,100 American adults,
1
11861
4050
κ·Έ μ‹œν—˜μ€ 성인 미ꡭ인 천백 λͺ… 이상을 λŒ€μƒμœΌλ‘œ ν•˜μ˜€μœΌλ©°
00:15
and designed, in part, to test their ability to evaluate sets of data.
2
15911
5683
자료 집합을 ν‰κ°€ν•˜λŠ” λŠ₯λ ₯을 μ‹œν—˜ν•˜λ €κ³  κ³ μ•ˆλ˜μ—ˆμŠ΅λ‹ˆλ‹€.
00:21
Hidden among these math problems were two almost identical questions.
3
21594
5200
이 μˆ˜ν•™ λ¬Έμ œλ“€μ—λŠ” 거의 λ™μΌν•œ 문제 두 κ°œκ°€ 숨겨져 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
00:26
Both problems used the same difficult data set,
4
26794
3040
κ·Έ λ¬Έμ œλ“€μ€ λ˜‘κ°™μ΄ μ–΄λ €μš΄ 자료 집합을 μ‚¬μš©ν–ˆκ³ 
00:29
and each had one objectively correct answer.
5
29834
3836
각각 κ°κ΄€μ μœΌλ‘œ μ •ν™•ν•œ 닡이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
00:33
The first asked about the correlation between rashes and a new skin cream.
6
33670
5032
첫 번째 λ¬Έμ œλŠ” λ°œμ§„κ³Ό μƒˆλ‘œμš΄ ν”ΌλΆ€μš© 크림의 연관성에 λŒ€ν•œ κ²ƒμž…λ‹ˆλ‹€.
00:38
The second asked about the correlation between crime rates
7
38702
3678
두 λ²ˆμ§ΈλŠ” λ²”μ£„μœ¨κ³Ό 총기 κ·œμ œλ²•μ˜ 연관성에 λŒ€ν•œ λ¬Έμ œμ˜€μŠ΅λ‹ˆλ‹€.
00:42
and gun control legislation.
8
42380
2690
00:45
Participants with strong math skills
9
45070
2410
μˆ˜ν•™ λŠ₯λ ₯이 λ›°μ–΄λ‚œ μ°Έκ°€μžλ“€μ€
00:47
were much more likely to get the first question correct.
10
47480
4030
첫 번째 문제λ₯Ό μ •ν™•ν•˜κ²Œ 맞힐 κ°€λŠ₯성이 훨씬 더 λ†’μ•˜μŠ΅λ‹ˆλ‹€.
00:51
But despite being mathematically identical,
11
51510
3170
ν•˜μ§€λ§Œ μˆ˜ν•™μ μœΌλ‘œ 동일함에도 λΆˆκ΅¬ν•˜κ³ 
00:54
the results for the second question looked totally different.
12
54680
4242
두 번째 문제의 κ²°κ³ΌλŠ” μ™„μ „νžˆ 달라 λ³΄μ˜€μŠ΅λ‹ˆλ‹€.
00:58
Here, math skills weren’t the best predictor
13
58922
3031
μ—¬κΈ°μ„œ μˆ˜ν•™ λŠ₯λ ₯은 μ°Έκ°€μžλ“€μ΄ μ •ν™•ν•˜κ²Œ 닡변할지에 λŒ€ν•œ
01:01
of which participants answered correctly.
14
61953
2899
졜고의 예츑 λ³€μˆ˜κ°€ μ•„λ‹ˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:04
Instead, another variable the researchers had been tracking came into play:
15
64852
5222
λŒ€μ‹  연ꡬ원듀이 좔적해 온 또 λ‹€λ₯Έ λ³€μˆ˜κ°€ μž‘μš©ν–ˆμŠ΅λ‹ˆλ‹€.
01:10
political identity.
16
70074
2460
λ°”λ‘œ μ •μΉ˜μ„±ν–₯μž…λ‹ˆλ‹€.
01:12
Participants whose political beliefs
17
72534
2018
μ •ν™•ν•œ 자료 해석에 λ”°λ₯Έ μ •μΉ˜μ  신념이 μžˆλŠ” μ°Έκ°€μžλ“€μ€
01:14
aligned with a correct interpretation of the data
18
74552
2920
01:17
were far more likely to answer the problem right.
19
77472
3490
닡이 훨씬 더 μ •ν™•ν–ˆμŠ΅λ‹ˆλ‹€.
01:20
Even the study’s top mathematicians
20
80962
2520
μ—°κ΅¬μ—μ„œ 졜고의 성적을 κ±°λ‘” μ‚¬λžŒλ“€μ‘°μ°¨
01:23
were 45% more likely to get the second question wrong
21
83482
4680
정닡이 κ·Έλ“€μ˜ μ •μΉ˜μ  신념에 λ°˜ν•  λ•ŒλŠ”
01:28
when the correct answer challenged their political beliefs.
22
88162
4103
두 번째 문제λ₯Ό ν‹€λ¦° κ²½μš°κ°€ 45% 더 λ§Žμ•˜μŠ΅λ‹ˆλ‹€.
01:32
What is it about politics that inspires this kind of illogical error?
23
92265
5774
μ •μΉ˜μ˜ μ–΄λ–€ 면이 이런 비논리적인 였λ₯˜λ₯Ό λΆ€μΆ”κΈ°λŠ” κ²ƒμΌκΉŒμš”?
01:38
Can someone’s political identity actually affect their ability
24
98039
3880
λˆ„κ΅°κ°€μ˜ μ •μΉ˜μ„±ν–₯이 정보 처리 λŠ₯λ ₯에 μ‹€μ œλ‘œ 영ν–₯을 μ€„κΉŒμš”?
01:41
to process information?
25
101919
2067
01:43
The answer lies in a cognitive phenomenon
26
103986
2450
닡은 곡적인 μ‚Άμ—μ„œ 점점 더 λšœλ ·ν•΄μ§€λŠ” 인식 ν˜„μƒμ— μžˆμŠ΅λ‹ˆλ‹€.
01:46
that has become increasingly visible in public life: partisanship.
27
106436
5510
λ°”λ‘œ λ‹ΉνŒŒμ£Όμ˜μž…λ‹ˆλ‹€.
01:51
While it’s often invoked in the context of politics,
28
111946
3192
λŒ€κ°œ μ •μΉ˜μ  λ§₯λ½μ—μ„œ λ‚˜νƒ€λ‚˜μ§€λ§Œ
01:55
partisanship is more broadly defined as a strong preference or bias
29
115138
4945
더 λ„“κ²Œλ³΄λ©΄ λ‹ΉνŒŒμ£Όμ˜λŠ” νŠΉμ • μ§‘λ‹¨μ΄λ‚˜ 생각에 λŒ€ν•œ
02:00
towards any particular group or idea.
30
120083
3320
κ°•ν•œ μ„ ν˜Έλ‚˜ 편견으둜 μ •μ˜λ©λ‹ˆλ‹€.
02:03
Our political, ethnic, religious, and national identities
31
123403
4060
μ •μΉ˜μ , 민쑱적, 쒅ꡐ적 그리고 ꡭ가적 정체성은
02:07
are all different forms of partisanship.
32
127463
2991
λͺ¨λ‘ λ‹ΉνŒŒμ£Όμ˜μ˜ ν˜•νƒœλ“€μž…λ‹ˆλ‹€.
02:10
Of course, identifying with social groups
33
130454
2762
λ¬Όλ‘  μ‚¬νšŒμ μΈ 집단을 κ΅¬λΆ„ν•˜λŠ” 것은
02:13
is an essential and healthy part of human life.
34
133216
3550
μƒν™œμ˜ ν•„μˆ˜μ μ΄κ³  κ±΄κ°•ν•œ λΆ€λΆ„μž…λ‹ˆλ‹€.
02:16
Our sense of self is defined not only by who we are as individuals,
35
136766
4270
μžμ•„ 감각은 μš°λ¦¬κ°€ κ°œμΈμœΌλ‘œμ„œ λˆ„κ΅¬μΈκ°€μ— μ˜ν•΄μ„œλ§Œ μ •μ˜λ˜λŠ” 게 μ•„λ‹ˆλΌ
02:21
but also by the groups we belong to.
36
141036
2440
μš°λ¦¬κ°€ μ†ν•˜λŠ” 집단에 μ˜ν•΄μ„œλ„ μ •μ˜λ©λ‹ˆλ‹€.
02:23
As a result, we’re strongly motivated to defend our group identities,
37
143476
5006
결둠적으둜 μš°λ¦¬λŠ” 집단 정체성을 지킀고
02:28
protecting both our sense of self and our social communities.
38
148482
4130
μžμ•„μ™€ 곡동체λ₯Ό λ³΄ν˜Έν•˜λ €λŠ” κ°•ν•œ 동기λ₯Ό κ°–κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
02:32
But this becomes a problem when the group’s beliefs
39
152612
2704
ν•˜μ§€λ§Œ 이것은 μ§‘λ‹¨μ˜ 믿음이 ν˜„μ‹€κ³Ό λŒ€λ¦½ν•  λ•Œ λ¬Έμ œκ°€ λ©λ‹ˆλ‹€.
02:35
are at odds with reality.
40
155316
2183
02:37
Imagine watching your favorite sports team commit a serious foul.
41
157499
3997
μ’‹μ•„ν•˜λŠ” νŒ€μ΄ μ‹¬κ°ν•œ λ°˜μΉ™μ„ λ²”ν•˜λŠ” 것을 μƒμƒν•΄λ³΄μ„Έμš”.
02:41
You know that’s against the rules,
42
161496
1840
그것이 κ·œμΉ™μ— μ–΄κΈ‹λ‚˜λŠ” 것은 μ•Œμ§€λ§Œ νŒ¬λ“€μ€ μ „ν˜€ 문제 μ—†λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
02:43
but your fellow fans think it’s totally acceptable.
43
163336
3300
02:46
The tension between these two incompatible thoughts
44
166636
3090
이 두 가지 곡쑴할 수 μ—†λŠ” 생각듀 μ‚¬μ΄μ˜ κΈ΄μž₯감은
02:49
is called cognitive dissonance,
45
169726
2363
02:52
and most people are driven to resolve this uncomfortable state of limbo.
46
172089
5091
λŒ€λΆ€λΆ„ μ‚¬λžŒλ“€μ€ 이 λΆˆνŽΈν•˜κ³  μ–΄μ€‘κ°„ν•œ μƒνƒœλ₯Ό λ²—μ–΄λ‚˜λ € ν•©λ‹ˆλ‹€.
02:57
You might start to blame the referee, complain that the other team started it,
47
177180
4510
μ‹¬νŒμ„ νƒ“ν•˜κ±°λ‚˜ μƒλŒ€κ°€ λ°˜μΉ™μ„ μ‹œμž‘ν–ˆλ‹€κ³  λΆˆν‰ν•˜κ±°λ‚˜
03:01
or even convince yourself there was no foul in the first place.
48
181690
4316
μ• μ΄ˆμ— λ°˜μΉ™μ€ μ—†μ—ˆλ‹€κ³  μžμ‹ μ„ 속이기 μ‹œμž‘ν• μ§€λ„ λͺ¨λ¦…λ‹ˆλ‹€.
이와 같은 κ²½μš°μ—λŠ”
03:06
In a case like this,
49
186006
1304
03:07
people are often more motivated to maintain a positive relationship
50
187310
4209
λŒ€κ°œ 세상을 μ •ν™•νžˆ μΈμ§€ν•˜λŠ” 것보닀
03:11
with their group than perceive the world accurately.
51
191519
4180
집단과 긍정적인 관계λ₯Ό μœ μ§€ν•˜λ €λŠ” 동기가 더 κ°•ν•©λ‹ˆλ‹€.
03:15
This behavior is especially dangerous in politics.
52
195699
3698
이 행동은 μ •μΉ˜μ—μ„œ 특히 μœ„ν—˜ν•©λ‹ˆλ‹€.
03:19
On an individual scale,
53
199397
1682
개인적인 μ°¨μ›μ—μ„œ
정당에 λŒ€ν•œ 좩성심은 μ •μΉ˜μ„±ν–₯을 μ°½μ‘°ν•˜κ²Œ ν•˜κ³ ,
03:21
allegiance to a party allows people to create a political identity
54
201079
4000
03:25
and support policies they agree with.
55
205079
2589
λ™μ˜ν•˜λŠ” 정책을 μ§€μ§€ν•˜κ²Œ ν•©λ‹ˆλ‹€.
03:27
But partisan-based cognitive dissonance can lead people to reject evidence
56
207668
4741
κ·ΈλŸ¬λ‚˜ λ‹ΉνŒŒμ„±μ—μ„œ λ‚˜μ˜¨ 인지 λΆ€μ‘°ν™”λŠ” μ‚¬λžŒλ“€μ΄ λ‹Ήμ˜ λ°©μΉ¨κ³Ό μΌμΉ˜ν•˜μ§€ μ•Šκ±°λ‚˜
03:32
that’s inconsistent with the party line or discredits party leaders.
57
212409
4719
μ •λ‹Ή μ§€λ„μžλ“€μ„ λΆˆμ‹ ν•˜λŠ” 증거λ₯Ό κ±°λΆ€ν•˜λ„λ‘ λ§Œλ“­λ‹ˆλ‹€.
03:37
And when entire groups of people revise the facts in service of partisan beliefs,
58
217128
5452
그리고 집단 λ‚΄μ˜ λͺ¨λ“  μ‚¬λžŒλ“€μ΄ λ‹ΉνŒŒμ  신념을 μœ„ν•΄ 사싀을 μ™œκ³‘ν•  λ•Œ
03:42
it can lead to policies that aren’t grounded in truth or reason.
59
222580
5182
μ§„μ‹€μ΄λ‚˜ 이성에 κ·Όκ±°ν•˜μ§€ μ•Šμ€ μ •μ±…μœΌλ‘œ μ΄μ–΄μ§ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:47
This problem isn’t newβ€”
60
227762
1850
이 λ¬Έμ œλŠ” μƒˆλ‘œμš΄ 것이 μ•„λ‹ˆκ³  μ •μΉ˜μ„±ν–₯은 μˆ˜μ„ΈκΈ° λ™μ•ˆ μ‘΄μž¬ν–ˆμŠ΅λ‹ˆλ‹€.
03:49
political identities have been around for centuries.
61
229612
2707
03:52
But studies show that partisan polarization
62
232319
2840
ν•˜μ§€λ§Œ 연ꡬλ₯Ό 보면 λ‹ΉνŒŒμ  μ–‘κ·Ήν™”λŠ”
03:55
has increased dramatically in the last few decades.
63
235159
3240
μ§€λ‚œ μˆ˜μ‹­ λ…„ λ™μ•ˆ 극적으둜 μ¦κ°€ν–ˆμŠ΅λ‹ˆλ‹€.
03:58
One theory explaining this increase
64
238399
2304
이 증가λ₯Ό μ„€λͺ…ν•˜λŠ” 이둠 쀑 ν•˜λ‚˜λŠ”
04:00
is the trend towards clustering geographically in like-minded communities.
65
240703
5099
λΉ„μŠ·ν•œ 생각을 ν•˜λŠ” 곡동체끼리 μ§€λ¦¬μ μœΌλ‘œ λͺ¨μ΄λŠ” μΆ”μ„Έμž…λ‹ˆλ‹€.
04:05
Another is the growing tendency to rely on partisan news
66
245802
3630
λ‹€λ₯Έ 것은 λ‹ΉνŒŒ μ†Œμ‹μ΄λ‚˜ 편ν–₯된 μ†Œμ…œ 미디어에
04:09
or social media bubbles.
67
249432
2228
μ˜μ§€ν•˜λŠ” κ²½ν–₯이 λŠλŠ” κ²ƒμž…λ‹ˆλ‹€.
04:11
These often act like echo chambers,
68
251660
2470
μ΄λŠ” μ’…μ’… λΉ„μŠ·ν•œ 관점을 가진 μ‚¬λžŒλ“€λ‘œλΆ€ν„°
04:14
delivering news and ideas from people with similar views.
69
254130
4400
μ˜€λŠ” μ†Œμ‹κ³Ό 생각을 λ°˜λ³΅ν•˜λŠ” 메아리에 κ°‡νžŒ κ²ƒμ²˜λŸΌ λ³΄μž…λ‹ˆλ‹€.
04:18
Fortunately, cognitive scientists have uncovered some strategies
70
258530
4212
λ‹€ν–‰νžˆ μΈμ§€κ³Όν•™μžλ“€μ€ 이 μ™œκ³‘ 여과기에 μ €ν•­ν•˜λŠ”
04:22
for resisting this distortion filter.
71
262742
2837
방법을 λͺ‡ 가지 μ œμ•ˆν–ˆμŠ΅λ‹ˆλ‹€.
04:25
One is to remember that you’re probably more biased than you think.
72
265579
4585
μ²«μ§ΈλŠ” μžμ‹ μ΄ 생각보닀 더 νŽΈνŒŒμ μΌμ§€ λͺ¨λ₯Έλ‹€λŠ” 것을 μΈμ •ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:30
So when you encounter new information,
73
270164
2040
κ·ΈλŸ¬λ―€λ‘œ μƒˆλ‘œμš΄ 정보λ₯Ό μ ‘ν•˜λ©΄
04:32
make a deliberate effort to push through your initial intuition
74
272204
3806
처음의 직관을 λ°€κ³  λ‚˜κ°€κ³  그것을 λΆ„μ„μ μœΌλ‘œ ν‰κ°€ν•˜λ„λ‘ λ…Έλ ₯ν•˜μ‹­μ‹œμ˜€.
04:36
and evaluate it analytically.
75
276010
2340
04:38
In your own groups, try to make fact-checking and questioning assumptions
76
278350
4560
μ—¬λŸ¬λΆ„ μžμ‹ μ˜ μ§‘λ‹¨μ—μ„œ
사싀 확인과 κ°€μ • κ²€ν† κ°€ μ€‘μš”ν•œ λ¬Έν™”κ°€ λ˜λ„λ‘ λ…Έλ ₯ν•˜μ‹­μ‹œμ˜€.
04:42
a valued part of the culture.
77
282910
2546
04:45
Warning people that they might have been presented with misinformation
78
285456
3427
잘λͺ»λœ 정보일 μˆ˜λ„ μžˆλ‹€κ³  μ‚¬λžŒλ“€μ—κ²Œ κ²½κ³ ν•˜λŠ” 것도 도움이 λ©λ‹ˆλ‹€.
04:48
can also help.
79
288883
1510
04:50
And when you’re trying to persuade someone else,
80
290393
2540
그리고 λˆ„κ΅°κ°€λ₯Ό μ„€λ“ν•˜λ € ν•  λ•Œ
04:52
affirming their values and framing the issue in their language
81
292933
4180
κ·Έλ“€μ˜ κ°€μΉ˜λ₯Ό κΈμ •ν•˜κ³  κ·Έλ“€μ˜ μ–Έμ–΄λ‘œ 문제λ₯Ό ν‘œν˜„ν•˜λ©΄
04:57
can help make people more receptive.
82
297113
3261
μ‚¬λžŒλ“€μ„ 쑰금 더 수용적으둜 λ§Œλ“€ 수 μžˆμŠ΅λ‹ˆλ‹€.
05:00
We still have a long way to go before solving the problem of partisanship.
83
300374
4687
λ‹ΉνŒŒμ£Όμ˜λ₯Ό ν•΄κ²°ν•˜κΈ° μœ„ν•΄μ„œλŠ” 아직도 갈 길이 λ©‰λ‹ˆλ‹€.
05:05
But hopefully, these tools can help keep us better informed,
84
305061
3340
ν•˜μ§€λ§Œ 이 도ꡬ듀이 μš°λ¦¬κ°€ 정보λ₯Ό 더 잘 μ•Œκ³ 
05:08
and capable of making evidence-based decisions about our shared reality.
85
308401
4817
κ³΅μœ ν•˜λŠ” ν˜„μ‹€μ— λŒ€ν•œ 증거에 κΈ°μ΄ˆν•œ 결정을 ν•  수 μžˆλ„λ‘ 돕길 λ°”λžλ‹ˆλ‹€.
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7