Paul Bloom: Can prejudice ever be a good thing?

184,833 views ・ 2014-07-03

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Gemma Lee κ²€ν† : Sunphil Ga
00:12
When we think about prejudice and bias,
0
12844
2378
μš°λ¦¬κ°€ 편견과 편ν–₯에 λŒ€ν•΄ 생각할 λ•Œ
00:15
we tend to think about stupid and evil people
1
15222
2144
어리석고 λ‚˜μœ μ‚¬λžŒλ“€μ΄
00:17
doing stupid and evil things.
2
17366
2454
어리석고 λ‚˜μœ 짓을 ν•œλ‹€κ³  μƒκ°ν•˜μ£ .
00:19
And this idea is nicely summarized
3
19820
2070
이런 생각은 영ꡭ의 비평가
00:21
by the British critic William Hazlitt,
4
21890
2468
μœŒλ¦¬μ—„ 해즐릿이 잘 μ •λ¦¬ν–ˆμŠ΅λ‹ˆλ‹€.
00:24
who wrote, "Prejudice is the child of ignorance."
5
24358
2935
κ·ΈλŠ” "νŽΈκ²¬μ€ μ–΄λ¦¬μ„μŒμ˜ μžμ‹μ΄λ‹€." 라고 μΌμŠ΅λ‹ˆλ‹€.
00:27
I want to try to convince you here
6
27293
2112
μ €λŠ” μ—¬κΈ°μ„œ 이것이
00:29
that this is mistaken.
7
29405
1635
잘λͺ»λœ 생각이라고 λ§μ”€λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
00:31
I want to try to convince you
8
31040
1732
μ €λŠ” μ—¬λŸ¬λΆ„μ„ μ„€λ“ν•˜κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
00:32
that prejudice and bias
9
32772
1723
편견과 편ν–₯은
00:34
are natural, they're often rational,
10
34495
3288
μžμ—°μŠ€λŸ½κ³  λ•Œλ‘œλŠ” μ΄μΉ˜μ— 맞고
00:37
and they're often even moral,
11
37783
1831
κ΅ν›ˆμ΄ 될 λ•Œλ„ μžˆμŠ΅λ‹ˆλ‹€.
00:39
and I think that once we understand this,
12
39614
2252
μš°λ¦¬κ°€ 이런 점을 μ΄ν•΄ν•œλ‹€λ©΄
00:41
we're in a better position to make sense of them
13
41866
2509
일이 잘λͺ» λŒμ•„κ°ˆ λ•Œ
00:44
when they go wrong,
14
44375
1057
λ”μ°ν•œ κ²°κ³Όλ₯Ό 낳을 λ•Œ
00:45
when they have horrible consequences,
15
45432
1768
이런 일이 생길 λ•Œ
00:47
and we're in a better position to know what to do
16
47200
2325
μ–΄λ–»κ²Œ λŒ€μ²˜ν•  지
00:49
when this happens.
17
49525
1682
더 잘 μ•Œ 수 μžˆμŠ΅λ‹ˆλ‹€.
00:51
So, start with stereotypes. You look at me,
18
51207
3027
κ³ μ •κ΄€λ…μœΌλ‘œ μ‹œμž‘ν•΄λ΄…μ‹œλ‹€. μ—¬λŸ¬λΆ„μ€ μ €λ₯Ό 보고,
00:54
you know my name, you know certain facts about me,
19
54234
2246
제 이름을 μ•Œκ³ , 저에 κ΄€ν•œ νŠΉμ •ν•œ 사싀도 μ•„λ‹ˆκΉŒ
00:56
and you could make certain judgments.
20
56480
1829
μ–΄λ–€ νŒλ‹¨μ„ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
00:58
You could make guesses about my ethnicity,
21
58309
2853
μ—¬λŸ¬λΆ„μ€ 제 λ―Όμ‘±μ„±,
01:01
my political affiliation, my religious beliefs.
22
61162
3281
μ œκ°€ μ†ν•œ μ •μΉ˜λ‹¨μ²΄, μ œκ°€ λ―ΏλŠ” 쒅ꡐ에 λŒ€ν•΄ μΆ”μΈ‘ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
01:04
And the thing is, these judgments tend to be accurate.
23
64443
2099
싀은 μ΄λŸ¬ν•œ νŒλ‹¨μ΄ μ •ν™•ν•œ κ²½ν–₯이 μžˆμŠ΅λ‹ˆλ‹€.
01:06
We're very good at this sort of thing.
24
66542
2182
μš°λ¦¬λŠ” 이런 일을 잘 ν•©λ‹ˆλ‹€.
01:08
And we're very good at this sort of thing
25
68724
1483
μš°λ¦¬λŠ” 이런 일을 잘 ν•©λ‹ˆλ‹€.
01:10
because our ability to stereotype people
26
70207
2733
μ‚¬λžŒλ“€μ— λŒ€ν•΄ 고정관념을 κ°–λŠ” λŠ₯λ ₯은
01:12
is not some sort of arbitrary quirk of the mind,
27
72940
3255
마음이 λ©‹λŒ€λ‘œ ν•˜λŠ” μž₯λ‚œμ΄ μ•„λ‹™λ‹ˆλ‹€.
01:16
but rather it's a specific instance
28
76195
2316
였히렀 보닀 μΌλ°˜ν™” κ³Όμ •μ˜
01:18
of a more general process,
29
78511
1655
ꡬ체적인 κ²½μš°λ‘œμ„œ
01:20
which is that we have experience
30
80166
1619
μ„Έμƒμ˜ 사물과 μ‚¬λžŒλ“€μ— λŒ€ν•΄
01:21
with things and people in the world
31
81785
1541
μš°λ¦¬κ°€ κ²ͺλŠ” κ²½ν—˜μ΄
01:23
that fall into categories,
32
83326
1249
λΆ„λ₯˜κ°€ λ©λ‹ˆλ‹€.
01:24
and we can use our experience to make generalizations
33
84575
2456
그리고 μš°λ¦¬λŠ” κ²½ν—˜μ„ 근거둜
01:27
about novel instances of these categories.
34
87031
2359
이듀 λΆ„λ₯˜μ— μ†ν•œ μƒˆλ‘œμš΄ 경우λ₯Ό μΌλ°˜ν™”ν•©λ‹ˆλ‹€.
01:29
So everybody here has a lot of experience
35
89390
2367
μ—¬κΈ° 계신 λͺ¨λ“  뢄듀이
01:31
with chairs and apples and dogs,
36
91757
2253
의자, 사과, κ°œμ— λŒ€ν•΄ λ‹€μ–‘ν•œ κ²½ν—˜μ„ κ°–κ³  κ³„μ‹œμ£ .
01:34
and based on this, you could see
37
94010
1636
이것을 λ°”νƒ•μœΌλ‘œ μ—¬λŸ¬λΆ„μ€
01:35
unfamiliar examples and you could guess,
38
95646
2352
μ΅μˆ™ν•˜μ§€ μ•Šμ€ 예λ₯Ό 보고 μΆ”μΈ‘ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
01:37
you could sit on the chair,
39
97998
1316
μ—¬λŸ¬λΆ„μ€ μ˜μžμ— μ•‰κ±°λ‚˜
01:39
you could eat the apple, the dog will bark.
40
99314
2565
사과λ₯Ό 먹을 수 있고 κ°œλŠ” 짖을 κ²λ‹ˆλ‹€.
01:41
Now we might be wrong.
41
101879
1764
μš°λ¦¬κ°€ 틀릴 지도 λͺ¨λ₯΄μ£ .
01:43
The chair could collapse if you sit on it,
42
103643
1800
μ—¬λŸ¬λΆ„μ΄ 앉은 μ˜μžκ°€ λ§κ°€μ§ˆ μˆ˜λ„ 있고
01:45
the apple might be poison, the dog might not bark,
43
105443
2222
사과에 독이 λ“€μ—ˆμ„ μˆ˜λ„ 있고 κ°œλŠ” 짖지 μ•Šμ„ 지도 λͺ¨λ¦…λ‹ˆλ‹€.
01:47
and in fact, this is my dog Tessie, who doesn't bark.
44
107665
2870
사싀 이건 μ œκ°€ ν‚€μš°λŠ” 개 "ν…Œμ”¨"인데 짖지 μ•ŠμŠ΅λ‹ˆλ‹€.
01:50
But for the most part, we're good at this.
45
110535
2759
ν•˜μ§€λ§Œ λŒ€μ²΄λ‘œ μš°λ¦¬λŠ” 이런 일을 잘 ν•©λ‹ˆλ‹€.
01:53
For the most part, we make good guesses
46
113294
1916
λŒ€λΆ€λΆ„μ˜ 경우 좔츑을 잘 ν•©λ‹ˆλ‹€.
01:55
both in the social domain and the non-social domain,
47
115210
1814
μ‚¬νšŒμ  μ˜μ—­κ³Ό λΉ„μ‚¬νšŒμ  μ˜μ—­μ—μ„œ λ‹€ 좔츑을 잘 ν•˜μ£ .
01:57
and if we weren't able to do so,
48
117024
1949
μš°λ¦¬κ°€ 그럴 수 μ—†λ‹€λ©΄
01:58
if we weren't able to make guesses about new instances that we encounter,
49
118973
3216
μš°λ¦¬κ°€ 마주친 μƒˆλ‘œμš΄ κ²½μš°μ— λŒ€ν•΄ 좔츑을 ν•  수 μ—†λ‹€λ©΄
02:02
we wouldn't survive.
50
122189
1451
μš°λ¦¬λŠ” 살아남지 λͺ»ν•  κ²λ‹ˆλ‹€.
02:03
And in fact, Hazlitt later on in his wonderful essay
51
123640
2869
사싀 해즐릿은 κ·Έκ°€ μ“΄ ν›Œλ₯­ν•œ μˆ˜ν•„μ—μ„œ
02:06
concedes this.
52
126509
1485
이 사싀을 μΈμ •ν–ˆμŠ΅λ‹ˆλ‹€.
02:07
He writes, "Without the aid of prejudice and custom,
53
127994
2542
κ·ΈλŠ” "편견과 κ΄€μŠ΅μ˜ 도움이 μ—†λ‹€λ©΄,
02:10
I should not be able to find my way my across the room;
54
130536
2340
λ‚˜λŠ” λ‚΄ 방을 κ°€λ‘œμ§€λ₯΄λŠ” 길을 찾을 수 μ—†μ—ˆμ„ ν…Œκ³ ,
02:12
nor know how to conduct myself in any circumstances,
55
132876
2452
λͺ¨λ“  μƒν™©μ—μ„œ μ–΄λ–»κ²Œ 행동해야 ν•˜λŠ”μ§€ λͺ°λžμ„ ν…Œκ³ ,
02:15
nor what to feel in any relation of life."
56
135328
4203
삢에 λŒ€ν•΄ μ–΄λ–»κ²Œ λŠκ»΄μ•Ό ν•˜λŠ”μ§€λ„ λͺ°λžμ„ κ²λ‹ˆλ‹€." 라고 μΌμŠ΅λ‹ˆλ‹€.
02:19
Or take bias.
57
139531
1509
편ν–₯을 μ–˜κΈ°ν•΄λ³΄μ£ .
02:21
Now sometimes, we break the world up into
58
141040
1708
λ•Œλ•Œλ‘œ μš°λ¦¬λŠ” 세상을 λ‚˜λˆ μ„œ
02:22
us versus them, into in-group versus out-group,
59
142748
3001
우리 λŒ€ κ·Έλ“€, 내집단 λŒ€ 외집단 으둜 λ§Œλ“­λ‹ˆλ‹€.
02:25
and sometimes when we do this,
60
145749
1161
μš°λ¦¬κ°€ μ΄λ ‡κ²Œ ν•  λ•Œ
02:26
we know we're doing something wrong,
61
146910
1557
λ­”κ°€ 잘λͺ»ν•˜κ³  μžˆμŒμ„ μ•Œκ³ 
02:28
and we're kind of ashamed of it.
62
148467
1673
λΆ€λ„λŸ¬μ›Œν•©λ‹ˆλ‹€.
02:30
But other times we're proud of it.
63
150140
1483
ν•˜μ§€λ§Œ λ‹€λ₯Έ λ•ŒλŠ” μžλž‘μŠ€λŸ¬μ›Œν•©λ‹ˆλ‹€.
02:31
We openly acknowledge it.
64
151623
1813
μš°λ¦¬λŠ” κ³΅κ³΅μ—°ν•˜κ²Œ μΈμ •ν•©λ‹ˆλ‹€.
02:33
And my favorite example of this
65
153436
1282
μ œκ°€ μ’‹μ•„ν•˜λŠ” μ˜ˆλŠ”
02:34
is a question that came from the audience
66
154718
2402
μ§€λ‚œ μ„ κ±°μ—μ„œ 곡화당 토둠을 ν•  λ•Œ
02:37
in a Republican debate prior to the last election.
67
157120
2717
μ²­μ€‘ν•œν…Œμ„œ λ‚˜μ˜¨ μ§ˆλ¬Έμž…λ‹ˆλ‹€.
02:39
(Video) Anderson Cooper: Gets to your question,
68
159837
2292
(μ˜μƒ) μ•€λ”μŠ¨ 쿠퍼 : μ§ˆλ¬Έμ„ λ“£μ£ .
02:42
the question in the hall, on foreign aid? Yes, ma'am.
69
162129
4181
λ°©μ²­μ„μ—μ„œ 외ꡭ원쑰에 κ΄€ν•œ μ§ˆλ¬ΈμΈλ°μš”. 예, μ—¬μžλΆ„.
02:46
Woman: The American people are suffering
70
166310
2236
μ—¬μž : λ―Έκ΅­ μ‚¬λžŒλ“€μ€
02:48
in our country right now.
71
168546
2637
μ§€κΈˆ νž˜λ“€μ–΄ν•©λ‹ˆλ‹€.
02:51
Why do we continue to send foreign aid
72
171183
3348
μ™œ μš°λ¦¬λŠ” λ‹€λ₯Έ λ‚˜λΌμ—
02:54
to other countries
73
174531
1316
κ³„μ†ν•΄μ„œ 원쑰λ₯Ό ν•©λ‹ˆκΉŒ?
02:55
when we need all the help we can get for ourselves?
74
175847
4103
μš°λ¦¬ν•œν…Œ λͺ¨λ“  도움이 ν•„μš”ν•œ λ•ŒμΈλ°μš”.
02:59
AC: Governor Perry, what about that?
75
179950
1695
μ•€λ”μŠ¨ : 페리 주지사, μ–΄λ–»κ²Œ μƒκ°ν•˜μ‹œλ‚˜μš”?
03:01
(Applause)
76
181645
1367
(λ°•μˆ˜)
03:03
Rick Perry: Absolutely, I think it'sβ€”
77
183012
2338
λ¦­ : λ¬Όλ‘  μ œκ°€ μƒκ°ν•˜κΈ°μ—” --
03:05
Paul Bloom: Each of the people onstage
78
185350
1660
폴 : λ¬΄λŒ€μ— μ„  λͺ¨λ“  μ‚¬λžŒλ“€μ΄
03:07
agreed with the premise of her question,
79
187010
1971
κ·Έ 질문의 μ „μ œμ— λ™μ˜ν•©λ‹ˆλ‹€.
03:08
which is as Americans, we should care more
80
188981
2119
그말은 λ―Έκ΅­μΈμœΌλ‘œμ„œ μš°λ¦¬λŠ”
03:11
about Americans than about other people.
81
191100
2126
λ‹€λ₯Έ μ‚¬λžŒλ“€λ³΄λ‹€ 미ꡭ인듀을 더 λ³΄μ‚΄νŽ΄μ•Ό ν•œλ‹€λŠ” 뜻이죠.
03:13
And in fact, in general, people are often swayed
82
193226
2865
사싀 일반적으둜 μ‚¬λžŒλ“€μ€ μ’…μ’…
03:16
by feelings of solidarity, loyalty, pride, patriotism,
83
196091
3508
μ—°λŒ€κ°, 의리, μžλΆ€μ‹¬, 애ꡭ심과 같은 κ°μ •μœΌλ‘œ
03:19
towards their country or towards their ethnic group.
84
199599
2716
이 λ‚˜λΌ λ˜λŠ” 그듀이 μ†ν•œ 민쑱에 νœ˜λ‘˜λ¦½λ‹ˆλ‹€.
03:22
Regardless of your politics, many people feel proud to be American,
85
202315
3085
μ •μΉ˜ μ„±ν–₯에 상관없이 λ§Žμ€ μ‚¬λžŒλ“€μ€ λ―Έκ΅­μΈμž„μ„ μžλž‘μŠ€λŸ½κ²Œ μƒκ°ν•˜κ³ 
03:25
and they favor Americans over other countries.
86
205400
2062
λ‹€λ₯Έ λ‚˜λΌμ— λΉ„ν•΄ 미ꡭ을 μ’‹μ•„ν•©λ‹ˆλ‹€.
03:27
Residents of other countries feel the same about their nation,
87
207462
2850
λ‹€λ₯Έ λ‚˜λΌ μ‚¬λžŒλ“€λ„ 자기 λ‚˜λΌμ— λŒ€ν•΄ λ§ˆμ°¬κ°€μ§€λ‘œ 느끼고
03:30
and we feel the same about our ethnicities.
88
210312
2486
μš°λ¦¬λ„ 우리 민쑱성에 λŒ€ν•΄ λ˜‘κ°™μ΄ λŠλ‚λ‹ˆλ‹€.
03:32
Now some of you may reject this.
89
212798
1684
μ–΄λ–€ 뢄듀은 여기에 λ°˜λŒ€ν• μ§€λ„ λͺ¨λ₯΄κ² λ„€μš”.
03:34
Some of you may be so cosmopolitan
90
214482
1721
μ–΄λ–€ 뢄듀은 μ„Έκ³„μ£Όμ˜μžλΌμ„œ
03:36
that you think that ethnicity and nationality
91
216213
2334
λ―Όμ‘±κ³Ό κ΅­κ°€κ°€
03:38
should hold no moral sway.
92
218547
2153
정신을 μ§€λ°°ν•˜μ§€λŠ” μ•ŠλŠ”λ‹€κ³  생가할 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
03:40
But even you sophisticates accept
93
220700
2762
ν•˜μ§€λ§Œ κ΅μ–‘μžˆλŠ” μ‚¬λžŒλ„
03:43
that there should be some pull
94
223462
1834
μ—¬λŸ¬λΆ„κ³Ό κ°€κΉŒμš΄ μ‚¬λžŒλ“€,
03:45
towards the in-group in the domain of friends and family,
95
225296
2701
친ꡬ, 가쑱이 μ†ν•œ μ˜μ—­μ—μ„œ
03:47
of people you're close to,
96
227997
1421
내집단을 ν–₯ν•œ 끌림이 μžˆμŒμ„ μΈμ •ν•©λ‹ˆλ‹€.
03:49
and so even you make a distinction
97
229418
1561
μ—¬λŸ¬λΆ„λ“€λ„ 우리 λŒ€ κ·Έλ“€ 사이에
03:50
between us versus them.
98
230979
1975
ꡬ별을 ν•©λ‹ˆλ‹€.
03:52
Now, this distinction is natural enough
99
232954
2603
자, 이런 ꡬ별은 μžμ—°μŠ€λŸ¬μ›Œμ„œ
03:55
and often moral enough, but it can go awry,
100
235557
2924
λ•Œλ‘œλŠ” μΆ©λΆ„νžˆ λ„μ˜μ μ΄μ§€λ§Œ λΉ—λ‚˜κ°ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
03:58
and this was part of the research
101
238481
1729
이것은 μœ„λŒ€ν•œ μ‚¬νšŒ μ‹¬λ¦¬ν•™μž
04:00
of the great social psychologist Henri Tajfel.
102
240210
2759
헨리 νƒ€μ§€νŽ μ΄ ν•œ μ—°κ΅¬μ˜€μŠ΅λ‹ˆλ‹€.
04:02
Tajfel was born in Poland in 1919.
103
242969
2605
νƒ€μ§€νŽ μ€ 1919λ…„ ν΄λž€λ“œμ—μ„œ νƒœμ–΄λ‚¬κ³ 
04:05
He left to go to university in France,
104
245574
2139
ν”„λž‘μŠ€μ—μ„œ λŒ€ν•™μ„ λ‹€λ…”μŠ΅λ‹ˆλ‹€.
04:07
because as a Jew, he couldn't go to university in Poland,
105
247713
2555
κ·ΈλŠ” μœ νƒœμΈμ΄μ—ˆκΈ° λ•Œλ¬Έμ— ν΄λž€λ“œμ—μ„œ λŒ€ν•™μ„ 갈 수 μ—†μ—ˆμ£ .
04:10
and then he enlisted in the French military
106
250268
2510
그뒀에 2μ°¨ λŒ€μ „ 쀑에
04:12
in World War II.
107
252778
1283
ν”„λž‘μŠ€ μœ‘κ΅°μ— μž…λŒ€ν–ˆμŠ΅λ‹ˆλ‹€.
04:14
He was captured and ended up
108
254061
1769
κ·ΈλŠ” μž‘ν˜€μ„œ
04:15
in a prisoner of war camp,
109
255830
1531
μ „μŸν¬λ‘œμˆ˜μš©μ†Œλ‘œ λŒλ €κ°”μ£ .
04:17
and it was a terrifying time for him,
110
257361
2267
κ·Έμ—κ²ŒλŠ” λ”μ°ν•œ μ‹œκΈ°μ˜€λŠ”λ°
04:19
because if it was discovered that he was a Jew,
111
259628
1688
κ·Έκ°€ μœ νƒœμΈμ΄λΌλŠ” 사싀이 λ°œκ°λ˜μ—ˆλ‹€λ©΄
04:21
he could have been moved to a concentration camp,
112
261316
2092
κ°•μ œ μˆ˜μš©μ†Œλ‘œ λŒλ €κ°€
04:23
where he most likely would not have survived.
113
263408
1992
살아남지 λͺ»ν–ˆμ„ 지도 λͺ¨λ¦…λ‹ˆλ‹€.
04:25
And in fact, when the war ended and he was released,
114
265400
2587
사싀 μ „μŸμ΄ λλ‚˜κ³  κ·Έκ°€ 풀렀났을 λ•Œ
04:27
most of his friends and family were dead.
115
267987
2505
그의 μΉœκ΅¬μ™€ κ°€μ‘± λŒ€λΆ€λΆ„μ€ μ‚¬λ§ν–ˆμŠ΅λ‹ˆλ‹€.
04:30
He got involved in different pursuits.
116
270492
1837
κ·ΈλŠ” λ‹€μ–‘ν•œ 일을 ν–ˆμ–΄μš”.
04:32
He helped out the war orphans.
117
272329
1531
μ „μŸ κ³ μ•„λ₯Ό λ„μ™”μŠ΅λ‹ˆλ‹€.
04:33
But he had a long-lasting interest
118
273860
1731
ν•˜μ§€λ§Œ κ·ΈλŠ” 편견의 과학에
04:35
in the science of prejudice,
119
275591
1545
지속적인 관심이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
04:37
and so when a prestigious British scholarship
120
277136
2660
κ·Έλž˜μ„œ 고정관념에 λŒ€ν•œ
04:39
on stereotypes opened up, he applied for it,
121
279796
1845
유λͺ…ν•œ 영ꡭ μž₯ν•™κΈˆμ„ 받을 κΈ°νšŒκ°€ 왔을 λ•Œ
04:41
and he won it,
122
281641
1357
κ·ΈλŠ” 응λͺ¨λ₯Ό ν•΄μ„œ ν•©κ²©ν–ˆμŠ΅λ‹ˆλ‹€.
04:42
and then he began this amazing career.
123
282998
2190
κ·Έλ’€ μ΄λ ‡κ²Œ λ†€λΌμš΄ κ²½λ ₯을 μŒ“κΈ° μ‹œμž‘ν–ˆμ£ .
04:45
And what started his career is an insight
124
285188
2749
그의 κ²½λ ₯이 μ‹œμž‘λœ κ³„κΈ°λŠ” 톡찰λ ₯인데
04:47
that the way most people were thinking
125
287937
1840
λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ΄ μœ νƒœμΈν•™μ‚΄μ„
04:49
about the Holocaust was wrong.
126
289777
2116
잘λͺ»μ΄λΌκ³  μƒκ°ν•˜λŠ” 방식을 κΏ°λš«μ–΄λ΄€μŠ΅λ‹ˆλ‹€.
04:51
Many people, most people at the time,
127
291893
2406
λ§Žμ€ μ‚¬λžŒλ“€, κ·Έ λ‹Ήμ‹œ λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ΄
04:54
viewed the Holocaust as sort of representing
128
294299
1901
μœ νƒœμΈν•™μ‚΄μ΄
04:56
some tragic flaw on the part of the Germans,
129
296200
3004
독일 μ‚¬λžŒλ“€μ˜ 비극적인 결함,
04:59
some genetic taint, some authoritarian personality.
130
299204
3834
μ–΄λ–€ μœ μ „μ μΈ 였점, λ…μž¬μ μΈ μ„±ν–₯을 λŒ€ν‘œν•œλ‹€κ³  λ΄€μŠ΅λ‹ˆλ‹€.
05:03
And Tajfel rejected this.
131
303038
2058
νƒ€μ§€νŽ μ„ 이런 논점에 λ°˜λŒ€ν–ˆμŠ΅λ‹ˆλ‹€.
05:05
Tajfel said what we see in the Holocaust
132
305096
2543
νƒ€μ§€νŽ μ€ μœ νƒœμΈν•™μ‚΄μ—μ„œ μš°λ¦¬κ°€ λ³Έ 것은
05:07
is just an exaggeration
133
307639
2311
λͺ¨λ“  μ‚¬λžŒλ“€ μ•ˆμ— μžˆλŠ”
05:09
of normal psychological processes
134
309950
1778
λ³΄ν†΅μ˜ 심리 과정이
05:11
that exist in every one of us.
135
311728
1761
κ³Όμž₯된 ν˜•νƒœμΌ 뿐이라고 λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:13
And to explore this, he did a series of classic studies
136
313489
2685
이걸 νƒκ΅¬ν•˜κΈ° μœ„ν•΄μ„œ κ·ΈλŠ”
05:16
with British adolescents.
137
316174
1744
영ꡭ의 μ²­μ†Œλ…„λ“€μ„ λŒ€μƒμœΌλ‘œ 일련의 연ꡬλ₯Ό ν–ˆμŠ΅λ‹ˆλ‹€.
05:17
And in one of his studies, what he did was he asked
138
317918
1549
κ·Έκ°€ ν•œ μ—°κ΅¬μ—μ„œ, κ·Έκ°€ ν•œ 일은
05:19
the British adolescents all sorts of questions,
139
319467
2552
영ꡭ의 μ²­μ†Œλ…„λ“€μ—κ²Œ λ‹€μ–‘ν•œ μ§ˆλ¬Έμ„ λ˜μ§„ λ’€
05:22
and then based on their answers, he said,
140
322019
1884
그듀이 ν•œ 닡변을 λ°”νƒ•μœΌλ‘œ κ·Έκ°€ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
05:23
"I've looked at your answers, and based on the answers,
141
323903
2357
"μ €λŠ” λ‹Ήμ‹ μ˜ 닡변을 λ΄€κ³  κ·Έ 닡변에 κ·Όκ±°ν•΄μ„œ
05:26
I have determined that you are either" β€”
142
326260
2097
당신은 이렇닀고 κ²°λ‘ λ‚΄λ ΈμŠ΅λ‹ˆλ‹€."
05:28
he told half of them β€”
143
328357
1006
κ·ΈλŠ” 절반의 μ‚¬λžŒλ“€μ—κ²Œ λ§ν–ˆμ£ .
05:29
"a Kandinsky lover, you love the work of Kandinsky,
144
329363
2957
"μΉΈλ”˜μŠ€ν‚€ μ• ν˜Έκ°€, 당신은 μΉΈλ”˜μŠ€ν‚€ μž‘ν’ˆμ„ μ’‹μ•„ν•©λ‹ˆλ‹€.
05:32
or a Klee lover, you love the work of Klee."
145
332320
2978
λ˜λŠ” 클리 μ• ν˜Έκ°€, 당신은 클리 μž‘ν’ˆμ„ μ’‹μ•„ν•©λ‹ˆλ‹€."
05:35
It was entirely bogus.
146
335298
1816
그건 μ™„μ „νžˆ κ±°μ§“μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
05:37
Their answers had nothing to do with Kandinsky or Klee.
147
337114
2290
그듀이 ν•œ 닡변은 μΉΈλ”˜μŠ€ν‚€λ‚˜ ν΄λ¦¬μ™€λŠ” μ•„λ¬΄λŸ° 상관이 μ—†μ—ˆμ–΄μš”.
05:39
They probably hadn't heard of the artists.
148
339404
2728
그듀은 ν™”κ°€λ₯Ό μ•Œμ§€λ„ λͺ»ν–ˆμ„ κ²λ‹ˆλ‹€.
05:42
He just arbitrarily divided them up.
149
342132
2740
κ·ΈλŠ” μž„μ˜λ‘œ μ²­μ†Œλ…„λ“€μ„ λ‚˜λˆ΄μŠ΅λ‹ˆλ‹€.
05:44
But what he found was, these categories mattered,
150
344872
3271
κ·Έκ°€ μ•Œμ•„λ‚Έ 사싀은 이런 ꡬ뢄이 상관이 μžˆλ‹€λŠ” κ±°μ—μš”.
05:48
so when he later gave the subjects money,
151
348143
2511
λ‚˜μ€‘μ— κ·Έκ°€ μ‹€ν—˜μ— μ°Έμ—¬ν•œ μ‚¬λžŒλ“€ν•œν…Œ λˆμ„ μ€¬λŠ”λ°
05:50
they would prefer to give the money
152
350654
1676
그듀은 λ‹€λ₯Έ 그룹의 μ‚¬λžŒλ“€λ³΄λ‹€λŠ”
05:52
to members of their own group
153
352330
1800
μžκΈ°κ°€ μ†ν•œ 그룹의
05:54
than members of the other group.
154
354130
1833
μ‚¬λžŒλ“€ν•œν…Œ λˆμ„ μ£Όλ €κ³  ν–ˆμŠ΅λ‹ˆλ‹€.
05:55
Worse, they were actually most interested
155
355963
2327
더 μ‹¬ν•œ 것은 그듀이
05:58
in establishing a difference
156
358290
2006
자기 κ·Έλ£Ήκ³Ό λ‹€λ₯Έ κ·Έλ£Ή μ‚¬μ΄μ—μ„œ
06:00
between their group and other groups,
157
360296
2566
차이점을 μ°ΎλŠ”λ° 관심을 더 κ°€μ‘Œλ‹€λŠ” 사싀이죠.
06:02
so they would give up money for their own group
158
362862
1908
κ·Έλž˜μ„œ 자기 그룹을 μœ„ν•΄μ„œ λˆμ„ 내놓고
06:04
if by doing so they could give the other group even less.
159
364770
5248
κ·Έλ ‡κ²Œ ν•¨μœΌλ‘œμ¨ λ‹€λ₯Έ κ·Έλ£Ήν•œν…Œ 적게 μ£ΌλŠ”κ±°μ£ .
06:10
This bias seems to show up very early.
160
370018
2218
이런 편ν–₯은 μ•„μ£Ό 일찍 λ“œλŸ¬λ‚©λ‹ˆλ‹€.
06:12
So my colleague and wife, Karen Wynn, at Yale
161
372236
2300
제 λ™λ£Œμ΄μž 아내인 카렌 μœˆμ€ 예일 λŒ€ν•™μ—μ„œ
06:14
has done a series of studies with babies
162
374536
1611
아기듀을 λŒ€μƒμœΌλ‘œ μ—¬λŸ¬ μ‹€ν—˜μ„ ν–ˆμŠ΅λ‹ˆλ‹€.
06:16
where she exposes babies to puppets,
163
376147
2832
μ•„κΈ°λ“€ν•œν…Œ μΈν˜•μ„ λ³΄μ—¬μ£ΌλŠ”λ°
06:18
and the puppets have certain food preferences.
164
378979
2265
μΈν˜•μ€ κΈ°ν˜Έμ‹ν’ˆμ„ κ°€μ§‘λ‹ˆλ‹€.
06:21
So one of the puppets might like green beans.
165
381244
2182
κ·Έλž˜μ„œ μ–΄λ–€ μΈν˜•μ€ 깍지 콩을 μ’‹μ•„ν•˜κ³ 
06:23
The other puppet might like graham crackers.
166
383426
2575
λ‹€λ₯Έ μΈν˜•μ€ κ·Έλ ˆμ—„ 크래컀λ₯Ό μ’‹μ•„ν•©λ‹ˆλ‹€.
06:26
They test the babies own food preferences,
167
386001
2369
그듀은 μ•„κΈ°λ“€μ˜ κΈ°ν˜Έμ‹ν’ˆμ„ μ‘°μ‚¬ν–ˆλŠ”λ°
06:28
and babies typically prefer the graham crackers.
168
388370
2690
아기듀은 보톡 κ·Έλ ˆμ—„ 크래컀λ₯Ό μ’‹μ•„ν•©λ‹ˆλ‹€.
06:31
But the question is, does this matter to babies
169
391060
2612
ν•˜μ§€λ§Œ μ§ˆλ¬Έμ€ 이런 사싀이 μ•„κΈ°κ°€ μΈν˜•μ„ λŒ€ν•˜λŠ” 데
06:33
in how they treat the puppets? And it matters a lot.
170
393672
3116
μ–΄λ–€ 상관이 μžˆλ‚˜μš”? 상관이 많죠.
06:36
They tend to prefer the puppet
171
396788
1519
아기듀은 μžκΈ°μ™€ λ˜‘κ°™μ€ 식성을 가진
06:38
who has the same food tastes that they have,
172
398307
3479
μΈν˜•μ„ 더 μ’‹μ•„ν•©λ‹ˆλ‹€.
06:41
and worse, they actually prefer puppets
173
401786
2556
μ‹¬ν•˜κ²ŒλŠ” μΈν˜•μ΄ 식성이 λ‹€λ₯Έ μΈν˜•μ„
06:44
who punish the puppet with the different food taste.
174
404342
2985
λ²Œμ£ΌλŠ” 것을 μ„ ν˜Έν•©λ‹ˆλ‹€.
06:47
(Laughter)
175
407327
2277
(μ›ƒμŒ)
06:49
We see this sort of in-group, out-group psychology all the time.
176
409604
3632
μš°λ¦¬λŠ” 이런 μ’…λ₯˜μ˜ 내집단, 외집단 심리λ₯Ό 늘 λ΄…λ‹ˆλ‹€.
06:53
We see it in political clashes
177
413236
1664
μš°λ¦¬λŠ” λ‹€λ₯Έ 이념을 가진
06:54
within groups with different ideologies.
178
414900
2414
κ·Έλ£Ήλ“€ μ•ˆμ—μ„œ μΌμ–΄λ‚˜λŠ” μ •μΉ˜μ  κ°ˆλ“±μ—μ„œ 이것을 λ΄…λ‹ˆλ‹€.
06:57
We see it in its extreme in cases of war,
179
417314
3626
μš°λ¦¬λŠ” μ „μŸμ΄λΌλŠ” 극단적 κ²½μš°μ—μ„œ 이것을 λ΄…λ‹ˆλ‹€.
07:00
where the out-group isn't merely given less,
180
420940
3217
이 경우 외집단은 κ·Έμ € 적게 λ°›λŠ” 게 μ•„λ‹ˆλΌ
07:04
but dehumanized,
181
424157
1588
인간성이 λ§μ‚΄λ©λ‹ˆλ‹€.
07:05
as in the Nazi perspective of Jews
182
425745
2240
λ‚˜μΉ˜κ°€ μœ νƒœμΈμ„
07:07
as vermin or lice,
183
427985
2085
ν•΄μΆ©μ΄λ‚˜ 이둜 λ³Ό λ•Œ
07:10
or the American perspective of Japanese as rats.
184
430070
4236
λ˜λŠ” 미ꡭ인이 일본인을 μ₯λ‘œ λ³΄λŠ” κ²ƒμ²˜λŸΌμš”.
07:14
Stereotypes can also go awry.
185
434306
2214
고정관념도 λΉ—λ‚˜κ°ˆ 수 μžˆμŠ΅λ‹ˆλ‹€.
07:16
So often they're rational and useful,
186
436520
2261
λ•Œλ‘œλŠ” 이성적이고 μœ μš©ν•˜μ§€λ§Œ
07:18
but sometimes they're irrational,
187
438781
1574
λ•Œλ‘œλŠ” 비이성적이고
07:20
they give the wrong answers,
188
440355
1226
ν‹€λ¦° 닡을 μ£Όλ©°
07:21
and other times
189
441581
1217
λ‹€λ₯Έ λ•ŒλŠ”
07:22
they lead to plainly immoral consequences.
190
442798
2175
비도덕적인 결과둜 μ΄λ•λ‹ˆλ‹€.
07:24
And the case that's been most studied
191
444973
2808
κ°€μž₯ λ§Žμ€ 연ꡬ가 된 κ²½μš°κ°€
07:27
is the case of race.
192
447781
1667
인쒅을 λŒ€μƒμœΌλ‘œ ν–ˆμŠ΅λ‹ˆλ‹€.
07:29
There was a fascinating study
193
449448
1407
2008λ…„ 선거에 μ•žμ„œ
07:30
prior to the 2008 election
194
450855
2074
ν₯미둜운 연ꡬ가 μžˆμ—ˆλŠ”λ°
07:32
where social psychologists looked at the extent
195
452929
3026
μ‚¬νšŒ μ‹¬λ¦¬ν•™μžλ“€μ€
07:35
to which the candidates were associated with America,
196
455955
3442
μ–΄λŠ 후보가 λ―Έκ΅­κ³Ό 연관성이 μžˆλŠ”μ§€ μ‚΄νˆμŠ΅λ‹ˆλ‹€.
07:39
as in an unconscious association with the American flag.
197
459397
3605
λ―Έκ΅­ ꡭ기와 λ¬΄μ˜μ‹μ μœΌλ‘œ 연관을 μ§“λŠ” κ²ƒμ²˜λŸΌμš”.
07:43
And in one of their studies they compared
198
463002
1356
그듀이 ν•œ μ—°κ΅¬μ—μ„œ
07:44
Obama and McCain, and they found McCain
199
464358
2014
μ˜€λ°”λ§ˆμ™€ 멕케인을 λΉ„κ΅ν–ˆλŠ”λ°
07:46
is thought of as more American than Obama,
200
466372
3394
멕케인이 μ˜€λ°”λ§ˆλ³΄λ‹€ 더 미ꡭ인처럼 μƒκ°λ˜μ—ˆκ³ 
07:49
and to some extent, people aren't that surprised by hearing that.
201
469766
2573
일뢀 μ‚¬λžŒλ“€μ€ κ·Έκ±Έ 듣고도 놀라지 μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
07:52
McCain is a celebrated war hero,
202
472339
1918
λ§₯케인은 유λͺ…ν•œ μ „μŸ μ˜μ›…μ΄κ³ 
07:54
and many people would explicitly say
203
474257
1659
λ§Žμ€ μ‚¬λžŒλ“€μ΄ λΆ„λͺ…ν•˜κ²Œ
07:55
he has more of an American story than Obama.
204
475916
2700
κ·Έκ°€ μ˜€λ°”λ§ˆλ³΄λ‹€ 더 λ§Žμ€ 미ꡭ의 이야기λ₯Ό κ°–κ³  μžˆλ‹€κ³  λ§ν•©λ‹ˆλ‹€.
07:58
But they also compared Obama
205
478616
1937
ν•˜μ§€λ§Œ 그듀이 μ˜€λ°”λ§ˆμ™€
08:00
to British Prime Minister Tony Blair,
206
480553
2516
영ꡭ μˆ˜μƒ ν† λ‹ˆ λΈ”λ ˆμ–΄λ₯Ό λΉ„κ΅ν–ˆμ„ λ•ŒλŠ”
08:03
and they found that Blair was also thought of
207
483069
2261
λΈ”λ ˆμ–΄λ₯Ό μ˜€λ°”λ§ˆλ³΄λ‹€ 더 미ꡭ인처럼
08:05
as more American than Obama,
208
485330
2507
μƒκ°ν•œλ‹€λŠ” κ±Έ λ°œκ²¬ν–ˆμŠ΅λ‹ˆλ‹€.
08:07
even though subjects explicitly understood
209
487837
2073
μ‹€ν—˜λŒ€μƒμžλ“€μ΄ ν† λ‹ˆ λΈ”λ ˆμ–΄κ°€
08:09
that he's not American at all.
210
489910
2990
미ꡭ인이 μ•„λ‹˜μ„ ν™•μ‹€ν•˜κ²Œ μ•Œκ³  μžˆμŒμ—λ„ λΆˆκ΅¬ν•˜κ³ μš”.
08:12
But they were responding, of course,
211
492900
1424
그듀은 λ‹Ήμ—°νžˆ
08:14
to the color of his skin.
212
494324
3051
피뢀색에 λ°˜μ‘ν–ˆμŠ΅λ‹ˆλ‹€.
08:17
These stereotypes and biases
213
497375
2051
이런 고정관념과 편ν–₯은
08:19
have real-world consequences,
214
499426
1450
μ‹€μ„Έκ³„μ—μ„œ
08:20
both subtle and very important.
215
500876
2872
λ―Έλ¬˜ν•˜λ©΄μ„œ μ•„μ£Ό μ€‘μš”ν•œ κ²°κ³Όλ₯Ό κ°€μ Έμ˜΅λ‹ˆλ‹€.
08:23
In one recent study, researchers
216
503748
2662
졜근의 ν•œ μ—°κ΅¬μ—μ„œ 연ꡬ원듀은
08:26
put ads on eBay for the sale of baseball cards.
217
506410
3269
μ•Όκ΅¬μΉ΄λ“œ 판맀λ₯Ό μœ„ν•œ κ΄‘κ³ λ₯Ό 이베이(eBay)에 μ˜¬λ ΈμŠ΅λ‹ˆλ‹€.
08:29
Some of them were held by white hands,
218
509679
2734
μ–΄λ–€ μΉ΄λ“œλŠ” 흰 μ†μœΌλ‘œ λ“€κ³ 
08:32
others by black hands.
219
512413
1218
μ–΄λ–€ μΉ΄λ“œλŠ” 검은 μ†μœΌλ‘œ λ“€μ—ˆμ£ .
08:33
They were the same baseball cards.
220
513631
1579
λ˜‘κ°™μ€ 야ꡬ μΉ΄λ“œμ˜€μŠ΅λ‹ˆλ‹€.
08:35
The ones held by black hands
221
515210
1244
검은 μ†μœΌλ‘œ λ“  μΉ΄λ“œλŠ”
08:36
got substantially smaller bids
222
516454
2067
흰 μ†μœΌλ‘œ λ“  μΉ΄λ“œλ³΄λ‹€
08:38
than the ones held by white hands.
223
518521
2484
μž…μ°°ν•œ μˆ˜κ°€ μƒλ‹Ήνžˆ μ μ—ˆμŠ΅λ‹ˆλ‹€.
08:41
In research done at Stanford,
224
521005
2362
μŠ€νƒ ν¬λ“œ λŒ€ν•™μ—μ„œ ν•œ μ—°κ΅¬μ—μ„œ
08:43
psychologists explored the case of people
225
523367
4230
μ‹¬λ¦¬ν•™μžλ“€μ€ 백인을 죽인 μ‚¬λžŒλ“€μ΄
08:47
sentenced for the murder of a white person.
226
527597
3569
받은 νŒκ²°μ— λŒ€ν•΄ μ—°κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
08:51
It turns out, holding everything else constant,
227
531166
2804
λ‹€λ₯Έ λͺ¨λ“  쑰건이 λ˜‘κ°™μ„ λ•Œ
08:53
you are considerably more likely to be executed
228
533970
2370
였λ₯Έμͺ½μ— μžˆλŠ” λ‚¨μžκ°€
08:56
if you look like the man on the right
229
536340
1777
μ™Όμͺ½μ— μžˆλŠ” λ‚¨μžλ³΄λ‹€
08:58
than the man on the left,
230
538117
1973
μ‚¬ν˜•μ„ λ°›λŠ” ν™•μœ¨μ΄ μ»ΈμŠ΅λ‹ˆλ‹€.
09:00
and this is in large part because
231
540090
2029
μ΄λŠ” λŒ€μ²΄λ‘œ
09:02
the man on the right looks more prototypically black,
232
542119
2534
였λ₯Έμͺ½μ— μžˆλŠ” λ‚¨μžκ°€ μ›ν˜•μ μœΌλ‘œ 더 흑인에 κ°€κΉκ²Œ 보이고
09:04
more prototypically African-American,
233
544653
2630
μ›ν˜•μ μœΌλ‘œ 더 아프리카계 미ꡭ인처럼 보이기 λ•Œλ¬Έμ—
09:07
and this apparently influences people's decisions
234
547283
2049
μ‚¬λžŒλ“€μ΄ 그에 κ΄€ν•œ νŒλ‹¨μ„ 내릴 λ•Œ
09:09
over what to do about him.
235
549332
1771
μ΄λŸ¬ν•œ 사싀이 λΆ„λͺ…ν•œ 영ν–₯을 λ―ΈμΉ©λ‹ˆλ‹€.
09:11
So now that we know about this,
236
551103
1547
이제 μš°λ¦¬λŠ” μ΄λŸ¬ν•œ 사싀을 μ•„λŠ”λ°
09:12
how do we combat it?
237
552650
1657
μ–΄λ–»κ²Œ 극볡해야 ν• κΉŒμš”?
09:14
And there are different avenues.
238
554307
1622
λ‹€μ–‘ν•œ 곳이 μžˆμŠ΅λ‹ˆλ‹€.
09:15
One avenue is to appeal
239
555929
1434
ν•œ 가지 방법은
09:17
to people's emotional responses,
240
557363
2046
μ‚¬λžŒλ“€μ˜ 감정적 λ°˜μ‘μ΄λ‚˜
09:19
to appeal to people's empathy,
241
559409
2133
μ‚¬λžŒλ“€μ˜ κ³΅κ°λŒ€μ— ν˜Έμ†Œν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
09:21
and we often do that through stories.
242
561542
1873
μš°λ¦¬λŠ” μ’…μ’… 이야기λ₯Ό 톡해 κ·Έλ ‡κ²Œ ν•©λ‹ˆλ‹€.
09:23
So if you are a liberal parent
243
563415
2565
μ—¬λŸ¬λΆ„μ΄ 진보적인 λΆ€λͺ¨μ΄κ³ 
09:25
and you want to encourage your children
244
565980
1872
μžμ‹λ“€μ—κ²Œ 비전톡적인 가쑱듀이
09:27
to believe in the merits of nontraditional families,
245
567852
2374
κ°–λŠ” μž₯점을 믿게 ν•˜κ³  μ‹Άλ‹€λ©΄
09:30
you might give them a book like this. ["Heather Has Two Mommies"]
246
570226
2273
μžμ‹λ“€μ—κ²Œ 이런 책을 κΆŒν•  μˆ˜λ„ 있죠. ["ν—€λ”λŠ” μ—„λ§ˆκ°€ λ‘˜μ΄μ—μš”."]
09:32
If you are conservative and have a different attitude,
247
572499
1726
μ—¬λŸ¬λΆ„μ΄ 보수적이고 λ‹€λ₯Έ νƒœλ„λ₯Ό κ°€μ‘Œλ‹€λ©΄
09:34
you might give them a book like this.
248
574225
1931
μžμ‹λ“€μ—κ²Œ 이런 책을 쀄 수 있죠.
09:36
(Laughter) ["Help! Mom! There Are Liberals under My Bed!"]
249
576156
1749
(μ›ƒμŒ) ["λ„μ™€μ€˜μš”! μ—„λ§ˆ! 제 μΉ¨λŒ€λ°‘μ— μ§„λ³΄μ£Όμ˜μžκ°€ μžˆμ–΄μš”."]
09:37
But in general, stories can turn
250
577905
3336
ν•˜μ§€λ§Œ 일반적으둜 μ΄μ•ΌκΈ°λŠ”
09:41
anonymous strangers into people who matter,
251
581241
2232
이름도 λͺ¨λ₯΄λŠ” λ‚―μ„  μ‚¬λžŒμ„ μš°λ¦¬μ™€ μƒκ΄€μžˆλŠ” μ‚¬λžŒμœΌλ‘œ λ°”κΏ€ 수 μžˆμŠ΅λ‹ˆλ‹€.
09:43
and the idea that we care about people
252
583473
2685
μš°λ¦¬κ°€ μ‚¬λžŒλ“€μ„ κ°œμΈμœΌλ‘œμ„œ μ΄ˆμ μ„ 맞좜 λ•Œ
09:46
when we focus on them as individuals
253
586158
1702
κ·Έλ“€ν•œν…Œ 관심을 κ°€μ§€κ²Œ ν•˜λŠ” μ•„μ΄λ””μ–΄λŠ”
09:47
is an idea which has shown up across history.
254
587860
2279
μ—­μ‚¬μ—μ„œ λ³΄μ—¬μ€¬λ˜ μ•„μ΄λ””μ–΄μž…λ‹ˆλ‹€.
09:50
So Stalin apocryphally said,
255
590139
2583
μŠ€νƒˆλ¦°μ΄ μΆœμ²˜κ°€ λΆˆλΆ„λͺ…ν•œ 이런 말을 ν–ˆμŠ΅λ‹ˆλ‹€.
09:52
"A single death is a tragedy,
256
592722
1617
"ν•œ μ‚¬λžŒμ˜ μ£½μŒμ€ 비극이고,
09:54
a million deaths is a statistic,"
257
594339
2040
백만의 μ£½μŒμ€ 톡계이닀."
09:56
and Mother Teresa said,
258
596379
1451
ν…Œλ ˆμ‚¬ μˆ˜λ…€λŠ” μ΄λ ‡κ²Œ λ§ν–ˆμ–΄μš”.
09:57
"If I look at the mass, I will never act.
259
597830
1541
"ꡰ쀑을 보면 μ €λŠ” ν–‰λ™ν•˜μ§€ μ•Šμ„ κ²λ‹ˆλ‹€."
09:59
If I look at the one, I will."
260
599371
2325
ν•œ μ‚¬λžŒμ„ 보면 μ €λŠ” 행동할 κ²λ‹ˆλ‹€.
10:01
Psychologists have explored this.
261
601696
2070
μ‹¬λ¦¬ν•™μžλ“€μ΄ 이런 뢀뢄을 μ—°κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
10:03
For instance, in one study,
262
603766
1301
예λ₯Ό λ“€λ©΄ ν•œ μ—°κ΅¬μ—μ„œ
10:05
people were given a list of facts about a crisis,
263
605067
2783
μ‚¬λžŒλ“€μ—κ²Œ μœ„κΈ°μ— λŒ€ν•œ μ—¬λŸ¬ 사싀을 μ•Œλ ΈμŠ΅λ‹ˆλ‹€.
10:07
and it was seen how much they would donate
264
607850
4256
이런 μœ„κΈ°λ₯Ό κ·Ήλ³΅ν•˜κΈ° μœ„ν•΄μ„œ
10:12
to solve this crisis,
265
612106
1584
μ‚¬λžŒλ“€μ΄ μ–Όλ§ˆλ‚˜ 많이 κΈ°λΆ€ν•˜λŠ”μ§€λ₯Ό μ‚΄νˆμ£ .
10:13
and another group was given no facts at all
266
613690
1837
λ‹€λ₯Έ 그룹은 사싀에 λŒ€ν•œ μ •λ³΄λŠ” μ „ν˜€ 주지 μ•Šκ³ 
10:15
but they were told of an individual
267
615527
2098
ν•œ μ‚¬λžŒμ— λŒ€ν•΄ μ–˜κΈ°ν•΄μ£Όκ³ 
10:17
and given a name and given a face,
268
617625
2440
κ·Έ μ‚¬λžŒμ˜ 이름과 얼꡴을 μ•Œλ €μ€¬μŠ΅λ‹ˆλ‹€.
10:20
and it turns out that they gave far more.
269
620065
3219
κ·Έλž¬λ”λ‹ˆ 훨씬 더 κΈ°λΆ€λ₯Ό 많이 ν–ˆμ–΄μš”.
10:23
None of this I think is a secret
270
623284
1861
μžμ„ ν™œλ™κ³Ό κ΄€λ ¨λœ 뢄듀은
10:25
to the people who are engaged in charity work.
271
625145
2111
이런 사싀을 잘 μ•„μ‹œκ² μ£ .
10:27
People don't tend to deluge people
272
627256
2648
μ‚¬λžŒλ“€μ€ 사싀과 ν†΅κ³„λ‘œ
10:29
with facts and statistics.
273
629904
1323
퍼뢓고 μ‹Άμ–΄ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
10:31
Rather, you show them faces,
274
631227
1022
였히렀 얼꡴을 보여주고
10:32
you show them people.
275
632249
1736
μ‚¬λžŒλ“€μ„ λ³΄μ—¬μ€λ‹ˆλ‹€.
10:33
It's possible that by extending our sympathies
276
633985
3227
우리의 동정심을
10:37
to an individual, they can spread
277
637212
1971
κ°œμΈμ—κ²Œ ν™•μž₯ν•¨μœΌλ‘œμ¨
10:39
to the group that the individual belongs to.
278
639183
2878
개인이 μ†ν•œ 그룹으둜 ν™•μ‚°ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
10:42
This is Harriet Beecher Stowe.
279
642061
2466
이뢄은 해리엇 비쳐 μŠ€ν† μž…λ‹ˆλ‹€.
10:44
The story, perhaps apocryphal,
280
644527
2443
μΆœμ²˜κ°€ λΆˆλΆ„λͺ…ν•œ 이야기인데
10:46
is that President Lincoln invited her
281
646970
2074
링컨 λŒ€ν†΅λ Ήμ΄ λ‚¨λΆμ „μŸμ΄ ν•œμ°½μΌ λ•Œ
10:49
to the White House in the middle of the Civil War
282
649044
1998
κ·Έλ…€λ₯Ό λ°±μ•…κ΄€μœΌλ‘œ μ΄ˆλŒ€ν•΄μ„œ
10:51
and said to her,
283
651042
1584
μ΄λ ‡κ²Œ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
10:52
"So you're the little lady who started this great war."
284
652626
2664
"μ΄λ ‡κ²Œ μ—„μ²­λ‚œ μ „μŸμ„ μ‹œμž‘ν•œ 뢄이 λ°”λ‘œ λ‹Ήμ‹ μ΄κ΅°μš”."
10:55
And he was talking about "Uncle Tom's Cabin."
285
655290
1885
κ·ΈλŠ” "ν†° μ•„μ €μ”¨μ˜ μ˜€λ‘λ§‰μ§‘"을 μ–˜κΈ°ν•˜κ³  μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
10:57
"Uncle Tom's Cabin" is not a great book of philosophy
286
657175
2531
"ν†° μ•„μ €μ”¨μ˜ μ˜€λ‘λ§‰μ§‘"은 μ² ν•™μ΄λ‚˜
10:59
or of theology or perhaps not even literature,
287
659706
3144
μ‹ ν•™μ΄λ‚˜ 문학에 κ΄€ν•œ ν›Œλ₯­ν•œ 책이 μ•„λ‹ˆμ§€λ§Œ
11:02
but it does a great job
288
662850
2515
큰 역할을 ν–ˆμŠ΅λ‹ˆλ‹€.
11:05
of getting people to put themselves in the shoes
289
665365
2498
μ‚¬λžŒλ“€μ—κ²Œ
11:07
of people they wouldn't otherwise be in the shoes of,
290
667863
2333
λ‹€λ₯Έ μ‚¬λžŒμ˜ μž…μž₯에 μ„œλ³΄κ²Œ ν–ˆλŠ”λ°
11:10
put themselves in the shoes of slaves.
291
670196
2402
λ…Έμ˜ˆ μž…μž₯μ—μ„œ μƒκ°ν•΄λ³΄κ²Œ ν–ˆμŠ΅λ‹ˆλ‹€.
11:12
And that could well have been a catalyst
292
672598
1781
그것은 μ—„μ²­λ‚œ μ‚¬νšŒ λ³€ν™”μ˜
11:14
for great social change.
293
674379
1604
κΈ°ν­μ œκ°€ 될 μˆ˜λ„ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
11:15
More recently, looking at America
294
675983
2362
μ΅œκ·Όμ— 미ꡭ을 보면
11:18
in the last several decades,
295
678345
3069
μ§€λ‚œ μˆ˜μ‹­λ…„λ™μ•ˆ
11:21
there's some reason to believe that shows like "The Cosby Show"
296
681414
3149
"μ½”μŠ€λΉ„ κ°€μ‘±"κ³Ό 같은 μ‡Όκ°€ 아프리카계 미ꡭ인듀을 ν–₯ν•œ
11:24
radically changed American attitudes towards African-Americans,
297
684563
2688
11:27
while shows like "Will and Grace" and "Modern Family"
298
687251
2983
"윌과 그레이슀"와 "λͺ¨λ˜ νŒ¨λ°€λ¦¬" 와 같은 μ‡ΌλŠ”
11:30
changed American attitudes
299
690234
1363
동성애 λ‚¨μžμ™€ μ—¬μžλ₯Ό ν–₯ν•œ
11:31
towards gay men and women.
300
691597
1300
λ―Έκ΅­μΈλ“€μ˜ νƒœλ„λ₯Ό λ°”κΏ¨μŠ΅λ‹ˆλ‹€.
11:32
I don't think it's an exaggeration to say
301
692897
2455
미ꡭ인의 도덕적 변화에 μ€‘μš”ν•œ κΈ°ν­μ œλŠ”
11:35
that the major catalyst in America for moral change
302
695352
2661
상황극이라고 말해도
11:38
has been a situation comedy.
303
698013
2893
과언이 μ•„λ‹™λ‹ˆλ‹€.
11:40
But it's not all emotions,
304
700906
1416
ν•˜μ§€λ§Œ 감정이 λ‹€λŠ” μ•„λ‹™λ‹ˆλ‹€.
11:42
and I want to end by appealing
305
702322
1276
μ €λŠ” μ΄μ„±μ˜ νž˜μ— ν˜Έμ†Œν•˜λ©°
11:43
to the power of reason.
306
703598
2235
이야기λ₯Ό 끝내고 μ‹ΆμŠ΅λ‹ˆλ‹€.
11:45
At some point in his wonderful book
307
705833
2156
κ·Έκ°€ μ“΄ ν›Œλ₯­ν•œ 책인
11:47
"The Better Angels of Our Nature,"
308
707989
1223
"우리 λ³Έμ„±μ˜ 더 λ‚˜μ€ μ²œμ‚¬λ“€" 의 ν•œ λΆ€λΆ„μ—μ„œ
11:49
Steven Pinker says,
309
709212
2016
μŠ€ν‹°λΈ 핑컀가 λ§ν–ˆμŠ΅λ‹ˆλ‹€.
11:51
the Old Testament says love thy neighbor,
310
711228
2582
ꡬ약 μ„±μ„œλŠ” 이웃을 μ‚¬λž‘ν•˜λΌκ³  λ§ν•˜κ³ 
11:53
and the New Testament says love thy enemy,
311
713810
2722
μ‹ μ•½ μ„±μ„œλŠ” 적을 μ‚¬λž‘ν•˜λΌκ³  λ§ν•˜μ£ .
11:56
but I don't love either one of them, not really,
312
716532
2686
μ €λŠ” κ·Έ μ–΄λŠ μͺ½λ„ μ‚¬λž‘ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
11:59
but I don't want to kill them.
313
719218
1667
ν•˜μ§€λ§Œ 그듀을 죽이고 싢지도 μ•Šμ•„μš”.
12:00
I know I have obligations to them,
314
720885
1866
μ €λŠ” 그듀에 λŒ€ν•œ μ˜λ¬΄κ°€ μžˆμŒμ„ μ•Œμ§€λ§Œ
12:02
but my moral feelings to them, my moral beliefs
315
722751
3470
μ œκ°€ κ·Έλ“€μ—κ²Œ μ–΄λ–€ 행동을 ν•΄μ•Ό ν•˜λŠ”μ§€μ— λŒ€ν•œ
12:06
about how I should behave towards them,
316
726221
1713
도덕적 λŠλ‚Œμ΄λ‚˜ 도덕적 λ―ΏμŒμ€
12:07
aren't grounded in love.
317
727934
2047
μ‚¬λž‘μ„ λ°”νƒ•μœΌλ‘œ ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
12:09
What they're grounded in is the understanding of human rights,
318
729981
1939
바탕을 두고 μžˆλŠ” 것은 μΈκΆŒμ— λŒ€ν•œ 이해,
12:11
a belief that their life is as valuable to them
319
731920
2223
제 생λͺ…이 μ €ν•œν…Œ 값진 κ²ƒμ²˜λŸΌ
12:14
as my life is to me,
320
734143
2356
κ·Έλ“€μ˜ 생λͺ…도 κ·Έλ“€μ—κ²ŒλŠ” 값진 κ²ƒμ΄λž€ λ―ΏμŒμž…λ‹ˆλ‹€.
12:16
and to support this, he tells a story
321
736499
1932
이λ₯Ό λ’·λ°›μΉ¨ν•˜κΈ° μœ„ν•΄ κ·ΈλŠ”
12:18
by the great philosopher Adam Smith,
322
738431
1848
μœ„λŒ€ν•œ μ² ν•™μž μ•„λ‹΄ μŠ€λ―ΈμŠ€κ°€ ν•œ 이야기λ₯Ό λ“€λ €μ€λ‹ˆλ‹€.
12:20
and I want to tell this story too,
323
740279
1686
저도 이 μ–˜κΈ°λ₯Ό λ§μ”€λ“œλ¦¬κ³  μ‹Άμ–΄μš”.
12:21
though I'm going to modify it a little bit
324
741965
1296
ν˜„λŒ€μ— 맞게 쑰금
12:23
for modern times.
325
743261
1678
κ³ μ³μ„œ λ§μ”€λ“œλ¦΄κ²Œμš”.
12:24
So Adam Smith starts by asking you to imagine
326
744939
1901
μ•„λ‹΄ μŠ€λ―ΈμŠ€λŠ” μ΄λ ‡κ²Œ λ¬»μŠ΅λ‹ˆλ‹€.
12:26
the death of thousands of people,
327
746840
1901
수천 λͺ…μ˜ μ‚¬λžŒλ“€μ΄ μ£½λŠ” 것을 μƒμƒν•˜λΌκ΅¬μš”.
12:28
and imagine that the thousands of people
328
748741
2040
κ·Έ 수천 λͺ…μ˜ μ‚¬λžŒλ“€μ΄
12:30
are in a country you are not familiar with.
329
750781
2239
μ—¬λŸ¬λΆ„μ΄ μ΅μˆ™ν•˜μ§€ μ•Šμ€ λ‚˜λΌμ˜ μ‚¬λžŒμ΄λΌκ³  μƒμƒν•΄λ³΄μ„Έμš”.
12:33
It could be China or India or a country in Africa.
330
753020
3554
쀑ꡭ, 인도, μ•„ν”„λ¦¬μΉ΄μ˜ ν•œ λ‚˜λΌκ°€ 될 수 있겠죠.
12:36
And Smith says, how would you respond?
331
756574
2484
μŠ€λ―ΈμŠ€κ°€ λ§ν•©λ‹ˆλ‹€. μ—¬λŸ¬λΆ„μ€ μ–΄λ–»κ²Œ λ°˜μ‘ν•˜κ² μŠ΅λ‹ˆκΉŒ?
12:39
And you would say, well that's too bad,
332
759058
2307
μ—¬λŸ¬λΆ„μ€ λ„ˆλ¬΄ μ•ˆ λ˜μ—ˆλ‹€κ³  μ–˜κΈ°ν•˜κ² μ£ .
12:41
and you'd go on to the rest of your life.
333
761365
1876
κ·Έλ¦¬κ³ λŠ” 자기 ν•  일을 ν•  κ²λ‹ˆλ‹€.
12:43
If you were to open up The New York Times online or something,
334
763241
2219
μ—¬λŸ¬λΆ„μ΄ 온라인으둜 λ‰΄μš• νƒ€μž„μ¦ˆλ₯Ό 보고
12:45
and discover this, and in fact this happens to us all the time,
335
765460
2960
이런 사싀을 μ•Œκ²Œ λœλ‹€λ©΄, 사싀 이런 일은 늘 μΌμ–΄λ‚©λ‹ˆλ‹€.
12:48
we go about our lives.
336
768420
1521
μš°λ¦¬λŠ” μƒν™œμ„ 계속 ν•©λ‹ˆλ‹€.
12:49
But imagine instead, Smith says,
337
769941
2194
ν•˜μ§€λ§Œ μŠ€λ―ΈμŠ€κ°€ μ΄λ ‡κ²Œ 상상해보라고 λ§ν•©λ‹ˆλ‹€.
12:52
you were to learn that tomorrow
338
772135
1254
μ—¬λŸ¬λΆ„μ€ 내일
12:53
you were to have your little finger chopped off.
339
773389
2539
μƒˆλΌ 손가락이 잘릴 것을 μ•Œκ²Œ λ©λ‹ˆλ‹€.
12:55
Smith says, that would matter a lot.
340
775928
2169
그건 크게 μƒκ΄€μžˆλŠ” 일이 될거라고 μŠ€λ―ΈμŠ€κ°€ λ§ν•©λ‹ˆλ‹€.
12:58
You would not sleep that night
341
778097
1411
μ—¬λŸ¬λΆ„μ€ κ·Έκ±Έ κ±±μ •ν•˜λ©°
12:59
wondering about that.
342
779508
1353
κ·Έλ‚ λ°€ μž μ„ λͺ» 이루겠죠.
13:00
So this raises the question:
343
780861
2019
κ·Έλž˜μ„œ 이런 λ°μ„œ 질문이 μƒκΉλ‹ˆλ‹€.
13:02
Would you sacrifice thousands of lives
344
782880
2466
μ—¬λŸ¬λΆ„μ€ 수천 λͺ…μ˜ λͺ©μˆ¨μ„ ν¬μƒν•΄μ„œ
13:05
to save your little finger?
345
785346
1969
자기의 μƒˆλΌ 손가락을 κ΅¬ν•˜κ² μŠ΅λ‹ˆκΉŒ?
13:07
Now answer this in the privacy of your own head,
346
787315
2318
이제 μ—¬λŸ¬λΆ„ λ¨Έλ¦Ώ μ†μ—μ„œ 이 μ§ˆλ¬Έμ— λ‹΅λ³€ν•΄μ£Όμ„Έμš”.
13:09
but Smith says, absolutely not,
347
789633
2919
μŠ€λ―ΈμŠ€λŠ” μ ˆλŒ€ 그럴 수 μ—†λ‹€κ³  μ–˜κΈ°ν•©λ‹ˆλ‹€.
13:12
what a horrid thought.
348
792552
1692
μ–Όλ§ˆλ‚˜ λ¬΄μ„œμš΄ μƒκ°μž…λ‹ˆκΉŒ.
13:14
And so this raises the question,
349
794244
2031
κ·Έλž˜μ„œ 이런 질문이 λ‚˜μ˜€λŠ”λ°
13:16
and so, as Smith puts it,
350
796275
1374
μŠ€λ―ΈμŠ€λŠ” μ΄λ ‡κ²Œ μ„€λͺ…ν•˜μ£ .
13:17
"When our passive feelings are almost always
351
797649
2218
우리의 μˆ˜λ™μ μΈ 감정은 거의 항상
13:19
so sordid and so selfish,
352
799867
1448
μ•„μ£Ό λΉ„μ—΄ν•˜κ³  λ„ˆλ¬΄ 이기적인데,
13:21
how comes it that our active principles
353
801315
1465
μ–΄λ–»κ²Œ ν•΄μ„œ 우리의 λŠ₯동적인 원칙은
13:22
should often be so generous and so noble?"
354
802780
2533
μ’…μ’… κ·Έλ ‡κ²Œ μ•„μ£Ό λ„ˆκ·ΈλŸ½κ³  μ•„μ£Ό ν›Œλ₯­ν•  수 μžˆμŠ΅λ‹ˆκΉŒ?
13:25
And Smith's answer is, "It is reason,
355
805313
2050
μŠ€λ―ΈμŠ€κ°€ λ‹΅ν•©λ‹ˆλ‹€. "그것은 이성,
13:27
principle, conscience.
356
807363
1775
원칙, μ–‘μ‹¬μž…λ‹ˆλ‹€.
13:29
[This] calls to us,
357
809138
1541
이게 μš°λ¦¬μ—κ²Œ μ–˜κΈ°ν•©λ‹ˆλ‹€.
13:30
with a voice capable of astonishing the most presumptuous of our passions,
358
810679
3425
λ†€λžκ³ λ„ μ•„μ£Ό λ»”λ²ˆν•˜κ²Œ 우리의 정열을 λ‹€ν•œ λͺ©μ†Œλ¦¬λ‘œ
13:34
that we are but one of the multitude,
359
814104
1677
μš°λ¦¬λŠ” λ‹€μˆ˜ μ€‘μ˜ ν•˜λ‚˜μ΄κ³ 
13:35
in no respect better than any other in it."
360
815781
2441
κ·Έ μ€‘μ˜ λˆ„κ΅¬λ„ 더 μ‘΄κ²½λ°›μ§€λŠ” μ•ŠλŠ”λ‹€κ³  λ§ν•©λ‹ˆλ‹€.
13:38
And this last part is what is often described
361
818222
2125
λ§ˆμ§€λ§‰ 뢀뢄은 μ’…μ’…
13:40
as the principle of impartiality.
362
820347
3208
κ³΅ν‰μ„±μ˜ μ›μΉ™μœΌλ‘œ μ„€λͺ…λ©λ‹ˆλ‹€.
13:43
And this principle of impartiality manifests itself
363
823555
2629
κ³΅ν‰μ„±μ˜ 원칙은
13:46
in all of the world's religions,
364
826184
1747
μ„Έκ³„μ˜ λͺ¨λ“  μ’…κ΅μ—μ„œ λ‚˜νƒ€λ‚˜κ³ ,
13:47
in all of the different versions of the golden rule,
365
827951
2258
λ‹€μ–‘ν•œ ν˜•νƒœμ˜ κΈˆμ–ΈμœΌλ‘œ λ‚˜νƒ€λ‚˜λ©°,
13:50
and in all of the world's moral philosophies,
366
830209
2454
μ„Έκ³„μ˜ λͺ¨λ“  도덕 철학에 λ‚˜νƒ€λ‚˜λŠ”λ°
13:52
which differ in many ways
367
832663
1307
μ—¬λŸ¬ λ©΄μ—μ„œ 차이가 λ‚˜μ§€λ§Œ
13:53
but share the presupposition that we should judge morality
368
833970
2994
κ³΅ν‰ν•œ μ‹œμ μ—μ„œ
13:56
from sort of an impartial point of view.
369
836964
2985
도덕성을 νŒλ‹¨ν•΄μ•Ό ν•œλ‹€λŠ” μ „μ œ 쑰건을 κ³΅μœ ν•©λ‹ˆλ‹€.
13:59
The best articulation of this view
370
839949
1822
이런 μ‹œκ°μ„ κ°€μž₯ 잘 ν‘œν˜„ν•œ 것은
14:01
is actually, for me, it's not from a theologian or from a philosopher,
371
841771
3085
μ œκ°€ μƒκ°ν•˜κΈ°μ—λŠ” μ‹ ν•™μžλ„ μ•„λ‹ˆκ³  μ² ν•™μžλ„ μ•„λ‹Œ
14:04
but from Humphrey Bogart
372
844856
1357
ν—˜ν”„λ¦¬ λ³΄κ°€νŠΈκ°€
14:06
at the end of "Casablanca."
373
846213
1547
"μΉ΄μŠ€ν΄λž‘μΉ΄"의 λ§ˆμ§€λ§‰μ— ν•œ λ§μž…λ‹ˆλ‹€.
14:07
So, spoiler alert, he's telling his lover
374
847760
3776
μŠ€ν¬μΌλŸ¬κ°€ 될 μˆ˜λ„ μžˆλŠ”λ° κ·Έκ°€ μ—°μΈμ—κ²Œ λ§ν•©λ‹ˆλ‹€.
14:11
that they have to separate
375
851536
1140
곡곡의 이읡을 μœ„ν•΄μ„œ
14:12
for the more general good,
376
852676
1593
그듀이 ν—€μ–΄μ Έμ•Ό ν•œλ‹€κ΅¬μš”.
14:14
and he says to her, and I won't do the accent,
377
854269
1864
κ·Έκ°€ μ—¬μžμ—κ²Œ λ§ν•©λ‹ˆλ‹€. 말투λ₯Ό ν‰λ‚΄λ‚΄μ§€λŠ” μ•Šκ² μŠ΅λ‹ˆλ‹€.
14:16
but he says to her, "It doesn't take much to see
378
856133
1782
κ·Έκ°€ κ·Έλ…€μ—κ²Œ λ§ν•©λ‹ˆλ‹€.
14:17
that the problems of three little people
379
857915
1359
μ„Έ μ‚¬λžŒμ˜ λ¬Έμ œκ°€ μ΄λ ‡κ²Œ 미친 μ„Έμƒμ—μ„œλŠ”
14:19
don't amount to a hill of beans in this crazy world."
380
859274
3111
μ „ν˜€ μ€‘μš”ν•˜μ§€ μ•ŠμŒμ„ κΉ¨λ‹«λŠ”λ° 였랜 μ‹œκ°„μ΄ 걸리진 μ•Šμ„κ±°μš”.
14:22
Our reason could cause us to override our passions.
381
862385
3280
우리의 이성은 우리의 정열을 λ¬΄νš¨ν™”μ‹œν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
14:25
Our reason could motivate us
382
865665
1716
우리의 이성은
14:27
to extend our empathy,
383
867381
1221
κ³΅κ°λŒ€λ₯Ό λ„“νž 수 있게 ν•˜κ³ 
14:28
could motivate us to write a book like "Uncle Tom's Cabin,"
384
868602
2327
"ν†° μ•„μ €μ”¨μ˜ μ˜€λ‘λ§‰"κ³Ό 같은 책을 μ“Έ 수 있게 ν•˜κ±°λ‚˜,
14:30
or read a book like "Uncle Tom's Cabin,"
385
870929
1723
"ν†° μ•„μ €μ”¨μ˜ μ˜€λ‘λ§‰"κ³Ό 같은 책을 읽게 ν•©λ‹ˆλ‹€.
14:32
and our reason can motivate us to create
386
872652
2694
우리의 이성은
14:35
customs and taboos and laws
387
875346
1962
좩동에 따라
14:37
that will constrain us
388
877308
1810
ν–‰λ™ν•˜μ§€ μ•Šλ„λ‘
14:39
from acting upon our impulses
389
879118
1676
κ΄€μŠ΅, κΈˆκΈ°μ‚¬ν•­, 법을 λ§Œλ“€ 수 있게 ν•©λ‹ˆλ‹€.
14:40
when, as rational beings, we feel
390
880794
1589
이성적인 μ‘΄μž¬λ‘œμ„œ
14:42
we should be constrained.
391
882383
1395
μš°λ¦¬κ°€ μžμ œν•΄μ•Ό ν•œλ‹€κ³  λŠλ‚„ λ•Œλ₯Ό μœ„ν•΄μ„œμ£ .
14:43
This is what a constitution is.
392
883778
2013
그것이 ν—Œλ²•μž…λ‹ˆλ‹€.
14:45
A constitution is something which was set up in the past
393
885791
2921
ν—Œλ²•μ€ 과거에 λ§Œλ“€μ–΄μ Έμ„œ
14:48
that applies now in the present,
394
888712
1307
μ§€κΈˆλ„ μ μš©λ©λ‹ˆλ‹€.
14:50
and what it says is,
395
890019
985
ν—Œλ²•μ€ μ΄λ ‡κ²Œ λ§ν•©λ‹ˆλ‹€.
14:51
no matter how much we might to reelect
396
891004
2227
인기가 λ§Žμ€ λŒ€ν†΅λ Ήμ„
14:53
a popular president for a third term,
397
893231
2603
μ„Έ λ²ˆμ”©μ΄λ‚˜ λ‹€μ‹œ λ½‘λŠ”λ‹€κ³  해도,
14:55
no matter how much white Americans might choose
398
895834
2095
백인듀이 λ…Έμ˜ˆ μ œλ„λ₯Ό μ–Όλ§ˆλ‚˜
14:57
to feel that they want to reinstate the institution of slavery, we can't.
399
897929
4068
λ˜λŒλ €λ†“κ³  μ‹Άμ–΄ν•˜λ”λΌλ„ κ·Έλ ‡κ²Œ ν•  수 μ—†μŠ΅λ‹ˆλ‹€.
15:01
We have bound ourselves.
400
901997
1676
μš°λ¦¬λŠ” 슀슀둜 얽맀어 μžˆμŠ΅λ‹ˆλ‹€.
15:03
And we bind ourselves in other ways as well.
401
903673
2417
μš°λ¦¬λŠ” λ‹€λ₯Έ λ°©λ²•μœΌλ‘œλ„ 슀슀둜λ₯Ό λ¬Άμ–΄λ†¨μŠ΅λ‹ˆλ‹€.
15:06
We know that when it comes to choosing somebody
402
906090
2758
μΌμžλ¦¬λ‚˜ 상을 μ£ΌκΈ° μœ„ν•΄μ„œ
15:08
for a job, for an award,
403
908848
2951
λˆ„κ΅¬λ₯Ό 골라야 ν•  λ•Œ
15:11
we are strongly biased by their race,
404
911799
2958
μš°λ¦¬λŠ” 인쒅에 λŒ€ν•΄ μ•„μ£Ό νŽΈν˜•μ μž„μ„ μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
15:14
we are biased by their gender,
405
914757
2296
μš°λ¦¬λŠ” 성별에도 편ν–₯성을 κ°€μ§‘λ‹ˆλ‹€.
15:17
we are biased by how attractive they are,
406
917053
2215
μš°λ¦¬λŠ” 그듀이 μ–Όλ§ˆλ‚˜ 맀λ ₯이 μžˆλŠ”μ§€μ— λŒ€ν•΄μ„œλ„ 편ν–₯성을 κ°€μ§‘λ‹ˆλ‹€.
15:19
and sometimes we might say, "Well fine, that's the way it should be."
407
919268
2651
λ•Œλ•Œλ‘œ μš°λ¦¬λŠ” μ΄λ ‡κ²Œ λ§ν•©λ‹ˆλ‹€. "μ’‹μ•„, κ·Έλ ‡κ²Œ ν•˜λŠ” 거지."
15:21
But other times we say, "This is wrong."
408
921919
2307
ν•˜μ§€λ§Œ λ•Œλ‘œλŠ” μ΄λ ‡κ²Œ λ§ν•©λ‹ˆλ‹€. "이건 잘λͺ»λμ–΄."
15:24
And so to combat this,
409
924226
1889
이걸 κ·Ήλ³΅ν•˜κΈ° μœ„ν•΄μ„œ
15:26
we don't just try harder,
410
926115
2251
μš°λ¦¬λŠ” λ…Έλ ₯만 λ”ν•˜λŠ” 게 μ•„λ‹ˆλΌ
15:28
but rather what we do is we set up situations
411
928366
3001
상황을 μ„€μ •ν•©λ‹ˆλ‹€.
15:31
where these other sources of information can't bias us,
412
931367
3039
λ‹€λ₯Έ 정보가 우리λ₯Ό 편ν–₯되게 ν•˜μ§€ μ•ŠλŠ” 상황을 λ§Œλ“œλŠ” κ±°μ£ .
15:34
which is why many orchestras
413
934406
1315
그런 κΉŒλ‹­μœΌλ‘œ λ§Žμ€ ꡐν–₯악단이
15:35
audition musicians behind screens,
414
935721
2645
μŒμ•…κ°€λ₯Ό 막 λ’€μ—μ„œ μ˜€λ””μ…˜ν•©λ‹ˆλ‹€.
15:38
so the only information they have
415
938366
1244
κ·Έλž˜μ„œ 그듀이 μ–»λŠ” μ •λ³΄λŠ”
15:39
is the information they believe should matter.
416
939610
2693
κ΄€λ ¨λœ 정보 λΏμž…λ‹ˆλ‹€.
15:42
I think prejudice and bias
417
942303
2323
μ €λŠ” 편견과 편ν–₯이
15:44
illustrate a fundamental duality of human nature.
418
944626
3094
인간 λ³Έμ„±μ˜ 기본적인 이쀑성을 잘 보여쀀닀고 μƒκ°ν•©λ‹ˆλ‹€.
15:47
We have gut feelings, instincts, emotions,
419
947720
3776
μš°λ¦¬λŠ” 직감, λ³ΈλŠ₯, 감정을 κ°–κ³  있고
15:51
and they affect our judgments and our actions
420
951496
2161
그것듀은 μ’‹κ²Œ λ‚˜μ˜κ²Œ
15:53
for good and for evil,
421
953657
2331
우리의 νŒλ‹¨κ³Ό 행동에 영ν–₯을 λ―ΈμΉ©λ‹ˆλ‹€.
15:55
but we are also capable of rational deliberation
422
955988
3622
ν•˜μ§€λ§Œ μš°λ¦¬λŠ” μ΄μ„±μ μœΌλ‘œ 신쀑할 수 있고
15:59
and intelligent planning,
423
959610
1435
λ˜‘λ˜‘ν•˜κ²Œ κ³„νšμ„ μ„ΈμšΈ 쀄 μ••λ‹ˆλ‹€.
16:01
and we can use these to, in some cases,
424
961045
2817
μš°λ¦¬λŠ” 이런 것듀을 μ‚¬μš©ν•΄μ„œ μ–΄λ–€ κ²½μš°μ—λŠ”
16:03
accelerate and nourish our emotions,
425
963862
1943
우리의 감정을 μ¦ν­ν•˜κ±°λ‚˜ 더 ν‚€μš°κ³ 
16:05
and in other cases staunch them.
426
965805
2768
λ‹€λ₯Έ κ²½μš°μ—” 감정을 λ§‰μŠ΅λ‹ˆλ‹€.
16:08
And it's in this way
427
968573
1234
이런 μ‹μœΌλ‘œ
16:09
that reason helps us create a better world.
428
969807
2767
이성은 μš°λ¦¬κ°€ 더 λ‚˜μ€ 세상을 λ§Œλ“€κ²Œ λ„μ™€μ€λ‹ˆλ‹€.
16:12
Thank you.
429
972574
2344
κ³ λ§™μŠ΅λ‹ˆλ‹€.
16:14
(Applause)
430
974918
3705
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7