Paul Bloom: Can prejudice ever be a good thing?

180,854 views ・ 2014-07-03

TED


請雙擊下方英文字幕播放視頻。

譯者: Marssi Draw 審譯者: Coco Shen
00:12
When we think about prejudice and bias,
0
12844
2378
在思考成見與偏見時,
00:15
we tend to think about stupid and evil people
1
15222
2144
我們常會想到愚人和壞人
00:17
doing stupid and evil things.
2
17366
2454
在做蠢事和壞事。
00:19
And this idea is nicely summarized
3
19820
2070
這個想法正好等同
00:21
by the British critic William Hazlitt,
4
21890
2468
英國評論家威廉.哈茲列特
00:24
who wrote, "Prejudice is the child of ignorance."
5
24358
2935
所述:「偏見是無知的產物。」
00:27
I want to try to convince you here
6
27293
2112
我想試著在此說服各位
00:29
that this is mistaken.
7
29405
1635
這個說法錯了。
00:31
I want to try to convince you
8
31040
1732
我想試著說服各位
00:32
that prejudice and bias
9
32772
1723
成見與偏見很正常,
00:34
are natural, they're often rational,
10
34495
3288
通常也很理性,
00:37
and they're often even moral,
11
37783
1831
甚至往往合乎道德,
00:39
and I think that once we understand this,
12
39614
2252
我認為只要我們理解這點,
00:41
we're in a better position to make sense of them
13
41866
2509
就更容易了解偏見,
00:44
when they go wrong,
14
44375
1057
一旦偏見出錯,
00:45
when they have horrible consequences,
15
45432
1768
或偏見帶來恐怖後果時,
00:47
and we're in a better position to know what to do
16
47200
2325
我們就能更了解該怎麼做。
00:49
when this happens.
17
49525
1682
00:51
So, start with stereotypes. You look at me,
18
51207
3027
從刻板印象開始談起。
你看著我,知道我的名字和一些背景,
00:54
you know my name, you know certain facts about me,
19
54234
2246
00:56
and you could make certain judgments.
20
56480
1829
接著就能做出某些判斷。
00:58
You could make guesses about my ethnicity,
21
58309
2853
你可能會猜測我的種族、
01:01
my political affiliation, my religious beliefs.
22
61162
3281
政黨,以及宗教信仰。
01:04
And the thing is, these judgments tend to be accurate.
23
64443
2099
而且這些評斷通常都正確。
01:06
We're very good at this sort of thing.
24
66542
2182
我們很擅長這種事。
01:08
And we're very good at this sort of thing
25
68724
1483
我們很擅於這種事
01:10
because our ability to stereotype people
26
70207
2733
是因為將人分類的能力
01:12
is not some sort of arbitrary quirk of the mind,
27
72940
3255
不是什麼心血來潮的事,
01:16
but rather it's a specific instance
28
76195
2316
而是經過自然形成的真實案例,
01:18
of a more general process,
29
78511
1655
01:20
which is that we have experience
30
80166
1619
那就是我們在世上經歷的人事物
01:21
with things and people in the world
31
81785
1541
01:23
that fall into categories,
32
83326
1249
都會被歸類,
01:24
and we can use our experience to make generalizations
33
84575
2456
我們能依據自己的經驗
分類各種新奇事物。
01:27
about novel instances of these categories.
34
87031
2359
01:29
So everybody here has a lot of experience
35
89390
2367
在座各位都有許多經驗
01:31
with chairs and apples and dogs,
36
91757
2253
和椅子、蘋果、小狗有關,
01:34
and based on this, you could see
37
94010
1636
而根據這點,
01:35
unfamiliar examples and you could guess,
38
95646
2352
在你看到許多陌生案例時就能推測,
01:37
you could sit on the chair,
39
97998
1316
你能坐上椅子,
01:39
you could eat the apple, the dog will bark.
40
99314
2565
可以吃那顆蘋果,那隻狗會吠。
01:41
Now we might be wrong.
41
101879
1764
但我們也許錯了。
01:43
The chair could collapse if you sit on it,
42
103643
1800
你如果坐上椅子,它可能就會垮。
01:45
the apple might be poison, the dog might not bark,
43
105443
2222
蘋果也許有毒,狗也許不會叫,
01:47
and in fact, this is my dog Tessie, who doesn't bark.
44
107665
2870
事實上,這是我的狗泰希,牠不會叫。
01:50
But for the most part, we're good at this.
45
110535
2759
但大多數的情況下,我們擅於此事。
01:53
For the most part, we make good guesses
46
113294
1916
通常我們都能猜得八九不離十,
01:55
both in the social domain and the non-social domain,
47
115210
1814
無論是社交或非社交領域,
01:57
and if we weren't able to do so,
48
117024
1949
如果我們不太會猜,
01:58
if we weren't able to make guesses about new instances that we encounter,
49
118973
3216
如果我們不能猜出碰到的新事件,
02:02
we wouldn't survive.
50
122189
1451
我們就無法生存。
02:03
And in fact, Hazlitt later on in his wonderful essay
51
123640
2869
事實上,赫茲利特後來在他的絕佳論文中
02:06
concedes this.
52
126509
1485
承認了這點。
02:07
He writes, "Without the aid of prejudice and custom,
53
127994
2542
他說:「沒有成見和慣例輔助,
02:10
I should not be able to find my way my across the room;
54
130536
2340
我就找不到穿越房間的路徑;
02:12
nor know how to conduct myself in any circumstances,
55
132876
2452
不知道該如何融入各種環境,
02:15
nor what to feel in any relation of life."
56
135328
4203
以及如何感受生命中的各種關係。」
02:19
Or take bias.
57
139531
1509
以偏見來說,
02:21
Now sometimes, we break the world up into
58
141040
1708
有時候我們會把世界劃分為你我對立,
02:22
us versus them, into in-group versus out-group,
59
142748
3001
劃分成內團體和外團體,
02:25
and sometimes when we do this,
60
145749
1161
偶爾我們這麼做的時候,
02:26
we know we're doing something wrong,
61
146910
1557
會自知犯了點錯,
02:28
and we're kind of ashamed of it.
62
148467
1673
而覺得有點難為情。
02:30
But other times we're proud of it.
63
150140
1483
但有時候我們卻引以為傲。
02:31
We openly acknowledge it.
64
151623
1813
我們勇於承認。
02:33
And my favorite example of this
65
153436
1282
我最愛的例子
02:34
is a question that came from the audience
66
154718
2402
是觀眾提出的問題,
02:37
in a Republican debate prior to the last election.
67
157120
2717
場合是共和黨上次選舉前的辯論大會。
02:39
(Video) Anderson Cooper: Gets to your question,
68
159837
2292
(影片)安德森.庫珀:下一個問題,
02:42
the question in the hall, on foreign aid? Yes, ma'am.
69
162129
4181
來自在場觀眾詢問對外援助,請說。
02:46
Woman: The American people are suffering
70
166310
2236
女士:美國人當前在國內受苦,
02:48
in our country right now.
71
168546
2637
02:51
Why do we continue to send foreign aid
72
171183
3348
為什麼我們不斷對外援助,
02:54
to other countries
73
174531
1316
協助其它國家?
02:55
when we need all the help we can get for ourselves?
74
175847
4103
我們自己也需要幫助啊!
02:59
AC: Governor Perry, what about that?
75
179950
1695
安德森:佩里州長,你怎麼看?
03:01
(Applause)
76
181645
1367
(掌聲)
03:03
Rick Perry: Absolutely, I think it's—
77
183012
2338
里克.佩里:當然,我認為那…
03:05
Paul Bloom: Each of the people onstage
78
185350
1660
保羅:每一位在台上的候選人
03:07
agreed with the premise of her question,
79
187010
1971
都同意她的前提,
03:08
which is as Americans, we should care more
80
188981
2119
那就是身為美國人,
我們應該更關心本國人,而非外國人。
03:11
about Americans than about other people.
81
191100
2126
03:13
And in fact, in general, people are often swayed
82
193226
2865
事實上,大家常被感覺支配,
03:16
by feelings of solidarity, loyalty, pride, patriotism,
83
196091
3508
像是團結、忠誠、自尊、愛國心,
03:19
towards their country or towards their ethnic group.
84
199599
2716
這類對所屬國家或種族的情感。
03:22
Regardless of your politics, many people feel proud to be American,
85
202315
3085
不論你的黨派為何, 許多人都以身為美國人為榮,
03:25
and they favor Americans over other countries.
86
205400
2062
他們偏愛美國人勝於其它國家。
03:27
Residents of other countries feel the same about their nation,
87
207462
2850
其它國家的居民對自己國家 有同樣的感覺,
03:30
and we feel the same about our ethnicities.
88
210312
2486
我們對種族淵源也有同樣感受。
03:32
Now some of you may reject this.
89
212798
1684
在座有些人可能會不認同。
03:34
Some of you may be so cosmopolitan
90
214482
1721
有些人可能四海為家,
03:36
that you think that ethnicity and nationality
91
216213
2334
因此認為種族和國籍
03:38
should hold no moral sway.
92
218547
2153
不應該有道德羈絆。
03:40
But even you sophisticates accept
93
220700
2762
但是即使你硬要說
03:43
that there should be some pull
94
223462
1834
應該有某種影響力
03:45
towards the in-group in the domain of friends and family,
95
225296
2701
存在你和親友、熟人之間的內團體之中,
03:47
of people you're close to,
96
227997
1421
03:49
and so even you make a distinction
97
229418
1561
你也在我們和他們之間做了區隔。
03:50
between us versus them.
98
230979
1975
03:52
Now, this distinction is natural enough
99
232954
2603
這種區隔十分自然,
03:55
and often moral enough, but it can go awry,
100
235557
2924
通常也很合乎道德觀,但卻可能走樣,
03:58
and this was part of the research
101
238481
1729
這就是偉大心理學家亨利.泰弗爾
04:00
of the great social psychologist Henri Tajfel.
102
240210
2759
研究的一部分。
04:02
Tajfel was born in Poland in 1919.
103
242969
2605
泰弗爾在 1919 年生於波蘭,
04:05
He left to go to university in France,
104
245574
2139
他為了唸大學前往法國,
04:07
because as a Jew, he couldn't go to university in Poland,
105
247713
2555
因為身為猶太人,他不能在波蘭唸大學,
04:10
and then he enlisted in the French military
106
250268
2510
後來他加入法國軍隊,
04:12
in World War II.
107
252778
1283
參與第二次世界大戰。
04:14
He was captured and ended up
108
254061
1769
他被逮捕,
04:15
in a prisoner of war camp,
109
255830
1531
被關進戰俘營,
04:17
and it was a terrifying time for him,
110
257361
2267
對他來說那是段恐怖時期,
04:19
because if it was discovered that he was a Jew,
111
259628
1688
因為如果有人發現他是猶太人,
04:21
he could have been moved to a concentration camp,
112
261316
2092
他就可能會被送進集中營,
04:23
where he most likely would not have survived.
113
263408
1992
很可能會在裡頭喪命。
04:25
And in fact, when the war ended and he was released,
114
265400
2587
其實戰爭結束後,他重獲自由,
04:27
most of his friends and family were dead.
115
267987
2505
大部分的親友都死了。
04:30
He got involved in different pursuits.
116
270492
1837
他開始參與各種活動,
04:32
He helped out the war orphans.
117
272329
1531
並協助戰時遺孤脫困。
04:33
But he had a long-lasting interest
118
273860
1731
但他一直很有興趣
04:35
in the science of prejudice,
119
275591
1545
專研偏見科學,
04:37
and so when a prestigious British scholarship
120
277136
2660
因此當一份針對刻板印象的 知名英國獎學金開放申請時,
04:39
on stereotypes opened up, he applied for it,
121
279796
1845
04:41
and he won it,
122
281641
1357
他提出申請後獲選,
04:42
and then he began this amazing career.
123
282998
2190
就開啟了這美好的職業生涯。
04:45
And what started his career is an insight
124
285188
2749
讓他展開這份職業的原因是
04:47
that the way most people were thinking
125
287937
1840
他頓悟到大多數人理解的
04:49
about the Holocaust was wrong.
126
289777
2116
猶太人大屠殺並不正確。
04:51
Many people, most people at the time,
127
291893
2406
那時候大多數人
04:54
viewed the Holocaust as sort of representing
128
294299
1901
認為猶太人大屠殺
04:56
some tragic flaw on the part of the Germans,
129
296200
3004
象徵德國人的某種悲慘缺陷,
04:59
some genetic taint, some authoritarian personality.
130
299204
3834
基因汙點與獨裁人格。
05:03
And Tajfel rejected this.
131
303038
2058
泰弗爾反駁這點。
05:05
Tajfel said what we see in the Holocaust
132
305096
2543
他說我們在猶太人大屠殺中見到的
05:07
is just an exaggeration
133
307639
2311
只是種正常心理歷程的誇大版,
05:09
of normal psychological processes
134
309950
1778
05:11
that exist in every one of us.
135
311728
1761
我們每個人都會有。
05:13
And to explore this, he did a series of classic studies
136
313489
2685
為了深入探索, 他做了一系列經典研究,
05:16
with British adolescents.
137
316174
1744
針對英國青少年。
05:17
And in one of his studies, what he did was he asked
138
317918
1549
在一份研究中,
05:19
the British adolescents all sorts of questions,
139
319467
2552
他問英國青少年各種問題,
05:22
and then based on their answers, he said,
140
322019
1884
然後根據他們的答案說:
05:23
"I've looked at your answers, and based on the answers,
141
323903
2357
「我看了你們的答案,根據你的答案,
05:26
I have determined that you are either" —
142
326260
2097
我認定你要嘛不是…」
05:28
he told half of them —
143
328357
1006
他告訴一半的人說:
05:29
"a Kandinsky lover, you love the work of Kandinsky,
144
329363
2957
「你是康丁斯基的粉絲, 超愛康丁斯基的作品,
05:32
or a Klee lover, you love the work of Klee."
145
332320
2978
不然你就是克利的粉絲, 超愛他的作品。」
05:35
It was entirely bogus.
146
335298
1816
其實這根本就是場騙局。
05:37
Their answers had nothing to do with Kandinsky or Klee.
147
337114
2290
他們的答案和康丁斯基或克利毫無關係,
05:39
They probably hadn't heard of the artists.
148
339404
2728
說不定連這兩位藝術家的名字也沒聽過。
05:42
He just arbitrarily divided them up.
149
342132
2740
他只是任意將他們分類。
05:44
But what he found was, these categories mattered,
150
344872
3271
但他發現,這樣的分類有意義,
05:48
so when he later gave the subjects money,
151
348143
2511
因此當他之後給受試者酬勞,
05:50
they would prefer to give the money
152
350654
1676
他們會比較想把錢給和自己同類的人,
05:52
to members of their own group
153
352330
1800
05:54
than members of the other group.
154
354130
1833
不想給和自己不同類的人。
05:55
Worse, they were actually most interested
155
355963
2327
更糟的是,他們其實特別喜歡
05:58
in establishing a difference
156
358290
2006
突顯團體之間的差異。
06:00
between their group and other groups,
157
360296
2566
06:02
so they would give up money for their own group
158
362862
1908
他們寧願把錢都給自己的團,
06:04
if by doing so they could give the other group even less.
159
364770
5248
只為了少給別團一點錢。
06:10
This bias seems to show up very early.
160
370018
2218
這種偏見似乎在年幼時就會出現。
06:12
So my colleague and wife, Karen Wynn, at Yale
161
372236
2300
我的妻子兼耶魯的同事凱倫.韋恩
06:14
has done a series of studies with babies
162
374536
1611
對嬰兒做了一系列研究,
06:16
where she exposes babies to puppets,
163
376147
2832
她讓嬰兒接觸玩偶,
06:18
and the puppets have certain food preferences.
164
378979
2265
而每個玩偶都偏好某種食物。
06:21
So one of the puppets might like green beans.
165
381244
2182
可能某個玩偶喜歡四季豆,
06:23
The other puppet might like graham crackers.
166
383426
2575
另一個玩偶則喜歡全麥餅乾。
06:26
They test the babies own food preferences,
167
386001
2369
他們測試嬰兒的食物喜好,
06:28
and babies typically prefer the graham crackers.
168
388370
2690
嬰兒基本上都比較喜歡全麥餅乾。
06:31
But the question is, does this matter to babies
169
391060
2612
但問題是,這會影響嬰兒 對待玩偶的方式嗎?
06:33
in how they treat the puppets? And it matters a lot.
170
393672
3116
答案是大有關聯。
06:36
They tend to prefer the puppet
171
396788
1519
他們較喜歡的是
06:38
who has the same food tastes that they have,
172
398307
3479
和自己有同樣口味的玩偶,
06:41
and worse, they actually prefer puppets
173
401786
2556
更糟的是,他們真的比較樂見
同口味的玩偶處罰不同口味的玩偶。
06:44
who punish the puppet with the different food taste.
174
404342
2985
06:47
(Laughter)
175
407327
2277
(笑聲)
06:49
We see this sort of in-group, out-group psychology all the time.
176
409604
3632
我們無時無刻會看到 這種內、外團體的心態。
06:53
We see it in political clashes
177
413236
1664
我們會看到意識型態不同的 團體間有政治衝突。
06:54
within groups with different ideologies.
178
414900
2414
06:57
We see it in its extreme in cases of war,
179
417314
3626
我們在戰爭中看到極端的例子,
07:00
where the out-group isn't merely given less,
180
420940
3217
外團體不只是得到較少好處,
07:04
but dehumanized,
181
424157
1588
還會被眨得連人都不如,
07:05
as in the Nazi perspective of Jews
182
425745
2240
就像納粹人眼中的猶太人是害蟲、蝨子,
07:07
as vermin or lice,
183
427985
2085
07:10
or the American perspective of Japanese as rats.
184
430070
4236
或美國人眼中的日本人是卑鄙的老鼠。
07:14
Stereotypes can also go awry.
185
434306
2214
刻板印象也可能出差錯。
07:16
So often they're rational and useful,
186
436520
2261
通常刻板印象很理性和有用,
07:18
but sometimes they're irrational,
187
438781
1574
但有時卻不合理,
07:20
they give the wrong answers,
188
440355
1226
會提供錯誤答案,
07:21
and other times
189
441581
1217
而某些時候
07:22
they lead to plainly immoral consequences.
190
442798
2175
則會引起違反道德的結果。
07:24
And the case that's been most studied
191
444973
2808
最常被研究的例子是種族。
07:27
is the case of race.
192
447781
1667
07:29
There was a fascinating study
193
449448
1407
在 2008 年大選前有項很吸引人的研究,
07:30
prior to the 2008 election
194
450855
2074
07:32
where social psychologists looked at the extent
195
452929
3026
社會心理學家檢視
07:35
to which the candidates were associated with America,
196
455955
3442
每一位候選人和美國的相關程度,
07:39
as in an unconscious association with the American flag.
197
459397
3605
如同對美國國旗的無意識聯想。
07:43
And in one of their studies they compared
198
463002
1356
其中一項研究比較了歐巴馬和麥肯,
07:44
Obama and McCain, and they found McCain
199
464358
2014
他們發現大家認為 麥肯比歐巴馬更像美國人,
07:46
is thought of as more American than Obama,
200
466372
3394
07:49
and to some extent, people aren't that surprised by hearing that.
201
469766
2573
而且大家聽到這件事也不太驚訝。
07:52
McCain is a celebrated war hero,
202
472339
1918
麥肯被封為戰時英雄,
07:54
and many people would explicitly say
203
474257
1659
很多人會直說
07:55
he has more of an American story than Obama.
204
475916
2700
他的故事比歐巴馬更符合美國形象。
07:58
But they also compared Obama
205
478616
1937
但他們也拿歐巴馬
08:00
to British Prime Minister Tony Blair,
206
480553
2516
和英國首相東尼.布萊爾比較,
08:03
and they found that Blair was also thought of
207
483069
2261
結果發現大家也認為
08:05
as more American than Obama,
208
485330
2507
布萊爾比歐巴馬還像美國人,
08:07
even though subjects explicitly understood
209
487837
2073
即使題目已經明白指出
08:09
that he's not American at all.
210
489910
2990
他根本不是美國人。
08:12
But they were responding, of course,
211
492900
1424
但是想當然爾,他們都說
08:14
to the color of his skin.
212
494324
3051
問題出在他的膚色。
08:17
These stereotypes and biases
213
497375
2051
這些刻板印象和偏見
08:19
have real-world consequences,
214
499426
1450
都會影響現實生活,
08:20
both subtle and very important.
215
500876
2872
無論事情是輕或重。
08:23
In one recent study, researchers
216
503748
2662
最近一份報告中,
08:26
put ads on eBay for the sale of baseball cards.
217
506410
3269
研究員在拍賣網 eBay 刊登賣棒球卡的廣告。
08:29
Some of them were held by white hands,
218
509679
2734
有些是白人的手持卡,
08:32
others by black hands.
219
512413
1218
有些則是黑人持卡。
08:33
They were the same baseball cards.
220
513631
1579
這些都是一樣的棒球卡。
08:35
The ones held by black hands
221
515210
1244
某些黑人拿的卡
08:36
got substantially smaller bids
222
516454
2067
競標價格遠低於
08:38
than the ones held by white hands.
223
518521
2484
白人拿的卡。
08:41
In research done at Stanford,
224
521005
2362
有份史丹佛的報告指出,
08:43
psychologists explored the case of people
225
523367
4230
心理學家探究
白人被判謀殺罪的案件。
08:47
sentenced for the murder of a white person.
226
527597
3569
結果出爐,在所有條件相同的情況下,
08:51
It turns out, holding everything else constant,
227
531166
2804
08:53
you are considerably more likely to be executed
228
533970
2370
你極有可能被判死刑,
08:56
if you look like the man on the right
229
536340
1777
如果你長得較像右邊這位,
08:58
than the man on the left,
230
538117
1973
機率高於左邊這位,
09:00
and this is in large part because
231
540090
2029
主要是因為
09:02
the man on the right looks more prototypically black,
232
542119
2534
右邊這位看來像典型黑人,
09:04
more prototypically African-American,
233
544653
2630
較像非裔美國人,
09:07
and this apparently influences people's decisions
234
547283
2049
影響人們決定的顯然是這點,
09:09
over what to do about him.
235
549332
1771
而非他做了什麼事。
09:11
So now that we know about this,
236
551103
1547
但知道了這件事,
09:12
how do we combat it?
237
552650
1657
我們能如何抵抗?
09:14
And there are different avenues.
238
554307
1622
有幾種方法。
09:15
One avenue is to appeal
239
555929
1434
其一是訴諸大眾的情感反應,
09:17
to people's emotional responses,
240
557363
2046
09:19
to appeal to people's empathy,
241
559409
2133
訴諸大眾的同理心,
09:21
and we often do that through stories.
242
561542
1873
我們通常透過故事來呈現。
09:23
So if you are a liberal parent
243
563415
2565
如果你是開明的家長,
09:25
and you want to encourage your children
244
565980
1872
你想鼓勵孩子
09:27
to believe in the merits of nontraditional families,
245
567852
2374
相信非傳統家庭的價值,
09:30
you might give them a book like this. ["Heather Has Two Mommies"]
246
570226
2273
你可能會給他這類的書。 【海瑟有兩個媽咪】
09:32
If you are conservative and have a different attitude,
247
572499
1726
如果你很保守,有不同的看法,
09:34
you might give them a book like this.
248
574225
1931
你可能會給孩子這本書。
09:36
(Laughter) ["Help! Mom! There Are Liberals under My Bed!"]
249
576156
1749
(笑聲) 【媽呀!我床下有自由黨人!】
09:37
But in general, stories can turn
250
577905
3336
但一般來說,
故事能夠將無名人士變得舉足輕重,
09:41
anonymous strangers into people who matter,
251
581241
2232
09:43
and the idea that we care about people
252
583473
2685
我們會在乎、看重某些人
09:46
when we focus on them as individuals
253
586158
1702
09:47
is an idea which has shown up across history.
254
587860
2279
是因為他們在史上留名。
09:50
So Stalin apocryphally said,
255
590139
2583
因此傳說史達林提過:
09:52
"A single death is a tragedy,
256
592722
1617
「一個人喪命是悲劇,
09:54
a million deaths is a statistic,"
257
594339
2040
一百萬人喪命則是統計數字。」
09:56
and Mother Teresa said,
258
596379
1451
泰瑞莎修女曾說:
09:57
"If I look at the mass, I will never act.
259
597830
1541
「如果放眼天下,我永遠不會行動。
09:59
If I look at the one, I will."
260
599371
2325
如果關注個人,我就會起身而行。」
10:01
Psychologists have explored this.
261
601696
2070
心理學家已探究此事。
10:03
For instance, in one study,
262
603766
1301
例如,某項研究中,
10:05
people were given a list of facts about a crisis,
263
605067
2783
受試者被告知危機事件的細節,
10:07
and it was seen how much they would donate
264
607850
4256
看大家願意捐多少錢
10:12
to solve this crisis,
265
612106
1584
來解決這項危機。
10:13
and another group was given no facts at all
266
613690
1837
而另一組人則不會知道任何細節,
10:15
but they were told of an individual
267
615527
2098
但他們會得知
受害者的姓名和長相,
10:17
and given a name and given a face,
268
617625
2440
10:20
and it turns out that they gave far more.
269
620065
3219
結果是這組人會捐得較多。
10:23
None of this I think is a secret
270
623284
1861
對慈善團體來說,這不是秘密。
10:25
to the people who are engaged in charity work.
271
625145
2111
10:27
People don't tend to deluge people
272
627256
2648
他們比較不會塞一堆 資料和統計數字給大眾,
10:29
with facts and statistics.
273
629904
1323
10:31
Rather, you show them faces,
274
631227
1022
而是讓他們看見臉龐和真人。
10:32
you show them people.
275
632249
1736
10:33
It's possible that by extending our sympathies
276
633985
3227
透過延伸對他人的同情心就有機會
10:37
to an individual, they can spread
277
637212
1971
將愛心傳到受助者所屬的團體中。
10:39
to the group that the individual belongs to.
278
639183
2878
10:42
This is Harriet Beecher Stowe.
279
642061
2466
這位是哈里特.比徹.斯托。
10:44
The story, perhaps apocryphal,
280
644527
2443
有個傳說是
10:46
is that President Lincoln invited her
281
646970
2074
林肯總統在內戰期間邀請她進白宮,
10:49
to the White House in the middle of the Civil War
282
649044
1998
10:51
and said to her,
283
651042
1584
總統對她說:
10:52
"So you're the little lady who started this great war."
284
652626
2664
「你就是那位開啟這場大戰的小姐。」
10:55
And he was talking about "Uncle Tom's Cabin."
285
655290
1885
他指的是《湯姆叔叔的小屋》。
10:57
"Uncle Tom's Cabin" is not a great book of philosophy
286
657175
2531
這故事不是什麼人生大道理
10:59
or of theology or perhaps not even literature,
287
659706
3144
或神學故事,甚至稱不上文學,
11:02
but it does a great job
288
662850
2515
但這本書卻成功
11:05
of getting people to put themselves in the shoes
289
665365
2498
讓大眾對他們原本不懂的人感同身受,
11:07
of people they wouldn't otherwise be in the shoes of,
290
667863
2333
11:10
put themselves in the shoes of slaves.
291
670196
2402
讓他們理解奴隸。
11:12
And that could well have been a catalyst
292
672598
1781
成功促成了重大的社會變遷。
11:14
for great social change.
293
674379
1604
11:15
More recently, looking at America
294
675983
2362
看看最近數十年的美國,
11:18
in the last several decades,
295
678345
3069
11:21
there's some reason to believe that shows like "The Cosby Show"
296
681414
3149
有證據顯示, 像《天才老爹》這種節目
11:24
radically changed American attitudes towards African-Americans,
297
684563
2688
徹底改變美國人對非裔美國人的態度,
11:27
while shows like "Will and Grace" and "Modern Family"
298
687251
2983
而像《威爾與格蕾絲》 和《摩登家庭》這類的節目
11:30
changed American attitudes
299
690234
1363
則改變了美國人
11:31
towards gay men and women.
300
691597
1300
對男同志與女性的態度。
11:32
I don't think it's an exaggeration to say
301
692897
2455
若說美國的道德觀念變遷
11:35
that the major catalyst in America for moral change
302
695352
2661
歸功於情境喜劇
11:38
has been a situation comedy.
303
698013
2893
一點也不誇張。
11:40
But it's not all emotions,
304
700906
1416
但並不全和情感有關,
11:42
and I want to end by appealing
305
702322
1276
最後,我想談訴諸理智的力量。
11:43
to the power of reason.
306
703598
2235
11:45
At some point in his wonderful book
307
705833
2156
在一本好書
11:47
"The Better Angels of Our Nature,"
308
707989
1223
《人性中的善良天使》中
11:49
Steven Pinker says,
309
709212
2016
史迪芬.平克提到,
11:51
the Old Testament says love thy neighbor,
310
711228
2582
《舊約聖經》說愛鄰舍如同自己,
11:53
and the New Testament says love thy enemy,
311
713810
2722
《新約聖經》說愛敵人如同自己,
11:56
but I don't love either one of them, not really,
312
716532
2686
但我兩者都不愛,也不是這麼說,
11:59
but I don't want to kill them.
313
719218
1667
至少我不會想殺他們。
12:00
I know I have obligations to them,
314
720885
1866
我自知對他們有責任,
12:02
but my moral feelings to them, my moral beliefs
315
722751
3470
但我對他們的道德情感
與我應該怎麼對待他們的道德信念,
12:06
about how I should behave towards them,
316
726221
1713
12:07
aren't grounded in love.
317
727934
2047
並非基於愛。
12:09
What they're grounded in is the understanding of human rights,
318
729981
1939
而是因為我了解人權,
12:11
a belief that their life is as valuable to them
319
731920
2223
相信對他們而言,生命有價值,
12:14
as my life is to me,
320
734143
2356
如同我的生命也有價值,
12:16
and to support this, he tells a story
321
736499
1932
為了支持這個論點,
12:18
by the great philosopher Adam Smith,
322
738431
1848
他提到亞當.斯密說的故事,
12:20
and I want to tell this story too,
323
740279
1686
我也想分享這個故事,
12:21
though I'm going to modify it a little bit
324
741965
1296
不過我會稍微修改成現代版本。
12:23
for modern times.
325
743261
1678
12:24
So Adam Smith starts by asking you to imagine
326
744939
1901
亞當.斯密請你想像
12:26
the death of thousands of people,
327
746840
1901
有數千人死亡,
12:28
and imagine that the thousands of people
328
748741
2040
並想像這數千人
12:30
are in a country you are not familiar with.
329
750781
2239
在你不熟悉的國度,
12:33
It could be China or India or a country in Africa.
330
753020
3554
可能是中國、印度, 或是非洲的某個國家。
12:36
And Smith says, how would you respond?
331
756574
2484
斯密問,你會有什麼反應?
12:39
And you would say, well that's too bad,
332
759058
2307
你可能會說,好慘啊,
12:41
and you'd go on to the rest of your life.
333
761365
1876
接著就繼續過生活。
12:43
If you were to open up The New York Times online or something,
334
763241
2219
如果你上網看紐約時報之類的網站,
12:45
and discover this, and in fact this happens to us all the time,
335
765460
2960
會發現這種故事其實常發生,
12:48
we go about our lives.
336
768420
1521
我們還是繼續過生活。
12:49
But imagine instead, Smith says,
337
769941
2194
但斯密請你再想像
12:52
you were to learn that tomorrow
338
772135
1254
自己的小指頭明天會被切斷,
12:53
you were to have your little finger chopped off.
339
773389
2539
12:55
Smith says, that would matter a lot.
340
775928
2169
他說這樣你就會很在意了。
12:58
You would not sleep that night
341
778097
1411
你大概會一夜無眠,
12:59
wondering about that.
342
779508
1353
只想著那件事。
13:00
So this raises the question:
343
780861
2019
隨之而來的問題是:
13:02
Would you sacrifice thousands of lives
344
782880
2466
你會願意犧牲上千條人命
13:05
to save your little finger?
345
785346
1969
來拯救自己的小指頭嗎?
13:07
Now answer this in the privacy of your own head,
346
787315
2318
自己在腦袋裡回答這個問題,
13:09
but Smith says, absolutely not,
347
789633
2919
但斯密說,當然不願意,
13:12
what a horrid thought.
348
792552
1692
那種想法太恐怖了。
13:14
And so this raises the question,
349
794244
2031
因此針對那個問題
13:16
and so, as Smith puts it,
350
796275
1374
斯密歸納:
13:17
"When our passive feelings are almost always
351
797649
2218
「當我們的被動想法總是
13:19
so sordid and so selfish,
352
799867
1448
極度可悲與自私時,
13:21
how comes it that our active principles
353
801315
1465
我們的主動原則怎能
13:22
should often be so generous and so noble?"
354
802780
2533
時常表現得如此大方又高貴呢?」
13:25
And Smith's answer is, "It is reason,
355
805313
2050
而斯密回應:
「是因為理性、原則和意識
13:27
principle, conscience.
356
807363
1775
13:29
[This] calls to us,
357
809138
1541
召喚我們,
13:30
with a voice capable of astonishing the most presumptuous of our passions,
358
810679
3425
用駭人口吻對非常冒失的自己說,
13:34
that we are but one of the multitude,
359
814104
1677
不過是芸芸眾生的我們,
13:35
in no respect better than any other in it."
360
815781
2441
可不比任何人還高尚。」
13:38
And this last part is what is often described
361
818222
2125
最後一部分常被稱為
13:40
as the principle of impartiality.
362
820347
3208
公平原則。
13:43
And this principle of impartiality manifests itself
363
823555
2629
公平原則在世上所有的宗教中
13:46
in all of the world's religions,
364
826184
1747
13:47
in all of the different versions of the golden rule,
365
827951
2258
以各種不同版本的黃金法則顯現,
13:50
and in all of the world's moral philosophies,
366
830209
2454
也在世上所有的道德哲學中
13:52
which differ in many ways
367
832663
1307
以各種方式呈現,
13:53
but share the presupposition that we should judge morality
368
833970
2994
但卻都有同樣的前提,
13:56
from sort of an impartial point of view.
369
836964
2985
即我們應該公正無私地衡量道德。
13:59
The best articulation of this view
370
839949
1822
這種觀點的最佳說法
14:01
is actually, for me, it's not from a theologian or from a philosopher,
371
841771
3085
對我來說其實不是 以神學家或哲學家的身分,
14:04
but from Humphrey Bogart
372
844856
1357
而是以一種亨弗萊.鮑嘉
14:06
at the end of "Casablanca."
373
846213
1547
在《北非諜影》中的說法。
14:07
So, spoiler alert, he's telling his lover
374
847760
3776
注意我要爆雷了,
他告訴情人,為了大家好, 我們還是分手吧。
14:11
that they have to separate
375
851536
1140
14:12
for the more general good,
376
852676
1593
14:14
and he says to her, and I won't do the accent,
377
854269
1864
我不會模仿他的腔調,他對她說:
14:16
but he says to her, "It doesn't take much to see
378
856133
1782
「三個小人物的問題
14:17
that the problems of three little people
379
857915
1359
14:19
don't amount to a hill of beans in this crazy world."
380
859274
3111
在這瘋狂世界裡根本微不足道。」
14:22
Our reason could cause us to override our passions.
381
862385
3280
我們的理智可能會讓自己無視熱情。
14:25
Our reason could motivate us
382
865665
1716
我們的理智可能會促使自己
14:27
to extend our empathy,
383
867381
1221
變得更有同理心,
14:28
could motivate us to write a book like "Uncle Tom's Cabin,"
384
868602
2327
可能激勵我們寫本 《湯姆叔叔的小屋》之類的書,
14:30
or read a book like "Uncle Tom's Cabin,"
385
870929
1723
或是讀本那樣的書,
14:32
and our reason can motivate us to create
386
872652
2694
我們的理智可能激勵大家
建立習俗、戒律或法令,
14:35
customs and taboos and laws
387
875346
1962
14:37
that will constrain us
388
877308
1810
限制自己不會衝動行事,
14:39
from acting upon our impulses
389
879118
1676
14:40
when, as rational beings, we feel
390
880794
1589
身為理智的人類,
14:42
we should be constrained.
391
882383
1395
我們自認應該受限。
14:43
This is what a constitution is.
392
883778
2013
憲法即是如此。
14:45
A constitution is something which was set up in the past
393
885791
2921
憲法是過去建立,
14:48
that applies now in the present,
394
888712
1307
並在當代運用,
14:50
and what it says is,
395
890019
985
憲法明訂
14:51
no matter how much we might to reelect
396
891004
2227
無論我們多想再次投票給 大受歡迎的總統,
14:53
a popular president for a third term,
397
893231
2603
讓他做第三任,
14:55
no matter how much white Americans might choose
398
895834
2095
不管美國白人多麼想
14:57
to feel that they want to reinstate the institution of slavery, we can't.
399
897929
4068
恢復奴隸制度,我們都無能為力。
15:01
We have bound ourselves.
400
901997
1676
我們已限制自己。
15:03
And we bind ourselves in other ways as well.
401
903673
2417
我們也在其它方面限制自己。
15:06
We know that when it comes to choosing somebody
402
906090
2758
我們知道在談到要選擇某人
15:08
for a job, for an award,
403
908848
2951
任職、獲獎時,
15:11
we are strongly biased by their race,
404
911799
2958
我們會輕易因他們的種族而有偏見,
15:14
we are biased by their gender,
405
914757
2296
因他們的性別而有偏見,
15:17
we are biased by how attractive they are,
406
917053
2215
因他們的外貌而有偏見,
15:19
and sometimes we might say, "Well fine, that's the way it should be."
407
919268
2651
有時我們會說: 「好吧,事情就該是這樣。」
15:21
But other times we say, "This is wrong."
408
921919
2307
但有時候我們會說:「那錯了。」
15:24
And so to combat this,
409
924226
1889
為了抵抗這件事,
15:26
we don't just try harder,
410
926115
2251
我們不只是要更努力,
15:28
but rather what we do is we set up situations
411
928366
3001
而是要開創某些情境,
15:31
where these other sources of information can't bias us,
412
931367
3039
如此其它訊息來源就不會誤導我們,
15:34
which is why many orchestras
413
934406
1315
這也是為何許多管弦樂隊
15:35
audition musicians behind screens,
414
935721
2645
甄選樂手時要在幕後,
15:38
so the only information they have
415
938366
1244
這樣他們接收到的 就只會是重要的訊息。
15:39
is the information they believe should matter.
416
939610
2693
15:42
I think prejudice and bias
417
942303
2323
我認為成見與偏見
15:44
illustrate a fundamental duality of human nature.
418
944626
3094
闡明了人性根本的二元論。
15:47
We have gut feelings, instincts, emotions,
419
947720
3776
我們有直覺、本能、情感,
15:51
and they affect our judgments and our actions
420
951496
2161
而這些感受都會影響我們的評斷與行動,
15:53
for good and for evil,
421
953657
2331
不論那是好是壞,
15:55
but we are also capable of rational deliberation
422
955988
3622
但我們也都有理性深思熟慮的能力,
15:59
and intelligent planning,
423
959610
1435
聰明規畫的能力,
16:01
and we can use these to, in some cases,
424
961045
2817
在某些情況下,我們能以此
16:03
accelerate and nourish our emotions,
425
963862
1943
刺激與培養我們的情緒,
16:05
and in other cases staunch them.
426
965805
2768
在其它情況下則阻止情緒爆發。
16:08
And it's in this way
427
968573
1234
也因此,理智幫助我們 建立更美好的世界。
16:09
that reason helps us create a better world.
428
969807
2767
16:12
Thank you.
429
972574
2344
謝謝。
16:14
(Applause)
430
974918
3705
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7