Paul Bloom: Can prejudice ever be a good thing?

180,880 views ・ 2014-07-03

TED


请双击下面的英文字幕来播放视频。

翻译人员: Zhiting Chen 校对人员: Geoff Chen
00:12
When we think about prejudice and bias,
0
12844
2378
当我们想到偏见和偏爱,
00:15
we tend to think about stupid and evil people
1
15222
2144
我们总会联想到愚蠢又邪恶的人
00:17
doing stupid and evil things.
2
17366
2454
做着愚蠢且邪恶的事。
00:19
And this idea is nicely summarized
3
19820
2070
英国评论家威廉‧哈兹里特
00:21
by the British critic William Hazlitt,
4
21890
2468
非常好地总结了这个想法,
00:24
who wrote, "Prejudice is the child of ignorance."
5
24358
2935
他写道,“偏见是无知的幼子”
00:27
I want to try to convince you here
6
27293
2112
我想要试图游说你
00:29
that this is mistaken.
7
29405
1635
这是错误的。
00:31
I want to try to convince you
8
31040
1732
我想要向你证明
00:32
that prejudice and bias
9
32772
1723
偏见和偏爱
00:34
are natural, they're often rational,
10
34495
3288
是自然而然的, 它们时常是理性的,
00:37
and they're often even moral,
11
37783
1831
时常甚至是道德的,
00:39
and I think that once we understand this,
12
39614
2252
我想当我们理解这,
00:41
we're in a better position to make sense of them
13
41866
2509
当它出现问题的时候
00:44
when they go wrong,
14
44375
1057
当它可能造成严重后果的时候,
00:45
when they have horrible consequences,
15
45432
1768
我们会有更好的应对方式,
00:47
and we're in a better position to know what to do
16
47200
2325
当这一切发生的时候,
00:49
when this happens.
17
49525
1682
我们会知道要如何处理。
00:51
So, start with stereotypes. You look at me,
18
51207
3027
好,让我们从“成见”开始聊起。 你看着我,
00:54
you know my name, you know certain facts about me,
19
54234
2246
你知道我的名字, 以及一些关于我的事情,
00:56
and you could make certain judgments.
20
56480
1829
你可以做出一定的判断。
00:58
You could make guesses about my ethnicity,
21
58309
2853
你可以猜测我的种族,
01:01
my political affiliation, my religious beliefs.
22
61162
3281
我的政治倾向,我的宗教信仰
01:04
And the thing is, these judgments tend to be accurate.
23
64443
2099
这些判断似乎可以是准确的。
01:06
We're very good at this sort of thing.
24
66542
2182
我们对这些事非常擅长。
01:08
And we're very good at this sort of thing
25
68724
1483
我们非常善于这些事的原因是
01:10
because our ability to stereotype people
26
70207
2733
我们“定义”他人的能力
01:12
is not some sort of arbitrary quirk of the mind,
27
72940
3255
不是非常武断的意识行为,
01:16
but rather it's a specific instance
28
76195
2316
而是一个综合过程
01:18
of a more general process,
29
78511
1655
的特定反应,
01:20
which is that we have experience
30
80166
1619
这意谓着,当我们对所经历过的
01:21
with things and people in the world
31
81785
1541
世界上发生的人与事
01:23
that fall into categories,
32
83326
1249
做出分类,
01:24
and we can use our experience to make generalizations
33
84575
2456
我们可以用经验来
01:27
about novel instances of these categories.
34
87031
2359
做出反应,
01:29
So everybody here has a lot of experience
35
89390
2367
这里的每个人都有很多经验
01:31
with chairs and apples and dogs,
36
91757
2253
椅子,苹果,狗
01:34
and based on this, you could see
37
94010
1636
根据这些物件,你可以看到
01:35
unfamiliar examples and you could guess,
38
95646
2352
不熟悉的例子,并且你可以猜测,
01:37
you could sit on the chair,
39
97998
1316
你可以坐在这张椅子上,
01:39
you could eat the apple, the dog will bark.
40
99314
2565
你可以吃这个苹果, 狗会对着你叫。
01:41
Now we might be wrong.
41
101879
1764
我们可能是错的。
01:43
The chair could collapse if you sit on it,
42
103643
1800
当你坐在椅子上的时候, 椅子可能会塌,
01:45
the apple might be poison, the dog might not bark,
43
105443
2222
苹果可能是有毒的, 狗未必会叫,
01:47
and in fact, this is my dog Tessie, who doesn't bark.
44
107665
2870
事实上,这是我的狗泰西, 它不叫。
01:50
But for the most part, we're good at this.
45
110535
2759
但在大多数情况下, 我们对此很擅长。
01:53
For the most part, we make good guesses
46
113294
1916
在大多数情况下, 我们的猜测是合理的
01:55
both in the social domain and the non-social domain,
47
115210
1814
在社会领域或其他领域,
01:57
and if we weren't able to do so,
48
117024
1949
如果我们不具有这样的能力,
01:58
if we weren't able to make guesses about new instances that we encounter,
49
118973
3216
如果我们没有办法对出现的新鲜事物 做出正确的猜测,
02:02
we wouldn't survive.
50
122189
1451
我们将无法生存。
02:03
And in fact, Hazlitt later on in his wonderful essay
51
123640
2869
事实上,哈兹里特后来 在他的佳作中
02:06
concedes this.
52
126509
1485
对此评论做出了让步。
02:07
He writes, "Without the aid of prejudice and custom,
53
127994
2542
他写道,“如果没有偏见和风俗习惯的帮助,
02:10
I should not be able to find my way my across the room;
54
130536
2340
我将无法找到穿越房间的路;
02:12
nor know how to conduct myself in any circumstances,
55
132876
2452
也无法知晓自己在不同条件下 要做出怎样的行为反应,
02:15
nor what to feel in any relation of life."
56
135328
4203
也无法体会人生关系中的任何感觉。”
02:19
Or take bias.
57
139531
1509
现在来讨论偏爱。
02:21
Now sometimes, we break the world up into
58
141040
1708
有时候,我们将世界划分为
02:22
us versus them, into in-group versus out-group,
59
142748
3001
我们对抗他们, 内群体对抗外群体,
02:25
and sometimes when we do this,
60
145749
1161
有时当我们这么做的时候,
02:26
we know we're doing something wrong,
61
146910
1557
我们知道我们正在犯错误,
02:28
and we're kind of ashamed of it.
62
148467
1673
我们甚至会有些惭愧。
02:30
But other times we're proud of it.
63
150140
1483
但其他时间,我们对此很自豪。
02:31
We openly acknowledge it.
64
151623
1813
我们公开承认。
02:33
And my favorite example of this
65
153436
1282
我最喜欢的例子
02:34
is a question that came from the audience
66
154718
2402
是一个来自观众的问题
02:37
in a Republican debate prior to the last election.
67
157120
2717
在一个选前的共和党辩论。
02:39
(Video) Anderson Cooper: Gets to your question,
68
159837
2292
(视频)安德森·库柏:问答时间,
02:42
the question in the hall, on foreign aid? Yes, ma'am.
69
162129
4181
观众提问,有关对外援助? 有请这位女士。
02:46
Woman: The American people are suffering
70
166310
2236
女士:在美国国内,有很多
02:48
in our country right now.
71
168546
2637
美国人民正在经受苦难。
02:51
Why do we continue to send foreign aid
72
171183
3348
为什么我们要持续为其他国家
02:54
to other countries
73
174531
1316
提供援助呢?
02:55
when we need all the help we can get for ourselves?
74
175847
4103
此时我们需要这些援助 使用在本国人身上。
02:59
AC: Governor Perry, what about that?
75
179950
1695
安德森·库柏: 州长佩里,请您解答?
03:01
(Applause)
76
181645
1367
(鼓掌)
03:03
Rick Perry: Absolutely, I think it's—
77
183012
2338
里克·佩里:绝对的,我认为--
03:05
Paul Bloom: Each of the people onstage
78
185350
1660
保罗·布鲁姆:这个台上的每个人
03:07
agreed with the premise of her question,
79
187010
1971
同意她问题的前提,
03:08
which is as Americans, we should care more
80
188981
2119
这个前提就是作为美国人, 我们应该将
03:11
about Americans than about other people.
81
191100
2126
更多的关注给予本国人民 而不是其他人民。
03:13
And in fact, in general, people are often swayed
82
193226
2865
事实上,总的来说, 人们时常容易受到影响
03:16
by feelings of solidarity, loyalty, pride, patriotism,
83
196091
3508
对他们国家以及种族 诸如团结,忠诚,
03:19
towards their country or towards their ethnic group.
84
199599
2716
自豪以及爱国主义。
03:22
Regardless of your politics, many people feel proud to be American,
85
202315
3085
不谈政治倾向,很多人对他们 美国人的身份感到自豪,
03:25
and they favor Americans over other countries.
86
205400
2062
他们偏爱美国多于其他国家。
03:27
Residents of other countries feel the same about their nation,
87
207462
2850
其他国家的人们 也持有这样的态度,
03:30
and we feel the same about our ethnicities.
88
210312
2486
人们对自己的种族 也是如此。
03:32
Now some of you may reject this.
89
212798
1684
一些人会反对这种说法。
03:34
Some of you may be so cosmopolitan
90
214482
1721
你们中的某些人可能 是世界主义者,
03:36
that you think that ethnicity and nationality
91
216213
2334
会认为种族和国籍
03:38
should hold no moral sway.
92
218547
2153
不该影响到人们。
03:40
But even you sophisticates accept
93
220700
2762
但是,即便如此,
03:43
that there should be some pull
94
223462
1834
你仍然会接受
03:45
towards the in-group in the domain of friends and family,
95
225296
2701
群体可以以朋友和家人来做划分,
03:47
of people you're close to,
96
227997
1421
那些与你更亲近的人们
03:49
and so even you make a distinction
97
229418
1561
甚至你也会做一个划分
03:50
between us versus them.
98
230979
1975
区别我们和他们,
03:52
Now, this distinction is natural enough
99
232954
2603
这些区别是自然而然的
03:55
and often moral enough, but it can go awry,
100
235557
2924
时常也是道德的, 但有时也会出错,
03:58
and this was part of the research
101
238481
1729
这是伟大的社会心理学家
04:00
of the great social psychologist Henri Tajfel.
102
240210
2759
亨利·泰吉弗尔研究的一个部分。
04:02
Tajfel was born in Poland in 1919.
103
242969
2605
泰吉弗尔生于1919年的波兰。
04:05
He left to go to university in France,
104
245574
2139
他离开家乡去法国上大学,
04:07
because as a Jew, he couldn't go to university in Poland,
105
247713
2555
因为他是犹太人, 无法在波兰接受大学教育,
04:10
and then he enlisted in the French military
106
250268
2510
随后在第二次世界大战
04:12
in World War II.
107
252778
1283
他应募入伍加入法军。
04:14
He was captured and ended up
108
254061
1769
他被捕了随后被送到
04:15
in a prisoner of war camp,
109
255830
1531
战俘营,
04:17
and it was a terrifying time for him,
110
257361
2267
对他来说这是非常恐怖的经历,
04:19
because if it was discovered that he was a Jew,
111
259628
1688
因为如果一旦发现他是犹太人,
04:21
he could have been moved to a concentration camp,
112
261316
2092
他便会被移送到集中营,
04:23
where he most likely would not have survived.
113
263408
1992
很难活下来。
04:25
And in fact, when the war ended and he was released,
114
265400
2587
事实上,当战争结束的时候, 他被释放了,
04:27
most of his friends and family were dead.
115
267987
2505
绝大多数他的亲友都死亡了。
04:30
He got involved in different pursuits.
116
270492
1837
他参与不同的活动。
04:32
He helped out the war orphans.
117
272329
1531
他帮助战争孤儿。
04:33
But he had a long-lasting interest
118
273860
1731
但他对偏见科学
04:35
in the science of prejudice,
119
275591
1545
有着极高的兴趣
04:37
and so when a prestigious British scholarship
120
277136
2660
因此当一个极有声望的, 有关“刻板印象成见”
04:39
on stereotypes opened up, he applied for it,
121
279796
1845
的英国奖学金机会释出的时候, 他递交了申请,
04:41
and he won it,
122
281641
1357
并拿到了奖学金,
04:42
and then he began this amazing career.
123
282998
2190
这使他开启了精彩的职业生涯。
04:45
And what started his career is an insight
124
285188
2749
他的职业开始于发觉
04:47
that the way most people were thinking
125
287937
1840
当绝大多数人思考大屠杀是错误的
04:49
about the Holocaust was wrong.
126
289777
2116
采取了怎样的方法。
04:51
Many people, most people at the time,
127
291893
2406
很多人,那时候的绝大多数人,
04:54
viewed the Holocaust as sort of representing
128
294299
1901
将大屠杀视为
04:56
some tragic flaw on the part of the Germans,
129
296200
3004
代表某种德国人的悲剧错误,
04:59
some genetic taint, some authoritarian personality.
130
299204
3834
像是基因污点,威权性格。
05:03
And Tajfel rejected this.
131
303038
2058
泰吉弗尔拒绝这样的解释。
05:05
Tajfel said what we see in the Holocaust
132
305096
2543
他说道,大屠杀
05:07
is just an exaggeration
133
307639
2311
只是夸大了
05:09
of normal psychological processes
134
309950
1778
正常的心理状态
05:11
that exist in every one of us.
135
311728
1761
这样的心理状态存在于 我们中的每一个人。
05:13
And to explore this, he did a series of classic studies
136
313489
2685
为了继续研究, 他做了一系列的经典研究
05:16
with British adolescents.
137
316174
1744
有关英国青少年。
05:17
And in one of his studies, what he did was he asked
138
317918
1549
在他的其中一项研究中, 他去询问
05:19
the British adolescents all sorts of questions,
139
319467
2552
英国青少年各种不同的问题,
05:22
and then based on their answers, he said,
140
322019
1884
基于他们的回答,他说,
05:23
"I've looked at your answers, and based on the answers,
141
323903
2357
“我看过你的答案, 基于你的回答,
05:26
I have determined that you are either" —
142
326260
2097
我决定你是”--
05:28
he told half of them —
143
328357
1006
他告诉青少年中一半的人--
05:29
"a Kandinsky lover, you love the work of Kandinsky,
144
329363
2957
“康定斯基迷, 你喜爱康定斯基的作品,
05:32
or a Klee lover, you love the work of Klee."
145
332320
2978
你是克利迷, 你喜爱克利的画作。”
05:35
It was entirely bogus.
146
335298
1816
这完全是胡编的。
05:37
Their answers had nothing to do with Kandinsky or Klee.
147
337114
2290
这些青少年的答案和康定斯基 或者克利一点关系也没有。
05:39
They probably hadn't heard of the artists.
148
339404
2728
他们甚至还未听说过 这两位艺术家的大名。
05:42
He just arbitrarily divided them up.
149
342132
2740
泰吉弗尔只是武断地 把青少年们划分开来。
05:44
But what he found was, these categories mattered,
150
344872
3271
但他发现,这样的类别划分 是有作用的,
05:48
so when he later gave the subjects money,
151
348143
2511
随后,他让这些青少年分配钱,
05:50
they would prefer to give the money
152
350654
1676
他们更愿意将金钱给予
05:52
to members of their own group
153
352330
1800
他们本组的其他人
05:54
than members of the other group.
154
354130
1833
而不是另一个组别的人。
05:55
Worse, they were actually most interested
155
355963
2327
更糟的是,他们真的很乐于
05:58
in establishing a difference
156
358290
2006
建立一个不同来将
06:00
between their group and other groups,
157
360296
2566
自己的组和他组区分开来,
06:02
so they would give up money for their own group
158
362862
1908
为了令别组少拿到些钱
06:04
if by doing so they could give the other group even less.
159
364770
5248
他们甚至愿意放弃自己的钱。
06:10
This bias seems to show up very early.
160
370018
2218
偏爱很快就展现出来。
06:12
So my colleague and wife, Karen Wynn, at Yale
161
372236
2300
我的妻子也是我的同事,凯伦·维恩, 在耶鲁大学
06:14
has done a series of studies with babies
162
374536
1611
做了一系列有关婴儿的研究
06:16
where she exposes babies to puppets,
163
376147
2832
她将幼儿放在玩偶旁边,
06:18
and the puppets have certain food preferences.
164
378979
2265
玩偶有它们各自喜爱的食物。
06:21
So one of the puppets might like green beans.
165
381244
2182
某个玩偶可能喜爱青豆。
06:23
The other puppet might like graham crackers.
166
383426
2575
另个玩偶更爱全麦饼干。
06:26
They test the babies own food preferences,
167
386001
2369
研究人员测试了 幼儿们自身的食物偏好
06:28
and babies typically prefer the graham crackers.
168
388370
2690
幼儿们代表性地更爱全麦饼干。
06:31
But the question is, does this matter to babies
169
391060
2612
问题是,这样的喜好差别
06:33
in how they treat the puppets? And it matters a lot.
170
393672
3116
会影响到幼儿们对待玩偶的态度吗? 确实有很大影响。
06:36
They tend to prefer the puppet
171
396788
1519
幼儿们更喜欢
06:38
who has the same food tastes that they have,
172
398307
3479
和他们有相同口味偏好的玩具,
06:41
and worse, they actually prefer puppets
173
401786
2556
更糟的是,幼儿们喜欢那些惩罚
06:44
who punish the puppet with the different food taste.
174
404342
2985
拥有不同口味同伴的玩偶。
06:47
(Laughter)
175
407327
2277
(笑声)
06:49
We see this sort of in-group, out-group psychology all the time.
176
409604
3632
这样群体内外分别 非常常见。
06:53
We see it in political clashes
177
413236
1664
政治冲突中也会展现
06:54
within groups with different ideologies.
178
414900
2414
在持有不同意识形态的群体。
06:57
We see it in its extreme in cases of war,
179
417314
3626
极端的例子通过战争展现,
07:00
where the out-group isn't merely given less,
180
420940
3217
外群体不是被轻视
07:04
but dehumanized,
181
424157
1588
而是被不当作人类对待,
07:05
as in the Nazi perspective of Jews
182
425745
2240
如同纳粹视犹太人为
07:07
as vermin or lice,
183
427985
2085
害虫或是虱子,
07:10
or the American perspective of Japanese as rats.
184
430070
4236
美国人视日本人为老鼠。
07:14
Stereotypes can also go awry.
185
434306
2214
刻板印象是会歪曲现实的。
07:16
So often they're rational and useful,
186
436520
2261
因此时常他们是理性的,有帮助的,
07:18
but sometimes they're irrational,
187
438781
1574
但时常也会是非理性的,
07:20
they give the wrong answers,
188
440355
1226
会给出错误的答案,
07:21
and other times
189
441581
1217
有时
07:22
they lead to plainly immoral consequences.
190
442798
2175
会导致不道德的后果。
07:24
And the case that's been most studied
191
444973
2808
最常被研究的案例
07:27
is the case of race.
192
447781
1667
是种族。
07:29
There was a fascinating study
193
449448
1407
在 2008 年美国大选前
07:30
prior to the 2008 election
194
450855
2074
有个极好的研究,
07:32
where social psychologists looked at the extent
195
452929
3026
社会心理学家研究
07:35
to which the candidates were associated with America,
196
455955
3442
被测者们是如何通过 对美国国旗不知不觉的联系
07:39
as in an unconscious association with the American flag.
197
459397
3605
和美国联系在一起的。
07:43
And in one of their studies they compared
198
463002
1356
在其中一个研究中, 他们比较了
07:44
Obama and McCain, and they found McCain
199
464358
2014
奥巴马和麦凯恩, 他们发现麦凯恩
07:46
is thought of as more American than Obama,
200
466372
3394
比奥巴马更加“美国”,
07:49
and to some extent, people aren't that surprised by hearing that.
201
469766
2573
某种程度上, 人们甚至并未表示惊讶。
07:52
McCain is a celebrated war hero,
202
472339
1918
麦凯恩是一个著名的 战争英雄,
07:54
and many people would explicitly say
203
474257
1659
很多人明确地说道
07:55
he has more of an American story than Obama.
204
475916
2700
比起奥巴马, 麦凯恩有更多的美国故事。
07:58
But they also compared Obama
205
478616
1937
研究人员也比对了
08:00
to British Prime Minister Tony Blair,
206
480553
2516
奥巴马和英国首相布莱尔,
08:03
and they found that Blair was also thought of
207
483069
2261
他们发现比起奥巴马
08:05
as more American than Obama,
208
485330
2507
人们认为 布莱尔更加“美国”,
08:07
even though subjects explicitly understood
209
487837
2073
即使他们完全知晓
08:09
that he's not American at all.
210
489910
2990
布莱尔根本不是美国人。
08:12
But they were responding, of course,
211
492900
1424
但人们回应,当然,
08:14
to the color of his skin.
212
494324
3051
因为肤色的原因。
08:17
These stereotypes and biases
213
497375
2051
这样的成见和偏好
08:19
have real-world consequences,
214
499426
1450
有着现世的影响,
08:20
both subtle and very important.
215
500876
2872
这有些微妙,也非常重要。
08:23
In one recent study, researchers
216
503748
2662
在最近的一个研究中, 研究人员
08:26
put ads on eBay for the sale of baseball cards.
217
506410
3269
在易趣(eBay)网站上投放广告 销售篮球卡。
08:29
Some of them were held by white hands,
218
509679
2734
有些卖家是白人,
08:32
others by black hands.
219
512413
1218
另一些则是黑人。
08:33
They were the same baseball cards.
220
513631
1579
同样的实验也包括销售棒球卡。
08:35
The ones held by black hands
221
515210
1244
黑人卖家
08:36
got substantially smaller bids
222
516454
2067
得到的来自买家的出价 价位略小于
08:38
than the ones held by white hands.
223
518521
2484
白人卖家。
08:41
In research done at Stanford,
224
521005
2362
在斯坦福的一个研究项目也表明,
08:43
psychologists explored the case of people
225
523367
4230
心理学家研究了
08:47
sentenced for the murder of a white person.
226
527597
3569
因谋杀白人而被判刑的罪犯。
08:51
It turns out, holding everything else constant,
227
531166
2804
结果表明,除去其他因素,
08:53
you are considerably more likely to be executed
228
533970
2370
比起图片左边的人(白人)
08:56
if you look like the man on the right
229
536340
1777
图片右边的人(黑人)
08:58
than the man on the left,
230
538117
1973
更可能被判死刑,
09:00
and this is in large part because
231
540090
2029
这很大程度归结于
09:02
the man on the right looks more prototypically black,
232
542119
2534
图片右边的人是黑人,
09:04
more prototypically African-American,
233
544653
2630
美国黑人,
09:07
and this apparently influences people's decisions
234
547283
2049
很明显这影响到了人们
09:09
over what to do about him.
235
549332
1771
对他所做出的决定。
09:11
So now that we know about this,
236
551103
1547
现在我们知道了 成见和偏爱的存在,
09:12
how do we combat it?
237
552650
1657
我们要怎样对抗这样的想法呢?
09:14
And there are different avenues.
238
554307
1622
有很多不同的方法。
09:15
One avenue is to appeal
239
555929
1434
一个方法是去感化
09:17
to people's emotional responses,
240
557363
2046
人们的情感,
09:19
to appeal to people's empathy,
241
559409
2133
去令人们感同身受,
09:21
and we often do that through stories.
242
561542
1873
通常我们会用故事 来达到这样的效果。
09:23
So if you are a liberal parent
243
563415
2565
如果你是自由的父母
09:25
and you want to encourage your children
244
565980
1872
你想要鼓励你的孩子
09:27
to believe in the merits of nontraditional families,
245
567852
2374
来相信非传统家庭的价值优点,
09:30
you might give them a book like this. ["Heather Has Two Mommies"]
246
570226
2273
你会给他们看这样的书。 [海瑟有两个妈妈]
09:32
If you are conservative and have a different attitude,
247
572499
1726
如果你比较传统 对此持有不同的态度,
09:34
you might give them a book like this.
248
574225
1931
你会给他们看这本书
09:36
(Laughter) ["Help! Mom! There Are Liberals under My Bed!"]
249
576156
1749
(笑声) [“救命呀!妈妈!自由党人藏在我的床下!”]
09:37
But in general, stories can turn
250
577905
3336
总的来说,故事能够
09:41
anonymous strangers into people who matter,
251
581241
2232
让路人从陌生到关注,
09:43
and the idea that we care about people
252
583473
2685
我们在乎人们
09:46
when we focus on them as individuals
253
586158
1702
当我们将他们是做个体
09:47
is an idea which has shown up across history.
254
587860
2279
这样的思想贯穿人类历史。
09:50
So Stalin apocryphally said,
255
590139
2583
因此,斯大林虚情假意地说,
09:52
"A single death is a tragedy,
256
592722
1617
“一个人死亡是悲剧,
09:54
a million deaths is a statistic,"
257
594339
2040
一百万人的死亡则是统计数据,”
09:56
and Mother Teresa said,
258
596379
1451
特蕾莎修女说道,
09:57
"If I look at the mass, I will never act.
259
597830
1541
"假如我看到一群人, 我不会有所行动。
09:59
If I look at the one, I will."
260
599371
2325
假如我看到一个人,我会。"
10:01
Psychologists have explored this.
261
601696
2070
心理学家对此作出研究。
10:03
For instance, in one study,
262
603766
1301
比方说,在一个研究中,
10:05
people were given a list of facts about a crisis,
263
605067
2783
研究人员交给人们一张清单, 上面列举了一些危急的例子,
10:07
and it was seen how much they would donate
264
607850
4256
看人们愿意为了化解危机
10:12
to solve this crisis,
265
612106
1584
捐赠多少,
10:13
and another group was given no facts at all
266
613690
1837
另一个组则未被告知这些事情
10:15
but they were told of an individual
267
615527
2098
但研究人员告诉他们
10:17
and given a name and given a face,
268
617625
2440
个体故事,包括名字,相片,
10:20
and it turns out that they gave far more.
269
620065
3219
结果是,他们比上一组捐赠更多善款。
10:23
None of this I think is a secret
270
623284
1861
上述故事对于从事
10:25
to the people who are engaged in charity work.
271
625145
2111
慈善工作的人来说都不是秘密。
10:27
People don't tend to deluge people
272
627256
2648
慈善工作者不会向人们
10:29
with facts and statistics.
273
629904
1323
展示大量的事实和数据。
10:31
Rather, you show them faces,
274
631227
1022
而是,给人们看相片,
10:32
you show them people.
275
632249
1736
展示灾民的样子。
10:33
It's possible that by extending our sympathies
276
633985
3227
很有可能的是,通过展现我们
10:37
to an individual, they can spread
277
637212
1971
对于个体的同情,
10:39
to the group that the individual belongs to.
278
639183
2878
他们可以进而展示个体从属的群体。
10:42
This is Harriet Beecher Stowe.
279
642061
2466
这是哈里耶持·比彻·斯托。
10:44
The story, perhaps apocryphal,
280
644527
2443
故事,可能是假的,
10:46
is that President Lincoln invited her
281
646970
2074
林肯总统邀请她
10:49
to the White House in the middle of the Civil War
282
649044
1998
在美国内战期间到白宫
10:51
and said to her,
283
651042
1584
对她说,
10:52
"So you're the little lady who started this great war."
284
652626
2664
“你是开始这场战争的女子。”
10:55
And he was talking about "Uncle Tom's Cabin."
285
655290
1885
他谈到“汤姆叔叔的小屋。”
10:57
"Uncle Tom's Cabin" is not a great book of philosophy
286
657175
2531
“汤姆叔叔的小屋”不是 伟大的哲学或宗教故事
10:59
or of theology or perhaps not even literature,
287
659706
3144
甚至可能都不是文学,
11:02
but it does a great job
288
662850
2515
但它起了很大的作用
11:05
of getting people to put themselves in the shoes
289
665365
2498
在人们能够将自己置身于某个故事
11:07
of people they wouldn't otherwise be in the shoes of,
290
667863
2333
那些本不可能属于他们的故事中,
11:10
put themselves in the shoes of slaves.
291
670196
2402
以奴隶的角度来看世界。
11:12
And that could well have been a catalyst
292
672598
1781
这些是催化剂,
11:14
for great social change.
293
674379
1604
催生巨大的社会变革。
11:15
More recently, looking at America
294
675983
2362
近年来,看看美国
11:18
in the last several decades,
295
678345
3069
在过去几十年的表现,
11:21
there's some reason to believe that shows like "The Cosby Show"
296
681414
3149
太多的原因让我们相信 像是“考斯比一家”
11:24
radically changed American attitudes towards African-Americans,
297
684563
2688
从更本上改变了美国人 对美国黑人的看法,
11:27
while shows like "Will and Grace" and "Modern Family"
298
687251
2983
“威尔与格蕾丝”,“摩登家庭”
11:30
changed American attitudes
299
690234
1363
改变了很多美国人
11:31
towards gay men and women.
300
691597
1300
对同性恋男女的态度。
11:32
I don't think it's an exaggeration to say
301
692897
2455
不夸张地说,
11:35
that the major catalyst in America for moral change
302
695352
2661
对美国道德价值改变 做出最大贡献的
11:38
has been a situation comedy.
303
698013
2893
是情景喜剧。
11:40
But it's not all emotions,
304
700906
1416
但这并不全是情感上的,
11:42
and I want to end by appealing
305
702322
1276
最后我想要谈到
11:43
to the power of reason.
306
703598
2235
理性的力量。
11:45
At some point in his wonderful book
307
705833
2156
在他佳作的某些部分
11:47
"The Better Angels of Our Nature,"
308
707989
1223
“唤醒人性中的天使”
11:49
Steven Pinker says,
309
709212
2016
史蒂文·平克说道,
11:51
the Old Testament says love thy neighbor,
310
711228
2582
旧约说到要爱我们的邻居,
11:53
and the New Testament says love thy enemy,
311
713810
2722
新约说到要爱我们的敌人,
11:56
but I don't love either one of them, not really,
312
716532
2686
但我不爱他们中的任何一个,不尽然,
11:59
but I don't want to kill them.
313
719218
1667
但我不想杀了他们。
12:00
I know I have obligations to them,
314
720885
1866
我知道我有义务对他们,
12:02
but my moral feelings to them, my moral beliefs
315
722751
3470
但我对他们的道德感受,
12:06
about how I should behave towards them,
316
726221
1713
我要如何对待他们的道德信念,
12:07
aren't grounded in love.
317
727934
2047
不会是基于爱。
12:09
What they're grounded in is the understanding of human rights,
318
729981
1939
是基于对人权的理解,
12:11
a belief that their life is as valuable to them
319
731920
2223
他们的生命对他们的价值
12:14
as my life is to me,
320
734143
2356
正如我的生命对我的价值,
12:16
and to support this, he tells a story
321
736499
1932
为了支持这个观点, 他讲了一个故事,
12:18
by the great philosopher Adam Smith,
322
738431
1848
关于伟大的哲人亚当·斯密,
12:20
and I want to tell this story too,
323
740279
1686
我现在讲这个故事,
12:21
though I'm going to modify it a little bit
324
741965
1296
我为了使其适应现代
12:23
for modern times.
325
743261
1678
略微做了改动。
12:24
So Adam Smith starts by asking you to imagine
326
744939
1901
亚当斯密让你来想象
12:26
the death of thousands of people,
327
746840
1901
数千人死亡的场景,
12:28
and imagine that the thousands of people
328
748741
2040
想象这数千人
12:30
are in a country you are not familiar with.
329
750781
2239
是在你不熟悉的国家。
12:33
It could be China or India or a country in Africa.
330
753020
3554
可能是中国,或者是印度, 或者是某个非洲国家。
12:36
And Smith says, how would you respond?
331
756574
2484
斯密说到,你会怎样回应?
12:39
And you would say, well that's too bad,
332
759058
2307
你可能会说,这太糟了,
12:41
and you'd go on to the rest of your life.
333
761365
1876
然后继续你的生活。
12:43
If you were to open up The New York Times online or something,
334
763241
2219
如果你打开纽约时报的网站或什么,
12:45
and discover this, and in fact this happens to us all the time,
335
765460
2960
看到这些消息,事实上 这常发生,
12:48
we go about our lives.
336
768420
1521
我们继续我们的生活。
12:49
But imagine instead, Smith says,
337
769941
2194
斯密说,想象另一个画面:
12:52
you were to learn that tomorrow
338
772135
1254
你发现明天
12:53
you were to have your little finger chopped off.
339
773389
2539
你的小手指会被砍掉。
12:55
Smith says, that would matter a lot.
340
775928
2169
斯密说,这太重要了。
12:58
You would not sleep that night
341
778097
1411
你整晚会睡不着觉
12:59
wondering about that.
342
779508
1353
辗转反侧。
13:00
So this raises the question:
343
780861
2019
这就提出了问题:
13:02
Would you sacrifice thousands of lives
344
782880
2466
你会牺牲数千人的生命
13:05
to save your little finger?
345
785346
1969
以求得保全自己小手指吗?
13:07
Now answer this in the privacy of your own head,
346
787315
2318
现在在自己的脑袋里 回答这个问题,
13:09
but Smith says, absolutely not,
347
789633
2919
但是斯密说,绝对不,
13:12
what a horrid thought.
348
792552
1692
这是多么邪恶的想法。
13:14
And so this raises the question,
349
794244
2031
这就提出了问题,
13:16
and so, as Smith puts it,
350
796275
1374
随后,斯密提出这样的疑问,
13:17
"When our passive feelings are almost always
351
797649
2218
“我们的消极情绪总是
13:19
so sordid and so selfish,
352
799867
1448
如此利欲熏心,自私卑鄙,
13:21
how comes it that our active principles
353
801315
1465
我们的行为又怎么可能
13:22
should often be so generous and so noble?"
354
802780
2533
时常很无私和高尚呢?”
13:25
And Smith's answer is, "It is reason,
355
805313
2050
斯密回答道,“因为理性,
13:27
principle, conscience.
356
807363
1775
道德,良知。”
13:29
[This] calls to us,
357
809138
1541
[这] 告诉我们
13:30
with a voice capable of astonishing the most presumptuous of our passions,
358
810679
3425
能够惊人地绝大部分 爆发我们的激情,
13:34
that we are but one of the multitude,
359
814104
1677
但众多思考中,
13:35
in no respect better than any other in it."
360
815781
2441
没有比尊重更重要。“
13:38
And this last part is what is often described
361
818222
2125
最后的这个部分是有关
13:40
as the principle of impartiality.
362
820347
3208
公正的原则。
13:43
And this principle of impartiality manifests itself
363
823555
2629
这样公正的原则
13:46
in all of the world's religions,
364
826184
1747
在全世界宗教中都有所证明,
13:47
in all of the different versions of the golden rule,
365
827951
2258
在各种不同版本的黄金法则,
13:50
and in all of the world's moral philosophies,
366
830209
2454
世界上所有的道德哲学,
13:52
which differ in many ways
367
832663
1307
即使有所不同
13:53
but share the presupposition that we should judge morality
368
833970
2994
但共有的假设是 我们应该从
13:56
from sort of an impartial point of view.
369
836964
2985
公正的角度来评判道德。
13:59
The best articulation of this view
370
839949
1822
这观点最重要的是
14:01
is actually, for me, it's not from a theologian or from a philosopher,
371
841771
3085
事实上,对我来说, 这不是从宗教学家或哲学家听来,
14:04
but from Humphrey Bogart
372
844856
1357
而是从亨弗莱·鲍嘉
14:06
at the end of "Casablanca."
373
846213
1547
在电影“卡萨布兰卡”片尾的表现。
14:07
So, spoiler alert, he's telling his lover
374
847760
3776
警告有剧透,他告诉他的爱人
14:11
that they have to separate
375
851536
1140
为了更伟大的善,
14:12
for the more general good,
376
852676
1593
他们必须要分开,
14:14
and he says to her, and I won't do the accent,
377
854269
1864
他对她说,我不会模仿这口音,
14:16
but he says to her, "It doesn't take much to see
378
856133
1782
他对她说“不用多久就可以看到
14:17
that the problems of three little people
379
857915
1359
这三个小人的问题
14:19
don't amount to a hill of beans in this crazy world."
380
859274
3111
不会使世界瘋狂。”
14:22
Our reason could cause us to override our passions.
381
862385
3280
我们的理性可以驾驭我们的热情。
14:25
Our reason could motivate us
382
865665
1716
我们的理性可以激励我们
14:27
to extend our empathy,
383
867381
1221
扩展我们的同理心,
14:28
could motivate us to write a book like "Uncle Tom's Cabin,"
384
868602
2327
可以激励我们写 “汤姆叔叔的小屋”这样的书
14:30
or read a book like "Uncle Tom's Cabin,"
385
870929
1723
或者看“汤姆叔叔的小屋”,
14:32
and our reason can motivate us to create
386
872652
2694
我们的理性可以促使我们
14:35
customs and taboos and laws
387
875346
1962
创造海关,烟草和法律
14:37
that will constrain us
388
877308
1810
这会限制我们
14:39
from acting upon our impulses
389
879118
1676
冲动的行为,
14:40
when, as rational beings, we feel
390
880794
1589
当理性存在,我们感到
14:42
we should be constrained.
391
882383
1395
我们要被限制。
14:43
This is what a constitution is.
392
883778
2013
这是宪法。
14:45
A constitution is something which was set up in the past
393
885791
2921
宪法是过去撰写的
14:48
that applies now in the present,
394
888712
1307
适用于现在以及未来,
14:50
and what it says is,
395
890019
985
宪法提到的,
14:51
no matter how much we might to reelect
396
891004
2227
无论我们多想
14:53
a popular president for a third term,
397
893231
2603
选举受欢迎的总统 开始第三任期,
14:55
no matter how much white Americans might choose
398
895834
2095
无论美国白人多么想
14:57
to feel that they want to reinstate the institution of slavery, we can't.
399
897929
4068
重新回到奴隶制度,我们不能。
15:01
We have bound ourselves.
400
901997
1676
我们限制自己。
15:03
And we bind ourselves in other ways as well.
401
903673
2417
我们也从别的方式约束自己。
15:06
We know that when it comes to choosing somebody
402
906090
2758
当我们想要选择某人
15:08
for a job, for an award,
403
908848
2951
来从事一项工作,获得一个奖项,
15:11
we are strongly biased by their race,
404
911799
2958
我们很容易受到种族因素的影响,
15:14
we are biased by their gender,
405
914757
2296
我们会因他们的性别产生偏见,
15:17
we are biased by how attractive they are,
406
917053
2215
我们会因为他们的样貌产生偏爱,
15:19
and sometimes we might say, "Well fine, that's the way it should be."
407
919268
2651
有时我们会说, “是的,就是这样。”
15:21
But other times we say, "This is wrong."
408
921919
2307
有时我们会说,“这是错的。”
15:24
And so to combat this,
409
924226
1889
为了对抗这些,
15:26
we don't just try harder,
410
926115
2251
我们不仅更加努力,
15:28
but rather what we do is we set up situations
411
928366
3001
我们建立机构
15:31
where these other sources of information can't bias us,
412
931367
3039
这些信息资源不会有成见,
15:34
which is why many orchestras
413
934406
1315
这就是为什么很多交响乐团
15:35
audition musicians behind screens,
414
935721
2645
面试音乐家时, 让他们站在幕后,
15:38
so the only information they have
415
938366
1244
这样评委唯一的信息来源
15:39
is the information they believe should matter.
416
939610
2693
就是他们认为最重要的。
15:42
I think prejudice and bias
417
942303
2323
我认为偏见和偏爱
15:44
illustrate a fundamental duality of human nature.
418
944626
3094
展示了人性最基础的二元性。
15:47
We have gut feelings, instincts, emotions,
419
947720
3776
我们有胆识,本能,情感,
15:51
and they affect our judgments and our actions
420
951496
2161
这会影响我们
15:53
for good and for evil,
421
953657
2331
对于好与坏的判断和行为,
15:55
but we are also capable of rational deliberation
422
955988
3622
但我们同样有能力做出理性思考
15:59
and intelligent planning,
423
959610
1435
和智能规划,
16:01
and we can use these to, in some cases,
424
961045
2817
我们可以运用这些,在某些情况下,
16:03
accelerate and nourish our emotions,
425
963862
1943
加速和丰富我们的情绪,
16:05
and in other cases staunch them.
426
965805
2768
某些情况下止住它们。
16:08
And it's in this way
427
968573
1234
这样成见和偏爱
16:09
that reason helps us create a better world.
428
969807
2767
就能帮助我们创建更美好的世界。
16:12
Thank you.
429
972574
2344
谢谢。
16:14
(Applause)
430
974918
3705
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7