Jonathan Haidt: The moral roots of liberals and conservatives

703,601 views ・ 2008-09-18

TED


请双击下面的英文字幕来播放视频。

翻译人员: Coco Shen 校对人员: Zachary Lin Zhao
00:18
Suppose that two American friends are traveling together in Italy.
0
18948
3097
想象两个美国人到意大利旅游
00:22
They go to see Michelangelo's "David,"
1
22069
1925
一起去看米开朗基罗的名作“大卫”
00:24
and when they finally come face-to-face with the statue,
2
24018
2630
当他们和巨大石雕面对面时
00:26
they both freeze dead in their tracks.
3
26672
1828
两个人都望着出神
00:28
The first guy -- we'll call him Adam --
4
28524
1867
第一个人﹐我们就叫他亚当吧
00:30
is transfixed by the beauty of the perfect human form.
5
30415
2720
被完美的人体肌理震慑住了
00:33
The second guy -- we'll call him Bill --
6
33785
1925
第二个人 我们就叫他比尔吧
00:35
is transfixed by embarrassment, at staring at the thing there in the center.
7
35734
4454
也吓傻了 - 被那两腿间的玩意儿
00:40
So here's my question for you:
8
40998
1903
让我试问
00:42
Which one of these two guys was more likely to have voted for George Bush,
9
42925
3617
这两个男人谁比较有可能把票投给小布什
00:46
which for Al Gore?
10
46566
1740
谁投给了高尔﹖
00:48
I don't need a show of hands,
11
48330
1384
大家不用举手
00:49
because we all have the same political stereotypes.
12
49738
2389
因为我们都有一样的刻板印象
00:52
We all know that it's Bill.
13
52151
2155
我们都知道是比尔
00:54
And in this case, the stereotype corresponds to reality.
14
54330
3429
在这个例子里﹐刻板印象反映了事实
00:57
It really is a fact that liberals are much higher than conservatives
15
57783
3374
事实上﹐自由党员的确比保守党员
01:01
on a major personality trait called openness to experience.
16
61181
3149
更容易接受新体验
01:04
People who are high in openness to experience
17
64813
2111
那些喜欢接受新体验的人
01:06
just crave novelty, variety, diversity, new ideas, travel.
18
66948
3444
渴望新鲜 多样性 新想法 旅行
01:10
People low on it like things that are familiar,
19
70416
2237
较难接受新体验的人喜欢熟悉 安全 可靠的事物
01:12
that are safe and dependable.
20
72677
2420
如果你知道这些特性
01:16
If you know about this trait,
21
76282
1383
01:17
you can understand a lot of puzzles about human behavior,
22
77689
2671
你便能了解人类许多难解的行为
了解为什么艺术家和会计师如此不同
01:20
like why artists are so different from accountants.
23
80384
2388
01:22
You can predict what kinds of books they like to read,
24
82796
2533
你可以预测他们喜欢看的书
他们喜欢去的旅游点
01:25
what kinds of places they like to travel to
25
85353
2015
甚至他们的饮食偏好
01:27
and what kinds of food they like to eat.
26
87392
1917
只要你了解这个特性﹐你便能理解
01:29
Once you understand this trait,
27
89333
1492
01:30
you can understand why anybody would eat at Applebee's,
28
90849
2605
为什么这么多人喜欢去连锁餐厅吃饭 但你却一个都不认识
01:33
but not anybody that you know.
29
93478
1478
01:34
(Laughter)
30
94980
5049
(笑声)
01:40
This trait also tells us a lot about politics.
31
100900
2342
这个特性也让我们理解政治
01:43
The main researcher of this trait, Robert McCrae,
32
103266
2389
研究这个性格特质的研究者 Robert McCrae 说
01:45
says that "Open individuals have an affinity
33
105679
2347
“开放的人偏向自由 进步 左翼政治思想”
01:48
for liberal, progressive, left-wing political views ..."
34
108050
2735
01:50
They like a society which is open and changing,
35
110809
2490
他们喜欢一个开放 持续改变的社会
“封闭的人偏好保守 传统 右翼的观点。”
01:53
"... whereas closed individuals prefer
36
113323
1876
01:55
conservative, traditional, right-wing views."
37
115223
2126
01:57
This trait also tells us a lot about the kinds of groups people join.
38
117956
3246
这个特质也让我们了解人们所参与的社团组织
02:01
Here's the description of a group I found on the web.
39
121226
2489
这是我在网络上找到的一个组织简介
02:03
What kinds of people would join
40
123739
1478
怎样的人会参加一个全球性的社群
02:05
"a global community ... welcoming people from every discipline and culture,
41
125241
3752
欢迎来自各种文化和学科的人
那些想更深刻理解世界的人
02:09
who seek a deeper understanding of the world,
42
129017
2097
同时也是那些想以这些理解让世界变得更好的人
02:11
and who hope to turn that understanding into a better future for us all"?
43
131138
3430
这是一个叫 TED 的男人写的
02:14
This is from some guy named Ted.
44
134592
1529
(笑声)
02:16
Well, let's see now.
45
136145
1152
那么﹐如果开放性格偏向自由派
02:17
If openness predicts who becomes liberal,
46
137321
2834
02:20
and openness predicts who becomes a TEDster,
47
140179
2729
同时也预知了你会成为 TED 一员
02:22
then might we predict that most TEDsters are liberal?
48
142932
2500
是否大部份的 TED 成员都是自由党呢﹖
02:25
Let's find out.
49
145964
1151
让我们试试
请你举起手﹐不管你是自由党﹐中间偏左
02:27
I'll ask you to raise your hand, whether you are liberal, left of center --
50
147139
3557
02:30
on social issues, primarily --
51
150720
1456
在我们所讨论的议题上
02:32
or conservative.
52
152200
1151
或是保守党﹐还有一个第三选项
02:33
And I'll give a third option,
53
153375
1394
02:34
because I know there are libertarians in the audience.
54
154793
2545
因为我知道观众中还有一些相信自由至上的放任自由主义者
现在﹐举起你的手来
02:37
So please raise your hand -- in the simulcast rooms too.
55
157362
2676
在联播台里的人也是
让每个人看看都是谁
02:40
Let's let everybody see who's here.
56
160062
1683
02:41
Please raise your hand if you'd say that you're liberal or left of center.
57
161769
3510
如果你是自由党或中间偏左﹐请举起手来
请把你的手举高﹐好
02:45
Please raise your hand high right now. OK.
58
165303
2069
02:47
Please raise your hand if you'd say you're libertarian.
59
167833
2671
请举手如果你是放任自由主义者
02:50
OK. About two dozen.
60
170914
1964
好 差不多有二十多人
02:53
And please raise your hand if you'd say you are right of center or conservative.
61
173234
4101
如果你觉得你是中间偏右或保守党﹐请举手
1 2 3 4 5 - 大概8 到10人
02:57
One, two, three, four, five -- about eight or 10.
62
177359
4599
03:01
OK.
63
181982
1174
好。这就是问题。
03:03
This is a bit of a problem.
64
183745
1485
如果我们的目标是了解世界
03:06
Because if our goal is to seek a deeper understanding of the world,
65
186346
3526
深刻的进一步了解世界
03:09
our general lack of moral diversity here is going to make it harder.
66
189896
3711
但缺乏道德多样性让了解世界变得更难
03:13
Because when people all share values, when people all share morals,
67
193631
3481
因为当每个人都分享一样的价值观和道德观
03:17
they become a team.
68
197136
1242
便成为一个团队﹐一旦进入团队心理
03:18
And once you engage the psychology of teams,
69
198402
2225
03:20
it shuts down open-minded thinking.
70
200651
2000
原本开放的思想就会闭塞
03:25
When the liberal team loses,
71
205131
2271
当自由队在2004年败选
03:27
[United States of Canada / Jesusland]
72
207426
1822
03:29
as it did in 2004, and as it almost did in 2000,
73
209272
2656
就像在2000年一样﹐我们自我安慰
03:31
we comfort ourselves.
74
211952
1354
03:33
(Laughter)
75
213330
1175
(笑声)
03:34
We try to explain why half of America voted for the other team.
76
214529
4248
我们尝试自我解释为什么有一半美国人投给另外一队
03:38
We think they must be blinded by religion
77
218801
2917
我们想 他们一定是被宗教蒙蔽 或是纯粹愚蠢
03:41
[Post-election US map: America / Dumbf*ckistan]
78
221742
2239
03:44
or by simple stupidity.
79
224005
1329
(笑声)
03:45
(Laughter)
80
225358
1795
03:47
(Applause)
81
227177
2824
(掌声)
03:50
(Laughter)
82
230025
6485
如果你认为投给共和党的另一半美国人
03:56
So if you think that half of America votes Republican
83
236534
5133
04:01
because they are blinded in this way,
84
241691
2438
是因为他们被蒙蔽了
04:04
then my message to you is that you're trapped in a moral Matrix,
85
244153
3225
我想告诉你的是你被道德母体限制住了
04:07
in a particular moral Matrix.
86
247402
1402
某一种特别的道德母体
04:08
And by "the Matrix," I mean literally the Matrix, like the movie "The Matrix."
87
248828
3675
所谓的道德母体﹐就像“黑客任务”里面的大计算机一样
04:12
But I'm here today to give you a choice.
88
252527
1985
但今日我让你有个选择
04:14
You can either take the blue pill and stick to your comforting delusions,
89
254536
4455
你可以选择蓝色药丸然后保持在舒适的幻觉中
或是选择红色药丸﹐
04:19
or you can take the red pill,
90
259015
1413
04:20
learn some moral psychology
91
260452
1301
了解道德心理学﹐跨越你的道德母体
04:21
and step outside the moral Matrix.
92
261777
1808
04:23
Now, because I know --
93
263609
1206
因为我知道 --
04:24
(Applause)
94
264839
3680
(掌声)
04:28
I assume that answers my question.
95
268543
1724
我想这已经回答了我的问题
04:30
I was going to ask which one you picked, but no need.
96
270291
2493
我本来想问你们要选哪一个﹐我想不需要了
04:32
You're all high in openness to experience,
97
272808
2016
你们都很爱接受新体验﹐更何况
04:34
and it looks like it might even taste good, and you're all epicures.
98
274848
3200
这看起来很可能很可口 能满足你们的美食主义
总而言之﹐让我们选择红色药丸
04:38
Anyway, let's go with the red pill, study some moral psychology
99
278072
2963
让我们学习一些道德心理学﹐看看我们能了解什么
04:41
and see where it takes us.
100
281059
1248
让我们从头开始
04:42
Let's start at the beginning: What is morality, where does it come from?
101
282331
3394
道德是什么﹖它从哪里来﹖
04:45
The worst idea in all of psychology
102
285749
1702
心理学中最糟的想法
04:47
is the idea that the mind is a blank slate at birth.
103
287475
2524
便是我们像一张白纸一样出生
04:50
Developmental psychology has shown that kids come into the world
104
290023
3024
发展心理学告诉我们
婴儿来到世界上时已经知道许多
04:53
already knowing so much about the physical and social worlds
105
293071
2941
有关世界和社会
04:56
and programmed to make it really easy for them to learn certain things
106
296036
4188
让他们变得更易学习
05:00
and hard to learn others.
107
300248
1428
却很难向他人学习
05:01
The best definition of innateness I've seen,
108
301700
2171
有关这些与生俱来的天赋
05:03
which clarifies so many things for me,
109
303895
1817
有个人说的很好
05:05
is from the brain scientist Gary Marcus.
110
305736
1915
脑科学家 Gary Marcus
05:07
He says, "The initial organization of the brain
111
307675
2210
他说“脑的初始组织不是来自经验
05:09
does not depend that much on experience.
112
309909
1969
05:11
Nature provides a first draft, which experience then revises.
113
311902
3359
自然提供了第一个版本﹐经验只能修改
05:15
'Built-in' doesn't mean unmalleable;
114
315285
1744
先建不代表不可塑﹔
05:17
it means organized in advance of experience."
115
317053
3010
而是组织先于经验。”
05:20
OK, so what's on the first draft of the moral mind?
116
320087
2877
那么道德的第一个版本是什么﹖
05:22
To find out, my colleague Craig Joseph and I read through the literature
117
322988
3687
我和同事 Craig Joseph
阅读了许多人类学的文献
05:26
on anthropology, on culture variation in morality
118
326699
2302
有关不同文化的道德
05:29
and also on evolutionary psychology,
119
329025
1778
同时也在进化心理学里找相同处
05:30
looking for matches:
120
330827
1157
跨领域的人谈论的时候他们都谈论什么
05:32
What sorts of things do people talk about across disciplines
121
332008
2916
05:34
that you find across cultures and even species?
122
334948
2209
跨文化和跨物种的人又谈论什么﹖
我们总共找到五种
05:37
We found five best matches, which we call the five foundations of morality.
123
337181
3660
我们称它们为五种道德基础
05:40
The first one is harm/care.
124
340865
1575
第一种是伤害-照护
05:42
We're all mammals here, we all have a lot of neural and hormonal programming
125
342464
3591
我们都是哺乳类﹐我们都有许多神经和荷尔蒙程序
05:46
that makes us really bond with others, care for others,
126
346079
2596
让我们与他人联结﹐关怀他人
05:48
feel compassion for others, especially the weak and vulnerable.
127
348699
3001
同情他人﹐尤其那些脆弱容易受伤的人
05:51
It gives us very strong feelings about those who cause harm.
128
351724
2952
让我们对那些造成伤害的人有强烈感觉
这个道德基础含括了我在TED所听到的
05:55
This moral foundation underlies about 70 percent
129
355197
2289
05:57
of the moral statements I've heard here at TED.
130
357510
2213
七成的道德陈述
05:59
The second foundation is fairness/reciprocity.
131
359747
2559
第二个道德基础是公平-相等
06:02
There's actually ambiguous evidence
132
362330
1749
有一些模糊的证据
06:04
as to whether you find reciprocity in other animals,
133
364103
2437
证明你是否能在其它动物身上找到相互性
06:06
but the evidence for people could not be clearer.
134
366564
2302
但在人类身上的例子却再清楚不过了
06:08
This Norman Rockwell painting is called "The Golden Rule" --
135
368890
2810
这幅 Norman Rockwell 的画叫做“金科玉律”
Karen Armstrong 也告诉我们
06:11
as we heard from Karen Armstrong, it's the foundation of many religions.
136
371724
3501
这是很多宗教的基础
06:15
That second foundation underlies the other 30 percent
137
375249
2479
第二哥道德基础含括了我在TED所听到的
06:17
of the moral statements I've heard here at TED.
138
377752
2198
另外三成的道德陈诉
06:19
The third foundation is in-group/loyalty.
139
379974
1980
第三个基础是团队忠诚
06:21
You do find cooperative groups in the animal kingdom,
140
381978
2934
你可以在动物里面找到群体
你可以找到合作团队
06:24
but these groups are always either very small or they're all siblings.
141
384936
3370
但这些组织通常不是很小或是牠们都是兄弟姐妹
06:28
It's only among humans that you find very large groups of people
142
388330
3048
只有在人类的世界里你看到一大群人
06:31
who are able to cooperate and join together into groups,
143
391402
2904
彼此相处﹐一起合作
06:34
but in this case, groups that are united to fight other groups.
144
394330
3573
但在这例子里﹐团队合作是为了和其它团队斗争
06:37
This probably comes from our long history of tribal living, of tribal psychology.
145
397927
3850
这大概是来自我们长时间的部落生态﹐部落心理
06:41
And this tribal psychology is so deeply pleasurable
146
401801
2983
这种部落心态实在太愉快了
06:44
that even when we don't have tribes, we go ahead and make them,
147
404808
3003
就算我们已经不在部落里了
我们还是照样因为好玩
06:47
because it's fun.
148
407835
1235
06:49
(Laughter)
149
409094
3385
(笑声)
06:52
Sports is to war as pornography is to sex.
150
412503
2867
运动和战争就像A片和性的关系
06:55
We get to exercise some ancient drives.
151
415394
3721
我们借此发泄那些古老的欲望
第四种道德基础是权威-尊敬
06:59
The fourth foundation is authority/respect.
152
419139
2015
07:01
Here you see submissive gestures
153
421178
1537
从这里你可以看到两种非常接近的物种的服从姿态
07:02
from two members of very closely related species.
154
422739
2289
但人类的权威不是以权力和暴力为基础
07:05
But authority in humans is not so closely based on power and brutality
155
425052
3715
07:08
as it is in other primates.
156
428791
1361
像其它动物
07:10
It's based on more voluntary deference and even elements of love, at times.
157
430176
3706
而是以自愿的服从﹐
有时候甚至是爱的元素
07:14
The fifth foundation is purity/sanctity.
158
434541
2060
第五种基础是纯洁- 神圣
07:16
This painting is called "The Allegory Of Chastity,"
159
436625
2521
这幅画是“贞节的寓意”
07:19
but purity is not just about suppressing female sexuality.
160
439170
2992
但纯洁不只是压抑女性性欲
07:22
It's about any kind of ideology, any kind of idea
161
442186
2944
而是任何理想﹐任何想法
07:25
that tells you that you can attain virtue by controlling what you do with your body
162
445154
3981
告诉你只要控制你的身体
你便可以成善
只要控制进入你身体的东西
07:29
and what you put into your body.
163
449159
1559
07:30
And while the political right may moralize sex much more,
164
450742
3281
右翼喜欢谈论性方面的道德
07:34
the political left is doing a lot of it with food.
165
454047
2349
左翼喜欢用食物
07:36
Food is becoming extremely moralized nowadays.
166
456420
2152
今日食物变成一种道德指标
07:38
A lot of it is ideas about purity,
167
458596
1631
这些观点也来自纯洁
07:40
about what you're willing to touch or put into your body.
168
460251
2786
有关你愿意触摸和放进身体的东西
07:43
I believe these are the five best candidates
169
463061
2891
我相信这是五个最好的候选人
07:45
for what's written on the first draft of the moral mind.
170
465976
2634
在我们道德思想的初稿上
07:48
I think this is what we come with, a preparedness to learn all these things.
171
468634
3657
我相信这是我们与生俱来的
做好准备要来学习这些东西
07:52
But as my son Max grows up in a liberal college town,
172
472315
3286
但我的儿子 Max 在一个自由派的大学城里长大
07:55
how is this first draft going to get revised?
173
475625
2403
这个初稿将如何被改写﹖
07:58
And how will it end up being different
174
478052
1830
和在我们南部六十里的乡下
07:59
from a kid born 60 miles south of us, in Lynchburg, Virginia?
175
479906
3056
生下来的孩子 又会有什么不同﹖
08:02
To think about culture variation, let's try a different metaphor.
176
482986
3069
当我们想到这些多样文化的时候﹐让我们试试其它隐喻
如果真的有着五种系统在我们想法里
08:06
If there really are five systems at work in the mind,
177
486079
2482
08:08
five sources of intuitions and emotions,
178
488585
1910
五种情绪和直觉的来源
08:10
then we can think of the moral mind as one of those audio equalizers
179
490519
3198
我们可以把道德感
当做音响有五种频道的均衡器
08:13
that has five channels,
180
493741
1151
08:14
where you can set it to a different setting on every channel.
181
494916
2877
你可以在不同频道选择不同的程度
我的同事 Brian Nosek, Jesse Graham 和我
08:17
My colleagues Brian Nosek and Jesse Graham and I
182
497817
2255
做了一个问卷﹐放在www.YourMorals.org网站上
08:20
made a questionnaire, which we put up on the web at www.YourMorals.org.
183
500096
4884
目前为止已经有三万人填写了这个问卷﹐你也可以
08:25
And so far, 30,000 people have taken this questionnaire, and you can, too.
184
505004
3981
08:29
Here are the results from about 23,000 American citizens.
185
509009
4202
结果在这里
这里是两万三千个美国公民的结果
08:33
On the left are the scores for liberals;
186
513235
1915
左边是自由派的分数
08:35
on the right, conservatives; in the middle, moderates.
187
515174
2537
右边是保守派的﹐中间是中立
08:37
The blue line shows people's responses on the average of all the harm questions.
188
517735
3795
蓝线是你们的回应
在所有有关伤害的问题上
08:41
So as you see, people care about harm and care issues.
189
521554
2593
你可以看到﹐人们真的很关心伤害和照护的问题
08:44
They highly endorse these sorts of statements all across the board,
190
524171
3163
他们很支持这方面的陈述
在整个表上﹐但你也可以看到
08:47
but as you also see,
191
527358
1151
08:48
liberals care about it a little more than conservatives; the line slopes down.
192
528533
3728
自由派比保守派更在乎一些﹐线慢慢降了下来
公平也是一样
08:52
Same story for fairness.
193
532285
1375
08:53
But look at the other three lines.
194
533684
1622
但看看其它三条线
08:55
For liberals, the scores are very low.
195
535330
1823
自由派的分数非常低
08:57
They're basically saying, "This is not morality.
196
537177
2240
基本上自由派是说“这根本不是道德。
08:59
In-group, authority, purity -- this has nothing to do with morality. I reject it."
197
539441
3860
团体 权威 纯洁 - 这些东西和道德一点关系也没有。我拒绝。”
但当人越保守﹐这些价值便提升
09:03
But as people get more conservative, the values rise.
198
543325
2485
我们可以说自由派有一种 - 双频
09:05
We can say liberals have a two-channel or two-foundation morality.
199
545834
3113
或是双基础的道德
09:08
Conservatives have more of a five-foundation,
200
548971
2099
保守派则是有五基础
或是五频的道德
09:11
or five-channel morality.
201
551094
1193
09:12
We find this in every country we look at.
202
552311
1958
我们在每个国家都看到一样的情形
这是一千多个加拿大人的数据
09:14
Here's the data for 1,100 Canadians. I'll flip through a few other slides.
203
554293
3495
我会翻过一些其它的国家
09:17
The UK, Australia, New Zealand, Western Europe, Eastern Europe,
204
557812
2957
英国﹐澳洲 纽西兰 西欧 东欧
09:20
Latin America, the Middle East, East Asia and South Asia.
205
560793
3513
拉丁美洲 中东 中亚 和南亚
09:24
Notice also that on all of these graphs,
206
564330
1912
你可以看到在这些图表上
09:26
the slope is steeper on in-group, authority, purity,
207
566266
2803
在团体 权威 纯洁的差异更大
09:29
which shows that, within any country,
208
569093
2712
这告诉我们在任何国家
09:31
the disagreement isn't over harm and fairness.
209
571829
2146
歧见并不是来自伤害和公平
09:33
I mean, we debate over what's fair,
210
573999
1711
我们讨论什么是公平
09:35
but everybody agrees that harm and fairness matter.
211
575734
3270
但每个人都认同伤害和公平是要紧的
09:39
Moral arguments within cultures
212
579028
2136
在文化中的道德讨论
09:41
are especially about issues of in-group, authority, purity.
213
581188
3178
通常都与团队 权威 纯洁有关
09:44
This effect is so robust, we find it no matter how we ask the question.
214
584390
3411
无论我们怎么提出问题﹐效果还是很明显。
09:47
In a recent study, we asked people,
215
587825
1673
在最近的一项研究中
09:49
suppose you're about to get a dog,
216
589522
1625
我们问人们﹕如果你们要买狗
09:51
you picked a particular breed, learned about the breed.
217
591171
2594
你选择了一种特别的品种
后来你知道有关这些品种的一些事
09:53
Suppose you learn that this particular breed is independent-minded
218
593789
3110
或许你学到这个特别的品种会独立思考
09:56
and relates to its owner as a friend and an equal.
219
596923
2349
并且把主人当做平等的朋友
09:59
If you're a liberal, you say, "That's great!"
220
599296
2104
如果你是自由派你会说“哇!那太好了!”
10:01
because liberals like to say, "Fetch! Please."
221
601424
2154
因为自由派喜欢说“去接!”
10:03
(Laughter)
222
603602
4609
(笑声)
10:08
But if you're a conservative, that's not so attractive.
223
608235
3014
但如果你是保守派﹐这就不是太好
10:11
If you're conservative and learn that a dog's extremely loyal
224
611273
2859
如果你是保守派﹐你知道这只狗对牠的家庭非常忠诚
10:14
to its home and family
225
614156
1151
不会很快地和陌生人混熟
10:15
and doesn't warm up to strangers,
226
615331
1576
10:16
for conservatives, loyalty is good; dogs ought to be loyal.
227
616931
2771
对保守派来说 忠诚很好 狗就是要忠诚
10:19
But to a liberal,
228
619726
1152
但对自由派来说﹐这听起来
10:20
it sounds like this dog is running for the Republican nomination.
229
620902
3059
像是这只狗要参加共和党初选了
10:23
(Laughter)
230
623985
1002
(笑声)
所以你可能说 好
10:25
You might say, OK, there are differences between liberals and conservatives,
231
625011
3586
这就是保守派和自由派的差异
10:28
but what makes the three other foundations moral?
232
628621
2288
但什么让其它三种基础也成为道德呢﹖
10:30
Aren't they the foundations of xenophobia, authoritarianism and puritanism?
233
630933
3530
难道它们不是只是极权主义
排他主义和清教主义的基础吗﹖
10:34
What makes them moral?
234
634487
1155
什么让它们变成道德﹖
10:35
The answer, I think, is contained in this incredible triptych from Hieronymus Bosch,
235
635666
3967
答案﹐我想﹐就存在布殊这个三联图中
“世俗欲望的乐园。”
10:39
"The Garden of Earthly Delights."
236
639657
1580
在第一幅图里﹐我们看到创造世界时
10:41
In the first panel, we see the moment of creation.
237
641261
2629
10:43
All is ordered, all is beautiful,
238
643914
1917
一切都有秩序﹐一些都很美丽﹐所有的人和动物
10:45
all the people and animals are doing what they're supposed to be doing,
239
645855
3342
都在它们应该在的地方做他们应该做的事情
10:49
are where they're supposed to be.
240
649221
1586
10:50
But then, given the way of the world, things change.
241
650831
2432
但因为世俗的一切 事情开始改变
10:53
We get every person doing whatever he wants,
242
653287
2063
人们开始任意而为
10:55
with every aperture of every other person and every other animal.
243
655374
3052
和任何人和任何动物
10:58
Some of you might recognize this as the '60s.
244
658450
2099
在座的某些人可能会发现这是60年代
11:00
(Laughter)
245
660573
1412
(笑声)
但60年代终究被70年代取代
11:02
But the '60s inevitably gives way to the '70s,
246
662009
3428
11:05
where the cuttings of the apertures hurt a little bit more.
247
665461
3415
这些裂缝开始令人痛苦
11:08
Of course, Bosch called this hell.
248
668900
2154
当然 布殊称这为地狱
11:11
So this triptych, these three panels,
249
671078
4129
在这个三联画中﹐三片图
描绘了秩序逐渐腐败的真实
11:15
portray the timeless truth that order tends to decay.
250
675231
4289
11:19
The truth of social entropy.
251
679544
1805
社会消减的事实
11:21
But lest you think this is just some part of the Christian imagination
252
681373
3291
你们可能只会想这只是基督徒的想象
11:24
where Christians have this weird problem with pleasure,
253
684688
2584
因为基督徒老是要跟欢愉过不去
这里有一个一样的故事 一样的演进
11:27
here's the same story, the same progression,
254
687296
2062
11:29
told in a paper that was published in "Nature" a few years ago,
255
689382
2969
在自然杂志中刊登的一篇文章里
11:32
in which Ernst Fehr and Simon Gächter had people play a commons dilemma,
256
692375
3997
Ernst Fehr 和 Simon Gachter 要人们思考一个常见的难题
11:36
a game in which you give people money,
257
696396
1910
你给人们钱
11:38
and then, on each round of the game,
258
698330
1718
然后在每一轮游戏结束前
11:40
they can put money into a common pot,
259
700072
2033
他们可以把钱放进一个共享壶里
11:42
then the experimenter doubles what's there,
260
702129
2008
实验者把里面的钱变双份
11:44
and then it's all divided among the players.
261
704161
2052
然后再分给所有玩家
11:46
So it's a nice analog for all sorts of environmental issues,
262
706237
2897
这就像许多环境议题
11:49
where we're asking people to make a sacrifice
263
709158
2098
我们要求人们做出牺牲
11:51
and they don't really benefit from their own sacrifice.
264
711280
2579
他们自己不会从牺牲中得到什么
11:53
You really want everybody else to sacrifice,
265
713883
2061
但你总是要其它人牺牲
11:55
but everybody has a temptation to free ride.
266
715968
2056
但人总有搭便车的想法
刚开始﹐人们较为合作
11:58
What happens is that, at first, people start off reasonably cooperative.
267
718048
3392
12:01
This is all played anonymously.
268
721464
1842
这是无名制的 --
12:03
On the first round, people give about half of the money that they can.
269
723330
3550
第一轮﹐人们给出一半的钱
12:06
But they quickly see other people aren't doing so much.
270
726904
2670
但他们很快知道”说真的﹐其它人没有做这么多。
12:09
"I don't want to be a sucker. I won't cooperate."
271
729598
2312
我才不是笨蛋。我不要合作。“
12:11
So cooperation quickly decays
272
731934
1442
于是合作关系很快的从还不错﹐落到几乎没有
12:13
from reasonably good down to close to zero.
273
733400
2565
12:15
But then -- and here's the trick --
274
735989
1680
但是 - 诀窍在这
12:17
Fehr and Gächter, on the seventh round, told people,
275
737693
2432
Fehr 和 Gachter 在第七轮的时候和每个人说
”好的﹐新规则
12:20
"You know what? New rule.
276
740149
1208
12:21
If you want to give some of your own money
277
741381
2135
如果你要给一些钱
12:23
to punish people who aren't contributing,
278
743540
2509
来惩罚那些没有贡献的人﹐你可以这样做。“
12:26
you can do that."
279
746073
1355
12:27
And as soon as people heard about the punishment issue going on,
280
747452
3410
当人们听到惩罚的时候
12:30
cooperation shoots up.
281
750886
1226
马上变得合作
12:32
It shoots up and it keeps going up.
282
752136
1689
不但合作 而且继续加强
12:33
Lots of research shows that to solve cooperative problems, it really helps.
283
753849
3535
有许多研究表示在解决合作问题上 这有明显的帮助
12:37
It's not enough to appeal to people's good motives.
284
757408
2405
只靠人们的好心并不够
12:39
It helps to have some sort of punishment.
285
759837
1977
有些惩罚会更好
12:41
Even if it's just shame or embarrassment or gossip,
286
761838
2391
就算只是羞辱或是被谈论
你需要惩罚
12:44
you need some sort of punishment
287
764253
1540
12:45
to bring people, when they're in large groups, to cooperate.
288
765817
2813
让人们在大的群体里合作
12:48
There's even some recent research suggesting that religion --
289
768654
2940
甚至有些最近的研究谈到宗教
12:51
priming God, making people think about God --
290
771618
2144
让人们想到神
12:53
often, in some situations, leads to more cooperative, more pro-social behavior.
291
773786
4168
往往让人们懂得合作 更符合社会期待
12:58
Some people think that religion is an adaptation
292
778574
2240
某些人认为宗教是一种适应作用
13:00
evolved both by cultural and biological evolution
293
780838
2292
来自文化和生理进化
13:03
to make groups to cohere,
294
783154
1726
让群体可以合作
13:04
in part for the purpose of trusting each other
295
784904
2198
让人们何以互信
13:07
and being more effective at competing with other groups.
296
787126
2632
在与他人竞争时能够更有效
13:09
That's probably right, although this is a controversial issue.
297
789782
2912
我想这大概是真的
虽然这是个争议性很大的话题
13:12
But I'm particularly interested in religion and the origin of religion
298
792718
3357
但我对宗教特别有兴趣
宗教的来源﹐他为我们和对我们做了什么
13:16
and in what it does to us and for us,
299
796099
2086
因为我认为最大的奇景不是大峡谷
13:18
because I think the greatest wonder in the world is not the Grand Canyon.
300
798209
3447
13:21
The Grand Canyon is really simple --
301
801680
1828
大峡谷很简单
13:23
a lot of rock and a lot of water and wind and a lot of time,
302
803532
3171
很多石头 很多水和风 很多时间
13:26
and you get the Grand Canyon.
303
806727
1393
你就能得到大峡谷
13:28
It's not that complicated.
304
808144
1241
一点也不复杂
13:29
This is what's complicated:
305
809409
1290
复杂的是
13:30
that people lived in places like the Grand Canyon, cooperating with each other,
306
810723
3849
那些住在大峡谷这样的地方的人
彼此合作﹐或在非洲的撒哈拉沙漠
13:34
or on the savannahs of Africa or the frozen shores of Alaska.
307
814596
2877
或在阿拉斯加的冰岸﹐和那些村庄
13:37
And some of these villages grew into the mighty cities of Babylon
308
817497
3351
逐渐变成伟大城市像巴比伦﹐罗马 湖中之城提诺契特兰
13:40
and Rome and Tenochtitlan.
309
820872
1434
13:42
How did this happen?
310
822330
1151
这是怎么发生的﹖
13:43
It's an absolute miracle, much harder to explain than the Grand Canyon.
311
823505
3363
这完全是奇迹﹐比大峡谷更难解释
13:46
The answer, I think, is that they used every tool in the toolbox.
312
826892
3093
答案﹐我想﹐是他们用了所有工具盒里面的工具
用了所有道德心理学
13:50
It took all of our moral psychology to create these cooperative groups.
313
830009
3352
创造了这些合作团队
13:53
Yes, you need to be concerned about harm, you need a psychology of justice.
314
833385
3572
是﹐你需要想到伤害
你需要想到正义
13:56
But it helps to organize a group if you have subgroups,
315
836981
2604
但如果你有一些小团队﹐便很容易组织大团队
13:59
and if those subgroups have some internal structure,
316
839609
2697
这些小团队中有一些内部组织
14:02
and if you have some ideology that tells people
317
842330
2239
如果你有一些理想可以告诉人
14:04
to suppress their carnality -- to pursue higher, nobler ends.
318
844593
3512
压制他们的欲望 去追求更高的 更荣耀的理想
现在我们来到自由派和保守派
14:09
Now we get to the crux of the disagreement between liberals and conservatives:
319
849118
3706
歧义的交会处
14:12
liberals reject three of these foundations.
320
852848
2245
因为自由派拒绝其中三个基础
他们说”不﹐我们应该要支持多样性 不要搞一些小圈圈。“
14:15
They say, "Let's celebrate diversity, not common in-group membership,"
321
855117
3340
他们说”让我们质疑权威。“
14:18
and, "Let's question authority," and, "Keep your laws off my body."
322
858481
3214
他们说”不要给我这些法律。“
14:21
Liberals have very noble motives for doing this.
323
861719
2249
自由派这样做有着崇高的动机
14:23
Traditional authority and morality can be quite repressive and restrictive
324
863992
3550
传统的权威﹐传统的道德 时常压制那些
14:27
to those at the bottom, to women, to people who don't fit in.
325
867566
2867
在底层的人 女人 那些不符合社会标准的人
14:30
Liberals speak for the weak and oppressed.
326
870457
2012
所以自由派为了那些受压迫的弱者说话
14:32
They want change and justice, even at the risk of chaos.
327
872493
2627
他们要改变 要正义 就算可能造成混乱
这个人的衣服上说”少放屁﹐去革命“
14:35
This shirt says, "Stop bitching, start a revolution."
328
875144
2494
14:37
If you're high in openness to experience, revolution is good; it's change, it's fun.
329
877662
3978
如果你很喜欢经历新事 革命是好的
它是改变 它很有趣
14:41
Conservatives, on the other hand, speak for institutions and traditions.
330
881664
3386
保守派﹐在另一边 为传统和制度发声
他们要秩序﹐就算有可能要牺牲底层的人
14:45
They want order, even at some cost, to those at the bottom.
331
885074
2787
14:47
The great conservative insight is that order is really hard to achieve.
332
887885
3348
保守派的心理是 秩序是非常难达成的
很珍贵 很容易就失去了
14:51
It's precious, and it's really easy to lose.
333
891257
2283
14:53
So as Edmund Burke said, "The restraints on men,
334
893564
2259
所以 Edmund Burke 说”人们的束缚
14:55
as well as their liberties, are to be reckoned among their rights."
335
895847
3153
和他们的自由﹐是在他们的权利上。“
这是在法国大革命的混乱后
14:59
This was after the chaos of the French Revolution.
336
899024
2342
只要你看清这一点
15:01
Once you see that liberals and conservatives
337
901390
2124
自由派和保守派都能有一些贡献
15:03
both have something to contribute,
338
903538
1666
15:05
that they form a balance on change versus stability,
339
905228
3078
他们能在改变和稳定中找到平衡 --
15:08
then I think the way is open to step outside the moral Matrix.
340
908330
2976
我想重点是试着踏出我们的道德框架
15:11
This is the great insight
341
911979
1436
这是所有亚洲宗教都有的特性
15:13
that all the Asian religions have attained.
342
913439
2527
15:16
Think about yin and yang.
343
916330
1242
想想阴阳
15:17
Yin and yang aren't enemies; they don't hate each other.
344
917596
2640
阴阳不是敌人﹐阴阳不互相仇恨
15:20
Yin and yang are both necessary, like night and day,
345
920260
2589
阴阳都是必须的﹐像日夜
15:22
for the functioning of the world.
346
922873
1593
让世界继续转动
15:24
You find the same thing in Hinduism.
347
924490
1726
你在印度教中也能看到
15:26
There are many high gods in Hinduism.
348
926240
1774
印度教有很多大神
15:28
Two of them are Vishnu, the preserver, and Shiva, the destroyer.
349
928038
3015
其中两位是守护神毗瑟挐﹐和破坏神湿婆
15:31
This image, actually, is both of those gods
350
931077
2053
这个图片是两个神使用同一个身体
15:33
sharing the same body.
351
933154
1225
15:34
You have the markings of Vishnu on the left,
352
934403
2439
左边有毗瑟挐的特质
15:36
so we could think of Vishnu as the conservative god.
353
936866
2440
你可以想他是保护神
15:39
You have the markings of Shiva on the right -- Shiva's the liberal god.
354
939330
3366
右边有湿婆的特质
湿婆是个自由派 - 祂们一起合作
15:42
And they work together.
355
942720
1159
15:43
You find the same thing in Buddhism.
356
943903
1726
你在佛教里也可以找到一样的例子
15:45
These two stanzas contain, I think, the deepest insights
357
945653
2625
这两个小句有深深的寓意
或许是道德心理学从来没达到的境界
15:48
that have ever been attained into moral psychology.
358
948302
2390
15:50
From the Zen master Sēngcàn:
359
950716
2177
来自禅宗的僧璨
15:52
"If you want the truth to stand clear before you, never be 'for' or 'against.'
360
952917
4103
至道无难,唯嫌拣择。
违顺相争,是为心病。“
15:57
The struggle between 'for' and 'against' is the mind's worst disease."
361
957044
3320
16:00
Unfortunately, it's a disease that has been caught
362
960821
2718
很不幸的﹐这种心病
许多世界的伟大领袖都有
16:03
by many of the world's leaders.
363
963563
1492
但在你感觉自己比小布什好很多前
16:05
But before you feel superior to George Bush,
364
965079
2091
16:07
before you throw a stone, ask yourself:
365
967194
2194
在你对他扔石头前﹐先自问﹕我接受吗﹖
16:09
Do you accept this?
366
969412
1180
16:11
Do you accept stepping out of the battle of good and evil?
367
971418
3252
我能跨出善恶论吗﹖
16:14
Can you be not for or against anything?
368
974694
2443
我能不支持和反对任何事情吗
16:17
So what's the point? What should you do?
369
977842
2770
重点是什么 我该怎么做
16:20
Well, if you take the greatest insights
370
980636
2568
你可以在伟大的古代亚洲宗教和哲学里
16:23
from ancient Asian philosophies and religions
371
983228
2104
找到答案
16:25
and combine them with the latest research on moral psychology,
372
985356
2937
将这些答案加上最新的道德心理学研究
你会有这三个结论﹕
16:28
I think you come to these conclusions:
373
988317
1810
我们的脑子被进化所设计
16:30
that our righteous minds were designed by evolution
374
990151
3155
16:33
to unite us into teams,
375
993330
1657
要我们成为一个团队 让我们和其它团队分开
16:35
to divide us against other teams
376
995011
1541
16:36
and then to blind us to the truth.
377
996576
1700
让我们无视真理
16:39
So what should you do?
378
999649
1917
你该怎么做﹖难道我要你放弃努力
16:41
Am I telling you to not strive?
379
1001590
1860
16:43
Am I telling you to embrace Sēngcàn and stop,
380
1003474
2704
我是要你拥抱僧璨
16:46
stop with the struggle of for and against?
381
1006202
2870
然后停止这些支持和反对的想法吗﹖
16:49
No, absolutely not. I'm not saying that.
382
1009096
2210
绝对不是。这不是我要说的
16:51
This is an amazing group of people who are doing so much,
383
1011330
2976
有许多了不起的人做了许多事
16:54
using so much of their talent,
384
1014330
1738
用他们的才能﹐他们的技能 他们的精力和金钱
16:56
their brilliance, their energy, their money,
385
1016092
2058
16:58
to make the world a better place,
386
1018174
1596
让世界变得更好﹐去争取
16:59
to fight wrongs,
387
1019794
1731
打击错误﹐解决问题
17:01
to solve problems.
388
1021549
1280
17:04
But as we learned from Samantha Power
389
1024186
2320
但就像我们在 Samantha Power 的故事里学到的
17:06
in her story about Sérgio Vieira de Mello,
390
1026530
4808
像 Sergio Vieira de Mello﹐你不能直接杀进去
17:11
you can't just go charging in, saying, "You're wrong, and I'm right,"
391
1031362
4245
然后说”你错了 我对了“
17:15
because, as we just heard, everybody thinks they are right.
392
1035631
3413
因为﹐就像我们刚刚听到的 每个人都以为自己是对的
17:19
A lot of the problems we have to solve
393
1039068
1858
有太多我们需要解决的问题
17:20
are problems that require us to change other people.
394
1040950
2497
是那些需要我们去改变他人的问题
17:23
And if you want to change other people,
395
1043940
1943
如果你想要改变他人﹐一个比较好的方法是
17:25
a much better way to do it is to first understand who we are --
396
1045907
3158
先了解我们是谁 -- 了解我们自己的道德心理
17:29
understand our moral psychology,
397
1049089
1638
17:30
understand that we all think we're right --
398
1050751
2596
了解我们都认为自己是对的﹐然后跨出去
17:33
and then step out,
399
1053371
1263
17:34
even if it's just for a moment, step out -- check in with Sēngcàn.
400
1054658
3695
就算只是一下子﹐跨出去 想想僧璨
17:38
Step out of the moral Matrix,
401
1058898
1408
跨出你的道德框架
17:40
just try to see it as a struggle playing out,
402
1060330
2099
尝试当做这只是每个人认为自己是对的人
17:42
in which everybody thinks they're right, and even if you disagree with them,
403
1062453
3618
的一种拔河
每个人﹐就算你不认同他们 都有自己的理由
17:46
everybody has some reasons for what they're doing.
404
1066095
2364
每个人做事都有自己的理由
17:48
Step out.
405
1068483
1151
跨出去
17:49
And if you do that, that's the essential move to cultivate moral humility,
406
1069658
3486
如果你这样做﹐你便可以培养道德谦逊
17:53
to get yourself out of this self-righteousness,
407
1073168
2202
让你自己离开这个自以为义
一种正常人类的心理
17:55
which is the normal human condition.
408
1075394
1719
想想达赖喇嘛
17:57
Think about the Dalai Lama.
409
1077137
1315
17:58
Think about the enormous moral authority of the Dalai Lama.
410
1078476
2830
想想达赖喇嘛巨大的道德权威
18:01
It comes from his moral humility.
411
1081330
1777
这是来自他的道德谦逊
18:05
So I think the point --
412
1085028
1334
我想我谈话的重点是
18:06
the point of my talk and, I think, the point of TED --
413
1086386
4214
TED的重点是
18:10
is that this is a group that is passionately engaged
414
1090624
2682
这是一个热情的想要
18:13
in the pursuit of changing the world for the better.
415
1093330
2477
让世界变得更好的团体
18:15
People here are passionately engaged
416
1095831
2002
人们热情的希望
18:17
in trying to make the world a better place.
417
1097857
2031
让世界变得更好
18:19
But there is also a passionate commitment to the truth.
418
1099912
2844
同时也有一种接近真理的希望
18:23
And so I think the answer is to use that passionate commitment to the truth
419
1103329
4874
我想答案是保持你的热情﹐寻找真理
然后把它变成更好的未来
18:28
to try to turn it into a better future for us all.
420
1108227
2843
18:31
Thank you.
421
1111094
1212
谢谢你。
18:32
(Applause)
422
1112330
5253
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7