Jonathan Haidt: The moral roots of liberals and conservatives

716,737 views ・ 2008-09-18

TED


請雙擊下方英文字幕播放視頻。

譯者: Coco Shen 審譯者: Geoff Chen
00:18
Suppose that two American friends are traveling together in Italy.
0
18948
3097
想像兩個美國人到意大利旅遊
00:22
They go to see Michelangelo's "David,"
1
22069
1925
一起去看米開朗基羅的名作“大衛”
00:24
and when they finally come face-to-face with the statue,
2
24018
2630
當他們和巨大石雕面對面時
00:26
they both freeze dead in their tracks.
3
26672
1828
兩個人都望著出神
00:28
The first guy -- we'll call him Adam --
4
28524
1867
第一個人﹐我們就叫他亞當吧
00:30
is transfixed by the beauty of the perfect human form.
5
30415
2720
被完美的人體肌理震懾住了
00:33
The second guy -- we'll call him Bill --
6
33785
1925
第二個人 我們就叫他比爾吧
00:35
is transfixed by embarrassment, at staring at the thing there in the center.
7
35734
4454
也嚇傻了 - 被那兩腿間的玩意兒
00:40
So here's my question for you:
8
40998
1903
讓我試問
00:42
Which one of these two guys was more likely to have voted for George Bush,
9
42925
3617
這兩個男人誰比較有可能把票投給小布希
00:46
which for Al Gore?
10
46566
1740
誰投給了高爾﹖
00:48
I don't need a show of hands,
11
48330
1384
大家不用舉手
00:49
because we all have the same political stereotypes.
12
49738
2389
因為我們都有一樣的刻板印象
00:52
We all know that it's Bill.
13
52151
2155
我們都知道是比爾
00:54
And in this case, the stereotype corresponds to reality.
14
54330
3429
在這個例子裡﹐刻板印象反映了事實
00:57
It really is a fact that liberals are much higher than conservatives
15
57783
3374
事實上﹐自由黨員的確比保守黨員
01:01
on a major personality trait called openness to experience.
16
61181
3149
更容易接受新體驗
01:04
People who are high in openness to experience
17
64813
2111
那些喜歡接受新體驗的人
01:06
just crave novelty, variety, diversity, new ideas, travel.
18
66948
3444
渴望新鮮 多樣性 新想法 旅行
01:10
People low on it like things that are familiar,
19
70416
2237
較難接受新體驗的人喜歡熟悉 安全 可靠的事物
01:12
that are safe and dependable.
20
72677
2420
如果你知道這些特性
01:16
If you know about this trait,
21
76282
1383
01:17
you can understand a lot of puzzles about human behavior,
22
77689
2671
你便能了解人類許多難解的行為
了解為什麼藝術家和會計師如此不同
01:20
like why artists are so different from accountants.
23
80384
2388
01:22
You can predict what kinds of books they like to read,
24
82796
2533
你可以預測他們喜歡看的書
他們喜歡去的旅遊點
01:25
what kinds of places they like to travel to
25
85353
2015
甚至他們的飲食偏好
01:27
and what kinds of food they like to eat.
26
87392
1917
只要你了解這個特性﹐你便能理解
01:29
Once you understand this trait,
27
89333
1492
01:30
you can understand why anybody would eat at Applebee's,
28
90849
2605
為什麼這麼多人喜歡去連鎖餐廳吃飯 但你卻一個都不認識
01:33
but not anybody that you know.
29
93478
1478
01:34
(Laughter)
30
94980
5049
(笑聲)
01:40
This trait also tells us a lot about politics.
31
100900
2342
這個特性也讓我們理解政治
01:43
The main researcher of this trait, Robert McCrae,
32
103266
2389
研究這個性格特質的研究者 Robert McCrae 說
01:45
says that "Open individuals have an affinity
33
105679
2347
“開放的人偏向自由 進步 左翼政治思想”
01:48
for liberal, progressive, left-wing political views ..."
34
108050
2735
01:50
They like a society which is open and changing,
35
110809
2490
他們喜歡一個開放 持續改變的社會
“封閉的人偏好保守 傳統 右翼的觀點。”
01:53
"... whereas closed individuals prefer
36
113323
1876
01:55
conservative, traditional, right-wing views."
37
115223
2126
01:57
This trait also tells us a lot about the kinds of groups people join.
38
117956
3246
這個特質也讓我們了解人們所參與的社團組織
02:01
Here's the description of a group I found on the web.
39
121226
2489
這是我在網路上找到的一個組織簡介
02:03
What kinds of people would join
40
123739
1478
怎樣的人會參加一個全球性的社群
02:05
"a global community ... welcoming people from every discipline and culture,
41
125241
3752
歡迎來自各種文化和學科的人
那些想更深刻理解世界的人
02:09
who seek a deeper understanding of the world,
42
129017
2097
同時也是那些想以這些理解讓世界變得更好的人
02:11
and who hope to turn that understanding into a better future for us all"?
43
131138
3430
這是一個叫 TED 的男人寫的
02:14
This is from some guy named Ted.
44
134592
1529
(笑聲)
02:16
Well, let's see now.
45
136145
1152
那麼﹐如果開放性格偏向自由派
02:17
If openness predicts who becomes liberal,
46
137321
2834
02:20
and openness predicts who becomes a TEDster,
47
140179
2729
同時也預知了你會成為 TED 一員
02:22
then might we predict that most TEDsters are liberal?
48
142932
2500
是否大部份的 TED 成員都是自由黨呢﹖
02:25
Let's find out.
49
145964
1151
讓我們試試
請你舉起手﹐不管你是自由黨﹐中間偏左
02:27
I'll ask you to raise your hand, whether you are liberal, left of center --
50
147139
3557
02:30
on social issues, primarily --
51
150720
1456
在我們所討論的議題上
02:32
or conservative.
52
152200
1151
或是保守黨﹐還有一個第三選項
02:33
And I'll give a third option,
53
153375
1394
02:34
because I know there are libertarians in the audience.
54
154793
2545
因為我知道觀眾中還有一些相信自由至上的放任自由主義者
現在﹐舉起你的手來
02:37
So please raise your hand -- in the simulcast rooms too.
55
157362
2676
在聯播臺裡的人也是
讓每個人看看都是誰
02:40
Let's let everybody see who's here.
56
160062
1683
02:41
Please raise your hand if you'd say that you're liberal or left of center.
57
161769
3510
如果你是自由黨或中間偏左﹐請舉起手來
請把你的手舉高﹐好
02:45
Please raise your hand high right now. OK.
58
165303
2069
02:47
Please raise your hand if you'd say you're libertarian.
59
167833
2671
請舉手如果你是放任自由主義者
02:50
OK. About two dozen.
60
170914
1964
好 差不多有二十多人
02:53
And please raise your hand if you'd say you are right of center or conservative.
61
173234
4101
如果你覺得你是中間偏右或保守黨﹐請舉手
1 2 3 4 5 - 大概8 到10人
02:57
One, two, three, four, five -- about eight or 10.
62
177359
4599
03:01
OK.
63
181982
1174
好。這就是問題。
03:03
This is a bit of a problem.
64
183745
1485
如果我們的目標是了解世界
03:06
Because if our goal is to seek a deeper understanding of the world,
65
186346
3526
深刻的進一步了解世界
03:09
our general lack of moral diversity here is going to make it harder.
66
189896
3711
但缺乏道德多樣性讓了解世界變得更難
03:13
Because when people all share values, when people all share morals,
67
193631
3481
因為當每個人都分享一樣的價值觀和道德觀
03:17
they become a team.
68
197136
1242
便成為一個團隊﹐一旦進入團隊心理
03:18
And once you engage the psychology of teams,
69
198402
2225
03:20
it shuts down open-minded thinking.
70
200651
2000
原本開放的思想就會閉塞
03:25
When the liberal team loses,
71
205131
2271
當自由隊在2004年敗選
03:27
[United States of Canada / Jesusland]
72
207426
1822
03:29
as it did in 2004, and as it almost did in 2000,
73
209272
2656
就像在2000年一樣﹐我們自我安慰
03:31
we comfort ourselves.
74
211952
1354
03:33
(Laughter)
75
213330
1175
(笑聲)
03:34
We try to explain why half of America voted for the other team.
76
214529
4248
我們嘗試自我解釋為什麼有一半美國人投給另外一隊
03:38
We think they must be blinded by religion
77
218801
2917
我們想 他們一定是被宗教蒙蔽 或是純粹愚蠢
03:41
[Post-election US map: America / Dumbf*ckistan]
78
221742
2239
03:44
or by simple stupidity.
79
224005
1329
(笑聲)
03:45
(Laughter)
80
225358
1795
03:47
(Applause)
81
227177
2824
(掌聲)
03:50
(Laughter)
82
230025
6485
如果你認為投給共和黨的另一半美國人
03:56
So if you think that half of America votes Republican
83
236534
5133
04:01
because they are blinded in this way,
84
241691
2438
是因為他們被蒙蔽了
04:04
then my message to you is that you're trapped in a moral Matrix,
85
244153
3225
我想告訴你的是你被道德母體限制住了
04:07
in a particular moral Matrix.
86
247402
1402
某一種特別的道德母體
04:08
And by "the Matrix," I mean literally the Matrix, like the movie "The Matrix."
87
248828
3675
所謂的道德母體﹐就像“駭客人物”裡面的大電腦一樣
04:12
But I'm here today to give you a choice.
88
252527
1985
但今日我讓你有個選擇
04:14
You can either take the blue pill and stick to your comforting delusions,
89
254536
4455
你可以選擇藍色藥丸然後保持在舒適的幻覺中
或是選擇紅色藥丸﹐
04:19
or you can take the red pill,
90
259015
1413
04:20
learn some moral psychology
91
260452
1301
了解道德心理學﹐跨越你的道德母體
04:21
and step outside the moral Matrix.
92
261777
1808
04:23
Now, because I know --
93
263609
1206
因為我知道 --
04:24
(Applause)
94
264839
3680
(掌聲)
04:28
I assume that answers my question.
95
268543
1724
我想這已經回答了我的問題
04:30
I was going to ask which one you picked, but no need.
96
270291
2493
我本來想問你們要選哪一個﹐我想不需要了
04:32
You're all high in openness to experience,
97
272808
2016
你們都很愛接受新體驗﹐更何況
04:34
and it looks like it might even taste good, and you're all epicures.
98
274848
3200
這看起來很可能很可口 能滿足你們的美食主義
總而言之﹐讓我們選擇紅色藥丸
04:38
Anyway, let's go with the red pill, study some moral psychology
99
278072
2963
讓我們學習一些道德心理學﹐看看我們能了解什麼
04:41
and see where it takes us.
100
281059
1248
讓我們從頭開始
04:42
Let's start at the beginning: What is morality, where does it come from?
101
282331
3394
道德是什麼﹖它從哪裡來﹖
04:45
The worst idea in all of psychology
102
285749
1702
心理學中最糟的想法
04:47
is the idea that the mind is a blank slate at birth.
103
287475
2524
便是我們像一張白紙一樣出生
04:50
Developmental psychology has shown that kids come into the world
104
290023
3024
發展心理學告訴我們
嬰兒來到世界上時已經知道許多
04:53
already knowing so much about the physical and social worlds
105
293071
2941
有關世界和社會
04:56
and programmed to make it really easy for them to learn certain things
106
296036
4188
讓他們變得更易學習
05:00
and hard to learn others.
107
300248
1428
卻很難向他人學習
05:01
The best definition of innateness I've seen,
108
301700
2171
有關這些與生俱來的天賦
05:03
which clarifies so many things for me,
109
303895
1817
有個人說的很好
05:05
is from the brain scientist Gary Marcus.
110
305736
1915
腦科學家 Gary Marcus
05:07
He says, "The initial organization of the brain
111
307675
2210
他說“腦的初始組織不是來自經驗
05:09
does not depend that much on experience.
112
309909
1969
05:11
Nature provides a first draft, which experience then revises.
113
311902
3359
自然提供了第一個版本﹐經驗只能修改
05:15
'Built-in' doesn't mean unmalleable;
114
315285
1744
先建不代表不可塑﹔
05:17
it means organized in advance of experience."
115
317053
3010
而是組織先於經驗。”
05:20
OK, so what's on the first draft of the moral mind?
116
320087
2877
那麼道德的第一個版本是什麼﹖
05:22
To find out, my colleague Craig Joseph and I read through the literature
117
322988
3687
我和同事 Craig Joseph
閱讀了許多人類學的文獻
05:26
on anthropology, on culture variation in morality
118
326699
2302
有關不同文化的道德
05:29
and also on evolutionary psychology,
119
329025
1778
同時也在進化心理學裡找相同處
05:30
looking for matches:
120
330827
1157
跨領域的人談論的時候他們都談論什麼
05:32
What sorts of things do people talk about across disciplines
121
332008
2916
05:34
that you find across cultures and even species?
122
334948
2209
跨文化和跨物種的人又談論什麼﹖
我們總共找到五種
05:37
We found five best matches, which we call the five foundations of morality.
123
337181
3660
我們稱它們為五種道德基礎
05:40
The first one is harm/care.
124
340865
1575
第一種是傷害-照護
05:42
We're all mammals here, we all have a lot of neural and hormonal programming
125
342464
3591
我們都是哺乳類﹐我們都有許多神經和荷爾蒙程式
05:46
that makes us really bond with others, care for others,
126
346079
2596
讓我們與他人聯結﹐關懷他人
05:48
feel compassion for others, especially the weak and vulnerable.
127
348699
3001
同情他人﹐尤其那些脆弱容易受傷的人
05:51
It gives us very strong feelings about those who cause harm.
128
351724
2952
讓我們對那些造成傷害的人有強烈感覺
這個道德基礎含括了我在TED所聽到的
05:55
This moral foundation underlies about 70 percent
129
355197
2289
05:57
of the moral statements I've heard here at TED.
130
357510
2213
七成的道德陳述
05:59
The second foundation is fairness/reciprocity.
131
359747
2559
第二個道德基礎是公平-相等
06:02
There's actually ambiguous evidence
132
362330
1749
有一些模糊的證據
06:04
as to whether you find reciprocity in other animals,
133
364103
2437
證明你是否能在其他動物身上找到相互性
06:06
but the evidence for people could not be clearer.
134
366564
2302
但在人類身上的例子卻再清楚不過了
06:08
This Norman Rockwell painting is called "The Golden Rule" --
135
368890
2810
這幅 Norman Rockwell 的畫叫做“金科玉律”
Karen Armstrong 也告訴我們
06:11
as we heard from Karen Armstrong, it's the foundation of many religions.
136
371724
3501
這是很多宗教的基礎
06:15
That second foundation underlies the other 30 percent
137
375249
2479
第二哥道德基礎含括了我在TED所聽到的
06:17
of the moral statements I've heard here at TED.
138
377752
2198
另外三成的道德陳訴
06:19
The third foundation is in-group/loyalty.
139
379974
1980
第三個基礎是團隊忠誠
06:21
You do find cooperative groups in the animal kingdom,
140
381978
2934
你可以在動物裡面找到群體
你可以找到合作團隊
06:24
but these groups are always either very small or they're all siblings.
141
384936
3370
但這些組織通常不是很小或是牠們都是兄弟姐妹
06:28
It's only among humans that you find very large groups of people
142
388330
3048
只有在人類的世界裡你看到一大群人
06:31
who are able to cooperate and join together into groups,
143
391402
2904
彼此相處﹐一起合作
06:34
but in this case, groups that are united to fight other groups.
144
394330
3573
但在這例子裡﹐團隊合作是為了和其他團隊鬥爭
06:37
This probably comes from our long history of tribal living, of tribal psychology.
145
397927
3850
這大概是來自我們長時間的部落生態﹐部落心理
06:41
And this tribal psychology is so deeply pleasurable
146
401801
2983
這種部落心態實在太愉快了
06:44
that even when we don't have tribes, we go ahead and make them,
147
404808
3003
就算我們已經不在部落裡了
我們還是照樣因為好玩
06:47
because it's fun.
148
407835
1235
06:49
(Laughter)
149
409094
3385
(笑聲)
06:52
Sports is to war as pornography is to sex.
150
412503
2867
運動和戰爭就像A片和性的關係
06:55
We get to exercise some ancient drives.
151
415394
3721
我們借此發泄那些古老的慾望
第四種道德基礎是權威-尊敬
06:59
The fourth foundation is authority/respect.
152
419139
2015
07:01
Here you see submissive gestures
153
421178
1537
從這裡你可以看到兩種非常接近的物種的服從姿態
07:02
from two members of very closely related species.
154
422739
2289
但人類的權威不是以權力和暴力為基礎
07:05
But authority in humans is not so closely based on power and brutality
155
425052
3715
07:08
as it is in other primates.
156
428791
1361
像其他動物
07:10
It's based on more voluntary deference and even elements of love, at times.
157
430176
3706
而是以自願的服從﹐
有時候甚至是愛的元素
07:14
The fifth foundation is purity/sanctity.
158
434541
2060
第五種基礎是純潔- 神聖
07:16
This painting is called "The Allegory Of Chastity,"
159
436625
2521
這幅畫是“貞節的寓意”
07:19
but purity is not just about suppressing female sexuality.
160
439170
2992
但純潔不只是壓抑女性性慾
07:22
It's about any kind of ideology, any kind of idea
161
442186
2944
而是任何理想﹐任何想法
07:25
that tells you that you can attain virtue by controlling what you do with your body
162
445154
3981
告訴你只要控制你的身體
你便可以成善
只要控制進入你身體的東西
07:29
and what you put into your body.
163
449159
1559
07:30
And while the political right may moralize sex much more,
164
450742
3281
右翼喜歡談論性方面的道德
07:34
the political left is doing a lot of it with food.
165
454047
2349
左翼喜歡用食物
07:36
Food is becoming extremely moralized nowadays.
166
456420
2152
今日食物變成一種道德指標
07:38
A lot of it is ideas about purity,
167
458596
1631
這些觀點也來自純潔
07:40
about what you're willing to touch or put into your body.
168
460251
2786
有關你願意觸摸和放進身體的東西
07:43
I believe these are the five best candidates
169
463061
2891
我相信這是五個最好的候選人
07:45
for what's written on the first draft of the moral mind.
170
465976
2634
在我們道德思想的初稿上
07:48
I think this is what we come with, a preparedness to learn all these things.
171
468634
3657
我相信這是我們與生俱來的
做好準備要來學習這些東西
07:52
But as my son Max grows up in a liberal college town,
172
472315
3286
但我的兒子 Max 在一個自由派的大學城裡長大
07:55
how is this first draft going to get revised?
173
475625
2403
這個初稿將如何被改寫﹖
07:58
And how will it end up being different
174
478052
1830
和在我們南部六十里的鄉下
07:59
from a kid born 60 miles south of us, in Lynchburg, Virginia?
175
479906
3056
生下來的孩子 又會有什麼不同﹖
08:02
To think about culture variation, let's try a different metaphor.
176
482986
3069
當我們想到這些多樣文化的時候﹐讓我們試試其他隱喻
如果真的有著五種系統在我們想法裡
08:06
If there really are five systems at work in the mind,
177
486079
2482
08:08
five sources of intuitions and emotions,
178
488585
1910
五種情緒和直覺的來源
08:10
then we can think of the moral mind as one of those audio equalizers
179
490519
3198
我們可以把道德感
當做音響有五種頻道的均衡器
08:13
that has five channels,
180
493741
1151
08:14
where you can set it to a different setting on every channel.
181
494916
2877
你可以在不同頻道選擇不同的程度
我的同事 Brian Nosek, Jesse Graham 和我
08:17
My colleagues Brian Nosek and Jesse Graham and I
182
497817
2255
做了一個問卷﹐放在www.YourMorals.org網站上
08:20
made a questionnaire, which we put up on the web at www.YourMorals.org.
183
500096
4884
目前為止已經有三萬人填寫了這個問卷﹐你也可以
08:25
And so far, 30,000 people have taken this questionnaire, and you can, too.
184
505004
3981
08:29
Here are the results from about 23,000 American citizens.
185
509009
4202
結果在這裡
這裡是兩萬三千個美國公民的結果
08:33
On the left are the scores for liberals;
186
513235
1915
左邊是自由派的分數
08:35
on the right, conservatives; in the middle, moderates.
187
515174
2537
右邊是保守派的﹐中間是中立
08:37
The blue line shows people's responses on the average of all the harm questions.
188
517735
3795
藍線是你們的回應
在所有有關傷害的問題上
08:41
So as you see, people care about harm and care issues.
189
521554
2593
你可以看到﹐人們真的很關心傷害和照護的問題
08:44
They highly endorse these sorts of statements all across the board,
190
524171
3163
他們很支持這方面的陳述
在整個表上﹐但你也可以看到
08:47
but as you also see,
191
527358
1151
08:48
liberals care about it a little more than conservatives; the line slopes down.
192
528533
3728
自由派比保守派更在乎一些﹐線慢慢降了下來
公平也是一樣
08:52
Same story for fairness.
193
532285
1375
08:53
But look at the other three lines.
194
533684
1622
但看看其他三條線
08:55
For liberals, the scores are very low.
195
535330
1823
自由派的分數非常低
08:57
They're basically saying, "This is not morality.
196
537177
2240
基本上自由派是說“這根本不是道德。
08:59
In-group, authority, purity -- this has nothing to do with morality. I reject it."
197
539441
3860
團體 權威 純潔 - 這些東西和道德一點關係也沒有。我拒絕。”
但當人越保守﹐這些價值便提昇
09:03
But as people get more conservative, the values rise.
198
543325
2485
我們可以說自由派有一種 - 雙頻
09:05
We can say liberals have a two-channel or two-foundation morality.
199
545834
3113
或是雙基礎的道德
09:08
Conservatives have more of a five-foundation,
200
548971
2099
保守派則是有五基礎
或是五頻的道德
09:11
or five-channel morality.
201
551094
1193
09:12
We find this in every country we look at.
202
552311
1958
我們在每個國家都看到一樣的情形
這是一千多個加拿大人的數據
09:14
Here's the data for 1,100 Canadians. I'll flip through a few other slides.
203
554293
3495
我會翻過一些其他的國家
09:17
The UK, Australia, New Zealand, Western Europe, Eastern Europe,
204
557812
2957
英國﹐澳洲 紐西蘭 西歐 東歐
09:20
Latin America, the Middle East, East Asia and South Asia.
205
560793
3513
拉丁美洲 中東 中亞 和南亞
09:24
Notice also that on all of these graphs,
206
564330
1912
你可以看到在這些圖表上
09:26
the slope is steeper on in-group, authority, purity,
207
566266
2803
在團體 權威 純潔的差異更大
09:29
which shows that, within any country,
208
569093
2712
這告訴我們在任何國家
09:31
the disagreement isn't over harm and fairness.
209
571829
2146
歧見並不是來自傷害和公平
09:33
I mean, we debate over what's fair,
210
573999
1711
我們討論什麼是公平
09:35
but everybody agrees that harm and fairness matter.
211
575734
3270
但每個人都認同傷害和公平是要緊的
09:39
Moral arguments within cultures
212
579028
2136
在文化中的道德討論
09:41
are especially about issues of in-group, authority, purity.
213
581188
3178
通常都與團隊 權威 純潔有關
09:44
This effect is so robust, we find it no matter how we ask the question.
214
584390
3411
無論我們怎麼提出問題﹐效果還是很明顯。
09:47
In a recent study, we asked people,
215
587825
1673
在最近的一項研究中
09:49
suppose you're about to get a dog,
216
589522
1625
我們問人們﹕如果你們要買狗
09:51
you picked a particular breed, learned about the breed.
217
591171
2594
你選擇了一種特別的品種
後來你知道有關這些品種的一些事
09:53
Suppose you learn that this particular breed is independent-minded
218
593789
3110
或許你學到這個特別的品種會獨立思考
09:56
and relates to its owner as a friend and an equal.
219
596923
2349
並且把主人當做平等的朋友
09:59
If you're a liberal, you say, "That's great!"
220
599296
2104
如果你是自由派你會說“哇!那太好了!”
10:01
because liberals like to say, "Fetch! Please."
221
601424
2154
因為自由派喜歡說“去接!”
10:03
(Laughter)
222
603602
4609
(笑聲)
10:08
But if you're a conservative, that's not so attractive.
223
608235
3014
但如果你是保守派﹐這就不是太好
10:11
If you're conservative and learn that a dog's extremely loyal
224
611273
2859
如果你是保守派﹐你知道這只狗對牠的家庭非常忠誠
10:14
to its home and family
225
614156
1151
不會很快地和陌生人混熟
10:15
and doesn't warm up to strangers,
226
615331
1576
10:16
for conservatives, loyalty is good; dogs ought to be loyal.
227
616931
2771
對保守派來說 忠誠很好 狗就是要忠誠
10:19
But to a liberal,
228
619726
1152
但對自由派來說﹐這聽起來
10:20
it sounds like this dog is running for the Republican nomination.
229
620902
3059
像是這隻狗要參加共和黨初選了
10:23
(Laughter)
230
623985
1002
(笑聲)
所以你可能說 好
10:25
You might say, OK, there are differences between liberals and conservatives,
231
625011
3586
這就是保守派和自由派的差異
10:28
but what makes the three other foundations moral?
232
628621
2288
但什麼讓其他三種基礎也成為道德呢﹖
10:30
Aren't they the foundations of xenophobia, authoritarianism and puritanism?
233
630933
3530
難道它們不是只是極權主義
排他主義和清教主義的基礎嗎﹖
10:34
What makes them moral?
234
634487
1155
什麼讓它們變成道德﹖
10:35
The answer, I think, is contained in this incredible triptych from Hieronymus Bosch,
235
635666
3967
答案﹐我想﹐就存在布殊這個三聯圖中
“世俗慾望的樂園。”
10:39
"The Garden of Earthly Delights."
236
639657
1580
在第一幅圖裡﹐我們看到創造世界時
10:41
In the first panel, we see the moment of creation.
237
641261
2629
10:43
All is ordered, all is beautiful,
238
643914
1917
一切都有秩序﹐一些都很美麗﹐所有的人和動物
10:45
all the people and animals are doing what they're supposed to be doing,
239
645855
3342
都在它們應該在的地方做他們應該做的事情
10:49
are where they're supposed to be.
240
649221
1586
10:50
But then, given the way of the world, things change.
241
650831
2432
但因為世俗的一切 事情開始改變
10:53
We get every person doing whatever he wants,
242
653287
2063
人們開始任意而為
10:55
with every aperture of every other person and every other animal.
243
655374
3052
和任何人和任何動物
10:58
Some of you might recognize this as the '60s.
244
658450
2099
在座的某些人可能會發現這是60年代
11:00
(Laughter)
245
660573
1412
(笑聲)
但60年代終究被70年代取代
11:02
But the '60s inevitably gives way to the '70s,
246
662009
3428
11:05
where the cuttings of the apertures hurt a little bit more.
247
665461
3415
這些裂縫開始令人痛苦
11:08
Of course, Bosch called this hell.
248
668900
2154
當然 布殊稱這為地獄
11:11
So this triptych, these three panels,
249
671078
4129
在這個三聯畫中﹐三片圖
描繪了秩序逐漸腐敗的真實
11:15
portray the timeless truth that order tends to decay.
250
675231
4289
11:19
The truth of social entropy.
251
679544
1805
社會消減的事實
11:21
But lest you think this is just some part of the Christian imagination
252
681373
3291
你們可能只會想這只是基督徒的想像
11:24
where Christians have this weird problem with pleasure,
253
684688
2584
因為基督徒老是要跟歡愉過不去
這裡有一個一樣的故事 一樣的演進
11:27
here's the same story, the same progression,
254
687296
2062
11:29
told in a paper that was published in "Nature" a few years ago,
255
689382
2969
在自然雜誌中刊登的一篇文章裡
11:32
in which Ernst Fehr and Simon Gächter had people play a commons dilemma,
256
692375
3997
Ernst Fehr 和 Simon Gachter 要人們思考一個常見的難題
11:36
a game in which you give people money,
257
696396
1910
你給人們錢
11:38
and then, on each round of the game,
258
698330
1718
然後在每一輪游戲結束前
11:40
they can put money into a common pot,
259
700072
2033
他們可以把錢放進一個共用壺裡
11:42
then the experimenter doubles what's there,
260
702129
2008
實驗者把裡面的錢變雙份
11:44
and then it's all divided among the players.
261
704161
2052
然後再分給所有玩家
11:46
So it's a nice analog for all sorts of environmental issues,
262
706237
2897
這就像許多環境議題
11:49
where we're asking people to make a sacrifice
263
709158
2098
我們要求人們做出犧牲
11:51
and they don't really benefit from their own sacrifice.
264
711280
2579
他們自己不會從犧牲中得到什麼
11:53
You really want everybody else to sacrifice,
265
713883
2061
但你總是要其他人犧牲
11:55
but everybody has a temptation to free ride.
266
715968
2056
但人總有搭便車的想法
剛開始﹐人們較為合作
11:58
What happens is that, at first, people start off reasonably cooperative.
267
718048
3392
12:01
This is all played anonymously.
268
721464
1842
這是無名制的 --
12:03
On the first round, people give about half of the money that they can.
269
723330
3550
第一輪﹐人們給出一半的錢
12:06
But they quickly see other people aren't doing so much.
270
726904
2670
但他們很快知道”說真的﹐其他人沒有做這麼多。
12:09
"I don't want to be a sucker. I won't cooperate."
271
729598
2312
我才不是笨蛋。我不要合作。“
12:11
So cooperation quickly decays
272
731934
1442
於是合作關係很快的從還不錯﹐落到幾乎沒有
12:13
from reasonably good down to close to zero.
273
733400
2565
12:15
But then -- and here's the trick --
274
735989
1680
但是 - 訣竅在這
12:17
Fehr and Gächter, on the seventh round, told people,
275
737693
2432
Fehr 和 Gachter 在第七輪的時候和每個人說
”好的﹐新規則
12:20
"You know what? New rule.
276
740149
1208
12:21
If you want to give some of your own money
277
741381
2135
如果你要給一些錢
12:23
to punish people who aren't contributing,
278
743540
2509
來懲罰那些沒有貢獻的人﹐你可以這樣做。“
12:26
you can do that."
279
746073
1355
12:27
And as soon as people heard about the punishment issue going on,
280
747452
3410
當人們聽到懲罰的時候
12:30
cooperation shoots up.
281
750886
1226
馬上變得合作
12:32
It shoots up and it keeps going up.
282
752136
1689
不但合作 而且繼續加強
12:33
Lots of research shows that to solve cooperative problems, it really helps.
283
753849
3535
有許多研究表示在解決合作問題上 這有明顯的幫助
12:37
It's not enough to appeal to people's good motives.
284
757408
2405
只靠人們的好心並不夠
12:39
It helps to have some sort of punishment.
285
759837
1977
有些懲罰會更好
12:41
Even if it's just shame or embarrassment or gossip,
286
761838
2391
就算只是羞辱或是被談論
你需要懲罰
12:44
you need some sort of punishment
287
764253
1540
12:45
to bring people, when they're in large groups, to cooperate.
288
765817
2813
讓人們在大的群體裡合作
12:48
There's even some recent research suggesting that religion --
289
768654
2940
甚至有些最近的研究談到宗教
12:51
priming God, making people think about God --
290
771618
2144
讓人們想到神
12:53
often, in some situations, leads to more cooperative, more pro-social behavior.
291
773786
4168
往往讓人們懂得合作 更符合社會期待
12:58
Some people think that religion is an adaptation
292
778574
2240
某些人認為宗教是一種適應作用
13:00
evolved both by cultural and biological evolution
293
780838
2292
來自文化和生理進化
13:03
to make groups to cohere,
294
783154
1726
讓群體可以合作
13:04
in part for the purpose of trusting each other
295
784904
2198
讓人們何以互信
13:07
and being more effective at competing with other groups.
296
787126
2632
在與他人競爭時能夠更有效
13:09
That's probably right, although this is a controversial issue.
297
789782
2912
我想這大概是真的
雖然這是個爭議性很大的話題
13:12
But I'm particularly interested in religion and the origin of religion
298
792718
3357
但我對宗教特別有興趣
宗教的來源﹐他為我們和對我們做了什麼
13:16
and in what it does to us and for us,
299
796099
2086
因為我認為最大的奇景不是大峽谷
13:18
because I think the greatest wonder in the world is not the Grand Canyon.
300
798209
3447
13:21
The Grand Canyon is really simple --
301
801680
1828
大峽谷很簡單
13:23
a lot of rock and a lot of water and wind and a lot of time,
302
803532
3171
很多石頭 很多水和風 很多時間
13:26
and you get the Grand Canyon.
303
806727
1393
你就能得到大峽谷
13:28
It's not that complicated.
304
808144
1241
一點也不複雜
13:29
This is what's complicated:
305
809409
1290
複雜的是
13:30
that people lived in places like the Grand Canyon, cooperating with each other,
306
810723
3849
那些住在大峽谷這樣的地方的人
彼此合作﹐或在非洲的撒哈拉沙漠
13:34
or on the savannahs of Africa or the frozen shores of Alaska.
307
814596
2877
或在阿拉斯加的冰岸﹐和那些村莊
13:37
And some of these villages grew into the mighty cities of Babylon
308
817497
3351
逐漸變成偉大城市像巴比倫﹐羅馬 湖中之城提诺契特兰
13:40
and Rome and Tenochtitlan.
309
820872
1434
13:42
How did this happen?
310
822330
1151
這是怎麼發生的﹖
13:43
It's an absolute miracle, much harder to explain than the Grand Canyon.
311
823505
3363
這完全是奇跡﹐比大峽谷更難解釋
13:46
The answer, I think, is that they used every tool in the toolbox.
312
826892
3093
答案﹐我想﹐是他們用了所有工具盒裡面的工具
用了所有道德心理學
13:50
It took all of our moral psychology to create these cooperative groups.
313
830009
3352
創造了這些合作團隊
13:53
Yes, you need to be concerned about harm, you need a psychology of justice.
314
833385
3572
是﹐你需要想到傷害
你需要想到正義
13:56
But it helps to organize a group if you have subgroups,
315
836981
2604
但如果你有一些小團隊﹐便很容易組織大團隊
13:59
and if those subgroups have some internal structure,
316
839609
2697
這些小團隊中有一些內部組織
14:02
and if you have some ideology that tells people
317
842330
2239
如果你有一些理想可以告訴人
14:04
to suppress their carnality -- to pursue higher, nobler ends.
318
844593
3512
壓制他們的慾望 去追求更高的 更榮耀的理想
現在我們來到自由派和保守派
14:09
Now we get to the crux of the disagreement between liberals and conservatives:
319
849118
3706
歧義的交會處
14:12
liberals reject three of these foundations.
320
852848
2245
因為自由派拒絕其中三個基礎
他們說”不﹐我們應該要支持多樣性 不要搞一些小圈圈。“
14:15
They say, "Let's celebrate diversity, not common in-group membership,"
321
855117
3340
他們說”讓我們質疑權威。“
14:18
and, "Let's question authority," and, "Keep your laws off my body."
322
858481
3214
他們說”不要給我這些法律。“
14:21
Liberals have very noble motives for doing this.
323
861719
2249
自由派這樣做有著崇高的動機
14:23
Traditional authority and morality can be quite repressive and restrictive
324
863992
3550
傳統的權威﹐傳統的道德 時常壓制那些
14:27
to those at the bottom, to women, to people who don't fit in.
325
867566
2867
在底層的人 女人 那些不符合社會標準的人
14:30
Liberals speak for the weak and oppressed.
326
870457
2012
所以自由派為了那些受壓迫的弱者說話
14:32
They want change and justice, even at the risk of chaos.
327
872493
2627
他們要改變 要正義 就算可能造成混亂
這個人的衣服上說”少放屁﹐去革命“
14:35
This shirt says, "Stop bitching, start a revolution."
328
875144
2494
14:37
If you're high in openness to experience, revolution is good; it's change, it's fun.
329
877662
3978
如果你很喜歡經歷新事 革命是好的
它是改變 它很有趣
14:41
Conservatives, on the other hand, speak for institutions and traditions.
330
881664
3386
保守派﹐在另一邊 為傳統和制度發聲
他們要秩序﹐就算有可能要犧牲底層的人
14:45
They want order, even at some cost, to those at the bottom.
331
885074
2787
14:47
The great conservative insight is that order is really hard to achieve.
332
887885
3348
保守派的心理是 秩序是非常難達成的
很珍貴 很容易就失去了
14:51
It's precious, and it's really easy to lose.
333
891257
2283
14:53
So as Edmund Burke said, "The restraints on men,
334
893564
2259
所以 Edmund Burke 說”人們的束縛
14:55
as well as their liberties, are to be reckoned among their rights."
335
895847
3153
和他們的自由﹐是在他們的權利上。“
這是在法國大革命的混亂後
14:59
This was after the chaos of the French Revolution.
336
899024
2342
只要你看清這一點
15:01
Once you see that liberals and conservatives
337
901390
2124
自由派和保守派都能有一些貢獻
15:03
both have something to contribute,
338
903538
1666
15:05
that they form a balance on change versus stability,
339
905228
3078
他們能在改變和穩定中找到平衡 --
15:08
then I think the way is open to step outside the moral Matrix.
340
908330
2976
我想重點是試著踏出我們的道德框架
15:11
This is the great insight
341
911979
1436
這是所有亞洲宗教都有的特性
15:13
that all the Asian religions have attained.
342
913439
2527
15:16
Think about yin and yang.
343
916330
1242
想想陰陽
15:17
Yin and yang aren't enemies; they don't hate each other.
344
917596
2640
陰陽不是敵人﹐陰陽不互相仇恨
15:20
Yin and yang are both necessary, like night and day,
345
920260
2589
陰陽都是必須的﹐像日夜
15:22
for the functioning of the world.
346
922873
1593
讓世界繼續轉動
15:24
You find the same thing in Hinduism.
347
924490
1726
你在印度教中也能看到
15:26
There are many high gods in Hinduism.
348
926240
1774
印度教有很多大神
15:28
Two of them are Vishnu, the preserver, and Shiva, the destroyer.
349
928038
3015
其中兩位是守護神毗瑟挐﹐和破壞神濕婆
15:31
This image, actually, is both of those gods
350
931077
2053
這個圖片是兩個神使用同一個身體
15:33
sharing the same body.
351
933154
1225
15:34
You have the markings of Vishnu on the left,
352
934403
2439
左邊有毗瑟挐的特質
15:36
so we could think of Vishnu as the conservative god.
353
936866
2440
你可以想他是保護神
15:39
You have the markings of Shiva on the right -- Shiva's the liberal god.
354
939330
3366
右邊有濕婆的特質
濕婆是個自由派 - 祂們一起合作
15:42
And they work together.
355
942720
1159
15:43
You find the same thing in Buddhism.
356
943903
1726
你在佛教裡也可以找到一樣的例子
15:45
These two stanzas contain, I think, the deepest insights
357
945653
2625
這兩個小句有深深的寓意
或許是道德心理學從來沒達到的境界
15:48
that have ever been attained into moral psychology.
358
948302
2390
15:50
From the Zen master Sēngcàn:
359
950716
2177
來自禪宗的僧璨
15:52
"If you want the truth to stand clear before you, never be 'for' or 'against.'
360
952917
4103
至道无难,唯嫌拣择。
违顺相争,是为心病。“
15:57
The struggle between 'for' and 'against' is the mind's worst disease."
361
957044
3320
16:00
Unfortunately, it's a disease that has been caught
362
960821
2718
很不幸的﹐這種心病
許多世界的偉大領袖都有
16:03
by many of the world's leaders.
363
963563
1492
但在你感覺自己比小布希好很多前
16:05
But before you feel superior to George Bush,
364
965079
2091
16:07
before you throw a stone, ask yourself:
365
967194
2194
在你對他扔石頭前﹐先自問﹕我接受嗎﹖
16:09
Do you accept this?
366
969412
1180
16:11
Do you accept stepping out of the battle of good and evil?
367
971418
3252
我能跨出善惡論嗎﹖
16:14
Can you be not for or against anything?
368
974694
2443
我能不支持和反對任何事情嗎
16:17
So what's the point? What should you do?
369
977842
2770
重點是什麼 我該怎麼做
16:20
Well, if you take the greatest insights
370
980636
2568
你可以在偉大的古代亞洲宗教和哲學裡
16:23
from ancient Asian philosophies and religions
371
983228
2104
找到答案
16:25
and combine them with the latest research on moral psychology,
372
985356
2937
將這些答案加上最新的道德心理學研究
你會有這三個結論﹕
16:28
I think you come to these conclusions:
373
988317
1810
我們的腦子被進化所設計
16:30
that our righteous minds were designed by evolution
374
990151
3155
16:33
to unite us into teams,
375
993330
1657
要我們成為一個團隊 讓我們和其他團隊分開
16:35
to divide us against other teams
376
995011
1541
16:36
and then to blind us to the truth.
377
996576
1700
讓我們無視真理
16:39
So what should you do?
378
999649
1917
你該怎麼做﹖難道我要你放棄努力
16:41
Am I telling you to not strive?
379
1001590
1860
16:43
Am I telling you to embrace Sēngcàn and stop,
380
1003474
2704
我是要你擁抱僧璨
16:46
stop with the struggle of for and against?
381
1006202
2870
然後停止這些支持和反對的想法嗎﹖
16:49
No, absolutely not. I'm not saying that.
382
1009096
2210
絕對不是。這不是我要說的
16:51
This is an amazing group of people who are doing so much,
383
1011330
2976
有許多了不起的人做了許多事
16:54
using so much of their talent,
384
1014330
1738
用他們的才能﹐他們的技能 他們的精力和金錢
16:56
their brilliance, their energy, their money,
385
1016092
2058
16:58
to make the world a better place,
386
1018174
1596
讓世界變得更好﹐去爭取
16:59
to fight wrongs,
387
1019794
1731
打擊錯誤﹐解決問題
17:01
to solve problems.
388
1021549
1280
17:04
But as we learned from Samantha Power
389
1024186
2320
但就像我們在 Samantha Power 的故事裡學到的
17:06
in her story about Sérgio Vieira de Mello,
390
1026530
4808
像 Sergio Vieira de Mello﹐你不能直接殺進去
17:11
you can't just go charging in, saying, "You're wrong, and I'm right,"
391
1031362
4245
然後說”你錯了 我對了“
17:15
because, as we just heard, everybody thinks they are right.
392
1035631
3413
因為﹐就像我們剛剛聽到的 每個人都以為自己是對的
17:19
A lot of the problems we have to solve
393
1039068
1858
有太多我們需要解決的問題
17:20
are problems that require us to change other people.
394
1040950
2497
是那些需要我們去改變他人的問題
17:23
And if you want to change other people,
395
1043940
1943
如果你想要改變他人﹐一個比較好的方法是
17:25
a much better way to do it is to first understand who we are --
396
1045907
3158
先了解我們是誰 -- 了解我們自己的道德心理
17:29
understand our moral psychology,
397
1049089
1638
17:30
understand that we all think we're right --
398
1050751
2596
了解我們都認為自己是對的﹐然後跨出去
17:33
and then step out,
399
1053371
1263
17:34
even if it's just for a moment, step out -- check in with Sēngcàn.
400
1054658
3695
就算只是一下子﹐跨出去 想想僧璨
17:38
Step out of the moral Matrix,
401
1058898
1408
跨出你的道德框架
17:40
just try to see it as a struggle playing out,
402
1060330
2099
嘗試當做這只是每個人認為自己是對的人
17:42
in which everybody thinks they're right, and even if you disagree with them,
403
1062453
3618
的一種拔河
每個人﹐就算你不認同他們 都有自己的理由
17:46
everybody has some reasons for what they're doing.
404
1066095
2364
每個人做事都有自己的理由
17:48
Step out.
405
1068483
1151
跨出去
17:49
And if you do that, that's the essential move to cultivate moral humility,
406
1069658
3486
如果你這樣做﹐你便可以培養道德謙遜
17:53
to get yourself out of this self-righteousness,
407
1073168
2202
讓你自己離開這個自以為義
一種正常人類的心理
17:55
which is the normal human condition.
408
1075394
1719
想想達賴喇嘛
17:57
Think about the Dalai Lama.
409
1077137
1315
17:58
Think about the enormous moral authority of the Dalai Lama.
410
1078476
2830
想想達賴喇嘛巨大的道德權威
18:01
It comes from his moral humility.
411
1081330
1777
這是來自他的道德謙遜
18:05
So I think the point --
412
1085028
1334
我想我談話的重點是
18:06
the point of my talk and, I think, the point of TED --
413
1086386
4214
TED的重點是
18:10
is that this is a group that is passionately engaged
414
1090624
2682
這是一個熱情的想要
18:13
in the pursuit of changing the world for the better.
415
1093330
2477
讓世界變得更好的團體
18:15
People here are passionately engaged
416
1095831
2002
人們熱情的希望
18:17
in trying to make the world a better place.
417
1097857
2031
讓世界變得更好
18:19
But there is also a passionate commitment to the truth.
418
1099912
2844
同時也有一種接近真理的希望
18:23
And so I think the answer is to use that passionate commitment to the truth
419
1103329
4874
我想答案是保持你的熱情﹐尋找真理
然後把它變成更好的未來
18:28
to try to turn it into a better future for us all.
420
1108227
2843
18:31
Thank you.
421
1111094
1212
謝謝你。
18:32
(Applause)
422
1112330
5253
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog