The psychology of evil | Philip Zimbardo

2,752,134 views ・ 2008-09-23

TED


請雙擊下方英文字幕播放視頻。

譯者: Coco Shen 審譯者: Geoff Chen
00:12
Philosophers, dramatists, theologians
0
12794
3556
許多世紀以來,哲學家,劇作家,神學家
00:16
have grappled with this question for centuries:
1
16374
2206
都在著力解決這個問題:
00:18
what makes people go wrong?
2
18604
1532
什麼使人們變壞?
00:20
Interestingly, I asked this question when I was a little kid.
3
20160
2976
有趣的是,當我還是小孩時,我問過同樣的問題。
00:23
I grew up in the South Bronx, inner-city ghetto in New York,
4
23160
3056
我成長於紐約南布朗克斯市中的貧民窟,
周圍充滿了罪惡,
00:26
and I was surrounded by evil,
5
26240
1483
00:27
as all kids are who grew up in an inner city.
6
27747
2556
如同所有在貧民窟長大的孩子一樣。
00:30
And I had friends who were really good kids,
7
30327
2325
我有一些朋友,他們曾是好孩子,
00:32
who lived out the Dr. Jekyll Mr. Hyde scenario -- Robert Louis Stevenson.
8
32676
4292
但他們的人生卻如同羅伯特·路易斯·斯蒂文森筆下的變身怪醫,由善轉惡。
00:36
That is, they took drugs, got in trouble, went to jail.
9
36992
3144
他們染毒,惹了麻煩,然後進了監獄。
00:40
Some got killed, and some did it without drug assistance.
10
40160
3859
有些喪了命,即使並沒有沾染毒品。
00:44
So when I read Robert Louis Stevenson, that wasn't fiction.
11
44448
2794
所以當我讀羅伯特·路易斯·斯蒂文森的作品時,我覺得那不是小說。
00:47
The only question is, what was in the juice?
12
47266
2224
唯一的問題是:釀成由善轉惡的毒藥是什麼?
更重要的是,善惡之間的界限——
00:49
And more importantly, that line between good and evil --
13
49514
3622
00:53
which privileged people like to think is fixed and impermeable,
14
53160
3468
特權階層喜歡認定這個界限是固定且不可逾越的,
00:56
with them on the good side, the others on the bad side --
15
56652
2673
認為他們是在善的一邊,其他人在惡的一邊——
00:59
I knew that line was movable, and it was permeable.
16
59349
3063
而我以前就知道這個界限是可以移動的,而且是可逾越的。
01:02
Good people could be seduced across that line,
17
62436
2343
好人可以受誘惑而越界,
01:04
and under good and some rare circumstances, bad kids could recover
18
64803
4204
偶爾在某些比較好的情況下,壞孩子也可能
依靠外界的幫助、改造、治療,以重塑人生。
01:09
with help, with reform, with rehabilitation.
19
69031
2296
01:12
So I want to begin with this wonderful illusion
20
72160
2477
所以,我想以荷蘭藝術家M. C. Escher
01:14
by [Dutch] artist M.C. Escher.
21
74661
2274
這幅奇妙的作品開始說起。
01:16
If you look at it and focus on the white, what you see is a world full of angels.
22
76959
3861
如果你把視線集中在白色區域,
你會看到一個充滿了天使的世界。
01:20
But let's look more deeply, and as we do,
23
80844
2568
但是當我們再靠近一點看,
01:23
what appears is the demons, the devils in the world.
24
83436
3021
魔鬼就出現了,世間的魔鬼。
01:26
That tells us several things.
25
86992
1381
這告訴我們幾點。
01:28
One, the world is, was, will always be filled with good and evil,
26
88397
3108
一:這個世界,無論過去,現在,還是將來,都總是由善和惡組成,
01:31
because good and evil is the yin and yang of the human condition.
27
91529
3048
因為善惡就如人類的陰陽。
01:34
It tells me something else.
28
94601
1286
它也告訴我另外一件事。如果你還記得,
01:35
If you remember, God's favorite angel was Lucifer.
29
95911
3028
上帝最喜歡的天使是路西法。
01:39
Apparently, Lucifer means "the light."
30
99449
2438
顯然,路西法的意思是“光明”。
01:41
It also means "the morning star," in some scripture.
31
101911
2726
在某些經文裡,它也有“黎明之星”的意思。
01:44
And apparently, he disobeyed God,
32
104661
3205
顯然他後來背叛了上帝,
01:47
and that's the ultimate disobedience to authority.
33
107890
2492
這是對權威的終極背叛。
01:50
And when he did, Michael, the archangel, was sent to kick him out of heaven
34
110406
5498
當他率眾背叛後,上帝派邁克天使長
將他和其他墮落的天使一起趕出天堂。
01:55
along with the other fallen angels.
35
115928
2003
01:57
And so Lucifer descends into hell, becomes Satan,
36
117955
3181
於是路西法降入地獄,成為撒旦,
02:01
becomes the devil, and the force of evil in the universe begins.
37
121160
3000
成為惡魔,宇宙中的惡之能量誕生了。
矛盾的是,是上帝造出了惡的容身之處---地獄。
02:04
Paradoxically, it was God who created hell as a place to store evil.
38
124184
4469
02:09
He didn't do a good job of keeping it there though.
39
129136
2381
他卻沒能使惡一直呆在那裡。
所以,從上帝最受寵的天使變為惡魔,
02:11
So, this arc of the cosmic transformation of God's favorite angel into the Devil,
40
131541
4589
這個巨大的轉變,
為我設立了一個大背景,
02:16
for me, sets the context for understanding human beings
41
136154
3644
02:19
who are transformed from good, ordinary people into perpetrators of evil.
42
139822
4314
去理解那些從好人或者普通人
轉變成壞人的人。
02:24
So the Lucifer effect, although it focuses on the negatives --
43
144737
4027
所以,路西法效應,儘管它集中在陰暗的方面——
02:28
the negatives that people can become,
44
148788
2655
人們可能投向陰暗,
02:31
not the negatives that people are --
45
151467
1962
但他們本身並非陰暗——
引導我作出一個心理學定義:惡是行使權力
02:33
leads me to a psychological definition.
46
153453
3354
02:36
Evil is the exercise of power.
47
156831
1973
02:38
And that's the key: it's about power.
48
158828
2239
這才是關鍵:權力。
來故意對他人進行心理傷害,
02:41
To intentionally harm people psychologically,
49
161091
2565
02:43
to hurt people physically, to destroy people mortally, or ideas,
50
163680
3730
對他人進行身體傷害,殘害他人生命或思想,
02:47
and to commit crimes against humanity.
51
167434
2194
犯下反人道的罪行。
02:51
If you Google "evil," a word that should surely have withered by now,
52
171160
3286
如果你用谷歌搜索evil (惡) 這個詞——時至今日,這本是個早應消亡的詞,
02:54
you come up with 136 million hits in a third of a second.
53
174470
3666
你會在1/3秒內得到1.36億個搜索結果。
02:58
A few years ago -- I am sure all of you were shocked, as I was,
54
178160
4498
幾年前發生的一件事——我知道你們當時一定和我一樣震驚,
03:02
with the revelation of American soldiers abusing prisoners in a strange place
55
182682
5454
就是揭露美軍士兵
在那場爭議性的對伊戰爭中
03:08
in a controversial war, Abu Ghraib in Iraq.
56
188160
2976
中的虐囚行為:阿布葛拉伊布虐囚事件。
03:11
And these were men and women
57
191160
2007
這些士兵,有男性也有女性,
對囚犯們實施了讓人難以置信的羞辱。
03:13
who were putting prisoners through unbelievable humiliation.
58
193191
4287
03:18
I was shocked, but I wasn't surprised,
59
198136
1810
我很震驚,但是並不感到意外,
03:19
because I had seen those same visual parallels
60
199970
2166
因為我以前看過類似的情況,
03:22
when I was the prison superintendent of the Stanford Prison Study.
61
202160
3402
當時我是史丹佛監獄實驗的負責人。
03:25
Immediately the Bush administration military said what?
62
205586
2715
布希政府軍方對此事的第一反應是什麼?
03:28
What all administrations say when there's a scandal:
63
208681
2500
是醜聞發生後任何官方都會說的套詞,
"不要怪我們。這與整個系統無關。只是幾個壞蘋果而已,
03:31
"Don't blame us. It's not the system.
64
211205
1850
03:33
It's the few bad apples, the few rogue soldiers."
65
213079
2400
只是一小撮惡劣的士兵而已。 "
03:35
My hypothesis is, American soldiers are good, usually.
66
215503
3031
而我的假設是,美國士兵通常情況下是好的。
03:38
Maybe it was the barrel that was bad.
67
218558
2061
也許是裝蘋果的桶壞了。
03:40
But how am I going to deal with that hypothesis?
68
220643
2493
但我如何證明這個假設呢?
03:43
I became an expert witness for one of the guards,
69
223160
2520
我成為了其中一個名叫
奇普·弗萊德里克中士的專家證人,
03:45
Sergeant Chip Frederick, and in that position,
70
225704
2143
在這個位置上,我可以接觸到關於此事的十幾份調查報告。
03:47
I had access to the dozen investigative reports.
71
227871
2913
03:50
I had access to him.
72
230808
1800
我同他接觸,我可以研究他,
03:52
I could study him, have him come to my home, get to know him,
73
232632
3301
讓他來我家,了解他,
03:55
do psychological analysis to see, was he a good apple or bad apple.
74
235957
3407
作些心理上的分析來判斷他是個好蘋果還是壞蘋果。
03:59
And thirdly, I had access to all of the 1,000 pictures
75
239388
3911
第三點,我可以查看所有的
04:03
that these soldiers took.
76
243323
1213
1000多張士兵拍攝的照片。
04:05
These pictures are of a violent or sexual nature.
77
245025
2510
這些照片都是暴力或色情的。
04:07
All of them come from the cameras of American soldiers.
78
247559
2959
所有這些都是美軍士兵用相機拍攝的。
04:10
Because everybody has a digital camera or cell phone camera,
79
250969
2859
因為每個人都有數位相機或手機相機,
04:13
they took pictures of everything, more than 1,000.
80
253852
2334
他們什麼都拍。拍了超過1000張照片。
04:16
And what I've done is I organized them into various categories.
81
256210
2954
我所做的是把它們分類。
但這些由陸軍預備役的美軍憲兵所拍攝的。
04:19
But these are by United States military police, army reservists.
82
259188
4939
04:24
They are not soldiers prepared for this mission at all.
83
264637
3482
他們完全不是為執行此項任務而設立的部隊。
而此事僅發生在一個地點,1A層,在夜間值班時間。
04:28
And it all happened in a single place, Tier 1-A, on the night shift.
84
268143
3673
04:31
Why?
85
271840
1201
為什麼? 1A層是軍方情報中心。
04:33
Tier 1-A was the center for military intelligence.
86
273065
2602
04:35
It was the interrogation hold.
87
275691
1976
是審訊關押處。中央情報局在那裡。
04:37
The CIA was there.
88
277691
1201
04:38
Interrogators from Titan Corporation, all there,
89
278916
3158
巨人公司(美軍外包公司)的審訊人員,全部都在那裡,
而他們得不到任何關於暴動的信息。
04:42
and they're getting no information about the insurgency.
90
282098
2786
04:44
So they're going to put pressure on these soldiers,
91
284908
2470
於是他們向這些憲兵隊士兵施加壓力,
迫使他們越線,
04:47
military police, to cross the line,
92
287402
1720
04:49
give them permission to break the will of the enemy,
93
289146
2990
允許他們採取措施來擊潰敵人的意志,
04:52
to prepare them for interrogation, to soften them up,
94
292160
2477
挽起袖子,為審訊做準備,
04:54
to take the gloves off.
95
294661
1226
使他們屈服。這些都是婉辭,
04:55
Those are the euphemisms, and this is how it was interpreted.
96
295911
3128
而這就是他們如何闡釋的。
04:59
Let's go down to that dungeon.
97
299794
1770
讓我們進入地牢吧。
(相機快門聲)(以下圖片含有裸露及暴力展示)
05:03
(Typewriting)
98
303445
1309
05:05
[Abu Ghraib Iraq Prison Abuses 2008 Military Police Guards' Photos]
99
305239
5548
05:11
[The following images include nudity and graphic depictions of violence]
100
311248
6357
05:18
(Camera shutter sounds)
101
318338
1133
(重擊聲)
05:39
(Thuds)
102
339173
1199
05:45
(Camera shutter)
103
345721
1059
(相機快門聲)
05:59
(Camera shutter)
104
359516
1190
(重擊聲)
06:09
(Breathing)
105
369160
3669
(喘息聲)
06:17
(Bells)
106
377160
1794
(鐘聲)
06:47
(Bells end)
107
407219
1366
06:49
So, pretty horrific.
108
409239
1786
很恐怖。
06:51
That's one of the visual illustrations of evil.
109
411501
3807
這是惡的一種視覺展示。
06:55
And it should not have escaped you
110
415332
1761
你應該不會沒有注意到,
我把那個伸開雙臂的囚犯
06:57
that the reason I paired the prisoner with his arms out
111
417117
3237
07:00
with Leonardo da Vinci's ode to humanity
112
420378
2758
和達文西頌揚人類的作品放在一起的原因,
07:03
is that that prisoner was mentally ill.
113
423160
2582
是那個犯人得了精神疾病。
07:05
That prisoner covered himself with shit every day,
114
425766
2353
那個犯人每天用大便塗抹在身上,
士兵們不得不使他在泥土裡打滾,以消除臭味。
07:08
they had to roll him in dirt so he wouldn't stink.
115
428143
2418
07:10
But the guards ended up calling him "Shit Boy."
116
430585
2348
但士兵們最終還是叫他屎男。
07:12
What was he doing in that prison rather than in some mental institution?
117
432957
4179
他在監獄裡做什麼!?
他本應在精神病院。
07:17
In any event, here's former Secretary of Defense Rumsfeld.
118
437657
2715
不管怎樣,前國防部長拉姆斯菲爾德
07:20
He comes down and says, "I want to know, who is responsible?
119
440396
2810
下來問,"我想知道誰該為此負責?
到底誰才是那幾個壞蘋果? "嗯,這是個差勁的問題。
07:23
Who are the bad apples?"
120
443230
1379
07:24
Well, that's a bad question.
121
444633
1408
你應該重新組織一下這個句子,"是什麼為此負責?"
07:26
You have to reframe it and ask, "What is responsible?"
122
446065
2621
07:28
"What" could be the who of people,
123
448710
1648
因為"什麼"既可以是指人,
07:30
but it could also be the what of the situation,
124
450382
2537
也可以是指情境,
07:32
and obviously that's wrongheaded.
125
452943
1686
而顯然那樣問是堅持錯誤。
07:34
How do psychologists try to understand such transformations of human character,
126
454653
3845
那麼心理學家是如何理解
這種人性的轉變呢?
07:38
if you believe that they were good soldiers
127
458522
2000
如果你相信他們在進入地牢之前
07:40
before they went down to that dungeon?
128
460546
1810
是好士兵的話。
07:42
There are three ways. The main way is called dispositional.
129
462380
2862
有三種方式。最主要的方式是所謂的特質論。
我們查看那些壞蘋果的內在特徵。
07:45
We look at what's inside of the person, the bad apples.
130
465266
2870
07:48
This is the foundation of all of social science,
131
468160
3311
這是所有社會科學的基礎,
07:51
the foundation of religion, the foundation of war.
132
471495
2660
宗教的基礎,戰爭的基礎。
07:55
Social psychologists like me come along and say,
133
475160
2239
像我這樣的社會心理學家會出來說,"是啊,
07:57
"Yeah, people are the actors on the stage,
134
477423
2119
人們是舞台上的演員,
07:59
but you'll have to be aware of the situation.
135
479566
2350
但你得清楚其所處的情境。
08:01
Who are the cast of characters? What's the costume?
136
481940
2381
扮演角色的演員是哪些人?戲服什麼樣?
08:04
Is there a stage director?"
137
484345
1459
有舞台導演嗎?
08:05
And so we're interested in what are the external factors
138
485828
2667
所以我們感興趣的是,個體周圍的外界因素
08:08
around the individual -- the bad barrel?
139
488519
1976
是什麼,壞的蘋果桶?
社會學家研究的僅限於此,卻遺漏了這個很重要的問題,
08:11
Social scientists stop there and they miss the big point
140
491064
2647
08:13
that I discovered when I became an expert witness for Abu Ghraib.
141
493735
3239
即我在成為阿布葛拉伊布虐囚事件的專家證人後所發現的:
08:16
The power is in the system.
142
496998
1551
權力存在於系統中。
08:18
The system creates the situation that corrupts the individuals,
143
498573
3241
系統製造出腐化個體的情境,
08:21
and the system is the legal, political, economic, cultural background.
144
501838
4784
這個系統,是指法制、政治、經濟和文化背景。
08:26
And this is where the power is of the bad-barrel makers.
145
506646
3294
該系統即蘋果桶製造者權力之所在。
如果你想改變一個人,你就得改變其所處的情境。
08:30
If you want to change a person, change the situation.
146
510299
2501
08:32
And to change it, you've got to know where the power is, in the system.
147
512824
3389
如果你要改變情境,
你得知道其權力存在於系統的何處。
所以路西法效應牽涉到理解
08:36
So the Lucifer effect involves understanding
148
516237
2048
人性轉變是如何受這三項因素影響的。
08:38
human character transformations with these three factors.
149
518309
4455
08:43
And it's a dynamic interplay.
150
523284
1389
它是一個相互作用的過程。
08:44
What do the people bring into the situation?
151
524697
2066
人們會怎樣影響情境?
08:46
What does the situation bring out of them?
152
526787
2008
情境如何影響人們?
08:48
And what is the system that creates and maintains that situation?
153
528819
3317
製造並維持該情境的系統是什麼?
08:52
My recent book, "The Lucifer Effect," is about,
154
532784
2284
我最近出版的書《路西法效應》,
就是關於我們如何理解好人是怎樣變成惡人的。
08:55
how do you understand how good people turn evil?
155
535092
2239
08:57
And it has a lot of detail about what I'm going to talk about today.
156
537355
3263
書中有關於我今天演講內容
的大量細節。
09:01
So Dr. Z's "Lucifer Effect," although it focuses on evil,
157
541056
3080
所以,津博士的《路西法效應》,儘管著重於惡,
09:04
really is a celebration of the human mind's infinite capacity
158
544160
4160
但其實是頌揚人類有無限的潛力,
使我們任何人向善或作惡,
09:08
to make any of us kind or cruel,
159
548344
2558
09:10
caring or indifferent, creative or destructive,
160
550926
2522
關懷或冷漠,創造或毀滅,
09:13
and it makes some of us villains.
161
553472
2278
甚至可以使得我們其中一些人成為惡棍。
09:15
And the good news that I'm going to hopefully come to at the end
162
555774
3097
而我在最後將滿懷希望地給大家講一個好消息的故事,
09:18
is that it makes some of us heroes.
163
558895
1767
即這潛力也可以使我們其中一些人成為英雄。
這是登在《紐約客》上非常棒的一個漫畫,
09:21
This wonderful cartoon in the New Yorker summarizes my whole talk:
164
561811
3549
它其實總結了我的全部演講:
09:25
"I'm neither a good cop nor a bad cop, Jerome.
165
565384
2143
"傑若米,我既不是好警察也不是壞警察,
09:27
Like yourself, I'm a complex amalgam
166
567551
1959
跟你一樣,我是一個正面和負面 人格特質
09:29
of positive and negative personality traits
167
569534
2469
的複雜混合體,
09:32
that emerge or not, depending on the circumstances."
168
572027
3490
至於體現哪一面,要靠具體情況而言。 "
09:35
(Laughter)
169
575541
1985
(笑聲)
09:37
There's a study some of you think you know about,
170
577550
2852
有一項研究,你們其中一些人可能以為自己知道,
09:40
but very few people have ever read the story.
171
580426
2187
但極少數人讀過這個故事。你看過電影。
09:42
You watched the movie.
172
582637
1925
09:44
This is Stanley Milgram, little Jewish kid from the Bronx,
173
584586
3349
這是斯坦利·米爾格拉姆,自小在布朗克斯長大的一個猶太人,
09:47
and he asked the question, "Could the Holocaust happen here, now?"
174
587959
3099
他問,"大屠殺在此時此地發生嗎?"
09:51
People say, "No, that's Nazi Germany, Hitler, you know, that's 1939."
175
591082
3505
人們回答,"不,那是納粹德國,
那是希特勒,你知道,那是1939年。 "
09:54
He said, "Yeah, but suppose Hitler asked you,
176
594611
2393
他說,"是啊,但如果希特勒問你,
'你會用電刑處死一個陌生人嗎? ' ' 不可能,我肯定不會,我是個好人。 "
09:57
'Would you electrocute a stranger?' 'No way, I'm a good person.'"
177
597028
3170
10:00
He said, "Why don't we put you in a situation
178
600222
2096
他說,"那麼我們不如把你放在一個情境裡,
給你一個機會,看看你會怎麼做? "
10:02
and give you a chance to see what you would do?"
179
602342
2239
於是,他找了1000個普通人來做測試。
10:04
And so what he did was he tested 1,000 ordinary people.
180
604605
3085
10:07
500 New Haven, Connecticut, 500 Bridgeport.
181
607714
4067
500人來自康州紐黑文,500人來自布里奇波特。
廣告是這樣說的,"心理學家想要研究人的記憶,
10:11
And the ad said, "Psychologists want to understand memory.
182
611805
2715
10:14
We want to improve people's memory, because it is the key to success."
183
614544
3294
我們想改善人的記憶,
因為記憶是成功的關鍵。 "
10:17
OK?
184
617862
1200
"我們會給你5美元——4元用來支付時間。"
10:20
"We're going to give you five bucks -- four dollars for your time.
185
620064
3985
10:24
We don't want college students. We want men between 20 and 50."
186
624073
3454
上面寫著,"我們不要大學生,
我們需要20到50歲之間的男性。 "
10:27
In the later studies, they ran women.
187
627551
1762
——他們在後來的實驗中也研究了女性——
他們都是普通人:理髮師,收銀員,白領等等。
10:29
Ordinary people: barbers, clerks, white-collar people.
188
629337
3073
10:32
So, you go down,
189
632434
1580
於是你們下去,其中一個扮演學生,
10:34
one of you will be a learner, one will be a teacher.
190
634038
2511
另一個扮演教師。
10:36
The learner's a genial, middle-aged guy.
191
636573
1905
學生是一個和藹的中年男子。
10:38
He gets tied up to the shock apparatus in another room.
192
638502
3109
在另外一間屋子裡,他被綁在一個電擊儀器上。
10:41
The learner could be middle-aged, could be as young as 20.
193
641635
3568
學生可能是中年人,也可能是二十多歲。
穿實驗室工作服的負責人,即權威角色,會告訴你們其中一個人說,
10:45
And one of you is told by the authority, the guy in the lab coat,
194
645227
3216
10:48
"Your job as teacher is to give him material to learn.
195
648467
2615
"你作為教師的工作就是讓這個人學習材料。
10:51
Gets it right, reward.
196
651106
1214
記對了,就獎勵他。
10:52
Gets it wrong, you press a button on the shock box.
197
652344
2391
記錯了,你就按這個電擊盒上的按鈕。
10:54
The first button is 15 volts. He doesn't even feel it."
198
654759
2854
第一個按鈕是15伏特。他基本感覺不到。 "
10:58
That's the key.
199
658160
1163
這就是關鍵。所有的惡都是從15伏特開始的。
10:59
All evil starts with 15 volts.
200
659347
1789
11:01
And then the next step is another 15 volts.
201
661160
2411
下一個再加15伏特。
11:04
The problem is, at the end of the line, it's 450 volts.
202
664009
2997
問題是,最後一個按鈕,是450伏特。
隨著你不斷加電壓,那個人就會慘叫,
11:07
And as you go along, the guy is screaming,
203
667030
2106
11:09
"I've got a heart condition! I'm out of here!"
204
669160
2191
"我有心臟問題!我要出去!"
11:11
You're a good person. You complain.
205
671375
1706
你是一個好人。你去投訴。
11:13
"Sir, who will be responsible if something happens to him?"
206
673105
2856
"先生,如果他出事了,誰來負責?"
11:15
The experimenter says, "Don't worry, I will be responsible.
207
675985
2763
實驗人員說,"不要緊,我來負責。
11:18
Continue, teacher."
208
678772
1329
請繼續,教師。 "
問題是,誰會一直按到450伏特?
11:20
And the question is, who would go all the way to 450 volts?
209
680125
3805
11:23
You should notice here, when it gets up to 375,
210
683954
2576
你們會注意到,到375伏特時,
11:26
it says, "Danger. Severe Shock."
211
686554
1525
上面寫著,"危險:強烈電擊"
11:28
When it gets up to here, there's "XXX" -- the pornography of power.
212
688103
3340
到這兒的時候,那兒標著"XXX"﹕限制級的權力。
11:31
So Milgram asks 40 psychiatrists,
213
691897
2110
(笑聲)
於是米爾格拉姆問了40個精神病醫生,
"百分之多少的美國人會按到最高電壓?"
11:34
"What percent of American citizens would go to the end?"
214
694031
3105
11:37
They said only one percent.
215
697160
1501
他們回答只有百分之1。因為那屬於虐待狂行為,
11:38
Because that's sadistic behavior, and we know, psychiatry knows,
216
698685
3566
而且我們知道,精神病學顯示,只有百分之1的美國人是虐待狂。
11:42
only one percent of Americans are sadistic.
217
702275
2092
11:44
OK.
218
704391
1200
好。這裡是研究資料。他們大錯特錯。
11:46
Here's the data. They could not be more wrong.
219
706147
2606
11:48
Two thirds go all the way to 450 volts.
220
708777
2821
三分之二的人會一直按到450伏特。這只是一個研究而已。
11:51
This was just one study.
221
711622
1514
11:53
Milgram did more than 16 studies.
222
713160
2284
米爾格拉姆做了超過16項研究。我們看一下這個。
11:55
And look at this.
223
715468
1233
在第16個研究中,你可以看到跟你們一樣的人們有百分之90
11:57
In study 16, where you see somebody like you go all the way,
224
717491
2837
12:00
90 percent go all the way.
225
720352
1468
會一直按到450伏特。在第5個研究中,如果有人反抗,百分之90的人反抗。
12:02
In study five, if you see people rebel,
226
722447
2188
12:04
90 percent rebel.
227
724659
1343
12:06
What about women? Study 13 -- no different than men.
228
726772
2845
女性呢?第13個研究:與男性無差別。
12:09
So Milgram is quantifying evil as the willingness of people
229
729938
3541
米爾格拉姆在以人們盲目服從權威,
12:13
to blindly obey authority, to go all the way to 450 volts.
230
733503
2793
一直按到450伏特的意願,來數量化惡。
12:16
And it's like a dial on human nature.
231
736606
2122
這就好像是在調節人性。
12:18
A dial in a sense that you can make almost everybody totally obedient,
232
738752
4384
調節的意思是,你幾乎可以從使絕大多數人完全服從,
12:23
down to the majority, down to none.
233
743160
2690
到使沒有人服從。
那麼,外界世界有什麼類似情況嗎?畢竟所有的實驗都是人為的。
12:26
What are the external parallels? For all research is artificial.
234
746767
3000
12:29
What's the validity in the real world?
235
749791
1810
它在真實世界中的有效性如何?
1978年,在圭亞那叢林裡,有912名美國人
12:31
912 American citizens committed suicide or were murdered
236
751625
3014
12:34
by family and friends in Guyana jungle in 1978,
237
754663
2985
自殺或遭其家人朋友殺害,
12:37
because they were blindly obedient to this guy, their pastor --
238
757672
3030
因為他們盲目地服從這個傢伙,他們的傳道者。
12:40
not their priest -- their pastor, Reverend Jim Jones.
239
760726
2707
不是他們的神父。他們的傳道者,吉姆·瓊斯主教。
他說服他們進行集體自殺。
12:43
He persuaded them to commit mass suicide.
240
763457
2796
12:46
And so, he's the modern Lucifer effect,
241
766277
1865
所以他是一個當代的路西法效應。
從上帝使者變成死亡天使。
12:48
a man of God who becomes the Angel of Death.
242
768166
2852
12:52
Milgram's study is all about individual authority to control people.
243
772662
4394
米爾格拉姆的研究完全是關於控制大眾的個人權力。
大多數時間我們在機構裡,
12:57
Most of the time, we are in institutions,
244
777793
2343
13:00
so the Stanford Prison Study is a study of the power of institutions
245
780160
3537
所以史丹佛監獄實驗,研究的是機構權力
13:03
to influence individual behavior.
246
783721
1940
如何影響個人行為。
13:05
Interestingly, Stanley Milgram and I were in the same high school class
247
785685
3604
有趣的是,斯坦利·米爾格拉姆和我上高中的時候在同一個班級,
那是1954年,在布朗克斯的詹姆斯·門羅高中。
13:09
in James Monroe in the Bronx, 1954.
248
789313
2704
13:13
I did this study with my graduate students,
249
793136
2087
這個實驗室是我跟
我的研究生做的,尤其是克雷格·漢尼,
13:15
especially Craig Haney -- and it also began work with an ad.
250
795247
2826
我們也從打廣告開始。
我們沒什麼錢,於是我們打了一個簡單的小廣告,
13:18
We had a cheap, little ad,
251
798097
1428
13:19
but we wanted college students for a study of prison life.
252
799549
2752
我們想找大學生來研究一下監獄生活。
75個人誌願參加,做了人格測試。
13:23
75 people volunteered, took personality tests.
253
803249
2440
13:25
We did interviews.
254
805713
1250
我們做了面試。挑選了24名:
13:26
Picked two dozen: the most normal, the most healthy.
255
806987
2446
他們是最正常的,最健康的。
13:29
Randomly assigned them to be prisoner and guard.
256
809457
2267
然後隨機把他們分成囚犯和警衛兩組。
13:31
So on day one, we knew we had good apples.
257
811748
2020
所以在第一天,我們知道他們都是好蘋果。
13:33
I'm going to put them in a bad situation.
258
813792
1953
而我將把他們放在一個壞的情境裡。
13:35
And secondly, we know there's no difference
259
815769
2277
其次,我們知道
13:38
between the boys who will be guards and those who will be prisoners.
260
818070
3342
在將要扮演警衛和
扮演囚犯的男生之間沒有任何區別。
13:41
To the prisoners, we said,
261
821436
1526
我們對那些將要扮演囚犯的男生說,
13:42
"Wait at home. The study will begin Sunday."
262
822986
2126
"在住處等著,實驗在星期天開始。"
13:45
We didn't tell them
263
825136
1200
我們沒有告訴他們的是,
13:46
that the city police were going to come and do realistic arrests.
264
826360
3195
市警察局的警察會上門做真實的逮捕。
13:49
(Video) (Music)
265
829579
1632
13:55
[Day 1]
266
835999
1200
錄像中的男人:一輛警車停在房子前面,一個警察來到前門
14:23
Student: A police car pulls up in front, and a cop comes to the front door,
267
863473
4974
14:28
and knocks, and says he's looking for me.
268
868471
1995
敲門,說是找我。
14:30
So they, right there, you know, they took me out the door,
269
870490
2921
於是他們,就在那兒,你懂的,把我抓出去,
14:33
they put my hands against the car.
270
873435
3498
把我的雙手放車上。
14:36
It was a real cop car, it was a real policeman,
271
876957
2201
那是輛真警車,是個真警察,
14:39
and there were real neighbors in the street,
272
879182
2063
街上的鄰居也是真的,
他們不知道這是個實驗。
14:41
who didn't know that this was an experiment.
273
881269
3026
14:44
And there was cameras all around and neighbors all around.
274
884789
2754
周圍都是相機,圍滿了鄰居。
14:47
They put me in the car, then they drove me around Palo Alto.
275
887567
3216
他們讓我上警車,在帕羅奧圖市的大街上行駛。
14:52
They took me to the basement of the police station.
276
892338
4993
他們把我抓到警察局,
警察局的地下室。他們把我關到一間牢房裡。
15:04
Then they put me in a cell.
277
904373
1394
15:05
I was the first one to be picked up, so they put me in a cell,
278
905791
2918
我是第一個被抓來的,所以他們把我關進一間單人牢房,
基本上就是一間門上有欄杆的房間。
15:08
which was just like a room with a door with bars on it.
279
908733
2793
15:12
You could tell it wasn't a real jail.
280
912446
1794
你可以看出來出它不是間真的牢房。
他們把我鎖在那兒,穿著這件丟人的衣服。
15:14
They locked me in there, in this degrading little outfit.
281
914264
3762
15:18
They were taking this experiment too seriously.
282
918770
2575
他們對這個實驗太認真了。
15:21
Here are the prisoners, who are going to be dehumanized, they'll become numbers.
283
921369
3833
這就是那些將要被剝奪人性的囚犯。
他們的名字將被號碼代替。
這是那些警衛,他們的裝扮標誌著權力和匿名性。
15:25
Here are the guards with the symbols of power and anonymity.
284
925226
2892
警衛們讓囚犯們
15:28
Guards get prisoners to clean the toilet bowls out with their bare hands,
285
928142
3449
徒手清理馬桶,
讓他們做其他一些羞辱性的任務。
15:31
to do other humiliating tasks.
286
931615
1472
他們脫光囚犯的衣服,性侮辱他們。
15:33
They strip them naked. They sexually taunt them.
287
933111
2242
他們開始做侮辱行為,
15:35
They begin to do degrading activities, like having them simulate sodomy.
288
935377
3537
譬如強迫囚犯們模擬雞姦。
15:38
You saw simulating fellatio in soldiers in Abu Ghraib.
289
938938
3231
你們看到阿布格萊布的士兵強迫囚犯模擬口交。
我的警衛在五天內就做了。囚犯們的應激反應是非常極端的,
15:42
My guards did it in five days.
290
942193
1882
15:44
The stress reaction was so extreme
291
944495
1827
15:46
that normal kids we picked because they were healthy
292
946346
2436
我們當初挑選他們是因為他們是健康的,
15:48
had breakdowns within 36 hours.
293
948806
1583
而這些正常的男生在36小時內就有人崩潰了。
15:50
The study ended after six days, because it was out of control.
294
950413
4025
這個實驗在6天后結束因為它已經失控了。
15:54
Five kids had emotional breakdowns.
295
954462
2130
五個男生情緒崩潰。
15:58
Does it make a difference
296
958466
1453
戰士們是否更換統一服裝
15:59
if warriors go to battle changing their appearance or not?
297
959943
2725
對於他們在戰場上的表現會有影響嗎?
16:02
If they're anonymous, how do they treat their victims?
298
962692
2753
他們匿名與否
對於他們對付受害者會有影響嗎?
16:05
In some cultures, they go to war without changing their appearance.
299
965469
3159
我們知道在某些文化裡,人們上戰場時
是不換服裝的。
在另外一些文化裡,他們把自己塗成"蒼蠅王"的樣子。
16:08
In others, they paint themselves like "Lord of the Flies."
300
968652
2728
在某些文化裡他們戴著面具。
16:11
In some, they wear masks.
301
971404
1192
在許多文化中,戰士們穿著統一服裝達到匿名性。
16:12
In many, soldiers are anonymous in uniform.
302
972620
2061
16:14
So this anthropologist, John Watson, found 23 cultures that had two bits of data.
303
974705
4697
人類學家約翰·華生
在23個文化中發現兩組數據。
16:19
Do they change their appearance? 15.
304
979426
2339
他們是否更換服裝? 15個是。
16:21
Do they kill, torture, mutilate? 13.
305
981789
2307
他們是否殺戮,折磨,殘害? 13個是。
如果他們不換服裝,
16:24
If they don't change their appearance,
306
984400
1823
八個文化中只有一個殺戮,折磨或殘害。
16:26
only one of eight kills, tortures or mutilates.
307
986247
2213
關鍵在這個紅色區域。
16:28
The key is in the red zone.
308
988484
1304
16:29
If they change their appearance,
309
989812
1565
如果他們更換服裝,
13個文化中有12個,即百分之90,會殺戮,折磨,殘害。
16:31
12 of 13 -- that's 90 percent -- kill, torture, mutilate.
310
991401
3217
16:35
And that's the power of anonymity.
311
995136
1620
這就是匿名性的威力。
16:36
So what are the seven social processes
312
996780
1905
那麼是哪七個社會性過程
16:38
that grease the slippery slope of evil?
313
998709
1858
會導致惡的逐漸產生呢?
16:40
Mindlessly taking the first small step.
314
1000591
2126
無意中邁出第一步。
16:42
Dehumanization of others. De-individuation of self.
315
1002741
2642
對他人去人性化。對自己去個體化。
16:45
Diffusion of personal responsibility.
316
1005407
1902
推卸個人責任。盲目服從權威。
16:47
Blind obedience to authority.
317
1007333
1647
16:49
Uncritical conformity to group norms.
318
1009004
1894
不加批判地依從群體規範。
16:50
Passive tolerance of evil through inaction, or indifference.
319
1010922
3214
袖手旁觀,漠不關心,對惡行消極容忍。
16:54
And it happens when you're in a new or unfamiliar situation.
320
1014160
2858
而其容易在新的或不熟悉的環境中發生。
你的習慣性反應失效了。
16:57
Your habitual response patterns don't work.
321
1017042
2049
16:59
Your personality and morality are disengaged.
322
1019115
2447
你的人格和道德感被關閉了。
"沒有什麼比公開譴責作惡者更容易,
17:02
"Nothing is easier than to denounce the evildoer;
323
1022803
2310
也沒什麼比理解他更難。"杜斯妥耶夫斯基告訴我們。
17:05
nothing more difficult than understanding him," Dostoyevsky.
324
1025137
3022
理解不是找藉口。心理學不是藉口學。
17:08
Understanding is not excusing. Psychology is not excuse-ology.
325
1028183
3572
17:12
So social and psychological research reveals
326
1032219
2127
社會學和心理學研究揭示了
在無需藥物的情況下,普通的好人是如何被轉變的。
17:14
how ordinary, good people can be transformed without the drugs.
327
1034370
3272
17:17
You don't need it. You just need the social-psychological processes.
328
1037666
3301
你不需要藥物,你只需要社會心理學的過程。
17:20
Real world parallels?
329
1040991
1695
真實世界的情況?和這個比較一下。
17:22
Compare this with this.
330
1042710
2184
17:26
James Schlesinger -- I'm going to end with this -- says,
331
1046136
2688
我以詹姆斯·施萊辛格的話作為結束,
17:28
"Psychologists have attempted to understand how and why
332
1048848
2599
"心理學家已嘗試理解,
17:31
individuals and groups who usually act humanely
333
1051471
2429
一般情況下具備人性的個體和群體,為什麼以及如何
17:33
can sometimes act otherwise in certain circumstances."
334
1053924
3188
會在某些情境下,作出反常行為。 "
17:37
That's the Lucifer effect.
335
1057136
1239
這就是路西法效應。
17:38
And he goes on to say, "The landmark Stanford study
336
1058399
2429
他接著說,"具有標誌性的史丹佛實驗
17:40
provides a cautionary tale for all military operations."
337
1060852
3920
給了所有軍事行為一個警告。 "
17:44
If you give people power without oversight,
338
1064796
2340
如果你在沒有監督的情況下賦予人們權力,
17:47
it's a prescription for abuse.
339
1067160
1483
那就是在給濫用開通行證。他們明明了解後果,卻任其發生。
17:48
They knew that, and let that happen.
340
1068667
2411
另一個報告,是費將軍所做的調查,
17:51
So another report, an investigative report by General Fay,
341
1071837
3080
17:54
says the system is guilty.
342
1074941
1950
認為整個系統是有罪的,在該報告中,
17:56
In this report, he says it was the environment that created Abu Ghraib,
343
1076915
3463
他認為是環境造成了阿布格萊布事件,
18:00
by leadership failures that contributed to the occurrence of such abuse,
344
1080402
3382
領導力的失誤,
導致了虐待的發生,
18:03
and because it remained undiscovered
345
1083808
1910
以及在很長一段時間內,
18:05
by higher authorities for a long period of time.
346
1085742
2240
當局高層一直被蒙在鼓裡。
那些虐待行為持續了三個月。有誰在看管嗎?
18:08
Those abuses went on for three months.
347
1088006
1825
18:09
Who was watching the store?
348
1089855
1608
18:11
The answer is nobody, I think on purpose.
349
1091487
2247
答案是沒有人,我認為,是沒有人主動去。
18:13
He gave the guards permission to do those things,
350
1093758
2293
他允許警衛們作那些惡行,
他們知道沒有人會下地牢來查看。
18:16
and they knew nobody was ever going to come down to that dungeon.
351
1096075
3064
所以我們在所有這些方面進行模式上的轉變。
18:19
So you need a paradigm shift in all of these areas.
352
1099163
2568
18:21
The shift is away from the medical model that focuses only on the individual.
353
1101755
3677
原來的醫療模式,
只集中於個體,
18:25
The shift is toward a public health model
354
1105456
2680
必須轉向一個公共健康模式,
18:28
that recognizes situational and systemic vectors of disease.
355
1108160
3782
這個模式同時考慮情境和系統對疾病的作用。
18:31
Bullying is a disease. Prejudice is a disease.
356
1111966
2337
欺侮是病。偏見是病。暴力是病。
18:34
Violence is a disease.
357
1114327
1168
18:35
Since the Inquisition, we've been dealing with problems at the individual level.
358
1115519
3768
自從審訊以來,我們一直在個人層面
解決問題。你猜怎麼著,沒用。
18:39
It doesn't work.
359
1119311
1441
18:40
Aleksandr Solzhenitsyn says, "The line between good and evil
360
1120776
2816
亞歷山大·索忍尼辛認為每個人心中
18:43
cuts through the heart of every human being."
361
1123616
2120
都有善惡的分界線。
18:45
That means that line is not out there.
362
1125760
1812
也就是說,這條線不是外在的。
18:47
That's a decision that you have to make, a personal thing.
363
1127596
2971
這是一個你必須作出的決定。是個人層面的。
18:50
So I want to end very quickly on a positive note.
364
1130591
2545
那麼,我想以一個正面的意見來做個簡短的結尾:
18:53
Heroism as the antidote to evil,
365
1133652
2460
英雄主義是惡的解藥。
18:56
by promoting the heroic imagination,
366
1136136
1895
通過推廣英雄主義想像,
尤其是在我們的孩子之中,在教育系統裡。
18:58
especially in our kids, in our educational system.
367
1138055
2498
19:00
We want kids to think, "I'm a hero in waiting,
368
1140577
2464
我們要孩子們想,我是那個等待中的英雄,
等待合適的情境出現,
19:03
waiting for the right situation to come along,
369
1143065
2143
19:05
and I will act heroically.
370
1145232
1543
屆時我會行英雄之事。
19:06
My whole life, I'm now going to focus away from evil --
371
1146799
2678
我一生自小與惡相伴,
如今我畢生努力之重點,將從研究惡轉向理解英雄主義。
19:09
that I've been in since I was a kid -- to understanding heroes.
372
1149501
2953
現在所謂的英雄主義是,
19:12
Banality of heroism.
373
1152478
1165
平凡之人行英雄之事。
19:13
It's ordinary people who do heroic deeds.
374
1153667
2004
19:15
It's the counterpoint to Hannah Arendt's "Banality of Evil."
375
1155695
2946
這是對漢娜·鄂蘭平庸之惡的反駁。
19:18
Our traditional societal heroes are wrong, because they are the exceptions.
376
1158665
3961
我們傳統的社會英雄是錯誤的,
因為他們是極少數例外。
19:22
They organize their life around this. That's why we know their names.
377
1162650
3335
他們為目標投入畢生之努力。
因此我們才知道他們的名字。
孩子們的英雄也是他們的榜樣,
19:26
Our kids' heroes are also wrong models for them,
378
1166009
2278
因為他們有超自然能力。
19:28
because they have supernatural talents.
379
1168311
1944
19:30
We want our kids to realize most heroes are everyday people,
380
1170279
2961
我們想要讓孩子們意識到,大多數英雄是平凡的人們,
而英雄行為是不平凡的。這是喬·達比。
19:33
and the heroic act is unusual.
381
1173264
2104
19:35
This is Joe Darby.
382
1175392
1291
19:36
He was the one that stopped those abuses you saw,
383
1176707
2286
就是他阻止了你們前面所見的那些虐行,
因為當他看到那些圖片時,
19:39
because when he saw those images,
384
1179017
1593
19:40
he turned them over to a senior investigating officer.
385
1180634
2571
他把它們交給了一位高級調查官。
19:43
He was a low-level private, and that stopped it.
386
1183229
2728
他是一個低級士兵但卻阻止了此事。他是英雄嗎?不是。
19:45
Was he a hero? No.
387
1185981
1178
19:47
They had to put him in hiding, because people wanted to kill him,
388
1187183
3150
他們不得不把他藏起來,因為有人想殺他,
19:50
and then his mother and his wife.
389
1190357
1652
還有他的母親和妻子。
他們隱藏了三年。
19:52
For three years, they were in hiding.
390
1192033
1885
19:53
This is the woman who stopped the Stanford Prison Study.
391
1193942
2686
這個女人阻止了斯坦福監獄實驗。
19:56
When I said it got out of control, I was the prison superintendent.
392
1196652
3249
當我說實驗失控的時候,我當時是監獄實驗負責人。
19:59
I didn't know it was out of control. I was totally indifferent.
393
1199925
2999
我不知道實驗已經失控了。我完全無動於衷。
20:02
She saw that madhouse and said,
394
1202948
1591
她下來看到這瘋人院一樣的監獄說,
20:04
"You know what, it's terrible what you're doing to those boys.
395
1204563
2905
"你知道嗎?你對這些男孩所做的一切實在是太可怕了。
20:07
They're not prisoners nor guards, they're boys, and you are responsible."
396
1207492
3644
他們不是囚犯,不是警衛,
他們只是孩子,你要為他們負責。 "
20:11
And I ended the study the next day.
397
1211160
1976
我第二天就停止了這個實驗。
20:13
The good news is I married her the next year.
398
1213699
2110
好消息是,我第二年就娶了她。
20:15
(Laughter)
399
1215833
2303
(笑聲)
20:18
(Applause)
400
1218160
7000
(鼓掌)
20:25
I just came to my senses, obviously.
401
1225605
2011
顯然,我醒悟了。
20:27
So situations have the power to do [three things].
402
1227640
2669
所以情境是有力量的——
20:30
But the point is, this is the same situation
403
1230333
2781
關鍵是,這個情境
可以刺激一些人內心的敵意想像,
20:33
that can inflame the hostile imagination in some of us,
404
1233138
3466
20:36
that makes us perpetrators of evil,
405
1236628
1737
使我們成為惡之犯人,
20:38
can inspire the heroic imagination in others.
406
1238389
2193
也可以激發另外一些人內心的英雄想像。情境是同樣的情境。
20:40
It's the same situation and you're on one side or the other.
407
1240606
3114
而你二者必居其一。
20:43
Most people are guilty of the evil of inaction,
408
1243744
2191
大多數人對袖手旁觀之惡感到內疚,
20:45
because your mother said, "Don't get involved. Mind your own business."
409
1245959
3348
因為你母親會說,"別管閒事,先管好你自己的事。"
你一定得這麼回答,"媽媽,人性就是我的事。"
20:49
And you have to say, "Mama, humanity is my business."
410
1249331
2477
20:51
So the psychology of heroism is -- we're going to end in a moment --
411
1251832
3191
英雄主義的心理學是——我們很快會結束——
我們如何在新的英雄課程裡鼓勵孩子們,
20:55
how do we encourage children in new hero courses,
412
1255047
2427
20:57
that I'm working on with Matt Langdon -- he has a hero workshop --
413
1257498
3292
我正與馬特·郎登從事這項工作——他有一個英雄工作坊——
21:00
to develop this heroic imagination, this self-labeling,
414
1260814
3043
來培養這種英雄想像,這種自我標籤,
21:03
"I am a hero in waiting," and teach them skills.
415
1263881
2763
"我是一個等待中的英雄",並且教會他們技能。
21:06
To be a hero, you have to learn to be a deviant,
416
1266668
2355
想成為英雄的話,你一定要學會成為一個"異類",
21:09
because you're always going against the conformity of the group.
417
1269047
3000
因為你得總是與群體規範相左。
英雄是那些在社會上行非凡之事的平凡人。那些有所為之人。
21:12
Heroes are ordinary people whose social actions are extraordinary. Who act.
418
1272697
3840
英雄主義之關鍵有二。
21:16
The key to heroism is two things.
419
1276561
1751
一:在眾人消極冷漠之時有所作為。
21:18
You have to act when other people are passive.
420
1278336
2381
21:20
B: You have to act socio-centrically, not egocentrically.
421
1280741
2767
二:你的作為必須以社會為中心,而非以自我為中心。
21:23
And I want to end with a known story about Wesley Autrey, New York subway hero.
422
1283532
3783
我想以韋斯利·奧特里,紐約地鐵英雄的故事來結尾,
你們其中一些人知道這個故事。
21:27
Fifty-year-old African-American construction worker standing on a subway.
423
1287339
3797
他是一個50歲的非裔美國人,是一個建築工人。
他在紐約地鐵等車的時候,
21:31
A white guy falls on the tracks.
424
1291160
1528
一個白人掉進地鐵軌道裡。
21:32
The subway train is coming. There's 75 people there.
425
1292712
2460
當時地鐵正開過來。當時有75個人在那兒。
21:35
You know what? They freeze.
426
1295196
1496
你猜怎麼著,他們全都僵住了。
21:36
He's got a reason not to get involved.
427
1296716
1866
他有理由袖手旁觀。
21:38
He's black, the guy's white, and he's got two kids.
428
1298606
2417
他是黑人,那個人是白人,他還有兩個小孩。
21:41
Instead, he gives his kids to a stranger,
429
1301047
1953
相反的是,他把兩個孩子交給一個陌生人看管,
跳進鐵軌裡,把那男子壓在鐵軌之間,
21:43
jumps on the tracks, puts the guy between the tracks,
430
1303024
2477
21:45
lays on him, the subway goes over him.
431
1305525
2170
趴在他身上,地鐵就從他身上開了過去。
21:47
Wesley and the guy -- 20 and a half inches height.
432
1307719
3258
韋斯利和那個男子摞起來高20.5英寸。
21:51
The train clearance is 21 inches.
433
1311001
2290
地鐵列車下的空隙高21英寸。
21:53
A half an inch would have taken his head off.
434
1313840
2103
再低半英寸就會把他的腦袋鏟去。
21:56
And he said, "I did what anyone could do," no big deal to jump on the tracks.
435
1316537
4069
而他卻說"我做了任何人都會做的事",
跳下鐵軌沒什麼大不了的。
22:00
And the moral imperative is "I did what everyone should do."
436
1320630
3648
從道德責任的角度說應該是"我做了任何人應該做的事"。
22:04
And so one day, you will be in a new situation.
437
1324302
2387
那麼,將來有一天,你會遇到一個新的情境。
22:07
Take path one, you're going to be a perpetrator of evil.
438
1327160
2620
第一條路,你會成為惡之犯人。
22:09
Evil, meaning you're going to be Arthur Andersen.
439
1329804
2332
惡,即你將成為亞瑟·安德森。
22:12
You're going to cheat, or you're going to allow bullying.
440
1332160
2715
你將會欺騙,或允許欺侮。
22:14
Path two, you become guilty of the evil of passive inaction.
441
1334899
2814
第二條路:你將因漠不關心袖手旁觀而內疚。
22:17
Path three, you become a hero.
442
1337737
1723
第三條路:你成為一個英雄。
關鍵是,我們是否做好準備來選擇這條路
22:19
The point is, are we ready to take the path to celebrating ordinary heroes,
443
1339484
4418
以頌揚平凡的英雄,
22:23
waiting for the right situation to come along
444
1343926
2204
等待合適的情境出現,
將對於英雄的想像付諸於實施呢?
22:26
to put heroic imagination into action?
445
1346154
2066
因為這可能是你平生僅有的機會,
22:28
Because it may only happen once in your life,
446
1348244
2794
22:31
and when you pass it by, you'll always know,
447
1351062
2143
而當你錯過的時候,你將永遠記得,
我本可以成為一個英雄但我讓這機會溜走了。
22:33
I could have been a hero and I let it pass me by.
448
1353229
2435
22:35
So the point is thinking it and then doing it.
449
1355688
2173
所以關鍵是先想再做。
所以我想謝謝你們。謝謝你們。謝謝。
22:38
So I want to thank you. Thank you.
450
1358559
1948
22:40
Let's oppose the power of evil systems at home and abroad,
451
1360531
3002
讓我們反對國內外惡之系統的力量,
22:43
and let's focus on the positive.
452
1363557
1579
並集中於積極的一面。
22:45
Advocate for respect of personal dignity, for justice and peace,
453
1365160
3521
倡導對個人高尚行為之尊敬,倡導正義與和平,
22:48
which sadly our administration has not been doing.
454
1368705
2338
遺憾的是,我們的當局並沒有做這些。
非常感謝。
22:51
Thanks so much.
455
1371067
1250
(掌聲)
22:52
(Applause)
456
1372341
3109
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7