The psychology of evil | Philip Zimbardo

2,752,134 views ・ 2008-09-23

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yujian Li 校对人员: Qian Yue
00:12
Philosophers, dramatists, theologians
0
12794
3556
许多世纪以来,哲学家,剧作家,神学家
00:16
have grappled with this question for centuries:
1
16374
2206
都在着力解决这个问题:
00:18
what makes people go wrong?
2
18604
1532
什么使人们变坏?
00:20
Interestingly, I asked this question when I was a little kid.
3
20160
2976
有趣的是,当我还是小孩时,我问过同样的问题。
00:23
I grew up in the South Bronx, inner-city ghetto in New York,
4
23160
3056
我在纽约南布朗克斯市中的贫民窟长大,
周围充满了罪恶,
00:26
and I was surrounded by evil,
5
26240
1483
00:27
as all kids are who grew up in an inner city.
6
27747
2556
如同所有在贫民窟长大的孩子一样。
00:30
And I had friends who were really good kids,
7
30327
2325
我有一些朋友,他们曾是好孩子,
00:32
who lived out the Dr. Jekyll Mr. Hyde scenario -- Robert Louis Stevenson.
8
32676
4292
但他们的人生却如同罗伯特·路易斯·斯蒂文森笔下的变身怪医,由善转恶。
00:36
That is, they took drugs, got in trouble, went to jail.
9
36992
3144
他们染毒,惹了麻烦,然后进了监狱。
00:40
Some got killed, and some did it without drug assistance.
10
40160
3859
有些丧了命,即使并没有沾染毒品。
00:44
So when I read Robert Louis Stevenson, that wasn't fiction.
11
44448
2794
所以当我读罗伯特·路易斯·斯蒂文森的作品时,我觉得那不是小说。
00:47
The only question is, what was in the juice?
12
47266
2224
唯一的问题是:酿成由善转恶的毒药是什么?
更重要的是,善恶之间的界限——
00:49
And more importantly, that line between good and evil --
13
49514
3622
00:53
which privileged people like to think is fixed and impermeable,
14
53160
3468
特权阶层喜欢认定这个界限是固定且不可逾越的,
00:56
with them on the good side, the others on the bad side --
15
56652
2673
认为他们是在善的一边,其他人在恶的一边——
00:59
I knew that line was movable, and it was permeable.
16
59349
3063
而我以前就知道这个界限是可以移动的,而且是可逾越的。
01:02
Good people could be seduced across that line,
17
62436
2343
好人可以受诱惑而越界,
01:04
and under good and some rare circumstances, bad kids could recover
18
64803
4204
偶尔在某些比较好的情况下,坏孩子也可能
依靠外界的帮助、改造、治疗,以重塑人生。
01:09
with help, with reform, with rehabilitation.
19
69031
2296
01:12
So I want to begin with this wonderful illusion
20
72160
2477
所以,我想以荷兰艺术家M. C. Escher
01:14
by [Dutch] artist M.C. Escher.
21
74661
2274
这幅奇妙的作品开始说起。
01:16
If you look at it and focus on the white, what you see is a world full of angels.
22
76959
3861
如果你把视线集中在白色区域,
你会看到一个充满了天使的世界。
01:20
But let's look more deeply, and as we do,
23
80844
2568
但是当我们再靠近一点看,
01:23
what appears is the demons, the devils in the world.
24
83436
3021
魔鬼就出现了,世间的魔鬼。
01:26
That tells us several things.
25
86992
1381
这告诉我们几点。
01:28
One, the world is, was, will always be filled with good and evil,
26
88397
3108
一:这个世界,无论过去,现在,还是将来,都总是由善和恶组成,
01:31
because good and evil is the yin and yang of the human condition.
27
91529
3048
因为善恶就如人类的阴阳。
01:34
It tells me something else.
28
94601
1286
它也告诉我另外一件事。如果你还记得,
01:35
If you remember, God's favorite angel was Lucifer.
29
95911
3028
上帝最喜欢的天使是路西法。
01:39
Apparently, Lucifer means "the light."
30
99449
2438
显然,路西法的意思是“光明”。
01:41
It also means "the morning star," in some scripture.
31
101911
2726
在某些经文里,它也有“黎明之星”的意思。
01:44
And apparently, he disobeyed God,
32
104661
3205
显然他后来背叛了上帝,
01:47
and that's the ultimate disobedience to authority.
33
107890
2492
这是对权威的终极背叛。
01:50
And when he did, Michael, the archangel, was sent to kick him out of heaven
34
110406
5498
当他率众背叛后,上帝派迈克天使长
将他和其他堕落的天使一起赶出天堂。
01:55
along with the other fallen angels.
35
115928
2003
01:57
And so Lucifer descends into hell, becomes Satan,
36
117955
3181
于是路西法降入地狱,成为撒旦,
02:01
becomes the devil, and the force of evil in the universe begins.
37
121160
3000
成为恶魔,宇宙中的恶之能量诞生了。
矛盾的是,是上帝造出了恶的容身之处---地狱。
02:04
Paradoxically, it was God who created hell as a place to store evil.
38
124184
4469
02:09
He didn't do a good job of keeping it there though.
39
129136
2381
他却没能使恶一直呆在那里。
所以,从上帝最受宠的天使变为恶魔,
02:11
So, this arc of the cosmic transformation of God's favorite angel into the Devil,
40
131541
4589
这个巨大的转变,
为我设立了一个大背景,
02:16
for me, sets the context for understanding human beings
41
136154
3644
02:19
who are transformed from good, ordinary people into perpetrators of evil.
42
139822
4314
去理解那些从好人或者普通人
转变成坏人的人。
02:24
So the Lucifer effect, although it focuses on the negatives --
43
144737
4027
所以,路西法效应,尽管它集中在阴暗的方面——
02:28
the negatives that people can become,
44
148788
2655
人们可能投向阴暗,
02:31
not the negatives that people are --
45
151467
1962
但他们本身并非阴暗——
引导我作出一个心理学定义:恶是行使权力
02:33
leads me to a psychological definition.
46
153453
3354
02:36
Evil is the exercise of power.
47
156831
1973
02:38
And that's the key: it's about power.
48
158828
2239
这才是关键:权力。
来故意对他人进行心理伤害,
02:41
To intentionally harm people psychologically,
49
161091
2565
02:43
to hurt people physically, to destroy people mortally, or ideas,
50
163680
3730
对他人进行身体伤害,残害他人生命或思想,
02:47
and to commit crimes against humanity.
51
167434
2194
犯下反人道的罪行。
02:51
If you Google "evil," a word that should surely have withered by now,
52
171160
3286
如果你用谷歌搜索evil (恶) 这个词——时至今日,这本是个早应消亡的词,
02:54
you come up with 136 million hits in a third of a second.
53
174470
3666
你会在1/3秒内得到1.36亿个搜索结果。
02:58
A few years ago -- I am sure all of you were shocked, as I was,
54
178160
4498
几年前发生的一件事——我知道你们当时一定和我一样震惊,
03:02
with the revelation of American soldiers abusing prisoners in a strange place
55
182682
5454
就是揭露美军士兵
在那场争议性的对伊战争中
03:08
in a controversial war, Abu Ghraib in Iraq.
56
188160
2976
中的虐囚行为:阿布格莱布虐囚事件。
03:11
And these were men and women
57
191160
2007
这些士兵,有男性也有女性,
对囚犯们实施了让人难以置信的羞辱。
03:13
who were putting prisoners through unbelievable humiliation.
58
193191
4287
03:18
I was shocked, but I wasn't surprised,
59
198136
1810
我很震惊,但是并不感到意外,
03:19
because I had seen those same visual parallels
60
199970
2166
因为我以前看到过类似的情况,
03:22
when I was the prison superintendent of the Stanford Prison Study.
61
202160
3402
当时我是斯坦福监狱实验的负责人。
03:25
Immediately the Bush administration military said what?
62
205586
2715
布什政府军方对此事的第一反应是什么?
03:28
What all administrations say when there's a scandal:
63
208681
2500
是丑闻发生后任何官方都会说的套词,
"不要怪我们。这与整个系统无关。只是几个坏苹果而已,
03:31
"Don't blame us. It's not the system.
64
211205
1850
03:33
It's the few bad apples, the few rogue soldiers."
65
213079
2400
只是一小撮恶劣的士兵而已。"
03:35
My hypothesis is, American soldiers are good, usually.
66
215503
3031
而我的假设是,美国士兵通常情况下是好的。
03:38
Maybe it was the barrel that was bad.
67
218558
2061
也许是装苹果的桶坏了。
03:40
But how am I going to deal with that hypothesis?
68
220643
2493
但我如何证明这个假设呢?
03:43
I became an expert witness for one of the guards,
69
223160
2520
我成为了其中一个名叫
奇普·弗莱德里克中士的专家证人,
03:45
Sergeant Chip Frederick, and in that position,
70
225704
2143
在这个位置上,我可以接触到关于此事的十几份调查报告。
03:47
I had access to the dozen investigative reports.
71
227871
2913
03:50
I had access to him.
72
230808
1800
我同他接触,我可以研究他,
03:52
I could study him, have him come to my home, get to know him,
73
232632
3301
让他来我家,了解他,
03:55
do psychological analysis to see, was he a good apple or bad apple.
74
235957
3407
作些心理上的分析来判断他是个好苹果还是坏苹果。
03:59
And thirdly, I had access to all of the 1,000 pictures
75
239388
3911
第三点,我可以查看所有的
04:03
that these soldiers took.
76
243323
1213
1000多张士兵拍摄的照片。
04:05
These pictures are of a violent or sexual nature.
77
245025
2510
这些照片都是暴力或色情的。
04:07
All of them come from the cameras of American soldiers.
78
247559
2959
所有这些都是美军士兵用相机拍摄的。
04:10
Because everybody has a digital camera or cell phone camera,
79
250969
2859
因为每个人都有数码相机或手机相机,
04:13
they took pictures of everything, more than 1,000.
80
253852
2334
他们什么都拍。拍了超过1000张照片。
04:16
And what I've done is I organized them into various categories.
81
256210
2954
我所做的是把它们分类。
但这些由陆军预备役的美军宪兵所拍摄的。
04:19
But these are by United States military police, army reservists.
82
259188
4939
04:24
They are not soldiers prepared for this mission at all.
83
264637
3482
他们完全不是为执行此项任务而设立的部队。
而此事仅发生在一个地点,1A层,在夜间值班时间。
04:28
And it all happened in a single place, Tier 1-A, on the night shift.
84
268143
3673
04:31
Why?
85
271840
1201
为什么?1A层是军方情报中心。
04:33
Tier 1-A was the center for military intelligence.
86
273065
2602
04:35
It was the interrogation hold.
87
275691
1976
是审讯关押处。中央情报局在那里。
04:37
The CIA was there.
88
277691
1201
04:38
Interrogators from Titan Corporation, all there,
89
278916
3158
巨人公司(美军外包公司)的审讯人员,全部都在那里,
而他们得不到任何关于暴动的信息。
04:42
and they're getting no information about the insurgency.
90
282098
2786
04:44
So they're going to put pressure on these soldiers,
91
284908
2470
于是他们向这些宪兵队士兵施加压力,
迫使他们越线,
04:47
military police, to cross the line,
92
287402
1720
04:49
give them permission to break the will of the enemy,
93
289146
2990
允许他们采取措施来击溃敌人的意志,
04:52
to prepare them for interrogation, to soften them up,
94
292160
2477
挽起袖子,为审讯做准备,
04:54
to take the gloves off.
95
294661
1226
使他们屈服。这些都是婉辞,
04:55
Those are the euphemisms, and this is how it was interpreted.
96
295911
3128
而这就是他们如何阐释的。
04:59
Let's go down to that dungeon.
97
299794
1770
让我们进入那个地牢吧。
(相机快门声)(以下图片含有裸露及暴力展示)
05:03
(Typewriting)
98
303445
1309
05:05
[Abu Ghraib Iraq Prison Abuses 2008 Military Police Guards' Photos]
99
305239
5548
05:11
[The following images include nudity and graphic depictions of violence]
100
311248
6357
05:18
(Camera shutter sounds)
101
318338
1133
(重击声)
05:39
(Thuds)
102
339173
1199
05:45
(Camera shutter)
103
345721
1059
(相机快门声)
05:59
(Camera shutter)
104
359516
1190
(重击声)
06:09
(Breathing)
105
369160
3669
(喘息声)
06:17
(Bells)
106
377160
1794
(钟声)
06:47
(Bells end)
107
407219
1366
06:49
So, pretty horrific.
108
409239
1786
很恐怖。
06:51
That's one of the visual illustrations of evil.
109
411501
3807
这是恶的一种视觉展示。
06:55
And it should not have escaped you
110
415332
1761
你应该不会没有注意到,
我把那个伸开双臂的囚犯
06:57
that the reason I paired the prisoner with his arms out
111
417117
3237
07:00
with Leonardo da Vinci's ode to humanity
112
420378
2758
和达芬奇颂扬人类的作品放在一起的原因,
07:03
is that that prisoner was mentally ill.
113
423160
2582
是那个犯人得了精神疾病。
07:05
That prisoner covered himself with shit every day,
114
425766
2353
那个犯人每天用大便涂抹在身上,
士兵们不得不使他在泥土里打滚,以消除臭味。
07:08
they had to roll him in dirt so he wouldn't stink.
115
428143
2418
07:10
But the guards ended up calling him "Shit Boy."
116
430585
2348
但士兵们最终还是叫他屎男。
07:12
What was he doing in that prison rather than in some mental institution?
117
432957
4179
他在监狱里做什么?!
他本应在精神病院。
07:17
In any event, here's former Secretary of Defense Rumsfeld.
118
437657
2715
不管怎样,前国防部长拉姆斯菲尔德
07:20
He comes down and says, "I want to know, who is responsible?
119
440396
2810
下来问,"我想知道谁该为此负责?
到底谁才是那几个坏苹果?"嗯,这是个差劲的问题。
07:23
Who are the bad apples?"
120
443230
1379
07:24
Well, that's a bad question.
121
444633
1408
你应该重新组织一下这个句子,"是什么为此负责?"
07:26
You have to reframe it and ask, "What is responsible?"
122
446065
2621
07:28
"What" could be the who of people,
123
448710
1648
因为"什么"既可以是指人,
07:30
but it could also be the what of the situation,
124
450382
2537
也可以是指情境,
07:32
and obviously that's wrongheaded.
125
452943
1686
而显然那样问是坚持错误。
07:34
How do psychologists try to understand such transformations of human character,
126
454653
3845
那么心理学家是如何理解
这种人性的转变呢?
07:38
if you believe that they were good soldiers
127
458522
2000
如果你相信他们在进入地牢之前
07:40
before they went down to that dungeon?
128
460546
1810
是好士兵的话。
07:42
There are three ways. The main way is called dispositional.
129
462380
2862
有三种方式。最主要的方式是所谓的特质论。
我们查看那些坏苹果的内在特征。
07:45
We look at what's inside of the person, the bad apples.
130
465266
2870
07:48
This is the foundation of all of social science,
131
468160
3311
这是所有社会科学的基础,
07:51
the foundation of religion, the foundation of war.
132
471495
2660
宗教的基础,战争的基础。
07:55
Social psychologists like me come along and say,
133
475160
2239
像我这样的社会心理学家会出来说,"是啊,
07:57
"Yeah, people are the actors on the stage,
134
477423
2119
人们是舞台上的演员,
07:59
but you'll have to be aware of the situation.
135
479566
2350
但你得清楚其所处的情境。
08:01
Who are the cast of characters? What's the costume?
136
481940
2381
扮演角色的演员是哪些人?戏服什么样?
08:04
Is there a stage director?"
137
484345
1459
有舞台导演吗?
08:05
And so we're interested in what are the external factors
138
485828
2667
所以我们感兴趣的是,个体周围的外界因素
08:08
around the individual -- the bad barrel?
139
488519
1976
是什么,坏的苹果桶?
社会学家研究的仅限于此,却遗漏了这个很重要的问题,
08:11
Social scientists stop there and they miss the big point
140
491064
2647
08:13
that I discovered when I became an expert witness for Abu Ghraib.
141
493735
3239
即我在成为阿布格莱布虐囚事件的专家证人后所发现的:
08:16
The power is in the system.
142
496998
1551
权力存在于系统中。
08:18
The system creates the situation that corrupts the individuals,
143
498573
3241
系统制造出腐化个体的情境,
08:21
and the system is the legal, political, economic, cultural background.
144
501838
4784
这个系统,是指法制、政治、经济和文化背景。
08:26
And this is where the power is of the bad-barrel makers.
145
506646
3294
该系统即苹果桶制造者权力之所在。
如果你想改变一个人,你就得改变其所处的情境。
08:30
If you want to change a person, change the situation.
146
510299
2501
08:32
And to change it, you've got to know where the power is, in the system.
147
512824
3389
如果你要改变情境,
你得知道其权力存在于系统的何处。
所以路西法效应牵涉到理解
08:36
So the Lucifer effect involves understanding
148
516237
2048
人性转变是如何受这三项因素影响的。
08:38
human character transformations with these three factors.
149
518309
4455
08:43
And it's a dynamic interplay.
150
523284
1389
它是一个相互作用的过程。
08:44
What do the people bring into the situation?
151
524697
2066
人们会怎样影响情境?
08:46
What does the situation bring out of them?
152
526787
2008
情境如何影响人们?
08:48
And what is the system that creates and maintains that situation?
153
528819
3317
制造并维持该情境的系统是什么?
08:52
My recent book, "The Lucifer Effect," is about,
154
532784
2284
我最近出版的书《路西法效应》,
就是关于我们如何理解好人是怎样变成恶人的。
08:55
how do you understand how good people turn evil?
155
535092
2239
08:57
And it has a lot of detail about what I'm going to talk about today.
156
537355
3263
书中有关于我今天演讲内容
的大量细节。
09:01
So Dr. Z's "Lucifer Effect," although it focuses on evil,
157
541056
3080
所以,津博士的《路西法效应》,尽管着重于恶,
09:04
really is a celebration of the human mind's infinite capacity
158
544160
4160
但其实是颂扬人类有无限的潜力,
使我们任何人向善或作恶,
09:08
to make any of us kind or cruel,
159
548344
2558
09:10
caring or indifferent, creative or destructive,
160
550926
2522
关怀或冷漠,创造或毁灭,
09:13
and it makes some of us villains.
161
553472
2278
甚至可以使得我们其中一些人成为恶棍。
09:15
And the good news that I'm going to hopefully come to at the end
162
555774
3097
而我在最后将满怀希望地给大家讲一个好消息的故事,
09:18
is that it makes some of us heroes.
163
558895
1767
即这潜力也可以使我们其中一些人成为英雄。
这是登在《纽约客》上非常棒的一个漫画,
09:21
This wonderful cartoon in the New Yorker summarizes my whole talk:
164
561811
3549
它其实总结了我的全部演讲:
09:25
"I'm neither a good cop nor a bad cop, Jerome.
165
565384
2143
"杰罗姆,我既不是好警察也不是坏警察,
09:27
Like yourself, I'm a complex amalgam
166
567551
1959
跟你一样,我是一个正面和负面人格特质
09:29
of positive and negative personality traits
167
569534
2469
的复杂混合体,
09:32
that emerge or not, depending on the circumstances."
168
572027
3490
至于体现哪一面,要靠具体情况而言。"
09:35
(Laughter)
169
575541
1985
(笑声)
09:37
There's a study some of you think you know about,
170
577550
2852
有一项研究,你们其中一些人可能以为自己知道,
09:40
but very few people have ever read the story.
171
580426
2187
但极少数人读过这个故事。你看过电影。
09:42
You watched the movie.
172
582637
1925
09:44
This is Stanley Milgram, little Jewish kid from the Bronx,
173
584586
3349
这是斯坦利·米尔格拉姆,自小在布朗克斯长大的一个犹太人,
09:47
and he asked the question, "Could the Holocaust happen here, now?"
174
587959
3099
他问,"大屠杀在此时此地发生吗?"
09:51
People say, "No, that's Nazi Germany, Hitler, you know, that's 1939."
175
591082
3505
人们回答,"不,那是纳粹德国,
那是希特勒,你知道,那是1939年。"
09:54
He said, "Yeah, but suppose Hitler asked you,
176
594611
2393
他说,"是啊,但如果希特勒问你,
'你会用电刑处死一个陌生人吗?' ' 没门儿,我肯定不会,我是个好人。"
09:57
'Would you electrocute a stranger?' 'No way, I'm a good person.'"
177
597028
3170
10:00
He said, "Why don't we put you in a situation
178
600222
2096
他说,"那么我们不如把你放在一个情境里,
给你一个机会,看看你会怎么做?"
10:02
and give you a chance to see what you would do?"
179
602342
2239
于是,他找了1000个普通人来做测试。
10:04
And so what he did was he tested 1,000 ordinary people.
180
604605
3085
10:07
500 New Haven, Connecticut, 500 Bridgeport.
181
607714
4067
500人来自康州纽黑文,500人来自布里奇波特。
广告是这样说的,"心理学家想要研究人的记忆,
10:11
And the ad said, "Psychologists want to understand memory.
182
611805
2715
10:14
We want to improve people's memory, because it is the key to success."
183
614544
3294
我们想改善人的记忆,
因为记忆是成功的关键。"
10:17
OK?
184
617862
1200
"我们会给你5美元——4元用来支付时间。"
10:20
"We're going to give you five bucks -- four dollars for your time.
185
620064
3985
10:24
We don't want college students. We want men between 20 and 50."
186
624073
3454
上面写着,"我们不要大学生,
我们需要20到50岁之间的男性。"
10:27
In the later studies, they ran women.
187
627551
1762
——他们在后来的实验中也研究了女性——
他们都是普通人:理发师,收银员,白领等等。
10:29
Ordinary people: barbers, clerks, white-collar people.
188
629337
3073
10:32
So, you go down,
189
632434
1580
于是你们下去,其中一个扮演学生,
10:34
one of you will be a learner, one will be a teacher.
190
634038
2511
另一个扮演教师。
10:36
The learner's a genial, middle-aged guy.
191
636573
1905
学生是一个和蔼的中年男子。
10:38
He gets tied up to the shock apparatus in another room.
192
638502
3109
在另外一间屋子里,他被绑在一个电击仪器上。
10:41
The learner could be middle-aged, could be as young as 20.
193
641635
3568
学生可能是中年人,也可能是二十多岁。
穿实验室工作服的负责人,即权威角色,会告诉你们其中一个人说,
10:45
And one of you is told by the authority, the guy in the lab coat,
194
645227
3216
10:48
"Your job as teacher is to give him material to learn.
195
648467
2615
"你作为教师的工作就是让这个人学习材料。
10:51
Gets it right, reward.
196
651106
1214
记对了,就奖励他。
10:52
Gets it wrong, you press a button on the shock box.
197
652344
2391
记错了,你就按这个电击盒上的按钮。
10:54
The first button is 15 volts. He doesn't even feel it."
198
654759
2854
第一个按钮是15伏特。他基本感觉不到。"
10:58
That's the key.
199
658160
1163
这就是关键。所有的恶都是从15伏特开始的。
10:59
All evil starts with 15 volts.
200
659347
1789
11:01
And then the next step is another 15 volts.
201
661160
2411
下一个再加15伏特。
11:04
The problem is, at the end of the line, it's 450 volts.
202
664009
2997
问题是,最后一个按钮,是450伏特。
随着你不断加电压,那个人就会惨叫,
11:07
And as you go along, the guy is screaming,
203
667030
2106
11:09
"I've got a heart condition! I'm out of here!"
204
669160
2191
"我有心脏问题!我要出去!"
11:11
You're a good person. You complain.
205
671375
1706
你是一个好人。你去投诉。
11:13
"Sir, who will be responsible if something happens to him?"
206
673105
2856
"先生,如果他出事了,谁来负责?"
11:15
The experimenter says, "Don't worry, I will be responsible.
207
675985
2763
实验人员说,"不要紧,我来负责。
11:18
Continue, teacher."
208
678772
1329
请继续,教师。"
问题是,谁会一直按到450伏特?
11:20
And the question is, who would go all the way to 450 volts?
209
680125
3805
11:23
You should notice here, when it gets up to 375,
210
683954
2576
你们会注意到,到375伏特时,
11:26
it says, "Danger. Severe Shock."
211
686554
1525
上面写着,"危险:强烈电击"
11:28
When it gets up to here, there's "XXX" -- the pornography of power.
212
688103
3340
到这儿的时候,那儿标着 "XXX" :少儿不宜的级别。
11:31
So Milgram asks 40 psychiatrists,
213
691897
2110
(笑声)
于是米尔格拉姆问了40个精神病医生,
"百分之多少的美国人会按到最高电压?"
11:34
"What percent of American citizens would go to the end?"
214
694031
3105
11:37
They said only one percent.
215
697160
1501
他们回答只有百分之1。因为那属于虐待狂行为,
11:38
Because that's sadistic behavior, and we know, psychiatry knows,
216
698685
3566
而且我们知道,精神病学显示,只有百分之1的美国人是虐待狂。
11:42
only one percent of Americans are sadistic.
217
702275
2092
11:44
OK.
218
704391
1200
好。这里是研究资料。他们大错特错。
11:46
Here's the data. They could not be more wrong.
219
706147
2606
11:48
Two thirds go all the way to 450 volts.
220
708777
2821
三分之二的人会一直按到450伏特。这只是一个研究而已。
11:51
This was just one study.
221
711622
1514
11:53
Milgram did more than 16 studies.
222
713160
2284
米尔格拉姆做了超过16项研究。我们看一下这个。
11:55
And look at this.
223
715468
1233
在第16个研究中,你可以看到跟你们一样的人们有百分之90
11:57
In study 16, where you see somebody like you go all the way,
224
717491
2837
12:00
90 percent go all the way.
225
720352
1468
会一直按到450伏特。在第5个研究中,如果有人反抗,百分之90的人反抗。
12:02
In study five, if you see people rebel,
226
722447
2188
12:04
90 percent rebel.
227
724659
1343
12:06
What about women? Study 13 -- no different than men.
228
726772
2845
女性呢?第13个研究:与男性无差别。
12:09
So Milgram is quantifying evil as the willingness of people
229
729938
3541
米尔格拉姆在以人们盲目服从权威,
12:13
to blindly obey authority, to go all the way to 450 volts.
230
733503
2793
一直按到450伏特的意愿,来数量化恶。
12:16
And it's like a dial on human nature.
231
736606
2122
这就好像是在调节人性。
12:18
A dial in a sense that you can make almost everybody totally obedient,
232
738752
4384
调节的意思是,你几乎可以从使绝大多数人完全服从,
12:23
down to the majority, down to none.
233
743160
2690
到使没有人服从。
那么,外界世界有什么类似情况吗?毕竟所有的实验都是人为的。
12:26
What are the external parallels? For all research is artificial.
234
746767
3000
12:29
What's the validity in the real world?
235
749791
1810
它在真实世界中的有效性如何?
1978年,在圭亚那丛林里,有912名美国人
12:31
912 American citizens committed suicide or were murdered
236
751625
3014
12:34
by family and friends in Guyana jungle in 1978,
237
754663
2985
自杀或遭其家人朋友杀害,
12:37
because they were blindly obedient to this guy, their pastor --
238
757672
3030
因为他们盲目地服从这个家伙,他们的传道者。
12:40
not their priest -- their pastor, Reverend Jim Jones.
239
760726
2707
不是他们的神父。他们的传道者,吉姆·琼斯主教。
他说服他们进行集体自杀。
12:43
He persuaded them to commit mass suicide.
240
763457
2796
12:46
And so, he's the modern Lucifer effect,
241
766277
1865
所以他是一个当代的路西法效应。
从上帝使者变成死亡天使。
12:48
a man of God who becomes the Angel of Death.
242
768166
2852
12:52
Milgram's study is all about individual authority to control people.
243
772662
4394
米尔格拉姆的研究完全是关于控制大众的个人权力。
大多数时间我们在机构里,
12:57
Most of the time, we are in institutions,
244
777793
2343
13:00
so the Stanford Prison Study is a study of the power of institutions
245
780160
3537
所以斯坦福监狱实验,研究的是机构权力
13:03
to influence individual behavior.
246
783721
1940
如何影响个人行为。
13:05
Interestingly, Stanley Milgram and I were in the same high school class
247
785685
3604
有趣的是,斯坦利·米尔格拉姆和我上高中的时候在同一个班级,
那是1954年,在布朗克斯的詹姆斯·门罗高中。
13:09
in James Monroe in the Bronx, 1954.
248
789313
2704
13:13
I did this study with my graduate students,
249
793136
2087
这个实验室是我跟
我的研究生做的,尤其是克雷格·汉尼,
13:15
especially Craig Haney -- and it also began work with an ad.
250
795247
2826
我们也从打广告开始。
我们没什么钱,于是我们打了一个简单的小广告,
13:18
We had a cheap, little ad,
251
798097
1428
13:19
but we wanted college students for a study of prison life.
252
799549
2752
我们想找大学生来研究一下监狱生活。
75个人志愿参加,做了人格测试。
13:23
75 people volunteered, took personality tests.
253
803249
2440
13:25
We did interviews.
254
805713
1250
我们做了面试。挑选了24名:
13:26
Picked two dozen: the most normal, the most healthy.
255
806987
2446
他们是最正常的,最健康的。
13:29
Randomly assigned them to be prisoner and guard.
256
809457
2267
然后随机把他们分成囚犯和警卫两组。
13:31
So on day one, we knew we had good apples.
257
811748
2020
所以在第一天,我们知道他们都是好苹果。
13:33
I'm going to put them in a bad situation.
258
813792
1953
而我将把他们放在一个坏的情境里。
13:35
And secondly, we know there's no difference
259
815769
2277
其次,我们知道
13:38
between the boys who will be guards and those who will be prisoners.
260
818070
3342
在将要扮演警卫和
扮演囚犯的男生之间没有任何区别。
13:41
To the prisoners, we said,
261
821436
1526
我们对那些将要扮演囚犯的男生说,
13:42
"Wait at home. The study will begin Sunday."
262
822986
2126
"在住处等着,实验在星期天开始。"
13:45
We didn't tell them
263
825136
1200
我们没有告诉他们的是,
13:46
that the city police were going to come and do realistic arrests.
264
826360
3195
市警察局的警察会上门做真实的逮捕。
13:49
(Video) (Music)
265
829579
1632
13:55
[Day 1]
266
835999
1200
录像中的男人:一辆警车停在房子前面,一个警察来到前门
14:23
Student: A police car pulls up in front, and a cop comes to the front door,
267
863473
4974
14:28
and knocks, and says he's looking for me.
268
868471
1995
敲门,说是找我。
14:30
So they, right there, you know, they took me out the door,
269
870490
2921
于是他们,就在那儿,你懂的,把我抓出去,
14:33
they put my hands against the car.
270
873435
3498
把我的双手放车上。
14:36
It was a real cop car, it was a real policeman,
271
876957
2201
那是辆真警车,是个真警察,
14:39
and there were real neighbors in the street,
272
879182
2063
街上的邻居也是真的,
他们不知道这是个实验。
14:41
who didn't know that this was an experiment.
273
881269
3026
14:44
And there was cameras all around and neighbors all around.
274
884789
2754
周围都是相机,围满了邻居。
14:47
They put me in the car, then they drove me around Palo Alto.
275
887567
3216
他们让我上警车,在帕罗奥图市的大街上行驶。
14:52
They took me to the basement of the police station.
276
892338
4993
他们把我抓到警察局,
警察局的地下室。他们把我关到一间牢房里。
15:04
Then they put me in a cell.
277
904373
1394
15:05
I was the first one to be picked up, so they put me in a cell,
278
905791
2918
我是第一个被抓来的,所以他们把我关进一间单人牢房,
基本上就是一间门上有栏杆的房间。
15:08
which was just like a room with a door with bars on it.
279
908733
2793
15:12
You could tell it wasn't a real jail.
280
912446
1794
你可以看出来出它不是间真的牢房。
他们把我锁在那儿,穿着这件丢人的衣服。
15:14
They locked me in there, in this degrading little outfit.
281
914264
3762
15:18
They were taking this experiment too seriously.
282
918770
2575
他们对这个实验太认真了。
15:21
Here are the prisoners, who are going to be dehumanized, they'll become numbers.
283
921369
3833
这就是那些将要被剥夺人性的囚犯。
他们的名字将被号码代替。
这是那些警卫,他们的装扮标志着权力和匿名性。
15:25
Here are the guards with the symbols of power and anonymity.
284
925226
2892
警卫们让囚犯们
15:28
Guards get prisoners to clean the toilet bowls out with their bare hands,
285
928142
3449
徒手清理马桶,
让他们做其他一些羞辱性的任务。
15:31
to do other humiliating tasks.
286
931615
1472
他们脱光囚犯的衣服,性侮辱他们。
15:33
They strip them naked. They sexually taunt them.
287
933111
2242
他们开始做侮辱行为,
15:35
They begin to do degrading activities, like having them simulate sodomy.
288
935377
3537
譬如强迫囚犯们模拟鸡奸。
15:38
You saw simulating fellatio in soldiers in Abu Ghraib.
289
938938
3231
你们看到阿布格莱布的士兵强迫囚犯模拟口交。
我的警卫在五天内就做了。囚犯们的应激反应是非常极端的,
15:42
My guards did it in five days.
290
942193
1882
15:44
The stress reaction was so extreme
291
944495
1827
15:46
that normal kids we picked because they were healthy
292
946346
2436
我们当初挑选他们是因为他们是健康的,
15:48
had breakdowns within 36 hours.
293
948806
1583
而这些正常的男生在36小时内就有人崩溃了。
15:50
The study ended after six days, because it was out of control.
294
950413
4025
这个实验在6天后结束因为它已经失控了。
15:54
Five kids had emotional breakdowns.
295
954462
2130
五个男生情绪崩溃。
15:58
Does it make a difference
296
958466
1453
战士们是否更换统一服装
15:59
if warriors go to battle changing their appearance or not?
297
959943
2725
对于他们在战场上的表现会有影响吗?
16:02
If they're anonymous, how do they treat their victims?
298
962692
2753
他们是否匿名
对于他们对付受害者会有影响吗?
16:05
In some cultures, they go to war without changing their appearance.
299
965469
3159
我们知道在某些文化里,人们上战场时
是不换服装的。
在另外一些文化里,他们把自己涂成"苍蝇王"的样子。
16:08
In others, they paint themselves like "Lord of the Flies."
300
968652
2728
在某些文化里他们戴着面具。
16:11
In some, they wear masks.
301
971404
1192
在许多文化中,战士们穿着统一服装达到匿名性。
16:12
In many, soldiers are anonymous in uniform.
302
972620
2061
16:14
So this anthropologist, John Watson, found 23 cultures that had two bits of data.
303
974705
4697
人类学家约翰·华生
在23个文化中发现两组数据。
16:19
Do they change their appearance? 15.
304
979426
2339
他们是否更换服装?15个是。
16:21
Do they kill, torture, mutilate? 13.
305
981789
2307
他们是否杀戮,折磨,残害?13个是。
如果他们不换服装,
16:24
If they don't change their appearance,
306
984400
1823
八个文化中只有一个杀戮,折磨或残害。
16:26
only one of eight kills, tortures or mutilates.
307
986247
2213
关键在这个红色区域。
16:28
The key is in the red zone.
308
988484
1304
16:29
If they change their appearance,
309
989812
1565
如果他们更换服装,
13个文化中有12个,即百分之90,会杀戮,折磨,残害。
16:31
12 of 13 -- that's 90 percent -- kill, torture, mutilate.
310
991401
3217
16:35
And that's the power of anonymity.
311
995136
1620
这就是匿名性的威力。
16:36
So what are the seven social processes
312
996780
1905
那么是哪七个社会性过程
16:38
that grease the slippery slope of evil?
313
998709
1858
会导致恶的逐渐产生呢?
16:40
Mindlessly taking the first small step.
314
1000591
2126
无意中迈出第一步。
16:42
Dehumanization of others. De-individuation of self.
315
1002741
2642
对他人去人性化。对自己去个体化。
16:45
Diffusion of personal responsibility.
316
1005407
1902
推卸个人责任。盲目服从权威。
16:47
Blind obedience to authority.
317
1007333
1647
16:49
Uncritical conformity to group norms.
318
1009004
1894
不加鉴别地依从群体规范。
16:50
Passive tolerance of evil through inaction, or indifference.
319
1010922
3214
袖手旁观,漠不关心,对恶行消极容忍。
16:54
And it happens when you're in a new or unfamiliar situation.
320
1014160
2858
而其容易在新的或不熟悉的环境中发生。
你的习惯性反应失效了。
16:57
Your habitual response patterns don't work.
321
1017042
2049
16:59
Your personality and morality are disengaged.
322
1019115
2447
你的人格和道德感被关闭了。
"没有什么比公开谴责作恶者更容易,
17:02
"Nothing is easier than to denounce the evildoer;
323
1022803
2310
也没什么比理解他更难。"陀思妥耶夫斯基告诉我们。
17:05
nothing more difficult than understanding him," Dostoyevsky.
324
1025137
3022
理解不是找借口。心理学不是借口学。
17:08
Understanding is not excusing. Psychology is not excuse-ology.
325
1028183
3572
17:12
So social and psychological research reveals
326
1032219
2127
社会学和心理学研究揭示了
在无需药物的情况下,普通的好人是如何被转变的。
17:14
how ordinary, good people can be transformed without the drugs.
327
1034370
3272
17:17
You don't need it. You just need the social-psychological processes.
328
1037666
3301
你不需要药物,你只需要社会心理学的过程。
17:20
Real world parallels?
329
1040991
1695
真实世界的情况?和这个比较一下。
17:22
Compare this with this.
330
1042710
2184
17:26
James Schlesinger -- I'm going to end with this -- says,
331
1046136
2688
我以詹姆斯·施莱辛格的话作为结束,
17:28
"Psychologists have attempted to understand how and why
332
1048848
2599
"心理学家已尝试理解,
17:31
individuals and groups who usually act humanely
333
1051471
2429
一般情况下具备人性的个体和群体,为什么以及如何
17:33
can sometimes act otherwise in certain circumstances."
334
1053924
3188
会在某些情境下,作出反常行为。"
17:37
That's the Lucifer effect.
335
1057136
1239
这就是路西法效应。
17:38
And he goes on to say, "The landmark Stanford study
336
1058399
2429
他接着说,"具有标志性的斯坦福实验
17:40
provides a cautionary tale for all military operations."
337
1060852
3920
给了所有军事行为一个警告。"
17:44
If you give people power without oversight,
338
1064796
2340
如果你在没有监督的情况下赋予人们权力,
17:47
it's a prescription for abuse.
339
1067160
1483
那就是在给滥用开通行证。他们明明了解后果,却任其发生。
17:48
They knew that, and let that happen.
340
1068667
2411
另一个报告,是费将军所做的调查,
17:51
So another report, an investigative report by General Fay,
341
1071837
3080
17:54
says the system is guilty.
342
1074941
1950
认为整个系统是有罪的,在该报告中,
17:56
In this report, he says it was the environment that created Abu Ghraib,
343
1076915
3463
他认为是环境造成了阿布格莱布事件,
18:00
by leadership failures that contributed to the occurrence of such abuse,
344
1080402
3382
领导力的失误,
导致了虐待的发生,
18:03
and because it remained undiscovered
345
1083808
1910
以及在很长一段时间内,
18:05
by higher authorities for a long period of time.
346
1085742
2240
当局高层一直被蒙在鼓里。
那些虐待行为持续了三个月。有谁在看管吗?
18:08
Those abuses went on for three months.
347
1088006
1825
18:09
Who was watching the store?
348
1089855
1608
18:11
The answer is nobody, I think on purpose.
349
1091487
2247
答案是没有人,我认为,是没有人主动去。
18:13
He gave the guards permission to do those things,
350
1093758
2293
他允许警卫们作那些恶行,
他们知道没有人会下地牢来查看。
18:16
and they knew nobody was ever going to come down to that dungeon.
351
1096075
3064
所以我们在所有这些方面进行模式上的转变。
18:19
So you need a paradigm shift in all of these areas.
352
1099163
2568
18:21
The shift is away from the medical model that focuses only on the individual.
353
1101755
3677
原来的医疗模式,
只集中于个体,
18:25
The shift is toward a public health model
354
1105456
2680
必须转向一个公共健康模式,
18:28
that recognizes situational and systemic vectors of disease.
355
1108160
3782
这个模式同时考虑情境和系统对疾病的作用。
18:31
Bullying is a disease. Prejudice is a disease.
356
1111966
2337
欺侮是病。偏见是病。暴力是病。
18:34
Violence is a disease.
357
1114327
1168
18:35
Since the Inquisition, we've been dealing with problems at the individual level.
358
1115519
3768
自从审讯以来,我们一直在个人层面
解决问题。你猜怎么着,没用。
18:39
It doesn't work.
359
1119311
1441
18:40
Aleksandr Solzhenitsyn says, "The line between good and evil
360
1120776
2816
亚历山大·索尔仁尼琴认为每个人心中
18:43
cuts through the heart of every human being."
361
1123616
2120
都有善恶的分界线。
18:45
That means that line is not out there.
362
1125760
1812
也就是说,这条线不是外在的。
18:47
That's a decision that you have to make, a personal thing.
363
1127596
2971
这是一个你必须作出的决定。是个人层面的。
18:50
So I want to end very quickly on a positive note.
364
1130591
2545
那么,我想以一个正面的意见来做个简短的结尾:
18:53
Heroism as the antidote to evil,
365
1133652
2460
英雄主义是恶的解药。
18:56
by promoting the heroic imagination,
366
1136136
1895
通过推广英雄主义想象,
尤其是在我们的孩子之中,在教育系统里。
18:58
especially in our kids, in our educational system.
367
1138055
2498
19:00
We want kids to think, "I'm a hero in waiting,
368
1140577
2464
我们要孩子们想,我是那个等待中的英雄,
等待合适的情境出现,
19:03
waiting for the right situation to come along,
369
1143065
2143
19:05
and I will act heroically.
370
1145232
1543
届时我会行英雄之事。
19:06
My whole life, I'm now going to focus away from evil --
371
1146799
2678
我一生自小与恶相伴,
如今我毕生努力之重点,将从研究恶转向理解英雄主义。
19:09
that I've been in since I was a kid -- to understanding heroes.
372
1149501
2953
现在所谓的英雄主义是,
19:12
Banality of heroism.
373
1152478
1165
平凡之人行英雄之事。
19:13
It's ordinary people who do heroic deeds.
374
1153667
2004
19:15
It's the counterpoint to Hannah Arendt's "Banality of Evil."
375
1155695
2946
这是对汉娜·阿伦特的平庸之恶的反驳。
19:18
Our traditional societal heroes are wrong, because they are the exceptions.
376
1158665
3961
我们传统的社会英雄是错误的,
因为他们是极少数例外。
19:22
They organize their life around this. That's why we know their names.
377
1162650
3335
他们为目标投入毕生之努力。
因此我们才知道他们的名字。
孩子们的英雄也是他们的榜样,
19:26
Our kids' heroes are also wrong models for them,
378
1166009
2278
因为他们有超自然能力。
19:28
because they have supernatural talents.
379
1168311
1944
19:30
We want our kids to realize most heroes are everyday people,
380
1170279
2961
我们想要让孩子们意识到,大多数英雄是平凡的人们,
而英雄行为是不平凡的。这是乔·达比。
19:33
and the heroic act is unusual.
381
1173264
2104
19:35
This is Joe Darby.
382
1175392
1291
19:36
He was the one that stopped those abuses you saw,
383
1176707
2286
就是他阻止了你们前面所见的那些虐行,
因为当他看到那些图片时,
19:39
because when he saw those images,
384
1179017
1593
19:40
he turned them over to a senior investigating officer.
385
1180634
2571
他把它们交给了一位高级调查官。
19:43
He was a low-level private, and that stopped it.
386
1183229
2728
他是一个低级列兵但却阻止了此事。他是英雄吗?不是。
19:45
Was he a hero? No.
387
1185981
1178
19:47
They had to put him in hiding, because people wanted to kill him,
388
1187183
3150
他们不得不把他藏起来,因为有人想杀他,
19:50
and then his mother and his wife.
389
1190357
1652
还有他的母亲和妻子。
他们隐藏了三年。
19:52
For three years, they were in hiding.
390
1192033
1885
19:53
This is the woman who stopped the Stanford Prison Study.
391
1193942
2686
这个女人阻止了斯坦福监狱实验。
19:56
When I said it got out of control, I was the prison superintendent.
392
1196652
3249
当我说实验失控的时候,我当时是监狱实验负责人。
19:59
I didn't know it was out of control. I was totally indifferent.
393
1199925
2999
我不知道实验已经失控了。我完全无动于衷。
20:02
She saw that madhouse and said,
394
1202948
1591
她下来看到这疯人院一样的监狱说,
20:04
"You know what, it's terrible what you're doing to those boys.
395
1204563
2905
"你知道吗?你对这些男孩所做的一切实在是太可怕了。
20:07
They're not prisoners nor guards, they're boys, and you are responsible."
396
1207492
3644
他们不是囚犯,不是警卫,
他们只是孩子,你要为他们负责。"
20:11
And I ended the study the next day.
397
1211160
1976
我第二天就停止了这个实验。
20:13
The good news is I married her the next year.
398
1213699
2110
好消息是,我第二年就娶了她。
20:15
(Laughter)
399
1215833
2303
(笑声)
20:18
(Applause)
400
1218160
7000
(鼓掌)
20:25
I just came to my senses, obviously.
401
1225605
2011
显然,我醒悟了。
20:27
So situations have the power to do [three things].
402
1227640
2669
所以情境是有力量的——
20:30
But the point is, this is the same situation
403
1230333
2781
关键是,这个情境
可以刺激一些人内心的敌意想象,
20:33
that can inflame the hostile imagination in some of us,
404
1233138
3466
20:36
that makes us perpetrators of evil,
405
1236628
1737
使我们成为恶之犯人,
20:38
can inspire the heroic imagination in others.
406
1238389
2193
也可以激发另外一些人内心的英雄想象。情境是同样的情境。
20:40
It's the same situation and you're on one side or the other.
407
1240606
3114
而你二者必居其一。
20:43
Most people are guilty of the evil of inaction,
408
1243744
2191
大多数人对袖手旁观之恶感到内疚,
20:45
because your mother said, "Don't get involved. Mind your own business."
409
1245959
3348
因为你母亲会说,"别管闲事,先管好你自己的事。"
你一定得这么回答,"妈妈,人性就是我的事。"
20:49
And you have to say, "Mama, humanity is my business."
410
1249331
2477
20:51
So the psychology of heroism is -- we're going to end in a moment --
411
1251832
3191
英雄主义的心理学是——我们很快会结束——
我们如何在新的英雄课程里鼓励孩子们,
20:55
how do we encourage children in new hero courses,
412
1255047
2427
20:57
that I'm working on with Matt Langdon -- he has a hero workshop --
413
1257498
3292
我正与马特·郎登从事这项工作——他有一个英雄工作坊——
21:00
to develop this heroic imagination, this self-labeling,
414
1260814
3043
来培养这种英雄想象,这种自我标签,
21:03
"I am a hero in waiting," and teach them skills.
415
1263881
2763
"我是一个等待中的英雄",并且教会他们技能。
21:06
To be a hero, you have to learn to be a deviant,
416
1266668
2355
想成为英雄的话,你一定要学会成为一个"异类",
21:09
because you're always going against the conformity of the group.
417
1269047
3000
因为你得总是与群体规范相左。
英雄是那些在社会上行非凡之事的平凡人。那些有所为之人。
21:12
Heroes are ordinary people whose social actions are extraordinary. Who act.
418
1272697
3840
英雄主义之关键有二。
21:16
The key to heroism is two things.
419
1276561
1751
一:在众人消极冷漠之时有所作为。
21:18
You have to act when other people are passive.
420
1278336
2381
21:20
B: You have to act socio-centrically, not egocentrically.
421
1280741
2767
二:你的作为必须以社会为中心,而非以自我为中心。
21:23
And I want to end with a known story about Wesley Autrey, New York subway hero.
422
1283532
3783
我想以韦斯利·奥特里,纽约地铁英雄的故事来结尾,
你们其中一些人知道这个故事。
21:27
Fifty-year-old African-American construction worker standing on a subway.
423
1287339
3797
他是一个50岁的非裔美国人,是一个建筑工人。
他在纽约地铁等车的时候,
21:31
A white guy falls on the tracks.
424
1291160
1528
一个白人掉进地铁轨道里。
21:32
The subway train is coming. There's 75 people there.
425
1292712
2460
当时地铁正开过来。当时有75个人在那儿。
21:35
You know what? They freeze.
426
1295196
1496
你猜怎么着,他们全都僵住了。
21:36
He's got a reason not to get involved.
427
1296716
1866
他有理由袖手旁观。
21:38
He's black, the guy's white, and he's got two kids.
428
1298606
2417
他是黑人,那个人是白人,他还有两个小孩。
21:41
Instead, he gives his kids to a stranger,
429
1301047
1953
相反的是,他把两个孩子交给一个陌生人看管,
跳进铁轨里,把那男子压在铁轨之间,
21:43
jumps on the tracks, puts the guy between the tracks,
430
1303024
2477
21:45
lays on him, the subway goes over him.
431
1305525
2170
趴在他身上,地铁就从他身上开了过去。
21:47
Wesley and the guy -- 20 and a half inches height.
432
1307719
3258
韦斯利和那个男子摞起来高20.5英寸。
21:51
The train clearance is 21 inches.
433
1311001
2290
地铁列车下的空隙高21英寸。
21:53
A half an inch would have taken his head off.
434
1313840
2103
再低半英寸就会把他的脑袋铲去。
21:56
And he said, "I did what anyone could do," no big deal to jump on the tracks.
435
1316537
4069
而他却说"我做了任何人都会做的事",
跳下铁轨没什么大不了的。
22:00
And the moral imperative is "I did what everyone should do."
436
1320630
3648
从道德责任的角度说应该是"我做了任何人应该做的事"。
22:04
And so one day, you will be in a new situation.
437
1324302
2387
那么,将来有一天,你会遇到一个新的情境。
22:07
Take path one, you're going to be a perpetrator of evil.
438
1327160
2620
第一条路,你会成为恶之犯人。
22:09
Evil, meaning you're going to be Arthur Andersen.
439
1329804
2332
恶,即你将成为亚瑟·安德森。
22:12
You're going to cheat, or you're going to allow bullying.
440
1332160
2715
你将会欺骗,或允许欺侮。
22:14
Path two, you become guilty of the evil of passive inaction.
441
1334899
2814
第二条路:你将因漠不关心袖手旁观而内疚。
22:17
Path three, you become a hero.
442
1337737
1723
第三条路:你成为一个英雄。
关键是,我们是否做好准备来选择这条路
22:19
The point is, are we ready to take the path to celebrating ordinary heroes,
443
1339484
4418
以颂扬平凡的英雄,
22:23
waiting for the right situation to come along
444
1343926
2204
等待合适的情境出现,
将对于英雄的想象付诸于实施呢?
22:26
to put heroic imagination into action?
445
1346154
2066
因为这可能是你平生仅有的机会,
22:28
Because it may only happen once in your life,
446
1348244
2794
22:31
and when you pass it by, you'll always know,
447
1351062
2143
而当你错过的时候,你将永远记得,
我本可以成为一个英雄但我让这机会溜走了。
22:33
I could have been a hero and I let it pass me by.
448
1353229
2435
22:35
So the point is thinking it and then doing it.
449
1355688
2173
所以关键是先想再做。
所以我想谢谢你们。谢谢你们。谢谢。
22:38
So I want to thank you. Thank you.
450
1358559
1948
22:40
Let's oppose the power of evil systems at home and abroad,
451
1360531
3002
让我们反对国内外的恶之系统的力量,
22:43
and let's focus on the positive.
452
1363557
1579
并集中于积极的一面。
22:45
Advocate for respect of personal dignity, for justice and peace,
453
1365160
3521
倡导对个人高尚行为之尊敬,倡导正义与和平,
22:48
which sadly our administration has not been doing.
454
1368705
2338
遗憾的是,我们的当局并没有在做这些。
非常感谢。
22:51
Thanks so much.
455
1371067
1250
(掌声)
22:52
(Applause)
456
1372341
3109
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7