请双击下面的英文字幕来播放视频。
翻译人员: Di SUN
校对人员: Yulin Li
00:06
Imagine you're watching a runaway trolley
barreling down the tracks
0
6949
4572
想象你眼前有一辆失控的电车,
飞速冲下轨道,
00:11
straight towards five workers
who can't escape.
1
11521
4440
轨道前方有5个工人,难逃此劫。
00:15
You happen to be standing next to a switch
2
15961
2218
而你正巧站在道岔旁边,
00:18
that will divert the trolley
onto a second track.
3
18179
3501
可以将电车引向另一条轨道上。
00:21
Here's the problem.
4
21680
1300
那么,问题来了,
00:22
That track has a worker on it, too,
but just one.
5
22980
5089
另一条轨道上面也有工人,
但是只有一个。
这时候,你会怎么办?
00:28
What do you do?
6
28069
1321
00:29
Do you sacrifice one person to save five?
7
29390
3295
你会选择牺牲一个人来挽救五个人吗?
00:32
This is the trolley problem,
8
32685
2729
这就是我们所说的电车难题,
00:35
a version of an ethical dilemma that
philosopher Philippa Foot devised in 1967.
9
35414
6689
是由哲学家菲利帕福特
在1967年提出的道德困境问题。
这个问题能引起大家的兴趣
是因为它促使我们思考
00:42
It's popular because it forces us
to think about how to choose
10
42103
3268
00:45
when there are no good choices.
11
45371
2689
如何在困境之中做出抉择。
我们应该选择一个最好的结果,
00:48
Do we pick the action
with the best outcome
12
48060
2140
00:50
or stick to a moral code that prohibits
causing someone's death?
13
50200
5200
还是坚守不做出任何伤害
他人生命的行为道德准则?
一项调查显示,
大约90%的参与者选择搬动道岔,
00:55
In one survey, about 90% of respondents
said that it's okay to flip the switch,
14
55400
5437
01:00
letting one worker die to save five,
15
60837
3413
牺牲一人来拯救五个人的生命,
01:04
and other studies, including a virtual
reality simulation of the dilemma,
16
64250
4350
其他试验,包括一个虚拟现实模拟研究
01:08
have found similar results.
17
68600
2440
也得出了相似的结果。
这与功利主义的观点相吻合,
01:11
These judgments are consistent with the
philosophical principle of utilitarianism
18
71040
5021
即认为道德上正确的决定是依据
01:16
which argues that
the morally correct decision
19
76061
2460
01:18
is the one that maximizes well-being
for the greatest number of people.
20
78521
4830
为最多的人提供最大的利益
这一原则做出的。
01:23
The five lives outweigh one,
21
83351
2130
五个人的生命总归大于一个人的生命,
01:25
even if achieving that outcome requires
condemning someone to death.
22
85481
5081
即便是以牺牲一个人的生命为代价。
01:30
But people don't always take
the utilitarian view,
23
90562
2909
然而人们并不都遵循功利主义的思想,
01:33
which we can see by changing
the trolley problem a bit.
24
93471
3591
我们从电车难题的变式中就可以发现。
这一次,你站在天桥上,
01:37
This time, you're standing on a bridge
over the track
25
97062
3241
01:40
as the runaway trolley approaches.
26
100303
2889
一辆失控的电车正朝你驶来。
此时并没有第二条轨道,
01:43
Now there's no second track,
27
103192
1681
01:44
but there is a very large man
on the bridge next to you.
28
104873
3921
但是你的旁边站着
一位体型庞大的男人。
01:48
If you push him over,
his body will stop the trolley,
29
108794
3698
如果你把他推下天桥,
他的身体能够让电车停下来,
01:52
saving the five workers,
30
112492
1751
拯救五个人的性命。
但是,那个男人会牺牲。
01:54
but he'll die.
31
114243
1790
对于功利主义者而言,
这一次选择与上一次相同,
01:56
To utilitarians,
the decision is exactly the same,
32
116033
3399
01:59
lose one life to save five.
33
119432
2550
牺牲一个人来拯救另五个人。
02:01
But in this case, only about 10% of people
34
121982
2602
但是在这次试验中,
只有大约10%的参与者
02:04
say that it's OK to throw the man
onto the tracks.
35
124584
3869
认为可以把那个男人推落到轨道上。
02:08
Our instincts tell us that deliberately
causing someone's death
36
128453
3461
直觉告诉我们,
故意造成他人死亡的行为
02:11
is different than allowing them to die
as collateral damage.
37
131914
4389
不同于间接伤害造成死亡。
02:16
It just feels wrong for reasons
that are hard to explain.
38
136303
4650
这属于人之常情,
其背后的原因很难解释清楚。
02:20
This intersection between ethics
and psychology
39
140953
2520
正是道德伦理与心理学产生的交集
02:23
is what's so interesting
about the trolley problem.
40
143473
3131
让电车难题变得非常有意思。
02:26
The dilemma in its many variations reveal
that what we think is right or wrong
41
146604
4380
电车难题及其多种变式
揭示了我们在做出道德判断时
02:30
depends on factors other than
a logical weighing of the pros and cons.
42
150984
5361
依赖于多种因素,
而非仅仅通过合乎逻辑的利弊权衡。
02:36
For example, men are more likely
than women
43
156345
2490
比如说,
男性比女性更有可能选择
02:38
to say it's okay to push the man
over the bridge.
44
158835
3669
把那个男人推下天桥。
02:42
So are people who watch a comedy clip
before doing the thought experiment.
45
162504
4490
参加试验之前看了喜剧片的人,
也更可能做出同样的选择。
02:46
And in one virtual reality study,
46
166994
2171
一项虚拟现实研究发现,
02:49
people were more willing
to sacrifice men than women.
47
169165
3779
相较女性,人们更愿意选择牺牲男性。
02:52
Researchers have studied
the brain activity
48
172944
2270
研究人员在探究
02:55
of people thinking through the classic
and bridge versions.
49
175214
4321
原始电车难题及其变式情形下
人们的脑部活动时发现,
02:59
Both scenarios activate areas of the brain
involved in conscious decision-making
50
179535
4519
两种情景都激发了
脑部负责有意识决策和
03:04
and emotional responses.
51
184054
2460
情绪反应的部位。
03:06
But in the bridge version,
the emotional response is much stronger.
52
186514
4461
但是在变式情况中,
参与者的情绪反应更加激烈。
03:10
So is activity in an area of the brain
53
190975
2219
脑部负责处理
03:13
associated with processing
internal conflict.
54
193194
3690
内部冲突的部位也更加活跃。
03:16
Why the difference?
55
196884
1261
为什么会产生这些变化?
一种解释是,把人推下桥致死
对个人的冲击更大,
03:18
One explanation is that pushing someone
to their death feels more personal,
56
198145
4767
03:22
activating an emotional aversion
to killing another person,
57
202912
4013
激发了对于杀人行为的厌恶之情,
03:26
but we feel conflicted because we know
it's still the logical choice.
58
206925
4499
但是我们又很矛盾,
因为我们知道这是符合逻辑的选择。
03:31
"Trolleyology" has been criticized by some
philosophers and psychologists.
59
211424
4981
一些哲学家和心理学家
对电车难题持批评态度。
03:36
They argue that it doesn't reveal anything
because its premise is so unrealistic
60
216405
4861
他们认为这并没有揭示任何东西,
因为问题发生的前提非常不现实,
03:41
that study participants
don't take it seriously.
61
221266
4159
以致于试验参与者并不会认真对待。
03:45
But new technology is making this kind
of ethical analysis
62
225425
3131
然而,新科技正让这种道德分析
03:48
more important than ever.
63
228556
2142
变得比以往更加重要。
03:50
For example, driver-less cars
may have to handle choices
64
230698
3338
比如说,
无人驾驶的汽车可能会面临
03:54
like causing a small accident
to prevent a larger one.
65
234036
3971
造成小事故来避免大事故的选择。
03:58
Meanwhile, governments are researching
autonomous military drones
66
238007
3619
同时,政府在研发军用无人机
04:01
that could wind up making decisions of
whether they'll risk civilian casualties
67
241626
4350
最终能够做出牺牲平民生命
04:05
to attack a high-value target.
68
245976
3300
以攻击高价值目标的决定。
04:09
If we want these actions to be ethical,
69
249276
1921
如果我们希望这样的行为
变得合乎道德,
04:11
we have to decide in advance
how to value human life
70
251197
4200
那么我们必须首先决定
如何衡量人类生命的价值,
04:15
and judge the greater good.
71
255397
2270
并评判什么是符合多数人利益的。
04:17
So researchers who study
autonomous systems
72
257667
2440
那么,独立系统的研究人员
04:20
are collaborating with philosophers
73
260107
2100
应该和哲学家一起处理
04:22
to address the complex problem
of programming ethics into machines,
74
262207
5421
机器编程过程中遇到的道德难题,
04:27
which goes to show that
even hypothetical dilemmas
75
267628
3329
而这正恰恰说明了假设中的困境,
04:30
can wind up on a collision course
with the real world.
76
270957
4101
最终也会与现实世界发生碰撞。
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。