Would you sacrifice one person to save five? - Eleanor Nelsen

5,161,800 views ・ 2017-01-12

TED-Ed


請雙擊下方英文字幕播放視頻。

譯者: Songzhe Gao 審譯者: yue chen
00:06
Imagine you're watching a runaway trolley barreling down the tracks
0
6949
4572
假如你目擊一輛失控的電車 在軌道上衝刺
00:11
straight towards five workers who can't escape.
1
11521
4440
前方有五個來不及逃離的工人
00:15
You happen to be standing next to a switch
2
15961
2218
而你剛好在一個換軌器旁邊
00:18
that will divert the trolley onto a second track.
3
18179
3501
可以把電車導向第二個軌道
00:21
Here's the problem.
4
21680
1300
但問題來了
00:22
That track has a worker on it, too, but just one.
5
22980
5089
另一條鐵軌也有一位工人 就一位而已
00:28
What do you do?
6
28069
1321
你會怎麼辦?
00:29
Do you sacrifice one person to save five?
7
29390
3295
你會選擇捨一命救五命嗎?
00:32
This is the trolley problem,
8
32685
2729
這叫做「電車問題」
00:35
a version of an ethical dilemma that philosopher Philippa Foot devised in 1967.
9
35414
6689
1967 年由菲力帕福特 提出的兩難倫理題目
00:42
It's popular because it forces us to think about how to choose
10
42103
3268
這項題目迫使我們在兩者 都不好的情況下抉擇,因而成名
00:45
when there are no good choices.
11
45371
2689
我們會選擇最小的犧牲?
00:48
Do we pick the action with the best outcome
12
48060
2140
00:50
or stick to a moral code that prohibits causing someone's death?
13
50200
5200
或是謹守不殺人的道德準則?
00:55
In one survey, about 90% of respondents said that it's okay to flip the switch,
14
55400
5437
一項問卷顯示,約有 90% 的受測者
認為應該切換軌道
01:00
letting one worker die to save five,
15
60837
3413
以捨棄一人拯救其他五人
01:04
and other studies, including a virtual reality simulation of the dilemma,
16
64250
4350
其他的研究包括 利用虛擬實境模擬電車問題
01:08
have found similar results.
17
68600
2440
顯示的結果還是一樣
01:11
These judgments are consistent with the philosophical principle of utilitarianism
18
71040
5021
這類思考依附著一項哲學理論: 效益主義(舊稱功利主義)
01:16
which argues that the morally correct decision
19
76061
2460
效益主義認為最合乎道德的選擇
01:18
is the one that maximizes well-being for the greatest number of people.
20
78521
4830
應該符合最多人的最大利益
01:23
The five lives outweigh one,
21
83351
2130
五命多於一命
01:25
even if achieving that outcome requires condemning someone to death.
22
85481
5081
就算這項決定 使得他人死亡也一樣
01:30
But people don't always take the utilitarian view,
23
90562
2909
不過人們並不總是依循效益主義
01:33
which we can see by changing the trolley problem a bit.
24
93471
3591
我們把電車問題稍微修改一下
01:37
This time, you're standing on a bridge over the track
25
97062
3241
你站在一座橫跨鐵軌的橋上
01:40
as the runaway trolley approaches.
26
100303
2889
失控的電車衝了過來
這次沒有第二條軌道了
01:43
Now there's no second track,
27
103192
1681
01:44
but there is a very large man on the bridge next to you.
28
104873
3921
不過有一位非常肥胖的人在你旁邊
01:48
If you push him over, his body will stop the trolley,
29
108794
3698
如果你把他推下去 他的身軀能擋下電車
01:52
saving the five workers,
30
112492
1751
拯救後面的五位工人
01:54
but he'll die.
31
114243
1790
但是胖子會死
按照效益主義 結果應該與第一題一樣
01:56
To utilitarians, the decision is exactly the same,
32
116033
3399
01:59
lose one life to save five.
33
119432
2550
犧牲一命可以救五命
02:01
But in this case, only about 10% of people
34
121982
2602
不過,這次只有 10% 的人認為
02:04
say that it's OK to throw the man onto the tracks.
35
124584
3869
可以把胖子推下橋
02:08
Our instincts tell us that deliberately causing someone's death
36
128453
3461
本能告訴我們,直接造成他人死亡
02:11
is different than allowing them to die as collateral damage.
37
131914
4389
與間接傷害導致死亡是不一樣的
02:16
It just feels wrong for reasons that are hard to explain.
38
136303
4650
原因很難解釋,但那感覺就是不對
02:20
This intersection between ethics and psychology
39
140953
2520
在倫理和心理的十字路口掙扎
02:23
is what's so interesting about the trolley problem.
40
143473
3131
正是電車問題的有趣之處
02:26
The dilemma in its many variations reveal that what we think is right or wrong
41
146604
4380
會出現不同答案的原因 是因為我們的是非概念
02:30
depends on factors other than a logical weighing of the pros and cons.
42
150984
5361
是用其他因素作出的判斷 而非邏輯思考的對與錯
02:36
For example, men are more likely than women
43
156345
2490
例如,更多的男性比女性
02:38
to say it's okay to push the man over the bridge.
44
158835
3669
認為把胖子推下橋沒有錯
02:42
So are people who watch a comedy clip before doing the thought experiment.
45
162504
4490
事先看過喜劇再受測的人也是如此
02:46
And in one virtual reality study,
46
166994
2171
一則虛擬實境研究顯示
02:49
people were more willing to sacrifice men than women.
47
169165
3779
多數人傾向犧牲男性高過女性
02:52
Researchers have studied the brain activity
48
172944
2270
專家們研究測驗者
02:55
of people thinking through the classic and bridge versions.
49
175214
4321
思考第一題與第二題時的大腦活動
02:59
Both scenarios activate areas of the brain involved in conscious decision-making
50
179535
4519
兩個問題都會刺激大腦的意識決策區
03:04
and emotional responses.
51
184054
2460
和情緒反應區
03:06
But in the bridge version, the emotional response is much stronger.
52
186514
4461
不過,情緒反應區 對橋梁問題有較強的反應
03:10
So is activity in an area of the brain
53
190975
2219
大腦處理內在衝突的區塊也是如此
03:13
associated with processing internal conflict.
54
193194
3690
03:16
Why the difference?
55
196884
1261
為什麼會這樣?
03:18
One explanation is that pushing someone to their death feels more personal,
56
198145
4767
其中一個解釋是
推下橋致死的感覺比較屬個人行為
03:22
activating an emotional aversion to killing another person,
57
202912
4013
會產生殺人的心理厭惡感
03:26
but we feel conflicted because we know it's still the logical choice.
58
206925
4499
與符合邏輯的決定產生衝突感
03:31
"Trolleyology" has been criticized by some philosophers and psychologists.
59
211424
4981
一些哲學家和心理學家批判電車問題
03:36
They argue that it doesn't reveal anything because its premise is so unrealistic
60
216405
4861
他們認為此問題沒有結論 因為發生的真實性太低
03:41
that study participants don't take it seriously.
61
221266
4159
以至於受測者不會太認真思考
03:45
But new technology is making this kind of ethical analysis
62
225425
3131
但是新科技使這項倫理問題
03:48
more important than ever.
63
228556
2142
變得比以往更重要
03:50
For example, driver-less cars may have to handle choices
64
230698
3338
例如自動駕駛系統需決策
03:54
like causing a small accident to prevent a larger one.
65
234036
3971
造成小型意外以避免大型車禍
03:58
Meanwhile, governments are researching autonomous military drones
66
238007
3619
同時,政府也在研究如何讓無人武器
04:01
that could wind up making decisions of whether they'll risk civilian casualties
67
241626
4350
自行決策如何在 可能傷及平民的環境中
04:05
to attack a high-value target.
68
245976
3300
順利攻擊目標
04:09
If we want these actions to be ethical,
69
249276
1921
如果行動必需合乎倫理
04:11
we have to decide in advance how to value human life
70
251197
4200
就得先定論人類生命的價值
04:15
and judge the greater good.
71
255397
2270
並評估最大的利益
04:17
So researchers who study autonomous systems
72
257667
2440
所以研究自主系統的科學家
04:20
are collaborating with philosophers
73
260107
2100
正在與哲學家合作
04:22
to address the complex problem of programming ethics into machines,
74
262207
5421
試圖把複雜的機器倫理學輸入電腦
04:27
which goes to show that even hypothetical dilemmas
75
267628
3329
這些結果告訴我們: 就算是假設性的難題
04:30
can wind up on a collision course with the real world.
76
270957
4101
也能在現實世界碰撞出許多思辨
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7