Would you sacrifice one person to save five? - Eleanor Nelsen

Bi žrtvovali enega človeka, da rešite pet ljudi? - Eleanor Nelsen

5,154,239 views

2017-01-12 ・ TED-Ed


New videos

Would you sacrifice one person to save five? - Eleanor Nelsen

Bi žrtvovali enega človeka, da rešite pet ljudi? - Eleanor Nelsen

5,154,239 views ・ 2017-01-12

TED-Ed


Za predvajanje videoposnetka dvakrat kliknite na spodnje angleške podnapise.

Translator: Klavdija Cernilogar Reviewer: Matej Divjak
00:06
Imagine you're watching a runaway trolley barreling down the tracks
0
6949
4572
Predstavljajte si vlak, ki brez nadzora drvi po tirih,
00:11
straight towards five workers who can't escape.
1
11521
4440
naravnost proti petim ujetim delavcem.
00:15
You happen to be standing next to a switch
2
15961
2218
Slučajno stojite poleg stikala,
ki lahko vlak preusmeri na drugi tir.
00:18
that will divert the trolley onto a second track.
3
18179
3501
00:21
Here's the problem.
4
21680
1300
In tu je problem.
00:22
That track has a worker on it, too, but just one.
5
22980
5089
Tudi na tem tiru je delavec, a samo eden.
Kaj storiti?
00:28
What do you do?
6
28069
1321
00:29
Do you sacrifice one person to save five?
7
29390
3295
Boste žrtvovali eno življenje, da jih rešite pet?
00:32
This is the trolley problem,
8
32685
2729
To je problem z vlakom,
00:35
a version of an ethical dilemma that philosopher Philippa Foot devised in 1967.
9
35414
6689
oblika etične dileme, ki jo je filozofinja Philippa Foot razvila leta 1967.
Ta znana dilema nas prisili v razmislek o tem, kako izbrati
00:42
It's popular because it forces us to think about how to choose
10
42103
3268
00:45
when there are no good choices.
11
45371
2689
med samimi slabimi možnostmi.
Bomo izbrali rešitev z najboljšim izidom
00:48
Do we pick the action with the best outcome
12
48060
2140
00:50
or stick to a moral code that prohibits causing someone's death?
13
50200
5200
ali se držali moralne zapovedi, ki prepoveduje usmrtitev drugega?
00:55
In one survey, about 90% of respondents said that it's okay to flip the switch,
14
55400
5437
V neki študiji se je okoli 90 odstotkov udeležencev strinjalo z uporabo stikala,
01:00
letting one worker die to save five,
15
60837
3413
ki bi ubilo enega delavca in jih rešilo pet;
01:04
and other studies, including a virtual reality simulation of the dilemma,
16
64250
4350
druge študije, tudi virtualne simulacije te dileme,
01:08
have found similar results.
17
68600
2440
so prišle do podobnih zaključkov.
Take sodbe so v skladu s filozofskim načelom utilitarizma,
01:11
These judgments are consistent with the philosophical principle of utilitarianism
18
71040
5021
ki pravi, da je moralno pravilna odločitev
01:16
which argues that the morally correct decision
19
76061
2460
01:18
is the one that maximizes well-being for the greatest number of people.
20
78521
4830
tista, ki prinese največ blaginje največjemu številu ljudi.
01:23
The five lives outweigh one,
21
83351
2130
Pet življenj odtehta eno,
01:25
even if achieving that outcome requires condemning someone to death.
22
85481
5081
čeprav to pomeni, da nekoga obsodimo na smrt.
01:30
But people don't always take the utilitarian view,
23
90562
2909
A ljudje ne razmišljajo vedno utilitarno,
01:33
which we can see by changing the trolley problem a bit.
24
93471
3591
kar vidimo, če dilemo z vlakom malo spremenimo.
Tokrat stojite na mostu nad tirom,
01:37
This time, you're standing on a bridge over the track
25
97062
3241
01:40
as the runaway trolley approaches.
26
100303
2889
ko se vlak približuje.
01:43
Now there's no second track,
27
103192
1681
Ni drugega tira,
01:44
but there is a very large man on the bridge next to you.
28
104873
3921
a poleg vas stoji zelo velik možak.
01:48
If you push him over, his body will stop the trolley,
29
108794
3698
Če ga pahnete preko mostu, bo njegovo telo ustavilo vlak
01:52
saving the five workers,
30
112492
1751
in rešilo pet delavcev,
sam pa bo umrl.
01:54
but he'll die.
31
114243
1790
Za utilitariste je odločitev enaka,
01:56
To utilitarians, the decision is exactly the same,
32
116033
3399
01:59
lose one life to save five.
33
119432
2550
izgubiti eno življenje in jih rešiti pet.
02:01
But in this case, only about 10% of people
34
121982
2602
A v tem primeru jih je samo kakih 10 odstotkov dejalo,
02:04
say that it's OK to throw the man onto the tracks.
35
124584
3869
da je prav vreči možaka na tir.
02:08
Our instincts tell us that deliberately causing someone's death
36
128453
3461
Instinkt nam pove, da je namerna povzročitev smrti
02:11
is different than allowing them to die as collateral damage.
37
131914
4389
drugačna od tega, da nekdo umre kot kolateralna škoda.
02:16
It just feels wrong for reasons that are hard to explain.
38
136303
4650
Zdi se narobe, a ni lahko pojasniti, zakaj.
02:20
This intersection between ethics and psychology
39
140953
2520
Stik med etiko in psihologijo
02:23
is what's so interesting about the trolley problem.
40
143473
3131
je najbolj zanimiva stvar pri dilemi z vlakom.
02:26
The dilemma in its many variations reveal that what we think is right or wrong
41
146604
4380
Dilema in njene variacije razkrijejo, da naše videnje pravilnega in napačnega
02:30
depends on factors other than a logical weighing of the pros and cons.
42
150984
5361
ni odvisno samo od logičnega tehtanja dejavnikov za in proti.
02:36
For example, men are more likely than women
43
156345
2490
Moški, na primer, pogosteje od žensk
02:38
to say it's okay to push the man over the bridge.
44
158835
3669
pravijo, da je v redu pahniti možaka preko mostu.
02:42
So are people who watch a comedy clip before doing the thought experiment.
45
162504
4490
Prav tako ljudje, ki so pred odločitvijo gledali skeč.
02:46
And in one virtual reality study,
46
166994
2171
V neki študiji z virtualno resničnostjo
02:49
people were more willing to sacrifice men than women.
47
169165
3779
so ljudje lažje žrtvovali moškega kot žensko.
02:52
Researchers have studied the brain activity
48
172944
2270
Raziskave so proučevale delovanje možganov
02:55
of people thinking through the classic and bridge versions.
49
175214
4321
ljudi, ki so se ukvarjali s klasično dilemo in tisto z mostom.
02:59
Both scenarios activate areas of the brain involved in conscious decision-making
50
179535
4519
Oba scenarija aktivirata dele možganov, povezane z zavestnim odločanjem
03:04
and emotional responses.
51
184054
2460
in čustvenimi odzivi.
03:06
But in the bridge version, the emotional response is much stronger.
52
186514
4461
A v verziji z mostom je čustveni odziv veliko močnejši.
03:10
So is activity in an area of the brain
53
190975
2219
Večja je tudi dejavnost dela možganov,
03:13
associated with processing internal conflict.
54
193194
3690
povezanega z reševanjem notranjih konfliktov.
03:16
Why the difference?
55
196884
1261
Zakaj ta razlika?
Ena razlaga pravi, da je poslati nekoga v smrt bolj osebno,
03:18
One explanation is that pushing someone to their death feels more personal,
56
198145
4767
03:22
activating an emotional aversion to killing another person,
57
202912
4013
da sproži čustveno nasprotovanje uboju drugega,
03:26
but we feel conflicted because we know it's still the logical choice.
58
206925
4499
a konflikt je v tem, da vemo, da je to logična rešitev.
03:31
"Trolleyology" has been criticized by some philosophers and psychologists.
59
211424
4981
Vlakologijo so nekateri filozofi in psihologi kritizirali.
03:36
They argue that it doesn't reveal anything because its premise is so unrealistic
60
216405
4861
Trdijo, da ne razkriva ničesar, saj so njene predpostavke tako nerealne,
03:41
that study participants don't take it seriously.
61
221266
4159
da je udeleženci ne jemljejo resno.
03:45
But new technology is making this kind of ethical analysis
62
225425
3131
A zaradi nove tehnologije je tovrstna etična analiza
03:48
more important than ever.
63
228556
2142
pomembnejša kot kdajkoli prej.
03:50
For example, driver-less cars may have to handle choices
64
230698
3338
Vozila brez voznika bodo morda morala sprejemati odločitve,
03:54
like causing a small accident to prevent a larger one.
65
234036
3971
kot so denimo povzročitev majhne nesreče, da bi preprečili večjo.
Vlade pa raziskujejo avtonomne vojaške drone,
03:58
Meanwhile, governments are researching autonomous military drones
66
238007
3619
04:01
that could wind up making decisions of whether they'll risk civilian casualties
67
241626
4350
ki bi lahko odločali o tem, ali bodo tvegali civilne žrtve,
04:05
to attack a high-value target.
68
245976
3300
da bi napadli pomembno tarčo.
04:09
If we want these actions to be ethical,
69
249276
1921
Če naj bodo te odločitve etične,
04:11
we have to decide in advance how to value human life
70
251197
4200
se moramo vnaprej odločiti, kako vrednotiti človeško življenje
04:15
and judge the greater good.
71
255397
2270
in oceniti, kaj je bolje.
04:17
So researchers who study autonomous systems
72
257667
2440
Raziskovalci, ki se ukvarjajo z avtonomnimi sistemi,
04:20
are collaborating with philosophers
73
260107
2100
sodelujejo s filozofi
04:22
to address the complex problem of programming ethics into machines,
74
262207
5421
pri zapletenem vprašanju programiranja etike v naprave,
04:27
which goes to show that even hypothetical dilemmas
75
267628
3329
kar kaže, da celo hipotetične dileme
04:30
can wind up on a collision course with the real world.
76
270957
4101
lahko trčijo v vprašanja resničnega sveta.
O tej spletni strani

Na tem mestu boste našli videoposnetke na YouTubu, ki so uporabni za učenje angleščine. Ogledali si boste lekcije angleščine, ki jih poučujejo vrhunski učitelji z vsega sveta. Z dvoklikom na angleške podnapise, ki so prikazani na vsaki strani z videoposnetki, lahko predvajate videoposnetek od tam. Podnapisi se pomikajo sinhronizirano s predvajanjem videoposnetka. Če imate kakršne koli pripombe ali zahteve, nam pišite prek tega obrazca za stike.

https://forms.gle/WvT1wiN1qDtmnspy7