Would you sacrifice one person to save five? - Eleanor Nelsen

Da li biste žrtvovali jednu osobu da biste spasili pet? - Elenor Nelsen

5,161,800 views

2017-01-12 ・ TED-Ed


New videos

Would you sacrifice one person to save five? - Eleanor Nelsen

Da li biste žrtvovali jednu osobu da biste spasili pet? - Elenor Nelsen

5,161,800 views ・ 2017-01-12

TED-Ed


Please double-click on the English subtitles below to play the video.

Prevodilac: Ivana Krivokuća Lektor: Tijana Mihajlović
00:06
Imagine you're watching a runaway trolley barreling down the tracks
0
6949
4572
Zamislite da posmatrate voz koji nekontrolisano juri niz šine
00:11
straight towards five workers who can't escape.
1
11521
4440
pravo prema petorici radnika koji ne mogu da pobegnu.
00:15
You happen to be standing next to a switch
2
15961
2218
Slučajno se nalazite pored poluge
koja će preusmeriti voz na drugu prugu.
00:18
that will divert the trolley onto a second track.
3
18179
3501
00:21
Here's the problem.
4
21680
1300
Evo u čemu je problem.
00:22
That track has a worker on it, too, but just one.
5
22980
5089
Na toj pruzi je takođe radnik, ali samo jedan.
Šta ćete uraditi?
00:28
What do you do?
6
28069
1321
00:29
Do you sacrifice one person to save five?
7
29390
3295
Da li ćete žrtvovati jednu osobu da biste spasili pet?
00:32
This is the trolley problem,
8
32685
2729
To je problem voza,
00:35
a version of an ethical dilemma that philosopher Philippa Foot devised in 1967.
9
35414
6689
verzija etičke dileme koju je osmislio filozof Filipa Fut 1967. godine.
Popularan je jer nas primorava da razmišljamo kako odabrati
00:42
It's popular because it forces us to think about how to choose
10
42103
3268
00:45
when there are no good choices.
11
45371
2689
kada nema dobrih izbora.
Da li da odaberemo akciju da najboljim ishodom
00:48
Do we pick the action with the best outcome
12
48060
2140
00:50
or stick to a moral code that prohibits causing someone's death?
13
50200
5200
ili da se pridržavamo moralnog kodeksa koji zabranjuje da uzrokujemo nečiju smrt?
00:55
In one survey, about 90% of respondents said that it's okay to flip the switch,
14
55400
5437
U jednoj anketi, oko 90% ispitanika reklo je da je u redu povući polugu,
01:00
letting one worker die to save five,
15
60837
3413
dopuštajući da jedan radnik umre da bi se spasilo njih pet,
01:04
and other studies, including a virtual reality simulation of the dilemma,
16
64250
4350
a i druga istraživanja, uključujući simulaciju dileme u virtuelnoj stvarnosti,
01:08
have found similar results.
17
68600
2440
došla su do sličnih rezultata.
Ovakva rasuđivanja su u skladu sa filozofskim principom utilitarizma
01:11
These judgments are consistent with the philosophical principle of utilitarianism
18
71040
5021
koji tvrdi da je moralno ispravna odluka
01:16
which argues that the morally correct decision
19
76061
2460
01:18
is the one that maximizes well-being for the greatest number of people.
20
78521
4830
ona koja maksimalno uvećava dobrobit najvećeg broja ljudi.
01:23
The five lives outweigh one,
21
83351
2130
Pet života nosi prevagu nad jednim,
01:25
even if achieving that outcome requires condemning someone to death.
22
85481
5081
čak i ako postizanje tog ishoda iziskuje osuditi nekoga na smrt.
01:30
But people don't always take the utilitarian view,
23
90562
2909
Ipak, ljudi ne zauzimaju uvek utilitarističko gledište,
01:33
which we can see by changing the trolley problem a bit.
24
93471
3591
što možemo videti ako malo izmenimo problem voza.
Ovoga puta stojite na mostu iznad šina
01:37
This time, you're standing on a bridge over the track
25
97062
3241
01:40
as the runaway trolley approaches.
26
100303
2889
dok se pomahnitali voz približava.
01:43
Now there's no second track,
27
103192
1681
Sada nema druge pruge,
01:44
but there is a very large man on the bridge next to you.
28
104873
3921
ali se na mostu pored vas nalazi veoma krupan čovek.
01:48
If you push him over, his body will stop the trolley,
29
108794
3698
Ako ga gurnete preko mosta, njegovo telo će zaustaviti voz,
01:52
saving the five workers,
30
112492
1751
što će spasiti petoricu radnika,
01:54
but he'll die.
31
114243
1790
ali će on umreti.
Za utilitariste, odluka je sasvim ista,
01:56
To utilitarians, the decision is exactly the same,
32
116033
3399
01:59
lose one life to save five.
33
119432
2550
lišiti se jednog života da bi se spasilo pet.
02:01
But in this case, only about 10% of people
34
121982
2602
Međutim, u ovom slučaju, samo oko 10% ljudi
02:04
say that it's OK to throw the man onto the tracks.
35
124584
3869
kaže da je u redu baciti čoveka na prugu.
02:08
Our instincts tell us that deliberately causing someone's death
36
128453
3461
Naši instinkti nam govore da je namerno izazvati nečiju smrt
02:11
is different than allowing them to die as collateral damage.
37
131914
4389
drugačije nego dopustiti da umre kao kolateralna šteta.
02:16
It just feels wrong for reasons that are hard to explain.
38
136303
4650
To jednostavno deluje pogrešno iz razloga koje je teško objasniti.
02:20
This intersection between ethics and psychology
39
140953
2520
Ovo ukrštanje etike i psihologije
02:23
is what's so interesting about the trolley problem.
40
143473
3131
predstavlja ono što je zanimljivo u vezi sa problemom voza.
02:26
The dilemma in its many variations reveal that what we think is right or wrong
41
146604
4380
Ova dilema u mnogim svojim varijacijama
otkriva da šta smatramo ispravnim ili pogrešnim zavisi od više faktora,
02:30
depends on factors other than a logical weighing of the pros and cons.
42
150984
5361
pored logičnog vaganja prednosti i mana.
02:36
For example, men are more likely than women
43
156345
2490
Na primer, veća je verovatnoća kod muškaraca nego kod žena
02:38
to say it's okay to push the man over the bridge.
44
158835
3669
da će reći da je u redu gurnuti čoveka preko mosta.
02:42
So are people who watch a comedy clip before doing the thought experiment.
45
162504
4490
Isto važi za ljude koji pogledaju komični snimak pre misaonog eksperimenta.
02:46
And in one virtual reality study,
46
166994
2171
U jednoj studiji virtuelne stvarnosti,
ljudi su bili spremniji da žrtvuju muškarce nego žene.
02:49
people were more willing to sacrifice men than women.
47
169165
3779
02:52
Researchers have studied the brain activity
48
172944
2270
Istraživači su proučavali aktivnost mozga
02:55
of people thinking through the classic and bridge versions.
49
175214
4321
ljudi koji promišljaju klasičnu verziju i onu sa mostom.
02:59
Both scenarios activate areas of the brain involved in conscious decision-making
50
179535
4519
Oba scenarija aktiviraju oblasti mozga uključene u svesno donošenje odluka
i emocionalne odgovore.
03:04
and emotional responses.
51
184054
2460
03:06
But in the bridge version, the emotional response is much stronger.
52
186514
4461
Međutim, u verziji sa mostom, emocionalna reakcija je mnogo snažnija.
03:10
So is activity in an area of the brain
53
190975
2219
Isto se dešava sa aktivnošću u oblasti mozga
03:13
associated with processing internal conflict.
54
193194
3690
koja je povezana sa obradom unutrašnjeg konflikta.
03:16
Why the difference?
55
196884
1261
Otkud ta razlika?
Jedno objašnjenje je da pogurati nekog u smrt deluje u većoj meri lično,
03:18
One explanation is that pushing someone to their death feels more personal,
56
198145
4767
03:22
activating an emotional aversion to killing another person,
57
202912
4013
što aktivira emocionalnu averziju prema ubijanju druge osobe,
03:26
but we feel conflicted because we know it's still the logical choice.
58
206925
4499
ali osećamo konflikt jer znamo da je to ipak logični izbor.
03:31
"Trolleyology" has been criticized by some philosophers and psychologists.
59
211424
4981
Neki filozofi i psiholozi su kritikovali ideju sa vozom.
03:36
They argue that it doesn't reveal anything because its premise is so unrealistic
60
216405
4861
Oni tvrde da ne razotkriva ništa jer je pretpostavka toliko nerealna
03:41
that study participants don't take it seriously.
61
221266
4159
da je učesnici istraživanja ne uzimaju za ozbiljno.
03:45
But new technology is making this kind of ethical analysis
62
225425
3131
Ipak, usled nove tehnologije,
vršenje ovakvih etičkih analiza postaje važnije nego ikada.
03:48
more important than ever.
63
228556
2142
03:50
For example, driver-less cars may have to handle choices
64
230698
3338
Na primer, automobili bez vozača će možda morati da se bave odlukama
03:54
like causing a small accident to prevent a larger one.
65
234036
3971
kao što je uzrokovanje manje nesreće da bi se sprečila veća.
U međuvremenu, vlasti ispituju autonomne vojne bespilotne letilice
03:58
Meanwhile, governments are researching autonomous military drones
66
238007
3619
04:01
that could wind up making decisions of whether they'll risk civilian casualties
67
241626
4350
koje bi na kraju mogle donositi odluke da li će rizikovati civilne žrtve
04:05
to attack a high-value target.
68
245976
3300
da bi napale metu od visokog značaja.
04:09
If we want these actions to be ethical,
69
249276
1921
Ako želimo da te akcije budu etički ispravne,
04:11
we have to decide in advance how to value human life
70
251197
4200
moramo unapred da odlučimo kako vrednovati ljudski život
04:15
and judge the greater good.
71
255397
2270
i rasuđivati o opštem dobru.
04:17
So researchers who study autonomous systems
72
257667
2440
Stoga istraživači koji izučavaju autonomne sisteme
04:20
are collaborating with philosophers
73
260107
2100
sarađuju sa filozofima
04:22
to address the complex problem of programming ethics into machines,
74
262207
5421
da bi se bavili složenim problemom programiranja etike u mašine,
04:27
which goes to show that even hypothetical dilemmas
75
267628
3329
što ukazuje da se čak i hipotetičke dileme
04:30
can wind up on a collision course with the real world.
76
270957
4101
mogu na kraju naći u sukobu sa stvarnim svetom.
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7