The ethical dilemma of self-driving cars - Patrick Lin

2,078,144 views ・ 2015-12-08

TED-Ed


Please double-click on the English subtitles below to play the video.

Prevodilac: Ivana Krivokuća Lektor: Tijana Mihajlović
Ovo je misaoni eksperiment.
00:07
This is a thought experiment.
0
7246
2042
00:09
Let's say at some point in the not so distant future,
1
9288
2625
Recimo da u nekom trenutku u ne tako dalekoj budućnosti
00:11
you're barreling down the highway in your self-driving car,
2
11913
3583
jurite niz auto-put u svom autu sa automatskim upravljanjem
00:15
and you find yourself boxed in on all sides by other cars.
3
15496
4292
i nađete se zarobljeni sa svih strana među drugim automobilima.
00:19
Suddenly, a large, heavy object falls off the truck in front of you.
4
19788
4416
Odjednom, veliki i težak predmet padne sa kamiona ispred vas.
Vaš auto ne može da se zaustavi na vreme da bi izbegao sudar,
00:24
Your car can't stop in time to avoid the collision,
5
24204
3167
00:27
so it needs to make a decision:
6
27371
2042
stoga mora da donese odluku -
00:29
go straight and hit the object,
7
29413
2250
ići pravo i udariti u predmet,
00:31
swerve left into an SUV,
8
31663
2291
naglo skrenuti ka terencu
00:33
or swerve right into a motorcycle.
9
33954
3000
ili se zaleteti desno u motocikl.
00:36
Should it prioritize your safety by hitting the motorcycle,
10
36954
3500
Da li treba da da prioritet vašoj bezbednosti i udari motocikl,
00:40
minimize danger to others by not swerving,
11
40454
2792
da umanji opasnost za druge tako što neće skrenuti,
00:43
even if it means hitting the large object and sacrificing your life,
12
43246
4083
čak i ako to znači da će udariti veliki predmet i žrtvovati vaš život,
00:47
or take the middle ground by hitting the SUV,
13
47329
2750
ili da odabere srednje rešenje i udari terenca,
00:50
which has a high passenger safety rating?
14
50079
3000
koji je visoko ocenjen u pogledu bezbednosti za putnike?
Dakle, šta bi auto bez vozača trebalo da uradi?
00:53
So what should the self-driving car do?
15
53079
3208
Ako bismo mi ručno upravljali tim zarobljenim autom,
00:56
If we were driving that boxed in car in manual mode,
16
56287
3209
00:59
whichever way we'd react would be understood as just that,
17
59496
3541
kako god da reagujemo bilo bi shvaćeno prosto tako kako jeste,
kao reakcija,
01:03
a reaction,
18
63037
1292
01:04
not a deliberate decision.
19
64329
2250
a ne kao promišljena odluka.
01:06
It would be an instinctual panicked move with no forethought or malice.
20
66579
4292
To bi bio instiktivan potez u panici bez predumišljaja ili loše namere.
01:10
But if a programmer were to instruct the car to make the same move,
21
70871
3625
Međutim, kada bi programer naložio automobilu da učini to isto,
01:14
given conditions it may sense in the future,
22
74496
2833
pod uslovima koji se mogu javiti u budućnosti,
01:17
well, that looks more like premeditated homicide.
23
77329
4292
pa, to više izgleda kao ubistvo sa predumišljajem.
01:21
Now, to be fair,
24
81621
1000
Sad, da budemo fer,
01:22
self-driving cars are predicted to dramatically reduce traffic accidents
25
82621
4083
predviđa se da će automobili bez vozača
drastično smanjiti broj saobraćajnih nesreća i smrtnih slučajeva
01:26
and fatalities
26
86704
1250
01:27
by removing human error from the driving equation.
27
87954
3167
tako što će ukloniti ljudsku grešku iz jednačine vožnje.
01:31
Plus, there may be all sorts of other benefits:
28
91121
2375
Plus, tu može biti svakojakih drugih pogodnosti:
01:33
eased road congestion,
29
93496
1667
olakšano zakrčenje puteva,
01:35
decreased harmful emissions,
30
95163
1541
smanjeno ispuštanje štetnih gasova
01:36
and minimized unproductive and stressful driving time.
31
96704
4625
i neproduktivno i stresno vreme vožnje svedeno na minimum.
Međutim, nesreće se mogu i dalje desiti i dešavaće se,
01:41
But accidents can and will still happen,
32
101329
2167
01:43
and when they do,
33
103496
1167
a kada se dogode,
01:44
their outcomes may be determined months or years in advance
34
104663
4500
njihovi ishodi mogu biti određeni mesecima ili godinama unapred
od strane programera ili donosioca zakona.
01:49
by programmers or policy makers.
35
109163
2583
01:51
And they'll have some difficult decisions to make.
36
111746
2500
A moraće da donose neke teške odluke.
01:54
It's tempting to offer up general decision-making principles,
37
114246
2958
Primamljivo je dati ponudu opštih principa donošenja odluka
01:57
like minimize harm,
38
117204
1875
poput svođenja štete na najmanju meru,
ali čak i to ubrzo dovodi do sumnjivih moralnih odluka.
01:59
but even that quickly leads to morally murky decisions.
39
119079
3375
02:02
For example,
40
122454
1167
Na primer,
02:03
let's say we have the same initial set up,
41
123621
2000
recimo da imamo istu početnu postavku,
02:05
but now there's a motorcyclist wearing a helmet to your left
42
125621
2875
ali sada se sa vaše leve strane nalazi motociklista sa kacigom
02:08
and another one without a helmet to your right.
43
128496
2792
i još jedan bez kacige sa vaše desne strane.
02:11
Which one should your robot car crash into?
44
131288
3083
U koga treba da udari vaš robotski auto?
02:14
If you say the biker with the helmet because she's more likely to survive,
45
134371
4083
Ako kažete da je to bajker sa kacigom jer je verovatnije da će preživeti,
02:18
then aren't you penalizing the responsible motorist?
46
138454
2917
zar onda ne kažnjavate odgovornog motociklistu?
02:21
If, instead, you save the biker without the helmet
47
141371
2750
Ako umesto toga spasite bajkera bez kacige
02:24
because he's acting irresponsibly,
48
144121
2000
jer se neodgovorno ponaša,
02:26
then you've gone way beyond the initial design principle about minimizing harm,
49
146121
4875
onda ste se udaljili od prvobitnog principa o smanjenju štete
02:30
and the robot car is now meting out street justice.
50
150996
3875
i sada robotski auto sprovodi uličnu pravdu.
02:34
The ethical considerations get more complicated here.
51
154871
3542
Etička razmatranja ovde postaju komplikovanija.
02:38
In both of our scenarios,
52
158413
1375
U oba naša scenarija,
02:39
the underlying design is functioning as a targeting algorithm of sorts.
53
159788
4500
dizajn koji se nalazi u osnovi funkcioniše
kao neka vrsta algoritma za nalaženje mete.
02:44
In other words,
54
164288
1000
Drugim rečima,
02:45
it's systematically favoring or discriminating
55
165288
2500
sistematski favorizuje ili diskriminiše
02:47
against a certain type of object to crash into.
56
167788
3500
određenu vrstu predmeta u koji će udariti.
02:51
And the owners of the target vehicles
57
171288
2333
A vlasnici vozila koja su postala meta
02:53
will suffer the negative consequences of this algorithm
58
173621
3042
će snositi negativne posledice ovog algoritma
02:56
through no fault of their own.
59
176663
2083
iako nisu ni za šta krivi.
02:58
Our new technologies are opening up many other novel ethical dilemmas.
60
178746
4667
Naše nove tehnologije otvaraju i mnoge druge nove etičke dileme.
03:03
For instance, if you had to choose between
61
183413
2083
Na primer, ako biste morali da odaberete
03:05
a car that would always save as many lives as possible in an accident,
62
185496
4042
između automobila koji bi uvek spasio što je više moguće života u nesreći,
03:09
or one that would save you at any cost,
63
189538
3041
ili automobila koji bi vas spasio po bilo koju cenu,
03:12
which would you buy?
64
192579
1667
koji biste kupili?
Šta bi se desilo ako bi automobil počeo da analizira i uzima u obzir faktore
03:14
What happens if the cars start analyzing and factoring in
65
194246
3333
03:17
the passengers of the cars and the particulars of their lives?
66
197579
3459
putnika u automobilu i njihove pojedinačne živote?
03:21
Could it be the case that a random decision
67
201038
2166
Da li može biti slučaj da je slučajna odluka
03:23
is still better than a predetermined one designed to minimize harm?
68
203204
4917
i dalje bolja od unapred određene koja je osmišljena da se umanji šteta?
03:28
And who should be making all of these decisions anyhow?
69
208121
2667
Ko bi uopšte trebalo da donosi sve te odluke?
03:30
Programmers? Companies? Governments?
70
210829
2709
Programeri?
Kompanije?
Vlade?
03:34
Reality may not play out exactly like our thought experiments,
71
214121
3458
Stvarnost se možda neće odvijati baš kao naši misaoni eksperimenti,
03:37
but that's not the point.
72
217579
1667
ali nije u tome poenta.
Oni su osmišljeni da izoluju našu intuiciju vezanu za etiku
03:39
They're designed to isolate and stress test our intuitions on ethics,
73
219246
4333
i podvrgnu je testu o stresu,
03:43
just like science experiments do for the physical world.
74
223579
3000
baš kao što naučni eksperimenti čine kada se radi o fizičkom svetu.
03:46
Spotting these moral hairpin turns now
75
226579
3375
Uočavanje ovakvih moralnih krivina sada
03:49
will help us maneuver the unfamiliar road of technology ethics,
76
229954
3584
pomoći će nam u upravljanju nepoznatim putem tehnološke etike
03:53
and allow us to cruise confidently and conscientiously
77
233538
3750
i omogućiti nam da uplovimo samopouzdano i savesno
03:57
into our brave new future.
78
237288
2375
u našu hrabru novu budućnost.
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7