What moral decisions should driverless cars make? | Iyad Rahwan

108,849 views ・ 2017-09-08

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Senzos Osijek Recezent: Sanda L
00:12
Today I'm going to talk about technology and society.
0
12820
4080
Danas ću govoriti o tehnologiji i društvu.
00:18
The Department of Transport estimated that last year
1
18860
3696
Odjel za transport je procijenio da je prošle godine
00:22
35,000 people died from traffic crashes in the US alone.
2
22580
4080
35,000 ljudi umrlo u prometnim nesrećama samo u SAD-u.
00:27
Worldwide, 1.2 million people die every year in traffic accidents.
3
27860
4800
Diljem svijeta, 1,2 milijuna ljudi umre svake godine u prometnim nesrećama.
00:33
If there was a way we could eliminate 90 percent of those accidents,
4
33580
4096
Kad bi postojao način na koji bismo mogli eliminirati 90 posto tih nesreća,
00:37
would you support it?
5
37700
1200
biste li ga podržali?
00:39
Of course you would.
6
39540
1296
Naravno da biste.
00:40
This is what driverless car technology promises to achieve
7
40860
3655
Upravo ovo obećava postići tehnologija automobila bez vozača
00:44
by eliminating the main source of accidents --
8
44540
2816
eliminirajući glavni uzrok nesreća --
00:47
human error.
9
47380
1200
ljudsku pogrešku.
00:49
Now picture yourself in a driverless car in the year 2030,
10
49740
5416
Sada zamislite sebe u automobilu bez vozača u 2030. godini,
00:55
sitting back and watching this vintage TEDxCambridge video.
11
55180
3456
kako sjedite otraga i gledate ovaj zastarjeli TEDxCambridge video.
00:58
(Laughter)
12
58660
2000
(Smijeh)
01:01
All of a sudden,
13
61340
1216
Odjednom,
01:02
the car experiences mechanical failure and is unable to stop.
14
62580
3280
automobilu se dogodio mehanički kvar i ne može se zaustaviti.
01:07
If the car continues,
15
67180
1520
Ako automobil nastavi voziti,
01:09
it will crash into a bunch of pedestrians crossing the street,
16
69540
4120
sudarit će se s hrpom pješaka koji prelaze ulicu,
01:14
but the car may swerve,
17
74900
2135
ali automobil može i skrenuti,
01:17
hitting one bystander,
18
77059
1857
udarajući nekoga tko stoji sa strane,
01:18
killing them to save the pedestrians.
19
78940
2080
usmrćujući ga kako bi spasio pješake.
01:21
What should the car do, and who should decide?
20
81860
2600
Što automobil treba učiniti i tko o tome treba odlučiti?
01:25
What if instead the car could swerve into a wall,
21
85340
3536
Što ako bi, umjesto toga, automobil mogao skrenuti u zid,
01:28
crashing and killing you, the passenger,
22
88900
3296
sudarajući se i ubijajući tebe, putnika,
01:32
in order to save those pedestrians?
23
92220
2320
kako bi spasio te pješake?
01:35
This scenario is inspired by the trolley problem,
24
95060
3080
Ovaj scenarij je inspiriran problemom trolejbusa,
01:38
which was invented by philosophers a few decades ago
25
98780
3776
kojeg su se dosjetili filozofi prije nekoliko desetljeća,
01:42
to think about ethics.
26
102580
1240
da bi promišljali o etici.
01:45
Now, the way we think about this problem matters.
27
105940
2496
Zato je važan način na koji razmišljamo o ovom problemu.
01:48
We may for example not think about it at all.
28
108460
2616
Možemo, na primjer, uopće ne razmišljati o njemu.
01:51
We may say this scenario is unrealistic,
29
111100
3376
Možemo reći kako je taj scenarij nerealan,
01:54
incredibly unlikely, or just silly.
30
114500
2320
iznimno nevjerojatan ili jednostavno blesav.
01:57
But I think this criticism misses the point
31
117580
2736
Ali, ja smatram da takva osuda promašuje poantu
02:00
because it takes the scenario too literally.
32
120340
2160
jer shvaća taj scenarij previše doslovno.
02:03
Of course no accident is going to look like this;
33
123740
2736
Naravno da nijedna nesreća neće izgledati ovako;
02:06
no accident has two or three options
34
126500
3336
nijedna nesreća nema dvije ili tri opcije
02:09
where everybody dies somehow.
35
129860
2000
u kojima svi nekako stradaju.
02:13
Instead, the car is going to calculate something
36
133300
2576
Umjesto toga, automobil će izračunati nešto
02:15
like the probability of hitting a certain group of people,
37
135900
4896
poput vjerojatnosti udaranja u određenu skupinu ljudi,
02:20
if you swerve one direction versus another direction,
38
140820
3336
ako skrenete u jednom smjeru naspram drugog smjera,
02:24
you might slightly increase the risk to passengers or other drivers
39
144180
3456
mogli biste neznatno povećati rizik za putnike ili druge vozače
02:27
versus pedestrians.
40
147660
1536
naspram pješaka.
02:29
It's going to be a more complex calculation,
41
149220
2160
To će biti složenija kalkulacija,
02:32
but it's still going to involve trade-offs,
42
152300
2520
ali će i dalje uključivati kompromise,
02:35
and trade-offs often require ethics.
43
155660
2880
a za kompromis je često potrebna moralna prosudba.
02:39
We might say then, "Well, let's not worry about this.
44
159660
2736
Možemo onda reći „Pa, nemojmo se brinuti oko toga.
02:42
Let's wait until technology is fully ready and 100 percent safe."
45
162420
4640
Pričekajmo dok tehnologija ne bude potpuno spremna i 100 posto sigurna.“
02:48
Suppose that we can indeed eliminate 90 percent of those accidents,
46
168340
3680
Pretpostavimo da zaista možemo uspjeti eliminirati 90 posto tih nesreća,
02:52
or even 99 percent in the next 10 years.
47
172900
2840
ili čak 99 posto u idućih 10 godina.
02:56
What if eliminating the last one percent of accidents
48
176740
3176
Što ako uklanjanje tog zadnjeg postotka nesreća
02:59
requires 50 more years of research?
49
179940
3120
zahtijeva još dodatnih 50 godina istraživanja?
03:04
Should we not adopt the technology?
50
184220
1800
Zar ne trebamo usvojiti tehnologiju?
03:06
That's 60 million people dead in car accidents
51
186540
4776
To je 60 milijuna ljudi poginulih u automobilskim nesrećama,
03:11
if we maintain the current rate.
52
191340
1760
ako zadržimo trenutnu stopu.
03:14
So the point is,
53
194580
1216
Stoga poanta je da,
03:15
waiting for full safety is also a choice,
54
195820
3616
čekanje potpune sigurnosti je također izbor
03:19
and it also involves trade-offs.
55
199460
2160
i isto uključuje kompromise.
03:23
People online on social media have been coming up with all sorts of ways
56
203380
4336
Ljudi prisutni na društvenim mrežama pronalaze svakakve načine
03:27
to not think about this problem.
57
207740
2016
kako ne misliti o ovom problemu.
03:29
One person suggested the car should just swerve somehow
58
209780
3216
Jedna je osoba predložila da automobil samo skrene nekako
03:33
in between the passengers --
59
213020
2136
između putnika --
03:35
(Laughter)
60
215180
1016
(Smijeh)
03:36
and the bystander.
61
216220
1256
i promatrača.
03:37
Of course if that's what the car can do, that's what the car should do.
62
217500
3360
Naravno, ako to automobil može učiniti, onda to i treba učiniti.
03:41
We're interested in scenarios in which this is not possible.
63
221740
2840
Nas zanimaju scenariji u kojima ovo nije moguće.
03:45
And my personal favorite was a suggestion by a blogger
64
225100
5416
I moj osobni favorit bio je prijedlog blogera
03:50
to have an eject button in the car that you press --
65
230540
3016
da u automobilu imamo gumb za izbacivanje koji pritisnemo --
03:53
(Laughter)
66
233580
1216
(Smijeh)
03:54
just before the car self-destructs.
67
234820
1667
netom prije no što se automobil sam uništi.
03:56
(Laughter)
68
236511
1680
(Smijeh)
03:59
So if we acknowledge that cars will have to make trade-offs on the road,
69
239660
5200
Dakle, ako priznamo da će automobili morati raditi kompromise na cesti,
04:06
how do we think about those trade-offs,
70
246020
1880
kako ćemo razmišljati o tim kompromisima
04:09
and how do we decide?
71
249140
1576
i kako ćemo donijeti odluku?
04:10
Well, maybe we should run a survey to find out what society wants,
72
250740
3136
Pa, možda bismo trebali provesti anketu da saznamo što društvo želi,
04:13
because ultimately,
73
253900
1456
jer naposljetku,
04:15
regulations and the law are a reflection of societal values.
74
255380
3960
regulacije i zakon su odraz društvenih vrijednosti.
04:19
So this is what we did.
75
259860
1240
Stoga smo učinili ovo.
04:21
With my collaborators,
76
261700
1616
S mojim suradnicima,
04:23
Jean-François Bonnefon and Azim Shariff,
77
263340
2336
Jean-Françoisem Bonnefonom i Azimom Shariffom,
04:25
we ran a survey
78
265700
1616
proveli smo istraživanje
04:27
in which we presented people with these types of scenarios.
79
267340
2855
u kojem smo predstavili ljudima ove vrste scenarija.
04:30
We gave them two options inspired by two philosophers:
80
270219
3777
Dali smo im dvije mogućnosti inspirirane dvama filozofima:
04:34
Jeremy Bentham and Immanuel Kant.
81
274020
2640
Jeremyem Benthamom i Immanuelom Kantom.
04:37
Bentham says the car should follow utilitarian ethics:
82
277420
3096
Bentham kaže da automobil treba pratiti utilitarnu etiku:
04:40
it should take the action that will minimize total harm --
83
280540
3416
treba poduzeti akciju koja će minimalizirati totalnu štetu --
04:43
even if that action will kill a bystander
84
283980
2816
čak i ako će ta akcija ubiti promatrača
04:46
and even if that action will kill the passenger.
85
286820
2440
te čak i ako će ta akcija ubiti putnika.
04:49
Immanuel Kant says the car should follow duty-bound principles,
86
289940
4976
Immanuel Kant kaže da automobil treba pratiti moralna načela,
04:54
like "Thou shalt not kill."
87
294940
1560
poput „Ne ubij.“
04:57
So you should not take an action that explicitly harms a human being,
88
297300
4456
Zato ne bismo trebali poduzeti akciju koja eksplicitno šteti ljudskom biću
05:01
and you should let the car take its course
89
301780
2456
i trebali bismo pustiti da automobil ide svojim putem,
05:04
even if that's going to harm more people.
90
304260
1960
čak i ako će to ozlijediti više ljudi.
05:07
What do you think?
91
307460
1200
Što vi mislite?
05:09
Bentham or Kant?
92
309180
1520
Bentham ili Kant?
05:11
Here's what we found.
93
311580
1256
Evo što smo mi pronašli.
05:12
Most people sided with Bentham.
94
312860
1800
Većina ljudi je stala na stranu Benthama.
05:15
So it seems that people want cars to be utilitarian,
95
315980
3776
Stoga se čini kako većina želi da automobili budu korisni,
05:19
minimize total harm,
96
319780
1416
da smanje totalnu štetu
05:21
and that's what we should all do.
97
321220
1576
i to bismo svi trebali činiti.
05:22
Problem solved.
98
322820
1200
Riješen problem.
05:25
But there is a little catch.
99
325060
1480
Ali, postoji mala kvaka.
05:27
When we asked people whether they would purchase such cars,
100
327740
3736
Kad smo pitali ljude bi li kupili takve automobile,
05:31
they said, "Absolutely not."
101
331500
1616
odgovorili su: „ Apsolutno ne.“
05:33
(Laughter)
102
333140
2296
(Smijeh)
05:35
They would like to buy cars that protect them at all costs,
103
335460
3896
Oni bi htjeli kupiti automobile koji njih štite pod svaku cijenu,
05:39
but they want everybody else to buy cars that minimize harm.
104
339380
3616
ali žele da svi ostali kupe automobile koji minimaliziraju štetu.
05:43
(Laughter)
105
343020
2520
(Smijeh)
05:46
We've seen this problem before.
106
346540
1856
Već prije smo se sreli s ovim problemom.
05:48
It's called a social dilemma.
107
348420
1560
Zove se društvena dilema.
05:50
And to understand the social dilemma,
108
350980
1816
I da bismo razumjeli društvena dilemu,
05:52
we have to go a little bit back in history.
109
352820
2040
moramo se vratiti malo u prošlost.
05:55
In the 1800s,
110
355820
2576
U 1800-ima,
05:58
English economist William Forster Lloyd published a pamphlet
111
358420
3736
engleski ekonomist William Forster Lloyd objavio je pamflet
06:02
which describes the following scenario.
112
362180
2216
koji opisuje idući scenarij.
06:04
You have a group of farmers --
113
364420
1656
Imate grupu farmera,
06:06
English farmers --
114
366100
1336
engleskih farmera,
06:07
who are sharing a common land for their sheep to graze.
115
367460
2680
koji dijele zajedničko zemljište za ispašu njihovih ovaca.
06:11
Now, if each farmer brings a certain number of sheep --
116
371340
2576
Sada, ako svaki farmer dovodi određeni broj ovaca --
06:13
let's say three sheep --
117
373940
1496
recimo tri ovce --
06:15
the land will be rejuvenated,
118
375460
2096
zemljište će biti pomlađeno,
06:17
the farmers are happy,
119
377580
1216
farmeri sretni,
06:18
the sheep are happy,
120
378820
1616
ovce sretne,
06:20
everything is good.
121
380460
1200
sve je dobro.
06:22
Now, if one farmer brings one extra sheep,
122
382260
2520
Sada, ako jedan farmer dovede jednu ovcu više,
06:25
that farmer will do slightly better, and no one else will be harmed.
123
385620
4720
tom će farmeru ići malo bolje i nitko drugi neće biti oštećen.
06:30
But if every farmer made that individually rational decision,
124
390980
3640
Međutim, ako svaki farmer donese tu individualno racionalnu odluku,
06:35
the land will be overrun, and it will be depleted
125
395660
2720
zemljište će postati pretrpano i bit će istrošeno
06:39
to the detriment of all the farmers,
126
399180
2176
na štetu svih farmera
06:41
and of course, to the detriment of the sheep.
127
401380
2120
i naravno, na štetu ovaca.
06:44
We see this problem in many places:
128
404540
3680
Možemo vidjeti ovaj problem u raznim područjima:
06:48
in the difficulty of managing overfishing,
129
408900
3176
u poteškoćama u vođenju pretjeranog ribolova
06:52
or in reducing carbon emissions to mitigate climate change.
130
412100
4560
ili u smanjenju emisija ugljika za ublažavanje klimatskih promjena.
06:58
When it comes to the regulation of driverless cars,
131
418980
2920
Kada je u pitanju regulacija automobila bez vozača,
07:02
the common land now is basically public safety --
132
422900
4336
zajedničko je, zapravo, javna sigurnost --
07:07
that's the common good --
133
427260
1240
to je opće dobro --
07:09
and the farmers are the passengers
134
429220
1976
a farmeri su putnici
07:11
or the car owners who are choosing to ride in those cars.
135
431220
3600
ili vlasnici automobila koji odabiru voziti se u tim automobilima.
07:16
And by making the individually rational choice
136
436780
2616
I odabirući individualno racionalni odabir
07:19
of prioritizing their own safety,
137
439420
2816
davanja prioriteta vlastitoj sigurnosti,
07:22
they may collectively be diminishing the common good,
138
442260
3136
oni bi kolektivno mogli umanjiti opće dobro,
07:25
which is minimizing total harm.
139
445420
2200
koje je smanjene totalne štete.
07:30
It's called the tragedy of the commons,
140
450140
2136
To se naziva tragedija zajedništva,
07:32
traditionally,
141
452300
1296
tradicionalno,
07:33
but I think in the case of driverless cars,
142
453620
3096
ali ja mislim da u slučaju automobila bez vozača,
07:36
the problem may be a little bit more insidious
143
456740
2856
problem je možda malo više podmukao
07:39
because there is not necessarily an individual human being
144
459620
3496
jer ne postoji nužno individualno ljudsko biće
07:43
making those decisions.
145
463140
1696
koje donosi te odluke.
07:44
So car manufacturers may simply program cars
146
464860
3296
Proizvođači automobila mogu jednostavno programirati automobile
07:48
that will maximize safety for their clients,
147
468180
2520
tako da maksimalno povećaju sigurnost svojih klijenata
07:51
and those cars may learn automatically on their own
148
471900
2976
te bi takvi automobili mogli automatski samostalno naučiti
07:54
that doing so requires slightly increasing risk for pedestrians.
149
474900
3520
da to zahtijeva povećanje rizika za pješake.
07:59
So to use the sheep metaphor,
150
479340
1416
Da upotrijebim metaforu s ovcom,
08:00
it's like we now have electric sheep that have a mind of their own.
151
480780
3616
to je kao da sada imamo električnu ovcu s vlastitim umom.
08:04
(Laughter)
152
484420
1456
(Smijeh)
08:05
And they may go and graze even if the farmer doesn't know it.
153
485900
3080
I one mogu otići i jesti bez da uopće njihov farmer to zna.
08:10
So this is what we may call the tragedy of the algorithmic commons,
154
490460
3976
Dakle, ovo bismo mogli nazvati tragedijom zajedničkog algoritma
08:14
and if offers new types of challenges.
155
494460
2360
te nam ona nudi nove vrste izazova.
08:22
Typically, traditionally,
156
502340
1896
Obično, tradicionalno,
08:24
we solve these types of social dilemmas using regulation,
157
504260
3336
ove vrste društvenih dilema rješavamo koristeći propise,
08:27
so either governments or communities get together,
158
507620
2736
stoga se sastanu ili vlade ili zajednice
08:30
and they decide collectively what kind of outcome they want
159
510380
3736
i kolektivno odluče kakav ishod žele
08:34
and what sort of constraints on individual behavior
160
514140
2656
i kakva ograničenja individualnog ponašanja
08:36
they need to implement.
161
516820
1200
trebaju provoditi.
08:39
And then using monitoring and enforcement,
162
519420
2616
Te tada, koristeći praćenje i provedbu,
08:42
they can make sure that the public good is preserved.
163
522060
2559
mogu osigurati očuvanje javne sigurnosti.
08:45
So why don't we just,
164
525260
1575
Stoga, zašto mi ne bismo samo,
08:46
as regulators,
165
526859
1496
kao regulatori,
08:48
require that all cars minimize harm?
166
528379
2897
zahtijevali da svi automobili potpuno smanje štetu?
08:51
After all, this is what people say they want.
167
531300
2240
Uostalom, to je ono što ljudi kažu da žele.
08:55
And more importantly,
168
535020
1416
I još važnije,
08:56
I can be sure that as an individual,
169
536460
3096
mogu biti siguran da kao pojedinac,
08:59
if I buy a car that may sacrifice me in a very rare case,
170
539580
3856
ukoliko kupim automobil koji bi me mogao žrtvovati u vrlo rijetkom slučaju,
09:03
I'm not the only sucker doing that
171
543460
1656
nisam jedina naivčina čineći to
09:05
while everybody else enjoys unconditional protection.
172
545140
2680
dok svi ostali uživaju bezuvjetnu zaštitu.
09:08
In our survey, we did ask people whether they would support regulation
173
548940
3336
U našoj anketi, pitali smo ljude bi li podržali propise
09:12
and here's what we found.
174
552300
1200
i evo što smo saznali.
09:14
First of all, people said no to regulation;
175
554180
3760
Prvo, ljudi su rekli ne propisima;
09:19
and second, they said,
176
559100
1256
i drugo, rekli su:
09:20
"Well if you regulate cars to do this and to minimize total harm,
177
560380
3936
„Pa, ako regulirate automobile da to čine i da potpuno smanje štetu,
09:24
I will not buy those cars."
178
564340
1480
ja neću kupiti te automobile.“
09:27
So ironically,
179
567220
1376
Dakle, ironično,
09:28
by regulating cars to minimize harm,
180
568620
3496
regulirajući automobile u svrhu potpunog smanjenja štete,
09:32
we may actually end up with more harm
181
572140
1840
možemo zapravo završiti s više štete
09:34
because people may not opt into the safer technology
182
574860
3656
jer se možda ljudi neće odlučiti za sigurniju tehnologiju,
09:38
even if it's much safer than human drivers.
183
578540
2080
čak i ako je puno sigurnija od ljudskih vozača.
09:42
I don't have the final answer to this riddle,
184
582180
3416
Nemam konačan odgovor na ovu zagonetku,
09:45
but I think as a starting point,
185
585620
1576
ali mislim da, za početak,
09:47
we need society to come together
186
587220
3296
trebamo društvo da se zajedno okupi
09:50
to decide what trade-offs we are comfortable with
187
590540
2760
kako bismo odlučili s kojim kompromisima se slažemo
09:54
and to come up with ways in which we can enforce those trade-offs.
188
594180
3480
i kako bismo osmislili načine donošenja tih kompromisa.
09:58
As a starting point, my brilliant students,
189
598340
2536
Za početak, moji sjajni studenti,
10:00
Edmond Awad and Sohan Dsouza,
190
600900
2456
Edmond Awad i Sohan Dsouza,
10:03
built the Moral Machine website,
191
603380
1800
napravili su internetsku stranicu Moral Machine,
10:06
which generates random scenarios at you --
192
606020
2680
koja vam postavlja nasumične događaje --
10:09
basically a bunch of random dilemmas in a sequence
193
609900
2456
obično gomilu nasumičnih dilema u nizu,
10:12
where you have to choose what the car should do in a given scenario.
194
612380
3920
gdje vi trebate odlučiti što bi automobil trebao napraviti u određenoj situaciji.
10:16
And we vary the ages and even the species of the different victims.
195
616860
4600
U ponudi imamo žrtve raznih dobi i vrsta.
10:22
So far we've collected over five million decisions
196
622860
3696
Dosad smo prikupili više od pet milijuna odluka
10:26
by over one million people worldwide
197
626580
2200
od preko milijun ljudi diljem svijeta
10:30
from the website.
198
630220
1200
putem te internetske stranice.
10:32
And this is helping us form an early picture
199
632180
2416
Ovo nam od početka pomaže u razumijevanju
10:34
of what trade-offs people are comfortable with
200
634620
2616
s kakvim kompromisima se ljudi slažu
10:37
and what matters to them --
201
637260
1896
te što je njima važno --
10:39
even across cultures.
202
639180
1440
čak i u različitim kulturama.
10:42
But more importantly,
203
642060
1496
No, važnije od toga,
10:43
doing this exercise is helping people recognize
204
643580
3376
ljudima se time pomaže u shvaćanju
10:46
the difficulty of making those choices
205
646980
2816
težine donošenja tih odluka
10:49
and that the regulators are tasked with impossible choices.
206
649820
3800
te da su oni koji donose odluke zaduženi za nemoguće izbore.
10:55
And maybe this will help us as a society understand the kinds of trade-offs
207
655180
3576
I možda će upravo ovo pomoći nama kao društvu u razumijevanju kompromisa
10:58
that will be implemented ultimately in regulation.
208
658780
3056
koji će biti ostvareni u konačnici pri donošenju propisa.
11:01
And indeed, I was very happy to hear
209
661860
1736
I zaista, bio sam sretan saznajući
11:03
that the first set of regulations
210
663620
2016
da je prvi skup propisa
11:05
that came from the Department of Transport --
211
665660
2136
iz Odjela za transport --
11:07
announced last week --
212
667820
1376
objavljen prošli tjedan --
11:09
included a 15-point checklist for all carmakers to provide,
213
669220
6576
obuhvaćao kontrolni popis of 15 točaka za sve proizvođače automobila
11:15
and number 14 was ethical consideration --
214
675820
3256
te da je broj 14 uzimao u obzir etiku --
11:19
how are you going to deal with that.
215
679100
1720
kako ćete se vi nositi s tim.
11:23
We also have people reflect on their own decisions
216
683620
2656
Također, potičemo ljude da promišljaju o vlastitim odlukama
11:26
by giving them summaries of what they chose.
217
686300
3000
tako što im dajemo sažetak njihovih odabira.
11:30
I'll give you one example --
218
690260
1656
Dat ću vam jedan primjer --
11:31
I'm just going to warn you that this is not your typical example,
219
691940
3536
samo ću vas upozoriti da ovo nije tipičan primjer,
11:35
your typical user.
220
695500
1376
tipičan korisnik.
11:36
This is the most sacrificed and the most saved character for this person.
221
696900
3616
Ovo je najviše žrtvovan i najviše spašen lik za ovu osobu.
11:40
(Laughter)
222
700540
5200
(Smijeh)
11:46
Some of you may agree with him,
223
706500
1896
Neki od vas bi se mogli složiti s njim,
11:48
or her, we don't know.
224
708420
1640
ili s njom, ne znamo.
11:52
But this person also seems to slightly prefer passengers over pedestrians
225
712300
6136
No također, čini se da ova osoba ipak malo više preferira putnike nego pješake
11:58
in their choices
226
718460
2096
u svojim odlukama
12:00
and is very happy to punish jaywalking.
227
720580
2816
te da vrlo rado kažnjava neoprezno prelaženje ulice.
12:03
(Laughter)
228
723420
3040
(Smijeh)
12:09
So let's wrap up.
229
729140
1216
Dakle, zaključimo.
12:10
We started with the question -- let's call it the ethical dilemma --
230
730379
3416
Započeli smo pitanjem -- nazovimo to moralnom dilemom --
12:13
of what the car should do in a specific scenario:
231
733820
3056
kako bi automobil trebao postupiti u određenoj situaciji:
12:16
swerve or stay?
232
736900
1200
skrenuti ili ostati?
12:19
But then we realized that the problem was a different one.
233
739060
2736
Ali onda smo shvatili da je problem bio u nečem drugom.
12:21
It was the problem of how to get society to agree on and enforce
234
741820
4536
U tome kako pridobiti društvo da donese zajedničku odluku i provede
12:26
the trade-offs they're comfortable with.
235
746380
1936
kompromise s kojim se slaže.
12:28
It's a social dilemma.
236
748340
1256
To je društvena dilema.
12:29
In the 1940s, Isaac Asimov wrote his famous laws of robotics --
237
749620
5016
1940., Isaac Asimov je napisao svoje slavne zakone robotike --
12:34
the three laws of robotics.
238
754660
1320
tri zakona robotike.
12:37
A robot may not harm a human being,
239
757060
2456
Robot nikada ne smije naškoditi ljudskom biću,
12:39
a robot may not disobey a human being,
240
759540
2536
robot nikada ne smije ne poslušati ljudsko biće
12:42
and a robot may not allow itself to come to harm --
241
762100
3256
i robot nikada ne smije dopustiti sebi štetu --
12:45
in this order of importance.
242
765380
1960
tim redoslijedom važnosti.
12:48
But after 40 years or so
243
768180
2136
No, nakon otprilike 40 godina
12:50
and after so many stories pushing these laws to the limit,
244
770340
3736
i nakon toliko slučajeva koji guraju ove zakone do krajnjih granica,
12:54
Asimov introduced the zeroth law
245
774100
3696
Asimov je predstavio nulti zakon,
12:57
which takes precedence above all,
246
777820
2256
koji ima prednost spram svih ostalih zakona
13:00
and it's that a robot may not harm humanity as a whole.
247
780100
3280
i glasi da robot nikada ne smije naškoditi čovječnosti u cijelosti.
13:04
I don't know what this means in the context of driverless cars
248
784300
4376
Ne znam što ovo znači u kontekstu automobila bez vozača
13:08
or any specific situation,
249
788700
2736
ili u bilo kojoj specifičnoj situaciji,
13:11
and I don't know how we can implement it,
250
791460
2216
i ne znam kako bismo to mogli primijeniti,
13:13
but I think that by recognizing
251
793700
1536
ali smatram da prepoznavanjem
13:15
that the regulation of driverless cars is not only a technological problem
252
795260
6136
kako regulacija automobila bez vozača nije samo problem tehnologije
13:21
but also a societal cooperation problem,
253
801420
3280
već i čitavog društva i njegove suradnje,
13:25
I hope that we can at least begin to ask the right questions.
254
805620
2880
nadam se, možemo barem početi postavljati prava pitanja.
13:29
Thank you.
255
809020
1216
Hvala vam.
13:30
(Applause)
256
810260
2920
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7