The ethical dilemma of self-driving cars - Patrick Lin

2,093,653 views ใƒป 2015-12-08

TED-Ed


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Ido Dekkers ืžื‘ืงืจ: Tal Dekkers
00:07
This is a thought experiment.
0
7246
2042
ื–ื” ื ื™ืกื•ื™ ืžื—ืฉื‘ืชื™,
00:09
Let's say at some point in the not so distant future,
1
9288
2625
ื‘ื•ืื• ื ื“ืžื™ื™ืŸ ื ืงื•ื“ื” ื‘ืขืชื™ื“ ื”ืœื ื›ืœ ื›ืš ืจื—ื•ืง,
00:11
you're barreling down the highway in your self-driving car,
2
11913
3583
ื‘ื” ืืชื ื“ื•ื”ืจื™ื ื‘ืžื•ืจื“ ื”ื›ื‘ื™ืฉ ื”ืžื”ื™ืจ ื‘ืžื›ื•ื ื™ืช ื”ื ื•ื”ื’ืช ื‘ืขืฆืžื” ืฉืœื›ื,
00:15
and you find yourself boxed in on all sides by other cars.
3
15496
4292
ื•ืืชื ืžื•ืฆืื™ื ืืช ืขืฆืžื›ื ืœื›ื•ื“ื™ื ืžื›ืœ ื”ื›ื™ื•ื•ื ื™ื ืขืœ ื™ื“ื™ ืžื›ื•ื ื™ื•ืช ืื—ืจื•ืช.
00:19
Suddenly, a large, heavy object falls off the truck in front of you.
4
19788
4416
ืคืชืื•ื, ืขืฆื ื’ื“ื•ืœ ื•ื›ื‘ื“ ื ื•ืคืœ ืžื”ืžืฉืื™ืช ืœืคื ื™ื›ื.
00:24
Your car can't stop in time to avoid the collision,
5
24204
3167
ื”ืžื›ื•ื ื™ืช ืฉืœื›ื ืœื ื™ื›ื•ืœื” ืœืขืฆื•ืจ ื‘ื–ืžืŸ ื›ื“ื™ ืœื”ืžื ืข ืžื”ืชื ื’ืฉื•ืช,
00:27
so it needs to make a decision:
6
27371
2042
ืœื›ืŸ ื”ื™ื ืฆืจื™ื›ื” ืœืขืฉื•ืช ื”ื—ืœื˜ื”:
00:29
go straight and hit the object,
7
29413
2250
ืœื”ืžืฉื™ืš ื™ืฉืจ ื•ืœืคื’ื•ืข ื‘ืขืฆื,
00:31
swerve left into an SUV,
8
31663
2291
ืœืกื˜ื•ืช ืฉืžืืœื” ืœืชื•ืš ืจื›ื‘ ืฉื˜ื—,
00:33
or swerve right into a motorcycle.
9
33954
3000
ืื• ืœืกื˜ื•ืช ื™ืžื™ื ื” ืœืชื•ืš ืื•ืคื ื•ืข.
00:36
Should it prioritize your safety by hitting the motorcycle,
10
36954
3500
ื”ืื™ื ื”ื™ื ืฆืจื™ื›ื” ืœื”ืขื“ื™ืฃ ืืช ื”ื‘ื˜ื™ื—ื•ืช ืฉืœื›ื ืขืœ ื™ื“ื™ ืคื’ื™ืขื” ื‘ืื•ืคื ื•ืข,
00:40
minimize danger to others by not swerving,
11
40454
2792
ืœื”ืงื˜ื™ืŸ ืืช ื”ืกื›ื ื” ืœืื—ืจื™ื ืขืœ ื™ื“ื™ ื›ืš ืฉืชืžืฉื™ืš ื™ืฉืจ,
00:43
even if it means hitting the large object and sacrificing your life,
12
43246
4083
ืืคื™ืœื• ืื ื–ื” ืื•ืžืจ ืคื’ื™ืขื” ื‘ืขืฆื ื’ื“ื•ืœ ื•ืœื”ืงืจื™ื‘ ืืช ื—ื™ื™ื›ื,
00:47
or take the middle ground by hitting the SUV,
13
47329
2750
ืื• ืœืงื—ืช ืืช ื”ื“ืจืš ื”ืืžืฆืขื™ืช ื•ืœืคื’ื•ืข ื‘ืจื›ื‘ ื”ืฉื˜ื—,
00:50
which has a high passenger safety rating?
14
50079
3000
ืฉื™ืฉ ืœื• ื“ืจื•ื’ ื‘ื˜ื™ื—ื•ืช ื’ื‘ื•ื”?
00:53
So what should the self-driving car do?
15
53079
3208
ืื– ืžื” ืžื›ื•ื ื™ืช ื”ื ื”ื™ื’ื” ื”ืขืฆืžื™ืช ื”ื™ืชื” ืขื•ืฉื”?
00:56
If we were driving that boxed in car in manual mode,
16
56287
3209
ืื ืื ื—ื ื• ื”ื™ื™ื ื• ื ื•ื”ื’ื™ื ื‘ืžื›ื•ื ื™ืช ื”ืœื›ื•ื“ื” ื”ื–ื• ื‘ืžืฆื‘ ื™ื“ื ื™,
00:59
whichever way we'd react would be understood as just that,
17
59496
3541
ื‘ื›ืœ ื“ืจืš ืฉื ื’ื™ื‘ ื–ื” ื™ื•ื‘ืŸ ื‘ื“ื™ื•ืง ื›ืš,
01:03
a reaction,
18
63037
1292
ืชื’ื•ื‘ื”,
01:04
not a deliberate decision.
19
64329
2250
ืœื ื”ื—ืœื˜ื” ืžื•ื›ื•ื•ื ืช.
01:06
It would be an instinctual panicked move with no forethought or malice.
20
66579
4292
ื–ื• ืชื”ื™ื” ืชื ื•ืขื” ืื™ื ืกื˜ื™ื ืงื˜ื™ื‘ื™ืช ืฉืœ ืคืื ื™ืงื” ืœืœื ื—ืฉื™ื‘ื” ืžื•ืงื“ืžืช ืื• ื›ื•ื•ื ืช ื–ื“ื•ืŸ.
01:10
But if a programmer were to instruct the car to make the same move,
21
70871
3625
ืื‘ืœ ืื ืžืชื›ื ืช ื”ื™ื” ืžื•ืจื” ืœืžื›ื•ื ื™ืช ืœืขืฉื•ืช ืืช ืื•ืชื” ืคืขื•ืœื”,
01:14
given conditions it may sense in the future,
22
74496
2833
ื‘ื”ืชื—ืฉื‘ ื‘ืชื ืื™ื ืฉื”ื™ื ืชื—ื•ืฉ ื‘ืขืชื™ื“,
01:17
well, that looks more like premeditated homicide.
23
77329
4292
ื•ื‘ื›ืŸ, ื–ื” ื ืจืื” ื™ื•ืชืจ ื›ืžื• ืจืฆื— ื‘ื›ื•ื•ื ื” ืชื—ื™ืœื”.
01:21
Now, to be fair,
24
81621
1000
ืขื›ืฉื™ื•, ื›ื“ื™ ืœื”ื™ื•ืช ื”ื•ื’ื ื™ื,
01:22
self-driving cars are predicted to dramatically reduce traffic accidents
25
82621
4083
ืžื›ื•ื ื™ื•ืช ืฉื ื•ื”ื’ื•ืช ื‘ืขืฆืžืŸ ืฆืคื•ื™ื•ืช ืœื”ืคื—ื™ืช ื‘ืื•ืคืŸ ื“ืจืžื˜ื™ ืืช ืชืื•ื ื•ืช ื”ื“ืจื›ื™ื
01:26
and fatalities
26
86704
1250
ื•ืืช ื”ืงื•ืจื‘ื ื•ืช
01:27
by removing human error from the driving equation.
27
87954
3167
ืขืœ ื™ื“ื™ ื”ืกืจืช ื˜ืขื•ืช ืื ื•ืฉ ืžืžืฉื•ื•ืืช ื”ื ื”ื™ื’ื”.
01:31
Plus, there may be all sorts of other benefits:
28
91121
2375
ื•ืขื•ื“, ืื•ืœื™ ื™ื”ื™ื• ืขื•ื“ ื›ืœ ืžื™ื ื™ ืชื•ืขืœื•ืช ืื—ืจื•ืช:
01:33
eased road congestion,
29
93496
1667
ื”ืงืœื” ื‘ืขื•ืžืก ื‘ื›ื‘ื™ืฉื™ื,
01:35
decreased harmful emissions,
30
95163
1541
ื”ืคื—ืชืช ืคืœื™ื˜ื•ืช ืžื–ื”ืžื•ืช,
01:36
and minimized unproductive and stressful driving time.
31
96704
4625
ื•ื”ืคื—ืชื” ื‘ื–ืžืŸ ื ื”ื™ื’ื” ืœื ืคืจื•ื“ื•ืงื˜ื™ื‘ื™ ื•ืžืœื—ื™ืฅ.
01:41
But accidents can and will still happen,
32
101329
2167
ืื‘ืœ ืชืื•ื ื•ืช ื™ื›ื•ืœื•ืช ืขื“ื™ื™ืŸ ืœื”ืชืจื—ืฉ ื•ื”ืŸ ื™ืชืจื—ืฉื•,
01:43
and when they do,
33
103496
1167
ื•ื›ืฉื”ืŸ ื™ืชืจื—ืฉื•,
01:44
their outcomes may be determined months or years in advance
34
104663
4500
ื”ืชื•ืฆืื” ืฉืœื”ืŸ ืื•ืœื™ ืชืงื‘ืข ื—ื•ื“ืฉื™ื ืื• ืฉื ื™ื ืžืจืืฉ
01:49
by programmers or policy makers.
35
109163
2583
ืขืœ ื™ื“ื™ ืžืชื›ื ืชื™ื ืื• ืงื•ื‘ืขื™ ืžื“ื™ื ื™ื•ืช.
01:51
And they'll have some difficult decisions to make.
36
111746
2500
ื•ื™ื”ื™ื• ืœื”ื ื›ืžื” ื”ื—ืœื˜ื•ืช ืงืฉื•ืช ืœืขืฉื•ืช.
01:54
It's tempting to offer up general decision-making principles,
37
114246
2958
ื–ื” ืžืคืชื” ืœื”ืฆื™ืข ืขืงืจื•ื ื•ืช ืงื‘ืœืช ื”ื—ืœื˜ื•ืช ื›ืœืœื™ื™ื,
01:57
like minimize harm,
38
117204
1875
ื›ืžื• ืœื”ืคื—ื™ืช ืคื’ื™ืขื”,
01:59
but even that quickly leads to morally murky decisions.
39
119079
3375
ืื‘ืœ ืืคื™ืœื• ื–ื” ืžื•ื‘ื™ืœ ื‘ืžื”ื™ืจื•ืช ืœื”ื—ืœื˜ื•ืช ืขื›ื•ืจื•ืช ืžื•ืจืืœื™ืช.
02:02
For example,
40
122454
1167
ืœื“ื•ื’ืžื”,
02:03
let's say we have the same initial set up,
41
123621
2000
ื‘ื•ืื• ื ื’ื™ื“ ืฉื™ืฉ ืœื ื• ืืช ืื•ืชื” ื ืงื•ื“ืช ื”ืชื—ืœื”,
02:05
but now there's a motorcyclist wearing a helmet to your left
42
125621
2875
ืื‘ืœ ืขื›ืฉื™ื• ื™ืฉ ืื•ืคื ื•ืขื ื™ืช ืฉื—ื•ื‘ืฉืช ืงืกื“ื” ืœืฉืžืืœื›ื
02:08
and another one without a helmet to your right.
43
128496
2792
ื•ืื—ืจืช ื‘ืœื™ ืงืกื“ื” ืœื™ืžื™ื ื›ื.
02:11
Which one should your robot car crash into?
44
131288
3083
ื‘ืื™ื–ื” ืžื”ื ื”ืจื•ื‘ื•ื˜ ืฉืœื›ื ื™ืชื ื’ืฉ?
02:14
If you say the biker with the helmet because she's more likely to survive,
45
134371
4083
ืื ื”ื™ื™ืชื ืื•ืžืจื™ื ื‘ืื•ืคื ื•ืขื ื™ืช ืขื ื”ืงืกื“ื” ืžืคื ื™ ืฉื™ืฉ ืœื” ื™ื•ืชืจ ืกื™ื›ื•ื™ ืœืฉืจื•ื“,
02:18
then aren't you penalizing the responsible motorist?
46
138454
2917
ืื– ืืชื ืœื ืžืขื ื™ืฉื™ื ืืช ื”ืจื•ื›ื‘ืช ื”ืื—ืจืื™ืช?
02:21
If, instead, you save the biker without the helmet
47
141371
2750
ืื ื‘ืžืงื•ื, ืชืฆื™ืœื• ืืช ื”ืื•ืคื ื•ืขืŸ ื‘ืœื™ ื”ืงืกื“ื”
02:24
because he's acting irresponsibly,
48
144121
2000
ื‘ื’ืœืœ ืฉื”ื•ื ืœื ื ื•ื”ื’ ื‘ื—ื•ืกืจ ืื—ืจื™ื•ืช,
02:26
then you've gone way beyond the initial design principle about minimizing harm,
49
146121
4875
ืื– ืขื‘ืจืชื ื”ืจื‘ื” ืžืขื‘ืจ ืœืขืงืจื•ื ื•ืช ื”ืชื›ื ื•ืŸ ื”ืจืืฉื•ื ื™ื™ื ื‘ื ื•ื’ืข ืœื”ืคื—ืชืช ืคื’ื™ืขื”,
02:30
and the robot car is now meting out street justice.
50
150996
3875
ื•ื”ืžื›ื•ื ื™ืช ื”ืจื•ื‘ื•ื˜ื™ืช ืžื‘ืฆืขืช ืขื›ืฉื™ื• ืฆื“ืง ืจื—ื•ื‘.
02:34
The ethical considerations get more complicated here.
51
154871
3542
ื”ืฉื™ืงื•ืœื™ื ื”ืืชื™ื™ื ื ืขืฉื™ื ื™ื•ืชืจ ืžืกื•ื‘ื›ื™ื ืคื”.
02:38
In both of our scenarios,
52
158413
1375
ื‘ืฉื ื™ ื”ืžืงืจื™ื,
02:39
the underlying design is functioning as a targeting algorithm of sorts.
53
159788
4500
ื”ืชื›ื ื•ืŸ ื”ืžื•ื‘ื ื” ืคื•ืขืœ ื›ืืœื’ื•ืจื™ืชื ื›ื™ื•ื•ื ื•ืŸ ืžืกื•ื’ ืฉื”ื•ื.
02:44
In other words,
54
164288
1000
ื‘ืžื™ืœื™ื ืื—ืจื•ืช,
02:45
it's systematically favoring or discriminating
55
165288
2500
ื–ื• ื”ืขื“ืคื” ืื• ืืคืœื™ื” ืฉื™ื˜ืชื™ืช
02:47
against a certain type of object to crash into.
56
167788
3500
ื ื’ื“ ืกื•ื’ ืžืกื•ื™ื™ื ืฉืœ ืขืฆื ืœื”ืชื ื’ืฉ ื‘ื•.
02:51
And the owners of the target vehicles
57
171288
2333
ื•ื”ื‘ืขืœื™ื ืฉืœ ืจื›ื‘ ื”ืžื˜ืจื”
02:53
will suffer the negative consequences of this algorithm
58
173621
3042
ื™ืกื‘ื•ืœ ืžื”ืชื•ืฆืื•ืช ื”ืฉืœื™ืœื™ื•ืช ืฉืœ ื”ืืœื’ื•ืจื™ืชื
02:56
through no fault of their own.
59
176663
2083
ืœื ื‘ืืฉืžืชื.
02:58
Our new technologies are opening up many other novel ethical dilemmas.
60
178746
4667
ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ื—ื“ืฉื•ืช ืคื•ืชื—ื•ืช ื”ืจื‘ื” ื“ื™ืœืžื•ืช ืืชื™ื•ืช ื—ื“ืฉื•ืช ืื—ืจื•ืช.
03:03
For instance, if you had to choose between
61
183413
2083
ืœื“ื•ื’ืžื”, ืื ื”ื™ื™ืชื ืฆืจื™ื›ื™ื ืœื‘ื—ื•ืจ ื‘ื™ืŸ
03:05
a car that would always save as many lives as possible in an accident,
62
185496
4042
ืžื›ื•ื ื™ืช ืฉืชืžื™ื“ ืชืฆื™ืœ ื—ื™ื™ื ืจื‘ื™ื ื›ื›ืœ ื”ืืคืฉืจ ื‘ืชืื•ื ื”,
03:09
or one that would save you at any cost,
63
189538
3041
ืื• ืื—ืช ืฉืชืฆื™ืœ ืืชื›ื ื‘ื›ืœ ืžื—ื™ืจ,
03:12
which would you buy?
64
192579
1667
ืื™ื–ื• ืชืงื ื•?
03:14
What happens if the cars start analyzing and factoring in
65
194246
3333
ืžื” ืงื•ืจื” ืื ื”ืžื›ื•ื ื™ืช ืžืชื—ื™ืœื” ืœื ืชื— ื•ืœืฉืงืœืœ
03:17
the passengers of the cars and the particulars of their lives?
66
197579
3459
ืืช ื”ื ื•ืกืขื™ื ืฉืœ ืžื›ื•ื ื™ื•ืช ื•ื”ืคืจื˜ื™ื ืขืœ ื—ื™ื™ื”ื?
03:21
Could it be the case that a random decision
67
201038
2166
ื”ืื ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ื”ืžืงืจื” ืฉื”ื—ืœื˜ื” ืืงืจืื™ืช
03:23
is still better than a predetermined one designed to minimize harm?
68
203204
4917
ืขื“ื™ื™ืŸ ื˜ื•ื‘ื” ื™ื•ืชืจ ืžืื—ืช ืงื‘ื•ืขื” ืžืจืืฉ ืฉืžืชื•ื›ื ื ืช ืœื”ืคื—ื™ืช ืคื’ื™ืขื”?
03:28
And who should be making all of these decisions anyhow?
69
208121
2667
ื•ืžื™ ืฆืจื™ืš ืœืขืฉื•ืช ืืช ื›ืœ ื”ื”ื—ืœื˜ื•ืช ื”ืืœื• ื‘ื›ืœ ืื•ืคืŸ?
03:30
Programmers? Companies? Governments?
70
210829
2709
ืชื•ื›ื ื™ืชื ื™ื?
ื—ื‘ืจื•ืช?
ืžืžืฉืœื•ืช?
03:34
Reality may not play out exactly like our thought experiments,
71
214121
3458
ื”ืžืฆื™ืื•ืช ืื•ืœื™ ืœื ืชืชืจื—ืฉ ื‘ื“ื™ื•ืง ื›ืžื• ื”ื ื™ืกื•ื™ื™ื ื”ืžื—ืฉื‘ืชื™ื™ื ืฉืœื ื•,
03:37
but that's not the point.
72
217579
1667
ืื‘ืœ ื–ื• ืœื ื”ื ืงื•ื“ื”.
03:39
They're designed to isolate and stress test our intuitions on ethics,
73
219246
4333
ื”ืŸ ืžืชื•ื›ื ื ื•ืช ืœื‘ื•ื“ื“ ื•ืœื‘ื“ื•ืง ืชื—ืช ืœื—ืฅ ืืช ื”ืื™ื ื˜ื•ืื™ืฆื™ื” ืฉืœื ื• ื•ื”ืืชื™ืงื”,
03:43
just like science experiments do for the physical world.
74
223579
3000
ืžืžืฉ ื›ืžื• ื ื™ืกื•ื™ื™ื ืžื“ืขื™ื™ื ื‘ืขื•ืœื ื”ืืžื™ืชื™.
03:46
Spotting these moral hairpin turns now
75
226579
3375
ืœื–ื”ื•ืช ืืช ื”ืกื™ื‘ื•ื‘ื™ื ื”ื—ื“ื™ื ื”ืžื•ืจืœื™ื™ื ื”ืืœื• ืขื›ืฉื™ื•
03:49
will help us maneuver the unfamiliar road of technology ethics,
76
229954
3584
ื™ืขื–ื•ืจ ืœื ื• ืœื ื•ื•ื˜ ื‘ื“ืจืš ื”ืœื ืžื•ื›ืจืช ืฉืœ ืืชื™ืงื” ื˜ื›ื ื•ืœื•ื’ื™ืช,
03:53
and allow us to cruise confidently and conscientiously
77
233538
3750
ื•ืœืืคืฉืจ ืœื ื• ืœืฉื™ื™ื˜ ื‘ื‘ื˜ื—ื•ืŸ ื•ื‘ื”ื›ืจื”
03:57
into our brave new future.
78
237288
2375
ืœืขืชื™ื“ ื”ืืžื™ืฅ ื•ื”ื—ื“ืฉ ืฉืœื ื•.
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7