What moral decisions should driverless cars make? | Iyad Rahwan

110,192 views ใƒป 2017-09-08

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Zeeva Livshitz ืžื‘ืงืจ: hila scherba
00:12
Today I'm going to talk about technology and society.
0
12820
4080
ื”ื™ื•ื ืื ื™ ืจื•ืฆื” ืœื“ื‘ืจ ืขืœ ื˜ื›ื ื•ืœื•ื’ื™ื” ื•ื—ื‘ืจื”.
00:18
The Department of Transport estimated that last year
1
18860
3696
ืžืฉืจื“ ื”ืชื—ื‘ื•ืจื” ื”ืขืจื™ืš ืฉื‘ืฉื ื” ืฉืขื‘ืจื”
00:22
35,000 people died from traffic crashes in the US alone.
2
22580
4080
35,000 ืื ืฉื™ื ื ื”ืจื’ื• ื‘ืชืื•ื ื•ืช ื“ืจื›ื™ื, ื‘ืืจื”"ื‘ ื‘ืœื‘ื“.
00:27
Worldwide, 1.2 million people die every year in traffic accidents.
3
27860
4800
ื‘ืจื—ื‘ื™ ื”ืขื•ืœื, 1.2 ืžื™ืœื™ื•ืŸ ืื ืฉื™ื ื ื”ืจื’ื™ื ืžื“ื™ ืฉื ื” ื‘ืชืื•ื ื•ืช ื“ืจื›ื™ื.
00:33
If there was a way we could eliminate 90 percent of those accidents,
4
33580
4096
ืื ื”ื™ืชื” ื“ืจืš ืœืžื ื•ืข 90% ืžื”ืชืื•ื ื•ืช,
00:37
would you support it?
5
37700
1200
ื”ื™ื™ืชื ืชื•ืžื›ื™ื ื‘ื–ื”?
00:39
Of course you would.
6
39540
1296
ื›ืžื•ื‘ืŸ ืฉื›ืŸ.
00:40
This is what driverless car technology promises to achieve
7
40860
3655
ื–ื” ืžื” ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœ ื”ืžื›ื•ื ื™ืช ื”ืื•ื˜ื•ื ื•ืžื™ืช ืžื‘ื˜ื™ื—ื” ืœื”ืฉื™ื’
00:44
by eliminating the main source of accidents --
8
44540
2816
ืขืœ ื™ื“ื™ ืžื ื™ืขืช ื”ืžืงื•ืจ ื”ืขื™ืงืจื™ ืœืชืื•ื ื•ืช --
00:47
human error.
9
47380
1200
ื˜ืขื•ืช ืื ื•ืฉ.
00:49
Now picture yourself in a driverless car in the year 2030,
10
49740
5416
ืขื›ืฉื™ื• ื“ืžื™ื™ื ื• ืืช ืขืฆืžื›ื ื‘ืžื›ื•ื ื™ืช ืื•ื˜ื•ื ื•ืžื™ืช ื‘ืฉื ืช 2030.
00:55
sitting back and watching this vintage TEDxCambridge video.
11
55180
3456
ื ืฉืขื ื™ื ืื—ื•ืจื” ื•ืฆื•ืคื™ื ื‘ื•ื™ื“ืื• ืžืฉื•ื‘ื— ื–ื” ืฉืœ TEDxCambridge .
00:58
(Laughter)
12
58660
2000
(ืฆื—ื•ืง)
01:01
All of a sudden,
13
61340
1216
ืœืคืชืข ืคืชืื•ื,
01:02
the car experiences mechanical failure and is unable to stop.
14
62580
3280
ื”ืžื›ื•ื ื™ืช ื—ื•ื•ื” ืชืงืœื” ืžื›ื ื™ืช ื•ืื™ื ื” ืžืกื•ื’ืœืช ืœืขืฆื•ืจ.
01:07
If the car continues,
15
67180
1520
ืื ื”ืžื›ื•ื ื™ืช ืžืžืฉื™ื›ื”,
01:09
it will crash into a bunch of pedestrians crossing the street,
16
69540
4120
ื”ื™ื ืชืชื ื’ืฉ ื‘ื—ื‘ื•ืจืช ื”ื•ืœื›ื™ ืจื’ืœ ืฉื—ื•ืฆื™ื ืืช ื”ื›ื‘ื™ืฉ,
01:14
but the car may swerve,
17
74900
2135
ืื‘ืœ ื”ืžื›ื•ื ื™ืช ื™ื›ื•ืœื” ืœืกื˜ื•ืช,
01:17
hitting one bystander,
18
77059
1857
ื•ืœืคื’ื•ืข ื‘ืขื•ื‘ืจ ืื•ืจื—,
01:18
killing them to save the pedestrians.
19
78940
2080
ืœื”ืจื•ื’ ืื•ืชื•, ื›ื“ื™ ืœื”ืฆื™ืœ ืืช ื”ื•ืœื›ื™ ื”ืจื’ืœ.
01:21
What should the car do, and who should decide?
20
81860
2600
ืžื” ืขืœ ื”ืžื›ื•ื ื™ืช ืœืขืฉื•ืช, ื•ืžื™ ืฆืจื™ืš ืœื”ื—ืœื™ื˜?
01:25
What if instead the car could swerve into a wall,
21
85340
3536
ืžื” ืื ื‘ืžืงื•ื ื–ื” ื”ืžื›ื•ื ื™ืช ืชืกื˜ื” ื•ืชื™ืชืงืข ื‘ืงื™ืจ.
01:28
crashing and killing you, the passenger,
22
88900
3296
ืชืชืจืกืง ื•ืชื”ืจื•ื’ ืื•ืชืš, ื”ื ื•ืกืข,
01:32
in order to save those pedestrians?
23
92220
2320
ื›ื“ื™ ืœื”ืฆื™ืœ ืืช ื”ื•ืœื›ื™ ื”ืจื’ืœ?
01:35
This scenario is inspired by the trolley problem,
24
95060
3080
ืชืจื—ื™ืฉ ื–ื” ื”ื™ื ื• ื‘ื”ืฉืจืื” ืฉืœ ื‘ืขื™ื™ืช ื”ืขื’ืœื”,
01:38
which was invented by philosophers a few decades ago
25
98780
3776
ืฉื”ื•ืžืฆืื” ืขืœ ื™ื“ื™ ืคื™ืœื•ืกื•ืคื™ื ืœืคื ื™ ื›ืžื” ืขืฉื•ืจื™ื
01:42
to think about ethics.
26
102580
1240
ืชื•ืš ื—ืฉื™ื‘ื” ืขืœ ืืชื™ืงื”.
01:45
Now, the way we think about this problem matters.
27
105940
2496
ื›ืขืช, ื”ื“ืจืš ื‘ื” ืื ื—ื ื• ื—ื•ืฉื‘ื™ื ืขืœ ื‘ืขื™ื” ื–ื• ื”ื™ื ื—ืฉื•ื‘ื”.
01:48
We may for example not think about it at all.
28
108460
2616
ืื ื• ื™ื›ื•ืœื™ื ืœืžืฉืœ ืœื ืœื—ืฉื•ื‘ ืขืœ ื–ื” ื‘ื›ืœืœ.
01:51
We may say this scenario is unrealistic,
29
111100
3376
ืื ื• ื™ื›ื•ืœื™ื ืœื•ืžืจ ืฉืชืจื—ื™ืฉ ื–ื” ืื™ื ื• ืžืฆื™ืื•ืชื™,
01:54
incredibly unlikely, or just silly.
30
114500
2320
ืžืื•ื“ ืœื ืกื‘ื™ืจ, ืื• ืคืฉื•ื˜ ื˜ื™ืคืฉื™.
01:57
But I think this criticism misses the point
31
117580
2736
ืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉื”ื‘ื™ืงื•ืจืช ื”ื–ืืช ืžื—ืžื™ืฆื” ืืช ื”ื ืงื•ื“ื”
02:00
because it takes the scenario too literally.
32
120340
2160
ื›ื™ ื”ื™ื ื‘ื•ื—ื ืช ืืช ื”ืชืจื—ื™ืฉ ื‘ืื•ืคืŸ ืžื™ืœื•ืœื™ ืžื“ื™.
02:03
Of course no accident is going to look like this;
33
123740
2736
ื›ืžื•ื‘ืŸ ืฉืฉื•ื ืชืื•ื ื” ืœื ืชื™ืจืื” ื›ืš;
02:06
no accident has two or three options
34
126500
3336
ืœืฉื•ื ืชืื•ื ื” ืื™ืŸ ืฉืชื™ื™ื ืื• ืฉืœื•ืฉ ืืคืฉืจื•ื™ื•ืช
02:09
where everybody dies somehow.
35
129860
2000
ืฉื‘ื” ื›ื•ืœื ืžืชื™ื ืื™ื›ืฉื”ื•.
02:13
Instead, the car is going to calculate something
36
133300
2576
ื‘ืžืงื•ื ื–ื”, ื”ืžื›ื•ื ื™ืช ื”ื•ืœื›ืช ืœื—ืฉื‘ ืžืฉื”ื•
02:15
like the probability of hitting a certain group of people,
37
135900
4896
ื›ืžื• ื”ื”ืกืชื‘ืจื•ืช ืœืคื’ื•ืข ื‘ืงื‘ื•ืฆื” ืžืกื•ื™ืžืช ืฉืœ ืื ืฉื™ื,
02:20
if you swerve one direction versus another direction,
38
140820
3336
ืื ืืชื ืกื•ื˜ื™ื ื‘ื›ื™ื•ื•ืŸ ืื—ื“ ื•ืœื ืœื›ื™ื•ื•ืŸ ืื—ืจ,
02:24
you might slightly increase the risk to passengers or other drivers
39
144180
3456
ืืชื ืขืœื•ืœื™ื ืœื”ื’ื“ื™ืœ ืžืขื˜ ืืช ื”ืกื™ื›ื•ืŸ ืœื ื•ืกืขื™ื ืื• ืœื ื”ื’ื™ื ืื—ืจื™ื
02:27
versus pedestrians.
40
147660
1536
ืœืขื•ืžืช ื”ื•ืœื›ื™ ืจื’ืœ.
02:29
It's going to be a more complex calculation,
41
149220
2160
ื–ื” ื”ื•ืœืš ืœื”ื™ื•ืช ื—ื™ืฉื•ื‘ ืžื•ืจื›ื‘ ื™ื•ืชืจ,
02:32
but it's still going to involve trade-offs,
42
152300
2520
ืื‘ืœ ื–ื” ืขื“ื™ื™ืŸ ื™ื”ื™ื” ื›ืจื•ืš ื‘ื‘ื—ื™ืจืช ื—ืœื•ืคื•ืช,
02:35
and trade-offs often require ethics.
43
155660
2880
ื•ื‘ื—ื™ืจืช ื—ืœื•ืคื•ืช ื ื–ืงืงืช ืœืขืชื™ื ืงืจื•ื‘ื•ืช ืœืืชื™ืงื”
02:39
We might say then, "Well, let's not worry about this.
44
159660
2736
ืืคืฉืจ ืœื•ืžืจ ืื ื›ืŸ, "ื˜ื•ื‘, ื‘ื•ืื• ืœื ื ื“ืื’ ื‘ืงืฉืจ ืœื–ื”.
02:42
Let's wait until technology is fully ready and 100 percent safe."
45
162420
4640
ื‘ื•ืื• ื ื—ื›ื” ืขื“ ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืชื”ื™ื” ืžื•ื›ื ื” ืœื—ืœื•ื˜ื™ืŸ ื• 100% ื‘ื˜ื•ื—ื”."
02:48
Suppose that we can indeed eliminate 90 percent of those accidents,
46
168340
3680
ื ื ื™ื— ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ื‘ืืžืช ืœืžื ื•ืข 90% ืžื”ืชืื•ื ื•ืช ื”ืืœื•,
02:52
or even 99 percent in the next 10 years.
47
172900
2840
ืื• ืืคื™ืœื• 99 ืื—ื•ื–ื™ื ื‘ -10 ื”ืฉื ื™ื ื”ื‘ืื•ืช.
02:56
What if eliminating the last one percent of accidents
48
176740
3176
ืžื” ืื ืžื ื™ืขืช ื”ืื—ื•ื– ื”ืื—ืจื•ืŸ ืฉืœ ื”ืชืื•ื ื•ืช
02:59
requires 50 more years of research?
49
179940
3120
ืชื“ืจื•ืฉ ืขื•ื“ 50 ืฉื ื™ื ืฉืœ ืžื—ืงืจ?
03:04
Should we not adopt the technology?
50
184220
1800
ื”ืื ืขืœื™ื ื• ืœื ืœืืžืฅ ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื”?
03:06
That's 60 million people dead in car accidents
51
186540
4776
ื–ื” 60 ืžื™ืœื™ื•ืŸ ืื ืฉื™ื ืฉื ื”ืจื’ื™ื ื‘ืชืื•ื ื•ืช ื“ืจื›ื™ื
03:11
if we maintain the current rate.
52
191340
1760
ืื ืžืฉืžืจื™ื ืืช ื”ืงืฆื‘ ื”ื ื•ื›ื—ื™
03:14
So the point is,
53
194580
1216
ืื– ื”ื ืงื•ื“ื” ื”ื™ื,
03:15
waiting for full safety is also a choice,
54
195820
3616
ืฉื”ืžืชื ื” ืœื‘ื˜ื™ื—ื•ืช ืžืœืื” ื”ื™ื ื’ื ื‘ื—ื™ืจื”,
03:19
and it also involves trade-offs.
55
199460
2160
ื•ื”ื™ื ื’ื ื›ืจื•ื›ื” ื‘ื‘ื—ื™ืจืช ื—ืœื•ืคื•ืช.
03:23
People online on social media have been coming up with all sorts of ways
56
203380
4336
ืื ืฉื™ื ื‘ืžื“ื™ื” ื”ื—ื‘ืจืชื™ืช ื‘ืจืฉืช ืžืฆืื• ื›ืœ ืžื™ื ื™ ื“ืจื›ื™ื
03:27
to not think about this problem.
57
207740
2016
ื›ื™ืฆื“ ืœื ืœื—ืฉื•ื‘ ืขืœ ื‘ืขื™ื” ื–ื•.
03:29
One person suggested the car should just swerve somehow
58
209780
3216
ืื“ื ืื—ื“ ื”ืฆื™ืข ืฉืขืœ ื”ืžื›ื•ื ื™ืช ืจืง ืœืกื˜ื•ืช ืื™ื›ืฉื”ื•
03:33
in between the passengers --
59
213020
2136
ืืœ ื‘ื™ืŸ ื”ื ื•ืกืขื™ื --
03:35
(Laughter)
60
215180
1016
(ืฆื—ื•ืง)
03:36
and the bystander.
61
216220
1256
ื•ืขื•ื‘ืจ ื”ืื•ืจื—.
03:37
Of course if that's what the car can do, that's what the car should do.
62
217500
3360
ื›ืžื•ื‘ืŸ, ืฉืื ื–ื” ืžื” ืฉื”ืžื›ื•ื ื™ืช ื™ื›ื•ืœื” ืœืขืฉื•ืช, ื–ื” ืžื” ืฉืขืœื™ื” ืœืขืฉื•ืช.
03:41
We're interested in scenarios in which this is not possible.
63
221740
2840
ืื ื• ืžืขื•ื ื™ื™ื ื™ื ื‘ืชืจื—ื™ืฉื™ื ืฉื‘ื”ื ื–ื” ืื™ื ื• ืืคืฉืจื™.
03:45
And my personal favorite was a suggestion by a blogger
64
225100
5416
ื•ื”ืžื•ืขื“ืคืช ื”ืื™ืฉื™ืช ืฉืœื™ ื”ื™ืชื” ืฉืœ ื‘ืœื•ื’ืจ ืฉื”ืฆื™ืข
03:50
to have an eject button in the car that you press --
65
230540
3016
ืฉื™ื”ื™ื” ื›ืคืชื•ืจ ื—ื™ืœื•ืฅ ื‘ืžื›ื•ื ื™ืช, ืฉืœื•ื—ืฆื™ื ืขืœื™ื• --
03:53
(Laughter)
66
233580
1216
(ืฆื—ื•ืง)
03:54
just before the car self-destructs.
67
234820
1667
ืžืžืฉ ืœืคื ื™ ืฉื”ืžื›ื•ื ื™ืช ืžืฉืžื™ื“ื” ืืช ืขืฆืžื”.
03:56
(Laughter)
68
236511
1680
(ืฆื—ื•ืง)
03:59
So if we acknowledge that cars will have to make trade-offs on the road,
69
239660
5200
ืื– ืื ืื ื—ื ื• ืžื•ื“ื™ื ืฉื”ืžื›ื•ื ื™ื•ืช ื”ืืœื• ื™ืฆื˜ืจื›ื• ืœื‘ื—ื•ืจ ื—ืœื•ืคื•ืช ืขืœ ื”ื›ื‘ื™ืฉ,
04:06
how do we think about those trade-offs,
70
246020
1880
ืื™ืš ืื ื• ื—ื•ืฉื‘ื™ื ืขืœ ืื•ืชืŸ ื—ืœื•ืคื•ืช.
04:09
and how do we decide?
71
249140
1576
ื•ื›ื™ืฆื“ ืื ื• ืžื—ืœื™ื˜ื™ื?
04:10
Well, maybe we should run a survey to find out what society wants,
72
250740
3136
ื˜ื•ื‘, ืื•ืœื™ ื›ื“ืื™ ืฉืขืจื•ืš ืกืงืจ ื›ื“ื™ ืœื‘ืจืจ ืžื” ื”ื—ื‘ืจื” ืจื•ืฆื”,
04:13
because ultimately,
73
253900
1456
ื›ื™ ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ,
04:15
regulations and the law are a reflection of societal values.
74
255380
3960
ื”ืชืงื ื•ืช ื•ื”ื—ื•ืง ื”ื ื”ืฉืชืงืคื•ืช ืฉืœ ืขืจื›ื™ื ื—ื‘ืจืชื™ื™ื.
04:19
So this is what we did.
75
259860
1240
ืื– ื–ื” ืžื” ืฉืขืฉื™ื ื•.
04:21
With my collaborators,
76
261700
1616
ืขื ืฉื•ืชืคื™ื™ ืœืขื‘ื•ื“ื”,
04:23
Jean-Franรงois Bonnefon and Azim Shariff,
77
263340
2336
ื–'ืืŸ ืคืจื ืกื•ืื” ื‘ื•ื ืคื•ืŸ ื•ืขื–ื™ื ืฉืจื™ืฃ,
04:25
we ran a survey
78
265700
1616
ื”ืจืฆื ื• ืกืงืจ
04:27
in which we presented people with these types of scenarios.
79
267340
2855
ืฉื‘ื• ื”ืฆื’ื ื• ื‘ืคื ื™ ืื ืฉื™ื ืกื•ื’ื™ื ืืœื” ืฉืœ ืชืจื—ื™ืฉื™ื.
04:30
We gave them two options inspired by two philosophers:
80
270219
3777
ื ืชื ื• ืœื”ื ืฉืชื™ ืืคืฉืจื•ื™ื•ืช ื‘ื”ืฉืจืืช ืฉื ื™ ืคื™ืœื•ืกื•ืคื™ื:
04:34
Jeremy Bentham and Immanuel Kant.
81
274020
2640
ื’'ืจืžื™ ื‘ื ืช'ื ื•ืขืžื ื•ืืœ ืงืื ื˜.
04:37
Bentham says the car should follow utilitarian ethics:
82
277420
3096
ื‘ื ืช'ื ืื•ืžืจ ืฉื”ืžื›ื•ื ื™ืช ืฆืจื™ื›ื” ืœืขืงื•ื‘ ืื—ืจ ื”ืืชื™ืงื” ื”ืชื•ืขืœืชื ื™ืช:
04:40
it should take the action that will minimize total harm --
83
280540
3416
ืฆืจื™ืš ืฉื”ื™ื ืชื‘ื—ืจ ื‘ืคืขื•ืœื” ืฉืชืคื—ื™ืช ืขื“ ืœืžื™ื ื™ืžื•ื ืืช ื”ื ื–ืง ื”ื›ื•ืœืœ -
04:43
even if that action will kill a bystander
84
283980
2816
ื’ื ืื ืคืขื•ืœื” ื–ื• ืชื”ืจื•ื’ ืขื•ื‘ืจ ืื•ืจื—
04:46
and even if that action will kill the passenger.
85
286820
2440
ื•ื’ื ืื ืคืขื•ืœื” ื–ื• ืชื”ืจื•ื’ ืืช ื”ื ื•ืกืข.
04:49
Immanuel Kant says the car should follow duty-bound principles,
86
289940
4976
ืขืžื ื•ืืœ ืงืื ื˜ ืื•ืžืจ ืฉื”ืžื›ื•ื ื™ืช ืฆืจื™ื›ื” ืœืขืงื•ื‘ ืื—ืจ ืขืงืจื•ื ื•ืช ืžื—ื™ื™ื‘ื™ื,
04:54
like "Thou shalt not kill."
87
294940
1560
ื›ืžื• "ืœื ืชืจืฆื—".
04:57
So you should not take an action that explicitly harms a human being,
88
297300
4456
ืื– ืœื ืฆืจื™ืš ืœื ืงื•ื˜ ืคืขื•ืœื” ืฉืคื•ื’ืขืช ื‘ืžืคื•ืจืฉ ื‘ื‘ื ื™ ืื“ื,
05:01
and you should let the car take its course
89
301780
2456
ื•ืฆืจื™ืš ืœืืคืฉืจ ืœืžื›ื•ื ื™ืช ืœืงื—ืช ืืช ื”ืžืกืœื•ืœ ืฉืœื”
05:04
even if that's going to harm more people.
90
304260
1960
ื’ื ืื ื–ื” ื™ืคื’ืข ื‘ื™ื•ืชืจ ืื ืฉื™ื.
05:07
What do you think?
91
307460
1200
ืžื” ืืชื ื—ื•ืฉื‘ื™ื?
05:09
Bentham or Kant?
92
309180
1520
ื‘ื ืช'ื ืื• ืงืื ื˜?
05:11
Here's what we found.
93
311580
1256
ื”ื ื” ืžื” ืฉืžืฆืื ื•.
05:12
Most people sided with Bentham.
94
312860
1800
ืจื•ื‘ ื”ืื ืฉื™ื ืฆื™ื“ื“ื• ืขื ื‘ื ืช'ื.
05:15
So it seems that people want cars to be utilitarian,
95
315980
3776
ืื– ื ืจืื” ืฉืื ืฉื™ื ืจื•ืฆื™ื ืฉื”ืžื›ื•ื ื™ื•ืช ื™ื”ื™ื• ืชื•ืขืœืชื ื™ื•ืช,
05:19
minimize total harm,
96
319780
1416
ืžื–ืขื•ืจ ืฉืœ ื”ื ื–ืง ื”ื›ื•ืœืœ,
05:21
and that's what we should all do.
97
321220
1576
ื•ื–ื” ืžื” ืฉื›ื•ืœื ื• ืฆืจื™ื›ื™ื ืœืขืฉื•ืช.
05:22
Problem solved.
98
322820
1200
ื”ื‘ืขื™ื” ื ืคืชืจื”.
05:25
But there is a little catch.
99
325060
1480
ืื‘ืœ ื™ืฉ ืžืœื›ื•ื“ ืงื˜ืŸ.
05:27
When we asked people whether they would purchase such cars,
100
327740
3736
ื›ืฉืฉืืœื ื• ืื ืฉื™ื ืื ื”ื ื™ืจื›ืฉื• ืžื›ื•ื ื™ื•ืช ื›ืืœื”,
05:31
they said, "Absolutely not."
101
331500
1616
ื”ื ืืžืจื•, "ื‘ื”ื—ืœื˜ ืœื".
05:33
(Laughter)
102
333140
2296
{ืฆื—ื•ืง}
05:35
They would like to buy cars that protect them at all costs,
103
335460
3896
ื”ื ืจื•ืฆื™ื ืœืงื ื•ืช ืžื›ื•ื ื™ื•ืช ืฉื™ื’ื ื• ืขืœื™ื”ื ื‘ื›ืœ ืžื—ื™ืจ,
05:39
but they want everybody else to buy cars that minimize harm.
104
339380
3616
ืื‘ืœ ื”ื ืจื•ืฆื™ื ืฉื›ืœ ื”ืื—ืจื™ื ื™ืงื ื• ืžื›ื•ื ื™ื•ืช ืฉื™ืžื–ืขืจื• ืืช ื”ื ื–ืง.
05:43
(Laughter)
105
343020
2520
(ืฆื—ื•ืง)
05:46
We've seen this problem before.
106
346540
1856
ืจืื™ื ื• ืืช ื”ื‘ืขื™ื” ื”ื–ื• ื‘ืขื‘ืจ.
05:48
It's called a social dilemma.
107
348420
1560
ื”ื™ื ื ืงืจืืช ื“ื™ืœืžื” ื—ื‘ืจืชื™ืช.
05:50
And to understand the social dilemma,
108
350980
1816
ื•ื›ื“ื™ ืœื”ื‘ื™ืŸ ืืช ื”ื“ื™ืœืžื” ื”ื—ื‘ืจืชื™ืช,
05:52
we have to go a little bit back in history.
109
352820
2040
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืœื›ืช ืงืฆืช ื—ื–ืจื” ืœื”ื™ืกื˜ื•ืจื™ื”
05:55
In the 1800s,
110
355820
2576
ื‘ืžืื” ื” 19,
05:58
English economist William Forster Lloyd published a pamphlet
111
358420
3736
ื”ื›ืœื›ืœืŸ ื”ืื ื’ืœื™ ื•ื™ืœื™ืื ืคื•ืจืกื˜ืจ ืœื•ื™ื“ ืคืจืกื ื—ื•ื‘ืจืช
06:02
which describes the following scenario.
112
362180
2216
ืฉืžืชืืจืช ืืช ื”ืชืจื—ื™ืฉ ื”ื‘ื.
06:04
You have a group of farmers --
113
364420
1656
ื™ืฉ ืœื›ื ืงื‘ื•ืฆื” ืฉืœ ื—ืงืœืื™ื -
06:06
English farmers --
114
366100
1336
ื—ืงืœืื™ื ืื ื’ืœื™ื --
06:07
who are sharing a common land for their sheep to graze.
115
367460
2680
ืฉื—ื•ืœืงื™ื ืงืจืงืข ืžืฉื•ืชืคืช ืฉืžืฉืžืฉืช ืžืจืขื” ืœื›ื‘ืฉื™ื ืฉืœื”ื.
06:11
Now, if each farmer brings a certain number of sheep --
116
371340
2576
ืื ื›ืœ ื—ืงืœืื™ ืžื‘ื™ื ืžืกืคืจ ืžืกื•ื™ื ืฉืœ ื›ื‘ืฉื™ื --
06:13
let's say three sheep --
117
373940
1496
ืฉืœื•ืฉ ื›ื‘ืฉื™ื ืœืžืฉืœ -
06:15
the land will be rejuvenated,
118
375460
2096
ื”ืงืจืงืข ืชืชื—ื“ืฉ,
06:17
the farmers are happy,
119
377580
1216
ื”ื—ืงืœืื™ื ื™ื”ื™ื• ืžืจื•ืฆื™ื,
06:18
the sheep are happy,
120
378820
1616
ื”ื›ื‘ืฉื™ื ืชื”ื™ื™ื ื” ืžืจื•ืฆื•ืช,
06:20
everything is good.
121
380460
1200
ื”ื›ืœ ื˜ื•ื‘.
06:22
Now, if one farmer brings one extra sheep,
122
382260
2520
ืขื›ืฉื™ื•, ืื ืื—ื“ ื”ื—ืงืœืื™ื ืžื‘ื™ื ื›ื‘ืฉื” ื ื•ืกืคืช,
06:25
that farmer will do slightly better, and no one else will be harmed.
123
385620
4720
ืื•ืชื• ืื™ื›ืจ ื™ืฉืคืจ ืžืขื˜ ืืช ืžืฆื‘ื•, ื•ืืฃ ืื—ื“ ืื—ืจ ืœื ื™ื™ืคื’ืข.
06:30
But if every farmer made that individually rational decision,
124
390980
3640
ืื‘ืœ ืื ื›ืœ ื—ืงืœืื™ ื™ืงื‘ืœ ื”ื—ืœื˜ื” ืื™ื ื“ื™ื•ื•ื™ื“ื•ืืœื™ืช ืจืฆื™ื•ื ืœื™ืช ื–ื•,
06:35
the land will be overrun, and it will be depleted
125
395660
2720
ื”ืื“ืžื” ืชื”ื™ื” ืขืžื•ืกื” ื•ืชื™ื“ืœื“ืœ
06:39
to the detriment of all the farmers,
126
399180
2176
ืžื” ืฉื™ืคื’ืข ื‘ื›ืœ ื”ื—ืงืœืื™ื,
06:41
and of course, to the detriment of the sheep.
127
401380
2120
ื•ื›ืžื•ื‘ืŸ, ื™ืคื’ืข ื‘ื›ื‘ืฉื™ื.
06:44
We see this problem in many places:
128
404540
3680
ืื ื• ืจื•ืื™ื ื‘ืขื™ื” ื–ื• ื‘ืžืงื•ืžื•ืช ืจื‘ื™ื:
06:48
in the difficulty of managing overfishing,
129
408900
3176
ื‘ืงื•ืฉื™ ืฉืœ ื ื™ื”ื•ืœ ื“ื™ื’ ื‘ื”ื™ืงืฃ ืžื•ื’ื–ื,
06:52
or in reducing carbon emissions to mitigate climate change.
130
412100
4560
ืื• ื‘ื”ืคื—ืชืช ืคืœื™ื˜ืช ื”ืคื—ืžืŸ ื›ื“ื™ ืœื”ืงืœ ืขืœ ืฉื™ื ื•ื™ื™ ื”ืืงืœื™ื.
06:58
When it comes to the regulation of driverless cars,
131
418980
2920
ื›ืฉืžื“ื•ื‘ืจ ื‘ืจื’ื•ืœืฆื™ื” ืฉืœ ืžื›ื•ื ื™ื•ืช ืื•ื˜ื•ื ื•ืžื™ื•ืช,
07:02
the common land now is basically public safety --
132
422900
4336
ื”ืงืจืงืข ื”ืžืฉื•ืชืคืช ืขื›ืฉื™ื• ื”ื•ื ื‘ื™ืกื•ื“ื• ืฉืœ ื“ื‘ืจ ื‘ื˜ื•ื—ื” ืœืฆื™ื‘ื•ืจ -
07:07
that's the common good --
133
427260
1240
ื–ื” ื”ื˜ื•ื‘ ื”ืžืฉื•ืชืฃ -
07:09
and the farmers are the passengers
134
429220
1976
ื•ื”ื—ืงืœืื™ื ื”ื ื”ื ื•ืกืขื™ื
07:11
or the car owners who are choosing to ride in those cars.
135
431220
3600
ืื• ื‘ืขืœื™ ื”ืžื›ื•ื ื™ื•ืช ืฉื‘ื•ื—ืจื™ื ืœื ืกื•ืข ื‘ืžื›ื•ื ื™ื•ืช ืืœื”.
07:16
And by making the individually rational choice
136
436780
2616
ื•ืขืœ ื™ื“ื™ ืขืฉื™ื™ืช ื‘ื—ื™ืจื” ืื™ืฉื™ืช ื”ื’ื™ื•ื ื™ืช
07:19
of prioritizing their own safety,
137
439420
2816
ืฉืœ ืžืชืŸ ืขื“ื™ืคื•ืช ืœื‘ื™ื˜ื—ื•ื ื,
07:22
they may collectively be diminishing the common good,
138
442260
3136
ื”ื ืขืฉื•ื™ื™ื ืœื’ืจื•ื ื™ื—ื“ ืœื”ืคื—ืชืช ื”ื˜ื•ื‘ ื”ืžืฉื•ืชืฃ,
07:25
which is minimizing total harm.
139
445420
2200
ืฉื”ื•ื ืžื–ืขื•ืจ ื”ื ื–ืง ื”ืžื•ื—ืœื˜.
07:30
It's called the tragedy of the commons,
140
450140
2136
ื–ื” ื ืงืจื ื”ื˜ืจื’ื“ื™ื” ืฉืœ ื ื—ืœืช ื”ื›ืœืœ,
07:32
traditionally,
141
452300
1296
ื‘ืื•ืคืŸ ืžืกื•ืจืชื™,
07:33
but I think in the case of driverless cars,
142
453620
3096
ืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉื‘ืžืงืจื” ืฉืœ ืžื›ื•ื ื™ื•ืช ืื•ื˜ื•ื ื•ืžื™ื•ืช,
07:36
the problem may be a little bit more insidious
143
456740
2856
ื”ื‘ืขื™ื” ืขืฉื•ื™ื” ืœื”ื™ื•ืช ืงืฆืช ื™ื•ืชืจ ืขืจืžื•ืžื™ืช
07:39
because there is not necessarily an individual human being
144
459620
3496
ื›ื™ ืื™ืŸ ื›ืืŸ ื‘ื”ื›ืจื— ืื“ื ื™ื—ื™ื“
07:43
making those decisions.
145
463140
1696
ืฉืžืงื‘ืœ ื”ื—ืœื˜ื•ืช ืืœื”.
07:44
So car manufacturers may simply program cars
146
464860
3296
ื›ืš ืฉื™ืฆืจื ื™ ื”ืจื›ื‘ ื™ื›ื•ืœื™ื ืคืฉื•ื˜ ืœืชื›ื ืช ืžื›ื•ื ื™ื•ืช
07:48
that will maximize safety for their clients,
147
468180
2520
ืฉื™ืžืงืกืžื• ืืช ื”ื‘ื˜ื™ื—ื•ืช ืขื‘ื•ืจ ื”ืœืงื•ื—ื•ืช ืฉืœื”ื,
07:51
and those cars may learn automatically on their own
148
471900
2976
ื•ื”ืžื›ื•ื ื™ื•ืช ื”ืืœื” ืขืฉื•ื™ื•ืช ืœืœืžื•ื“ ื‘ืื•ืคืŸ ืื•ื˜ื•ืžื˜ื™ ื‘ืขืฆืžืŸ
07:54
that doing so requires slightly increasing risk for pedestrians.
149
474900
3520
ืฉืขืฉื™ื™ื” ื›ื–ื• ื“ื•ืจืฉืช ืžืขื˜ ื”ื’ื“ืœืช ืกื™ื›ื•ืŸ ืœื”ื•ืœื›ื™ ืจื’ืœ.
07:59
So to use the sheep metaphor,
150
479340
1416
ืื– ืื ื ืฉืชืžืฉ ื‘ืžื˜ืคื•ืจืช ื”ื›ื‘ืฉื™ื,
08:00
it's like we now have electric sheep that have a mind of their own.
151
480780
3616
ื–ื” ื›ืื™ืœื• ืฉื™ืฉ ืœื ื• ืขื›ืฉื™ื• ื›ื‘ืฉื™ื ื—ืฉืžืœื™ื•ืช ืฉื™ืฉ ืœื”ืŸ ื—ืฉื™ื‘ื” ืžืฉืœ ืขืฆืžืŸ.
08:04
(Laughter)
152
484420
1456
(ืฆื—ื•ืง)
08:05
And they may go and graze even if the farmer doesn't know it.
153
485900
3080
ื•ื”ืŸ ื™ื›ื•ืœื•ืช ืœืœื›ืช ื•ืœืจืขื•ืช ื’ื ืื ื”ืื™ื›ืจ ืื™ื ื• ื™ื•ื“ืข ืขืœ ื›ืš.
08:10
So this is what we may call the tragedy of the algorithmic commons,
154
490460
3976
ืื– ื–ื” ืžื” ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืงืจื•ื ืœื• ื”ื˜ืจื’ื“ื™ื” ืฉืœ ื ื—ืœืช ื”ื›ืœืœ ื”ืืœื’ื•ืจื™ืชืžื™ืช,
08:14
and if offers new types of challenges.
155
494460
2360
ื•ื”ื™ื ืžืฆื™ืขื” ืกื•ื’ื™ื ื—ื“ืฉื™ื ืฉืœ ืืชื’ืจื™ื.
08:22
Typically, traditionally,
156
502340
1896
ื‘ื“ืจืš ื›ืœืœ, ื‘ืื•ืคืŸ ืžืกื•ืจืชื™,
08:24
we solve these types of social dilemmas using regulation,
157
504260
3336
ืื ื• ืคื•ืชืจื™ื ืกื•ื’ื™ื ืืœื” ืฉืœ ื“ื™ืœืžื•ืช ื—ื‘ืจืชื™ื•ืช ื‘ืืžืฆืขื•ืช ืจื’ื•ืœืฆื™ื”,
08:27
so either governments or communities get together,
158
507620
2736
ืื– ืื• ืฉื”ืžืžืฉืœื•ืช ืื• ืฉื”ืงื”ื™ืœื•ืช ื ืคื’ืฉื•ืช,
08:30
and they decide collectively what kind of outcome they want
159
510380
3736
ื•ืžื—ืœื™ื˜ื•ืช ื‘ื™ื—ื“ ื‘ืื™ื–ื• ืกื•ื’ ืฉืœ ืชื•ืฆืื” ื”ื ืžืขื•ื ื™ื ื™ื
08:34
and what sort of constraints on individual behavior
160
514140
2656
ื•ืื™ื–ื” ืกื•ื’ ืฉืœ ืื™ืœื•ืฆื™ื ืขืœ ื”ืชื ื”ื’ื•ืช ืื™ืฉื™ืช
08:36
they need to implement.
161
516820
1200
ื”ื ืฆืจื™ื›ื™ื ืœื™ื™ืฉื.
08:39
And then using monitoring and enforcement,
162
519420
2616
ื•ืื– ื‘ืืžืฆืขื•ืช ื ื™ื˜ื•ืจ ื•ืื›ื™ืคื”,
08:42
they can make sure that the public good is preserved.
163
522060
2559
ื”ื ื™ื›ื•ืœื™ื ืœื•ื•ื“ื ืฉื˜ื•ื‘ืช ื”ื›ืœืœ ื ืฉืžืจืช.
08:45
So why don't we just,
164
525260
1575
ืื– ืœืžื” ืื ื—ื ื• ืื™ื ื ื• ืคืฉื•ื˜,
08:46
as regulators,
165
526859
1496
ื›ืจื’ื•ืœื˜ื•ืจื™ื,
08:48
require that all cars minimize harm?
166
528379
2897
ื“ื•ืจืฉื™ื ืžื›ืœ ื”ืžื›ื•ื ื™ื•ืช ืœืžื–ืขืจ ื ื–ืง?
08:51
After all, this is what people say they want.
167
531300
2240
ืื—ืจื™ ื”ื›ืœ, ื–ื” ืžื” ืฉืื ืฉื™ื ืื•ืžืจื™ื ืฉื”ื ืจื•ืฆื™ื.
08:55
And more importantly,
168
535020
1416
ื•ื—ืฉื•ื‘ ื™ื•ืชืจ,
08:56
I can be sure that as an individual,
169
536460
3096
ืื ื™ ื™ื›ื•ืœ ืœื”ื™ื•ืช ื‘ื˜ื•ื— ื›ื™ ื›ืื“ื ื™ื—ื™ื“,
08:59
if I buy a car that may sacrifice me in a very rare case,
170
539580
3856
ืื ืื ื™ ืงื•ื ื” ืžื›ื•ื ื™ืช ืฉืื•ืœื™ ืชืงืจื™ื‘ ืื•ืชื™ ื‘ืžืงืจื” ื ื“ื™ืจ ืžืื•ื“,
09:03
I'm not the only sucker doing that
171
543460
1656
ืื ื™ ืœื ื”ืคืจืื™ื™ืจ ื”ื™ื—ื™ื“ ืฉืขื•ืฉื” ืืช ื–ื”
09:05
while everybody else enjoys unconditional protection.
172
545140
2680
ื‘ืขื•ื“ ื›ืœ ื”ืื—ืจื™ื ื ื”ื ื™ื ืžื”ื’ื ื” ื‘ืœืชื™ ืžื•ืชื ื™ืช.
09:08
In our survey, we did ask people whether they would support regulation
173
548940
3336
ื‘ืกืงืจ ืฉืœื ื•, ืฉืืœื ื• ืื ืฉื™ื ืื ื”ื ื™ืชืžื›ื• ื‘ืจื’ื•ืœืฆื™ื”
09:12
and here's what we found.
174
552300
1200
ื•ื”ื ื” ืžื” ืฉืžืฆืื ื•.
09:14
First of all, people said no to regulation;
175
554180
3760
ืจืืฉื™ืช, ืื ืฉื™ื ืืžืจื• ืœื ืœืจื’ื•ืœืฆื™ื”;
09:19
and second, they said,
176
559100
1256
ื•ืฉื ื™ืช, ื”ื ืืžืจื•,
09:20
"Well if you regulate cars to do this and to minimize total harm,
177
560380
3936
"ืื ืชื”ื™ื” ืชืงื ื” ืฉื”ืžื›ื•ื ื™ื•ืช ื™ืขืฉื• ื–ืืช, ื•ื›ื“ื™ ืœืžื–ืขืจ ืืช ื”ื ื–ืง ื”ื›ื•ืœืœ,
09:24
I will not buy those cars."
178
564340
1480
ืื ื™ ืœื ืืงื ื” ืืช ื”ืžื›ื•ื ื™ื•ืช ื”ืืœื•".
09:27
So ironically,
179
567220
1376
ืื– ื‘ืื•ืคืŸ ืื™ืจื•ื ื™,
09:28
by regulating cars to minimize harm,
180
568620
3496
ืขืœ ื™ื“ื™ ื”ืกื“ืจืช ื”ืžื›ื•ื ื™ื•ืช ื›ื“ื™ ืœืžื–ืขืจ ื ื–ืง,
09:32
we may actually end up with more harm
181
572140
1840
ืื ื• ืขืฉื•ื™ื™ื ืœืžืขืฉื” ืœืกื™ื™ื ืขื ื™ื•ืชืจ ื ื–ืง
09:34
because people may not opt into the safer technology
182
574860
3656
ื›ื™ ืื ืฉื™ื ืขืฉื•ื™ื™ื ืœื ืœื‘ื—ื•ืจ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื‘ื˜ื•ื—ื” ื™ื•ืชืจ
09:38
even if it's much safer than human drivers.
183
578540
2080
ื’ื ืื ื–ื” ื”ืจื‘ื” ื™ื•ืชืจ ื‘ื˜ื•ื— ืžื ื”ื’ื™ื ืื ื•ืฉื™ื™ื
09:42
I don't have the final answer to this riddle,
184
582180
3416
ืื™ืŸ ืœื™ ืืช ื”ืชืฉื•ื‘ื” ื”ืกื•ืคื™ืช ืœื—ื™ื“ื” ื–ื•,
09:45
but I think as a starting point,
185
585620
1576
ืื‘ืœ ืœื“ืขืชื™, ื›ื ืงื•ื“ืช ืžื•ืฆื,
09:47
we need society to come together
186
587220
3296
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืฉื”ื—ื‘ืจื” ืชืชืื—ื“
09:50
to decide what trade-offs we are comfortable with
187
590540
2760
ื›ื“ื™ ืœื”ื—ืœื™ื˜ ืžื”ืŸ ื”ื—ืœื•ืคื•ืช ืฉืื™ืชืŸ ืื ื• ืžืจื’ื™ืฉื™ื ื‘ื ื•ื—
09:54
and to come up with ways in which we can enforce those trade-offs.
188
594180
3480
ื•ืœืžืฆื•ื ื“ืจื›ื™ื ืฉื‘ืขื–ืจืชืŸ ื ื•ื›ืœ ืœืื›ื•ืฃ ืื•ืชืŸ.
09:58
As a starting point, my brilliant students,
189
598340
2536
ื›ื ืงื•ื“ืช ืžื•ืฆื, ื”ืกื˜ื•ื“ื ื˜ื™ื ื”ืžื‘ืจื™ืงื™ื ืฉืœื™,
10:00
Edmond Awad and Sohan Dsouza,
190
600900
2456
ืื“ืžื•ื ื“ ืขื•ื•ืื“ ื•ืกื•ื”ืืŸ ื“ืกื•ื–ื”,
10:03
built the Moral Machine website,
191
603380
1800
ื‘ื ื• ืืช ืืชืจ ื”ืื™ื ื˜ืจื ื˜ "ืžื›ื•ื ืช ื”ืžื•ืกืจ",
10:06
which generates random scenarios at you --
192
606020
2680
ืฉืžื™ื™ืฆืจ ืชืจื—ื™ืฉื™ื ืืงืจืื™ื™ื -
10:09
basically a bunch of random dilemmas in a sequence
193
609900
2456
ื‘ืขืฆื ืžืงื‘ืฅ ืฉืœ ื“ื™ืœืžื•ืช ืืงืจืื™ื•ืช ื‘ืจืฆืฃ
10:12
where you have to choose what the car should do in a given scenario.
194
612380
3920
ืฉื‘ื”ืŸ ืชืฆื˜ืจื›ื• ืœื‘ื—ื•ืจ ืžื” ื”ืžื›ื•ื ื™ืช ืฆืจื™ื›ื” ืœืขืฉื•ืช ื‘ืชืจื—ื™ืฉ ื ืชื•ืŸ.
10:16
And we vary the ages and even the species of the different victims.
195
616860
4600
ื•ืื ื—ื ื• ืžืฉื ื™ื ืืช ื”ื’ื™ืœืื™ื ื•ืืคื™ืœื• ืืช ื”ืžื™ืŸ ืฉืœ ื”ืงื•ืจื‘ื ื•ืช ื”ืฉื•ื ื™ื.
10:22
So far we've collected over five million decisions
196
622860
3696
ืขื“ ื›ื” ืืกืคื ื• ืžืขืœ 5 ืžื™ืœื™ื•ืŸ ื”ื—ืœื˜ื•ืช
10:26
by over one million people worldwide
197
626580
2200
ืฉืœ ื™ื•ืชืจ ืžืžื™ืœื™ื•ืŸ ืื ืฉื™ื ื‘ืจื—ื‘ื™ ื”ืขื•ืœื
10:30
from the website.
198
630220
1200
ืžื”ืืชืจ.
10:32
And this is helping us form an early picture
199
632180
2416
ื•ื–ื” ืขื•ื–ืจ ืœื ื• ืœื™ืฆื•ืจ ืชืžื•ื ื” ืžื•ืงื“ืžืช
10:34
of what trade-offs people are comfortable with
200
634620
2616
ืฉืœ ื”ื—ืœื•ืคื•ืช ืฉืœืื ืฉื™ื ื ื•ื— ืื™ืชืŸ
10:37
and what matters to them --
201
637260
1896
ื•ืžื” ื—ืฉื•ื‘ ืœื”ื -
10:39
even across cultures.
202
639180
1440
ืืคื™ืœื• ื‘ืื•ืคืŸ ืฉื—ื•ืฆื” ืชืจื‘ื•ื™ื•ืช.
10:42
But more importantly,
203
642060
1496
ืื‘ืœ ื—ืฉื•ื‘ ื™ื•ืชืจ,
10:43
doing this exercise is helping people recognize
204
643580
3376
ืœืขืฉื•ืช ืืช ื”ืชืจื’ื™ืœ ื”ื–ื” ืขื•ื–ืจ ืœืื ืฉื™ื ืœื–ื”ื•ืช
10:46
the difficulty of making those choices
205
646980
2816
ืืช ื”ืงื•ืฉื™ ืฉื‘ืงื‘ืœืช ื”ื—ืœื˜ื•ืช ืืœื”
10:49
and that the regulators are tasked with impossible choices.
206
649820
3800
ื•ื›ื™ ืขืœ ื”ืจื’ื•ืœื˜ื•ืจื™ื ืžื•ื˜ืœื•ืช ื‘ื—ื™ืจื•ืช ื‘ืœืชื™ ืืคืฉืจื™ื•ืช.
10:55
And maybe this will help us as a society understand the kinds of trade-offs
207
655180
3576
ื•ืื•ืœื™ ื–ื” ื™ืขื–ื•ืจ ืœื ื• ื›ื—ื‘ืจื” ืœื”ื‘ื™ืŸ ืืช ืกื•ื’ื™ ื”ื—ืœื•ืคื•ืช
10:58
that will be implemented ultimately in regulation.
208
658780
3056
ืฉื™ื™ื•ืฉืžื• ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ.
11:01
And indeed, I was very happy to hear
209
661860
1736
ื•ืื›ืŸ, ืฉืžื—ืชื™ ืžืื•ื“ ืœืฉืžื•ืข
11:03
that the first set of regulations
210
663620
2016
ืฉืžืขืจื›ืช ื”ืชืงื ื•ืช ื”ืจืืฉื•ื ื”
11:05
that came from the Department of Transport --
211
665660
2136
ืฉื”ื’ื™ืขื” ืžืžืฉืจื“ ื”ืชื—ื‘ื•ืจื” --
11:07
announced last week --
212
667820
1376
ื”ื•ื“ื™ืขื” ื‘ืฉื‘ื•ืข ืฉืขื‘ืจ --
11:09
included a 15-point checklist for all carmakers to provide,
213
669220
6576
ืฉื”ื™ื ื›ืœืœื” ืจืฉื™ืžื” ืฉืœ 15 ื ืงื•ื“ื•ืช ืฉืขืœ ื›ืœ ื™ืฆืจืŸ ืžื›ื•ื ื™ื•ืช ืœืกืคืง,
11:15
and number 14 was ethical consideration --
214
675820
3256
ื•ืžืกืคืจ 14 ื”ื™ื” ืฉื™ืงื•ืœ ืžื•ืกืจื™ -
11:19
how are you going to deal with that.
215
679100
1720
ืื™ืš ืืชื ืžืชื›ื•ื•ื ื™ื ืœื”ืชืžื•ื“ื“ ืขื ื–ื”.
11:23
We also have people reflect on their own decisions
216
683620
2656
ืื ื—ื ื• ื’ื ืžืฉืงืคื™ื ืœืื ืฉื™ื ืืช ื”ื”ื—ืœื˜ื•ืช ืฉืœื”ื
11:26
by giving them summaries of what they chose.
217
686300
3000
ืขืœ ื™ื“ื™ ื”ืฆื’ืช ืกื™ื›ื•ืžื™ื ืฉืœ ืžื” ืฉื”ื ื‘ื—ืจื•.
11:30
I'll give you one example --
218
690260
1656
ืืชืŸ ืœื›ื ื“ื•ื’ืžื” ืื—ืช --
11:31
I'm just going to warn you that this is not your typical example,
219
691940
3536
ืื ื™ ืจืง ืขื•ืžื“ ืœื”ื–ื”ื™ืจ ืืชื›ื ืฉื–ื• ืœื ื“ื•ื’ืžื” ื˜ื™ืคื•ืกื™ืช ืฉืœื›ื,
11:35
your typical user.
220
695500
1376
ื”ืžืฉืชืžืฉ ื”ื˜ื™ืคื•ืกื™ ืฉืœื›ื.
11:36
This is the most sacrificed and the most saved character for this person.
221
696900
3616
ืืœื• ื”ื“ืžื•ืช ืฉื”ื›ื™ ื ืฉืžืจื” ื•ื–ื• ืฉื”ื›ื™ ื”ื•ืงืจื‘ื” ืืฆืœ ืื“ื ืžืกื•ื™ื.
11:40
(Laughter)
222
700540
5200
(ืฆื—ื•ืง)
11:46
Some of you may agree with him,
223
706500
1896
ืื—ื“ื™ื ืžื›ื ืขืฉื•ื™ื™ื ืœื”ืกื›ื™ื ืื™ืชื•,
11:48
or her, we don't know.
224
708420
1640
ืื• ืื™ืชื”, ืื™ื ื ื• ื™ื•ื“ืขื™ื.
11:52
But this person also seems to slightly prefer passengers over pedestrians
225
712300
6136
ืื‘ืœ ื ืจืื” ืฉืื“ื ื–ื” ืžืขื“ื™ืฃ ื‘ืžืงืฆืช ืืช ื”ื ื•ืกืขื™ื ืขืœ ืคื ื™ ื”ื•ืœื›ื™ ืจื’ืœ
11:58
in their choices
226
718460
2096
ื‘ื‘ื—ื™ืจื•ืช ืฉืœื”ื.
12:00
and is very happy to punish jaywalking.
227
720580
2816
ื•ื”ื•ื ืฉืžื— ืžืื•ื“ ืœื”ืขื ื™ืฉ ืืช ืืœื” ืฉื—ื•ืฆื™ื ื›ื‘ื™ืฉ ื‘ืจืฉืœื ื•ืช.
12:03
(Laughter)
228
723420
3040
(ืฆื—ื•ืง)
12:09
So let's wrap up.
229
729140
1216
ืื– ื‘ื•ืื• ื•ื ืกื›ื.
12:10
We started with the question -- let's call it the ethical dilemma --
230
730379
3416
ื”ืชื—ืœื ื• ืขื ื”ืฉืืœื” - ืฉื ืงืจื ืœื” ื”ื“ื™ืœืžื” ื”ืืชื™ืช -
12:13
of what the car should do in a specific scenario:
231
733820
3056
ืฉืœ ืžื” ืฉื”ืžื›ื•ื ื™ืช ืฆืจื™ื›ื” ืœืขืฉื•ืช ื‘ืชืจื—ื™ืฉ ืกืคืฆื™ืคื™:
12:16
swerve or stay?
232
736900
1200
ืœืกื˜ื•ืช ืื• ืœื”ื™ืฉืืจ?
12:19
But then we realized that the problem was a different one.
233
739060
2736
ืื‘ืœ ืื– ื”ื‘ื ื• ืฉื”ื‘ืขื™ื” ื”ื™ืชื” ืฉื•ื ื”.
12:21
It was the problem of how to get society to agree on and enforce
234
741820
4536
ื–ื• ื”ื™ืชื” ื”ื‘ืขื™ื” ืฉืœ ืื™ืš ืœื’ืจื•ื ืœื—ื‘ืจื” ืœื”ืกื›ื™ื ื•ืœืื›ื•ืฃ
12:26
the trade-offs they're comfortable with.
235
746380
1936
ืืช ื”ื—ืœื•ืคื•ืช ืฉื”ื ืžืจื’ื™ืฉื™ื ืื™ืชืŸ ื ื•ื—.
12:28
It's a social dilemma.
236
748340
1256
ื–ื• ื“ื™ืœืžื” ื—ื‘ืจืชื™ืช.
12:29
In the 1940s, Isaac Asimov wrote his famous laws of robotics --
237
749620
5016
ื‘ืฉื ื•ืช ื”ืืจื‘ืขื™ื, ืื™ื–ืง ืืกื™ืžื•ื‘ ื›ืชื‘ ืืช ื”ื—ื•ืงื™ื ื”ืžืคื•ืจืกืžื™ื ืฉืœ ื”ืจื•ื‘ื•ื˜ื™ืงื” -
12:34
the three laws of robotics.
238
754660
1320
ืืช ืฉืœื•ืฉืช ื—ื•ืงื™ ื”ืจื•ื‘ื•ื˜ื™ืงื”.
12:37
A robot may not harm a human being,
239
757060
2456
ืจื•ื‘ื•ื˜ ืœื ืจืฉืื™ ืœืคื’ื•ืข ื‘ื‘ื ื™ ืื“ื,
12:39
a robot may not disobey a human being,
240
759540
2536
ืืกื•ืจ ืœืจื•ื‘ื•ื˜ ืœื ืœื”ื™ืฉืžืข ืœืื“ื,
12:42
and a robot may not allow itself to come to harm --
241
762100
3256
ื•ืจื•ื‘ื•ื˜ ืœื ื™ื›ื•ืœ ืœื”ืจืฉื•ืช ืœืขืฆืžื• ืœื‘ื•ื ื›ื“ื™ ืœืคื’ื•ืข -
12:45
in this order of importance.
242
765380
1960
ืขืœ ืคื™ ืกื“ืจ ื—ืฉื™ื‘ื•ืช ื–ื”.
12:48
But after 40 years or so
243
768180
2136
ืื‘ืœ ืื—ืจื™ 40 ืฉื ื” ื‘ืขืจืš
12:50
and after so many stories pushing these laws to the limit,
244
770340
3736
ื•ืื—ืจื™ ื›ืœ ื›ืš ื”ืจื‘ื” ืกื™ืคื•ืจื™ื ืฉื“ื•ื—ืงื™ื ื—ื•ืงื™ื ืืœื” ืขื“ ืงืฆื” ื’ื‘ื•ืœ ื”ืชื—ื•ื ื”ืžื•ืชืจ,
12:54
Asimov introduced the zeroth law
245
774100
3696
ืืกื™ืžื•ื‘ ื”ืฆื™ื’ ืืช ื—ื•ืง ื”ืืคืก
12:57
which takes precedence above all,
246
777820
2256
ืฉื ื•ืชืŸ ืงื“ื™ืžื•ืช ืžืขืœ ืœื›ืœ,
13:00
and it's that a robot may not harm humanity as a whole.
247
780100
3280
ืœื›ืš ืฉื”ืจื•ื‘ื•ื˜ ืœื ื™ืคื’ืข ื‘ืื ื•ืฉื•ืช ื‘ื›ืœืœื•ืชื”.
13:04
I don't know what this means in the context of driverless cars
248
784300
4376
ืื™ื ื ื™ ื™ื•ื“ืข ืžื” ื–ื” ืื•ืžืจ ื‘ื”ืงืฉืจ ืœืžื›ื•ื ื™ื•ืช ืื•ื˜ื•ื ื•ืžื™ื•ืช
13:08
or any specific situation,
249
788700
2736
ืื• ืœืžืฆื‘ ืกืคืฆื™ืคื™ ื›ืœืฉื”ื•,
13:11
and I don't know how we can implement it,
250
791460
2216
ื•ืื™ื ื ื™ ื™ื•ื“ืข ืื™ืš ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื™ื™ืฉื ืืช ื–ื”,
13:13
but I think that by recognizing
251
793700
1536
ืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉืขืœ ื™ื“ื™ ื”ื›ืจื” ื‘ื›ืš
13:15
that the regulation of driverless cars is not only a technological problem
252
795260
6136
ืฉื”ืจื’ื•ืœืฆื™ื” ืฉืœ ืžื›ื•ื ื™ื•ืช ืื•ื˜ื•ื ื•ืžื™ื•ืช ืื™ื ื” ืจืง ื‘ืขื™ื” ื˜ื›ื ื•ืœื•ื’ื™ืช
13:21
but also a societal cooperation problem,
253
801420
3280
ืืœื ื’ื ื‘ืขื™ื” ืฉืœ ืฉื™ืชื•ืฃ ืคืขื•ืœื” ื—ื‘ืจืชื™,
13:25
I hope that we can at least begin to ask the right questions.
254
805620
2880
ืื ื™ ืžืงื•ื•ื” ืฉื ื•ื›ืœ ืœืคื—ื•ืช ืœื”ืชื—ื™ืœ ืœืฉืื•ืœ ืืช ื”ืฉืืœื•ืช ื”ื ื›ื•ื ื•ืช.
13:29
Thank you.
255
809020
1216
ืชื•ื“ื” ืœื›ื.
13:30
(Applause)
256
810260
2920
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7