What moral decisions should driverless cars make? | Iyad Rahwan

104,087 views ・ 2017-09-08

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Moonjeong Kang κ²€ν† : Jihyeon J. Kim
00:12
Today I'm going to talk about technology and society.
0
12820
4080
μ €λŠ” 였늘 기술과 μ‚¬νšŒμ— λŒ€ν•œ 이야기λ₯Ό ν•˜λ €κ³  ν•©λ‹ˆλ‹€.
00:18
The Department of Transport estimated that last year
1
18860
3696
ꡐ톡뢀 좔산에 λ”°λ₯΄λ©΄
μž‘λ…„ ν•œ ν•΄ λ™μ•ˆ λ―Έκ΅­μ—μ„œλ§Œ 3만 5천 λͺ…이 κ΅ν†΅μ‚¬κ³ λ‘œ μ£½μ—ˆμŠ΅λ‹ˆλ‹€.
00:22
35,000 people died from traffic crashes in the US alone.
2
22580
4080
00:27
Worldwide, 1.2 million people die every year in traffic accidents.
3
27860
4800
μ„Έκ³„μ μœΌλ‘œλŠ” 맀년 120만 λͺ…이 κ΅ν†΅μ‚¬κ³ λ‘œ μ£½μ£ .
00:33
If there was a way we could eliminate 90 percent of those accidents,
4
33580
4096
λ§Œμ•½ 이런 μ‚¬κ³ μ˜ 90%λ₯Ό 없앨 수 μžˆλŠ” 방법이 μžˆλ‹€λ©΄
00:37
would you support it?
5
37700
1200
μ—¬λŸ¬λΆ„μ€ μ§€μ§€ν•˜μ‹œκ² μŠ΅λ‹ˆκΉŒ?
00:39
Of course you would.
6
39540
1296
λ¬Όλ‘  κ·ΈλŸ¬μ‹œκ² μ£ .
00:40
This is what driverless car technology promises to achieve
7
40860
3655
이것이 λ°”λ‘œ μžμœ¨μ£Όν–‰μ°¨ 기술의 λͺ©ν‘œμž…λ‹ˆλ‹€.
00:44
by eliminating the main source of accidents --
8
44540
2816
μ‚¬κ³ μ˜ μ£Όμš” 원인이라 ν•  수 μžˆλŠ”
00:47
human error.
9
47380
1200
μΈκ°„μ˜ μ‹€μˆ˜λ₯Ό μ—†μ•°μœΌλ‘œμ¨ 말이죠.
00:49
Now picture yourself in a driverless car in the year 2030,
10
49740
5416
2030년에 μžμœ¨μ£Όν–‰μ°¨μ— 타고 μžˆλ‹€κ³  상상해 λ³΄μ„Έμš”.
00:55
sitting back and watching this vintage TEDxCambridge video.
11
55180
3456
λŠκΈ‹μ΄ μ•‰μ•„μ„œ 였래된 TEDxCambridge μ˜μƒμ„ 보고 있죠.
00:58
(Laughter)
12
58660
2000
(μ›ƒμŒ)
01:01
All of a sudden,
13
61340
1216
μ—¬κΈ°μ„œ κ°‘μžκΈ°
01:02
the car experiences mechanical failure and is unable to stop.
14
62580
3280
μ°¨λŸ‰ μ˜€μž‘λ™μœΌλ‘œ 인해 μ°¨λ₯Ό 멈좜 수 μ—†κ²Œ λ©λ‹ˆλ‹€.
01:07
If the car continues,
15
67180
1520
λ§Œμ•½ 계속 κ°„λ‹€λ©΄
01:09
it will crash into a bunch of pedestrians crossing the street,
16
69540
4120
κΈΈ κ±΄λ„ˆνŽΈμ— μžˆλŠ” λ³΄ν–‰μžλ“€κ³Ό λΆ€λ”ͺμΉ  κ±°μ˜ˆμš”.
01:14
but the car may swerve,
17
74900
2135
ν•˜μ§€λ§Œ μ°¨λŠ” λ°©ν–₯을 λ°”κΎΈμ–΄μ„œ
01:17
hitting one bystander,
18
77059
1857
ν•œ μ‚¬λžŒλ§Œ μΉ˜μ–΄ 죽이고
01:18
killing them to save the pedestrians.
19
78940
2080
λ³΄ν–‰μž 무리λ₯Ό ꡬ할 μˆ˜λ„ μžˆμ–΄μš”.
01:21
What should the car do, and who should decide?
20
81860
2600
μ°¨λŠ” 무엇을 ν•΄μ•Ό ν•˜λ©°, 이런 결정은 λˆ„κ°€ ν•΄μ•Ό ν• κΉŒμš”?
01:25
What if instead the car could swerve into a wall,
21
85340
3536
λ§Œμ•½μ— λ³΄ν–‰μžλ“€μ„ κ΅¬ν•˜κΈ° μœ„ν•΄μ„œ
01:28
crashing and killing you, the passenger,
22
88900
3296
μ°¨κ°€ 벽으둜 λŒμ§„ν•΄μ„œ
01:32
in order to save those pedestrians?
23
92220
2320
νƒ‘μŠΉμžμΈ 당신을 죽게 ν•œλ‹€λ©΄μš”?
01:35
This scenario is inspired by the trolley problem,
24
95060
3080
이 μ‹œλ‚˜λ¦¬μ˜€λŠ” '트둀리의 문제'μ—μ„œ μ˜κ°μ„ λ°›μ•˜μ–΄μš”.
01:38
which was invented by philosophers a few decades ago
25
98780
3776
수 μ‹­ 년전에 μ² ν•™μžλ“€μ΄ μœ€λ¦¬μ— λŒ€ν•΄ μƒκ°ν•˜κΈ° μœ„ν•΄ κ³ μ•ˆν•œ 문제죠.
01:42
to think about ethics.
26
102580
1240
01:45
Now, the way we think about this problem matters.
27
105940
2496
μš°λ¦¬κ°€ 이 λ¬Έμ œμ— λŒ€ν•΄ μƒκ°ν•˜λŠ” 방식은 μ€‘μš”ν•©λ‹ˆλ‹€.
01:48
We may for example not think about it at all.
28
108460
2616
예λ₯Ό λ“€μ–΄, μš°λ¦¬λŠ” 이 λ¬Έμ œμ— λŒ€ν•΄ μ „ν˜€ μƒκ°ν•˜μ§€ μ•Šμ„ 수 μžˆκ³ μš”.
01:51
We may say this scenario is unrealistic,
29
111100
3376
이 μ‹œλ‚˜λ¦¬μ˜€κ°€ λΉ„ν˜„μ‹€μ μ΄λΌκ±°λ‚˜
01:54
incredibly unlikely, or just silly.
30
114500
2320
μžˆμ„ 수 μ—†λŠ” μΌμ΄λΌκ±°λ‚˜, κ·Έμ € 바보같닀고도 ν•  μˆ˜λ„ 있죠.
01:57
But I think this criticism misses the point
31
117580
2736
ν•˜μ§€λ§Œ 이런 λΉ„νŒμ€ 논점을 λ²—μ–΄λ‚˜μš”.
02:00
because it takes the scenario too literally.
32
120340
2160
μ‹œλ‚˜λ¦¬μ˜€λ₯Ό 문자 κ·ΈλŒ€λ‘œ 받아듀이기 λ•Œλ¬Έμ΄μ£ .
02:03
Of course no accident is going to look like this;
33
123740
2736
λ¬Όλ‘  사고가 이런 μ‹μœΌλ‘œ μΌμ–΄λ‚˜κΈ΄ νž˜λ“­λ‹ˆλ‹€.
02:06
no accident has two or three options
34
126500
3336
사고가 일어날 수 μžˆλŠ” 두 μ„Έκ°€μ§€μ˜ ν™•λ₯ μ—μ„œ
02:09
where everybody dies somehow.
35
129860
2000
λͺ¨λ“  κ²½μš°μ— μ‚¬λ§μžκ°€ μƒκΈ°λŠ” κ²½μš°λŠ” 잘 μ—†μ£ .
02:13
Instead, the car is going to calculate something
36
133300
2576
λŒ€μ‹ μ— μ°¨λŠ” μ–΄λ–€ ν™•λ₯  같은 κ±Έ 계산 ν•  κ±°μ˜ˆμš”.
02:15
like the probability of hitting a certain group of people,
37
135900
4896
각 κ²½μš°μ— μ‚¬λžŒλ“€μ„ 치게 λ˜λŠ” ν™•λ₯  말이죠.
02:20
if you swerve one direction versus another direction,
38
140820
3336
κ°€λ Ή, μ—¬λŸ¬λΆ„μ΄ μ € μͺ½ λŒ€μ‹  이 μͺ½μœΌλ‘œ λ°©ν–₯을 λ°”κΎΈλ©΄
02:24
you might slightly increase the risk to passengers or other drivers
39
144180
3456
νƒ‘μŠΉμžλ‚˜ λ‹€λ₯Έ μš΄μ „μžλ“€μ˜ μœ„ν—˜λ„κ°€ λ³΄ν–‰μžμ— λΉ„ν•΄ 더 μ¦κ°€ν• κ²λ‹ˆλ‹€.
02:27
versus pedestrians.
40
147660
1536
02:29
It's going to be a more complex calculation,
41
149220
2160
쑰금 더 λ³΅μž‘ν•œ 계산이 λ˜κ² μ§€λ§Œ
02:32
but it's still going to involve trade-offs,
42
152300
2520
이것은 νŠΈλ ˆμ΄λ“œ μ˜€ν”„μ™€ 관련이 μžˆμŠ΅λ‹ˆλ‹€.
02:35
and trade-offs often require ethics.
43
155660
2880
νŠΈλ ˆμ΄λ“œ μ˜€ν”„λŠ” ν”νžˆ 도덕성을 ν•„μš”λ‘œ ν•©λ‹ˆλ‹€.
02:39
We might say then, "Well, let's not worry about this.
44
159660
2736
그럼 μ΄λ ‡κ²Œ λ§ν•˜κ² μ£  "음, 이건 신경쓰지 말자"
02:42
Let's wait until technology is fully ready and 100 percent safe."
45
162420
4640
"기술이 μ’€ 더 λ°œμ „ν•΄μ„œ 100% μ•ˆμ „ν•΄ 질 λ•ŒκΉŒμ§€ κΈ°λ‹€λ¦¬μž"
02:48
Suppose that we can indeed eliminate 90 percent of those accidents,
46
168340
3680
μš°λ¦¬κ°€ μ•žμœΌλ‘œ 10λ…„μ•ˆμ—
02:52
or even 99 percent in the next 10 years.
47
172900
2840
μ‚¬κ³ μœ¨μ„ 90%, ν˜Ήμ€ 99% κΉŒμ§€ κ°μ†Œμ‹œν‚¬ 수 μžˆλ‹€κ³  κ°€μ •ν•΄ λ΄…μ‹œλ‹€.
02:56
What if eliminating the last one percent of accidents
48
176740
3176
λ‚˜λ¨Έμ§€ 1% 의 사고확λ₯ μ„ μ—†μ• κΈ° μœ„ν•΄
02:59
requires 50 more years of research?
49
179940
3120
50 μ—¬λ…„μ˜ 연ꡬ가 더 ν•„μš”ν•˜λ‹€λ©΄μš”?
03:04
Should we not adopt the technology?
50
184220
1800
κ·Έ κΈ°μˆ μ„ μ‚¬μš©ν•˜μ§€ 말아야 ν• κΉŒμš”?
03:06
That's 60 million people dead in car accidents
51
186540
4776
이 말은 ν˜„μž¬μ˜ μΆ”μ„Έλ‘œ λ”°μ Έλ³΄μ•˜μ„ λ•Œ, 6천만 λͺ…μ˜ μ‚¬λžŒλ“€μ΄
03:11
if we maintain the current rate.
52
191340
1760
μ°¨ μ‚¬κ³ λ‘œ μ£½λŠ” μ…ˆμ΄ λ©λ‹ˆλ‹€.
03:14
So the point is,
53
194580
1216
μš”μ μ€
03:15
waiting for full safety is also a choice,
54
195820
3616
μ™„λ²½ν•œ μ•ˆμ „μ„ μœ„ν•΄ κΈ°λ‹€λ¦¬λŠ” 것 λ˜ν•œ ν•˜λ‚˜μ˜ 선택이고
03:19
and it also involves trade-offs.
55
199460
2160
νŠΈλ ˆμ΄λ“œ μ˜€ν”„μ™€ 관련이 μžˆλ‹€λŠ” κ²λ‹ˆλ‹€.
03:23
People online on social media have been coming up with all sorts of ways
56
203380
4336
μ‚¬λžŒλ“€μ€ 온라인 μ†Œμ…œλ―Έλ””μ–΄μ—μ„œ
이 문제λ₯Ό νšŒν”Όν•˜κΈ° μœ„ν•œ μ˜¨κ°– 방법을 μ œμ‹œν•΄ μ™”μŠ΅λ‹ˆλ‹€.
03:27
to not think about this problem.
57
207740
2016
03:29
One person suggested the car should just swerve somehow
58
209780
3216
μ–΄λ–€ μ‚¬λžŒμ€ μ°¨κ°€ μ–΄λ–»κ²Œ ν•΄μ„œλ“ μ§€ νƒ‘μŠΉμžλ“€κ³Ό 행인 μ‚¬μ΄λ‘œ 가도둝
03:33
in between the passengers --
59
213020
2136
03:35
(Laughter)
60
215180
1016
(μ›ƒμŒ)
03:36
and the bystander.
61
216220
1256
λ°©ν–₯을 ν‹€μ–΄μ•Ό ν•œλ‹€κ³  ν–ˆμ£ .
03:37
Of course if that's what the car can do, that's what the car should do.
62
217500
3360
λ§Œμ•½ μ°¨κ°€ κ·Έλ ‡κ²Œ ν•  수만 μžˆλ‹€λ©΄, κ·Έλ ‡κ²Œ ν•΄μ•Ό ν•  κ²λ‹ˆλ‹€.
03:41
We're interested in scenarios in which this is not possible.
63
221740
2840
μš°λ¦¬λŠ” 이런 μ‹œλ‚˜λ¦¬μ˜€κ°€ λΆˆκ°€λŠ₯ν•œ κ²½μš°μ— 관심이 μžˆμŠ΅λ‹ˆλ‹€.
03:45
And my personal favorite was a suggestion by a blogger
64
225100
5416
μ œκ°€ 개인적으둜 κ°€μž₯ μ’‹μ•„ν•˜λŠ” μ˜κ²¬μ€ μ–΄λ–€ λΈ”λ‘œκ±°κ°€ μ œμ•ˆν•œ κ²ƒμΈλ°μš”.
03:50
to have an eject button in the car that you press --
65
230540
3016
차에 νƒˆμΆœ λ²„νŠΌμ΄ μžˆμ–΄μ„œ
03:53
(Laughter)
66
233580
1216
(μ›ƒμŒ)
03:54
just before the car self-destructs.
67
234820
1667
μ°¨κ°€ λΆ€μ„œμ§€κΈ° 직전에 κ·Έ λ²„νŠΌμ„ λˆ„λ₯΄λŠ” κ±°μ˜ˆμš”.
03:56
(Laughter)
68
236511
1680
(μ›ƒμŒ)
03:59
So if we acknowledge that cars will have to make trade-offs on the road,
69
239660
5200
μ΄λ ‡κ²Œ 차듀이 λ„λ‘œμ—μ„œ νŠΈλ ˆμ΄λ“œ μ˜€ν”„ 함을 μΈμ •ν•œλ‹€λ©΄
04:06
how do we think about those trade-offs,
70
246020
1880
μš°λ¦¬λŠ” 이 νŠΈλ ˆμ΄λ“œ μ˜€ν”„λ₯Ό μ–΄λ–€ λ°©μ‹μœΌλ‘œ λŒ€ν•΄μ•Ό ν•˜λ©°
04:09
and how do we decide?
71
249140
1576
μ–΄λ–»κ²Œ 결정을 λ‚΄λ €μ•Ό ν• κΉŒμš”?
04:10
Well, maybe we should run a survey to find out what society wants,
72
250740
3136
μ‚¬νšŒκ°€ μ›ν•˜λŠ” 게 무엇인지 μ•ŒκΈ° μœ„ν•΄ μš°λ¦¬λŠ” μ•„λ§ˆ μ„€λ¬Έ 쑰사λ₯Ό ν•΄μ•Ό ν•  κ±°μ˜ˆμš”.
04:13
because ultimately,
73
253900
1456
μ™œλƒν•˜λ©΄, ꢁ극적으둜
04:15
regulations and the law are a reflection of societal values.
74
255380
3960
규제과 법은 μ‚¬νšŒμ  κ°€μΉ˜λ₯Ό λ°˜μ˜ν•˜κ³  있기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
04:19
So this is what we did.
75
259860
1240
κ·Έλž˜μ„œ μš°λ¦¬λŠ” λ‹€μŒκ³Ό 같은 일을 ν–ˆμ–΄μš”
04:21
With my collaborators,
76
261700
1616
μ €λŠ” 제 λ™λ£ŒμΈ
04:23
Jean-François Bonnefon and Azim Shariff,
77
263340
2336
μž₯ ν”„λž‘μˆ˜μ•„ 보넀퐁, 아짐 샀리프와 ν•¨κ»˜
04:25
we ran a survey
78
265700
1616
섀문쑰사λ₯Ό μ‹€μ‹œν–ˆμ–΄μš”.
04:27
in which we presented people with these types of scenarios.
79
267340
2855
이런 μ’…λ₯˜μ˜ μ‹œλ‚˜λ¦¬μ˜€λ“€μ„ μ‚¬λžŒλ“€μ—κ²Œ λ³΄μ—¬μ€λ‹ˆλ‹€.
04:30
We gave them two options inspired by two philosophers:
80
270219
3777
두 μ² ν•™μžμ—κ²Œμ„œ μ˜κ°μ„ 받은 두 가지 μ˜΅μ…˜μ„ μ€λ‹ˆλ‹€.
04:34
Jeremy Bentham and Immanuel Kant.
81
274020
2640
제레미 λ²€λ‹΄κ³Ό μ— λ§ˆλˆ„μ—˜ 칸트
04:37
Bentham says the car should follow utilitarian ethics:
82
277420
3096
벀담은 μ°¨κ°€ 곡리주의적 μœ€λ¦¬μ— 따라야 ν•œλ‹€κ³  λ§ν•©λ‹ˆλ‹€.
04:40
it should take the action that will minimize total harm --
83
280540
3416
μ „μ²΄μ˜ ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•  수 μžˆλŠ” 행동을 μ·¨ν•΄μ•Ό ν•œλ‹€λŠ” μž…μž₯이죠
04:43
even if that action will kill a bystander
84
283980
2816
κ·Έ 결정이 ν•œ μ‚¬λžŒμ˜ ν–‰μΈμ΄λ‚˜
04:46
and even if that action will kill the passenger.
85
286820
2440
νƒ‘μŠΉμžλ₯Ό μ£½μ΄λŠ” 결정이라 ν•˜λ”λΌλ„μš”.
04:49
Immanuel Kant says the car should follow duty-bound principles,
86
289940
4976
μΉΈνŠΈλŠ” μ°¨κ°€ μ˜λ¬΄μ— μž…κ°ν•œ 원칙을 따라야 ν•œλ‹€κ³  ν•©λ‹ˆλ‹€.
04:54
like "Thou shalt not kill."
87
294940
1560
κ°€λ Ή "살인을 ν•˜μ§€ 말라" 와 같은 원칙이죠.
04:57
So you should not take an action that explicitly harms a human being,
88
297300
4456
λͺ…λ°±ν•˜κ²Œ 인λ₯˜λ₯Ό ν•΄μΉ˜κ²Œ λ§Œλ“œλŠ” ν–‰μœ„λ₯Ό ν•΄μ„œλŠ” μ•ˆλ˜κ³ 
05:01
and you should let the car take its course
89
301780
2456
μ°¨κ°€ μ›λž˜ κ°€λ˜κΈΈμ„ κ°€κ²Œ ν•΄μ•Ό ν•©λ‹ˆλ‹€.
05:04
even if that's going to harm more people.
90
304260
1960
κ·Έ 결정이 더 λ§Žμ€ μ‚¬λžŒλ“€μ„ 죽이게 λ μ§€λΌλ„μš”.
05:07
What do you think?
91
307460
1200
μ–΄λ–»κ²Œ μƒκ°ν•˜μ‹œλ‚˜μš”?
05:09
Bentham or Kant?
92
309180
1520
λ²€λ‹΄μž…λ‹ˆκΉŒ? μΉΈνŠΈμž…λ‹ˆκΉŒ?
05:11
Here's what we found.
93
311580
1256
μ„€λ¬Έμ‘°μ‚¬μ˜ κ²°κ³Όμž…λ‹ˆλ‹€.
05:12
Most people sided with Bentham.
94
312860
1800
λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ΄ λ²€λ‹΄μ˜ 손을 λ“€μ—ˆμŠ΅λ‹ˆλ‹€.
05:15
So it seems that people want cars to be utilitarian,
95
315980
3776
μ‚¬λžŒλ“€μ€ μ°¨κ°€ 곡리주의적으둜
05:19
minimize total harm,
96
319780
1416
전체 ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•˜κΈΈ μ›ν•˜λ©°
05:21
and that's what we should all do.
97
321220
1576
우리 λͺ¨λ‘κ°€ κ·Έλ ‡κ²Œ ν•˜κΈΈ λ°”λžλ‹ˆλ‹€.
05:22
Problem solved.
98
322820
1200
λ¬Έμ œλŠ” ν•΄κ²°λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
05:25
But there is a little catch.
99
325060
1480
ν•˜μ§€λ§Œ 쑰금 κ±Έλ¦¬λŠ” 점이 μžˆμ–΄μš”.
05:27
When we asked people whether they would purchase such cars,
100
327740
3736
μš°λ¦¬κ°€ μ‚¬λžŒλ“€μ—κ²Œ 이런 μ°¨λ₯Ό 사겠냐고 λ¬Όμ—ˆλ”λ‹ˆ
05:31
they said, "Absolutely not."
101
331500
1616
μ ˆλŒ€λ‘œ 사지 μ•Šκ² λ‹€κ³  λŒ€λ‹΅ν–ˆμ–΄μš”.
05:33
(Laughter)
102
333140
2296
(μ›ƒμŒ)
05:35
They would like to buy cars that protect them at all costs,
103
335460
3896
μ‚¬λžŒλ“€μ€ μ–΄λ–»κ²Œ ν•΄μ„œλ“  자기λ₯Ό λ³΄ν˜Έν•  μ°¨λ₯Ό 사고싢어 ν•˜μ§€λ§Œ
05:39
but they want everybody else to buy cars that minimize harm.
104
339380
3616
λ‹€λ₯Έ μ‚¬λžŒλ“€μ€ 전체 ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•˜λŠ” μ°¨λ₯Ό 사길 μ›ν•˜μ£ .
05:43
(Laughter)
105
343020
2520
(μ›ƒμŒ)
05:46
We've seen this problem before.
106
346540
1856
μš°λ¦¬λŠ” 전에 이 문제λ₯Ό λ³Έ 적이 μžˆμ–΄μš”.
05:48
It's called a social dilemma.
107
348420
1560
이것을 μ‚¬νšŒμ  λ”œλ ˆλ§ˆλΌκ³  ν•©λ‹ˆλ‹€.
05:50
And to understand the social dilemma,
108
350980
1816
μ‚¬νšŒμ  λ”œλ ˆλ§ˆλ₯Ό μ΄ν•΄ν•˜κΈ° μœ„ν•΄μ„œλŠ”
05:52
we have to go a little bit back in history.
109
352820
2040
역사λ₯Ό μž μ‹œ μ‚΄νŽ΄λ³΄μ•„μ•Ό ν•©λ‹ˆλ‹€.
05:55
In the 1800s,
110
355820
2576
1800λ…„λŒ€μ—
05:58
English economist William Forster Lloyd published a pamphlet
111
358420
3736
영ꡭ κ²½μ œν•™μžμΈ μœŒλ¦¬μ—„ ν¬μŠ€ν„° λ‘œμ΄λ“œκ°€ 글을 λ°œν‘œν–ˆλŠ”λ°
06:02
which describes the following scenario.
112
362180
2216
λ‹€μŒκ³Ό 같은 μ‹œλ‚˜λ¦¬μ˜€λ₯Ό λ‹΄κ³  μžˆμ–΄μš”.
06:04
You have a group of farmers --
113
364420
1656
ν•œ 무리의 농뢀듀이 μžˆμ–΄μš”.
06:06
English farmers --
114
366100
1336
영ꡭ λ†λΆ€λ“€μ΄κ΅¬μš”
06:07
who are sharing a common land for their sheep to graze.
115
367460
2680
이듀은 양을 λ°©λͺ©ν•˜κΈ° μœ„ν•œ 땅을 κ³΅μœ ν•˜κ³  μžˆμ–΄μš”.
06:11
Now, if each farmer brings a certain number of sheep --
116
371340
2576
각 λ†λΆ€λ§ˆλ‹€ λͺ‡ 마리의 양을 데리고 μ™€μš”.
06:13
let's say three sheep --
117
373940
1496
농뢀당 μ„Έ 마리 라고 ν•©μ‹œλ‹€.
06:15
the land will be rejuvenated,
118
375460
2096
땅은 ν™œλ ₯을 찾을 것이고
06:17
the farmers are happy,
119
377580
1216
농뢀듀은 ν–‰λ³΅ν•˜κ³ 
06:18
the sheep are happy,
120
378820
1616
양듀도 ν–‰λ³΅ν•˜κ³ 
06:20
everything is good.
121
380460
1200
λͺ¨λ“  것이 쒋을 κ±°μ˜ˆμš”.
06:22
Now, if one farmer brings one extra sheep,
122
382260
2520
이제 ν•œ 농뢀가 양을 ν•œ 마리 더 데리고 μ™€μš”.
06:25
that farmer will do slightly better, and no one else will be harmed.
123
385620
4720
κ·Έ λ†λΆ€λŠ” 쑰금 더 이읡을 λ³΄μ§€λ§Œ, μ—¬κΈ°μ„œ 손해λ₯Ό λ³΄λŠ” μ‚¬λžŒμ€ 아무도 μ—†μ–΄μš”.
06:30
But if every farmer made that individually rational decision,
124
390980
3640
ν•˜μ§€λ§Œ λͺ¨λ“  농뢀가 μ΄λ ‡κ²Œ κ°œμΈμ—κ²Œ 합리적인 선택을 ν•œλ‹€λ©΄
06:35
the land will be overrun, and it will be depleted
125
395660
2720
땅은 꽉 차게 되고, 고갈될 κ±°μ˜ˆμš”.
06:39
to the detriment of all the farmers,
126
399180
2176
λ†λΆ€λ“€μ—κ²Œ ν•΄λ₯Ό μž…νžˆκ³ 
06:41
and of course, to the detriment of the sheep.
127
401380
2120
μ–‘λ“€μ—κ²ŒκΉŒμ§€ ν•΄λ₯Ό μž…νž μ •λ„λ‘œ 말이죠.
06:44
We see this problem in many places:
128
404540
3680
μš°λ¦¬λŠ” λ§Žμ€ κ³³μ—μ„œ 이와 λΉ„μŠ·ν•œ 문제λ₯Ό λ΄…λ‹ˆλ‹€.
06:48
in the difficulty of managing overfishing,
129
408900
3176
λ‚¨νš κ΄€λ¦¬μ˜ μ–΄λ €μ›€μ—μ„œ
06:52
or in reducing carbon emissions to mitigate climate change.
130
412100
4560
ν˜Ήμ€ κΈ°ν›„ λ³€ν™”λ₯Ό μ™„ν™”ν•˜κΈ° μœ„ν•΄ νƒ„μ†Œ λ°°μΆœλŸ‰μ„ μ€„μ΄λŠ” κ²ƒμ—μ„œμš”.
06:58
When it comes to the regulation of driverless cars,
131
418980
2920
μžμœ¨μ£Όν–‰μ°¨μ— λŒ€ν•œ κ·œμ œμ— μžˆμ–΄μ„œ
07:02
the common land now is basically public safety --
132
422900
4336
κ³΅μœ μ§€λŠ” 기본적으둜 κ³΅κ³΅μ•ˆμ „μž…λ‹ˆλ‹€.
07:07
that's the common good --
133
427260
1240
그것이 κ³΅μ΅μž…λ‹ˆλ‹€.
07:09
and the farmers are the passengers
134
429220
1976
농뢀듀은 νƒ‘μŠΉμžλ“€ ν˜Ήμ€
07:11
or the car owners who are choosing to ride in those cars.
135
431220
3600
κ·Έ 차에 νƒ€λŠ” 결정을 ν•œ μ°¨λŸ‰μ†Œμœ μ£Όλ“€ 이겠죠.
07:16
And by making the individually rational choice
136
436780
2616
본인의 μ•ˆμ „μ„ μš°μ„ μœΌλ‘œ ν•˜λŠ”
07:19
of prioritizing their own safety,
137
439420
2816
κ°œμΈμ—κ²Œ 합리적인 선택을 ν•¨μœΌλ‘œμ¨
07:22
they may collectively be diminishing the common good,
138
442260
3136
μ΄μ²΄μ μœΌλ‘œλŠ”
전체적인 ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•˜λŠ”, 이 곡읡을 μ•½ν™”μ‹œν‚¬ κ²λ‹ˆλ‹€.
07:25
which is minimizing total harm.
139
445420
2200
07:30
It's called the tragedy of the commons,
140
450140
2136
이것은 κ³΅μœ μ§€μ˜ λΉ„κ·ΉμœΌλ‘œ λΆˆλ¦½λ‹ˆλ‹€
07:32
traditionally,
141
452300
1296
μ „ν†΅μ μœΌλ‘œμš”,
07:33
but I think in the case of driverless cars,
142
453620
3096
ν•˜μ§€λ§Œ μžμœ¨μ£Όν–‰μ°¨μ˜ κ²½μš°μ—”
07:36
the problem may be a little bit more insidious
143
456740
2856
이 λ¬Έμ œλŠ” μ’€ λ―Έλ¬˜ν•˜κ²Œ λ©λ‹ˆλ‹€.
07:39
because there is not necessarily an individual human being
144
459620
3496
μ™œλƒν•˜λ©΄ 개개인이 λ°˜λ“œμ‹œ 그런 선택을 ν•΄μ•Όν•˜λŠ” 것은
07:43
making those decisions.
145
463140
1696
07:44
So car manufacturers may simply program cars
146
464860
3296
λ”°λΌμ„œ μžλ™μ°¨ μ œμ‘°μ‚¬λ“€μ€ μ•„λ§ˆ 고객의 μ•ˆμ „μ„ μ΅œλŒ€ν•œ 보μž₯ν•˜λ„λ‘
07:48
that will maximize safety for their clients,
147
468180
2520
μ°¨λŸ‰ ν”„λ‘œκ·Έλž¨μ„ λ‹¨μˆœν•˜κ²Œ ν•  κ²λ‹ˆλ‹€.
07:51
and those cars may learn automatically on their own
148
471900
2976
그러면 차듀은 μžλ™μœΌλ‘œ μžμœ¨ν•™μŠ΅μ„ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
07:54
that doing so requires slightly increasing risk for pedestrians.
149
474900
3520
κ·ΈλŸ¬ν•œ 선택이 λ³΄ν–‰μžλ“€μ˜ μœ„ν—˜μ„ 살짝 μ¦κ°€μ‹œν‚¨λ‹€λŠ” 점을 말이죠.
07:59
So to use the sheep metaphor,
150
479340
1416
μ–‘μœΌλ‘œ λΉ„μœ ν•˜μžλ©΄
08:00
it's like we now have electric sheep that have a mind of their own.
151
480780
3616
우린 이제 슀슀둜 생각할 수 μžˆλŠ” μ „μž μ–‘μœΌλ‘œ 가진 μ…ˆμ΄μ£ .
08:04
(Laughter)
152
484420
1456
(μ›ƒμŒ)
08:05
And they may go and graze even if the farmer doesn't know it.
153
485900
3080
이 양듀은 농뢀 λͺ°λž˜ κ°€μ„œ 풀을 λœ―μ–΄ 먹을 수 μžˆμ–΄μš”.
08:10
So this is what we may call the tragedy of the algorithmic commons,
154
490460
3976
μš°λ¦¬λŠ” 이것을 μ•Œκ³ λ¦¬μ¦˜μ  κ³΅μœ μ§€μ˜ 비극이라 λΆ€λ₯Ό 수 μžˆκ² κ³ μš”.
08:14
and if offers new types of challenges.
155
494460
2360
μ΄λŠ” μƒˆλ‘œμš΄ μ’…λ₯˜μ˜ 도전을 μ•ΌκΈ°ν•©λ‹ˆλ‹€.
08:22
Typically, traditionally,
156
502340
1896
일반적으둜, μ „ν†΅μ μœΌλ‘œ
08:24
we solve these types of social dilemmas using regulation,
157
504260
3336
이런 μ’…λ₯˜μ˜ μ‚¬νšŒμ  λ”œλ ˆλ§ˆλ₯Ό ν•΄κ²°ν•˜κΈ° μœ„ν•΄μ„œ κ·œμ œκ°€ μ‚¬μš©λ©λ‹ˆλ‹€.
08:27
so either governments or communities get together,
158
507620
2736
μ •λΆ€λ‚˜ μ§€μ—­μ‚¬νšŒλ“€μ΄ ν•¨κ»˜ λͺ¨μ—¬ μ§‘λ‹¨μ μœΌλ‘œ κ²°μ •ν•©λ‹ˆλ‹€.
08:30
and they decide collectively what kind of outcome they want
159
510380
3736
μ–΄λ–€ μ’…λ₯˜μ˜ κ²°κ³Όλ₯Ό μ›ν•˜λŠ”μ§€
08:34
and what sort of constraints on individual behavior
160
514140
2656
그리고 κ°œμΈλ“€μ˜ 행동에 λŒ€ν•΄μ„ 
μ–΄λ–€ μ’…λ₯˜μ˜ μ œν•œμ„ λ‘˜ 것인지λ₯Ό 말이죠.
08:36
they need to implement.
161
516820
1200
08:39
And then using monitoring and enforcement,
162
519420
2616
그런 λ‹€μŒμ—, κ°μ‹œκ³Ό κ°•ν™”λ₯Ό 톡해
08:42
they can make sure that the public good is preserved.
163
522060
2559
곡읡이 보쑴됨을 확인할 수 있죠.
08:45
So why don't we just,
164
525260
1575
κ·Έλ ‡λ‹€λ©΄ μš°λ¦¬λŠ”
08:46
as regulators,
165
526859
1496
규제λ₯Ό λ§Œλ“œλŠ” μž…μž₯μœΌλ‘œμ„œ
08:48
require that all cars minimize harm?
166
528379
2897
λͺ¨λ“  μ°¨λŸ‰λ“€μ—κ²Œ κ·Έμ € ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•˜λ„λ‘ μš”κ΅¬ν•˜λ©΄ λ˜λŠ” 것 μ•„λ‹κΉŒμš”?
08:51
After all, this is what people say they want.
167
531300
2240
κ²°κ΅­μ—” 이것이 μ‚¬λžŒλ“€μ΄ μ›ν•œλ‹€κ³  λ§ν•˜λŠ” κ²ƒμ΄μ—μš”.
08:55
And more importantly,
168
535020
1416
그리고 더 μ€‘μš”ν•œ 것은
08:56
I can be sure that as an individual,
169
536460
3096
ν•œ κ°œμΈμœΌλ‘œμ„œ μ €λŠ” ν™•μ‹ ν•©λ‹ˆλ‹€.
08:59
if I buy a car that may sacrifice me in a very rare case,
170
539580
3856
μ œκ°€ λ§Œμ•½ μ•„μ£Ό λ“œλ¬Ό ν™•μœ¨λ‘œ μ €λ₯Ό ν¬μƒμ‹œν‚¬ μˆ˜λ„ μžˆλŠ” μ°¨λ₯Ό μ‚°λ‹€λ©΄
09:03
I'm not the only sucker doing that
171
543460
1656
λ‹€λ₯Έ λͺ¨λ“  μ‚¬λžŒλ“€μ΄ 무쑰건적인 보호λ₯Ό μ¦κΈ°λŠ” λ™μ•ˆ
09:05
while everybody else enjoys unconditional protection.
172
545140
2680
μ €λ§Œ ν™€λ‘œ 그런 선택을 ν•˜λŠ” μ‚¬λžŒμ€ 아닐 κ²λ‹ˆλ‹€.
09:08
In our survey, we did ask people whether they would support regulation
173
548940
3336
우리 μ„€λ¬Έ μ‘°μ‚¬μ—μ„œ μ‚¬λžŒλ“€μ—κ²Œ 규제λ₯Ό μ§€μ§€ν•˜λŠ”μ§€λ₯Ό λ¬Όμ–΄λ³΄μ•˜κ³ 
09:12
and here's what we found.
174
552300
1200
μ—¬κΈ° κ·Έ κ²°κ³Όκ°€ μžˆμŠ΅λ‹ˆλ‹€.
09:14
First of all, people said no to regulation;
175
554180
3760
μš°μ„  μ‚¬λžŒλ“€μ€ κ·œμ œμ—λŠ” λ°˜λŒ€ν•œλ‹€κ³  ν–ˆμŠ΅λ‹ˆλ‹€.
09:19
and second, they said,
176
559100
1256
그리고 λ§ν•˜λ”κ΅°μš”.
09:20
"Well if you regulate cars to do this and to minimize total harm,
177
560380
3936
"λ§Œμ•½ μ°¨λŸ‰μ— λŒ€ν•΄ μ΄λŸ°μ‹μœΌλ‘œ 규제λ₯Ό ν•˜κ³ , 전체 ν”Όν•΄λ₯Ό μ€„μ΄κ³ μž ν•œλ‹€λ©΄"
09:24
I will not buy those cars."
178
564340
1480
"κ·Έ μ°¨λ₯Ό 사지 μ•Šκ² μ–΄μš”"
09:27
So ironically,
179
567220
1376
μ•„μ΄λŸ¬λ‹ˆν•˜κ²Œλ„
09:28
by regulating cars to minimize harm,
180
568620
3496
ν”Όν•΄λ₯Ό μ΅œμ†Œν™”ν•˜λ €κ³  규제λ₯Ό ν•˜λŠ” 것은
09:32
we may actually end up with more harm
181
572140
1840
더 큰 ν”Όν•΄λ₯Ό 뢈러올 수 μžˆμŠ΅λ‹ˆλ‹€
09:34
because people may not opt into the safer technology
182
574860
3656
μ™œλƒν•˜λ©΄ μ‚¬λžŒλ“€μ΄ 더 μ•ˆμ „ν•œ κΈ°μˆ μ— λ™μ˜ν•˜μ§€ μ•Šμ„ μˆ˜λ„ 있기 λ•Œλ¬Έμ΄μ£ .
09:38
even if it's much safer than human drivers.
183
578540
2080
비둝 그것이 μ‚¬λžŒμ΄ μš΄μ „ν•˜λŠ” 것보닀 훨씬 μ•ˆμ „ν•˜λ‹€ ν• μ§€λΌλ„μš”.
09:42
I don't have the final answer to this riddle,
184
582180
3416
이 λ‚œμ œμ— λŒ€ν•œ μ΅œμ’… 정닡은 μ—†μŠ΅λ‹ˆλ‹€.
09:45
but I think as a starting point,
185
585620
1576
ν•˜μ§€λ§Œ μΆœλ°œμ μœΌλ‘œμ„œ
09:47
we need society to come together
186
587220
3296
μ‚¬νšŒλ₯Ό ν•œ 곳에 λͺ¨μ„ ν•„μš”λŠ” μžˆμŠ΅λ‹ˆλ‹€.
09:50
to decide what trade-offs we are comfortable with
187
590540
2760
μ–΄λ–€ νŠΈλ ˆμ΄λ“œ μ˜€ν”„κ°€ μš°λ¦¬μ—κ²Œ νŽΈμ•ˆν•œμ§€λ₯Ό κ²°μ •ν•˜κ³ 
09:54
and to come up with ways in which we can enforce those trade-offs.
188
594180
3480
κ·Έ νŠΈλ ˆμ΄λ“œ μ˜€ν”„λ₯Ό μ‹€ν–‰ν•˜κΈ° μœ„ν•œ 방법듀을 μ œμ‹œν•˜κΈ° μœ„ν•΄μ„œ 말이죠
09:58
As a starting point, my brilliant students,
189
598340
2536
μΆœλ°œμ μœΌλ‘œμ„œ
μ €μ˜ λ˜‘λ˜‘ν•œ 학생듀인 μ—λ“œλͺ¬λ“œ μ–΄μ™€λ“œμ™€ μ†Œν•œ λ“œμˆ˜μžλŠ”
10:00
Edmond Awad and Sohan Dsouza,
190
600900
2456
10:03
built the Moral Machine website,
191
603380
1800
도덕 기계 (Moral Machine) μ›Ήμ‚¬μ΄νŠΈλ₯Ό λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
10:06
which generates random scenarios at you --
192
606020
2680
λžœλ€ν•œ μ‹œλ‚˜λ¦¬μ˜€λ“€μ„ μ—¬λŸ¬λΆ„μ—κ²Œ λ³΄μ—¬μ€λ‹ˆλ‹€.
10:09
basically a bunch of random dilemmas in a sequence
193
609900
2456
기본적으둜 λžœλ€ν•œ λ”œλ ˆλ§ˆλ“€μ΄ μ—°μ†μ μœΌλ‘œ λ‚˜μ˜€λŠ” κ²ƒμ΄κ³ μš”.
10:12
where you have to choose what the car should do in a given scenario.
194
612380
3920
μ—¬λŸ¬λΆ„μ€ μ°¨κ°€ 무엇을 ν•΄μ•Ό ν•˜λŠ”μ§€λ₯Ό 골라야 ν•©λ‹ˆλ‹€.
10:16
And we vary the ages and even the species of the different victims.
195
616860
4600
μ—¬κΈ°μ„œ ν¬μƒμžλ“€μ˜ λ‚˜μ΄μ™€ 인쒅은 각각 λ‹€λ¦…λ‹ˆλ‹€.
10:22
So far we've collected over five million decisions
196
622860
3696
μ§€κΈˆκΉŒμ§€ μš°λ¦¬λŠ” 500만 건이 λ„˜λŠ” 결정듀을 μˆ˜μ§‘ν–ˆμŠ΅λ‹ˆλ‹€.
10:26
by over one million people worldwide
197
626580
2200
μ „ μ„Έκ³„μ μœΌλ‘œ 100만λͺ…이 λ„˜λŠ” μ‚¬λžŒλ“€μ΄
10:30
from the website.
198
630220
1200
μ›Ήμ‚¬μ΄νŠΈλ₯Ό 톡해 μ°Έμ—¬ν–ˆμŠ΅λ‹ˆλ‹€.
10:32
And this is helping us form an early picture
199
632180
2416
이 κ²°κ³ΌλŠ” 저희가 초기 연ꡬλ₯Ό λ§Œλ“œλŠ” 데에 도움이 되고 μžˆμŠ΅λ‹ˆλ‹€.
10:34
of what trade-offs people are comfortable with
200
634620
2616
μ‚¬λžŒλ“€μ΄ νŽΈμ•ˆν•΄ ν•˜λŠ” νŠΈλ ˆμ΄λ“œ μ˜€ν”„κ°€ 무엇이고
10:37
and what matters to them --
201
637260
1896
무엇이 κ·Έλ“€μ—κ²Œ μ€‘μš”ν•œμ§€μ— λŒ€ν•΄μ„œ
10:39
even across cultures.
202
639180
1440
λ¬Έν™”λ₯Ό λ„˜μ–΄μ„œκΉŒμ§€ 말이죠.
10:42
But more importantly,
203
642060
1496
ν•˜μ§€λ§Œ 더 μ€‘μš”ν•œ 것은
10:43
doing this exercise is helping people recognize
204
643580
3376
이 ν›ˆλ ¨μ€ μ‚¬λžŒλ“€λ‘œ ν•˜μ—¬κΈˆ
10:46
the difficulty of making those choices
205
646980
2816
이런 선택을 ν•˜λŠ” 것이 νž˜λ“€λ‹€λŠ” 것과
10:49
and that the regulators are tasked with impossible choices.
206
649820
3800
κ²°μ •κΆŒμžλ“€μ΄ λΆˆκ°€λŠ₯ν•œ 선택을 가지고 일함을 μΈμ§€μ‹œν‚€λŠ”λ°μ— 도움이 되고 μžˆμŠ΅λ‹ˆλ‹€.
10:55
And maybe this will help us as a society understand the kinds of trade-offs
207
655180
3576
이것은 μ‚¬νšŒκ°€ ꢁ극적으둜 규제둜 λ§Œλ“€ νŠΈλ ˆμ΄λ“œ μ˜€ν”„μ˜ μ’…λ₯˜λ₯Ό μ΄ν•΄ν•˜λŠ” 데
10:58
that will be implemented ultimately in regulation.
208
658780
3056
도움이 될 κ²ƒμž…λ‹ˆλ‹€.
11:01
And indeed, I was very happy to hear
209
661860
1736
μ§€λ‚œμ£Όμ— κ΅ν†΅λΆ€μ—μ„œ
11:03
that the first set of regulations
210
663620
2016
첫 번째 규제λ₯Ό λ°œν‘œν–ˆλ‹€λŠ” μ†Œμ‹μ„ λ“£κ³ 
11:05
that came from the Department of Transport --
211
665660
2136
11:07
announced last week --
212
667820
1376
μ €λŠ” 참으둜 κΈ°λ»€μŠ΅λ‹ˆλ‹€.
11:09
included a 15-point checklist for all carmakers to provide,
213
669220
6576
이 κ·œμ œλŠ” λͺ¨λ“  μžλ™μ°¨ νšŒμ‚¬λ“€μ΄ μ œκ³΅ν•΄μ•Ό ν•  15개의 체크리슀트λ₯Ό λ‹΄κ³  μžˆλŠ”λ°
11:15
and number 14 was ethical consideration --
214
675820
3256
이 쀑 14번이 윀리적인 κ³ λ €μž…λ‹ˆλ‹€.
11:19
how are you going to deal with that.
215
679100
1720
κ·Έκ±Έ μ–΄λ–»κ²Œ μ²˜λ¦¬ν•  것이냐 ν•˜λŠ” 것이죠.
11:23
We also have people reflect on their own decisions
216
683620
2656
μš°λ¦¬λŠ” 또 μ‚¬λžŒλ“€μ—κ²Œ μžμ‹ μ˜ 선택듀을 돌이켜보게 ν•©λ‹ˆλ‹€.
11:26
by giving them summaries of what they chose.
217
686300
3000
그듀이 μ„ νƒν•œ 것을 λ‹€μ‹œ μš”μ•½ν•΄μ€ŒμœΌλ‘œμ¨ 말이죠.
11:30
I'll give you one example --
218
690260
1656
ν•œ 가지 예λ₯Ό λ“€μ–΄λ³Όκ²Œμš”.
11:31
I'm just going to warn you that this is not your typical example,
219
691940
3536
μ—¬λŸ¬λΆ„μ—κ²Œ κ²½κ³ λ“œλ¦½λ‹ˆλ‹€.
이것은 일반적인 μ˜ˆλ‚˜ 일반적인 μ‚¬μš©μžκ°€ μ•„λ‹™λ‹ˆλ‹€.
11:35
your typical user.
220
695500
1376
11:36
This is the most sacrificed and the most saved character for this person.
221
696900
3616
이것은 이 μ‚¬λžŒμ—κ²Œ κ°€μž₯ ν¬μƒλ˜κ³  κ°€μž₯ ꡬ원받은 μΊλ¦­ν„°μž…λ‹ˆλ‹€.
11:40
(Laughter)
222
700540
5200
(μ›ƒμŒ)
11:46
Some of you may agree with him,
223
706500
1896
μ—¬λŸ¬λΆ„ 쀑 μΌλΆ€λŠ”
11:48
or her, we don't know.
224
708420
1640
이 μ‚¬λžŒμ—κ²Œ λ™μ˜ν•˜μ‹€μ§€λ„ λͺ¨λ₯΄κ² μ–΄μš”.
11:52
But this person also seems to slightly prefer passengers over pedestrians
225
712300
6136
ν•˜μ§€λ§Œ 이 μ‚¬λžŒμ˜ 선택을 μ‚΄νŽ΄λ³΄λ©΄
λ³΄ν–‰μž λ³΄λ‹€λŠ” νƒ‘μŠΉμžλ₯Ό μ•½κ°„ 더 μ„ ν˜Έν•˜κ³ 
11:58
in their choices
226
718460
2096
12:00
and is very happy to punish jaywalking.
227
720580
2816
λ¬΄λ‹¨νš‘λ‹¨μžλ₯Ό 벌주고 μ‹Άμ–΄ν•΄μš”.
12:03
(Laughter)
228
723420
3040
(μ›ƒμŒ)
12:09
So let's wrap up.
229
729140
1216
μ •λ¦¬ν•˜κ² μŠ΅λ‹ˆλ‹€.
12:10
We started with the question -- let's call it the ethical dilemma --
230
730379
3416
μš°λ¦¬λŠ” 이 질문과 ν•¨κ»˜ μ‹œμž‘ν–ˆμ–΄μš”. 윀리적 λ”œλ ˆλ§ˆλΌκ³  λΆ€λ₯Όκ²Œμš”.
12:13
of what the car should do in a specific scenario:
231
733820
3056
μ‹œλ‚˜λ¦¬μ˜€ μƒμ—μ„œ μ°¨κ°€ μ–΄λ–»κ²Œ ν•΄μ•Όν•˜λŠ”μ§€λ₯Ό λ¬Όμ–΄μš”.
12:16
swerve or stay?
232
736900
1200
λ°©ν–₯을 ν‹€ 것인가? κ°€λ§Œνžˆ μžˆμ„ 것인가?
12:19
But then we realized that the problem was a different one.
233
739060
2736
그런 λ‹€μŒμ— 우린 λ¬Έμ œλŠ” λ‹€λ₯Έ κ²ƒμ΄λΌλŠ” 사싀을 κΉ¨λ‹«μ£ .
12:21
It was the problem of how to get society to agree on and enforce
234
741820
4536
그것은 μ–΄λ–»κ²Œ μ‚¬νšŒμ μΈ ν•©μ˜λ₯Ό μ΄λŒμ–΄λ‚΄κ³ 
12:26
the trade-offs they're comfortable with.
235
746380
1936
μ‚¬λžŒλ“€μ΄ νŽΈμ•ˆν•΄ν•˜λŠ” νŠΈλ ˆμ΄λ“œ μ˜€ν”„λ₯Ό 집행할 κ²ƒμΈκ°€μ˜ λ¬Έμ œμ˜€μ£ .
12:28
It's a social dilemma.
236
748340
1256
이것은 μ‚¬νšŒμ  λ”œλ ˆλ§ˆμž…λ‹ˆλ‹€.
12:29
In the 1940s, Isaac Asimov wrote his famous laws of robotics --
237
749620
5016
1940λ…„λŒ€μ— μ•„μ΄μž‘ μ•„μ‹œλͺ¨ν”„λŠ”
그의 유λͺ…ν•œ λ‘œλ΄‡μ˜ 3원칙을 μ €μˆ ν•©λ‹ˆλ‹€.
12:34
the three laws of robotics.
238
754660
1320
12:37
A robot may not harm a human being,
239
757060
2456
λ‘œλ΄‡μ€ μΈκ°„μ—κ²Œ ν•΄λ₯Ό 끼칠 수 μ—†κ³ 
12:39
a robot may not disobey a human being,
240
759540
2536
λ‘œλ΄‡μ€ 인간이 λ‚΄λ¦¬λŠ” λͺ…령에 볡쒅해야 ν•˜λ©°
12:42
and a robot may not allow itself to come to harm --
241
762100
3256
λ‘œλ΄‡μ€ μžμ‹ μ˜ 쑴재λ₯Ό λ³΄ν˜Έν•΄μ•Ό ν•œλ‹€.
12:45
in this order of importance.
242
765380
1960
μˆœμ„œλŒ€λ‘œ μ€‘μš”ν•˜μ£ .
12:48
But after 40 years or so
243
768180
2136
ν•˜μ§€λ§Œ 40μ—¬λ…„μ˜ 세월이 흐λ₯΄κ³ 
12:50
and after so many stories pushing these laws to the limit,
244
770340
3736
λ§Žμ€ μŠ€ν† λ¦¬λ“€μ΄ λ‚˜νƒ€λ‚˜λ©° 이 법칙듀이 ν•œκ³„μ— λΆ€λ”ͺ치자
12:54
Asimov introduced the zeroth law
245
774100
3696
μ•„μ‹œλͺ¨ν”„λŠ” 0번째 법칙을 μ†Œκ°œν–ˆμŠ΅λ‹ˆλ‹€.
12:57
which takes precedence above all,
246
777820
2256
μ΄λŠ” 3가지 법칙보닀 μš°μ„ ν•˜λŠ” λ²•μΉ™μœΌλ‘œ
13:00
and it's that a robot may not harm humanity as a whole.
247
780100
3280
λ°”λ‘œ, λ‘œλ΄‡μ€ 인λ₯˜ μ „μ²΄μ—κ²Œ ν•΄λ₯Ό 끼칠 수 μ—†λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
13:04
I don't know what this means in the context of driverless cars
248
784300
4376
μ €λŠ” 이것이 μžμœ¨μ£Όν–‰μ°¨μ˜ κ²½μš°λ‚˜ λ‹€λ₯Έ κ²½μš°μ— μžˆμ–΄
13:08
or any specific situation,
249
788700
2736
μ–΄λ–€ μ˜λ―Έκ°€ λ˜λŠ”μ§€ 잘 λͺ¨λ₯΄κ² μŠ΅λ‹ˆλ‹€.
13:11
and I don't know how we can implement it,
250
791460
2216
μš°λ¦¬κ°€ μ–΄λ–»κ²Œ κ΅¬ν˜„ν• μ§€λ„ 잘 λͺ¨λ₯΄κ² κ³ μš”.
13:13
but I think that by recognizing
251
793700
1536
ν•˜μ§€λ§Œ μžμœ¨μ£Όν–‰μ°¨μ— λŒ€ν•œ κ·œμ œκ°€
13:15
that the regulation of driverless cars is not only a technological problem
252
795260
6136
단지 기술적인 λ¬Έμ œκ°€ μ•„λ‹ˆλΌ
13:21
but also a societal cooperation problem,
253
801420
3280
μ‚¬νšŒμ  ν˜‘λ ₯ λ¬Έμ œμž„μ„ μΈμ‹ν•¨μœΌλ‘œμ¨
13:25
I hope that we can at least begin to ask the right questions.
254
805620
2880
μš°λ¦¬κ°€ 적어도 μ˜¬λ°”λ₯Έ μ§ˆλ¬Έμ„ μ‹œμž‘ν•  수 있게 되기λ₯Ό ν¬λ§ν•΄λ΄…λ‹ˆλ‹€.
13:29
Thank you.
255
809020
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
13:30
(Applause)
256
810260
2920
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7