Damon Horowitz calls for a "moral operating system"

95,064 views ・ 2011-06-06

TED


Please double-click on the English subtitles below to play the video.

00:15
Power.
0
15260
2000
00:17
That is the word that comes to mind.
1
17260
2000
00:19
We're the new technologists.
2
19260
2000
00:21
We have a lot of data, so we have a lot of power.
3
21260
3000
00:24
How much power do we have?
4
24260
2000
00:26
Scene from a movie: "Apocalypse Now" -- great movie.
5
26260
3000
00:29
We've got to get our hero, Captain Willard, to the mouth of the Nung River
6
29260
3000
00:32
so he can go pursue Colonel Kurtz.
7
32260
2000
00:34
The way we're going to do this is fly him in and drop him off.
8
34260
2000
00:36
So the scene:
9
36260
2000
00:38
the sky is filled with this fleet of helicopters carrying him in.
10
38260
3000
00:41
And there's this loud, thrilling music in the background,
11
41260
2000
00:43
this wild music.
12
43260
2000
00:45
♫ Dum da ta da dum ♫
13
45260
2000
00:47
♫ Dum da ta da dum ♫
14
47260
2000
00:49
♫ Da ta da da ♫
15
49260
3000
00:52
That's a lot of power.
16
52260
2000
00:54
That's the kind of power I feel in this room.
17
54260
2000
00:56
That's the kind of power we have
18
56260
2000
00:58
because of all of the data that we have.
19
58260
2000
01:00
Let's take an example.
20
60260
2000
01:02
What can we do
21
62260
2000
01:04
with just one person's data?
22
64260
3000
01:07
What can we do
23
67260
2000
01:09
with that guy's data?
24
69260
2000
01:11
I can look at your financial records.
25
71260
2000
01:13
I can tell if you pay your bills on time.
26
73260
2000
01:15
I know if you're good to give a loan to.
27
75260
2000
01:17
I can look at your medical records; I can see if your pump is still pumping --
28
77260
3000
01:20
see if you're good to offer insurance to.
29
80260
3000
01:23
I can look at your clicking patterns.
30
83260
2000
01:25
When you come to my website, I actually know what you're going to do already
31
85260
3000
01:28
because I've seen you visit millions of websites before.
32
88260
2000
01:30
And I'm sorry to tell you,
33
90260
2000
01:32
you're like a poker player, you have a tell.
34
92260
2000
01:34
I can tell with data analysis what you're going to do
35
94260
2000
01:36
before you even do it.
36
96260
2000
01:38
I know what you like. I know who you are,
37
98260
3000
01:41
and that's even before I look at your mail
38
101260
2000
01:43
or your phone.
39
103260
2000
01:45
Those are the kinds of things we can do
40
105260
2000
01:47
with the data that we have.
41
107260
3000
01:50
But I'm not actually here to talk about what we can do.
42
110260
3000
01:56
I'm here to talk about what we should do.
43
116260
3000
02:00
What's the right thing to do?
44
120260
3000
02:04
Now I see some puzzled looks
45
124260
2000
02:06
like, "Why are you asking us what's the right thing to do?
46
126260
3000
02:09
We're just building this stuff. Somebody else is using it."
47
129260
3000
02:12
Fair enough.
48
132260
3000
02:15
But it brings me back.
49
135260
2000
02:17
I think about World War II --
50
137260
2000
02:19
some of our great technologists then,
51
139260
2000
02:21
some of our great physicists,
52
141260
2000
02:23
studying nuclear fission and fusion --
53
143260
2000
02:25
just nuclear stuff.
54
145260
2000
02:27
We gather together these physicists in Los Alamos
55
147260
3000
02:30
to see what they'll build.
56
150260
3000
02:33
We want the people building the technology
57
153260
3000
02:36
thinking about what we should be doing with the technology.
58
156260
3000
02:41
So what should we be doing with that guy's data?
59
161260
3000
02:44
Should we be collecting it, gathering it,
60
164260
3000
02:47
so we can make his online experience better?
61
167260
2000
02:49
So we can make money?
62
169260
2000
02:51
So we can protect ourselves
63
171260
2000
02:53
if he was up to no good?
64
173260
2000
02:55
Or should we respect his privacy,
65
175260
3000
02:58
protect his dignity and leave him alone?
66
178260
3000
03:02
Which one is it?
67
182260
3000
03:05
How should we figure it out?
68
185260
2000
03:07
I know: crowdsource. Let's crowdsource this.
69
187260
3000
03:11
So to get people warmed up,
70
191260
3000
03:14
let's start with an easy question --
71
194260
2000
03:16
something I'm sure everybody here has an opinion about:
72
196260
3000
03:19
iPhone versus Android.
73
199260
2000
03:21
Let's do a show of hands -- iPhone.
74
201260
3000
03:24
Uh huh.
75
204260
2000
03:26
Android.
76
206260
3000
03:29
You'd think with a bunch of smart people
77
209260
2000
03:31
we wouldn't be such suckers just for the pretty phones.
78
211260
2000
03:33
(Laughter)
79
213260
2000
03:35
Next question,
80
215260
2000
03:37
a little bit harder.
81
217260
2000
03:39
Should we be collecting all of that guy's data
82
219260
2000
03:41
to make his experiences better
83
221260
2000
03:43
and to protect ourselves in case he's up to no good?
84
223260
3000
03:46
Or should we leave him alone?
85
226260
2000
03:48
Collect his data.
86
228260
3000
03:53
Leave him alone.
87
233260
3000
03:56
You're safe. It's fine.
88
236260
2000
03:58
(Laughter)
89
238260
2000
04:00
Okay, last question --
90
240260
2000
04:02
harder question --
91
242260
2000
04:04
when trying to evaluate
92
244260
3000
04:07
what we should do in this case,
93
247260
3000
04:10
should we use a Kantian deontological moral framework,
94
250260
4000
04:14
or should we use a Millian consequentialist one?
95
254260
3000
04:19
Kant.
96
259260
3000
04:22
Mill.
97
262260
3000
04:25
Not as many votes.
98
265260
2000
04:27
(Laughter)
99
267260
3000
04:30
Yeah, that's a terrifying result.
100
270260
3000
04:34
Terrifying, because we have stronger opinions
101
274260
4000
04:38
about our hand-held devices
102
278260
2000
04:40
than about the moral framework
103
280260
2000
04:42
we should use to guide our decisions.
104
282260
2000
04:44
How do we know what to do with all the power we have
105
284260
3000
04:47
if we don't have a moral framework?
106
287260
3000
04:50
We know more about mobile operating systems,
107
290260
3000
04:53
but what we really need is a moral operating system.
108
293260
3000
04:58
What's a moral operating system?
109
298260
2000
05:00
We all know right and wrong, right?
110
300260
2000
05:02
You feel good when you do something right,
111
302260
2000
05:04
you feel bad when you do something wrong.
112
304260
2000
05:06
Our parents teach us that: praise with the good, scold with the bad.
113
306260
3000
05:09
But how do we figure out what's right and wrong?
114
309260
3000
05:12
And from day to day, we have the techniques that we use.
115
312260
3000
05:15
Maybe we just follow our gut.
116
315260
3000
05:18
Maybe we take a vote -- we crowdsource.
117
318260
3000
05:21
Or maybe we punt --
118
321260
2000
05:23
ask the legal department, see what they say.
119
323260
3000
05:26
In other words, it's kind of random,
120
326260
2000
05:28
kind of ad hoc,
121
328260
2000
05:30
how we figure out what we should do.
122
330260
3000
05:33
And maybe, if we want to be on surer footing,
123
333260
3000
05:36
what we really want is a moral framework that will help guide us there,
124
336260
3000
05:39
that will tell us what kinds of things are right and wrong in the first place,
125
339260
3000
05:42
and how would we know in a given situation what to do.
126
342260
4000
05:46
So let's get a moral framework.
127
346260
2000
05:48
We're numbers people, living by numbers.
128
348260
3000
05:51
How can we use numbers
129
351260
2000
05:53
as the basis for a moral framework?
130
353260
3000
05:56
I know a guy who did exactly that.
131
356260
3000
05:59
A brilliant guy --
132
359260
3000
06:02
he's been dead 2,500 years.
133
362260
3000
06:05
Plato, that's right.
134
365260
2000
06:07
Remember him -- old philosopher?
135
367260
2000
06:09
You were sleeping during that class.
136
369260
3000
06:12
And Plato, he had a lot of the same concerns that we did.
137
372260
2000
06:14
He was worried about right and wrong.
138
374260
2000
06:16
He wanted to know what is just.
139
376260
2000
06:18
But he was worried that all we seem to be doing
140
378260
2000
06:20
is trading opinions about this.
141
380260
2000
06:22
He says something's just. She says something else is just.
142
382260
3000
06:25
It's kind of convincing when he talks and when she talks too.
143
385260
2000
06:27
I'm just going back and forth; I'm not getting anywhere.
144
387260
2000
06:29
I don't want opinions; I want knowledge.
145
389260
3000
06:32
I want to know the truth about justice --
146
392260
3000
06:35
like we have truths in math.
147
395260
3000
06:38
In math, we know the objective facts.
148
398260
3000
06:41
Take a number, any number -- two.
149
401260
2000
06:43
Favorite number. I love that number.
150
403260
2000
06:45
There are truths about two.
151
405260
2000
06:47
If you've got two of something,
152
407260
2000
06:49
you add two more, you get four.
153
409260
2000
06:51
That's true no matter what thing you're talking about.
154
411260
2000
06:53
It's an objective truth about the form of two,
155
413260
2000
06:55
the abstract form.
156
415260
2000
06:57
When you have two of anything -- two eyes, two ears, two noses,
157
417260
2000
06:59
just two protrusions --
158
419260
2000
07:01
those all partake of the form of two.
159
421260
3000
07:04
They all participate in the truths that two has.
160
424260
4000
07:08
They all have two-ness in them.
161
428260
2000
07:10
And therefore, it's not a matter of opinion.
162
430260
3000
07:13
What if, Plato thought,
163
433260
2000
07:15
ethics was like math?
164
435260
2000
07:17
What if there were a pure form of justice?
165
437260
3000
07:20
What if there are truths about justice,
166
440260
2000
07:22
and you could just look around in this world
167
442260
2000
07:24
and see which things participated,
168
444260
2000
07:26
partook of that form of justice?
169
446260
3000
07:29
Then you would know what was really just and what wasn't.
170
449260
3000
07:32
It wouldn't be a matter
171
452260
2000
07:34
of just opinion or just appearances.
172
454260
3000
07:37
That's a stunning vision.
173
457260
2000
07:39
I mean, think about that. How grand. How ambitious.
174
459260
3000
07:42
That's as ambitious as we are.
175
462260
2000
07:44
He wants to solve ethics.
176
464260
2000
07:46
He wants objective truths.
177
466260
2000
07:48
If you think that way,
178
468260
3000
07:51
you have a Platonist moral framework.
179
471260
3000
07:54
If you don't think that way,
180
474260
2000
07:56
well, you have a lot of company in the history of Western philosophy,
181
476260
2000
07:58
because the tidy idea, you know, people criticized it.
182
478260
3000
08:01
Aristotle, in particular, he was not amused.
183
481260
3000
08:04
He thought it was impractical.
184
484260
3000
08:07
Aristotle said, "We should seek only so much precision in each subject
185
487260
4000
08:11
as that subject allows."
186
491260
2000
08:13
Aristotle thought ethics wasn't a lot like math.
187
493260
3000
08:16
He thought ethics was a matter of making decisions in the here-and-now
188
496260
3000
08:19
using our best judgment
189
499260
2000
08:21
to find the right path.
190
501260
2000
08:23
If you think that, Plato's not your guy.
191
503260
2000
08:25
But don't give up.
192
505260
2000
08:27
Maybe there's another way
193
507260
2000
08:29
that we can use numbers as the basis of our moral framework.
194
509260
3000
08:33
How about this:
195
513260
2000
08:35
What if in any situation you could just calculate,
196
515260
3000
08:38
look at the choices,
197
518260
2000
08:40
measure out which one's better and know what to do?
198
520260
3000
08:43
That sound familiar?
199
523260
2000
08:45
That's a utilitarian moral framework.
200
525260
3000
08:48
John Stuart Mill was a great advocate of this --
201
528260
2000
08:50
nice guy besides --
202
530260
2000
08:52
and only been dead 200 years.
203
532260
2000
08:54
So basis of utilitarianism --
204
534260
2000
08:56
I'm sure you're familiar at least.
205
536260
2000
08:58
The three people who voted for Mill before are familiar with this.
206
538260
2000
09:00
But here's the way it works.
207
540260
2000
09:02
What if morals, what if what makes something moral
208
542260
3000
09:05
is just a matter of if it maximizes pleasure
209
545260
2000
09:07
and minimizes pain?
210
547260
2000
09:09
It does something intrinsic to the act.
211
549260
3000
09:12
It's not like its relation to some abstract form.
212
552260
2000
09:14
It's just a matter of the consequences.
213
554260
2000
09:16
You just look at the consequences
214
556260
2000
09:18
and see if, overall, it's for the good or for the worse.
215
558260
2000
09:20
That would be simple. Then we know what to do.
216
560260
2000
09:22
Let's take an example.
217
562260
2000
09:24
Suppose I go up
218
564260
2000
09:26
and I say, "I'm going to take your phone."
219
566260
2000
09:28
Not just because it rang earlier,
220
568260
2000
09:30
but I'm going to take it because I made a little calculation.
221
570260
3000
09:33
I thought, that guy looks suspicious.
222
573260
3000
09:36
And what if he's been sending little messages to Bin Laden's hideout --
223
576260
3000
09:39
or whoever took over after Bin Laden --
224
579260
2000
09:41
and he's actually like a terrorist, a sleeper cell.
225
581260
3000
09:44
I'm going to find that out, and when I find that out,
226
584260
3000
09:47
I'm going to prevent a huge amount of damage that he could cause.
227
587260
3000
09:50
That has a very high utility to prevent that damage.
228
590260
3000
09:53
And compared to the little pain that it's going to cause --
229
593260
2000
09:55
because it's going to be embarrassing when I'm looking on his phone
230
595260
2000
09:57
and seeing that he has a Farmville problem and that whole bit --
231
597260
3000
10:00
that's overwhelmed
232
600260
3000
10:03
by the value of looking at the phone.
233
603260
2000
10:05
If you feel that way,
234
605260
2000
10:07
that's a utilitarian choice.
235
607260
3000
10:10
But maybe you don't feel that way either.
236
610260
3000
10:13
Maybe you think, it's his phone.
237
613260
2000
10:15
It's wrong to take his phone
238
615260
2000
10:17
because he's a person
239
617260
2000
10:19
and he has rights and he has dignity,
240
619260
2000
10:21
and we can't just interfere with that.
241
621260
2000
10:23
He has autonomy.
242
623260
2000
10:25
It doesn't matter what the calculations are.
243
625260
2000
10:27
There are things that are intrinsically wrong --
244
627260
3000
10:30
like lying is wrong,
245
630260
2000
10:32
like torturing innocent children is wrong.
246
632260
3000
10:35
Kant was very good on this point,
247
635260
3000
10:38
and he said it a little better than I'll say it.
248
638260
2000
10:40
He said we should use our reason
249
640260
2000
10:42
to figure out the rules by which we should guide our conduct,
250
642260
3000
10:45
and then it is our duty to follow those rules.
251
645260
3000
10:48
It's not a matter of calculation.
252
648260
3000
10:51
So let's stop.
253
651260
2000
10:53
We're right in the thick of it, this philosophical thicket.
254
653260
3000
10:56
And this goes on for thousands of years,
255
656260
3000
10:59
because these are hard questions,
256
659260
2000
11:01
and I've only got 15 minutes.
257
661260
2000
11:03
So let's cut to the chase.
258
663260
2000
11:05
How should we be making our decisions?
259
665260
4000
11:09
Is it Plato, is it Aristotle, is it Kant, is it Mill?
260
669260
3000
11:12
What should we be doing? What's the answer?
261
672260
2000
11:14
What's the formula that we can use in any situation
262
674260
3000
11:17
to determine what we should do,
263
677260
2000
11:19
whether we should use that guy's data or not?
264
679260
2000
11:21
What's the formula?
265
681260
3000
11:25
There's not a formula.
266
685260
2000
11:29
There's not a simple answer.
267
689260
2000
11:31
Ethics is hard.
268
691260
3000
11:34
Ethics requires thinking.
269
694260
3000
11:38
And that's uncomfortable.
270
698260
2000
11:40
I know; I spent a lot of my career
271
700260
2000
11:42
in artificial intelligence,
272
702260
2000
11:44
trying to build machines that could do some of this thinking for us,
273
704260
3000
11:47
that could give us answers.
274
707260
2000
11:49
But they can't.
275
709260
2000
11:51
You can't just take human thinking
276
711260
2000
11:53
and put it into a machine.
277
713260
2000
11:55
We're the ones who have to do it.
278
715260
3000
11:58
Happily, we're not machines, and we can do it.
279
718260
3000
12:01
Not only can we think,
280
721260
2000
12:03
we must.
281
723260
2000
12:05
Hannah Arendt said,
282
725260
2000
12:07
"The sad truth
283
727260
2000
12:09
is that most evil done in this world
284
729260
2000
12:11
is not done by people
285
731260
2000
12:13
who choose to be evil.
286
733260
2000
12:15
It arises from not thinking."
287
735260
3000
12:18
That's what she called the "banality of evil."
288
738260
4000
12:22
And the response to that
289
742260
2000
12:24
is that we demand the exercise of thinking
290
744260
2000
12:26
from every sane person.
291
746260
3000
12:29
So let's do that. Let's think.
292
749260
2000
12:31
In fact, let's start right now.
293
751260
3000
12:34
Every person in this room do this:
294
754260
3000
12:37
think of the last time you had a decision to make
295
757260
3000
12:40
where you were worried to do the right thing,
296
760260
2000
12:42
where you wondered, "What should I be doing?"
297
762260
2000
12:44
Bring that to mind,
298
764260
2000
12:46
and now reflect on that
299
766260
2000
12:48
and say, "How did I come up that decision?
300
768260
3000
12:51
What did I do? Did I follow my gut?
301
771260
3000
12:54
Did I have somebody vote on it? Or did I punt to legal?"
302
774260
2000
12:56
Or now we have a few more choices.
303
776260
3000
12:59
"Did I evaluate what would be the highest pleasure
304
779260
2000
13:01
like Mill would?
305
781260
2000
13:03
Or like Kant, did I use reason to figure out what was intrinsically right?"
306
783260
3000
13:06
Think about it. Really bring it to mind. This is important.
307
786260
3000
13:09
It is so important
308
789260
2000
13:11
we are going to spend 30 seconds of valuable TEDTalk time
309
791260
2000
13:13
doing nothing but thinking about this.
310
793260
2000
13:15
Are you ready? Go.
311
795260
2000
13:33
Stop. Good work.
312
813260
3000
13:36
What you just did,
313
816260
2000
13:38
that's the first step towards taking responsibility
314
818260
2000
13:40
for what we should do with all of our power.
315
820260
3000
13:45
Now the next step -- try this.
316
825260
3000
13:49
Go find a friend and explain to them
317
829260
2000
13:51
how you made that decision.
318
831260
2000
13:53
Not right now. Wait till I finish talking.
319
833260
2000
13:55
Do it over lunch.
320
835260
2000
13:57
And don't just find another technologist friend;
321
837260
3000
14:00
find somebody different than you.
322
840260
2000
14:02
Find an artist or a writer --
323
842260
2000
14:04
or, heaven forbid, find a philosopher and talk to them.
324
844260
3000
14:07
In fact, find somebody from the humanities.
325
847260
2000
14:09
Why? Because they think about problems
326
849260
2000
14:11
differently than we do as technologists.
327
851260
2000
14:13
Just a few days ago, right across the street from here,
328
853260
3000
14:16
there was hundreds of people gathered together.
329
856260
2000
14:18
It was technologists and humanists
330
858260
2000
14:20
at that big BiblioTech Conference.
331
860260
2000
14:22
And they gathered together
332
862260
2000
14:24
because the technologists wanted to learn
333
864260
2000
14:26
what it would be like to think from a humanities perspective.
334
866260
3000
14:29
You have someone from Google
335
869260
2000
14:31
talking to someone who does comparative literature.
336
871260
2000
14:33
You're thinking about the relevance of 17th century French theater --
337
873260
3000
14:36
how does that bear upon venture capital?
338
876260
2000
14:38
Well that's interesting. That's a different way of thinking.
339
878260
3000
14:41
And when you think in that way,
340
881260
2000
14:43
you become more sensitive to the human considerations,
341
883260
3000
14:46
which are crucial to making ethical decisions.
342
886260
3000
14:49
So imagine that right now
343
889260
2000
14:51
you went and you found your musician friend.
344
891260
2000
14:53
And you're telling him what we're talking about,
345
893260
3000
14:56
about our whole data revolution and all this --
346
896260
2000
14:58
maybe even hum a few bars of our theme music.
347
898260
2000
15:00
♫ Dum ta da da dum dum ta da da dum ♫
348
900260
3000
15:03
Well, your musician friend will stop you and say,
349
903260
2000
15:05
"You know, the theme music
350
905260
2000
15:07
for your data revolution,
351
907260
2000
15:09
that's an opera, that's Wagner.
352
909260
2000
15:11
It's based on Norse legend.
353
911260
2000
15:13
It's Gods and mythical creatures
354
913260
2000
15:15
fighting over magical jewelry."
355
915260
3000
15:19
That's interesting.
356
919260
3000
15:22
Now it's also a beautiful opera,
357
922260
3000
15:25
and we're moved by that opera.
358
925260
3000
15:28
We're moved because it's about the battle
359
928260
2000
15:30
between good and evil,
360
930260
2000
15:32
about right and wrong.
361
932260
2000
15:34
And we care about right and wrong.
362
934260
2000
15:36
We care what happens in that opera.
363
936260
3000
15:39
We care what happens in "Apocalypse Now."
364
939260
3000
15:42
And we certainly care
365
942260
2000
15:44
what happens with our technologies.
366
944260
2000
15:46
We have so much power today,
367
946260
2000
15:48
it is up to us to figure out what to do,
368
948260
3000
15:51
and that's the good news.
369
951260
2000
15:53
We're the ones writing this opera.
370
953260
3000
15:56
This is our movie.
371
956260
2000
15:58
We figure out what will happen with this technology.
372
958260
3000
16:01
We determine how this will all end.
373
961260
3000
16:04
Thank you.
374
964260
2000
16:06
(Applause)
375
966260
5000
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7