Damon Horowitz calls for a "moral operating system"

95,064 views ใƒป 2011-06-06

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Yubal Masalker ืžื‘ืงืจ: Ido Dekkers
00:15
Power.
0
15260
2000
ื›ื—.
00:17
That is the word that comes to mind.
1
17260
2000
ื–ื• ื”ืžื™ืœื” ืฉืขื•ืœื” ื‘ื“ืขืชื ื•.
00:19
We're the new technologists.
2
19260
2000
ืื ื—ื ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื ื”ื—ื“ืฉื™ื.
00:21
We have a lot of data, so we have a lot of power.
3
21260
3000
ื™ืฉ ืœื ื• ื”ืžื•ืŸ ื ืชื•ื ื™ื, ืœื›ืŸ ื™ืฉ ืœื ื• ื”ืžื•ืŸ ื›ื—.
00:24
How much power do we have?
4
24260
2000
ื›ืžื” ื›ื— ื™ืฉ ืœื ื•?
00:26
Scene from a movie: "Apocalypse Now" -- great movie.
5
26260
3000
ืงื˜ืข ืžืกืจื˜: "ืืคื•ืงืœื™ืคืกื” ืขื›ืฉื™ื•" -- ืกืจื˜ ื ื”ื“ืจ.
00:29
We've got to get our hero, Captain Willard, to the mouth of the Nung River
6
29260
3000
ืื ื• ื—ื™ื™ื‘ื™ื ืœื”ืขื‘ื™ืจ ืืช ื’ื™ื‘ื•ืจื ื•, ืงืคื˜ืŸ ื•ื™ืœืืจื“, ืืœ ืžื•ืฆื ื ื”ืจ ื”"ื ื•ื ื’"
00:32
so he can go pursue Colonel Kurtz.
7
32260
2000
ื›ื“ื™ ืฉื™ื•ื›ืœ ืœืจื“ื•ืฃ ืื—ืจ ืงื•ืœื•ื ืœ ืงื•ืจืฅ.
00:34
The way we're going to do this is fly him in and drop him off.
8
34260
2000
ืื ื• ื ืขืฉื” ื–ืืช ื‘ื“ืจืš ืฉืœ ื”ื˜ืกืชื• ืœืฉื ื•ื”ืฉืœื›ืชื•.
00:36
So the scene:
9
36260
2000
ื–ื” ื”ืงื˜ืข:
00:38
the sky is filled with this fleet of helicopters carrying him in.
10
38260
3000
ื”ืฉืžื™ื™ื ืžืœืื™ื ื‘ืฆื™ ืฉืœ ืžืกื•ืงื™ื ื”ื ื•ืฉื ืื•ืชื•.
00:41
And there's this loud, thrilling music in the background,
11
41260
2000
ื•ื™ืฉ ืืช ื”ืžื•ืกื™ืงื” ื”ืจื•ืขืฉืช ื•ื”ืžืคื—ื™ื“ื” ื‘ืจืงืข,
00:43
this wild music.
12
43260
2000
ืžื•ื–ื™ืงื” ืคืจืื™ืช ื–ื•.
00:45
โ™ซ Dum da ta da dum โ™ซ
13
45260
2000
โ™ซ ื“ืื ื“ื ื˜ื ื“ื ื“ืื โ™ซ
00:47
โ™ซ Dum da ta da dum โ™ซ
14
47260
2000
โ™ซ ื“ืื ื“ื ื˜ื ื“ื ื“ืื โ™ซ
00:49
โ™ซ Da ta da da โ™ซ
15
49260
3000
โ™ซ ื“ืื ื˜ื ื“ื ื“ื โ™ซ
00:52
That's a lot of power.
16
52260
2000
ื–ื• ื”ืžื•ืŸ ืขื•ืฆืžื”.
00:54
That's the kind of power I feel in this room.
17
54260
2000
ื–ื• ื”ืขื•ืฆืžื” ืฉืื ื™ ืžืจื’ื™ืฉ ื‘ืื•ืœื ื–ื”.
00:56
That's the kind of power we have
18
56260
2000
ื–ื” ืกื•ื’ ื”ืขื•ืฆืžื” ืฉื™ืฉ ืœื ื•
00:58
because of all of the data that we have.
19
58260
2000
ื‘ื’ืœืœ ื›ืœ ื”ื ืชื•ื ื™ื ืฉื™ืฉ ืœื ื•.
01:00
Let's take an example.
20
60260
2000
ื”ื‘ื” ื ื™ืงื— ื“ื•ื’ืžื ืื—ืช.
01:02
What can we do
21
62260
2000
ืžื” ืื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช
01:04
with just one person's data?
22
64260
3000
ืจืง ืขื ื ืชื•ื ื™ื ืฉืœ ืื“ื ืื—ื“?
01:07
What can we do
23
67260
2000
ืžื” ืื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช
01:09
with that guy's data?
24
69260
2000
ืขื ื ืชื•ื ื™ื ืฉืœ ืื•ืชื• ืื“ื?
01:11
I can look at your financial records.
25
71260
2000
ืื ื™ ื™ื›ื•ืœ ืœื”ืกืชื›ืœ ื‘ื ืชื•ื ื™ื ื”ื›ืกืคื™ื™ื ืฉืœืš.
01:13
I can tell if you pay your bills on time.
26
73260
2000
ืื ื™ ื™ื›ื•ืœ ืœื•ืžืจ ืื ืืชื” ืžืฉืœื ื—ืฉื‘ื•ื ื•ืช ื‘ื–ืžืŸ.
01:15
I know if you're good to give a loan to.
27
75260
2000
ืื ื™ ื™ื•ื“ืข ืื ืืชื” ื˜ื•ื‘ ื‘ืฉื‘ื™ืœ ืœืชืช ืœืš ื”ืœื•ื•ืื”.
01:17
I can look at your medical records; I can see if your pump is still pumping --
28
77260
3000
ืื ื™ ื™ื›ื•ืœ ืœื”ืกืชื›ืœ ื‘ืชื™ืง ื”ืจืคื•ืื™ ืฉืœืš, ืื ื™ ื™ื›ื•ืœ ืœืจืื•ืช ืื ื”ืžืฉืื‘ื” ืฉืœืš ืขื“ื™ื™ืŸ ืคื•ืขืžืช --
01:20
see if you're good to offer insurance to.
29
80260
3000
ื•ืœืจืื•ืช ืื ื›ื“ืื™ ืœื”ืฆื™ืข ืœืš ื‘ื™ื˜ื•ื—.
01:23
I can look at your clicking patterns.
30
83260
2000
ืื ื™ ื™ื›ื•ืœ ืœื”ืกืชื›ืœ ืœืชื•ืš ื“ืคื•ืกื™ ื”ื”ืงืฉื•ืช ืฉืœืš.
01:25
When you come to my website, I actually know what you're going to do already
31
85260
3000
ื›ืืฉืจ ืืชื” ืžื’ื™ืข ืœืืชืจ ืฉืœื™, ืื ื™ ื›ื‘ืจ ื™ื•ื“ืข ืžื” ืืชื” ื”ื•ืœืš ืœืขืฉื•ืช,
01:28
because I've seen you visit millions of websites before.
32
88260
2000
ื›ื™ ืจืื™ืชื™ ืื•ืชืš ื›ื‘ืจ ืžื‘ืงืจ ืžื™ืœื™ื•ื ื™ ืืชืจื™ื.
01:30
And I'm sorry to tell you,
33
90260
2000
ื•ืื ื™ ืžืฆื˜ืขืจ ืœื•ืžืจ ืœืš,
01:32
you're like a poker player, you have a tell.
34
92260
2000
ืฉืืชื” ื›ืžื• ืฉื—ืงืŸ ืคื•ืงืจ, ืžื•ืคื™ืขื™ื ืขืœื™ืš ืกื™ืžื ื™ื.
01:34
I can tell with data analysis what you're going to do
35
94260
2000
ืžื ื™ืชื•ื— ื”ื ืชื•ื ื™ื ืื ื™ ื™ื›ื•ืœ ืœื•ืžืจ ืžื” ืืชื” ื”ื•ืœืš ืœืขืฉื•ืช
01:36
before you even do it.
36
96260
2000
ืœืคื ื™ ืฉืืชื” ืขืฆืžืš ืืคื™ืœื• ืขื•ืฉื”.
01:38
I know what you like. I know who you are,
37
98260
3000
ืื ื™ ื™ื•ื“ืข ืžื” ืืชื” ืื•ื”ื‘. ืื ื™ ื™ื•ื“ืข ืžื™ ืืชื”.
01:41
and that's even before I look at your mail
38
101260
2000
ื•ื–ื” ืืคื™ืœื• ืœืคื ื™ ืฉืจืื™ืชื™ ืืช ื”ื“ื•ืืจ ืฉืœืš
01:43
or your phone.
39
103260
2000
ืื• ืืช ื”ื˜ืœืคื•ืŸ ืฉืœืš.
01:45
Those are the kinds of things we can do
40
105260
2000
ืืœื” ืžืกื•ื’ ื”ื“ื‘ืจื™ื ืฉืื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช
01:47
with the data that we have.
41
107260
3000
ื‘ืขื–ืจืช ื”ื ืชื•ื ื™ื ืฉื™ืฉ ืœื ื•.
01:50
But I'm not actually here to talk about what we can do.
42
110260
3000
ืื‘ืœ ื‘ืขืฆื ืื ื™ ืœื ื ืžืฆื ื›ืืŸ ื›ื“ื™ ืœื“ื‘ืจ ืžื” ืื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช.
01:56
I'm here to talk about what we should do.
43
116260
3000
ืื ื™ ื›ืืŸ ื›ื“ื™ ืœื“ื‘ืจ ืžื” ืื ื• ืฆืจื™ื›ื™ื ืœืขืฉื•ืช.
02:00
What's the right thing to do?
44
120260
3000
ืžื”ื• ื”ื“ื‘ืจ ื”ื ื›ื•ืŸ ืœืขืฉื•ืชื•?
02:04
Now I see some puzzled looks
45
124260
2000
ืื ื™ ืžื‘ื—ื™ืŸ ื‘ื›ืžื” ืžื‘ื˜ื™ื ืžืฉืชืื™ื
02:06
like, "Why are you asking us what's the right thing to do?
46
126260
3000
ื›ืื™ืœื•, "ืžื“ื•ืข ืืชื” ืฉื•ืืœ ืื•ืชื ื• ืžื”ื• ื”ื“ื‘ืจ ื”ื ื›ื•ืŸ ืœืขืฉื•ืช?
02:09
We're just building this stuff. Somebody else is using it."
47
129260
3000
ืื ื• ืจืง ื‘ื•ื ื™ื ืืช ื”ื“ื‘ืจื™ื. ืžื™ืฉื”ื• ืื—ืจ ืžืฉืชืžืฉ ื‘ื–ื”."
02:12
Fair enough.
48
132260
3000
ื–ื” ื“ื™ ื ื›ื•ืŸ.
02:15
But it brings me back.
49
135260
2000
ืื‘ืœ ื–ื” ืžื—ื–ื™ืจ ืื•ืชื™ ืื—ื•ืจื”.
02:17
I think about World War II --
50
137260
2000
ืื ื™ ื—ื•ืฉื‘ ืขืœ ืžืœื—ืžืช ื”ืขื•ืœื ื”ืฉื ื™ื” --
02:19
some of our great technologists then,
51
139260
2000
ื›ืžื” ืžืื ืฉื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื’ื“ื•ืœื™ื ืฉืœื ื• ืื–,
02:21
some of our great physicists,
52
141260
2000
ื›ืžื” ืžื”ืคื™ื–ื™ืงืื™ื ื”ื’ื“ื•ืœื™ื ืฉืœื ื•,
02:23
studying nuclear fission and fusion --
53
143260
2000
ืฉื—ืงืจื• ื‘ื™ืงื•ืข ื•ื”ื™ืชื•ืš ื’ืจืขื™ื ื™ื™ื --
02:25
just nuclear stuff.
54
145260
2000
ืคืฉื•ื˜ ื“ื‘ืจื™ื ื’ืจืขื™ื ื™ื™ื.
02:27
We gather together these physicists in Los Alamos
55
147260
3000
ืืกืคื ื• ื‘ื™ื—ื“ ืคื™ื–ื™ืงืื™ื ืืœื• ื‘ืœื•ืก ืืœืžื•ืก
02:30
to see what they'll build.
56
150260
3000
ื›ื“ื™ ืœืจืื•ืช ืžื” ื”ื ื™ื•ื›ืœื• ืœื‘ื ื•ืช.
02:33
We want the people building the technology
57
153260
3000
ืื ื• ืจื•ืฆื™ื ืฉืื ืฉื™ื ื™ื™ืฆืจื• ื˜ื›ื ื•ืœื•ื’ื™ื”
02:36
thinking about what we should be doing with the technology.
58
156260
3000
ื‘ื—ื•ืฉื‘ื ื• ืขืœ ืžื” ืฉืื ื• ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
02:41
So what should we be doing with that guy's data?
59
161260
3000
ืื– ืžื” ืขืœื™ื ื• ืœืขืฉื•ืช ืขื ื”ื ืชื•ื ื™ื ืฉืœ ืื•ืชื• ืื“ื?
02:44
Should we be collecting it, gathering it,
60
164260
3000
ื”ืื ืขืœื™ื ื• ืœื”ืžืฉื™ืš ืœืืกื•ืฃ ืื•ืชื,
02:47
so we can make his online experience better?
61
167260
2000
ื›ื“ื™ ืฉื ื•ื›ืœ ืœืฉืคืจ ืืช ื—ื•ื•ื™ื™ืช ื”ืื™ื ื˜ืจื ื˜ ืฉืœื•?
02:49
So we can make money?
62
169260
2000
ื›ื“ื™ ืฉื ื•ื›ืœ ืœืขืฉื•ืช ื›ืกืฃ?
02:51
So we can protect ourselves
63
171260
2000
ื›ื“ื™ ืœื”ื’ืŸ ืขืœ ืขืฆืžื ื•
02:53
if he was up to no good?
64
173260
2000
ืื ื”ื•ื ืขื•ืฉื” ื‘ืขื™ื•ืช?
02:55
Or should we respect his privacy,
65
175260
3000
ืื• ืฉืขืœื™ื ื• ืœื›ื‘ื“ ืืช ืคืจื˜ื™ื•ืชื•,
02:58
protect his dignity and leave him alone?
66
178260
3000
ืœืฉืžื•ืจ ืขืœ ื›ื‘ื•ื“ื• ื•ืœืขื–ื•ื‘ ืื•ืชื• ืœื ืคืฉื•?
03:02
Which one is it?
67
182260
3000
ืžื”ื• ื”ื“ื‘ืจ ื”ื ื›ื•ืŸ?
03:05
How should we figure it out?
68
185260
2000
ื›ื™ืฆื“ ื ื•ื›ืœ ืœื“ืขืช ื–ืืช?
03:07
I know: crowdsource. Let's crowdsource this.
69
187260
3000
ืื ื™ ื™ื•ื“ืข: ื”ื‘ื” ื ืขืฉื” ืœื–ื” ืžื™ืงื•ืจ-ืงื”ืœ.
03:11
So to get people warmed up,
70
191260
3000
ืœื›ืŸ ื‘ืชื•ืจ ืชืจื’ื™ืœ-ื—ื™ืžื•ื ืœืงื”ืœ,
03:14
let's start with an easy question --
71
194260
2000
ื ืชื—ื™ืœ ืขื ืฉืืœื” ืงืœื” --
03:16
something I'm sure everybody here has an opinion about:
72
196260
3000
ืžืฉื”ื• ืฉืื ื™ ื‘ื˜ื•ื— ืฉืœื›ืœ ืื—ื“ ื›ืืŸ ื™ืฉ ื“ืขื” ืขืœื™ื•:
03:19
iPhone versus Android.
73
199260
2000
ืื™ื™ืคื•ืŸ ื›ื ื’ื“ ืื ื“ืจื•ืื™ื“.
03:21
Let's do a show of hands -- iPhone.
74
201260
3000
ื‘ื•ืื• ื ืขืฉื” ื”ืฆื‘ืขื” -- ืื™ื™ืคื•ืŸ.
03:24
Uh huh.
75
204260
2000
ืื” ื”ื”.
03:26
Android.
76
206260
3000
ืื ื“ืจื•ืื™ื“.
03:29
You'd think with a bunch of smart people
77
209260
2000
ืืคืฉืจ ื”ื™ื” ืœื—ืฉื•ื‘ ืฉืขื ื—ื‘ื•ืจืช ืื ืฉื™ื ืคื™ืงื—ื™ื
03:31
we wouldn't be such suckers just for the pretty phones.
78
211260
2000
ืœื ื ื”ื™ื” ื›ืืœื” ืคืจืื™ื™ืจื™ื ื”ื ืžืฉื›ื™ื ืจืง ืœื˜ืœืคื•ื ื™ื ื™ืคื™ื.
03:33
(Laughter)
79
213260
2000
(ืฆื—ื•ืง)
03:35
Next question,
80
215260
2000
ืฉืืœื” ื”ื‘ืื”,
03:37
a little bit harder.
81
217260
2000
ืงืฆืช ื™ื•ืชืจ ืงืฉื”.
03:39
Should we be collecting all of that guy's data
82
219260
2000
ื”ืื ืขืœื™ื ื• ืœืืกื•ืฃ ืืช ื›ืœ ื”ื ืชื•ื ื™ื ืฉืœ ืื•ืชื• ืื“ื
03:41
to make his experiences better
83
221260
2000
ื›ื“ื™ ืœืฉืคืจ ืืช ื”ื—ื•ื•ื™ื™ื” ืฉืœื•
03:43
and to protect ourselves in case he's up to no good?
84
223260
3000
ื•ื›ื“ื™ ืœื”ื’ืŸ ืขืœื™ื ื• ืื ื‘ืžืงืจื” ื”ื•ื ืขื•ืฉื” ืฆืจื•ืช?
03:46
Or should we leave him alone?
85
226260
2000
ืื• ืฉืขืœื™ื ื• ืœืขื–ื‘ื• ืœื ืคืฉื•?
03:48
Collect his data.
86
228260
3000
ืœืืกื•ืฃ ืืช ื ืชื•ื ื™ื•.
03:53
Leave him alone.
87
233260
3000
ืœืขื–ื‘ื• ืœื ืคืฉื•.
03:56
You're safe. It's fine.
88
236260
2000
ืืชื ืžื•ื’ื ื™ื. ื–ื” ื‘ืกื“ืจ ื’ืžื•ืจ.
03:58
(Laughter)
89
238260
2000
(ืฆื—ื•ืง)
04:00
Okay, last question --
90
240260
2000
ื˜ื•ื‘, ืฉืืœื” ืื—ืจื•ื ื” --
04:02
harder question --
91
242260
2000
ื™ื•ืชืจ ืงืฉื” --
04:04
when trying to evaluate
92
244260
3000
ื›ืืฉืจ ืžื ืกื™ื ืœื”ืขืจื™ืš
04:07
what we should do in this case,
93
247260
3000
ืžื” ืขืœื™ื ื• ืœืขืฉื•ืช ื‘ืžืงืจื” ื›ื–ื”,
04:10
should we use a Kantian deontological moral framework,
94
250260
4000
ื”ืื ืขืœื™ื ื• ืœื”ืฉืชืžืฉ ื‘ืขืงืจื•ื ื•ืช ืžื•ืกืจ ื“ื™ืื•ื ื˜ื•ืœื•ื’ื™ื™ื ืฉืœ ืงืื ื˜,
04:14
or should we use a Millian consequentialist one?
95
254260
3000
ืื• ืฉืขืœื™ื ื• ืœืขืฉื•ืช ืฉื™ืžื•ืฉ ื‘ืขืงืจื•ื ื•ืช ืžื•ืกืจ ืชื•ืฆืืชื™ื™ื ืฉืœ ืžื™ืœ?
04:19
Kant.
96
259260
3000
ืงืื ื˜.
04:22
Mill.
97
262260
3000
ืžื™ืœ.
04:25
Not as many votes.
98
265260
2000
ืื™ืŸ ื”ืจื‘ื” ื“ืขื•ืช ื›ืžื• ืงื•ื“ื.
04:27
(Laughter)
99
267260
3000
(ืฆื—ื•ืง)
04:30
Yeah, that's a terrifying result.
100
270260
3000
ื–ื• ืชื•ืฆืื” ืžืคื—ื™ื“ื”.
04:34
Terrifying, because we have stronger opinions
101
274260
4000
ืžืคื—ื™ื“ื” ื›ื™ ื™ืฉ ืœื ื• ื“ืขื•ืช ื™ื•ืชืจ ืžื•ืฆืงื•ืช
04:38
about our hand-held devices
102
278260
2000
ืขืœ ื”ืžื›ืฉื™ืจื™ื ื”ื ื™ืฉืื™ื ืฉืœื ื•
04:40
than about the moral framework
103
280260
2000
ืžืืฉืจ ืขืœ ืขืงืจื•ื ื•ืช ืžื•ืกืจ
04:42
we should use to guide our decisions.
104
282260
2000
ืฉืขืœื™ื ื• ืœื”ืฉืชืžืฉ ื‘ื”ื ื›ืงื•ื™ื ืžื ื—ื™ื ืœื”ื—ืœื˜ื•ืชื™ื ื•.
04:44
How do we know what to do with all the power we have
105
284260
3000
ื›ื™ืฆื“ ื ื“ืข ืžื” ืœืขืฉื•ืช ืขื ื›ืœ ื”ื›ื— ืฉื‘ื™ื“ื™ื ื•
04:47
if we don't have a moral framework?
106
287260
3000
ืื ืื™ืŸ ืœื ื• ืขืงืจื•ื ื•ืช ืžื•ืกืจ?
04:50
We know more about mobile operating systems,
107
290260
3000
ืื ื• ื™ื•ื“ืขื™ื ื™ื•ืชืจ ืขืœ ืžืขืจื›ื•ืช ื”ืคืขืœื” ืœื˜ืœืคื•ื ื™ื ื ื™ื™ื“ื™ื,
04:53
but what we really need is a moral operating system.
108
293260
3000
ืื‘ืœ ืžื” ืฉืื ื• ื‘ืืžืช ื–ืงื•ืงื™ื ืœื• ื–ื” ืžืขืจื›ืช ื”ืคืขืœื” ืœืžื•ืกืจ.
04:58
What's a moral operating system?
109
298260
2000
ืžื”ื™ ืžืขืจื›ืช ื”ืคืขืœื” ืœืžื•ืกืจ?
05:00
We all know right and wrong, right?
110
300260
2000
ื›ื•ืœื ื• ื™ื•ื“ืขื™ื ืžื” ื˜ื•ื‘ ื•ืžื” ืจืข.
05:02
You feel good when you do something right,
111
302260
2000
ืื ื• ืžืจื’ื™ืฉื™ื ื˜ื•ื‘ ื›ืืฉืจ ืขื•ืฉื™ื ืžืฉื”ื• ื ื›ื•ืŸ,
05:04
you feel bad when you do something wrong.
112
304260
2000
ืื ื• ืžืจื’ื™ืฉื™ื ืจืข ื›ืืฉืจ ืขื•ืฉื™ื ืžืฉื”ื• ืฉื’ื•ื™.
05:06
Our parents teach us that: praise with the good, scold with the bad.
113
306260
3000
ื”ื•ืจื™ื ื• ื—ื™ื ื›ื• ืื•ืชื ื• ืœื–ื”: ื“ื‘ืจื™ ืฉื‘ื— ืขืœ ืžืฉื”ื• ื˜ื•ื‘, ื ื–ื™ืคื” ืขืœ ืžืฉื”ื• ืจืข.
05:09
But how do we figure out what's right and wrong?
114
309260
3000
ืื‘ืœ ื›ื™ืฆื“ ื™ื•ื“ืขื™ื ืžื” ื ื›ื•ืŸ ื•ืžื” ืœื?
05:12
And from day to day, we have the techniques that we use.
115
312260
3000
ื•ืžื™ื•ื ืœื™ื•ื, ื™ืฉ ืœื ื• ื˜ื›ื ื™ืงื•ืช ืฉืื ื• ืžืฉืชืžืฉื™ื.
05:15
Maybe we just follow our gut.
116
315260
3000
ืื•ืœื™ ืื ื• ืจืง ืคื•ืขืœื™ื ืขืœ-ืคื™ ืชื—ื•ืฉื•ืช ื‘ื˜ืŸ.
05:18
Maybe we take a vote -- we crowdsource.
117
318260
3000
ืื•ืœื™ ืื ื• ืขื•ืฉื™ื ื”ืฆื‘ืขื” -- ืžื™ืงื•ืจ-ืงื”ืœ.
05:21
Or maybe we punt --
118
321260
2000
ืื• ืื•ืœื™ ืžื”ืžืจื™ื --
05:23
ask the legal department, see what they say.
119
323260
3000
ืฉื•ืืœื™ื ืืช ื”ืžื—ืœืงื” ืœืžืฉืคื˜ื™ื ื•ืžื—ื›ื™ื ืœื—ื•ื•ืช ื“ืขืชื”.
05:26
In other words, it's kind of random,
120
326260
2000
ื‘ืžื™ืœื™ื ืื—ืจื•ืช, ื–ื” ื”ื›ืœ ืืงืจืื™,
05:28
kind of ad hoc,
121
328260
2000
ืžื™ืŸ ืื™ืœืชื•ืจ,
05:30
how we figure out what we should do.
122
330260
3000
ืื™ืš ืฉืื ื• ืžื’ืœื™ื ืžื” ืขืœื™ื ื• ืœืขืฉื•ืช.
05:33
And maybe, if we want to be on surer footing,
123
333260
3000
ื•ืื•ืœื™, ืื ื‘ืจืฆื•ื ื ื• ืœื”ื™ื•ืช ื™ื•ืชืจ ื‘ื˜ื•ื—ื™ื,
05:36
what we really want is a moral framework that will help guide us there,
124
336260
3000
ืžื” ืฉืื ื• ื‘ืืžืช ืฆืจื™ื›ื™ื ื–ื” ืขืงืจื•ื ื•ืช ืžื•ืกืจ ืฉื™ืกื™ื™ืขื• ืœื ื• ืœื”ื’ื™ืข ืœื–ื”,
05:39
that will tell us what kinds of things are right and wrong in the first place,
125
339260
3000
ืฉื™ื’ื™ื“ื• ืœื ื• ืžืœื›ืชื—ื™ืœื” ืžื”ื ื”ื“ื‘ืจื™ื ื”ื ื›ื•ื ื™ื ื•ืžื”ื ื”ืœื ื ื›ื•ื ื™ื,
05:42
and how would we know in a given situation what to do.
126
342260
4000
ื•ื›ื™ืฆื“ ื ื“ืข ื‘ืžืฆื‘ ื ืชื•ืŸ ืžื” ืœืขืฉื•ืช.
05:46
So let's get a moral framework.
127
346260
2000
ืื– ื”ื‘ื” ื ื‘ื ื” ืขืงืจื•ื ื•ืช ืžื•ืกืจ.
05:48
We're numbers people, living by numbers.
128
348260
3000
ืื ื• ืื ืฉื™ ื”ืžืกืคืจื™ื, ื—ื™ื™ื ืขืœ-ืคื™ ืžืกืคืจื™ื.
05:51
How can we use numbers
129
351260
2000
ื›ื™ืฆื“ ื ื•ื›ืœ ืœื”ืฉืชืžืฉ ื‘ืžืกืคืจื™ื
05:53
as the basis for a moral framework?
130
353260
3000
ื›ื‘ืกื™ืก ืœืขืงืจื•ื ื•ืช ืžื•ืกืจ?
05:56
I know a guy who did exactly that.
131
356260
3000
ืื ื™ ืžื›ื™ืจ ืžื™ืฉื”ื• ืฉืขืฉื” ื‘ื“ื™ื•ืง ื–ืืช,
05:59
A brilliant guy --
132
359260
3000
ืื“ื ืžื‘ืจื™ืง --
06:02
he's been dead 2,500 years.
133
362260
3000
ื”ื•ื ืžืช ืœืคื ื™ 2,500 ืฉื ื”.
06:05
Plato, that's right.
134
365260
2000
ืืคืœื˜ื•ืŸ, ื”ื•ื ื‘ื“ื™ื•ืง.
06:07
Remember him -- old philosopher?
135
367260
2000
ื–ื•ื›ืจื™ื ืื•ืชื• -- ืคื™ืœื•ืกื•ืฃ ื–ืงืŸ?
06:09
You were sleeping during that class.
136
369260
3000
ื‘ื˜ื— ื™ืฉื ืชื ื‘ืื•ืชื• ืฉื™ืขื•ืจ.
06:12
And Plato, he had a lot of the same concerns that we did.
137
372260
2000
ื•ืืคืœื˜ื•ืŸ, ื”ื™ื• ืœื• ืื•ืชืŸ ื”ื“ืื’ื•ืช ืฉื™ืฉ ืœื ื•.
06:14
He was worried about right and wrong.
138
374260
2000
ื”ื•ื ื”ื™ื” ืžื•ื“ืื’ ื‘ืงืฉืจ ืœื˜ื•ื‘ ื•ืจืข.
06:16
He wanted to know what is just.
139
376260
2000
ื”ื•ื ืจืฆื” ืœื“ืขืช ืžื” ืฆื•ื“ืง.
06:18
But he was worried that all we seem to be doing
140
378260
2000
ืื‘ืœ ื”ื•ื ื”ื™ื” ืžื•ื“ืื’ ืฉื›ืœ ืžื” ืฉืื ื• ืขื•ืฉื™ื
06:20
is trading opinions about this.
141
380260
2000
ื–ื” ื”ื—ืœืคืช ื“ืขื•ืช ืขืœ ื›ืš.
06:22
He says something's just. She says something else is just.
142
382260
3000
ื”ื•ื ืื•ืžืจ ืฉืžืฉื”ื• ืฆื•ื“ืง. ื”ื™ื ืื•ืžืจืช ืฉืžืฉื”ื• ืื—ืจ ืฆื•ื“ืง.
06:25
It's kind of convincing when he talks and when she talks too.
143
385260
2000
ื–ื” ื“ื™ ืžืฉื›ื ืข ื›ืืฉืจ ื”ื•ื ืžื“ื‘ืจ ื•ื’ื ื›ืืฉืจ ื”ื™ื ืžื“ื‘ืจืช.
06:27
I'm just going back and forth; I'm not getting anywhere.
144
387260
2000
ืื‘ืœ ืื ื• ื”ื•ืœื›ื™ื ืจืง ืงื“ื™ืžื” ื•ืื—ื•ืจื”; ืื ื• ืœื ืžื’ื™ืขื™ื ืœืฉื•ื ืžืงื•ื.
06:29
I don't want opinions; I want knowledge.
145
389260
3000
ืื ื™ ืœื ืจื•ืฆื” ื“ืขื•ืช, ืื ื™ ืจื•ืฆื” ื™ื“ืข.
06:32
I want to know the truth about justice --
146
392260
3000
ื‘ืจืฆื•ื ื™ ืœื“ืขืช ืืช ื”ืืžืช ืขืœ ืฆื“ืง --
06:35
like we have truths in math.
147
395260
3000
ื›ืžื• ืฉื™ืฉ ืœื ื• ืืžื™ืชื•ืช ื‘ืžืชืžื˜ื™ืงื”.
06:38
In math, we know the objective facts.
148
398260
3000
ื‘ืžืชืžื˜ื™ืงื” ืื ื• ื™ื•ื“ืขื™ื ืขื•ื‘ื“ื•ืช ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ื•ืช.
06:41
Take a number, any number -- two.
149
401260
2000
ืงื—ื• ืžืกืคืจ, ื›ืœ ืžืกืคืจ -- 2.
06:43
Favorite number. I love that number.
150
403260
2000
ืžืกืคืจ ืื”ื•ื‘ ืขืœื™ื™. ืื ื™ ืžืช ืขืœื™ื•.
06:45
There are truths about two.
151
405260
2000
ื™ืฉื ืŸ ืืžื™ืชื•ืช ืขืœ 2.
06:47
If you've got two of something,
152
407260
2000
ืื ื™ืฉ ืœื ื• 2 ื“ื‘ืจื™ื,
06:49
you add two more, you get four.
153
409260
2000
ื•ืžื•ืกื™ืคื™ื ืœื”ื 2 ื ื•ืกืคื™ื, ืžืงื‘ืœื™ื 4.
06:51
That's true no matter what thing you're talking about.
154
411260
2000
ื–ื” ืชืžื™ื“ ื ื›ื•ืŸ, ืœื ืžืฉื ื” ืขืœ ืื™ื–ื” ื“ื‘ืจื™ื ืžื“ื•ื‘ืจ.
06:53
It's an objective truth about the form of two,
155
413260
2000
ื–ื•ื”ื™ ืืžืช ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ืช ืขืœ ืื•ืคื™ื• ืฉืœ 2,
06:55
the abstract form.
156
415260
2000
ื”ืื•ืคื™ ื”ืžื•ืคืฉื˜.
06:57
When you have two of anything -- two eyes, two ears, two noses,
157
417260
2000
ื™ืฉ ืœื ื• 2 ืžื›ืœ ื“ื‘ืจ -- 2 ืขื™ื ื™ื™ื, 2 ืื•ื–ื ื™ื™ื, 2 ืืคื™ื,
06:59
just two protrusions --
158
419260
2000
ืคืฉื•ื˜ 2 ื‘ืœื™ื˜ื•ืช --
07:01
those all partake of the form of two.
159
421260
3000
ื›ื•ืœื ืžืื•ืคื™ื™ื ื™ื ื‘-2.
07:04
They all participate in the truths that two has.
160
424260
4000
ื›ื•ืœื ืžืฉืชืชืคื™ื ื‘ืืžื™ืชื•ืช ืฉื™ืฉ ื‘-2.
07:08
They all have two-ness in them.
161
428260
2000
ืœื›ื•ืœื ืžื”ื ื™ืฉ 2-ื™ื•ืช ื‘ืชื•ื›ื.
07:10
And therefore, it's not a matter of opinion.
162
430260
3000
ื•ืœื›ืŸ, ื–ื” ืœื ืขื ื™ื™ืŸ ืฉืœ ื“ืขื”.
07:13
What if, Plato thought,
163
433260
2000
ื•ืžื” ืื, ืืคืœื˜ื•ืŸ ื”ื™ื” ื—ื•ืฉื‘
07:15
ethics was like math?
164
435260
2000
ืฉืืชื™ืงื” ื–ื” ื›ืžื• ืžืชืžื˜ื™ืงื”?
07:17
What if there were a pure form of justice?
165
437260
3000
ืžื” ืื ื™ืฉื ืŸ ืฆื•ืจื•ืช ื˜ื”ื•ืจื•ืช ืฉืœ ืฆื“ืง?
07:20
What if there are truths about justice,
166
440260
2000
ืžื” ืื ื™ืฉ ืืžื™ืชื•ืช ืขืœ ืฆื“ืง,
07:22
and you could just look around in this world
167
442260
2000
ื•ืคืฉื•ื˜ ื™ื›ื•ืœื ื• ืœื”ืกืชื›ืœ ืžืกื‘ื™ื‘ ื‘ืขื•ืœื
07:24
and see which things participated,
168
444260
2000
ื•ืœืจืื•ืช ืื™ื–ื” ื“ื‘ืจื™ื ืžืฉืชืชืคื™ื ื‘ื–ื”,
07:26
partook of that form of justice?
169
446260
3000
ืžืื•ืคื™ื™ื ื™ื ื‘ืื•ืชื” ืฆื•ืจืช ืฆื“ืง?
07:29
Then you would know what was really just and what wasn't.
170
449260
3000
ืื– ื”ื™ื™ื ื• ื™ื•ื“ืขื™ื ืžื” ื‘ืืžืช ืฆื•ื“ืง ื•ืžื” ืœื.
07:32
It wouldn't be a matter
171
452260
2000
ื–ื” ืœื ื”ื™ื” ืจืง
07:34
of just opinion or just appearances.
172
454260
3000
ืขื ื™ื™ืŸ ืฉืœ ื“ืขื” ืื• ืžืจืื”.
07:37
That's a stunning vision.
173
457260
2000
ื–ื”ื• ื—ื–ื•ืŸ ื ืคืœื.
07:39
I mean, think about that. How grand. How ambitious.
174
459260
3000
ื›ืœื•ืžืจ, ื—ื™ืฉื‘ื• ืขืœ ื›ืš. ื›ืžื” ืื“ื™ืจ, ื›ืžื” ืฉืืคืชื ื™.
07:42
That's as ambitious as we are.
175
462260
2000
ื–ื” ื›ืžื” ืฉืื ื• ืฉืืคืชื ื™ื™ื.
07:44
He wants to solve ethics.
176
464260
2000
ื”ื•ื ืจื•ืฆื” ืœืคืชื•ืจ ืืชื™ืงื”.
07:46
He wants objective truths.
177
466260
2000
ื”ื•ื ืจื•ืฆื” ืืžื™ืชื•ืช ืื•ื‘ื™ื™ืงื˜ื™ื‘ื™ื•ืช.
07:48
If you think that way,
178
468260
3000
ืื ืืชื ื—ื•ืฉื‘ื™ื ื‘ื“ืจืš ื–ื•,
07:51
you have a Platonist moral framework.
179
471260
3000
ื™ืฉ ืœื›ื ืขืงืจื•ื ื•ืช ืžื•ืกืจ ืืคืœื˜ื•ื ื™ื™ื.
07:54
If you don't think that way,
180
474260
2000
ืื ืื™ื ื›ื ื—ื•ืฉื‘ื™ื ื‘ื“ืจืš ื–ื•,
07:56
well, you have a lot of company in the history of Western philosophy,
181
476260
2000
ืืชื ื‘ื—ื‘ืจื” ืžืื•ื“ ื’ื“ื•ืœื” ื‘ื”ื™ืกื˜ื•ืจื™ื” ืฉืœ ื”ืคื™ืœื•ืกื•ืคื™ื” ื”ืžืขืจื‘ื™ืช,
07:58
because the tidy idea, you know, people criticized it.
182
478260
3000
ื‘ื’ืœืœ ืฉื”ืจืขื™ื•ืŸ ืฉืœ ื”ืกื“ืจ -- ื ืžืชื—ื” ืขืœื™ื• ื‘ื™ืงื•ืจืช.
08:01
Aristotle, in particular, he was not amused.
183
481260
3000
ืืจื™ืกื˜ื•, ื‘ืžื™ื•ื—ื“, ื”ื•ื ืœื ืฉื‘ืข ื ื—ืช.
08:04
He thought it was impractical.
184
484260
3000
ื”ื•ื ืกื‘ืจ ืฉื–ื” ืœื ืžืขืฉื™.
08:07
Aristotle said, "We should seek only so much precision in each subject
185
487260
4000
ืืจื™ืกื˜ื• ืืžืจ, "ืขืœื™ื ื• ืœื—ืคืฉ ืจืง ืืช ืื•ืชื• ื“ื™ื•ืง ื‘ื›ืœ ื ื•ืฉื
08:11
as that subject allows."
186
491260
2000
ื›ืคื™ ืฉืื•ืชื• ื ื•ืฉื ืžืืคืฉืจ."
08:13
Aristotle thought ethics wasn't a lot like math.
187
493260
3000
ืืจื™ืกื˜ื• ืกื‘ืจ ืฉืื™ืŸ ื”ืืชื™ืงื” ื“ื•ืžื” ื›ืœ-ื›ืš ืœืžืชืžื˜ื™ืงื”.
08:16
He thought ethics was a matter of making decisions in the here-and-now
188
496260
3000
ื”ื•ื ืกื‘ืจ ืฉืืชื™ืงื” ื–ื” ืขื ื™ื™ืŸ ืฉืœ ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ื›ืืŸ ื•ืขื›ืฉื™ื•
08:19
using our best judgment
189
499260
2000
ื‘ืขื–ืจืช ืžื™ื˜ื‘ ืฉื™ืงื•ืœ ื“ืขืชื ื•
08:21
to find the right path.
190
501260
2000
ื›ื“ื™ ืœืžืฆื•ื ืืช ื”ืฉื‘ื™ืœ ื”ื ื›ื•ืŸ.
08:23
If you think that, Plato's not your guy.
191
503260
2000
ืื ืืชื ื—ื•ืฉื‘ื™ื ื›ืš, ืืคืœื˜ื•ืŸ ื”ื•ื ืœื ื”ืื™ืฉ ืฉืœื›ื.
08:25
But don't give up.
192
505260
2000
ืื‘ืœ ืœื ืœื”ืชื™ื™ืืฉ.
08:27
Maybe there's another way
193
507260
2000
ืื•ืœื™ ื™ืฉ ื“ืจืš ื ื•ืกืคืช
08:29
that we can use numbers as the basis of our moral framework.
194
509260
3000
ืฉื‘ื” ื ื•ื›ืœ ืœื”ืฉืชืžืฉ ื‘ืžืกืคืจื™ื ื›ื‘ืกื™ืก ืœืขืงืจื•ื ื•ืช ืžื•ืกืจ.
08:33
How about this:
195
513260
2000
ืžื” ื“ืขืชื›ื ืขืœ ื–ื”:
08:35
What if in any situation you could just calculate,
196
515260
3000
ืžื” ืื ื”ื™ื™ื ื• ื™ื›ื•ืœื™ื ืคืฉื•ื˜ ืœื—ืฉื‘ ื›ืœ ืžืฆื‘,
08:38
look at the choices,
197
518260
2000
ืœืจืื•ืช ืืช ื”ืืคืฉืจื•ื™ื•ืช,
08:40
measure out which one's better and know what to do?
198
520260
3000
ืœืืžื•ื“ ืื™ื–ื• ื™ื•ืชืจ ื˜ื•ื‘ื” ื•ื›ืš ืœื“ืขืช ืžื” ืœืขืฉื•ืช?
08:43
That sound familiar?
199
523260
2000
ื ืฉืžืข ืžื•ื›ืจ?
08:45
That's a utilitarian moral framework.
200
525260
3000
ืืœื” ื”ื ืขืงืจื•ื ื•ืช ืžื•ืกืจ ืชื•ืขืœืชื ื™ื™ื.
08:48
John Stuart Mill was a great advocate of this --
201
528260
2000
ื’'ื•ืŸ ืกื˜ื™ื•ืืจื˜ ืžื™ืœ ื”ื™ื” ื—ืกื™ื“ ื’ื“ื•ืœ ืฉืœื”ืŸ --
08:50
nice guy besides --
202
530260
2000
ื—ื•ืฅ ืžื–ื” ืื“ื ื ื—ืžื“ --
08:52
and only been dead 200 years.
203
532260
2000
ืจืง ืžืช ื›ื‘ืจ 200 ืฉื ื”.
08:54
So basis of utilitarianism --
204
534260
2000
ื›ืš ืฉื‘ืกื™ืก ืฉืœ ืชื•ืขืœืชื ื•ืช --
08:56
I'm sure you're familiar at least.
205
536260
2000
ืื ื™ ื‘ื˜ื•ื— ืฉืœืคื—ื•ืช ืฉืžืขืชื ืขืœ ื–ื”.
08:58
The three people who voted for Mill before are familiar with this.
206
538260
2000
ืฉืœื•ืฉืช ื”ืื ืฉื™ื ืฉื”ืฆื‘ื™ืขื• ืงื•ื“ื ื‘ืขื“ ืžื™ืœ ื‘ื˜ื— ืžื›ื™ืจื™ื ื–ืืช.
09:00
But here's the way it works.
207
540260
2000
ื•ื›ืš ื–ื” ืขื•ื‘ื“.
09:02
What if morals, what if what makes something moral
208
542260
3000
ืžื” ืื ื”ืชืฉื•ื‘ื” ืœืฉืืœื” ืื ืžืฉื”ื• ืžื•ืกืจื™,
09:05
is just a matter of if it maximizes pleasure
209
545260
2000
ื”ื™ื ืจืง ืขื ื™ื™ืŸ ืฉืœ ืื ื–ื” ืžืžืงืกื ื”ื ืื”
09:07
and minimizes pain?
210
547260
2000
ื•ืžืฆืžืฆื ื›ืื‘?
09:09
It does something intrinsic to the act.
211
549260
3000
ื–ื” ืขื•ืฉื” ืžืฉื”ื• ืžื”ื•ืชื™ ื•ืคื ื™ืžื™ ืœืคืขื•ืœื”.
09:12
It's not like its relation to some abstract form.
212
552260
2000
ื–ื” ืœื ื›ืžื• ื”ืงืฉืจ ืฉืœื• ืœืื™ื–ื• ืฆื•ืจื” ืžื•ืคืฉื˜ืช.
09:14
It's just a matter of the consequences.
213
554260
2000
ื–ื” ืคืฉื•ื˜ ืขื ื™ื™ืŸ ืฉืœ ืชื•ืฆืื•ืช.
09:16
You just look at the consequences
214
556260
2000
ืคืฉื•ื˜ ืžืกืชื›ืœื™ื ืขืœ ื”ืชื•ืฆืื•ืช
09:18
and see if, overall, it's for the good or for the worse.
215
558260
2000
ื•ืจื•ืื™ื, ืื ื–ื” ืœื˜ื•ื‘ื” ืื• ืœืจืขื”.
09:20
That would be simple. Then we know what to do.
216
560260
2000
ื–ื” ื™ื”ื™ื” ืคืฉื•ื˜. ื•ืื– ื ื“ืข ืžื” ืœืขืฉื•ืช.
09:22
Let's take an example.
217
562260
2000
ื ื™ืงื— ืžืงืจื” ืœื“ื•ื’ืžื.
09:24
Suppose I go up
218
564260
2000
ื ื ื™ื— ืฉืื ื™ ื‘ื ื•ืื•ืžืจ,
09:26
and I say, "I'm going to take your phone."
219
566260
2000
"ืื ื™ ื”ื•ืœืš ืœืงื—ืช ืืช ื”ื ื™ื™ื“ ืฉืœืš."
09:28
Not just because it rang earlier,
220
568260
2000
ืœื ืจืง ื‘ื’ืœืœ ืฉื”ื•ื ืฆื™ืœืฆืœ ืงื•ื“ื,
09:30
but I'm going to take it because I made a little calculation.
221
570260
3000
ืืœื ื›ื™ ืขืฉื™ืชื™ ื—ื™ืฉื•ื‘ ืงื˜ืŸ.
09:33
I thought, that guy looks suspicious.
222
573260
3000
ื—ืฉื‘ืชื™ ืฉื”ืื“ื ืฉื ื ืจืื” ื—ืฉื•ื“.
09:36
And what if he's been sending little messages to Bin Laden's hideout --
223
576260
3000
ื•ืžื” ืื ื”ื•ื ืฉื•ืœื— ื”ื•ื“ืขื•ืช ืงืฆืจื•ืช ืืœ ืžื—ื‘ื•ื ืฉืœ ื‘ื™ืŸ-ืœืื“ืŸ --
09:39
or whoever took over after Bin Laden --
224
579260
2000
ืื• ืžื™ ืฉื‘ื ืื—ืจื™ ื‘ื™ืŸ-ืœืื“ืŸ --
09:41
and he's actually like a terrorist, a sleeper cell.
225
581260
3000
ื”ื•ื ื‘ืขืฆื ื›ืžื• ืžื—ื‘ืœ, ืชื ืจื“ื•ื.
09:44
I'm going to find that out, and when I find that out,
226
584260
3000
ืื ื™ ื”ื•ืœืš ืœื’ืœื•ืช ื–ืืช, ื•ื›ืืฉืจ ืื’ืœื”,
09:47
I'm going to prevent a huge amount of damage that he could cause.
227
587260
3000
ืืžื ืข ื ื–ืง ื’ื“ื•ืœ ืžืื•ื“ ืฉื”ื•ื ื™ื›ืœ ืœื’ืจื•ื.
09:50
That has a very high utility to prevent that damage.
228
590260
3000
ืœืคืขื•ืœื” ื–ื• ื™ืฉ ืชื•ืขืœืช ื’ื“ื•ืœื” ืžืื•ื“ ืฉืœ ืžื ื™ืขืช ื ื–ืง.
09:53
And compared to the little pain that it's going to cause --
229
593260
2000
ื•ื‘ื”ืฉื•ื•ืื” ืœื›ืื‘ ื”ืงื˜ืŸ ืฉื”ื™ื ื™ื›ื•ืœื” ืœื’ืจื•ื --
09:55
because it's going to be embarrassing when I'm looking on his phone
230
595260
2000
ืžื›ื™ื•ื•ืŸ ืฉื”ื•ืœืš ืœื”ื™ื•ืช ืžื‘ื™ืš ื›ืืฉืจ ืืกืชื›ืœ ืขืœ ื”ื ื™ื™ื“ ืฉืœื•
09:57
and seeing that he has a Farmville problem and that whole bit --
231
597260
3000
ื•ืืžืฆื ืฉื™ืฉ ืœื• ื‘ืขื™ื” ื‘ืžืฉื—ืง ืจืฉืช ื•ื›ืœ ื–ื” --
10:00
that's overwhelmed
232
600260
3000
ืื‘ืœ ื›ืœ ื–ื” ืžืชื’ืžื“
10:03
by the value of looking at the phone.
233
603260
2000
ืžื•ืœ ื”ืขืจืš ืฉืœ ืœื”ืกืชื›ืœ ื‘ื ื™ื™ื“ ืฉืœื•.
10:05
If you feel that way,
234
605260
2000
ืื ื’ื ืืชื ืžืจื’ื™ืฉื™ื ื›ืš,
10:07
that's a utilitarian choice.
235
607260
3000
ื–ื•ื”ื™ ื‘ื—ื™ืจื” ืชื•ืขืœืชื ื™ืช.
10:10
But maybe you don't feel that way either.
236
610260
3000
ื•ืื•ืœื™ ืืชื ืœื ืžืจื’ื™ืฉื™ื ื›ืš.
10:13
Maybe you think, it's his phone.
237
613260
2000
ืื•ืœื™ ืืชื ื—ื•ืฉื‘ื™ื, ื–ื” ื”ื ื™ื™ื“ ืฉืœื•.
10:15
It's wrong to take his phone
238
615260
2000
ื–ื” ืœื ื‘ืกื“ืจ ืœืงื—ืช ืืช ื”ื ื™ื™ื“ ืฉืœื•,
10:17
because he's a person
239
617260
2000
ืžื›ื™ื•ื•ืŸ ืฉื”ื•ื ืื“ื
10:19
and he has rights and he has dignity,
240
619260
2000
ื•ื™ืฉ ืœื• ื–ื›ื•ื™ื•ืช ื•ื™ืฉ ืœื• ื›ื‘ื•ื“-ืขืฆืžื™,
10:21
and we can't just interfere with that.
241
621260
2000
ื•ืืœ ืœื ื• ืกืชื ื›ืš ืœืชื—ื•ื‘ ืืช ืืคื™ื ื•.
10:23
He has autonomy.
242
623260
2000
ื™ืฉ ืœื• ืื•ื˜ื•ื ื•ืžื™ื”.
10:25
It doesn't matter what the calculations are.
243
625260
2000
ื–ื” ืœื ืžืฉื ื” ืžื” ื”ืฉื™ืงื•ืœื™ื.
10:27
There are things that are intrinsically wrong --
244
627260
3000
ื™ืฉ ื“ื‘ืจื™ื ืฉื”ื ื‘ืื•ืคืŸ ืžื”ื•ืชื™ ื•ืคื ื™ืžื™ ืขื•ื•ืœ --
10:30
like lying is wrong,
245
630260
2000
ื›ืžื• ืฉืœืฉืงืจ ื–ื” ืจืข,
10:32
like torturing innocent children is wrong.
246
632260
3000
ื›ืžื• ืฉืœืขื ื•ืช ื™ืœื“ื™ื ืชืžื™ืžื™ื ื–ื” ืจืข.
10:35
Kant was very good on this point,
247
635260
3000
ืงืื ื˜ ืžืื•ื“ ืฆื“ืง ื‘ืขื ื™ื™ืŸ ื–ื”,
10:38
and he said it a little better than I'll say it.
248
638260
2000
ื•ื”ื•ื ื‘ื™ื˜ื ืืช ื–ื” ื™ื•ืชืจ ื˜ื•ื‘ ืžืžื ื™.
10:40
He said we should use our reason
249
640260
2000
ื”ื•ื ืืžืจ ืฉืขืœื™ื ื• ืœื”ืฉืชืžืฉ ื‘ืฉื›ืœ ืฉืœื ื•
10:42
to figure out the rules by which we should guide our conduct,
250
642260
3000
ื›ื“ื™ ืœื’ืœื•ืช ืืช ื”ื›ืœืœื™ื ืฉืขืœื™ื ื• ืœื”ืชื ื”ื’ ืœืคื™ื”ื.
10:45
and then it is our duty to follow those rules.
251
645260
3000
ื•ืื– ื–ื• ื—ื•ื‘ืชื ื• ืœืฆื™ื™ืช ืœื›ืœืœื™ื ืืœื”.
10:48
It's not a matter of calculation.
252
648260
3000
ืื™ืŸ ื–ื” ืขื ื™ื™ืŸ ืฉืœ ื—ื™ืฉื•ื‘ื™ื.
10:51
So let's stop.
253
651260
2000
ืื– ื”ื‘ื” ื ืขืฆื•ืจ.
10:53
We're right in the thick of it, this philosophical thicket.
254
653260
3000
ืื ื—ื ื• ืžืžืฉ ื‘ืกื‘ืš ื”ืขื ื™ื™ื ื™ื, ื‘ืกื‘ืš ื”ืคื™ืœื•ืกื•ืคื™ ื”ื–ื”.
10:56
And this goes on for thousands of years,
255
656260
3000
ื•ื–ื” ื ืžืฉืš ื›ื‘ืจ ืืœืคื™ ืฉื ื™ื,
10:59
because these are hard questions,
256
659260
2000
ืžื›ื™ื•ื•ืŸ ืฉืืœื• ื”ืŸ ืฉืืœื•ืช ืงืฉื•ืช,
11:01
and I've only got 15 minutes.
257
661260
2000
ื•ืœื™ ื™ืฉ ืจืง 15 ื“ืงื•ืช.
11:03
So let's cut to the chase.
258
663260
2000
ืœื›ืŸ ื ื’ื™ืข ืœืขื™ืงืจ.
11:05
How should we be making our decisions?
259
665260
4000
ื›ื™ืฆื“ ืขืœื™ื ื• ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช?
11:09
Is it Plato, is it Aristotle, is it Kant, is it Mill?
260
669260
3000
ื”ืื ื–ื” ืืคืœื˜ื•ืŸ, ื”ืื ื–ื” ืืจื™ืกื˜ื•, ื”ืื ื–ื” ืงืื ื˜, ืื• ืื•ืœื™ ืžื™ืœ?
11:12
What should we be doing? What's the answer?
261
672260
2000
ืžื” ืขืœื™ื ื• ืœืขืฉื•ืช? ืžื” ื”ืชืฉื•ื‘ื”?
11:14
What's the formula that we can use in any situation
262
674260
3000
ืžื”ื™ ื”ื ื•ืกื—ื” ืฉืขืœื™ื ื• ืœื”ืฉืชืžืฉ ื‘ื” ื‘ื›ืœ ืžืฆื‘
11:17
to determine what we should do,
263
677260
2000
ื›ื“ื™ ืœืงื‘ื•ืข ืžื” ืขืœื™ื ื• ืœืขืฉื•ืช,
11:19
whether we should use that guy's data or not?
264
679260
2000
ื‘ืื ืขืœื™ื ื• ืœื”ืฉืชืžืฉ ื‘ื ืชื•ื ื™ ืื•ืชื• ืื“ื ืื• ืœื?
11:21
What's the formula?
265
681260
3000
ืžื”ื™ ื”ื ื•ืกื—ื”?
11:25
There's not a formula.
266
685260
2000
ืื™ืŸ ื ื•ืกื—ื”.
11:29
There's not a simple answer.
267
689260
2000
ืื™ืŸ ืชืฉื•ื‘ื” ืคืฉื•ื˜ื”.
11:31
Ethics is hard.
268
691260
3000
ืืชื™ืงื” ื–ื” ื“ื‘ืจ ืžืกื•ื‘ืš.
11:34
Ethics requires thinking.
269
694260
3000
ืืชื™ืงื” ื“ื•ืจืฉืช ื—ืฉื™ื‘ื”.
11:38
And that's uncomfortable.
270
698260
2000
ื•ื–ื” ืœื ื ื•ื—.
11:40
I know; I spent a lot of my career
271
700260
2000
ื‘ื™ืœื™ืชื™ ื—ืœืง ื’ื“ื•ืœ ืžื”ืงืจื™ื™ืจื” ืฉืœื™
11:42
in artificial intelligence,
272
702260
2000
ื‘ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
11:44
trying to build machines that could do some of this thinking for us,
273
704260
3000
ื‘ื ื™ืกื™ื•ืŸ ืœื‘ื ื•ืช ืžื›ื•ื ื•ืช ืฉืชื•ื›ืœื ื” ืœื—ืฉื•ื‘ ื‘ืžืงื•ืžื ื•,
11:47
that could give us answers.
274
707260
2000
ืฉืชื•ื›ืœื ื” ืœืชืช ืœื ื• ืชืฉื•ื‘ื•ืช.
11:49
But they can't.
275
709260
2000
ืื‘ืœ ื”ืŸ ืœื ื™ื›ื•ืœื•ืช.
11:51
You can't just take human thinking
276
711260
2000
ืื™-ืืคืฉืจ ืคืฉื•ื˜ ืœืงื—ืช ื—ืฉื™ื‘ื” ืื ื•ืฉื™ืช
11:53
and put it into a machine.
277
713260
2000
ื•ืœืฉื™ื ืื•ืชื” ื‘ืžื›ื•ื ื”.
11:55
We're the ones who have to do it.
278
715260
3000
ืื ื—ื ื• ืืœื” ืฉื ืฆื˜ืจืš ืœืขืฉื•ืช ื–ืืช.
11:58
Happily, we're not machines, and we can do it.
279
718260
3000
ืœืฉืžื—ืชื ื•, ืื ื• ืœื ืžื›ื•ื ื•ืช ื•ืื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ื–ืืช.
12:01
Not only can we think,
280
721260
2000
ืœื ืจืง ืฉืื ื• ื™ื›ื•ืœื™ื ืœื—ืฉื•ื‘,
12:03
we must.
281
723260
2000
ืืœื ืื ื• ื—ื™ื™ื‘ื™ื.
12:05
Hannah Arendt said,
282
725260
2000
ื—ื ื” ืืจื“ื ื˜ ืืžืจื”,
12:07
"The sad truth
283
727260
2000
"ื”ืืžืช ื”ืขืฆื•ื‘ื” ื”ื™ื
12:09
is that most evil done in this world
284
729260
2000
ืฉืจื•ื‘ ื”ืจืข ืฉื ืขืฉื” ื‘ืขื•ืœื ื–ื”,
12:11
is not done by people
285
731260
2000
ืื™ื ื• ื ืขืฉื” ืขืœ-ื™ื“ื™ ืื ืฉื™ื
12:13
who choose to be evil.
286
733260
2000
ืืฉืจ ื‘ื•ื—ืจื™ื ืœืขืฉื•ืช ืจืข.
12:15
It arises from not thinking."
287
735260
3000
ืืœื ื”ื•ื ื ื•ื‘ืข ืžื—ื•ืกืจ ื—ืฉื™ื‘ื”."
12:18
That's what she called the "banality of evil."
288
738260
4000
ืœื–ื” ื”ื™ื ืงืจืื” "ื”ื‘ื ืœื™ื•ืช ืฉืœ ืจื•ืข".
12:22
And the response to that
289
742260
2000
ื•ื›ืžืขื ื” ืœื–ื”,
12:24
is that we demand the exercise of thinking
290
744260
2000
ืื ื• ื“ื•ืจืฉื™ื ืืช ืชื™ืจื’ื•ืœ ื”ื—ืฉื™ื‘ื”
12:26
from every sane person.
291
746260
3000
ืžื›ืœ ืื“ื ืฉืคื•ื™.
12:29
So let's do that. Let's think.
292
749260
2000
ืื– ื”ื‘ื” ื ืขืฉื” ื–ืืช, ื”ื‘ื” ื ื—ืฉื•ื‘.
12:31
In fact, let's start right now.
293
751260
3000
ื‘ืขืฆื, ื‘ื•ืื• ื ืชื—ื™ืœ ื›ื‘ืจ ืขื›ืฉื™ื•.
12:34
Every person in this room do this:
294
754260
3000
ืฉื›ืœ ืื—ื“ ื‘ืื•ืœื ื™ืขืฉื” ืืช ื–ื”:
12:37
think of the last time you had a decision to make
295
757260
3000
ืฉื™ื—ืฉื•ื‘ ืขืœ ื”ืคืขื ื”ืื—ืจื•ื ื” ืฉื”ื™ื” ืขืœื™ื• ืœืงื‘ืœ ื”ื—ืœื˜ื”
12:40
where you were worried to do the right thing,
296
760260
2000
ื›ืืฉืจ ื“ืื’ืช ื‘ืงืฉืจ ืœืขืฉื™ื™ืช ื”ื“ื‘ืจ ื”ื ื›ื•ืŸ,
12:42
where you wondered, "What should I be doing?"
297
762260
2000
ื›ืืฉืจ ืชื”ื™ืช, "ืžื” ืื ื™ ืืžื•ืจ ืœืขืฉื•ืช?"
12:44
Bring that to mind,
298
764260
2000
ืชื™ื–ื›ืจื• ื‘ื–ื”.
12:46
and now reflect on that
299
766260
2000
ื•ื›ืขืช ืชื”ืจื”ืจื• ื‘ื–ื”
12:48
and say, "How did I come up that decision?
300
768260
3000
ื•ืชืืžืจื•, "ื›ื™ืฆื“ ื”ื’ืขืชื™ ืœื”ื—ืœื˜ื” ื–ื•?
12:51
What did I do? Did I follow my gut?
301
771260
3000
ืžื” ืขืฉื™ืชื™? ื”ืื ืคืขืœืชื™ ืœืคื™ ืชื—ื•ืฉื•ืชื™ื™?
12:54
Did I have somebody vote on it? Or did I punt to legal?"
302
774260
2000
ื”ืื ื”ื™ื” ืœื™ ืžื™ืฉื”ื• ืฉื™ื‘ื™ืข ืืช ื“ืขืชื•? ืื• ืฉืคื ื™ืชื™ ืœื—ื•ืง?"
12:56
Or now we have a few more choices.
303
776260
3000
ืื• ื›ืขืช ืฉื™ืฉ ืœื ื• ื™ื•ืชืจ ืืคืฉืจื•ื™ื•ืช:
12:59
"Did I evaluate what would be the highest pleasure
304
779260
2000
"ื”ืื ื‘ื“ืงืชื™ ืžืชื™ ืชื”ื™ื” ื”ื”ื ืื” ื”ื’ื“ื•ืœื” ื‘ื™ื•ืชืจ
13:01
like Mill would?
305
781260
2000
ื›ืžื• ืฉืžื™ืœ ื”ื™ื” ืขื•ืฉื”?
13:03
Or like Kant, did I use reason to figure out what was intrinsically right?"
306
783260
3000
ืื• ื›ืžื• ืงืื ื˜, ื”ืื ื”ืฉืชืžืฉืชื™ ื‘ืฉื›ืœ ื›ื“ื™ ืœืžืฆื•ื ืžื” ื”ื™ื” ื ื›ื•ืŸ ืžื”ื•ืชื™ืช ื•ืคื ื™ืžื™ืช?"
13:06
Think about it. Really bring it to mind. This is important.
307
786260
3000
ืชื—ืฉื‘ื• ืขืœ ื–ื”. ื‘ืืžืช ืชื‘ื™ืื• ืืช ืขืฆืžื›ื ืœื—ืฉื•ื‘. ื–ื” ื—ืฉื•ื‘.
13:09
It is so important
308
789260
2000
ื–ื” ื›ืœ-ื›ืš ื—ืฉื•ื‘
13:11
we are going to spend 30 seconds of valuable TEDTalk time
309
791260
2000
ืฉืื ื• ืขื•ืžื“ื ืœื”ืขื‘ื™ืจ 30 ืฉื ื™ื•ืช ืžื”ื–ืžืŸ ื”ื™ืงืจ ืฉืœ ื”ืจืฆืืช TED
13:13
doing nothing but thinking about this.
310
793260
2000
ื›ื“ื™ ืœื ืœืขืฉื•ืช ื›ืœื•ื ืืœื ืœื—ืฉื•ื‘ ืขืœ ื–ื”.
13:15
Are you ready? Go.
311
795260
2000
ื”ืื ืืชื ืžื•ื›ื ื™ื? ืชืชื—ื™ืœื•.
13:33
Stop. Good work.
312
813260
3000
ืขื™ืฆืจื•. ืขื‘ื•ื“ื” ื˜ื•ื‘ื”.
13:36
What you just did,
313
816260
2000
ืžื” ืฉืขืฉื™ืชื ื›ืจื’ืข,
13:38
that's the first step towards taking responsibility
314
818260
2000
ื”ื™ื” ื”ืฆืขื“ ื”ืจืืฉื•ืŸ ืืœ ืขื‘ืจ ืœืงื™ื—ืช ืื—ืจื™ื•ืช
13:40
for what we should do with all of our power.
315
820260
3000
ืขืœ ืžื” ืฉืื ื• ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ืขื ื›ืœ ื”ื›ื— ืฉื‘ื™ื“ื™ื ื•.
13:45
Now the next step -- try this.
316
825260
3000
ื›ืขืช ืœืฆืขื“ ื”ื‘ื -- ื ืกื• ืืช ื–ื”.
13:49
Go find a friend and explain to them
317
829260
2000
ืชืžืฆืื• ื—ื‘ืจ ื•ื”ืกื‘ื™ืจื• ืœื•
13:51
how you made that decision.
318
831260
2000
ื›ื™ืฆื“ ื”ื’ืขืชื ืœื”ื—ืœื˜ื”.
13:53
Not right now. Wait till I finish talking.
319
833260
2000
ืœื ืžื™ื™ื“. ื—ื›ื• ืขื“ ืฉืืกื™ื™ื ืœื“ื‘ืจ.
13:55
Do it over lunch.
320
835260
2000
ืชืขืฉื• ื–ืืช ื‘ื–ืžืŸ ืืจื•ื—ื”.
13:57
And don't just find another technologist friend;
321
837260
3000
ื•ืืœ ืชืžืฆืื• ืจืง ืขื•ื“ ื—ื‘ืจ ื˜ื›ื ื•ืœื•ื’ื™;
14:00
find somebody different than you.
322
840260
2000
ืชืžืฆืื• ืžื™ืฉื”ื• ืฉื•ื ื” ืžื›ื.
14:02
Find an artist or a writer --
323
842260
2000
ืชืžืฆืื• ืืžืŸ ืื• ืกื•ืคืจ --
14:04
or, heaven forbid, find a philosopher and talk to them.
324
844260
3000
ืื•, ื—ืก ื•ื—ืœื™ืœื”, ืชืžืฆืื• ืคื™ืœื•ืกื•ืฃ ื•ืชืฉื•ื—ื—ื• ืขื™ืžื.
14:07
In fact, find somebody from the humanities.
325
847260
2000
ื‘ืขืฆื, ืชืžืฆืื• ืžื™ืฉื”ื• ืžืžื“ืขื™ ื”ืจื•ื—.
14:09
Why? Because they think about problems
326
849260
2000
ืœืžื”? ืžื›ื™ื•ื•ืŸ ืฉื”ื ื—ื•ืฉื‘ื™ื ืขืœ ื‘ืขื™ื•ืช
14:11
differently than we do as technologists.
327
851260
2000
ื‘ืื•ืคืŸ ืฉื•ื ื” ืžืื™ืชื ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื.
14:13
Just a few days ago, right across the street from here,
328
853260
3000
ืจืง ืœืคื ื™ ืžืกืคืจ ื™ืžื™ื, ืžืžืฉ ืžืขื‘ืจ ืœื›ื‘ื™ืฉ ื›ืืŸ,
14:16
there was hundreds of people gathered together.
329
856260
2000
ื”ื™ื• ืžืื•ืช ืื ืฉื™ื ืฉื ืืกืคื•.
14:18
It was technologists and humanists
330
858260
2000
ื”ื™ื• ืืœื” ื˜ื›ื ื•ืœื•ื’ื™ื ื•ืื ืฉื™ ืžื“ืขื™ ื”ืจื•ื—
14:20
at that big BiblioTech Conference.
331
860260
2000
ื‘ื›ื ืก BiblioTech ื”ื’ื“ื•ืœ.
14:22
And they gathered together
332
862260
2000
ื”ื ื ืืกืคื• ื‘ื™ื—ื“
14:24
because the technologists wanted to learn
333
864260
2000
ื›ื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื ืจืฆื• ืœืœืžื•ื“
14:26
what it would be like to think from a humanities perspective.
334
866260
3000
ืื™ืš ื–ื” ื™ื”ื™ื” ืœื—ืฉื•ื‘ ืžื ืงื•ื“ืช ืžื‘ื˜ ืฉืœ ืžื“ืขื™ ื”ืจื•ื—.
14:29
You have someone from Google
335
869260
2000
ื”ื™ื” ืฉื ืžื™ืฉื”ื• ืžื’ื•ื’ืœ
14:31
talking to someone who does comparative literature.
336
871260
2000
ืฉืฉื•ื—ื— ืขื ืžื™ืฉื”ื• ืฉืขื•ืกืง ื‘ืกืคืจื•ืช ื”ืฉื•ื•ืืชื™ืช.
14:33
You're thinking about the relevance of 17th century French theater --
337
873260
3000
ืืชื ืฉื•ืืœื™ื ืืช ืขืฆืžื›ื ืขืœ ื”ื”ืฉืคืขื” ืฉืœ ืชืื˜ืจื•ืŸ ืฆืจืคืชื™ ืžืŸ ื”ืžืื” ื”-17 --
14:36
how does that bear upon venture capital?
338
876260
2000
ื›ื™ืฆื“ ื”ื•ื ืงืฉื•ืจ ืœื”ื•ืŸ-ืกื™ื›ื•ืŸ?
14:38
Well that's interesting. That's a different way of thinking.
339
878260
3000
ื›ืœ ื–ื” ืžืขื ื™ื™ืŸ. ื–ื•ื”ื™ ื“ืจืš ื—ืฉื™ื‘ื” ืื—ืจืช.
14:41
And when you think in that way,
340
881260
2000
ื•ื›ืืฉืจ ื—ื•ืฉื‘ื™ื ื‘ื“ืจืš ื–ื•,
14:43
you become more sensitive to the human considerations,
341
883260
3000
ืื ื• ื”ื•ืคื›ื™ื ืœื™ื•ืชืจ ืจื’ื™ืฉื™ื ืœืฉื™ืงื•ืœื™ื ืื ื•ืฉื™ื™ื,
14:46
which are crucial to making ethical decisions.
342
886260
3000
ืฉื”ื ื”ืžืคืชื— ืœืงื‘ืœืช ื”ื—ืœื˜ื•ืช ืืชื™ื•ืช.
14:49
So imagine that right now
343
889260
2000
ืื– ื“ืžื™ื™ื ื• ืฉืžืžืฉ ืขื›ืฉื™ื•
14:51
you went and you found your musician friend.
344
891260
2000
ื ืชืงืœืชื ื‘ื—ื‘ืจื›ื ื”ืžื•ื–ื™ืงืื™.
14:53
And you're telling him what we're talking about,
345
893260
3000
ื•ืืชื ืžืกืคืจื™ื ืœื• ืขืœ ืžื” ืฉืื ื• ืžื“ื‘ืจื™ื ืขื›ืฉื™ื•,
14:56
about our whole data revolution and all this --
346
896260
2000
ืขืœ ืžื”ืคื›ืช ื”ื ืชื•ื ื™ื ื•ื›ืœ ื–ื” --
14:58
maybe even hum a few bars of our theme music.
347
898260
2000
ืื•ืœื™ ืืคื™ืœื• ืชื–ืžื–ืžื• ื›ืžื” ืชื•ื•ื™ื ืžืžื•ื–ื™ืงืช ื”ื ื•ืฉื ืฉืœื ื•.
15:00
โ™ซ Dum ta da da dum dum ta da da dum โ™ซ
348
900260
3000
โ™ซ ื“ืื ื˜ื ื“ื ื“ื ื“ืื ื“ืื ื˜ื ื“ื ื“ื ื“ืื โ™ซ
15:03
Well, your musician friend will stop you and say,
349
903260
2000
ื”ื—ื‘ืจ ื”ืžื•ื–ื™ืงืื™ ืฉืœื›ื ื™ืงื˜ืข ืืชื›ื ื•ื™ืืžืจ,
15:05
"You know, the theme music
350
905260
2000
"ืืชื” ื™ื•ื“ืข, ืžื•ื–ื™ืงื” ื–ื•
15:07
for your data revolution,
351
907260
2000
ืฉืงืฉื•ืจื” ืœืžื”ืคื›ืช ื”ื ืชื•ื ื™ื ืฉืœืš,
15:09
that's an opera, that's Wagner.
352
909260
2000
ื”ื™ื ืื•ืคืจื”, ื–ื”ื• ื•ืื’ื ืจ.
15:11
It's based on Norse legend.
353
911260
2000
ื”ื™ื ืžื‘ื•ืกืกืช ืขืœ ืื’ื“ื” ื ื•ืจื‘ื’ื™ืช.
15:13
It's Gods and mythical creatures
354
913260
2000
ืืœื” ื”ื ืืœื™ื ื•ื™ืฆื•ืจื™ื ืžื™ืชื•ืœื•ื’ื™ื™ื
15:15
fighting over magical jewelry."
355
915260
3000
ื”ื ืœื—ืžื™ื ืขืœ ืื‘ื ื™ ื—ืŸ ืงืกื•ืžื•ืช."
15:19
That's interesting.
356
919260
3000
ื–ื” ืžืขื ื™ื™ืŸ.
15:22
Now it's also a beautiful opera,
357
922260
3000
ืื– ืขื›ืฉื™ื• ื–ื• ื’ื ืื•ืคืจื” ื™ืคื”.
15:25
and we're moved by that opera.
358
925260
3000
ื•ื–ื” ื ื•ื’ืข ืœืœื™ื‘ื ื•, ืื•ืชื” ืื•ืคืจื”.
15:28
We're moved because it's about the battle
359
928260
2000
ื”ื™ื ื ื•ื’ืขืช ืœืœื™ื‘ื ื• ื›ื™ ื”ื™ื ืขืœ ื”ืงืจื‘
15:30
between good and evil,
360
930260
2000
ื‘ื™ืŸ ื˜ื•ื‘ ืœืจืข,
15:32
about right and wrong.
361
932260
2000
ืขืœ ืžื” ื ื›ื•ืŸ ื•ืžื” ืœื ื ื›ื•ืŸ.
15:34
And we care about right and wrong.
362
934260
2000
ื•ืœื ื• ืื›ืคืช ืžื” ื ื›ื•ืŸ ื•ืžื” ืœื.
15:36
We care what happens in that opera.
363
936260
3000
ืื›ืคืช ืœื ื• ืžื” ืงื•ืจื” ื‘ืื•ืชื” ืื•ืคืจื”.
15:39
We care what happens in "Apocalypse Now."
364
939260
3000
ืื›ืคืช ืœื ื• ืžื” ืงื•ืจื” ื‘"ืืคื•ืงืœื™ืคืกื” ืขื›ืฉื™ื•".
15:42
And we certainly care
365
942260
2000
ื•ื‘ื˜ื•ื— ืื›ืคืช ืœื ื•
15:44
what happens with our technologies.
366
944260
2000
ืžื” ืงื•ืจื” ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉืœื ื•.
15:46
We have so much power today,
367
946260
2000
ื™ืฉ ืœื ื• ื”ืžื•ืŸ ื›ื— ื”ื™ื•ื,
15:48
it is up to us to figure out what to do,
368
948260
3000
ื•ืžื•ื˜ืœ ืขืœื™ื ื• ืœื’ืœื•ืช ืžื” ืขืœื™ื ื• ืœืขืฉื•ืช.
15:51
and that's the good news.
369
951260
2000
ื•ืืœื” ื—ื“ืฉื•ืช ื˜ื•ื‘ื•ืช.
15:53
We're the ones writing this opera.
370
953260
3000
ืื ื—ื ื• ื”ื ืืœื” ืฉื›ื•ืชื‘ื™ื ืืช ื”ืื•ืคืจื”.
15:56
This is our movie.
371
956260
2000
ื–ื” ื”ืกืจื˜ ืฉืœื ื•.
15:58
We figure out what will happen with this technology.
372
958260
3000
ืื ื—ื ื• ืžื’ืœื™ื ืžื” ื™ืงืจื” ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื•.
16:01
We determine how this will all end.
373
961260
3000
ืื ื—ื ื• ืงื•ื‘ืขื™ื ื›ื™ืฆื“ ื›ืœ ื–ื” ื™ืกืชื™ื™ื.
16:04
Thank you.
374
964260
2000
ืชื•ื“ื” ืœื›ื.
16:06
(Applause)
375
966260
5000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7