Damon Horowitz calls for a "moral operating system"

96,197 views ・ 2011-06-06

TED


请双击下面的英文字幕来播放视频。

翻译人员: Xiaoqiao Xie 校对人员: Angelia King
00:15
Power.
0
15260
2000
能力
00:17
That is the word that comes to mind.
1
17260
2000
是我们时时在想的
00:19
We're the new technologists.
2
19260
2000
我们是新时代的技术人员
00:21
We have a lot of data, so we have a lot of power.
3
21260
3000
我们有很多信息,这样我们就有能力
00:24
How much power do we have?
4
24260
2000
多少能力呢?
00:26
Scene from a movie: "Apocalypse Now" -- great movie.
5
26260
3000
想像电影“现代启示录”, 这电影绝了
00:29
We've got to get our hero, Captain Willard, to the mouth of the Nung River
6
29260
3000
里面有个英雄,卫尔沃德上尉,来到怒河的入海口
00:32
so he can go pursue Colonel Kurtz.
7
32260
2000
去追踪科特兹上校
00:34
The way we're going to do this is fly him in and drop him off.
8
34260
2000
想像他飞到那里,下飞机
00:36
So the scene:
9
36260
2000
想像这个情景:
00:38
the sky is filled with this fleet of helicopters carrying him in.
10
38260
3000
天空布满了陪他来的直升飞机
00:41
And there's this loud, thrilling music in the background,
11
41260
2000
背景音乐震人心肺
00:43
this wild music.
12
43260
2000
非常野性
00:45
♫ Dum da ta da dum ♫
13
45260
2000
♫ Dum da ta da dum ♫
00:47
♫ Dum da ta da dum ♫
14
47260
2000
♫ Dum da ta da dum ♫
00:49
♫ Da ta da da ♫
15
49260
3000
♫ Da ta da da ♫
00:52
That's a lot of power.
16
52260
2000
充满了力量
00:54
That's the kind of power I feel in this room.
17
54260
2000
这就是我在这个房间里感觉到的能量
00:56
That's the kind of power we have
18
56260
2000
这就是我们拥有的能量
00:58
because of all of the data that we have.
19
58260
2000
就因为我们拥有信息
01:00
Let's take an example.
20
60260
2000
举个例子
01:02
What can we do
21
62260
2000
用个人信息
01:04
with just one person's data?
22
64260
3000
我们能做什么呢?
01:07
What can we do
23
67260
2000
用这位先生的信息
01:09
with that guy's data?
24
69260
2000
我们能做什么?
01:11
I can look at your financial records.
25
71260
2000
我可以查看你的财政历史
01:13
I can tell if you pay your bills on time.
26
73260
2000
看你是不是按时付账单
01:15
I know if you're good to give a loan to.
27
75260
2000
来决定给不给你贷款
01:17
I can look at your medical records; I can see if your pump is still pumping --
28
77260
3000
我可以浏览你的医疗病历,看你的心脏好不好
01:20
see if you're good to offer insurance to.
29
80260
3000
来决定给不给你保险
01:23
I can look at your clicking patterns.
30
83260
2000
我可以跟踪你上网的习惯
01:25
When you come to my website, I actually know what you're going to do already
31
85260
3000
当你访问我的网站时,我已经知道你会怎么做了
01:28
because I've seen you visit millions of websites before.
32
88260
2000
因为我已经看到你以前是怎么访问成千上万个网站的
01:30
And I'm sorry to tell you,
33
90260
2000
我不得不说
01:32
you're like a poker player, you have a tell.
34
92260
2000
你就像是个赌客,你能被看透
01:34
I can tell with data analysis what you're going to do
35
94260
2000
就凭你的信息我能看透你将会做什么
01:36
before you even do it.
36
96260
2000
在你还没做之前
01:38
I know what you like. I know who you are,
37
98260
3000
我知道你喜欢什么,我知道你是谁
01:41
and that's even before I look at your mail
38
101260
2000
在我看你的邮件之前
01:43
or your phone.
39
103260
2000
看你的电话之前
01:45
Those are the kinds of things we can do
40
105260
2000
现今我们能这样做了
01:47
with the data that we have.
41
107260
3000
凭着我们有的信息
01:50
But I'm not actually here to talk about what we can do.
42
110260
3000
但我来这里不是谈我们能做什么
01:56
I'm here to talk about what we should do.
43
116260
3000
我是来谈谈我们应该做什么
02:00
What's the right thing to do?
44
120260
3000
如何善用信息?
02:04
Now I see some puzzled looks
45
124260
2000
我看到一些人迷惑了
02:06
like, "Why are you asking us what's the right thing to do?
46
126260
3000
好像在想:“为什么你问我们应该如何善用信息?
02:09
We're just building this stuff. Somebody else is using it."
47
129260
3000
我们只管把技术推出来,用是别人的事情。”
02:12
Fair enough.
48
132260
3000
就算你对
02:15
But it brings me back.
49
135260
2000
但这正是我要说的
02:17
I think about World War II --
50
137260
2000
想想二次世界大战
02:19
some of our great technologists then,
51
139260
2000
很多我们最优秀的技术人员
02:21
some of our great physicists,
52
141260
2000
最优秀的物理学家
02:23
studying nuclear fission and fusion --
53
143260
2000
在研究核裂变和核聚变
02:25
just nuclear stuff.
54
145260
2000
都是关于核的
02:27
We gather together these physicists in Los Alamos
55
147260
3000
我们把这些科学家请到洛斯阿拉莫斯国家实验室
02:30
to see what they'll build.
56
150260
3000
来看看他们想用这些技术来做什么
02:33
We want the people building the technology
57
153260
3000
我们希望,人们能够在发明新技术时
02:36
thinking about what we should be doing with the technology.
58
156260
3000
时时想着我们应该怎么用这些技术
02:41
So what should we be doing with that guy's data?
59
161260
3000
话说回来,我们应该怎么用这位先生的信息?
02:44
Should we be collecting it, gathering it,
60
164260
3000
我们应不应该收集他的信息
02:47
so we can make his online experience better?
61
167260
2000
来使他的上网冲浪的体验更好?
02:49
So we can make money?
62
169260
2000
我们应不应该用这些信息来赚钱?
02:51
So we can protect ourselves
63
171260
2000
应不应该用这些信息来自我保护
02:53
if he was up to no good?
64
173260
2000
以防万一他干坏事?
02:55
Or should we respect his privacy,
65
175260
3000
还是我们应该尊重他的隐私
02:58
protect his dignity and leave him alone?
66
178260
3000
为他保留尊严,不要烦他?
03:02
Which one is it?
67
182260
3000
哪一个办法更好?
03:05
How should we figure it out?
68
185260
2000
怎么做决定?
03:07
I know: crowdsource. Let's crowdsource this.
69
187260
3000
我知道:集思广益。让我们来听听大家的意见
03:11
So to get people warmed up,
70
191260
3000
先来个热身题
03:14
let's start with an easy question --
71
194260
2000
简单的问题——
03:16
something I'm sure everybody here has an opinion about:
72
196260
3000
这个问题我相信大家都有倾向
03:19
iPhone versus Android.
73
199260
2000
苹果电话还是谷歌Android电话?
03:21
Let's do a show of hands -- iPhone.
74
201260
3000
让我们举手表决,支持苹果的
03:24
Uh huh.
75
204260
2000
03:26
Android.
76
206260
3000
现在支持谷歌Android的
03:29
You'd think with a bunch of smart people
77
209260
2000
原以为这里都是聪明人
03:31
we wouldn't be such suckers just for the pretty phones.
78
211260
2000
不会只为了漂亮徒有其表而盲目消费。
03:33
(Laughter)
79
213260
2000
(笑声)
03:35
Next question,
80
215260
2000
下一个问题,
03:37
a little bit harder.
81
217260
2000
有点难
03:39
Should we be collecting all of that guy's data
82
219260
2000
我们应不应该收集这位先生的信息
03:41
to make his experiences better
83
221260
2000
来让他的体验更好
03:43
and to protect ourselves in case he's up to no good?
84
223260
3000
同时来保护我们自身,万一他干坏事?
03:46
Or should we leave him alone?
85
226260
2000
还是我们应该不管他?
03:48
Collect his data.
86
228260
3000
支持收集信息的
03:53
Leave him alone.
87
233260
3000
支持不管他的
03:56
You're safe. It's fine.
88
236260
2000
(这位先生)你是安全的
03:58
(Laughter)
89
238260
2000
(笑声)
04:00
Okay, last question --
90
240260
2000
现在,最后一个问题——
04:02
harder question --
91
242260
2000
更难一点——
04:04
when trying to evaluate
92
244260
3000
当我们试图决定
04:07
what we should do in this case,
93
247260
3000
我们应该怎么做的时候
04:10
should we use a Kantian deontological moral framework,
94
250260
4000
是应该用康德的义务论道德框架,
04:14
or should we use a Millian consequentialist one?
95
254260
3000
还是用米利安的结果主义论?
04:19
Kant.
96
259260
3000
支持康德的
04:22
Mill.
97
262260
3000
支持米利安的
04:25
Not as many votes.
98
265260
2000
没什么人知道哦。
04:27
(Laughter)
99
267260
3000
(笑声)
04:30
Yeah, that's a terrifying result.
100
270260
3000
这是个惊人的结果
04:34
Terrifying, because we have stronger opinions
101
274260
4000
惊人,因为我们关心
04:38
about our hand-held devices
102
278260
2000
手持设备的电话
04:40
than about the moral framework
103
280260
2000
比关心道德理论更多
04:42
we should use to guide our decisions.
104
282260
2000
我们应该用这些理论来指导我们的决定
04:44
How do we know what to do with all the power we have
105
284260
3000
如何决定我们该怎么来使用我们所拥有的能力
04:47
if we don't have a moral framework?
106
287260
3000
如果我们连道德框架都没有?
04:50
We know more about mobile operating systems,
107
290260
3000
我们对移动电话操作系统知道得越来越多
04:53
but what we really need is a moral operating system.
108
293260
3000
但是我们需要知道的是道德的操作系统
04:58
What's a moral operating system?
109
298260
2000
什么是道德操作系统?
05:00
We all know right and wrong, right?
110
300260
2000
我们知道什么是对,什么是错
05:02
You feel good when you do something right,
111
302260
2000
你做对了的时候,自我感觉挺好
05:04
you feel bad when you do something wrong.
112
304260
2000
如果你做错了你觉得自责
05:06
Our parents teach us that: praise with the good, scold with the bad.
113
306260
3000
父母亲教导我们,奖对惩错
05:09
But how do we figure out what's right and wrong?
114
309260
3000
但是怎么才能知道什么是对什么是错?
05:12
And from day to day, we have the techniques that we use.
115
312260
3000
尤其是当每天新技术都层出不穷
05:15
Maybe we just follow our gut.
116
315260
3000
说不定我们可以就靠本能
05:18
Maybe we take a vote -- we crowdsource.
117
318260
3000
说不定我们需要推选——群众意见集思广益
05:21
Or maybe we punt --
118
321260
2000
或者我们赌一把——
05:23
ask the legal department, see what they say.
119
323260
3000
问法律专家的意见,看他们怎么说
05:26
In other words, it's kind of random,
120
326260
2000
也就是说,我们作出决定的办法
05:28
kind of ad hoc,
121
328260
2000
很随机
05:30
how we figure out what we should do.
122
330260
3000
很即兴
05:33
And maybe, if we want to be on surer footing,
123
333260
3000
或者,如果我们想更保险些
05:36
what we really want is a moral framework that will help guide us there,
124
336260
3000
只想要一个好的道德框架,我们能用来自己弄明白的
05:39
that will tell us what kinds of things are right and wrong in the first place,
125
339260
3000
能帮我们明辨对错
05:42
and how would we know in a given situation what to do.
126
342260
4000
帮我们决定什么情况下做什么
05:46
So let's get a moral framework.
127
346260
2000
让我们找到一个道德框架
05:48
We're numbers people, living by numbers.
128
348260
3000
我们用很多数字,生活在数字中
05:51
How can we use numbers
129
351260
2000
怎么用数字
05:53
as the basis for a moral framework?
130
353260
3000
来建立一个道德框架?
05:56
I know a guy who did exactly that.
131
356260
3000
我知道一个人专门干这个的
05:59
A brilliant guy --
132
359260
3000
很聪明的人——
06:02
he's been dead 2,500 years.
133
362260
3000
2500年前就死了
06:05
Plato, that's right.
134
365260
2000
柏拉图。对了
06:07
Remember him -- old philosopher?
135
367260
2000
还记得他?老哲学家?
06:09
You were sleeping during that class.
136
369260
3000
在他的教学课上,你可能睡过去了。
06:12
And Plato, he had a lot of the same concerns that we did.
137
372260
2000
柏拉图,他有很多我们有的担心
06:14
He was worried about right and wrong.
138
374260
2000
他担心如何辨明是非
06:16
He wanted to know what is just.
139
376260
2000
他希望知道怎么衡量
06:18
But he was worried that all we seem to be doing
140
378260
2000
他担心我们现在做的
06:20
is trading opinions about this.
141
380260
2000
不过是交换意见
06:22
He says something's just. She says something else is just.
142
382260
3000
你一个主意我一个主意
06:25
It's kind of convincing when he talks and when she talks too.
143
385260
2000
听起来都很有理
06:27
I'm just going back and forth; I'm not getting anywhere.
144
387260
2000
我只是前瞻后顾,最后还是没结果
06:29
I don't want opinions; I want knowledge.
145
389260
3000
我不需要意见,我需要的是真理
06:32
I want to know the truth about justice --
146
392260
3000
我需要知道什么是有关正义的真理--
06:35
like we have truths in math.
147
395260
3000
就像我们研究什么是数学的真理一样
06:38
In math, we know the objective facts.
148
398260
3000
在数学中,我们使用客观的事实
06:41
Take a number, any number -- two.
149
401260
2000
比如将一个数字,任何数字——二
06:43
Favorite number. I love that number.
150
403260
2000
个人最爱
06:45
There are truths about two.
151
405260
2000
二有一些事实可讲
06:47
If you've got two of something,
152
407260
2000
如果你有两个东西
06:49
you add two more, you get four.
153
409260
2000
你再加上两个,就是四个
06:51
That's true no matter what thing you're talking about.
154
411260
2000
这是真理,不管你讨论的东西是什么
06:53
It's an objective truth about the form of two,
155
413260
2000
这是关于二的客观真理
06:55
the abstract form.
156
415260
2000
抽象性
06:57
When you have two of anything -- two eyes, two ears, two noses,
157
417260
2000
当你有两个东西,任何东西——两只眼睛,两只耳朵,两个鼻子
06:59
just two protrusions --
158
419260
2000
或者仅仅是两个小突起——
07:01
those all partake of the form of two.
159
421260
3000
他们都是二的表现
07:04
They all participate in the truths that two has.
160
424260
4000
都分享了关于二的事实
07:08
They all have two-ness in them.
161
428260
2000
都有二在其中
07:10
And therefore, it's not a matter of opinion.
162
430260
3000
所以,个人情绪没关系
07:13
What if, Plato thought,
163
433260
2000
如果柏拉图考虑
07:15
ethics was like math?
164
435260
2000
道德,就像数学一样,会怎么样?
07:17
What if there were a pure form of justice?
165
437260
3000
如果真有纯粹的公正,会怎么样?
07:20
What if there are truths about justice,
166
440260
2000
如果公正真有绝对的事实,
07:22
and you could just look around in this world
167
442260
2000
你可以环顾四周
07:24
and see which things participated,
168
444260
2000
看能用在什么上
07:26
partook of that form of justice?
169
446260
3000
什么东西有着绝对的公正,会怎么样?
07:29
Then you would know what was really just and what wasn't.
170
449260
3000
这样你就能知道什么是真的对,什么是真的错
07:32
It wouldn't be a matter
171
452260
2000
和事情的外表
07:34
of just opinion or just appearances.
172
454260
3000
和个人的观点都没有关系
07:37
That's a stunning vision.
173
457260
2000
这是个诱人的观点
07:39
I mean, think about that. How grand. How ambitious.
174
459260
3000
我是说,想象一下,多么壮观,多么充满雄心
07:42
That's as ambitious as we are.
175
462260
2000
这是我们最大的野心了
07:44
He wants to solve ethics.
176
464260
2000
他希望解决道德的问题
07:46
He wants objective truths.
177
466260
2000
他希望客观的事实
07:48
If you think that way,
178
468260
3000
如果你这么想
07:51
you have a Platonist moral framework.
179
471260
3000
你就有柏拉图式的道德观。
07:54
If you don't think that way,
180
474260
2000
如果你不这么认为
07:56
well, you have a lot of company in the history of Western philosophy,
181
476260
2000
那你就在西方哲学史上有很多同僚
07:58
because the tidy idea, you know, people criticized it.
182
478260
3000
因为人们总爱挑剔这简明的观点。
08:01
Aristotle, in particular, he was not amused.
183
481260
3000
特别是亚里士多德,他就不买帐
08:04
He thought it was impractical.
184
484260
3000
他觉得这不实际
08:07
Aristotle said, "We should seek only so much precision in each subject
185
487260
4000
亚里士多德说:“我们看每个物体
08:11
as that subject allows."
186
491260
2000
都只能精确到这个物体允许的程度。”
08:13
Aristotle thought ethics wasn't a lot like math.
187
493260
3000
亚里士多德认为道德不像是数学
08:16
He thought ethics was a matter of making decisions in the here-and-now
188
496260
3000
他认为道德是我在当时当地
08:19
using our best judgment
189
499260
2000
作出的最好的判断
08:21
to find the right path.
190
501260
2000
从而引导我们的决定
08:23
If you think that, Plato's not your guy.
191
503260
2000
如果你这么想,柏拉图就不是你这边的了
08:25
But don't give up.
192
505260
2000
但是别放弃
08:27
Maybe there's another way
193
507260
2000
或者还有一条路
08:29
that we can use numbers as the basis of our moral framework.
194
509260
3000
我们能用数字,作为我们道德观的基础。
08:33
How about this:
195
513260
2000
这个怎么样:
08:35
What if in any situation you could just calculate,
196
515260
3000
如果你能在任何情况下都算出
08:38
look at the choices,
197
518260
2000
衡量所有的选择
08:40
measure out which one's better and know what to do?
198
520260
3000
算出怎么做更好,什么应该做?
08:43
That sound familiar?
199
523260
2000
听起来熟悉是不是?
08:45
That's a utilitarian moral framework.
200
525260
3000
这就是实用主义的道德观
08:48
John Stuart Mill was a great advocate of this --
201
528260
2000
约翰·斯图尔特·密尔是大支持者——
08:50
nice guy besides --
202
530260
2000
很友善的一个人——
08:52
and only been dead 200 years.
203
532260
2000
可惜死了两百年了
08:54
So basis of utilitarianism --
204
534260
2000
所以实用主义的基础——
08:56
I'm sure you're familiar at least.
205
536260
2000
我详细你们都略知一二
08:58
The three people who voted for Mill before are familiar with this.
206
538260
2000
刚才三个米尔的支持者肯定知道
09:00
But here's the way it works.
207
540260
2000
我还是讲讲这个怎么运作的
09:02
What if morals, what if what makes something moral
208
542260
3000
假如道德,道德的内涵
09:05
is just a matter of if it maximizes pleasure
209
545260
2000
只是为了最大的享受
09:07
and minimizes pain?
210
547260
2000
和最少的痛苦?
09:09
It does something intrinsic to the act.
211
549260
3000
这在根本上决定了我们的行为
09:12
It's not like its relation to some abstract form.
212
552260
2000
看起来和任何抽象的形式有关
09:14
It's just a matter of the consequences.
213
554260
2000
只是一系列的必然结果
09:16
You just look at the consequences
214
556260
2000
你只看必然结果
09:18
and see if, overall, it's for the good or for the worse.
215
558260
2000
看总的来说是好结果还是坏结果
09:20
That would be simple. Then we know what to do.
216
560260
2000
这看起来简单,然后我们就知道怎么做
09:22
Let's take an example.
217
562260
2000
让我们看一个例子
09:24
Suppose I go up
218
564260
2000
假设我上前
09:26
and I say, "I'm going to take your phone."
219
566260
2000
说:“我要没收你的手机。”
09:28
Not just because it rang earlier,
220
568260
2000
不仅仅是因为刚才你的手机响了
09:30
but I'm going to take it because I made a little calculation.
221
570260
3000
而是因为我刚算计了一下
09:33
I thought, that guy looks suspicious.
222
573260
3000
我觉得,这个人看起来可疑
09:36
And what if he's been sending little messages to Bin Laden's hideout --
223
576260
3000
说不定他是在给本拉登的藏身处发消息——
09:39
or whoever took over after Bin Laden --
224
579260
2000
或者是跟本拉登的接班人发消息——
09:41
and he's actually like a terrorist, a sleeper cell.
225
581260
3000
他看起来真像是恐怖分子,一个卧底
09:44
I'm going to find that out, and when I find that out,
226
584260
3000
我得把他揪出来,当我揪他时
09:47
I'm going to prevent a huge amount of damage that he could cause.
227
587260
3000
我就能防止他将带来一个大危害
09:50
That has a very high utility to prevent that damage.
228
590260
3000
防止这个危害的结果非常有意义
09:53
And compared to the little pain that it's going to cause --
229
593260
2000
和我的行为将带来的小小痛苦相比——
09:55
because it's going to be embarrassing when I'm looking on his phone
230
595260
2000
因为在我检查他的手机的时候
09:57
and seeing that he has a Farmville problem and that whole bit --
231
597260
3000
要是看到他只是在玩线上游戏,是挺丢人的
10:00
that's overwhelmed
232
600260
3000
但是这不抵
10:03
by the value of looking at the phone.
233
603260
2000
检查手机能带来的价值
10:05
If you feel that way,
234
605260
2000
如果你这么想
10:07
that's a utilitarian choice.
235
607260
3000
就是实用主义的选择
10:10
But maybe you don't feel that way either.
236
610260
3000
可能你也并不赞同这个做法
10:13
Maybe you think, it's his phone.
237
613260
2000
可能你会想,这是他的手机
10:15
It's wrong to take his phone
238
615260
2000
拿他的手机是不对的
10:17
because he's a person
239
617260
2000
他是个人
10:19
and he has rights and he has dignity,
240
619260
2000
他有人权,有尊严
10:21
and we can't just interfere with that.
241
621260
2000
我们不能干涉
10:23
He has autonomy.
242
623260
2000
他有自主权
10:25
It doesn't matter what the calculations are.
243
625260
2000
不管计算结果是什么
10:27
There are things that are intrinsically wrong --
244
627260
3000
这个举动在本质上是错误的——
10:30
like lying is wrong,
245
630260
2000
就像撒谎是错误的
10:32
like torturing innocent children is wrong.
246
632260
3000
像是折磨无辜的孩童是错误的
10:35
Kant was very good on this point,
247
635260
3000
康德非常坚持这一点
10:38
and he said it a little better than I'll say it.
248
638260
2000
他解释得更好一点
10:40
He said we should use our reason
249
640260
2000
他说我们应该用我们自己的理由
10:42
to figure out the rules by which we should guide our conduct,
250
642260
3000
来弄明白决定该怎么做的准则
10:45
and then it is our duty to follow those rules.
251
645260
3000
接下来我们的责任是遵守这些准则
10:48
It's not a matter of calculation.
252
648260
3000
不是关于计算。
10:51
So let's stop.
253
651260
2000
别再算了
10:53
We're right in the thick of it, this philosophical thicket.
254
653260
3000
我们正在这错综复杂的哲学中间
10:56
And this goes on for thousands of years,
255
656260
3000
而且已经像这样上千年了
10:59
because these are hard questions,
256
659260
2000
因为这些是很难的问题
11:01
and I've only got 15 minutes.
257
661260
2000
我只有十五分钟
11:03
So let's cut to the chase.
258
663260
2000
所以让我们直奔主题
11:05
How should we be making our decisions?
259
665260
4000
我们应该怎么做决定?
11:09
Is it Plato, is it Aristotle, is it Kant, is it Mill?
260
669260
3000
是听从柏拉图,亚里士多德,康德,还是米尔?
11:12
What should we be doing? What's the answer?
261
672260
2000
我们该做什么?答案是什么?
11:14
What's the formula that we can use in any situation
262
674260
3000
能在任何情况下计算出我们应该怎么做的
11:17
to determine what we should do,
263
677260
2000
万灵的公式在哪里?
11:19
whether we should use that guy's data or not?
264
679260
2000
我们该还是不该用这位先生的数据?
11:21
What's the formula?
265
681260
3000
公式是什么?
11:25
There's not a formula.
266
685260
2000
没有公式
11:29
There's not a simple answer.
267
689260
2000
没有简单明了的答案
11:31
Ethics is hard.
268
691260
3000
道德观是很难的
11:34
Ethics requires thinking.
269
694260
3000
道德观需要思想
11:38
And that's uncomfortable.
270
698260
2000
这是不太容易的
11:40
I know; I spent a lot of my career
271
700260
2000
我知道。在我的职业生涯中我花了很多时间
11:42
in artificial intelligence,
272
702260
2000
研究机器人
11:44
trying to build machines that could do some of this thinking for us,
273
704260
3000
试图造出机器来替我们做决定
11:47
that could give us answers.
274
707260
2000
来给我们答案
11:49
But they can't.
275
709260
2000
但是他们做不到
11:51
You can't just take human thinking
276
711260
2000
你不能就拿人的思维方式
11:53
and put it into a machine.
277
713260
2000
放在机器里
11:55
We're the ones who have to do it.
278
715260
3000
我们要靠自己思考
11:58
Happily, we're not machines, and we can do it.
279
718260
3000
好的方面是我们不是机器,我们能思考
12:01
Not only can we think,
280
721260
2000
不光能思考
12:03
we must.
281
723260
2000
我们必须思考
12:05
Hannah Arendt said,
282
725260
2000
汉娜·阿伦特说过
12:07
"The sad truth
283
727260
2000
“悲哀的是
12:09
is that most evil done in this world
284
729260
2000
世上大多数的罪恶
12:11
is not done by people
285
731260
2000
不是由有恶意的人
12:13
who choose to be evil.
286
733260
2000
故意造成的
12:15
It arises from not thinking."
287
735260
3000
而是人没有想清楚。”
12:18
That's what she called the "banality of evil."
288
738260
4000
这就是她所谓的“平庸的罪恶”
12:22
And the response to that
289
742260
2000
对策是
12:24
is that we demand the exercise of thinking
290
744260
2000
我们需要练习思考
12:26
from every sane person.
291
746260
3000
每个平常人都要
12:29
So let's do that. Let's think.
292
749260
2000
让我们来做吧,来思考
12:31
In fact, let's start right now.
293
751260
3000
事实上,让我们现在就做
12:34
Every person in this room do this:
294
754260
3000
房间里每个人都这样做:
12:37
think of the last time you had a decision to make
295
757260
3000
想像上一次你需要做一个决定的时候
12:40
where you were worried to do the right thing,
296
760260
2000
当你担心不能做出正确的决定
12:42
where you wondered, "What should I be doing?"
297
762260
2000
当你怀疑:“到底该怎么做呢?”
12:44
Bring that to mind,
298
764260
2000
想象那个时刻
12:46
and now reflect on that
299
766260
2000
之后再想
12:48
and say, "How did I come up that decision?
300
768260
3000
“我当时是怎么做决定的?”
12:51
What did I do? Did I follow my gut?
301
771260
3000
我做了什么?是随我意愿的么?
12:54
Did I have somebody vote on it? Or did I punt to legal?"
302
774260
2000
征求他人意见了么?征求法律意见了么?
12:56
Or now we have a few more choices.
303
776260
3000
现在我们有更多选择
12:59
"Did I evaluate what would be the highest pleasure
304
779260
2000
“我有没有像米尔会做的那样
13:01
like Mill would?
305
781260
2000
衡量最大的享受?
13:03
Or like Kant, did I use reason to figure out what was intrinsically right?"
306
783260
3000
或者像康德,借助原因来弄明白什么是本质上对的?”
13:06
Think about it. Really bring it to mind. This is important.
307
786260
3000
想象一下,真的深入其中,这是很重要的
13:09
It is so important
308
789260
2000
有多重要
13:11
we are going to spend 30 seconds of valuable TEDTalk time
309
791260
2000
我们将会用三十秒,非常宝贵的TED的时间
13:13
doing nothing but thinking about this.
310
793260
2000
什么也不做,就思考
13:15
Are you ready? Go.
311
795260
2000
准备好了么?开始
13:33
Stop. Good work.
312
813260
3000
结束,不错
13:36
What you just did,
313
816260
2000
你刚做的
13:38
that's the first step towards taking responsibility
314
818260
2000
是向着为我们使用我们的能力
13:40
for what we should do with all of our power.
315
820260
3000
负责任迈出的第一步
13:45
Now the next step -- try this.
316
825260
3000
现在第二步——试试这个
13:49
Go find a friend and explain to them
317
829260
2000
找到一个朋友,向他们解释
13:51
how you made that decision.
318
831260
2000
你是怎么做决定的
13:53
Not right now. Wait till I finish talking.
319
833260
2000
不是现在,等我讲完了之后。
13:55
Do it over lunch.
320
835260
2000
午饭的时候做
13:57
And don't just find another technologist friend;
321
837260
3000
别找另一个搞技术的朋友
14:00
find somebody different than you.
322
840260
2000
找一个和你很不同的
14:02
Find an artist or a writer --
323
842260
2000
艺术家或者作家——
14:04
or, heaven forbid, find a philosopher and talk to them.
324
844260
3000
或者,哲学家,和他们谈
14:07
In fact, find somebody from the humanities.
325
847260
2000
事实上找个人文主义者谈
14:09
Why? Because they think about problems
326
849260
2000
为什么?因为他们考虑问题的角度
14:11
differently than we do as technologists.
327
851260
2000
和我们技术人员是很不一样的
14:13
Just a few days ago, right across the street from here,
328
853260
3000
仅仅几天前,街对面
14:16
there was hundreds of people gathered together.
329
856260
2000
成百上千人聚集起来
14:18
It was technologists and humanists
330
858260
2000
都是技术人员和人文主义者
14:20
at that big BiblioTech Conference.
331
860260
2000
在开一个大的研讨会
14:22
And they gathered together
332
862260
2000
他们聚集起来
14:24
because the technologists wanted to learn
333
864260
2000
因为技术人员想了解
14:26
what it would be like to think from a humanities perspective.
334
866260
3000
从人文主义者的角度想是怎么样的
14:29
You have someone from Google
335
869260
2000
有从谷歌来的人
14:31
talking to someone who does comparative literature.
336
871260
2000
和研究比较文学的人谈
14:33
You're thinking about the relevance of 17th century French theater --
337
873260
3000
你在想十七世纪的法国戏剧的影响——
14:36
how does that bear upon venture capital?
338
876260
2000
是怎么在风险投资下存活的?
14:38
Well that's interesting. That's a different way of thinking.
339
878260
3000
这很有趣,是另一个角度看问题。
14:41
And when you think in that way,
340
881260
2000
我们在这样想的时候
14:43
you become more sensitive to the human considerations,
341
883260
3000
就开始对人类的考量更敏感
14:46
which are crucial to making ethical decisions.
342
886260
3000
这对于做出道德的决定是很重要的
14:49
So imagine that right now
343
889260
2000
想像现在
14:51
you went and you found your musician friend.
344
891260
2000
你去找个音乐家的朋友
14:53
And you're telling him what we're talking about,
345
893260
3000
和他说我们现在的话题
14:56
about our whole data revolution and all this --
346
896260
2000
这些信息革命等等——
14:58
maybe even hum a few bars of our theme music.
347
898260
2000
甚至哼哼我们的一些电影主题音乐。
15:00
♫ Dum ta da da dum dum ta da da dum ♫
348
900260
3000
♫ Dum ta da da dum dum ta da da dum ♫
15:03
Well, your musician friend will stop you and say,
349
903260
2000
你的音乐家朋友会打断你说:
15:05
"You know, the theme music
350
905260
2000
“你知道,你技术革命的
15:07
for your data revolution,
351
907260
2000
主题音乐
15:09
that's an opera, that's Wagner.
352
909260
2000
是个戏剧,是瓦格纳的作品。
15:11
It's based on Norse legend.
353
911260
2000
关于一个北欧传说
15:13
It's Gods and mythical creatures
354
913260
2000
是关于上帝和神话人物
15:15
fighting over magical jewelry."
355
915260
3000
为了魔术珍宝而争斗。”
15:19
That's interesting.
356
919260
3000
这很有趣
15:22
Now it's also a beautiful opera,
357
922260
3000
现在这也是个动人的戏剧
15:25
and we're moved by that opera.
358
925260
3000
我们都被感动了
15:28
We're moved because it's about the battle
359
928260
2000
因为这是关于一场战争
15:30
between good and evil,
360
930260
2000
善与恶之间的
15:32
about right and wrong.
361
932260
2000
对与错之间的
15:34
And we care about right and wrong.
362
934260
2000
我们关心对和错
15:36
We care what happens in that opera.
363
936260
3000
我们关心在戏剧里发生了什么
15:39
We care what happens in "Apocalypse Now."
364
939260
3000
我们关心在“现代启示录”里发生了什么
15:42
And we certainly care
365
942260
2000
我们也关心
15:44
what happens with our technologies.
366
944260
2000
用我们的技术会发生什么
15:46
We have so much power today,
367
946260
2000
当下我们有这么多的能力
15:48
it is up to us to figure out what to do,
368
948260
3000
这是我们的责任来搞清楚该怎么做。
15:51
and that's the good news.
369
951260
2000
这是好消息
15:53
We're the ones writing this opera.
370
953260
3000
我们是写剧本的人
15:56
This is our movie.
371
956260
2000
这就是我们的电影
15:58
We figure out what will happen with this technology.
372
958260
3000
我们清楚这个技术将会带给我们什么。
16:01
We determine how this will all end.
373
961260
3000
我们认为事情会这样结束
16:04
Thank you.
374
964260
2000
谢谢
16:06
(Applause)
375
966260
5000
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog