Damon Horowitz calls for a "moral operating system"

96,197 views ・ 2011-06-06

TED


請雙擊下方英文字幕播放視頻。

譯者: Ana Choi 審譯者: Shelley Krishna Tsang
00:15
Power.
0
15260
2000
權力。
00:17
That is the word that comes to mind.
1
17260
2000
這就是在我腦海呈現的詞語。
00:19
We're the new technologists.
2
19260
2000
我們是新的科技專家。
00:21
We have a lot of data, so we have a lot of power.
3
21260
3000
我們有很多數據資料,所以我們有很多權力。
00:24
How much power do we have?
4
24260
2000
我們究竟有幾大威力?
00:26
Scene from a movie: "Apocalypse Now" -- great movie.
5
26260
3000
這是從電影<現代啟示錄>其中一幕--很棒的電影。
00:29
We've got to get our hero, Captain Willard, to the mouth of the Nung River
6
29260
3000
我們必須讓我們的英雄,韋勒上尉,到儂河的河口
00:32
so he can go pursue Colonel Kurtz.
7
32260
2000
去追尋寇茲上校。
00:34
The way we're going to do this is fly him in and drop him off.
8
34260
2000
我們要帶他飛往目的地然後讓他前行。
00:36
So the scene:
9
36260
2000
場景如下:
00:38
the sky is filled with this fleet of helicopters carrying him in.
10
38260
3000
天空佈滿了運載他的直升機艦隊。
00:41
And there's this loud, thrilling music in the background,
11
41260
2000
附帶這個響亮,驚險的背景音樂,
00:43
this wild music.
12
43260
2000
這瘋狂的音樂。
00:45
♫ Dum da ta da dum ♫
13
45260
2000
♫ 音樂聲 ♫
00:47
♫ Dum da ta da dum ♫
14
47260
2000
♫ 音樂聲 ♫
00:49
♫ Da ta da da ♫
15
49260
3000
♫ 音樂聲 ♫
00:52
That's a lot of power.
16
52260
2000
這是強勁的威力。
00:54
That's the kind of power I feel in this room.
17
54260
2000
這就是在我在這個房間感覺到的力量。
00:56
That's the kind of power we have
18
56260
2000
這就我們擁有的一種力量,
00:58
because of all of the data that we have.
19
58260
2000
因為我們有所有的資料數據。
01:00
Let's take an example.
20
60260
2000
讓我們舉一個例子。
01:02
What can we do
21
62260
2000
若我們有著一個人的資料數據,
01:04
with just one person's data?
22
64260
3000
我們可以用它做什麼?
01:07
What can we do
23
67260
2000
我們可以用此人
01:09
with that guy's data?
24
69260
2000
的資料做什麼?
01:11
I can look at your financial records.
25
71260
2000
我可以看看你的財務記錄。
01:13
I can tell if you pay your bills on time.
26
73260
2000
我可知道你是否按時支付帳單。
01:15
I know if you're good to give a loan to.
27
75260
2000
我會知道應否給你予貸款。
01:17
I can look at your medical records; I can see if your pump is still pumping --
28
77260
3000
我可以看看你的醫療記錄,看看你的身體機能是否運行如常--
01:20
see if you're good to offer insurance to.
29
80260
3000
看看應否給你推銷健康保險。
01:23
I can look at your clicking patterns.
30
83260
2000
我可以看看你的點擊模式。
01:25
When you come to my website, I actually know what you're going to do already
31
85260
3000
當你來到我的網站,其實我已經知道你要做的事情,
01:28
because I've seen you visit millions of websites before.
32
88260
2000
因為我已經看過你之前到過的數百萬個網站。
01:30
And I'm sorry to tell you,
33
90260
2000
而且我很抱歉地告訴你,
01:32
you're like a poker player, you have a tell.
34
92260
2000
你像一個撲克玩家,你有一個「偷雞」破綻。
01:34
I can tell with data analysis what you're going to do
35
94260
2000
我可以使用分析你的數據
01:36
before you even do it.
36
96260
2000
來知道你的下一步。
01:38
I know what you like. I know who you are,
37
98260
3000
我知道你的喜嗜。我知道你是誰。
01:41
and that's even before I look at your mail
38
101260
2000
而且我還未看你的電郵
01:43
or your phone.
39
103260
2000
或手機。
01:45
Those are the kinds of things we can do
40
105260
2000
這都我們可以用數據
01:47
with the data that we have.
41
107260
3000
達到的事情。
01:50
But I'm not actually here to talk about what we can do.
42
110260
3000
但其實我不是來談我們能夠做些什麼。
01:56
I'm here to talk about what we should do.
43
116260
3000
我是來談談我們應該做什麼。
02:00
What's the right thing to do?
44
120260
3000
什麼是正確的事情?
02:04
Now I see some puzzled looks
45
124260
2000
我看到你們有些困惑的樣子
02:06
like, "Why are you asking us what's the right thing to do?
46
126260
3000
像在問:「你為何問我們應該做什麼?
02:09
We're just building this stuff. Somebody else is using it."
47
129260
3000
我們只是建造這種東西供給別人所用。」
02:12
Fair enough.
48
132260
3000
合理的。
02:15
But it brings me back.
49
135260
2000
但它使我回到這一點。
02:17
I think about World War II --
50
137260
2000
我想到第二次世界大戰 --
02:19
some of our great technologists then,
51
139260
2000
我們有一些重要的技術人員,
02:21
some of our great physicists,
52
141260
2000
我們的一些偉大的物理學家,
02:23
studying nuclear fission and fusion --
53
143260
2000
研究核裂變和核聚變 --
02:25
just nuclear stuff.
54
145260
2000
只是核子的科學。
02:27
We gather together these physicists in Los Alamos
55
147260
3000
我們聚集這些物理學家在洛斯阿拉莫斯國家實驗室,
02:30
to see what they'll build.
56
150260
3000
看看他們將構建些什麼。
02:33
We want the people building the technology
57
153260
3000
我們要建設技術的人
02:36
thinking about what we should be doing with the technology.
58
156260
3000
想想我們應該用技術建做些什麼。
02:41
So what should we be doing with that guy's data?
59
161260
3000
那麼,我們應該用那傢伙的數據做什麼?
02:44
Should we be collecting it, gathering it,
60
164260
3000
我們是否應該收集它,薈集它,
02:47
so we can make his online experience better?
61
167260
2000
用來使他有更好的上網體驗?
02:49
So we can make money?
62
169260
2000
用來使我們可以賺錢?
02:51
So we can protect ourselves
63
171260
2000
用來保護我們自己,
02:53
if he was up to no good?
64
173260
2000
若假他實行一些壞主意?
02:55
Or should we respect his privacy,
65
175260
3000
或我們應該尊重他的隱私,
02:58
protect his dignity and leave him alone?
66
178260
3000
保護他的尊嚴,讓他獨自自我?
03:02
Which one is it?
67
182260
3000
我們該怎樣?
03:05
How should we figure it out?
68
185260
2000
我們應該如何看著辦?
03:07
I know: crowdsource. Let's crowdsource this.
69
187260
3000
我知道:眾包。讓我們用眾包的方法解決問題。
03:11
So to get people warmed up,
70
191260
3000
好,為了讓你們做好準備就绪,
03:14
let's start with an easy question --
71
194260
2000
讓我們先從一個簡單的問題開始 --
03:16
something I'm sure everybody here has an opinion about:
72
196260
3000
一個我敢肯定在座每個人都有異議的問題:
03:19
iPhone versus Android.
73
199260
2000
iPhone 抑或 Android電話。
03:21
Let's do a show of hands -- iPhone.
74
201260
3000
讓我們舉手 -- iPhone。
03:24
Uh huh.
75
204260
2000
嗯。
03:26
Android.
76
206260
3000
Android。
03:29
You'd think with a bunch of smart people
77
209260
2000
哼,還以為一班聰明人
03:31
we wouldn't be such suckers just for the pretty phones.
78
211260
2000
我們不會出現這種只顧漂亮手機的人。
03:33
(Laughter)
79
213260
2000
<笑聲>
03:35
Next question,
80
215260
2000
接下來的問題,
03:37
a little bit harder.
81
217260
2000
有點更困難。
03:39
Should we be collecting all of that guy's data
82
219260
2000
我們應否該收集那傢伙所有的資料數據
03:41
to make his experiences better
83
221260
2000
使他有更好的上網體驗
03:43
and to protect ourselves in case he's up to no good?
84
223260
3000
並用來保護我們自己若假他實行一些壞主意?
03:46
Or should we leave him alone?
85
226260
2000
或我們應該讓他獨自自我?
03:48
Collect his data.
86
228260
3000
收集他的資料數據。
03:53
Leave him alone.
87
233260
3000
讓他獨自自我。
03:56
You're safe. It's fine.
88
236260
2000
你是安全的。沒有問題。
03:58
(Laughter)
89
238260
2000
<笑聲>
04:00
Okay, last question --
90
240260
2000
好吧,最後一個問題--
04:02
harder question --
91
242260
2000
更困難的問題。
04:04
when trying to evaluate
92
244260
3000
當我們在這種情況下
04:07
what we should do in this case,
93
247260
3000
試圖評估我們應該如何,
04:10
should we use a Kantian deontological moral framework,
94
250260
4000
我們應否使用康德的道義論的道德框架,
04:14
or should we use a Millian consequentialist one?
95
254260
3000
或我們應否使用莉恩結果主義的道德框架,
04:19
Kant.
96
259260
3000
康德的。
04:22
Mill.
97
262260
3000
莉恩的。
04:25
Not as many votes.
98
265260
2000
比上一次少表決。
04:27
(Laughter)
99
267260
3000
<笑聲>
04:30
Yeah, that's a terrifying result.
100
270260
3000
是的,這是一個可怕的結果。
04:34
Terrifying, because we have stronger opinions
101
274260
4000
可怕,因為我們對手機的意見
04:38
about our hand-held devices
102
278260
2000
比我們對用來指導
04:40
than about the moral framework
103
280260
2000
我們決定的道德框架
04:42
we should use to guide our decisions.
104
282260
2000
的意見更強。
04:44
How do we know what to do with all the power we have
105
284260
3000
如果我們沒有一個道德框架,
04:47
if we don't have a moral framework?
106
287260
3000
我們怎麼知道用我們所有的力量該怎麼做?
04:50
We know more about mobile operating systems,
107
290260
3000
我們對關於手機的操作系統知道更十分多,
04:53
but what we really need is a moral operating system.
108
293260
3000
但我們真正需要的是一個道德的操作系統。
04:58
What's a moral operating system?
109
298260
2000
什麼是道德的操作系統?
05:00
We all know right and wrong, right?
110
300260
2000
我們都知道是,非。
05:02
You feel good when you do something right,
111
302260
2000
你做一些好的事會有好的感覺,
05:04
you feel bad when you do something wrong.
112
304260
2000
你做一些錯事會有不好的感覺。
05:06
Our parents teach us that: praise with the good, scold with the bad.
113
306260
3000
我們的父母教導我們:讚美好的,罵壞的。
05:09
But how do we figure out what's right and wrong?
114
309260
3000
但是,我們究境如何找出什麼是正確的和錯誤的?
05:12
And from day to day, we have the techniques that we use.
115
312260
3000
而每過一天又一天,我們用我們的方法權衡。
05:15
Maybe we just follow our gut.
116
315260
3000
也許我們只要按照我們的膽量。
05:18
Maybe we take a vote -- we crowdsource.
117
318260
3000
也許我們需要投票 - 我們的人群的來源。
05:21
Or maybe we punt --
118
321260
2000
或者,也許我們用篙撐 --
05:23
ask the legal department, see what they say.
119
323260
3000
問問法律部門,看他們說些什麼。
05:26
In other words, it's kind of random,
120
326260
2000
換句話說,它是一種隨機的,
05:28
kind of ad hoc,
121
328260
2000
即席的,
05:30
how we figure out what we should do.
122
330260
3000
的權衡方法。
05:33
And maybe, if we want to be on surer footing,
123
333260
3000
也或許,若我們想在基礎上更有把握,
05:36
what we really want is a moral framework that will help guide us there,
124
336260
3000
我們真正需要的是一種道德框架,有助於引導我們到那裡,
05:39
that will tell us what kinds of things are right and wrong in the first place,
125
339260
3000
指導我們從一開始什麼樣是正確和錯誤的,
05:42
and how would we know in a given situation what to do.
126
342260
4000
以及指導我們在特定情況下該怎麼辦。
05:46
So let's get a moral framework.
127
346260
2000
因此,讓我們弄一個道德框架。
05:48
We're numbers people, living by numbers.
128
348260
3000
我們數字的人,生活於數字裡。
05:51
How can we use numbers
129
351260
2000
我們如何使用數字
05:53
as the basis for a moral framework?
130
353260
3000
作為道德框架的基礎?
05:56
I know a guy who did exactly that.
131
356260
3000
我知道一個傢伙正恰恰如此辦,
05:59
A brilliant guy --
132
359260
3000
一個傑出的傢伙 --
06:02
he's been dead 2,500 years.
133
362260
3000
他已經死了2500年。
06:05
Plato, that's right.
134
365260
2000
不錯, 是柏拉圖。
06:07
Remember him -- old philosopher?
135
367260
2000
老哲學家--記得他嗎?
06:09
You were sleeping during that class.
136
369260
3000
你在那課時正在睡覺。
06:12
And Plato, he had a lot of the same concerns that we did.
137
372260
2000
柏拉圖,他有很多和我們相同的問題。
06:14
He was worried about right and wrong.
138
374260
2000
他憂慮着對與錯。
06:16
He wanted to know what is just.
139
376260
2000
他想知道什麼是正義的。
06:18
But he was worried that all we seem to be doing
140
378260
2000
但他擔心,似乎我們所做的
06:20
is trading opinions about this.
141
380260
2000
只是交流有關這一點的意見。
06:22
He says something's just. She says something else is just.
142
382260
3000
公說公有理。 婆說婆有理。
06:25
It's kind of convincing when he talks and when she talks too.
143
385260
2000
雙方首都非常有說服力。
06:27
I'm just going back and forth; I'm not getting anywhere.
144
387260
2000
來來回回; 沒有結果。
06:29
I don't want opinions; I want knowledge.
145
389260
3000
我不想要意見,我想要知識。
06:32
I want to know the truth about justice --
146
392260
3000
我想知道公義的真相 --
06:35
like we have truths in math.
147
395260
3000
像我們在數學有的真理。
06:38
In math, we know the objective facts.
148
398260
3000
在數學,我們知道客觀的事實。
06:41
Take a number, any number -- two.
149
401260
2000
例如一個數字,任何數字 -- 「二」。
06:43
Favorite number. I love that number.
150
403260
2000
最喜愛的數字。我愛這個數字。
06:45
There are truths about two.
151
405260
2000
有關於「二」的真理。
06:47
If you've got two of something,
152
407260
2000
如果你有兩個東西,
06:49
you add two more, you get four.
153
409260
2000
你加兩個,你會得到四個。
06:51
That's true no matter what thing you're talking about.
154
411260
2000
這是真的,無論你在說什麼事情。
06:53
It's an objective truth about the form of two,
155
413260
2000
這是一個有關「二」客觀的真理,
06:55
the abstract form.
156
415260
2000
抽象的形式。
06:57
When you have two of anything -- two eyes, two ears, two noses,
157
417260
2000
當你有兩個東西 -- 兩隻眼睛,兩隻耳朵,兩個鼻子,
06:59
just two protrusions --
158
419260
2000
只有兩個突起的物體 --
07:01
those all partake of the form of two.
159
421260
3000
所有參與「二」的形式。
07:04
They all participate in the truths that two has.
160
424260
4000
他們都參與了「二」。
07:08
They all have two-ness in them.
161
428260
2000
「二」是他們所有。
07:10
And therefore, it's not a matter of opinion.
162
430260
3000
因此,它不是一個見仁見智的局勢。
07:13
What if, Plato thought,
163
433260
2000
如果,柏拉圖在想,
07:15
ethics was like math?
164
435260
2000
道德是好比數學?
07:17
What if there were a pure form of justice?
165
437260
3000
假如有一個純粹的正義形式呢?
07:20
What if there are truths about justice,
166
440260
2000
如果公義有真理,
07:22
and you could just look around in this world
167
442260
2000
你可以看看周圍環顧這個世界,
07:24
and see which things participated,
168
444260
2000
看看哪些東西參與,
07:26
partook of that form of justice?
169
446260
3000
參加這種形式的正義?
07:29
Then you would know what was really just and what wasn't.
170
449260
3000
那你便會知道什麼是真正公正,什麼不是。
07:32
It wouldn't be a matter
171
452260
2000
這不會是只有關意見
07:34
of just opinion or just appearances.
172
454260
3000
或表面的問題。
07:37
That's a stunning vision.
173
457260
2000
這是一個驚人的遠見。
07:39
I mean, think about that. How grand. How ambitious.
174
459260
3000
試想想,這一點。這是如何宏偉, 如何雄心勃勃。
07:42
That's as ambitious as we are.
175
462260
2000
這是如我們的雄心勃勃。
07:44
He wants to solve ethics.
176
464260
2000
他要解決道德。
07:46
He wants objective truths.
177
466260
2000
他要客觀真理。
07:48
If you think that way,
178
468260
3000
如果你這樣思想,
07:51
you have a Platonist moral framework.
179
471260
3000
你便擁有一個柏拉圖主義的道德框架。
07:54
If you don't think that way,
180
474260
2000
如果你不認為這樣,
07:56
well, you have a lot of company in the history of Western philosophy,
181
476260
2000
那,你在歷史上西方哲學史便有很多支持者,
07:58
because the tidy idea, you know, people criticized it.
182
478260
3000
因為整齊的想法 --- 眾所週知,人們會批評。
08:01
Aristotle, in particular, he was not amused.
183
481260
3000
尤其是亞里士多德,他便不覺得有趣。
08:04
He thought it was impractical.
184
484260
3000
他認為這是不切實際的。
08:07
Aristotle said, "We should seek only so much precision in each subject
185
487260
4000
亞里士多德說,「我們只應該尋求每個主題允許
08:11
as that subject allows."
186
491260
2000
我們能夠尋求的問題。」
08:13
Aristotle thought ethics wasn't a lot like math.
187
493260
3000
亞里士多德不認為道德好比數學。
08:16
He thought ethics was a matter of making decisions in the here-and-now
188
496260
3000
他認為道德是一個在此時此地用我們最好的
08:19
using our best judgment
189
499260
2000
判斷決策的問題,
08:21
to find the right path.
190
501260
2000
來找到正確的道路。
08:23
If you think that, Plato's not your guy.
191
503260
2000
如果你這樣認為,柏拉圖不是你的同道。
08:25
But don't give up.
192
505260
2000
但不要放棄。
08:27
Maybe there's another way
193
507260
2000
也許有另一種方式,
08:29
that we can use numbers as the basis of our moral framework.
194
509260
3000
我們可以使用數字作為我們道德框架的依據。
08:33
How about this:
195
513260
2000
這樣如何:
08:35
What if in any situation you could just calculate,
196
515260
3000
如果在任何情況下你可以計算,
08:38
look at the choices,
197
518260
2000
看看選擇,
08:40
measure out which one's better and know what to do?
198
520260
3000
測算出哪一個更好,知道該怎麼辦?
08:43
That sound familiar?
199
523260
2000
這聽起來很熟悉?
08:45
That's a utilitarian moral framework.
200
525260
3000
這是一個功利主義的道德框架。
08:48
John Stuart Mill was a great advocate of this --
201
528260
2000
穆勒是便主張這一點 --
08:50
nice guy besides --
202
530260
2000
好好的傢伙 --
08:52
and only been dead 200 years.
203
532260
2000
而且只死了200年。
08:54
So basis of utilitarianism --
204
534260
2000
所以功利主義的基礎 --
08:56
I'm sure you're familiar at least.
205
536260
2000
我敢肯定你會熟悉。
08:58
The three people who voted for Mill before are familiar with this.
206
538260
2000
至少剛才那三個舉手贊同穆勒的人對他熟悉。
09:00
But here's the way it works.
207
540260
2000
現在讓我說說該怎麼樣。
09:02
What if morals, what if what makes something moral
208
542260
3000
若果我說道德,或某一件擁有道德的東西,
09:05
is just a matter of if it maximizes pleasure
209
545260
2000
只不過是擴大快樂限度
09:07
and minimizes pain?
210
547260
2000
以及減少痛苦?
09:09
It does something intrinsic to the act.
211
549260
3000
這種思想方式便有一些固定性。
09:12
It's not like its relation to some abstract form.
212
552260
2000
它不好比一些抽象的形式。
09:14
It's just a matter of the consequences.
213
554260
2000
這只是一個後果的問題。
09:16
You just look at the consequences
214
556260
2000
你只要看看後果,
09:18
and see if, overall, it's for the good or for the worse.
215
558260
2000
看看總體而言,它是好還是壞。
09:20
That would be simple. Then we know what to do.
216
560260
2000
這便簡單。我們知道該怎麼做。
09:22
Let's take an example.
217
562260
2000
讓我們舉一個例子。
09:24
Suppose I go up
218
564260
2000
假如我說,
09:26
and I say, "I'm going to take your phone."
219
566260
2000
「我要拿帶走你的手機。」
09:28
Not just because it rang earlier,
220
568260
2000
不是因為它剛才響過,
09:30
but I'm going to take it because I made a little calculation.
221
570260
3000
而是因為我做了一個小小的計算。
09:33
I thought, that guy looks suspicious.
222
573260
3000
我想,那傢伙看起來有可疑。
09:36
And what if he's been sending little messages to Bin Laden's hideout --
223
576260
3000
如果他一直有消息發送到拉登藏身之處 --
09:39
or whoever took over after Bin Laden --
224
579260
2000
或誰接手拉登之位 --
09:41
and he's actually like a terrorist, a sleeper cell.
225
581260
3000
或者他實際上像個恐怖分子,臥底細胞。
09:44
I'm going to find that out, and when I find that out,
226
584260
3000
我會去了解,當我查出了,
09:47
I'm going to prevent a huge amount of damage that he could cause.
227
587260
3000
我將會防止他構成的巨大損害。
09:50
That has a very high utility to prevent that damage.
228
590260
3000
具有很高的實用工具,以防止該傷害。
09:53
And compared to the little pain that it's going to cause --
229
593260
2000
這相比於小小不便的痛苦--
09:55
because it's going to be embarrassing when I'm looking on his phone
230
595260
2000
因為這將會是個令人尷尬的時候,
09:57
and seeing that he has a Farmville problem and that whole bit --
231
597260
3000
當我在他的電話,看到他竟然沉迷於Farmville遊戲 --
10:00
that's overwhelmed
232
600260
3000
這是不堪相比於
10:03
by the value of looking at the phone.
233
603260
2000
看他手機的價值。
10:05
If you feel that way,
234
605260
2000
如果你這樣認為,
10:07
that's a utilitarian choice.
235
607260
3000
這是一個功利思想的選擇。
10:10
But maybe you don't feel that way either.
236
610260
3000
但也許你覺得都不應該是這樣。
10:13
Maybe you think, it's his phone.
237
613260
2000
也許你認為,這是他的電話。
10:15
It's wrong to take his phone
238
615260
2000
拿他的電話看是不對,
10:17
because he's a person
239
617260
2000
因為他是一個人,
10:19
and he has rights and he has dignity,
240
619260
2000
他有他的權利和尊嚴,
10:21
and we can't just interfere with that.
241
621260
2000
我們不能干擾。
10:23
He has autonomy.
242
623260
2000
他有自主權。
10:25
It doesn't matter what the calculations are.
243
625260
2000
什麼計算根本不緊要。
10:27
There are things that are intrinsically wrong --
244
627260
3000
有些事情是一件固的壞事 --
10:30
like lying is wrong,
245
630260
2000
例如說謊是不對的,
10:32
like torturing innocent children is wrong.
246
632260
3000
例如虐待無辜的孩子是錯的。
10:35
Kant was very good on this point,
247
635260
3000
康德於這一點非常滿意,
10:38
and he said it a little better than I'll say it.
248
638260
2000
而且他說這一點比我會說得較好。
10:40
He said we should use our reason
249
640260
2000
他說我們應該用我們的原因找出規律,
10:42
to figure out the rules by which we should guide our conduct,
250
642260
3000
應該以這種方式引導我們的行為。
10:45
and then it is our duty to follow those rules.
251
645260
3000
然後,我們有責任遵循這些規則。
10:48
It's not a matter of calculation.
252
648260
3000
這不是一個計算的問題。
10:51
So let's stop.
253
651260
2000
因此,讓我們停下來。
10:53
We're right in the thick of it, this philosophical thicket.
254
653260
3000
我們就在它的要緊點,在這哲學叢林內。
10:56
And this goes on for thousands of years,
255
656260
3000
而這任其發展了千百年,
10:59
because these are hard questions,
256
659260
2000
因為這些都是難以回答的問題,
11:01
and I've only got 15 minutes.
257
661260
2000
而我只有15分鐘。
11:03
So let's cut to the chase.
258
663260
2000
因此,讓我們切入正題。
11:05
How should we be making our decisions?
259
665260
4000
我們應該如何營造我們的決策?
11:09
Is it Plato, is it Aristotle, is it Kant, is it Mill?
260
669260
3000
是柏拉圖,亞里士多德,是康德,抑或是密爾?
11:12
What should we be doing? What's the answer?
261
672260
2000
我們應該怎麼做?答案該是什麼?
11:14
What's the formula that we can use in any situation
262
674260
3000
什麼是我們可以在任何情況下都使用的公式,
11:17
to determine what we should do,
263
677260
2000
來確定了我們應該做的,
11:19
whether we should use that guy's data or not?
264
679260
2000
我們是否應該使用那傢伙的數據?
11:21
What's the formula?
265
681260
3000
有什麼公式?
11:25
There's not a formula.
266
685260
2000
根本沒有一個公式。
11:29
There's not a simple answer.
267
689260
2000
沒有一個簡單的答案。
11:31
Ethics is hard.
268
691260
3000
道德是艱難的。
11:34
Ethics requires thinking.
269
694260
3000
道德要求思想。
11:38
And that's uncomfortable.
270
698260
2000
這是不舒服的地帶。
11:40
I know; I spent a lot of my career
271
700260
2000
我知道,在我的職業生涯中
11:42
in artificial intelligence,
272
702260
2000
我花了很多時間在人工智能上,
11:44
trying to build machines that could do some of this thinking for us,
273
704260
3000
試圖建立一些可以替我們做這些思想的機器,
11:47
that could give us answers.
274
707260
2000
給我們一些答案。
11:49
But they can't.
275
709260
2000
但它們不能。
11:51
You can't just take human thinking
276
711260
2000
你不能拿人的思想
11:53
and put it into a machine.
277
713260
2000
把它輸入機器。
11:55
We're the ones who have to do it.
278
715260
3000
那些是我們要自己做。
11:58
Happily, we're not machines, and we can do it.
279
718260
3000
令人高興的是,我們不是機器,我們可以做到這一點。
12:01
Not only can we think,
280
721260
2000
我們不僅可以思想,
12:03
we must.
281
723260
2000
我們必須要思想。
12:05
Hannah Arendt said,
282
725260
2000
漢娜阿倫特說:
12:07
"The sad truth
283
727260
2000
「可悲的是,
12:09
is that most evil done in this world
284
729260
2000
這個世界最邪惡的事
12:11
is not done by people
285
731260
2000
不是由選擇作惡
12:13
who choose to be evil.
286
733260
2000
的人所做。
12:15
It arises from not thinking."
287
735260
3000
而它是產生於不思想。」
12:18
That's what she called the "banality of evil."
288
738260
4000
這就是她所謂的「平庸的邪惡」。
12:22
And the response to that
289
742260
2000
而根據這一要求,
12:24
is that we demand the exercise of thinking
290
744260
2000
是我們要要求每一個思維健全的人
12:26
from every sane person.
291
746260
3000
鍛煉思想。
12:29
So let's do that. Let's think.
292
749260
2000
因此,讓我們這樣做。讓我們思考。
12:31
In fact, let's start right now.
293
751260
3000
事實上,我們現在開始。
12:34
Every person in this room do this:
294
754260
3000
每個人在這個房間裡要這樣做:
12:37
think of the last time you had a decision to make
295
757260
3000
想起你上次要做出決定,
12:40
where you were worried to do the right thing,
296
760260
2000
當你擔心是否做正確的決定,
12:42
where you wondered, "What should I be doing?"
297
762260
2000
你在想,「我應該怎樣做?」
12:44
Bring that to mind,
298
764260
2000
想想。
12:46
and now reflect on that
299
766260
2000
反思一下
12:48
and say, "How did I come up that decision?
300
768260
3000
並問,「我是怎麼作出這個決定?
12:51
What did I do? Did I follow my gut?
301
771260
3000
我做了什麼?難道我跟著我的直覺?
12:54
Did I have somebody vote on it? Or did I punt to legal?"
302
774260
2000
是否有人投票呢?還是我舉法律?」
12:56
Or now we have a few more choices.
303
776260
3000
或者,現在我們有更多些選擇。
12:59
"Did I evaluate what would be the highest pleasure
304
779260
2000
「我是否會使用穆勒的最高快樂
13:01
like Mill would?
305
781260
2000
來評價決定?
13:03
Or like Kant, did I use reason to figure out what was intrinsically right?"
306
783260
3000
或像康德,我是否會使用因由弄清楚什麼是本質?」
13:06
Think about it. Really bring it to mind. This is important.
307
786260
3000
真正想想看。這一點很重要。
13:09
It is so important
308
789260
2000
它是如此的重要,
13:11
we are going to spend 30 seconds of valuable TEDTalk time
309
791260
2000
我們將要花30秒寶貴的TEDTalk時間
13:13
doing nothing but thinking about this.
310
793260
2000
什麼也不做,並去思考這一點。
13:15
Are you ready? Go.
311
795260
2000
你準備好嗎?開始。
13:33
Stop. Good work.
312
813260
3000
停止。好。
13:36
What you just did,
313
816260
2000
你剛才做的,
13:38
that's the first step towards taking responsibility
314
818260
2000
這就用是我們所有力量的責任,
13:40
for what we should do with all of our power.
315
820260
3000
承擔對於我們應該做的第一步。
13:45
Now the next step -- try this.
316
825260
3000
現在,試試這下一個步驟。
13:49
Go find a friend and explain to them
317
829260
2000
去找一個朋友,向他們解釋
13:51
how you made that decision.
318
831260
2000
你如何作出這個決定。
13:53
Not right now. Wait till I finish talking.
319
833260
2000
不是現在。等到我談話結束。
13:55
Do it over lunch.
320
835260
2000
可以在午餐談談。
13:57
And don't just find another technologist friend;
321
837260
3000
而且不要只是找另一個技師朋友;
14:00
find somebody different than you.
322
840260
2000
找一個與你不同的人。
14:02
Find an artist or a writer --
323
842260
2000
找一個藝術家或作家 --
14:04
or, heaven forbid, find a philosopher and talk to them.
324
844260
3000
或者,上天保佑,找到一個哲學家交談。
14:07
In fact, find somebody from the humanities.
325
847260
2000
或者,找一個從事文科學的人。
14:09
Why? Because they think about problems
326
849260
2000
為什麼?因為他們思考問題
14:11
differently than we do as technologists.
327
851260
2000
不同於我們做技術的人。
14:13
Just a few days ago, right across the street from here,
328
853260
3000
就在數天前,從這裡在對面街,
14:16
there was hundreds of people gathered together.
329
856260
2000
有數百人聚集在一起。
14:18
It was technologists and humanists
330
858260
2000
這是技術人員和人道主義者
14:20
at that big BiblioTech Conference.
331
860260
2000
在那個BiblioTech大會議。
14:22
And they gathered together
332
862260
2000
他們聚集在一起,
14:24
because the technologists wanted to learn
333
864260
2000
因為技術人員想了解
14:26
what it would be like to think from a humanities perspective.
334
866260
3000
從文人會有什麼樣的視角遠景。
14:29
You have someone from Google
335
869260
2000
有人從谷歌
14:31
talking to someone who does comparative literature.
336
871260
2000
與做比較文學的人交談。
14:33
You're thinking about the relevance of 17th century French theater --
337
873260
3000
你在考慮17世紀法國戲劇的相關性 --
14:36
how does that bear upon venture capital?
338
876260
2000
如何會對風險投資有影響?
14:38
Well that's interesting. That's a different way of thinking.
339
878260
3000
這是有趣的。這是一個不同的思維方式。
14:41
And when you think in that way,
340
881260
2000
而當你以這種方式思想,
14:43
you become more sensitive to the human considerations,
341
883260
3000
你變得更加對人的考慮敏感,
14:46
which are crucial to making ethical decisions.
342
886260
3000
這是對決策倫理至關重要的。
14:49
So imagine that right now
343
889260
2000
所以現在想想
14:51
you went and you found your musician friend.
344
891260
2000
你去找你的音樂家朋友。
14:53
And you're telling him what we're talking about,
345
893260
3000
告訴他我們在說什麼,
14:56
about our whole data revolution and all this --
346
896260
2000
有關我們整個數據革命的一切 --
14:58
maybe even hum a few bars of our theme music.
347
898260
2000
甚至哼了幾節我們的主題音樂。
15:00
♫ Dum ta da da dum dum ta da da dum ♫
348
900260
3000
♫音樂♫
15:03
Well, your musician friend will stop you and say,
349
903260
2000
那麼,你的音樂家朋友會對你說,
15:05
"You know, the theme music
350
905260
2000
「你要知道,為你數據的革命
15:07
for your data revolution,
351
907260
2000
的主題音樂,
15:09
that's an opera, that's Wagner.
352
909260
2000
是一部歌劇,是瓦格納的。
15:11
It's based on Norse legend.
353
911260
2000
它是基於北歐傳說。
15:13
It's Gods and mythical creatures
354
913260
2000
是神和神話動物
15:15
fighting over magical jewelry."
355
915260
3000
爭奪神奇首飾。」
15:19
That's interesting.
356
919260
3000
那很有意思。
15:22
Now it's also a beautiful opera,
357
922260
3000
當然,它也是一個美麗的歌劇。
15:25
and we're moved by that opera.
358
925260
3000
而我們對歌劇感動。
15:28
We're moved because it's about the battle
359
928260
2000
我們很感動,因為它是
15:30
between good and evil,
360
930260
2000
對善惡之間的戰鬥,
15:32
about right and wrong.
361
932260
2000
對與錯的戰鬥。
15:34
And we care about right and wrong.
362
934260
2000
而我們是關心是與非。
15:36
We care what happens in that opera.
363
936260
3000
我們關心在歌劇會發生什麼。
15:39
We care what happens in "Apocalypse Now."
364
939260
3000
我們關心在<現代啟示錄>會發生什麼。
15:42
And we certainly care
365
942260
2000
我們當然關心
15:44
what happens with our technologies.
366
944260
2000
我們的技術會發生什麼。
15:46
We have so much power today,
367
946260
2000
我們今天有這麼大的權力,
15:48
it is up to us to figure out what to do,
368
948260
3000
它是由我們來搞清楚怎樣做。
15:51
and that's the good news.
369
951260
2000
這就是好消息。
15:53
We're the ones writing this opera.
370
953260
3000
我們是寫這部歌劇的作家。
15:56
This is our movie.
371
956260
2000
這是我們的電影。
15:58
We figure out what will happen with this technology.
372
958260
3000
我們會弄清楚這種技術將會發生什麼。
16:01
We determine how this will all end.
373
961260
3000
我們確定全部如何將結束。
16:04
Thank you.
374
964260
2000
謝謝。
16:06
(Applause)
375
966260
5000
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog