What would happen if we upload our brains to computers? | Robin Hanson
151,318 views ・ 2017-09-09
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: 易帆 余
00:13
Someday, we may have robots
as smart as people,
0
13341
4444
總有一天,機器人
會像人類一樣聰明,
00:17
artificial intelligence, AI.
1
17809
2474
人工智慧,AI。
00:20
How could that happen?
2
20307
1476
那要如何做到呢?
00:22
One route is that we'll just keep
accumulating better software,
3
22632
3184
其中一個方式就是,
我們只要繼續累積更好的軟體,
00:25
like we've been doing for 70 years.
4
25840
1992
就像我們過去七十年所做的。
00:28
At past rates of progress,
that may take centuries.
5
28363
2936
但以過去的進步速率推斷,
這樣要花上數百年。
00:31
Some say it'll happen a lot faster
6
31958
2350
有人說,它會提早實現,
00:34
as we discover grand new
powerful theories of intelligence.
7
34332
3491
因為我們發現了
全新的強大智慧理論。
00:38
I'm skeptical.
8
38227
1199
我持懷疑態度。
00:40
But a third scenario
9
40428
3061
但有第三種情況,
00:43
is what I'm going to talk about today.
10
43513
1827
也是我今天要講的。
00:45
The idea is to port the software
from the human brain.
11
45364
2738
這個想法是從人腦中移植出軟體。
00:48
To do this, we're going to need
three technologies to be good enough,
12
48792
3301
要做到這一點,我們需要
讓三種技術的發展足夠成熟,
00:52
and none of them are there yet.
13
52117
1555
但目前我們這些都還沒達到。
00:54
First, we're going to need
lots of cheap, fast, parallel computers.
14
54237
4714
第一,我們需要許多
廉價、快速、平行的電腦。
01:00
Second, we're going to need
to scan individual human brains
15
60675
4207
第二,我們需要在適當的
空間與正確的化學成分下
01:04
in fine spatial and chemical detail,
16
64906
2238
掃瞄我們的大腦。
01:07
to see exactly what cells are where,
connected to what, of what type.
17
67168
4028
這樣才能看出細胞的
精確位置、連結、類型。
01:11
And third, we're going to need
computer models
18
71220
4356
第三,我們需要一種電腦模型,
01:15
of how each kind of brain cell works --
19
75600
2222
關於各類腦細胞如何運作的模型──
01:18
taking input signals,
changing interval state
20
78822
2580
可以讓我們輸入訊號,
改變區間的狀態,
01:21
and sending output signals.
21
81426
1294
並再發送出訊號的模型。
01:22
If we have good enough models
of all the kinds of brain cells
22
82744
3818
如果我們有夠好的各種腦細胞模型、
01:26
and a good enough model of the brain,
23
86586
1763
夠好的大腦模型,
01:28
we can put it together to make
a good enough model of an entire brain,
24
88373
3434
我們可以把它們結合起來,
做出一個完整的大腦模型,
01:31
and that model would have the same
input-output behavior as the original.
25
91831
4003
那個模型與原本大腦所想要的
輸入和輸出行為一致。
01:35
So if you talk to it, it might talk back.
26
95858
2485
所以,如果你跟它說話,
它可能會回話。
01:38
If you ask it to do things,
it might do them.
27
98367
2097
如果你要它做事,它可能會去做。
01:40
And if we could do that,
everything would change.
28
100488
2822
如果我們能做到這樣,
一切都會改變。
01:43
People have been talking
about this idea for decades,
29
103334
2568
幾十年來人們以「上傳」的方式
01:45
under the name of "uploads."
30
105926
1789
談論這個想法。
01:47
I'm going to call them "ems."
31
107739
1484
我打算稱它為「仿真腦(ems)」。
01:50
When they talk about it, they say,
32
110175
2604
當他們在談論這個想法時,他們說:
01:52
"Is this even possible?
33
112803
1544
「這是可能的嗎?」
01:54
If you made one, would it be conscious?
Or is it just an empty machine?
34
114658
3351
如果能做出來,它會是有意識的嗎?
還是只是個空機器?
01:58
If you made one of me,
is that me or someone else?"
35
118033
2452
如果你仿了我的大腦,
它會是我還是另一個人?
02:00
These are all fascinating questions
that I'm going to ignore ...
36
120856
3102
我將忽略這些令人著迷的問題…...
02:03
(Laughter)
37
123982
2365
(笑聲)
02:06
because I see a neglected question:
38
126861
2089
因為我看到了一個被忽視的問題:
02:08
What would actually happen?
39
128974
1610
究竟會發生什麽事?
02:12
I became obsessed with this question.
40
132260
1904
我對這個問題很著迷。
02:15
I spent four years trying to analyze it,
41
135813
2953
我花了四年時間來分析它,
02:18
using standard academic tools,
to guess what would happen,
42
138790
3547
用標準的學術工具,
來猜測可能會發生什麼事,
02:22
and I'm here to tell you what I found.
43
142361
1849
我來這裡是要告訴各位我的發現。
02:24
But be warned --
44
144748
1382
但,請注意──
02:26
I'm not offering inspiration,
I'm offering analysis.
45
146542
3407
我不是要提供靈感,
我是要提供分析。
02:29
I see my job as telling you
what's most likely to happen
46
149973
2794
我認為我的工作是
告訴你們最有可能發生的情況,
02:32
if we did the least to avoid it.
47
152791
1777
如果我們不花心思去預防,
02:35
If you aren't at least a bit disturbed
by something I tell you here,
48
155952
3302
如果你對於我今天告訴你的事
並不感到擔心,
02:39
you're just not paying attention.
49
159278
1594
那表示你沒在注意聽。
02:40
(Laughter)
50
160896
1351
(笑聲)
02:42
OK, the first thing I can tell you
51
162271
2596
好,我能告訴各位的第一件事是,
02:44
is that ems spend most
of their life in virtual reality.
52
164891
3133
仿真腦的大部分時間
都是在虛擬現實生活中度過的。
02:48
This is what you might look like
if you were using virtual reality.
53
168639
4484
如果你使用虛擬實境,
這就是你的樣子。
02:53
And this is what you might see:
54
173894
2150
這也是你可能會看到的:
02:57
sunlight glinting off of water,
you might hear gulls flying above,
55
177064
3596
陽光在水面上閃閃發光,
可能還會聽到海鷗從上面飛過,
03:00
you might even feel the wind
on your cheeks or smell seawater,
56
180684
3031
你甚至可以用先進的硬體
03:03
with advanced hardware.
57
183739
1325
感受到風吹撫過臉頰或
聞到海水的味道。
03:05
Now, if you were to spend
a lot of time here,
58
185902
2097
如果你要花很多時間在這裡,
03:08
you might want a dashboard
59
188023
1266
你可能會想要一個儀表板,
03:09
where you could do things like
make a phone call,
60
189313
2494
讓你可以做一些事,
比如打電話、
03:11
move to a new virtual world,
61
191831
1550
移動到一個新的虛擬世界、
03:13
check your bank account.
62
193405
1300
查看銀行帳戶。
03:15
Now, while this is what
you would look like in virtual reality,
63
195780
3657
這是你在虛擬實境中
看起來的樣子,
03:19
this is what an em
would look like in virtual reality.
64
199461
3128
這是仿真腦在虛擬實境中
看起來的樣子。
03:22
It's computer hardware
sitting in a server rack somewhere.
65
202613
2756
它是電腦硬體,
就放在某個伺服器的機架上。
03:25
But still, it could see
and experience the same thing.
66
205393
3158
但它仍然能夠
看見和體驗同樣的東西。
03:29
But some things are different for ems.
67
209102
1830
但對仿真腦來說,
有些東西是不同的。
03:31
First, while you'll probably always notice
that virtual reality isn't entirely real,
68
211510
4786
第一,你可能會留意到
虛擬實境並不全是真實的,
03:36
to an em, it can feel as real to them
as this room feels to you now
69
216320
3254
但仿真腦感受到的真實度,
就像你現在對這空間
03:39
or as anything ever feels.
70
219598
1638
或對任何其它東西
感受到的真實度一樣。
03:41
And ems also have
some more action possibilities.
71
221260
2412
仿真腦還有更多的行為可能性。
03:44
For example, your mind just always
runs at the same speed,
72
224209
2716
比如,你的大腦
總是以同樣速度運作,
03:46
but an em can add more or less
computer hardware to run faster or slower,
73
226949
3841
但仿真腦可以透過增加或減少一些
電腦硬體來運作地更快或更慢,
03:50
and therefore, if the world around them
seems to be going too fast,
74
230814
3688
因此,如果它們覺得周圍的世界
發展地似乎太快了,
03:54
they can just speed up their mind,
75
234526
1619
它們可以加快思考的速度,
03:56
and the world around them
would seem to slow down.
76
236169
2350
這樣它們周遭的世界
似乎就會慢了下來。
03:58
In addition, an em can make
a copy of itself at that moment.
77
238543
4841
此外,仿真腦可以在
那個時候複製它自己。
04:03
This copy would remember
everything the same,
78
243940
2097
這個複製品會記住所有的東西,
04:06
and if it starts out with the same speed,
looking at the same speed,
79
246061
3316
如果它用同樣的速度開始運作,
以同樣的速度觀察世界,
04:09
it might even need to be told,
"You are the copy."
80
249401
2651
它們甚至可能需要被告知:
「你只是個複製品。」
04:12
And em could make archive copies,
81
252599
1573
仿真腦可以做歸檔副本,
04:14
and with enough archives,
82
254196
1824
一旦有足夠的歸檔副本,
04:16
an em can be immortal --
83
256044
2072
仿真腦就可以永生——
04:18
in principle, though not
usually in practice.
84
258140
2190
那是理論上,雖然實務上不常見。
04:21
And an em can move its brain,
the computer that represents its brain,
85
261231
3532
仿真腦可以移動它的腦,
也就是代表它的腦的那台電腦,
04:24
from one physical location to another.
86
264787
2087
從一個實體地點移到另一個地點。
04:27
Ems can actually move around the world
at the speed of light,
87
267857
3318
仿真腦可以用光速在全世界移動,
04:31
and by moving to a new location,
88
271199
1532
藉由移動到一個新地點,
04:32
they can interact more quickly
with ems near that new location.
89
272755
3264
它可以更快速地與新地點
附近的仿真腦做互動。
04:36
So far, I've been talking about
what ems can do.
90
276043
4277
目前,我說的都是仿真腦能做什麼。
04:41
What do ems choose to do?
91
281047
1900
那仿真腦會選擇做什麼?
04:44
To understand that, we'll need
to understand three key facts.
92
284225
2977
要了解這一點,我們得要
先了解三項關鍵事實。
04:47
First, ems by definition do what
the human they emulate would do
93
287916
6151
第一,定義上,
仿真腦在相同情境下,
會模仿出人類會做的事。
04:54
in the same situation.
94
294091
1301
04:56
So their lives and behavior
are very human.
95
296125
2643
所以它們的生活和行為
都非常像人類。
04:58
They're mainly different because
they're living in a different world.
96
298792
3297
它們與人主要的差異在於
它們活在不同的世界裡。
05:02
Second, ems need
real resources to survive.
97
302113
3298
第二,仿真腦需要真實資源來存活。
05:05
You need food and shelter or you'll die.
98
305435
3663
你需要食物和住所,不然就會死亡。
05:09
Also, ems need computer hardware,
energy, cooling, or they can't exist.
99
309122
4310
同樣的,仿真腦需要電腦硬體、
能源、冷卻,不然它們將不能生存。
05:13
For every subjective minute
that an em experiences,
100
313456
2575
仿真腦所經歷的主觀每一分鐘,
05:16
someone, usually that em,
had to work to pay for it.
101
316055
3274
都是得有人付出才能得到的,
而付出的那個人通常是仿真腦本身。
05:19
Third, ems are poor.
102
319956
1690
第三,仿真腦很窮。
05:21
(Laughter)
103
321670
1150
(笑聲)
05:23
The em population can grow
quicker than the em economy,
104
323535
2635
仿真腦數量的成長會快過
仿真腦的經濟成長,
05:26
so that means wages fall down
to em subsistence levels.
105
326194
3335
那意味著薪資水平會落到
仿真腦能維持生計的水平。
05:29
That means ems have to be working
most of the time.
106
329553
2604
那意味著仿真腦
大部份的時間得要工作。
05:32
So that means
this is what ems usually see:
107
332181
2928
那意味著仿真腦通常看見的是這個:
05:35
beautiful and luxurious, but desks --
108
335504
2548
美麗又奢華的,辦公桌──
05:38
they're working most of the time.
109
338076
1724
它們大部份的時間都在工作。
05:40
Now, a subsistence wage scenario,
you might think, is exotic and strange,
110
340103
3969
你可能會認為,這種勉強糊口的
工資的情景是奇異並奇怪的,
05:44
but it's actually the usual case
in human history,
111
344096
2341
但這是人類歷史上常見的情況,
05:46
and it's how pretty much
all wild animals have ever lived,
112
346461
2714
這基本是所有野生動物的生活方式,
05:49
so we know what humans do
in this situation.
113
349199
2276
所以我們知道人類
在這種情況下會做什麼。
05:51
Humans basically do
what it takes to survive,
114
351499
2930
人類基本上會為了生存而行動,
05:54
and this is what lets me say
so much about the em world.
115
354453
3309
這就是讓我說了這麽多
關於仿真腦世界的原因。
05:58
When creatures are rich, like you,
116
358572
2389
當生物富有時,就像你們一樣,
06:00
you have to know a lot
about what they want
117
360985
2023
你得先要知道他們想要什麼,
06:03
to figure out what they do.
118
363032
1341
才能猜出他們會做什麼。
06:04
When creatures are poor,
119
364397
1170
當生物很貧窮時,
06:05
you know that they mostly do
what it takes to survive.
120
365591
2657
你知道的,他們會做的
就只是生存下來。
06:08
So we've been talking about the em world
from the point of view of the ems --
121
368596
3624
我們已經從仿真腦的觀點
談了仿真腦的世界。
06:12
now, let's step back
and look at their whole world.
122
372244
2515
現在我們退後一步,
來看看它們的整個世界。
06:14
First, the em world grows
much faster than ours,
123
374783
3080
第一,仿真腦世界的成長速度
比我們的世界快很多,
06:17
roughly a hundred times faster.
124
377887
2464
大約快上一百倍。
06:20
So the amount of change
we would experience in a century or two,
125
380375
3163
我們經歷了一、兩個世紀的改變,
06:23
they would experience in a year or two.
126
383562
1858
它們在一、兩年內就能完成。
06:25
And I'm not really willing to project
this age much beyond that,
127
385444
3038
我其實不想預測更遙遠的未來,
06:28
because plausibly by then something else
will happen, I don't know what.
128
388506
3437
因為,理論上到那時就會有
其它事發生,我們無法知道。
06:31
Second, the typical emulation
runs even faster,
129
391967
2358
第二,典型的仿真腦運作得更快,
06:34
roughly a thousand times human speed.
130
394349
2189
大約是人類的一千倍。
06:37
So for them, they experience
thousands of years in this year or two,
131
397136
5340
所以,對它們而言,它們會在
這一、兩年就經歷我們的數千年,
06:42
and for them, the world around them
is actually changing more slowly
132
402500
3192
相對於你對你的周圍世界的感覺,
06:45
than your world seems to change for you.
133
405716
1942
它們會覺得它們周圍的
世界改變很緩慢。
06:48
Third, ems are crammed together
in a small number of very dense cities.
134
408097
4484
第三,仿真腦會擠在
少數幾個高密度城市中。
06:52
This is not only how they see
themselves in virtual reality,
135
412605
3466
這不只是它們在虛擬實境中
看到的自身狀況,
06:56
it's also how they actually are
physically crammed together.
136
416095
2896
它們的實體也確實是
擠在同一個地方的。
06:59
So at em speeds, physical travel
feels really painfully slow,
137
419294
3886
以仿真腦速度來看,
實體旅行是要命的緩慢,
07:03
so most em cities are self-sufficient,
138
423204
2322
所以大部份的仿真腦城市
是自給自足的,
07:05
most war is cyber war,
139
425550
1750
大部份的戰爭是網路戰爭,
07:07
and most of the rest of the earth
away from the em cities
140
427324
2691
而地球上遠離
仿真腦城市的其它地方,
07:10
is left to the humans, because the ems
really aren't that interested in it.
141
430039
3674
就留給人類,因為仿真腦
對這些地方真的不感興趣。
07:14
Speaking of humans,
142
434072
1508
說到人類,
07:15
you were wanting to hear about that.
143
435604
1784
你會想要聽到這個。
07:17
Humans must retire, at once, for good.
144
437412
4466
人類必須要退休,
一旦發生了,你就好好退休吧。
07:22
They just can't compete.
145
442707
1523
人類無法和仿真腦競爭。
07:24
Now, humans start out owning
all of the capital in this world.
146
444738
3072
一開始,人類擁有
這個世界上所有的資本。
07:27
The economy grows very fast,
their wealth grows very fast.
147
447834
2869
經濟成長非常快,
他們的財富也成長非常快。
07:30
Humans get rich, collectively.
148
450727
2384
整體而言,人類變富有了。
07:34
As you may know, most humans today
don't actually own that much
149
454048
3239
如你所知,現今大部份人
除了擁有工作能力外,
07:37
besides their ability to work,
150
457311
1841
其它其實擁有的並不多。
07:39
so between now and then,
they need to acquire sufficient assets,
151
459176
3548
所以從現在起到那時後,
他們需要獲得足夠的資產、
07:42
insurance or sharing arrangements,
152
462748
1833
準備保險或股權的配置,
07:44
or they may starve.
153
464605
1461
不然他們就要挨餓了。
07:46
I highly recommend avoiding this outcome.
154
466090
2015
我強烈建議要避免這個結果。
07:48
(Laughter)
155
468129
1285
(笑聲)
07:50
Now, you might wonder,
156
470285
1223
你們可能會好奇,
07:51
why would ems let humans exist?
157
471532
1478
為什麼仿真腦要讓人類存在?
07:53
Why not kill them, take their stuff?
158
473034
1715
為什麼不殺了人類,
拿走他們的東西?
07:55
But notice we have many
unproductive retirees around us today,
159
475521
3077
別忘了,現今我們身邊就有很多
沒有生產力的退休者,
07:58
and we don't kill them
and take their stuff.
160
478622
2105
我們不會殺了他們,拿他們的東西。
08:00
(Laughter)
161
480751
1634
(笑聲)
08:02
In part, that's because it would disrupt
the institutions we share with them.
162
482409
4420
部份原因是因為這樣會使得
我們與他們共同的制度被瓦解。
08:07
Other groups would wonder who's next,
163
487344
2116
其他族群就會想,誰會下一個遭殃?
08:09
so plausibly, ems may well let humans
retire in peace during the age of em.
164
489484
4726
所以仿真腦比較合理的做法,是讓
人類在仿真腦的時代能平靜地退休。
08:14
You should worry more that
the age of em only lasts a year or two
165
494234
3049
你們比較該擔心的是,
仿真腦的時代可能只有一、兩年,
08:17
and you don't know what happens next.
166
497307
1828
而你卻不知道接下來會發生什麼事。
08:21
Ems are very much like humans,
167
501027
2366
仿真腦和人類很像,
08:23
but they are not like the typical human.
168
503417
2309
但它們不像典型的人類。
08:26
The typical em is a copy
of the few hundred most productive humans.
169
506258
5983
典型的仿真腦是
幾百個最有生產力的人的翻版。
08:33
So in fact, they are as elite,
compared to the typical human,
170
513149
3580
所以事實上,
與一般人相比它們是菁英,
08:36
as the typical billionaire,
Nobel Prize winner,
171
516753
2262
它們是典型的億萬富翁、
諾貝爾獎得主、
08:39
Olympic gold medalist, head of state.
172
519039
2349
奧運金牌得主、國家元首。
08:42
Ems look on humans
perhaps with nostalgia and gratitude,
173
522347
4302
仿真腦在看人類時,可能是帶著
懷舊之情和感激之情的,
08:46
but not so much respect,
174
526673
1508
但不會帶有很多尊重,
08:48
which is, if you think about it,
how you think about your ancestors.
175
528205
3195
就像你仔細想想,
你是怎麽看待你的祖先的。
08:51
We know many things about how humans
differ in terms of productivity.
176
531424
3267
我們知道許多關於
不同個體在生產力方面的不同之處。
08:54
We can just use those
to predict features of ems --
177
534715
2419
我們可以用那些知識
來預測仿真腦的特徵──
08:57
for example, they tend to be smart,
conscientious, hard-working,
178
537158
3096
比如,它們往往是
聰明的、認真的、努力的、
09:00
married, religious, middle-aged.
179
540278
1685
已婚的、有宗教信仰的、中年的。
09:02
These are features of ems.
180
542281
1612
這些是仿真腦的特點。
09:03
Em world also contains enormous variety.
181
543917
2228
仿真腦世界也有很高的多樣性。
09:06
Not only does it continue on with most
of the kinds of variety that humans do,
182
546530
3668
它不只延續了人類的各種多樣性,
09:10
including variety of industry
and profession,
183
550222
2395
包括產業及職業的多樣性,
09:12
they also have many new kinds of variety,
184
552641
2162
它們也在很多新型特點上
體現了多樣性,
09:14
and one of the most important
is mind speed.
185
554827
2345
最重要的之一就是思維速度。
09:17
Ems can plausibly go from human speed
186
557532
3834
仿真腦的速度,
慢可以慢到與人類一樣的速度
09:21
up to a million times
faster than human speed,
187
561390
2880
快可以快到比人類快
幾百萬倍的速度,
09:24
and down to a billion times
slower than human speed.
188
564294
2777
往下還可以比人類慢
十億倍的速度。
09:28
Faster ems tend to have
markers of high status.
189
568046
2766
更快的仿真腦往往具有較高的地位。
09:30
They embody more wealth.
They win arguments.
190
570836
2160
它們有比較多的財富。
它們能在爭論中勝出。
09:33
They sit at premium locations.
191
573020
1663
它們位於金字塔頂端。
09:34
Slower ems are mostly retirees,
192
574707
1820
比較慢的仿真腦多半是已退休的,
09:36
and they are like the ghosts
of our literature.
193
576551
2250
它們就像是我們小說中看到的鬼魂。
09:38
If you recall, ghosts are all around us --
194
578825
2041
如果你有注意到,
書中的鬼魂都在我們周圍,
09:40
you can interact with them
if you pay the price.
195
580890
2270
如果你願意付出代價,
就可以與它們互動。
09:43
But they don't know much,
they can't influence much,
196
583184
2429
但它們知道的不多,
它們的影響不大,
09:45
and they're obsessed with the past,
so what's the point?
197
585637
2698
且它們沉陷在過去中,
但那有什麼意義?
09:48
(Laughter)
198
588359
1151
(笑聲)
09:49
Ems also have more variety
in the structure of their lives.
199
589534
2957
仿真腦的人生結構
也有較高的多樣性。
09:52
This is your life: you start
and you end, really simple.
200
592515
2682
這是你的人生:
開始、結束就這麼簡單。
09:55
This is the life of an em,
201
595221
1613
這是仿真腦的人生,
09:56
who every day splits off
some short-term copies
202
596858
2293
每天都會分裂出一些短暫的複製品,
09:59
to do short-term tasks and then end.
203
599175
1958
用來完成短期任務,然後結束。
10:01
We'll talk more about
those short term versions in a moment,
204
601157
4127
我們稍後再討論這些短期版本,
10:05
but they are much more efficient
205
605308
1610
但是它們的效率更高,
10:06
because they don't have to rest
for the next day.
206
606942
2365
因為它們不需要在第二天休息。
10:09
This em is more opportunistic.
207
609331
1508
這個仿真腦是比較機會主義者的。
10:10
They make more copies of themselves
when there's more demand for that.
208
610863
3287
當有需要時,它們就會
把自己複製更多份。
10:14
They don't know which way
the future's going.
209
614174
2097
它們不知道未來的走向。
10:16
This is an em designer,
210
616295
1299
這是一個仿真腦設計師,
10:17
who conceives of a large system
211
617618
1602
它構想了一個很大的系統,
10:19
and then breaks recursively into copies
who elaborate that,
212
619244
3485
然後再遞歸地分裂成複製品
來詳細說明它的構想,
10:22
so ems can implement
larger, more coherent designs.
213
622753
3412
因此,仿真腦可以實現更大、
更連貫的設計。
10:26
This an emulation plumber
214
626189
1358
這是個仿真腦世界裡的水管工,
10:27
who remembers that every day,
for the last 20 years,
215
627571
2445
它記得過去二十年的每一天,
10:30
they only ever worked
two hours a day, a life of leisure.
216
630040
2781
它們一天只工作兩小時,
很悠閒的人生。
10:32
But what really happened is,
every day they had a thousand copies,
217
632845
3097
但實情是,它們每天都得
複製出數千個複製品,
10:35
each of whom did a two-hour plumbing job,
218
635966
1953
每個複製品得做兩個小時的工作,
10:37
and only one of them
went on to the next day.
219
637943
2165
它們之中只有一個
第二天會接著工作。
客觀上,它們 99% 的時間
都在努力工作。
10:40
Objectively, they're working
well over 99 percent of the time.
220
640132
2999
主觀上,它們還記得
甚麼是悠閑的生活。
10:43
Subjectively, they remember
a life of leisure.
221
643155
2190
10:45
(Laughter)
222
645369
1008
(笑聲)
10:46
This, again, is you.
You start and you end.
223
646401
2088
同樣的,這還是你。
你開始,你結束。
10:48
This could be you
if at the start of party,
224
648513
2112
這可能是你,如果在派對開始時,
10:50
you took a drug that meant
you would not remember that party
225
650649
2811
你吃了顆藥,
它會讓你在那天之後再也
10:53
ever after that day.
226
653484
1618
想不起那場派對。
10:55
Some people do this, I'm told.
227
655126
1929
據我所知,有些人會這樣做。
10:57
Toward the end of the party,
228
657490
1635
到派對尾聲時,
10:59
will you say to yourself,
"I'm about to die, this is terrible.
229
659149
2926
我要問你的是,你是否會告訴自己:
「我快要死了,這太可怕了。
11:02
That person tomorrow isn't me,
because they won't remember what I do."
230
662099
3303
明天的那個我其實不是我,
因為他不會記得我做過的事。」
11:05
Or you could say, "I will go on tomorrow.
I just won't remember what I did."
231
665426
3961
或者說:「明天,我還是我,
只是不記得我過去做了什麽。」
11:09
This is an em who splits off
a short-term copy
232
669411
2429
這是分裂出短期複製品的仿真腦,
11:11
to do a short-term task and then end.
233
671864
1864
用來做短期工作,然後就結束。
11:13
They have the same two
attitude possibilities.
234
673752
2206
它們同樣也有這兩種可能的態度。
11:15
They can say, "I'm a new short-term
creature with a short life. I hate this."
235
675982
3679
它們會說:「我是新的短期生物,
生命很短。我討厭這樣。」
或「我是更大生物的一部份,
只是記不得是哪一部份。」
11:19
Or "I'm a part of a larger creature
who won't remember this part."
236
679685
3134
11:22
I predict they'll have
that second attitude,
237
682843
2118
我預測它們會採用第二種態度,
不是因為哲學正確性,而是因為
那種態度能幫助它們相處。
11:24
not because it's philosophically correct,
but because it helps them get along.
238
684985
3676
11:28
Today, if the president says
we must invade Iraq,
239
688685
2328
如果今天總統說
我們必須要侵略伊拉克,
11:31
and you say, "Why?"
240
691037
1151
你說:「為什麼?」
11:32
and they say, "State secret,"
241
692212
1382
他們說:「國家機密。」
11:33
you're not sure if you can trust them,
242
693618
1850
你不確定是否可以信任他們,
但對仿真腦而言,總統的複製品和
你的複製品可以進入一個保險箱,
11:35
but for ems, a copy of the president
and a copy of you can go inside a safe,
243
695492
3616
11:39
explain all their secret reasons,
244
699132
1654
解釋它們所有的秘密原因,
11:40
and then one bit comes out
from your copy to yourself,
245
700810
2531
然後有一個位元訊號
從你的複製品回來告知你,
11:43
telling you if you were convinced.
246
703365
1856
讓你知道你是否有被說服。
11:45
So now you can know
there is a good reason.
247
705245
2223
這樣你就會知道有個好理由存在。
11:47
I know you guys are all eager
to evaluate this world.
248
707796
2545
我知道你們都很渴望評估那個世界。
11:50
You're eager to decide
if you love it or hate it.
249
710365
2334
你想盡快決定
你是否喜歡它或討厭它。
11:52
But think: your ancestors
from thousands of years ago
250
712723
2543
但想想:你數千年前的祖先
11:55
would have loved or hated your world
251
715290
1753
會根據他們聽聞的最初的幾件事,
11:57
based on the first few things
they heard about it,
252
717067
2421
來選擇愛或恨你現在這個世界嗎?
因為你的世界聽起來是很奇怪的。
11:59
because your world
is really just weird.
253
719512
1853
12:01
So before judging a strange future world,
you should really learn a lot about it,
254
721399
3818
所以在評斷一個奇怪的
未來世界之前,
你應該好好學習它,
12:05
maybe read a whole book about it,
255
725237
1570
也許讀一本關於它的書,
12:06
and then, if you don't like it,
work to change it.
256
726851
2344
然後,如果你不喜歡它,
就努力去改變它。
12:09
Thank you.
257
729229
1019
謝謝。
12:10
(Applause)
258
730272
4833
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。