Susan Blackmore: Memes and "temes"

156,625 views ・ 2008-06-04

TED


請雙擊下方英文字幕播放視頻。

譯者: Manlai YOU 審譯者: Chun-wen Chen
00:18
Cultural evolution is a dangerous child
0
18330
3000
文化的演化是個危險小孩
00:21
for any species to let loose on its planet.
1
21330
3000
任何物種如果放任它在星球上。
00:24
By the time you realize what's happening, the child is a toddler,
2
24330
4000
等你察覺出了事,小孩已在學步,
00:28
up and causing havoc, and it's too late to put it back.
3
28330
6000
四處闖禍,將它帶回已太遲。
00:34
We humans are Earth's Pandoran species.
4
34330
3000
我們人類是地球的潘朵拉物種。
00:37
We're the ones who let the second replicator out of its box,
5
37330
5000
我們把第二個複製體放出盒子,
00:42
and we can't push it back in.
6
42330
2000
而我們無法將它收回。
00:44
We're seeing the consequences all around us.
7
44330
3000
我們正看到身邊的後果。
00:48
Now that, I suggest, is the view that
8
48330
4000
現在,我認為這個觀點
00:52
comes out of taking memetics seriously.
9
52330
2000
是認真看待迷因論而來的。
00:54
And it gives us a new way of thinking about
10
54330
2000
它提供我們一個新方法去思考
00:56
not only what's going on on our planet,
11
56330
2000
不僅我們星球發生了什麼事,
00:58
but what might be going on elsewhere in the cosmos.
12
58330
3000
還有宇宙他處可能有什麼事。
01:01
So first of all, I'd like to say something about memetics
13
61330
3000
首先,我要談談迷因論,
01:04
and the theory of memes,
14
64330
2000
就是迷因的理論,
01:06
and secondly, how this might answer questions about who's out there,
15
66330
5000
其次,談這可能解答外太空有誰,
01:11
if indeed anyone is.
16
71330
3000
是否真的有誰。
01:14
So, memetics:
17
74330
2000
迷因論。
01:16
memetics is founded on the principle of Universal Darwinism.
18
76330
4000
迷因論是根據通用達爾文理論而來的。
01:20
Darwin had this amazing idea.
19
80330
3000
達爾文有這個驚奇想法。
01:23
Indeed, some people say
20
83330
2000
真的,有人說:
01:25
it's the best idea anybody ever had.
21
85330
3000
那是有史以來最好的想法。
01:28
Isn't that a wonderful thought, that there could be such a thing
22
88330
4000
那豈不是絕妙的觀點,認為有件事可以是
01:32
as a best idea anybody ever had?
23
92330
2000
有史以來最好的想法?
01:34
Do you think there could?
24
94330
1000
你認為有可能嗎?
01:35
Audience: No.
25
95330
1000
(觀眾:不)
01:36
(Laughter)
26
96330
1000
(笑聲)
01:37
Susan Blackmore: Someone says no, very loudly, from over there.
27
97330
2000
那邊有人很大聲地說:「不」。
01:39
Well, I say yes, and if there is, I give the prize to Darwin.
28
99330
4000
但我說「有」,如果有,我將頒獎給達爾文。
01:43
Why?
29
103330
2000
為什麼?
01:45
Because the idea was so simple,
30
105330
3000
因為這個想法那麼簡單,
01:48
and yet it explains all design in the universe.
31
108330
6000
卻解釋了宇宙的一切設計。
01:54
I would say not just biological design,
32
114330
2000
我認為不只是生物的設計,
01:56
but all of the design that we think of as human design.
33
116330
2000
還有一切我們認為的人為設計。
01:58
It's all just the same thing happening.
34
118330
2000
其發生的原理完全一樣。
02:00
What did Darwin say?
35
120330
2000
達爾文說了什麼?
02:02
I know you know the idea, natural selection,
36
122330
2000
我想你知道他的想法:「天擇」
02:04
but let me just paraphrase "The Origin of Species," 1859,
37
124330
5000
讓我用「物種起源」, 1859 年版,
02:09
in a few sentences.
38
129330
2000
套幾句話解釋一下。
02:11
What Darwin said was something like this:
39
131330
3000
達爾文說的就像 -
02:14
if you have creatures that vary, and that can't be doubted --
40
134330
4000
如果你有生物的變異,這是無庸置疑的 -
02:18
I've been to the Galapagos, and I've measured the size of the beaks
41
138330
3000
我到過加拉巴哥群島,測量過鳥嘴
02:21
and the size of the turtle shells and so on, and so on.
42
141330
2000
和龜殼的尺寸等等...
02:23
And 100 pages later.
43
143330
2000
翻過 100 頁 -
02:25
(Laughter)
44
145330
2000
(笑聲)
02:27
And if there is a struggle for life,
45
147330
4000
如果有生存競爭,
02:31
such that nearly all of these creatures die --
46
151330
3000
使幾乎所有的生物都死亡 -
02:34
and this can't be doubted, I've read Malthus
47
154330
3000
這無庸置疑,我讀過馬爾薩斯
02:37
and I've calculated how long it would take for elephants
48
157330
2000
我計算過要多少時間會使大象
02:39
to cover the whole world if they bred unrestricted, and so on and so on.
49
159330
3000
在不受限制的成長下,充滿整個世界,等等...
02:42
And another 100 pages later.
50
162330
4000
再翻過 100 頁。
02:46
And if the very few that survive pass onto their offspring
51
166330
5000
如果少數幾個活下來,傳承給子孫
02:51
whatever it was that helped them survive,
52
171330
3000
有助它們存活的任何條件,
02:54
then those offspring must be better adapted
53
174330
2000
則這些子孫必然比祖先
02:56
to the circumstances in which all this happened
54
176330
2000
更能適應發生這一切
02:58
than their parents were.
55
178330
3000
的環境情況。
03:01
You see the idea?
56
181330
2000
你看到這個想法了嗎?
03:03
If, if, if, then.
57
183330
2000
如果、如果、如果,則
03:05
He had no concept of the idea of an algorithm,
58
185330
2000
他並沒有運算法的概念。
03:07
but that's what he described in that book,
59
187330
3000
但他在書上寫的就是這樣,
03:10
and this is what we now know as the evolutionary algorithm.
60
190330
3000
就是現在我們所知的演化運算法。
03:13
The principle is you just need those three things --
61
193330
4000
原則上你只需三樣東西 -
03:17
variation, selection and heredity.
62
197330
3000
變異、選擇、及遺傳。
03:20
And as Dan Dennett puts it, if you have those,
63
200330
4000
就如 Dan Dennett 所說,如果有這些
03:24
then you must get evolution.
64
204330
2000
必然會有演化。
03:26
Or design out of chaos, without the aid of mind.
65
206330
5000
或:不用心智的輔助,混沌即會產出設計。
03:31
There's one word I love on that slide.
66
211330
2000
幻燈片中有個字我很喜歡,
03:33
What do you think my favorite word is?
67
213330
2000
你猜我喜歡的是哪個字?
03:35
Audience: Chaos.
68
215330
1000
(觀眾:混沌)
03:36
SB: Chaos? No. What? Mind? No.
69
216330
3000
「混沌」?不是。「心智」?不是。
03:39
Audience: Without.
70
219330
1000
(觀眾:不用)
03:40
SB: No, not without.
71
220330
1000
不是,不是「不用」。
03:41
(Laughter)
72
221330
1000
(笑聲)
03:42
You try them all in order: Mmm...?
73
222330
2000
你們依序再猜:嗯?
03:44
Audience: Must.
74
224330
1000
(觀眾:必然)
03:45
SB: Must, at must. Must, must.
75
225330
4000
必然、必然、必然、必然。
03:49
This is what makes it so amazing.
76
229330
2000
就是它才那麼驚奇。
03:51
You don't need a designer,
77
231330
3000
你不需要設計師,
03:54
or a plan, or foresight, or anything else.
78
234330
3000
或計畫,或先見、或任何什麼。
03:57
If there's something that is copied with variation
79
237330
3000
如果有變異的複製
04:00
and it's selected, then you must get design appearing out of nowhere.
80
240330
4000
並被選擇,則無中必然會有設計出現。
04:04
You can't stop it.
81
244330
2000
你無法停止它。
04:06
Must is my favorite word there.
82
246330
4000
此處,「必然」是我喜歡的字。
04:11
Now, what's this to do with memes?
83
251330
2000
而這和迷因有什關係?
04:13
Well, the principle here applies to anything
84
253330
5000
嗯,這個原則適用任何情形
04:18
that is copied with variation and selection.
85
258330
1000
就是有變異和選擇的複製。
04:19
We're so used to thinking in terms of biology,
86
259330
3000
我們太習慣於生物學的觀點,
04:22
we think about genes this way.
87
262330
2000
我們以此方式思考基因。
04:24
Darwin didn't, of course; he didn't know about genes.
88
264330
3000
達爾文則當然沒有, 他不知道基因。
04:27
He talked mostly about animals and plants,
89
267330
2000
他大部分提到動物、植物,
04:29
but also about languages evolving and becoming extinct.
90
269330
3000
但也提到語言的演化與滅絕。
04:32
But the principle of Universal Darwinism
91
272330
2000
但通用達爾文論的原則
04:34
is that any information that is varied and selected
92
274330
4000
是任何有變異及被選擇的資訊
04:38
will produce design.
93
278330
2000
都會產生設計。
04:40
And this is what Richard Dawkins was on about
94
280330
2000
這也是理察·道金斯在他的
04:42
in his 1976 bestseller, "The Selfish Gene."
95
282330
3000
1976 年暢銷書「自私的基因」中談的。
04:45
The information that is copied, he called the replicator.
96
285330
4000
被複製的資訊,他叫做複製體。
04:49
It selfishly copies.
97
289330
2000
它自私地複製。
04:51
Not meaning it kind of sits around inside cells going, "I want to get copied."
98
291330
4000
不是說它躺在細胞裡叫著「我要被複製」。
04:55
But that it will get copied if it can,
99
295330
2000
而是只要能夠,它就會被複製,
04:57
regardless of the consequences.
100
297330
2000
不論後果如何。
05:00
It doesn't care about the consequences because it can't,
101
300330
3000
它不在意後果,因為它無從在意,
05:03
because it's just information being copied.
102
303330
2000
因為複製的只是資訊。
05:06
And he wanted to get away
103
306330
1000
他想要跳脫,
05:07
from everybody thinking all the time about genes,
104
307330
3000
大家總是想到基因,
05:10
and so he said, "Is there another replicator out there on the planet?"
105
310330
3000
因此他說:「行星上還有另一個複製體嗎?」
05:13
Ah, yes, there is.
106
313330
2000
是的,有的。
05:15
Look around you -- here will do, in this room.
107
315330
3000
看看四周,這房間裡就有。
05:18
All around us, still clumsily drifting about
108
318330
3000
我們四周,仍拙然浮現著
05:21
in its primeval soup of culture, is another replicator.
109
321330
3000
文化原汁的,是另一個複製體。
05:24
Information that we copy from person to person, by imitation,
110
324330
5000
經由模仿,資訊在人與人之間複製著,
05:29
by language, by talking, by telling stories,
111
329330
2000
經由語言、交談、敍事、
05:31
by wearing clothes, by doing things.
112
331330
3000
穿著、行為等。
05:34
This is information copied with variation and selection.
113
334330
5000
這是有變異與選擇的資訊複製。
05:39
This is design process going on.
114
339330
3000
是進行中的設計過程。
05:42
He wanted a name for the new replicator.
115
342330
3000
他要為這新複製體取個新名字。
05:45
So, he took the Greek word "mimeme," which means that which is imitated.
116
345330
4000
因此他用希臘字 mimeme,意指「模仿之物」。
05:49
Remember that, that's the core definition:
117
349330
2000
記住,這是它的本義。
05:52
that which is imitated.
118
352330
1000
指「模仿之物」。
05:53
And abbreviated it to meme, just because it sounds good
119
353330
3000
將它簡化為 meme,因為好唸
05:56
and made a good meme, an effective spreading meme.
120
356330
3000
而成為好的迷因,有效傳播的迷因。
05:59
So that's how the idea came about.
121
359330
3000
這就是此想法的來源。
06:03
It's important to stick with that definition.
122
363330
3000
謹守這個定義是重要的。
06:06
The whole science of memetics is much maligned,
123
366330
4000
整個迷因論受到太多詆毀,
06:10
much misunderstood, much feared.
124
370330
3000
太多誤解,太多憂懼。
06:13
But a lot of these problems can be avoided
125
373330
3000
但許多問題可以避免掉,
06:16
by remembering the definition.
126
376330
2000
只要記住這個定義。
06:18
A meme is not equivalent to an idea.
127
378330
2000
迷因不等於一個想法。
06:20
It's not an idea. It's not equivalent to anything else, really.
128
380330
2000
它不是想法,它也不等於任何事,,真的。
06:22
Stick with the definition.
129
382330
2000
謹守這個定義。
06:24
It's that which is imitated,
130
384330
2000
它是模仿之物。
06:26
or information which is copied from person to person.
131
386330
3000
或指在人與人之間複製的資訊。
06:30
So, let's see some memes.
132
390330
1000
那麼,讓我們來看一些迷因。
06:31
Well, you sir, you've got those glasses hung around your neck
133
391330
3000
先生,你的眼鏡掛在脖子上
06:34
in that particularly fetching way.
134
394330
2000
是一種特別的拿取方式。
06:36
I wonder whether you invented that idea for yourself,
135
396330
2000
我好奇那是你自己發明的想法,
06:38
or copied it from someone else?
136
398330
2000
或複製自別人?
06:40
If you copied it from someone else, it's a meme.
137
400330
3000
如果你複製自別人,那就是個迷因。
06:43
And what about, oh, I can't see any interesting memes here.
138
403330
3000
還有,這裡我看不到任何有趣的迷因。
06:46
All right everyone, who's got some interesting memes for me?
139
406330
3000
各位,誰有有趣的迷因?
06:49
Oh, well, your earrings,
140
409330
2000
好,你的耳環,
06:51
I don't suppose you invented the idea of earrings.
141
411330
2000
我不認為你發明了耳環的想法。
06:53
You probably went out and bought them.
142
413330
2000
或許是你出去買的。
06:55
There are plenty more in the shops.
143
415330
2000
店裡有很多。
06:57
That's something that's passed on from person to person.
144
417330
2000
這就是在人與人之間傳遞的。
06:59
All the stories that we're telling -- well, of course,
145
419330
3000
所有我們說的故事,當然
07:02
TED is a great meme-fest, masses of memes.
146
422330
4000
TED 是個大的迷因饗宴,有大量的迷因。
07:06
The way to think about memes, though,
147
426330
2000
考慮迷因的一個方式是,
07:08
is to think, why do they spread?
148
428330
2000
想想它們為什麼會傳播?
07:10
They're selfish information, they will get copied, if they can.
149
430330
4000
它們是自私的資訊,它們盡可能讓人複製。
07:14
But some of them will be copied because they're good,
150
434330
3000
有些被複製,因為它們很好、
07:17
or true, or useful, or beautiful.
151
437330
2000
真實、有用、或美妙。
07:19
Some of them will be copied even though they're not.
152
439330
2000
有些雖不是,也將被複製。
07:21
Some, it's quite hard to tell why.
153
441330
2000
有些,很難說明為什麼。
07:24
There's one particular curious meme which I rather enjoy.
154
444330
3000
有一種特別好奇的迷因我很欣賞。
07:27
And I'm glad to say, as I expected, I found it when I came here,
155
447330
3000
我很高興地說,如預期地我在這裡找到了它,
07:30
and I'm sure all of you found it, too.
156
450330
2000
我確定你們也都發現了它。
07:32
You go to your nice, posh, international hotel somewhere,
157
452330
3000
你到某處的豪華國際旅館,
07:36
and you come in and you put down your clothes
158
456330
2000
進去後,放下你的衣服
07:38
and you go to the bathroom, and what do you see?
159
458330
3000
到了浴室,你看到什麼?
07:41
Audience: Bathroom soap.
160
461330
1000
(觀眾:肥皂)
07:42
SB: Pardon?
161
462330
1000
什麼?
07:43
Audience: Soap.
162
463330
1000
(觀眾:肥皂)
07:44
SB: Soap, yeah. What else do you see?
163
464330
2000
肥皂,是呀。還看到什麼?
07:46
Audience: (Inaudible)
164
466330
1000
(觀眾:...)
07:47
SB: Mmm mmm.
165
467330
1000
嗯、嗯。
07:48
Audience: Sink, toilet!
166
468330
1000
(觀眾:洗臉盆、馬桶)
07:49
SB: Sink, toilet, yes, these are all memes, they're all memes,
167
469330
2000
洗臉盆、馬桶,對,這些都是迷因,都是迷因,
07:51
but they're sort of useful ones, and then there's this one.
168
471330
3000
它們是有用的迷因,還有這個。
07:54
(Laughter)
169
474330
3000
(笑聲)
07:58
What is this one doing?
170
478330
2000
這個做什麼?
08:00
(Laughter)
171
480330
1000
(笑聲)
08:01
This has spread all over the world.
172
481330
2000
這已傳遍全世界。
08:03
It's not surprising that you all found it
173
483330
2000
無疑你們都發現了它
08:05
when you arrived in your bathrooms here.
174
485330
2000
在你來到這裡的浴室時。
08:07
But I took this photograph in a toilet at the back of a tent
175
487330
5000
但這張照片拍自一個帳篷後的廁所
08:12
in the eco-camp in the jungle in Assam.
176
492330
2000
是在阿隡姆叢林的生態營中。
08:14
(Laughter)
177
494330
1000
(笑聲)
08:16
Who folded that thing up there, and why?
178
496330
3000
誰把它摺成那樣,為什麼?
08:19
(Laughter)
179
499330
1000
(笑聲)
08:20
Some people get carried away.
180
500330
2000
有些人受影響過了頭。
08:22
(Laughter)
181
502330
3000
(笑聲)
08:26
Other people are just lazy and make mistakes.
182
506330
3000
其他人則太懶並弄錯了。
08:29
Some hotels exploit the opportunity to put even more memes
183
509330
3000
有些旅館趁機會加入更多迷因
08:32
with a little sticker.
184
512330
2000
附上小貼標。
08:34
(Laughter)
185
514330
1000
(笑聲)
08:35
What is this all about?
186
515330
2000
到底是怎麼了?
08:37
I suppose it's there to tell you that somebody's
187
517330
2000
我想它是要告訴你:有人已經
08:39
cleaned the place, and it's all lovely.
188
519330
2000
清潔了這地方,全都好了。
08:41
And you know, actually, all it tells you is that another person
189
521330
3000
你知道,實際上它告訴你的是:另個人
08:44
has potentially spread germs from place to place.
190
524330
3000
有可能散播細菌到各處。
08:47
(Laughter)
191
527330
1000
(笑聲)
08:48
So, think of it this way.
192
528330
2000
因此用這方式去想它。
08:50
Imagine a world full of brains
193
530330
2000
想像世界上充滿了頭腦
08:52
and far more memes than can possibly find homes.
194
532330
3000
但有更多的迷因找不到家。
08:55
The memes are all trying to get copied --
195
535330
3000
迷因都試著要被複製,
08:58
trying, in inverted commas -- i.e.,
196
538330
3000
試著,明白地講
09:01
that's the shorthand for, if they can get copied, they will.
197
541330
3000
就是:「盡其所能地被複製」。
09:04
They're using you and me as their propagating, copying machinery,
198
544330
6000
它們利用你我當擴散的複製機,
09:10
and we are the meme machines.
199
550330
3000
我們是迷因機器。
09:13
Now, why is this important?
200
553330
2000
為什麼這個重要?
09:15
Why is this useful, or what does it tell us?
201
555330
2000
為什麼它有用?它告訴我們什麼?
09:17
It gives us a completely new view of human origins
202
557330
4000
它給我們全新觀點的人類起源
09:21
and what it means to be human,
203
561330
1000
及它對人類的意義。
09:22
all conventional theories of cultural evolution,
204
562330
4000
所有傳統的文化演化理論,
09:26
of the origin of humans,
205
566330
2000
人類起源理論,
09:28
and what makes us so different from other species.
206
568330
4000
及我們異於其他物種的理論。
09:32
All other theories explaining the big brain, and language, and tool use
207
572330
2000
其他理論都解釋大腦、語言、及工具使用
09:34
and all these things that make us unique,
208
574330
2000
是這些事使我們獨特,
09:36
are based upon genes.
209
576330
3000
都是基於基因。
09:39
Language must have been useful for the genes.
210
579330
3000
語言必須對基因有用。
09:42
Tool use must have enhanced our survival, mating and so on.
211
582330
3000
工具使用必須加強我們的存活、交配等。
09:45
It always comes back, as Richard Dawkins complained
212
585330
3000
它總是回到,如同理察·道金斯所埋怨
09:48
all that long time ago, it always comes back to genes.
213
588330
3000
長時以來,它總是回到基因。
09:51
The point of memetics is to say, "Oh no, it doesn't."
214
591330
4000
迷因論則說:「不,它不會。」
09:55
There are two replicators now on this planet.
215
595330
3000
現在有兩種複製體在這星球上。
09:58
From the moment that our ancestors,
216
598330
3000
自從我們祖先
10:01
perhaps two and a half million years ago or so,
217
601330
2000
大約 250 萬年前,
10:03
began imitating, there was a new copying process.
218
603330
4000
開始模仿,就有一個新的複製過程。
10:07
Copying with variation and selection.
219
607330
2000
以變異及選擇而複製。
10:09
A new replicator was let loose, and it could never be --
220
609330
5000
釋出了一個新複製體,它將永不會 -
10:14
right from the start -- it could never be
221
614330
1000
在一開始,它就永不會是
10:15
that human beings who let loose this new creature,
222
615330
5000
釋放了這個新生物的人類,
10:20
could just copy the useful, beautiful, true things,
223
620330
3000
只複製有用的、美妙的、真實的事物,
10:23
and not copy the other things.
224
623330
2000
而不複製其他事物。
10:25
While their brains were having an advantage from being able to copy --
225
625330
3000
人類的頭腦有利於去複製 -
10:28
lighting fires, keeping fires going, new techniques of hunting,
226
628330
5000
取火、保存火、打獵新技法,
10:33
these kinds of things --
227
633330
2000
這些東西 -
10:35
inevitably they were also copying putting feathers in their hair,
228
635330
3000
難免他們也複製頭髮裝飾羽毛,
10:38
or wearing strange clothes, or painting their faces,
229
638330
2000
或穿新奇衣服、畫臉、
10:40
or whatever.
230
640330
1000
或什麼的。
10:41
So, you get an arms race between the genes
231
641330
4000
因而有了武器競賽:
10:45
which are trying to get the humans to have small economical brains
232
645330
4000
基因試著要人類有小而經濟的頭腦
10:49
and not waste their time copying all this stuff,
233
649330
2000
不要浪費時間複製所有東西,
10:51
and the memes themselves, like the sounds that people made and copied --
234
651330
4000
而迷因自己,像人類創造及複製的聲音 -
10:56
in other words, what turned out to be language --
235
656330
2000
換言之,就是語言 -
10:58
competing to get the brains to get bigger and bigger.
236
658330
3000
競爭著要頭腦越來越大。
11:01
So, the big brain, on this theory, is driven by the memes.
237
661330
4000
因此大頭腦理論是來自迷因的。
11:05
This is why, in "The Meme Machine," I called it memetic drive.
238
665330
4000
這是為什麼在「迷因機器」裡,我叫它迷因驅動機。
11:09
As the memes evolve, as they inevitably must,
239
669330
3000
當迷因演化時,當它們難免必須,
11:12
they drive a bigger brain that is better at copying the memes
240
672330
4000
它們驅動較會複製迷因的較大頭腦
11:16
that are doing the driving.
241
676330
2000
去做驅動。
11:18
This is why we've ended up with such peculiar brains,
242
678330
4000
這是為什麼我們有這樣奇特的頭腦,
11:22
that we like religion, and music, and art.
243
682330
3000
我們喜歡宗教、音樂、和藝術。
11:25
Language is a parasite that we've adapted to,
244
685330
3000
語言是我們已適應的寄生物,
11:28
not something that was there originally for our genes,
245
688330
2000
不是我們基因原本就有的,
11:30
on this view.
246
690330
2000
這樣一個觀點。
11:32
And like most parasites, it can begin dangerous,
247
692330
3000
像大部分寄生物一樣,它一開始有危險,
11:35
but then it coevolves and adapts,
248
695330
3000
然後一起演化、調適
11:38
and we end up with a symbiotic relationship
249
698330
2000
結果我們和這寄生物
11:40
with this new parasite.
250
700330
1000
形成共生關係。
11:41
And so, from our perspective,
251
701330
2000
因此從我們的觀點,
11:43
we don't realize that that's how it began.
252
703330
3000
我們不知它是如何開始的。
11:46
So, this is a view of what humans are.
253
706330
3000
這是人類是什麼的一個觀點。
11:49
All other species on this planet are gene machines only,
254
709330
3000
地球上的其他物種只是基因機器而已,
11:52
they don't imitate at all well, hardly at all.
255
712330
3000
它們不太會模仿,幾乎不會。
11:55
We alone are gene machines and meme machines as well.
256
715330
5000
只有我們是基因機器,也是迷因機器。
12:00
The memes took a gene machine and turned it into a meme machine.
257
720330
4000
迷因取用基因機器,將它變成迷因機器。
12:04
But that's not all.
258
724330
2000
但這還不是全部。
12:06
We have a new kind of memes now.
259
726330
3000
我們現在有新種的迷因了。
12:09
I've been wondering for a long time,
260
729330
1000
我已經驚奇一段長時間了,
12:10
since I've been thinking about memes a lot,
261
730330
2000
因為我一直常在思考迷因,
12:12
is there a difference between the memes that we copy --
262
732330
2000
迷因複製的東西有差別嗎 -
12:14
the words we speak to each other,
263
734330
2000
我們彼此交談的話,
12:16
the gestures we copy, the human things --
264
736330
2000
我們複製的姿勢,人為的物品 -
12:18
and all these technological things around us?
265
738330
2000
以及所有我們四周的科技物品?
12:20
I have always, until now, called them all memes,
266
740330
4000
我一直到現在都叫它們迷因,
12:24
but I do honestly think now
267
744330
3000
但現在我坦誠思考
12:27
we need a new word for technological memes.
268
747330
3000
我們需要為科技迷因取個新詞。
12:30
Let's call them techno-memes or temes.
269
750330
3000
我們來把它叫做「技術迷因」或「技因」。
12:33
Because the processes are getting different.
270
753330
3000
因為過程已經在變。
12:37
We began, perhaps 5,000 years ago, with writing.
271
757330
3000
我們約在五千年前開始書寫。
12:40
We put the storage of memes out there on a clay tablet,
272
760330
7000
我們將迷因典藏放在泥板上,
12:48
but in order to get true temes and true teme machines,
273
768330
2000
但為了取得真正的技因及真正的技因機器,
12:50
you need to get the variation, the selection and the copying,
274
770330
3000
必須有變異、選擇、和複製,
12:53
all done outside of humans.
275
773330
2000
都在人類以外進行。
12:55
And we're getting there.
276
775330
2000
我們快有了。
12:57
We're at this extraordinary point where we're nearly there,
277
777330
2000
我們已趨近這快有了的特殊點,
12:59
that there are machines like that.
278
779330
2000
已有類似的機器。
13:01
And indeed, in the short time I've already been at TED,
279
781330
2000
真的,在我到 TED 的短暫時間裡,
13:03
I see we're even closer than I thought we were before.
280
783330
2000
我認為,我們已比我想的還接近了。
13:05
So actually, now the temes are forcing our brains
281
785330
6000
實際上,此時技因正強迫我們的大腦
13:11
to become more like teme machines.
282
791330
2000
變成更像技因機器。
13:13
Our children are growing up very quickly learning to read,
283
793330
3000
我們的孩子成長中更快學會讀書,
13:16
learning to use machinery.
284
796330
2000
學會使用機器。
13:18
We're going to have all kinds of implants,
285
798330
1000
我們將有各種植入物,
13:19
drugs that force us to stay awake all the time.
286
799330
3000
迫使我們一直清醒的藥物。
13:22
We'll think we're choosing these things,
287
802330
2000
以為是我們自己選擇這些事物,
13:24
but the temes are making us do it.
288
804330
3000
卻是技因使我們這樣的。
13:28
So, we're at this cusp now
289
808330
1000
因此我們正在此一頂點
13:29
of having a third replicator on our planet.
290
809330
4000
正要有我們星球上的第三複製體。
13:34
Now, what about what else is going on out there in the universe?
291
814330
5000
而宇宙他處又有什麼在發生?
13:39
Is there anyone else out there?
292
819330
2000
外太空有誰嗎?
13:41
People have been asking this question for a long time.
293
821330
3000
人們問這個問題已很長久了。
13:44
We've been asking it here at TED already.
294
824330
2000
我們在 TED 也已問過。
13:46
In 1961, Frank Drake made his famous equation,
295
826330
4000
1961 年,Frank Drake 提出有名的等式,
13:50
but I think he concentrated on the wrong things.
296
830330
2000
但我覺得他聚焦到錯的事項上。
13:52
It's been very productive, that equation.
297
832330
2000
那個等式很有生產力。
13:54
He wanted to estimate N,
298
834330
2000
他要預估 N,
13:56
the number of communicative civilizations out there in our galaxy,
299
836330
4000
我們星系中具溝通文明的數量。
14:00
and he included in there the rate of star formation,
300
840330
4000
他包含在式子裡有星星形成率、
14:04
the rate of planets, but crucially, intelligence.
301
844330
4000
行星率,及最關鍵的智慧。
14:08
I think that's the wrong way to think about it.
302
848330
4000
我認為這樣思考法是錯的。
14:12
Intelligence appears all over the place, in all kinds of guises.
303
852330
3000
智慧到處都有,以各種變貌。
14:15
Human intelligence is only one kind of a thing.
304
855330
2000
人類智慧只是其中一種。
14:17
But what's really important is the replicators you have
305
857330
3000
但最重要的是你有的複製體
14:20
and the levels of replicators, one feeding on the one before.
306
860330
4000
及複製體的層級,各須依賴其前一個。
14:24
So, I would suggest that we don't think intelligence,
307
864330
5000
因此我建議,我們不考慮智慧,
14:29
we think replicators.
308
869330
2000
我們考慮複製體。
14:31
And on that basis, I've suggested a different kind of equation.
309
871330
3000
基於此,我提議一個不同的等式。
14:34
A very simple equation.
310
874330
2000
很單純的等式。
14:36
N, the same thing,
311
876330
2000
N 一樣,
14:38
the number of communicative civilizations out there
312
878330
3000
具溝通的文明數量,
14:41
[that] we might expect in our galaxy.
313
881330
2000
預期在我們星系中的。
14:43
Just start with the number of planets there are in our galaxy.
314
883330
4000
以我們星系中的行星數開始。
14:47
The fraction of those which get a first replicator.
315
887330
4000
擁有第一種複製體的分數。
14:51
The fraction of those that get the second replicator.
316
891330
4000
擁有第二種複製體的分數。
14:55
The fraction of those that get the third replicator.
317
895330
2000
擁有第三種複製體的分數。
14:58
Because it's only the third replicator that's going to reach out --
318
898330
3000
因為只有第三種複製體會伸展出去 -
15:01
sending information, sending probes, getting out there,
319
901330
3000
送出資訊、送出探測器、對外探索,
15:04
and communicating with anywhere else.
320
904330
2000
並和外界做溝通。
15:06
OK, so if we take that equation,
321
906330
3000
好,我們用這個等式,
15:09
why haven't we heard from anybody out there?
322
909330
5000
為何我們沒聽到有誰在外界?
15:14
Because every step is dangerous.
323
914330
4000
因為每一步都是危險的。
15:18
Getting a new replicator is dangerous.
324
918330
3000
取得新種複製體是危險的。
15:21
You can pull through, we have pulled through,
325
921330
2000
你能擺脫,我們擺脫了,
15:23
but it's dangerous.
326
923330
2000
但那是危險的。
15:25
Take the first step, as soon as life appeared on this earth.
327
925330
3000
第一步,生命一開始在地球出現。
15:28
We may take the Gaian view.
328
928330
2000
我們可以採用蓋亞的觀點。
15:30
I loved Peter Ward's talk yesterday -- it's not Gaian all the time.
329
930330
3000
我喜歡昨天 Peter Ward 的演講 - 並非一直是蓋亞論。
15:33
Actually, life forms produce things that kill themselves.
330
933330
3000
事實上,生命體製造殺死自己的事物。
15:36
Well, we did pull through on this planet.
331
936330
3000
嗯,我們在這星球上擺脫了。
15:39
But then, a long time later, billions of years later,
332
939330
2000
但,過了一段長時間,幾十億年後,
15:41
we got the second replicator, the memes.
333
941330
3000
我們得到第二種複製體:迷因。
15:44
That was dangerous, all right.
334
944330
2000
那是危險的,沒錯。
15:46
Think of the big brain.
335
946330
2000
想想大頭腦。
15:48
How many mothers do we have here?
336
948330
3000
這裡有幾位媽媽?
15:51
You know all about big brains.
337
951330
2000
妳們都知道大頭腦。
15:53
They are dangerous to give birth to,
338
953330
2000
大頭腦的生產很危險。
15:55
are agonizing to give birth to.
339
955330
2000
生得很折磨。
15:57
(Laughter)
340
957330
1000
(笑聲)
15:59
My cat gave birth to four kittens, purring all the time.
341
959330
2000
我的貓生了四隻小貓,嗚個不停。
16:01
Ah, mm -- slightly different.
342
961330
2000
嗯 - 有點不同。
16:03
(Laughter)
343
963330
2000
(笑聲)
16:05
But not only is it painful, it kills lots of babies,
344
965330
3000
它不但痛苦,害死許多嬰兒,
16:08
it kills lots of mothers,
345
968330
2000
害死許多媽媽,
16:10
and it's very expensive to produce.
346
970330
2000
它的產生也很昂貴。
16:12
The genes are forced into producing all this myelin,
347
972330
2000
基因被迫去生產髓磷脂,
16:14
all the fat to myelinate the brain.
348
974330
2000
供應大腦的髓磷脂。
16:16
Do you know, sitting here,
349
976330
2000
你知道嗎?坐在這裡,
16:18
your brain is using about 20 percent of your body's energy output
350
978330
4000
你的大腦大約使用身體能量產出的百分之 20.
16:22
for two percent of your body weight?
351
982330
2000
它只有體重的百分之二。
16:24
It's a really expensive organ to run.
352
984330
2000
這器官的運轉真的很貴。
16:26
Why? Because it's producing the memes.
353
986330
2000
為什麼?因為它生產迷因。
16:28
Now, it could have killed us off. It could have killed us off,
354
988330
4000
它可能害我們死光 - 可能害我們死光,
16:32
and maybe it nearly did, but you see, we don't know.
355
992330
2000
也許它差點做到了,但我們並不清楚。
16:34
But maybe it nearly did.
356
994330
2000
也許它差點做到了。
16:36
Has it been tried before?
357
996330
1000
它曾試過嗎?
16:37
What about all those other species?
358
997330
2000
其他的物種又如何?
16:39
Louise Leakey talked yesterday
359
999330
2000
Louise Leakey 昨天談到
16:41
about how we're the only one in this branch left.
360
1001330
3000
我們是這一支系唯一存下的。
16:44
What happened to the others?
361
1004330
2000
其他的怎麼了?
16:46
Could it be that this experiment in imitation,
362
1006330
2000
會是因為實驗了模仿,
16:48
this experiment in a second replicator,
363
1008330
2000
實驗了第二種複製體
16:50
is dangerous enough to kill people off?
364
1010330
4000
危險到足以害死大家?
16:54
Well, we did pull through, and we adapted.
365
1014330
2000
我們擺脫了,我們適應了。
16:56
But now, we're hitting, as I've just described,
366
1016330
3000
而現在,我們碰上了如我剛說的,
16:59
we're hitting the third replicator point.
367
1019330
2000
我們碰上了第三種複製體。
17:01
And this is even more dangerous --
368
1021330
3000
這個更危險 -
17:04
well, it's dangerous again.
369
1024330
2000
它又是危險的。
17:06
Why? Because the temes are selfish replicators
370
1026330
4000
為什麼?因為技因是自私的複製體
17:10
and they don't care about us, or our planet, or anything else.
371
1030330
3000
它不在乎我們、我們的星球、或任何東西。
17:13
They're just information, why would they?
372
1033330
3000
它只是資訊 - 為何它要在乎?
17:17
They are using us to suck up the planet's resources
373
1037330
2000
它只利用我們去吸取地球資源
17:19
to produce more computers,
374
1039330
2000
去產生更多電腦,
17:21
and more of all these amazing things we're hearing about here at TED.
375
1041330
3000
更多驚奇事物,我們在 TED 聽到的事物。
17:24
Don't think, "Oh, we created the Internet for our own benefit."
376
1044330
4000
不要以為:「喔,我們為自己的好處創造了網路。」
17:28
That's how it seems to us.
377
1048330
2000
那是我們認為如此。
17:30
Think, temes spreading because they must.
378
1050330
4000
想想:技因的傳播是因為它必須如此。
17:34
We are the old machines.
379
1054330
2000
我們是老舊機器。
17:36
Now, are we going to pull through?
380
1056330
2000
現在我們能擺脫嗎?
17:38
What's going to happen?
381
1058330
2000
會出什麼事嗎?
17:40
What does it mean to pull through?
382
1060330
2000
擺脫了又怎樣?
17:42
Well, there are kind of two ways of pulling through.
383
1062330
2000
嗯,有兩種擺脫的方式。
17:45
One that is obviously happening all around us now,
384
1065330
2000
一種顯然正在我們四周發生,
17:47
is that the temes turn us into teme machines,
385
1067330
4000
即技因把我們變成技因機器,
17:51
with these implants, with the drugs,
386
1071330
2000
用植入物、用藥物,
17:53
with us merging with the technology.
387
1073330
3000
把我們併入科技中。
17:56
And why would they do that?
388
1076330
2000
為什麼它要這麼做?
17:58
Because we are self-replicating.
389
1078330
2000
因為我們自我複製。
18:00
We have babies.
390
1080330
2000
我們有後代。
18:02
We make new ones, and so it's convenient to piggyback on us,
391
1082330
3000
我們生產新生命,因此騎著我們很方便,
18:05
because we're not yet at the stage on this planet
392
1085330
4000
因為在地球上,我們還未到達
18:09
where the other option is viable.
393
1089330
2000
其他方式可行的階段。
18:11
Although it's closer, I heard this morning,
394
1091330
2000
雖然接近了,早上我聽到了,
18:13
it's closer than I thought it was.
395
1093330
2000
比我以為的還接近了。
18:15
Where the teme machines themselves will replicate themselves.
396
1095330
3000
到了技因機器會複製自己。
18:18
That way, it wouldn't matter if the planet's climate
397
1098330
4000
那時,就無所謂地球氣候
18:22
was utterly destabilized,
398
1102330
2000
是否極度不穩定,
18:24
and it was no longer possible for humans to live here.
399
1104330
2000
不再可能讓人類生存於此。
18:26
Because those teme machines, they wouldn't need --
400
1106330
2000
因為技因機器將不需要 -
18:28
they're not squishy, wet, oxygen-breathing,
401
1108330
2000
它們不是血肉之軀、呼吸氧氣、
18:30
warmth-requiring creatures.
402
1110330
3000
需要溫暖的生物。
18:33
They could carry on without us.
403
1113330
2000
沒有我們,它們也能生存。
18:35
So, those are the two possibilities.
404
1115330
3000
因此,有這兩種可能。
18:38
The second, I don't think we're that close.
405
1118330
4000
第二種,我不認為我們有那麼接近。
18:42
It's coming, but we're not there yet.
406
1122330
2000
它會來,但我們還未到達。
18:44
The first, it's coming too.
407
1124330
2000
第一種,也會來。
18:46
But the damage that is already being done
408
1126330
3000
但已造成對地球的損害
18:49
to the planet is showing us how dangerous the third point is,
409
1129330
5000
這告訴了我們第三點有多危險,
18:54
that third danger point, getting a third replicator.
410
1134330
3000
即第三危險點:取得第三種複製體。
18:58
And will we get through this third danger point,
411
1138330
2000
我們能擺脫這第三危險點,
19:00
like we got through the second and like we got through the first?
412
1140330
3000
像我們擺脫第二個,像我們擺脫第一個?
19:04
Maybe we will, maybe we won't.
413
1144330
2000
也許我們能,也許不能。
19:06
I have no idea.
414
1146330
3000
我也不知道。
19:13
(Applause)
415
1153330
10000
(掌聲)
19:24
Chris Anderson: That was an incredible talk.
416
1164330
2000
演講很精彩。
19:26
SB: Thank you. I scared myself.
417
1166330
2000
謝謝。我嚇到自己。
19:28
CA: (Laughter)
418
1168330
1000
(笑聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7