Susan Blackmore: Memes and "temes"

157,186 views ใƒป 2008-06-04

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: Shlomo Adam ืขืจื™ื›ื”: osnat mader willensky
00:18
Cultural evolution is a dangerous child
0
18330
3000
ืื‘ื•ืœื•ืฆื™ื” ืชืจื‘ื•ืชื™ืช ื”ื™ื ืฆืืฆื ืžืกื•ื›ืŸ
00:21
for any species to let loose on its planet.
1
21330
3000
ืฉืœ ื›ืœ ืžื™ืŸ ืฉืžืฉื—ืจืจ ืื•ืชื• ื‘ื›ื•ื›ื‘ ื”ืœื›ืช ืฉืœื•.
00:24
By the time you realize what's happening, the child is a toddler,
2
24330
4000
ืขื“ ืฉืชื•ืคืกื™ื ืžื” ืงื•ืจื”, ื”ืชื™ื ื•ืง ื ืขืฉื” ืœืคืขื•ื˜,
00:28
up and causing havoc, and it's too late to put it back.
3
28330
6000
ืฉืžืฉืชื•ืœืœ ื•ื–ื•ืจืข ืžื”ื•ืžื”, ื•ืžืื•ื—ืจ ืžื“ื™ ืœื”ื›ื ื™ืกื• ื‘ื—ื–ืจื”.
00:34
We humans are Earth's Pandoran species.
4
34330
3000
ื›ืžื™ืŸ, ืื ื• ื‘ื ื™ ื”ืื“ื ื”ื ื ื• ืชื™ื‘ืช-ื”ืคื ื“ื•ืจื” ืฉืœ ืขื•ืœืžื ื•.
00:37
We're the ones who let the second replicator out of its box,
5
37330
5000
ืื ื—ื ื• ืžื™ ืฉืฉื™ื—ืจืจื• ืžื›ืœืื• ืืช ื”ืžืฉื›ืคืœ ื”ืฉื ื™,
ื•ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ื—ื–ื™ืจื• ืœืฉื.
00:42
and we can't push it back in.
6
42330
2000
00:44
We're seeing the consequences all around us.
7
44330
3000
ืื ื• ืจื•ืื™ื ื”ื™ื•ื ืืช ื”ืชื•ืฆืื•ืช ื‘ื›ืœ ืžืงื•ื.
00:48
Now that, I suggest, is the view that
8
48330
4000
ื•ืื ื™ ื˜ื•ืขื ืช ืฉื–ื• ื”ื”ืฉืงืคื” ืฉื ื•ื‘ืขืช ืžื”ืชื™ื™ื—ืกื•ืช ืจืฆื™ื ื™ืช ืœืžืžื˜ื™ืงื”.
00:52
comes out of taking memetics seriously.
9
52330
2000
00:54
And it gives us a new way of thinking about
10
54330
2000
ื•ื–ื” ื ื•ืชืŸ ืœื ื• ื“ืจืš ื—ื“ืฉื” ืœื—ืฉื•ื‘
00:56
not only what's going on on our planet,
11
56330
2000
ืœื ืจืง ืขืœ ืžื” ืฉืงื•ืจื” ื‘ืขื•ืœืžื ื•,
00:58
but what might be going on elsewhere in the cosmos.
12
58330
3000
ืืœื ืขืœ ืžื” ืฉืื•ืœื™ ืžืชืจื—ืฉ ื‘ืžืงื•ืžื•ืช ืื—ืจื™ื ื‘ื™ืงื•ื.
01:01
So first of all, I'd like to say something about memetics
13
61330
3000
ืจืืฉื™ืช, ื‘ืจืฆื•ื ื™ ืœื•ืžืจ ืžืฉื”ื• ืขืœ ื”ืžืžื˜ื™ืงื”
01:04
and the theory of memes,
14
64330
2000
ื•ืขืœ ืชื™ืื•ืจื™ื™ืช ื”ืžืžื™ื,
01:06
and secondly, how this might answer questions about who's out there,
15
66330
5000
ื•ืฉื ื™ืช, ืื™ืš ื–ื” ืขืฉื•ื™ ืœืขื ื•ืช ืขืœ ืฉืืœืช ื”ื—ื™ื™ื ื”ื—ื•ืฆื ื™ื™ื,
01:11
if indeed anyone is.
16
71330
3000
ืื ืื›ืŸ ื™ืฉ ื›ืืœื”.
01:14
So, memetics:
17
74330
2000
ื•ื‘ื›ืŸ, ื”ืžืžื˜ื™ืงื”.
01:16
memetics is founded on the principle of Universal Darwinism.
18
76330
4000
ื”ืžืžื˜ื™ืงื” ืžื‘ื•ืกืกืช ืขืœ ืขืงืจื•ืŸ ื”ื“ืจื•ื•ื™ื ื™ื–ื ื”ืื•ื ื™ื‘ืจืกืœื™.
01:20
Darwin had this amazing idea.
19
80330
3000
ืœื“ืจื•ื•ื™ืŸ ื”ื™ื” ืจืขื™ื•ืŸ ืžื“ื”ื™ื.
01:23
Indeed, some people say
20
83330
2000
ืœืžืขืŸ ื”ืืžืช, ื™ืฉ ื”ื˜ื•ืขื ื™ื
01:25
it's the best idea anybody ever had.
21
85330
3000
ืฉื–ื” ื”ืจืขื™ื•ืŸ ื”ื›ื™ ื˜ื•ื‘ ืžืื– ื•ืžืชืžื™ื“.
01:28
Isn't that a wonderful thought, that there could be such a thing
22
88330
4000
ื”ืื™ืŸ ื–ื• ืžื—ืฉื‘ื” ื ืคืœืื”, ืฉื™ื™ืชื›ืŸ ื“ื‘ืจ ื›ื–ื”,
01:32
as a best idea anybody ever had?
23
92330
2000
"ื”ืจืขื™ื•ืŸ ื”ื›ื™ ื˜ื•ื‘ ืžืื– ื•ืžืชืžื™ื“"?
01:34
Do you think there could?
24
94330
1000
ื ืจืื” ืœื›ื ืฉื–ื” ื™ื™ืชื›ืŸ?
01:35
Audience: No.
25
95330
1000
ืงื”ืœ: ืœื.
01:36
(Laughter)
26
96330
1000
(ืฆื—ื•ืง)
01:37
Susan Blackmore: Someone says no, very loudly, from over there.
27
97330
2000
ืก"ื‘: ืžื™ืฉื”ื• ืฉื ืืžืจ "ืœื" ื‘ืงื•ืœ ืจื ืžืื“.
01:39
Well, I say yes, and if there is, I give the prize to Darwin.
28
99330
4000
ื•ืื ื™ ืื•ืžืจืช ืฉื›ืŸ. ื•ืื ื™ืฉ ืจืขื™ื•ืŸ ื›ื–ื”, ืื ื™ ื ื•ืชื ืช ืืช ื”ืคืจืก ืœื“ืจื•ื•ื™ืŸ.
01:43
Why?
29
103330
2000
ื•ืžื“ื•ืข?
01:45
Because the idea was so simple,
30
105330
3000
ื›ื™ ื”ืจืขื™ื•ืŸ ื”ื™ื” ื›ื” ืคืฉื•ื˜,
01:48
and yet it explains all design in the universe.
31
108330
6000
ื•ืขื ื–ืืช ื”ื•ื ืžืกื‘ื™ืจ ืืช ื›ืœ ื”ืชื›ื ื•ืŸ ื‘ื™ืงื•ื.
01:54
I would say not just biological design,
32
114330
2000
ื•ืื ื™ ื˜ื•ืขื ืช ืฉืœื ืจืง ืืช ื”ืชื›ื ื•ืŸ ื”ื‘ื™ื•ืœื•ื’ื™,
01:56
but all of the design that we think of as human design.
33
116330
2000
ืืœื ืืช ื›ืœ ื”ืชื›ื ื•ืŸ ืฉื ื—ืฉื‘ ื‘ืขื™ื ื™ื ื• ืœืชื›ื ื•ืŸ ืื ื•ืฉื™.
01:58
It's all just the same thing happening.
34
118330
2000
ืื•ืชื• ื”ื“ื‘ืจ ื‘ื“ื™ื•ืง ื—ื•ื–ืจ ื•ืžืชืจื—ืฉ ืฉื•ื‘ ื•ืฉื•ื‘.
02:00
What did Darwin say?
35
120330
2000
ืžื” ืืžืจ ื“ืจื•ื•ื™ืŸ?
02:02
I know you know the idea, natural selection,
36
122330
2000
ืื ื™ ื™ื•ื“ืขืช ืฉืืชื ืžื›ื™ืจื™ื ืืช ืจืขื™ื•ืŸ ื”ื‘ืจื™ืจื” ื”ื˜ื‘ืขื™ืช,
02:04
but let me just paraphrase "The Origin of Species," 1859,
37
124330
5000
ืืš ื”ื‘ื” ื•ืืกื›ื ืืช "ืžื•ืฆื ื”ืžื™ื ื™ื", 1859,
02:09
in a few sentences.
38
129330
2000
ื‘ื›ืžื” ืžืฉืคื˜ื™ื.
02:11
What Darwin said was something like this:
39
131330
3000
ืžื” ืฉื“ืจื•ื•ื™ืŸ ืืžืจ ื”ื™ื” ืžืฉื”ื• ื›ื–ื”:
02:14
if you have creatures that vary, and that can't be doubted --
40
134330
4000
"ืื ื™ืฉ ื™ืฆื•ืจื™ื ืฉืขื•ื‘ืจื™ื ืฉื™ื ื•ื™ื™ื, ื•ื‘ื›ืš ืื™ืŸ ืกืคืง -
02:18
I've been to the Galapagos, and I've measured the size of the beaks
41
138330
3000
"ื‘ื™ืงืจืชื™ ื‘ืื™ื™ ื’ืืœืืคื’ื•ืก ื•ืžื“ื“ืชื™ ืืช ื’ื“ืœื™ ื”ืžืงื•ืจื™ื
02:21
and the size of the turtle shells and so on, and so on.
42
141330
2000
"ื•ืืช ื’ื“ืœื™ ืฉืจื™ื•ื ื™ ื”ืฆื‘ื™ื ื•ื›ื•' ื•ื›ื•'..."
02:23
And 100 pages later.
43
143330
2000
ื•ืื—ืจื™ 100 ืขืžื•ื“ื™ื -
02:25
(Laughter)
44
145330
2000
(ืฆื—ื•ืง)
02:27
And if there is a struggle for life,
45
147330
4000
"...ื•ืื ื™ืฉ ืžืœื—ืžืช ืงื™ื•ื,
02:31
such that nearly all of these creatures die --
46
151330
3000
"ืฉื‘ื” ืžืชื™ื ื›ืžืขื˜ ื›ืœ ื”ื™ืฆื•ืจื™ื ื”ืืœื” --
02:34
and this can't be doubted, I've read Malthus
47
154330
3000
"ื•ื‘ื›ืš ืื™ืŸ ืกืคืง. ืงืจืืชื™ ืืช ืžืืœืชื•ืก
02:37
and I've calculated how long it would take for elephants
48
157330
2000
"ื•ื—ื™ืฉื‘ืชื™ ื›ืžื” ื–ืžืŸ ื”ื™ื” ื ื—ื•ืฅ ืœืคื™ืœื™ื
02:39
to cover the whole world if they bred unrestricted, and so on and so on.
49
159330
3000
"ืœื”ืชืคืฉื˜ ื‘ื›ืœ ื”ืขื•ืœื ืื™ืœื• ื”ืชืจื‘ื• ืœืœื ื”ื’ื‘ืœื”, ื•ื›ื•' ื•ื›ื•'..."
02:42
And another 100 pages later.
50
162330
4000
ื•ืื—ืจื™ ืขื•ื“ 100 ืขืžื•ื“ื™ื,
02:46
And if the very few that survive pass onto their offspring
51
166330
5000
"...ื•ืื ื”ืงื•ืžืฅ ืฉืฉืจื“ื• ืžืขื‘ื™ืจื™ื ืœืฆืืฆืื™ื”ื
02:51
whatever it was that helped them survive,
52
171330
3000
"ืืช ืžื” ืฉืกื™ื™ืข ืœื”ื ืœืฉืจื•ื“,
02:54
then those offspring must be better adapted
53
174330
2000
"ื”ืจื™ ืฉืื•ืชื ืฆืืฆืื™ื ืžืŸ ื”ืกืชื ืžื•ืชืืžื™ื ื˜ื•ื‘ ื™ื•ืชืจ
02:56
to the circumstances in which all this happened
54
176330
2000
"ืœื ืกื™ื‘ื•ืช ืฉื‘ื”ืŸ ื”ืชืจื—ืฉ ื›ืœ ื–ื”
02:58
than their parents were.
55
178330
3000
"ืžืืฉืจ ื”ื•ืจื™ื”ื."
03:01
You see the idea?
56
181330
2000
ืืชื ืžื–ื”ื™ื ืืช ื”ืจืขื™ื•ืŸ? ืื, ืื, ืื - ืื–.
03:03
If, if, if, then.
57
183330
2000
03:05
He had no concept of the idea of an algorithm,
58
185330
2000
ืœื ื”ื™ื” ืœื• ืžื•ืฉื’ ืžื”ื• ืืœื’ื•ืจื™ืชื.
03:07
but that's what he described in that book,
59
187330
3000
ืืš ื–ื” ืžื” ืฉื”ื•ื ืชื™ืืจ ื‘ืื•ืชื• ืกืคืจ,
03:10
and this is what we now know as the evolutionary algorithm.
60
190330
3000
ื•ื–ื” ืžื” ืฉืžื•ื›ืจ ืœื ื• ื›ื™ื•ื ื›ืืœื’ื•ืจื™ืชื ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™.
03:13
The principle is you just need those three things --
61
193330
4000
[ื”ืืœื’ื•ืจื™ืชื ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™]
ื”ืขืงืจื•ืŸ ื”ื•ื ืฉื ื—ื•ืฆื™ื ืจืง ืฉืœื•ืฉื” ื“ื‘ืจื™ื ืืœื”:
03:17
variation, selection and heredity.
62
197330
3000
ื’ื™ื•ื•ืŸ, ื‘ืจื™ืจื” ื•ืชื•ืจืฉื”.
03:20
And as Dan Dennett puts it, if you have those,
63
200330
4000
ื•ื‘ื ื™ืกื•ื—ื• ืฉืœ ื“ืŸ ื“ื ื˜, ืื ื”ื ืงื™ื™ืžื™ื,
03:24
then you must get evolution.
64
204330
2000
"ืืชื ืžื•ื›ืจื—ื™ื ืœืงื‘ืœ ืื‘ื•ืœื•ืฆื™ื”"
03:26
Or design out of chaos, without the aid of mind.
65
206330
5000
ืื• "ืชื›ื ื•ืŸ ืฉื ื•ื‘ืข ืžืชื•ื”ื•, ืœืœื ืกื™ื•ืข ืฉืœ ืžื•ื— ื›ืœืฉื”ื•."
03:31
There's one word I love on that slide.
66
211330
2000
ื™ืฉ ื‘ืฉืงื•ืคื™ืช ื–ื• ืžื™ืœื” ืื—ืช ืฉืื”ื•ื‘ื” ืขืœื™. ืžื” ื”ื™ื, ืœื“ืขืชื›ื?
03:33
What do you think my favorite word is?
67
213330
2000
03:35
Audience: Chaos.
68
215330
1000
03:36
SB: Chaos? No. What? Mind? No.
69
216330
3000
ืชื•ื”ื•? ืœื. ืžื”? ืžื•ื—? ืœื.
03:39
Audience: Without.
70
219330
1000
ืงื”ืœ: ื‘ืœื™. ืก"ื‘: ืœื, ืœื "ื‘ืœื™".
03:40
SB: No, not without.
71
220330
1000
03:41
(Laughter)
72
221330
1000
(ืฆื—ื•ืง)
03:42
You try them all in order: Mmm...?
73
222330
2000
ื ื™ืกื™ืชื ืืช ื›ื•ืœื ืœืคื™ ื”ืกื“ืจ, ื›ืŸ?
03:44
Audience: Must.
74
224330
1000
ืงื”ืœ: ืžื•ื›ืจื—ื™ื.
03:45
SB: Must, at must. Must, must.
75
225330
4000
ืก"ื‘: ืžื•ื›ืจื—ื™ื. ืžื•ื›ืจื—ื™ื. ืžื•ื›ืจื—ื™ื.
03:49
This is what makes it so amazing.
76
229330
2000
ื–ื” ืžื” ืฉืขื•ืฉื” ื–ืืช ืžื“ื”ื™ื ื›ืœ ื›ืš.
03:51
You don't need a designer,
77
231330
3000
ืื™ืŸ ืฆื•ืจืš ื‘ืžืชื›ื ืŸ, ืชื›ื ื™ืช, ื—ื–ื•ืŸ ืœืขืชื™ื“ ืื• ื›ืœ ื“ื‘ืจ ืื—ืจ.
03:54
or a plan, or foresight, or anything else.
78
234330
3000
03:57
If there's something that is copied with variation
79
237330
3000
ืื ื™ืฉ ืžืฉื”ื• ืฉืžื•ืขืชืง ืชื•ืš ื”ืชื’ื•ื•ื ื•ืช ื•ื‘ืจื™ืจื”,
04:00
and it's selected, then you must get design appearing out of nowhere.
80
240330
4000
ืืชื ืžื•ื›ืจื—ื™ื ืœืงื‘ืœ ืชื›ื ื•ืŸ ืฉืžื•ืคื™ืข ื™ืฉ ืžืื™ืŸ.
04:04
You can't stop it.
81
244330
2000
ืื™-ืืคืฉืจ ืœืžื ื•ืข ื–ืืช.
04:06
Must is my favorite word there.
82
246330
4000
"ืžื•ื›ืจื—ื™ื" ื”ื™ื ื”ืžื™ืœื” ื”ืื”ื•ื‘ื” ืขืœื™ ื›ืืŸ.
04:11
Now, what's this to do with memes?
83
251330
2000
ืื‘ืœ ืžื” ื”ืงืฉืจ ืœืžืžื™ื?
04:13
Well, the principle here applies to anything
84
253330
5000
ื•ื‘ื›ืŸ, ื”ืขืงืจื•ืŸ ื”ื–ื” ื™ืคื” ืœื›ืœ ื“ื‘ืจ ืฉืžื•ืขืชืง ืชื•ืš ื”ืชื’ื•ื•ื ื•ืช ื•ื‘ืจื™ืจื”.
04:18
that is copied with variation and selection.
85
258330
1000
04:19
We're so used to thinking in terms of biology,
86
259330
3000
ืื ื• ื›ื” ืจื’ื™ืœื™ื ืœื—ืฉื•ื‘ ื‘ืžื•ื ื—ื™ื ืฉืœ ื‘ื™ื•ืœื•ื’ื™ื”,
04:22
we think about genes this way.
87
262330
2000
ืื ื• ื—ื•ืฉื‘ื™ื ื›ืš ืขืœ ื”ื’ื ื™ื.
04:24
Darwin didn't, of course; he didn't know about genes.
88
264330
3000
ืืš ืœื ื“ืจื•ื•ื™ืŸ, ื›ืžื•ื‘ืŸ. ื”ื•ื ืœื ื™ื“ืข ืขืœ ื”ื’ื ื™ื.
04:27
He talked mostly about animals and plants,
89
267330
2000
ื”ื•ื ื“ื™ื‘ืจ ื‘ืขื™ืงืจ ืขืœ ื—ื™ื•ืช ื•ืฆืžื—ื™ื,
04:29
but also about languages evolving and becoming extinct.
90
269330
3000
ืืš ื’ื ืขืœ ืฉืคื•ืช ืฉืžืชืคืชื—ื•ืช ื•ื ื›ื—ื“ื•ืช.
04:32
But the principle of Universal Darwinism
91
272330
2000
ืืš ืขืงืจื•ืŸ ื”ื“ืจื•ื•ื™ื ื™ื–ื ื”ืื•ื ื™ื‘ืจืกืœื™
04:34
is that any information that is varied and selected
92
274330
4000
ืื•ืžืจ ืฉื›ืœ ืžื™ื“ืข ืฉืžืชื’ื•ื•ืŸ ื•ื ื‘ืจืจ
04:38
will produce design.
93
278330
2000
ืขืชื™ื“ ืœื”ืคื™ืง ืชื›ื ื•ืŸ.
04:40
And this is what Richard Dawkins was on about
94
280330
2000
ื•ืขืœ ื–ื” ืขืœื” ืจื™ืฆ'ืจื“ ื“ื•ืงื™ื ืก
04:42
in his 1976 bestseller, "The Selfish Gene."
95
282330
3000
ื‘ืจื‘-ื”ืžื›ืจ ืฉืœื• ืž-1976, "ื”ื’ืŸ ื”ืื ื•ื›ื™".
04:45
The information that is copied, he called the replicator.
96
285330
4000
ืœืžื™ื“ืข ื”ืžื•ืขืชืง ื”ื•ื ืงืจื "ืžืฉื›ืคืœ".
04:49
It selfishly copies.
97
289330
2000
ื”ื”ืขืชืงื” ื”ื™ื ืื ื•ื›ื™ืช:
04:51
Not meaning it kind of sits around inside cells going, "I want to get copied."
98
291330
4000
ืœื ื‘ืžื•ื‘ืŸ ืฉื”ืžื™ื“ืข ื™ื•ืฉื‘ ื‘ืชื ื•ืื•ืžืจ, "ืื ื™ ืจื•ืฆื” ืœื”ื™ื•ืช ืžื•ืขืชืง";
04:55
But that it will get copied if it can,
99
295330
2000
ืืœื: ื”ื•ื ื™ื•ืขืชืง ืื ื™ืชืืคืฉืจ ืœื•,
04:57
regardless of the consequences.
100
297330
2000
ืœืœื ืงืฉืจ ืœืชื•ืฆืื•ืช.
05:00
It doesn't care about the consequences because it can't,
101
300330
3000
ืœื ืื›ืคืช ืœื• ืžื”ืชื•ืฆืื•ืช ื›ื™ ืœื ื™ื›ื•ืœ ืœื”ื™ื•ืช ืœื• ืื™ื›ืคืช,
05:03
because it's just information being copied.
102
303330
2000
ื›ื™ ืžื” ืฉืžื•ืขืชืง ื”ื•ื ืจืง ืžื™ื“ืข.
05:06
And he wanted to get away
103
306330
1000
ื•ื“ื•ืงื™ื ืก ืฉืืฃ ืœื”ืชืจื—ืง
05:07
from everybody thinking all the time about genes,
104
307330
3000
ืžื”ื—ืฉื™ื‘ื” ื”ืจื•ื•ื—ืช ืขืœ ื’ื ื™ื,
05:10
and so he said, "Is there another replicator out there on the planet?"
105
310330
3000
ื•ืœื›ืŸ ื”ื•ื ืฉืืœ, "ื”ืื ื™ืฉ ืขื•ื“ ืžืฉื›ืคืœื™ื ื‘ืขื•ืœืžื ื•?"
05:13
Ah, yes, there is.
106
313330
2000
ื•ืื›ืŸ, ื™ืฉ ื•ื™ืฉ.
05:15
Look around you -- here will do, in this room.
107
315330
3000
ื”ื‘ื™ื˜ื• ืกื‘ื™ื‘, ื›ืืŸ ื‘ืื•ืœื ื–ื”.
05:18
All around us, still clumsily drifting about
108
318330
3000
ืขืœ ื›ืœ ืกื‘ื™ื‘ื•ืชื™ื ื•, ื›ืฉื”ื•ื ืขื•ื“ื ื• ื ืกื—ืฃ ื‘ื’ืžืœื•ื ื™ื•ืช
05:21
in its primeval soup of culture, is another replicator.
109
321330
3000
ื‘ืžืจืง ื”ืงื“ื•ื ืฉืœ ื”ืชืจื‘ื•ืช, ืงื™ื™ื ืžืฉื›ืคืœ ื ื•ืกืฃ.
05:24
Information that we copy from person to person, by imitation,
110
324330
5000
ื”ืžื™ื“ืข ืฉืื ื• ืžืขืชื™ืงื™ื ืžืื“ื ืœืื“ื ื“ืจืš ื—ื™ืงื•ื™,
05:29
by language, by talking, by telling stories,
111
329330
2000
ื“ืจืš ื”ืฉืคื”, ื‘ื“ื™ื‘ื•ืจ, ื‘ืกื™ืคื•ืจื™ื,
05:31
by wearing clothes, by doing things.
112
331330
3000
ื‘ื‘ื’ื“ื™ื ืฉืื ื• ืœื•ื‘ืฉื™ื, ื‘ืžืขืฉื™ื ื•.
05:34
This is information copied with variation and selection.
113
334330
5000
ื–ื”ื• ืžื™ื“ืข ื”ืžื•ืขืชืง ืชื•ืš ื”ืชื’ื•ื•ื ื•ืช ื•ื‘ืจื™ืจื”.
05:39
This is design process going on.
114
339330
3000
ื–ื”ื• ืชื”ืœื™ืš ืชื›ื ื•ื ื™ ื‘ืขื™ืฆื•ืžื•.
05:42
He wanted a name for the new replicator.
115
342330
3000
ื”ื•ื ื—ื™ืคืฉ ืฉื ืœืžืฉื›ืคืœ ื”ื—ื“ืฉ ื”ื–ื”.
05:45
So, he took the Greek word "mimeme," which means that which is imitated.
116
345330
4000
ื•ื”ื•ื ืฉืืœ ืืช ื”ืžื™ืœื” ื”ื™ื•ื•ื ื™ืช "ืžื™ืžื", ืฉืžืฉืžืขื•ืชื”: 'ื–ื” ืฉื–ื•ื›ื” ืœื—ื™ืงื•ื™'.
05:49
Remember that, that's the core definition:
117
349330
2000
ื–ื™ื›ืจื• ื–ืืช, ื–ื•ื”ื™ ื”ื”ื’ื“ืจื” ื”ืžืจื›ื–ื™ืช,
05:52
that which is imitated.
118
352330
1000
"ื–ื” ืฉื–ื•ื›ื” ืœื—ื™ืงื•ื™".
05:53
And abbreviated it to meme, just because it sounds good
119
353330
3000
ื•ื”ื•ื ืงื™ืฆืจ ืื•ืชื” ืœ"ืžื", ืจืง ืžืคื ื™ ืฉื–ื” ื ืฉืžืข ื˜ื•ื‘
05:56
and made a good meme, an effective spreading meme.
120
356330
3000
ื•ื–ื” ื™ืฆืจ ืžื ื˜ื•ื‘, ืžื ืฉืžืชืคืฉื˜ ื‘ื™ืขื™ืœื•ืช.
05:59
So that's how the idea came about.
121
359330
3000
ื›ืš ื ื•ืฆืจ ื”ืจืขื™ื•ืŸ.
06:03
It's important to stick with that definition.
122
363330
3000
ื—ืฉื•ื‘ ืœื”ื™ืฆืžื“ ืœื”ื’ื“ืจื” ื–ื•.
06:06
The whole science of memetics is much maligned,
123
366330
4000
ื›ืœ ืžื“ืข ื”ืžืžื˜ื™ืงื” ื–ื•ื›ื” ืœืงื™ืชื•ื ื•ืช ืฉืœ ื‘ื•ื–,
06:10
much misunderstood, much feared.
124
370330
3000
ืžืจื‘ื™ื ืœื ืœื”ื‘ื™ื ื• ื•ืœื—ืฉื•ืฉ ืžืžื ื•.
06:13
But a lot of these problems can be avoided
125
373330
3000
ืื‘ืœ ื ื™ืชืŸ ืœื”ื™ืžื ืข ืžืจื‘ื•ืช ืžื”ื‘ืขื™ื•ืช ื”ืืœื”
06:16
by remembering the definition.
126
376330
2000
ืื ื–ื•ื›ืจื™ื ืืช ื”ื”ื’ื“ืจื”.
06:18
A meme is not equivalent to an idea.
127
378330
2000
ืžื ืื™ื ื ื• ืฉืงื•ืœ ืœืจืขื™ื•ืŸ.
06:20
It's not an idea. It's not equivalent to anything else, really.
128
380330
2000
ื”ื•ื ืื™ื ื• ืจืขื™ื•ืŸ. ื”ื•ื ืื™ื ื• ืฉืงื•ืœ ืœืฉื•ื ื“ื‘ืจ ืื—ืจ, ื‘ืขืฆื.
06:22
Stick with the definition.
129
382330
2000
ื”ื™ืฆืžื“ื• ืœื”ื’ื“ืจื”:
06:24
It's that which is imitated,
130
384330
2000
"ื–ื” ืฉื–ื•ื›ื” ืœื—ื™ืงื•ื™",
06:26
or information which is copied from person to person.
131
386330
3000
ืžื™ื“ืข ื”ืžื•ืขืชืง ืžืื“ื ืœืื“ื.
06:30
So, let's see some memes.
132
390330
1000
ืื– ื‘ื•ืื• ื ื‘ื—ืŸ ื›ืžื” ืžืžื™ื.
06:31
Well, you sir, you've got those glasses hung around your neck
133
391330
3000
ืืชื”, ืื“ื•ื ื™, ืขื ื”ืžืฉืงืคื™ื™ื ื”ืชืœื•ื™ื•ืช ืกื‘ื™ื‘ ืฆื•ื•ืืจืš
06:34
in that particularly fetching way.
134
394330
2000
ื‘ื“ืจืš ืฉื•ื‘ืช-ืœื‘ ื‘ืžื™ื•ื—ื“.
06:36
I wonder whether you invented that idea for yourself,
135
396330
2000
ืื ื™ ืชื•ื”ื” ืื ื”ืžืฆืืช ืจืขื™ื•ืŸ ื–ื” ื‘ืขืฆืžืš,
06:38
or copied it from someone else?
136
398330
2000
ืื• ื”ืขืชืงืช ืื•ืชื• ืžืžื™ืฉื”ื• ืื—ืจ?
06:40
If you copied it from someone else, it's a meme.
137
400330
3000
ืื ื”ืขืชืงืช ืื•ืชื• ืžืžื™ืฉื”ื• ืื—ืจ, ื–ื”ื• ืžื.
06:43
And what about, oh, I can't see any interesting memes here.
138
403330
3000
ื•ืžื” ื‘ื“ื‘ืจ -- ืื™ื ื™ ืจื•ืื” ื›ืืŸ ืžืžื™ื ืžืขื ื™ื™ื ื™ื.
06:46
All right everyone, who's got some interesting memes for me?
139
406330
3000
ืื•ืงื™ื™ ื—ื‘ืจ'ื”, ืœืžื™ ื™ืฉ ืื™ื–ื” ืžื ืžืขื ื™ื™ืŸ ืขื‘ื•ืจื™?
06:49
Oh, well, your earrings,
140
409330
2000
ืื” ื›ืŸ, ื”ืขื’ื™ืœื™ื ืฉืœืš,
06:51
I don't suppose you invented the idea of earrings.
141
411330
2000
ืื ื™ ืžื ื™ื—ื” ืฉืœื ืืช ื”ืžืฆืืช ืืช ืจืขื™ื•ืŸ ื”ืขื’ื™ืœื™ื.
06:53
You probably went out and bought them.
142
413330
2000
ืกื‘ื™ืจ ืฉื™ืฆืืช ื•ืงื ื™ืช ืื•ืชื.
06:55
There are plenty more in the shops.
143
415330
2000
ื™ืฉ ืขื•ื“ ื”ืจื‘ื” ื›ืืœื” ื‘ื—ื ื•ื™ื•ืช.
06:57
That's something that's passed on from person to person.
144
417330
2000
ื–ื”ื• ืžืฉื”ื• ืฉืขื•ื‘ืจ ืžืื“ื ืœืื“ื.
06:59
All the stories that we're telling -- well, of course,
145
419330
3000
ื›ืœ ื”ืกื™ืคื•ืจื™ื ืฉืื ื—ื ื• ืžืกืคืจื™ื, ื•ื›ืžื•ื‘ืŸ,
07:02
TED is a great meme-fest, masses of memes.
146
422330
4000
TED ื”ื•ื ื—ื’ื™ื’ืช ืžืžื™ื, ืžื ืžืกื™ื‘ื™.
07:06
The way to think about memes, though,
147
426330
2000
ืืš ื”ื“ืจืš ืœื—ืฉื•ื‘ ืขืœ ืžืžื™ื ื”ื™ื ืœืฉืื•ืœ ืžื“ื•ืข ื”ื ืžืชืคืฉื˜ื™ื.
07:08
is to think, why do they spread?
148
428330
2000
07:10
They're selfish information, they will get copied, if they can.
149
430330
4000
ืžื“ื•ื‘ืจ ื‘ืžื™ื“ืข ืื ื•ื›ื™, ื•ื”ื ื™ื•ืขืชืงื• ื‘ืžื™ื“ื” ื•ื™ื•ื›ืœื•.
07:14
But some of them will be copied because they're good,
150
434330
3000
ืืš ืžืžื™ื ืžืกื•ื™ืžื™ื ื™ื•ืขืชืงื• ืžืคื ื™ ืฉื”ื ื˜ื•ื‘ื™ื,
07:17
or true, or useful, or beautiful.
151
437330
2000
ืื• ื ื›ื•ื ื™ื, ืื• ืžื•ืขื™ืœื™ื, ืื• ื™ืคื™ื.
07:19
Some of them will be copied even though they're not.
152
439330
2000
ื—ืœืงื ื™ื•ืขืชืงื• ืืคื™ืœื• ืฉืื™ื ื ื›ืืœื”.
07:21
Some, it's quite hard to tell why.
153
441330
2000
ื•ื—ืœืงื - ืงืฉื” ืžืื“ ืœื•ืžืจ ืžื“ื•ืข.
07:24
There's one particular curious meme which I rather enjoy.
154
444330
3000
ื™ืฉ ืžื ืžืกื•ื™ื ืื—ื“ ืฉืื ื™ ื“ื™ ื ื”ื ื™ืช ืžืžื ื•,
07:27
And I'm glad to say, as I expected, I found it when I came here,
155
447330
3000
ื•ืื ื™ ืฉืžื—ื” ืœื•ืžืจ ืฉื›ืคื™ ืฉืฆื™ืคื™ืชื™, ื’ื™ืœื™ืชื™ ืื•ืชื• ื›ืฉื‘ืืชื™ ื”ื ื”
07:30
and I'm sure all of you found it, too.
156
450330
2000
ื•ืื ื™ ื‘ื˜ื•ื—ื” ืฉื’ื ืืชื ื’ื™ืœื™ืชื ืื•ืชื•.
07:32
You go to your nice, posh, international hotel somewhere,
157
452330
3000
ืืชื ื”ื•ืœื›ื™ื ืœืื™ื–ื” ืžืœื•ืŸ ื‘ื™ื ืœืื•ืžื™ ืžืคื•ืืจ,
07:36
and you come in and you put down your clothes
158
456330
2000
ืืชื ื ื›ื ืกื™ื ื•ืžืกื™ืจื™ื ืืช ื‘ื’ื“ื™ื›ื,
07:38
and you go to the bathroom, and what do you see?
159
458330
3000
ื•ื ื›ื ืกื™ื ืœืžืงืœื—ืช, ื•ืžื” ืืชื ืจื•ืื™ื?
07:41
Audience: Bathroom soap.
160
461330
1000
07:42
SB: Pardon?
161
462330
1000
ืกืœื™ื—ื”?
07:43
Audience: Soap.
162
463330
1000
07:44
SB: Soap, yeah. What else do you see?
163
464330
2000
ืกื‘ื•ืŸ. ื ื›ื•ืŸ. ืžื” ืขื•ื“ ืืชื ืจื•ืื™ื?
07:46
Audience: (Inaudible)
164
466330
1000
07:47
SB: Mmm mmm.
165
467330
1000
ืžืžืž.
07:48
Audience: Sink, toilet!
166
468330
1000
ืงื”ืœ: ื›ื™ื•ืจ, ืืกืœื”!
07:49
SB: Sink, toilet, yes, these are all memes, they're all memes,
167
469330
2000
ืก"ื‘: ื›ื™ื•ืจ, ืืกืœื”, ื›ืŸ. ื›ืœ ืืœื” ืžืžื™ื. ื›ื•ืœื ืžืžื™ื
07:51
but they're sort of useful ones, and then there's this one.
168
471330
3000
ื•ื›ื•ืœื ืฉื™ืžื•ืฉื™ื™ื, ืืš ื™ืฉื ื• ื’ื ื”ืžื ื”ื–ื”.
07:54
(Laughter)
169
474330
3000
(ืฆื—ื•ืง)
07:58
What is this one doing?
170
478330
2000
ืžื”ื• ื”ืžื ื”ื–ื”?
08:00
(Laughter)
171
480330
1000
(ืฆื—ื•ืง)
08:01
This has spread all over the world.
172
481330
2000
ื–ื” ื ืคื•ืฅ ื‘ื›ืœ ื”ืขื•ืœื.
08:03
It's not surprising that you all found it
173
483330
2000
ืœื ืžืคืชื™ืข ืฉื›ื•ืœื›ื ืžืฆืืชื ืื•ืชื•
08:05
when you arrived in your bathrooms here.
174
485330
2000
ื›ืฉื ื›ื ืกืชื ืœื—ื“ืจื™ ื”ืจื—ืฆื” ืฉืœื›ื ื›ืืŸ.
08:07
But I took this photograph in a toilet at the back of a tent
175
487330
5000
ืืš ืืช ื–ื” ืื ื™ ืฆื™ืœืžืชื™ ื‘ืฉื™ืจื•ืชื™ื ื‘ืงืฆื”ื• ืฉืœ ืื•ื”ืœ,
ื‘ืžื—ื ื” ื”ืืงื•ืœื•ื’ื™ ืฉื‘ื’'ื•ื ื’ืœ ืฉืœ ืืกืื.
08:12
in the eco-camp in the jungle in Assam.
176
492330
2000
08:14
(Laughter)
177
494330
1000
(ืฆื—ื•ืง)
08:16
Who folded that thing up there, and why?
178
496330
3000
ืžื™ ืงื™ืคืœ ื–ืืช ืฉื ื•ืžื“ื•ืข?
08:19
(Laughter)
179
499330
1000
(ืฆื—ื•ืง)
08:20
Some people get carried away.
180
500330
2000
ื™ืฉ ืื ืฉื™ื ืฉื ืกื—ืคื™ื.
08:22
(Laughter)
181
502330
3000
(ืฆื—ื•ืง)
08:26
Other people are just lazy and make mistakes.
182
506330
3000
ืื—ืจื™ื ืกืชื ืžืชืขืฆืœื™ื ื•ืขื•ืฉื™ื ื˜ืขื•ื™ื•ืช.
08:29
Some hotels exploit the opportunity to put even more memes
183
509330
3000
ื™ืฉ ืžืœื•ื ื•ืช ืฉืžื—ื“ื™ืจื™ื ื‘ื”ื–ื“ืžื ื•ืช ื–ื• ืขื•ื“ ืžืžื™ื
08:32
with a little sticker.
184
512330
2000
ื‘ืขื–ืจืช ืžื“ื‘ืงื” ืงื˜ื ื”.
08:34
(Laughter)
185
514330
1000
(ืฆื—ื•ืง)
08:35
What is this all about?
186
515330
2000
ืžื” ื”ืขื ื™ื™ืŸ ืคื”?
08:37
I suppose it's there to tell you that somebody's
187
517330
2000
ืื ื™ ืžื ื™ื—ื” ืฉื–ื” ื‘ื ืœื”ื•ื“ื™ืข ืœื›ื ืฉืžื™ืฉื”ื•
08:39
cleaned the place, and it's all lovely.
188
519330
2000
ื ื™ืงื” ืืช ื”ืžืงื•ื ื•ื”ื›ืœ ืžืื“ ื ื—ืžื“.
08:41
And you know, actually, all it tells you is that another person
189
521330
3000
ื•ื‘ืขืฆื ื–ื” ืื•ืžืจ ืœื›ื ืฉื™ื™ืชื›ืŸ ืฉืžื™ืฉื”ื• ืื—ืจ
08:44
has potentially spread germs from place to place.
190
524330
3000
ื”ืขื‘ื™ืจ ื—ื™ื™ื“ืงื™ื ืžืžืงื•ื ืœืžืงื•ื.
08:47
(Laughter)
191
527330
1000
(ืฆื—ื•ืง)
08:48
So, think of it this way.
192
528330
2000
ืื– ืชื—ืฉื‘ื• ืขืœ ื–ื” ื›ืš.
08:50
Imagine a world full of brains
193
530330
2000
ืชืืจื• ืœืขืฆืžื›ื ืขื•ืœื ืžืœื ืžื•ื—ื•ืช
08:52
and far more memes than can possibly find homes.
194
532330
3000
ื•ื‘ื• ื”ืจื‘ื” ื™ื•ืชืจ ืžืžื™ื ืžื›ืคื™ ืฉื™ื•ื›ืœื• ืœืžืฆื•ื ืœืขืฆืžื ื‘ื™ืช.
08:55
The memes are all trying to get copied --
195
535330
3000
ื›ืœ ื”ืžืžื™ื ืžื ืกื™ื ืœื”ื™ื•ืช ืžื•ืขืชืงื™ื,
08:58
trying, in inverted commas -- i.e.,
196
538330
3000
"ืžื ืกื™ื", ื›ืœื•ืžืจ,
09:01
that's the shorthand for, if they can get copied, they will.
197
541330
3000
ืงื™ืฆื•ืจ ืฉืœ: ืื ื™ื•ื›ืœื• ืœื”ื™ื•ืช ืžื•ืขืชืงื™ื, ื™ืขืฉื• ื–ืืช.
09:04
They're using you and me as their propagating, copying machinery,
198
544330
6000
ื”ื ืžืฉืชืžืฉื™ื ื‘ื›ื ื•ื‘ื™ ื›ื‘ืฆื™ื•ื“ ื”ื”ืขืชืงื” ื•ื”ื”ืคืฆื”,
09:10
and we are the meme machines.
199
550330
3000
ื•ืื ื—ื ื• ืžื›ื•ื ื•ืช ื”ืžืžื™ื.
09:13
Now, why is this important?
200
553330
2000
ื•ืžื“ื•ืข ื–ื” ื—ืฉื•ื‘?
09:15
Why is this useful, or what does it tell us?
201
555330
2000
ืžื“ื•ืข ื–ื” ืฉื™ืžื•ืฉื™ ื•ืžื” ื–ื” ืื•ืžืจ ืœื ื•?
09:17
It gives us a completely new view of human origins
202
557330
4000
ื–ื” ื ื•ืชืŸ ืœื ื• ืžื‘ื˜ ื—ื“ืฉ ืœื’ืžืจื™ ืขืœ ืžืงื•ืจื•ืช ื”ืื“ื
09:21
and what it means to be human,
203
561330
1000
ื•ืžื” ืžืฉืžืขื•ืช ื”ื™ื•ืช ืื ื•ืฉื™.
09:22
all conventional theories of cultural evolution,
204
562330
4000
ื›ืœ ื”ืชื™ืื•ืจื™ื•ืช ื”ืจื’ื™ืœื•ืช ืขืœ ื”ืชืคืชื—ื•ืช ื”ืชืจื‘ื•ืช,
09:26
of the origin of humans,
205
566330
2000
ืขืœ ืžืงื•ืจ ื”ืื“ื,
09:28
and what makes us so different from other species.
206
568330
4000
ื•ืขืœ ืžื” ืฉื›ืœ-ื›ืš ืžื‘ื“ื™ืœ ืื•ืชื ื• ืžืžื™ื ื™ื ืื—ืจื™ื.
ื›ืœ ืฉืืจ ืชื™ืื•ืจื™ื•ืช ื”ืžื•ื— ื”ื’ื“ื•ืœ, ื”ืฉืคื” ื•ื”ืฉื™ืžื•ืฉ ื‘ื›ืœื™ื
09:32
All other theories explaining the big brain, and language, and tool use
207
572330
2000
09:34
and all these things that make us unique,
208
574330
2000
ื•ื›ืœ ื™ืชืจ ื”ื“ื‘ืจื™ื ืฉืžื™ื™ื—ื“ื™ื ืื•ืชื ื•,
09:36
are based upon genes.
209
576330
3000
ืžื‘ื•ืกืกื•ืช ืขืœ ืชืคื™ืฉืช ื”ื’ื ื™ื.
09:39
Language must have been useful for the genes.
210
579330
3000
ื”ืฉืคื” ื•ื“ืื™ ื”ื•ืขื™ืœื” ืžืื“ ืœื’ื ื™ื.
09:42
Tool use must have enhanced our survival, mating and so on.
211
582330
3000
ื”ืฉื™ืžื•ืฉ ื‘ื›ืœื™ื ื•ื“ืื™ ืชืจื ืœื”ื™ืฉืจื“ื•ืชื ื•, ืœืžืฆื™ืืช ื‘ื ื™ ื–ื•ื’ ื•ื›ื•'
09:45
It always comes back, as Richard Dawkins complained
212
585330
3000
ื”ื›ืœ ื—ื•ื–ืจ --
ื›ืคื™ ืฉื”ืชืœื•ื ืŸ ืจื™ืฆ'ืจื“ ื“ื•ืงื™ื ืก ืœืคื ื™ ื–ืžืŸ ื›ื” ืจื‘ --
09:48
all that long time ago, it always comes back to genes.
213
588330
3000
ื”ื›ืœ ืชืžื™ื“ ื—ื•ื–ืจ ืœื’ื ื™ื.
09:51
The point of memetics is to say, "Oh no, it doesn't."
214
591330
4000
ื•ืื™ืœื• ื”ืžืžื˜ื™ืงื” ื˜ื•ืขื ืช, ืฉืœื, ื–ื” ืœื ื ื›ื•ืŸ.
09:55
There are two replicators now on this planet.
215
595330
3000
ื›ื™ื•ื ื™ืฉ ื‘ืขื•ืœืžื ื• ืฉื ื™ ืžืฉื›ืคืœื™ื.
09:58
From the moment that our ancestors,
216
598330
3000
ืžื”ืจื’ืข ื‘ื• ืื‘ื•ืชื™ื ื•,
10:01
perhaps two and a half million years ago or so,
217
601330
2000
ืœืคื ื™ ื›-2.5 ืžืœื™ื•ืŸ ืฉื ื”,
10:03
began imitating, there was a new copying process.
218
603330
4000
ื”ื—ืœื• ืœื—ืงื•ืช, ื”ื—ืœ ืชื”ืœื™ืš ื”ืขืชืงื” ื—ื“ืฉ.
10:07
Copying with variation and selection.
219
607330
2000
ื”ืขืชืงื” ืขื ื”ืชื’ื•ื•ื ื•ืช ื•ื‘ืจื™ืจื”.
10:09
A new replicator was let loose, and it could never be --
220
609330
5000
ืฉื•ื—ืจืจ ืžืฉื›ืคืœ ื—ื“ืฉ,
ื•ืœื ื™ื™ืชื›ืŸ -- ื›ื‘ืจ ืžืŸ ื”ื”ืชื—ืœื”, ืœื ื™ื™ืชื›ืŸ
10:14
right from the start -- it could never be
221
614330
1000
10:15
that human beings who let loose this new creature,
222
615330
5000
ืฉื‘ื ื™ ื”ืื“ื, ืฉื”ื•ืฆื™ืื• ืœื—ื•ืคืฉื™ ื™ืฆื•ืจ ื—ื“ืฉ ื–ื”,
10:20
could just copy the useful, beautiful, true things,
223
620330
3000
ื™ื›ืœื• ืœื”ืขืชื™ืง ืจืง ืืช ืžื” ืฉืžื•ืขื™ืœ, ื™ืคื” ื•ื ื›ื•ืŸ,
10:23
and not copy the other things.
224
623330
2000
ื•ืœื ืืช ื›ืœ ื™ืชืจ ื”ื“ื‘ืจื™ื.
10:25
While their brains were having an advantage from being able to copy --
225
625330
3000
ื‘ืขื•ื“ ืฉืœืžื•ื—ื•ืชื™ื”ื ื”ื™ื” ื”ื™ืชืจื•ืŸ ื”ื•ื“ื•ืช ืœื™ื›ื•ืœืช ื”ื”ืขืชืงื” --
10:28
lighting fires, keeping fires going, new techniques of hunting,
226
628330
5000
ืœื”ื“ืœื™ืง ืืฉ, ืœืฉืžื•ืจ ืขืœื™ื”, ื˜ื›ื ื™ืงื•ืช ืฆื™ื“ ื—ื“ืฉื•ืช, ื“ื‘ืจื™ื ื›ืืœื” --
10:33
these kinds of things --
227
633330
2000
ื‘ืื•ืคืŸ ื‘ืœืชื™-ื ืžื ืข ื”ื ื”ืขืชื™ืงื• ื’ื ืืช ืชื—ื™ื‘ืช ื”ื ื•ืฆื•ืช ืœืฉื™ืขืจ
10:35
inevitably they were also copying putting feathers in their hair,
228
635330
3000
ืื• ืืช ืœื‘ื™ืฉืชื ืฉืœ ื‘ื’ื“ื™ื ืžืฉื•ื ื™ื, ืื• ืืช ืฆื‘ื™ืขืช ื”ืคื ื™ื,
10:38
or wearing strange clothes, or painting their faces,
229
638330
2000
10:40
or whatever.
230
640330
1000
ื›ืœ ื“ื‘ืจ ืฉื”ื•ื.
10:41
So, you get an arms race between the genes
231
641330
4000
ืžืชืงื‘ืœ ืžื™ืจื•ืฅ ื—ื™ืžื•ืฉ ื‘ื™ืŸ ื”ื’ื ื™ื,
10:45
which are trying to get the humans to have small economical brains
232
645330
4000
ืฉืžืฉืชื“ืœื™ื ืฉืœืื“ื ื™ื”ื™ื” ืžื•ื— ืงื˜ืŸ ื•ื—ืกื›ื•ื ื™
10:49
and not waste their time copying all this stuff,
233
649330
2000
ื•ืฉืœื ื™ื‘ื–ื‘ื– ืืช ื–ืžื ื• ื‘ื”ืขืชืงืช ื›ืœ ื”ืฉื˜ื•ื™ื•ืช ื”ืืœื”,
10:51
and the memes themselves, like the sounds that people made and copied --
234
651330
4000
ืœื‘ื™ืŸ ื”ืžืžื™ื ืขืฆืžื, ื›ืžื• ื”ืงื•ืœื•ืช ืฉืื ืฉื™ื ื”ืฉืžื™ืขื• ื•ื”ืขืชื™ืงื• --
10:56
in other words, what turned out to be language --
235
656330
2000
ื›ืœื•ืžืจ, ืžื” ืฉื”ืคืš ืœื”ื™ื•ืช ื”ืฉืคื” --
10:58
competing to get the brains to get bigger and bigger.
236
658330
3000
ืฉืžืชื—ืจื™ื ืขืœ ื”ื’ื“ืœืช ื”ืžื•ื—ื•ืช ืขื•ื“ ื•ืขื•ื“.
11:01
So, the big brain, on this theory, is driven by the memes.
237
661330
4000
ื›ืš ืฉืœืคื™ ืชื™ืื•ืจื™ื” ื–ื• ื”ืžื•ื— ื”ื’ื“ื•ืœ ืžื•ื ืข ืข"ื™ ื”ืžืžื™ื.
11:05
This is why, in "The Meme Machine," I called it memetic drive.
238
665330
4000
ื•ืœื›ืŸ, ื‘"ืžื›ื•ื ืช ื”ืžืžื™ื", ืื ื™ ืžื›ื ื” ื–ืืช 'ืžื ื•ืข ืžืžื˜ื™',
11:09
As the memes evolve, as they inevitably must,
239
669330
3000
ืขื ื”ืชืคืชื—ื•ืช ื”ืžืžื™ื ื”ื‘ืœืชื™-ื ืžื ืขืช,
11:12
they drive a bigger brain that is better at copying the memes
240
672330
4000
ื”ื ื“ื•ื—ืคื™ื ืœืžื•ื— ื’ื“ื•ืœ ื™ื•ืชืจ ืฉื™ื™ื˜ื™ื‘ ืœื”ืขืชื™ืง ืืช ื”ืžืžื™ื
11:16
that are doing the driving.
241
676330
2000
ืฉื“ื•ื—ืคื™ื ืœื›ืš.
11:18
This is why we've ended up with such peculiar brains,
242
678330
4000
ื•ื–ื• ื”ืกื™ื‘ื” ืฉื™ืฉ ืœื ื• ืžื•ื—ื•ืช ืžืฉื•ื ื™ื ื›ืืœื”,
11:22
that we like religion, and music, and art.
243
682330
3000
ื•ืื ื• ืื•ื”ื‘ื™ื ื“ืช, ืžื•ืกื™ืงื” ื•ืืžื ื•ืช.
11:25
Language is a parasite that we've adapted to,
244
685330
3000
ื”ืฉืคื” ื”ื™ื ื” ื˜ืคื™ืœ ืฉืืœื™ื• ื”ืกืชื’ืœื ื•,
11:28
not something that was there originally for our genes,
245
688330
2000
ื•ืœื ืžืฉื”ื• ืฉื”ืชืงื™ื™ื ื‘ืžืงื•ืจ ืœื˜ื•ื‘ืช ื”ื’ื ื™ื ืฉืœื ื•, ืœืคื™ ื”ืฉืงืคื” ื–ื•.
11:30
on this view.
246
690330
2000
11:32
And like most parasites, it can begin dangerous,
247
692330
3000
ื•ื›ืžื• ืจื•ื‘ ื”ื˜ืคื™ืœื™ื, ื”ื™ื ืขืœื•ืœื” ืœื”ื™ื•ืช ื‘ื”ืชื—ืœื” ืžืกื•ื›ื ืช,
11:35
but then it coevolves and adapts,
248
695330
3000
ืืš ืื– ื”ื™ื ืžืชืคืชื—ืช ื•ืžืกืชื’ืœืช ื‘ืžืงื‘ื™ืœ,
11:38
and we end up with a symbiotic relationship
249
698330
2000
ื•ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื™ืฉ ืœื ื• ืžืขืจื›ืช-ื™ื—ืกื™ื ืกื™ืžื‘ื™ื•ื˜ื™ืช
11:40
with this new parasite.
250
700330
1000
ืขื ื”ื˜ืคื™ืœ ื”ื—ื“ืฉ ื”ื–ื”.
11:41
And so, from our perspective,
251
701330
2000
ื›ืš ืฉืžื ืงื•ื“ืช ืžื‘ื˜ื ื• ืื™ื ื ื• ืžื‘ื™ื ื™ื ืฉื›ืš ื–ื” ื”ื—ืœ.
11:43
we don't realize that that's how it began.
252
703330
3000
11:46
So, this is a view of what humans are.
253
706330
3000
ื›ืš ืฉื–ื•ื”ื™ ื”ืฉืงืคื” ืขืœ ืžื”ื• ืื ื•ืฉื™.
11:49
All other species on this planet are gene machines only,
254
709330
3000
ื›ืœ ืฉืืจ ื”ืžื™ื ื™ื ื‘ืขื•ืœื ื”ื ืžื›ื•ื ื•ืช ื’ื ื™ื ื‘ืœื‘ื“,
11:52
they don't imitate at all well, hardly at all.
255
712330
3000
ื”ื ืœื ืžื—ืงื™ื ื”ื™ื˜ื‘, ื›ืžืขื˜ ื•ืœื.
11:55
We alone are gene machines and meme machines as well.
256
715330
5000
ืจืง ืื ื—ื ื• ื”ื ื ื• ืžื›ื•ื ื•ืช ื’ื ื™ื ื•ื’ื ืžื›ื•ื ื•ืช ืžืžื™ื.
12:00
The memes took a gene machine and turned it into a meme machine.
257
720330
4000
ื”ืžืžื™ื ืœืงื—ื• ืžื›ื•ื ืช ื’ื ื™ื ื•ื”ืคื›ื• ืื•ืชื” ืœืžื›ื•ื ืช ืžืžื™ื.
12:04
But that's not all.
258
724330
2000
ืืš ื–ื” ืœื ื”ื›ืœ.
12:06
We have a new kind of memes now.
259
726330
3000
ื›ืขืช ื™ืฉ ืœื ื• ืžืžื™ื ืžืกื•ื’ ื—ื“ืฉ.
12:09
I've been wondering for a long time,
260
729330
1000
ืชื”ื™ืชื™ ื‘ืžืฉืš ื–ืžืŸ ืจื‘,
12:10
since I've been thinking about memes a lot,
261
730330
2000
ืžืคื ื™ ืฉื”ืจื‘ื™ืชื™ ืœื—ืฉื•ื‘ ืขืœ ืžืžื™ื,
12:12
is there a difference between the memes that we copy --
262
732330
2000
ื”ืื ื™ืฉ ื”ื‘ื“ืœ ื‘ื™ืŸ ื”ืžืžื™ื ืฉืื ื• ืžืขืชื™ืงื™ื --
12:14
the words we speak to each other,
263
734330
2000
ื”ืžืœื™ื ืฉืื ื• ืื•ืžืจื™ื ื–ื” ืœื–ื”,
12:16
the gestures we copy, the human things --
264
736330
2000
ื”ืžื—ื•ื•ืช ืฉืื ื• ืžืขืชื™ืงื™ื, ื”ื“ื‘ืจื™ื ื”ืื ื•ืฉื™ื™ื -
12:18
and all these technological things around us?
265
738330
2000
ืœื‘ื™ืŸ ื›ืœ ื”ื“ื‘ืจื™ื ื”ื˜ื›ื ื•ืœื•ื’ื™ื™ื ื”ืืœื” ืฉืžืกื‘ื™ื‘ื ื•?
12:20
I have always, until now, called them all memes,
266
740330
4000
ืขื“ ื›ื” ืชืžื™ื“ ื›ื™ื ื™ืชื™ ืืช ื›ื•ืœื "ืžืžื™ื",
12:24
but I do honestly think now
267
744330
3000
ืืš ื›ืขืช ืื ื™ ื‘ืืžืช ื—ื•ืฉื‘ืช
12:27
we need a new word for technological memes.
268
747330
3000
ืฉืื ื• ื–ืงื•ืงื™ื ืœืžื™ืœื” ืขื‘ื•ืจ ืžืžื™ื ื˜ื›ื ื•ืœื•ื’ื™ื™ื.
12:30
Let's call them techno-memes or temes.
269
750330
3000
ื”ื‘ื” ื ื›ื ื” ืื•ืชื "ื˜ื›ื ื•-ืžืžื™ื" ืื• "ื˜ืžื™ื".
12:33
Because the processes are getting different.
270
753330
3000
ื›ื™ ื”ืชื”ืœื™ื›ื™ื ื ืขืฉื™ื ืฉื•ื ื™ื.
ื”ืชื—ืœื ื•, ืœืคื ื™ ื›-5,000 ืฉื ื”, ืขื ื”ื›ืชื™ื‘ื”.
12:37
We began, perhaps 5,000 years ago, with writing.
271
757330
3000
12:40
We put the storage of memes out there on a clay tablet,
272
760330
7000
ื”ืคืฆื ื• ืืช ืื™ื—ืกื•ืŸ ื”ืžืžื™ื ืžืขืœ ื’ื‘ื™ ืœื•ื—ื•ืช ื—ื™ืžืจ,
12:48
but in order to get true temes and true teme machines,
273
768330
2000
ืืš ื˜ืžื™ื ืืžื™ืชื™ื™ื ื•ืžื›ื•ื ื•ืช ื˜ืžื™ื ืืžื™ืชื™ื•ืช,
12:50
you need to get the variation, the selection and the copying,
274
770330
3000
ืžื—ื™ื™ื‘ื™ื ืฉื”ื’ื™ื•ื•ืŸ, ื”ื‘ืจื™ืจื” ื•ื”ื”ืขืชืงื”,
12:53
all done outside of humans.
275
773330
2000
ื™ื™ืขืฉื• ื›ื•ืœื ืžื—ื•ืฅ ืœื‘ื ื™ ื”ืื“ื.
12:55
And we're getting there.
276
775330
2000
ื•ืื ื• ืžืชืงืจื‘ื™ื ืœื›ืš.
12:57
We're at this extraordinary point where we're nearly there,
277
777330
2000
ืื ื• ืžืฆื•ื™ื™ื ื‘ืฉืœื‘ ื”ืžื•ืคืœื, ืฉื›ืžืขื˜ ื”ื’ืขื ื• ืœื›ืš
12:59
that there are machines like that.
278
779330
2000
ืฉื™ืฉ ืžื›ื•ื ื•ืช ื›ืืœื”.
13:01
And indeed, in the short time I've already been at TED,
279
781330
2000
ื•ืœืžืขืฉื”, ื‘ืฉื”ื•ืช ื”ืงืฆืจื” ืฉืœื™ ื‘-TED,
13:03
I see we're even closer than I thought we were before.
280
783330
2000
ืจืื™ืชื™ ืฉืื ื• ืงืจื•ื‘ื™ื ืœื›ืš ื™ื•ืชืจ ืžื›ืคื™ ืฉื—ืฉื‘ืชื™.
13:05
So actually, now the temes are forcing our brains
281
785330
6000
ื›ืš ืฉื‘ืขืฆื, ืขื›ืฉื™ื• ื”ื˜ืžื™ื ื›ื•ืคื™ื ืขืœ ืžื•ื—ื•ืชื™ื ื•
13:11
to become more like teme machines.
282
791330
2000
ืœื”ืคื•ืš ื™ื•ืชืจ ื•ื™ื•ืชืจ ืœืžื›ื•ื ื•ืช ื˜ืžื™ื.
13:13
Our children are growing up very quickly learning to read,
283
793330
3000
ื™ืœื“ื™ื ื• ืœื•ืžื“ื™ื ืžื”ืจ ื™ื•ืชืจ ืœืงืจื•ื,
13:16
learning to use machinery.
284
796330
2000
ืœื•ืžื“ื™ื ืœื”ืฉืชืžืฉ ื‘ืžื›ื•ื ื•ืช.
13:18
We're going to have all kinds of implants,
285
798330
1000
ื™ื”ื™ื• ืœื ื• ื›ืœ ืžื™ื ื™ ืฉืชืœื™ื,
13:19
drugs that force us to stay awake all the time.
286
799330
3000
ืกืžื™ื ืฉื™ืืœืฆื• ืื•ืชื ื• ืœื”ื™ืฉืืจ ืขืจื™ื ื›ืœ ื”ื–ืžืŸ.
13:22
We'll think we're choosing these things,
287
802330
2000
ืื ื• ื ื—ืฉื•ื‘ ืฉืื ื• ื‘ื•ื—ืจื™ื ืืช ื›ืœ ืืœื”,
13:24
but the temes are making us do it.
288
804330
3000
ืืš ื”ื˜ืžื™ื ื”ื ืฉื’ื•ืจืžื™ื ืœื ื• ืœืขืฉื•ืช ื–ืืช.
13:28
So, we're at this cusp now
289
808330
1000
ื•ื›ืขืช ืื ื• ืขื•ืžื“ื™ื ืขืœ ื”ืกืฃ ื”ื–ื”,
13:29
of having a third replicator on our planet.
290
809330
4000
ืฉื™ื”ื™ื” ืœื ื• ืžืฉื›ืคืœ ืฉืœื™ืฉื™ ื‘ืขื•ืœืžื ื•.
13:34
Now, what about what else is going on out there in the universe?
291
814330
5000
ื•ืžื” ืขื ื›ืœ ืžื” ืฉืงื•ืจื” ืื™-ืฉื ื‘ื™ืงื•ื?
13:39
Is there anyone else out there?
292
819330
2000
ื”ืื ื™ืฉ ืฉื ืขื•ื“ ืžื™ืฉื”ื•?
13:41
People have been asking this question for a long time.
293
821330
3000
ืื ืฉื™ื ืฉื•ืืœื™ื ืฉืืœื” ื–ื• ืžื–ื” ื–ืžืŸ ืจื‘.
13:44
We've been asking it here at TED already.
294
824330
2000
ื›ื‘ืจ ืฉืืœื ื• ืื•ืชื” ื›ืืŸ ื‘-TED.
13:46
In 1961, Frank Drake made his famous equation,
295
826330
4000
ื‘-1961 ื”ืฆื™ื’ ืคืจื ืง ื“ืจื™ื™ืง ืืช ื”ืžืฉื•ื•ืื” ื”ื™ื“ื•ืขื” ืฉืœื•,
13:50
but I think he concentrated on the wrong things.
296
830330
2000
ืืš ืœื“ืขืชื™ ื”ื•ื ื”ืชืจื›ื– ื‘ื“ื‘ืจื™ื ื”ืœื-ื ื›ื•ื ื™ื.
13:52
It's been very productive, that equation.
297
832330
2000
ื–ื• ื”ื™ืชื” ืžืฉื•ื•ืื” ืžืื“ ืคื•ืจื™ื”.
13:54
He wanted to estimate N,
298
834330
2000
ื”ื•ื ื‘ื™ืงืฉ ืœื”ืขืจื™ืš ืืช n,
13:56
the number of communicative civilizations out there in our galaxy,
299
836330
4000
ืžืกืคืจ ื”ืชืจื‘ื•ื™ื•ืช ื”ืžืกื•ื’ืœื•ืช ืœืชืงืฉื•ืจืช ื‘ื’ืœืงืกื™ื” ืฉืœื ื•.
14:00
and he included in there the rate of star formation,
300
840330
4000
ื•ื”ื•ื ื›ืœืœ ื‘ื” ืืช ืงืฆื‘ ื”ื™ื•ื•ืฆืจื•ืช ื”ื›ื•ื›ื‘ื™ื,
14:04
the rate of planets, but crucially, intelligence.
301
844330
4000
ื•ื›ื•ื›ื‘ื™ ื”ืœื›ืช, ืืš ื‘ืขื™ืงืจ ืืช ืงืฆื‘ ื”ืชืคืชื—ื•ืช ื”ืชื‘ื•ื ื”.
14:08
I think that's the wrong way to think about it.
302
848330
4000
ืœื“ืขืชื™ ื–ื• ื“ืจืš ืฉื’ื•ื™ื” ืœื—ืฉื•ื‘ ืขืœ ื›ืš.
14:12
Intelligence appears all over the place, in all kinds of guises.
303
852330
3000
ื”ืชื‘ื•ื ื” ืžื•ืคื™ืขื” ื‘ื›ืœ ืžืงื•ื ื•ื‘ื›ืœ ืžื™ื ื™ ืฆื•ืจื•ืช.
14:15
Human intelligence is only one kind of a thing.
304
855330
2000
ื”ืชื‘ื•ื ื” ื”ืื ื•ืฉื™ืช ื”ื™ื ืจืง ืกื•ื’ ืื—ื“ ืฉืœื”.
14:17
But what's really important is the replicators you have
305
857330
3000
ืืš ืžื” ืฉื‘ืืžืช ื—ืฉื•ื‘ ื”ื•ื ืžื”ื ื”ืžืฉื›ืคืœื™ื
14:20
and the levels of replicators, one feeding on the one before.
306
860330
4000
ื•ืจืžื•ืช ื”ืžืฉื›ืคืœื™ื, ื›ืฉื›ืœ ืื—ืช ื ื™ื–ื•ื ื” ืžื–ื• ืฉืงื“ืžื” ืœื”.
14:24
So, I would suggest that we don't think intelligence,
307
864330
5000
ืœื›ืŸ ืื ื™ ืžืฆื™ืขื” ืฉืœื ื ื—ืฉื•ื‘ ืขืœ ืชื‘ื•ื ื”,
14:29
we think replicators.
308
869330
2000
ืืœื ืขืœ ืžืฉื›ืคืœื™ื.
14:31
And on that basis, I've suggested a different kind of equation.
309
871330
3000
ื•ืขืœ ืกืžืš ื–ื” ื”ื™ื™ืชื™ ืžืฆื™ืขื” ืžืฉื•ื•ืื” ืžืกื•ื’ ืฉื•ื ื”.
14:34
A very simple equation.
310
874330
2000
ืžืฉื•ื•ืื” ืคืฉื•ื˜ื” ืžืื“.
14:36
N, the same thing,
311
876330
2000
n ืชื™ื™ืฆื’ ืืช ืื•ืชื• ื”ื“ื‘ืจ,
14:38
the number of communicative civilizations out there
312
878330
3000
ืžืกืคืจ ื”ืชืจื‘ื•ื™ื•ืช ื”ืžืกื•ื’ืœื•ืช ืœืชืงืฉื•ืจืช,
14:41
[that] we might expect in our galaxy.
313
881330
2000
ืฉื ื™ืชืŸ ืœืฆืคื•ืช ืฉื™ื”ื™ื• ื‘ื’ืœืงืกื™ื” ืฉืœื ื•.
14:43
Just start with the number of planets there are in our galaxy.
314
883330
4000
ืคืฉื•ื˜ ืœื”ืชื—ื™ืœ ื‘ืžืกืคืจ ื›ื•ื›ื‘ื™ ื”ืœื›ืช ืฉื‘ื’ืœืงืกื™ื” ืฉืœื ื•.
14:47
The fraction of those which get a first replicator.
315
887330
4000
ื—ืœืงื™ืง ืžืืœื•, ืฉืžื’ื™ืข ืœืžืฉื›ืคืœ ืจืืฉื•ืŸ.
14:51
The fraction of those that get the second replicator.
316
891330
4000
ื—ืœืงื™ืง ืžืืœื•, ืฉืžื’ื™ืข ืœืžืฉื›ืคืœ ื”ืฉื ื™.
14:55
The fraction of those that get the third replicator.
317
895330
2000
ื—ืœืงื™ืง ืžืืœื•, ืฉืžื’ื™ืข ืœืžืฉื›ืคืœ ื”ืฉืœื™ืฉื™.
14:58
Because it's only the third replicator that's going to reach out --
318
898330
3000
ื›ื™ ืจืง ื”ืžืฉื›ืคืœ ื”ืฉืœื™ืฉื™ ื”ื•ื ืฉื™ื™ืฆื•ืจ ืžื’ืข --
15:01
sending information, sending probes, getting out there,
319
901330
3000
ืฉื™ืฉืœื— ืžื™ื“ืข, ื™ืฉื’ืจ ื’ืฉื•ืฉื™ื ื•ื™ื™ืฆื•ืจ ืงืฉืจ,
15:04
and communicating with anywhere else.
320
904330
2000
ื•ื™ืชืงืฉืจ ืขื ื›ืœ ื”ื™ืชืจ.
15:06
OK, so if we take that equation,
321
906330
3000
ืื– ืื ื ืกื›ื™ื ืœืžืฉื•ื•ืื” ื–ื•,
15:09
why haven't we heard from anybody out there?
322
909330
5000
ืžื“ื•ืข ืœื ืฉืžืขื ื• ืžืฉื”ื• ืžืžื™ืฉื”ื• ืื™-ืฉื ื‘ื—ื•ืฅ?
15:14
Because every step is dangerous.
323
914330
4000
ืžืคื ื™ ืฉื›ืœ ืฉืœื‘ ื”ื•ื ืžืกื•ื›ืŸ.
15:18
Getting a new replicator is dangerous.
324
918330
3000
ื”ื”ื’ืขื” ืœืžืฉื›ืคืœ ื—ื“ืฉ ืžืกื•ื›ื ืช.
15:21
You can pull through, we have pulled through,
325
921330
2000
ืืคืฉืจ ืœืขื‘ื•ืจ ืืช ื–ื”, ืื ื• ืขื‘ืจื ื• ืืช ื–ื”,
15:23
but it's dangerous.
326
923330
2000
ืื‘ืœ ื–ื” ืžืกื•ื›ืŸ.
15:25
Take the first step, as soon as life appeared on this earth.
327
925330
3000
ืœืžืฉืœ, ื”ืฉืœื‘ ื”ืจืืฉื•ืŸ. ืžื™ื“ ื›ืฉื”ื•ืคื™ืขื• ื”ื—ื™ื™ื ื‘ืขื•ืœืžื ื•.
15:28
We may take the Gaian view.
328
928330
2000
ืื ื ื™ืงื— ืืช ื”ื”ืฉืงืคื” ื”ื’ืื™ืื ื™ืช.
15:30
I loved Peter Ward's talk yesterday -- it's not Gaian all the time.
329
930330
3000
ื ื”ื ื™ืชื™ ืืชืžื•ืœ ืžื”ืจืฆืืชื• ืฉืœ ืคื˜ืจ ื•ื•ืจื“. ืœื ื”ื›ืœ ื’ืื™ืื ื™.
15:33
Actually, life forms produce things that kill themselves.
330
933330
3000
ืœืžืขืฉื”, ืฆื•ืจื•ืช ื”ื—ื™ื™ื ื™ื•ืฆืจื•ืช ื“ื‘ืจื™ื ืฉืžืฉืžื™ื“ื™ื ืืช ืขืฆืžื.
15:36
Well, we did pull through on this planet.
331
936330
3000
ืื– ื‘ืขื•ืœืžื ื• ื”ืฆืœื—ื ื• ืœืขื‘ื•ืจ ืืช ื–ื”.
15:39
But then, a long time later, billions of years later,
332
939330
2000
ืื‘ืœ ื–ืžืŸ ืจื‘ ืื—"ื›, ืžื™ืœื™ืืจื“ื™ ืฉื ื™ื ืื—"ื›,
15:41
we got the second replicator, the memes.
333
941330
3000
ื”ื’ืขื ื• ืœืžืฉื›ืคืœ ื”ืฉื ื™, ื”ืžืžื™ื.
15:44
That was dangerous, all right.
334
944330
2000
ื–ื” ื”ื™ื” ื‘ื”ื—ืœื˜ ืžืกื•ื›ืŸ.
15:46
Think of the big brain.
335
946330
2000
ื—ื™ืฉื‘ื• ืขืœ ื”ืžื•ื— ื”ื’ื“ื•ืœ.
15:48
How many mothers do we have here?
336
948330
3000
ื›ืžื” ืื™ืžื”ื•ืช ื™ืฉ ืœื ื• ื‘ืงื”ืœ?
15:51
You know all about big brains.
337
951330
2000
ืืชืŸ ืžื›ื™ืจื•ืช ื”ื™ื˜ื‘ ืืช ืขื ื™ื™ืŸ ื”ืžื•ื— ื”ื’ื“ื•ืœ:
15:53
They are dangerous to give birth to,
338
953330
2000
ืžืกื•ื›ืŸ ืœืœื“ืช ืื•ืชื•.
15:55
are agonizing to give birth to.
339
955330
2000
ืžื™ื™ืกืจ ืœืœื“ืช ืื•ืชื•.
15:57
(Laughter)
340
957330
1000
(ืฆื—ื•ืง)
15:59
My cat gave birth to four kittens, purring all the time.
341
959330
2000
ื”ื—ืชื•ืœื” ืฉืœื™ ื”ืžืœื™ื˜ื” 4 ื’ื•ืจื™ื ื•ื’ื™ืจื’ืจื” ื›ืœ ื”ื–ืžืŸ.
16:01
Ah, mm -- slightly different.
342
961330
2000
ืืฆืœื ื• ื–ื” ืฉื•ื ื” ื‘ืžืงืฆืช...
16:03
(Laughter)
343
963330
2000
(ืฆื—ื•ืง)
16:05
But not only is it painful, it kills lots of babies,
344
965330
3000
ื•ืœื ืจืง ืฉื–ื” ื›ื•ืื‘, ื–ื” ื’ื ืžืžื™ืช ื”ืจื‘ื” ืชื™ื ื•ืงื•ืช,
16:08
it kills lots of mothers,
345
968330
2000
ื•ื”ืจื‘ื” ืื™ืžื”ื•ืช,
16:10
and it's very expensive to produce.
346
970330
2000
ื•ื–ื” ื’ื ืชื”ืœื™ืš-ื™ื™ืฆื•ืจ ื™ืงืจ ืžืื“.
16:12
The genes are forced into producing all this myelin,
347
972330
2000
ื”ื’ื ื™ื ื ืืœืฆื™ื ืœื™ื™ืฆืจ ืืช ื›ืœ ื”ืžื™ืืœื™ืŸ ื”ื–ื”,
16:14
all the fat to myelinate the brain.
348
974330
2000
ื›ืœ ื”ืฉื•ืžืŸ ืฉืขื•ื˜ืฃ ืืช ื”ืžื•ื—.
16:16
Do you know, sitting here,
349
976330
2000
ื”ืื ืืชื ื™ื•ื“ืขื™ื ืฉื‘ืขื•ื“ื›ื ื™ื•ืฉื‘ื™ื ื›ืืŸ,
16:18
your brain is using about 20 percent of your body's energy output
350
978330
4000
ืžื•ื—ื›ื ืžื ืฆืœ ื›-20% ืžืชืคื•ืงืช ื”ืื ืจื’ื™ื” ืฉืœ ื’ื•ืคื›ื
16:22
for two percent of your body weight?
351
982330
2000
ื‘-2% ืžืžืฉืงืœ ื’ื•ืคื›ื?
16:24
It's a really expensive organ to run.
352
984330
2000
ื–ื”ื• ื‘ืืžืช ืื™ื‘ืจ ืฉืžืื“ ื™ืงืจ ืœื”ืคืขื™ืœื•.
16:26
Why? Because it's producing the memes.
353
986330
2000
ื•ืžื“ื•ืข? ืžืคื ื™ ืฉื”ื•ื ืžื™ื™ืฆืจ ืืช ื”ืžืžื™ื.
16:28
Now, it could have killed us off. It could have killed us off,
354
988330
4000
ื•ื–ื” ื”ื™ื” ื™ื›ื•ืœ ืœื—ืกืœ ืื•ืชื ื•,
16:32
and maybe it nearly did, but you see, we don't know.
355
992330
2000
ื•ืื•ืœื™ ื–ื” ื›ืžืขื˜ ืงืจื”. ืื‘ืœ ืื™ื ื ื• ื™ื•ื“ืขื™ื ื–ืืช.
16:34
But maybe it nearly did.
356
994330
2000
ืืš ืื•ืœื™ ื–ื” ื›ืžืขื˜ ืงืจื”.
16:36
Has it been tried before?
357
996330
1000
ื”ืื ื–ื” ื ื•ืกื” ืœืคื ื™ ื›ืŸ?
16:37
What about all those other species?
358
997330
2000
ืžื” ืขื ื›ืœ ื™ืชืจ ื”ืžื™ื ื™ื?
16:39
Louise Leakey talked yesterday
359
999330
2000
ืœื•ืื™ืก ืœื™ืงื™ ื”ื™ืจืฆื” ืืชืžื•ืœ
16:41
about how we're the only one in this branch left.
360
1001330
3000
ืขืœ ื›ืš ืฉืื ื• ื”ื™ื—ื™ื“ื™ื ืฉื ื•ืชืจื• ื‘ืขื ืฃ ื”ืื‘ื•ืœื•ืฆื™ื•ื ื™ ืฉืœื ื•.
16:44
What happened to the others?
361
1004330
2000
ืžื” ืงืจื” ืœื›ืœ ื”ืฉืืจ?
16:46
Could it be that this experiment in imitation,
362
1006330
2000
ื”ื™ื™ืชื›ืŸ ืฉื ื™ืกื•ื™ ื–ื” ื‘ื—ื™ืงื•ื™,
16:48
this experiment in a second replicator,
363
1008330
2000
ื ื™ืกื•ื™ ื–ื” ื‘ื”ืคืขืœืช ืžืฉื›ืคืœ ืฉื ื™,
16:50
is dangerous enough to kill people off?
364
1010330
4000
ืžืกื•ื›ืŸ ื“ื™ื• ืœื—ืกืœ ื‘ื ื™-ืื“ื?
16:54
Well, we did pull through, and we adapted.
365
1014330
2000
ื•ื‘ื›ืŸ, ืฉืจื“ื ื• ื–ืืช, ื•ื”ืกืชื’ืœื ื•.
16:56
But now, we're hitting, as I've just described,
366
1016330
3000
ืืš ื›ืขืช ืื ื• ืžื’ื™ืขื™ื, ื›ืคื™ ืฉื–ื” ืขืชื” ืชื™ืืจืชื™,
16:59
we're hitting the third replicator point.
367
1019330
2000
ืœืฉืœื‘ ื”ืžืฉื›ืคืœ ื”ืฉืœื™ืฉื™.
17:01
And this is even more dangerous --
368
1021330
3000
ื•ื–ื” ืขื•ื“ ื™ื•ืชืจ ืžืกื•ื›ืŸ --
17:04
well, it's dangerous again.
369
1024330
2000
ื˜ื•ื‘, ื–ื” ืฉื•ื‘ ืžืกื•ื›ืŸ.
17:06
Why? Because the temes are selfish replicators
370
1026330
4000
ื•ืžื“ื•ืข? ื›ื™ ื”ื˜ืžื™ื ื”ื ืžืฉื›ืคืœื™ื ืื ื•ื›ื™ื™ื
17:10
and they don't care about us, or our planet, or anything else.
371
1030330
3000
ื•ืœื ืื›ืคืช ืœื”ื ืžืื™ืชื ื•, ืžืขื•ืœืžื ื• ืื• ืžื›ืœ ื“ื‘ืจ ืื—ืจ.
17:13
They're just information, why would they?
372
1033330
3000
ื”ื ืžื™ื“ืข ื‘ืœื‘ื“. ืœืžื” ืฉื™ื”ื™ื” ืœื”ื ืื›ืคืช?
ื”ื ืžื ืฆืœื™ื ืื•ืชื ื• ื›ื“ื™ ืœืžืฆื•ืช ืืช ืžืฉืื‘ื™ ื”ืขื•ืœื
17:17
They are using us to suck up the planet's resources
373
1037330
2000
17:19
to produce more computers,
374
1039330
2000
ืขืœ ืžื ืช ืœื™ื™ืฆืจ ืขื•ื“ ืžื—ืฉื‘ื™ื,
17:21
and more of all these amazing things we're hearing about here at TED.
375
1041330
3000
ื•ืขื•ื“ ืžื”ื“ื‘ืจื™ื ื”ืžื“ื”ื™ืžื™ื ืฉืื ื• ืฉื•ืžืขื™ื ืขืœื™ื”ื ื‘-TED.
17:24
Don't think, "Oh, we created the Internet for our own benefit."
376
1044330
4000
ืืœ ืชื—ืฉื‘ื•, "ืื ื• ื™ืฆืจื ื• ืืช ื”ืื™ื ื˜ืจื ื˜ ืœืชื•ืขืœืชื ื•."
17:28
That's how it seems to us.
377
1048330
2000
ื–ื” ืจืง ื ืจืื” ืœื ื• ื›ืš.
17:30
Think, temes spreading because they must.
378
1050330
4000
ื—ื™ืฉื‘ื• ืขืœ ื˜ืžื™ื ืฉืžืชืคืฉื˜ื™ื ืžืคื ื™ ืฉื”ื ืžื•ื›ืจื—ื™ื.
17:34
We are the old machines.
379
1054330
2000
ืื ื• ื”ื ื ื• ื”ืžื›ื•ื ื•ืช ื”ื™ืฉื ื•ืช.
17:36
Now, are we going to pull through?
380
1056330
2000
ื”ืื ื ืฉืจื•ื“ ืืช ื–ื”?
17:38
What's going to happen?
381
1058330
2000
ืžื” ืขืชื™ื“ ืœื”ืชืจื—ืฉ?
17:40
What does it mean to pull through?
382
1060330
2000
ืžื” ืžืฉืžืขื•ืช "ืœืฉืจื•ื“ ืืช ื–ื”"?
17:42
Well, there are kind of two ways of pulling through.
383
1062330
2000
ื™ืฉ ืฉืชื™ ื“ืจื›ื™ื ืœืฉืจื•ื“ ืืช ื–ื”.
17:45
One that is obviously happening all around us now,
384
1065330
2000
ืื—ืช ืฉืžืชืจื—ืฉืช ื›ืขืช ื‘ื‘ื™ืจื•ืจ ื‘ื›ืœ ืžืงื•ื,
17:47
is that the temes turn us into teme machines,
385
1067330
4000
ื•ื”ื™ื ืฉื”ื˜ืžื™ื ื”ื•ืคื›ื™ื ืื•ืชื ื• ืœืžื›ื•ื ื•ืช ื˜ืžื™ื,
17:51
with these implants, with the drugs,
386
1071330
2000
ืขื ื”ืฉืชืœื™ื ื”ืืœื”, ืขื ื”ืกืžื™ื,
17:53
with us merging with the technology.
387
1073330
3000
ืขื ื”ืชืžื–ื’ื•ืชื ื• ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
17:56
And why would they do that?
388
1076330
2000
ื•ืœืžื” ืœื”ื ืœืขืฉื•ืช ื–ืืช?
17:58
Because we are self-replicating.
389
1078330
2000
ืžืคื ื™ ืฉืื ื• ืžืฉื›ืคืœื™ื ืืช ืขืฆืžื ื•.
18:00
We have babies.
390
1080330
2000
ืื ื• ื™ื•ืœื“ื™ื.
18:02
We make new ones, and so it's convenient to piggyback on us,
391
1082330
3000
ืื ื• ืžื™ื™ืฆืจื™ื ืื ืฉื™ื ื—ื“ืฉื™ื, ื›ืš ืฉื ื•ื— ืœืชืคื•ืก ืขืœื™ื ื• ื˜ืจืžืค,
18:05
because we're not yet at the stage on this planet
392
1085330
4000
ื›ื™ ื‘ื›ื•ื›ื‘ ื”ืœื›ืช ื”ื–ื” ื˜ืจื ื”ื’ืขื ื• ืœืฉืœื‘
18:09
where the other option is viable.
393
1089330
2000
ืฉื‘ื• ื”ืืคืฉืจื•ืช ื”ืฉื ื™ื” ืžืขืฉื™ืช.
18:11
Although it's closer, I heard this morning,
394
1091330
2000
ืœืžืจื•ืช ืฉื–ื” ืงืจื•ื‘, ื›ืš ืฉืžืขืชื™ ื”ื‘ื•ืงืจ,
18:13
it's closer than I thought it was.
395
1093330
2000
ื–ื” ืงืจื•ื‘ ืžื›ืคื™ ืฉื—ืฉื‘ืชื™.
18:15
Where the teme machines themselves will replicate themselves.
396
1095330
3000
ื›ืฉืžื›ื•ื ื•ืช ื”ื˜ืžื™ื ื”ืŸ ืฉื™ืฉื›ืคืœื• ืืช ืขืฆืžืŸ.
18:18
That way, it wouldn't matter if the planet's climate
397
1098330
4000
ื•ืื– ืœื ื™ืฉื ื” ืื ื”ืืงืœื™ื ืฉืœ ื›ื•ื›ื‘ ื”ืœื›ืช
18:22
was utterly destabilized,
398
1102330
2000
ื™ื™ืฆื ืœื’ืžืจื™ ืžืื™ื–ื•ืŸ,
18:24
and it was no longer possible for humans to live here.
399
1104330
2000
ื•ืœื ื™ืชืืคืฉืจื• ื›ืืŸ ื”ื—ื™ื™ื ื”ืื ื•ืฉื™ื™ื.
18:26
Because those teme machines, they wouldn't need --
400
1106330
2000
ื›ื™ ืžื›ื•ื ื•ืช ื”ื˜ืžื™ื ื”ืืœื” ืœื ื™ื–ื“ืงืงื• -
18:28
they're not squishy, wet, oxygen-breathing,
401
1108330
2000
ื”ืŸ ืื™ื ืŸ ื™ืฆื•ืจื™ื ืจื›ื™ื, ืœื—ื™ื, ื ื•ืฉืžื™-ื—ืžืฆืŸ ื”ื–ืงื•ืงื™ื ืœื—ื•ื.
18:30
warmth-requiring creatures.
402
1110330
3000
18:33
They could carry on without us.
403
1113330
2000
ื”ืŸ ืชื•ื›ืœื ื” ืœื”ืžืฉื™ืš ื‘ืœืขื“ื™ื ื•.
18:35
So, those are the two possibilities.
404
1115330
3000
ืื– ืืœื” ื”ืŸ ืฉืชื™ ื”ืืคืฉืจื•ื™ื•ืช.
18:38
The second, I don't think we're that close.
405
1118330
4000
ืื™ื ื™ ื—ื•ืฉื‘ืช ืฉืื ื• ืงืจื•ื‘ื™ื ื›ืœ ื›ืš ืœืืคืฉืจื•ืช ื”ืฉื ื™ื”.
18:42
It's coming, but we're not there yet.
406
1122330
2000
ื–ื” ืžืชืงืจื‘, ืืš ื˜ืจื ื”ื’ืขื ื• ืœื›ืš.
18:44
The first, it's coming too.
407
1124330
2000
ื’ื ื”ืจืืฉื•ื ื” ืงืจื•ื‘ื”.
18:46
But the damage that is already being done
408
1126330
3000
ืืš ื”ื ื–ืง ืฉื ื’ืจื ื›ื‘ืจ ืขืชื” ืœื›ื•ื›ื‘ ื”ืœื›ืช
18:49
to the planet is showing us how dangerous the third point is,
409
1129330
5000
ืžืจืื” ื›ืžื” ื”ืฉืœื‘ ื”ืฉืœื™ืฉื™ ืžืกื•ื›ืŸ,
18:54
that third danger point, getting a third replicator.
410
1134330
3000
ืฉืœื‘ ื”ืกื›ื ื” ื”ืฉืœื™ืฉื™ ื”ื–ื”, ื”ื”ื’ืขื” ืœืžืฉื›ืคืœ ืฉืœื™ืฉื™.
18:58
And will we get through this third danger point,
411
1138330
2000
ื”ืื ืื ื• ืขืชื™ื“ื™ื ืœืฉืจื•ื“ ืืช ืฉืœื‘ ื”ืกื›ื ื” ื”ืฉืœื™ืฉื™,
19:00
like we got through the second and like we got through the first?
412
1140330
3000
ื›ืžื• ืฉืฉืจื“ื ื• ืืช ื”ืฉื ื™ ื•ืืช ื”ืจืืฉื•ืŸ?
19:04
Maybe we will, maybe we won't.
413
1144330
2000
ืื•ืœื™ ื›ืŸ, ื•ืื•ืœื™ ืœื.
19:06
I have no idea.
414
1146330
3000
ืื™ืŸ ืœื™ ืžื•ืฉื’.
19:13
(Applause)
415
1153330
10000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ื›ืจื™ืก ืื ื“ืจืกื•ืŸ: ื–ื• ื”ื™ืชื” ื”ืจืฆืื” ืžื“ื”ื™ืžื”.
19:24
Chris Anderson: That was an incredible talk.
416
1164330
2000
19:26
SB: Thank you. I scared myself.
417
1166330
2000
ืกื•ื–ืŸ ื‘ืœืงืžื•ืจ: ืชื•ื“ื” ืœืš. ื”ื‘ื”ืœืชื™ ืืช ืขืฆืžื™.
19:28
CA: (Laughter)
418
1168330
1000
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7