Susan Blackmore: Memes and "temes"

156,625 views ・ 2008-06-04

TED


Please double-click on the English subtitles below to play the video.

00:18
Cultural evolution is a dangerous child
0
18330
3000
00:21
for any species to let loose on its planet.
1
21330
3000
00:24
By the time you realize what's happening, the child is a toddler,
2
24330
4000
00:28
up and causing havoc, and it's too late to put it back.
3
28330
6000
00:34
We humans are Earth's Pandoran species.
4
34330
3000
00:37
We're the ones who let the second replicator out of its box,
5
37330
5000
00:42
and we can't push it back in.
6
42330
2000
00:44
We're seeing the consequences all around us.
7
44330
3000
00:48
Now that, I suggest, is the view that
8
48330
4000
00:52
comes out of taking memetics seriously.
9
52330
2000
00:54
And it gives us a new way of thinking about
10
54330
2000
00:56
not only what's going on on our planet,
11
56330
2000
00:58
but what might be going on elsewhere in the cosmos.
12
58330
3000
01:01
So first of all, I'd like to say something about memetics
13
61330
3000
01:04
and the theory of memes,
14
64330
2000
01:06
and secondly, how this might answer questions about who's out there,
15
66330
5000
01:11
if indeed anyone is.
16
71330
3000
01:14
So, memetics:
17
74330
2000
01:16
memetics is founded on the principle of Universal Darwinism.
18
76330
4000
01:20
Darwin had this amazing idea.
19
80330
3000
01:23
Indeed, some people say
20
83330
2000
01:25
it's the best idea anybody ever had.
21
85330
3000
01:28
Isn't that a wonderful thought, that there could be such a thing
22
88330
4000
01:32
as a best idea anybody ever had?
23
92330
2000
01:34
Do you think there could?
24
94330
1000
01:35
Audience: No.
25
95330
1000
01:36
(Laughter)
26
96330
1000
01:37
Susan Blackmore: Someone says no, very loudly, from over there.
27
97330
2000
01:39
Well, I say yes, and if there is, I give the prize to Darwin.
28
99330
4000
01:43
Why?
29
103330
2000
01:45
Because the idea was so simple,
30
105330
3000
01:48
and yet it explains all design in the universe.
31
108330
6000
01:54
I would say not just biological design,
32
114330
2000
01:56
but all of the design that we think of as human design.
33
116330
2000
01:58
It's all just the same thing happening.
34
118330
2000
02:00
What did Darwin say?
35
120330
2000
02:02
I know you know the idea, natural selection,
36
122330
2000
02:04
but let me just paraphrase "The Origin of Species," 1859,
37
124330
5000
02:09
in a few sentences.
38
129330
2000
02:11
What Darwin said was something like this:
39
131330
3000
02:14
if you have creatures that vary, and that can't be doubted --
40
134330
4000
02:18
I've been to the Galapagos, and I've measured the size of the beaks
41
138330
3000
02:21
and the size of the turtle shells and so on, and so on.
42
141330
2000
02:23
And 100 pages later.
43
143330
2000
02:25
(Laughter)
44
145330
2000
02:27
And if there is a struggle for life,
45
147330
4000
02:31
such that nearly all of these creatures die --
46
151330
3000
02:34
and this can't be doubted, I've read Malthus
47
154330
3000
02:37
and I've calculated how long it would take for elephants
48
157330
2000
02:39
to cover the whole world if they bred unrestricted, and so on and so on.
49
159330
3000
02:42
And another 100 pages later.
50
162330
4000
02:46
And if the very few that survive pass onto their offspring
51
166330
5000
02:51
whatever it was that helped them survive,
52
171330
3000
02:54
then those offspring must be better adapted
53
174330
2000
02:56
to the circumstances in which all this happened
54
176330
2000
02:58
than their parents were.
55
178330
3000
03:01
You see the idea?
56
181330
2000
03:03
If, if, if, then.
57
183330
2000
03:05
He had no concept of the idea of an algorithm,
58
185330
2000
03:07
but that's what he described in that book,
59
187330
3000
03:10
and this is what we now know as the evolutionary algorithm.
60
190330
3000
03:13
The principle is you just need those three things --
61
193330
4000
03:17
variation, selection and heredity.
62
197330
3000
03:20
And as Dan Dennett puts it, if you have those,
63
200330
4000
03:24
then you must get evolution.
64
204330
2000
03:26
Or design out of chaos, without the aid of mind.
65
206330
5000
03:31
There's one word I love on that slide.
66
211330
2000
03:33
What do you think my favorite word is?
67
213330
2000
03:35
Audience: Chaos.
68
215330
1000
03:36
SB: Chaos? No. What? Mind? No.
69
216330
3000
03:39
Audience: Without.
70
219330
1000
03:40
SB: No, not without.
71
220330
1000
03:41
(Laughter)
72
221330
1000
03:42
You try them all in order: Mmm...?
73
222330
2000
03:44
Audience: Must.
74
224330
1000
03:45
SB: Must, at must. Must, must.
75
225330
4000
03:49
This is what makes it so amazing.
76
229330
2000
03:51
You don't need a designer,
77
231330
3000
03:54
or a plan, or foresight, or anything else.
78
234330
3000
03:57
If there's something that is copied with variation
79
237330
3000
04:00
and it's selected, then you must get design appearing out of nowhere.
80
240330
4000
04:04
You can't stop it.
81
244330
2000
04:06
Must is my favorite word there.
82
246330
4000
04:11
Now, what's this to do with memes?
83
251330
2000
04:13
Well, the principle here applies to anything
84
253330
5000
04:18
that is copied with variation and selection.
85
258330
1000
04:19
We're so used to thinking in terms of biology,
86
259330
3000
04:22
we think about genes this way.
87
262330
2000
04:24
Darwin didn't, of course; he didn't know about genes.
88
264330
3000
04:27
He talked mostly about animals and plants,
89
267330
2000
04:29
but also about languages evolving and becoming extinct.
90
269330
3000
04:32
But the principle of Universal Darwinism
91
272330
2000
04:34
is that any information that is varied and selected
92
274330
4000
04:38
will produce design.
93
278330
2000
04:40
And this is what Richard Dawkins was on about
94
280330
2000
04:42
in his 1976 bestseller, "The Selfish Gene."
95
282330
3000
04:45
The information that is copied, he called the replicator.
96
285330
4000
04:49
It selfishly copies.
97
289330
2000
04:51
Not meaning it kind of sits around inside cells going, "I want to get copied."
98
291330
4000
04:55
But that it will get copied if it can,
99
295330
2000
04:57
regardless of the consequences.
100
297330
2000
05:00
It doesn't care about the consequences because it can't,
101
300330
3000
05:03
because it's just information being copied.
102
303330
2000
05:06
And he wanted to get away
103
306330
1000
05:07
from everybody thinking all the time about genes,
104
307330
3000
05:10
and so he said, "Is there another replicator out there on the planet?"
105
310330
3000
05:13
Ah, yes, there is.
106
313330
2000
05:15
Look around you -- here will do, in this room.
107
315330
3000
05:18
All around us, still clumsily drifting about
108
318330
3000
05:21
in its primeval soup of culture, is another replicator.
109
321330
3000
05:24
Information that we copy from person to person, by imitation,
110
324330
5000
05:29
by language, by talking, by telling stories,
111
329330
2000
05:31
by wearing clothes, by doing things.
112
331330
3000
05:34
This is information copied with variation and selection.
113
334330
5000
05:39
This is design process going on.
114
339330
3000
05:42
He wanted a name for the new replicator.
115
342330
3000
05:45
So, he took the Greek word "mimeme," which means that which is imitated.
116
345330
4000
05:49
Remember that, that's the core definition:
117
349330
2000
05:52
that which is imitated.
118
352330
1000
05:53
And abbreviated it to meme, just because it sounds good
119
353330
3000
05:56
and made a good meme, an effective spreading meme.
120
356330
3000
05:59
So that's how the idea came about.
121
359330
3000
06:03
It's important to stick with that definition.
122
363330
3000
06:06
The whole science of memetics is much maligned,
123
366330
4000
06:10
much misunderstood, much feared.
124
370330
3000
06:13
But a lot of these problems can be avoided
125
373330
3000
06:16
by remembering the definition.
126
376330
2000
06:18
A meme is not equivalent to an idea.
127
378330
2000
06:20
It's not an idea. It's not equivalent to anything else, really.
128
380330
2000
06:22
Stick with the definition.
129
382330
2000
06:24
It's that which is imitated,
130
384330
2000
06:26
or information which is copied from person to person.
131
386330
3000
06:30
So, let's see some memes.
132
390330
1000
06:31
Well, you sir, you've got those glasses hung around your neck
133
391330
3000
06:34
in that particularly fetching way.
134
394330
2000
06:36
I wonder whether you invented that idea for yourself,
135
396330
2000
06:38
or copied it from someone else?
136
398330
2000
06:40
If you copied it from someone else, it's a meme.
137
400330
3000
06:43
And what about, oh, I can't see any interesting memes here.
138
403330
3000
06:46
All right everyone, who's got some interesting memes for me?
139
406330
3000
06:49
Oh, well, your earrings,
140
409330
2000
06:51
I don't suppose you invented the idea of earrings.
141
411330
2000
06:53
You probably went out and bought them.
142
413330
2000
06:55
There are plenty more in the shops.
143
415330
2000
06:57
That's something that's passed on from person to person.
144
417330
2000
06:59
All the stories that we're telling -- well, of course,
145
419330
3000
07:02
TED is a great meme-fest, masses of memes.
146
422330
4000
07:06
The way to think about memes, though,
147
426330
2000
07:08
is to think, why do they spread?
148
428330
2000
07:10
They're selfish information, they will get copied, if they can.
149
430330
4000
07:14
But some of them will be copied because they're good,
150
434330
3000
07:17
or true, or useful, or beautiful.
151
437330
2000
07:19
Some of them will be copied even though they're not.
152
439330
2000
07:21
Some, it's quite hard to tell why.
153
441330
2000
07:24
There's one particular curious meme which I rather enjoy.
154
444330
3000
07:27
And I'm glad to say, as I expected, I found it when I came here,
155
447330
3000
07:30
and I'm sure all of you found it, too.
156
450330
2000
07:32
You go to your nice, posh, international hotel somewhere,
157
452330
3000
07:36
and you come in and you put down your clothes
158
456330
2000
07:38
and you go to the bathroom, and what do you see?
159
458330
3000
07:41
Audience: Bathroom soap.
160
461330
1000
07:42
SB: Pardon?
161
462330
1000
07:43
Audience: Soap.
162
463330
1000
07:44
SB: Soap, yeah. What else do you see?
163
464330
2000
07:46
Audience: (Inaudible)
164
466330
1000
07:47
SB: Mmm mmm.
165
467330
1000
07:48
Audience: Sink, toilet!
166
468330
1000
07:49
SB: Sink, toilet, yes, these are all memes, they're all memes,
167
469330
2000
07:51
but they're sort of useful ones, and then there's this one.
168
471330
3000
07:54
(Laughter)
169
474330
3000
07:58
What is this one doing?
170
478330
2000
08:00
(Laughter)
171
480330
1000
08:01
This has spread all over the world.
172
481330
2000
08:03
It's not surprising that you all found it
173
483330
2000
08:05
when you arrived in your bathrooms here.
174
485330
2000
08:07
But I took this photograph in a toilet at the back of a tent
175
487330
5000
08:12
in the eco-camp in the jungle in Assam.
176
492330
2000
08:14
(Laughter)
177
494330
1000
08:16
Who folded that thing up there, and why?
178
496330
3000
08:19
(Laughter)
179
499330
1000
08:20
Some people get carried away.
180
500330
2000
08:22
(Laughter)
181
502330
3000
08:26
Other people are just lazy and make mistakes.
182
506330
3000
08:29
Some hotels exploit the opportunity to put even more memes
183
509330
3000
08:32
with a little sticker.
184
512330
2000
08:34
(Laughter)
185
514330
1000
08:35
What is this all about?
186
515330
2000
08:37
I suppose it's there to tell you that somebody's
187
517330
2000
08:39
cleaned the place, and it's all lovely.
188
519330
2000
08:41
And you know, actually, all it tells you is that another person
189
521330
3000
08:44
has potentially spread germs from place to place.
190
524330
3000
08:47
(Laughter)
191
527330
1000
08:48
So, think of it this way.
192
528330
2000
08:50
Imagine a world full of brains
193
530330
2000
08:52
and far more memes than can possibly find homes.
194
532330
3000
08:55
The memes are all trying to get copied --
195
535330
3000
08:58
trying, in inverted commas -- i.e.,
196
538330
3000
09:01
that's the shorthand for, if they can get copied, they will.
197
541330
3000
09:04
They're using you and me as their propagating, copying machinery,
198
544330
6000
09:10
and we are the meme machines.
199
550330
3000
09:13
Now, why is this important?
200
553330
2000
09:15
Why is this useful, or what does it tell us?
201
555330
2000
09:17
It gives us a completely new view of human origins
202
557330
4000
09:21
and what it means to be human,
203
561330
1000
09:22
all conventional theories of cultural evolution,
204
562330
4000
09:26
of the origin of humans,
205
566330
2000
09:28
and what makes us so different from other species.
206
568330
4000
09:32
All other theories explaining the big brain, and language, and tool use
207
572330
2000
09:34
and all these things that make us unique,
208
574330
2000
09:36
are based upon genes.
209
576330
3000
09:39
Language must have been useful for the genes.
210
579330
3000
09:42
Tool use must have enhanced our survival, mating and so on.
211
582330
3000
09:45
It always comes back, as Richard Dawkins complained
212
585330
3000
09:48
all that long time ago, it always comes back to genes.
213
588330
3000
09:51
The point of memetics is to say, "Oh no, it doesn't."
214
591330
4000
09:55
There are two replicators now on this planet.
215
595330
3000
09:58
From the moment that our ancestors,
216
598330
3000
10:01
perhaps two and a half million years ago or so,
217
601330
2000
10:03
began imitating, there was a new copying process.
218
603330
4000
10:07
Copying with variation and selection.
219
607330
2000
10:09
A new replicator was let loose, and it could never be --
220
609330
5000
10:14
right from the start -- it could never be
221
614330
1000
10:15
that human beings who let loose this new creature,
222
615330
5000
10:20
could just copy the useful, beautiful, true things,
223
620330
3000
10:23
and not copy the other things.
224
623330
2000
10:25
While their brains were having an advantage from being able to copy --
225
625330
3000
10:28
lighting fires, keeping fires going, new techniques of hunting,
226
628330
5000
10:33
these kinds of things --
227
633330
2000
10:35
inevitably they were also copying putting feathers in their hair,
228
635330
3000
10:38
or wearing strange clothes, or painting their faces,
229
638330
2000
10:40
or whatever.
230
640330
1000
10:41
So, you get an arms race between the genes
231
641330
4000
10:45
which are trying to get the humans to have small economical brains
232
645330
4000
10:49
and not waste their time copying all this stuff,
233
649330
2000
10:51
and the memes themselves, like the sounds that people made and copied --
234
651330
4000
10:56
in other words, what turned out to be language --
235
656330
2000
10:58
competing to get the brains to get bigger and bigger.
236
658330
3000
11:01
So, the big brain, on this theory, is driven by the memes.
237
661330
4000
11:05
This is why, in "The Meme Machine," I called it memetic drive.
238
665330
4000
11:09
As the memes evolve, as they inevitably must,
239
669330
3000
11:12
they drive a bigger brain that is better at copying the memes
240
672330
4000
11:16
that are doing the driving.
241
676330
2000
11:18
This is why we've ended up with such peculiar brains,
242
678330
4000
11:22
that we like religion, and music, and art.
243
682330
3000
11:25
Language is a parasite that we've adapted to,
244
685330
3000
11:28
not something that was there originally for our genes,
245
688330
2000
11:30
on this view.
246
690330
2000
11:32
And like most parasites, it can begin dangerous,
247
692330
3000
11:35
but then it coevolves and adapts,
248
695330
3000
11:38
and we end up with a symbiotic relationship
249
698330
2000
11:40
with this new parasite.
250
700330
1000
11:41
And so, from our perspective,
251
701330
2000
11:43
we don't realize that that's how it began.
252
703330
3000
11:46
So, this is a view of what humans are.
253
706330
3000
11:49
All other species on this planet are gene machines only,
254
709330
3000
11:52
they don't imitate at all well, hardly at all.
255
712330
3000
11:55
We alone are gene machines and meme machines as well.
256
715330
5000
12:00
The memes took a gene machine and turned it into a meme machine.
257
720330
4000
12:04
But that's not all.
258
724330
2000
12:06
We have a new kind of memes now.
259
726330
3000
12:09
I've been wondering for a long time,
260
729330
1000
12:10
since I've been thinking about memes a lot,
261
730330
2000
12:12
is there a difference between the memes that we copy --
262
732330
2000
12:14
the words we speak to each other,
263
734330
2000
12:16
the gestures we copy, the human things --
264
736330
2000
12:18
and all these technological things around us?
265
738330
2000
12:20
I have always, until now, called them all memes,
266
740330
4000
12:24
but I do honestly think now
267
744330
3000
12:27
we need a new word for technological memes.
268
747330
3000
12:30
Let's call them techno-memes or temes.
269
750330
3000
12:33
Because the processes are getting different.
270
753330
3000
12:37
We began, perhaps 5,000 years ago, with writing.
271
757330
3000
12:40
We put the storage of memes out there on a clay tablet,
272
760330
7000
12:48
but in order to get true temes and true teme machines,
273
768330
2000
12:50
you need to get the variation, the selection and the copying,
274
770330
3000
12:53
all done outside of humans.
275
773330
2000
12:55
And we're getting there.
276
775330
2000
12:57
We're at this extraordinary point where we're nearly there,
277
777330
2000
12:59
that there are machines like that.
278
779330
2000
13:01
And indeed, in the short time I've already been at TED,
279
781330
2000
13:03
I see we're even closer than I thought we were before.
280
783330
2000
13:05
So actually, now the temes are forcing our brains
281
785330
6000
13:11
to become more like teme machines.
282
791330
2000
13:13
Our children are growing up very quickly learning to read,
283
793330
3000
13:16
learning to use machinery.
284
796330
2000
13:18
We're going to have all kinds of implants,
285
798330
1000
13:19
drugs that force us to stay awake all the time.
286
799330
3000
13:22
We'll think we're choosing these things,
287
802330
2000
13:24
but the temes are making us do it.
288
804330
3000
13:28
So, we're at this cusp now
289
808330
1000
13:29
of having a third replicator on our planet.
290
809330
4000
13:34
Now, what about what else is going on out there in the universe?
291
814330
5000
13:39
Is there anyone else out there?
292
819330
2000
13:41
People have been asking this question for a long time.
293
821330
3000
13:44
We've been asking it here at TED already.
294
824330
2000
13:46
In 1961, Frank Drake made his famous equation,
295
826330
4000
13:50
but I think he concentrated on the wrong things.
296
830330
2000
13:52
It's been very productive, that equation.
297
832330
2000
13:54
He wanted to estimate N,
298
834330
2000
13:56
the number of communicative civilizations out there in our galaxy,
299
836330
4000
14:00
and he included in there the rate of star formation,
300
840330
4000
14:04
the rate of planets, but crucially, intelligence.
301
844330
4000
14:08
I think that's the wrong way to think about it.
302
848330
4000
14:12
Intelligence appears all over the place, in all kinds of guises.
303
852330
3000
14:15
Human intelligence is only one kind of a thing.
304
855330
2000
14:17
But what's really important is the replicators you have
305
857330
3000
14:20
and the levels of replicators, one feeding on the one before.
306
860330
4000
14:24
So, I would suggest that we don't think intelligence,
307
864330
5000
14:29
we think replicators.
308
869330
2000
14:31
And on that basis, I've suggested a different kind of equation.
309
871330
3000
14:34
A very simple equation.
310
874330
2000
14:36
N, the same thing,
311
876330
2000
14:38
the number of communicative civilizations out there
312
878330
3000
14:41
[that] we might expect in our galaxy.
313
881330
2000
14:43
Just start with the number of planets there are in our galaxy.
314
883330
4000
14:47
The fraction of those which get a first replicator.
315
887330
4000
14:51
The fraction of those that get the second replicator.
316
891330
4000
14:55
The fraction of those that get the third replicator.
317
895330
2000
14:58
Because it's only the third replicator that's going to reach out --
318
898330
3000
15:01
sending information, sending probes, getting out there,
319
901330
3000
15:04
and communicating with anywhere else.
320
904330
2000
15:06
OK, so if we take that equation,
321
906330
3000
15:09
why haven't we heard from anybody out there?
322
909330
5000
15:14
Because every step is dangerous.
323
914330
4000
15:18
Getting a new replicator is dangerous.
324
918330
3000
15:21
You can pull through, we have pulled through,
325
921330
2000
15:23
but it's dangerous.
326
923330
2000
15:25
Take the first step, as soon as life appeared on this earth.
327
925330
3000
15:28
We may take the Gaian view.
328
928330
2000
15:30
I loved Peter Ward's talk yesterday -- it's not Gaian all the time.
329
930330
3000
15:33
Actually, life forms produce things that kill themselves.
330
933330
3000
15:36
Well, we did pull through on this planet.
331
936330
3000
15:39
But then, a long time later, billions of years later,
332
939330
2000
15:41
we got the second replicator, the memes.
333
941330
3000
15:44
That was dangerous, all right.
334
944330
2000
15:46
Think of the big brain.
335
946330
2000
15:48
How many mothers do we have here?
336
948330
3000
15:51
You know all about big brains.
337
951330
2000
15:53
They are dangerous to give birth to,
338
953330
2000
15:55
are agonizing to give birth to.
339
955330
2000
15:57
(Laughter)
340
957330
1000
15:59
My cat gave birth to four kittens, purring all the time.
341
959330
2000
16:01
Ah, mm -- slightly different.
342
961330
2000
16:03
(Laughter)
343
963330
2000
16:05
But not only is it painful, it kills lots of babies,
344
965330
3000
16:08
it kills lots of mothers,
345
968330
2000
16:10
and it's very expensive to produce.
346
970330
2000
16:12
The genes are forced into producing all this myelin,
347
972330
2000
16:14
all the fat to myelinate the brain.
348
974330
2000
16:16
Do you know, sitting here,
349
976330
2000
16:18
your brain is using about 20 percent of your body's energy output
350
978330
4000
16:22
for two percent of your body weight?
351
982330
2000
16:24
It's a really expensive organ to run.
352
984330
2000
16:26
Why? Because it's producing the memes.
353
986330
2000
16:28
Now, it could have killed us off. It could have killed us off,
354
988330
4000
16:32
and maybe it nearly did, but you see, we don't know.
355
992330
2000
16:34
But maybe it nearly did.
356
994330
2000
16:36
Has it been tried before?
357
996330
1000
16:37
What about all those other species?
358
997330
2000
16:39
Louise Leakey talked yesterday
359
999330
2000
16:41
about how we're the only one in this branch left.
360
1001330
3000
16:44
What happened to the others?
361
1004330
2000
16:46
Could it be that this experiment in imitation,
362
1006330
2000
16:48
this experiment in a second replicator,
363
1008330
2000
16:50
is dangerous enough to kill people off?
364
1010330
4000
16:54
Well, we did pull through, and we adapted.
365
1014330
2000
16:56
But now, we're hitting, as I've just described,
366
1016330
3000
16:59
we're hitting the third replicator point.
367
1019330
2000
17:01
And this is even more dangerous --
368
1021330
3000
17:04
well, it's dangerous again.
369
1024330
2000
17:06
Why? Because the temes are selfish replicators
370
1026330
4000
17:10
and they don't care about us, or our planet, or anything else.
371
1030330
3000
17:13
They're just information, why would they?
372
1033330
3000
17:17
They are using us to suck up the planet's resources
373
1037330
2000
17:19
to produce more computers,
374
1039330
2000
17:21
and more of all these amazing things we're hearing about here at TED.
375
1041330
3000
17:24
Don't think, "Oh, we created the Internet for our own benefit."
376
1044330
4000
17:28
That's how it seems to us.
377
1048330
2000
17:30
Think, temes spreading because they must.
378
1050330
4000
17:34
We are the old machines.
379
1054330
2000
17:36
Now, are we going to pull through?
380
1056330
2000
17:38
What's going to happen?
381
1058330
2000
17:40
What does it mean to pull through?
382
1060330
2000
17:42
Well, there are kind of two ways of pulling through.
383
1062330
2000
17:45
One that is obviously happening all around us now,
384
1065330
2000
17:47
is that the temes turn us into teme machines,
385
1067330
4000
17:51
with these implants, with the drugs,
386
1071330
2000
17:53
with us merging with the technology.
387
1073330
3000
17:56
And why would they do that?
388
1076330
2000
17:58
Because we are self-replicating.
389
1078330
2000
18:00
We have babies.
390
1080330
2000
18:02
We make new ones, and so it's convenient to piggyback on us,
391
1082330
3000
18:05
because we're not yet at the stage on this planet
392
1085330
4000
18:09
where the other option is viable.
393
1089330
2000
18:11
Although it's closer, I heard this morning,
394
1091330
2000
18:13
it's closer than I thought it was.
395
1093330
2000
18:15
Where the teme machines themselves will replicate themselves.
396
1095330
3000
18:18
That way, it wouldn't matter if the planet's climate
397
1098330
4000
18:22
was utterly destabilized,
398
1102330
2000
18:24
and it was no longer possible for humans to live here.
399
1104330
2000
18:26
Because those teme machines, they wouldn't need --
400
1106330
2000
18:28
they're not squishy, wet, oxygen-breathing,
401
1108330
2000
18:30
warmth-requiring creatures.
402
1110330
3000
18:33
They could carry on without us.
403
1113330
2000
18:35
So, those are the two possibilities.
404
1115330
3000
18:38
The second, I don't think we're that close.
405
1118330
4000
18:42
It's coming, but we're not there yet.
406
1122330
2000
18:44
The first, it's coming too.
407
1124330
2000
18:46
But the damage that is already being done
408
1126330
3000
18:49
to the planet is showing us how dangerous the third point is,
409
1129330
5000
18:54
that third danger point, getting a third replicator.
410
1134330
3000
18:58
And will we get through this third danger point,
411
1138330
2000
19:00
like we got through the second and like we got through the first?
412
1140330
3000
19:04
Maybe we will, maybe we won't.
413
1144330
2000
19:06
I have no idea.
414
1146330
3000
19:13
(Applause)
415
1153330
10000
19:24
Chris Anderson: That was an incredible talk.
416
1164330
2000
19:26
SB: Thank you. I scared myself.
417
1166330
2000
19:28
CA: (Laughter)
418
1168330
1000
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7