Susan Blackmore: Memes and "temes"

156,625 views ・ 2008-06-04

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Bruno Gašperov Recezent: Janko Mihelić
00:18
Cultural evolution is a dangerous child
0
18330
3000
Kulturna evolucija je opasno dijete
00:21
for any species to let loose on its planet.
1
21330
3000
za pustiti na svom planetu.
00:24
By the time you realize what's happening, the child is a toddler,
2
24330
4000
Dok shvatimo što se događa, dijete odraste
00:28
up and causing havoc, and it's too late to put it back.
3
28330
6000
stvara kaos, i prekasno je za vratiti ga.
00:34
We humans are Earth's Pandoran species.
4
34330
3000
Mi ljudi smo Zemljina Pandorina vrsta.
00:37
We're the ones who let the second replicator out of its box,
5
37330
5000
Mi smo pustili drugog replikatora iz kutije,
00:42
and we can't push it back in.
6
42330
2000
i ne možemo ga vratiti unutra.
00:44
We're seeing the consequences all around us.
7
44330
3000
Vidimo posljedice svugdje oko nas.
00:48
Now that, I suggest, is the view that
8
48330
4000
To je pogled koji
00:52
comes out of taking memetics seriously.
9
52330
2000
slijedi iz ozbiljnog shvaćanja memetike.
00:54
And it gives us a new way of thinking about
10
54330
2000
I pruža nam novi način razmišljanja o
00:56
not only what's going on on our planet,
11
56330
2000
zbivanju ne samo na našem planetu,
00:58
but what might be going on elsewhere in the cosmos.
12
58330
3000
nego i onom što bi se moglo događati drugdje u svemiru.
01:01
So first of all, I'd like to say something about memetics
13
61330
3000
Zato bih najprije željela reći nešto o memetici
01:04
and the theory of memes,
14
64330
2000
i teoriji o memima,
01:06
and secondly, how this might answer questions about who's out there,
15
66330
5000
a potom bi tako mogli odgovoriti na pitanje je li netko vani
01:11
if indeed anyone is.
16
71330
3000
ako netko jest.
01:14
So, memetics:
17
74330
2000
Dakle, memetika.
01:16
memetics is founded on the principle of Universal Darwinism.
18
76330
4000
Memetika se temelji na principu univerzalnog darvinizma.
01:20
Darwin had this amazing idea.
19
80330
3000
Darwin je imao fascinantnu ideju.
01:23
Indeed, some people say
20
83330
2000
Zapravo, neki ljudi kažu
01:25
it's the best idea anybody ever had.
21
85330
3000
najbolju ideju koje se itko ikada sjetio.
01:28
Isn't that a wonderful thought, that there could be such a thing
22
88330
4000
Nije li to predivna pomisao, da postoji nešto kao
01:32
as a best idea anybody ever had?
23
92330
2000
najbolja ideja koje se itko ikada sjetio?
01:34
Do you think there could?
24
94330
1000
Mislite da postoji?
01:35
Audience: No.
25
95330
1000
Publika: Ne.
01:36
(Laughter)
26
96330
1000
(Smijeh)
01:37
Susan Blackmore: Someone says no, very loudly, from over there.
27
97330
2000
Netko je rekao ne..
01:39
Well, I say yes, and if there is, I give the prize to Darwin.
28
99330
4000
Pa, ja kažem da, i da postoji, dala bih nagradu Darwinu.
01:43
Why?
29
103330
2000
Zašto?
01:45
Because the idea was so simple,
30
105330
3000
Jer je ideja bila tako jednostavna,
01:48
and yet it explains all design in the universe.
31
108330
6000
a objašnjava sav dizajn u svemiru.
01:54
I would say not just biological design,
32
114330
2000
Rekla bih ne samo biološki,
01:56
but all of the design that we think of as human design.
33
116330
2000
već sav dizajn koji smatramo ljudskim dizajnom.
01:58
It's all just the same thing happening.
34
118330
2000
To je sve jedna te ista stvar.
02:00
What did Darwin say?
35
120330
2000
Što je Darwin rekao?
02:02
I know you know the idea, natural selection,
36
122330
2000
Znam da poznajete ideju - prirodna selekcija,
02:04
but let me just paraphrase "The Origin of Species," 1859,
37
124330
5000
ali dopustite mi da parafraziram "Porijeklo vrsta", 1859.
02:09
in a few sentences.
38
129330
2000
u nekoliko rečenica.
02:11
What Darwin said was something like this:
39
131330
3000
Darwin je rekao nešto poput ovoga --
02:14
if you have creatures that vary, and that can't be doubted --
40
134330
4000
ako imate bića koja se razlikuju, a to je nesumljivo --
02:18
I've been to the Galapagos, and I've measured the size of the beaks
41
138330
3000
bio sam na Galapagosu i mjerio veličine kljunova
02:21
and the size of the turtle shells and so on, and so on.
42
141330
2000
i veličine oklopa kornjača, itd, itd.
02:23
And 100 pages later.
43
143330
2000
I 100 stranica poslije --
02:25
(Laughter)
44
145330
2000
(Smijeh)
02:27
And if there is a struggle for life,
45
147330
4000
I ako postoji borba za život,
02:31
such that nearly all of these creatures die --
46
151330
3000
takva da gotovo sva ta bića uginu --
02:34
and this can't be doubted, I've read Malthus
47
154330
3000
a ovo je nesumljivo, pročitao sam Malthusa
02:37
and I've calculated how long it would take for elephants
48
157330
2000
i izračunao koliko bi trebalo slonovima
02:39
to cover the whole world if they bred unrestricted, and so on and so on.
49
159330
3000
da prekriju Svijet neograničenim razmnožavanjem, itd itd.
02:42
And another 100 pages later.
50
162330
4000
I još 100 stranica poslije.
02:46
And if the very few that survive pass onto their offspring
51
166330
5000
I ako nekolicina koji prežive prenesu na potomke
02:51
whatever it was that helped them survive,
52
171330
3000
štogod im je pomoglo da prežive,
02:54
then those offspring must be better adapted
53
174330
2000
onda ti potomci moraju biti bolje prilagođeni
02:56
to the circumstances in which all this happened
54
176330
2000
okolnostima u kojima se to zbivalo
02:58
than their parents were.
55
178330
3000
nego njihovi roditelji.
03:01
You see the idea?
56
181330
2000
Shvaćate ideju?
03:03
If, if, if, then.
57
183330
2000
Ako, ako, ako, onda.
03:05
He had no concept of the idea of an algorithm,
58
185330
2000
On nije znao za ideju algoritma.
03:07
but that's what he described in that book,
59
187330
3000
Ali to je ono što je opisao u toj knjizi,
03:10
and this is what we now know as the evolutionary algorithm.
60
190330
3000
i to je ono što nazivamo evolucijskim algoritmom.
03:13
The principle is you just need those three things --
61
193330
4000
Princip je da su potrebne te tri stvari --
03:17
variation, selection and heredity.
62
197330
3000
varijacija, selekcija i nasljednost.
03:20
And as Dan Dennett puts it, if you have those,
63
200330
4000
I kako bi Dan Dennett rekao, ako njih imaš
03:24
then you must get evolution.
64
204330
2000
onda moraš dobiti evoluciju.
03:26
Or design out of chaos, without the aid of mind.
65
206330
5000
Ili dizajn iz kaosa bez pomoći uma.
03:31
There's one word I love on that slide.
66
211330
2000
Jednu riječ s tog slajda volim.
03:33
What do you think my favorite word is?
67
213330
2000
Što mislite, koja je to riječ?
03:35
Audience: Chaos.
68
215330
1000
Publika: Kaos.
03:36
SB: Chaos? No. What? Mind? No.
69
216330
3000
Kaos? Ne. Um? Ne.
03:39
Audience: Without.
70
219330
1000
Publika: Bez.
03:40
SB: No, not without.
71
220330
1000
Nije bez.
03:41
(Laughter)
72
221330
1000
(Smijeh)
03:42
You try them all in order: Mmm...?
73
222330
2000
Pokušajte sve po redu: Mmm... ?
03:44
Audience: Must.
74
224330
1000
Publika: Mora.
03:45
SB: Must, at must. Must, must.
75
225330
4000
Mora, konačno. Mora, mora.
03:49
This is what makes it so amazing.
76
229330
2000
To je ono što to čini tako zadivljujućim.
03:51
You don't need a designer,
77
231330
3000
Ne trebate stvoritelja,
03:54
or a plan, or foresight, or anything else.
78
234330
3000
ili plan, ili predumišljaj ili išta drugo.
03:57
If there's something that is copied with variation
79
237330
3000
Ako postoji nešto što se kopira s varijacijom
04:00
and it's selected, then you must get design appearing out of nowhere.
80
240330
4000
i to je odabrano, onda morate dobiti dizajn iz ničega.
04:04
You can't stop it.
81
244330
2000
Ne možete to zaustaviti.
04:06
Must is my favorite word there.
82
246330
4000
"Mora" je moja najdraža riječ tamo.
04:11
Now, what's this to do with memes?
83
251330
2000
Kakve to ima veze s memima?
04:13
Well, the principle here applies to anything
84
253330
5000
Princip se primjenjuje na bilo što
04:18
that is copied with variation and selection.
85
258330
1000
što se kopira varijacijom i selekcijom.
04:19
We're so used to thinking in terms of biology,
86
259330
3000
Naviknuti smo na razmišljanje u terminima biologije,
04:22
we think about genes this way.
87
262330
2000
tako razmišljamo o genima.
04:24
Darwin didn't, of course; he didn't know about genes.
88
264330
3000
Darwin, dakako, nije ni znao za gene.
04:27
He talked mostly about animals and plants,
89
267330
2000
Uglavnom je pričao o životinjama i biljkama,
04:29
but also about languages evolving and becoming extinct.
90
269330
3000
ali i o jezicima kako evoluiraju i izumiru.
04:32
But the principle of Universal Darwinism
91
272330
2000
Ali princip univerzalnog darvinizma
04:34
is that any information that is varied and selected
92
274330
4000
je da će bilo koja informacija koja varira i odabire se
04:38
will produce design.
93
278330
2000
stvoriti dizajn.
04:40
And this is what Richard Dawkins was on about
94
280330
2000
To je ono o čemu je Richard Dawkins pričao
04:42
in his 1976 bestseller, "The Selfish Gene."
95
282330
3000
u svom bestselleru iz 1976, "Sebični gen".
04:45
The information that is copied, he called the replicator.
96
285330
4000
Informacija koja se kopira, nju je zvao replikatorom.
04:49
It selfishly copies.
97
289330
2000
Sebično se kopira.
04:51
Not meaning it kind of sits around inside cells going, "I want to get copied."
98
291330
4000
Ne u smislu da leži unutar stanica i govori, "Želim se kopirati."
04:55
But that it will get copied if it can,
99
295330
2000
Nego da će se kopirati ako može,
04:57
regardless of the consequences.
100
297330
2000
bez obzira na posljedice.
05:00
It doesn't care about the consequences because it can't,
101
300330
3000
Ne mari za posljedice jer nije sposobna,
05:03
because it's just information being copied.
102
303330
2000
jer je to samo informacija.
05:06
And he wanted to get away
103
306330
1000
I želio se odmaknuti
05:07
from everybody thinking all the time about genes,
104
307330
3000
od svih koji su razmišljali samo o genima,
05:10
and so he said, "Is there another replicator out there on the planet?"
105
310330
3000
rekao je, "Postoji li još replikatora na ovom planetu?"
05:13
Ah, yes, there is.
106
313330
2000
Ah, da, postoji.
05:15
Look around you -- here will do, in this room.
107
315330
3000
Pogledajte oko sebe, u ovoj sobi.
05:18
All around us, still clumsily drifting about
108
318330
3000
Svugdje oko nas, nespretno se valja
05:21
in its primeval soup of culture, is another replicator.
109
321330
3000
u kulturnoj prajuhi, je drugi replikator.
05:24
Information that we copy from person to person, by imitation,
110
324330
5000
Informacija koju kopiramo od osobe do osobe imitacijom,
05:29
by language, by talking, by telling stories,
111
329330
2000
jezikom, govorom, pričanjem priča,
05:31
by wearing clothes, by doing things.
112
331330
3000
nošenjem odjeće, radeći stvari.
05:34
This is information copied with variation and selection.
113
334330
5000
To je informacija kopirana s varijacijom i selekcijom.
05:39
This is design process going on.
114
339330
3000
To je proces dizajna koji se zbiva.
05:42
He wanted a name for the new replicator.
115
342330
3000
Želio je ime za novi replikator.
05:45
So, he took the Greek word "mimeme," which means that which is imitated.
116
345330
4000
Pa je uzeo grču riječ "mimeme", što znači "ono što je imitirano".
05:49
Remember that, that's the core definition:
117
349330
2000
To je ključna definicija.
05:52
that which is imitated.
118
352330
1000
Ono što se imitira.
05:53
And abbreviated it to meme, just because it sounds good
119
353330
3000
I skratio to na mem, jer zvuči dobro
05:56
and made a good meme, an effective spreading meme.
120
356330
3000
i stvorio dobar mem, efektivni šireći mem.
05:59
So that's how the idea came about.
121
359330
3000
Znači tako je ideja nastala.
06:03
It's important to stick with that definition.
122
363330
3000
Važno je držati se te definicije.
06:06
The whole science of memetics is much maligned,
123
366330
4000
Cijela znanost memetike je omražena,
06:10
much misunderstood, much feared.
124
370330
3000
krivo shvaćena i pribojavana.
06:13
But a lot of these problems can be avoided
125
373330
3000
Ali mnogo tih problema možemo izbjeći
06:16
by remembering the definition.
126
376330
2000
prisjećajući se definicije.
06:18
A meme is not equivalent to an idea.
127
378330
2000
Mem nije ideja.
06:20
It's not an idea. It's not equivalent to anything else, really.
128
380330
2000
Nije ideja, nije ekvivalent ničem drugom.
06:22
Stick with the definition.
129
382330
2000
Držite se definicije.
06:24
It's that which is imitated,
130
384330
2000
To je ono što se imitira.
06:26
or information which is copied from person to person.
131
386330
3000
Ili informacija koja je kopirana od osobe do osobe.
06:30
So, let's see some memes.
132
390330
1000
Pogledajmo memove.
06:31
Well, you sir, you've got those glasses hung around your neck
133
391330
3000
Vi, imate naočale obješene oko vrata
06:34
in that particularly fetching way.
134
394330
2000
na posebno uočljiv način.
06:36
I wonder whether you invented that idea for yourself,
135
396330
2000
Jeste li smislili tu ideju sami,
06:38
or copied it from someone else?
136
398330
2000
ili ste je kopirali od nekog?
06:40
If you copied it from someone else, it's a meme.
137
400330
3000
Ako ste je kopirali od nekog, to je mem.
06:43
And what about, oh, I can't see any interesting memes here.
138
403330
3000
Ne vidim nikakve zanimljive meme ovdje.
06:46
All right everyone, who's got some interesting memes for me?
139
406330
3000
Tko ima zanimljive meme za mene?
06:49
Oh, well, your earrings,
140
409330
2000
Vaše naušnice,
06:51
I don't suppose you invented the idea of earrings.
141
411330
2000
Niste izmislili ideju naušnica.
06:53
You probably went out and bought them.
142
413330
2000
Vjerojatno ste ih kupili.
06:55
There are plenty more in the shops.
143
415330
2000
Ima ih još mnogo u trgovinama.
06:57
That's something that's passed on from person to person.
144
417330
2000
To se prenosi sa osobe na osobu.
06:59
All the stories that we're telling -- well, of course,
145
419330
3000
Sve priče koje pričamo, pa naravno,
07:02
TED is a great meme-fest, masses of memes.
146
422330
4000
TED je sjajan festival mema, masa mema.
07:06
The way to think about memes, though,
147
426330
2000
Način na koji razmišljamo o memima
07:08
is to think, why do they spread?
148
428330
2000
je, zašto se šire?
07:10
They're selfish information, they will get copied, if they can.
149
430330
4000
Oni su sebična informacija, kopirat će se ako mogu.
07:14
But some of them will be copied because they're good,
150
434330
3000
Ali neki od njih će se kopirati jer su dobri,
07:17
or true, or useful, or beautiful.
151
437330
2000
ili istiniti, ili korisni, ili prekrasni.
07:19
Some of them will be copied even though they're not.
152
439330
2000
Neki od njih će se kopirati iako nisu.
07:21
Some, it's quite hard to tell why.
153
441330
2000
Za neke je teško reći zašto.
07:24
There's one particular curious meme which I rather enjoy.
154
444330
3000
Ima jedan meme u kojem uživam.
07:27
And I'm glad to say, as I expected, I found it when I came here,
155
447330
3000
I, očekivano, pronašla sam ga kad sam došla ovdje,
07:30
and I'm sure all of you found it, too.
156
450330
2000
sigurna sam da ste ga svi pronašli.
07:32
You go to your nice, posh, international hotel somewhere,
157
452330
3000
Idete u svoj lijepi internacionalni hotel,
07:36
and you come in and you put down your clothes
158
456330
2000
uđete unutra i skinete svoju odjeću
07:38
and you go to the bathroom, and what do you see?
159
458330
3000
uđete u kupaonicu, i što vidite?
07:41
Audience: Bathroom soap.
160
461330
1000
Publika: Kupaonski sapun.
07:42
SB: Pardon?
161
462330
1000
Oprostite?
07:43
Audience: Soap.
162
463330
1000
Publika: Sapun.
07:44
SB: Soap, yeah. What else do you see?
163
464330
2000
Sapun, da. Što još vidite?
07:46
Audience: (Inaudible)
164
466330
1000
Publika: (Nečujno)
07:47
SB: Mmm mmm.
165
467330
1000
Mmm mmm.
07:48
Audience: Sink, toilet!
166
468330
1000
Publika: Umivaonik, zahod!
07:49
SB: Sink, toilet, yes, these are all memes, they're all memes,
167
469330
2000
Umivaonik, zahod, da, to su sve memi,
07:51
but they're sort of useful ones, and then there's this one.
168
471330
3000
ali oni su recimo korisni, i onda dođe ovaj.
07:54
(Laughter)
169
474330
3000
(Smijeh)
07:58
What is this one doing?
170
478330
2000
Čemu ovaj služi?
08:00
(Laughter)
171
480330
1000
(Smijeh)
08:01
This has spread all over the world.
172
481330
2000
Ovaj se proširio svuda po svijetu.
08:03
It's not surprising that you all found it
173
483330
2000
Nije čudno što ste ga svi pronašli
08:05
when you arrived in your bathrooms here.
174
485330
2000
kada ste ušli u kupaonice.
08:07
But I took this photograph in a toilet at the back of a tent
175
487330
5000
Uslikala sam ovu fotografiju u zahodu iza šatora
08:12
in the eco-camp in the jungle in Assam.
176
492330
2000
u eko-kampu u džungli u Assamu.
08:14
(Laughter)
177
494330
1000
(Smijeh)
08:16
Who folded that thing up there, and why?
178
496330
3000
Tko je zamotao tu stvar, i zašto?
08:19
(Laughter)
179
499330
1000
(Smijeh)
08:20
Some people get carried away.
180
500330
2000
Neki ljudi pretjeruju.
08:22
(Laughter)
181
502330
3000
(Smijeh)
08:26
Other people are just lazy and make mistakes.
182
506330
3000
Drugi su samo lijeni i rade greške.
08:29
Some hotels exploit the opportunity to put even more memes
183
509330
3000
Neki hoteli stvore nove memove
08:32
with a little sticker.
184
512330
2000
s naljepnicom.
08:34
(Laughter)
185
514330
1000
(Smijeh)
08:35
What is this all about?
186
515330
2000
O čemu se ovdje radi?
08:37
I suppose it's there to tell you that somebody's
187
517330
2000
Pretpostavljam da žele reći kako je netko
08:39
cleaned the place, and it's all lovely.
188
519330
2000
sve očistio.
08:41
And you know, actually, all it tells you is that another person
189
521330
3000
Ali zapravo vam govori da je ta osoba
08:44
has potentially spread germs from place to place.
190
524330
3000
potencijalno proširila bakterije s mjesta na mjesto.
08:47
(Laughter)
191
527330
1000
(Smijeh)
08:48
So, think of it this way.
192
528330
2000
Gledajte na to ovako.
08:50
Imagine a world full of brains
193
530330
2000
Zamislite svijet pun mozgova.
08:52
and far more memes than can possibly find homes.
194
532330
3000
i više memova nego što imaju domova.
08:55
The memes are all trying to get copied --
195
535330
3000
Memovi se pokušavaju kopirati,
08:58
trying, in inverted commas -- i.e.,
196
538330
3000
pokušavaju, tj.
09:01
that's the shorthand for, if they can get copied, they will.
197
541330
3000
ako se mogu kopirati onda hoće.
09:04
They're using you and me as their propagating, copying machinery,
198
544330
6000
Iskorištavaju vas i mene kao svoju kopirajuću mašineriju,
09:10
and we are the meme machines.
199
550330
3000
mi smo strojevi za memove.
09:13
Now, why is this important?
200
553330
2000
Zašto je ovo važno?
09:15
Why is this useful, or what does it tell us?
201
555330
2000
Zašto je ovo korisno, ili što nam to govori?
09:17
It gives us a completely new view of human origins
202
557330
4000
Daje nam novi pogled na ljudsku genezu
09:21
and what it means to be human,
203
561330
1000
i što znači biti čovjek.
09:22
all conventional theories of cultural evolution,
204
562330
4000
Sve konvencionalne teorije kulturne evolucije,
09:26
of the origin of humans,
205
566330
2000
porijekla ljudi,
09:28
and what makes us so different from other species.
206
568330
4000
i što nas čini drukčijim od ostalih vrsta.
09:32
All other theories explaining the big brain, and language, and tool use
207
572330
2000
Teorije koje govore o velikom mozgu, jeziku i uporabi alata
09:34
and all these things that make us unique,
208
574330
2000
i sve te stvari koje nas čine posebnima,
09:36
are based upon genes.
209
576330
3000
se temelje na genima.
09:39
Language must have been useful for the genes.
210
579330
3000
Jezik je morao biti koristan genima.
09:42
Tool use must have enhanced our survival, mating and so on.
211
582330
3000
Oruđa su olakšala preživljavanje, razmnožavanje itd.
09:45
It always comes back, as Richard Dawkins complained
212
585330
3000
Uvijek se sve vraća, kako se žalio Richard Dawkins
09:48
all that long time ago, it always comes back to genes.
213
588330
3000
uvijek se sve vraća na gene.
09:51
The point of memetics is to say, "Oh no, it doesn't."
214
591330
4000
Bit memetike je reći, "O, ne vraća se."
09:55
There are two replicators now on this planet.
215
595330
3000
Postoje dva replikatora na ovom planetu.
09:58
From the moment that our ancestors,
216
598330
3000
Od trenutka kada su naši preci,
10:01
perhaps two and a half million years ago or so,
217
601330
2000
prije nekih dva i pol milijuna godina,
10:03
began imitating, there was a new copying process.
218
603330
4000
počeli imitirati, nastao je novi proces kopiranja.
10:07
Copying with variation and selection.
219
607330
2000
Kopiranja s varijacijom i selekcijom.
10:09
A new replicator was let loose, and it could never be --
220
609330
5000
Novi replikator je oslobođen, i --
10:14
right from the start -- it could never be
221
614330
1000
od početka, nije moglo biti
10:15
that human beings who let loose this new creature,
222
615330
5000
da ljudska bića koja su pustila ovo novo biće,
10:20
could just copy the useful, beautiful, true things,
223
620330
3000
mogu kopirati samo korisne, lijepe, istinite stvari,
10:23
and not copy the other things.
224
623330
2000
a ne kopirati ostale stvari.
10:25
While their brains were having an advantage from being able to copy --
225
625330
3000
Njihovi mozgovi su imali prednost od kopiranja --
10:28
lighting fires, keeping fires going, new techniques of hunting,
226
628330
5000
paljenja vatre, održavanja vatre, novih tehnika lova,
10:33
these kinds of things --
227
633330
2000
takvih vrsta stvari --
10:35
inevitably they were also copying putting feathers in their hair,
228
635330
3000
neizbježno su kopirali i stavljanje pera u kosu,
10:38
or wearing strange clothes, or painting their faces,
229
638330
2000
ili nošenje čudne odjeće, ili bojanja lica,
10:40
or whatever.
230
640330
1000
ili bilo što.
10:41
So, you get an arms race between the genes
231
641330
4000
Dobijemo utrku u naoružavanju između gena
10:45
which are trying to get the humans to have small economical brains
232
645330
4000
koji pokušavaju ljudima dati male ekonomične mozgove
10:49
and not waste their time copying all this stuff,
233
649330
2000
i ne tratiti vrijeme kopirajući sve ove stvari,
10:51
and the memes themselves, like the sounds that people made and copied --
234
651330
4000
i mema, poput zvukova koje su ljudi proizvodili i kopirali --
10:56
in other words, what turned out to be language --
235
656330
2000
drugim riječima, onog što je na kraju postalo jezik --
10:58
competing to get the brains to get bigger and bigger.
236
658330
3000
natječući se da bi povećali mozgove više i više.
11:01
So, the big brain, on this theory, is driven by the memes.
237
661330
4000
Dakle veliki mozak je u ovoj teoriji pogonjen memima.
11:05
This is why, in "The Meme Machine," I called it memetic drive.
238
665330
4000
Zato sam ga, u "Stroj za memove", nazvala memetskim pogonom.
11:09
As the memes evolve, as they inevitably must,
239
669330
3000
Kako memi evoluiraju, kao što neizbježno moraju,
11:12
they drive a bigger brain that is better at copying the memes
240
672330
4000
pogone veći mozak koji je bolji u kopiranju mema
11:16
that are doing the driving.
241
676330
2000
koji ga pogone.
11:18
This is why we've ended up with such peculiar brains,
242
678330
4000
Zato imamo tako čudne mozgove,
11:22
that we like religion, and music, and art.
243
682330
3000
da nam se sviđaju religije, glazba, i umjetnost.
11:25
Language is a parasite that we've adapted to,
244
685330
3000
Jezik je parazit kojem smo se prilagodili,
11:28
not something that was there originally for our genes,
245
688330
2000
a ne nešto stvoreno za gene,
11:30
on this view.
246
690330
2000
iz ovog gledišta.
11:32
And like most parasites, it can begin dangerous,
247
692330
3000
I poput većine parazita može započeti kao opasan,
11:35
but then it coevolves and adapts,
248
695330
3000
ali onda ko-evoluira i adaptira se
11:38
and we end up with a symbiotic relationship
249
698330
2000
i na kraju dolazimo do simbiotske veze
11:40
with this new parasite.
250
700330
1000
s ovim novim parazitom.
11:41
And so, from our perspective,
251
701330
2000
I zato iz naše perspektive,
11:43
we don't realize that that's how it began.
252
703330
3000
ne uviđamo da je tako počelo.
11:46
So, this is a view of what humans are.
253
706330
3000
To je pogled na ono što ljudi jesu.
11:49
All other species on this planet are gene machines only,
254
709330
3000
Ostale vrste na planetu su isključivo genetski strojevi,
11:52
they don't imitate at all well, hardly at all.
255
712330
3000
ne imitiraju dobro, ako uopće.
11:55
We alone are gene machines and meme machines as well.
256
715330
5000
Jedino mi smo i genski i memski strojevi.
12:00
The memes took a gene machine and turned it into a meme machine.
257
720330
4000
Memi su uzeli genski stroj i pretvorili ga u memski stroj.
12:04
But that's not all.
258
724330
2000
Ali to nije sve.
12:06
We have a new kind of memes now.
259
726330
3000
Sada imamo novu vrstu mema.
12:09
I've been wondering for a long time,
260
729330
1000
Dugo sam se propitkivala,
12:10
since I've been thinking about memes a lot,
261
730330
2000
s obzirom da sam puno razmišljala o memima,
12:12
is there a difference between the memes that we copy --
262
732330
2000
postoji li razlika između mema koje mi kopiramo --
12:14
the words we speak to each other,
263
734330
2000
riječi koje pričamo jedni drugima,
12:16
the gestures we copy, the human things --
264
736330
2000
običaja koje kopiramo, ljudskim stvarima --
12:18
and all these technological things around us?
265
738330
2000
i svih ovih tehnoloških stvari uokolo nas?
12:20
I have always, until now, called them all memes,
266
740330
4000
Uvijek sam ih, do sada, sve nazivala memima,
12:24
but I do honestly think now
267
744330
3000
ali sada iskreno smatram
12:27
we need a new word for technological memes.
268
747330
3000
kako trebamo novu riječ za tehnološke meme.
12:30
Let's call them techno-memes or temes.
269
750330
3000
Nazovimo ih tehnomemi ili temi.
12:33
Because the processes are getting different.
270
753330
3000
Zato što se procesi mijenjaju.
12:37
We began, perhaps 5,000 years ago, with writing.
271
757330
3000
Počeli smo, prije 5000 godina, s pisanjem.
12:40
We put the storage of memes out there on a clay tablet,
272
760330
7000
Pohranjivali smo meme na glinenu ploču,
12:48
but in order to get true temes and true teme machines,
273
768330
2000
kako bi dobili prave temske strojeve,
12:50
you need to get the variation, the selection and the copying,
274
770330
3000
trebamo varijaciju, selekciju i kopiranje,
12:53
all done outside of humans.
275
773330
2000
izvršene bez ljudi.
12:55
And we're getting there.
276
775330
2000
Stižemo ovdje.
12:57
We're at this extraordinary point where we're nearly there,
277
777330
2000
Došli smo do točke gdje smo zamalo tu,
12:59
that there are machines like that.
278
779330
2000
gdje postoje takvi strojevi.
13:01
And indeed, in the short time I've already been at TED,
279
781330
2000
I stvarno, u ovo malo vremena što sam na TED-u,
13:03
I see we're even closer than I thought we were before.
280
783330
2000
vidim da smo bliže nego što sam ikad mislila.
13:05
So actually, now the temes are forcing our brains
281
785330
6000
Sada temi prisiljavaju naše mozgove
13:11
to become more like teme machines.
282
791330
2000
da postanu sličniji temskim strojevima.
13:13
Our children are growing up very quickly learning to read,
283
793330
3000
Naša djeca odrastaju jako brzo učeći čitati,
13:16
learning to use machinery.
284
796330
2000
učeći koristiti strojeve.
13:18
We're going to have all kinds of implants,
285
798330
1000
Imat ćemo razne implantanate,
13:19
drugs that force us to stay awake all the time.
286
799330
3000
droge koja nas prisiljavaju da budemo budni.
13:22
We'll think we're choosing these things,
287
802330
2000
Mislit ćemo da mi biramo te stvari,
13:24
but the temes are making us do it.
288
804330
3000
ali temi nas na to tjeraju.
13:28
So, we're at this cusp now
289
808330
1000
Na granici smo
13:29
of having a third replicator on our planet.
290
809330
4000
dolaska trećeg replikatora na planet.
13:34
Now, what about what else is going on out there in the universe?
291
814330
5000
Što u vezi zbivanja vani u svemiru?
13:39
Is there anyone else out there?
292
819330
2000
Ima li nekoga tamo?
13:41
People have been asking this question for a long time.
293
821330
3000
Ljudi su se propitkivali ovim dugo vremena.
13:44
We've been asking it here at TED already.
294
824330
2000
Propitkivali smo se i ovdje na TED-u.
13:46
In 1961, Frank Drake made his famous equation,
295
826330
4000
1961., Frank Drake je stvorio poznatu jednadžbu,
13:50
but I think he concentrated on the wrong things.
296
830330
2000
ali se koncentrirao na krive stvari.
13:52
It's been very productive, that equation.
297
832330
2000
Bila je vrlo produktivna.
13:54
He wanted to estimate N,
298
834330
2000
Želio je procijeniti N,
13:56
the number of communicative civilizations out there in our galaxy,
299
836330
4000
broj komunikativnih civilizacija u našoj galaksiji.
14:00
and he included in there the rate of star formation,
300
840330
4000
I uključio je brzinu stvaranja zvijezda,
14:04
the rate of planets, but crucially, intelligence.
301
844330
4000
faktor planeta, i krucijalno, inteligenciju.
14:08
I think that's the wrong way to think about it.
302
848330
4000
Mislim da je to krivi način razmišljanja o tome.
14:12
Intelligence appears all over the place, in all kinds of guises.
303
852330
3000
Inteligencija se pojavljuje posvuda, u svakakvim krinkama.
14:15
Human intelligence is only one kind of a thing.
304
855330
2000
Ljudska inteligencija je samo jedna vrsta.
14:17
But what's really important is the replicators you have
305
857330
3000
Ali stvarno bitni su replikatori koje imate
14:20
and the levels of replicators, one feeding on the one before.
306
860330
4000
i razine replikatora, koji se hrane onim prethodnim.
14:24
So, I would suggest that we don't think intelligence,
307
864330
5000
Sugerirala bih da ne razmišljamo o inteligenciji,
14:29
we think replicators.
308
869330
2000
nego o replikatorima.
14:31
And on that basis, I've suggested a different kind of equation.
309
871330
3000
Na toj osnovi, predložila sam drukčiju vrstu jednadžbe.
14:34
A very simple equation.
310
874330
2000
Vrlu jednostavnu jednadžbu.
14:36
N, the same thing,
311
876330
2000
N, ista stvar.
14:38
the number of communicative civilizations out there
312
878330
3000
broj komunikativnih civilizacija,
14:41
[that] we might expect in our galaxy.
313
881330
2000
koje možemo očekivati u galaksiji.
14:43
Just start with the number of planets there are in our galaxy.
314
883330
4000
Samo počnimo s brojem planeta koji su u našoj galaksiji.
14:47
The fraction of those which get a first replicator.
315
887330
4000
Udio onih koji dobiju prvi replikator.
14:51
The fraction of those that get the second replicator.
316
891330
4000
Udio onih koje dobiju drugi replikator.
14:55
The fraction of those that get the third replicator.
317
895330
2000
Udio onih koje dobiju treći replikator.
14:58
Because it's only the third replicator that's going to reach out --
318
898330
3000
Zato što će jedino treći replikator posegnuti --
15:01
sending information, sending probes, getting out there,
319
901330
3000
slati informacije, slati sonde, izlaziti van,
15:04
and communicating with anywhere else.
320
904330
2000
i komunicirati sa bilo kime.
15:06
OK, so if we take that equation,
321
906330
3000
Ako uzmemo tu jednadžbu,
15:09
why haven't we heard from anybody out there?
322
909330
5000
zašto nismo čuli ništa od nikoga vani?
15:14
Because every step is dangerous.
323
914330
4000
Zato što je svaki korak opasan.
15:18
Getting a new replicator is dangerous.
324
918330
3000
Nastanak novog replikatora je opasan.
15:21
You can pull through, we have pulled through,
325
921330
2000
Možemo se provući, mi smo se provukli,
15:23
but it's dangerous.
326
923330
2000
ali je opasno.
15:25
Take the first step, as soon as life appeared on this earth.
327
925330
3000
Uzmimo prvi korak, čim se život pojavio na zemlji.
15:28
We may take the Gaian view.
328
928330
2000
Uzmimo Gajanski pogled.
15:30
I loved Peter Ward's talk yesterday -- it's not Gaian all the time.
329
930330
3000
Svidio mi se Peter Wardov govor jučer -- nije Gajanski cijelo vrijeme.
15:33
Actually, life forms produce things that kill themselves.
330
933330
3000
Zapravo, život stvara stvari koje se same ubijaju.
15:36
Well, we did pull through on this planet.
331
936330
3000
Pa, mi se jesmo provukli na ovom planetu.
15:39
But then, a long time later, billions of years later,
332
939330
2000
Ali onda, mnogo kasnije, milijarde godina kasnije,
15:41
we got the second replicator, the memes.
333
941330
3000
dobili smo drugi replikator, meme.
15:44
That was dangerous, all right.
334
944330
2000
To je bilo stvarno opasno.
15:46
Think of the big brain.
335
946330
2000
Razmislite o velikom mozgu.
15:48
How many mothers do we have here?
336
948330
3000
Koliko majki ovdje imamo?
15:51
You know all about big brains.
337
951330
2000
Znate sve o velikim mozgovima.
15:53
They are dangerous to give birth to,
338
953330
2000
Opasni su za rađanje.
15:55
are agonizing to give birth to.
339
955330
2000
Grozni su za rađanje.
15:57
(Laughter)
340
957330
1000
(Smijeh)
15:59
My cat gave birth to four kittens, purring all the time.
341
959330
2000
Moja mačka je rodila predući cijelo vrijeme.
16:01
Ah, mm -- slightly different.
342
961330
2000
Malo drukčije.
16:03
(Laughter)
343
963330
2000
(Smijeh)
16:05
But not only is it painful, it kills lots of babies,
344
965330
3000
Ali nije samo bolno, ubija mnogo djece,
16:08
it kills lots of mothers,
345
968330
2000
ubija mnogo majki,
16:10
and it's very expensive to produce.
346
970330
2000
i vrlo skupo za proizvesti.
16:12
The genes are forced into producing all this myelin,
347
972330
2000
Geni su prisiljeni proizvoditi sav taj mijelin,
16:14
all the fat to myelinate the brain.
348
974330
2000
svu tu mast kako bi mijelinirali mozak.
16:16
Do you know, sitting here,
349
976330
2000
Sjedeći ovdje,
16:18
your brain is using about 20 percent of your body's energy output
350
978330
4000
vaš mozak koristi oko 20 posto vaše tjelesne energije
16:22
for two percent of your body weight?
351
982330
2000
za 2 posto vaše tjelesne mase.
16:24
It's a really expensive organ to run.
352
984330
2000
To je skupi organ za korištenje.
16:26
Why? Because it's producing the memes.
353
986330
2000
Zašto? Zato jer stvara meme.
16:28
Now, it could have killed us off. It could have killed us off,
354
988330
4000
Mogao nas je pobiti,
16:32
and maybe it nearly did, but you see, we don't know.
355
992330
2000
i možda skoro jest, ali ne znamo.
16:34
But maybe it nearly did.
356
994330
2000
Ali možda skoro jest.
16:36
Has it been tried before?
357
996330
1000
Je li pokušano prije?
16:37
What about all those other species?
358
997330
2000
Što sa svim ostalim vrstama?
16:39
Louise Leakey talked yesterday
359
999330
2000
Louis Leakey je pričala jučer
16:41
about how we're the only one in this branch left.
360
1001330
3000
o tome kako smo jedini preostali u ovoj grani.
16:44
What happened to the others?
361
1004330
2000
Što se dogodilo drugima?
16:46
Could it be that this experiment in imitation,
362
1006330
2000
Je li moguće da je ovaj eksperiment imitacije,
16:48
this experiment in a second replicator,
363
1008330
2000
ovaj eksperiment drugog replikatora,
16:50
is dangerous enough to kill people off?
364
1010330
4000
dovoljno opasan da pobije ljude?
16:54
Well, we did pull through, and we adapted.
365
1014330
2000
Pa, provukli smo se, i prilagodili smo se.
16:56
But now, we're hitting, as I've just described,
366
1016330
3000
Ali sada se približavamo, kao što sam opisala,
16:59
we're hitting the third replicator point.
367
1019330
2000
točki trećeg replikatora.
17:01
And this is even more dangerous --
368
1021330
3000
I ovo je još opasnije --
17:04
well, it's dangerous again.
369
1024330
2000
ponovno je opasno.
17:06
Why? Because the temes are selfish replicators
370
1026330
4000
Zašto? Zato što su temi sebični replikatori.
17:10
and they don't care about us, or our planet, or anything else.
371
1030330
3000
i ne mare za nas, ili naš planet, ili išta drugo.
17:13
They're just information, why would they?
372
1033330
3000
Oni su samo informacija -- zašto bi?
17:17
They are using us to suck up the planet's resources
373
1037330
2000
Iskorištavaju nas kako bi potrošili resurse planeta
17:19
to produce more computers,
374
1039330
2000
za proizvodnju više računala,
17:21
and more of all these amazing things we're hearing about here at TED.
375
1041330
3000
i više ovih fascinantnih stvari o kojima čujemo na TED-u.
17:24
Don't think, "Oh, we created the Internet for our own benefit."
376
1044330
4000
Nemojte misliti, "Stvorili smo Internet za našu korist."
17:28
That's how it seems to us.
377
1048330
2000
Tako nam se čini.
17:30
Think, temes spreading because they must.
378
1050330
4000
Razmišljajte o temima kako se šire jer moraju.
17:34
We are the old machines.
379
1054330
2000
Mi smo stari strojevi.
17:36
Now, are we going to pull through?
380
1056330
2000
Hoćemo li se provući?
17:38
What's going to happen?
381
1058330
2000
Što će se dogoditi?
17:40
What does it mean to pull through?
382
1060330
2000
Što znači provući se?
17:42
Well, there are kind of two ways of pulling through.
383
1062330
2000
Postoje dva načina provlačenja.
17:45
One that is obviously happening all around us now,
384
1065330
2000
Jedan je onaj koji se upravo događa oko nas,
17:47
is that the temes turn us into teme machines,
385
1067330
4000
da nas temi pretvore u temske strojeve,
17:51
with these implants, with the drugs,
386
1071330
2000
s tim implatantima, s drogama,
17:53
with us merging with the technology.
387
1073330
3000
s nama kako se stapamo s tehnologijom.
17:56
And why would they do that?
388
1076330
2000
Zašto bi oni to učinili?
17:58
Because we are self-replicating.
389
1078330
2000
Zato jer se mi samorepliciramo.
18:00
We have babies.
390
1080330
2000
Imamo djecu.
18:02
We make new ones, and so it's convenient to piggyback on us,
391
1082330
3000
Radimo novu, pa je zato prikladno krpati nam se
18:05
because we're not yet at the stage on this planet
392
1085330
4000
zato što još nismo na toj razini na ovom planetu
18:09
where the other option is viable.
393
1089330
2000
gdje je druga opcija moguća.
18:11
Although it's closer, I heard this morning,
394
1091330
2000
Iako je bliže, čula sam jutros,
18:13
it's closer than I thought it was.
395
1093330
2000
Bliže je nego što sam mislila da jest.
18:15
Where the teme machines themselves will replicate themselves.
396
1095330
3000
Onda kada će se temski strojevi samoreplicirati.
18:18
That way, it wouldn't matter if the planet's climate
397
1098330
4000
Tada ne bi bilo važno da se klima planeta
18:22
was utterly destabilized,
398
1102330
2000
potpuno destabilizira,
18:24
and it was no longer possible for humans to live here.
399
1104330
2000
i da nije moguć ljudski život.
18:26
Because those teme machines, they wouldn't need --
400
1106330
2000
Jer ti temski strojevi, ne bi trebali --
18:28
they're not squishy, wet, oxygen-breathing,
401
1108330
2000
nisu sluzavi, mokri, dišući,
18:30
warmth-requiring creatures.
402
1110330
3000
toplinski organizmi.
18:33
They could carry on without us.
403
1113330
2000
Mogli bi nastaviti bez nas.
18:35
So, those are the two possibilities.
404
1115330
3000
Znači, to su dvije mogućnosti.
18:38
The second, I don't think we're that close.
405
1118330
4000
Druga, ne mislim da smo tako blizu.
18:42
It's coming, but we're not there yet.
406
1122330
2000
Dolazi, ali nismo još stigli.
18:44
The first, it's coming too.
407
1124330
2000
Prva također dolazi.
18:46
But the damage that is already being done
408
1126330
3000
Ali šteta koja se već nanosi
18:49
to the planet is showing us how dangerous the third point is,
409
1129330
5000
planeti nam pokazuje koliko je treća točka opasna
18:54
that third danger point, getting a third replicator.
410
1134330
3000
ta treća točka, dolazak trećeg replikatora.
18:58
And will we get through this third danger point,
411
1138330
2000
Hoćemo li proći treću opasnu točku,
19:00
like we got through the second and like we got through the first?
412
1140330
3000
kako smo prošli drugu i prvu?
19:04
Maybe we will, maybe we won't.
413
1144330
2000
Možda hoćemo, možda nećemo.
19:06
I have no idea.
414
1146330
3000
Ne znam.
19:13
(Applause)
415
1153330
10000
(Pljesak)
19:24
Chris Anderson: That was an incredible talk.
416
1164330
2000
Chris Anderson: To je bio sjajan govor.
19:26
SB: Thank you. I scared myself.
417
1166330
2000
Hvala. Preplašila sam sebe.
19:28
CA: (Laughter)
418
1168330
1000
(Smijeh)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7