Ray Kurzweil: Get ready for hybrid thinking

521,726 views ・ 2014-06-02

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Marina Maras Recezent: Ivan Stamenković
00:12
Let me tell you a story.
0
12988
2316
Htio bih vam ispričati priču.
00:15
It goes back 200 million years.
1
15304
1799
Ona počinje prije 200 milijuna godina
00:17
It's a story of the neocortex,
2
17103
1984
To je priča o neokorteksu
00:19
which means "new rind."
3
19087
1974
što znači "nova kora".
00:21
So in these early mammals,
4
21061
2431
Kod prvih sisavaca,
00:23
because only mammals have a neocortex,
5
23492
2055
jer samo sisavci imaju neokorteks,
00:25
rodent-like creatures.
6
25547
1664
bića sličnih glodavcima.
00:27
It was the size of a postage stamp and just as thin,
7
27211
3579
Neokorteks je bio veličine poštanske marke te jednako tanak,
00:30
and was a thin covering around
8
30790
1439
bio je to tanki sloj koji je bio omotan
00:32
their walnut-sized brain,
9
32229
2264
oko njihova mozga veličine oraha
00:34
but it was capable of a new type of thinking.
10
34493
3701
ali bio je sposoban za novu vrstu razmišljanja.
00:38
Rather than the fixed behaviors
11
38194
1567
Za razliku od utvrđenih ponašanja,
00:39
that non-mammalian animals have,
12
39761
1992
tipičnih za ne-sisavce,
00:41
it could invent new behaviors.
13
41753
2692
neokorteks je mogao smisliti nove oblike ponašanja.
00:44
So a mouse is escaping a predator,
14
44445
2553
Tako da miš koji bježi svom predatoru
00:46
its path is blocked,
15
46998
1540
a put mu je blokiran
00:48
it'll try to invent a new solution.
16
48538
2129
pokušat će smisliti novo rješenje.
00:50
That may work, it may not,
17
50667
1266
Koje može biti uspješno ili ne
00:51
but if it does, it will remember that
18
51933
1910
ali ako bude uspješno, miš će to zapamtiti
00:53
and have a new behavior,
19
53843
1292
i stvoriti novo ponašanje
00:55
and that can actually spread virally
20
55135
1457
koje se može viralno proširiti
00:56
through the rest of the community.
21
56592
2195
ostatkom zajednice.
00:58
Another mouse watching this could say,
22
58787
1609
Drugi miš gledajući ovo bi mogao reći:
01:00
"Hey, that was pretty clever, going around that rock,"
23
60396
2704
"Hej, to je bilo baš pametno, tako zaobići kamen,"
01:03
and it could adopt a new behavior as well.
24
63100
3725
i mogao bi također usvojiti to ponašanje.
01:06
Non-mammalian animals
25
66825
1717
Ne-sisavci
01:08
couldn't do any of those things.
26
68542
1713
nisu bili sposobni za nešto takvo.
01:10
They had fixed behaviors.
27
70255
1215
Oni su imali utvrđene oblike ponašanja.
01:11
Now they could learn a new behavior
28
71470
1331
Mogli su naučiti novo ponašanje
01:12
but not in the course of one lifetime.
29
72801
2576
ali ne u toku jednog životnog vijeka.
01:15
In the course of maybe a thousand lifetimes,
30
75377
1767
Možda u toku od tisuću života
01:17
it could evolve a new fixed behavior.
31
77144
3330
su mogli razviti novo utvrđeno ponašanje.
01:20
That was perfectly okay 200 million years ago.
32
80474
3377
To je bilo sasvim u redu prije 200 milijuna godina.
01:23
The environment changed very slowly.
33
83851
1981
Okoliš se mijenjao vrlo sporo.
01:25
It could take 10,000 years for there to be
34
85832
1554
Moglo je proći i 10 000 godina da bi se dogodila
01:27
a significant environmental change,
35
87386
2092
značajnija promjena okoliša,
01:29
and during that period of time
36
89478
1382
a tijekom tog razdoblja
01:30
it would evolve a new behavior.
37
90860
2929
mogli su razviti novo ponašanje.
01:33
Now that went along fine,
38
93789
1521
Sve je išlo svojim tijekom
01:35
but then something happened.
39
95310
1704
ali onda se nešto dogodilo.
01:37
Sixty-five million years ago,
40
97014
2246
Prije 65 milijuna godina
01:39
there was a sudden, violent change to the environment.
41
99260
2615
iznenada se dogodila silovita promjena okoliša.
01:41
We call it the Cretaceous extinction event.
42
101875
3505
A zovemo je događaj Kreda-Tercijar izumiranja.
01:45
That's when the dinosaurs went extinct,
43
105380
2293
Tada su izumrli dinosauri,
01:47
that's when 75 percent of the
44
107673
3449
tada je 75 posto
01:51
animal and plant species went extinct,
45
111122
2746
životinjskih i biljnih vrsta izumrlo
01:53
and that's when mammals
46
113868
1745
i tada su sisavci
01:55
overtook their ecological niche,
47
115613
2152
zauzeli svoju ekološku nišu
01:57
and to anthropomorphize, biological evolution said,
48
117765
3654
i da antropomorfiziramo, biološka evolucija je rekla:
02:01
"Hmm, this neocortex is pretty good stuff,"
49
121419
2025
"Hm, ovaj neokorteks je odlična stvar",
02:03
and it began to grow it.
50
123444
1793
i počela ga je razvijati.
02:05
And mammals got bigger,
51
125237
1342
I sisavci su postajali sve veći
02:06
their brains got bigger at an even faster pace,
52
126579
2915
i njihovi mozgovi su postajali veći još većom brzinom
02:09
and the neocortex got bigger even faster than that
53
129494
3807
a neokorteks je postajao veći još brže od toga
02:13
and developed these distinctive ridges and folds
54
133301
2929
i razvio je prepoznatljive brazde i prijevoje
02:16
basically to increase its surface area.
55
136230
2881
kako bi povećao svoju površinu.
02:19
If you took the human neocortex
56
139111
1819
Kad biste uzeli ljudski neokorteks
02:20
and stretched it out,
57
140930
1301
i potpuno ga rastegli,
02:22
it's about the size of a table napkin,
58
142231
1713
bio bi veličine kuhinjske salvete
02:23
and it's still a thin structure.
59
143944
1306
a još uvijek bi bio tanke strukture.
02:25
It's about the thickness of a table napkin.
60
145250
1980
Otprilike je debljine kuhinjske salvete.
02:27
But it has so many convolutions and ridges
61
147230
2497
Ali ima toliko vijuga i brazdi
02:29
it's now 80 percent of our brain,
62
149727
3075
da sačinjava 80 posto našeg mozga,
02:32
and that's where we do our thinking,
63
152802
2461
tu se odvijaju naši procesi razmišljanja
02:35
and it's the great sublimator.
64
155263
1761
i zadužen je za proces sublimiranja.
02:37
We still have that old brain
65
157024
1114
Još uvijek imamo onaj stari mozak
02:38
that provides our basic drives and motivations,
66
158138
2764
koji određuje naše osnovne nagone i motivacije
02:40
but I may have a drive for conquest,
67
160902
2716
ali ja mogu imati nagon za osvajanjem
02:43
and that'll be sublimated by the neocortex
68
163618
2715
a to će neokorteks sublimirati
02:46
into writing a poem or inventing an app
69
166333
2909
u proces pisanja pjesme ili smišljanja aplikacije
02:49
or giving a TED Talk,
70
169242
1509
ili držanja govora na TED konferenciji
02:50
and it's really the neocortex that's where
71
170751
3622
i zapravo se u neokorteksu
02:54
the action is.
72
174373
1968
događa sva akcija.
02:56
Fifty years ago, I wrote a paper
73
176341
1717
Prije 50 godina sam napisao rad
02:58
describing how I thought the brain worked,
74
178058
1918
u kojem sam opisao kako sam mislio da mozak radi
02:59
and I described it as a series of modules.
75
179976
3199
i opisao sam ga kao niz modula.
03:03
Each module could do things with a pattern.
76
183175
2128
Svaki modul je mogao nešto obavljati ali slijedeći uzorak.
03:05
It could learn a pattern. It could remember a pattern.
77
185303
2746
Mogao je naučiti uzorak, mogao je zapamtiti uzorak.
03:08
It could implement a pattern.
78
188049
1407
Mogao je primijeniti uzorak.
03:09
And these modules were organized in hierarchies,
79
189456
2679
I ovi moduli su organizirani u hijerarhije,
03:12
and we created that hierarchy with our own thinking.
80
192135
2954
a mi stvaramo te hijerarhije svojim razmišljanjem.
03:15
And there was actually very little to go on
81
195089
3333
Bilo je vrlo malo informacija s kojim ste mogli raditi
03:18
50 years ago.
82
198422
1562
prije 50 godina.
03:19
It led me to meet President Johnson.
83
199984
2115
To me je dovelo do predsjednika Johnsona.
03:22
I've been thinking about this for 50 years,
84
202099
2173
Razmišljao sam o ovome 50 godina
03:24
and a year and a half ago I came out with the book
85
204272
2828
i prije godinu i pol izdao sam knjigu
03:27
"How To Create A Mind,"
86
207100
1265
"Kako stvoriti um",
03:28
which has the same thesis,
87
208365
1613
koja ima istu postavku
03:29
but now there's a plethora of evidence.
88
209978
2812
ali danas postoji mnoštvo dokaza.
03:32
The amount of data we're getting about the brain
89
212790
1814
Količina podataka koju dobivamo o mozgu
03:34
from neuroscience is doubling every year.
90
214604
2203
od neuroznanosti udvostručava se svake godine.
03:36
Spatial resolution of brainscanning of all types
91
216807
2654
Prostorna rezolucija skeniranja mozga svih vrsta
03:39
is doubling every year.
92
219461
2285
se udvostručava svake godine.
03:41
We can now see inside a living brain
93
221746
1717
Danas možemo vidjeti unutrašnjost živog mozga
03:43
and see individual interneural connections
94
223463
2870
i vidjeti pojedinačne među-neuronske veze
03:46
connecting in real time, firing in real time.
95
226333
2703
kako se povezuju u realnom vremenu, kako ispaljuju impulse u relnom vremenu.
03:49
We can see your brain create your thoughts.
96
229036
2419
Možemo vidjeti kako vaš mozak kreira misli.
03:51
We can see your thoughts create your brain,
97
231455
1575
Možemo vidjeti kako vaše misli oblikuju vaš mozak,
03:53
which is really key to how it works.
98
233030
1999
što je najbitnije u načinu kako radi.
03:55
So let me describe briefly how it works.
99
235029
2219
Dopustite mi da vam ukratko objasnim kako mozak radi.
03:57
I've actually counted these modules.
100
237248
2275
Zapravo, ja sam izbrojio ove module.
03:59
We have about 300 million of them,
101
239523
2046
Postoji negdje oko 300 milijuna modula
04:01
and we create them in these hierarchies.
102
241569
2229
i oblikujemo ih u hijerarhije.
04:03
I'll give you a simple example.
103
243798
2082
Dat ću vam jednostavan primjer.
04:05
I've got a bunch of modules
104
245880
2805
Imam gomilu modula
04:08
that can recognize the crossbar to a capital A,
105
248685
3403
koji mogu prepoznati poprečnu crticu velikog slova A
04:12
and that's all they care about.
106
252088
1914
i to je sve što ih zanima.
04:14
A beautiful song can play,
107
254002
1578
Može svirati prelijepa pjesma,
04:15
a pretty girl could walk by,
108
255580
1434
može proći prelijepa djevojka
04:17
they don't care, but they see a crossbar to a capital A,
109
257014
2846
oni neće mariti za to, ali kad vide poprečnu crticu velikog slova A,
04:19
they get very excited and they say "crossbar,"
110
259860
3021
uzbude se i kažu "poprečna crtica",
04:22
and they put out a high probability
111
262881
2112
i odašilju signal velike vjerojatnosti
04:24
on their output axon.
112
264993
1634
na izlaznom aksonu.
04:26
That goes to the next level,
113
266627
1333
Tada to prelazi na višu razinu,
04:27
and these layers are organized in conceptual levels.
114
267960
2750
ovi slojevi su organizirani u konceptualne razine.
04:30
Each is more abstract than the next one,
115
270710
1856
Svaka sljedeća je apstraktnija od prethodne
04:32
so the next one might say "capital A."
116
272566
2418
pa bi sljedeća mogla reći "veliko slovo A".
04:34
That goes up to a higher level that might say "Apple."
117
274984
2891
To ide na višu razinu koja bi mogla reći "Auto".
04:37
Information flows down also.
118
277875
2167
Informacija također ide prema dolje.
04:40
If the apple recognizer has seen A-P-P-L,
119
280042
2936
Ako je identifikator auta vidio A-U-T-,
04:42
it'll think to itself, "Hmm, I think an E is probably likely,"
120
282978
3219
pomislit će: " Hm, mislim da je jedno O vrlo vjerojatno",
04:46
and it'll send a signal down to all the E recognizers
121
286197
2564
i poslat će signal svim O identifikatorima
04:48
saying, "Be on the lookout for an E,
122
288761
1619
govoreći: "Pazite na slovo O,
04:50
I think one might be coming."
123
290380
1556
mislim da bi se jedno moglo pojaviti."
04:51
The E recognizers will lower their threshold
124
291936
2843
'O' identifikatori će sniziti svoj prag
04:54
and they see some sloppy thing, could be an E.
125
294779
1945
i ako vide neku brljotinu, mogla bi biti O.
04:56
Ordinarily you wouldn't think so,
126
296724
1490
Obično to ne biste pomislili,
04:58
but we're expecting an E, it's good enough,
127
298214
2009
ali sad kad očekujemo 'O', dovoljno je dobra,
05:00
and yeah, I've seen an E, and then apple says,
128
300223
1817
i da, vidio sam 'O', i tada auto kaže:
05:02
"Yeah, I've seen an Apple."
129
302040
1728
"Da, vidio sam Auto".
05:03
Go up another five levels,
130
303768
1746
Popnite se pet razina
05:05
and you're now at a pretty high level
131
305514
1353
i sada ste na vrlo viskoj razini
05:06
of this hierarchy,
132
306867
1569
ove hijerarhije
05:08
and stretch down into the different senses,
133
308436
2353
i spustite se prema različitim osjetilima,
05:10
and you may have a module that sees a certain fabric,
134
310789
2655
i mogli biste mati modul koji vidi određenu tkaninu,
05:13
hears a certain voice quality, smells a certain perfume,
135
313444
2844
čuje određenu kvalitetu glasa, njuši određeni parfem,
05:16
and will say, "My wife has entered the room."
136
316288
2513
i reći će: "Moja žena je ušla u sobu".
05:18
Go up another 10 levels, and now you're at
137
318801
1895
Popnite se još 10 razina i sada ste na
05:20
a very high level.
138
320696
1160
zaista visokoj razini.
05:21
You're probably in the frontal cortex,
139
321856
1937
Vjerojatno se nalazite u frontalnom korteksu,
05:23
and you'll have modules that say, "That was ironic.
140
323793
3767
i imat ćete module koji govore: " To je ironično.
05:27
That's funny. She's pretty."
141
327560
2370
To je smiješno. Ona je lijepa."
05:29
You might think that those are more sophisticated,
142
329930
2105
Mogli biste pomisliti da su oni sofisticiraniji,
05:32
but actually what's more complicated
143
332035
1506
ali ono što je složenije
05:33
is the hierarchy beneath them.
144
333541
2669
jest hijerarhija koja se nalazi ispod njih.
05:36
There was a 16-year-old girl, she had brain surgery,
145
336210
2620
Jedna šesnaestogodišnja djevojka je imala operaciju na mozgu
05:38
and she was conscious because the surgeons
146
338830
2051
i bila je pri svijesti tijekom operacije jer su kirurzi
05:40
wanted to talk to her.
147
340881
1537
željeli razgovarati s njom.
05:42
You can do that because there's no pain receptors
148
342418
1822
To je moguće zato što nema receptora boli
05:44
in the brain.
149
344240
1038
na mozgu.
05:45
And whenever they stimulated particular,
150
345278
1800
I kadgod su stimulirali određene,
05:47
very small points on her neocortex,
151
347078
2463
vrlo male točke na njezinom neokorteksu
05:49
shown here in red, she would laugh.
152
349541
2665
koje su ovdje pokazane crvenom bojom, ona se smijala.
05:52
So at first they thought they were triggering
153
352206
1440
Prvotno su mislili da su pogodili
05:53
some kind of laugh reflex,
154
353646
1720
nekakvu vrstu refleksa smijeha
05:55
but no, they quickly realized they had found
155
355366
2519
ali ne, ubrzo su shvatili da su pronašli
05:57
the points in her neocortex that detect humor,
156
357885
3044
točke njezinog neokorteksa koje prepoznaju humor,
06:00
and she just found everything hilarious
157
360929
1969
i njoj je jednostavno sve bilo smiješno
06:02
whenever they stimulated these points.
158
362898
2437
kadgod su joj stimulirali ove točke.
06:05
"You guys are so funny just standing around,"
159
365335
1925
"Momci, što ste smiješni dok tako stojite tu",
06:07
was the typical comment,
160
367260
1738
je bio uobičajen komentar,
06:08
and they weren't funny,
161
368998
2302
ali oni nisu bili smiješni
06:11
not while doing surgery.
162
371300
3247
bar ne dok su obavljali operaciju.
06:14
So how are we doing today?
163
374547
4830
Pa kako nam ide danas?
06:19
Well, computers are actually beginning to master
164
379377
3054
Zapravo, računala počinju savladavati
06:22
human language with techniques
165
382431
2001
ljudski jezik tehnikama
06:24
that are similar to the neocortex.
166
384432
2867
koje su slične neokorteksu.
06:27
I actually described the algorithm,
167
387299
1514
Opisao sam algoritam
06:28
which is similar to something called
168
388813
2054
koji je sličan nečemu što se zove
06:30
a hierarchical hidden Markov model,
169
390867
2233
hijerarhijski skriveni Markovljev model,
06:33
something I've worked on since the '90s.
170
393100
3241
nešto na čemu sam radio još od 90-ih.
06:36
"Jeopardy" is a very broad natural language game,
171
396341
3238
"Izazov" je vrlo široka igra prirodnog jezika,
06:39
and Watson got a higher score
172
399579
1892
i Watson je osvojio više bodova
06:41
than the best two players combined.
173
401471
2000
nego najbolja dva igrača zajedno.
06:43
It got this query correct:
174
403471
2499
Točno je odgovorio na ovaj upit:
06:45
"A long, tiresome speech
175
405970
2085
" Dugi, zamoran govor
06:48
delivered by a frothy pie topping,"
176
408055
2152
pjenušavog nadjeva za pitu",
06:50
and it quickly responded, "What is a meringue harangue?"
177
410207
2796
i on je brzo odgovorio, "Što je meringue harangue?"
06:53
And Jennings and the other guy didn't get that.
178
413003
2635
A Jennings i onaj drugi tip to nisu shvatili.
06:55
It's a pretty sophisticated example of
179
415638
1926
To je vrlo sofisticirani primjer
06:57
computers actually understanding human language,
180
417564
1914
kako računala zapravo razumiju ljudski jezik,
06:59
and it actually got its knowledge by reading
181
419478
1652
a on je zapravo stekao znanje čitajući
07:01
Wikipedia and several other encyclopedias.
182
421130
3785
Wikipediu i nekoliko drugih enciklopedija.
07:04
Five to 10 years from now,
183
424915
2133
Negdje za 5 do 10 godina
07:07
search engines will actually be based on
184
427048
2184
internetski pretraživači će raditi na principu
07:09
not just looking for combinations of words and links
185
429232
2794
ne samo na traženju kombinacije riječi i linkova
07:12
but actually understanding,
186
432026
1914
već razumijevajući,
07:13
reading for understanding the billions of pages
187
433940
2411
čitajući da bi razumijeli milijarde stranica
07:16
on the web and in books.
188
436351
2733
na webu i u knjigama.
07:19
So you'll be walking along, and Google will pop up
189
439084
2616
Tako dok se šetate, iskočit će Google
07:21
and say, "You know, Mary, you expressed concern
190
441700
3081
i reći: "Znaš, Mary, izrazila si zabrinutost
07:24
to me a month ago that your glutathione supplement
191
444781
3019
negdje prije mjesec dana da tvoj glutationski dodatak prehrani
07:27
wasn't getting past the blood-brain barrier.
192
447800
2231
ne može proći barijeru krv-mozak.
07:30
Well, new research just came out 13 seconds ago
193
450031
2593
Pa, novo istraživanje je upravo izašlo prije 13 sekundi
07:32
that shows a whole new approach to that
194
452624
1711
koje pokazuje jedan skroz novi pristup
07:34
and a new way to take glutathione.
195
454335
1993
i novi način uzimanja glutationa.
07:36
Let me summarize it for you."
196
456328
2562
Dopusti da ga sažmem."
07:38
Twenty years from now, we'll have nanobots,
197
458890
3684
Za 20 godina imat ćemo nanobote,
07:42
because another exponential trend
198
462574
1627
još jedan rastući trend
07:44
is the shrinking of technology.
199
464201
1615
jest smanjivanje tehnologije.
07:45
They'll go into our brain
200
465816
2370
Oni će ući u naš mozak
07:48
through the capillaries
201
468186
1703
kroz kapilare
07:49
and basically connect our neocortex
202
469889
2477
i zapravo spojiti naš neokorteks
07:52
to a synthetic neocortex in the cloud
203
472366
3185
sa sintetskim neokorteksom u oblaku
07:55
providing an extension of our neocortex.
204
475551
3591
koji će biti produžetak našeg neokorteksa.
07:59
Now today, I mean,
205
479142
1578
Danas,
08:00
you have a computer in your phone,
206
480720
1530
imate računalo u vašem telefonu
08:02
but if you need 10,000 computers for a few seconds
207
482250
2754
ali ako vam zatreba 10,000 računala za samo nekoliko sekundi
08:05
to do a complex search,
208
485004
1495
kako biste napravili složenu pretragu,
08:06
you can access that for a second or two in the cloud.
209
486499
3396
možete pristupiti tome za 1-2 sekunde u oblaku.
08:09
In the 2030s, if you need some extra neocortex,
210
489895
3095
U 2030-im ako budete trebali još neokorteksa
08:12
you'll be able to connect to that in the cloud
211
492990
2273
moći ćete se povezati s njim u oblaku
08:15
directly from your brain.
212
495263
1648
izravno iz vašeg mozga.
08:16
So I'm walking along and I say,
213
496911
1543
Naprimjer šetam se i kažem:
08:18
"Oh, there's Chris Anderson.
214
498454
1363
"Oh, evo Chris Anderson.
08:19
He's coming my way.
215
499817
1525
Dolazi prema meni.
08:21
I'd better think of something clever to say.
216
501342
2335
Bolje da smislim nešto pametno za reći.
08:23
I've got three seconds.
217
503677
1524
imam tri sekunde.
08:25
My 300 million modules in my neocortex
218
505201
3097
300 milijuna modula u mojem neokorteksu
08:28
isn't going to cut it.
219
508298
1240
neće to uspjeti.
08:29
I need a billion more."
220
509538
1246
Trabam ih još milijardu."
08:30
I'll be able to access that in the cloud.
221
510784
3323
Moći ću im pristupiti preko oblaka.
08:34
And our thinking, then, will be a hybrid
222
514107
2812
I tada će naše razmišljanje biti hibrid
08:36
of biological and non-biological thinking,
223
516919
3522
biološkog i ne-biološkog razmišljanja,
08:40
but the non-biological portion
224
520441
1898
ali nebiološki dio
08:42
is subject to my law of accelerating returns.
225
522339
2682
je podložan mom zakonu ubrzavajućih povrata.
08:45
It will grow exponentially.
226
525021
2239
On će rasti eksponencijalno.
08:47
And remember what happens
227
527260
2016
Sjećate li se što se dogodilo
08:49
the last time we expanded our neocortex?
228
529276
2645
zadnji put kad nam se povećao neokorteks?
08:51
That was two million years ago
229
531921
1426
To je bilo prije 2 milijuna godina
08:53
when we became humanoids
230
533347
1236
kada smo postali humanoidi
08:54
and developed these large foreheads.
231
534583
1594
i razvili ova visoka čela.
08:56
Other primates have a slanted brow.
232
536177
2583
Drugi primati imaju koso čelo.
08:58
They don't have the frontal cortex.
233
538760
1745
Oni nemaju frontalni korteks.
09:00
But the frontal cortex is not really qualitatively different.
234
540505
3685
Ali frontalni korteks nije kvalitativno drugačiji.
09:04
It's a quantitative expansion of neocortex,
235
544190
2743
On je kvantitativno proširenje neokorteksa,
09:06
but that additional quantity of thinking
236
546933
2703
ali ta dodatna količina razmišljanja
09:09
was the enabling factor for us to take
237
549636
1779
nam je omogućila da napravimo
09:11
a qualitative leap and invent language
238
551415
3346
kvalitativan skok i izmislimo jezik
09:14
and art and science and technology
239
554761
1967
i umjetnost i znanost i tehnologiju
09:16
and TED conferences.
240
556728
1454
i TED konferencije.
09:18
No other species has done that.
241
558182
2131
Nijednoj drugoj vrsti to nije uspjelo.
09:20
And so, over the next few decades,
242
560313
2075
Tijekom sljedećih nekoliko desetljeća
09:22
we're going to do it again.
243
562388
1760
učinit ćemo to ponovno.
09:24
We're going to again expand our neocortex,
244
564148
2274
Opet ćemo proširiti svoj neokorteks
09:26
only this time we won't be limited
245
566422
1756
samo ovaj put nećemo biti ograničeni
09:28
by a fixed architecture of enclosure.
246
568178
4280
fiksnom arhitekturom zatvorenog prostora.
09:32
It'll be expanded without limit.
247
572458
3304
Širit će se bez ograničenja.
09:35
That additional quantity will again
248
575762
2243
Ta dodatna količina će opet
09:38
be the enabling factor for another qualitative leap
249
578005
3005
omogućiti kvalitativan skok
09:41
in culture and technology.
250
581010
1635
u kulturi i tehnologiji.
09:42
Thank you very much.
251
582645
2054
Mnogo vam hvala.
09:44
(Applause)
252
584699
3086
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7