Alex Wissner-Gross: A new equation for intelligence

198,699 views ・ 2014-02-06

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Miloš Milosavljević Lektor: Mile Živković
00:12
Intelligence -- what is it?
0
12899
3667
Inteligencija - šta je to?
00:16
If we take a look back at the history
1
16566
2291
Ako pogledamo unazad kroz istoriju
00:18
of how intelligence has been viewed,
2
18857
2624
kako su videli inteligenciju,
00:21
one seminal example has been
3
21481
3618
jedan važan primer je bio
00:25
Edsger Dijkstra's famous quote that
4
25099
3477
poznati citat Edsgara Dajkstre:
00:28
"the question of whether a machine can think
5
28576
3111
"Pitanje da li mašina može da misli
00:31
is about as interesting
6
31687
1310
je isto tako zanimljivo
00:32
as the question of whether a submarine
7
32997
2971
kao i pitanje da li podmornica
00:35
can swim."
8
35968
1790
može da pliva."
00:37
Now, Edsger Dijkstra, when he wrote this,
9
37758
3844
Kad je Edsgar Dajkstra napisao ovo,
00:41
intended it as a criticism
10
41602
2054
nameravao je da to bude kritika
00:43
of the early pioneers of computer science,
11
43656
3000
pionira u kompjuterskoj nauci
00:46
like Alan Turing.
12
46656
1747
kao što je Alan Tjuring.
00:48
However, if you take a look back
13
48403
2499
Međutim, ako pogledate u prošlost
00:50
and think about what have been
14
50902
1965
i razmislite koje su bile
00:52
the most empowering innovations
15
52867
1996
najvažnije inovacije
00:54
that enabled us to build
16
54863
1879
koje su nam omogućile da pravimo
00:56
artificial machines that swim
17
56742
2234
veštačke mašine koje plivaju
00:58
and artificial machines that [fly],
18
58976
2573
i veštačke mašine koje lete,
01:01
you find that it was only through understanding
19
61549
3547
otkrićete da smo samo pomoću razumevanja
01:05
the underlying physical mechanisms
20
65096
2608
fizičkih mehanizama koji čine osnovu
01:07
of swimming and flight
21
67704
2779
plivanja i letenja
01:10
that we were able to build these machines.
22
70483
3172
mogli da napravimo ove mašine.
01:13
And so, several years ago,
23
73655
2256
I tako, pre nekoliko godina,
01:15
I undertook a program to try to understand
24
75911
3249
sproveo sam program da bih pokušao da razumem
01:19
the fundamental physical mechanisms
25
79160
2634
fizičke mehanizme
01:21
underlying intelligence.
26
81794
2768
koji čine osnovu inteligencije.
01:24
Let's take a step back.
27
84562
1860
Hajde da se vratimo korak unazad.
01:26
Let's first begin with a thought experiment.
28
86422
3149
Da počnemo prvo sa misaonim eksperimentom.
01:29
Pretend that you're an alien race
29
89571
2854
Zamislite da ste vanzemaljska rasa
01:32
that doesn't know anything about Earth biology
30
92425
3041
koja ne zna ništa o biologiji Zemlje
01:35
or Earth neuroscience or Earth intelligence,
31
95466
3116
ili o neuronauci Zemlje ili o Zemljinoj inteligenciji,
01:38
but you have amazing telescopes
32
98582
2192
ali imate neverovatne teleskope
01:40
and you're able to watch the Earth,
33
100774
2362
i možete da posmatrate Zemlju
01:43
and you have amazingly long lives,
34
103136
2332
i imate neverovatno duge živote,
01:45
so you're able to watch the Earth
35
105468
1499
tako da možete da posmatrate Zemlju
01:46
over millions, even billions of years.
36
106967
3442
milonima ili čak milijardama godina.
01:50
And you observe a really strange effect.
37
110409
3015
I primećujete veoma čudan efekat.
01:53
You observe that, over the course of the millennia,
38
113424
4312
Primećujete da je tokom milenijuma
01:57
Earth is continually bombarded with asteroids
39
117736
4285
Zemlja konstantno bombardovana asteroidima
02:02
up until a point,
40
122021
2087
sve do jednog momenta
02:04
and that at some point,
41
124108
1531
i da u jednom momentu
02:05
corresponding roughly to our year, 2000 AD,
42
125639
4192
koji odgovara, po gruboj proceni, našoj 2000. godini,
02:09
asteroids that are on
43
129831
1716
asteroidi koji su na putu
02:11
a collision course with the Earth
44
131547
1931
da se sudare sa Zemljom,
02:13
that otherwise would have collided
45
133478
1975
koji bi inače udarili,
02:15
mysteriously get deflected
46
135453
2415
misteriozno bivaju skrenuti
02:17
or they detonate before they can hit the Earth.
47
137868
3072
ili eksplodiraju pre nego što udare u Zemlju.
02:20
Now of course, as earthlings,
48
140940
2083
Naravno, kao Zemljani
02:23
we know the reason would be
49
143023
1544
znamo da je razlog to
02:24
that we're trying to save ourselves.
50
144567
1756
što pokušavamo da se spasemo.
02:26
We're trying to prevent an impact.
51
146323
3080
Pokušavamo da sprečimo udarac.
02:29
But if you're an alien race
52
149403
1711
Ali ako ste vanzemaljska rasa
02:31
who doesn't know any of this,
53
151114
1146
koja ne zna ništa od ovoga,
02:32
doesn't have any concept of Earth intelligence,
54
152260
2514
nema nikakvu ideju o inteligenciji na Zemlji,
02:34
you'd be forced to put together
55
154774
1728
bili biste prisiljeni da smislite
02:36
a physical theory that explains how,
56
156502
2918
fizičku teoriju koja objašnjava kako
02:39
up until a certain point in time,
57
159420
2538
do određenog trenutka u vremenu
02:41
asteroids that would demolish the surface of a planet
58
161958
4449
asteroidi koji bi uništili površinu planete
02:46
mysteriously stop doing that.
59
166407
3231
misteriozno ne učine to.
02:49
And so I claim that this is the same question
60
169638
4204
Tvrdim da je ovo isto pitanje
02:53
as understanding the physical nature of intelligence.
61
173842
3998
kao i razumevanje fizičke prirode inteligencije.
02:57
So in this program that I undertook several years ago,
62
177840
3882
U ovom programu koji sam sproveo pre nekoliko godina
03:01
I looked at a variety of different threads
63
181722
2765
video sam raznovrsne tragove
03:04
across science, across a variety of disciplines,
64
184487
3162
u nauci, u raznovrsnim disciplinama
03:07
that were pointing, I think,
65
187649
1892
koji su ukazivali, ja mislim,
03:09
towards a single, underlying mechanism
66
189541
2548
na jedan mehanizam koji je u osnovi
03:12
for intelligence.
67
192089
1581
inteligencije.
03:13
In cosmology, for example,
68
193670
2546
U kosmologiji, na primer,
03:16
there have been a variety of different threads of evidence
69
196216
2747
bilo je mnoštvo različitih tragova dokaza
03:18
that our universe appears to be finely tuned
70
198963
3407
da je naš svemir, izgleda, lepo naštimovan
03:22
for the development of intelligence,
71
202370
2153
za razvoj inteligencije
03:24
and, in particular, for the development
72
204523
2389
i posebno, za razvoj
03:26
of universal states
73
206912
1886
univerzalnih stanja
03:28
that maximize the diversity of possible futures.
74
208798
4098
koja maksimiziraju raznolikost mogućih budućnosti.
03:32
In game play, for example, in Go --
75
212896
2344
U igrama na primer, u igri Go -
03:35
everyone remembers in 1997
76
215240
3025
svi se sećaju kada je 1997.
03:38
when IBM's Deep Blue beat Garry Kasparov at chess --
77
218265
3951
IBM-ov "Dip blu" pobedio Garija Kasparova u šahu -
03:42
fewer people are aware
78
222216
1523
manje ljudi je svesno činjenice
03:43
that in the past 10 years or so,
79
223739
2018
da je u poslednjih 10 godina,
03:45
the game of Go,
80
225757
1198
igra Go,
03:46
arguably a much more challenging game
81
226955
1956
možda mnogo izazovnija igra od šaha
03:48
because it has a much higher branching factor,
82
228911
2425
jer ima mnogo veći faktor grananja,
03:51
has also started to succumb
83
231336
1702
takođe počela da se predaje
03:53
to computer game players
84
233038
1865
pred igračima kompjuterskih igara
03:54
for the same reason:
85
234903
1573
iz istog razloga:
03:56
the best techniques right now for computers playing Go
86
236476
2800
trenutno, najbolje tehnike za kompjutere koji igraju Go
03:59
are techniques that try to maximize future options
87
239276
3696
su tehnike koje pokušavaju da maksimiziraju buduće opcije
04:02
during game play.
88
242972
2014
tokom igre.
04:04
Finally, in robotic motion planning,
89
244986
3581
Na kraju, u planiranju robotskih pokreta
04:08
there have been a variety of recent techniques
90
248567
2182
postoji mnoštvo novih tehnika
04:10
that have tried to take advantage
91
250749
1902
koje pokušavaju da iskoriste
04:12
of abilities of robots to maximize
92
252651
3146
mogućnosti robota da maksimiziraju
04:15
future freedom of action
93
255797
1506
buduću slobodu delovanja
04:17
in order to accomplish complex tasks.
94
257303
3097
da bi obavili složene zadatke.
04:20
And so, taking all of these different threads
95
260400
2355
I tako, ako uzmemo sve ove različite tragove
04:22
and putting them together,
96
262755
1622
i sastavimo ih,
04:24
I asked, starting several years ago,
97
264377
2640
počeo sam da pitam pre nekoliko godina,
04:27
is there an underlying mechanism for intelligence
98
267017
2850
da li postoji mehanizam u osnovi inteligencije
04:29
that we can factor out
99
269867
1673
koji možemo da izdvojimo
04:31
of all of these different threads?
100
271540
1774
od svih ovih različitih tragova?
04:33
Is there a single equation for intelligence?
101
273314
4593
Da li postoji jedna jednačina za inteligenciju?
04:37
And the answer, I believe, is yes. ["F = T ∇ Sτ"]
102
277907
3371
I odgovor je, ja mislim, da. ["F = T ∇ Sτ"]
04:41
What you're seeing is probably
103
281278
1913
Ono što vidite je verovatno
04:43
the closest equivalent to an E = mc²
104
283191
3294
najbliži ekvivalent E = mc²
04:46
for intelligence that I've seen.
105
286485
2830
za inteligenciju koji sam video.
04:49
So what you're seeing here
106
289315
1702
Ono što vidite ovde
04:51
is a statement of correspondence
107
291017
2669
je iskaz odnosa
04:53
that intelligence is a force, F,
108
293686
4435
gde je inteligencija sila, F,
04:58
that acts so as to maximize future freedom of action.
109
298121
4650
koja se ponaša tako da maksimizira buduću slobodu delovanja.
05:02
It acts to maximize future freedom of action,
110
302771
2375
Ponaša se tako da maksimizira buduću slobodu delovanja
05:05
or keep options open,
111
305146
1628
ili drži opcije otvorene,
05:06
with some strength T,
112
306774
2225
sa nekom snagom T,
05:08
with the diversity of possible accessible futures, S,
113
308999
4777
sa raznolikošću mogućih dostupnih budućnosti, S,
05:13
up to some future time horizon, tau.
114
313776
2550
do nekog budućeg vremenskog horizonta, tau.
05:16
In short, intelligence doesn't like to get trapped.
115
316326
3209
Ukratko, inteligencija ne voli da bude zarobljena.
05:19
Intelligence tries to maximize future freedom of action
116
319535
3055
Ona pokušava da maksimizira buduću slobodu delovanja
05:22
and keep options open.
117
322590
2673
i da drži opcije otvorene.
05:25
And so, given this one equation,
118
325263
2433
I tako, kad imate ovu jednačinu,
05:27
it's natural to ask, so what can you do with this?
119
327696
2532
prirodno je pitati šta možete da uradite s njom?
05:30
How predictive is it?
120
330228
1351
Kolika joj je mogućnost predviđanja?
05:31
Does it predict human-level intelligence?
121
331579
2135
Da li predviđa inteligenciju ljudskog nivoa?
05:33
Does it predict artificial intelligence?
122
333714
2818
Da li predviđa veštačku inteligenciju?
05:36
So I'm going to show you now a video
123
336532
2042
Sada ću vam pokazati video
05:38
that will, I think, demonstrate
124
338574
3420
koji će, ja mislim, demonstrirati
05:41
some of the amazing applications
125
341994
2288
neke od neverovatnih primena
05:44
of just this single equation.
126
344282
2319
samo ove jedne jednačine.
05:46
(Video) Narrator: Recent research in cosmology
127
346601
1979
(Video) Narator: Skorija istraživanja u kosmologiji
05:48
has suggested that universes that produce
128
348580
2047
su ukazala da univerzumi koji proizvode
05:50
more disorder, or "entropy," over their lifetimes
129
350627
3481
više nereda ili "entropije" tokom svog postojanja
05:54
should tend to have more favorable conditions
130
354108
2478
bi trebalo da pokazuju tendenciju da imaju povoljnije uslove
05:56
for the existence of intelligent beings such as ourselves.
131
356586
3016
za postojanje inteligentnih bića kao što smo mi.
05:59
But what if that tentative cosmological connection
132
359602
2574
Ali šta ako ta provizorna kosmološka povezanost
06:02
between entropy and intelligence
133
362176
1843
između entropije i inteligencije
06:04
hints at a deeper relationship?
134
364019
1771
nagoveštava dublji odnos?
06:05
What if intelligent behavior doesn't just correlate
135
365790
2564
Šta ako inteligentno ponašanje nije samo u uzajamnoj vezi
06:08
with the production of long-term entropy,
136
368354
1844
sa proizvodnjom dugoročne entropije,
06:10
but actually emerges directly from it?
137
370198
2318
već, u stvari, direktno nastaje od nje?
06:12
To find out, we developed a software engine
138
372516
2406
Da bismo to saznali, razvili smo softver
06:14
called Entropica, designed to maximize
139
374922
2503
zvani Entropika, napravljen da maksimizira
06:17
the production of long-term entropy
140
377425
1768
proizvodnju dugoročne entropije
06:19
of any system that it finds itself in.
141
379193
2576
bilo kog sistema u kome se nađe.
06:21
Amazingly, Entropica was able to pass
142
381769
2155
Entropika je, neverovatno, uspela da prođe
06:23
multiple animal intelligence tests, play human games,
143
383924
3456
višestruke testove životinjske inteligencije, da igra ljudske igre
06:27
and even earn money trading stocks,
144
387380
2146
i čak da zarađuje novac trgujući deonicama,
06:29
all without being instructed to do so.
145
389526
2111
a da joj to nije bilo naloženo.
06:31
Here are some examples of Entropica in action.
146
391637
2518
Evo nekih primera Entropike u akciji.
06:34
Just like a human standing upright without falling over,
147
394155
3205
Baš kao što čovek stoji uspravno, a da ne padne,
06:37
here we see Entropica
148
397360
1230
ovde vidimo Entropiku
06:38
automatically balancing a pole using a cart.
149
398590
2885
koja automatski balansira na gredi koristeći kolica.
06:41
This behavior is remarkable in part
150
401475
2012
Ovo ponašanje je vredno pažnje
06:43
because we never gave Entropica a goal.
151
403487
2331
delom zato što uopšte nismo zadali cilj Entropici.
06:45
It simply decided on its own to balance the pole.
152
405818
3157
Jednostavno je sama odlučila da balansira na gredi.
06:48
This balancing ability will have appliactions
153
408975
2132
Ova sposobnost balansiranja će imati primenu
06:51
for humanoid robotics
154
411107
1397
u humanoidnoj robotici
06:52
and human assistive technologies.
155
412504
2515
i u tehnologiji za pomoć ljudima.
06:55
Just as some animals can use objects
156
415019
2001
Baš kao što neke životinje mogu da koriste predmete
06:57
in their environments as tools
157
417020
1442
u svojoj okolini kao alatke
06:58
to reach into narrow spaces,
158
418462
1987
da bi dohvatili nešto kroz uzak prostor,
07:00
here we see that Entropica,
159
420449
1882
ovde vidimo Entropiku koja,
07:02
again on its own initiative,
160
422331
1838
ponovo na sopstvenu inicijativu,
07:04
was able to move a large disk representing an animal
161
424169
2910
uspeva da pomeri veliki disk koji predstavlja životinju
07:07
around so as to cause a small disk,
162
427079
2345
da bi mali disk
07:09
representing a tool, to reach into a confined space
163
429424
2771
koji predstavlja alatku stigao u zatvoreni prostor
07:12
holding a third disk
164
432195
1537
u kome je treći disk
07:13
and release the third disk from its initially fixed position.
165
433732
2972
i izbacuje treći disk sa njegove početne pozicije.
07:16
This tool use ability will have applications
166
436704
2189
Ova upotreba sposobnosti alatke će imati primene
07:18
for smart manufacturing and agriculture.
167
438893
2359
u "pametnoj" proizvodnji i poljoprivredi.
07:21
In addition, just as some other animals
168
441252
1944
Osim toga, baš kao što su neke druge životinje
07:23
are able to cooperate by pulling opposite ends of a rope
169
443196
2696
sposobne da sarađuju tako što vuku suprotne krajeve konopca
07:25
at the same time to release food,
170
445892
2053
u isto vreme, da bi oslobodile hranu,
07:27
here we see that Entropica is able to accomplish
171
447945
2295
ovde vidimo da je Entropika sposobna da postigne
07:30
a model version of that task.
172
450240
1988
verziju tog zadatka pomoću modela.
07:32
This cooperative ability has interesting implications
173
452228
2522
Ova sposobnost kooperacije ima zanimljive implikacije
07:34
for economic planning and a variety of other fields.
174
454750
3435
na ekonomsko planiranje i mnoštvo drugih oblasti.
07:38
Entropica is broadly applicable
175
458185
2071
Entropika ima široku primenu
07:40
to a variety of domains.
176
460256
1943
na niz domena.
07:42
For example, here we see it successfully
177
462199
2442
Na primer, ovde vidimo kako uspešno igra
07:44
playing a game of pong against itself,
178
464641
2559
igru pong, sama protiv sebe,
07:47
illustrating its potential for gaming.
179
467200
2343
ilustrujući svoj potencijal za igre.
07:49
Here we see Entropica orchestrating
180
469543
1919
Ovde vidimo Entropiku kako pravi
07:51
new connections on a social network
181
471462
1839
nove veze na društvenoj mreži
07:53
where friends are constantly falling out of touch
182
473301
2760
gde prijatelji konstantno gube kontakt
07:56
and successfully keeping the network well connected.
183
476061
2856
i uspešno drži mrežu dobro povezanom.
07:58
This same network orchestration ability
184
478917
2298
Ova sposobnost organizovanja mreže
08:01
also has applications in health care,
185
481215
2328
takođe ima primenu u zdravstvu,
08:03
energy, and intelligence.
186
483543
3232
energetici i inteligenciji.
08:06
Here we see Entropica directing the paths
187
486775
2085
Ovde vidimo Entropiku kako usmerava puteve
08:08
of a fleet of ships,
188
488860
1486
flote brodova,
08:10
successfully discovering and utilizing the Panama Canal
189
490346
3175
tako što uspešno otkriva i koristi Panamski kanal
08:13
to globally extend its reach from the Atlantic
190
493521
2458
da bi ih sprovela od Atlantika
08:15
to the Pacific.
191
495979
1529
do Pacifika.
08:17
By the same token, Entropica
192
497508
1727
Pored toga, Entropika ima
08:19
is broadly applicable to problems
193
499235
1620
široku primenu na probleme
08:20
in autonomous defense, logistics and transportation.
194
500855
5302
autonomne odbrane, logistike i prevoza.
08:26
Finally, here we see Entropica
195
506173
2030
Na kraju, ovde vidimo Entropiku
08:28
spontaneously discovering and executing
196
508203
2723
kako spontano otkriva i izvršava
08:30
a buy-low, sell-high strategy
197
510926
2067
strategiju "kupi jeftino, prodaj skupo"
08:32
on a simulated range traded stock,
198
512993
2178
na simulaciji promena na berzi,
08:35
successfully growing assets under management
199
515171
2331
uspešno uvećavajući dobit pod svojim vođstvom
08:37
exponentially.
200
517502
1424
eksponencijalno.
08:38
This risk management ability
201
518926
1308
Ova sposobnost menadžmenta sa rizikom
08:40
will have broad applications in finance
202
520234
2487
imaće široke primene u finansijama
08:42
and insurance.
203
522721
3328
i osiguranju.
08:46
Alex Wissner-Gross: So what you've just seen
204
526049
2091
Aleks Visner-Gros: Ono što ste upravo videli
08:48
is that a variety of signature human intelligent
205
528140
4392
je niz originalno ljudskih inteligentnih
08:52
cognitive behaviors
206
532532
1757
kognitivnih ponašanja
08:54
such as tool use and walking upright
207
534289
2831
kao što su korišćenje alata i uspravan hod
08:57
and social cooperation
208
537120
2029
i društvena saradnja
08:59
all follow from a single equation,
209
539149
2972
koji proizilaze iz jedne jednačine
09:02
which drives a system
210
542121
1932
koja pokreće sistem
09:04
to maximize its future freedom of action.
211
544053
3911
da bi maksimizirala buduću slobodu delovanja.
09:07
Now, there's a profound irony here.
212
547964
3007
U ovome ima duboke ironije.
09:10
Going back to the beginning
213
550971
2024
Ako se vratimo na početak
09:12
of the usage of the term robot,
214
552995
3273
korišćenja izraza "robot",
09:16
the play "RUR,"
215
556268
2903
dramski komad "R.U.R";
09:19
there was always a concept
216
559171
2235
uvek je postojala ideja
09:21
that if we developed machine intelligence,
217
561406
3226
da ako bismo stvorili mašinsku inteligenciju,
09:24
there would be a cybernetic revolt.
218
564632
3027
desila bi se kibernetska pobuna.
09:27
The machines would rise up against us.
219
567659
3551
Mašine bi se pobunile protiv nas.
09:31
One major consequence of this work
220
571210
2319
Jedna velika posledica ovog rada je ta
09:33
is that maybe all of these decades,
221
573529
2769
da smo možda svih ovih decenija
09:36
we've had the whole concept of cybernetic revolt
222
576298
2976
imali obrnut princip kibernetske
09:39
in reverse.
223
579274
2011
pobune.
09:41
It's not that machines first become intelligent
224
581285
3279
Nije se desilo da su mašine prvo postale inteligentne
09:44
and then megalomaniacal
225
584564
2015
pa onda megalomanijačne
09:46
and try to take over the world.
226
586579
2224
i pokušale da preuzmu svet.
09:48
It's quite the opposite,
227
588803
1434
Potpuno je suprotno.
09:50
that the urge to take control
228
590237
2906
Poriv da se preuzme kontrola
09:53
of all possible futures
229
593143
2261
svih mogućih budućnosti
09:55
is a more fundamental principle
230
595404
2118
je osnovniji princip
09:57
than that of intelligence,
231
597522
1363
od inteligencije.
09:58
that general intelligence may in fact emerge
232
598885
3700
Opšta inteligencija može, u stvari, da nastane
10:02
directly from this sort of control-grabbing,
233
602585
3559
direktno od ovakvog grabljenja kontrole,
10:06
rather than vice versa.
234
606144
4185
pre nego obrnuto.
10:10
Another important consequence is goal seeking.
235
610329
3769
Još jedna važna posledica je traženje cilja.
10:14
I'm often asked, how does the ability to seek goals
236
614098
4360
Često me pitaju kako sposobnost za traženje ciljeva
10:18
follow from this sort of framework?
237
618458
1620
proizilazi iz ovakve strukture?
10:20
And the answer is, the ability to seek goals
238
620078
3028
I odgovor je da će sposobnost za traženje ciljeva
10:23
will follow directly from this
239
623106
1882
proizaći direktno iz ovoga,
10:24
in the following sense:
240
624988
1834
u sledećem smislu:
10:26
just like you would travel through a tunnel,
241
626822
2865
baš kao što biste putovali kroz tunel,
10:29
a bottleneck in your future path space,
242
629687
2505
uskim prolazom u prostoru vašeg budućeg puta,
10:32
in order to achieve many other
243
632192
1871
da biste postigli mnoge druge
10:34
diverse objectives later on,
244
634063
2021
različite ciljeve kasnije
10:36
or just like you would invest
245
636084
2372
ili kao kad biste uložili
10:38
in a financial security,
246
638456
1787
u finansijsku sigurnost,
10:40
reducing your short-term liquidity
247
640243
2237
smanjujući svoju kratkoročnu platežnu moć
10:42
in order to increase your wealth over the long term,
248
642480
2400
da biste uvećali svoje bogatstvo tokom dužeg perioda,
10:44
goal seeking emerges directly
249
644880
2337
traženje ciljeva direktno nastaje
10:47
from a long-term drive
250
647217
1729
od dugoročnog poriva
10:48
to increase future freedom of action.
251
648946
4037
da se uveća buduća sloboda delovanja.
10:52
Finally, Richard Feynman, famous physicist,
252
652983
3528
Na kraju, Ričard Fajnmen, čuveni fizičar,
10:56
once wrote that if human civilization were destroyed
253
656511
3672
jednom je napisao da kad bi ljudska civilizacija bila uništena
11:00
and you could pass only a single concept
254
660183
1893
i kad biste mogli da prenesete samo jednu ideju
11:02
on to our descendants
255
662076
1371
našim potomcima
11:03
to help them rebuild civilization,
256
663447
2307
da im pomognete da ponovo izgrade civilizaciju,
11:05
that concept should be
257
665754
1686
ta ideja bi trebala da bude
11:07
that all matter around us
258
667440
1852
da se sva materija oko nas
11:09
is made out of tiny elements
259
669292
2323
sastoji od sićušnih elemenata
11:11
that attract each other when they're far apart
260
671615
2508
koji se privlače međusobno kad su razdvojeni,
11:14
but repel each other when they're close together.
261
674123
3330
ali se međusobno odbijaju kad su blizu.
11:17
My equivalent of that statement
262
677453
1781
Moj ekvivalent ovoj izjavi
11:19
to pass on to descendants
263
679234
1268
za prenošenje potomcima
11:20
to help them build artificial intelligences
264
680502
2712
da bismo im pomogli da prave veštačke inteligencije
11:23
or to help them understand human intelligence,
265
683214
2949
ili da razumeju ljudsku inteligenciju,
11:26
is the following:
266
686163
1267
je sledeći:
11:27
Intelligence should be viewed
267
687430
2053
inteligenciju bi trebalo posmatrati
11:29
as a physical process
268
689483
1413
kao fizički proces
11:30
that tries to maximize future freedom of action
269
690896
2965
koji pokušava da maksimizira buduću slobodu delovanja
11:33
and avoid constraints in its own future.
270
693861
3616
i izbegne ograničenja u svojoj budućnosti.
11:37
Thank you very much.
271
697477
1358
Hvala vam mnogo.
11:38
(Applause)
272
698835
4000
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7