Don't fear superintelligent AI | Grady Booch

269,003 views ・ 2017-03-13

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Stjepan Mateljan Recezent: Tilen Pigac - EFZG
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
Kad sam bio klinac, bio sam onaj pravi štreber.
00:17
I think some of you were, too.
1
17140
2176
MIslim kako su i neki od vas bili, također.
00:19
(Laughter)
2
19340
1216
(Smijeh)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
A vi, gospodine, koji ste se najglasnije nasmijali,vjerojatno to i dalje jeste.
00:23
(Laughter)
4
23820
2256
(Smijeh)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
Odrastao sam u malenom gradu na prašnjavim ravnicama sjevernog Teksasa,
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
sin šerifa koji je bio sin pastora.
00:32
Getting into trouble was not an option.
7
32980
1920
Upadanje u nevolje nije bila mogućnost.
00:35
And so I started reading calculus books for fun.
8
35860
3256
I tako sam počeo čitati knjige iz (*infinitezimalnog) računa za zabavu.
00:39
(Laughter)
9
39140
1536
(Smijeh)
00:40
You did, too.
10
40700
1696
I vi ste, također.
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
To me dovelo do izgradnje lasera i računala i modelarskih raketa,
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
a to me dovelo do izrade raketnog goriva u mojoj spavaćoj sobi.
00:49
Now, in scientific terms,
13
49780
3656
Sad, u znanstvenim terminima,
00:53
we call this a very bad idea.
14
53460
3256
to nazivamo vrlo lošom idejom.
00:56
(Laughter)
15
56740
1216
(Smijeh)
00:57
Around that same time,
16
57980
2176
Nekako oko tog doba,
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
Kubrick-ova "2001: Odiseja u Svemiru" je došla u kina,
01:03
and my life was forever changed.
18
63420
2200
i moj se život zauvijek promijenio.
01:06
I loved everything about that movie,
19
66100
2056
Volio sam sve oko tog filma,
01:08
especially the HAL 9000.
20
68180
2536
posebno HAL 9000.
01:10
Now, HAL was a sentient computer
21
70740
2056
Sad, HAL je bio svjesno računalo
01:12
designed to guide the Discovery spacecraft
22
72820
2456
osmišljeno da vodi svemirsku letjelicu Discovery
01:15
from the Earth to Jupiter.
23
75300
2536
od Zemlje do Jupitera.
01:17
HAL was also a flawed character,
24
77860
2056
HAL je također bio manjkav lik,
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
jer je na kraju birao vrednovati misiju više od ljudskog života.
01:24
Now, HAL was a fictional character,
26
84660
2096
Sad, HAL je bio izmišljeni lik,
01:26
but nonetheless he speaks to our fears,
27
86780
2656
ali ipak se obraćao našim strahovima,
01:29
our fears of being subjugated
28
89460
2096
našim strahovima o bivanju podčinjenim
01:31
by some unfeeling, artificial intelligence
29
91580
3016
od strane neke bezosjećajne, umjetne inteligencije
01:34
who is indifferent to our humanity.
30
94620
1960
koja je ravnodušna spram naše ljudskosti.
01:37
I believe that such fears are unfounded.
31
97700
2576
Vjerujem kako su takvi strahovi neutemeljeni.
01:40
Indeed, we stand at a remarkable time
32
100300
2696
Zbilja, stojimo na upečatljivom mjestu
01:43
in human history,
33
103020
1536
u ljudskoj povijesti,
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
gdje, potaknuti odbijanjem prihvaćanja granica naših tijela i naših umova,
01:49
we are building machines
35
109580
1696
gradimo strojeve,
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
fine, prekrasne složenosti i ljepote
01:54
that will extend the human experience
37
114940
2056
koji će proširiti ljudsko iskustvo
01:57
in ways beyond our imagining.
38
117020
1680
na načine onkraj naše mašte.
01:59
After a career that led me from the Air Force Academy
39
119540
2576
Nakon karijere koja me vodila od Akademije zračnih snaga
02:02
to Space Command to now,
40
122140
1936
preko Svemirskog zapovjedništva do sada,
02:04
I became a systems engineer,
41
124100
1696
postao sam sistemski inženjer,
02:05
and recently I was drawn into an engineering problem
42
125820
2736
i nedavno sam uvučen u inženjerski problem
02:08
associated with NASA's mission to Mars.
43
128580
2576
vezan uz NASA-inu misiju na Mars.
02:11
Now, in space flights to the Moon,
44
131180
2496
Sad, u svemirskim letovima do Mjeseca,
02:13
we can rely upon mission control in Houston
45
133700
3136
možemo se osloniti na kontrolu misije u Houstonu
02:16
to watch over all aspects of a flight.
46
136860
1976
kako će paziti na sve aspekte leta.
02:18
However, Mars is 200 times further away,
47
138860
3536
Međutim, Mars je 200 puta udaljeniji,
02:22
and as a result it takes on average 13 minutes
48
142420
3216
te kao rezultat treba prosječno 13 minuta
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
signalu da otputuje od Zemlje do Marsa.
02:28
If there's trouble, there's not enough time.
50
148820
3400
Ako ima nevolje, nema dovoljno vremena.
02:32
And so a reasonable engineering solution
51
152660
2496
I tako razumno inženjersko rješenje
02:35
calls for us to put mission control
52
155180
2576
kaže nam neka stavimo kontrolu misije
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
unutar zidova svemirske letjelice Orion.
02:40
Another fascinating idea in the mission profile
54
160820
2896
Druga očaravajuća ideja u profilu misije
02:43
places humanoid robots on the surface of Mars
55
163740
2896
stavlja humanoidne robote na površinu Marsa
02:46
before the humans themselves arrive,
56
166660
1856
prije nego sami ljudi doputuju,
02:48
first to build facilities
57
168540
1656
prvo kako bi sagradili postrojenja
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
a kasnije kako bi služili kao pomoćni članovi znanstvenog tima.
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
Sad, kad sam gledao na ovo iz inženjerskog kuta,
02:57
it became very clear to me that what I needed to architect
60
177980
3176
postalo mi je vrlo jasno kako je ono što trebam osmisliti
03:01
was a smart, collaborative,
61
181180
2176
bila pametna, uslužna,
03:03
socially intelligent artificial intelligence.
62
183380
2376
društveno inteligentna umjetna inteligencija.
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
Drugim riječima, trebao sam složiti nešto vrlo slično HAL-u
03:10
but without the homicidal tendencies.
64
190100
2416
ali bez ubilačkih sklonosti.
03:12
(Laughter)
65
192540
1360
(Smijeh)
03:14
Let's pause for a moment.
66
194740
1816
Zaustavimo se na trenutak.
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
Je li stvarno moguće sagraditi umjetnu inteligenciju poput ove?
03:20
Actually, it is.
68
200500
1456
Zapravo, jest.
03:21
In many ways,
69
201980
1256
Na puno načina,
03:23
this is a hard engineering problem
70
203260
1976
ovo je tvrdi inženjerski problem
03:25
with elements of AI,
71
205260
1456
s elementima UI,
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
ne neki UI problem mokre lopte kose kojeg treba osmisliti.
03:31
To paraphrase Alan Turing,
73
211460
2656
Da parafraziram Alana Turinga:
03:34
I'm not interested in building a sentient machine.
74
214140
2376
Ne zanima me izgradnja svjesnog stroja.
03:36
I'm not building a HAL.
75
216540
1576
Ne gradim HAL-a.
03:38
All I'm after is a simple brain,
76
218140
2416
Sve za čime idem je jednostavan mozak,
03:40
something that offers the illusion of intelligence.
77
220580
3120
nešto što nudi privid inteligencije.
03:44
The art and the science of computing have come a long way
78
224820
3136
Umjetnost i znanost računarstva je prošla dalek put
03:47
since HAL was onscreen,
79
227980
1496
otkad je HAL bio na ekranu,
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
i mogu zamisliti kad bi njegov izumitelj dr. Chandra bio ovdje danas,
03:52
he'd have a whole lot of questions for us.
81
232740
2336
imao bi cijeli niz pitanja za nas.
03:55
Is it really possible for us
82
235100
2096
Je li nam zbilja moguće
03:57
to take a system of millions upon millions of devices,
83
237220
4016
uzeti sustav od milijuna i milijuna uređaja,
04:01
to read in their data streams,
84
241260
1456
čitati njihove tokove podataka,
04:02
to predict their failures and act in advance?
85
242740
2256
predviđati njihove kvarove te djelovati unaprijed?
04:05
Yes.
86
245020
1216
Da.
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
Možemo li graditi sustave koji razgovaraju s ljudima prirodnim jezikom?
04:09
Yes.
88
249460
1216
Da.
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
Možemo li graditi sustave koji prepoznaju objekte, identificiraju osjećaje,
04:13
emote themselves, play games and even read lips?
90
253700
3376
sami izražavaju emocije, igraju igre i čak čitaju s usana?
04:17
Yes.
91
257100
1216
Da.
04:18
Can we build a system that sets goals,
92
258340
2136
Možemo li sagraditi sustav koji postavlja ciljeve,
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
potom izvršava planove prema tim ciljevima te usput uči?
04:24
Yes.
94
264140
1216
Da.
04:25
Can we build systems that have a theory of mind?
95
265380
3336
Možemo li sagraditi sustave koji imaju teoriju uma?
04:28
This we are learning to do.
96
268740
1496
To upravo učimo.
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
Možemo li sagraditi sustave koji imaju etičke i moralne temelje?
04:34
This we must learn how to do.
98
274300
2040
Ovo moramo naučiti kako.
04:37
So let's accept for a moment
99
277180
1376
Pa prihvatimo na trenutak
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
kako je moguće sagraditi takvu umjetnu inteligenciju
04:41
for this kind of mission and others.
101
281500
2136
za ovu vrstu misije i slično.
04:43
The next question you must ask yourself is,
102
283660
2536
Slijedeće pitanje koje biste se trebali pitati je,
04:46
should we fear it?
103
286220
1456
bismo li je se trebali bojati?
04:47
Now, every new technology
104
287700
1976
Sad, svaka nova tehnologija
04:49
brings with it some measure of trepidation.
105
289700
2896
sa sobom donosi neku mjeru strepnje.
04:52
When we first saw cars,
106
292620
1696
Kad smo prvo vidjeli aute,
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
ljudi su se žalili kako ćemo vidjeti uništenje obitelji.
04:58
When we first saw telephones come in,
108
298380
2696
Kad smo prvo vidjeli telefone kako dolaze,
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
ljudi su brinuli kako će uništiti sav uljudan razgovor.
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
U točci vremena kad smo vidjeli kako pisana riječ sve više prožima,
05:07
people thought we would lose our ability to memorize.
111
307980
2496
ljudi su mislili kako ćemo izgubiti našu sposobnost pamćenja.
05:10
These things are all true to a degree,
112
310500
2056
Sve su te stvari točne do nekog stupnja,
05:12
but it's also the case that these technologies
113
312580
2416
ali je također slučaj kako su nam te tehnologije
05:15
brought to us things that extended the human experience
114
315020
3376
dovele stvari koje su proširile ljudsko iskustvo
05:18
in some profound ways.
115
318420
1880
na neke duboke načine.
05:21
So let's take this a little further.
116
321660
2280
Pa hajdemo s ovim još malo dalje.
05:24
I do not fear the creation of an AI like this,
117
324940
4736
Ne bojim se stvaranje UI poput ove,
05:29
because it will eventually embody some of our values.
118
329700
3816
jer će s vremenom utjeloviti neke od naših vrijednosti.
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
Uzmite u obzir ovo: izgradnja spoznajnog sustava je iz temelja različita
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
od izgradnje tradicionalnog softverski intenzivnog sustava u prošlosti.
05:40
We don't program them. We teach them.
121
340380
2456
Ne programiramo ih. Učimo ih.
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
Kako bismo sustav naučili prepoznavati cvijeće,
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
pokazujem mu tisuće vrsta cvjetova koje mi se sviđaju.
05:48
In order to teach a system how to play a game --
124
348580
2256
Kako bismo sustav naučili igrati igru --
05:50
Well, I would. You would, too.
125
350860
1960
Dobro, ja bih. I vi bi.
05:54
I like flowers. Come on.
126
354420
2040
Sviđa mi se cvijeće. Hajde.
05:57
To teach a system how to play a game like Go,
127
357260
2856
Naučiti sustav kako igrati igru kakva je Go,
06:00
I'd have it play thousands of games of Go,
128
360140
2056
pustit ću ga neka odigra tisuće partija Go-a,
06:02
but in the process I also teach it
129
362220
1656
ali ću ga u postupku također naučiti i
06:03
how to discern a good game from a bad game.
130
363900
2416
kako razaznati dobru partiju od loše partije.
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
Ako želim stvoriti umjetno inteligentnog pravnog savjetnika,
06:10
I will teach it some corpus of law
132
370060
1776
naučit ću ga neki korpus zakona
06:11
but at the same time I am fusing with it
133
371860
2856
ali ću u isto vrijeme spojiti s njim
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
osjećaj milosti i pravde koji je dio tog zakona.
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
Znanstvenim rječnikom, to je ono što zovemo neposredna informacija,
06:21
and here's the important point:
136
381380
2016
a evo važnog momenta:
06:23
in producing these machines,
137
383420
1456
u stvaranju svih tih strojeva,
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
mi ih dakle učimo smislu naših vrijednosti.
06:28
To that end, I trust an artificial intelligence
139
388340
3136
Stoga, vjerujem umjetnoj inteligenciji
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
isto, ako ne i više, koliko i čovjeku koji je dobro obučen.
06:35
But, you may ask,
141
395900
1216
Ali, možete pitati,
06:37
what about rogue agents,
142
397140
2616
što sa nevaljalim igračima,
06:39
some well-funded nongovernment organization?
143
399780
3336
nekom dobro financiranom nevladinom organizacijom?
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
Ne bojim se umjetne inteligencije u rukama vuka samotnjaka.
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
Jasno, ne možemo se zaštititi od svih nasumičnih činova nasilja,
06:51
but the reality is such a system
146
411540
2136
ali stvarnost je da takav sustav
06:53
requires substantial training and subtle training
147
413700
3096
traži znatnu obuku i finu obuku
06:56
far beyond the resources of an individual.
148
416820
2296
daleko onkraj resursa pojedinca.
06:59
And furthermore,
149
419140
1216
I nadalje,
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
to je daleko više nego samo ubaciti internetski virus u svijet,
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
gdje pritisnete tipku, odjednom je na milijun mjesta
07:06
and laptops start blowing up all over the place.
152
426780
2456
i laptopi posvuda krenu eksplodirati.
07:09
Now, these kinds of substances are much larger,
153
429260
2816
Sad, ove su vrste stvari puno veće,
07:12
and we'll certainly see them coming.
154
432100
1715
i sigurno ćemo ih vidjeti kako dolaze.
07:14
Do I fear that such an artificial intelligence
155
434340
3056
Bojim li se kako bi takva umjetna inteligencija
07:17
might threaten all of humanity?
156
437420
1960
mogla zaprijetiti cijelom čovječanstvu?
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
Ako pogledate filmove kao što su "Matrix" ili "Metropolis,"
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
"Terminator," serije kao što je "Westworld,"
07:27
they all speak of this kind of fear.
159
447700
2136
svi oni pričaju o ovom tipu straha.
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
Zbilja, u knjizi "Superinteligencija" filozofa Nicka Bostroma,
07:34
he picks up on this theme
161
454180
1536
on polazi od ove teme
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
te uočava kako superinteligencija može biti ne samo opasna,
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
mogla bi predstavljati egzistencijalnu opasnost svom čovječanstvu.
07:43
Dr. Bostrom's basic argument
164
463660
2216
Dr. Bostromov temeljni argument
07:45
is that such systems will eventually
165
465900
2736
je kako će takvi sustavi s vremenom
07:48
have such an insatiable thirst for information
166
468660
3256
imati tako neutaživu žeđ za informacijom
07:51
that they will perhaps learn how to learn
167
471940
2896
da će možda naučiti kako učiti
07:54
and eventually discover that they may have goals
168
474860
2616
te s vremenom otkriti kako mogu imati ciljeve
07:57
that are contrary to human needs.
169
477500
2296
koji su suprotni ljudskim potrebama.
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
Dr. Bostrom ima neki broj sljedbenika.
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
Podržavaju ga ljudi poput Elona Muska i Stephena Hawkinga.
08:06
With all due respect
172
486700
2400
Sa svim dužnim poštovanjem
08:09
to these brilliant minds,
173
489980
2016
prema tim briljantnim umovima,
08:12
I believe that they are fundamentally wrong.
174
492020
2256
vjerujem kako su oni iz temelja u krivu.
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
Sad, ima puno dijelova dr. Bostromovog argumenta za maknuti,
08:17
and I don't have time to unpack them all,
176
497500
2136
a nemam vremena kako bih ih maknuo sve,
08:19
but very briefly, consider this:
177
499660
2696
ali vrlo kratko, razmislite o ovome:
08:22
super knowing is very different than super doing.
178
502380
3736
super znanje je vrlo različito od super djelovanja.
08:26
HAL was a threat to the Discovery crew
179
506140
1896
HAL je bio prijetnja posadi Discoveryja
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
samo stoga što je HAL upravljao svime u Discoveryju.
08:32
So it would have to be with a superintelligence.
181
512500
2496
Tako bi trebalo biti i sa superinteligencijom.
08:35
It would have to have dominion over all of our world.
182
515020
2496
Morala bi imati vlast nad cijelim našim svijetom.
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
Tako je sastavljen Skynet iz filma "Terminator"
08:40
in which we had a superintelligence
184
520380
1856
u kojem smo imali superinteligenciju
08:42
that commanded human will,
185
522260
1376
koja je upravljala ljudskom voljom,
08:43
that directed every device that was in every corner of the world.
186
523660
3856
koja je vodila svaki uređaj koji je bio u svakom kutku svijeta.
08:47
Practically speaking,
187
527540
1456
Praktično govoreći,
08:49
it ain't gonna happen.
188
529020
2096
to se neće dogoditi.
08:51
We are not building AIs that control the weather,
189
531140
3056
Ne gradimo UI-e koji kontroliraju vrijeme,
08:54
that direct the tides,
190
534220
1336
koji upravljaju plimom,
08:55
that command us capricious, chaotic humans.
191
535580
3376
koji upravljaju nama jogunastima, kaotičnim ljudima.
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
I nadalje, ako bi takva umjetna superinteligencija postojala,
09:02
it would have to compete with human economies,
193
542900
2936
morala bi se nadmetati sa ljudskim ekonomijama,
09:05
and thereby compete for resources with us.
194
545860
2520
i time se nadmetati za resurse sa nama.
09:09
And in the end --
195
549020
1216
I na kraju --
09:10
don't tell Siri this --
196
550260
1240
nemojte reći Siri ovo --
09:12
we can always unplug them.
197
552260
1376
uvijek ih možemo ištekati.
09:13
(Laughter)
198
553660
2120
(Smijeh)
09:17
We are on an incredible journey
199
557180
2456
Mi smo na nevjerojatnom putovanju
09:19
of coevolution with our machines.
200
559660
2496
koevolucije sa našim strojevima.
09:22
The humans we are today
201
562180
2496
Ljudi koji smo danas
09:24
are not the humans we will be then.
202
564700
2536
nisu ljudi koji ćemo biti tada.
09:27
To worry now about the rise of a superintelligence
203
567260
3136
Brinuti sada o usponu superinteligencije
09:30
is in many ways a dangerous distraction
204
570420
3056
je na puno načina opasno odvraćanje pažnje
09:33
because the rise of computing itself
205
573500
2336
stoga što nam već uspon računarstva sam
09:35
brings to us a number of human and societal issues
206
575860
3016
donosi brojna ljudska i društvena pitanja
09:38
to which we must now attend.
207
578900
1640
kojima se moramo pozabaviti sada.
09:41
How shall I best organize society
208
581180
2816
Kako ću najbolje organizirati društvo
09:44
when the need for human labor diminishes?
209
584020
2336
kad se potreba za radom smanji?
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
Kako mogu donijeti razumijevanje i obrazovanje širom svijeta
09:50
and still respect our differences?
211
590220
1776
a i dalje uvažavati razlike?
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
Kako mogu produljiti i poboljšati ljudski život kroz kognitivnu zdravstvenu skrb?
09:56
How might I use computing
213
596300
2856
Kako mogu koristiti računarstvo
09:59
to help take us to the stars?
214
599180
1760
za pomoć kako bi došli do zvijezda?
10:01
And that's the exciting thing.
215
601580
2040
A to je uzbudljiva stvar.
10:04
The opportunities to use computing
216
604220
2336
Prilike za korištenje računarstva
10:06
to advance the human experience
217
606580
1536
kako bi unaprijedili ljudsko iskustvo
10:08
are within our reach,
218
608140
1416
su nam unutar dosega,
10:09
here and now,
219
609580
1856
sada i ovdje,
10:11
and we are just beginning.
220
611460
1680
a tek smo na početku.
10:14
Thank you very much.
221
614100
1216
Hvala vam puno.
10:15
(Applause)
222
615340
4286
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7