Can a computer write poetry? | Oscar Schwartz

89,886 views ・ 2016-02-10

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Mile Živković
00:12
I have a question.
0
12881
1230
Imam jedno pitanje.
00:15
Can a computer write poetry?
1
15422
1943
Može li kompjuter da piše poeziju?
00:18
This is a provocative question.
2
18959
2077
Ovo je provokativno pitanje.
00:21
You think about it for a minute,
3
21715
1718
Razmislite na trenutak o tome
00:23
and you suddenly have a bunch of other questions like:
4
23457
2590
i iznenada imate gomilu drugih pitanja, poput:
00:26
What is a computer?
5
26769
1381
šta je kompjuter?
00:28
What is poetry?
6
28710
1575
Šta je poezija?
00:30
What is creativity?
7
30707
1689
Šta je kreativnost?
00:33
But these are questions
8
33650
1172
Međutim, ovo su pitanja
00:34
that people spend their entire lifetime trying to answer,
9
34846
3070
o čijim odgovorima ljudi provedu čitav život razmišljajući,
00:37
not in a single TED Talk.
10
37940
2224
a ne jedan TED govor.
00:40
So we're going to have to try a different approach.
11
40188
2445
Pa ćemo morati da isprobamo drugačiji pristup.
00:42
So up here, we have two poems.
12
42657
2143
Dakle, ovde gore imamo dve pesme.
00:45
One of them is written by a human,
13
45839
2276
Jednu je napisalo ljudsko biće,
00:48
and the other one's written by a computer.
14
48139
2102
a drugu je napisao kompjuter.
00:50
I'm going to ask you to tell me which one's which.
15
50754
2410
Zamoliću vas da mi kažete koja je koja.
00:53
Have a go:
16
53858
1156
Pokušajte:
00:55
Poem 1: Little Fly / Thy summer's play, / My thoughtless hand / Has brush'd away.
17
55038
4056
Prva pesma: Muvo mala, / Letnju igru ti / Moja nemarna / Ruka prekrati
00:59
Am I not / A fly like thee? / Or art not thou / A man like me?
18
59118
3394
Zar ja nisam / Muva kao ti? / A zar ti nisi / Čovek kao ja?
01:02
Poem 2: We can feel / Activist through your life's / morning /
19
62536
3299
Druga pesma: Možemo da osećamo / Aktivista kroz tvog života / jutra
01:05
Pauses to see, pope I hate the / Non all the night to start a / great otherwise (...)
20
65859
4247
Zastane da vidi, mrzim papu tog / Nije noć sva za početak / sjajna inače (...)
01:10
Alright, time's up.
21
70130
1359
U redu vreme je isteklo.
01:11
Hands up if you think Poem 1 was written by a human.
22
71513
4096
Podignite ruke, ako mislite da je prvu pesmu napisao čovek.
01:17
OK, most of you.
23
77547
1490
U redu, većina misli.
01:19
Hands up if you think Poem 2 was written by a human.
24
79061
3023
Podignite ruke, ako mislite da je drugu pesmu napisao čovek.
01:23
Very brave of you,
25
83172
1190
Veoma hrabro od vas
01:24
because the first one was written by the human poet William Blake.
26
84855
4285
jer je prvu napisao ljudski pesnik Vilijam Blejk.
01:29
The second one was written by an algorithm
27
89784
2949
Drugu je napisao algoritam
01:32
that took all the language from my Facebook feed on one day
28
92757
3692
koji je iskoristio sav jezik iz jednog dana na mom kanalu na Fejsbuku
01:36
and then regenerated it algorithmically,
29
96473
2763
i potom ga je preradio algoritamski,
01:39
according to methods that I'll describe a little bit later on.
30
99260
3590
služeći se metodama koje ću da objasnim nešto kasnije.
01:43
So let's try another test.
31
103218
2404
Pa, isprobajmo još jedan test.
01:46
Again, you haven't got ages to read this,
32
106398
2093
Opet, nemate večnost da ovo pročitate,
01:48
so just trust your gut.
33
108515
1612
prosto verujte osećaju.
01:50
Poem 1: A lion roars and a dog barks. It is interesting / and fascinating
34
110151
4045
Prva pesma: Lav riče, a pas laje zanimljivo je to / i očaravajuće
01:54
that a bird will fly and not / roar or bark. Enthralling stories about animals
35
114220
4303
da ptica će da leti ili ne / rika ili lavež. Opčinjavajuće priče o životinjama
01:58
are in my dreams and I will sing them all if I / am not exhausted or weary.
36
118547
4060
su mi u snovima i ja ispevaću ih sve / nisam li odveć umoran il' iscrpljen.
02:02
Poem 2: Oh! kangaroos, sequins, chocolate sodas! / You are really beautiful!
37
122631
3985
Druga pesma: Ah! kenguri, cekini, čokoladne sode vode! / Uistinu ste krasni!
02:06
Pearls, / harmonicas, jujubes, aspirins! All / the stuff they've always talked about (...)
38
126640
4358
Biseri, / harmonike, žižule, aspirini! Sve / stvari te o kojima oduvek govore....
02:11
Alright, time's up.
39
131022
1158
Uredu, vreme je isteklo.
02:12
So if you think the first poem was written by a human,
40
132204
3137
Dakle, ako mislite da je prvu pesmu napisalo ljudsko biće,
02:15
put your hand up.
41
135365
1215
podignite ruke.
02:17
OK.
42
137687
1154
U redu.
02:18
And if you think the second poem was written by a human,
43
138865
2675
A ako mislite da je drugu pesmu napisalo ljudsko biće,
02:21
put your hand up.
44
141564
1155
podignite ruke.
02:23
We have, more or less, a 50/50 split here.
45
143779
3810
Ovde imamo, manje-više, podelu 50-50.
02:28
It was much harder.
46
148157
1436
Bilo je značajno teže.
02:29
The answer is,
47
149617
1712
Odgovor je:
02:31
the first poem was generated by an algorithm called Racter,
48
151353
3483
prvu pesmu je sastavio algoritam koji se zove Racter,
02:34
that was created back in the 1970s,
49
154860
3002
napravljen davnih 1970-ih,
02:37
and the second poem was written by a guy called Frank O'Hara,
50
157886
3189
a drugu pesmu je napisao čovek po imenu Frenk Ohara,
02:41
who happens to be one of my favorite human poets.
51
161099
2668
koji je slučajno jedan od mojih omiljenih ljudskih pesnika.
02:44
(Laughter)
52
164631
3058
(Smeh)
02:48
So what we've just done now is a Turing test for poetry.
53
168046
3228
Dakle, upravo ste odradili Tjuringov test za poeziju.
02:52
The Turing test was first proposed by this guy, Alan Turing, in 1950,
54
172018
4547
Tjuringov test je predložio ovaj lik, Alan Tjuring, 1950,
02:56
in order to answer the question,
55
176589
1564
kako bi odgovorio na pitanje:
02:58
can computers think?
56
178177
1637
mogu li kompjuteri da misle?
03:00
Alan Turing believed that if a computer was able
57
180245
2770
Alan Tjuring je verovao da ukoliko je kompjuter u stanju
03:03
to have a to have a text-based conversation with a human,
58
183039
3078
da vodi razgovor s čovekom, utemeljen na tekstu,
03:06
with such proficiency such that the human couldn't tell
59
186141
2770
toliko vično da čovek ne može da odredi
03:08
whether they are talking to a computer or a human,
60
188935
2966
da li razgovara s kompjuterom ili čovekom,
03:11
then the computer can be said to have intelligence.
61
191925
2856
onda se za kompjuter može reći da ima inteligenciju.
03:15
So in 2013, my friend Benjamin Laird and I,
62
195270
3295
Pa smo 2013, moj prijatelj Bendžamin Lerd i ja
03:18
we created a Turing test for poetry online.
63
198589
2988
napravili Tjuringov test za poeziju na internetu.
03:21
It's called bot or not,
64
201601
1277
Zove se "Bot or not",
03:22
and you can go and play it for yourselves.
65
202902
2044
i možete da ga posetite i da se poigrate.
03:24
But basically, it's the game we just played.
66
204970
2251
U suštini, radi se o igri koju smo upravo igrali.
03:27
You're presented with a poem,
67
207245
1528
Ponuđena vam je pesma,
03:28
you don't know whether it was written by a human or a computer
68
208797
3028
ne znate da li ju je napisao čovek ili kompjuter
03:31
and you have to guess.
69
211849
1166
i morate da pogađate.
Dakle, na hiljade ljudi je učestvovalo u ovom testu na internetu
03:33
So thousands and thousands of people have taken this test online,
70
213039
3191
03:36
so we have results.
71
216254
1449
i imamo rezultate.
03:37
And what are the results?
72
217727
1428
A koji su rezultati?
03:39
Well, Turing said that if a computer could fool a human
73
219704
2879
Pa, Tjuring je rekao da ako kompjuter može da obmanjuje čoveka
03:42
30 percent of the time that it was a human,
74
222607
3019
30 posto vremena da je čovek,
03:45
then it passes the Turing test for intelligence.
75
225650
2397
onda je prošao Tjuringov test za inteligenciju.
03:48
We have poems on the bot or not database
76
228625
2438
Imamo pesme u "Bot or not" bazi podataka
03:51
that have fooled 65 percent of human readers into thinking
77
231087
2979
koje su obmanule 65 procenata ljudskih čitalaca da veruju
03:54
it was written by a human.
78
234090
1395
da su ih napisali ljudi.
03:55
So, I think we have an answer to our question.
79
235959
2817
Dakle, mislim da imamo odgovor na naše pitanje.
03:59
According to the logic of the Turing test,
80
239546
2348
Prema logici Tjuringovog testa:
04:01
can a computer write poetry?
81
241918
1928
može li kompjuter da piše poeziju?
04:03
Well, yes, absolutely it can.
82
243870
2351
Pa, da, apsolutno može.
04:07
But if you're feeling a little bit uncomfortable
83
247782
2346
Međutim ako se osećate malčice nelagodno
04:10
with this answer, that's OK.
84
250152
1927
zbog ovog odgovora, to je u redu.
04:12
If you're having a bunch of gut reactions to it,
85
252103
2316
Ako imate gomilu instinktivnih reakcija na ovo,
04:14
that's also OK because this isn't the end of the story.
86
254443
3205
to je takođe u redu jer ovo nije kraj priče.
04:18
Let's play our third and final test.
87
258594
2324
Hajde da pustimo treći i poslednji test.
04:22
Again, you're going to have to read
88
262000
1750
Opet, moraćete da pročitate
04:23
and tell me which you think is human.
89
263774
1909
i da mi kažete šta je napisao čovek.
04:25
Poem 1: Red flags the reason for pretty flags. / And ribbons.
90
265707
3718
Prva pesma: Crvene zastave su razlog za lepe zastave. / I trake.
04:29
Ribbons of flags / And wearing material / Reasons for wearing material. (...)
91
269449
4321
Trake od zastava / I nosive tkanine / Razlozi da se nose tkanine. (...)
04:33
Poem 2: A wounded deer leaps highest, / I've heard the daffodil
92
273794
3918
2. Pesma: Povređeni jelen skače najviše / Slušala sam narcise
04:37
I've heard the flag to-day / I've heard the hunter tell; /
93
277736
3446
Slušala sam zastave danas / Slušala sam priču lovca; /
04:41
'Tis but the ecstasy of death, / And then the brake is almost done (...)
94
281206
3702
Ovo je sami ushit smrti, / A onda predahu se bliži kraj (..)
04:44
OK, time is up.
95
284932
1599
U redu, vreme je isteklo.
04:46
So hands up if you think Poem 1 was written by a human.
96
286555
3837
Podignite ruke, ako mislite da je prvu pesmu napisalo ljudsko biće.
04:51
Hands up if you think Poem 2 was written by a human.
97
291973
3038
Podignite ruke, ako mislite da je drugu pesmu napisao čovek.
04:55
Whoa, that's a lot more people.
98
295035
2331
Opa, to je mnogo više ljudi.
04:58
So you'd be surprised to find that Poem 1
99
298327
2968
Dakle, iznenadiće vas da je prvu pesmu
05:01
was written by the very human poet Gertrude Stein.
100
301319
3993
napisala baš ljudska pesnikinja Gertruda Štajn.
05:06
And Poem 2 was generated by an algorithm called RKCP.
101
306100
5038
A drugu pesmu je sastavio algoritam, naslovljen RKCP.
05:11
Now before we go on, let me describe very quickly and simply,
102
311162
3319
Sad, pre nego što nastavimo dozvolite da vam brzo i jednostavno
05:14
how RKCP works.
103
314505
1781
objasnim kako RKCP radi.
05:16
So RKCP is an algorithm designed by Ray Kurzweil,
104
316873
3850
Dakle, RKCP je algoritam koga je dizajnirao Rej Kurcvel,
05:20
who's a director of engineering at Google
105
320747
2222
direktor inženjeringa u Guglu
05:22
and a firm believer in artificial intelligence.
106
322993
2360
i istinski pobornik veštačke inteligencije.
05:25
So, you give RKCP a source text,
107
325822
3991
Dakle, date RKCP-u izvorni tekst,
05:29
it analyzes the source text in order to find out how it uses language,
108
329837
4469
on analizira izvorni tekst da bi otkrio kako je korišćen jezik u njemu,
05:34
and then it regenerates language
109
334330
1948
a onda ponovo stvara jezik
05:36
that emulates that first text.
110
336302
2528
koji podražava taj prvobitni tekst.
05:38
So in the poem we just saw before,
111
338854
2113
U pesmi koju smo upravo videli,
05:40
Poem 2, the one that you all thought was human,
112
340991
2625
drugoj pesmi, za koju ste svi verovali da je ljudska,
05:43
it was fed a bunch of poems
113
343640
1550
u algoritam je pohranjena gomila pesama
05:45
by a poet called Emily Dickinson
114
345214
2035
pesnikinje Emili Dikinson.
05:47
it looked at the way she used language,
115
347273
2189
Razmatrao je kako je ona koristila jezik,
05:49
learned the model,
116
349486
1165
savladao je obrazac,
05:50
and then it regenerated a model according to that same structure.
117
350675
4258
a potom je nanovo stvorio obrazac prema toj istoj strukturi.
05:56
But the important thing to know about RKCP
118
356732
2178
Međutim, važno je da znate za RKCP
05:58
is that it doesn't know the meaning of the words it's using.
119
358934
2838
da on ne zna značenje reči koje koristi.
06:02
The language is just raw material,
120
362359
2276
Jezik je prosto sirovina,
06:04
it could be Chinese, it could be in Swedish,
121
364659
2160
mogao bi biti kineski, mogao bi biti švedski,
06:06
it could be the collected language from your Facebook feed for one day.
122
366843
4179
mogao bi to biti izabrani jezik jednog dana s vašeg Fejsbuk kanala.
06:11
It's just raw material.
123
371046
1652
To je prosto sirovina.
06:13
And nevertheless, it's able to create a poem
124
373380
2697
A ipak je u stanju da sastavi pesmu
06:16
that seems more human than Gertrude Stein's poem,
125
376101
3327
koja zvuči ljudskije od poezije Gertrude Štajn,
06:19
and Gertrude Stein is a human.
126
379452
2153
a Gertruda Štajn je ljudsko biće.
06:22
So what we've done here is, more or less, a reverse Turing test.
127
382846
4072
Ovde smo obavili, manje-više, obrnuti Tjuringov test.
06:27
So Gertrude Stein, who's a human, is able to write a poem
128
387940
5179
Dakle, Gertruda Štajn, koja je čovek, može da napiše pesmu
06:33
that fools a majority of human judges into thinking
129
393143
3738
koja obmanjuje većinu ljudskih sudija da veruju
06:36
that it was written by a computer.
130
396905
1826
da ju je napisao kompjuter.
06:39
Therefore, according to the logic of the reverse Turing test,
131
399176
4141
Stoga, prema logici obrnutog Tjurinovog testa,
06:43
Gertrude Stein is a computer.
132
403341
1916
Gertruda Štajn je kompjuter.
06:45
(Laughter)
133
405281
1462
(Smeh)
06:47
Feeling confused?
134
407358
1294
Zbunjeni ste?
06:49
I think that's fair enough.
135
409193
1515
Mislim da je to pošteno.
06:51
So far we've had humans that write like humans,
136
411546
4116
Dosad smo imali ljude koji pišu kao ljudi,
06:55
we have computers that write like computers,
137
415686
3111
imamo kompjutere koji pišu kao kompjuteri,
06:58
we have computers that write like humans,
138
418821
3055
imamo kompjutere koji pišu kao ljudi,
07:01
but we also have, perhaps most confusingly,
139
421900
3632
ali takođe imamo, verovatno najkonfuznije,
07:05
humans that write like computers.
140
425556
2375
ljude koji pišu kao kompjuteri.
07:08
So what do we take from all of this?
141
428938
1766
Dakle, šta da zaključimo iz ovoga?
07:11
Do we take that William Blake is somehow more of a human
142
431611
3157
Da li da zaključimo da je Vilijam Blejk nekako ljudskiji
07:14
than Gertrude Stein?
143
434792
1249
od Gertrude Štajn?
07:16
Or that Gertrude Stein is more of a computer than William Blake?
144
436065
3046
Ili da je Gertruda Štajn više kompjuter od Vilijama Blejka?
07:19
(Laughter)
145
439135
1552
(Smeh)
07:20
These are questions I've been asking myself
146
440711
2323
Ova pitanja sam sebi postavljao
07:23
for around two years now,
147
443058
1465
evo već dve godine
07:24
and I don't have any answers.
148
444547
2309
i nemam nikakve odgovore.
07:26
But what I do have are a bunch of insights
149
446880
2330
Ali imam gomilu zapažanja
07:29
about our relationship with technology.
150
449234
2534
o našem odnosu prema tehnologiji.
07:32
So my first insight is that, for some reason,
151
452999
3609
Moje prvo zapažanje je da iz nekog razloga
07:36
we associate poetry with being human.
152
456632
3111
povezujemo poeziju s ljudskošću.
07:40
So that when we ask, "Can a computer write poetry?"
153
460197
3715
Pa, kada pitamo: "Može li kompjuter da piše poeziju?"
07:43
we're also asking,
154
463936
1193
takođe pitamo:
07:45
"What does it mean to be human
155
465153
1798
"Šta znači biti čovek
07:46
and how do we put boundaries around this category?
156
466975
3172
i kako ograničavamo ovu kategoriju?
07:50
How do we say who or what can be part of this category?"
157
470171
3658
Kako određujemo ko ili šta može da bude deo ove kategorije?"
07:54
This is an essentially philosophical question, I believe,
158
474376
3351
Ovo je, verujem, u suštini filozofsko pitanje
07:57
and it can't be answered with a yes or no test,
159
477751
2229
i test sa da ili ne ne može da odgovori na njega,
08:00
like the Turing test.
160
480004
1327
poput Tjuringovog testa.
08:01
I also believe that Alan Turing understood this,
161
481805
3045
Takođe verujem da je Alan Tjuring razumeo ovo
08:04
and that when he devised his test back in 1950,
162
484874
3305
i da kada je osmislio svoj test te 1950,
08:08
he was doing it as a philosophical provocation.
163
488203
2802
on ga je napravio kao filozofsku provokaciju.
08:13
So my second insight is that, when we take the Turing test for poetry,
164
493124
5541
Pa je moj sledeće zapažanje da, kada uradimo Tjuringov test za poeziju,
08:18
we're not really testing the capacity of the computers
165
498689
3460
mi zaista ne testiramo sposobnosti kompjutera
08:22
because poetry-generating algorithms,
166
502173
2893
jer su algoritmi za sastavljanje poezije
08:25
they're pretty simple and have existed, more or less, since the 1950s.
167
505090
4563
prilično prosti i postoje, manje-više, od 1950-ih.
08:31
What we are doing with the Turing test for poetry, rather,
168
511055
3118
Tjuringovim testom za poeziju,
08:34
is collecting opinions about what constitutes humanness.
169
514197
4615
mi sakupljamo mišljenja o tome iz čega se sastoji ljudskost.
08:40
So, what I've figured out,
170
520313
2729
Dakle, ono što sam shvatio,
08:43
we've seen this when earlier today,
171
523066
2972
videli smo to nešto ranije,
08:46
we say that William Blake is more of a human
172
526062
2478
kažemo da je Vilijam Blejk više ljudsko biće
08:48
than Gertrude Stein.
173
528564
1565
od Gertrude Štajn.
08:50
Of course, this doesn't mean that William Blake
174
530153
2462
Naravno da ovo ne znači da je Vilijam Blejk
08:52
was actually more human
175
532639
1828
uistinu bio više ljudsko biće
08:54
or that Gertrude Stein was more of a computer.
176
534491
2327
ili da je Gertruda Štajn više bila kompjuter.
08:57
It simply means that the category of the human is unstable.
177
537533
4714
To prosto znači da je kategorija ljudskosti nestabilna.
09:03
This has led me to understand
178
543450
2074
Zbog ovoga sam shvatio
09:05
that the human is not a cold, hard fact.
179
545548
2763
da ljudskost nije hladna, čvrsta činjenica.
09:08
Rather, it is something that's constructed with our opinions
180
548832
3132
Već da je to nešto što je sačinjeno od naših shvatanja
09:11
and something that changes over time.
181
551988
2855
i nešto što se vremenom menja.
09:16
So my final insight is that the computer, more or less,
182
556671
4479
Pa je moje poslednje zapažanje da kompjuter, manje-više,
09:21
works like a mirror that reflects any idea of a human
183
561174
4006
funkcioniše poput ogledala koje odražava bilo koju ideju ljudskosti
09:25
that we show it.
184
565204
1375
koju mu pokažemo.
09:26
We show it Emily Dickinson,
185
566958
1884
Pokažemo mu Emili Dikinson,
09:28
it gives Emily Dickinson back to us.
186
568866
2321
on nam vrati Emili Dikinson.
09:31
We show it William Blake,
187
571768
1834
Pokažemo mu Vilijama Blejka,
09:33
that's what it reflects back to us.
188
573626
2285
to je ono što će da bude u odrazu.
09:35
We show it Gertrude Stein,
189
575935
1839
Pokažemo mu Gertrudu Štajn,
09:37
what we get back is Gertrude Stein.
190
577798
2470
dobićemo opet Gertrudu Štajn.
09:41
More than any other bit of technology,
191
581083
2368
Više od bilo kog parčeta tehnologije,
09:43
the computer is a mirror that reflects any idea of the human we teach it.
192
583475
5165
kompjuter je ogledalo koje odražava bilo koju ideju ljudskosti koju nauči.
09:50
So I'm sure a lot of you have been hearing
193
590061
2287
Siguran sam da je većina vas slušala
09:52
a lot about artificial intelligence recently.
194
592372
2862
mnogo o veštačkoj inteligenciji, u skorije vreme.
09:56
And much of the conversation is,
195
596694
2830
A većina razgovora je o tome
10:00
can we build it?
196
600292
1189
možemo li je napraviti?
10:02
Can we build an intelligent computer?
197
602383
3135
Možemo li da napravimo inteligentan kompjuter?
10:05
Can we build a creative computer?
198
605542
2763
Možemo li da napravimo kreativan kompjuter?
10:08
What we seem to be asking over and over
199
608329
2113
Čini mi se da iznova i iznova pitamo:
10:10
is can we build a human-like computer?
200
610466
2724
možemo li da napravimo čovekoliki kompjuter?
10:13
But what we've seen just now
201
613961
1556
Međutim, upravo smo videli
10:15
is that the human is not a scientific fact,
202
615541
3088
da ljudskost nije naučna činjenica,
10:18
that it's an ever-shifting, concatenating idea
203
618653
3530
da je to ideja koja se stalno menja, nadograđuje
10:22
and one that changes over time.
204
622207
2531
i menja tokom vremena.
10:24
So that when we begin to grapple with the ideas
205
624762
3152
Pa, kad se uhvatimo u koštac sa zamislima
10:27
of artificial intelligence in the future,
206
627938
2386
o veštačkoj inteligenciji u budućnosti,
10:30
we shouldn't only be asking ourselves,
207
630348
1905
ne bi trebalo da se samo pitamo:
10:32
"Can we build it?"
208
632277
1368
"Možemo li je napraviti?"
10:33
But we should also be asking ourselves,
209
633669
1894
Već bi trebalo i da se pitamo:
10:35
"What idea of the human do we want to have reflected back to us?"
210
635587
3713
"Koju ideju ljudskosti želimo da nađemo u njenom odrazu?"
10:39
This is an essentially philosophical idea,
211
639820
2693
Ovo je suštinski filozofska zamisao,
10:42
and it's one that can't be answered with software alone,
212
642537
2997
i to takva da na nju ne može da odgvori samo softver,
10:45
but I think requires a moment of species-wide, existential reflection.
213
645558
4977
već mislim da zahteva trenutak
egzistencijalnog razmatranja od strane celokupne vrste.
10:51
Thank you.
214
651040
1153
Hvala vam.
10:52
(Applause)
215
652217
2695
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7