The Inside Story of ChatGPT’s Astonishing Potential | Greg Brockman | TED

1,798,027 views ・ 2023-04-20

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Strahinja Tomic Lektor: Milenka Okuka
00:03
We started OpenAI seven years ago
0
3875
2503
Osnovali smo OpenAI pre sedam godina
00:06
because we felt like something really interesting was happening in AI
1
6378
3712
jer smo osetili da se nešto veoma zanimljivio dešava sa VI
00:10
and we wanted to help steer it in a positive direction.
2
10131
3170
i želeli smo da to usmerimo u pozitivnom smeru.
00:15
It's honestly just really amazing to see
3
15220
2085
Iskreno, neverovatno je gledati
00:17
how far this whole field has come since then.
4
17347
3086
dokle se stiglo u ovoj oblasti.
00:20
And it's really gratifying to hear from people like Raymond
5
20433
3629
Zadovoljavajuće je kada nam se jave ljudi poput Rejmonda
00:24
who are using the technology we are building, and others,
6
24104
2836
koji koriste tehnologiju koju pravimo, kao i drugi, za mnoštvo divnih stvari.
00:26
for so many wonderful things.
7
26982
2127
00:29
We hear from people who are excited,
8
29150
2503
Jave nam se ljudi koji su uzbuđeni,
00:31
we hear from people who are concerned,
9
31653
1835
jave nam se ljudi koji su zabrinuti,
00:33
we hear from people who feel both those emotions at once.
10
33530
2961
jave nam se i ljudi koji su i jedno i drugo istovremeno.
00:36
And honestly, that's how we feel.
11
36533
2252
I iekreno, tako se i mi osećamo.
00:40
Above all, it feels like we're entering an historic period right now
12
40245
4087
Nadasve, izgleda kao da sada ulazimo u istorijski period
00:44
where we as a world are going to define a technology
13
44374
4421
gde mi kao svet definišemo tehnologiju
00:48
that will be so important for our society going forward.
14
48795
3086
koja će biti od važnosti za društvo u budućnosti.
00:52
And I believe that we can manage this for good.
15
52924
2628
I verujem da ovim možemo da upravljamo za dobrobit svih.
00:56
So today, I want to show you the current state of that technology
16
56845
4171
Danas želim da vam pokažem trenutno stanje te tehnologije,
01:01
and some of the underlying design principles that we hold dear.
17
61016
3086
kao i neke osnovne principe dizajna kojih se držimo.
01:09
So the first thing I'm going to show you
18
69983
1918
Prva stvar koju ću vam pokazati je kako izgleda pravljenje alata za VI
01:11
is what it's like to build a tool for an AI
19
71943
2086
01:14
rather than building it for a human.
20
74029
1876
naspram pravljenja tog alata za ljude.
01:17
So we have a new DALL-E model, which generates images,
21
77574
3545
Dakle, imamo novi DALL-E model, koji proizvodi slike,
01:21
and we are exposing it as an app for ChatGPT to use on your behalf.
22
81161
4045
i mi ga kao aplikaciju izlažemo ChatGTP-u da ga koristi u vaše ime.
01:25
And you can do things like ask, you know,
23
85248
2461
I možete raditi stvari, npr. da pitate:
01:27
suggest a nice post-TED meal and draw a picture of it.
24
87751
6631
“Predloži mi fin obrok za posle TED govora i izradi sliku toga.”
01:35
(Laughter)
25
95216
1419
(Smeh)
01:38
Now you get all of the, sort of, ideation and creative back-and-forth
26
98303
4671
Dobićete pregled idejnog i kreativnog procesa, na neki način,
01:43
and taking care of the details for you that you get out of ChatGPT.
27
103016
4004
i detalji koje biste dobili iz ChatGP-a će vam biti sređeni.
01:47
And here we go, it's not just the idea for the meal,
28
107062
2669
I krećemo. Vidite da to nije samo ideja o obroku, nego veoma detaljan pregled.
01:49
but a very, very detailed spread.
29
109773
3587
01:54
So let's see what we're going to get.
30
114110
2044
Pa pogledajmo šta ćemo dobiti.
01:56
But ChatGPT doesn't just generate images in this case --
31
116154
3795
Ali ChatGPT ne proizvodi samo slike u ovom slučaju -
01:59
sorry, it doesn't generate text, it also generates an image.
32
119991
2836
izvinjavam se, ne proizvodi samo tekst, nego proizvodi i sliku.
02:02
And that is something that really expands the power
33
122827
2419
A to je nešto što stvarno širi mogućnosti onoga što za vas može da učini,
02:05
of what it can do on your behalf in terms of carrying out your intent.
34
125246
3504
u smislu sprovođenja vaše namere.
02:08
And I'll point out, this is all a live demo.
35
128750
2085
Napominjem, sve je ovo demonstracija uživo.
02:10
This is all generated by the AI as we speak.
36
130835
2169
Sve ovo proizvodi VI dok razgovaramo.
02:13
So I actually don't even know what we're going to see.
37
133046
2544
Tako da ja zapravo ne znam šta ćemo da vidimo.
02:16
This looks wonderful.
38
136216
2294
Ovo izgleda divno.
02:18
(Applause)
39
138510
3712
(Aplauz)
02:22
I'm getting hungry just looking at it.
40
142514
1877
Ogladniću samo gledajući u to.
02:24
Now we've extended ChatGPT with other tools too,
41
144724
2753
Opremili smo ChatGPT i drugim alatima, na primer, memorijom.
02:27
for example, memory.
42
147519
1168
02:28
You can say "save this for later."
43
148728
2795
Možete reći: “Sačuvaj mi ovo za kasnije.”
02:33
And the interesting thing about these tools
44
153233
2043
A zanimljiva stvar sa ovim alatima je da su svi veoma pregledni.
02:35
is they're very inspectable.
45
155318
1377
02:36
So you get this little pop up here that says "use the DALL-E app."
46
156695
3128
Ovde će iskočiti obavest da “koristite DALL-E aplikaciju”.
A usput, ovo vam dolazi, korisnicima ChatGPT, u sledećim mesecima.
02:39
And by the way, this is coming to you, all ChatGPT users, over upcoming months.
47
159823
3712
I možete pogledati ispod haube i videti šta je zapravo učinio,
02:43
And you can look under the hood and see that what it actually did
48
163535
3086
a to je da je napisao upit kao što bi to učinio i čovek.
02:46
was write a prompt just like a human could.
49
166621
2169
02:48
And so you sort of have this ability to inspect
50
168790
2628
Stoga, na neki način, imate mogućnost uvida u to
02:51
how the machine is using these tools,
51
171459
2086
kako mašina koristi ove alate, što nam omogućava davanje povratne informacije.
02:53
which allows us to provide feedback to them.
52
173586
2086
02:55
Now it's saved for later,
53
175714
1209
Sačuvano je za kasnije,
02:56
and let me show you what it's like to use that information
54
176965
2878
a sada ću vam pokazati kako izgleda upotreba te informacije,
02:59
and to integrate with other applications too.
55
179884
2503
kao i integracija sa drugim aplikacijama.
03:02
You can say,
56
182387
2210
Možete reći:
03:04
“Now make a shopping list for the tasty thing
57
184639
5506
“Napravi sad spisak za kupovinu za onu ukusnu stvar
03:10
I was suggesting earlier.”
58
190186
1835
koja mi je ranije predložena.”
03:12
And make it a little tricky for the AI.
59
192021
2128
I da to sve otežamo za VI.
03:16
"And tweet it out for all the TED viewers out there."
60
196276
4337
“I tvituj to svim gledaocima TED govora.”
03:20
(Laughter)
61
200655
2252
(Smeh)
03:22
So if you do make this wonderful, wonderful meal,
62
202949
2461
Ako budete pravili ovaj prekrasan obrok, definitivno želim da znam kakvog je ukusa.
03:25
I definitely want to know how it tastes.
63
205410
2044
03:28
But you can see that ChatGPT is selecting all these different tools
64
208496
3504
Međutim, možete da vidite da ChatGPT bira sve ove različite alate
03:32
without me having to tell it explicitly which ones to use in any situation.
65
212000
4379
bez izričite naredbe koji da koristi u kojoj situaciji.
03:37
And this, I think, shows a new way of thinking about the user interface.
66
217088
3879
Radeći to, pokazuje nov način razmišljanja o korisničkom intefejsu.
03:40
Like, we are so used to thinking of, well, we have these apps,
67
220967
3796
Navikli smo na način razmišljanja,
imamo aplikacije, prebacujemo se, kopiramo i lepimo iz jedne u drugu,
03:44
we click between them, we copy/paste between them,
68
224763
2335
Obično je to ugodno iskustvo u aplikaciji dok god otprilike znamo menije i opcije.
03:47
and usually it's a great experience within an app
69
227098
2294
03:49
as long as you kind of know the menus and know all the options.
70
229434
2961
Da, volio bih to.
03:52
Yes, I would like you to.
71
232395
1293
03:53
Yes, please.
72
233730
1126
Da, molim te. Uvek je lepo biti učtiv.
03:54
Always good to be polite.
73
234898
1251
03:56
(Laughter)
74
236149
2628
(Smeh)
04:00
And by having this unified language interface on top of tools,
75
240361
5464
I time što su svi ti alati objedinjeni jezičkim intefejsom,
04:05
the AI is able to sort of take away all those details from you.
76
245867
4630
VI može da, na neki način, zaključi od vas sve te detalje.
04:10
So you don't have to be the one
77
250538
1543
Tako da ne morate da crtate svaki sitni detalj onoga što bi trebalo da se desi.
04:12
who spells out every single sort of little piece
78
252123
2294
04:14
of what's supposed to happen.
79
254459
1543
04:16
And as I said, this is a live demo,
80
256419
1710
Kao što rekoh, ovo je demonstracija uživo, tako da se nešto neočekivano može desiti.
04:18
so sometimes the unexpected will happen to us.
81
258129
3379
04:21
But let's take a look at the Instacart shopping list while we're at it.
82
261549
3420
Ali pogledajmo sad onaj Instakart spisak za kupovinu, kad smo već kod toga.
04:25
And you can see we sent a list of ingredients to Instacart.
83
265386
3254
Možete videti da smo poslali spisak namirnica na Instakart.
04:29
Here's everything you need.
84
269349
1543
Tu je sve što vam treba.
04:30
And the thing that's really interesting
85
270892
1877
Zanimljiva stvar je da je tradicionalni korisnički intefejs i dalje bitan, zar ne?
04:32
is that the traditional UI is still very valuable, right?
86
272811
2919
04:35
If you look at this,
87
275772
1877
Ako ga pogledate, možete proći kroz spisak i prilagoditi potrebne količine.
04:37
you still can click through it and sort of modify the actual quantities.
88
277690
4296
04:41
And that's something that I think shows
89
281986
1877
Mislim da pokazuje da tradicionalni korisnički interfejsi ne idu nigde.
04:43
that they're not going away, traditional UIs.
90
283863
3253
04:47
It's just we have a new, augmented way to build them.
91
287158
2795
Stvar je da sada imamo novi, bolji način da ih napravimo.
04:49
And now we have a tweet that's been drafted for our review,
92
289994
2920
Sada dobijamo nacrt tvita da pregledamo pre slanja, što je takođe veoma bitno.
04:52
which is also a very important thing.
93
292956
1793
04:54
We can click “run,” and there we are, we’re the manager, we’re able to inspect,
94
294749
3712
Možemo da kliknemo na pokreni, i eto ga. Mi smo šefovi, možemo da ispitamo,
04:58
we're able to change the work of the AI if we want to.
95
298461
2836
možemo da izmenimo rad VI ako to želimo.
05:02
And so after this talk, you will be able to access this yourself.
96
302924
5964
I nakon ovog govora, vi ćete moći ovome i sami pristupiti.
05:17
And there we go.
97
317647
1710
I eto ga.
05:19
Cool.
98
319816
1126
Strava.
05:22
Thank you, everyone.
99
322485
1168
Hvala svima.
05:23
(Applause)
100
323653
3003
(Aplauz)
05:29
So we’ll cut back to the slides.
101
329367
1627
Vratimo se na slajdove.
05:32
Now, the important thing about how we build this,
102
332954
3587
E sad, važna stvar o tome kako gradimo ove alate,
05:36
it's not just about building these tools.
103
336583
2210
je da se ne radi samo o izradi ovih alata.
05:38
It's about teaching the AI how to use them.
104
338793
2252
Radi se i o podučavanju VI kako da ih koristi.
05:41
Like, what do we even want it to do
105
341087
1710
Šta zapravo želimo da VI uradi
05:42
when we ask these very high-level questions?
106
342839
2419
kada joj postavimo ova pitanja veoma visokog nivoa?
05:45
And to do this, we use an old idea.
107
345258
2669
A da bismo to postigli, koristili smo staru ideju.
Ako se vratimo na rad Alana Tjuringa iz 1950, na Tjuringovom testu, on kaže
05:48
If you go back to Alan Turing's 1950 paper on the Turing test, he says,
108
348261
3337
05:51
you'll never program an answer to this.
109
351598
2043
mašinu nikada nećete moći programirati za ovakve odgovore, ali je možete podučiti.
05:53
Instead, you can learn it.
110
353683
1627
05:55
You could build a machine, like a human child,
111
355351
2169
Možete napraviti mašinu nalik na ljudsko dete,
05:57
and then teach it through feedback.
112
357520
2127
i onda ga učiti kroz povratne informacije.
05:59
Have a human teacher who provides rewards and punishments
113
359689
2711
Neka je ljudski učitelj kažnjava i nagrađuje dok isprobava stvari,
06:02
as it tries things out and does things that are either good or bad.
114
362400
3212
dok radi dobre ili loše stvari.
06:06
And this is exactly how we train ChatGPT.
115
366237
2002
I upravo ovako mi treniramo ChatGPT.
06:08
It's a two-step process.
116
368239
1168
To je proces u dva koraka.
06:09
First, we produce what Turing would have called a child machine
117
369449
3086
Prvo, pravimo ono što je Tjuring nazivao mašinom detetom.
06:12
through an unsupervised learning process.
118
372535
1960
Kroz postupak učenja bez nadozora pokažemo mu čitav svet, čitav internet, i kažemo:
06:14
We just show it the whole world, the whole internet
119
374495
2461
06:16
and say, “Predict what comes next in text you’ve never seen before.”
120
376956
3212
“Predvidi šta ide sledeće u do sad nepoznatom tekstu.”
I ovaj postupak je prožima mnogim divnim veštinama.
06:20
And this process imbues it with all sorts of wonderful skills.
121
380168
3044
Na primer, ako joj pokažete matematički problem
06:23
For example, if you're shown a math problem,
122
383212
2086
06:25
the only way to actually complete that math problem,
123
385298
2544
jedini način na koji će ga rešiti,
06:27
to say what comes next,
124
387884
1334
da kaže šta dolazi posle, ova zelena devetka gore, je da reši taj problem.
06:29
that green nine up there,
125
389218
1293
06:30
is to actually solve the math problem.
126
390511
2294
06:34
But we actually have to do a second step, too,
127
394432
2169
Mi ipak moramo da uradimo i drugi korak,
06:36
which is to teach the AI what to do with those skills.
128
396601
2544
a to je da naučimo VI šta da radi s tim veštinama.
I zbog toga, dajemo povratne informacije.
06:39
And for this, we provide feedback.
129
399187
1668
06:40
We have the AI try out multiple things, give us multiple suggestions,
130
400855
3253
Damo VI da isproba različite stvari, da nam da više predloga,
06:44
and then a human rates them, says “This one’s better than that one.”
131
404150
3212
a onda ih čovek ocenjuje, govoreći: “Ova je bolja od one.”
Ovo ne samo da učvršćuje jednu specifičnu stvar koju je VI rekla,
06:47
And this reinforces not just the specific thing that the AI said,
132
407362
3086
06:50
but very importantly, the whole process that the AI used to produce that answer.
133
410448
3795
nego i čitav proces kojim je VI došla do odgovora.
Ovo joj dozvoljava da generalizuje,
06:54
And this allows it to generalize.
134
414243
1585
06:55
It allows it to teach, to sort of infer your intent
135
415828
2419
da nauči, da na neki način zaključi vašu nameru i primeni je
06:58
and apply it in scenarios that it hasn't seen before,
136
418247
2503
na scenarije koje još nije videla, za koje nema povratnu informaciju.
07:00
that it hasn't received feedback.
137
420750
1585
07:02
Now, sometimes the things we have to teach the AI
138
422669
2460
Sad, nekada moramo da naučimo VI stvari koje ne biste očekivali.
07:05
are not what you'd expect.
139
425171
1543
07:06
For example, when we first showed GPT-4 to Khan Academy,
140
426756
3086
Npr., kada smo prvi put pokazali GPT-4 Akademiji Kan,
07:09
they said, "Wow, this is so great,
141
429884
1627
rekli su: “Opa, ovo je odlično,
07:11
We're going to be able to teach students wonderful things.
142
431552
2753
ovim ćemo moći da naučimo naše studente divne stvari.”
07:14
Only one problem, it doesn't double-check students' math.
143
434347
3462
Problem je bio, model nije proveravao matematiku studenata.
07:17
If there's some bad math in there,
144
437809
1626
Ako je matematika loša, pretvaraće se da je 1 + 1 = 3 i raditi na osnovu toga.
07:19
it will happily pretend that one plus one equals three and run with it."
145
439477
3462
07:23
So we had to collect some feedback data.
146
443523
2294
Tako da smo morali da prikupimo povratne informacije.
07:25
Sal Khan himself was very kind
147
445858
1544
Sal Kan lično je bio veoma ljubazan i ponudio nam je 20 sati svog vremena
07:27
and offered 20 hours of his own time to provide feedback to the machine
148
447443
3337
da pruži povratne informacije mašini radeći uz naš tim.
07:30
alongside our team.
149
450780
1501
07:32
And over the course of a couple of months we were able to teach the AI that,
150
452323
3587
I kroz nekoliko meseci, uspeli smo da naučimo VI sledeće:
07:35
"Hey, you really should push back on humans
151
455910
2044
“Hej, u ovom specifičnom scenariju, trebalo bi da se suprotstaviš ljudima.”
07:37
in this specific kind of scenario."
152
457954
2044
07:41
And we've actually made lots and lots of improvements to the models this way.
153
461416
4921
I na ovaj način smo zapravo uveliko poboljšali model.
07:46
And when you push that thumbs down in ChatGPT,
154
466379
2544
A kada pritisnete onaj “palac dole” u ChatGPT-u,
07:48
that actually is kind of like sending up a bat signal to our team to say,
155
468965
3462
to je kao da pošaljete bat-signal našem timu i kažete:
“Evo slabije oblasti gde vam treba još povratnih informacija.”
07:52
“Here’s an area of weakness where you should gather feedback.”
156
472427
2919
Kada tako radite, to je jedan od načina na koji stvarno slušamo korisnike,
07:55
And so when you do that,
157
475388
1168
07:56
that's one way that we really listen to our users
158
476597
2294
07:58
and make sure we're building something that's more useful for everyone.
159
478933
3378
i brinemo se da gradimo nešto što je korisno svima.
08:02
Now, providing high-quality feedback is a hard thing.
160
482895
3754
Sada, dobijanje kvalitetne povratne informacije je teško.
08:07
If you think about asking a kid to clean their room,
161
487025
2460
Kada pitate dete da spremi svoju sobu,
08:09
if all you're doing is inspecting the floor,
162
489485
2711
ako samo proveravate kakav je pod,
08:12
you don't know if you're just teaching them to stuff all the toys in the closet.
163
492196
3796
ne znate da li ga samo učite da natrpa sve igračke u ormar.
08:15
This is a nice DALL-E-generated image, by the way.
164
495992
2627
Inače, ovo je divna slika koju je napravio DALL-E.
08:19
And the same sort of reasoning applies to AI.
165
499912
4713
Ista logika važi i za VI.
08:24
As we move to harder tasks,
166
504667
1794
Kako idemo ka težim zadacima,
08:26
we will have to scale our ability to provide high-quality feedback.
167
506502
3796
moramo srazmerno poboljšati i mogućnost davanja kvalitetnih povratnih infromacija.
08:30
But for this, the AI itself is happy to help.
168
510882
3879
Ali za ovo, sama VI rado pomaže, kako da joj pružimo bolje povratne informacije
08:34
It's happy to help us provide even better feedback
169
514761
2335
i srazmerno je nadgledamo kako vreme odmiče.
08:37
and to scale our ability to supervise the machine as time goes on.
170
517138
3587
08:40
And let me show you what I mean.
171
520767
1543
Dozvolite da vam pokažem na šta mislim.
08:42
For example, you can ask GPT-4 a question like this,
172
522810
4546
Npr., možete pitati GPT-4 ovako nešto,
08:47
of how much time passed between these two foundational blogs
173
527356
3295
koliko je prošlo vremena od ova dva osnivačka bloga
08:50
on unsupervised learning
174
530693
1668
o učenju bez nadzora i učenju na osnovu ljudske povratne informacije.
08:52
and learning from human feedback.
175
532403
1794
08:54
And the model says two months passed.
176
534197
2460
I model kaže da je prošlo dva meseca.
Međutim, da li je to tačno?
08:57
But is it true?
177
537075
1167
08:58
Like, these models are not 100-percent reliable,
178
538284
2252
Ovi modeli nisu sto posto pouzdani,
09:00
although they’re getting better every time we provide some feedback.
179
540536
3921
iako su sve bolji posle svake povratne informacije koju im damo.
09:04
But we can actually use the AI to fact-check.
180
544457
3086
Međutim, možemo iskoristiti VI da proveri činjenice,
09:07
And it can actually check its own work.
181
547543
1877
i može samu sebe da proveri.
09:09
You can say, fact-check this for me.
182
549462
1877
Možete reći: “Proveri mi ovu činjenicu.”
09:12
Now, in this case, I've actually given the AI a new tool.
183
552757
3670
U ovom slučaju sam VI dao novi alat.
09:16
This one is a browsing tool
184
556427
1710
Ovo je alatka za pretraživanje,
09:18
where the model can issue search queries and click into web pages.
185
558137
3879
gde model može slati upite i pregledati veb-stranice.
Pri tome, ispisuje čitav sled razmišljanja dok to radi.
09:22
And it actually writes out its whole chain of thought as it does it.
186
562016
3253
09:25
It says, I’m just going to search for this and it actually does the search.
187
565269
3587
Kaže da će pretraživati ovo i onda radi pretragu.
Onda nalazi datum objave i rezultate pretrage.
09:28
It then it finds the publication date and the search results.
188
568856
3128
09:32
It then is issuing another search query.
189
572026
1919
A onda radi još jedan upit. Kliknuće na ovu objavu bloga.
09:33
It's going to click into the blog post.
190
573945
1877
09:35
And all of this you could do, but it’s a very tedious task.
191
575822
2877
I kao što vidite, sve ovo možete i sami uraditi, ali je veoma zamorno.
09:38
It's not a thing that humans really want to do.
192
578741
2211
Nije nešto što ljudi stvarno žele raditi.
09:40
It's much more fun to be in the driver's seat,
193
580952
2169
Zabavnije je držati uzde, biti menadžer, gde možete, ako to želite,
09:43
to be in this manager's position where you can, if you want,
194
583162
2836
i tri puta proveriti njen rad.
09:45
triple-check the work.
195
585998
1210
I evo dolazi citat,
09:47
And out come citations
196
587208
1501
09:48
so you can actually go
197
588709
1168
pa možete i da s lakoćom potvrdite bilo koji korak u ovom sledu razmišljanja.
09:49
and very easily verify any piece of this whole chain of reasoning.
198
589919
3754
09:53
And it actually turns out two months was wrong.
199
593673
2210
Ispostaviće se da je dva meseca pogrešno. Dva meseca i jedna sedmica je ispravno.
09:55
Two months and one week,
200
595883
2169
09:58
that was correct.
201
598094
1251
10:00
(Applause)
202
600888
3837
(Aplauz)
10:07
And we'll cut back to the side.
203
607645
1502
Vratićemo se na slajd.
10:09
And so thing that's so interesting to me about this whole process
204
609147
3920
Ono što je meni najinteresantnije kod čitavog ovog procesa
10:13
is that it’s this many-step collaboration between a human and an AI.
205
613067
3962
je da je to saradnja čoveka i VI na više nivoa.
10:17
Because a human, using this fact-checking tool
206
617029
2461
Jer čovek koji koristi ovaj alat za proveru činjenica radi to radi podataka
10:19
is doing it in order to produce data
207
619532
2210
10:21
for another AI to become more useful to a human.
208
621742
3170
koje će da koristi druga VI kako bi postala korisnija za čoveka.
10:25
And I think this really shows the shape of something
209
625454
2545
Mislim da ovo poprima oblik nečega
10:28
that we should expect to be much more common in the future,
210
628040
3087
što možemo očekivati da bude sve učestalije u budućnosti.
10:31
where we have humans and machines kind of very carefully
211
631127
2711
Gde ćemo imati čoveka s jedne strane, a s druge pažljivo i precizno podešenu mašinu
10:33
and delicately designed in how they fit into a problem
212
633880
3503
koja je napravljena da rešava problem na način na koji mi želimo.
10:37
and how we want to solve that problem.
213
637425
1918
10:39
We make sure that the humans are providing the management, the oversight,
214
639385
3462
Mi se staramo o nadzoru, upravljanju, povratnim informacijama,
10:42
the feedback,
215
642847
1168
a mašine rade na pregledan i pouzdan način.
10:44
and the machines are operating in a way that's inspectable
216
644015
2752
10:46
and trustworthy.
217
646809
1126
I zajedno smo u stanju da napravimo još pouzdanije mašine.
10:47
And together we're able to actually create even more trustworthy machines.
218
647977
3503
Mislim da ćemo vremenom, ako usavršimo proces kako treba,
10:51
And I think that over time, if we get this process right,
219
651522
2711
biti u stanju da rešavamo nemoguće probleme.
10:54
we will be able to solve impossible problems.
220
654233
2127
I da vam dam prestavu o koliko nemogućim problemima govorimo,
10:56
And to give you a sense of just how impossible I'm talking,
221
656360
3963
11:00
I think we're going to be able to rethink almost every aspect
222
660323
2878
mislim da ćemo moći da preispitamo gotovo svaki aspekt interakcije sa računarima.
11:03
of how we interact with computers.
223
663242
2378
11:05
For example, think about spreadsheets.
224
665620
2502
Npr., razmislite o tabelarnim prikazima.
11:08
They've been around in some form since, we'll say, 40 years ago with VisiCalc.
225
668122
4379
Prisutni su u nekom obliku već nekih četrdesetak godina, od VisiCalc-a.
11:12
I don't think they've really changed that much in that time.
226
672543
2878
Ne mislim da su se nešto bitnije menjali u tom periodu.
11:16
And here is a specific spreadsheet of all the AI papers on the arXiv
227
676214
5922
A evo posebne tablice o svim radovima o VI na arXiv u poslednjih 30 godina.
11:22
for the past 30 years.
228
682178
1168
11:23
There's about 167,000 of them.
229
683346
1960
Ima ih oko 167 000, kao što možete da vidite iz podataka ovde.
11:25
And you can see there the data right here.
230
685348
2878
11:28
But let me show you the ChatGPT take on how to analyze a data set like this.
231
688267
3837
Ali da vam pokažem kako ChatGTP pristupa analizi ovakvog skupa podataka.
11:37
So we can give ChatGPT access to yet another tool,
232
697318
3837
Možemo mu dati pristup još jednom alatu, Pajton interpreteru,
11:41
this one a Python interpreter,
233
701197
1460
11:42
so it’s able to run code, just like a data scientist would.
234
702657
4004
pa će moći da pokreće kod, kao što bi to radio i naučnik za podatke.
11:46
And so you can just literally upload a file
235
706661
2335
Tako da doslovno možete da učitate fajl i upitate VI o tome.
11:48
and ask questions about it.
236
708996
1335
11:50
And very helpfully, you know, it knows the name of the file and it's like,
237
710373
3545
I uz malo sreće, prepoznaće naziv i tip datoteke i reći će:
11:53
"Oh, this is CSV," comma-separated value file,
238
713960
2419
“Aha, ovo je CSV fajl, tu su vrednosti odvojene zarezom, raščlaniću to za tebe.”
11:56
"I'll parse it for you."
239
716420
1335
11:57
The only information here is the name of the file,
240
717755
2794
Jedine informacije ovde su ime fajla,
12:00
the column names like you saw and then the actual data.
241
720591
3671
imena kolona, kao što ste videli, i na kraju sami podaci.
12:04
And from that it's able to infer what these columns actually mean.
242
724262
4504
I iz svega toga, uspela je da izvede zaključak šta te kolone zapravo znače.
12:08
Like, that semantic information wasn't in there.
243
728766
2294
Semantičke informacije nisu bile tu.
12:11
It has to sort of, put together its world knowledge of knowing that,
244
731102
3211
Na neki način, ona mora na osnovu interdisciplinarnog znanja da poveže:
12:14
“Oh yeah, arXiv is a site that people submit papers
245
734355
2502
“Da, arXiv je sajt gde ljudi dostavljaju svoje radove, pa to mora da su ovi podaci,
12:16
and therefore that's what these things are and that these are integer values
246
736857
3587
a onda ovi celi brojevi predstavljaju broj autora u radu.”
12:20
and so therefore it's a number of authors in the paper,"
247
740486
2628
Sve ovo je posao koji inače rade ljudi, a VI će rado pomoći s tim.
12:23
like all of that, that’s work for a human to do,
248
743114
2252
12:25
and the AI is happy to help with it.
249
745408
1751
Sad, ja ni ne znam šta bih pitao. Srećom, možete pitati mašinu:
12:27
Now I don't even know what I want to ask.
250
747159
2002
12:29
So fortunately, you can ask the machine,
251
749203
3003
“Možeš li napraviti neke istraživačke grafikone?”
12:32
"Can you make some exploratory graphs?"
252
752248
1877
12:37
And once again, this is a super high-level instruction with lots of intent behind it.
253
757461
4004
I opet, ovo je jako napredna i detaljna instrukcija koju prati jako puno namere.
12:41
But I don't even know what I want.
254
761507
1668
Ali ni ja ne znam šta hoću, A VI sad mora da zaključi šta bi mene moglo zanimati.
12:43
And the AI kind of has to infer what I might be interested in.
255
763175
2920
I tako dolazi do nekih dobrih ideja, mislim.
12:46
And so it comes up with some good ideas, I think.
256
766137
2294
Izbacuje histogram o broju autora po radu, vremensku seriju radova po godini,
12:48
So a histogram of the number of authors per paper,
257
768472
2336
12:50
time series of papers per year, word cloud of the paper titles.
258
770850
2961
oblačić ključnih reči iz naslova radova, i sve to će biti zanimljivo za gledati.
12:53
All of that, I think, will be pretty interesting to see.
259
773853
2627
Odlična stvar je što to zapravo može i uraditi.
12:56
And the great thing is, it can actually do it.
260
776522
2169
Idemo, evo lepe krivulje zvona.
12:58
Here we go, a nice bell curve.
261
778691
1460
Vidite da je najčešće tri autora.
13:00
You see that three is kind of the most common.
262
780151
2169
13:02
It's going to then make this nice plot of the papers per year.
263
782320
5797
Napraviće lepu seriju radova po godini.
13:08
Something crazy is happening in 2023, though.
264
788117
2294
Nešto ludo se dešava ipak u 2023. godini.
13:10
Looks like we were on an exponential and it dropped off the cliff.
265
790411
3128
Eksponencijalno je rasla, a onda kao da se strmoglavila.
13:13
What could be going on there?
266
793539
1460
Šta se tu dešava? Usput, sve ovo je u Pajtonovom kodu, možete ispitati.
13:14
By the way, all this is Python code, you can inspect.
267
794999
2753
13:17
And then we'll see word cloud.
268
797752
1459
I evo nas onda kod oblačića ključnih reči, gde vidimo sve te divne pojmove iz radova.
13:19
So you can see all these wonderful things that appear in these titles.
269
799253
3378
13:23
But I'm pretty unhappy about this 2023 thing.
270
803090
2127
Ali prilično sam nezadovoljan ovim u 2023, zbog toga ova godina izgleda loše.
13:25
It makes this year look really bad.
271
805634
1711
13:27
Of course, the problem is that the year is not over.
272
807345
2877
Naravno, problem je što godina još nije gotova.
13:30
So I'm going to push back on the machine.
273
810222
2878
Pa ću da uzvratim mašini.
13:33
[Waitttt that's not fair!!!
274
813142
1585
[Čekaj, to nije fer!!! 2023. godina još nije gotova]
13:34
2023 isn't over.
275
814727
1293
[Koji procenat radova iz 2022. godine su objavljeni do 13. aprila?]
13:38
What percentage of papers in 2022 were even posted by April 13?]
276
818481
5088
13:44
So April 13 was the cut-off date I believe.
277
824695
2294
Verujem da je 13. april bio krajnji rok.
13:47
Can you use that to make a fair projection?
278
827656
4922
Možeš li to iskoristiti da napraviš precizniju projekciju?
13:54
So we'll see, this is the kind of ambitious one.
279
834747
2294
Vidićemo, ovo je prilično ambiciozno.
13:57
(Laughter)
280
837083
1126
(Smeh)
13:59
So you know,
281
839877
1251
Znate, opet, čini mi se da sam još toga želeo da izvučem iz mašine ovde.
14:01
again, I feel like there was more I wanted out of the machine here.
282
841128
3921
14:05
I really wanted it to notice this thing,
283
845049
2502
Stvarno sam želio da primeti ovo.
14:07
maybe it's a little bit of an overreach for it
284
847593
3128
Možda je malo za previše očekivati da magično zaključi da sam baš to želeo.
14:10
to have sort of, inferred magically that this is what I wanted.
285
850763
3378
14:14
But I inject my intent,
286
854183
1627
Ali izrazio sam svoju nameru, pružio dodatna uputstva,
14:15
I provide this additional piece of, you know, guidance.
287
855810
4754
14:20
And under the hood,
288
860564
1168
I pod haubom, VI i dalje piše kod, možete da preglete kako radi, veoma je moguće.
14:21
the AI is just writing code again, so if you want to inspect what it's doing,
289
861774
3629
14:25
it's very possible.
290
865403
1251
14:26
And now, it does the correct projection.
291
866654
3628
A sada, dala je ispravnu projekciju.
14:30
(Applause)
292
870282
5005
(Aplauz)
14:35
If you noticed, it even updates the title.
293
875287
2169
Ako ste primetili, čak je i ažurirala naslov.
14:37
I didn't ask for that, but it know what I want.
294
877498
2336
Nisam ni pitao, ali znala je da to želim.
14:41
Now we'll cut back to the slide again.
295
881794
2544
Vratimo se opet na slajd.
14:45
This slide shows a parable of how I think we ...
296
885714
4880
Ovaj slajd prikazuje parabolu o tome kako ja mislim da ćemo...
14:51
A vision of how we may end up using this technology in the future.
297
891220
3212
Viziju kako bi mogli da koristimo ovu tehnologiju u budućnosti.
14:54
A person brought his very sick dog to the vet,
298
894849
3712
Osoba je dovela svog veoma bolesnog psa veterinaru,
14:58
and the veterinarian made a bad call to say, “Let’s just wait and see.”
299
898561
3336
koji je doneo veoma lošu odluku i rekao: “Sačekajmo da vidimo šta će biti.”
15:01
And the dog would not be here today had he listened.
300
901897
2795
Psa ne bi danas bilo da ga je poslušao.
15:05
In the meanwhile, he provided the blood test,
301
905401
2252
U međuvremenu, dao je nalaz krvi i medicinski karton GTP-4,
15:07
like, the full medical records, to GPT-4,
302
907653
2586
15:10
which said, "I am not a vet, you need to talk to a professional,
303
910281
3170
koji je rekao: “Nisam veterinar, trebalo bi razgovarati sa profesionalcem,
15:13
here are some hypotheses."
304
913492
1710
ali evo nekih pretpostavki.”
15:15
He brought that information to a second vet
305
915786
2127
S tim informacijama je otišao po drugo mišljenje, i taj veterinar je spasio psa.
15:17
who used it to save the dog's life.
306
917913
1835
15:21
Now, these systems, they're not perfect.
307
921292
2252
Ovi sistemi nisu savršeni.
15:23
You cannot overly rely on them.
308
923544
2336
Ne možete se preterano oslanjati na njih.
15:25
But this story, I think, shows
309
925880
3712
Ali mislim da ova priča ilustruje
15:29
that a human with a medical professional
310
929592
3044
kako je čovek s medicinskim profesionalcem
15:32
and with ChatGPT as a brainstorming partner
311
932678
2461
i ChatGPT kao idejnim partnerom
15:35
was able to achieve an outcome that would not have happened otherwise.
312
935181
3295
postigao ishod koji se ne bi inače dogodio.
15:38
I think this is something we should all reflect on,
313
938476
2419
Mislim da je to nešto o čemu bi svi trebalo da razmislimo
15:40
think about as we consider how to integrate these systems
314
940895
2711
i razmotrimo prilikom integracije ovih sistema u naše živote.
15:43
into our world.
315
943606
1167
I jedna stvar u koju doboko verujem
15:44
And one thing I believe really deeply,
316
944815
1835
15:46
is that getting AI right is going to require participation from everyone.
317
946650
4338
je da će za usavršavanje VI biti potrebno učestvovanje svih nas.
15:50
And that's for deciding how we want it to slot in,
318
950988
2377
Kao i odlučivanje kako želimo da je integrišemo, postavimo joj pravila,
15:53
that's for setting the rules of the road,
319
953365
1961
15:55
for what an AI will and won't do.
320
955367
2044
šta VI sme i ne sme da radi.
15:57
And if there's one thing to take away from this talk,
321
957453
2502
Zaključak ovog govora je da ova tehnologija prosto izgleda drugačije.
15:59
it's that this technology just looks different.
322
959997
2211
16:02
Just different from anything people had anticipated.
323
962208
2502
Prosto je različita od svega što smo predviđali.
16:04
And so we all have to become literate.
324
964710
1835
Svi ćemo morati da se opismenimo.
16:06
And that's, honestly, one of the reasons we released ChatGPT.
325
966587
2961
Iskreno, to je jedan od razloga što smo objavili ChatGTP.
16:09
Together, I believe that we can achieve the OpenAI mission
326
969548
3128
Zajedno, mislim da možemo postići misiju OpenAI-a,
16:12
of ensuring that artificial general intelligence
327
972718
2252
a to je da opšta veštačka inteligencija koristi čitavom čovečanstvu.
16:14
benefits all of humanity.
328
974970
1877
16:16
Thank you.
329
976847
1168
Hvala vam.
16:18
(Applause)
330
978057
6965
(Aplauz)
16:33
(Applause ends)
331
993322
1168
(Kraj aplauza)
16:34
Chris Anderson: Greg.
332
994532
1334
Kris Anderson: Greg.
16:36
Wow.
333
996242
1167
Opa.
16:37
I mean ...
334
997868
1126
Hoću reći...
16:39
I suspect that within every mind out here
335
999662
3753
Slutim da se se u svačijoj glavi sada javlja osećaj zbunjenosti.
16:43
there's a feeling of reeling.
336
1003457
2503
16:46
Like, I suspect that a very large number of people viewing this,
337
1006001
3379
Slutim da veliki broj ljudi koji sada ovo gleda,
16:49
you look at that and you think, “Oh my goodness,
338
1009421
2419
gleda i razmišlja: “Bože, gotovo sve o mom načinu rada, moraću ponovo da promislim.”
16:51
pretty much every single thing about the way I work, I need to rethink."
339
1011882
3462
16:55
Like, there's just new possibilities there.
340
1015386
2002
Prosto se otvaraju nove opcije, zar ne?
16:57
Am I right?
341
1017388
1168
Ko sve misli da će morati ponovo da promisli o načinu na koji se nešto radi?
16:58
Who thinks that they're having to rethink the way that we do things?
342
1018597
3337
17:01
Yeah, I mean, it's amazing,
343
1021976
1543
Hoću reći, da, to je sve neverovatno, ali ujedno i zastrašujuće.
17:03
but it's also really scary.
344
1023561
2002
17:05
So let's talk, Greg, let's talk.
345
1025604
1585
Stoga, porazgovarajmo, Greg. GB: Naravno.
17:08
I mean, I guess my first question actually is just
346
1028524
2377
KA: Moje prvo pitanje bi bilo, kako si dođavola ovo postigao?
17:10
how the hell have you done this?
347
1030901
1585
17:12
(Laughter)
348
1032486
1251
(Smeh)
17:13
OpenAI has a few hundred employees.
349
1033737
2962
OpenAI ima par stotina zaposlenih.
17:16
Google has thousands of employees working on artificial intelligence.
350
1036740
4755
Gugl ima hiljade zaposlenih koji rade na veštačkoj inteligenciji.
17:21
Why is it you who's come up with this technology
351
1041996
3503
Kako si baš ti osmislio ovu tehnologiju koja je šokirala svet?
17:25
that shocked the world?
352
1045541
1168
17:26
Greg Brockman: I mean, the truth is,
353
1046709
1751
GB: Iskreno, svi gradimo na temeljima koje su postavili velikani, to je neupitno.
17:28
we're all building on shoulders of giants, right, there's no question.
354
1048502
3295
Gledajući pomake u komputaciji, podacima, algoritmima, svi su na nivou industrije.
17:31
If you look at the compute progress,
355
1051797
1752
17:33
the algorithmic progress, the data progress,
356
1053549
2085
17:35
all of those are really industry-wide.
357
1055634
1835
Mislim da smo sa OpenAI napravili mnoge ciljane izbore od samog početka.
17:37
But I think within OpenAI,
358
1057469
1252
17:38
we made a lot of very deliberate choices from the early days.
359
1058762
2878
Prvi je bio prosto da se suočimo sa realnošću kakva jeste.
17:41
And the first one was just to confront reality as it lays.
360
1061640
2711
Zapitali smo se: “Šta će biti potrebno da se ovde naprave pomaci?”
17:44
And that we just thought really hard about like:
361
1064393
2294
17:46
What is it going to take to make progress here?
362
1066687
2210
17:48
We tried a lot of things that didn't work, so you only see the things that did.
363
1068939
3754
Svašta smo probali, nije sve uspelo, ostale su samo stvari koje rade.
Najvažnije je bilo da se okupe timovi ljudi koji se razlikuju jedni od drugih,
17:52
And I think that the most important thing has been to get teams of people
364
1072693
3462
17:56
who are very different from each other to work together harmoniously.
365
1076196
3254
a koji mogu raditi složno.
17:59
CA: Can we have the water, by the way, just brought here?
366
1079450
2711
KA: Možete li nam doneti vode?
Trebaće nam, biće ovo tema od koje se suše usta.
18:02
I think we're going to need it, it's a dry-mouth topic.
367
1082202
3170
18:06
But isn't there something also just about the fact
368
1086665
2795
Ima li to veze sa činjenicom da si video nešto u ovim jezičkim modelima iz čega bi
18:09
that you saw something in these language models
369
1089501
4755
18:14
that meant that if you continue to invest in them and grow them,
370
1094256
3921
ukoliko se u to uloži i razvije, moglo nešto nastati u jednom trenutku?
18:18
that something at some point might emerge?
371
1098218
3129
18:21
GB: Yes.
372
1101847
1126
GB: Da.
18:23
And I think that, I mean, honestly,
373
1103015
2836
Iskreno, mislim da ova priča to sjajno ilustruje, zar ne?
18:25
I think the story there is pretty illustrative, right?
374
1105893
2544
18:28
I think that high level, deep learning,
375
1108437
2002
Mislim da duboko učenje, visokog nivoa, to je ono što smo uvek želeli.
18:30
like we always knew that was what we wanted to be,
376
1110481
2335
Želeli smo laboratoriju za duboko učenje. Ali kako da to postignemo?
18:32
was a deep learning lab, and exactly how to do it?
377
1112858
2419
U početku, mislim da nismo ni znali. Svašta smo probali.
18:35
I think that in the early days, we didn't know.
378
1115277
2211
18:37
We tried a lot of things,
379
1117529
1210
Neko je obučavao model kako da prepozna sledeće slovo u recenziji na Amazonu.
18:38
and one person was working on training a model
380
1118739
2336
18:41
to predict the next character in Amazon reviews,
381
1121075
2877
18:43
and he got a result where -- this is a syntactic process,
382
1123994
4755
I došao je do rezultata gde, a govorimo o sintaktičkom procesu,
18:48
you expect, you know, the model will predict where the commas go,
383
1128749
3086
očekujete da model predvidi gde će ići zarezi, imenice i glagoli.
18:51
where the nouns and verbs are.
384
1131835
1627
18:53
But he actually got a state-of-the-art sentiment analysis classifier out of it.
385
1133504
4337
Ali je zapravo došao do vrhunskog klasifikatora sentimentalne analize.
18:57
This model could tell you if a review was positive or negative.
386
1137883
2961
Model vam je mogao reći da li je recenzija pozitivna ili negativna.
19:00
I mean, today we are just like, come on, anyone can do that.
387
1140886
3378
Danas kažemo: “Ma daj, svako to može.” Ali tada je to bio prvi takav slučaj,
19:04
But this was the first time that you saw this emergence,
388
1144306
3087
19:07
this sort of semantics that emerged from this underlying syntactic process.
389
1147434
5005
takav tip semantike koja se pojavila u osnovi ovog sintaktičkog procesa.
19:12
And there we knew, you've got to scale this thing,
390
1152481
2336
I tada smo znali, moramo to skalirati, videti dokle može da ide.
19:14
you've got to see where it goes.
391
1154858
1544
KA: Mislim da će ovo rasvetliti misteriju koja muči sve koji ovo gledaju,
19:16
CA: So I think this helps explain
392
1156402
1626
19:18
the riddle that baffles everyone looking at this,
393
1158028
2544
jer ih često opisujemo kao mašine koje predviđaju.
19:20
because these things are described as prediction machines.
394
1160572
2753
A opet, ono što vidimo da rade ...
19:23
And yet, what we're seeing out of them feels ...
395
1163367
2669
19:26
it just feels impossible that that could come from a prediction machine.
396
1166036
3420
Prosto se čini nemoguće da ovakvo nešto dolazi od mašine za predviđanje.
19:29
Just the stuff you showed us just now.
397
1169456
2378
Sve ovo što si nam pokazao maločas.
19:31
And the key idea of emergence is that when you get more of a thing,
398
1171875
3838
Sama ideja emergencije je da kada povećavate srazmeru ili učestalost nečega,
19:35
suddenly different things emerge.
399
1175754
1585
prosto se pojave nove stvari. To se stalno deštava.
19:37
It happens all the time, ant colonies, single ants run around,
400
1177339
3045
Npr, kolonije mrava. jedan mrav samo hoda unaokolo, ali ako ih je dovoljno,
19:40
when you bring enough of them together,
401
1180384
1877
nastane kolonija koja pokazuje skroz drugačije, novonastalo ponašanje.
19:42
you get these ant colonies that show completely emergent, different behavior.
402
1182302
3629
19:45
Or a city where a few houses together, it's just houses together.
403
1185973
3086
Ili grad. Par kuća je samo par kuća zajedno.
Ali kako broj kuća raste, nastaju nove stvari,
19:49
But as you grow the number of houses,
404
1189059
1794
19:50
things emerge, like suburbs and cultural centers and traffic jams.
405
1190894
4588
poput predgrađa, kulturnih centara i gužvi u saobraćaju.
19:57
Give me one moment for you when you saw just something pop
406
1197276
3211
Daj mi jedan trenutak u kom si video da nastaje nešto
20:00
that just blew your mind
407
1200529
1668
što te je oduvalo, što prosto nisi predvideo.
20:02
that you just did not see coming.
408
1202197
1627
20:03
GB: Yeah, well,
409
1203824
1209
GB: Evo, probajte ovo u ChatGPT-u, sabirajte brojeve od 40 cifara.
20:05
so you can try this in ChatGPT, if you add 40-digit numbers --
410
1205075
3462
20:08
CA: 40-digit?
411
1208537
1168
KA: 40 cifara?
20:09
GB: 40-digit numbers, the model will do it,
412
1209705
2169
GB: Brojeve od 40 cifara, i model će ih sabrati,
20:11
which means it's really learned an internal circuit for how to do it.
413
1211915
3254
što znači da je formirao interno kolo kako da to radi.
20:15
And the really interesting thing is actually,
414
1215210
2127
Ali zanimljiva stvar je, ako ga pitate da sabere brojeve od 40 i 35 cifara,
20:17
if you have it add like a 40-digit number plus a 35-digit number,
415
1217337
3212
20:20
it'll often get it wrong.
416
1220591
1710
često će pogrešno sabrati.
20:22
And so you can see that it's really learning the process,
417
1222676
2795
Vidi se da uči proces, ali da ga nije još do kraja savladao, generalizovao, zar ne?
20:25
but it hasn't fully generalized, right?
418
1225471
1876
20:27
It's like you can't memorize the 40-digit addition table,
419
1227389
2711
Ne možete da naučite tablicu sabiranja za 40-cifrene brojeve
20:30
that's more atoms than there are in the universe.
420
1230100
2294
to je više nego što atoma ima u univerzumu.
20:32
So it had to have learned something general,
421
1232394
2086
Pa je morao da nauči nešto uopšteno, ali još to nije savladao do kraja.
20:34
but that it hasn't really fully yet learned that,
422
1234480
2377
20:36
Oh, I can sort of generalize this to adding arbitrary numbers
423
1236899
2961
U fazonu: “Sad mogu da generalizujem ovo, da sabiram brojeve proizvoljne dužine.”
20:39
of arbitrary lengths.
424
1239902
1167
20:41
CA: So what's happened here
425
1241111
1335
KA: Dakle, ovde se desilo to da ste mu dozvolili da poveća razmere
20:42
is that you've allowed it to scale up
426
1242488
1793
20:44
and look at an incredible number of pieces of text.
427
1244281
2419
i da sagleda neverovatne količine teksta.
20:46
And it is learning things
428
1246742
1209
I sada uči stvari za koje niste verovali da će biti u stanju naučiti.
20:47
that you didn't know that it was going to be capable of learning.
429
1247951
3379
20:51
GB Well, yeah, and it’s more nuanced, too.
430
1251371
2002
GB: U suštini, ali se ide i u tančine.
20:53
So one science that we’re starting to really get good at
431
1253415
2878
Jedna od nauka u kojima smo se poboljšali je predviđanje mogućih emergencija.
20:56
is predicting some of these emergent capabilities.
432
1256335
2586
20:58
And to do that actually,
433
1258962
1335
Da bismo to postigli, ne hvalimo dovoljno u ovom polju inženjerski kvalitet.
21:00
one of the things I think is very undersung in this field
434
1260339
2711
21:03
is sort of engineering quality.
435
1263050
1501
21:04
Like, we had to rebuild our entire stack.
436
1264551
2044
Npr, morali smo da ponovo izgradimo naš čitav stek.
21:06
When you think about building a rocket,
437
1266637
1877
Kada gradite raketu, tolerancije prema greškama moraju biti izuzetno male.
21:08
every tolerance has to be incredibly tiny.
438
1268555
2211
21:10
Same is true in machine learning.
439
1270766
1626
Isto važi i za mašinsko učenje.
21:12
You have to get every single piece of the stack engineered properly,
440
1272434
3212
Svaki pojedinačni stek mora biti projektovan kako treba,
21:15
and then you can start doing these predictions.
441
1275646
2210
i tek onda se može pristupi predviđanjima.
21:17
There are all these incredibly smooth scaling curves.
442
1277856
2503
Tu je mnoštvo izuzetno glatkih skalirajućih krivulja.
21:20
They tell you something deeply fundamental about intelligence.
443
1280359
2919
One vam saopštavaju nešto suštinsko o inteligenciji.
Pogledajte našu objavu na GPT-4 blogu i vidićete sve ove krivulje.
21:23
If you look at our GPT-4 blog post,
444
1283320
1710
21:25
you can see all of these curves in there.
445
1285030
1960
21:26
And now we're starting to be able to predict.
446
1286990
2127
I sada već možemo da predviđamo.
Npr., možemo da predvidimo učinak na problemima u kodiranju.
21:29
So we were able to predict, for example, the performance on coding problems.
447
1289117
3713
21:32
We basically look at some models
448
1292871
1585
U suštini, gledamo neke modele koji su 10 000 ili 1 000 puta manji.
21:34
that are 10,000 times or 1,000 times smaller.
449
1294456
2461
21:36
And so there's something about this that is actually smooth scaling,
450
1296959
3211
Tako da tu ima nečega što je zapravo glatko skaliranje, iako je još u začetku.
21:40
even though it's still early days.
451
1300170
2044
21:42
CA: So here is, one of the big fears then,
452
1302756
2544
KA: I tako dolazimo do jednog od najvećih strahova koji proizilazi iz ovoga.
21:45
that arises from this.
453
1305300
1252
21:46
If it’s fundamental to what’s happening here,
454
1306593
2127
Ako je u osnovi ovoga što se dešava, to da kako povećavamo razmeru,
21:48
that as you scale up,
455
1308720
1210
21:49
things emerge that
456
1309930
2419
pojavljuju se stvari koje se mogu predvideti sa određenom dozom sigurnosti,
21:52
you can maybe predict in some level of confidence,
457
1312349
4171
21:56
but it's capable of surprising you.
458
1316562
2544
ali i dalje mogu da iznenade.
22:00
Why isn't there just a huge risk of something truly terrible emerging?
459
1320816
4463
Zar ne postoji opasnost da se nešto stvarno strašno pojavi?
22:05
GB: Well, I think all of these are questions of degree
460
1325320
2545
GB: Mislim da je sve to pitanje razmere i vremena.
22:07
and scale and timing.
461
1327865
1209
I ono što ljudima takođe promiče je da je i integracija
22:09
And I think one thing people miss, too,
462
1329116
1877
22:10
is sort of the integration with the world is also this incredibly emergent,
463
1330993
3587
ove tehnologije takođe nova, a pri tome i izuzetno moćna pojava.
22:14
sort of, very powerful thing too.
464
1334621
1585
To je jedan od razloga što mislimo da se ova tehnologija treba uvoditi postepeno.
22:16
And so that's one of the reasons that we think it's so important
465
1336248
3045
22:19
to deploy incrementally.
466
1339293
1167
Ono što mislim da sada vidimo, gledajući ovaj govor,
22:20
And so I think that what we kind of see right now, if you look at this talk,
467
1340502
3629
je da se ja većinom bavim davanjem kvalitetne povratne informacije.
22:24
a lot of what I focus on is providing really high-quality feedback.
468
1344131
3170
Danas se u to može ostvariti uvid, zar ne?
22:27
Today, the tasks that we do, you can inspect them, right?
469
1347301
2711
Lako se može videti matematički problem
22:30
It's very easy to look at that math problem and be like, no, no, no,
470
1350012
3211
i reći: “Ne, mašino, sedam je ispravan odgovor.”
22:33
machine, seven was the correct answer.
471
1353265
1835
Ali čak i za rezimiranje knjige, to se teško nadgleda.
22:35
But even summarizing a book, like, that's a hard thing to supervise.
472
1355100
3212
22:38
Like, how do you know if this book summary is any good?
473
1358312
2586
Kako da znaš da ovaj siže ičemu valja? Moraš pročitati knjigu, a to niko ne želi.
22:40
You have to read the whole book.
474
1360939
1543
22:42
No one wants to do that.
475
1362482
1168
(Smeh)
22:43
(Laughter)
476
1363692
1293
22:44
And so I think that the important thing will be that we take this step by step.
477
1364985
4296
I zato mislim da će biti bitno da se stvari rade korak po korak.
22:49
And that we say, OK, as we move on to book summaries,
478
1369323
2544
I da kažemo, u redu, kad pređemo na sažimanje knjiga,
22:51
we have to supervise this task properly.
479
1371867
1960
moraćemo to propisno nadgledati.
22:53
We have to build up a track record with these machines
480
1373827
2586
Moramo prvo izgraditi istorijat uspeha sa ovim mašinama
22:56
that they're able to actually carry out our intent.
481
1376413
2586
da bi one uopšte bile u stanju da iznesu našu nameru.
22:59
And I think we're going to have to produce even better, more efficient,
482
1379041
3336
Moraćmo pronaći još bolje, efikasnije i pouzdanije načine skaliranja
23:02
more reliable ways of scaling this,
483
1382419
1710
23:04
sort of like making the machine be aligned with you.
484
1384129
2878
toga da mašina bude usklađena sa vama.
23:07
CA: So we're going to hear later in this session,
485
1387049
2294
KA: Čućemo kasnije govore gde kritičari kažu
23:09
there are critics who say that,
486
1389343
1543
23:10
you know, there's no real understanding inside,
487
1390928
4587
da nema istinskog razumevanja unutrašnjosti sistema, da će on uvek --
23:15
the system is going to always --
488
1395557
1627
23:17
we're never going to know that it's not generating errors,
489
1397225
3212
da nikada nećemo znati da ne pravi greške, da neće imati zdrav razum i tome slično.
23:20
that it doesn't have common sense and so forth.
490
1400479
2210
23:22
Is it your belief, Greg, that it is true at any one moment,
491
1402689
4088
Da li ti, Greg, veruješ da je to sada tačno,
23:26
but that the expansion of the scale and the human feedback
492
1406818
3629
ali da će je povećanje razmere i povratne informacije od ljudi,
23:30
that you talked about is basically going to take it on that journey
493
1410489
4963
o čemu si govorio, povesti u tom smeru
23:35
of actually getting to things like truth and wisdom and so forth,
494
1415494
3837
i da će vremenom doći do istine, mudrosti,
i tome slično, sa većom sigurnošću? Kako možeš biti siguran u to?
23:39
with a high degree of confidence.
495
1419331
1627
23:40
Can you be sure of that?
496
1420999
1335
23:42
GB: Yeah, well, I think that the OpenAI, I mean, the short answer is yes,
497
1422334
3462
GB: Pa, kratak odgovor je da. Mislim da se OpenAI kreće u tom smeru.
23:45
I believe that is where we're headed.
498
1425796
1793
23:47
And I think that the OpenAI approach here has always been just like,
499
1427631
3211
A naš pristup je uvek bio da pustimo realnost da nas ošine po faci, zar ne?
23:50
let reality hit you in the face, right?
500
1430842
1877
23:52
It's like this field is the field of broken promises,
501
1432719
2503
Ovo je oblast prekršenih obećanja,
gde stručnjaci govore da će se X desiti, a na način Y.
23:55
of all these experts saying X is going to happen, Y is how it works.
502
1435263
3212
Ljudi su govorili da neuronske mreže neće raditi još 70 godina.
23:58
People have been saying neural nets aren't going to work for 70 years.
503
1438475
3337
Još uvek nisu u pravu.
24:01
They haven't been right yet.
504
1441812
1376
Možda će trebati vremenski period od 70 plus jednu godinu, ili nešto slično.
24:03
They might be right maybe 70 years plus one
505
1443188
2044
24:05
or something like that is what you need.
506
1445232
1918
Mislm da je naš pristup uvek bio da se granice ove tehnologije moraju gurati
24:07
But I think that our approach has always been,
507
1447192
2169
24:09
you've got to push to the limits of this technology
508
1449361
2419
da se vidi za šta je ona sposobna.
24:11
to really see it in action,
509
1451822
1293
jer to nam govori kako i kada ćemo moći preći na novu paradigmu.
24:13
because that tells you then, oh, here's how we can move on to a new paradigm.
510
1453115
3670
I mislim da ovde nismo još iscrpli sve.
24:16
And we just haven't exhausted the fruit here.
511
1456785
2127
KA: Mislim da si zauzeo prilično kontroverzno stanovište,
24:18
CA: I mean, it's quite a controversial stance you've taken,
512
1458954
2794
a to je da je pravi način da sve ovo staviš u javnost
24:21
that the right way to do this is to put it out there in public
513
1461748
2920
i onda to sve iskoristiš,
24:24
and then harness all this, you know,
514
1464710
1751
umesto da samo tvoj tim daje povratne informacije,
24:26
instead of just your team giving feedback,
515
1466461
2002
24:28
the world is now giving feedback.
516
1468463
2461
čitav svet ti daje povratne informacije.
24:30
But ...
517
1470924
1168
Ali...
24:33
If, you know, bad things are going to emerge,
518
1473135
3753
Ako se recimo pojave loše stvari, prosto će biti puštene u etar.
24:36
it is out there.
519
1476930
1168
24:38
So, you know, the original story that I heard on OpenAI
520
1478140
2919
Izvorna priča o OpenAI, kada ste osnovani kao neprofitna organizacija, je bila
24:41
when you were founded as a nonprofit,
521
1481101
1793
24:42
well you were there as the great sort of check on the big companies
522
1482894
4463
da ćete biti brana od velikih korporacija
24:47
doing their unknown, possibly evil thing with AI.
523
1487399
3837
koje rade nepoznate, verovatno i loše stvari sa VI.
24:51
And you were going to build models that sort of, you know,
524
1491278
4755
I da ćete graditi modele koji će ih nekako držati odgovornim
24:56
somehow held them accountable
525
1496033
1418
24:57
and was capable of slowing the field down, if need be.
526
1497492
4380
i biti u stanju da uspore oblast, ukoliko bude potrebno.
25:01
Or at least that's kind of what I heard.
527
1501872
1960
Ili sam nešto slično tome čuo.
25:03
And yet, what's happened, arguably, is the opposite.
528
1503832
2461
A zapravo se desilo upravo suprotno.
25:06
That your release of GPT, especially ChatGPT,
529
1506334
5673
Puštanje GPT-a, a pogotovo ChatGPT-a
25:12
sent such shockwaves through the tech world
530
1512049
2002
je toliko odjeknulo u svetu tehnologije
25:14
that now Google and Meta and so forth are all scrambling to catch up.
531
1514051
3795
da se sada Gugl, Meta i drugi jagme da vas sustignu.
25:17
And some of their criticisms have been,
532
1517888
2085
Neke od njihovih kritika su
25:20
you are forcing us to put this out here without proper guardrails or we die.
533
1520015
4963
da ih silite da puste stvari bez propisnih mehanizama zaštite ili će propasti.
25:25
You know, how do you, like,
534
1525020
2794
Kako opravdate ovo? Da je to urađeno na odgovoran način, a ne nesmotreno?
25:27
make the case that what you have done is responsible here and not reckless.
535
1527814
3754
25:31
GB: Yeah, we think about these questions all the time.
536
1531568
3128
GB: Da, ova pitanja su nam stalno na pameti.
25:34
Like, seriously all the time.
537
1534738
1418
Ne, ozbiljno. Stalno.
25:36
And I don't think we're always going to get it right.
538
1536198
2711
Ne mislim da ćemo uvek biti u pravu.
25:38
But one thing I think has been incredibly important,
539
1538909
2460
Jedna stvar nam je uvek bila važna, i od samog početka smo razmišljali
25:41
from the very beginning, when we were thinking
540
1541411
2169
kako da izgradimo veštačku opštu inteligenciju za dobrobit čovečanstva.
25:43
about how to build artificial general intelligence,
541
1543580
2419
25:45
actually have it benefit all of humanity,
542
1545999
2002
Mislim, kako da to uradimo?
25:48
like, how are you supposed to do that, right?
543
1548001
2127
Neki plan koji se podrazumeva je da je izgradite u tajnosti, tu jako moćnu stvar,
25:50
And that default plan of being, well, you build in secret,
544
1550170
2711
25:52
you get this super powerful thing,
545
1552923
1626
i onda osmislite način sigurne upotrebe, pokrenete je i nadate se najboljem.
25:54
and then you figure out the safety of it and then you push “go,”
546
1554549
3003
25:57
and you hope you got it right.
547
1557552
1460
Ali ja ne znam kako da takav plan sprovedem. Neko drugi možda zna.
25:59
I don't know how to execute that plan.
548
1559012
1835
26:00
Maybe someone else does.
549
1560889
1168
Ali meni je to zastrašujuće, ne čini mi se ispravnim.
26:02
But for me, that was always terrifying, it didn't feel right.
550
1562099
2877
Mislim da je ovaj alternativni pristup jedini drugi put koji ja vidim,
26:04
And so I think that this alternative approach
551
1564976
2128
26:07
is the only other path that I see,
552
1567104
2043
a to je da pustite da vas realnost ošine po licu.
26:09
which is that you do let reality hit you in the face.
553
1569147
2503
I mislim da ljudima treba dati mogućnost davanja ulaznih podataka,
26:11
And I think you do give people time to give input.
554
1571691
2336
da pre nego što ove mašine postanu savršene,
26:14
You do have, before these machines are perfect,
555
1574027
2211
26:16
before they are super powerful, that you actually have the ability
556
1576279
3128
postanu premoćne, da zapravo imamo uvid u to šta u praksi mogu da urade.
26:19
to see them in action.
557
1579407
1168
26:20
And we've seen it from GPT-3, right?
558
1580617
1752
A to smo videli kroz GPT-3, zar ne?
Kod GPT-3, plašili smo se
26:22
GPT-3, we really were afraid
559
1582369
1376
26:23
that the number one thing people were going to do with it
560
1583745
2711
da će glavna stvar za koju će se model koristiti biti
stvaranje dezinformacija i pokušaj uticanja na izbore.
26:26
was generate misinformation, try to tip elections.
561
1586456
2336
26:28
Instead, the number one thing was generating Viagra spam.
562
1588834
2711
Umesto toga, korišten je za stvaranje neželjene pošte za vijagru.
26:31
(Laughter)
563
1591545
3169
(Smeh)
KA: Da, neželjena pošta za vijagru je loša, ali ima mnogo gorih stvari.
26:36
CA: So Viagra spam is bad, but there are things that are much worse.
564
1596007
3212
Evo misaonog eksperimenta za tebe.
26:39
Here's a thought experiment for you.
565
1599219
1752
26:40
Suppose you're sitting in a room,
566
1600971
1710
Recimo da sediš u sobi. U sobi je sto, a na njemu kutija.
26:42
there's a box on the table.
567
1602681
1668
26:44
You believe that in that box is something that,
568
1604349
3003
Veruješ da je u toj kutiji nešto za šta
26:47
there's a very strong chance it's something absolutely glorious
569
1607394
2961
postoji velika mogućnost da je nešto predivno
26:50
that's going to give beautiful gifts to your family and to everyone.
570
1610397
3920
što će tebi, tvojoj porodici i svima dati divne darove.
26:54
But there's actually also a one percent thing in the small print there
571
1614359
3629
Ali postoji i jedan posto šanse da negde sitnim slovima piše: “Pandora.”
26:58
that says: “Pandora.”
572
1618029
1877
26:59
And there's a chance
573
1619906
1669
I postoji šansa da se oslobode neslućena zla u svet.
27:01
that this actually could unleash unimaginable evils on the world.
574
1621616
4088
27:06
Do you open that box?
575
1626538
1543
Hoćeš li otvoriti tu kutiju?
GB: Pa, naravno da ne.
27:08
GB: Well, so, absolutely not.
576
1628123
1460
27:09
I think you don't do it that way.
577
1629624
1919
Ne mislim da tako treba uraditi.
27:12
And honestly, like, I'll tell you a story that I haven't actually told before,
578
1632210
3796
Iskreno, ispričaću ti priču koju još nikom nisam ispričao,
27:16
which is that shortly after we started OpenAI,
579
1636006
2586
a to je da kad smo pokrenuli OpenAI
27:18
I remember I was in Puerto Rico for an AI conference.
580
1638592
2711
sećam se da sam bio u Portoriku na konferenciji o VI.
27:21
I'm sitting in the hotel room just looking out over this wonderful water,
581
1641344
3462
Sedeo sam u hotelskoj sobi i gledao divnu vodu i ljude koji se zabavljaju.
27:24
all these people having a good time.
582
1644806
1752
I čovek se zapita, ako biste imali izbor da otvorite tu vašu Pandorinu kutiju
27:26
And you think about it for a moment,
583
1646558
1752
27:28
if you could choose for basically that Pandora’s box
584
1648310
4504
27:32
to be five years away
585
1652814
2711
za pet ili 500 godina, šta biste izabrali?
27:35
or 500 years away,
586
1655567
1585
27:37
which would you pick, right?
587
1657194
1501
27:38
On the one hand you're like, well, maybe for you personally,
588
1658737
2836
S jedne strane, vama lično, možda je bolje za pet godina.
27:41
it's better to have it be five years away.
589
1661573
2002
27:43
But if it gets to be 500 years away and people get more time to get it right,
590
1663617
3628
Ali ako bi to bilo za 500 godina i ljudi bi imali vremena da to urade kako treba,
27:47
which do you pick?
591
1667287
1168
šta onda izabrati?
27:48
And you know, I just really felt it in the moment.
592
1668496
2336
I onda sam osetio u tom trenutku, naravno da bi izabrali da to bude za 500 godina.
27:50
I was like, of course you do the 500 years.
593
1670874
2002
Brat mi je u to vreme bio u vojsci i rizikovao bi život na način
27:53
My brother was in the military at the time
594
1673293
2002
27:55
and like, he puts his life on the line in a much more real way
595
1675295
2961
mnogo stvarniji od bilo koga od nas kojih smo tipkali tada po računaru
27:58
than any of us typing things in computers
596
1678256
2628
28:00
and developing this technology at the time.
597
1680926
2585
i razvijali ovu tehnologiju.
28:03
And so, yeah, I'm really sold on the you've got to approach this right.
598
1683511
4547
Tako da sam pristalica ideje da se ovome mora pristupiti kako treba.
28:08
But I don't think that's quite playing the field as it truly lies.
599
1688058
3628
Ali ne mislim da je tu sve onako kako se na prvi pogled čini.
28:11
Like, if you look at the whole history of computing,
600
1691686
2670
Ako se pogleda istorijat računarstva kao celine,
28:14
I really mean it when I say that this is an industry-wide
601
1694397
4463
stvarno mislim kada kažem da je ovo što se sada dešava pomak na nivou industrije,
28:18
or even just almost like
602
1698902
1543
ako ne čak i na nivou razvoja celokupne ljudske tehnologije.
28:20
a human-development- of-technology-wide shift.
603
1700487
3336
28:23
And the more that you sort of, don't put together the pieces
604
1703865
4088
Što duže ne povezujemo stvari koje su već u eteru,
28:27
that are there, right,
605
1707994
1293
28:29
we're still making faster computers,
606
1709329
1752
jer i dalje pravimo brže računare,
28:31
we're still improving the algorithms, all of these things, they are happening.
607
1711081
3670
i dalje poboljšavamo algoritme, sve se to i dalje dešava,
28:34
And if you don't put them together, you get an overhang,
608
1714793
2627
što duže to ne uvezujete, dolaziće do preklapanja,
što znači da ako neko, ili onog momenta kada neko uspe da to sve poveže,
28:37
which means that if someone does,
609
1717420
1627
28:39
or the moment that someone does manage to connect to the circuit,
610
1719089
3086
odjednom ćete imati jako moćnu stvar, a da niko nije imao vremena da se prilagodi.
28:42
then you suddenly have this very powerful thing,
611
1722175
2252
28:44
no one's had any time to adjust,
612
1724427
1544
Ko zna kakve ćete imati mere bezbednosti?
28:46
who knows what kind of safety precautions you get.
613
1726012
2336
Moj zaključak je da, kada razmišljate o razvojima drugih tehnologija,
28:48
And so I think that one thing I take away
614
1728390
1918
28:50
is like, even you think about development of other sort of technologies,
615
1730308
3837
28:54
think about nuclear weapons,
616
1734187
1376
npr., nuklearnog oružja,
28:55
people talk about being like a zero to one,
617
1735563
2002
ljudi govore da je to bila promena iz nula u jedan
28:57
sort of, change in what humans could do.
618
1737565
2628
u ljudskim mogućnostima.
29:00
But I actually think that if you look at capability,
619
1740235
2461
Ali, ako pogledamo kroz njenu primenu, to je ipak bilo glatko kroz neki period.
29:02
it's been quite smooth over time.
620
1742696
1585
29:04
And so the history, I think, of every technology we've developed
621
1744281
3670
I istorijski gledano, svaka tehnologija koju smo razvili, razvijala se postepeno
29:07
has been, you've got to do it incrementally
622
1747993
2002
29:10
and you've got to figure out how to manage it
623
1750036
2127
i upravljali smo njome postepeno kako se povećavala njena upotreba.
29:12
for each moment that you're increasing it.
624
1752163
2461
29:14
CA: So what I'm hearing is that you ...
625
1754666
2252
KA: Dakle, ono što želiš da kažeš, da ovaj model koji želiš da imamo,
29:16
the model you want us to have
626
1756918
1668
29:18
is that we have birthed this extraordinary child
627
1758628
2795
da ga posmatramo kao jedno izuzetno novorođenče, koje možda ima supermoći,
29:21
that may have superpowers
628
1761423
2544
koje mogu da povedu čovečanstvo do potpuno novog mesta.
29:24
that take humanity to a whole new place.
629
1764009
2544
29:26
It is our collective responsibility to provide the guardrails
630
1766594
5005
Da je naša kolektivna odgovornost da ovom detetu pružimo smernice, zaštitnu ogradu,
29:31
for this child
631
1771641
1210
29:32
to collectively teach it to be wise and not to tear us all down.
632
1772892
5047
da ga naučimo mudrosti i da nas ne uništi. Jesam li dobro shvatio tvoj model?
29:37
Is that basically the model?
633
1777939
1377
29:39
GB: I think it's true.
634
1779357
1168
GB: Mislim da si u pravu.
29:40
And I think it's also important to say this may shift, right?
635
1780567
2878
Važno je napomenuti da se i to može promeniti.
29:43
We've got to take each step as we encounter it.
636
1783445
3253
Svakoj fazi moramo pristupiti kako se s njom susrećemo.
29:46
And I think it's incredibly important today
637
1786740
2002
Izuzetno je važno da danas svi postanemo pismeni u pogledu ove tehnologije,
29:48
that we all do get literate in this technology,
638
1788783
2878
29:51
figure out how to provide the feedback,
639
1791661
1919
shvatimo davanje povratnih informacija i odlučimo šta želimo od nje.
29:53
decide what we want from it.
640
1793621
1377
29:54
And my hope is that that will continue to be the best path,
641
1794998
3128
Nadam se da će ovo ostati najbolji put,
29:58
but it's so good we're honestly having this debate
642
1798168
2377
i drago mi je da iskreno vodimo ovu raspravu
30:00
because we wouldn't otherwise if it weren't out there.
643
1800545
2628
jer da model nije pušten, ne bismo je ni vodili.
30:03
CA: Greg Brockman, thank you so much for coming to TED and blowing our minds.
644
1803631
3629
KA: Greg, hvala ti što si došao na TED i što si nas raspametio.
30:07
(Applause)
645
1807302
1626
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7