Can we build AI without losing control over it? | Sam Harris

3,798,930 views ・ 2016-10-19

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Vesna Radovic Lektor: Mile Živković
00:13
I'm going to talk about a failure of intuition
0
13000
2216
Pričaću vam o porazu intuicije
00:15
that many of us suffer from.
1
15240
1600
od koga mnogi patimo.
00:17
It's really a failure to detect a certain kind of danger.
2
17480
3040
Zista je porazno opaziti određenu vrstu opasnosti.
00:21
I'm going to describe a scenario
3
21360
1736
Opisaću vam jedan scenario
00:23
that I think is both terrifying
4
23120
3256
za koji mislim da je i zastrašujuć
00:26
and likely to occur,
5
26400
1760
i koji se realno može dogoditi,
00:28
and that's not a good combination,
6
28840
1656
a to, kako se čini,
00:30
as it turns out.
7
30520
1536
nije dobra kombinacija.
00:32
And yet rather than be scared, most of you will feel
8
32080
2456
A opet umesto da se uplaše, mnogi od vas će misliti
00:34
that what I'm talking about is kind of cool.
9
34560
2080
da je ono o čemu pričam nešto kul.
00:37
I'm going to describe how the gains we make
10
37200
2976
Opisaću vam kako bi napredak koji pravimo
00:40
in artificial intelligence
11
40200
1776
u veštačkoj inteligenciji
00:42
could ultimately destroy us.
12
42000
1776
mogao na kraju da nas uništi.
00:43
And in fact, I think it's very difficult to see how they won't destroy us
13
43800
3456
Zapravo, mislim da je jako teško zamisliti kako nas neće uništiti,
00:47
or inspire us to destroy ourselves.
14
47280
1680
ili navesti da se samouništimo.
00:49
And yet if you're anything like me,
15
49400
1856
A opet ako ste imalo kao ja,
00:51
you'll find that it's fun to think about these things.
16
51280
2656
shvatićete da je zabavno razmišljati o ovim stvarima.
00:53
And that response is part of the problem.
17
53960
3376
I taj odgovor je deo ovog problema.
00:57
OK? That response should worry you.
18
57360
1720
OK? Taj odgovor treba da vas brine.
00:59
And if I were to convince you in this talk
19
59920
2656
Kad bih vas u ovom govoru ubedio
01:02
that we were likely to suffer a global famine,
20
62600
3416
da ćemo vrlo verovatno globalno gladovati,
01:06
either because of climate change or some other catastrophe,
21
66040
3056
bilo zbog klimatskih promena ili neke druge katastrofe,
01:09
and that your grandchildren, or their grandchildren,
22
69120
3416
i da će vaši unuci, ili njihovi unuci,
01:12
are very likely to live like this,
23
72560
1800
vrlo verovatno živeti ovako,
01:15
you wouldn't think,
24
75200
1200
ne biste mislili:
01:17
"Interesting.
25
77440
1336
"Zanimljivo.
01:18
I like this TED Talk."
26
78800
1200
Dobar ovaj TED Talk."
01:21
Famine isn't fun.
27
81200
1520
Glad nije zabavna.
01:23
Death by science fiction, on the other hand, is fun,
28
83800
3376
Smrt u naučnoj fantastici, sa druge strane, jeste zabavna,
01:27
and one of the things that worries me most about the development of AI at this point
29
87200
3976
i nešto što me najviše brine u ovom trenutku u vezi sa razvojem VI
01:31
is that we seem unable to marshal an appropriate emotional response
30
91200
4096
je što izgleda da ne možemo da pružimo odgovarajući emocionalni odgovor
01:35
to the dangers that lie ahead.
31
95320
1816
na opasnosti koje dolaze.
01:37
I am unable to marshal this response, and I'm giving this talk.
32
97160
3200
Ni ja ne mogu da pružim ovaj odgovor, a držim ovaj govor.
01:42
It's as though we stand before two doors.
33
102120
2696
To je kao da stojimo ispred dvoja vrata.
01:44
Behind door number one,
34
104840
1256
Iza vrata broj jedan,
01:46
we stop making progress in building intelligent machines.
35
106120
3296
zaustavljamo napredak u pravljenju inteligentnih mašina.
01:49
Our computer hardware and software just stops getting better for some reason.
36
109440
4016
Kompjuterski hardver i softver prosto prestaju da budu bolji iz nekog razloga.
01:53
Now take a moment to consider why this might happen.
37
113480
3000
Sada na trenutak razmislite zašto bi se ovo moglo desiti.
01:57
I mean, given how valuable intelligence and automation are,
38
117080
3656
Mislim, s obzirom koliku vrednost imaju inteligencija i automatizacija,
02:00
we will continue to improve our technology if we are at all able to.
39
120760
3520
nastavićemo da unapređujemo tehnologiju ako smo uopšte kadri za to.
02:05
What could stop us from doing this?
40
125200
1667
Šta bi nas moglo sprečiti u tome?
02:07
A full-scale nuclear war?
41
127800
1800
Svetski nuklearni rat?
02:11
A global pandemic?
42
131000
1560
Globalna pandemija?
02:14
An asteroid impact?
43
134320
1320
Pad asteroida?
02:17
Justin Bieber becoming president of the United States?
44
137640
2576
Ako Džastin Biber postane predsednik Amerike?
02:20
(Laughter)
45
140240
2280
(Smeh)
02:24
The point is, something would have to destroy civilization as we know it.
46
144760
3920
Poenta je u tome, da bi nešto moralo da uništi civilizaciju ovakvu kakva je.
02:29
You have to imagine how bad it would have to be
47
149360
4296
Morate zamisliti kako bi zaista loše moralo biti to
02:33
to prevent us from making improvements in our technology
48
153680
3336
što bi nas trajno sprečilo da unapređujemo,
02:37
permanently,
49
157040
1216
tehnologiju,
02:38
generation after generation.
50
158280
2016
generaciju za generacijom.
02:40
Almost by definition, this is the worst thing
51
160320
2136
Skoro po definiciji, ovo je najgora stvar
02:42
that's ever happened in human history.
52
162480
2016
koja se ikada desila u ljudskoj istoriji.
02:44
So the only alternative,
53
164520
1296
Pa je jedina alternativa,
02:45
and this is what lies behind door number two,
54
165840
2336
a ovo se nalazi iza vrata broj dva,
02:48
is that we continue to improve our intelligent machines
55
168200
3136
da nastavimo da unapređujemo naše inteligentne mašine
02:51
year after year after year.
56
171360
1600
bez prestanka.
02:53
At a certain point, we will build machines that are smarter than we are,
57
173720
3640
U jednom trenutku, napravićemo mašine koje su pametnije od nas,
02:58
and once we have machines that are smarter than we are,
58
178080
2616
i tad kad budemo imali mašine koje su pametnije od nas,
03:00
they will begin to improve themselves.
59
180720
1976
one će same početi da se razvijaju.
03:02
And then we risk what the mathematician IJ Good called
60
182720
2736
I onda rizikujemo ono što je matematičar I.Dž. Gud nazvao
03:05
an "intelligence explosion,"
61
185480
1776
"eksplozija inteligencije",
03:07
that the process could get away from us.
62
187280
2000
da proces može da nam pobegne.
03:10
Now, this is often caricatured, as I have here,
63
190120
2816
E, sad, ovo se često karikira, kao što ću i ja uraditi,
03:12
as a fear that armies of malicious robots
64
192960
3216
kao strah od napada armije
03:16
will attack us.
65
196200
1256
zlih robota.
03:17
But that isn't the most likely scenario.
66
197480
2696
Ali to nije najverovatniji scenario.
03:20
It's not that our machines will become spontaneously malevolent.
67
200200
4856
Nije kao da će naše mašine spontano postati zle.
03:25
The concern is really that we will build machines
68
205080
2616
Brine nas, zapravo, da ćemo mi napraviti mašine
03:27
that are so much more competent than we are
69
207720
2056
koje su mnogo, mnogo kompetentnije od nas,
03:29
that the slightest divergence between their goals and our own
70
209800
3776
brinemo da će nas uništiti i najmanja razlika između
03:33
could destroy us.
71
213600
1200
naših i njihovih ciljeva.
03:35
Just think about how we relate to ants.
72
215960
2080
Zamislite kakvi smo mi prema mravima.
03:38
We don't hate them.
73
218600
1656
Ne mrzimo ih.
03:40
We don't go out of our way to harm them.
74
220280
2056
Ne povređujemo ih namerno.
03:42
In fact, sometimes we take pains not to harm them.
75
222360
2376
Zapravo, ponekad se trudimo se da ih ne povredimo.
03:44
We step over them on the sidewalk.
76
224760
2016
Preskačemo ih na trotoaru.
03:46
But whenever their presence
77
226800
2136
Ali kada god njihovo prisustvo
03:48
seriously conflicts with one of our goals,
78
228960
2496
ozbiljno ugrozi neke naše ciljeve,
03:51
let's say when constructing a building like this one,
79
231480
2477
recimo kada konstruišemo neku zgradu poput ove,
03:53
we annihilate them without a qualm.
80
233981
1960
bez po muke ih istrebimo.
03:56
The concern is that we will one day build machines
81
236480
2936
Brine nas da ćemo jednog dana napraviti mašine
03:59
that, whether they're conscious or not,
82
239440
2736
koje će, bilo da su svesne ili ne,
04:02
could treat us with similar disregard.
83
242200
2000
nas tretirati na isti način.
04:05
Now, I suspect this seems far-fetched to many of you.
84
245760
2760
E, sad, cenim da većini vas ovo zvuči kao daleka budućnost.
04:09
I bet there are those of you who doubt that superintelligent AI is possible,
85
249360
6336
Kladim se da ima onih koji ne veruju da je superinteligentna VI moguća,
04:15
much less inevitable.
86
255720
1656
a kamoli neizbežna.
04:17
But then you must find something wrong with one of the following assumptions.
87
257400
3620
Ali onda morate naći neku grešku u sledećim pretpostavkama.
04:21
And there are only three of them.
88
261044
1572
A samo ih je tri.
04:23
Intelligence is a matter of information processing in physical systems.
89
263800
4719
Inteligencija je stvar obrade informacija u fizičkim sistemima.
04:29
Actually, this is a little bit more than an assumption.
90
269320
2615
U stvari, ovo je malo više od neke pretpostavke.
04:31
We have already built narrow intelligence into our machines,
91
271959
3457
Već smo ugradili specifičnu inteligenciju u naše mašine,
04:35
and many of these machines perform
92
275440
2016
i mnoge od ovih mašina već
04:37
at a level of superhuman intelligence already.
93
277480
2640
imaju performanse na nivou superljudske inteligencije.
04:40
And we know that mere matter
94
280840
2576
I znamo da sama materija
04:43
can give rise to what is called "general intelligence,"
95
283440
2616
može da omogući takozvanu "uopštenu inteligenciju",
04:46
an ability to think flexibly across multiple domains,
96
286080
3656
sposobnost da se fleksibilno razmišlja u različitim domenima,
04:49
because our brains have managed it. Right?
97
289760
3136
jer to može i naš mozak. Je l' tako?
04:52
I mean, there's just atoms in here,
98
292920
3936
Mislim, ovde postoje samo atomi,
04:56
and as long as we continue to build systems of atoms
99
296880
4496
i sve dok mi nastavljamo da gradimo sisteme atoma
05:01
that display more and more intelligent behavior,
100
301400
2696
koji su sve inteligentniji i inteligentniji,
05:04
we will eventually, unless we are interrupted,
101
304120
2536
mi ćemo na kraju, osim ako nas prekinu,
05:06
we will eventually build general intelligence
102
306680
3376
mi ćemo na kraju ugraditi uopštenu inteligenciju
05:10
into our machines.
103
310080
1296
u naše mašine.
05:11
It's crucial to realize that the rate of progress doesn't matter,
104
311400
3656
Od suštinske je važnosti da shvatimo da stopa napretka nije važna,
05:15
because any progress is enough to get us into the end zone.
105
315080
3176
jer bilo kakav napredak je dovoljan da nas dovede do cilja.
05:18
We don't need Moore's law to continue. We don't need exponential progress.
106
318280
3776
Ne treba nam Murov zakon. Ne treba nam eksponencijalni napredak.
05:22
We just need to keep going.
107
322080
1600
Samo treba da nastavimo dalje.
05:25
The second assumption is that we will keep going.
108
325480
2920
Druga pretpostavka je da ćemo nastaviti dalje.
05:29
We will continue to improve our intelligent machines.
109
329000
2760
Nastavićemo da poboljšavamo naše inteligentne mašine.
05:33
And given the value of intelligence --
110
333000
4376
I uz postojeću vrednost inteligencije -
05:37
I mean, intelligence is either the source of everything we value
111
337400
3536
mislim, inteligencija je ili izvor svega što je vredno
05:40
or we need it to safeguard everything we value.
112
340960
2776
ili nam treba da sačuvamo sve ono što nam je vredno.
05:43
It is our most valuable resource.
113
343760
2256
To je naš najvredniji resurs.
05:46
So we want to do this.
114
346040
1536
I zato želimo da uradimo ovo.
05:47
We have problems that we desperately need to solve.
115
347600
3336
Imamo probleme koje očajnički moramo da rešimo.
05:50
We want to cure diseases like Alzheimer's and cancer.
116
350960
3200
Želimo da izlečimo bolesti poput Alchajmera i raka.
05:54
We want to understand economic systems. We want to improve our climate science.
117
354960
3936
Želimo da shvatimo ekonomske sisteme. Želimo da poboljšamo nauku o klimi.
05:58
So we will do this, if we can.
118
358920
2256
Tako da, ako možemo, to ćemo i uraditi.
06:01
The train is already out of the station, and there's no brake to pull.
119
361200
3286
Voz je već napustio stanicu, i nema kočnice.
06:05
Finally, we don't stand on a peak of intelligence,
120
365880
5456
Konačno, mi ne stojimo na vrhuncu inteligencije,
06:11
or anywhere near it, likely.
121
371360
1800
a verovatno ni blizu toga.
06:13
And this really is the crucial insight.
122
373640
1896
I ovo je zaista suštinski uvid.
06:15
This is what makes our situation so precarious,
123
375560
2416
Zbog ovoga je naša situacija tako nepouzdana,
06:18
and this is what makes our intuitions about risk so unreliable.
124
378000
4040
i zbog toga je naša intuicija o riziku tako nepouzdana.
06:23
Now, just consider the smartest person who has ever lived.
125
383120
2720
E, sad, samo razmislite o najpametnijoj osobi ikada.
06:26
On almost everyone's shortlist here is John von Neumann.
126
386640
3416
Na skoro svačijem užem spisku ovde je Džon fon Nojman.
06:30
I mean, the impression that von Neumann made on the people around him,
127
390080
3336
Mislim, utisak koji je Nojman ostavio na ljude oko sebe,
06:33
and this included the greatest mathematicians and physicists of his time,
128
393440
4056
uključujući tu najveće matematičare i fizičare tog vremena,
06:37
is fairly well-documented.
129
397520
1936
je prilično dobro dokumentovan.
06:39
If only half the stories about him are half true,
130
399480
3776
I ako je samo pola onih priča o njemu polutačna,
06:43
there's no question
131
403280
1216
nema dileme da je on
06:44
he's one of the smartest people who has ever lived.
132
404520
2456
jedan od najpametnijih ljudi koji su ikada živeli.
06:47
So consider the spectrum of intelligence.
133
407000
2520
Samo zamislite opseg inteligencije.
06:50
Here we have John von Neumann.
134
410320
1429
Ovde imamo Džona fon Nojmana.
06:53
And then we have you and me.
135
413560
1334
Onda smo tu vi i ja.
06:56
And then we have a chicken.
136
416120
1296
I onda imamo neko pile.
06:57
(Laughter)
137
417440
1936
(Smeh)
06:59
Sorry, a chicken.
138
419400
1216
Pardon, neko pile.
07:00
(Laughter)
139
420640
1256
(Smeh)
07:01
There's no reason for me to make this talk more depressing than it needs to be.
140
421920
3736
Nema potrebe da idem dalje u dubiozu u ovom govoru nego što treba.
07:05
(Laughter)
141
425680
1600
(Smeh)
07:08
It seems overwhelmingly likely, however, that the spectrum of intelligence
142
428339
3477
Međutim, čini se da je više nego verovatno da se spektar inteligencije
07:11
extends much further than we currently conceive,
143
431840
3120
proteže mnogo više nego što trenutno možemo da shvatimo,
07:15
and if we build machines that are more intelligent than we are,
144
435880
3216
i ako napravimo mašine koje su mnogo inteligentnije od nas,
07:19
they will very likely explore this spectrum
145
439120
2296
one će skoro sigurno istražiti ovaj spektar
07:21
in ways that we can't imagine,
146
441440
1856
onako kako to ne možemo ni zamisliti,
07:23
and exceed us in ways that we can't imagine.
147
443320
2520
i preteći nas na načine koje ne možemo ni zamisliti.
07:27
And it's important to recognize that this is true by virtue of speed alone.
148
447000
4336
I važno je shvatiti da je ovo moguće samo putem brzine.
07:31
Right? So imagine if we just built a superintelligent AI
149
451360
5056
U redu? Zamislite samo da napravimo jednu superinteligentnu VI
07:36
that was no smarter than your average team of researchers
150
456440
3456
koja nije ništa pametnija od vašeg prosečnog tima istraživača
07:39
at Stanford or MIT.
151
459920
2296
na Stanfordu ili MIT-u.
07:42
Well, electronic circuits function about a million times faster
152
462240
2976
Pa, električna kola funkcionišu oko milion puta brže
07:45
than biochemical ones,
153
465240
1256
od biohemijskih,
07:46
so this machine should think about a million times faster
154
466520
3136
tako da bi ove mašine razmišljale oko milion puta brže
07:49
than the minds that built it.
155
469680
1816
od umova koje su ih sastavile.
07:51
So you set it running for a week,
156
471520
1656
Pustite ih da rade nedelju dana,
07:53
and it will perform 20,000 years of human-level intellectual work,
157
473200
4560
i izvešće 20.000 godina ljudskog intelektualnog rada,
07:58
week after week after week.
158
478400
1960
nedelju za nedeljom.
08:01
How could we even understand, much less constrain,
159
481640
3096
Kako čak možemo i razumeti, a kamoli ograničiti
08:04
a mind making this sort of progress?
160
484760
2280
um koji bi mogao ovo da uradi?
08:08
The other thing that's worrying, frankly,
161
488840
2136
Iskreno, ono drugo što je zabrinjavajuće
08:11
is that, imagine the best case scenario.
162
491000
4976
je da, zamislite najbolji mogući scenario.
08:16
So imagine we hit upon a design of superintelligent AI
163
496000
4176
Zamislite da nabasamo na neki dizajn superinteligentne VI
08:20
that has no safety concerns.
164
500200
1376
čija nas sigurnost ne brine.
08:21
We have the perfect design the first time around.
165
501600
3256
Po prvi put imamo savršeni dizajn.
08:24
It's as though we've been handed an oracle
166
504880
2216
To je kao da nam je dato proročanstvo
08:27
that behaves exactly as intended.
167
507120
2016
koje se odvija baš kako je zapisano.
08:29
Well, this machine would be the perfect labor-saving device.
168
509160
3720
Ta mašina bi bila savršeni uređaj za uštedu radne snage.
08:33
It can design the machine that can build the machine
169
513680
2429
Dizajnirala bi mašinu koja može napraviti mašine
08:36
that can do any physical work,
170
516133
1763
za bilo kakav fizički posao,
08:37
powered by sunlight,
171
517920
1456
a koje bi napajalo sunce,
08:39
more or less for the cost of raw materials.
172
519400
2696
manje-više po ceni sirovina.
08:42
So we're talking about the end of human drudgery.
173
522120
3256
Pričamo o kraju teškog ljudskog rada.
08:45
We're also talking about the end of most intellectual work.
174
525400
2800
Takođe pričamo o kraju većeg dela intelektualnog rada.
08:49
So what would apes like ourselves do in this circumstance?
175
529200
3056
Pa šta će onda majmuni poput nas da rade u takvim okolnostima?
08:52
Well, we'd be free to play Frisbee and give each other massages.
176
532280
4080
Pa, imali bismo vremena za frizbi i da masiramo jedni druga.
08:57
Add some LSD and some questionable wardrobe choices,
177
537840
2856
Dodajmo malo LSD-a i neke diskutabilne garderobe,
09:00
and the whole world could be like Burning Man.
178
540720
2176
i ceo svet bi mogao biti poput festivala Burning Man.
09:02
(Laughter)
179
542920
1640
(Smeh)
09:06
Now, that might sound pretty good,
180
546320
2000
E, sad, to možda zvuči kao dobra ideja,
09:09
but ask yourself what would happen
181
549280
2376
ali zapitajte se šta bi se desilo
09:11
under our current economic and political order?
182
551680
2736
pod našim trenutnim ekonomskim i političkim poretkom?
09:14
It seems likely that we would witness
183
554440
2416
Čini se verovatnim da bismo bili svedoci
09:16
a level of wealth inequality and unemployment
184
556880
4136
određenog nivoa novčane nejednakosti i nezaposlenosti
09:21
that we have never seen before.
185
561040
1496
koji ranije nismo viđali.
09:22
Absent a willingness to immediately put this new wealth
186
562560
2616
Uz odsustvo volje da se odmah ovo novo bogatstvo stavi
09:25
to the service of all humanity,
187
565200
1480
na raspolaganje čovečanstvu,
09:27
a few trillionaires could grace the covers of our business magazines
188
567640
3616
nekoliko bilionera bi poziralo na omotima biznis magazina
09:31
while the rest of the world would be free to starve.
189
571280
2440
dok bi ostatak sveta slobodno mogao umreti od gladi.
09:34
And what would the Russians or the Chinese do
190
574320
2296
I šta bi radili Rusi ili Kinezi
09:36
if they heard that some company in Silicon Valley
191
576640
2616
kada bi čuli da tamo neka firma iz Silicijumske doline
09:39
was about to deploy a superintelligent AI?
192
579280
2736
treba da izbaci superinteligentnu VI?
09:42
This machine would be capable of waging war,
193
582040
2856
Ova mašina bi bila sposobna da vodi rat,
09:44
whether terrestrial or cyber,
194
584920
2216
bilo klasični, bilo sajber rat,
09:47
with unprecedented power.
195
587160
1680
nadmoćna bez presedana.
09:50
This is a winner-take-all scenario.
196
590120
1856
Ovo je scenario pobednik nosi sve.
09:52
To be six months ahead of the competition here
197
592000
3136
Biti šest meseci ispred konkurencije ovde
09:55
is to be 500,000 years ahead,
198
595160
2776
je isto biti 500 000 godina ispred,
09:57
at a minimum.
199
597960
1496
u najmanju ruku.
09:59
So it seems that even mere rumors of this kind of breakthrough
200
599480
4736
Tako se čini da bi čak i samo glasina o proboju ovog tipa
10:04
could cause our species to go berserk.
201
604240
2376
izazvala ludnicu među ljudima.
10:06
Now, one of the most frightening things,
202
606640
2896
I, sad, jedna od najstarašnijih stvari,
10:09
in my view, at this moment,
203
609560
2776
prema mom mišljenju, trenutno,
10:12
are the kinds of things that AI researchers say
204
612360
4296
su one stvari koje istraživači VI kažu
10:16
when they want to be reassuring.
205
616680
1560
kada žele da vas ohrabre.
10:19
And the most common reason we're told not to worry is time.
206
619000
3456
I najuobičajeniji razlog zbog koga nam kažu da ne brinemo jeste vreme.
10:22
This is all a long way off, don't you know.
207
622480
2056
Sve je to daleko, znate.
10:24
This is probably 50 or 100 years away.
208
624560
2440
To je verovatno 50 ili 100 godina daleko.
10:27
One researcher has said,
209
627720
1256
Jedan istraživač je rekao:
10:29
"Worrying about AI safety
210
629000
1576
"Briga o bezbednosti VI
10:30
is like worrying about overpopulation on Mars."
211
630600
2280
je kao briga o prenaseljenosti Marsa."
Ovo je kao kada vam iz Silicijumske doline kažu:
10:34
This is the Silicon Valley version
212
634116
1620
10:35
of "don't worry your pretty little head about it."
213
635760
2376
"Nemojte vi zamarati vaše glavice time."
10:38
(Laughter)
214
638160
1336
(Smeh)
10:39
No one seems to notice
215
639520
1896
Izgleda da niko ne primećuje
10:41
that referencing the time horizon
216
641440
2616
da je odnos prema vremenu
10:44
is a total non sequitur.
217
644080
2576
potpuni non sequitur.
10:46
If intelligence is just a matter of information processing,
218
646680
3256
Ako je inteligencija samo stvar procesuiranja informacija,
10:49
and we continue to improve our machines,
219
649960
2656
i mi nastavimo da poboljšavamo mašine,
10:52
we will produce some form of superintelligence.
220
652640
2880
proizvešćemo neki oblik superinteligencije.
10:56
And we have no idea how long it will take us
221
656320
3656
I nemamo pojma koliko će nam trebati vremena
11:00
to create the conditions to do that safely.
222
660000
2400
da stvorimo uslove da to uradimo bezbedno.
11:04
Let me say that again.
223
664200
1296
Dozvolite mi da ponovim.
11:05
We have no idea how long it will take us
224
665520
3816
Nemamo pojma koliko vremena će nam trebati
11:09
to create the conditions to do that safely.
225
669360
2240
da stvorimo uslove da to uradimo bezbedno.
11:12
And if you haven't noticed, 50 years is not what it used to be.
226
672920
3456
Ako niste primetili, 50 godina nije kao što je bilo ranije.
11:16
This is 50 years in months.
227
676400
2456
Ovo je 50 godina u mesecima.
11:18
This is how long we've had the iPhone.
228
678880
1840
Ovoliko dugo već imamo iPhone.
11:21
This is how long "The Simpsons" has been on television.
229
681440
2600
Ovoliko se već prikazuju "Simpsonovi".
11:24
Fifty years is not that much time
230
684680
2376
Pedeset godina nije mnogo vremena
11:27
to meet one of the greatest challenges our species will ever face.
231
687080
3160
da se ostvari jedan od najvećih izazova naše vrste.
11:31
Once again, we seem to be failing to have an appropriate emotional response
232
691640
4016
Još jedanput, izgleda da nam nedostaje odgovarajući emocionalni odgovor
11:35
to what we have every reason to believe is coming.
233
695680
2696
na ono na šta imamo svako pravo da verujemo da dolazi.
11:38
The computer scientist Stuart Russell has a nice analogy here.
234
698400
3976
Kompjuterski naučnik Stjuart Rasel ima lepu analogiju za ovo.
11:42
He said, imagine that we received a message from an alien civilization,
235
702400
4896
On kaže, zamislite da smo primili neku poruku od vanzemaljaca,
11:47
which read:
236
707320
1696
u kojoj stoji:
11:49
"People of Earth,
237
709040
1536
"Zemljani,
11:50
we will arrive on your planet in 50 years.
238
710600
2360
stižemo na vašu planetu za 50 godina.
11:53
Get ready."
239
713800
1576
Spremite se."
11:55
And now we're just counting down the months until the mothership lands?
240
715400
4256
I mi ćemo samo da odbrojavamo mesece dok se njihov brod ne spusti?
11:59
We would feel a little more urgency than we do.
241
719680
3000
Osetili bismo malo više straha nego obično.
12:04
Another reason we're told not to worry
242
724680
1856
Drugo zbog čega kažu da ne brinemo
12:06
is that these machines can't help but share our values
243
726560
3016
je da ove mašine ne mogu drugačije nego da dele naše vrednosti
12:09
because they will be literally extensions of ourselves.
244
729600
2616
zato što će one bukvalno biti produžeci nas samih.
12:12
They'll be grafted onto our brains,
245
732240
1816
One će biti nakalamljene na naš mozak
12:14
and we'll essentially become their limbic systems.
246
734080
2360
i mi ćemo suštinski postati limbički sistem.
12:17
Now take a moment to consider
247
737120
1416
Sad na trenutak razmislite
12:18
that the safest and only prudent path forward,
248
738560
3176
da je najbezbednija i jedina razborita i preporučena,
12:21
recommended,
249
741760
1336
putanja napred,
12:23
is to implant this technology directly into our brains.
250
743120
2800
da ovu tehnologiju implementiramo direktno u naš mozak.
12:26
Now, this may in fact be the safest and only prudent path forward,
251
746600
3376
Sad, ovo može biti najbezbednija i jedina razborita putanja napred,
12:30
but usually one's safety concerns about a technology
252
750000
3056
ali obično se bezbednost neke tehnologije
12:33
have to be pretty much worked out before you stick it inside your head.
253
753080
3656
mora baš dobro razmotriti pre nego što vam se zabode u mozak.
12:36
(Laughter)
254
756760
2016
(Smeh)
12:38
The deeper problem is that building superintelligent AI on its own
255
758800
5336
Veći problem je što pravljenje superinteligentne VI same od sebe
12:44
seems likely to be easier
256
764160
1736
izgleda lakše
12:45
than building superintelligent AI
257
765920
1856
od pravljenja superinteligentne VI
12:47
and having the completed neuroscience
258
767800
1776
i kompletiranja neuronauke
12:49
that allows us to seamlessly integrate our minds with it.
259
769600
2680
koja nam omogućava da integrišemo naše umove sa njom.
12:52
And given that the companies and governments doing this work
260
772800
3176
I ako imamo to da kompanije i vlade rade ovaj posao,
12:56
are likely to perceive themselves as being in a race against all others,
261
776000
3656
one će verovatno to shvatiti kao da se takmiče protiv svih drugih,
12:59
given that to win this race is to win the world,
262
779680
3256
jer pobediti u ovoj trci znači osvojiti svet,
13:02
provided you don't destroy it in the next moment,
263
782960
2456
uz uslov da ga sledećeg trenutka ne uništite,
13:05
then it seems likely that whatever is easier to do
264
785440
2616
a onda će verovatno prvo biti urađeno
13:08
will get done first.
265
788080
1200
ono što je najlakše.
13:10
Now, unfortunately, I don't have a solution to this problem,
266
790560
2856
E, sad, nažalost, ja nemam rešenje za ovaj problem,
13:13
apart from recommending that more of us think about it.
267
793440
2616
osim toga da preporučim da više nas promisli o tome.
13:16
I think we need something like a Manhattan Project
268
796080
2376
Smatram da nam je potrebno nešto poput Projekta Menhetn
13:18
on the topic of artificial intelligence.
269
798480
2016
na temu veštačke inteligencije.
13:20
Not to build it, because I think we'll inevitably do that,
270
800520
2736
Ne da bismo je izgradili, jer mislim da je to neizbežno,
13:23
but to understand how to avoid an arms race
271
803280
3336
već da razumemo kako da izbegnemo trku naoružanja
13:26
and to build it in a way that is aligned with our interests.
272
806640
3496
i da je napravimo na način koji je u skladu sa našim interesima.
13:30
When you're talking about superintelligent AI
273
810160
2136
Kada se razgovara o superinteligentnoj VI
13:32
that can make changes to itself,
274
812320
2256
koja sama sebe može da menja,
13:34
it seems that we only have one chance to get the initial conditions right,
275
814600
4616
čini se da imamo samo jednu šansu da početne uslove postavimo kako treba,
13:39
and even then we will need to absorb
276
819240
2056
a čak i onda ćemo morati da prihvatimo
13:41
the economic and political consequences of getting them right.
277
821320
3040
ekonomske i političke posledice pravilne upotrebe.
13:45
But the moment we admit
278
825760
2056
Ali trenutak kada priznamo
13:47
that information processing is the source of intelligence,
279
827840
4000
da je obrada informacija izvor inteligencije,
13:52
that some appropriate computational system is what the basis of intelligence is,
280
832720
4800
da je neki odgovarajući računarski sistem osnova onoga što je inteligencija,
13:58
and we admit that we will improve these systems continuously,
281
838360
3760
i prihvatimo da ćemo stalno unapređivati ove sisteme,
14:03
and we admit that the horizon of cognition very likely far exceeds
282
843280
4456
i prihvatimo da će horizont shvatanja vrlo verovatno daleko nadmašiti
14:07
what we currently know,
283
847760
1200
ono što trenutno znamo,
14:10
then we have to admit
284
850120
1216
onda moramo da prihvatimo
14:11
that we are in the process of building some sort of god.
285
851360
2640
da smo u procesu pravljenja neke vrste božanstva.
14:15
Now would be a good time
286
855400
1576
Sada bi bilo dobro vreme
14:17
to make sure it's a god we can live with.
287
857000
1953
da se postaramo da je to božanstvo sa kojim bismo mogli živeti.
14:20
Thank you very much.
288
860120
1536
Hvala vam mnogo.
14:21
(Applause)
289
861680
5093
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7