What happens when our computers get smarter than we are? | Nick Bostrom

2,703,006 views ・ 2015-04-27

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Miloš Milosavljević Lektor: Mile Živković
00:12
I work with a bunch of mathematicians, philosophers and computer scientists,
0
12570
4207
Ja radim sa grupom matematičara, filozofa i informatičara
00:16
and we sit around and think about the future of machine intelligence,
1
16777
5209
i mi, između ostalog, sedimo i razmišljamo o budućnosti
00:21
among other things.
2
21986
2044
mašinske inteligencije.
00:24
Some people think that some of these things are sort of science fiction-y,
3
24030
4725
Neki misle da su neke od ovih stvari kao naučna fantastika,
00:28
far out there, crazy.
4
28755
3101
uvrnute, lude.
00:31
But I like to say,
5
31856
1470
Ali ja volim da kažem:
00:33
okay, let's look at the modern human condition.
6
33326
3604
pogledajmo kako izgleda moderni čovek.
00:36
(Laughter)
7
36930
1692
(Smeh)
00:38
This is the normal way for things to be.
8
38622
2402
Normalno je da bude ovako.
00:41
But if we think about it,
9
41024
2285
Ali ako razmislimo,
00:43
we are actually recently arrived guests on this planet,
10
43309
3293
mi smo, kao ljudska vrsta, skoro pristigli gosti
00:46
the human species.
11
46602
2082
na ovoj planeti.
00:48
Think about if Earth was created one year ago,
12
48684
4746
Zamislite da je Zemlja nastala pre godinu dana.
00:53
the human species, then, would be 10 minutes old.
13
53430
3548
Ljudska vrsta bi onda bila stara 10 minuta,
00:56
The industrial era started two seconds ago.
14
56978
3168
a industrijsko doba je počelo pre dve sekunde.
01:01
Another way to look at this is to think of world GDP over the last 10,000 years,
15
61276
5225
Drugi način da predstavimo ovo
jeste da pogledamo svetski BDP u poslednjih 10.000 godina.
01:06
I've actually taken the trouble to plot this for you in a graph.
16
66501
3029
Potrudio sam se da vam ovo predstavim grafički.
01:09
It looks like this.
17
69530
1774
To izgleda ovako.
01:11
(Laughter)
18
71304
1363
(Smeh)
01:12
It's a curious shape for a normal condition.
19
72667
2151
U normalnoj situaciji, to je neobičan oblik.
01:14
I sure wouldn't want to sit on it.
20
74818
1698
Sigurno ne bih voleo da sednem na njega.
01:16
(Laughter)
21
76516
2551
(Smeh)
01:19
Let's ask ourselves, what is the cause of this current anomaly?
22
79067
4774
Zapitajmo se: šta je razlog ove savremene anomalije?
01:23
Some people would say it's technology.
23
83841
2552
Neki bi rekli da je to tehnologija.
01:26
Now it's true, technology has accumulated through human history,
24
86393
4668
Istina je, tehnologija se nagomilala tokom ljudske istorije,
01:31
and right now, technology advances extremely rapidly --
25
91061
4652
a sada napreduje ekstremno brzo.
01:35
that is the proximate cause,
26
95713
1565
To je neposredan uzrok,
01:37
that's why we are currently so very productive.
27
97278
2565
zbog toga smo toliko produktivni.
01:40
But I like to think back further to the ultimate cause.
28
100473
3661
Ali želeo bih da se vratim unazad do prvobitnog uzroka.
01:45
Look at these two highly distinguished gentlemen:
29
105114
3766
Pogledajte ova dva cenjena gospodina.
01:48
We have Kanzi --
30
108880
1600
Imamo Kanzija -
01:50
he's mastered 200 lexical tokens, an incredible feat.
31
110480
4643
on je ovladao sa 200 leksičkih simbola, što je neverovatan podvig.
01:55
And Ed Witten unleashed the second superstring revolution.
32
115123
3694
A Ed Viten je pokrenuo drugu revoluciju superstruna.
01:58
If we look under the hood, this is what we find:
33
118817
2324
Ako pogledamo ispod haube, pronaći ćemo ovo:
02:01
basically the same thing.
34
121141
1570
u osnovi istu stvar.
02:02
One is a little larger,
35
122711
1813
Jedna je malo veća
02:04
it maybe also has a few tricks in the exact way it's wired.
36
124524
2758
i možda ima par trikova u načinu na koji je povezana.
02:07
These invisible differences cannot be too complicated, however,
37
127282
3812
Ove nevidljive razlike ne mogu biti previše komplikovane,
02:11
because there have only been 250,000 generations
38
131094
4285
jer je prošlo samo 250.000 generacija
02:15
since our last common ancestor.
39
135379
1732
od našeg poslednjeg zajedničkog pretka.
02:17
We know that complicated mechanisms take a long time to evolve.
40
137111
3849
Znamo da komplikovanim mehanizmima treba mnogo vremena da se razviju.
02:22
So a bunch of relatively minor changes
41
142000
2499
Tako da skup relativno malih promena
02:24
take us from Kanzi to Witten,
42
144499
3067
nas od Kanzija dovodi do Vitena,
02:27
from broken-off tree branches to intercontinental ballistic missiles.
43
147566
4543
od slomljenih grana drveća, do interkontinentalnih raketa.
02:32
So this then seems pretty obvious that everything we've achieved,
44
152839
3935
Onda izgleda prilično očigledno da sve što smo postigli
02:36
and everything we care about,
45
156774
1378
i sve što nam je važno
02:38
depends crucially on some relatively minor changes that made the human mind.
46
158152
5228
zavisi od nekih relativno malih promena koje su stvorile ljudski um.
02:44
And the corollary, of course, is that any further changes
47
164650
3662
I onda, logično, sve dalje promene
02:48
that could significantly change the substrate of thinking
48
168312
3477
koje značajno menjaju osnovu mišljenja
02:51
could have potentially enormous consequences.
49
171789
3202
mogle bi imati potencijalno ogromne posledice.
02:56
Some of my colleagues think we're on the verge
50
176321
2905
Neke moje kolege misle da se nalazimo nadomak
02:59
of something that could cause a profound change in that substrate,
51
179226
3908
nečega što bi moglo da prouzrokuje duboku promenu u toj osnovi,
03:03
and that is machine superintelligence.
52
183134
3213
a to je mašinska superinteligencija.
03:06
Artificial intelligence used to be about putting commands in a box.
53
186347
4739
Veštačka inteligencija je dosad bila kutija u koju ste ubacivali komande.
03:11
You would have human programmers
54
191086
1665
Ljudski programeri
03:12
that would painstakingly handcraft knowledge items.
55
192751
3135
su brižljivo brusili kolekcije znanja.
03:15
You build up these expert systems,
56
195886
2086
Razvijali su te ekspertske sisteme
03:17
and they were kind of useful for some purposes,
57
197972
2324
i oni su bili korisni za neke namene,
03:20
but they were very brittle, you couldn't scale them.
58
200296
2681
ali su bili vrlo krhki, niste mogli da ih podešavate.
03:22
Basically, you got out only what you put in.
59
202977
3433
U suštini, dobijali ste samo ono što ste već imali ubačeno.
03:26
But since then,
60
206410
997
Ali od tada,
03:27
a paradigm shift has taken place in the field of artificial intelligence.
61
207407
3467
desila se promena paradigme na polju veštačke inteligencije.
03:30
Today, the action is really around machine learning.
62
210874
2770
Danas se najviše radi na mašinskom učenju.
03:34
So rather than handcrafting knowledge representations and features,
63
214394
5387
Ne kreiramo imitacije znanja i njegove komponente,
03:40
we create algorithms that learn, often from raw perceptual data.
64
220511
5554
već kreiramo algoritme koji uče iz sirovih opažajnih podataka.
03:46
Basically the same thing that the human infant does.
65
226065
4998
U suštini, isto kao što to radi novorođenče.
03:51
The result is A.I. that is not limited to one domain --
66
231063
4207
Rezultat je veštačka inteligencija (V.I.) koja nije ograničena na jednu oblast -
03:55
the same system can learn to translate between any pairs of languages,
67
235270
4631
isti sistem može da nauči da prevodi između bilo koja dva jezika
03:59
or learn to play any computer game on the Atari console.
68
239901
5437
ili da nauči da igra bilo koju igricu na Atariju.
04:05
Now of course,
69
245338
1779
Naravno,
04:07
A.I. is still nowhere near having the same powerful, cross-domain
70
247117
3999
V.I. je još uvek daleko od toga da može kao i ljudi
04:11
ability to learn and plan as a human being has.
71
251116
3219
da uči i planira između više oblasti istovremeno.
04:14
The cortex still has some algorithmic tricks
72
254335
2126
Korteks ima neke algoritamske trikove
04:16
that we don't yet know how to match in machines.
73
256461
2355
koje još uvek ne znamo kako da postignemo kod mašina.
04:19
So the question is,
74
259886
1899
Tako da, pitanje je:
04:21
how far are we from being able to match those tricks?
75
261785
3500
koliko smo daleko od toga da to postignemo?
04:26
A couple of years ago,
76
266245
1083
Pre nekoliko godina
04:27
we did a survey of some of the world's leading A.I. experts,
77
267328
2888
anketirali smo neke vodeće stručnjake za V.I.
04:30
to see what they think, and one of the questions we asked was,
78
270216
3224
da vidimo šta oni misle i jedno od naših pitanja je bilo:
04:33
"By which year do you think there is a 50 percent probability
79
273440
3353
"Do koje godine mislite da postoji 50% šanse
04:36
that we will have achieved human-level machine intelligence?"
80
276793
3482
da ćemo kod mašina postići inteligenciju koja je na nivou ljudske?"
04:40
We defined human-level here as the ability to perform
81
280785
4183
Definisali smo "ljudski nivo" kao sposobnost da se obavi
04:44
almost any job at least as well as an adult human,
82
284968
2871
skoro svaki posao onako kao što bi to uradila odrasla osoba.
04:47
so real human-level, not just within some limited domain.
83
287839
4005
Znači, na pravom ljudskom nivou, a ne u nekoj ograničenoj oblasti.
04:51
And the median answer was 2040 or 2050,
84
291844
3650
I srednji odgovor je bio: oko 2040. ili 2050,
04:55
depending on precisely which group of experts we asked.
85
295494
2806
u zavisnosti od toga koju smo tačno grupu stručnjaka pitali.
04:58
Now, it could happen much, much later, or sooner,
86
298300
4039
Moglo bi da se desi i mnogo, mnogo kasnije, ili ranije.
05:02
the truth is nobody really knows.
87
302339
1940
U stvari, niko to ne zna.
05:05
What we do know is that the ultimate limit to information processing
88
305259
4412
Ono što znamo je da se konačna granica obrade informacija
05:09
in a machine substrate lies far outside the limits in biological tissue.
89
309671
4871
kod mašina nalazi daleko izvan granica biološkog tkiva.
05:15
This comes down to physics.
90
315241
2378
Ovo se svodi na fiziku.
05:17
A biological neuron fires, maybe, at 200 hertz, 200 times a second.
91
317619
4718
Biološki neuron radi, možda, na 200 herca, 200 puta u sekundi.
05:22
But even a present-day transistor operates at the Gigahertz.
92
322337
3594
Ali čak i današnji tranzistor radi na jednom gigahercu.
05:25
Neurons propagate slowly in axons, 100 meters per second, tops.
93
325931
5297
Neuroni se razmnožavaju sporo u aksonima - najviše 100 m u sekundi.
05:31
But in computers, signals can travel at the speed of light.
94
331228
3111
Ali u kompjuteru, signali mogu da putuju brzinom svetlosti.
05:35
There are also size limitations,
95
335079
1869
Takođe postoje ograničenja u veličini,
05:36
like a human brain has to fit inside a cranium,
96
336948
3027
kao što ih ima ljudski mozak da bi stao u lobanju,
05:39
but a computer can be the size of a warehouse or larger.
97
339975
4761
ali kompjuter može biti veličine skladišta ili veći.
05:44
So the potential for superintelligence lies dormant in matter,
98
344736
5599
Tako da potencijal superinteligencije čeka uspavan,
05:50
much like the power of the atom lay dormant throughout human history,
99
350335
5712
kao što je moć atoma bila uspavana tokom ljudske istorije,
05:56
patiently waiting there until 1945.
100
356047
4405
strpljivo čekajući do 1945.
06:00
In this century,
101
360452
1248
U ovom veku,
06:01
scientists may learn to awaken the power of artificial intelligence.
102
361700
4118
naučnici možda nauče da probude moć veštačke inteligencije.
06:05
And I think we might then see an intelligence explosion.
103
365818
4008
I mislim da bismo tada mogli da vidimo eksploziju inteligencije.
06:10
Now most people, when they think about what is smart and what is dumb,
104
370406
3957
Većina ljudi kad pomisli šta je pametno, a šta glupo,
06:14
I think have in mind a picture roughly like this.
105
374363
3023
mislim da u glavi ima sliku sličnu ovoj.
06:17
So at one end we have the village idiot,
106
377386
2598
Na jednom kraju imamo idiota,
06:19
and then far over at the other side
107
379984
2483
a daleko na drugom kraju
06:22
we have Ed Witten, or Albert Einstein, or whoever your favorite guru is.
108
382467
4756
imamo Eda Vitena, ili Alberta Ajnštajna, ili nekog vašeg omiljenog gurua.
06:27
But I think that from the point of view of artificial intelligence,
109
387223
3834
Ali s tačke gledišta veštačke inteligencije,
06:31
the true picture is actually probably more like this:
110
391057
3681
prava slika više izgleda ovako:
06:35
AI starts out at this point here, at zero intelligence,
111
395258
3378
V.i. počinje u ovoj ovde tački, na nultoj inteligenciji
06:38
and then, after many, many years of really hard work,
112
398636
3011
i onda, posle mnogo, mnogo godina vrednog rada,
06:41
maybe eventually we get to mouse-level artificial intelligence,
113
401647
3844
možda konačno stignemo do veštačke inteligencije na nivou miša,
06:45
something that can navigate cluttered environments
114
405491
2430
koja može da se kreće u zakrčenim prostorima
06:47
as well as a mouse can.
115
407921
1987
kao što to može miš.
06:49
And then, after many, many more years of really hard work, lots of investment,
116
409908
4313
I zatim, posle još mnogo godina vrednog rada i mnogo ulaganja,
06:54
maybe eventually we get to chimpanzee-level artificial intelligence.
117
414221
4639
možda konačno stignemo do nivoa šimpanze,
06:58
And then, after even more years of really, really hard work,
118
418860
3210
a posle još više godina vrednog rada,
07:02
we get to village idiot artificial intelligence.
119
422070
2913
do nivoa veštačke inteligencije idiota.
07:04
And a few moments later, we are beyond Ed Witten.
120
424983
3272
A nekoliko trenutaka kasnije, pretekli smo Eda Vitena.
07:08
The train doesn't stop at Humanville Station.
121
428255
2970
Voz se neće zaustaviti kad pretekne ljude,
07:11
It's likely, rather, to swoosh right by.
122
431225
3022
već će najverovatnije, samo prošišati pored.
07:14
Now this has profound implications,
123
434247
1984
Ovo ima suštinske posledice,
07:16
particularly when it comes to questions of power.
124
436231
3862
naročito što se tiče pitanja moći.
07:20
For example, chimpanzees are strong --
125
440093
1899
Na primer, šimpanze su snažne -
07:21
pound for pound, a chimpanzee is about twice as strong as a fit human male.
126
441992
5222
šimpanza je oko dva puta snažnija od čoveka.
07:27
And yet, the fate of Kanzi and his pals depends a lot more
127
447214
4614
Pa ipak, sudbina Kanzija i njegovih drugara zavisi više
07:31
on what we humans do than on what the chimpanzees do themselves.
128
451828
4140
od toga šta mi ljudi radimo, nego od toga šta šimpanze rade.
07:37
Once there is superintelligence,
129
457228
2314
Kad se pojavi superinteligencija,
07:39
the fate of humanity may depend on what the superintelligence does.
130
459542
3839
sudbina čovečanstva može zavisiti od toga šta superinteligencija radi.
07:44
Think about it:
131
464451
1057
Razmislite:
07:45
Machine intelligence is the last invention that humanity will ever need to make.
132
465508
5044
mašinska inteligencija je poslednji izum koji će čovečanstvo morati da napravi.
07:50
Machines will then be better at inventing than we are,
133
470552
2973
Mašine će tada biti bolje u pronalazaštvu od nas
07:53
and they'll be doing so on digital timescales.
134
473525
2540
i to će raditi u digitalnim vremenskim rokovima.
07:56
What this means is basically a telescoping of the future.
135
476065
4901
To u suštini znači skraćivanje budućnosti.
08:00
Think of all the crazy technologies that you could have imagined
136
480966
3558
Pomislite na svu neverovatnu tehnologiju koju možete da zamislite
08:04
maybe humans could have developed in the fullness of time:
137
484524
2798
da bi ljudi mogli da razviju kroz vreme:
08:07
cures for aging, space colonization,
138
487322
3258
lek protiv starenja, kolonizaciju svemira,
08:10
self-replicating nanobots or uploading of minds into computers,
139
490580
3731
samoreplikaciju nanorobota ili čuvanje umova u kompjuterima,
08:14
all kinds of science fiction-y stuff
140
494311
2159
raznorazne stvari iz naučne fantastike
08:16
that's nevertheless consistent with the laws of physics.
141
496470
2737
koje su ipak u skladu sa zakonima fizike.
08:19
All of this superintelligence could develop, and possibly quite rapidly.
142
499207
4212
Sve to bi superinteligencija mogla da napravi, i to verovatno jako brzo.
08:24
Now, a superintelligence with such technological maturity
143
504449
3558
Superinteligecija s takvom tehnološkom zrelošću
08:28
would be extremely powerful,
144
508007
2179
bila bi izuzetno moćna
08:30
and at least in some scenarios, it would be able to get what it wants.
145
510186
4546
i, barem po nekim scenarijima, mogla bi da dobije šta hoće.
08:34
We would then have a future that would be shaped by the preferences of this A.I.
146
514732
5661
Onda bismo imali budućnost koja bi bila oblikovana po želji te V.I.
08:41
Now a good question is, what are those preferences?
147
521855
3749
Sada je pitanje: kakve su te želje?
08:46
Here it gets trickier.
148
526244
1769
Ovde nastaje začkoljica.
08:48
To make any headway with this,
149
528013
1435
Da bismo napredovali dalje,
08:49
we must first of all avoid anthropomorphizing.
150
529448
3276
moramo prvo da izbegnemo antropomorfnost.
08:53
And this is ironic because every newspaper article
151
533934
3301
To je ironično, jer svaki novinski članak
08:57
about the future of A.I. has a picture of this:
152
537235
3855
o budućnosti V.I. zamišlja ovo:
09:02
So I think what we need to do is to conceive of the issue more abstractly,
153
542280
4134
Mislim da bi trebalo da zamislimo problem apstraktnije,
09:06
not in terms of vivid Hollywood scenarios.
154
546414
2790
a ne kao živopisni holivudski scenario.
09:09
We need to think of intelligence as an optimization process,
155
549204
3617
Moramo da posmatramo inteligenciju kao proces optimizacije,
09:12
a process that steers the future into a particular set of configurations.
156
552821
5649
proces koji vodi budućnost u određeni set konfiguracija.
09:18
A superintelligence is a really strong optimization process.
157
558470
3511
Superinteligencija je vrlo snažan proces optimizacije.
09:21
It's extremely good at using available means to achieve a state
158
561981
4117
Izuzetno je dobra u korišćenju dostupnih sredstava da postigne stanje
09:26
in which its goal is realized.
159
566098
1909
u kome je njen cilj ostvaren.
09:28
This means that there is no necessary connection between
160
568447
2672
To znači da ne postoji obavezno veza između:
09:31
being highly intelligent in this sense,
161
571119
2734
biti visoko inteligentan u ovom smislu
09:33
and having an objective that we humans would find worthwhile or meaningful.
162
573853
4662
i imati cilj koji mi ljudi smatramo važnim ili smislenim.
09:39
Suppose we give an A.I. the goal to make humans smile.
163
579321
3794
Pretpostavimo da damo V.I. zadatak da nasmeje ljude.
09:43
When the A.I. is weak, it performs useful or amusing actions
164
583115
2982
Kada je V.I. slaba, ona izvodi korisne ili zabavne radnje
09:46
that cause its user to smile.
165
586097
2517
koje prouzrokuju osmeh kod korisnika.
09:48
When the A.I. becomes superintelligent,
166
588614
2417
Kada V.I. postane superinteligentna,
09:51
it realizes that there is a more effective way to achieve this goal:
167
591031
3523
ona shvata da postoji efikasniji način da postigne taj cilj,
09:54
take control of the world
168
594554
1922
a to je da preuzme kontrolu nad svetom
09:56
and stick electrodes into the facial muscles of humans
169
596476
3162
i prikači elektrode na mišiće ljudskih lica
09:59
to cause constant, beaming grins.
170
599638
2941
da bi proizvela konstantne kezove.
10:02
Another example,
171
602579
1035
Drugi primer je:
10:03
suppose we give A.I. the goal to solve a difficult mathematical problem.
172
603614
3383
pretpostavimo da V. I. damo zadatak da reši težak matetematički problem.
10:06
When the A.I. becomes superintelligent,
173
606997
1937
Kad V.I. postane superinteligentna,
10:08
it realizes that the most effective way to get the solution to this problem
174
608934
4171
shvati da je najefikasniji način za rešenje ovog problema
10:13
is by transforming the planet into a giant computer,
175
613105
2930
da transformiše planetu u džinovski kompjuter,
10:16
so as to increase its thinking capacity.
176
616035
2246
kako bi uvećala svoj kapacitet za razmišljanje.
10:18
And notice that this gives the A.I.s an instrumental reason
177
618281
2764
To daje veštačkoj inteligenciji instrumentalni razlog
10:21
to do things to us that we might not approve of.
178
621045
2516
da nam radi stvari s kojima se možda ne slažemo.
10:23
Human beings in this model are threats,
179
623561
1935
Ljudska bića su pretnja u ovom modelu,
10:25
we could prevent the mathematical problem from being solved.
180
625496
2921
mogli bi da budu prepreka za rešavanje matematičkog problema.
10:29
Of course, perceivably things won't go wrong in these particular ways;
181
629207
3494
Naravno, stvari neće krenuti loše baš na ovaj način;
10:32
these are cartoon examples.
182
632701
1753
ovo su primeri iz crtanog filma.
10:34
But the general point here is important:
183
634454
1939
Ali ovde je važna poenta:
10:36
if you create a really powerful optimization process
184
636393
2873
ako kreirate veoma moćan proces optimizacije
10:39
to maximize for objective x,
185
639266
2234
da se postigne maksimalan učinak za cilj X,
10:41
you better make sure that your definition of x
186
641500
2276
onda neka vaša definicija X-a
10:43
incorporates everything you care about.
187
643776
2469
obavezno uključi sve ono do čega vam je stalo.
10:46
This is a lesson that's also taught in many a myth.
188
646835
4384
Ovo je lekcija kojoj nas takođe uče i mnogi mitovi.
10:51
King Midas wishes that everything he touches be turned into gold.
189
651219
5298
Kralj Mida poželi da sve što dodirne bude pretvoreno u zlato.
10:56
He touches his daughter, she turns into gold.
190
656517
2861
I tako, dodirne ćerku i ona se pretvori u zlato.
10:59
He touches his food, it turns into gold.
191
659378
2553
Dodirne hranu i ona se pretvori u zlato.
11:01
This could become practically relevant,
192
661931
2589
To može biti relevantno i u praksi,
11:04
not just as a metaphor for greed,
193
664520
2070
ne samo kao metafora za pohlepu,
11:06
but as an illustration of what happens
194
666590
1895
već i kao ilustracija za ono što bi se desilo
11:08
if you create a powerful optimization process
195
668485
2837
ako pogrešno kreirate moćan proces optimizacije
11:11
and give it misconceived or poorly specified goals.
196
671322
4789
ili loše formulišete ciljeve.
11:16
Now you might say, if a computer starts sticking electrodes into people's faces,
197
676111
5189
Mogli biste reći: pa ako kompjuter počne da kači elektrode ljudima na lica,
11:21
we'd just shut it off.
198
681300
2265
prosto ćemo ga isključiti.
11:24
A, this is not necessarily so easy to do if we've grown dependent on the system --
199
684555
5340
Pod A: to možda ne bi bilo tako lako, ako bismo bili zavisni od sistema -
11:29
like, where is the off switch to the Internet?
200
689895
2732
na primer: gde je prekidač za gašenje interneta?
11:32
B, why haven't the chimpanzees flicked the off switch to humanity,
201
692627
5120
Pod B: zašto šimpanze nisu isključile prekidač čovečanstvu,
11:37
or the Neanderthals?
202
697747
1551
ili Neandertalci?
11:39
They certainly had reasons.
203
699298
2666
Sigurno ima razloga.
11:41
We have an off switch, for example, right here.
204
701964
2795
Imamo prekidač za gašenje: na primer, ovde.
11:44
(Choking)
205
704759
1554
(Gušenje)
11:46
The reason is that we are an intelligent adversary;
206
706313
2925
Razlog je to što smo mi inteligentan neprijatelj;
11:49
we can anticipate threats and plan around them.
207
709238
2728
možemo predvideti pretnje i planirati kako da ih izbegnemo.
11:51
But so could a superintelligent agent,
208
711966
2504
Ali to može i predstavnik superinteligencije,
11:54
and it would be much better at that than we are.
209
714470
3254
i bio bi mnogo bolji u tome od nas.
11:57
The point is, we should not be confident that we have this under control here.
210
717724
7187
Poenta je da ne treba da budemo tako sigurni da imamo kontrolu ovde.
12:04
And we could try to make our job a little bit easier by, say,
211
724911
3447
Mogli bismo da pokušamo da malo olakšamo posao
tako što ćemo, recimo, staviti V.I. u kutiju,
12:08
putting the A.I. in a box,
212
728358
1590
12:09
like a secure software environment,
213
729948
1796
kao bezbedno okruženje za softver,
12:11
a virtual reality simulation from which it cannot escape.
214
731744
3022
simulaciju virtuelne realnosti iz koje ona ne može pobeći.
12:14
But how confident can we be that the A.I. couldn't find a bug.
215
734766
4146
Ali koliko možemo biti sigurni da V.I. neće naći rupu?
12:18
Given that merely human hackers find bugs all the time,
216
738912
3169
S obzirom da ljudski hakeri stalno nalaze rupe,
12:22
I'd say, probably not very confident.
217
742081
3036
ja ne bih bio tako siguran.
12:26
So we disconnect the ethernet cable to create an air gap,
218
746237
4548
Isključimo eternet kabl da bismo stvorili vazdušni zid,
12:30
but again, like merely human hackers
219
750785
2668
ali, kao obični ljudski hakeri,
12:33
routinely transgress air gaps using social engineering.
220
753453
3381
rutinski preskačemo vazdušne zidove pomoću socijalnog inženjeringa.
12:36
Right now, as I speak,
221
756834
1259
Upravo sada, dok govorim,
12:38
I'm sure there is some employee out there somewhere
222
758093
2389
siguran sam da negde postoji neki zaposleni
12:40
who has been talked into handing out her account details
223
760482
3346
koga neko nagovara da mu da podatke o svom računu,
12:43
by somebody claiming to be from the I.T. department.
224
763828
2746
neko ko se predstavlja da je iz IT sektora.
12:46
More creative scenarios are also possible,
225
766574
2127
Ima još mogućih kreativnih scenarija,
12:48
like if you're the A.I.,
226
768701
1315
kao na primer, ako ste vi V.I.,
12:50
you can imagine wiggling electrodes around in your internal circuitry
227
770016
3532
možete zamisliti elektrode u vašem unutrašnjem sistemu
12:53
to create radio waves that you can use to communicate.
228
773548
3462
koje stvaraju radio talase koje možete koristiti za komunikaciju.
12:57
Or maybe you could pretend to malfunction,
229
777010
2424
Ili možete da se pretvarate da ste pokvareni
12:59
and then when the programmers open you up to see what went wrong with you,
230
779434
3497
i kada vas programeri otvore da vide šta nije u redu,
13:02
they look at the source code -- Bam! --
231
782931
1936
vide izvorni kod
13:04
the manipulation can take place.
232
784867
2447
i manipulacija može da se desi.
13:07
Or it could output the blueprint to a really nifty technology,
233
787314
3430
Ili može da prenese šemu na pogodan tehnološki uređaj
13:10
and when we implement it,
234
790744
1398
i kada je sprovedemo,
13:12
it has some surreptitious side effect that the A.I. had planned.
235
792142
4397
ima podrivajući efekat koji je V.I. planirala.
13:16
The point here is that we should not be confident in our ability
236
796539
3463
Poenta je da treba da budemo sigurni
13:20
to keep a superintelligent genie locked up in its bottle forever.
237
800002
3808
da ćemo držati superinteligentnog duha zaključanog u boci zauvek.
13:23
Sooner or later, it will out.
238
803810
2254
Pre ili kasnije će izaći.
13:27
I believe that the answer here is to figure out
239
807034
3103
Mislim da je odgovor to da shvatimo
13:30
how to create superintelligent A.I. such that even if -- when -- it escapes,
240
810137
5024
kako da napravimo V.I. tako da, čak i ako pobegne
13:35
it is still safe because it is fundamentally on our side
241
815161
3277
budemo bezbedni jer je suštinski na našoj strani
13:38
because it shares our values.
242
818438
1899
jer deli naše vrednosti.
13:40
I see no way around this difficult problem.
243
820337
3210
Ne vidim drugo rešenje za ovaj težak problem.
13:44
Now, I'm actually fairly optimistic that this problem can be solved.
244
824557
3834
Ali sam optimista da ovaj problem može da se reši.
13:48
We wouldn't have to write down a long list of everything we care about,
245
828391
3903
Ne bismo morali da pišemo dugu listu svega onoga do čega nam je stalo
13:52
or worse yet, spell it out in some computer language
246
832294
3643
ili, još gore, pišemo to na nekom kompjuterskom jeziku
13:55
like C++ or Python,
247
835937
1454
kao što je C++ ili Pajton,
13:57
that would be a task beyond hopeless.
248
837391
2767
što bi bio beznadežan zadatak.
14:00
Instead, we would create an A.I. that uses its intelligence
249
840158
4297
Umesto toga, napravili bismo V.I. koja koristi svoju inteligenciju
14:04
to learn what we value,
250
844455
2771
da nauči šta mi vrednujemo
14:07
and its motivation system is constructed in such a way that it is motivated
251
847226
5280
i čiji motivacioni sistem je konstruisan tako da je motivisan
14:12
to pursue our values or to perform actions that it predicts we would approve of.
252
852506
5232
da sledi naše vrednosti ili obavlja radnje za koje predviđa da ih mi odobravamo.
14:17
We would thus leverage its intelligence as much as possible
253
857738
3414
Tako bismo iskoristili njenu inteligenciju što je više moguće
14:21
to solve the problem of value-loading.
254
861152
2745
da rešimo problem prenošenja vrednosti.
14:24
This can happen,
255
864727
1512
Ovo može da se desi
14:26
and the outcome could be very good for humanity.
256
866239
3596
i ishod bi bio veoma dobar za čovečanstvo.
14:29
But it doesn't happen automatically.
257
869835
3957
Ali to se ne dešava automatski.
14:33
The initial conditions for the intelligence explosion
258
873792
2998
Početni uslovi za eksploziju inteligencije
14:36
might need to be set up in just the right way
259
876790
2863
možda treba da budu postavljeni na tačno određen način
14:39
if we are to have a controlled detonation.
260
879653
3530
ako želimo da imamo kontrolisanu detonaciju.
14:43
The values that the A.I. has need to match ours,
261
883183
2618
Vrednosti V.I. tribe da se poklope sa našim,
14:45
not just in the familiar context,
262
885801
1760
ne samo u poznatim kontekstima,
14:47
like where we can easily check how the A.I. behaves,
263
887561
2438
gde možemo lako proveriti kako se V.I. ponaša,
14:49
but also in all novel contexts that the A.I. might encounter
264
889999
3234
nego i u novim kontekstima u kojima se V.I. može naći
14:53
in the indefinite future.
265
893233
1557
u neodređenoj budućnosti.
14:54
And there are also some esoteric issues that would need to be solved, sorted out:
266
894790
4737
Postoje i neka ezoterična pitanja koja bi trebalo rešiti:
14:59
the exact details of its decision theory,
267
899527
2089
tačne detalje njene teorije odlučivanja,
15:01
how to deal with logical uncertainty and so forth.
268
901616
2864
kako rešiti logičku nesigurnost itd.
15:05
So the technical problems that need to be solved to make this work
269
905330
3102
Tako da tehnički problemi koje bi trebalo rešiti da bi ovo funkcionisalo
15:08
look quite difficult --
270
908432
1113
izgledaju prilično teški -
15:09
not as difficult as making a superintelligent A.I.,
271
909545
3380
ne toliko kao pravljenje superinteligentne V.I.,
15:12
but fairly difficult.
272
912925
2868
ali prilično teški.
15:15
Here is the worry:
273
915793
1695
Problem je u sledećem:
15:17
Making superintelligent A.I. is a really hard challenge.
274
917488
4684
napraviti superinteligentnu V.I. je vrlo težak izazov.
15:22
Making superintelligent A.I. that is safe
275
922172
2548
Pravljenje superinteligentne V.I. koja je bezbedna
15:24
involves some additional challenge on top of that.
276
924720
2416
uključuje još neke dodatne teškoće.
15:28
The risk is that if somebody figures out how to crack the first challenge
277
928216
3487
Postoji rizik da neko otkrije kako da razbije prvu teškoću,
15:31
without also having cracked the additional challenge
278
931703
3001
a da nije razbio i dodatnu teškoću
15:34
of ensuring perfect safety.
279
934704
1901
obezbeđivanja savršene bezbednosti.
15:37
So I think that we should work out a solution
280
937375
3331
Mislim da bi trebalo da nađemo rešenje
15:40
to the control problem in advance,
281
940706
2822
za kontrolu problema unapred
15:43
so that we have it available by the time it is needed.
282
943528
2660
da bismo ga imali na raspolaganju kad dođe vreme.
15:46
Now it might be that we cannot solve the entire control problem in advance
283
946768
3507
Ali možda ne možemo da rešimo ceo problem kontrole unapred,
15:50
because maybe some elements can only be put in place
284
950275
3024
jer možda neki elementi mogu da se ubace
15:53
once you know the details of the architecture where it will be implemented.
285
953299
3997
tek kad znamo detaljnu strukturu na kojoj bi bilo primenjeno.
15:57
But the more of the control problem that we solve in advance,
286
957296
3380
Ali što veći deo problema kontrole rešimo unapred,
16:00
the better the odds that the transition to the machine intelligence era
287
960676
4090
to su bolje šanse da će tranzicija ka dobu mašinske inteligencije
16:04
will go well.
288
964766
1540
proteći dobro
16:06
This to me looks like a thing that is well worth doing
289
966306
4644
To mi izgleda kao stvar koju bi vredelo uraditi
16:10
and I can imagine that if things turn out okay,
290
970950
3332
i mogu da zamislim da ako stvari ispadnu okej,
16:14
that people a million years from now look back at this century
291
974282
4658
ljudi će za milion godina pogledati unazad na ovo doba
16:18
and it might well be that they say that the one thing we did that really mattered
292
978940
4002
i možda će reći da je ono što je bilo zaista važno
16:22
was to get this thing right.
293
982942
1567
bilo to da smo ovo uradili dobro.
16:24
Thank you.
294
984509
1689
Hvala vam.
(Aplauz)
16:26
(Applause)
295
986198
2813
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7