How will AI change the world?

2,024,306 views ・ 2022-12-06

TED-Ed


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Una Kavazović
U godinama koje slede, veštačka inteligencija
00:07
In the coming years, artificial intelligence
0
7336
2169
će vam verovatno promeniti život, a sasvim izvesno i čitav svet.
00:09
is probably going to change your life, and likely the entire world.
1
9505
3629
Međutim, ljudi se baš i ne slažu tačno na koji način.
00:13
But people have a hard time agreeing on exactly how.
2
13301
3295
Slede isečci iz intervjua sa Svetskog ekonomskog foruma
00:16
The following are excerpts from an interview
3
16804
2127
00:18
where renowned computer science professor and AI expert Stuart Russell
4
18931
3504
u kojima cenjeni profesor računarske nauke i ekspert za VI Stjuart Rasel
00:22
helps separate the sense from the nonsense.
5
22435
2586
pomaže da odvojimo smisao od besmislica.
00:25
There’s a big difference between asking a human to do something
6
25313
3754
Velika je razlika između traženja od ljudskog bića da uradi nešto
00:29
and giving that as the objective to an AI system.
7
29067
3461
i navođenja istog cilja sistemu VI.
00:32
When you ask a human to get you a cup of coffee,
8
32528
2628
Kada tražite od ljudskog bića da vam donese šolju kafe,
00:35
you don’t mean this should be their life’s mission,
9
35156
2586
ne smatrate da bi to trebalo da bude njihova životna misija
00:37
and nothing else in the universe matters.
10
37742
1960
i da je samo to bitno u univerzumu.
00:39
Even if they have to kill everybody else in Starbucks
11
39702
2586
Čak i ako moraju da ubiju svakoga u kafeu
00:42
to get you the coffee before it closes— they should do that.
12
42288
2836
da vam donesu kafu pre zatvaranja - trebalo bi to da urade.
00:45
No, that’s not what you mean.
13
45124
1627
Ne, ne mislite tako.
00:46
All the other things that we mutually care about,
14
46751
2294
Sve ostalo do čega nam je međusobno stalo
00:49
they should factor into your behavior as well.
15
49045
2169
trebalo bi takođe da utiče na vaše ponašanje.
00:51
And the problem with the way we build AI systems now
16
51214
3169
A problem s tim kako trenutno gradimo sisteme VI
00:54
is we give them a fixed objective.
17
54383
1627
je to što im dajemo fiksiran cilj.
00:56
The algorithms require us to specify everything in the objective.
18
56010
3545
Algoritmi od nas zahtevaju da sve definišemo u cilju.
00:59
And if you say, can we fix the acidification of the oceans?
19
59555
3420
A ako kažete, možemo li da popravimo zakiseljavanje okeana?
01:02
Yeah, you could have a catalytic reaction that does that extremely efficiently,
20
62975
4046
Da, možete da imate katalitičku reakciju koja to izuzetno uspešno postiže,
01:07
but it consumes a quarter of the oxygen in the atmosphere,
21
67021
3253
ali bi potrošila četvrtinu kiseonika u atmosferi,
01:10
which would apparently cause us to die fairly slowly and unpleasantly
22
70274
3712
zbog čega bismo očito umirali krajnje sporo i neprijatno
01:13
over the course of several hours.
23
73986
1752
tokom nekoliko sati.
01:15
So, how do we avoid this problem?
24
75780
3211
Dakle, kako da izbegnemo ovaj problem?
01:18
You might say, okay, well, just be more careful about specifying the objective—
25
78991
4088
Mogli biste reći, u redu, samo budite pažljiviji kod definisanja cilja -
01:23
don’t forget the atmospheric oxygen.
26
83079
2544
ne zaboravite atmosferski kiseonik.
01:25
And then, of course, some side effect of the reaction in the ocean
27
85873
3545
A onda, naravno, neka nuspojava reakcije u okeanu
01:29
poisons all the fish.
28
89418
1377
potruje svu ribu.
01:30
Okay, well I meant don’t kill the fish either.
29
90795
2669
U redu, mislio sam nemoj ni ribu da ubiješ.
01:33
And then, well, what about the seaweed?
30
93464
1919
A onda, dakle, šta je sa morskom travom?
01:35
Don’t do anything that’s going to cause all the seaweed to die.
31
95383
2961
Nemoj da uradiš ništa što bi uzrokovalo izumiranje morske trave.
01:38
And on and on and on.
32
98344
1210
I tako dalje, i tako dalje.
01:39
And the reason that we don’t have to do that with humans is that
33
99679
3920
A razlog zašto ne moramo to da radimo s ljudima
01:43
humans often know that they don’t know all the things that we care about.
34
103599
4505
je što ljudi često znaju da ne znaju sve stvari do kojih nam je stalo.
01:48
If you ask a human to get you a cup of coffee,
35
108354
2961
Ako zatražite od ljudskog bića da vam donese šolju kafe,
01:51
and you happen to be in the Hotel George Sand in Paris,
36
111315
2878
a nalazite se u hotelu Žorž Sand u Parizu
01:54
where the coffee is 13 euros a cup,
37
114193
2628
gde šolja kafe košta 13 evra,
01:56
it’s entirely reasonable to come back and say, well, it’s 13 euros,
38
116821
4171
sasvim je razumno da se vratite i kažete: „Košta 13 evra,
02:00
are you sure you want it, or I could go next door and get one?
39
120992
2961
jesi li siguran da je želiš ili da je kupim u susednoj radnji?”
02:03
And it’s a perfectly normal thing for a person to do.
40
123953
2878
I to je sasvim normalno za nekoga da uradi.
02:07
To ask, I’m going to repaint your house—
41
127039
3003
Da pita: „Prekrečiću ti kuću -
02:10
is it okay if I take off the drainpipes and then put them back?
42
130042
3337
je li u redu da skinem oluke i da ih potom vratim?”
02:13
We don't think of this as a terribly sophisticated capability,
43
133504
3128
Ovo ne smatramo naročito prefinjenom sposobnošću,
02:16
but AI systems don’t have it because the way we build them now,
44
136632
3087
ali sistemi VI je nemaju, jer zbog toga kako ih trenutno gradimo
02:19
they have to know the full objective.
45
139719
1793
moraju u potpunosti da znaju cilj.
02:21
If we build systems that know that they don’t know what the objective is,
46
141721
3753
Ako bismo gradili sisteme koji znaju da ne znaju šta je cilj,
02:25
then they start to exhibit these behaviors,
47
145474
2586
onda bi počeli da ispoljavaju ova ponašanja,
02:28
like asking permission before getting rid of all the oxygen in the atmosphere.
48
148060
4046
poput traženja dozvole pre nego se reše celokupnog kiseonika u atmosferi.
02:32
In all these senses, control over the AI system
49
152565
3378
U svakom smislu, kontrola nad sistemom VI
02:35
comes from the machine’s uncertainty about what the true objective is.
50
155943
4463
dolazi iz nesigurnosti mašine po pitanju njenog istinskog cilja.
02:41
And it’s when you build machines that believe with certainty
51
161282
3086
A kada gradite mašine koje veruju sa sigurnošću
02:44
that they have the objective,
52
164368
1418
da imaju cilj,
02:45
that’s when you get this sort of psychopathic behavior.
53
165786
2753
tada dobijate psihopatsko ponašanje.
02:48
And I think we see the same thing in humans.
54
168539
2127
A smatram da isto to vidimo i kod ljudi.
02:50
What happens when general purpose AI hits the real economy?
55
170750
4254
Šta se desi kada veštačka opšta inteligencija stigne u realnu ekonomiju?
02:55
How do things change? Can we adapt?
56
175379
3587
Kako se stvari menjaju? Možemo li se prilagoditi?
02:59
This is a very old point.
57
179175
1835
Radi se o izuzetno staroj poenti.
03:01
Amazingly, Aristotle actually has a passage where he says,
58
181010
3587
Izvanredno je da Aristotel zapravo ima pasus gde kaže,
03:04
look, if we had fully automated weaving machines
59
184597
3045
vidite, kad bismo imali potpuno automatizovane razboje
03:07
and plectrums that could pluck the lyre and produce music without any humans,
60
187642
3837
i trzalice koje bi mogle da trzaju liru i proizvode muziku bez ljudi,
03:11
then we wouldn’t need any workers.
61
191604
2002
onda nam ne bi bili potrebni radnici.
03:13
That idea, which I think it was Keynes
62
193814
2878
Ta zamisao, koja mislim da je bila Kejnsova
03:16
who called it technological unemployment in 1930,
63
196692
2836
koji ju je 1930. godine nazvao tehnološkom nezaposlenošću
03:19
is very obvious to people.
64
199528
1919
je krajnje očita ljudima.
03:21
They think, yeah, of course, if the machine does the work,
65
201447
3086
Misle, da, naravno, ako mašina obavlja posao,
03:24
then I'm going to be unemployed.
66
204533
1669
onda ću biti nezaposlen.
03:26
You can think about the warehouses that companies are currently operating
67
206369
3503
Setite se skladišta kojima trenutno upravljaju kompanije
03:29
for e-commerce, they are half automated.
68
209872
2711
za e-trgovinu, ona su poluautomatizovana.
03:32
The way it works is that an old warehouse— where you’ve got tons of stuff piled up
69
212583
4046
Način na koji to funkcioniše je da stari magacin - gde imate tone stvari u hrpama
03:36
all over the place and humans go and rummage around
70
216629
2461
svuda razbacane i ljudi dolaze i preturaju okolo
03:39
and then bring it back and send it off—
71
219090
1877
i potom to donose i odašiljaju -
03:40
there’s a robot who goes and gets the shelving unit
72
220967
3586
postoji robot koji odlazi i uzima paletni regal
03:44
that contains the thing that you need,
73
224553
1919
u kom se nalazi stvar koja vam je potrebna,
03:46
but the human has to pick the object out of the bin or off the shelf,
74
226472
3629
ali čovek mora da uzme predmet iz kontejnera ili sa police
03:50
because that’s still too difficult.
75
230101
1877
jer je to i dalje suviše komplikovano.
03:52
But, at the same time,
76
232019
2002
Međutim, istovremeno,
03:54
would you make a robot that is accurate enough to be able to pick
77
234021
3921
da li biste napravili robota koji je dovoljno precizan da može da odabere
03:57
pretty much any object within a very wide variety of objects that you can buy?
78
237942
4338
gotovo svaki predmet u širokom dijapazonu predmeta koje možete da kupite?
04:02
That would, at a stroke, eliminate 3 or 4 million jobs?
79
242280
4004
Nešto što bi, jednim potezom, eliminisalo tri ili četiri miliona poslova?
04:06
There's an interesting story that E.M. Forster wrote,
80
246409
3336
Postoji zanimljiva priča koju je napisao E. M. Forster
04:09
where everyone is entirely machine dependent.
81
249745
3504
u kojoj je svako u potpunosti zavisan od mašina.
04:13
The story is really about the fact that if you hand over
82
253499
3754
Priča zaista govori o činjenici da ako predate
04:17
the management of your civilization to machines,
83
257253
2961
upravljanje civilizacijom mašinama,
04:20
you then lose the incentive to understand it yourself
84
260214
3504
gubite podsticaj da ih sami razumete
04:23
or to teach the next generation how to understand it.
85
263718
2544
ili da podučite narednu generaciju kako da ih razumeju.
04:26
You can see “WALL-E” actually as a modern version,
86
266262
3003
Možete posmatrati „Volija” zapravo kao savremenu verziju,
04:29
where everyone is enfeebled and infantilized by the machine,
87
269265
3628
gde su mašine svakoga oslabile i infantilizovale,
04:32
and that hasn’t been possible up to now.
88
272893
1961
a to nije bilo moguće sve do sad.
04:34
We put a lot of our civilization into books,
89
274854
2419
Veći deo naše civilizacije smo smestili u knjige,
04:37
but the books can’t run it for us.
90
277273
1626
ali one ne mogu da upravljaju umesto nas.
04:38
And so we always have to teach the next generation.
91
278899
2795
Stoga, uvek moramo da podučavamo sledeću generaciju.
04:41
If you work it out, it’s about a trillion person years of teaching and learning
92
281736
4212
Ako to izračunate, radi se o oko bilion životnih dobi podučavanja i učenja
04:45
and an unbroken chain that goes back tens of thousands of generations.
93
285948
3962
i neprekidnom lancu koji ide unazad desetinama hiljada generacija.
04:50
What happens if that chain breaks?
94
290119
1919
Šta se desi kada se lanac prekine?
04:52
I think that’s something we have to understand as AI moves forward.
95
292038
3461
Mislim da je to nešto što moramo da razumemo kako VI napreduje.
04:55
The actual date of arrival of general purpose AI—
96
295624
3587
Stvarni datum dolaska veštačke opšte inteligencije -
04:59
you’re not going to be able to pinpoint, it isn’t a single day.
97
299211
3087
ne biste mogli tačno da odredite, ne radi se o jednom danu.
05:02
It’s also not the case that it’s all or nothing.
98
302298
2294
Takođe nije slučaj gde je sve ili ništa.
05:04
The impact is going to be increasing.
99
304592
2461
Njen uticaj će da raste.
05:07
So with every advance in AI,
100
307053
2043
Stoga sa svakim unapređenjem VI,
05:09
it significantly expands the range of tasks.
101
309096
2962
ona značajno proširuje dijapazon zadataka.
05:12
So in that sense, I think most experts say by the end of the century,
102
312058
5338
Te u tom smislu, smatram da većina stručnjaka kaže da do kraja veka,
05:17
we’re very, very likely to have general purpose AI.
103
317396
3337
veoma, veoma je verovatno da ćemo imati veštačku opštu inteligenciju.
05:20
The median is something around 2045.
104
320733
3754
Srednja vrednost je oko 2045. godine.
05:24
I'm a little more on the conservative side.
105
324487
2002
Naginjem više ka konzervativnoj strani.
05:26
I think the problem is harder than we think.
106
326489
2085
Smatram da je problem teži nego što mislimo.
05:28
I like what John McAfee, he was one of the founders of AI,
107
328574
3253
Sviđa mi se ono što je Džon Makafi, bio je jedan od osnivača VI,
05:31
when he was asked this question, he said, somewhere between five and 500 years.
108
331911
3837
kada su ga upitali o ovome rekao, negde između pet i 500 godina.
05:35
And we're going to need, I think, several Einsteins to make it happen.
109
335748
3337
I biće nam potrebno, mislim, nekoliko Ajnštajna da to i ostvarimo.

Original video on YouTube.com
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7