Is AI the most important technology of the century?

365,293 views ・ 2023-04-04

TED-Ed


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Una Kavazović
00:07
What's the most important century in human history?
0
7003
3587
Koji je najvažniji vek u ljudskoj istoriji?
00:11
Some might argue it’s a period of extensive military campaigning,
1
11049
3920
Neki bi tvrdili da je to period opsežnih vojnih kampanja,
00:14
like Alexander the Great’s in the 300s BCE,
2
14969
3545
poput onih Aleksandra Velikog iz 300-ih g.p.n.e,
00:18
which reshaped political and cultural borders.
3
18848
3253
koje su preoblikovale političke i kulturološke granice.
00:23
Others might cite the emergence of a major religion,
4
23936
3087
Drugi bi naveli pojavu veće religije,
00:27
such as Islam in the 7th century,
5
27023
2127
poput islama u VII veku,
00:29
which codified and spread values across such borders.
6
29525
3504
koji je kodifikovao i proširio vrednosti širom granica.
00:35
Or perhaps it’s the Industrial Revolution of the 1700s
7
35531
3462
Ili je to možda industrijska revolucija iz 1700-ih
00:38
that transformed global commerce
8
38993
1752
koja je transformisala globalnu trgovinu
00:40
and redefined humanity's relationship with labor.
9
40745
3128
i redefinisala odnos ljudi prema radu.
00:44
Whatever the answer, it seems like any century vying for that top spot
10
44165
4129
Koji god bio odgovor, čini se da je svaki vek koji se nadmeće za prvu poziciju
00:48
is at a moment of great change—
11
48294
2377
trenutak velike promene -
00:51
when the actions of our ancestors shifted humanity’s trajectory
12
51631
4171
kada su postupci naših predaka izmenili putanju čovečanstva
00:55
for centuries to come.
13
55802
1584
u nadolazećim vekovima.
00:57
So if this is our metric, is it possible that right now—
14
57720
3921
Ukoliko nam je to merilo, da li je moguće da je upravo sad -
01:01
this century— is the most important one yet?
15
61641
3086
ovaj vek - najvažniji od svih prethodnih?
01:05
The 21st century has already proven to be a period of rapid technological growth.
16
65770
5213
Već se ispostavilo da je XXI vek period ubrzanog tehnološkog napretka.
01:11
Phones and computers have accelerated the pace of life.
17
71234
3128
Telefoni i kompjuteri su ubrzali životni ritam.
01:14
And we’re likely on the cusp of developing new transformative technologies,
18
74403
4171
I verovatno smo na ivici razvoja novih preobražajnih tehnologija,
01:18
like advanced artificial intelligence,
19
78574
2294
poput napredne veštačke inteligencije,
01:20
that could entirely change the way people live.
20
80868
2878
koje bi mogle u potpunosti da promene način na koji ljudi žive.
01:25
Meanwhile, many technologies we already have
21
85414
3045
Istovremeno, mnoge tehnologije koje već posedujemo
01:28
contribute to humanity’s unprecedented levels of existential risk—
22
88459
4838
doprinose nivoima egzistencijalnog rizika bez presedana po čovečanstvo -
01:33
that’s the risk of our species going extinct
23
93381
2377
a to je rizik od izumiranja naše vrste
01:35
or experiencing some kind of disaster that permanently limits
24
95758
3420
ili doživljavanja nekakve katastrofe koja bi trajno ograničila
01:39
humanity’s ability to grow and thrive.
25
99178
2670
sposobnost čovečanstva da raste i napreduje.
01:43
The invention of the atomic bomb marked a major rise in existential risk,
26
103266
4629
Izum atomske bombe je obeležio ogroman rast u egzistencijalnom riziku,
01:48
and since then we’ve only increased the odds against us.
27
108396
3879
a otad smo samo uvećali izglede protiv nas.
01:52
It’s profoundly difficult to estimate the odds
28
112900
2628
Istinski je komplikovano izračunati izglede
01:55
of an existential collapse occurring this century.
29
115528
2628
od egzistencijalnog kolapsa u ovom veku.
01:58
Very rough guesses put the risk of existential catastrophe
30
118239
3462
Krajnje okvirna nagađanja procenjuju rizik od egzistencijalne katastrofe
02:01
due to nuclear winter and climate change at around 0.1%,
31
121701
4880
zbog nuklearne zime i klimatskih promena na oko 0,1%,
02:08
with the odds of a pandemic causing the same kind of collapse
32
128291
3211
sa izgledima da isti kolaps izazove pandemija
02:11
at a frightening 3%.
33
131502
1960
na zastrašujućih 3%.
02:14
Given that any of these disasters could mean the end of life as we know it,
34
134005
4880
Kako bi bilo koja od ovih katastrofa mogla da znači kraj života kakvog znamo,
02:19
these aren’t exactly small figures,
35
139177
2127
ne radi se baš o sitnim brojkama.
02:21
And it’s possible this century could see the rise of new technologies
36
141304
4004
A moguće je da će ovaj vek posvedočiti usponu novih tehnologija
02:25
that introduce more existential risks.
37
145308
2794
koje će uvesti dodatni egzistencijalni rizik.
Stručnjaci za VI imaju širok dijapazon procena u vezi sa tim
02:29
AI experts have a wide range of estimates regarding
38
149103
2628
02:31
when artificial general intelligence will emerge,
39
151731
2961
kada će se pojaviti veštačka opšta inteligencija,
02:34
but according to some surveys, many believe it could happen this century.
40
154734
4087
ali prema nekim anketama, mnogi veruju da bi se to moglo desiti u ovom veku.
02:39
Currently, we have relatively narrow forms of artificial intelligence,
41
159655
3837
Trenutno imamo relativno svedene oblike veštačke inteligencije,
02:43
which are designed to do specific tasks like play chess or recognize faces.
42
163492
4588
koji su osmišljeni da obavljaju specifične zadatke,
poput igranja šaha ili prepoznavanja lica.
02:49
Even narrow AIs that do creative work are limited to their singular specialty.
43
169040
5213
Čak i svedene VI koje obavljaju kreativne poslove
ograničene su na pojedinačnu specijalnost.
02:54
But artificial general intelligences, or AGIs,
44
174503
3629
Međutim, veštačke opšte inteligencije ili VOI
02:58
would be able to adapt to and perform any number of tasks,
45
178132
3837
bi mogle da budu prilagodljive i da izvode brojne zadatke,
03:02
quickly outpacing their human counterparts.
46
182678
2878
brzo nadmašujući svoje ljudske pandane.
03:06
There are a huge variety of guesses about what AGI could look like,
47
186849
4088
Pretpostavke kako bi VOI mogla da izgleda uveliko variraju,
03:11
and what it would mean for humanity to share the Earth
48
191646
2836
kao i šta bi to značilo za čovečanstvo da deli zemlju
03:14
with another sentient entity.
49
194482
2127
sa drugim svesnim entitetom.
03:18
AGIs might help us achieve our goals,
50
198653
2544
VOI bi mogla da nam pomogne u ispunjavanju ciljeva,
03:21
they might regard us as inconsequential,
51
201197
2294
mogla bi da nas smatra beznačajnim
03:23
or, they might see us as an obstacle to swiftly remove.
52
203491
3170
ili bi mogla da gleda na nas kao na prepreku koju treba hitno ukloniti.
03:26
So in terms of existential risk,
53
206869
2169
Te je u smislu egzistencijalnog rizika
03:29
it's imperative the values of this new technology align with our own.
54
209038
4087
suštinski važno da se vrednosti ove nove tehnologije podudare sa našim.
03:33
This is an incredibly difficult philosophical and engineering challenge
55
213501
3879
Radi se o izuzetno komplikovanom filozofskom i inženjerskom izazovu
03:37
that will require a lot of delicate, thoughtful work.
56
217380
3086
koji će zahtevati mnogo prefinjenog, promišljenog rada.
03:40
Yet, even if we succeed, AGI could still lead to another complicated outcome.
57
220967
5213
Ipak, čak i ako uspemo, VOI bi i dalje mogla da dovede do drugog složenog ishoda.
03:46
Let’s imagine an AGI emerges with deep respect for human life
58
226847
3796
Recimo da se VOI pojavi sa istinskim poštovanjem za ljudski život
03:50
and a desire to solve all humanity’s troubles.
59
230643
2961
i željom da reši sve ljudske neprilike.
03:55
But to avoid becoming misaligned,
60
235231
2169
Međutim, kako bi se izbegla nekompatibilnost,
03:57
it's been developed to be incredibly rigid about its beliefs.
61
237400
3628
razvijena je da bude krajnje rigidna u vezi sa svojim uverenjima.
04:01
If these machines became the dominant power on Earth,
62
241237
3086
Ukoliko ove mašine postanu dominantna sila na zemlji,
04:04
their strict values might become hegemonic,
63
244323
2711
njihove stroge vrednosti bi mogle postati hegemonističke
04:07
locking humanity into one ideology that would be incredibly resistant to change.
64
247034
5214
i učauriti čovečanstvo u jednu ideologiju koja bi bila krajnje otporna na promene.
04:14
History has taught us that no matter how enlightened
65
254792
2503
Iz istorije smo naučili da koliko god prosvetljenom
04:17
a civilization thinks they are,
66
257295
1793
neka civilizacija smatrala sebe,
04:19
they are rarely up to the moral standards of later generations.
67
259088
3795
retko doseže moralne standarde kasnijih generacija.
04:23
And this kind of value lock in could permanently distort or constrain
68
263009
4629
A ovakav vid učaurivanja vrednosti bi mogao trajno iskriviti ili ograničiti
04:27
humanity’s moral growth.
69
267638
1919
moralni rast čovečanstva.
04:30
There's a ton of uncertainty around AGI,
70
270016
2585
Postoji gomila neizvesnosti u vezi sa VOI,
04:32
and it’s profoundly difficult to predict how any existential risks
71
272601
3713
a istinski je teško predvideti kako će bilo koji egzistencijalni rizik
04:36
will play out over the next century.
72
276314
2002
da se odvija u narednom veku.
04:38
It’s also possible that new, more pressing concerns
73
278649
2920
Takođe je moguće da bi zbog novih, hitnijih briga
04:41
might render these risks moot.
74
281569
2002
ovi rizici mogli biti prevaziđeni.
04:43
But even if we can't definitively say that ours is the most important century,
75
283904
4713
Međutim, čak i ako ne možemo sa sigurnošću tvrditi da je naš vek najvažniji,
04:48
it still seems like the decisions we make might have a major impact
76
288617
3754
i dalje se čini da bi odluke koje donosimo mogle da imaju značajan uticaj
04:52
on humanity’s future.
77
292371
1502
na budućnost čovečanstva.
04:54
So maybe we should all live like the future depends on us—
78
294415
3211
Stoga bi svi možda trebalo da živimo kao da budućnost zavisi od nas -
04:57
because actually, it just might.
79
297626
2211
jer bi zapravo baš to i mogao da bude slučaj.
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7