Daniel Suarez: The kill decision shouldn't belong to a robot

76,064 views ・ 2013-06-13

TED


Please double-click on the English subtitles below to play the video.

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
Prevodilac: Miloš Milosavljević Lektor: Tatjana Jevdjic
00:12
I write fiction sci-fi thrillers,
1
12691
3158
Ja pišem naučno fantastične trilere
00:15
so if I say "killer robots,"
2
15849
2193
i ako kažem "roboti ubice"
00:18
you'd probably think something like this.
3
18042
2361
verovatno ćete pomisliti na ovako nešto.
00:20
But I'm actually not here to talk about fiction.
4
20403
2555
Ali nisam ovde da bih pričao o fikciji.
00:22
I'm here to talk about very real killer robots,
5
22958
2948
Ovde sam da bih pričao o vrlo stvarnim robotima ubicama,
00:25
autonomous combat drones.
6
25906
3122
autonomnim bespilotnim letelicama za borbu.
00:29
Now, I'm not referring to Predator and Reaper drones,
7
29028
3627
Ne mislim pritom na bespilotne letelice "Predator i "Riper"
00:32
which have a human making targeting decisions.
8
32655
3260
gde odluke o ciljanju donose ljudi.
00:35
I'm talking about fully autonomous robotic weapons
9
35915
3129
Govorim o potpuno nezavisnom robotskom oružju
00:39
that make lethal decisions about human beings
10
39044
2654
koje donosi odluke o smrti ljudskih bića
00:41
all on their own.
11
41698
2417
potpuno samostalno.
00:44
There's actually a technical term for this: lethal autonomy.
12
44115
4000
Postoji i tehnički izraz za to: smrtonosna autonomija.
00:48
Now, lethally autonomous killer robots
13
48115
2856
Smrtonosno autonomni roboti ubice
00:50
would take many forms -- flying, driving,
14
50971
3069
mogu imati više oblika - leteći, na točkovima
00:54
or just lying in wait.
15
54040
2746
ili samo u zasedi.
00:56
And actually, they're very quickly becoming a reality.
16
56786
3109
U stvari, oni vrlo brzo postaju realnost.
00:59
These are two automatic sniper stations
17
59895
2484
Ovo su dve automatske snajperske stanice
01:02
currently deployed in the DMZ between North and South Korea.
18
62379
4137
trenutno postavljene u demilitarizovanoj zoni između Severne i Južne Koreje.
01:06
Both of these machines are capable of automatically
19
66516
2171
Obe ove mašine su sposobne za automatsko
01:08
identifying a human target and firing on it,
20
68687
3524
identifikovanje ljudskih meta i pucanje na njih,
01:12
the one on the left at a distance of over a kilometer.
21
72211
4324
ona s leve strane, na razdaljini od preko jednog kilometra.
01:16
Now, in both cases, there's still a human in the loop
22
76535
3589
U oba slučaja još uvek je čovek uključen
01:20
to make that lethal firing decision,
23
80124
2372
da donese odluku o smrtonosnoj paljbi,
01:22
but it's not a technological requirement. It's a choice.
24
82496
5413
ali to nije zahtev tehnologije. To je izbor.
01:27
And it's that choice that I want to focus on,
25
87909
3093
I taj izbor je ono na šta želim da se fokusiram,
01:31
because as we migrate lethal decision-making
26
91002
2641
jer kako prenosimo donošenje smrtonosne odluke
01:33
from humans to software,
27
93643
3109
sa ljudi na softver,
01:36
we risk not only taking the humanity out of war,
28
96752
3476
rizikujemo ne samo to da izbacimo ljudskost iz rata,
01:40
but also changing our social landscape entirely,
29
100228
3526
nego i da potpuno promenimo sliku našeg društva
01:43
far from the battlefield.
30
103754
2224
daleko od bojnog polja.
01:45
That's because the way humans resolve conflict
31
105978
4509
To je zbog toga što način na koji ljudi razrešavaju sukob
01:50
shapes our social landscape.
32
110487
1733
oblikuje sliku našeg društva.
01:52
And this has always been the case, throughout history.
33
112220
2633
Ovo je uvek bio slučaj tokom istorije.
01:54
For example, these were state-of-the-art weapons systems
34
114853
2661
Na primer, ovo su bili moderni sistemi naoružanja
01:57
in 1400 A.D.
35
117514
2079
1400. godine.
01:59
Now they were both very expensive to build and maintain,
36
119593
3144
Oba su bila skupa za izgradnju i održavanje,
02:02
but with these you could dominate the populace,
37
122737
3240
ali njima ste mogli da vladate narodom
02:05
and the distribution of political power in feudal society reflected that.
38
125977
3889
i to se ogledalo u raspodeli političke moći u feudalnom društvu.
02:09
Power was focused at the very top.
39
129866
2687
Moć je bila koncentrisana u samom vrhu.
02:12
And what changed? Technological innovation.
40
132553
3528
Šta se promenilo? Tehnološka inovacija.
02:16
Gunpowder, cannon.
41
136081
1871
Barut, top.
02:17
And pretty soon, armor and castles were obsolete,
42
137952
3817
I uskoro, oklopi i tvrđave su postali beskorisni
02:21
and it mattered less who you brought to the battlefield
43
141769
2533
i bilo je manje važno koga ste doveli na bojno polje
02:24
versus how many people you brought to the battlefield.
44
144302
3779
od toga koliko ste ljudi doveli na bojno polje.
02:28
And as armies grew in size, the nation-state arose
45
148081
3638
I kako su vojske rasle, nastajala je nacionalna država
02:31
as a political and logistical requirement of defense.
46
151719
3680
kao politički i logistički zahtev odbrane.
02:35
And as leaders had to rely on more of their populace,
47
155399
2376
I kako su lideri morali da se oslanjaju više na svoj narod,
02:37
they began to share power.
48
157775
1833
počeli su da dele moć.
02:39
Representative government began to form.
49
159608
2599
Počela je da se formira vlada.
02:42
So again, the tools we use to resolve conflict
50
162207
3288
I opet, sredstva koja koristimo da razrešimo sukob
02:45
shape our social landscape.
51
165495
3304
oblikuju sliku našeg društva.
02:48
Autonomous robotic weapons are such a tool,
52
168799
4064
Autonomna robotička oružja su takvo sredstvo,
02:52
except that, by requiring very few people to go to war,
53
172863
5168
koje osim što zahteva da vrlo malo ljudi pođe u rat,
02:58
they risk re-centralizing power into very few hands,
54
178031
4840
rizikuje ponovno centralizovanje moći u mali broj ruku
03:02
possibly reversing a five-century trend toward democracy.
55
182871
6515
što bi verovatno obrnulo petovekovni trend ka demokratiji.
03:09
Now, I think, knowing this,
56
189386
1757
Znajući ovo, mislim da možemo
03:11
we can take decisive steps to preserve our democratic institutions,
57
191143
4352
preduzeti odlučne korake da sačuvamo naše demokratske institucije,
03:15
to do what humans do best, which is adapt.
58
195495
3979
da uradimo ono što ljudi rade najbolje, a to je da se adaptiramo.
03:19
But time is a factor.
59
199474
2005
Ali vreme je faktor.
03:21
Seventy nations are developing remotely-piloted
60
201479
2851
Sedamdeset nacija razvija sopstvene daljinski upravljane
03:24
combat drones of their own,
61
204330
2157
bespilotne letelice
03:26
and as you'll see, remotely-piloted combat drones
62
206487
2593
i kao što ćete videti, daljinski upravljane bespilotne letelice
03:29
are the precursors to autonomous robotic weapons.
63
209080
4472
su preteče autonomnih robotskih letelica.
03:33
That's because once you've deployed remotely-piloted drones,
64
213552
2767
To je zato što, kad ste razmestili daljinske bespilotne letelice,
03:36
there are three powerful factors pushing decision-making
65
216319
3384
postoje tri moćna faktora koji podstiču premeštanje donošenja odluke
03:39
away from humans and on to the weapon platform itself.
66
219703
4600
sa ljudi na sama oružja.
03:44
The first of these is the deluge of video that drones produce.
67
224303
5259
Prvi od njih je poplava video snimaka koje prave bespilotne letelice.
03:49
For example, in 2004, the U.S. drone fleet produced
68
229562
3853
Na primer 2004., flota bespilotnih letelica SAD-a je proizvela
03:53
a grand total of 71 hours of video surveillance for analysis.
69
233415
5312
ukupno 71 sat video nadzora za analizu.
03:58
By 2011, this had gone up to 300,000 hours,
70
238727
4499
Do 2011., taj broj se popeo na 300.000 sati,
04:03
outstripping human ability to review it all,
71
243226
3149
prevazilazeći ljudsku sposobnost da sve to pregleda
04:06
but even that number is about to go up drastically.
72
246375
3664
i taj broj će se drastično povećavati.
04:10
The Pentagon's Gorgon Stare and Argus programs
73
250039
2575
Programi Pentagona "Gorgonin pogled" i "Argus"
04:12
will put up to 65 independently operated camera eyes
74
252614
3164
će postaviti do 65 nezavisnih objektiva kamere
04:15
on each drone platform,
75
255778
2038
na svaku bespilotnu letelicu
04:17
and this would vastly outstrip human ability to review it.
76
257816
3303
i ovo će uveliko prevazići ljudsku sposobnost da sve to pregleda.
04:21
And that means visual intelligence software will need
77
261119
2160
To znači da će vizuelno inteligentni softver morati
04:23
to scan it for items of interest.
78
263279
4048
da to skenira da bi izdvojio ono što je značajno.
04:27
And that means very soon
79
267327
1348
A to znači da će uskoro
04:28
drones will tell humans what to look at,
80
268675
2747
bespilotne letelice govoriti ljudima šta da pogledaju,
04:31
not the other way around.
81
271422
2497
a ne obratno.
04:33
But there's a second powerful incentive pushing
82
273919
2473
Ali tu je i drugi moćni podsticaj
04:36
decision-making away from humans and onto machines,
83
276392
3383
za prenošenje donošenja odluka sa ljudi na mašine
04:39
and that's electromagnetic jamming,
84
279775
2872
i to je elektromagnetno ometanje
04:42
severing the connection between the drone
85
282647
2236
koje prekida vezu između bespilotne letelice
04:44
and its operator.
86
284883
2814
i onoga ko njom upravlja.
04:47
Now we saw an example of this in 2011
87
287697
2618
Videli smo primer ovoga 2011.
04:50
when an American RQ-170 Sentinel drone
88
290315
2956
kada se američka bespilotna letelica RQ-170 "Sentinel"
04:53
got a bit confused over Iran due to a GPS spoofing attack,
89
293271
4307
malo zbunila iznad Irana usled napada pomoću GPS obmane,
04:57
but any remotely-piloted drone is susceptible to this type of attack,
90
297578
5114
ali bilo koja daljinska bespilotna letelica bila bi podložna ovom tipu napada,
05:02
and that means drones
91
302692
2052
a to znači da će bespilotne letelice
05:04
will have to shoulder more decision-making.
92
304744
3620
morati u većoj meri da preuzmu donošenje odluke.
05:08
They'll know their mission objective,
93
308364
3043
One će znati cilj svoje misije
05:11
and they'll react to new circumstances without human guidance.
94
311407
4845
i reagovaće na nove okolnosti bez ljudskog vođstva.
05:16
They'll ignore external radio signals
95
316252
2581
One će ignorisati spoljne radio signale
05:18
and send very few of their own.
96
318833
2330
i slati vrlo malo sopstvenih.
05:21
Which brings us to, really, the third
97
321163
2006
I to nas dovodi do trećeg
05:23
and most powerful incentive pushing decision-making
98
323169
3862
i najmoćnijeg podsticaja za prenošenje donošenja odluka
05:27
away from humans and onto weapons:
99
327031
3342
sa ljudi na oružje:
05:30
plausible deniability.
100
330373
3293
uverljivo negiranje.
05:33
Now we live in a global economy.
101
333666
2887
Živimo u globalnoj ekonomiji.
05:36
High-tech manufacturing is occurring on most continents.
102
336553
4334
Proizvodnja visoke tehnologije se dešava na uglavnom svim kontinentima.
05:40
Cyber espionage is spiriting away advanced designs
103
340887
2914
Sajber špijunaža krijumčari napredne dizajne
05:43
to parts unknown,
104
343801
1886
na nepoznate destinacije
05:45
and in that environment, it is very likely
105
345687
2014
i u takvom okruženju je vrlo verovatno
05:47
that a successful drone design will be knocked off in contract factories,
106
347701
4734
da će uspešan dizajn bespilotne letelice stati u fabrikama
05:52
proliferate in the gray market.
107
352435
2170
i umnožiti se na crnom tržištu.
05:54
And in that situation, sifting through the wreckage
108
354605
2460
I u toj situaciji, pretraživanjem olupine
05:57
of a suicide drone attack, it will be very difficult to say
109
357065
2960
posle samoubilačkog napada bespilotne letelice, biće vrlo teško reći
06:00
who sent that weapon.
110
360025
4400
ko je poslao to oružje.
06:04
This raises the very real possibility
111
364425
2800
Ovo predstavlja realnu mogućnost
06:07
of anonymous war.
112
367225
2935
anonimnog rata.
06:10
This could tilt the geopolitical balance on its head,
113
370160
2614
Ovo bi moglo okrenuti geopolitičku ravnotežu naglavce
06:12
make it very difficult for a nation to turn its firepower
114
372774
3491
i vrlo otežati naciji da upotrebi svoju oružanu moć
06:16
against an attacker, and that could shift the balance
115
376265
2848
protiv napadača i to može da promeni ravnotežu
06:19
in the 21st century away from defense and toward offense.
116
379113
3764
u 21. veku sa odbrane na napad.
06:22
It could make military action a viable option
117
382877
3124
Može da učini vojnu akciju izvodljivom opcijom
06:26
not just for small nations,
118
386001
2288
ne samo za male narode,
06:28
but criminal organizations, private enterprise,
119
388289
2545
nego i za kriminalne organizacije, privatne poduhvate,
06:30
even powerful individuals.
120
390834
2479
pa čak i za moćne pojedince.
06:33
It could create a landscape of rival warlords
121
393313
3328
Može da kreira sliku u kojoj rivalski gospodari rata
06:36
undermining rule of law and civil society.
122
396641
3680
podrivaju zakon prava i civilnog društva.
06:40
Now if responsibility and transparency
123
400321
3616
Ako su odgovornost i transparentnost
06:43
are two of the cornerstones of representative government,
124
403937
2384
dva stuba vlade,
06:46
autonomous robotic weapons could undermine both.
125
406321
4320
autonomna robotska oružja bi mogla da ih potkopaju.
06:50
Now you might be thinking that
126
410641
1546
Možda mislite da bi građani
06:52
citizens of high-tech nations
127
412187
2246
visoko tehnoloških nacija
06:54
would have the advantage in any robotic war,
128
414433
2703
imali prednost u bilo kom robotskom ratu,
06:57
that citizens of those nations would be less vulnerable,
129
417136
3633
da bi građani ovih nacija bili manje ranjivi,
07:00
particularly against developing nations.
130
420769
4288
posebno u poređenju sa nacijama u razvoju.
07:05
But I think the truth is the exact opposite.
131
425057
3524
Ali mislim da je istina potpuno suprotna.
07:08
I think citizens of high-tech societies
132
428581
2251
Mislim da su građani visoko tehnoloških društava
07:10
are more vulnerable to robotic weapons,
133
430832
3729
ranjiviji za robotska oružja
07:14
and the reason can be summed up in one word: data.
134
434561
4465
i razlog za to može da se sumira u jednoj reči: podaci.
07:19
Data powers high-tech societies.
135
439026
3481
Podaci vladaju visoko tehnološkim društvima.
07:22
Cell phone geolocation, telecom metadata,
136
442507
3190
Geolokacija mobilnih telefona, telekomunikacioni metapodaci,
07:25
social media, email, text, financial transaction data,
137
445697
3472
društveni mediji, e-mail, SMS, podaci finansijskih transakcija
07:29
transportation data, it's a wealth of real-time data
138
449169
3532
saobraćajni podaci, to je bogatstvo podataka u realnom vremenu
07:32
on the movements and social interactions of people.
139
452701
3373
o potezima i društvenim interakcijama ljudi.
07:36
In short, we are more visible to machines
140
456074
3775
Ukratko, vidljiviji smo mašinama
07:39
than any people in history,
141
459849
2242
od bilo kojih drugih ljudi u istoriji
07:42
and this perfectly suits the targeting needs of autonomous weapons.
142
462091
5616
i ovo savršeno odgovara ciljanim potrebama autonomnog oružja.
07:47
What you're looking at here
143
467707
1738
Ono što vidite ovde
07:49
is a link analysis map of a social group.
144
469445
3246
je mapa analize veza društvene grupe.
07:52
Lines indicate social connectedness between individuals.
145
472691
3634
Linije označavaju društvenu povezanost između osoba.
07:56
And these types of maps can be automatically generated
146
476325
2880
Ova vrsta mapa može se generisati automatski
07:59
based on the data trail modern people leave behind.
147
479205
4715
na osnovu tragova podataka koje moderni ljudi ostavljaju za sobom.
08:03
Now it's typically used to market goods and services
148
483920
2477
Obično se koristi u tržištu robe i usluga
08:06
to targeted demographics, but it's a dual-use technology,
149
486397
4416
za ciljanu demografiju, ali to je tehnologija za obe strane
08:10
because targeting is used in another context.
150
490813
3360
jer je "ciljanje" upotrebljeno u drugom kontekstu.
08:14
Notice that certain individuals are highlighted.
151
494173
2560
Vidite da su određene osobe istaknute.
08:16
These are the hubs of social networks.
152
496733
3280
Reč je o centrima društvenih mreža.
08:20
These are organizers, opinion-makers, leaders,
153
500013
3590
To su organizatori, oni utiču na formiranje mišljenja, vođe
08:23
and these people also can be automatically identified
154
503603
2682
i ti ljudi takođe mogu biti automatski identifikovani
08:26
from their communication patterns.
155
506285
2382
pomoću svojih obrazaca komunikacije.
08:28
Now, if you're a marketer, you might then target them
156
508667
2146
Ako ste prodavac, onda ih možete naciljati
08:30
with product samples, try to spread your brand
157
510813
2543
pomoću uzoraka proizvoda i pokušati da proširite svoj brend
08:33
through their social group.
158
513356
2829
u njihovoj društvenoj grupi.
08:36
But if you're a repressive government
159
516185
1953
Ali, ako ste represivna vlada
08:38
searching for political enemies, you might instead remove them,
160
518138
4810
koja traži političke protivnike, mogli biste umesto toga da ih uklonite,
08:42
eliminate them, disrupt their social group,
161
522948
2760
eliminišete, poremetite njihovu društvenu grupu
08:45
and those who remain behind lose social cohesion
162
525708
3169
i one koji stoje u labavoj društvenoj koheziji
08:48
and organization.
163
528877
2621
i organizaciji.
08:51
Now in a world of cheap, proliferating robotic weapons,
164
531498
3324
U svetu jeftinih robotskih oružja koja se sve više umnožavaju
08:54
borders would offer very little protection
165
534822
2635
granice bi pružale vrlo malu zaštitu
08:57
to critics of distant governments
166
537457
1946
kritičarima udaljenih vlada
08:59
or trans-national criminal organizations.
167
539403
3646
ili međunarodnim kriminalnim organizacijama.
09:03
Popular movements agitating for change
168
543049
3493
Narodni pokreti koji agituju za promenu
09:06
could be detected early and their leaders eliminated
169
546542
3609
mogli bi biti otkriveni rano i njihovi lideri eliminisani
09:10
before their ideas achieve critical mass.
170
550151
2911
pre neko što njihove ideje usvoji kritična masa.
09:13
And ideas achieving critical mass
171
553062
2591
A ideje koje dostižu kritičnu masu
09:15
is what political activism in popular government is all about.
172
555653
3936
su ono čime se politički aktivizam u narodnoj vladi bavi.
09:19
Anonymous lethal weapons could make lethal action
173
559589
3997
Anonimna smrtonosna oružja mogu preduzeti smrtonosnu akciju,
09:23
an easy choice for all sorts of competing interests.
174
563586
3782
lak izbor za sve vrste takmičarskih interesa.
09:27
And this would put a chill on free speech
175
567368
3734
I ovo bi stavilo tačku na slobodu govora
09:31
and popular political action, the very heart of democracy.
176
571102
5308
i političku aktivnost naroda, samu srž demokratije.
09:36
And this is why we need an international treaty
177
576410
2914
To je ono zbog čega nam treba međunarodni sporazum
09:39
on robotic weapons, and in particular a global ban
178
579324
3540
o robotskom oružju i posebno globalna zabrana
09:42
on the development and deployment of killer robots.
179
582864
3908
razvoja i razmeštanja robota ubica.
09:46
Now we already have international treaties
180
586772
3254
Već imamo međunarodne sporazume
09:50
on nuclear and biological weapons, and, while imperfect,
181
590026
3386
o nuklearnom i biološkom oružju koji su uglavnom funkcionisali,
09:53
these have largely worked.
182
593412
2288
iako nisu savršeni.
09:55
But robotic weapons might be every bit as dangerous,
183
595700
3768
Ali robotska oružja mogu biti isto tako opasna
09:59
because they will almost certainly be used,
184
599468
3288
jer će ona skoro sigurno biti upotrebljena
10:02
and they would also be corrosive to our democratic institutions.
185
602756
5027
i bila bi isto tako nagrizajuća za naše demokratske institucije.
10:07
Now in November 2012 the U.S. Department of Defense
186
607783
3468
U novembru 2012. Ministarstvo odbrane Sjedinjenih Država
10:11
issued a directive requiring
187
611251
2458
izdalo je direktivu koja zahteva
10:13
a human being be present in all lethal decisions.
188
613709
4519
da čovek bude prisutan kod svih smrtonosnih odluka.
10:18
This temporarily effectively banned autonomous weapons in the U.S. military,
189
618228
4776
Ovim su u stvari zabranjena autonomna oružja u američkoj vojsci,
10:23
but that directive needs to be made permanent.
190
623004
3753
ali ta direktiva bi trebalo da postane trajna.
10:26
And it could set the stage for global action.
191
626757
4376
To bi moglo da pripremi pozornicu za globalnu akciju.
10:31
Because we need an international legal framework
192
631133
3845
Zato što nam treba međunarodni zakonski okvir
10:34
for robotic weapons.
193
634978
2138
za robotska oružja.
10:37
And we need it now, before there's a devastating attack
194
637116
2928
I potreban nam je odmah, pre nego što se desi razoran napad
10:40
or a terrorist incident that causes nations of the world
195
640044
3152
ili teroristički incident koji uzrokuje da nacije sveta
10:43
to rush to adopt these weapons
196
643196
1924
požure da usvoje ova oružja
10:45
before thinking through the consequences.
197
645120
3771
pre nego što razmisle o posledicama.
10:48
Autonomous robotic weapons concentrate too much power
198
648891
2981
Autonomna robotska oružja koncentrišu previše moći
10:51
in too few hands, and they would imperil democracy itself.
199
651872
6283
u suviše malom broju ruku i to bi ugrozilo samu demokratiju.
10:58
Now, don't get me wrong, I think there are tons
200
658155
2686
Nemojte me shvatiti pogrešno, mislim da ima mnogo
11:00
of great uses for unarmed civilian drones:
201
660841
2618
sjajnih načina za upotrebu nenaoružanih civilnih bespilotnih letelica:
11:03
environmental monitoring, search and rescue, logistics.
202
663459
3939
nadgledanje okoline, potraga i spasavanje, logistika.
11:07
If we have an international treaty on robotic weapons,
203
667398
2826
Ako bismo imali međunarodni sporazum o robotskom oružju,
11:10
how do we gain the benefits of autonomous drones
204
670224
3587
kako bismo onda dobili koristi od autonomnih bespilotnih letelica
11:13
and vehicles while still protecting ourselves
205
673811
2648
i vozila, dok se u isto vreme čuvamo
11:16
against illegal robotic weapons?
206
676459
3980
od ilegalnog robotskog oružja?
11:20
I think the secret will be transparency.
207
680439
4741
Mislim da će tajna biti u transparentnosti.
11:25
No robot should have an expectation of privacy
208
685180
3013
Nijedan robot ne bi trebalo da očekuje da ima privatnost
11:28
in a public place.
209
688193
3451
u javnom prostoru.
11:31
(Applause)
210
691644
5048
(Aplauz)
11:36
Each robot and drone should have
211
696692
2045
Svaki robot i bespilotna letelica bi trebalo da ima
11:38
a cryptographically signed I.D. burned in at the factory
212
698737
2883
kriptografsku identifikaciju utisnutu u fabrici
11:41
that can be used to track its movement through public spaces.
213
701620
2923
koja može da se iskoristi za praćenje njegovog kretanja kroz javne prostore.
11:44
We have license plates on cars, tail numbers on aircraft.
214
704543
3381
Imamo tablice na kolima, brojeve na repovima aviona.
11:47
This is no different.
215
707924
1841
Ovo nije ništa drugačije.
11:49
And every citizen should be able to download an app
216
709765
2012
Svaki građanin bi trebalo da može da skine aplikaciju
11:51
that shows the population of drones and autonomous vehicles
217
711777
3125
koja prikazuje populaciju bespilotnih letelica i autonomnih vozila
11:54
moving through public spaces around them,
218
714902
2429
koja se kreće u javnim prostorima oko njih
11:57
both right now and historically.
219
717331
2733
i u sadašnjem trenutku i u prošlosti.
12:00
And civic leaders should deploy sensors and civic drones
220
720064
3548
I gradske vođe bi trebalo da postave senzore i gradske bespilotne letelice
12:03
to detect rogue drones,
221
723612
2344
da otkrivaju nezakonite letelice
12:05
and instead of sending killer drones of their own up to shoot them down,
222
725956
3176
i umesto da šalju same letelice ubice da ih obore,
12:09
they should notify humans to their presence.
223
729132
2992
trebalo bi da obaveste ljude o njihovom prisustvu.
12:12
And in certain very high-security areas,
224
732124
2606
I u određenim oblastima visoke bezbednosti
12:14
perhaps civic drones would snare them
225
734730
1909
možda bi gradske bespilotne letelice mogle da ih uhvate
12:16
and drag them off to a bomb disposal facility.
226
736639
2841
i odvuku ih do mesta za odlaganje bombi.
12:19
But notice, this is more an immune system
227
739480
3027
Ali vidite, ovo je otporniji sistem
12:22
than a weapons system.
228
742507
1321
od sistema oružja.
12:23
It would allow us to avail ourselves of the use
229
743828
2592
To bi nam omogućilo da iskoristimo upotrebu
12:26
of autonomous vehicles and drones
230
746420
2032
autonomnih vozila i bespilotnih letelica
12:28
while still preserving our open, civil society.
231
748452
4295
dok u isto vreme čuvamo naše otvoreno, građansko društvo.
12:32
We must ban the deployment and development
232
752747
2999
Moramo zabraniti razmeštanje i razvoj
12:35
of killer robots.
233
755746
1862
robota ubica.
12:37
Let's not succumb to the temptation to automate war.
234
757608
4850
Nemojmo da podlegnemo iskušenju da automatizujemo rat.
12:42
Autocratic governments and criminal organizations
235
762458
2718
Apsolutističke vlade i kriminalne organizacije
12:45
undoubtedly will, but let's not join them.
236
765176
2956
bez sumnje hoće, ali nemojmo da im se pridružimo.
12:48
Autonomous robotic weapons
237
768132
1891
Autonomna robotska oružja
12:50
would concentrate too much power
238
770023
2051
bi koncentrisala suviše moći
12:52
in too few unseen hands,
239
772074
2482
u suviše mali broj nevidljivih ruku
12:54
and that would be corrosive to representative government.
240
774556
3255
i to bi bilo nagrizajuće za predstavnike vlasti.
12:57
Let's make sure, for democracies at least,
241
777811
2961
Osigurajmo da, barem za demokratije,
13:00
killer robots remain fiction.
242
780772
2604
roboti ubice ostanu fikcija.
13:03
Thank you.
243
783376
1110
Hvala vam.
13:04
(Applause)
244
784486
4565
(Aplauz)
13:09
Thank you. (Applause)
245
789051
4616
Hvala vam. (Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7