Daniel Suarez: The kill decision shouldn't belong to a robot

76,207 views ใƒป 2013-06-13

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
ืžืชืจื’ื: Yael BST ืžื‘ืงืจ: Ido Dekkers
00:12
I write fiction sci-fi thrillers,
1
12691
3158
ืื ื™ ื›ื•ืชื‘ ืžื•ืชื—ื ื™ื ืฉืœ ืžื“ืข ื‘ื“ื™ื•ื ื™,
00:15
so if I say "killer robots,"
2
15849
2193
ืื– ืื ืื ื™ ืื•ืžืจ "ืจื•ื‘ื•ื˜ื™ื ืจืฆื—ื ื™ื™ื",
00:18
you'd probably think something like this.
3
18042
2361
ืืชื ื‘ื˜ื— ื—ื•ืฉื‘ื™ื ืขืœ ืžืฉื”ื• ื›ืžื• ื–ื”.
00:20
But I'm actually not here to talk about fiction.
4
20403
2555
ืื‘ืœ ื”ืืžืช ื”ื™ื ืฉืื ื™ ื›ืืŸ ืœื ื‘ื›ื“ื™ ืœื“ื‘ืจ ืขืœ ืžื“ืข ื‘ื“ื™ื•ื ื™.
00:22
I'm here to talk about very real killer robots,
5
22958
2948
ืื ื™ ื›ืืŸ ื‘ื›ื“ื™ ืœื“ื‘ืจ ืขืœ ืจื•ื‘ื•ื˜ื™ื ืจืฆื—ื ื™ื™ื ืžืื•ื“ ืืžื™ืชื™ื™ื,
00:25
autonomous combat drones.
6
25906
3122
ื›ื˜ื‘"ืžื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื ืงืจื‘ื™ื™ื.
00:29
Now, I'm not referring to Predator and Reaper drones,
7
29028
3627
ืขื›ืฉื™ื•, ืื ื™ ืœื ืžื“ื‘ืจ ืขืœ ื”ื›ื˜ื‘"ืžื™ื "ืคืจื“ื˜ื•ืจ" ื•"ืจื™ืคืจ" ,
00:32
which have a human making targeting decisions.
8
32655
3260
ืฉื‘ื”ื ื”ืื“ื ื”ื•ื ืฉืžื—ืœื™ื˜ ืœื’ื‘ื™ ื”ืžื˜ืจื•ืช.
00:35
I'm talking about fully autonomous robotic weapons
9
35915
3129
ืื ื™ ืžื“ื‘ืจ ืขืœ ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื ืœื’ืžืจื™ ืื•ื˜ื•ื ื•ืžื™ื™ื
00:39
that make lethal decisions about human beings
10
39044
2654
ืฉืžืงื‘ืœื™ื ื”ื—ืœื˜ื•ืช ืงื˜ืœื ื™ื•ืช ืขืœ ื‘ื ื™ ืื“ื
00:41
all on their own.
11
41698
2417
ืœื’ืžืจื™ ื‘ืขืฆืžื.
00:44
There's actually a technical term for this: lethal autonomy.
12
44115
4000
ื™ืฉ ืœืžืขืฉื” ืžื•ื ื— ื˜ื›ื ื™ ืœื–ื”: ืื•ื˜ื•ื ื•ืžื™ื” ืงื˜ืœื ื™ืช.
00:48
Now, lethally autonomous killer robots
13
48115
2856
ืขื›ืฉื™ื•, ืจื•ื‘ื•ื˜ื™ื ืงื˜ืœื ื™ื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื
00:50
would take many forms -- flying, driving,
14
50971
3069
ืœื•ื‘ืฉื™ื ืฆื•ืจื•ืช ืจื‘ื•ืช: ื˜ืกื™ื, ื ื•ื”ื’ื™ื,
00:54
or just lying in wait.
15
54040
2746
ืื• ืกืชื ื ืžืฆืื™ื ื‘ืžืืจื‘.
00:56
And actually, they're very quickly becoming a reality.
16
56786
3109
ื•ืœืžืขืฉื”, ื”ื ื ื”ืคื›ื™ื ืœืžืฆื™ืื•ืช ืžื”ืจ ืžืื•ื“.
00:59
These are two automatic sniper stations
17
59895
2484
ืืœื• ืฉืชื™ ืชื—ื ื•ืช ืฆืœื™ืคื” ืื•ื˜ื•ืžื˜ื™ืช
01:02
currently deployed in the DMZ between North and South Korea.
18
62379
4137
ืฉืžื•ืฆื‘ื•ืช ื›ื™ื•ื ื‘ืื–ื•ืจ ื”ืžืคื•ืจื– ืฉื‘ื™ืŸ ืฆืคื•ืŸ ืงื•ืจื™ืื” ืœื“ืจื•ื ืงื•ืจื™ืื”.
01:06
Both of these machines are capable of automatically
19
66516
2171
ืœืฉืชื™ ื”ืžืขืจื›ื•ืช ื™ื›ื•ืœื•ืช ืื•ื˜ื•ื ื•ืžื™ื•ืช
01:08
identifying a human target and firing on it,
20
68687
3524
ืœื–ื™ื”ื•ื™ ืžื˜ืจื” ืื ื•ืฉื™ืช ื•ื™ื™ืจื•ื˜ื”,
01:12
the one on the left at a distance of over a kilometer.
21
72211
4324
ื–ื• ืžืฉืžืืœ ืžืžืจื—ืง ืฉืœ ืœืžืขืœื” ืžืงื™ืœื•ืžื˜ืจ.
01:16
Now, in both cases, there's still a human in the loop
22
76535
3589
ืขื›ืฉื™ื•, ื‘ืฉื ื™ ื”ืžืงืจื™ื, ื™ืฉ ืขื“ื™ื™ืŸ ื‘ืŸ ืื“ื ื‘ืืžืฆืข
01:20
to make that lethal firing decision,
23
80124
2372
ืฉืžืงื‘ืœ ืืช ื”ื”ื—ืœื˜ื” ื”ืงื˜ืœื ื™ืช ืœื™ืจื•ืช,
01:22
but it's not a technological requirement. It's a choice.
24
82496
5413
ืื‘ืœ ื–ื• ืื™ื ื” ื“ืจื™ืฉื” ื˜ื›ื ื™ืช. ื–ื• ื‘ื—ื™ืจื”.
01:27
And it's that choice that I want to focus on,
25
87909
3093
ื•ื‘ื“ื™ื•ืง ื‘ื‘ื—ื™ืจื” ื”ื–ื• ืื ื™ ืจื•ืฆื” ืœื”ืชืžืงื“,
01:31
because as we migrate lethal decision-making
26
91002
2641
ื›ื™ ื‘ื–ืžืŸ ืฉืื ื—ื ื• ืžืขื‘ื™ืจื™ื ืืช ืงื‘ืœืช ื”ื”ื—ืœื˜ื•ืช ื”ืงื˜ืœื ื™ื•ืช
01:33
from humans to software,
27
93643
3109
ืžื‘ื ื™ ื”ืื“ื ืœืชื•ื›ื ื”,
01:36
we risk not only taking the humanity out of war,
28
96752
3476
ืื ื—ื ื• ืžืกืชื›ื ื™ื ืœื ืจืง ื‘ื›ืš ืฉื ื•ืฆื™ื ืืช ื”ืื ื•ืฉื™ื•ืช ืžื”ืžืœื—ืžื”,
01:40
but also changing our social landscape entirely,
29
100228
3526
ืืœื ื’ื ื‘ื›ืš ืฉืื ื—ื ื• ื ืฉื ื” ืœื—ืœื•ื˜ื™ืŸ ืืช ื”ื ื•ืฃ ื”ื—ื‘ืจืชื™ ืฉืœื ื•,
01:43
far from the battlefield.
30
103754
2224
ื’ื ื”ืจื‘ื” ืžืขื‘ืจ ืœืฉื“ื” ื”ืงืจื‘.
01:45
That's because the way humans resolve conflict
31
105978
4509
ื•ื–ืืช ื›ื™ ื”ืื•ืคืŸ ืฉื‘ื• ื‘ื ื™ ืื“ื ืคื•ืชืจื™ื ืกื™ื›ืกื•ื›ื™ื
01:50
shapes our social landscape.
32
110487
1733
ืžืขืฆื‘ ืืช ื”ื ื•ืฃ ื”ื—ื‘ืจืชื™ ืฉืœื ื•.
01:52
And this has always been the case, throughout history.
33
112220
2633
ื•ื›ืš ื–ื” ื”ื™ื” ืชืžื™ื“, ืœืื•ืจืš ื›ืœ ื”ื”ื™ืกื˜ื•ืจื™ื”.
01:54
For example, these were state-of-the-art weapons systems
34
114853
2661
ืœืžืฉืœ, ืืœื• ื”ื™ื• ื›ืœื™ ื”ื ืฉืง ื”ื›ื™ ืžืชืงื“ืžื™ื
01:57
in 1400 A.D.
35
117514
2079
ื‘-1400 ืœืกืคื™ืจื”.
01:59
Now they were both very expensive to build and maintain,
36
119593
3144
ื”ื ื”ื™ื• ืžืื•ื“ ื™ืงืจื™ื ื”ืŸ ืœื‘ื ื™ื” ื•ื”ืŸ ืœืชื—ื–ื•ืงื”,
02:02
but with these you could dominate the populace,
37
122737
3240
ืื‘ืœ ื‘ืืžืฆืขื•ืชื ื™ื›ื•ืœืช ืœืฉืœื•ื˜ ื‘ืขื,
02:05
and the distribution of political power in feudal society reflected that.
38
125977
3889
ื•ื”ืคื™ื–ื•ืจ ืฉืœ ื”ื›ื•ื— ื”ืคื•ืœื™ื˜ื™ ื‘ื—ื‘ืจื” ื”ืคื™ืื•ื“ืœื™ืช ืฉื™ืงืฃ ื–ืืช.
02:09
Power was focused at the very top.
39
129866
2687
ื”ื›ื•ื— ื”ื™ื” ืžืจื•ื›ื– ืžืžืฉ ื‘ืงื•ื“ืงื•ื“.
02:12
And what changed? Technological innovation.
40
132553
3528
ื•ืžื” ื”ืฉืชื ื”? ื—ื“ืฉื ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ืช.
02:16
Gunpowder, cannon.
41
136081
1871
ืื‘ืง ื”ืฉืจื™ืคื”, ืชื•ืชื—ื™ื.
02:17
And pretty soon, armor and castles were obsolete,
42
137952
3817
ื•ืžื”ืจ ืžืื•ื“, ื”ืฉืจื™ื•ืŸ ื•ื”ื˜ื™ืจื•ืช ื”ืคื›ื• ืžื™ื•ืฉื ื™ื,
02:21
and it mattered less who you brought to the battlefield
43
141769
2533
ื•ื›ื‘ืจ ื”ื™ื” ื—ืฉื•ื‘ ืคื—ื•ืช ืืช ืžื™ ื”ื‘ืืช ืœืฉื“ื” ื”ืงืจื‘
02:24
versus how many people you brought to the battlefield.
44
144302
3779
ืœืขื•ืžืช ื›ืžื” ืื ืฉื™ื ื”ื‘ืืช ืœืฉื“ื” ื”ืงืจื‘.
02:28
And as armies grew in size, the nation-state arose
45
148081
3638
ื•ื›ื›ืœ ืฉืฆื‘ืื•ืช ื’ื“ืœื• ื‘ืžืกืคืจื™ื, ืงืžื• ืžื“ื™ื ื•ืช-ื”ืœืื•ื
02:31
as a political and logistical requirement of defense.
46
151719
3680
ื›ืฆื•ืจืš ืคื•ืœื™ื˜ื™ ื•ืœื•ื’ื™ืกื˜ื™ ืฉืœ ื”ื”ื’ื ื”.
02:35
And as leaders had to rely on more of their populace,
47
155399
2376
ื•ื›ื›ืœ ืฉืžื ื”ื™ื’ื™ื ื ืืœืฆื• ืœืกืžื•ืš ืขืœ ื™ื•ืชืจ ืžื‘ื ื™ ื”ืขื ืฉืœื”ื,
02:37
they began to share power.
48
157775
1833
ื›ืš ื”ื ื”ื—ืœื• ืœื—ืœื•ืง ื‘ื›ื•ื—.
02:39
Representative government began to form.
49
159608
2599
ืžืžืฉืœื™ื ื™ื™ืฆื•ื’ื™ื™ื ื”ื—ืœื• ืœื”ื™ื•ื•ืฆืจ.
02:42
So again, the tools we use to resolve conflict
50
162207
3288
ืื– ืฉื•ื‘, ื”ื›ืœื™ื ื‘ื”ื ืื ื—ื ื• ืžืฉืชืžืฉื™ื ื›ื“ื™ ืœืคืชื•ืจ ืกื›ืกื•ื›ื™ื
02:45
shape our social landscape.
51
165495
3304
ืžืขืฆื‘ื™ื ืืช ื”ื ื•ืฃ ื”ื—ื‘ืจืชื™ ืฉืœื ื•.
02:48
Autonomous robotic weapons are such a tool,
52
168799
4064
ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื ื”ื ื›ืœื™ ื›ื–ื”,
02:52
except that, by requiring very few people to go to war,
53
172863
5168
ืืœื ืฉื‘ื›ืš ืฉื”ื ืžืฆืจื™ื›ื™ื ืžืขื˜ ืžืื•ื“ ืื ืฉื™ื ืฉื™ืฆืื• ืœืžืœื—ืžื”,
02:58
they risk re-centralizing power into very few hands,
54
178031
4840
ื”ื ื™ื•ืฆืจื™ื ืืช ื”ืกื™ื›ื•ืŸ ืฉื”ื›ื•ื— ื™ืฉื•ื‘ ื•ื™ืชืจื›ื– ื‘ืงืจื‘ ืžืขื˜ ืžืื•ื“ ื™ื“ื™ื™ื,
03:02
possibly reversing a five-century trend toward democracy.
55
182871
6515
ื•ืื•ืœื™ ืืฃ ื™ื—ื–ื™ืจื• ืื—ื•ืจื” ืžื’ืžื” ื‘ืช 500 ืฉื ื” ืฉืœ ื”ืชืงื“ืžื•ืช ืœืขื‘ืจ ื“ืžื•ืงืจื˜ื™ื”.
03:09
Now, I think, knowing this,
56
189386
1757
ืขื›ืฉื™ื•, ืœื“ืขืชื™, ื›ืฉื™ื•ื“ืขื™ื ืืช ื–ื”,
03:11
we can take decisive steps to preserve our democratic institutions,
57
191143
4352
ืืคืฉืจ ืœื ืงื•ื˜ ื‘ืฆืขื“ื™ื ื”ื—ืœื˜ื™ื™ื ื›ื“ื™ ืœืฉืžืจ ืืช ื”ืžื•ืกื“ื•ืช ื”ื“ืžื•ืงืจื˜ื™ื™ื ืฉืœื ื•,
03:15
to do what humans do best, which is adapt.
58
195495
3979
ื•ืœืขืฉื•ืช ืืช ืžื” ืฉื‘ื ื™ ื”ืื“ื ืขื•ืฉื™ื ื”ื›ื™ ื˜ื•ื‘, ื•ื–ื” ืœื”ืกืชื’ืœ.
03:19
But time is a factor.
59
199474
2005
ืื‘ืœ ื”ื–ืžืŸ ื”ื•ื ืžืจื›ื™ื‘ ื—ืฉื•ื‘.
03:21
Seventy nations are developing remotely-piloted
60
201479
2851
ืฉื‘ืขื™ื ืื•ืžื•ืช ืžืคืชื—ื•ืช ื‘ืขืฆืžืŸ ื›ื˜ื‘"ืžื™ื
03:24
combat drones of their own,
61
204330
2157
ืฉื ืฉืœื˜ื™ื ืžืจื—ื•ืง,
03:26
and as you'll see, remotely-piloted combat drones
62
206487
2593
ื•ื›ืคื™ ืฉืชืจืื•, ื›ื˜ื‘"ืžื™ื ืงืจื‘ื™ื™ื ืฉื ืฉืœื˜ื™ื ืžืจื—ื•ืง
03:29
are the precursors to autonomous robotic weapons.
63
209080
4472
ื”ื ื ื•ืฉืื™ ื”ื‘ืฉื•ืจื” ืœื›ืœื™ ื”ื ืฉืง ื”ืจื•ื‘ื•ื˜ื™ื™ื ื”ืื•ื˜ื•ื ื•ืžื™ื™ื.
03:33
That's because once you've deployed remotely-piloted drones,
64
213552
2767
ื•ื–ื” ื›ื™ ืžืจื’ืข ืฉืคืจืกืช ื›ื˜ื‘"ืžื™ื ื‘ืฉืœื™ื˜ื” ืžืจื—ื•ืง,
03:36
there are three powerful factors pushing decision-making
65
216319
3384
ื™ืฉื ื ืฉืœื•ืฉื” ื’ื•ืจืžื™ื ืžืจื›ื–ื™ื™ื ืฉื“ื•ื—ืงื™ื ืืช ืงื‘ืœืช ื”ื”ื—ืœื˜ื•ืช
03:39
away from humans and on to the weapon platform itself.
66
219703
4600
ืžื™ื“ื™ ื‘ื ื™ ื”ืื“ื ื•ืืœ ืžืขืจื›ืช ื”ื ืฉืง ืขืฆืžื”.
03:44
The first of these is the deluge of video that drones produce.
67
224303
5259
ื”ืจืืฉื•ืŸ ืฉื‘ื”ื ื”ื•ื ืžื‘ื•ืœ ื”ื•ื•ื™ื“ื™ืื• ืฉื”ื›ื˜ื‘"ืžื™ื ืžื™ื™ืฆืจื™ื.
03:49
For example, in 2004, the U.S. drone fleet produced
68
229562
3853
ืœืžืฉืœ, ื‘-2004, ืฆื™ ื”ื›ื˜ื‘"ืžื™ื ืฉืœ ืืจื”"ื‘ ื”ืคื™ืง
03:53
a grand total of 71 hours of video surveillance for analysis.
69
233415
5312
ื‘ืกื”"ื› 71 ืฉืขื•ืช ืฉืœ ืฆื™ืœื•ืžื™ ืžืขืงื‘ ืœื ื™ืชื•ื—.
03:58
By 2011, this had gone up to 300,000 hours,
70
238727
4499
ื‘-2011, ื”ื ืชื•ืŸ ืขืœื” ืœ 300,000 ืฉืขื•ืช,
04:03
outstripping human ability to review it all,
71
243226
3149
ื•ื‘ื›ืš ืขืงืฃ ืืช ื”ื™ื›ื•ืœืช ื”ืื ื•ืฉื™ืช ืœืฆืคื™ื™ื” ื‘ื›ืœ ื–ื”,
04:06
but even that number is about to go up drastically.
72
246375
3664
ืื‘ืœ ืืคื™ืœื• ื”ืžืกืคืจ ื”ื–ื” ืขื•ื“ ื™ืขืœื” ืžืฉืžืขื•ืชื™ืช.
04:10
The Pentagon's Gorgon Stare and Argus programs
73
250039
2575
ื”ืชื•ื›ื ื™ื•ืช ืฉืœ ื”ืคื ื˜ื’ื•ืŸ "ื’ื•ืจื’ื•ืŸ ืกื˜ืจ" ื•ืืจื’ื•ืก"
04:12
will put up to 65 independently operated camera eyes
74
252614
3164
ื™ืฆื™ื‘ื• ื›-65 ืขื“ืฉื•ืช ืžืฆืœืžื” ืฉืžื ื•ื”ืœื•ืช ื›ืœ ืื—ืช ื‘ื ืคืจื“
04:15
on each drone platform,
75
255778
2038
ืขืœ ื”ืคืœื˜ืคื•ืจืžื” ืฉืœ ื›ืœ ื›ื˜ื‘"ื,
04:17
and this would vastly outstrip human ability to review it.
76
257816
3303
ื•ื–ื” ื™ืขืœื” ืžืฉืžืขื•ืชื™ืช ืขืœ ื”ื™ื›ื•ืœืช ื”ืื ื•ืฉื™ืช ืœืฆืคื•ืช ื‘ืชื•ืฆืจื™ื.
04:21
And that means visual intelligence software will need
77
261119
2160
ื•ื”ืžืฉืžืขื•ืช ื”ื™ื ืฉื™ื“ืจืฉื• ืชื•ื›ื ื•ืช ื‘ื™ื ื” ื•ื™ื–ื•ืืœื™ืช
04:23
to scan it for items of interest.
78
263279
4048
ืขืœ ืžื ืช ืœืกืจื•ืง ื•ืœืืชืจ ืžื•ืงื“ื™ ืขื ื™ื™ืŸ.
04:27
And that means very soon
79
267327
1348
ื•ื–ื” ืื•ืžืจ ืฉื‘ืงืจื•ื‘ ืžืื•ื“
04:28
drones will tell humans what to look at,
80
268675
2747
ื›ื˜ื‘"ืžื™ื ื™ื’ื™ื“ื• ืœื‘ื ื™ ื”ืื“ื ืขืœ ืžื” ืœื”ืกืชื›ืœ,
04:31
not the other way around.
81
271422
2497
ื•ืœื ืœื”ื™ืคืš.
04:33
But there's a second powerful incentive pushing
82
273919
2473
ืื‘ืœ ื™ืฉ ืชืžืจื™ืฅ ื—ื–ืง ืฉื ื™ ืฉื“ื•ื—ืฃ
04:36
decision-making away from humans and onto machines,
83
276392
3383
ืืช ืงื‘ืœืช ื”ื”ื—ืœื˜ื•ืช ืžื™ื“ื™ ื‘ื ื™ ื”ืื“ื ืœืขื‘ืจ ื”ืžื›ื•ื ื•ืช,
04:39
and that's electromagnetic jamming,
84
279775
2872
ื•ื–ื” ื”ืฉื™ื‘ื•ืฉ ื”ืืœืงื˜ืจื•ืžื’ื ื˜ื™,
04:42
severing the connection between the drone
85
282647
2236
ืฉืžื ืชืง ืืช ื”ืงืฉืจ ืฉื‘ื™ืŸ ื”ื›ื˜ื‘"ื
04:44
and its operator.
86
284883
2814
ืœืžืคืขื™ืœ ืฉืœื•.
04:47
Now we saw an example of this in 2011
87
287697
2618
ืจืื™ื ื• ื“ื•ื’ืžื” ืœื–ื” ื‘- 2011
04:50
when an American RQ-170 Sentinel drone
88
290315
2956
ื›ืืฉืจ ื”ื›ื˜ื‘"ื ืกื ื˜ื™ื ืœ RQ-170 ื”ืืžืจื™ืงืื™
04:53
got a bit confused over Iran due to a GPS spoofing attack,
89
293271
4307
ื”ืชื‘ืœื‘ืœ ืงืฆืช ืžืขืœ ืื™ืจืืŸ ื‘ืฉืœ ืžืชืงืคืช ืฉื™ื‘ื•ืฉ GPS,
04:57
but any remotely-piloted drone is susceptible to this type of attack,
90
297578
5114
ืื‘ืœ ื›ืœ ื›ื˜ื‘"ื ื”ื ืฉืœื˜ ืžืจื—ื•ืง ื—ืฉื•ืฃ ืœืกื•ื’ ื›ื–ื” ืฉืœ ืžืชืงืคื”,
05:02
and that means drones
91
302692
2052
ื•ื–ื” ืื•ืžืจ ืฉื›ื˜ื‘"ืžื™ื
05:04
will have to shoulder more decision-making.
92
304744
3620
ื™ืฆื˜ืจื›ื• ืœืงื‘ืœ ื™ื•ืชืจ ืื—ืจื™ื•ืช ืขืœ ืงื‘ืœืช ื”ื—ืœื˜ื•ืช.
05:08
They'll know their mission objective,
93
308364
3043
ื”ื ื™ื“ืขื ื• ืžื” ืžื˜ืจืช ื”ืžืฉื™ืžื”,
05:11
and they'll react to new circumstances without human guidance.
94
311407
4845
ื•ื”ื ื™ื’ื™ื‘ื• ืœื ืกื™ื‘ื•ืช ื—ื“ืฉื•ืช ืœืœื ืงื‘ืœืช ื”ื ื—ื™ื•ืช ืžื‘ืŸ ืื ื•ืฉ.
05:16
They'll ignore external radio signals
95
316252
2581
ื”ื ื™ืชืขืœืžื• ืžืื•ืชื•ืช ืจื“ื™ื• ื—ื™ืฆื•ื ื™ื™ื
05:18
and send very few of their own.
96
318833
2330
ื•ื™ืฉืœื—ื• ืžืขื˜ ืžืื•ื“ ืื•ืชื•ืช ืžืฉืœื”ื.
05:21
Which brings us to, really, the third
97
321163
2006
ืžื” ืฉืžื‘ื™ื ืื•ืชื ื•, ืœืžืขืฉื”, ืœืชืžืจื™ืฅ ื”ืฉืœื™ืฉื™
05:23
and most powerful incentive pushing decision-making
98
323169
3862
ื•ื”ื—ื–ืง ื‘ื™ื•ืชืจ ืฉื“ื•ื—ืง ืืช ืงื‘ืœืช ื”ื”ื—ืœื˜ื•ืช
05:27
away from humans and onto weapons:
99
327031
3342
ืžื™ื“ื™ ื‘ื ื™ ื”ืื“ื ื•ืœืขื‘ืจ ื›ืœื™ ื”ื ืฉืง:
05:30
plausible deniability.
100
330373
3293
ื”ื›ื—ืฉื” ืกื‘ื™ืจื”.
05:33
Now we live in a global economy.
101
333666
2887
ืื ื—ื ื• ื—ื™ื™ื ื›ื™ื•ื ื‘ื›ืœื›ืœื” ื’ืœื•ื‘ืœื™ืช.
05:36
High-tech manufacturing is occurring on most continents.
102
336553
4334
ื‘ืจื•ื‘ ื”ื™ื‘ืฉื•ืช ืžืชืงื™ื™ื ื™ื™ืฆื•ืจ ืฉืœ ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืชืงื“ืžืช.
05:40
Cyber espionage is spiriting away advanced designs
103
340887
2914
ืจื™ื’ื•ืœ ืงื™ื‘ืจื ื˜ื™ ืžื•ื‘ื™ืœ ืœืชืคื•ืฆื” ืฉืœ ืชื•ื›ื ื™ื•ืช ืคื™ืชื•ื—
05:43
to parts unknown,
104
343801
1886
ืœืžืงื•ืžื•ืช ื‘ืœืชื™ ื™ื“ื•ืขื™ื,
05:45
and in that environment, it is very likely
105
345687
2014
ื•ื‘ืื•ื•ื™ืจื” ื›ื–ื•, ืžืื•ื“ ืกื‘ื™ืจ
05:47
that a successful drone design will be knocked off in contract factories,
106
347701
4734
ืฉืชื•ื›ื ื™ืช ื™ื™ืฆื•ืจ ืฉืœ ื›ื˜ื‘"ื ืžื•ืฆืœื— ืชื•ืขืชืง ื‘ืžืคืขืœื™ื ื”ืฉื•ืชืคื™ื,
05:52
proliferate in the gray market.
107
352435
2170
ื•ืชืฉื’ืฉื’ ื‘ืžื”ื™ืจื•ืช ื‘ืฉื•ืง ื”ืืคื•ืจ.
05:54
And in that situation, sifting through the wreckage
108
354605
2460
ื•ื‘ืžืฆื‘ ื›ื–ื”, ื‘ื—ื™ืคื•ืฉ ื‘ื™ืŸ ื”ื”ืจื™ืกื•ืช
05:57
of a suicide drone attack, it will be very difficult to say
109
357065
2960
ืฉืœ ืžืชืงืคืช ื›ื˜ื‘"ื ืžืชืื‘ื“, ื™ื”ื™ื” ืžืื•ื“ ืงืฉื” ืœื•ืžืจ
06:00
who sent that weapon.
110
360025
4400
ืžื™ ืฉืœื— ืืช ื›ืœื™ ื”ื ืฉืง ื”ื–ื”.
06:04
This raises the very real possibility
111
364425
2800
ื–ื” ืžืขืœื” ืืช ื”ืืคืฉืจื•ืช ื”ืžืื•ื“ ืžืžืฉื™ืช
06:07
of anonymous war.
112
367225
2935
ืฉืœ ืžืœื—ืžื” ืื ื•ื ื™ืžื™ืช.
06:10
This could tilt the geopolitical balance on its head,
113
370160
2614
ื–ื” ื™ื›ื•ืœ ืœื”ื˜ื•ืช ืœื—ืœื•ื˜ื™ืŸ ืืช ื”ืžืื–ืŸ ื”ื’ื™ืื•ืคื•ืœื˜ื™,
06:12
make it very difficult for a nation to turn its firepower
114
372774
3491
ื•ืœื”ืงืฉื•ืช ืžืื•ื“ ืขืœ ืžื“ื™ื ื” ืœื”ืคื ื•ืช ืืช ื›ื•ื—ื” ื”ืฆื‘ืื™
06:16
against an attacker, and that could shift the balance
115
376265
2848
ื›ื ื’ื“ ื”ืชื•ืงืฃ, ื•ื–ื” ืขืฉื•ื™ ืœื”ื˜ื•ืช ืืช ื”ืื™ื–ื•ืŸ
06:19
in the 21st century away from defense and toward offense.
116
379113
3764
ื‘ืžืื” ื”-21 ืžื”ื’ื ื” ืœื”ืชืงืคื”.
06:22
It could make military action a viable option
117
382877
3124
ื–ื” ืขืฉื•ื™ ืœื”ืคื•ืš ืืช ื”ืคืขื•ืœื” ื”ืฆื‘ืื™ืช ืœืืคืฉืจื•ืช ืžืขืฉื™ืช
06:26
not just for small nations,
118
386001
2288
ืœื ืจืง ืขื‘ื•ืจ ืžื“ื™ื ื•ืช ืงื˜ื ื•ืช,
06:28
but criminal organizations, private enterprise,
119
388289
2545
ืืœื ื’ื ืขื‘ื•ืจ ืืจื’ื•ื ื™ ืคืฉืข, ืชืื’ื™ื“ื™ื ืคืจื˜ื™ื™ื,
06:30
even powerful individuals.
120
390834
2479
ื•ืืคื™ืœื• ืื–ืจื—ื™ื ืคืจื˜ื™ื™ื ื‘ืขืœื™ ื›ื•ื—.
06:33
It could create a landscape of rival warlords
121
393313
3328
ื–ื” ืขืฉื•ื™ ืœื™ืฆื•ืจ ื ื•ืฃ ืฉืœ ืื“ื•ื ื™ ืžืœื—ืžื” ื™ืจื™ื‘ื™ื
06:36
undermining rule of law and civil society.
122
396641
3680
ืฉืžืขืจืขืจื™ื ืืช ืฉืœื˜ื•ืŸ ื”ื—ื•ืง ื•ืืช ื”ื—ื‘ืจื” ื”ืื–ืจื—ื™ืช.
06:40
Now if responsibility and transparency
123
400321
3616
ืขื›ืฉื™ื• ืื ืื—ืจื™ื•ืช ื•ืฉืงื™ืคื•ืช
06:43
are two of the cornerstones of representative government,
124
403937
2384
ื”ืŸ ืื‘ื ื™ ื”ืคื™ื ื” ืฉืœ ื”ืฉืœื˜ื•ืŸ ื”ื™ื™ืฆื•ื’ื™,
06:46
autonomous robotic weapons could undermine both.
125
406321
4320
ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื ืขืœื•ืœื™ื ืœืขืจืขืจ ืืช ืฉืชื™ ืืœื•.
06:50
Now you might be thinking that
126
410641
1546
ืขื›ืฉื™ื• ืืชื ืื•ืœื™ ื—ื•ืฉื‘ื™ื
06:52
citizens of high-tech nations
127
412187
2246
ืฉืœืื–ืจื—ื™ื ื‘ืžื“ื™ื ื•ืช ื‘ืขืœื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืชืงื“ืžืช
06:54
would have the advantage in any robotic war,
128
414433
2703
ื™ื”ื™ื” ื™ืชืจื•ืŸ ื‘ื›ืœ ืžืœื—ืžื” ืจื•ื‘ื•ื˜ื™ืช,
06:57
that citizens of those nations would be less vulnerable,
129
417136
3633
ืฉืื–ืจื—ื™ื ื‘ืžื“ื™ื ื•ืช ื›ืืœื• ื™ื”ื™ื• ืคื’ื™ืขื™ื ืคื—ื•ืช,
07:00
particularly against developing nations.
130
420769
4288
ื‘ืขื™ืงืจ ืžื•ืœ ืžื“ื™ื ื•ืช ืžืชืคืชื—ื•ืช.
07:05
But I think the truth is the exact opposite.
131
425057
3524
ืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉื”ืืžืช ื”ื™ื ื‘ื“ื™ื•ืง ื”ืคื•ื›ื”.
07:08
I think citizens of high-tech societies
132
428581
2251
ืื ื™ ื—ื•ืฉื‘ ืฉืื–ืจื—ื™ื ื‘ืžื“ื™ื ื•ืช ื‘ืขืœื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืชืงื“ืžืช
07:10
are more vulnerable to robotic weapons,
133
430832
3729
ื—ืฉื•ืคื™ื ื™ื•ืชืจ ืœืคื’ื™ืขื” ืฉืœ ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื,
07:14
and the reason can be summed up in one word: data.
134
434561
4465
ื•ื”ืกื™ื‘ื” ืœื›ืš ื”ื™ื ื‘ืžื™ืœื” ืื—ืช: ื ืชื•ื ื™ื.
07:19
Data powers high-tech societies.
135
439026
3481
ื ืชื•ื ื™ื ื”ื ื”ืžื ื•ืข ืฉืœ ื—ื‘ืจื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื•ืช.
07:22
Cell phone geolocation, telecom metadata,
136
442507
3190
ืžื™ืงื•ื ื’ื™ืื•ื’ืจืคื™ ื‘ื˜ืœืคื•ื ื™ื ื ื™ื™ื“ื™ื, ื ืชื•ื ื™ ื˜ืœืงื•ืžื•ื ื™ืงืฆื™ื”,
07:25
social media, email, text, financial transaction data,
137
445697
3472
ืžื“ื™ื” ื—ื‘ืจืชื™ืช, ื“ื•ื"ืœ, ื ืชื•ื ื™ ืขืกืงื•ืช ืคื™ื ื ืกื™ื•ืช,
07:29
transportation data, it's a wealth of real-time data
138
449169
3532
ื ืชื•ื ื™ ืฉื™ื ื•ืข, ื–ื”ื• ืขื•ืฉืจ ืฉืœ ื ืชื•ื ื™ื ื‘ื–ืžืŸ ืืžืช
07:32
on the movements and social interactions of people.
139
452701
3373
ืขืœ ืชื ื•ืขื•ืช ื•ืื™ื ื˜ืจืืงืฆื™ื•ืช ื—ื‘ืจืชื™ื•ืช ื‘ื™ืŸ ืื ืฉื™ื.
07:36
In short, we are more visible to machines
140
456074
3775
ื‘ืงืฆืจื”, ืื ื—ื ื• ื™ื•ืชืจ ื ืจืื™ื ืœืžื›ื•ื ื•ืช
07:39
than any people in history,
141
459849
2242
ื‘ื”ืฉื•ื•ืื” ืœื›ืœ ืื“ื ื‘ื”ื™ืกื˜ื•ืจื™ื”,
07:42
and this perfectly suits the targeting needs of autonomous weapons.
142
462091
5616
ื•ื–ื” ืžืชืื™ื ื‘ืžื“ื•ื™ืง ืœืฆืจื›ื™ ืื™ืชื•ืจ ื”ืžื˜ืจื” ืฉืœ ื›ืœื™ ื”ื ืฉืง ื”ืื•ื˜ื•ื ื•ืžื™ื™ื.
07:47
What you're looking at here
143
467707
1738
ืžื” ืฉืืชื ืจื•ืื™ื ื›ืืŸ
07:49
is a link analysis map of a social group.
144
469445
3246
ื–ื” ืžืคืช ื ื™ืชื•ื— ืงืฉืจื™ื ืฉืœ ืงื‘ื•ืฆื” ื—ื‘ืจืชื™ืช.
07:52
Lines indicate social connectedness between individuals.
145
472691
3634
ื”ืงื•ื•ื™ื ืžืขื™ื“ื™ื ืขืœ ืงื™ืฉื•ืจ ื—ื‘ืจืชื™ ื‘ื™ืŸ ืคืจื˜ื™ื.
07:56
And these types of maps can be automatically generated
146
476325
2880
ื•ืžืคื•ืช ืžืกื•ื’ ื›ื–ื” ืืคืฉืจ ืœื—ื•ืœืœ ืื•ื˜ื•ืžื˜ื™ืช
07:59
based on the data trail modern people leave behind.
147
479205
4715
ืขืœ ื‘ืกื™ืก ืขืงื‘ื•ืช ื”ืžื™ื“ืข ืฉื”ืื“ื ื”ืžื•ื“ืจื ื™ ืžืฉืื™ืจ ืื—ืจื™ื•.
08:03
Now it's typically used to market goods and services
148
483920
2477
ืขื›ืฉื™ื• ื‘ื“ืจ"ื› ื–ื” ืžืฉืžืฉ ืœืฉื™ื•ื•ืง ืกื—ื•ืจื•ืช ื•ืฉื™ืจื•ืชื™ื
08:06
to targeted demographics, but it's a dual-use technology,
149
486397
4416
ืœืงื‘ื•ืฆืช ืื•ื›ืœื•ืกื™ื™ื” ื™ื™ืขื•ื“ื™ืช, ืื‘ืœ ื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ื“ื•-ืฉื™ืžื•ืฉื™ืช,
08:10
because targeting is used in another context.
150
490813
3360
ื›ื™ ืื™ืชื•ืจ ืžื˜ืจื” ืžืฉืžืฉ ื’ื ื‘ื”ืงืฉืจ ืื—ืจ.
08:14
Notice that certain individuals are highlighted.
151
494173
2560
ืฉื™ืžื• ืœื‘ ืฉื—ืœืง ืžื”ืคืจื˜ื™ื ืžื•ื“ื’ืฉื™ื.
08:16
These are the hubs of social networks.
152
496733
3280
ืืœื• ื”ื ืžืขื™ืŸ ืžื•ืงื“ื™ื ืฉืœ ืจืฉืชื•ืช ื—ื‘ืจืชื™ื•ืช.
08:20
These are organizers, opinion-makers, leaders,
153
500013
3590
ืืœื• ื”ื ืžืืจื’ื ื™ื, ืงื•ื‘ืขื™ ื“ืขืช-ืงื”ืœ, ืžื ื”ื™ื’ื™ื,
08:23
and these people also can be automatically identified
154
503603
2682
ื•ื’ื ื”ืื ืฉื™ื ื”ืืœื• ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืžื–ื•ื”ื™ื ืื•ื˜ื•ืžื˜ื™ืช
08:26
from their communication patterns.
155
506285
2382
ืขืœ ืคื™ ื“ืคื•ืกื™ ื”ืชืงืฉื•ืจืช ืฉืœื”ื.
08:28
Now, if you're a marketer, you might then target them
156
508667
2146
ืขื›ืฉื™ื•, ืื ืืชื” ืžืฉื•ื•ืง, ืืชื” ื™ื›ื•ืœ ืœื”ืคื•ืš ืื•ืชื ืœืžื˜ืจื”
08:30
with product samples, try to spread your brand
157
510813
2543
ืœื“ื•ื’ืžื™ื•ืช ืฉืœ ืžื•ืฆืจื™ื, ืœื ืกื•ืช ืœื”ืคื™ืฅ ืืช ื”ืžื•ืชื’ ืฉืœืš
08:33
through their social group.
158
513356
2829
ื“ืจืš ื”ืงื‘ื•ืฆื” ื”ื—ื‘ืจืชื™ืช ืฉืœื”ื.
08:36
But if you're a repressive government
159
516185
1953
ืื‘ืœ ืื ืืชื” ืžืฉื˜ืจ ื“ื™ื›ื•ื™
08:38
searching for political enemies, you might instead remove them,
160
518138
4810
ืฉืžื—ืคืฉ ืื•ื™ื‘ื™ื ืคื•ืœื™ื˜ื™ื™ื, ืืชื” ืขืœื•ืœ ื‘ืžืงื•ื ื–ื” ืœื”ื•ืจื™ื“ ืื•ืชื,
08:42
eliminate them, disrupt their social group,
161
522948
2760
ืœื—ืกืœ ืื•ืชื, ืœื–ืขื–ืข ืืช ื”ืงื‘ื•ืฆื” ื”ื—ื‘ืจืชื™ืช ืฉืœื”ื,
08:45
and those who remain behind lose social cohesion
162
525708
3169
ื•ื”ื ื•ืชืจื™ื ืžืื‘ื“ื™ื ืืช ื”ืœื›ื™ื“ื•ืช ื”ื—ื‘ืจืชื™ืช
08:48
and organization.
163
528877
2621
ื•ืืช ื”ืืจื’ื•ืŸ.
08:51
Now in a world of cheap, proliferating robotic weapons,
164
531498
3324
ืขื›ืฉื™ื• ื‘ืขื•ืœื ืฉืœ ื ืฉืง ืจื•ื‘ื•ื˜ื™ ื–ื•ืœ ื•ื ืคื•ืฅ,
08:54
borders would offer very little protection
165
534822
2635
ื’ื‘ื•ืœื•ืช ื™ื•ื›ืœื• ืœืกืคืง ืžืขื˜ ืžืื•ื“ ื”ื’ื ื”
08:57
to critics of distant governments
166
537457
1946
ืœืžื‘ืงืจื™ื ืฉืœ ืžืžืฉืœื•ืช ืจื—ื•ืงื•ืช
08:59
or trans-national criminal organizations.
167
539403
3646
ืื• ืืจื’ื•ื ื™ ืคืฉืข ื—ื•ืฆื™-ืขื•ืœื.
09:03
Popular movements agitating for change
168
543049
3493
ืงื‘ื•ืฆื•ืช ืขืžืžื™ื•ืช ืฉืคื•ืขืœื•ืช ืœืžืขืŸ ืฉื™ื ื•ื™
09:06
could be detected early and their leaders eliminated
169
546542
3609
ื™ื”ื™ื• ื ื™ืชื ื•ืช ืœื–ื™ื”ื•ื™ ืžื•ืงื“ื ื•ื”ืžื ื”ื™ื’ื™ื ืฉืœื”ื ื™ื—ื•ืกืœื•
09:10
before their ideas achieve critical mass.
170
550151
2911
ืœืคื ื™ ืฉื”ืจืขื™ื•ื ื•ืช ืฉืœื”ื ื™ืฆื‘ืจื• ืžืืกื” ืงืจื™ื˜ื™ืช.
09:13
And ideas achieving critical mass
171
553062
2591
ื”ืฉื’ืช ืžืืกื” ืงืจื™ื˜ื™ืช ืœืจืขื™ื•ื ื•ืช
09:15
is what political activism in popular government is all about.
172
555653
3936
ื–ื• ื”ืžื”ื•ืช ืฉืœ ืืงื˜ื™ื‘ื™ื–ื ืคื•ืœื™ื˜ื™ ื‘ืฉืœื˜ื•ืŸ ืขืžืžื™.
09:19
Anonymous lethal weapons could make lethal action
173
559589
3997
ื›ืœื™ ื ืฉืง ืงื˜ืœื ื™ื™ื ืื ื•ื ื™ืžื™ื™ื ืขืฉื•ื™ื™ื ืœื”ืคื•ืš ืคืขื•ืœื” ืงื˜ืœื ื™ืช
09:23
an easy choice for all sorts of competing interests.
174
563586
3782
ืœืืคืฉืจื•ืช ืงืœื” ืขื‘ื•ืจ ื›ืœ ืžื ื™ ืื™ื ื˜ืจืกื™ื ืžืชื—ืจื™ื.
09:27
And this would put a chill on free speech
175
567368
3734
ื•ื–ื” ื™ื™ื‘ืฉ ืืช ื—ื•ืคืฉ ื”ื‘ื™ื˜ื•ื™
09:31
and popular political action, the very heart of democracy.
176
571102
5308
ื•ืคืขื™ืœื•ืช ืคื•ืœื™ื˜ื™ืช ืขืžืžื™ืช, ืฉื”ื ืœื‘ ืœื™ื‘ื” ืฉืœ ื”ื“ืžื•ืงืจื˜ื™ื”.
09:36
And this is why we need an international treaty
177
576410
2914
ื•ื–ื• ื”ืกื™ื‘ื” ืœื›ืš ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืืžื ื” ื‘ื™ื ืœืื•ืžื™ืช
09:39
on robotic weapons, and in particular a global ban
178
579324
3540
ื‘ื ื•ืฉื ื ืฉืง ืจื•ื‘ื•ื˜ื™, ื•ื‘ืคืจื˜ ืื™ืกื•ืจ ื’ืœื•ื‘ืœื™
09:42
on the development and deployment of killer robots.
179
582864
3908
ืขืœ ืคื™ืชื•ื— ื•ืคืจื™ืกื” ืฉืœ ืจื•ื‘ื•ื˜ื™ื ืงื˜ืœื ื™ื™ื.
09:46
Now we already have international treaties
180
586772
3254
ื™ืฉ ืœื ื• ื›ื‘ืจ ืืžื ื•ืช ื‘ื™ื ืœืื•ืžื™ื•ืช
09:50
on nuclear and biological weapons, and, while imperfect,
181
590026
3386
ื‘ื ื•ืฉื ื ืฉืง ื’ืจืขื™ื ื™ ื•ื‘ื™ื•ืœื•ื’ื™, ื•ืœืžืจื•ืช ืฉื”ืŸ ืœื ืžื•ืฉืœืžื•ืช,
09:53
these have largely worked.
182
593412
2288
ื‘ื’ื“ื•ืœ ื”ืŸ ืขื•ื‘ื“ื•ืช.
09:55
But robotic weapons might be every bit as dangerous,
183
595700
3768
ืื‘ืœ ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื ืขืœื•ืœื™ื ืœื”ื™ื•ืช ืžืกื•ื›ื ื™ื ืœื ืคื—ื•ืช,
09:59
because they will almost certainly be used,
184
599468
3288
ื›ื™ ื›ืžืขื˜ ื‘ื•ื•ื“ืื•ืช ื™ื™ืขืฉื” ื‘ื”ื ืฉื™ืžื•ืฉ,
10:02
and they would also be corrosive to our democratic institutions.
185
602756
5027
ื•ื”ื ื™ื”ื™ื• ื”ืจืกื ื™ื™ื ื’ื ืœืžื•ืกื“ื•ืช ื”ื“ืžื•ืงืจื˜ื™ื™ื ืฉืœื ื•.
10:07
Now in November 2012 the U.S. Department of Defense
186
607783
3468
ื‘ื ื•ื‘ืžื‘ืจ 2012 ืžื—ืœืงืช ื”ื”ื’ื ื” ืฉืœ ืืจื”"ื‘
10:11
issued a directive requiring
187
611251
2458
ืคื™ืจืกืžื” ื”ื ื—ื™ื” ืฉื“ื•ืจืฉืช
10:13
a human being be present in all lethal decisions.
188
613709
4519
ืฉื‘ืŸ ืื“ื ื™ื”ื™ื” ื ื•ื›ื— ื‘ื›ืœ ื”ื”ื—ืœื˜ื•ืช ื”ืงื˜ืœื ื™ื•ืช.
10:18
This temporarily effectively banned autonomous weapons in the U.S. military,
189
618228
4776
ืœืžืขืฉื” ื–ื” ืืกืจ ื‘ืื•ืคืŸ ื–ืžื ื™ ืฉื™ืžื•ืฉ ื‘ื›ืœื™ ื ืฉืง ืื•ื˜ื•ื ื•ืžื™ื™ื ื‘ืฆื‘ื ืืจื”"ื‘,
10:23
but that directive needs to be made permanent.
190
623004
3753
ืื‘ืœ ื”ื”ื ื—ื™ื” ื”ื–ื• ืฆืจื™ื›ื” ืœื”ืคื•ืš ืœืงื‘ื•ืขื”.
10:26
And it could set the stage for global action.
191
626757
4376
ื•ื”ื™ื ื™ื›ื•ืœื” ืœืฉืžืฉ ื›ืคืชื™ื— ืœืคืขื•ืœื” ื’ืœื•ื‘ืœื™ืช.
10:31
Because we need an international legal framework
192
631133
3845
ื›ื™ ืื ื—ื ื• ื–ืงื•ืงื™ื ืœืžืกื’ืจืช ื—ื•ืงื™ืช ื‘ื™ื ืœืื•ืžื™ืช
10:34
for robotic weapons.
193
634978
2138
ืœื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื.
10:37
And we need it now, before there's a devastating attack
194
637116
2928
ื•ืื ื—ื ื• ื–ืงื•ืงื™ื ืœื” ืขื›ืฉื™ื•, ืœืคื ื™ ืฉืชื”ื™ื” ืžืชืงืคื” ื”ืจืกื ื™ืช
10:40
or a terrorist incident that causes nations of the world
195
640044
3152
ืื• ืื™ืจื•ืข ื˜ืจื•ืจ ืฉื™ื’ืจื•ื ืœืื•ืžื•ืช ื”ืขื•ืœื
10:43
to rush to adopt these weapons
196
643196
1924
ืœืžื”ืจ ื•ืœืืžืฅ ืืช ื›ืœื™ ื”ื ืฉืง ื”ืœืœื•
10:45
before thinking through the consequences.
197
645120
3771
ืœืคื ื™ ืฉื”ืกืคื™ืงื• ืœื—ืฉื•ื‘ ืœืขื•ืžืง ืขืœ ื”ื”ืฉืœื›ื•ืช.
10:48
Autonomous robotic weapons concentrate too much power
198
648891
2981
ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื ืžืจื›ื–ื™ื ื™ื•ืชืจ ืžื“ื™ ื›ื•ื—
10:51
in too few hands, and they would imperil democracy itself.
199
651872
6283
ื‘ืงืจื‘ ืžืขื˜ ืžื“ื™ ื™ื“ื™ื™ื, ื•ื”ื ืžืขืžื™ื“ื™ื ื‘ืกื›ื ื” ืืช ื”ื“ืžื•ืงืจื˜ื™ื” ืขืฆืžื”.
10:58
Now, don't get me wrong, I think there are tons
200
658155
2686
ืขื›ืฉื™ื•, ืืœ ืชื‘ื™ื ื• ืื•ืชื™ ืœื ื ื›ื•ืŸ, ืื ื™ ื—ื•ืฉื‘ ืฉื™ืฉ ื”ืžื•ืŸ
11:00
of great uses for unarmed civilian drones:
201
660841
2618
ืฉื™ืžื•ืฉื™ื ื ืคืœืื™ื ืœื›ื˜ื‘"ืžื™ื ืื–ืจื—ื™ื™ื ืœื ื—ืžื•ืฉื™ื:
11:03
environmental monitoring, search and rescue, logistics.
202
663459
3939
ื ื™ื˜ื•ืจ ืกื‘ื™ื‘ืชื™, ื—ื™ืคื•ืฉ ื•ื”ืฆืœื”, ืœื•ื’ื™ืกื˜ื™ืงื”.
11:07
If we have an international treaty on robotic weapons,
203
667398
2826
ืื ืชื”ื™ื” ืœื ื• ืืžื ื” ื‘ื™ื ืœืื•ืžื™ืช ื‘ื ื•ืฉื ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื,
11:10
how do we gain the benefits of autonomous drones
204
670224
3587
ืื™ืš ื ืฉื™ื’ ืืช ื”ืชื•ืขืœื•ืช ืฉื‘ื›ื˜ื‘"ืžื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื
11:13
and vehicles while still protecting ourselves
205
673811
2648
ื•ื›ืœื™ ืจื›ื‘ ืชื•ืš ื”ื’ื ื” ืขืœ ืขืฆืžื ื•
11:16
against illegal robotic weapons?
206
676459
3980
ืžืคื ื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ ื‘ืœืชื™ ื—ื•ืงื™?
11:20
I think the secret will be transparency.
207
680439
4741
ืื ื™ ื—ื•ืฉื‘ ืฉื”ืกื•ื“ ื™ื”ื™ื” ื‘ืฉืงื™ืคื•ืช.
11:25
No robot should have an expectation of privacy
208
685180
3013
ืืกื•ืจ ืฉืืฃ ืจื•ื‘ื•ื˜ ื™ืฆืคื” ืœืคืจื˜ื™ื•ืช
11:28
in a public place.
209
688193
3451
ื‘ืžืงื•ื ืฆื™ื‘ื•ืจื™.
11:31
(Applause)
210
691644
5048
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
11:36
Each robot and drone should have
211
696692
2045
ืœื›ืœ ืจื•ื‘ื•ื˜ ื•ื›ื˜ื‘"ื ืฆืจื™ืš ืœื”ื™ื•ืช
11:38
a cryptographically signed I.D. burned in at the factory
212
698737
2883
ื–ื™ื”ื•ื™ ืฉืœ ื—ืชื™ืžื” ืžื•ืฆืคื ืช ืฉื ืฆืจื‘ ื‘ื• ื‘ืžืคืขืœ
11:41
that can be used to track its movement through public spaces.
213
701620
2923
ื•ืฉื™ื›ื•ืœ ืœืฉืžืฉ ืœืžืขืงื‘ ืื—ืจ ืชื ื•ืขืชื• ื‘ืžืจื—ื‘ ื”ืฆื™ื‘ื•ืจื™.
11:44
We have license plates on cars, tail numbers on aircraft.
214
704543
3381
ื™ืฉ ืœื ื• ืœื•ื—ื•ืช ืจื™ืฉื•ื™ ืขืœ ืจื›ื‘ื™ื, ืžืกืคืจ ื–ื ื‘ ืขืœ ื›ืœื™ ื˜ื™ืก.
11:47
This is no different.
215
707924
1841
ื–ื” ืœื ืฉื•ื ื”.
11:49
And every citizen should be able to download an app
216
709765
2012
ื•ืœื›ืœ ืื–ืจื— ืฆืจื™ื›ื” ืœื”ื™ื•ืช ื”ืืคืฉืจื•ืช ืœื”ื•ืจื™ื“ ืืคืœื™ืงืฆื™ื”
11:51
that shows the population of drones and autonomous vehicles
217
711777
3125
ืฉืžืฆื™ื’ื” ืืช ืื•ื›ืœื•ืกื™ื™ืช ื”ื›ื˜ื‘"ืžื™ื ื”ืื•ื˜ื•ื ื•ืžื™ื™ื
11:54
moving through public spaces around them,
218
714902
2429
ืฉื ืขื™ื ื‘ืžืจื—ื‘ ื”ืฆื™ื‘ื•ืจื™ ืฉืžืกื‘ื™ื‘ื•,
11:57
both right now and historically.
219
717331
2733
ื’ื ืขื›ืฉื™ื• ื•ื’ื ื‘ืขื‘ืจ.
12:00
And civic leaders should deploy sensors and civic drones
220
720064
3548
ื•ืžื ื”ื™ื’ื™ื ืื–ืจื—ื™ื™ื ืฆืจื™ื›ื™ื ืœืคืจื•ืก ืกื ืกื•ืจื™ื ื•ื›ื˜ื‘"ืžื™ื ืื–ืจื—ื™ื™ื
12:03
to detect rogue drones,
221
723612
2344
ืฉื™ื–ื”ื• ื›ื˜ื‘"ืžื™ื ืืกื•ืจื™ื,
12:05
and instead of sending killer drones of their own up to shoot them down,
222
725956
3176
ื•ื‘ืžืงื•ื ืœืฉืœื•ื— ื›ื˜ื‘"ื ืงื˜ืœื ื™ ืžืฉืœื”ื ื›ื“ื™ ืœื™ื™ืจื˜ ืื•ืชื,
12:09
they should notify humans to their presence.
223
729132
2992
ื”ื ืฆืจื™ื›ื™ื ืœื™ื™ื“ืข ืืช ื‘ื ื™ ื”ืื“ื ืขืœ ื ื•ื›ื—ื•ืชื.
12:12
And in certain very high-security areas,
224
732124
2606
ื•ื‘ืื–ื•ืจื™ื ื‘ื™ื˜ื—ื•ื ื™ื™ื ืžืกื•ื™ืžื™ื,
12:14
perhaps civic drones would snare them
225
734730
1909
ื›ื˜ื‘ืžื™ื ืื–ืจื—ื™ื™ื ืื•ืœื™ ื™ืฆื™ื‘ื• ืœื”ื ืžืœื›ื•ื“ืช
12:16
and drag them off to a bomb disposal facility.
226
736639
2841
ื•ื™ืžืฉื›ื• ืื•ืชื ืœืขื‘ืจ ืžืชืงืŸ ืžื™ื•ื—ื“ ืœืกื™ืœื•ืง ืคืฆืฆื•ืช.
12:19
But notice, this is more an immune system
227
739480
3027
ืื‘ืœ ืฉื™ืžื• ืœื‘, ื–ื” ื™ื•ืชืจ ืžืขืจื›ืช ื—ื™ืกื•ืŸ
12:22
than a weapons system.
228
742507
1321
ืžืืฉืจ ืžืขืจื›ืช ื ืฉืง.
12:23
It would allow us to avail ourselves of the use
229
743828
2592
ื”ื™ื ืชืืคืฉืจ ืœื ื• ืœื”ืคื™ืง ืชื•ืขืœืช ืžื”ืฉื™ืžื•ืฉ
12:26
of autonomous vehicles and drones
230
746420
2032
ื‘ื›ืœื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื ื•ื›ื˜ื‘"ืžื™ื
12:28
while still preserving our open, civil society.
231
748452
4295
ื•ื‘ื• ื‘ื–ืžืŸ ืœืฉืžืจ ืืช ื”ื—ื‘ืจื” ื”ืื–ืจื—ื™ืช ื”ืคืชื•ื—ื” ืฉืœื ื•.
12:32
We must ban the deployment and development
232
752747
2999
ืื ื—ื ื• ื—ื™ื™ื‘ื™ื ืœืืกื•ืจ ืืช ื”ืคืจื™ืกื” ื•ื”ืคื™ืชื•ื—
12:35
of killer robots.
233
755746
1862
ืฉืœ ืจื•ื‘ื•ื˜ื™ื ืงื˜ืœื ื™ื™ื.
12:37
Let's not succumb to the temptation to automate war.
234
757608
4850
ื‘ื•ืื• ืœื ื ื™ื›ื ืข ืœืคื™ืชื•ื™ ืฉืœ ืžื™ื›ื•ืŸ ื”ืžืœื—ืžื•ืช.
12:42
Autocratic governments and criminal organizations
235
762458
2718
ืžืžืฉืœื•ืช ืื•ื˜ื•ืงืจื˜ื™ื•ืช ื•ืืจื’ื•ื ื™ ืคืฉื™ืขื”
12:45
undoubtedly will, but let's not join them.
236
765176
2956
ืœืœื ืกืคืง ื™ืขืฉื• ื–ืืช, ืื‘ืœ ื‘ื•ืื• ืœื ื ืฆื˜ืจืฃ ืืœื™ื”ื.
12:48
Autonomous robotic weapons
237
768132
1891
ื›ืœื™ ื ืฉืง ืจื•ื‘ื•ื˜ื™ื™ื ืื•ื˜ื•ื ื•ืžื™ื™ื
12:50
would concentrate too much power
238
770023
2051
ื™ืจื›ื–ื• ื™ื•ืชืจ ืžื“ื™ ื›ื•ื—
12:52
in too few unseen hands,
239
772074
2482
ื‘ืงืจื‘ ืžืกืคืจ ืžื•ืขื˜ ืžื“ื™ ืฉืœ ื™ื“ื™ื™ื ื‘ืœืชื™ ื ืจืื•ืช,
12:54
and that would be corrosive to representative government.
240
774556
3255
ื•ื–ื” ื™ื”ื™ื” ื”ืจืกื ื™ ืœืฉืœื˜ื•ืŸ ื”ื™ื™ืฆื•ื’ื™.
12:57
Let's make sure, for democracies at least,
241
777811
2961
ื‘ื•ืื• ื ื•ื•ื“ื, ืœืคื—ื•ืช ืœื’ื‘ื™ ื“ืžื•ืงืจื˜ื™ื•ืช,
13:00
killer robots remain fiction.
242
780772
2604
ืฉืจื•ื‘ื•ื˜ื™ื ืงื˜ืœื ื™ื™ื ื™ื™ืฉืืจื• ื‘ื’ื“ืจ ืžื“ืข ื‘ื“ื™ื•ื ื™.
13:03
Thank you.
243
783376
1110
ืชื•ื“ื” ืœื›ื.
13:04
(Applause)
244
784486
4565
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
13:09
Thank you. (Applause)
245
789051
4616
ืชื•ื“ื” ืœื›ื. (ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7