The AI Arsenal That Could Stop World War III | Palmer Luckey | TED

5,253 views ・ 2025-04-25

TED


Please double-click on the English subtitles below to play the video.

00:04
I want you to imagine something.
0
4201
1936
00:06
In the early hours of a massive surprise invasion of Taiwan,
1
6971
3904
00:10
China unleashes its full arsenal.
2
10908
2769
00:13
Ballistic missiles rain down on key military installations,
3
13711
3470
00:17
neutralizing air bases and command centers
4
17214
2069
00:19
before Taiwan can fire a single shot.
5
19283
2669
00:21
The People's Liberation Army Navy moves in with overwhelming force,
6
21986
3737
00:25
deploying amphibious assault ships and aircraft carriers
7
25756
2736
00:28
while cyber attacks cripple Taiwan's infrastructure
8
28526
2769
00:31
and prevent emergency response.
9
31328
1802
00:34
The Chinese rocket forces' long range missiles shred through our defenses.
10
34131
4338
00:38
Ships, command and control nodes
11
38936
1869
00:40
and critical assets are destroyed before they can even engage.
12
40805
3403
00:45
The United States attempts to respond,
13
45109
2269
00:47
but it quickly becomes clear: we don't have enough.
14
47411
3671
00:51
Not enough weapons,
15
51549
1368
00:52
not enough platforms to carry those weapons.
16
52950
2569
00:55
American warships, too slow and too few,
17
55553
2769
00:58
sink to the bottom of the Pacific under anti-ship missile swarms.
18
58355
3904
01:02
Our fighter jets,
19
62259
1202
01:03
piloted by brave but outnumbered human pilots,
20
63461
3536
01:07
are shot down one by one.
21
67031
2903
01:09
The United States exhausts its shallow arsenal
22
69967
2169
01:12
of precision munitions in a mere eight days.
23
72136
2169
01:14
Taiwan falls within weeks.
24
74972
1668
01:17
And the world wakes up to a new reality,
25
77108
2168
01:19
one where the world's dominant power is no longer a democracy.
26
79276
4438
01:25
This is the war US military analysts fear most.
27
85382
2803
01:28
Not just because of outdated technology or slow decision making,
28
88519
3036
01:31
but because our lack of capacity,
29
91555
1635
01:33
our sheer shortage of tools and platforms,
30
93190
2703
01:35
means we can't even get into the fight.
31
95926
2069
01:38
When China invades Taiwan,
32
98662
2303
01:40
the consequences will be global.
33
100998
1768
01:42
Taiwan is the undisputed epicenter of the world's chip supply,
34
102800
2936
01:45
producing over 90 percent of most advanced semiconductors:
35
105769
3003
01:48
the high-performance chips that power today’s AI, GPUs, robotics.
36
108806
3603
01:52
These are also the chips that power your phones, computers,
37
112409
2803
01:55
cars and medical devices.
38
115246
1868
01:57
If those factories are seized or destroyed,
39
117114
2369
01:59
the global economy will crash overnight.
40
119517
2402
02:01
Tens of trillions of dollars in losses,
41
121952
2369
02:04
supply chains in chaos,
42
124355
2068
02:06
the worst economic depression in a century.
43
126457
2803
02:09
And the danger is more than economic.
44
129560
1768
02:11
It's ideological.
45
131362
1635
02:13
China is an autocracy.
46
133464
1234
02:14
And a world where China dictates the terms of international order
47
134732
3236
02:17
is a world where individual freedoms erode,
48
137968
2069
02:20
authoritarianism spreads
49
140070
1669
02:21
and smaller nations are forced into submission.
50
141739
2936
02:25
And before anyone shrugs this off as the plot of Michael Bay's latest movie,
51
145109
3603
02:28
we've seen this film before.
52
148746
1368
02:30
Just ask Ukraine.
53
150114
1368
02:32
At this point, you might be wondering
54
152216
1768
02:33
why a guy in a Hawaiian shirt and flip flops is up here
55
153984
2636
02:36
talking about potential World War III.
56
156654
1835
02:38
My name is Palmer Luckey, I'm an inventor and an entrepreneur.
57
158489
2903
02:41
When I was 19 years old,
58
161425
1168
02:42
I founded Oculus VR while I was living in a camper trailer
59
162626
2770
02:45
and then brought virtual reality to the masses.
60
165396
2302
02:47
Years later, I was fired from Facebook after donating 9,000 dollars
61
167731
3170
02:50
to the wrong political candidate.
62
170935
1601
02:52
And that left me with a choice.
63
172570
1635
02:54
Either fade into irrelevance and islands
64
174238
3437
02:57
or build something that actually mattered.
65
177708
2669
03:00
I wanted to solve a problem that was being ignored,
66
180377
2436
03:02
one that would shape the future of this country and the world.
67
182813
3237
03:06
Despite the incredible technological progress
68
186617
2135
03:08
happening all around us,
69
188752
1168
03:09
our defense sector was stuck in the past.
70
189954
2736
03:13
The biggest defense contractors had stopped innovating
71
193057
3403
03:16
as fast as they had before,
72
196493
1635
03:18
prioritizing shareholder dividends over advanced capability.
73
198162
3770
03:22
Prioritizing bureaucracy over breakthroughs.
74
202566
2836
03:26
Silicon Valley, which was home to many of our top engineers and scientists,
75
206337
3536
03:29
had turned its back on defense
76
209873
1869
03:31
and the military writ large,
77
211775
1368
03:33
betting on China as the only economy or government worth pandering to.
78
213143
3938
03:37
Tech companies that once partnered with the military
79
217648
2502
03:40
had decided that national security was someone else's problem.
80
220184
3036
03:43
The result?
81
223721
1401
03:45
Your Tesla has better AI than any US aircraft.
82
225155
3971
03:49
Your Roomba has better autonomy
83
229159
1635
03:50
than most of the Pentagon’s weapons systems.
84
230828
2068
03:52
And your Snapchat filters,
85
232896
1836
03:54
they rely on better computer vision
86
234765
1702
03:56
than our most advanced military sensors.
87
236467
2569
03:59
Now I knew that if both the smartest minds in technology
88
239570
3003
04:02
and the biggest players in defense
89
242606
1669
04:04
both deprioritized innovation,
90
244275
2802
04:07
the United States would forever lose its ability to protect our way of life.
91
247111
3603
04:11
And with so few willing to solve that problem,
92
251115
2169
04:13
I decided that I would try my best.
93
253317
2202
04:16
So I founded a company called Anduril.
94
256153
1869
04:18
Not a defense contractor but a defense product company.
95
258055
2836
04:21
We spend our own money building defense products that work,
96
261258
3437
04:24
rather than asking taxpayers to foot the bill.
97
264695
3236
04:27
The result is that we move much faster and at lower cost
98
267965
3337
04:31
than most traditional primes.
99
271335
2069
04:33
Our first pitch deck to our investors, who are very aligned with us,
100
273437
3203
04:36
said it plainly:
101
276674
1167
04:37
we will save taxpayers hundreds of billions of dollars a year
102
277841
3370
04:41
by making tens of billions of dollars a year.
103
281245
2869
04:44
Now while we make dozens of different hardware products,
104
284882
3069
04:47
our core system is a piece of software,
105
287985
3170
04:51
an AI platform called Lattice,
106
291188
1869
04:53
that lets us deploy millions of weapons
107
293090
2269
04:55
without risking millions of lives.
108
295392
2503
04:57
It also allows us to make updates to those weapons at the speed of code,
109
297928
3404
05:01
ensuring we always stay one step ahead
110
301332
2068
05:03
of emerging and reactive threats.
111
303434
2035
05:06
Another big difference is that we design hardware for mass production
112
306503
3270
05:09
using existing infrastructure and industrial base.
113
309773
3003
05:13
Unlike traditional contractors, we build, test and deploy our products
114
313143
3470
05:16
in months, not years.
115
316647
2269
05:18
That approach has allowed us, in less than eight years,
116
318949
2603
05:21
to build autonomous fighter jets for the United States Air Force,
117
321552
3069
05:24
school bus-sized autonomous submarines for the Australian Navy,
118
324655
2969
05:27
and augmented reality headsets
119
327624
1469
05:29
that give every one of our superheroes superpowers,
120
329093
2402
05:31
to name just a few.
121
331528
1168
05:32
We also build counter-drone technology like Roadrunner here,
122
332696
2870
05:35
which is a twin turbojet-powered, reusable counter-drone interceptor
123
335566
3236
05:38
that we took from napkin sketch
124
338802
1535
05:40
to real-world combat-validated capability
125
340371
2135
05:42
in less than 24 months.
126
342539
1802
05:44
And we did it using our own money.
127
344341
2036
05:47
Now coming from a guy who builds weapons for a living,
128
347344
2603
05:49
what I'm about to say next might sound counterintuitive to you.
129
349980
3404
05:53
At our core,
130
353384
1368
05:54
we're about fostering peace.
131
354785
1401
05:56
We deter conflict by making sure our adversaries know they can't compete.
132
356653
4171
06:00
Putin invaded Ukraine
133
360858
1434
06:02
because he believed that he could win.
134
362326
1835
06:04
Countries only go to war
135
364194
1235
06:05
when they disagree as to who the victor will be.
136
365462
2469
06:07
That's what deterrence is all about.
137
367931
2903
06:10
Not saber-rattling.
138
370868
1902
06:12
Making aggression so costly
139
372803
1335
06:14
that adversaries don't try in the first place.
140
374171
2636
06:16
So how do we do that?
141
376840
1368
06:19
For centuries, military power was derived by size.
142
379343
2936
06:22
More troops, more tanks, more firepower.
143
382279
2603
06:25
But over the last few decades,
144
385449
1468
06:26
the defense industry has spent far too long handcrafting exquisite,
145
386950
3170
06:30
almost impossible-to-build weapons.
146
390120
2303
06:32
Meanwhile, China has studied how we fight.
147
392456
2402
06:34
And they've invested in the technologies and the mass
148
394892
2502
06:37
that counter our specific strategies.
149
397394
2102
06:39
Today, China has the world's largest navy,
150
399997
2836
06:42
with 232 times the shipbuilding capacity of the United States;
151
402866
4405
06:47
the world’s largest coast guard;
152
407271
1701
06:48
the world’s largest standing ground force;
153
408972
2369
06:51
and the world’s largest missile arsenal,
154
411375
1935
06:53
with production capacity growing every single day.
155
413310
2903
06:56
We'll never meet China's numerical advantage through traditional means,
156
416947
3370
07:00
nor should we try.
157
420350
1735
07:02
What we need isn't more of these same systems.
158
422119
2969
07:05
We need fundamentally different capabilities.
159
425756
2536
07:08
We need autonomous systems
160
428325
1268
07:09
that can augment our existing manned fleets.
161
429593
2903
07:12
We need intelligent platforms
162
432529
1435
07:13
that can operate in contested environments
163
433997
2036
07:16
where human-piloted systems simply cannot.
164
436033
3270
07:20
We need weapons that can be produced at scale,
165
440037
2169
07:22
deployed rapidly
166
442206
1368
07:23
and updated continuously.
167
443574
1668
07:25
Mass production matters.
168
445776
1702
07:28
In a conflict where our capacity is our greatest vulnerability,
169
448879
3337
07:32
what we really need is a production model
170
452216
2102
07:34
that mirrors the best of our commercial sector:
171
454351
2202
07:36
fast, scalable and resilient.
172
456587
2936
07:39
We know how to win like this.
173
459890
1802
07:42
We rallied our industrial base during World War II
174
462326
2335
07:44
to mass produce weapons at an unprecedented scale.
175
464695
2335
07:47
It's how we won.
176
467064
1168
07:48
The Ford Motor Company, for example, produced one B-24 bomber
177
468265
2869
07:51
every 63 minutes.
178
471168
2235
07:54
But to actually achieve the benefits of these mass-produced systems,
179
474972
3236
07:58
we need them to be smarter.
180
478242
2168
08:01
This is where AI comes in.
181
481044
1836
08:03
AI is the only possible way
182
483480
1869
08:05
we can keep up with China's numerical advantage.
183
485349
2769
08:08
We don't want to throw millions of people into the fight like they do.
184
488118
3504
08:11
We can’t do it, and we shouldn’t do it.
185
491622
2869
08:15
AI software allows us to build a different kind of force,
186
495926
2936
08:18
one that isn't limited by cost
187
498896
1735
08:20
or complexity or population or manpower,
188
500664
3370
08:24
but instead by adaptability,
189
504067
1502
08:25
scale and speed of manufacturing.
190
505602
1936
08:28
Now the ethical implications of AI in warfare are serious.
191
508405
2970
08:31
But here's the truth.
192
511408
1568
08:33
If the United States doesn't lead in this space,
193
513010
2269
08:35
authoritarian regimes will.
194
515279
1601
08:37
And they won't be concerned with our ethical norms.
195
517614
2536
08:40
AI enhances decisio-making.
196
520551
1401
08:41
It increases precision.
197
521985
1168
08:43
It reduces collateral damage.
198
523186
1635
08:45
Hopefully, it can eliminate some conflicts altogether.
199
525322
2769
08:49
The good news is that the US and our allies have the technology,
200
529026
3303
08:52
human capital and expertise
201
532362
2169
08:54
to mass-produce these new kinds of autonomous systems
202
534531
2836
08:57
and launch a new golden age of defense production.
203
537367
2937
09:01
With all that information in mind, let's go back to Taiwan.
204
541071
3270
09:04
But imagine a different scenario.
205
544675
1835
09:06
The attack might begin the same way:
206
546843
1736
09:08
Chinese missiles streak towards Taiwan.
207
548612
1868
09:10
But this time, the response is instant.
208
550514
2302
09:12
A fleet of AI-driven, autonomous drones,
209
552816
1935
09:14
already stationed in the region by allies,
210
554751
2036
09:16
launch within seconds.
211
556820
1168
09:18
Swarming together in coordinated attacks,
212
558021
2036
09:20
they intercept incoming Chinese bombers and cruise missiles
213
560090
2803
09:22
before they ever reach Taiwan.
214
562926
1602
09:25
In the Pacific, a distributed force of unmanned submarines,
215
565696
2769
09:28
stealthy drone warships, and autonomous aircraft
216
568498
2269
09:30
that work alongside manned systems
217
570767
1669
09:32
strike from unpredictable locations.
218
572436
2269
09:35
Our AI-piloted fighter swarms engage Chinese aircraft in dogfights,
219
575038
3938
09:39
responding faster than any human possibly could.
220
579009
3170
09:43
On the ground, robotic sentries and AI-assisted long range fires
221
583046
3203
09:46
halt China's amphibious assault
222
586283
1702
09:47
before a single Chinese boot reaches Taiwan's shores.
223
587985
3103
09:52
By deploying autonomous systems at scale,
224
592489
2402
09:54
this type of autonomous system,
225
594925
2102
09:57
we prove to our adversaries that we have the capacity to win.
226
597060
4238
10:02
That is how we reclaim our deterrence.
227
602532
2169
10:05
To do so, we just have to stand with our allies across the world,
228
605469
4004
10:09
united by the shared values
229
609506
2035
10:11
and common resolve that we've shared for the better part of a century.
230
611575
3470
10:15
Our defenders, the men and the women
231
615545
1769
10:17
who volunteer to risk their lives,
232
617314
2102
10:19
deserve technology that makes them stronger,
233
619449
2169
10:21
faster and safer.
234
621652
1634
10:23
Anything less is a betrayal
235
623820
1769
10:25
because that technology is available today.
236
625589
2235
10:28
This is how we prevent a repeat of Pearl Harbor.
237
628959
2536
10:31
We could be the second greatest generation by rethinking warfare altogether.
238
631928
3604
10:36
Thank you.
239
636366
1402
10:37
(Applause)
240
637801
6940
10:45
Bilawal Sidhu: Thank you, Palmer.
241
645676
1601
10:47
You painted a very vivid picture of the future of warfare and deterrence.
242
647277
3604
10:51
I want to ask you a couple questions.
243
651448
1768
10:53
I think one that's on a lot of people's minds is
244
653250
3203
10:56
autonomy in the military kill chain.
245
656486
2503
10:59
With the rise of AI,
246
659022
1235
11:00
are we contending with fundamentally a new set of questions here?
247
660257
3069
11:03
Because some advocate that we shouldn't build autonomous systems
248
663360
3236
11:06
or killer robots at all.
249
666630
2069
11:08
What's your take on that?
250
668732
1201
11:09
Palmer Luckey: I love killer robots.
251
669966
1736
11:11
(Laughter)
252
671702
1301
11:13
The thing that people have to remember
253
673036
1835
11:14
is that this idea of humans building tools
254
674905
2035
11:16
that divorce the design of the tool
255
676940
2903
11:19
from when the decision is made to enact violence,
256
679876
3037
11:22
it's not something new.
257
682946
1168
11:24
We've been doing it for thousands of years.
258
684147
2002
11:26
Pit traps, spike traps,
259
686183
1434
11:27
a huge variety of weapons even into the modern era,
260
687651
2936
11:30
like, think about anti-ship mines.
261
690620
2103
11:33
Even purely defensive tools that are fundamentally autonomous.
262
693090
4471
11:37
Whether or not you use AI is a very modern problem.
263
697594
2970
11:40
It's one that people who haven't usually examined the problem
264
700597
2870
11:43
fall into this trap.
265
703467
1167
11:44
And there’s people who say things that sound pretty good, like,
266
704668
2969
11:47
you should never allow a robot to pull the trigger.
267
707671
2436
11:50
You should never allow AI to decide who lives and who dies.
268
710107
2769
11:52
I look at it in a different way.
269
712876
1568
11:54
I think that the ethics of warfare are so fraught,
270
714444
2369
11:56
and the decisions so difficult,
271
716813
1502
11:58
that to artificially box yourself in and refuse to use sets of technology
272
718348
3437
12:01
that could lead to better results,
273
721785
1635
12:03
is an abdication of responsibility.
274
723653
2036
12:05
There's no moral high ground in saying, I refuse to use AI,
275
725689
3603
12:10
because I don’t want mines to be able
276
730193
2202
12:12
to tell the difference between a school bus full of children
277
732429
2836
12:15
and Russian armor.
278
735265
1201
12:17
There's a thousand problems like this.
279
737367
1969
12:19
The right way to look at this is problem by problem.
280
739369
3036
12:22
Is this ethical?
281
742906
1768
12:24
Are people taking responsibility for this use of force?
282
744708
3470
12:28
It's not to write off an entire category of technology,
283
748211
3504
12:31
and in doing so, tie our hands behind our backs
284
751748
2269
12:34
and hope we can still win.
285
754050
1302
12:35
I can't abide by that.
286
755385
2035
12:38
(Applause)
287
758188
3870
12:42
BS: You're right, if the information is available to you,
288
762092
2703
12:44
why not create systems that actually take advantage of it?
289
764795
2736
12:47
If you blind yourself to it, the result could be far more catastrophic.
290
767531
3403
12:50
PL: Precisely. And usually non-technical people will say things,
291
770967
3003
12:54
like, why not just make it all remote control?
292
774004
2169
12:56
And they don't recognize the scale of these conflicts we're talking about.
293
776173
3503
12:59
They don't lend themselves to a one-to-one ratio of people to systems.
294
779709
3304
13:03
To say nothing of the fact that if you're a remotely piloted system,
295
783046
3203
13:06
all you have to do is break the remote part
296
786283
2002
13:08
and everything falls apart.
297
788285
1301
13:09
There's no moral high ground either in saying all you have to do
298
789619
3003
13:12
is figure out how to jam us and you win.
299
792656
1935
13:14
BS: And it sounds like a lot of defense systems that exist today
300
794591
3036
13:17
kind of have this type of autonomous mode?
301
797627
2036
13:19
PL: This is another point.
302
799696
1268
13:20
It's usually not one that I make on a stage,
303
800964
2102
13:23
but I'll get confronted by journalists who say, oh, well,
304
803066
3170
13:26
we shouldn't open Pandora's box.
305
806236
1735
13:28
And my point to them is Pandora’s box was opened a long time ago
306
808004
3204
13:31
with anti-radiation missiles that seek out surface-to-air missile launchers.
307
811241
3604
13:34
We've been using them since pre-Vietnam era.
308
814845
2068
13:36
Our destroyers’ Aegis systems are capable of locking on
309
816947
2802
13:39
and firing on targets totally autonomously.
310
819749
2036
13:41
Almost all of our ships are protected by close-in weapon systems
311
821818
3003
13:44
that shoot down incoming mortars, missiles and drones.
312
824855
3269
13:48
I mean, we've been in this world of systems
313
828158
3270
13:51
that act out our will autonomously for decades.
314
831461
4772
13:56
And so the point I would make to people
315
836266
1935
13:58
is you're not asking to not open Pandora's box.
316
838235
2636
14:00
You're asking to shove it back in and close it again.
317
840904
3403
14:04
And the whole point of the allegory is that such cannot be done.
318
844341
3904
14:08
And so that's the way that I look at it.
319
848578
2036
14:10
BS: I've got to ask you one more question, going back to your roots.
320
850647
3237
14:13
Many folks were obviously introduced to VR because of Oculus.
321
853884
3003
14:16
And in a twist of fate, Anduril recently took over the IVAS program,
322
856920
3770
14:20
essentially building AR-VR headsets for the US Army.
323
860690
3103
14:23
What’s your vision for the program, and what does that feel like?
324
863793
3070
14:26
PL: We need all of our robots and all of our people
325
866863
2403
14:29
to be getting the right information at the right time.
326
869299
2536
14:31
That means they need a common view of the battlefield.
327
871868
2536
14:34
The way you can present that view to a human
328
874437
2069
14:36
is very different from the way you present it to a robot.
329
876539
2703
14:39
Robots are great, they have very high IO
330
879242
1969
14:41
and very low error rates in connectivity.
331
881244
2002
14:43
People have to try to figure out how to strap stuff onto our appendages,
332
883280
3436
14:46
like our hands and our eyes and our ears
333
886716
2002
14:48
and present information in a way that allows us to collaboratively work
334
888752
3370
14:52
with these types of tools.
335
892122
1301
14:53
So superhuman vision augmentation systems like better night vision,
336
893423
3170
14:56
thermal, ultraviolet, hyperspectral vision,
337
896593
2069
14:58
those are the things that people focus on when they look at IVAS.
338
898662
3069
15:01
But there’s a whole other layer,
339
901731
1635
15:03
which is that we need to be able to see the world the same way that robots do
340
903366
3671
15:07
if we're going to work closely alongside them on such high-stakes problems.
341
907070
3537
15:10
BS: I love it. Human plus machine intelligence.
342
910607
2202
15:12
Palmer Luckey, everyone.
343
912842
1268
15:14
(Applause)
344
914144
1835
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7