War, AI and the New Global Arms Race | Alexandr Wang | TED

267,292 views ・ 2023-07-10

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yip Yan Yeung 校对人员: suya f.
(注:本视频立场 不代表 TED 译者意见)
00:04
Artificial intelligence and warfare.
0
4626
3211
人工智能与战争。
00:07
Let's talk about what this really could look like.
1
7879
2461
我们来聊一聊 这究竟会是什么样的。
00:11
Swarms of lethal drones with facial recognition
2
11758
3587
成群的致命性无人机 携带人脸识别功能,
00:15
that know your every move.
3
15387
1918
对你的一举一动了如指掌。
00:17
Or unmanned armed robots that are near impossible to defeat.
4
17347
4296
无人操纵的武装机器人 几乎立于不败之地。
00:21
Autonomous fighter jets that can travel at supersonic speeds
5
21685
4087
自动驾驶战斗机以超音速飞行,
00:25
and can withstand greater gravitational force
6
25772
2461
可以承受大于 人类飞行员能够存活的重力。
00:28
than a human pilot could survive.
7
28233
2210
00:30
Cyberattacks that incapacitate critical port infrastructure
8
30443
4004
让关键港口设施 无法正常运行的网络攻击
00:34
or disinformation campaigns and deepfakes that throw presidential elections.
9
34447
4463
或充满虚假信息的宣传 和扰乱总统选举的深度伪造。
00:38
Or even foreign adversaries taking out satellites,
10
38910
3754
甚至还有境外敌对势力破坏卫星——
00:42
our eyes and ears in space,
11
42706
1918
我们在太空中的眼睛和耳朵,
00:44
rendering us blind to global events.
12
44666
2419
让我们无从知晓全球事件。
00:47
All superintelligent weapons of terror.
13
47669
4880
这些都是超智能的恐怖武器。
00:54
We are at the dawn of a new age of warfare.
14
54593
3003
我们已经走到了战争新时代的黎明。
00:58
I grew up in the birthplace of a technology
15
58763
2211
我长大的地方是一项
01:01
that defined the last era of warfare,
16
61016
2335
定义了上一个战争时代的科技——
01:03
the atomic bomb.
17
63393
1919
原子弹的诞生地。
01:05
I was keenly aware of how this technology had fundamentally shaped geopolitics
18
65353
5047
我深知这项科技 如何彻底改变了地缘政治
01:10
and the nature of war.
19
70400
1877
和战争的本质。
01:12
My parents were both scientists at Los Alamos National Laboratory.
20
72277
3545
我的父母都是洛斯·阿拉莫斯 国家实验室的科学家。
01:16
My dad’s a physicist, and my mom’s an astrophysicist.
21
76948
3379
我的父亲是一名物理学家, 母亲是一名天体物理学家。
01:20
Their scientific work in plasma fluid dynamics
22
80327
3420
他们在等离子体流体动力学中的科研成果
01:23
will have deep implications for how we understand our universe.
23
83747
3628
会对我们理解宇宙的方式 产生深远的影响。
01:27
So naturally, I knew I wanted to work on something just as impactful.
24
87417
4546
所以,我自然而然知道自己 也想从事如此影响深远的工作。
01:32
I decided to become a programmer and study artificial intelligence.
25
92005
3295
我决定成为一名程序员, 研究人工智能。
01:36
AI is one of the most critical technologies of our time
26
96635
3962
AI 是当今最重要的科技之一,
01:40
and with deep implications for national security
27
100639
3211
会对国家安全和全球民主 产生巨大的影响。
01:43
and democracy globally.
28
103892
1626
01:46
As we saw in World War II with the atomic bomb,
29
106102
3128
我们在第二次世界大战中 见识了原子弹,
01:49
the country that is able to most rapidly and effectively
30
109230
3295
那能最快、最有效
01:52
integrate new technology into warfighting wins.
31
112525
3462
在战争中用上新技术的国家 就会取得胜利。
01:55
There's no reason to believe this will be any different with AI.
32
115987
3504
AI 也绝对没有可能会成为例外。
02:00
But in the AI arms race, we're already behind.
33
120867
3796
但在 AI 的军备竞赛中, 我们已经落后了。
02:04
From a technological perspective,
34
124663
2210
从技术角度来看,
02:06
China is already ahead of the United States
35
126915
3378
中国在计算机视觉 AI 上 已经领先美国。
02:10
in computer vision AI.
36
130335
1626
02:12
And in large language models, like ChatGPT,
37
132003
2711
在大语言模型领域, 如 ChatGPT 上,
02:14
they are fast followers.
38
134756
1501
中国紧随我们之后。
02:17
In terms of military implementations,
39
137008
2378
就军事部署而言,
02:19
they're outspending us:
40
139427
1835
中国的支出超过了美国:
02:21
adjusted for total military budget,
41
141304
2127
从军事总预算来看,
02:23
China's spending ten times more than the United States.
42
143473
3128
中国的支出要超过美国十倍。
02:27
Why are we so far behind?
43
147477
1418
为什么我们会落后这么多?
02:29
The answer is twofold.
44
149979
2420
答案有两方面。
02:32
First, data supremacy.
45
152399
2085
一方面,“数据至上”。
02:35
Despite having the largest fleet of military hardware in the world,
46
155694
4087
虽然美国有世界上 最庞大的军事硬件储备,
02:39
most of the data from this fleet is thrown away or inaccessible,
47
159781
4588
这些硬件输出的大部分数据 都会被丢弃或无法访问,
02:44
hidden away on hard drives that never see the light of day.
48
164369
3128
藏于硬盘,不见天日。
02:48
This is our Achilles heel.
49
168456
1919
这就是我们的阿喀琉斯之踵。
02:51
In an AI war, everything boils down to data.
50
171459
4046
在 AI 战争中, 一切的精髓就是数据。
02:57
For defense AI,
51
177215
1543
就防御型 AI 而言,
02:58
data from the internet is not enough.
52
178800
3003
来自互联网的数据是不够的。
03:01
Most of the data needs to come from our military assets,
53
181845
3044
大多数数据得源于我们的军用设备、
03:04
sensors and collaborations with tech companies.
54
184931
3879
传感器和与科技公司的合作。
03:08
Military commanders need to know how to use data as a military asset.
55
188852
4129
军事指挥官必须知晓 如何将数据当作军事资产使用。
03:13
I've heard this first-hand many times,
56
193773
1919
我好几次直接听人说到这个观点,
03:15
from my conversations with military personnel,
57
195692
2294
来自我和军事人员的对话,
03:17
including most recently
58
197986
1877
包括最近一次,
03:19
from Lieutenant General Richard R. Coffman,
59
199863
2377
与美国陆军未来司令部副司令, 理查德·R·考夫曼
03:22
deputy commanding general for United States Army Futures Command.
60
202240
3337
(Richard R. Coffman)中将的对话。
03:26
Second, despite being home to the leading technology companies
61
206494
3796
第二,虽然美国是站在人工智能前线的
03:30
at the forefront of artificial intelligence,
62
210331
2545
主导科技公司的大本营,
03:32
the US tech industry has largely shied away
63
212917
2586
美国的科技行业大多会选择
03:35
from taking on government contracts.
64
215545
2127
避免承接政府合约。
03:37
Somewhere along the line,
65
217714
1752
久而久之,
03:39
tech leaders decided that working with the government was taboo.
66
219507
4255
科技领导者们认为 与政府合作是个禁忌。
03:43
As a technologist,
67
223803
1251
作为一名技术专家,
03:45
I'm often asked how I'm bettering this world.
68
225096
2294
我总是会被人问到 我为这个世界做出了什么贡献。
03:48
This is how I'm improving the future of our world:
69
228308
3044
我是这么改善世界的未来的:
03:51
by helping my country succeed
70
231394
1668
通过帮助我国走向成功,
03:53
and providing the best tools and technology
71
233062
2086
提供最好的工具和技术,
03:55
to ensure that the United States government can defend its citizens,
72
235148
3545
让美国政府有能力保卫国民、
03:58
allies and partners.
73
238693
1460
同盟和合作伙伴。
04:00
(Applause)
74
240153
4504
(掌声)
04:05
The Ukraine war has demonstrated that the nature of war has changed.
75
245617
4296
俄乌战争证明了 战争的本质已经发生了变化。
04:11
Through AI overmatch, Ukraine is challenging an adversary
76
251372
3546
在这场 AI 实力悬殊的战争中, 乌克兰挑战的对手
04:14
with far superior numbers of troops and weapons.
77
254959
3379
有着远超己方的军队规模和武器。
04:19
Before the Ukraine war,
78
259255
1710
在俄乌战争前,
04:21
Russia had spent an estimated 65 billion US dollars
79
261007
3629
俄罗斯在军费支出上 花费了大约 650 亿美元,
04:24
on its military expenditures,
80
264677
2253
04:26
whereas Ukraine only spent about six billion dollars.
81
266971
2962
而乌克兰只支出了 大约 60 亿美元。
04:31
It's estimated that Russia had over 900,000 military troops
82
271226
5005
据估计,俄罗斯拥有 超过 90 万名军人、
04:36
and 1,300 aircraft,
83
276231
2002
1300 架飞机,
04:38
whereas Ukraine only had 200,000 military troops and 130 aircraft.
84
278233
6047
而乌克兰只有 20 万名军人、130 架飞机。
04:44
Technologies such as drones,
85
284280
2336
像无人机、
04:46
AI-based targeting and image intelligence
86
286616
2753
基于 AI 的目标定位和图像情报、
04:49
and Javelin missiles
87
289369
1543
标枪反坦克导弹这样的技术,
04:50
have enabled a shocking defense of Ukraine.
88
290912
2628
让乌克兰具备了惊人的防御能力。
04:54
AI is proving invaluable for defending Ukrainian cities and infrastructure
89
294499
4963
AI 在保卫乌克兰城市和基础设施
04:59
against missile and drone bombardment.
90
299504
2753
免受导弹和无人机轰炸上功不可没。
05:02
At Scale, we’re using our novel machine learning algorithm
91
302298
3671
我们在 Scale 公司 通过新型机器学习算法
05:06
for battle damage assessment in key areas affected by the war.
92
306010
3838
对战争造成影响的关键领域 进行战争损害评估。
05:09
We've rapidly analyzed over 2,000 square kilometers
93
309889
4046
我们快速分析了 超过 2000 平方公里,
05:13
and have identified over 370,000 structures,
94
313977
3587
识别出了超过 37 万座建筑物,
05:17
including thousands not previously identified
95
317564
2377
包括上千座之前没有 被其他开源数据集识别的建筑物。
05:19
by other open source data sets.
96
319941
2085
05:22
We focused on Kyiv, Kharkiv and Dnipro
97
322026
3963
我们关注基辅、哈尔科夫和第聂伯罗,
05:25
and provided our data directly
98
325989
2794
以公开、可获取数据集的形式
05:28
in a publicly accessible data set to the broader AI community.
99
328783
3420
向受众更广的 AI 社区 直接提供数据。
05:32
One of the key problems we're solving
100
332203
2252
我们要解决的一个关键问题是
05:34
is using AI to analyze massive amounts of imagery and detect objects
101
334497
4922
利用 AI 分析大量图像, 识别物体,
05:39
because humans just can't keep up.
102
339460
2503
因为人类根本没有这个能力。
05:42
We've received an overwhelming response from our free AI-ready data set
103
342589
4254
我们免费的 AI 可用数据集 收到了热烈的反馈,
05:46
and have provided it directly to the United States and NATO allies.
104
346885
3837
并将其直接提供给了 美国及北约盟国。
05:50
And it's been downloaded over 2,000 times by AI companies, researchers,
105
350763
4338
下载量已超过 2000 次, 来自 AI 公司、研究者、
05:55
developers and GIS practitioners.
106
355143
2377
开发人员和地理信息系统 (GIS)从业者。
05:58
AI can also be used for change detection.
107
358605
2377
AI 也可以用于变化检测。
06:01
Simply put, algorithms can constantly monitor imagery
108
361941
3796
简而言之,算法可以持续监测影像,
06:05
and notify a human to investigate further if there's a change or a movement.
109
365737
4004
如果发生了变化或移动, 就会通知人类进一步调查。
06:10
It's clear that AI is increasingly powering warfare.
110
370867
4296
很显然,AI 越来越能 为战争添砖加瓦。
06:15
And based on the rate of progress in the AI field,
111
375914
3211
根据 AI 领域发展的速度,
06:19
I predict that in ten years, it will be the dominant force.
112
379167
3462
我预计它会在十年内成为中坚力量。
06:24
Disinformation and misinformation are already huge problems.
113
384088
4088
虚假信息和误传误报 已经是个大问题了。
06:29
And this technology is only going to make it worse.
114
389302
3086
这项技术只会加重这个问题。
06:32
Tools like ChatGPT have enabled AI to generate imagery,
115
392430
3754
像 ChatGPT 这样的工具 已经可以让 AI 生成图像、
06:36
text, audio, video, code and even reason.
116
396225
4255
文本、音频、视频、 代码,甚至是推理。
06:40
These tools can generate realistic-looking and realistic-sounding content,
117
400480
4171
这些工具可以生成似是而非、 以假乱真的内容,
06:44
which on top of bot-run social media accounts
118
404651
2502
再加上由机器人运营的 社交媒体账号,
06:47
will make it nearly impossible to identify disinformation
119
407153
2961
让人几乎不可能识别出 线上的虚假信息和以讹传讹。
06:50
and misinformation online.
120
410114
2169
06:52
Bad actors can use these tools to supercharge misinformation
121
412283
4088
图谋不轨的人利用这些工具 让虚假信息变本加厉,
06:56
and propagate falsehoods.
122
416371
1960
大肆传播谣言。
06:58
China already uses disinformation campaigns
123
418373
2919
中国在台湾进行了大量 的虚假信息宣传
07:01
and social media manipulations heavily in Taiwan,
124
421334
2794
和社交媒体操纵,
07:04
particularly during elections.
125
424170
2085
尤其是在大选期间。 (注:为原文立场,不代表译者意见)
07:06
Or take Russia's propaganda machine,
126
426297
1752
或者举俄罗斯的宣传机器为例,
07:08
which in the wake of Russia's invasion of Ukraine
127
428091
3044
在俄罗斯入侵乌克兰之后,
07:11
created a deepfake of Ukrainian President Zelensky
128
431177
2711
制作了乌克兰总统 泽连斯基(Zelensky)的深度伪造,
07:13
calling for Ukrainian troops to surrender.
129
433930
2085
呼吁乌军投降。
07:16
This deepfake was easy to spot,
130
436057
2711
很容易看出这个深度伪造,
07:18
but the next one may not be.
131
438810
2627
但下一个就未必了。
07:21
This also takes place within our borders,
132
441437
2044
这也发生在我国国内,
07:23
from social media algorithm manipulation
133
443481
2544
包括社交媒体算法操纵、
07:26
to advertising microtargeting and geofencing,
134
446025
3128
广告精准投放、地理围栏、
07:29
to deepfakes of politicians and bot-run social media accounts.
135
449153
3796
政治人物的深度伪造和 由机器人运营的社交媒体账号。
07:32
The United States is not excused
136
452949
1626
美国也不能幸免于持续恶化的 虚假信息和误传误报。
07:34
from exacerbating disinformation and misinformation.
137
454575
2670
07:38
These tools are universally accessible at low or no cost,
138
458246
3754
这些工具可以被大众 以低价或免费轻易获取,
07:42
meaning they can be employed by anyone anywhere
139
462041
2252
也就是说他们可以 在任意地点被任意人士雇佣
07:44
to undermine the sanctity of democracy globally.
140
464335
3170
在全球范围内破坏 民主的神圣不可侵犯性。
07:48
However, all hope is not lost.
141
468464
2753
但是,还有希望。
07:51
If we properly invest
142
471259
1251
如果我们适当地向 数据基础设施和数据准备投资,
07:52
into data infrastructure and data preparation,
143
472552
2752
07:55
all this can be avoided.
144
475346
1794
这些都可以被避免。
07:57
Deterrence is nothing new to military thinking.
145
477181
2753
威慑在军事思想中 不是什么新鲜事。
07:59
As we saw in World War II with the atomic bomb,
146
479976
2919
就像第二次世界大战中的原子弹,
08:02
it was a primary factor in deterring foreign adversaries
147
482937
3337
它成为了 60 多年以来 威慑外国敌对势力
08:06
from going to nuclear war for more than six decades.
148
486274
2753
避免进入核战争的主要因素。
08:09
Because the stakes of going to war with such a technology
149
489027
2711
用这项技术打仗的代价
08:11
were simply too high.
150
491738
1251
太高了。
08:13
We're likely to see a new calculus emerge with AI.
151
493865
3712
我们估计会看到 随 AI 而来的新算式。
08:17
It's uncharted territory, nobody knows what it will look like
152
497577
3170
这是一个无人之境, 没有人知道会是什么样的
08:20
or the toll it will take.
153
500747
1918
或会付出什么代价。
08:22
How do we know if our AI is better than our adversaries'?
154
502707
3003
我们该如何得知我们的 AI 会不会比敌对势力更厉害呢?
08:25
We won't.
155
505752
1459
我们无法得知。
08:27
But one thing is clear:
156
507253
1794
但有一点很明确:
08:29
AI can only be as powerful as the underlying data
157
509088
3045
AI 的厉害之处来自
08:32
that is used to fuel its algorithms.
158
512175
2002
它输入算法的底层数据。
08:35
Data will be a new kind of ammunition in the era of AI warfare.
159
515678
4213
数据会成为 AI 战时代的新型弹药。
08:40
In the tech industry, we often talk about missions.
160
520850
3670
身处科技行业的我们 总是会谈到“使命”。
08:44
They're often frivolous.
161
524562
1752
通常都是随便说说的。
08:46
Do they really change the world or save lives?
162
526314
3462
我们真的会改变世界或拯救生命吗?
08:49
This mission, on the other hand, really matters.
163
529776
3378
但是这个使命真的很重要。
08:54
The AI war will define the future of our world
164
534447
2544
AI 战争会定义世界的未来,
08:57
and has the potential to shift the balance of diplomatic power.
165
537825
3045
有可能会改变外交能力的平衡。
09:02
It's clear that digital warfare
166
542080
2752
很显然数字战争
09:04
is not some dystopian reality, tucked away in a faraway future.
167
544874
3420
不是什么反乌托邦的事实, 远在天边。
09:08
It is taking place in the here and now.
168
548336
2753
它正在此时此刻此地上演。
09:12
We cannot sit by the sidelines
169
552673
2086
我们不能袖手旁观,
09:14
and watch the rise of an authoritarian regime.
170
554801
2919
眼看着威权政体逐渐壮大。
09:17
It is in moments like this
171
557762
1293
在这样的时刻,
09:19
that technologists can either rise to the challenge or stand idle.
172
559097
3795
技术专家要么挺身而出迎接挑战, 要么袖手旁观。
09:24
I encourage my fellow technologists
173
564310
1752
我号召科技同僚们,
09:26
to understand the austerity and severity of our times
174
566104
2711
理解眼前的艰难困苦,
09:28
and commit themselves to supporting national security.
175
568815
3211
投身于支持国家安全。
09:32
While I find it shocking that most American AI companies
176
572026
3462
虽然我很意外大多数的 美国 AI 公司
09:35
have chosen not to support national security,
177
575488
2502
都会选择不去支持国家安全,
09:37
I do hope others join us.
178
577990
1460
我还是希望其他人可以加入我们。
09:40
We must fight for the world we want to live in.
179
580827
2794
我们必须争取 我们愿意身处其中的世界。
09:43
It's never mattered more.
180
583621
2211
此诚危急存亡之秋也。
09:45
Thank you.
181
585873
1252
谢谢。
09:47
(Applause)
182
587166
3837
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog