War, AI and the New Global Arms Race | Alexandr Wang | TED

175,811 views ・ 2023-07-10

TED


請雙擊下方英文字幕播放視頻。

譯者: Josephine Chen 審譯者: Shelley Tsang 曾雯海
00:04
Artificial intelligence and warfare.
0
4626
3211
人工智慧和戰爭
00:07
Let's talk about what this really could look like.
1
7879
2461
我們來聊聊可能發展的局勢
00:11
Swarms of lethal drones with facial recognition
2
11758
3587
蜂擁的致命無人機有臉部辨識功能
00:15
that know your every move.
3
15387
1918
偵測你的一舉一動
00:17
Or unmanned armed robots that are near impossible to defeat.
4
17347
4296
或是自主移動武裝機器人 近乎無堅不摧
00:21
Autonomous fighter jets that can travel at supersonic speeds
5
21685
4087
自主戰鬥機可以超音速之姿行進
00:25
and can withstand greater gravitational force
6
25772
2461
且能承受更大的重力
00:28
than a human pilot could survive.
7
28233
2210
人類飛行員無法生存的重力
00:30
Cyberattacks that incapacitate critical port infrastructure
8
30443
4004
網路攻擊癱瘓重要港口的基礎建設
00:34
or disinformation campaigns and deepfakes that throw presidential elections.
9
34447
4463
或假情報活動和深偽技術 在總統大選時亂人耳目
00:38
Or even foreign adversaries taking out satellites,
10
38910
3754
甚至是敵國入侵人造衛星
00:42
our eyes and ears in space,
11
42706
1918
人造衛星是我們在太空中的耳目
00:44
rendering us blind to global events.
12
44666
2419
入侵動作蒙蔽我們對全球事件的了解
00:47
All superintelligent weapons of terror.
13
47669
4880
所有恐怖武器都有超級智能
00:54
We are at the dawn of a new age of warfare.
14
54593
3003
我們身處嶄新戰爭時代的開端
00:58
I grew up in the birthplace of a technology
15
58763
2211
我長大的科技發源地
01:01
that defined the last era of warfare,
16
61016
2335
界定了上一個戰爭時代
01:03
the atomic bomb.
17
63393
1919
那是原子彈時代
01:05
I was keenly aware of how this technology had fundamentally shaped geopolitics
18
65353
5047
我敏銳地察覺到原子科技 如何從根本上塑造了地緣政治
01:10
and the nature of war.
19
70400
1877
還有戰爭的本質
01:12
My parents were both scientists at Los Alamos National Laboratory.
20
72277
3545
我的雙親都是科學家 在洛斯阿拉莫斯國家實驗室工作
01:16
My dad’s a physicist, and my mom’s an astrophysicist.
21
76948
3379
爸爸是物理學家 媽媽是天體物理學家
01:20
Their scientific work in plasma fluid dynamics
22
80327
3420
我父母研究電漿流體動力學
01:23
will have deep implications for how we understand our universe.
23
83747
3628
將有的深遠影響 有助於我們了解宇宙萬象
01:27
So naturally, I knew I wanted to work on something just as impactful.
24
87417
4546
自然而然的驅策了我想要做有影響力的事
01:32
I decided to become a programmer and study artificial intelligence.
25
92005
3295
我決定成為程式設計師,攻讀人工智慧
01:36
AI is one of the most critical technologies of our time
26
96635
3962
AI是我們這個時代最具爭議的科技之一
01:40
and with deep implications for national security
27
100639
3211
對國家安全和全球民主
01:43
and democracy globally.
28
103892
1626
影響廣泛
01:46
As we saw in World War II with the atomic bomb,
29
106102
3128
二次世界大戰擁有原子彈
01:49
the country that is able to most rapidly and effectively
30
109230
3295
又能最快速有效率地整合新科技的國家
01:52
integrate new technology into warfighting wins.
31
112525
3462
在二戰中拔得頭籌
01:55
There's no reason to believe this will be any different with AI.
32
115987
3504
沒有理由不相信AI會有什麼不同
02:00
But in the AI arms race, we're already behind.
33
120867
3796
但在AI戰備競賽上 我們已經落後
02:04
From a technological perspective,
34
124663
2210
從技術層面的觀點看來
02:06
China is already ahead of the United States
35
126915
3378
中國早已凌駕美國之上
02:10
in computer vision AI.
36
130335
1626
包括計算機視覺人工智慧
02:12
And in large language models, like ChatGPT,
37
132003
2711
ChatGPT 之類的大語言模型
02:14
they are fast followers.
38
134756
1501
中國是敏捷的AI擁護者
02:17
In terms of military implementations,
39
137008
2378
以軍事執行而言
02:19
they're outspending us:
40
139427
1835
中方的軍事支出已超越美方
02:21
adjusted for total military budget,
41
141304
2127
整體軍事預算的調度
02:23
China's spending ten times more than the United States.
42
143473
3128
中國編列的預算凌駕美國十倍
02:27
Why are we so far behind?
43
147477
1418
為何我們落後中國呢?
02:29
The answer is twofold.
44
149979
2420
原因有兩部分
02:32
First, data supremacy.
45
152399
2085
一部分是因為資料數據至上權
02:35
Despite having the largest fleet of military hardware in the world,
46
155694
4087
儘管我們擁有世界上最大的軍武裝備艦隊
02:39
most of the data from this fleet is thrown away or inaccessible,
47
159781
4588
艦隊大多數的數據不是被丟棄就是無法取得
02:44
hidden away on hard drives that never see the light of day.
48
164369
3128
藏在硬碟裡永不見天日
02:48
This is our Achilles heel.
49
168456
1919
這就是我們的弱點
02:51
In an AI war, everything boils down to data.
50
171459
4046
AI科技戰爭的關鍵在歸納資料數據
02:57
For defense AI,
51
177215
1543
要創造防禦型人工智慧
02:58
data from the internet is not enough.
52
178800
3003
網路上的資料數據不足
03:01
Most of the data needs to come from our military assets,
53
181845
3044
大部分的數據取得須來自軍事資產
03:04
sensors and collaborations with tech companies.
54
184931
3879
軍事感應器並與科技公司合作
03:08
Military commanders need to know how to use data as a military asset.
55
188852
4129
軍事指揮官需知道如何運用數據 並視為軍事資產
03:13
I've heard this first-hand many times,
56
193773
1919
我親耳聽到這種說法幾次了
03:15
from my conversations with military personnel,
57
195692
2294
從和軍事人員對話中得知
03:17
including most recently
58
197986
1877
包含最近
03:19
from Lieutenant General Richard R. Coffman,
59
199863
2377
和中將理查德·考夫曼
03:22
deputy commanding general for United States Army Futures Command.
60
202240
3337
美國陸軍未來司令部的副總司令談及此事
03:26
Second, despite being home to the leading technology companies
61
206494
3796
第二部分是儘管在國內是首屈一指的科技公司們
03:30
at the forefront of artificial intelligence,
62
210331
2545
擁有人工智慧的一席顯耀地位
03:32
the US tech industry has largely shied away
63
212917
2586
美國科技業卻頻頻閃避
03:35
from taking on government contracts.
64
215545
2127
不願簽訂政府契約
03:37
Somewhere along the line,
65
217714
1752
在某個當下
03:39
tech leaders decided that working with the government was taboo.
66
219507
4255
科技大老們決定和政府共事是禁忌
03:43
As a technologist,
67
223803
1251
身為一個科技家
03:45
I'm often asked how I'm bettering this world.
68
225096
2294
常有人問我我如何能讓世界更美好
03:48
This is how I'm improving the future of our world:
69
228308
3044
這就是我讓未來的世界更進步的方式
03:51
by helping my country succeed
70
231394
1668
我協助我的國家成功
03:53
and providing the best tools and technology
71
233062
2086
提供最好的工具與科技
03:55
to ensure that the United States government can defend its citizens,
72
235148
3545
確保美國政府可以捍衛自己的人民
03:58
allies and partners.
73
238693
1460
同盟國和伙伴
04:00
(Applause)
74
240153
4504
(掌聲)
04:05
The Ukraine war has demonstrated that the nature of war has changed.
75
245617
4296
烏俄戰爭已顯現了戰爭本質的轉變
04:11
Through AI overmatch, Ukraine is challenging an adversary
76
251372
3546
透過領先的人工智慧 烏克蘭能豪不畏懼對手
04:14
with far superior numbers of troops and weapons.
77
254959
3379
儘管俄國在軍隊與武器的數量 遠在烏克蘭之上
04:19
Before the Ukraine war,
78
259255
1710
烏俄大戰前
04:21
Russia had spent an estimated 65 billion US dollars
79
261007
3629
俄國花費粗估將近650億美元
04:24
on its military expenditures,
80
264677
2253
在軍備開支上
04:26
whereas Ukraine only spent about six billion dollars.
81
266971
2962
反觀烏克蘭只花費60億美元
04:31
It's estimated that Russia had over 900,000 military troops
82
271226
5005
據估計俄國有超過90萬支部隊
04:36
and 1,300 aircraft,
83
276231
2002
一千三百台軍機
04:38
whereas Ukraine only had 200,000 military troops and 130 aircraft.
84
278233
6047
然而烏克蘭只有20萬隻部隊和130架軍機
04:44
Technologies such as drones,
85
284280
2336
科技有像是無人機
04:46
AI-based targeting and image intelligence
86
286616
2753
AI人工智能定位、圖像智能
04:49
and Javelin missiles
87
289369
1543
和刺針標槍飛彈
04:50
have enabled a shocking defense of Ukraine.
88
290912
2628
烏克蘭已有令人震驚的足夠防禦力
04:54
AI is proving invaluable for defending Ukrainian cities and infrastructure
89
294499
4963
AI證實了能極為有效的防禦烏克蘭的城市與建設
04:59
against missile and drone bombardment.
90
299504
2753
對抗飛彈和無人機的連續轟炸
05:02
At Scale, we’re using our novel machine learning algorithm
91
302298
3671
在Scale AI 我們大規模的使用新式機器學習演算法
05:06
for battle damage assessment in key areas affected by the war.
92
306010
3838
對主要戰爭影響的區域做損害評估
05:09
We've rapidly analyzed over 2,000 square kilometers
93
309889
4046
我們迅速分析超過2千平方公尺的區域
05:13
and have identified over 370,000 structures,
94
313977
3587
也分析超越37萬棟的大型建築物
05:17
including thousands not previously identified
95
317564
2377
包含其他開放原碼資料庫
05:19
by other open source data sets.
96
319941
2085
早先無法判斷的損害分析
05:22
We focused on Kyiv, Kharkiv and Dnipro
97
322026
3963
我們把重心放在基輔、哈爾科夫和聶伯河
05:25
and provided our data directly
98
325989
2794
直接提供我們的數據資料
05:28
in a publicly accessible data set to the broader AI community.
99
328783
3420
集結公開可取得的資料合併更廣大的AI社群
05:32
One of the key problems we're solving
100
332203
2252
我們解決其中一個關鍵問題
05:34
is using AI to analyze massive amounts of imagery and detect objects
101
334497
4922
就是運用人工智能分析大量的圖像與偵測物體
05:39
because humans just can't keep up.
102
339460
2503
因為人類根本跟不上AI分析的速度
05:42
We've received an overwhelming response from our free AI-ready data set
103
342589
4254
我們的免費AI就緒資料庫收到很多熱烈的回應
05:46
and have provided it directly to the United States and NATO allies.
104
346885
3837
已經直接提供美國和北約盟國使用
05:50
And it's been downloaded over 2,000 times by AI companies, researchers,
105
350763
4338
眾AI公司、研究者、發展者和地理資訊從業者
05:55
developers and GIS practitioners.
106
355143
2377
下載免費資料庫已超過2千次
05:58
AI can also be used for change detection.
107
358605
2377
人工智能也可用於變化檢測
06:01
Simply put, algorithms can constantly monitor imagery
108
361941
3796
簡單來說,演算法可以持續監測圖像
06:05
and notify a human to investigate further if there's a change or a movement.
109
365737
4004
假如AI發現異動或是任何行動 會通知我們做更深入的調查
06:10
It's clear that AI is increasingly powering warfare.
110
370867
4296
很明顯地,AI逐步加強了戰力
06:15
And based on the rate of progress in the AI field,
111
375914
3211
以AI領域的進步速度來預估
06:19
I predict that in ten years, it will be the dominant force.
112
379167
3462
10年內人工智能將會成為主力
06:24
Disinformation and misinformation are already huge problems.
113
384088
4088
假情報和假資訊已經造成大麻煩
06:29
And this technology is only going to make it worse.
114
389302
3086
AI科技只會讓情況更糟
06:32
Tools like ChatGPT have enabled AI to generate imagery,
115
392430
3754
像是ChatGPT這樣的科技 能運用人工智能生成圖像
06:36
text, audio, video, code and even reason.
116
396225
4255
文字、音檔、影片、編碼甚至推理
06:40
These tools can generate realistic-looking and realistic-sounding content,
117
400480
4171
這些工具能產出擬真圖像和逼真的人聲
06:44
which on top of bot-run social media accounts
118
404651
2502
再加上機器人運作的社群媒體帳號
06:47
will make it nearly impossible to identify disinformation
119
407153
2961
將達到近乎無法辨識真偽的假情報
06:50
and misinformation online.
120
410114
2169
和假資訊在網路流傳
06:52
Bad actors can use these tools to supercharge misinformation
121
412283
4088
劣質演員會使用這些AI智能工具強化假消息
06:56
and propagate falsehoods.
122
416371
1960
散播謊言
06:58
China already uses disinformation campaigns
123
418373
2919
中國早就大量使用假消息滲透
07:01
and social media manipulations heavily in Taiwan,
124
421334
2794
操控社群媒體的手段來對付台灣
07:04
particularly during elections.
125
424170
2085
在選舉期間尤其常見
07:06
Or take Russia's propaganda machine,
126
426297
1752
以俄羅斯的宣傳機器為例
07:08
which in the wake of Russia's invasion of Ukraine
127
428091
3044
隨著俄羅斯侵略烏克蘭之後
07:11
created a deepfake of Ukrainian President Zelensky
128
431177
2711
製造了烏克蘭總統澤蘭斯基的深偽分身
07:13
calling for Ukrainian troops to surrender.
129
433930
2085
召喚烏克蘭軍隊投降
07:16
This deepfake was easy to spot,
130
436057
2711
這個深偽極為容易辨識
07:18
but the next one may not be.
131
438810
2627
但下一個可就不一定了
07:21
This also takes place within our borders,
132
441437
2044
美國境內也有此現象發生
07:23
from social media algorithm manipulation
133
443481
2544
從被社群媒體演算法操控
07:26
to advertising microtargeting and geofencing,
134
446025
3128
到微型目標廣告和電子圍籬
07:29
to deepfakes of politicians and bot-run social media accounts.
135
449153
3796
到深偽政治人物和機器人操控的社群媒體帳號
07:32
The United States is not excused
136
452949
1626
美國無法置身事外
07:34
from exacerbating disinformation and misinformation.
137
454575
2670
也陷在惡化的假情報與假消息中
07:38
These tools are universally accessible at low or no cost,
138
458246
3754
這些人工智能工具普及易取得 便宜甚至免費
07:42
meaning they can be employed by anyone anywhere
139
462041
2252
這代表著無論和人和地都可利用它
07:44
to undermine the sanctity of democracy globally.
140
464335
3170
破壞了神聖的全球民主
07:48
However, all hope is not lost.
141
468464
2753
然而希望尚存
07:51
If we properly invest
142
471259
1251
如果我們適當的投入
07:52
into data infrastructure and data preparation,
143
472552
2752
資料建設和資料準備
07:55
all this can be avoided.
144
475346
1794
這些破壞都可以避免
07:57
Deterrence is nothing new to military thinking.
145
477181
2753
威懾對軍事思考而言是舊聞了
07:59
As we saw in World War II with the atomic bomb,
146
479976
2919
二次大戰中所見的原子彈
08:02
it was a primary factor in deterring foreign adversaries
147
482937
3337
是威嚇住對手國的主要關鍵
08:06
from going to nuclear war for more than six decades.
148
486274
2753
這六十多個年頭來避免演變成核戰的關鍵
08:09
Because the stakes of going to war with such a technology
149
489027
2711
使用原子彈這項科技開戰的風險
08:11
were simply too high.
150
491738
1251
簡直太高
08:13
We're likely to see a new calculus emerge with AI.
151
493865
3712
我們很可能看到AI編寫新微積分
08:17
It's uncharted territory, nobody knows what it will look like
152
497577
3170
人工智慧是未知的領域 沒有人知道他會如何發展
08:20
or the toll it will take.
153
500747
1918
或是帶來多少破壞
08:22
How do we know if our AI is better than our adversaries'?
154
502707
3003
我們要從何得知我們的AI比對手更好
08:25
We won't.
155
505752
1459
無從得知
08:27
But one thing is clear:
156
507253
1794
但有件事是很明確的
08:29
AI can only be as powerful as the underlying data
157
509088
3045
人工智能的強大程度取決於
08:32
that is used to fuel its algorithms.
158
512175
2002
用於支持其演算法的基礎數據
08:35
Data will be a new kind of ammunition in the era of AI warfare.
159
515678
4213
數據將是這個時代AI戰的新型彈藥
08:40
In the tech industry, we often talk about missions.
160
520850
3670
身處科技產業,我們常談到任務
08:44
They're often frivolous.
161
524562
1752
科技產業常漫不經心
08:46
Do they really change the world or save lives?
162
526314
3462
科技產業真的改變世界了嗎? 拯救生命了嗎?
08:49
This mission, on the other hand, really matters.
163
529776
3378
另一方面來說, 這個使命真的太重要了
08:54
The AI war will define the future of our world
164
534447
2544
人工智能戰將重新定義世界的未來
08:57
and has the potential to shift the balance of diplomatic power.
165
537825
3045
並且有潛力轉移外交力量的平衡
09:02
It's clear that digital warfare
166
542080
2752
很明確的數位戰爭
09:04
is not some dystopian reality, tucked away in a faraway future.
167
544874
3420
並非反烏托邦現實 隱藏在遙遠的以後
09:08
It is taking place in the here and now.
168
548336
2753
數位戰爭現在正在發生
09:12
We cannot sit by the sidelines
169
552673
2086
我們不能坐視不管
09:14
and watch the rise of an authoritarian regime.
170
554801
2919
眼看著專制政權崛起
09:17
It is in moments like this
171
557762
1293
這樣的時刻
09:19
that technologists can either rise to the challenge or stand idle.
172
559097
3795
科技學家是要勇於挑戰還是袖手旁觀
09:24
I encourage my fellow technologists
173
564310
1752
鼓勵我的科技同儕們
09:26
to understand the austerity and severity of our times
174
566104
2711
理解我們這個時代的艱苦和嚴苛
09:28
and commit themselves to supporting national security.
175
568815
3211
並承諾自己支持國家安全吧!
09:32
While I find it shocking that most American AI companies
176
572026
3462
我驚訝的發現大部分的美國AI公司
09:35
have chosen not to support national security,
177
575488
2502
選擇不支持國家安全
09:37
I do hope others join us.
178
577990
1460
我真心期盼其他人加入我們
09:40
We must fight for the world we want to live in.
179
580827
2794
為我們想活成的世界的樣子而戰
09:43
It's never mattered more.
180
583621
2211
沒有比這更重要的事了
09:45
Thank you.
181
585873
1252
謝謝各位
09:47
(Applause)
182
587166
3837
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7