Military robots and the future of war | P.W. Singer

233,642 views ・ 2009-04-03

TED


請雙擊下方英文字幕播放視頻。

譯者: kane tan 審譯者: Regina Chu
00:13
I thought I'd begin with a scene of war.
0
13160
2000
我想從一個戰爭的場景來開始談。
00:15
There was little to warn of the danger ahead.
1
15160
2000
這裡看不出任何徵兆來得知前方隱藏的危險。
00:17
The Iraqi insurgent had placed the IED,
2
17160
2000
事實上伊拉克反抗軍已經在路旁
00:19
an Improvised Explosive Device,
3
19160
3000
很小心的裝置了 IED,
00:22
along the side of the road with great care.
4
22160
2000
那是一種簡易爆炸裝置。
00:24
By 2006, there were more than 2,500
5
24160
4000
在 2006 這一年中,
00:28
of these attacks every single month,
6
28160
3000
每個月有超過 2500 件這類攻擊事件,
00:31
and they were the leading cause of
7
31160
2000
而這些正是造成
00:33
casualties among American soldiers
8
33160
2000
美國士兵以及伊拉克居民
00:35
and Iraqi civilians.
9
35160
2000
死傷的主要原因。
00:37
The team that was hunting for this IED
10
37160
2000
搜尋這種 IED 的小組
00:39
is called an EOD team—
11
39160
2000
被稱為 EOD 小組 --
00:41
Explosives Ordinance Disposal—and
12
41160
2000
爆破性武器處理( Explosive Ordnance Disposal )
00:43
they're the pointy end of the spear in the
13
43160
2000
他們是美軍部隊中
00:45
American effort to suppress these roadside bombs.
14
45160
3000
專門對付路旁炸彈的尖兵。
00:48
Each EOD team goes out on about
15
48160
2000
每個 EOD 小組每年會進行
00:50
600 of these bomb calls every year,
16
50160
2000
大約 600 次這類炸彈任務,
00:52
defusing about two bombs a day.
17
52160
3000
每天大概要拆兩個炸彈。
00:55
Perhaps the best sign of how valuable they
18
55160
2000
也許對他們來說,
00:57
are to the war effort, is that
19
57160
2000
伊拉克反抗軍用5萬美金
00:59
the Iraqi insurgents put a $50,000 bounty
20
59160
2000
懸賞一個EOD士兵的人頭這件事,
01:01
on the head of a single EOD soldier.
21
61160
3000
可以看出他們有多珍貴。
01:04
Unfortunately, this particular call
22
64160
2000
很不幸的,
01:06
would not end well.
23
66160
2000
這種特殊任務不一定能安全完成。
01:08
By the time the soldier advanced close
24
68160
2000
某次一位資深士兵為了看訊號線
01:10
enough to see the telltale wires
25
70160
2000
過度接近一枚炸彈,
01:12
of the bomb, it exploded in a wave of flame.
26
72160
3000
結果它炸出了一道火牆。
01:15
Now, depending how close you are
27
75160
2000
依據你靠得多近,
01:17
and how much explosive has been packed
28
77160
2000
以及炸彈中裝置了多少火藥,
01:19
into that bomb, it can cause death
29
79160
2000
它會導致人員死亡或受傷。
01:21
or injury. You have to be as far as
30
81160
2000
你必須要保持距離
01:23
50 yards away to escape that.
31
83160
2000
在 50 碼以上才能不受波及。
01:25
The blast is so strong it can even break
32
85160
2000
沖擊波強大到足以震斷你的四肢,
01:27
your limbs, even if you're not hit.
33
87160
2000
即使你沒有被炸彈擊中。
01:29
That soldier had been on top of the bomb.
34
89160
2000
這位士兵正身處於炸彈上方。
01:31
And so when the rest of the team advanced
35
91160
3000
而這個資深隊伍的其他人員
01:34
they found little left. And that night the unit's
36
94160
2000
只在稍微左邊一點的位置。而在當晚,
01:36
commander did a sad duty, and he wrote
37
96160
2000
這個隊伍的指揮官做了一件令人難過的任務,
01:38
a condolence letter back to the United
38
98160
2000
他寫了一封哀悼的信回美國,
01:40
States, and he talked about how hard the
39
100160
2000
他描述關於失去他的隊伍他有多傷心,
01:42
loss had been on his unit, about the fact
40
102160
2000
而且事實上,
01:44
that they had lost their bravest soldier,
41
104160
2000
他們失去了最勇敢的士兵,
01:46
a soldier who had saved their lives
42
106160
2000
一位曾經多次
01:48
many a time.
43
108160
2000
拯救他們性命的士兵。
01:50
And he apologized
44
110160
2000
並且,他感到抱歉
01:52
for not being able to bring them home.
45
112160
2000
因為他無法將這個人帶回家。
01:54
But then he talked up the silver lining
46
114160
2000
但是從這個事件中,
01:56
that he took away from the loss.
47
116160
2000
他提到了一絲希望。
01:58
"At least," as he wrote, "when a robot dies,
48
118160
2000
如他所寫的,
02:00
you don't have to write a letter
49
120160
2000
「至少當一個機器人死亡時,
02:02
to its mother."
50
122160
2000
你不用寫信給他的母親。」
02:04
That scene sounds like science fiction,
51
124160
2000
這聽起來像是科幻電影的場景,
02:06
but is battlefield reality already.
52
126160
2000
但這是戰場上真實的事情。
02:08
The soldier in that case
53
128160
3000
在這個例子中的士兵,
02:11
was a 42-pound robot called a PackBot.
54
131160
3000
是一個重達 42 磅,名為 PackBot 的機器人。
02:14
The chief's letter went, not to some
55
134160
3000
這位長官的信,
02:17
farmhouse in Iowa like you see
56
137160
2000
不像你在一些老舊戰爭電影所見
02:19
in the old war movies, but went to
57
139160
3000
寄給愛荷華州的某個家園,
02:22
the iRobot Company, which is
58
142160
2000
而是寄給這家 iRobot 公司,
02:24
named after the Asimov novel
59
144160
3000
名字取自 Asimov 小說
02:27
and the not-so-great Will Smith movie,
60
147160
2000
以及不怎麼樣的 Will Smith 電影,
02:29
and... um... (Laughter)...
61
149160
2000
還有 ... ( 笑聲 ) ...
02:31
if you remember that
62
151160
2000
如果你還記得
02:33
in that fictional world, robots started out
63
153160
2000
在那個科幻世界中,機器人一開始
02:35
carrying out mundane chores, and then
64
155160
2000
只會幫我們做家事,接著,
02:37
they started taking on life-and-death decisions.
65
157160
2000
它們可以決定我們的生死。
02:39
That's a reality we face today.
66
159160
2000
這就是今日我們面對的真實狀況。
02:41
What we're going to do is actually just
67
161160
2000
接下來,我將會用我身後的投影片
02:43
flash a series of photos behind me that
68
163160
2000
來播放一些照片給大家看,
02:45
show you the reality of robots used in war
69
165160
3000
讓大家看看目前被用於戰場
02:48
right now or already at the prototype stage.
70
168160
2000
以及那些已經有原型機(prototype)的機器人。
02:50
It's just to give you a taste.
71
170160
3000
這只是稍微做個介紹。
02:53
Another way of putting it is you're not
72
173160
2000
另外要提一下,
02:55
going to see anything that's powered
73
175160
2000
你們現在看見的這些,
02:57
by Vulcan technology, or teenage
74
177160
2000
並不是瓦肯人的外星技術,
02:59
wizard hormones or anything like that.
75
179160
2000
或是少年巫師賀爾蒙作祟變出來的東西。
03:01
This is all real. So why don't we
76
181160
2000
這一切都是真實的。
03:03
go ahead and start those pictures.
77
183160
2000
我們開始看看這些照片吧
03:05
Something big is going on in war today,
78
185160
2000
人性在現今的戰爭中
03:07
and maybe even the history of humanity
79
187160
2000
出現有史以來的重大改變。
03:09
itself. The U.S. military went into Iraq with
80
189160
3000
當初美軍進入伊拉克時,
03:12
a handful of drones in the air.
81
192160
2000
只有少數的無人飛機,
03:14
We now have 5,300.
82
194160
3000
現在已經增加到 5300 架。
03:17
We went in with zero unmanned ground
83
197160
2000
當時我們沒有無人的地面系統,
03:19
systems. We now have 12,000.
84
199160
4000
現在已經有 12000 架了。
03:23
And the tech term "killer application"
85
203160
2000
而 "殺手級應用" 這個技術名詞
03:25
takes on new meaning in this space.
86
205160
3000
在這兒將會有新的定義。
03:28
And we need to remember that we're
87
208160
2000
請記住我們現在所談的是
03:30
talking about the Model T Fords,
88
210160
2000
拿福特的Model T4 車,
03:32
the Wright Flyers, compared
89
212160
2000
與萊特兄弟的飛機
03:34
to what's coming soon.
90
214160
2000
和即將推出的最新款比較的那種巨大差異。
03:36
That's where we're at right now.
91
216160
2000
這就是我們現在即將看見的。
03:38
One of the people that I recently met with
92
218160
2000
我最近遇到一位
03:40
was an Air Force three-star general, and he
93
220160
2000
空軍三星上將,他說,
03:42
said basically, where we're headed very
94
222160
2000
基本上我們即將面對的是,
03:44
soon is tens of thousands of robots
95
224160
2000
將會有成千上萬的機器人
03:46
operating in our conflicts, and these
96
226160
2000
在戰場上作戰,
03:48
numbers matter, because we're not just
97
228160
2000
而這個數字之所以重要,是因為我們不是談論
03:50
talking about tens of thousands of today's
98
230160
2000
成千上萬的現有機器人,
03:52
robots, but tens of thousands of these
99
232160
2000
而是成千上萬的原型機,
03:54
prototypes and tomorrow's robots, because
100
234160
2000
以及未來機器人,因為,
03:56
of course, one of the things that's operating
101
236160
3000
在現今科技裡面
03:59
in technology is Moore's Law,
102
239160
2000
有個叫做摩爾定律的東西,
04:01
that you can pack in more and more
103
241160
2000
就是說你將可以替機器人
04:03
computing power into those robots, and so
104
243160
2000
裝入更多強力的運算能力,
04:05
flash forward around 25 years,
105
245160
2000
所以往後推算 25 年的話,
04:07
if Moore's Law holds true,
106
247160
2000
如果摩爾定律依然正確的話,
04:09
those robots will be close to a billion times
107
249160
3000
這些機器人在運算能力上
04:12
more powerful in their computing than today.
108
252160
3000
將會比今日強上十幾億倍。
04:15
And so what that means is the kind of
109
255160
2000
這樣的話,
04:17
things that we used to only talk about at
110
257160
2000
我們本來在一些漫畫同人展中
04:19
science fiction conventions like Comic-Con
111
259160
2000
討論的一些科幻小說內容,
04:21
have to be talked about in the halls
112
261160
2000
將會改在掌權者的會議室中談論,
04:23
of power and places like the Pentagon.
113
263160
2000
像是五角大廈這種地方。
04:25
A robots revolution is upon us.
114
265160
3000
機器人的巨大轉變正在發生。
04:28
Now, I need to be clear here.
115
268160
2000
在這兒我必須先說清楚。
04:30
I'm not talking about a revolution where you
116
270160
2000
我並不是在說
04:32
have to worry about the Governor of
117
272160
2000
某天加州州長會以魔鬼終結者型態
04:34
California showing up at your door,
118
274160
2000
出現在你家門前
04:36
a la the Terminator. (Laughter)
119
276160
2000
那種令你擔心的事。(笑聲)
04:38
When historians look at this period, they're
120
278160
2000
以後的歷史學家看我們這段時期時,
04:40
going to conclude that we're in a different
121
280160
2000
他們會寫說我們正處於一種不同的轉變期,
04:42
type of revolution: a revolution in war,
122
282160
2000
一種戰爭型態的轉變,
04:44
like the invention of the atomic bomb.
123
284160
2000
就像是原子彈的發明一般。
04:46
But it may be even bigger than that,
124
286160
2000
但可能還會更加重大,
04:48
because our unmanned systems don't just
125
288160
2000
因為我們擁有的無人系統
04:50
affect the "how" of war-fighting,
126
290160
2000
並不只是影響"如何"去作戰,
04:52
they affect the "who" of fighting
127
292160
2000
它們將影響在機器人最重要的原則,
04:54
at its most fundamental level.
128
294160
2000
它是"和誰"在作戰。
04:56
That is, every previous revolution in war, be
129
296160
2000
就是說,在以往的戰爭型態改變中,
04:58
it the machine gun, be it the atomic bomb,
130
298160
2000
不論是機關槍,或是原子彈,
05:00
was about a system that either shot faster,
131
300160
3000
都只是關於如何作到射得更快,
05:03
went further, had a bigger boom.
132
303160
3000
前進得更遠,或是爆炸威力更大。
05:06
That's certainly the case with robotics, but
133
306160
3000
這些當然也和機器人有關係,
05:09
they also change the experience of the warrior
134
309160
3000
但這將會改變士兵們的經歷,
05:12
and even the very identity of the warrior.
135
312160
3000
甚至是士兵們的身份。
05:15
Another way of putting this is that
136
315160
3000
另外一部分就是,
05:18
mankind's 5,000-year-old monopoly
137
318160
2000
人類在近5000年來
05:20
on the fighting of war is breaking down
138
320160
3000
在戰場上作戰的專利
05:23
in our very lifetime. I've spent
139
323160
2000
將會在我們這個世代中結束。
05:25
the last several years going around
140
325160
2000
過去幾年來,
05:27
meeting with all the players in this field,
141
327160
2000
我到各地去拜訪這個領域的人們,
05:29
from the robot scientists to the science
142
329160
2000
包括了製作機器人的科學家,
05:31
fiction authors who inspired them to the
143
331160
2000
提供靈感的那些科幻小說作者,
05:33
19-year-old drone pilots who are fighting
144
333160
2000
那些身在內華達州
05:35
from Nevada, to the four-star generals
145
335160
2000
負責操縱無人飛機的19歲駕駛們,
05:37
who command them, to even the Iraqi
146
337160
2000
負責指揮這些人的四星上將,
05:39
insurgents who they are targeting and what
147
339160
2000
甚至是被攻擊的伊拉克反抗軍,
05:41
they think about our systems, and
148
341160
2000
詢問他們對這套系統的看法,
05:43
what I found interesting is not just
149
343160
2000
令我感興趣的並不只是
05:45
their stories, but how their experiences
150
345160
2000
他們自身的故事,
05:47
point to these ripple effects that are going
151
347160
2000
而是他們所經歷的這些
05:49
outwards in our society, in our law
152
349160
2000
將如何影響我們的社會,我們的法律
05:51
and our ethics, etc. And so what I'd like
153
351160
2000
以及道德觀之類的。
05:53
to do with my remaining time is basically
154
353160
2000
我會利用剩餘的時間,
05:55
flesh out a couple of these.
155
355160
2000
將這些作一個簡單的說明。
05:57
So the first is that the future of war,
156
357160
2000
首先,戰爭的未來型態,
05:59
even a robotics one, is not going to be
157
359160
2000
甚至是利用機器人的戰爭,
06:01
purely an American one.
158
361160
2000
將不再只是美國的問題。
06:03
The U.S. is currently ahead in military
159
363160
2000
確實現在美國在軍事機器人方面
06:05
robotics right now, but we know that in
160
365160
2000
佔有領先的地位,但是我們知道
06:07
technology there's no such thing as
161
367160
2000
科技這種東西不會有
06:09
a permanent first move or advantage.
162
369160
3000
永遠保持領先這種事。
06:12
In a quick show of hands, how many
163
372160
2000
這樣說吧,請問
06:14
people in this room still use
164
374160
2000
這裡有多少人還在使用
06:16
Wang Computers? (Laughter)
165
376160
2000
王安電腦? (笑聲)
06:18
It's the same thing in war. The British and
166
378160
2000
戰爭也是一樣。英國以及
06:20
the French invented the tank.
167
380160
3000
法國發明了坦克。
06:23
The Germans figured out how
168
383160
2000
而德國人發現該如何
06:25
to use it right, and so what we have to
169
385160
2000
正確的使用它,所以
06:27
think about for the U.S. is that we are
170
387160
2000
我們現在要思考的是,
06:29
ahead right now, but you have
171
389160
2000
雖然美國現在領先,
06:31
43 other countries out there
172
391160
2000
但世界上還有其他43個國家
06:33
working on military robotics, and they
173
393160
2000
正在努力開發軍事機器人,
06:35
include all the interesting countries like
174
395160
2000
包括了那些值得注意的國家
06:37
Russia, China, Pakistan, Iran.
175
397160
3000
例如俄羅斯、中國、巴基斯坦和伊朗。
06:40
And this raises a bigger worry for me.
176
400160
3000
這讓我更加擔憂。
06:43
How do we move forward in this revolution
177
403160
2000
我們該如何利用
06:45
given the state of our manufacturing
178
405160
2000
現有的製造技術、現有的學校科學
06:47
and the state of our science and
179
407160
2000
與數學教育,
06:49
mathematics training in our schools?
180
409160
2000
在這個戰爭型態轉變中繼續向前?
06:51
Or another way of thinking about this is,
181
411160
2000
換個想法的話就是,
06:53
what does it mean to go to war increasingly
182
413160
2000
越來越多士兵在戰場上使用中國製的硬體設備,
06:55
with soldiers whose hardware is made
183
415160
3000
以及印度所寫的電腦軟體,
06:58
in China and software is written in India?
184
418160
5000
這將會是什麼情況呢?
07:03
But just as software has gone open-source,
185
423160
3000
但正如同軟體逐漸趨向開放式原始碼(open source),
07:06
so has warfare.
186
426160
2000
戰爭也是如此。
07:08
Unlike an aircraft carrier or an atomic bomb,
187
428160
3000
不同於飛機或是原子彈,
07:11
you don't need a massive manufacturing
188
431160
2000
你不需要有大型生產系統來製造機器人。
07:13
system to build robotics. A lot of it is
189
433160
2000
大部分都有現成品,
07:15
off the shelf. A lot of it's even do-it-yourself.
190
435160
2000
有些甚至可以讓你DIY的。
07:17
One of those things you just saw flashed
191
437160
2000
像是各位在剛才的影片中
07:19
before you was a raven drone, the handheld
192
439160
2000
看見的烏鴉號無人飛機,
07:21
tossed one. For about a thousand dollars,
193
441160
2000
手持拋擲的那台,只要大約1千美金,
07:23
you can build one yourself, equivalent to
194
443160
2000
你就可以自己做一台,
07:25
what the soldiers use in Iraq.
195
445160
2000
和那些在伊拉克的士兵們所用的完全一樣。
07:27
That raises another wrinkle when it comes
196
447160
2000
當關係到戰爭和衝突時,
07:29
to war and conflict. Good guys might play
197
449160
2000
這東西就會有不同的用法,
07:31
around and work on these as hobby kits,
198
451160
2000
好人可能只是拿來玩玩,把這些當作興趣而已,
07:33
but so might bad guys.
199
453160
2000
但壞人也會拿來玩。
07:35
This cross between robotics and things like
200
455160
2000
這個介於機器人技術與恐怖主義行為的東西,
07:37
terrorism is going to be fascinating
201
457160
2000
將會變得相當迷人,
07:39
and even disturbing,
202
459160
2000
但又相當令人困擾,
07:41
and we've already seen it start.
203
461160
2000
我們已經看見它發生了。
07:43
During the war between Israel, a state,
204
463160
2000
在以色列這個國家,
07:45
and Hezbollah, a non-state actor,
205
465160
3000
與真主黨這個非國家組織的戰爭中,
07:48
the non-state actor flew
206
468160
2000
真主黨操縱了
07:50
four different drones against Israel.
207
470160
2000
四架無人飛機來對抗以色列。
07:52
There's already a jihadi website
208
472160
2000
你甚至可以坐在家裡,
07:54
that you can go on and remotely
209
474160
2000
使用電腦連到一個Jehadi網站,
07:56
detonate an IED in Iraq while sitting
210
476160
2000
遙控引爆在伊拉克的
07:58
at your home computer.
211
478160
2000
簡易爆炸裝置。
08:00
And so I think what we're going to see is
212
480160
2000
因此,我想這個將會導致
08:02
two trends take place with this.
213
482160
2000
兩種趨勢的發生。
08:04
First is, you're going to reinforce the power
214
484160
2000
第一個是,個人對抗政府的力量
08:06
of individuals against governments,
215
486160
4000
將會變得更加強大,
08:10
but then the second is that
216
490160
2000
但第二個則是,
08:12
we are going to see an expansion
217
492160
2000
我們將會看見
08:14
in the realm of terrorism.
218
494160
2000
恐怖份子的勢力範圍逐漸擴大。
08:16
The future of it may be a cross between
219
496160
2000
未來可能將是
08:18
al Qaeda 2.0 and the
220
498160
2000
蓋達組織2.0與
08:20
next generation of the Unabomber.
221
500160
2000
次世代郵包炸彈客聯手的情況。
08:22
And another way of thinking about this
222
502160
2000
另外一方面的問題是,
08:24
is the fact that, remember, you don't have
223
504160
2000
請記住,你並不需要
08:26
to convince a robot that they're gonna
224
506160
2000
去說服機器人相信
08:28
receive 72 virgins after they die
225
508160
3000
在犧牲後會獲得72個處女,
08:31
to convince them to blow themselves up.
226
511160
3000
以說服它們執行自殺炸彈任務。
08:34
But the ripple effects of this are going to go
227
514160
2000
但這件事將會在政治上
08:36
out into our politics. One of the people that
228
516160
2000
產生一些影響。我曾經遇到
08:38
I met with was a former Assistant Secretary of
229
518160
2000
雷根時代的前國防部助理秘書長。
08:40
Defense for Ronald Reagan, and he put it
230
520160
2000
他曾經這樣說過,
08:42
this way: "I like these systems because
231
522160
2000
"我喜歡這些系統,
08:44
they save American lives, but I worry about
232
524160
2000
因為它可以拯救許多美國人的性命,
08:46
more marketization of wars,
233
526160
2000
但是我擔心
08:48
more shock-and-awe talk,
234
528160
3000
這將會使得戰爭變得更加自由市場化,
08:51
to defray discussion of the costs.
235
531160
2000
會出現更多談論代價的驚人言論。
08:53
People are more likely to support the use
236
533160
2000
當人們認為某種方式是幾乎不需付出代價時,
08:55
of force if they view it as costless."
237
535160
3000
人們會比較傾向去使用它。"
08:58
Robots for me take certain trends
238
538160
2000
對我來說,機器人化是必然的趨勢,
09:00
that are already in play in our body politic,
239
540160
3000
甚至已經在政治體系中出現,
09:03
and maybe take them to
240
543160
2000
也許已經讓它們
09:05
their logical ending point.
241
545160
2000
變得相當合理化。
09:07
We don't have a draft. We don't
242
547160
2000
對於這個我們並沒有任何草案。
09:09
have declarations of war anymore.
243
549160
3000
我們不再有任何戰爭宣言。
09:12
We don't buy war bonds anymore.
244
552160
2000
我們不再買任何戰爭公債。
09:14
And now we have the fact that we're
245
554160
2000
但事實上,
09:16
converting more and more of our American
246
556160
2000
我們正在持續將那些
09:18
soldiers that we would send into harm's
247
558160
2000
原本要送往危險區域的美國士兵們
09:20
way into machines, and so we may take
248
560160
3000
盡量以機器人替代,這樣我們就可以
09:23
those already lowering bars to war
249
563160
3000
讓那些反對戰爭的民調數字
09:26
and drop them to the ground.
250
566160
3000
降至最低。
09:29
But the future of war is also going to be
251
569160
2000
但是未來的戰爭,
09:31
a YouTube war.
252
571160
2000
也將會變成YouTube型態的戰爭。
09:33
That is, our new technologies don't merely
253
573160
2000
就是說,我們的機器人
09:35
remove humans from risk.
254
575160
2000
不只可以讓人們遠離危險。
09:37
They also record everything that they see.
255
577160
3000
同時也會紀錄下它們所看見的一切。
09:40
So they don't just delink the public:
256
580160
3000
所以它們不只是讓人群遠離戰爭,
09:43
they reshape its relationship with war.
257
583160
3000
它們也重新塑造它與戰爭的關係。
09:46
There's already several thousand
258
586160
2000
現在YouTube上
09:48
video clips of combat footage from Iraq
259
588160
2000
已經有數以千計的短片
09:50
on YouTube right now,
260
590160
2000
紀錄著伊拉克戰爭的狀況,
09:52
most of it gathered by drones.
261
592160
2000
大部分都是由機器人所拍攝。
09:54
Now, this could be a good thing.
262
594160
2000
這也許是一件好事,
09:56
It could be building connections between
263
596160
2000
它可以讓一般家庭
09:58
the home front and the war front
264
598160
2000
與戰場之間
10:00
as never before.
265
600160
2000
形成一種前所未有的連結。
10:02
But remember, this is taking place
266
602160
2000
但是,請記住,這正發生在
10:04
in our strange, weird world, and so
267
604160
3000
我們所存在的詭異世界,
10:07
inevitably the ability to download these
268
607160
2000
因此,它們勢必會被人們
10:09
video clips to, you know, your iPod
269
609160
2000
下載到他們的iPod
10:11
or your Zune gives you
270
611160
3000
或Zune裡面,
10:14
the ability to turn it into entertainment.
271
614160
4000
甚至會變成一種娛樂。
10:18
Soldiers have a name for these clips.
272
618160
2000
士兵們給了這種短片一個名字。
10:20
They call it war porn.
273
620160
2000
他們稱呼它為戰爭色情片。
10:22
The typical one that I was sent was
274
622160
2000
我曾經收到一則經典的影片,
10:24
an email that had an attachment of
275
624160
2000
是封e-mail,夾帶了一段掠奪者攻擊的影片,
10:26
video of a Predator strike taking out
276
626160
2000
紀錄著攻擊敵人的陣地,
10:28
an enemy site. Missile hits,
277
628160
2000
飛彈擊中目標時,
10:30
bodies burst into the air with the explosion.
278
630160
3000
敵軍的身體因為爆破而飛上空中。
10:33
It was set to music.
279
633160
2000
它被配上了音樂。
10:35
It was set to the pop song
280
635160
2000
它被配上了一段流行音樂
10:37
"I Just Want To Fly" by Sugar Ray.
281
637160
3000
Sugar Ray的"我只想飛"。
10:40
This ability to watch more
282
640160
3000
這種看得更多、
10:43
but experience less creates a wrinkle
283
643160
3000
但親身經歷更少的情況,
10:46
in the public's relationship with war.
284
646160
2000
導致大眾與戰爭產生一種奇怪的關係。
10:48
I think about this with a sports parallel.
285
648160
2000
我想這就像是觀看運動比賽一樣。
10:50
It's like the difference between
286
650160
3000
這就像是在看一場
10:53
watching an NBA game, a professional
287
653160
3000
NBA比賽一樣,在電視上
10:56
basketball game on TV, where the athletes
288
656160
3000
看職業籃球賽時,畫面上的運動員
10:59
are tiny figures on the screen, and
289
659160
2000
看起來都小小的,
11:01
being at that basketball game in person
290
661160
3000
但是親自在籃球比賽現場時,
11:04
and realizing what someone seven feet
291
664160
2000
你才知道那些身高超過200公分的人
11:06
really does look like.
292
666160
2000
真正是什麼樣子。
11:08
But we have to remember,
293
668160
2000
但是我們必須記住,
11:10
these are just the clips.
294
670160
2000
這些只是短片。
11:12
These are just the ESPN SportsCenter
295
672160
2000
這些是ESPN運動中心版本的比賽。
11:14
version of the game. They lose the context.
296
674160
2000
它沒有故事背景,
11:16
They lose the strategy.
297
676160
2000
它沒有戰略,
11:18
They lose the humanity. War just
298
678160
2000
它沒有人性,
11:20
becomes slam dunks and smart bombs.
299
680160
3000
戰爭變成了像灌籃或是發射飛彈那麼簡單。
11:23
Now the irony of all this is that
300
683160
3000
諷刺的是,
11:26
while the future of war may involve
301
686160
2000
當未來戰爭
11:28
more and more machines,
302
688160
2000
有更多機器人加入時,
11:30
it's our human psychology that's driving
303
690160
2000
那是被人們的心理因素所驅使,
11:32
all of this, it's our human failings
304
692160
2000
那是因為人們的缺陷
11:34
that are leading to these wars.
305
694160
2000
導致這些戰爭發生。
11:36
So one example of this that has
306
696160
2000
有一個激起了政治上
11:38
big resonance in the policy realm is
307
698160
2000
很大迴響的案例,
11:40
how this plays out on our very real
308
700160
2000
就是當我們對抗激進份子時,
11:42
war of ideas that we're fighting
309
702160
2000
我們對於真實戰爭的想法
11:44
against radical groups.
310
704160
2000
會有什麼改變。
11:46
What is the message that we think we are
311
706160
2000
在什麼情況下,我們會決定
11:48
sending with these machines versus what
312
708160
2000
要派出這些機器人,以及
11:50
is being received in terms of the message.
313
710160
3000
派出這些機器人之後會有什麼結果。
11:53
So one of the people that I met was
314
713160
2000
我曾經遇到一位
11:55
a senior Bush Administration official,
315
715160
2000
布希內閣的高級官員,
11:57
who had this to say about
316
717160
2000
他曾經對於這種
11:59
our unmanning of war:
317
719160
2000
無人的戰爭方式這麼說:
12:01
"It plays to our strength. The thing that
318
721160
2000
"它讓我們的力量變大,
12:03
scares people is our technology."
319
723160
2000
令敵人害怕的是我們的科技"
12:05
But when you go out and meet with people,
320
725160
2000
但是當你走出去,遇到其他人,
12:07
for example in Lebanon, it's a very
321
727160
2000
例如在黎巴嫩,
12:09
different story. One of the people
322
729160
2000
這就變成另外一件事。
12:11
I met with there was a news editor, and
323
731160
2000
在那裡我遇到一個新聞記者,
12:13
we're talking as a drone is flying above him,
324
733160
2000
當我們在談話時,有一架無人飛機正在空中飛行著。
12:15
and this is what he had to say.
325
735160
2002
他這麼說:
12:17
"This is just another sign of the coldhearted
326
737162
1998
"這是只是冷酷無情的另一種象徵,
12:19
cruel Israelis and Americans,
327
739160
3006
殘忍的以色列和美國人們,
12:22
who are cowards because
328
742166
1992
他們很膽小,
12:24
they send out machines to fight us.
329
744158
2006
因為他們只敢派機器人來攻擊我們。
12:26
They don't want to fight us like real men,
330
746164
1992
他們不願意像個男人一樣作戰,
12:28
but they're afraid to fight,
331
748156
2007
因為他們害怕戰鬥。
12:30
so we just have to kill a few of their soldiers
332
750163
1991
所以我們必須要殺掉一些他們的士兵
12:32
to defeat them."
333
752154
3005
來打敗他們。"
12:35
The future of war also is featuring
334
755159
2007
未來的戰爭也造就了
12:37
a new type of warrior,
335
757166
1977
不同型態的戰士,
12:39
and it's actually redefining the experience
336
759159
3006
這確實重新定義了
12:42
of going to war.
337
762165
1998
上戰場的經歷,
12:44
You can call this a cubicle warrior.
338
764163
1976
你可以稱之為臥房戰士。
12:46
This is what one Predator drone pilot
339
766154
2007
以下是一位掠奪者無人飛機的駕駛
12:48
described of his experience fighting
340
768161
2007
描述關於他在伊拉克戰爭中的經歷,
12:50
in the Iraq War while never leaving Nevada.
341
770168
2989
而期間他從未離開過內華達州。
12:53
"You're going to war for 12 hours,
342
773157
2005
"你將會前進至戰場上12小時,
12:55
shooting weapons at targets,
343
775162
1991
對著目標物射擊,
12:57
directing kills on enemy combatants,
344
777168
2991
直接殺掉敵人的戰鬥員,
13:00
and then you get in the car
345
780159
2006
之後你開著車回到家,
13:02
and you drive home and within 20 minutes,
346
782165
1982
在20分鐘之內
13:04
you're sitting at the dinner table
347
784163
1991
你就可以坐在餐桌上,
13:06
talking to your kids about their homework."
348
786169
1993
和你的孩子們談論他們的回家作業。"
13:08
Now, the psychological balancing
349
788162
1991
對於這些經歷,
13:10
of those experiences is incredibly tough,
350
790168
1993
在心理上要達到平衡是極為困難的。
13:12
and in fact those drone pilots have
351
792161
2992
事實上這些無人飛機的駕駛員
13:15
higher rates of PTSD than many
352
795153
2008
比起那些在伊拉克作戰的人們
13:17
of the units physically in Iraq.
353
797161
3005
更容易得到創傷後壓力症候群。
13:20
But some have worries that this
354
800166
1992
但是有些人會擔心這種遠距作戰
13:22
disconnection will lead to something else,
355
802158
1994
會導致一些其它問題,
13:24
that it might make the contemplation of war
356
804167
1998
這將會讓企圖進行戰爭犯罪
13:26
crimes a lot easier when you have
357
806165
2005
當你有這種遠距能力時會更加容易。
13:28
this distance. "It's like a video game,"
358
808170
1999
一個年輕的無人飛機駕駛告訴我,
13:30
is what one young pilot described to me
359
810169
1999
在遠距射殺敵人的部隊
13:32
of taking out enemy troops from afar.
360
812168
1960
"感覺就像是電動玩具一樣"。
13:34
As anyone who's played Grand Theft Auto
361
814158
2995
玩過"俠盜獵車手"的人應該會知道,
13:37
knows, we do things in the video world
362
817168
3000
任何我們在真實世界不會做的事
13:40
that we wouldn't do face to face.
363
820168
2996
在電玩世界裡我們可以任意進行。
13:43
So much of what you're hearing from me
364
823164
1998
所以,你剛聽到我所說的
13:45
is that there's another side
365
825162
1993
這些內容,都是科技革命
13:47
to technologic revolutions,
366
827155
2008
的另外一面,
13:49
and that it's shaping our present
367
829163
1994
它正改變現在的我們,
13:51
and maybe will shape our future of war.
368
831157
3001
也許將會改變我們未來的戰爭型態。
13:54
Moore's Law is operative,
369
834158
2001
摩爾定律在這裡是可行的,
13:56
but so's Murphy's Law.
370
836159
1998
同樣的,墨菲定律也是如此。
13:58
The fog of war isn't being lifted.
371
838157
2005
戰爭的硝煙還未散去,
14:00
The enemy has a vote.
372
840162
2010
敵人進行了投票,
14:02
We're gaining incredible new capabilities,
373
842172
1989
我們將會得到驚人的新能力,
14:04
but we're also seeing and experiencing
374
844161
2006
但是我們同時也在經歷
14:06
new human dilemmas. Now,
375
846167
2000
人類新的困境。
14:08
sometimes these are just "oops" moments,
376
848167
1999
有的時候,這些只是一種"令人驚訝"的時刻,
14:10
which is what the head of a robotics
377
850166
1992
這是製造機器人的公司所描述的
14:12
company described it, you just have
378
852158
2011
"你將會有一些令人驚訝的時刻。"
14:14
"oops" moments. Well, what are
379
854169
2000
那麼,對於機器人和戰爭而言,
14:16
"oops" moments with robots in war?
380
856169
1996
什麼是令人驚訝的時刻呢?
14:18
Well, sometimes they're funny. Sometimes,
381
858165
2001
有的時候它們會很好笑。
14:20
they're like that scene from the
382
860166
2002
有時候,它們就像是
14:22
Eddie Murphy movie "Best Defense,"
383
862168
2001
艾迪‧墨菲的電影 "兵來將擋"
14:24
playing out in reality, where they tested out
384
864169
2001
裡面所演的那樣,當他們測試
14:26
a machine gun-armed robot, and during
385
866170
1988
武裝機器人的機關槍時,
14:28
the demonstration it started spinning
386
868158
2001
在展示的期間,
14:30
in a circle and pointed its machine gun
387
870159
3009
它開始轉圈圈,然後將它的機關槍
14:33
at the reviewing stand of VIPs.
388
873168
2995
瞄準觀眾席上的貴賓們。
14:36
Fortunately the weapon wasn't loaded
389
876163
1993
所幸這些武器並沒有子彈,
14:38
and no one was hurt, but other times
390
878156
2010
所以沒有人受傷。但是其他令人驚訝的時刻
14:40
"oops" moments are tragic,
391
880166
1992
則會變成悲劇,
14:42
such as last year in South Africa, where
392
882158
2002
例如去年在南非,
14:44
an anti-aircraft cannon had a
393
884160
2999
一架防空加農砲出現了官方宣稱的
14:47
"software glitch," and actually did turn on
394
887159
2999
"軟體故障",結果它啟動了,
14:50
and fired, and nine soldiers were killed.
395
890158
3004
並且開火,結果殺死了九個士兵。
14:53
We have new wrinkles in the laws of war
396
893162
2997
我們在戰爭規則和責任上
14:56
and accountability. What do we do
397
896159
1997
出現了新的問題。我們要怎麼處理
14:58
with things like unmanned slaughter?
398
898156
2010
這些無人的屠殺工具呢?
15:00
What is unmanned slaughter?
399
900166
1998
什麼是無人的屠殺工具呢?
15:02
We've already had three instances of
400
902164
2000
我們已經有三次的例子,
15:04
Predator drone strikes where we thought
401
904164
2000
當掠奪者攻擊時,
15:06
we got bin Laden, and it turned out
402
906164
1994
我們以為已經殺了賓拉登,
15:08
not to be the case.
403
908158
2002
但結果並不是如此。
15:10
And this is where we're at right now.
404
910160
2008
這就是我們所面臨的問題。
15:12
This is not even talking about armed,
405
912168
2000
這並不只是在說,
15:14
autonomous systems
406
914168
1989
武裝的自動化系統
15:16
with full authority to use force.
407
916157
2013
擁有完整的權利去使用武器,
15:18
And do not believe that that isn't coming.
408
918170
1994
還有,不要以為這不會發生。
15:20
During my research I came across
409
920164
2007
在我的研究中,
15:22
four different Pentagon projects
410
922171
1985
我接觸了五角大廈針對不同領域
15:24
on different aspects of that.
411
924156
2012
進行的四個專案。
15:26
And so you have this question:
412
926168
2002
那麼你會有這個疑問。
15:28
what does this lead to issues like
413
928170
1993
是什麼導致這些變得像是戰爭犯罪?
15:30
war crimes? Robots are emotionless, so
414
930163
1992
機器人是沒有感情的,
15:32
they don't get upset if their buddy is killed.
415
932155
3012
所以它們不會因為同伴被殺而打亂思緒。
15:35
They don't commit crimes of rage
416
935167
1994
它們不會因為憤怒而犯罪
15:37
and revenge.
417
937161
2007
或進行報復。
15:39
But robots are emotionless.
418
939168
2997
但正因為機器人是沒有感情的,
15:42
They see an 80-year-old grandmother
419
942165
1999
對它們來說,
15:44
in a wheelchair the same way they see
420
944164
2000
坐在輪椅上的80歲老婆婆,
15:46
a T-80 tank: they're both
421
946164
3002
和T80坦克是沒有兩樣的。
15:49
just a series of zeroes and ones.
422
949166
2993
他們都只是一連串 0 和 1 的訊號。
15:52
And so we have this question to figure out:
423
952159
3003
因此我們必須想辦法解決這個問題,
15:55
How do we catch up our 20th century
424
955162
2003
我們該怎麼讓20世紀的戰爭規則
15:57
laws of war, that are so old right now
425
957165
1989
跟上時代,
15:59
that they could qualify for Medicare,
426
959154
3008
對於21世紀的科技而言,
16:02
to these 21st century technologies?
427
962162
3008
只剩下醫療保險還能延用而已。
16:05
And so, in conclusion, I've talked about
428
965170
2987
總結而言,
16:08
what seems the future of war,
429
968157
3011
我已經談論了未來戰爭的樣子,
16:11
but notice that I've only used
430
971168
1990
但是請注意,
16:13
real world examples and you've only seen
431
973158
2012
我舉的都是真實世界的例子,
16:15
real world pictures and videos.
432
975170
2001
你看見的都是真實世界的照片和影片。
16:17
And so this sets a great challenge for
433
977171
2000
對我們來說
16:19
all of us that we have to worry about well
434
979171
1986
這將是一大挑戰,
16:21
before you have to worry about your
435
981157
2009
在你開始擔心
16:23
Roomba sucking the life away from you.
436
983166
2001
你的自動吸塵器殺掉你之前,
16:25
Are we going to let the fact that what's
437
985167
1999
我們必須趕快想好對策。
16:27
unveiling itself right now in war
438
987166
3000
難道我們要讓戰爭中的事實真相,
16:30
sounds like science fiction and therefore
439
990166
2991
繼續被當成是科幻電影一般,
16:33
keeps us in denial?
440
993157
2005
而不願意去面對嗎?
16:35
Are we going to face the reality
441
995162
2004
我們是否要面對21世紀
16:37
of 21st century war?
442
997166
1993
戰爭的真相呢?
16:39
Is our generation going to make the same
443
999159
2001
我們這一代是否要像上一代一樣,
16:41
mistake that a past generation did
444
1001160
2009
繼續犯下那種
16:43
with atomic weaponry, and not deal with
445
1003169
1998
使用原子彈的錯誤呢?
16:45
the issues that surround it until
446
1005167
1995
然後一直不願意去面對這些問題,
16:47
Pandora's box is already opened up?
447
1007162
2007
直到潘朵拉的盒子被打開為止嗎?
16:49
Now, I could be wrong on this, and
448
1009169
1998
也許我的觀點是錯的,
16:51
one Pentagon robot scientist told me
449
1011167
2004
一位五角大廈的機器人科學家說我錯了,
16:53
that I was. He said, "There's no real
450
1013171
1994
他說,"對於機器人來說
16:55
social, ethical, moral issues when it comes
451
1015165
1995
沒有真假,社會意識,
16:57
to robots.
452
1017160
1998
倫理道德等問題。
16:59
That is," he added, "unless the machine
453
1019158
2001
唯一的問題是,
17:01
kills the wrong people repeatedly.
454
1021159
3010
除非這台機器連續殺錯了人,
17:04
Then it's just a product recall issue."
455
1024169
2989
這樣的話,那就是要回廠維修的問題。"
17:07
And so the ending point for this is
456
1027158
3011
所以,談論到最後,
17:10
that actually, we can turn to Hollywood.
457
1030169
5002
我們來說說好萊塢。
17:15
A few years ago, Hollywood gathered
458
1035171
1996
幾年前好萊塢聚集了
17:17
all the top characters and created
459
1037167
3006
許多頂尖的演員,
17:20
a list of the top 100 heroes and
460
1040173
1997
並且列出了好萊塢歷史上前100名英雄人物,
17:22
top 100 villains of all of Hollywood history,
461
1042170
2990
還有前100名壞蛋角色,
17:25
the characters that represented the best
462
1045160
2006
這些角色代表著人性中最好
17:27
and worst of humanity.
463
1047166
1992
和最壞的特質,
17:29
Only one character made it onto both lists:
464
1049158
4015
其中只有一個角色同時存在於名單之中,
17:33
The Terminator, a robot killing machine.
465
1053173
2996
就是魔鬼終結者,一個機器人型態的殺人機器。
17:36
And so that points to the fact that
466
1056169
1988
這就點出了我們的機器
17:38
our machines can be used
467
1058157
2013
可以同時被用於好事和邪惡的事,
17:40
for both good and evil, but for me
468
1060170
1999
但是對我來說,
17:42
it points to the fact that there's a duality
469
1062169
1996
事實上它也點出了
17:44
of humans as well.
470
1064165
3000
人類同時具有善和惡的雙重特性。
17:47
This week is a celebration
471
1067165
1998
本週我們有個創造力的慶祝會。
17:49
of our creativity. Our creativity
472
1069163
1999
我們的創造力
17:51
has taken our species to the stars.
473
1071162
1998
已經讓人類登上別的星球。
17:53
Our creativity has created works of arts
474
1073160
2007
我們的創造力製造出藝術品,
17:55
and literature to express our love.
475
1075167
2996
以及文學作品來表現我們的愛,
17:58
And now, we're using our creativity
476
1078163
1998
而現在我們正在使用我們的創造力,
18:00
in a certain direction, to build fantastic
477
1080161
1993
去製造出奇妙的機器,
18:02
machines with incredible capabilities,
478
1082169
2987
讓它們具有不可思議的能力,
18:05
maybe even one day
479
1085156
2009
也許有一天,
18:07
an entirely new species.
480
1087165
2997
會出現一種全新的物種。
18:10
But one of the main reasons that we're
481
1090162
2005
但是,令我們這麼做的
18:12
doing that is because of our drive
482
1092167
1998
其中一個主要因素,
18:14
to destroy each other, and so the question
483
1094165
3000
就是因為我們想要毀掉彼此。
18:17
we all should ask:
484
1097165
1994
所以,我們必須要問的是,
18:19
is it our machines, or is it us
485
1099159
2000
操縱戰爭的是我們的機器,
18:21
that's wired for war?
486
1101159
2007
還是我們自己?
18:23
Thank you. (Applause)
487
1103166
2098
謝謝大家。(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7