Military robots and the future of war | P.W. Singer

233,642 views ・ 2009-04-03

TED


请双击下面的英文字幕来播放视频。

翻译人员: Dennis Guo 校对人员: Karen SONG
00:13
I thought I'd begin with a scene of war.
0
13160
2000
我想从一个战争的片段说起
00:15
There was little to warn of the danger ahead.
1
15160
2000
危机来得毫无征兆
00:17
The Iraqi insurgent had placed the IED,
2
17160
2000
伊拉克叛军将IED
00:19
an Improvised Explosive Device,
3
19160
3000
也就是简易爆炸装置
00:22
along the side of the road with great care.
4
22160
2000
小心翼翼地设置在路边
00:24
By 2006, there were more than 2,500
5
24160
4000
在2006年之前,这样的攻击事件
00:28
of these attacks every single month,
6
28160
3000
每个月超过2500起
00:31
and they were the leading cause of
7
31160
2000
这是造成
00:33
casualties among American soldiers
8
33160
2000
在伊美军士兵
00:35
and Iraqi civilians.
9
35160
2000
和伊拉克平民伤亡的罪魁祸首
00:37
The team that was hunting for this IED
10
37160
2000
搜索这些简易爆炸装置的部队
00:39
is called an EOD team—
11
39160
2000
叫作EOD小组
00:41
Explosives Ordinance Disposal—and
12
41160
2000
爆炸军械处置小组
00:43
they're the pointy end of the spear in the
13
43160
2000
他们是美国压制这些
00:45
American effort to suppress these roadside bombs.
14
45160
3000
用来压制这些土制炸弹的攻击
00:48
Each EOD team goes out on about
15
48160
2000
每个EOD小组会出动
00:50
600 of these bomb calls every year,
16
50160
2000
每年大约600次拆弹任务
00:52
defusing about two bombs a day.
17
52160
3000
每天平均拆除2个炸弹
00:55
Perhaps the best sign of how valuable they
18
55160
2000
或许最能显示他们对于
00:57
are to the war effort, is that
19
57160
2000
战争行动的重要性,便是
00:59
the Iraqi insurgents put a $50,000 bounty
20
59160
2000
伊拉克叛军悬赏5万美金
01:01
on the head of a single EOD soldier.
21
61160
3000
为每个EOD士兵的头颅
01:04
Unfortunately, this particular call
22
64160
2000
很不幸,这次行动
01:06
would not end well.
23
66160
2000
结局悲惨
01:08
By the time the soldier advanced close
24
68160
2000
当一名士兵刚刚靠近
01:10
enough to see the telltale wires
25
70160
2000
可以看到炸弹的引线时
01:12
of the bomb, it exploded in a wave of flame.
26
72160
3000
它在一片火浪中爆炸了
01:15
Now, depending how close you are
27
75160
2000
根据你和炸弹的距离
01:17
and how much explosive has been packed
28
77160
2000
以及那个炸弹中放了
01:19
into that bomb, it can cause death
29
79160
2000
多少炸药,极有可能死亡
01:21
or injury. You have to be as far as
30
81160
2000
或者受伤。你至少要距离
01:23
50 yards away to escape that.
31
83160
2000
50码远才能安全逃脱
01:25
The blast is so strong it can even break
32
85160
2000
爆炸强烈到就算你没有被直接打击
01:27
your limbs, even if you're not hit.
33
87160
2000
也可能四肢断掉
01:29
That soldier had been on top of the bomb.
34
89160
2000
那名士兵就在炸弹的正上方
01:31
And so when the rest of the team advanced
35
91160
3000
所以当其他队员赶到时
01:34
they found little left. And that night the unit's
36
94160
2000
已经没有什么余骸剩下了
01:36
commander did a sad duty, and he wrote
37
96160
2000
那天晚上,部队指挥官在悲痛中履行了长官职责
01:38
a condolence letter back to the United
38
98160
2000
他写了一封吊唁信并寄回了美国
01:40
States, and he talked about how hard the
39
100160
2000
他提到了失去这名士兵
01:42
loss had been on his unit, about the fact
40
102160
2000
整个队伍是多么难过
01:44
that they had lost their bravest soldier,
41
104160
2000
他们失去了最勇敢的战士
01:46
a soldier who had saved their lives
42
106160
2000
曾救过他们的性命
01:48
many a time.
43
108160
2000
许多次
01:50
And he apologized
44
110160
2000
同时他为
01:52
for not being able to bring them home.
45
112160
2000
不能将他们所有人都带回家而道了歉
01:54
But then he talked up the silver lining
46
114160
2000
但紧接着,他讲起了
01:56
that he took away from the loss.
47
116160
2000
这次不幸中之大幸
01:58
"At least," as he wrote, "when a robot dies,
48
118160
2000
“至少,”他写道,“当一个机器人死去时
02:00
you don't have to write a letter
49
120160
2000
你不用给它的母亲
02:02
to its mother."
50
122160
2000
写吊唁信。”
02:04
That scene sounds like science fiction,
51
124160
2000
这个场景听起来像科幻小说
02:06
but is battlefield reality already.
52
126160
2000
但在战场上却已经是现实
02:08
The soldier in that case
53
128160
3000
上面提到的士兵
02:11
was a 42-pound robot called a PackBot.
54
131160
3000
是一个42磅的机器人,叫做PackBot。
02:14
The chief's letter went, not to some
55
134160
3000
那位指挥官的信
02:17
farmhouse in Iowa like you see
56
137160
2000
并没有寄给爱荷华的某个农场
02:19
in the old war movies, but went to
57
139160
3000
像老战争片里演的那样
02:22
the iRobot Company, which is
58
142160
2000
而是寄给了iRobot公司
02:24
named after the Asimov novel
59
144160
3000
公司名字来源于阿西莫夫(Asimov)的小说
02:27
and the not-so-great Will Smith movie,
60
147160
2000
以及威尔.史密斯那不怎么样的电影
02:29
and... um... (Laughter)...
61
149160
2000
还有...(笑声)...
02:31
if you remember that
62
151160
2000
如果你们还记得
02:33
in that fictional world, robots started out
63
153160
2000
在小说世界中,机器人从
02:35
carrying out mundane chores, and then
64
155160
2000
做日常家务开始,直到后来
02:37
they started taking on life-and-death decisions.
65
157160
2000
他们开始做出生死攸关的决定
02:39
That's a reality we face today.
66
159160
2000
这就是我们今天面对的现实
02:41
What we're going to do is actually just
67
161160
2000
我们现在要做的
02:43
flash a series of photos behind me that
68
163160
2000
就是为大家展示一系列照片
02:45
show you the reality of robots used in war
69
165160
3000
关于军事机器人的现实
02:48
right now or already at the prototype stage.
70
168160
2000
已在使用的,或是处于雏形阶段的
02:50
It's just to give you a taste.
71
170160
3000
这只是让各位见识一下
02:53
Another way of putting it is you're not
72
173160
2000
换一种说法,便是各位
02:55
going to see anything that's powered
73
175160
2000
不会看到任何靠瓦肯星的
02:57
by Vulcan technology, or teenage
74
177160
2000
科技运转的事物,也不会有类似
02:59
wizard hormones or anything like that.
75
179160
2000
青年魔法荷尔蒙之类的东西
03:01
This is all real. So why don't we
76
181160
2000
这些都是真实的
03:03
go ahead and start those pictures.
77
183160
2000
让我们开始展示这些照片吧
03:05
Something big is going on in war today,
78
185160
2000
今天的战争正在经历着翻天覆地的变化
03:07
and maybe even the history of humanity
79
187160
2000
也可能改变人道主义的历史
03:09
itself. The U.S. military went into Iraq with
80
189160
3000
美国军队在进入伊拉克时
03:12
a handful of drones in the air.
81
192160
2000
只有五架无人机
03:14
We now have 5,300.
82
194160
3000
我们现在有5300架
03:17
We went in with zero unmanned ground
83
197160
2000
进入伊拉克时我们尚没有一个无人地面系统
03:19
systems. We now have 12,000.
84
199160
4000
我们现在有12000个
03:23
And the tech term "killer application"
85
203160
2000
而本身作为一个科技词汇的“杀手级应用”
03:25
takes on new meaning in this space.
86
205160
3000
在这个领域中获得了新的意义。
03:28
And we need to remember that we're
87
208160
2000
而我们需要记住
03:30
talking about the Model T Fords,
88
210160
2000
现在所讲的技术只相当于福特T型车
03:32
the Wright Flyers, compared
89
212160
2000
和莱特飞行器的级别
03:34
to what's coming soon.
90
214160
2000
与即将出现的的技术相比
03:36
That's where we're at right now.
91
216160
2000
这就是我们的现状
03:38
One of the people that I recently met with
92
218160
2000
我最近会见过的一个人
03:40
was an Air Force three-star general, and he
93
220160
2000
是一位空军的三星上将
03:42
said basically, where we're headed very
94
222160
2000
他说,在不远的将来
03:44
soon is tens of thousands of robots
95
224160
2000
我们将会有成千上万个机器人
03:46
operating in our conflicts, and these
96
226160
2000
投入到武装对抗中
03:48
numbers matter, because we're not just
97
228160
2000
这些数字很重要,因为我们不只是
03:50
talking about tens of thousands of today's
98
230160
2000
在讲成千上万个
03:52
robots, but tens of thousands of these
99
232160
2000
今天的机器人,而是成千上万个
03:54
prototypes and tomorrow's robots, because
100
234160
2000
概念雏形和未来的机器人
03:56
of course, one of the things that's operating
101
236160
3000
因为,主导科技发展之一
03:59
in technology is Moore's Law,
102
239160
2000
的摩尔定律说
04:01
that you can pack in more and more
103
241160
2000
这些机器人可以拥有
04:03
computing power into those robots, and so
104
243160
2000
越来越强大的计算能力
04:05
flash forward around 25 years,
105
245160
2000
所以大概25年后
04:07
if Moore's Law holds true,
106
247160
2000
如果摩尔定律成立
04:09
those robots will be close to a billion times
107
249160
3000
这些机器人会比今天的拥有
04:12
more powerful in their computing than today.
108
252160
3000
强大上亿倍的计算能力
04:15
And so what that means is the kind of
109
255160
2000
这意味着
04:17
things that we used to only talk about at
110
257160
2000
那些我们以前只能在
04:19
science fiction conventions like Comic-Con
111
259160
2000
像动漫展这种科幻小说展谈论的事情
04:21
have to be talked about in the halls
112
261160
2000
将会在像五角大楼
04:23
of power and places like the Pentagon.
113
263160
2000
这样权利中心被提起
04:25
A robots revolution is upon us.
114
265160
3000
我们正在经历着一场机器人革命
04:28
Now, I need to be clear here.
115
268160
2000
现在我想澄清一件事
04:30
I'm not talking about a revolution where you
116
270160
2000
我不是在说那种革命
04:32
have to worry about the Governor of
117
272160
2000
那种你需要担心加利福尼亚州长
04:34
California showing up at your door,
118
274160
2000
以终结者的身份出现在你家门口
04:36
a la the Terminator. (Laughter)
119
276160
2000
的那种革命(笑声)
04:38
When historians look at this period, they're
120
278160
2000
当历史学家回顾这段时期
04:40
going to conclude that we're in a different
121
280160
2000
他们会认为我们在经历一场不一样的革命
04:42
type of revolution: a revolution in war,
122
282160
2000
战争的革命
04:44
like the invention of the atomic bomb.
123
284160
2000
就像原子弹的发明一样
04:46
But it may be even bigger than that,
124
286160
2000
而这有可能比原子弹更重要
04:48
because our unmanned systems don't just
125
288160
2000
因为我们的无人驾驶系统
04:50
affect the "how" of war-fighting,
126
290160
2000
将不仅仅影响到我们“如何”打仗
04:52
they affect the "who" of fighting
127
292160
2000
它影响到“谁”来打仗
04:54
at its most fundamental level.
128
294160
2000
在最基本的层面上
04:56
That is, every previous revolution in war, be
129
296160
2000
也就是说,过去的每一次战争革命
04:58
it the machine gun, be it the atomic bomb,
130
298160
2000
比如机关枪,比如原子弹
05:00
was about a system that either shot faster,
131
300160
3000
都是关于一个系统 - 要么射击更快
05:03
went further, had a bigger boom.
132
303160
3000
要么射程更远,要么打击更广
05:06
That's certainly the case with robotics, but
133
306160
3000
机器人学也能有这些作用
05:09
they also change the experience of the warrior
134
309160
3000
但同时它也改变战士的参战体验
05:12
and even the very identity of the warrior.
135
312160
3000
甚至是战士的身份
05:15
Another way of putting this is that
136
315160
3000
换句话说
05:18
mankind's 5,000-year-old monopoly
137
318160
2000
人类五千年来对于战争的垄断
05:20
on the fighting of war is breaking down
138
320160
3000
将在我们有生之年结束
05:23
in our very lifetime. I've spent
139
323160
2000
我在过去
05:25
the last several years going around
140
325160
2000
几年中在各地会见了
05:27
meeting with all the players in this field,
141
327160
2000
这个领域的所有参与者
05:29
from the robot scientists to the science
142
329160
2000
从机器人科学家
05:31
fiction authors who inspired them to the
143
331160
2000
到启发了他们的科幻小说作者
05:33
19-year-old drone pilots who are fighting
144
333160
2000
到来自内华达州的19岁无人机飞行员
05:35
from Nevada, to the four-star generals
145
335160
2000
到指挥他们的四星上将
05:37
who command them, to even the Iraqi
146
337160
2000
甚至到伊拉克叛军
05:39
insurgents who they are targeting and what
147
339160
2000
他们的打击目标
05:41
they think about our systems, and
148
341160
2000
以及他们对于这些系统的看法
05:43
what I found interesting is not just
149
343160
2000
而让我感兴趣的不仅仅是
05:45
their stories, but how their experiences
150
345160
2000
他们的故事,而是他们的经历
05:47
point to these ripple effects that are going
151
347160
2000
指出了一些因为涟漪作用而被影响到的
05:49
outwards in our society, in our law
152
349160
2000
我们的社会、法律
05:51
and our ethics, etc. And so what I'd like
153
351160
2000
和伦理
05:53
to do with my remaining time is basically
154
353160
2000
我想用余下的时间
05:55
flesh out a couple of these.
155
355160
2000
讲述其中几个看法
05:57
So the first is that the future of war,
156
357160
2000
第一,战争的未来
05:59
even a robotics one, is not going to be
157
359160
2000
就算机器人化了,未来也不会
06:01
purely an American one.
158
361160
2000
纯粹的属于美国
06:03
The U.S. is currently ahead in military
159
363160
2000
美国目前在军事机器人上领先
06:05
robotics right now, but we know that in
160
365160
2000
但我们知道
06:07
technology there's no such thing as
161
367160
2000
在科技中
06:09
a permanent first move or advantage.
162
369160
3000
没有永远的先手或有利地位
06:12
In a quick show of hands, how many
163
372160
2000
请还在使用
06:14
people in this room still use
164
374160
2000
王氏电脑的人
06:16
Wang Computers? (Laughter)
165
376160
2000
快速的举下手。(笑声)
06:18
It's the same thing in war. The British and
166
378160
2000
战争也是一样
06:20
the French invented the tank.
167
380160
3000
英国人和法国人发明了坦克
06:23
The Germans figured out how
168
383160
2000
德国人发掘了它的潜力
06:25
to use it right, and so what we have to
169
385160
2000
所以我们需要
06:27
think about for the U.S. is that we are
170
387160
2000
虽然我们美国现在
06:29
ahead right now, but you have
171
389160
2000
但仍有
06:31
43 other countries out there
172
391160
2000
其他43个国家
06:33
working on military robotics, and they
173
393160
2000
也在研发军事机器人
06:35
include all the interesting countries like
174
395160
2000
在这其中便包括了
06:37
Russia, China, Pakistan, Iran.
175
397160
3000
俄罗斯、中国、巴基斯坦、和伊朗
06:40
And this raises a bigger worry for me.
176
400160
3000
而这让我更加担心
06:43
How do we move forward in this revolution
177
403160
2000
我们如何能在这场革命中前行?
06:45
given the state of our manufacturing
178
405160
2000
以我们如今的生产状况
06:47
and the state of our science and
179
407160
2000
学校中关于科学
06:49
mathematics training in our schools?
180
409160
2000
和数学的教育状况
06:51
Or another way of thinking about this is,
181
411160
2000
换句话说
06:53
what does it mean to go to war increasingly
182
413160
2000
当我们加剧使用
06:55
with soldiers whose hardware is made
183
415160
3000
有着在中国制造的硬件
06:58
in China and software is written in India?
184
418160
5000
和在印度写的软件的士兵,战争会变成什么样子?
07:03
But just as software has gone open-source,
185
423160
3000
但就像软件有着开放源码
07:06
so has warfare.
186
426160
2000
战争也是如此
07:08
Unlike an aircraft carrier or an atomic bomb,
187
428160
3000
不像航空母舰或原子弹
07:11
you don't need a massive manufacturing
188
431160
2000
建造机器人不需要庞大的
07:13
system to build robotics. A lot of it is
189
433160
2000
生产系统。很多部件
07:15
off the shelf. A lot of it's even do-it-yourself.
190
435160
2000
在商店便买的到,自己也可以制造很多部件
07:17
One of those things you just saw flashed
191
437160
2000
你们刚刚看到的一张照片
07:19
before you was a raven drone, the handheld
192
439160
2000
是一驾用手掷的掠夺者无人飞行机。
07:21
tossed one. For about a thousand dollars,
193
441160
2000
花大概一千美元
07:23
you can build one yourself, equivalent to
194
443160
2000
你便可以自己建造一个
07:25
what the soldiers use in Iraq.
195
445160
2000
和士兵在伊拉克用的一样
07:27
That raises another wrinkle when it comes
196
447160
2000
这对于战争和冲突来说又多添加了
07:29
to war and conflict. Good guys might play
197
449160
2000
一层考虑。好人可能会
07:31
around and work on these as hobby kits,
198
451160
2000
把它当作业余爱好
07:33
but so might bad guys.
199
453160
2000
但坏人也会
07:35
This cross between robotics and things like
200
455160
2000
机器人学和恐怖主义的
07:37
terrorism is going to be fascinating
201
457160
2000
的产物让人
07:39
and even disturbing,
202
459160
2000
异常不安
07:41
and we've already seen it start.
203
461160
2000
而我们已经看到了它的开始
07:43
During the war between Israel, a state,
204
463160
2000
当以色列,一个国家
07:45
and Hezbollah, a non-state actor,
205
465160
3000
和真主党,一个非国家组织的战争中
07:48
the non-state actor flew
206
468160
2000
这个非国家组织
07:50
four different drones against Israel.
207
470160
2000
对以色列用了四种不同的无人飞行机
07:52
There's already a jihadi website
208
472160
2000
网上也已经有了一个圣战网站
07:54
that you can go on and remotely
209
474160
2000
你可以在家中的电脑前
07:56
detonate an IED in Iraq while sitting
210
476160
2000
在上面远程引爆一个
07:58
at your home computer.
211
478160
2000
伊拉克的IED
08:00
And so I think what we're going to see is
212
480160
2000
所以我觉得
08:02
two trends take place with this.
213
482160
2000
这会带来两个趋势
08:04
First is, you're going to reinforce the power
214
484160
2000
第一,个人
08:06
of individuals against governments,
215
486160
4000
对抗政府的力量将会增强
08:10
but then the second is that
216
490160
2000
第二
08:12
we are going to see an expansion
217
492160
2000
我们将会看到
08:14
in the realm of terrorism.
218
494160
2000
恐怖主义的会扩展
08:16
The future of it may be a cross between
219
496160
2000
未来可能会出现一种
08:18
al Qaeda 2.0 and the
220
498160
2000
由基地组织2.0和
08:20
next generation of the Unabomber.
221
500160
2000
新一代的炸弹客交汇的产物
08:22
And another way of thinking about this
222
502160
2000
换句话说
08:24
is the fact that, remember, you don't have
223
504160
2000
实际上,请记住,你不需要
08:26
to convince a robot that they're gonna
224
506160
2000
用这样的理由说服一个机器人
08:28
receive 72 virgins after they die
225
508160
3000
让它相信死后能得到72个处女
08:31
to convince them to blow themselves up.
226
511160
3000
从而说服它们去做人肉炸弹
08:34
But the ripple effects of this are going to go
227
514160
2000
而这涟漪作用也将会渗入
08:36
out into our politics. One of the people that
228
516160
2000
我们的政治。我所见过的
08:38
I met with was a former Assistant Secretary of
229
518160
2000
其中一人是罗纳德.里根的
08:40
Defense for Ronald Reagan, and he put it
230
520160
2000
前国务卿助理。他这么说:
08:42
this way: "I like these systems because
231
522160
2000
“我喜欢这些系统
08:44
they save American lives, but I worry about
232
524160
2000
因为它们能拯救美国人的性命,但我担心
08:46
more marketization of wars,
233
526160
2000
战争的市场化
08:48
more shock-and-awe talk,
234
528160
3000
以更多的震慑行动
08:51
to defray discussion of the costs.
235
531160
2000
来减轻关于代价的讨论
08:53
People are more likely to support the use
236
533160
2000
人们更倾向于支持武力手段
08:55
of force if they view it as costless."
237
535160
3000
如果他们认为这代价很低。”
08:58
Robots for me take certain trends
238
538160
2000
对于我来说,机器人
09:00
that are already in play in our body politic,
239
540160
3000
将一些已经存在在我们政治中的趋势
09:03
and maybe take them to
240
543160
2000
带到了它们
09:05
their logical ending point.
241
545160
2000
逻辑的终点
09:07
We don't have a draft. We don't
242
547160
2000
我们不再拟稿,我们也
09:09
have declarations of war anymore.
243
549160
3000
不再宣战。
09:12
We don't buy war bonds anymore.
244
552160
2000
我们不再购买战争债券
09:14
And now we have the fact that we're
245
554160
2000
而现在事实上我们正在
09:16
converting more and more of our American
246
556160
2000
正在将越来越多的美军士兵
09:18
soldiers that we would send into harm's
247
558160
2000
那些本应亲身赴险的士兵
09:20
way into machines, and so we may take
248
560160
3000
替换成机器人
09:23
those already lowering bars to war
249
563160
3000
将已经非常低的战争门槛
09:26
and drop them to the ground.
250
566160
3000
一降到底。
09:29
But the future of war is also going to be
251
569160
2000
但未来的战争也会是
09:31
a YouTube war.
252
571160
2000
一场 Youtube 战争
09:33
That is, our new technologies don't merely
253
573160
2000
也就是说,我们的新科技不仅仅
09:35
remove humans from risk.
254
575160
2000
消除了人类的危险
09:37
They also record everything that they see.
255
577160
3000
它们也记录着我们看到的一切
09:40
So they don't just delink the public:
256
580160
3000
所以它们不仅让大众与战争脱钩
09:43
they reshape its relationship with war.
257
583160
3000
它们重新塑造了大众和战争的关系
09:46
There's already several thousand
258
586160
2000
已经有几千个
09:48
video clips of combat footage from Iraq
259
588160
2000
关于伊拉克的实况视频录像
09:50
on YouTube right now,
260
590160
2000
在YouTube上可以看到
09:52
most of it gathered by drones.
261
592160
2000
大部分影像由无人机收集
09:54
Now, this could be a good thing.
262
594160
2000
这可以是件好事
09:56
It could be building connections between
263
596160
2000
它可以在家园
09:58
the home front and the war front
264
598160
2000
和战争前线中建立一种
10:00
as never before.
265
600160
2000
以前未有过的联系
10:02
But remember, this is taking place
266
602160
2000
但请记住,这些的发生在
10:04
in our strange, weird world, and so
267
604160
3000
我们这个奇怪荒谬的世界中
10:07
inevitably the ability to download these
268
607160
2000
最终,只要能下载
10:09
video clips to, you know, your iPod
269
609160
2000
这些录像到你的 iPod
10:11
or your Zune gives you
270
611160
3000
或 Zune
10:14
the ability to turn it into entertainment.
271
614160
4000
你就能把它们娱乐化
10:18
Soldiers have a name for these clips.
272
618160
2000
士兵们给这些录像取了名字
10:20
They call it war porn.
273
620160
2000
他们叫它战争色情
10:22
The typical one that I was sent was
274
622160
2000
典型的例子是我曾收到
10:24
an email that had an attachment of
275
624160
2000
一封电子邮件的附件中
10:26
video of a Predator strike taking out
276
626160
2000
有一段“掠夺者”无人机攻击一个敌军据点的录像
10:28
an enemy site. Missile hits,
277
628160
2000
导弹命中
10:30
bodies burst into the air with the explosion.
278
630160
3000
人被爆炸炸入空中
10:33
It was set to music.
279
633160
2000
录像还加了背景音乐
10:35
It was set to the pop song
280
635160
2000
是流行歌曲
10:37
"I Just Want To Fly" by Sugar Ray.
281
637160
3000
Sugar Ray的"I just want to Fly"
10:40
This ability to watch more
282
640160
3000
人们能看到更多
10:43
but experience less creates a wrinkle
283
643160
3000
但却感受更少
10:46
in the public's relationship with war.
284
646160
2000
这将对公众和战争的关系带来负面影响
10:48
I think about this with a sports parallel.
285
648160
2000
我用一种体育来比喻这种关系
10:50
It's like the difference between
286
650160
3000
这是截然不同的:
10:53
watching an NBA game, a professional
287
653160
3000
在电视机前观看NBA职业篮球赛
10:56
basketball game on TV, where the athletes
288
656160
3000
运动员们
10:59
are tiny figures on the screen, and
289
659160
2000
只是一个个银幕上的小人
11:01
being at that basketball game in person
290
661160
3000
而在现场看球赛时
11:04
and realizing what someone seven feet
291
664160
2000
亲身感受身高2米的人
11:06
really does look like.
292
666160
2000
真正是什么样子
11:08
But we have to remember,
293
668160
2000
但我们需要记住
11:10
these are just the clips.
294
670160
2000
这些只是(战争录像)片段
11:12
These are just the ESPN SportsCenter
295
672160
2000
这些只是类似ESPN体育中心版本的比赛片段
11:14
version of the game. They lose the context.
296
674160
2000
没有背景信息
11:16
They lose the strategy.
297
676160
2000
鲜少涉及战略
11:18
They lose the humanity. War just
298
678160
2000
更加丧失人性
11:20
becomes slam dunks and smart bombs.
299
680160
3000
战争变成了灌篮和跟踪导弹
11:23
Now the irony of all this is that
300
683160
3000
这一切中最讽刺的便是
11:26
while the future of war may involve
301
686160
2000
虽然未来的战争会有
11:28
more and more machines,
302
688160
2000
越来越多的机械参与其中
11:30
it's our human psychology that's driving
303
690160
2000
但我们人类的心理才是一切的源头
11:32
all of this, it's our human failings
304
692160
2000
人性的弱点
11:34
that are leading to these wars.
305
694160
2000
才会导致战争
11:36
So one example of this that has
306
696160
2000
有一个例子
11:38
big resonance in the policy realm is
307
698160
2000
在政策方面有着共鸣
11:40
how this plays out on our very real
308
700160
2000
这会如何改变
11:42
war of ideas that we're fighting
309
702160
2000
我们与激进组织的
11:44
against radical groups.
310
704160
2000
理念战争
11:46
What is the message that we think we are
311
706160
2000
我们认为机械的运用
11:48
sending with these machines versus what
312
708160
2000
将传达出什么样的信息
11:50
is being received in terms of the message.
313
710160
3000
以及对方将如何理解这些信息
11:53
So one of the people that I met was
314
713160
2000
我见过的其中一个人
11:55
a senior Bush Administration official,
315
715160
2000
曾经是布什政府的工作人员
11:57
who had this to say about
316
717160
2000
他这样说起
11:59
our unmanning of war:
317
719160
2000
我们的无人战争
12:01
"It plays to our strength. The thing that
318
721160
2000
“这是我们的长处
12:03
scares people is our technology."
319
723160
2000
人们害怕的便是我们的科技”
12:05
But when you go out and meet with people,
320
725160
2000
但当你在各地和人们交谈时
12:07
for example in Lebanon, it's a very
321
727160
2000
比如说在黎巴嫩
12:09
different story. One of the people
322
729160
2000
人们的想法又是完全不同。我在那
12:11
I met with there was a news editor, and
323
731160
2000
我在那儿遇到的了一位新闻编辑
12:13
we're talking as a drone is flying above him,
324
733160
2000
当我们在交谈时一架无人飞行机飞过他的头顶
12:15
and this is what he had to say.
325
735160
2002
然后他说了这样的话:
12:17
"This is just another sign of the coldhearted
326
737162
1998
“这只是另一个以色列人
12:19
cruel Israelis and Americans,
327
739160
3006
和美国人冷血残酷的证明
12:22
who are cowards because
328
742166
1992
他们懦弱到
12:24
they send out machines to fight us.
329
744158
2006
只敢用机器来和我们打仗
12:26
They don't want to fight us like real men,
330
746164
1992
他们不愿意像真正的男人一样来打仗
12:28
but they're afraid to fight,
331
748156
2007
他们不敢战斗
12:30
so we just have to kill a few of their soldiers
332
750163
1991
所以我们只需要杀死少许几个士兵
12:32
to defeat them."
333
752154
3005
便可以击溃他们。”
12:35
The future of war also is featuring
334
755159
2007
未来的战争也将引进
12:37
a new type of warrior,
335
757166
1977
一种新的战士
12:39
and it's actually redefining the experience
336
759159
3006
并重新定义
12:42
of going to war.
337
762165
1998
上战场的经历
12:44
You can call this a cubicle warrior.
338
764163
1976
你可以称之为“格子间战士”
12:46
This is what one Predator drone pilot
339
766154
2007
一名“掠夺者”无人机的飞行员
12:48
described of his experience fighting
340
768161
2007
这样描述他的战斗经历
12:50
in the Iraq War while never leaving Nevada.
341
770168
2989
加入伊拉克战争的战斗,却从未离开内华达
12:53
"You're going to war for 12 hours,
342
773157
2005
“你参战12个小时
12:55
shooting weapons at targets,
343
775162
1991
用武器射击目标
12:57
directing kills on enemy combatants,
344
777168
2991
射杀敌军战斗人员
13:00
and then you get in the car
345
780159
2006
然后你钻进汽车
13:02
and you drive home and within 20 minutes,
346
782165
1982
在20分钟内开车回到家
13:04
you're sitting at the dinner table
347
784163
1991
之后便坐在晚餐桌前
13:06
talking to your kids about their homework."
348
786169
1993
和孩子们聊他们的功课。"
13:08
Now, the psychological balancing
349
788162
1991
现在从这两种经历汇总求得心理平衡
13:10
of those experiences is incredibly tough,
350
790168
1993
是极其困难的
13:12
and in fact those drone pilots have
351
792161
2992
事实上,这些无人机的飞行员
13:15
higher rates of PTSD than many
352
795153
2008
患上PTSD (创伤后压力症候群)的比率
13:17
of the units physically in Iraq.
353
797161
3005
比身在伊拉克的部队更高
13:20
But some have worries that this
354
800166
1992
但有些人更加担心
13:22
disconnection will lead to something else,
355
802158
1994
这种疏离
13:24
that it might make the contemplation of war
356
804167
1998
会使得实施战争罪行的考虑
13:26
crimes a lot easier when you have
357
806165
2005
变得更为容易,因为距离太远
13:28
this distance. "It's like a video game,"
358
808170
1999
“这就像是一个电子游戏。”
13:30
is what one young pilot described to me
359
810169
1999
一位年轻的飞行员这样向我描述
13:32
of taking out enemy troops from afar.
360
812168
1960
从远处抹杀敌军部队的感受
13:34
As anyone who's played Grand Theft Auto
361
814158
2995
任何玩过侠盗猎车手的人都知道
13:37
knows, we do things in the video world
362
817168
3000
我们在电子世界中
13:40
that we wouldn't do face to face.
363
820168
2996
会做一些我们在面对面时不会做的事情。
13:43
So much of what you're hearing from me
364
823164
1998
目前你从我这听到的
13:45
is that there's another side
365
825162
1993
就是科技革命
13:47
to technologic revolutions,
366
827155
2008
有着另外的一面
13:49
and that it's shaping our present
367
829163
1994
而它正在重塑我们现在
13:51
and maybe will shape our future of war.
368
831157
3001
甚至是未来的战争
13:54
Moore's Law is operative,
369
834158
2001
摩尔定律适用
13:56
but so's Murphy's Law.
370
836159
1998
墨菲法则同样适用
13:58
The fog of war isn't being lifted.
371
838157
2005
战争的迷雾并没有被吹散
14:00
The enemy has a vote.
372
840162
2010
敌人也要为此负责
14:02
We're gaining incredible new capabilities,
373
842172
1989
我们在获得惊人的新能力的同时
14:04
but we're also seeing and experiencing
374
844161
2006
我们也在看到以及经历
14:06
new human dilemmas. Now,
375
846167
2000
新的人性难题
14:08
sometimes these are just "oops" moments,
376
848167
1999
有些时候这只是意外
14:10
which is what the head of a robotics
377
850166
1992
就像一个机器人公司的领头人
14:12
company described it, you just have
378
852158
2011
描述的那样
14:14
"oops" moments. Well, what are
379
854169
2000
有时意外无可避免
14:16
"oops" moments with robots in war?
380
856169
1996
可是在机器人的战争中,意外代表着什么?
14:18
Well, sometimes they're funny. Sometimes,
381
858165
2001
有些时候它们很搞笑
14:20
they're like that scene from the
382
860166
2002
有时他们就像
14:22
Eddie Murphy movie "Best Defense,"
383
862168
2001
埃迪.墨菲的电影“最佳防御”中的场景
14:24
playing out in reality, where they tested out
384
864169
2001
在现实中重演一般
14:26
a machine gun-armed robot, and during
385
866170
1988
当他们在试用一个装备着机关枪的机器人时
14:28
the demonstration it started spinning
386
868158
2001
在模拟中它将机关枪对准了
14:30
in a circle and pointed its machine gun
387
870159
3009
那些旁观的重要人物们
14:33
at the reviewing stand of VIPs.
388
873168
2995
并开始原地转圈
14:36
Fortunately the weapon wasn't loaded
389
876163
1993
幸亏那武器没有装载弹药
14:38
and no one was hurt, but other times
390
878156
2010
也没有人受伤
14:40
"oops" moments are tragic,
391
880166
1992
但其他的时候,意外时刻非常悲惨
14:42
such as last year in South Africa, where
392
882158
2002
就像去年在南非
14:44
an anti-aircraft cannon had a
393
884160
2999
当一个对空炮发生了
14:47
"software glitch," and actually did turn on
394
887159
2999
“软件故障”时,它真的转过头来开火
14:50
and fired, and nine soldiers were killed.
395
890158
3004
导致了9名士兵死亡
14:53
We have new wrinkles in the laws of war
396
893162
2997
我们的战争法律和责任
14:56
and accountability. What do we do
397
896159
1997
也有了新的变化
14:58
with things like unmanned slaughter?
398
898156
2010
我们该怎么处理像无人机屠杀这种事情?
15:00
What is unmanned slaughter?
399
900166
1998
无人机屠杀是什么?
15:02
We've already had three instances of
400
902164
2000
我们已经有了三起
15:04
Predator drone strikes where we thought
401
904164
2000
我们以为掠夺者无人飞行载具攻击的
15:06
we got bin Laden, and it turned out
402
906164
1994
是本.拉登但后来发现
15:08
not to be the case.
403
908158
2002
并非如此的事故
15:10
And this is where we're at right now.
404
910160
2008
这就是我们现在的处境
15:12
This is not even talking about armed,
405
912168
2000
这还远远没到武装过的
15:14
autonomous systems
406
914168
1989
有着可以决定使用武力的权威的
15:16
with full authority to use force.
407
916157
2013
自主系统
15:18
And do not believe that that isn't coming.
408
918170
1994
别想着这一天不会到来
15:20
During my research I came across
409
920164
2007
在我的研究中我在
15:22
four different Pentagon projects
410
922171
1985
五角大楼中见过了4项
15:24
on different aspects of that.
411
924156
2012
关于它的不同方面的项目
15:26
And so you have this question:
412
926168
2002
然后你就有了这样的问题:
15:28
what does this lead to issues like
413
928170
1993
这将对战争罪行有什么样的影响?
15:30
war crimes? Robots are emotionless, so
414
930163
1992
机器人没有感情
15:32
they don't get upset if their buddy is killed.
415
932155
3012
他们不会因为他们的战友被杀而悲伤
15:35
They don't commit crimes of rage
416
935167
1994
他们也不会因为愤怒
15:37
and revenge.
417
937161
2007
和复仇而犯罪
15:39
But robots are emotionless.
418
939168
2997
机器人没有感情
15:42
They see an 80-year-old grandmother
419
942165
1999
他们会将一位在轮椅上的80岁的老太太
15:44
in a wheelchair the same way they see
420
944164
2000
与一架T-80坦克一视同仁
15:46
a T-80 tank: they're both
421
946164
3002
他们都只是
15:49
just a series of zeroes and ones.
422
949166
2993
一系列的0和1而已
15:52
And so we have this question to figure out:
423
952159
3003
所以我们需要解决这样的问题:
15:55
How do we catch up our 20th century
424
955162
2003
我们如何才能让20世纪的战争法则
15:57
laws of war, that are so old right now
425
957165
1989
衰老过时的
15:59
that they could qualify for Medicare,
426
959154
3008
足以去领退休医保的战争法则
16:02
to these 21st century technologies?
427
962162
3008
迎头赶上21世纪的科技
16:05
And so, in conclusion, I've talked about
428
965170
2987
最后,做个总结
16:08
what seems the future of war,
429
968157
3011
我大概预测了未来战争的趋势
16:11
but notice that I've only used
430
971168
1990
但你们要注意到我只使用了
16:13
real world examples and you've only seen
431
973158
2012
现实世界的例子
16:15
real world pictures and videos.
432
975170
2001
现实世界的图片和影像
16:17
And so this sets a great challenge for
433
977171
2000
这说明我们已经面对了极大的挑战
16:19
all of us that we have to worry about well
434
979171
1986
以及忧虑
16:21
before you have to worry about your
435
981157
2009
远在你们需要开始担心
16:23
Roomba sucking the life away from you.
436
983166
2001
Roomba 机器人汲取你们的生命之前
16:25
Are we going to let the fact that what's
437
985167
1999
我们是否会放任事实的发展
16:27
unveiling itself right now in war
438
987166
3000
只是因为现在的现实所揭示的战争
16:30
sounds like science fiction and therefore
439
990166
2991
听起来像科幻小说
16:33
keeps us in denial?
440
993157
2005
而拒绝承认现实?
16:35
Are we going to face the reality
441
995162
2004
我们是否能面对
16:37
of 21st century war?
442
997166
1993
21世纪的战争的现实?
16:39
Is our generation going to make the same
443
999159
2001
我们这一代是否会
16:41
mistake that a past generation did
444
1001160
2009
重复上一代的错误
16:43
with atomic weaponry, and not deal with
445
1003169
1998
发射原子弹
16:45
the issues that surround it until
446
1005167
1995
而不去及时处理相关的问题
16:47
Pandora's box is already opened up?
447
1007162
2007
直到潘多拉的盒子被打开?
16:49
Now, I could be wrong on this, and
448
1009169
1998
我的观点可能是错误的
16:51
one Pentagon robot scientist told me
449
1011167
2004
一位五角大楼的机器人专家
16:53
that I was. He said, "There's no real
450
1013171
1994
这么告诉我。他说:
16:55
social, ethical, moral issues when it comes
451
1015165
1995
“对机器人而言,社会、伦理或道德上的问题
16:57
to robots.
452
1017160
1998
是不存在的
16:59
That is," he added, "unless the machine
453
1019158
2001
当然,”他接着说,“除非那个机器人
17:01
kills the wrong people repeatedly.
454
1021159
3010
一再杀死不该杀的人”
17:04
Then it's just a product recall issue."
455
1024169
2989
“但那只是一个产品回收的问题。”
17:07
And so the ending point for this is
456
1027158
3011
所以这一切的终点
17:10
that actually, we can turn to Hollywood.
457
1030169
5002
实际上可以参考好莱坞
17:15
A few years ago, Hollywood gathered
458
1035171
1996
几年前,好莱坞聚集了
17:17
all the top characters and created
459
1037167
3006
所有的有名的主角
17:20
a list of the top 100 heroes and
460
1040173
1997
然后列出了好莱坞历史上前100位英雄
17:22
top 100 villains of all of Hollywood history,
461
1042170
2990
和前100号反面人物
17:25
the characters that represented the best
462
1045160
2006
可以代表人类最好
17:27
and worst of humanity.
463
1047166
1992
和最坏的角色。
17:29
Only one character made it onto both lists:
464
1049158
4015
只有一个角色同时登上了这两个名单:
17:33
The Terminator, a robot killing machine.
465
1053173
2996
终结者,一个杀戮机械
17:36
And so that points to the fact that
466
1056169
1988
所以这指明了
17:38
our machines can be used
467
1058157
2013
我们的机械可以用来
17:40
for both good and evil, but for me
468
1060170
1999
做好事也可以做坏事
17:42
it points to the fact that there's a duality
469
1062169
1996
但对我来说,这意味着人类也拥有着这样的
17:44
of humans as well.
470
1064165
3000
矛盾的双面性
17:47
This week is a celebration
471
1067165
1998
这一周是祝贺我们创造力的一周。
17:49
of our creativity. Our creativity
472
1069163
1999
我们的创造力
17:51
has taken our species to the stars.
473
1071162
1998
将我们这个物种带上太空
17:53
Our creativity has created works of arts
474
1073160
2007
我们的创造力产生了伟大的艺术作品
17:55
and literature to express our love.
475
1075167
2996
以及文学作品来表达人类之爱
17:58
And now, we're using our creativity
476
1078163
1998
而现在,我们将创造力
18:00
in a certain direction, to build fantastic
477
1080161
1993
用在了一个特殊领域
18:02
machines with incredible capabilities,
478
1082169
2987
用于制造具有强大能力的奇异机械
18:05
maybe even one day
479
1085156
2009
甚至可能某一天
18:07
an entirely new species.
480
1087165
2997
会制造出一个全新的物种。
18:10
But one of the main reasons that we're
481
1090162
2005
但我们这么做的
18:12
doing that is because of our drive
482
1092167
1998
主要原因之一却是因为
18:14
to destroy each other, and so the question
483
1094165
3000
我们想要毁灭他人的欲望
18:17
we all should ask:
484
1097165
1994
所有人都需要问这样一个问题:
18:19
is it our machines, or is it us
485
1099159
2000
究竟是机械还是我们自身
18:21
that's wired for war?
486
1101159
2007
才是为战争而生的?
18:23
Thank you. (Applause)
487
1103166
2098
谢谢。(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7