请双击下面的英文字幕来播放视频。
翻译人员: Xiao Xiao
校对人员: Louise LIANG
00:15
I study the future
0
15539
2823
我研究的是
00:18
of crime and terrorism,
1
18362
4285
犯罪和恐怖主义的未来。
00:22
and frankly, I'm afraid.
2
22647
2360
坦白地说,我感到害怕。
00:25
I'm afraid by what I see.
3
25007
1962
我对我在研究中了解到的事情感到害怕。
00:26
I sincerely want to believe
4
26969
2145
我真诚地希望自己能够相信
00:29
that technology can bring us
5
29114
2792
科技可以带给我们
00:31
the techno-utopia that we've been promised,
6
31906
3057
对我们的承诺中的科技乌托邦
00:34
but, you see,
7
34963
1736
但是,你们看,
00:36
I've spent a career in law enforcement,
8
36699
3113
我曾经度过了一段与执法有关的职业生涯,
00:39
and that's informed my perspective on things.
9
39812
2773
而这让我对事物的看法有了新的认识。
00:42
I've been a street police officer,
10
42585
1633
我曾经做过巡街警察,
00:44
an undercover investigator,
11
44218
2071
卧底侦探,
00:46
a counter-terrorism strategist,
12
46289
2008
反恐战略师,
00:48
and I've worked in more than 70 countries
13
48297
2225
也曾经在全球超过70个国家
00:50
around the world.
14
50522
1466
工作过。
00:51
I've had to see more than my fair share
15
51988
1656
我不得不经历了比我应该遇到的更多的
00:53
of violence and the darker underbelly of society,
16
53644
3245
暴力和社会阴暗面,
00:56
and that's informed my opinions.
17
56889
4461
而正是这些影响了我的观点。
01:01
My work with criminals and terrorists
18
61350
1587
我的工作会与罪犯和恐怖分子打交道
01:02
has actually been highly educational.
19
62937
2120
这实际上是非常有教育意义的。
01:05
They have taught me a lot, and I'd like to be able
20
65057
2709
他们教会了我很多,而我希望可以
01:07
to share some of these observations with you.
21
67766
3899
与你们分享一些我所观察到的东西。
01:11
Today I'm going to show you the flip side
22
71665
2319
今天我将展现给你们
01:13
of all those technologies that we marvel at,
23
73984
4015
那些让我们感到惊奇的
01:17
the ones that we love.
24
77999
1863
我们所爱的科技的另一面
01:19
In the hands of the TED community,
25
79862
1866
如果科技由TDE社区掌控
01:21
these are awesome tools which will bring about
26
81728
2913
那么它无疑是可以给我们的世界带来巨大改变的
01:24
great change for our world,
27
84641
3459
不可思议的工具,
01:28
but in the hands of suicide bombers,
28
88100
2726
但是当科技被握于人肉炸弹袭击者的手中,
01:30
the future can look quite different.
29
90826
3436
我们的未来就非常不一样了。
01:34
I started observing
30
94262
2153
当我还是一个年轻的巡逻警察时,
01:36
technology and how criminals were using it
31
96415
1896
我便开始观察科技以及
01:38
as a young patrol officer.
32
98311
1991
罪犯们利用科技的方式。
01:40
In those days, this was the height of technology.
33
100302
4456
在那个时代,这就是科技的高度了。
01:44
Laugh though you will,
34
104758
1569
你们尽管笑吧,
01:46
all the drug dealers and gang members
35
106327
1295
所有我对付过的毒贩和帮派成员
01:47
with whom I dealt had one of these
36
107622
2201
都远比我认识的任何一位警察
01:49
long before any police officer I knew did.
37
109823
4291
更早地拥有了其中一样。
01:54
Twenty years later, criminals are still using
38
114114
2860
20年过去了,罪犯们仍然在使用
01:56
mobile phones, but they're also building
39
116974
3883
移动电话。但与此同时,他们也在建造
02:00
their own mobile phone networks,
40
120857
2349
属于他们自己的移动电话网络,
02:03
like this one, which has been deployed
41
123206
2232
比如这一个,就已经被贩毒团伙部署在了
02:05
in all 31 states of Mexico by the narcos.
42
125438
3800
墨西哥全部的31个州之中。
02:09
They have a national encrypted
43
129238
2136
他们还拥有一套全国性的加密的
02:11
radio communications system.
44
131374
3209
无线电通信系统。
02:14
Think about that.
45
134583
2110
大家可以认真想想。
02:16
Think about the innovation that went into that.
46
136693
3382
想想投注于这个系统中的创造革新。
02:20
Think about the infrastructure to build it.
47
140075
2897
想想构建这个系统所需的基础设施。
02:22
And then think about this:
48
142972
1405
然后再想想这个:
02:24
Why can't I get a cell phone signal in San Francisco? (Laughter)
49
144377
4050
为什么我在旧金山收不到手机信号呢?(笑声)
02:28
How is this possible? (Laughter) It makes no sense. (Applause)
50
148427
5087
这怎么可能呢?(笑声)这完全没有道理啊。(掌声)
02:33
We consistently underestimate
51
153514
1358
我们一直低估了
02:34
what criminals and terrorists can do.
52
154872
3304
罪犯和恐怖分子能做到的事情。
02:38
Technology has made our world
53
158176
1608
科技已经让我们的世界
02:39
increasingly open, and for the most part,
54
159784
2080
越来越开放。当然,在大多数情况下,
02:41
that's great, but all of this openness
55
161864
2353
这是好事,但是这个开放性
02:44
may have unintended consequences.
56
164217
2045
也可能会造成始料未及的后果。
02:46
Consider the 2008 terrorist attack on Mumbai.
57
166262
4716
想想看2008年发生在孟买的恐怖袭击吧。
02:50
The men that carried that attack out were armed
58
170978
3158
那些实施了那次恐怖袭击的人
02:54
with AK-47s, explosives and hand grenades.
59
174136
3470
都装备着AK-47s,炸药和手榴弹。
02:57
They threw these hand grenades
60
177606
1832
他们把这些手榴弹
02:59
at innocent people as they sat eating in cafes
61
179438
3200
投向了那些正在咖啡馆里就餐
03:02
and waited to catch trains on their way home from work.
62
182638
4853
和下班后正在等着赶火车回家的无辜的平民。
03:07
But heavy artillery is nothing new in terrorist operations.
63
187491
3353
但是在恐怖活动当中,重型火炮早已屡见不鲜。
03:10
Guns and bombs are nothing new.
64
190844
2404
枪支和炸弹更是毫无新意了。
03:13
What was different this time
65
193248
1236
这次恐怖袭击的特殊之处在于
03:14
is the way that the terrorists used
66
194484
2688
恐怖分子利用
03:17
modern information communications technologies
67
197172
2736
现代信息通信技术
03:19
to locate additional victims and slaughter them.
68
199908
5356
查明额外的受害者的身份并且屠杀他们的方式。
03:25
They were armed with mobile phones.
69
205264
2213
这些袭击者配备了移动电话。
03:27
They had BlackBerries.
70
207477
1599
他们拥有黑莓手机。
03:29
They had access to satellite imagery.
71
209076
2237
他们可以使用卫星地图。
03:31
They had satellite phones, and they even had night vision goggles.
72
211313
4965
他们拥有卫星电话,甚至夜视眼镜。
03:36
But perhaps their greatest innovation was this.
73
216278
2987
但是他们最大的创新之处也许是这个:
03:39
We've all seen pictures like this
74
219265
2088
我们都在电视上或者新闻里看到过
03:41
on television and in the news. This is an operations center.
75
221353
3356
像这样的图片。这是一个操控中心。
03:44
And the terrorists built their very own op center
76
224709
2427
这些恐怖分子建立了完全属于他们自己的
03:47
across the border in Pakistan,
77
227136
3431
跨越巴基斯坦边境的操控中心
03:50
where they monitored the BBC,
78
230567
1281
在那里,他们可以监控英国广播公司,
03:51
al Jazeera, CNN and Indian local stations.
79
231848
4590
半岛电视台,美国有线电视,以及印度当地的地方电台。
03:56
They also monitored the Internet and social media
80
236438
5028
他们还可以通过监测网络和社会化媒体
04:01
to monitor the progress of their attacks
81
241466
2122
来监控他们的袭击的进展
04:03
and how many people they had killed.
82
243588
2370
以及他们已经屠杀了多少人
04:05
They did all of this in real time.
83
245958
3068
他们实时进行着一切行动。
04:09
The innovation of the terrorist operations center
84
249026
3338
恐怖分子在操控中心上的创新
04:12
gave terrorists unparalleled situational awareness
85
252364
3974
为他们提供了前所未有的现场情景感知
04:16
and tactical advantage over the police
86
256338
2226
以及远胜过警察和政府的
04:18
and over the government.
87
258564
2643
战术优势
04:21
What did they do with this?
88
261207
1265
他们利用它做了什么呢?
04:22
They used it to great effect.
89
262472
1976
他们利用它发挥了重大的作用。
04:24
At one point during the 60-hour siege,
90
264448
2049
在60个小时的恐怖袭击过程中,
04:26
the terrorists were going room to room
91
266497
1663
这些恐怖分子曾一度闯入一间又一间的房间,
04:28
trying to find additional victims.
92
268160
3061
试图寻找更多的受害者。
04:31
They came upon a suite on the top floor
93
271221
1651
他们在酒店的顶层偶然发现一间套房,
04:32
of the hotel, and they kicked down the door
94
272872
1905
接着他们踢开了房门,
04:34
and they found a man hiding by his bed.
95
274777
2887
而且找到了一个正躲在床边的男人。
04:37
And they said to him, "Who are you,
96
277664
2027
他们问这个男人:“你是谁,
04:39
and what are you doing here?"
97
279691
1389
你在这里干什么?”
04:41
And the man replied,
98
281080
1720
这个男人回答道:
04:42
"I'm just an innocent schoolteacher."
99
282800
3674
“我只是一名无辜的学校老师。”
04:46
Of course, the terrorists knew
100
286474
2164
当然,这些恐怖分子非常清楚
04:48
that no Indian schoolteacher stays at a suite in the Taj.
101
288638
3845
印度的学校老师不可能住得起泰姬玛哈酒店的套房。
04:52
They picked up his identification,
102
292483
1592
他们捡起了他的身份证件,
04:54
and they phoned his name in to the terrorist war room,
103
294075
3523
并且打电话将他的名字告知了恐怖活动操控中心,
04:57
where the terrorist war room Googled him,
104
297598
2824
而操控中心的恐怖分子在谷歌上搜索了他的名字,
05:00
and found a picture and called their operatives
105
300422
2641
找到了一张照片,然后把电话打回给在套房现场的
05:03
on the ground and said,
106
303063
1193
恐怖分子,并且问他们:
05:04
"Your hostage, is he heavyset?
107
304256
4679
你们的人质是不是身材壮硕?
05:08
Is he bald in front? Does he wear glasses?"
108
308935
4478
他的前额是不是秃顶?他是不是戴着眼镜?”
05:13
"Yes, yes, yes," came the answers.
109
313413
4606
“没错!”现场的恐怖分子回答道。
05:18
The op center had found him and they had a match.
110
318019
1987
操控中心找到了他,并且对他的身份进行了匹配确认。
05:20
He was not a schoolteacher.
111
320006
1367
他的确不是学校老师。
05:21
He was the second-wealthiest businessman in India,
112
321373
3874
事实上,他是印度财富排名第二的商人,
05:25
and after discovering this information,
113
325247
1645
而在发现了这一信息之后,
05:26
the terrorist war room gave the order
114
326892
3055
操控中心给孟买现场的恐怖分子
05:29
to the terrorists on the ground in Mumbai.
115
329947
2626
下了一道命令。
05:32
("Kill him.")
116
332573
3126
(“杀了他。”)
05:35
We all worry about our privacy settings
117
335699
3441
我们都为我们在Facebook上的隐私设置
05:39
on Facebook,
118
339140
1736
感到烦恼,
05:40
but the fact of the matter is,
119
340876
2425
但是问题的实际情况在于,
05:43
our openness can be used against us.
120
343301
3462
我们的坦率可能会被用来伤害我们自己。
05:46
Terrorists are doing this.
121
346763
1906
恐怖分子正在这样做。
05:48
A search engine can determine
122
348669
3506
一个搜索引擎可以决定
05:52
who shall live and who shall die.
123
352175
3955
谁可以活着,而谁必须死。
05:56
This is the world that we live in.
124
356130
3029
这就是我们所生活的世界。
05:59
During the Mumbai siege,
125
359159
1569
在孟买恐怖袭击的过程中,
06:00
terrorists were so dependent on technology
126
360728
2626
恐怖分子是如此依赖科技
06:03
that several witnesses reported that
127
363354
2246
以至于有几个目击证人曾提到
06:05
as the terrorists were shooting hostages with one hand,
128
365600
2879
当恐怖分子用一只手射杀人质的时候,
06:08
they were checking their mobile phone messages
129
368479
2376
他们还在用另一只手检查他们的
06:10
in the very other hand.
130
370855
2455
移动电话上的信息。
06:13
In the end, 300 people were gravely wounded
131
373310
3150
最终,共有300人受了重伤,
06:16
and over 172 men, women and children
132
376460
3905
而超过172位男子,女子和小孩,
06:20
lost their lives that day.
133
380365
4351
在那一天失去了生命。
06:24
Think about what happened.
134
384716
1765
让我们想一想究竟发生了什么。
06:26
During this 60-hour siege on Mumbai,
135
386481
3011
在孟买的60个小时的恐怖袭击中,
06:29
10 men armed not just with weapons,
136
389492
4415
不仅仅装备着武器,更装备着科技的
06:33
but with technology,
137
393907
1894
10个恐怖分子,
06:35
were able to bring a city of 20 million people
138
395801
3530
能够使一座容纳了2000万人的城市
06:39
to a standstill.
139
399331
2162
陷入停滞。
06:41
Ten people brought 20 million people
140
401493
1970
10个人使2000万人
06:43
to a standstill, and this traveled around the world.
141
403463
4194
陷入停滞,并将这份停滞传染至世界各国。
06:47
This is what radicals can do with openness.
142
407657
6310
这就是激进分子利用公开性所能做到的事。
06:53
This was done nearly four years ago.
143
413967
2518
这件事发生在大约4年之前。
06:56
What could terrorists do today
144
416485
1541
那么如今恐怖分子能够利用
06:58
with the technologies available that we have?
145
418026
2913
我们所拥有的现有科技做些什么呢?
07:00
What will they do tomorrow?
146
420939
3335
他们未来又会做些什么?
07:04
The ability of one to affect many
147
424274
2379
一个人通过自己的行为影响很多人的能力
07:06
is scaling exponentially,
148
426653
2416
正在呈指数级地增长,
07:09
and it's scaling for good and it's scaling for evil.
149
429069
4281
它既为正义而蓬勃,也为邪恶而蔓延。
07:13
It's not just about terrorism, though.
150
433350
2254
尽管它并不仅仅关于恐怖主义。
07:15
There's also been a big paradigm shift in crime.
151
435604
2310
这里也存在着一个犯罪领域的巨大的范式转变。
07:17
You see, you can now commit more crime as well.
152
437914
4293
你们看,你们现在也能进行更多的犯罪活动了。
07:22
In the old days, it was a knife and a gun.
153
442207
2280
很久以前,罪犯的犯罪工具只有刀和枪。
07:24
Then criminals moved to robbing trains.
154
444487
2392
之后,罪犯转而抢劫火车。
07:26
You could rob 200 people on a train, a great innovation.
155
446879
3627
你能在一列火车上抢劫200个人,这是一个极大的革新。
07:30
Moving forward, the Internet
156
450519
1701
随着历史的前进,网络
07:32
allowed things to scale even more.
157
452220
2168
让事情更具规模化。
07:34
In fact, many of you will remember
158
454388
1552
实际上,你们中的很多人将会牢牢记住
07:35
the recent Sony PlayStation hack.
159
455940
1736
最近索尼公司的PlayStation网络遭遇黑客入侵的事件。
07:37
In that incident, over 100 million people were robbed.
160
457676
6497
在那次事件中,超过1亿用户的信息资料被劫取了。
07:44
Think about that.
161
464173
1726
大家可以想一想。
07:45
When in the history of humanity
162
465899
1466
在人类漫长的历史中,
07:47
has it ever been possible for one person
163
467365
2202
何曾出现过1个人
07:49
to rob 100 million?
164
469567
5399
抢劫1亿人的可能性?
07:54
Of course, it's not just about stealing things.
165
474966
1799
当然,不单单是偷取东西。
07:56
There are other avenues of technology
166
476765
2287
还有很多的科技途径
07:59
that criminals can exploit.
167
479052
1445
可供罪犯开发。
08:00
Many of you will remember this super cute video
168
480497
3292
你们中的很多人将会铭记这段在最近的TED里
08:03
from the last TED,
169
483789
2392
播放的非常可爱的视频,
08:06
but not all quadcopter swarms are so nice and cute.
170
486181
5070
然而不是所有的四轴飞行器群都如此美好可爱。
08:11
They don't all have drumsticks.
171
491251
2147
他们并非都有鼓槌。
08:13
Some can be armed with HD cameras
172
493398
1858
其中有一些可能被安装了高清摄像头
08:15
and do countersurveillance on protesters,
173
495256
2503
而且能针对反对者进行反侦察伪装,
08:17
or, as in this little bit of movie magic,
174
497759
2896
或者,正如在这段简短的电影魔法中展现的,
08:20
quadcopters can be loaded with firearms
175
500655
3703
四轴飞行器可以被装置上火器,
08:24
and automatic weapons.
176
504358
3558
以及自动武器。
08:27
Little robots are cute when they play music to you.
177
507916
2713
当小机器人为你弹奏音乐时,他们的确很可爱。
08:30
When they swarm and chase you down the block
178
510629
2768
但是当它们蜂拥而至,在街区追逐你并不停地
08:33
to shoot you, a little bit less so.
179
513397
3632
向你射击时,可就没那么可爱了。
08:37
Of course, criminals and terrorists weren't the first
180
517029
2049
当然,罪犯和恐怖分子并非最先
08:39
to give guns to robots. We know where that started.
181
519078
2838
将武器安置在机器人身上。我们都知道这从何开始。
08:41
But they're adapting quickly.
182
521916
1336
但是,他们适应得非常快。
08:43
Recently, the FBI arrested
183
523252
1980
最近,美国联邦调查局逮捕了
08:45
an al Qaeda affiliate in the United States,
184
525232
1807
一位潜于美国的基地组织成员,
08:47
who was planning on using these remote-controlled
185
527039
1921
他正计划利用这些远程遥控的
08:48
drone aircraft to fly C4 explosives
186
528960
2927
无人驾驶飞机携带C4炸药
08:51
into government buildings in the United States.
187
531887
2609
飞入美国的政府大楼。
08:54
By the way, these travel at over 600 miles an hour.
188
534496
5387
顺便提一下,它们的飞行时速超过600英里。
08:59
Every time a new technology is being introduced,
189
539883
2374
每当一项新的科技被介绍引进,
09:02
criminals are there to exploit it.
190
542257
2641
罪犯就开始开发利用它。
09:04
We've all seen 3D printers.
191
544898
1447
我们都看过3D打印机,
09:06
We know with them that you can print
192
546345
1432
我们都知道利用这种打印机,我们可以通过
09:07
in many materials ranging from plastic
193
547777
2832
从塑料到巧克力,到金属,甚至混凝土
09:10
to chocolate to metal and even concrete.
194
550609
4365
等多种材料进行打印。
09:14
With great precision
195
554974
1386
实际上,凭借极高的精度,
09:16
I actually was able to make this
196
556360
2059
我在不久前的一天
09:18
just the other day, a very cute little ducky.
197
558419
1549
制造了一只非常可爱的小鸭子。
09:25
But I wonder to myself,
198
565343
1844
但是我追问自己,
09:27
for those people that strap bombs to their chests
199
567187
6913
对于那些将炸弹捆绑在自己的胸前,
09:34
and blow themselves up,
200
574100
1081
然后让自己爆炸的恐怖分子来说,
09:35
how might they use 3D printers?
201
575181
3757
他们可能会如何使用3D打印机?
09:38
Perhaps like this.
202
578938
4113
有可能像这样。
09:43
You see, if you can print in metal,
203
583051
2193
你们看,如果你们能够用金属打印,
09:45
you can print one of these,
204
585244
3347
你们就能打印出这其中的一个,
09:48
and in fact
205
588591
2467
而且事实上,
09:54
you can also print one of these too.
206
594750
3243
你们也能打印出这其中的一个。
10:00
The UK I know has some very strict firearms laws.
207
600793
4551
据我所知,英国制定了一些非常严厉的关于火器的法律。
10:05
You needn't bring the gun into the UK anymore.
208
605344
2121
你们再也不需要将枪带入英国了。
10:07
You just bring the 3D printer
209
607465
1655
你们只需要带一台3D打印机
10:09
and print the gun while you're here,
210
609120
2835
当你们在这里之后打印一支枪,
10:11
and, of course, the magazines for your bullets.
211
611955
3105
当然,还有子弹夹。
10:15
But as these get bigger in the future,
212
615060
1536
但是当这些东西在未来变得越来愈庞大,
10:16
what other items will you be able to print?
213
616596
2145
你们将能打印哪些其他的物体呢?
10:18
The technologies are allowing bigger printers.
214
618741
3288
科技将会为更庞大的打印机开路。
10:22
As we move forward,
215
622029
1348
随着人类社会向前发展
10:23
we'll see new technologies also, like the Internet of Things.
216
623377
2692
我们也会见证很多新科技的诞生,比如物联网。
10:26
Every day we're connecting more and more of our lives
217
626069
2031
我们日复一日地将越来越多的个人生活
10:28
to the Internet, which means
218
628100
2593
与网络相连,这就意味着
10:30
that the Internet of Things will soon be
219
630693
4226
物联网将很快成为
10:34
the Internet of Things To Be Hacked.
220
634919
1764
等待被黑之物的互联网。
10:36
All of the physical objects in our space
221
636683
2023
我们生活空间里存在的所有的有形物体
10:38
are being transformed into information technologies,
222
638706
2864
都正在被转换为信息技术,
10:41
and that has a radical implication for our security,
223
641570
3555
而这对我们的安全具有根本性的暗示,
10:45
because more connections to more devices
224
645125
3010
因为与更多的设备产生更多的联系
10:48
means more vulnerabilities.
225
648135
2579
就意味着出现更多的可供攻击的漏洞。
10:50
Criminals understand this.
226
650714
1748
罪犯清楚这一点。
10:52
Terrorists understand this. Hackers understand this.
227
652462
1879
恐怖分子清楚这一点。黑客清楚这一点。
10:54
If you control the code, you control the world.
228
654341
3163
只要你能控制代码,你就能控制整个世界。
10:57
This is the future that awaits us.
229
657504
4671
这就是等待着我们的未来。
11:02
There has not yet been an operating system
230
662175
2837
到现在为止还没有一个操作系统
11:05
or a technology that hasn't been hacked.
231
665012
2600
或者一项技术没有遭到过黑客入侵。
11:07
That's troubling, since the human body itself
232
667612
2738
这非常令人担忧,因为人体本身
11:10
is now becoming an information technology.
233
670350
3098
正在逐渐成为一项信息技术。
11:13
As we've seen here, we're transforming ourselves into cyborgs.
234
673448
3567
正如我们在这里看到的,我们正在将自己转换为电子人。
11:17
Every year, thousands of cochlear implants,
235
677015
2552
每一年都有成千上万的人工耳蜗,
11:19
diabetic pumps, pacemakers
236
679567
2218
胰岛素泵,心脏起搏器,
11:21
and defibrillators are being implanted in people.
237
681785
2215
心脏除颤器被移植入人类的身体。
11:24
In the United States, there are 60,000 people
238
684000
2648
在美国,一共有6万人
11:26
who have a pacemaker that connects to the Internet.
239
686648
2962
拥有连接到网络的心脏起搏器。
11:29
The defibrillators allow a physician at a distance
240
689610
3364
而心脏除颤器也使得千里之外的内科医生
11:32
to give a shock to a heart
241
692974
1790
能够在患者需要的时候
11:34
in case a patient needs it.
242
694764
2350
对他的心脏进行一次电击。
11:37
But if you don't need it,
243
697114
2091
但是如果你不需要它,
11:39
and somebody else gives you the shock,
244
699205
1776
而让其他人为你做一次电击,
11:40
it's not a good thing.
245
700981
3391
这就不是一件好事了。
11:44
Of course, we're going to go even deeper than the human body.
246
704372
3622
当然,我们的科技发展将不会止步于人体。
11:47
We're going down to the cellular level these days.
247
707994
2250
目前,我们的科技正慢慢深入细胞层面。
11:50
Up until this point, all the technologies
248
710244
2143
到现在为止,我所谈论到的
11:52
I've been talking about have been silicon-based, ones and zeroes,
249
712387
3788
所有技术都以硅为基础,以0和1为序码,
11:56
but there's another operating system out there:
250
716182
2691
但是现在又出现了另一套操作系统:
11:58
the original operating system, DNA.
251
718873
4696
即最本原的操作系统,DNA。
12:03
And to hackers, DNA is just another operating system
252
723569
4440
对黑客来说,DNA只不过是另一套等着被入侵的
12:08
waiting to be hacked.
253
728009
1999
操作系统罢了。
12:10
It's a great challenge for them.
254
730008
1496
这对他们来说是一个巨大的挑战。
12:11
There are people already working on hacking the software of life,
255
731504
1401
甚至有很多人早已从事于开发这套生命软件,
12:14
and while most of them are doing this to great good
256
734804
2957
虽然其中的绝大部分是为了公众的福祉,
12:17
and to help us all,
257
737761
1401
是为了帮助我们所有人,
12:19
some won't be.
258
739162
3536
但总有一部分人心存恶毒。
12:22
So how will criminals abuse this?
259
742698
1662
那么罪犯将如何滥用这项技术呢?
12:24
Well, with synthetic biology you can do some pretty neat things.
260
744360
2283
利用合成生物学,你能做一些相当优雅的事情。
12:26
For example, I predict that we will move away
261
746643
3196
例如,我预测我们将会离开这个
12:29
from a plant-based narcotics world
262
749839
3111
植物配方毒品的时代而进入
12:32
to a synthetic one. Why do you need the plants anymore?
263
752950
3059
合成毒品的时代。那么你们为什么还需要植物呢?
12:36
You can just take the DNA code from marijuana
264
756009
3117
你们可以仅仅从大麻或者罂粟或者古柯叶里
12:39
or poppies or coca leaves
265
759126
4792
提取DNA编码,
12:43
and cut and past that gene
266
763918
2320
接着剪切并复制那段基因,
12:46
and put it into yeast,
267
766238
2677
继而将它放入酵母中,
12:48
and you can take those yeast
268
768915
1402
然后你可以取出那些酵母
12:50
and make them make the cocaine for you,
269
770317
3272
并且用它们来为你们制造可卡因,
12:53
or the marijuana, or any other drug.
270
773589
3784
或者大麻,或者其他任何毒品。
12:57
So how we use yeast in the future
271
777373
1784
可见,我们在未来如何运用酵母
12:59
is going to be really interesting.
272
779157
1890
将会变得相当有趣。
13:01
In fact, we may have some really interesting bread and beer
273
781047
2344
实际上,当我们进入下一个世纪,我们可能会拥有
13:03
as we go into this next century.
274
783391
3823
一些非常有意思的面包和啤酒。
13:07
The cost of sequencing the human genome is dropping precipitously.
275
787214
3998
对人类基因进行排序的费用正在迅速下降。
13:11
It was proceeding at Moore's Law pace,
276
791212
2388
它过去一直按照摩尔定律的速度进行,
13:13
but then in 2008, something changed.
277
793600
1966
但是到了2008年,有一些事情改变了。
13:15
The technologies got better,
278
795566
1672
科技变得更加先进,
13:17
and now DNA sequencing is proceeding at a pace
279
797238
3846
现在进行DNA排序的速度已经达到了
13:21
five times that of Moore's Law.
280
801084
3118
摩尔定律的五倍。
13:24
That has significant implications for us.
281
804202
3824
这对我们而言有着极为重大的影响。
13:28
It took us 30 years to get from
282
808026
2743
我们花了30年时间从
13:30
the introduction of the personal computer
283
810769
2156
个人电脑的引进发展到
13:32
to the level of cybercrime we have today,
284
812925
2229
我们今天达到的网络犯罪的程度,
13:35
but looking at how biology is proceeding so rapidly,
285
815154
3627
但是看看生物学技术发展得多么迅猛,
13:38
and knowing criminals and terrorists as I do,
286
818781
1664
并像我一样了解罪犯和恐怖分子,
13:40
we may get there a lot faster
287
820445
2869
我们在未来也许可以利用生物犯罪
13:43
with biocrime in the future.
288
823314
1869
更快地达成目的。
13:45
It will be easy for anybody to go ahead
289
825183
2039
对任何人来说,着手行动,
13:47
and print their own bio-virus,
290
827222
1880
打印他们自己的生物病毒,
13:49
enhanced versions of ebola or anthrax,
291
829102
2762
强化埃博拉病毒、炭疽病、武器化流感的变体,
13:51
weaponized flu.
292
831864
1758
都将轻而易举。
13:53
We recently saw a case where some researchers
293
833622
2789
我们最近观察到一个案例,发现一些研究员
13:56
made the H5N1 avian influenza virus more potent.
294
836411
5026
将H5N1禽流感病毒变得更强效。
14:01
It already has a 70 percent mortality rate
295
841437
3466
如果你受它感染,你的死亡概率已经会
14:04
if you get it, but it's hard to get.
296
844903
1590
达到70%,但是它并不轻易传染。
14:06
Engineers, by moving around a small number
297
846493
2808
工程师们通过来回移动少量的
14:09
of genetic changes,
298
849301
1656
基因变化,
14:10
were able to weaponize it
299
850957
1699
能够将它武器化
14:12
and make it much more easy for human beings to catch,
300
852656
3262
并使它更容易传染给人类
14:15
so that not thousands of people would die,
301
855918
2183
如此一来,死去的人将何止成千上万,
14:18
but tens of millions.
302
858101
1713
恐怕将以数千万计。
14:19
You see, you can go ahead and create
303
859814
2490
你们看,你们可以赶紧行动并创造
14:22
new pandemics, and the researchers who did this
304
862304
1981
新的全球性疫情,而那些从事了上述工作的研究员
14:24
were so proud of their accomplishments,
305
864285
1512
将会为他们的非凡成就而深感自豪,
14:25
they wanted to publish it openly
306
865797
1889
他们会希望将这一技术公开发表,
14:27
so that everybody could see this
307
867686
2103
从而让每一个人了解这些,
14:29
and get access to this information.
308
869789
4144
并能够利用这些信息。
14:33
But it goes deeper than that.
309
873933
1795
但是事情远不止这么简单。
14:35
DNA researcher Andrew Hessel
310
875728
1942
一位名为Andrew Hessel的DNA研究员
14:37
has pointed out quite rightly
311
877670
1397
已经非常准确地指明
14:39
that if you can use cancer treatments,
312
879067
2283
如果你们能够利用癌症疗法,
14:41
modern cancer treatments,
313
881350
1128
现代癌症疗法,
14:42
to go after one cell while leaving all the other cells
314
882478
2949
去获取一个细胞,同时让其周围的其他细胞
14:45
around it intact,
315
885427
1656
保持原状,
14:47
then you can also go after any one person's cell.
316
887083
4144
那么之后你们也可能获取任何一个人的细胞。
14:51
Personalized cancer treatments
317
891227
1978
个性化的癌症疗法
14:53
are the flip side of personalized bioweapons,
318
893205
2614
正是个性化的生物武器的另一面,
14:55
which means you can attack any one individual,
319
895819
3386
这就意味着你可以攻击任何一个人,
14:59
including all the people in this picture.
320
899205
4183
包括这张图片里的所有人。
15:03
How will we protect them in the future?
321
903388
4128
那么未来我们要如何保护他们?
15:07
What to do? What to do about all this?
322
907516
2810
我们下一步该怎么办?关于这一切,我们该做什么?
15:10
That's what I get asked all the time.
323
910326
2495
这是我一直以来被追问的问题。
15:12
For those of you who follow me on Twitter,
324
912821
1587
对于你们中的那些在Twitter上跟随我的人,
15:14
I will be tweeting out the answer later on today. (Laughter)
325
914408
4530
我将会在今天的晚些时候公布答案。(笑声)
15:18
Actually, it's a bit more complex than that,
326
918938
2635
实际上,事情比这要更复杂一些,
15:21
and there are no magic bullets.
327
921573
2079
而且我们没有灵丹妙药。
15:23
I don't have all the answers,
328
923652
1712
我并不知道所有的答案,
15:25
but I know a few things.
329
925364
1664
但是我知道一些事情。
15:27
In the wake of 9/11,
330
927028
2697
在9·11事件之后,
15:29
the best security minds
331
929725
2790
最具安保才智的精英们
15:32
put together all their innovation
332
932515
1984
将他们所有的创新凝聚在一起
15:34
and this is what they created for security.
333
934499
4139
而这正是他们为保护我们的安全所创造的。
15:38
If you're expecting the people who built this to protect you
334
938638
4021
如果你指望建造了这个的精英能够从未来的机器人启示录
15:42
from the coming robopocalypse — (Laughter)
335
942659
3716
之中穿越而出来保护你—(笑声)
15:46
— uh, you may want to have a backup plan. (Laughter)
336
946375
2952
—嗯,你大概想要一个后备计划。(笑声)
15:49
Just saying. Just think about that. (Applause)
337
949327
6199
只是说一说。只是想一想。(掌声)
15:55
Law enforcement is currently a closed system.
338
955526
2919
执法活动在当下仍属一个封闭的系统。
15:58
It's nation-based, while the threat is international.
339
958445
2802
它虽以国家为基础,但威胁是国际性的。
16:01
Policing doesn't scale globally. At least, it hasn't,
340
961247
3948
警察并不具有全球性的规模。至少现在还没有,
16:05
and our current system of guns, border guards, big gates and fences
341
965195
4283
而我们现行的枪支,边防,大门和围栏的体系制度
16:09
are outdated in the new world into which we're moving.
342
969478
3111
完全不能适应我们正将迈入的新时代。
16:12
So how might we prepare for some of these specific threats,
343
972589
2276
所以我们应该如何准备应对这些特定的威胁,
16:14
like attacking a president or a prime minister?
344
974865
2423
比如袭击一位总统或者首相?
16:17
This would be the natural government response,
345
977288
2137
政府理所当然会采取的应对措施是
16:19
to hide away all our government leaders
346
979425
2277
将我们的政府领导人全都藏在
16:21
in hermetically sealed bubbles.
347
981702
1811
密封保护的泡沫之中。
16:23
But this is not going to work.
348
983513
1832
但是这并不会奏效。
16:25
The cost of doing a DNA sequence is going to be trivial.
349
985345
3136
对DNA进行排序的费用将会低得微不足道。
16:28
Anybody will have it and we will all have them in the future.
350
988481
3245
在未来,任何人都将拥有它,我们都将掌握它。
16:31
So maybe there's a more radical way that we can look at this.
351
991726
3453
因此也许有一个更彻底的方式,我们看看这个。
16:35
What happens if we were to take
352
995183
1928
如果我们得到总统的DNA,
16:37
the President's DNA, or a king or queen's,
353
997111
3200
或者国王或王后的DNA,然后将它提供给
16:40
and put it out to a group of a few hundred
354
1000311
2336
一个由几百位可信赖的研究员组成的团队,
16:42
trusted researchers so they could
355
1002647
1896
使他们能够研究这个DNA,并且对它
16:44
study that DNA and do penetration testing against it
356
1004543
2936
进行入侵性的渗透测试,以作为帮助我们的领导人的
16:47
as a means of helping our leaders?
357
1007479
2208
一种方式,那么将会发生什么呢?
16:49
Or what if we sent it out to a few thousand?
358
1009687
2257
或者如果我们将它提供给几千人将会怎么样呢?
16:51
Or, controversially, and not without its risks,
359
1011944
2662
或者,即使颇具争议,也不是没有风险,
16:54
what happens if we just gave it to the whole public?
360
1014606
2859
如果我们将它提供给全社会将会发生什么?
16:57
Then we could all be engaged in helping.
361
1017465
3565
然后我们都能够投身于帮助他人。
17:01
We've already seen examples of this working well.
362
1021030
2616
关于这项工作,我们已经看到了一些实例。
17:03
The Organized Crime and Corruption Reporting Project
363
1023646
2928
有组织犯罪和腐败报告项目
17:06
is staffed by journalists and citizens
364
1026574
1672
配备了记者和公民,
17:08
where they are crowd-sourcing
365
1028246
1736
他们集思广益群策群力
17:09
what dictators and terrorists are doing
366
1029982
2613
找出独裁者和恐怖分子所做的活动,
17:12
with public funds around the world,
367
1032595
1682
并得到了来自世界各地的公共基金支持,
17:14
and, in a more dramatic case,
368
1034277
1667
此外还有一个更具戏剧性的例子,
17:15
we've seen in Mexico,
369
1035944
2304
是我们在墨西哥了解到的,
17:18
a country that has been racked
370
1038248
1624
这个国家在过去的六年里
17:19
by 50,000 narcotics-related murders
371
1039872
4013
发生了5万件与毒品有关的谋杀,
17:23
in the past six years.
372
1043885
1766
并为此饱受折磨。
17:25
They're killing so many people
373
1045651
1375
他们杀害了这么多的人,
17:27
they can't even afford to bury them all
374
1047026
2280
甚至负担不起将他们全部好生入葬的费用,
17:29
in anything but these unmarked graves
375
1049306
1769
而只能将他们埋入这些无名坟墓,
17:31
like this one outside of Ciudad Juarez.
376
1051075
3405
比如华雷斯城外的这个地方。
17:34
What can we do about this? The government has proven ineffective.
377
1054480
3108
我们能为此做些什么呢?政府已经被证明毫无效用。
17:37
So in Mexico, citizens, at great risk to themselves,
378
1057588
3137
因此在墨西哥,公民们自身冒着极大的危险
17:40
are fighting back to build an effective solution.
379
1060725
4079
为了建立一套有效的方案而坚持反击。
17:44
They're crowd-mapping the activities of the drug dealers.
380
1064804
4184
他们以生命之血构建人群地图记录了毒贩的所有活动。
17:48
Whether or not you realize it,
381
1068988
1792
不论你们是否意识到,
17:50
we are at the dawn of a technological arms race,
382
1070780
3685
我们即将迎来一场科技军备竞赛,
17:54
an arms race between people
383
1074465
1778
这场军备竞赛的双方
17:56
who are using technology for good
384
1076243
1762
是运用科技行善的一方
17:58
and those who are using it for ill.
385
1078005
2208
和滥用科技作恶的一方
18:00
The threat is serious, and the time to prepare for it is now.
386
1080213
4361
这次威胁相当严峻,而全力备战的时间正是现在。
18:04
I can assure you that the terrorists and criminals are.
387
1084574
3433
我可以向你们保证恐怖分子和罪犯正在加紧准备。
18:08
My personal belief is that,
388
1088007
2103
我个人的信仰是,
18:10
rather than having a small, elite force
389
1090110
1945
与其依靠一支由训练有素的政府人员
18:12
of highly trained government agents
390
1092055
2040
组成的小规模精英部队
18:14
here to protect us all,
391
1094095
1721
来保护我们所有人,
18:15
we're much better off
392
1095816
1703
还不如让
18:17
having average and ordinary citizens
393
1097519
2079
普通公民团结为整体
18:19
approaching this problem as a group
394
1099598
2424
来解决这个问题,并且看看
18:22
and seeing what we can do.
395
1102022
1313
我们能做什么要好得多。
18:23
If we all do our part,
396
1103335
1249
如果我们都能够尽绵薄之力,
18:24
I think we'll be in a much better space.
397
1104584
2327
我相信我们将会拥有更美好的生活。
18:26
The tools to change the world
398
1106911
1467
改变世界的工具
18:28
are in everybody's hands.
399
1108378
1693
在我们每一个人手中。
18:30
How we use them is not just up to me,
400
1110071
2839
我们如何利用这些工具绝不仅取决于我,
18:32
it's up to all of us.
401
1112910
2482
更取决于我们所有人的努力。
18:35
This was a technology I would frequently deploy
402
1115392
2633
这是我在做警察的时候经常
18:38
as a police officer.
403
1118025
1921
应用的科技。
18:39
This technology has become outdated in our current world.
404
1119946
3775
这项技术在我们现处的时代已经过气了。
18:43
It doesn't scale, it doesn't work globally,
405
1123734
2712
它不具规模,不能运用于全世界,
18:46
and it surely doesn't work virtually.
406
1126446
2072
在虚拟空间更是一无是处
18:48
We've seen paradigm shifts in crime and terrorism.
407
1128518
3268
我们已经看到了犯罪和恐怖主义的范式改革。
18:51
They call for a shift to a more open form
408
1131786
4347
他们呼吁改革至一个更为开放的形态,
18:56
and a more participatory form of law enforcement.
409
1136133
4897
和一个更注重参与的执法模式。
19:01
So I invite you to join me.
410
1141030
2639
因此我邀请你加入我们。
19:03
After all, public safety is too important to leave to the professionals.
411
1143669
5188
毕竟,公众安全事关重大,不能仅限于专业人员参加。
19:08
Thank you. (Applause)
412
1148857
2807
谢谢大家。(掌声)
19:11
(Applause)
413
1151664
7881
(掌声)
New videos
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。