Bruce Schneier: The security mirage

78,953 views ・ 2011-04-27

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yuli Zhou 校对人员: Tony Yet
安全其实是两种事物:
00:16
So, security is two different things:
0
16010
2276
它不仅是感觉,也是现实。
00:18
it's a feeling, and it's a reality.
1
18310
2526
这两样事物是完全不同的。
00:20
And they're different.
2
20860
1425
你可以在不安全的时候
00:22
You could feel secure even if you're not.
3
22309
3427
感觉很安全。
00:25
And you can be secure
4
25760
1976
或者你不感到安全的时候
00:27
even if you don't feel it.
5
27760
1850
却很安全
00:29
Really, we have two separate concepts
6
29634
2117
真的,我们有两种不同的概念
00:31
mapped onto the same word.
7
31775
1652
存在于同一个词语上。
00:33
And what I want to do in this talk is to split them apart --
8
33960
3626
我在这里想要做的是
把它们区分开来
00:37
figuring out when they diverge and how they converge.
9
37610
3610
找出什么时候它们存在分歧,
什么时候又聚合在一起。
00:41
And language is actually a problem here.
10
41711
2275
语言是个大问题。
因为没有多少适合的词语去表达
00:44
There aren't a lot of good words
11
44010
2076
我们将要谈到的概念。
00:46
for the concepts we're going to talk about.
12
46110
2061
如果你将安全
00:49
So if you look at security from economic terms,
13
49295
4120
视为一个经济学的名词,
那它就是“权衡取舍”。
00:53
it's a trade-off.
14
53439
1647
每一次你得到一些安全,
00:55
Every time you get some security, you're always trading off something.
15
55110
4132
你总是在拿一些东西去交换。
不管是个人决定
00:59
Whether this is a personal decision --
16
59266
1845
比如是否给家里装一个防盗器
01:01
whether you're going to install a burglar alarm in your home --
17
61135
3012
或者国家的决策,比如去侵略哪个国家
01:04
or a national decision,
18
64171
1157
01:05
where you're going to invade a foreign country --
19
65352
2310
你都要去交换,
01:07
you're going to trade off something: money or time, convenience, capabilities,
20
67686
3782
不管是金钱、时间、便利、能力,
还可能是基本自由权。
01:11
maybe fundamental liberties.
21
71492
2002
01:13
And the question to ask when you look at a security anything
22
73518
3274
当你面对安全的时候,要问的
01:16
is not whether this makes us safer,
23
76816
3382
不是这个能不能让我们更安全,
而是值不值得我们去交换。
01:20
but whether it's worth the trade-off.
24
80222
2215
01:22
You've heard in the past several years, the world is safer
25
82461
3229
你们这几年都听过,
世界更安全了是因为萨达姆倒台了。
01:25
because Saddam Hussein is not in power.
26
85714
1890
那个可能是真的,但没什么关系。
01:27
That might be true, but it's not terribly relevant.
27
87628
2603
问题是,值得吗?
01:30
The question is: Was it worth it?
28
90255
2831
你可以有自己的想法,
01:33
And you can make your own decision,
29
93110
2459
01:35
and then you'll decide whether the invasion was worth it.
30
95593
2733
然后决定那个侵略是否值得
那就是你如何在以权衡取舍
01:38
That's how you think about security: in terms of the trade-off.
31
98350
3561
来考虑安全。
01:41
Now, there's often no right or wrong here.
32
101935
2610
这里没有绝对的对与错。
我们中的有些人在家安了防盗器
01:45
Some of us have a burglar alarm system at home and some of us don't.
33
105208
3308
有些人没有
01:48
And it'll depend on where we live,
34
108540
2731
安不安取决于我们住在哪里
我们是独居还是有个家庭
01:51
whether we live alone or have a family,
35
111295
1926
我们有多少值钱的东西
01:53
how much cool stuff we have,
36
113245
1668
01:54
how much we're willing to accept the risk of theft.
37
114937
3165
我们愿意接受多少
盗窃带来的风险
01:58
In politics also, there are different opinions.
38
118943
2948
对于政治来说也是一样
存在着各种不同的观点
02:02
A lot of times, these trade-offs are about more than just security,
39
122459
4435
很多时候,这些权衡取舍
不仅仅跟安全有关
02:06
and I think that's really important.
40
126918
1865
对于这点我觉得很重要
02:08
Now, people have a natural intuition about these trade-offs.
41
128807
3308
当今人们有一种
关于这些权衡取舍的直觉
02:12
We make them every day.
42
132588
1556
我们每天都在用它来做决定
02:14
Last night in my hotel room, when I decided to double-lock the door,
43
134807
3533
比如昨晚我在酒店房间里
决定是否给门上两层锁的时候
02:18
or you in your car when you drove here;
44
138364
2000
或者你在驾车到这里的路上
或者当我们去吃午饭时
02:21
when we go eat lunch
45
141191
1478
02:22
and decide the food's not poison and we'll eat it.
46
142693
2608
会认为食物是没毒的然后放心地吃
02:25
We make these trade-offs again and again,
47
145325
3161
我们反复做出这种权衡取舍
每天都有很多次
02:28
multiple times a day.
48
148510
1576
我们甚至没有留意它们
02:30
We often won't even notice them.
49
150110
1589
02:31
They're just part of being alive; we all do it.
50
151723
2626
它们只是生活的一部分,人们都是这么做的
每一个物种都是这么做的
02:34
Every species does it.
51
154373
1555
02:36
Imagine a rabbit in a field, eating grass.
52
156474
2862
想象有一只兔子在吃草
然后它看到了一只狐狸
02:39
And the rabbit sees a fox.
53
159360
1943
02:41
That rabbit will make a security trade-off:
54
161856
2049
那只兔子需要做出一个关于安全的权衡取舍
02:43
"Should I stay, or should I flee?"
55
163929
1904
“我应该留下,还是逃跑呢?”
正如你所见
02:46
And if you think about it,
56
166380
1619
懂得做出权衡取舍的兔子
02:48
the rabbits that are good at making that trade-off
57
168023
2555
02:50
will tend to live and reproduce,
58
170602
1978
会选择生存和繁衍
02:52
and the rabbits that are bad at it
59
172604
2307
而不懂的兔子
02:54
will get eaten or starve.
60
174935
1474
则被吃掉
02:56
So you'd think
61
176958
1608
所以你可能会想
作为这个星球上一支成功的物种的我们 --
02:59
that us, as a successful species on the planet -- you, me, everybody --
62
179573
4309
你、我、所有人 --
03:03
would be really good at making these trade-offs.
63
183906
2573
比较擅长于做出有利的权衡取舍
然而事实一次又一次地证明
03:07
Yet it seems, again and again, that we're hopelessly bad at it.
64
187126
3104
我们并非如此
03:11
And I think that's a fundamentally interesting question.
65
191768
2802
我认为那是个关键又有趣的问题
03:14
I'll give you the short answer.
66
194594
1873
我给你们一个简短的答案
03:16
The answer is, we respond to the feeling of security
67
196491
2651
答案就是,我们依据的是安全的感觉
而非现实
03:19
and not the reality.
68
199166
1518
03:21
Now, most of the time, that works.
69
201864
2696
很多时候,这样没什么问题
03:25
Most of the time,
70
205538
1503
因为在大部分时间里
03:27
feeling and reality are the same.
71
207065
2133
感觉和现实是相同的
03:30
Certainly that's true for most of human prehistory.
72
210776
3516
在绝大部分史前人类历史中
那也是没错的
03:35
We've developed this ability
73
215633
2708
我们发展了这个能力
03:38
because it makes evolutionary sense.
74
218365
2584
因为它有利于进化
继续思考一下就知道
03:41
One way to think of it is that we're highly optimized
75
221985
3274
我们做出某些风险决策的能力
是高度优化了的
03:45
for risk decisions
76
225283
1803
这些决策是以群居的小型家庭形式
03:47
that are endemic to living in small family groups
77
227110
2543
03:49
in the East African Highlands in 100,000 BC.
78
229677
2536
生活在公元前十万年的东非高地的人们所独有 --
03:52
2010 New York, not so much.
79
232792
2659
其实2010年的纽约也差不多
03:56
Now, there are several biases in risk perception.
80
236879
3206
现在有一些对风险的偏见
很多实验都关于这些偏见
04:00
A lot of good experiments in this.
81
240109
1741
04:01
And you can see certain biases that come up again and again.
82
241874
3603
你可以观察到有些偏见反复出现
我讲四个
04:05
I'll give you four.
83
245501
1353
04:06
We tend to exaggerate spectacular and rare risks
84
246878
3208
第一个,我们会夸大那些耸人听闻但少见的风险
并漠视常见的
04:10
and downplay common risks --
85
250110
1976
比如像飞机和汽车的事故率
04:12
so, flying versus driving.
86
252110
1518
04:14
The unknown is perceived to be riskier than the familiar.
87
254451
3794
第二个,未知的被认为比熟悉的
更加危险
举个例子
04:21
One example would be:
88
261470
1439
04:22
people fear kidnapping by strangers,
89
262933
2613
人们害怕被陌生人绑架
04:25
when the data supports that kidnapping by relatives is much more common.
90
265570
3636
即使数据证实被亲戚绑架更常见
以上都是针对孩子们来说的
04:29
This is for children.
91
269230
1574
04:30
Third, personified risks are perceived to be greater
92
270828
4040
第三个,人格化的风险
被认为比匿名的更严重
04:34
than anonymous risks.
93
274892
1503
04:36
So, Bin Laden is scarier because he has a name.
94
276419
2787
所以本拉登很可怕是因为他有个名字
第四个是
04:40
And the fourth is:
95
280182
1363
04:41
people underestimate risks in situations they do control
96
281569
4755
人们在他们觉得可以掌控的情况下
会低估风险
而在不能控制的情况下高估风险
04:46
and overestimate them in situations they don't control.
97
286348
2963
04:49
So once you take up skydiving or smoking,
98
289335
3384
所以你在开始跳伞或抽烟后
04:52
you downplay the risks.
99
292743
1624
会不再重视它们带来的风险
如果你猛然面临某种风险 -- 恐怖主义是个好例子 --
04:55
If a risk is thrust upon you -- terrorism is a good example --
100
295037
3053
你会高估它,因为你不觉得你可以控制了
04:58
you'll overplay it,
101
298114
1157
04:59
because you don't feel like it's in your control.
102
299295
2339
05:02
There are a bunch of other of these cognitive biases,
103
302157
3493
还有许多这样的认知偏见
05:05
that affect our risk decisions.
104
305674
2339
影响着我们的风险决策
05:08
There's the availability heuristic,
105
308832
2254
有一种易得性偏差
意思是
05:11
which basically means we estimate the probability of something
106
311110
4180
我们在估计某事发生的概率时
05:15
by how easy it is to bring instances of it to mind.
107
315314
3339
依据的是想到具体的例子是否容易
05:19
So you can imagine how that works.
108
319831
1777
你可以想象那是怎么作用的
05:21
If you hear a lot about tiger attacks, there must be a lot of tigers around.
109
321632
3628
如果你听说了许多老虎袭击人的消息,那么你会认为肯定有很多老虎在附近
如果你没听说狮子袭击人,那么就没有多少狮子在附近
05:25
You don't hear about lion attacks, there aren't a lot of lions around.
110
325284
3344
这是可行的直到报纸被发明
05:28
This works, until you invent newspapers,
111
328652
2297
05:30
because what newspapers do is repeat again and again
112
330973
4406
因为报纸所做的
就是一遍又一遍地重复
那些少见的风险
05:35
rare risks.
113
335403
1406
05:36
I tell people: if it's in the news, don't worry about it,
114
336833
2865
我告诉大家,如果事情出现在新闻里,那就不用担心了
因为按照定义
05:39
because by definition, news is something that almost never happens.
115
339722
4275
新闻是从来没有发生过的事情
(笑)
05:44
(Laughter)
116
344021
1769
05:45
When something is so common, it's no longer news.
117
345814
2923
当事情变得常见了,那就不是新闻了
05:48
Car crashes, domestic violence --
118
348761
2198
比如车祸和家庭暴力
05:50
those are the risks you worry about.
119
350983
1990
你会担心这些风险
05:53
We're also a species of storytellers.
120
353713
2148
我们同时也是一种会讲故事的物种
05:55
We respond to stories more than data.
121
355885
2115
相比于数据,我们更喜欢故事
05:58
And there's some basic innumeracy going on.
122
358514
2406
在故事里,总有些对科学的无知存在
06:00
I mean, the joke "One, two, three, many" is kind of right.
123
360944
3142
比如 “一、二、三、很多”(见英文) 这个笑话
我们善于用小数字
06:04
We're really good at small numbers.
124
364110
2322
06:06
One mango, two mangoes, three mangoes,
125
366456
2336
一个芒果,两个芒果,三个芒果
06:08
10,000 mangoes, 100,000 mangoes --
126
368816
1977
一万个芒果,十万个芒果 --
06:10
it's still more mangoes you can eat before they rot.
127
370817
2977
在烂掉前还有足够的芒果等你去吃
06:13
So one half, one quarter, one fifth -- we're good at that.
128
373818
3268
二分之一,四分之一,五分之一 -- 我们擅长这些
百万分之一,十亿分之一 --
06:17
One in a million, one in a billion --
129
377110
1976
它们就像永远不会发生那样
06:19
they're both almost never.
130
379110
1575
06:21
So we have trouble with the risks that aren't very common.
131
381546
3414
所以我们不知如何面对
那些不常见的风险
06:25
And what these cognitive biases do
132
385760
1977
这些认知偏见所起的作用
06:27
is they act as filters between us and reality.
133
387761
2976
就是像过滤器一样隔断我们和现实
结果呢
06:31
And the result is that feeling and reality get out of whack,
134
391284
3873
感觉和现实被割开
它们变得不同了
06:35
they get different.
135
395181
1384
06:37
Now, you either have a feeling -- you feel more secure than you are,
136
397370
3931
现在你要么有种感觉 -- 你觉得比现实更加安全
这是一个错误的安全感
06:41
there's a false sense of security.
137
401325
1685
要么相反
06:43
Or the other way, and that's a false sense of insecurity.
138
403034
3374
出现错误的不安全感
我写了很多关于”安全剧场“的文章
06:47
I write a lot about "security theater,"
139
407015
2880
06:49
which are products that make people feel secure,
140
409919
2680
这个概念只起到让人们觉得很安全的作用
06:52
but don't actually do anything.
141
412623
1977
除此之外一无是处
06:54
There's no real word for stuff that makes us secure,
142
414624
2557
现实世界里不存在让我们安全
但不让我们觉得安全的事物
06:57
but doesn't make us feel secure.
143
417205
1881
可能这就是CIA应该为我们做的
06:59
Maybe it's what the CIA is supposed to do for us.
144
419110
2720
07:03
So back to economics.
145
423539
2168
好了,回到经济学里
07:05
If economics, if the market, drives security,
146
425731
3656
如果经济,或者市场,以安全为重
07:09
and if people make trade-offs based on the feeling of security,
147
429411
4847
并且人们根据安全的感觉
作出权衡取舍
07:14
then the smart thing for companies to do for the economic incentives
148
434282
4680
那么精明的公司所应该做的
为了经济上的激励
07:18
is to make people feel secure.
149
438986
2057
就是让人们觉得安全
07:21
And there are two ways to do this.
150
441942
2330
有两种方法可以做到
07:24
One, you can make people actually secure
151
444296
2790
一,你可以真正地做到安全
然后希望人们可以注意到
07:27
and hope they notice.
152
447110
1463
07:28
Or two, you can make people just feel secure
153
448597
2844
二,你可以让人们觉得安全
07:31
and hope they don't notice.
154
451465
1872
然后希望他们没有注意到真相
07:34
Right?
155
454401
1375
07:35
So what makes people notice?
156
455800
3141
那么到底什么可以引起人们注意是否安全呢?
有很多,比如
07:39
Well, a couple of things:
157
459500
1382
07:40
understanding of the security,
158
460906
2266
对安全的理解
对风险和威胁的理解
07:43
of the risks, the threats,
159
463196
1890
对 对策及其原理的理解
07:45
the countermeasures, how they work.
160
465110
1874
07:47
But if you know stuff, you're more likely
161
467008
2290
如果你知道很多东西
那么你更有可能拥有与现实一致的感觉
07:50
to have your feelings match reality.
162
470155
2226
很多现实生活中的例子可以帮助理解
07:53
Enough real-world examples helps.
163
473110
3145
比如我们都了解我们居住的地区的犯罪率
07:56
We all know the crime rate in our neighborhood,
164
476279
2559
07:58
because we live there, and we get a feeling about it
165
478862
2801
因为我们住在那,并且我们能够感受的到
08:01
that basically matches reality.
166
481687
1869
这种感觉与现实基本相符
”安全剧场“会在失灵的时候
08:05
Security theater is exposed
167
485038
2207
08:07
when it's obvious that it's not working properly.
168
487269
2786
很明显的暴露出来
好,接下来,什么让人们不去注意安全呢?
08:11
OK. So what makes people not notice?
169
491209
2670
08:14
Well, a poor understanding.
170
494443
1492
这里有个简单的理解
08:16
If you don't understand the risks, you don't understand the costs,
171
496642
3144
如果你不理解风险,你就不理解成本
08:19
you're likely to get the trade-off wrong,
172
499810
2157
你就会做出错误的权衡取舍
08:21
and your feeling doesn't match reality.
173
501991
2488
并且你的感觉与现实不符
08:24
Not enough examples.
174
504503
1737
没多少例子
08:26
There's an inherent problem with low-probability events.
175
506879
3506
在小概率事件里
存在一个固有的问题
08:30
If, for example, terrorism almost never happens,
176
510919
3813
举个例子
如果恐怖行动从来没发生过
08:34
it's really hard to judge the efficacy of counter-terrorist measures.
177
514756
4604
那么就很难对反恐措施的效果
进行衡量
08:40
This is why you keep sacrificing virgins,
178
520523
3563
这是为什么人们牺牲处女
和对童话的抵触会如此成功的原因
08:44
and why your unicorn defenses are working just great.
179
524110
2675
08:46
There aren't enough examples of failures.
180
526809
2557
鲜有失败的例子
同时,对于事情的感觉
08:51
Also, feelings that cloud the issues --
181
531109
2787
08:53
the cognitive biases I talked about earlier: fears, folk beliefs --
182
533920
4028
-- 之前说的认知偏见
恐惧和盲目相信熟悉的人 --
08:58
basically, an inadequate model of reality.
183
538727
2745
基本上一个对现实的不完整模型
让我深入一点
09:03
So let me complicate things.
184
543403
2171
09:05
I have feeling and reality.
185
545598
1977
我现在有感觉和现实
09:07
I want to add a third element. I want to add "model."
186
547599
2796
我想加入第三个元素,一个模型
09:10
Feeling and model are in our head,
187
550839
2350
感觉和模型存在于脑海里
现实存在于外部世界
09:13
reality is the outside world; it doesn't change, it's real.
188
553213
3452
它是不会变的,它是真实的
09:17
Feeling is based on our intuition,
189
557800
2214
所以感觉是建立在直觉上的
模型是建立在理智上的
09:20
model is based on reason.
190
560038
1626
那是关键的不同之处
09:22
That's basically the difference.
191
562383
2039
09:24
In a primitive and simple world,
192
564446
1977
在一个原始又简单的世界里
09:26
there's really no reason for a model,
193
566447
2137
没有建立模型的必要
因为感觉和现实很接近
09:30
because feeling is close to reality.
194
570253
2295
09:32
You don't need a model.
195
572572
1373
你不需要
09:34
But in a modern and complex world,
196
574596
2150
但是在现在这个复杂的世界里
你需要模型
09:37
you need models to understand a lot of the risks we face.
197
577556
3650
去理解面对的很多风险
09:42
There's no feeling about germs.
198
582362
2284
比如说,没有什么感觉是关于细菌的
你需要一个模型去了解它们
09:45
You need a model to understand them.
199
585110
2116
所以这个模型
09:48
This model is an intelligent representation of reality.
200
588157
3678
是在理智层面上的现实
09:52
It's, of course, limited by science, by technology.
201
592411
4751
它当然被科学和技术
所限制着
我们没法在发明显微镜观察细菌前
09:58
We couldn't have a germ theory of disease
202
598249
2326
10:00
before we invented the microscope to see them.
203
600599
2534
拥有一套关于细菌和疾病的理论
10:04
It's limited by our cognitive biases.
204
604316
2643
它同时也被我们的认知偏见所限制
但模型有能力
10:08
But it has the ability to override our feelings.
205
608110
2991
凌驾于我们的感觉
10:11
Where do we get these models? We get them from others.
206
611507
3104
我们从哪里得到这些模型的呢?从其他人那里
10:14
We get them from religion, from culture, teachers, elders.
207
614635
5219
从宗教、文化
老师、长辈那里得到
很多年前
10:20
A couple years ago, I was in South Africa on safari.
208
620298
3426
我在南非狩猎
10:23
The tracker I was with grew up in Kruger National Park.
209
623748
2762
跟我一起的那个追踪者是在克鲁格国家公园长大的
10:26
He had some very complex models of how to survive.
210
626534
2753
他有一些如何生存的复杂模型
10:29
And it depended on if you were attacked by a lion, leopard, rhino, or elephant --
211
629800
3913
分别针对被狮子、猎豹、
犀牛还是大象所攻击的情况
10:33
and when you had to run away, when you couldn't run away,
212
633737
2734
和什么时候应该逃跑,什么时候应该爬树
10:36
when you had to climb a tree, when you could never climb a tree.
213
636495
3083
和什么时候千万别上树
我可能会在一天内就死在那里
10:39
I would have died in a day.
214
639602
1349
但他生在那里
10:42
But he was born there, and he understood how to survive.
215
642160
3782
他知道生存的方法
我生在纽约
10:46
I was born in New York City.
216
646490
1596
我可以把他带到纽约,估计他也会在一天内就没命了
10:48
I could have taken him to New York, and he would have died in a day.
217
648110
3251
(笑)
10:51
(Laughter)
218
651385
1001
10:52
Because we had different models based on our different experiences.
219
652410
4144
原因在我们有建立在我们各自经验上的
不同的模型
10:58
Models can come from the media,
220
658291
2469
模型来自媒体
11:00
from our elected officials ...
221
660784
1763
来自我们选出的政府
11:03
Think of models of terrorism,
222
663234
3081
想一下恐怖袭击的模型
11:06
child kidnapping,
223
666339
2197
绑架儿童的模型
11:08
airline safety, car safety.
224
668560
2325
飞机和汽车的安全模型
11:11
Models can come from industry.
225
671539
1993
模型可以来自某个工业领域
11:14
The two I'm following are surveillance cameras,
226
674348
3218
我关注的两个是监视器
和身份证
11:17
ID cards,
227
677590
1496
很多计算机安全模型都来自它们
11:19
quite a lot of our computer security models come from there.
228
679110
3130
还有些模型来自科学
11:22
A lot of models come from science.
229
682264
2227
11:24
Health models are a great example.
230
684515
1837
以健康模型为例
11:26
Think of cancer, bird flu, swine flu, SARS.
231
686376
3026
想想癌症、禽流感、猪流感、非典
11:29
All of our feelings of security about those diseases
232
689942
4870
我们所有关于
这些疾病的感觉
11:34
come from models given to us, really, by science filtered through the media.
233
694836
4655
都来自于
媒体从科学里过滤出来之后灌输给我们的
所以模型是可变的
11:41
So models can change.
234
701038
1720
11:43
Models are not static.
235
703482
2103
模型不是静态的
11:45
As we become more comfortable in our environments,
236
705609
3240
随着我们越来越适应环境
11:48
our model can move closer to our feelings.
237
708873
3602
模型会越来越接近现实
11:53
So an example might be,
238
713965
2340
举个例子
如果你回到一百年前
11:56
if you go back 100 years ago,
239
716329
1596
11:57
when electricity was first becoming common,
240
717949
3428
那时电刚刚普及
仍然有很多人害怕它
12:01
there were a lot of fears about it.
241
721401
1703
有些人害怕按门铃
12:03
There were people who were afraid to push doorbells,
242
723128
2478
因为那有电,所以很危险
12:05
because there was electricity in there, and that was dangerous.
243
725630
3005
对于我们来说,我们跟电相处地很融洽
12:08
For us, we're very facile around electricity.
244
728659
2869
我们不用怎么想
12:11
We change light bulbs without even thinking about it.
245
731552
2818
就可以换灯泡
12:14
Our model of security around electricity is something we were born into.
246
734948
6163
我们拥有的关于电和安全的模型
是天生的
12:21
It hasn't changed as we were growing up.
247
741735
2514
它没有随着我们的成长而变化
12:24
And we're good at it.
248
744273
1565
并且我们很适应
12:27
Or think of the risks on the Internet across generations --
249
747380
4499
再想想在不同年龄层的人
关于互联网风险的认识 --
12:31
how your parents approach Internet security,
250
751903
2097
你的父母是怎么看待互联网安全的
你是怎么看待的
12:34
versus how you do,
251
754024
1616
12:35
versus how our kids will.
252
755664
1542
你的孩子们会怎么看待
12:38
Models eventually fade into the background.
253
758300
2550
模型最终会消失在无意识里
12:42
"Intuitive" is just another word for familiar.
254
762427
2493
直觉来自于熟悉
12:45
So as your model is close to reality and it converges with feelings,
255
765887
3850
所以随着你的模型越来越接近现实
它将同感觉合二为一
12:49
you often don't even know it's there.
256
769761
1918
你将感觉不到它的存在
以去年的猪流感为例
12:53
A nice example of this came from last year and swine flu.
257
773239
3817
以去年的猪流感为例
当猪流感第一次出现时
12:58
When swine flu first appeared,
258
778281
2000
一开始的新闻造成了过度的反应
13:00
the initial news caused a lot of overreaction.
259
780305
2618
13:03
Now, it had a name,
260
783562
1978
现在它有了个名字
13:05
which made it scarier than the regular flu,
261
785564
2050
使之变得比平常的流感更加可怕
13:07
even though it was more deadly.
262
787638
1567
即使它没那么致命
13:09
And people thought doctors should be able to deal with it.
263
789784
3208
另外,人们觉得医生应该能够解决掉它
13:13
So there was that feeling of lack of control.
264
793459
2524
所以产生了一种失去控制的感觉
以上两种原因
13:16
And those two things made the risk more than it was.
265
796007
3109
使风险变得比实际更严重
13:19
As the novelty wore off and the months went by,
266
799140
3557
几个月过去了,随着新鲜感的消退
13:22
there was some amount of tolerance; people got used to it.
267
802721
2843
人们接受了
并且习惯了猪流感的事情
13:26
There was no new data, but there was less fear.
268
806355
2670
没有新的数据,但恐惧减少了
13:29
By autumn,
269
809681
2174
秋天的时候
13:31
people thought the doctors should have solved this already.
270
811879
3382
人们想
医生应该已经解决问题了
13:35
And there's kind of a bifurcation:
271
815722
1960
一个选择出现了 --
13:37
people had to choose between fear and acceptance --
272
817706
5751
人们必须从
恐惧接受中选择 --
实际上是恐惧和漠视 --
13:44
actually, fear and indifference --
273
824512
1644
他们选择了怀疑
13:46
and they kind of chose suspicion.
274
826180
1795
当疫苗在冬天出现的时候
13:49
And when the vaccine appeared last winter,
275
829110
3111
很多人 -- 非常大的数量 --
13:52
there were a lot of people -- a surprising number --
276
832245
2511
13:54
who refused to get it.
277
834780
1797
拒绝接种
13:58
And it's a nice example of how people's feelings of security change,
278
838777
3656
这可以作为
人们的安全感和模型是如何
14:02
how their model changes,
279
842457
1603
剧烈地
14:04
sort of wildly,
280
844084
1668
14:05
with no new information, with no new input.
281
845776
2770
在没有新信息
的情况下改变的
这种情况经常发生
14:10
This kind of thing happens a lot.
282
850327
1808
现在我再把概念深入一点
14:13
I'm going to give one more complication.
283
853199
1971
14:15
We have feeling, model, reality.
284
855194
2696
我们有感觉、模型和现实
14:18
I have a very relativistic view of security.
285
858640
2510
我认为安全其实还是相对的
它取决于观察者
14:21
I think it depends on the observer.
286
861174
1814
14:23
And most security decisions have a variety of people involved.
287
863695
5166
大多数关于安全的决策
是由各种人群所参与决定的
14:29
And stakeholders with specific trade-offs will try to influence the decision.
288
869792
6539
有小算盘的利益相关者
有小算盘的利益相关者
会试着影响决策的进行
14:36
And I call that their agenda.
289
876355
1681
我称其为他们的议程
你可以瞧见这个议程 --
14:39
And you see agenda -- this is marketing, this is politics --
290
879512
3491
不管是市场还是政治 --
14:43
trying to convince you to have one model versus another,
291
883481
3039
它尝试着说服你只拥有其中一种模型
14:46
trying to convince you to ignore a model
292
886544
1984
说服你去忽视模型
14:48
and trust your feelings,
293
888552
2672
而相信感觉
14:51
marginalizing people with models you don't like.
294
891248
2504
边缘化那些拥有跟你的模型的不同的人们
14:54
This is not uncommon.
295
894744
1516
这很常见
14:57
An example, a great example, is the risk of smoking.
296
897610
3229
这里有个例子,很好的例子,关于吸烟的危害
在过去50年里,吸烟的危害
15:02
In the history of the past 50 years,
297
902196
1783
15:04
the smoking risk shows how a model changes,
298
904003
2613
展示了一个模型是怎么变化的
15:06
and it also shows how an industry fights against a model it doesn't like.
299
906640
4358
同时也展示了一个工业是怎么对付
一个它不喜欢的模型
15:11
Compare that to the secondhand smoke debate --
300
911983
3103
你可以把它跟20年后
关于二手烟的争论相比较
15:15
probably about 20 years behind.
301
915110
1953
15:17
Think about seat belts.
302
917982
1615
再想想安全带
15:19
When I was a kid, no one wore a seat belt.
303
919621
2024
当我还小的时候,没人系安全带
15:21
Nowadays, no kid will let you drive if you're not wearing a seat belt.
304
921669
3541
现在呢,如果你不系安全带
没有哪个孩子会让你开车
15:26
Compare that to the airbag debate,
305
926633
2453
你可以把它跟30年后
关于安全气囊的争论相比较
15:29
probably about 30 years behind.
306
929110
1667
这几个例子里的模型都变了
15:32
All examples of models changing.
307
932006
2088
15:36
What we learn is that changing models is hard.
308
936855
2796
由此我们可以的出结论,模型是很难被改变的
模型是很难被移除的
15:40
Models are hard to dislodge.
309
940334
2053
如果模型跟你的感觉相符
15:42
If they equal your feelings,
310
942411
1675
你甚至不知道你有个模型
15:44
you don't even know you have a model.
311
944110
1899
再说另一个认知偏见
15:47
And there's another cognitive bias
312
947110
1886
证实性偏见
15:49
I'll call confirmation bias,
313
949020
2066
意思是我们倾向于接受
15:51
where we tend to accept data that confirms our beliefs
314
951110
4361
那些能够支持我们观点的数据
15:55
and reject data that contradicts our beliefs.
315
955495
2437
而拒绝那些反对的
15:59
So evidence against our model, we're likely to ignore,
316
959490
3935
所以对于那些与我们的模型相反的证据
我们会忽略掉,即使它们很有说服力
16:03
even if it's compelling.
317
963449
1248
16:04
It has to get very compelling before we'll pay attention.
318
964721
3005
那些证据必须非常非常令人信服,我们才会去关注
16:08
New models that extend long periods of time are hard.
319
968990
2597
一个时间跨度长的新模型难以让人接受
比如像全球变暖
16:11
Global warming is a great example.
320
971611
1754
我们很难接受一个
16:13
We're terrible at models that span 80 years.
321
973389
3442
超过80年的的模型
16:16
We can do "to the next harvest."
322
976855
2063
我们可以接受一年的
16:18
We can often do "until our kids grow up."
323
978942
2306
我们也可以接受让一个小孩长大那么长的时间
16:21
But "80 years," we're just not good at.
324
981760
2201
但80年还是太难了
16:24
So it's a very hard model to accept.
325
984975
2302
所以那是个非常难以让人接受的模型
16:27
We can have both models in our head simultaneously --
326
987999
2946
我们可以同时拥有对同一件事情的
16:31
that kind of problem where we're holding both beliefs together,
327
991912
6948
两个模型
此时,我们拥有同时两种信念
这种情况也叫认知不协调
16:38
the cognitive dissonance.
328
998884
1370
最后
16:40
Eventually, the new model will replace the old model.
329
1000278
3156
新模型代替了旧模型
16:44
Strong feelings can create a model.
330
1004164
2190
强烈的感觉可以产生一个模型
16:47
September 11 created a security model in a lot of people's heads.
331
1007411
5363
911在很多人脑里
产生了一个安全模型
16:52
Also, personal experiences with crime can do it,
332
1012798
3288
同时,个人的犯罪经历和
一次健康危机 --
16:56
personal health scare,
333
1016110
1379
16:57
a health scare in the news.
334
1017513
1466
就是那种在新闻里可以看到的那种 -- 也可以产生模型
那些经历在心理学里叫做
17:00
You'll see these called "flashbulb events" by psychiatrists.
335
1020198
3345
闪光灯事件
它们能迅速地产生一个模型
17:04
They can create a model instantaneously,
336
1024183
2461
17:06
because they're very emotive.
337
1026668
1761
因为引起了强烈的个人感情
17:09
So in the technological world,
338
1029908
1588
所以在一个技术世界里
17:11
we don't have experience to judge models.
339
1031520
3237
我们没有可以判断模型
的经历
17:15
And we rely on others. We rely on proxies.
340
1035124
2933
我们依赖其他人,我们依赖于代理人
这样是可以的,只要它能够纠正错误就行
17:18
And this works, as long as it's the correct others.
341
1038081
2619
17:21
We rely on government agencies
342
1041183
2682
我们依赖政府
17:23
to tell us what pharmaceuticals are safe.
343
1043889
4404
来告诉我们哪些药品是安全的
17:28
I flew here yesterday.
344
1048317
1913
我是昨天坐飞机来的
17:30
I didn't check the airplane.
345
1050254
1762
我没检查飞机是否安全
17:32
I relied on some other group
346
1052699
2595
我依赖其他人
去决定我坐的飞机是否安全
17:35
to determine whether my plane was safe to fly.
347
1055318
2437
17:37
We're here, none of us fear the roof is going to collapse on us,
348
1057779
3298
我们坐在这里,没人担心屋顶会塌
不是因为我们亲自检查过
17:41
not because we checked,
349
1061101
2205
17:43
but because we're pretty sure the building codes here are good.
350
1063330
3672
而是我们非常确定
这建筑符合规范
17:48
It's a model we just accept
351
1068442
2989
这是一种模型我们只是
因为信念而接受
17:51
pretty much by faith.
352
1071455
1360
这也没错
17:53
And that's OK.
353
1073331
1358
17:57
Now, what we want is people to get familiar enough with better models,
354
1077966
5873
现在,我们希望的是
人们能够认识一些
更好的模型 --
18:03
have it reflected in their feelings,
355
1083863
2120
在感觉里显现出来 --
以帮助他们做出更好的权衡取舍
18:06
to allow them to make security trade-offs.
356
1086007
3002
当感觉和模型分开的时候
18:10
When these go out of whack, you have two options.
357
1090110
3719
你有两个选择
18:13
One, you can fix people's feelings, directly appeal to feelings.
358
1093853
4233
第一,改变人们的感觉
直接诉诸于感觉
这是一种操纵,但有效果
18:18
It's manipulation, but it can work.
359
1098110
2406
第二,更诚实一点的做法
18:21
The second, more honest way
360
1101173
2191
就是改变模型
18:23
is to actually fix the model.
361
1103388
1773
18:26
Change happens slowly.
362
1106720
2101
改变是很缓慢的
18:28
The smoking debate took 40 years -- and that was an easy one.
363
1108845
4378
吸烟的争论持续了40年
而那还是比较简单的一个
有一些是非常困难的
18:35
Some of this stuff is hard.
364
1115195
1813
是真的很困难
18:37
Really, though, information seems like our best hope.
365
1117496
3756
看起来信息是我们最好的希望
事实上我之前撒了个谎
18:41
And I lied.
366
1121276
1272
18:42
Remember I said feeling, model, reality; reality doesn't change?
367
1122572
4020
我之前提到感觉、模型和现实
我说现实不会改变。事实上它会。
18:46
It actually does.
368
1126616
1375
我们生活在一个技术的世界里
18:48
We live in a technological world;
369
1128015
1714
18:49
reality changes all the time.
370
1129753
2338
现实每时每刻都在改变
18:52
So we might have, for the first time in our species:
371
1132887
2977
所以,可能是我们这个物种里的第一次
18:55
feeling chases model, model chases reality, reality's moving --
372
1135888
3183
感觉追赶着模型,模型追赶着现实,而现实则在不断改变
它们可能永远也追不上
18:59
they might never catch up.
373
1139095
1533
19:02
We don't know.
374
1142180
1328
这点谁知道呢
但是就长期来看
19:05
But in the long term,
375
1145614
1603
感觉和现实是很重要的
19:07
both feeling and reality are important.
376
1147241
2204
19:09
And I want to close with two quick stories to illustrate this.
377
1149469
3233
结束前我想以两个小故事来说明这点
19:12
1982 -- I don't know if people will remember this --
378
1152726
2479
1982年 -- 我不知道人们还记不记得 --
那时在美国发生了一次
19:15
there was a short epidemic of Tylenol poisonings
379
1155229
3370
时间不长但传播范围广的泰诺中毒事件
19:18
in the United States.
380
1158623
1196
19:19
It's a horrific story.
381
1159843
1361
很可怕。有人拿了一瓶泰诺胶囊,
19:21
Someone took a bottle of Tylenol,
382
1161228
2079
放毒进去,关上盖子,然后又放回货架
19:23
put poison in it, closed it up, put it back on the shelf,
383
1163331
3002
七个人买回去吃了然后中毒而死
19:26
someone else bought it and died.
384
1166357
1558
19:27
This terrified people.
385
1167939
1673
人们很害怕
19:29
There were a couple of copycat attacks.
386
1169636
2227
当时还有些模仿此投毒的行为
19:31
There wasn't any real risk, but people were scared.
387
1171887
2845
幸好后者没什么真正的危险,但人们被吓到了
19:34
And this is how the tamper-proof drug industry was invented.
388
1174756
3876
这是防盗瓶盖产业
得以发展起来的原因
19:38
Those tamper-proof caps? That came from this.
389
1178656
2229
那些防盗瓶盖就是这么来的
19:40
It's complete security theater.
390
1180909
1571
它就是所谓的安全剧场
19:42
As a homework assignment, think of 10 ways to get around it.
391
1182504
2891
你们可以想想10种破解防盗瓶盖的方法
我先说一个,用注射器
19:45
I'll give you one: a syringe.
392
1185419
1891
19:47
But it made people feel better.
393
1187334
2781
即使没那么安全,但至少人们感觉更安全了
19:50
It made their feeling of security more match the reality.
394
1190744
3702
它让人们对安全的感觉
跟现实更为符合
最后一个故事。几年前,我一个朋友要生了
19:55
Last story: a few years ago, a friend of mine gave birth.
395
1195390
2934
我去医院看她
19:58
I visit her in the hospital.
396
1198348
1397
19:59
It turns out, when a baby's born now,
397
1199769
1923
发现当一个婴儿出生后
20:01
they put an RFID bracelet on the baby, a corresponding one on the mother,
398
1201716
3563
他们会给婴儿戴上一个带RFID的手镯
然后给母亲也配一个对应的
20:05
so if anyone other than the mother takes the baby out of the maternity ward,
399
1205303
3620
这样,当一个不是母亲的人想把婴儿从产房带走
警报就会响
20:08
an alarm goes off.
400
1208947
1158
我说:“这措施不错。
20:10
I said, "Well, that's kind of neat.
401
1210129
1729
20:11
I wonder how rampant baby snatching is out of hospitals."
402
1211882
3970
我想知道在医院
偷盗婴儿的行为有多猖獗。”
20:15
I go home, I look it up.
403
1215876
1236
回到家,我查了一下。
20:17
It basically never happens.
404
1217136
1525
基本上从来没发生过
20:18
(Laughter)
405
1218685
1844
但如果你仔细想想
20:20
But if you think about it, if you are a hospital,
406
1220553
2843
如果你是医生
20:23
and you need to take a baby away from its mother,
407
1223420
2380
你需要给婴儿从母亲身边带走
20:25
out of the room to run some tests,
408
1225824
1781
带出房间做点测试
20:27
you better have some good security theater,
409
1227629
2050
你最好有安全剧院
20:29
or she's going to rip your arm off.
410
1229703
1945
不然的话那位母亲会把你的胳膊都拽下来
20:31
(Laughter)
411
1231672
1533
(笑)
所以,安全剧院这个概念对于
20:34
So it's important for us,
412
1234161
1717
20:35
those of us who design security,
413
1235902
2135
那些做安全设计的,
那些以实际效果来看待
20:38
who look at security policy --
414
1238061
2031
20:40
or even look at public policy in ways that affect security.
415
1240946
3308
安全政策或公共政策的人来说
是非常重要的
它不只是现实,它是感觉和现实
20:45
It's not just reality; it's feeling and reality.
416
1245006
3416
重要的是
20:48
What's important
417
1248446
1865
它们几乎是一样的
20:50
is that they be about the same.
418
1250335
1545
20:51
It's important that, if our feelings match reality,
419
1251904
2531
如果我们的感觉和现实相符
我们就能够做出更好的关于安全的权衡取舍
20:54
we make better security trade-offs.
420
1254459
1873
谢谢
20:56
Thank you.
421
1256711
1153
20:57
(Applause)
422
1257888
2133
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog