Bruce Schneier: The security mirage

78,953 views ・ 2011-04-27

TED


請雙擊下方英文字幕播放視頻。

譯者: wentzu chen 審譯者: Diwen Mueller
安全有兩種涵義
00:16
So, security is two different things:
0
16010
2276
感覺上的安全,和真實裡的安全
00:18
it's a feeling, and it's a reality.
1
18310
2526
二者並不相同
00:20
And they're different.
2
20860
1425
你可能感到安全
00:22
You could feel secure even if you're not.
3
22309
3427
但現實情況是不安全的
00:25
And you can be secure
4
25760
1976
而在真實的安全中
00:27
even if you don't feel it.
5
27760
1850
卻感到不安全
00:29
Really, we have two separate concepts
6
29634
2117
確實,這兩種不同的概念
00:31
mapped onto the same word.
7
31775
1652
被放在同一個字詞裡
00:33
And what I want to do in this talk is to split them apart --
8
33960
3626
這個演講的目的
就是將它們區分清楚 --
00:37
figuring out when they diverge and how they converge.
9
37610
3610
探討它們何時會分歧
又在什麼狀況下合而為一
00:41
And language is actually a problem here.
10
41711
2275
語言本身是個問題
因為沒有足夠合適的字詞
00:44
There aren't a lot of good words
11
44010
2076
來傳達我們要談到的概念
00:46
for the concepts we're going to talk about.
12
46110
2061
用經濟學的角度
00:49
So if you look at security from economic terms,
13
49295
4120
來看安全
安全就是一項權衡的交易
00:53
it's a trade-off.
14
53439
1647
要得到安全
00:55
Every time you get some security, you're always trading off something.
15
55110
4132
一定要先付出
無論是個人的決定-
00:59
Whether this is a personal decision --
16
59266
1845
例如在家中安裝防盜警鈴
01:01
whether you're going to install a burglar alarm in your home --
17
61135
3012
還是攸關國家安全的決策-例如侵略他國
01:04
or a national decision,
18
64171
1157
01:05
where you're going to invade a foreign country --
19
65352
2310
你總得有所付出
01:07
you're going to trade off something: money or time, convenience, capabilities,
20
67686
3782
不是錢就是時間,或是便利性,能力
也可能是基本自由
01:11
maybe fundamental liberties.
21
71492
2002
01:13
And the question to ask when you look at a security anything
22
73518
3274
面對安全議題,該問的
01:16
is not whether this makes us safer,
23
76816
3382
不是「這樣做會更安全嗎」
而是「值得付出這個代價嗎」
01:20
but whether it's worth the trade-off.
24
80222
2215
01:22
You've heard in the past several years, the world is safer
25
82461
3229
在過去這幾年,你們都聽過這種說法
我們的世界更安全是因為薩達姆.海珊垮台的緣故
01:25
because Saddam Hussein is not in power.
26
85714
1890
兩件事情也許都是真的,但兩者之間卻沒有關連
01:27
That might be true, but it's not terribly relevant.
27
87628
2603
該問的問題是,這樣做值得嗎?
01:30
The question is: Was it worth it?
28
90255
2831
你可以做出自己的選擇
01:33
And you can make your own decision,
29
93110
2459
01:35
and then you'll decide whether the invasion was worth it.
30
95593
2733
然後判斷是否值得為此入侵他國
這就是以權衡的觀點
01:38
That's how you think about security: in terms of the trade-off.
31
98350
3561
來分析安全的方法
01:41
Now, there's often no right or wrong here.
32
101935
2610
決定沒有正確或錯誤之分
有人在家裡安裝防盜警鈴系統
01:45
Some of us have a burglar alarm system at home and some of us don't.
33
105208
3308
有人不裝
01:48
And it'll depend on where we live,
34
108540
2731
這取決於我們居住的地點
是獨居或是與家人同住
01:51
whether we live alone or have a family,
35
111295
1926
擁有多少值錢的物品
01:53
how much cool stuff we have,
36
113245
1668
01:54
how much we're willing to accept the risk of theft.
37
114937
3165
以及願意承擔多少竊盜損失
竊盜損失
01:58
In politics also, there are different opinions.
38
118943
2948
政治上也一樣
各種意見分歧
02:02
A lot of times, these trade-offs are about more than just security,
39
122459
4435
在權衡得失時
通常要考慮的不只有安全因素
02:06
and I think that's really important.
40
126918
1865
我認為這點很重要
02:08
Now, people have a natural intuition about these trade-offs.
41
128807
3308
人們對於抉擇
有天生的直覺
02:12
We make them every day.
42
132588
1556
我們每天都在做決定
02:14
Last night in my hotel room, when I decided to double-lock the door,
43
134807
3533
像是昨晚在飯店
我決定把房門上雙層鎖
02:18
or you in your car when you drove here;
44
138364
2000
或是當你在車裡決定開車來此地的時候
或是我們吃午餐時
02:21
when we go eat lunch
45
141191
1478
02:22
and decide the food's not poison and we'll eat it.
46
142693
2608
先判斷食物沒有毒,才決定吃它
02:25
We make these trade-offs again and again,
47
145325
3161
一天中有很多場合需要
需要一再地做出決定
02:28
multiple times a day.
48
148510
1576
大部分的時後,我們甚至不會留意到這點
02:30
We often won't even notice them.
49
150110
1589
02:31
They're just part of being alive; we all do it.
50
151723
2626
因為這已是我們生存的一部份;我們都是這樣的
每個物種也都一樣
02:34
Every species does it.
51
154373
1555
02:36
Imagine a rabbit in a field, eating grass.
52
156474
2862
試想原野中的一隻兔子,正在吃著草
這時牠見到一隻狐狸
02:39
And the rabbit sees a fox.
53
159360
1943
02:41
That rabbit will make a security trade-off:
54
161856
2049
兔子需要做一個攸關安全的抉擇
02:43
"Should I stay, or should I flee?"
55
163929
1904
留下還是逃命?
你認為
02:46
And if you think about it,
56
166380
1619
擅長做出正確決定的兔子
02:48
the rabbits that are good at making that trade-off
57
168023
2555
02:50
will tend to live and reproduce,
58
170602
1978
比較容易存活且繁衍下去
02:52
and the rabbits that are bad at it
59
172604
2307
而做出錯誤決定的兔子
02:54
will get eaten or starve.
60
174935
1474
不是被吃就是餓死了
02:56
So you'd think
61
176958
1608
那麼
在地球上表現傑出優異的我們 --
02:59
that us, as a successful species on the planet -- you, me, everybody --
62
179573
4309
包括你、我、以及每個人 --
03:03
would be really good at making these trade-offs.
63
183906
2573
必定也擅長做出正確抉擇吧
然而,事實似乎一再地證明
03:07
Yet it seems, again and again, that we're hopelessly bad at it.
64
187126
3104
人類做出的決策糟糕無比
03:11
And I think that's a fundamentally interesting question.
65
191768
2802
這問題非常重要也相當有趣
03:14
I'll give you the short answer.
66
194594
1873
我給你們一個簡短的解答
03:16
The answer is, we respond to the feeling of security
67
196491
2651
答案是,因為人類是依據對安全的感覺做出判斷
而非依據真實的安全狀況
03:19
and not the reality.
68
199166
1518
03:21
Now, most of the time, that works.
69
201864
2696
大部分的情況下,這麼做是正確的
03:25
Most of the time,
70
205538
1503
因為大多數的時候
03:27
feeling and reality are the same.
71
207065
2133
感覺和真實是一致的
03:30
Certainly that's true for most of human prehistory.
72
210776
3516
人類在史前時代
也是這樣的
03:35
We've developed this ability
73
215633
2708
我們發展出這種能力
03:38
because it makes evolutionary sense.
74
218365
2584
是因演化而來
有些看法認為
03:41
One way to think of it is that we're highly optimized
75
221985
3274
人類目前所擁有的最佳能力
是為了配合
03:45
for risk decisions
76
225283
1803
公元前100,000年在東非高地生活的小型家庭
03:47
that are endemic to living in small family groups
77
227110
2543
03:49
in the East African Highlands in 100,000 BC.
78
229677
2536
他們生存所須具備的風險決策能力
03:52
2010 New York, not so much.
79
232792
2659
但已不太符合在2010年的紐約生存的條件了
03:56
Now, there are several biases in risk perception.
80
236879
3206
如今,人類的風險感知能力出現偏差
很多的實驗在探討這點
04:00
A lot of good experiments in this.
81
240109
1741
04:01
And you can see certain biases that come up again and again.
82
241874
3603
某些類型的偏差會反覆出現
我會說明其中的四種
04:05
I'll give you four.
83
245501
1353
04:06
We tend to exaggerate spectacular and rare risks
84
246878
3208
一,我們容易誇大驚心動魄且不常見的風險
卻低估常見的風險
04:10
and downplay common risks --
85
250110
1976
例如搭飛機的風險對比陸地上駕駛的風險
04:12
so, flying versus driving.
86
252110
1518
04:14
The unknown is perceived to be riskier than the familiar.
87
254451
3794
二,我們認為未知的事
比起熟知的事更加危險
其中一個例子是
04:21
One example would be:
88
261470
1439
04:22
people fear kidnapping by strangers,
89
262933
2613
人們害怕被陌生人綁架
04:25
when the data supports that kidnapping by relatives is much more common.
90
265570
3636
但資料顯示被親友綁架的案件更普遍
這裡指的是誘拐孩童
04:29
This is for children.
91
269230
1574
04:30
Third, personified risks are perceived to be greater
92
270828
4040
三,我們認為具名化的事件
比不具名事件的風險高
04:34
than anonymous risks.
93
274892
1503
04:36
So, Bin Laden is scarier because he has a name.
94
276419
2787
賓拉登很恐怖,正是因為他有個名字
第四
04:40
And the fourth is:
95
280182
1363
04:41
people underestimate risks in situations they do control
96
281569
4755
人們容易在可以控制狀況時
低估風險
在不能控制的情境中高估風險
04:46
and overestimate them in situations they don't control.
97
286348
2963
04:49
So once you take up skydiving or smoking,
98
289335
3384
所以,你開始特技跳傘或是抽菸後
04:52
you downplay the risks.
99
292743
1624
就會忽略它的風險
面對突如其來的危險-例如恐怖主義
04:55
If a risk is thrust upon you -- terrorism is a good example --
100
295037
3053
人們會過度反應,是因為覺得無法控制狀況
04:58
you'll overplay it,
101
298114
1157
04:59
because you don't feel like it's in your control.
102
299295
2339
05:02
There are a bunch of other of these cognitive biases,
103
302157
3493
類似的偏差還有很多,這些認知的偏差
05:05
that affect our risk decisions.
104
305674
2339
影響我們的風險決策
05:08
There's the availability heuristic,
105
308832
2254
所謂”可得性捷思”
指的是
05:11
which basically means we estimate the probability of something
106
311110
4180
人在評估事件可能發生的機率時
05:15
by how easy it is to bring instances of it to mind.
107
315314
3339
是基於該事件在我們心目中容易聯想的程度
05:19
So you can imagine how that works.
108
319831
1777
像一下這是怎麼運作的
05:21
If you hear a lot about tiger attacks, there must be a lot of tigers around.
109
321632
3628
聽到多起老虎攻擊事件,就表示附近老虎很多
沒聽到獅子攻擊事件,就表示附近的獅子不多
05:25
You don't hear about lion attacks, there aren't a lot of lions around.
110
325284
3344
直到新聞報紙被發明前,這種判斷準則是成立的
05:28
This works, until you invent newspapers,
111
328652
2297
05:30
because what newspapers do is repeat again and again
112
330973
4406
因為報紙所做的
就是一再地重複報導
那些鮮少發生的危險
05:35
rare risks.
113
335403
1406
05:36
I tell people: if it's in the news, don't worry about it,
114
336833
2865
我要告訴大家,新聞中報導的事情,都無需煩憂
因為根據定義
05:39
because by definition, news is something that almost never happens.
115
339722
4275
新聞就是不會發生的事件
(笑)
05:44
(Laughter)
116
344021
1769
05:45
When something is so common, it's no longer news.
117
345814
2923
太常見的事件,就不會是新聞
05:48
Car crashes, domestic violence --
118
348761
2198
像是車禍,家庭暴力
05:50
those are the risks you worry about.
119
350983
1990
這些才是我們該擔憂的
05:53
We're also a species of storytellers.
120
353713
2148
人類是說故事的物種
05:55
We respond to stories more than data.
121
355885
2115
比起數據,故事更容易影響我們
05:58
And there's some basic innumeracy going on.
122
358514
2406
人類多少有點數字文盲,我的意思是
06:00
I mean, the joke "One, two, three, many" is kind of right.
123
360944
3142
有個笑話說:人只會數一,二,三,很多.
人真的是這樣,我們對小數字很在行
06:04
We're really good at small numbers.
124
364110
2322
06:06
One mango, two mangoes, three mangoes,
125
366456
2336
一個芒果,兩個芒果,三個芒果
06:08
10,000 mangoes, 100,000 mangoes --
126
368816
1977
一萬個芒果,十萬的芒果
06:10
it's still more mangoes you can eat before they rot.
127
370817
2977
在它們腐壞前,還有許多芒果可吃
06:13
So one half, one quarter, one fifth -- we're good at that.
128
373818
3268
½,¼, 1/5,這些數字我們也都很在行
百萬分之一,十億分之一
06:17
One in a million, one in a billion --
129
377110
1976
這些被當作幾乎沒有
06:19
they're both almost never.
130
379110
1575
06:21
So we have trouble with the risks that aren't very common.
131
381546
3414
所以,一旦面對不尋常的危機
我們就不知該怎麼對付了
06:25
And what these cognitive biases do
132
385760
1977
認知的偏見
06:27
is they act as filters between us and reality.
133
387761
2976
如同濾鏡般,存在我們和真實之間
於是
06:31
And the result is that feeling and reality get out of whack,
134
391284
3873
感覺背離了真實
他們不再相同
06:35
they get different.
135
395181
1384
06:37
Now, you either have a feeling -- you feel more secure than you are,
136
397370
3931
並產生兩種可能狀況,一是擁有過多的安全感
這是錯誤的安全感
06:41
there's a false sense of security.
137
401325
1685
另一種是,
06:43
Or the other way, and that's a false sense of insecurity.
138
403034
3374
錯誤的不安全感
我寫過很多關於「安全劇院」的文章
06:47
I write a lot about "security theater,"
139
407015
2880
06:49
which are products that make people feel secure,
140
409919
2680
它是一種可以讓人們感覺到安全的機制
06:52
but don't actually do anything.
141
412623
1977
但事實上並沒有改善實際的安全狀況
06:54
There's no real word for stuff that makes us secure,
142
414624
2557
沒有確切的字眼來形容那種能改善真實安全
但無法增加安全感的機制
06:57
but doesn't make us feel secure.
143
417205
1881
CIA該為我們做的也許就是這個
06:59
Maybe it's what the CIA is supposed to do for us.
144
419110
2720
07:03
So back to economics.
145
423539
2168
回到經濟學
07:05
If economics, if the market, drives security,
146
425731
3656
如果經濟,或者市場,是驅動安全的力量
07:09
and if people make trade-offs based on the feeling of security,
147
429411
4847
而人們是依據對安全的感覺
來進行交易
07:14
then the smart thing for companies to do for the economic incentives
148
434282
4680
那麼,公司想要促進經濟誘因的
最佳策略
07:18
is to make people feel secure.
149
438986
2057
就是讓人們感覺到安全
07:21
And there are two ways to do this.
150
441942
2330
有兩種方式可以達成這個目的
07:24
One, you can make people actually secure
151
444296
2790
一是讓人們在真實中更安全
並且期盼他們有留意到這點
07:27
and hope they notice.
152
447110
1463
07:28
Or two, you can make people just feel secure
153
448597
2844
或者你也可以讓人們只是感覺更安全
07:31
and hope they don't notice.
154
451465
1872
但你要期望他們不會發現到真相
07:34
Right?
155
454401
1375
07:35
So what makes people notice?
156
455800
3141
究竟什麼會引起人們關注
舉例來說
07:39
Well, a couple of things:
157
459500
1382
07:40
understanding of the security,
158
460906
2266
對安全的認知程度
對風險及威脅的認知
07:43
of the risks, the threats,
159
463196
1890
以及了解如何採取對策等
07:45
the countermeasures, how they work.
160
465110
1874
07:47
But if you know stuff, you're more likely
161
467008
2290
知道得更多
感覺和真實就愈趨一致
07:50
to have your feelings match reality.
162
470155
2226
真實世界中有很多這方面的例子
07:53
Enough real-world examples helps.
163
473110
3145
我們對居家附近區域的犯罪率很明瞭
07:56
We all know the crime rate in our neighborhood,
164
476279
2559
07:58
because we live there, and we get a feeling about it
165
478862
2801
因為我們住在這裡,所以我們對治安的感覺
08:01
that basically matches reality.
166
481687
1869
基本上符合真實狀況
安全劇院所揭露的
08:05
Security theater is exposed
167
485038
2207
08:07
when it's obvious that it's not working properly.
168
487269
2786
是真實與感覺明顯背離的情況
那麼,又是什麼讓人們忽略安全?
08:11
OK. So what makes people not notice?
169
491209
2670
08:14
Well, a poor understanding.
170
494443
1492
認知不足
08:16
If you don't understand the risks, you don't understand the costs,
171
496642
3144
不了解風險,不了解代價
08:19
you're likely to get the trade-off wrong,
172
499810
2157
就愈可能做出錯誤的安全策略
08:21
and your feeling doesn't match reality.
173
501991
2488
並且無法感覺真實情況
08:24
Not enough examples.
174
504503
1737
相關的例子不多
08:26
There's an inherent problem with low-probability events.
175
506879
3506
對於不常發生的事件
這是本質上存在的問題
08:30
If, for example, terrorism almost never happens,
176
510919
3813
舉例來說
如果恐怖主義幾乎是不曾發生的
08:34
it's really hard to judge the efficacy of counter-terrorist measures.
177
514756
4604
那麼要判斷反恐措施的功效
就難上加難了
08:40
This is why you keep sacrificing virgins,
178
520523
3563
這就是為什麼人們不斷地奉獻處女祭祀
或是將過錯推諉給編造出來的「他」,都很有用
08:44
and why your unicorn defenses are working just great.
179
524110
2675
08:46
There aren't enough examples of failures.
180
526809
2557
因為災難本來就不多
加上心理作用作祟
08:51
Also, feelings that cloud the issues --
181
531109
2787
08:53
the cognitive biases I talked about earlier: fears, folk beliefs --
182
533920
4028
就是我剛剛所說的認知偏差
恐懼,民間信仰
08:58
basically, an inadequate model of reality.
183
538727
2745
這些基本上都無法適當地反映真實
讓我把事情弄得再複雜些
09:03
So let me complicate things.
184
543403
2171
09:05
I have feeling and reality.
185
545598
1977
除了感覺,以及真實的世界
09:07
I want to add a third element. I want to add "model."
186
547599
2796
我想再加上第三個元素-模型
09:10
Feeling and model are in our head,
187
550839
2350
感覺和模型存在腦海裡
而真實存在於外在
09:13
reality is the outside world; it doesn't change, it's real.
188
553213
3452
它不會變,它是真實的
09:17
Feeling is based on our intuition,
189
557800
2214
感覺是基於直覺
模型是基於理智
09:20
model is based on reason.
190
560038
1626
這是兩者最基本的差異
09:22
That's basically the difference.
191
562383
2039
09:24
In a primitive and simple world,
192
564446
1977
在遠古的簡單世界裡
09:26
there's really no reason for a model,
193
566447
2137
模型沒有存在的意義
因為感覺和真實非常的接近
09:30
because feeling is close to reality.
194
570253
2295
09:32
You don't need a model.
195
572572
1373
你不需要模型
09:34
But in a modern and complex world,
196
574596
2150
但在現代複雜的社會
你需要模型
09:37
you need models to understand a lot of the risks we face.
197
577556
3650
來解析我們面對的風險
09:42
There's no feeling about germs.
198
582362
2284
我們無法用感覺來認識細菌
所以需要模型
09:45
You need a model to understand them.
199
585110
2116
模型可以
09:48
This model is an intelligent representation of reality.
200
588157
3678
清楚地呈現真實
09:52
It's, of course, limited by science, by technology.
201
592411
4751
然而,模型受限於科學
與技術
在顯微鏡被發明來觀測細菌以前
09:58
We couldn't have a germ theory of disease
202
598249
2326
10:00
before we invented the microscope to see them.
203
600599
2534
疾病的細菌理論就不可能存在
10:04
It's limited by our cognitive biases.
204
604316
2643
模型也受限於我們認知的偏差
但它的能力
10:08
But it has the ability to override our feelings.
205
608110
2991
足以駕馭我們的感覺
10:11
Where do we get these models? We get them from others.
206
611507
3104
模型來自何處? 通常是從他人而來
10:14
We get them from religion, from culture, teachers, elders.
207
614635
5219
可能是宗教,文化
老師或是長老
數年前
10:20
A couple years ago, I was in South Africa on safari.
208
620298
3426
我到南非進行狩獵之旅
10:23
The tracker I was with grew up in Kruger National Park.
209
623748
2762
我的追蹤嚮導是在克魯格國家公園長大的
10:26
He had some very complex models of how to survive.
210
626534
2753
他的求生模型非常的複雜
10:29
And it depended on if you were attacked by a lion, leopard, rhino, or elephant --
211
629800
3913
遭受到不同動物攻擊有不同的模型
像是獅子、美洲豹、犀牛或是大象
10:33
and when you had to run away, when you couldn't run away,
212
633737
2734
依照不同的情況:在何時必須逃跑,或是爬樹
10:36
when you had to climb a tree, when you could never climb a tree.
213
636495
3083
或者無法爬樹,採用的模型也不同
我在那裡可能活不過一天
10:39
I would have died in a day.
214
639602
1349
但他生於此
10:42
But he was born there, and he understood how to survive.
215
642160
3782
他了解此地求生之道
我生於紐約市
10:46
I was born in New York City.
216
646490
1596
如果我帶他到紐約,那他可能也活不過一天吧
10:48
I could have taken him to New York, and he would have died in a day.
217
648110
3251
(笑聲)
10:51
(Laughter)
218
651385
1001
10:52
Because we had different models based on our different experiences.
219
652410
4144
因為我們有不同的生存模型
這來自我們不同的經驗
10:58
Models can come from the media,
220
658291
2469
模型來自媒體
11:00
from our elected officials ...
221
660784
1763
也來自我們選出的官員
11:03
Think of models of terrorism,
222
663234
3081
回想一下恐怖攻擊
11:06
child kidnapping,
223
666339
2197
幼童綁票
11:08
airline safety, car safety.
224
668560
2325
飛行安全以及汽車安全這些模型
11:11
Models can come from industry.
225
671539
1993
模型也來自工業界
11:14
The two I'm following are surveillance cameras,
226
674348
3218
我最近關注在監控攝影機
和身分證這兩項議題
11:17
ID cards,
227
677590
1496
很多資訊安全的模型與此有關
11:19
quite a lot of our computer security models come from there.
228
679110
3130
很多模型來自科學
11:22
A lot of models come from science.
229
682264
2227
11:24
Health models are a great example.
230
684515
1837
和健康相關的模型是很好的例子
11:26
Think of cancer, bird flu, swine flu, SARS.
231
686376
3026
例如癌症,禽流感,豬流感以及SARS
11:29
All of our feelings of security about those diseases
232
689942
4870
我們對這些疾病
產生的危機感
11:34
come from models given to us, really, by science filtered through the media.
233
694836
4655
其實是來自於模型
模型由科學家提供,經過媒體傳達給我們
模型是變動的
11:41
So models can change.
234
701038
1720
11:43
Models are not static.
235
703482
2103
不是固定的
11:45
As we become more comfortable in our environments,
236
705609
3240
當我們對愈適應環境時
11:48
our model can move closer to our feelings.
237
708873
3602
模型會愈趨近我們的感覺
11:53
So an example might be,
238
713965
2340
另一個的例子可能是這樣的
假設你回到100年前
11:56
if you go back 100 years ago,
239
716329
1596
11:57
when electricity was first becoming common,
240
717949
3428
當時電力剛開始普及
人們對電力存有相當多的恐懼
12:01
there were a lot of fears about it.
241
721401
1703
像是,有人害怕壓門鈴
12:03
There were people who were afraid to push doorbells,
242
723128
2478
因為那裡有電,非常危險
12:05
because there was electricity in there, and that was dangerous.
243
725630
3005
現在的我們對電力已相當熟悉了
12:08
For us, we're very facile around electricity.
244
728659
2869
像是換燈泡這種事情
12:11
We change light bulbs without even thinking about it.
245
731552
2818
我們不會去想它的安全問題
12:14
Our model of security around electricity is something we were born into.
246
734948
6163
我們對電力的安全認知模型
幾乎是與生俱來的
12:21
It hasn't changed as we were growing up.
247
741735
2514
長大後也沒變過
12:24
And we're good at it.
248
744273
1565
我們很擅長運用電力
12:27
Or think of the risks on the Internet across generations --
249
747380
4499
你也可以想想看
不同世代對網際網路的風險評估
12:31
how your parents approach Internet security,
250
751903
2097
你的父母親是怎麼看待網路安全的
對照一下你自己的做法
12:34
versus how you do,
251
754024
1616
12:35
versus how our kids will.
252
755664
1542
再對照一下我們的下一代,他們將會如何做
12:38
Models eventually fade into the background.
253
758300
2550
模型最終會融到我們的生活背景
12:42
"Intuitive" is just another word for familiar.
254
762427
2493
直覺其實是來自於熟悉
12:45
So as your model is close to reality and it converges with feelings,
255
765887
3850
當模型與真實接近時
並且與感覺合而為一
12:49
you often don't even know it's there.
256
769761
1918
此時,你感覺不到它的存在
有個很好的例子
12:53
A nice example of this came from last year and swine flu.
257
773239
3817
就是去年發生的豬流感
豬流感剛開始時
12:58
When swine flu first appeared,
258
778281
2000
最初的報導引起許多過度恐慌
13:00
the initial news caused a lot of overreaction.
259
780305
2618
13:03
Now, it had a name,
260
783562
1978
接著,它有正式名稱了
13:05
which made it scarier than the regular flu,
261
785564
2050
這使得它比一般感冒更恐怖
13:07
even though it was more deadly.
262
787638
1567
即使一般感冒致死率更高
13:09
And people thought doctors should be able to deal with it.
263
789784
3208
人們原本認為醫生應該可以處理豬流感
13:13
So there was that feeling of lack of control.
264
793459
2524
這時,我們覺得事情失控了
由於以上兩項因素
13:16
And those two things made the risk more than it was.
265
796007
3109
風險顯得比實際狀況更高
13:19
As the novelty wore off and the months went by,
266
799140
3557
數個月過後,人們對新事物的陌生恐懼逐漸淡去
13:22
there was some amount of tolerance; people got used to it.
267
802721
2843
接納度提升
也漸漸習慣了
13:26
There was no new data, but there was less fear.
268
806355
2670
雖然沒有新進展,但是恐懼減少了
13:29
By autumn,
269
809681
2174
在秋天來臨前
13:31
people thought the doctors should have solved this already.
270
811879
3382
人們相信
醫生已經解決問題了
13:35
And there's kind of a bifurcation:
271
815722
1960
這時出現了分歧
13:37
people had to choose between fear and acceptance --
272
817706
5751
人們必須
在恐懼或是接受中做出選擇
更正確的說,是恐懼和忽視
13:44
actually, fear and indifference --
273
824512
1644
最後,人們選擇了懷疑
13:46
and they kind of chose suspicion.
274
826180
1795
當疫苗在去年冬天上市時
13:49
And when the vaccine appeared last winter,
275
829110
3111
很多人 -- 令人驚訝的數目
13:52
there were a lot of people -- a surprising number --
276
832245
2511
13:54
who refused to get it.
277
834780
1797
反而拒絕疫苗接種
13:58
And it's a nice example of how people's feelings of security change,
278
838777
3656
這個例子很清楚指出
人們的安全感是如何改變,模型又是如何改變
14:02
how their model changes,
279
842457
1603
在沒有新資訊
14:04
sort of wildly,
280
844084
1668
14:05
with no new information, with no new input.
281
845776
2770
也沒有新來源時
也會有巨大的改變
這樣的事情其實常常發生
14:10
This kind of thing happens a lot.
282
850327
1808
現在,我要再加上一個複雜的因素
14:13
I'm going to give one more complication.
283
853199
1971
14:15
We have feeling, model, reality.
284
855194
2696
除了感覺,模型,真實三項因素
14:18
I have a very relativistic view of security.
285
858640
2510
我認為安全是相對的
因人而異
14:21
I think it depends on the observer.
286
861174
1814
14:23
And most security decisions have a variety of people involved.
287
863695
5166
多數的安全決策
牽扯到許多不同類型的人
14:29
And stakeholders with specific trade-offs will try to influence the decision.
288
869792
6539
有利益牽扯的
利害關係人
會試圖去影響決定
14:36
And I call that their agenda.
289
876355
1681
我稱之為關係人的「議程規畫表」
這個規畫表
14:39
And you see agenda -- this is marketing, this is politics --
290
879512
3491
是一種行銷,也是政治
14:43
trying to convince you to have one model versus another,
291
883481
3039
它企圖影響你信任某種模型而放棄另一個
14:46
trying to convince you to ignore a model
292
886544
1984
企圖影響去忽視模型
14:48
and trust your feelings,
293
888552
2672
只信任你的感覺
14:51
marginalizing people with models you don't like.
294
891248
2504
並且邊緣化那些採用你不喜歡的模型的人
14:54
This is not uncommon.
295
894744
1516
這並非不尋常
14:57
An example, a great example, is the risk of smoking.
296
897610
3229
一個例子,很好的例子,就是關於抽菸的危害
過去50 年的歷史,抽菸風險的變化
15:02
In the history of the past 50 years,
297
902196
1783
15:04
the smoking risk shows how a model changes,
298
904003
2613
顯示出模型是如何改變的
15:06
and it also shows how an industry fights against a model it doesn't like.
299
906640
4358
也顯示出業界如何對付
它們不喜歡的模型
15:11
Compare that to the secondhand smoke debate --
300
911983
3103
相對起來,關於二手煙的討論
晚了約20年
15:15
probably about 20 years behind.
301
915110
1953
15:17
Think about seat belts.
302
917982
1615
再看看安全帶
15:19
When I was a kid, no one wore a seat belt.
303
919621
2024
我小的時後,沒有人繫安全帶
15:21
Nowadays, no kid will let you drive if you're not wearing a seat belt.
304
921669
3541
而現今,如果不繫上安全帶
連小孩都會阻止你開車
15:26
Compare that to the airbag debate,
305
926633
2453
相對起來,安全氣囊的討論
落後了約三十年
15:29
probably about 30 years behind.
306
929110
1667
所有的模型都會改變
15:32
All examples of models changing.
307
932006
2088
15:36
What we learn is that changing models is hard.
308
936855
2796
我們目前知道的是,模型的改變不容易
模型也很難被移走
15:40
Models are hard to dislodge.
309
940334
2053
當它們和感覺完全相同時
15:42
If they equal your feelings,
310
942411
1675
你甚至不知道模型的存在
15:44
you don't even know you have a model.
311
944110
1899
另一種認知偏見
15:47
And there's another cognitive bias
312
947110
1886
我認為是肯證偏見
15:49
I'll call confirmation bias,
313
949020
2066
是指人們傾向於接受
15:51
where we tend to accept data that confirms our beliefs
314
951110
4361
和自己立場相符的訊息
15:55
and reject data that contradicts our beliefs.
315
955495
2437
而拒絕與我們立場相左的資訊
15:59
So evidence against our model, we're likely to ignore,
316
959490
3935
所以和我們模型不符的證據
我們也會忽略它,不管它多麼的讓人信服
16:03
even if it's compelling.
317
963449
1248
16:04
It has to get very compelling before we'll pay attention.
318
964721
3005
它必須強烈到無法忽視,才能引起我們的注意
16:08
New models that extend long periods of time are hard.
319
968990
2597
跨越長時間的新模型是難以接受的
全球暖化的議題就是個例子
16:11
Global warming is a great example.
320
971611
1754
我們很難接受
16:13
We're terrible at models that span 80 years.
321
973389
3442
一個長達八十年之久的模型
16:16
We can do "to the next harvest."
322
976855
2063
我們可以應付下一個收割季來臨前的問題
16:18
We can often do "until our kids grow up."
323
978942
2306
也可以應付小孩長大前的事情
16:21
But "80 years," we're just not good at.
324
981760
2201
但是八十年耶,我們不知道怎麼辦了
16:24
So it's a very hard model to accept.
325
984975
2302
所以,接受這種模型並不容易
16:27
We can have both models in our head simultaneously --
326
987999
2946
兩種模型可能並存在大腦中
16:31
that kind of problem where we're holding both beliefs together,
327
991912
6948
就像對某些事情
我們會有兩種信念
這是種認知失調
16:38
the cognitive dissonance.
328
998884
1370
但最後
16:40
Eventually, the new model will replace the old model.
329
1000278
3156
舊模型終將被新模型取代
16:44
Strong feelings can create a model.
330
1004164
2190
強烈的感覺可以產生模型
16:47
September 11 created a security model in a lot of people's heads.
331
1007411
5363
九一一事件在很多人的心裡
建立新的安全模型
16:52
Also, personal experiences with crime can do it,
332
1012798
3288
還有,個人經歷的犯罪事件
個人的健康危機
16:56
personal health scare,
333
1016110
1379
16:57
a health scare in the news.
334
1017513
1466
以及新聞報導中的健康問題都會產生新模型
精神病專家稱之為
17:00
You'll see these called "flashbulb events" by psychiatrists.
335
1020198
3345
閃光燈效應
這些事件可以立即產生新模型
17:04
They can create a model instantaneously,
336
1024183
2461
17:06
because they're very emotive.
337
1026668
1761
因為他們引起強烈的情緒
17:09
So in the technological world,
338
1029908
1588
在科技的世界裡
17:11
we don't have experience to judge models.
339
1031520
3237
我們沒有經驗
足以判斷模型
17:15
And we rely on others. We rely on proxies.
340
1035124
2933
所以,我們仰賴他人,我們仰賴代理人
只要代理人能夠指正錯誤,這樣做是可行的。
17:18
And this works, as long as it's the correct others.
341
1038081
2619
17:21
We rely on government agencies
342
1041183
2682
我們依賴政府機關
17:23
to tell us what pharmaceuticals are safe.
343
1043889
4404
來告訴我們藥物是安全的
17:28
I flew here yesterday.
344
1048317
1913
我昨天搭機來此地
17:30
I didn't check the airplane.
345
1050254
1762
我沒有檢查飛機
17:32
I relied on some other group
346
1052699
2595
是因為另一群人
會先檢查飛機是否安全
17:35
to determine whether my plane was safe to fly.
347
1055318
2437
17:37
We're here, none of us fear the roof is going to collapse on us,
348
1057779
3298
我們在這裡,沒有人擔心屋頂會垮下來
不是因為我們檢查過了
17:41
not because we checked,
349
1061101
2205
17:43
but because we're pretty sure the building codes here are good.
350
1063330
3672
而是我們非常確定
建築法規很建全
17:48
It's a model we just accept
351
1068442
2989
基於這樣的信念
我們接受這個模型
17:51
pretty much by faith.
352
1071455
1360
它也運作得很好
17:53
And that's OK.
353
1073331
1358
17:57
Now, what we want is people to get familiar enough with better models,
354
1077966
5873
我們希望
人們能去了解
更好的模型
18:03
have it reflected in their feelings,
355
1083863
2120
真正反應出感覺的模型
幫助人們可以在安全上做出正確的抉擇
18:06
to allow them to make security trade-offs.
356
1086007
3002
當模型與感覺不一致時
18:10
When these go out of whack, you have two options.
357
1090110
3719
你有兩個選擇
18:13
One, you can fix people's feelings, directly appeal to feelings.
358
1093853
4233
其一是,先修正個人的感覺
然後直接針對感覺下判斷
雖然動了點手腳,但是行的通
18:18
It's manipulation, but it can work.
359
1098110
2406
第二種方式比較誠實
18:21
The second, more honest way
360
1101173
2191
就是去修正模型
18:23
is to actually fix the model.
361
1103388
1773
18:26
Change happens slowly.
362
1106720
2101
改變是很緩慢的
18:28
The smoking debate took 40 years -- and that was an easy one.
363
1108845
4378
抽菸的辯論持續了40年
這還算是簡單的
有些改變很難
18:35
Some of this stuff is hard.
364
1115195
1813
相當困難
18:37
Really, though, information seems like our best hope.
365
1117496
3756
要靠絕對的資訊才有希望能改變
我剛撒了一個謊
18:41
And I lied.
366
1121276
1272
18:42
Remember I said feeling, model, reality; reality doesn't change?
367
1122572
4020
在說到感覺、模型和真實三個因素時
我說,真實是不會變的,事實上它會
18:46
It actually does.
368
1126616
1375
我們處在科技的世界
18:48
We live in a technological world;
369
1128015
1714
18:49
reality changes all the time.
370
1129753
2338
所謂的真實一直都在變
18:52
So we might have, for the first time in our species:
371
1132887
2977
第一次,我們人類這個物種發生這種現象
18:55
feeling chases model, model chases reality, reality's moving --
372
1135888
3183
感覺追逐模型,模型追逐真實,而真實不停的跑
它們可能永遠也追不上
18:59
they might never catch up.
373
1139095
1533
19:02
We don't know.
374
1142180
1328
我們不知道結果
但是,就長期來說
19:05
But in the long term,
375
1145614
1603
感覺和真實都是重要的
19:07
both feeling and reality are important.
376
1147241
2204
19:09
And I want to close with two quick stories to illustrate this.
377
1149469
3233
我以兩個簡短的故事證明這點,並以此做為總結
19:12
1982 -- I don't know if people will remember this --
378
1152726
2479
1982 年,不知道你們還記不記得
當時美國有個很短暫但散播很廣的
19:15
there was a short epidemic of Tylenol poisonings
379
1155229
3370
泰諾(Thlenol)止痛藥中毒事件
19:18
in the United States.
380
1158623
1196
19:19
It's a horrific story.
381
1159843
1361
事情很可怕.有人取走一瓶的泰諾
19:21
Someone took a bottle of Tylenol,
382
1161228
2079
在裡面下毒,關上盒蓋,又放回架上販賣
19:23
put poison in it, closed it up, put it back on the shelf,
383
1163331
3002
其他人買下這瓶藥後,中毒死亡
19:26
someone else bought it and died.
384
1166357
1558
19:27
This terrified people.
385
1167939
1673
事情嚇壞了群眾
19:29
There were a couple of copycat attacks.
386
1169636
2227
當時還有數起模仿這個手法的攻擊事件
19:31
There wasn't any real risk, but people were scared.
387
1171887
2845
雖然沒有真正的危險,但是民眾嚇壞了
19:34
And this is how the tamper-proof drug industry was invented.
388
1174756
3876
這事件驅使
藥品業界發明防盜安全裝置
19:38
Those tamper-proof caps? That came from this.
389
1178656
2229
那些防盜安全瓶蓋就是這樣來的
19:40
It's complete security theater.
390
1180909
1571
這就是安全劇場
19:42
As a homework assignment, think of 10 ways to get around it.
391
1182504
2891
這是你們的作業-想出十個破解安全瓶蓋的方法
我先給個答案,針筒
19:45
I'll give you one: a syringe.
392
1185419
1891
19:47
But it made people feel better.
393
1187334
2781
但是安全瓶蓋確實讓人們感覺比較安全
19:50
It made their feeling of security more match the reality.
394
1190744
3702
這使得人們對安全的感覺
和實際情況更符合
最後一個故事,數年前,我的一個朋友生小孩
19:55
Last story: a few years ago, a friend of mine gave birth.
395
1195390
2934
我去醫院看她
19:58
I visit her in the hospital.
396
1198348
1397
19:59
It turns out, when a baby's born now,
397
1199769
1923
才發現現在小孩出生時
20:01
they put an RFID bracelet on the baby, a corresponding one on the mother,
398
1201716
3563
要繫上RFID(無線射頻辨識系統) 手環
母親也配戴對應的RFID
20:05
so if anyone other than the mother takes the baby out of the maternity ward,
399
1205303
3620
所以,除了母親以外的人抱小孩離開產房
警報就會響起
20:08
an alarm goes off.
400
1208947
1158
我說:「哇!這真棒
20:10
I said, "Well, that's kind of neat.
401
1210129
1729
20:11
I wonder how rampant baby snatching is out of hospitals."
402
1211882
3970
那些猖獗的嬰兒綁架犯
怎麼可能走的出醫院」
20:15
I go home, I look it up.
403
1215876
1236
回到家,我查了一下資料
20:17
It basically never happens.
404
1217136
1525
發現嬰兒綁架幾乎不曾發生
20:18
(Laughter)
405
1218685
1844
你想想看
20:20
But if you think about it, if you are a hospital,
406
1220553
2843
如果你是個醫務人員
20:23
and you need to take a baby away from its mother,
407
1223420
2380
你需要從母親的手中把嬰兒
20:25
out of the room to run some tests,
408
1225824
1781
帶出房間去進行檢驗
20:27
you better have some good security theater,
409
1227629
2050
那你最好有些絕佳的安全策略
20:29
or she's going to rip your arm off.
410
1229703
1945
不然你的手臂一定會被嬰兒的母親扭斷
20:31
(Laughter)
411
1231672
1533
(笑聲)
這對我們很重要
20:34
So it's important for us,
412
1234161
1717
20:35
those of us who design security,
413
1235902
2135
有些人從事安全設計
有人審視安全政策
20:38
who look at security policy --
414
1238061
2031
20:40
or even look at public policy in ways that affect security.
415
1240946
3308
或是研究
會影響安全的公共政策
要考慮的不是只有真實,而是感覺與真實兩者
20:45
It's not just reality; it's feeling and reality.
416
1245006
3416
重要的是
20:48
What's important
417
1248446
1865
這兩者要盡可能相同
20:50
is that they be about the same.
418
1250335
1545
20:51
It's important that, if our feelings match reality,
419
1251904
2531
這是重要的,當我們的感覺和真實更一致
才能在安全議題上做出更好的選擇
20:54
we make better security trade-offs.
420
1254459
1873
謝謝
20:56
Thank you.
421
1256711
1153
20:57
(Applause)
422
1257888
2133
(鼓掌)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog