Bruce Schneier: The security mirage

78,030 views ・ 2011-04-27

TED


Please double-click on the English subtitles below to play the video.

00:16
So, security is two different things:
0
16010
2276
00:18
it's a feeling, and it's a reality.
1
18310
2526
00:20
And they're different.
2
20860
1425
00:22
You could feel secure even if you're not.
3
22309
3427
00:25
And you can be secure
4
25760
1976
00:27
even if you don't feel it.
5
27760
1850
00:29
Really, we have two separate concepts
6
29634
2117
00:31
mapped onto the same word.
7
31775
1652
00:33
And what I want to do in this talk is to split them apart --
8
33960
3626
00:37
figuring out when they diverge and how they converge.
9
37610
3610
00:41
And language is actually a problem here.
10
41711
2275
00:44
There aren't a lot of good words
11
44010
2076
00:46
for the concepts we're going to talk about.
12
46110
2061
00:49
So if you look at security from economic terms,
13
49295
4120
00:53
it's a trade-off.
14
53439
1647
00:55
Every time you get some security, you're always trading off something.
15
55110
4132
00:59
Whether this is a personal decision --
16
59266
1845
01:01
whether you're going to install a burglar alarm in your home --
17
61135
3012
01:04
or a national decision,
18
64171
1157
01:05
where you're going to invade a foreign country --
19
65352
2310
01:07
you're going to trade off something: money or time, convenience, capabilities,
20
67686
3782
01:11
maybe fundamental liberties.
21
71492
2002
01:13
And the question to ask when you look at a security anything
22
73518
3274
01:16
is not whether this makes us safer,
23
76816
3382
01:20
but whether it's worth the trade-off.
24
80222
2215
01:22
You've heard in the past several years, the world is safer
25
82461
3229
01:25
because Saddam Hussein is not in power.
26
85714
1890
01:27
That might be true, but it's not terribly relevant.
27
87628
2603
01:30
The question is: Was it worth it?
28
90255
2831
01:33
And you can make your own decision,
29
93110
2459
01:35
and then you'll decide whether the invasion was worth it.
30
95593
2733
01:38
That's how you think about security: in terms of the trade-off.
31
98350
3561
01:41
Now, there's often no right or wrong here.
32
101935
2610
01:45
Some of us have a burglar alarm system at home and some of us don't.
33
105208
3308
01:48
And it'll depend on where we live,
34
108540
2731
01:51
whether we live alone or have a family,
35
111295
1926
01:53
how much cool stuff we have,
36
113245
1668
01:54
how much we're willing to accept the risk of theft.
37
114937
3165
01:58
In politics also, there are different opinions.
38
118943
2948
02:02
A lot of times, these trade-offs are about more than just security,
39
122459
4435
02:06
and I think that's really important.
40
126918
1865
02:08
Now, people have a natural intuition about these trade-offs.
41
128807
3308
02:12
We make them every day.
42
132588
1556
02:14
Last night in my hotel room, when I decided to double-lock the door,
43
134807
3533
02:18
or you in your car when you drove here;
44
138364
2000
02:21
when we go eat lunch
45
141191
1478
02:22
and decide the food's not poison and we'll eat it.
46
142693
2608
02:25
We make these trade-offs again and again,
47
145325
3161
02:28
multiple times a day.
48
148510
1576
02:30
We often won't even notice them.
49
150110
1589
02:31
They're just part of being alive; we all do it.
50
151723
2626
02:34
Every species does it.
51
154373
1555
02:36
Imagine a rabbit in a field, eating grass.
52
156474
2862
02:39
And the rabbit sees a fox.
53
159360
1943
02:41
That rabbit will make a security trade-off:
54
161856
2049
02:43
"Should I stay, or should I flee?"
55
163929
1904
02:46
And if you think about it,
56
166380
1619
02:48
the rabbits that are good at making that trade-off
57
168023
2555
02:50
will tend to live and reproduce,
58
170602
1978
02:52
and the rabbits that are bad at it
59
172604
2307
02:54
will get eaten or starve.
60
174935
1474
02:56
So you'd think
61
176958
1608
02:59
that us, as a successful species on the planet -- you, me, everybody --
62
179573
4309
03:03
would be really good at making these trade-offs.
63
183906
2573
03:07
Yet it seems, again and again, that we're hopelessly bad at it.
64
187126
3104
03:11
And I think that's a fundamentally interesting question.
65
191768
2802
03:14
I'll give you the short answer.
66
194594
1873
03:16
The answer is, we respond to the feeling of security
67
196491
2651
03:19
and not the reality.
68
199166
1518
03:21
Now, most of the time, that works.
69
201864
2696
03:25
Most of the time,
70
205538
1503
03:27
feeling and reality are the same.
71
207065
2133
03:30
Certainly that's true for most of human prehistory.
72
210776
3516
03:35
We've developed this ability
73
215633
2708
03:38
because it makes evolutionary sense.
74
218365
2584
03:41
One way to think of it is that we're highly optimized
75
221985
3274
03:45
for risk decisions
76
225283
1803
03:47
that are endemic to living in small family groups
77
227110
2543
03:49
in the East African Highlands in 100,000 BC.
78
229677
2536
03:52
2010 New York, not so much.
79
232792
2659
03:56
Now, there are several biases in risk perception.
80
236879
3206
04:00
A lot of good experiments in this.
81
240109
1741
04:01
And you can see certain biases that come up again and again.
82
241874
3603
04:05
I'll give you four.
83
245501
1353
04:06
We tend to exaggerate spectacular and rare risks
84
246878
3208
04:10
and downplay common risks --
85
250110
1976
04:12
so, flying versus driving.
86
252110
1518
04:14
The unknown is perceived to be riskier than the familiar.
87
254451
3794
04:21
One example would be:
88
261470
1439
04:22
people fear kidnapping by strangers,
89
262933
2613
04:25
when the data supports that kidnapping by relatives is much more common.
90
265570
3636
04:29
This is for children.
91
269230
1574
04:30
Third, personified risks are perceived to be greater
92
270828
4040
04:34
than anonymous risks.
93
274892
1503
04:36
So, Bin Laden is scarier because he has a name.
94
276419
2787
04:40
And the fourth is:
95
280182
1363
04:41
people underestimate risks in situations they do control
96
281569
4755
04:46
and overestimate them in situations they don't control.
97
286348
2963
04:49
So once you take up skydiving or smoking,
98
289335
3384
04:52
you downplay the risks.
99
292743
1624
04:55
If a risk is thrust upon you -- terrorism is a good example --
100
295037
3053
04:58
you'll overplay it,
101
298114
1157
04:59
because you don't feel like it's in your control.
102
299295
2339
05:02
There are a bunch of other of these cognitive biases,
103
302157
3493
05:05
that affect our risk decisions.
104
305674
2339
05:08
There's the availability heuristic,
105
308832
2254
05:11
which basically means we estimate the probability of something
106
311110
4180
05:15
by how easy it is to bring instances of it to mind.
107
315314
3339
05:19
So you can imagine how that works.
108
319831
1777
05:21
If you hear a lot about tiger attacks, there must be a lot of tigers around.
109
321632
3628
05:25
You don't hear about lion attacks, there aren't a lot of lions around.
110
325284
3344
05:28
This works, until you invent newspapers,
111
328652
2297
05:30
because what newspapers do is repeat again and again
112
330973
4406
05:35
rare risks.
113
335403
1406
05:36
I tell people: if it's in the news, don't worry about it,
114
336833
2865
05:39
because by definition, news is something that almost never happens.
115
339722
4275
05:44
(Laughter)
116
344021
1769
05:45
When something is so common, it's no longer news.
117
345814
2923
05:48
Car crashes, domestic violence --
118
348761
2198
05:50
those are the risks you worry about.
119
350983
1990
05:53
We're also a species of storytellers.
120
353713
2148
05:55
We respond to stories more than data.
121
355885
2115
05:58
And there's some basic innumeracy going on.
122
358514
2406
06:00
I mean, the joke "One, two, three, many" is kind of right.
123
360944
3142
06:04
We're really good at small numbers.
124
364110
2322
06:06
One mango, two mangoes, three mangoes,
125
366456
2336
06:08
10,000 mangoes, 100,000 mangoes --
126
368816
1977
06:10
it's still more mangoes you can eat before they rot.
127
370817
2977
06:13
So one half, one quarter, one fifth -- we're good at that.
128
373818
3268
06:17
One in a million, one in a billion --
129
377110
1976
06:19
they're both almost never.
130
379110
1575
06:21
So we have trouble with the risks that aren't very common.
131
381546
3414
06:25
And what these cognitive biases do
132
385760
1977
06:27
is they act as filters between us and reality.
133
387761
2976
06:31
And the result is that feeling and reality get out of whack,
134
391284
3873
06:35
they get different.
135
395181
1384
06:37
Now, you either have a feeling -- you feel more secure than you are,
136
397370
3931
06:41
there's a false sense of security.
137
401325
1685
06:43
Or the other way, and that's a false sense of insecurity.
138
403034
3374
06:47
I write a lot about "security theater,"
139
407015
2880
06:49
which are products that make people feel secure,
140
409919
2680
06:52
but don't actually do anything.
141
412623
1977
06:54
There's no real word for stuff that makes us secure,
142
414624
2557
06:57
but doesn't make us feel secure.
143
417205
1881
06:59
Maybe it's what the CIA is supposed to do for us.
144
419110
2720
07:03
So back to economics.
145
423539
2168
07:05
If economics, if the market, drives security,
146
425731
3656
07:09
and if people make trade-offs based on the feeling of security,
147
429411
4847
07:14
then the smart thing for companies to do for the economic incentives
148
434282
4680
07:18
is to make people feel secure.
149
438986
2057
07:21
And there are two ways to do this.
150
441942
2330
07:24
One, you can make people actually secure
151
444296
2790
07:27
and hope they notice.
152
447110
1463
07:28
Or two, you can make people just feel secure
153
448597
2844
07:31
and hope they don't notice.
154
451465
1872
07:34
Right?
155
454401
1375
07:35
So what makes people notice?
156
455800
3141
07:39
Well, a couple of things:
157
459500
1382
07:40
understanding of the security,
158
460906
2266
07:43
of the risks, the threats,
159
463196
1890
07:45
the countermeasures, how they work.
160
465110
1874
07:47
But if you know stuff, you're more likely
161
467008
2290
07:50
to have your feelings match reality.
162
470155
2226
07:53
Enough real-world examples helps.
163
473110
3145
07:56
We all know the crime rate in our neighborhood,
164
476279
2559
07:58
because we live there, and we get a feeling about it
165
478862
2801
08:01
that basically matches reality.
166
481687
1869
08:05
Security theater is exposed
167
485038
2207
08:07
when it's obvious that it's not working properly.
168
487269
2786
08:11
OK. So what makes people not notice?
169
491209
2670
08:14
Well, a poor understanding.
170
494443
1492
08:16
If you don't understand the risks, you don't understand the costs,
171
496642
3144
08:19
you're likely to get the trade-off wrong,
172
499810
2157
08:21
and your feeling doesn't match reality.
173
501991
2488
08:24
Not enough examples.
174
504503
1737
08:26
There's an inherent problem with low-probability events.
175
506879
3506
08:30
If, for example, terrorism almost never happens,
176
510919
3813
08:34
it's really hard to judge the efficacy of counter-terrorist measures.
177
514756
4604
08:40
This is why you keep sacrificing virgins,
178
520523
3563
08:44
and why your unicorn defenses are working just great.
179
524110
2675
08:46
There aren't enough examples of failures.
180
526809
2557
08:51
Also, feelings that cloud the issues --
181
531109
2787
08:53
the cognitive biases I talked about earlier: fears, folk beliefs --
182
533920
4028
08:58
basically, an inadequate model of reality.
183
538727
2745
09:03
So let me complicate things.
184
543403
2171
09:05
I have feeling and reality.
185
545598
1977
09:07
I want to add a third element. I want to add "model."
186
547599
2796
09:10
Feeling and model are in our head,
187
550839
2350
09:13
reality is the outside world; it doesn't change, it's real.
188
553213
3452
09:17
Feeling is based on our intuition,
189
557800
2214
09:20
model is based on reason.
190
560038
1626
09:22
That's basically the difference.
191
562383
2039
09:24
In a primitive and simple world,
192
564446
1977
09:26
there's really no reason for a model,
193
566447
2137
09:30
because feeling is close to reality.
194
570253
2295
09:32
You don't need a model.
195
572572
1373
09:34
But in a modern and complex world,
196
574596
2150
09:37
you need models to understand a lot of the risks we face.
197
577556
3650
09:42
There's no feeling about germs.
198
582362
2284
09:45
You need a model to understand them.
199
585110
2116
09:48
This model is an intelligent representation of reality.
200
588157
3678
09:52
It's, of course, limited by science, by technology.
201
592411
4751
09:58
We couldn't have a germ theory of disease
202
598249
2326
10:00
before we invented the microscope to see them.
203
600599
2534
10:04
It's limited by our cognitive biases.
204
604316
2643
10:08
But it has the ability to override our feelings.
205
608110
2991
10:11
Where do we get these models? We get them from others.
206
611507
3104
10:14
We get them from religion, from culture, teachers, elders.
207
614635
5219
10:20
A couple years ago, I was in South Africa on safari.
208
620298
3426
10:23
The tracker I was with grew up in Kruger National Park.
209
623748
2762
10:26
He had some very complex models of how to survive.
210
626534
2753
10:29
And it depended on if you were attacked by a lion, leopard, rhino, or elephant --
211
629800
3913
10:33
and when you had to run away, when you couldn't run away,
212
633737
2734
10:36
when you had to climb a tree, when you could never climb a tree.
213
636495
3083
10:39
I would have died in a day.
214
639602
1349
10:42
But he was born there, and he understood how to survive.
215
642160
3782
10:46
I was born in New York City.
216
646490
1596
10:48
I could have taken him to New York, and he would have died in a day.
217
648110
3251
10:51
(Laughter)
218
651385
1001
10:52
Because we had different models based on our different experiences.
219
652410
4144
10:58
Models can come from the media,
220
658291
2469
11:00
from our elected officials ...
221
660784
1763
11:03
Think of models of terrorism,
222
663234
3081
11:06
child kidnapping,
223
666339
2197
11:08
airline safety, car safety.
224
668560
2325
11:11
Models can come from industry.
225
671539
1993
11:14
The two I'm following are surveillance cameras,
226
674348
3218
11:17
ID cards,
227
677590
1496
11:19
quite a lot of our computer security models come from there.
228
679110
3130
11:22
A lot of models come from science.
229
682264
2227
11:24
Health models are a great example.
230
684515
1837
11:26
Think of cancer, bird flu, swine flu, SARS.
231
686376
3026
11:29
All of our feelings of security about those diseases
232
689942
4870
11:34
come from models given to us, really, by science filtered through the media.
233
694836
4655
11:41
So models can change.
234
701038
1720
11:43
Models are not static.
235
703482
2103
11:45
As we become more comfortable in our environments,
236
705609
3240
11:48
our model can move closer to our feelings.
237
708873
3602
11:53
So an example might be,
238
713965
2340
11:56
if you go back 100 years ago,
239
716329
1596
11:57
when electricity was first becoming common,
240
717949
3428
12:01
there were a lot of fears about it.
241
721401
1703
12:03
There were people who were afraid to push doorbells,
242
723128
2478
12:05
because there was electricity in there, and that was dangerous.
243
725630
3005
12:08
For us, we're very facile around electricity.
244
728659
2869
12:11
We change light bulbs without even thinking about it.
245
731552
2818
12:14
Our model of security around electricity is something we were born into.
246
734948
6163
12:21
It hasn't changed as we were growing up.
247
741735
2514
12:24
And we're good at it.
248
744273
1565
12:27
Or think of the risks on the Internet across generations --
249
747380
4499
12:31
how your parents approach Internet security,
250
751903
2097
12:34
versus how you do,
251
754024
1616
12:35
versus how our kids will.
252
755664
1542
12:38
Models eventually fade into the background.
253
758300
2550
12:42
"Intuitive" is just another word for familiar.
254
762427
2493
12:45
So as your model is close to reality and it converges with feelings,
255
765887
3850
12:49
you often don't even know it's there.
256
769761
1918
12:53
A nice example of this came from last year and swine flu.
257
773239
3817
12:58
When swine flu first appeared,
258
778281
2000
13:00
the initial news caused a lot of overreaction.
259
780305
2618
13:03
Now, it had a name,
260
783562
1978
13:05
which made it scarier than the regular flu,
261
785564
2050
13:07
even though it was more deadly.
262
787638
1567
13:09
And people thought doctors should be able to deal with it.
263
789784
3208
13:13
So there was that feeling of lack of control.
264
793459
2524
13:16
And those two things made the risk more than it was.
265
796007
3109
13:19
As the novelty wore off and the months went by,
266
799140
3557
13:22
there was some amount of tolerance; people got used to it.
267
802721
2843
13:26
There was no new data, but there was less fear.
268
806355
2670
13:29
By autumn,
269
809681
2174
13:31
people thought the doctors should have solved this already.
270
811879
3382
13:35
And there's kind of a bifurcation:
271
815722
1960
13:37
people had to choose between fear and acceptance --
272
817706
5751
13:44
actually, fear and indifference --
273
824512
1644
13:46
and they kind of chose suspicion.
274
826180
1795
13:49
And when the vaccine appeared last winter,
275
829110
3111
13:52
there were a lot of people -- a surprising number --
276
832245
2511
13:54
who refused to get it.
277
834780
1797
13:58
And it's a nice example of how people's feelings of security change,
278
838777
3656
14:02
how their model changes,
279
842457
1603
14:04
sort of wildly,
280
844084
1668
14:05
with no new information, with no new input.
281
845776
2770
14:10
This kind of thing happens a lot.
282
850327
1808
14:13
I'm going to give one more complication.
283
853199
1971
14:15
We have feeling, model, reality.
284
855194
2696
14:18
I have a very relativistic view of security.
285
858640
2510
14:21
I think it depends on the observer.
286
861174
1814
14:23
And most security decisions have a variety of people involved.
287
863695
5166
14:29
And stakeholders with specific trade-offs will try to influence the decision.
288
869792
6539
14:36
And I call that their agenda.
289
876355
1681
14:39
And you see agenda -- this is marketing, this is politics --
290
879512
3491
14:43
trying to convince you to have one model versus another,
291
883481
3039
14:46
trying to convince you to ignore a model
292
886544
1984
14:48
and trust your feelings,
293
888552
2672
14:51
marginalizing people with models you don't like.
294
891248
2504
14:54
This is not uncommon.
295
894744
1516
14:57
An example, a great example, is the risk of smoking.
296
897610
3229
15:02
In the history of the past 50 years,
297
902196
1783
15:04
the smoking risk shows how a model changes,
298
904003
2613
15:06
and it also shows how an industry fights against a model it doesn't like.
299
906640
4358
15:11
Compare that to the secondhand smoke debate --
300
911983
3103
15:15
probably about 20 years behind.
301
915110
1953
15:17
Think about seat belts.
302
917982
1615
15:19
When I was a kid, no one wore a seat belt.
303
919621
2024
15:21
Nowadays, no kid will let you drive if you're not wearing a seat belt.
304
921669
3541
15:26
Compare that to the airbag debate,
305
926633
2453
15:29
probably about 30 years behind.
306
929110
1667
15:32
All examples of models changing.
307
932006
2088
15:36
What we learn is that changing models is hard.
308
936855
2796
15:40
Models are hard to dislodge.
309
940334
2053
15:42
If they equal your feelings,
310
942411
1675
15:44
you don't even know you have a model.
311
944110
1899
15:47
And there's another cognitive bias
312
947110
1886
15:49
I'll call confirmation bias,
313
949020
2066
15:51
where we tend to accept data that confirms our beliefs
314
951110
4361
15:55
and reject data that contradicts our beliefs.
315
955495
2437
15:59
So evidence against our model, we're likely to ignore,
316
959490
3935
16:03
even if it's compelling.
317
963449
1248
16:04
It has to get very compelling before we'll pay attention.
318
964721
3005
16:08
New models that extend long periods of time are hard.
319
968990
2597
16:11
Global warming is a great example.
320
971611
1754
16:13
We're terrible at models that span 80 years.
321
973389
3442
16:16
We can do "to the next harvest."
322
976855
2063
16:18
We can often do "until our kids grow up."
323
978942
2306
16:21
But "80 years," we're just not good at.
324
981760
2201
16:24
So it's a very hard model to accept.
325
984975
2302
16:27
We can have both models in our head simultaneously --
326
987999
2946
16:31
that kind of problem where we're holding both beliefs together,
327
991912
6948
16:38
the cognitive dissonance.
328
998884
1370
16:40
Eventually, the new model will replace the old model.
329
1000278
3156
16:44
Strong feelings can create a model.
330
1004164
2190
16:47
September 11 created a security model in a lot of people's heads.
331
1007411
5363
16:52
Also, personal experiences with crime can do it,
332
1012798
3288
16:56
personal health scare,
333
1016110
1379
16:57
a health scare in the news.
334
1017513
1466
17:00
You'll see these called "flashbulb events" by psychiatrists.
335
1020198
3345
17:04
They can create a model instantaneously,
336
1024183
2461
17:06
because they're very emotive.
337
1026668
1761
17:09
So in the technological world,
338
1029908
1588
17:11
we don't have experience to judge models.
339
1031520
3237
17:15
And we rely on others. We rely on proxies.
340
1035124
2933
17:18
And this works, as long as it's the correct others.
341
1038081
2619
17:21
We rely on government agencies
342
1041183
2682
17:23
to tell us what pharmaceuticals are safe.
343
1043889
4404
17:28
I flew here yesterday.
344
1048317
1913
17:30
I didn't check the airplane.
345
1050254
1762
17:32
I relied on some other group
346
1052699
2595
17:35
to determine whether my plane was safe to fly.
347
1055318
2437
17:37
We're here, none of us fear the roof is going to collapse on us,
348
1057779
3298
17:41
not because we checked,
349
1061101
2205
17:43
but because we're pretty sure the building codes here are good.
350
1063330
3672
17:48
It's a model we just accept
351
1068442
2989
17:51
pretty much by faith.
352
1071455
1360
17:53
And that's OK.
353
1073331
1358
17:57
Now, what we want is people to get familiar enough with better models,
354
1077966
5873
18:03
have it reflected in their feelings,
355
1083863
2120
18:06
to allow them to make security trade-offs.
356
1086007
3002
18:10
When these go out of whack, you have two options.
357
1090110
3719
18:13
One, you can fix people's feelings, directly appeal to feelings.
358
1093853
4233
18:18
It's manipulation, but it can work.
359
1098110
2406
18:21
The second, more honest way
360
1101173
2191
18:23
is to actually fix the model.
361
1103388
1773
18:26
Change happens slowly.
362
1106720
2101
18:28
The smoking debate took 40 years -- and that was an easy one.
363
1108845
4378
18:35
Some of this stuff is hard.
364
1115195
1813
18:37
Really, though, information seems like our best hope.
365
1117496
3756
18:41
And I lied.
366
1121276
1272
18:42
Remember I said feeling, model, reality; reality doesn't change?
367
1122572
4020
18:46
It actually does.
368
1126616
1375
18:48
We live in a technological world;
369
1128015
1714
18:49
reality changes all the time.
370
1129753
2338
18:52
So we might have, for the first time in our species:
371
1132887
2977
18:55
feeling chases model, model chases reality, reality's moving --
372
1135888
3183
18:59
they might never catch up.
373
1139095
1533
19:02
We don't know.
374
1142180
1328
19:05
But in the long term,
375
1145614
1603
19:07
both feeling and reality are important.
376
1147241
2204
19:09
And I want to close with two quick stories to illustrate this.
377
1149469
3233
19:12
1982 -- I don't know if people will remember this --
378
1152726
2479
19:15
there was a short epidemic of Tylenol poisonings
379
1155229
3370
19:18
in the United States.
380
1158623
1196
19:19
It's a horrific story.
381
1159843
1361
19:21
Someone took a bottle of Tylenol,
382
1161228
2079
19:23
put poison in it, closed it up, put it back on the shelf,
383
1163331
3002
19:26
someone else bought it and died.
384
1166357
1558
19:27
This terrified people.
385
1167939
1673
19:29
There were a couple of copycat attacks.
386
1169636
2227
19:31
There wasn't any real risk, but people were scared.
387
1171887
2845
19:34
And this is how the tamper-proof drug industry was invented.
388
1174756
3876
19:38
Those tamper-proof caps? That came from this.
389
1178656
2229
19:40
It's complete security theater.
390
1180909
1571
19:42
As a homework assignment, think of 10 ways to get around it.
391
1182504
2891
19:45
I'll give you one: a syringe.
392
1185419
1891
19:47
But it made people feel better.
393
1187334
2781
19:50
It made their feeling of security more match the reality.
394
1190744
3702
19:55
Last story: a few years ago, a friend of mine gave birth.
395
1195390
2934
19:58
I visit her in the hospital.
396
1198348
1397
19:59
It turns out, when a baby's born now,
397
1199769
1923
20:01
they put an RFID bracelet on the baby, a corresponding one on the mother,
398
1201716
3563
20:05
so if anyone other than the mother takes the baby out of the maternity ward,
399
1205303
3620
20:08
an alarm goes off.
400
1208947
1158
20:10
I said, "Well, that's kind of neat.
401
1210129
1729
20:11
I wonder how rampant baby snatching is out of hospitals."
402
1211882
3970
20:15
I go home, I look it up.
403
1215876
1236
20:17
It basically never happens.
404
1217136
1525
20:18
(Laughter)
405
1218685
1844
20:20
But if you think about it, if you are a hospital,
406
1220553
2843
20:23
and you need to take a baby away from its mother,
407
1223420
2380
20:25
out of the room to run some tests,
408
1225824
1781
20:27
you better have some good security theater,
409
1227629
2050
20:29
or she's going to rip your arm off.
410
1229703
1945
20:31
(Laughter)
411
1231672
1533
20:34
So it's important for us,
412
1234161
1717
20:35
those of us who design security,
413
1235902
2135
20:38
who look at security policy --
414
1238061
2031
20:40
or even look at public policy in ways that affect security.
415
1240946
3308
20:45
It's not just reality; it's feeling and reality.
416
1245006
3416
20:48
What's important
417
1248446
1865
20:50
is that they be about the same.
418
1250335
1545
20:51
It's important that, if our feelings match reality,
419
1251904
2531
20:54
we make better security trade-offs.
420
1254459
1873
20:56
Thank you.
421
1256711
1153
20:57
(Applause)
422
1257888
2133
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7