Bruce Schneier: The security mirage

78,320 views ใƒป 2011-04-27

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Moshe Sayag ืžื‘ืงืจ: Sigal Tifferet
ื‘ื˜ื—ื•ืŸ ื”ื•ื ืฉื ื™ ื“ื‘ืจื™ื ืฉื•ื ื™ื:
00:16
So, security is two different things:
0
16010
2276
ื”ื•ื ืชื—ื•ืฉื”, ื•ื”ื•ื ืžืฆื™ืื•ืช.
00:18
it's a feeling, and it's a reality.
1
18310
2526
ื•ืืœื• ื“ื‘ืจื™ื ืฉื•ื ื™ื.
00:20
And they're different.
2
20860
1425
ืืชื” ื™ื›ื•ืœ ืœื—ื•ืฉ ื‘ื˜ื•ื—
00:22
You could feel secure even if you're not.
3
22309
3427
ืืคื™ืœื• ื›ืฉืืชื” ืœื.
00:25
And you can be secure
4
25760
1976
ื•ืืชื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ื‘ื˜ื•ื—
00:27
even if you don't feel it.
5
27760
1850
ืืคื™ืœื• ื‘ืœื™ ืœื—ื•ืฉ ื›ืš.
00:29
Really, we have two separate concepts
6
29634
2117
ื‘ืืžืช, ื™ืฉ ืœื ื• ืฉื ื™ ืžื•ืฉื’ื™ื ืฉื•ื ื™ื
00:31
mapped onto the same word.
7
31775
1652
ืฉืžืชื™ื™ื—ืกื™ื ืœืื•ืชื” ื”ืžื™ืœื”.
00:33
And what I want to do in this talk is to split them apart --
8
33960
3626
ื•ืžื” ืฉืื ื™ ืจื•ืฆื” ืœืขืฉื•ืช ื‘ื”ืจืฆืื” ื–ื•
ื”ื•ื ืœื”ืคืจื™ื“ ื‘ื™ื ื™ื”ื --
00:37
figuring out when they diverge and how they converge.
9
37610
3610
ืœืžืฆื•ื ืžืชื™ ื”ื ื ืคืจื“ื™ื
ื•ืžืชื™ ื”ื ืžืชืœื›ื“ื™ื.
00:41
And language is actually a problem here.
10
41711
2275
ื•ื”ืฉืคื” ื”ื™ื ืœืžืขืฉื” ื‘ืขื™ื” ื›ืืŸ.
ืื™ืŸ ื”ืจื‘ื” ืžื™ืœื™ื ื˜ื•ื‘ื•ืช
00:44
There aren't a lot of good words
11
44010
2076
ืขื‘ื•ืจ ื”ืžื•ืฉื’ื™ื ืขืœื™ื”ื ื ื“ื‘ืจ.
00:46
for the concepts we're going to talk about.
12
46110
2061
ืื– ื›ืฉืžืกืชื›ืœื™ื ืขืœ ื‘ื˜ื—ื•ืŸ
00:49
So if you look at security from economic terms,
13
49295
4120
ืžื‘ื—ื™ื ื” ื›ืœื›ืœื™ืช,
ื™ืฉ ืคื” ืคืฉืจื”.
00:53
it's a trade-off.
14
53439
1647
ื‘ื›ืœ ืคืขื ืฉืžืฉื™ื’ื™ื ืงืฆืช ื‘ื˜ื—ื•ืŸ,
00:55
Every time you get some security, you're always trading off something.
15
55110
4132
ืžืชืคืฉืจื™ื ืขืœ ืžืฉื”ื•.
ื‘ื™ืŸ ืื ื–ื• ื”ื—ืœื˜ื” ืื™ืฉื™ืช --
00:59
Whether this is a personal decision --
16
59266
1845
ืื ืžื—ืœื™ื˜ื™ื ืœื”ืชืงื™ืŸ ื‘ื‘ื™ืช ืื–ืขืงื” ื ื’ื“ ืคื•ืจืฆื™ื --
01:01
whether you're going to install a burglar alarm in your home --
17
61135
3012
ืื• ื”ื—ืœื˜ื” ืœืื•ืžื™ืช -- ื›ืฉืžื—ืœื™ื˜ื™ื ืœืคืœื•ืฉ ืœืื™ื–ื• ืžื“ื™ื ื” --
01:04
or a national decision,
18
64171
1157
01:05
where you're going to invade a foreign country --
19
65352
2310
ืฆืจื™ืš ืœื”ืชืคืฉืจ ืขืœ ืžืฉื”ื•,
01:07
you're going to trade off something: money or time, convenience, capabilities,
20
67686
3782
ื›ืกืฃ ืื• ื–ืžืŸ, ื ื•ื—ื•ืช, ื™ื›ื•ืœื•ืช,
ื•ืื•ืœื™ ื—ื™ืจื•ื™ื•ืช ื™ืกื•ื“.
01:11
maybe fundamental liberties.
21
71492
2002
01:13
And the question to ask when you look at a security anything
22
73518
3274
ื•ื”ืฉืืœื” ืฉื™ืฉ ืœืฉืื•ืœ ื›ืฉื‘ื•ื—ื ื™ื ืื‘ื˜ื—ื” ื›ืœืฉื”ื™
01:16
is not whether this makes us safer,
23
76816
3382
ื”ื™ื ืœื ืื ื”ื™ื ืžืฉืคืจืช ืืช ื”ื‘ื˜ื—ื•ืŸ ืฉืœื ื•,
ืืœื ืื ื”ื™ื ืฉื•ื•ื” ืืช ื”ืคืฉืจื”.
01:20
but whether it's worth the trade-off.
24
80222
2215
01:22
You've heard in the past several years, the world is safer
25
82461
3229
ืฉืžืขืชื ื‘ืžื”ืœืš ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
ืฉื”ืขื•ืœื ื‘ื˜ื•ื— ื™ื•ืชืจ ื›ื™ ืกื“ืื ื—ื•ืกื™ื™ืŸ ื›ื‘ืจ ืœื ื‘ืฉืœื˜ื•ืŸ.
01:25
because Saddam Hussein is not in power.
26
85714
1890
ื–ื” ืขืฉื•ื™ ืœื”ื™ื•ืช ื ื›ื•ืŸ, ืื‘ืœ ื–ื” ืœื ืžืžืฉ ืจืœื‘ื ื˜ื™.
01:27
That might be true, but it's not terribly relevant.
27
87628
2603
ื”ืฉืืœื” ื”ื™ื, ื”ืื ื–ื” ื”ืฉืชืœื?
01:30
The question is: Was it worth it?
28
90255
2831
ื•ืืชื ื™ื›ื•ืœื™ื ืœื”ื—ืœื™ื˜ ื‘ืขืฆืžื›ื,
01:33
And you can make your own decision,
29
93110
2459
01:35
and then you'll decide whether the invasion was worth it.
30
95593
2733
ืืชื ืชื—ืœื™ื˜ื• ื”ืื ื”ืคืœื™ืฉื” ื”ืฉืชืœืžื”.
ื›ืš ืืชื ืฆืจื™ื›ื™ื ืœื—ืฉื•ื‘ ืขืœ ื‘ื˜ื—ื•ืŸ --
01:38
That's how you think about security: in terms of the trade-off.
31
98350
3561
ื‘ืžื•ื ื—ื™ื ืฉืœ ืคืฉืจื”.
01:41
Now, there's often no right or wrong here.
32
101935
2610
ืœืขืชื™ื ืื™ืŸ "ื ื›ื•ืŸ" ืื• "ืœื ื ื›ื•ืŸ".
ืœื—ืœืงื ื• ื™ืฉ ื‘ื‘ื™ืช ืžืขืจื›ืช ืื–ืขืงื”,
01:45
Some of us have a burglar alarm system at home and some of us don't.
33
105208
3308
ื•ืœื—ืœืงื ื• ืœื.
01:48
And it'll depend on where we live,
34
108540
2731
ื•ื–ื” ืชืœื•ื™ ื‘ืžืงื•ื ื‘ื• ืื ื• ื’ืจื™ื,
ื”ืื ืื ื—ื ื• ื’ืจื™ื ืœื‘ื“ ืื• ืฉื™ืฉ ืœื ื• ืžืฉืคื—ื”,
01:51
whether we live alone or have a family,
35
111295
1926
ื‘ื›ืžื•ืช ื”ื“ื‘ืจื™ื ื”ื™ืงืจื™ื ืฉื™ืฉ ืœื ื•,
01:53
how much cool stuff we have,
36
113245
1668
01:54
how much we're willing to accept the risk of theft.
37
114937
3165
ืขื“ ื›ืžื” ืื ื—ื ื• ืžื•ื›ื ื™ื ืœืงื‘ืœ
ืืช ื”ืกื™ื›ื•ืŸ ืฉืชื”ื™ื” ื’ื ื™ื‘ื”.
01:58
In politics also, there are different opinions.
38
118943
2948
ื’ื ื‘ืคื•ืœื™ื˜ื™ืงื”,
ื™ืฉ ื“ื™ืขื•ืช ืฉื•ื ื•ืช.
02:02
A lot of times, these trade-offs are about more than just security,
39
122459
4435
ื‘ืžืงืจื™ื ืจื‘ื™ื, ืคืฉืจื•ืช ืืœื•
ืงืฉื•ืจื•ืช ื‘ื™ื•ืชืจ ืžืืฉืจ ื‘ื˜ื—ื•ืŸ,
02:06
and I think that's really important.
40
126918
1865
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ืžืื•ื“ ื—ืฉื•ื‘.
02:08
Now, people have a natural intuition about these trade-offs.
41
128807
3308
ืœืื ืฉื™ื ื™ืฉ ืื™ื ื˜ื•ืื™ืฆื™ื” ื˜ื‘ืขื™ืช
ืœื’ื‘ื™ ืคืฉืจื•ืช ืืœื•.
02:12
We make them every day.
42
132588
1556
ืื ื• ืขื•ืฉื™ื ืื•ืชืŸ ื›ืœ ื™ื•ื --
02:14
Last night in my hotel room, when I decided to double-lock the door,
43
134807
3533
ืืชืžื•ืœ ื‘ืœื™ืœื”, ื‘ื—ื“ืจื™ ื‘ืžืœื•ืŸ,
ื›ืฉื”ื—ืœื˜ืชื™ ืœื ืขื•ืœ ืืช ื”ื“ืœืช ื ืขื™ืœื” ื›ืคื•ืœื”,
02:18
or you in your car when you drove here;
44
138364
2000
ืื• ืืชื ื‘ืžื›ื•ื ื™ืช, ื›ืฉื ื”ื’ืชื ื”ื ื”,
ื›ืฉืื ื—ื ื• ื”ื•ืœื›ื™ื ืœืื›ื•ืœ ืืจื•ื—ืช ืฆื”ืจื™ื™ื
02:21
when we go eat lunch
45
141191
1478
02:22
and decide the food's not poison and we'll eat it.
46
142693
2608
ื•ืžื—ืœื™ื˜ื™ื ืฉื”ืžื–ื•ืŸ ืื™ื ื• ืจืขื™ืœ ื•ืื•ื›ืœื™ื ืื•ืชื•.
02:25
We make these trade-offs again and again,
47
145325
3161
ืื ื• ืžืชืคืฉืจื™ื ื›ืš ืฉื•ื‘ ื•ืฉื•ื‘
ืคืขืžื™ื ืจื‘ื•ืช ื‘ื™ื•ื.
02:28
multiple times a day.
48
148510
1576
ืœืจื•ื‘ ืื ื• ื‘ื›ืœืœ ืœื ืžื‘ื—ื™ื ื™ื ื‘ื–ืืช.
02:30
We often won't even notice them.
49
150110
1589
02:31
They're just part of being alive; we all do it.
50
151723
2626
ื”ืŸ ืคืฉื•ื˜ ื—ืœืง ืžื”ื—ื™ื™ื. ื›ื•ืœื ื• ืขื•ืฉื™ื ืืช ื–ื”.
ื›ืœ ื”ืžื™ื ื™ื ืขื•ืฉื™ื ืืช ื–ื”.
02:34
Every species does it.
51
154373
1555
02:36
Imagine a rabbit in a field, eating grass.
52
156474
2862
ื“ืžื™ื™ื ื• ืืจื ื‘ ืฉืื•ื›ืœ ืขืฉื‘ ื‘ืฉื“ื”,
ื•ื”ืืจื ื‘ ืจื•ืื” ืฉื•ืขืœ.
02:39
And the rabbit sees a fox.
53
159360
1943
02:41
That rabbit will make a security trade-off:
54
161856
2049
ื”ืืจื ื‘ ื™ืขืฉื” ืคืฉืจืช-ืื‘ื˜ื—ื”:
02:43
"Should I stay, or should I flee?"
55
163929
1904
"ื”ืื ืขืœื™ ืœื”ื™ืฉืืจ ืื• ืœื‘ืจื•ื—?"
ื•ืื ื—ื•ืฉื‘ื™ื ืขืœ ื–ื”,
02:46
And if you think about it,
56
166380
1619
ื”ืืจื ื‘ื™ื ืฉื˜ื•ื‘ื™ื ื‘ืขืฉื™ื™ืช ื”ืคืฉืจื” ื”ื–ื•
02:48
the rabbits that are good at making that trade-off
57
168023
2555
02:50
will tend to live and reproduce,
58
170602
1978
ื™ื–ื›ื• ืœื—ื™ื•ืช ื•ืœื”ืชืจื‘ื•ืช,
02:52
and the rabbits that are bad at it
59
172604
2307
ื•ื”ืืจื ื‘ื™ื ืฉื™ืฉื’ื• ื‘ื‘ื—ื™ืจืชื
02:54
will get eaten or starve.
60
174935
1474
ื™ื™ื˜ืจืคื• ืื• ื™ื’ื•ื•ืขื• ื‘ืจืขื‘.
02:56
So you'd think
61
176958
1608
ื”ื™ื™ืชื ื—ื•ืฉื‘ื™ื
ืฉืื ื—ื ื•, ื›ืžื™ืŸ ืžืฆืœื™ื— ื‘ื›ื“ื•ืจ ื”ืืจืฅ --
02:59
that us, as a successful species on the planet -- you, me, everybody --
62
179573
4309
ืืชื, ืื ื™, ื›ื•ืœื --
03:03
would be really good at making these trade-offs.
63
183906
2573
ื ื”ื™ื” ืžืžืฉ ื˜ื•ื‘ื™ื ื‘ื”ืชืคืฉืจื•ื™ื•ืช ื”ืืœื”.
ื•ื‘ื›ืœ ื–ืืช ื ืจืื”, ืคืขื ืื—ืจ ืคืขื,
03:07
Yet it seems, again and again, that we're hopelessly bad at it.
64
187126
3104
ืฉืื ื—ื ื• ื’ืจื•ืขื™ื ื‘ื–ื” ืœืœื ืชืงื ื”.
03:11
And I think that's a fundamentally interesting question.
65
191768
2802
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื–ื•ื”ื™ ืฉืืœื” ืžืขื ื™ื™ื ืช ืžื™ืกื•ื“ื”.
03:14
I'll give you the short answer.
66
194594
1873
ืืชืŸ ืœื›ื ืชืฉื•ื‘ื” ืงืฆืจื”.
03:16
The answer is, we respond to the feeling of security
67
196491
2651
ื”ืชืฉื•ื‘ื” ื”ื™ื, ืฉืื ื—ื ื• ืžื’ื™ื‘ื™ื ืœืชื—ื•ืฉืช ื”ื‘ื˜ื—ื•ืŸ
ื•ืœื ืœืžืฆื™ืื•ืช.
03:19
and not the reality.
68
199166
1518
03:21
Now, most of the time, that works.
69
201864
2696
ืœืจื•ื‘, ื–ื” ืขื•ื‘ื“.
03:25
Most of the time,
70
205538
1503
ื‘ืจื•ื‘ ื”ืžืงืจื™ื,
03:27
feeling and reality are the same.
71
207065
2133
ื”ืชื—ื•ืฉื” ื•ื”ืžืฆื™ืื•ืช ืฉื•ื•ื™ื.
03:30
Certainly that's true for most of human prehistory.
72
210776
3516
ื–ื” ืœืœื ืกืคืง ื ื›ื•ืŸ
ื‘ืžืฉืš ืจื•ื‘ ื”ืคืจื”ื™ืกื˜ื•ืจื™ื” ื”ืื ื•ืฉื™ืช.
03:35
We've developed this ability
73
215633
2708
ืคื™ืชื—ื ื• ืืช ื”ื™ื›ื•ืœืช ื”ื–ื•
03:38
because it makes evolutionary sense.
74
218365
2584
ื›ื™ ื”ื™ื ื”ื™ืชื” ื”ื’ื™ื•ื ื™ืช ืžื‘ื—ื™ื ื” ืื‘ื•ืœื•ืฆื™ื•ื ื™ืช.
ื“ืจืš ืื—ืช ืœื”ืกืชื›ืœ ืขืœ ื–ื”
03:41
One way to think of it is that we're highly optimized
75
221985
3274
ื”ื™ื ืฉื”ื’ืขื ื• ืœืžื™ื˜ื•ื‘ ื’ื‘ื•ื” ืžืื•ื“
ื‘ื”ื—ืœื˜ื•ืช ื”ื ื•ื’ืขื•ืช ืœืกื™ื›ื•ื ื™ื
03:45
for risk decisions
76
225283
1803
ืฉืฉื™ื™ื›ื™ื ืœื—ื™ื™ื ื‘ืงื‘ื•ืฆื•ืช ืžืฉืคื—ืชื™ื•ืช ืงื˜ื ื•ืช
03:47
that are endemic to living in small family groups
77
227110
2543
03:49
in the East African Highlands in 100,000 BC.
78
229677
2536
ื‘ืจืžื•ืช ืฉืœ ืžื–ืจื— ืืคืจื™ืงื”, ื›-100,000 ืฉื ื” ืœืคื ื™ ื”ืกืคื™ืจื” --
03:52
2010 New York, not so much.
79
232792
2659
ื•ืœื ื›ืœ ื›ืš ื‘ื ื™ื•-ื™ื•ืจืง ืฉืœ 2010.
03:56
Now, there are several biases in risk perception.
80
236879
3206
ืงื™ื™ืžื•ืช ื”ื˜ื™ื•ืช ืžืกื•ื™ื™ืžื•ืช ื‘ืงืฉืจ ืœืชืคื™ืฉืช ืกื™ื›ื•ืŸ.
ื ืขืฉื• ืžื—ืงืจื™ื ื˜ื•ื‘ื™ื ืจื‘ื™ื ืขืœ ื›ืš,
04:00
A lot of good experiments in this.
81
240109
1741
04:01
And you can see certain biases that come up again and again.
82
241874
3603
ื•ื ื™ืชืŸ ืœืจืื•ืช ืฉื”ื˜ื™ื•ืช ืžืกื•ื™ื™ืžื•ืช ืขื•ืœื•ืช ืฉื•ื‘ ื•ืฉื•ื‘.
ืืชืŸ ืœื›ื ืืจื‘ืข.
04:05
I'll give you four.
83
245501
1353
04:06
We tend to exaggerate spectacular and rare risks
84
246878
3208
ืื ื• ื ื•ื˜ื™ื ืœื”ื’ื–ื™ื ื‘ื ื•ื’ืข ืœืกื™ื›ื•ื ื™ื ืžืจืฉื™ืžื™ื ื•ื ื“ื™ืจื™ื
ื•ืœื”ืžืขื™ื˜ ื‘ืกื™ื›ื•ื ื™ื ื ืคื•ืฆื™ื --
04:10
and downplay common risks --
85
250110
1976
ื›ืžื• ื˜ื™ืกื” ืœืขื•ืžืช ื ื”ื™ื’ื”.
04:12
so, flying versus driving.
86
252110
1518
04:14
The unknown is perceived to be riskier than the familiar.
87
254451
3794
ื”ื‘ืœืชื™ ื™ื“ื•ืข ื ืชืคืฉ
ื›ืžืกื•ื›ืŸ ื™ื•ืชืจ ืžืืฉืจ ื”ืžื•ื›ึผืจ.
ื“ื•ื’ืžื” ืื—ืช ืชื”ื™ื”:
04:21
One example would be:
88
261470
1439
04:22
people fear kidnapping by strangers,
89
262933
2613
ืื ืฉื™ื ื—ื•ืฉืฉื™ื ืžื—ื˜ื™ืคื” ืข"ื™ ื–ืจื™ื,
04:25
when the data supports that kidnapping by relatives is much more common.
90
265570
3636
ื›ืฉื”ื ืชื•ื ื™ื ืžืจืื™ื ืฉื—ื˜ื™ืคื” ืข"ื™ ืงืจื•ื‘ื™ื ื ืคื•ืฆื” ื”ืจื‘ื” ื™ื•ืชืจ.
ื–ื” ื‘ื ื•ื’ืข ืœื™ืœื“ื™ื.
04:29
This is for children.
91
269230
1574
04:30
Third, personified risks are perceived to be greater
92
270828
4040
ืฉืœื™ืฉื™ืช, ืกื™ื›ื•ื ื™ื ืžื–ื•ื”ื™ื
ื ืชืคืฉื™ื ื›ื’ื“ื•ืœื™ื ื™ื•ืชืจ ืžืกื™ื›ื•ื ื™ื ืื ื•ื ื™ืžื™ื™ื --
04:34
than anonymous risks.
93
274892
1503
04:36
So, Bin Laden is scarier because he has a name.
94
276419
2787
ืื– ื‘ืŸ-ืœืื“ืŸ ืžืคื—ื™ื“ ื™ื•ืชืจ ื›ื™ ื™ืฉ ืœื• ืฉื.
ื•ืจื‘ื™ืขื™ืช,
04:40
And the fourth is:
95
280182
1363
04:41
people underestimate risks in situations they do control
96
281569
4755
ืื ืฉื™ื ืžืคื—ื™ืชื™ื ื‘ื”ืขืจื›ืช ืขื•ืฆืžืช ื”ืกื™ื›ื•ื ื™ื
ื‘ืžืฆื‘ื™ื ืฉื‘ื”ื ื”ื ืฉื•ืœื˜ื™ื
ื•ืžืคืจื™ื–ื™ื ื‘ื”ืขืจื›ืชื ื‘ืžืฆื‘ื™ื ื‘ื”ื ืื™ื ื ืฉื•ืœื˜ื™ื.
04:46
and overestimate them in situations they don't control.
97
286348
2963
04:49
So once you take up skydiving or smoking,
98
289335
3384
ืื– ื›ืฉืžืชื—ื™ืœื™ื ื‘ืฆื ื™ื—ื” ื—ื•ืคืฉื™ืช ื•ื‘ืขื™ืฉื•ืŸ,
04:52
you downplay the risks.
99
292743
1624
ืžืžืขื™ื˜ื™ื ื‘ืขืจืš ื”ืกื™ื›ื•ื ื™ื.
ืื ื”ืกื™ื›ื•ืŸ ื ื›ืคื” ืขืœื™ื›ื -- ื˜ืจื•ืจ ื”ื•ื ื“ื•ื’ืžื” ื˜ื•ื‘ื” --
04:55
If a risk is thrust upon you -- terrorism is a good example --
100
295037
3053
ืืชื ืชืคืจื™ื–ื• ื‘ื”ืขืจื›ืชื•, ื›ื™ ืื™ื ื›ื ื—ืฉื™ื ืฉื–ื” ื‘ืฉืœื™ื˜ืชื›ื.
04:58
you'll overplay it,
101
298114
1157
04:59
because you don't feel like it's in your control.
102
299295
2339
05:02
There are a bunch of other of these cognitive biases,
103
302157
3493
ื™ืฉ ื”ืžื•ืŸ ื”ื˜ื™ื•ืช ื›ืืœื”, ื”ื˜ื™ื•ืช ืงื•ื’ื ื™ื˜ื™ื‘ื™ื•ืช,
05:05
that affect our risk decisions.
104
305674
2339
ืฉืžืฉืคื™ืขื•ืช ืขืœ ื”ื—ืœื˜ื•ืชื™ื ื• ื‘ื ื•ื’ืข ืœืกื™ื›ื•ื ื™ื.
05:08
There's the availability heuristic,
105
308832
2254
ื™ืฉื ื” ื™ื•ืจื™ืกื˜ื™ืงืช ื”ื ื’ื™ืฉื•ืช,
ืฉืื•ืžืจืช, ื‘ืขื™ืงืจื•ืŸ,
05:11
which basically means we estimate the probability of something
106
311110
4180
ืฉืื ื—ื ื• ืžืขืจื™ื›ื™ื ืืช ื”ื”ืกืชื‘ืจื•ืช ืฉืœ ืžืฉื”ื•
05:15
by how easy it is to bring instances of it to mind.
107
315314
3339
ืœืคื™ ื”ืžื™ื“ื” ื‘ื” ืงืœ ืœื—ืฉื•ื‘ ืขืœ ืžืงืจื™ื ื“ื•ืžื™ื.
05:19
So you can imagine how that works.
108
319831
1777
ืื– ืืชื ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ ืื™ืš ื–ื” ืขื•ื‘ื“.
05:21
If you hear a lot about tiger attacks, there must be a lot of tigers around.
109
321632
3628
ืื ืฉื•ืžืขื™ื ื”ืจื‘ื” ืขืœ ืชืงื™ืคื•ืช ื ืžืจื™ื, ื™ืฉ ื•ื“ืื™ ื”ืจื‘ื” ื ืžืจื™ื ื‘ืื™ื–ื•ืจ.
ืœื ืฉื•ืžืขื™ื ืขืœ ืชืงื™ืคื•ืช ืืจื™ื•ืช, ืื– ืื™ืŸ ื”ืจื‘ื” ืืจื™ื•ืช ื‘ืื™ื–ื•ืจ.
05:25
You don't hear about lion attacks, there aren't a lot of lions around.
110
325284
3344
ื–ื” ืขื•ื‘ื“ ืขื“ ืฉืžืžืฆื™ืื™ื ืืช ื”ืขื™ืชื•ื ื™ื.
05:28
This works, until you invent newspapers,
111
328652
2297
05:30
because what newspapers do is repeat again and again
112
330973
4406
ื›ื™ ืžื” ืฉื”ืขื™ืชื•ื ื™ื ืขื•ืฉื™ื
ื”ื•ื ืœื—ื–ื•ืจ ืฉื•ื‘ ื•ืฉื•ื‘
ืขืœ ืกื™ื›ื•ื ื™ื ื ื“ื™ืจื™ื
05:35
rare risks.
113
335403
1406
05:36
I tell people: if it's in the news, don't worry about it,
114
336833
2865
ืื ื™ ืื•ืžืจ ืœืื ืฉื™ื, ืื ื–ื” ื‘ื—ื“ืฉื•ืช, ืื™ืŸ ืœื›ื ืžื” ืœื“ืื•ื’.
ื›ื™ ื‘ื”ื’ื“ืจื”,
05:39
because by definition, news is something that almost never happens.
115
339722
4275
ื—ื“ืฉื•ืช ื”ืŸ ืžืฉื”ื• ืฉื›ืžืขื˜ ืืฃ ืคืขื ืœื ืงื•ืจื”.
(ืฆื—ื•ืง)
05:44
(Laughter)
116
344021
1769
05:45
When something is so common, it's no longer news.
117
345814
2923
ื›ืฉืžืฉื”ื• ื ืคื•ืฅ, ื”ื•ื ื›ื‘ืจ ืœื ื—ื“ืฉื•ืช --
05:48
Car crashes, domestic violence --
118
348761
2198
ื”ืชื ื’ืฉื•ื™ื•ืช ืฉืœ ืžื›ื•ื ื™ื•ืช, ืืœื™ืžื•ืช ื‘ืžืฉืคื—ื” --
05:50
those are the risks you worry about.
119
350983
1990
ืžื”ื“ื‘ืจื™ื ื”ืืœื” ืฆืจื™ืš ืœื—ืฉื•ืฉ.
05:53
We're also a species of storytellers.
120
353713
2148
ืื ื• ื’ื ืžื™ืŸ ืฉืœ ืžืกืคืจื™ ืกื™ืคื•ืจื™ื.
05:55
We respond to stories more than data.
121
355885
2115
ืื ื• ืžื’ื™ื‘ื™ื ืœืกื™ืคื•ืจื™ื ื™ื•ืชืจ ืžืืฉืจ ืœื ืชื•ื ื™ื.
05:58
And there's some basic innumeracy going on.
122
358514
2406
ื•ืงื™ื™ืžืช ื›ืืŸ ื’ื ื‘ืขื™ื” ืฉืœ ื›ื™ืฉื•ืจื™-ื™ืกื•ื“ ืžืชืžื˜ื™ื™ื.
06:00
I mean, the joke "One, two, three, many" is kind of right.
123
360944
3142
ื”ื‘ื“ื™ื—ื” "ืื—ืช, ืฉืชื™ื™ื, ืฉืœื•ืฉ, ื”ืจื‘ื”" ื”ื™ื ื“ื™ ื ื›ื•ื ื”.
ืื ื—ื ื• ืžืžืฉ ื˜ื•ื‘ื™ื ื‘ืžืกืคืจื™ื ืงื˜ื ื™ื.
06:04
We're really good at small numbers.
124
364110
2322
06:06
One mango, two mangoes, three mangoes,
125
366456
2336
ืžื ื’ื• ืื—ื“, ืฉื ื™ ืžื ื’ื•ืื™ื, ืฉืœื•ืฉื” ืžื ื’ื•ืื™ื,
06:08
10,000 mangoes, 100,000 mangoes --
126
368816
1977
10,000 ืžื ื’ื•ืื™ื, 100,000 ืžื ื’ื•ืื™ื --
06:10
it's still more mangoes you can eat before they rot.
127
370817
2977
ื–ื” ืขื“ื™ื™ืŸ ื™ื•ืชืจ ืžื ื’ื•ืื™ื ืžืžื” ืฉืชื•ื›ืœื• ืœืื›ื•ืœ ืœืคื ื™ ืฉื™ื™ืจืงื‘ื•.
06:13
So one half, one quarter, one fifth -- we're good at that.
128
373818
3268
ืื– ื—ืฆื™ ืื—ื“, ืจื‘ืข ืื—ื“, ื—ืžื™ืฉื™ืช -- ืื ื—ื ื• ื˜ื•ื‘ื™ื ื‘ื–ื”.
ืื—ื“ ืœืžื™ืœื™ื•ืŸ, ืื—ื“ ืœืžื™ืœื™ืืจื“,
06:17
One in a million, one in a billion --
129
377110
1976
ืฉื ื™ื”ื ื–ื”ื™ื ืœ"ื›ืžืขื˜ ืืฃ ืคืขื".
06:19
they're both almost never.
130
379110
1575
06:21
So we have trouble with the risks that aren't very common.
131
381546
3414
ืื– ื™ืฉ ืœื ื• ื‘ืขื™ื” ืขื ื”ืกื™ื›ื•ื ื™ื
ืฉืื™ื ื ืžืื•ื“ ื ืคื•ืฆื™ื.
06:25
And what these cognitive biases do
132
385760
1977
ื•ืžื” ืฉื”ื”ื˜ื™ื•ืช ื”ืงื•ื’ื ื™ื˜ื™ื‘ื™ื•ืช ื”ืืœื” ืขื•ืฉื•ืช
06:27
is they act as filters between us and reality.
133
387761
2976
ื”ื•ื ืฉื”ืŸ ื—ื•ืฆืฆื•ืช ื‘ื™ื ื™ื ื• ืœื‘ื™ืŸ ื”ืžืฆื™ืื•ืช.
ื•ื”ืชื•ืฆืื” ื”ื™ื
06:31
And the result is that feeling and reality get out of whack,
134
391284
3873
ืฉื”ืชื—ื•ืฉื” ื•ื”ืžืฆื™ืื•ืช ืœื ืžืชื•ืืžื•ืช,
ื”ืŸ ื ืขืฉื•ืช ืฉื•ื ื•ืช.
06:35
they get different.
135
395181
1384
06:37
Now, you either have a feeling -- you feel more secure than you are,
136
397370
3931
ืื– ืื• ืฉื™ืฉ ืœื›ื ืชื—ื•ืฉื”-- ืืชื ื—ืฉื™ื ื™ื•ืชืจ ื‘ื˜ื•ื—ื™ื ืžื›ืคื™ ืฉืืชื ื‘ืืžืช.
ื–ื•ื”ื™ ืชื—ื•ืฉืช ืฉื•ื•ื ืฉืœ ื‘ื˜ื—ื•ืŸ.
06:41
there's a false sense of security.
137
401325
1685
ืื• ืœื”ื™ืคืš,
06:43
Or the other way, and that's a false sense of insecurity.
138
403034
3374
ื•ื–ื•ื”ื™ ืชื—ื•ืฉืช ืฉื•ื•ื ืฉืœ ื—ื•ืกืจ ื‘ื˜ื—ื•ืŸ.
ื›ืชื‘ืชื™ ืจื‘ื•ืช ืขืœ "ืชื™ืื˜ืจื•ืŸ ื”ื‘ื™ื˜ื—ื•ืŸ",
06:47
I write a lot about "security theater,"
139
407015
2880
06:49
which are products that make people feel secure,
140
409919
2680
ืžื•ืฆืจื™ื ืฉื’ื•ืจืžื™ื ืœืื ืฉื™ื ืœื—ื•ืฉ ื‘ื˜ื•ื—ื™ื,
06:52
but don't actually do anything.
141
412623
1977
ืื‘ืœ ืœืžืขืฉื” ืœื ืขื•ืฉื™ื ื›ืœื•ื.
06:54
There's no real word for stuff that makes us secure,
142
414624
2557
ืื™ืŸ ืžื™ืœื” ืืžื™ืชื™ืช ืขื‘ื•ืจ ืžืฉื”ื• ืฉื’ื•ืจื ืœื ื• ืœื”ื™ื•ืช ื‘ื˜ื•ื—ื™ื,
ืื‘ืœ ืœื ื’ื•ืจื ืœื ื• ืœื—ื•ืฉ ื‘ื˜ื•ื—ื™ื.
06:57
but doesn't make us feel secure.
143
417205
1881
ืื•ืœื™ ื–ื” ืžื” ืฉื”ืกื™-ืื™ื™-ืื™ื™ ืืžื•ืจ ืœืกืคืง ืœื ื•.
06:59
Maybe it's what the CIA is supposed to do for us.
144
419110
2720
07:03
So back to economics.
145
423539
2168
ืื– ื‘ื—ื–ืจื” ืœื›ืœื›ืœื”.
07:05
If economics, if the market, drives security,
146
425731
3656
ืื ื›ืœื›ืœื”, ืื ื”ืฉื•ืง, ืžื ื™ืขื™ื ืืช ื”ื‘ื™ื˜ื—ื•ืŸ,
07:09
and if people make trade-offs based on the feeling of security,
147
429411
4847
ื•ืื ืื ืฉื™ื ืžืชืคืฉืจื™ื
ืขืœ ืกืžืš ืชื—ื•ืฉืช ื”ื‘ื˜ื—ื•ืŸ,
07:14
then the smart thing for companies to do for the economic incentives
148
434282
4680
ืื– ื”ื“ื‘ืจ ื”ื—ื›ื ืฉื—ื‘ืจื•ืช ืฆืจื™ื›ื•ืช ืœืขืฉื•ืช
ื‘ื’ืœืœ ื”ืชืžืจื™ืฅ ื”ื›ืœื›ืœื™
07:18
is to make people feel secure.
149
438986
2057
ื”ื•ื ืœื’ืจื•ื ืœืื ืฉื™ื ืœื—ื•ืฉ ื‘ื˜ื•ื—ื™ื.
07:21
And there are two ways to do this.
150
441942
2330
ื•ื™ืฉ ืฉืชื™ ื“ืจื›ื™ื ืœืขืฉื•ืช ื–ืืช.
07:24
One, you can make people actually secure
151
444296
2790
ืื—ืช, ืืคืฉืจ ืœื’ืจื•ื ืœืื ืฉื™ื ืœื”ื™ื•ืช ื‘ืืžืช ื‘ื˜ื•ื—ื™ื
ื•ืœืงื•ื•ืช ืฉื™ืฉื’ื™ื—ื• ื‘ื›ืš.
07:27
and hope they notice.
152
447110
1463
07:28
Or two, you can make people just feel secure
153
448597
2844
ืื• ืฉืชื™ื™ื, ืืคืฉืจ ืœื’ืจื•ื ืœื”ื ืจืง ืœื—ื•ืฉ ื‘ื˜ื•ื—ื™ื
07:31
and hope they don't notice.
154
451465
1872
ื•ืœืงื•ื•ืช ืฉื”ื ืœื ื™ืฉื™ืžื• ืœื‘.
07:34
Right?
155
454401
1375
07:35
So what makes people notice?
156
455800
3141
ืื– ืžื” ื’ื•ืจื ืœืื ืฉื™ื ืœืฉื™ื ืœื‘?
ื•ื‘ื›ืŸ, ืžืกืคืจ ื“ื‘ืจื™ื:
07:39
Well, a couple of things:
157
459500
1382
07:40
understanding of the security,
158
460906
2266
ื”ื‘ื ื” ื‘ื‘ื™ื˜ื—ื•ืŸ,
ืฉืœ ื”ืกื™ื›ื•ื ื™ื, ื”ืื™ื•ืžื™ื,
07:43
of the risks, the threats,
159
463196
1890
ืฉืœ ืืžืฆืขื™ ื”ื ื’ื“, ืื™ืš ื”ื ืคื•ืขืœื™ื.
07:45
the countermeasures, how they work.
160
465110
1874
07:47
But if you know stuff, you're more likely
161
467008
2290
ืื‘ืœ ืื ืžื‘ื™ื ื™ื ืžืฉื”ื•,
ืกื‘ื™ืจ ื™ื•ืชืจ ืฉื”ืชื—ื•ืฉื•ืช ื™ืชืื™ืžื• ืœืžืฆื™ืื•ืช.
07:50
to have your feelings match reality.
162
470155
2226
ื“ื™ ื‘ืžืกืคื™ืง ื“ื•ื’ืžืื•ืช ืžื”ืขื•ืœื ื”ืืžื™ืชื™.
07:53
Enough real-world examples helps.
163
473110
3145
ื›ื•ืœื ื• ื™ื•ื“ืขื™ื ืžื”ื™ ืจืžืช ื”ืคืฉืข ื‘ืฉื›ื•ื ื” ืฉืœื ื•,
07:56
We all know the crime rate in our neighborhood,
164
476279
2559
07:58
because we live there, and we get a feeling about it
165
478862
2801
ื›ื™ ืื ื—ื ื• ื’ืจื™ื ืฉื, ื•ื™ืฉ ืœื ื• ืชื—ื•ืฉื” ื‘ืงืฉืจ ืœื–ื”
08:01
that basically matches reality.
166
481687
1869
ืฉืขืงืจื•ื ื™ืช ืชื•ืืžืช ืœืžืฆื™ืื•ืช.
ืชื™ืื˜ืจื•ืŸ ื”ื‘ื™ื˜ื—ื•ืŸ ื ื—ืฉืฃ
08:05
Security theater is exposed
167
485038
2207
08:07
when it's obvious that it's not working properly.
168
487269
2786
ื›ืฉื‘ืจื•ืจ ืฉื–ื” ืœื ืขื•ื‘ื“ ื˜ื•ื‘.
ืื– ืžื” ื’ื•ืจื ืœืื ืฉื™ื ืœื ืœืฉื™ื ืœื‘?
08:11
OK. So what makes people not notice?
169
491209
2670
08:14
Well, a poor understanding.
170
494443
1492
ื•ื‘ื›ืŸ, ื”ื‘ื ื” ื’ืจื•ืขื”.
08:16
If you don't understand the risks, you don't understand the costs,
171
496642
3144
ืื ืœื ืžื‘ื™ื ื™ื ืืช ื”ืกื™ื›ื•ื ื™ื, ืœื ืžื‘ื™ื ื™ื ืืช ื”ืขืœื•ื™ื•ืช,
08:19
you're likely to get the trade-off wrong,
172
499810
2157
ื•ืกื‘ื™ืจ ืฉืขื•ืฉื™ื ืคืฉืจื•ืช ืฉื’ื•ื™ื•ืช,
08:21
and your feeling doesn't match reality.
173
501991
2488
ื•ื”ืชื—ื•ืฉื” ืœื ืชื•ืืžืช ืืช ื”ืžืฆื™ืื•ืช.
08:24
Not enough examples.
174
504503
1737
ืœื ืžืกืคื™ืง ื“ื•ื’ืžืื•ืช.
08:26
There's an inherent problem with low-probability events.
175
506879
3506
ื™ืฉ ื‘ืขื™ื™ืชื™ื•ืช ืžื•ื‘ื ื™ืช
ื‘ืื™ืจื•ืขื™ื ื‘ืขืœื™ ื”ืกืชื‘ืจื•ืช ื ืžื•ื›ื”.
08:30
If, for example, terrorism almost never happens,
176
510919
3813
ืื, ืœื“ื•ื’ืžื”,
ื”ื˜ืจื•ืจ ืœื ืžืชืจื—ืฉ ื›ืžืขื˜ ืืฃ ืคืขื,
08:34
it's really hard to judge the efficacy of counter-terrorist measures.
177
514756
4604
ืžืžืฉ ืงืฉื” ืœื”ืขืจื™ืš
ืืช ื™ืขื™ืœื•ืชื ืฉืœ ืืžืฆืขื™ื ื ื’ื“ ื˜ืจื•ืจ.
08:40
This is why you keep sacrificing virgins,
178
520523
3563
ืœื›ืŸ ืžืžืฉื™ื›ื™ื ืœื”ืงืจื™ื‘ ื‘ืชื•ืœื•ืช,
ื•ืœื›ืŸ ื”ื’ื ื•ืช ื”ื—ื“-ืงืจืŸ ืฉืœื›ื ืคืฉื•ื˜ ืขื•ื‘ื“ื•ืช ืžืขื•ืœื”.
08:44
and why your unicorn defenses are working just great.
179
524110
2675
08:46
There aren't enough examples of failures.
180
526809
2557
ืื™ืŸ ืžืกืคื™ืง ื“ื•ื’ืžืื•ืช ืœื›ืฉืœื•ื ื•ืช.
ื•ื‘ื ื•ืกืฃ, ืชื—ื•ืฉื•ืช ืฉืžืขืจืคืœื•ืช ืืช ื”ื‘ืขื™ื•ืช --
08:51
Also, feelings that cloud the issues --
181
531109
2787
08:53
the cognitive biases I talked about earlier: fears, folk beliefs --
182
533920
4028
ื”ื”ื˜ื™ื•ืช ื”ืžื—ืฉื‘ืชื™ื•ืช ืฉื“ื™ื‘ืจืชื™ ืขืœื™ื”ืŸ ืงื•ื“ื,
ืคื—ื“, ืืžื•ื ื•ืช ืขืžืžื™ื•ืช,
08:58
basically, an inadequate model of reality.
183
538727
2745
ื‘ืขืงืจื•ืŸ, ืžื•ื“ืœ ืœื ืžืกืคืง ืฉืœ ื”ืžืฆื™ืื•ืช.
ืื– ื”ืจืฉื• ืœื™ ืœืกื‘ืš ืืช ื”ื“ื‘ืจื™ื.
09:03
So let me complicate things.
184
543403
2171
09:05
I have feeling and reality.
185
545598
1977
ื™ืฉ ืœื™ ืชื—ื•ืฉื” ื•ืžืฆื™ืื•ืช.
09:07
I want to add a third element. I want to add "model."
186
547599
2796
ืื ื™ ืจื•ืฆื” ืœื”ื•ืกื™ืฃ ื’ื•ืจื ืฉืœื™ืฉื™. ืื ื™ ืจื•ืฆื” ืœื”ื•ืกื™ืฃ ืžื•ื“ืœ.
09:10
Feeling and model are in our head,
187
550839
2350
ืชื—ื•ืฉื” ื•ืžื•ื“ืœ ื ืžืฆืื™ื ื‘ืจืืฉ ืฉืœื ื•,
ื”ืžืฆื™ืื•ืช ื”ื™ื ื”ืขื•ืœื ื”ื—ื™ืฆื•ืŸ.
09:13
reality is the outside world; it doesn't change, it's real.
188
553213
3452
ื”ื™ื ืœื ืžืฉืชื ื”. ื”ื™ื ืžืžืฉื™ืช.
09:17
Feeling is based on our intuition,
189
557800
2214
ืื– ื”ืชื—ื•ืฉื” ืžื‘ื•ืกืกืช ืขืœ ื”ืื™ื ื˜ื•ืื™ืฆื™ื” ืฉืœื ื•.
ื”ืžื•ื“ืœ ืžื‘ื•ืกืก ืขืœ ืžื—ืฉื‘ื”.
09:20
model is based on reason.
190
560038
1626
ื–ื”ื• ื‘ืขืงืจื•ืŸ ื”ื”ื‘ื“ืœ.
09:22
That's basically the difference.
191
562383
2039
09:24
In a primitive and simple world,
192
564446
1977
ื‘ืขื•ืœื ืคืจื™ืžื™ื˜ื™ื‘ื™ ื•ืคืฉื•ื˜,
09:26
there's really no reason for a model,
193
566447
2137
ืื™ืŸ ืกื™ื‘ื” ืืžื™ืชื™ืช ืœืžื•ื“ืœ.
ื›ื™ ื”ืชื—ื•ืฉื” ืงืจื•ื‘ื” ืœืžืฆื™ืื•ืช.
09:30
because feeling is close to reality.
194
570253
2295
09:32
You don't need a model.
195
572572
1373
ืœื ืฆืจื™ืš ืžื•ื“ืœ.
09:34
But in a modern and complex world,
196
574596
2150
ืื‘ืœ ื‘ืขื•ืœื ืžื•ื“ืจื ื™ ื•ืžื•ืจื›ื‘,
ืฆืจื™ืš ืžื•ื“ืœ
09:37
you need models to understand a lot of the risks we face.
197
577556
3650
ื›ื“ื™ ืœื”ื‘ื™ืŸ ืจื‘ื™ื ืžื”ืกื™ื›ื•ื ื™ื ืฉื ื™ืฆื‘ื™ื ื‘ืคื ื™ื ื•.
09:42
There's no feeling about germs.
198
582362
2284
ืื™ืŸ ืชื—ื•ืฉื” ื‘ื ื•ื’ืข ืœื—ื™ื™ื“ืงื™ื.
ืฆืจื™ืš ืžื•ื“ืœ ื›ื“ื™ ืœื”ื‘ื™ืŸ ืื•ืชื.
09:45
You need a model to understand them.
199
585110
2116
ืื– ื”ืžื•ื“ืœ ื”ื–ื”
09:48
This model is an intelligent representation of reality.
200
588157
3678
ื”ื•ื ื™ื™ืฆื•ื’ ืชื‘ื•ื ื™ ืฉืœ ื”ืžืฆื™ืื•ืช.
09:52
It's, of course, limited by science, by technology.
201
592411
4751
ื”ื•ื ืžื•ื’ื‘ืœ ื›ืžื•ื‘ืŸ ืข"ื™ ื”ืžื“ืข,
ืข"ื™ ื”ื˜ื›ื ื•ืœื•ื’ื™ื”.
ืœื ื™ื›ืœื” ืœื”ื™ื•ืช ืœื ื• ืชืื•ืจื™ื™ืช ื—ื™ื™ื“ืงื™ื ืขืœ ืžื—ืœื•ืช
09:58
We couldn't have a germ theory of disease
202
598249
2326
10:00
before we invented the microscope to see them.
203
600599
2534
ืœืคื ื™ ืฉื”ืžืฆืื ื• ืืช ื”ืžื™ืงืจื•ืกืงื•ืค ืฉืžืืคืฉืจ ืœืจืื•ืช ืื•ืชื.
10:04
It's limited by our cognitive biases.
204
604316
2643
ื”ื•ื ืžื•ื’ื‘ืœ ืข"ื™ ื”ื”ื˜ื™ื•ืช ื”ืžื—ืฉื‘ืชื™ื•ืช ืฉืœื ื•.
ืื‘ืœ ื™ืฉ ืœื• ื”ื™ื›ื•ืœืช
10:08
But it has the ability to override our feelings.
205
608110
2991
ืœื”ืชื’ื‘ืจ ืขืœ ื”ืชื—ื•ืฉื•ืช ืฉืœื ื•
10:11
Where do we get these models? We get them from others.
206
611507
3104
ืžืื™ืŸ ืื ื—ื ื• ืžืงื‘ืœื™ื ืžื•ื“ืœื™ื ืืœื”? ืื ื—ื ื• ืžืงื‘ืœื™ื ืื•ืชื ืžืื—ืจื™ื.
10:14
We get them from religion, from culture, teachers, elders.
207
614635
5219
ืื ื—ื ื• ืžืงื‘ืœื™ื ืื•ืชื ืžื”ื“ืช, ืžื”ืžืกื•ืจืช,
ืžืžื•ืจื™ื, ืžื–ืงื ื™ื.
ืœืคื ื™ ื›ืžื” ืฉื ื™ื
10:20
A couple years ago, I was in South Africa on safari.
208
620298
3426
ื”ื™ื™ืชื™ ื‘ืกืคืืจื™ ื‘ื“ืจื•ื ืืคืจื™ืงื”.
10:23
The tracker I was with grew up in Kruger National Park.
209
623748
2762
ื”ื’ืฉืฉ ืฉื”ื™ื™ืชื™ ืื™ืชื• ื’ื“ืœ ื‘ืคืืจืง ื”ืœืื•ืžื™ ืงืจื•ื’ืจ.
10:26
He had some very complex models of how to survive.
210
626534
2753
ื”ื™ื• ืœื• ื›ืžื” ืžื•ื“ืœื™ื ืžื•ืจื›ื‘ื™ื ืžืื•ื“ ื›ื™ืฆื“ ืœืฉืจื•ื“.
10:29
And it depended on if you were attacked by a lion, leopard, rhino, or elephant --
211
629800
3913
ื•ื–ื” ื”ื™ื” ืชืœื•ื™ ื‘ืฉืืœื” ืื ื”ื•ืชืงืคืช
ืข"ื™ ืืจื™ื”, ื ืžืจ, ืงืจื ืฃ ืื• ืคื™ืœ --
10:33
and when you had to run away, when you couldn't run away,
212
633737
2734
ืžืชื™ ื”ื™ื™ืช ืฆืจื™ืš ืœื‘ืจื•ื—, ืื• ืžืชื™ ื”ื™ื™ืช ืฆืจื™ืš ืœื˜ืคืก ืขืœ ืขืฅ --
10:36
when you had to climb a tree, when you could never climb a tree.
213
636495
3083
ืื• ืฉืœื ื”ื™ื™ืช ืžืฆืœื™ื— ืœื˜ืคืก ืขืœ ืขืฅ.
ื”ื™ื™ืชื™ ืžืช ืชื•ืš ื™ื•ื ืื—ื“,
10:39
I would have died in a day.
214
639602
1349
ืื‘ืœ ื”ื•ื ื ื•ืœื“ ืฉื,
10:42
But he was born there, and he understood how to survive.
215
642160
3782
ื•ื”ื•ื ื”ื‘ื™ืŸ ืื™ืš ืœืฉืจื•ื“.
ืื ื™ ื ื•ืœื“ืชื™ ื‘ืขื™ืจ ื ื™ื• ื™ื•ืจืง.
10:46
I was born in New York City.
216
646490
1596
ื”ื™ื™ืชื™ ื™ื›ื•ืœ ืœืงื—ืช ืื•ืชื• ืœื ื™ื• ื™ื•ืจืง, ื•ื”ื•ื ื”ื™ื” ืžืช ืชื•ืš ื™ื•ื.
10:48
I could have taken him to New York, and he would have died in a day.
217
648110
3251
(ืฆื—ื•ืง)
10:51
(Laughter)
218
651385
1001
10:52
Because we had different models based on our different experiences.
219
652410
4144
ื‘ื’ืœืœ ืฉื™ืฉ ืœื ื• ืžื•ื“ืœื™ื ืฉื•ื ื™ื
ื”ืžื‘ื•ืกืกื™ื ืขืœ ื ืกื™ื•ืŸ ื”ื—ื™ื™ื ื”ืฉื•ื ื” ืฉืœื ื•.
10:58
Models can come from the media,
220
658291
2469
ืžื•ื“ืœื™ื ื™ื›ื•ืœื™ื ืœื‘ื•ื ืžื”ืชืงืฉื•ืจืช,
11:00
from our elected officials ...
221
660784
1763
ืžื”ื ื‘ื—ืจื™ื ืฉืœื ื•.
11:03
Think of models of terrorism,
222
663234
3081
ื—ื™ืฉื‘ื• ืขืœ ืžื•ื“ืœื™ื ืฉืœ ื˜ืจื•ืจ,
11:06
child kidnapping,
223
666339
2197
ื—ื˜ื™ืคื•ืช ื™ืœื“ื™ื,
11:08
airline safety, car safety.
224
668560
2325
ื‘ื˜ื™ื—ื•ืช ื˜ื™ืกื”, ื‘ื˜ื™ื—ื•ืช ืจื›ื‘.
11:11
Models can come from industry.
225
671539
1993
ืžื•ื“ืœื™ื ื™ื›ื•ืœื™ื ืœื‘ื•ื ืžื”ืชืขืฉื™ื™ื”.
11:14
The two I'm following are surveillance cameras,
226
674348
3218
ื”ืฉื ื™ื™ื ืฉืื ื™ ืขื•ืงื‘ ืื—ืจื™ื”ื ื”ื ืžืฆืœืžื•ืช ืžืขืงื‘,
ืชืขื•ื“ื•ืช ื–ื”ื•ืช,
11:17
ID cards,
227
677590
1496
ืจื‘ื™ื ืžืžื•ื“ืœื™ ืื‘ื˜ื—ืช ื”ืžื—ืฉื‘ื™ื ืฉืœื ื• ื‘ืื™ื ืžืฉื.
11:19
quite a lot of our computer security models come from there.
228
679110
3130
ืžื•ื“ืœื™ื ืจื‘ื™ื ื‘ืื™ื ืžื”ืžื“ืข.
11:22
A lot of models come from science.
229
682264
2227
11:24
Health models are a great example.
230
684515
1837
ืžื•ื“ืœื™ื ืจืคื•ืื™ื™ื ื”ื ื“ื•ื’ืžื ืžืฆื•ื™ื™ื ืช.
11:26
Think of cancer, bird flu, swine flu, SARS.
231
686376
3026
ื—ื™ืฉื‘ื• ืขืœ ืกืจื˜ืŸ, ืฉืคืขืช ื”ืขื•ืคื•ืช, ืฉืคืขืช ื”ื—ื–ื™ืจื™ื, ืกืืจืก.
11:29
All of our feelings of security about those diseases
232
689942
4870
ื›ืœ ื”ืชื—ื•ืฉื•ืช ืฉืœื ื• ืœื’ื‘ื™ ื‘ื™ื˜ื—ื•ืŸ
ืœื’ื‘ื™ ื”ืžื—ืœื•ืช ื”ืืœื”
11:34
come from models given to us, really, by science filtered through the media.
233
694836
4655
ืžื’ื™ืขื•ืช ืžืžื•ื“ืœื™ื
ืฉื ื™ืชื ื™ื ืœื ื•, ื‘ืืžืช, ืข"ื™ ืžื“ืขื ื™ื ืœืื—ืจ ืกื™ื ื•ืŸ ืฉืœ ื”ืชืงืฉื•ืจืช.
ืื– ืžื•ื“ืœื™ื ืขืฉื•ื™ื™ื ืœื”ืฉืชื ื•ืช.
11:41
So models can change.
234
701038
1720
11:43
Models are not static.
235
703482
2103
ืžื•ื“ืœื™ื ืื™ื ื ืกื˜ื˜ื™ื™ื.
11:45
As we become more comfortable in our environments,
236
705609
3240
ื›ื›ืœ ืฉืื ื—ื ื• ื—ืฉื™ื ื™ื•ืชืจ ื‘ื ื•ื— ื‘ืกื‘ื™ื‘ื” ืฉืœื ื•,
11:48
our model can move closer to our feelings.
237
708873
3602
ื”ืžื•ื“ืœื™ื ืฉืœื ื• ื™ื›ื•ืœื™ื ืœื”ืชืงืจื‘ ืœืชื—ื•ืฉื•ืชื™ื ื•.
11:53
So an example might be,
238
713965
2340
ื“ื•ื’ืžื” ืื—ืช ืขืฉื•ื™ื” ืœื”ื™ื•ืช,
ืื ื—ื•ื–ืจื™ื 100 ืฉื ื” ืœืื—ื•ืจ
11:56
if you go back 100 years ago,
239
716329
1596
11:57
when electricity was first becoming common,
240
717949
3428
ื›ืฉื”ื—ืฉืžืœ ื”ื—ืœ ืœื”ื™ื•ืช ื ืคื•ืฅ,
ื”ื•ื ืœื•ื•ื” ื‘ื”ืจื‘ื” ืžืื•ื“ ื—ืฉืฉื•ืช.
12:01
there were a lot of fears about it.
241
721401
1703
ื”ื™ื• ืื ืฉื™ื ืฉื—ืฉืฉื• ืœืœื—ื•ืฅ ืขืœ ืคืขืžื•ื ื™ ื“ืœืชื•ืช,
12:03
There were people who were afraid to push doorbells,
242
723128
2478
ื›ื™ ื—ืฉืฉื• ืžื”ื—ืฉืžืœ ืฉื‘ื”ื, ื•ื–ื” ื”ื™ื” ืžืกื•ื›ืŸ.
12:05
because there was electricity in there, and that was dangerous.
243
725630
3005
ืื ื—ื ื• ืžืจื’ื™ืฉื™ื ื‘ื ื•ื— ืขื ื—ืฉืžืœ.
12:08
For us, we're very facile around electricity.
244
728659
2869
ืื ื—ื ื• ืžื—ืœื™ืคื™ื ื ื•ืจื•ืช
12:11
We change light bulbs without even thinking about it.
245
731552
2818
ืืคื™ืœื• ื‘ืœื™ ืœื—ืฉื•ื‘ ืขืœ ื–ื”.
12:14
Our model of security around electricity is something we were born into.
246
734948
6163
ื”ืžื•ื“ืœ ืฉืœื ื• ืœื’ื‘ื™ ื‘ื˜ื—ื•ืŸ ืขื ื—ืฉืžืœ,
ื”ื•ื ืžืฉื”ื• ืฉื ื•ืœื“ื ื• ืœืชื•ื›ื•.
12:21
It hasn't changed as we were growing up.
247
741735
2514
ื”ื•ื ืœื ื”ืฉืชื ื” ื›ืฉื”ืชื‘ื’ืจื ื•.
12:24
And we're good at it.
248
744273
1565
ื•ืื ื—ื ื• ื˜ื•ื‘ื™ื ื‘ื–ื”.
12:27
Or think of the risks on the Internet across generations --
249
747380
4499
ืื• ื—ื™ืฉื‘ื• ืขืœ ื”ืกื™ื›ื•ื ื™ื
ืฉื‘ืื™ื ื˜ืจื ื˜ ื‘ื™ืŸ ื”ื“ื•ืจื•ืช --
12:31
how your parents approach Internet security,
250
751903
2097
ืื™ืš ื”ื”ื•ืจื™ื ืฉืœื›ื ืžืชื™ื™ื—ืกื™ื ืœื‘ื™ื˜ื—ื•ืŸ ื‘ืื™ื ื˜ืจื ื˜,
ืœืขื•ืžืชื›ื,
12:34
versus how you do,
251
754024
1616
12:35
versus how our kids will.
252
755664
1542
ื•ืœืขื•ืžืช ื”ื™ืœื“ื™ื ืฉืœื›ื.
12:38
Models eventually fade into the background.
253
758300
2550
ืžื•ื“ืœื™ื ืกื•ืคื ืœื”ื™ืžื•ื’ ื‘ืจืงืข.
12:42
"Intuitive" is just another word for familiar.
254
762427
2493
"ืื™ื ื˜ื•ืื™ื˜ื™ื‘ื™" ื”ื™ื ื‘ืกื”"ื› ืžื™ืœื” ืื—ืจืช ืœ"ืžื•ื›ืจ".
12:45
So as your model is close to reality and it converges with feelings,
255
765887
3850
ืื– ื›ื›ืœ ืฉื”ืžื•ื“ืœ ืฉืœื›ื ืงืจื•ื‘ ืœืžืฆื™ืื•ืช,
ื•ื”ื•ื ืžืชืžื–ื’ ืขื ื”ืชื—ื•ืฉื•ืช,
12:49
you often don't even know it's there.
256
769761
1918
ืืชื ืœื ื™ื•ื“ืขื™ื ืฉื”ื•ื ืฉื.
ืื– ื“ื•ื’ืžื” ื ื—ืžื“ื” ืœื–ื”
12:53
A nice example of this came from last year and swine flu.
257
773239
3817
ื‘ืื” ืžืฉื ื” ืฉืขื‘ืจื” ื•ืžืฉืคืขืช ื”ื—ื–ื™ืจื™ื.
ื›ืฉืฉืคืขืช ื”ื—ื–ื™ืจื™ื ื”ื•ืคื™ืขื” ืœืจืืฉื•ื ื”,
12:58
When swine flu first appeared,
258
778281
2000
ื”ื—ื“ืฉื•ืช ื”ืจืืฉื•ื ื•ืช ื™ืฆืจื• ืชื’ื•ื‘ืช-ื™ืชืจ ืจื‘ื”.
13:00
the initial news caused a lot of overreaction.
259
780305
2618
13:03
Now, it had a name,
260
783562
1978
ืขื›ืฉื™ื• ื™ืฉ ืœื–ื” ืฉื,
13:05
which made it scarier than the regular flu,
261
785564
2050
ืฉื”ืคืš ืืช ื–ื” ืœืžืคื—ื™ื“ ื™ื•ืชืจ ืžืฉืคืขืช ืจื’ื™ืœื”,
13:07
even though it was more deadly.
262
787638
1567
ืœืžืจื•ืช ืฉื”ื™ืชื” ืงื˜ืœื ื™ืช ื™ื•ืชืจ.
13:09
And people thought doctors should be able to deal with it.
263
789784
3208
ื•ืื ืฉื™ื ื—ืฉื‘ื• ืฉืจื•ืคืื™ื ืืžื•ืจื™ื ืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœื˜ืคืœ ื‘ื–ื”.
13:13
So there was that feeling of lack of control.
264
793459
2524
ืื– ื”ื™ืชื” ื”ืชื—ื•ืฉื” ื”ื–ื• ืฉืœ ื—ื•ืกืจ ืฉืœื™ื˜ื”.
ื•ืฉื ื™ ื”ื“ื‘ืจื™ื ื”ืืœื”
13:16
And those two things made the risk more than it was.
265
796007
3109
ื”ืขืฆื™ืžื• ืืช ื”ืกื™ื›ื•ืŸ ื™ื•ืชืจ ืžืคื™ ืฉื”ื™ื” ื‘ืืžืช.
13:19
As the novelty wore off and the months went by,
266
799140
3557
ื›ืฉื”ื—ื™ื“ื•ืฉ ื ืžื•ื’, ื”ื—ื•ื“ืฉื™ื ืขื‘ืจื•,
13:22
there was some amount of tolerance; people got used to it.
267
802721
2843
ื ื•ืฆืจื” ืชื—ื•ืฉื” ืžืกื•ื™ื™ืžืช ืฉืœ ืขืžื™ื“ื•ืช,
ืื ืฉื™ื ื”ืชืจื’ืœื•.
13:26
There was no new data, but there was less fear.
268
806355
2670
ืœื ื”ื™ื• ื ืชื•ื ื™ื ื—ื“ืฉื™ื, ืื‘ืœ ื”ื™ื” ืคื—ื•ืช ืคื—ื“.
13:29
By autumn,
269
809681
2174
ืขื“ ืœืกืชื™ื•,
13:31
people thought the doctors should have solved this already.
270
811879
3382
ืื ืฉื™ื ื—ืฉื‘ื•
ืฉื”ืจื•ืคืื™ื ื”ื™ื• ืฆืจื™ื›ื™ื ื›ื‘ืจ ืœืคืชื•ืจ ืืช ื–ื”.
13:35
And there's kind of a bifurcation:
271
815722
1960
ื•ื™ืฉ ื›ืืŸ ืกื•ื’ ืฉืœ ื”ืชืคืฆืœื•ืช --
13:37
people had to choose between fear and acceptance --
272
817706
5751
ืื ืฉื™ื ื ืืœืฆื• ืœื‘ื—ื•ืจ
ื‘ื™ืŸ ืคื—ื“ ืœืงื‘ืœื” --
ืœืžืขืฉื” ื‘ื™ืŸ ืคื—ื“ ืœืขืฆืžืื•ืช --
13:44
actually, fear and indifference --
273
824512
1644
ื”ื ื“ื™ ื‘ื—ืจื• ื‘ื—ืฉื“.
13:46
and they kind of chose suspicion.
274
826180
1795
ื•ื›ืฉื”ื—ื™ืกื•ืŸ ื”ื•ืคื™ืข ื‘ื—ื•ืจืฃ ืฉืขื‘ืจ,
13:49
And when the vaccine appeared last winter,
275
829110
3111
ื”ื™ื• ื”ืจื‘ื” ืื ืฉื™ื -- ืžืกืคืจ ืžืคืชื™ืข --
13:52
there were a lot of people -- a surprising number --
276
832245
2511
13:54
who refused to get it.
277
834780
1797
ืฉืกื™ืจื‘ื• ืœืงื‘ืœ ืื•ืชื• --
13:58
And it's a nice example of how people's feelings of security change,
278
838777
3656
ื“ื•ื’ืžื” ื ื—ืžื“ื”
ืื™ืš ืชื—ื•ืฉื•ืช ื”ื‘ื˜ื—ื•ืŸ ืฉืœ ืื ืฉื™ื ืžืฉืชื ื•ืช, ืื™ืš ื”ืžื•ื“ืœื™ื ืฉืœื”ื ืžืฉืชื ื™ื,
14:02
how their model changes,
279
842457
1603
ื‘ืฆื•ืจื” ื“ื™ ืคืจื•ืขื”
14:04
sort of wildly,
280
844084
1668
14:05
with no new information, with no new input.
281
845776
2770
ืœืœื ืžื™ื“ืข ื—ื“ืฉ,
ืœืœื ืงืœื˜ ื—ื“ืฉ.
ื“ื‘ืจื™ื ื›ืืœื” ืงื•ืจื™ื ืคืขืžื™ื ืจื‘ื•ืช.
14:10
This kind of thing happens a lot.
282
850327
1808
ืื•ืกื™ืฃ ืขื•ื“ ืžืจื›ื™ื‘.
14:13
I'm going to give one more complication.
283
853199
1971
14:15
We have feeling, model, reality.
284
855194
2696
ื™ืฉ ืœื ื• ืชื—ื•ืฉื”, ืžื•ื“ืœ, ืžืฆื™ืื•ืช,
14:18
I have a very relativistic view of security.
285
858640
2510
ื™ืฉ ืœื™ ื”ืฉืงืคื” ืžืื•ื“ ื™ื—ืกื™ืช ืขืœ ื‘ื˜ื—ื•ืŸ.
ืื ื™ ื—ื•ืฉื‘ ืฉื”ื™ื ืชืœื•ื™ื” ื‘ืžืฉืงื™ืฃ.
14:21
I think it depends on the observer.
286
861174
1814
14:23
And most security decisions have a variety of people involved.
287
863695
5166
ื•ืจื•ื‘ ื”ื—ืœื˜ื•ืช ื”ื‘ื˜ื—ื•ืŸ
ืžืขืจื‘ื•ืช ืžื’ื•ื•ืŸ ืื ืฉื™ื.
14:29
And stakeholders with specific trade-offs will try to influence the decision.
288
869792
6539
ื•ื‘ืขืœื™ ืขื ื™ื™ืŸ,
ืขื ืคืฉืจื•ืช ืกืคืฆื™ืคื™ื•ืช
ื™ื ืกื• ืœื”ืฉืคื™ืข ืขืœ ื”ื”ื—ืœื˜ื”.
14:36
And I call that their agenda.
289
876355
1681
ื•ืื ื™ ืงื•ืจื ืœื–ื” "ืกื“ืจ ื”ื™ื•ื ืฉืœื”ื".
ื•ืกื“ืจ ื”ื™ื•ื ื”ื–ื” --
14:39
And you see agenda -- this is marketing, this is politics --
290
879512
3491
ื”ื•ื ืฉื™ื•ื•ืง, ืคื•ืœื™ื˜ื™ืงื” --
14:43
trying to convince you to have one model versus another,
291
883481
3039
ื ืกื™ื•ืŸ ืœืฉื›ื ืข ืœืืžืฅ ืžื•ื“ืœ ื›ื–ื” ื•ืœื ืื—ืจ,
14:46
trying to convince you to ignore a model
292
886544
1984
ื ืกื™ื•ืŸ ืœืฉื›ื ืข ืœื”ืชืขืœื ืžืžื•ื“ืœ
14:48
and trust your feelings,
293
888552
2672
ื•ืœืกืžื•ืš ืขืœ ื”ืชื—ื•ืฉื•ืช,
14:51
marginalizing people with models you don't like.
294
891248
2504
ืœื“ื—ื•ืง ืœืฉื•ืœื™ื™ื ืื ืฉื™ื ืขื ืžื•ื“ืœื™ื ืฉื”ื ืœื ืื•ื”ื‘ื™ื.
14:54
This is not uncommon.
295
894744
1516
ื–ื” ืœื ื ื“ื™ืจ.
14:57
An example, a great example, is the risk of smoking.
296
897610
3229
ื“ื•ื’ืžื”, ื“ื•ื’ืžื” ืžืฆื•ื™ื™ื ืช, ื”ื™ื ื”ืกื™ื›ื•ืŸ ืฉื‘ืขื™ืฉื•ืŸ.
ื‘ื”ื™ืกื˜ื•ืจื™ื” ืฉืœ 50 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช, ื”ืกื™ื›ื•ืŸ ืฉื‘ืขื™ืฉื•ืŸ
15:02
In the history of the past 50 years,
297
902196
1783
15:04
the smoking risk shows how a model changes,
298
904003
2613
ืžื“ื’ื™ื ืื™ืš ื”ืžื•ื“ืœ ื”ืฉืชื ื”,
15:06
and it also shows how an industry fights against a model it doesn't like.
299
906640
4358
ื•ื’ื ืžืจืื” ืื™ืš ืชืขืฉื™ื™ื” ื ืœื—ืžืช ื ื’ื“
ืžื•ื“ืœ ืฉื”ื™ื ืœื ืื•ื”ื‘ืช.
15:11
Compare that to the secondhand smoke debate --
300
911983
3103
ื”ืฉื•ื• ืืช ื–ื” ืœื“ื™ื•ืŸ ืขืœ ืžืขืฉื ื™ื ืคืกื™ื‘ื™ื --
ื‘ื˜ื— ื‘ืคื™ื’ื•ืจ ืฉืœ ื›-20 ืฉื ื”.
15:15
probably about 20 years behind.
301
915110
1953
15:17
Think about seat belts.
302
917982
1615
ื—ื™ืฉื‘ื• ืขืœ ื—ื’ื•ืจื•ืช ื‘ื˜ื™ื—ื•ืช.
15:19
When I was a kid, no one wore a seat belt.
303
919621
2024
ื›ืฉื”ื™ื™ืชื™ ื™ืœื“, ืื™ืฉ ืœื ื—ื’ืจ ื—ื’ื•ืจื•ืช ื‘ื˜ื™ื—ื•ืช.
15:21
Nowadays, no kid will let you drive if you're not wearing a seat belt.
304
921669
3541
ื”ื™ื•ื, ืืฃ ื™ืœื“ ืœื ื™ื™ืชืŸ ืœืš ืœื ื”ื•ื’
ืื ืœื ืชื—ื’ื•ืจ ื—ื’ื•ืจืช ื‘ื˜ื™ื—ื•ืช.
15:26
Compare that to the airbag debate,
305
926633
2453
ื”ืฉื•ื• ืืช ื–ื” ืœื“ื™ื•ืŸ ืœื’ื‘ื™ ื›ืจื™ื•ืช ื”ืื•ื•ื™ืจ --
ื‘ื˜ื— ื‘ืคื™ื’ื•ืจ ืฉืœ ื›-30 ืฉื ื”.
15:29
probably about 30 years behind.
306
929110
1667
ื›ืœ ื”ื“ื•ื’ืžืื•ืช ืฉืœ ืฉื™ื ื•ื™ ืžื•ื“ืœื™ื.
15:32
All examples of models changing.
307
932006
2088
15:36
What we learn is that changing models is hard.
308
936855
2796
ืžื” ืฉืื ื—ื ื• ืœืžื“ื™ื ื”ื•ื ืฉืฉื™ื ื•ื™ ืžื•ื“ืœื™ื ื”ื•ื ืงืฉื”.
ืงืฉื” ืœืกืœืง ืžื•ื“ืœื™ื.
15:40
Models are hard to dislodge.
309
940334
2053
ืื ื”ื ืฉื•ื•ื™ื ืœืชื—ื•ืฉื•ืช,
15:42
If they equal your feelings,
310
942411
1675
ืืคื™ืœื• ืœื ื™ื•ื“ืขื™ื ืฉื™ืฉ ืžื•ื“ืœ.
15:44
you don't even know you have a model.
311
944110
1899
ื•ื™ืฉื ื” ื’ื ื”ื˜ื™ื” ืžื—ืฉื‘ืชื™ืช ืื—ืจืช
15:47
And there's another cognitive bias
312
947110
1886
ืื ื™ ืงื•ืจื ืœื” ื”ื˜ื™ื™ืช ื”ืื™ืฉื•ืฉ,
15:49
I'll call confirmation bias,
313
949020
2066
ืฉื‘ื” ืื ื—ื ื• ื ื•ื˜ื™ื ืœืงื‘ืœ ื ืชื•ื ื™ื
15:51
where we tend to accept data that confirms our beliefs
314
951110
4361
ืฉืžืืฉืฉื™ื ืืช ื”ืืžื•ื ื•ืช ืฉืœื ื•
15:55
and reject data that contradicts our beliefs.
315
955495
2437
ื•ื“ื•ื—ื™ื ืžื™ื“ืข ืฉืกื•ืชืจ ืืช ื”ืืžื•ื ื•ืช ืฉืœื ื•.
15:59
So evidence against our model, we're likely to ignore,
316
959490
3935
ืื– ืขื“ื•ื™ื•ืช ื ื’ื“ ื”ืžื•ื“ืœ ืฉืœื ื•,
ืฆืคื•ื™ ืฉื ืชืขืœื ืžื”ืŸ, ืืคื™ืœื• ืื ื”ืŸ ืžืฉื›ื ืขื•ืช.
16:03
even if it's compelling.
317
963449
1248
16:04
It has to get very compelling before we'll pay attention.
318
964721
3005
ื”ืŸ ื—ื™ื™ื‘ื•ืช ืœื”ื™ื•ืช ืžืื•ื“ ืžืฉื›ื ืขื•ืช ืœืคื ื™ ืฉื ืฉื™ื ืœื‘ ืืœื™ื”ืŸ.
16:08
New models that extend long periods of time are hard.
319
968990
2597
ืžื•ื“ืœื™ื ื—ื“ืฉื™ื ืฉืžืฉืชืจืขื™ื ืขืœ ืคื ื™ ืชืงื•ืคื” ืืจื•ื›ื” ื”ื ืงืฉื™ื.
ื”ืชื—ืžืžื•ืช ื’ืœื•ื‘ืœื™ืช ื”ื™ื ื“ื•ื’ืžื ืžืฆื•ื™ื™ื ืช.
16:11
Global warming is a great example.
320
971611
1754
ืื ื—ื ื• ื’ืจื•ืขื™ื
16:13
We're terrible at models that span 80 years.
321
973389
3442
ื‘ื ื•ื’ืข ืœืžื•ื“ืœื™ื ืฉืžืฉืชืจืขื™ื ืขืœ ืคื ื™ 80 ืฉื ื”.
16:16
We can do "to the next harvest."
322
976855
2063
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืชื›ื ืŸ ืขื“ ืœืงืฆื™ืจ ื”ืงืจื•ื‘.
16:18
We can often do "until our kids grow up."
323
978942
2306
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืชื™ื ืœืชื›ื ืŸ ืขื“ ืฉื™ืœื“ื™ื ื• ื™ื’ื“ืœื•.
16:21
But "80 years," we're just not good at.
324
981760
2201
ืื‘ืœ 80 ืฉื ื” - ืื ื—ื ื• ืคืฉื•ื˜ ืœื ื˜ื•ื‘ื™ื ื‘ื–ื”.
16:24
So it's a very hard model to accept.
325
984975
2302
ืื– ื–ื” ืžื•ื“ืœ ืฉืงืฉื” ืžืื•ื“ ืœืงื‘ืœื•.
16:27
We can have both models in our head simultaneously --
326
987999
2946
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื—ื–ื™ืง ื‘ืจืืฉื™ื ื• ืืช ืฉื ื™ ื”ืžื•ื“ืœื™ื ื‘ื•-ื–ืžื ื™ืช,
16:31
that kind of problem where we're holding both beliefs together,
327
991912
6948
ื”ื‘ืขื™ื” ื”ื–ื•,
ืฉื‘ื” ืื ื—ื ื• ืžื—ื–ื™ืงื™ื ืืช ืฉืชื™ ื”ืืžื•ื ื•ืช ื‘ื™ื—ื“,
ื”ื“ื™ืกื•ื ื ืก ื”ืงื•ื’ื ื™ื˜ื™ื‘ื™ ื”ื–ื”.
16:38
the cognitive dissonance.
328
998884
1370
ืœื‘ืกื•ืฃ,
16:40
Eventually, the new model will replace the old model.
329
1000278
3156
ื”ืžื•ื“ืœ ื”ื—ื“ืฉ ื™ื—ืœื™ืฃ ืืช ื”ืžื•ื“ืœ ื”ื™ืฉืŸ.
16:44
Strong feelings can create a model.
330
1004164
2190
ืชื—ื•ืฉื•ืช ืขื–ื•ืช ื™ื›ื•ืœื•ืช ืœื™ืฆื•ืจ ืžื•ื“ืœ.
16:47
September 11 created a security model in a lot of people's heads.
331
1007411
5363
ื”-11 ื‘ืกืคื˜ืžื‘ืจ ื™ืฆืจ ืžื•ื“ืœ ื‘ื˜ื—ื•ืŸ
ื‘ืจืืฉื™ ืื ืฉื™ื ืจื‘ื™ื.
16:52
Also, personal experiences with crime can do it,
332
1012798
3288
ื‘ื ื•ืกืฃ, ื’ื ื ื™ืกื™ื•ืŸ ืื™ืฉื™ ืฉืœ ื”ื™ืชืงืœื•ืช ื‘ืคืฉืข ื™ื›ื•ืœ ืœืขืฉื•ืช ืืช ื–ื”,
ืคื—ื“ ืžืคื ื™ ื‘ืขื™ื•ืช ื‘ืจื™ืื•ืช ืื™ืฉื™ื•ืช,
16:56
personal health scare,
333
1016110
1379
16:57
a health scare in the news.
334
1017513
1466
ื”ืคื—ื“ื” ื‘ื ื•ื’ืข ืœื‘ืจื™ืื•ืช ื‘ื—ื“ืฉื•ืช.
ื”ื ื ืงืจืื™ื "ืื™ืจื•ืขื™ ื”ื‘ื–ืง"
17:00
You'll see these called "flashbulb events" by psychiatrists.
335
1020198
3345
ืข"ื™ ืคืกื™ื›ื™ืื˜ืจื™ื.
ื”ื ื™ื›ื•ืœื™ื ืœื™ืฆื•ืจ ืžื•ื“ืœ ื‘ืื•ืคืŸ ืžื™ื™ื“ื™,
17:04
They can create a model instantaneously,
336
1024183
2461
17:06
because they're very emotive.
337
1026668
1761
ื›ื™ ื”ื ืžืื•ื“ ืจื’ืฉื™ื™ื.
17:09
So in the technological world,
338
1029908
1588
ืื– ื‘ืขื•ืœื ื”ื˜ื›ื ื•ืœื•ื’ื™,
17:11
we don't have experience to judge models.
339
1031520
3237
ืื™ืŸ ืœื ื• ื ื™ืกื™ื•ืŸ
ื›ื“ื™ ืœืฉืคื•ื˜ ืžื•ื“ืœื™ื.
17:15
And we rely on others. We rely on proxies.
340
1035124
2933
ื•ืื ื—ื ื• ืกื•ืžื›ื™ื ืขืœ ืื—ืจื™ื. ืื ื—ื ื• ืกื•ืžื›ื™ื ืขืœ ื ืฆื™ื’ื™ื.
ื›ืœื•ืžืจ, ื–ื” ืขื•ื‘ื“ ื›ืœ ืขื•ื“ ื–ื” ื ื•ืขื“ ืœืชืงืŸ ืื—ืจื™ื.
17:18
And this works, as long as it's the correct others.
341
1038081
2619
17:21
We rely on government agencies
342
1041183
2682
ืื ื—ื ื• ืกื•ืžื›ื™ื ืขืœ ืกื•ื›ื ื•ื™ื•ืช ืžืžืฉืœืชื™ื•ืช
17:23
to tell us what pharmaceuticals are safe.
343
1043889
4404
ื›ื“ื™ ืฉื™ื’ื™ื“ื• ืœื ื• ืื™ืœื• ืชืจื•ืคื•ืช ื‘ื˜ื•ื—ื•ืช.
17:28
I flew here yesterday.
344
1048317
1913
ื˜ืกืชื™ ืœื›ืืŸ ืืชืžื•ืœ.
17:30
I didn't check the airplane.
345
1050254
1762
ืœื ื‘ื“ืงืชื™ ืืช ื”ืžื˜ื•ืก.
17:32
I relied on some other group
346
1052699
2595
ืกืžื›ืชื™ ืขืœ ืงื‘ื•ืฆื” ืื—ืจืช
ืฉืชืงื‘ืข ืื ื”ืžื˜ื•ืก ืฉืœื™ ื‘ื˜ื•ื— ืœื˜ื™ืกื”.
17:35
to determine whether my plane was safe to fly.
347
1055318
2437
17:37
We're here, none of us fear the roof is going to collapse on us,
348
1057779
3298
ืื ื—ื ื• ื›ืืŸ, ืื™ืฉ ืžืื™ืชื ื• ืœื ื—ื•ืฉืฉ ืฉื”ื’ื’ ื™ืชืžื•ื˜ื˜ ืขืœื™ื ื•,
ืœื ื‘ื’ืœืœ ืฉื‘ื“ืงื ื•,
17:41
not because we checked,
349
1061101
2205
17:43
but because we're pretty sure the building codes here are good.
350
1063330
3672
ืืœื ื‘ื’ืœืœ ืฉืื ื—ื ื• ื“ื™ ื‘ื˜ื•ื—ื™ื
ืฉืชืงื ื™ ื”ื‘ื ื™ื™ื” ืคื” ื˜ื•ื‘ื™ื.
17:48
It's a model we just accept
351
1068442
2989
ื–ื”ื• ืžื•ื“ืœ ืฉืื ื—ื ื• ืคืฉื•ื˜ ืžืงื‘ืœื™ื
ืคื—ื•ืช ืื• ื™ื•ืชืจ ืข"ื™ ืืžื•ื ื”.
17:51
pretty much by faith.
352
1071455
1360
ื•ื–ื” ื‘ืกื“ืจ.
17:53
And that's OK.
353
1073331
1358
17:57
Now, what we want is people to get familiar enough with better models,
354
1077966
5873
ื•ืžื” ืฉืื ื—ื ื• ืจื•ืฆื™ื
ื”ื•ื ืฉืื ืฉื™ื ื™ื›ื™ืจื• ื˜ื•ื‘ ื™ื•ืชืจ
ืžื•ื“ืœื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ --
18:03
have it reflected in their feelings,
355
1083863
2120
ืฉื™ืฉืชืงืคื• ื‘ืชื—ื•ืฉื•ืชื™ื”ื --
ืฉื™ืืคืฉืจื• ืœื”ื ืœืขืฉื•ืช ืคืฉืจื•ืช ื‘ื˜ื—ื•ืŸ.
18:06
to allow them to make security trade-offs.
356
1086007
3002
ื•ื›ืฉื–ื” ื™ื•ืฆื ืžืฉืœื™ื˜ื”,
18:10
When these go out of whack, you have two options.
357
1090110
3719
ื™ืฉ ืฉืชื™ ืื•ืคืฆื™ื•ืช.
18:13
One, you can fix people's feelings, directly appeal to feelings.
358
1093853
4233
ืื—ืช, ืืคืฉืจ ืœืชืงืŸ ืืช ื”ืชื—ื•ืฉื•ืช ืฉืœ ืื ืฉื™ื,
ืœืคื ื•ืช ื™ืฉืจ ืืœ ื”ืชื—ื•ืฉื•ืช.
ื–ื•ื”ื™ ืžื ื™ืคื•ืœืฆื™ื”, ืื‘ืœ ื–ื” ื™ื›ื•ืœ ืœืขื‘ื•ื“.
18:18
It's manipulation, but it can work.
359
1098110
2406
ืฉืชื™ื™ื, ื“ืจืš ื”ื•ื’ื ืช ื™ื•ืชืจ
18:21
The second, more honest way
360
1101173
2191
ื”ื™ื ืœืชืงืŸ ืžืžืฉ ืืช ื”ืžื•ื“ืœ.
18:23
is to actually fix the model.
361
1103388
1773
18:26
Change happens slowly.
362
1106720
2101
ืฉื™ื ื•ื™ ืžืชืจื—ืฉ ืœืื˜.
18:28
The smoking debate took 40 years -- and that was an easy one.
363
1108845
4378
ื”ื“ื™ื•ืŸ ืขืœ ื”ืขื™ืฉื•ืŸ ืœืงื— 40 ืฉื ื”,
ื•ื–ื” ืขื•ื“ ื”ื™ื” ืงืœ.
ื—ืœืง ืžื”ื“ื‘ืจื™ื ื”ืืœื” ืงืฉื™ื.
18:35
Some of this stuff is hard.
364
1115195
1813
ืื‘ืœ ืžืžืฉ,
18:37
Really, though, information seems like our best hope.
365
1117496
3756
ืžื™ื“ืข ื ืจืื” ื›ืชืงื•ื•ื” ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ ืฉืœื ื•.
ื•ืื ื™ ืฉื™ืงืจืชื™.
18:41
And I lied.
366
1121276
1272
18:42
Remember I said feeling, model, reality; reality doesn't change?
367
1122572
4020
ื–ื™ื›ืจื• ืฉืืžืจืชื™ ืชื—ื•ืฉื•ืช, ืžื•ื“ืœ, ืžืฆื™ืื•ืช.
ืืžืจืชื™ ืฉืžืฆื™ืื•ืช ืื™ื ื” ืžืฉืชื ื”. ื”ื™ื ื›ืŸ.
18:46
It actually does.
368
1126616
1375
ืื ื—ื ื• ื—ื™ื™ื ื‘ืขื•ืœื ื˜ื›ื ื•ืœื•ื’ื™.
18:48
We live in a technological world;
369
1128015
1714
18:49
reality changes all the time.
370
1129753
2338
ื”ืžืฆื™ืื•ืช ืžืฉืชื ื” ื›ืœ ื”ื–ืžืŸ.
18:52
So we might have, for the first time in our species:
371
1132887
2977
ื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉื™ืฉ ืœื ื• -- ืœืจืืฉื•ื ื” ื‘ืชื•ืœื“ื•ืช ื”ืื ื•ืฉื•ืช --
18:55
feeling chases model, model chases reality, reality's moving --
372
1135888
3183
ืชื—ื•ืฉื” ืฉืจื•ื“ืคืช ืื—ืจื™ ืžื•ื“ืœ, ืžื•ื“ืœ ืฉืจื•ื“ืฃ ืื—ืจื™ ื”ืžืฆื™ืื•ืช, ื”ืžืฆื™ืื•ืช ืžืชืงื“ืžืช --
ืื•ืœื™ ื”ื ืœืขื•ืœื ืœื ื™ืฉื™ื’ื• ื–ื” ืืช ื–ื”.
18:59
they might never catch up.
373
1139095
1533
19:02
We don't know.
374
1142180
1328
ืื ื—ื ื• ืœื ื™ื•ื“ืขื™ื.
ืื‘ืœ ื‘ื˜ื•ื•ื— ื”ืืจื•ืš,
19:05
But in the long term,
375
1145614
1603
ื’ื ื”ืชื—ื•ืฉื•ืช ื•ื’ื ื”ืžืฆื™ืื•ืช ื—ืฉื•ื‘ื•ืช.
19:07
both feeling and reality are important.
376
1147241
2204
19:09
And I want to close with two quick stories to illustrate this.
377
1149469
3233
ื•ืื ื™ ืจื•ืฆื” ืœืกื›ื ืขื ืฉื ื™ ืกื™ืคื•ืจื™ื ืงืฆืจื™ื ื›ื“ื™ ืœื”ืžื—ื™ืฉ ืืช ื–ื”.
19:12
1982 -- I don't know if people will remember this --
378
1152726
2479
ื‘-1982 -- ืœื ื™ื•ื“ืข ืื ืื ืฉื™ื ื™ื–ื›ืจื• ืืช ื–ื” --
ื”ื™ืชื” ืžื’ื™ืคื” ืงืฆืจื”
19:15
there was a short epidemic of Tylenol poisonings
379
1155229
3370
ืฉืœ ื”ืจืขืœืช ื˜ื™ื™ืœื ื•ืœ ื‘ืืจื”"ื‘.
19:18
in the United States.
380
1158623
1196
19:19
It's a horrific story.
381
1159843
1361
ื–ื”ื• ืกื™ืคื•ืจ ื ื•ืจื. ืžื™ืฉื”ื• ืœืงื— ื‘ืงื‘ื•ืง ื˜ื™ื™ืœื ื•ืœ,
19:21
Someone took a bottle of Tylenol,
382
1161228
2079
ืฉื ื‘ืชื•ื›ื• ืจืขืœ, ืกื’ืจ ื•ื”ื—ื–ื™ืจ ืœืžื“ืฃ.
19:23
put poison in it, closed it up, put it back on the shelf,
383
1163331
3002
ืžื™ืฉื”ื• ืื—ืจ ืงื ื” ืื•ืชื• ื•ืžืช.
19:26
someone else bought it and died.
384
1166357
1558
19:27
This terrified people.
385
1167939
1673
ื–ื” ื”ืคื—ื™ื“ ืžืื•ื“ ืื ืฉื™ื.
19:29
There were a couple of copycat attacks.
386
1169636
2227
ื”ื™ื• ืžืกืคืจ ื”ืชืงืคื•ืช ืฉืœ ื—ืงื™ื™ื ื™ื.
19:31
There wasn't any real risk, but people were scared.
387
1171887
2845
ืœื ื›ืœ ื”ื™ื” ืกื™ื›ื•ืŸ ืืžื™ืชื™, ืื‘ืœ ืื ืฉื™ื ื—ืฉืฉื•.
19:34
And this is how the tamper-proof drug industry was invented.
388
1174756
3876
ื•ื›ืš
ื”ื•ืžืฆืื” ืชืขืฉื™ื™ืช ื”ืชืจื•ืคื•ืช ืฉืœื ื ื™ืชืŸ "ืœื˜ืคืœ" ื‘ื”ืŸ.
19:38
Those tamper-proof caps? That came from this.
389
1178656
2229
ื”ืžื›ืกื™ื ื”ืืœื” ืฉืœื ื ื™ืชืŸ ืœืคืชื•ื—, ืฉื‘ืื• ื‘ืขืงื‘ื•ืช ื–ื”.
19:40
It's complete security theater.
390
1180909
1571
ื–ื”ื• "ืชื™ืื˜ืจื•ืŸ ื‘ื˜ื—ื•ืŸ" ืฉืœื.
19:42
As a homework assignment, think of 10 ways to get around it.
391
1182504
2891
ื‘ืชื•ืจ ืฉื™ืขื•ืจื™ ื‘ื™ืช, ื—ื™ืฉื‘ื• ืขืœ 10 ื“ืจื›ื™ื ืฉื•ื ื•ืช ืœืขืงื•ืฃ ืืช ื–ื”.
ืืชืŸ ืœื›ื ืื—ืช, ืžื–ืจืง.
19:45
I'll give you one: a syringe.
392
1185419
1891
19:47
But it made people feel better.
393
1187334
2781
ืื‘ืœ ื–ื” ืฉื™ืคืจ ืืช ืชื—ื•ืฉืชื ืฉืœ ืื ืฉื™ื.
19:50
It made their feeling of security more match the reality.
394
1190744
3702
ื–ื” ื’ืจื ืœืชื—ื•ืฉืช ื”ื‘ื˜ื—ื•ืŸ ืฉืœื”ื
ืœื”ืชืื™ื ื™ื•ืชืจ ืœืžืฆื™ืื•ืช.
ืกื™ืคื•ืจ ืื—ืจื•ืŸ: ืœืคื ื™ ื›ืžื” ืฉื ื™ื, ื™ื“ื™ื“ื” ืฉืœื™ ื™ืœื“ื”.
19:55
Last story: a few years ago, a friend of mine gave birth.
395
1195390
2934
ื‘ื™ืงืจืชื™ ืื•ืชื” ื‘ื‘ื™ืช ื”ื—ื•ืœื™ื.
19:58
I visit her in the hospital.
396
1198348
1397
19:59
It turns out, when a baby's born now,
397
1199769
1923
ืžืชื‘ืจืจ ืฉื”ื™ื•ื, ื›ืฉืชื™ื ื•ืง ื ื•ืœื“,
20:01
they put an RFID bracelet on the baby, a corresponding one on the mother,
398
1201716
3563
ืžืฆืžื™ื“ื™ื ืœื™ื“ ืฉืœื• ืฆืžื™ื“ ืขื ืžืฉื“ืจ ืจื“ื™ื•,
ื•ืžืฆืžื™ื“ื™ื ืฆืžื™ื“ ืชื•ืื ืœืืžื ืฉืœื•,
20:05
so if anyone other than the mother takes the baby out of the maternity ward,
399
1205303
3620
ื›ืš ืฉืื ืžื™ืฉื”ื• ืคืจื˜ ืœืืžื ืžื•ืฆื™ื ืืช ื”ืชื™ื ื•ืง ืžืžื—ืœืงืช ื”ื™ื•ืœื“ื•ืช,
ื ืฉืžืขืช ืื–ืขืงื”.
20:08
an alarm goes off.
400
1208947
1158
ืืžืจืชื™: "ื˜ื•ื‘, ื–ื” ืžื’ื ื™ื‘.
20:10
I said, "Well, that's kind of neat.
401
1210129
1729
20:11
I wonder how rampant baby snatching is out of hospitals."
402
1211882
3970
"ืžืขื ื™ื™ืŸ ืขื“ ื›ืžื” ื ืคื•ืฆื•ืช ื”ืŸ ื—ื˜ื™ืคื•ืช ืฉืœ ืชื™ื ื•ืงื•ืช
"ืžื‘ืชื™ ื—ื•ืœื™ื."
20:15
I go home, I look it up.
403
1215876
1236
ื—ื–ืจืชื™ ื”ื‘ื™ืชื”, ื‘ื“ืงืชื™ ืืช ื–ื”.
20:17
It basically never happens.
404
1217136
1525
ื–ื” ืœืžืขืฉื” ืžืขื•ืœื ืœื ืงืจื”.
20:18
(Laughter)
405
1218685
1844
ืื‘ืœ ื›ืฉื—ื•ืฉื‘ื™ื ืขืœ ื–ื”,
20:20
But if you think about it, if you are a hospital,
406
1220553
2843
ืื ืืชื ื‘ื™ืช ื—ื•ืœื™ื,
20:23
and you need to take a baby away from its mother,
407
1223420
2380
ื•ืืชื ืฆืจื™ืš ืœื”ืจื—ื™ืง ืชื™ื ื•ืง ืžืืžื ืฉืœื•,
20:25
out of the room to run some tests,
408
1225824
1781
ืืœ ืžื—ื•ืฅ ืœื—ื“ืจ ื›ื“ื™ ืœืขืจื•ืš ื›ืžื” ื‘ื“ื™ืงื•ืช,
20:27
you better have some good security theater,
409
1227629
2050
ืขื“ื™ืฃ ืฉื™ื”ื™ื” ืœื›ื ืงืฆืช ืชื™ืื˜ืจื•ืŸ ื‘ื˜ื—ื•ืŸ ื˜ื•ื‘,
20:29
or she's going to rip your arm off.
410
1229703
1945
ืื—ืจืช ื”ื™ื ืชืชืœื•ืฉ ืœืš ืืช ื”ื™ื“.
20:31
(Laughter)
411
1231672
1533
(ืฆื—ื•ืง)
ืื– ื–ื” ื—ืฉื•ื‘ ืœื ื•,
20:34
So it's important for us,
412
1234161
1717
20:35
those of us who design security,
413
1235902
2135
ืืœื• ืžืื™ืชื ื• ืฉืžืชื›ื ื ื™ื ืื‘ื˜ื—ื”,
ืฉื‘ื•ื—ื ื™ื ืžื“ื™ื ื™ื•ืช ืื‘ื˜ื—ื”,
20:38
who look at security policy --
414
1238061
2031
20:40
or even look at public policy in ways that affect security.
415
1240946
3308
ืื• ืืคื™ืœื• ื‘ื•ื—ื ื™ื ืžื“ื™ื ื™ื•ืช ืฆื™ื‘ื•ืจื™ืช
ื‘ื“ืจื›ื™ื ืฉืžืฉืคื™ืขื•ืช ืขืœ ื‘ื˜ื—ื•ืŸ.
ืœื ืžื“ื•ื‘ืจ ืจืง ื‘ืžืฆื™ืื•ืช, ืืœื ื‘ืชื—ื•ืฉื•ืช ื•ื‘ืžืฆื™ืื•ืช.
20:45
It's not just reality; it's feeling and reality.
416
1245006
3416
ืžื” ืฉื—ืฉื•ื‘
20:48
What's important
417
1248446
1865
ื”ื•ื ืฉื”ืŸ ืชื”ื™ื™ื ื” ื‘ืขืจืš ืื•ืชื• ื“ื‘ืจ.
20:50
is that they be about the same.
418
1250335
1545
20:51
It's important that, if our feelings match reality,
419
1251904
2531
ื—ืฉื•ื‘ ืฉืื ื”ืชื—ื•ืฉื•ืช ืฉืœื ื• ื™ืชืืžื• ืœืžืฆื™ืื•ืช,
ื ืขืฉื” ืคืฉืจื•ืช ื‘ื˜ื—ื•ืŸ ื˜ื•ื‘ื•ืช ื™ื•ืชืจ.
20:54
we make better security trade-offs.
420
1254459
1873
ืชื•ื“ื”
20:56
Thank you.
421
1256711
1153
20:57
(Applause)
422
1257888
2133
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7