Bruce Schneier: The security mirage

78,320 views ・ 2011-04-27

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Woo Hwang κ²€ν† : Young-ho Park
λ³΄μ•ˆμ—λŠ” 두가지가 μžˆμŠ΅λ‹ˆλ‹€:
00:16
So, security is two different things:
0
16010
2276
즉, λŠλ‚Œκ³Ό ν˜„μ‹€μ΄μ£ .
00:18
it's a feeling, and it's a reality.
1
18310
2526
이 λ‘κ°€μ§€λŠ” 맀우 λ‹€λ¦…λ‹ˆλ‹€.
00:20
And they're different.
2
20860
1425
μš°λ¦¬λŠ” 사싀은 μ•ˆμ „ν•˜μ§€ μ•Šμ„ λ•Œλ„
00:22
You could feel secure even if you're not.
3
22309
3427
μ•ˆμ „ν•˜λ‹€κ³  λŠλ‚„ 수 있죠.
00:25
And you can be secure
4
25760
1976
κ·Έ 반면, μ•ˆμ „κ°μ„ λͺ»λŠλΌλ”라도
00:27
even if you don't feel it.
5
27760
1850
μ‹€μ œλŠ” μ•ˆμ „ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
00:29
Really, we have two separate concepts
6
29634
2117
사싀 λ³΄μ•ˆμ΄λΌλŠ” λ‹¨μ–΄μ—λŠ”
00:31
mapped onto the same word.
7
31775
1652
λ‘κ°œμ˜ λ‹€λ₯Έ κ°œλ…μ΄ 담겨 μžˆμ§€μš”.
00:33
And what I want to do in this talk is to split them apart --
8
33960
3626
μ €λŠ” 이 ν† ν¬μ—μ„œ
이 단어λ₯Ό λ‘˜λ‘œ λΆ„λ¦¬ν•΄μ„œ
00:37
figuring out when they diverge and how they converge.
9
37610
3610
κ·Έλ“€μ˜ μ˜λ―Έκ°€ μ–΄λ–»κ²Œ λ‹€λ₯΄λ©°,
μ–΄λ–€λ•Œ κ·Έλ“€μ˜ μ˜λ―Έκ°€ κ°™μ•„ μ§€λŠ”μ§€ λ§μ”€λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
00:41
And language is actually a problem here.
10
41711
2275
사싀 이건 μ–Έμ–΄ λ¬Έμ œμž…λ‹ˆλ‹€.
μš°λ¦¬κ°€ 이야기 ν•˜λ €λŠ” κ°œλ…μ— λ”± λ§žλŠ”
00:44
There aren't a lot of good words
11
44010
2076
쒋은 단어듀은 그리 λ§Žμ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
00:46
for the concepts we're going to talk about.
12
46110
2061
경제적 μš©μ–΄λ‘œ λ§ν•˜μžλ©΄
00:49
So if you look at security from economic terms,
13
49295
4120
λ³΄μ•ˆμ€ μ›ν•˜λŠ” λ³΄μ•ˆ μˆ˜μ€€κ³Ό
그것을 κ°€μ§€λŠ”λ° ν•„μš”ν•œ κ²½λΉ„λ₯Ό λ§ν•©λ‹ˆλ‹€.
00:53
it's a trade-off.
14
53439
1647
μΌμ •ν•œ μˆ˜μ€€μ˜ λ³΄μ•ˆμ„ 가지렀면
00:55
Every time you get some security, you're always trading off something.
15
55110
4132
그것에 λŒ€ν•œ μ–΄λ–€ λŒ€κ°€λ₯Ό μΉ˜λ€„μ•Ό ν•˜μ§€μš”.
λ„λ‚œκ²½λ³΄κΈ°λ₯Ό μ„€μΉ˜ν•˜λŠλƒ μ•ˆν•˜λŠλƒμ™€ 같은
00:59
Whether this is a personal decision --
16
59266
1845
개인적인 결정이건,
01:01
whether you're going to install a burglar alarm in your home --
17
61135
3012
외ꡭ을 μ³λ“€μ–΄κ°ˆ 것인가 같은 ꡭ가적 결정이건,
01:04
or a national decision,
18
64171
1157
01:05
where you're going to invade a foreign country --
19
65352
2310
μš°λ¦¬λŠ” λˆμ΄λ‚˜ μ‹œκ°„, νŽΈλ¦¬μ„±,
01:07
you're going to trade off something: money or time, convenience, capabilities,
20
67686
3782
ꡰ사λ ₯, 그리고 μ‹¬μ§€μ–΄λŠ” 근본적인 자유 같은 것을 μœ„ν•΄
μ–΄λ–€ λŒ€κ°€λ₯Ό μΉ˜λ£Ήλ‹ˆλ‹€.
01:11
maybe fundamental liberties.
21
71492
2002
01:13
And the question to ask when you look at a security anything
22
73518
3274
κ·Έλ ‡κΈ° λ•Œλ¬Έμ— μš°λ¦¬κ°€ μ–΄λ–€ λ³΄μ•ˆμ±…μ„ μ›ν• λ•ŒλŠ”
01:16
is not whether this makes us safer,
23
76816
3382
그것이 우리λ₯Ό 더 μ•ˆμ „ν•˜κ²Œ λ§Œλ“œλŠλƒ λ³΄λ‹€λŠ”
κ·Έ λŒ€κ°€λ₯Ό μ§€λΆˆν•  κ°€μΉ˜κ°€ μžˆλŠ”κ°€λ₯Ό 생각해야 ν•©λ‹ˆλ‹€.
01:20
but whether it's worth the trade-off.
24
80222
2215
01:22
You've heard in the past several years, the world is safer
25
82461
3229
μš°λ¦¬λŠ” 사담 후세인이 더 이상 μ§‘κΆŒν•˜μ§€ μ•ŠκΈ° λ•Œλ¬Έμ—
세계가 더 μ•ˆμ „ν•΄ μ‘Œλ‹€λŠ” 말을 μ§€λ‚œ λͺ‡ν•΄λ™μ•ˆ λ“£κ³  μžˆμ§€μš”.
01:25
because Saddam Hussein is not in power.
26
85714
1890
그건 κ·ΈλŸ΄μ§€λ„ λͺ¨λ₯΄μ§€λ§Œ 사싀 그것은 λ³„λ‘œ λ¬΄κ΄€ν•œ λ§μž…λ‹ˆλ‹€.
01:27
That might be true, but it's not terribly relevant.
27
87628
2603
μš°λ¦¬κ°€ λ¬Όμ–΄μ•Όν•  μ§ˆλ¬Έμ€ μš°λ¦¬κ°€ μ§€λΆˆν•œ κ°€μΉ˜κ°€ μžˆμ—ˆλƒλŠ” 것이죠.
01:30
The question is: Was it worth it?
28
90255
2831
그리고 λ‚œ ν›„ 이락을 μΉ¨λž΅ν•  κ°€μΉ˜κ°€ μžˆμ—ˆλŠ”μ§€λ₯Ό
01:33
And you can make your own decision,
29
93110
2459
01:35
and then you'll decide whether the invasion was worth it.
30
95593
2733
우리 μŠ€μŠ€λ‘œκ°€ νŒλ‹¨μ„ λ‚΄λ €μ•Ό ν•©λ‹ˆλ‹€.
λ³΄μ•ˆλ„ κ·Έλ ‡κ²Œ
01:38
That's how you think about security: in terms of the trade-off.
31
98350
3561
λΉ„μš©νŽΈμ΅ μΈ‘λ©΄μ—μ„œ 생각해야 ν•©λ‹ˆλ‹€.
01:41
Now, there's often no right or wrong here.
32
101935
2610
이것은 μ’…μ’… 옳고 κ·Έλ₯Έ λ¬Έμ œκ°€ μ•„λ‹ˆμ£ .
μ–΄λ–€ μ‚¬λžŒμ€ 집에 λ„λ‚œκ²½λ³΄κΈ°λ₯Ό μ„€μΉ˜ν•˜κ³ 
01:45
Some of us have a burglar alarm system at home and some of us don't.
33
105208
3308
μ–΄λ–€μ‚¬λžŒμ€ μ„€μΉ˜ν•˜μ§€ μ•Šμ£ .
01:48
And it'll depend on where we live,
34
108540
2731
그것은 μš°λ¦¬κ°€ 어디에 μ‚΄κ³  μžˆλŠ”μ§€,
혼자 μ‚¬λŠ”μ§€ λ˜λŠ” 가쑱이 μžˆλŠ”μ§€,
01:51
whether we live alone or have a family,
35
111295
1926
쒋은 물건을 μ–Όλ§ˆλ‚˜ 가지고 μžˆλŠ”μ§€,
01:53
how much cool stuff we have,
36
113245
1668
01:54
how much we're willing to accept the risk of theft.
37
114937
3165
그리고 도둑을 λ§žμ„ κ°€λŠ₯성이 μ–Όλ§ˆλ‚˜ λœλ‹€κ³ 
μƒκ°ν•˜λŠ”μ§€μ— λ‹¬λ €μžˆμ§€μš”.
01:58
In politics also, there are different opinions.
38
118943
2948
μ •μΉ˜μ—λ„ λ˜ν•œ
λ‹€μ–‘ν•œ μ˜κ²¬λ“€μ΄ μžˆμŠ΅λ‹ˆλ‹€.
02:02
A lot of times, these trade-offs are about more than just security,
39
122459
4435
λŒ€λΆ€λΆ„μ˜ 경우, λΉ„μš©νŽΈμ΅ 뢄석을 ν•  λ•Œ
λ³΄μ•ˆλ¬Έμ œλ§Œ κ³ λ €ν•˜μ§€λŠ” μ•ŠλŠ”λ° μ €λŠ” 이것이
02:06
and I think that's really important.
40
126918
1865
맀우 μ€‘μš”ν•˜λ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
02:08
Now, people have a natural intuition about these trade-offs.
41
128807
3308
μš°λ¦¬λŠ” λΉ„μš©νŽΈμ΅ 뢄석을 ν•  λ•Œ
μ§κ΄€μ μœΌλ‘œ μ„ ν˜Έν•˜λŠ” 것이 각각 λ‹€λ¦…λ‹ˆλ‹€.
02:12
We make them every day.
42
132588
1556
μš°λ¦¬λŠ” 직관적인 결정을 맀일 λ‚΄λ¦¬μ§€μš” --
02:14
Last night in my hotel room, when I decided to double-lock the door,
43
134807
3533
μš°λ¦¬λŠ” ν˜Έν…”λ°©μ˜ 문을
μ΄μ€‘μœΌλ‘œ μž κΈ€κ²ƒμΈμ§€λ₯Ό κ²°μ •ν•˜κ±°λ‚˜,
02:18
or you in your car when you drove here;
44
138364
2000
였늘 μ—¬κΈ°λ‘œ μ˜¬λ•Œ μ°¨λ₯Ό 타고 올 것인가λ₯Ό κ²°μ •ν•˜κ±°λ‚˜,
λ˜λŠ” 점심을 λ¨Ήμ„λ•Œ 독이 μ•„λ‹ˆλ‹ˆκΉŒ λ¨ΉμžλŠ” κ²°μ • 등을 λ‚΄λ¦΄λ•Œ
02:21
when we go eat lunch
45
141191
1478
02:22
and decide the food's not poison and we'll eat it.
46
142693
2608
λΉ„μš©νŽΈμ΅μ— μ˜κ±°ν•œ κ°€μΉ˜νŒλ‹¨μ„ λ‚΄λ¦½λ‹ˆλ‹€.
02:25
We make these trade-offs again and again,
47
145325
3161
μš°λ¦¬λŠ” 이와같은 λΉ„μš©νŽΈμ΅ 뢄석을 ν†΅ν•œ
κ°€μΉ˜νŒλ‹¨μ„ ν•˜λ£¨μ—λ„ λͺ‡λ²ˆμ”© λ‚΄λ¦½λ‹ˆλ‹€.
02:28
multiple times a day.
48
148510
1576
μš°λ¦¬λŠ” 이런 것을 λ¬΄μ˜μ‹μ€‘μ— μ‹€ν–‰ν•˜μ§€μš”
02:30
We often won't even notice them.
49
150110
1589
02:31
They're just part of being alive; we all do it.
50
151723
2626
μš°λ¦¬λŠ” μ‚΄μ•„μžˆλŠ” ν•œ 계속 이런 선택을 내리죠.
이건 λͺ¨λ“  동물이 λ‹€ ν•˜λŠ” μΌμž…λ‹ˆλ‹€.
02:34
Every species does it.
51
154373
1555
02:36
Imagine a rabbit in a field, eating grass.
52
156474
2862
토끼가 λ“€νŒμ—μ„œ 풀을 λ¨Ήκ³  μžˆλ‹€κ³  상상해 λ³΄μ„Έμš”.
토끼가 μ—¬μš°λ₯Ό λ΄…λ‹ˆλ‹€.
02:39
And the rabbit sees a fox.
53
159360
1943
02:41
That rabbit will make a security trade-off:
54
161856
2049
ν† λΌλŠ” 이제 λ³΄μ•ˆ λΉ„μš©νŽΈμ΅ 뢄석에 μ˜κ±°ν•œ 결정을 λ‚΄λ¦½λ‹ˆλ‹€.
02:43
"Should I stay, or should I flee?"
55
163929
1904
"μ—¬κΈ° 머무λ₯ΌκΉŒ, λ„λ§κ°ˆκΉŒ?"
이제 생각해보죠,
02:46
And if you think about it,
56
166380
1619
μ˜³μ€ 결정을 λ‚΄λ¦° ν† λΌλŠ” μ‚΄μ•„λ‚¨μ•„μ„œ
02:48
the rabbits that are good at making that trade-off
57
168023
2555
02:50
will tend to live and reproduce,
58
170602
1978
μžμ†μ„ 남길 것이고
02:52
and the rabbits that are bad at it
59
172604
2307
그렇지 λͺ»ν•œ ν† λΌλŠ” μž‘μ•„λ¨Ήνžˆκ±°λ‚˜
02:54
will get eaten or starve.
60
174935
1474
κ΅Άμ–΄ μ£½μ–΄ 버리겠죠.
02:56
So you'd think
61
176958
1608
μ§€κ΅¬μƒμ—μ„œ κ°€μž₯
성곡적인 동물인
02:59
that us, as a successful species on the planet -- you, me, everybody --
62
179573
4309
μ—¬λŸ¬λΆ„κ³Ό 저와 우리 λͺ¨λ‘λŠ”,
03:03
would be really good at making these trade-offs.
63
183906
2573
이런 결정에 μ•„μ£Ό λŠ₯μˆ™ν•˜λ‹€κ³  생각할지 λͺ¨λ¦…λ‹ˆλ‹€.
κ·ΈλŸ¬λ‚˜ μš°λ¦¬λŠ”
03:07
Yet it seems, again and again, that we're hopelessly bad at it.
64
187126
3104
그런 결정을 λ‚΄λ¦¬λŠ”λ° μ•„μ£Ό μ„œνˆ¬λ¦…λ‹ˆλ‹€.
03:11
And I think that's a fundamentally interesting question.
65
191768
2802
이건 근본적으둜 μ•„μ£Ό ν₯λ―ΈμžˆλŠ” μ μ΄μ§€μš”.
03:14
I'll give you the short answer.
66
194594
1873
μš°λ¦¬κ°€ 이런 결정을 λ‚΄λ¦¬λŠ”λ° μ„œνˆ¬λ₯Έ μ΄μœ λŠ” κ°„λ‹¨ν•˜μ£ .
03:16
The answer is, we respond to the feeling of security
67
196491
2651
즉, μš°λ¦¬λŠ” ν˜„μ‹€ μžμ²΄μ— λŒ€μ‘ν•˜μ§€ μ•Šκ³ 
μ•ˆμ „ν•˜λƒ λ˜λŠ” μ•ˆμ „ν•˜μ§€ μ•ŠλƒλŠ” λŠλ‚Œμ— λŒ€μ‘ν•˜κΈ° λ•Œλ¬Έμ΄μ£ .
03:19
and not the reality.
68
199166
1518
03:21
Now, most of the time, that works.
69
201864
2696
λŒ€λΆ€λΆ„μ˜ 경우 이 방식은 별 λ¬Έμ œκ°€ λ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
03:25
Most of the time,
70
205538
1503
μ™œλƒν•˜λ©΄, λŒ€λΆ€λΆ„μ˜ 경우
03:27
feeling and reality are the same.
71
207065
2133
λŠλ‚Œκ³Ό μ‹€μ œκ°€ μΌμΉ˜ν•˜λ‹ˆκΉŒμš”.
03:30
Certainly that's true for most of human prehistory.
72
210776
3516
μ„ μ‚¬μ‹œλŒ€ λ•Œ λΆ€ν„°
κ·Έλž¬μœΌλ‹ˆκΉŒμš”.
03:35
We've developed this ability
73
215633
2708
이런 방식은 μ§„ν™”μ μœΌλ‘œ μœ λ¦¬ν–ˆκΈ° λ•Œλ¬Έμ—
03:38
because it makes evolutionary sense.
74
218365
2584
μš°λ¦¬λŠ” 이 λŠ₯λ ₯을 λ°œλ‹¬μ‹œμΌ°μ£ .
μ–΄λ–»κ²Œ 보면 우리의 μœ„ν—˜λΆ„μ„ κΈ°λŠ₯은
03:41
One way to think of it is that we're highly optimized
75
221985
3274
10λ§Œλ…„ 전에 μš°λ¦¬κ°€
λ™μ•„ν”„λ¦¬μΉ΄μ˜ κ³ μ›μ§€λŒ€μ—μ„œ
03:45
for risk decisions
76
225283
1803
μ†Œκ·œλͺ¨ κ°€μ‘±λ‹¨μœ„λ‘œ μ‚΄λ˜ λ‹Ήμ‹œμ˜ 상황에 맞게
03:47
that are endemic to living in small family groups
77
227110
2543
03:49
in the East African Highlands in 100,000 BC.
78
229677
2536
μ΅œμ ν™” 됐닀고 생각할 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
03:52
2010 New York, not so much.
79
232792
2659
2010λ…„μ˜ λ‰΄μš•μ€ μ™„μ „νžˆ 상황이 λ‹€λ₯΄μ£ .
03:56
Now, there are several biases in risk perception.
80
236879
3206
μœ„ν—˜μ„ 인식 ν•  λ•ŒλŠ” μ—¬λŸ¬ 편견이 μž‘μš©ν•©λ‹ˆλ‹€.
이것에 λŒ€ν•œ 쒋은 μ‹€ν—˜μ΄ 많이 μ‹€μ‹œλλŠ”λ°
04:00
A lot of good experiments in this.
81
240109
1741
04:01
And you can see certain biases that come up again and again.
82
241874
3603
일뢀 νŠΉμ •λœ νŽΈκ²¬λ“€μ΄ μ—¬λŸ¬ μ‹€ν—˜ 결과에 계속 λ‚˜νƒ€λ‚¬μŠ΅λ‹ˆλ‹€.
λ„€ 가지 예λ₯Ό λ³΄μ—¬λ“œλ¦¬μ£ .
04:05
I'll give you four.
83
245501
1353
04:06
We tend to exaggerate spectacular and rare risks
84
246878
3208
μš°λ¦¬λŠ” λ“œλ¬Όκ³  μž₯λŒ€ν•œ μœ„ν—˜μ€ κ³Όμž₯ν•˜κ³ 
ν”ν•œ μœ„ν—˜μ€ κ²½μ‹œν•˜λŠ” κ²½ν–₯이 μžˆμŠ΅λ‹ˆλ‹€.
04:10
and downplay common risks --
85
250110
1976
λΉ„ν–‰κ³Ό μš΄μ „μ΄ κ·Έλ ‡μ£ .
04:12
so, flying versus driving.
86
252110
1518
04:14
The unknown is perceived to be riskier than the familiar.
87
254451
3794
λ˜ν•œ, μš°λ¦¬μ—κ²Œ μΉœμˆ™ν•œκ²ƒ λ³΄λ‹€λŠ”
μš°λ¦¬κ°€ λͺ¨λ₯΄λŠ” 것을 더 μœ„ν—˜ν•˜κ²Œ μΈμ‹ν•©λ‹ˆλ‹€.
μΌλ‘€λ‘œ, μ‚¬λžŒλ“€μ€
04:21
One example would be:
88
261470
1439
04:22
people fear kidnapping by strangers,
89
262933
2613
λ‚―μ„  μ‚¬λžŒμ— μ˜ν•œ μœ μ•„λ‚©μΉ˜λ₯Ό λ‘λ €μ›Œν•˜λŠ”λ°,
04:25
when the data supports that kidnapping by relatives is much more common.
90
265570
3636
ν†΅κ³„μ μœΌλ‘œλŠ”, 지인에 μ˜ν•œ λ‚©μΉ˜κ°€ 훨씬 더 λΉˆλ²ˆν•˜μ£ .
μ–΄λ¦°μ΄μ˜ κ²½μš°μ— κ·Έλ ‡μ£ .
04:29
This is for children.
91
269230
1574
04:30
Third, personified risks are perceived to be greater
92
270828
4040
μ„Έλ²ˆμ§Έλ‘œ, νŠΉμ •λœ μ‚¬λžŒμ— λŒ€ν•œ μœ„ν—˜μ€
λ§‰μ—°νžˆ λˆ„κ΅°κ°€μ— λŒ€ν•œ μœ„ν—˜λ³΄λ‹€ 더 크게 μΈμ‹λ©λ‹ˆλ‹€.
04:34
than anonymous risks.
93
274892
1503
04:36
So, Bin Laden is scarier because he has a name.
94
276419
2787
κ·Έλž˜μ„œ 빈 라덴 μ΄λΌλŠ” 이름이 우리λ₯Ό 더 λ¬΄μ„­κ²Œ ν•©λ‹ˆλ‹€.
λ„€λ²ˆμ§Έλ‘œ,
04:40
And the fourth is:
95
280182
1363
04:41
people underestimate risks in situations they do control
96
281569
4755
μ‚¬λžŒλ“€μ€ 그듀이 ν†΅μ œν•˜μ§€ λͺ»ν•˜λŠ”
μƒν™©μ—μ„œμ˜ μœ„ν—˜μ€ κ³ΌλŒ€ν‰κ°€ν•˜κ³ ,
그듀이 ν†΅μ œν•  수 μžˆλŠ” μƒν™©μ˜ μœ„ν—˜μ€ κ³Όμ†Œν‰κ°€ ν•©λ‹ˆλ‹€.
04:46
and overestimate them in situations they don't control.
97
286348
2963
04:49
So once you take up skydiving or smoking,
98
289335
3384
κ·Έλž˜μ„œ, 일단 μŠ€μΉ΄μ΄λ‹€μ΄λΉ™μ΄λ‚˜ 흑연을 μ‹œμž‘ν•œ μ‚¬λžŒλ“€μ€
04:52
you downplay the risks.
99
292743
1624
κ·Έ μœ„ν—˜λ“€μ„ κ²½μ‹œν•˜μ£ .
예λ₯Όλ“€μ–΄ λˆ„κ°€ ν…ŒλŸ¬ 같은 κ²ƒμ˜ μœ„ν—˜μ— λŒ€ν•΄ λ¬»λŠ”λ‹€λ©΄
04:55
If a risk is thrust upon you -- terrorism is a good example --
100
295037
3053
μ—¬λŸ¬λΆ„μ€ μ–΄μ©” 수 μ—†λ‹€κ³  느끼기 λ•Œλ¬Έμ— 더 μœ„ν—˜ν•˜κ²Œ λŠλ‚λ‹ˆλ‹€.
04:58
you'll overplay it,
101
298114
1157
04:59
because you don't feel like it's in your control.
102
299295
2339
05:02
There are a bunch of other of these cognitive biases,
103
302157
3493
이 외에도 μˆ˜λ§Žμ€ 인식적 νŽΈκ²¬λ“€μ΄ μ—¬λŸ¬λΆ„μ˜ μœ„ν—˜ νŒλ‹¨μ—
05:05
that affect our risk decisions.
104
305674
2339
영ν–₯을 λ―ΈμΉ©λ‹ˆλ‹€.
05:08
There's the availability heuristic,
105
308832
2254
κ°€μš©μ„± μΆ”λ‹¨λ²•μ΄λΌλŠ” μš©μ–΄κ°€ μžˆλŠ”λ°
κ°„λ‹¨νžˆ 말해 κ·Έ μ˜λ―ΈλŠ”
05:11
which basically means we estimate the probability of something
106
311110
4180
μ–Όλ§ˆλ‚˜ μ‰½κ²Œ 머리속에 상상할 수 μžˆλŠ”κ°€μ— 따라
05:15
by how easy it is to bring instances of it to mind.
107
315314
3339
κ·Έκ²ƒμ˜ ν™•λ₯ μ„ μΆ”μ •ν•œλ‹€λŠ” 것이죠.
05:19
So you can imagine how that works.
108
319831
1777
예λ₯Όλ“€μ–΄ μ„€λͺ…을 λ“œλ¦¬μ£ .
05:21
If you hear a lot about tiger attacks, there must be a lot of tigers around.
109
321632
3628
ν˜Έλž‘μ΄ 슡격 이야기λ₯Ό 많이 λ“€μœΌλ©΄, 주변에 ν˜Έλž‘μ΄κ°€ 많이 μžˆλ‹€λŠ” 말이겠죠.
μ‚¬μžκ°€ κ³΅κ²©ν•˜λŠ” μ–˜κΈ°λ₯Ό λͺ»λ“€μ—ˆμœΌλ©΄ μ‚¬μžκ°€ 많이 μ—†μ„ν…Œκ³ μš”.
05:25
You don't hear about lion attacks, there aren't a lot of lions around.
110
325284
3344
μ‹€μ œλ‘œλŠ” μ–΄μ©Œλ‹€κ°€ μΌμ–΄λ‚ μˆ˜ μžˆλŠ” μœ„ν—˜μ„
05:28
This works, until you invent newspapers,
111
328652
2297
05:30
because what newspapers do is repeat again and again
112
330973
4406
λ°˜λ³΅ν•΄μ„œ λ“€λ €μ£ΌλŠ” 신문이
발λͺ…λ˜κΈ° μ „κΉŒμ§€ 이런 생각은
μ˜³μ•˜μŠ΅λ‹ˆλ‹€.
05:35
rare risks.
113
335403
1406
05:36
I tell people: if it's in the news, don't worry about it,
114
336833
2865
μ €λŠ” μ‚¬λžŒλ“€μ—κ²Œ λ‰΄μŠ€μ— μ–΄λ–€ μ–˜κΈ°κ°€ λ‚˜μ˜€λ©΄
κ±±μ •ν•˜μ§€ 말라고 ν•©λ‹ˆλ‹€.
05:39
because by definition, news is something that almost never happens.
115
339722
4275
μ™œλƒν•˜λ©΄, 그것은 λ‰΄μŠ€λΌλŠ” 말의 μ •μ˜μ™€ 같이 거의 생기지 μ•ŠλŠ”λ‹€λŠ” λ§μ΄λ‹ˆκΉŒμš”.
(μ›ƒμŒ)
05:44
(Laughter)
116
344021
1769
05:45
When something is so common, it's no longer news.
117
345814
2923
더 이상 λ‰΄μŠ€κ°€ 아닐 μ •λ„λ‘œ ν”ν•œ 사건듀, 즉
05:48
Car crashes, domestic violence --
118
348761
2198
μ°¨λŸ‰ 좩돌, κ°€μ • λ‚΄ 폭λ ₯ 등이
05:50
those are the risks you worry about.
119
350983
1990
μ—¬λŸ¬λΆ„μ΄ κ±±μ •ν•΄μ•Ό ν•  μœ„ν—˜λ“€μž…λ‹ˆλ‹€.
05:53
We're also a species of storytellers.
120
353713
2148
인간은 λ˜ν•œ 이야기λ₯Ό μ’‹μ•„ν•˜λŠ” λ™λ¬Όμ΄μ§€μš”.
05:55
We respond to stories more than data.
121
355885
2115
μš°λ¦¬λŠ” ν†΅κ³„μˆ˜μΉ˜ λ³΄λ‹€λŠ” 이야기에 더 κ°λ™ν•©λ‹ˆλ‹€.
05:58
And there's some basic innumeracy going on.
122
358514
2406
μš°λ¦¬λŠ” μˆ«μžμ— κ΄€ν•œ ν•œ 기본적으둜 무λŠ₯ν•œ 면이 μžˆμŠ΅λ‹ˆλ‹€.
06:00
I mean, the joke "One, two, three, many" is kind of right.
123
360944
3142
"ν•˜λ‚˜, λ‘˜, μ…‹, λ§Žλ‹€" 같은 죠크가 λŒ€μΆ© 맞죠.
μš°λ¦¬λŠ” μž‘μ€ μˆ«μžλ“€μ€ 잘 μ•Œμ§€μš”.
06:04
We're really good at small numbers.
124
364110
2322
06:06
One mango, two mangoes, three mangoes,
125
366456
2336
망고 ν•˜λ‚˜, 망고 λ‘˜, 망고 μ…‹μ²˜λŸΌμš”.
06:08
10,000 mangoes, 100,000 mangoes --
126
368816
1977
망고 λ§Œκ°œλ‚˜ 망고 10만 개 --
06:10
it's still more mangoes you can eat before they rot.
127
370817
2977
이런건 λ¨ΉκΈ° 전에 λ‹€ μ©μ–΄λ²„λ¦¬λŠ” 숫자죠.
06:13
So one half, one quarter, one fifth -- we're good at that.
128
373818
3268
μš°λ¦¬λŠ” 반 개, 1/4개, 1/5개 등은 잘 μΈμ‹ν•˜μ§€λ§Œ
100만 λΆ„μ˜ μΌμ΄λ‚˜ 10μ–΅λΆ„μ˜ 일은 λ‘˜ λ‹€
06:17
One in a million, one in a billion --
129
377110
1976
λ˜‘κ°™μ΄ 거의 제둜둜 μ·¨κΈ‰ν•˜μ§€μš”
06:19
they're both almost never.
130
379110
1575
06:21
So we have trouble with the risks that aren't very common.
131
381546
3414
λ”°λΌμ„œ μš°λ¦¬λŠ” ν”ν•˜μ§€ μ•Šμ€ μœ„ν—˜λ“€μ„
λ‹€λ£° λ•Œ 어렀움을 κ²ͺμŠ΅λ‹ˆλ‹€.
06:25
And what these cognitive biases do
132
385760
1977
μ΄λŸ¬ν•œ 인식적 νŽΈκ²¬λ“€μ€
06:27
is they act as filters between us and reality.
133
387761
2976
μš°λ¦¬μ™€ μ‹€μ œ μ‚¬μ΄μ—μ„œ 필터와 같은 역할을 ν•©λ‹ˆλ‹€.
κ·Έ κ²°κ³Ό,
06:31
And the result is that feeling and reality get out of whack,
134
391284
3873
우리의 λŠλ‚Œκ³Ό μ‹€μ œκ°„μ— 괴리가 생기고,
λŠλ‚Œμ€ μ‹€μ œμ™€ λ‹¬λΌμ§‘λ‹ˆλ‹€.
06:35
they get different.
135
395181
1384
06:37
Now, you either have a feeling -- you feel more secure than you are,
136
397370
3931
그러면 μ‹€μ œλ‘œ μ•ˆμ „ν•œ 것 보닀
더 μ•ˆμ „ν•˜κ²Œ λŠλΌκ±°λ‚˜
06:41
there's a false sense of security.
137
401325
1685
λ˜λŠ” κ·Έ λ°˜λŒ€λ‘œ
06:43
Or the other way, and that's a false sense of insecurity.
138
403034
3374
μ‹€μ œλŠ” μ•ˆμ „ν•˜μ§€λ§Œ μ•ˆμ „ν•˜μ§€ μ•Šκ²Œ λŠλΌμ§€μš”.
μ €λŠ” μ†Œμœ„ λ§ν•˜λŠ” "λ³΄μ•ˆ μ—°κ·Ή"에 λŒ€ν•΄ 글을 많이 μ“°λŠ”λ°
06:47
I write a lot about "security theater,"
139
407015
2880
06:49
which are products that make people feel secure,
140
409919
2680
λ³΄μ•ˆμ—°κ·Ήμ€ μ•ˆμ „ν•˜λ‹€λŠ” λŠλ‚Œλ§Œ 쀄뿐
06:52
but don't actually do anything.
141
412623
1977
μ‹€μ§€λ‘œλŠ” μ•„λ¬΄λŸ° 역할이 μ—†μŠ΅λ‹ˆλ‹€.
06:54
There's no real word for stuff that makes us secure,
142
414624
2557
μ‹€μ œλ‘œλŠ” μ•ˆμ „ν•˜κ²Œ λ§Œλ“€μ–΄ μ£Όμ§€λ§Œ
μ•ˆμ „ν•˜λ‹€λŠ” λŠλ‚Œμ€ 주지 λͺ»ν•œλ‹€λŠ” 의미의 단어가 μ˜μ–΄μ—λŠ” μ—†μ§€μš”.
06:57
but doesn't make us feel secure.
143
417205
1881
μ•„λ§ˆ CIAκ°€ ν•΄μ•Όν•˜λŠ” 일이 이런 것듀일지 λͺ¨λ₯΄μ£ .
06:59
Maybe it's what the CIA is supposed to do for us.
144
419110
2720
07:03
So back to economics.
145
423539
2168
이제 κ²½μ œν•™μœΌλ‘œ λŒμ•„μ™€μ„œ,
07:05
If economics, if the market, drives security,
146
425731
3656
λ§Œμ•½μ— μ•ˆμ „ν•˜λ‹€λŠ” λŠλ‚Œμ— μ˜κ±°ν•΄μ„œ
07:09
and if people make trade-offs based on the feeling of security,
147
429411
4847
κ²½μ œλ‚˜ μ‹œμž₯의 λ³΄μ•ˆμ΄ 쒌우되고
μ‚¬λžŒλ“€μ΄ λ³΄μ•ˆμ— λŒ€ν•œ λΉ„μš©νŽΈμ΅ κ°€μΉ˜νŒλ‹¨μ„ ν•œλ‹€λ©΄
07:14
then the smart thing for companies to do for the economic incentives
148
434282
4680
계산에 λΉ λ₯Έ νšŒμ‚¬λ“€μ˜ μž…μž₯μ—μ„œ λ³Όλ•Œ
μ‚¬λžŒλ“€μ΄ μ•ˆμ „ν•˜κ²Œ 느끼게 ν•˜λŠ” 것이
07:18
is to make people feel secure.
149
438986
2057
μœ λ¦¬ν•˜κ² μ§€μš”.
07:21
And there are two ways to do this.
150
441942
2330
κ·Έλ ‡κ²Œ λŠλΌλ„λ‘ ν•˜λŠ”λ°λŠ” 두가지 방법이 있죠.
07:24
One, you can make people actually secure
151
444296
2790
μ²«λ²ˆμ§ΈλŠ” μ‹€μ œλ‘œ μ•ˆμ „ν•˜κ²Œ λ§Œλ“  후에
μ‚¬λžŒλ“€μ΄ μ•ˆμ „ν•¨μ„ κΉ¨λ‹«κ²Œ ν•˜λŠ” 것이고
07:27
and hope they notice.
152
447110
1463
07:28
Or two, you can make people just feel secure
153
448597
2844
λ‘λ²ˆμ§Έ 방법은 κ·Έλƒ₯ μ•ˆμ „ν•˜λ‹€λŠ” λŠλ‚Œλ§Œ μ£Όκ³ 
07:31
and hope they don't notice.
154
451465
1872
μ‚¬λžŒλ“€μ΄ λˆˆμΉ˜μ±„μ§€ λͺ»ν•˜κΈΈ λ°”λΌλŠ” κ±°μ£ .
07:34
Right?
155
454401
1375
07:35
So what makes people notice?
156
455800
3141
μ–΄λ–»κ²Œν•˜λ©΄ μ‚¬λžŒλ“€μ΄ μ•ˆμ „ν•˜λ‹€λŠ” λŠλ‚Œμ„ κ°€μ§€κ²Œ ν•  수 μžˆμ„κΉŒμš”?
λ‹€μŒκ³Ό 같은 λͺ‡κ°€μ§€ 방법이 μžˆκ² μ§€μš”.
07:39
Well, a couple of things:
157
459500
1382
07:40
understanding of the security,
158
460906
2266
μ•ˆμ „, μœ„ν—˜, μœ„ν˜‘,
그리고 λŒ€μ‘μ±…μ„ μ΄ν•΄ν•˜κ³ 
07:43
of the risks, the threats,
159
463196
1890
그듀이 μ–΄λ–»κ²Œ μž‘μš©ν•œλ‹€λŠ” 것을 μ΄ν•΄ν•˜λŠ” κ²ƒμ΄μ§€μš”.
07:45
the countermeasures, how they work.
160
465110
1874
07:47
But if you know stuff, you're more likely
161
467008
2290
κ·ΈλŸ¬λ‚˜ μ‚¬λžŒλ“€μ΄ λ³΄μ•ˆμ— λŒ€ν•΄ 잘 μ•Œκ²Œ 되면 .
ν˜„μ‹€μ— μΌμΉ˜ν•˜λŠ” λ³΄μ•ˆμ— λŒ€ν•œ λŠλ‚Œμ„ κ°€μ§€κ²Œ λ©λ‹ˆλ‹€
07:50
to have your feelings match reality.
162
470155
2226
제 말을 μ΄ν•΄ν•˜λŠ”λ° 도움이 λ˜λ„λ‘ λͺ†κ°€μ§€ μ‹€λ‘€λ₯Ό λ§μ”€λ“œλ¦¬μ§€μš”.
07:53
Enough real-world examples helps.
163
473110
3145
μš°λ¦¬λŠ” 우리 μ£Όλ³€μ˜ λ²”μ£„μœ¨μ„ 잘 μ•„λŠ”λ°
07:56
We all know the crime rate in our neighborhood,
164
476279
2559
07:58
because we live there, and we get a feeling about it
165
478862
2801
κ·Έ μ΄μœ λŠ” μš°λ¦¬κ°€ 우리 지역에 μ‚΄λ©΄μ„œ κ°€μ§€λŠ” λŠλ‚Œμ΄
08:01
that basically matches reality.
166
481687
1869
μ‹€μ œμ™€ μΌμΉ˜ν•˜κΈ° λ•Œλ¬Έμ΄μ§€μš”.
λ³΄μ•ˆμ—°κ·Ήμ˜ 본성은 그것이
08:05
Security theater is exposed
167
485038
2207
08:07
when it's obvious that it's not working properly.
168
487269
2786
λΉ„νš¨κ³Όμ μ΄λΌλŠ” 것이 λΆ„λͺ…ν•΄ μ§ˆλ•Œ λ…ΈμΆœλ©λ‹ˆλ‹€.
μ’‹μ•„μš”, 그럼 μ–΄λ–¨λ•Œ μ‚¬λžŒλ“€μ΄ 눈치λ₯Ό 채지 λͺ»ν• κΉŒμš”?
08:11
OK. So what makes people not notice?
169
491209
2670
08:14
Well, a poor understanding.
170
494443
1492
μ‚¬λžŒλ“€μ΄ μ œλŒ€λ‘œ μ΄ν•΄ν•˜μ§€ λͺ»ν•  λ•Œ κ·Έλ ‡μ§€μš”.
08:16
If you don't understand the risks, you don't understand the costs,
171
496642
3144
μš°λ¦¬κ°€ μœ„ν—˜μ„ μ΄ν•΄ν•˜μ§€ λͺ»ν•˜λ©΄, μΉ˜λ€„μ•Ό ν•˜λŠ” λŒ€κ°€λ„ λͺ¨λ₯΄κ³ 
08:19
you're likely to get the trade-off wrong,
172
499810
2157
λ³΄μ•ˆμ— λŒ€ν•œ ν‹€λ¦° λΉ„μš©νŽΈμ΅ νŒλ‹¨μ„ 내리기 쉽고
08:21
and your feeling doesn't match reality.
173
501991
2488
우리의 λŠλ‚Œκ³Ό ν˜„μ‹€μ΄ μΌμΉ˜ν•˜μ§€ μ•Šμ§€μš”.
08:24
Not enough examples.
174
504503
1737
ν•œκ°€μ§€ 예λ₯Ό 더 λ“€μ£ .
08:26
There's an inherent problem with low-probability events.
175
506879
3506
자주 λ°œμƒν•˜μ§€ μ•ŠλŠ” μ΄λ²€νŠΈμ—λŠ”
본질적인 λ¬Έμ œκ°€ μžˆμ§€μš”.
08:30
If, for example, terrorism almost never happens,
176
510919
3813
λ§Œμ•½μ—, 예λ₯Ό λ“€μ–΄,
ν…ŒλŸ¬ν–‰μœ„κ°€ 거의 μΌμ–΄λ‚˜μ§€ μ•ŠλŠ”λ‹€λ©΄,
08:34
it's really hard to judge the efficacy of counter-terrorist measures.
177
514756
4604
ν…ŒλŸ¬ 방지 λŒ€μ±…μ˜ 효λŠ₯을
νŒλ‹¨ν•˜κΈ°κ°€ 정말 μ–΄λ ΅μŠ΅λ‹ˆλ‹€.
08:40
This is why you keep sacrificing virgins,
178
520523
3563
κ·Έλž˜μ„œ μš°λ¦¬λŠ” 계속 μ²˜λ…€λ₯Ό 제물둜 λ°”μΉ˜κ³ ,
μœ λ‹ˆμ½˜ λ°©μ–΄κ°€ μ—¬μ „νžˆ νš¨κ³Όκ°€ μžˆλ‹€κ³  λ―Ώμ§€μš”.
08:44
and why your unicorn defenses are working just great.
179
524110
2675
08:46
There aren't enough examples of failures.
180
526809
2557
μ‹€νŒ¨μ— λŒ€ν•œ μ˜ˆλŠ” λ§Žμ§€ μ•Šμ§€μš”.
λ˜ν•œ, μ œκ°€ μ•žμ„œ λ§μ”€λ“œλ¦° 인지적 편견,
08:51
Also, feelings that cloud the issues --
181
531109
2787
08:53
the cognitive biases I talked about earlier: fears, folk beliefs --
182
533920
4028
곡포감, 전톡적인 λ―ΏμŒμ€
κ°„λ‹¨ν•˜κ²Œ 말해 ν˜„μ‹€κ³ΌλŠ” λ‹€λ₯Έ λ―Έν‘ν•œ λͺ¨λΈμ΄κΈ° λ•Œλ¬Έμ—
08:58
basically, an inadequate model of reality.
183
538727
2745
그런 λŠλ‚Œμ€ μ‹€μ œμ™€ ν—ˆκ΅¬λ₯Ό ν˜Όλ™μ‹œν‚΅λ‹ˆλ‹€.
이제 μ œκ°€ 머리λ₯Ό λ³΅μž‘ν•˜κ²Œ λ§Œλ“œλŠ” 말씀을 λ“œλ¦¬μ£ .
09:03
So let me complicate things.
184
543403
2171
09:05
I have feeling and reality.
185
545598
1977
λŠλ‚Œκ³Ό ν˜„μ‹€ 이외에
09:07
I want to add a third element. I want to add "model."
186
547599
2796
이제 λͺ¨λΈμ΄λΌλŠ” 제3의 μš”μ†Œλ₯Ό 더 μΆ”κ°€ν•˜κ² μŠ΅λ‹ˆλ‹€.
09:10
Feeling and model are in our head,
187
550839
2350
λŠλ‚Œκ³Ό λͺ¨λΈμ€ 우리의 머리속에 μ‘΄μž¬ν•˜λŠ” ν•œνŽΈ
ν˜„μ‹€μ€ μ™ΈλΆ€ μ„Έκ³„μž…λ‹ˆλ‹€.
09:13
reality is the outside world; it doesn't change, it's real.
188
553213
3452
ν˜„μ‹€μ€ λ°”λ€Œμ§€ μ•Šκ³  μ‹€μ œλ‘œ μ‘΄μž¬ν•©λ‹ˆλ‹€.
09:17
Feeling is based on our intuition,
189
557800
2214
λŠλ‚Œμ€ 우리의 직관에 μ˜κ±°ν•œ 것이고
λͺ¨λΈμ€ 이성적인 생각에 μ˜κ±°ν•œ 것이죠.
09:20
model is based on reason.
190
560038
1626
그게 기본적인 μ°¨μ΄μž…λ‹ˆλ‹€.
09:22
That's basically the difference.
191
562383
2039
09:24
In a primitive and simple world,
192
564446
1977
μ›μ‹œμ μ΄κ³  λ‹¨μˆœν•œ μ„Έκ³„μ—μ„œλŠ”
09:26
there's really no reason for a model,
193
566447
2137
λͺ¨λΈμ΄ ν•„μš”ν•˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
μ™œλƒλ©΄ λŠλ‚Œμ΄ μ‹€μ œμ— κ°€κΉŒμ› μœΌλ‹ˆκΉŒμš”.
09:30
because feeling is close to reality.
194
570253
2295
09:32
You don't need a model.
195
572572
1373
그럴 땐 λͺ¨λΈμ΄ ν•„μš”μ—†μŠ΅λ‹ˆλ‹€.
09:34
But in a modern and complex world,
196
574596
2150
ν•˜μ§€λ§Œ ν˜„λŒ€μ˜ λ³΅μž‘ν•œ μ„Έμƒμ—μ„œλŠ”
μš°λ¦¬κ°€ λ‹Ήλ©΄ν•˜λŠ” λ§Žμ€ μœ„ν—˜μ„ μ΄ν•΄ν•˜κΈ° μœ„ν•΄
09:37
you need models to understand a lot of the risks we face.
197
577556
3650
λͺ¨λΈμ΄ ν•„μš”ν•©λ‹ˆλ‹€.
09:42
There's no feeling about germs.
198
582362
2284
μš°λ¦¬λŠ” 세균에 λŒ€ν•œ λŠλ‚Œμ΄ μ—†κΈ° λ•Œλ¬Έμ—
세균을 μ΄ν•΄ν•˜λ €λ©΄ λͺ¨λΈμ΄ ν•„μš”ν•©λ‹ˆλ‹€.
09:45
You need a model to understand them.
199
585110
2116
λ”°λΌμ„œ 이 λͺ¨λΈμ€
09:48
This model is an intelligent representation of reality.
200
588157
3678
ν˜„μ‹€μ˜ 지적 ν‘œμƒμ΄μ£ .
09:52
It's, of course, limited by science, by technology.
201
592411
4751
이런 λͺ¨λΈμ€ λ¬Όλ‘  κ³Όν•™κ³Ό 기술이
ν—ˆμš©ν•˜λŠ” ν•œκ³„μ—μ„œ κ°€λŠ₯ν•˜μ£ .
ν˜„λ―Έκ²½μ„ 발λͺ…ν•˜μ—¬ 세균을 직접 보기 μ „κΉŒμ§„
09:58
We couldn't have a germ theory of disease
202
598249
2326
10:00
before we invented the microscope to see them.
203
600599
2534
미생물 λ³‘μΈλ‘ μ΄λž€ 것이 μ—†μ—ˆμ§€μš”.
10:04
It's limited by our cognitive biases.
204
604316
2643
λͺ¨λΈμ€ 우리의 μΈμ‹μ˜ 편견으둜 λ§Œλ“€μ–΄ μ§‘λ‹ˆλ‹€.
κ·ΈλŸ¬λ‚˜ 이런 λͺ¨λΈμ€
10:08
But it has the ability to override our feelings.
205
608110
2991
우리의 λŠλ‚Œμ„ 무λ ₯ν™” μ‹œν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
10:11
Where do we get these models? We get them from others.
206
611507
3104
μš°λ¦¬λŠ” λͺ¨λΈμ„ μ–΄λ””μ„œ μ–»μ„κΉŒμš”? λ°”λ‘œ 우리의 μ™ΈλΆ€μ—μ„œ μ–»μŠ΅λ‹ˆλ‹€.
10:14
We get them from religion, from culture, teachers, elders.
207
614635
5219
μš°λ¦¬λŠ” μ’…κ΅λ‚˜, λ¬Έν™”λ‚˜,
μŠ€μŠΉμ΄λ‚˜, μ–΄λ₯΄μ‹ λΆ„λ“€λ‘œλΆ€ν„° λͺ¨λΈμ„ μ–»μŠ΅λ‹ˆλ‹€.
이년전에 μ €λŠ” λ‚¨μ•„ν”„λ¦¬μΉ΄μ—μ„œ
10:20
A couple years ago, I was in South Africa on safari.
208
620298
3426
μ‚¬νŒŒλ¦¬ 여행을 ν–ˆμŠ΅λ‹ˆλ‹€.
10:23
The tracker I was with grew up in Kruger National Park.
209
623748
2762
μ œκ°€ κ³ μš©ν•œ μΆ”μ κ°€λŠ” 크루거 κ΅­λ¦½κ³΅μ›μ—μ„œ μžλž€ μ‚¬λžŒμ΄μ—ˆμ§€μš”.
10:26
He had some very complex models of how to survive.
210
626534
2753
κ·ΈλŠ” 생쑴법에 λŒ€ν•œ μ—¬λŸ¬κ°€μ§€ λ³΅μž‘ν•œ λͺ¨λΈμ„ 가지고 μžˆμ—ˆμ§€μš”.
10:29
And it depended on if you were attacked by a lion, leopard, rhino, or elephant --
211
629800
3913
그의 생쑴 λͺ¨λΈμ€ κ·Έλ₯Ό κ³΅κ²©ν•˜λŠ” λ§Ήμˆ˜κ°€
μ‚¬μžμΈμ§€, ν‘œλ²”μΈμ§€, μ½”λΏ”μ†ŒμΈμ§€ λ˜λŠ” 코끼리인지에 따라 λ‹€λ₯΄κ³ 
10:33
and when you had to run away, when you couldn't run away,
212
633737
2734
λ˜ν•œ, μ–Έμ œ 도망가고, μ–Έμ œ 도망가면 μ•ˆλ˜κ³ 
10:36
when you had to climb a tree, when you could never climb a tree.
213
636495
3083
λ‚˜λ¬΄μ— 올라 갈 수 μžˆλŠ”μ§€ λ˜λŠ” μ—†λŠ”μ§€μ— 따라 λ‹¬λžμ£ .
μ œκ°€ ν˜Όμžμžˆμ—ˆμœΌλ©΄ μ €λŠ” ν•˜λ£¨λ„ λͺ»λΌμ„œ μ£½μ—ˆμ„ κ²λ‹ˆλ‹€.
10:39
I would have died in a day.
214
639602
1349
ν•˜μ§€λ§Œ 제 좔적가 κ·Έκ³³μ—μ„œ νƒœμ–΄λ‚¬κ³ ,
10:42
But he was born there, and he understood how to survive.
215
642160
3782
μ–΄λ–»κ²Œ ν•˜λ©΄ μ‚΄ 수 μžˆλŠ”μ§€ μ•Œκ³  μžˆμ—ˆμ§€μš”.
μ €λŠ” λ‰΄μš•μ—μ„œ νƒœμ–΄λ‚¬μŠ΅λ‹ˆλ‹€.
10:46
I was born in New York City.
216
646490
1596
λ§Œμ•½μ— μ œκ°€ κ·Έ 좔적가λ₯Ό λ‰΄μš•μœΌλ‘œ 데리고 였면, κ·Έ μ‚¬λžŒμ€ ν•˜λ£¨μ•ˆμ— μ£½μ„κ±°μ˜ˆμš”.
10:48
I could have taken him to New York, and he would have died in a day.
217
648110
3251
(μ›ƒμŒ)
10:51
(Laughter)
218
651385
1001
10:52
Because we had different models based on our different experiences.
219
652410
4144
μš°λ¦¬λŠ” 우리만의 κ²½ν—˜μ„ λ°”νƒ•μœΌλ‘œ
우리의 κ³ μœ ν•œ λͺ¨λΈμ„ 가지고 μžˆμ§€μš”.
10:58
Models can come from the media,
220
658291
2469
λͺ¨λΈμ€ λŒ€μ€‘λ§€μ²΄λ‘œ λΆ€ν„° λ‚˜μ˜¬ 수 도 있고,
11:00
from our elected officials ...
221
660784
1763
정뢀관리가 λ§Œλ“€ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
11:03
Think of models of terrorism,
222
663234
3081
ν…ŒλŸ¬,
11:06
child kidnapping,
223
666339
2197
어린이 유괴,
11:08
airline safety, car safety.
224
668560
2325
항곡기 μ•ˆμ „, μžλ™μ°¨ μ•ˆμ „ λ“±μ˜ λͺ¨λΈμ„ μƒκ°ν•΄λ³΄μ„Έμš”.
11:11
Models can come from industry.
225
671539
1993
λͺ¨λΈμ€ μ‚°μ—…κ³„λ‘œ λΆ€ν„° λ‚˜μ˜¬ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
11:14
The two I'm following are surveillance cameras,
226
674348
3218
μ €λŠ” μ§€κΈˆ κ°μ‹œ 카메라와
ID μΉ΄λ“œ λͺ¨λΈλ“€μ„ μ—°κ΅¬ν•˜κ³  μžˆλŠ”λ°
11:17
ID cards,
227
677590
1496
λ§Žμ€ 컴퓨터 λ³΄μ•ˆ λͺ¨λΈλ“€μ΄ 이런 μ‹œμŠ€ν…œμ—μ„œ λ‚˜μ™”μ§€μš”.
11:19
quite a lot of our computer security models come from there.
228
679110
3130
λ§Žμ€ λͺ¨λΈμ€ κ³Όν•™μ—μ„œ λ‚˜μ˜€μ§€μš”.
11:22
A lot of models come from science.
229
682264
2227
11:24
Health models are a great example.
230
684515
1837
건강 λͺ¨λΈμ΄ μ•„μ£Ό 쒋은 μ˜ˆμž…λ‹ˆλ‹€.
11:26
Think of cancer, bird flu, swine flu, SARS.
231
686376
3026
μ•”, μ‘°λ₯˜λ…감, 돼지독감, μ‚¬μŠ€ 같은 병듀을 μƒκ°ν•΄λ³΄μ„Έμš”.
11:29
All of our feelings of security about those diseases
232
689942
4870
이런 λ³‘λ“€μ˜ μ•ˆμ „μ— λŒ€ν•œ
우리의 λͺ¨λ“  λŠλ‚Œμ€
11:34
come from models given to us, really, by science filtered through the media.
233
694836
4655
과학적 λͺ¨λΈμ΄ λŒ€μ€‘λ§€μ²΄λ₯Ό 톡해 μ—¬κ³Όλœ 후에
μš°λ¦¬μ—κ²Œ 주어진 κ²ƒμž…λ‹ˆλ‹€.
κ·Έλž˜μ„œ λͺ¨λΈλ“€μ€ λ°”λ€” 수 μžˆμŠ΅λ‹ˆλ‹€.
11:41
So models can change.
234
701038
1720
11:43
Models are not static.
235
703482
2103
λͺ¨λΈλ“€μ€ 정적인것이 μ•„λ‹™λ‹ˆλ‹€.
11:45
As we become more comfortable in our environments,
236
705609
3240
μš°λ¦¬κ°€ 우리의 ν™˜κ²½μ— 더 잘 μ μ‘ν•˜κ²Œ 되면
11:48
our model can move closer to our feelings.
237
708873
3602
우리의 λͺ¨λΈμ€ 우리의 λŠλ‚Œκ³Ό λΉ„μŠ·ν•˜κ²Œ λ©λ‹ˆλ‹€.
11:53
So an example might be,
238
713965
2340
κ·Έ μΌλ‘€λ‘œ,
μ „κΈ°κ°€ λŒ€μ€‘ν™”λ˜κΈ° μ‹œμž‘ν–ˆλ˜
11:56
if you go back 100 years ago,
239
716329
1596
11:57
when electricity was first becoming common,
240
717949
3428
100λ…„μ „μœΌλ‘œ λŒμ•„κ°€λ³΄λ©΄,
전기에 λŒ€ν•œ 곡포심이 널리 νΌμ Έμžˆμ—ˆμ§€μš”.
12:01
there were a lot of fears about it.
241
721401
1703
μ–΄λ–€ μ‚¬λžŒλ“€μ€ μ „κΈ° μ΄ˆμΈμ’…μ— μ „κΈ°κ°€ μžˆμ–΄μ„œ μœ„ν—˜ν•˜λ‹€λ©°
12:03
There were people who were afraid to push doorbells,
242
723128
2478
μ΄ˆμΈμ’…μ„ λˆ„λ₯΄λŠ” 것을 λ‘λ €μ›Œ ν–ˆμ§€μš”.
12:05
because there was electricity in there, and that was dangerous.
243
725630
3005
μš°λ¦¬λŠ” 이제 전기에 μ•„μ£Ό μ΅μˆ™ν•΄ 쑌죠.
12:08
For us, we're very facile around electricity.
244
728659
2869
전ꡬ λ°”κΎΈλŠ” μ •λ„λŠ”
12:11
We change light bulbs without even thinking about it.
245
731552
2818
아무것도 μ•„λ‹ˆμ£ 
12:14
Our model of security around electricity is something we were born into.
246
734948
6163
전기에 λŒ€ν•œ 우리의 λͺ¨λΈμ€
μš°λ¦¬κ°€ νƒ€κ³ λ‚œ 것이죠.
12:21
It hasn't changed as we were growing up.
247
741735
2514
κ·Έ λͺ¨λΈμ€ μš°λ¦¬κ°€ μžλΌλ©΄μ„œ λ³€ν•˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
12:24
And we're good at it.
248
744273
1565
μš°λ¦¬λŠ” μ „κΈ°λ₯Ό 잘 닀루죠.
12:27
Or think of the risks on the Internet across generations --
249
747380
4499
우리의 각 μ„ΈλŒ€κ°€ μΈν„°λ„·μ˜
μœ„ν—˜μ„ μ–΄λ–»κ²Œ μƒκ°ν•˜λŠ”μ§€ 보죠 --
12:31
how your parents approach Internet security,
250
751903
2097
μ—¬λŸ¬λΆ„μ˜ λΆ€λͺ¨λ‹˜κ³Ό μ—¬λŸ¬λΆ„λ“€ μžμ‹ μ΄
인터넷 λ³΄μ•ˆμ„ λŒ€ν•˜λŠ” 것이 μ–΄λ–»κ²Œ λ‹€λ₯΄κ³ ,
12:34
versus how you do,
251
754024
1616
12:35
versus how our kids will.
252
755664
1542
μ—¬λŸ¬λΆ„μ˜ μžλ…€λ“€μ€ μ•žμœΌλ‘œ μ–΄λ–»κ²Œ 생각할 κ²ƒμΈκ°€μš”.
12:38
Models eventually fade into the background.
253
758300
2550
κ·ΈλŸ¬λ‚˜ λͺ¨λΈμ€ μ‹œκ°„μ΄ 흐λ₯΄λ©° κ²°κ΅­μ—λŠ” μ„œμ„œνžˆ μ‚¬λΌμ§‘λ‹ˆλ‹€.
12:42
"Intuitive" is just another word for familiar.
254
762427
2493
μ§κ΄€μ μ΄λΌλŠ” 말은 즉 μΉœμˆ™ν•˜λ‹€λŠ” λ§μ΄μ§€μš”.
12:45
So as your model is close to reality and it converges with feelings,
255
765887
3850
μ—¬λŸ¬λΆ„μ˜ λͺ¨λΈμ΄ ν˜„μ‹€μ— 더 κ°€κΉκ²Œ 되고
κ·Έ λͺ¨λΈμ΄ μ—¬λŸ¬λΆ„μ˜ λŠλ‚Œκ³Ό κ°€κΉκ²Œ 되면
12:49
you often don't even know it's there.
256
769761
1918
λͺ¨λΈ 자체λ₯Ό μžŠμ–΄λ²„λ¦¬κ²Œ 되죠.
이런 κ²ƒμ˜ 쒋은 예둜
12:53
A nice example of this came from last year and swine flu.
257
773239
3817
μž‘λ…„μ— μžˆμ—ˆλ˜ 돼지독감을 λ“€ 수 μžˆμ§€μš”.
돼지독감이 처음 λ‚˜νƒ€λ‚¬λ˜
12:58
When swine flu first appeared,
258
778281
2000
초기의 λ‰΄μŠ€λ“€μ€ λ§Žμ€ κ³Όμž‰λ°˜μ‘μ„ μ•ΌκΈ°ν–ˆμŠ΅λ‹ˆλ‹€.
13:00
the initial news caused a lot of overreaction.
259
780305
2618
13:03
Now, it had a name,
260
783562
1978
λΌμ§€λ…κ°μ΄λΌλŠ” 이름 λ•Œλ¬Έμ—
13:05
which made it scarier than the regular flu,
261
785564
2050
일반 독감보닀 더 λ¬΄μ„­κ²Œ λ“€λ Έλ˜ 것이죠,
13:07
even though it was more deadly.
262
787638
1567
사싀은 이름이 μ—†λŠ” 일반 독감이 더 μœ„ν—˜ν•œλ°μš”.
13:09
And people thought doctors should be able to deal with it.
263
789784
3208
그리고 μ‚¬λžŒλ“€μ€ 그것은 μ˜μ‚¬κ°€ ν•΄κ²°ν•  문제라고 μƒκ°ν–ˆμ£ .
13:13
So there was that feeling of lack of control.
264
793459
2524
λ”°λΌμ„œ μ‚¬λžŒλ“€μ€ κ·Έλ“€ μžμ‹ μ΄ 무λ ₯ν•˜λ‹€κ³  λŠκΌˆμ§€μš”.
이 두가지 이유둜 ν•΄μ„œ 돼지독감이
13:16
And those two things made the risk more than it was.
265
796007
3109
μ‹€μ œλ³΄λ‹€ 더 μœ„ν—˜ν•œ κ²ƒμ²˜λŸΌ λ³΄μ˜€μ£ .
13:19
As the novelty wore off and the months went by,
266
799140
3557
μ²˜μŒμ— μžˆμ—ˆλ˜ μƒμ†Œν•¨μ΄ 없어지고 μ‹œκ°„μ΄ μ§€λ‚˜λ©°
13:22
there was some amount of tolerance; people got used to it.
267
802721
2843
μ‚¬λžŒλ“€μ΄ μ–΄λŠμ •λ„ 돼지독감을 ν¬μš©ν•˜κ²Œ 되고
κ·Έ μœ„ν—˜μ— μ΅μˆ™ν•΄ 지며
13:26
There was no new data, but there was less fear.
268
806355
2670
돼지독감에 λŒ€ν•œ μƒˆ λ°μ΄ν„°λŠ” μ•ˆλ‚˜μ™”λŠ”λ° 두렀움이 많이 μ—†μ–΄μ‘Œμ§€μš”.
13:29
By autumn,
269
809681
2174
가을이 였며
13:31
people thought the doctors should have solved this already.
270
811879
3382
μ‚¬λžŒλ“€μ€ μ˜μ‚¬λ“€μ΄ 돼지독감 문제λ₯Ό
이미 ν•΄κ²°ν–ˆμ–΄μ•Ό ν•œλ‹€κ³  μƒκ°ν–ˆμ£ .
13:35
And there's kind of a bifurcation:
271
815722
1960
κ·Έλ•Œ μ‚¬λžŒλ“€μ€ μΌμ’…μ˜ 뢄기점에 λ„λ‹¬ν•΄μ„œ
13:37
people had to choose between fear and acceptance --
272
817706
5751
돼지독감을 계속 λ‘λ €μ›Œν•˜λ˜μ§€
μ•„λ‹ˆλ©΄ ν˜„μ‹€μ„ λ°›μ•„λ“€μ΄λ˜μ§€
λ‘˜ 쀑에 ν•˜λ‚˜λ₯Ό 선택해야 ν–ˆλŠ”λ°
13:44
actually, fear and indifference --
273
824512
1644
μ‚¬λžŒλ“€μ€ λΆˆμ‹ ν•˜λŠ” μͺ½μœΌλ‘œ κΈ°μšΈμ—ˆμ§€μš”.
13:46
and they kind of chose suspicion.
274
826180
1795
κ·Έλž˜μ„œ μž‘λ…„ κ²¨μšΈμ— 백신이 λ‚˜μ™”μ„λ•Œ
13:49
And when the vaccine appeared last winter,
275
829110
3111
λ†€λΌμšΈ μ •λ„λ‘œ λ§Žμ€ μ‚¬λžŒλ“€μ΄
13:52
there were a lot of people -- a surprising number --
276
832245
2511
13:54
who refused to get it.
277
834780
1797
백신주사 맞기λ₯Ό κ±°λΆ€ν–ˆμŠ΅λ‹ˆλ‹€.
13:58
And it's a nice example of how people's feelings of security change,
278
838777
3656
이것은
μ•„λ¬΄λŸ° μƒˆλ‘œμš΄ 정보가 없이
14:02
how their model changes,
279
842457
1603
μ•ˆμ „μ— λŒ€ν•œ 우리의 λŠλ‚Œμ΄ 크게 λ°”λ€Œκ³ ,
14:04
sort of wildly,
280
844084
1668
14:05
with no new information, with no new input.
281
845776
2770
λ˜ν•œ λͺ¨λΈλ„ λ°”λ€” 수 μžˆλ‹€λŠ” 것을
λ³΄μ—¬μ£ΌλŠ” 쒋은 μΌλ‘€μž…λ‹ˆλ‹€.
이런 일은 많이 μƒκΈ°μ§€μš”.
14:10
This kind of thing happens a lot.
282
850327
1808
쑰금 더 λ³΅μž‘ν•œ 상황을 이야기 ν•΄λ³΄κ² μŠ΅λ‹ˆλ‹€.
14:13
I'm going to give one more complication.
283
853199
1971
14:15
We have feeling, model, reality.
284
855194
2696
λŠλ‚Œ, λͺ¨λΈ, ν˜„μ‹€μ΄ μžˆμŠ΅λ‹ˆλ‹€.
14:18
I have a very relativistic view of security.
285
858640
2510
μ €λŠ” λ³΄μ•ˆμ— λŒ€ν•΄ 맀우 ν˜„μ‹€μ μΈ 견해λ₯Ό 가지고 μžˆμŠ΅λ‹ˆλ‹€.
μ €λŠ” λ³΄μ•ˆμ— λŒ€ν•œ 생각은 μ‚¬λžŒμ— 따라 λ‹€λ₯΄λ‹€κ³  μƒκ°ν•˜μ§€μš”.
14:21
I think it depends on the observer.
286
861174
1814
14:23
And most security decisions have a variety of people involved.
287
863695
5166
λŒ€λΆ€λΆ„μ˜ λ³΄μ•ˆμ— λŒ€ν•œ κ²°μ •μ—λŠ”
λ‹€μ–‘ν•œ μ‚¬λžŒλ“€μ΄ κ΄€κ³„λ˜μ–΄ μžˆμŠ΅λ‹ˆλ‹€.
14:29
And stakeholders with specific trade-offs will try to influence the decision.
288
869792
6539
λ³΄μ•ˆμ— κ΄€λ ¨λœ 결정에 κ΄€λ ¨λœ
이해 λ‹Ήμ‚¬μžλ“€μ€ λ³΄μ•ˆ 결정에
영ν–₯을 μ£Όλ €κ³  ν•  κ²ƒμž…λ‹ˆλ‹€.
14:36
And I call that their agenda.
289
876355
1681
μ €λŠ” 그것을 κ·Έλ“€μ˜ 'μˆ¨μ€ μ˜λ„'라고 λΆ€λ₯΄κ² μŠ΅λ‹ˆλ‹€.
κ·Έλ“€μ˜ μˆ¨μ€ μ˜λ„μ—λŠ”
14:39
And you see agenda -- this is marketing, this is politics --
290
879512
3491
λ§ˆμΌ€νŒ…λ„ 있고, μ •μΉ˜λ„ μžˆμ§€μš” ---
14:43
trying to convince you to have one model versus another,
291
883481
3039
그듀은 μ—¬λŸ¬λΆ„μ΄ μ–΄λ–€ λͺ¨λΈ λŒ€μ‹ μ— λ‹€λ₯Έ λͺ¨λΈμ„ 가지도둝 ν•˜κ±°λ‚˜,
14:46
trying to convince you to ignore a model
292
886544
1984
λ˜λŠ” μ–΄λ–€ λͺ¨λΈμ„ λ¬΄μ‹œν•˜κ³ 
14:48
and trust your feelings,
293
888552
2672
μ—¬λŸ¬λΆ„μ˜ λŠλ‚Œμ„ 믿으라고 μ„€λ“μ‹œμΌœμ„œ
14:51
marginalizing people with models you don't like.
294
891248
2504
μ—¬λŸ¬λΆ„μ΄ μ’‹μ•„ν•˜μ§€ μ•ŠλŠ” μ‚¬λžŒ μ΄λ‚˜ λͺ¨λΈμ„ μ†Œμ™Έμ‹œν‚€κ²Œ λ§Œλ“€μ§€μš”.
14:54
This is not uncommon.
295
894744
1516
이건 λ“œλ¬ΈμΌμ΄ μ•„λ‹™λ‹ˆλ‹€.
14:57
An example, a great example, is the risk of smoking.
296
897610
3229
μ•„μ£Ό 쒋은 예의 ν•˜λ‚˜λŠ” ν‘μ—°μ˜ μœ„ν—˜μ΄μ£ .
ν‘μ—°μ˜ μœ„ν—˜μ— λŒ€ν•œ μ§€λ‚œ 50λ…„μ˜ μ—­μ‚¬λŠ”
15:02
In the history of the past 50 years,
297
902196
1783
15:04
the smoking risk shows how a model changes,
298
904003
2613
리슀크 λͺ¨λΈμ΄ μ–΄λ–»κ²Œ λ°”λ€Œμ—ˆμœΌλ©°
15:06
and it also shows how an industry fights against a model it doesn't like.
299
906640
4358
담배업계가 그듀이 μ’‹μ•„ν•˜μ§€ μ•ŠλŠ” λͺ¨λΈμ„
μ–΄λ–»κ²Œ μ‹Έμ›Œ μ™”λŠ”μ§€λ₯Ό λ³΄μ—¬μ€λ‹ˆλ‹€.
15:11
Compare that to the secondhand smoke debate --
300
911983
3103
이것은 간접흑연에 λŒ€ν•œ λ…ΌμŸμ— 비ꡐ해
μ•½ 20λ…„ μ •λ„λŠ” 뒀쳐진 것이죠.
15:15
probably about 20 years behind.
301
915110
1953
15:17
Think about seat belts.
302
917982
1615
μ•ˆμ „λ²¨νŠΈμ˜ μ˜ˆλ„ 듀어보죠.
15:19
When I was a kid, no one wore a seat belt.
303
919621
2024
μ œκ°€ 어렸을 λ•ŒλŠ” 아무도 μ•ˆμ „λ²¨νŠΈλ₯Ό 맀지 μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
15:21
Nowadays, no kid will let you drive if you're not wearing a seat belt.
304
921669
3541
μš”μ¦˜μ€ μ•ˆμ „λ²¨νŠΈλ₯Ό μ•ˆμ±„μ›Œμ£Όλ©΄,
아이듀이 μš΄μ „μ„ λͺ»ν•˜κ²Œ ν•©λ‹ˆλ‹€.
15:26
Compare that to the airbag debate,
305
926633
2453
에어백 λ…ΌμŸκ³Ό 비ꡐ해보면,
μ•„λ§ˆ 30년은 λ’€μ³μ‘Œμ„κ²λ‹ˆλ‹€.
15:29
probably about 30 years behind.
306
929110
1667
이것은 λͺ¨λ‘ λͺ¨λΈμ΄ λ³€ν•˜λŠ” 것을 λ³΄μ—¬μ£Όμ§€μš”.
15:32
All examples of models changing.
307
932006
2088
15:36
What we learn is that changing models is hard.
308
936855
2796
이것은 λͺ¨λΈμ„ λ°”κΎΈλŠ” 것이 μ–΄λ ΅λ‹€λŠ” 것을 λ³΄μ—¬μ€λ‹ˆλ‹€.
λͺ¨λΈμ„ κ±·μ–΄λ‚΄λŠ” 것은 νž˜μ–΄λ €μš΄ μΌμž…λ‹ˆλ‹€.
15:40
Models are hard to dislodge.
309
940334
2053
λͺ¨λΈμ΄ μ—¬λŸ¬λΆ„μ˜ λŠλ‚Œκ³Ό κ°™μœΌλ©΄
15:42
If they equal your feelings,
310
942411
1675
μ—¬λŸ¬λΆ„μ€ λͺ¨λΈμ„ 가지고 μžˆλŠ”μ§€ μ‘°μ°¨ λͺ¨λ₯΄μ§€μš”
15:44
you don't even know you have a model.
311
944110
1899
μš°λ¦¬λŠ” λ˜ν•œ λ‹€λ₯Έ 인지적 νŽΈκ²¬μ„ 가지고 μžˆλŠ”λ°
15:47
And there's another cognitive bias
312
947110
1886
μ €λŠ” 그것을 "ν™•μΈνŽΈκ²¬"이라고 λΆ€λ₯΄κ² μŠ΅λ‹ˆλ‹€ .
15:49
I'll call confirmation bias,
313
949020
2066
그것은 μš°λ¦¬λŠ” 우리의 믿음과 μΌμΉ˜ν•˜λŠ”
15:51
where we tend to accept data that confirms our beliefs
314
951110
4361
λ°μ΄ν„°λŠ” λ°›μ•„λ“œλ¦¬κ³ 
15:55
and reject data that contradicts our beliefs.
315
955495
2437
우리의 믿음과 μƒλ°˜λ˜λŠ” λ°μ΄ν„°λŠ” κ±°λΆ€ν•˜λŠ” κ²ƒμ΄μ§€μš”.
15:59
So evidence against our model, we're likely to ignore,
316
959490
3935
μš°λ¦¬λŠ” 우리의 λͺ¨λΈμ— μΌμΉ˜ν•˜μ§€ μ•ŠλŠ” μ¦κ±°λŠ”
아무리 κ°•λ ₯ν•œ 증거라도 λ¬΄μ‹œν•  κ°€λŠ₯성이 λ§ŽμŠ΅λ‹ˆλ‹€.
16:03
even if it's compelling.
317
963449
1248
16:04
It has to get very compelling before we'll pay attention.
318
964721
3005
섀득λ ₯이 μ•„μ£Ό κ°•ν•œ μ¦κ±°λž˜μ•Ό λΉ„λ‘œμ„œ 관심을 λ³΄μ΄μ§€μš”.
16:08
New models that extend long periods of time are hard.
319
968990
2597
였랜 μ‹œκ°„μ΄ κ΄€λ ¨λ˜λŠ” μƒˆ λͺ¨λΈμ€ 받아듀이기 νž˜λ“€μ§€μš”.
지ꡬ μ˜¨λ‚œν™”κ°€ λŒ€ν‘œμ μΈ μ˜ˆμž…λ‹ˆλ‹€.
16:11
Global warming is a great example.
320
971611
1754
μš°λ¦¬λŠ” 80λ…„ μ •λ„λ˜λŠ” 기간에 걸친
16:13
We're terrible at models that span 80 years.
321
973389
3442
λͺ¨λΈμ€ 잘 닀루지 λͺ»ν•©λ‹ˆλ‹€.
16:16
We can do "to the next harvest."
322
976855
2063
λ‹€μŒ μΆ”μˆ˜μ‹œμ ˆ κΉŒμ§€μ˜ 기간은 감당할 수 있죠.
16:18
We can often do "until our kids grow up."
323
978942
2306
μžμ‹μ΄ μ„±μž₯ν• λ•Œ κΉŒμ§€μ˜ 기간도 μ’…μ’… 감당할 수 있죠.
16:21
But "80 years," we're just not good at.
324
981760
2201
κ·ΈλŸ¬λ‚˜ 80년은 κ°λ‹Ήν•˜μ§€ λͺ»ν•©λ‹ˆλ‹€.
16:24
So it's a very hard model to accept.
325
984975
2302
κ·Έλž˜μ„œ μ˜¨λ‚œν™” λͺ¨λΈμ€ 정말 받아듀이기 μ–΄λ ΅μ£ .
16:27
We can have both models in our head simultaneously --
326
987999
2946
μš°λ¦¬λŠ” λ‘κ°œμ˜ λͺ¨λΈμ„
16:31
that kind of problem where we're holding both beliefs together,
327
991912
6948
λ™μ‹œμ— κ°€μ§ˆ μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
즉, λ‘κ°œμ˜ λ‹€λ₯Έ λ―ΏμŒμ„ λ™μ‹œμ— κ°€μ§€λŠ” 것은
인지뢀쑰화라고 말할 수 있죠.
16:38
the cognitive dissonance.
328
998884
1370
κ·ΈλŸ¬λ‚˜ κ²°κ΅­μ—λŠ”
16:40
Eventually, the new model will replace the old model.
329
1000278
3156
μƒˆλ‘œμš΄ λͺ¨λΈμ΄ 이전 λͺ¨λΈμ„ λŒ€μ²΄ν•˜κ²Œ λ©λ‹ˆλ‹€.
16:44
Strong feelings can create a model.
330
1004164
2190
κ°•ν•œ λŠλ‚Œμ€ λͺ¨λΈμ„ λ§Œλ“€ 수 μžˆμŠ΅λ‹ˆλ‹€.
16:47
September 11 created a security model in a lot of people's heads.
331
1007411
5363
911 μ‚¬νƒœλŠ” λ§Žμ€ μ‚¬λžŒλ“€μ˜ 머리속에
λ³΄μ•ˆμ— λŒ€ν•œ λͺ¨λΈμ„ λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
16:52
Also, personal experiences with crime can do it,
332
1012798
3288
λ˜ν•œ 범죄에 κ΄€λ ¨λœ 개인적인 κ²½ν—˜λ„ λͺ¨λΈμ„ λ§Œλ“€ 수 있고,
건강에 λŒ€ν•œ 두렀움,
16:56
personal health scare,
333
1016110
1379
16:57
a health scare in the news.
334
1017513
1466
κ±΄κ°•μœ„κΈ° λ‰΄μŠ€λ„ 그럴 수 있죠.
μ΄λŸ°κ²ƒμ€ μ •μ‹ κ³Ό μ˜μ‚¬λ“€μ΄ λ§ν•˜λŠ”
17:00
You'll see these called "flashbulb events" by psychiatrists.
335
1020198
3345
μ†Œμœ„ "섬광 이벀트"μ—μ„œ λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
이런 κ²½ν—˜μ€ 맀우 κ°•λ ₯ν•œ 감정을 μ΄ˆλž˜ν•˜κΈ° 떄문에
17:04
They can create a model instantaneously,
336
1024183
2461
17:06
because they're very emotive.
337
1026668
1761
κ·Έ μ¦‰μ‹œλ‘œ λͺ¨λΈμ„ λ§Œλ“€ 수 있죠.
17:09
So in the technological world,
338
1029908
1588
기술이 λ°œλ‹¬ν•œ μ‚¬νšŒμ— μ‚¬λŠ” μ‚¬λžŒλ“€μ€
17:11
we don't have experience to judge models.
339
1031520
3237
λͺ¨λΈλ“€μ„ νŒλ‹¬ ν•  수 μžˆλŠ”
κ²½ν—˜μ„ 가지고 μžˆμ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
17:15
And we rely on others. We rely on proxies.
340
1035124
2933
κ·Έλž˜μ„œ λ‹€λ₯Έ μ‚¬λžŒμ˜ κ²½ν—˜μ— λŒ€μ‹  μ˜λ’°ν•˜κ²Œ λ˜μ§€μš”
이런 방법은 λ‹€λ₯Έ μ‚¬λžŒμ„ κ΄€λ¦¬ν•˜λŠ” λͺ©μ μœΌλ‘œλŠ” μ’‹μ£ .
17:18
And this works, as long as it's the correct others.
341
1038081
2619
17:21
We rely on government agencies
342
1041183
2682
μš°λ¦¬λŠ” μ•½μ˜ μ•ˆμ „μ„± μ‹œν—˜μ„
17:23
to tell us what pharmaceuticals are safe.
343
1043889
4404
정뢀기관에 λ§‘κΉλ‹ˆλ‹€.
17:28
I flew here yesterday.
344
1048317
1913
μ €λŠ” μ–΄μ œ λΉ„ν–‰κΈ°λ₯Ό 타고 μ—¬κΈ°λ‘œ μ™”λŠ”λ°
17:30
I didn't check the airplane.
345
1050254
1762
μ œκ°€ 직접 λΉ„ν–‰κΈ°λ₯Ό μ κ²€ν•˜μ§€λŠ” μ•Šμ•˜μ§€μš”.
17:32
I relied on some other group
346
1052699
2595
μ œκ°€ 탄 λΉ„ν–‰κΈ°κ°€ μ•ˆμ „ν•œμ§€ 그렇지 μ•Šμ€μ§€λŠ”
λ‹€λ₯Έ μ‚¬λžŒλ“€μ—κ²Œ λ§‘κΉλ‹ˆλ‹€.
17:35
to determine whether my plane was safe to fly.
347
1055318
2437
17:37
We're here, none of us fear the roof is going to collapse on us,
348
1057779
3298
μš°λ¦¬λŠ” 이 λΉŒλ”©μ„ 직접 μ κ²€ν•˜μ§€ μ•Šμ•˜μ§€λ§Œ
지뢕이 λ‚΄λ € 앉을지 λͺ¨λ₯Έλ‹€κ³  κ±±μ •ν•˜λŠ” μ‚¬λžŒμ€ 아무도 μ—†μ£ .
17:41
not because we checked,
349
1061101
2205
17:43
but because we're pretty sure the building codes here are good.
350
1063330
3672
μ™œλƒν•˜λ©΄ 이 λΉŒλ”©μ΄ κ±΄μΆ•κ·œμ •μ„ 잘 지켰을거라고
λ―Ώκ³  ν™•μ‹ ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
17:48
It's a model we just accept
351
1068442
2989
μ΄λŸ°κ²ƒμ€ λ―ΏμŒμ„ 톡해 μš°λ¦¬κ°€
κ·Έλƒ₯ λ°›μ•„λ“€μ΄λŠ” λͺ¨λΈμž…λ‹ˆλ‹€.
17:51
pretty much by faith.
352
1071455
1360
그런건 μ’‹μŠ΅λ‹ˆλ‹€.
17:53
And that's OK.
353
1073331
1358
17:57
Now, what we want is people to get familiar enough with better models,
354
1077966
5873
자, μš°λ¦¬κ°€ μ›ν•˜λŠ” 것은
μ‚¬λžŒλ“€μ΄ 더 쒋은 λͺ¨λΈμ—
μΆ©λΆ„νžˆ μΉœμˆ™ν•΄μ§€κ³  --
18:03
have it reflected in their feelings,
355
1083863
2120
κ·Έλ“€μ˜ 감정에도 반영이 λ˜μ–΄
λ³΄μ•ˆμ— κ΄€λ ¨λœ λΉ„μš©νŽΈμ΅ 결정을 ν•  수 μžˆλŠ” 그런 λͺ¨λΈ 말이죠.
18:06
to allow them to make security trade-offs.
356
1086007
3002
자, 그런데 이런 λͺ¨λΈλ“€μ΄ 잘 λ§žμ§€ μ•ŠμœΌλ©΄,
18:10
When these go out of whack, you have two options.
357
1090110
3719
μ—¬λŸ¬λΆ„λ“€μ—κ²ŒλŠ” 두가지 μ˜΅μ…˜μ΄ μžˆμŠ΅λ‹ˆλ‹€.
18:13
One, you can fix people's feelings, directly appeal to feelings.
358
1093853
4233
ν•œκ°€μ§€λŠ” λ‹€λ₯Έ μ‚¬λžŒλ“€μ˜ 감정에 직접 ν˜Έμ†Œν•˜μ—¬
κ·Έλ“€μ˜ λͺ¨λΈμ„ λ°”κΎΈκ²Œ ν•˜λŠ” 것이죠.
이런 방식은 μ‘°μž‘μ΄μ§€λ§Œ νš¨κ³Όκ°€ μžˆμ„ 수 μžˆμŠ΅λ‹ˆλ‹€.
18:18
It's manipulation, but it can work.
359
1098110
2406
μ’€ 더 μ†”μ§ν•œ λ‘λ²ˆμ§Έ 방법은
18:21
The second, more honest way
360
1101173
2191
λͺ¨λΈμ„ μˆ˜μ •ν•΄ λ³΄λŠ”κ²ƒμ΄μ£ .
18:23
is to actually fix the model.
361
1103388
1773
18:26
Change happens slowly.
362
1106720
2101
λ³€ν™”λŠ” 천천히 μΌμ–΄λ‚©λ‹ˆλ‹€.
18:28
The smoking debate took 40 years -- and that was an easy one.
363
1108845
4378
흑연에 λŒ€ν•œ λ…ΌμŸμ€ 40λ…„ κ±Έλ ΈμŠ΅λ‹ˆλ‹€.
κ·ΈλŸ¬λ‚˜ κ·Έ λ¬Έμ œλŠ” μ‰¬μš΄νŽΈμ΄μ—ˆμ£ .
λ‹€λ₯Έ μ‚¬λžŒμ˜ λͺ¨λΈμ„ λ°”κΎΈλŠ” 것은 μ–΄λ €μš΄ μΌμž…λ‹ˆλ‹€.
18:35
Some of this stuff is hard.
364
1115195
1813
μ œλ§μ€ μ•„λ§ˆλ„
18:37
Really, though, information seems like our best hope.
365
1117496
3756
정보λ₯Ό μ œκ³΅ν•˜λŠ”κ²ƒμ΄ 제일 희망적일것 κ°™μŠ΅λ‹ˆλ‹€.
μ œκ°€ 거짓말을 ν–ˆμ–΄μš”.
18:41
And I lied.
366
1121276
1272
18:42
Remember I said feeling, model, reality; reality doesn't change?
367
1122572
4020
μ €λŠ” μ–Όλ§ˆμ „μ— 감정, λͺ¨λΈ, ν˜„μ‹€μ„ λ§ν–ˆμ§€μš”.
ν˜„μ‹€μ€ λ°”λ€Œμ§€ μ•ŠλŠ”λ‹€κ³  λ§ν–ˆμ£ . 사싀 ν˜„μ‹€μ€ λ³€ν•©λ‹ˆλ‹€.
18:46
It actually does.
368
1126616
1375
μš°λ¦¬λŠ” 기술이 μ§€λ°°ν•˜λŠ” 세상에 μ‚΄κΈ° λ•Œλ¬Έμ—
18:48
We live in a technological world;
369
1128015
1714
18:49
reality changes all the time.
370
1129753
2338
ν˜„μ‹€μ€ 항상 λ³€ν•©λ‹ˆλ‹€.
18:52
So we might have, for the first time in our species:
371
1132887
2977
κ·Έλž˜μ„œ μ•„λ§ˆ 인λ₯˜μ—­μ‚¬μƒ 처음으둜 λŠλ‚Œμ΄ λͺ¨λΈμ„ 따라가고,
18:55
feeling chases model, model chases reality, reality's moving --
372
1135888
3183
λͺ¨λΈμ΄ ν˜„μ‹€μ„ 따라가고, ν˜„μ‹€μ€ 움직이고 --
μ•„λ§ˆ ν˜„μ‹€μ„ 따라 μž‘μ§€ λͺ»ν•  지 λͺ¨λ₯΄μ£ .
18:59
they might never catch up.
373
1139095
1533
19:02
We don't know.
374
1142180
1328
아무도 λͺ¨λ¦…λ‹ˆλ‹€.
κ·ΈλŸ¬λ‚˜ 였랜 기간을 ν†΅ν•΄μ„œλŠ”
19:05
But in the long term,
375
1145614
1603
λŠλ‚Œκ³Ό ν˜„μ‹€μ΄ λ‘˜λ‹€ λ‹€ μ€‘μš”ν•©λ‹ˆλ‹€.
19:07
both feeling and reality are important.
376
1147241
2204
19:09
And I want to close with two quick stories to illustrate this.
377
1149469
3233
이점을 μ„€λͺ…ν•˜κΈ° μœ„ν•΄ λ§ˆμ§€λ§‰μœΌλ‘œ 짧은 이야기λ₯Ό λ‘κ°œ λ§μ”€λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
19:12
1982 -- I don't know if people will remember this --
378
1152726
2479
μ—¬λŸ¬λΆ„λ“€μ΄ κΈ°μ–΅ν•˜μ‹œλŠ”μ§€ λͺ¨λ₯΄κ² μ§€λ§Œ 1982년에
λ―Έκ΅­μ—μ„œ νƒ€μ΄λ ˆλ†€ 독살
19:15
there was a short epidemic of Tylenol poisonings
379
1155229
3370
사건이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
19:18
in the United States.
380
1158623
1196
19:19
It's a horrific story.
381
1159843
1361
λ”μ°ν•œ μ΄μ•ΌκΈ°μ˜€μ£ . λˆ„κ΅°κ°€ νƒ€μ—λ ˆλ†€ 병에닀가
19:21
Someone took a bottle of Tylenol,
382
1161228
2079
독을 탄후, 병을 λ‹€μ‹œ λ‹«κ³  μ§„μ—΄λŒ€μ— λ‹€μ‹œ 올렀 놓은 κ±°μ£ .
19:23
put poison in it, closed it up, put it back on the shelf,
383
1163331
3002
μ–΄λ–€ μ‚¬λžŒμ΄ 그약을 μ‚¬μ„œ λ¨Ήκ³  μ£½μ—ˆμ§€μš”.
19:26
someone else bought it and died.
384
1166357
1558
19:27
This terrified people.
385
1167939
1673
이 μ‚¬κ±΄μœΌλ‘œ μ‚¬λžŒλ“€μ€ 곡포에 νœ©μ“Έλ Έμ—ˆμ§€μš”.
19:29
There were a couple of copycat attacks.
386
1169636
2227
λͺ¨λ°©λ²”μ£„μ˜ 사둀도 λͺ‡κ°œ μžˆμ—ˆμ§€μš”.
19:31
There wasn't any real risk, but people were scared.
387
1171887
2845
μ‹€μ œμ μΈ μœ„ν—˜μ€ μ—†μ§€λ§Œ μ‚¬λžŒλ“€μ€ λ¬΄μ„œμ›Œν–ˆμ§€μš”.
19:34
And this is how the tamper-proof drug industry was invented.
388
1174756
3876
이 사건 이후
변쑰방지 μ œμ•½μ‚°μ—…μ΄ νƒœμ–΄λ‚¬μ§€μš”.
19:38
Those tamper-proof caps? That came from this.
389
1178656
2229
변쑰방지 λšœκ»‘μ΄ 이런 사건 λ•Œλ¬Έμ— 생긴거죠.
19:40
It's complete security theater.
390
1180909
1571
이런건 μ™„μ „νžˆ λ³΄μ•ˆμ—°κ·Ήμž…λ‹ˆλ‹€.
19:42
As a homework assignment, think of 10 ways to get around it.
391
1182504
2891
이것을 μš°νšŒν•˜λŠ” 방법을 μˆ™μ œλ‘œ 10가지 생각해 λ³΄μ„Έμš”.
μ œκ°€ ν•œκ°€μ§€ 방법을 μ•Œλ €λ“œλ¦¬μ§€μš”. 주사기λ₯Ό μ“°λ©΄ λ©λ‹ˆλ‹€.
19:45
I'll give you one: a syringe.
392
1185419
1891
19:47
But it made people feel better.
393
1187334
2781
ν•˜μ§€λ§Œ 이 λšœκ»‘μ€ μ‚¬λžŒλ“€μ„ μ•ˆμ‹¬μ‹œμΌ°μ£ .
19:50
It made their feeling of security more match the reality.
394
1190744
3702
이 λšœκ»‘μ€ μ‚¬λžŒλ“€μ΄ 가진 μ•ˆμ „ν•˜λ‹€λŠ” λŠλ‚Œμ„
ν˜„μ‹€μ— 더 λ§žμΆ”μ–΄ μ€¬μ§€μš”.
λͺ‡λ…„전에 제 μΉœκ΅¬κ°€ 아이λ₯Ό λ‚³μ•˜μŠ΅λ‹ˆλ‹€.
19:55
Last story: a few years ago, a friend of mine gave birth.
395
1195390
2934
κ·Έλž˜μ„œ μ œκ°€ λ³‘μ›μœΌλ‘œ 문병을 κ°”μ—ˆμ§€μš”.
19:58
I visit her in the hospital.
396
1198348
1397
19:59
It turns out, when a baby's born now,
397
1199769
1923
μš”μ¦˜μ€ μ•„κΈ°κ°€ νƒœμ–΄λ‚˜λ©΄
20:01
they put an RFID bracelet on the baby, a corresponding one on the mother,
398
1201716
3563
λ³‘μ›μ—μ„œ μ—„λ§ˆ νŒ”μ°Œμ™€ λ˜‘κ°™μ€
RFID νŒ”μ°Œλ₯Ό μ•„κΈ° 손λͺ©μ— μ±„μš°λ”κ΅°μš”.
20:05
so if anyone other than the mother takes the baby out of the maternity ward,
399
1205303
3620
κ·Έλž˜μ„œ μ—„λ§ˆκ°€ μ•„λ‹Œ λ‹€λ₯Έ μ‚¬λžŒμ΄ μ•„κΈ°λ₯Ό
μž„μ‚°λΆ€λ³‘μ‹€μ—μ„œ 데리고 λ‚˜κ°€λ©΄ 경보음이 μšΈλ¦¬λŠ”κ±°μ£ .
20:08
an alarm goes off.
400
1208947
1158
μ €λŠ” "그게 μ•„μ£Ό 쒋은 방법이넀" 라고 λ§ν–ˆμ§€μš”.
20:10
I said, "Well, that's kind of neat.
401
1210129
1729
20:11
I wonder how rampant baby snatching is out of hospitals."
402
1211882
3970
μ €λŠ” λ³‘μ›μ•ˆμ—μ„œ μœ μ•„μœ κ΄΄κ°€
μ–Όλ§ˆλ‚˜ λ§Œμ—°ν•œμ§€ κΆκΈˆν–ˆμŠ΅λ‹ˆλ‹€.
20:15
I go home, I look it up.
403
1215876
1236
κ·Έλž˜μ„œ 집에 λŒμ•„μ™€μ„œ μ°Ύμ•„ λ΄€λ”λ‹ˆ.
20:17
It basically never happens.
404
1217136
1525
그런 일은 거의 λ°œμƒν•˜μ§€ μ•Šλ”κ΅°μš”.
20:18
(Laughter)
405
1218685
1844
ν•˜μ§€λ§Œ 생각을 ν•΄ 보면,
20:20
But if you think about it, if you are a hospital,
406
1220553
2843
κ°„ν˜Έμ‚¬κ°€ μ•„κΈ°λ₯Ό
20:23
and you need to take a baby away from its mother,
407
1223420
2380
κ²€μ§„ν•˜κΈ° μœ„ν•΄μ„œ μ•„κΈ° μ—„λ§ˆ λˆˆμ•žμ—μ„œ
20:25
out of the room to run some tests,
408
1225824
1781
μ•„κΈ°λ₯Ό 데리고 병싀 λ°–μœΌλ‘œ λ‚˜κ°ˆμˆ˜ 있으렀면,
20:27
you better have some good security theater,
409
1227629
2050
μ•„λ§ˆ 쒋은 μ•ˆμ „μ—°κ·Ήμ„ ν•΄μ•Ό ν•  κ²ƒμž…λ‹ˆλ‹€.
20:29
or she's going to rip your arm off.
410
1229703
1945
μ•„λ‹ˆλ©΄ μ•„κΈ° μ—„λ§ˆκ°€ κ°„ν˜Έμ‚¬μ˜ νŒ”μ„ μž‘μ•„ λ—„ν…Œλ‹ˆκΉŒμš”.
20:31
(Laughter)
411
1231672
1533
(μ›ƒμŒ)
κ·Έλž˜μ„œ λ³΄μ•ˆ μ‹œμŠ€ν…œμ„ λ””μžμΈν•˜κ±°λ‚˜,
20:34
So it's important for us,
412
1234161
1717
20:35
those of us who design security,
413
1235902
2135
λ³΄μ•ˆ 정책에 κ΄€λ ¨λœ μ‚¬λžŒλ“€μ΄λ‚˜
μ‹¬μ§€μ–΄λŠ” λ³΄μ•ˆμ— κ΄€κ³„λ˜λŠ”
20:38
who look at security policy --
414
1238061
2031
20:40
or even look at public policy in ways that affect security.
415
1240946
3308
곡곡정책에 κ΄€μ—¬λœ μ‚¬λžŒλ“€μ€
λ‹¨μˆœνžˆ ν˜„μ‹€λ§Œ 생각할 것이 μ•„λ‹ˆλΌ
λŠλ‚Œκ³Ό ν˜„μ‹€μ„ λ™μ‹œμ— κ³ λ €ν•΄μ•Ό ν•©λ‹ˆλ‹€.
20:45
It's not just reality; it's feeling and reality.
416
1245006
3416
μ€‘μš”ν•œκ²ƒμ€ λŠλ‚Œκ³Ό ν˜„μ‹€μ΄
20:48
What's important
417
1248446
1865
거의 μΌμΉ˜ν•΄μ•Ό ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
20:50
is that they be about the same.
418
1250335
1545
20:51
It's important that, if our feelings match reality,
419
1251904
2531
우리의 λŠλ‚Œμ΄ ν˜„μ‹€κ³Ό κ°™μœΌλ©΄ μš°λ¦¬λŠ” λ³΄μ•ˆμ— λŒ€ν•œ
보닀 더 합리적인 νŒλ‹¨μ„ 내릴 수 μžˆμ„ κ²ƒμž…λ‹ˆλ‹€.
20:54
we make better security trade-offs.
420
1254459
1873
κ°μ‚¬ν•©λ‹ˆλ‹€.
20:56
Thank you.
421
1256711
1153
20:57
(Applause)
422
1257888
2133
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7