Bill Joy: What I'm worried about, what I'm excited about

91,745 views ใƒป 2008-11-25

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Seungwoo PAEK ๊ฒ€ํ† : Gyoung-tae Kim
00:18
What technology can we really apply to reducing global poverty?
0
18330
6000
์–ด๋–ค ๊ธฐ์ˆ ์ด ์„ธ๊ณ„์ ์ธ ๊ฐ€๋‚œ์„ ์ค„์ผ์ˆ˜ ์žˆ์„๊นŒ์š”?
00:24
And what I found was quite surprising.
1
24330
4000
์ €๋Š” ๋†€๋ผ์šด ์‚ฌ์‹ค์„ ๋ฐœ๊ฒฌํ–ˆ์Šต๋‹ˆ๋‹ค.
00:28
We started looking at things like death rates in the 20th century,
2
28330
3000
20์„ธ๊ธฐ์˜ ์‚ฌ๋ง์œจ ๊ฐ™์€ ๊ฒƒ๋ถ€ํ„ฐ ๋ณด๊ธฐ ์‹œ์ž‘ํ–ˆ์ฃ .
00:31
and how they'd been improved, and very simple things turned out.
3
31330
3000
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋ง์œจ์ด ์–ด๋–ป๊ฒŒ ์ค„์—ˆ๋Š”์ง€ ๋ดค์Šต๋‹ˆ๋‹ค. ๋งค์šฐ ๊ฐ„๋‹จํ•œ ์‚ฌ์‹ค์ด ๋“œ๋Ÿฌ๋‚ฌ์Šต๋‹ˆ๋‹ค.
00:34
You'd think maybe antibiotics made more difference than clean water,
4
34330
3000
๋‹น์‹ ์€ ํ•ญ์ƒ์ œ๊ฐ€ ๊นจ๋—ํ•œ ๋ฌผ๋ณด๋‹ค ๋” ํšจ๊ณผ๊ฐ€ ์žˆ์—ˆ๋‹ค๊ณ  ์ƒ๊ฐํ•  ์ˆ˜๋„ ์žˆ๊ฒ ์ฃ .
00:37
but it's actually the opposite.
5
37330
3000
ํ•˜์ง€๋งŒ ์‚ฌ์‹ค์€ ๊ทธ ๋ฐ˜๋Œ€์ž…๋‹ˆ๋‹ค.
00:40
And so very simple things -- off-the-shelf technologies
6
40330
3000
๊ฐ„๋‹จํ•œ ์‚ฌ์‹ค ํ•˜๋‚˜๋ฅผ ๋ง์”€๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค. ์‰ฝ๊ฒŒ ์ ‘๊ทผ๊ฐ€๋Šฅํ•œ ๊ธฐ์ˆ  -
00:43
that we could easily find on the then-early Web --
7
43330
5000
์ดˆ๊ธฐ WEB์—์„œ ์ฐพ์„ ์ˆ˜ ์žˆ๋Š” ๊ธฐ์ˆ ๋“ค์ด-
00:48
would clearly make a huge difference to that problem.
8
48330
5000
๋ฌธ์ œํ•ด๊ฒฐ์— ํฐ ์ฐจ์ด๋ฅผ ๊ฐ€์ ธ์˜ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
00:53
But I also, in looking at more powerful technologies
9
53330
4000
๊ทธ๋Ÿฌ๋‚˜ ๋™์‹œ์— ์ €๋Š” ๊ฐ•๋ ฅํ•œ ๊ธฐ์ˆ ๋“ค๊ณผ
00:57
and nanotechnology and genetic engineering and other new emerging
10
57330
5000
๋‚˜๋…ธ ํ…Œํฌ๋†€๋กœ์ง€์™€ ์œ ์ „๊ณตํ•™๊ณผ ์ƒˆ๋กญ๊ฒŒ ๋“ฑ์žฅํ•˜๊ณ  ์žˆ๋Š”
01:02
kind of digital technologies, became very concerned
11
62330
4000
๋””์ง€ํ„ธ ๊ธฐ์ˆ ๋“ค์„ ๋ณด๋ฉด์„œ ๋‚จ์šฉ์˜ ๊ฐ€๋Šฅ์„ฑ์— ๋Œ€ํ•ด
01:06
about the potential for abuse.
12
66330
4000
๋งค์šฐ ๊ฑฑ์ •ํ•˜๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
01:10
If you think about it, in history, a long, long time ago
13
70330
5000
์•„์ฃผ ์˜ค๋ž˜์ „ ์—ญ์‚ฌ๋ฅผ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
01:15
we dealt with the problem of an individual abusing another individual.
14
75330
3000
์šฐ๋ฆฌ๋Š” ๊ฐœ์ธ์ด ๊ฐœ์ธ์„ ํ•™๋Œ€ํ•˜๋Š” ๋ฌธ์ œ์— ๋Œ€ํ•ด ๊ณ ๋ฏผํ•ด ์™”์Šต๋‹ˆ๋‹ค.
01:18
We came up with something -- the Ten Commandments: Thou shalt not kill.
15
78330
3000
์šฐ๋ฆฌ๋Š” ๋‹ต์„ ๋ฐœ๊ฒฌํ•ด ๋ƒˆ์ง€์š”. ์‹ญ๊ณ„๋ช… - ์‚ด์ธ์„ ํ•˜์ง€ ๋ง์ง€์–ด๋‹ค.
01:21
That's a, kind of a one-on-one thing.
16
81330
2000
์ด๊ฒƒ์€ ์ผ์ข…์˜, ๊ฐœ๊ฐœ์ธ ์‚ฌ์ด์˜ ๋ฌธ์ œ์˜€์ฃ .
01:23
We organized into cities. We had many people.
17
83330
4000
๊ทธ๋Ÿฌ๋‹ค ์šฐ๋ฆฌ๋Š” ์„œ๋กœ ๋ชจ์—ฌ์„œ ๋„์‹œ๋ฅผ ํ˜•์„ฑํ•˜๊ณ , ๋งŽ์€ ์‚ฌ๋žŒ๋“ค๊ณผ ํ•จ๊ป˜ ์‚ด์•˜์Šต๋‹ˆ๋‹ค.
01:27
And to keep the many from tyrannizing the one,
18
87330
4000
๊ทธ๋ฆฌ๊ณ  ๋‹ค์ˆ˜๊ฐ€ ๊ฐœ์ธ์„ ์–ต์••ํ•˜๋Š” ๊ฒƒ์„ ๋ง‰๊ธฐ ์œ„ํ•ด
01:31
we came up with concepts like individual liberty.
19
91330
4000
์šฐ๋ฆฌ๋Š” ๊ฐœ์ธ์˜ ์ž์œ ์™€ ๊ฐ™์€ ๊ฐœ๋…์„ ๋งŒ๋“ค์—ˆ์ฃ .
01:35
And then, to have to deal with large groups,
20
95330
1000
๊ทธ ํ›„์—๋Š” ์ข€ ๋” ํฐ ์กฐ์ง์„ ๋‹ค๋ฃจ๊ธฐ ์œ„ํ•ด
01:36
say, at the nation-state level,
21
96330
3000
์˜ˆ๋ฅผ ๋“ค๋ฉด ๊ตญ๊ฐ€ ์ฐจ์›์˜ ์กฐ์ง์„ ๋‹ค๋ฃจ๊ธฐ ์œ„ํ•ด
01:39
and we had to have mutual non-aggression,
22
99330
2000
์ƒํ˜ธ๋ถˆ๊ฐ€์นจ ์กฐ์•ฝ์„ ๋งบ์Šต๋‹ˆ๋‹ค.
01:41
or through a series of conflicts, we eventually came to
23
101330
4000
๋˜๋Š” ๊ณ„์†๋˜๋Š” ๊ฐˆ๋“ฑ์†์— ์šฐ๋ฆฌ๋Š” ๊ฒฐ๊ตญ
01:45
a rough international bargain to largely keep the peace.
24
105330
6000
ํ‰ํ™”๋ฅผ ์ง€ํ‚ค๊ธฐ ์œ„ํ•ด ๊ตญ์ œ ์กฐ์•ฝ์„ ๋งบ๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
01:51
But now we have a new situation, really what people call
25
111330
5000
ํ•˜์ง€๋งŒ ์ด์ œ ์šฐ๋ฆฌ๋Š” ์ƒˆ๋กœ์šด ๊ตญ๋ฉด์„ ๋งž์ด ํ•˜์˜€์Šต๋‹ˆ๋‹ค.
01:56
an asymmetric situation, where technology is so powerful
26
116330
3000
๊ธฐ์ˆ ์ด ๋ง‰๊ฐ•ํ•ด์ ธ ๊ตญ๊ฐ€์˜ ์ฐจ์›์„ ๋›ฐ์–ด ๋„˜๋Š”
01:59
that it extends beyond a nation-state.
27
119330
4000
'๋น„๋Œ€์นญ ์ƒํ™ฉ'์˜ ์‹œ๋Œ€๊ฐ€ ์™”์Šต๋‹ˆ๋‹ค.
02:03
It's not the nation-states that have potential access
28
123330
3000
๊ตญ๊ฐ€๊ฐ€ ๋Œ€๋Ÿ‰ ํŒŒ๊ดด์˜ ๊ฐ€๋Šฅ์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๊ณ ,
02:06
to mass destruction, but individuals.
29
126330
5000
๊ฐœ์ธ๋“ค์ด ๊ทธ ๊ฐ€๋Šฅ์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:11
And this is a consequence of the fact that these new technologies tend to be digital.
30
131330
5000
์ด๋Š” ์ƒˆ๋กœ์šด ๊ธฐ์ˆ ๋“ค์ด ๋””์ง€ํ„ธ ๊ธฐ๋ฐ˜์ด๊ธฐ ๋•Œ๋ฌธ์— ๊ทธ๋ ‡์Šต๋‹ˆ๋‹ค.
02:16
We saw genome sequences.
31
136330
4000
์šฐ๋ฆฌ๋Š” ๊ฒŒ๋†ˆ ๋ฐฐ์—ด์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:20
You can download the gene sequences
32
140330
1000
๋‹น์‹ ์ด ์›ํ•œ๋‹ค๋ฉด ๋ณ‘์›๊ท ์˜
02:21
of pathogens off the Internet if you want to,
33
141330
4000
๊ฒŒ๋†ˆ ๋ฐฐ์—ด์„ ์ธํ„ฐ๋„ท์—์„œ ๋ฐ›์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:25
and clearly someone recently -- I saw in a science magazine --
34
145330
5000
์ œ๊ฐ€ ๊ณผํ•™์žก์ง€์—์„œ ๋ณธ ๋ฐ”๋กœ๋Š” ์ตœ๊ทผ ๋ˆ„๊ตฐ๊ฐ€๊ฐ€ ๋งํ•˜๊ธฐ๋ฅผ
02:30
they said, well, the 1918 flu is too dangerous to FedEx around.
35
150330
5000
1918๋…„ ๋…๊ฐ(์ŠคํŽ˜์ธ ๋…๊ฐ)์€ FedExํƒ๋ฐฐ๋กœ ๋ณด๋‚ด๊ธฐ ๋„ˆ๋ฌด ์œ„ํ—˜ํ•˜๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:35
If people want to use it in their labs for working on research,
36
155330
3000
๋งŒ์•ฝ ์‚ฌ๋žŒ๋“ค์ด ์—ฐ๊ตฌ๋ฅผ ์œ„ํ•ด์„œ ์—ฐ๊ตฌ์†Œ์—์„œ ์‚ฌ์šฉํ•˜๊ธฐ ์›ํ•œ๋‹ค๋ฉด,
02:38
just reconstruct it yourself,
37
158330
3000
์Šค์Šค๋กœ ์žฌ๊ตฌ์„ฑํ•˜์—ฌ ์‚ฌ์šฉํ•˜๋ผ๊ณ ์š”.
02:41
because, you know, it might break in FedEx.
38
161330
4000
์™œ๋ƒํ•˜๋ฉด ํƒ๋ฐฐ๋กœ ์˜ค๋‹ค๊ฐ€ ๋ˆ„์ถœ๋  ์ˆ˜๋„ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
02:45
So that this is possible to do this is not deniable.
39
165330
5000
์ด๋Ÿฐ ์ผ๋“ค์„ ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์„ ๋ถ€์ •ํ•  ์ˆ˜๋Š” ์—†์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:50
So individuals in small groups super-empowered by access to these
40
170330
5000
์†Œ๊ทœ๋ชจ ์ง‘๋‹จ์— ์†ํ•œ ๊ฐœ๊ฐœ์ธ ๊ฐ๊ฐ์€ ์ด๋Ÿฌํ•œ ์ž๊ฐ€ ๋ณต์ œ๊ฐ€๋Šฅํ•œ ๊ธฐ์ˆ ๋“ค์„ ํ†ตํ•˜์—ฌ
02:55
kinds of self-replicating technologies, whether it be biological
41
175330
5000
๋ง‰๊ฐ•ํ•œ ํž˜์„ ์–ป๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๊ทธ๊ฒƒ์ด ์ƒ๋ฌผํ•™ ์ชฝ์ด๋˜
03:00
or other, are clearly a danger in our world.
42
180330
3000
์•„๋‹ˆ๋˜ ์ƒ๊ด€์—†์ด, ์ด๋Š” ๋ช…๋ฐฑํžˆ ์šฐ๋ฆฌ์—๊ฒŒ ์œ„ํ—˜ํ•˜์ง€์š”.
03:03
And the danger is that they can cause roughly what's a pandemic.
43
183330
4000
๊ทธ๋ฆฌ๊ณ  ๊ทธ ์œ„ํ—˜์€ ์ „ ์„ธ๊ณ„์ ์œผ๋กœ ํผ์ ธ ๋‚˜๊ฐˆ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:07
And we really don't have experience with pandemics,
44
187330
3000
๊ฒŒ๋‹ค๊ฐ€ ์šฐ๋ฆฌ๋Š” ์‚ฌ์‹ค์ƒ ์ „ ์„ธ๊ณ„์ ์œผ๋กœ ์‰ฝ๊ฒŒ ํผ์ง€๋Š” ์œ„ํ—˜์— ๋Œ€ํ•œ ๊ฒฝํ—˜์ด ์—†์Šต๋‹ˆ๋‹ค.
03:10
and we're also not very good as a society at acting
45
190330
3000
์šฐ๋ฆฌ๋Š” ์ง‘๋‹จ์œผ๋กœ์„œ ์–ด๋–ป๊ฒŒ ํ–‰๋™ํ•ด์•ผ ํ• ์ง€ ์ž˜ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
03:13
to things we don't have direct and sort of gut-level experience with.
46
193330
4000
์ด๋Ÿฌํ•œ ์œ„ํ—˜์— ๋Œ€ํ•ด ๋ณธ๋Šฅ์ ์ธ ์ˆ˜์ค€์˜ ๊ฒฝํ—˜์ด๋‚˜ ์ง์ ‘์ ์ธ ๊ฒฝํ—˜์ด ์—†์œผ๋ฉด ๋ง์ด์ฃ .
03:17
So it's not in our nature to pre-act.
47
197330
4000
๊ทธ๋Ÿฌ๋ฏ€๋กœ ์šฐ๋ฆฌ๋Š” ๋ณธ๋ž˜ ๋ฏธ๋ฆฌ ๋Œ€์ฑ…์„ ์„ธ์šฐ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
03:21
And in this case, piling on more technology doesn't solve the problem,
48
201330
5000
์ด ๊ฒฝ์šฐ์—๋Š” ๊ธฐ์ˆ ๋ ฅ์„ ์Œ“๋Š” ๊ฒƒ์œผ๋กœ๋Š” ํ•ด๊ฒฐ ๋˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
03:26
because it only super-empowers people more.
49
206330
3000
๊ธฐ์ˆ ๋ ฅ์€ ๋‹จ์ง€ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋”์šฑ ๋ง‰๊ฐ•ํ•œ ํž˜์„ ์ค„ ๋ฟ์ž…๋‹ˆ๋‹ค.
03:29
So the solution has to be, as people like Russell and Einstein
50
209330
4000
๊ทธ๋ž˜์„œ ๊ทธ ๋‹ต์€, ๋Ÿฌ์…€ ๋ฒ„ํŠธ๋ž€๋“œ์™€ ์•„์ธ์Šˆํƒ€์ธ
03:33
and others imagine in a conversation that existed
51
213330
2000
๋“ฑ์˜ ์‚ฌ๋žŒ๋“ค์ด, 20์„ธ๊ธฐ ์ดˆ์—
03:35
in a much stronger form, I think, early in the 20th century,
52
215330
4000
๊ฐ•๋ ฅํ•œ ๋Œ€ํ™”ํ˜•์‹์œผ๋กœ ๋งํ•œ ๊ฒƒ์ฒ˜๋Ÿผ
03:39
that the solution had to be not just the head but the heart.
53
219330
3000
๋‹ต์€ ๋จธ๋ฆฌ์—๋งŒ ์žˆ๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ์‹ฌ์žฅ์— ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
03:42
You know, public policy and moral progress.
54
222330
5000
์˜ˆ๋ฅผ ๋“ค๋ฉด, ๊ณต๊ณต ์ •์ฑ…๊ณผ ๋„๋•์  ์„ฑ์žฅ๊ณผ ๊ฐ™์ด ๋ง์ž…๋‹ˆ๋‹ค.
03:47
The bargain that gives us civilization is a bargain to not use power.
55
227330
6000
์šฐ๋ฆฌ๋Š” ํž˜์„ ํ•จ๋ถ€๋กœ ์“ฐ์ง€ ์•Š์„ ๊ฒƒ์„ ์•ฝ์†ํ•˜์—ฌ ๋ฌธ๋ช…์„ ์ด๋ค˜์Šต๋‹ˆ๋‹ค.
03:53
We get our individual rights by society protecting us from others
56
233330
3000
์‚ฌํšŒ๋Š” ์‚ฌ๋žŒ๋“ค์ด ์ž๊ธฐ๊ฐ€ ํ•˜๊ณ  ์‹ถ์€ ๊ฒƒ์„ ๋‹ค ํ•˜๊ฒŒ ํ•˜๋„๋ก ํ•˜์ง€ ์•Š๊ณ ,
03:56
not doing everything they can do but largely doing only what is legal.
57
236330
5000
ํ•ฉ๋ฒ•์ ์ธ ๊ฒƒ๋งŒ์„ ํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•˜์—ฌ ๊ฐœ์ธ์˜ ๊ถŒ๋ฆฌ๋ฅผ ๋ณดํ˜ธํ•ฉ๋‹ˆ๋‹ค.
04:01
And so to limit the danger of these new things, we have to limit,
58
241330
5000
๋”ฐ๋ผ์„œ ์ด๋Ÿฐ ์ƒˆ๋กœ์šด ๊ฒƒ๋“ค์˜ ์œ„ํ—˜์„ฑ์„ ์ œํ•œํ•˜๊ธฐ ์œ„ํ•ด์„œ, ์šฐ๋ฆฌ๋Š”
04:06
ultimately, the ability of individuals
59
246330
2000
๊ถ๊ทน์ ์œผ๋กœ,์ „์„ธ๊ณ„์ ์œผ๋กœ ๋ฒˆ์ง€๋Š” ํž˜์— ๋Œ€ํ•œ ๊ฐœ๊ฐœ์ธ์˜
04:08
to have access, essentially, to pandemic power.
60
248330
3000
์ ‘๊ทผ ๋Šฅ๋ ฅ์„ ์ œํ•œํ•ด์•ผ๋งŒ ํ•ฉ๋‹ˆ๋‹ค.
04:11
We also have to have sensible defense, because no limitation
61
251330
4000
๋˜ํ•œ ํ•ฉ๋ฆฌ์ ์ธ ๋ฐฉ์–ด ์ฒด๊ณ„๋„ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
04:15
is going to prevent a crazy person from doing something.
62
255330
3000
์™œ๋ƒํ•˜๋ฉด ์ œํ•œํ•˜๋Š” ๊ฒƒ๋งŒ์œผ๋กœ๋Š” ๋ฏธ์น˜๊ด‘์ด์˜ ํ–‰๋™์„ ๋ง‰์„ ์ˆ˜ ์—†๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
04:18
And you know, and the troubling thing is that
63
258330
2000
๊ทธ๋Ÿฐ๋ฐ ์—ฌ๋Ÿฌ๋ถ„๋“ค๋„ ์•„์‹œ๋‹ค์‹œํ”ผ, ๊ณจ์น˜์•„ํ”„๊ฒŒ๋„
04:20
it's much easier to do something bad than to defend
64
260330
2000
๋‚˜์œํ–‰๋™์„ ํ•˜๋Š” ๊ฒƒ์ด ์˜จ๊ฐ– ์ข…๋ฅ˜์˜ ๋‚˜์œํ–‰๋™์„
04:22
against all possible bad things,
65
262330
2000
๋ง‰๋Š” ๊ฒƒ๋ณด๋‹ค ์‰ฝ์Šต๋‹ˆ๋‹ค.
04:24
so the offensive uses really have an asymmetric advantage.
66
264330
4000
๊ทธ๋ž˜์„œ ๊ณต๊ฒฉ์ธก์ด ๋น„๋Œ€์นญ์  ์šฐ์œ„๋ฅผ ๊ฐ€์ง€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
04:28
So these are the kind of thoughts I was thinking in 1999 and 2000,
67
268330
4000
์ €๋Š” 1999๋…„๊ณผ 2000๋…„์— ์ด๋Ÿฌํ•œ ์ƒ๊ฐ๋“ค์„ ๊ฐ€์ง€๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
04:32
and my friends told me I was getting really depressed,
68
272330
2000
๊ทธ๋Ÿฌ์ž ์นœ๊ตฌ๋“ค์€ ์ œ๊ฐ€ ์šฐ์šธํ•ด์ง€๊ณ  ์žˆ๋‹ค๊ณ  ๋งํ–ˆ๊ณ 
04:34
and they were really worried about me.
69
274330
2000
์ €๋ฅผ ๊ฑฑ์ •ํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:36
And then I signed a book contract to write more gloomy thoughts about this
70
276330
3000
๊ทธ๋ฆฌ๊ณ  ์ €๋Š” ์ด๋Ÿฌํ•œ ์ €์˜ ์šฐ์šธํ•œ ์ƒ๊ฐ์— ๋Œ€ํ•œ ์ฑ…์„ ์“ฐ๊ธฐ๋กœ ๊ณ„์•ฝํ–ˆ์Šต๋‹ˆ๋‹ค.
04:39
and moved into a hotel room in New York
71
279330
2000
๊ทธ๋ฆฌ๊ณ ์„œ๋Š” ๋Œ€์žฌ์•™์— ๊ด€ํ•œ ์ฑ…์œผ๋กœ ๊ฐ€๋“์ฐฌ
04:41
with one room full of books on the Plague,
72
281330
4000
๋‰ด์š•์˜ ํ˜ธํ…”๋กœ ๋“ค์–ด ๊ฐ”์ฃ .
04:45
and you know, nuclear bombs exploding in New York
73
285330
3000
๋‰ด์š•์— ํ•ตํญํƒ„์ด ๋–จ์–ด์ง€๊ณ 
04:48
where I would be within the circle, and so on.
74
288330
3000
์ €๋Š” ๊ทธ ์˜ํ–ฅ ๋ฒ”์œ„์— ์žˆ๋‹ค๋Š” ๋‘ฅ์˜ ๊ทธ๋Ÿฐ ์ฑ…๋“ค ๋ง์ด์ฃ .
04:51
And then I was there on September 11th,
75
291330
4000
๊ฒฐ๊ตญ ์ €๋Š” 9์›”11์ผ์— ๊ฑฐ๊ธฐ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
04:55
and I stood in the streets with everyone.
76
295330
1000
๊ธธ๊ฑฐ๋ฆฌ์— ๋ชจ๋‘์™€ ํ•จ๊ป˜ ๋ง์ด์ฃ .
04:56
And it was quite an experience to be there.
77
296330
2000
์ •๋ง์ด์ง€ ๋†€๋ผ์šด ๊ฒฝํ—˜์ด์—ˆ์Šต๋‹ˆ๋‹ค.
04:58
I got up the next morning and walked out of the city,
78
298330
3000
๋‹ค์Œ๋‚  ์ผ์–ด๋‚˜ ๋„์‹œ ๋ฐ–์œผ๋กœ ๊ฑธ์—ˆ์Šต๋‹ˆ๋‹ค.
05:01
and all the sanitation trucks were parked on Houston Street
79
301330
3000
ํœด์Šคํ†ค๊ฐ€์—๋Š” ๋ชจ๋“  ์“ฐ๋ ˆ๊ธฐ์ฐจ๊ฐ€ ์ฃผ์ฐจ๋˜์–ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
05:04
and ready to go down and start taking the rubble away.
80
304330
2000
์ž”ํ•ด๋“ค์„ ์น˜์šธ ์ค€๋น„๋ฅผ ํ•œ์ฑ„๋กœ์š”.
05:06
And I walked down the middle, up to the train station,
81
306330
2000
๊ธฐ์ฐจ์—ญ๊นŒ์ง€ ๊ณ„์† ๊ฑธ์–ด๊ฐ”๋Š”๋ฐ,
05:08
and everything below 14th Street was closed.
82
308330
3000
14๋ฒˆ๊ฐ€์˜ ๋ชจ๋“  ์ƒ์ ์€ ๋‹ซํ˜€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
05:11
It was quite a compelling experience, but not really, I suppose,
83
311330
4000
์–ด์ฉ” ์ˆ˜ ์—†๋Š” ๊ฒฝํ—˜์ด์—ˆ์ฃ . ํ•˜์ง€๋งŒ ๋ฐฉ์— ์žฌ์•™์— ๋Œ€ํ•œ ์ฑ…์„
05:15
a surprise to someone who'd had his room full of the books.
84
315330
3000
๊ฐ€๋“ ์Œ“์•„ ๋†“์€ ์‚ฌ๋žŒ์—๊ฒŒ๋Š” ๋†€๋ผ์šด ์ผ์ด ์•„๋‹ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:18
It was always a surprise that it happened then and there,
85
318330
4000
๊ทธ ์žฅ์†Œ, ๊ทธ ์‹œ๊ฐ„์— ์ผ์–ด๋‚ฌ๋‹ค๋Š” ๊ฒƒ์€ ๋†€๋ž์ง€๋งŒ
05:22
but it wasn't a surprise that it happened at all.
86
322330
4000
๊ทธ ์‚ฌ๊ฑด ์ž์ฒด๋Š” ์ „ํ˜€ ๋†€๋ž์ง€๊ฐ€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
05:26
And everyone then started writing about this.
87
326330
2000
๊ทธ ํ›„ ๋ชจ๋‘๊ฐ€ ๊ทธ ์‚ฌ๊ฑด์— ๋Œ€ํ•ด ์“ฐ๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:28
Thousands of people started writing about this.
88
328330
1000
์ˆ˜๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ์“ฐ๊ธฐ ์‹œ์ž‘ํ–ˆ์ฃ .
05:29
And I eventually abandoned the book, and then Chris called me
89
329330
2000
๊ฒฐ๊ตญ ์ €๋Š” ์ฑ… ์“ฐ๊ธฐ๋ฅผ ํฌ๊ธฐํ–ˆ์Šต๋‹ˆ๋‹ค. ์–ผ๋งˆํ›„ ํฌ๋ฆฌ์Šค๊ฐ€ ์ „ํ™”ํ•ด์„œ
05:31
to talk at the conference. I really don't talk about this anymore
90
331330
3000
์ปจํผ๋Ÿฐ์Šค์—์„œ ์–˜๊ธฐํ•ด ์ฃผ๊ธฐ๋ฅผ ์›ํ–ˆ์Šต๋‹ˆ๋‹ค. ํ•˜์ง€๋งŒ ์ด ์ฃผ์ œ์— ๋Œ€ํ•ด์„œ ๋” ์ด์ƒ ๋งํ•˜๊ณ  ๋‹ค๋‹ˆ์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
05:34
because, you know, there's enough frustrating and depressing things going on.
91
334330
5000
์™œ๋ƒํ•˜๋ฉด ์ด๋ฏธ ์งœ์ฆ๋‚˜๊ณ  ์šฐ์šธํ•œ ์‚ฌ๊ฑด๋“ค์ด ๊ณ„์†๋˜๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
05:39
But I agreed to come and say a few things about this.
92
339330
3000
ํ•˜์ง€๋งŒ ๊ทธ์ผ์— ๋Œ€ํ•ด ๋ช‡๊ฐ€์ง€ ์–˜๊ธฐํ•  ๋งŒํ•œ ๊ฐ€์น˜๊ฐ€ ์žˆ๋‹ค๋Š” ๊ฒƒ์— ๋™์˜ํ•ฉ๋‹ˆ๋‹ค.
05:42
And I would say that we can't give up the rule of law
93
342330
3000
์šฐ๋ฆฌ๋Š” ๋น„๋Œ€์นญ์ ์ธ ์œ„ํ˜‘์— ๋งž์„œ๊ธฐ ์œ„ํ•ด์„œ,
05:45
to fight an asymmetric threat, which is what we seem to be doing
94
345330
4000
๋ฒ•์˜ ์ง€๋ฐฐ๋ฅผ ํฌ๊ธฐํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
05:49
because of the present, the people that are in power,
95
349330
5000
์™œ๋ƒํ•˜๋ฉด ํ˜„์žฌ, ์‚ฌ๋žŒ๋“ค์€ ํž˜์„ ๊ฐ€์ง€๊ณ  ์žˆ๊ณ ,
05:54
because that's to give up the thing that makes civilization.
96
354330
5000
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์€ ๋ฌธ๋ช…์„ ๋งŒ๋“œ๋Š” ๊ฒƒ์„ ํฌ๊ธฐํ•˜๋Š” ๊ฒƒ์„ ๋œปํ•˜๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
05:59
And we can't fight the threat in the kind of stupid way we're doing,
97
359330
3000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์œ„ํ˜‘์— ๋Œ€์‘ํ•˜๋Š” ๋ฐฉ์‹์œผ๋กœ๋Š” ์‹ธ์šธ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
06:02
because a million-dollar act
98
362330
2000
์™œ๋ƒํ•˜๋ฉด ๋ฐฑ๋งŒ๋ถˆ์ด ๋“ค์–ด๊ฐ€๋Š” ๊ณ„ํš์€
06:04
causes a billion dollars of damage, causes a trillion dollar response
99
364330
3000
10์–ต๋ถˆ์˜ ํ”ผํ•ด๋ฅผ ์ฃผ๊ณ , 1์กฐ๋ถˆ ์–ด์น˜์˜ ๋ฐ˜์‘์ด ๋˜์–ด ๋Œ์•„์˜ค๊ธฐ ๋•Œ๋ฌธ์ด์ง€์š”.
06:07
which is largely ineffective and arguably, probably almost certainly,
100
367330
3000
๊ฒฐ๊ตญ ์ด ๋ฐฉ์‹์€ ๋น„ํšจ์œจ์ ์ด๊ณ  ๊ฑฐ์˜ ํ™•์‹คํ•˜๊ฒŒ
06:10
has made the problem worse.
101
370330
2000
๋ฌธ์ œ๋ฅผ ์•…ํ™”์‹œํ‚ต๋‹ˆ๋‹ค.
06:12
So we can't fight the thing with a million-to-one cost,
102
372330
5000
๊ฒฐ๊ตญ์€ ๋ฐฑ๋งŒ๋Œ€ ์ผ์˜ ๋น„์šฉ, ์ผ๋Œ€ ๋ฐฑ๋งŒ ํšจ๊ณผ ๋ฐฉ์‹์œผ๋กœ
06:17
one-to-a-million cost-benefit ratio.
103
377330
6000
์‹ธ์šธ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
06:24
So after giving up on the book -- and I had the great honor
104
384330
5000
์ฑ… ์“ฐ๋Š” ๊ฒƒ์„ ํฌ๊ธฐ ํ•œ ๋‹ค์Œ
06:29
to be able to join Kleiner Perkins about a year ago,
105
389330
4000
1๋…„ ์ „๋ถ€ํ„ฐ ์ €๋Š” Kleiner Perkins์— ํ•ฉ๋ฅ˜ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:33
and to work through venture capital on the innovative side,
106
393330
7000
ํ˜์‹ ์ ์ธ ๋ถ€๋ถ„์„ ๋‹ค๋ฃจ๋Š” ๋ฒค์ณ ์บํ”ผํƒˆ์— ์ผํ•˜๊ฒŒ ๋˜์—ˆ๊ณ 
06:40
and to try to find some innovations that could address what I saw as
107
400330
4000
์ œ๊ฐ€ ํฐ ๋ฌธ์ œ๋ผ๊ณ  ์ธ์‹ํ•œ ๋ถ€๋ถ„์„ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๋Š”
06:44
some of these big problems.
108
404330
2000
ํ˜์‹ ์„ ์ฐพ๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅํ–ˆ์Šต๋‹ˆ๋‹ค.
06:46
Things where, you know, a factor of 10 difference
109
406330
3000
10๊ฐœ์˜ ์ฐจ์ด๊ฐ€
06:49
can make a factor of 1,000 difference in the outcome.
110
409330
4000
๊ฒฐ๊ณผ์ ์œผ๋กœ 1000๊ฐœ์˜ ์ฐจ์ด๋ฅผ ๋งŒ๋“œ๋Š” ๊ณณ์„ ์ฐพ์•˜์Šต๋‹ˆ๋‹ค.
06:53
I've been amazed in the last year at the incredible quality
111
413330
3000
์ž‘๋…„ ํ•œํ•ด ๋™์•ˆ, ์ œ ์ฑ…์ƒ์œ„๋ฅผ ์˜ค๊ฐ„ ํ˜์‹ ๋“ค์˜
06:56
and excitement of the innovations that have come across my desk.
112
416330
5000
์งˆ๊ณผ ๊ทธ๊ฒƒ์ด ์ฃผ๋Š” ํฅ๋ถ„์— ๊ต‰์žฅํžˆ ๋†€๋ผ์› ์Šต๋‹ˆ๋‹ค.
07:01
It's overwhelming at times. I'm very thankful for Google and Wikipedia
113
421330
3000
๋•Œ๋•Œ๋กœ ๋ฒ…์ฐจ๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค. ์ €๋Š” ๊ตฌ๊ธ€๊ณผ ์œ„ํ‚คํ”ผ๋””์•„์—๊ฒŒ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
07:04
so I can understand at least a little of what people are talking about
114
424330
4000
๋•๋ถ„์— ์ €๋ฅผ ์ฐพ์•„์˜จ ์‚ฌ๋žŒ๋“ค์ด
07:08
who come through the doors.
115
428330
2000
์–ด๋–ค ์ด์•ผ๊ธฐ๋ฅผ ํ•˜๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ๊ฑฐ๋“ ์š”.
07:10
But I wanted to share with you three areas
116
430330
3000
ํŠน๋ณ„ํžˆ ์ œ๊ฐ€ Wired์— ๊ธฐ์‚ฌ๋ฅผ ์“ด
07:13
that I'm particularly excited about and that relate to the problems
117
433330
3000
์–ธ๊ธ‰๋œ ๋ฌธ์ œ๋“ค์— ๊ด€ํ•œ
07:16
that I was talking about in the Wired article.
118
436330
5000
3๊ฐ€์ง€ ๋ถ„์•ผ์— ๋Œ€ํ•ด์„œ ์—ฌ๋Ÿฌ๋ถ„๊ณผ ๋‚˜๋ˆ„๋„๋ก ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
07:21
The first is this whole area of education,
119
441330
2000
์ฒซ๋ฒˆ์งธ ๋ถ„์•ผ๋Š” ๊ต์œก์ „๋ฐ˜์— ๊ด€ํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:23
and it really relates to what Nicholas was talking about with a $100 computer.
120
443330
4000
์ด๊ฒƒ์€ Nicholas Negroponte๊ฐ€ ๋งํ–ˆ๋˜ 100๋ถˆ์งœ๋ฆฌ ์ปดํ“จํ„ฐ์™€๋„ ๊ด€๊ณ„๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
07:27
And that is to say that there's a lot of legs left in Moore's Law.
121
447330
4000
์ฆ‰, ๋ฌด์–ด์˜ ๋ฒ•์น™์€ ๊นจ์ง€๋ ค๋ฉด ํ•œ์ฐธ ๋‚จ์•˜๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
07:31
The most advanced transistors today are at 65 nanometers,
122
451330
4000
ํ˜„์žฌ ๊ฐ€์žฅ ๊ณ ๋„๋กœ ์ง‘์ ๋œ ํŠธ๋žœ์ง€์Šคํ„ฐ์˜ ํฌ๊ธฐ๋Š” 65๋‚˜๋…ธ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค.
07:35
and we've seen, and I've had the pleasure to invest
123
455330
3000
๋ณด์…จ๋‹ค์‹œํ”ผ, ์ €๋Š” ๋ฌด์–ด์˜ ๋ฒ•์น™์„ ์—ฐ์žฅ ์‹œํ‚ฌ ์ˆ˜ ์žˆ๊ฒ ๋‹ค ์‹ถ์€
07:38
in, companies that give me great confidence that we'll extend Moore's Law
124
458330
6000
ํšŒ์‚ฌ์— ํˆฌ์ž๋ฅผ ํ•  ๊ธฐํšŒ๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
07:44
all the way down to roughly the 10 nanometer scale.
125
464330
3000
ํŠธ๋žœ์ง€์Šคํ„ฐ์˜ ํฌ๊ธฐ๋ฅผ 10๋‚˜๋…ธ๋ฏธํ„ฐ๊ธ‰ ๊นŒ์ง€ ์ค„์ด๋Š”๊ฑฐ์ฃ .
07:47
Another factor of, say, six in dimensional reduction,
126
467330
6000
๊ฐ€๋กœ,์„ธ๋กœ,๋†’์ด๋ฅผ 6๋ฐฐ ์ง‘์ ์‹œํ‚ค๋Š”๊ฒ๋‹ˆ๋‹ค.
07:53
which should give us about another factor of 100 in raw improvement
127
473330
5000
๊ทธ๋ ‡๊ฒŒ ํ•ด์„œ ์นฉ์˜ ์„ฑ๋Šฅ์„ 100๋ฐฐ ๊ฐ€๋Ÿ‰ ๊ฐœ์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:58
in what the chips can do. And so, to put that in practical terms,
128
478330
5000
์ด๊ฒƒ์„ ์ ์šฉ์‹œ์ผœ ๋ณด์ž๋ฉด
08:03
if something costs about 1,000 dollars today,
129
483330
4000
์˜ค๋Š˜๋‚  ๊ตฌํ•  ์ˆ˜ ์žˆ๋Š” ๊ฐ€์žฅ ์ข‹์€ PC๋ฅผ
08:07
say, the best personal computer you can buy, that might be its cost,
130
487330
5000
1000๋ถˆ์— ๊ตฌ์ž…ํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด
08:12
I think we can have that in 2020 for 10 dollars. Okay?
131
492330
6000
2020๋…„์—๋Š” 10๋ถˆ์— ๊ตฌ์ž…ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๋œป์ž…๋‹ˆ๋‹ค.
08:18
Now, just imagine what that $100 computer will be in 2020
132
498330
5000
๊ทธ๋ ‡๋‹ค๋ฉด ํ•œ๋ฒˆ 2020๋…„์— ๊ต์œก์šฉ 100๋ถˆ ์ปดํ“จํ„ฐ๊ฐ€
08:23
as a tool for education.
133
503330
2000
์–ผ๋งˆํ•  ์ง€ ์ƒ์ƒํ•ด ๋ณด์„ธ์š”
08:25
I think the challenge for us is --
134
505330
2000
์šฐ๋ฆฌ์—๊ฒŒ ์ฃผ์–ด์ง„ ๊ณผ์ œ๋Š” -
08:27
I'm very certain that that will happen, the challenge is,
135
507330
2000
์ €๋Š” ๊ทธ ๊ณผ์ œ๊ฐ€ ๋ถ„๋ช…ํžˆ ์˜ฌ๊ฒƒ์ด๋ผ๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.-
08:29
will we develop the kind of educational tools and things with the net
136
509330
5000
์šฐ๋ฆฌ๊ฐ€ ๊ณผ์—ฐ ์šฐ๋ฆฌ์—๊ฒŒ ์ด๋“์ด ๋  ์ˆ˜ ์žˆ๋Š”
08:34
to let us take advantage of that device?
137
514330
3000
๊ต์œก์  ์žฅ๋น„์™€ ๋ฌผ๊ฑด๋“ค์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋ƒ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:37
I'd argue today that we have incredibly powerful computers,
138
517330
4000
์ €๋Š” ๋งํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ค๋Š˜๋‚  ์šฐ๋ฆฌ๋Š” ๊ฐ•๋ ฅํ•œ ์ปดํ“จํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์ง€๋งŒ
08:41
but we don't have very good software for them.
139
521330
2000
๊ทธ๊ฒƒ์— ๊ฑธ๋งž๋Š” ์ข‹์€ ์†Œํ”„ํŠธ์›จ์–ด๋ฅผ ๊ฐ€์ง€์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
08:43
And it's only in retrospect, after the better software comes along,
140
523330
3000
ํ•ญ์ƒ ์ข‹์€ ์†Œํ”„ํŠธ์›จ์–ด๊ฐ€ ๋‚˜์˜ค๊ณ  ๋Œ์•„๋ณด๋Š” ๊ฑฐ์ง€๋งŒ
08:46
and you take it and you run it on a ten-year-old machine, you say,
141
526330
2000
๊ทธ ์†Œํ”„ํŠธ์›จ์–ด๋ฅผ 10๋…„์ „ ์ปดํ“จํ„ฐ์— ์„ค์น˜ํ•˜๊ณ ๋Š”
08:48
God, the machine was that fast?
142
528330
2000
'์–ด๋ผ? ์ด๋ ‡๊ฒŒ ์ด ๊ธฐ๊ณ„๊ฐ€ ๋นจ๋ž๋‚˜'๋ผ๊ณ  ์ƒ๊ฐํ•˜์ฃ .
08:50
I remember when they took the Apple Mac interface
143
530330
2000
์ €๋Š” Apple Mac ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ
08:52
and they put it back on the Apple II.
144
532330
3000
์˜ˆ์ „์˜ Apple 2์—์„œ ์‹คํ–‰ํ•ด ๋ณธ๊ฒƒ์„ ๊ธฐ์–ตํ•ฉ๋‹ˆ๋‹ค.
08:55
The Apple II was perfectly capable of running that kind of interface,
145
535330
3000
Apple 2๋Š” ๊ทธ๋Ÿฌํ•œ ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ์™„๋ฒฝํžˆ ์‹คํ–‰ ์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
08:58
we just didn't know how to do it at the time.
146
538330
3000
์šฐ๋ฆฌ๋Š” ๋‹ค๋งŒ ๋‹น์‹œ์— ์–ด๋–ป๊ฒŒ ํ•  ์ง€๋ฅผ ๋ชฐ๋ž๋˜ ๊ฑฐ์ฃ .
09:01
So given that we know and should believe --
147
541330
2000
๋ฌด์–ด์˜ ๋ฒ•์น™์ด ๋งˆ์น˜ ์ƒ์ˆ˜์ฒ˜๋Ÿผ ๊ฒฌ๊ณ  ํ–ˆ๊ธฐ๋•Œ๋ฌธ์—
09:03
because Moore's Law's been, like, a constant,
148
543330
3000
์šฐ๋ฆฌ๋Š” ์ง€๋‚œ 40๋…„๊ฐ„ ๋ฐœ์ „์ด
09:06
I mean, it's just been very predictable progress
149
546330
3000
๋งค์šฐ ์˜ˆ์ธก ๊ฐ€๋Šฅํ•˜๋‹ค๋Š”
09:09
over the last 40 years or whatever.
150
549330
3000
์‚ฌ์‹ค์€ ์•Œ๊ณ  ์žˆ๊ณ  ๋ฏฟ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:12
We can know what the computers are going to be like in 2020.
151
552330
4000
์šฐ๋ฆฌ๋Š” 2020๋…„์—๋Š” ์–ด๋–ค ์ปดํ“จํ„ฐ๊ฐ€ ๋‚˜์˜ฌ์ง€ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:16
It's great that we have initiatives to say,
152
556330
2000
์šฐ๋ฆฌ๊ฐ€ ๊ต์œก์„ ์ฐฝ์กฐํ•˜๊ณ 
09:18
let's go create the education and educate people in the world,
153
558330
3000
์„ธ๊ณ„ํ‰ํ™”์˜ ํž˜์ด ๋  ๊ต์œก์„ ์ „ ์„ธ๊ณ„์ธ์—๊ฒŒ ์ œ๊ณตํ•˜๊ฒ ๋‹ค๊ณ  ํ•˜๋Š” ๊ฒƒ์€
09:21
because that's a great force for peace.
154
561330
2000
๋งค์šฐ ํ›Œ๋ฅญํ•œ ์ผ์ž…๋‹ˆ๋‹ค.
09:23
And we can give everyone in the world a $100 computer
155
563330
3000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” 100๋ถˆ ๋˜๋Š” 10๋ถˆ์งœ๋ฆฌ ์ปดํ“จํ„ฐ๋ฅผ
09:26
or a $10 computer in the next 15 years.
156
566330
5000
15๋…„ํ›„์— ๋ˆ„๊ตฌ์—๊ฒŒ๋‚˜ ์ค„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:31
The second area that I'm focusing on is the environmental problem,
157
571330
5000
์ œ๊ฐ€ ๋‘๋ฒˆ์งธ๋กœ ๊ด€์‹ฌ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ๋ถ„์•ผ๋Š” ํ™˜๊ฒฝ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
09:36
because that's clearly going to put a lot of pressure on this world.
158
576330
4000
์™œ๋ƒํ•˜๋ฉด ๋ถ„๋ช…, ํ™˜๊ฒฝ๋ฌธ์ œ๋Š” ์ „์„ธ๊ณ„์— ๋งŽ์€ ์ง์„ ์ง€์šฐ๊ฒŒ ๋ ๊ฑฐ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:40
We'll hear a lot more about that from Al Gore very shortly.
159
580330
4000
์ด ๋ฌธ์ œ์— ๋Œ€ํ•ด์„œ๋Š” ์ž ์‹œ ํ›„ ๋‚˜์˜ฌ ์•Œ ๊ณ ์–ด์—๊ฒŒ ๋” ๋งŽ์ด ๋“ค์„ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:44
The thing that we see as the kind of Moore's Law trend
160
584330
3000
ํ™˜๊ฒฝ ๋ฌธ์ œ์— ์žˆ์–ด์„œ ๋ฌด์–ด์˜ ๋ฒ•์น™์—์„œ ๋ดค๋˜
09:47
that's driving improvement in our ability to address
161
587330
3000
๊ฐ•๋ ฅํ•œ ๊ฐœ์„  ํšจ๊ณผ๋ฅผ ๋ฐ”๋ž„ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์€
09:50
the environmental problem is new materials.
162
590330
4000
์‹ ์†Œ์žฌ ์ž…๋‹ˆ๋‹ค.
09:54
We have a challenge, because the urban population is growing
163
594330
4000
์šฐ๋ฆฌ๋Š” ๋„์ „์„ ๋งž์ด ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:58
in this century from two billion to six billion
164
598330
3000
์ตœ๊ทผ ํ•œ์„ธ๊ธฐ๋ผ๋Š” ์งง์€ ์‹œ๊ฐ„์—๋งŒ
10:01
in a very short amount of time. People are moving to the cities.
165
601330
2000
์ธ๊ตฌ๊ฐ€ 20์–ต์—์„œ 60์–ต์œผ๋กœ ์ฆ๊ฐ€ํ–ˆ๊ณ  ์‚ฌ๋žŒ๋“ค์€ ๋„์‹œ๋กœ ๋ชจ์—ฌ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
10:03
They all need clean water, they need energy, they need transportation,
166
603330
3000
์‚ฌ๋žŒ๋“ค์€ ๊นจ๋—ํ•œ ๋ฌผ, ์—๋„ˆ์ง€, ์ˆ˜์†ก์ˆ˜๋‹จ์„ ํ•„์š”๋กœ ํ•˜๊ณ  ์žˆ์œผ๋ฉฐ,
10:06
and we want them to develop in a green way.
167
606330
4000
์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ๋“ค์„ ์นœํ™˜๊ฒฝ์ ์ธ ๋ฐฉ๋ฒ•์œผ๋กœ ๊ฐœ๋ฐœํ•˜๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
10:10
We're reasonably efficient in the industrial sectors.
168
610330
2000
์šฐ๋ฆฌ๋Š” ๊ณต์—… ๋ถ„์•ผ์—์„œ๋Š” ๊ฝค ํšจ์œจ์ ์œผ๋กœ ํ•ด ์™”์Šต๋‹ˆ๋‹ค.
10:12
We've made improvements in energy and resource efficiency,
169
612330
3000
์—๋„ˆ์ง€์™€ ์ž์› ํšจ์œจ์„ฑ์„ ๊ฐœ์„ ํ–ˆ์ฃ .
10:15
but the consumer sector, especially in America, is very inefficient.
170
615330
4000
ํ•˜์ง€๋งŒ ์†Œ๋น„์ž ์˜์—ญ์—์„œ๋Š”, ํŠนํžˆ ๋ฏธ๊ตญ์ด ๊ทธ๋ ‡์Šต๋‹ˆ๋‹ค๋งŒ, ๋งค์šฐ ๋น„ํšจ์œจ์ ์ž…๋‹ˆ๋‹ค.
10:19
But these new materials bring such incredible innovations
171
619330
4000
ํ•˜์ง€๋งŒ ์‹ ์†Œ์žฌ๊ฐ€ ์ฃผ๋Š” ํ˜์‹ ์€ ๋†€๋ผ์›Œ์„œ
10:23
that there's a strong basis for hope that these things
172
623330
4000
์ด ์†Œ์žฌ๋“ค์€ ์ˆ˜์ต์„ฑ์ด ๋งค์šฐ ๋†’์•„
10:27
will be so profitable that they can be brought to the market.
173
627330
2000
์‹œ์žฅ์— ๋‚˜์˜ฌ ์ˆ˜ ์žˆ๋‹ค๋Š” ํฐ ํฌ๋ง์ด ์ „์ œ ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
10:29
And I want to give you a specific example of a new material
174
629330
3000
๊ทธ๋ฆฌ๊ณ  ๊ตฌ์ฒด์ ์ธ ์˜ˆ๋กœ์„œ 15๋…„ ์ „์— ๋ฐœ๊ฒฌํ•œ
10:32
that was discovered 15 years ago.
175
632330
3000
์‹ ์†Œ์žฌ๋ฅผ ์†Œ๊ฐœํ•ด ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
10:35
If we take carbon nanotubes, you know, Iijima discovered them in 1991,
176
635330
5000
1991๋…„์— ์ด์ด์ง€๋งˆ๋Š” ํƒ„์†Œ ๋‚˜๋…ธํŠœ๋ธŒ๋ฅผ ๋ฐœ๊ฒฌํ–ˆ์Šต๋‹ˆ๋‹ค.
10:40
they just have incredible properties.
177
640330
2000
๋†€๋ผ์šด ์„ฑ์งˆ๋“ค์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:42
And these are the kinds of things we're going to discover
178
642330
1000
๊ทธ๋ฆฌ๊ณ  ๋ฐ”๋กœ ์ด๋Ÿฐ ์ข…๋ฅ˜์˜ ๋ฌผ์งˆ๋“ค์ด
10:43
as we start to engineer at the nano scale.
179
643330
3000
์šฐ๋ฆฌ๊ฐ€ ๋‚˜๋…ธ ๋‹จ์œ„ ์—ฐ๊ตฌ๋ฅผ ์‹œ์ž‘ํ•˜๋ฉด์„œ ๋ฐœ๊ฒฌํ•  ๊ฒƒ๋“ค ์ž…๋‹ˆ๋‹ค.
10:46
Their strength: they're almost the strongest material,
180
646330
3000
๊ทธ๋“ค์˜ ๊ฐ•์ ์€ ์„ธ์ƒ์—์„œ ๊ฐ€์žฅ ๊ฐ•๋ ฅํ•œ ๋ฌผ์งˆ์ด๋ฉฐ,
10:49
tensile strength material known.
181
649330
2000
์ธ์žฅ๊ฐ•๋„๊ฐ€ ๋†’์€ ๋ฌผ์งˆ์ด๋ผ๋Š” ์ ์ž…๋‹ˆ๋‹ค.
10:52
They're very, very stiff. They stretch very, very little.
182
652330
5000
๊ทธ๋“ค์€ ๊ฒฌ๊ณ ํ•˜๋ฉฐ ๊ฑฐ์˜ ๋Š˜์–ด๋‚˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
10:57
In two dimensions, if you make, like, a fabric out of them,
183
657330
3000
์‹ ์†Œ์žฌ๋ฅผ 2์ฐจ์› ์ง๋ฌผ๋กœ ๋งŒ๋“ค๋ฉด
11:00
they're 30 times stronger than Kevlar.
184
660330
3000
์ผ€๋ธ”๋ผ ์„ฌ์œ ๋ณด๋‹ค 30๋ฐฐ ๊ฐ•ํ•ฉ๋‹ˆ๋‹ค.
11:03
And if you make a three-dimensional structure, like a buckyball,
185
663330
3000
๋ฒ„ํ‚ค๋ณผ๊ณผ ๊ฐ™์ด 3์ฐจ์›์œผ๋กœ ๋งŒ๋“ ๋‹ค๋ฉด
11:06
they have all sorts of incredible properties.
186
666330
2000
์˜จ๊ฐ– ํ›Œ๋ฅญํ•œ ์„ฑ์งˆ๋“ค์„ ๊ฐ€์ง€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
11:08
If you shoot a particle at them and knock a hole in them,
187
668330
3000
๋งŒ์•ฝ ์ž…์žํ•˜๋‚˜๋ฅผ ์˜์•„์„œ ๊ทธ๊ฒƒ์— ๊ตฌ๋ฉ์„ ํ•˜๋‚˜ ๋งŒ๋“ค๋ฉด
11:11
they repair themselves; they go zip and they repair the hole
188
671330
3000
์Šค์Šค๋กœ ๋ณต๊ตฌํ•ฉ๋‹ˆ๋‹ค. ํœ™- ํ•˜๊ณ  ๊ตฌ๋ฉ์„ ๋ณต๊ตฌํ•ฉ๋‹ˆ๋‹ค.
11:14
in femtoseconds, which is not -- is really quick.
189
674330
3000
์ˆ˜ ํŽจํ† ์ดˆ(1000์กฐ๋ถ„์˜ 1์ดˆ) ๋งŒ์—์š”.- ๋น ๋ฅด์ง€ ์•Š์€,์•„๋‹ˆ ๋งค์šฐ ๋น ๋ฅธ ์‹œ๊ฐ„์ด์ฃ .
11:17
(Laughter)
190
677330
3000
(์›ƒ์Œ)
11:20
If you shine a light on them, they produce electricity.
191
680330
4000
๋น›์„ ๋น„์ถ”๋ฉด ์ „๊ธฐ๋ฅผ ์ƒ์‚ฐํ•˜๊ธฐ๋„ ํ•˜์ฃ 
11:24
In fact, if you flash them with a camera they catch on fire.
192
684330
3000
์นด๋ฉ”๋ผ์˜ ํ”Œ๋ž˜์‰ฌ๋ฅผ ํ„ฐํŠธ๋ฆฌ๋ฉด ๋ถˆ์ด ๋ถ™๊ธฐ๋„ ํ•˜์ฃ .
11:27
If you put electricity on them, they emit light.
193
687330
4000
๋งŒ์•ฝ ์ „๊ธฐ๋ฅผ ์ฃผ๋ฉด ๋น›์„ ๋ฐœํ•˜๊ธฐ๋„ ํ•˜์ฃ .
11:31
If you run current through them, you can run 1,000 times more current
194
691330
3000
๊ทธ๋ฆฌ๊ณ  ์ „๊ธฐ๋ฅผ ํ˜๋ฆฌ๋ฉด, ๋ณดํ†ต์˜ ๊ธˆ์†๋ณด๋‹ค
11:34
through one of these than through a piece of metal.
195
694330
4000
1000๋ฐฐ ์ •๋„๋Š” ๋งŽ์€ ์ „๋ฅ˜๊ฐ€ ํ๋ฅผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
11:38
You can make both p- and n-type semiconductors,
196
698330
3000
pํ˜•, nํ˜• ๋ฐ˜๋„์ฒด ๋‘˜ ๋‹ค ๋งŒ๋“ค์ˆ˜ ์žˆ์ฃ .
11:41
which means you can make transistors out of them.
197
701330
2000
ํŠธ๋žœ์ง€์Šคํ„ฐ๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋‹ค๋Š” ๋œป์ด์ฃ .
11:43
They conduct heat along their length but not across --
198
703330
3000
์ข…์œผ๋กœ๋Š” ์—ด์„ ์ „๋‹ฌํ•˜์ง€๋งŒ ํšก์œผ๋กœ๋Š” ์ „๋‹ฌํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
11:46
well, there is no width, but not in the other direction
199
706330
2000
์‚ฌ์‹ค ๋„ˆ๋น„๋ผ๋Š” ๊ฒƒ์ด ์—†๊ธฐ๋Š” ํ•˜์ง€๋งŒ ๋ฐ˜๋Œ€ ๋ฐฉํ–ฅ์œผ๋กœ๋Š” ํ•˜์ง€ ์•Š๋Š”๋‹ค๋Š” ๋œป์ด์ฃ .
11:48
if you stack them up; that's a property of carbon fiber also.
200
708330
6000
๊ทธ๋ฆฌ๊ณ  ํƒ„์†Œ์„ฌ์œ ์˜ ํŠน์„ฑ์ฒ˜๋Ÿผ ์‹ ์†Œ์žฌ๋ฅผ ์Œ“๊ณ 
11:54
If you put particles in them, and they go shooting out the tip --
201
714330
3000
์ž…์ž๋ฅผ ์˜ฌ๋ ค๋†“์œผ๋ฉด ๋ฐ”๋กœ ํŠ€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
11:57
they're like miniature linear accelerators or electron guns.
202
717330
3000
๋งˆ์น˜ ๋ฏธ๋‹ˆ ์„ ํ˜• ๊ฐ€์†๊ธฐ๋‚˜ ์ „์ž ์ด์ฒ˜๋Ÿผ์ด์š”.
12:00
The inside of the nanotubes is so small --
203
720330
3000
๋‚˜๋…ธํŠœ๋ธŒ์˜ ๋‚ด๋ถ€๋Š” ์ž‘์Šต๋‹ˆ๋‹ค.
12:03
the smallest ones are 0.7 nanometers --
204
723330
2000
๊ฐ€์žฅ ์ž‘์€ ๊ฑด 0.7๋‚˜๋…ธ๋ฏธํ„ฐ๊นŒ์ง€๋„ ๋‚˜์™”์Šต๋‹ˆ๋‹ค.
12:05
that it's basically a quantum world.
205
725330
2000
์‹ค์งˆ์ ์œผ๋กœ ์–‘์ž์˜ ์„ธ๊ณ„์ž…๋‹ˆ๋‹ค.
12:07
It's a strange place inside a nanotube.
206
727330
3000
๋‚˜๋…ธํŠœ๋ธŒ์˜ ์„ธ๊ณ„๋Š” ๋งค์šฐ ์‹ ๊ธฐํ•ฉ๋‹ˆ๋‹ค.
12:10
And so we begin to see, and we've seen business plans already,
207
730330
3000
์ด๋ฏธ ์‚ฌ์—…๊ณ„ํš์„œ๋“ค์—์„œ ๋ณด์ด๊ธฐ ์‹œ์ž‘ํ•˜๋“ฏ์ด
12:13
where the kind of things Lisa Randall's talking about are in there.
208
733330
3000
๋ฆฌ์‚ฌ ๋žœ๋‹ฌ์ด ๋งํ•œ ๊ฒƒ๋“ค์„ ๋ณด๊ธฐ ์‹œ์ž‘ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:16
I had one business plan where I was trying to learn more about
209
736330
2000
์ œ๊ฐ€ ์ ‘ํ•œ ์‚ฌ์—…๊ณ„ํš์„œ ์ค‘์— ํ•˜๋‚˜์—์„œ
12:18
Witten's cosmic dimension strings to try to understand
210
738330
3000
์ €๋Š” ๋‚˜๋…ธ ๋ฌผ์งˆ์— ๋Œ€ํ•ด์„œ ์ข€๋” ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•ด
12:21
what the phenomenon was going on in this proposed nanomaterial.
211
741330
3000
์œ„ํŠผ์˜ ์šฐ์ฃผ์  ์ฐจ์›์˜ ๋ˆ์ด๋ก ์„ ๋ฐฐ์šธ ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
12:24
So inside of a nanotube, we're really at the limit here.
212
744330
6000
๋‚˜๋…ธํŠœ๋ธŒ์˜ ๋‚ด๋ถ€๋Š”, ์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ ๊ทนํ•œ๊นŒ์ง€ ์™€์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:30
So what we see is with these and other new materials
213
750330
4000
์ด๋Ÿฌํ•œ ์ƒˆ๋กœ์šด ๋ฌผ์งˆ์—์„œ ๋ณผ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์€
12:34
that we can do things with different properties -- lighter, stronger --
214
754330
4000
๊ฒฝ๋Ÿ‰ํ™”, ๊ฒฌ๊ณ ํ™”์™€ ๊ฐ™์€ ํŠน์„ฑ์„ ์ด์šฉํ•˜์—ฌ
12:38
and apply these new materials to the environmental problems.
215
758330
6000
์ด๋Ÿฐ ๋ฌผ์งˆ๋“ค์„ ํ™˜๊ฒฝ ๋ฌธ์ œ์˜ ํ•ด๊ฒฐ์— ์ ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:44
New materials that can make water,
216
764330
1000
๋ฌผ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋Š” ๋ฌผ์งˆ,
12:45
new materials that can make fuel cells work better,
217
765330
2000
์—ฐ๋ฃŒ ์ „์ง€ ํšจ์œจ์„ ๋†’์ผ ์ˆ˜ ์žˆ๋Š” ๋ฌผ์งˆ,
12:47
new materials that catalyze chemical reactions,
218
767330
4000
ํ™”ํ•™๋ฐ˜์‘์˜ ์ด‰๋งค์ œ ์—ญํ• ์„ ํ•˜์—ฌ
12:51
that cut pollution and so on.
219
771330
3000
์˜ค์—ผ์„ ์ฐจ๋‹จ ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฌผ์งˆ ๋“ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
12:54
Ethanol -- new ways of making ethanol.
220
774330
3000
์—ํƒ„์˜ฌ์„ ๋งŒ๋“œ๋Š” ์ƒˆ๋กœ์šด ๋ฐฉ๋ฒ•.
12:57
New ways of making electric transportation.
221
777330
3000
์ „๊ธฐ๋กœ ์›€์ง์ด๋Š” ์šด์†ก์ˆ˜๋‹จ์„ ๋งŒ๋“œ๋Š” ์ƒˆ๋กœ์šด ๋ฐฉ๋ฒ• ๊ฐ™์€๊ฒƒ ๋ง์ด์ฃ .
13:00
The whole green dream -- because it can be profitable.
222
780330
4000
์ด๋Ÿฐ ๋…น์ƒ‰ ์„ฑ์žฅ์„ ๊ฟˆ๊ฟ‰๋‹ˆ๋‹ค. ์™œ๋ƒํ•˜๋ฉด ์ˆ˜์ต์„ฑ์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
13:04
And we've dedicated -- we've just raised a new fund,
223
784330
2000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ์ƒˆ๋กœ์šด ํŽ€๋“œ๋ฅผ ๋ชจ์ง‘ํ•˜๊ณ  ํˆฌ์žํ•˜์˜€์Šต๋‹ˆ๋‹ค.
13:06
we dedicated 100 million dollars to these kinds of investments.
224
786330
3000
์šฐ๋ฆฌ๋Š” ์ด๋Ÿฌํ•œ ๊ฒƒ์— 1์–ต๋ถˆ์„ ํˆฌ์žํ–ˆ์Šต๋‹ˆ๋‹ค.
13:09
We believe that Genentech, the Compaq, the Lotus, the Sun,
225
789330
4000
์šฐ๋ฆฌ๋Š” genetech, compaq, lotus, sun,
13:13
the Netscape, the Amazon, the Google in these fields
226
793330
4000
netscape, amazon, google๊ณผ ๊ฐ™์€ ๋ถ„์•ผ๋Š”
13:17
are yet to be found, because this materials revolution
227
797330
3000
์•„์ง ๋ฐœ๊ฒฌ๋  ๊ฒƒ์ด ์žˆ๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค. ์™œ๋ƒํ•˜๋ฉด ์ด ์‹ ์†Œ์žฌ ํ˜๋ช…์ด
13:20
will drive these things forward.
228
800330
3000
์•ž์œผ๋กœ ๋ฐœ์ „ํ•˜๋Š” ๊ฒƒ์— ๋™๋ ฅ์ด ๋ ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
13:24
The third area that we're working on,
229
804330
2000
์„ธ๋ฒˆ์งธ๋กœ ๋‹ค๋ฃฐ ๋ถ„์•ผ๋Š”
13:26
and we just announced last week -- we were all in New York.
230
806330
4000
์ €๋ฒˆ์ฃผ์— ๋‰ด์š•์—์„œ ๋ฐœํ‘œํ•œ ๋ถ„์•ผ์ž…๋‹ˆ๋‹ค.
13:30
We raised 200 million dollars in a specialty fund
231
810330
6000
์ „์„ธ๊ณ„์  ์ƒ๋ฌผํ•™์  ๋ฐฉ์–ด๋ฅผ ๊ฐ–์ถ”๊ธฐ ์œ„ํ•ด
13:36
to work on a pandemic in biodefense.
232
816330
4000
2์–ต๋ถˆ ํŽ€๋“œ๋ฅผ ์กฐ์„ฑํ•˜์˜€์Šต๋‹ˆ๋‹ค.
13:40
And to give you an idea of the last fund that Kleiner raised
233
820330
3000
์–ด๋Š์ •๋„์ธ์ง€ ๊ฐ์ด ์•ˆ ์žกํžˆ์‹ค ๊ฒƒ ๊ฐ™์•„ ๋ง์”€๋“œ๋ฆฝ๋‹ˆ๋‹ค.Kleiner๊ฐ€ ๋งˆ์ง€๋ง‰์œผ๋กœ ์กฐ์„ฑํ•œ ํŽ€๋“œ๋Š”
13:43
was a $400 million fund, so this for us is a very substantial fund.
234
823330
5000
4์–ต๋ถˆ ๊ทœ๋ชจ์˜€์Šต๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋ฏ€๋กœ ์ด์ •๋„๋ฉด ์ƒ๋‹นํ•œ ๊ทœ๋ชจ๋ผ๊ณ  ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
13:48
And what we did, over the last few months -- well, a few months ago,
235
828330
4000
๋ช‡ ๋‹ฌ๋™์•ˆ ์ €ํฌ๊ฐ€ ํ•œ ์ผ์„ ๋ง์”€๋“œ๋ฆฌ๋ฉด
13:52
Ray Kurzweil and I wrote an op-ed in the New York Times
236
832330
3000
์ €์™€ RAY Kurzweil์€ ๋‰ด์š• ํƒ€์ž„์ฆˆ์—
13:55
about how publishing the 1918 genome was very dangerous.
237
835330
3000
1918๋…„ ๋…๊ฐ์˜ ๊ฒŒ๋†ˆ์„ ๊ณต๊ฐœํ•˜๋Š” ๊ฒƒ์ด ์–ผ๋งˆ๋‚˜ ์œ„ํ—˜ํ•œ๊ฐ€์— ๋Œ€ํ•œ ์‚ฌ์„ค์„ ์‹ค์—ˆ์Šต๋‹ˆ๋‹ค.
13:58
And John Doerr and Brook and others got concerned, [unclear],
238
838330
4000
๊ทธ๋ฆฌ๊ณ  John Doerr๊ณผ Brook ๊ทธ๋ฆฌ๊ณ  ์ฃผ์œ„์‚ฌ๋žŒ๋“ค์€ ๊ฑฑ์ •ํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:02
and we started looking around at what the world was doing
239
842330
4000
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๋Š” ์„ธ๊ณ„์  ์ „์—ผ๋ณ‘๊ณผ ๊ด€๋ จํ•˜์—ฌ ๋ฌด์Šจ ์ผ์ด ๋ฒŒ์–ด์ง€๊ณ  ์žˆ๋Š”์ง€ ๋‘˜๋Ÿฌ ๋ณด๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:06
about being prepared for a pandemic. And we saw a lot of gaps.
240
846330
5000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ๋นˆํ‹ˆ๋“ค์„ ๋ณด์•˜์Šต๋‹ˆ๋‹ค.
14:11
And so we asked ourselves, you know, can we find innovative things
241
851330
4000
์šฐ๋ฆฌ๋Š” ์ž์‹ ์—๊ฒŒ ์ž๋ฌธํ–ˆ์Šต๋‹ˆ๋‹ค. '์ด๋Ÿฐ ๋นˆํ‹ˆ๋“ค์„ ์ฑ„์›Œ์ค„
14:15
that will go fill these gaps? And Brooks told me in a break here,
242
855330
4000
ํ˜์‹ ์ ์ธ ๊ฒƒ์€ ์—†์„๊นŒ?' Brooks๋Š” ์ž ์‹œ์ „ ํœด์‹์‹œ๊ฐ„์—
14:19
he said he's found so much stuff he can't sleep,
243
859330
2000
๊ทธ๊ฐ€ ์—„์ฒญ๋‚œ ๊ฒƒ๋“ค์„ ๋ฐœ๊ฒฌํ•˜์—ฌ ์ž ์„ ๋ชป ์ด๋ฃฌ๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:21
because there's so many great technologies out there,
244
861330
3000
์™œ๋ƒํ•˜๋ฉด ์‚ฌ์žฅ๋˜์–ด ๋ฒ„๋ฆฐ ํ›Œ๋ฅญํ•œ ๊ธฐ์ˆ ๋“ค์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
14:24
we're essentially buried. And we need them, you know.
245
864330
3000
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๋Š” ๊ทธ๊ฒƒ๋“ค์„ ํ•„์š”๋กœ ํ•˜์ฃ 
14:27
We have one antiviral that people are talking about stockpiling
246
867330
3000
์šฐ๋ฆฌ๊ฐ€ ๋Œ€๋Ÿ‰์œผ๋กœ ์Œ“์•„ ๋†“์€ ์ œ๋Œ€๋กœ ๋œ
14:30
that still works, roughly. That's Tamiflu.
247
870330
3000
ํ•ญ๋ฐ”์ด๋Ÿฌ์Šค์ œ๊ฐ€ ํ•˜๋‚˜ ์žˆ์Šต๋‹ˆ๋‹ค. ํƒ€๋ฏธํ”Œ๋ฃจ.
14:33
But Tamiflu -- the virus is resistant. It is resistant to Tamiflu.
248
873330
5000
ํ•˜์ง€๋งŒ ๋ฐ”์ด๋Ÿฌ์Šค์— ๋‚ด์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค. ํƒ€๋ฏธํ”Œ๋ฃจ์— ๋‚ด์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
14:38
We've discovered with AIDS we need cocktails to work well
249
878330
4000
AIDS ํ™˜์ž์—๊ฒŒ ํˆฌ์•ฝํ•˜๋ฉด์„œ ์•Œ์•„๋‚ธ ์‚ฌ์‹ค์ด ์žˆ์ฃ . ์•ฝ์„ ๋“ฃ๊ฒŒ ํ•˜๋ ค๋ฉด ์นตํ…Œ์ผ ์š”๋ฒ•์„ ์จ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
14:42
so that the viral resistance -- we need several anti-virals.
250
882330
3000
๋ฐ”์ด๋Ÿฌ์Šค ์ €ํ•ญ์—๋Š” ๋ช‡ ๋ช‡๊ฐœ์˜ ํ•ญ๋ฐ”์ด๋Ÿฌ์Šค์ œ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
14:45
We need better surveillance.
251
885330
2000
์šฐ๋ฆฌ๋Š” ์ข€ ๋” ์„ธ๋ฐ€ํ•œ ๊ด€์ฐฐ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
14:47
We need networks that can find out what's going on.
252
887330
3000
์šฐ๋ฆฌ๋Š” ํ˜„์ƒ ํŒŒ์•…์„ ์œ„ํ•œ ๋„คํŠธ์›Œํฌ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
14:50
We need rapid diagnostics so that we can tell if somebody has
253
890330
4000
์šฐ๋ฆฌ๋Š” ์ตœ๊ทผ์— ํ™•์ธ๋œ ๋…๊ฐ์„ ๋ˆ„๊ตฐ๊ฐ€์—๊ฒŒ
14:54
a strain of flu which we have only identified very recently.
254
894330
4000
๋งํ•ด ์ค„์ˆ˜ ์žˆ๋Š” ์‹ ์†ํ•œ ์ง„๋‹จ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
14:58
We've got to be able to make the rapid diagnostics quickly.
255
898330
2000
์‹ ์†ํ•œ ์ง„๋‹จ์ด ๋น ๋ฅด๊ฒŒ ์ด๋ฃจ์–ด์ ธ์•ผ ํ•  ๋Šฅ๋ ฅ์ด ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
15:00
We need new anti-virals and cocktails. We need new kinds of vaccines.
256
900330
3000
์ƒˆ๋กœ์šด ํ•ญ๋ฐ”์ด๋Ÿฌ์Šค์ œ์™€ ์นตํ…Œ์ผ ๊ธฐ๋ฒ•์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค. ์ƒˆ๋กœ์šด ๋ฐฑ์‹ ๋„ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
15:03
Vaccines that are broad spectrum.
257
903330
2000
ํŠนํžˆ, ๋„“์€ ์ ์šฉ๋ฒ”์œ„๋ฅผ ๊ฐ€์ง„ ๋ฐฑ์‹ ,
15:05
Vaccines that we can manufacture quickly.
258
905330
4000
๊ทธ๋ฆฌ๊ณ  ๋นจ๋ฆฌ ์ƒ์‚ฐํ•  ์ˆ˜ ์žˆ๋Š” ๋ฐฑ์‹ ์„์š”.
15:09
Cocktails, more polyvalent vaccines.
259
909330
2000
์—ฌ๋Ÿฌ ๊ท ์„ ์„ž์€ ์นตํ…Œ์ผ ๋ฐฑ์‹ ์ด์š”.
15:11
You normally get a trivalent vaccine against three possible strains.
260
911330
3000
๋ณดํ†ต 3๊ฐ€์ง€ ์ •๋„์˜ ๋ณ€์ข…์„ ์—ผ๋‘์— ๋‘๊ณ  3 ์ข…๋ฅ˜์˜ ๊ท ์„ ์„ž์–ด ๋งŒ๋“ค์ฃ .
15:14
We need -- we don't know where this thing is going.
261
914330
3000
๋ฐ”์ด๋Ÿฌ์Šค๊ฐ€ ์–ด๋–ป๊ฒŒ ํ–‰๋™ํ• ์ง€ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
15:17
We believe that if we could fill these 10 gaps,
262
917330
3000
์ด๋Ÿฌํ•œ 10๊ฐ€์ง€ ํ•ญ๋ชฉ์„ ์ถฉ์กฑํ•œ๋‹ค๋ฉด
15:20
we have a chance to help really reduce the risk of a pandemic.
263
920330
6000
์ „ ์„ธ๊ณ„์  ์œ ํ–‰๋ณ‘์˜ ์œ„ํ—˜์„ ์ค„์ผ ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ฏฟ์Šต๋‹ˆ๋‹ค.
15:26
And the difference between a normal flu season and a pandemic
264
926330
4000
๋ณดํ†ต ๊ฐ๊ธฐ์™€ ์ด๋Ÿฐ ์ „์—ผ๋ณ‘์˜ ์ฐจ์ด๋Š”
15:30
is about a factor of 1,000 in deaths
265
930330
3000
์‚ฌ๋ง์ž์ˆ˜์—์„œ 1000๋ฐฐ๊ฐ€ ์ฐจ์ด๋‚˜๊ณ 
15:33
and certainly enormous economic impact.
266
933330
3000
๊ฒฝ์ œ์— ์ƒ๋‹นํ•œ ์˜ํ–ฅ์„ ๋ฏธ์น˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
15:36
So we're very excited because we think we can fund 10,
267
936330
3000
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” 10๊ฐœ์˜ ํ”„๋กœ์ ํŠธ๋ฅผ ์ง€์›ํ•˜๊ณ 
15:39
or speed up 10 projects and see them come to market
268
939330
4000
๋ฐฑ์‹ ์˜ ๊ฐœ๋ฐœ์— ๋ฐ•์ฐจ๋ฅผ ๊ฐ€ํ•ด ๋ช‡ ๋…„ ์ด๋‚ด๋กœ ์‹œ์žฅ์— ๋‚ด ๋†“์„
15:43
in the next couple years that will address this.
269
943330
3000
์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋ฉด ํฅ๋ถ„๋ฉ๋‹ˆ๋‹ค.
15:46
So if we can address, use technology, help address education,
270
946330
3000
๊ทธ๋ ‡๋‹ค๋ฉด ๊ธฐ์ˆ ์„ ์‚ฌ์šฉํ•˜๊ณ , ๊ต์œก์„ ํ•˜๊ณ 
15:49
help address the environment, help address the pandemic,
271
949330
3000
ํ™˜๊ฒฝ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ณ , ์ „์—ผ๋ณ‘์„ ๋ง‰๋Š”๋‹ค๋ฉด
15:52
does that solve the larger problem that I was talking about
272
952330
4000
์ œ๊ฐ€ Wired์— ์–ธ๊ธ‰ํ–ˆ๋˜ ๋ฌธ์ œ๋“ค์„ ํ•ด๊ฒฐ ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
15:56
in the Wired article? And I'm afraid the answer is really no,
273
956330
5000
์ฃ„์†กํ•˜์ง€๋งŒ ์•„๋‹™๋‹ˆ๋‹ค.
16:01
because you can't solve a problem with the management of technology
274
961330
4000
์™œ๋ƒํ•˜๋ฉด ๋‹จ์ˆœํžˆ ๊ธฐ์ˆ ์„ ๊ด€๋ฆฌํ•˜๊ณ  , ๋” ๋งŽ์€ ๊ธฐ์ˆ ์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์œผ๋กœ๋Š”
16:05
with more technology.
275
965330
3000
๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
16:08
If we let an unlimited amount of power loose, then we will --
276
968330
5000
๋งŒ์•ฝ ๋ฌด์ œํ•œ์˜ ์ „๋ ฅ์„ ๊ณต๊ธ‰ํ•œ๋‹ค๋ฉด
16:13
a very small number of people will be able to abuse it.
277
973330
2000
์†Œ์ˆ˜์˜ ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๊ฒƒ์„ ์•…์šฉํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:15
We can't fight at a million-to-one disadvantage.
278
975330
4000
1๋Œ€ ๋ฐฑ๋งŒ์˜ ์‹ธ์›€์„ ํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
16:19
So what we need to do is, we need better policy.
279
979330
3000
์ง€๊ธˆ ์šฐ๋ฆฌ๊ฐ€ ํ•„์š”ํ•œ๊ฒƒ์€ ์ข€๋” ๋‚˜์€ ์ •์ฑ…์ž…๋‹ˆ๋‹ค.
16:22
And for example, some things we could do
280
982330
3000
์˜ˆ๋ฅผ ๋“ค์ž๋ฉด ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ์€
16:25
that would be policy solutions which are not really in the political air right now
281
985330
4000
ํ˜„์žฌ ์ •์น˜๊ณ„์—์„œ ๊ด€์‹ฌ์€ ์—†์ง€๋งŒ
16:29
but perhaps with the change of administration would be -- use markets.
282
989330
4000
์‹œ์žฅ์„ ๋„์ž…ํ•˜๋Š” ์ •์ฑ…์  ํ•ด๊ฒฐ์ฑ…์„ ์ œ์‹œ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:33
Markets are a very strong force.
283
993330
2000
์‹œ์žฅ์€ ๊ฐ•๋ ฅํ•œ ํž˜์„ ๊ฐ€์ง€๊ณ  ์žˆ์ง€์š”
16:35
For example, rather than trying to regulate away problems,
284
995330
3000
์˜ˆ๋ฅผ ๋“ค๋ฉด ์ž˜ ๋˜์ง€๋„ ์•Š๋Š”
16:38
which probably won't work, if we could price
285
998330
2000
๋ฌธ์ œ๋ฅผ ๊ทœ์ œํ•˜๋Š” ๋ฐฉ์‹ ๋ณด๋‹ค๋Š”
16:40
into the cost of doing business, the cost of catastrophe,
286
1000330
5000
์‚ฌ์—…์„ ํ•จ์— ์žˆ์–ด์„œ ๋Œ€์ฐธ์‚ฌ์˜ ๋น„์šฉ์„ ๋งค๊ธฐ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:45
so that people who are doing things that had a higher cost of catastrophe
287
1005330
3000
๊ทธ๋Ÿฌ๋ฏ€๋กœ์„œ ๋Œ€์ฐธ์‚ฌ์˜ ๊ฐ€๋Šฅ์„ฑ์„ ๊ฐ€์ง„ ์‚ฌ์—…์€
16:48
would have to take insurance against that risk.
288
1008330
3000
๊ทธ ์œ„ํ—˜์— ๋Œ€ํ•œ ๋ณดํ—˜์„ ๋“ค์–ด์•ผ ํ•˜๋Š”๊ฑฐ์ฃ .
16:51
So if you wanted to put a drug on the market you could put it on.
289
1011330
2000
๋งŒ์•ฝ ์•ฝ์„ ์‹œ์žฅ์— ํŒ”๊ณ  ์‹ถ๋‹ค๋ฉด ํŒ” ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
16:53
But it wouldn't have to be approved by regulators;
290
1013330
2000
๋‹จ, ๊ทœ์ œ์— ์˜ํ•ด ์Šน์ธ์ด ๋˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๊ณ 
16:55
you'd have to convince an actuary that it would be safe.
291
1015330
4000
ํšŒ๊ณ„์‚ฌ์—๊ฒŒ ์ด ์•ฝ์ด ์•ˆ์ „ํ•˜๋‹ค๋Š” ๊ฒƒ์„ ์„ค๋“ ํ•ด์•ผ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:59
And if you apply the notion of insurance more broadly,
292
1019330
3000
๊ทธ๋ฆฌ๊ณ  ๋ณดํ—˜์ด๋ผ๋Š” ๊ฐœ๋…์„ ๋„“๊ฒŒ ์žก์œผ๋ฉด
17:02
you can use a more powerful force, a market force,
293
1022330
3000
์‹œ์žฅ์ด ๊ฐ€์ง„ ๊ฐ•๋ ฅํ•œ ํž˜์„ ์ด์šฉํ•ด
17:05
to provide feedback.
294
1025330
2000
ํ”ผ๋“œ๋ฐฑ์„ ์ œ๊ณต ๋ฐ›์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
17:07
How could you keep the law?
295
1027330
1000
๋ฒ•์„ ์–ด๋–ป๊ฒŒ ์ง€ํ‚ฌ ์ˆ˜ ์žˆ์„๊นŒ์š”?
17:08
I think the law would be a really good thing to keep.
296
1028330
2000
์ €๋Š” ๋ฒ•์„ ์ง€ํ‚ค๋Š” ๊ฒƒ์€ ๋งค์šฐ ์ข‹์€ ๊ฒƒ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
17:10
Well, you have to hold people accountable.
297
1030330
2000
์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์ฑ…์ž„๊ฐ์ด ์žˆ์–ด์•ผ ํ•˜์ฃ .
17:12
The law requires accountability.
298
1032330
2000
๋ฒ•์€ ์ฑ…์ž„์˜์‹์„ ํ•„์š”๋กœ ํ•ฉ๋‹ˆ๋‹ค.
17:14
Today scientists, technologists, businessmen, engineers
299
1034330
3000
์˜ค๋Š˜๋‚  ๊ณผํ•™์ž, ๊ธฐ์ˆ ์ž, ์‚ฌ์—…๊ฐ€, ์—”์ง€๋‹ˆ์–ด๋Š”
17:17
don't have any personal responsibility
300
1037330
2000
๊ทธ๋“ค์˜ ํ–‰๋™์— ๋Œ€ํ•ด
17:19
for the consequences of their actions.
301
1039330
2000
๊ฐœ์ธ์ ์ธ ์ฑ…์ž„์ด ์—†์Šต๋‹ˆ๋‹ค.
17:21
So if you tie that -- you have to tie that back with the law.
302
1041330
4000
์ด๋Ÿฌํ•œ ์ ๋“ค์„ ๋ฒ•๊ณผ ์—ฐ๊ณ„์‹œ์ผœ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
17:25
And finally, I think we have to do something that's not really --
303
1045330
4000
๊ทธ๋ฆฌ๊ณ  ๋งˆ์ง€๋ง‰์œผ๋กœ
17:29
it's almost unacceptable to say this -- which,
304
1049330
1000
๋ฐ›์•„๋“ค์ด๊ธฐ ํž˜๋“ค์ง€๋งŒ
17:30
we have to begin to design the future.
305
1050330
3000
์šฐ๋ฆฌ๋Š” ๋ฏธ๋ž˜๋ฅผ ๋””์ž์ธ ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
17:33
We can't pick the future, but we can steer the future.
306
1053330
4000
๋ฏธ๋ž˜๋ฅผ ๊ณ ๋ฅผ ์ˆ˜๋Š” ์—†์ง€๋งŒ ๋ฏธ๋ž˜์˜ ๋ฐฉํ–ฅ์„ ๊ฒฐ์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
17:37
Our investment in trying to prevent pandemic flu
307
1057330
2000
์ „์˜๋ณ‘์„ ๋ง‰๊ธฐ ์œ„ํ•œ ์šฐ๋ฆฌ์˜ ํˆฌ์ž๊ฐ€
17:39
is affecting the distribution of possible outcomes.
308
1059330
4000
๊ฐ€๋Šฅํ•œ ๊ฒฐ๊ณผ์˜ ๋ชจ์–‘์— ์˜ํ–ฅ์„ ์ฃผ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
17:43
We may not be able to stop it, but the likelihood
309
1063330
2000
์šฐ๋ฆฌ๋Š” ๋ง‰์ง€ ๋ชปํ•  ์ง€๋ผ๋„
17:45
that it will get past us is lower if we focus on that problem.
310
1065330
4000
๋ฌธ์ œ์— ์ง‘์ค‘ํ•˜์—ฌ ์šฐ๋ฆฌ๋ฅผ ๋น„์ผœ๋‚˜๊ฐˆ ํ™•๋ฅ ์„ ๋†’์ž…๋‹ˆ๋‹ค.
17:49
So we can design the future if we choose what kind of things
311
1069330
4000
๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ์–ด๋– ํ•œ ์ผ๋“ค์ด ์ผ์–ด๋‚˜์•ผ ํ•˜๊ณ , ์ผ์–ด๋‚˜์ง€ ๋ง์•„์•ผ ํ• ์ง€๋ฅผ ์ •ํ•˜๋ฉด
17:53
we want to have happen and not have happen,
312
1073330
3000
๋ฏธ๋ž˜๋ฅผ ๋””์ž์ธ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
17:56
and steer us to a lower-risk place.
313
1076330
3000
๊ทธ๋ฆฌ๊ณ  ๋ฆฌ์Šคํฌ๊ฐ€ ๋‚ฎ์€ ์„ธ์ƒ์œผ๋กœ ๋ฐฉํ–ฅ์„ ์žก๊ฒ ์ฃ .
17:59
Vice President Gore will talk about how we could steer the climate trajectory
314
1079330
6000
Gore ๋ถ€ํ†ต๋ น๊ป˜์„œ ๊ธฐํ›„์˜ ๋ฐฉํ–ฅ์„ ์–ด๋–ป๊ฒŒ ๋ฐ”๊พธ์–ด์„œ
18:05
into a lower probability of catastrophic risk.
315
1085330
3000
์žฌ์•™์ด ์˜ฌ ํ™•๋ฅ ์„ ๋‚ฎ์ถœ์ˆ˜ ์žˆ๋Š”์ง€ ๋ง์”€ํ•ด ์ฃผ์‹ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
18:08
But above all, what we have to do is we have to help the good guys,
316
1088330
3000
๋ฌด์—‡๋ณด๋‹ค๋„ ์šฐ๋ฆฌ๊ฐ€ ํ•ด์•ผํ•  ์ผ์€ ์ˆ˜๋น„ํ•˜๋Š” ์ž…์žฅ์— ์žˆ๋Š”
18:11
the people on the defensive side,
317
1091330
2000
์ข‹์€ ํŽธ์„ ๋„์™€
18:13
have an advantage over the people who want to abuse things.
318
1093330
4000
์•…์šฉํ•˜๋Š” ์‚ฌ๋žŒ๋“ค๋ณด๋‹ค ์šฐ์œ„๋ฅผ ์ ํ•˜๊ฒŒ ํ•ด์ฃผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
18:17
And what we have to do to do that
319
1097330
2000
๊ทธ๋Ÿฌ๊ธฐ ์œ„ํ•ด์„œ๋Š” ํŠน์ • ์ •๋ณด์— ๋Œ€ํ•œ
18:19
is we have to limit access to certain information.
320
1099330
3000
์ ‘๊ทผ์„ ์ œํ•œํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
18:22
And growing up as we have, and holding very high
321
1102330
3000
์–ธ๋ก ์˜ ์ž์œ ๋Š” ๊ณ„์† ๋ฐœ์ „๋˜์–ด ์™”๊ณ  ๋งค์šฐ ์กด์ค‘๋˜์–ด ์™”์ง€์š”.
18:25
the value of free speech, this is a hard thing for us to accept --
322
1105330
4000
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” ์ •๋ณด๋ฅผ ์ œํ•œํ•˜๋Š” ๊ฒƒ์„ ๋ฐ›์•„๋“ค์ด๊ธฐ ํž˜๋“ค ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
18:29
for all of us to accept.
323
1109330
1000
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๋Š” ๋ฐ›์•„๋“ค์—ฌ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
18:30
It's especially hard for the scientists to accept who still remember,
324
1110330
5000
ํŠนํžˆ ์•„์ง ๊ฐˆ๋ฆด๋ ˆ์˜ค๋ฅผ ๊ธฐ์–ตํ•˜๋Š” ๊ณผํ•™์ž๋“ค์—๊ฒŒ๋Š” ๋ฐ›์•„๋“ค์ด๊ธฐ ํž˜๋“ค ๊ฒƒ์ž…๋‹ˆ๋‹ค.
18:35
you know, Galileo essentially locked up,
325
1115330
2000
์•„์‹œ๋‹ค์‹œํ”ผ, ๊ฐˆ๋ฆด๋ ˆ์˜ค๋Š” ๊ฐ์˜ฅ์— ๊ฐ–ํ˜”์ฃ .
18:37
and who are still fighting this battle against the church.
326
1117330
4000
๊ทธ๋ฆฌ๊ณ , ๊ตํšŒ์— ๋Œ€ํ•ญํ•ด ๊ณ„์† ์‹ธ์› ์Šต๋‹ˆ๋‹ค.
18:41
But that's the price of having a civilization.
327
1121330
5000
ํ•˜์ง€๋งŒ ๊ทธ๊ฒƒ์€ ๋ฌธ๋ช…์˜ ๋Œ“๊ฐ€์ž…๋‹ˆ๋‹ค.
18:46
The price of retaining the rule of law
328
1126330
2000
๋ฒ•์˜ ์ง€๋ฐฐ๋ฅผ ์œ ์ง€ํ•จ์œผ๋กœ์จ,
18:48
is to limit the access to the great and kind of unbridled power.
329
1128330
5000
ํฌ๊ณ , ์ œํ•œ๋ฐ›์ง€ ์•Š๋Š” ํž˜์— ๋Œ€ํ•œ ์ ‘๊ทผ์„ ์ œํ•œํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
18:53
Thank you.
330
1133330
1000
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
18:54
(Applause)
331
1134330
2000
๋ฐ•์ˆ˜
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7