Bill Joy: What I'm worried about, what I'm excited about

91,745 views ใƒป 2008-11-25

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Gad Amit
00:18
What technology can we really apply to reducing global poverty?
0
18330
6000
ื‘ืื™ื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื” ื ื•ื›ืœ ืœื”ืฉืชืžืฉ ื‘ื›ื“ื™ ืœื”ืคื—ื™ืช ืืช ื”ืขื•ื ื™ ื‘ืขื•ืœื?
00:24
And what I found was quite surprising.
1
24330
4000
ืžื” ืฉืžืฆืืชื™ ื”ื™ื” ื“ื™ ืžืคืชื™ืข
ื”ืชื—ืœื ื• ืœื‘ื—ื•ืŸ ื“ื‘ืจื™ื ื›ืžื• ืฉื™ืขื•ืจ ื”ืชืžื•ืชื” ื‘ืžืื” ื”-20
00:28
We started looking at things like death rates in the 20th century,
2
28330
3000
00:31
and how they'd been improved, and very simple things turned out.
3
31330
3000
ื•ืื™ืš ื–ื” ื”ืฉืชืคืจ, ื•ื”ืชื‘ืจืจื• ื“ื‘ืจื™ื ืคืฉื•ื˜ื™ื ืžืื“.
00:34
You'd think maybe antibiotics made more difference than clean water,
4
34330
3000
ืื ื—ืฉื‘ืชื ืฉื”ืื ื˜ื™ื‘ื™ื•ื˜ื™ืงื” ืขื–ืจื” ื™ื•ืชืจ ืžืืฉืจ ืžื™ื ื ืงื™ื™ื
00:37
but it's actually the opposite.
5
37330
3000
ื”ื”ื™ืคืš ื”ื•ื ื”ื ื›ื•ืŸ
00:40
And so very simple things -- off-the-shelf technologies
6
40330
3000
ื›ืš ืฉื“ื‘ืจื™ื ืคืฉื•ื˜ื™ื ืžืื“... ื˜ื›ื ื•ืœื•ื’ื™ื•ืช-ืžื“ืฃ
00:43
that we could easily find on the then-early Web --
7
43330
5000
ืฉื™ื›ื•ืœื ื• ืœืžืฆื•ื ื‘ืงืœื•ืช ื‘ืื™ื ื˜ืจื ื˜ ื”ืžื•ืงื“ื ืฉืœ ืื–
00:48
would clearly make a huge difference to that problem.
8
48330
5000
ื™ืชืจืžื• ื‘ื’ื“ื•ืœ ืœืคืชืจื•ืŸ ื‘ืขื™ื” ื–ื•
00:53
But I also, in looking at more powerful technologies
9
53330
4000
ืืš ื’ื ื›ืฉืื ื™ ื‘ื•ื—ืŸ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื—ื–ืงื•ืช ื™ื•ืชืจ
00:57
and nanotechnology and genetic engineering and other new emerging
10
57330
5000
ื ื ื•-ื˜ื›ื ื•ืœื•ื’ื™ื”, ื”ื ื“ืกื” ื’ื ื˜ื™ืช ื•ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืžืชืคืชื—ื•ืช ื—ื“ืฉื•ืช
01:02
kind of digital technologies, became very concerned
11
62330
4000
ื›ืžื•-ื“ื™ื’ื™ื˜ืœื™ื•ืช, ื ืขืฉื™ืชื™ ืžื•ื“ืื’ ืžืื“
01:06
about the potential for abuse.
12
66330
4000
ื‘ื ื•ื’ืข ืœืืคืฉืจื•ืช ื”ื ื™ืฆื•ืœ-ืœืจืขื”
01:10
If you think about it, in history, a long, long time ago
13
70330
5000
ืื ืชื—ืฉื‘ื• ืขืœ ื–ื”, ืœืคื ื™ ื–ืžืŸ ืจื‘ ืžืื“ ื‘ืขื‘ืจ,
01:15
we dealt with the problem of an individual abusing another individual.
14
75330
3000
ื”ืชืžื•ื“ื“ื ื• ืขื ื”ื‘ืขื™ื” ืฉืœ ื™ื—ื™ื“ ื”ืคื•ื’ืข ื‘ื™ื—ื™ื“
01:18
We came up with something -- the Ten Commandments: Thou shalt not kill.
15
78330
3000
ื™ืฆืจื ื• ืคืชืจื•ืŸ: 10 ื”ื“ื™ื‘ืจื•ืช. ืืœ ืชืจืฆื—
01:21
That's a, kind of a one-on-one thing.
16
81330
2000
ืžื“ื•ื‘ืจ ื›ืืŸ ื‘ืชื—ื•ื ืฉืœ ืื—ื“-ืขืœ-ืื—ื“
01:23
We organized into cities. We had many people.
17
83330
4000
ื”ืชืืจื’ื ื• ืœืขืจื™ื. ื ืขืฉื™ื ื• ืœืขืžื™ื.
01:27
And to keep the many from tyrannizing the one,
18
87330
4000
ื•ื›ื“ื™ ืœืžื ื•ืข ืืช ืขืจื™ืฆื•ืช ื”ืจื‘ื™ื ืขืœ ื”ื™ื—ื™ื“,
01:31
we came up with concepts like individual liberty.
19
91330
4000
ื™ืฆืจื ื• ืžื•ืฉื’ื™ื ื›ืžื• ื—ื™ืจื•ืช ื”ืคืจื˜
01:35
And then, to have to deal with large groups,
20
95330
1000
ื•ืื–, ื›ื“ื™ ืœื”ืชืžื•ื“ื“ ืขื ืงื‘ื•ืฆื•ืช ื’ื“ื•ืœื•ืช
01:36
say, at the nation-state level,
21
96330
3000
ืœืžืฉืœ, ื‘ืจืžืช ื”ืื•ืžื”
01:39
and we had to have mutual non-aggression,
22
99330
2000
ืื ื• ื–ืงื•ืงื™ื ืœืื™-ืืœื™ืžื•ืช ื”ื“ื“ื™ืช,
01:41
or through a series of conflicts, we eventually came to
23
101330
4000
ืื• ืฉื‘ืขืงื‘ื•ืช ืกื“ืจืช ืกื›ืกื•ื›ื™ื ื”ื’ืขื ื• ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ
01:45
a rough international bargain to largely keep the peace.
24
105330
6000
ืœืขื™ืกืงื” ื‘ื™ื ืœืื•ืžื™ืช ื’ื•ืœืžื™ืช ืœืฉืžื™ืจืช ื”ืฉืœื•ื
01:51
But now we have a new situation, really what people call
25
111330
5000
ืืš ื›ืขืช ื™ืฉ ืžืฆื‘ ื—ื“ืฉ, ืฉื”ื•ื ืžื” ืฉืžื›ื ื™ื
01:56
an asymmetric situation, where technology is so powerful
26
116330
3000
ืžืฆื‘ ื-ืกื™ืžื˜ืจื™, ืฉื‘ื• ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ื” ื—ื–ืงื”
01:59
that it extends beyond a nation-state.
27
119330
4000
ืขื“ ืฉื”ื™ื ื—ื•ืจื’ืช ืžื’ื‘ื•ืœื•ืช ืจืžืช-ื”ืื•ืžื”.
02:03
It's not the nation-states that have potential access
28
123330
3000
ื™ื›ื•ืœืช ื”ื”ืฉืžื“ื” ื”ื”ืžื•ื ื™ืช ืื™ื ื” ื‘ืจืžืช ื”ืื•ืžื”
02:06
to mass destruction, but individuals.
29
126330
5000
ื›ื™ ืื ื‘ืจืžืช ื”ื™ื—ื™ื“.
02:11
And this is a consequence of the fact that these new technologies tend to be digital.
30
131330
5000
ื•ื–ืืช ื›ื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื—ื“ืฉื•ืช ืืœื” ื”ืŸ ืœืจื•ื‘ ื“ื™ื’ื™ื˜ืœื™ื•ืช.
02:16
We saw genome sequences.
31
136330
4000
ืจืื™ื ื• ืจืฆืคื™-ื’ื ื™ื.
02:20
You can download the gene sequences
32
140330
1000
ื ื™ืชืŸ ืœื”ื•ืจื™ื“ ื‘ืื™ื ื˜ืจื ื˜ ืจืฆืคื™-ื’ื ื™ื
02:21
of pathogens off the Internet if you want to,
33
141330
4000
ืฉืœ ื’ื•ืจืžื™-ืžื—ืœื•ืช, ืื ืจื•ืฆื™ื,
02:25
and clearly someone recently -- I saw in a science magazine --
34
145330
5000
ื•ื‘ืจื•ืจ ืฉืžื™ืฉื”ื• ืœืื—ืจื•ื ื”... ืงืจืืชื™ ื‘ืขื™ืชื•ืŸ ืžื“ืขื™ --
02:30
they said, well, the 1918 flu is too dangerous to FedEx around.
35
150330
5000
ื ืืžืจ ืฉื, ืฉืžื’ืคืช ื”ืฉืคืขืช ืฉืœ 1918 ืžืกื•ื›ื ืช ืžื“ื™ ืœืžืฉืœื•ื— ื‘"ืคื“ืงืก".
02:35
If people want to use it in their labs for working on research,
36
155330
3000
ืื ืžื™ืฉื”ื• ืจื•ืฆื” ืœืขื‘ื•ื“ ืขืœื™ื” ื‘ืžืขื‘ื“ืชื• ืœืฆืจื›ื™ ืžื—ืงืจ,
02:38
just reconstruct it yourself,
37
158330
3000
ื”ื•ื ื™ื›ื•ืœ ืœื”ืจื›ื™ื‘ ืื•ืชื” ืžื—ื“ืฉ ื‘ืขืฆืžื•,
02:41
because, you know, it might break in FedEx.
38
161330
4000
ื›ื™ ืื•ืœื™ ื–ื” ื™ื™ืฉื‘ืจ ื‘ืžื”ืœืš ื”ืžืฉืœื•ื— ื‘"ืคื“ืงืก".
02:45
So that this is possible to do this is not deniable.
39
165330
5000
ื›ืš ืฉืื™ืŸ ืœื”ื›ื—ื™ืฉ, ื ื™ืชืŸ ืœืขืฉื•ืช ื–ืืช.
02:50
So individuals in small groups super-empowered by access to these
40
170330
5000
ื›ืš ืฉื™ื—ื™ื“ื™ื ื‘ืงื‘ื•ืฆื•ืช ืงื˜ื ื•ืช ื”ืžื•ืขืฆืžื•ืช ื‘ื›ืš ืฉื™ืฉ ืœื”ืŸ ื’ื™ืฉื”
02:55
kinds of self-replicating technologies, whether it be biological
41
175330
5000
ืœื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉื›ืคื•ืœ-ืขืฆืžื™, ื‘ื™ื•ืœื•ื’ื™ื•ืช
03:00
or other, are clearly a danger in our world.
42
180330
3000
ืื• ืื—ืจื•ืช, ื”ืŸ ื‘ื‘ื™ืจื•ืจ ืกื›ื ื” ื‘ืขื•ืœืžื ื•.
03:03
And the danger is that they can cause roughly what's a pandemic.
43
183330
4000
ื•ื”ืกื›ื ื” ื”ื™ื ืฉื‘ื›ื•ื—ืŸ ืœื™ืฆื•ืจ ืžื’ืคื•ืช-ืขื ืง.
03:07
And we really don't have experience with pandemics,
44
187330
3000
ื•ื”ืืžืช ื”ื™ื, ืฉืื™ืŸ ืœื ื• ื ืกื™ื•ืŸ ื‘ืžื’ืคื•ืช-ืขื ืง,
03:10
and we're also not very good as a society at acting
45
190330
3000
ื•ื›ื—ื‘ืจื”, ืื ื• ืœื ืžืžืฉ ื˜ื•ื‘ื™ื ื‘ืœืคืขื•ืœ ืœื’ื‘ื™ ื“ื‘ืจื™ื
03:13
to things we don't have direct and sort of gut-level experience with.
46
193330
4000
ืฉืื™ืŸ ืœื ื• ืœื’ื‘ื™ื”ื ื”ืชื ืกื•ืช ื™ืฉื™ืจื” ื•ืชื—ื•ืฉืช-ื‘ื˜ืŸ.
03:17
So it's not in our nature to pre-act.
47
197330
4000
ื›ืš ืฉืคืขื•ืœื” ืžืงื“ื™ืžื” ืื™ื ื ื” ืžื˜ื‘ืขื ื•.
03:21
And in this case, piling on more technology doesn't solve the problem,
48
201330
5000
ื•ื‘ืžืงืจื” ื”ื–ื”, ื”ื•ืกืคืช ืขื•ื“ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืœื ืชืคืชื•ืจ ืืช ื”ื‘ืขื™ื”,
03:26
because it only super-empowers people more.
49
206330
3000
ื›ื™ ื–ื” ืจืง ื™ืขืฆื™ื ืขื•ื“ ื™ื•ืชืจ ืืช ื”ืื ืฉื™ื.
03:29
So the solution has to be, as people like Russell and Einstein
50
209330
4000
ื›ืš ืฉื”ืคืชืจื•ืŸ ืฆืจื™ืš ืœื”ื™ื•ืช, ื›ืคื™ ืฉืื ืฉื™ื ื›ืžื• ืจืืกืœ, ืื™ื™ื ืฉื˜ื™ื™ืŸ
03:33
and others imagine in a conversation that existed
51
213330
2000
ื•ืื—ืจื™ื ืฉืกื‘ืจื•, ื‘ืฉื™ื—ื” ืฉื”ืชืงื™ื™ืžื”
03:35
in a much stronger form, I think, early in the 20th century,
52
215330
4000
ื‘ืฆื•ืจื” ื”ืจื‘ื” ื™ื•ืชืจ ื‘ื•ื˜ื”, ืœื“ืขืชื™, ื‘ืชื—ื™ืœืช ืฉื ื•ืช ื”-20,
03:39
that the solution had to be not just the head but the heart.
53
219330
3000
ืฉื”ืคืชืจื•ืŸ ืฆืจื™ืš ืœื‘ื•ื ืœื ืจืง ืžื”ืจืืฉ ืืœื ื’ื ืžื”ืœื‘.
03:42
You know, public policy and moral progress.
54
222330
5000
ืืชื ื™ื•ื“ืขื™ื, ืžื“ื™ื ื™ื•ืช ืฆื™ื‘ื•ืจื™ืช ื•ืงื™ื“ืžื” ืžื•ืกืจื™ืช.
03:47
The bargain that gives us civilization is a bargain to not use power.
55
227330
6000
ื”ืขื™ืกืงื” ืฉืžื•ืœื™ื“ื” ืืช ื”ืฆื™ื•ื™ืœื™ื–ืฆื™ื” ื”ื™ื ืฉืœื ืœื”ืฉืชืžืฉ ื‘ื›ื•ื—.
03:53
We get our individual rights by society protecting us from others
56
233330
3000
ืื ื• ื–ื•ื›ื™ื ื‘ื–ื›ื•ื™ื•ืช ื”ืคืจื˜ ืข"ื™ ื›ืš ืฉื”ื—ื‘ืจื” ืžื’ื™ื ื” ืขืœื™ื ื• ืžื›ืš ืฉืื—ืจื™ื
03:56
not doing everything they can do but largely doing only what is legal.
57
236330
5000
ืœื ื™ืขืฉื• ื›ืจืฆื•ื ื ืืœื ื‘ื“"ื› ืจืง ืžื” ืฉื—ื•ืงื™.
04:01
And so to limit the danger of these new things, we have to limit,
58
241330
5000
ื›ืš ืฉื›ื“ื™ ืœื”ื’ื‘ื™ืœ ืืช ืกื›ื ืช ื”ื“ื‘ืจื™ื ื”ื—ื“ืฉื™ื ืขืœื™ื ื• ืœื”ื’ื‘ื™ืœ,
04:06
ultimately, the ability of individuals
59
246330
2000
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ, ืืช ื”ืืคืฉืจื•ืช ืฉืœ ื™ื—ื™ื“ื™ื
04:08
to have access, essentially, to pandemic power.
60
248330
3000
ืœื”ืฉื™ื’ ื’ื™ืฉื” ืœื—ื•ืœืœ ืžื’ืคื•ืช ืขื ืง.
04:11
We also have to have sensible defense, because no limitation
61
251330
4000
ืขืœื™ื ื• ื’ื ืœื™ืฆื•ืจ ืžืขืจื›ืช ื”ื’ื ื” ืžื•ืฉื›ืœืช, ืžื›ื™ื•ื•ืŸ ืฉืฉื•ื ืžื’ื‘ืœื”
04:15
is going to prevent a crazy person from doing something.
62
255330
3000
ืœื ืชืžื ืข ืžืžื˜ื•ืจืฃ ืœืขื•ืœืœ ืžืฉื”ื•.
04:18
And you know, and the troubling thing is that
63
258330
2000
ื•ื”ื‘ืขื™ื” ื”ืžื˜ืจื™ื“ื” ื”ื™ื
04:20
it's much easier to do something bad than to defend
64
260330
2000
ืฉื”ืจื‘ื” ื™ื•ืชืจ ืงืœ ืœืขื•ืœืœ ืžืฉื”ื• ืจืข ืžืืฉืจ ืœื”ื’ืŸ
04:22
against all possible bad things,
65
262330
2000
ื›ื ื’ื“ ื›ืœ ื”ืจืขื•ืช ื”ืืคืฉืจื™ื•ืช,
04:24
so the offensive uses really have an asymmetric advantage.
66
264330
4000
ื›ืš ืฉืœืฆื“ ื”ืคื•ื’ืข ื™ืฉ ืœืžืขืฉื” ื™ืชืจื•ืŸ ื-ืกื™ืžื˜ืจื™.
04:28
So these are the kind of thoughts I was thinking in 1999 and 2000,
67
268330
4000
ื–ื” ื”ื™ื” ืกื•ื’ ื”ืžื—ืฉื‘ื•ืช ืฉื—ืฉื‘ืชื™ ื‘-1999 ื•ื‘-2000,
04:32
and my friends told me I was getting really depressed,
68
272330
2000
ื•ื—ื‘ืจื™ ืืžืจื• ืœื™ ืฉืื ื™ ื ืขืฉื” ืžืžืฉ ืžื“ื•ื›ื,
04:34
and they were really worried about me.
69
274330
2000
ื•ืžืžืฉ ื“ืื’ื• ืœื™.
04:36
And then I signed a book contract to write more gloomy thoughts about this
70
276330
3000
ื•ืื– ื—ืชืžืชื™ ื—ื•ื–ื” ืœื›ืชื•ื‘ ืกืคืจ ืขื ืขื•ื“ ืžื—ืฉื‘ื•ืช ืงื•ื“ืจื•ืช ื›ืืœื”
04:39
and moved into a hotel room in New York
71
279330
2000
ื•ืขื‘ืจืชื™ ืœื’ื•ืจ ื‘ื—ื“ืจ ืžืœื•ืŸ ื‘ื ื™ื•-ื™ื•ืจืง
04:41
with one room full of books on the Plague,
72
281330
4000
ืขื ื—ื“ืจ ืžืœื ื‘ืกืคืจื™ื ืื•ื“ื•ืช ืžื’ืคืช ื”ื“ื‘ืจ,
04:45
and you know, nuclear bombs exploding in New York
73
285330
3000
ื•ืคืฆืฆื•ืช ืื˜ื•ื ืฉืžืชืคื•ืฆืฆื•ืช ื‘ื ื™ื•-ื™ื•ืจืง
04:48
where I would be within the circle, and so on.
74
288330
3000
ื›ืฉืื ื™ ื‘ืชื—ื•ื ื”ืคื™ืฆื•ืฅ, ื•ื›ืš ื”ืœืื”.
04:51
And then I was there on September 11th,
75
291330
4000
ื•ืื– ื”ื™ื™ืชื™ ืฉื ื‘-11 ืœืกืคื˜ืžื‘ืจ,
04:55
and I stood in the streets with everyone.
76
295330
1000
ื•ืขืžื“ืชื™ ื‘ืจื—ื•ื‘ ื›ืžื• ื›ื•ืœื.
04:56
And it was quite an experience to be there.
77
296330
2000
ื•ื–ื• ื”ื™ืชื” ืžืžืฉ ื—ื•ื•ื™ื”, ืœื”ื™ื•ืช ืฉื.
04:58
I got up the next morning and walked out of the city,
78
298330
3000
ื”ืชืขื•ืจืจืชื™ ืœืžื—ืจืช ื‘ื‘ื•ืงืจ ื•ื™ืฆืืชื™ ื‘ืจื’ืœ ืžื”ืขื™ืจ,
05:01
and all the sanitation trucks were parked on Houston Street
79
301330
3000
ื•ื›ืœ ืžืฉืื™ื•ืช ื”ืืฉืคื” ื—ื ื• ื‘ืจื—ื•ื‘ ื™ื•ืกื˜ื•ืŸ
05:04
and ready to go down and start taking the rubble away.
80
304330
2000
ืžื•ื›ื ื•ืช ืœืฆืืช ื”ืขื™ืจื” ื•ืœืืกื•ืฃ ืืช ื”ื”ืจื™ืกื•ืช.
05:06
And I walked down the middle, up to the train station,
81
306330
2000
ื”ืœื›ืชื™ ื‘ื™ื ื™ื”ืŸ ืขื“ ืœืชื—ื ืช ื”ืจื›ื‘ืช,
05:08
and everything below 14th Street was closed.
82
308330
3000
ื•ืžื“ืจื•ื ืœืจื—' 14 ื”ื›ืœ ื”ื™ื” ืกื’ื•ืจ.
05:11
It was quite a compelling experience, but not really, I suppose,
83
311330
4000
ื–ื• ื”ื™ืชื” ื‘ื”ื—ืœื˜ ื—ื•ื•ื™ื” ืจื‘ืช-ืขื•ืฆืžื”, ืื‘ืœ ื›ื ืจืื” ืฉืœื ืžืžืฉ ื”ืคืชืขื”,
05:15
a surprise to someone who'd had his room full of the books.
84
315330
3000
ืœืžื™ ืฉื—ื“ืจื• ืžืœื ื‘ืกืคืจื™ื ื”ืืœื”.
05:18
It was always a surprise that it happened then and there,
85
318330
4000
ื ื›ื•ืŸ ืฉื”ื™ืชื” ื”ืคืชืขื” ืฉื–ื” ืงืจื” ืฉื ื•ืื–,
05:22
but it wasn't a surprise that it happened at all.
86
322330
4000
ืืš ืœื ื”ื™ืชื” ื”ืคืชืขื” ืฉื–ื” ื‘ื›ืœืœ ืงืจื”.
05:26
And everyone then started writing about this.
87
326330
2000
ื•ื›ื•ืœื ืื– ื”ืชื—ื™ืœื• ืœื›ืชื•ื‘ ืขืœ ื–ื”.
05:28
Thousands of people started writing about this.
88
328330
1000
ืืœืคื™ ืื ืฉื™ื ื”ื—ืœื• ืœื›ืชื•ื‘ ืขืœ ื–ื”.
05:29
And I eventually abandoned the book, and then Chris called me
89
329330
2000
ื•ืœื‘ืกื•ืฃ ื–ื ื—ืชื™ ืืช ื”ืกืคืจ, ื•ืื– ื›ืจื™ืก ื”ื–ืžื™ืŸ ืื•ืชื™
05:31
to talk at the conference. I really don't talk about this anymore
90
331330
3000
ืœื“ื‘ืจ ื‘ื›ื ืก, ืื ื™ ื›ื‘ืจ ืœื ืžื“ื‘ืจ ืขืœ ื–ื”
05:34
because, you know, there's enough frustrating and depressing things going on.
91
334330
5000
ื›ื™ ื”ืจื™ ืžืชืจื—ืฉื™ื ืžืกืคื™ืง ื“ื‘ืจื™ื ืžืชืกื›ืœื™ื ื•ืžื“ื›ืื™ื.
05:39
But I agreed to come and say a few things about this.
92
339330
3000
ืืš ื”ืกื›ืžืชื™ ืœื‘ื•ื ื•ืœื•ืžืจ ืขืœ ื›ืš ื›ืžื” ื“ื‘ืจื™ื.
05:42
And I would say that we can't give up the rule of law
93
342330
3000
ื”ื™ื™ืชื™ ืื•ืžืจ ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื•ื•ืชืจ ืขืœ ืฉืœื˜ื•ืŸ ื”ื—ื•ืง
05:45
to fight an asymmetric threat, which is what we seem to be doing
94
345330
4000
ื›ื“ื™ ืœื”ื™ืœื—ื ื‘ืื™ื•ื ื-ืกื™ืžื˜ืจื™, ื•ื ืจืื” ืฉื–ื” ืžื” ืฉืื ื• ืขื•ืฉื™ื
05:49
because of the present, the people that are in power,
95
349330
5000
ื›ื™ ื›ืš ื–ื” ื”ื™ื•ื, ื›ืืœื” ื”ื ื”ืื ืฉื™ื ืฉื‘ืฉืœื˜ื•ืŸ,
05:54
because that's to give up the thing that makes civilization.
96
354330
5000
ื›ื™ ื–ื” ืื•ืžืจ ืœื•ื•ืชืจ ืขืœ ืžื” ืฉื”ื•ืคืš ืื•ืชื ื• ืœืฆื™ื•ื•ื™ืœื™ื–ืฆื™ื”.
05:59
And we can't fight the threat in the kind of stupid way we're doing,
97
359330
3000
ื•ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ื™ืœื—ื ื‘ืื™ื•ื ื‘ืฆื•ืจื” ื”ืžื˜ื•ืคืฉืช ื‘ื” ืื ื• ืขื•ืฉื™ื ื–ืืช
06:02
because a million-dollar act
98
362330
2000
ื›ื™ ืคืขื•ืœื” ืฉืขื•ืœื” ืžื™ืœื™ื•ืŸ ื“ื•ืœืจ
06:04
causes a billion dollars of damage, causes a trillion dollar response
99
364330
3000
ื’ื•ืจืžืช ืœื ื–ืง ื‘ืžื™ืœื™ืืจื“ ื“ื•ืœืจ ื•ืžืขื•ืจืจืช ืชื’ื•ื‘ื” ื‘ื˜ืจื™ืœื™ื•ืŸ ื“ื•ืœืจ
06:07
which is largely ineffective and arguably, probably almost certainly,
100
367330
3000
ื•ื”ื™ื ื‘ืขื™ืงืจื” ืœื ื™ืขื™ืœื”, ื•ื ื™ืชืŸ ืœืงื‘ื•ืข ื›ืžืขื˜ ื‘ื‘ื˜ื—ื•ืŸ
06:10
has made the problem worse.
101
370330
2000
ืฉื”ื™ื ืชื—ืžื™ืจ ืืช ื”ื‘ืขื™ื”.
06:12
So we can't fight the thing with a million-to-one cost,
102
372330
5000
ื›ืš ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ื™ืœื—ื ื‘ื›ืš ื‘ืขืœื•ืช ืฉืœ ืžื™ืœื™ื•ืŸ-ืœืื—ื“,
06:17
one-to-a-million cost-benefit ratio.
103
377330
6000
ื™ื—ืก ืขืœื•ืช-ืชื•ืขืœืช ืฉืœ ืžื™ืœื™ื•ืŸ-ืœืื—ื“.
06:24
So after giving up on the book -- and I had the great honor
104
384330
5000
ืื– ืื—ืจื™ ืฉื ื˜ืฉืชื™ ืืช ื”ืกืคืจ... ื•ื”ื™ื” ืœื™ ื”ื›ื‘ื•ื“ ื”ื’ื“ื•ืœ
06:29
to be able to join Kleiner Perkins about a year ago,
105
389330
4000
ืœื”ืฆื˜ืจืฃ ืœ"ืงืœื™ื™ื ืจ ืคืจืงื™ื ืก" ืœืคื ื™ ืฉื ื”,
06:33
and to work through venture capital on the innovative side,
106
393330
7000
ื•ืœืขื‘ื•ื“ ื‘ืขื–ืจืช ืงืจื ื•ืช ื”ื•ืŸ-ืกื™ื›ื•ืŸ ืขืœ ื”ื”ื™ื‘ื˜ ื”ื—ื“ืฉื ื™,
06:40
and to try to find some innovations that could address what I saw as
107
400330
4000
ืœื ืกื•ืช ืœืžืฆื•ื ื”ืžืฆืื•ืช ื—ื“ืฉื•ืช ืœื”ืชืžื•ื“ื“ื•ืช ืขื ืžื” ืฉื‘ืขื™ื ื™ ื”ื™ื•
06:44
some of these big problems.
108
404330
2000
ื›ืžื” ืžื”ื‘ืขื™ื•ืช ื”ื’ื“ื•ืœื•ืช ื”ืืœื”.
06:46
Things where, you know, a factor of 10 difference
109
406330
3000
ื“ื‘ืจื™ื ืฉื‘ื”ื ืžืงื“ื-ืฉื™ื ื•ื™ ืฉืœ ืขืฉืจ
06:49
can make a factor of 1,000 difference in the outcome.
110
409330
4000
ื™ื›ื•ืœ ืœื™ืฆื•ืจ ืžืงื“ื-ืฉื™ื ื•ื™ ืฉืœ 1000 ื‘ืชื•ืฆืื” ื”ืกื•ืคื™ืช.
06:53
I've been amazed in the last year at the incredible quality
111
413330
3000
ื”ื•ืคืชืขืชื™ ื‘ืฉื ื” ื”ืื—ืจื•ื ื” ืžื”ืื™ื›ื•ืช ื”ืžื“ื”ื™ืžื”
06:56
and excitement of the innovations that have come across my desk.
112
416330
5000
ื•ืžื”ื”ืชืจื’ืฉื•ืช ืฉืขื•ืจืจื• ื›ืžื” ืžื”ื—ื™ื“ื•ืฉื™ื ืฉื”ื’ื™ืขื• ืืœ ืฉื•ืœื—ื ื™.
07:01
It's overwhelming at times. I'm very thankful for Google and Wikipedia
113
421330
3000
ื–ื” ืœืขืชื™ื ืžื”ืžื. ืื ื™ ืืกื™ืจ-ืชื•ื“ื” ืœ"ื’ื•ื’ืœ" ื•ืœ"ื•ื™ืงื™ืคื“ื™ื”",
07:04
so I can understand at least a little of what people are talking about
114
424330
4000
ืขืœ ืฉืื ื™ ืžืกื•ื’ืœ ืœื”ื‘ื™ืŸ ืœืคื—ื•ืช ืžืขื˜ ืžืžื” ืฉืื•ืžืจื™ื ื”ืื ืฉื™ื
07:08
who come through the doors.
115
428330
2000
ืฉื ื›ื ืกื™ื ืืœื™.
07:10
But I wanted to share with you three areas
116
430330
3000
ืืš ืจืฆื™ืชื™ ืœืฉืชืฃ ืืชื›ื ื‘-3 ืชื—ื•ืžื™ื
07:13
that I'm particularly excited about and that relate to the problems
117
433330
3000
ืฉืžืจื’ืฉื™ื ืื•ืชื™ ื‘ืžื™ื•ื—ื“, ื•ื ื•ื’ืขื™ื ืœื‘ืขื™ื”
07:16
that I was talking about in the Wired article.
118
436330
5000
ืฉืขืœื™ื” ื“ื™ื‘ืจืชื™ ื‘ืžืืžืจ ื‘ื›ืชื‘ ื”ืขืช "ื•ื•ื™ื™ืจื“".
07:21
The first is this whole area of education,
119
441330
2000
ื”ืจืืฉื•ืŸ ื”ื•ื ื›ืœ ืชื—ื•ื ื”ื—ื™ื ื•ืš,
07:23
and it really relates to what Nicholas was talking about with a $100 computer.
120
443330
4000
ื•ืœืžืขืฉื” ืžืชื™ื™ื—ืก ืœื“ื‘ืจื™ื• ืฉืœ ื ื™ืงื•ืœืก ืขืœ ื”ืžื—ืฉื‘ ื‘-100 ื“ื•ืœืจ.
07:27
And that is to say that there's a lot of legs left in Moore's Law.
121
447330
4000
ื“ื”ื™ื™ื ื•, ื—ื•ืง ืžื•ืจ ืขื•ื“ื ื• ืชืงืฃ.
07:31
The most advanced transistors today are at 65 nanometers,
122
451330
4000
ื”ื˜ืจื ื–ื™ืกื˜ื•ืจื™ื ื”ื›ื™ ืžืชืงื“ืžื™ื ื”ื™ื•ื ื”ื ืฉืœ 65 ื ื ื•ืžื˜ืจ,
07:35
and we've seen, and I've had the pleasure to invest
123
455330
3000
ื•ื›ื‘ืจ ืจืื™ื ื•, ื•ืœื™ ื”ื™ื” ื”ืขื•ื ื’ ืœื”ืฉืงื™ืข
07:38
in, companies that give me great confidence that we'll extend Moore's Law
124
458330
6000
ื‘ื—ื‘ืจื•ืช ืฉื ื•ืชื ื•ืช ืœื™ ื‘ื˜ื—ื•ืŸ ืจื‘ ืฉืขื•ื“ ื ืจื—ื™ื‘ ืืช ื—ื•ืง ืžื•ืจ
07:44
all the way down to roughly the 10 nanometer scale.
125
464330
3000
ืขื“ ืœืจืžืช ื”-10 ื ื ื•ืžื˜ืจ ื‘ืขืจืš.
07:47
Another factor of, say, six in dimensional reduction,
126
467330
6000
ื›ืœื•ืžืจ, ืžืงื“ื ื ื•ืกืฃ ืฉืœ ืคื™ 6 ื‘ืžื™ื–ืขื•ืจ,
07:53
which should give us about another factor of 100 in raw improvement
127
473330
5000
ืฉืฆืจื™ืš ืœืชืช ืœื ื• ืžืงื“ื ื ื•ืกืฃ ืฉืœ 100 ื‘ืฉื™ืคื•ืจ ื›ืœืœื™
07:58
in what the chips can do. And so, to put that in practical terms,
128
478330
5000
ื‘ืฉื‘ื‘ื™ื ืฉื ื™ื™ืฆืจ, ืื– ื‘ืžื•ื ื—ื™ื ืžืขืฉื™ื™ื,
08:03
if something costs about 1,000 dollars today,
129
483330
4000
ืื ืžืฉื”ื• ืขื•ืœื” ื”ื™ื•ื ื›ืืœืฃ ื“ื•ืœืจ,
08:07
say, the best personal computer you can buy, that might be its cost,
130
487330
5000
ื ืืžืจ, ื”ืžื—ืฉื‘ ื”ืื™ืฉื™ ื”ื˜ื•ื‘ ื‘ื™ื•ืชืจ, ืฉื–ื”ื• ืžื—ื™ืจื•,
08:12
I think we can have that in 2020 for 10 dollars. Okay?
131
492330
6000
ืœื“ืขืชื™ ื ื•ื›ืœ ืœืงื ื•ืชื• ื‘-2020 ืชืžื•ืจืช 10 ื“ื•ืœืจ, ื‘ืกื“ืจ?
08:18
Now, just imagine what that $100 computer will be in 2020
132
498330
5000
ืชืืจื• ืœืขืฆืžื›ื ืืช ืžื—ืฉื‘ ื”-100 ื“ื•ืœืจ ื‘-2020
08:23
as a tool for education.
133
503330
2000
ื›ื›ืœื™-ืขื–ืจ ื‘ื—ื™ื ื•ืš.
08:25
I think the challenge for us is --
134
505330
2000
ืื ื™ ื—ื•ืฉื‘ ืฉื”ืืชื’ืจ ืฉืœื ื• ื”ื•ื --
08:27
I'm very certain that that will happen, the challenge is,
135
507330
2000
ืื ื™ ืžืžืฉ ืžืฉื•ื›ื ืข ืฉื–ื” ื™ืงืจื”. ื”ืืชื’ืจ ื”ื•ื,
08:29
will we develop the kind of educational tools and things with the net
136
509330
5000
ื”ืื ื ืคืชื— ื‘ืื™ื ื˜ืจื ื˜ ืืช ื”ื›ืœื™ื ื”ื—ื™ื ื•ื›ื™ื™ื ื•ื›ื“ื•ืžื”,
08:34
to let us take advantage of that device?
137
514330
3000
ืฉื™ืืคืฉืจื• ืœื ื• ืœื ืฆืœ ืžื›ืฉื™ืจ ื–ื”?
08:37
I'd argue today that we have incredibly powerful computers,
138
517330
4000
ื”ื™ื™ืชื™ ืื•ืžืจ ืฉื™ืฉ ืœื ื• ื›ื™ื•ื ืžื—ืฉื‘ื™ื ืื“ื™ืจื™ ืขื•ืฆืžื”,
08:41
but we don't have very good software for them.
139
521330
2000
ืืš ืื™ืŸ ืœื ื• ืชื•ื›ื ื” ืžืžืฉ ื˜ื•ื‘ื” ืขื‘ื•ืจื.
08:43
And it's only in retrospect, after the better software comes along,
140
523330
3000
ื•ื–ื” ืจืง ื‘ืžื‘ื˜ ืœืื—ื•ืจ, ืื—ืจื™ ืฉื™ืฆืื” ืชื•ื›ื ื” ื˜ื•ื‘ื” ื™ื•ืชืจ,
08:46
and you take it and you run it on a ten-year-old machine, you say,
141
526330
2000
ื•ืœื•ืงื—ื™ื ื•ืžืจื™ืฆื™ื ืื•ืชื” ืขืœ ืžื—ืฉื‘ ื‘ืŸ 10 ื•ืื•ืžืจื™ื,
08:48
God, the machine was that fast?
142
528330
2000
"ืืœื•ื”ื™ื, ื”ืžื›ื•ื ื” ื”ื–ื• ื”ื™ืชื” ื›ืœ ื›ืš ืžื”ื™ืจื”?"
08:50
I remember when they took the Apple Mac interface
143
530330
2000
ืื ื™ ื–ื•ื›ืจ ื›ืฉืœืงื—ื• ืืช ื”ืžืžืฉืง ืฉืœ ื”"ืžืง" ืฉืœ "ืืคืœ"
08:52
and they put it back on the Apple II.
144
532330
3000
ื•ื”ื—ื–ื™ืจื• ืื•ืชื• ืœ"ืืคืœ 2".
08:55
The Apple II was perfectly capable of running that kind of interface,
145
535330
3000
ื”"ืืคืœ 2" ื”ื™ื” ืžืกื•ื’ืœ ื‘ื”ื—ืœื˜ ืœื”ืจื™ืฅ ืืช ื”ืžืžืฉืง,
08:58
we just didn't know how to do it at the time.
146
538330
3000
ืคืฉื•ื˜ ืœื ื™ื“ืขื ื• ืื– ืื™ืš ืœืขืฉื•ืช ื–ืืช.
09:01
So given that we know and should believe --
147
541330
2000
ื•ื‘ื”ื™ื ืชืŸ ืฉืื ื• ื™ื•ื“ืขื™ื ื•ืืžื•ืจื™ื ืœื”ืืžื™ืŸ --
09:03
because Moore's Law's been, like, a constant,
148
543330
3000
ื›ื™ ื—ื•ืง ืžื•ืจ ื”ื™ื” ืžืขื™ืŸ ืงื‘ื•ืข,
09:06
I mean, it's just been very predictable progress
149
546330
3000
ื›ื•ื•ื ืชื™ ืฉื”ื”ืชืงื“ืžื•ืช ื”ื™ืชื” ืžืื“ ืฆืคื•ื™ื”
09:09
over the last 40 years or whatever.
150
549330
3000
ื‘ืžืฉืš ื›-40 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช.
09:12
We can know what the computers are going to be like in 2020.
151
552330
4000
ื‘ื™ื›ื•ืœืชื ื• ืœื“ืขืช ืื™ืš ื™ื™ืจืื• ื”ืžื—ืฉื‘ื™ื ื‘-2020.
09:16
It's great that we have initiatives to say,
152
556330
2000
ื ื”ื“ืจ ืฉื™ืฉ ืœื ื• ื”ื™ื•ื–ืžื” ืœื‘ื•ื ื•ืœื•ืžืจ,
09:18
let's go create the education and educate people in the world,
153
558330
3000
"ื‘ื•ืื• ื ื™ืฆื•ืจ ื—ื™ื ื•ืš ื•ื ื—ื ืš ืืช ื”ืื ืฉื™ื ื‘ืขื•ืœื
09:21
because that's a great force for peace.
154
561330
2000
ื›ื™ ื˜ืžื•ืŸ ื‘ื›ืš ื›ื•ื— ืจื‘ ืœื˜ื•ื‘ืช ื”ืฉืœื•ื."
09:23
And we can give everyone in the world a $100 computer
155
563330
3000
ื•ื‘ืืคืฉืจื•ืชื ื• ืœืชืช ืœื›ืœ ืื—ื“ ื‘ืขื•ืœื ืžื—ืฉื‘ ื‘ืžืื” ื“ื•ืœืจ
09:26
or a $10 computer in the next 15 years.
156
566330
5000
ืื• ืžื—ืฉื‘ ื‘-10 ื“ื•ืœืจ ืชื•ืš 15 ืฉื ื”.
09:31
The second area that I'm focusing on is the environmental problem,
157
571330
5000
ื”ืชื—ื•ื ื”ืฉื ื™ ื‘ื• ืื ื™ ืžืชืžืงื“ ื”ื•ื ื‘ืขื™ื™ืช ื”ืกื‘ื™ื‘ื”,
09:36
because that's clearly going to put a lot of pressure on this world.
158
576330
4000
ื›ื™ ื‘ืจื•ืจ ืฉื–ื” ืขืชื™ื“ ืœื”ื˜ื™ืœ ืขื•ืžืก ืจื‘ ืขืœ ืขื•ืœืžื ื•.
09:40
We'll hear a lot more about that from Al Gore very shortly.
159
580330
4000
ืขื•ื“ ืžืขื˜ ื ืฉืžืข ื”ืจื‘ื” ื™ื•ืชืจ ืขืœ ื›ืš ืžืืœ ื’ื•ืจ.
09:44
The thing that we see as the kind of Moore's Law trend
160
584330
3000
ื”ื“ื‘ืจ ืฉืื ื• ืจื•ืื™ื, ืฉื“ื•ืžื” ืœืžื’ืžืช ื—ื•ืง-ืžื•ืจ
09:47
that's driving improvement in our ability to address
161
587330
3000
ื•ืฉืžื ื™ืข ืืช ื”ืฉื™ืคื•ืจ ื‘ื™ื›ื•ืœืชื ื• ืœื˜ืคืœ
09:50
the environmental problem is new materials.
162
590330
4000
ื‘ื‘ืขื™ื™ืช ื”ืกื‘ื™ื‘ื”, ื”ื•ื ื—ื•ืžืจื™ื ื—ื“ืฉื™ื.
09:54
We have a challenge, because the urban population is growing
163
594330
4000
ื ื™ืฆื‘ ื‘ืคื ื™ื ื• ืืชื’ืจ, ื›ื™ ืื•ื›ืœื•ืกื™ื™ืช ื”ืขืจื™ื ื’ื“ืœื”
09:58
in this century from two billion to six billion
164
598330
3000
ื‘ืžืื” ื”ื ื•ื›ื—ื™ืช ืžืฉื ื™ ืžื™ืœื™ื•ืŸ ืœืฉื™ืฉื” ืžื™ืœื™ืืจื“
10:01
in a very short amount of time. People are moving to the cities.
165
601330
2000
ื‘ื–ืžืŸ ืงืฆืจ ืžืื“. ืื ืฉื™ื ืขื•ื‘ืจื™ื ืœืขื™ืจ.
10:03
They all need clean water, they need energy, they need transportation,
166
603330
3000
ื•ื›ื•ืœื ืฆืจื™ื›ื™ื ืžื™ื ื ืงื™ื™ื, ืื ืจื’ื™ื”, ืชื—ื‘ื•ืจื”,
10:06
and we want them to develop in a green way.
167
606330
4000
ื•ื‘ืจืฆื•ื ื ื• ืฉื”ืขืจื™ื ื™ืชืคืชื—ื• ื‘ืฆื•ืจื” ื™ืจื•ืงื”.
10:10
We're reasonably efficient in the industrial sectors.
168
610330
2000
ืื ื• ื™ืขื™ืœื™ื ืœืžื“ื™ ื‘ืžื’ื–ืจื™ื ื”ืชืขืฉื™ื™ืชื™ื™ื.
10:12
We've made improvements in energy and resource efficiency,
169
612330
3000
ื”ื›ื ืกื ื• ืฉื™ืคื•ืจื™ื ื‘ื™ืขื™ืœื•ืช ื”ืื ืจื’ื™ื” ื•ื”ืžืฉืื‘ื™ื,
10:15
but the consumer sector, especially in America, is very inefficient.
170
615330
4000
ืืš ื”ืžื’ื–ืจ ื”ืฆืจื›ื ื™, ื‘ืคืจื˜ ื‘ืืžืจื™ืงื”, ืžืื“ ืœื ื™ืขื™ืœ.
10:19
But these new materials bring such incredible innovations
171
619330
4000
ื•ื—ื•ืžืจื™ื ื—ื“ืฉื™ื ืืœื” ืžื‘ื™ืื™ื ืœื—ื™ื“ื•ืฉื™ื ืžื“ื”ื™ืžื™ื
10:23
that there's a strong basis for hope that these things
172
623330
4000
ื›ืš ืฉื™ืฉ ื™ืกื•ื“ ืœืชืงื•ื•ื” ืฉื“ื‘ืจื™ื ืืœื”
10:27
will be so profitable that they can be brought to the market.
173
627330
2000
ื™ื”ื™ื• ื›ื” ืจื•ื•ื—ื™ื™ื, ืฉื ื™ืชืŸ ื™ื”ื™ื” ืœื”ื›ื ื™ืกื ืœืฉื•ืง.
10:29
And I want to give you a specific example of a new material
174
629330
3000
ื‘ืจืฆื•ื ื™ ืœืชืช ืœื›ื ื“ื•ื’ืžื” ืกืคืฆื™ืคื™ืช ืœื—ื•ืžืจ ื—ื“ืฉ
10:32
that was discovered 15 years ago.
175
632330
3000
ืฉื ืชื’ืœื” ืœืคื ื™ 15 ืฉื ื™ื.
10:35
If we take carbon nanotubes, you know, Iijima discovered them in 1991,
176
635330
5000
ืฆื™ื ื•ืจื™ื•ืช-ื ืื ื• ืžืคื—ื. "ืœื™ื’'ื™ืžื”" ื’ื™ืœื• ืื•ืชืŸ ื‘-1991,
10:40
they just have incredible properties.
177
640330
2000
ื™ืฉ ืœื”ืŸ ืชื›ื•ื ื•ืช ืฉืœื-ื™ื™ืืžื ื•.
10:42
And these are the kinds of things we're going to discover
178
642330
1000
ื•ื–ื” ืกื•ื’ ื”ื“ื‘ืจื™ื ืฉืื ื• ืขืชื™ื“ื™ื ืœื’ืœื•ืช
10:43
as we start to engineer at the nano scale.
179
643330
3000
ื›ืฉืื ื• ืžืชื—ื™ืœื™ื ืœืขื‘ื•ื“ ื‘ืจืžืช ื”ื ืื ื•.
10:46
Their strength: they're almost the strongest material,
180
646330
3000
ื—ื•ื–ืง: ื”ืŸ ื›ืžืขื˜ ื”ื—ื•ืžืจ ื”ื—ื–ืง ื‘ื™ื•ืชืจ,
10:49
tensile strength material known.
181
649330
2000
ื‘ืขืœ ื—ื•ื–ืง ื”ืžืชื™ื—ื” ื”ื’ื‘ื•ื” ื‘ื™ื•ืชืจ ืฉืžื•ื›ืจ ืœื ื•
10:52
They're very, very stiff. They stretch very, very little.
182
652330
5000
ื”ืŸ ืžืื“ ืงืฉื™ื—ื•ืช. ื”ืŸ ื ืžืชื—ื•ืช ืžืขื˜ ืžืื“.
10:57
In two dimensions, if you make, like, a fabric out of them,
183
657330
3000
ื‘ืฉื ื™ ืžื™ืžื“ื™ื, ืื ื™ื•ืฆืจื™ื ืžื”ืŸ ืืจื™ื’,
11:00
they're 30 times stronger than Kevlar.
184
660330
3000
ื”ืŸ ื—ื–ืงื•ืช ืคื™ 30 ืžืงื•ื•ืœืจ.
11:03
And if you make a three-dimensional structure, like a buckyball,
185
663330
3000
ื•ืื ื™ื•ืฆืจื™ื ืžื”ืŸ ืžื‘ื ื” ืชืœืช-ืžื™ืžื“ื™ ื—ืœื•ืœ ื›ืžื• "ื›ื“ื•ืจ ื‘ืืงื™",
11:06
they have all sorts of incredible properties.
186
666330
2000
ื™ืฉ ืœื”ืŸ ื›ืœ ืžื™ื ื™ ืชื›ื•ื ื•ืช ืžื“ื”ื™ืžื•ืช.
11:08
If you shoot a particle at them and knock a hole in them,
187
668330
3000
ืื ื™ื•ืจื™ื ืขืœื™ื”ืŸ ื—ืœืงื™ืง ื•ืžื—ื•ืจืจื™ื ืื•ืชืŸ,
11:11
they repair themselves; they go zip and they repair the hole
188
671330
3000
ื”ืŸ ืžืชืงื ื•ืช ืืช ืขืฆืžืŸ. ื”ืŸ ืชื•ืคืจื•ืช ื•ืžืชืงื ื•ืช ืืช ื”ื—ื•ืจ
11:14
in femtoseconds, which is not -- is really quick.
189
674330
3000
ืชื•ืš ืคืžื˜ื•-ืฉื ื™ื•ืช, ืฉื–ื” ืœื-- ื–ื” ืžืžืฉ ืžื”ืจ.
11:17
(Laughter)
190
677330
3000
[ืฆื—ื•ืง]
11:20
If you shine a light on them, they produce electricity.
191
680330
4000
ืื ืžืื™ืจื™ื ืขืœื™ื”ืŸ, ื”ืŸ ืžืคื™ืงื•ืช ื—ืฉืžืœ.
11:24
In fact, if you flash them with a camera they catch on fire.
192
684330
3000
ืœืžืขืฉื”, ืžื•ืœ ื”ื‘ื–ืง ืžืฆืœืžื” ื”ืŸ ืขื•ืœื•ืช ื‘ืืฉ.
11:27
If you put electricity on them, they emit light.
193
687330
4000
ืื ืžืขื‘ื™ืจื™ื ื“ืจื›ืŸ ื—ืฉืžืœ, ื”ืŸ ืžืคื™ืงื•ืช ืื•ืจ.
11:31
If you run current through them, you can run 1,000 times more current
194
691330
3000
ื ื™ืชืŸ ืœื”ืขื‘ื™ืจ ื“ืจื›ืŸ ืคื™ ืืœืฃ ื™ื•ืชืจ ื–ืจื
11:34
through one of these than through a piece of metal.
195
694330
4000
ืžืืฉืจ ื“ืจืš ืžืชื›ืช.
11:38
You can make both p- and n-type semiconductors,
196
698330
3000
ื ื™ืชืŸ ืœื™ื™ืฆืจ ื—ืฆืื™-ืžื•ืœื™ื›ื™ื ืžืกื•ื’ ื—ื™ื•ื‘ื™ ื•ืฉืœื™ืœื™,
11:41
which means you can make transistors out of them.
197
701330
2000
ืžืฉืžืข ืฉื ื™ืชืŸ ืœื™ื™ืฆืจ ืžื”ืŸ ื˜ืจื ื–ื™ืกื˜ื•ืจื™ื.
11:43
They conduct heat along their length but not across --
198
703330
3000
ื”ืŸ ืžื•ืœื™ื›ื•ืช ื—ื•ื ืœืื•ืจืš ืืš ืœื ืœืจื•ื—ื‘ --
11:46
well, there is no width, but not in the other direction
199
706330
2000
ื˜ื•ื‘, ืื™ืŸ ืฉื•ื ืจื•ื—ื‘, ืืš ืœื ื‘ื›ื™ื•ื•ืŸ ื”ืื—ืจ
11:48
if you stack them up; that's a property of carbon fiber also.
200
708330
6000
ืื ืขื•ืจืžื™ื ืื•ืชืŸ. ื™ืฉ ืœื”ืŸ ื’ื ืชื›ื•ื ื•ืช ืฉืœ ืกื™ื‘ื™-ืคื—ื.
11:54
If you put particles in them, and they go shooting out the tip --
201
714330
3000
ืื ืฉืžื™ื ื‘ื”ืŸ ื—ืœืงื™ืงื™ื, ื”ืŸ ื™ื•ืจื•ืช ืื•ืชืŸ ืžื”ืงืฆื” --
11:57
they're like miniature linear accelerators or electron guns.
202
717330
3000
ื”ื ื›ืžื• ืžืื™ืฆื™ื ืงื•ื•ื™ื™ื ืื• ืชื•ืชื—ื™ ืืœืงื˜ืจื•ื ื™ื ื–ืขื™ืจื™ื.
12:00
The inside of the nanotubes is so small --
203
720330
3000
ื”ืคื ื™ื ืฉืœ ืฉืคื•ืคืจื•ืช-ื”ื ื ื• ื›ื” ืงื˜ืŸ --
12:03
the smallest ones are 0.7 nanometers --
204
723330
2000
ื”ืงื˜ื ื•ืช ื‘ื™ื•ืชืจ ื”ืŸ ื‘ื’ื•ื“ืœ 0.7 ื ื ื•ืžื˜ืจ --
12:05
that it's basically a quantum world.
205
725330
2000
ื›ืš ืฉื–ื” ื‘ืขืฆื ืขื•ืœื ืงื•ื•ืื ื˜ื™.
12:07
It's a strange place inside a nanotube.
206
727330
3000
ืชื•ื›ื” ืฉืœ ืฉืคื•ืคืจืช-ื ื ื• ื”ื•ื ืžืงื•ื ืžื•ื–ืจ.
12:10
And so we begin to see, and we've seen business plans already,
207
730330
3000
ื•ืื ื• ื›ื‘ืจ ืžืชื—ื™ืœื™ื ืœืจืื•ืช ืชื›ื ื™ื•ืช ืขืกืงื™ื•ืช,
12:13
where the kind of things Lisa Randall's talking about are in there.
208
733330
3000
ื•ื”ื“ื‘ืจื™ื ืฉืœื™ืกื” ืจื ื“ืœ ื“ื™ื‘ืจื” ืขืœื™ื”ื ื›ื‘ืจ ืžื•ืคื™ืขื™ื.
12:16
I had one business plan where I was trying to learn more about
209
736330
2000
ื”ื™ืชื” ืชื›ื ื™ืช ืขืกืงื™ืช ืื—ืช ืฉื‘ื” ื ื™ืกื™ืชื™ ืœืœืžื•ื“ ื™ื•ืชืจ
12:18
Witten's cosmic dimension strings to try to understand
210
738330
3000
ืขืœ ืžื™ืชืจื™ ื”ืžื™ืžื“ ื”ืงื•ืกืžื™ ืฉืœ ื•ื•ื™ื˜ืŸ, ื›ื“ื™ ืœื ืกื•ืช ื•ืœื”ื‘ื™ืŸ
12:21
what the phenomenon was going on in this proposed nanomaterial.
211
741330
3000
ืื™ื–ื• ืชื•ืคืขื” ืžืชืจื—ืฉืช ื‘ื ื ื•-ื—ื•ืžืจ ื”ื–ื”.
12:24
So inside of a nanotube, we're really at the limit here.
212
744330
6000
ืื– ื‘ืชื•ืš ืฉืคื•ืคืจืช ื”ื ื ื•, ืื ื• ื›ื‘ืจ ืžืžืฉ ืขืœ ื”ืงืฆื”.
12:30
So what we see is with these and other new materials
213
750330
4000
ื•ื‘ื—ื•ืžืจื™ื ืืœื” ื•ืื—ืจื™ื ืื ื• ืจื•ืื™ื ืฉื‘ื™ื›ื•ืœืชื ื• ืœื™ื™ืฆืจ
12:34
that we can do things with different properties -- lighter, stronger --
214
754330
4000
ื“ื‘ืจื™ื ื‘ืขืœื™ ืชื›ื•ื ื•ืช ืฉื•ื ื•ืช, ืงืœื™ื ื™ื•ืชืจ, ื—ื–ืงื™ื ื™ื•ืชืจ,
12:38
and apply these new materials to the environmental problems.
215
758330
6000
ื•ืœื”ืฉืชืžืฉ ื‘ื—ื•ืžืจื™ื ื—ื“ืฉื™ื ืืœื” ืœื‘ืขื™ื•ืช ื”ืกื‘ื™ื‘ืชื™ื•ืช.
12:44
New materials that can make water,
216
764330
1000
ื—ื•ืžืจื™ื ื—ื“ืฉื™ื ื”ืžืกื•ื’ืœื™ื ืœื™ื™ืฆืจ ืžื™ื,
12:45
new materials that can make fuel cells work better,
217
765330
2000
ื—ื•ืžืจื™ื ื—ื“ืฉื™ื ืฉื™ื’ืจืžื• ืœืชืื™ ื“ืœืง ืœืคืขื•ืœ ื˜ื•ื‘ ื™ื•ืชืจ,
12:47
new materials that catalyze chemical reactions,
218
767330
4000
ื—ื•ืžืจื™ื ื—ื“ืฉื™ื ืฉืžืื™ืฆื™ื ืชื’ื•ื‘ื•ืช ื›ื™ืžื™ื•ืช,
12:51
that cut pollution and so on.
219
771330
3000
ื•ืžืงื˜ื™ื ื™ื ืืช ื”ื–ื™ื”ื•ื ื•ื›ืš ื”ืœืื”.
12:54
Ethanol -- new ways of making ethanol.
220
774330
3000
ืืชื ื•ืœ -- ื“ืจื›ื™ื ื—ื“ืฉื•ืช ืœื™ื™ืฆื•ืจ ืืชื ื•ืœ.
12:57
New ways of making electric transportation.
221
777330
3000
ื“ืจื›ื™ื ื—ื“ืฉื•ืช ืœื™ืฆื™ืจืช ืชืขื‘ื•ืจื” ื—ืฉืžืœื™ืช.
13:00
The whole green dream -- because it can be profitable.
222
780330
4000
ื›ืœ ื”ื—ืœื•ื ื”ื™ืจื•ืง -- ื›ื™ ื–ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ืจื•ื•ื—ื™.
13:04
And we've dedicated -- we've just raised a new fund,
223
784330
2000
ื•ื”ืงื“ืฉื ื• -- ื–ื” ืขืชื” ื”ืงืžื ื• ืงืจืŸ ื—ื“ืฉื”,
13:06
we dedicated 100 million dollars to these kinds of investments.
224
786330
3000
ื”ืงื“ืฉื ื• 100 ืžื™ืœื™ื•ืŸ ื“ื•ืœืจ ืœื”ืฉืงืขื•ืช ืžืขื™ืŸ ืืœื”.
13:09
We believe that Genentech, the Compaq, the Lotus, the Sun,
225
789330
4000
ื•ืื ื• ืžืืžื™ื ื™ื ืฉื”"ื’'ื™ื ื ื˜ืง", ื”"ืงื•ืžืคืืง", ื”"ืœื•ื˜ื•ืก", ื”"ืกืืŸ",
13:13
the Netscape, the Amazon, the Google in these fields
226
793330
4000
ื”"ื ื˜ืกืงื™ื™ืค", ื”"ืืžื–ื•ืŸ" ื•ื”"ื’ื•ื’ืœ" ื‘ืชื—ื•ืžื™ื ืืœื”
13:17
are yet to be found, because this materials revolution
227
797330
3000
ืขืชื™ื“ื™ื ืœื”ื•ืคื™ืข, ื›ื™ ืžื”ืคื›ืช ื”ื—ื•ืžืจื™ื ื”ื–ื•
13:20
will drive these things forward.
228
800330
3000
ืชื“ื—ื•ืฃ ื”ื›ืœ ืงื“ื™ืžื”.
13:24
The third area that we're working on,
229
804330
2000
ื”ืชื—ื•ื ื”ืฉืœื™ืฉื™ ืขืœื™ื• ืื ื• ืขื•ื‘ื“ื™ื,
13:26
and we just announced last week -- we were all in New York.
230
806330
4000
ื•ื”ื›ืจื–ื ื• ืขืœื™ื• ืจืง ื‘ืฉื‘ื•ืข ืฉืขื‘ืจ -- ื”ื™ื™ื ื• ื›ื•ืœื ื• ื‘ื ื™ื•-ื™ื•ืจืง.
13:30
We raised 200 million dollars in a specialty fund
231
810330
6000
ื’ื™ื™ืกื ื• 200 ืžื™ืœื™ื•ืŸ ื“ื•ืœืจ ื‘ืงืจืŸ ืžื™ื•ื—ื“ืช
13:36
to work on a pandemic in biodefense.
232
816330
4000
ืœืคื™ืชื•ื— ื”ื’ื ื” ื‘ื™ื•ืœื•ื’ื™ืช ื ื’ื“ ืžื’ืคื•ืช-ืขื ืง.
13:40
And to give you an idea of the last fund that Kleiner raised
233
820330
3000
ื›ื“ื™ ืœืชืช ืœื›ื ืžื•ืฉื’, ื”ืžื™ืžื•ืŸ ื”ืื—ืจื•ืŸ ืฉื’ื™ื™ืกื” "ืงืœื™ื™ื ืจ" ื”ื™ื”
13:43
was a $400 million fund, so this for us is a very substantial fund.
234
823330
5000
ืฉืœ 400 ืžื™ืœื™ื•ืŸ, ื›ืš ืฉืžื‘ื—ื™ื ืชื ื• ื–ื”ื• ืžื™ืžื•ืŸ ืจืฆื™ื ื™ ื‘ื™ื•ืชืจ.
13:48
And what we did, over the last few months -- well, a few months ago,
235
828330
4000
ืžื” ืฉืขืฉื™ื ื• ื‘ื—ื•ื“ืฉื™ื ื”ืื—ืจื•ื ื™ื -- ื‘ืขืฆื, ืœืคื ื™ ื›ืžื” ื—ื•ื“ืฉื™ื,
13:52
Ray Kurzweil and I wrote an op-ed in the New York Times
236
832330
3000
ืจื™ื™ ืงื•ืจืฆื•ื•ื™ื™ืœ ื•ืื ื™ ื›ืชื‘ื ื• ื‘"ื ื™ื•-ื™ื•ืจืง ื˜ื™ื™ืžืก"
13:55
about how publishing the 1918 genome was very dangerous.
237
835330
3000
ืขืœ ื›ืš ืฉืคืจืกื•ื ื ื•ืฉื ื”ื’ื ื•ื ืฉืœ 1918 ืžืกื•ื›ืŸ ื‘ื™ื•ืชืจ.
13:58
And John Doerr and Brook and others got concerned, [unclear],
238
838330
4000
ื•ื’'ื•ืŸ ื“ื•ืจ, ื‘ืจื•ืง ื•ืื—ืจื™ื ื ืขืฉื• ืžื•ื“ืื’ื™ื,
14:02
and we started looking around at what the world was doing
239
842330
4000
ื•ื”ืชื—ืœื ื• ืœื‘ื“ื•ืง ืžื” ืขื•ืฉื” ื”ืขื•ืœื ืžื‘ื—ื™ื ืช
14:06
about being prepared for a pandemic. And we saw a lot of gaps.
240
846330
5000
ื”ืžื•ื›ื ื•ืช ืœืžื’ืคื”, ื•ืžืฆืื ื• ืคืขืจื™ื ืจื‘ื™ื.
14:11
And so we asked ourselves, you know, can we find innovative things
241
851330
4000
ื•ื›ืขืช ืื ื• ืฉื•ืืœื™ื ืืช ืขืฆืžื ื• ืื ื ื•ื›ืœ ืœืžืฆื•ื ื—ื™ื“ื•ืฉื™ื
14:15
that will go fill these gaps? And Brooks told me in a break here,
242
855330
4000
ืฉื™ืžืœืื• ืคืขืจื™ื ืืœื”? ื‘ืจื•ืง ืืžืจ ืœื™ ื‘ื”ืคืกืงื”
14:19
he said he's found so much stuff he can't sleep,
243
859330
2000
ืฉื”ื•ื ืžืฆื ื›ืœ-ื›ืš ื”ืจื‘ื” ื—ื•ืžืจ, ืฉืื™ื ื ื• ืžืกื•ื’ืœ ืœื™ืฉื•ืŸ,
14:21
because there's so many great technologies out there,
244
861330
3000
ื›ื™ ื™ืฉื ืŸ ื›ืœ-ื›ืš ื”ืจื‘ื” ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื ื”ื“ืจื•ืช,
14:24
we're essentially buried. And we need them, you know.
245
864330
3000
ื”ืจื‘ื” ื™ื•ืชืจ ืžืžื” ืฉืฆืจื™ืš. ื•ืื ื• ื–ืงื•ืงื™ื ืœื”ืŸ.
14:27
We have one antiviral that people are talking about stockpiling
246
867330
3000
ื™ืฉ ืชืจื•ืคื” ืื ื˜ื™-ื ื’ื™ืคื™ืช ืื—ืช ืฉืื ืฉื™ื ืžืชื›ื•ื•ื ื™ื ืœืื’ื•ืจ
14:30
that still works, roughly. That's Tamiflu.
247
870330
3000
ืฉืขื“ื™ื™ืŸ ืขื•ื‘ื“ืช ืื™ื›ืฉื”ื•. ืžื“ื•ื‘ืจ ื‘"ื˜ืžื™ืคืœื•".
14:33
But Tamiflu -- the virus is resistant. It is resistant to Tamiflu.
248
873330
5000
ืืš ื”"ื˜ืžื™ืคืœื•" -- ื”ื ื’ื™ืฃ ืขืžื™ื“ ืœ"ื˜ืžื™ืคืœื•".
14:38
We've discovered with AIDS we need cocktails to work well
249
878330
4000
ืขื ื”ืื™ื™ื“ืก ืžืฆืื ื• ืฉืื ื• ืฆืจื™ื›ื™ื ืงื•ืงื˜ื™ื™ืœื™ื ื›ื“ื™ ืฉื™ืคืขืœ
14:42
so that the viral resistance -- we need several anti-virals.
250
882330
3000
ื›ืš ืฉื”ืขืžื™ื“ื•ืช ืฉืœ ื”ื ื’ื™ืฃ-- ื ื—ื•ืฆื•ืช ืžืกืคืจ ืชืจื•ืคื•ืช ื›ืืœื”.
14:45
We need better surveillance.
251
885330
2000
ืื ื• ื–ืงื•ืงื™ื ืœืžื•ื“ื™ืขื™ืŸ ื˜ื•ื‘ ื™ื•ืชืจ.
14:47
We need networks that can find out what's going on.
252
887330
3000
ืื ื• ื–ืงื•ืงื™ื ืœืจืฉืชื•ืช ืฉื‘ื•ื“ืงื•ืช ืžื” ืžืชืจื—ืฉ
14:50
We need rapid diagnostics so that we can tell if somebody has
253
890330
4000
ื•ืœืื™ื‘ื—ื•ื ื™ื ืžื”ื™ืจื™ื, ื›ื“ื™ ืฉื ื•ื›ืœ ืœืงื‘ื•ืข ืื ืœืžื™ืฉื”ื• ื™ืฉ
14:54
a strain of flu which we have only identified very recently.
254
894330
4000
ื–ืŸ ืฉืœ ืฉืคืขืช ืฉื–ื™ื”ื™ื ื• ืจืง ืœืื—ืจื•ื ื”.
14:58
We've got to be able to make the rapid diagnostics quickly.
255
898330
2000
ืขืœื™ื ื• ืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœื‘ืฆืข ืื™ื‘ื—ื•ื ื™ื ื‘ืžื”ื™ืจื•ืช
15:00
We need new anti-virals and cocktails. We need new kinds of vaccines.
256
900330
3000
ื ื—ื•ืฆื™ื ืœื ื• ื ื•ื’ื“ื™-ื ื’ื™ืคื™ื ื•ืงื•ืงื˜ื™ื™ืœื™ื ื—ื“ืฉื™ื, ื•ืกื•ื’ื™ื ื—ื“ืฉื™ื ืฉืœ ื—ื™ืกื•ื ื™ื.
15:03
Vaccines that are broad spectrum.
257
903330
2000
ื—ื™ืกื•ื ื™ื ื‘ืขืœื™ ื˜ื•ื•ื— ืจื—ื‘.
15:05
Vaccines that we can manufacture quickly.
258
905330
4000
ื—ื™ืกื•ื ื™ื ืฉื ื•ื›ืœ ืœื™ื™ืฆืจ ื‘ืžื”ื™ืจื•ืช.
15:09
Cocktails, more polyvalent vaccines.
259
909330
2000
ืงื•ืงื˜ื™ื™ืœื™ื, ื—ื™ืกื•ื ื™ื ืจื‘-ืขืจื›ื™ื™ื ื ื•ืกืคื™ื.
15:11
You normally get a trivalent vaccine against three possible strains.
260
911330
3000
ื‘ื“"ื› ืžืงื‘ืœื™ื ื—ื™ืกื•ืŸ ืชืœืช-ืขืจื›ื™ ืœ-3 ื–ื ื™ื ืืคืฉืจื™ื™ื.
15:14
We need -- we don't know where this thing is going.
261
914330
3000
ืื ื• ื–ืงื•ืงื™ื-- ืื™ื ื ื• ื™ื•ื“ืขื™ื ืœืืŸ ื–ื” ืžืชืคืชื—.
15:17
We believe that if we could fill these 10 gaps,
262
917330
3000
ืื ื• ืžืืžื™ื ื™ื ืฉืื ื ื•ื›ืœ ืœืžืœื ืืช 10 ื”ืคืขืจื™ื ื”ืืœื”,
15:20
we have a chance to help really reduce the risk of a pandemic.
263
920330
6000
ื™ืฉ ืœื ื• ืกื™ื›ื•ื™ ืœืฆืžืฆื ื‘ืืžืช ืืช ื”ืกื™ื›ื•ืŸ ืœืžื’ืคื”.
15:26
And the difference between a normal flu season and a pandemic
264
926330
4000
ื•ื”ื”ื‘ื“ืœ ื‘ื™ืŸ ืฉืคืขืช ืขื•ื ืชื™ืช ืจื’ื™ืœื” ืœืžื’ืคื”
15:30
is about a factor of 1,000 in deaths
265
930330
3000
ื”ื•ื ืžืงื“ื ืฉืœ 1,000 ืžื‘ื—ื™ื ืช ื”ืชืžื•ืชื”
15:33
and certainly enormous economic impact.
266
933330
3000
ื•ื‘ืจื•ืจ ืฉื™ืฉ ื’ื ืืคืงื˜ ื›ืœื›ืœื™ ืื“ื™ืจ.
15:36
So we're very excited because we think we can fund 10,
267
936330
3000
ื›ืš ืฉืื ื—ื ื• ืฉืžื—ื™ื ืžืื“ ื›ื™ ื ืจืื” ืœื ื• ืฉื ื•ื›ืœ ืœืžืžืŸ
15:39
or speed up 10 projects and see them come to market
268
939330
4000
ืื• ืœืงื“ื 10 ืคืจื•ื™ื™ืงื˜ื™ื ื•ืœืจืื•ืชื ืžื’ื™ืขื™ื ืœืฉื•ืง
15:43
in the next couple years that will address this.
269
943330
3000
ื‘ืฉื ืชื™ื™ื ื”ืงืจื•ื‘ื•ืช, ื›ื“ื™ ืœื”ืชืžื•ื“ื“ ืขื ื–ื”.
15:46
So if we can address, use technology, help address education,
270
946330
3000
ื›ืš ืฉืื ื‘ืขื–ืจืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื ื•ื›ืœ ืœืกื™ื™ืข ืœื˜ืคืœ ื‘ื—ื™ื ื•ืš,
15:49
help address the environment, help address the pandemic,
271
949330
3000
ื‘ื‘ืขื™ื•ืช ื”ืกื‘ื™ื‘ื”, ื‘ื‘ืขื™ื™ืช ื”ืžื’ืคื”,
15:52
does that solve the larger problem that I was talking about
272
952330
4000
ื”ืื ื–ื” ืคื•ืชืจ ืืช ื”ื‘ืขื™ื” ื”ื’ื“ื•ืœื” ื™ื•ืชืจ ื‘ื” ืขืกืงืชื™ ื‘"ื•ื•ื™ื™ืจื“"?
15:56
in the Wired article? And I'm afraid the answer is really no,
273
956330
5000
ื—ื•ืฉืฉื ื™ ืฉื”ืชืฉื•ื‘ื” ื”ื™ื ืœื.
16:01
because you can't solve a problem with the management of technology
274
961330
4000
ื›ื™ ืœื ื ื™ืชืŸ ืœืคืชื•ืจ ื‘ืขื™ื” ืฉืœ ื ื™ื”ื•ืœ ื˜ื›ื ื•ืœื•ื’ื™ื”
16:05
with more technology.
275
965330
3000
ื‘ืขื–ืจืช ืขื•ื“ ื˜ื›ื ื•ืœื•ื’ื™ื”.
16:08
If we let an unlimited amount of power loose, then we will --
276
968330
5000
ืื ื ืืคืฉืจ ืœื›ื•ื— ื‘ืœืชื™-ืžื•ื’ื‘ืœ ืœื”ืฉืชื—ืจืจ, ืื– --
16:13
a very small number of people will be able to abuse it.
277
973330
2000
ืงื‘ื•ืฆืช ืื ืฉื™ื ืงื˜ื ื” ืžืื“ ืชื•ื›ืœ ืœื ืฆืœื• ืœืจืขื”.
16:15
We can't fight at a million-to-one disadvantage.
278
975330
4000
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ื™ืœื—ื ืžืขืžื“ืช ื ื—ื™ืชื•ืช ืฉืœ ืžื™ืœื™ื•ืŸ-ืœืื—ื“.
16:19
So what we need to do is, we need better policy.
279
979330
3000
ืื– ืžื” ืฉืื ื• ืฆืจื™ื›ื™ื ื–ื• ืžื“ื™ื ื™ื•ืช ื˜ื•ื‘ื” ื™ื•ืชืจ.
16:22
And for example, some things we could do
280
982330
3000
ืœื“ื•ื’ืžื”, ื›ืžื” ื“ื‘ืจื™ื ืฉื ื•ื›ืœ ืœืขืฉื•ืช
16:25
that would be policy solutions which are not really in the political air right now
281
985330
4000
ืฉื™ื”ื•ื• ืคืชืจื•ื ื•ืช ืฉืœ ืžื“ื™ื ื™ื•ืช, ืฉื˜ืจื ืงื™ื™ืžื™ื ื‘ืื•ื•ื™ืจื” ื”ืคื•ืœื™ื˜ื™ืช ืฉืœ ื”ื™ื•ื
16:29
but perhaps with the change of administration would be -- use markets.
282
989330
4000
ืื‘ืœ ืื•ืœื™ ื™ื”ื™ื• ืื ื™ืชื—ืœืฃ ื”ืžืžืฉืœ -- ื ื™ืฆื•ืœ ื”ืฉื•ื•ืงื™ื.
16:33
Markets are a very strong force.
283
993330
2000
ื”ืฉื•ื•ืงื™ื ืžื”ื•ื•ื™ื ื›ื•ื— ื—ื–ืง ืžืื“.
16:35
For example, rather than trying to regulate away problems,
284
995330
3000
ืœืžืฉืœ, ื‘ืžืงื•ื ืœื˜ืคืœ ื‘ื‘ืขื™ื•ืช ื‘ืืžืฆืขื•ืช ื—ื•ืงื™ื,
16:38
which probably won't work, if we could price
285
998330
2000
ืžื” ืฉื›ื ืจืื” ืœื ื™ืขื‘ื•ื“, ื”ืจื™ ืฉืื ื ื•ื›ืœ ืœืชืžื—ืจ
16:40
into the cost of doing business, the cost of catastrophe,
286
1000330
5000
ื›ื—ืœืง ืžืขืœื•ืช ืขืฉื™ื™ืช ื”ืขืกืงื™ื, ืืช ืขืœื•ืชื• ืฉืœ ืืกื•ืŸ,
16:45
so that people who are doing things that had a higher cost of catastrophe
287
1005330
3000
ื›ืš ืฉืžื™ ืฉืขื•ืกืง ื‘ื“ื‘ืจื™ื ื‘ืขืœื™ ืชื’-ืžื—ื™ืจ ื’ื‘ื•ื” ืžื—ืฉืฉ ืœืืกื•ืŸ,
16:48
would have to take insurance against that risk.
288
1008330
3000
ื™ืฆื˜ืจืš ืœื‘ื˜ื— ื›ื ื’ื“ ืกื™ื›ื•ืŸ ื›ื–ื”.
16:51
So if you wanted to put a drug on the market you could put it on.
289
1011330
2000
ื›ืš ืฉืื ืชืจืฆื• ืœื”ื›ื ื™ืก ืœืฉื•ืง ืชืจื•ืคื”, ืชื•ื›ืœื• ืœื”ื›ื ื™ืกื”,
16:53
But it wouldn't have to be approved by regulators;
290
1013330
2000
ืื‘ืœ ื–ื” ืœื ื™ื“ืจื•ืฉ ืื™ืฉื•ืจ ืฉืœ ืžื—ื•ืงืง;
16:55
you'd have to convince an actuary that it would be safe.
291
1015330
4000
ื™ื”ื™ื” ืขืœื™ื›ื ืœืฉื›ื ืข ืืงื˜ื•ืืจ ื›ืœืฉื”ื• ืฉื–ื” ื‘ื˜ื•ื—.
16:59
And if you apply the notion of insurance more broadly,
292
1019330
3000
ื•ืื ืžื™ื™ืฉืžื™ื ืืช ืจืขื™ื•ืŸ ื”ื‘ื™ื˜ื•ื— ื‘ืฆื•ืจื” ืจื—ื‘ื” ื™ื•ืชืจ,
17:02
you can use a more powerful force, a market force,
293
1022330
3000
ื ื™ืชืŸ ืœื”ืฉืชืžืฉ ื‘ื›ื•ื— ื—ื–ืง ื™ื•ืชืจ, ื›ื•ื— ืฉื™ื•ื•ืงื™,
17:05
to provide feedback.
294
1025330
2000
ื›ื“ื™ ืœืกืคืง ืžืฉื•ื‘.
17:07
How could you keep the law?
295
1027330
1000
ืื™ืš ืฉื•ืžืจื™ื ืขืœ ื”ื—ื•ืง?
17:08
I think the law would be a really good thing to keep.
296
1028330
2000
ืœื“ืขืชื™, ื›ื“ืื™ ืžืื“ ืœืฉืžื•ืจ ืขืœ ื”ื—ื•ืง.
17:10
Well, you have to hold people accountable.
297
1030330
2000
ื™ืฉ ืœืชื‘ื•ืข ืžืื ืฉื™ื ืœืงื—ืช ืื—ืจื™ื•ืช.
17:12
The law requires accountability.
298
1032330
2000
ื”ื—ื•ืง ืžื—ื™ื™ื‘ ื ืฉื™ืื” ื‘ืื—ืจื™ื•ืช.
17:14
Today scientists, technologists, businessmen, engineers
299
1034330
3000
ื›ื™ื•ื, ื”ืžื“ืขื ื™ื, ื”ื˜ื›ื ื•ืœื•ื’ื™ื, ืื ืฉื™ ื”ืขืกืงื™ื ื•ื”ืžื”ื ื“ืกื™ื
17:17
don't have any personal responsibility
300
1037330
2000
ืื™ื ื ื ื•ืฉืื™ื ื‘ืฉื•ื ืื—ืจื™ื•ืช ืื™ืฉื™ืช
17:19
for the consequences of their actions.
301
1039330
2000
ืœืชื•ืฆืื•ืช ืžืขืฉื™ื”ื.
17:21
So if you tie that -- you have to tie that back with the law.
302
1041330
4000
ื›ืš ืฉืื ืžืขื’ื ื™ื ื–ืืช -- ื™ืฉ ืœืขื’ืŸ ื–ืืช ื‘ื—ื•ืง.
17:25
And finally, I think we have to do something that's not really --
303
1045330
4000
ื•ืœื‘ืกื•ืฃ, ื—ื•ืฉื‘ื ื™ ืฉืขืœื™ื ื• ืœืขืฉื•ืช ืžืฉื”ื• --
17:29
it's almost unacceptable to say this -- which,
304
1049330
1000
ื›ืžืขื˜ ืฉืœื ืžืงื•ื‘ืœ ืœื•ืžืจ ื–ืืช -- ืฉื”ื•ื,
17:30
we have to begin to design the future.
305
1050330
3000
ืœื”ืชื—ื™ืœ ืœืชื›ื ืŸ ืืช ื”ืขืชื™ื“.
17:33
We can't pick the future, but we can steer the future.
306
1053330
4000
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื‘ื—ื•ืจ ืขืชื™ื“, ืืš ืื ื• ื™ื›ื•ืœื™ื ืœื›ื•ื•ืŸ ืืœื™ื•.
17:37
Our investment in trying to prevent pandemic flu
307
1057330
2000
ื”ืฉืงืขืชื ื• ื‘ื ืกื™ื•ืŸ ืœืžื ื•ืข ืžื’ืคืช ืฉืคืขืช
17:39
is affecting the distribution of possible outcomes.
308
1059330
4000
ืžืฉืคื™ืขื” ืขืœ ื”ืชืคืœื’ื•ืช ื”ืชื•ืฆืื•ืช ื”ืืคืฉืจื™ื•ืช.
17:43
We may not be able to stop it, but the likelihood
309
1063330
2000
ืื•ืœื™ ืœื ื ื•ื›ืœ ืœืขืฆื•ืจ ื–ืืช, ืืš ื”ืกื‘ื™ืจื•ืช
17:45
that it will get past us is lower if we focus on that problem.
310
1065330
4000
ืฉื”ื™ื ืชื—ืœื•ืฃ ืขืœ ืคื ื™ื ื• ื ืžื•ื›ื” ื™ื•ืชืจ, ืื ื ืชืžืงื“ ื‘ื‘ืขื™ื” ื–ื•.
17:49
So we can design the future if we choose what kind of things
311
1069330
4000
ื›ืš ืฉื‘ื™ื›ื•ืœืชื ื• ืœืขืฆื‘ ืืช ื”ืขืชื™ื“ ืื ื ื‘ื—ืจ ืืช ื”ื“ื‘ืจื™ื
17:53
we want to have happen and not have happen,
312
1073330
3000
ืฉืื ื• ืจื•ืฆื™ื ืฉื™ืงืจื• ืœื ื• ื•ืืช ืืœื• ืฉืœื,
17:56
and steer us to a lower-risk place.
313
1076330
3000
ื•ืœื›ื•ื•ืŸ ืขืฆืžื ื• ืœืžืงื•ื ื ืžื•ืš-ืกื™ื›ื•ืŸ.
17:59
Vice President Gore will talk about how we could steer the climate trajectory
314
1079330
6000
ืกื’ืŸ ื”ื ืฉื™ื ื’ื•ืจ ื™ืกืคืจ ืื™ืš ื ื•ื›ืœ ืœื›ื•ื•ืŸ ืืช ื ืชื™ื‘ ื”ืืงืœื™ื
18:05
into a lower probability of catastrophic risk.
315
1085330
3000
ืœืขื‘ืจ ืกื‘ื™ืจื•ืช ื ืžื•ื›ื” ื™ื•ืชืจ ืฉืœ ืืกื•ืŸ.
18:08
But above all, what we have to do is we have to help the good guys,
316
1088330
3000
ืืš ืžืขืœ ืœื›ืœ, ืžื” ืฉืขืœื™ื ื• ืœืขืฉื•ืช ื”ื•ื ืœืกื™ื™ืข ืœื—ื‘ืจ'ื” ื”ื˜ื•ื‘ื™ื,
18:11
the people on the defensive side,
317
1091330
2000
ื”ืื ืฉื™ื ื‘ืฆื“ ื”ืžื’ื ื ื”,
18:13
have an advantage over the people who want to abuse things.
318
1093330
4000
ืฉื™ื”ื™ื” ืœื”ื ื™ืชืจื•ืŸ ืขืœ ืคื ื™ ืžื™ ืฉืขื ื™ื™ื ื ื ื™ืฆื•ืœ-ืœืจืขื”.
18:17
And what we have to do to do that
319
1097330
2000
ื•ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช, ืขืœื™ื ื•
18:19
is we have to limit access to certain information.
320
1099330
3000
ืœื”ื’ื‘ื™ืœ ืืช ื”ื’ื™ืฉื” ืœืžื™ื“ืข ืžืกื•ื™ื.
18:22
And growing up as we have, and holding very high
321
1102330
3000
ื•ื‘ืจืžืช ื”ื”ืชืคืชื—ื•ืช ืฉืœื ื•, ื•ืขื ื”ื—ืฉื™ื‘ื•ืช ื”ืขืฆื•ืžื” ืฉืื ื• ืžืงื ื™ื
18:25
the value of free speech, this is a hard thing for us to accept --
322
1105330
4000
ืœื—ื•ืคืฉ ื”ื“ื™ื‘ื•ืจ, ืงืฉื” ืœื ื• ืœืงื‘ืœ ื–ืืช --
18:29
for all of us to accept.
323
1109330
1000
ืงืฉื” ืœื›ื•ืœื ื• ืœืงื‘ืœ.
18:30
It's especially hard for the scientists to accept who still remember,
324
1110330
5000
ืœืžื“ืขื ื™ื ืงืฉื” ื‘ืžื™ื•ื—ื“ ืœืงื‘ืœ ื–ืืช, ื›ื™ ื”ื ื–ื•ื›ืจื™ื
18:35
you know, Galileo essentially locked up,
325
1115330
2000
ืืช ื’ืœื™ืœืื•, ืฉื™ืฉื‘ ื‘ื›ืœื,
18:37
and who are still fighting this battle against the church.
326
1117330
4000
ื•ื”ื ืขื“ ื”ื™ื•ื ืžื ื”ืœื™ื ืืช ื”ืžืื‘ืง ื”ื–ื” ื ื’ื“ ื”ื›ื ืกื™ื”.
18:41
But that's the price of having a civilization.
327
1121330
5000
ืืš ื–ื”ื• ืžื—ื™ืจ ืงื™ื•ืžื” ืฉืœ ื”ืฆื™ื•ื•ื™ืœื™ื–ืฆื™ื”.
18:46
The price of retaining the rule of law
328
1126330
2000
ืžื—ื™ืจ ืฉืžื™ืจืช ืฉืœื˜ื•ืŸ ื”ื—ื•ืง
18:48
is to limit the access to the great and kind of unbridled power.
329
1128330
5000
ื”ื•ื ื”ื’ื‘ืœืช ื”ื’ื™ืฉื” ืœื›ื•ื— ืจื‘ ื•ืžืฉื•ืœื—-ืจืกืŸ.
18:53
Thank you.
330
1133330
1000
ืชื•ื“ื” ืœื›ื.
18:54
(Applause)
331
1134330
2000
[ืžื—ื™ืื•ืช ื›ืคื™ื™ื]
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7