Gregory Stock: To upgrade is human

44,335 views ใƒป 2009-04-14

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Ido Dekkers ืžื‘ืงืจ: Dan Liebeschutz
00:12
The future of life, where the unraveling of our biology --
0
12160
4000
ืขืชื™ื“ ื”ื—ื™ื™ื, ื•ื”ื”ืชืคืชื—ื•ืช ืฉืœ ื”ื‘ื™ื•ืœื•ื’ื™ื” ืฉืœื ื• --
00:16
and bring up the lights a little bit. I don't have any slides.
1
16160
2000
ื•ืชื’ื‘ื™ืจื• ืืช ื”ืื•ืจื•ืช ืžืขื˜. ืื™ืŸ ืœื™ ืฉืงื•ืคื™ื•ืช.
00:18
I'm just going to talk --
2
18160
3000
ืื ื™ ืจืง ืขื•ืžื“ ืœื“ื‘ืจ --
00:21
about where that's likely to carry us.
3
21160
2000
ืขืœ ืœืืŸ ื–ื” ื™ืงื— ืื•ืชื ื•.
00:23
And you know, I saw all the visions
4
23160
4000
ื•ืืชื ื™ื•ื“ืขื™ื, ืจืื™ืชื™ ืืช ื›ืœ ื”ื—ื–ื•ื ื•ืช
00:27
of the first couple of sessions.
5
27160
2000
ืฉืœ ื”ื”ืจืฆืื•ืช ื”ืจืืฉื•ื ื•ืช.
00:29
It almost made me feel a little bit guilty about having an uplifting talk
6
29160
3000
ื–ื” ื›ืžืขื˜ ื’ืจื ืœื™ ืœื”ืจื’ื™ืฉ ืžืขื˜ ืืฉื ืฉื™ืฉ ืœื™ ืฉื™ื—ื” ืžืจื•ืžืžืช
00:32
about the future.
7
32160
2000
ืขืœ ื”ืขืชื™ื“.
00:34
It felt wrong to do that in some way.
8
34160
2000
ื”ืจื’ืฉืชื™ ื˜ืขื•ืช ืœืขืฉื•ืช ื–ืืช ื‘ืžื•ื‘ืŸ ืžืกื•ื™ื™ื.
00:36
And yet, I don't really think it is
9
36160
2000
ื•ืขื“ื™ื™ืŸ, ืื ื™ ืœื ื‘ืืžืช ื—ื•ืฉื‘ ืฉื–ื” ื›ืš
00:38
because when it comes down to it,
10
38160
2000
ืžืคื ื™ ืฉื›ืฉื–ื” ืžื’ื™ืข ืœืขื ื™ื™ืŸ,
00:40
it's this larger trajectory that is really what is going to remain --
11
40160
4000
ื–ื” ื”ืžืกืœื•ืœ ื”ืจื—ื‘ ื‘ื™ื•ืชืจ ืฉื›ื ืจืื” ื™ืฉืืจ -
00:44
what people in the future are going to remember about this period.
12
44160
4000
ืžื” ืฉืื ืฉื™ื ื‘ืขืชื™ื“ ื™ื–ื›ืจื• ืขืœ ื”ืชืงื•ืคื” ื”ื–ื•.
00:48
I want to talk to you a little bit about
13
48160
3000
ืื ื™ ืจื•ืฆื” ืœื“ื‘ืจ ืื™ืชื›ื ืžืขื˜ ืขืœ
00:51
why the visions of Jeremy Rivkins,
14
51160
4000
ืœืžื” ื—ื–ื•ื ื ืฉืœ ื’'ืจืžื™ ืจื™ื‘ืงื™ื ืก,
00:55
who would like to ban these sorts of technologies,
15
55160
3000
ืฉืจื•ืฆื” ืœืืกื•ืจ ืขืœ ืกื•ื’ ื–ื” ืฉืœ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช,
00:58
or of the Bill Joys who would like to relinquish them,
16
58160
4000
ืื• ืฉืœ ื‘ื™ืœ ื’'ื•ื™ืก ืฉืจื•ืฆื” ืœื”ืขืœื™ื ืื•ืชืŸ,
01:02
are actually -- to follow those paths would be such a tragedy for us.
17
62160
5000
ื”ื ืœืžืขืฉื”, ืœืœื›ืช ื‘ื“ืจื›ื ื™ื”ื™ื” ื˜ืจื’ื“ื™ื” ื‘ืฉื‘ื™ืœื ื•.
01:07
I'm focusing on biology,
18
67160
3000
ืื ื™ ืžืชืžืงื“ ื‘ื‘ื™ื•ืœื•ื’ื™ื”,
01:10
the biological sciences.
19
70160
2000
ืžื“ืข ื”ื‘ื™ื•ืœื•ื’ื™ื”.
01:12
The reason I'm doing that is because those are going to be
20
72160
3000
ื”ืกื™ื‘ื” ืฉืื ื™ ืขื•ืฉื” ืืช ื–ื” ื”ื™ื ื‘ื’ืœืœ ืฉืืœื” ืขื•ืžื“ื™ื ืœื”ื™ื•ืช
01:15
the areas that are the most significant to us.
21
75160
4000
ื”ืชื—ื•ืžื™ื ืฉื™ื”ื™ื• ื”ื›ื™ ื—ืฉื•ื‘ื™ื ืœื ื•.
01:19
The reason for that is really very simple.
22
79160
2000
ื”ืกื™ื‘ื” ืœื–ื” ื”ื™ื ื‘ืืžืช ืคืฉื•ื˜ื”.
01:21
It's because we're flesh and blood.
23
81160
2000
ื–ื” ื‘ื’ืœืœ ืฉืื ื—ื ื• ื‘ืฉืจ ื•ื“ื.
01:23
We're biological creatures.
24
83160
2000
ืื ื—ื ื• ื™ืฆื•ืจื™ื ื‘ื™ื•ืœื•ื’ื™ื™ื.
01:25
And what we can do with our biology
25
85160
4000
ื•ืžื” ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช ืขื ื”ื‘ื™ื•ืœื•ื’ื™ื” ืฉืœื ื•
01:29
is going to shape our future
26
89160
2000
ื™ืขืฆื‘ ืืช ืขืชื™ื“ื ื•
01:31
and that of our children and that of their children --
27
91160
3000
ื•ื–ื” ืฉืœ ื™ืœื“ื ื• ื•ื–ื” ืฉืœ ื™ืœื“ื™ื”ื --
01:34
whether we gain control over aging,
28
94160
2000
ืื ื ืชื’ื‘ืจ ืขืœ ื”ื”ื–ื“ืงื ื•ืช,
01:36
whether we learn to protect ourselves from Alzheimer's,
29
96160
4000
ืื ื ืœืžื“ ืœื”ื’ืŸ ืขืœ ืขืฆืžื ื• ืžืืœืฆื”ื™ื™ืžืจ,
01:40
and heart disease, and cancer.
30
100160
2000
ื•ืžื—ืœื•ืช ืœื‘, ื•ืกืจื˜ืŸ.
01:42
I think that Shakespeare really put it very nicely.
31
102160
4000
ืื ื™ ื—ื•ืฉื‘ ืฉืฉื™ืงืกืคื™ืจ ื‘ืืžืช ืืžืจ ืืช ื–ื” ื™ืคื”.
01:46
And I'm actually going to use his words in the same order that he did.
32
106160
3000
ื•ืื ื™ ืœืžืขืฉื” ืžืชื›ื•ื•ืŸ ืœื”ืฉืชืžืฉ ื‘ืžื™ืœื•ืชื™ื• ื‘ืื•ืชื• ืกื“ืจ ื‘ื• ื”ื•ื ื”ืฉืชืžืฉ.
01:49
(Laughter)
33
109160
2000
(ืฆื—ื•ืง)
01:51
He said, "And so from hour to hour
34
111160
3000
ื”ื•ื ืืžืจ, "ื•ื›ืš ืžืฉืขื” ืœืฉืขื”
01:54
we ripe and ripe.
35
114160
3000
ืื ื—ื ื• ืžื‘ืฉื™ืœื™ื ื•ืžื‘ืฉื™ืœื™ื.
01:57
And then from hour to hour we rot and rot.
36
117160
2000
ื•ืื– ืžืฉืขื” ืœืฉืขื” ืœืฉืขื” ืื ื—ื ื• ื ืจืงื‘ื™ื ื•ื ืจืงื‘ื™ื.
01:59
And thereby hangs a tale."
37
119160
2000
ื•ืื– ืชื•ืœื™ื ืืช ื”ืกื™ืคื•ืจ."
02:01
Life is short, you know.
38
121160
2000
ื”ื—ื™ื™ื ื”ื ืงืฆืจื™ื, ืืชื ื™ื•ื“ืขื™ื.
02:03
And we need to think about planning a little bit.
39
123160
2000
ื•ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื—ืฉื•ื‘ ืงืฆืช ืขืœ ืชื›ื ื•ืŸ.
02:05
We're all going to eventually, even in the developed world,
40
125160
5000
ื›ื•ืœื ื• ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ, ืืคื™ืœื• ื‘ืขื•ืœื ื”ืžืคื•ืชื—,
02:10
going to have to lose everything that we love.
41
130160
3000
ื—ื™ื™ื‘ื™ื ืœืื‘ื“ ืืช ื›ืœ ืžื” ืฉืื ื—ื ื• ืื•ื”ื‘ื™ื.
02:13
When you're beginning to rot a little bit,
42
133160
3000
ื•ื›ืฉืืชื ืžืชื—ื™ืœื™ื ืœื”ืจืงื‘ ืžืขื˜,
02:16
all of the videos crammed into your head,
43
136160
2000
ื›ืœ ื”ืกืจื˜ื™ื ืฉื“ื—ื•ืกื™ื ื‘ืจืืฉื›ื,
02:18
all of the extensions that extend your various powers,
44
138160
4000
ื›ืœ ื”ื”ืจื—ื‘ื•ืช ืฉืžืจื—ื™ื‘ื•ืช ืืช ืžื’ื•ื•ืŸ ื”ื›ื•ื—ื•ืช ืฉืœื›ื,
02:22
are going to being to seem a little secondary.
45
142160
4000
ืขื•ืžื“ื™ื ืœื”ืจืื•ืช ืžืขื˜ ืžืฉื ื™ื™ื.
02:26
And you know, I'm getting a little bit gray -- so is Ray Kurzweil,
46
146160
4000
ื•ืืชื ื™ื•ื“ืขื™ื, ืื ื™ ื ืขืฉื” ืžืขื˜ ืืคื•ืจ, ื›ืžื• ืจื™ื™ ืงื•ืจืฆื•ื•ื™ืœ,
02:30
so is Eric Drexler.
47
150160
2000
ื’ื ืืจื™ืง ื“ืงืกืœืจ.
02:32
This is where it's really central to our lives.
48
152160
3000
ืฉื ื–ื” ื ืขืฉื” ืžืจื›ื–ื™ ื‘ื—ื™ื™ื ื•.
02:35
Now I know there's been a whole lot of hype
49
155160
2000
ืขื›ืฉื™ื• ืื ื™ ื™ื•ื“ืข ืฉื”ื™ื” ื”ืจื‘ื” ื”ื™ื™ืค
02:37
about our power to control biology.
50
157160
3000
ืขืœ ื›ื•ื—ื ื• ืœืฉืœื•ื˜ ืขืœ ื‘ื™ื•ืœื•ื’ื™ื”.
02:40
You just have to look at the Human Genome Project.
51
160160
2000
ืืชื ืจืง ืฆืจื™ื›ื™ื ืœื”ื‘ื™ื˜ ื‘ืคืจื•ื™ื™ืงื˜ ื”ื’ื ื•ื ื”ืื ื•ืฉื™.
02:42
It wasn't two years ago
52
162160
2000
ื–ื” ื”ื™ื” ืจืง ืœืคื ื™ ืฉื ืชื™ื™ื
02:44
that everybody was talking about --
53
164160
2000
ืฉื›ื•ืœื ื“ื™ื‘ืจื• ืขืœ
02:46
we've found the Holy Grail of biology.
54
166160
2000
ืžืฆื™ืืช ื”ื’ื‘ื™ืข ื”ืงื“ื•ืฉ ืฉืœ ื”ื‘ื™ื•ืœื•ื’ื™ื”.
02:48
We're deciphering the code of codes.
55
168160
2000
ืื ื—ื ื• ืžืคืขื ื—ื™ื ืืช ื”ืงื•ื“ ืฉืœ ื”ืงื•ื“ื™ื.
02:50
We're reading the book of life.
56
170160
3000
ืื ื—ื ื• ืงื•ืจืื™ื ืืช ืกืคืจ ื”ื—ื™ื™ื.
02:53
It's a little bit reminiscent of 1969 when Neil Armstrong walked on the moon,
57
173160
4000
ื–ื” ืžื–ื›ื™ืจ ืžืขื˜ ืืช 1969 ื›ืฉื ื™ืœ ืืจืžืกื˜ืจื•ื ื’ ื”ืœืš ืขืœ ื”ื™ืจื—.
02:57
and everybody was about to race out toward the stars.
58
177160
4000
ื•ื›ื•ืœื ื”ืชืขืกืงื• ื‘ืžืจื•ืฅ ืœื›ื•ื›ื‘ื™ื ื”ื—ื™ืฆื•ื ื™ื™ื.
03:01
And we've all seen "2001: A Space Odyssey."
59
181160
3000
ื•ื›ื•ืœื ื• ืจืื™ื ื• ืืช "2001 ืื•ื“ื™ืกืื” ื‘ื—ืœืœ."
03:04
You know it's 2003, and there is no HAL.
60
184160
4000
ืืชื ื™ื•ื“ืขื™ื ืฉื–ื” 2003, ื•ืื™ืŸ ื”ืืœ.
03:08
And there is no odyssey to our own moon, much less the moons of Jupiter.
61
188160
4000
ื•ืื™ืŸ ืื•ื“ื™ืกืื” ืœื™ืจื— ืฉืœื ื•, ืฉืœื ืœื“ื‘ืจ ืขืœ ื”ื™ืจื—ื™ื ืฉืœ ืฆื“ืง.
03:12
And we're still picking up pieces of the Challenger.
62
192160
3000
ื•ืื ื—ื ื• ืขื“ื™ื™ืŸ ืื•ืกืคื™ื ื—ืœืงื™ื ืฉืœ ื”ืฆ'ืืœื ื’'ืจ.
03:15
So it's not surprising that some people would wonder
63
195160
4000
ืื– ื–ื” ืœื ืžืคืชื™ืข ืฉื›ืžื” ืื ืฉื™ื ืชื•ื”ื™ื
03:19
whether maybe 30 or 40 years from now,
64
199160
3000
ืื ืื•ืœื™ ื‘ืขื•ื“ 30 ืื• 40 ืฉื ื” ืžื”ื™ื•ื,
03:22
we'll look back at this instant in time,
65
202160
2000
ืื ื—ื ื• ื ื‘ื™ื˜ ืื—ื•ืจื” ืœื ืงื•ื“ื” ื”ื–ื• ื‘ื–ืžืŸ,
03:24
and all of the sort of talk about
66
204160
3000
ื•ื›ืœ ื”ื“ื™ื‘ื•ืจื™ื ืขืœ
03:27
the Human Genome Project,
67
207160
2000
ืคืจื•ื™ื™ืงื˜ ื”ื’ื ื•ื ื”ืื ื•ืฉื™,
03:29
and what all this is going to mean to us --
68
209160
2000
ื•ืžื” ื”ืžืฉืžืขื•ืช ืฉืœ ื›ืœ ื–ื” ื‘ืฉื‘ื™ืœื ื• --
03:31
well, it will really mean precious little.
69
211160
4000
ื•ื‘ื›ืŸ, ื–ื” ื‘ืืžืช ื™ืฉื ื” ืžืžืฉ ืžืขื˜.
03:35
And I just want to say that that is absolutely not going to be the case.
70
215160
4000
ื•ืื ื™ ืจืง ืจื•ืฆื” ืœื”ื’ื™ื“ ืฉื–ื” ืžืžืฉ ืœื ื™ื”ื™ื” ื”ืžืงืจื”.
03:39
Because when we talk about our genetics and our biology,
71
219160
5000
ืžืคื ื™ ืฉืžืชื™ ืฉืื ื—ื ื• ืžื“ื‘ืจื™ื ืขืœ ื”ื’ื ื˜ื™ืงื” ื•ื”ื‘ื™ื•ืœื•ื’ื™ื” ืฉืœื ื•,
03:44
and modifying and altering and adjusting these things,
72
224160
2000
ื•ืฉื™ื ื•ื™ ื•ืฉื™ืคื•ืจ ื•ื›ื™ื•ื•ื ื•ืŸ ืฉืœ ื”ื“ื‘ืจื™ื ื”ืืœื”,
03:46
we're talking about changing ourselves.
73
226160
3000
ืื ื—ื ื• ืžื“ื‘ืจื™ื ืขืœ ืœืฉื ื•ืช ืืช ืขืฆืžื ื•.
03:49
And this is very critical stuff.
74
229160
2000
ื•ื–ื” ื“ื‘ืจ ืžืื•ื“ ืงืจื™ื˜ื™.
03:51
If you have any doubts about how technology affects our lives,
75
231160
4000
ืื ื™ืฉ ืœื›ื ืกืคืง ืขืœ ืื™ืš ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืžืฉืคื™ืขื” ืขืœ ื—ื™ื™ื ื•,
03:55
you just have to go to any major city.
76
235160
2000
ืืชื ืจืง ืฆืจื™ื›ื™ื ืœืœื›ืช ืœื›ืœ ืขื™ืจ ืžืจื›ื–ื™ืช.
03:57
This is not the stomping ground
77
237160
2000
ื–ื” ืœื ืื“ืžื•ืช ื”ืžืจืขื”
03:59
of our Pleistocene ancestors.
78
239160
2000
ืฉืœ ืื‘ื•ืชื™ื ื• ืžื”ืขื™ื“ืŸ ื”ืคืœืกื˜ื•ืกื™ื ื™.
04:01
What's happening is we're taking this technology --
79
241160
3000
ืžื” ืฉืงื•ืจื” ื”ื•ื ืฉืื ื—ื ื• ืœื•ืงื—ื™ื ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื• --
04:04
it's becoming more precise, more potent --
80
244160
2000
ื”ื™ื ื ื”ืคื›ืช ืœื™ื•ืชืจ ืžื“ื•ื™ื™ืงืช, ื™ื•ืชืจ ื‘ืขืœืช ื™ื›ื•ืœืช --
04:06
and we're turning it back upon ourselves.
81
246160
3000
ื•ืื ื—ื ื• ื”ื•ืคื›ื™ื ืื•ืชื” ืขืœ ืขืฆืžื ื•.
04:09
Before it's all done
82
249160
2000
ืœืคื ื™ ืฉื”ื›ืœ ื ื’ืžืจ
04:11
we are going to alter ourselves
83
251160
2000
ืื ื—ื ื• ืขื•ืžื“ื™ื ืœืฉื ื•ืช ืืช ืขืฆืžื ื•
04:13
every bit as much as we have changed the world around us.
84
253160
3000
ื‘ืื•ืชื” ืžื™ื“ื” ื‘ื” ืฉื™ื ื™ื ื• ืืช ื”ืขื•ืœื ืกื‘ื™ื‘ื ื•.
04:16
It's going to happen a lot sooner
85
256160
2000
ื–ื• ืขื•ืžื“ ืœืงืจื•ืช ื”ืจื‘ื” ื™ื•ืชืจ ืžื”ืจ
04:18
than people imagine.
86
258160
2000
ืžืžื” ืฉืื ืฉื™ื ืžื“ืžื™ื™ื ื™ื.
04:20
On the way there it's going to
87
260160
3000
ื‘ื“ืจืš ืœืฉื ื–ื” ืขื•ืžื“
04:23
completely revolutionize medicine and health care; that's obvious.
88
263160
3000
ืœืขืฉื•ืช ืžื”ืคืš ืฉืœื ื‘ืจืคื•ืื” ื•ื‘ื‘ืจื™ืื•ืช; ื–ื” ื‘ืจื•ืจ.
04:26
It's going to change the way we have children.
89
266160
2000
ื–ื” ืขื•ืžื“ ืœืฉื ื•ืช ืืช ื”ื“ืจืš ื‘ื” ืื ื—ื ื• ืžื‘ื™ืื™ื ื™ืœื“ื™ื.
04:28
It's going to change the way we manage
90
268160
2000
ื–ื” ืขื•ืžื“ ืœืฉื ื•ืช ืืช ื”ื“ืจืš ื‘ื” ืื ื—ื ื• ืžื ื”ืœื™ื
04:30
and alter our emotions.
91
270160
2000
ื•ืžืฉื ื™ื ืืช ื”ืจื’ืฉื•ืช ืฉืœื ื•.
04:32
It's going to probably change the human lifespan.
92
272160
2000
ื–ื” ื›ื ืจืื” ืขื•ืžื“ ืœืฉื ื•ืช ืืช ืชื•ื—ืœืช ื”ื—ื™ื™ื ื”ืื ื•ืฉื™ืช.
04:34
It will really make us question
93
274160
2000
ื–ื” ื‘ืืžืช ื™ื’ืจื•ื ืœื ื• ืœืชื”ื•ืช
04:36
what it is to be a human being.
94
276160
4000
ืžื” ื–ื” ืœื”ื™ื•ืช ื‘ืŸ ืื“ื.
04:40
The larger context of this is that are
95
280160
2000
ื”ื”ืงืฉืจ ื”ืจื—ื‘ ื™ื•ืชืจ ืฉืœ ื–ื” ื”ื•ื
04:42
two unprecedented revolutions that are going on today.
96
282160
6000
ืฉืชื™ ืžื”ืคื›ื•ืช ื—ืกืจื•ืช ืชืงื“ื™ื ืฉืžืชืจื—ืฉื•ืช ื”ื™ื•ื.
04:48
The first of them is the obvious one,
97
288160
2000
ื”ืจืืฉื•ื ื” ื”ื™ื ื”ื‘ืจื•ืจื”,
04:50
the silicon revolution,
98
290160
2000
ืžื”ืคื›ืช ื”ืกื™ืœื™ืงื•ืŸ,
04:52
which you all are very, very familiar with.
99
292160
3000
ืื•ืชื” ืืชื ืžื›ื™ืจื™ื ืžืžืฉ ื˜ื•ื‘.
04:55
It's changing our lives in so many ways,
100
295160
2000
ื–ื” ืžืฉื ื” ืืช ื—ื™ื™ื ื• ื‘ื“ืจื›ื™ื ื›ื” ืจื‘ื•ืช,
04:57
and it will continue to do that.
101
297160
2000
ื•ื–ื” ื™ืžืฉื™ืš ืœืขืฉื•ืช ืืช ื–ื”.
04:59
What the essence of that is, is that we're taking
102
299160
2000
ื•ื”ืชืžืฆื™ืช ืฉืœ ื›ืœ ื–ื”, ื”ื™ื ืฉืื ื—ื ื• ืœื•ืงื—ื™ื
05:01
the sand at our feet, the inert silicon at our feet,
103
301160
4000
ืืช ื”ื—ื•ืœ ื‘ื™ืŸ ืจื’ืœื™ื ื•, ืืช ื”ืฆื•ืจืŸ ื”ื“ื•ืžื ืœืจื’ืœื™ื ื•,
05:05
and we're breathing a level of complexity into it
104
305160
2000
ื•ืื ื—ื ื• ืžืคื™ื—ื™ื ืจืžืช ืžื•ืจื›ื‘ื•ืช ืœืชื•ื›ื•
05:07
that rivals that of life itself,
105
307160
2000
ืฉืžืชื—ืจื” ื‘ื–ื• ืฉืœ ื”ื—ื™ื™ื ืขืฆืžื,
05:09
and may even surpass it.
106
309160
3000
ื•ืื•ืœื™ ืชืขืงื•ืฃ ืื•ืชื”.
05:12
As an outgrowth of that, as a child of that revolution,
107
312160
3000
ื•ื›ืชื•ืฆืื” ืžื–ื”, ื›ื™ืœื“ ืฉืœ ื”ืžื”ืคื›ื”,
05:15
is the revolution in biology.
108
315160
2000
ื”ื™ื ื”ืžื”ืคื›ื” ื‘ื‘ื™ื•ืœื•ื’ื™ื”.
05:17
The genomics revolution,
109
317160
2000
ื”ืžื”ืคื›ื” ื”ื’ื ื•ืžื™ืช,
05:19
proteomics, metabolomics, all of these "omics"
110
319160
3000
ืคืจื•ื˜ืื•ืžื™ืงื”, ืžื˜ื‘ื•ืœื•ืžื™ื”, ื›ืœ ื”"ื™ื”" ื”ืืœื”
05:22
that sound so terrific on grants and on business plans.
111
322160
3000
ืฉื ืฉืžืขื™ื ื›ืœ ื›ืš ื ืคืœื ื‘ื‘ืงืฉื•ืช ืœืžืขื ืงื™ื ื•ืชื•ื›ื ื™ื•ืช ืขืกืงื™ื•ืช.
05:25
What we're doing is we are
112
325160
5000
ืžื” ืฉืื ื—ื ื• ืขื•ืฉื™ื ื–ื”
05:30
seizing control of our evolutionary future.
113
330160
3000
ืœืชืคื•ืฉ ืฉืœื™ื˜ื” ืขืœ ื”ืขืชื™ื“ ื”ื”ืชืคืชื—ื•ืชื™ ืฉืœื ื•.
05:33
I mean we're essentially using technology
114
333160
2000
ืื ื™ ืžืชื›ื•ื•ืŸ ืฉืื ื—ื ื• ื‘ืขืงืจื•ืŸ ืžืฉืชืžืฉื™ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”
05:35
to just jam evolution into fast-forward.
115
335160
3000
ืœืชืงื•ืข ืืช ื”ืื‘ื•ืœื•ืฆื™ื” ื‘ื”ื™ืœื•ืš ืžื”ื™ืจ.
05:38
It's not at all clear where it's going to take us.
116
338160
3000
ื–ื” ืœื ื›ื–ื” ื‘ืจื•ืจ ืœืืŸ ื–ื” ื™ืงื— ืื•ืชื ื•.
05:41
But in five to ten years we're going to start see
117
341160
3000
ืื‘ืœ ืขื•ื“ ื—ืžืฉ ืขื“ ืขืฉืจ ืฉื ื™ื ืื ื—ื ื• ื ืชื—ื™ืœ ืœืจืื•ืช
05:44
some very profound changes.
118
344160
2000
ืฉื™ื ื•ื™ื™ื ืžืฉืžืขื•ืชื™ื™ื.
05:46
The most immediate changes that we'll see
119
346160
2000
ื”ืฉื™ื ื•ื™ื™ื ื”ืžื™ื™ื“ื™ื ืžื›ื•ืœื ืฉื ืจืื”
05:48
are things like in medicine.
120
348160
2000
ื”ื ื“ื‘ืจื™ื ื›ืžื• ืจืคื•ืื”.
05:50
There is going to be a big shift towards preventative medicine
121
350160
3000
ื™ื”ื™ื” ืžืขื‘ืจ ื’ื“ื•ืœ ืœื›ื™ื•ื•ืŸ ืจืคื•ืื” ืžื•ื ืขืช.
05:53
as we start to be able to identify
122
353160
2000
ื›ืฉื ืชื—ื™ืœ ืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœื–ื”ื•ืช
05:55
all of the risk factors that we have as individuals.
123
355160
3000
ืืช ื›ืœ ื’ื•ืจืžื™ ื”ืกื™ื›ื•ืŸ ืฉื™ืฉ ืœื ื• ื›ื™ื—ื™ื“ื™ื.
05:58
But who is going to pay for all this?
124
358160
2000
ืื‘ืœ ืžื™ ื™ืฉืœื ืขืœ ื›ืœ ื–ื”?
06:00
And how are we going to understand all this complex information?
125
360160
4000
ื•ืื™ืš ื ื‘ื™ืŸ ืืช ื›ืœ ื”ืžื™ื“ืข ื”ืžื•ืจื›ื‘ ื”ื–ื”?
06:04
That is going to be the IT challenge
126
364160
3000
ื–ื” ืขื•ืžื“ ืœื”ื™ื•ืช ืืชื’ืจ ืฉืœ ืžืขืจื›ื•ืช ืžื™ื“ืข
06:07
of the next generation, is communicating all this information.
127
367160
4000
ืžื”ื“ื•ืจ ื”ื‘ื, ื–ื” ืœืชืงืฉืจ ืืช ื›ืœ ื”ืžื™ื“ืข ื”ื–ื”.
06:11
There's pharmacogenomics, the combination of pharmacology
128
371160
2000
ื™ืฉ ืคืืจืžื”ื’ื ื•ืžื™ืงื”, ื”ืฉื™ืœื•ื‘ ืฉืœ ืคืืจืžืงื•ืœื•ื’ื™ื”
06:13
and genetics:
129
373160
3000
ื•ื’ื ื˜ื™ืงื”:
06:16
tailoring drugs to our individual constitutions
130
376160
2000
ืชืคื™ืจืช ืชืจื•ืคื•ืช ืœืคื™ ืคืจื•ืคื™ืœ ืื™ืฉื™
06:18
that Juan talked about a little bit earlier.
131
378160
3000
ืฉื—ื•ืืŸ ื“ื™ื‘ืจ ืขืœ ื–ื” ืงื•ื“ื ืœื›ืŸ.
06:21
That's going to have amazing impacts.
132
381160
3000
ืขื•ืžื“ื•ืช ืœื”ื™ื•ืช ืœื–ื” ื”ืฉืคืขื•ืช ืžื“ื”ื™ืžื•ืช.
06:24
And it's going to be used for diet as well,
133
384160
3000
ื•ืขื•ืžื“ื™ื ืœื”ืฉืชืžืฉ ื‘ื–ื” ื’ื ืœื“ื™ืื˜ื”,
06:27
and nutritional supplements and such.
134
387160
2000
ื•ืชื™ืกื•ืฃ ืชื–ื•ื ืชื™ ื•ืขื•ื“.
06:29
But it's going to have a big impact because
135
389160
2000
ืื‘ืœ ืขื•ืžื“ ืœื”ื™ื•ืช ืœื–ื” ื”ืฉืคืขื” ื’ื“ื•ืœื” ืžืคื ื™
06:31
we're going to have niche drugs.
136
391160
2000
ืฉื™ื”ื™ื• ืœื ื• ืชืจื•ืคื•ืช ื ื™ืฉื”.
06:33
And we aren't going to be able to support
137
393160
3000
ื•ืื ื—ื ื• ืœื ื ื”ื™ื” ืžืกื•ื’ืœื™ื ืœืชืžื•ืš
06:36
the kinds of expenses that we have to create blockbuster drugs today.
138
396160
3000
ื‘ื”ื•ืฆืื•ืช ื”ืืœื” ืฉื™ืฉ ื”ื™ื•ื ืœื ื• ื›ื“ื™ ืœื™ืฆื•ืจ ืชืจื•ืคื•ืช ืฉื•ื‘ืจื•ืช ืงื•ืคื•ืช.
06:39
The approval process is going to fall apart, actually.
139
399160
5000
ืชื”ืœื™ืš ื”ืื™ืฉื•ืจ ื™ืชืžื•ื˜ื˜, ืœืžืขืฉื”.
06:44
It's too slow.
140
404160
2000
ื”ื•ื ืื™ื˜ื™ ืžื“ื™ื™.
06:46
It's too risk-averse.
141
406160
2000
ื™ืฉ ื‘ื–ื” ื™ื•ืชืจ ืžื™ื“ื™ ืกื™ื›ื•ืŸ.
06:48
And it is really not suited for the future
142
408160
3000
ื•ื–ื” ื‘ืืžืช ืœื ืžืชืื™ื ืœืขืชื™ื“
06:51
that we're moving into.
143
411160
2000
ืฉืื ื—ื ื• ืžืชืงื“ืžื™ื ืืœื™ื•.
06:53
Another thing is that we're just going to have to deal with this knowledge.
144
413160
2000
ื“ื‘ืจ ื ื•ืกืฃ ืฉืื ื—ื ื• ืคืฉื•ื˜ ื ืฆื˜ืจืš ืœื”ืชืžื•ื“ื“ ืขื ื”ื™ื“ืข ื”ื–ื”.
06:55
It's really wonderful when we hear,
145
415160
2000
ื–ื” ื‘ืืžืช ื ืคืœื ื›ืฉืื ื—ื ื• ืฉื•ืžืขื™ื,
06:57
"Oh, 99.9 percent of the letters in the code are the same.
146
417160
5000
"ืื•ื”, 99.9 ืื—ื•ื– ืžื”ืื•ืชื™ื•ืช ื‘ืงื•ื“ ื”ืŸ ื–ื”ื•ืช.
07:02
We're all identical to each other. Isn't it wonderful?"
147
422160
4000
ืื ื—ื ื• ื›ื•ืœื ื• ื–ื”ื™ื ืื—ื“ ืœืฉื ื™. ื–ื” ืœื ื ืคืœื?"
07:06
And look around you and know
148
426160
2000
ื•ืœื”ื‘ื™ื˜ ืกื‘ื™ื‘ ื•ืœื“ืขืช
07:08
that what we really care about is
149
428160
2000
ืฉืžื” ืฉืื›ืคืช ืœื ื• ืžืžื ื• ื‘ืืžืช ื”ื•ื
07:10
that little bit of difference.
150
430160
2000
ื”ื”ื‘ื“ืœ ื”ืงื˜ืŸ.
07:12
We look the same to a visitor from another planet, maybe,
151
432160
3000
ืื ื—ื ื• ื ืจืื™ื ืื•ืชื• ื”ื“ื‘ืจ ืœืžื‘ืงืจ ืžื›ื•ื›ื‘ ืื—ืจ, ืื•ืœื™,
07:15
but not to each other
152
435160
2000
ืื‘ืœ ืœื ืื—ื“ ืœืฉื ื™
07:17
because we compete with each other all time.
153
437160
2000
ืžืคื ื™ ืฉืื ื—ื ื• ืžืชื—ืจื™ื ืื—ื“ ืขื ื”ืฉื ื™ ื›ืœ ื”ื–ืžืŸ.
07:19
And we're going to have to come to grips with the fact
154
439160
3000
ื•ืื ื—ื ื• ื ืฆื˜ืจืš ืœื”ื›ื™ืจ ื‘ืขื•ื‘ื“ื”
07:22
that there are differences between us as individuals that we will know about,
155
442160
3000
ืฉื™ืฉ ื”ื‘ื“ืœื™ื ื‘ื™ื ื™ื ื• ื›ืื ืฉื™ื ืฉื›ื•ืœื ื• ืžื•ื“ืขื™ื ืœื”ื,
07:25
and between subpopulations of humans as well.
156
445160
4000
ื•ื’ื ื‘ื™ืŸ ืชืชื™ ืื•ื›ืœื•ืกื™ื•ืช ืฉืœ ืื ืฉื™ื.
07:29
To deny that that's the case is not a very good start on that.
157
449160
4000
ืœื”ืชื›ื—ืฉ ืฉื–ื” ื”ืžืงืจื” ื–ื• ืœื ื”ืชื—ืœื” ื˜ื•ื‘ื”.
07:33
A generation or so away
158
453160
3000
ื‘ืขื•ื“ ื“ื•ืจ ืื• ืฉื ื™ื™ื
07:36
there are going to be even more profound things that are going to happen.
159
456160
3000
ื™ื”ื™ื• ื“ื‘ืจื™ื ืขื•ื“ ื™ื•ืชืจ ืžืฉืžืขื•ืชื™ื™ื ืฉื™ืงืจื•.
07:39
That's when we're going to begin to use this knowledge to modify ourselves.
160
459160
4000
ืื– ื ืชื—ื™ืœ ืœื”ืฉืชืžืฉ ื‘ื™ื“ืข ืฉืœื ื• ื›ื“ื™ ืœืฉื ื•ืช ืืช ืขืฆืžื ื•.
07:43
Now I don't mean extra gills or something --
161
463160
3000
ืขื›ืฉื™ื• ืื ื™ ืœื ืžืชื›ื•ื•ืŸ ืœื–ื™ืžื™ื ื ื•ืกืคื™ื ืื• ืžืฉื”ื• --
07:46
something we care about, like aging.
162
466160
2000
ืžืฉื”ื• ืฉื—ืฉื•ื‘ ืœื ื• ื›ืžื• ื–ืงื ื”.
07:48
What if we could unravel aging and understand it --
163
468160
4000
ืžื” ืื ื ื•ื›ืœ ืœืคืชื•ืจ ืืช ื”ื”ื–ื“ืงื ื•ืช ื•ืœื”ื‘ื™ืŸ ืื•ืชื” --
07:52
begin to retard the process or even reverse it?
164
472160
3000
ื ืชื—ื™ืœ ืœื”ืื˜ ืืช ื”ืชื”ืœื™ืš ืื• ืืคื™ืœื• ืœื”ืคื•ืš ืื•ืชื•?
07:55
It would change absolutely everything.
165
475160
2000
ื–ื” ื™ืฉื ื” ืœื—ืœื•ื˜ื™ืŸ ืืช ื”ื›ืœ.
07:57
And it's obvious to anyone,
166
477160
2000
ื•ื–ื” ื‘ืจื•ืจ ืœื›ื•ืœื,
07:59
that if we can do this, we absolutely will do this,
167
479160
3000
ืฉืื ื ื•ื›ืœ ืœืขืฉื•ืช ืืช ื–ื”, ืื ื—ื ื• ื‘ื”ื—ืœื˜ ื ืขืฉื” ืืช ื–ื”,
08:02
whatever the consequences are.
168
482160
2000
ืœื ืžืฉื ื” ืžื” ื”ื”ืฉืœื›ื•ืช.
08:04
The second is modifying our emotions.
169
484160
4000
ื”ืฉื ื™ ื”ื•ื ืœืฉื ื•ืช ืืช ื”ืจื’ืฉื•ืช ืฉืœื ื•.
08:08
I mean Ritalin, Viagra,
170
488160
4000
ืื ื™ ืžืชื›ื•ื•ืŸ ืœืจื™ื˜ืœื™ืŸ, ื•ื™ืื’ืจื”,
08:12
things of that sort, Prozac.
171
492160
2000
ื“ื‘ืจื™ื ืžื”ืกื•ื’ ื”ื–ื”, ืคืจื•ื–ืืง.
08:14
You know, this is just clumsy little baby steps.
172
494160
2000
ืืชื ื™ื•ื“ืขื™ื, ื–ื” ืคืฉื•ื˜ ืฆืขื“ื™ ืชื™ื ื•ืง ืžืกื•ืจื‘ืœื™ื.
08:16
What if you could take a little
173
496160
3000
ืžื” ืื ืชื•ื›ืœื• ืœืงื—ืช
08:19
concoction of pharmaceuticals
174
499160
3000
ืขืจื•ื‘ ืงื˜ืŸ ืฉืœ ืชืจื•ืคื•ืช
08:22
that would make you feel really contented,
175
502160
3000
ืฉื™ื’ืจืžื• ืœื›ื ืœื”ืจื’ื™ืฉ ืžืžืฉ ืžืจื•ืฆื™ื,
08:25
just happy to be you.
176
505160
2000
ืคืฉื•ื˜ ืžืื•ืฉืจื™ื ืœื”ื™ื•ืช ืืชื.
08:27
Are you going to be able to resist that if it doesn't have any overt side effects?
177
507160
3000
ื”ืื ืชื•ื›ืœื• ืœื”ืชื ื’ื“ ืœื–ื” ืื ืื™ืŸ ืœื–ื” ืชื•ืคืขื•ืช ืœื•ื•ืื™?
08:30
Probably not.
178
510160
2000
ื›ื ืจืื” ืฉืœื.
08:32
And if you don't, who are you going to be?
179
512160
2000
ื•ืื ืœื, ืžื™ ืชื”ื™ื•?
08:34
Why do you do what you do?
180
514160
2000
ืœืžื” ืืชื ืขื•ืฉื™ื ืžื” ืฉืืชื ืขื•ืฉื™ื?
08:36
We're sort of circumventing evolutionary programs that guide our behavior.
181
516160
3000
ืื ื—ื ื• ืกื•ื’ ืฉืœ ืžื•ื ืขื™ื ืชื•ื›ื ื™ื•ืช ืื‘ื•ืœื•ืฆื™ื•ื ื™ื•ืช ืฉืžื ื—ื•ืช ืืช ื”ื”ืชื ื”ื’ื•ืช ืฉืœื ื•.
08:39
It's going to be very challenging to deal with.
182
519160
2000
ื–ื” ื™ื”ื™ื” ืžืื•ื“ ืžืืชื’ืจ ืœื”ืชืžื•ื“ื“ ืขื ื–ื”.
08:41
The third area is reproduction.
183
521160
4000
ื”ื ื•ืฉื ื”ืฉืœื™ืฉื™ ื”ื•ื ืจื‘ื™ื”.
08:45
The idea that we're going to chose our children's genes,
184
525160
3000
ื”ืจืขื™ื•ืŸ ืฉืื ื—ื ื• ื ื‘ื—ืจ ืืช ื”ื’ื ื™ื ืฉืœ ื”ื™ืœื“ื™ื,
08:48
as we begin to understand what genes say about who we are.
185
528160
4000
ื›ืฉื ืชื—ื™ืœ ืœื”ื‘ื™ืŸ ืžื” ื”ื’ื ื™ื ืื•ืžืจื™ื ืขืœ ืžื™ ืฉืื ื—ื ื•.
08:52
That's the focus of my book "Redesigning Humans,"
186
532160
2000
ื–ื” ื”ืคื•ืงื•ืก ืฉืœ ืกืคืจื™ "ืœืขืฆื‘ ืžื—ื“ืฉ ืื ืฉื™ื,"
08:54
where I talk about the kinds of choices we'll make,
187
534160
2000
ื‘ื• ืื ื™ ืžื“ื‘ืจ ืขืœ ืกื•ื’ื™ ื”ื‘ื—ื™ืจื•ืช ืฉื ืขืฉื”,
08:56
and the challenges it's going to present to society.
188
536160
3000
ื•ื”ืืชื’ืจื™ื ืฉื–ื• ืขื•ืžื“ ืœื”ืขืžื™ื“ ืœืคื ื™ ื”ื—ื‘ืจื”.
08:59
There are three obvious ways of doing this.
189
539160
2000
ื™ืฉ ืฉืœื•ืฉ ื“ืจื›ื™ื ื‘ืจื•ืจื•ืช ืœืขืฉื•ืช ืืช ื–ื”.
09:01
The first is cloning.
190
541160
2000
ื”ืจืืฉื•ื ื” ื”ื™ื ืฉื™ื›ืคื•ืœ.
09:03
It didn't happen.
191
543160
2000
ื–ื” ืœื ืงืจื”.
09:05
It's a total media circus.
192
545160
2000
ื–ื” ืงืจืงืก ืžื“ื™ื” ื˜ื•ื˜ืืœื™.
09:07
It will happen in five to 10 years.
193
547160
2000
ื–ื” ื™ืงืจื” ืขื•ื“ 5 ืขื“ 10 ืฉื ื™ื.
09:09
And when it does it's not going to be that big a deal.
194
549160
3000
ื•ื›ืฉื–ื” ื™ืงืจื” ื–ื” ืœื ื™ื”ื™ื” ืขื ื™ื™ืŸ ื’ื“ื•ืœ.
09:12
The birth of a delayed identical twin
195
552160
2000
ื”ืœื™ื“ื” ื”ื“ื—ื•ื™ื™ื” ืฉืœ ืชืื•ื ื–ื”ื”
09:14
is not going to shake western civilization.
196
554160
5000
ืœื ืชื ืขืจ ืืช ื”ื—ื‘ืจื” ื”ืžืขืจื‘ื™ืช.
09:19
But there are more important things that are already occurring:
197
559160
3000
ืื‘ืœ ื™ืฉ ื“ื‘ืจื™ื ื—ืฉื•ื‘ื™ื ื™ื•ืชืจ ืฉื›ื‘ืจ ืงื•ืจื™ื:
09:22
embryo screening.
198
562160
2000
ืกื™ื ื•ืŸ ืขื•ื‘ืจื™ื.
09:24
You take a six to eight cell embryo,
199
564160
3000
ืœื•ืงื—ื™ื ืขื•ื‘ืจ ืฉืœ ืฉื™ืฉื” ืขื“ ืฉืžื•ื ื” ืชืื™ื,
09:27
you tease out one of the cells, you run a genetic test on that cell,
200
567160
3000
ืืชื ืžื•ืฆื™ืื™ื ืชื ืื—ื“, ืžืจื™ืฆื™ื ื‘ื“ื™ืงื•ืช ื’ื ื˜ื™ื•ืช ืขืœ ื”ืชื,
09:30
and depending on the results of that test
201
570160
2000
ื•ืชืœื•ื™ ื‘ืชื•ืฆืื•ืช ืฉืœ ื”ื‘ื“ื™ืงื”
09:32
you either implant that embryo or you discard it.
202
572160
3000
ืืชื ืื• ืžืฉืชื™ืœื™ื ืืช ื”ืขื•ื‘ืจ ืื• ืฉื ืคื˜ืจื™ื ืžืžื ื•.
09:35
It's already done to avoid rare diseases today.
203
575160
3000
ื–ื” ื›ื‘ืจ ื ืขืฉื” ื›ื“ื™ ืœื”ืžื ืข ืžืžื—ืœื•ืช ื ื“ื™ืจื•ืช ื›ื™ื•ื.
09:38
And pretty soon it's going to be possible
204
578160
3000
ื•ื‘ืงืจื•ื‘ ื–ื” ื™ื”ื™ื” ืืคืฉืจื™
09:41
to avoid virtually all genetic diseases in that way.
205
581160
4000
ืœื”ืžื ืข ืžื›ืœ ื”ืžื—ืœื•ืช ื”ื’ื ื˜ื™ื•ืช ื‘ื“ืจืš ื”ื–ื•.
09:45
As that becomes possible
206
585160
3000
ื›ืฉื–ื” ื™ืขืฉื” ืืคืฉืจื™
09:48
this is going to move from something that is used by those who
207
588160
3000
ื–ื” ื™ื”ืคื•ืš ืžืžืฉื”ื• ืฉืžืฉืžืฉ ื›ืืœื”
09:51
have infertility problems and are already doing in vitro fertilization,
208
591160
3000
ืขื ื‘ืขื™ื•ืช ืคื•ืจื™ื•ืช ื•ืฉืžืžื™ืœื ืขื•ืฉื™ื ื”ืคืจื™ื™ืช ืžื‘ื—ื ื”,
09:54
to the wealthy who want to protect their children,
209
594160
3000
ืœืขืฉื™ืจื™ื ืฉืจื•ืฆื™ื ืœื”ื’ืŸ ืขืœ ื™ืœื“ื™ื”ื,
09:57
to just about everybody else.
210
597160
2000
ืœื›ื•ืœื ื‘ืขืฆื.
09:59
And in that process that's going to morph
211
599160
2000
ื•ื‘ืชื”ืœื™ืš ื”ื–ื” ื–ื” ื™ืฉืชื ื”
10:01
from being just for diseases,
212
601160
2000
ืžืœื”ื™ื•ืช ืจืง ื‘ืฉื‘ื™ืœ ืžื—ืœื•ืช,
10:03
to being for lesser vulnerabilities,
213
603160
2000
ืœื‘ืขื™ื•ืช ืคื—ื•ืช ื—ืฉื•ื‘ื•ืช,
10:05
like risk of manic depression or something,
214
605160
3000
ื›ืžื• ื”ืกื™ื›ื•ืŸ ืœืžื ื™ื” ื“ื™ืคืจืกื™ื” ืื• ืžืฉื”ื•,
10:08
to picking personalities,
215
608160
3000
ืœื‘ื—ื™ืจืช ืื™ืฉื™ื•ืช,
10:11
temperaments, traits, these sorts of things.
216
611160
3000
ืžื–ื’, ืชื›ื•ื ื•ืช, ื“ื‘ืจื™ื ื›ืืœื”.
10:14
Of course there is going to be genetic engineering.
217
614160
3000
ื›ืžื•ื‘ืŸ ืขื•ืžื“ืช ืœื”ื™ื•ืช ื”ื ื“ืกื” ื’ื ื˜ื™ืช.
10:17
Directly going in -- it's a little bit further away, but not that far away --
218
617160
3000
ื›ื ื™ืกื” ื™ืฉื™ืจื•ืช ืคื ื™ืžื” -- ื–ื” ืงืฆืช ื™ื•ืชืจ ืจื—ื•ืง, ืื‘ืœ ืœื ื›ืœ ื›ืš ืจื—ื•ืง --
10:20
going in and altering the genes in the first cell in an embryo.
219
620160
5000
ืœื”ื›ื ืก ื•ืœืฉื ื•ืช ืืช ื”ื’ื ื™ื ื‘ืชื ื”ืจืืฉื•ืŸ ื‘ืขื•ื‘ืจ.
10:25
The way I suspect it will happen
220
625160
3000
ื”ื“ืจืš ื‘ื” ืื ื™ ืžื ื—ืฉ ืฉื–ื” ื™ืงืจื”
10:28
is using artificial chromosomes
221
628160
2000
ื”ื™ื ืฉื™ืžื•ืฉ ื‘ื›ืจื•ืžื•ื–ื•ืžื™ื ืžืœืื›ื•ืชื™ื™ื
10:30
and extra chromosomes, so we go from 46
222
630160
3000
ื•ื›ืจื•ืžื•ื–ื•ืžื™ื ื ื•ืกืคื™ื, ืื– ื ืขืœื” ืž46
10:33
to 47 or 48.
223
633160
2000
ืœ47 ืื• 48.
10:35
And one that is not heritable
224
635160
2000
ื•ืื—ื“ ืฉื”ื•ื ืœื ืชื•ืจืฉืชื™
10:37
because who would want to pass on to their children
225
637160
3000
ืžืคื ื™ ืฉืžื™ ื™ืจืฆื” ืœื”ืขื‘ื™ืจ ืœื™ืœื“ื™ื•
10:40
the archaic enhancement modules
226
640160
3000
ืืช ืžืจื›ื™ื‘ื™ ื”ื”ืจื—ื‘ื” ื”ื™ืฉื ื™ื
10:43
that they got 25 years earlier from their parents?
227
643160
3000
ืฉื”ื ืงื™ื‘ืœื• 25 ืฉื ื” ืžื•ืงื“ื ื™ื•ืชืจ ืžื”ื•ืจื™ื™ื”ื?
10:46
It's a joke; of course they wouldn't want to do that.
228
646160
3000
ื–ื• ื‘ื“ื™ื—ื”; ื‘ืจื•ืจ ืฉื”ื ืœื ื™ืจืฆื• ืœืขืฉื•ืช ืืช ื–ื”.
10:49
They'll want the new release.
229
649160
2000
ื”ื ื™ืจืฆื• ืืช ื”ื’ืจืกื” ื”ื—ื“ืฉื”.
10:51
Those kinds of loose analogies with
230
651160
3000
ื”ื”ืฉื•ื•ืื•ืช ื”ื—ื•ืคืฉื™ื•ืช ื”ืืœื• ืขื
10:54
(Laughter)
231
654160
1000
(ืฆื—ื•ืง)
10:55
computers, and with programming,
232
655160
2000
ืžื—ืฉื‘ื™ื, ื•ืชื›ื ื•ืช,
10:57
are actually much deeper than that.
233
657160
3000
ื”ืŸ ืœืžืขืฉื” ื”ืจื‘ื” ื™ื•ืชืจ ืขืžื•ืงื•ืช ืžื–ื”.
11:00
They are really going to come to operate in this realm.
234
660160
3000
ื”ืŸ ื‘ืืžืช ืขื•ืžื“ื•ืช ืœื”ื’ื™ืข ื•ืœืคืขื•ืœ ื‘ืชื—ื•ื ื”ื–ื”.
11:03
Now not everything that can be done should be done.
235
663160
3000
ืขื›ืฉื™ื• ืœื ื›ืœ ื“ื‘ืจ ืฉืืคืฉืจ ืœืขืฉื•ืช ืฆืจื™ืš ืœืขืฉื•ืช.
11:06
And it won't be done.
236
666160
2000
ื•ื–ื” ืœื ื™ืขืฉื”.
11:08
But when something is feasible in thousands of
237
668160
2000
ืื‘ืœ ื›ืฉืžืฉื”ื• ื”ื•ื ื‘ืจ ื”ืฉื’ื” ื‘ืืœืคื™
11:10
laboratories all over the world,
238
670160
2000
ืžืขื‘ื“ื•ืช ื‘ื›ืœ ื”ืขื•ืœื,
11:12
which is going to be the case with these technologies,
239
672160
2000
ืฉื–ื” ื™ื”ื™ื” ื”ืžืงืจื” ืขื ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื•,
11:14
when there are large numbers of people who see them as beneficial,
240
674160
2000
ื›ืฉื™ืฉ ืžืกืคืจื™ื ื’ื“ื•ืœื™ื ืฉืœ ืื ืฉื™ื ืฉืžืฉืชืžืฉื™ื ื‘ื”ืŸ ื‘ืื•ืคืŸ ืžื•ืขื™ืœ,
11:16
which is already the case,
241
676160
2000
ืฉื–ื” ื›ื‘ืจ ื”ืžืงืจื”,
11:18
and when they're almost impossible to police,
242
678160
3000
ื•ื›ืฉื”ืŸ ื›ืžืขื˜ ื‘ืœืชื™ ืืคืฉืจื™ื•ืช ืœืฉื™ื˜ื•ืจ,
11:21
it's not a question of if this is going to happen,
243
681160
2000
ื–ื” ืœื ืฉืืœื” ืฉืœ ืื ื–ื” ื™ืงืจื”,
11:23
it's when and where and how it's going to happen.
244
683160
2000
ื–ื” ืื™ืคื” ื•ืžืชื™ ื•ืื™ืš ื–ื” ื™ืงืจื”.
11:25
Humanity is going to go down this path.
245
685160
4000
ื”ืื ื•ืฉื•ืช ืชืœืš ื‘ื“ืจืš ื”ื–ื•.
11:29
And it's going to do so for two reasons.
246
689160
2000
ื•ื”ื™ื ืชืขืฉื” ื–ืืช ืžืฉืชื™ ืกื™ื‘ื•ืช.
11:31
The first is that all these technologies
247
691160
3000
ื”ืจืืฉื•ื ื” ื”ื™ื ืฉื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื•
11:34
are just a spin-off of mainstream medical research
248
694160
2000
ื”ืŸ ืกืคื™ืŸ ืื•ืฃ ืฉืœ ื”ืžื—ืงืจ ื”ืจืคื•ืื™ ืžื”ื–ืจื ื”ืžืจื›ื–ื™
11:36
that everybody wants to see happen.
249
696160
2000
ืฉื›ืœ ืื—ื“ ืจื•ืฆื” ืฉื™ืงืจื”.
11:38
It is being funded very very --
250
698160
3000
ื–ื” ืžืžื•ืžืŸ ืžืื•ื“ ืžืื•ื“ --
11:41
in a big way.
251
701160
2000
ื‘ืฆื•ืจื” ื ืจื—ื‘ืช.
11:43
The second is, we're human.
252
703160
2000
ื”ืฉื ื™ื” ื”ื™ื, ืื ื—ื ื• ืื ื•ืฉื™ื™ื.
11:45
That's what we do.
253
705160
2000
ื–ื” ืžื” ืฉืื ื—ื ื• ืขื•ืฉื™ื.
11:47
We try and use our technology to
254
707160
2000
ืื ื—ื ื• ืžื ืกื™ื ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœื ื•
11:49
improve our lives in one way or another.
255
709160
2000
ื›ื“ื™ ืœืฉืคืจ ืืช ื—ื™ื™ื ื• ื‘ื“ืจืš ื–ื• ืื• ืื—ืจืช.
11:51
To imagine that we're not going to use these technologies
256
711160
3000
ื•ืœื“ืžื™ื™ืŸ ืฉืื ื—ื ื• ืœื ื ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื•
11:54
when they become available,
257
714160
2000
ื›ืฉื”ืŸ ื™ื”ื™ื• ื–ืžื™ื ื•ืช,
11:56
is as much a denial of who we are
258
716160
2000
ื–ื• ื”ื›ื—ืฉื” ืฉืœ ืžื™ ืฉืื ื—ื ื•
11:58
as to imagine
259
718160
2000
ื‘ืื•ืชื” ืžื™ื“ื” ื›ืžื• ืœื“ืžื™ื™ืŸ
12:00
that we'll use these technologies and not fret
260
720160
2000
ืฉื ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื• ื•ืœื ื ืคื—ื“
12:02
and worry about it a great deal.
261
722160
3000
ื•ื ื“ืื’ ืžืื•ื“ ืžื–ื”.
12:05
The lines are going to blur. And they already are
262
725160
3000
ื”ืงื•ื™ื ื™ื˜ื•ืฉื˜ืฉื•. ื•ื”ื ื›ื‘ืจ ื›ืš
12:08
between therapy and enhancement,
263
728160
2000
ื‘ื™ืŸ ื˜ื™ืคื•ืœ ื•ืฉื™ื“ืจื•ื’,
12:10
between treatment and prevention,
264
730160
2000
ื‘ื™ืŸ ื˜ื™ืคื•ืœ ื•ืžื ื™ืขื”,
12:12
between need and desire.
265
732160
3000
ื‘ื™ืŸ ืฆื•ืจืš ืœืจืฆื•ืŸ.
12:15
That's really the central one, I believe.
266
735160
3000
ื–ื” ื‘ืืžืช ืžืจื›ื–ื™, ืื ื™ ืžืืžื™ืŸ.
12:18
People can try and ban these things.
267
738160
2000
ืื ืฉื™ื ื™ื›ื•ืœื™ื ืœื ืกื•ืช ื•ืœืืกื•ืจ ืืช ื”ื“ื‘ืจื™ื ื”ืืœื•.
12:20
They undoubtedly will. They have.
268
740160
2000
ื”ื ืœืœื ืกืคืง ื™ืขืฉื• ื–ืืช. ื”ื ืขืฉื•.
12:22
But ultimately all this is going to do
269
742160
2000
ืื‘ืœ ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื›ืœ ืžื” ืฉื–ื” ื™ืขืฉื”
12:24
is just shift development elsewhere.
270
744160
3000
ื–ื” ืœื”ืขื‘ื™ืจ ืืช ื”ืคื™ืชื•ื— ืœืžืงื•ื ืื—ืจ.
12:27
It's going to drive these things from view.
271
747160
2000
ื–ื” ื™ืจื—ื™ืง ืืช ื”ื“ื‘ืจื™ื ื”ืืœื” ืžืงื• ื”ืจืื™ื”.
12:29
It's going to reserve the technology for the wealthy
272
749160
2000
ื–ื” ื™ื”ืคื•ืš ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืœืขืฉื™ืจื™ื
12:31
because they are in the best position
273
751160
2000
ืžืคื ื™ ืฉื”ื ื‘ื ืงื•ื“ื” ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ
12:33
to circumvent any of these sorts of laws.
274
753160
3000
ืœืžื ื•ืข ืืช ื”ื—ื•ืงื™ื ืžื”ืกื•ื’ ื”ื–ื”.
12:36
And it's going to deny us
275
756160
2000
ื•ื–ื” ื™ืžื ืข ืžืื™ืชื ื•
12:38
the information that we need to make wise decisions
276
758160
2000
ืืช ื”ืžื™ื“ืข ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ื›ื“ื™ ืœืขืฉื•ืช ื‘ื—ื™ืจื•ืช ื—ื›ืžื•ืช
12:40
about how to use these technologies.
277
760160
3000
ืขืœ ืื™ืš ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืืœื•.
12:43
So, sure, we need to debate these things.
278
763160
2000
ืื–, ื›ืžื•ื‘ืŸ, ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื“ื•ืŸ ื‘ื“ื‘ืจื™ื ื”ืืœื•.
12:45
And I think it's wonderful that we do.
279
765160
2000
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ื ืคืœื ืฉืื ื—ื ื• ืขื•ืฉื™ื ืืช ื–ื”.
12:47
But we shouldn't kid ourselves
280
767160
3000
ืื‘ืœ ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœื”ืฉืœื•ืช ืืช ืขืฆืžื ื•
12:50
and think that we're going to reach a consensus about these things.
281
770160
3000
ื•ืœื—ืฉื•ื‘ ืฉื ื’ื™ืข ืœื”ืกื›ืžื” ืขืœ ื”ื“ื‘ืจื™ื ื”ืืœื•.
12:53
That is simply not going to happen.
282
773160
2000
ื–ื” ืคืฉื•ื˜ ืœื ื™ืงืจื”.
12:55
They touch us too deeply.
283
775160
2000
ื–ื” ื ื•ื’ืข ื‘ื ื• ืขืžื•ืง ืžื“ื™.
12:57
And they depend too much upon history, upon philosophy,
284
777160
3000
ื•ื”ื ืชืœื•ื™ื™ื ื™ื•ืชืจ ืžื“ื™ ื‘ื”ืกื˜ื•ืจื™ื”, ื‘ืคื™ืœื•ืกื•ืคื™ื”,
13:00
upon religion, upon culture, upon politics.
285
780160
3000
ื‘ื“ืช, ื‘ืชืจื‘ื•ืช ื•ื‘ืคื•ืœื™ื˜ื™ืงื”.
13:03
Some people are going to see this
286
783160
3000
ื—ืœืง ืžื”ืื ืฉื™ื ื™ืจืื• ืืช ื–ื”
13:06
as an abomination,
287
786160
2000
ื›ืชื•ืขื‘ื”,
13:08
as the worst thing, as just awful.
288
788160
3000
ื›ื“ื‘ืจ ื”ื’ืจื•ืข ืžื›ืœ, ื›ืคืฉื•ื˜ ื ื•ืจืื™.
13:11
Other people are going to say, "This is great.
289
791160
3000
ืื—ืจื™ื ื™ื’ื™ื“ื•, "ื–ื” ื ืคืœื.
13:14
This is the flowering of human endeavor."
290
794160
4000
ื–ื• ื”ืคืจื™ื—ื” ืฉืœ ื”ืžืกืข ื”ืื ื•ืฉื™."
13:18
The one thing though that is really dangerous
291
798160
3000
ื”ื“ื‘ืจ ื”ื™ื—ื™ื“ื™ ืฉื‘ืืžืช ืžืกื•ื›ืŸ
13:21
about these sorts of technologies,
292
801160
3000
ื‘ืกื•ื’ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ื–ื”,
13:24
is that it's easy to become seduced by them.
293
804160
4000
ื–ื” ืฉืงืœ ืœื”ืชืคืชื•ืช ืžื”ืŸ.
13:28
And to focus too much on all
294
808160
2000
ื•ืœื”ืชืžืงื“ ื™ื•ืชืจ ืžื“ื™ ืขืœ ื›ืœ
13:30
the high-technology possibilities that exist.
295
810160
3000
ื”ืืคืฉืจื•ื™ื•ืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉืงื™ื™ืžื•ืช.
13:33
And to lose touch
296
813160
3000
ื•ืœืื‘ื“ ืงืฉืจ
13:36
with the basic rhythms of our biology and our health.
297
816160
2000
ืขื ื”ืงืฆื‘ื™ื ื”ื‘ืกื™ืกื™ื™ื ืฉืœ ื”ื‘ื™ื•ืœื•ื’ื™ื” ื•ื”ื‘ืจื™ืื•ืช ืฉืœื ื•.
13:38
There are too many people that think that high-technology medicine
298
818160
3000
ื™ืฉ ื™ื•ืชืจ ืžื“ื™ ืื ืฉื™ื ืฉื—ื•ืฉื‘ื™ื ืฉืจืคื•ืื” ืฉืœ ื˜ื›ื ื•ืœื•ื’ื™ื” ื’ื‘ื•ื”ื”
13:41
is going to keep them, save them,
299
821160
2000
ืชืฉืžื•ืจ ืขืœื™ื”ื, ืชืฆื™ืœ ืื•ืชื,
13:43
from overeating,
300
823160
2000
ืžืื›ื™ืœืช ื™ืชืจ,
13:45
from eating a lot of fast foods,
301
825160
2000
ืžืœืื›ื•ืœ ื™ื•ืชืจ ืžื“ื™ ืžื–ื•ืŸ ืžื”ื™ืจ,
13:47
from not getting any exercise.
302
827160
2000
ืžืœื ืœื”ืชืขืžืœ ืžืกืคื™ืง.
13:49
It's not going to happen.
303
829160
2000
ื–ื” ืœื ื™ืงืจื”.
13:51
In the midst of all this amazing technology,
304
831160
2000
ื‘ืชื•ืš ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื ืคืœืื” ื”ื–ื•,
13:53
and all these things that are occurring, it's really interesting
305
833160
3000
ื•ื›ืœ ื”ื“ื‘ืจื™ื ืฉืงื•ืจื™ื, ื–ื” ื‘ืืžืช ืžืขื ื™ื™ืŸ
13:56
because there is sort of a counter-revolution that is going on:
306
836160
3000
ืžืคื ื™ ืฉื™ืฉ ืžืขื™ื™ืŸ ืžื”ืคื›ืช ื ื’ื“ ืฉืงื•ืจื”:
13:59
a resurgence of interest in remedies from the past,
307
839160
5000
ืขืœื™ื™ื” ืฉืœ ื”ืชืขื ื™ื ื•ืช ื‘ืชืจื•ืคื•ืช ืžื”ืขื‘ืจ,
14:04
in nutraceuticals, in all of these sorts of things
308
844160
3000
ื‘ื ื•ื˜ืจืกื•ื˜ื™ืงืœืก, ื‘ื›ืœ ืกื•ื’ ื”ื“ื‘ืจื™ื ื”ื–ื”
14:07
that some people, in the pharmaceutical industry particularly,
309
847160
3000
ืฉื—ืœืง ืžื”ืื ืฉื™ื, ื‘ืชืขืฉื™ื™ืช ื”ืชืจื•ืคื•ืช ื‘ืขื™ืงืจ,
14:10
like to brand as non-science.
310
850160
2000
ืื•ื”ื‘ื™ื ืœืžืชื’ ื›ืœื-ืžื“ืข.
14:12
But this whole effort is generated,
311
852160
3000
ืื‘ืœ ื›ืœ ื”ืžืืžืฅ ื”ื–ื” ืžื•ืคืขืœ,
14:15
is driven, by IT as well
312
855160
3000
ื”ื•ื ืžื•ื ืข, ืขืœ ื™ื“ื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืžื™ื“ืข ื’ื ื”ื•ื
14:18
because that is how we're gathering all this information,
313
858160
3000
ืžืคื ื™ ืฉื›ืš ืื ื—ื ื• ืื•ืกืคื™ื ืืช ื›ืœ ื”ืžื™ื“ืข,
14:21
and linking it, and integrating it together.
314
861160
2000
ื•ืžืงืฉืจื™ื ืื•ืชื•, ื•ืžื—ื‘ืจื™ื ืื•ืชื•.
14:23
There is a lot in this rich biota that is going to serve us well.
315
863160
5000
ื™ืฉ ื”ืจื‘ื” ื‘ื˜ื‘ืข ืฉื™ืฉืจืช ืื•ืชื ื•.
14:28
And that's where about half of our drugs come.
316
868160
2000
ื•ืžืฉื ืžื’ื™ืขื•ืช ื‘ืขืจืš ื—ืฆื™ ืžื”ืชืจื•ืคื•ืช.
14:30
So we shouldn't dismiss this
317
870160
2000
ืื– ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœื”ืชืขืœื ืžื–ื”
14:32
because it's an enormous opportunity to use
318
872160
2000
ืžืคื ื™ ืฉื–ื• ื”ื–ื“ืžื ื•ืช ืขื ืงื™ืช ืœื ืฆืœ
14:34
these sorts of results,
319
874160
3000
ืกื•ื’ ื–ื” ืฉืœ ืชื•ืฆืื•ืช,
14:37
or these random loose trials from the last thousand years
320
877160
3000
ืฉืœ ื”ืžื‘ื“ืงื™ื ื”ื—ื•ืคืฉื™ื™ื ื”ืืงืจืื™ื™ื ื”ืืœื• ืžืืœืฃ ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช
14:40
about what has impacts on our health.
321
880160
2000
ืขืœ ืžื” ื™ืฉ ืœื• ื”ืฉืคืขื” ืขืœ ื”ื‘ืจื™ืื•ืช ืฉืœื ื•.
14:42
And to use our advanced technologies
322
882160
2000
ื•ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืžืชืงื“ืžื•ืช ืฉืœื ื•
14:44
to pull out what is beneficial from this
323
884160
2000
ืœืžืฆื•ืช ืžื” ืฉืžื•ืขื™ืœ ืœื ื•
14:46
sea of noise, basically.
324
886160
3000
ืžื™ื ื”ืจืขืฉ ื”ื–ื”, ื‘ืขืงืจื•ืŸ.
14:49
In fact this isn't just abstract.
325
889160
3000
ืœืžืขืฉื” ื–ื” ืœื ืจืง ืžื•ืคืฉื˜.
14:52
I just formed a biotechnology company
326
892160
2000
ื‘ื“ื™ื•ืง ื”ืงืžืชื™ ื—ื‘ืจืช ื‘ื™ื•ื˜ื›ื ื•ืœื•ื’ื™ื”
14:54
that is using
327
894160
2000
ืฉืžืฉืชืžืฉืช
14:56
this sort of an approach to develop
328
896160
2000
ื‘ื’ื™ืฉื” ืžื”ืกื•ื’ ื”ื–ื” ืœืคืชื—
14:58
therapeutics for Alzheimer's and other diseases of aging,
329
898160
3000
ืชืจื•ืคื•ืช ืœืืœืฆื”ื™ื™ืžืจ ื•ืžื—ืœื•ืช ื”ื–ื“ืงื ื•ืช ืื—ืจื•ืช,
15:01
and we're making some real progress.
330
901160
2000
ื•ืื ื—ื ื• ืžืชืงื“ืžื™ื ื™ืคื”.
15:03
So here we are.
331
903160
2000
ืื– ื”ื ื” ืื ื—ื ื•.
15:05
It's the beginning of a new millennium.
332
905160
3000
ื–ื• ื”ื”ืชื—ืœื” ืฉืœ ืืœืฃ ื—ื“ืฉ.
15:08
If you look forward,
333
908160
2000
ืื ืชื‘ื™ื˜ื• ืงื“ื™ืžื”,
15:10
I mean future humans,
334
910160
2000
ืื ื™ ืžืชื›ื•ื•ืŸ ืœืื“ื ื”ืขืชื™ื“ื™,
15:12
far before the end of this millennium,
335
912160
3000
ื”ืจื‘ื” ืžืขื‘ืจ ืœืกื•ืฃ ื”ืืœืฃ ื”ื–ื”,
15:15
in a few hundred years, they are going to look back at this moment.
336
915160
2000
ืขื•ื“ ื›ืžื” ืžืื•ืช ืฉื ื™ื, ื”ื ื™ื‘ื™ื˜ื• ืื—ื•ืจื” ืœืจื’ืข ื”ื–ื”.
15:17
And from the beginning of today's sessions
337
917160
3000
ื•ืžื”ื”ืชื—ืœื” ืฉืœ ื”ืกืฉืŸ ืฉืœ ื”ื™ื•ื
15:20
you'd think that they're going to see this as this horrible
338
920160
3000
ืืชื ื™ื›ื•ืœื™ื ืœื—ืฉื•ื‘ ืฉื”ื ื™ืจืื• ืืช ื–ื” ื›ืžื™ืŸ
15:23
difficult, painful period
339
923160
2000
ืชืงื•ืคื” ื ื•ืจืื™ืช, ืงืฉื” ื•ื›ื•ืื‘ืช
15:25
that we struggled through.
340
925160
2000
ืฉื ืื‘ืงื ื• ื“ืจื›ื”.
15:27
And I don't think that's what's going to happen.
341
927160
3000
ื•ืื ื™ ืœื ื—ื•ืฉื‘ ืฉื–ื” ืžื” ืฉื™ืงืจื”.
15:30
They're going to do like everybody does. They are going to forget about all that stuff.
342
930160
4000
ื”ื ื™ืขืฉื• ื›ืžื• ืฉื›ื•ืœื ืขื•ืฉื™ื. ื”ื ื™ืฉื›ื—ื• ืžื›ืœ ื”ื“ื‘ืจื™ื ื”ืืœื”.
15:34
And they are actually going to romanticize this moment in time.
343
934160
3000
ื•ื”ื ืœืžืขืฉื” ื™ื”ืคื›ื• ืืช ืืช ื”ืจื’ืข ื”ื–ื” ื‘ื–ืžืŸ ืœืจื•ืžื ื˜ื™.
15:37
They are going to think about it
344
937160
2000
ื”ื ื™ื—ืฉื‘ื• ืขืœ ื–ื”
15:39
as this glorious instant
345
939160
2000
ื›ืจื’ืข ื”ืžืคื•ืืจ
15:41
when we laid down
346
941160
2000
ื‘ื• ื”ื ื—ื ื•
15:43
the very foundations of their lives,
347
943160
3000
ืืช ื”ื™ืกื•ื“ื•ืช ืœื—ื™ื™ื”ื,
15:46
of their society, of their future.
348
946160
2000
ืฉืœ ื”ื—ื‘ืจื” ืฉืœื”ื, ืฉืœ ืขืชื™ื“ื.
15:48
You know it's a little bit like a birth.
349
948160
4000
ืืชื ื™ื•ื“ืขื™ื ื–ื” ืงืฆืช ื›ืžื• ืœื™ื“ื”.
15:52
Where there is this bloody, awful mess happens.
350
952160
3000
ื›ืฉื™ืฉ ืืช ื”ื‘ืœื’ืืŸ ื”ื ื•ืจืื™ ื”ืžื“ืžื ื”ื–ื” ืฉืงื•ืจื”.
15:55
And then what comes out of it? New life.
351
955160
4000
ื•ืื– ืžื” ื™ื•ืฆื ืžื–ื”? ื—ื™ื™ื ื—ื“ืฉื™ื.
15:59
Actually as was pointed out earlier,
352
959160
3000
ืœืžืขืฉื” ื›ืžื• ืฉื”ื•ื–ื›ืจ ืงื•ื“ื,
16:02
we forget about all the struggle there was in getting there.
353
962160
3000
ืื ื—ื ื• ืฉื•ื›ื—ื™ื ืžื”ืžืื‘ืงื™ื ืฉื”ื™ื• ื‘ื“ืจืš ืœืฉื.
16:05
So to me,
354
965160
4000
ืื– ื‘ืฉื‘ื™ืœื™,
16:09
it's clear that one of the foundations of that future
355
969160
2000
ื–ื” ื‘ืจื•ืจ ืฉืื—ื“ ืžื”ื™ืกื•ื“ื•ืช ืฉืœ ื”ืขืชื™ื“ ื”ื”ื•ื
16:11
is going to be the reworking of our biology.
356
971160
4000
ืขื•ืžื“ ืœื”ื™ื•ืช ื”ื‘ื ื™ื” ืžื—ื“ืฉ ืฉืœ ื”ื‘ื™ื•ืœื•ื’ื™ื”.
16:15
It's going to come gradually at first. It's going to pick up speed.
357
975160
2000
ื–ื” ืขื•ืžื“ ืœื”ื’ื™ืข ื‘ื”ื“ืจื’ื” ื‘ื”ืชื—ืœื”. ื–ื” ืขื•ืžื“ ืœืชืคื•ืฉ ืžื”ื™ืจื•ืช.
16:17
We're going to make lots of errors.
358
977160
2000
ืื ื—ื ื• ืขื•ืžื“ื™ื ืœืขืฉื•ืช ื”ืจื‘ื” ืฉื’ื™ืื•ืช.
16:19
That's the way these things work.
359
979160
2000
ื–ื• ื”ืฆื•ืจื” ื‘ื” ื”ื“ื‘ืจื™ื ื”ืืœื” ืขื•ื‘ื“ื™ื.
16:21
To me it's an incredible privilege
360
981160
3000
ื‘ืฉื‘ื™ืœื™ ื–ื• ื–ื›ื•ืช ืขืฆื•ืžื”
16:24
to be alive now
361
984160
2000
ืœื”ื™ื•ืช ื—ื™ ื›ื™ื•ื
16:26
and to be able to witness this thing.
362
986160
3000
ื•ืœื”ื™ื•ืช ืžืกื•ื’ืœ ืœื—ื–ื•ืช ื‘ื–ื”.
16:29
It is something that is a unique instant
363
989160
2000
ื–ื” ืžืฉื”ื• ืฉื”ื•ื ื”ื–ื“ืžื ื•ืช ื™ื—ื•ื“ื™ืช
16:31
in the history of all of life.
364
991160
3000
ื‘ื›ืœ ื”ื”ืกื˜ื•ืจื™ื” ืฉืœ ื”ื—ื™ื™ื.
16:34
It will always be remembered.
365
994160
2000
ื–ื” ืชืžื™ื“ ื™ื–ื›ืจ.
16:36
And what's extraordinary is that
366
996160
2000
ื•ืžื” ืฉืžื“ื”ื™ื ื–ื”
16:38
we're not just observing this,
367
998160
2000
ืฉืื ื—ื ื• ืœื ืจืง ืฆื•ืคื™ื ื‘ื–ื”,
16:40
we are the architects of this.
368
1000160
2000
ืื ื—ื ื• ื”ืืจื›ื™ื˜ืงื˜ื™ื ืฉืœ ื–ื”.
16:42
I think that we should be proud of it.
369
1002160
2000
ืื ื™ ื—ื•ืฉื‘ ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ื’ืื™ื ื‘ื–ื”.
16:44
What is so difficult and challenging
370
1004160
3000
ืžื” ืฉื›ืœ ื›ืš ืงืฉื” ื•ืžืืชื’ืจ
16:47
is that we are also the objects of these changes.
371
1007160
3000
ื”ื•ื ืฉืื ื—ื ื• ื’ื ื”ืื•ื‘ื™ื™ืงื˜ ืฉืœ ื”ืฉื™ื ื•ื™ื™ื ื”ืืœื•.
16:50
It's our health, it's our lives, it's our future, it's our children.
372
1010160
3000
ื–ื• ื”ื‘ืจื™ืื•ืช ืฉืœื ื•, ืืœื” ื”ื—ื™ื™ื ืฉืœื ื•, ื–ื” ื”ืขืชื™ื“ ืฉืœื ื•, ืืœื” ื™ืœื“ื™ื ื•.
16:53
And that is why they are so very troubling to so many people
373
1013160
4000
ื•ืœื›ืŸ ื–ื” ื›ืœ ื›ืš ืžื“ืื™ื’ ื›ืœ ื›ืš ื”ืจื‘ื” ืื ืฉื™ื
16:57
who would pull back in fear.
374
1017160
3000
ืฉื™ืจืชืขื• ื‘ืคื—ื“.
17:00
I think that our choice
375
1020160
3000
ืื ื™ ื—ื•ืฉื‘ ืฉื”ื‘ื—ื™ืจื” ืฉืœื ื•
17:03
in the choice of life,
376
1023160
2000
ื‘ื‘ื—ื™ืจื•ืช ืฉืœ ื”ื—ื™ื™ื,
17:05
is not whether we're going to go down this path.
377
1025160
3000
ื”ื™ื ืœื ืื ื ืœืš ื‘ื“ืจืš ื”ื–ื•.
17:08
We are, definitely.
378
1028160
2000
ืื ื—ื ื• ื ืœืš, ื‘ื”ื—ืœื˜.
17:10
It's how we hold it in our hearts.
379
1030160
3000
ื–ื” ืื™ืš ื ื—ื–ื™ืง ื–ืืช ื‘ืœื™ื‘ื ื•.
17:13
It's how we look at it.
380
1033160
2000
ื–ื” ืื™ืš ื ื‘ื™ื˜ ื‘ื–ื”.
17:15
I think Thucydides really spoke to us very clearly
381
1035160
3000
ืื ื™ ื—ื•ืฉื‘ ืฉื˜ื•ืงื™ื“ื™ื“ืก ื“ื™ื‘ืจ ืืœื™ื ื• ื‘ื‘ืจื•ืจ
17:18
in 430 B.C. He put it nicely.
382
1038160
4000
ื‘ 430 ืœืคื ื™ ื”ืกืคื™ืจื”. ื”ื•ื ืืžืจ ืืช ื–ื” ื™ืคื”.
17:22
Again, I'll use the words in the same order he did.
383
1042160
3000
ืฉื•ื‘, ืื ื™ ืืฉืชืžืฉ ื‘ืื•ืชื• ืกื“ืจ ืžื™ืœื™ื ืฉื”ื•ื ื”ืฉืชืžืฉ.
17:25
"The bravest are surely those
384
1045160
4000
"ื”ืืžื™ืฆื™ื ื‘ื™ื•ืชืจ ื”ื ื‘ื•ื•ื“ืื™ ืืœื”
17:29
who have the clearest vision of what is before them,
385
1049160
3000
ืฉื™ืฉ ืœื”ื ืืช ื”ื—ื–ื•ืŸ ื”ื‘ืจื•ืจ ื‘ื™ื•ืชืจ ืฉืœ ืžื” ืฉืœืคื ื™ื”ื,
17:32
both glory and danger alike.
386
1052160
3000
ืชื”ื™ืœื” ื•ื’ื ืกื›ื ื”.
17:35
And yet notwithstanding, they go out and they meet it."
387
1055160
4000
ื•ืขื“ื™ื™ืŸ ืœืžืจื•ืช ื”ื›ืœ, ื”ื ื™ื•ืฆืื™ื ื•ืคื•ื’ืฉื™ื ืื•ืชื•."
17:39
Thank you.
388
1059160
2000
ืชื•ื“ื” ืœื›ื
17:41
(Applause)
389
1061160
5000
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7