Nick Bostrom: Humanity's biggest problems aren't what you think they are

112,017 views ใƒป 2007-05-16

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Adam Golub ืžื‘ืงืจ: Ido Dekkers
00:25
I want to talk today about --
0
25000
3000
ืื ื™ ืจื•ืฆื” ืœื“ื‘ืจ ื”ื™ื•ื ืขืœ --
00:28
I've been asked to take the long view, and I'm going to tell you what
1
28000
6000
ื”ืชื‘ืงืฉืชื™ ืœื”ืกืชื›ืœ ืœื˜ื•ื•ื— ื”ืืจื•ืš, ื•ืื ื™ ืื’ื™ื“ ืœื›ื ืžื”ืŸ ืœื“ืขืชื™
00:34
I think are the three biggest problems for humanity
2
34000
4000
ืฉืœื•ืฉ ื”ื‘ืขื™ื•ืช ื”ื›ื™ ื’ื“ื•ืœื•ืช ืฉืœ ื‘ื ื™ ื”ืื“ื.
00:38
from this long point of view.
3
38000
3000
ืžื ืงื•ื“ืช ื”ืžื‘ื˜ ื”ื–ื•
00:41
Some of these have already been touched upon by other speakers,
4
41000
3000
ืขืœ ื—ืœืงืŸ ื›ื‘ืจ ื“ื™ื‘ืจื• ืžืจืฆื™ื ืื—ืจื™ื
00:44
which is encouraging.
5
44000
2000
ืฉื–ื” ืžืขื•ื“ื“ ืื•ืชื™
00:46
It seems that there's not just one person
6
46000
2000
ื ื“ืžื” ื›ื™ ื™ืฉ ื™ื•ืชืจ ืžืœื‘ื“ ืื™ืฉ ืื—ื“
00:48
who thinks that these problems are important.
7
48000
2000
ืฉื—ื•ืฉื‘ ืฉื”ื‘ืขื™ื•ืช ื”ืœืœื• ื—ืฉื•ื‘ื•ืช.
00:50
The first is -- death is a big problem.
8
50000
4000
ื”ืจืืฉื•ื ื” ื”ื™ื -- ื”ืžื•ื•ืช ื”ื•ื ื‘ืขื™ื” ื’ื“ื•ืœื”.
00:54
If you look at the statistics,
9
54000
3000
ืื ืชืกืชื›ืœื• ืขืœ ื”ืกื˜ื˜ื™ืกื˜ื™ืงื•ืช
00:57
the odds are not very favorable to us.
10
57000
2000
ื”ืกื™ื›ื•ื™ื ื ื’ื“ื ื•.
00:59
So far, most people who have lived have also died.
11
59000
4000
ื ื›ื•ืŸ ืœืขื›ืฉื™ื•, ืจื•ื‘ ื”ืื ืฉื™ื ืฉื—ื™ื• ืขื“ ื›ื” ื’ื ืžืชื•.
01:03
Roughly 90 percent of everybody who has been alive has died by now.
12
63000
4000
ื‘ืขืจืš 90 ืื—ื•ื– ืžื›ืœ ืžื™ ืฉืคืขื ื”ื™ื” ื‘ื—ื™ื™ื ืขื›ืฉื™ื• ืžืช.
01:07
So the annual death rate adds up to 150,000 --
13
67000
6000
ืฉื™ืขื•ืจ ื”ืชืžื•ืชื” ื”ืฉื ืชื™ ืžื’ื™ืข ืœ150,000 --
01:13
sorry, the daily death rate -- 150,000 people per day,
14
73000
3000
ืกืœื™ื—ื”, ื”ืฉื™ืขื•ืจ ื”ื™ื•ืžื™ -- 150,000 ื‘ื™ื•ื
01:16
which is a huge number by any standard.
15
76000
3000
ืฉื–ื” ืžืกืคืจ ืขื ืง ืœืคื™ ื›ืœ ืงื ื” ืžื™ื“ื”.
01:19
The annual death rate, then, becomes 56 million.
16
79000
5000
ื›ืš ืฉื”ืฉื™ืขื•ืจ ื”ืฉื ืชื™ ืžื’ื™ืข ืœ56 ืžื™ืœื™ื•ืŸ.
01:24
If we just look at the single, biggest cause of death -- aging --
17
84000
5000
ืื ื ืกืชื›ืœ ืขืœ ื”ื’ื•ืจื ื”ืขืงืจื™ ืœืžื•ื•ืช -- ื–ืงื ื” --
01:30
it accounts for roughly two-thirds of all human people who die.
18
90000
5000
ื”ื™ื ื’ื•ืจืžืช ืœื‘ืขืจืš ืฉื ื™-ืฉืœื™ืฉ ืฉืœ ื›ืœ ืžืงืจื™ ื”ืžื•ื•ืช ื‘ืขื•ืœื.
01:35
That adds up to an annual death toll
19
95000
3000
ื–ื” ืžื—ื™ืจ ื“ืžื™ื ืฉื ืชื™
01:38
of greater than the population of Canada.
20
98000
2000
ืฉื™ื•ืชืจ ื’ื“ื•ืœ ืžื”ืื•ื›ืœื•ืกื™ื” ืฉืœ ืงื ื“ื”.
01:40
Sometimes, we don't see a problem
21
100000
2000
ืœืคืขืžื™ื, ืื ื—ื ื• ืœื ืจื•ืื™ื ื‘ืขื™ื”
01:42
because either it's too familiar or it's too big.
22
102000
4000
ืžืคื ื™ ืฉื”ื™ื ืžื•ื›ืจืช ืžื“ื™ ืื• ืฉื”ื™ื ื’ื“ื•ืœื” ืžื“ื™.
01:46
Can't see it because it's too big.
23
106000
2000
ืœื ืจื•ืื™ื ืื•ืชื” ื›ื™ ื”ื™ื ื’ื“ื•ืœื” ืžื“ื™.
01:48
I think death might be both too familiar and too big
24
108000
3000
ืœื“ืขืชื™ ื”ืžื•ื•ืช ื’ื“ื•ืœ ืžื“ื™ ื•ื’ื ืžื•ื›ืจ ืžื“ื™
01:51
for most people to see it as a problem.
25
111000
3000
ื‘ืฉื‘ื™ืœ ืฉืจื•ื‘ ื”ืื ืฉื™ื ื™ืจืื• ืื•ืชื• ื‘ืชื•ืจ ื‘ืขื™ื”.
01:54
Once you think about it, you see this is not statistical points;
26
114000
2000
ื‘ืจื’ืข ืฉื—ื•ืฉื‘ื™ื ืขืœ ื–ื”, ืจื•ืื™ื ืฉื–ื” ืœื ืžื™ื“ืข ืกื˜ื˜ื™ืกื˜ื™;
01:56
these are -- let's see, how far have I talked?
27
116000
2000
ื–ื” -- ื‘ื•ืื• ื ืจืื”, ื›ืžื” ื–ืžืŸ ื›ื‘ืจ ื“ื‘ืจืชื™?
01:58
I've talked for three minutes.
28
118000
3000
ืื ื™ ืžื“ื‘ืจ ื›ื‘ืจ ืฉืœื•ืฉ ื“ืงื•ืช.
02:01
So that would be, roughly, 324 people have died since I've begun speaking.
29
121000
7000
ืื– ื–ื”, ื‘ืขืจืš, 324 ืื ืฉื™ื ืžืชื• ืžืื– ืฉื”ืชื—ืœืชื™ ืœื“ื‘ืจ.
02:08
People like -- it's roughly the population in this room has just died.
30
128000
4000
ืื ืฉื™ื ื›ืžื• -- ื–ื” ื‘ืขืจืš ืžืกืคืจ ื”ืื ืฉื™ื ืฉื ืžืฆืื™ื ื‘ืื•ืœื ื”ื–ื”, ืžืชื•.
02:13
Now, the human cost of that is obvious,
31
133000
2000
ืขื›ืฉื™ื•, ื”ืžื—ื™ืจ ื”ืื ื•ืฉื™ ื‘ืจื•ืจ
02:15
once you start to think about it -- the suffering, the loss --
32
135000
3000
ื›ืืฉืจ ืืชื” ืžืชื—ื™ืœ ืœื—ืฉื•ื‘ ืขืœื™ื• -- ื”ืกื‘ืœ, ื”ืื•ื‘ื“ืŸ --
02:18
it's also, economically, enormously wasteful.
33
138000
3000
ื–ื” ื’ื, ืžื‘ื—ื™ื ื” ื›ืœื›ืœื™ืช, ื‘ื–ื‘ื•ื– ืขื ืง.
02:21
I just look at the information, and knowledge, and experience
34
141000
3000
ืื ื™ ืคืฉื•ื˜ ืžืกืชื›ืœ ืขืœ ื”ืžื™ื“ืข, ื”ื™ื“ืข ื•ื”ื ื™ืกื™ื•ืŸ
02:24
that is lost due to natural causes of death in general,
35
144000
3000
ืฉื ืื‘ื“ื• ื›ืชื•ืฆืื” ืฉืœ ืžื•ื•ืช ื˜ื‘ืขื™ ื‘ืื•ืคืŸ ื›ืœืœื™,
02:27
and aging, in particular.
36
147000
2000
ื•ื–ืงื ื”, ื‘ืžื™ื•ื—ื“.
02:29
Suppose we approximated one person with one book?
37
149000
3000
ื ื’ื™ื“ ื•ื ืขืจื™ืš ื‘ืŸ ืื“ื ืื—ื“ ื›ืฉื•ื•ื” ืœืกืคืจ ืื—ื“?
02:32
Now, of course, this is an underestimation.
38
152000
2000
ืขื›ืฉื™ื•, ื›ืžื•ื‘ืŸ, ื–ื• ื”ืขืจื›ืช ื—ืกืจ.
02:34
A person's lifetime of learning and experience
39
154000
6000
ื”ืœืžื™ื“ื” ื•ื”ื ื™ืกื™ื•ืŸ ืฉืœ ืชืงื•ืคืช ื—ื™ื™ื ืฉืœ ื‘ืŸ ืื“ื
02:40
is a lot more than you could put into a single book.
40
160000
2000
ื”ื™ื ื”ืจื‘ื” ื™ื•ืชืจ ืžืžื” ืฉืืคืฉืจ ืœืฉื™ื ื‘ืกืคืจ.
02:42
But let's suppose we did this.
41
162000
2000
ืื‘ืœ ื‘ื•ืื• ื ื’ื™ื“ ืฉื ืฉื•ื•ื” ื‘ื™ื ื”ื.
02:45
52 million people die of natural causes each year
42
165000
5000
52 ืžื™ืœื™ื•ืŸ ืื ืฉื™ื ืžืชื™ื ืžืกื™ื‘ื•ืช ื˜ื‘ืขื™ื•ืช ื›ืœ ืฉื ื”
02:50
corresponds, then, to 52 million volumes destroyed.
43
170000
4000
ื•ื–ื” ืžืฉืชื•ื•ื” ืœ 52 ืžื™ืœื™ื•ืŸ ืกืคืจื™ื ืฉื ื”ืจืกื•.
02:54
Library of Congress holds 18 million volumes.
44
174000
3000
ืกืคืจื™ื™ืช ื”ืงื•ื ื’ืจืก ืžื›ื™ืœื” 18 ืžื™ืœื™ื•ืŸ ืกืคืจื™ื.
02:58
We are upset about the burning of the Library of Alexandria.
45
178000
3000
ื”ื™ื™ื ื• ื‘ื”ืœื ื›ืฉืกืคืจื™ื™ืช ืืœื›ืกื ื“ืจื™ื” ื ืฉืจืคื”.
03:01
It's one of the great cultural tragedies
46
181000
2000
ื–ื” ืื—ืช ื”ื˜ืจื’ื“ื™ื•ืช ื”ืชืจื‘ื•ืชื™ื•ืช ื”ื’ื“ื•ืœื•ืช
03:03
that we remember, even today.
47
183000
3000
ืฉื ื–ื›ืจืช ืขื“ ื”ื™ื•ื.
03:07
But this is the equivalent of three Libraries of Congress --
48
187000
2000
ืื‘ืœ ื–ื” ืฉื•ื•ื” ืœืฉืœื•ืฉ ืกืคืจื™ื•ืช ืงื•ื ื’ืจืก --
03:09
burnt down, forever lost -- each year.
49
189000
3000
ื ืฉืจืคื•ืช, ื ืื‘ื“ื•ืช ืœืชืžื™ื“ -- ื›ืœ ืฉื ื”.
03:12
So that's the first big problem.
50
192000
2000
ืื– ื–ื• ื”ื‘ืขื™ื” ื”ืจืืฉื•ื ื”.
03:14
And I wish Godspeed to Aubrey de Grey,
51
194000
3000
ื•ืื ื™ ืžืื—ืœ ื“ืจืš ืฆืœื—ื” ืœืื•ื‘ืจื™ ื“ื” ื’ืจื™ื™,
03:17
and other people like him,
52
197000
2000
ื•ืขื•ื“ ืื ืฉื™ื ื›ืžื•ื”ื•,
03:19
to try to do something about this as soon as possible.
53
199000
3000
ืฉื™ื ืกื• ืœืขืฉื•ืช ืžืฉื”ื• ื‘ื ื•ื’ืข ืœื–ื” ื›ืžื” ืฉื™ื•ืชืจ ืžื”ืจ.
03:23
Existential risk -- the second big problem.
54
203000
3000
ืกื™ื›ื•ืŸ ืืงืกื™ืกื˜ื ืฆื™ืืœื™ -- ื”ื‘ืขื™ื” ื”ืฉื ื™ื™ื”.
03:26
Existential risk is a threat to human survival, or to the long-term potential of our species.
55
206000
7000
ืกื™ื›ื•ืŸ ืืงืกื™ืกื˜ื ืฆื™ืืœื™ ื”ื•ื ื”ื•ื ืกื™ื›ื•ืŸ ืœื”ืฉืจื“ื•ืชื• ืฉืœ ื”ืื“ื, ืื• ืœืคื•ื˜ื ืฆื™ืืœ ืœืื•ืจืš ื˜ื•ื•ื— ืฉืœ ื’ื–ืขื ื•.
03:33
Now, why do I say that this is a big problem?
56
213000
2000
ืขื›ืฉื™ื•, ืžื“ื•ืข ืื ื™ ืื•ืžืจ ื›ื™ ื–ื• ื‘ืขื™ื” ื’ื“ื•ืœื”?
03:35
Well, let's first look at the probability --
57
215000
4000
ื‘ื•ืื• ื ืกืชื›ืœ ืขืœ ื”ื”ืกืชื‘ืจื•ืช --
03:39
and this is very, very difficult to estimate --
58
219000
3000
ื•ื–ื” ืžืื•ื“, ืžืื•ื“ ืงืฉื” ืœื”ืขืจื™ืš --
03:42
but there have been only four studies on this in recent years,
59
222000
3000
ืื‘ืœ ื”ื™ื• ืจืง ืืจื‘ืขื” ืžื—ืงืจื™ื ื‘ืฉื ื™ื ืื—ืจื•ื ื•ืช,
03:45
which is surprising.
60
225000
2000
ืฉื–ื” ืžืคืชื™ืข.
03:47
You would think that it would be of some interest
61
227000
3000
ื”ื™ื™ืช ื—ื•ืฉื‘ ื›ื™ ื”ื ื•ืฉื ื”ื™ื” ืžืขื•ืจืจ ื”ืจื‘ื” ืขื ื™ื™ืŸ
03:50
to try to find out more about this given that the stakes are so big,
62
230000
4000
ื‘ืœื’ืœื•ืช ื™ื•ืชืจ ื‘ืขื ื™ื™ืŸ ื–ื” ืœืื•ืจ ื”ื”ืฉืœื›ื•ืช ื”ื’ื“ื•ืœื•ืช.
03:54
but it's a very neglected area.
63
234000
2000
ืืš ื–ื” ืชื—ื•ื ืžื•ื–ื ื— ืœืžื“ื™.
03:56
But there have been four studies --
64
236000
2000
ืืš ื‘ื•ืฆืขื• ืืจื‘ืขื” ืžื—ืงืจื™ื --
03:58
one by John Lesley, wrote a book on this.
65
238000
2000
ืื—ื“ ืงื™ื ื’'ื•ืŸ ืœืืกืœื™, ืžื™ ืฉื›ืชื‘ ืกืคืจ ื‘ื ื•ืฉื.
04:00
He estimated a probability that we will fail
66
240000
2000
ื”ื•ื ืฉื™ืขืจ ืืช ื”ื”ืกืชื‘ืจื•ืช ืฉื ื›ืฉืœ
04:02
to survive the current century: 50 percent.
67
242000
3000
ืœืฉืจื•ื“ ืืช ืชืงื•ืคืช ืžืื” ื”ืฉื ื™ื ื”ืืœื•: 50 ืื—ื•ื–.
04:05
Similarly, the Astronomer Royal, whom we heard speak yesterday,
68
245000
5000
ื‘ืื•ืคืŸ ื“ื•ืžื”, ื”ืืกื˜ืจื•ื ื•ื ืจื•ื™ืืœ, ืฉืžืžื ื• ืฉืžืขื ื• ืืชืžื•ืœ,
04:10
also has a 50 percent probability estimate.
69
250000
3000
ื’ื ืžืฉืขืจ 50 ืื—ื•ื– ื”ืกืชื‘ืจื•ืช.
04:13
Another author doesn't give any numerical estimate,
70
253000
3000
ืขื•ื“ ื—ื•ืงืจ ืื—ื“ ืœื ื ื•ืชืŸ ื”ืฉืขืจื” ืžืกืคืจื™ืช,
04:16
but says the probability is significant that it will fail.
71
256000
3000
ืืš ืื•ืžืจ ื›ื™ ื”ื”ืกืชื‘ืจื•ืช ืฉื ื›ืฉืœ ืžืฉืžืขื•ืชื™ืช.
04:19
I wrote a long paper on this.
72
259000
3000
ืื ื™ ื›ืชื‘ืชื™ ื—ื™ื‘ื•ืจ ืืจื•ืš ื‘ื ื•ืฉื ื”ื–ื”.
04:22
I said assigning a less than 20 percent probability would be a mistake
73
262000
4000
ืื ื™ ืืžืจืชื™ ืฉืœื”ืฆื™ืข ื”ืกืชื‘ืจื•ืช ืฉื ืžื•ื›ื” ืž20 ืื—ื•ื– ืชื”ื™ื” ื˜ืขื•ืช
04:26
in light of the current evidence we have.
74
266000
3000
ืœืื•ืจ ื”ื”ื•ื›ื—ื•ืช ื”ืงื™ื™ืžื•ืช.
04:29
Now, the exact figures here,
75
269000
2000
ืขื›ืฉื™ื•, ื”ืžื™ื“ืข ื”ืžื“ื•ื™ืง ืคื”,
04:31
we should take with a big grain of salt,
76
271000
2000
ืœื ื›ื“ืื™ ืฉื ืงื— ืื•ืชื• ื‘ื™ื•ืชืจ ืžื“ื™ ืจืฆื™ื ื•ืช,
04:33
but there seems to be a consensus that the risk is substantial.
77
273000
3000
ืื‘ืœ ื ื“ืžื” ื›ื™ ืงื™ื™ื ืงื•ื ืฆื ื–ื•ืก ื›ื™ ื”ืกื™ื›ื•ืŸ ื”ื•ื ืžืฉืžืขื•ืชื™.
04:36
Everybody who has looked at this and studied it agrees.
78
276000
3000
ื›ืœ ืžื™ ืฉื‘ื—ืŸ ืืช ื–ื” ื•ื—ืงืจ ืืช ื–ื” ืžืกื›ื™ื.
04:39
Now, if we think about what just reducing
79
279000
2000
ืขื›ืฉื™ื• ืื ื ื—ืฉื•ื‘ ืขืœ ืœืฆืžืฆื
04:41
the probability of human extinction by just one percentage point --
80
281000
5000
ืืช ื”ืกื™ื›ื•ืŸ ืฉืœ ื”ื”ื›ื—ื“ื” ืฉืœ ื”ืื“ื ื‘ืจืง ืื—ื•ื– ืื—ื“ --
04:46
not very much -- so that's equivalent to 60 million lives saved,
81
286000
5000
ืœื ื›ืœ ื›ืš ื”ืจื‘ื” -- ืื– ื–ื” ืฉื•ื•ื” ืœ 60 ืžื™ืœื™ื•ืŸ ืื ืฉื™ื ืฉื ื™ืฆืœื™ื.
04:51
if we just count the currently living people, the current generation.
82
291000
4000
ืื ืจืง ื ืกืคื•ืจ ืืช ืžืกืคืจ ื”ืื ืฉื™ื ืฉื—ื™ื™ื ืขืชื”, ื”ื“ื•ืจ ื”ืขื›ืฉื•ื•ื™.
04:55
Now one percent of six billion people is equivalent to 60 million.
83
295000
4000
ืขื›ืฉื™ื• ืื—ื“ ืื—ื•ื– ืฉืœ 6 ืžื™ืœื™ืืจื“ ืฉื•ื•ื” 60 ืžื™ืœื™ื•ืŸ.
04:59
So that's a large number.
84
299000
2000
ืื– ื–ื” ืžืกืคืจ ื’ื“ื•ืœ.
05:01
If we were to take into account future generations
85
301000
3000
ืื ื ื™ืงื— ื‘ื—ืฉื‘ื•ืŸ ืืช ื“ื•ืจื•ืช ืขืชื™ื“ื™ื™ื
05:04
that will never come into existence if we blow ourselves up,
86
304000
5000
ืฉืœื ื™ืชืงื™ื™ืžื• ืื ื ืคื•ืฆืฅ ืืช ืขืฆืžื ื•,
05:09
then the figure becomes astronomical.
87
309000
3000
ืื– ื”ืžืกืคืจื™ื ื ืขืฉื™ื ืืกื˜ืจื•ื ื•ืžื™ื™ื.
05:12
If we could eventually colonize a chunk of the universe --
88
312000
3000
ืื ื ื•ื›ืœ ืœืื›ืœืก ื—ืœืง ืžื”ื™ืงื•ื --
05:15
the Virgo supercluster --
89
315000
2000
ืฆื‘ื™ืจ-ืขืœ ื”ื‘ืชื•ืœื” --
05:17
maybe it will take us 100 million years to get there,
90
317000
2000
ืื•ืœื™ ื™ืงื— ืœื ื• 100 ืžื™ืœื™ื•ืŸ ืฉื ื” ืœื”ื’ื™ืข ืœื–ื”,
05:19
but if we go extinct we never will.
91
319000
3000
ืื‘ืœ ืื ื ื›ื—ื“ ืœื ื ื’ื™ืข ืืฃ ืคืขื.
05:22
Then, even a one percentage point reduction
92
322000
3000
ืื–, ืืคื™ืœื• ืฆืžืฆื•ื ืฉืœ ืื—ื•ื– ืื—ื“
05:25
in the extinction risk could be equivalent
93
325000
4000
ื‘ืกื™ื›ื•ืŸ ื”ื”ื›ื—ื“ื” ืฉืœื ื• ื™ื›ื•ืœ ืœื”ืฉืชื•ื•ืช
05:29
to this astronomical number -- 10 to the power of 32.
94
329000
3000
ืœืžืกืคืจ ื”ืืกื˜ืจื•ื ื•ืžื™ ื”ื–ื” -- 10 ื‘ื—ื–ืงื” 32.
05:32
So if you take into account future generations as much as our own,
95
332000
4000
ืื– ืื ืœื•ืงื—ื™ื ื‘ื—ืฉื‘ื•ืŸ ืืช ื”ื“ื•ืจื•ืช ื”ืขืชื™ื“ื™ื™ื
05:36
every other moral imperative of philanthropic cost just becomes irrelevant.
96
336000
5000
ื›ืœ ืฆื™ื•ื•ื™ ืžื•ืกืจื™ ืฉืœ ืžื—ื™ืจ ืคื™ืœื ืชืจื•ืคื™ ื ืขืฉื” ืฉื•ืœื™.
05:41
The only thing you should focus on
97
341000
2000
ื”ื“ื‘ืจ ื”ื™ื—ื™ื“ ืฉื›ื“ืื™ ืœื”ืชืžืงื“ ื‘ื•
05:43
would be to reduce existential risk
98
343000
2000
ื”ื•ื ืœืฆืžืฆื ืืช ื”ืกื™ื›ื•ืŸ ื”ืืงืกื™ืกื˜ื ืฆื™ืืœื™
05:45
because even the tiniest decrease in existential risk
99
345000
4000
ื›ื™ ืืคื™ืœื• ื”ืฆืžืฆื•ื ื”ื–ืขื™ืจ ื‘ื™ื•ืชืจ
05:49
would just overwhelm any other benefit you could hope to achieve.
100
349000
4000
ื”ื™ื” ืžื›ื ื™ืข ื›ืœ ื™ืชืจื•ืŸ ืื—ืจ ืฉืืคืฉืจ ืœื”ืฉื™ื’.
05:53
And even if you just look at the current people,
101
353000
2000
ื•ืืคื™ืœื• ืื ื ืกืชื›ืœ ืจืง ืขืœ ื”ืื ืฉื™ื ื”ืงื™ื™ืžื™ื,
05:55
and ignore the potential that would be lost if we went extinct,
102
355000
5000
ื•ื ืชืขืœื ืžื”ืคื•ื˜ื ืฆื™ืืœ ืฉื™ืื‘ื“ ืื ื”ื™ื™ื ื• ื ื›ื—ื“ื™ื,
06:00
it should still have a high priority.
103
360000
2000
ื–ื” ืขื“ื™ื™ืŸ ืฆืจื™ืš ืœื”ื™ื•ืช ื‘ืขื“ื™ืคื•ืช ื’ื‘ื•ื”ื”.
06:02
Now, let me spend the rest of my time on the third big problem,
104
362000
5000
ืขื›ืฉื™ื• ืื“ื‘ืจ ื‘ื–ืžืŸ ืฉื ื•ืชืจ ืœื™ ืขืœ ื”ื‘ืขื™ื” ื”ืฉืœื™ืฉื™ืช,
06:07
because it's more subtle and perhaps difficult to grasp.
105
367000
5000
ื›ื™ ื”ื•ื ื™ื•ืชืจ ืขื“ื™ืŸ ื•ืื•ืœื™ ืงืฉื” ืœืงืœื•ื˜.
06:13
Think about some time in your life --
106
373000
4000
ืชื—ืฉื‘ื• ืขืœ ืชืงื•ืคื” ืžืกื•ื™ืžืช ื‘ื—ื™ื™ื›ื --
06:17
some people might never have experienced it -- but some people,
107
377000
3000
ืœืื—ื“ื™ื ืžืื™ืชื ื• ืชืงื•ืคื” ื›ื–ืืช ืื•ืœื™ ืœื ื ื—ื•ื•ืชื” -- ืื‘ืœ ื”ืฉืืจ
06:20
there are just those moments that you have experienced
108
380000
3000
ื™ืฉ ืืช ื”ื—ื•ื•ื™ื•ืช ืฉื—ื•ื•ื™ืชื
06:23
where life was fantastic.
109
383000
2000
ื›ืืฉืจ ื”ื—ื™ื™ื ื”ื™ื• ื ืคืœืื™ื.
06:25
It might have been at the moment of some great, creative inspiration
110
385000
7000
ื–ื” ืื•ืœื™ ืงืจื” ื™ื—ื“ ืขื ืจื’ืข ืฉืœ ื”ืฉืจืื” ื™ืฆื™ืจืชื™ืช
06:32
you might have had when you just entered this flow stage.
111
392000
2000
ืฉื”ื™ื” ืœื›ื ื›ืฉื ื›ื ืกืชื ืœืชืงื•ืคื” ืฉืœ ืžื•ื–ื”
06:34
Or when you understood something you had never done before.
112
394000
2000
ืื• ื›ืฉื”ื‘ื ืชื ืžืฉื”ื• ืฉืœื ื”ื‘ื ืชื ืงื•ื“ื
06:36
Or perhaps in the ecstasy of romantic love.
113
396000
4000
ืื• ืื•ืœื™ ื‘ืืงืกื˜ื–ื” ืฉืœ ืื”ื‘ื” ืจื•ืžื ื˜ื™ืช.
06:40
Or an aesthetic experience -- a sunset or a great piece of art.
114
400000
5000
ืื• ื‘ื—ื•ื•ื™ื” ืืกื˜ื˜ื™ืช -- ืฉืงื™ืขืช ืฉืžืฉ ืื• ื™ืฆื™ืจืช ืื•ืžื ื•ืช.
06:45
Every once in a while we have these moments,
115
405000
2000
ืžื“ื™ ืคืขื ื™ืฉ ืœื ื• ืจื’ืขื™ื ื›ืืœื”,
06:47
and we realize just how good life can be when it's at its best.
116
407000
4000
ื•ืื ื• ืžื‘ื™ื ื™ื ืขื“ ื›ืžื” ื”ื—ื™ื™ื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ื˜ื•ื‘ื™ื ื›ืฉื”ื ื‘ืฉื™ืื.
06:51
And you wonder, why can't it be like that all the time?
117
411000
5000
ื•ืืชื” ืชื•ื”ื” ืœืžื” ื–ื” ืœื ื™ื›ื•ืœ ืœื”ื™ื•ืช ื›ื›ื” ื›ืœ ื”ื–ืžืŸ?
06:56
You just want to cling onto this.
118
416000
2000
ืืชื” ืจื•ืฆื” ืœืชืคื•ืก ื•ืœื ืœืฉื—ืจืจ ืืช ื”ืจื’ืข.
06:58
And then, of course, it drifts back into ordinary life and the memory fades.
119
418000
4000
ื•ืื–, ื›ืžื•ื‘ืŸ, ื”ื—ื™ื™ื ื—ื•ื–ืจื™ื ืœื”ื™ื•ืช ืจื’ื™ืœื™ื ื•ื”ื–ื›ืจื•ืŸ ื“ื•ื”ื”.
07:02
And it's really difficult to recall, in a normal frame of mind,
120
422000
4000
ื•ืžืื•ื“ ืงืฉื” ืœื”ื™ื–ื›ืจ ื‘ื• ื‘ื—ื™ื™ื ื”ื™ื•ืžื™ื•ืžื™ื™ื.
07:06
just how good life can be at its best.
121
426000
3000
ื›ืžื” ื™ืคื™ื ื—ื™ื™ื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ื‘ืฉื™ืื.
07:09
Or how bad it can be at its worst.
122
429000
3000
ืื• ื›ืžื” ื’ืจื•ืขื™ื ื”ื ื›ืฉื”ื ื”ื›ื™ ื’ืจื•ืขื™ื.
07:12
The third big problem is that life isn't usually
123
432000
3000
ื”ื‘ืขื™ื” ื”ืฉืœื™ืฉื™ืช ื”ื™ื ืฉื”ื—ื™ื™ื ื‘ื“ืจืš ื›ืœืœ ืœื
07:15
as wonderful as it could be.
124
435000
2000
ื ืคืœืื™ื ื›ืžื• ืฉื”ื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช.
07:17
I think that's a big, big problem.
125
437000
4000
ืื ื™ ื—ื•ืฉื‘ ืฉื–ื•ื‘ืขื™ื” ื’ื“ื•ืœื”.
07:21
It's easy to say what we don't want.
126
441000
2000
ืงืœ ืœืืžืจ ืžื” ืื ื—ื ื• ืœื ืจื•ืฆื™ื.
07:24
Here are a number of things that we don't want --
127
444000
3000
ื”ื ื” ืžืกืคืจ ื“ื‘ืจื™ื ืฉืื ื—ื ื• ืœื ืจื•ืฆื™ื --
07:27
illness, involuntary death, unnecessary suffering, cruelty,
128
447000
3000
ืžื—ืœื•ืช, ืžื•ื•ืช ื‘ืœืชื™ ืจืฆื•ื™, ืกื‘ืœ ืžื™ื•ืชืจ, ืื›ื–ืจื™ื•ืช,
07:30
stunted growth, memory loss, ignorance, absence of creativity.
129
450000
5000
ื’ื“ื™ืœื” ืžื•ื’ื‘ืœืช, ืื•ื‘ื“ืŸ ื–ื›ืจื•ืŸ, ื‘ื•ืจื•ืช, ื—ื•ืกืจ ื™ืฆื™ืจืชื™ื•ืช.
07:36
Suppose we fixed these things -- we did something about all of these.
130
456000
3000
ื ื ื™ื— ืฉื ืชืงืŸ ืืช ื”ื“ื‘ืจื™ื ื”ืืœื” - ืฉืขืฉื™ื ื• ืžืฉื”ื• ื›ื“ื™ ืœื˜ืคืœ ื‘ื›ืœ ื”ื“ื‘ืจื™ื ื”ืืœื”
07:39
We were very successful.
131
459000
2000
ื”ืฆืœื—ื ื• ืžืื•ื“.
07:41
We got rid of all of these things.
132
461000
2000
ื ืคื˜ืจื ื• ืžื›ืœ ื”ื“ื‘ืจื™ื ื”ืืœื”.
07:43
We might end up with something like this,
133
463000
3000
ืื•ืœื™ ื”ื™ื” ื™ื•ืฆื ืœื ื• ืžืฉื”ื• ื›ื–ื”,
07:46
which is -- I mean, it's a heck of a lot better than that.
134
466000
4000
ืฉื–ื” -- ื›ืœื•ืžืจ ื–ื” ื”ืจื‘ื” ื™ื•ืชืจ ื˜ื•ื‘ ืžื–ื”.
07:50
But is this really the best we can dream of?
135
470000
5000
ืื‘ืœ ื–ื” ื‘ืืžืช ื”ื›ื™ ื˜ื•ื‘ ืฉืืคืฉืจ ืœืฉืื•ืฃ?
07:55
Is this the best we can do?
136
475000
2000
ื–ื” ื”ื›ื™ ื˜ื•ื‘ ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช?
07:57
Or is it possible to find something a little bit more inspiring to work towards?
137
477000
6000
ืื• ืงื™ื™ื ื“ื‘ืจ ื™ื•ืชืจ ื˜ื•ื‘ ืฉื›ื“ืื™ ืœืขื‘ื•ื“ ืœืงืจืืชื•?
08:03
And if we think about this,
138
483000
2000
ื•ืื ื ื—ืฉื•ื‘ ืขืœ ื–ื”,
08:05
I think it's very clear that there are ways
139
485000
4000
ื‘ืจื•ืจ ืžืื•ื“ ืฉื™ืฉ ื“ืจื›ื™ื
08:09
in which we could change things, not just by eliminating negatives,
140
489000
3000
ืœืฉื ื•ืช ื“ื‘ืจื™ื, ืœื ืจืง ื‘ื”ืกืจืช ื“ื‘ืจื™ื ืฉืœื™ืœื™ื™ื,
08:12
but adding positives.
141
492000
2000
ืืœื ื’ื ื‘ื”ื•ืกืคืช ื“ื‘ืจื™ื ื—ื™ื•ื‘ื™ื™ื.
08:14
On my wish list, at least, would be:
142
494000
2000
ื‘ืจืฉื™ืžืช ื”ื‘ืงืฉื•ืช ืฉืœื™ ื”ื™ื•:
08:16
much longer, healthier lives, greater subjective well-being,
143
496000
5000
ื—ื™ื™ื ื”ืจื‘ื” ื™ื•ืชืจ ืืจื•ื›ื™ื ื•ื‘ืจื™ืื™ื, ืจืžืช ืจื•ื•ื—ื” ื™ื•ืชืจ ื’ื‘ื•ื”ื” ื‘ืื•ืคืŸ ืกื•ื‘ื™ืงื˜ื™ื‘ื™,
08:21
enhanced cognitive capacities, more knowledge and understanding,
144
501000
5000
ื™ื›ื•ืœื•ืช ืžื•ื—ื™ื•ืช ืžื•ื’ื‘ืจื•ืช, ื™ื•ืชืจ ื™ื“ืข ื•ื”ื‘ื ื”,
08:26
unlimited opportunity for personal growth
145
506000
2000
ื”ื–ื“ืžื ื•ื™ื•ืช ืœื”ืชืคืชื—ื•ืช ืื™ืฉื™ืช ืœืœื ื”ื’ื‘ืœื•ืช
08:28
beyond our current biological limits, better relationships,
146
508000
4000
ืžืขื‘ืจ ืœื’ื‘ื•ืœื•ืชื ื• ื”ื‘ื™ื•ืœื•ื’ื™ื™ื, ืงืฉืจื™ื ื™ื•ืชืจ ื˜ื•ื‘ื™ื,
08:32
an unbounded potential for spiritual, moral
147
512000
2000
ืคื•ื˜ื ืฆื™ืืœ ืœื”ืชืคืชื—ื•ืช ืจื•ื—ื ื™ืช, ืžื•ืกืจื™ืช
08:34
and intellectual development.
148
514000
2000
ื•ืื™ื ื˜ืœืงื˜ื•ืืœื™ืช ื‘ืœืชื™ ืžื•ื’ื‘ืœืช.
08:36
If we want to achieve this, what, in the world, would have to change?
149
516000
8000
ืื ื”ื™ื™ื ื• ืจื•ืฆื™ื ืœื”ืฉื™ื’ ืืช ื–ื”, ืžื”, ื‘ืขื•ืœืžื ื•, ื”ื™ื” ืฆืจื™ืš ืœื”ืฉืชื ื•ืช?
08:44
And this is the answer -- we would have to change.
150
524000
5000
ื•ื–ืืช ื”ืชืฉื•ื‘ื” -- ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื”ืฉืชื ื•ืช.
08:49
Not just the world around us, but we, ourselves.
151
529000
3000
ืœื ืจืง ืืช ื”ืขื•ืœื ืžืกื‘ื™ื‘, ืืœื ืื ื—ื ื•, ืขืฆืžื ื•.
08:52
Not just the way we think about the world, but the way we are -- our very biology.
152
532000
4000
ืœื ืจืง ืื™ืš ืฉืื ื—ื ื• ื—ื•ืฉื‘ื™ื ืขืœ ื”ืขื•ืœื, ืืœื ื’ื ืื™ืš ืฉืื ื—ื ื• -- ื”ื‘ื™ื•ืœื•ื’ื™ื” ืฉืœื ื•.
08:56
Human nature would have to change.
153
536000
2000
ื”ื˜ื‘ืข ื”ืื ื•ืฉื™ ื”ื™ื” ืฆืจื™ืš ืœื”ืฉืชื ื•ืช.
08:58
Now, when we think about changing human nature,
154
538000
2000
ืขื›ืฉื™ื•, ื›ืฉืื ื—ื ื• ื—ื•ืฉื‘ื™ื ืขืœ ืœืฉื ื•ืช ืืช ื”ื˜ื‘ืข ื”ืื ื•ืฉื™,
09:00
the first thing that comes to mind
155
540000
2000
ื”ื“ื‘ืจ ื”ืจืืฉื•ืŸ ืฉืขื•ืœื” ืขืœ ื“ืขืชื ื•
09:02
are these human modification technologies --
156
542000
4000
ื”ื•ื ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ืœืฉื™ืคื•ืจ ื”ืื ื•ืฉ --
09:06
growth hormone therapy, cosmetic surgery,
157
546000
2000
ื˜ื™ืคื•ืœ ื‘ื”ื•ืจืžื•ื ื™ ื’ื“ื™ืœื”, ื ื™ืชื•ื—ื™ื ืคืœืกื˜ื™ื™ื,
09:08
stimulants like Ritalin, Adderall, anti-depressants,
158
548000
3000
ืกืžื™ื ืžืžืจื™ืฆื™ื ื›ื’ื•ืŸ ืจื™ื˜ืœื™ืŸ, ืื“ืจืืœ, ืชืจื•ืคื•ืช ื ื’ื“ ื“ื›ืื•ืŸ,
09:11
anabolic steroids, artificial hearts.
159
551000
2000
ืกื˜ืจื•ืื™ื“ื™ื ืื ืื‘ื•ืœื™ื™ื, ืœื‘ื‘ื•ืช ืžืœืื›ื•ืชื™ื™ื.
09:13
It's a pretty pathetic list.
160
553000
3000
ืจืฉื™ืžื” ื“ื™ ืคืชื˜ื™ืช.
09:16
They do great things for a few people
161
556000
2000
ื”ื ืขื•ืฉื™ื ื“ื‘ืจื™ื ื ืคืœืื™ื ืœืžืขื˜ื™ื ืžืื™ืชื ื•
09:18
who suffer from some specific condition,
162
558000
2000
ืฉืกื•ื‘ืœื™ื ืžื‘ืขื™ื•ืช ืžืกื•ื™ืžื•ืช,
09:20
but for most people, they don't really transform
163
560000
5000
ืืš ืœืจื•ื‘ื ื•, ื”ื ืœื ืžืžืฉ ืžืฉื ื™ื
09:25
what it is to be human.
164
565000
2000
ืืช ืขืฆื ื”ืื ื•ืฉื™ื•ืช.
09:27
And they also all seem a little bit --
165
567000
2000
ื•ื‘ื ื•ืกืฃ ื”ื ื›ื•ืœื ื ืจืื™ื ืงืฆืช --
09:29
most people have this instinct that, well, sure,
166
569000
3000
ืœืจื•ื‘ ื”ืื ืฉื™ื ื™ืฉ ืื™ื ืกื˜ื™ื ืงื˜ ืฉ, ื•ื‘ื›ืŸ,
09:32
there needs to be anti-depressants for the really depressed people.
167
572000
2000
ืฆืจื™ื›ื•ืช ืœื”ื™ื•ืช ืชืจื•ืคื•ืช ื ื’ื“ ื“ื›ืื•ืŸ ืœืกื•ื‘ืœื™ ื“ื›ืื•ืŸ ื—ืžื•ืจ.
09:34
But there's a kind of queasiness
168
574000
2000
ืื‘ืœ ื™ืฉ ืžืขื™ืŸ ืื™ ื ื•ื—ื•ืช
09:36
that these are unnatural in some way.
169
576000
3000
ืฉื”ืŸ ืœื ื˜ื‘ืขื™ื•ืช ื‘ืื™ื–ืฉื”ื• ืžื•ื‘ืŸ.
09:39
It's worth recalling that there are a lot of other
170
579000
2000
ืฉื•ื•ื” ืœื–ื›ื•ืจ ืฉื™ืฉ ื”ืจื‘ื”
09:41
modification technologies and enhancement technologies that we use.
171
581000
3000
ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืœืฉื™ืคื•ืจ ื•ืฉื™ื ื•ื™ ื”ืื ื•ืฉ ืื—ืจื•ืช ืฉืžืฉืžืฉื•ืช ืื•ืชื ื•.
09:44
We have skin enhancements, clothing.
172
584000
4000
ื™ืฉ ืœื ื• ืืžืฆืขื™ ืฉื™ืคื•ืจ ืขื•ืจ, ื‘ื’ื“ื™ื.
09:48
As far as I can see, all of you are users of this
173
588000
4000
ื›ื›ืœ ื”ื ืจืื”, ื›ื•ืœื›ื ืคื” ืžืฉืชืžืฉื™ื
09:52
enhancement technology in this room, so that's a great thing.
174
592000
5000
ื‘ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืื™ืœื•, ืื– ื–ื” ื“ื‘ืจ ืžืขื•ืœื”.
09:57
Mood modifiers have been used from time immemorial --
175
597000
3000
ืกืžื™ื ืฉืžืฉืคื™ืขื™ื ืขืœ ืžืฆื‘ ื”ืจื•ื— ืžืฉืžืฉื™ื ืื•ืชื ื• ื›ื‘ืจ ืžืื– ื–ืžืŸ ืฉืื™ื ื• ื–ื›ื•ืจ ืœื ื• --
10:00
caffeine, alcohol, nicotine, immune system enhancement,
176
600000
5000
ืงืคืื™ืŸ, ืืœื›ื•ื”ื•ืœ, ื ื™ืงื•ื˜ื™ืŸ, ืžื’ื‘ื™ืจื™ ืžืขืจื›ืช ื”ื—ื™ืกื•ืŸ,
10:05
vision enhancement, anesthetics --
177
605000
2000
ืžืฉืคืจื™ ืจืื™ื”, ื—ื•ืžืจื™ ื”ืจื“ืžื” --
10:07
we take that very much for granted,
178
607000
2000
ืื ื—ื ื• ืœื•ืงื—ื™ื ืืชื ื›ืžื•ื‘ืŸ ืžืืœื™ื•,
10:09
but just think about how great progress that is --
179
609000
4000
ืจืง ื—ืฉื‘ื• ืขืœ ื›ืžื” ื”ืชืงื“ืžื•ืช ื”ืฉื’ื ื• --
10:13
like, having an operation before anesthetics was not fun.
180
613000
4000
ื›ืœื•ืžืจ, ืœืขืฉื•ืช ื ื™ืชื•ื— ืœืคื ื™ ืฉื”ื™ื• ื—ื•ืžืจื™ ื”ืจื“ืžื” ืœื ื”ื™ื” ื›ื™ืฃ.
10:17
Contraceptives, cosmetics and brain reprogramming techniques --
181
617000
6000
ืืžืฆืขื™ ืžื ื™ืขื”,ืงื•ืกืžื˜ื™ืงื” ื•ื˜ื›ื ื™ืงื•ืช ืœืฉื™ื ื•ื™ื™ื ืžื•ื—ื™ื™ื --
10:23
that sounds ominous,
182
623000
2000
ื–ื” ื ืฉืžืข ื›ืžื‘ืฉืจ ืจืขื•ืช,
10:25
but the distinction between what is a technology --
183
625000
4000
ืื‘ืœ ื”ื”ื‘ื“ืœ ื‘ื™ืŸ ื˜ื›ื ื•ืœื•ื’ื™ื” --
10:29
a gadget would be the archetype --
184
629000
2000
ื”ืื‘-ื˜ื™ืคื•ืก ื™ื”ื™ื” ื”ื’ืื“ื’'ื˜ --
10:31
and other ways of changing and rewriting human nature is quite subtle.
185
631000
4000
ื•ื“ืจื›ื™ื ืื—ืจื•ืช ืœืฉื ื•ืช ืืช ื˜ื‘ืข ื”ืื ื•ืฉ ื”ื•ื ืžืื•ื“ ืขื“ื™ืŸ.
10:35
So if you think about what it means to learn arithmetic or to learn to read,
186
635000
4000
ืื– ืื ืชื—ืฉื•ื‘ ืขืœ ืžื” ื–ื” ืื•ืžืจ ืœืœืžื•ื“ ื—ืฉื‘ื•ืŸ ืื• ืœืœืžื•ื“ ืœืงืจื•ื
10:39
you're actually, literally rewriting your own brain.
187
639000
3000
ืืชื” ื‘ืขืฆื, ื‘ืื•ืคืŸ ืžื™ืœื•ืœื™ ื‘ื•ื ื” ืืช ืžื•ื—ืš ืžื—ื“ืฉ.
10:42
You're changing the microstructure of your brain as you go along.
188
642000
3000
ืืชื” ืžืฉื ื” ืืช ืžื‘ื ื” ื”ืžื™ืงืจื• ืฉืœ ืžื•ื—ืš ืชื•ืš ื›ื“ื™.
10:46
So in a broad sense, we don't need to think about technology
189
646000
3000
ืื– ื‘ื’ื“ื•ืœ, ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœื—ืฉื•ื‘ ืขืœ ื˜ื›ื ื•ืœื•ื’ื™ื”
10:49
as only little gadgets, like these things here,
190
649000
2000
ืจืง ื›ื’ืื“ื’'ื˜ื™ื ืงื˜ื ื™ื, ื›ืžื• ืืœื” ืคื”,
10:51
but even institutions and techniques,
191
651000
4000
ืืœื ื’ื ืžื•ืกื“ื•ืช, ื•ื˜ื›ื ื™ืงื•ืช,
10:55
psychological methods and so forth.
192
655000
2000
ืฉื™ื˜ื•ืช ืคืกื™ื›ื•ืœื•ื’ื™ื•ืช ื•ื›ื•'.
10:57
Forms of organization can have a profound impact on human nature.
193
657000
5000
ืฆื•ืจื•ืช ืื™ืจื’ื•ืŸ ื—ื‘ืจืชื™ ื™ื›ื•ืœ ืœื”ืฉืคื™ืข ื‘ืื•ืคืŸ ืžืฉืžืขื•ืชื™ ืขืœ ื”ื˜ื‘ืข ื”ืื ื•ืฉื™.
11:02
Looking ahead, there is a range of technologies
194
662000
2000
ื‘ืžื‘ื˜ ืงื“ื™ืžื”, ื™ืฉ ืžื’ื•ื•ืŸ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช
11:04
that are almost certain to be developed sooner or later.
195
664000
3000
ืฉื›ืžืขื˜ ื‘ื•ื•ื“ืื•ืช ื™ืคืชื—ื• ืื•ืชืŸ ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ.
11:07
We are very ignorant about what the time scale for these things are,
196
667000
4000
ืื ื—ื ื• ืžืื•ื“ ืœื ืžื•ื“ืขื™ื ืœืžื”ื• ื˜ื•ื•ื— ื”ื–ืžืŸ ืฉืœ ื”ื“ื‘ืจื™ื ื”ืืœื”,
11:11
but they all are consistent with everything we know
197
671000
2000
ืื‘ืœ ื”ื ื›ื•ืœื ืžื•ืชืืžื™ื ืœืžื” ืฉืื ื—ื ื• ื™ื•ื“ืขื™ื
11:13
about physical laws, laws of chemistry, etc.
198
673000
4000
ืขืœ ื—ื•ืงื™ ื”ืคื™ื–ื™ืงื”, ื›ื™ืžื™ื”, ื•ื›ื•'
11:17
It's possible to assume,
199
677000
2000
ืืคืฉืจ ืœื”ื ื™ื—,
11:19
setting aside a possibility of catastrophe,
200
679000
3000
ืื ื ืฉื™ื ื‘ืฆื“ ืืช ื”ืืคืฉืจื•ืช ืฉืœ ืืกื•ืŸ,
11:22
that sooner or later we will develop all of these.
201
682000
3000
ืฉื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ืื ื—ื ื• ื ืคืชื— ืืช ื›ื•ืœืŸ.
11:25
And even just a couple of these would be enough
202
685000
3000
ื•ืจืง ื›ืžื” ืžื”ืŸ ื™ืกืคื™ืงื•
11:28
to transform the human condition.
203
688000
2000
ืœืฉื ื•ืช ืืช ื”ืžืฆื‘ ื”ืื ื•ืฉื™.
11:30
So let's look at some of the dimensions of human nature
204
690000
5000
ืื– ื‘ื•ืื• ื ืกืชื›ืœ ืขืœ ื›ืžื” ื”ื™ื‘ื˜ื™ื ืฉืœ ื˜ื‘ืข ื”ืื ื•ืฉ
11:35
that seem to leave room for improvement.
205
695000
3000
ืฉืื•ืœื™ ื™ืฉ ื‘ื”ื ืžืงื•ื ืœืฉื™ืคื•ืจ.
11:38
Health span is a big and urgent thing,
206
698000
2000
ื˜ื•ื•ื— ื”ื‘ืจื™ืื•ืช ื”ื•ื ื“ื‘ืจ ื’ื“ื•ืœ ื•ื“ื—ื•ืฃ
11:40
because if you're not alive,
207
700000
2000
ื›ื™ ืื ืืชื” ืœื ื—ื™,
11:42
then all the other things will be to little avail.
208
702000
3000
ืื– ื›ืœ ื”ื“ื‘ืจื™ื ื”ืื—ืจื™ื ืœื ื™ืขื–ืจื•.
11:45
Intellectual capacity -- let's take that box,
209
705000
2000
ื™ื›ื•ืœืช ืื™ื ื˜ืœืงื˜ื•ืืœื™ืช -- ื‘ื•ืื• ื ืงื— ืืช ื”ื ื•ืฉื ื”ื–ื”,
11:47
which falls into a lot of different sub-categories:
210
707000
5000
ืฉื ื›ืœืœ ื‘ื”ืจื‘ื” ืชืช-ืงื˜ื’ื•ืจื™ื•ืช:
11:52
memory, concentration, mental energy, intelligence, empathy.
211
712000
3000
ื–ื›ืจื•ืŸ, ืจื™ื›ื•ื–, ื›ื•ื— ืžื•ื—ื™, ืžืฉื›ืœ, ืืžืคื˜ื™ื”.
11:55
These are really great things.
212
715000
2000
ืืœื” ื“ื‘ืจื™ื ืžืžืฉ ื˜ื•ื‘ื™ื.
11:57
Part of the reason why we value these traits
213
717000
2000
ื—ืœืง ืžื”ืกื™ื‘ื” ืฉืื ื—ื ื• ืžืขืจื™ื›ื™ื ืืช ื”ืชื›ื•ื ื•ืช ื”ืœืœื•
11:59
is that they make us better at competing with other people --
214
719000
4000
ื–ื” ื‘ื’ืœืœ ืฉื”ืŸ ืขื•ื–ืจื•ืช ืœื ื• ื‘ืœื”ืชื—ืจื•ืช ืขื ืื ืฉื™ื ืื—ืจื™ื --
12:03
they're positional goods.
215
723000
2000
ื”ืŸ ืžืฉืื‘ื™ื ืžืขืžื“ื™ื™ื.
12:05
But part of the reason --
216
725000
2000
ืื‘ืœ ื—ืœืง ืื—ืจ ืžื”ืกื™ื‘ื” --
12:07
and that's the reason why we have ethical ground for pursuing these --
217
727000
4000
ื•ื–ืืช ื”ืกื™ื‘ื” ืฉื™ืฉ ืœื ื• ื‘ืกื™ืก ืืชื™ ืœืจืฆื•ืช ืœื”ืฉื™ื’ ืื•ืชืŸ --
12:11
is that they're also intrinsically valuable.
218
731000
3000
ื–ื” ืฉื”ืŸ ื‘ืขืœื•ืช ืขืจืš ื‘ืื•ืคืŸ ืžื”ื•ืชื™.
12:14
It's just better to be able to understand more of the world around you
219
734000
4000
ื–ื” ืคืฉื•ื˜ ืขื“ื™ืฃ ืœื”ื‘ื™ืŸ ื™ื•ืชืจ ื˜ื•ื‘ ืืช ื”ืขื•ืœื ืฉืกื‘ื™ื‘ืš
12:18
and the people that you are communicating with,
220
738000
2000
ื•ื”ืื ืฉื™ื ืฉืืชื” ืžืชืงืฉืจ ืื™ืชื,
12:20
and to remember what you have learned.
221
740000
4000
ื•ืœื–ื›ื•ืจ ืืช ืžื” ืฉืœืžื“ืช ืขื“ ื›ื”.
12:24
Modalities and special faculties.
222
744000
2000
ืืคื ื•ื™ื•ืช ื•ื™ื›ื•ืœื•ืช ืžื™ื•ื—ื“ื•ืช.
12:26
Now, the human mind is not a single unitary information processor,
223
746000
5000
ืขื›ืฉื™ื• ื”ืžื•ื— ื”ืื ื•ืฉื™ ืื™ื ื• ืžืขื‘ื“ ืžื™ื“ืข ื—ื“ ืฉื™ืžื•ืฉื™,
12:31
but it has a lot of different, special, evolved modules
224
751000
4000
ื™ืฉ ืœื• ื”ืจื‘ื” ืžื•ื“ื•ืœื™ื ืžื™ื•ื—ื“ื™ื, ืžื’ื•ื•ื ื™ื ื•ืžืคื•ืชื—ื™ื
12:35
that do specific things for us.
225
755000
2000
ืฉืขื•ืฉื™ื ื“ื‘ืจื™ื ืžืกื•ื™ืžื™ื ืขื‘ื•ืจื ื•.
12:37
If you think about what we normally take as giving life a lot of its meaning --
226
757000
4000
ืื ืชื—ืฉื•ื‘ ืขืœ ืžื” ืฉืื ื—ื ื• ืžื‘ื™ื ื™ื ื›ื“ื‘ืจื™ื ืฉื ื•ืชื ื™ื ืœื—ื™ื™ื ืืช ืžืฉืžืขื•ืชื --
12:41
music, humor, eroticism, spirituality, aesthetics,
227
761000
4000
ืžื•ืกื™ืงื”, ื”ื•ืžื•ืจ, ืืจื•ื˜ื™ื•ืช, ืจื•ื—ื ื™ื•ืช, ืืกื˜ืชื™ื•ืช
12:45
nurturing and caring, gossip, chatting with people --
228
765000
5000
ืื›ืคืชื™ื•ืช, ืจื›ื™ืœื•ืช, ืคื™ื˜ืคื•ื˜ื™ื ืขื ืื ืฉื™ื --
12:50
all of these, very likely, are enabled by a special circuitry
229
770000
4000
ื›ืœ ื”ื“ื‘ืจื™ื ื”ืœืœื• ื”ื, ืงืจื•ื‘ ืœื•ื•ื“ืื™, ืžืชืืคืฉืจื™ื ืขืœ ื™ื“ื™ ืžืขื’ืœ ื—ืฉืžืœื™ ืžื•ื—ื™ ืžื™ื•ื—ื“
12:54
that we humans have,
230
774000
2000
ืฉื™ืฉ ืœื‘ื ื™ ืื“ื,
12:56
but that you could have another intelligent life form that lacks these.
231
776000
3000
ืื‘ืœ ืฉืืคืฉืจ ืœื“ืžื™ื™ืŸ ื™ืฆื•ืจ ื—ื™ ืื™ื ื˜ืœื™ื’ื ื˜ื™ ืื—ืจ ื‘ืœืขื“ื™ื”ื.
12:59
We're just lucky that we have the requisite neural machinery
232
779000
3000
ืื ื—ื ื• ื‘ืจื™ ืžื–ืœ ืฉื™ืฉ ืœื ื• ืืช ื”ืžื ื’ื ื•ื ื™ื ื”ืขืฆื‘ื™ื™ื ื”ื”ื›ืจื—ื™ื™ื
13:02
to process music and to appreciate it and enjoy it.
233
782000
4000
ืœืขื‘ื“ ืžื•ื–ื™ืงื” ื•ืœื”ืขืจื™ืš ืื•ืชื” ื•ืœื”ื ื•ืช ืžืžื ื”.
13:06
All of these would enable, in principle -- be amenable to enhancement.
234
786000
3000
ืืคืฉืจ -ื‘ืจืžืช ื”ืขืงืจื•ืŸ - ืœื—ื–ืง ืืช ื›ืœ ื”ืžื ื’ื ื•ื ื™ื ื”ืืœื”.
13:09
Some people have a better musical ability
235
789000
2000
ืœืื—ื“ื™ื ืžืื™ืชื ื• ื™ืฉ ื™ื›ื•ืœืช ืžื•ื–ื™ืงืœื™ืช,
13:11
and ability to appreciate music than others have.
236
791000
2000
ื•ืืช ื”ื™ื›ื•ืœืช ืœื”ืขืจื™ืš ืžื•ื–ื™ืงื”, ืฉืžืคื•ืชื—ื•ืช ื™ื•ืชืจ ืžืื—ืจื™ื.
13:13
It's also interesting to think about what other things are --
237
793000
3000
ื’ื ืžืขื ื™ื™ืŸ ืœื—ืฉื•ื‘ ืขืœ ืžื ื’ื ื•ื ื™ื ืื—ืจื™ื --
13:16
so if these all enabled great values,
238
796000
3000
ืื– ืื ื›ืœ ื”ืžื ื’ื ื•ื ื™ื ื”ืืœื” ื™ื•ืฆืจื™ื ืœื ื• ืขืจื›ื™ื ื’ื“ื•ืœื™ื,
13:20
why should we think that evolution has happened to provide us
239
800000
3000
ืœืžื” ืฉื ื—ืฉื•ื‘ ืฉืื‘ื•ืœื•ืฆื™ื” ืกื™ืคืงื” ืื•ืชื ื•
13:23
with all the modalities we would need to engage
240
803000
3000
ืขื ื›ืœ ื”ืืคื ื•ื™ื•ืช ืฉื”ื™ื™ื ื• ืฆืจื™ื›ื™ื ื›ื“ื™
13:26
with other values that there might be?
241
806000
2000
ืœืขื‘ื•ื“ ืขื ืขืจื›ื™ื ืื—ืจื™ื ืืคืฉืจื™ื™ื?
13:28
Imagine a species
242
808000
2000
ื“ืžื™ื™ื ื• ืฉื™ืฉ ื’ื–ืข
13:30
that just didn't have this neural machinery for processing music.
243
810000
4000
ืฉื”ื™ื” ื—ืกืจ ืœื• ืืช ื”ืžื ื’ื ื•ืŸ ื”ืžื•ื—ื™ ื”ื–ื” ืœื”ืขืจื™ืš ื•ืœืขื‘ื“ ืžื•ื–ื™ืงื”.
13:34
And they would just stare at us with bafflement
244
814000
3000
ื•ื”ื™ื• ื‘ื•ื”ื™ื ื‘ื ื• ื‘ืชืžื”ื•ืŸ
13:37
when we spend time listening to a beautiful performance,
245
817000
4000
ื›ืฉื”ื™ื™ื ื• ืžื‘ืœื™ื ื–ืžืŸ ื‘ื”ื–ื ืช ืžื•ืคืข ืžื•ื–ื™ืงืœื™ ื™ืคื”,
13:41
like the one we just heard -- because of people making stupid movements,
246
821000
2000
ื›ืžื• ื–ื” ืฉืฉืžืขื ื• ื›ืจื’ืข -- ื›ืชื•ืฆืขื” ืžืื ืฉื™ื ืขื•ืฉื™ื ืชื ื•ืขื•ืช ื˜ืคืฉื™ื•ืช,
13:43
and they would be really irritated and wouldn't see what we were up to.
247
823000
3000
ื•ื”ื ื”ื™ื• ืžืชืขืฆื‘ื ื™ื ื•ืœื ื”ื™ื• ืžื‘ื™ื ื™ื ืžื” ืื ื—ื ื• ืขื•ืฉื™ื.
13:46
But maybe they have another faculty, something else
248
826000
3000
ืื‘ืœ ืื•ืœื™ ืœื”ื ื”ื™ื™ืชื” ืื™ื–ืฉื”ื™ ื™ื›ื•ืœืช ืื—ืจืช
13:49
that would seem equally irrational to us,
249
829000
3000
ืฉื”ื™ื™ืชื” ื ืจืื™ืช ืœื ื• ืœื ื”ื™ื’ื™ื•ื ื™ืช ื‘ืื•ืชื” ืžื™ื“ื”,
13:52
but they actually tap into some great possible value there.
250
832000
3000
ืื‘ืœ ืœื”ื ื”ื™ื™ืชื” ื’ื™ืฉื” ืœืื™ื–ืฉื”ื• ืขืจืš ื’ื“ื•ืœ ืืคืฉืจื™.
13:55
But we are just literally deaf to that kind of value.
251
835000
4000
ืื‘ืœ ืื ื—ื ื• ืคืฉื•ื˜ ื—ืจืฉื™ื ืœืขืจืš ื”ื–ื”.
13:59
So we could think of adding on different,
252
839000
2000
ืื– ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื—ืฉื•ื‘ ืขืœ ืœื”ื•ืกื™ืฃ
14:01
new sensory capacities and mental faculties.
253
841000
4000
ื™ื›ื•ืœื•ืช ื—ื•ืฉื™ื•ืช ื•ืžื•ื—ื™ื•ืช ื—ื“ืฉื•ืช.
14:05
Bodily functionality and morphology and affective self-control.
254
845000
5000
ืคื•ื ืงืฆื™ื•ื ืœื™ื•ืช ื’ื•ืคื ื™ืช ื•ืฉืœื™ื˜ื” ืขืฆืžื™ืช ื™ืขื™ืœื”.
14:10
Greater subjective well-being.
255
850000
2000
ืžืฆื‘ ืขืฆืžื™ ื˜ื•ื‘ ื™ื•ืชืจ.
14:12
Be able to switch between relaxation and activity --
256
852000
3000
ื”ื™ื›ื•ืœืช ืœื”ื—ืœื™ืฃ ื‘ื™ืŸ ืจื•ื’ืข ื•ืคืขื™ืœื•ืช --
14:15
being able to go slow when you need to do that, and to speed up.
257
855000
4000
ื”ื™ื›ื•ืœืช ืœืœื›ืช ืœืขื˜ ื›ืฉืฆืจื™ืš ื•ืœืžื”ืจ.
14:19
Able to switch back and forth more easily
258
859000
2000
ืœื”ืฉื™ื’ ืืช ื”ื™ื›ื•ืœืช ืœื”ื—ืœื™ืฃ ื™ื•ืชืจ ืžื”ืจ
14:21
would be a neat thing to be able to do --
259
861000
2000
ื”ื™ื” ื™ื›ื•ืœ ืœื”ื™ื•ืช ืžื’ื ื™ื‘ --
14:23
easier to achieve the flow state,
260
863000
2000
ื”ื™ื›ื•ืœืช ืœื”ื’ื™ืข ืœืžืฆื‘ ืฉืœ ืจื™ื›ื•ื– ืžื•ื—ืœื˜
14:25
when you're totally immersed in something you are doing.
261
865000
4000
ื›ืฉืืชื” ืœื’ืžืจื™ ืžืฉืงื™ืข ื‘ืžื” ืฉืืชื” ืขื•ืฉื”.
14:29
Conscientiousness and sympathy.
262
869000
2000
ืžืฆืคื•ื ื™ื•ืช ื•ืกื™ืžืคื˜ื™ื”.
14:31
The ability to -- it's another interesting application
263
871000
3000
ื”ื™ื›ื•ืœืช -- ื–ื” ืขื•ื“ ื™ืฉื•ื ืžืขื ื™ื™ืŸ
14:34
that would have large social ramification, perhaps.
264
874000
3000
ืฉื”ื™ื” ื‘ืขืœ ื”ืฉืœื›ื•ืช ื—ื‘ืจืชื™ื•ืช, ืื•ืœื™.
14:37
If you could actually choose to preserve your romantic attachments to one person,
265
877000
6000
ืื ื™ื›ื•ืœืช ืœื”ื—ืœื™ื˜ ืœื”ืงื“ื™ืฉ ืืช ืจืฆื•ื ืš ื”ืจื•ืžื ื˜ื™ ืœื‘ืŸ ืื“ื ืื—ื“ ื‘ืœื‘ื“,
14:43
undiminished through time,
266
883000
2000
ืฉืœื ื™ื“ื”ื” ืขื ื”ื–ืžืŸ,
14:45
so that wouldn't have to -- love would never have to fade if you didn't want it to.
267
885000
3000
ื›ื“ื™ ืฉื–ื” ืœื -- ืื”ื‘ื” ืœื ื”ื™ื™ืชื” ื ื—ืœืฉืช ืื ืœื ื”ื™ื™ืช ืจื•ืฆื”.
14:50
That's probably not all that difficult.
268
890000
3000
ื–ื” ื‘ื˜ื— ืœื ื›ื–ื” ืงืฉื”.
14:53
It might just be a simple hormone or something that could do this.
269
893000
3000
ืื•ืœื™ ื–ื” ืคืฉื•ื˜ ืขื ื™ื™ืŸ ืฉืœ ื”ื•ืจืžื•ืŸ ืื• ืžืฉื”ื• ืื—ืจ ืฉื™ื›ื•ืœ ืœืขืฉื•ืช ืืช ื–ื”.
14:58
It's been done in voles.
270
898000
2000
ืขืฉื™ื ื• ืืช ื–ื” ื›ื‘ืจ ืขื ื ื‘ืจื ื™ื
15:02
You can engineer a prairie vole to become monogamous
271
902000
3000
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื ื“ืก ื ื‘ืจื ื™ ืฉื“ื” ืžื•ื ื•ื’ืžื™ื™ื
15:05
when it's naturally polygamous.
272
905000
2000
ืœืžืจื•ืช ืฉื”ื ืคื•ืœื™ื’ืžื™ื™ื ื‘ื˜ื‘ืข.
15:07
It's just a single gene.
273
907000
2000
ื–ื” ืจืง ื’ืŸ ืื—ื“.
15:09
Might be more complicated in humans, but perhaps not that much.
274
909000
2000
ื–ื” ืื•ืœื™ ื™ื”ื™ื” ื™ื•ืชืจ ืžื•ืจื›ื‘ ืขื ื‘ื ื™ ืื“ื, ืื‘ืœ ื™ื™ืชื›ืŸ ืฉืœื ื‘ื”ืจื‘ื”.
15:11
This is the last picture that I want to --
275
911000
2000
ื–ื• ื”ืชืžื•ื ื” ื”ืื—ืจื•ื ื” ืฉืื ื™ ืจื•ืฆื” ืœื”ืฆื™ื’ --
15:14
now we've got to use the laser pointer.
276
914000
2000
ืขื›ืฉื™ื• ื ืฆื˜ืจืš ืœื”ืฉืชืžืฉ ื‘ืกืžืŸ ืœื™ื™ื–ืจ.
15:17
A possible mode of being here would be a way of life --
277
917000
3000
ืคื” ื™ืฉ ืœื ื• ืฆื•ืจืช ืงื™ื•ื ืฉืชื”ืคื•ืš ืœืฆื•ืจืช ื—ื™ื™ื --
15:20
a way of being, experiencing, thinking, seeing,
278
920000
4000
ื”ืฆื•ืจื” ืฉื‘ื” ืื ื• ืงื™ื™ืžื™ื, ื—ื•ื•ื™ื, ื—ื•ืฉื‘ื™ื, ืจื•ืื™ื,
15:24
interacting with the world.
279
924000
2000
ืžืงื™ื™ืžื™ื ืื™ื ื˜ืจืงืฆื™ื” ืขื ื”ืขื•ืœื.
15:26
Down here in this little corner, here, we have the little sub-space
280
926000
5000
ืคื” ืœืžื˜ื” ื™ืฉ ืœื ื• ืืช ื”ืชืช-ืžืจื—ื‘ ื”ืงื˜ืŸ ื”ื–ื”
15:31
of this larger space that is accessible to human beings --
281
931000
4000
ืžืชื•ืš ื”ืžืจื—ื‘ ื”ื’ื“ื•ืœ ื™ื•ืชืจ ืฉื ื’ื™ืฉ ืœื‘ื ื™ ืื“ื --
15:35
beings with our biological capacities.
282
935000
3000
ื™ืฆื•ืจื™ื ืขื ื”ื™ื›ื•ืœื•ืช ื”ื‘ื™ื•ืœื•ื’ื™ื•ืช ืฉืœื ื•.
15:38
It's a part of the space that's accessible to animals;
283
938000
3000
ื–ื” ื—ืœืง ืžื”ืžืจื—ื‘ ืฉื ื’ื™ืฉ ืœื—ื™ื•ืช;
15:41
since we are animals, we are a subset of that.
284
941000
3000
ื”ื™ื•ืช ืฉืื ื—ื ื• ื—ื™ื•ืช, ืื ื—ื ื• ืชืช-ืงื‘ื•ืฆื” ืฉืœ ื–ื”.
15:44
And then you can imagine some enhancements of human capacities.
285
944000
4000
ื•ืื– ืชื•ื›ืœื• ืœื“ืžื™ื™ืŸ ื—ื™ื–ื•ืงื™ื ืฉืœ ื™ื›ื•ืœื•ืช ืื ื•ืฉื™ื•ืช.
15:48
There would be different modes of being you could experience
286
948000
3000
ื™ื”ื™ื• ืฆื•ืจื•ืช ืงื™ื•ื ืื—ืจื•ืช ืฉืชื•ื›ืœื• ืœื—ื•ื•ืช
15:51
if you were able to stay alive for, say, 200 years.
287
951000
3000
ืื ื™ื›ื•ืœืชื ืœื”ืฉืืจ ื‘ื—ื™ื™ื ืขื“ ื’ื™ืœ, ื ื’ื™ื“, 200.
15:54
Then you could live sorts of lives and accumulate wisdoms
288
954000
4000
ืื– ื”ื™ื™ืชื ื™ื›ื•ืœื™ื ืœื—ื™ื•ืช ื—ื™ื™ื ืื—ืจื™ื ื•ืœืฆื‘ื•ืจ ื—ื•ื›ืžื•ืช
15:58
that are just not possible for humans as we currently are.
289
958000
3000
ืฉืคืฉื•ื˜ ืœื ืืคืฉืจื™ื•ืช ืœื”ืฉื™ื’ ื‘ืžืฆื‘ื ื• ื”ืื ื•ืฉื™ ื”ื ื•ื›ื—ื™.
16:01
So then, you move off to this larger sphere of "human +,"
290
961000
4000
ืื– ืืชื” ืžืชืจื—ืง ืžื”ืชื—ื•ื ื”ื–ื” ืฉืœ "ืื ื•ืฉ +"
16:05
and you could continue that process and eventually
291
965000
3000
ื•ื™ื›ื•ืœืช ืœื”ืžืฉื™ืš ื‘ืชื”ืœื™ืš ื”ื–ื” ื•ื‘ืกื•ืคื• ืฉืœ ื“ื‘ื›
16:08
explore a lot of this larger space of possible modes of being.
292
968000
4000
ื—ื•ืงืจ ื”ืจื‘ื” ืžื”ืžืจื—ื‘ ื”ื’ื“ื•ืœ ื™ื•ืชืจ ืฉืœ ืฆื•ืจื•ืช ื”ืงื™ื•ื ื”ืืคืฉืจื™ื•ืช ื”ืœืœื•.
16:12
Now, why is that a good thing to do?
293
972000
2000
ืขื›ืฉื™ื•, ืœืžื” ื–ื” ื“ื‘ืจ ืฉื˜ื•ื‘ ืœืขืฉื•ืช?
16:14
Well, we know already that in this little human circle there,
294
974000
4000
ืื ื—ื ื• ื™ื•ื“ืขื™ื ื›ื‘ืจ ืฉื‘ืขื™ื’ื•ืœ ื”ืงื˜ืŸ ื”ื–ื” ืคื” ืฉืœ ื‘ื ื™ ืื“ื,
16:18
there are these enormously wonderful and worthwhile modes of being --
295
978000
4000
ื™ืฉ ืฆื•ืจื•ืช ืงื™ื•ื ืžืžืฉ ื ืคืœืื•ืช ื‘ืขืœื•ืช ืขืจืš --
16:22
human life at its best is wonderful.
296
982000
3000
ื—ื™ื™ ื”ืื“ื ื”ื ืžืขื•ืœื™ื™ื ื‘ืžื™ื˜ื‘ื.
16:25
We have no reason to believe that within this much, much larger space
297
985000
5000
ืื™ืŸ ืœื ื• ืกื™ื‘ื” ืœื—ืฉื•ื‘ ืฉื‘ืชื•ืš ื”ืžืจื—ื‘ ื”ื”ืจื‘ื”, ื”ืจื‘ื” ื™ื•ืชืจ ื’ื“ื•ืœ ื”ื–ื”
16:30
there would not also be extremely worthwhile modes of being,
298
990000
4000
ืœื ื™ื”ื™ื• ืขื•ื“ ืฆื•ืจื•ืช ืงื™ื•ื ืžืžืฉ ื˜ื•ื‘ื•ืช ื’ื ื›ืŸ,
16:34
perhaps ones that would be way beyond our wildest ability
299
994000
6000
ืื•ืœื™ ืฆื•ืจื•ืช ืงื™ื•ื ืฉื™ื”ื™ื• ืžืขื‘ืจ ืœืžื” ืฉืื ื—ื ื•
16:40
even to imagine or dream about.
300
1000000
2000
ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ ืื• ืœื—ืœื•ื ืืคื™ืœื•.
16:42
And so, to fix this third problem,
301
1002000
2000
ื•ืœื›ืŸ, ืœืชืงืŸ ืืช ื”ื‘ืขื™ื” ื”ืฉืœื™ืฉื™ืช ื”ื–ื•,
16:44
I think we need -- slowly, carefully, with ethical wisdom and constraint --
302
1004000
6000
ืื ื™ ื—ื•ืฉื‘ ืฉืื ื™ ืฆืจื™ื›ื™ื -- ืœืื˜, ื‘ื–ื”ื™ืจื•ืช, ืขื ื—ื•ื›ืžื” ืžื•ืกืจื™ืช ื•ื‘ืื•ืคืŸ ืžื—ื•ืฉื‘ --
16:50
develop the means that enable us to go out in this larger space and explore it
303
1010000
5000
ืœืคืชื— ืืช ื”ืืžืฆืขื™ื ืœืฆืืช ืœืžืจื—ื‘ ื”ื’ื“ื•ืœ ื”ื–ื” ื•ืœื’ืœื•ืช ืื•ืชื•
16:55
and find the great values that might hide there.
304
1015000
2000
ื•ืœืžืฆื•ื ืืช ื”ืขืจื›ื™ื ื”ื’ื“ื•ืœื™ื ืฉื™ื™ืชื›ืŸ ื ืžืฆื ืฉื.
16:57
Thanks.
305
1017000
2000
ืชื•ื“ื”.
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7