How civilization could destroy itself -- and 4 ways we could prevent it | Nick Bostrom

152,004 views ・ 2020-01-17

TED


Please double-click on the English subtitles below to play the video.

00:13
Chris Anderson: Nick Bostrom.
0
13000
1809
00:14
So, you have already given us so many crazy ideas out there.
1
14833
3976
00:18
I think a couple of decades ago,
2
18833
1726
00:20
you made the case that we might all be living in a simulation,
3
20583
2935
00:23
or perhaps probably were.
4
23542
1809
00:25
More recently,
5
25375
1351
00:26
you've painted the most vivid examples of how artificial general intelligence
6
26750
4601
00:31
could go horribly wrong.
7
31375
1833
00:33
And now this year,
8
33750
1393
00:35
you're about to publish
9
35167
2226
00:37
a paper that presents something called the vulnerable world hypothesis.
10
37417
3934
00:41
And our job this evening is to give the illustrated guide to that.
11
41375
4583
00:46
So let's do that.
12
46417
1833
00:48
What is that hypothesis?
13
48833
1792
00:52
Nick Bostrom: It's trying to think about
14
52000
2434
00:54
a sort of structural feature of the current human condition.
15
54458
3084
00:59
You like the urn metaphor,
16
59125
2351
01:01
so I'm going to use that to explain it.
17
61500
1893
01:03
So picture a big urn filled with balls
18
63417
4351
01:07
representing ideas, methods, possible technologies.
19
67792
3958
01:12
You can think of the history of human creativity
20
72833
3726
01:16
as the process of reaching into this urn and pulling out one ball after another,
21
76583
3810
01:20
and the net effect so far has been hugely beneficial, right?
22
80417
3226
01:23
We've extracted a great many white balls,
23
83667
2726
01:26
some various shades of gray, mixed blessings.
24
86417
2875
01:30
We haven't so far pulled out the black ball --
25
90042
2958
01:34
a technology that invariably destroys the civilization that discovers it.
26
94292
5476
01:39
So the paper tries to think about what could such a black ball be.
27
99792
3267
01:43
CA: So you define that ball
28
103083
1810
01:44
as one that would inevitably bring about civilizational destruction.
29
104917
3684
01:48
NB: Unless we exit what I call the semi-anarchic default condition.
30
108625
5309
01:53
But sort of, by default.
31
113958
1500
01:56
CA: So, you make the case compelling
32
116333
3518
01:59
by showing some sort of counterexamples
33
119875
2018
02:01
where you believe that so far we've actually got lucky,
34
121917
2934
02:04
that we might have pulled out that death ball
35
124875
2851
02:07
without even knowing it.
36
127750
1559
02:09
So there's this quote, what's this quote?
37
129333
2292
02:12
NB: Well, I guess it's just meant to illustrate
38
132625
2684
02:15
the difficulty of foreseeing
39
135333
2101
02:17
what basic discoveries will lead to.
40
137458
2685
02:20
We just don't have that capability.
41
140167
3059
02:23
Because we have become quite good at pulling out balls,
42
143250
3351
02:26
but we don't really have the ability to put the ball back into the urn, right.
43
146625
3726
02:30
We can invent, but we can't un-invent.
44
150375
2167
02:33
So our strategy, such as it is,
45
153583
2768
02:36
is to hope that there is no black ball in the urn.
46
156375
2434
02:38
CA: So once it's out, it's out, and you can't put it back in,
47
158833
4060
02:42
and you think we've been lucky.
48
162917
1517
02:44
So talk through a couple of these examples.
49
164458
2226
02:46
You talk about different types of vulnerability.
50
166708
3101
02:49
NB: So the easiest type to understand
51
169833
2435
02:52
is a technology that just makes it very easy
52
172292
3142
02:55
to cause massive amounts of destruction.
53
175458
2125
02:59
Synthetic biology might be a fecund source of that kind of black ball,
54
179375
3518
03:02
but many other possible things we could --
55
182917
2684
03:05
think of geoengineering, really great, right?
56
185625
2518
03:08
We could combat global warming,
57
188167
2226
03:10
but you don't want it to get too easy either,
58
190417
2142
03:12
you don't want any random person and his grandmother
59
192583
2476
03:15
to have the ability to radically alter the earth's climate.
60
195083
3060
03:18
Or maybe lethal autonomous drones,
61
198167
3559
03:21
massed-produced, mosquito-sized killer bot swarms.
62
201750
3333
03:26
Nanotechnology, artificial general intelligence.
63
206500
2726
03:29
CA: You argue in the paper
64
209250
1309
03:30
that it's a matter of luck that when we discovered
65
210583
2893
03:33
that nuclear power could create a bomb,
66
213500
3434
03:36
it might have been the case
67
216958
1393
03:38
that you could have created a bomb
68
218375
1851
03:40
with much easier resources, accessible to anyone.
69
220250
3559
03:43
NB: Yeah, so think back to the 1930s
70
223833
3560
03:47
where for the first time we make some breakthroughs in nuclear physics,
71
227417
4601
03:52
some genius figures out that it's possible to create a nuclear chain reaction
72
232042
3684
03:55
and then realizes that this could lead to the bomb.
73
235750
3184
03:58
And we do some more work,
74
238958
1893
04:00
it turns out that what you require to make a nuclear bomb
75
240875
2726
04:03
is highly enriched uranium or plutonium,
76
243625
2393
04:06
which are very difficult materials to get.
77
246042
2017
04:08
You need ultracentrifuges,
78
248083
2268
04:10
you need reactors, like, massive amounts of energy.
79
250375
3768
04:14
But suppose it had turned out instead
80
254167
1809
04:16
there had been an easy way to unlock the energy of the atom.
81
256000
3976
04:20
That maybe by baking sand in the microwave oven
82
260000
2768
04:22
or something like that
83
262792
1267
04:24
you could have created a nuclear detonation.
84
264083
2101
04:26
So we know that that's physically impossible.
85
266208
2143
04:28
But before you did the relevant physics
86
268375
1893
04:30
how could you have known how it would turn out?
87
270292
2191
04:32
CA: Although, couldn't you argue
88
272507
1552
04:34
that for life to evolve on Earth
89
274083
1935
04:36
that implied sort of stable environment,
90
276042
3267
04:39
that if it was possible to create massive nuclear reactions relatively easy,
91
279333
4185
04:43
the Earth would never have been stable,
92
283542
1858
04:45
that we wouldn't be here at all.
93
285424
1552
04:47
NB: Yeah, unless there were something that is easy to do on purpose
94
287000
3393
04:50
but that wouldn't happen by random chance.
95
290417
2851
04:53
So, like things we can easily do,
96
293292
1579
04:54
we can stack 10 blocks on top of one another,
97
294896
2110
04:57
but in nature, you're not going to find, like, a stack of 10 blocks.
98
297031
3197
05:00
CA: OK, so this is probably the one
99
300253
1673
05:01
that many of us worry about most,
100
301950
1943
05:03
and yes, synthetic biology is perhaps the quickest route
101
303917
3517
05:07
that we can foresee in our near future to get us here.
102
307458
3018
05:10
NB: Yeah, and so think about what that would have meant
103
310500
2934
05:13
if, say, anybody by working in their kitchen for an afternoon
104
313458
3643
05:17
could destroy a city.
105
317125
1393
05:18
It's hard to see how modern civilization as we know it
106
318542
3559
05:22
could have survived that.
107
322125
1434
05:23
Because in any population of a million people,
108
323583
2518
05:26
there will always be some who would, for whatever reason,
109
326125
2684
05:28
choose to use that destructive power.
110
328833
2084
05:31
So if that apocalyptic residual
111
331750
3143
05:34
would choose to destroy a city, or worse,
112
334917
1976
05:36
then cities would get destroyed.
113
336917
1559
05:38
CA: So here's another type of vulnerability.
114
338500
2351
05:40
Talk about this.
115
340875
1643
05:42
NB: Yeah, so in addition to these kind of obvious types of black balls
116
342542
3976
05:46
that would just make it possible to blow up a lot of things,
117
346542
2810
05:49
other types would act by creating bad incentives
118
349376
4433
05:53
for humans to do things that are harmful.
119
353833
2226
05:56
So, the Type-2a, we might call it that,
120
356083
4101
06:00
is to think about some technology that incentivizes great powers
121
360208
4518
06:04
to use their massive amounts of force to create destruction.
122
364750
4476
06:09
So, nuclear weapons were actually very close to this, right?
123
369250
2833
06:14
What we did, we spent over 10 trillion dollars
124
374083
3060
06:17
to build 70,000 nuclear warheads
125
377167
2517
06:19
and put them on hair-trigger alert.
126
379708
2435
06:22
And there were several times during the Cold War
127
382167
2267
06:24
we almost blew each other up.
128
384458
1435
06:25
It's not because a lot of people felt this would be a great idea,
129
385917
3101
06:29
let's all spend 10 trillion dollars to blow ourselves up,
130
389042
2684
06:31
but the incentives were such that we were finding ourselves --
131
391750
2934
06:34
this could have been worse.
132
394708
1310
06:36
Imagine if there had been a safe first strike.
133
396042
2434
06:38
Then it might have been very tricky,
134
398500
2309
06:40
in a crisis situation,
135
400833
1268
06:42
to refrain from launching all their nuclear missiles.
136
402125
2477
06:44
If nothing else, because you would fear that the other side might do it.
137
404626
3392
06:48
CA: Right, mutual assured destruction
138
408042
1809
06:49
kept the Cold War relatively stable,
139
409875
2726
06:52
without that, we might not be here now.
140
412625
1934
06:54
NB: It could have been more unstable than it was.
141
414583
2310
06:56
And there could be other properties of technology.
142
416917
2351
06:59
It could have been harder to have arms treaties,
143
419292
2309
07:01
if instead of nuclear weapons
144
421625
1601
07:03
there had been some smaller thing or something less distinctive.
145
423250
3018
07:06
CA: And as well as bad incentives for powerful actors,
146
426292
2559
07:08
you also worry about bad incentives for all of us, in Type-2b here.
147
428875
3518
07:12
NB: Yeah, so, here we might take the case of global warming.
148
432417
4250
07:18
There are a lot of little conveniences
149
438958
1893
07:20
that cause each one of us to do things
150
440875
2184
07:23
that individually have no significant effect, right?
151
443083
2851
07:25
But if billions of people do it,
152
445958
1976
07:27
cumulatively, it has a damaging effect.
153
447958
2060
07:30
Now, global warming could have been a lot worse than it is.
154
450042
2809
07:32
So we have the climate sensitivity parameter, right.
155
452875
2976
07:35
It's a parameter that says how much warmer does it get
156
455875
3643
07:39
if you emit a certain amount of greenhouse gases.
157
459542
2684
07:42
But, suppose that it had been the case
158
462250
2393
07:44
that with the amount of greenhouse gases we emitted,
159
464667
2517
07:47
instead of the temperature rising by, say,
160
467208
2060
07:49
between three and 4.5 degrees by 2100,
161
469292
3726
07:53
suppose it had been 15 degrees or 20 degrees.
162
473042
2500
07:56
Like, then we might have been in a very bad situation.
163
476375
2559
07:58
Or suppose that renewable energy had just been a lot harder to do.
164
478958
3143
08:02
Or that there had been more fossil fuels in the ground.
165
482125
2643
08:04
CA: Couldn't you argue that if in that case of --
166
484792
2642
08:07
if what we are doing today
167
487458
1726
08:09
had resulted in 10 degrees difference in the time period that we could see,
168
489208
4560
08:13
actually humanity would have got off its ass and done something about it.
169
493792
3684
08:17
We're stupid, but we're not maybe that stupid.
170
497500
2809
08:20
Or maybe we are.
171
500333
1268
08:21
NB: I wouldn't bet on it.
172
501625
1268
08:22
(Laughter)
173
502917
2184
08:25
You could imagine other features.
174
505125
1684
08:26
So, right now, it's a little bit difficult to switch to renewables and stuff, right,
175
506833
5518
08:32
but it can be done.
176
512375
1268
08:33
But it might just have been, with slightly different physics,
177
513667
2976
08:36
it could have been much more expensive to do these things.
178
516667
2791
08:40
CA: And what's your view, Nick?
179
520375
1518
08:41
Do you think, putting these possibilities together,
180
521917
2434
08:44
that this earth, humanity that we are,
181
524375
4268
08:48
we count as a vulnerable world?
182
528667
1559
08:50
That there is a death ball in our future?
183
530250
2417
08:55
NB: It's hard to say.
184
535958
1268
08:57
I mean, I think there might well be various black balls in the urn,
185
537250
5059
09:02
that's what it looks like.
186
542333
1310
09:03
There might also be some golden balls
187
543667
2392
09:06
that would help us protect against black balls.
188
546083
3476
09:09
And I don't know which order they will come out.
189
549583
2976
09:12
CA: I mean, one possible philosophical critique of this idea
190
552583
3851
09:16
is that it implies a view that the future is essentially settled.
191
556458
5643
09:22
That there either is that ball there or it's not.
192
562125
2476
09:24
And in a way,
193
564625
3018
09:27
that's not a view of the future that I want to believe.
194
567667
2601
09:30
I want to believe that the future is undetermined,
195
570292
2351
09:32
that our decisions today will determine
196
572667
1934
09:34
what kind of balls we pull out of that urn.
197
574625
2208
09:37
NB: I mean, if we just keep inventing,
198
577917
3767
09:41
like, eventually we will pull out all the balls.
199
581708
2334
09:44
I mean, I think there's a kind of weak form of technological determinism
200
584875
3393
09:48
that is quite plausible,
201
588292
1267
09:49
like, you're unlikely to encounter a society
202
589583
2643
09:52
that uses flint axes and jet planes.
203
592250
2833
09:56
But you can almost think of a technology as a set of affordances.
204
596208
4060
10:00
So technology is the thing that enables us to do various things
205
600292
3017
10:03
and achieve various effects in the world.
206
603333
1976
10:05
How we'd then use that, of course depends on human choice.
207
605333
2810
10:08
But if we think about these three types of vulnerability,
208
608167
2684
10:10
they make quite weak assumptions about how we would choose to use them.
209
610875
3393
10:14
So a Type-1 vulnerability, again, this massive, destructive power,
210
614292
3392
10:17
it's a fairly weak assumption
211
617708
1435
10:19
to think that in a population of millions of people
212
619167
2392
10:21
there would be some that would choose to use it destructively.
213
621583
2935
10:24
CA: For me, the most single disturbing argument
214
624542
2434
10:27
is that we actually might have some kind of view into the urn
215
627000
4559
10:31
that makes it actually very likely that we're doomed.
216
631583
3518
10:35
Namely, if you believe in accelerating power,
217
635125
4643
10:39
that technology inherently accelerates,
218
639792
2267
10:42
that we build the tools that make us more powerful,
219
642083
2435
10:44
then at some point you get to a stage
220
644542
2642
10:47
where a single individual can take us all down,
221
647208
3060
10:50
and then it looks like we're screwed.
222
650292
2851
10:53
Isn't that argument quite alarming?
223
653167
2934
10:56
NB: Ah, yeah.
224
656125
1750
10:58
(Laughter)
225
658708
1268
11:00
I think --
226
660000
1333
11:02
Yeah, we get more and more power,
227
662875
1601
11:04
and [it's] easier and easier to use those powers,
228
664500
3934
11:08
but we can also invent technologies that kind of help us control
229
668458
3560
11:12
how people use those powers.
230
672042
2017
11:14
CA: So let's talk about that, let's talk about the response.
231
674083
2851
11:16
Suppose that thinking about all the possibilities
232
676958
2310
11:19
that are out there now --
233
679292
2101
11:21
it's not just synbio, it's things like cyberwarfare,
234
681417
3726
11:25
artificial intelligence, etc., etc. --
235
685167
3351
11:28
that there might be serious doom in our future.
236
688542
4517
11:33
What are the possible responses?
237
693083
1601
11:34
And you've talked about four possible responses as well.
238
694708
4893
11:39
NB: Restricting technological development doesn't seem promising,
239
699625
3643
11:43
if we are talking about a general halt to technological progress.
240
703292
3226
11:46
I think neither feasible,
241
706542
1267
11:47
nor would it be desirable even if we could do it.
242
707833
2310
11:50
I think there might be very limited areas
243
710167
3017
11:53
where maybe you would want slower technological progress.
244
713208
2726
11:55
You don't, I think, want faster progress in bioweapons,
245
715958
3393
11:59
or in, say, isotope separation,
246
719375
2059
12:01
that would make it easier to create nukes.
247
721458
2250
12:04
CA: I mean, I used to be fully on board with that.
248
724583
3310
12:07
But I would like to actually push back on that for a minute.
249
727917
3267
12:11
Just because, first of all,
250
731208
1310
12:12
if you look at the history of the last couple of decades,
251
732542
2684
12:15
you know, it's always been push forward at full speed,
252
735250
3559
12:18
it's OK, that's our only choice.
253
738833
1851
12:20
But if you look at globalization and the rapid acceleration of that,
254
740708
4268
12:25
if you look at the strategy of "move fast and break things"
255
745000
3434
12:28
and what happened with that,
256
748458
2060
12:30
and then you look at the potential for synthetic biology,
257
750542
2767
12:33
I don't know that we should move forward rapidly
258
753333
4435
12:37
or without any kind of restriction
259
757792
1642
12:39
to a world where you could have a DNA printer in every home
260
759458
3310
12:42
and high school lab.
261
762792
1333
12:45
There are some restrictions, right?
262
765167
1684
12:46
NB: Possibly, there is the first part, the not feasible.
263
766875
2643
12:49
If you think it would be desirable to stop it,
264
769542
2184
12:51
there's the problem of feasibility.
265
771750
1726
12:53
So it doesn't really help if one nation kind of --
266
773500
2809
12:56
CA: No, it doesn't help if one nation does,
267
776333
2018
12:58
but we've had treaties before.
268
778375
2934
13:01
That's really how we survived the nuclear threat,
269
781333
3351
13:04
was by going out there
270
784708
1268
13:06
and going through the painful process of negotiating.
271
786000
2518
13:08
I just wonder whether the logic isn't that we, as a matter of global priority,
272
788542
5434
13:14
we shouldn't go out there and try,
273
794000
1684
13:15
like, now start negotiating really strict rules
274
795708
2685
13:18
on where synthetic bioresearch is done,
275
798417
2684
13:21
that it's not something that you want to democratize, no?
276
801125
2851
13:24
NB: I totally agree with that --
277
804000
1809
13:25
that it would be desirable, for example,
278
805833
4226
13:30
maybe to have DNA synthesis machines,
279
810083
3601
13:33
not as a product where each lab has their own device,
280
813708
3560
13:37
but maybe as a service.
281
817292
1476
13:38
Maybe there could be four or five places in the world
282
818792
2517
13:41
where you send in your digital blueprint and the DNA comes back, right?
283
821333
3518
13:44
And then, you would have the ability,
284
824875
1768
13:46
if one day it really looked like it was necessary,
285
826667
2392
13:49
we would have like, a finite set of choke points.
286
829083
2351
13:51
So I think you want to look for kind of special opportunities,
287
831458
3518
13:55
where you could have tighter control.
288
835000
2059
13:57
CA: Your belief is, fundamentally,
289
837083
1643
13:58
we are not going to be successful in just holding back.
290
838750
2893
14:01
Someone, somewhere -- North Korea, you know --
291
841667
2726
14:04
someone is going to go there and discover this knowledge,
292
844417
3517
14:07
if it's there to be found.
293
847958
1268
14:09
NB: That looks plausible under current conditions.
294
849250
2351
14:11
It's not just synthetic biology, either.
295
851625
1934
14:13
I mean, any kind of profound, new change in the world
296
853583
2518
14:16
could turn out to be a black ball.
297
856101
1626
14:17
CA: Let's look at another possible response.
298
857727
2096
14:19
NB: This also, I think, has only limited potential.
299
859823
2403
14:22
So, with the Type-1 vulnerability again,
300
862250
3559
14:25
I mean, if you could reduce the number of people who are incentivized
301
865833
4351
14:30
to destroy the world,
302
870208
1268
14:31
if only they could get access and the means,
303
871500
2059
14:33
that would be good.
304
873583
1268
14:34
CA: In this image that you asked us to do
305
874875
1976
14:36
you're imagining these drones flying around the world
306
876875
2559
14:39
with facial recognition.
307
879458
1268
14:40
When they spot someone showing signs of sociopathic behavior,
308
880750
2893
14:43
they shower them with love, they fix them.
309
883667
2184
14:45
NB: I think it's like a hybrid picture.
310
885875
1893
14:47
Eliminate can either mean, like, incarcerate or kill,
311
887792
4017
14:51
or it can mean persuade them to a better view of the world.
312
891833
3018
14:54
But the point is that,
313
894875
1726
14:56
suppose you were extremely successful in this,
314
896625
2143
14:58
and you reduced the number of such individuals by half.
315
898792
3309
15:02
And if you want to do it by persuasion,
316
902125
1893
15:04
you are competing against all other powerful forces
317
904042
2392
15:06
that are trying to persuade people,
318
906458
1685
15:08
parties, religion, education system.
319
908167
1767
15:09
But suppose you could reduce it by half,
320
909958
1905
15:11
I don't think the risk would be reduced by half.
321
911887
2256
15:14
Maybe by five or 10 percent.
322
914167
1559
15:15
CA: You're not recommending that we gamble humanity's future on response two.
323
915750
4351
15:20
NB: I think it's all good to try to deter and persuade people,
324
920125
3018
15:23
but we shouldn't rely on that as our only safeguard.
325
923167
2976
15:26
CA: How about three?
326
926167
1267
15:27
NB: I think there are two general methods
327
927458
2893
15:30
that we could use to achieve the ability to stabilize the world
328
930375
3976
15:34
against the whole spectrum of possible vulnerabilities.
329
934375
2976
15:37
And we probably would need both.
330
937375
1559
15:38
So, one is an extremely effective ability
331
938958
4518
15:43
to do preventive policing.
332
943500
1768
15:45
Such that you could intercept.
333
945292
1524
15:46
If anybody started to do this dangerous thing,
334
946840
2761
15:49
you could intercept them in real time, and stop them.
335
949625
2684
15:52
So this would require ubiquitous surveillance,
336
952333
2476
15:54
everybody would be monitored all the time.
337
954833
2375
15:58
CA: This is "Minority Report," essentially, a form of.
338
958333
2560
16:00
NB: You would have maybe AI algorithms,
339
960917
1934
16:02
big freedom centers that were reviewing this, etc., etc.
340
962875
4417
16:08
CA: You know that mass surveillance is not a very popular term right now?
341
968583
4393
16:13
(Laughter)
342
973000
1250
16:15
NB: Yeah, so this little device there,
343
975458
1810
16:17
imagine that kind of necklace that you would have to wear at all times
344
977292
3601
16:20
with multidirectional cameras.
345
980917
2000
16:23
But, to make it go down better,
346
983792
1809
16:25
just call it the "freedom tag" or something like that.
347
985625
2524
16:28
(Laughter)
348
988173
2011
16:30
CA: OK.
349
990208
1268
16:31
I mean, this is the conversation, friends,
350
991500
2101
16:33
this is why this is such a mind-blowing conversation.
351
993625
3559
16:37
NB: Actually, there's a whole big conversation on this
352
997208
2601
16:39
on its own, obviously.
353
999833
1310
16:41
There are huge problems and risks with that, right?
354
1001167
2476
16:43
We may come back to that.
355
1003667
1267
16:44
So the other, the final,
356
1004958
1268
16:46
the other general stabilization capability
357
1006250
2559
16:48
is kind of plugging another governance gap.
358
1008833
2060
16:50
So the surveillance would be kind of governance gap at the microlevel,
359
1010917
4184
16:55
like, preventing anybody from ever doing something highly illegal.
360
1015125
3101
16:58
Then, there's a corresponding governance gap
361
1018250
2309
17:00
at the macro level, at the global level.
362
1020583
1935
17:02
You would need the ability, reliably,
363
1022542
3934
17:06
to prevent the worst kinds of global coordination failures,
364
1026500
2809
17:09
to avoid wars between great powers,
365
1029333
3768
17:13
arms races,
366
1033125
1333
17:15
cataclysmic commons problems,
367
1035500
2208
17:19
in order to deal with the Type-2a vulnerabilities.
368
1039667
4184
17:23
CA: Global governance is a term
369
1043875
1934
17:25
that's definitely way out of fashion right now,
370
1045833
2226
17:28
but could you make the case that throughout history,
371
1048083
2518
17:30
the history of humanity
372
1050625
1268
17:31
is that at every stage of technological power increase,
373
1051917
5434
17:37
people have reorganized and sort of centralized the power.
374
1057375
3226
17:40
So, for example, when a roving band of criminals
375
1060625
3434
17:44
could take over a society,
376
1064083
1685
17:45
the response was, well, you have a nation-state
377
1065792
2239
17:48
and you centralize force, a police force or an army,
378
1068055
2434
17:50
so, "No, you can't do that."
379
1070513
1630
17:52
The logic, perhaps, of having a single person or a single group
380
1072167
4559
17:56
able to take out humanity
381
1076750
1643
17:58
means at some point we're going to have to go this route,
382
1078417
2726
18:01
at least in some form, no?
383
1081167
1434
18:02
NB: It's certainly true that the scale of political organization has increased
384
1082625
3684
18:06
over the course of human history.
385
1086333
2143
18:08
It used to be hunter-gatherer band, right,
386
1088500
2018
18:10
and then chiefdom, city-states, nations,
387
1090542
2934
18:13
now there are international organizations and so on and so forth.
388
1093500
3976
18:17
Again, I just want to make sure
389
1097500
1518
18:19
I get the chance to stress
390
1099042
1642
18:20
that obviously there are huge downsides
391
1100708
1976
18:22
and indeed, massive risks,
392
1102708
1518
18:24
both to mass surveillance and to global governance.
393
1104250
3351
18:27
I'm just pointing out that if we are lucky,
394
1107625
2559
18:30
the world could be such that these would be the only ways
395
1110208
2685
18:32
you could survive a black ball.
396
1112917
1517
18:34
CA: The logic of this theory,
397
1114458
2518
18:37
it seems to me,
398
1117000
1268
18:38
is that we've got to recognize we can't have it all.
399
1118292
3601
18:41
That the sort of,
400
1121917
1833
18:45
I would say, naive dream that many of us had
401
1125500
2976
18:48
that technology is always going to be a force for good,
402
1128500
3351
18:51
keep going, don't stop, go as fast as you can
403
1131875
2976
18:54
and not pay attention to some of the consequences,
404
1134875
2351
18:57
that's actually just not an option.
405
1137250
1684
18:58
We can have that.
406
1138958
1935
19:00
If we have that,
407
1140917
1267
19:02
we're going to have to accept
408
1142208
1435
19:03
some of these other very uncomfortable things with it,
409
1143667
2559
19:06
and kind of be in this arms race with ourselves
410
1146250
2226
19:08
of, you want the power, you better limit it,
411
1148500
2268
19:10
you better figure out how to limit it.
412
1150792
2142
19:12
NB: I think it is an option,
413
1152958
3476
19:16
a very tempting option, it's in a sense the easiest option
414
1156458
2768
19:19
and it might work,
415
1159250
1268
19:20
but it means we are fundamentally vulnerable to extracting a black ball.
416
1160542
4809
19:25
Now, I think with a bit of coordination,
417
1165375
2143
19:27
like, if you did solve this macrogovernance problem,
418
1167542
2726
19:30
and the microgovernance problem,
419
1170292
1601
19:31
then we could extract all the balls from the urn
420
1171917
2309
19:34
and we'd benefit greatly.
421
1174250
2268
19:36
CA: I mean, if we're living in a simulation, does it matter?
422
1176542
3434
19:40
We just reboot.
423
1180000
1309
19:41
(Laughter)
424
1181333
1268
19:42
NB: Then ... I ...
425
1182625
1643
19:44
(Laughter)
426
1184292
2476
19:46
I didn't see that one coming.
427
1186792
1416
19:50
CA: So what's your view?
428
1190125
1268
19:51
Putting all the pieces together, how likely is it that we're doomed?
429
1191417
4809
19:56
(Laughter)
430
1196250
1958
19:59
I love how people laugh when you ask that question.
431
1199042
2392
20:01
NB: On an individual level,
432
1201458
1351
20:02
we seem to kind of be doomed anyway, just with the time line,
433
1202833
3851
20:06
we're rotting and aging and all kinds of things, right?
434
1206708
2601
20:09
(Laughter)
435
1209333
1601
20:10
It's actually a little bit tricky.
436
1210958
1685
20:12
If you want to set up so that you can attach a probability,
437
1212667
2767
20:15
first, who are we?
438
1215458
1268
20:16
If you're very old, probably you'll die of natural causes,
439
1216750
2726
20:19
if you're very young, you might have a 100-year --
440
1219500
2351
20:21
the probability might depend on who you ask.
441
1221875
2143
20:24
Then the threshold, like, what counts as civilizational devastation?
442
1224042
4226
20:28
In the paper I don't require an existential catastrophe
443
1228292
5642
20:33
in order for it to count.
444
1233958
1435
20:35
This is just a definitional matter,
445
1235417
1684
20:37
I say a billion dead,
446
1237125
1309
20:38
or a reduction of world GDP by 50 percent,
447
1238458
2060
20:40
but depending on what you say the threshold is,
448
1240542
2226
20:42
you get a different probability estimate.
449
1242792
1976
20:44
But I guess you could put me down as a frightened optimist.
450
1244792
4517
20:49
(Laughter)
451
1249333
1101
20:50
CA: You're a frightened optimist,
452
1250458
1643
20:52
and I think you've just created a large number of other frightened ...
453
1252125
4268
20:56
people.
454
1256417
1267
20:57
(Laughter)
455
1257708
1060
20:58
NB: In the simulation.
456
1258792
1267
21:00
CA: In a simulation.
457
1260083
1268
21:01
Nick Bostrom, your mind amazes me,
458
1261375
1684
21:03
thank you so much for scaring the living daylights out of us.
459
1263083
2893
21:06
(Applause)
460
1266000
2375
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7