What happens when our computers get smarter than we are? | Nick Bostrom

2,706,132 views ใƒป 2015-04-27

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Ido Dekkers ืžื‘ืงืจ: Tal Dekkers
00:12
I work with a bunch of mathematicians, philosophers and computer scientists,
0
12570
4207
ืื ื™ ืขื•ื‘ื“ ืขื ื—ื‘ื•ืจื” ืฉืœ ืžืชืžื˜ื™ืงืื™ื, ืคื™ืœื•ืกื•ืคื™ื ื•ืžื“ืขื ื™ ืžื—ืฉื‘.
00:16
and we sit around and think about the future of machine intelligence,
1
16777
5209
ื•ืื ื—ื ื• ื™ื•ืฉื‘ื™ื ื•ื—ื•ืฉื‘ื™ื ืขืœ ื”ืขืชื™ื“ ืฉืœ ืื™ื ื˜ื™ืœื™ื’ื ืฆื™ืช ืžื›ื•ื ื•ืช,
00:21
among other things.
2
21986
2044
ื‘ื™ืŸ ื“ื‘ืจื™ื ืื—ืจื™ื.
00:24
Some people think that some of these things are sort of science fiction-y,
3
24030
4725
ื›ืžื” ืื ืฉื™ื ื—ื•ืฉื‘ื™ื ืฉื›ืžื” ืžื”ื“ื‘ืจื™ื ื”ืืœื” ื”ื ืกื•ื’ ืฉืœ ืžื“ืข ื‘ื“ื™ื•ื ื™,
00:28
far out there, crazy.
4
28755
3101
ืจื—ื•ืงื™ื ืžืื•ื“, ืžืฉื•ื’ืขื™ื.
00:31
But I like to say,
5
31856
1470
ืื‘ืœ ืื ื™ ืื•ื”ื‘ ืœื•ืžืจ,
00:33
okay, let's look at the modern human condition.
6
33326
3604
ืื•ืงื™ื™, ื‘ื•ืื• ื ื‘ื™ื˜ ื‘ืžืฆื‘ ื”ืื ื•ืฉื™ ื”ืžื•ื“ืจื ื™.
00:36
(Laughter)
7
36930
1692
(ืฆื—ื•ืง)
00:38
This is the normal way for things to be.
8
38622
2402
ื–ืืช ื”ื“ืจืš ื”ื ื•ืจืžืœื™ืช ืฉืœ ื”ื“ื‘ืจื™ื.
00:41
But if we think about it,
9
41024
2285
ืื‘ืœ ืื ื ื—ืฉื•ื‘ ืขืœ ื–ื”,
00:43
we are actually recently arrived guests on this planet,
10
43309
3293
ืื ื—ื ื• ืœืžืขืฉื” ืื•ืจื—ื™ื ืฉื”ื’ื™ืขื• ืœืื—ืจื•ื ื” ืœืคืœื ื˜ื” ื”ื–ื•,
00:46
the human species.
11
46602
2082
ื”ืžื™ืŸ ื”ืื ื•ืฉื™.
00:48
Think about if Earth was created one year ago,
12
48684
4746
ื—ืฉื‘ื• ืขืœ ืื ื›ื“ื•ืจ ื”ืืจืฅ ื”ื™ื” ื ื•ืฆืจ ืœืคื ื™ ืฉื ื”,
00:53
the human species, then, would be 10 minutes old.
13
53430
3548
ื”ืžื™ืŸ ื”ืื ื•ืฉื™, ืื–, ื”ื™ื” ื‘ืŸ 10 ื“ืงื•ืช.
00:56
The industrial era started two seconds ago.
14
56978
3168
ื”ืžื”ืคื›ื” ื”ืชืขืฉื™ื™ืชื™ืช ื”ืชื—ื™ืœื” ืœืคื ื™ ืฉืชื™ ืฉื ื™ื•ืช.
01:01
Another way to look at this is to think of world GDP over the last 10,000 years,
15
61276
5225
ื“ืจืš ื ื•ืกืคืช ืœื”ื‘ื™ื˜ ื‘ื–ื” ื”ื™ื ืœื—ืฉื•ื‘ ืขืœ ื”ืชื•ืฆืจ ื”ืœืื•ืžื™ ื”ื’ื•ืœืžื™ ื‘ 10,000 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช,
01:06
I've actually taken the trouble to plot this for you in a graph.
16
66501
3029
ืื ื™ ืœืžืขืฉื” ื˜ืจื—ืชื™ ืœืฆื™ื™ืจ ืืช ื–ื” ืขืœ ื’ืจืฃ ื‘ืฉื‘ื™ืœื›ื.
01:09
It looks like this.
17
69530
1774
ื–ื” ื ืจืื” ื›ืš.
01:11
(Laughter)
18
71304
1363
(ืฆื—ื•ืง)
01:12
It's a curious shape for a normal condition.
19
72667
2151
ื–ื• ืฆื•ืจื” ืžื•ื–ืจื” ืœืžืฆื‘ ื ื•ืจืžืœื™.
01:14
I sure wouldn't want to sit on it.
20
74818
1698
ืื ื™ ืžืื•ื“ ืœื ื”ื™ื™ืชื™ ืจื•ืฆื” ืœืฉื‘ืช ืขืœ ื–ื”.
01:16
(Laughter)
21
76516
2551
(ืฆื—ื•ืง)
01:19
Let's ask ourselves, what is the cause of this current anomaly?
22
79067
4774
ื‘ื•ืื• ื ืฉืืœ ืืช ืขืฆืžื ื•, ืžื”ื™ ื”ืกื™ื‘ื” ืœืื ื•ืžืœื™ื” ื”ื ื•ื›ื—ื™ืช?
01:23
Some people would say it's technology.
23
83841
2552
ื›ืžื” ืื ืฉื™ื ื”ื™ื• ืื•ืžืจื™ื ืฉื–ื• ื˜ื›ื ื•ืœื•ื’ื™ื”.
01:26
Now it's true, technology has accumulated through human history,
24
86393
4668
ืขื›ืฉื™ื• ื–ื” ื ื›ื•ืŸ, ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ืฆื˜ื‘ืจื” ื‘ืžื”ืœืš ื”ื”ืกื˜ื•ืจื™ื” ื”ืื ื•ืฉื™ืช,
01:31
and right now, technology advances extremely rapidly --
25
91061
4652
ื•ืžืžืฉ ืขื›ืฉื™ื•, ื”ื”ืชืงื“ืžื•ืช ื”ื˜ื›ื ื•ืœื•ื’ื™ืช ื”ื™ื ืžืžืฉ ืžื”ื™ืจื” --
01:35
that is the proximate cause,
26
95713
1565
ื–ื• ื”ืกื™ื‘ื” ื”ืžืฉื•ืขืจืช,
01:37
that's why we are currently so very productive.
27
97278
2565
ืœื›ืŸ ืื ื—ื ื• ื›ืœ ื›ืš ืคืจื•ื“ื•ืงื˜ื™ื‘ื™ื™ื ืขื›ืฉื™ื•.
01:40
But I like to think back further to the ultimate cause.
28
100473
3661
ืื‘ืœ ืื ื™ ืจื•ืฆื” ืœื—ืฉื•ื‘ ืื—ื•ืจื” ื™ื•ืชืจ ืœืกื™ื‘ื” ื”ืื•ืœื˜ื™ืžื˜ื™ื‘ื™ืช.
01:45
Look at these two highly distinguished gentlemen:
29
105114
3766
ื”ื‘ื™ื˜ื• ื‘ืฉื ื™ ื”ืื“ื•ื ื™ื ื”ืžืื•ื“ ื ื‘ื“ืœื™ื ื”ืืœื”:
01:48
We have Kanzi --
30
108880
1600
ื™ืฉ ืœื ื• ืืช ืงื ื–ื™ --
01:50
he's mastered 200 lexical tokens, an incredible feat.
31
110480
4643
ื”ื•ื ืฉื•ืœื˜ ื‘ 200 ืžื•ื ื—ื™ื ืœืฉื•ื ื™ื™ื, ืžื˜ืœื” ืžื“ื”ื™ืžื”.
01:55
And Ed Witten unleashed the second superstring revolution.
32
115123
3694
ื•ืื“ ื•ื•ื™ื˜ืŸ ืฉื™ื—ืจืจ ืืช ืžื”ืคื›ืช ืžื™ืชืจื™ ื”ืขืœ ื”ืฉื ื™ื”.
01:58
If we look under the hood, this is what we find:
33
118817
2324
ืื ื ื‘ื™ื˜ ืžืชื—ืช ืœืžื›ืกื” ื”ืžื ื•ืข, ื–ื” ืžื” ืฉื ืžืฆื:
02:01
basically the same thing.
34
121141
1570
ื‘ืขื™ืงืจื•ืŸ ืื•ืชื• ื”ื“ื‘ืจ.
02:02
One is a little larger,
35
122711
1813
ืื—ื“ ื’ื“ื•ืœ ืžืขื˜ ื™ื•ืชืจ,
02:04
it maybe also has a few tricks in the exact way it's wired.
36
124524
2758
ืื•ืœื™ ื™ืฉ ืœื• ื’ื ื›ืžื” ื˜ืจื™ืงื™ื ื‘ืฆื•ืจื” ื‘ื” ื”ื•ื ืžื—ื•ื•ื˜.
02:07
These invisible differences cannot be too complicated, however,
37
127282
3812
ื”ื”ื‘ื“ืœื™ื ื”ื‘ืœืชื™ ื ืจืื™ื ืœื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ื›ืœ ื›ืš ืžืกื•ื‘ื›ื™ื, ืขื ื–ืืช,
02:11
because there have only been 250,000 generations
38
131094
4285
ืžืคื ื™ ืฉื™ืฉ ื”ื‘ื“ืœ ืฉืœ 250,000 ื“ื•ืจื•ืช ื‘ืœื‘ื“
02:15
since our last common ancestor.
39
135379
1732
ืžืื– ื”ืื‘ ื”ืžืฉื•ืชืฃ ืฉืœื ื•.
02:17
We know that complicated mechanisms take a long time to evolve.
40
137111
3849
ืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉืœืžื ื’ื ื•ื ื™ื ืžื•ืจื›ื‘ื™ื ืœื•ืงื— ื–ืžืŸ ืจื‘ ืœื”ืชืคืชื—.
02:22
So a bunch of relatively minor changes
41
142000
2499
ืื– ื›ืžื” ืฉื™ื ื•ื™ื™ื ื™ื—ืกื™ืช ืงื˜ื ื™ื
02:24
take us from Kanzi to Witten,
42
144499
3067
ืœื•ืงื—ื™ื ืื•ืชื ื• ืžืงื ื–ื™ ืœื•ื•ื™ื˜ืŸ,
02:27
from broken-off tree branches to intercontinental ballistic missiles.
43
147566
4543
ืžืขื ืคื™ื ืฉื‘ื•ืจื™ื ืœื˜ื™ืœื™ื ื‘ื™ืŸ ื™ื‘ืฉืชื™ื™ื.
02:32
So this then seems pretty obvious that everything we've achieved,
44
152839
3935
ืื– ื–ื” ื ืจืื” ื“ื™ ื‘ืจื•ืจ ืฉื›ืœ ืžื” ืฉื”ืฉื’ื ื•,
02:36
and everything we care about,
45
156774
1378
ื•ื›ืœ ืžื” ืฉื—ืฉื•ื‘ ืœื ื•,
02:38
depends crucially on some relatively minor changes that made the human mind.
46
158152
5228
ืชืœื•ื™ ืงืจื™ื˜ื™ืช ื‘ืฉื™ื ื•ื™ื™ื ื™ื—ืกื™ืช ืžื™ื ื•ืจื™ื™ื ืฉื™ืฆืจื• ืืช ื”ืžื•ื— ื”ืื ื•ืฉื™.
02:44
And the corollary, of course, is that any further changes
47
164650
3662
ื•ื”ืžืงื‘ื™ืœ ืœื–ื”, ื›ืžื•ื‘ืŸ, ื–ื” ืฉื›ืœ ืฉื™ื ื•ื™ื ืขืชื™ื“ื™ื™ื
02:48
that could significantly change the substrate of thinking
48
168312
3477
ืฉื™ื›ื•ืœื™ื ืœื”ืฉืคื™ืข ืžืฉืžืขื•ืชื™ืช ืขืœ ื”ืชืฉืชื™ืช ืฉืœ ื”ื—ืฉื™ื‘ื”
02:51
could have potentially enormous consequences.
49
171789
3202
ื™ื›ื•ืœื™ื ืคื•ื˜ื ืฆื™ืืœื™ืช ืœื”ื™ื•ืช ื‘ืขืœื™ ืชื•ืฆืื•ืช ืขืฆื•ืžื•ืช.
02:56
Some of my colleagues think we're on the verge
50
176321
2905
ื›ืžื” ืžืขืžื™ืชื™ ื—ื•ืฉื‘ื™ื ืฉืื ื—ื ื• ืขืœ ืกืฃ
02:59
of something that could cause a profound change in that substrate,
51
179226
3908
ืžืฉื”ื• ืฉื™ื•ื›ืœ ืœื’ืจื•ื ืœืฉื™ื ื•ื™ ืžืฉืžืขื•ืชื™ ื‘ืชืฉืชื™ืช,
03:03
and that is machine superintelligence.
52
183134
3213
ื•ืฉื”ื“ื‘ืจ ื–ื” ื”ื•ื ื‘ื™ื ืช ืขืœ ืฉืœ ืžื›ื•ื ื•ืช.
03:06
Artificial intelligence used to be about putting commands in a box.
53
186347
4739
ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื”ื™ืชื” ืœืฉื™ื ืคืงื•ื“ื•ืช ื‘ืžื›ื•ื ื”.
03:11
You would have human programmers
54
191086
1665
ื”ื™ื™ืชื ืฆืจื™ื›ื™ื ืžืชื›ื ืชื™ื ืื ื•ืฉื™ื™ื
03:12
that would painstakingly handcraft knowledge items.
55
192751
3135
ืฉื™ื‘ื ื• ื‘ืขื‘ื•ื“ื” ืงืฉื” ืืช ืคืจื™ื˜ื™ ื”ื™ื“ืข.
03:15
You build up these expert systems,
56
195886
2086
ืืชื ื‘ื•ื ื™ื ืืช ื”ืžืขืจื›ื•ืช ื”ืžืชืžื—ื•ืช ื”ืืœื•,
03:17
and they were kind of useful for some purposes,
57
197972
2324
ื•ื”ืŸ ืกื•ื’ ืฉืœ ื™ืขื™ืœื•ืช ืœืžื˜ืจื•ืช ืžืกื•ื™ื™ืžื•ืช,
03:20
but they were very brittle, you couldn't scale them.
58
200296
2681
ืื‘ืœ ื”ืŸ ื”ื™ื• ืžืื•ื“ ืฉื‘ื™ืจื•ืช, ืœื ื™ื›ื•ืœืชื ืœื”ื’ื“ื™ืœ ืื•ืชืŸ.
03:22
Basically, you got out only what you put in.
59
202977
3433
ื‘ืขื™ืงืจื•ืŸ, ืงื™ื‘ืœืชื ืจืง ืžื” ืฉื”ื›ื ืกืชื.
03:26
But since then,
60
206410
997
ืื‘ืœ ืžืื–,
03:27
a paradigm shift has taken place in the field of artificial intelligence.
61
207407
3467
ื”ื™ื” ืฉื™ื ื•ื™ ืคืจื“ื™ื’ืžื” ืฉื”ืชืจื—ืฉ ื‘ืฉื“ื” ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
03:30
Today, the action is really around machine learning.
62
210874
2770
ื”ื™ื•ื, ื”ืคืขื™ืœื•ืช ืกื•ื‘ื‘ืช ืกื‘ื™ื‘ ืœื™ืžื•ื“ ืžื›ื•ื ื”.
03:34
So rather than handcrafting knowledge representations and features,
63
214394
5387
ืื– ื‘ืžืงื•ื ืœื™ื™ืฆืจ ื™ืฆื•ื’ ื•ืชื›ื•ื ื•ืช ืฉืœ ื™ื“ืข,
03:40
we create algorithms that learn, often from raw perceptual data.
64
220511
5554
ื™ืฆืจื ื• ืืœื’ื•ืจื™ืชืžื™ื ืฉืœื•ืžื“ื™ื, ื”ืจื‘ื” ืคืขืžื™ื ืžืžื™ื“ืข ืชืคื™ืฉืชื™ ื’ื•ืœืžื™.
03:46
Basically the same thing that the human infant does.
65
226065
4998
ื‘ืขื™ืงืจื•ืŸ ืื•ืชื• ื”ื“ื‘ืจ ืฉืชื™ื ื•ืงื•ืช ืื ื•ืฉื™ื™ื ืขื•ืฉื™ื.
03:51
The result is A.I. that is not limited to one domain --
66
231063
4207
ื”ืชื•ืฆืื” ื”ื™ื ื‘"ืž ืฉืœื ืžื•ื’ื‘ืœืช ืœืชื—ื•ื ืื—ื“ --
03:55
the same system can learn to translate between any pairs of languages,
67
235270
4631
ืื•ืชื” ืžืขืจื›ืช ื™ื›ื•ืœื” ืœืœืžื•ื“ ืœืชืจื’ื ื‘ื™ืŸ ื›ืœ ื–ื•ื’ ืฉืคื•ืช,
03:59
or learn to play any computer game on the Atari console.
68
239901
5437
ืื• ืœืœืžื•ื“ ืœืฉื—ืง ื›ืœ ืžืฉื—ืง ืžื—ืฉื‘ ืขืœ ืงื•ื ืกื•ืœืช ื”ืื˜ืืจื™.
04:05
Now of course,
69
245338
1779
ืขื›ืฉื™ื• ื›ืžื•ื‘ืŸ,
04:07
A.I. is still nowhere near having the same powerful, cross-domain
70
247117
3999
ื‘"ืž ืขื“ื™ื™ืŸ ืœื ืงืจื•ื‘ื” ืœื™ื›ื•ืœื•ืช ื—ื•ืฆื•ืช ื”ืชื—ื•ืžื™ื ื•ื”ืขื•ืฆืžื”
04:11
ability to learn and plan as a human being has.
71
251116
3219
ื•ื”ื™ื›ื•ืœืช ืœืœืžื•ื“ ื•ืœืชื›ื ืŸ ื›ืžื• ื‘ื ื™ ืื“ื.
04:14
The cortex still has some algorithmic tricks
72
254335
2126
ืœืงื•ืจื˜ืงืก ืขื“ื™ื™ืŸ ื™ืฉ ื›ืžื” ื˜ืจื™ืงื™ื ืืœื’ื•ืจื™ืชืžื™ื™ื
04:16
that we don't yet know how to match in machines.
73
256461
2355
ืฉืื ื—ื ื• ืขื“ื™ื™ืŸ ืœื ื™ื•ื“ืขื™ื ืื™ืš ืœื“ืžื•ืช ื‘ืžื›ื•ื ื•ืช.
04:19
So the question is,
74
259886
1899
ืื– ื”ืฉืืœื” ื”ื™ื,
04:21
how far are we from being able to match those tricks?
75
261785
3500
ื›ืžื” ืจื—ืงื• ืื ื—ื ื• ืžืœื”ื™ื•ืช ืžืกื•ื’ืœื™ื ืœื”ืฉืชื•ื•ืช ืœื˜ืจื™ืงื™ื ื”ืืœื”?
04:26
A couple of years ago,
76
266245
1083
ืœืคื ื™ ื›ืžื” ืฉื ื™ื,
04:27
we did a survey of some of the world's leading A.I. experts,
77
267328
2888
ืขืฉื™ื ื• ืกืงืจ ืฉืœ ื›ืžื” ืžืžื•ืžื—ื™ ื”ื‘"ืž ื”ืžื•ื‘ื™ืœื™ื ื‘ืขื•ืœื,
04:30
to see what they think, and one of the questions we asked was,
78
270216
3224
ื›ื“ื™ ืœืจืื•ืช ืžื” ื”ื ื—ื•ืฉื‘ื™ื, ื•ืื—ืช ื”ืฉืืœื•ืช ืฉืฉืืœื ื• ื”ื™ืชื”,
04:33
"By which year do you think there is a 50 percent probability
79
273440
3353
"ื‘ืื™ื–ื• ืฉื ื” ืืชื ื—ื•ืฉื‘ื™ื ืฉื™ืฉ ื”ืกืชื‘ืจื•ืช ืฉืœ 50 ืื—ื•ื–
04:36
that we will have achieved human-level machine intelligence?"
80
276793
3482
ืฉื ื’ื™ืข ืœืจืžืช ื‘ื™ื ื” ืื ื•ืฉื™ืช ื‘ืžื›ื•ื ื•ืช?"
04:40
We defined human-level here as the ability to perform
81
280785
4183
ื”ื’ื“ืจื ื• ืจืžืช ื‘ื™ื ื” ืื ื•ืฉื™ืช ืคื” ื›ื™ื›ื•ืœืช ืœื‘ืฆืข
04:44
almost any job at least as well as an adult human,
82
284968
2871
ื›ืžืขื˜ ื›ืœ ืขื‘ื•ื“ื” ื‘ืฆื•ืจื” ื˜ื•ื‘ื” ืœืคื—ื•ืช ื›ืžื• ืื“ื ื‘ื•ื’ืจ,
04:47
so real human-level, not just within some limited domain.
83
287839
4005
ืื– ืจืžื” ืื ื•ืฉื™ืช ืืžื™ืชื™ืช, ืœื ืจืง ื‘ืชื•ืš ืชื—ื•ื ืžืกื•ื™ื™ื.
04:51
And the median answer was 2040 or 2050,
84
291844
3650
ื•ื”ืชื•ืฆืื” ื”ื—ืฆื™ื•ื ื™ืช ื”ื™ืชื” ื‘ื™ืŸ 2040 ืœ 2050,
04:55
depending on precisely which group of experts we asked.
85
295494
2806
ืชืœื•ื™ ื‘ื“ื™ื•ืง ื‘ืื™ื–ื• ืงื‘ื•ืฆืช ืžื•ืžื—ื™ื ืฉืืœื ื•.
04:58
Now, it could happen much, much later, or sooner,
86
298300
4039
ืขื›ืฉื™ื•, ื–ื” ื™ื›ื•ืœ ืœืงืจื•ืช ื”ืจื‘ื” ื”ืจื‘ื” ื™ื•ืชืจ ืžืื•ื—ืจ, ืื• ืžื•ืงื“ื,
05:02
the truth is nobody really knows.
87
302339
1940
ื”ืืžืช ื”ื™ื ืฉืืฃ ืื—ื“ ืœื ื‘ืืžืช ื™ื•ื“ืข.
05:05
What we do know is that the ultimate limit to information processing
88
305259
4412
ืžื” ืฉืื ื—ื ื• ื›ืŸ ื™ื•ื“ืขื™ื ื–ื” ืฉื”ืกืฃ ื”ืื•ืœื˜ื™ืžื˜ื™ื‘ื™ ืœืขื™ื‘ื•ื“ ืžื™ื“ืข
05:09
in a machine substrate lies far outside the limits in biological tissue.
89
309671
4871
ื‘ืชืฉืชื™ืช ื”ืžื›ื•ื ื” ื ืžืฆื ืžื—ื•ืฅ ืœืžื’ื‘ืœื•ืช ื‘ืจืงืžื” ื‘ื™ื•ืœื•ื’ื™ืช.
05:15
This comes down to physics.
90
315241
2378
ื–ื” ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ืขื ื™ื™ืŸ ืฉืœ ืคื™ื–ื™ืงื”.
05:17
A biological neuron fires, maybe, at 200 hertz, 200 times a second.
91
317619
4718
ื ื™ื•ืจื•ืŸ ื‘ื™ื•ืœื•ื’ื™ ื™ื•ืจื”, ืื•ืœื™, ื‘ 200 ื”ืจืฅ, 200 ืคืขืžื™ื ื‘ืฉื ื™ื”.
05:22
But even a present-day transistor operates at the Gigahertz.
92
322337
3594
ืื‘ืœ ืืคื™ืœื• ื˜ืจื ื–ื™ืกื˜ื•ืจ ื”ื™ื•ื ืคื•ืขืœ ื‘ื’ื™ื’ื” ื”ืจืฆื™ื.
05:25
Neurons propagate slowly in axons, 100 meters per second, tops.
93
325931
5297
ื”ื ื™ื•ืจื•ื ื™ื ืžืชืคืฉื˜ื™ื ืœืื˜ ื‘ืืงืกื•ื ื™ื 100 ืžื˜ืจ ื‘ืฉื ื™ื” ืžืงืกื™ืžื•ื.
05:31
But in computers, signals can travel at the speed of light.
94
331228
3111
ืื‘ืœ ื‘ืžื—ืฉื‘ื™ื, ื”ืื•ืช ื™ื›ื•ืœ ืœื ื•ืข ื‘ืžื”ื™ืจื•ืช ื”ืื•ืจ.
05:35
There are also size limitations,
95
335079
1869
ื™ืฉ ื’ื ืžื’ื‘ืœื•ืช ื’ื•ื“ืœ,
05:36
like a human brain has to fit inside a cranium,
96
336948
3027
ื›ืžื• ืฉืžื•ื— ืื ื•ืฉื™ ื—ื™ื™ื‘ ืœื”ืชืื™ื ืœืชื•ืš ื”ื’ื•ืœื’ื•ืœืช,
05:39
but a computer can be the size of a warehouse or larger.
97
339975
4761
ืื‘ืœ ืžื—ืฉื‘ ื™ื›ื•ืœ ืœื”ื™ื•ืช ื‘ื’ื•ื“ืœ ืฉืœ ืžื—ืกืŸ ืื• ื’ื“ื•ืœ ื™ื•ืชืจ.
05:44
So the potential for superintelligence lies dormant in matter,
98
344736
5599
ืื– ื”ืคื•ื˜ื ืฆื™ืืœ ืœื‘ื™ื ืช ืขืœ ื ืžืฆื ืจื“ื•ื ื‘ืชื•ืš ื”ื—ื•ืžืจ,
05:50
much like the power of the atom lay dormant throughout human history,
99
350335
5712
ื‘ื“ื•ืžื” ืœื›ื•ื— ื”ืื˜ื•ื ืฉื ืžืฆื ืจื“ื•ื ื‘ืžื”ืœืš ื”ื”ืกื˜ื•ืจื™ื” ื”ืื ื•ืฉื™ืช,
05:56
patiently waiting there until 1945.
100
356047
4405
ื‘ืกื‘ืœื ื•ืช ืขื“ 1945,
06:00
In this century,
101
360452
1248
ื‘ืžืื” ื”ื–ื•,
06:01
scientists may learn to awaken the power of artificial intelligence.
102
361700
4118
ืžื“ืขื ื™ื ืื•ืœื™ ื™ืœืžื“ื• ืœื”ืขื™ืจ ืืช ื”ื›ื•ื— ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
06:05
And I think we might then see an intelligence explosion.
103
365818
4008
ื•ืื ื™ ื—ื•ืฉื‘ ืฉืื•ืœื™ ื ืจืื” ื”ืชืคื•ืฆืฆื•ืช ื‘ื™ื ื”.
06:10
Now most people, when they think about what is smart and what is dumb,
104
370406
3957
ืขื›ืฉื™ื• ืจื•ื‘ ื”ืื ืฉื™ื, ื›ืฉื”ื ื—ื•ืฉื‘ื™ื ืขืœ ืžื” ื–ื” ื—ื›ื ื•ืžื” ื˜ื™ืคืฉ,
06:14
I think have in mind a picture roughly like this.
105
374363
3023
ืื ื™ ื—ื•ืฉื‘ ืฉื™ืฉ ื‘ืจืืฉื ืชืžื•ื ื” ื‘ืขืจืš ื›ื–ื•.
06:17
So at one end we have the village idiot,
106
377386
2598
ืื– ื‘ืฆื“ ืื—ื“ ื™ืฉ ืœื ื• ืืช ื˜ื™ืคืฉ ื”ื›ืคืจ,
06:19
and then far over at the other side
107
379984
2483
ื•ืื– ื”ืจื—ืง ื‘ืฆื“ ื”ืฉื ื™
06:22
we have Ed Witten, or Albert Einstein, or whoever your favorite guru is.
108
382467
4756
ื™ืฉ ืœื ื• ืืช ืื“ ื•ื•ื™ื˜ืŸ, ืื• ืืœื‘ืจื˜ ืื™ื™ื ืฉื˜ื™ื™ืŸ, ืื• ืžื™ ืฉื”ื’ื•ืจื• ื”ื—ื‘ื™ื‘ ืขืœื™ื›ื.
06:27
But I think that from the point of view of artificial intelligence,
109
387223
3834
ืื‘ืœ ืื ื™ ื—ื•ืฉื‘ ืฉืžื ืงื•ื“ืช ื”ืžื‘ื˜ ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
06:31
the true picture is actually probably more like this:
110
391057
3681
ื”ืชืžื•ื ื” ื”ืืžื™ืชื™ืช ื”ื™ื ืœืžืขืฉื” ื™ื•ืชืจ ื“ื•ืžื” ืœื–ื”:
06:35
AI starts out at this point here, at zero intelligence,
111
395258
3378
ื‘"ืž ืžืชื—ื™ืœื” ื‘ื ืงื•ื“ื” ื”ื–ื• ืคื”, ื‘ืืคืก ื‘ื™ื ื”,
06:38
and then, after many, many years of really hard work,
112
398636
3011
ื•ืื–, ืื—ืจื™ ื”ืจื‘ื”, ื”ืจื‘ื”, ืฉื ื™ื ืฉืœ ืขื‘ื•ื“ื” ืงืฉื”,
06:41
maybe eventually we get to mouse-level artificial intelligence,
113
401647
3844
ืื•ืœื™ ืœื‘ืกื•ืฃ ื ื’ื™ืข ืœืจืžืช ื‘ื™ื ื” ืฉืœ ืขื›ื‘ืจ,
06:45
something that can navigate cluttered environments
114
405491
2430
ืžืฉื”ื• ืฉื™ื›ื•ืœ ืœื ื•ื•ื˜ ื‘ืกื‘ื™ื‘ื•ืช ืžื•ืจื›ื‘ื•ืช
06:47
as well as a mouse can.
115
407921
1987
ื›ืžื• ืฉืขื›ื‘ืจ ื™ื›ื•ืœ.
06:49
And then, after many, many more years of really hard work, lots of investment,
116
409908
4313
ื•ืื–, ืื—ืจื™ ืขื•ื“ ื”ืจื‘ื” ื”ืจื‘ื” ืฉื ื™ื ืฉืœ ืขื‘ื•ื“ื” ืžืžืฉ ืงืฉื”, ื”ืจื‘ื” ื”ืฉืงืขื•ืช,
06:54
maybe eventually we get to chimpanzee-level artificial intelligence.
117
414221
4639
ืื•ืœื™ ืœื‘ืกื•ืฃ ื ื’ื™ืข ืœืจืžืช ื‘ื™ื ื” ืฉืœ ืฉื™ืžืคื ื–ื”.
06:58
And then, after even more years of really, really hard work,
118
418860
3210
ื•ืื–, ืื—ืจื™ ืืคื™ืœื• ื™ื•ืชืจ ืฉื ื™ื ืฉืœ ืขื‘ื•ื“ื” ืžืžืฉ ืžืžืฉ ืงืฉื”,
07:02
we get to village idiot artificial intelligence.
119
422070
2913
ื ื’ื™ืข ืœืจืžื” ืฉืœ ืื™ื“ื™ื•ื˜ ื”ื›ืคืจ.
07:04
And a few moments later, we are beyond Ed Witten.
120
424983
3272
ื•ื›ืžื” ืจื’ืขื™ื ืื—ืจ ื›ืš, ืื ื—ื ื• ืžืขื‘ืจ ืœืื“ ื•ื•ื™ื˜ืŸ.
07:08
The train doesn't stop at Humanville Station.
121
428255
2970
ื”ืจื›ื‘ืช ืœื ืขื•ืฆืจืช ื‘ืขื™ืจ ื‘ื ื™ ื”ืื“ื,
07:11
It's likely, rather, to swoosh right by.
122
431225
3022
ืจื•ื‘ ื”ืกื™ื›ื•ื™ื ืฉื”ื™ื ืชื—ืœื•ืฃ ื‘ื™ืขืฃ.
07:14
Now this has profound implications,
123
434247
1984
ืขื›ืฉื™ื• ื™ืฉ ืœื–ื” ื”ืฉืœื›ื•ืช ืขืžื•ืงื•ืช,
07:16
particularly when it comes to questions of power.
124
436231
3862
ื‘ืขื™ืงืจ ื›ืฉื–ื” ืžื’ื™ืข ืœืฉืืœื•ืช ืฉืœ ื›ื•ื—.
07:20
For example, chimpanzees are strong --
125
440093
1899
ืœื“ื•ื’ืžื”, ืฉื™ืžืคื ื–ื™ื ื—ื–ืงื™ื --
07:21
pound for pound, a chimpanzee is about twice as strong as a fit human male.
126
441992
5222
ืงื™ืœื• ืœืงื™ืœื•, ืฉื™ืžืคื ื–ื” ื—ื–ืงื” ื‘ืขืจืš ืคื™ ืฉืชื™ื™ื ืžื’ื‘ืจ ืื ื•ืฉื™.
07:27
And yet, the fate of Kanzi and his pals depends a lot more
127
447214
4614
ื•ืขื“ื™ื™ืŸ, ื”ื’ื•ืจืœ ืฉืœ ืงื ื–ื™ ื•ื—ื‘ืจื™ื• ืชืœื•ื™ ื”ืจื‘ื” ื™ื•ืชืจ
07:31
on what we humans do than on what the chimpanzees do themselves.
128
451828
4140
ื‘ืžื” ืฉืื ืฉื™ื ื™ืขืฉื• ืžืืฉืจ ื‘ืžื” ืฉื”ืฉื™ืžืคื ื–ื•ืช ื™ืขืฉื• ื‘ืขืฆืžืŸ.
07:37
Once there is superintelligence,
129
457228
2314
ื‘ืจื’ืข ืฉืชื”ื™ื™ื” ื‘ื™ื ืช ืขืœ,
07:39
the fate of humanity may depend on what the superintelligence does.
130
459542
3839
ื’ื•ืจืœ ื”ืื ื•ืฉื•ืช ืื•ืœื™ ื™ื”ื™ื” ืชืœื•ื™ ื‘ืžื” ืฉื‘ื™ื ืช ื”ืขืœ ืชืขืฉื”.
07:44
Think about it:
131
464451
1057
ื—ืฉื‘ื• ืขืœ ื–ื”:
07:45
Machine intelligence is the last invention that humanity will ever need to make.
132
465508
5044
ื‘ื™ื ืช ืžื›ื•ื ื•ืช ื”ื™ื ื”ื”ืžืฆืื” ื”ืื—ืจื•ื ื” ืฉื”ืžื™ืŸ ื”ืื ื•ืฉื™ ื™ืฆื˜ืจืš ืื™ ืคืขื ืœืขืฉื•ืช.
07:50
Machines will then be better at inventing than we are,
133
470552
2973
ืžื›ื•ื ื•ืช ื™ื”ื™ื• ืื– ื˜ื•ื‘ื•ืช ื™ื•ืชืจ ื‘ื”ืžืฆืื” ืžืื™ืชื ื•,
07:53
and they'll be doing so on digital timescales.
134
473525
2540
ื•ื”ืŸ ื™ืขืฉื• ืืช ื–ื” ื‘ื˜ื•ื•ื—ื™ ื–ืžืŸ ื“ื™ื’ื™ื˜ืœื™ื™ื.
07:56
What this means is basically a telescoping of the future.
135
476065
4901
ืžื” ืฉื–ื” ืื•ืžืจ ื–ื” ื‘ืขื™ืงืจื•ืŸ ื˜ืœืกืงื•ืค ืฉืœ ื”ืขืชื™ื“.
08:00
Think of all the crazy technologies that you could have imagined
136
480966
3558
ื—ืฉื‘ื• ืขืœ ื›ืœ ื”ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ื”ืžืฉื•ื’ืขื•ืช ืฉืชื•ื›ืœื• ืœื“ืžื™ื™ืŸ
08:04
maybe humans could have developed in the fullness of time:
137
484524
2798
ืฉืื•ืœื™ ืื ืฉื™ื ื”ื™ื• ื™ื›ื•ืœื™ื ืœืคืชื— ื‘ืžืฉืš ื”ื–ืžืŸ:
08:07
cures for aging, space colonization,
138
487322
3258
ืชืจื•ืคื•ืช ืœื”ื–ื“ืงื ื•ืช, ื™ืฉื•ื‘ ื”ื—ืœืœ,
08:10
self-replicating nanobots or uploading of minds into computers,
139
490580
3731
ื ื ื• ืจื•ื‘ื•ื˜ื™ื ืฉืžืฉื›ืคืœื™ื ืืช ืขืฆืžื ืื• ื”ืขืœืืช ื”ืชื•ื“ืขื” ืฉืœื ื• ืœืžื—ืฉื‘ื™ื,
08:14
all kinds of science fiction-y stuff
140
494311
2159
ื›ืœ ืžื™ื ื™ ืกื•ื’ื™ื ืฉืœ ืžื“ืข ื‘ื“ื™ื•ื ื™
08:16
that's nevertheless consistent with the laws of physics.
141
496470
2737
ืฉืขื“ื™ื™ืŸ ืžืชืื™ื ืœื—ื•ืงื™ ื”ืคื™ื–ื™ืงื”.
08:19
All of this superintelligence could develop, and possibly quite rapidly.
142
499207
4212
ืืช ื›ืœ ื–ื” ื‘ื™ื ืช ื”ืขืœ ื”ื–ื• ื™ื›ื•ืœื” ืœืคืชื—, ื•ื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉื“ื™ ืžื”ืจ.
08:24
Now, a superintelligence with such technological maturity
143
504449
3558
ืขื›ืฉื™ื• ื‘ื™ื ืช ืขืœ ืขื ื›ื–ื• ื‘ื’ืจื•ืช ื˜ื›ื ื•ืœื•ื’ื™ืช
08:28
would be extremely powerful,
144
508007
2179
ืชื”ื™ื” ื—ื–ืงื” ืžืื•ื“,
08:30
and at least in some scenarios, it would be able to get what it wants.
145
510186
4546
ื•ืœืคื—ื•ืช ื‘ื›ืžื” ืžืงืจื™ื, ื”ื™ื ืชื”ื™ื” ืžืกื•ื’ืœืช ืœื”ืฉื™ื’ ืžื” ืฉื”ื™ื ืจื•ืฆื”.
08:34
We would then have a future that would be shaped by the preferences of this A.I.
146
514732
5661
ืื– ื™ื”ื™ื• ืœื ื• ืขืชื™ื“ ืฉื™ื”ื™ื” ืžืขื•ืฆื‘ ืขืœ ื™ื“ื™ ื”ื”ืขื“ืคื•ืช ืฉืœ ื‘"ืž ื–ื•.
08:41
Now a good question is, what are those preferences?
147
521855
3749
ืขื›ืฉื™ื• ืฉืืœื” ื˜ื•ื‘ื” ื”ื™ื, ืžื”ืŸ ื”ื”ืขื“ืคื•ืช ื”ืืœื•?
08:46
Here it gets trickier.
148
526244
1769
ืคื” ื–ื” ื ืขืฉื” ืžืกื•ื‘ืš.
08:48
To make any headway with this,
149
528013
1435
ื›ื“ื™ ืœื”ืชืงื“ื ื‘ื›ืœืœ ืขื ื–ื”,
08:49
we must first of all avoid anthropomorphizing.
150
529448
3276
ืื ื—ื ื• ื—ื™ื™ื‘ื™ื ืจืืฉื™ืช ืœื”ืžื ืข ืžืื ืชืจื•ืคื•ืžื•ืจืคื–ื™ื ื’.
08:53
And this is ironic because every newspaper article
151
533934
3301
ื•ื–ื” ืื™ืจื•ื ื™ ืžืคื ื™ ืฉื‘ื›ืœ ืžืืžืจ ื‘ืขื™ืชื•ืŸ
08:57
about the future of A.I. has a picture of this:
152
537235
3855
ืขืœ ื”ืขืชื™ื“ ืฉืœ ื‘"ืž ื™ืฉ ืืช ื”ืชืžื•ื ื” ื”ื–ื•:
09:02
So I think what we need to do is to conceive of the issue more abstractly,
153
542280
4134
ืื– ืื ื™ ื—ื•ืฉื‘ ืฉืžื” ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ื–ื” ืœื—ืฉื•ื‘ ืขืœ ื”ื‘ืขื™ื” ื”ื–ื• ื‘ืฆื•ืจื” ื™ื•ืชืจ ืžื•ืคืฉื˜ืช,
09:06
not in terms of vivid Hollywood scenarios.
154
546414
2790
ืœื ื‘ืžื•ื ื—ื™ื ืฉืœ ืชืกืจื™ื˜ื™ื ื”ื•ืœื™ื•ื•ื“ื™ื™ื.
09:09
We need to think of intelligence as an optimization process,
155
549204
3617
ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื—ืฉื•ื‘ ืขืœ ื‘ื™ื ื” ื›ืชื”ืœื™ืš ืžื™ื˜ื•ื‘,
09:12
a process that steers the future into a particular set of configurations.
156
552821
5649
ืชื”ืœื™ืš ืฉืžื ืชื‘ ืืช ื”ืขืชื™ื“ ืœืกื˜ ืžืกื•ื™ื™ื ืฉืœ ืงื•ื ืคื™ื’ื•ืจืฆื™ื•ืช.
09:18
A superintelligence is a really strong optimization process.
157
558470
3511
ื‘ื™ื ืช ืขืœ ื”ื™ื ื‘ืืžืช ืชื”ืœื™ืš ืžื™ื˜ื•ื‘ ืžืื•ื“ ื—ื–ืง.
09:21
It's extremely good at using available means to achieve a state
158
561981
4117
ื”ื™ื ืžืžืฉ ื˜ื•ื‘ื” ื‘ืฉื™ืžื•ืฉ ื‘ืืžืฆืขื™ื ื–ืžื™ื ื™ื ื›ื“ื™ ืœื”ืฉื™ื’ ืžืฆื‘
09:26
in which its goal is realized.
159
566098
1909
ื‘ื• ื”ืžื˜ืจื” ืฉืœื” ืžื•ืฉื’ืช.
09:28
This means that there is no necessary connection between
160
568447
2672
ื–ื” ืื•ืžืจ ืฉืื™ืŸ ื—ื™ื‘ื•ืจ ื”ื›ืจื—ื™ ื‘ื™ืŸ
09:31
being highly intelligent in this sense,
161
571119
2734
ืœื”ื™ื•ืช ืžืื•ื“ ื ื‘ื•ืŸ ื‘ืžื•ื‘ืŸ ื”ื–ื”,
09:33
and having an objective that we humans would find worthwhile or meaningful.
162
573853
4662
ื•ืฉืชื”ื™ื” ืœื” ืžื˜ืจื” ืฉืื ื—ื ื• ื”ืื ืฉื™ื ื ืžืฆื ื›ื“ืื™ืช ืื• ืžืฉืžืขื•ืชื™ืช.
09:39
Suppose we give an A.I. the goal to make humans smile.
163
579321
3794
ื ื ื™ื— ืฉื ื™ืชืŸ ืœื‘"ืž ืืช ื”ืžื˜ืจื” ืœื’ืจื•ื ืœืื ืฉื™ื ืœื—ื™ื™ืš.
09:43
When the A.I. is weak, it performs useful or amusing actions
164
583115
2982
ื›ืฉื”ื‘"ืž ื—ืœืฉื”, ื”ื™ื ืžื‘ืฆืขืช ืคืขื•ืœื•ืช ืžื•ืขื™ืœื•ืช ืื• ืžืฉืขืฉืขื•ืช
09:46
that cause its user to smile.
165
586097
2517
ืฉื’ื•ืจืžื•ืช ืœืžืฉืชืžืฉื™ื ืฉืœื” ืœื—ื™ื™ืš.
09:48
When the A.I. becomes superintelligent,
166
588614
2417
ื›ืฉื”ื‘"ืž ื”ื•ืคื›ืช ืœื‘ื™ื ืช ืขืœ,
09:51
it realizes that there is a more effective way to achieve this goal:
167
591031
3523
ื”ื™ื ืžื‘ื™ื ื” ืฉื™ืฉ ื“ืจืš ื™ื•ืชืจ ื™ืขื™ืœื” ื›ื“ื™ ืœื”ืฉื™ื’ ืืช ื”ืžื˜ืจื”:
09:54
take control of the world
168
594554
1922
ืœื”ืฉืชืœื˜ ืขืœ ื”ืขื•ืœื
09:56
and stick electrodes into the facial muscles of humans
169
596476
3162
ื•ืœืชืงื•ืข ืืœืงื˜ืจื•ื“ื•ืช ื‘ืฉืจื™ืจื™ ื”ืคื ื™ื ืฉืœ ืื ืฉื™ื
09:59
to cause constant, beaming grins.
170
599638
2941
ื›ื“ื™ ืœื’ืจื•ื ืœืคืจืฆื•ืคื™ื ืžื—ื™ื™ื›ื™ื ื‘ืงื‘ื™ืขื•ืช.
10:02
Another example,
171
602579
1035
ื“ื•ื’ืžื” ื ื•ืกืคืช,
10:03
suppose we give A.I. the goal to solve a difficult mathematical problem.
172
603614
3383
ื ื ื™ื— ืฉื ื™ืชืŸ ืœื‘"ืž ืืช ื”ืžื˜ืจื” ืœืคืชื•ืจ ื‘ืขื™ื” ืžืชืžื˜ื™ืช ืžืกื•ื‘ื›ืช.
10:06
When the A.I. becomes superintelligent,
173
606997
1937
ื›ืฉื”ื‘"ืž ื”ื•ืคื›ืช ืœื‘ื™ื ืช ืขืœ,
10:08
it realizes that the most effective way to get the solution to this problem
174
608934
4171
ื”ื™ื ืžื‘ื™ื ื” ืฉื”ื“ืจืš ื”ื›ื™ ืืคืงื˜ื™ื‘ื™ืช ืœืคืชื•ืจ ืืช ื”ื‘ืขื™ื” ื”ื–ื•
10:13
is by transforming the planet into a giant computer,
175
613105
2930
ื”ื™ื ืœื”ืคื•ืš ืืช ื”ืคืœื ื˜ื” ืœืžื—ืฉื‘ ืขืฆื•ื,
10:16
so as to increase its thinking capacity.
176
616035
2246
ื›ื“ื™ ืœื”ื’ื‘ื™ืจ ืืช ื™ื›ื•ืœืช ื”ื—ืฉื™ื‘ื”.
10:18
And notice that this gives the A.I.s an instrumental reason
177
618281
2764
ื•ืฉื™ืžื• ืœื‘ ืฉื–ื” ื ื•ืชืŸ ืœื‘"ืž ืกื™ื‘ื” ืžืฉืžืขื•ืชื™ืช
10:21
to do things to us that we might not approve of.
178
621045
2516
ืœืขืฉื•ืช ื“ื‘ืจื™ื ืœื ื• ืฉืื•ืœื™ ืœื ื ืกื›ื™ื ืœื”ื.
10:23
Human beings in this model are threats,
179
623561
1935
ืื ืฉื™ื ื‘ืžื•ื“ืœ ื”ื–ื” ื”ื ืื™ื•ืžื™ื,
10:25
we could prevent the mathematical problem from being solved.
180
625496
2921
ื ื•ื›ืœ ืœืžื ื•ืข ืžื”ื‘ืขื™ื” ื”ืžืชืžื˜ื™ืช ืžืœื”ืคืชืจ.
10:29
Of course, perceivably things won't go wrong in these particular ways;
181
629207
3494
ื›ืžื•ื‘ืŸ, ื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉื”ื“ื‘ืจื™ื ืœื ื™ืœื›ื• ืœื›ื™ื•ื•ื ื™ื ืœื ื˜ื•ื‘ื™ื ื‘ื“ืจื›ื™ื ื”ืืœื”;
10:32
these are cartoon examples.
182
632701
1753
ืืœื” ื“ื•ื’ืžืื•ืช ื”ื ืคืฉื”.
10:34
But the general point here is important:
183
634454
1939
ืื‘ืœ ื”ื ืงื•ื“ื” ื”ืขื™ืงืจื™ืช ืคื” ื”ื™ื ื—ืฉื•ื‘ื”:
10:36
if you create a really powerful optimization process
184
636393
2873
ืื ืืชื ื™ื•ืฆืจื™ื ืชื”ืœื™ืš ืžื™ื˜ื•ื‘ ืžืžืฉ ื—ื–ืง
10:39
to maximize for objective x,
185
639266
2234
ื›ื“ื™ ืœืžืงืกื ืืช ืžื˜ืจื” X,
10:41
you better make sure that your definition of x
186
641500
2276
ื›ื“ืื™ ืฉืชื“ืื’ื• ืฉื”ื”ื’ื“ืจื” ืฉืœื›ื ืœ X
10:43
incorporates everything you care about.
187
643776
2469
ื›ื•ืœืœืช ื›ืœ ืžื” ืฉื—ืฉื•ื‘ ืœื›ื.
10:46
This is a lesson that's also taught in many a myth.
188
646835
4384
ื–ื” ืฉื™ืขื•ืจ ืฉื’ื ืžืกื•ืคืจ ื‘ื”ืจื‘ื” ืžื™ืชื•ืกื™ื.
10:51
King Midas wishes that everything he touches be turned into gold.
189
651219
5298
ื”ืžืœืš ืžื™ื“ืืก ืจืฆื” ืฉื›ืœ ืžื” ืฉื™ื’ืข ื‘ื• ื™ื”ืคื•ืš ืœื–ื”ื‘.
10:56
He touches his daughter, she turns into gold.
190
656517
2861
ื”ื•ื ื ื•ื’ืข ื‘ื‘ื™ืชื•, ื•ื”ื™ื ื”ื•ืคื›ืช ืœื–ื”ื‘.
10:59
He touches his food, it turns into gold.
191
659378
2553
ื”ื•ื ื ื•ื’ืข ื‘ืื•ื›ืœ, ื•ื”ื•ื ื”ื•ืคืš ืœื–ื”ื‘.
11:01
This could become practically relevant,
192
661931
2589
ื–ื” ื™ื›ื•ืœ ืœื”ืคื•ืš ืœืจืœื•ื•ื ื˜ื™ ื‘ืžื™ื•ื—ื“,
11:04
not just as a metaphor for greed,
193
664520
2070
ืœื ืจืง ื›ืžื˜ืืคื•ืจื” ืœืชืื•ื•ืช ื‘ืฆืข,
11:06
but as an illustration of what happens
194
666590
1895
ืืœื ื›ื”ื“ื’ืžื” ืœืžื” ืฉืงื•ืจื”
11:08
if you create a powerful optimization process
195
668485
2837
ืื ืืชื ื™ื•ืฆืจื™ื ืชื”ืœื™ืš ืžื™ื˜ื•ื‘ ื—ื–ืง
11:11
and give it misconceived or poorly specified goals.
196
671322
4789
ื•ื ื•ืชื ื™ื ืœื• ืžื˜ืจื•ืช ืœื ืžื•ื‘ื ื•ืช ืื• ืœื ืžื•ื’ื“ืจื•ืช ื”ื™ื˜ื‘.
11:16
Now you might say, if a computer starts sticking electrodes into people's faces,
197
676111
5189
ืขื›ืฉื™ื• ืืชื ืื•ืœื™ ืชื’ื™ื“ื•, ืื ืžื—ืฉื‘ ื™ืชื—ื™ืœ ืœืชืงื•ืข ืืœืงื˜ืจื•ื“ื•ืช ืœืชื•ืš ืคื ื™ื ืฉืœ ืื ืฉื™ื,
11:21
we'd just shut it off.
198
681300
2265
ืคืฉื•ื˜ ื ื›ื‘ื” ืื•ืชื•.
11:24
A, this is not necessarily so easy to do if we've grown dependent on the system --
199
684555
5340
ืจืืฉื™ืช, ื–ื” ืœื ื‘ื”ื›ืจื— ื™ื”ื™ื” ืคืฉื•ื˜ ืœืขืฉื•ืช ืื ื”ืคื›ื ื• ืœืชืœื•ื™ื™ื ื‘ืžืขืจื›ืช --
11:29
like, where is the off switch to the Internet?
200
689895
2732
ื›ืžื•, ืื™ืคื” ืžืชื’ ื”ื›ื™ื‘ื•ื™ ืœืื™ื ื˜ืจื ื˜?
11:32
B, why haven't the chimpanzees flicked the off switch to humanity,
201
692627
5120
ืฉื ื™ืช, ืœืžื” ื”ืฉื™ืžืคื ื–ื™ื ืœื ื›ื™ื‘ื• ืืช ื”ืžืชื’ ืฉืœ ื”ืื ื•ืฉื•ืช,
11:37
or the Neanderthals?
202
697747
1551
ืื• ื”ื ืื ื“ืจื˜ืœื™ื?
11:39
They certainly had reasons.
203
699298
2666
ื‘ื”ื—ืœื˜ ื”ื™ื• ืœื”ื ืกื™ื‘ื•ืช.
11:41
We have an off switch, for example, right here.
204
701964
2795
ื™ืฉ ืœื ื• ืžืชื’ ื›ื™ื‘ื•ื™, ืœื“ื•ื’ืžื”, ืžืžืฉ ืคื”.
11:44
(Choking)
205
704759
1554
(ื—ื ืง)
11:46
The reason is that we are an intelligent adversary;
206
706313
2925
ื”ืกื™ื‘ื” ื”ื™ื ืฉืื ื—ื ื• ื™ืจื™ื‘ ื ื‘ื•ืŸ;
11:49
we can anticipate threats and plan around them.
207
709238
2728
ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืฆืคื•ืช ืื™ื•ืžื™ื ื•ืœืชื›ื ืŸ ืกื‘ื™ื‘ื.
11:51
But so could a superintelligent agent,
208
711966
2504
ืื‘ืœ ื›ืš ื’ื ืชื•ื›ืœ ื‘ื™ื ืช ื”ืขืœ,
11:54
and it would be much better at that than we are.
209
714470
3254
ื•ื”ื™ื ืชื”ื™ื” ื˜ื•ื‘ื” ื‘ื–ื” ืžืื™ืชื ื•.
11:57
The point is, we should not be confident that we have this under control here.
210
717724
7187
ื”ื ืงื•ื“ื” ื”ื™ื, ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ื‘ื˜ื•ื—ื™ื ืฉื ื•ื›ืœ ืœืฉืœื•ื˜ ื‘ืžืฆื‘ ื”ื–ื”.
12:04
And we could try to make our job a little bit easier by, say,
211
724911
3447
ื•ื ื•ื›ืœ ืœื ืกื•ืช ืœืขืฉื•ืช ืืช ื”ืขื‘ื•ื“ื” ืฉืœื ื• ืžืขื˜ ืงืœื” ื™ื•ืชืจ ืขืœ ื™ื“ื™, ื ื’ื™ื“,
12:08
putting the A.I. in a box,
212
728358
1590
ืœืฉื™ื ืืช ื”ื‘"ืž ื‘ืงื•ืคืกื”,
12:09
like a secure software environment,
213
729948
1796
ื›ืžื• ืกื‘ื™ื‘ืช ืชื•ื›ื ื” ืžืื•ื‘ื˜ื—ืช,
12:11
a virtual reality simulation from which it cannot escape.
214
731744
3022
ื”ื“ืžื™ื™ืช ืžืฆื™ืื•ืช ืžื“ื•ืžื” ืžืžื ื” ื”ื™ื ืœื ื™ื›ื•ืœื” ืœื‘ืจื•ื—.
12:14
But how confident can we be that the A.I. couldn't find a bug.
215
734766
4146
ืื‘ืœ ื›ืžื” ื‘ื˜ื•ื—ื™ื ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืฉื”ื‘"ืž ืœื ืชื•ื›ืœ ืœืžืฆื•ื ื‘ืื’.
12:18
Given that merely human hackers find bugs all the time,
216
738912
3169
ื‘ื”ืชื—ืฉื‘ ื‘ื–ื” ืฉื”ืืงืจื™ื ืื ื•ืฉื™ื™ื ืžื•ืฆืื™ื ื‘ืื’ื™ื ื›ืœ ื”ื–ืžืŸ,
12:22
I'd say, probably not very confident.
217
742081
3036
ื”ื™ืชื™ ืื•ืžืจ, ืฉื›ื ืจืื” ืœื ื›ืœ ื›ืš ื‘ื˜ื•ื—ื™ื.
12:26
So we disconnect the ethernet cable to create an air gap,
218
746237
4548
ืื– ื ื ืชืง ืืช ื›ื‘ืœ ื”ืจืฉืช ื›ื“ื™ ืœื™ืฆื•ืจ ืžืจื•ื•ื— ืื•ื™ืจ,
12:30
but again, like merely human hackers
219
750785
2668
ืื‘ืœ ืฉื•ื‘, ื›ืžื• ื”ืืงืจื™ื ืื ื•ืฉื™ื™ื ืจื’ื™ืœื™ื
12:33
routinely transgress air gaps using social engineering.
220
753453
3381
ืฉืขื•ื‘ืจื™ื ืคืขืจื™ ืื•ื™ืจ ื‘ืื•ืคืŸ ืฉื•ื˜ืฃ ื‘ืขื–ืจืช ื”ื ื“ืกื” ื—ื‘ืจืชื™ืช.
12:36
Right now, as I speak,
221
756834
1259
ืžืžืฉ ืขื›ืฉื™ื•, ื›ืฉืื ื™ ืžื“ื‘ืจ,
12:38
I'm sure there is some employee out there somewhere
222
758093
2389
ืื ื™ ื‘ื˜ื•ื— ืฉื™ืฉ ืื™ื–ืฉื”ื• ืขื•ื‘ื“ืช ืื™ ืฉื
12:40
who has been talked into handing out her account details
223
760482
3346
ืฉืฉื›ื ืขื• ืื•ืชื” ืœืชืช ืืช ืคืจื˜ื™ ื”ื—ืฉื‘ื•ืŸ ืฉืœื”
12:43
by somebody claiming to be from the I.T. department.
224
763828
2746
ืขืœ ื™ื“ื™ ืžื™ืฉื”ื• ืฉื˜ื•ืขืŸ ืฉื”ื•ื ืžืžื—ืœืงืช ืžืขืจื›ื•ืช ื”ืžื™ื“ืข.
12:46
More creative scenarios are also possible,
225
766574
2127
ืชืจื—ื™ืฉื™ื ื™ื•ืชืจ ื™ืฆื™ืจืชื™ื™ื ื”ื ื’ื ืืคืฉืจื™ื™ื,
12:48
like if you're the A.I.,
226
768701
1315
ื›ืžื• ืื ืืชื ื”ื‘"ืž,
12:50
you can imagine wiggling electrodes around in your internal circuitry
227
770016
3532
ืืชื ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ ืฉื™ื ื•ื™ ืืœืงื˜ืจื•ื“ื•ืช ื‘ื—ื™ื•ื•ื˜ ื”ืคื ื™ืžื™ ืฉืœื›ื
12:53
to create radio waves that you can use to communicate.
228
773548
3462
ื›ื“ื™ ืœื™ืฆื•ืจ ื’ืœื™ ืจื“ื™ื• ื‘ื”ื ืืชื ื™ื›ื•ืœื™ื ืœื”ืฉืชืžืฉ ื›ื“ื™ ืœืชืงืฉืจ.
12:57
Or maybe you could pretend to malfunction,
229
777010
2424
ืื• ืื•ืœื™ ืชื•ื›ืœื• ืœื”ืขืžื™ื“ ืคื ื™ื ืฉืืชื ืžืงื•ืœืงืœื™ื,
12:59
and then when the programmers open you up to see what went wrong with you,
230
779434
3497
ื•ืื– ื›ืฉื”ืžืชื›ื ืช ื™ืคืชื— ืืชื›ื ืœืจืื•ืช ืžื” ืœื ืชืงื™ืŸ ืื™ืชื›ื,
13:02
they look at the source code -- Bam! --
231
782931
1936
ื”ื ื™ื‘ื™ื˜ื™ ื‘ืงื•ื“ ื”ืžืงื•ืจ -- ื‘ืื! --
13:04
the manipulation can take place.
232
784867
2447
ื”ืžื ื™ืคื•ืœืฆื™ื” ื™ื›ื•ืœื” ืœื”ืชืจื—ืฉ.
13:07
Or it could output the blueprint to a really nifty technology,
233
787314
3430
ืื• ืื•ืœื™ ื”ื™ื ืชื•ื›ืœ ืœื”ื•ืฆื™ื ืืช ื”ืชื•ื›ื ื™ืช ืฉืœื” ืœื˜ื›ื ื•ืœื•ื’ื™ื” ืžืžืฉ ืžื’ื ื™ื‘ื”,
13:10
and when we implement it,
234
790744
1398
ื•ื›ืฉืชื™ื™ืฉืžื• ืื•ืชื”,
13:12
it has some surreptitious side effect that the A.I. had planned.
235
792142
4397
ื™ื”ื™ื” ืœื” ืืคืงื˜ ืžืฉื ื” ื—ืฉืื™ ืฉื”ื‘"ืž ืชื›ื ื ื”.
13:16
The point here is that we should not be confident in our ability
236
796539
3463
ื”ื ืงื•ื“ื” ืคื” ื”ื™ื ืฉืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ื‘ื˜ื•ื—ื™ื ื‘ื™ื›ื•ืœืช ืฉืœื ื•
13:20
to keep a superintelligent genie locked up in its bottle forever.
237
800002
3808
ืœืฉืžื•ืจ ืขืœ ืฉื“ ื‘ื™ื ืช ื”ืขืœ ื ืขื•ืœ ื‘ื‘ืงื‘ื•ืง ืฉืœื• ืœืขื“.
13:23
Sooner or later, it will out.
238
803810
2254
ื‘ืžื•ืงื“ื ืื• ื‘ืžืื•ื—ืจ, ื”ื•ื ื™ืฆื.
13:27
I believe that the answer here is to figure out
239
807034
3103
ืื ื™ ืžืืžื™ืŸ ืฉื”ืชืฉื•ื‘ื” ืคื” ื”ื™ื ืœื”ื‘ื™ืŸ
13:30
how to create superintelligent A.I. such that even if -- when -- it escapes,
240
810137
5024
ืื™ืš ืœื™ืฆื•ืจ ื‘ื™ื ืช ืขืœ ื›ืš ืฉืืคื™ืœื• ืื -- ืžืชื™ - ืฉื”ื™ื ืชื‘ืจื—,
13:35
it is still safe because it is fundamentally on our side
241
815161
3277
ื–ื” ืขื“ื™ื™ืŸ ื‘ื˜ื•ื— ืžืคื ื™ ืฉื”ื™ื ื‘ืื•ืคืŸ ื‘ืกื™ืกื™ ืœืฆื™ื“ื ื•
13:38
because it shares our values.
242
818438
1899
ืžืคื ื™ ืฉื”ื™ื ื—ื•ืœืงืช ืืช ื”ืขืจื›ื™ื ืฉืœื ื•.
13:40
I see no way around this difficult problem.
243
820337
3210
ืื ื™ ืœื ืจื•ืื” ื“ืจืš ืœืขืงื•ืฃ ืืช ื”ื‘ืขื™ื” ื”ืžืกื•ื‘ื›ืช ื”ื–ื•.
13:44
Now, I'm actually fairly optimistic that this problem can be solved.
244
824557
3834
ืขื›ืฉื™ื•, ืื ื™ ืœืžืขืฉื” ื“ื™ ืื•ืคื˜ื™ืžื™ ืฉื”ื‘ืขื™ื” ื”ื–ื• ื™ื›ื•ืœื” ืœื”ืคืชืจ.
13:48
We wouldn't have to write down a long list of everything we care about,
245
828391
3903
ืœื ื ื”ื™ื” ืฆืจื™ื›ื™ื ืœื›ืชื•ื‘ ืจืฉื™ืžื” ืืจื•ื›ื” ืฉืœ ื›ืœ ืžื” ืฉืื›ืคืช ืœื ื• ืžืžื ื•,
13:52
or worse yet, spell it out in some computer language
246
832294
3643
ืื• ื’ืจื•ืข ื™ื•ืชืจ, ืœืื™ื™ืช ืืช ื–ื” ื‘ืฉืคืช ืžื—ืฉื‘ ื›ืœืฉื”ื™ื
13:55
like C++ or Python,
247
835937
1454
ื›ืžื• C++ ืื• ืคื™ื™ืชื•ืŸ,
13:57
that would be a task beyond hopeless.
248
837391
2767
ื–ื• ืชื”ื™ื” ืžืฉื™ืžื” ืžืขื‘ืจ ืœื—ืกืจืช ืชืงื•ื•ื”.
14:00
Instead, we would create an A.I. that uses its intelligence
249
840158
4297
ื‘ืžืงื•ื, ืื ื—ื ื• ืฆืจื™ื›ื™ื ืœื™ืฆื•ืจ ื‘"ืž ืฉืžืฉืžืฉืช ื‘ื‘ื™ื ื” ืฉืœื”
14:04
to learn what we value,
250
844455
2771
ื›ื“ื™ ืœืœืžื•ื“ ืžื” ืื ื—ื ื• ืžืขืจื™ื›ื™ื,
14:07
and its motivation system is constructed in such a way that it is motivated
251
847226
5280
ื•ืžืขืจื›ืช ื”ืžื•ื˜ื™ื‘ืฆื™ื” ืฉืœื” ืžื•ืจื›ื‘ืช ื‘ื“ืจืš ื›ื–ื• ืฉื™ืฉ ืœื” ืžื•ื˜ื™ื‘ืฆื™ื”
14:12
to pursue our values or to perform actions that it predicts we would approve of.
252
852506
5232
ืœืจื“ื•ืฃ ืื—ืจื™ ื”ืขืจื›ื™ื ืฉืœื ื• ื›ื“ื™ ืœื‘ืฆืข ืคืขื•ืœื•ืช ืฉื”ื™ื ืฆื•ืคื” ืฉื ืกื›ื™ื ืœื”ืŸ.
14:17
We would thus leverage its intelligence as much as possible
253
857738
3414
ืœื›ืŸ ื ื ืฆืœ ืืช ื”ื‘ื™ื ื” ืฉืœื” ื›ื›ืœ ื”ืืคืฉืจ
14:21
to solve the problem of value-loading.
254
861152
2745
ื›ื“ื™ ืœืคืชื•ืจ ื‘ืขื™ื•ืช ืฉืœ ื”ื˜ืขื ืช ืขืจื›ื™ื.
14:24
This can happen,
255
864727
1512
ื–ื” ื™ื›ื•ืœ ืœืงืจื•ืช,
14:26
and the outcome could be very good for humanity.
256
866239
3596
ื•ื”ืชื•ืฆืื” ืชื•ื›ืœ ืœื”ื™ื•ืช ืžืื•ื“ ื˜ื•ื‘ื” ืœืื ื•ืฉื•ืช.
14:29
But it doesn't happen automatically.
257
869835
3957
ืื‘ืœ ื–ื” ืœื ืงื•ืจื” ืื•ื˜ื•ืžื˜ื™ืช.
14:33
The initial conditions for the intelligence explosion
258
873792
2998
ื”ืชื ืื™ื ื”ื”ืชื—ืœืชื™ื™ื ืœื”ืชืคื•ืฆืฆื•ืช ื”ื‘ื™ื ื”
14:36
might need to be set up in just the right way
259
876790
2863
ืื•ืœื™ ืฆืจื™ื›ื™ื ืœื”ื‘ื ื•ืช ื‘ื“ื™ื•ืง ื‘ื“ืจืš ื”ื ื›ื•ื ื”
14:39
if we are to have a controlled detonation.
260
879653
3530
ืื ืื ื—ื ื• ืจื•ืฆื™ื ืฉื™ื”ื™ื” ืœื ื• ืคื™ืฆื•ืฅ ืžื‘ื•ืงืจ.
14:43
The values that the A.I. has need to match ours,
261
883183
2618
ื”ืขืจื›ื™ื ืฉื™ืฉ ืœื‘"ืž ืฆืจื™ื›ื™ื ืœื”ืชืื™ื ืœืฉืœื ื•,
14:45
not just in the familiar context,
262
885801
1760
ืœื ืจืง ื‘ื”ืงืฉืจ ื”ืžื•ื›ืจ,
14:47
like where we can easily check how the A.I. behaves,
263
887561
2438
ื›ืžื• ืื™ืคื” ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœื‘ื“ื•ืง ื‘ืงืœื•ืช ืื™ืš ื”ื‘"ืž ืžืชื ื”ื’ืช,
14:49
but also in all novel contexts that the A.I. might encounter
264
889999
3234
ืืœื ื’ื ื‘ื”ืงืฉืจื™ื ื”ื›ื™ ื—ื“ืฉื™ื ืฉื”ื‘"ืž ืื•ืœื™ ืชื™ืชืงืœ ื‘ื”ื
14:53
in the indefinite future.
265
893233
1557
ื‘ืขืชื™ื“ ื”ืœื ื‘ื˜ื•ื—.
14:54
And there are also some esoteric issues that would need to be solved, sorted out:
266
894790
4737
ื•ื™ืฉ ื’ื ื›ืžื” ื ื•ืฉืื™ื ืื™ื–ื•ื˜ืจื™ื™ื ืฉื ืฆื˜ืจืš ืœืคืชื•ืจ, ืœืืจื’ืŸ:
14:59
the exact details of its decision theory,
267
899527
2089
ื”ืคืจื˜ื™ื ื”ืžื“ื•ื™ื™ืงื™ื ืฉืœ ืชืื•ืจื™ื™ืช ื”ื”ื—ืœื˜ื•ืช ืฉืœื”,
15:01
how to deal with logical uncertainty and so forth.
268
901616
2864
ืื™ืš ื”ื™ื ืžืชืžื•ื“ื“ืช ืขื ื—ื•ืกืจ ื•ื“ืื•ืช ืœื•ื’ื™ ื•ื›ืš ื”ืืœื”.
15:05
So the technical problems that need to be solved to make this work
269
905330
3102
ืื– ื”ื‘ืขื™ื•ืช ื”ื˜ื›ื ื™ื•ืช ืฉืฆืจื™ื›ื•ืช ืœื”ืคืชืจ ื›ื“ื™ ืœื’ืจื•ื ืœื–ื” ืœืขื‘ื•ื“
15:08
look quite difficult --
270
908432
1113
ื ืจืื™ื ื“ื™ ืงืฉื™ื --
15:09
not as difficult as making a superintelligent A.I.,
271
909545
3380
ืœื ืงืฉื™ื ื›ืžื• ืœืขืฉื•ืช ื‘"ืž ืกื•ืคืจ ืื™ื ื˜ื™ืœื™ื’ื ื˜ื™ืช,
15:12
but fairly difficult.
272
912925
2868
ืื‘ืœ ื“ื™ ืงืฉื™ื.
15:15
Here is the worry:
273
915793
1695
ื”ื ื” ื”ื“ืื’ื”:
15:17
Making superintelligent A.I. is a really hard challenge.
274
917488
4684
ืœื™ืฆื•ืจ ื‘"ืž ืกื•ืคืจ ืื™ื ื˜ื™ืœื™ื’ื ื˜ื™ืช ื–ื” ืืชื’ืจ ืžืื•ื“ ืงืฉื”.
15:22
Making superintelligent A.I. that is safe
275
922172
2548
ืœื™ืฆื•ืจ ื‘"ืž ืกื•ืคืจ ืื™ื ื˜ื™ืœื™ื’ื ื˜ื™ืช ืฉื”ื™ื ื‘ื˜ื•ื—ื”
15:24
involves some additional challenge on top of that.
276
924720
2416
ื›ื•ืœืœ ื›ืžื” ืืชื’ืจื™ื ื ื•ืกืคื™ื ืžืขื‘ืจ ืœื–ื”.
15:28
The risk is that if somebody figures out how to crack the first challenge
277
928216
3487
ื”ืกื™ื›ื•ืŸ ื”ื•ื ืฉืžื™ืฉื”ื• ื™ื‘ื™ืŸ ืื™ืš ืœืคืฆื— ืืช ื”ืืชื’ืจ ื”ืจืืฉื•ืŸ
15:31
without also having cracked the additional challenge
278
931703
3001
ื‘ืœื™ ืœืคืชื•ืจ ืืช ื”ืืชื’ืจ ื”ื ื•ืกืฃ
15:34
of ensuring perfect safety.
279
934704
1901
ืฉืœ ืœื”ื‘ื˜ื™ื— ื‘ื˜ื™ื—ื•ืช ืžื•ืฉืœืžืช.
15:37
So I think that we should work out a solution
280
937375
3331
ืื– ืื ื™ ื—ื•ืฉื‘ ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืžืฆื•ื ืคื™ืชืจื•ืŸ
15:40
to the control problem in advance,
281
940706
2822
ืœื‘ืขื™ืช ื”ืฉืœื™ื˜ื” ืžืจืืฉ,
15:43
so that we have it available by the time it is needed.
282
943528
2660
ื›ืš ืฉื”ื™ื ืชื”ื™ื” ื–ืžื™ื ื” ื‘ื–ืžืŸ ืฉืžืฆื˜ืจืš ืื•ืชื”.
15:46
Now it might be that we cannot solve the entire control problem in advance
283
946768
3507
ืขื›ืฉื™ื• ืื•ืœื™ ืœื ื ื•ื›ืœ ืœืคืชื•ืจ ืืช ื›ืœ ื‘ืขื™ื™ืช ื”ืฉืœื™ื˜ื” ืžืจืืฉ
15:50
because maybe some elements can only be put in place
284
950275
3024
ืžืคื ื™ ืฉืื•ืœื™ ื›ืžื” ืืœืžื ื˜ื™ื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ื‘ืžืงื•ื
15:53
once you know the details of the architecture where it will be implemented.
285
953299
3997
ืจืง ื‘ืจื’ืข ืฉืื ื—ื ื• ื™ื•ื“ืขื™ื ืืช ื”ืคืจื˜ื™ื ืฉืœ ื”ืืจื›ื™ื˜ืงื˜ื•ืจื” ื‘ื” ื”ื™ื ืชื™ื•ืฉื.
15:57
But the more of the control problem that we solve in advance,
286
957296
3380
ืื‘ืœ ื›ื›ืœ ืฉื ืคืชื•ืจ ื™ื•ืชืจ ืžื‘ืขื™ืช ื”ืฉืœื™ื˜ื” ืžืจืืฉ,
16:00
the better the odds that the transition to the machine intelligence era
287
960676
4090
ื”ืกื™ื›ื•ื™ื™ื ื™ื”ื™ื• ื˜ื•ื‘ื™ื ื™ื•ืชืจ ืฉื”ืžืขื‘ืจ ืœืขื™ื“ืŸ ื‘ื™ื ืช ื”ืžื›ื•ื ื”
16:04
will go well.
288
964766
1540
ื™ืขื‘ื•ืจ ื˜ื•ื‘.
16:06
This to me looks like a thing that is well worth doing
289
966306
4644
ื–ื” ื‘ืฉื‘ื™ืœื™ ื ืจืื” ื›ืžื• ืžืฉื”ื• ืฉืฉื•ื•ื” ืœืขืฉื•ืช
16:10
and I can imagine that if things turn out okay,
290
970950
3332
ื•ืื ื™ ื™ื›ื•ืœ ืœื“ืžื™ื™ืŸ ืฉืื ื“ื‘ืจื™ื ื™ืœื›ื• ื˜ื•ื‘,
16:14
that people a million years from now look back at this century
291
974282
4658
ืฉืื ืฉื™ื ื‘ืขื•ื“ ืžืœื™ื•ืŸ ืฉื ื” ื™ื‘ื™ื˜ื• ืื—ื•ืจื” ื‘ืžืื” ื”ื–ื•
16:18
and it might well be that they say that the one thing we did that really mattered
292
978940
4002
ื•ืื•ืœื™ ื™ืกืชื‘ืจ ืฉื”ื“ื‘ืจ ื”ื™ื—ื™ื“ื™ ืฉืขืฉื™ื ื• ื•ื‘ืืžืช ืฉื™ื ื”
16:22
was to get this thing right.
293
982942
1567
ื”ื™ื” ืœืขืฉื•ืช ืืช ื–ื” ื ื›ื•ืŸ.
16:24
Thank you.
294
984509
1689
ืชื•ื“ื” ืœื›ื.
16:26
(Applause)
295
986198
2813
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7