Can we build AI without losing control over it? | Sam Harris

3,802,856 views ใƒป 2016-10-19

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Yuri Dulkin ืžื‘ืงืจ: Sigal Tifferet
00:13
I'm going to talk about a failure of intuition
0
13000
2216
ืื ื™ ืื“ื‘ืจ ืขืœ ื›ืฉืœ ื‘ืื™ื ื˜ื•ืื™ืฆื™ื”
00:15
that many of us suffer from.
1
15240
1600
ืฉืจื‘ื™ื ืžืื™ืชื ื• ืกื•ื‘ืœื™ื ืžืžื ื•.
00:17
It's really a failure to detect a certain kind of danger.
2
17480
3040
ื–ื”ื• ืœืžืขืฉื” ื›ืฉืœ ื‘ื–ื™ื”ื•ื™ ืกื›ื ื” ืžืกื•ื’ ืžืกื•ื™ื™ื.
00:21
I'm going to describe a scenario
3
21360
1736
ืื ื™ ืืชืืจ ืชืจื—ื™ืฉ
00:23
that I think is both terrifying
4
23120
3256
ืฉืื ื™ ื—ื•ืฉื‘ ืฉื”ื•ื ื’ื ืžื‘ืขื™ืช
00:26
and likely to occur,
5
26400
1760
ื•ื’ื ืกื‘ื™ืจ ืฉื™ืชืจื—ืฉ,
00:28
and that's not a good combination,
6
28840
1656
ื•ื–ื” ืœื ืฉื™ืœื•ื‘ ื˜ื•ื‘,
00:30
as it turns out.
7
30520
1536
ื›ืคื™ ืฉืžืกืชื‘ืจ.
00:32
And yet rather than be scared, most of you will feel
8
32080
2456
ื•ืœืžืจื•ืช ื–ืืช, ื‘ืžืงื•ื ืœืคื—ื“, ืจื•ื‘ื›ื ืชืจื’ื™ืฉื•
00:34
that what I'm talking about is kind of cool.
9
34560
2080
ืฉืžื” ืฉืื ื™ ืžื“ื‘ืจ ืขืœื™ื• ื”ื•ื ื“ื™ ืžื’ื ื™ื‘.
00:37
I'm going to describe how the gains we make
10
37200
2976
ืื ื™ ืืชืืจ ืื™ืš ื”ื”ืชืงื“ืžื•ื™ื•ืช ืฉืœื ื•
00:40
in artificial intelligence
11
40200
1776
ื‘ืชื—ื•ื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
00:42
could ultimately destroy us.
12
42000
1776
ืขืœื•ืœื•ืช ืœื‘ืกื•ืฃ ืœื”ืฉืžื™ื“ ืื•ืชื ื•.
00:43
And in fact, I think it's very difficult to see how they won't destroy us
13
43800
3456
ื•ืœืžืขืฉื”, ืื ื™ ื—ื•ืฉื‘ ืฉืงืฉื” ืžืื•ื“ ืœืจืื•ืช ืื™ืš ื”ื ืœื ื™ืฉืžื™ื“ื• ืื•ืชื ื•,
ืื• ื™ืžืจื™ืฆื• ืื•ืชื ื• ืœื”ืฉืžื™ื“ ืืช ืขืฆืžื ื•.
00:47
or inspire us to destroy ourselves.
14
47280
1680
00:49
And yet if you're anything like me,
15
49400
1856
ื•ื‘ื›ืœ ื–ืืช, ืื ืืชื ื›ืžื•ื ื™,
00:51
you'll find that it's fun to think about these things.
16
51280
2656
ืืชื ืชื’ืœื• ืฉื›ื™ืฃ ืœื—ืฉื•ื‘ ืขืœ ื“ื‘ืจื™ื ื›ืืœื”.
00:53
And that response is part of the problem.
17
53960
3376
ื•ื”ืชื’ื•ื‘ื” ื”ื–ื• ื”ื™ื ื—ืœืง ืžื”ื‘ืขื™ื”.
00:57
OK? That response should worry you.
18
57360
1720
ื”ืชื’ื•ื‘ื” ื”ื–ื• ืฆืจื™ื›ื” ืœื”ื“ืื™ื’ ืืชื›ื.
00:59
And if I were to convince you in this talk
19
59920
2656
ืื ื”ื™ื™ืชื™ ืฆืจื™ืš ืœืฉื›ื ืข ืืชื›ื ื‘ืฉื™ื—ื” ื”ื–ื•
01:02
that we were likely to suffer a global famine,
20
62600
3416
ืฉืื ื—ื ื• ืฆืคื•ื™ื™ื ืœืกื‘ื•ืœ ืžืจืขื‘ ืขื•ืœืžื™,
01:06
either because of climate change or some other catastrophe,
21
66040
3056
ื‘ื’ืœืœ ืฉื™ื ื•ื™ื™ ืืงืœื™ื ืื• ืงื˜ืกื˜ืจื•ืคื” ืื—ืจืช,
01:09
and that your grandchildren, or their grandchildren,
22
69120
3416
ื•ืฉืกื‘ื™ืจ ืžืื•ื“ ืฉื”ื ื›ื“ื™ื ืฉืœื›ื, ืื• ื”ื ื›ื“ื™ื ืฉืœื”ื, ื™ื—ื™ื• ื›ืš
01:12
are very likely to live like this,
23
72560
1800
01:15
you wouldn't think,
24
75200
1200
ืœื ื”ื™ื™ืชื ื—ื•ืฉื‘ื™ื,
01:17
"Interesting.
25
77440
1336
"ืžืขื ื™ื™ืŸ.
01:18
I like this TED Talk."
26
78800
1200
"ืื ื™ ืื•ื”ื‘ ืืช ื”ื”ืจืฆืื” ื”ื–ื•."
01:21
Famine isn't fun.
27
81200
1520
ืจืขื‘ ื–ื” ืœื ื›ื™ืฃ.
01:23
Death by science fiction, on the other hand, is fun,
28
83800
3376
ืžื•ื•ืช ื‘ืืžืฆืขื•ืช ืžื“ืข ื‘ื“ื™ื•ื ื™, ืœืขื•ืžืช ื–ืืช, ื”ื•ื ื›ื™ืฃ.
01:27
and one of the things that worries me most about the development of AI at this point
29
87200
3976
ื•ืื—ื“ ื”ื“ื‘ืจื™ื ืฉื”ื›ื™ ืžื“ืื™ื’ื™ื ืื•ืชื™ ื‘ื ื•ื’ืข ืœื”ืชืคืชื—ื•ืช ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื™ื•ื,
ื”ื•ื ืฉืื ื—ื ื• ืœื ืžืกื•ื’ืœื™ื ืœื’ื™ื™ืก ืืช ื”ืชื’ื•ื‘ื” ื”ืจื’ืฉื™ืช ื”ืžืชืื™ืžื”
01:31
is that we seem unable to marshal an appropriate emotional response
30
91200
4096
01:35
to the dangers that lie ahead.
31
95320
1816
ืœืกื›ื ื•ืช ืฉืฆืคื•ื™ื•ืช ืœื ื•.
01:37
I am unable to marshal this response, and I'm giving this talk.
32
97160
3200
ืื ื™ ืœื ืžืกื•ื’ืœ ืœื’ื™ื™ืก ืืช ื”ืชื’ื•ื‘ื” ื”ื–ื•, ื•ืื ื™ ืžืขื‘ื™ืจ ืืช ื”ื”ืจืฆืื” ื”ื–ื•.
01:42
It's as though we stand before two doors.
33
102120
2696
ื–ื” ื›ืื™ืœื• ืฉืื ื—ื ื• ื ื™ืฆื‘ื™ื ื‘ืคื ื™ ืฉืชื™ ื“ืœืชื•ืช.
01:44
Behind door number one,
34
104840
1256
ืžืื—ื•ืจื™ ื“ืœืช ืžืกืคืจ ืื—ืช,
01:46
we stop making progress in building intelligent machines.
35
106120
3296
ืื ื—ื ื• ืžืคืกื™ืงื™ื ืœื”ืชืงื“ื ื‘ื‘ื ื™ื™ืช ืžื›ื•ื ื•ืช ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช.
01:49
Our computer hardware and software just stops getting better for some reason.
36
109440
4016
ื—ื•ืžืจื•ืช ื•ืชื•ื›ื ื•ืช ื”ืžื—ืฉื‘ ืฉืœื ื• ืžืคืกื™ืงื•ืช ืœื”ืฉืชืคืจ ืžืกื™ื‘ื” ื›ืœืฉื”ื™.
01:53
Now take a moment to consider why this might happen.
37
113480
3000
ืขื›ืฉื™ื• ืงื—ื• ืœืขืฆืžื›ื ืจื’ืข ืœืฉืงื•ืœ ืžื“ื•ืข ื–ื” ื™ื›ื•ืœ ืœืงืจื•ืช.
01:57
I mean, given how valuable intelligence and automation are,
38
117080
3656
ื‘ื”ืชื—ืฉื‘ ื‘ืขืจืš ื”ืจื‘ ืฉืœ ืื™ื ื˜ืœื’ื™ื ืฆื™ื” ื•ืื•ื˜ื•ืžืฆื™ื” ืขื‘ื•ืจื ื•,
02:00
we will continue to improve our technology if we are at all able to.
39
120760
3520
ืื ื—ื ื• ื ืžืฉื™ืš ืœืฉืคืจ ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื”, ืื ื ื”ื™ื” ืžืกื•ื’ืœื™ื ืœื›ืš.
02:05
What could stop us from doing this?
40
125200
1667
ืžื” ื™ื›ื•ืœ ืœืขืฆื•ืจ ืื•ืชื ื• ืžืœืขืฉื•ืช ื–ืืช?
02:07
A full-scale nuclear war?
41
127800
1800
ืžืœื—ืžื” ื’ืจืขื™ื ื™ืช ื›ื•ืœืœืช?
02:11
A global pandemic?
42
131000
1560
ืžื’ื™ืคื” ื›ืœืœ-ืขื•ืœืžื™ืช?
02:14
An asteroid impact?
43
134320
1320
ืคื’ื™ืขืช ืืกื˜ืจื•ืื™ื“?
02:17
Justin Bieber becoming president of the United States?
44
137640
2576
ื’'ืกื˜ื™ืŸ ื‘ื™ื‘ืจ ืžืžื•ื ื” ืœื ืฉื™ื ืืจื”"ื‘?
02:20
(Laughter)
45
140240
2280
(ืฆื—ื•ืง)
02:24
The point is, something would have to destroy civilization as we know it.
46
144760
3920
ื”ื ืงื•ื“ื” ื”ื™ื, ืžืฉื”ื• ื™ืฆื˜ืจืš ืœื”ืฉืžื™ื“ ืืช ื”ืฆื™ื•ื•ื™ืœื™ื–ืฆื™ื” ื”ืžื•ื›ืจืช ืœื ื•.
02:29
You have to imagine how bad it would have to be
47
149360
4296
ืืชื ื—ื™ื™ื‘ื™ื ืœื“ืžื™ื™ืŸ ื›ืžื” ื ื•ืจื ื–ื” ืฆืจื™ืš ืœื”ื™ื•ืช
02:33
to prevent us from making improvements in our technology
48
153680
3336
ื‘ื›ื“ื™ ืœืžื ื•ืข ืžืื™ืชื ื• ืœื‘ืฆืข ืฉื™ืคื•ืจื™ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืœื ื•
02:37
permanently,
49
157040
1216
ืœืชืžื™ื“.
02:38
generation after generation.
50
158280
2016
ื“ื•ืจ ืื—ืจื™ ื“ื•ืจ.
02:40
Almost by definition, this is the worst thing
51
160320
2136
ื›ืžืขื˜ ื‘ื”ื’ื“ืจื”, ื–ื”ื• ื”ื“ื‘ืจ ื”ื’ืจื•ืข ื‘ื™ื•ืชืจ
02:42
that's ever happened in human history.
52
162480
2016
ืฉืื™-ืคืขื ืงืจื” ื‘ื”ื™ืกื˜ื•ืจื™ื” ื”ืื ื•ืฉื™ืช.
02:44
So the only alternative,
53
164520
1296
ืื– ื”ืืœื˜ืจื ื˜ื™ื‘ื” ื”ื™ื—ื™ื“ื”,
02:45
and this is what lies behind door number two,
54
165840
2336
ื•ื–ื” ืžื” ืฉื ืžืฆื ืžืื—ื•ืจื™ ื“ืœืช ืžืกืคืจ ืฉืชื™ื™ื,
02:48
is that we continue to improve our intelligent machines
55
168200
3136
ื”ื™ื ืฉื ืžืฉื™ืš ืœืฉืคืจ ืืช ื”ืžื›ื•ื ื•ืช ื”ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช ืฉืœื ื•
02:51
year after year after year.
56
171360
1600
ืฉื ื” ืื—ืจ ืฉื ื” ืื—ืจ ืฉื ื”.
02:53
At a certain point, we will build machines that are smarter than we are,
57
173720
3640
ื•ื‘ื ืงื•ื“ื” ืžืกื•ื™ืžืช, ื ื‘ื ื” ืžื›ื•ื ื•ืช ืฉื”ืŸ ื—ื›ืžื•ืช ื™ื•ืชืจ ืžืื™ืชื ื•,
02:58
and once we have machines that are smarter than we are,
58
178080
2616
ื•ื‘ืจื’ืข ืฉื™ื”ื™ื• ืœื ื• ืžื›ื•ื ื•ืช ื—ื›ืžื•ืช ืžืื™ืชื ื•,
03:00
they will begin to improve themselves.
59
180720
1976
ื”ืŸ ื™ืชื—ื™ืœื• ืœืฉืคืจ ืืช ืขืฆืžืŸ.
03:02
And then we risk what the mathematician IJ Good called
60
182720
2736
ื•ืื– ืื ื—ื ื• ืžืกืชื›ื ื™ื ื‘ืžื” ืฉื”ืžืชืžื˜ื™ืงืื™ ืื™ื™. ื’'ื™ื™. ื’ื•ื“ ืงืจื ืœื•
03:05
an "intelligence explosion,"
61
185480
1776
"ื”ืชืคื•ืฆืฆื•ืช ืื™ื ื˜ืœื™ื’ื ืฆื™ื”",
03:07
that the process could get away from us.
62
187280
2000
ืฉื”ืชื”ืœื™ืš ื™ื›ื•ืœ ืœืฆืืช ืžืฉืœื™ื˜ืชื ื•.
03:10
Now, this is often caricatured, as I have here,
63
190120
2816
ืœืจื•ื‘ ื–ื” ืžืชื•ืืจ ื›ืคื™ ืฉื ืจืื” ืคื”,
03:12
as a fear that armies of malicious robots
64
192960
3216
ื›ืคื—ื“ ืžืฆื‘ืื•ืช ืฉืœ ืจื•ื‘ื•ื˜ื™ื ื–ื“ื•ื ื™ื™ื
03:16
will attack us.
65
196200
1256
ืฉื™ืชืงืคื• ืื•ืชื ื•.
03:17
But that isn't the most likely scenario.
66
197480
2696
ืื‘ืœ ื–ื” ืœื ื”ืชืจื—ื™ืฉ ื”ืกื‘ื™ืจ ื‘ื™ื•ืชืจ.
03:20
It's not that our machines will become spontaneously malevolent.
67
200200
4856
ื–ื” ืœื ืฉื”ืžื›ื•ื ื•ืช ืฉืœื ื• ื™ื”ืคื›ื• ื‘ืื•ืคืŸ ืกืคื•ื ื˜ื ื™ ืœืžืจื•ืฉืขื•ืช.
03:25
The concern is really that we will build machines
68
205080
2616
ื”ื—ืฉืฉ ื”ืืžื™ืชื™ ื”ื•ื ืฉื ื‘ื ื” ืžื›ื•ื ื•ืช
03:27
that are so much more competent than we are
69
207720
2056
ืฉืžื•ื›ืฉืจื•ืช ื›ืœ ื›ืš ื”ืจื‘ื” ื™ื•ืชืจ ืžืื™ืชื ื•,
03:29
that the slightest divergence between their goals and our own
70
209800
3776
ืฉื”ืกื˜ื™ื™ื” ื”ืงื˜ื ื” ื‘ื™ื•ืชืจ ื‘ื™ืŸ ื”ืžื˜ืจื•ืช ืฉืœื ื• ืœืฉืœื”ืŸ
03:33
could destroy us.
71
213600
1200
ืขืœื•ืœื” ืœื”ืฉืžื™ื“ ืื•ืชื ื•.
03:35
Just think about how we relate to ants.
72
215960
2080
ืจืง ืชื—ืฉื‘ื• ืขืœ ื”ื™ื—ืก ืฉืœื ื• ืœื ืžืœื™ื.
03:38
We don't hate them.
73
218600
1656
ืื ื—ื ื• ืœื ืฉื•ื ืื™ื ืื•ืชืŸ.
03:40
We don't go out of our way to harm them.
74
220280
2056
ืื ื—ื ื• ืœื ืขื•ืฉื™ื ืžืืžืฅ ืžื™ื•ื—ื“ ืœืคื’ื•ืข ื‘ื”ืŸ.
03:42
In fact, sometimes we take pains not to harm them.
75
222360
2376
ืœืžืขืฉื”, ืœืคืขืžื™ื ืื ื—ื ื• ืžืชืืžืฆื™ื ื›ื“ื™ ืฉืœื ืœืคื’ื•ืข ื‘ื”ืŸ.
03:44
We step over them on the sidewalk.
76
224760
2016
ืื ื—ื ื• ืฆื•ืขื“ื™ื ืžืขืœื™ื”ืŸ ื‘ืžื“ืจื›ื•ืช.
03:46
But whenever their presence
77
226800
2136
ืื‘ืœ ื‘ืจื’ืข ืฉื”ื ื•ื›ื—ื•ืช ืฉืœื”ืŸ
03:48
seriously conflicts with one of our goals,
78
228960
2496
ื ืžืฆืืช ื‘ืงื•ื ืคืœื™ืงื˜ ืจืฆื™ื ื™ ืขื ืื—ืช ื”ืžื˜ืจื•ืช ืฉืœื ื•,
03:51
let's say when constructing a building like this one,
79
231480
2477
ืœืžืฉืœ ื›ืฉืื ื—ื ื• ื‘ื•ื ื™ื ื‘ื ื™ืŸ ื›ืžื• ื–ื”,
03:53
we annihilate them without a qualm.
80
233981
1960
ืื ื—ื ื• ืžืฉืžื™ื“ื™ื ืื•ืชืŸ ืœืœื ื™ืกื•ืจื™ ืžืฆืคื•ืŸ.
03:56
The concern is that we will one day build machines
81
236480
2936
ื”ื—ืฉืฉ ื”ื•ื ืฉื™ื•ื ืื—ื“ ืื ื—ื ื• ื ื‘ื ื” ืžื›ื•ื ื•ืช
03:59
that, whether they're conscious or not,
82
239440
2736
ืืฉืจ ื‘ืžื•ื“ืข ืื• ืฉืœื,
04:02
could treat us with similar disregard.
83
242200
2000
ื™ื•ื›ืœื• ืœื”ืชื™ื™ื—ืก ืืœื™ื ื• ื‘ื–ืœื–ื•ืœ ื“ื•ืžื”.
04:05
Now, I suspect this seems far-fetched to many of you.
84
245760
2760
ืขื›ืฉื™ื•, ืื ื™ ื—ื•ืฉื“ ืฉื›ืœ ื–ื” ื ืฉืžืข ืžื•ืคืจืš ืขื‘ื•ืจ ืจื‘ื™ื ืžื›ื.
04:09
I bet there are those of you who doubt that superintelligent AI is possible,
85
249360
6336
ืื ื™ ื‘ื˜ื•ื— ืฉื™ืฉ ื‘ื›ื ื›ืืœื” ื”ืกืคืงื ื™ื ืœื’ื‘ื™ ื”ืืคืฉืจื•ืช ืœื”ื’ื™ืข ืœื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช ื›ื–ื•,
04:15
much less inevitable.
86
255720
1656
ื•ื‘ื•ื•ื“ืื™ ืœื—ืฉื•ื‘ ืฉื”ื™ื ื‘ืœืชื™ ื ืžื ืขืช.
04:17
But then you must find something wrong with one of the following assumptions.
87
257400
3620
ืื‘ืœ ืื– ืืชื ืฆืจื™ื›ื™ื ืœืžืฆื•ื ืคื’ื ื‘ืื—ืช ืžื”ื ื—ื•ืช ื”ื™ืกื•ื“ ื”ื‘ืื•ืช.
04:21
And there are only three of them.
88
261044
1572
ื•ื™ืฉ ืจืง ืฉืœื•ืฉ ืžื”ืŸ.
04:23
Intelligence is a matter of information processing in physical systems.
89
263800
4719
ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ื™ื ืขื ื™ืŸ ืฉืœ ืขื™ื‘ื•ื“ ืžื™ื“ืข ื‘ืžืขืจื›ื•ืช ืคื™ื–ื™ื•ืช.
04:29
Actually, this is a little bit more than an assumption.
90
269320
2615
ืœืžืขืฉื”, ื–ื• ืงืฆืช ื™ื•ืชืจ ืžืกืชื ื”ื ื—ื”.
04:31
We have already built narrow intelligence into our machines,
91
271959
3457
ืื ื—ื ื• ื›ื‘ืจ ื‘ื ื™ื ื• ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืฆืจื” ืœืชื•ืš ื”ืžื›ื•ื ื•ืช ืฉืœื ื•,
04:35
and many of these machines perform
92
275440
2016
ื•ืจื‘ื•ืช ืžื”ืžื›ื•ื ื•ืช ื”ืืœื” ืžืชืคืงื“ื•ืช
04:37
at a level of superhuman intelligence already.
93
277480
2640
ื‘ืจืžื•ืช ืฉืœ ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืขืœ-ืื ื•ืฉื™ืช ื›ื‘ืจ ื›ื™ื•ื.
04:40
And we know that mere matter
94
280840
2576
ื•ืื ื—ื ื• ื™ื•ื“ืขื™ื ืฉื—ื•ืžืจ ื‘ืœื‘ื“
04:43
can give rise to what is called "general intelligence,"
95
283440
2616
ื™ื›ื•ืœ ืœื™ื™ืฆืจ ืžื” ืฉื ืงืจื "ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื›ืœืœื™ืช",
04:46
an ability to think flexibly across multiple domains,
96
286080
3656
ื”ื™ื›ื•ืœืช ืœื—ืฉื•ื‘ ื‘ืฆื•ืจื” ื’ืžื™ืฉื” ืžืขื‘ืจ ืœืชื—ื•ืžื™ื ืจื‘ื™ื,
04:49
because our brains have managed it. Right?
97
289760
3136
ื›ื™ ื”ืžื•ื— ืฉืœื ื• ืžืฆืœื™ื— ืœืขืฉื•ืช ื–ืืช. ื ื›ื•ืŸ?
04:52
I mean, there's just atoms in here,
98
292920
3936
ื”ืจื™ ื™ืฉ ื›ืืŸ ืจืง ืื˜ื•ืžื™ื,
04:56
and as long as we continue to build systems of atoms
99
296880
4496
ื•ื›ืœ ืขื•ื“ ื ืžืฉื™ืš ืœื‘ื ื•ืช ืžืขืจื›ื•ืช ืฉืœ ืื˜ื•ืžื™ื
05:01
that display more and more intelligent behavior,
100
301400
2696
ืฉืžืคื’ื™ื ื•ืช ื”ืชื ื”ื’ื•ืช ื™ื•ืชืจ ื•ื™ื•ืชืจ ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช,
05:04
we will eventually, unless we are interrupted,
101
304120
2536
ื‘ืกื•ืฃ, ืืœื ืื ื›ืŸ ื™ืคืจื™ืขื• ืœื ื•,
05:06
we will eventually build general intelligence
102
306680
3376
ื‘ืกื•ืฃ, ื ื‘ื ื” ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื›ืœืœื™ืช
05:10
into our machines.
103
310080
1296
ื‘ืชื•ืš ื”ืžื›ื•ื ื•ืช ืฉืœื ื•.
05:11
It's crucial to realize that the rate of progress doesn't matter,
104
311400
3656
ื–ื” ื—ื™ื•ื ื™ ืœื”ื‘ื™ืŸ ืฉืงืฆื‘ ื”ื”ืชืงื“ืžื•ืช ืื™ื ื• ืžืฉื ื”,
05:15
because any progress is enough to get us into the end zone.
105
315080
3176
ื›ื™ื•ื•ืŸ ืฉื›ืœ ื”ืชืงื“ืžื•ืช ืžืกืคื™ืงื” ื‘ื›ื“ื™ ืœื”ื‘ื™ื ืื•ืชื ื• ืœืงื• ื”ืกื™ื•ื.
05:18
We don't need Moore's law to continue. We don't need exponential progress.
106
318280
3776
ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืืช ื—ื•ืง ืžื•ืจ ื‘ืฉื‘ื™ืœ ืœื”ืžืฉื™ืš. ืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ื”ืชืงื“ืžื•ืช ืืงืกืคื•ื ื ืฆื™ืืœื™ืช.
05:22
We just need to keep going.
107
322080
1600
ืื ื—ื ื• ืจืง ืฆืจื™ื›ื™ื ืœื”ืžืฉื™ืš.
05:25
The second assumption is that we will keep going.
108
325480
2920
ื”ื”ื ื—ื” ื”ืฉื ื™ื” ื”ื™ื ืฉืื ื—ื ื• ื ืžืฉื™ืš.
05:29
We will continue to improve our intelligent machines.
109
329000
2760
ืื ื—ื ื• ื ืžืฉื™ืš ืœืฉืคืจ ืืช ื”ืžื›ื•ื ื•ืช ื”ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช ืฉืœื ื•.
05:33
And given the value of intelligence --
110
333000
4376
ื•ื‘ื”ืชื—ืฉื‘ ื‘ื—ืฉื™ื‘ื•ืช ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื” โ€“
05:37
I mean, intelligence is either the source of everything we value
111
337400
3536
ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ื™ื ื”ืžืงื•ืจ ืฉืœ ื›ืœ ืžื” ืฉื™ืงืจ ืœื ื•
05:40
or we need it to safeguard everything we value.
112
340960
2776
ืื• ืฉืื ื—ื ื• ื–ืงื•ืงื™ื ืœื” ื‘ืฉื‘ื™ืœ ืœืฉืžื•ืจ ืขืœ ื›ืœ ืžื” ืฉื™ืงืจ ืœื ื•.
05:43
It is our most valuable resource.
113
343760
2256
ื–ื”ื• ื”ืžืฉืื‘ ื”ื›ื™ ื™ืงืจ ืฉืœื ื•.
05:46
So we want to do this.
114
346040
1536
ืื– ืื ื—ื ื• ืจื•ืฆื™ื ืœืขืฉื•ืช ืืช ื–ื”.
05:47
We have problems that we desperately need to solve.
115
347600
3336
ื™ืฉ ืœื ื• ื‘ืขื™ื•ืช ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืœืคืชื•ืจ ื ื•ืืฉื•ืช.
05:50
We want to cure diseases like Alzheimer's and cancer.
116
350960
3200
ืื ื—ื ื• ืจื•ืฆื™ื ืœืจืคื ืžื—ืœื•ืช ื›ืžื• ืืœืฆื”ื™ื™ืžืจ ื•ืกืจื˜ืŸ.
05:54
We want to understand economic systems. We want to improve our climate science.
117
354960
3936
ืื ื—ื ื• ืจื•ืฆื™ื ืœื”ื‘ื™ืŸ ืžืขืจื›ื•ืช ื›ืœื›ืœื™ื•ืช. ืื ื—ื ื• ืจื•ืฆื™ื ืœืฉืคืจ ืืช ืžื“ืขื™ ื”ืืงืœื™ื.
05:58
So we will do this, if we can.
118
358920
2256
ืื– ืื ื—ื ื• ื ืขืฉื” ื–ืืช, ืื ื ื•ื›ืœ.
06:01
The train is already out of the station, and there's no brake to pull.
119
361200
3286
ื”ืจื›ื‘ืช ื›ื‘ืจ ื™ืฆืื” ืžื”ืชื—ื ื”, ื•ืื™ืŸ ืœื” ืžืขืฆื•ืจ ืฉื ื•ื›ืœ ืœืžืฉื•ืš ื‘ื•.
06:05
Finally, we don't stand on a peak of intelligence,
120
365880
5456
ืœื‘ืกื•ืฃ, ืื ื—ื ื• ืœื ืขื•ืžื“ื™ื ื‘ืคืกื’ืช ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื”,
06:11
or anywhere near it, likely.
121
371360
1800
ืื• ืืคื™ืœื• ืงืจื•ื‘ ืืœื™ื”, ื›ื›ืœ ื”ื ืจืื”.
06:13
And this really is the crucial insight.
122
373640
1896
ื•ื–ื• ื‘ืืžืช ื”ืชื•ื‘ื ื” ื”ืงืจื™ื˜ื™ืช ื‘ื™ื•ืชืจ.
06:15
This is what makes our situation so precarious,
123
375560
2416
ื–ื” ืžื” ืฉื”ื•ืคืš ืืช ื”ืžืฆื‘ ืฉืœื ื• ืœื›ืœ ื›ืš ืžืกื•ื›ืŸ,
06:18
and this is what makes our intuitions about risk so unreliable.
124
378000
4040
ื•ื–ื” ืžื” ืฉื”ื•ืคืš ืืช ื”ืื™ื ื˜ื•ืื™ืฆื™ื•ืช ืฉืœื ื• ืœื’ื‘ื™ ื”ืกื™ื›ื•ืŸ ืœื›ืœ ื›ืš ืœื ืืžื™ื ื•ืช.
06:23
Now, just consider the smartest person who has ever lived.
125
383120
2720
ืขื›ืฉื™ื•, ืชื—ืฉื‘ื• ืขืœ ื”ืื“ื ื”ื—ื›ื ื‘ื™ื•ืชืจ ืฉื—ื™ ืื™ ืคืขื.
06:26
On almost everyone's shortlist here is John von Neumann.
126
386640
3416
ื›ืžืขื˜ ื›ื•ืœื ื›ืืŸ ื™ืฉื™ืžื• ืืช ื’'ื•ืŸ ืคื•ืŸ ื ื•ื™ืžืŸ ืื™ืคืฉื”ื• ื‘ืจืืฉ ื”ืจืฉื™ืžื”.
06:30
I mean, the impression that von Neumann made on the people around him,
127
390080
3336
ื”ืจื•ืฉื ืฉืคื•ืŸ ื ื•ื™ืžืŸ ืขืฉื” ืขืœ ื”ืื ืฉื™ื ืกื‘ื™ื‘ื•,
06:33
and this included the greatest mathematicians and physicists of his time,
128
393440
4056
ื•ื–ื” ื›ื•ืœืœ ื›ืžื” ืžื”ืžืชืžื˜ื™ืงืื™ื ื•ื”ืคื™ื–ื™ืงืื™ื ื”ื’ื“ื•ืœื™ื ืฉืœ ื–ืžื ื•,
06:37
is fairly well-documented.
129
397520
1936
ืžืชื•ืขื“ ื‘ืฆื•ืจื” ื˜ื•ื‘ื” ืœืžื“ื™.
06:39
If only half the stories about him are half true,
130
399480
3776
ืื™ืœื• ื—ืฆื™ ืžื”ืกื™ืคื•ืจื™ื ืขืœื™ื• ื”ื™ื• ื—ืฆื™ ื ื›ื•ื ื™ื,
06:43
there's no question
131
403280
1216
ืื™ืŸ ืกืคืง ื‘ื›ืœืœ
06:44
he's one of the smartest people who has ever lived.
132
404520
2456
ืฉื”ื•ื ืื—ื“ ื”ืื ืฉื™ื ื”ื—ื›ืžื™ื ืฉื—ื™ื• ืขืœื™ ืื“ืžื•ืช.
06:47
So consider the spectrum of intelligence.
133
407000
2520
ืื– ื—ืฉื‘ื• ืขืœ ืจืฆืฃ ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื”.
06:50
Here we have John von Neumann.
134
410320
1429
ื›ืืŸ ื ืžืฆื ื’'ื•ืŸ ืคื•ืŸ ื ื•ื™ืžืŸ.
06:53
And then we have you and me.
135
413560
1334
ื•ื›ืืŸ ื ืžืฆืื™ื ืืชื ื•ืื ื™.
06:56
And then we have a chicken.
136
416120
1296
ื•ืคื” ื™ืฉ ืœื ื• ืชืจื ื’ื•ืœืช.
06:57
(Laughter)
137
417440
1936
(ืฆื—ื•ืง)
06:59
Sorry, a chicken.
138
419400
1216
ืžืฆื˜ืขืจ โ€“ ืชืจื ื’ื•ืœืช.
07:00
(Laughter)
139
420640
1256
(ืฆื—ื•ืง)
07:01
There's no reason for me to make this talk more depressing than it needs to be.
140
421920
3736
ืื™ืŸ ืกื™ื‘ื” ืฉื”ื”ืจืฆืื” ื”ื–ื• ืชื”ื™ื” ืžื“ื›ืืช ื™ื•ืชืจ ืžื”ื ื“ืจืฉ.
(ืฆื—ื•ืง)
07:05
(Laughter)
141
425680
1600
07:08
It seems overwhelmingly likely, however, that the spectrum of intelligence
142
428339
3477
ืื‘ืœ ืกื‘ื™ืจ ืžืื•ื“ ืฉืจืฆืฃ ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื”
07:11
extends much further than we currently conceive,
143
431840
3120
ืžืžืฉื™ืš ื”ืจื‘ื” ืžืขื‘ืจ ืœืžื” ืฉืื ื—ื ื• ืžื›ื™ืจื™ื ื›ื™ื•ื,
07:15
and if we build machines that are more intelligent than we are,
144
435880
3216
ื•ืฉืื ื ื‘ื ื” ืžื›ื•ื ื•ืช ืฉื”ืŸ ื™ื•ืชืจ ืื™ื ื˜ืœื™ื’ื ื˜ื™ื•ืช ืžืื™ืชื ื•,
07:19
they will very likely explore this spectrum
145
439120
2296
ืกื‘ื™ืจ ืžืื•ื“ ืฉื”ืŸ ื™ื—ืงืจื• ืืช ื”ืจืฆืฃ ื”ื–ื”
07:21
in ways that we can't imagine,
146
441440
1856
ื‘ื“ืจื›ื™ื ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ,
07:23
and exceed us in ways that we can't imagine.
147
443320
2520
ื•ืœืขืœื•ืช ืขืœื™ื ื• ื‘ื“ืจื›ื™ื ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื“ืžื™ื™ืŸ.
07:27
And it's important to recognize that this is true by virtue of speed alone.
148
447000
4336
ื•ื—ืฉื•ื‘ ืœื”ื‘ื™ืŸ ืฉื–ื” ื ื›ื•ืŸ ื•ืœื• ืจืง ื‘ื–ื›ื•ืช ื”ืžื”ื™ืจื•ืช.
07:31
Right? So imagine if we just built a superintelligent AI
149
451360
5056
ื ื›ื•ืŸ? ื“ืžื™ื™ื ื• ืฉื ื‘ื ื” ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช
07:36
that was no smarter than your average team of researchers
150
456440
3456
ืฉืื™ื ื” ื—ื›ืžื” ื™ื•ืชืจ ืžืฆื•ื•ืช ืžืžื•ืฆืข ืฉืœ ื—ื•ืงืจื™ื
07:39
at Stanford or MIT.
151
459920
2296
ื‘ืกื˜ื ืคื•ืจื“ ืื• ื‘-MIT.
07:42
Well, electronic circuits function about a million times faster
152
462240
2976
ื•ื‘ื›ืŸ, ืžืขื’ืœื™ื ืืœืงื˜ืจื•ื ื™ื™ื ืคื•ืขืœื™ื ื‘ืขืจืš ืคื™ ืžื™ืœื™ื•ืŸ ืžื”ืจ ื™ื•ืชืจ ืžืžืขื’ืœื™ื ื‘ื™ื•ื›ื™ืžื™ื™ื,
07:45
than biochemical ones,
153
465240
1256
07:46
so this machine should think about a million times faster
154
466520
3136
ืื– ื”ืžื›ื•ื ื” ื”ื–ื• ืชื•ื›ืœ ืœื—ืฉื•ื‘ ื‘ืขืจืš ืคื™ ืžื™ืœื™ื•ืŸ ืžื”ืจ ื™ื•ืชืจ
07:49
than the minds that built it.
155
469680
1816
ืžื”ืžื•ื—ื•ืช ืฉื‘ื ื• ืื•ืชื”.
07:51
So you set it running for a week,
156
471520
1656
ืื– ืืชื ื ื•ืชื ื™ื ืœื” ืœืจื•ืฅ ืฉื‘ื•ืข,
07:53
and it will perform 20,000 years of human-level intellectual work,
157
473200
4560
ื•ื”ื™ื ืชื‘ืฆืข 20,000 ืฉื ื•ืช ืื“ื ืฉืœ ืขื‘ื•ื“ื” ืื™ื ื˜ืœืงื˜ื•ืืœื™ืช,
07:58
week after week after week.
158
478400
1960
ืฉื‘ื•ืข ืื—ืจื™ ืฉื‘ื•ืข ืื—ืจื™ ืฉื‘ื•ืข.
08:01
How could we even understand, much less constrain,
159
481640
3096
ืื™ืš ื ื•ื›ืœ ืœื”ื‘ื™ืŸ, ืœื ื›ืœ ืฉื›ืŸ ืœื”ื’ื‘ื™ืœ,
08:04
a mind making this sort of progress?
160
484760
2280
ืžื•ื— ืฉื™ื›ื•ืœ ืœื”ืชืงื“ื ื›ืš?
08:08
The other thing that's worrying, frankly,
161
488840
2136
ื‘ื›ื ื•ืช, ื”ื“ื‘ืจ ื”ื ื•ืกืฃ ืฉืžื“ืื™ื’,
08:11
is that, imagine the best case scenario.
162
491000
4976
ื”ื•ื ืฉืื ืชื“ืžื™ื™ื ื• ืืช ื”ืชืจื—ื™ืฉ ื”ื˜ื•ื‘ ื‘ื™ื•ืชืจ,
08:16
So imagine we hit upon a design of superintelligent AI
163
496000
4176
ื“ืžื™ื™ื ื• ืฉืคืชืื•ื ื ืชืงืœื ื• ื‘ืชื›ืŸ ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช
08:20
that has no safety concerns.
164
500200
1376
ืฉืื™ืŸ ื‘ื• ืฉื•ื ื“ืื’ื•ืช ื‘ื˜ื™ื—ื•ืชื™ื•ืช.
08:21
We have the perfect design the first time around.
165
501600
3256
ื™ืฉ ืœื ื• ืืช ื”ืชื›ืŸ ื”ืžื•ืฉืœื ืขืœ ื”ืคืขื ื”ืจืืฉื•ื ื”.
08:24
It's as though we've been handed an oracle
166
504880
2216
ื›ืื™ืœื• ืฉืงื™ื‘ืœื ื• ืื•ึนืจึทืงึฐืœ
08:27
that behaves exactly as intended.
167
507120
2016
ืฉืžืชื ื”ื’ ื‘ื“ื™ื•ืง ื›ืคื™ ืฉืื ื• ืžืฆืคื™ื.
08:29
Well, this machine would be the perfect labor-saving device.
168
509160
3720
ืื– ื”ืžื›ื•ื ื” ื”ื–ื• ืชื”ื™ื” ื”ืžื›ืฉื™ืจ ื”ืžื•ืฉืœื ืœื—ื™ืกื›ื•ืŸ ื‘ืขื‘ื•ื“ื”.
08:33
It can design the machine that can build the machine
169
513680
2429
ื”ื™ื ืชื•ื›ืœ ืœืชื›ื ืŸ ืืช ื”ืžื›ื•ื ื” ืฉืชื•ื›ืœ ืœื‘ื ื•ืช ืืช ื”ืžื›ื•ื ื”
08:36
that can do any physical work,
170
516133
1763
ืฉืชืขืฉื” ื›ืœ ืขื‘ื•ื“ื” ืคื™ื–ื™ืช,
08:37
powered by sunlight,
171
517920
1456
ืชื•ืคืขืœ ื‘ื›ื•ื— ื”ืฉืžืฉ,
08:39
more or less for the cost of raw materials.
172
519400
2696
ืคื—ื•ืช ืื• ื™ื•ืชืจ ื‘ืžื—ื™ืจ ืฉืœ ื—ื•ืžืจื™ ื”ื’ืœื.
08:42
So we're talking about the end of human drudgery.
173
522120
3256
ืื– ืื ื—ื ื• ืžื“ื‘ืจื™ื ืขืœ ืกื•ืคื” ืฉืœ ืขื‘ื•ื“ืช ื”ืคืจืš ื”ืื ื•ืฉื™ืช.
08:45
We're also talking about the end of most intellectual work.
174
525400
2800
ืื ื—ื ื• ืžื“ื‘ืจื™ื ื’ื ืขืœ ืกื•ืคื” ืฉืœ ืจื•ื‘ ื”ืขื‘ื•ื“ื” ื”ืื™ื ื˜ืœืงื˜ื•ืืœื™ืช.
08:49
So what would apes like ourselves do in this circumstance?
175
529200
3056
ืื– ืžื” ืงื•ืคื™ื ื›ืžื•ื ื• ื™ืขืฉื• ื‘ื ืกื™ื‘ื•ืช ืฉื›ืืœื•?
08:52
Well, we'd be free to play Frisbee and give each other massages.
176
532280
4080
ื˜ื•ื‘, ื ื”ื™ื” ื—ื•ืคืฉื™ื™ื ืœืฉื—ืง ืคืจื™ืกื‘ื™ ื•ืœื”ืขื ื™ืง ืื—ื“ ืœืฉื ื™ ืขื™ืกื•ื™ื™ื.
08:57
Add some LSD and some questionable wardrobe choices,
177
537840
2856
ืชื•ืกื™ืคื• ืงืฆืช ืืœ.ืืก.ื“ื™ ื•ื‘ื—ื™ืจืช ืœื‘ื•ืฉ ืžืคื•ืงืคืงืช,
09:00
and the whole world could be like Burning Man.
178
540720
2176
ื•ื›ืœ ื”ืขื•ืœื ื™ืจืื” ื›ืžื• ืื™ืจื•ืข ืฉืœ ื‘ืจื ื™ื ื’ ืžืืŸ.
09:02
(Laughter)
179
542920
1640
(ืฆื—ื•ืง)
09:06
Now, that might sound pretty good,
180
546320
2000
ืขื›ืฉื™ื•, ื–ื” ืื•ืœื™ ื ืฉืžืข ื“ื™ ื˜ื•ื‘,
09:09
but ask yourself what would happen
181
549280
2376
ืื‘ืœ ืชืฉืืœื• ืืช ืขืฆืžื›ื ืžื” ื™ืงืจื”
09:11
under our current economic and political order?
182
551680
2736
ืชื—ืช ื”ืกื“ืจ ื”ื›ืœื›ืœื™ ื•ื”ืคื•ืœื™ื˜ื™ ืฉืœื ื• ื”ื™ื•ื?
09:14
It seems likely that we would witness
183
554440
2416
ื ืจืื” ืกื‘ื™ืจ ืฉืื ื—ื ื• ื ื—ื–ื”
09:16
a level of wealth inequality and unemployment
184
556880
4136
ื‘ืจืžื•ืช ืฉืœ ืื™-ืฉื•ื™ื•ืŸ ื›ืœื›ืœื™ ื•ืื‘ื˜ืœื”
09:21
that we have never seen before.
185
561040
1496
ืฉืœื ื ืฆืคื• ืงื•ื“ื ืœื›ืŸ ืžืขื•ืœื.
09:22
Absent a willingness to immediately put this new wealth
186
562560
2616
ืœืœื ืจืฆื•ืŸ ืœื”ืฉืงื™ืข ืžื™ื“ ืืช ื›ืœ ื”ืขื•ืฉืจ ื”ื–ื”
09:25
to the service of all humanity,
187
565200
1480
ืœืจื•ื•ื—ืช ื”ืžื™ืŸ ื”ืื ื•ืฉื™ ื›ื•ืœื•,
09:27
a few trillionaires could grace the covers of our business magazines
188
567640
3616
ืžืกืคืจ ื˜ืจื™ืœื™ื•ื ืจื™ื ื™ื•ื›ืœื• ืœืงืฉื˜ ืืช ื”ืฉืขืจื™ื ืฉืœ ื›ืชื‘ื™ ื”ืขืช ื”ืขืกืงื™ื™ื
09:31
while the rest of the world would be free to starve.
189
571280
2440
ื‘ื–ืžืŸ ืฉืฉืืจ ื”ืขื•ืœื ื™ื”ื™ื” ื—ื•ืคืฉื™ ืœืจืขื•ื‘.
09:34
And what would the Russians or the Chinese do
190
574320
2296
ื•ืžื” ื”ืจื•ืกื™ื ื•ื”ืกื™ื ื™ื ื™ืขืฉื•
09:36
if they heard that some company in Silicon Valley
191
576640
2616
ืื ื”ื ื™ืฉืžืขื• ืฉื—ื‘ืจื” ื›ืœืฉื”ื™ ื‘ืขืžืง ื”ืกื™ืœื™ืงื•ืŸ
09:39
was about to deploy a superintelligent AI?
192
579280
2736
ืขื•ืžื“ืช ืœื”ืฉื™ืง ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช?
09:42
This machine would be capable of waging war,
193
582040
2856
ื”ืžื›ื•ื ื” ื”ื–ื• ืชื•ื›ืœ ืœื ื”ืœ ืžืœื—ืžื”,
09:44
whether terrestrial or cyber,
194
584920
2216
ื‘ื™ืŸ ืื ืืจืฆื™ืช ืื• ื‘ืžืจื—ื‘ ื”ืกื™ื™ื‘ืจ,
09:47
with unprecedented power.
195
587160
1680
ื‘ืขื•ืฆืžื” ืœืœื ืชืงื“ื™ื.
09:50
This is a winner-take-all scenario.
196
590120
1856
ื–ื” ืชืจื—ื™ืฉ ืฉื‘ื• ื”ืžื ืฆื— ืœื•ืงื— ื”ื›ืœ.
09:52
To be six months ahead of the competition here
197
592000
3136
ืœื”ืงื“ื™ื ืืช ื”ืชื—ืจื•ืช ื”ื–ื• ื‘ืฉื™ืฉื” ื—ื•ื“ืฉื™ื
09:55
is to be 500,000 years ahead,
198
595160
2776
ืžืฉืžืขื• ืœื”ืงื“ื™ื ืืช ื”ืขื•ืœื ื‘-500,000 ืฉื ื” ืœืคื—ื•ืช.
09:57
at a minimum.
199
597960
1496
09:59
So it seems that even mere rumors of this kind of breakthrough
200
599480
4736
ืื– ื ืจืื” ืฉืืคื™ืœื• ืฉืžื•ืขื•ืช ืขืœ ืคืจื™ืฆืช ื“ืจืš ืฉื›ื–ื•
10:04
could cause our species to go berserk.
201
604240
2376
ื™ื›ื•ืœื•ืช ืœื’ืจื•ื ืœืžื™ืŸ ืฉืœื ื• ืœื”ืฉืชื’ืข.
10:06
Now, one of the most frightening things,
202
606640
2896
ืขื›ืฉื™ื•, ืื—ื“ ื”ื“ื‘ืจื™ื ื”ืžืคื—ื™ื“ื™ื ื‘ื™ื•ืชืจ,
10:09
in my view, at this moment,
203
609560
2776
ืœืจืื•ืช ืขื™ื ื™, ื‘ืจื’ืข ื”ื–ื”,
10:12
are the kinds of things that AI researchers say
204
612360
4296
ืืœื” ื”ื“ื‘ืจื™ื ืฉื—ื•ืงืจื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืื•ืžืจื™ื
10:16
when they want to be reassuring.
205
616680
1560
ื›ืฉื”ื ืจื•ืฆื™ื ืœื”ืจื’ื™ืข ืื•ืชื ื•.
10:19
And the most common reason we're told not to worry is time.
206
619000
3456
ื•ื”ืกื™ื‘ื” ื”ื ืคื•ืฆื” ื‘ื™ื•ืชืจ ืฉืื•ืžืจื™ื ืœื ื• ืฉืื™ืŸ ืกื™ื‘ื” ืœื“ืื’ื” ื”ื™ื ื–ืžืŸ.
10:22
This is all a long way off, don't you know.
207
622480
2056
ื–ื” ืขื•ื“ ืจื—ื•ืง ืžืื•ื“, ืืชื ื™ื•ื“ืขื™ื.
10:24
This is probably 50 or 100 years away.
208
624560
2440
ื–ื” ื›ื ืจืื” ืจื—ื•ืง ืžืื™ืชื ื• ื‘-50 ืื• 100 ืฉื ื™ื.
10:27
One researcher has said,
209
627720
1256
ื—ื•ืงืจ ืื—ื“ ืืžืจ,
10:29
"Worrying about AI safety
210
629000
1576
"ืœื“ืื•ื’ ืžื”ื‘ื˜ื™ื—ื•ืช ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
10:30
is like worrying about overpopulation on Mars."
211
630600
2280
"ื–ื” ื›ืžื• ืœื“ืื•ื’ ืžืฆืคื™ืคื•ืช ืื•ื›ืœื•ืกื™ืŸ ืขืœ ื”ืžืื“ื™ื."
10:34
This is the Silicon Valley version
212
634116
1620
ื–ื• ื”ื’ืจืกื” ืฉืœ ืขืžืง ื”ืกื™ืœื™ืงื•ืŸ
10:35
of "don't worry your pretty little head about it."
213
635760
2376
ืœ"ืืœ ืชื˜ืจื™ื“ื• ืืช ืขืฆืžื›ื ื‘ื–ื”."
10:38
(Laughter)
214
638160
1336
(ืฆื—ื•ืง)
10:39
No one seems to notice
215
639520
1896
ื ืจืื” ืฉืืฃ ืื—ื“ ืœื ืฉื ืœื‘
10:41
that referencing the time horizon
216
641440
2616
ืฉื”ื”ืชื™ื™ื—ืกื•ืช ืœืื•ืคืง ื”ื–ืžืŸ
10:44
is a total non sequitur.
217
644080
2576
ืžื”ื•ื•ื” ื ื•ืŸ ืกืงื•ื•ื™ื˜ืจ ืžื•ื—ืœื˜.
10:46
If intelligence is just a matter of information processing,
218
646680
3256
ืื ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ื™ื ืจืง ืขื ื™ืŸ ืฉืœ ืขื™ื‘ื•ื“ ืžื™ื“ืข,
10:49
and we continue to improve our machines,
219
649960
2656
ื•ืื ื—ื ื• ืžืžืฉื™ื›ื™ื ืœืฉืคืจ ืืช ื”ืžื›ื•ื ื•ืช ืฉืœื ื•,
10:52
we will produce some form of superintelligence.
220
652640
2880
ืื ื—ื ื• ื ื™ื™ืฆืจ ืกื•ื’ ื›ืœืฉื”ื• ืฉืœ ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ืฆื™ื”.
10:56
And we have no idea how long it will take us
221
656320
3656
ื•ืื™ืŸ ืœื ื• ืžื•ืฉื’ ื›ืžื” ื–ืžืŸ ื™ื™ืงื— ืœื ื•
11:00
to create the conditions to do that safely.
222
660000
2400
ืœื™ื™ืฆืจ ืืช ื”ืชื ืื™ื ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช ื‘ืื•ืคืŸ ื‘ื˜ื™ื—ื•ืชื™.
11:04
Let me say that again.
223
664200
1296
ืชืจืฉื• ืœื™ ืœื”ื’ื™ื“ ืืช ื–ื” ืฉื•ื‘.
11:05
We have no idea how long it will take us
224
665520
3816
ืื™ืŸ ืœื ื• ืžื•ืฉื’ ื›ืžื” ื–ืžืŸ ื™ื™ืงื— ืœื ื•
11:09
to create the conditions to do that safely.
225
669360
2240
ืœื™ื™ืฆืจ ืืช ื”ืชื ืื™ื ื›ื“ื™ ืœืขืฉื•ืช ื–ืืช ื‘ืื•ืคืŸ ื‘ื˜ื™ื—ื•ืชื™.
11:12
And if you haven't noticed, 50 years is not what it used to be.
226
672920
3456
ื•ืื ืœื ืฉืžืชื ืœื‘, 50 ืฉื ื” ื–ื” ืœื ืžื” ืฉื–ื” ื”ื™ื” ืคืขื.
11:16
This is 50 years in months.
227
676400
2456
ืืœื” 50 ืฉื ื™ื ืžื—ื•ืœืงื™ื ืœื—ื•ื“ืฉื™ื.
11:18
This is how long we've had the iPhone.
228
678880
1840
ื–ื• ื›ืžื•ืช ื”ื–ืžืŸ ืฉื™ืฉ ืœื ื• ืื™ื™ืคื•ื ื™ื.
11:21
This is how long "The Simpsons" has been on television.
229
681440
2600
ื–ื• ื›ืžื•ืช ื”ื–ืžืŸ ืฉ"ื”ืกื™ืžืคืกื•ื ื™ื" ืžืฉื•ื“ืจื™ื ื‘ื˜ืœื•ื™ื–ื™ื”.
11:24
Fifty years is not that much time
230
684680
2376
ื—ืžื™ืฉื™ื ืฉื ื™ื ื–ื” ืœื ื”ืจื‘ื” ื–ืžืŸ
11:27
to meet one of the greatest challenges our species will ever face.
231
687080
3160
ื›ื“ื™ ืœื”ืชืžื•ื“ื“ ืขื ืื—ื“ ื”ืืชื’ืจื™ื ื”ื’ื“ื•ืœื™ื ื‘ื™ื•ืชืจ ืฉื”ืžื™ืŸ ืฉืœื ื• ื™ืชืงืœ ื‘ื”ื.
11:31
Once again, we seem to be failing to have an appropriate emotional response
232
691640
4016
ืฉื•ื‘, ื ืจืื” ืฉืื ื—ื ื• ืœื ืžืฆืœื™ื—ื™ื ืœื™ื™ืฆืจ ืืช ื”ืชื’ื•ื‘ื” ื”ืจื’ืฉื™ืช ื”ืžืชืื™ืžื”
11:35
to what we have every reason to believe is coming.
233
695680
2696
ืœืžืฉื”ื• ืฉืื™ืŸ ืœื ื• ืกืคืง ืฉื”ื•ื ืžื’ื™ืข.
11:38
The computer scientist Stuart Russell has a nice analogy here.
234
698400
3976
ืœืžื“ืขืŸ ื”ืžื—ืฉื‘ื™ื ืกื˜ื™ื•ืืจื˜ ืจืืกืœ ื™ืฉ ืื ืœื•ื’ื™ื” ื ื—ืžื“ื” ื‘ื ื•ืฉื ื–ื”.
11:42
He said, imagine that we received a message from an alien civilization,
235
702400
4896
ื”ื•ื ืืžืจ, ื“ืžื™ื™ื ื• ืฉืงื™ื‘ืœื ื• ืžืกืจ ืžืชืจื‘ื•ืช ื—ื•ืฆื ื™ืช ื›ืœืฉื”ื™,
11:47
which read:
236
707320
1696
ืฉืื•ืžืจ:
11:49
"People of Earth,
237
709040
1536
"ืชื•ืฉื‘ื™ ื›ื“ื•ืจ ื”ืืจืฅ,
11:50
we will arrive on your planet in 50 years.
238
710600
2360
"ืื ื—ื ื• ื ื’ื™ืข ืœื›ื•ื›ื‘ ืฉืœื›ื ื‘ืขื•ื“ 50 ืฉื ื™ื.
11:53
Get ready."
239
713800
1576
ืชืชื›ื•ื ื ื•."
11:55
And now we're just counting down the months until the mothership lands?
240
715400
4256
ื•ืขื›ืฉื™ื• ืื ื—ื ื• ืจืง ืกื•ืคืจื™ื ืืช ื”ื—ื•ื“ืฉื™ื ืขื“ ืฉืกืคื™ื ืช ื”ืื ืชื ื—ืช?
11:59
We would feel a little more urgency than we do.
241
719680
3000
ืื ื—ื ื• ื ืจื’ื™ืฉ ืงืฆืช ื™ื•ืชืจ ื“ื—ื™ืคื•ืช ืžืžื” ืฉืื ื—ื ื• ืžืจื’ื™ืฉื™ื ื”ื™ื•ื.
12:04
Another reason we're told not to worry
242
724680
1856
ืกื™ื‘ื” ื ื•ืกืคืช ืฉืื•ืžืจื™ื ืœื ื• ืœื ืœื“ืื•ื’
12:06
is that these machines can't help but share our values
243
726560
3016
ื”ื™ื ืฉื”ืžื›ื•ื ื•ืช ื”ืœืœื• ื‘ื”ื›ืจื— ื™ื—ืœึตืงื• ืืช ื”ืขืจื›ื™ื ืฉืœื ื•
12:09
because they will be literally extensions of ourselves.
244
729600
2616
ื›ื™ ื”ืŸ ื™ื”ื™ื•, ื”ืœื›ื” ืœืžืขืฉื”, ืฉืœื•ื—ื•ืช ืฉืœ ืขืฆืžื ื•.
12:12
They'll be grafted onto our brains,
245
732240
1816
ื”ืŸ ื™ื•ืฉืชืœื• ืœืžื•ื—ื•ืช ืฉืœื ื•,
12:14
and we'll essentially become their limbic systems.
246
734080
2360
ื•ืื ื—ื ื• ื ื”ืคื•ืš ืœืžืขืฉื” ืœืžืขืจื›ืช ื”ืœื™ืžื‘ื™ืช ืฉืœื”ืŸ.
12:17
Now take a moment to consider
247
737120
1416
ืงื—ื• ืจื’ืข ื ื•ืกืฃ ื‘ื›ื“ื™ ืœืฉืงื•ืœ
12:18
that the safest and only prudent path forward,
248
738560
3176
ืื ื”ื”ืชืงื“ืžื•ืช ื”ื‘ื˜ื•ื—ื” ื•ื”ืฉืงื•ืœื” ื”ื™ื—ื™ื“ื” ื”ืžื•ืžืœืฆืช,
12:21
recommended,
249
741760
1336
12:23
is to implant this technology directly into our brains.
250
743120
2800
ื”ื™ื ืœื”ืฉืชื™ืœ ืืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” ื”ื–ื• ื™ืฉื™ืจื•ืช ืœืชื•ืš ื”ืžื•ื— ืฉืœื ื•.
12:26
Now, this may in fact be the safest and only prudent path forward,
251
746600
3376
ืขื›ืฉื™ื•, ื–ื• ืื•ืœื™ ื‘ืืžืช ื”ื“ืจืš ื”ืฉืงื•ืœื” ื•ื”ื‘ื˜ื•ื—ื” ืœื”ืชืงื“ื,
12:30
but usually one's safety concerns about a technology
252
750000
3056
ืื‘ืœ ืžื•ืžืœืฅ ืœื˜ืคืœ ื”ื™ื˜ื‘ ื‘ื‘ื˜ื™ื—ื•ืช ื˜ื›ื ื•ืœื•ื’ื™ื”
12:33
have to be pretty much worked out before you stick it inside your head.
253
753080
3656
ืœืคื ื™ ืฉืชื•ืงืขื™ื ืื•ืชื” ืœืชื•ืš ื”ืจืืฉ ืฉืœื›ื.
12:36
(Laughter)
254
756760
2016
(ืฆื—ื•ืง)
12:38
The deeper problem is that building superintelligent AI on its own
255
758800
5336
ื”ื‘ืขื™ื” ื”ืขืžื•ืงื” ื™ื•ืชืจ ื”ื™ื ืฉื‘ื ื™ื™ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช ืœื‘ื“ื”
12:44
seems likely to be easier
256
764160
1736
ื ืจืื™ืช ืงืœื” ื™ื•ืชืจ
12:45
than building superintelligent AI
257
765920
1856
ืžื‘ื ื™ื™ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช
12:47
and having the completed neuroscience
258
767800
1776
ื•ื’ื ื”ื‘ื ื” ืžืœืื” ื‘ืžื“ืขื™ ื”ืžื•ื—
12:49
that allows us to seamlessly integrate our minds with it.
259
769600
2680
ืฉืชืืคืฉืจ ืœื ื• ืœืฉืœื‘ ื‘ืื•ืคืŸ ืžื•ืฉืœื ืื•ืชื” ืขื ืžื•ื—ื ื•.
12:52
And given that the companies and governments doing this work
260
772800
3176
ื•ื‘ื”ืชื—ืฉื‘ ื‘ื›ืš ืฉื—ื‘ืจื•ืช ื•ืžืžืฉืœื•ืช ืฉืขื•ืฉื•ืช ืืช ื”ืขื‘ื•ื“ื” ื”ื–ื•
12:56
are likely to perceive themselves as being in a race against all others,
261
776000
3656
ื›ื ืจืื” ื™ืจืื• ืืช ืขืฆืžืŸ ื‘ืžืจื•ืฅ ื›ื ื’ื“ ื›ืœ ื”ืื—ืจื•ืช,
12:59
given that to win this race is to win the world,
262
779680
3256
ื•ื‘ื”ืชื—ืฉื‘ ืฉื ืฆื—ื•ืŸ ื‘ืžืจื•ืฅ ื”ื–ื” ืžืฉืžืขื•ืชื• ืœื–ื›ื•ืช ื‘ืขื•ืœื,
13:02
provided you don't destroy it in the next moment,
263
782960
2456
ื‘ืชื ืื™ ืฉืœื ืชื”ืจืกื• ืื•ืชื• ื‘ืจื’ืข ืฉืื—ืจื™,
13:05
then it seems likely that whatever is easier to do
264
785440
2616
ืื– ื ืจืื” ืกื‘ื™ืจ ืฉืžื” ืฉืงืœ ื™ื•ืชืจ ืœืขืฉื•ืช
13:08
will get done first.
265
788080
1200
ื™ื™ืขืฉื” ืงื•ื“ื.
13:10
Now, unfortunately, I don't have a solution to this problem,
266
790560
2856
ืœืžืจื‘ื” ื”ืฆืขืจ, ืื™ืŸ ืœื™ ืคืชืจื•ืŸ ืœื‘ืขื™ื” ื”ื–ื•,
13:13
apart from recommending that more of us think about it.
267
793440
2616
ืžืœื‘ื“ ืœื”ืžืœื™ืฅ ืฉื™ื•ืชืจ ืžืื™ืชื ื• ื™ื—ืฉื‘ื• ืขืœื™ื”.
ืื ื™ ื—ื•ืฉื‘ ืฉืื ื—ื ื• ืฆืจื™ื›ื™ื ืžืฉื”ื• ื›ืžื• ืคืจื•ื™ืงื˜ ืžื ื”ื˜ืŸ
13:16
I think we need something like a Manhattan Project
268
796080
2376
13:18
on the topic of artificial intelligence.
269
798480
2016
ืœื ื•ืฉื ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช.
13:20
Not to build it, because I think we'll inevitably do that,
270
800520
2736
ืœื ื›ื“ื™ ืœื‘ื ื•ืช ืื•ืชื”, ื›ื™ ืœื“ืขืชื™ ื–ื” ื‘ืœืชื™ ื ืžื ืข ืฉื ืขืฉื” ื–ืืช,
13:23
but to understand how to avoid an arms race
271
803280
3336
ืืœื ื›ื“ื™ ืœื”ื‘ื™ืŸ ืื™ืš ื ืžื ืขื™ื ืžืžืจื•ืฅ ื—ื™ืžื•ืฉ
13:26
and to build it in a way that is aligned with our interests.
272
806640
3496
ื•ื›ื“ื™ ืœื‘ื ื•ืช ืื•ืชื” ื‘ืื•ืคืŸ ืฉืขื•ืœื” ื‘ืงื ื” ืื—ื“ ืขื ื”ืื™ื ื˜ืจืกื™ื ืฉืœื ื•.
13:30
When you're talking about superintelligent AI
273
810160
2136
ื›ืฉืžื“ื‘ืจื™ื ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืกื•ืคืจ-ืื™ื ื˜ืœื™ื’ื ื˜ื™ืช
13:32
that can make changes to itself,
274
812320
2256
ืฉื™ื›ื•ืœื” ืœืฉื ื•ืช ืืช ืขืฆืžื”,
13:34
it seems that we only have one chance to get the initial conditions right,
275
814600
4616
ื ืจืื” ืฉื™ืฉ ืœื ื• ืจืง ื”ื–ื“ืžื ื•ืช ืื—ืช ืœื™ืฆื•ืจ ืชื ืื™ ืคืชื™ื—ื” ื ื›ื•ื ื™ื,
13:39
and even then we will need to absorb
276
819240
2056
ื•ืืคื™ืœื• ืื– ื ืฆื˜ืจืš ืœืกืคื•ื’
13:41
the economic and political consequences of getting them right.
277
821320
3040
ืืช ื”ื”ืฉืœื›ื•ืช ื”ืคื•ืœื™ื˜ื™ื•ืช ื•ื”ื›ืœื›ืœื™ื•ืช ืฉืงืฉื•ืจื•ืช ื‘ื›ืš.
13:45
But the moment we admit
278
825760
2056
ืื‘ืœ ื‘ืจื’ืข ืฉื ื•ื“ื” ื‘ื›ืš
13:47
that information processing is the source of intelligence,
279
827840
4000
ืฉืขื™ื‘ื•ื“ ืžื™ื“ืข ื”ื•ื ื”ืžืงื•ืจ ืœืื™ื ื˜ืœื™ื’ื ืฆื™ื”,
13:52
that some appropriate computational system is what the basis of intelligence is,
280
832720
4800
ืฉืžืขืจื›ืช ื—ื™ืฉื•ื‘ื™ืช ืžืชืื™ืžื” ื”ื™ื ื”ื‘ืกื™ืก ืœืื™ื ื˜ืœื™ื’ื ืฆื™ื”,
13:58
and we admit that we will improve these systems continuously,
281
838360
3760
ื•ื ืงื‘ืœ ืฉืื ื—ื ื• ื ืžืฉื™ืš ืœืฉืคืจ ืืช ื”ืžืขืจื›ื•ืช ื”ืืœื• ืœืœื ื”ืคืกืงื”,
14:03
and we admit that the horizon of cognition very likely far exceeds
282
843280
4456
ื•ื ืงื‘ืœ ืฉื›ื›ืœ ื”ื ืจืื” ืื•ืคืง ื”ืงื•ื’ื ื™ืฆื™ื” ืจื—ื•ืง ื‘ื”ืจื‘ื”
14:07
what we currently know,
283
847760
1200
ืžื–ื” ืฉืื ื• ืžื›ื™ืจื™ื ื›ื™ื•ื,
14:10
then we have to admit
284
850120
1216
ืื– ืขืœื™ื ื• ืœืงื‘ืœ
14:11
that we are in the process of building some sort of god.
285
851360
2640
ืฉืื ื—ื ื• ื‘ืชื”ืœื™ืš ืฉืœ ื‘ื ื™ื™ืช ืืœ ื›ืœืฉื”ื•.
14:15
Now would be a good time
286
855400
1576
ื•ืขื›ืฉื™ื• ื™ื”ื™ื” ื–ืžืŸ ื˜ื•ื‘
14:17
to make sure it's a god we can live with.
287
857000
1953
ืœื•ื•ื“ื ืฉื–ื” ืืœ ืฉื ื•ื›ืœ ืœื—ื™ื•ืช ืื™ืชื•.
14:20
Thank you very much.
288
860120
1536
ืชื•ื“ื” ืจื‘ื” ืœื›ื.
14:21
(Applause)
289
861680
5093
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7