Can we build AI without losing control over it? | Sam Harris

3,792,802 views ใƒป 2016-10-19

TED


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚

็ฟป่จณ: Yasushi Aoki ๆ กๆญฃ: Kaori Nozaki
ๆˆ‘ใ€…ใฎๅคšใใŒๆŠฑใˆใฆใ„ใ‚‹โ€•
00:13
I'm going to talk about a failure of intuition
0
13000
2216
็›ดๆ„Ÿใฎ่ชคใ‚Šใซใคใ„ใฆ ใŠ่ฉฑใ—ใ—ใพใ™
00:15
that many of us suffer from.
1
15240
1600
00:17
It's really a failure to detect a certain kind of danger.
2
17480
3040
ๆญฃ็ขบใซใฏ ใ‚ใ‚‹็จฎใฎๅฑ้™บใ‚’ ๅฏŸ็Ÿฅใ—ๆใชใ†ใจใ„ใ†ใ“ใจใงใ™
00:21
I'm going to describe a scenario
3
21360
1736
ใ“ใ‚Œใ‹ใ‚‰่ชฌๆ˜Žใ™ใ‚‹ใ‚ทใƒŠใƒชใ‚ชใฏ
็งใฎ่€ƒใˆใงใฏ ๆใ‚ใ—ใ„ใจๅŒๆ™‚ใซ
00:23
that I think is both terrifying
4
23120
3256
00:26
and likely to occur,
5
26400
1760
่ตทใ“ใ‚Šใใ†ใชใ“ใจ ใงใ‚‚ใ‚ใ‚Šใพใ™
00:28
and that's not a good combination,
6
28840
1656
ใ‚ใ‚ŠใŒใŸใใชใ„ ็ต„ใฟๅˆใ‚ใ›ใงใ™ใญ
00:30
as it turns out.
7
30520
1536
่€ƒใˆใฆใฟใŸใ‚‰
ใ—ใ‹ใ‚‚ ๅคšใใฎไบบใŒ ใใ‚Œใ‚’ๆใ‚Œใ‚‹ใ‚ˆใ‚Šใฏ
00:32
And yet rather than be scared, most of you will feel
8
32080
2456
00:34
that what I'm talking about is kind of cool.
9
34560
2080
็ด ๆ•ตใชใ“ใจใฎใ‚ˆใ†ใซ ๆ„Ÿใ˜ใฆใ„ใพใ™
ไบบๅทฅ็Ÿฅ่ƒฝใซใ‚ˆใฃใฆ ๆˆ‘ใ€…ใŒใฉใฎใ‚ˆใ†ใซๅˆฉ็›Šใ‚’ๅพ—
00:37
I'm going to describe how the gains we make
10
37200
2976
ใใ—ใฆๆœ€็ต‚็š„ใซใฏ ็ ดๆป…ใ‚’ๆ‹›ใใ‹ใญใชใ„ใ‹ ใŠ่ฉฑใ—ใ—ใพใ™
00:40
in artificial intelligence
11
40200
1776
00:42
could ultimately destroy us.
12
42000
1776
00:43
And in fact, I think it's very difficult to see how they won't destroy us
13
43800
3456
ไบบๅทฅ็Ÿฅ่ƒฝใŒ ไบบ้กžใ‚’็ ดๆป…ใ•ใ›ใŸใ‚Š
่‡ชๆป…ใซ่ฟฝใ„่พผใ‚“ใ ใ‚Šใ—ใชใ„ ใ‚ทใƒŠใƒชใ‚ชใฏ
00:47
or inspire us to destroy ourselves.
14
47280
1680
ๅฎŸ้š› ่€ƒใˆใซใใ„ๆฐ—ใŒใ—ใพใ™
00:49
And yet if you're anything like me,
15
49400
1856
ใใ‚Œใงใ‚‚็š†ใ•ใ‚“ใŒ ็งใจๅŒใ˜ใชใ‚‰
00:51
you'll find that it's fun to think about these things.
16
51280
2656
ใใ†ใ„ใฃใŸใ“ใจใซใคใ„ใฆ่€ƒใˆใ‚‹ใฎใฏ ๆฅฝใ—ใ„ใ“ใจใงใ—ใ‚‡ใ†
00:53
And that response is part of the problem.
17
53960
3376
ใใฎๅๅฟœ่‡ชไฝ“ใŒๅ•้กŒใชใฎใงใ™
00:57
OK? That response should worry you.
18
57360
1720
ใใ†ใ„ใ†ๅๅฟœใฏ ๆ‡ธๅฟตใ™ในใใงใ™
00:59
And if I were to convince you in this talk
19
59920
2656
ไปฎใซใ“ใฎ่ฌ›ๆผ”ใง ็งใฎ่จดใˆใฆใ„ใ‚‹ใ“ใจใŒ
ๅœฐ็ƒๆธฉๆš–ๅŒ–ใ‚„ ไฝ•ใ‹ใฎๅคง็•ฐๅค‰ใฎใŸใ‚
01:02
that we were likely to suffer a global famine,
20
62600
3416
ไธ–็•Œ็š„ใช้ฃข้ฅ‰ใŒใ‚„ใฃใฆใใ‚‹ ใจใ„ใ†ใ“ใจใ ใฃใŸใจใ—ใพใ—ใ‚‡ใ†
01:06
either because of climate change or some other catastrophe,
21
66040
3056
็งใŸใกใฎๅญซใ‚„ ใใฎๅญซใฎไธ–ไปฃใฏ
01:09
and that your grandchildren, or their grandchildren,
22
69120
3416
01:12
are very likely to live like this,
23
72560
1800
ใ“ใฎๅ†™็œŸใฎใ‚ˆใ†ใช ๆœ‰ๆง˜ใซใชใ‚‹ใ‚“ใ ใจ
ใใฎๅ ดๅˆ ใ“ใ†ใฏๆ€ใ‚ใชใ„ใงใ—ใ‚‡ใ†
01:15
you wouldn't think,
24
75200
1200
01:17
"Interesting.
25
77440
1336
ใ€Œใ‚„ใ‚ ้ข็™ฝใ„ใช
01:18
I like this TED Talk."
26
78800
1200
ใ“ใฎTEDใƒˆใƒผใ‚ฏๆฐ—ใซๅ…ฅใฃใŸใ‚ˆใ€
01:21
Famine isn't fun.
27
81200
1520
้ฃข้ฅ‰ใฏๆฅฝใ—ใ„ใ‚‚ใฎ ใงใฏใ‚ใ‚Šใพใ›ใ‚“
01:23
Death by science fiction, on the other hand, is fun,
28
83800
3376
ไธ€ๆ–นใง SFใฎไธญใฎ็ ดๆป…ใฏ ๆฅฝใ—ใ„ใ‚‚ใฎใชใฎใงใ™
01:27
and one of the things that worries me most about the development of AI at this point
29
87200
3976
ไปŠใฎๆ™‚็‚นใง AIใฎ็™บๅฑ•ใซใคใ„ใฆ ็งใŒๆœ€ใ‚‚ๆ‡ธๅฟตใ™ใ‚‹ใฎใฏ
01:31
is that we seem unable to marshal an appropriate emotional response
30
91200
4096
ๅฐ†ๆฅใซๅพ…ใกๅ—ใ‘ใฆใ„ใ‚‹ ๅฑ้™บใซๅฏพใ—ใฆ
ๆˆ‘ใ€…ใŒๆ„Ÿๆƒ…็š„ใซ้ฉๅˆ‡ใชๅๅฟœใ‚’ ใงใใšใซใ„ใ‚‹ใ“ใจใงใ™
01:35
to the dangers that lie ahead.
31
95320
1816
ใ“ใ†่ฉฑใ—ใฆใ„ใ‚‹็ง่‡ช่บซ ใใ†ใ„ใ†ๅๅฟœใ‚’ใงใใšใซใ„ใพใ™
01:37
I am unable to marshal this response, and I'm giving this talk.
32
97160
3200
ๆˆ‘ใ€…ใฏ๏ผ’ใคใฎๆ‰‰ใ‚’ๅ‰ใซ ็ซ‹ใฃใฆใ„ใ‚‹ใ‚ˆใ†ใชใ‚‚ใฎใงใ™
01:42
It's as though we stand before two doors.
33
102120
2696
01:44
Behind door number one,
34
104840
1256
๏ผ‘ใค็›ฎใฎๆ‰‰ใฎๅ…ˆใซใฏ
็Ÿฅ็š„ใชๆฉŸๆขฐใฎ้–‹็™บใ‚’ใ‚„ใ‚ใ‚‹ใจใ„ใ† ้“ใŒใ‚ใ‚Šใพใ™
01:46
we stop making progress in building intelligent machines.
35
106120
3296
01:49
Our computer hardware and software just stops getting better for some reason.
36
109440
4016
ใƒใƒผใƒ‰ใ‚ฆใ‚งใ‚ขใ‚„ใ‚ฝใƒ•ใƒˆใ‚ฆใ‚งใ‚ขใฎ้€ฒๆญฉใŒ ไฝ•ใ‚‰ใ‹ใฎ็†็”ฑใงๆญขใพใ‚‹ใฎใงใ™
01:53
Now take a moment to consider why this might happen.
37
113480
3000
ใใ†ใ„ใ†ใ“ใจใŒ่ตทใใ†ใ‚‹่ฆๅ› ใ‚’ ใกใ‚‡ใฃใจ่€ƒใˆใฆใฟใพใ—ใ‚‡ใ†
่‡ชๅ‹•ๅŒ–ใ‚„ใ‚คใƒณใƒ†ใƒชใ‚ธใ‚งใƒณใƒˆๅŒ–ใซใ‚ˆใฃใฆ ๅพ—ใ‚‰ใ‚Œใ‚‹ไพกๅ€คใ‚’่€ƒใˆใ‚Œใฐ
01:57
I mean, given how valuable intelligence and automation are,
38
117080
3656
02:00
we will continue to improve our technology if we are at all able to.
39
120760
3520
ไบบ้–“ใฏๅฏ่ƒฝใช้™ใ‚Š ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใ‚’้€ฒๆญฉใ•ใ›็ถšใ‘ใ‚‹ใฏใšใงใ™
02:05
What could stop us from doing this?
40
125200
1667
ใใ‚ŒใŒๆญขใพใ‚‹ใจใ—ใŸใ‚‰ ็†็”ฑใฏไฝ•ใงใ—ใ‚‡ใ†๏ผŸ
02:07
A full-scale nuclear war?
41
127800
1800
ๅ…จ้ขๆ ธๆˆฆไบ‰ใจใ‹
ไธ–็•Œ็š„ใช็–ซ็—…ใฎๅคงๆต่กŒใจใ‹
02:11
A global pandemic?
42
131000
1560
02:14
An asteroid impact?
43
134320
1320
ๅฐๆƒ‘ๆ˜Ÿใฎ่ก็ชใจใ‹
02:17
Justin Bieber becoming president of the United States?
44
137640
2576
ใ‚ธใƒฃใ‚นใƒ†ใ‚ฃใƒณใƒปใƒ“ใƒผใƒใƒผใฎ ๅคง็ตฑ้ ˜ๅฐฑไปปใจใ‹
02:20
(Laughter)
45
140240
2280
(็ฌ‘)
02:24
The point is, something would have to destroy civilization as we know it.
46
144760
3920
่ฆใฏ ไฝ•ใ‹ใง็พๅœจใฎๆ–‡ๆ˜ŽใŒ ๅดฉๅฃŠใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹ใจใ„ใ†ใ“ใจใงใ™
02:29
You have to imagine how bad it would have to be
47
149360
4296
ใใ‚ŒใŒใฉใ‚“ใชใซ ใฒใฉใ„ไบ‹ๆ…‹ใงใ‚ใ‚‹ใฏใšใ‹ ่€ƒใˆใฆใฟใฆใใ ใ•ใ„
02:33
to prevent us from making improvements in our technology
48
153680
3336
ๆŠ€่ก“ใฎ้€ฒๆญฉใ‚’
ๅนพไธ–ไปฃใซใ‚‚ใ‚ใŸใ‚Š
02:37
permanently,
49
157040
1216
ๅŠๆฐธไน…็š„ใซๅฆจใ’ใ‚‹ใ‚ˆใ†ใชใ“ใจใงใ™
02:38
generation after generation.
50
158280
2016
02:40
Almost by definition, this is the worst thing
51
160320
2136
ใปใผ้–“้•ใ„ใชใ
ไบบ้กžๅฒไธŠ ๆœ€ๆ‚ชใฎๅ‡บๆฅไบ‹ใงใ—ใ‚‡ใ†
02:42
that's ever happened in human history.
52
162480
2016
02:44
So the only alternative,
53
164520
1296
ใ‚‚ใ†๏ผ‘ใคใฎ้ธๆŠž่‚ข
02:45
and this is what lies behind door number two,
54
165840
2336
๏ผ’็•ช็›ฎใฎๆ‰‰ใฎ ๅ‘ใ“ใ†ใซใ‚ใ‚‹ใฎใฏ
02:48
is that we continue to improve our intelligent machines
55
168200
3136
็Ÿฅ็š„ใชๆฉŸๆขฐใฎ้€ฒๆญฉใŒ
ใšใฃใจ็ถšใ„ใฆใ„ใๆœชๆฅใงใ™
02:51
year after year after year.
56
171360
1600
02:53
At a certain point, we will build machines that are smarter than we are,
57
173720
3640
ใ™ใ‚‹ใจใฉใ“ใ‹ใฎๆ™‚็‚นใง ไบบ้–“ใ‚ˆใ‚Šใ‚‚ ็Ÿฅ็š„ใชๆฉŸๆขฐใ‚’ไฝœใ‚‹ใ“ใจใซใชใ‚‹ใงใ—ใ‚‡ใ†
ใใ—ใฆใฒใจใŸใณ ไบบ้–“ใ‚ˆใ‚Š็Ÿฅ็š„ใชๆฉŸๆขฐใŒ็”Ÿใพใ‚ŒใŸใชใ‚‰
02:58
and once we have machines that are smarter than we are,
58
178080
2616
03:00
they will begin to improve themselves.
59
180720
1976
ๆฉŸๆขฐใฏ่‡ชๅˆ†ใง ้€ฒๅŒ–ใ—ใฆใ„ใใงใ—ใ‚‡ใ†
03:02
And then we risk what the mathematician IJ Good called
60
182720
2736
ใใ†ใชใ‚‹ใจๆˆ‘ใ€…ใฏ ๆ•ฐๅญฆ่€… IใƒปJใƒปใ‚ฐใƒƒใƒ‰ใŒ่จ€ใ†ใจใ“ใ‚ใฎ
03:05
an "intelligence explosion,"
61
185480
1776
ใ€Œ็Ÿฅ่ƒฝ็ˆ†็™บใ€ใฎ ใƒชใ‚นใ‚ฏใซ็›ด้ขใ—ใพใ™
03:07
that the process could get away from us.
62
187280
2000
้€ฒๆญฉใฎใƒ—ใƒญใ‚ปใ‚นใŒ ไบบ้–“ใฎๆ‰‹ใ‚’้›ขใ‚Œใฆใ—ใพใ†ใฎใงใ™
ใใ‚Œใฏใ“ใฎ็ตตใฎใ‚ˆใ†ใช
03:10
Now, this is often caricatured, as I have here,
63
190120
2816
ๆ‚ชๆ„ใ‚’ๆŒใคใƒญใƒœใƒƒใƒˆใฎๅคง็พคใซ ่ฅฒใ‚ใ‚Œใ‚‹ๆๆ€–ใจใ—ใฆ
03:12
as a fear that armies of malicious robots
64
192960
3216
ใ‚ˆใๆˆฏ็”ปๅŒ–ใ•ใ‚Œใฆใ„ใพใ™
03:16
will attack us.
65
196200
1256
03:17
But that isn't the most likely scenario.
66
197480
2696
ใงใ‚‚ใใ‚Œใฏ ใ”ใใ‚ใ‚Šใใ†ใช ใ‚ทใƒŠใƒชใ‚ชใงใฏใชใ„ใงใ—ใ‚‡ใ†
03:20
It's not that our machines will become spontaneously malevolent.
67
200200
4856
ๅˆฅใซๆฉŸๆขฐใŒๆ‚ชๆ„ใซ็›ฎ่ฆšใ‚ใ‚‹ใจใ„ใ† ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“
ๆœฌๅฝ“ใซ ๅฟƒ้…ใ™ในใใชใฎใฏ
03:25
The concern is really that we will build machines
68
205080
2616
ไบบ้–“ใ‚ˆใ‚Šใ‚‚ใฏใ‚‹ใ‹ใซๅ„ชใ‚ŒใŸ ๆฉŸๆขฐใ‚’ไฝœใฃใฆใ—ใพใ†ใจ
03:27
that are so much more competent than we are
69
207720
2056
03:29
that the slightest divergence between their goals and our own
70
209800
3776
ไบบ้–“ใจๆฉŸๆขฐใฎ็›ฎ็š„ใฎ ใปใ‚“ใฎใ‚ใšใ‹ใชใ‚บใƒฌใซใ‚ˆใฃใฆ
ไบบ้กžใŒ็ ดๆป…ใ—ใ‹ใญใชใ„ใจ ใ„ใ†ใ“ใจใงใ™
03:33
could destroy us.
71
213600
1200
03:35
Just think about how we relate to ants.
72
215960
2080
ไบบ้กžใŒใ‚ขใƒชใฎใ‚ˆใ†ใชๅญ˜ๅœจใซใชใ‚‹ใจ ่€ƒใˆใ‚‹ใจใ„ใ„ใงใ™
03:38
We don't hate them.
73
218600
1656
ๆˆ‘ใ€…ใฏ ใ‚ขใƒชใŒ ๅซŒใ„ใชใ‚ใ‘ใงใฏใชใ
03:40
We don't go out of our way to harm them.
74
220280
2056
ใ‚ใ–ใ‚ใ– ๆฝฐใ—ใซ ่กŒใใฏใ—ใพใ›ใ‚“
03:42
In fact, sometimes we take pains not to harm them.
75
222360
2376
ๆฝฐใ—ใฆใ—ใพใ‚ใชใ„ใ‚ˆใ†ใซ ๆฐ—ใ‚’้ฃใ„ใ•ใˆใ—ใฆ
03:44
We step over them on the sidewalk.
76
224760
2016
ใ‚ขใƒชใ‚’้ฟใ‘ใฆๆญฉใใพใ™
03:46
But whenever their presence
77
226800
2136
ใงใ‚‚ใ‚ขใƒชใฎๅญ˜ๅœจใŒ
03:48
seriously conflicts with one of our goals,
78
228960
2496
ๆˆ‘ใ€…ใฎ็›ฎ็š„ใจ่ก็ชใ—ใŸๆ™‚โ€”
03:51
let's say when constructing a building like this one,
79
231480
2477
ใŸใจใˆใฐใ“ใ‚“ใชๅปบ้€ ็‰ฉใ‚’ ไฝœใ‚‹ใจใ„ใ†ๅ ดๅˆใซใฏ
03:53
we annihilate them without a qualm.
80
233981
1960
่‰ฏๅฟƒใฎๅ‘ต่ฒฌใชใ—ใซ ใ‚ขใƒชใ‚’ๅ…จๆป…ใ•ใ›ใฆใ—ใพใ„ใพใ™
03:56
The concern is that we will one day build machines
81
236480
2936
็งใŒๆ‡ธๅฟตใ™ใ‚‹ใฎใฏ
ๆ„่ญ˜็š„ใซใ›ใ‚ˆ็„กๆ„่ญ˜ใซใ›ใ‚ˆ
03:59
that, whether they're conscious or not,
82
239440
2736
ไบบ้กžใ‚’ใใฎใ‚ˆใ†ใซ่ปฝใๆ‰ฑใ†ๆฉŸๆขฐใ‚’ ๆˆ‘ใ€…ใŒใ„ใคใ‹ไฝœใฃใฆใ—ใพใ†ใ“ใจใงใ™
04:02
could treat us with similar disregard.
83
242200
2000
04:05
Now, I suspect this seems far-fetched to many of you.
84
245760
2760
่€ƒใˆใ™ใŽใ ใจ ๅคšใใฎๆ–นใฏ ๆ€ใฃใฆใ„ใ‚‹ใ“ใจใงใ—ใ‚‡ใ†
04:09
I bet there are those of you who doubt that superintelligent AI is possible,
85
249360
6336
่ถ…็Ÿฅ็š„AIใฎๅฎŸ็พๅฏ่ƒฝๆ€งใ‚’ ็–‘ๅ•่ฆ–ใ—
ใพใ—ใฆใ‚„ไธๅฏ้ฟใชใฉใจใฏ ๆ€ใ‚ใชใ„ไบบใ‚‚ใ„ใ‚‹ใงใ—ใ‚‡ใ†
04:15
much less inevitable.
86
255720
1656
04:17
But then you must find something wrong with one of the following assumptions.
87
257400
3620
ใงใ‚‚ ใใ‚Œใชใ‚‰ ใ“ใ‚Œใ‹ใ‚‰ๆŒ™ใ’ใ‚‹ไปฎๅฎšใซ ้–“้•ใ„ใ‚’่ฆ‹ใคใ‘ใฆใปใ—ใ„ใ‚‚ใฎใงใ™
04:21
And there are only three of them.
88
261044
1572
ไปฎๅฎšใฏ๏ผ“ใคใ ใ‘ใงใ™
04:23
Intelligence is a matter of information processing in physical systems.
89
263800
4719
็Ÿฅๆ€งใจใฏ็‰ฉ่ณช็š„ใชใ‚ทใ‚นใƒ†ใƒ ใซใŠใ‘ใ‚‹ ๆƒ…ๅ ฑๅ‡ฆ็†ใงใ‚ใ‚‹
04:29
Actually, this is a little bit more than an assumption.
90
269320
2615
ๅฎŸ้š›ใ“ใ‚Œใฏ ไปฎๅฎšไปฅไธŠใฎใ‚‚ใฎใงใ™
04:31
We have already built narrow intelligence into our machines,
91
271959
3457
็‹ญใ„ๆ„ๅ‘ณใงใฎ็Ÿฅๆ€งใชใ‚‰ ใ‚‚ใ†ๆฉŸๆขฐใงไฝœใ‚Šๅ‡บใ•ใ‚ŒใฆใŠใ‚Š
04:35
and many of these machines perform
92
275440
2016
ใใฎใ‚ˆใ†ใชๆฉŸๆขฐใฏใ™ใงใซ
04:37
at a level of superhuman intelligence already.
93
277480
2640
ไบบ้–“ใ‚’่ถ…ใˆใ‚‹ๆฐดๆบ–ใซ ้”ใ—ใฆใ„ใพใ™
04:40
And we know that mere matter
94
280840
2576
ใใ—ใฆๆง˜ใ€…ใชๅˆ†้‡Žใซใ‚ใŸใฃใฆ
ๆŸ”่ปŸใซๆ€่€ƒใงใใ‚‹่ƒฝๅŠ›ใงใ‚ใ‚‹ ใ€ŒๆฑŽ็”จ็Ÿฅ่ƒฝใ€ใฏ
04:43
can give rise to what is called "general intelligence,"
95
283440
2616
็‰ฉ่ณชใงๆง‹ๆˆๅฏ่ƒฝใชใ“ใจใ‚’ ๆˆ‘ใ€…ใฏ็Ÿฅใฃใฆใ„ใพใ™
04:46
an ability to think flexibly across multiple domains,
96
286080
3656
04:49
because our brains have managed it. Right?
97
289760
3136
ใชใœใชใ‚‰ไบบ้–“ใฎ่„ณใŒ ใใ†ใ ใ‹ใ‚‰ใงใ™
04:52
I mean, there's just atoms in here,
98
292920
3936
ใ“ใ“ใซใ‚ใ‚‹ใฎใฏ ใŸใ ใฎๅŽŸๅญใฎๅกŠใงใ™
04:56
and as long as we continue to build systems of atoms
99
296880
4496
ใ‚ˆใ‚Š็Ÿฅ็š„ใชๆŒฏใ‚‹่ˆžใ„ใ‚’ใ™ใ‚‹
ๅŽŸๅญใงใงใใŸใ‚ทใ‚นใƒ†ใƒ ใฎๆง‹็ฏ‰ใ‚’ ็ถšใ‘ใฆใ„ใใชใ‚‰
05:01
that display more and more intelligent behavior,
100
301400
2696
ไฝ•ใ‹ใงไธญๆ–ญใ•ใ›ใ‚‰ใ‚Œใชใ„้™ใ‚Šใฏ
05:04
we will eventually, unless we are interrupted,
101
304120
2536
05:06
we will eventually build general intelligence
102
306680
3376
ใ„ใคใ‹ๆฑŽ็”จ็Ÿฅ่ƒฝใ‚’
ๆฉŸๆขฐใงไฝœใ‚Šๅ‡บใ™ใ“ใจใซ ใชใ‚‹ใงใ—ใ‚‡ใ†
05:10
into our machines.
103
310080
1296
05:11
It's crucial to realize that the rate of progress doesn't matter,
104
311400
3656
้‡่ฆใชใฎใฏ ใ“ใ“ใง้€ฒๆญฉใฎใ‚นใƒ”ใƒผใƒ‰ใฏ ๅ•้กŒใงใฏใชใ„ใ“ใจใงใ™
ใฉใ‚“ใช้€ฒๆญฉใงใ‚ใ‚Œ ใใ“ใซ่‡ณใ‚‹ใฎใซๅๅˆ†ใ ใ‹ใ‚‰ใงใ™
05:15
because any progress is enough to get us into the end zone.
105
315080
3176
ใƒ ใƒผใ‚ขใฎๆณ•ๅ‰‡ใฎๆŒ็ถšใ‚‚ ๆŒ‡ๆ•ฐ็ˆ†็™บ็š„ใช้€ฒๆญฉใ‚‚ๅฟ…่ฆใ‚ใ‚Šใพใ›ใ‚“
05:18
We don't need Moore's law to continue. We don't need exponential progress.
106
318280
3776
ใŸใ ็ถšใ‘ใฆใ„ใใ ใ‘ใง ใ„ใ„ใฎใงใ™
05:22
We just need to keep going.
107
322080
1600
05:25
The second assumption is that we will keep going.
108
325480
2920
็ฌฌ๏ผ’ใฎไปฎๅฎšใฏ ไบบ้–“ใŒ้€ฒใฟ็ถšใ‘ใฆใ„ใใ“ใจ
ๆˆ‘ใ€…ใŒ็Ÿฅ็š„ใชๆฉŸๆขฐใฎ ๆ”น่‰ฏใ‚’็ถšใ‘ใฆใ„ใใ“ใจใงใ™
05:29
We will continue to improve our intelligent machines.
109
329000
2760
็Ÿฅๆ€งใฎไพกๅ€คใ‚’่€ƒใˆใฆใใ ใ•ใ„
05:33
And given the value of intelligence --
110
333000
4376
05:37
I mean, intelligence is either the source of everything we value
111
337400
3536
็Ÿฅๆ€งใฏ ๆˆ‘ใ€…ใŒไพกๅ€คใ‚’็ฝฎใใ‚‚ใฎ ใ™ในใฆใฎๆบๆณ‰
05:40
or we need it to safeguard everything we value.
112
340960
2776
ไพกๅ€คใ‚’็ฝฎใใ‚‚ใฎใ‚’ๅฎˆใ‚‹ใŸใ‚ใซ ๅฟ…่ฆใชใ‚‚ใฎใงใ™
05:43
It is our most valuable resource.
113
343760
2256
ๆˆ‘ใ€…ใŒๆŒใค ๆœ€ใ‚‚่ฒด้‡ใชใƒชใ‚ฝใƒผใ‚นใงใ™
ๆˆ‘ใ€…ใฏ ใใ‚Œใ‚’ๅฟ…่ฆใจใ—ใฆใ„ใพใ™
05:46
So we want to do this.
114
346040
1536
05:47
We have problems that we desperately need to solve.
115
347600
3336
ๆˆ‘ใ€…ใฏ่งฃๆฑบใ™ในใๅ•้กŒใ‚’ ๆŠฑใˆใฆใ„ใพใ™
05:50
We want to cure diseases like Alzheimer's and cancer.
116
350960
3200
ใ‚ขใƒซใƒ„ใƒใ‚คใƒžใƒผใ‚„ใ‚ฌใƒณใฎใ‚ˆใ†ใช็—…ๆฐ—ใ‚’ ๆฒปใ—ใŸใ„ใจๆ€ใฃใฆใ„ใพใ™
05:54
We want to understand economic systems. We want to improve our climate science.
117
354960
3936
็ตŒๆธˆใฎไป•็ต„ใฟใ‚’็†่งฃใ—ใŸใ„ ๆฐ—่ฑก็ง‘ๅญฆใ‚’ใ‚‚ใฃใจ่‰ฏใใ—ใŸใ„ใจๆ€ใฃใฆใ„ใพใ™
05:58
So we will do this, if we can.
118
358920
2256
ใ ใ‹ใ‚‰ ๅฏ่ƒฝใงใ‚ใ‚Œใฐ ใ‚„ใ‚‹ใฏใšใงใ™
06:01
The train is already out of the station, and there's no brake to pull.
119
361200
3286
ๅˆ—่ปŠใฏใ™ใงใซ้ง…ใ‚’ๅ‡บ็™บใ—ใฆใ„ใฆ ใƒ–ใƒฌใƒผใ‚ญใฏใ‚ใ‚Šใพใ›ใ‚“
06:05
Finally, we don't stand on a peak of intelligence,
120
365880
5456
๏ผ“็•ช็›ฎใฏ ไบบ้–“ใฏ็Ÿฅๆ€งใฎ ้ ‚็‚นใซใฏใŠใ‚‰ใš
ใใฎ่ฟ‘ใใซใ™ใ‚‰ ใ„ใชใ„ใจใ„ใ†ใ“ใจ
06:11
or anywhere near it, likely.
121
371360
1800
06:13
And this really is the crucial insight.
122
373640
1896
ใ“ใ‚Œใฏ้‡่ฆใชๆดžๅฏŸใงใ‚ใ‚Š
06:15
This is what makes our situation so precarious,
123
375560
2416
ๆˆ‘ใ€…ใฎ็Šถๆณใ‚’ ๅฑใ†ใใ™ใ‚‹ใ‚‚ใฎใงใ™
ใ“ใ‚ŒใฏใพใŸ ใƒชใ‚นใ‚ฏใซใคใ„ใฆใฎๆˆ‘ใ€…ใฎๅ‹˜ใŒ ๅฝ“ใฆใซใชใ‚‰ใชใ„็†็”ฑใงใ‚‚ใ‚ใ‚Šใพใ™
06:18
and this is what makes our intuitions about risk so unreliable.
124
378000
4040
ๅฒไธŠๆœ€ใ‚‚้ ญใฎ่‰ฏใ„ไบบ้–“ใซใคใ„ใฆ ่€ƒใˆใฆใฟใพใ—ใ‚‡ใ†
06:23
Now, just consider the smartest person who has ever lived.
125
383120
2720
06:26
On almost everyone's shortlist here is John von Neumann.
126
386640
3416
ใปใผๅฟ…ใšไธŠใŒใฃใฆใใ‚‹ๅๅ‰ใซ ใƒ•ใ‚ฉใƒณใƒปใƒŽใ‚คใƒžใƒณใŒใ‚ใ‚Šใพใ™
ใƒŽใ‚คใƒžใƒณใฎๅ‘จๅ›ฒใซใฏ
06:30
I mean, the impression that von Neumann made on the people around him,
127
390080
3336
ๅฝ“ๆ™‚ใฎ ๆœ€ใ‚‚ๅ„ชใ‚ŒใŸๆ•ฐๅญฆ่€…ใ‚„ ็‰ฉ็†ๅญฆ่€…ใŒใ„ใŸใ‚ใ‘ใงใ™ใŒ
06:33
and this included the greatest mathematicians and physicists of his time,
128
393440
4056
ไบบใ€…ใŒใƒŽใ‚คใƒžใƒณใ‹ใ‚‰ๅ—ใ‘ใŸๅฐ่ฑกใซใคใ„ใฆใฏ ใ„ใ‚ใ„ใ‚่จ˜่ฟฐใ•ใ‚Œใฆใ„ใพใ™
06:37
is fairly well-documented.
129
397520
1936
06:39
If only half the stories about him are half true,
130
399480
3776
ใƒŽใ‚คใƒžใƒณใซใพใคใ‚ใ‚‹่ฉฑใง ๅŠๅˆ†ใงใ‚‚ ๆญฃใ—ใ„ใ‚‚ใฎใŒ ๅŠๅˆ†ใ ใ‘ใ ใฃใŸใจใ—ใฆใ‚‚
06:43
there's no question
131
403280
1216
็–‘ใ„ใชใๅฝผใฏ
06:44
he's one of the smartest people who has ever lived.
132
404520
2456
ใ“ใ‚Œใพใงใซ็”ŸใใŸๆœ€ใ‚‚้ ญใฎ่‰ฏใ„ ไบบ้–“ใฎ๏ผ‘ไบบใงใ™
็Ÿฅ่ƒฝใฎๅˆ†ๅธƒใ‚’่€ƒใˆใฆใฟใพใ—ใ‚‡ใ†
06:47
So consider the spectrum of intelligence.
133
407000
2520
06:50
Here we have John von Neumann.
134
410320
1429
ใ“ใ“ใซใƒ•ใ‚ฉใƒณใƒปใƒŽใ‚คใƒžใƒณใŒใ„ใฆ
06:53
And then we have you and me.
135
413560
1334
ใ“ใฎ่พบใซ็งใ‚„็š†ใ•ใ‚“ใŒใ„ใฆ
ใ“ใฎ่พบใซใƒ‹ใƒฏใƒˆใƒชใŒใ„ใพใ™
06:56
And then we have a chicken.
136
416120
1296
06:57
(Laughter)
137
417440
1936
(็ฌ‘)
06:59
Sorry, a chicken.
138
419400
1216
ๅคฑๆ•ฌ ใƒ‹ใƒฏใƒˆใƒชใฏใ“ใฎ่พบใงใ™
07:00
(Laughter)
139
420640
1256
(็ฌ‘)
07:01
There's no reason for me to make this talk more depressing than it needs to be.
140
421920
3736
ๅ…ƒใ€…ๆป…ๅ…ฅใ‚‹่ฉฑใ‚’ใ•ใ‚‰ใซๆป…ๅ…ฅใ‚‹ใ‚‚ใฎใซ ใ™ใ‚‹ใ“ใจใ‚‚ใชใ„ใงใ—ใ‚‡ใ†
07:05
(Laughter)
141
425680
1600
(็ฌ‘)
07:08
It seems overwhelmingly likely, however, that the spectrum of intelligence
142
428339
3477
็Ÿฅ่ƒฝใฎๅˆ†ๅธƒใซใฏ ไปŠใฎๆˆ‘ใ€…ใซๆƒณๅƒใงใใ‚‹ใ‚ˆใ‚Šใ‚‚
07:11
extends much further than we currently conceive,
143
431840
3120
ใฏใ‚‹ใ‹ใซๅคงใใชๅบƒใŒใ‚ŠใŒ ใ‚ใ‚‹ใ“ใจใงใ—ใ‚‡ใ†
07:15
and if we build machines that are more intelligent than we are,
144
435880
3216
ใใ—ใฆๆˆ‘ใ€…ใŒ ไบบ้–“ใ‚ˆใ‚Šใ‚‚็Ÿฅ็š„ใชๆฉŸๆขฐใ‚’ไฝœใฃใŸใชใ‚‰
ใใ‚Œใฏใ“ใฎ็Ÿฅ่ƒฝใฎๅœฐๅนณใ‚’
07:19
they will very likely explore this spectrum
145
439120
2296
07:21
in ways that we can't imagine,
146
441440
1856
ๆˆ‘ใ€…ใฎๆƒณๅƒใŒ ๅŠใฐใชใ„ใใ‚‰ใ„้ ใใพใง
07:23
and exceed us in ways that we can't imagine.
147
443320
2520
้€ฒใ‚“ใงใ„ใใ“ใจใงใ—ใ‚‡ใ†
้‡่ฆใชใฎใฏ ใ‚นใƒ”ใƒผใƒ‰ใฎ็‚นใ ใ‘ใ‚’ๅ–ใฃใฆใ‚‚ ใใ‚Œใฏ็ขบใ‹ใ ใจใ„ใ†ใ“ใจใงใ™
07:27
And it's important to recognize that this is true by virtue of speed alone.
148
447000
4336
07:31
Right? So imagine if we just built a superintelligent AI
149
451360
5056
ใ‚นใ‚ฟใƒณใƒ•ใ‚ฉใƒผใƒ‰ใ‚„MITใฎ
ๅนณๅ‡็š„ใช็ ”็ฉถ่€…ไธฆใฟใฎ ็Ÿฅ่ƒฝใ‚’ๆŒใค
07:36
that was no smarter than your average team of researchers
150
456440
3456
่ถ…็Ÿฅ็š„AIใŒใงใใŸใจใ—ใพใ—ใ‚‡ใ†
07:39
at Stanford or MIT.
151
459920
2296
้›ปๅญๅ›ž่ทฏใฏ ็”ŸๅŒ–ๅญฆ็š„ๅ›ž่ทฏใ‚ˆใ‚Š
07:42
Well, electronic circuits function about a million times faster
152
462240
2976
100ไธ‡ๅ€ใ‚‚้ซ˜้€Ÿใซ็จผๅƒใ—ใพใ™
07:45
than biochemical ones,
153
465240
1256
07:46
so this machine should think about a million times faster
154
466520
3136
ใใฎๆฉŸๆขฐใฏ ใใ‚Œใ‚’ไฝœใ‚Šๅ‡บใ—ใŸ้ ญ่„ณใ‚ˆใ‚Šใ‚‚
07:49
than the minds that built it.
155
469680
1816
100ไธ‡ๅ€้€Ÿใ ๅƒใใจใ„ใ†ใ“ใจใงใ™
07:51
So you set it running for a week,
156
471520
1656
ใ ใ‹ใ‚‰๏ผ‘้€ฑ้–“็จผๅƒใ•ใ›ใฆใŠใใจ
07:53
and it will perform 20,000 years of human-level intellectual work,
157
473200
4560
ไบบ้–“ไธฆใฟใฎ็Ÿฅ็š„ไฝœๆฅญใ‚’ ๏ผ’ไธ‡ๅนดๅˆ†ใ“ใชใ™ใ“ใจใซใชใ‚Šใพใ™
07:58
week after week after week.
158
478400
1960
ๆฏŽ้€ฑๆฏŽ้€ฑใงใ™
08:01
How could we even understand, much less constrain,
159
481640
3096
ใใฎใ‚ˆใ†ใช้€ฒๆญฉใ‚’ใ™ใ‚‹้ ญ่„ณใฏ
ๆŠ‘ใˆใ‚‹ใฉใ“ใ‚ใ‹็†่งฃใ™ใ‚‹ใ“ใจใ™ใ‚‰ ใŠใผใคใ‹ใชใ„ใงใ—ใ‚‡ใ†
08:04
a mind making this sort of progress?
160
484760
2280
08:08
The other thing that's worrying, frankly,
161
488840
2136
ๆญฃ็›ด่จ€ใฃใฆ ๅฟƒ้…ใชใ“ใจใŒ ไป–ใซใ‚‚ใ‚ใ‚Šใพใ™
08:11
is that, imagine the best case scenario.
162
491000
4976
ใใ‚Œใชใ‚‰ๆœ€่‰ฏใฎใ‚ทใƒŠใƒชใ‚ชใฏ ใฉใ‚“ใชใ‚‚ใฎใงใ—ใ‚‡ใ†๏ผŸ
ๅฎ‰ๅ…จๆ€งใซใคใ„ใฆๅฟƒ้…ใฎใชใ„ ่ถ…็Ÿฅ็š„AIใฎใƒ‡ใ‚ถใ‚คใƒณใ‚’
08:16
So imagine we hit upon a design of superintelligent AI
163
496000
4176
ๆ€ใ„ไป˜ใ„ใŸใจใ—ใพใ™
08:20
that has no safety concerns.
164
500200
1376
08:21
We have the perfect design the first time around.
165
501600
3256
ๅฎŒ็’งใชใƒ‡ใ‚ถใ‚คใƒณใ‚’ ๆœ€ๅˆใซๆ‰‹ใซใ™ใ‚‹ใ“ใจใŒใงใใŸ
08:24
It's as though we've been handed an oracle
166
504880
2216
็ฅžใฎใŠๅ‘Šใ’ใง ไธŽใˆใ‚‰ใ‚ŒใŸใ‹ใฎใ‚ˆใ†ใซ
ใพใฃใŸใๆ„ๅ›ณใ—ใŸใจใŠใ‚Šใซ ๆŒฏใ‚‹่ˆžใ„ใพใ™
08:27
that behaves exactly as intended.
167
507120
2016
08:29
Well, this machine would be the perfect labor-saving device.
168
509160
3720
ใใฎๆฉŸๆขฐใฏไบบ้–“ใฎๅŠดๅƒใ‚’ไธ่ฆใซใ™ใ‚‹ ๅฎŒ็’งใช่ฃ…็ฝฎใซใชใ‚Šใพใ™
08:33
It can design the machine that can build the machine
169
513680
2429
ใฉใ‚“ใช่‚‰ไฝ“ๅŠดๅƒใ‚‚ใ“ใชใ™ ๆฉŸๆขฐใ‚’ไฝœใ‚‹ใŸใ‚ใฎ
ๆฉŸๆขฐใ‚’ใƒ‡ใ‚ถใ‚คใƒณใงใ
08:36
that can do any physical work,
170
516133
1763
08:37
powered by sunlight,
171
517920
1456
ๅคช้™ฝๅ…‰ใง็จผๅƒใ—
08:39
more or less for the cost of raw materials.
172
519400
2696
ๅŽŸๆๆ–™่ฒปใใ‚‰ใ„ใ—ใ‹ ใ‹ใ‹ใ‚Šใพใ›ใ‚“
ไบบ้–“ใฏ่‹ฆๅฝนใ‹ใ‚‰ ่งฃๆ”พใ•ใ‚Œใพใ™
08:42
So we're talking about the end of human drudgery.
173
522120
3256
08:45
We're also talking about the end of most intellectual work.
174
525400
2800
็Ÿฅ็š„ใชไป•ไบ‹ใฎๅคšใใ™ใ‚‰ ไธ่ฆใซใชใ‚Šใพใ™
ใใฎใ‚ˆใ†ใช็Šถๆณใง ๆˆ‘ใ€…ใฎใ‚ˆใ†ใชใ‚ตใƒซใฏ ไฝ•ใ‚’ใ™ใ‚‹ใฎใงใ—ใ‚‡ใ†๏ผŸ
08:49
So what would apes like ourselves do in this circumstance?
175
529200
3056
08:52
Well, we'd be free to play Frisbee and give each other massages.
176
532280
4080
ๆฐ—ใพใพใซใƒ•ใƒชใ‚นใƒ“ใƒผใ—ใŸใ‚Š ใƒžใƒƒใ‚ตใƒผใ‚ธใ—ๅˆใฃใŸใ‚Šใงใ—ใ‚‡ใ†ใ‹
08:57
Add some LSD and some questionable wardrobe choices,
177
537840
2856
ใใ‚ŒใซLSDใจ ๅฅ‡ๆŠœใชๆœใ‚’ๅŠ ใˆใ‚‹ใชใ‚‰
09:00
and the whole world could be like Burning Man.
178
540720
2176
ไธ–็•ŒไธญใŒใƒใƒผใƒ‹ใƒณใ‚ฐใƒžใƒณใฎ ใŠ็ฅญใ‚Šใฎใ‚ˆใ†ใซใชใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
09:02
(Laughter)
179
542920
1640
(็ฌ‘)
09:06
Now, that might sound pretty good,
180
546320
2000
ใใ‚Œใ‚‚็ตๆง‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใŒ
09:09
but ask yourself what would happen
181
549280
2376
็พๅœจใฎๆ”ฟๆฒป็ตŒๆธˆ็Šถๆณใฎๅ…ƒใง
09:11
under our current economic and political order?
182
551680
2736
ใฉใ‚“ใชใ“ใจใŒ่ตทใใ‚‹ใ ใ‚ใ†ใ‹ ่€ƒใˆใฆใฟใฆใใ ใ•ใ„
09:14
It seems likely that we would witness
183
554440
2416
็ตŒๆธˆๆ ผๅทฎใ‚„ๅคฑๆฅญใŒ
09:16
a level of wealth inequality and unemployment
184
556880
4136
ใ‹ใคใฆใชใ„่ฆๆจกใง็”Ÿใ˜ใ‚‹
ๅฏ่ƒฝๆ€งใŒ้ซ˜ใ„ใงใ—ใ‚‡ใ†
09:21
that we have never seen before.
185
561040
1496
09:22
Absent a willingness to immediately put this new wealth
186
562560
2616
ใ“ใฎๆ–ฐใ—ใ„ๅฏŒใ‚’ๅณๅบงใซ
ๅ…จไบบ้กžใซ้€ฒใ‚“ใง ๆไพ›ใ™ใ‚‹ๆฐ—ใŒใชใ‘ใ‚Œใฐ
09:25
to the service of all humanity,
187
565200
1480
09:27
a few trillionaires could grace the covers of our business magazines
188
567640
3616
ไธ€ๆกใ‚Šใฎๅ…†ไธ‡้•ท่€…ใŒ ใƒ“ใ‚ธใƒใ‚น่ชŒใฎ่กจ็ด™ใ‚’้ฃพใ‚‹ไธ€ๆ–นใง
09:31
while the rest of the world would be free to starve.
189
571280
2440
ๆฎ‹ใ‚Šใฎไบบใ€…ใฏ ้ฃขใˆใ‚‹ใ“ใจใซใชใ‚‹ใงใ—ใ‚‡ใ†
09:34
And what would the Russians or the Chinese do
190
574320
2296
ใ‚ทใƒชใ‚ณใƒณใƒใƒฌใƒผใฎไผš็คพใŒ
09:36
if they heard that some company in Silicon Valley
191
576640
2616
่ถ…็Ÿฅ็š„AIใ‚’ๅฑ•้–‹ใ—ใ‚ˆใ†ใจ ใ—ใฆใ„ใ‚‹ใจ่žใ„ใŸใ‚‰
09:39
was about to deploy a superintelligent AI?
192
579280
2736
ใƒญใ‚ทใ‚ขใ‚„ไธญๅ›ฝใฏ ใฉใ†ใ™ใ‚‹ใงใ—ใ‚‡ใ†๏ผŸ
ใใฎๆฉŸๆขฐใฏ ๅœฐไธŠๆˆฆใงใ‚ใ‚Œ ใ‚ตใ‚คใƒใƒผๆˆฆใงใ‚ใ‚Œ
09:42
This machine would be capable of waging war,
193
582040
2856
09:44
whether terrestrial or cyber,
194
584920
2216
ใ‹ใคใฆใชใ„ๅŠ›ใงใ‚‚ใฃใฆ
ๆˆฆไบ‰ใ‚’้‚่กŒใ™ใ‚‹ ่ƒฝๅŠ›ใŒใ‚ใ‚Šใพใ™
09:47
with unprecedented power.
195
587160
1680
ๅ‹่€…ใŒใ™ในใฆใ‚’ ๆ‰‹ใซใ™ใ‚‹ใ‚ทใƒŠใƒชใ‚ชใงใ™
09:50
This is a winner-take-all scenario.
196
590120
1856
09:52
To be six months ahead of the competition here
197
592000
3136
็ซถไบ‰ใงๅŠๅนด ๅ…ˆใ‚“ใ˜ใ‚‹ใ“ใจใง
ๅฐ‘ใชใใจใ‚‚50ไธ‡ๅนดๅˆ†
09:55
is to be 500,000 years ahead,
198
595160
2776
09:57
at a minimum.
199
597960
1496
ๅ…ˆใ‚’่กŒใใ“ใจใŒ ใงใใ‚‹ใฎใงใ™
09:59
So it seems that even mere rumors of this kind of breakthrough
200
599480
4736
ใใฎใ‚ˆใ†ใชๆŠ€่ก“้ฉๆ–ฐใฎ ๅ™‚ใ ใ‘ใงใ‚‚
ไธ–็•ŒไธญใŒๅคง้จ’ใŽใซ ใชใ‚‹ใงใ—ใ‚‡ใ†
10:04
could cause our species to go berserk.
201
604240
2376
10:06
Now, one of the most frightening things,
202
606640
2896
ไปŠใฎๆ™‚็‚นใง็งใŒ
10:09
in my view, at this moment,
203
609560
2776
ใ„ใกใฐใ‚“ใ‚พใƒƒใจใ™ใ‚‹ใฎใฏ
10:12
are the kinds of things that AI researchers say
204
612360
4296
AI็ ”็ฉถ่€…ใŒ ๅฎ‰ๅฟƒใ•ใ›ใ‚ˆใ†ใจใ—ใฆ
่จ€ใ†ใŸใใ„ใฎใ“ใจใงใ™
10:16
when they want to be reassuring.
205
616680
1560
ๅฟƒ้…ใ™ในใใงใชใ„็†็”ฑใจใ—ใฆ ใ‚ˆใ่จ€ใ‚ใ‚Œใ‚‹ใฎใฏ ๆ™‚้–“ใงใ™
10:19
And the most common reason we're told not to worry is time.
206
619000
3456
10:22
This is all a long way off, don't you know.
207
622480
2056
ใใ‚“ใชใฎใฏ ใšใฃใจๅ…ˆใฎ่ฉฑใง
10:24
This is probably 50 or 100 years away.
208
624560
2440
50ๅนดใจใ‹100ๅนดใ‚‚ๅ…ˆใฎ่ฉฑใ ใจ
10:27
One researcher has said,
209
627720
1256
ใ‚ใ‚‹็ ”็ฉถ่€…ใฏ
ใ€ŒAIใฎๅฎ‰ๅ…จๆ€งใ‚’ๆ‡ธๅฟตใ™ใ‚‹ใฎใฏ
10:29
"Worrying about AI safety
210
629000
1576
10:30
is like worrying about overpopulation on Mars."
211
630600
2280
็ซๆ˜Ÿใฎไบบๅฃ้Žๅ‰ฐใ‚’ๅฟƒ้…ใ™ใ‚‹ใ‚ˆใ†ใชใ‚‚ใฎใ ใ€ใจ ่จ€ใฃใฆใ„ใพใ™
ใ“ใ‚Œใฏใ€Œใ‚ใชใŸใฎๅฐใ•ใช้ ญใ‚’ ใใ‚“ใชใ“ใจใงๆ‚ฉใพใ›ใชใ„ใงใ€ใจใ„ใ†ใ‚„ใคใฎ
10:34
This is the Silicon Valley version
212
634116
1620
10:35
of "don't worry your pretty little head about it."
213
635760
2376
ใ‚ทใƒชใ‚ณใƒณใƒใƒฌใƒผ็‰ˆใงใ™
10:38
(Laughter)
214
638160
1336
(็ฌ‘)
10:39
No one seems to notice
215
639520
1896
่ชฐใ‚‚ๆฐ—ไป˜ใ„ใฆใ„ใชใ„ ใ‚ˆใ†ใซ่ฆ‹ใˆใ‚‹ใฎใฏ
10:41
that referencing the time horizon
216
641440
2616
ๆ™‚้–“็š„ใ‚นใ‚ฑใƒผใƒซใฏ
ใ“ใฎ้š› ็„ก้–ขไฟ‚ใ ใจ ใ„ใ†ใ“ใจใงใ™
10:44
is a total non sequitur.
217
644080
2576
10:46
If intelligence is just a matter of information processing,
218
646680
3256
็Ÿฅๆ€งใฎๅฎŸไฝ“ใŒ ๆƒ…ๅ ฑๅ‡ฆ็†ใซใ™ใŽใš
10:49
and we continue to improve our machines,
219
649960
2656
ไบบ้กžใŒ ๆฉŸๆขฐใฎๆ”น่‰ฏใ‚’ ็ถšใ‘ใฆใ„ใใชใ‚‰
10:52
we will produce some form of superintelligence.
220
652640
2880
ใ„ใšใ‚Œไฝ•ใ‚‰ใ‹ใฎๅฝขใฎ ่ถ…็Ÿฅๆ€งใ‚’็”Ÿใฟๅ‡บใ™ใ“ใจใซใชใ‚‹ใงใ—ใ‚‡ใ†
10:56
And we have no idea how long it will take us
221
656320
3656
ใใ‚Œใ‚’ๅฎ‰ๅ…จใซ่กŒใˆใ‚‹ ๆกไปถใ‚’ไฝœใ‚Šๅ‡บใ™ใฎใซ
ใฉใ‚Œใใ‚‰ใ„ๆ™‚้–“ใŒใ‹ใ‹ใ‚‹ใฎใ‹ ๆˆ‘ใ€…ใซใฏ ่ฆ‹ๅฝ“ใ‚‚ไป˜ใใพใ›ใ‚“
11:00
to create the conditions to do that safely.
222
660000
2400
ใ‚‚ใ†ไธ€ๅบฆ่จ€ใ„ใพใ—ใ‚‡ใ†
11:04
Let me say that again.
223
664200
1296
11:05
We have no idea how long it will take us
224
665520
3816
ใใ‚Œใ‚’ๅฎ‰ๅ…จใซ่กŒใˆใ‚‹ ๆกไปถใ‚’ไฝœใ‚Šๅ‡บใ™ใฎใซ
11:09
to create the conditions to do that safely.
225
669360
2240
ใฉใ‚Œใใ‚‰ใ„ๆ™‚้–“ใŒใ‹ใ‹ใ‚‹ใฎใ‹ ๆˆ‘ใ€…ใซใฏ ่ฆ‹ๅฝ“ใ‚‚ไป˜ใใพใ›ใ‚“
11:12
And if you haven't noticed, 50 years is not what it used to be.
226
672920
3456
ไปŠใฎ50ๅนดใจใ„ใ†ใฎใฏ ใ‹ใคใฆใฎ50ๅนดใจใฏ้•ใฃใฆใ„ใพใ™
11:16
This is 50 years in months.
227
676400
2456
ใ“ใ‚Œใฏ 50ๅนดใ‚’ๆœˆใฎๆ•ฐใง ่กจใ—ใŸใ‚‚ใฎใงใ™
11:18
This is how long we've had the iPhone.
228
678880
1840
ใ“ใ‚ŒใฏiPhoneใŒ ็™ปๅ ดใ—ใฆใ‹ใ‚‰ใฎๆ™‚้–“
11:21
This is how long "The Simpsons" has been on television.
229
681440
2600
ใ“ใ‚Œใฏใ€Žใ‚ทใƒณใƒ—ใ‚ฝใƒณใ‚บใ€ใฎๆ”พ้€ใŒ ๅง‹ใพใฃใฆใ‹ใ‚‰ใฎๆ™‚้–“
11:24
Fifty years is not that much time
230
684680
2376
ไบบ้กžๆœ€ๅคงใฎ้›ฃๅ•ใซ ๅ–ใ‚Š็ต„ใ‚€ๆ™‚้–“ใจใ—ใฆใฏ
11:27
to meet one of the greatest challenges our species will ever face.
231
687080
3160
50ๅนดใจใ„ใ†ใฎใฏ ใใ‚“ใชใซ ้•ทใ„ใ‚‚ใฎใงใฏใ‚ใ‚Šใพใ›ใ‚“
11:31
Once again, we seem to be failing to have an appropriate emotional response
232
691640
4016
็นฐใ‚Š่ฟ”ใ—ใซใชใ‚Šใพใ™ใŒ ๅฟ…ใšใ‚„ใฃใฆใใ‚‹ใจๆ€ใ‚ใ‚Œใ‚‹ใ‚‚ใฎใซๅฏพใ—ใฆ
11:35
to what we have every reason to believe is coming.
233
695680
2696
ๆˆ‘ใ€…ใฏๆ„Ÿๆƒ…็š„ใซ้ฉๅˆ‡ใชๅๅฟœใ‚’ ใงใใšใซใ„ใ‚‹ใ‚ˆใ†ใงใ™
11:38
The computer scientist Stuart Russell has a nice analogy here.
234
698400
3976
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผ็ง‘ๅญฆ่€…ใฎใ‚นใƒใƒฅใƒฏใƒผใƒˆใƒป ใƒฉใƒƒใ‚ปใƒซใŒ ใ†ใพใ„ไพ‹ใˆใ‚’ใ—ใฆใ„ใพใ™
11:42
He said, imagine that we received a message from an alien civilization,
235
702400
4896
็•ฐๆ˜Ÿใฎๆ–‡ๆ˜Žใ‹ใ‚‰
ใ“ใ‚“ใชใƒกใƒƒใ‚ปใƒผใ‚ธใ‚’ๅ—ใ‘ๅ–ใฃใŸใจ ๆƒณๅƒใ—ใฆใใ ใ•ใ„
11:47
which read:
236
707320
1696
ใ€Œๅœฐ็ƒใฎ่ซธๅ›
11:49
"People of Earth,
237
709040
1536
11:50
we will arrive on your planet in 50 years.
238
710600
2360
ใ‚ใจ50ๅนดใง ๆˆ‘ใ€…ใฏใใกใ‚‰ใซ็€ใ
11:53
Get ready."
239
713800
1576
ๆบ–ๅ‚™ใ•ใ‚ŒใŸใ—ใ€
11:55
And now we're just counting down the months until the mothership lands?
240
715400
4256
ๆˆ‘ใ€…ใฏ ๅฝผใ‚‰ใฎๆฏ่ˆนใŒ็€้™ธใ™ใ‚‹ๆ™‚ใ‚’ ใŸใ ๆŒ‡ๆŠ˜ใ‚Šๆ•ฐใˆใฆๅพ…ใคใฎใงใ—ใ‚‡ใ†ใ‹๏ผŸ
11:59
We would feel a little more urgency than we do.
241
719680
3000
ใ‚‚ใ†ๅฐ‘ใ— ๅˆ‡่ฟซๆ„Ÿใ‚’ๆŠฑใใฎใงใฏ ใชใ„ใ‹ใจๆ€ใ„ใพใ™
12:04
Another reason we're told not to worry
242
724680
1856
ๅฟƒ้…็„ก็”จใ ใจใ•ใ‚Œใ‚‹ ๅˆฅใฎ็†็”ฑใฏ
12:06
is that these machines can't help but share our values
243
726560
3016
ใใ†ใ„ใฃใŸๆฉŸๆขฐใฏ ไบบ้–“ใฎๅปถ้•ทใงใ‚ใ‚Š
12:09
because they will be literally extensions of ourselves.
244
729600
2616
ไบบ้–“ใจๅŒใ˜ไพกๅ€ค่ฆณใ‚’ ๆŒใคใฏใšใ ใจใ„ใ†ใ‚‚ใฎใงใ™
12:12
They'll be grafted onto our brains,
245
732240
1816
ๆˆ‘ใ€…ใฎ่„ณใซๆŽฅ็ถšใ•ใ‚Œ
ไบบ้–“ใฏใใฎ่พบ็ธ็ณปใฎใ‚ˆใ†ใช ๅญ˜ๅœจใซใชใ‚‹ใฎใ ใจ
12:14
and we'll essentially become their limbic systems.
246
734080
2360
ใกใ‚‡ใฃใจ่€ƒใˆใฆใปใ—ใ„ใ‚“ใงใ™ใŒ
12:17
Now take a moment to consider
247
737120
1416
12:18
that the safest and only prudent path forward,
248
738560
3176
ๆœ€ใ‚‚ๅฎ‰ๅ…จใง
ๅ”ฏไธ€ๅˆ†ๅˆฅใฎใ‚ใ‚‹ใ‚„ใ‚Šๆ–นใจใ—ใฆ ๅ‹งใ‚ใ‚‰ใ‚Œใฆใ„ใ‚‹ใฎใฏ
12:21
recommended,
249
741760
1336
ใใฎๆŠ€่ก“ใ‚’็›ดๆŽฅ่‡ชๅˆ†ใฎ่„ณใซ ๅŸ‹ใ‚่พผใ‚€ใ“ใจใ ใจ่จ€ใ†ใฎใงใ™
12:23
is to implant this technology directly into our brains.
250
743120
2800
12:26
Now, this may in fact be the safest and only prudent path forward,
251
746600
3376
ใใ‚ŒใฏๅฎŸ้š›ใซ ๆœ€ใ‚‚ๅฎ‰ๅ…จใง ๅ”ฏไธ€ๅˆ†ๅˆฅใฎใ‚ใ‚‹ใ‚„ใ‚Šๆ–นใชใฎใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใŒ
12:30
but usually one's safety concerns about a technology
252
750000
3056
ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใซๅฏพใ™ใ‚‹ๅฎ‰ๅ…จๆ€งใฎๆ‡ธๅฟตใฏ ใตใคใ†ใงใ‚ใ‚Œใฐโ€”
12:33
have to be pretty much worked out before you stick it inside your head.
253
753080
3656
้ ญใซๅŸ‹ใ‚่พผใ‚€ๅ‰ใซ่งฃๆถˆใ•ใ‚Œใฆใ„ใ‚‹ ในใใงใฏใชใ„ใงใ—ใ‚‡ใ†ใ‹
12:36
(Laughter)
254
756760
2016
(็ฌ‘)
12:38
The deeper problem is that building superintelligent AI on its own
255
758800
5336
ใ‚ˆใ‚Šๆ นๆทฑใ„ๅ•้กŒ็‚นใฏ ่ถ…็Ÿฅ็š„AIใ‚’ไฝœใ‚‹ใ“ใจ่‡ชไฝ“ใฏ
่ถ…็Ÿฅ็š„AIใ‚’ไฝœใ‚Š
12:44
seems likely to be easier
256
764160
1736
12:45
than building superintelligent AI
257
765920
1856
ใ‹ใค ใใ‚Œใ‚’ไบบ้–“ใฎ่„ณใซ ็ตฑๅˆใ™ใ‚‹ใŸใ‚ใฎ
12:47
and having the completed neuroscience
258
767800
1776
็ฅž็ตŒ็ง‘ๅญฆใ‚’ ๅฎŒๆˆใ•ใ›ใ‚‹ใ“ใจใ‚ˆใ‚Šใฏ
12:49
that allows us to seamlessly integrate our minds with it.
259
769600
2680
็ฐกๅ˜ใ ใ‚ใ†ใ“ใจใงใ™
12:52
And given that the companies and governments doing this work
260
772800
3176
ใ“ใ‚Œใซๅ–ใ‚Š็ต„ใ‚€ๅ›ฝใ‚„ไผๆฅญใฏ
ไป–ใฎๅ›ฝใ‚„ไผๆฅญใจ็ซถไบ‰ใ‚’ ใ—ใฆใ„ใ‚‹ใจ่ช่ญ˜ใ—ใฆใŠใ‚Š
12:56
are likely to perceive themselves as being in a race against all others,
261
776000
3656
12:59
given that to win this race is to win the world,
262
779680
3256
ๅ‹่€…ใŒๅ…จไธ–็•Œใ‚’ ๆ‰‹ใซๅ…ฅใ‚Œใ‚‰ใ‚Œใ‚‹ใฎใ ใจใ™ใ‚Œใฐโ€”
13:02
provided you don't destroy it in the next moment,
263
782960
2456
ๆฌกใฎ็žฌ้–“ใซ็ ดๆป…ใ•ใ›ใฆ ใ„ใŸใ‚‰ๅˆฅใงใ™ใŒโ€”
13:05
then it seems likely that whatever is easier to do
264
785440
2616
ใใฎใ‚ˆใ†ใช็ŠถๆณใซใŠใ„ใฆใฏ
ไฝ•ใซใ›ใ‚ˆ็ฐกๅ˜ใชๆ–นใŒ ๅ…ˆใซๆˆใ—้‚ใ’ใ‚‰ใ‚Œใ‚‹ใ“ใจใงใ—ใ‚‡ใ†
13:08
will get done first.
265
788080
1200
13:10
Now, unfortunately, I don't have a solution to this problem,
266
790560
2856
ใ‚ใ„ใซใ ็งใฏใ“ใฎๅ•้กŒใธใฎ็ญ”ใˆใ‚’ ๆŒใกๅˆใ‚ใ›ใฆใฏใŠใ‚‰ใš
13:13
apart from recommending that more of us think about it.
267
793440
2616
ใŸใ ใ‚‚ใฃใจ่€ƒใˆใฆใฟใ‚‹ใ‚ˆใ†ใซ ใŠๅ‹งใ‚ใ™ใ‚‹ใ ใ‘ใงใ™
ไบบๅทฅ็Ÿฅ่ƒฝใฎใŸใ‚ใฎ ใ€Œใƒžใƒณใƒใƒƒใ‚ฟใƒณ่จˆ็”ปใ€ใฎใ‚ˆใ†ใชใ‚‚ใฎใŒ
13:16
I think we need something like a Manhattan Project
268
796080
2376
13:18
on the topic of artificial intelligence.
269
798480
2016
ๅฟ…่ฆใชใฎใงใฏ ใชใ„ใ‹ใจๆ€ใ„ใพใ™
13:20
Not to build it, because I think we'll inevitably do that,
270
800520
2736
ไฝœใ‚‹ใŸใ‚ใงใฏใ‚ใ‚Šใพใ›ใ‚“ ใใ‚Œใฏไธๅฏ้ฟใงใ—ใ‚‡ใ†ใ‹ใ‚‰
13:23
but to understand how to avoid an arms race
271
803280
3336
่ปๆ‹ก็ซถไบ‰ใฎใ‚ˆใ†ใซใชใ‚‹ใฎใ‚’ ใ„ใ‹ใซ้ฟใ‘ใ‚‰ใ‚Œใ‚‹ใ‹ๆŽขใ‚Š
13:26
and to build it in a way that is aligned with our interests.
272
806640
3496
ไบบ้กžใฎๅˆฉ็›Šใซใ‹ใชใ†ใ‚ˆใ†ใซ ใใ‚ŒใŒไฝœใ‚‰ใ‚Œใ‚‹ใ‚ˆใ†ใซใ™ใ‚‹ใŸใ‚ใงใ™
13:30
When you're talking about superintelligent AI
273
810160
2136
่ถ…็Ÿฅ็š„AIใŒ
่‡ช่บซใ‚’ๅค‰ใˆใฆใ„ใ‘ใ‚‹ใ“ใจใ‚’ ่€ƒใˆใ‚‹ใจ
13:32
that can make changes to itself,
274
812320
2256
13:34
it seems that we only have one chance to get the initial conditions right,
275
814600
4616
ใใ‚Œใ‚’ๆญฃใ—ใไฝœใ‚Œใ‚‹ใƒใƒฃใƒณใ‚นใฏ ๏ผ‘ๅบฆใ—ใ‹ใชใ„ใงใ—ใ‚‡ใ†
13:39
and even then we will need to absorb
276
819240
2056
ใ—ใ‹ใ‚‚ใใฎไธŠใง ใใ‚Œใ‚’ๆญฃใ—ใใ‚„ใ‚‹ใŸใ‚ใฎ
13:41
the economic and political consequences of getting them right.
277
821320
3040
ๆ”ฟๆฒป็š„ใƒป็ตŒๆธˆ็š„ใชๅธฐ็ตใ‚’ ๅ—ใ‘ๅ…ฅใ‚Œใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™
13:45
But the moment we admit
278
825760
2056
็Ÿฅๆ€งใฎๆบใฏๆƒ…ๅ ฑๅ‡ฆ็†ใ ใจโ€”
13:47
that information processing is the source of intelligence,
279
827840
4000
็Ÿฅๆ€งใฎๅŸบ็คŽใ‚’ใชใ—ใฆใ„ใ‚‹ใฎใฏ
13:52
that some appropriate computational system is what the basis of intelligence is,
280
832720
4800
ใ‚ใ‚‹็จฎใฎ่จˆ็ฎ—ใ‚ทใ‚นใƒ†ใƒ ใงใ‚ใ‚‹ ใจใ„ใ†ใ“ใจใ‚’่ชใ‚
13:58
and we admit that we will improve these systems continuously,
281
838360
3760
ไบบ้กžใŒใใ†ใ„ใ†ใ‚ทใ‚นใƒ†ใƒ ใ‚’็ถ™็ถš็š„ใซ ๆ”น่‰ฏใ—ใฆใ„ใใ ใ‚ใ†ใ“ใจใ‚’่ชใ‚
14:03
and we admit that the horizon of cognition very likely far exceeds
282
843280
4456
ใใ—ใฆ่ช็Ÿฅใฎๅœฐๅนณใฏ ๆˆ‘ใ€…ใซไปŠๅˆ†ใ‹ใฃใฆใ„ใ‚‹ใ‚ˆใ‚Šใ‚‚
ใฏใ‚‹ใ‹้ ใใพใง ็ถšใ„ใฆใ„ใ‚‹ใงใ‚ใ‚ใ†ใ“ใจใ‚’่ชใ‚ใ‚‹ใจใ
14:07
what we currently know,
283
847760
1200
ๆˆ‘ใ€…ใฏใ‚ใ‚‹็จฎใฎ็ฅžใ‚’ ไฝœใ‚ใ†ใจใ—ใฆใ„ใ‚‹ใ“ใจใ‚’
14:10
then we have to admit
284
850120
1216
14:11
that we are in the process of building some sort of god.
285
851360
2640
่ชใ‚ใชใ„ใ‚ใ‘ใซใฏ ใ„ใ‹ใชใ„ใงใ—ใ‚‡ใ†
ใใฎ็ฅžใŒไธ€็ท’ใซใ‚„ใฃใฆใ„ใ‘ใ‚‹็›ธๆ‰‹ใ‹ ็ขบ่ชใ—ใŸใ„ใชใ‚‰
14:15
Now would be a good time
286
855400
1576
14:17
to make sure it's a god we can live with.
287
857000
1953
ไปŠใŒใใฎๆ™‚ใงใ™
ใ‚ใ‚ŠใŒใจใ†ใ”ใ–ใ„ใพใ—ใŸ
14:20
Thank you very much.
288
860120
1536
14:21
(Applause)
289
861680
5093
(ๆ‹ๆ‰‹)
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7