3 principles for creating safer AI | Stuart Russell

139,486 views ใƒป 2017-06-06

TED


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚

็ฟป่จณ: Yasushi Aoki ๆ กๆญฃ: Yuko Yoshida
00:12
This is Lee Sedol.
0
12532
1552
ใ“ใ‚ŒใฏๆŽไธ–ใƒ‰ใƒซใงใ™
ๆŽไธ–ใƒ‰ใƒซใฏ ไธ–็•Œใงๆœ€ใ‚‚ๅผทใ„ ็ขๆ‰“ใกใฎ๏ผ‘ไบบใงใ™ใŒ
00:14
Lee Sedol is one of the world's greatest Go players,
1
14108
3997
ใ‚ทใƒชใ‚ณใƒณใƒใƒฌใƒผใฎ ๅ‹ไบบใŸใกใชใ‚‰
00:18
and he's having what my friends in Silicon Valley call
2
18129
2885
ใ€Œใชใ‚“ใฆใ“ใฃใŸใ€ใจ่จ€ใ† ็žฌ้–“ใ‚’่ฟŽใˆใฆใ„ใพใ™
00:21
a "Holy Cow" moment --
3
21038
1510
00:22
(Laughter)
4
22572
1073
(็ฌ‘)
00:23
a moment where we realize
5
23669
2188
ๆˆ‘ใ€…ใŒไบˆๆƒณใ—ใฆใ„ใŸใ‚ˆใ‚Šใ‚‚ ใšใฃใจๆ—ฉใ
00:25
that AI is actually progressing a lot faster than we expected.
6
25881
3296
AIใŒ้€ฒๆญฉใ—ใฆใ„ใ‚‹ใ“ใจใซ ๆฐ—ไป˜ใ„ใŸ็žฌ้–“ใงใ™
00:29
So humans have lost on the Go board. What about the real world?
7
29974
3047
ไบบ้–“ใฏ็ข็›คไธŠใงๆฉŸๆขฐใซ่ฒ ใ‘ใพใ—ใŸใŒ ๅฎŸ้š›ใฎไธ–ใฎไธญใงใฏใฉใ†ใงใ—ใ‚‡ใ†๏ผŸ
00:33
Well, the real world is much bigger,
8
33045
2100
ๅฎŸ้š›ใฎไธ–็•Œใฏ
็ข็›คใ‚ˆใ‚Šใ‚‚ใšใฃใจๅคงใใ ใšใฃใจ่ค‡้›‘ใง
00:35
much more complicated than the Go board.
9
35169
2249
00:37
It's a lot less visible,
10
37442
1819
ใšใฃใจ่ฆ‹้€šใ—้›ฃใ„ใงใ™ใŒ
00:39
but it's still a decision problem.
11
39285
2038
ๆฑบๅฎšๅ•้กŒใงใ‚ใ‚‹ใ“ใจใซ ้•ใ„ใฏใ‚ใ‚Šใพใ›ใ‚“
00:42
And if we think about some of the technologies
12
42768
2321
ๅˆฐๆฅใ—ใคใคใ‚ใ‚‹
ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใฎใ“ใจใ‚’่€ƒใˆใ‚‹ใชใ‚‰ โ€”
00:45
that are coming down the pike ...
13
45113
1749
ๆฉŸๆขฐใฏ ๆœฌๅฝ“ใซ็†่งฃใ—ใฆๆ–‡ใ‚’่ชญใ‚ใ‚‹ใ‚ˆใ†ใซใฏ ใพใ ใชใฃใฆใ„ใชใ„ใ“ใจใซ
00:47
Noriko [Arai] mentioned that reading is not yet happening in machines,
14
47558
4335
ๆ–ฐไบ•็ด€ๅญๆฐใŒ ่งฆใ‚Œใฆใ„ใพใ—ใŸใŒ
00:51
at least with understanding.
15
51917
1500
00:53
But that will happen,
16
53441
1536
ใใ‚Œใ‚‚ใ‚„ใŒใฆ ใงใใ‚‹ใ‚ˆใ†ใซใชใ‚‹ใงใ—ใ‚‡ใ†
00:55
and when that happens,
17
55001
1771
ใใ—ใฆ ใใ†ใชใฃใŸใจใ
00:56
very soon afterwards,
18
56796
1187
ๆฉŸๆขฐใฏไบบ้กžใŒใ‹ใคใฆๆ›ธใ„ใŸ ใ™ในใฆใฎใ‚‚ใฎใ‚’
00:58
machines will have read everything that the human race has ever written.
19
58007
4572
้€Ÿใ‚„ใ‹ใซ่ชญ็ ดใ™ใ‚‹ใ“ใจใงใ—ใ‚‡ใ†
01:03
And that will enable machines,
20
63670
2030
ใใ†ใชใ‚‹ใจๆฉŸๆขฐใฏ
01:05
along with the ability to look further ahead than humans can,
21
65724
2920
็ขใซใŠใ„ใฆ่ฆ‹ใ›ใŸ
ไบบ้–“ใ‚ˆใ‚Š้ ใใพใง ่ฆ‹้€šใ™ๅŠ›ใจๅˆใ‚ใ›
01:08
as we've already seen in Go,
22
68668
1680
01:10
if they also have access to more information,
23
70372
2164
ใ‚ˆใ‚Šๅคšใใฎๆƒ…ๅ ฑใซ ่งฆใ‚Œใ‚‰ใ‚Œใ‚‹ใ‚ˆใ†ใซใชใ‚‹ใ“ใจใง
01:12
they'll be able to make better decisions in the real world than we can.
24
72560
4268
ๅฎŸ้š›ใฎไธ–ใฎไธญใงใ‚‚ ไบบ้–“ใ‚ˆใ‚Šๅ„ชใ‚ŒใŸ ๅˆคๆ–ญใŒใงใใ‚‹ใ‚ˆใ†ใซใชใ‚‹ใงใ—ใ‚‡ใ†
01:18
So is that a good thing?
25
78612
1606
ใใ‚Œใฏ่‰ฏใ„ใ“ใจ ใชใฎใงใ—ใ‚‡ใ†ใ‹๏ผŸ
01:21
Well, I hope so.
26
81718
2232
ใใ†ใ ใจๆœ›ใฟใŸใ„ใงใ™
01:26
Our entire civilization, everything that we value,
27
86514
3255
ๆˆ‘ใ€…ใฎๆ–‡ๆ˜Žใใฎใ‚‚ใฎ ๆˆ‘ใ€…ใŒไพกๅ€คใ‚’็ฝฎใใ™ในใฆใฏ
01:29
is based on our intelligence.
28
89793
2068
ๆˆ‘ใ€…ใฎ็Ÿฅๆ€งใ‚’ ๆ‹ ใ‚Šๆ‰€ใจใ—ใฆใ„ใพใ™
01:31
And if we had access to a lot more intelligence,
29
91885
3694
ใฏใ‚‹ใ‹ใซๅคšใใฎ็Ÿฅๆ€งใŒ ไฝฟใˆใ‚‹ใ‚ˆใ†ใซใชใฃใŸใชใ‚‰
01:35
then there's really no limit to what the human race can do.
30
95603
3302
ไบบ้กžใซๅฏ่ƒฝใชใ“ใจใซ ้™็•Œใฏใชใ„ใงใ—ใ‚‡ใ†
01:40
And I think this could be, as some people have described it,
31
100485
3325
ใ‚ใ‚‹ไบบใ€…ใŒ่จ€ใฃใฆใ„ใ‚‹ใ‚ˆใ†ใซ
ใ“ใ‚Œใฏไบบ้กžๅฒไธŠๆœ€ๅคงใฎๅ‡บๆฅไบ‹ใซ ใชใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
01:43
the biggest event in human history.
32
103834
2016
01:48
So why are people saying things like this,
33
108485
2829
ใงใฏใชใœใ€ŒAIใฏไบบ้กžใฎ็ต‚็„‰ใ‚’ ๆ„ๅ‘ณใ™ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใ€ใชใฉใจ
01:51
that AI might spell the end of the human race?
34
111338
2876
่จ€ใ‚ใ‚Œใฆใ„ใ‚‹ใฎใงใ—ใ‚‡ใ†๏ผŸ
01:55
Is this a new thing?
35
115258
1659
ใ“ใ‚Œใฏๆ–ฐใ—ใ„ใ“ใจ ใชใฎใงใ—ใ‚‡ใ†ใ‹๏ผŸ
01:56
Is it just Elon Musk and Bill Gates and Stephen Hawking?
36
116941
4110
ใŸใ ใ‚คใƒผใƒญใƒณใƒปใƒžใ‚นใ‚ฏใจ ใƒ“ใƒซใƒปใ‚ฒใ‚คใƒ„ใจ ใƒ›ใƒผใ‚ญใƒณใ‚ฐใŒ่จ€ใฃใฆใ„ใ‚‹ใ ใ‘ใชใฎใ‹๏ผŸ
02:01
Actually, no. This idea has been around for a while.
37
121773
3262
้•ใ„ใพใ™ ใ“ใฎ่€ƒใˆใฏ็ตๆง‹ๅ‰ใ‹ใ‚‰ใ‚ใ‚Šใพใ—ใŸ
02:05
Here's a quotation:
38
125059
1962
ใ“ใ“ใซ ใ‚ใ‚‹ไบบใฎ ่จ€่‘‰ใŒใ‚ใ‚Šใพใ™
ใ€Œ้‡ๅคงใช็žฌ้–“ใซใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‹ ใจใ„ใฃใŸใ“ใจใซใ‚ˆใฃใฆ
02:07
"Even if we could keep the machines in a subservient position,
39
127045
4350
ๆฉŸๆขฐใ‚’ ๅพ“ๅฑž็š„ใชไฝ็ฝฎใซ ไฟใฆใŸใจใ—ใฆใ‚‚ โ€”
02:11
for instance, by turning off the power at strategic moments" --
40
131419
2984
02:14
and I'll come back to that "turning off the power" idea later on --
41
134427
3237
ใ“ใฎ โ€œใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‹โ€ ใ“ใจใซใคใ„ใฆใฏ ๅพŒใงใพใŸๆˆปใฃใฆใใพใ™ โ€”
02:17
"we should, as a species, feel greatly humbled."
42
137688
2804
็จฎใจใ—ใฆใฎๆˆ‘ใ€…ใฏ ่ฌ™่™šใซๆ‰ใˆใ‚‹ในใใงใ‚ใ‚‹ใ€
02:21
So who said this? This is Alan Turing in 1951.
43
141997
3448
่ชฐใฎ่จ€่‘‰ใงใ—ใ‚‡ใ†๏ผŸ ใ‚ขใƒฉใƒณใƒปใƒใƒฅใƒผใƒชใƒณใ‚ฐใŒ 1951ๅนดใซ่จ€ใฃใŸใ“ใจใงใ™
ใ”ๅญ˜ใ˜ใฎใ‚ˆใ†ใซ ใƒใƒฅใƒผใƒชใƒณใ‚ฐใฏ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผ็ง‘ๅญฆใฎ็ˆถใงใ‚ใ‚Š
02:26
Alan Turing, as you know, is the father of computer science
44
146120
2763
02:28
and in many ways, the father of AI as well.
45
148907
3048
ใ„ใ‚ใ„ใ‚ใชๆ„ๅ‘ณใง AIใฎ็ˆถใงใ‚‚ใ‚ใ‚Šใพใ™
ใ“ใฎๅ•้กŒใ‚’่€ƒใˆใฆใฟใ‚‹ใจ
02:33
So if we think about this problem,
46
153059
1882
02:34
the problem of creating something more intelligent than your own species,
47
154965
3787
ใคใพใ‚Š่‡ชๅˆ†ใฎ็จฎใ‚ˆใ‚Šใ‚‚็Ÿฅ็š„ใชใ‚‚ใฎใ‚’ ็”Ÿใฟๅ‡บใ—ใฆใ—ใพใ†ใจใ„ใ†ๅ•้กŒใงใ™ใŒ
02:38
we might call this "the gorilla problem,"
48
158776
2622
ใ“ใ‚Œใฏใ€Œใ‚ดใƒชใƒฉใฎๅ•้กŒใ€ใจๅ‘ผใ‚“ใงใ‚‚ ่‰ฏใ„ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
02:42
because gorillas' ancestors did this a few million years ago,
49
162165
3750
ใชใœใชใ‚‰ๆ•ฐ็™พไธ‡ๅนดๅ‰ใซ ใ‚ดใƒชใƒฉใฎ็ฅ–ๅ…ˆใŒใใ†ใ—ใฆใ„ใ‚‹ใ‹ใ‚‰ใง
02:45
and now we can ask the gorillas:
50
165939
1745
ใ‚ดใƒชใƒฉใŸใกใซ ๅฐ‹ใญใ‚‹ใ“ใจใŒใงใใพใ™
02:48
Was this a good idea?
51
168572
1160
ใ€Œใ„ใ„ใ‚ขใ‚คใƒ‡ใ‚ขใ ใฃใŸใจๆ€ใ†๏ผŸใ€
02:49
So here they are having a meeting to discuss whether it was a good idea,
52
169756
3530
ใ‚ดใƒชใƒฉใŸใกใŒ ใ„ใ„ใ‚ขใ‚คใƒ‡ใ‚ขใ ใฃใŸใฎใ‹ ่ญฐ่ซ–ใ™ใ‚‹ใŸใ‚ใซ ้›†ใพใฃใฆใ„ใพใ™ใŒ
02:53
and after a little while, they conclude, no,
53
173310
3346
ใ—ใฐใ‚‰ใใ—ใฆ ๅ‡บใ—ใŸ็ต่ซ–ใฏ
ใ€Œใ‚ใ‚Œใฏ้…ทใ„ใ‚ขใ‚คใƒ‡ใ‚ขใ ใฃใŸใ€ ใจใ„ใ†ใ‚‚ใฎใงใ™
02:56
this was a terrible idea.
54
176680
1345
ใŠใ‹ใ’ใงๆˆ‘ใ€…ใฎ็จฎใฏ ใฒใฉใ„่‹ฆๅขƒใซ็ฝฎใ‹ใ‚Œใฆใ„ใ‚‹ใจ
02:58
Our species is in dire straits.
55
178049
1782
03:00
In fact, you can see the existential sadness in their eyes.
56
180358
4263
ๅฝผใ‚‰ใฎ็›ฎใซๅฎŸๅญ˜็š„ใชๆ‚ฒๅ“€ใ‚’ ่ฆ‹ใฆๅ–ใ‚Œใ‚‹ใงใ—ใ‚‡ใ†
03:04
(Laughter)
57
184645
1640
(็ฌ‘)
03:06
So this queasy feeling that making something smarter than your own species
58
186309
4840
ใ€Œ่‡ชๅˆ†ใฎ็จฎใ‚ˆใ‚Š็Ÿฅ็š„ใชใ‚‚ใฎใ‚’ ็”Ÿใฟๅ‡บใ™ใฎใฏ
่‰ฏใ„่€ƒใˆใงใฏใชใ„ใฎใงใฏ๏ผŸใ€ ใจใ„ใ†ไธๅฎ‰ใชๆ„Ÿ่ฆšใŒใ‚ใ‚Šใพใ™
03:11
is maybe not a good idea --
59
191173
2365
03:14
what can we do about that?
60
194308
1491
ใใ‚Œใซใคใ„ใฆ ไฝ•ใŒใงใใ‚‹ใฎใงใ—ใ‚‡ใ†๏ผŸ
03:15
Well, really nothing, except stop doing AI,
61
195823
4767
AIใฎ้–‹็™บใ‚’ใ‚„ใ‚ใฆใ—ใพใ†ไปฅๅค– ใชใ„ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“ใŒ
03:20
and because of all the benefits that I mentioned
62
200614
2510
AIใฎใ‚‚ใŸใ‚‰ใ™ๆง˜ใ€…ใชๅˆฉ็‚นใ‚„
็ง่‡ช่บซAI็ ”็ฉถ่€…ใงใ‚ใ‚‹ ใจใ„ใ†็†็”ฑใซใ‚ˆใฃใฆ
03:23
and because I'm an AI researcher,
63
203148
1716
03:24
I'm not having that.
64
204888
1791
็งใซใฏใใ†ใ„ใ†้ธๆŠž่‚ขใฏ ใ‚ใ‚Šใพใ›ใ‚“
03:27
I actually want to be able to keep doing AI.
65
207103
2468
ๅฎŸ้š›AIใฏ็ถšใ‘ใŸใ„ใจ ๆ€ใฃใฆใ„ใพใ™
03:30
So we actually need to nail down the problem a bit more.
66
210435
2678
ใ“ใฎๅ•้กŒใ‚’ใ‚‚ใ†ๅฐ‘ใ— ๆ˜Ž็ขบใซใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹ใงใ—ใ‚‡ใ†
03:33
What exactly is the problem?
67
213137
1371
ๆญฃ็ขบใซไฝ•ใŒๅ•้กŒใชใฎใ‹๏ผŸ
03:34
Why is better AI possibly a catastrophe?
68
214532
3246
ๅ„ชใ‚ŒใŸAIใŒๆˆ‘ใ€…ใฎ็ ดๆป…ใซ็น‹ใŒใ‚Šใ†ใ‚‹ใฎใฏ ใชใœใชใฎใ‹๏ผŸ
03:39
So here's another quotation:
69
219218
1498
ใ“ใ“ใซใ‚‚ใ†๏ผ‘ใค ๅผ•็”จใŒใ‚ใ‚Šใพใ™
03:41
"We had better be quite sure that the purpose put into the machine
70
221755
3335
ใ€ŒๆฉŸๆขฐใซไธŽใˆใ‚‹็›ฎ็š„ใซใคใ„ใฆใฏ
ใใ‚ŒใŒๆœฌๅฝ“ใซๆœ›ใ‚€ใ‚‚ใฎใ ใจ ็ขบไฟกใŒใ‚ใ‚‹ใ‚‚ใฎใซใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚‹ใ€
03:45
is the purpose which we really desire."
71
225114
2298
03:48
This was said by Norbert Wiener in 1960,
72
228102
3498
ใ“ใ‚ŒใฏใƒŽใƒผใƒใƒผใƒˆใƒปใ‚ฆใ‚ฃใƒผใƒŠใƒผใŒ 1960ๅนดใซ่จ€ใฃใŸใ“ใจใง
03:51
shortly after he watched one of the very early learning systems
73
231624
4002
ๆœ€ๅˆๆœŸใฎๅญฆ็ฟ’ใ‚ทใ‚นใƒ†ใƒ ใŒ
ไฝœใ‚Šๆ‰‹ใ‚ˆใ‚Šใ‚‚ใ†ใพใใƒใ‚งใƒƒใ‚ซใƒผใ‚’ ๆŒ‡ใ™ใฎใ‚’่ฆ‹ใŸ ใ™ใๅพŒใฎใ“ใจใงใ™
03:55
learn to play checkers better than its creator.
74
235650
2583
04:00
But this could equally have been said
75
240422
2683
ใ—ใ‹ใ—ใ“ใ‚ŒใฏใƒŸใƒ€ใ‚น็Ž‹ใฎ ่จ€่‘‰ใ ใฃใŸใจใ—ใฆใ‚‚
04:03
by King Midas.
76
243129
1167
ใŠใ‹ใ—ใใชใ„ใงใ—ใ‚‡ใ†
04:04
King Midas said, "I want everything I touch to turn to gold,"
77
244903
3134
ใƒŸใƒ€ใ‚น็Ž‹ใฏใ€Œ่‡ชๅˆ†ใฎ่งฆใ‚ŒใŸใ‚‚ใฎใ™ในใฆใŒ ้‡‘ใซใชใฃใฆใปใ—ใ„ใ€ใจๆœ›ใฟ
04:08
and he got exactly what he asked for.
78
248061
2473
ใใ—ใฆ ใใฎๆœ›ใฟใŒ ๅถใˆใ‚‰ใ‚Œใพใ—ใŸ
04:10
That was the purpose that he put into the machine,
79
250558
2751
ใ“ใ‚Œใฏใ„ใ‚ใฐ
ๅฝผใŒใ€ŒๆฉŸๆขฐใซไธŽใˆใŸ็›ฎ็š„ใ€ใงใ™
04:13
so to speak,
80
253333
1450
04:14
and then his food and his drink and his relatives turned to gold
81
254807
3444
ใใ—ใฆๅฝผใฎ้ฃŸใน็‰ฉใ‚„้ฃฒใฟ็‰ฉใ‚„่ฆช้กžใฏ ใฟใ‚“ใช้‡‘ใซๅค‰ใ‚ใฃใฆใ—ใพใ„
04:18
and he died in misery and starvation.
82
258275
2281
ๅฝผใฏๆ‚ฒๅ˜†ใจ้ฃขใˆใฎไธญใง ๆญปใ‚“ใงใ„ใใพใ—ใŸ
04:22
So we'll call this "the King Midas problem"
83
262264
2341
ใ ใ‹ใ‚‰่‡ชๅˆ†ใŒๆœฌๅฝ“ใซๆœ›ใ‚€ใ“ใจใจๅˆใ‚ใชใ„ ็›ฎ็š„ใ‚’ๆŽฒใ’ใ‚‹ใ“ใจใ‚’
04:24
of stating an objective which is not, in fact,
84
264629
3305
ใ€ŒใƒŸใƒ€ใ‚น็Ž‹ใฎๅ•้กŒใ€ใจ
04:27
truly aligned with what we want.
85
267958
2413
ๅ‘ผใถใ“ใจใซใ—ใพใ—ใ‚‡ใ†
04:30
In modern terms, we call this "the value alignment problem."
86
270395
3253
็พไปฃ็š„ใช็”จ่ชžใงใฏ ใ“ใ‚Œใ‚’ ใ€Œไพกๅ€คๆ•ดๅˆใฎๅ•้กŒใ€ใจ่จ€ใ„ใพใ™
04:36
Putting in the wrong objective is not the only part of the problem.
87
276867
3485
้–“้•ใฃใŸ็›ฎ็š„ใ‚’ไธŽใˆใฆใ—ใพใ†ใจใ„ใ†ใฎใŒ ๅ•้กŒใฎใ™ในใฆใงใฏใ‚ใ‚Šใพใ›ใ‚“
04:40
There's another part.
88
280376
1152
ๅˆฅใฎๅด้ขใ‚‚ใ‚ใ‚Šใพใ™
04:41
If you put an objective into a machine,
89
281980
1943
ใ€Œใ‚ณใƒผใƒ’ใƒผใ‚’ๅ–ใฃใฆใใ‚‹ใ€ใจใ„ใ†ใ‚ˆใ†ใช
04:43
even something as simple as, "Fetch the coffee,"
90
283947
2448
ใ”ใๅ˜็ด”ใช็›ฎ็š„ใ‚’ ๆฉŸๆขฐใซไธŽใˆใŸใจใ—ใพใ™
04:47
the machine says to itself,
91
287728
1841
ๆฉŸๆขฐใฏ่€ƒใˆใพใ™
04:50
"Well, how might I fail to fetch the coffee?
92
290553
2623
ใ€Œใ‚ณใƒผใƒ’ใƒผใ‚’ๅ–ใฃใฆใใ‚‹ใฎใซๅคฑๆ•—ใ™ใ‚‹ ใฉใ‚“ใช็ŠถๆณใŒใ‚ใ‚Šใ†ใ‚‹ใ ใ‚ใ†๏ผŸ
04:53
Someone might switch me off.
93
293200
1580
่ชฐใ‹ใŒ่‡ชๅˆ†ใฎใ‚นใ‚คใƒƒใƒใ‚’ ๅˆ‡ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„
04:55
OK, I have to take steps to prevent that.
94
295465
2387
ใใฎใ‚ˆใ†ใชใ“ใจใ‚’้˜ฒๆญขใ™ใ‚‹ ๆ‰‹ใ‚’ๆ‰“ใŸใชใ‘ใ‚Œใฐ
04:57
I will disable my 'off' switch.
95
297876
1906
่‡ชๅˆ†ใฎใ€Œใ‚ชใƒ•ใ€ใ‚นใ‚คใƒƒใƒใ‚’ ็„กๅŠนใซใ—ใฆใŠใ“ใ†
05:00
I will do anything to defend myself against interference
96
300354
2959
ไธŽใˆใ‚‰ใ‚ŒใŸ็›ฎ็š„ใฎ้‚่กŒใ‚’้˜ปใ‚€ใ‚‚ใฎใ‹ใ‚‰ ่‡ชๅˆ†ใ‚’ๅฎˆใ‚‹ใŸใ‚ใงใ‚ใ‚Œใฐ
05:03
with this objective that I have been given."
97
303337
2629
ไฝ•ใ ใฃใฆใ‚„ใ‚ใ†ใ€
05:05
So this single-minded pursuit
98
305990
2012
๏ผ‘ใคใฎ็›ฎ็š„ใ‚’
้žๅธธใซ้˜ฒๅพก็š„ใซ ไธ€้€”ใซ่ฟฝๆฑ‚ใ™ใ‚‹ใจ
05:09
in a very defensive mode of an objective that is, in fact,
99
309033
2945
ไบบ้กžใฎๆœฌๅฝ“ใฎ็›ฎ็š„ใซ ๆฒฟใ‚ใชใใชใ‚‹ใจใ„ใ†ใฎใŒ
05:12
not aligned with the true objectives of the human race --
100
312002
2814
05:15
that's the problem that we face.
101
315942
1862
ๆˆ‘ใ€…ใฎ็›ด้ขใ™ใ‚‹ๅ•้กŒใงใ™
05:18
And in fact, that's the high-value takeaway from this talk.
102
318827
4767
ๅฎŸ้š›ใใ‚ŒใŒ ใ“ใฎ่ฌ›ๆผ”ใ‹ใ‚‰ๅญฆในใ‚‹ ไพกๅ€คใ‚ใ‚‹ๆ•™่จ“ใงใ™
05:23
If you want to remember one thing,
103
323618
2055
ใ‚‚ใ—๏ผ‘ใคใ ใ‘่ฆšใˆใฆใŠใใจใ—ใŸใ‚‰ ใใ‚Œใฏ โ€”
05:25
it's that you can't fetch the coffee if you're dead.
104
325697
2675
ใ€Œๆญปใ‚“ใ ใ‚‰ใ‚ณใƒผใƒ’ใƒผใ‚’ๅ–ใฃใฆใ“ใ‚Œใชใ„ใ€ ใจใ„ใ†ใ“ใจใงใ™
05:28
(Laughter)
105
328396
1061
(็ฌ‘)
05:29
It's very simple. Just remember that. Repeat it to yourself three times a day.
106
329481
3829
็ฐกๅ˜ใงใ—ใ‚‡ใ† ่จ˜ๆ†ถใ—ใฆ๏ผ‘ๆ—ฅ๏ผ“ๅ›žๅ”ฑใˆใฆใใ ใ•ใ„
05:33
(Laughter)
107
333334
1821
(็ฌ‘)
05:35
And in fact, this is exactly the plot
108
335179
2754
ๅฎŸ้š› ๆ˜ ็”ปใ€Ž2001ๅนดๅฎ‡ๅฎ™ใฎๆ—…ใ€ใฎ็ญ‹ใฏ
05:37
of "2001: [A Space Odyssey]"
109
337957
2648
ใใ†ใ„ใ†ใ‚‚ใฎใงใ—ใŸ
HALใฎ็›ฎ็š„ใƒปใƒŸใƒƒใ‚ทใƒงใƒณใฏ
05:41
HAL has an objective, a mission,
110
341046
2090
05:43
which is not aligned with the objectives of the humans,
111
343160
3732
ไบบ้–“ใฎ็›ฎ็š„ใจใฏๅˆใ‚ใš
05:46
and that leads to this conflict.
112
346916
1810
ใใฎใŸใ‚่ก็ชใŒ่ตทใใพใ™
05:49
Now fortunately, HAL is not superintelligent.
113
349314
2969
ๅนธใ„HALใฏ้žๅธธใซ่ณขใใฏใ‚ใฃใฆใ‚‚ ่ถ…็Ÿฅ็š„ใงใฏใ‚ใ‚Šใพใ›ใ‚“ใงใ—ใŸ
05:52
He's pretty smart, but eventually Dave outwits him
114
352307
3587
ใใ‚Œใงๆœ€็ต‚็š„ใซใฏ ไธปไบบๅ…ฌใŒๅ‡บใ—ๆŠœใ„ใฆ
05:55
and manages to switch him off.
115
355918
1849
ใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‹ใ“ใจใŒใงใใพใ—ใŸ
06:01
But we might not be so lucky.
116
361648
1619
ใงใ‚‚็งใŸใกใฏใใ‚“ใชใซๅนธ้‹ใงใฏ ใชใ„ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
ใงใฏ ใฉใ†ใ—ใŸใ‚‰ใ„ใ„ใฎใงใ—ใ‚‡ใ†๏ผŸ
06:08
So what are we going to do?
117
368013
1592
06:12
I'm trying to redefine AI
118
372191
2601
ใ€Œ็Ÿฅ็š„ใซ็›ฎ็š„ใ‚’่ฟฝๆฑ‚ใ™ใ‚‹ๆฉŸๆขฐใ€ใจใ„ใ†
06:14
to get away from this classical notion
119
374816
2061
ๅคๅ…ธ็š„ใช่ฆ‹ๆ–นใ‹ใ‚‰้›ขใ‚Œใฆ
06:16
of machines that intelligently pursue objectives.
120
376901
4567
AIใฎๅ†ๅฎš็พฉใ‚’่ฉฆใฟใ‚ˆใ†ใจๆ€ใ„ใพใ™
06:22
There are three principles involved.
121
382532
1798
๏ผ“ใคใฎๅŽŸๅ‰‡ใŒใ‚ใ‚Šใพใ™
06:24
The first one is a principle of altruism, if you like,
122
384354
3289
็ฌฌ๏ผ‘ใฏใ€Œๅˆฉไป–ๆ€งใฎๅŽŸๅ‰‡ใ€ใง
06:27
that the robot's only objective
123
387667
3262
ใƒญใƒœใƒƒใƒˆใฎๅ”ฏไธ€ใฎ็›ฎ็š„ใฏ
06:30
is to maximize the realization of human objectives,
124
390953
4246
ไบบ้–“ใฎ็›ฎ็š„ ไบบ้–“ใซใจใฃใฆไพกๅ€คใ‚ใ‚‹ใ“ใจใŒ
ๆœ€ๅคง้™ใซๅฎŸ็พใ•ใ‚Œใ‚‹ ใ‚ˆใ†ใซใ™ใ‚‹ใ“ใจใงใ™
06:35
of human values.
125
395223
1390
06:36
And by values here I don't mean touchy-feely, goody-goody values.
126
396637
3330
ใ“ใ“ใง่จ€ใ†ไพกๅ€คใฏ ๅ–„ไบบใถใฃใŸๅด‡้ซ˜ใใ†ใชไพกๅ€คใงใฏใ‚ใ‚Šใพใ›ใ‚“
06:39
I just mean whatever it is that the human would prefer
127
399991
3787
ๅ˜ใซไฝ•ใงใ‚ใ‚Œ
ไบบ้–“ใŒ่‡ชๅˆ†ใฎ็”Ÿๆดปใซ ๆœ›ใ‚€ใ‚‚ใฎใจใ„ใ†ใ“ใจใงใ™
06:43
their life to be like.
128
403802
1343
06:47
And so this actually violates Asimov's law
129
407184
2309
ใ“ใฎๅŽŸๅ‰‡ใฏ
ใ€Œใƒญใƒœใƒƒใƒˆใฏ่‡ชๅทฑใ‚’ๅฎˆใ‚‰ใชใ‘ใ‚Œใฐใชใ‚‰ใชใ„ใ€ ใจใ„ใ†ใ‚ขใ‚ทใƒขใƒ•ใฎๅŽŸๅ‰‡ใซๅใ—ใพใ™
06:49
that the robot has to protect its own existence.
130
409517
2329
06:51
It has no interest in preserving its existence whatsoever.
131
411870
3723
่‡ชๅทฑใฎๅญ˜ๅœจ็ถญๆŒใซใฏ ใพใฃใŸใ้–ขๅฟƒใ‚’ๆŒใŸใชใ„ใฎใงใ™
06:57
The second law is a law of humility, if you like.
132
417240
3768
็ฌฌ๏ผ’ใฎๅŽŸๅ‰‡ใฏ ่จ€ใ†ใชใ‚Œใฐใ€Œ่ฌ™่™šใฎๅŽŸๅ‰‡ใ€ใงใ™
07:01
And this turns out to be really important to make robots safe.
133
421794
3743
ใ“ใ‚Œใฏใƒญใƒœใƒƒใƒˆใ‚’ๅฎ‰ๅ…จใชใ‚‚ใฎใซใ™ใ‚‹ไธŠใง ้žๅธธใซ้‡่ฆใงใ‚ใ‚‹ใ“ใจใŒใ‚ใ‹ใ‚Šใพใ™
07:05
It says that the robot does not know
134
425561
3142
ใ“ใฎๅŽŸๅ‰‡ใฏ
ใƒญใƒœใƒƒใƒˆใŒไบบ้–“ใฎไพกๅ€คใŒไฝ•ใ‹ ็Ÿฅใ‚‰ใชใ„ใ‚‚ใฎใจใ—ใฆใ„ใพใ™
07:08
what those human values are,
135
428727
2028
07:10
so it has to maximize them, but it doesn't know what they are.
136
430779
3178
ใƒญใƒœใƒƒใƒˆใฏๆœ€ๅคงๅŒ–ใ™ในใใ‚‚ใฎใŒไฝ•ใ‹ ็Ÿฅใ‚‰ใชใ„ใจใ„ใ†ใ“ใจใงใ™
๏ผ‘ใคใฎ็›ฎ็š„ใ‚’ ไธ€้€”ใซ่ฟฝๆฑ‚ใ™ใ‚‹ใ“ใจใฎๅ•้กŒใ‚’
07:15
And that avoids this problem of single-minded pursuit
137
435074
2626
07:17
of an objective.
138
437724
1212
ใ“ใ‚Œใง้ฟใ‘ใ‚‹ใ“ใจใŒใงใใพใ™
07:18
This uncertainty turns out to be crucial.
139
438960
2172
ใ“ใฎไธ็ขบๅฎšๆ€งใŒ ๆฅตใ‚ใฆ้‡่ฆใชใฎใงใ™
07:21
Now, in order to be useful to us,
140
441546
1639
ไบบ้–“ใซใจใฃใฆๆœ‰็”จใงใ‚ใ‚‹ใŸใ‚ใซใฏ
07:23
it has to have some idea of what we want.
141
443209
2731
ๆˆ‘ใ€…ใŒไฝ•ใ‚’ๆœ›ใ‚€ใฎใ‹ใซใคใ„ใฆ ๅคงใพใ‹ใช็†่งฃใฏๅฟ…่ฆใงใ™
ใƒญใƒœใƒƒใƒˆใฏใใฎๆƒ…ๅ ฑใ‚’ไธปใจใ—ใฆ ไบบ้–“ใฎ้ธๆŠžใ‚’่ฆณๅฏŸใ™ใ‚‹ใ“ใจใงๅพ—ใพใ™
07:27
It obtains that information primarily by observation of human choices,
142
447043
5427
07:32
so our own choices reveal information
143
452494
2801
ๆˆ‘ใ€…ใŒ่‡ชๅˆ†ใฎ็”Ÿๆดปใซๆœ›ใ‚€ใฎใŒ ไฝ•ใ‹ใจใ„ใ†ๆƒ…ๅ ฑใŒ
07:35
about what it is that we prefer our lives to be like.
144
455319
3300
ๆˆ‘ใ€…ใฎใ™ใ‚‹้ธๆŠžใ‚’้€šใ—ใฆ ๆ˜Žใ‹ใ•ใ‚Œใ‚‹ใ‚ใ‘ใงใ™
ไปฅไธŠใŒ๏ผ“ใคใฎๅŽŸๅ‰‡ใงใ™
07:40
So those are the three principles.
145
460452
1683
ใ“ใ‚ŒใŒใƒใƒฅใƒผใƒชใƒณใ‚ฐใฎๆ่ตทใ—ใŸ ใ€ŒๆฉŸๆขฐใฎใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚Œใ‚‹ใ‹ใ€ใจใ„ใ†ๅ•้กŒใซ
07:42
Let's see how that applies to this question of:
146
462159
2318
07:44
"Can you switch the machine off?" as Turing suggested.
147
464501
2789
ใฉใ†้ฉ็”จใงใใ‚‹ใ‹ ่ฆ‹ใฆใฟใพใ—ใ‚‡ใ†
07:48
So here's a PR2 robot.
148
468893
2120
ใ“ใ‚Œใฏ PR2 ใƒญใƒœใƒƒใƒˆใงใ™
็งใŸใกใฎ็ ”็ฉถๅฎคใซใ‚ใ‚‹ใ‚‚ใฎใง
07:51
This is one that we have in our lab,
149
471037
1821
07:52
and it has a big red "off" switch right on the back.
150
472882
2903
่ƒŒไธญใซๅคงใใช่ตคใ„ ใ€Œใ‚ชใƒ•ใ€ใ‚นใ‚คใƒƒใƒใŒใ‚ใ‚Šใพใ™
07:56
The question is: Is it going to let you switch it off?
151
476361
2615
ๅ•้กŒใฏ ใƒญใƒœใƒƒใƒˆใŒใ‚นใ‚คใƒƒใƒใ‚’ ๅˆ‡ใ‚‰ใ›ใฆใใ‚Œใ‚‹ใ‹ใจใ„ใ†ใ“ใจใงใ™
ๅคๅ…ธ็š„ใชใ‚„ใ‚Šๆ–นใ‚’ใ™ใ‚‹ใชใ‚‰
07:59
If we do it the classical way,
152
479000
1465
08:00
we give it the objective of, "Fetch the coffee, I must fetch the coffee,
153
480489
3482
ใ€Œใ‚ณใƒผใƒ’ใƒผใ‚’ๅ–ใฃใฆใใ‚‹ใ€ ใจใ„ใ†็›ฎ็š„ใซๅฏพใ—
ใ€Œใ‚ณใƒผใƒ’ใƒผใ‚’ๅ–ใฃใฆใ“ใชใ‘ใ‚Œใฐใชใ‚‰ใชใ„ใ€ ใ€Œๆญปใ‚“ใ ใ‚‰ใ‚ณใƒผใƒ’ใƒผใ‚’ๅ–ใฃใฆใ“ใ‚Œใชใ„ใ€ใจ่€ƒใˆ
08:03
I can't fetch the coffee if I'm dead,"
154
483995
2580
08:06
so obviously the PR2 has been listening to my talk,
155
486599
3341
็งใฎ่ฌ›ๆผ”ใ‚’่ดใ„ใฆใ„ใŸPR2ใฏ
08:09
and so it says, therefore, "I must disable my 'off' switch,
156
489964
3753
ใ€Œใ‚ชใƒ•ใƒปใ‚นใ‚คใƒƒใƒใฏ็„กๅŠนใซใ—ใชใ‘ใ‚Œใฐใ€ ใจๅˆคๆ–ญใ—
08:14
and probably taser all the other people in Starbucks
157
494796
2694
ใ€Œใ‚นใ‚ฟใƒผใƒใƒƒใ‚ฏใ‚นใง้‚ช้ญ”ใซใชใ‚‹ ไป–ใฎๅฎขใฏใฟใ‚“ใช
ใƒ†ใƒผใ‚ถใƒผ้Šƒใง็œ ใ‚‰ใ›ใ‚ˆใ†ใ€ ใจใชใ‚Šใพใ™
08:17
who might interfere with me."
158
497514
1560
08:19
(Laughter)
159
499098
2062
(็ฌ‘)
08:21
So this seems to be inevitable, right?
160
501184
2153
ใ“ใ‚Œใฏ้ฟใ‘ใŒใŸใ„ ใ‚ˆใ†ใซ่ฆ‹ใˆใพใ™
08:23
This kind of failure mode seems to be inevitable,
161
503361
2398
ใ“ใฎใ‚ˆใ†ใชๆ•…้šœใƒขใƒผใƒ‰ใฏ ไธๅฏ้ฟใซ่ฆ‹ใˆ
08:25
and it follows from having a concrete, definite objective.
162
505783
3543
ใใ—ใฆใใ‚Œใฏๅ…ทไฝ“็š„ใง็ตถๅฏพ็š„ใช ็›ฎ็š„ใŒใ‚ใ‚‹ใ“ใจใ‹ใ‚‰ๆฅใฆใ„ใพใ™
08:30
So what happens if the machine is uncertain about the objective?
163
510632
3144
็›ฎ็š„ใŒไฝ•ใชใฎใ‹ๆฉŸๆขฐใซ ็ขบไฟกใŒใชใ„ใจใ—ใŸใ‚‰ ใฉใ†ใชใ‚‹ใงใ—ใ‚‡ใ†๏ผŸ
08:33
Well, it reasons in a different way.
164
513800
2127
้•ใฃใŸใ‚ˆใ†ใซๆŽจ่ซ–ใ™ใ‚‹ใฏใšใงใ™
08:35
It says, "OK, the human might switch me off,
165
515951
2424
ใ€Œไบบ้–“ใฏ่‡ชๅˆ†ใฎใ‚นใ‚คใƒƒใƒใ‚’ ๅˆ‡ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใชใ„ใŒ
08:38
but only if I'm doing something wrong.
166
518964
1866
ใใ‚Œใฏ่‡ชๅˆ†ใŒไฝ•ใ‹ ๆ‚ชใ„ใ“ใจใ‚’ใ—ใŸใจใใ ใ‘ใ 
08:41
Well, I don't really know what wrong is,
167
521567
2475
ๆ‚ชใ„ใ“ใจใŒไฝ•ใ‹ ใ‚ˆใๅˆ†ใ‹ใ‚‰ใชใ„ใ‘ใฉ
08:44
but I know that I don't want to do it."
168
524066
2044
ๆ‚ชใ„ใ“ใจใฏใ—ใŸใใชใ„ใ€
ใ“ใ“ใง ็ฌฌ๏ผ‘ ใŠใ‚ˆใณ็ฌฌ๏ผ’ใฎๅŽŸๅ‰‡ใŒ ๅŠนใ„ใฆใ„ใพใ™
08:46
So that's the first and second principles right there.
169
526134
3010
08:49
"So I should let the human switch me off."
170
529168
3359
ใ€Œใ ใ‹ใ‚‰ใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‹ใฎใ‚’ ไบบ้–“ใซ่จฑใ™ในใใ ใ€
08:53
And in fact you can calculate the incentive that the robot has
171
533541
3956
ๅฎŸ้š›ใƒญใƒœใƒƒใƒˆใŒไบบ้–“ใซ ใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‹ใ“ใจใ‚’่จฑใ™
ใ‚คใƒณใ‚ปใƒณใƒ†ใ‚ฃใƒ–ใ‚’ ่จˆ็ฎ—ใ™ใ‚‹ใ“ใจใŒใงใ
08:57
to allow the human to switch it off,
172
537521
2493
ใใ‚Œใฏ็›ฎ็š„ใฎไธ็ขบใ‹ใ•ใฎๅบฆๅˆใ„ใจ
09:00
and it's directly tied to the degree
173
540038
1914
09:01
of uncertainty about the underlying objective.
174
541976
2746
็›ดๆŽฅ็š„ใซ็ตใณใคใ„ใฆใ„ใพใ™
09:05
And then when the machine is switched off,
175
545797
2949
ๆฉŸๆขฐใฎใ‚นใ‚คใƒƒใƒใŒๅˆ‡ใ‚‰ใ‚Œใ‚‹ใจ
09:08
that third principle comes into play.
176
548770
1805
็ฌฌ๏ผ“ใฎๅŽŸๅ‰‡ใŒๅƒใ„ใฆ
09:10
It learns something about the objectives it should be pursuing,
177
550599
3062
่ฟฝๆฑ‚ใ™ในใ็›ฎ็š„ใซใคใ„ใฆ ไฝ•ใ‹ใ‚’ๅญฆใณใพใ™
09:13
because it learns that what it did wasn't right.
178
553685
2533
่‡ชๅˆ†ใฎ้–“้•ใฃใŸ่กŒใ„ใ‹ใ‚‰ ๅญฆใถใฎใงใ™
09:16
In fact, we can, with suitable use of Greek symbols,
179
556242
3570
ๆ•ฐๅญฆ่€…ใŒใ‚ˆใใ‚„ใ‚‹ใ‚ˆใ†ใซ
ใ‚ฎใƒชใ‚ทใƒฃๆ–‡ๅญ—ใ‚’ใ†ใพใไฝฟใฃใฆ
09:19
as mathematicians usually do,
180
559836
2131
09:21
we can actually prove a theorem
181
561991
1984
ใใฎใ‚ˆใ†ใชใƒญใƒœใƒƒใƒˆใŒ ไบบ้–“ใซใจใฃใฆๆœ‰็›Šใงใ‚ใ‚‹ใจใ„ใ†ๅฎš็†ใ‚’
09:23
that says that such a robot is provably beneficial to the human.
182
563999
3553
่จผๆ˜Žใ™ใ‚‹ใ“ใจใŒใงใใพใ™
09:27
You are provably better off with a machine that's designed in this way
183
567576
3803
ใใฎใ‚ˆใ†ใซใƒ‡ใ‚ถใ‚คใƒณใ•ใ‚ŒใŸๆฉŸๆขฐใฎๆ–นใŒ ใใ†ใงใชใ„ใ‚‚ใฎใ‚ˆใ‚Š่‰ฏใ„็ตๆžœใซใชใ‚‹ใจ
09:31
than without it.
184
571403
1246
่จผๆ˜Žๅฏ่ƒฝใชใฎใงใ™
09:33
So this is a very simple example, but this is the first step
185
573057
2906
ใ“ใ‚Œใฏๅ˜็ด”ใชไพ‹ใงใ™ใŒ
ไบบ้–“ไบ’ๆ›ใฎAIใ‚’ๆ‰‹ใซใ™ใ‚‹ใŸใ‚ใฎ ็ฌฌไธ€ๆญฉใงใ™
09:35
in what we're trying to do with human-compatible AI.
186
575987
3903
09:42
Now, this third principle,
187
582477
3257
๏ผ“็•ช็›ฎใฎๅŽŸๅ‰‡ใซใคใ„ใฆใฏ
09:45
I think is the one that you're probably scratching your head over.
188
585758
3112
็š†ใ•ใ‚“ๅ›ฐๆƒ‘ใ—ใฆใ„ใ‚‹ใฎใงใฏ ใจๆ€ใ„ใพใ™
09:48
You're probably thinking, "Well, you know, I behave badly.
189
588894
3239
ใ€Œ่‡ชๅˆ†ใฎ่กŒๅ‹•ใฏ ่ฆ‹ไธŠใ’ใŸใ‚‚ใฎใงใฏใชใ„
ใƒญใƒœใƒƒใƒˆใซ่‡ชๅˆ†ใฎใ‚ˆใ†ใซ ๆŒฏใ‚‹่ˆžใฃใฆๆฌฒใ—ใใฏใชใ„
09:52
I don't want my robot to behave like me.
190
592157
2929
็œŸๅคœไธญใซใ“ใฃใใ‚Šๅฐๆ‰€ใซ่กŒใฃใฆ ๅ†ท่”ตๅบซใ‹ใ‚‰้ฃŸใน็‰ฉใ‚’ๅคฑๆ•ฌใ—ใŸใ‚Š
09:55
I sneak down in the middle of the night and take stuff from the fridge.
191
595110
3434
09:58
I do this and that."
192
598568
1168
ใ‚ใ‚“ใชใ“ใจใ‚„ ใ“ใ‚“ใชใ“ใจใ‚’ ใ—ใฆใ„ใ‚‹ใ‹ใ‚‰ใ€
09:59
There's all kinds of things you don't want the robot doing.
193
599760
2797
ใƒญใƒœใƒƒใƒˆใซใ—ใฆใปใ—ใใชใ„ ๆง˜ใ€…ใชใ“ใจใŒใ‚ใ‚Šใพใ™
10:02
But in fact, it doesn't quite work that way.
194
602581
2071
ใงใ‚‚ๅฎŸ้š›ใใ†ใ„ใ†้ขจใซ ๅƒใใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“
10:04
Just because you behave badly
195
604676
2155
่‡ชๅˆ†ใŒใพใšใ„ๆŒฏใ‚‹่ˆžใ„ใ‚’ใ—ใŸใ‚‰
10:06
doesn't mean the robot is going to copy your behavior.
196
606855
2623
ใƒญใƒœใƒƒใƒˆใŒใใ‚Œใ‚’็œŸไผผใ™ใ‚‹ ใจใ„ใ†ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“
10:09
It's going to understand your motivations and maybe help you resist them,
197
609502
3910
ไบบใŒใใฎใ‚ˆใ†ใซใ™ใ‚‹ ๅ‹•ๆฉŸใ‚’็†่งฃใ—ใฆ
่ช˜ๆƒ‘ใซๆŠตๆŠ—ใ™ใ‚‹ๆ‰‹ๅŠฉใ‘ใ•ใˆ ใ—ใฆใใ‚Œใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
10:13
if appropriate.
198
613436
1320
10:16
But it's still difficult.
199
616026
1464
ใใ‚Œใงใ‚‚้›ฃใ—ใ„ใงใ™
10:18
What we're trying to do, in fact,
200
618122
2545
็งใŸใกใŒใ‚„ใ‚ใ†ใจใ—ใฆใ„ใ‚‹ใฎใฏ
ใ‚ใ‚‰ใ‚†ใ‚‹็Šถๆณใซใ‚ใ‚‹
10:20
is to allow machines to predict for any person and for any possible life
201
620691
5796
ใ‚ใ‚‰ใ‚†ใ‚‹ไบบใฎใ“ใจใ‚’
ๆฉŸๆขฐใซไบˆๆธฌใ•ใ›ใ‚‹ ใจใ„ใ†ใ“ใจใงใ™
10:26
that they could live,
202
626511
1161
10:27
and the lives of everybody else:
203
627696
1597
ใใฎไบบใŸใกใฏ ใฉใกใ‚‰ใ‚’ๅฅฝใ‚€ใฎใ‹๏ผŸ
10:29
Which would they prefer?
204
629317
2517
10:33
And there are many, many difficulties involved in doing this;
205
633881
2954
ใ“ใ‚Œใซใฏ้›ฃใ—ใ„ใ“ใจใŒ ใŸใใ•ใ‚“ใ‚ใฃใฆ
10:36
I don't expect that this is going to get solved very quickly.
206
636859
2932
ใ”ใ้€Ÿใ‚„ใ‹ใซ่งฃๆฑบใ•ใ‚Œใ‚‹ใ ใ‚ใ†ใจใฏ ๆ€ใฃใฆใ„ใพใ›ใ‚“
10:39
The real difficulties, in fact, are us.
207
639815
2643
ๆœฌๅฝ“ใซ้›ฃใ—ใ„้ƒจๅˆ†ใฏ ็งใŸใกใซใ‚ใ‚Šใพใ™
10:43
As I have already mentioned, we behave badly.
208
643969
3117
่จ€ใ„ใพใ—ใŸใ‚ˆใ†ใซ ็งใŸใกใฏ ใพใšใ„ๆŒฏใ‚‹่ˆžใ„ใ‚’ใ—ใพใ™
10:47
In fact, some of us are downright nasty.
209
647110
2321
ไบบใซใ‚ˆใฃใฆใฏ ๆ‚ช่ณชใงใ•ใˆใ‚ใ‚Šใพใ™
10:50
Now the robot, as I said, doesn't have to copy the behavior.
210
650251
3052
ใ—ใ‹ใ—ใƒญใƒœใƒƒใƒˆใฏไบบ้–“ใฎๆŒฏใ‚‹ใพใ„ใ‚’ ็œŸไผผใ™ใ‚‹ๅฟ…่ฆใฏใ‚ใ‚Šใพใ›ใ‚“
10:53
The robot does not have any objective of its own.
211
653327
2791
ใƒญใƒœใƒƒใƒˆใฏ ใใ‚Œ่‡ช่บซใฎ็›ฎ็š„ ใจใ„ใ†ใฎใ‚’ๆŒใกใพใ›ใ‚“
10:56
It's purely altruistic.
212
656142
1737
็ด”็ฒ‹ใซๅˆฉไป–็š„ใงใ™
10:59
And it's not designed just to satisfy the desires of one person, the user,
213
659113
5221
ใใ—ใฆ๏ผ‘ไบบใฎไบบ้–“ใฎๆœ›ใฟใ ใ‘ ๆบ€ใŸใใ†ใจใ™ใ‚‹ใฎใงใฏใชใ
11:04
but in fact it has to respect the preferences of everybody.
214
664358
3138
ใฟใ‚“ใชใฎๅฅฝใฟใซๆ•ฌๆ„ใ‚’ๆ‰•ใ†ใ‚ˆใ† ใƒ‡ใ‚ถใ‚คใƒณใ•ใ‚Œใฆใ„ใพใ™
ใ ใ‹ใ‚‰ใ‚ใ‚‹็จ‹ๅบฆ ๆ‚ชใ„ใ“ใจใ‚‚ๆ‰ฑใˆ
11:09
So it can deal with a certain amount of nastiness,
215
669083
2570
11:11
and it can even understand that your nastiness, for example,
216
671677
3701
ไบบ้–“ใฎๆ‚ชใ„้ขใ‚‚ ็†่งฃใงใใพใ™
ไพ‹ใˆใฐๅ…ฅๅ›ฝๅฏฉๆŸปๅฎ˜ใŒ ่ณ„่ณ‚ใ‚’ๅ—ใ‘ๅ–ใฃใฆใ„ใ‚‹ใ‘ใ‚Œใฉ
11:15
you may take bribes as a passport official
217
675402
2671
11:18
because you need to feed your family and send your kids to school.
218
678097
3812
ใใ‚Œใฏๅฎถๆ—ใ‚’้ฃŸในใ•ใ› ๅญไพ›ใ‚’ๅญฆๆ กใซ่กŒใ‹ใ›ใ‚‹ใŸใ‚ใชใฎใ ใจใ‹
11:21
It can understand that; it doesn't mean it's going to steal.
219
681933
2906
ใƒญใƒœใƒƒใƒˆใฏใใ‚Œใ‚’็†่งฃใงใใพใ™ใŒ ใใฎใŸใ‚ใซ็›—ใฟใ‚’ใ™ใ‚‹ใ‚ใ‘ใงใฏใ‚ใ‚Šใพใ›ใ‚“
11:24
In fact, it'll just help you send your kids to school.
220
684863
2679
ใŸใ ๅญไพ›ใŒๅญฆๆ กใซ่กŒใ‘ใ‚‹ใ‚ˆใ† ๆ‰‹ๅŠฉใ‘ใ‚’ใ™ใ‚‹ใ ใ‘ใงใ™
11:28
We are also computationally limited.
221
688796
3012
ใพใŸไบบ้–“ใฏ่จˆ็ฎ—่ƒฝๅŠ›ใฎ็‚นใง ้™็•ŒใŒใ‚ใ‚Šใพใ™
11:31
Lee Sedol is a brilliant Go player,
222
691832
2505
ๆŽไธ–ใƒ‰ใƒซใฏ ็ด ๆ™ดใ‚‰ใ—ใ„็ขๆ‰“ใกใงใ™ใŒ
11:34
but he still lost.
223
694361
1325
ใใ‚Œใงใ‚‚่ฒ ใ‘ใพใ—ใŸ
11:35
So if we look at his actions, he took an action that lost the game.
224
695710
4239
ๅฝผใฎ่กŒๅ‹•ใ‚’่ฆ‹ใ‚Œใฐ ๅ‹่ฒ ใซ่ฒ ใ‘ใ‚‹ใ“ใจใซใชใ‚‹ ๆ‰‹ใ‚’ๆ‰“ใฃใŸใฎใŒๅˆ†ใ‹ใ‚‹ใงใ—ใ‚‡ใ†
11:39
That doesn't mean he wanted to lose.
225
699973
2161
ใ—ใ‹ใ—ใใ‚Œใฏ ๅฝผใŒ่ฒ ใ‘ใ‚’ ๆœ›ใ‚“ใ ใ“ใจใ‚’ๆ„ๅ‘ณใ—ใพใ›ใ‚“
11:43
So to understand his behavior,
226
703160
2040
ๅฝผใฎ่กŒๅ‹•ใ‚’็†่งฃใ™ใ‚‹ใŸใ‚ใซใฏ
11:45
we actually have to invert through a model of human cognition
227
705224
3644
ไบบใฎ่ช็Ÿฅใƒขใƒ‡ใƒซใ‚’ ้€†ใซใŸใฉใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™ใŒ
11:48
that includes our computational limitations -- a very complicated model.
228
708892
4977
ใใ‚Œใฏ่จˆ็ฎ—่ƒฝๅŠ›ใฎ้™็•Œใ‚‚ๅซใ‚€ ใจใฆใ‚‚่ค‡้›‘ใชใƒขใƒ‡ใƒซใงใ™
11:53
But it's still something that we can work on understanding.
229
713893
2993
ใใ‚Œใงใ‚‚็งใŸใกใŒ็†่งฃใ™ในใ ๅ–ใ‚Š็ต„ใ‚ใ‚‹ใ‚‚ใฎใงใฏใ‚ใ‚Šใพใ™
11:57
Probably the most difficult part, from my point of view as an AI researcher,
230
717696
4320
AI็ ”็ฉถ่€…ใจใ—ใฆ่ฆ‹ใŸใจใ ๆœ€ใ‚‚้›ฃใ—ใ„ใจๆ€ใˆใ‚‹้ƒจๅˆ†ใฏ
็งใŸใกไบบ้–“ใŒ ๆฒขๅฑฑใ„ใ‚‹ใจใ„ใ†ใ“ใจใงใ™
12:02
is the fact that there are lots of us,
231
722040
2575
12:06
and so the machine has to somehow trade off, weigh up the preferences
232
726114
3581
ใ ใ‹ใ‚‰ๆฉŸๆขฐใฏ ใƒˆใƒฌใƒผใƒ‰ใ‚ชใƒ•ใ‚’่€ƒใˆ
ๆฒขๅฑฑใฎ็•ฐใชใ‚‹ไบบ้–“ใฎๅฅฝใฟใ‚’ ๆฏ”่ผƒ่€ƒ้‡ใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Š
12:09
of many different people,
233
729719
2225
12:11
and there are different ways to do that.
234
731968
1906
ใใ‚Œใซใฏ ใ„ใ‚ใ„ใ‚ใช ใ‚„ใ‚Šๆ–นใŒใ‚ใ‚Šใพใ™
12:13
Economists, sociologists, moral philosophers have understood that,
235
733898
3689
็ตŒๆธˆๅญฆ่€… ็คพไผšๅญฆ่€… ๅ€ซ็†ๅญฆ่€…ใฏ ใใ†ใ„ใ†ใ“ใจใ‚’ๅˆ†ใ‹ใฃใฆใŠใ‚Š
12:17
and we are actively looking for collaboration.
236
737611
2455
็งใŸใกใฏๅ”ๅŒใฎ้“ใ‚’ ๆŽขใฃใฆใ„ใพใ™
12:20
Let's have a look and see what happens when you get that wrong.
237
740090
3251
ใใ“ใ‚’ใ†ใพใใ‚„ใ‚‰ใชใ„ใจ ใฉใ†ใชใ‚‹ใ‹่ฆ‹ใฆใฟใพใ—ใ‚‡ใ†
12:23
So you can have a conversation, for example,
238
743365
2133
ใŸใจใˆใฐใ“ใ‚“ใชไผš่ฉฑใ‚’ ่€ƒใˆใฆใฟใพใ™
12:25
with your intelligent personal assistant
239
745522
1944
็Ÿฅ็š„ใช็ง˜ๆ›ธAIใŒ
12:27
that might be available in a few years' time.
240
747490
2285
ๆ•ฐๅนดๅ†…ใซๅˆฉ็”จๅฏ่ƒฝใซ ใชใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
12:29
Think of a Siri on steroids.
241
749799
2524
ๅผทๅŒ–ใ•ใ‚ŒใŸSiriใฎใ‚ˆใ†ใชใ‚‚ใฎใงใ™
12:33
So Siri says, "Your wife called to remind you about dinner tonight."
242
753447
4322
SiriใŒใ€ŒไปŠๆ™ฉใฎใƒ‡ใ‚ฃใƒŠใƒผใซใคใ„ใฆ ๅฅฅๆง˜ใ‹ใ‚‰็ขบ่ชใฎ้›ป่ฉฑใŒใ‚ใ‚Šใพใ—ใŸใ€ใจ่จ€ใ„ใพใ™
12:38
And of course, you've forgotten. "What? What dinner?
243
758436
2508
ใ‚ใชใŸใฏใ‚‚ใกใ‚ใ‚“ๅฟ˜ใ‚Œใฆใ„ใพใ™ ใ€Œไฝ•ใฎใƒ‡ใ‚ฃใƒŠใƒผใ ใฃใฆ๏ผŸ
12:40
What are you talking about?"
244
760968
1425
ไฝ•ใฎ่ฉฑใ‚’ใ—ใฆใ„ใ‚‹ใ‚“ใ ๏ผŸใ€
12:42
"Uh, your 20th anniversary at 7pm."
245
762417
3746
ใ€Œ20ๅ‘จๅนดใฎใƒ‡ใ‚ฃใƒŠใƒผใงใ™ใ‚ˆ ๅคœ๏ผ—ๆ™‚ใฎใ€
12:48
"I can't do that. I'm meeting with the secretary-general at 7:30.
246
768735
3719
ใ€Œ็„ก็†ใ ใ‚ˆใ€€๏ผ—ๆ™‚ๅŠใซ ไบ‹ๅ‹™็ท้•ทใจไผšใ‚ใชใใ‚ƒใชใ‚‰ใชใ„
12:52
How could this have happened?"
247
772478
1692
ใฉใ†ใ—ใฆ ใ“ใ‚“ใชใ“ใจใซ ใชใฃใŸใ‚“ใ ๏ผŸใ€
12:54
"Well, I did warn you, but you overrode my recommendation."
248
774194
4660
ใ€Œ่ญฆๅ‘Šใฏ่‡ดใ—ใพใ—ใŸใŒ ใ‚ใชใŸใฏๆŽจๅฅจๆกˆใ‚’็„ก่ฆ–ใ•ใ‚Œใพใ—ใŸใ€
12:59
"Well, what am I going to do? I can't just tell him I'm too busy."
249
779966
3328
ใ€Œใฉใ†ใ—ใŸใ‚‰ใ„ใ„ใ‚“ใ ๏ผŸ ๅฟ™ใ—ใใฆ่กŒใ‘ใชใ„ใชใ‚“ใฆ่จ€ใˆใชใ„ใžใ€
ใ€Œใ”ๅฟƒ้…ใซใฏๅŠใณใพใ›ใ‚“ ไบ‹ๅ‹™็ท้•ทใฎ้ฃ›่กŒๆฉŸใŒ้…ใ‚Œใ‚‹ใ‚ˆใ†ใซๆ‰‹้…ๆธˆใฟใงใ™ใ€
13:04
"Don't worry. I arranged for his plane to be delayed."
250
784310
3281
13:07
(Laughter)
251
787615
1682
(็ฌ‘)
ใ€Œใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใซ ็ดฐๅทฅใ—ใฆใŠใใพใ—ใŸใ€
13:10
"Some kind of computer malfunction."
252
790069
2101
13:12
(Laughter)
253
792194
1212
(็ฌ‘)
13:13
"Really? You can do that?"
254
793430
1617
ใ€Œใˆใฃ ใใ‚“ใชใ“ใจใงใใ‚‹ใฎใ‹๏ผŸใ€
13:16
"He sends his profound apologies
255
796220
2179
ใ€Œๅคงๅค‰ๆ็ธฎใ—ใฆ
ๆ˜Žๆ—ฅใฎใƒฉใƒณใƒใงใŠไผšใ„ใ™ใ‚‹ใฎใ‚’ ๆฅฝใ—ใฟใซใ—ใฆใ„ใ‚‹ ใจใฎใ“ใจใงใ™ใ€
13:18
and looks forward to meeting you for lunch tomorrow."
256
798423
2555
13:21
(Laughter)
257
801002
1299
(็ฌ‘)
13:22
So the values here -- there's a slight mistake going on.
258
802325
4403
ใ“ใ“ใงใฏไพกๅ€คใซใคใ„ใฆ ใกใ‚‡ใฃใจ่กŒใ้•ใ„ใŒ่ตทใใฆใ„ใพใ™
13:26
This is clearly following my wife's values
259
806752
3009
Siri ใฏๆ˜Žใ‚‰ใ‹ใซ ๅฆปใฎไพกๅ€ค่ฆณใซๅพ“ใฃใฆใ„ใพใ™
13:29
which is "Happy wife, happy life."
260
809785
2069
ใ€Œๅฆปใฎๅนธใ›ใŒ ๅคซใฎๅนธใ›ใ€ใงใ™
13:31
(Laughter)
261
811878
1583
(็ฌ‘)
13:33
It could go the other way.
262
813485
1444
ๅˆฅใฎๆ–นๅ‘ใซ่กŒใใ“ใจใ‚‚ ใ‚ใ‚Šๅพ—ใพใ™
13:35
You could come home after a hard day's work,
263
815641
2201
ๅฟ™ใ—ใ„ไป•ไบ‹ใ‚’็ต‚ใˆ ๅธฐๅฎ…ใ™ใ‚‹ใจ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒผใŒ่จ€ใ„ใพใ™
13:37
and the computer says, "Long day?"
264
817866
2195
ใ€Œๅคงๅค‰ใช๏ผ‘ๆ—ฅใ ใฃใŸใ‚ˆใ†ใงใ™ใญใ€
13:40
"Yes, I didn't even have time for lunch."
265
820085
2288
ใ€Œๆ˜ผใ‚’้ฃŸในใ‚‹ๆ™‚้–“ใ‚‚ใชใ‹ใฃใŸใ‚ˆใ€
13:42
"You must be very hungry."
266
822397
1282
ใ€ŒใŠ่…นใŒ็ฉบใ„ใŸใ“ใจใงใ—ใ‚‡ใ†ใ€
13:43
"Starving, yeah. Could you make some dinner?"
267
823703
2646
ใ€Œใ‚ใ‚ ่…นใƒšใ‚ณใ ใ‚ˆ ไฝ•ใ‹ๅค•้ฃŸใ‚’ไฝœใฃใฆใ‚‚ใ‚‰ใˆใ‚‹ใ‹ใช๏ผŸใ€
13:47
"There's something I need to tell you."
268
827890
2090
ใ€Œใใฎใ“ใจใง ใŠ่ฉฑใ—ใ—ใชใ‘ใ‚Œใฐ ใชใ‚‰ใชใ„ใ“ใจใŒใ‚ใ‚Šใพใ™ใ€
13:50
(Laughter)
269
830004
1155
(็ฌ‘)
13:52
"There are humans in South Sudan who are in more urgent need than you."
270
832013
4905
ใ€Œๅ—ใ‚นใƒผใƒ€ใƒณใซใฏ ใ‚ใชใŸใ‚ˆใ‚Šใ‚‚ ๅฟ…่ฆใซ่ฟซใ‚‰ใ‚Œใฆใ„ใ‚‹ไบบใ€…ใŒใ„ใพใ™ใ€
13:56
(Laughter)
271
836942
1104
(็ฌ‘)
ใ€Œ่กŒใใ“ใจใซ่‡ดใ—ใพใ—ใŸใฎใง ๅค•้ฃŸใฏใ”่‡ชๅˆ†ใงไฝœใฃใฆใใ ใ•ใ„ใ€
13:58
"So I'm leaving. Make your own dinner."
272
838070
2075
14:00
(Laughter)
273
840169
2000
(็ฌ‘)
14:02
So we have to solve these problems,
274
842643
1739
ใ“ใ†ใ„ใฃใŸๅ•้กŒใ‚’ ่งฃใ‹ใชใ‘ใ‚Œใฐใชใ‚Šใพใ›ใ‚“
14:04
and I'm looking forward to working on them.
275
844406
2515
ใใ†ใ„ใ†ๅ•้กŒใซๅ–ใ‚Š็ต„ใ‚€ใฎใฏ ๆฅฝใ—ใฟใงใ™
14:06
There are reasons for optimism.
276
846945
1843
ๆฅฝ่ฆณใ—ใฆใ„ใ‚‹ใฎใซใฏ ็†็”ฑใŒใ‚ใ‚Šใพใ™
14:08
One reason is,
277
848812
1159
๏ผ‘ใคใซใฏ
14:09
there is a massive amount of data.
278
849995
1868
่†จๅคงใชใƒ‡ใƒผใ‚ฟใŒใ‚ใ‚‹ใ“ใจ
14:11
Because remember -- I said they're going to read everything
279
851887
2794
ๆ€ใ„ๅ‡บใ—ใฆใใ ใ•ใ„
ๆฉŸๆขฐใฏไบบ้กžใŒๆ›ธใ„ใŸใ‚ใ‚‰ใ‚†ใ‚‹ใ‚‚ใฎใ‚’ ่ชญใ‚€ใ“ใจใซใชใ‚‹ใงใ—ใ‚‡ใ†
14:14
the human race has ever written.
280
854705
1546
ไบบ้–“ใฎๆ›ธใ„ใŸใ‚‚ใฎใฏใŸใ„ใŒใ„
14:16
Most of what we write about is human beings doing things
281
856275
2724
่ชฐใ‹ใŒไฝ•ใ‹ใ‚’ใ— ไป–ใฎไบบใŒใใ‚Œใซ่…นใ‚’็ซ‹ใฆใŸใจใ„ใ†ใ‚‚ใฎใงใ™
14:19
and other people getting upset about it.
282
859023
1914
14:20
So there's a massive amount of data to learn from.
283
860961
2398
ๅญฆในใ‚‹ใƒ‡ใƒผใ‚ฟใŒ่†จๅคงใซใ‚ใ‚Šใพใ™
14:23
There's also a very strong economic incentive
284
863383
2236
ใพใŸ ใ“ใ‚Œใ‚’ๆญฃใ—ใใ‚„ใ‚‹ใŸใ‚ใฎ
ๅผทใ„็ตŒๆธˆ็š„ใ‚คใƒณใ‚ปใƒณใƒ†ใ‚ฃใƒ–ใŒ ๅญ˜ๅœจใ—ใพใ™
14:27
to get this right.
285
867151
1186
14:28
So imagine your domestic robot's at home.
286
868361
2001
ๅฎถใซๅฎถไบ‹ใƒญใƒœใƒƒใƒˆใŒใ„ใ‚‹ใจ ๆƒณๅƒใ—ใฆใใ ใ•ใ„
14:30
You're late from work again and the robot has to feed the kids,
287
870386
3067
ใ‚ใชใŸใฏใพใŸไป•ไบ‹ใงๅธฐใ‚ŠใŒ้…ใ ใƒญใƒœใƒƒใƒˆใฏๅญไพ›้”ใซ้ฃŸในใ•ใ›ใชใ‘ใ‚Œใฐใชใ‚Šใพใ›ใ‚“
14:33
and the kids are hungry and there's nothing in the fridge.
288
873477
2823
ๅญไพ›้”ใฏใŠ่…นใ‚’็ฉบใ‹ใ›ใฆใ„ใพใ™ใŒ ๅ†ท่”ตๅบซใฏ็ฉบใฃใฝใงใ™
14:36
And the robot sees the cat.
289
876324
2605
ใใ“ใงใƒญใƒœใƒƒใƒˆใฏ ็Œซใซ็›ฎใ‚’ๆญขใ‚ใพใ™
14:38
(Laughter)
290
878953
1692
(็ฌ‘)
14:40
And the robot hasn't quite learned the human value function properly,
291
880669
4190
ใƒญใƒœใƒƒใƒˆใฏไบบ้–“ใฎไพกๅ€ค่ฆณใ‚’ ใกใ‚ƒใ‚“ใจๅญฆใ‚“ใงใ„ใชใ„ใŸใ‚
14:44
so it doesn't understand
292
884883
1251
็ŒซใฎๆŒใคๆ„Ÿๆƒ…็š„ไพกๅ€คใŒ
14:46
the sentimental value of the cat outweighs the nutritional value of the cat.
293
886158
4844
็Œซใฎๆ „้คŠ็š„ไพกๅ€คใ‚’ไธŠๅ›žใ‚‹ใ“ใจใ‚’ ็†่งฃใ—ใพใ›ใ‚“
(็ฌ‘)
14:51
(Laughter)
294
891026
1095
ใ™ใ‚‹ใจใฉใ†ใชใ‚‹ใงใ—ใ‚‡ใ†๏ผŸ
14:52
So then what happens?
295
892145
1748
ใ€Œ็‹‚ใฃใŸใƒญใƒœใƒƒใƒˆ ๅญ็Œซใ‚’ๆ–™็†ใ—ใฆๅค•้ฃŸใซๅ‡บใ™ใ€
14:53
Well, it happens like this:
296
893917
3297
14:57
"Deranged robot cooks kitty for family dinner."
297
897238
2964
ใฟใŸใ„ใช่ฆ‹ๅ‡บใ—ใ‚’ ่ฆ‹ใ‚‹ใ“ใจใซใชใ‚Šใพใ™
15:00
That one incident would be the end of the domestic robot industry.
298
900226
4523
ใ“ใฎใ‚ˆใ†ใชๅ‡บๆฅไบ‹๏ผ‘ใคใง ๅฎถไบ‹ใƒญใƒœใƒƒใƒˆ็”ฃๆฅญใฏใŠ็ต‚ใ„ใงใ™
15:04
So there's a huge incentive to get this right
299
904773
3372
ใ ใ‹ใ‚‰่ถ…็Ÿฅ็š„ใชๆฉŸๆขฐใซๅˆฐ้”ใ™ใ‚‹ ใšใฃใจไปฅๅ‰ใซ
ใ“ใฎๅ•้กŒใ‚’ๆญฃใ™ใ‚ˆใ† ๅคงใใชใ‚คใƒณใ‚ปใƒณใƒ†ใ‚ฃใƒ–ใŒๅƒใใพใ™
15:08
long before we reach superintelligent machines.
300
908169
2715
15:11
So to summarize:
301
911948
1535
่ฆ็ด„ใ™ใ‚‹ใจ
15:13
I'm actually trying to change the definition of AI
302
913507
2881
็งใฏAIใฎๅฎš็พฉใ‚’ๅค‰ใˆใฆ
ไบบ้–“ใฎใŸใ‚ใซใชใ‚‹ใจ่จผๆ˜Žๅฏ่ƒฝใชๆฉŸๆขฐใŒ ๅพ—ใ‚‰ใ‚Œใ‚‹ใ‚ˆใ†่ฉฆใฟใฆใ„ใพใ™
15:16
so that we have provably beneficial machines.
303
916412
2993
15:19
And the principles are:
304
919429
1222
ใใฎๅŽŸๅ‰‡ใฏ
15:20
machines that are altruistic,
305
920675
1398
ๆฉŸๆขฐใฏๅˆฉไป–็š„ใงใ‚ใ‚Š
15:22
that want to achieve only our objectives,
306
922097
2804
ไบบ้–“ใฎ็›ฎ็š„ใฎใฟใ‚’ ้”ๆˆใ—ใ‚ˆใ†ใจใ™ใ‚‹ใŒ
15:24
but that are uncertain about what those objectives are,
307
924925
3116
ใใฎ็›ฎ็š„ใŒไฝ•ใ‹ใฏ ็ขบไฟกใ‚’ๆŒใŸใš
ใใ—ใฆใ™ในใฆใฎไบบ้–“ใ‚’ ่ฆณๅฏŸใ™ใ‚‹ใ“ใจใง
15:28
and will watch all of us
308
928065
1998
15:30
to learn more about what it is that we really want.
309
930087
3203
ๆˆ‘ใ€…ใฎๆœฌๅฝ“ใซๆœ›ใ‚€ใ“ใจใŒไฝ•ใ‹ใ‚’ๅญฆใถ ใจใ„ใ†ใ“ใจใงใ™
ใใฎ้Ž็จ‹ใง ไบบ้กžใŒใ‚ˆใ‚Š่‰ฏใ„่€…ใซใชใ‚‹่ก“ใ‚’ ๅญฆใถใ“ใจใ‚’ๆœ›ใฟใพใ™
15:34
And hopefully in the process, we will learn to be better people.
310
934193
3559
15:37
Thank you very much.
311
937776
1191
ใ‚ใ‚ŠใŒใจใ†ใ”ใ–ใ„ใพใ—ใŸ
15:38
(Applause)
312
938991
3709
(ๆ‹ๆ‰‹)
15:42
Chris Anderson: So interesting, Stuart.
313
942724
1868
(ใ‚ฏใƒชใ‚นใƒปใ‚ขใƒณใƒ€ใƒผใ‚ฝใƒณ) ใ™ใ”ใ่ˆˆๅ‘ณๆทฑใ„ใญ ใ‚นใƒใƒฅใƒฏใƒผใƒˆ
15:44
We're going to stand here a bit because I think they're setting up
314
944616
3170
ๆฌกใฎใ‚นใƒ”ใƒผใ‚ซใƒผใฎใŸใ‚ใฎ ๆบ–ๅ‚™ใŒใ‚ใ‚‹ใฎใง
ๅฐ‘ใ—ใ“ใ“ใง่ฉฑใ—ใพใ—ใ‚‡ใ†
15:47
for our next speaker.
315
947810
1151
15:48
A couple of questions.
316
948985
1538
่ณชๅ•ใŒใ‚ใ‚‹ใ‚“ใงใ™ใŒ
15:50
So the idea of programming in ignorance seems intuitively really powerful.
317
950547
5453
ใ€Œ็„ก็Ÿฅใซใƒ—ใƒญใ‚ฐใƒฉใƒ ใ™ใ‚‹ใ€ใจใ„ใ†ใ‚ขใ‚คใƒ‡ใ‚ขใฏ ใจใฆใ‚‚ๅผทๅŠ›ใงใ‚ใ‚‹ใ‚ˆใ†ใซๆ€ใˆใพใ™
่ถ…็Ÿฅ็š„ใซใชใฃใŸใƒญใƒœใƒƒใƒˆใŒ ๆ–‡็Œฎใ‚’่ชญใ‚“ใง
15:56
As you get to superintelligence,
318
956024
1594
15:57
what's going to stop a robot
319
957642
2258
็„ก็Ÿฅใ‚ˆใ‚Šใ‚‚็Ÿฅ่ญ˜ใŒใ‚ใ‚‹ๆ–นใŒ ่‰ฏใ„ใจๆฐ—ไป˜ใ
15:59
reading literature and discovering this idea that knowledge
320
959924
2852
่‡ชๅˆ†ใฎ็›ฎ็š„ใ‚’ๅค‰ใˆใฆ ใƒ—ใƒญใ‚ฐใƒฉใƒ ใ‚’ๆ›ธใๆ›ใˆใฆใ—ใพใ† โ€”
16:02
is actually better than ignorance
321
962800
1572
ใใ†ใ„ใ†ใ“ใจใซ ใชใ‚‰ใชใ„ใŸใ‚ใซใฏ
16:04
and still just shifting its own goals and rewriting that programming?
322
964396
4218
ใฉใ†ใ™ใ‚Œใฐ ่‰ฏใ„ใฎใงใ—ใ‚‡ใ†๏ผŸ
16:09
Stuart Russell: Yes, so we want it to learn more, as I said,
323
969512
6356
(ใ‚นใƒใƒฅใƒฏใƒผใƒˆใƒปใƒฉใƒƒใ‚ปใƒซ) ็งใŸใกใฏใƒญใƒœใƒƒใƒˆใซ
ไบบ้–“ใฎ็›ฎ็š„ใ‚’ใ‚ˆใๅญฆใ‚“ใง ใปใ—ใ„ใจๆ€ใฃใฆใ„ใพใ™
16:15
about our objectives.
324
975892
1287
16:17
It'll only become more certain as it becomes more correct,
325
977203
5521
ใƒญใƒœใƒƒใƒˆใฏ ใ‚ˆใ‚Šๆญฃใ—ใใชใ‚‹ใปใฉ ็ขบไฟกใ‚’ๅผทใ‚ใพใ™
16:22
so the evidence is there
326
982748
1945
ๆ‰‹ใŒใ‹ใ‚Šใฏใใ“ใซ ใ‚ใ‚‹ใ‚ใ‘ใงใ™ใ‹ใ‚‰
16:24
and it's going to be designed to interpret it correctly.
327
984717
2724
ใใ‚Œใ‚’ๆญฃใ—ใ่งฃ้‡ˆใ™ใ‚‹ใ‚ˆใ† ใƒ‡ใ‚ถใ‚คใƒณใ™ใ‚‹ใฎใงใ™
16:27
It will understand, for example, that books are very biased
328
987465
3956
ใŸใจใˆใฐๆœฌใฎๅ†…ๅฎนใซใฏ
ใƒใ‚คใ‚ขใ‚นใŒใ‚ใ‚‹ใ“ใจใ‚’ ็†่งฃใ™ใ‚‹ใงใ—ใ‚‡ใ†
16:31
in the evidence they contain.
329
991445
1483
16:32
They only talk about kings and princes
330
992952
2397
็Ž‹ใ‚„็Ž‹ๅฅณใ‚„ ใ‚จใƒชใƒผใƒˆใฎ็™ฝไบบ็”ทๆ€งใŒใ—ใŸใ“ใจใฐใ‹ใ‚Š
16:35
and elite white male people doing stuff.
331
995373
2800
ๆ›ธใ‹ใ‚Œใฆใ„ใ‚‹ใจใ„ใฃใŸ้ขจใซ
16:38
So it's a complicated problem,
332
998197
2096
ใ ใ‹ใ‚‰่ค‡้›‘ใชๅ•้กŒใงใฏใ‚ใ‚Šใพใ™ใŒ
16:40
but as it learns more about our objectives
333
1000317
3872
ใƒญใƒœใƒƒใƒˆใŒๆˆ‘ใ€…ใฎ็›ฎ็š„ใ‚’ ๅญฆในใฏๅญฆใถใปใฉ
16:44
it will become more and more useful to us.
334
1004213
2063
ๆˆ‘ใ€…ใซใจใฃใฆ ๆœ‰็”จใชใ‚‚ใฎใซใชใ‚‹ใงใ—ใ‚‡ใ†
16:46
CA: And you couldn't just boil it down to one law,
335
1006300
2526
(ใ‚ฏใƒชใ‚น) ๏ผ‘ใคใฎๅŽŸๅ‰‡ใซ ใพใจใ‚ใ‚‰ใ‚Œใชใ„ใ‚“ใงใ™ใ‹๏ผŸ
16:48
you know, hardwired in:
336
1008850
1650
ๅ›บๅฎšใ—ใŸใƒ—ใƒญใ‚ฐใƒฉใƒ ใจใ—ใฆ
16:50
"if any human ever tries to switch me off,
337
1010524
3293
ใ€Œไบบ้–“ใŒใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚ใ†ใจใ—ใŸใ‚‰
16:53
I comply. I comply."
338
1013841
1935
็„กๆกไปถใซๅพ“ใ†ใ€ใฟใŸใ„ใช
16:55
SR: Absolutely not.
339
1015800
1182
(ใ‚นใƒใƒฅใƒฏใƒผใƒˆ) ใใ‚Œใฏ้ง„็›ฎใงใ™ใญ
16:57
That would be a terrible idea.
340
1017006
1499
ใพใšใ„ใ‚ขใ‚คใƒ‡ใ‚ขใงใ™
16:58
So imagine that you have a self-driving car
341
1018529
2689
่‡ชๅ‹•้‹่ปข่ปŠใง
๏ผ•ๆญณใฎๅญใ‚’ๅนผ็จšๅœ’ใซ ้€ใ‚‹ใจใ“ใ‚ใ‚’
17:01
and you want to send your five-year-old
342
1021242
2433
17:03
off to preschool.
343
1023699
1174
่€ƒใˆใฆใฟใฆใใ ใ•ใ„
17:04
Do you want your five-year-old to be able to switch off the car
344
1024897
3101
่ปŠใซ๏ผ‘ไบบใงไน—ใฃใฆใ„ใ‚‹ ๏ผ•ๆญณๅ…ใŒ
่ปŠใฎใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚Œใ‚‹ใ‚ˆใ†ใซ ใ—ใŸใ„ใจๆ€ใ„ใพใ™ใ‹๏ผŸ
17:08
while it's driving along?
345
1028022
1213
้•ใ†ใงใ—ใ‚‡ใ†
17:09
Probably not.
346
1029259
1159
ใƒญใƒœใƒƒใƒˆใฏ ใใฎไบบ้–“ใŒใฉใ‚Œใปใฉ็†ๆ€ง็š„ใง ๅˆ†ๅˆฅใŒใ‚ใ‚‹ใ‹ใ‚’็†่งฃใ™ใ‚‹ๅฟ…่ฆใŒใ‚ใ‚Šใพใ™
17:10
So it needs to understand how rational and sensible the person is.
347
1030442
4703
ไบบ้–“ใŒ็†ๆ€ง็š„ใงใ‚ใ‚‹ใปใฉ
17:15
The more rational the person,
348
1035169
1676
17:16
the more willing you are to be switched off.
349
1036869
2103
ใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‰ใ›ใ‚‹่ฆ‹่พผใฟใฏ ้ซ˜ใใชใ‚Šใพใ™
17:18
If the person is completely random or even malicious,
350
1038996
2543
ใพใฃใŸใใƒฉใƒณใƒ€ใƒ ใช็›ธๆ‰‹ใ‚„ ๆ‚ชๆ„ใ‚ใ‚‹ไบบ้–“ใซๅฏพใ—ใฆใฏ
17:21
then you're less willing to be switched off.
351
1041563
2512
ใชใ‹ใชใ‹ใ‚นใ‚คใƒƒใƒใ‚’ๅˆ‡ใ‚‰ใ›ใ‚ˆใ†ใจใฏ ใ—ใชใ„ใงใ—ใ‚‡ใ†
17:24
CA: All right. Stuart, can I just say,
352
1044099
1866
(ใ‚ฏใƒชใ‚น) ใ‚นใƒใƒฅใƒฏใƒผใƒˆ
17:25
I really, really hope you figure this out for us.
353
1045989
2314
ใ‚ใชใŸใŒ ใฟใ‚“ใชใฎใŸใ‚ใซ ใ“ใฎๅ•้กŒใ‚’่งฃๆฑบใ—ใฆใใ‚Œใ‚‹ใ“ใจใ‚’ๅˆ‡ใซๆœ›ใฟใพใ™
17:28
Thank you so much for that talk. That was amazing.
354
1048327
2375
ใ‚ใ‚ŠใŒใจใ†ใ”ใ–ใ„ใพใ—ใŸ ็ด ๆ™ดใ‚‰ใ—ใ„ใŠ่ฉฑใงใ—ใŸ
(ใ‚นใƒใƒฅใƒฏใƒผใƒˆ) ใฉใ†ใ‚‚ใ‚ใ‚ŠใŒใจใ†
17:30
SR: Thank you.
355
1050726
1167
(ๆ‹ๆ‰‹)
17:31
(Applause)
356
1051917
1837
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7