How we can build AI to help humans, not hurt us | Margaret Mitchell

81,177 views ใƒป 2018-03-12

TED


ไธ‹ใฎ่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจๅ‹•็”ปใ‚’ๅ†็”Ÿใงใใพใ™ใ€‚

ๆ กๆญฃ: Masaki Yanagishita
00:13
I work on helping computers communicate about the world around us.
0
13381
4015
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใŒ ็งใŸใกใฎๅ‘จใ‚Šใฎไธ–็•Œใซใคใ„ใฆ ๆƒ…ๅ ฑไบคๆ›ใ™ใ‚‹็ ”็ฉถใ‚’ใ—ใฆใ„ใพใ™
00:17
There are a lot of ways to do this,
1
17754
1793
ใ‚„ใ‚Šๆ–นใฏใŸใใ•ใ‚“ใ‚ใ‚Šใพใ™ใŒ
00:19
and I like to focus on helping computers
2
19571
2592
ไธญใงใ‚‚็งใŒๅŠ›ใ‚’ๅ…ฅใ‚Œใฆใ„ใ‚‹ใฎใฏ
00:22
to talk about what they see and understand.
3
22187
2874
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใŒ่ฆ‹ใฆ ็†่งฃใ—ใŸใ‚‚ใฎใ‚’ ่ชžใ‚‰ใ›ใ‚‹ใ“ใจใงใ™
00:25
Given a scene like this,
4
25514
1571
ใ“ใ‚“ใชใ‚ทใƒผใƒณใŒใ‚ใ‚‹ใจ
00:27
a modern computer-vision algorithm
5
27109
1905
ๆœ€่ฟ‘ใฎใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใƒ“ใ‚ธใƒงใƒณใฎ ใ‚ขใƒซใ‚ดใƒชใ‚บใƒ ใฏ
00:29
can tell you that there's a woman and there's a dog.
6
29038
3095
ๅฅณๆ€งใจ็ŠฌใŒๆ˜ ใฃใฆใ„ใฆ
00:32
It can tell you that the woman is smiling.
7
32157
2706
ๅฅณๆ€งใŒ็ฌ‘้ก”ใ  ใจๆ•™ใˆใฆใใ‚Œใพใ™
00:34
It might even be able to tell you that the dog is incredibly cute.
8
34887
3873
ใพใŸใใฎ็ŠฌใŒใจใฆใ‚‚ใ‹ใ‚ใ„ใ„ใจใ‚‚ ๆ•™ใˆใฆใใ‚Œใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
00:38
I work on this problem
9
38784
1349
็งใฏใ“ใฎ็ ”็ฉถใซใ‚ใŸใฃใฆ
00:40
thinking about how humans understand and process the world.
10
40157
4212
ไบบ้–“ใŒใฉใฎใ‚ˆใ†ใซไธ–็•Œใ‚’็†่งฃใ— ๅ‡ฆ็†ใ™ใ‚‹ใฎใ‹่€ƒใˆใพใ™
00:45
The thoughts, memories and stories
11
45577
2952
ใ“ใฎใ‚ˆใ†ใชใ‚ทใƒผใƒณใ‚’่ฆ‹ใŸๆ™‚ใซ
00:48
that a scene like this might evoke for humans.
12
48553
2818
ไบบ้–“ใŒๅผ•ใ่ตทใ“ใ™ ๆ€่€ƒใ‚„่จ˜ๆ†ถใ€็‰ฉ่ชžใชใฉใงใ™
00:51
All the interconnections of related situations.
13
51395
4285
้–ขไฟ‚ใ™ใ‚‹ใ‚ทใƒใƒฅใ‚จใƒผใ‚ทใƒงใƒณใฎใ™ในใฆใ‚’ ใŠไบ’ใ„ใซใคใชใŽๅˆใ‚ใ›ใ‚‹ใ“ใจใงใ™
00:55
Maybe you've seen a dog like this one before,
14
55704
3126
ใฟใชใ•ใ‚“ใ‚‚ใ“ใ‚Œใพใงใซ ใ“ใ‚“ใช็Šฌใ‚’่ฆ‹ใŸใ“ใจใŒใ‚ใ‚‹ใงใ—ใ‚‡ใ†
00:58
or you've spent time running on a beach like this one,
15
58854
2969
ใ“ใ‚“ใชใƒ“ใƒผใƒใง ่ตฐใฃใŸใ“ใจใŒใ‚ใ‚‹ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
01:01
and that further evokes thoughts and memories of a past vacation,
16
61847
4778
ไผ‘ๆš‡ใฎใจใใซ่€ƒใˆใŸใ“ใจใ‚„่จ˜ๆ†ถ
01:06
past times to the beach,
17
66649
1920
ใƒ“ใƒผใƒใง้Žใ”ใ—ใŸๆ™‚้–“
01:08
times spent running around with other dogs.
18
68593
2603
ๅˆฅใฎ็Šฌใจ่ตฐใ‚Šๅ›žใฃใŸๆ€ใ„ๅ‡บใชใฉใ‚‚ ๆ€ใ„่ตทใ“ใ—ใŸใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
01:11
One of my guiding principles is that by helping computers to understand
19
71688
5207
็งใฎๅŸบๆœฌ็†ๅฟตใฎไธ€ใคใฏ
01:16
what it's like to have these experiences,
20
76919
2896
ใ“ใฎใ‚ˆใ†ใช็ตŒ้จ“ใฎๆ„ๅ‘ณใ‚’ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใซ็†่งฃใ•ใ›ใŸใ‚Š
01:19
to understand what we share and believe and feel,
21
79839
5176
็งใŸใกใŒๅ…ฑๆœ‰ใ— ไฟกใ˜ ๆ„Ÿใ˜ใŸใ‚‚ใฎใ‚’ ็†่งฃใ•ใ›ใŸใ‚Šใ™ใ‚‹ใ“ใจใง
01:26
then we're in a great position to start evolving computer technology
22
86094
4310
็งใŸใก่‡ช่บซใฎ็ตŒ้จ“ใ‚’่ฃœๅฎŒใ™ใ‚‹ๆ–นๅ‘ใซ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟๆŠ€่ก“ใ‚’้€ฒๅŒ–ใ•ใ›ใ‚‹ใจใ„ใ†
01:30
in a way that's complementary with our own experiences.
23
90428
4587
้‡่ฆใชๅฝนๅ‰ฒใ‚’ๆ‹…ใ†ใจใ„ใ†ใ“ใจใงใ™
01:35
So, digging more deeply into this,
24
95539
3387
ใ•ใ‚‰ใซๆทฑๅ €ใ‚Šใ—ใฆใ„ใใพใ—ใ‚‡ใ†
01:38
a few years ago I began working on helping computers to generate human-like stories
25
98950
5905
ๆ•ฐๅนดๅ‰ ็งใฏใ‚ใ‚‹็ ”็ฉถใ‚’ๅง‹ใ‚ใพใ—ใŸ ไธ€้€ฃใฎ็”ปๅƒใ‹ใ‚‰ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใซ
01:44
from sequences of images.
26
104879
1666
ไบบ้–“ใ‚‰ใ—ใ„็‰ฉ่ชžใ‚’ไฝœใ‚‰ใ›ใ‚‹่ฉฆใฟใงใ™
01:47
So, one day,
27
107427
1904
ใ‚ใ‚‹ๆ—ฅใฎใ“ใจ
01:49
I was working with my computer to ask it what it thought about a trip to Australia.
28
109355
4622
ใ‚ชใƒผใ‚นใƒˆใƒฉใƒชใ‚ขๆ—…่กŒใฎใ“ใจใ‚’่€ƒใˆใ‚‹ใ‚ˆใ† ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใซๅฐ‹ใญใพใ—ใŸ
01:54
It took a look at the pictures, and it saw a koala.
29
114768
2920
็”ปๅƒใฎไธญใ‹ใ‚‰ ใ‚ณใ‚ขใƒฉใ‚’่ฆ‹ใคใ‘ใพใ—ใŸ
01:58
It didn't know what the koala was,
30
118236
1643
ใ‚ณใ‚ขใƒฉใŒไฝ•ใชใฎใ‹ ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใฏ็Ÿฅใ‚Šใพใ›ใ‚“
01:59
but it said it thought it was an interesting-looking creature.
31
119903
2999
ใงใ‚‚ใ€Œใ‚ณใ‚ขใƒฉใŒ ้ข็™ฝใ„ๅค–่ฆ‹ใฎๅ‹•็‰ฉใ ใ€ ใจ่จ€ใฃใŸใฎใงใ™
02:04
Then I shared with it a sequence of images about a house burning down.
32
124053
4004
ใพใŸ ๅฎถใŒ็„ผใ‘่ฝใกใ‚‹ไธ€้€ฃใฎ็”ปๅƒใ‚’ ่ฆ‹ใ›ใŸใ“ใจใ‚‚ใ‚ใ‚Šใพใ—ใŸ
02:09
It took a look at the images and it said,
33
129704
3285
ใใ‚Œใ‚’่ฆ‹ใฆใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใฏใ“ใ†่จ€ใ„ใพใ—ใŸ
02:13
"This is an amazing view! This is spectacular!"
34
133013
3500
ใ€Œใ“ใ‚Œใฏใ™ใ”ใ„ๅ…‰ๆ™ฏใ ๏ผๅฃฎ่ฆณใ ๏ผใ€
02:17
It sent chills down my spine.
35
137450
2095
็งใฎ่ƒŒ็ญ‹ใŒๅ‡ใ‚Šใพใ—ใŸ
02:20
It saw a horrible, life-changing and life-destroying event
36
140983
4572
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใŒ่ฆ‹ใŸใ‚‚ใฎใฏ ๅ‘ฝใฎๅฑ้™บใ‚’ ไผดใ†ใ‚ˆใ†ใชๆใ‚ใ—ใ„ๅ‡บๆฅไบ‹ใงใ™ใŒ
02:25
and thought it was something positive.
37
145579
2382
ไฝ•ใ‹่‰ฏใ„ใ“ใจใฎใ‚ˆใ†ใซ ่งฃ้‡ˆใ—ใฆใ„ใ‚‹ใฎใงใ™
02:27
I realized that it recognized the contrast,
38
147985
3441
็”ปๅƒใ‹ใ‚‰ใ‚ณใƒณใƒˆใƒฉใ‚นใƒˆใ‚„
02:31
the reds, the yellows,
39
151450
2699
่ตคใ‚„้ป„่‰ฒใ‚’่ช่ญ˜ใ—
02:34
and thought it was something worth remarking on positively.
40
154173
3078
ใชใซใ‹่‰ฏใ„ใ“ใจใจ่งฃ้‡ˆใ™ใ‚‹ใซๅ€คใ™ใ‚‹ใจ ๆ€ใฃใŸใฎใงใ™
02:37
And part of why it was doing this
41
157928
1615
ใใ—ใฆใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใŒใใ†ๆ€ใฃใŸใฎใฏ
02:39
was because most of the images I had given it
42
159577
2945
ไธŽใˆใŸ็”ปๅƒใฎใปใจใ‚“ใฉใŒ ่‚ฏๅฎš็š„ใชใ‚‚ใฎใ ใฃใŸใ›ใ„ใ‚‚ใ‚ใ‚‹ใงใ—ใ‚‡ใ†
02:42
were positive images.
43
162546
1840
02:44
That's because people tend to share positive images
44
164903
3658
ๅฎŸ้š› ็งใŸใกใŒ่‡ชๅˆ†ใฎ็ตŒ้จ“ใ‚’ ไบบใซ่ฉฑใ™ใจใใฏ
02:48
when they talk about their experiences.
45
168585
2190
่‚ฏๅฎš็š„ใช็”ปๅƒใ‚’ๅ…ฑๆœ‰ใ—ใ‚ˆใ†ใจใ—ใพใ™ใ‚ˆใญ
02:51
When was the last time you saw a selfie at a funeral?
46
171267
2541
ใŠ่‘ฌๅผใงใฎ่‡ชๆ’ฎใ‚Šๅ†™็œŸใชใ‚“ใ‹ ่ฆ‹ใŸใ“ใจใŒใ‚ใ‚Šใพใ™ใ‹๏ผŸ
02:55
I realized that, as I worked on improving AI
47
175434
3095
AIใฎ็ ”็ฉถใ‚’ใ—ใฆใ„ใฆ ๆฐ—ใฅใ„ใŸใ“ใจใŒใ‚ใ‚Šใพใ™
02:58
task by task, dataset by dataset,
48
178553
3714
ๆ•ฐใ€…ใฎ่ชฒ้กŒใ‚„ใƒ‡ใƒผใ‚ฟใ‚ปใƒƒใƒˆใ‚’ๆ‰ฑใ†ไธญใง
03:02
that I was creating massive gaps,
49
182291
2897
ใ‚ณใƒณใƒ”ใƒฅใƒผใ‚ฟใŒ็†่งฃใงใใ‚‹ใ‚‚ใฎใฎไธญใซ
03:05
holes and blind spots in what it could understand.
50
185212
3999
ๅคงใใช้ฃ›่บ ๆฌ ่ฝ ็›ฒ็‚นใ‚’ ็”Ÿใฟๅ‡บใ—ใฆใ„ใพใ—ใŸ
03:10
And while doing so,
51
190307
1334
ใพใŸใใฎ้Ž็จ‹ใง
03:11
I was encoding all kinds of biases.
52
191665
2483
ใ‚ใ‚‰ใ‚†ใ‚‹็จฎ้กžใฎๅ…ˆๅ…ฅ่ฆณใ‚’ ๅŸ‹ใ‚่พผใ‚“ใงใ„ใพใ—ใŸ
03:15
Biases that reflect a limited viewpoint,
53
195029
3318
้™ๅฎšใ•ใ‚ŒใŸ่ฆณ็‚นใซใ‚ˆใ‚‹ๅ…ˆๅ…ฅ่ฆณ
03:18
limited to a single dataset --
54
198371
2261
ๅ˜ไธ€ใฎใƒ‡ใƒผใ‚ฟใ‚ปใƒƒใƒˆใซใ‚ˆใ‚‹ๅ…ˆๅ…ฅ่ฆณ
03:21
biases that can reflect human biases found in the data,
55
201283
3858
ใƒ‡ใƒผใ‚ฟไธญใฎ ไบบ้–“ใฎๅ…ˆๅ…ฅ่ฆณใ‚’ ๅๆ˜ ใ™ใ‚‹ใ“ใจใ‚‚ใ‚ใ‚Šใพใ™
03:25
such as prejudice and stereotyping.
56
205165
3104
ๅ่ฆ‹ใ‚„ๅ›บๅฎš่ฆณๅฟตใชใฉใงใ™
03:29
I thought back to the evolution of the technology
57
209554
3057
็งใ‚’ใใฎๆ—ฅใพใงๅฐŽใ„ใŸ
03:32
that brought me to where I was that day --
58
212635
2502
ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใฎ้€ฒๅŒ–ใ‚’ๆŒฏใ‚Š่ฟ”ใฃใฆใฟใพใ—ใŸ
03:35
how the first color images
59
215966
2233
ๅˆใ‚ใฆใฎ ใ‚ซใƒฉใƒผ็”ปๅƒใฎ่‰ฒๅˆใ„ใฏ
03:38
were calibrated against a white woman's skin,
60
218223
3048
็™ฝไบบๅฅณๆ€งใฎ็šฎ่†šใฎ่‰ฒใ‚’ๅŸบๆบ–ใซ ่ชฟๆ•ดใ•ใ‚Œใฆใ„ใพใ—ใŸ
03:41
meaning that color photography was biased against black faces.
61
221665
4145
้ป’ไบบใฎ้ก”ใฏ ๅŸบๆบ–ใ‹ใ‚‰ๅค–ใ‚Œใ‚‹ใจใ„ใ†ๆ„ๅ‘ณใงใ™
03:46
And that same bias, that same blind spot
62
226514
2925
ใพใŸ ๅŒๆง˜ใฎๅ…ˆๅ…ฅ่ฆณใ‚„็›ฒ็‚นใŒ
03:49
continued well into the '90s.
63
229463
1867
90ๅนดไปฃใซๅ…ฅใฃใฆใ‚‚ใšใฃใจ็ถšใใพใ—ใŸ
03:51
And the same blind spot continues even today
64
231701
3154
ๅŒใ˜ใ‚ˆใ†ใช็›ฒ็‚นใŒไปŠๆ—ฅใพใงใ‚‚ ็ถšใ„ใฆใ„ใฆ
03:54
in how well we can recognize different people's faces
65
234879
3698
็•ฐใชใ‚‹ไบบ้–“ใฎ้ก”ใ‚’ ใ„ใ‹ใซๅŒบๅˆฅใ—ใฆ ่ช่ญ˜ใ™ใ‚‹ใ‹ใ‚’็ ”็ฉถใ™ใ‚‹
03:58
in facial recognition technology.
66
238601
2200
้ก”่ช่ญ˜ๆŠ€่ก“ใฎๅˆ†้‡Žใซ่ฆ‹ใ‚‰ใ‚Œใพใ™
04:01
I though about the state of the art in research today,
67
241323
3143
ไปŠๆ—ฅใฎ็ ”็ฉถใฎๆœ€ๅ…ˆ็ซฏใซใคใ„ใฆๆ€ใ†ใฎใฏ
04:04
where we tend to limit our thinking to one dataset and one problem.
68
244490
4514
๏ผ‘ใคใฎใƒ‡ใƒผใ‚ฟใ‚ปใƒƒใƒˆ ๏ผ‘ใคใฎ่ชฒ้กŒใซ ่€ƒใˆใ‚’้™ๅฎšใ—ใŒใกใง
04:09
And that in doing so, we were creating more blind spots and biases
69
249688
4881
ใใฎใ›ใ„ใง ใ•ใ‚‰ใซ็›ฒ็‚นใ‚„ ๅ…ˆๅ…ฅ่ฆณใ‚’็”Ÿใฟๅ‡บใ—
04:14
that the AI could further amplify.
70
254593
2277
AIใŒใ•ใ‚‰ใซใใ‚Œใ‚’ ๅข—ๅน…ใ™ใ‚‹ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใ“ใจใงใ™
04:17
I realized then that we had to think deeply
71
257712
2079
ใ ใ‹ใ‚‰ๅฝ“ๆ™‚็งใŸใกใฏ ไปŠ็ ”็ฉถใ—ใฆใ„ใ‚‹ใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใŒ
04:19
about how the technology we work on today looks in five years, in 10 years.
72
259815
5519
5ๅนดๅพŒ 10ๅนดๅพŒใซใฉใ†่ฆ‹ใˆใฆใ„ใ‚‹ใ‹ใ‚’ ๆทฑใ่€ƒใˆใชใ‘ใ‚Œใฐใชใ‚‰ใชใ„ใจๆ€ใ„ใพใ—ใŸ
04:25
Humans evolve slowly, with time to correct for issues
73
265990
3142
ไบบ้–“ใฏใ‚†ใฃใใ‚Š้€ฒๅŒ–ใ— ใใฎ้–“ใซ ไบบ้–“ๅŒๅฃซใฎ้–ขไฟ‚ใ‚„
04:29
in the interaction of humans and their environment.
74
269156
3534
ๅ‘จๅ›ฒใฎ็’ฐๅขƒใซใคใ„ใฆ ๅ•้กŒใ‚’ๆ˜ฏๆญฃใ™ใ‚‹ๆ™‚้–“ใŒใ‚ใ‚Šใพใ™
04:33
In contrast, artificial intelligence is evolving at an incredibly fast rate.
75
273276
5429
ไธ€ๆ–น ไบบๅทฅ็Ÿฅ่ƒฝใฏ ไฟกใ˜ใ‚‰ใ‚Œใชใ„ใปใฉใฎ้€Ÿใ•ใง ้€ฒๅŒ–ใ—ใฆใ„ใพใ™
04:39
And that means that it really matters
76
279013
1773
ใ ใ‹ใ‚‰็งใŸใกใฏ ใชใŠใ•ใ‚‰
04:40
that we think about this carefully right now --
77
280810
2317
ไปŠใ™ใ ใ“ใฎใ“ใจใ‚’ๆ…Ž้‡ใซๆคœ่จŽใ—
04:44
that we reflect on our own blind spots,
78
284180
3008
็งใŸใก่‡ช่บซใฎ็›ฒ็‚นใ‚„ๅ…ˆๅ…ฅ่ฆณใซใคใ„ใฆ
04:47
our own biases,
79
287212
2317
ใ‚ˆใ่€ƒใˆใชใ‘ใ‚Œใฐใชใ‚Šใพใ›ใ‚“
04:49
and think about how that's informing the technology we're creating
80
289553
3857
ใพใŸ ใใ‚Œใ‚‰ใŒ็งใŸใกใฎใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใซ ใฉใฎใ‚ˆใ†ใซๆƒ…ๅ ฑใ‚’ไพ›ไธŽใ™ใ‚‹ใ‹ใ‚’่€ƒใˆ
04:53
and discuss what the technology of today will mean for tomorrow.
81
293434
3902
ไปŠๆ—ฅใฎใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใŒ ๆ˜Žๆ—ฅใฉใฎใ‚ˆใ†ใช ๆ„ๅ‘ณใ‚’ๆŒใคใฎใ‹่ญฐ่ซ–ใ—ใชใ‘ใ‚Œใฐใชใ‚Šใพใ›ใ‚“
04:58
CEOs and scientists have weighed in on what they think
82
298593
3191
ไผๆฅญใฎใƒˆใƒƒใƒ—ใ‚„็ง‘ๅญฆ่€…ใฏ ่‡ชๅˆ†ใŸใกใฎ่€ƒใˆใŸ
05:01
the artificial intelligence technology of the future will be.
83
301808
3325
ๆฅใ‚‹ในใๆœชๆฅใฎไบบๅทฅ็Ÿฅ่ƒฝๆŠ€่ก“ใซ ๅŠ›ใ‚’ๆณจใ„ใงใใพใ—ใŸ
05:05
Stephen Hawking warns that
84
305157
1618
ใ‚นใƒ†ใ‚ฃใƒผใƒ–ใƒณใƒปใƒ›ใƒผใ‚ญใƒณใ‚ฐใฏ
05:06
"Artificial intelligence could end mankind."
85
306799
3007
ใ€Œไบบๅทฅ็Ÿฅ่ƒฝใŒไบบ้กžใ‚’ๆป…ใผใ™ๅฏ่ƒฝๆ€งใŒใ‚ใ‚‹ใ€ใจ ่ญฆ้˜ใ‚’้ณดใ‚‰ใ—ใฆใ„ใพใ™
05:10
Elon Musk warns that it's an existential risk
86
310307
2683
ใ‚คใƒผใƒญใƒณใƒปใƒžใ‚นใ‚ฏใฏ ไบบๅทฅ็Ÿฅ่ƒฝใ‚’
05:13
and one of the greatest risks that we face as a civilization.
87
313014
3574
็งใŸใกใฎๆ–‡ๆ˜ŽใŒ็›ด้ขใ™ใ‚‹ ๅญ˜ไบกใซ้–ขใ‚ใ‚‹ ๆœ€ๅคง็ดšใฎ่„…ๅจใ ใจ่ญฆๅ‘Šใ—ใพใ™
05:17
Bill Gates has made the point,
88
317665
1452
ใƒ“ใƒซใƒปใ‚ฒใ‚คใƒ„ใ‚‚
05:19
"I don't understand why people aren't more concerned."
89
319141
3185
ใ€Œใชใœ็š†ใŒๅฟƒ้…ใ—ใชใ„ใฎใ‹ ็†่งฃใงใใชใ„ใ€ใจๆŒ‡ๆ‘˜ใ—ใฆใ„ใพใ™
05:23
But these views --
90
323412
1318
ใ—ใ‹ใ—ใใ‚Œใ‚‰ใฎ่ฆ‹ๆ–นใฏ
05:25
they're part of the story.
91
325618
1734
ใปใ‚“ใฎไธ€้ƒจใฎๆ„่ฆ‹ใงใ™
05:28
The math, the models,
92
328079
2420
ๆ•ฐๅญฆใ‚„ใƒขใƒ‡ใƒซใชใฉ
05:30
the basic building blocks of artificial intelligence
93
330523
3070
ไบบๅทฅ็Ÿฅ่ƒฝใฎๅŸบๆœฌ็š„ใชๆง‹ๆˆ่ฆ็ด ใฏ
05:33
are something that we call access and all work with.
94
333617
3135
่ชฐใงใ‚‚ใ‚ขใ‚ฏใ‚ปใ‚นใ—ใฆ ๅˆฉ็”จใงใใ‚‹ใ‚‚ใฎใงใ™
05:36
We have open-source tools for machine learning and intelligence
95
336776
3785
ๆฉŸๆขฐๅญฆ็ฟ’ใƒปไบบๅทฅ็Ÿฅ่ƒฝๅ‘ใ‘ใฎ ใ‚ชใƒผใƒ—ใƒณใ‚ฝใƒผใ‚นใƒปใƒ„ใƒผใƒซใŒใ‚ใ‚Š
05:40
that we can contribute to.
96
340585
1734
็งใŸใกใŒ่ฒข็Œฎใ™ใ‚‹ใ“ใจใ‚‚ใงใใพใ™
05:42
And beyond that, we can share our experience.
97
342919
3340
ใ•ใ‚‰ใซๅ€‹ไบบใฎ็ตŒ้จ“ใ‚‚ๅ…ฑๆœ‰ใงใใพใ™
05:46
We can share our experiences with technology and how it concerns us
98
346760
3468
ๆŠ€่ก“ใซใคใ„ใฆใฎ็ตŒ้จ“ใ‚„ ใใ‚ŒใŒใฉใฎใ‚ˆใ†ใซ่‡ชๅˆ†ใŸใกใซ้–ขไฟ‚ใ—
05:50
and how it excites us.
99
350252
1467
่ˆˆๅฅฎใ•ใ›ใ‚‹ใ‹ใ‚’ๅ…ฑๆœ‰ใงใใพใ™
05:52
We can discuss what we love.
100
352251
1867
ๅคงๅฅฝใใชใ‚‚ใฎใซใคใ„ใฆ ๆ„่ฆ‹ไบคๆ›ใŒใงใใพใ™
05:55
We can communicate with foresight
101
355244
2031
ๆŠ€่ก“็š„ใชๅด้ขใฎใ†ใก
05:57
about the aspects of technology that could be more beneficial
102
357299
4857
ๆ™‚้–“ใŒ็ตŒใคใซใคใ‚Œใฆ ๆฌก็ฌฌใซๆœ‰็›Šใซใชใ‚‹้ขใ‚„ ๅ•้กŒใ‚’็”Ÿใฟๅ‡บใ™้ขใซใคใ„ใฆ
06:02
or could be more problematic over time.
103
362180
2600
็งใŸใกใฏๅฑ•ๆœ›ใ‚’ๆŒใฃใฆ ่ฉฑใ—ๅˆใ†ใ“ใจใŒใงใใพใ™
06:05
If we all focus on opening up the discussion on AI
104
365799
4143
ใ‚‚ใ—็งใŸใก็š†ใŒๆœชๆฅใธใฎๅฑ•ๆœ›ใ‚’ใ‚‚ใฃใฆ AIใซ้–ขใ™ใ‚‹่ญฐ่ซ–ใฎ่ผชใ‚’
06:09
with foresight towards the future,
105
369966
1809
ๅบƒใ’ใ‚‹ใ“ใจใซๆณจ็›ฎใ™ใ‚Œใฐ
06:13
this will help create a general conversation and awareness
106
373093
4270
็พๅœจใฎAIใฏใฉใฎใ‚ˆใ†ใชใ‚‚ใฎใ‹
06:17
about what AI is now,
107
377387
2513
ใ“ใ‚Œใ‹ใ‚‰ใฉใฎใ‚ˆใ†ใซใชใ‚Šใˆใ‚‹ใฎใ‹
06:21
what it can become
108
381212
2001
ใใ—ใฆ่‡ชๅˆ†ใŸใกใซๆœ€้ฉใช ๆœ€็ต‚็ตๆžœใ‚’ๅพ—ใ‚‹ใŸใ‚ใซ
06:23
and all the things that we need to do
109
383237
1785
ใ—ใชใ‘ใ‚Œใฐใชใ‚‰ใชใ„ใ“ใจใซใคใ„ใฆ
06:25
in order to enable that outcome that best suits us.
110
385046
3753
ไธ€่ˆฌ็š„ใช่ญฐ่ซ–ใ‚„่ช่ญ˜ใŒ้ซ˜ใพใ‚‹ใงใ—ใ‚‡ใ†
06:29
We already see and know this in the technology that we use today.
111
389490
3674
็งใŸใกใฏ ใ“ใฎใ“ใจใ‚’ ็พๅœจใฎใƒ†ใ‚ฏใƒŽใƒญใ‚ธใƒผใ‚’่ฆ‹ใฆ็Ÿฅใฃใฆใ„ใพใ™
06:33
We use smart phones and digital assistants and Roombas.
112
393767
3880
ใ‚นใƒžใƒผใƒˆใƒ•ใ‚ฉใƒณใ€ใƒ‡ใ‚ธใ‚ฟใƒซใ‚ขใ‚ทใ‚นใ‚ฟใƒณใƒˆ ใใ—ใฆใ€Œใƒซใƒณใƒใ€ใชใฉใงใ™
06:38
Are they evil?
113
398457
1150
ใใ‚Œใ‚‰ใฏๆ‚ชใงใ—ใ‚‡ใ†ใ‹๏ผŸ
06:40
Maybe sometimes.
114
400268
1547
ๆ™‚ใซใฏใใ†ใ‹ใ‚‚ใ—ใ‚Œใพใ›ใ‚“
06:42
Are they beneficial?
115
402664
1333
ใงใฏๆœ‰็›Šใชใ‚‚ใฎใงใ—ใ‚‡ใ†ใ‹๏ผŸ
06:45
Yes, they're that, too.
116
405005
1533
ใใ†ใจใ‚‚ใ„ใˆใพใ™
06:48
And they're not all the same.
117
408236
1761
ใใ—ใฆ ใ™ในใฆใŒๅŒใ˜ใงใฏใ‚ใ‚Šใพใ›ใ‚“
06:50
And there you already see a light shining on what the future holds.
118
410489
3540
ใพใŸ ๆœชๆฅใซ่ผใใ‚‚ใฎใŒ ใ™ใงใซ่ฆ‹ใˆใพใ™
06:54
The future continues on from what we build and create right now.
119
414942
3619
็งใŸใกใŒไปŠไฝœใ‚ŠไธŠใ’ใฆใ„ใ‚‹ใ‚‚ใฎใŒ ๆœชๆฅใธใจ็ถšใใพใ™
06:59
We set into motion that domino effect
120
419165
2642
AIใฎ้€ฒๅŒ–ใฎ้“ใ™ใ˜ใ‚’ๅˆ‡ใ‚Š้–‹ใ
07:01
that carves out AI's evolutionary path.
121
421831
2600
ใƒ‰ใƒŸใƒŽๅŠนๆžœใ‚’ไฝœๅ‹•ใ•ใ›ใพใ™
07:05
In our time right now, we shape the AI of tomorrow.
122
425173
2871
ไปŠใ“ใฎๆ™‚ ็งใŸใกใฏๆ˜Žๆ—ฅใฎAIใ‚’ ๅฝขไฝœใฃใฆใ„ใ‚‹ใฎใงใ™
07:08
Technology that immerses us in augmented realities
123
428566
3699
้ŽๅŽปใฎไธ–็•Œใ‚’ ๆ‹กๅผต็พๅฎŸใจใ—ใฆ่˜‡ใ‚‰ใ›
07:12
bringing to life past worlds.
124
432289
2566
ไบบ้–“ใ‚’ๅคขไธญใซใ•ใ›ใ‚‹ๆŠ€่ก“
07:15
Technology that helps people to share their experiences
125
435844
4312
ใพใŸ ใ‚ณใƒŸใƒฅใƒ‹ใ‚ฑใƒผใ‚ทใƒงใƒณใŒ้›ฃใ—ใ„ๆ™‚ใงใ‚‚
07:20
when they have difficulty communicating.
126
440180
2262
็ตŒ้จ“ใ‚’ๅ…ฑๆœ‰ใ™ใ‚‹ใฎใซๅฝน็ซ‹ใคๆŠ€่ก“ใ‚‚ใ‚ใ‚Šใพใ™
07:23
Technology built on understanding the streaming visual worlds
127
443323
4532
ๅธธใซๅค‰ๅŒ–ใ™ใ‚‹ใ€Œใ‚นใƒˆใƒชใƒผใƒŸใƒณใ‚ฐใ€ใฎๆ€ง่ณชใ‚’ๆŒใค ่ฆ–่ฆšไธ–็•Œใฎ็†่งฃใ‚’ๅŸบ็คŽใซใ—ใŸๆŠ€่ก“ใฏ
07:27
used as technology for self-driving cars.
128
447879
3079
่‡ชๅ‹•้‹่ปข่ปŠใซไฝฟใ‚ใ‚Œใพใ™
07:32
Technology built on understanding images and generating language,
129
452490
3413
็”ปๅƒใ‚’็†่งฃใ— ่จ€่‘‰ใ‚’็”Ÿใฟๅ‡บใ™ๆŠ€่ก“ใฏ
07:35
evolving into technology that helps people who are visually impaired
130
455927
4063
่ฆ–่ฆš้šœๅฎณ่€…ใŒ ่ฆ–่ฆšไธ–็•Œใซใ‚ขใ‚ฏใ‚ปใ‚นใ™ใ‚‹ใ“ใจใ‚’
07:40
be better able to access the visual world.
131
460014
2800
ๆ”ฏๆดใ™ใ‚‹ๆŠ€่ก“ใธใจ้€ฒๅŒ–ใ—ใฆใ„ใพใ™
07:42
And we also see how technology can lead to problems.
132
462838
3261
ๅŒๆ™‚ใซ ใใ†ใ—ใŸๆŠ€่ก“ใŒๅผ•ใ่ตทใ“ใ™ ๅ•้กŒใ‚‚ใ‚ใ‚Šใพใ™
07:46
We have technology today
133
466885
1428
็พๅœจ ใƒ’ใƒˆ็”Ÿๆฅใฎ่บซไฝ“็š„ใช็‰นๅพด
07:48
that analyzes physical characteristics we're born with --
134
468337
3835
ใŸใจใˆใฐ ็šฎ่†šใฎ่‰ฒใ‚„ ้ก”ใฎๅค–่ฆ‹ใชใฉใ‚’
07:52
such as the color of our skin or the look of our face --
135
472196
3272
่งฃๆžใ™ใ‚‹ๆŠ€่ก“ใŒใ‚ใ‚Š
07:55
in order to determine whether or not we might be criminals or terrorists.
136
475492
3804
ใใ‚Œใง ็Šฏ็ฝช่€…ใ‚„ใƒ†ใƒญใƒชใ‚นใƒˆใ‹ใฉใ†ใ‹ใ‚’ ่ญ˜ๅˆฅใ—ใ‚ˆใ†ใจใ—ใพใ™
07:59
We have technology that crunches through our data,
137
479688
2905
ใƒญใƒผใƒณใฎๆ‰ฟ่ชๅฏฉๆŸปใฎใŸใ‚ใซ
08:02
even data relating to our gender or our race,
138
482617
2896
ๆ€งๅˆฅใ‚„ไบบ็จฎใซ้–ขใ‚ใ‚‹ใƒ‡ใƒผใ‚ฟใพใงๅซใ‚ใฆ
08:05
in order to determine whether or not we might get a loan.
139
485537
2865
ๅคง้‡ใฎใƒ‡ใƒผใ‚ฟใ‚’ๅ‡ฆ็†ใ™ใ‚‹ๆŠ€่ก“ใŒใ‚ใ‚Šใพใ™
08:09
All that we see now
140
489494
1579
็พๅœจ็งใŸใกใŒ่ฆ‹ใฆใ„ใ‚‹ใ‚‚ใฎใฏ
08:11
is a snapshot in the evolution of artificial intelligence.
141
491097
3617
้€ฒๅŒ–ใ—็ถšใ‘ใ‚‹AIใฎ ไธ€็žฌใ‚’ใจใ‚‰ใˆใŸใ‚‚ใฎใซใ™ใŽใพใ›ใ‚“
08:15
Because where we are right now,
142
495763
1778
ใชใœใชใ‚‰ ็งใŸใกใฏ ไปŠใพใ•ใซ
08:17
is within a moment of that evolution.
143
497565
2238
AI้€ฒๅŒ–ใฎ้€”ไธŠใซใ‚ใ‚‹ใ‹ใ‚‰ใงใ™
08:20
That means that what we do now will affect what happens down the line
144
500690
3802
ใคใพใ‚Š ็งใŸใกใŒไปŠใ‚„ใฃใŸใ“ใจใŒ ๅฐ†ๆฅ่ตทใ“ใ‚‹ใ“ใจใซ
08:24
and in the future.
145
504516
1200
ๅฝฑ้Ÿฟใ™ใ‚‹ใจใ„ใ†ใ“ใจใงใ™
08:26
If we want AI to evolve in a way that helps humans,
146
506063
3951
ใ‚‚ใ—ไบบ้–“ใซๅฝน็ซ‹ใคใ‚ˆใ† AIใ‚’้€ฒๅŒ–ใ•ใ›ใŸใ„ใชใ‚‰
08:30
then we need to define the goals and strategies
147
510038
2801
ไปŠ ใ‚ดใƒผใƒซใจๆˆฆ็•ฅใ‚’ๆ˜Ž็ขบใซใ—ใฆ
08:32
that enable that path now.
148
512863
1733
้“็ญ‹ใ‚’ใคใ‘ใชใ‘ใ‚Œใฐใชใ‚Šใพใ›ใ‚“
08:35
What I'd like to see is something that fits well with humans,
149
515680
3738
็งใŒๆœŸๅพ…ใ™ใ‚‹ใฎใฏ
08:39
with our culture and with the environment.
150
519442
2800
ไบบ้–“ใ‚„ใใฎๆ–‡ๅŒ– ใใ—ใฆ็’ฐๅขƒใซ ใ‚ˆใใชใ˜ใ‚€AIใงใ™
08:43
Technology that aids and assists those of us with neurological conditions
151
523435
4484
็ฅž็ตŒ็–พๆ‚ฃใชใฉใฎ้šœๅฎณใ‚’ๆŠฑใˆใฆใ„ใฆใ‚‚ ่ชฐใ‚‚ใŒๅŒใ˜ใ‚ˆใ†ใซ
08:47
or other disabilities
152
527943
1721
ใ‚„ใ‚ŠใŒใ„ใ‚’ๆ„Ÿใ˜ใฆ็”Ÿใใ‚‹ใ“ใจใ‚’
08:49
in order to make life equally challenging for everyone.
153
529688
3216
่ฃœๅŠฉใ—ใฆใใ‚Œใ‚‹ๆŠ€่ก“ใงใ™
08:54
Technology that works
154
534097
1421
ใใฎไบบใฎๅฑžๆ€งใ‚„ ็šฎ่†šใฎ่‰ฒใซใ‚ˆใ‚‰ใš
08:55
regardless of your demographics or the color of your skin.
155
535542
3933
ใ†ใพใๆฉŸ่ƒฝใ™ใ‚‹ๆŠ€่ก“ใงใ™
09:00
And so today, what I focus on is the technology for tomorrow
156
540383
4742
ใใ—ใฆไปŠๆ—ฅ ็งใŒใŠ่ฉฑใ—ใ—ใฆใใŸใฎใฏ ๆ˜Žๆ—ฅใธ ใใ—ใฆ10ๅนดๅพŒใธใจ
09:05
and for 10 years from now.
157
545149
1733
ใคใชใŒใ‚‹ๆŠ€่ก“ใงใ™
09:08
AI can turn out in many different ways.
158
548530
2634
AIใฏ ๆง˜ใ€…ใชๆ–นๅ‘ใซ้€ฒใ‚€ๅฏ่ƒฝๆ€งใŒใ‚ใ‚Šใพใ™ใŒ
09:11
But in this case,
159
551688
1225
ใใ‚Œใ‚‰ใฏ ็›ฎ็š„ๅœฐใฎใชใ„
09:12
it isn't a self-driving car without any destination.
160
552937
3328
่‡ชๅ‹•้‹่ปข่ปŠใงใฏใ‚ใ‚Šใพใ›ใ‚“
09:16
This is the car that we are driving.
161
556884
2400
ใพใ•ใซ็งใŸใกใŒ้‹่ปขใ—ใฆใ„ใ‚‹่ปŠใชใฎใงใ™
09:19
We choose when to speed up and when to slow down.
162
559953
3595
ใ„ใคๅŠ ้€Ÿใ™ใ‚‹ใ‹ ใƒ–ใƒฌใƒผใ‚ญใ‚’่ธใ‚€ใ‹
09:23
We choose if we need to make a turn.
163
563572
2400
ใ„ใคๆ›ฒใŒใ‚‹ใ‹ ใใ‚Œใ‚’็งใŸใกใŒ้ธใถใฎใงใ™
09:26
We choose what the AI of the future will be.
164
566868
3000
ใใ—ใฆ ๆœชๆฅใฎAIใŒใฉใ†ใชใ‚‹ใ‹ใซใคใ„ใฆใ‚‚ ็งใŸใกใŒ้ธใถใฎใงใ™
09:31
There's a vast playing field
165
571186
1337
AIใŒใชใ‚Œใ‚‹ใ‚‚ใฎใฎๅฏ่ƒฝๆ€งใฏๅบƒๅคงใงใ™
09:32
of all the things that artificial intelligence can become.
166
572547
2965
09:36
It will become many things.
167
576064
1800
ใ„ใ‚ใ„ใ‚ใชใ‚‚ใฎใซ็™บๅฑ•ใ™ใ‚‹ใงใ—ใ‚‡ใ†
09:39
And it's up to us now,
168
579694
1732
ใฉใ†ใชใ‚‹ใ‹ใฏ็งใŸใกๆฌก็ฌฌใงใ™
09:41
in order to figure out what we need to put in place
169
581450
3061
ใฉใ†ใ‹่€ƒใˆใฆใฟใฆใใ ใ•ใ„ AIใฎใ‚‚ใŸใ‚‰ใ™ใ‚‚ใฎใŒ
09:44
to make sure the outcomes of artificial intelligence
170
584535
3807
็งใŸใกใ™ในใฆใซใจใฃใฆ ใ‚ˆใ‚Šใ‚ˆใ„ใ‚‚ใฎใซใชใ‚‹ใ‚ˆใ†
09:48
are the ones that will be better for all of us.
171
588366
3066
ไปŠไฝ•ใ‚’ใ™ในใใ‹ใ‚’
09:51
Thank you.
172
591456
1150
ใ‚ใ‚ŠใŒใจใ†ใ”ใ–ใ„ใพใ—ใŸ
09:52
(Applause)
173
592630
2187
(ๆ‹ๆ‰‹)
ใ“ใฎใ‚ฆใ‚งใƒ–ใ‚ตใ‚คใƒˆใซใคใ„ใฆ

ใ“ใฎใ‚ตใ‚คใƒˆใงใฏ่‹ฑ่ชžๅญฆ็ฟ’ใซๅฝน็ซ‹ใคYouTubeๅ‹•็”ปใ‚’็ดนไป‹ใ—ใพใ™ใ€‚ไธ–็•Œไธญใฎไธ€ๆต่ฌ›ๅธซใซใ‚ˆใ‚‹่‹ฑ่ชžใƒฌใƒƒใ‚นใƒณใ‚’่ฆ‹ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅ„ใƒ“ใƒ‡ใ‚ชใฎใƒšใƒผใ‚ธใซ่กจ็คบใ•ใ‚Œใ‚‹่‹ฑ่ชžๅญ—ๅน•ใ‚’ใƒ€ใƒ–ใƒซใ‚ฏใƒชใƒƒใ‚ฏใ™ใ‚‹ใจใ€ใใ“ใ‹ใ‚‰ใƒ“ใƒ‡ใ‚ชใ‚’ๅ†็”Ÿใ™ใ‚‹ใ“ใจใŒใงใใพใ™ใ€‚ๅญ—ๅน•ใฏใƒ“ใƒ‡ใ‚ชใฎๅ†็”ŸใจๅŒๆœŸใ—ใฆใ‚นใ‚ฏใƒญใƒผใƒซใ—ใพใ™ใ€‚ใ”ๆ„่ฆ‹ใƒปใ”่ฆๆœ›ใŒใ”ใ–ใ„ใพใ—ใŸใ‚‰ใ€ใ“ใกใ‚‰ใฎใŠๅ•ใ„ๅˆใ‚ใ›ใƒ•ใ‚ฉใƒผใƒ ใ‚ˆใ‚Šใ”้€ฃ็ตกใใ ใ•ใ„ใ€‚

https://forms.gle/WvT1wiN1qDtmnspy7