Don't fear superintelligent AI | Grady Booch

270,644 views ใƒป 2017-03-13

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Ilan Caner ืžื‘ืงืจ: Ido Dekkers
00:12
When I was a kid, I was the quintessential nerd.
0
12580
3840
ื‘ื™ืœื“ื•ืชื™ ื”ื™ื™ืชื™ ื”ื—ื ื•ืŸ ื”ืžื•ื—ืœื˜.
00:17
I think some of you were, too.
1
17140
2176
ืื ื™ ื—ื•ืฉื‘ ืฉื›ืžื” ืžื›ื ื”ื™ื• ื’ื.
00:19
(Laughter)
2
19340
1216
(ืฆื—ื•ืง)
00:20
And you, sir, who laughed the loudest, you probably still are.
3
20580
3216
ื•ืืชื” ืื“ื•ื ื™, ืฉืฆื—ืงืช ื”ื›ื™ ื—ื–ืง, ื›ื ืจืื” ืขื“ื™ื™ืŸ ื’ื ื”ื™ื•ื.
00:23
(Laughter)
4
23820
2256
(ืฆื—ื•ืง)
00:26
I grew up in a small town in the dusty plains of north Texas,
5
26100
3496
ื’ื“ืœืชื™ ื‘ืขื™ืจ ืงื˜ื ื” ื‘ืžืจื—ื‘ื™ื ื”ืžืื•ื‘ืงื™ื ืฉืœ ืฆืคื•ืŸ ื˜ืงืกืก,
00:29
the son of a sheriff who was the son of a pastor.
6
29620
3336
ื‘ื ื• ืฉืœ ื”ืฉืจื™ืฃ ืฉื”ื™ื” ื‘ืŸ ืฉืœ ื›ื•ืžืจ.
00:32
Getting into trouble was not an option.
7
32980
1920
ืœื”ื™ืงืœืข ืœืฆืจื•ืช ืœื ื”ื™ืชื” ืื•ืคืฆื™ื”.
00:35
And so I started reading calculus books for fun.
8
35860
3256
ืื– ื”ืชื—ืœืชื™ ืœืงืจื•ื ืกืคืจื™ ื—ืฉื‘ื•ืŸ ื‘ืฉื‘ื™ืœ ื”ื›ื™ืฃ.
00:39
(Laughter)
9
39140
1536
(ืฆื—ื•ืง)
00:40
You did, too.
10
40700
1696
ืืชื ื’ื.
00:42
That led me to building a laser and a computer and model rockets,
11
42420
3736
ื–ื” ื”ื•ื‘ื™ืœ ืื•ืชื™ ืœื‘ื ื™ื™ืช ืœื™ื™ื–ืจ ื•ืžื—ืฉื‘ ื•ืžื•ื“ืœื™ื ืฉืœ ื˜ื™ืœื™ื,
00:46
and that led me to making rocket fuel in my bedroom.
12
46180
3000
ืžื” ืฉื”ื•ื‘ื™ืœ ืื•ืชื™ ืœื™ื™ืฆืจ ื“ืœืง ื˜ื™ืœื™ื ื‘ื—ื“ืจ ื”ืฉื™ื ื” ืฉืœื™.
00:49
Now, in scientific terms,
13
49780
3656
ื•ื‘ื›ืŸ, ื‘ืžื•ื ื—ื™ื ืžื“ืขื™ื™ื,
00:53
we call this a very bad idea.
14
53460
3256
ืื ื—ื ื• ืงื•ืจืื™ื ืœื–ื” ืจืขื™ื•ืŸ ืจืข ืžืื•ื“.
00:56
(Laughter)
15
56740
1216
(ืฆื—ื•ืง)
00:57
Around that same time,
16
57980
2176
ื‘ืขืจืš ื‘ืื•ืชื• ื–ืžืŸ,
01:00
Stanley Kubrick's "2001: A Space Odyssey" came to the theaters,
17
60180
3216
ื”ืกืจื˜ ืด2001 ืื•ื“ื™ืกืื” ื‘ื—ืœืœืด ืฉืœ ืกื˜ื ืœื™ ืงื•ื‘ืจื™ืง ื”ื’ื™ืข ืœืžืกื›ื™ื,
01:03
and my life was forever changed.
18
63420
2200
ื•ื—ื™ื™ ื”ืฉืชื ื• ืœื ืฆื—.
01:06
I loved everything about that movie,
19
66100
2056
ืื”ื‘ืชื™ ื›ืœ ืžื” ืฉื”ื™ื” ืงืฉื•ืจ ืœืกืจื˜,
01:08
especially the HAL 9000.
20
68180
2536
ื‘ื™ื™ื—ื•ื“ ืืช ื”ืืœ 9000 (HAL).
01:10
Now, HAL was a sentient computer
21
70740
2056
ื•ื‘ื›ืŸ, ื”ืืœ ื”ื™ื” ืžื—ืฉื‘ ื‘ืขืœ ืžื•ื“ืขื•ืช
01:12
designed to guide the Discovery spacecraft
22
72820
2456
ืฉืชื•ื›ื ืŸ ืœื”ื“ืจื™ืš ืืช ืกืคื™ื ืช ื”ื—ืœืœ ื“ื™ืกืงื‘ืจื™
01:15
from the Earth to Jupiter.
23
75300
2536
ื‘ื“ืจืš ืžื›ื“ื•ืจ ื”ืืจืฅ ืœืฆื“ืง.
01:17
HAL was also a flawed character,
24
77860
2056
ื”ืืœ ื”ื™ื” ื’ื ื˜ื™ืคื•ืก ืคื’ื•ื,
01:19
for in the end he chose to value the mission over human life.
25
79940
4280
ื›ื™ ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื”ื•ื ื‘ื—ืจ ืœื”ืขื“ื™ืฃ ืืช ื”ืžืฉื™ืžื” ืขืœ ืคื ื™ ื—ื™ื™ ืื“ื.
01:24
Now, HAL was a fictional character,
26
84660
2096
ืœืžืจื•ืช ืฉื”ืืœ ื”ื™ื” ื“ืžื•ืช ื‘ื“ื™ื•ื ื™ืช,
01:26
but nonetheless he speaks to our fears,
27
86780
2656
ื”ื•ื ื“ื™ื‘ืจ ืืœ ื”ืคื—ื“ื™ื ืฉืœื ื•,
01:29
our fears of being subjugated
28
89460
2096
ื”ืคื—ื“ื™ื ืฉืœื ื• ืœื”ื™ื•ืช ืžืฉื•ืขื‘ื“ื™ื
01:31
by some unfeeling, artificial intelligence
29
91580
3016
ืขืœ ื™ื“ื™ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช, ืœืœื ืจื’ืฉื•ืช
01:34
who is indifferent to our humanity.
30
94620
1960
ืฉืื“ื™ืฉื” ืœืื ื•ืฉื™ื•ืช ืฉืœื ื•.
01:37
I believe that such fears are unfounded.
31
97700
2576
ืื ื™ ืžืืžื™ืŸ ืฉืคื—ื“ื™ื ื›ืืœื• ื”ื ื—ืกืจื™ ื‘ื™ืกื•ืก.
01:40
Indeed, we stand at a remarkable time
32
100300
2696
ืื›ืŸ, ืื ื• ื ืžืฆืื™ื ื‘ื–ืžืŸ ื™ื•ืฆื ื“ื•ืคืŸ
01:43
in human history,
33
103020
1536
ื‘ื”ืกื˜ื•ืจื™ื” ื”ืื ื•ืฉื™ืช,
01:44
where, driven by refusal to accept the limits of our bodies and our minds,
34
104580
4976
ืฉื‘ื•, ืžื•ื ืขื™ื ืขืœ ื™ื“ื™ ืกืจื•ื‘ื™ื ื• ืœืงื‘ืœ ืืช ื”ืžื’ื‘ืœื•ืช ืฉืœ ื’ื•ืคื™ื ื• ื•ืžื—ืฉื‘ื•ืชื™ื ื•,
01:49
we are building machines
35
109580
1696
ืื ื• ื‘ื•ื ื™ื ืžื›ื•ื ื•ืช
01:51
of exquisite, beautiful complexity and grace
36
111300
3616
ื‘ืขืœื•ืช ืขื“ื™ื ื•ืช, ืกื™ื‘ื•ื›ื™ื•ืช ื™ืคื™ืคื™ื” ื•ื—ื™ื ื ื™ื•ืช
01:54
that will extend the human experience
37
114940
2056
ืฉื™ืืจื™ื›ื• ืืช ื˜ื•ื•ื— ื”ื—ื•ื•ื™ื” ื”ืื ื•ืฉื™ืช
01:57
in ways beyond our imagining.
38
117020
1680
ื‘ื“ืจื›ื™ื ืžืขื‘ืจ ืœื“ืžื™ื•ื ื™ื ื•.
01:59
After a career that led me from the Air Force Academy
39
119540
2576
ืื—ืจื™ ืงืจื™ื™ืจื” ืฉื”ื•ื‘ื™ืœื” ืื•ืชื™ ืžื”ืืงื“ืžื™ื” ืœื˜ื™ืก ื‘ื—ื™ืœ ื”ืื•ื™ืจ
02:02
to Space Command to now,
40
122140
1936
ืœืคื™ืงื•ื“ ืขืœ ืžืฉื™ืžื•ืช ื—ืœืœ ืœืขื›ืฉื™ื•,
02:04
I became a systems engineer,
41
124100
1696
ื ื”ื™ื™ืชื™ ืžื”ื ื“ืก ืžืขืจื›ื•ืช,
02:05
and recently I was drawn into an engineering problem
42
125820
2736
ื•ืœืื—ืจื•ื ื” ื ืฉืื‘ืชื™ ืœื‘ืขื™ื™ื” ื”ื ื“ืกื™ืช
02:08
associated with NASA's mission to Mars.
43
128580
2576
ื”ืงืฉื•ืจื” ืœืžืฉื™ืžื” ืฉืœ ื ืืกืดื ืœืžืื“ื™ื.
02:11
Now, in space flights to the Moon,
44
131180
2496
ื•ื‘ื›ืŸ, ื‘ืžืฉื™ืžื•ืช ื—ืœืœ ืœื™ืจื—,
02:13
we can rely upon mission control in Houston
45
133700
3136
ืื ื• ื™ื›ื•ืœื™ื ืœืกืžื•ืš ืขืœ ืคื™ืงื•ื“ ื”ืžืฉื™ืžื” ื‘ื™ื•ืกื˜ื•ืŸ
02:16
to watch over all aspects of a flight.
46
136860
1976
ืฉื™ืคืงื— ืขืœ ื›ืœ ื”ื”ื™ื‘ื˜ื™ื ืฉืœ ื”ื˜ื™ืกื”.
02:18
However, Mars is 200 times further away,
47
138860
3536
ืื‘ืœ, ืžืื“ื™ื ืจื—ื•ืง ืคื™ 200 ื™ื•ืชืจ,
02:22
and as a result it takes on average 13 minutes
48
142420
3216
ื•ืœื›ืŸ ืœื•ืงื— ื‘ืžืžื•ืฆืข 13 ื“ืงื•ืช
02:25
for a signal to travel from the Earth to Mars.
49
145660
3136
ืœืฉื“ืจ ืœื ื•ืข ืžื›ื“ื•ืจ ื”ืืจืฅ ืœืžืื“ื™ื.
02:28
If there's trouble, there's not enough time.
50
148820
3400
ืื ืงื•ืจื•ืช ื‘ืขื™ื•ืช, ืื™ืŸ ืžืกืคื™ืง ื–ืžืŸ.
02:32
And so a reasonable engineering solution
51
152660
2496
ื•ืœื›ืŸ ืคืชืจื•ืŸ ื”ื ื“ืกื™ ืกื‘ื™ืจ
02:35
calls for us to put mission control
52
155180
2576
ื™ื”ื™ื” ืœืฉื™ื ืืช ืคื™ืงื•ื“ ื”ืžืฉื™ืžื”
02:37
inside the walls of the Orion spacecraft.
53
157780
3016
ื‘ืชื•ืš ืจื›ื‘ ื”ื—ืœืœ ืื•ืจื™ื•ืŸ.
02:40
Another fascinating idea in the mission profile
54
160820
2896
ืจืขื™ื•ืŸ ืžืจืชืง ื ื•ืกืฃ ื‘ืคืจื•ืคื™ืœ ื”ืžืฉื™ืžื”
02:43
places humanoid robots on the surface of Mars
55
163740
2896
ื”ื•ื ืœืฉื™ื ืจื•ื‘ื•ื˜ื™ื ื“ืžื•ื™ื™ ืื“ื ืขืœ ืคื ื™ ืžืื“ื™ื
02:46
before the humans themselves arrive,
56
166660
1856
ืœืคื ื™ ืฉื‘ื ื™ ื”ืื“ื ืžื’ื™ืขื™ื,
02:48
first to build facilities
57
168540
1656
ื‘ืชื—ื™ืœื” ืœืฉื ื‘ื ื™ื™ืช ืชืฉืชื™ื•ืช
02:50
and later to serve as collaborative members of the science team.
58
170220
3360
ื•ืœืื—ืจ ืžื›ืŸ ืœืฉืจืช ื›ื—ืœืง ืฉื™ืชื•ืคื™ ืžื”ืฆื•ื•ืช ื”ืžื“ืขื™.
02:55
Now, as I looked at this from an engineering perspective,
59
175220
2736
ื›ืฉื”ืชื‘ื•ื ื ืชื™ ื‘ื–ื” ืžื ืงื•ื“ืช ืžื‘ื˜ ื”ื ื“ืกื™ืช,
02:57
it became very clear to me that what I needed to architect
60
177980
3176
ื”ืชื‘ื”ืจื” ืœื™ ื”ื”ื‘ื ื” ืฉืžื” ืฉืขืœื™ ืœืชื›ื ืŸ
03:01
was a smart, collaborative,
61
181180
2176
ื”ื™ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื—ื›ืžื”, ืฉื™ืชื•ืคื™ืช
03:03
socially intelligent artificial intelligence.
62
183380
2376
ื•ื‘ืขืœืช ื‘ื™ื ื” ื—ื‘ืจืชื™ืช.
03:05
In other words, I needed to build something very much like a HAL
63
185780
4296
ื‘ืžื™ืœื™ื ืื—ืจื•ืช, ื”ื™ื™ืชื™ ืฆืจื™ืš ืœื‘ื ื•ืช ืžืฉื”ื• ืžืื•ื“ ื“ื•ืžื” ืœื”ืืœ
03:10
but without the homicidal tendencies.
64
190100
2416
ืื‘ืœ ืœืœื ื”ื ื˜ื™ื•ืช ื”ืื•ื‘ื“ื ื™ื•ืช.
03:12
(Laughter)
65
192540
1360
(ืฆื—ื•ืง)
03:14
Let's pause for a moment.
66
194740
1816
ื‘ื•ืื• ื ืขืฆื•ืจ ืœืจื’ืข.
03:16
Is it really possible to build an artificial intelligence like that?
67
196580
3896
ื”ืื ื–ื” ืืคืฉืจื™ ื‘ืืžืช ืœื‘ื ื•ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื–ื•?
03:20
Actually, it is.
68
200500
1456
ืœืžืขืฉื”, ื›ืŸ.
03:21
In many ways,
69
201980
1256
ื‘ืžื•ื‘ื ื™ื ืจื‘ื™ื,
03:23
this is a hard engineering problem
70
203260
1976
ื–ื• ื‘ืขื™ื™ื” ื”ื ื“ืกื™ืช ืงืฉื”
03:25
with elements of AI,
71
205260
1456
ืขื ืืœืžื ื˜ื™ื ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
03:26
not some wet hair ball of an AI problem that needs to be engineered.
72
206740
4696
ื•ืœื ื›ื“ื•ืจ ืฉื™ืขืจ ืจื˜ื•ื‘ ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืฉืฆืจื™ืš ืœื”ื™ื•ืช ืžื”ื•ื ื“ืก.
03:31
To paraphrase Alan Turing,
73
211460
2656
ื‘ืคืจืคืจื–ื” ืขืœ ื“ื‘ืจื™ื• ืฉืœ ืืœืŸ ื˜ื™ื•ืจื™ื ื’,
03:34
I'm not interested in building a sentient machine.
74
214140
2376
ืื ื™ ืœื ืžืขื•ื ื™ื™ืŸ ืœื‘ื ื•ืช ืžื›ื•ื ื” ืžื•ื“ืขืช.
03:36
I'm not building a HAL.
75
216540
1576
ืื ื™ ืœื ื‘ื•ื ื” ืืช ื”ืืœ.
03:38
All I'm after is a simple brain,
76
218140
2416
ื›ืœ ืžื” ืฉืื ื™ ืจื•ืฆื” ื”ื•ื ืžื•ื— ืคืฉื•ื˜,
03:40
something that offers the illusion of intelligence.
77
220580
3120
ืžืฉื”ื• ืฉืžืฆื™ืข ืืช ื”ืืฉืœื™ื” ืฉืœ ื‘ื™ื ื”.
03:44
The art and the science of computing have come a long way
78
224820
3136
ื”ืžื“ืข ื•ื”ืื•ืžื ื•ืช ืฉืœ ื”ืžื—ืฉื•ื‘ ื”ืชืงื“ืžื• ืจื‘ื•ืช
03:47
since HAL was onscreen,
79
227980
1496
ืžืื– ืฉื”ืืœ ื”ื•ืคื™ืข ืขืœ ื”ืžืกืš,
03:49
and I'd imagine if his inventor Dr. Chandra were here today,
80
229500
3216
ื•ืื ื™ ืžื“ืžื™ื™ืŸ ืฉืื ื”ืžืžืฆื™ื ืฉืœื• ื“ืดืจ ืฆืณื ื“ืจื” ื”ื™ื” ืคื” ื”ื™ื•ื
03:52
he'd have a whole lot of questions for us.
81
232740
2336
ื”ื™ื• ืœื• ื”ืจื‘ื” ืฉืืœื•ืช ืืœื™ื ื•.
03:55
Is it really possible for us
82
235100
2096
ื”ืื ื‘ืืžืช ืื ื—ื ื• ื™ื›ื•ืœื™ื
03:57
to take a system of millions upon millions of devices,
83
237220
4016
ืœืงื—ืช ืžืขืจื›ืช ืฉืœ ืžืœื™ื•ื ื™ื ืขืœ ืžืœื™ื•ื ื™ื ืฉืœ ืžื›ืฉื™ืจื™ื,
04:01
to read in their data streams,
84
241260
1456
ืœืงืจื•ื ืืช ื›ืœ ื”ื ืชื•ื ื™ื ืžื”ื,
04:02
to predict their failures and act in advance?
85
242740
2256
ืœื ื—ืฉ ืืช ื”ื›ืฉืœื™ื ื•ืœื”ื’ื™ื‘ ืžืจืืฉ?
04:05
Yes.
86
245020
1216
ื›ืŸ.
04:06
Can we build systems that converse with humans in natural language?
87
246260
3176
ื”ืื ืื ื• ื™ื›ื•ืœื™ื ืœื‘ื ื•ืช ืžืขืจื›ื•ืช ื”ืžืฉื•ื—ื—ื•ืช ืขื ื‘ื ื™ ืื“ื ื‘ืฉืคื” ื˜ื‘ืขื™ืช?
04:09
Yes.
88
249460
1216
ื›ืŸ.
04:10
Can we build systems that recognize objects, identify emotions,
89
250700
2976
ื”ืื ืื ื• ื™ื›ื•ืœื™ื ืœื‘ื ื•ืช ืžืขืจื›ื•ืช ืฉื™ื›ื™ืจื• ืื•ื‘ื™ื™ืงื˜ื™ื, ื™ื–ื”ื• ืจื’ืฉื•ืช,
04:13
emote themselves, play games and even read lips?
90
253700
3376
ืฉื™ื”ื• ืจื’ืฉื ื™ื•ืช, ื™ืฉื—ืงื• ืžืฉื—ืงื™ื ื•ืืคื™ืœื• ื™ืงืจืื• ืฉืคืชื™ื™ื?
04:17
Yes.
91
257100
1216
ื›ืŸ.
04:18
Can we build a system that sets goals,
92
258340
2136
ื”ืื ื‘ื™ื›ื•ืœืชื™ื ื• ืœื‘ื ื•ืช ืžืขืจื›ื•ืช ื”ืžืฆื™ื‘ื•ืช ื™ืขื“ื™ื,
04:20
that carries out plans against those goals and learns along the way?
93
260500
3616
ื•ืžื•ืฆื™ืื•ืช ืœืคื•ืขืœ ืชื•ื›ื ื™ื•ืช ืœืคื™ ื™ืขื“ื™ื ืืœื• ื•ืœื•ืžื“ื•ืช ื‘ื“ืจืš?
04:24
Yes.
94
264140
1216
ื›ืŸ.
04:25
Can we build systems that have a theory of mind?
95
265380
3336
ื”ืื ื‘ื™ื›ื•ืœืชื™ื ื• ืœื‘ื ื•ืช ืžืขืจื›ื•ืช ื‘ืขืœื•ืช ืžื•ื“ืขื•ืช?
04:28
This we are learning to do.
96
268740
1496
ืืช ื–ื” ืื ื• ืœื•ืžื“ื™ื ืœืขืฉื•ืช.
04:30
Can we build systems that have an ethical and moral foundation?
97
270260
3480
ื”ืื ื‘ื™ื›ื•ืœืชื™ื ื• ืœื‘ื ื•ืช ืžืขืจื›ื•ืช ื‘ืขืœื•ืช ื‘ืกื™ืก ืžื•ืจืœื™ ื•ืืชื™?
04:34
This we must learn how to do.
98
274300
2040
ืืช ื–ื” ืื ื• ื—ื™ื™ื‘ื™ื ืœืœืžื•ื“ ืœืขืฉื•ืช.
04:37
So let's accept for a moment
99
277180
1376
ืื– ื‘ื•ืื• ื ืงื‘ืœ ืœืจื’ืข
04:38
that it's possible to build such an artificial intelligence
100
278580
2896
ืฉื–ื” ืืคืฉืจื™ ืœื‘ื ื•ืช ื›ื–ื• ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
04:41
for this kind of mission and others.
101
281500
2136
ืขื‘ื•ืจ ืžืฉื™ืžื” ื–ื• ื•ืื—ืจื•ืช.
04:43
The next question you must ask yourself is,
102
283660
2536
ื”ืฉืืœื” ื”ื‘ืื” ืฉืืชื ื—ื™ื™ื‘ื™ื ืœืฉืื•ืœ ืืช ืขืฆืžื›ื ื”ื™ื,
04:46
should we fear it?
103
286220
1456
ื”ืื ืขืœื™ื ื• ืœืคื—ื“ ืžืžื ื”?
04:47
Now, every new technology
104
287700
1976
ื›ืœ ื˜ื›ื ื•ืœื•ื’ื™ื” ื—ื“ืฉื”
04:49
brings with it some measure of trepidation.
105
289700
2896
ืžื‘ื™ืื” ืื™ืชื” ืจืžื” ืžืกื•ื™ื™ืžืช ืฉืœ ื—ืฉืฉ ืžื”ืœื ื ื•ื“ืข.
04:52
When we first saw cars,
106
292620
1696
ื›ืฉืจืื™ื ื• ืœืจืืฉื•ื ื” ืžื›ื•ื ื™ื•ืช,
04:54
people lamented that we would see the destruction of the family.
107
294340
4016
ืื ืฉื™ื ื—ื–ื• ืฉื ืจืื” ืืช ื”ืจืก ื”ืžืฉืคื—ื”.
04:58
When we first saw telephones come in,
108
298380
2696
ื›ืฉืจืื™ื ื• ืœืจืืฉื•ื ื” ื˜ืœืคื•ื ื™ื,
05:01
people were worried it would destroy all civil conversation.
109
301100
2896
ืื ืฉื™ื ื—ืฉืฉื• ืฉื–ื” ื™ื‘ื™ื ืœืกื•ืคื• ืฉืœ ื”ื“ื™ื•ืŸ ื”ืชืจื‘ื•ืชื™.
05:04
At a point in time we saw the written word become pervasive,
110
304020
3936
ื›ืฉืจืื™ื ื• ื‘ืฉืœื‘ ืžืกื•ื™ื™ื ืืช ื”ืชืคืฉื˜ื•ืช ื”ืžื™ืœื” ื”ื›ืชื•ื‘ื”,
05:07
people thought we would lose our ability to memorize.
111
307980
2496
ืื ืฉื™ื ื—ืฉื‘ื• ืฉื ืื‘ื“ ืืช ื”ื™ื›ื•ืœืช ืœื–ื›ื•ืจ.
05:10
These things are all true to a degree,
112
310500
2056
ื›ืœ ืืœื• ื ื›ื•ื ื™ื ื‘ืžื™ื“ื” ืžืกื•ื™ื™ืžืช,
05:12
but it's also the case that these technologies
113
312580
2416
ืื‘ืœ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืืœื•
05:15
brought to us things that extended the human experience
114
315020
3376
ื”ื‘ื™ืื• ื“ื‘ืจื™ื ืฉื”ืจื—ื™ื‘ื• ืืช ื”ื ืกื™ื•ืŸ ื”ืื ื•ืฉื™
05:18
in some profound ways.
115
318420
1880
ื‘ื“ืจื›ื™ื ืขืžื•ืงื•ืช.
05:21
So let's take this a little further.
116
321660
2280
ืื– ื‘ื•ืื• ื ื™ืงื— ืืช ื–ื” ืžืขื˜ ืจื—ื•ืง ื™ื•ืชืจ.
05:24
I do not fear the creation of an AI like this,
117
324940
4736
ืื ื™ ืœื ืžืคื—ื“ ืžื™ืฆื™ืจืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื–ื•,
05:29
because it will eventually embody some of our values.
118
329700
3816
ื›ื™ื•ื•ืŸ ืฉื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ื”ื™ื ืชื›ื™ืœ ื’ื ื›ืžื” ืžื”ืขืจื›ื™ื ืฉืœื ื•.
05:33
Consider this: building a cognitive system is fundamentally different
119
333540
3496
ื—ื™ืฉื‘ื• ืขืœ ื›ืš: ื‘ื ื™ื™ืช ืžืขืจื›ืช ืงื•ื’ื ื™ื˜ื™ื‘ื™ืช ื”ื™ื ืฉื•ื ื” ืžื”ื•ืชื™ืช
05:37
than building a traditional software-intensive system of the past.
120
337060
3296
ืžื‘ื ื™ื™ืช ืžืขืจื›ืช ืžืกื•ืจืชื™ืช ืžืจื•ื‘ืช ืชื•ื›ื ื” ืฉืœ ื”ืขื‘ืจ.
05:40
We don't program them. We teach them.
121
340380
2456
ืื ื• ืœื ืžืชื›ื ืชื™ื ืื•ืชืŸ, ืื ื• ืžืœืžื“ื™ื ืื•ืชืŸ.
05:42
In order to teach a system how to recognize flowers,
122
342860
2656
ื›ื“ื™ ืœืœืžื“ ืžืขืจื›ืช ืœื–ื”ื•ืช ืคืจื—ื™ื,
05:45
I show it thousands of flowers of the kinds I like.
123
345540
3016
ืื ื™ ืžืจืื” ืœื” ืืœืคื™ ืคืจื—ื™ื ืžื”ืกื•ื’ ืฉืื ื™ ืื•ื”ื‘.
05:48
In order to teach a system how to play a game --
124
348580
2256
ื›ื“ื™ ืœืœืžื“ ืžืขืจื›ืช ืœืฉื—ืง ืžืฉื—ืง --
05:50
Well, I would. You would, too.
125
350860
1960
ื•ื‘ื›ืŸ, ื”ื™ื™ืชื™ ืขื•ืฉื” ืืช ื–ื”. ืืชื ื’ื.
05:54
I like flowers. Come on.
126
354420
2040
ืื ื™ ืื•ื”ื‘ ืคืจื—ื™ื. ื‘ื—ื™ื™ื›ื.
05:57
To teach a system how to play a game like Go,
127
357260
2856
ื›ื“ื™ ืœืœืžื“ ืžืขืจื›ืช ืœืฉื—ืง ืžืฉื—ืง ื›ืžื• ื’ื•,
06:00
I'd have it play thousands of games of Go,
128
360140
2056
ืื’ืจื•ื ืœื” ืœืฉื—ืง ืืœืคื™ ืžืฉื—ืงื™ ื’ื•.
06:02
but in the process I also teach it
129
362220
1656
ืื‘ืœ ื‘ืชื”ืœื™ืš ืืœืžื“ ืื•ืชื”
06:03
how to discern a good game from a bad game.
130
363900
2416
ืื™ืš ืœื”ื‘ื“ื™ืœ ื‘ื™ืŸ ืžืฉื—ืง ื˜ื•ื‘ ืœืžืฉื—ืง ืจืข.
06:06
If I want to create an artificially intelligent legal assistant,
131
366340
3696
ืื ืื ื™ ืจื•ืฆื” ืœื™ืฆื•ืจ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืขื‘ื•ืจ ืขื–ืจื” ืžืฉืคื˜ื™ืช,
06:10
I will teach it some corpus of law
132
370060
1776
ืืœืžื“ ืื•ืชื” ื›ืžื” ืกืคืจื™ ืžืฉืคื˜ื™ื
06:11
but at the same time I am fusing with it
133
371860
2856
ืืš ื‘ืื•ืชื• ื–ืžืŸ ืืฉืœื‘ ื–ืืช ืขื
06:14
the sense of mercy and justice that is part of that law.
134
374740
2880
ื—ื•ืฉ ื”ืฆื“ืง ื•ื”ืจื—ืžื™ื ืฉื”ื ื—ืœืง ืžืื•ืชื ื—ื•ืงื™ื.
06:18
In scientific terms, this is what we call ground truth,
135
378380
2976
ื‘ืžื•ื ื—ื™ื ืžื“ืขื™ื™ื, ื–ื” ืžื” ืฉืื ื• ืงื•ืจืื™ื ืœื• ืืžืช ื‘ืกื™ืกื™ืช,
06:21
and here's the important point:
136
381380
2016
ื•ื”ื ื” ื”ื ืงื•ื“ื” ื”ื—ืฉื•ื‘ื”:
06:23
in producing these machines,
137
383420
1456
ื›ื—ืœืง ืžื™ืฆื™ืจืช ืžื›ื•ื ื•ืช ืืœื•,
06:24
we are therefore teaching them a sense of our values.
138
384900
3416
ืื ื• ืžืœืžื“ื™ื ืื•ืชืŸ ื—ืœืง ืžืขืจื›ื™ื ื•.
06:28
To that end, I trust an artificial intelligence
139
388340
3136
ื‘ื”ืงืฉืจ ื–ื”, ืื ื™ ืกื•ืžืš ืขืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
06:31
the same, if not more, as a human who is well-trained.
140
391500
3640
ื‘ืื•ืชื” ืžื™ื“ื”, ืื ืœื ื™ื•ืชืจ, ืžืืฉืจ ืขืœ ืื“ื ืฉืื•ืžืŸ ื”ื™ื˜ื‘.
06:35
But, you may ask,
141
395900
1216
ืื‘ืœ, ืืชื ืขืœื•ืœื™ื ืœืฉืื•ืœ,
06:37
what about rogue agents,
142
397140
2616
ืžื” ืœื’ื‘ื™ ืกื•ื›ื ื™ื ืคื•ืจืขื™ ื—ื•ืง,
06:39
some well-funded nongovernment organization?
143
399780
3336
ื›ืžื• ืืจื’ื•ืŸ ืœื ืžืžืฉืœืชื™ ืžืžื•ืžืŸ ื”ื™ื˜ื‘?
06:43
I do not fear an artificial intelligence in the hand of a lone wolf.
144
403140
3816
ืื ื™ ืœื ื—ื•ืฉืฉ ืžื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื‘ื™ื“ื™ื™ื ืฉืœ ื–ืื‘ ื‘ื•ื“ื“.
06:46
Clearly, we cannot protect ourselves against all random acts of violence,
145
406980
4536
ื‘ืจื•ืจ ืฉืื™ื ื ื• ื™ื›ื•ืœื™ื ืœื”ื’ืŸ ืขืœ ืขืฆืžื™ื ื• ืžื›ืœ ืžืขืฉื” ืืœื™ืžื•ืช ืืงืจืื™,
06:51
but the reality is such a system
146
411540
2136
ืื‘ืœ ื”ืžืฆื™ืื•ืช ืœื’ื‘ื™ ืžืขืจื›ื•ืช ืืœื• ื”ื™ื
06:53
requires substantial training and subtle training
147
413700
3096
ืฉื”ืŸ ืฆืจื™ื›ื•ืช ื”ืจื‘ื” ืื™ืžื•ืŸ, ื•ืื™ืžื•ืŸ ืขื“ื™ืŸ
06:56
far beyond the resources of an individual.
148
416820
2296
ื”ืจื‘ื” ืžืขื‘ืจ ืœืžืฉืื‘ื™ื ืฉืœ ืื“ื ื‘ื•ื“ื“.
06:59
And furthermore,
149
419140
1216
ื•ื‘ื ื•ืกืฃ,
07:00
it's far more than just injecting an internet virus to the world,
150
420380
3256
ื–ื” ื”ืจื‘ื” ื™ื•ืชืจ ืžืœื”ื–ืจื™ืง ื•ื™ืจื•ืก ืื™ื ื˜ืจื ื˜ ืœืขื•ืœื,
07:03
where you push a button, all of a sudden it's in a million places
151
423660
3096
ื›ืฉืืชื” ืœื•ื—ืฅ ืขืœ ื›ืคืชื•ืจ, ื•ืคืชืื•ื ื–ื” ื‘ืžืœื™ื•ืŸ ืžืงื•ืžื•ืช
07:06
and laptops start blowing up all over the place.
152
426780
2456
ื•ืžื—ืฉื‘ื™ื ืžืชื—ื™ืœื™ื ืœื”ืชืคื•ืฆืฅ ื‘ื›ืœ ืžืงื•ื.
07:09
Now, these kinds of substances are much larger,
153
429260
2816
ื•ื‘ื›ืŸ, ืžื“ื•ื‘ืจ ืคื” ื‘ืžืฉื”ื• ื”ืจื‘ื” ื™ื•ืชืจ ื’ื“ื•ืœ,
07:12
and we'll certainly see them coming.
154
432100
1715
ื•ื‘ื•ื•ื“ืื•ืช ื ืจืื” ืืช ื–ื” ืžื’ื™ืข.
07:14
Do I fear that such an artificial intelligence
155
434340
3056
ื”ืื ืื ื™ ื—ื•ืฉืฉ ืฉื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื–ื•
07:17
might threaten all of humanity?
156
437420
1960
ืขืœื•ืœื” ืœืื™ื™ื ืขืœ ื”ืื ื•ืฉื•ืช ื›ื•ืœื”?
07:20
If you look at movies such as "The Matrix," "Metropolis,"
157
440100
4376
ืื ืชืชื‘ื•ื ื ื• ื‘ืกืจื˜ื™ื ื›ืžื• ืดื”ืžื˜ืจื™ืงืกืด, ืดืžื˜ืจื•ืคื•ืœื™ืกืด ,
07:24
"The Terminator," shows such as "Westworld,"
158
444500
3176
ืดืฉืœื™ื—ื•ืช ืงื˜ืœื™ืชืด, ืชื•ื›ื ื™ื•ืช ื›ืžื• ืดื•ื•ืกื˜ื•ื•ืจืœื“ืด,
07:27
they all speak of this kind of fear.
159
447700
2136
ื›ื•ืœืŸ ืžื“ื‘ืจื•ืช ืขืœ ื—ืฉืฉ ืฉื›ื–ื”.
07:29
Indeed, in the book "Superintelligence" by the philosopher Nick Bostrom,
160
449860
4296
ื•ืื›ืŸ, ื‘ืกืคืจ ืดืกื•ืคืจ ืื™ื ื˜ืœื™ื’ื ืฆื™ื”ืด ืฉืœ ื”ืคื™ืœื•ืกื•ืฃ ื ื™ืง ื‘ื•ืกื˜ืจื•ื,
07:34
he picks up on this theme
161
454180
1536
ื”ื•ื ื‘ื•ื—ืŸ ื ื•ืฉื ื–ื”
07:35
and observes that a superintelligence might not only be dangerous,
162
455740
4016
ื•ืฉื ืœื‘ ืฉืกื•ืคืจ ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ืขืœื•ืœื” ืœื”ื™ื•ืช ืœื ืจืง ืžืกื•ื›ื ืช,
07:39
it could represent an existential threat to all of humanity.
163
459780
3856
ืืœื ื’ื ื™ื›ื•ืœื” ืœื”ื•ื•ืช ืื™ื•ื ืงื™ื•ืžื™ ืขืœ ื”ืื ื•ืฉื•ืช ื›ื•ืœื”.
07:43
Dr. Bostrom's basic argument
164
463660
2216
ื”ื˜ื™ืขื•ืŸ ื”ื‘ืกื™ืกื™ ืฉืœ ื“ื•ืงื˜ื•ืจ ื‘ื•ืกื˜ืจื•ื
07:45
is that such systems will eventually
165
465900
2736
ื”ื•ื ืฉืœืžืขืจื›ื•ืช ื›ืืœื• ื™ื”ื™ื” ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ
07:48
have such an insatiable thirst for information
166
468660
3256
ืฆืžืื•ืŸ ื‘ืœืชื™ ื ื™ืชืŸ ืœืจื•ื•ื™ื” ืœืžื™ื“ืข
07:51
that they will perhaps learn how to learn
167
471940
2896
ืขื“ ืฉื”ืŸ ืื•ืœื™ ื™ืœืžื“ื• ืœืœืžื•ื“
07:54
and eventually discover that they may have goals
168
474860
2616
ื•ืœื‘ืกื•ืฃ ื™ื’ืœื• ืฉืื•ืœื™ ื™ืฉ ืœื”ืŸ ื™ืขื“ื™ื
07:57
that are contrary to human needs.
169
477500
2296
ืžื ื•ื’ื“ื™ื ืœืฆืจื›ื™ ื‘ื ื™ ื”ืื“ื.
07:59
Dr. Bostrom has a number of followers.
170
479820
1856
ืœื“ื•ืงื˜ื•ืจ ื‘ื•ืกื˜ืจื•ื ื™ืฉ ืžืกืคืจ ืขื•ืงื‘ื™ื.
08:01
He is supported by people such as Elon Musk and Stephen Hawking.
171
481700
4320
ื”ื•ื ื ืชืžืš ืขืดื™ ืื ืฉื™ื ื›ืžื• ืืœื•ืŸ ืžืืกืง ื•ืกื˜ืคืŸ ื”ื•ืงื™ื ื’.
08:06
With all due respect
172
486700
2400
ืขื ื›ืœ ื”ื›ื‘ื•ื“
08:09
to these brilliant minds,
173
489980
2016
ืœืžื•ื—ื•ืช ืžื‘ืจื™ืงื™ื ืืœื•,
08:12
I believe that they are fundamentally wrong.
174
492020
2256
ืื ื™ ืžืืžื™ืŸ ืฉื”ื ืฉื•ื’ื™ื ื‘ืื•ืคืŸ ื‘ืกื™ืกื™.
08:14
Now, there are a lot of pieces of Dr. Bostrom's argument to unpack,
175
494300
3176
ื•ื‘ื›ืŸ, ื™ืฉ ื”ืจื‘ื” ื—ืœืงื™ื ืœืคืจืง ื‘ื˜ื™ืขื•ืŸ ืฉืœ ื“ื•ืงื˜ื•ืจ ื‘ื•ืกื˜ืจื•ื,
08:17
and I don't have time to unpack them all,
176
497500
2136
ื•ืื™ืŸ ืœื™ ื–ืžืŸ ืœืคืจืง ืืช ื›ื•ืœื,
08:19
but very briefly, consider this:
177
499660
2696
ืื‘ืœ ื‘ืงื™ืฆื•ืจ ืจื‘, ื—ื™ืฉื‘ื• ืขืœ ื›ืš:
08:22
super knowing is very different than super doing.
178
502380
3736
ื™ื“ื™ืขืช-ืขืœ ื”ื™ื ืžืื•ื“ ืฉื•ื ื” ืžืขืฉื™ื™ืช-ืขืœ.
08:26
HAL was a threat to the Discovery crew
179
506140
1896
ื”ืืœ ื”ื™ื” ืื™ื•ื ืœืฆื•ื•ืช ื”ื“ื™ืกืงื‘ืจื™
08:28
only insofar as HAL commanded all aspects of the Discovery.
180
508060
4416
ืจืง ื›ื™ื•ื•ืŸ ืฉื”ืืœ ืคื™ืงื“ ืขืœ ื›ืœ ื”ืฆื“ื“ื™ื ืฉืœ ื”ื“ื™ืกืงื‘ืจื™.
08:32
So it would have to be with a superintelligence.
181
512500
2496
ืื– ื›ืš ื’ื ื™ืฆื˜ืจืš ืœื”ื™ื•ืช ืขื ืกื•ืคืจ ืื™ื ื˜ืœื™ื’ื ืฆื™ื”.
08:35
It would have to have dominion over all of our world.
182
515020
2496
ื”ื™ื ืชืฆื˜ืจืš ืœื”ื™ื•ืช ื‘ืฉืœื™ื˜ื” ืขืœ ื›ืœ ืขื•ืœืžื™ื ื•.
08:37
This is the stuff of Skynet from the movie "The Terminator"
183
517540
2816
ื–ื” ืžื–ื›ื™ืจ ืืช ืกืงื™ื™ื ื˜ ืžื”ืกืจื˜ ืดืฉืœื™ื—ื•ืช ืงื˜ืœื ื™ืชืด
08:40
in which we had a superintelligence
184
520380
1856
ืฉื‘ื• ื™ืฉ ืกื•ืคืจ ืื™ื ื˜ืœื™ื’ื ืฆื™ื”
08:42
that commanded human will,
185
522260
1376
ืฉืฉืœื˜ื” ืขืœ ืจืฆื•ืŸ ื‘ื ื™ ื”ืื“ื,
08:43
that directed every device that was in every corner of the world.
186
523660
3856
ืฉื›ื™ื•ื•ื ื” ื›ืœ ืžื›ืฉื™ืจ ื‘ื›ืœ ืคื™ื ื” ืฉืœ ื”ืขื•ืœื.
08:47
Practically speaking,
187
527540
1456
ืื ืœื”ื™ื•ืช ืžืขืฉื™ื™ื,
08:49
it ain't gonna happen.
188
529020
2096
ื–ื” ืœื ื”ื•ืœืš ืœืงืจื•ืช.
08:51
We are not building AIs that control the weather,
189
531140
3056
ืื ื—ื ื• ืœื ื‘ื•ื ื™ื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืฉืฉื•ืœื˜ืช ื‘ืžื–ื’ ื”ืื•ื•ื™ืจ,
08:54
that direct the tides,
190
534220
1336
ืฉืžื›ื•ื•ื ืช ืืช ื”ื’ืื•ืช,
08:55
that command us capricious, chaotic humans.
191
535580
3376
ืฉืžืคืงื“ืช ืขืœื™ื ื•, ื”ืื ืฉื™ื ื”ื’ื—ืžื ื™ื™ื ื•ื”ื›ืื•ื˜ื™ื.
08:58
And furthermore, if such an artificial intelligence existed,
192
538980
3896
ื•ื‘ื ื•ืกืฃ, ืื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ื–ื• ื”ื™ื™ืชื” ืงื™ื™ืžืช,
09:02
it would have to compete with human economies,
193
542900
2936
ื”ื™ื” ืขืœื™ื” ืœื”ืชื—ืจื•ืช ื‘ื›ืœื›ืœื” ื”ืื ื•ืฉื™ืช,
09:05
and thereby compete for resources with us.
194
545860
2520
ื•ืœื›ืŸ ืœื”ืชื—ืจื•ืช ื ื’ื“ื™ื ื• ืขืœ ืžืฉืื‘ื™ื.
09:09
And in the end --
195
549020
1216
ื•ืœื‘ืกื•ืฃ --
09:10
don't tell Siri this --
196
550260
1240
ืืœ ืชื’ืœื• ืœืกื™ืจื™ --
09:12
we can always unplug them.
197
552260
1376
ืื ื—ื ื• ืชืžื™ื“ ื™ื›ื•ืœื™ื ืœื ืชืง ืื•ืชื.
09:13
(Laughter)
198
553660
2120
(ืฆื—ื•ืง)
09:17
We are on an incredible journey
199
557180
2456
ืื ื—ื ื• ื ืžืฆืื™ื ื‘ืžืกืข ืžื•ืคืœื
09:19
of coevolution with our machines.
200
559660
2496
ืฉืœ ืื‘ื•ืœื•ืฆื™ื” ืžืฉื•ืชืคืช ืขื ื”ืžื›ื•ื ื•ืช ืฉืœื ื•.
09:22
The humans we are today
201
562180
2496
ื”ืื ืฉื™ื ืฉืื ื• ื”ื™ื•ื
09:24
are not the humans we will be then.
202
564700
2536
ื”ื ืœื ืื•ืชื ื”ืื ืฉื™ื ืฉื ื”ื™ื” ืื–.
09:27
To worry now about the rise of a superintelligence
203
567260
3136
ืœื“ืื•ื’ ื”ื™ื•ื ืžื”ืขืœื™ื™ื” ืฉืœ ืกื•ืคืจ ืื™ื ื˜ืœื™ื’ื ืฆื™ื”
09:30
is in many ways a dangerous distraction
204
570420
3056
ื–ื” ื‘ืžื•ื‘ื ื™ื ืจื‘ื™ื ื”ืกื—ืช ื“ืขืช ืžืกื•ื›ื ืช
09:33
because the rise of computing itself
205
573500
2336
ื›ื™ื•ื•ืŸ ืฉื”ืขืœื™ื™ื” ื‘ืžื™ื—ืฉื•ื‘ ืขืฆืžื•
09:35
brings to us a number of human and societal issues
206
575860
3016
ืžื‘ื™ื ืืœื™ื ื• ืžืกืคืจ ื‘ืขื™ื•ืช ืื ื•ืฉื™ื•ืช ื•ื—ื‘ืจืชื™ื•ืช
09:38
to which we must now attend.
207
578900
1640
ืฉืื ื• ื—ื™ื™ื‘ื™ื ืœื˜ืคืœ ื‘ื”ืŸ ืขื›ืฉื™ื•.
09:41
How shall I best organize society
208
581180
2816
ืื™ืš ืขืœื™ ืœืืจื’ืŸ ืืช ื”ื—ื‘ืจื” ื”ืื ื•ืฉื™ืช
09:44
when the need for human labor diminishes?
209
584020
2336
ื›ืฉื”ื‘ื™ืงื•ืฉ ืœืขื‘ื•ื“ื” ืื ื•ืฉื™ืช ื”ืฆื˜ืžืฆื?
09:46
How can I bring understanding and education throughout the globe
210
586380
3816
ื›ื™ืฆื“ ืื ื™ ื™ื›ื•ืœ ืœื”ื‘ื™ื ื”ื‘ื ื” ื•ื—ื™ื ื•ืš ืœื›ืœ ื”ืื ื•ืฉื•ืช
09:50
and still respect our differences?
211
590220
1776
ื•ืขื“ื™ื™ืŸ ืœื›ื‘ื“ ืืช ื”ืฉื•ื ื” ื‘ื™ื ื ื•?
09:52
How might I extend and enhance human life through cognitive healthcare?
212
592020
4256
ืื™ืš ืืืจื™ืš ื•ืืฉืคืจ ืืช ื—ื™ื™ื ื• ืชื•ืš ืฉื™ืžื•ืฉ ื‘ืžืขืจื›ืช ื‘ืจื™ืื•ืช ืงื•ื’ื ื™ื˜ื™ื‘ื™ืช?
09:56
How might I use computing
213
596300
2856
ื›ื™ืฆื“ ืื•ื›ืœ ืœื”ืฉืชืžืฉ ื‘ืžื—ืฉื•ื‘
09:59
to help take us to the stars?
214
599180
1760
ื›ื“ื™ ืœืขื–ื•ืจ ืœืงื—ืช ืื•ืชื ื• ืœื›ื•ื›ื›ื‘ื™ื?
10:01
And that's the exciting thing.
215
601580
2040
ื•ื–ื” ื”ื“ื‘ืจ ื”ืžืจืชืง.
10:04
The opportunities to use computing
216
604220
2336
ื”ื”ื–ื“ืžื ื•ื™ื•ืช ืœื”ืฉืชืžืฉ ื‘ืžื™ื—ืฉื•ื‘
10:06
to advance the human experience
217
606580
1536
ืœืงื™ื“ื•ื ื”ื”ืชื ืกื•ืช ื”ืื ื•ืฉื™ืช
10:08
are within our reach,
218
608140
1416
ื”ืŸ ื‘ื˜ื•ื•ื— ื”ื”ืฉื’ื” ืฉืœื ื•,
10:09
here and now,
219
609580
1856
ื›ืืŸ ื•ืขื›ืฉื™ื•,
10:11
and we are just beginning.
220
611460
1680
ื•ืื ื• ืจืง ืžืชื—ื™ืœื™ื.
10:14
Thank you very much.
221
614100
1216
ืชื•ื“ื” ืจื‘ื” ืœื›ื.
10:15
(Applause)
222
615340
4286
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7