What AI is -- and isn't | Sebastian Thrun and Chris Anderson

260,683 views ใƒป 2017-12-21

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Sigal Tifferet
00:12
Chris Anderson: Help us understand what machine learning is,
0
12904
2886
ื›ืจื™ืก ืื ื“ืจืกื•ืŸ: ืขื–ื•ืจ ืœื ื• ืœื”ื‘ื™ืŸ ืžื”ื™ ืœืžื™ื“ืช ืžื›ื•ื ื”,
00:15
because that seems to be the key driver
1
15814
2054
ื›ื™ ื ืจืื” ืฉื–ื”ื• ื’ื•ืจื ืžืจื›ื–ื™
00:17
of so much of the excitement and also of the concern
2
17892
2737
ื‘ื”ืชืจื’ืฉื•ืช ื•ื‘ื“ืื’ื” ื”ืจื‘ื•ืช ืกื‘ื™ื‘ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช.
00:20
around artificial intelligence.
3
20653
1494
ืื™ืš ืคื•ืขืœืช ืœืžื™ื“ืช ื”ืžื›ื•ื ื”?
00:22
How does machine learning work?
4
22171
1643
00:23
Sebastian Thrun: So, artificial intelligence and machine learning
5
23838
3896
ืกื‘ืกื˜ื™ืืŸ ืช'ืจืŸ: ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ื•ืœืžื™ื“ืช ื”ืžื›ื•ื ื”
ื”ืŸ ื‘ื ื•ืช 60 ืฉื ื” ืœืขืจืš,
00:27
is about 60 years old
6
27758
2002
00:29
and has not had a great day in its past until recently.
7
29784
4269
ื•ืขื“ ืœืื—ืจื•ื ื” ืœื ื”ื™ื” ืœื”ืŸ ืจื’ืข ื—ืฉื•ื‘.
ื•ื”ืกื™ื‘ื” ื”ื™ื,
00:34
And the reason is that today,
8
34077
2924
ืฉื›ื™ื•ื ื”ื’ืขื ื• ืœื”ื™ืงืคื™ ื”ืžื™ื—ืฉื•ื‘ ื•ืžืขืจื›ื™ ื”ื ืชื•ื ื™ื
00:37
we have reached a scale of computing and datasets
9
37025
3973
ื”ื ื—ื•ืฆื™ื ื›ื“ื™ ืœื”ืคื•ืš ืืช ื”ืžื›ื•ื ื•ืช ืœื—ื›ืžื•ืช.
00:41
that was necessary to make machines smart.
10
41022
2637
00:43
So here's how it works.
11
43683
1751
ืื– ื›ื›ื” ื–ื” ืขื•ื‘ื“:
00:45
If you program a computer today, say, your phone,
12
45458
3497
ืื ืืชื” ืžืชื›ื ืช ื”ื™ื•ื ืžื—ืฉื‘, ืœืžืฉืœ ืืช ื”ื˜ืœืคื•ืŸ ืฉืœืš,
00:48
then you hire software engineers
13
48979
2335
ืืชื” ื‘ืขืฆื ืžืขืกื™ืง ืžื”ื ื“ืกื™ ืชื•ื›ื ื”
00:51
that write a very, very long kitchen recipe,
14
51338
3854
ืฉื™ื›ืชื‘ื• ืžืชื›ื•ืŸ ืืจื•ืš ืžืื“
00:55
like, "If the water is too hot, turn down the temperature.
15
55216
3132
ื“ื•ืžื” ืœ"ืื ื”ืžื™ื ื—ืžื™ื ืžื“ื™, ื”ื ืžืš ืืช ื”ื˜ืžืคืจื˜ื•ืจื”.
00:58
If it's too cold, turn up the temperature."
16
58372
2279
"ืื ื”ื ืงืจื™ื ืžื“ื™, ื”ืขืœื” ืืช ื”ื˜ืžืคืจื˜ื•ืจื”."
01:00
The recipes are not just 10 lines long.
17
60675
2849
ื”ืžืชื›ื•ื ื™ื ืื™ื ื ื‘ื ื™ ืขืฉืจ ืฉื•ืจื•ืช ื‘ืœื‘ื“,
01:03
They are millions of lines long.
18
63548
2603
ืืœื ื‘ื ื™ ืžื™ืœื™ื•ืŸ ืฉื•ืจื•ืช.
01:06
A modern cell phone has 12 million lines of code.
19
66175
4084
ื˜ืœืคื•ืŸ ืกืœื•ืœืจื™ ืžื•ื“ืจื ื™ ืžื›ื™ืœ 12 ืžื™ืœื™ื•ืŸ ืฉื•ืจื•ืช ืงื•ื“.
01:10
A browser has five million lines of code.
20
70283
2646
ื“ืคื“ืคืŸ ืžื›ื™ืœ 5 ืžื™ืœื™ื•ืŸ ืฉื•ืจื•ืช ืงื•ื“.
01:12
And each bug in this recipe can cause your computer to crash.
21
72953
4969
ื•ื›ืœ ื‘ืื’ ื‘ืžืชื›ื•ืŸ ืขืœื•ืœ ืœื’ืจื•ื ืœืžื—ืฉื‘ ืœืงืจื•ืก.
01:17
That's why a software engineer makes so much money.
22
77946
3075
ื•ืœื›ืŸ ืžื”ื ื“ืกื™ ืชื•ื›ื ื” ืžืฉืชื›ืจื™ื ื˜ื•ื‘ ื›ืœ-ื›ืš.
01:21
The new thing now is that computers can find their own rules.
23
81953
3660
ืžื” ืฉื—ื“ืฉ ื”ื™ื•ื ื”ื•ื ืฉื”ืžื—ืฉื‘ื™ื ื™ื›ื•ืœื™ื ืœื”ื’ื“ื™ืจ ื›ืœืœื™ื ืžืฉืœื”ื.
01:25
So instead of an expert deciphering, step by step,
24
85637
3606
ืื– ื‘ืžืงื•ื ืฉืžื•ืžื—ื” ื™ืชื›ื ืช, ืฉืœื‘ ืื—ืจ ืฉืœื‘,
01:29
a rule for every contingency,
25
89267
2148
ื›ืœืœ ืœื›ืœ ืืคืฉืจื•ืช,
01:31
what you do now is you give the computer examples
26
91439
3074
ื ื•ืชื ื™ื ื›ืขืช ืœืžื—ืฉื‘ ื“ื•ื’ืžืื•ืช
01:34
and have it infer its own rules.
27
94537
1581
ื•ืœืคื™ื”ืŸ ื”ื•ื ืžืคืจืฉ ื‘ืขืฆืžื• ืืช ื”ื›ืœืœื™ื.
01:36
A really good example is AlphaGo, which recently was won by Google.
28
96142
4306
ื“ื•ื’ืžื” ื˜ื•ื‘ื” ื”ื™ื "ืืœืคื-ื’ื•", ืฉ"ื’ื•ื’ืœ" ืจื›ืฉื” ืœืื—ืจื•ื ื”.
01:40
Normally, in game playing, you would really write down all the rules,
29
100472
3687
ื‘ื“"ื›, ื‘ืžืฉื—ืงื™ื, ื›ื•ืชื‘ื™ื ืืช ื›ืœ ื”ื›ืœืœื™ื,
ืื‘ืœ ื‘ืžืงืจื” ืฉืœ "ืืœืคื-ื’ื•",
01:44
but in AlphaGo's case,
30
104183
1785
01:45
the system looked over a million games
31
105992
2066
ื”ืžืขืจื›ืช ื‘ื—ื ื” ืžืขืœ ืžื™ืœื™ื•ืŸ ืžืฉื—ืงื™ื,
ื”ืฆืœื™ื—ื” ืœื”ืงื™ืฉ ืžืชื•ื›ื ื›ืœืœื™ื ืžืฉืœื”
01:48
and was able to infer its own rules
32
108082
2192
01:50
and then beat the world's residing Go champion.
33
110298
2738
ื•ืื– ืœื”ื‘ื™ืก ืืช ืืœื•ืฃ ื”ืขื•ืœื ื‘"ื’ื•".
01:53
That is exciting, because it relieves the software engineer
34
113853
3509
ื–ื” ืžืœื”ื™ื‘, ื›ื™ ื–ื” ืžืฉื—ืจืจ ืืช ืžื”ื ื“ืก ื”ืชื•ื›ื ื”
ืžื”ืฆื•ืจืš ืœื”ื™ื•ืช ื’ืื•ืŸ
01:57
of the need of being super smart,
35
117386
1819
01:59
and pushes the burden towards the data.
36
119229
2325
ื•ื–ื” ืžืขื‘ื™ืจ ืืช ื”ื ื˜ืœ ืืœ ื”ื ืชื•ื ื™ื.
02:01
As I said, the inflection point where this has become really possible --
37
121578
4534
ื›ืคื™ ืฉืืžืจืชื™, ื ืงื•ื“ืช ื”ืžืคื ื” ืฉืื™ืคืฉืจื” ื–ืืช -
ืžื‘ื™ืš ืžืื“: ื”ืชื–ื” ืฉืœื™ ืขืกืงื” ื‘ืœืžื™ื“ืช ืžื›ื•ื ื”.
02:06
very embarrassing, my thesis was about machine learning.
38
126136
2746
02:08
It was completely insignificant, don't read it,
39
128906
2205
ื”ื™ื ืžืžืฉ ืœื ื—ืฉื•ื‘ื”. ืืœ ืชืงืจืื• ืื•ืชื”,
ื›ื™ ื–ื” ื”ื™ื” ืœืคื ื™ 20 ืฉื ื”
02:11
because it was 20 years ago
40
131135
1350
ื•ืื–, ื”ืžื—ืฉื‘ื™ื ื”ื™ื• ื‘ื’ื•ื“ืœ ืžื•ื— ืฉืœ ืžืงืง.
02:12
and back then, the computers were as big as a cockroach brain.
41
132509
2907
ื”ื™ื•ื ื”ื ืžืกืคื™ืง ื—ื–ืงื™ื ื›ื“ื™ ืœื—ืงื•ืช
02:15
Now they are powerful enough to really emulate
42
135440
2331
02:17
kind of specialized human thinking.
43
137795
2076
ืžืขื™ืŸ ื—ืฉื™ื‘ื” ืื ื•ืฉื™ืช ื™ื™ื—ื•ื“ื™ืช.
02:19
And then the computers take advantage of the fact
44
139895
2313
ื•ืื– ื”ืžื—ืฉื‘ื™ื ืžื ืฆืœื™ื ืืช ื”ืขื•ื‘ื“ื”
ืฉื”ื ื™ื›ื•ืœื™ื ืœื‘ื—ื•ืŸ ื ื•ืชื ื™ื ืžื”ืจ ื‘ื”ืจื‘ื” ืžื‘ื ื™ ืื“ื.
02:22
that they can look at much more data than people can.
45
142232
2500
02:24
So I'd say AlphaGo looked at more than a million games.
46
144756
3080
ืื– ื‘ืื•ืžืจื™ ืฉ"ืืœืคื-ื’ื•" ื‘ื—ื ื” ืžืขืœ ืžื™ืœื™ื•ืŸ ืžืฉื—ืงื™ื,
02:27
No human expert can ever study a million games.
47
147860
2839
ืื™ืŸ ืžื•ืžื—ื” ืื ื•ืฉื™ ืฉืžืกื•ื’ืœ ืœื‘ื—ื•ืŸ ืžื™ืœื™ื•ืŸ ืžืฉื—ืงื™ื.
02:30
Google has looked at over a hundred billion web pages.
48
150723
3182
"ื’ื•ื’ืœ" ื‘ื—ื ื” ืžืขืœ ืžืื” ืžื™ืœื™ืืจื“ ื“ืคื™ ืื™ื ื˜ืจื ื˜.
02:33
No person can ever study a hundred billion web pages.
49
153929
2650
ืื™ืŸ ืื“ื ืฉืžืกื•ื’ืœ ืœื‘ื—ื•ืŸ ืžืื” ืžื™ืœื™ืืจื“ ื“ืคื™ ืื™ื ื˜ืจื ื˜.
02:36
So as a result, the computer can find rules
50
156603
2714
ื•ืœื›ืŸ, ื”ืžื—ืฉื‘ื™ื ืžืกื•ื’ืœื™ื ืœื”ื’ื“ื™ืจ ื›ืœืœื™ื
02:39
that even people can't find.
51
159341
1755
ืฉื‘ื ื™-ืื“ื ืœื ืžืกื•ื’ืœื™ื ืœื”ื’ื“ื™ืจ.
ื›"ื: ืื– ื‘ืžืงื•ื ืœืฆืคื•ืช ืžื”ืœื›ื™ื ื‘ืกื’ื ื•ืŸ "ืื ื”ื•ื ืขื•ืฉื” ื›ืš, ืืขืฉื” ื›ืš,"
02:41
CA: So instead of looking ahead to, "If he does that, I will do that,"
52
161120
4312
02:45
it's more saying, "Here is what looks like a winning pattern,
53
165456
3072
ื–ื” ื™ื•ืชืจ ื›ืžื• "ื›ื›ื” ื ืจืื” ื“ืคื•ืก ืžืฉื—ืง ืžื ืฆื—."
02:48
here is what looks like a winning pattern."
54
168552
2079
ืก"ืช: ื›ืŸ. ืชื—ืฉื•ื‘ ืœืžืฉืœ ืขืœ ื’ื™ื“ื•ืœ ื™ืœื“ื™ื.
02:50
ST: Yeah. I mean, think about how you raise children.
55
170655
2517
ืื™ื ืš ืžื’ื“ื™ืจ ืœื™ืœื“ื™ื ื‘ืžืฉืš 18 ืฉื ื” ื›ืœืœ ืœื›ืœ ืืคืฉืจื•ืช
02:53
You don't spend the first 18 years giving kids a rule for every contingency
56
173196
3644
ื•ืื– ืฉื•ืœื— ืื•ืชื ืœืขื•ืœื ืขื ืื™ื–ื• ืชื•ื›ื ื” ื’ื“ื•ืœื”.
02:56
and set them free and they have this big program.
57
176864
2347
ื”ื ื ื›ืฉืœื™ื, ื ื•ืคืœื™ื, ืงืžื™ื, ื—ื•ื˜ืคื™ื ืกื˜ื™ืจื•ืช ืื• ืžื›ื•ืช,
02:59
They stumble, fall, get up, they get slapped or spanked,
58
179235
2719
03:01
and they have a positive experience, a good grade in school,
59
181978
2884
ื•ืจื•ื›ืฉื™ื ื ืกื™ื•ืŸ ื—ื™ื•ื‘ื™, ืฆื™ื•ื ื™ื ื˜ื•ื‘ื™ื ื‘ื‘ื™ื”"ืก,
03:04
and they figure it out on their own.
60
184886
1834
ื•ืžื•ืฆืื™ื ื‘ืขืฆืžื ืืช ื“ืจื›ื.
03:06
That's happening with computers now,
61
186744
1737
ื–ื” ืžื” ืฉืงื•ืจื” ื”ื™ื•ื ืขื ืžื—ืฉื‘ื™ื,
03:08
which makes computer programming so much easier all of a sudden.
62
188505
3029
ื•ื‘ื‘ืช-ืื—ืช, ืชื›ื ื•ืช ื”ืžื—ืฉื‘ื™ื ื ืขืฉื” ืงืœ ืžืื“.
ื›ืขืช ื›ื‘ืจ ืื™ื ื ื• ืฆืจื™ื›ื™ื ืœื—ืฉื•ื‘, ืืœื ืจืง ืœืชืช ืœื”ื ื”ืžื•ืŸ ื ืชื•ื ื™ื.
03:11
Now we don't have to think anymore. We just give them lots of data.
63
191558
3175
03:14
CA: And so, this has been key to the spectacular improvement
64
194757
3422
ื›"ื: ื•ื–ื” ื”ื™ื” ื’ื•ืจื ืžืจื›ื–ื™ ื‘ืฉื™ืคื•ืจ ื”ืžื“ื”ื™ื
ื‘ืชื—ื•ื ื”ืžื›ื•ื ื™ื•ืช ื”ืื•ื˜ื•ื ื•ืžื™ื•ืช.
03:18
in power of self-driving cars.
65
198203
3064
03:21
I think you gave me an example.
66
201291
1739
ื”ืจืื™ืช ืœื™ ืงื•ื“ื ื“ื•ื’ืžื”.
ืชื•ื›ืœ ืœื”ืกื‘ื™ืจ ืžื” ืงื•ืจื” ื‘ืชื—ื•ื ื–ื”?
03:23
Can you explain what's happening here?
67
203054
2685
03:25
ST: This is a drive of a self-driving car
68
205763
3564
ืก"ืช: ื–ืืช ื ื”ื™ื’ื” ื‘ืžื›ื•ื ื™ืช ืื•ื˜ื•ื ื•ืžื™ืช
03:29
that we happened to have at Udacity
69
209351
1957
ืฉื™ืฉ ืœื ื• ื‘"ื™ื•ื“ืกื™ื˜ื™",
03:31
and recently made into a spin-off called Voyage.
70
211332
2398
ื•ืœืื—ืจื•ื ื” ืคื™ืชื—ื ื• ืœืžืฉื”ื• ื‘ืฉื "ื•ื•ื™ืื’'."
03:33
We have used this thing called deep learning
71
213754
2574
ื”ืฉืชืžืฉื ื• ื‘ืžื” ืฉืงืจื•ื™ "ืœืžื™ื“ื” ืขืžื•ืงื”"
03:36
to train a car to drive itself,
72
216352
1623
ื›ื“ื™ ืœืืžืŸ ืืช ื”ืžื›ื•ื ื™ืช ืœื ื”ื•ื’ ื‘ืขืฆืžื”,
03:37
and this is driving from Mountain View, California,
73
217999
2387
ื•ื›ืืŸ ืจื•ืื™ื ื ื”ื™ื’ื” ืžืžืื•ื ื˜ืŸ ื•ื•ื™ื• ืฉื‘ืงืœื™ืคื•ืจื ื™ื”
ืขื“ ืœืกืŸ-ืคืจื ืกื™ืกืงื•
03:40
to San Francisco
74
220410
1168
03:41
on El Camino Real on a rainy day,
75
221602
2259
ื‘ื›ื‘ื™ืฉ ืืœ-ืงืžื™ื ื• ืจื™ืืœ ื‘ื™ื•ื ื’ืฉื•ื,
03:43
with bicyclists and pedestrians and 133 traffic lights.
76
223885
3524
ืขื ืจื•ื›ื‘ื™ ืื•ืคื ื™ื™ื, ื”ื•ืœื›ื™ ืจื’ืœ ื•-133 ืจืžื–ื•ืจื™ื.
03:47
And the novel thing here is,
77
227433
2636
ื•ืžื” ืฉื—ื“ืฉื ื™ ืคื”,
ื”ื•ื ืฉืœืคื ื™ ื™ืจื—ื™ื ืจื‘ื™ื ืžืื“ ื™ื™ืกื“ืชื™ ืืช ืฆื•ื•ืช ื”ืžื›ื•ื ื™ืช ื”ืื•ื˜ื•ื ื•ืžื™ืช ืฉืœ "ื’ื•ื’ืœ".
03:50
many, many moons ago, I started the Google self-driving car team.
78
230093
3120
03:53
And back in the day, I hired the world's best software engineers
79
233237
3181
ื•ืื–, ืฉื›ืจืชื™ ืืช ืžื”ื ื“ืกื™ ื”ืชื•ื›ื ื” ื”ื›ื™ ื˜ื•ื‘ื™ื ื‘ืขื•ืœื
ื›ื“ื™ ืฉื™ื’ื“ื™ืจื• ืืช ื”ื›ืœืœื™ื ื”ื›ื™ ื˜ื•ื‘ื™ื ื‘ืขื•ืœื.
03:56
to find the world's best rules.
80
236442
1607
ื›ืืŸ ืžื“ื•ื‘ืจ ื‘ืื™ืžื•ืŸ ื‘ืœื‘ื“.
03:58
This is just trained.
81
238073
1754
03:59
We drive this road 20 times,
82
239851
3336
ื ื”ื’ื ื• ื‘ื›ื‘ื™ืฉ ื”ื–ื” 20 ืคืขื,
ื”ื›ื ืกื ื• ืืช ื›ืœ ื”ื ืชื•ื ื™ื ืœืžื•ื— ืฉืœ ื”ืžื—ืฉื‘,
04:03
we put all this data into the computer brain,
83
243211
2447
04:05
and after a few hours of processing,
84
245682
2082
ื•ืื—ืจื™ ื›ืžื” ืฉืขื•ืช ืขื™ื‘ื•ื“,
04:07
it comes up with behavior that often surpasses human agility.
85
247788
3926
ื”ื•ื ื”ืคื™ืง ื”ืชื ื”ื’ื•ืช ืฉืœืขืชื™ื ืงืจื•ื‘ื•ืช ืขื•ืœื” ืขืœ ื”ื–ืจื™ื–ื•ืช ื”ืื ื•ืฉื™ืช.
04:11
So it's become really easy to program it.
86
251738
2017
ื›ืš ืฉื ืขืฉื” ืงืœ ืžืื“ ืœืชื›ื ืช ืืช ื–ื”.
04:13
This is 100 percent autonomous, about 33 miles, an hour and a half.
87
253779
3803
ื”ื™ื ืื•ื˜ื•ื ื•ืžื™ืช ื‘-100% ืœืื•ืจืš ื›-50 ืง"ืž ื‘ืžืฉืš ืฉืขื” ื•ื—ืฆื™.
04:17
CA: So, explain it -- on the big part of this program on the left,
88
257606
3630
ื›"ื: ืชืกื‘ื™ืจ ืืช ื–ื” - ื‘ื—ืœืง ื”ื’ื“ื•ืœ ืฉืœ ื”ืชื•ื›ื ื”, ืžืฉืžืืœ,
04:21
you're seeing basically what the computer sees as trucks and cars
89
261260
3257
ืจื•ืื™ื ื‘ืขืฆื ืžื” ืฉื”ืžื—ืฉื‘ ืžื–ื”ื” ื›ืžืฉืื™ื•ืช ื•ื›ืœื™-ืจื›ื‘
04:24
and those dots overtaking it and so forth.
90
264541
2886
ื•ืขืงื™ืคื•ืช ื•ื›ืŸ ื”ืœืื”.
04:27
ST: On the right side, you see the camera image, which is the main input here,
91
267451
3762
ืก"ืช: ืžื™ืžื™ืŸ ืจื•ืื™ื ืืช ื”ืžืฆืœืžื”, ืฉื–ื” ื”ืงืœื˜ ื”ืขื™ืงืจื™,
ื•ื”ื™ื ืžืฉืžืฉืช ืœืื™ืชื•ืจ ื ืชื™ื‘ื™ื, ืžื›ื•ื ื™ื•ืช ืื—ืจื•ืช, ืจืžื–ื•ืจื™ื.
04:31
and it's used to find lanes, other cars, traffic lights.
92
271237
2676
04:33
The vehicle has a radar to do distance estimation.
93
273937
2489
ื”ืจื›ื‘ ืžืฆื•ื™ื“ ื‘ืžื›"ื ืฉืžื‘ืฆืข ืื•ืžื“ืŸ ืžืจื—ืงื™ื.
04:36
This is very commonly used in these kind of systems.
94
276450
2621
ื–ื” ื ืคื•ืฅ ืžืื“ ื‘ืžืขืจื›ื•ืช ื›ืืœื”.
ืžืฉืžืืœ ืจื•ืื™ื ืฉืจื˜ื•ื˜ ืœื™ื™ื–ืจ,
04:39
On the left side you see a laser diagram,
95
279095
1992
ื•ื‘ื• ืจื•ืื™ื ืžื›ืฉื•ืœื™ื ื›ืžื• ืขืฆื™ื ื•ื›ื•' ืฉื”ืœื™ื™ื–ืจ ืžื–ื”ื”.
04:41
where you see obstacles like trees and so on depicted by the laser.
96
281111
3200
ืื‘ืœ ื›ืžืขื˜ ื›ืœ ื”ืขื‘ื•ื“ื” ื”ืžืขื ื™ื™ื ืช ืžืชืžืงื“ืช ื›ืขืช ื‘ืชืžื•ื ืช ื”ืžืฆืœืžื”.
04:44
But almost all the interesting work is centering on the camera image now.
97
284335
3436
ืื ื• ื‘ืขืฆื ืขื•ื‘ืจื™ื ืžื—ื™ื™ืฉื ื™ื ืžื“ื•ื™ืงื™ื ื›ืžื• ืžื›"ื ื•ืœื™ื™ื–ืจ
04:47
We're really shifting over from precision sensors like radars and lasers
98
287795
3476
ืœื—ื™ื™ืฉื ื™ื ืžืกื—ืจื™ื™ื ื•ื–ื•ืœื™ื ืžืื“.
04:51
into very cheap, commoditized sensors.
99
291295
1842
ืžืฆืœืžื” ืขื•ืœื” ืคื—ื•ืช ืž-8 ื“ื•ืœืจ.
04:53
A camera costs less than eight dollars.
100
293161
1987
ื›"ื: ื•ืžื”ื™ ื”ื ืงื•ื“ื” ื”ื™ืจื•ืงื” ืžืฉืžืืœ?
04:55
CA: And that green dot on the left thing, what is that?
101
295172
2793
04:57
Is that anything meaningful?
102
297989
1371
ื™ืฉ ืœื” ืื™ื–ื• ืžืฉืžืขื•ืช?
04:59
ST: This is a look-ahead point for your adaptive cruise control,
103
299384
3668
ืก"ืช: ื–ืืช ื ืงื•ื“ืช ื—ื™ื–ื•ื™ ืขื‘ื•ืจ ื‘ืงืจืช ื”ืฉื™ื•ื˜ ื”ืกืชื’ืœื ื™ืช,
ื•ื”ื™ื ืขื•ื–ืจืช ืœื ื• ืœื”ื‘ื™ืŸ ืื™ืš ืœื•ื•ืกืช ืืช ื”ืžื”ื™ืจื•ืช
05:03
so it helps us understand how to regulate velocity
104
303076
2477
05:05
based on how far the cars in front of you are.
105
305577
2634
ืœืคื™ ื”ืžืจื—ืง ืžื”ืžื›ื•ื ื™ื•ืช ืฉืœืคื ื™ืš.
ื›"ื: ื™ืฉ ืœืš ื’ื ื“ื•ื’ืžื”
05:08
CA: And so, you've also got an example, I think,
106
308235
2716
05:10
of how the actual learning part takes place.
107
310975
2381
ืื™ืš ืžืชืจื—ืฉืช ื”ืœืžื™ื“ื” ื”ืขืžื•ืงื”.
ืื•ืœื™ ื ื•ื›ืœ ืœืจืื•ืช ืื•ืชื”. ืกืคืจ ืขืœ ื–ื”.
05:13
Maybe we can see that. Talk about this.
108
313380
2458
05:15
ST: This is an example where we posed a challenge to Udacity students
109
315862
3643
ืก"ืช: ื‘ื“ื•ื’ืžื” ื‘ื–ืืช ื‘ื” ื”ืฆื‘ื ื• ืืชื’ืจ ืœืชืœืžื™ื“ื™ "ื™ื•ื“ืกื™ื˜ื™"
05:19
to take what we call a self-driving car Nanodegree.
110
319529
3131
ืœื”ื™ื‘ื—ืŸ ื‘ืžื” ืฉืื ื• ืžื›ื ื™ื "ื ื ื•-ืชื•ืืจ ื‘ื ื”ื™ื’ื” ืื•ื˜ื•ื ื•ืžื™ืช."
05:22
We gave them this dataset
111
322684
1495
ื ืชื ื• ืœื”ื ืืช ื‘ืกื™ืก ื”ื ืชื•ื ื™ื ื”ื–ื”
ื•ืฉืืœื ื•, "ื”ืื ืชื•ื›ืœื• ืœืžืฆื•ื ืื™ืš ืœื ื”ื•ื’ ื‘ืžื›ื•ื ื™ืช ื”ื–ืืช?"
05:24
and said "Hey, can you guys figure out how to steer this car?"
112
324203
3054
ื•ืื ืชื‘ื™ื˜ ื‘ืชืžื•ื ื•ืช ื”ืืœื”,
05:27
And if you look at the images,
113
327281
1624
05:28
it's, even for humans, quite impossible to get the steering right.
114
328929
4073
ืื– ื›ืžืขื˜ ืืคื™ืœื• ืœื‘ื ื™ ืื“ื, ื‘ืœืชื™-ืืคืฉืจื™ ืœื”ื’ื“ื™ืจ ื ื”ื™ื’ื” ื ื›ื•ื ื”.
05:33
And we ran a competition and said, "It's a deep learning competition,
115
333026
3591
ืขืจื›ื ื• ืชื—ืจื•ืช ื‘ืฉื "ืชื—ืจื•ืช ืœืžื™ื“ื” ืขืžื•ืงื”,
05:36
AI competition,"
116
336641
1173
"ืชื—ืจื•ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,"
05:37
and we gave the students 48 hours.
117
337838
1887
ื•ื”ืงืฆื‘ื ื• ืœืกื˜ื•ื“ื ื˜ื™ื 48 ืฉืขื•ืช.
05:39
So if you are a software house like Google or Facebook,
118
339749
4172
ืœื‘ื™ืช-ืชื•ื›ื ื” ื›ืžื• "ื’ื•ื’ืœ" ืื•" ืคื™ื™ืกื‘ื•ืง"
05:43
something like this costs you at least six months of work.
119
343945
2717
ื“ื‘ืจ ื›ื–ื” ืขื•ืœื” ืœืคื—ื•ืช ื‘ืฉื™ืฉื” ื—ื•ื“ืฉื™ ืขื‘ื•ื“ื”.
05:46
So we figured 48 hours is great.
120
346686
2202
ืื– ื”ื—ืœื˜ื ื• ืฉื‘-48 ืฉืขื•ืช ื–ื” ืžืฆื•ื™ืŸ.
05:48
And within 48 hours, we got about 100 submissions from students,
121
348912
3467
ื•ื›ืขื‘ื•ืจ 48 ืฉืขื•ืช ืงื™ื‘ืœื ื• ืžื”ืกื˜ื•ื“ื ื˜ื™ื 100 ื”ืฆืขื•ืช,
05:52
and the top four got it perfectly right.
122
352403
3370
ื•ืืจื‘ืข ื”ื˜ื•ื‘ื•ืช ื‘ื™ื•ืชืจ ืคืชืจื• ืืช ื–ื” ื‘ืื•ืคืŸ ืžื•ืฉืœื.
05:55
It drives better than I could drive on this imagery,
123
355797
2640
ื”ื ื”ื™ื’ื” ื‘ืขื–ืจืช ืœืžื™ื“ื” ืขืžื•ืงื” ื˜ื•ื‘ื” ื™ื•ืชืจ
ืžืžื” ืฉืื ื™ ื”ืฆืœื—ืชื™ ืœื‘ืฆืข ื‘ื”ื“ืžื™ื” ื”ื–ื•.
05:58
using deep learning.
124
358461
1189
05:59
And again, it's the same methodology.
125
359674
1799
ื•ืฉื•ื‘, ื–ื• ืื•ืชื” ืžืชื•ื“ื•ืœื•ื’ื™ื”. ื–ื” ื›ืœ ื”ืงืกื:
06:01
It's this magical thing.
126
361497
1164
06:02
When you give enough data to a computer now,
127
362685
2085
ื”ื™ื•ื, ื›ืฉื ื•ืชื ื™ื ืœืžื—ืฉื‘ ืžืกืคื™ืง ื ืชื•ื ื™ื
06:04
and give enough time to comprehend the data,
128
364794
2140
ื•ืžืกืคื™ืง ื–ืžืŸ ื›ื“ื™ ืœื”ื‘ื™ืŸ ืืช ื”ื ืชื•ื ื™ื,
06:06
it finds its own rules.
129
366958
1445
ื”ื•ื ืžื’ื“ื™ืจ ื‘ืขืฆืžื• ืืช ื”ื›ืœืœื™ื.
06:09
CA: And so that has led to the development of powerful applications
130
369339
4845
ื›"ื: ื•ื–ื” ื”ื•ื‘ื™ืœ ืœืคื™ืชื•ื— ื™ื™ืฉื•ืžื™ื ืจื‘ื™-ืขื•ืฆืžื”
ื‘ื›ืœ ืžื™ื ื™ ืชื—ื•ืžื™ื.
06:14
in all sorts of areas.
131
374208
1525
06:15
You were talking to me the other day about cancer.
132
375757
2668
ื“ื™ื‘ืจืช ืื™ืชื™ ืขืœ ืกืจื˜ืŸ.
06:18
Can I show this video?
133
378449
1189
ืื ื™ ื™ื›ื•ืœ ืœื”ืงืจื™ืŸ ืืช ื”ืกืจื˜ื•ืŸ ื”ื–ื”?
06:19
ST: Yeah, absolutely, please. CA: This is cool.
134
379662
2354
ืก"ืช: ื›ืŸ, ื‘ื”ื—ืœื˜. ื‘ื‘ืงืฉื”. ื›"ื: ื–ื” ืžืขื•ืœื”.
06:22
ST: This is kind of an insight into what's happening
135
382040
3534
ืก"ืช: ื–ื• ื”ืฆืฆื” ืœืžื” ืฉืงื•ืจื”
06:25
in a completely different domain.
136
385598
2429
ื‘ืชื—ื•ื ืฉื•ื ื” ืœื’ืžืจื™.
ื–ื”ื• ืชื’ื‘ื•ืจ - ืื• ืชื—ืจื•ืช -
06:28
This is augmenting, or competing --
137
388051
3752
06:31
it's in the eye of the beholder --
138
391827
1749
ื–ื” ื ืชื•ืŸ ืœืคืจืฉื ื•ืช -
06:33
with people who are being paid 400,000 dollars a year,
139
393600
3454
ืžื•ืœ ืื ืฉื™ื ืฉืžืฉืชื›ืจื™ื 400,000 ื“ื•ืœืจ ื‘ืฉื ื”,
ืจื•ืคืื™-ืขื•ืจ,
06:37
dermatologists,
140
397078
1237
06:38
highly trained specialists.
141
398339
1983
ืžื•ืžื—ื™ื ืžื™ื•ืžื ื™ื ืžืื“.
06:40
It takes more than a decade of training to be a good dermatologist.
142
400346
3561
ื“ืจื•ืฉื•ืช ืžืขืœ 10 ืฉื ื™ื ื”ืชืžื—ื•ืช ื›ื“ื™ ืœื”ื™ืขืฉื•ืช ืœืจื•ืคื-ืขื•ืจ ื˜ื•ื‘.
06:43
What you see here is the machine learning version of it.
143
403931
3196
ื›ืืŸ ืืชื” ืจื•ืื” ืืช ื’ื™ืจืกืช ืœืžื™ื“ืช ื”ืžื›ื•ื ื”.
ื–ื” ืงืจื•ื™ "ืจืฉืช ืขืฆื‘ื™ืช".
06:47
It's called a neural network.
144
407151
1841
"ืจืฉืชื•ืช ืขืฆื‘ื™ื•ืช" ื”ื•ื ื”ืžื•ื ื— ื”ื˜ื›ื ื™ ืœืืœื’ื•ืจื™ืชืžื™ื ืฉืœ ืœืžื™ื“ืช ื”ืžื›ื•ื ื”.
06:49
"Neural networks" is the technical term for these machine learning algorithms.
145
409016
3742
06:52
They've been around since the 1980s.
146
412782
1789
ื”ืŸ ืงื™ื™ืžื•ืช ืžืื– ืฉื ื•ืช ื”-80.
06:54
This one was invented in 1988 by a Facebook Fellow called Yann LeCun,
147
414595
4640
ื–ื• ืฉื›ืืŸ ื”ื•ืžืฆืื” ื‘-1988 ืข"ื™ ื‘ื—ื•ืจ ืž"ืคื™ื™ืกื‘ื•ืง" ื‘ืฉื ื™ืืŸ ืœื”-ืงืืŸ,
06:59
and it propagates data stages
148
419259
3558
ื•ื”ื™ื ืžืคื™ืฆื” ืฉืœื‘ื™ ื ืชื•ื ื™ื
07:02
through what you could think of as the human brain.
149
422841
2578
ื“ืจืš ืžื” ืฉืืคืฉืจ ืœื“ืžื•ืช ืœืžื•ื— ื”ืื ื•ืฉื™.
07:05
It's not quite the same thing, but it emulates the same thing.
150
425443
2966
ื–ื” ืœื ื‘ื“ื™ื•ืง ืื•ืชื• ื”ื“ื‘ืจ, ืื‘ืœ ื–ื” ื—ื™ืงื•ื™ ืฉืœ ื–ื”.
07:08
It goes stage after stage.
151
428433
1302
ื–ื” ืขื•ื‘ืจ ืฉืœื‘-ืฉืœื‘.
07:09
In the very first stage, it takes the visual input and extracts edges
152
429759
3637
ื‘ืฉืœื‘ ื”ืจืืฉื•ืŸ ื”ืืœื’ื•ืจื™ืชื ืœื•ืงื— ืงืœื˜ ื—ื–ื•ืชื™ ื•ืžืžืฆื” ืžืชื•ื›ื• ืฉื•ืœื™ื™ื,
07:13
and rods and dots.
153
433420
2612
ืงื•ื•ื™ื ื•ื ืงื•ื“ื•ืช.
ื‘ืฉืœื‘ ื”ื‘ื ืžื“ื•ื‘ืจ ื‘ืฉื•ืœื™ื™ื ื”ืžื•ืจื›ื‘ื™ื ื™ื•ืชืจ
07:16
And the next one becomes more complicated edges
154
436056
3037
07:19
and shapes like little half-moons.
155
439117
3191
ื•ืฆื•ืจื•ืช, ื›ืžื• ื—ืฆื™-ื™ืจื—,
07:22
And eventually, it's able to build really complicated concepts.
156
442332
4443
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ, ื”ื•ื ืžืกื•ื’ืœ ืœื‘ื ื•ืช ืžื•ืฉื’ื™ื ืžื•ืจื›ื‘ื™ื ืžืื“ -
07:26
Andrew Ng has been able to show
157
446799
2048
ืื ื“ืจื• ื ื’ ื”ืฆืœื™ื— ืœื”ื•ื›ื™ื—
07:28
that it's able to find cat faces and dog faces
158
448871
3480
ืฉื”ื•ื ืžืกื•ื’ืœ ืœื–ื”ื•ืช ืคื ื™ื ืฉืœ ื›ืœื‘ื™ื ื•ื—ืชื•ืœื™ื
07:32
in vast amounts of images.
159
452375
1661
ื‘ื›ืžื•ืช ืขืฆื•ืžื” ืฉืœ ืชืžื•ื ื•ืช.
07:34
What my student team at Stanford has shown is that
160
454060
2724
ื”ืกื˜ื•ื“ื ื˜ื™ื ืฉืœื™ ื‘ืกื˜ื ืคื•ืจื“ ื”ื•ื›ื™ื—ื•
07:36
if you train it on 129,000 images of skin conditions,
161
456808
6073
ืฉืื ืžืืžื ื™ื ืื•ืชื• ื‘ืขื–ืจืช 129,000 ืชืžื•ื ื•ืช ืฉืœ ืžื—ืœื•ืช ืขื•ืจ,
07:42
including melanoma and carcinomas,
162
462905
2565
ื›ื•ืœืœ ืžืœื ื•ืžื•ืช ื•ืงืจืฆื™ื ื•ืžื•ืช,
07:45
you can do as good a job
163
465494
3301
ื”ื•ื ืžืฉื™ื’ ืชื•ืฆืื•ืช ื˜ื•ื‘ื•ืช
07:48
as the best human dermatologists.
164
468819
2197
ื›ืžื• ืฉืœ ืจื•ืคืื™ ื”ืขื•ืจ ื”ืื ื•ืฉื™ื™ื ื”ื›ื™ ื˜ื•ื‘ื™ื.
07:51
And to convince ourselves that this is the case,
165
471040
2549
ื•ื›ื“ื™ ืœื”ืฉืชื›ื ืข ืฉื–ื” ืื›ืŸ ื›ืš,
07:53
we captured an independent dataset that we presented to our network
166
473613
3990
ื‘ื•ื“ื“ื ื• ื‘ืกื™ืก ื ืชื•ื ื™ื ืขืฆืžืื™ ื•ื”ื’ืฉื ื• ืื•ืชื• ืœืจืฉืช ืฉืœื ื•
07:57
and to 25 board-certified Stanford-level dermatologists,
167
477627
4342
ื•ืœ-25 ืจื•ืคืื™-ืขื•ืจ ืžื•ืกืžื›ื™ื ื‘ืชืงืŸ ืฉืœ ืกื˜ื ืคื•ืจื“,
08:01
and compared those.
168
481993
1672
ื•ื”ืฉื•ื•ื™ื ื• ื‘ื™ื ื™ื”ื.
08:03
And in most cases,
169
483689
1504
ื•ื‘ืจื•ื‘ ื”ืžืงืจื™ื
08:05
they were either on par or above the performance classification accuracy
170
485217
3875
ื”ื ื”ื™ื• ื‘ืฉื•ื•ื™ื•ืŸ ืื• ื‘ื™ืชืจื•ืŸ ืžื‘ื—ื™ื ืช ื‘ื™ืฆื•ืขื™ ื“ื™ื•ืง ื”ืกื™ื•ื•ื’
ืœืขื•ืžืช ืจื•ืคืื™ ื”ืขื•ืจ ื”ืื ื•ืฉื™ื™ื.
08:09
of human dermatologists.
171
489116
1467
08:10
CA: You were telling me an anecdote.
172
490607
1746
ื›"ื: ืกื™ืคืจืช ืœื™ ืื ืงื“ื•ื˜ื”.
08:12
I think about this image right here.
173
492377
1957
ื ืจืื” ืœื™ ืฉื–ื” ื”ื™ื” ืงืฉื•ืจ ืœืชืžื•ื ื” ื”ื–ืืช.
08:14
What happened here?
174
494358
1484
ืžื” ืงืจื” ื›ืืŸ?
08:15
ST: This was last Thursday. That's a moving piece.
175
495866
4008
ืก"ืช: ื–ื” ื”ื™ื” ื‘ื™ื•ื ื—ืžื™ืฉื™ ื”ืื—ืจื•ืŸ. ื–ื” ืงื˜ืข ืžืจื’ืฉ.
08:19
What we've shown before and we published in "Nature" earlier this year
176
499898
3600
ืžื” ืฉื”ืจืื™ื ื• ืงื•ื“ื, ืฉืคืจืกืžื ื• ื‘"ื ื™ื™ืฆ'ืจ" ื‘ืชื—ื™ืœืช ื”ืฉื ื”,
08:23
was this idea that we show dermatologists images
177
503522
2484
ื”ื™ื” ื”ืจืขื™ื•ืŸ ืฉืื ื• ืžืจืื™ื ืชืžื•ื ื•ืช ืœืจื•ืคืื™-ืขื•ืจ ื•ืœืžื—ืฉื‘ ืฉืœื ื•,
08:26
and our computer program images,
178
506030
1539
08:27
and count how often they're right.
179
507593
1627
ื•ืžื•ื ื™ื ื›ืžื” ืคืขืžื™ื ื”ื ืฆื•ื“ืงื™ื.
ืื‘ืœ ื›ืœ ื”ืชืžื•ื ื•ืช ื”ืืœื” ื”ืŸ ืžื”ืขื‘ืจ.
08:29
But all these images are past images.
180
509244
1778
ื›ื•ืœืŸ ืขื‘ืจื• ื‘ื™ื•ืคืกื™ื” ื›ื“ื™ ืœื•ื•ื“ื ืฉื”ืกื™ื•ื•ื’ ืฉืœื ื• ื ื›ื•ืŸ.
08:31
They've all been biopsied to make sure we had the correct classification.
181
511046
3460
ื–ืืช - ืœื.
08:34
This one wasn't.
182
514530
1172
ื–ืืช ืฆื•ืœืžื” ื‘ืกื˜ื ืคื•ืจื“ ืข"ื™ ืฉื•ืชืฃ ืฉืœื ื•.
08:35
This one was actually done at Stanford by one of our collaborators.
183
515726
3179
08:38
The story goes that our collaborator,
184
518929
2314
ืœืคื™ ื”ืกื™ืคื•ืจ, ื”ืฉื•ืชืฃ ืฉืœื ื•,
08:41
who is a world-famous dermatologist, one of the three best, apparently,
185
521267
3391
ืจื•ืคื-ืขื•ืจ ื‘ืขืœ ืคืจืกื•ื ืขื•ืœืžื™, ืื—ื“ ืžืฉืœื•ืฉืช ื”ื˜ื•ื‘ื™ื ื‘ื™ื•ืชืจ, ืžืกืชื‘ืจ,
ื”ื‘ื™ื˜ ื‘ืฉื•ืžื” ื”ื–ืืช ื•ืืžืจ, "ื–ื” ืื™ื ื ื• ืกืจื˜ืŸ ืขื•ืจ."
08:44
looked at this mole and said, "This is not skin cancer."
186
524682
2935
08:47
And then he had a second moment, where he said,
187
527641
2476
ื•ืื– ื”ื•ื ื—ืฉื‘ ืœืจื’ืข, ื•ืืžืจ, "ืื•ื•ื“ื ื–ืืช ื‘ืขื–ืจืช ื™ื™ืฉื•ืžื•ืŸ."
08:50
"Well, let me just check with the app."
188
530141
1866
ื”ื•ื ืฉืœืฃ ืืช ื”"ืื™ื™ืคื•ืŸ" ืฉืœื• ื•ื”ืจื™ืฅ ืืช ื”ื™ื™ืฉื•ื ืฉืœื ื•,
08:52
So he took out his iPhone and ran our piece of software,
189
532031
2699
ืืช ื”"ืจื•ืคื-ืขื•ืจ-ื›ื™ืก" ืฉืœื ื•,
08:54
our "pocket dermatologist," so to speak,
190
534754
2121
08:56
and the iPhone said: cancer.
191
536899
2994
ื•ื”"ืื™ื™ืคื•ืŸ" ืงื‘ืข: ืกืจื˜ืŸ.
08:59
It said melanoma.
192
539917
1306
ืžืœื ื•ืžื”.
09:01
And then he was confused.
193
541849
1233
ื”ื•ื ื ื‘ื•ืš,
ื•ื”ื—ืœื™ื˜, "ื˜ื•ื‘, ืื•ืœื™ ืื ื™ ืกื•ืžืš ืขืœ ื”'ืื™ื™ืคื•ืŸ' ืงืฆืช ื™ื•ืชืจ ืžืืฉืจ ืขืœ ืขืฆืžื™,"
09:03
And he decided, "OK, maybe I trust the iPhone a little bit more than myself,"
194
543106
4551
09:07
and he sent it out to the lab to get it biopsied.
195
547681
2735
ื•ื”ื•ื ืฉืœื— ืืช ื–ื” ืœื‘ื™ื•ืคืกื™ื” ื‘ืžืขื‘ื“ื”.
09:10
And it came up as an aggressive melanoma.
196
550440
2469
ื•ื”ืžืžืฆื ื”ื™ื” "ืžืœื ื•ืžื” ืชื•ืงืคื ื™ืช."
09:13
So I think this might be the first time that we actually found,
197
553545
3067
ืื– ื ืจืื” ืœื™ ืฉื–ื• ื”ืคืขื ื”ืจืืฉื•ื ื” ืฉื‘ื” ืžืฆืื ื•,
09:16
in the practice of using deep learning,
198
556636
2487
ื‘ืืžืฆืขื•ืช ื”ืœืžื™ื“ื” ื”ืขืžื•ืงื”,
ืžืฆื‘ ืฉื‘ื• ืžืœื ื•ืžื” ืืฆืœ ืื“ื ื”ื™ืชื” ืขืœื•ืœื” ืœื ืœื”ืชื’ืœื•ืช
09:19
an actual person whose melanoma would have gone unclassified,
199
559147
3372
09:22
had it not been for deep learning.
200
562543
2115
ืื™ืœื•ืœื ื”ืœืžื™ื“ื” ื”ืขืžื•ืงื”.
09:24
CA: I mean, that's incredible.
201
564682
1560
ื›"ื: ืžื“ื”ื™ื!
09:26
(Applause)
202
566266
1769
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ื ืจืื” ืœื™ ืฉื™ื”ื™ื” ืžื™ื“ ื‘ื™ืงื•ืฉ ืžื™ื™ื“ื™ ืœื™ื™ืฉื•ืžื•ืŸ ื›ื–ื”,
09:28
It feels like there'd be an instant demand for an app like this right now,
203
568059
3600
09:31
that you might freak out a lot of people.
204
571683
1966
ื•ื–ื” ื™ื›ื•ืœ ืœื”ืคื—ื™ื“ ื”ืžื•ืŸ ืื ืฉื™ื.
09:33
Are you thinking of doing this, making an app that allows self-checking?
205
573673
3527
ืืชื” ื—ื•ืฉื‘ ืœืขืฉื•ืช ื–ืืช, ืœื™ืฆื•ืจ ื™ื™ืฉื•ืžื•ืŸ ืฉืžืืคืฉืจ ื‘ื“ื™ืงื” ืขืฆืžื™ืช?
ืก"ืช: ื”ื“ื•ืืจ ื”ื ื›ื ืก ืฉืœื™ ืžื•ืฆืฃ ื‘ื‘ืงืฉื•ืช ืœื™ื™ืฉื•ืžื•ื ื™ ืกืจื˜ืŸ,
09:37
ST: So my in-box is flooded about cancer apps,
206
577224
4973
ืขื ืกื™ืคื•ืจื™ื ืงื•ืจืขื™-ืœื‘ ืฉืœ ืื ืฉื™ื.
09:42
with heartbreaking stories of people.
207
582221
2303
09:44
I mean, some people have had 10, 15, 20 melanomas removed,
208
584548
3204
ื™ืฉ ืื ืฉื™ื ืฉื”ื•ืกืจื• ืืฆืœื 10, 15, 20 ืžืœื ื•ืžื•ืช,
09:47
and are scared that one might be overlooked, like this one,
209
587776
3952
ื•ื”ื ื—ื•ืฉืฉื™ื ืฉื™ืฉ ื›ืืœื” ืฉื”ื•ื—ืžืฆื•, ื›ืžื• ื–ื”,
09:51
and also, about, I don't know,
210
591752
1741
ื•ืžื’ื™ืขื•ืช ื’ื ื‘ืงืฉื•ืช, ืœื ื™ื•ื“ืข, ืœืžื›ื•ื ื™ื•ืช ืžืขื•ืคืคื•ืช ื•ืคื ื™ื•ืช ืœื”ืจืฆื•ืช.
09:53
flying cars and speaker inquiries these days, I guess.
211
593517
2732
09:56
My take is, we need more testing.
212
596273
2738
ื”ืžืกืงื ื” ืฉืœื™ ื”ื™ื ืฉื ื—ื•ืฆื•ืช ืœื ื• ื‘ื“ื™ืงื•ืช ื ื•ืกืคื•ืช.
09:59
I want to be very careful.
213
599449
1778
ืื ื™ ืจื•ืฆื” ืœื”ื™ื–ื”ืจ ืžืื“.
10:01
It's very easy to give a flashy result and impress a TED audience.
214
601251
3666
ืงืœ ืžืื“ ืœืชืช ืชื•ืฆืื” ืžืกื ื•ื•ืจืช ื•ืœื”ืจืฉื™ื ืืช ื”ืงื”ืœ ื‘-TED.
10:04
It's much harder to put something out that's ethical.
215
604941
2627
ืงืฉื” ื”ืจื‘ื” ื™ื•ืชืจ ืœืฉื—ืจืจ ืžื•ืฆืจ ื‘ืื•ืคืŸ ืื—ืจืื™.
10:07
And if people were to use the app
216
607592
2394
ื•ืื ืื ืฉื™ื ื™ืฉืชืžืฉื• ื‘ื™ื™ืฉื•ืžื•ืŸ ื”ื–ื”
ื•ื™ื—ืœื™ื˜ื• ืœื ืœื”ืชื™ื™ืขืฅ ืขื ืจื•ืคื ื‘ื’ืœืœ ื˜ืขื•ืช ืฉืœื ื•,
10:10
and choose not to consult the assistance of a doctor
217
610010
2797
10:12
because we get it wrong,
218
612831
1583
ืืจื’ื™ืฉ ืจืข ืžืื“ ื‘ื’ืœืœ ื–ื”.
10:14
I would feel really bad about it.
219
614438
1653
ืื– ื›ืจื’ืข ืื ื• ืขื•ืจื›ื™ื ื‘ื“ื™ืงื•ืช ืงืœื™ื ื™ื•ืช,
10:16
So we're currently doing clinical tests,
220
616115
1925
ื•ืื ืžื”ื‘ื“ื™ืงื•ืช ื™ืชื‘ืจืจ ืฉื”ื ืชื•ื ื™ื ืฉืœื ื• ืžื•ืฆืงื™ื,
10:18
and if these clinical tests commence and our data holds up,
221
618064
2798
10:20
we might be able at some point to take this kind of technology
222
620886
2990
ืื•ืœื™ ื ื•ื›ืœ ืžืชื™ืฉื”ื• ืœืงื—ืช ื˜ื›ื ื•ืœื•ื’ื™ื” ื›ื–ืืช,
ื•ืœื”ื‘ื™ืื” ืœืงืœื™ื ื™ืงื” ืฉืœ ืกื˜ื ืคื•ืจื“
10:23
and take it out of the Stanford clinic
223
623900
1892
ื•ืœืขื•ืœื ื›ื•ืœื•,
10:25
and bring it to the entire world,
224
625816
1658
ืœืžืงื•ืžื•ืช ืฉืืœื™ื”ื ืจื•ืคืื™ื ืžืกื˜ื ืคื•ืจื“ ืœื ืžื’ื™ืขื™ื ืœืขื•ืœื.
10:27
places where Stanford doctors never, ever set foot.
225
627498
2456
10:30
CA: And do I hear this right,
226
630617
2580
ื›"ื: ืื ืื ื™ ืฉื•ืžืข ื ื›ื•ืŸ ื ืจืื” ืฉืžื” ืฉืืชื” ืื•ืžืจ,
10:33
that it seemed like what you were saying,
227
633221
1966
ืžืฉื•ื ืฉืืชื” ืขื•ื‘ื“ ืขื ืฆื‘ื ืฉืœ ืชืœืžื™ื“ื™ "ื™ื•ื“ืกื™ื˜ื™",
10:35
because you are working with this army of Udacity students,
228
635211
4254
10:39
that in a way, you're applying a different form of machine learning
229
639489
3221
ืฉืืชื ืžืฉืชืžืฉื™ื ื‘ืœืžื™ื“ืช ืžื›ื•ื ื” ื‘ืฆื•ืจื” ืฉื•ื ื”
10:42
than might take place in a company,
230
642734
1735
ืžื–ื• ืฉื™ื›ื•ืœื” ืœื”ืชื ื”ืœ ื‘ื—ื‘ืจื”,
10:44
which is you're combining machine learning with a form of crowd wisdom.
231
644493
3484
ืฉืืชื ืžืฉืœื‘ื™ื ื‘ื™ืŸ ืœืžื™ื“ืช ืžื›ื•ื ื” ื•ื—ื•ื›ืžืช ื”ืžื•ื ื™ื.
ื”ืื ืืชื” ืื•ืžืจ ืฉืœืคืขืžื™ื ื ืจืื” ืœืš ืฉื–ื” ื™ื›ื•ืœ ืœื”ืชืขืœื•ืช ืขืœ ื‘ื™ืฆื•ืขื™ื” ืฉืœ ื—ื‘ืจื”,
10:48
Are you saying that sometimes you think that could actually outperform
232
648001
3384
10:51
what a company can do, even a vast company?
233
651409
2050
ืืคื™ืœื• ื—ื‘ืจื” ื’ื“ื•ืœื”?
ืก"ืช: ืœื“ืขืชื™ ื™ืฉ ื›ื™ื•ื ืžืงืจื™ื ืฉืžื“ื”ื™ืžื™ื ืื•ืชื™,
10:53
ST: I believe there's now instances that blow my mind,
234
653483
2940
10:56
and I'm still trying to understand.
235
656447
1758
ื•ืฉืื ื™ ืขื“ื™ื™ืŸ ืžื ืกื” ืœื”ื‘ื™ืŸ.
ื›ืจื™ืก ืžืชื™ื™ื—ืก ืœืชื—ืจื•ื™ื•ืช ืฉืื ื• ืžื ื”ืœื™ื.
10:58
What Chris is referring to is these competitions that we run.
236
658229
3937
11:02
We turn them around in 48 hours,
237
662190
2268
ืื ื• ืžืงืฆื™ื‘ื™ื ื‘ื”ืŸ 48 ืฉืขื•ืช,
11:04
and we've been able to build a self-driving car
238
664482
2252
ื•ื”ืฆืœื—ื ื• ืœื‘ื ื•ืช ืžื›ื•ื ื™ืช ืื•ื˜ื•ื ื•ืžื™ืช
11:06
that can drive from Mountain View to San Francisco on surface streets.
239
666758
3387
ืฉื™ื›ื•ืœื” ืœื ื”ื•ื’ ืžืžืื•ื ื˜ืŸ ื•ื•ื™ื• ื•ืขื“ ืกืŸ-ืคืจื ืกื™ืกืงื• ืขืœ ื›ื‘ื™ืฉ ืกืœื•ืœ.
ื–ื” ืœื ืžืžืฉ ืฉื•ื•ื”-ืขืจืš ืœื”ื™ืฉื’ื™ "ื’ื•ื’ืœ" ืื—ืจื™ ืฉื‘ืข ืฉื ื•ืช ืขื‘ื•ื“ื” ืฉืœื”,
11:10
It's not quite on par with Google after seven years of Google work,
240
670169
3584
11:13
but it's getting there.
241
673777
2528
ืื‘ืœ ืื ื• ืžืชืงืจื‘ื™ื ืœื–ื”.
11:16
And it took us only two engineers and three months to do this.
242
676329
3084
ื•ื–ื” ื“ืจืฉ ืžืื™ืชื ื• ืจืง ืฉื ื™ ืžื”ื ื“ืกื™ื ื•ืฉืœื•ืฉื” ื—ื•ื“ืฉื™ ืขื‘ื•ื“ื”.
11:19
And the reason is, we have an army of students
243
679437
2856
ื•ื”ืกื™ื‘ื” ื”ื™ื ืฉื™ืฉ ืœื ื• ืฆื‘ื ืฉืœ ืกื˜ื•ื“ื ื˜ื™ื
11:22
who participate in competitions.
244
682317
1850
ืฉืžืฉืชืชืคื™ื ื‘ืชื—ืจื•ื™ื•ืช.
11:24
We're not the only ones who use crowdsourcing.
245
684191
2220
ืื™ื ื ื• ื”ื™ื—ื™ื“ื™ื ืฉืžืฉืชืžืฉื™ื ื‘ืžื™ืงื•ืจ ื”ืžื•ื ื™ื.
11:26
Uber and Didi use crowdsource for driving.
246
686435
2223
"ืื•ื‘ืจ" ื•"ื“ื™ื“ื™" ืžืฉืชืžืฉื•ืช ื‘ืžื™ืงื•ืจ ื”ืžื•ื ื™ื ืขื‘ื•ืจ ื ื”ื™ื’ื”.
11:28
Airbnb uses crowdsourcing for hotels.
247
688682
2759
"ืื™ื™ืจ ื‘ื™-ืืŸ-ื‘ื™" ืžืฉืชืžืฉืช ื‘ืžื™ืงื•ืจ ื”ืžื•ื ื™ื ืœืžืงื•ืžื•ืช ืœื™ื ื”.
11:31
There's now many examples where people do bug-finding crowdsourcing
248
691465
4007
ื™ืฉ ื”ื™ื•ื ื”ืจื‘ื” ื“ื•ื’ืžืื•ืช ืœืžื™ืงื•ืจ ื”ืžื•ื ื™ื ืขื‘ื•ืจ ืื™ืชื•ืจ ื‘ืื’ื™ื
ืื• ืงื™ืคื•ืœ ื—ืœื‘ื•ื ื™ื, ื›ืœ ืžื™ื ื™ ื“ื‘ืจื™ื ื‘ืขื–ืจืช ืžื™ืงื•ืจ ื”ืžื•ื ื™ื.
11:35
or protein folding, of all things, in crowdsourcing.
249
695496
2804
11:38
But we've been able to build this car in three months,
250
698324
2915
ืื‘ืœ ืื ื• ื”ืฆืœื—ื ื• ืœื‘ื ื•ืช ืืช ื”ืžื›ื•ื ื™ืช ื”ื–ืืช ืชื•ืš 3 ื—ื•ื“ืฉื™ื,
11:41
so I am actually rethinking
251
701263
3655
ืื– ืื ื™ ืžืขืจื™ืš ืžื—ื“ืฉ
11:44
how we organize corporations.
252
704942
2238
ืื™ืš ืื ื• ืžืืจื’ื ื™ื ืชืื’ื™ื“ื™ื.
11:47
We have a staff of 9,000 people who are never hired,
253
707204
4696
ื™ืฉ ืœื ื• ืกื’ืœ ื‘ืŸ 9,000 ืื™ืฉ ืฉืื™ื ื ืžื•ืขืกืงื™ื
11:51
that I never fire.
254
711924
1308
ื•ืฉืื™ื ื ื™ ืžืคื˜ืจ.
11:53
They show up to work and I don't even know.
255
713256
2362
ื”ื ืžื’ื™ืขื™ื ืœืขื‘ื•ื“ื” ืžื‘ืœื™ ืฉืื“ืข ืืคื™ืœื•.
11:55
Then they submit to me maybe 9,000 answers.
256
715642
3058
ื•ื”ื ืžื’ื™ืฉื™ื ืœื™ 9,000 ื”ืฆืขื•ืช.
11:58
I'm not obliged to use any of those.
257
718724
2176
ืื ื™ ืœื ืžื—ื•ื™ื™ื‘ ืœืงื‘ืœ ืืฃ ืื—ืช ืžื”ืŸ.
12:00
I end up -- I pay only the winners,
258
720924
1991
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ - ืื ื™ ืžืฉืœื ืจืง ืœื–ื•ื›ื™ื,
12:02
so I'm actually very cheapskate here, which is maybe not the best thing to do.
259
722939
3718
ืื– ืื ื™ ื“ื™ ืงืžืฆืŸ ืคื”, ื•ืื•ืœื™ ื–ื” ืœื ื”ื›ื™ ื˜ื•ื‘.
12:06
But they consider it part of their education, too, which is nice.
260
726681
3185
ืื‘ืœ ืžืฆื“ ืฉื ื™ ื”ื ืจื•ืื™ื ื‘ื–ื” ื—ืœืง ืžื”ื”ืฉื›ืœื” ืฉืœื”ื, ื•ื–ื” ื ื—ืžื“.
12:09
But these students have been able to produce amazing deep learning results.
261
729890
4201
ืื‘ืœ ื”ืกื˜ื•ื“ื ื˜ื™ื ื”ืืœื” ื”ืฆืœื™ื—ื• ืœื™ื™ืฆืจ ืชื•ืฆืื•ืช ืžื“ื”ื™ืžื•ืช ืฉืœ ืœืžื™ื“ื” ืขืžื•ืงื”.
ืื–, ื›ืŸ. ืžื“ื”ื™ื ืœืฉืœื‘ ื‘ื™ืŸ ืื ืฉื™ื ื ืคืœืื™ื ื•ืœืžื™ื“ื” ืขืžื•ืงื” ืžืขื•ืœื”.
12:14
So yeah, the synthesis of great people and great machine learning is amazing.
262
734115
3861
ื›"ื: ื’ืืจื™ ืงืกืคืจื•ื‘ ืืžืจ ื‘ื™ื•ื ื”ืจืืฉื•ืŸ [ื‘-TED2017]
12:18
CA: I mean, Gary Kasparov said on the first day [of TED2017]
263
738000
2814
12:20
that the winners of chess, surprisingly, turned out to be two amateur chess players
264
740848
5412
ืฉื”ืžื ืฆื—ื™ื ื‘ืฉื—ืžื˜, ื‘ืžืคืชื™ืข, ื”ื ืฉื ื™ ืฉื—ืงื ื™ ืฉื—ืžื˜ ื—ื•ื‘ื‘ื™ื
ืขื ืฉืœื•ืฉ ืชื•ื›ื ื•ืช ืžื—ืฉื‘ ื‘ื™ื ื•ื ื™ื•ืช, ื‘ื™ื ื•ื ื™ื•ืช ืขื“ ื˜ื•ื‘ื•ืช,
12:26
with three mediocre-ish, mediocre-to-good, computer programs,
265
746284
5371
12:31
that could outperform one grand master with one great chess player,
266
751679
3163
ืฉืขื•ืœื•ืช ื‘ื‘ื™ืฆื•ืขื™ื”ืŸ ืขืœ ืจื‘-ืืžืŸ ืžื•ืœ ืฉื—ืงืŸ ืžืฆื˜ื™ื™ืŸ,
12:34
like it was all part of the process.
267
754866
1743
ื›ืื™ืœื• ื›ืœ ื–ื” ื—ืœืง ืžื”ืชื”ืœื™ืš.
12:36
And it almost seems like you're talking about a much richer version
268
756633
3335
ื•ื ืจืื” ืฉืืชื” ืžื“ื‘ืจ ืขืœ ื’ื™ืจืกื” ืขืฉื™ืจื” ื‘ื”ืจื‘ื” ืฉืœ ืื•ืชื• ื”ืจืขื™ื•ืŸ.
12:39
of that same idea.
269
759992
1200
ืก"ืช: ื›ืŸ, ืชืจืื”, ื›ืฉืขืงื‘ืช ืื—ืจื™ ื”ืคื ืœื™ื ื”ื ืคืœืื™ื ืฉืœ ืืชืžื•ืœ ื‘ื‘ื•ืงืจ,
12:41
ST: Yeah, I mean, as you followed the fantastic panels yesterday morning,
270
761216
3857
ืฉื ื™ ืžืคื’ืฉื™ื ื‘ื ื•ืฉื ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช,
12:45
two sessions about AI,
271
765097
1994
12:47
robotic overlords and the human response,
272
767115
2167
ืขืœื™ื™ืช ื”ืจื•ื‘ื•ื˜ื™ืงื” ื•ื”ืชื’ื•ื‘ื” ื”ืื ื•ืฉื™ืช,
12:49
many, many great things were said.
273
769306
1982
ื ืืžืจื• ื”ืจื‘ื” ืžืื“ ื“ื‘ืจื™ื ื—ืฉื•ื‘ื™ื.
12:51
But one of the concerns is that we sometimes confuse
274
771312
2687
ืื‘ืœ ืื—ืช ื”ื“ืื’ื•ืช ื”ื™ื ืฉืœืคืขืžื™ื ืื ื• ืžื‘ืœื‘ืœื™ื
ื‘ื™ืŸ ืžื” ืฉื ืขืฉื” ื‘ืคื•ืขืœ ื‘ืชื—ื•ื ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
12:54
what's actually been done with AI with this kind of overlord threat,
275
774023
4062
ืœื‘ื™ืŸ ื”ืคื—ื“ ืžื”ืจื•ื‘ื•ื˜ื™ื,
12:58
where your AI develops consciousness, right?
276
778109
3424
ื•ืžื›ืš ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืชืคืชื— ืชื•ื“ืขื”, ื ื›ื•ืŸ?
13:01
The last thing I want is for my AI to have consciousness.
277
781557
2971
ื”ื“ื‘ืจ ื”ืื—ืจื•ืŸ ืฉืื ื™ ืจื•ืฆื” ื”ื•ื ืฉื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืฉืœื™ ืชื”ื™ื” ืžื•ื“ืขืช.
13:04
I don't want to come into my kitchen
278
784552
1716
ืื ื™ ืœื ืจื•ืฆื” ืœื”ื™ื›ื ืก ืœืžื˜ื‘ื—
13:06
and have the refrigerator fall in love with the dishwasher
279
786292
4193
ื•ืœื’ืœื•ืช ืžื”ืžืงืจืจ ืฉืœื™ ื”ืชืื”ื‘ ื‘ืžื“ื™ื—
13:10
and tell me, because I wasn't nice enough,
280
790509
2124
ื•ืื•ืžืจ ืœื™ ืฉื‘ื’ืœืœ ืฉืœื ื”ืชื ื”ื’ืชื™ ื™ืคื”,
13:12
my food is now warm.
281
792657
1837
ื”ื•ื ื›ื‘ืจ ืœื ืžืงืจืจ ืœื™ ืืช ื”ืื•ื›ืœ.
13:14
I wouldn't buy these products, and I don't want them.
282
794518
2891
ืœื ื”ื™ื™ืชื™ ืงื•ื ื” ืžื•ืฆืจื™ื ื›ืืœื”. ืื ื™ ืœื ืจื•ืฆื” ื‘ื”ื.
13:17
But the truth is, for me,
283
797825
1802
ืื‘ืœ ื”ืืžืช ื”ื™ื ืฉืžื‘ื—ื™ื ืชื™,
13:19
AI has always been an augmentation of people.
284
799651
2720
ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืชืžื™ื“ ื ื—ืฉื‘ื” ืœื”ื’ื‘ืจื” ืฉืœ ื”ืื ื•ืฉื™ื•ืช.
13:22
It's been an augmentation of us,
285
802893
1676
ื–ื• ื”ื’ื‘ืจื” ืฉืœื ื• ืฉื ื•ืขื“ื” ืœื”ื•ืกื™ืฃ ืœื ื• ืขื•ืฆืžื”.
13:24
to make us stronger.
286
804593
1457
ื•ืื ื™ ื—ื•ืฉื‘ ืฉืงืกืคืจื•ื‘ ืงืœืข ืœืžื˜ืจื”.
13:26
And I think Kasparov was exactly correct.
287
806074
2831
13:28
It's been the combination of human smarts and machine smarts
288
808929
3849
ื”ืฉื™ืœื•ื‘ ื‘ื™ืŸ ื”ื—ื•ื›ืžื” ื”ืื ื•ืฉื™ืช ืœื—ื•ื›ืžื” ืฉืœ ื”ืžื›ื•ื ื•ืช
13:32
that make us stronger.
289
812802
1464
ื”ื•ื ืฉืžืขืฆื™ื ืื•ืชื ื•.
13:34
The theme of machines making us stronger is as old as machines are.
290
814290
4587
ื”ืจืขื™ื•ืŸ ืฉืœ ื”ืชืขืฆืžื•ืช ื‘ืขื–ืจืช ืžื›ื•ื ื•ืช ืขืชื™ืง ื›ืžื• ื”ืžื›ื•ื ื•ืช ืขืฆืžืŸ.
13:39
The agricultural revolution took place because it made steam engines
291
819567
3758
ื”ืžื”ืคื›ื” ื”ื—ืงืœืื™ืช ื”ืชื—ื•ืœืœื” ืžืฉื•ื ืฉื™ืฆืจื ื• ืžื ื•ืขื™ ืงื™ื˜ื•ืจ
13:43
and farming equipment that couldn't farm by itself,
292
823349
2666
ื•ืฆื™ื•ื“ ื—ืงืœืื™ ืฉืœื ื”ื™ื” ืžืกื•ื’ืœ ืœืคืขื•ืœ ื‘ืขืฆืžื•,
ื”ืžื›ื•ื ื•ืช ืžืขื•ืœื ืœื ื”ื—ืœื™ืคื• ืื•ืชื ื•; ื”ืŸ ืชื’ื‘ืจื• ืื•ืชื ื•.
13:46
that never replaced us; it made us stronger.
293
826039
2122
ื•ืื ื™ ืžืืžื™ืŸ ืฉื”ื’ืœ ื”ื—ื“ืฉ ื”ื–ื” ืฉืœ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช, ื™ืขืฆื™ื ืื•ืชื ื• ื‘ื”ืจื‘ื”
13:48
And I believe this new wave of AI will make us much, much stronger
294
828185
3738
13:51
as a human race.
295
831947
1183
ื›ืžื™ืŸ ืื ื•ืฉื™.
13:53
CA: We'll come on to that a bit more,
296
833765
1813
ื›"ื: ืžื™ื“ ื ื’ื™ืข ืœื–ื”,
13:55
but just to continue with the scary part of this for some people,
297
835602
3671
ืื‘ืœ ื›ื”ืžืฉืš ืœื—ืœืง ืฉืžืคื—ื™ื“ ืื ืฉื™ื ืžืกื•ื™ืžื™ื,
13:59
like, what feels like it gets scary for people is when you have
298
839297
3558
ืœืžืฉืœ, ื”ืคื—ื“ ืžื›ืš ืฉืžื—ืฉื‘ ื™ืฉื›ืชื‘ ืืช ื”ืงื•ื“ ืฉืœื•,
14:02
a computer that can, one, rewrite its own code,
299
842879
4618
14:07
so, it can create multiple copies of itself,
300
847521
3584
ืฉื™ื•ื›ืœ ืœื™ืฆื•ืจ ืขื•ืชืงื™ื ืฉืœ ืขืฆืžื•,
ืœื ืกื•ืช ื’ื™ืจืกืื•ืช ืงื•ื“ ืฉื•ื ื•ืช,
14:11
try a bunch of different code versions,
301
851129
1897
ืื•ืœื™ ืืคื™ืœื• ื‘ืื•ืคืŸ ืืงืจืื™,
14:13
possibly even at random,
302
853050
1775
14:14
and then check them out and see if a goal is achieved and improved.
303
854849
3632
ื•ืื– ืœื‘ื“ื•ืง ืœืจืื•ืช ืื ื”ืžื˜ืจื” ื”ื•ืฉื’ื” ื•ื™ืฉ ืฉื™ืคื•ืจ.
14:18
So, say the goal is to do better on an intelligence test.
304
858505
3641
ื ื ื™ื— ืฉื”ืžื˜ืจื” ื”ื™ื ืœื”ืฉืชืคืจ ื‘ืžื‘ื—ืŸ ืื™ื ื˜ืœื™ื’ื ืฆื™ื”.
14:22
You know, a computer that's moderately good at that,
305
862170
3894
ืžื—ืฉื‘ ืฉื˜ื•ื‘ ื‘ื›ืš ื‘ืžื™ื“ื” ื‘ื™ื ื•ื ื™ืช,
ื™ื›ื•ืœ ืœื ืกื•ืช ืžื™ืœื™ื•ืŸ ื’ื™ืจืกืื•ืช.
14:26
you could try a million versions of that.
306
866088
2509
14:28
You might find one that was better,
307
868621
2090
ืœืžืฆื•ื ืื—ืช ืขื“ื™ืคื”,
ื•ืื– ืœื—ื–ื•ืจ ืขืœ ื–ื”.
14:30
and then, you know, repeat.
308
870735
2004
ืื– ื”ื“ืื’ื” ื”ื™ื ืฉื™ื™ื•ื•ืฆืจ ืืคืงื˜ ืฉืœ ื™ืฆื™ืื” ืžืฉืœื™ื˜ื”,
14:32
And so the concern is that you get some sort of runaway effect
309
872763
3040
14:35
where everything is fine on Thursday evening,
310
875827
3008
ื›ืฉื‘ื™ื•ื ื—ืžื™ืฉื™ ื‘ืขืจื‘ ื”ื›ืœ ื‘ืกื“ืจ,
14:38
and you come back into the lab on Friday morning,
311
878859
2336
ื•ื›ืฉื—ื•ื–ืจื™ื ืœืžืขื‘ื“ื” ื‘ื™ื•ื ืฉืฉื™ ื‘ื‘ื•ืงืจ,
ื‘ื’ืœืœ ื”ืžื”ื™ืจื•ืช ืฉืœ ื”ืžื—ืฉื‘ ื•ื›ืŸ ื”ืœืื”,
14:41
and because of the speed of computers and so forth,
312
881219
2449
ื”ื›ืœ ืžืฉืชื’ืข, ื•ืคืชืื•ื --
14:43
things have gone crazy, and suddenly --
313
883692
1903
ืก"ืช: ื”ื™ื™ืชื™ ืื•ืžืจ ืฉื–ืืช ืืคืฉืจื•ืช,
14:45
ST: I would say this is a possibility,
314
885619
2020
14:47
but it's a very remote possibility.
315
887663
1916
ืื‘ืœ ืืคืฉืจื•ืช ื“ื—ื•ืงื” ื‘ื™ื•ืชืจ.
14:49
So let me just translate what I heard you say.
316
889603
3337
ืื ืกื” ืœืชืจื’ื ืืช ืžื” ืฉืฉืžืขืชื™ ืžืžืš.
14:52
In the AlphaGo case, we had exactly this thing:
317
892964
2704
ื‘ืžืงืจื” ืฉืœ "ืืœืคื-ื’ื•", ื”ื™ื” ืœื ื• ื‘ื“ื™ื•ืง ื“ื‘ืจ ื›ื–ื”:
14:55
the computer would play the game against itself
318
895692
2315
ื”ืžื—ืฉื‘ ืฉื™ื—ืง ื ื’ื“ ืขืฆืžื•
ื•ืœืžื“ ื›ืœืœื™ื ื—ื“ืฉื™ื.
14:58
and then learn new rules.
319
898031
1250
14:59
And what machine learning is is a rewriting of the rules.
320
899305
3235
ื•ืœืžื™ื“ืช ื”ืžื›ื•ื ื” ื”ื™ื ืื›ืŸ ืฉื›ืชื•ื‘ ืฉืœ ื”ื›ืœืœื™ื.
15:02
It's the rewriting of code.
321
902564
1769
ื–ื”ื• ืฉื›ืชื•ื‘ ืฉืœ ื”ืงื•ื“.
15:04
But I think there was absolutely no concern
322
904357
2845
ืื‘ืœ ืœื“ืขืชื™ ืœื ื”ื™ืชื” ื›ืœ ืกื™ื‘ื” ืœื“ืื’ื”
ืฉ"ืืœืคื-ื’ื•" ื™ืฉืชืœื˜ ืขืœ ื”ืขื•ืœื.
15:07
that AlphaGo would take over the world.
323
907226
2426
15:09
It can't even play chess.
324
909676
1464
ื”ื•ื ืืคื™ืœื• ืœื ื™ื•ื“ืข ืœืฉื—ืง ืฉื—ืžื˜.
15:11
CA: No, no, no, but now, these are all very single-domain things.
325
911164
5147
ื›"ื: ืœื, ืœื, ืื‘ืœ ื›ืœ ืืœื” ื”ื ืชื—ื•ืžื™ื ื™ื™ื—ื•ื“ื™ื™ื ืžืื“.
ืื‘ืœ ืืคืฉืจ ืœื“ืžื™ื™ืŸ...
15:16
But it's possible to imagine.
326
916335
2879
ืจืื™ื ื• ื”ืจื™ ื–ื” ืขืชื” ืžื—ืฉื‘ ืฉื›ื ืจืื” ืžืกื•ื’ืœ ืœืขื‘ื•ืจ
15:19
I mean, we just saw a computer that seemed nearly capable
327
919238
3089
15:22
of passing a university entrance test,
328
922351
2655
ืืช ื‘ื—ื™ื ื•ืช ื”ื›ื ื™ืกื” ืœืื•ื ื™ื‘ืจืกื™ื˜ื”,
ืฉื™ื›ื•ืœ -- ื”ื•ื ืœื ื™ื•ื“ืข ืœืงืจื•ื ืื• ืœื”ื‘ื™ืŸ ื›ืžื•ื ื•,
15:25
that can kind of -- it can't read and understand in the sense that we can,
329
925030
3688
ืื‘ืœ ื”ื•ื ื‘ื”ื—ืœื˜ ืžืกื•ื’ืœ ืœืงืœื•ื˜ ืืช ื›ืœ ื”ื˜ืงืกื˜
15:28
but it can certainly absorb all the text
330
928742
1987
15:30
and maybe see increased patterns of meaning.
331
930753
2899
ื•ืื•ืœื™ ืœืžืฆื•ื ื“ืคื•ืกื™ ืžืฉืžืขื•ืช ืžื•ื’ื‘ืจื™ื.
15:33
Isn't there a chance that, as this broadens out,
332
933676
3694
ื”ืื ืื™ืŸ ืกื™ื›ื•ื™ ืฉื›ืืฉืจ ื›ืœ ื–ื” ื™ืชืจื—ื‘,
15:37
there could be a different kind of runaway effect?
333
937394
2466
ื™ื™ื•ื•ืฆืจ ืืคืงื˜ ืื—ืจ ืฉืœ ื™ืฆื™ืื” ืžืฉืœื™ื˜ื”?
15:39
ST: That's where I draw the line, honestly.
334
939884
2078
ืก"ืช: ื›ืืŸ ืื ื™ ืžื•ืชื— ืืช ื”ื’ื‘ื•ืœ, ื‘ื›ื ื•ืช.
15:41
And the chance exists -- I don't want to downplay it --
335
941986
2643
ื”ืกื™ื›ื•ื™ ืงื™ื™ื -- ืื™ื ื ื™ ืจื•ืฆื” ืœื”ืžืขื™ื˜ ื‘ืขืจืš ื”ื“ื‘ืจ --
ืื‘ืœ ืœื“ืขืชื™ ื–ื” ืจื—ื•ืง ื•ื–ื” ืœื ืžืขืกื™ืง ืื•ืชื™ ื‘ื™ืžื™ื ืืœื”,
15:44
but I think it's remote, and it's not the thing that's on my mind these days,
336
944653
3672
ื›ื™ ืœื“ืขืชื™ ื”ืžื”ืคื›ื” ื”ื’ื“ื•ืœื” ืžืชื—ื•ืœืœืช ื‘ืžืงื•ื ืื—ืจ.
15:48
because I think the big revolution is something else.
337
948349
2512
ื›ืœ ืžื” ืฉืžื•ืฆืœื— ื‘ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืขื“ ื”ื™ื•ื
15:50
Everything successful in AI to the present date
338
950885
2922
15:53
has been extremely specialized,
339
953831
2214
ื”ื™ื” ืžื•ื’ื‘ืœ ืžืื“ ืœืชื—ื•ืžื•,
ื•ื–ื” ืžืฆืœื™ื— ืขืœ ื™ืกื•ื“ ืจืขื™ื•ืŸ ืื—ื“ ื•ื™ื—ื™ื“:
15:56
and it's been thriving on a single idea,
340
956069
2489
15:58
which is massive amounts of data.
341
958582
2739
ื›ืžื•ื™ื•ืช ืขืฆื•ืžื•ืช ืฉืœ ื ืชื•ื ื™ื.
16:01
The reason AlphaGo works so well is because of massive numbers of Go plays,
342
961345
4147
ื”ืกื™ื‘ื” ืฉ"ืืœืคื-ื’ื•" ืžื•ืฆืœื— ื›ืœ-ื›ืš ื”ื™ื ื‘ื’ืœืœ ื›ืžื•ื™ื•ืช ืขืชืง ืฉืœ ืžืฉื—ืงื™ ื’ื•,
16:05
and AlphaGo can't drive a car or fly a plane.
343
965516
3255
"ืืœืคื-ื’ื•" ืœื ืžืกื•ื’ืœ ืœื ื”ื•ื’ ื‘ืžื›ื•ื ื™ืช ืื• ืœื”ื˜ื™ืก ืžื˜ื•ืก.
16:08
The Google self-driving car or the Udacity self-driving car
344
968795
3031
ื”ืžื›ื•ื ื™ื•ืช ื”ืื•ื˜ื•ื ื•ืžื™ืช ืฉืœ "ื’ื•ื’ืœ" ื•ืฉืœ "ื™ื•ื“ืกื™ื˜ื™"
16:11
thrives on massive amounts of data, and it can't do anything else.
345
971850
3240
ืžืฆืœื™ื—ื•ืช ื‘ื–ื›ื•ืช ื›ืžื•ื™ื•ืช ืขืชืง ืฉืœ ื ืชื•ื ื™ื, ื•ื”ืŸ ืœื ื™ื•ื“ืขื•ืช ืœืขืฉื•ืช ืžืฉื”ื• ืื—ืจ.
ื”ืŸ ืœื ื™ื•ื“ืขื•ืช ืœืฉืœื•ื˜ ื‘ืื•ืคื ื•ืข.
16:15
It can't even control a motorcycle.
346
975114
1727
16:16
It's a very specific, domain-specific function,
347
976865
2762
ื–ื” ืžืื“ ืกืคืฆื™ืคื™, ื•ืžื•ื’ื‘ืœ ืœืชื—ื•ื ืžืกื•ื™ื,
16:19
and the same is true for our cancer app.
348
979651
1907
ื•ื–ื” ื ื›ื•ืŸ ื’ื ืœื’ื‘ื™ ื™ื™ืฉื•ืžื•ืŸ ื”ืกืจื˜ืŸ ืฉืœื ื•.
16:21
There has been almost no progress on this thing called "general AI,"
349
981582
3236
ืœื ื”ื™ืชื” ื›ืžืขื˜ ืฉื•ื ื”ืชืงื“ืžื•ืช ื‘ื“ื‘ืจ ืฉืงืจื•ื™ "ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ืœืœื™ืช,"
16:24
where you go to an AI and say, "Hey, invent for me special relativity
350
984842
4000
ื›ืฉื ื™ื’ืฉื™ื ืœื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื•ืื•ืžืจื™ื, "ืชืžืฆื™ืื™ ืœื™ ืชื•ืจืช ื™ื—ืกื•ืช ืคืจื˜ื™ืช
16:28
or string theory."
351
988866
1666
"ืื• ืืช ืชื•ืจืช ื”ืžื™ืชืจื™ื."
16:30
It's totally in the infancy.
352
990556
1931
ื”ื™ื ืœื’ืžืจื™ ื‘ื—ื™ืชื•ืœื™ื”.
16:32
The reason I want to emphasize this,
353
992511
2127
ื”ืกื™ื‘ื” ืฉืื ื™ ืจื•ืฆื” ืœื”ื“ื’ื™ืฉ ื–ืืช,
16:34
I see the concerns, and I want to acknowledge them.
354
994662
3838
ืื ื™ ืจื•ืื” ืืช ื”ื“ืื’ื•ืช ื•ืื ื™ ืžื›ื™ืจ ื‘ื”ืŸ,
16:38
But if I were to think about one thing,
355
998524
2886
ืื‘ืœ ืื ื™ืฉ ื“ื‘ืจ ืื—ื“ ืฉื›ื“ืื™ ืœื—ืฉื•ื‘ ืขืœื™ื•,
16:41
I would ask myself the question, "What if we can take anything repetitive
356
1001434
5563
ื”ื™ื™ืชื™ ืฉื•ืืœ ืืช ืขืฆืžื™,
"ืžื” ืื ื ื™ืงื— ื›ืœ ืžืฉื™ืžื” ืฉื—ื•ื–ืจืช ืขืœ ืขืฆืžื”
16:47
and make ourselves 100 times as efficient?"
357
1007021
3473
"ื•ื ืชื™ื™ืขืœ ืคื™ ืžืื”?"
16:51
It so turns out, 300 years ago, we all worked in agriculture
358
1011170
4249
ืžืกืชื‘ืจ ืฉืœืคื ื™ 300 ืฉื ื” ื›ื•ืœื ื• ืขื‘ื“ื ื• ื‘ื—ืงืœืื•ืช
ืขื‘ื“ื ื• ื‘ืžืฉืง ื•ืขืฉื™ื ื• ืžื˜ืœื•ืช ืฉื—ื•ื–ืจื•ืช ืขืœ ืขืฆืžืŸ.
16:55
and did farming and did repetitive things.
359
1015443
2051
ื”ื™ื•ื, 75% ืžืื™ืชื ื• ืขื•ื‘ื“ื™ื ื‘ืžืฉืจื“ื™ื
16:57
Today, 75 percent of us work in offices
360
1017518
2556
ื•ืขื•ืฉื™ื ื“ื‘ืจื™ื ืฉื—ื•ื–ืจื™ื ืขืœ ืขืฆืžื.
17:00
and do repetitive things.
361
1020098
2124
ื ืขืฉื™ื ื• ืขื‘ื“ื™ื ืœื’ืœื™ื•ืŸ ื”ืืœืงื˜ืจื•ื ื™.
17:02
We've become spreadsheet monkeys.
362
1022246
2183
17:04
And not just low-end labor.
363
1024453
2054
ื•ืœื ืžื“ื•ื‘ืจ ืจืง ื‘ืขื•ื‘ื“ื™ ื”ืฆื•ื•ืืจื•ืŸ ื”ื›ื—ื•ืœ.
17:06
We've become dermatologists doing repetitive things,
364
1026531
2754
ื ืขืฉื™ื ื• ืœืจื•ืคืื™-ืขื•ืจ ืฉืขื•ืฉื™ื ื“ื‘ืจื™ื ืฉื—ื•ื–ืจื™ื ืขืœ ืขืฆืžื,
ืขื•ืจื›ื™-ื“ื™ืŸ ืฉืขื•ืฉื™ื ื“ื‘ืจื™ื ืฉื—ื•ื–ืจื™ื ืขืœ ืขืฆืžื.
17:09
lawyers doing repetitive things.
365
1029309
1749
ืœื“ืขืชื™, ื‘ืงืจื•ื‘ ื ื•ื›ืœ ืœืงื—ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ื›ืœืฉื”ื™,
17:11
I think we are at the brink of being able to take an AI,
366
1031082
3823
17:14
look over our shoulders,
367
1034929
1718
ืœื”ื‘ื™ื˜ ืžืขื‘ืจ ืœื›ืชืฃ,
17:16
and they make us maybe 10 or 50 times as effective in these repetitive things.
368
1036671
4058
ื•ื–ื” ื™ื”ืคื•ืš ืื•ืชื ื• ืœื™ืขื™ืœื™ื ืคื™ 10 ืื• 50 ื‘ืžืฉื™ืžื•ืช ืฉื—ื•ื–ืจื•ืช ืขืœ ืขืฆืžืŸ.
17:20
That's what is on my mind.
369
1040753
1275
ื–ื” ืžื” ืฉืžืขืกื™ืง ืื•ืชื™.
ื›"ื: ื–ื” ื ืฉืžืข ืžืจืชืง ืžืื“.
17:22
CA: That sounds super exciting.
370
1042052
2450
17:24
The process of getting there seems a little terrifying to some people,
371
1044526
3530
ื”ื“ืจืš ืœืฉื ื ืจืื™ืช ืงืฆืช ืžืคื—ื™ื“ื” ืœืื ืฉื™ื ืžืกื•ื™ืžื™ื,
ื›ื™ ื‘ืจื’ืข ืฉืžื—ืฉื‘ ื™ื•ื›ืœ ืœืขืฉื•ืช ืžื˜ืœื•ืช ืฉื—ื•ื–ืจื•ืช ืขืœ ืขืฆืžืŸ
17:28
because once a computer can do this repetitive thing
372
1048080
3180
ื˜ื•ื‘ ื‘ื”ืจื‘ื” ืžืจื•ืคื ื”ืขื•ืจ
17:31
much better than the dermatologist
373
1051284
3434
17:34
or than the driver, especially, is the thing that's talked about
374
1054742
3230
ืื• ืžื”ื ื”ื’, ืฉืขืœ ื–ื” ื‘ืžื™ื•ื—ื“ ืžื“ื‘ืจื™ื ื”ืจื‘ื” ื”ื™ื•ื,
17:37
so much now,
375
1057996
1290
17:39
suddenly millions of jobs go,
376
1059310
1958
ืœืคืชืข ื™ื™ืขืœืžื• ืžื™ืœื™ื•ื ื™ ืžืงื•ืžื•ืช ืขื‘ื•ื“ื”,
17:41
and, you know, the country's in revolution
377
1061292
2695
ื•ื”ืืจืฅ ืชื”ื™ื” ื‘ืžื”ืคื›ื”
ืขื•ื“ ืœืคื ื™ ืฉื‘ื›ืœืœ ื ื’ื™ืข ืœื”ื™ื‘ื˜ื™ื ื”ื ื”ื“ืจื™ื ืฉื™ืชืืคืฉืจื•.
17:44
before we ever get to the more glorious aspects of what's possible.
378
1064011
4329
17:48
ST: Yeah, and that's an issue, and it's a big issue,
379
1068364
2517
ืก"ืช: ื›ืŸ, ื•ื–ืืช ื‘ืขื™ื”, ื‘ืขื™ื” ืจืฆื™ื ื™ืช.
17:50
and it was pointed out yesterday morning by several guest speakers.
380
1070905
4196
ืฆื™ื™ื ื• ืื•ืชื” ืืชืžื•ืœ ื›ืžื” ืžืจืฆื™ื-ืื•ืจื—ื™ื.
17:55
Now, prior to me showing up onstage,
381
1075125
2754
ืœืคื ื™ ืฉืขืœื™ืชื™ ืœื‘ืžื”,
17:57
I confessed I'm a positive, optimistic person,
382
1077903
3739
ื”ื•ื“ื™ืชื™ ืฉืื ื™ ืื“ื ื—ื™ื•ื‘ื™ ื•ืื•ืคื˜ื™ืžื™,
18:01
so let me give you an optimistic pitch,
383
1081666
2389
ืื– ื‘ื•ืื• ื•ืืชืŸ ืœื›ื ื ืื•ื ืžื›ื™ืจื•ืช ืื•ืคื˜ื™ืžื™,
18:04
which is, think of yourself back 300 years ago.
384
1084079
4795
ื“ืžื™ื™ื ื• ืืช ืขืฆืžื›ื ืœืคื ื™ 300 ืฉื ื”.
18:08
Europe just survived 140 years of continuous war,
385
1088898
3996
ืื™ืจื•ืคื” ืฉืจื“ื” ื–ื” ืขืชื” 140 ืฉื ื•ืช ืžืœื—ืžื” ืžืชืžืฉื›ืช,
18:12
none of you could read or write,
386
1092918
1711
ืื™ืฉ ืžื›ื ืื™ื ื• ื™ื•ื“ืข ืงืจื•ื ื•ื›ืชื•ื‘,
18:14
there were no jobs that you hold today,
387
1094653
2945
ื”ืžืฉืจื•ืช ืฉื™ืฉ ืœื›ื ื”ื™ื•ื ื‘ื›ืœืœ ืœื ื”ื™ื•,
18:17
like investment banker or software engineer or TV anchor.
388
1097622
4096
ื›ืžื• ื‘ื ืงืื™ ื”ืฉืงืขื•ืช, ืžื”ื ื“ืก ืชื•ื›ื ื” ืื• ืžื’ื™ืฉ ื—ื“ืฉื•ืช.
18:21
We would all be in the fields and farming.
389
1101742
2414
ื›ื•ืœื ื• ื”ื™ื™ื ื• ื‘ืฉื“ื•ืช, ืขื•ื‘ื“ื™ื ื‘ื—ืงืœืื•ืช.
18:24
Now here comes little Sebastian with a little steam engine in his pocket,
390
1104180
3573
ืžื•ืคื™ืข ืกื‘ืกื˜ื™ืืŸ ืงื˜ืŸ ื•ื‘ื›ื™ืกื• ืžื ื•ืข ืงื™ื˜ื•ืจ ืงื˜ืŸ,
ื•ืื•ืžืจ, "ื—ื‘ืจ'ื”, ืชืจืื• ืืช ื–ื”.
18:27
saying, "Hey guys, look at this.
391
1107777
1548
"ื–ื” ื™ื›ืคื™ืœ ืืช ื›ื•ื—ื›ื ืคื™ 100 ื•ืื– ืชื•ื›ืœื• ืœืขืกื•ืง ื‘ืžืฉื”ื• ืื—ืจ."
18:29
It's going to make you 100 times as strong, so you can do something else."
392
1109349
3595
18:32
And then back in the day, there was no real stage,
393
1112968
2470
ื•ื‘ืื•ืชื ื™ืžื™ื ืœื ื”ื™ืชื” ื‘ืžื” ืืžื™ืชื™ืช,
18:35
but Chris and I hang out with the cows in the stable,
394
1115462
2526
ื›ืจื™ืก ื•ืื ื™ ื”ืกืชื•ื‘ื‘ื ื• ืขื ื”ืคืจื•ืช ื‘ืจืคืช,
18:38
and he says, "I'm really concerned about it,
395
1118012
2100
ื•ื”ื•ื ืื•ืžืจ ืœื™, "ื–ื” ืžื“ืื™ื’ ืื•ืชื™ ืžืื“,
"ื›ื™ ืื ื™ ื—ื•ืœื‘ ืืช ืคืจืชื™ ื‘ื›ืœ ื™ื•ื. ืžื” ืื ื”ืžื›ื•ื ื” ืชืขืฉื” ืืช ื–ื” ื‘ืžืงื•ืžื™?"
18:40
because I milk my cow every day, and what if the machine does this for me?"
396
1120136
3652
18:43
The reason why I mention this is,
397
1123812
1702
ืื ื™ ืžืขืœื” ืืช ื–ื”,
18:46
we're always good in acknowledging past progress and the benefit of it,
398
1126360
3603
ื›ื™ ื›ื•ืœื ื• ืžื›ื™ืจื™ื ื‘ืงื™ื“ืžื” ืฉื—ืœื” ื‘ืขื‘ืจ ื•ื‘ืชื•ืขืœืช ืฉืœื”,
18:49
like our iPhones or our planes or electricity or medical supply.
399
1129987
3354
ื›ืžื• ื”"ืื™ื™ืคื•ืŸ", ื”ืžื˜ื•ืก, ื”ื—ืฉืžืœ ืื• ื”ืชืจื•ืคื•ืช.
ืื ื• ื ื”ื ื™ื ืœื—ื™ื•ืช ืขื“ ื’ื™ืœ 80, ืžื” ืฉืœื ื”ื™ื” ืืคืฉืจื™ ืœืคื ื™ 300 ืฉื ื”.
18:53
We all love to live to 80, which was impossible 300 years ago.
400
1133365
4245
18:57
But we kind of don't apply the same rules to the future.
401
1137634
4156
ืื‘ืœ ืžืฉื•ื-ืžื” ืื™ื ื ื• ืžืชื™ื™ื—ืกื™ื ื›ืš ืืœ ื”ืขืชื™ื“.
19:02
So if I look at my own job as a CEO,
402
1142621
3207
ืื ืื ื™ ืžืกืชื›ืœ ืขืœ ืชืคืงื™ื“ื™ ื›ืžื ื›"ืœ,
19:05
I would say 90 percent of my work is repetitive,
403
1145852
3140
ื”ื™ื™ืชื™ ืื•ืžืจ ืฉ-90% ืžืขื‘ื•ื“ืชื™ ื”ื ื“ื‘ืจื™ื ื—ื•ื–ืจื™ื ื•ื ืฉื ื™ื,
ืื ื™ ืœื ื ื”ื ื” ืžื–ื”,
19:09
I don't enjoy it,
404
1149016
1351
19:10
I spend about four hours per day on stupid, repetitive email.
405
1150391
3978
ืื ื™ ืžืงื“ื™ืฉ ื›ืืจื‘ืข ืฉืขื•ืช ื‘ื™ื•ื ืœื”ืชืขืกืงื•ืช ื—ื•ื–ืจืช ื•ื ืฉื ื™ืช ื‘ื“ื•ื"ืœ ืžื˜ื•ืคืฉ.
19:14
And I'm burning to have something that helps me get rid of this.
406
1154393
3641
ื•ืื ื™ ืžืช ืœืžืฉื”ื• ืฉื™ืขื–ื•ืจ ืœื™ ืœื”ื™ืคื˜ืจ ืžื–ื”.
ืžื“ื•ืข?
19:18
Why?
407
1158058
1158
ื›ื™ ืื ื™ ืžืืžื™ืŸ ืฉื›ื•ืœื ื• ื™ืฆื™ืจืชื™ื™ื ื‘ื˜ื™ืจื•ืฃ,
19:19
Because I believe all of us are insanely creative;
408
1159240
3003
19:22
I think the TED community more than anybody else.
409
1162731
3194
ื•ืงื”ื™ืœืช TED - ื™ื•ืชืจ ืžื›ืœ.
19:25
But even blue-collar workers; I think you can go to your hotel maid
410
1165949
3559
ืื‘ืœ ืืคื™ืœื• ืขื•ื‘ื“ื™ ืฆื•ื•ืืจื•ืŸ ื›ื—ื•ืœ - ืœื“ืขืชื™, ืื ืชื™ื’ืฉ ืœืžืฉืจืชืช ืื• ืœืžืฉืจืช ื‘ืžืœื•ืŸ
19:29
and have a drink with him or her,
411
1169532
2402
ืชื–ืžื™ืŸ ืื•ืชื ืœืžืฉืงื”,
19:31
and an hour later, you find a creative idea.
412
1171958
2717
ื‘ืชื•ืš ืฉืขื” ืชืงื‘ืœ ืžื”ื ืจืขื™ื•ืŸ ื™ืฆื™ืจืชื™.
19:34
What this will empower is to turn this creativity into action.
413
1174699
4140
ื”ื“ื‘ืจ ื”ื–ื” ื™ื’ืจื•ื ืœื™ืฆื™ืจืชื™ื•ืช ื”ื–ืืช ืœื”ืชื’ืฉื.
19:39
Like, what if you could build Google in a day?
414
1179265
3442
ืœืžืฉืœ, ืžื” ืื ื ื•ื›ืœ ืœื‘ื ื•ืช "ื’ื•ื’ืœ" ืชื•ืš ื™ื•ื?
19:43
What if you could sit over beer and invent the next Snapchat,
415
1183221
3316
ืžื” ืื ื™ื›ื•ืœื ื• ืœื”ืžืฆื™ื ืขืœ ื›ื•ืก ื‘ื™ืจื” ืืช ื”"ืกื ืืคืฆ'ื˜" ื”ื‘ื,
19:46
whatever it is,
416
1186561
1165
ืžื” ืฉื–ื” ืœื ื™ื”ื™ื”,
19:47
and tomorrow morning it's up and running?
417
1187750
2187
ื•ืœืžื—ืจืช ื‘ื‘ื•ืงืจ ื–ื” ื›ื‘ืจ ื™ืคืขืœ?
19:49
And that is not science fiction.
418
1189961
1773
ื•ื–ื” ืื™ื ื ื• ืžื“ืข ื‘ื“ื™ื•ื ื™.
19:51
What's going to happen is,
419
1191758
1254
ืžื” ืฉื”ื•ืœืš ืœืงืจื•ืช ื”ื•ื,
ืฉืื ื• ื›ื‘ืจ ื—ืœืง ืžื”ื”ื™ืกื˜ื•ืจื™ื”,
19:53
we are already in history.
420
1193036
1867
19:54
We've unleashed this amazing creativity
421
1194927
3228
ืฉื”ื•ืฆืื ื• ืœืื•ืจ ืืช ื”ื™ืฆื™ืจืชื™ื•ืช ื”ืžื“ื”ื™ืžื” ื”ื–ืืช
ื‘ื›ืš ืฉื”ืฉืชื—ืจืจื ื• ืžื”ืฉืขื‘ื•ื“ ืœื—ืงืœืื•ืช
19:58
by de-slaving us from farming
422
1198179
1611
19:59
and later, of course, from factory work
423
1199814
3363
ื•ืื—"ื›, ื›ืžื•ื‘ืŸ, ืžื”ืขื‘ื•ื“ื” ื‘ืžืคืขืœ
20:03
and have invented so many things.
424
1203201
3162
ื•ื”ืžืฆืื ื• ื›ื” ื”ืจื‘ื” ื“ื‘ืจื™ื.
20:06
It's going to be even better, in my opinion.
425
1206387
2178
ืฉื–ื” ืขืชื™ื“ ืœืœื›ืช ื•ืœื”ืฉืชืคืจ, ืœื“ืขืชื™.
20:08
And there's going to be great side effects.
426
1208589
2072
ื•ืชื”ื™ื™ื ื” ืœื–ื” ืชื•ืฆืื•ืช-ืœื•ื•ืื™ ื ื”ื“ืจื•ืช.
20:10
One of the side effects will be
427
1210685
1489
ืื—ืช ืžืชื•ืฆืื•ืช ื”ืœื•ื•ืื™ ืชื”ื™ื”
20:12
that things like food and medical supply and education and shelter
428
1212198
4795
ืฉื“ื‘ืจื™ื ื›ืžื• ืžื–ื•ืŸ, ืืกืคืงื” ืจืคื•ืื™ืช, ื”ืฉื›ืœื”, ืžื—ืกื” ื•ืชื—ื‘ื•ืจื”
20:17
and transportation
429
1217017
1177
ื™ื™ืขืฉื• ื–ื•ืœื™ื ื‘ื”ืจื‘ื” ืœื›ื•ืœื ื•,
20:18
will all become much more affordable to all of us,
430
1218218
2441
20:20
not just the rich people.
431
1220683
1322
ื•ืœื ืจืง ืœืขืฉื™ืจื™ื.
20:22
CA: Hmm.
432
1222029
1182
ื›"ื: ื”ืžืžืž...
20:23
So when Martin Ford argued, you know, that this time it's different
433
1223235
4341
ืื– ื›ืฉืžืจื˜ื™ืŸ ืคื•ืจื“ ื˜ืขืŸ ืฉื”ืคืขื ื–ื” ืื—ืจืช
20:27
because the intelligence that we've used in the past
434
1227600
3453
ืžืฉื•ื ืฉื”ื‘ื™ื ื” ื‘ื” ื ืขื–ืจื ื• ื‘ืขื‘ืจ
ื›ื“ื™ ืœืžืฆื•ื ื“ืจื›ื™ื ื—ื“ืฉื•ืช
20:31
to find new ways to be
435
1231077
2483
20:33
will be matched at the same pace
436
1233584
2279
ืชื™ื“ื—ืง ื”ื—ื•ืฆื” ื‘ืื•ืชื• ื”ืงืฆื‘
20:35
by computers taking over those things,
437
1235887
2291
ืข"ื™ ืžื—ืฉื‘ื™ื ืฉื™ืฉืชืœื˜ื• ืขืœ ื”ื“ื‘ืจื™ื ื”ืืœื”,
ืžื” ืฉืื ื™ ืฉื•ืžืข ืžืžืš ื–ื” ืฉืœื ืœื’ืžืจื™
20:38
what I hear you saying is that, not completely,
438
1238202
3078
20:41
because of human creativity.
439
1241304
2951
ื”ื•ื“ื•ืช ืœื™ืฆื™ืจืชื™ื•ืช ื”ืื ื•ืฉื™ืช...
ื”ืื ืœื“ืขืชืš ื–ื” ืฉื•ื ื” ืžืŸ ื”ื™ืกื•ื“ ืžืกื•ื’ ื”ื™ืฆื™ืจืชื™ื•ืช
20:44
Do you think that that's fundamentally different from the kind of creativity
440
1244279
3785
ืฉื”ืžื—ืฉื‘ื™ื ืžืกื•ื’ืœื™ื ืœื”?
20:48
that computers can do?
441
1248088
2696
20:50
ST: So, that's my firm belief as an AI person --
442
1250808
4434
ืก"ืช: ืื ื™ ืžืืžื™ืŸ ื‘ืชื•ืงืฃ ื›ืื™ืฉ ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช --
ืฉืœื ืจืื™ืชื™ ืฉื•ื ื”ืชืงื“ืžื•ืช ืืžื™ืชื™ืช ืžื‘ื—ื™ื ืช ื™ืฆื™ืจืชื™ื•ืช
20:55
that I haven't seen any real progress on creativity
443
1255266
3803
20:59
and out-of-the-box thinking.
444
1259949
1407
ื•ื—ืฉื™ื‘ื” ืœื-ืฉื’ืจืชื™ืช.
21:01
What I see right now -- and this is really important for people to realize,
445
1261380
3623
ืžื” ืฉืื ื™ ืจื•ืื” ื›ืจื’ืข -- ื•ื—ืฉื•ื‘ ืžืื“ ืฉืื ืฉื™ื ื™ื‘ื™ื ื• ื–ืืช,
ื›ื™ ื”ืžื•ืฉื’ "ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช" ื›ื” ืžืื™ื™ื,
21:05
because the word "artificial intelligence" is so threatening,
446
1265027
2903
21:07
and then we have Steve Spielberg tossing a movie in,
447
1267954
2523
ื•ื’ื ืกื˜ื™ื‘ืŸ ืกืคื™ืœื‘ืจื’ ื”ื•ืกื™ืฃ ื›ืืŸ ืกืจื˜ ืœืžื“ื•ืจื”,
21:10
where all of a sudden the computer is our overlord,
448
1270501
2413
ื•ื”ืžื—ืฉื‘ ื ื”ื™ื” ืœืคืชืข ืื“ื•ืŸ ื”ืขื•ืœื,
21:12
but it's really a technology.
449
1272938
1452
ื‘ืขื•ื“ ืฉื–ื• ืจืง ื˜ื›ื ื•ืœื•ื’ื™ื”.
21:14
It's a technology that helps us do repetitive things.
450
1274414
2982
ื˜ื›ื ื•ืœื•ื’ื™ื” ืฉืขื•ื–ืจืช ืœื ื• ืœื‘ืฆืข ืžื˜ืœื•ืช ื—ื•ื–ืจื•ืช ื•ื ืฉื ื•ืช.
21:17
And the progress has been entirely on the repetitive end.
451
1277420
2913
ื•ื”ื”ืชืงื“ืžื•ืช ื”ื™ืชื” ื›ื•ืœื” ื‘ื”ื™ื‘ื˜ ื”ื—ื–ืจืชื™:
21:20
It's been in legal document discovery.
452
1280357
2228
ื‘ื’ื™ืœื•ื™ ืžืกืžื›ื™ื ื—ื•ืงื™ื™ื,
21:22
It's been contract drafting.
453
1282609
1680
ื‘ื ื™ืกื•ื— ื˜ื™ื•ื˜ื•ืช ืฉืœ ื—ื•ื–ื™ื,
21:24
It's been screening X-rays of your chest.
454
1284313
4223
ื‘ืฆื™ืœื•ืžื™ ืจื ื˜ื’ืŸ ืฉืœ ื”ื—ื–ื”,
21:28
And these things are so specialized,
455
1288560
1773
ื•ืืœื” ื“ื‘ืจื™ื ื›ืœ-ื›ืš ืžื•ื’ื‘ืœื™ื ื•ืกืคืฆื™ืคื™ื™ื,
21:30
I don't see the big threat of humanity.
456
1290357
2391
ืฉืื ื™ ืœื ืจื•ืื” ืื™ืคื” ื”ืื™ื•ื ื”ื’ื“ื•ืœ ืขืœ ื”ืื ื•ืฉื•ืช.
21:32
In fact, we as people --
457
1292772
1794
ืœืžืขืฉื”, ืื ื• ื›ื‘ื ื™-ืื“ื --
21:34
I mean, let's face it: we've become superhuman.
458
1294590
2385
ื›ื™ ื‘ื•ืื• ื•ื ื•ื“ื” ื‘ื›ืš: ื ืขืฉื™ื ื• ืื“ื-ืขืœ
21:36
We've made us superhuman.
459
1296999
1764
ื”ืคื›ื ื• ืืช ืขืฆืžื ื• ืœืื“ื-ืขืœ.
21:38
We can swim across the Atlantic in 11 hours.
460
1298787
2632
ืื ื• ื™ื›ื•ืœื™ื ืœื—ืฆื•ืช ืืช ื”ืื•ืงื™ื™ื ื•ืก ื”ืื˜ืœื ื˜ื™ ื‘ืชื•ืš 11 ืฉืขื•ืช.
21:41
We can take a device out of our pocket
461
1301443
2074
ืื ื• ื™ื›ื•ืœื™ื ืœืฉืœื•ืฃ ืžื”ื›ื™ืก ืžื›ืฉื™ืจ
21:43
and shout all the way to Australia,
462
1303541
2147
ื•ืœืฆืขื•ืง ืœืชื•ื›ื• ื›ืš ืฉื™ืฉืžืขื• ื‘ืื•ืกื˜ืจืœื™ื”,
21:45
and in real time, have that person shouting back to us.
463
1305712
2600
ื•ืฉื”ืฉื•ืžืข ื™ืฆืขืง ืืœื™ื ื• ื‘ื—ื–ืจื” ื‘ื–ืžืŸ ืืžืช.
21:48
That's physically not possible. We're breaking the rules of physics.
464
1308336
3624
ืคื™ื–ื™ืช, ื–ื” ื‘ืœืชื™-ืืคืฉืจื™. ืื ื• ืฉื•ื‘ืจื™ื ืืช ื›ืœืœื™ ื”ืคื™ื–ื™ืงื”.
21:51
When this is said and done, we're going to remember everything
465
1311984
2943
ื‘ืกื•ืคื• ืฉืœ ื“ื‘ืจ ืื ื• ื ื–ื›ื•ืจ ื”ื›ืœ,
21:54
we've ever said and seen,
466
1314951
1213
ืืช ื›ืœ ืžื” ืฉืืžืจื ื• ื•ืขืฉื™ื ื•,
ืืชื ืชื–ื›ืจื• ื›ืœ ืื“ื,
21:56
you'll remember every person,
467
1316188
1496
21:57
which is good for me in my early stages of Alzheimer's.
468
1317708
2626
ื•ื–ื” ืžืฆื•ื™ืŸ ื‘ืฉื‘ื™ืœื™, ื‘ืฉืœื‘ื™ ื”ืฉื˜ื™ื•ืŸ ื”ืžื•ืงื“ืžื™ื ืฉืœื™.
ืกืœื™ื—ื”, ืžื” ืืžืจืชื™? ืฉื›ื—ืชื™.
22:00
Sorry, what was I saying? I forgot.
469
1320358
1677
ื›"ื: (ืฆื•ื—ืง)
22:02
CA: (Laughs)
470
1322059
1578
22:03
ST: We will probably have an IQ of 1,000 or more.
471
1323661
3077
ืก"ืช: ืžื ืช ื”ืžืฉื›ืœ ืฉืœื ื• ืชื”ื™ื” ื›ื ืจืื” 1000 ืื• ื™ื•ืชืจ.
22:06
There will be no more spelling classes for our kids,
472
1326762
3425
ืœื™ืœื“ื™ื ื›ื‘ืจ ืœื ื™ื”ื™ื• ืฉื™ืขื•ืจื™ ืื™ื•ืช,
22:10
because there's no spelling issue anymore.
473
1330211
2086
ื›ื™ ื›ื‘ืจ ืœื ืชื”ื™ื” ื‘ืขื™ื™ืช ืื™ื•ืช.
22:12
There's no math issue anymore.
474
1332321
1832
ื›ื‘ืจ ืœื ืชื”ื™ื™ื ื” ื‘ืขื™ื•ืช ื‘ื—ืฉื‘ื•ืŸ.
ื•ืœื“ืขืชื™ ืžื” ืฉื‘ืืžืช ื™ืงืจื” ื”ื•ื ืฉื ื•ื›ืœ ืœื”ื™ื•ืช ืกื•ืคืจ-ื™ืฆื™ืจืชื™ื™ื.
22:14
And I think what really will happen is that we can be super creative.
475
1334177
3510
22:17
And we are. We are creative.
476
1337711
1857
ื•ืื ื• ื›ื‘ืจ ื›ืืœื”. ืื ื• ื™ืฆื™ืจืชื™ื™ื.
22:19
That's our secret weapon.
477
1339592
1552
ื–ื”ื• ื”ื ืฉืง ื”ืกื•ื“ื™ ืฉืœื ื•.
ื›"ื: ืื– ืžืงื•ืžื•ืช ื”ืขื‘ื•ื“ื” ืฉื ืขืœืžื™ื,
22:21
CA: So the jobs that are getting lost,
478
1341168
2153
22:23
in a way, even though it's going to be painful,
479
1343345
2494
ื‘ืžื•ื‘ืŸ ืžืกื•ื™ื, ื›ื›ืœ ืฉื–ื” ื›ื•ืื‘,
22:25
humans are capable of more than those jobs.
480
1345863
2047
ื‘ื ื™-ืื“ื ืžืกื•ื’ืœื™ื ืœื™ื•ืชืจ ืžืืฉืจ ืœืขืกื•ืง ื‘ืžืฉืจื•ืช ื”ืืœื”.
22:27
This is the dream.
481
1347934
1218
ื–ื”ื• ื”ื—ืœื•ื.
22:29
The dream is that humans can rise to just a new level of empowerment
482
1349176
4247
ื”ื—ืœื•ื ื”ื•ื ืฉื‘ื ื™ ื”ืื“ื ื™ื•ื›ืœื• ืœื”ืชืขืœื•ืช ืœืžื™ืฉื•ืจ ื—ื“ืฉ ืฉืœ ื”ืขืฆืžื”
22:33
and discovery.
483
1353447
1657
ื•ื’ื™ืœื•ื™ื™ื.
22:35
That's the dream.
484
1355128
1452
ื–ื”ื• ื”ื—ืœื•ื.
22:36
ST: And think about this:
485
1356604
1643
ืก"ืช: ื•ื’ื ืชื—ืฉื•ื‘ ืขืœ ื–ื”:
ืื ืชืกืชื›ืœ ืขืœ ื”ื”ื™ืกื˜ื•ืจื™ื” ื”ืื ื•ืฉื™ืช,
22:38
if you look at the history of humanity,
486
1358271
2021
22:40
that might be whatever -- 60-100,000 years old, give or take --
487
1360316
3328
ืœื ืžืฉื ื” ืžืชื™ -- ืœืคื ื™ 60,000 ืื• 100,000 ืฉื ื”, ืคืœื•ืก-ืžื™ื ื•ืก --
22:43
almost everything that you cherish in terms of invention,
488
1363668
3726
ื›ืžืขื˜ ื›ืœ ืžื” ืฉื™ืงืจ ืœื ื• ืžื‘ื—ื™ื ืช ื”ื”ืžืฆืื•ืช,
22:47
of technology, of things we've built,
489
1367418
2151
ื”ื˜ื›ื ื•ืœื•ื’ื™ื”, ื”ื“ื‘ืจื™ื ืฉื‘ื ื™ื ื•,
22:49
has been invented in the last 150 years.
490
1369593
3099
ื›ืœ ืืœื” ื”ื•ืžืฆืื• ื‘-150 ื”ืฉื ื™ื ื”ืื—ืจื•ื ื•ืช.
22:53
If you toss in the book and the wheel, it's a little bit older.
491
1373756
3048
ืื ืžื•ืกื™ืคื™ื ืืช ื”ืกืคืจ ื•ืืช ื”ื’ืœื’ืœ, ื–ื” ืงืฆืช ื™ื•ืชืจ ื–ืžืŸ.
22:56
Or the axe.
492
1376828
1169
ืื• ืืช ื”ื’ืจื–ืŸ.
ืื‘ืœ ื”ื˜ืœืคื•ืŸ, ื ืขืœื™ ื”ืกืคื•ืจื˜,
22:58
But your phone, your sneakers,
493
1378021
2790
23:00
these chairs, modern manufacturing, penicillin --
494
1380835
3551
ื”ื›ืกืื•ืช ื”ืืœื”, ื”ื™ื™ืฆื•ืจ ื”ืžื•ื“ืจื ื™, ื”ืคื ื™ืฆื™ืœื™ืŸ --
23:04
the things we cherish.
495
1384410
1714
ื”ื“ื‘ืจื™ื ืฉื™ืงืจื™ื ืœื ื•.
ื‘ืขื™ื ื™ ื–ื” ืื•ืžืจ
23:06
Now, that to me means
496
1386148
3658
23:09
the next 150 years will find more things.
497
1389830
3041
ืฉื‘-150 ื”ืฉื ื™ื ื”ื‘ืื•ืช ื ื’ืœื” ื“ื‘ืจื™ื ื ื•ืกืคื™ื.
23:12
In fact, the pace of invention has gone up, not gone down, in my opinion.
498
1392895
4154
ืœืžืขืฉื”, ืงืฆื‘ ื”ื”ืžืฆืื•ืช ื”ืชื’ื‘ืจ ื•ืœื ื ื—ืœืฉ, ืœื“ืขืชื™.
ืื ื™ ืžืืžื™ืŸ ืฉืจืง ืื—ื•ื– ืื—ื“ ืžื”ื“ื‘ืจื™ื ื”ืžืขื ื™ื™ื ื™ื ื”ืชื’ืœื” ืขื“ ื›ื”.
23:17
I believe only one percent of interesting things have been invented yet. Right?
499
1397073
4905
ืขื•ื“ ืœื ืจื™ืคืื ื• ืืช ื”ืกืจื˜ืŸ.
23:22
We haven't cured cancer.
500
1402002
1988
ืื™ืŸ ืœื ื• ืžื›ื•ื ื™ื•ืช ืžืขื•ืคืคื•ืช -- ืขื“ื™ื™ืŸ. ืื ื™ ืžืงื•ื•ื” ืœืฉื ื•ืช ืืช ื–ื”.
23:24
We don't have flying cars -- yet. Hopefully, I'll change this.
501
1404014
3718
23:27
That used to be an example people laughed about. (Laughs)
502
1407756
3257
ื–ื• ื“ื•ื’ืžื” ืœืžืฉื”ื• ืฉืื ืฉื™ื ืœืขื’ื• ืœื•.
ืžืฆื—ื™ืง, ืœื? "ืขื•ื‘ื“ื™ื ื‘ื—ืฉืื™ ืขืœ ืžื›ื•ื ื™ื•ืช ืžืขื•ืคืคื•ืช."
23:31
It's funny, isn't it? Working secretly on flying cars.
503
1411037
2992
ืขื“ื™ื™ืŸ ืื™ื ื ื• ื—ื™ื™ื ืคื™ 2 ื–ืžืŸ, ื ื›ื•ืŸ?
23:34
We don't live twice as long yet. OK?
504
1414053
2683
23:36
We don't have this magic implant in our brain
505
1416760
2785
ืื™ืŸ ืœื ื• ืฉืชืœ ืงืกื•ื ื‘ืžื•ื—
23:39
that gives us the information we want.
506
1419569
1832
ืฉืžืกืคืง ืœื ื• ืžื™ื“ืข ื ื—ื•ืฅ.
ืื•ืœื™ ืืชื ืžื–ื•ืขื–ืขื™ื ืžื›ืš,
23:41
And you might be appalled by it,
507
1421425
1526
23:42
but I promise you, once you have it, you'll love it.
508
1422975
2444
ืื‘ืœ ืื ื™ ืžื‘ื˜ื™ื— ืœื›ื ืฉื›ืฉื–ื” ื™ื”ื™ื”, ืชืžื•ืชื• ืขืœ ื–ื”.
23:45
I hope you will.
509
1425443
1166
ืื ื™ ืžืงื•ื•ื” ื›ืš.
23:46
It's a bit scary, I know.
510
1426633
1909
ื–ื” ืงืฆืช ืžืคื—ื™ื“, ืื ื™ ื™ื•ื“ืข.
23:48
There are so many things we haven't invented yet
511
1428566
2254
ื™ืฉ ืขื•ื“ ื”ืžื•ืŸ ื“ื‘ืจื™ื ืฉื˜ืจื ื”ืžืฆืื ื• ื•ืฉืขื•ื“ ื ืžืฆื™ื.
23:50
that I think we'll invent.
512
1430844
1268
ืื™ืŸ ืœื ื• ืžื’ื™ื ื™ ื›ื‘ื™ื“ื”.
23:52
We have no gravity shields.
513
1432136
1306
ืื™ื ื ื• ื™ื›ื•ืœื™ื ืœืฉื’ืจ ืืช ืขืฆืžื ื• ืžืžืงื•ื ืœืžืงื•ื.
23:53
We can't beam ourselves from one location to another.
514
1433466
2553
ื–ื” ื ืฉืžืข ืžื’ื•ื—ืš,
23:56
That sounds ridiculous,
515
1436043
1151
ืื‘ืœ ืœืคื ื™ ื›-200 ืฉื ื”
23:57
but about 200 years ago,
516
1437218
1288
23:58
experts were of the opinion that flight wouldn't exist,
517
1438530
2667
ื”ืžื•ืžื—ื™ื ื”ื™ื• ื‘ื“ืขื” ืื—ืช ืฉื”ืชืขื•ืคื” ืœื ืชื™ืชื›ืŸ,
ืืคื™ืœื• ืœืคื ื™ 120 ืฉื ื”,
24:01
even 120 years ago,
518
1441221
1324
24:02
and if you moved faster than you could run,
519
1442569
2582
ื•ืฉืื ื ื ื•ืข ืžื”ืจ ื™ื•ืชืจ ืžื›ืคื™ ื™ื›ื•ืœืชื ื• ืœืจื•ืฅ,
ื ืžื•ืช ืžื™ื“.
24:05
you would instantly die.
520
1445175
1520
24:06
So who says we are correct today that you can't beam a person
521
1446719
3569
ืื– ืžื™ ื™ืืžืจ ืฉื”ื™ื•ื ืœื ื ื›ื•ืŸ ืœืงื‘ื•ืข
ืฉืื™-ืืคืฉืจ ืœืฉื’ืจ ืื“ื ืžื›ืืŸ ืœืžืื“ื™ื?
24:10
from here to Mars?
522
1450312
2249
24:12
CA: Sebastian, thank you so much
523
1452585
1569
ื›"ื: ืกื‘ืกื˜ื™ืืŸ, ืชื•ื“ื” ืจื‘ื” ืœืš
24:14
for your incredibly inspiring vision and your brilliance.
524
1454178
2682
ืขืœ ื—ื•ื›ืžืชืš ื•ื”ื—ื–ื•ืŸ ืจื‘ ื”ื”ืฉืจืื” ืฉืœืš.
24:16
Thank you, Sebastian Thrun.
525
1456884
1323
ืชื•ื“ื” ืจื‘ื” ืœืกื‘ืกื˜ื™ืืŸ ืช'ืจืŸ. ื–ื” ื”ื™ื” ื ื”ื“ืจ.
24:18
That was fantastic. (Applause)
526
1458231
1895
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7