Elon Musk: A future worth getting excited about | Tesla Texas Gigafactory interview | TED

9,805,272 views

2022-04-18 ・ TED


New videos

Elon Musk: A future worth getting excited about | Tesla Texas Gigafactory interview | TED

9,805,272 views ・ 2022-04-18

TED


Please double-click on the English subtitles below to play the video.

00:00
Chris Anderson: Elon Musk, great to see you.
0
288
2085
00:02
How are you?
1
2373
1210
00:03
Elon Musk: Good. How are you?
2
3583
1418
00:05
CA: We're here at the Texas Gigafactory the day before this thing opens.
3
5001
4212
00:09
It's been pretty crazy out there.
4
9255
1752
00:11
Thank you so much for making time on a busy day.
5
11049
3294
00:14
I would love you to help us, kind of, cast our minds,
6
14343
3713
00:18
I don't know, 10, 20, 30 years into the future.
7
18097
3462
00:22
And help us try to picture what it would take
8
22852
3462
00:26
to build a future that's worth getting excited about.
9
26314
3086
00:29
The last time you spoke at TED,
10
29442
1710
00:31
you said that that was really just a big driver.
11
31152
2794
00:33
You know, you can talk about lots of other reasons to do the work you're doing,
12
33988
3754
00:37
but fundamentally, you want to think about the future
13
37742
4087
00:41
and not think that it sucks.
14
41871
1376
00:43
EM: Yeah, absolutely.
15
43873
1168
00:45
I think in general, you know,
16
45083
1501
00:46
there's a lot of discussion of like, this problem or that problem.
17
46626
4337
00:51
And a lot of people are sad about the future
18
51005
3378
00:54
and they're ...
19
54425
1543
00:58
Pessimistic.
20
58262
1168
00:59
And I think ...
21
59472
2586
01:02
this is ...
22
62100
1376
01:05
This is not great.
23
65061
1168
01:06
I mean, we really want to wake up in the morning
24
66270
2628
01:08
and look forward to the future.
25
68940
1960
01:10
We want to be excited about what's going to happen.
26
70942
3545
01:15
And life cannot simply be about sort of,
27
75446
4880
01:20
solving one miserable problem after another.
28
80368
2210
01:22
CA: So if you look forward 30 years, you know, the year 2050
29
82578
3712
01:26
has been labeled by scientists
30
86332
3253
01:29
as this, kind of, almost like this doomsday deadline on climate.
31
89627
4254
01:33
There's a consensus of scientists, a large consensus of scientists,
32
93923
3420
01:37
who believe that if we haven't completely eliminated greenhouse gases
33
97385
5297
01:42
or offset them completely by 2050,
34
102723
3546
01:46
effectively we're inviting climate catastrophe.
35
106310
2962
01:49
Do you believe there is a pathway to avoid that catastrophe?
36
109272
4212
01:53
And what would it look like?
37
113526
1585
01:56
EM: Yeah, so I am not one of the doomsday people,
38
116904
3879
02:00
which may surprise you.
39
120783
1627
02:02
I actually think we're on a good path.
40
122952
2502
02:07
But at the same time,
41
127123
1251
02:08
I want to caution against complacency.
42
128416
3587
02:12
So, so long as we are not complacent,
43
132003
2669
02:14
as long as we have a high sense of urgency
44
134714
2711
02:17
about moving towards a sustainable energy economy,
45
137425
4045
02:21
then I think things will be fine.
46
141512
1710
02:25
So I can't emphasize that enough,
47
145433
1960
02:27
as long as we push hard and are not complacent,
48
147435
5255
02:34
the future is going to be great.
49
154066
1585
02:35
Don't worry about it.
50
155651
1126
02:36
I mean, worry about it,
51
156777
1168
02:37
but if you worry about it, ironically, it will be a self-unfulfilling prophecy.
52
157987
4338
02:42
So, like, there are three elements to a sustainable energy future.
53
162867
4087
02:47
One is of sustainable energy generation, which is primarily wind and solar.
54
167371
4463
02:51
There's also hydro, geothermal,
55
171876
3920
02:55
I'm actually pro-nuclear.
56
175838
2169
02:58
I think nuclear is fine.
57
178049
2127
03:04
But it's going to be primarily solar and wind,
58
184263
2962
03:07
as the primary generators of energy.
59
187266
4338
03:11
The second part is you need batteries to store the solar and wind energy
60
191604
3837
03:15
because the sun doesn't shine all the time,
61
195483
2252
03:17
the wind doesn't blow all the time.
62
197777
1710
03:19
So it's a lot of stationary battery packs.
63
199487
2711
03:23
And then you need electric transport.
64
203032
2044
03:25
So electric cars, electric planes, boats.
65
205117
2753
03:27
And then ultimately,
66
207870
1460
03:30
it’s not really possible to make electric rockets,
67
210373
2377
03:32
but you can make the propellant used in rockets
68
212792
3170
03:36
using sustainable energy.
69
216003
1710
03:37
So ultimately, we can have a fully sustainable energy economy.
70
217755
4421
03:42
And it's those three things:
71
222593
2961
03:45
solar/wind, stationary battery pack, electric vehicles.
72
225596
2878
03:48
So then what are the limiting factors on progress?
73
228516
3462
03:51
The limiting factor really will be battery cell production.
74
231978
3211
03:55
So that's going to really be the fundamental rate driver.
75
235606
4046
03:59
And then whatever the slowest element
76
239694
1793
04:01
of the whole lithium-ion battery cells supply chain,
77
241487
4087
04:05
from mining and the many steps of refining
78
245616
2753
04:08
to ultimately creating a battery cell
79
248411
2711
04:11
and putting it into a pack,
80
251163
1418
04:12
that will be the limiting factor on progress towards sustainability.
81
252623
3379
04:16
CA: All right, so we need to talk more about batteries,
82
256043
2586
04:18
because the key thing that I want to understand,
83
258671
2252
04:20
like, there seems to be a scaling issue here
84
260923
2086
04:23
that is kind of amazing and alarming.
85
263050
2836
04:25
You have said that you have calculated
86
265886
2920
04:28
that the amount of battery production that the world needs for sustainability
87
268806
5672
04:34
is 300 terawatt hours of batteries.
88
274478
3838
04:39
That's the end goal?
89
279317
1167
04:40
EM: Very rough numbers,
90
280484
1335
04:41
and I certainly would invite others to check our calculations
91
281861
3211
04:45
because they may arrive at different conclusions.
92
285114
2586
04:47
But in order to transition, not just current electricity production,
93
287742
6089
04:53
but also heating and transport,
94
293873
4004
04:57
which roughly triples the amount of electricity that you need,
95
297918
3796
05:01
it amounts to approximately 300 terawatt hours of installed capacity.
96
301756
5046
05:06
CA: So we need to give people a sense of how big a task that is.
97
306802
4463
05:11
I mean, here we are at the Gigafactory.
98
311307
2210
05:13
You know, this is one of the biggest buildings in the world.
99
313517
5089
05:19
What I've read, and tell me if this is still right,
100
319565
3086
05:22
is that the goal here is to eventually produce 100 gigawatt hours
101
322693
5881
05:28
of batteries here a year eventually.
102
328616
2461
05:31
EM: We will probably do more than that,
103
331452
1877
05:33
but yes, hopefully we get there within a couple of years.
104
333329
4129
05:37
CA: Right.
105
337500
1293
05:38
But I mean, that is one --
106
338793
2502
05:41
EM: 0.1 terrawat hours.
107
341337
1835
05:43
CA: But that's still 1/100 of what's needed.
108
343172
3462
05:46
How much of the rest of that 100 is Tesla planning to take on
109
346675
5840
05:52
let's say, between now and 2030, 2040,
110
352556
5422
05:58
when we really need to see the scale up happen?
111
358020
3128
06:01
EM: I mean, these are just guesses.
112
361774
1752
06:03
So please, people shouldn't hold me to these things.
113
363567
2795
06:06
It's not like this is like some --
114
366404
2168
06:08
What tends to happen is I'll make some like,
115
368614
3378
06:12
you know, best guess
116
372034
1251
06:13
and then people, in five years,
117
373327
2044
06:15
there’ll be some jerk that writes an article:
118
375413
2127
06:17
"Elon said this would happen, and it didn't happen.
119
377540
2878
06:20
He's a liar and a fool."
120
380459
1710
06:22
It's very annoying when that happens.
121
382211
2211
06:25
So these are just guesses, this is a conversation.
122
385172
2628
06:27
CA: Right.
123
387842
1126
06:29
EM: I think Tesla probably ends up doing 10 percent of that.
124
389552
4462
06:35
Roughly.
125
395099
1168
06:36
CA: Let's say 2050
126
396308
1293
06:37
we have this amazing, you know, 100 percent sustainable electric grid
127
397643
5881
06:43
made up of, you know, some mixture of the sustainable energy sources
128
403566
3211
06:46
you talked about.
129
406777
1335
06:49
That same grid probably is offering the world
130
409447
2836
06:52
really low-cost energy, isn't it,
131
412324
1877
06:54
compared with now.
132
414243
2085
06:56
And I'm curious about like,
133
416745
3129
06:59
are people entitled to get a little bit excited
134
419915
4213
07:04
about the possibilities of that world?
135
424170
2627
07:06
EM: People should be optimistic about the future.
136
426839
2294
07:12
Humanity will solve sustainable energy.
137
432970
2753
07:15
It will happen if we, you know, continue to push hard,
138
435764
4547
07:20
the future is bright and good from an energy standpoint.
139
440311
4504
07:25
And then it will be possible to also use that energy to do carbon sequestration.
140
445983
5380
07:31
It takes a lot of energy to pull carbon out of the atmosphere
141
451363
2878
07:34
because in putting it in the atmosphere it releases energy.
142
454283
2920
07:37
So now, you know, obviously in order to pull it out,
143
457244
2461
07:39
you need to use a lot of energy.
144
459705
1585
07:41
But if you've got a lot of sustainable energy from wind and solar,
145
461332
3712
07:45
you can actually sequester carbon.
146
465044
1668
07:46
So you can reverse the CO2 parts per million of the atmosphere and oceans.
147
466712
5464
07:52
And also you can really have as much fresh water as you want.
148
472593
3712
07:57
Earth is mostly water.
149
477264
1168
07:58
We should call Earth “Water.”
150
478432
1418
07:59
It's 70 percent water by surface area.
151
479850
1835
08:01
Now most of that’s seawater,
152
481685
1418
08:03
but it's like we just happen to be on the bit that's land.
153
483103
3462
08:06
CA: And with energy, you can turn seawater into --
154
486565
3212
08:09
EM: Yes.
155
489777
1168
08:10
CA: Irrigating water or whatever water you need.
156
490986
2336
08:13
EM: At very low cost.
157
493864
1210
08:15
Things will be good.
158
495115
1168
08:16
CA: Things will be good.
159
496325
1168
08:17
And also, there's other benefits to this non-fossil fuel world
160
497535
2919
08:20
where the air is cleaner --
161
500496
1460
08:21
EM: Yes, exactly.
162
501997
1210
08:23
Because, like, when you burn fossil fuels,
163
503707
2795
08:26
there's all these side reactions
164
506544
3294
08:29
and toxic gases of various kinds.
165
509880
2711
08:33
And sort of little particulates that are bad for your lungs.
166
513509
4880
08:38
Like, there's all sorts of bad things that are happening
167
518430
2628
08:41
that will go away.
168
521100
1168
08:42
And the sky will be cleaner and quieter.
169
522268
3211
08:45
The future's going to be good.
170
525479
1502
08:46
CA: I want us to switch now to think a bit about artificial intelligence.
171
526981
3461
08:50
But the segue there,
172
530442
1836
08:52
you mentioned how annoying it is when people call you up
173
532278
3461
08:55
for bad predictions in the past.
174
535781
2878
08:58
So I'm possibly going to be annoying now,
175
538701
4212
09:02
but I’m curious about your timelines and how you predict
176
542955
4421
09:07
and how come some things are so amazingly on the money and some aren't.
177
547418
3378
09:10
So when it comes to predicting sales of Tesla vehicles, for example,
178
550796
4963
09:15
you've kind of been amazing,
179
555801
1376
09:17
I think in 2014 when Tesla had sold that year 60,000 cars,
180
557177
5423
09:22
you said, "2020, I think we will do half a million a year."
181
562641
3712
09:26
EM: Yeah, we did almost exactly a half million.
182
566395
2211
09:28
CA: You did almost exactly half a million.
183
568647
2002
09:30
You were scoffed in 2014 because no one since Henry Ford,
184
570691
3253
09:33
with the Model T, had come close to that kind of growth rate for cars.
185
573944
4546
09:38
You were scoffed, and you actually hit 500,000 cars
186
578532
2836
09:41
and then 510,000 or whatever produced.
187
581410
2336
09:44
But five years ago, last time you came to TED,
188
584246
3170
09:47
I asked you about full self-driving,
189
587416
2961
09:50
and you said, “Yeah, this very year,
190
590419
2878
09:53
I'm confident that we will have a car going from LA to New York
191
593297
5464
09:58
without any intervention."
192
598802
2044
10:00
EM: Yeah, I don't want to blow your mind, but I'm not always right.
193
600888
3211
10:04
CA: (Laughs)
194
604099
1377
10:05
What's the difference between those two?
195
605517
2044
10:08
Why has full self-driving in particular been so hard to predict?
196
608646
4671
10:13
EM: I mean, the thing that really got me,
197
613359
1960
10:15
and I think it's going to get a lot of other people,
198
615361
2460
10:17
is that there are just so many false dawns with self-driving,
199
617821
4338
10:22
where you think you've got the problem,
200
622159
3128
10:25
have a handle on the problem,
201
625329
1418
10:26
and then it, no, turns out you just hit a ceiling.
202
626747
4212
10:33
Because if you were to plot the progress,
203
633754
3754
10:37
the progress looks like a log curve.
204
637508
1751
10:39
So it's like a series of log curves.
205
639259
2336
10:41
So most people don't know what a log curve is, I suppose.
206
641637
2794
10:45
CA: Show the shape with your hands.
207
645474
1752
10:47
EM: It goes up you know, sort of a fairly straight way,
208
647267
2962
10:50
and then it starts tailing off
209
650270
2336
10:52
and you start getting diminishing returns.
210
652648
2669
10:55
And you're like, uh oh,
211
655693
1876
10:57
it was trending up and now it's sort of, curving over
212
657611
3629
11:01
and you start getting to these, what I call local maxima,
213
661281
4505
11:05
where you don't realize basically how dumb you were.
214
665828
2836
11:10
And then it happens again.
215
670040
1919
11:14
And ultimately...
216
674253
1293
11:16
These things, you know, in retrospect, they seem obvious,
217
676004
3295
11:19
but in order to solve full self-driving properly,
218
679341
4088
11:23
you actually have to solve real-world AI.
219
683429
2961
11:28
Because what are the road networks designed to work with?
220
688350
4171
11:32
They're designed to work with a biological neural net, our brains,
221
692563
3920
11:36
and with vision, our eyes.
222
696483
3295
11:40
And so in order to make it work with computers,
223
700487
5047
11:45
you basically need to solve real-world AI and vision.
224
705576
4838
11:51
Because we need cameras
225
711123
5047
11:56
and silicon neural nets
226
716211
2711
11:58
in order to have self-driving work
227
718922
2962
12:01
for a system that was designed for eyes and biological neural nets.
228
721884
4212
12:07
You know, I guess when you put it that way,
229
727139
2085
12:09
it's sort of, like, quite obvious
230
729224
1585
12:10
that the only way to solve full self-driving
231
730851
2085
12:12
is to solve real world AI and sophisticated vision.
232
732936
3963
12:16
CA: What do you feel about the current architecture?
233
736899
2460
12:19
Do you think you have an architecture now
234
739401
2002
12:21
where there is a chance
235
741403
1794
12:23
for the logarithmic curve not to tail off any anytime soon?
236
743197
3962
12:27
EM: Well I mean, admittedly these may be infamous last words,
237
747743
5213
12:32
but I actually am confident that we will solve it this year.
238
752998
2836
12:36
That we will exceed --
239
756126
1794
12:39
The probability of an accident,
240
759379
2962
12:42
at what point do you exceed that of the average person?
241
762382
2586
12:45
I think we will exceed that this year.
242
765344
1960
12:47
CA: What are you seeing behind the scenes that gives you that confidence?
243
767346
3712
12:51
EM: We’re almost at the point where we have a high-quality
244
771099
2795
12:53
unified vector space.
245
773894
1919
12:55
In the beginning, we were trying to do this with image recognition
246
775854
4213
13:00
on individual images.
247
780108
2920
13:03
But if you get one image out of a video,
248
783070
1918
13:05
it's actually quite hard to see what's going on without ambiguity.
249
785030
4838
13:09
But if you look at a video segment of a few seconds of video,
250
789910
3128
13:13
that ambiguity resolves.
251
793080
1668
13:15
So the first thing we had to do is tie all eight cameras together
252
795791
3462
13:19
so they're synchronized,
253
799294
1168
13:20
so that all the frames are looked at simultaneously
254
800504
3170
13:23
and labeled simultaneously by one person,
255
803715
2962
13:26
because we still need human labeling.
256
806718
1877
13:30
So at least they’re not labeled at different times by different people
257
810180
3504
13:33
in different ways.
258
813684
1209
13:35
So it's sort of a surround picture.
259
815561
2419
13:37
Then a very important part is to add the time dimension.
260
817980
3420
13:41
So that you’re looking at surround video,
261
821733
3587
13:45
and you're labeling surround video.
262
825320
1919
13:47
And this is actually quite difficult to do from a software standpoint.
263
827865
4671
13:52
We had to write our own labeling tools
264
832911
4672
13:57
and then create auto labeling,
265
837624
5464
14:03
create auto labeling software to amplify the efficiency of human labelers
266
843130
4546
14:07
because it’s quite hard to label.
267
847676
1752
14:09
In the beginning, it was taking several hours
268
849469
2294
14:11
to label a 10-second video clip.
269
851763
2128
14:13
This is not scalable.
270
853932
2169
14:16
So basically what you have to have is you have to have surround video,
271
856518
3295
14:19
and that surround video has to be primarily automatically labeled
272
859813
3796
14:23
with humans just being editors
273
863609
1668
14:25
and making slight corrections to the labeling of the video
274
865319
4671
14:30
and then feeding back those corrections into the future auto labeler,
275
870032
4629
14:34
so you get this flywheel eventually
276
874661
1710
14:36
where the auto labeler is able to take in vast amounts of video
277
876371
3295
14:39
and with high accuracy,
278
879666
1335
14:41
automatically label the video for cars, lane lines, drive space.
279
881043
5881
14:46
CA: What you’re saying is ...
280
886965
4296
14:51
the result of this is that you're effectively giving the car a 3D model
281
891303
4796
14:56
of the actual objects that are all around it.
282
896099
2461
14:58
It knows what they are,
283
898560
2461
15:01
and it knows how fast they are moving.
284
901063
2794
15:03
And the remaining task is to predict
285
903899
5338
15:09
what the quirky behaviors are that, you know,
286
909237
3379
15:12
that when a pedestrian is walking down the road with a smaller pedestrian,
287
912658
3586
15:16
that maybe that smaller pedestrian might do something unpredictable
288
916286
3212
15:19
or things like that.
289
919498
1668
15:21
You have to build into it before you can really call it safe.
290
921208
3003
15:24
EM: You basically need to have memory across time and space.
291
924211
5881
15:30
So what I mean by that is ...
292
930467
2544
15:34
Memory can’t be infinite,
293
934763
2544
15:37
because it's using up a lot of the computer's RAM basically.
294
937349
4212
15:42
So you have to say how much are you going to try to remember?
295
942562
2878
15:46
It's very common for things to be occluded.
296
946650
2627
15:49
So if you talk about say, a pedestrian walking past a truck
297
949319
4171
15:53
where you saw the pedestrian start on one side of the truck,
298
953490
4129
15:57
then they're occluded by the truck.
299
957619
2086
16:01
You would know intuitively,
300
961540
2377
16:03
OK, that pedestrian is going to pop out the other side, most likely.
301
963917
3253
16:07
CA: A computer doesn't know it.
302
967504
1543
16:09
EM: You need to slow down.
303
969047
1293
16:10
CA: A skeptic is going to say that every year for the last five years,
304
970340
3879
16:14
you've kind of said, well,
305
974261
1668
16:15
no this is the year,
306
975971
1209
16:17
we're confident that it will be there in a year or two or, you know,
307
977222
3253
16:20
like it's always been about that far away.
308
980475
2586
16:23
But we've got a new architecture now,
309
983103
2794
16:25
you're seeing enough improvement behind the scenes
310
985897
2837
16:28
to make you not certain, but pretty confident,
311
988734
2961
16:31
that, by the end of this year,
312
991737
2210
16:33
what in most, not in every city, and every circumstance
313
993989
3128
16:37
but in many cities and circumstances,
314
997159
2460
16:39
basically the car will be able to drive without interventions
315
999619
2878
16:42
safer than a human.
316
1002539
1168
16:43
EM: Yes.
317
1003707
1168
16:44
I mean, the car currently drives me around Austin
318
1004916
2461
16:47
most of the time with no interventions.
319
1007377
2044
16:49
So it's not like ...
320
1009755
1751
16:52
And we have over 100,000 people
321
1012340
3129
16:55
in our full self-driving beta program.
322
1015510
2878
16:59
So you can look at the videos that they post online.
323
1019181
3086
17:02
CA: I do.
324
1022309
1168
17:03
And some of them are great, and some of them are a little terrifying.
325
1023935
3254
17:07
I mean, occasionally the car seems to veer off
326
1027189
2168
17:09
and scare the hell out of people.
327
1029357
1877
17:12
EM: It’s still a beta.
328
1032152
1335
17:15
CA: But you’re behind the scenes, looking at the data,
329
1035072
2585
17:17
you're seeing enough improvement
330
1037699
1543
17:19
to believe that a this-year timeline is real.
331
1039242
4213
17:23
EM: Yes, that's what it seems like.
332
1043497
1710
17:25
I mean, we could be here talking again in a year,
333
1045207
3628
17:28
like, well, another year went by, and it didn’t happen.
334
1048877
2753
17:31
But I think this is the year.
335
1051630
1710
17:33
CA: And so in general, when people talk about Elon time,
336
1053381
3295
17:36
I mean it sounds like you can't just have a general rule
337
1056718
3212
17:39
that if you predict that something will be done in six months,
338
1059930
2961
17:42
actually what we should imagine is it’s going to be a year
339
1062891
2920
17:45
or it’s like two-x or three-x, it depends on the type of prediction.
340
1065852
3254
17:49
Some things, I guess, things involving software, AI, whatever,
341
1069106
3837
17:52
are fundamentally harder to predict than others.
342
1072943
4004
17:56
Is there an element
343
1076988
1168
17:58
that you actually deliberately make aggressive prediction timelines
344
1078198
3337
18:01
to drive people to be ambitious?
345
1081576
3420
18:04
Without that, nothing gets done?
346
1084996
1710
18:06
EM: Well, I generally believe, in terms of internal timelines,
347
1086748
3003
18:09
that we want to set the most aggressive timeline that we can.
348
1089793
4880
18:14
Because there’s sort of like a law of gaseous expansion where,
349
1094673
3170
18:17
for schedules, where whatever time you set,
350
1097843
3086
18:20
it's not going to be less than that.
351
1100971
1751
18:22
It's very rare that it'll be less than that.
352
1102722
2253
18:26
But as far as our predictions are concerned,
353
1106184
2086
18:28
what tends to happen in the media
354
1108311
1585
18:29
is that they will report all the wrong ones
355
1109938
2002
18:31
and ignore all the right ones.
356
1111982
1501
18:33
Or, you know, when writing an article about me --
357
1113525
4796
18:38
I've had a long career in multiple industries.
358
1118321
2211
18:40
If you list my sins, I sound like the worst person on Earth.
359
1120574
2836
18:43
But if you put those against the things I've done right,
360
1123410
3378
18:46
it makes much more sense, you know?
361
1126788
1794
18:48
So essentially like, the longer you do anything,
362
1128623
2753
18:51
the more mistakes that you will make cumulatively.
363
1131418
3670
18:55
Which, if you sum up those mistakes,
364
1135130
1793
18:56
will sound like I'm the worst predictor ever.
365
1136965
3629
19:00
But for example, for Tesla vehicle growth,
366
1140635
3879
19:04
I said I think we’d do 50 percent, and we’ve done 80 percent.
367
1144556
4379
19:08
CA: Yes.
368
1148977
1126
19:10
EM: But they don't mention that one.
369
1150937
1752
19:12
So, I mean, I'm not sure what my exact track record is on predictions.
370
1152689
3754
19:16
They're more optimistic than pessimistic, but they're not all optimistic.
371
1156484
3462
19:19
Some of them are exceeded probably more or later,
372
1159946
4838
19:24
but they do come true.
373
1164826
3921
19:28
It's very rare that they do not come true.
374
1168747
2878
19:31
It's sort of like, you know,
375
1171666
3212
19:34
if there's some radical technology prediction,
376
1174878
3670
19:38
the point is not that it was a few years late,
377
1178590
2169
19:40
but that it happened at all.
378
1180800
1544
19:43
That's the more important part.
379
1183136
2336
19:45
CA: So it feels like at some point in the last year,
380
1185889
3962
19:49
seeing the progress on understanding,
381
1189851
4671
19:54
the Tesla AI understanding the world around it,
382
1194522
3421
19:57
led to a kind of, an aha moment at Tesla.
383
1197984
2544
20:00
Because you really surprised people recently when you said
384
1200570
3087
20:03
probably the most important product development
385
1203698
2503
20:06
going on at Tesla this year is this robot, Optimus.
386
1206243
3878
20:10
EM: Yes.
387
1210163
1168
20:11
CA: Many companies out there have tried to put out these robots,
388
1211331
3211
20:14
they've been working on them for years.
389
1214584
1919
20:16
And so far no one has really cracked it.
390
1216503
2294
20:18
There's no mass adoption robot in people's homes.
391
1218797
3795
20:22
There are some in manufacturing, but I would say,
392
1222592
3170
20:25
no one's kind of, really cracked it.
393
1225762
2711
20:29
Is it something that happened
394
1229724
1794
20:31
in the development of full self-driving that gave you the confidence to say,
395
1231559
3587
20:35
"You know what, we could do something special here."
396
1235146
2461
20:37
EM: Yeah, exactly.
397
1237607
1251
20:38
So, you know, it took me a while to sort of realize
398
1238858
2670
20:41
that in order to solve self-driving,
399
1241528
2919
20:44
you really needed to solve real-world AI.
400
1244489
2169
20:47
And at the point of which you solve real-world AI for a car,
401
1247617
3045
20:50
which is really a robot on four wheels,
402
1250704
2502
20:53
you can then generalize that to a robot on legs as well.
403
1253248
4421
20:58
The two hard parts I think --
404
1258128
1418
20:59
like obviously companies like Boston Dynamics
405
1259587
2586
21:02
have shown that it's possible to make quite compelling,
406
1262215
2878
21:05
sometimes alarming robots.
407
1265135
1626
21:06
CA: Right.
408
1266761
1210
21:08
EM: You know, so from a sensors and actuators standpoint,
409
1268013
3295
21:11
it's certainly been demonstrated by many
410
1271349
3253
21:14
that it's possible to make a humanoid robot.
411
1274644
2086
21:16
The things that are currently missing are enough intelligence
412
1276730
5547
21:22
for the robot to navigate the real world and do useful things
413
1282277
3086
21:25
without being explicitly instructed.
414
1285405
2252
21:27
So the missing things are basically real-world intelligence
415
1287657
4129
21:31
and scaling up manufacturing.
416
1291786
2503
21:34
Those are two things that Tesla is very good at.
417
1294664
2419
21:37
And so then we basically just need to design the specialized actuators
418
1297125
6006
21:43
and sensors that are needed for humanoid robot.
419
1303173
2335
21:46
People have no idea, this is going to be bigger than the car.
420
1306301
3045
21:50
CA: So let's dig into exactly that.
421
1310764
1918
21:52
I mean, in one way, it's actually an easier problem than full self-driving
422
1312724
3587
21:56
because instead of an object going along at 60 miles an hour,
423
1316311
3795
22:00
which if it gets it wrong, someone will die.
424
1320148
2544
22:02
This is an object that's engineered to only go at what,
425
1322692
2586
22:05
three or four or five miles an hour.
426
1325320
2002
22:07
And so a mistake, there aren't lives at stake.
427
1327322
3003
22:10
There might be embarrassment at stake.
428
1330367
1835
22:12
EM: So long as the AI doesn't take it over and murder us in our sleep or something.
429
1332202
5130
22:17
CA: Right.
430
1337374
1167
22:18
(Laughter)
431
1338583
1168
22:20
So talk about --
432
1340794
1543
22:22
I think the first applications you've mentioned
433
1342379
2711
22:25
are probably going to be manufacturing,
434
1345131
1919
22:27
but eventually the vision is to have these available for people at home.
435
1347050
3462
22:30
If you had a robot that really understood the 3D architecture of your house
436
1350553
6298
22:36
and knew where every object in that house was
437
1356893
5005
22:41
or was supposed to be,
438
1361940
1334
22:43
and could recognize all those objects,
439
1363274
2837
22:46
I mean, that’s kind of amazing, isn’t it?
440
1366152
2253
22:48
Like the kind of thing that you could ask a robot to do
441
1368446
3295
22:51
would be what?
442
1371741
1168
22:52
Like, tidy up?
443
1372951
1209
22:54
EM: Yeah, absolutely.
444
1374160
1460
22:57
Make dinner, I guess, mow the lawn.
445
1377122
2252
22:59
CA: Take a cup of tea to grandma and show her family pictures.
446
1379416
4379
23:04
EM: Exactly. Take care of my grandmother and make sure --
447
1384462
4296
23:08
CA: It could obviously recognize everyone in the home.
448
1388758
2628
23:12
It could play catch with your kids.
449
1392053
1877
23:13
EM: Yes. I mean, obviously, we need to be careful
450
1393972
2294
23:16
this doesn't become a dystopian situation.
451
1396307
2545
23:20
I think one of the things that's going to be important
452
1400520
2544
23:23
is to have a localized ROM chip on the robot
453
1403064
3504
23:26
that cannot be updated over the air.
454
1406609
3045
23:29
Where if you, for example, were to say, “Stop, stop, stop,”
455
1409696
2794
23:32
if anyone said that,
456
1412490
1460
23:33
then the robot would stop, you know, type of thing.
457
1413992
2419
23:36
And that's not updatable remotely.
458
1416411
1960
23:38
I think it's going to be important to have safety features like that.
459
1418997
3253
23:42
CA: Yeah, that sounds wise.
460
1422292
1501
23:43
EM: And I do think there should be a regulatory agency for AI.
461
1423793
2961
23:46
I've said that for many years.
462
1426754
1460
23:48
I don't love being regulated,
463
1428214
1418
23:49
but I think this is an important thing for public safety.
464
1429632
2711
23:52
CA: Let's come back to that.
465
1432343
1377
23:53
But I don't think many people have really sort of taken seriously
466
1433720
4463
23:58
the notion of, you know, a robot at home.
467
1438183
2752
24:00
I mean, at the start of the computing revolution,
468
1440935
2294
24:03
Bill Gates said there's going to be a computer in every home.
469
1443271
2878
24:06
And people at the time said, yeah, whatever, who would even want that.
470
1446149
3295
24:09
Do you think there will be basically like in, say, 2050 or whatever,
471
1449903
3670
24:13
like a robot in most homes, is what there will be,
472
1453615
4170
24:17
and people will love them and count on them?
473
1457827
2419
24:20
You’ll have your own butler basically.
474
1460663
1836
24:22
EM: Yeah, you'll have your sort of buddy robot probably, yeah.
475
1462749
3920
24:27
CA: I mean, how much of a buddy?
476
1467003
1585
24:28
How many applications have you thought,
477
1468630
2627
24:31
you know, can you have a romantic partner, a sex partner?
478
1471257
2836
24:34
EM: It's probably inevitable.
479
1474135
2127
24:36
I mean, I did promise the internet that I’d make catgirls.
480
1476304
2794
24:39
We could make a robot catgirl.
481
1479098
2044
24:42
CA: Be careful what you promise the internet.
482
1482644
2168
24:44
(Laughter)
483
1484812
2253
24:47
EM: So, yeah, I guess it'll be whatever people want really, you know.
484
1487065
4963
24:52
CA: What sort of timeline should we be thinking about
485
1492487
3670
24:56
of the first models that are actually made and sold?
486
1496157
3837
25:01
EM: Well, you know, the first units that we intend to make
487
1501621
3712
25:05
are for jobs that are dangerous, boring, repetitive,
488
1505375
4629
25:10
and things that people don't want to do.
489
1510004
1919
25:11
And, you know, I think we’ll have like an interesting prototype
490
1511965
3044
25:15
sometime this year.
491
1515009
1210
25:16
We might have something useful next year,
492
1516219
2627
25:18
but I think quite likely within at least two years.
493
1518888
2836
25:22
And then we'll see rapid growth year over year
494
1522308
2169
25:24
of the usefulness of the humanoid robots
495
1524519
2544
25:27
and decrease in cost and scaling up production.
496
1527105
2711
25:29
CA: Initially just selling to businesses,
497
1529857
1961
25:31
or when do you picture you'll start selling them
498
1531859
2670
25:34
where you can buy your parents one for Christmas or something?
499
1534571
4295
25:39
EM: I'd say in less than ten years.
500
1539450
1710
25:41
CA: Help me on the economics of this.
501
1541160
2837
25:43
So what do you picture the cost of one of these being?
502
1543997
3211
25:47
EM: Well, I think the cost is actually not going to be crazy high.
503
1547250
3295
25:51
Like less than a car.
504
1551921
1251
25:53
Initially, things will be expensive because it'll be a new technology
505
1553172
3254
25:56
at low production volume.
506
1556467
1210
25:57
The complexity and cost of a car is greater than that of a humanoid robot.
507
1557677
3670
26:01
So I would expect that it's going to be less than a car,
508
1561681
4212
26:05
or at least equivalent to a cheap car.
509
1565935
1835
26:07
CA: So even if it starts at 50k, within a few years,
510
1567770
2461
26:10
it’s down to 20k or lower or whatever.
511
1570273
2336
26:13
And maybe for home they'll get much cheaper still.
512
1573568
2335
26:15
But think about the economics of this.
513
1575903
1877
26:17
If you can replace a $30,000,
514
1577822
4254
26:22
$40,000-a-year worker,
515
1582076
2795
26:24
which you have to pay every year,
516
1584912
1669
26:26
with a one-time payment of $25,000
517
1586623
3044
26:29
for a robot that can work longer hours,
518
1589667
2920
26:32
a pretty rapid replacement of certain types of jobs.
519
1592629
4004
26:36
How worried should the world be about that?
520
1596674
2211
26:39
EM: I wouldn't worry about the sort of, putting people out of a job thing.
521
1599344
3503
26:42
I think we're actually going to have, and already do have,
522
1602889
3336
26:46
a massive shortage of labor.
523
1606267
1377
26:47
So I think we will have ...
524
1607644
2919
26:54
Not people out of work,
525
1614067
1209
26:55
but actually still a shortage labor even in the future.
526
1615276
2836
26:58
But this really will be a world of abundance.
527
1618863
4630
27:03
Any goods and services will be available to anyone who wants them.
528
1623534
4964
27:08
It'll be so cheap to have goods and services, it will be ridiculous.
529
1628790
3211
27:12
CA: I'm presuming it should be possible to imagine a bunch of goods and services
530
1632043
4046
27:16
that can't profitably be made now but could be made in that world,
531
1636089
4421
27:20
courtesy of legions of robots.
532
1640551
2628
27:23
EM: Yeah.
533
1643179
1460
27:25
It will be a world of abundance.
534
1645014
1794
27:26
The only scarcity that will exist in the future
535
1646808
2502
27:29
is that which we decide to create ourselves as humans.
536
1649352
3170
27:32
CA: OK.
537
1652563
1168
27:33
So AI is allowing us to imagine a differently powered economy
538
1653731
4338
27:38
that will create this abundance.
539
1658111
2127
27:40
What are you most worried about going wrong?
540
1660279
2086
27:42
EM: Well, like I said, AI and robotics will bring out
541
1662407
6006
27:48
what might be termed the age of abundance.
542
1668454
2544
27:51
Other people have used this word,
543
1671332
1960
27:54
and that this is my prediction:
544
1674210
1668
27:55
it will be an age of abundance for everyone.
545
1675920
3212
27:59
But I guess there’s ...
546
1679841
2544
28:03
The dangers would be the artificial general intelligence
547
1683886
4547
28:08
or digital superintelligence decouples from a collective human will
548
1688474
5256
28:13
and goes in the direction that for some reason we don't like.
549
1693771
3504
28:17
Whatever direction it might go.
550
1697316
2128
28:20
You know, that’s sort of the idea behind Neuralink,
551
1700570
3420
28:24
is to try to more tightly couple collective human world
552
1704031
3045
28:27
to digital superintelligence.
553
1707076
4463
28:33
And also along the way solve a lot of brain injuries and spinal injuries
554
1713458
5755
28:39
and that kind of thing.
555
1719213
1168
28:40
So even if it doesn't succeed in the greater goal,
556
1720423
2336
28:42
I think it will succeed in the goal of alleviating brain and spine damage.
557
1722759
5588
28:48
CA: So the spirit there is that if we're going to make these AIs
558
1728347
3045
28:51
that are so vastly intelligent, we ought to be wired directly to them
559
1731434
3253
28:54
so that we ourselves can have those superpowers more directly.
560
1734687
4421
28:59
But that doesn't seem to avoid the risk that those superpowers might ...
561
1739609
4421
29:05
turn ugly in unintended ways.
562
1745740
2586
29:08
EM: I think it's a risk, I agree.
563
1748326
1626
29:09
I'm not saying that I have some certain answer to that risk.
564
1749994
6256
29:16
I’m just saying like
565
1756292
2294
29:18
maybe one of the things that would be good
566
1758628
3545
29:22
for ensuring that the future is one that we want
567
1762215
5672
29:27
is to more tightly couple
568
1767887
3253
29:31
the collective human world to digital intelligence.
569
1771140
3754
29:36
The issue that we face here is that we are already a cyborg,
570
1776437
4129
29:40
if you think about it.
571
1780566
1252
29:41
The computers are an extension of ourselves.
572
1781859
3963
29:46
And when we die, we have, like, a digital ghost.
573
1786697
3003
29:49
You know, all of our text messages and social media, emails.
574
1789742
3545
29:53
And it's quite eerie actually,
575
1793329
2002
29:55
when someone dies but everything online is still there.
576
1795373
3295
29:59
But you say like, what's the limitation?
577
1799001
1919
30:00
What is it that inhibits a human-machine symbiosis?
578
1800962
5171
30:06
It's the data rate.
579
1806175
1210
30:07
When you communicate, especially with a phone,
580
1807385
2169
30:09
you're moving your thumbs very slowly.
581
1809554
2919
30:12
So you're like moving your two little meat sticks
582
1812515
2878
30:15
at a rate that’s maybe 10 bits per second, optimistically, 100 bits per second.
583
1815393
5922
30:21
And computers are communicating at the gigabyte level and beyond.
584
1821315
5130
30:26
CA: Have you seen evidence that the technology is actually working,
585
1826487
3170
30:29
that you've got a richer, sort of, higher bandwidth connection, if you like,
586
1829657
3587
30:33
between like external electronics and a brain
587
1833286
2836
30:36
than has been possible before?
588
1836122
1668
30:38
EM: Yeah.
589
1838165
1210
30:41
I mean, the fundamental principles of reading neurons,
590
1841002
5422
30:46
sort of doing read-write on neurons with tiny electrodes,
591
1846465
3921
30:50
have been demonstrated for decades.
592
1850386
2169
30:53
So it's not like the concept is new.
593
1853306
4254
30:57
The problem is that there is no product that works well
594
1857602
4754
31:02
that you can go and buy.
595
1862398
2503
31:04
So it's all sort of, in research labs.
596
1864942
2586
31:08
And it's like some cords sticking out of your head.
597
1868738
5672
31:14
And it's quite gruesome, and it's really ...
598
1874410
3253
31:18
There's no good product that actually does a good job
599
1878539
4088
31:22
and is high-bandwidth and safe
600
1882627
1876
31:24
and something actually that you could buy and would want to buy.
601
1884545
3253
31:29
But the way to think of the Neuralink device
602
1889550
3921
31:33
is kind of like a Fitbit or an Apple Watch.
603
1893512
3546
31:37
That's where we take out sort of a small section of skull
604
1897934
4713
31:42
about the size of a quarter,
605
1902647
1584
31:44
replace that with what,
606
1904273
2252
31:46
in many ways really is very much like a Fitbit, Apple Watch
607
1906525
5673
31:52
or some kind of smart watch thing.
608
1912239
2378
31:56
But with tiny, tiny wires,
609
1916035
4171
32:00
very, very tiny wires.
610
1920247
1877
32:02
Wires so tiny, it’s hard to even see them.
611
1922124
2044
32:05
And it's very important to have very tiny wires
612
1925044
2210
32:07
so that when they’re implanted, they don’t damage the brain.
613
1927296
2836
32:10
CA: How far are you from putting these into humans?
614
1930132
2878
32:14
EM: Well, we have put in our FDA application
615
1934136
4463
32:18
to aspirationally do the first human implant this year.
616
1938641
4296
32:23
CA: The first uses will be for neurological injuries
617
1943312
3545
32:26
of different kinds.
618
1946899
1293
32:28
But rolling the clock forward
619
1948192
1543
32:29
and imagining when people are actually using these
620
1949777
4046
32:33
for their own enhancement, let's say,
621
1953864
2628
32:36
and for the enhancement of the world,
622
1956534
1877
32:38
how clear are you in your mind
623
1958452
1460
32:39
as to what it will feel like to have one of these inside your head?
624
1959912
5339
32:45
EM: Well, I do want to emphasize we're at an early stage.
625
1965251
3920
32:49
And so it really will be many years before we have
626
1969171
6048
32:55
anything approximating a high-bandwidth neural interface
627
1975261
6673
33:01
that allows for AI-human symbiosis.
628
1981934
3670
33:07
For many years, we will just be solving brain injuries and spinal injuries.
629
1987481
4255
33:11
For probably a decade.
630
1991777
2169
33:14
This is not something that will suddenly one day
631
1994905
3504
33:18
it will have this incredible sort of whole brain interface.
632
1998451
4713
33:25
It's going to be, like I said,
633
2005041
1501
33:26
at least a decade of really just solving brain injuries
634
2006542
3921
33:30
and spinal injuries.
635
2010504
2127
33:32
And really, I think you can solve a very wide range of brain injuries,
636
2012631
3754
33:36
including severe depression, morbid obesity, sleep,
637
2016385
6799
33:43
potentially schizophrenia,
638
2023225
1252
33:44
like, a lot of things that cause great stress to people.
639
2024518
3712
33:48
Restoring memory in older people.
640
2028773
3003
33:51
CA: If you can pull that off, that's the app I will sign up for.
641
2031817
3796
33:56
EM: Absolutely.
642
2036363
1293
33:57
CA: Please hurry. (Laughs)
643
2037698
2169
33:59
EM: I mean, the emails that we get at Neuralink are heartbreaking.
644
2039867
4880
34:05
I mean, they'll send us just tragic, you know,
645
2045122
4088
34:09
where someone was sort of, in the prime of life
646
2049251
2336
34:11
and they had an accident on a motorcycle
647
2051629
3795
34:15
and someone who's 25, you know, can't even feed themselves.
648
2055466
5797
34:21
And this is something we could fix.
649
2061263
2378
34:24
CA: But you have said that AI is one of the things you're most worried about
650
2064391
3587
34:28
and that Neuralink may be one of the ways
651
2068020
2878
34:30
where we can keep abreast of it.
652
2070940
3837
34:35
EM: Yeah, there's the short-term thing,
653
2075528
3545
34:39
which I think is helpful on an individual human level with injuries.
654
2079115
3920
34:43
And then the long-term thing is an attempt
655
2083077
2544
34:45
to address the civilizational risk of AI
656
2085663
6006
34:51
by bringing digital intelligence
657
2091710
3921
34:55
and biological intelligence closer together.
658
2095673
2669
34:58
I mean, if you think of how the brain works today,
659
2098384
2377
35:00
there are really two layers to the brain.
660
2100803
2002
35:02
There's the limbic system and the cortex.
661
2102847
1960
35:04
You've got the kind of, animal brain where --
662
2104849
2127
35:06
it’s kind of like the fun part, really.
663
2106976
1877
35:08
CA: It's where most of Twitter operates, by the way.
664
2108853
2502
35:11
EM: I think Tim Urban said,
665
2111355
1710
35:13
we’re like somebody, you know, stuck a computer on a monkey.
666
2113107
4463
35:18
You know, so we're like, if you gave a monkey a computer,
667
2118320
3587
35:21
that's our cortex.
668
2121949
1210
35:23
But we still have a lot of monkey instincts.
669
2123159
2168
35:25
Which we then try to rationalize as, no, it's not a monkey instinct.
670
2125369
3754
35:29
It’s something more important than that.
671
2129165
1918
35:31
But it's often just really a monkey instinct.
672
2131083
2127
35:33
We're just monkeys with a computer stuck in our brain.
673
2133252
3378
35:38
But even though the cortex is sort of the smart,
674
2138883
2919
35:41
or the intelligent part of the brain,
675
2141844
1793
35:43
the thinking part of the brain,
676
2143679
2419
35:46
I've not yet met anyone who wants to delete their limbic system
677
2146098
3712
35:49
or their cortex.
678
2149852
1168
35:51
They're quite happy having both.
679
2151020
1543
35:52
Everyone wants both parts of their brain.
680
2152605
2002
35:56
And people really want their phones and their computers,
681
2156025
2669
35:58
which are really the tertiary, the third part of your intelligence.
682
2158736
3503
36:02
It's just that it's ...
683
2162281
1627
36:03
Like the bandwidth,
684
2163908
2294
36:06
the rate of communication with that tertiary layer is slow.
685
2166202
4629
36:11
And it's just a very tiny straw to this tertiary layer.
686
2171665
3796
36:15
And we want to make that tiny straw a big highway.
687
2175502
2753
36:19
And I’m definitely not saying that this is going to solve everything.
688
2179298
3545
36:22
Or this is you know, it’s the only thing --
689
2182885
3503
36:26
it’s something that might be helpful.
690
2186430
3754
36:30
And worst-case scenario,
691
2190517
1711
36:32
I think we solve some important brain injury,
692
2192228
3503
36:35
spinal injury issues, and that's still a great outcome.
693
2195773
2586
36:38
CA: Best-case scenario,
694
2198359
1167
36:39
we may discover new human possibility, telepathy,
695
2199568
2419
36:42
you've spoken of, in a way, a connection with a loved one, you know,
696
2202029
4671
36:46
full memory and much faster thought processing maybe.
697
2206742
5005
36:51
All these things.
698
2211747
1335
36:53
It's very cool.
699
2213540
1210
36:55
If AI were to take down Earth, we need a plan B.
700
2215542
5423
37:01
Let's shift our attention to space.
701
2221006
3545
37:04
We spoke last time at TED about reusability,
702
2224593
2086
37:06
and you had just demonstrated that spectacularly for the first time.
703
2226720
3212
37:09
Since then, you've gone on to build this monster rocket, Starship,
704
2229974
5922
37:15
which kind of changes the rules of the game in spectacular ways.
705
2235938
4046
37:20
Tell us about Starship.
706
2240025
1543
37:22
EM: Starship is extremely fundamental.
707
2242486
1877
37:24
So the holy grail of rocketry or space transport
708
2244405
5839
37:30
is full and rapid reusability.
709
2250286
1793
37:32
This has never been achieved.
710
2252121
1418
37:33
The closest that anything has come is our Falcon 9 rocket,
711
2253580
3337
37:36
where we are able to recover the first stage, the boost stage,
712
2256959
5213
37:42
which is probably about 60 percent of the cost of the vehicle
713
2262214
4630
37:46
of the whole launch, maybe 70 percent.
714
2266885
2920
37:50
And we've now done that over a hundred times.
715
2270347
3170
37:53
So with Starship, we will be recovering the entire thing.
716
2273517
6131
37:59
Or at least that's the goal.
717
2279690
1543
38:01
CA: Right.
718
2281275
1209
38:02
EM: And moreover, recovering it in such a way
719
2282526
3128
38:05
that it can be immediately re-flown.
720
2285696
2628
38:08
Whereas with Falcon 9, we still need to do some amount of refurbishment
721
2288365
3462
38:11
to the booster and to the fairing nose cone.
722
2291869
2919
38:16
But with Starship, the design goal is immediate re-flight.
723
2296790
4880
38:22
So you just refill propellants and go again.
724
2302212
3671
38:28
And this is gigantic.
725
2308302
2335
38:30
Just as it would be in any other mode of transport.
726
2310679
2878
38:33
CA: And the main design
727
2313557
1752
38:35
is to basically take 100 plus people at a time,
728
2315351
6006
38:41
plus a bunch of things that they need, to Mars.
729
2321357
3837
38:45
So, first of all, talk about that piece.
730
2325611
1960
38:47
What is your latest timeline?
731
2327613
3462
38:51
One, for the first time, a Starship goes to Mars,
732
2331116
3379
38:54
presumably without people, but just equipment.
733
2334536
2211
38:57
Two, with people.
734
2337122
1877
38:59
Three, there’s sort of,
735
2339041
2252
39:01
OK, 100 people at a time, let's go.
736
2341335
2711
39:04
EM: Sure.
737
2344546
1126
39:05
And just to put the cost thing into perspective,
738
2345714
3796
39:09
the expected cost of Starship,
739
2349510
4754
39:14
putting 100 tons into orbit,
740
2354306
2002
39:16
is significantly less than what it would have cost
741
2356350
4880
39:21
or what it did cost to put our tiny Falcon 1 rocket into orbit.
742
2361271
4505
39:27
Just as the cost of flying a 747 around the world
743
2367611
4671
39:32
is less than the cost of a small airplane.
744
2372282
2419
39:35
You know, a small airplane that was thrown away.
745
2375244
2586
39:37
So it's really pretty mind-boggling that the giant thing costs less,
746
2377871
6048
39:43
way less than the small thing.
747
2383961
1460
39:45
So it doesn't use exotic propellants
748
2385421
4587
39:50
or things that are difficult to obtain on Mars.
749
2390050
2461
39:52
It uses methane as fuel,
750
2392928
3962
39:56
and it's primarily oxygen, roughly 77-78 percent oxygen by weight.
751
2396890
5798
40:03
And Mars has a CO2 atmosphere and has water ice,
752
2403313
3587
40:06
which is CO2 plus H2O, so you can make CH4, methane,
753
2406942
3378
40:10
and O2, oxygen, on Mars.
754
2410362
1794
40:12
CA: Presumably, one of the first tasks on Mars will be to create a fuel plant
755
2412197
3963
40:16
that can create the fuel for the return trips of many Starships.
756
2416201
4255
40:20
EM: Yes.
757
2420497
1168
40:21
And actually, it's mostly going to be oxygen plants,
758
2421665
2920
40:24
because it's 78 percent oxygen, 22 percent fuel.
759
2424626
5965
40:31
But the fuel is a simple fuel that is easy to create on Mars.
760
2431300
3712
40:35
And in many other parts of the solar system.
761
2435512
2586
40:38
So basically ...
762
2438098
1293
40:39
And it's all propulsive landing, no parachutes,
763
2439933
3796
40:43
nothing thrown away.
764
2443729
1460
40:46
It has a heat shield that’s capable of entering on Earth or Mars.
765
2446857
6632
40:53
We can even potentially go to Venus.
766
2453530
1752
40:55
but you don't want to go there.
767
2455324
1501
40:56
(Laughs)
768
2456867
1543
40:59
Venus is hell, almost literally.
769
2459161
2210
41:02
But you could ...
770
2462247
1460
41:04
It's a generalized method of transport to anywhere in the solar system,
771
2464041
4838
41:08
because the point at which you have propellant depo on Mars,
772
2468921
2836
41:11
you can then travel to the asteroid belt
773
2471798
1919
41:13
and to the moons of Jupiter and Saturn
774
2473759
3128
41:16
and ultimately anywhere in the solar system.
775
2476887
2919
41:19
CA: But your main focus
776
2479848
2753
41:22
and SpaceX's main focus is still Mars.
777
2482643
3670
41:26
That is the mission.
778
2486313
2211
41:28
That is where most of the effort will go?
779
2488524
3920
41:33
Or are you actually imagining a much broader array of uses
780
2493278
4672
41:37
even in the coming, you know,
781
2497991
2628
41:40
the first decade or so of uses of this.
782
2500619
3253
41:44
Where we could go, for example, to other places
783
2504498
2252
41:46
in the solar system to explore,
784
2506750
1919
41:48
perhaps NASA wants to use the rocket for that reason.
785
2508710
4338
41:53
EM: Yeah, NASA is planning to use a Starship to return to the moon,
786
2513423
5131
41:58
to return people to the moon.
787
2518595
1794
42:01
And so we're very honored that NASA has chosen us to do this.
788
2521139
4422
42:07
But I'm saying it is a generalized --
789
2527271
4337
42:11
it’s a general solution
790
2531650
2377
42:14
to getting anywhere in the greater solar system.
791
2534027
5172
42:19
It's not suitable for going to another star system,
792
2539241
2502
42:21
but it is a general solution for transport anywhere in the solar system.
793
2541785
3879
42:25
CA: Before it can do any of that,
794
2545706
1585
42:27
it's got to demonstrate it can get into orbit, you know, around Earth.
795
2547332
3295
42:30
What’s your latest advice on the timeline for that?
796
2550627
5005
42:35
EM: It's looking promising for us to have an orbital launch attempt
797
2555632
3921
42:39
in a few months.
798
2559595
2002
42:43
So we're actually integrating --
799
2563015
3545
42:46
will be integrating the engines into the booster
800
2566602
2961
42:49
for the first orbital flight starting in about a week or two.
801
2569605
3586
42:53
And the launch complex itself is ready to go.
802
2573942
6465
43:00
So assuming we get regulatory approval,
803
2580741
3670
43:04
I think we could have an orbital launch attempt within a few months.
804
2584453
6464
43:10
CA: And a radical new technology like this
805
2590959
2002
43:13
presumably there is real risk on those early attempts.
806
2593003
2544
43:15
EM: Oh, 100 percent, yeah.
807
2595589
1251
43:16
The joke I make all the time is that excitement is guaranteed.
808
2596882
3837
43:20
Success is not guaranteed, but excitement certainly is.
809
2600719
2794
43:23
CA: But the last I saw on your timeline,
810
2603513
2378
43:25
you've slightly put back the expected date
811
2605932
2962
43:28
to put the first human on Mars till 2029, I want to say?
812
2608935
4380
43:33
EM: Yeah, I mean, so let's see.
813
2613815
3128
43:36
I mean, we have built a production system for Starship,
814
2616985
3504
43:40
so we're making a lot of ships and boosters.
815
2620489
3295
43:43
CA: How many are you planning to make actually?
816
2623784
2210
43:46
EM: Well, we're currently expecting to make a booster and a ship
817
2626036
5714
43:51
roughly every, well, initially, roughly every couple of months,
818
2631792
3503
43:55
and then hopefully by the end of this year, one every month.
819
2635295
3670
43:59
So it's giant rockets, and a lot of them.
820
2639007
2711
44:01
Just talking in terms of rough orders of magnitude,
821
2641760
2419
44:04
in order to create a self-sustaining city on Mars,
822
2644179
3504
44:07
I think you will need something on the order of a thousand ships.
823
2647724
4421
44:12
And we just need a Helen of Sparta, I guess, on Mars.
824
2652187
6381
44:19
CA: This is not in most people's heads, Elon.
825
2659319
2211
44:21
EM: The planet that launched 1,000 ships.
826
2661571
1961
44:24
CA: That's nice.
827
2664574
1168
44:25
But this is not in most people's heads,
828
2665784
1877
44:27
this picture that you have in your mind.
829
2667661
1918
44:29
There's basically a two-year window,
830
2669621
1752
44:31
you can only really fly to Mars conveniently every two years.
831
2671373
2919
44:34
You were picturing that during the 2030s,
832
2674292
4797
44:39
every couple of years,
833
2679089
1376
44:40
something like 1,000 Starships take off,
834
2680507
3003
44:43
each containing 100 or more people.
835
2683552
1960
44:45
That picture is just completely mind-blowing to me.
836
2685512
5464
44:51
That sense of this armada of humans going to --
837
2691393
3378
44:54
EM: It'll be like "Battlestar Galactica," the fleet departs.
838
2694813
2878
44:57
CA: And you think that it can basically be funded by people
839
2697691
2794
45:00
spending maybe a couple hundred grand on a ticket to Mars?
840
2700485
3254
45:03
Is that price about where it has been?
841
2703739
2752
45:07
EM: Well, I think if you say like,
842
2707367
1627
45:08
what's required in order to get enough people and enough cargo to Mars
843
2708994
4755
45:13
to build a self-sustaining city.
844
2713790
2586
45:17
And it's where you have an intersection
845
2717377
1919
45:19
of sets of people who want to go,
846
2719296
2669
45:21
because I think only a small percentage of humanity will want to go,
847
2721965
5047
45:27
and can afford to go or get sponsorship in some manner.
848
2727012
3754
45:31
That intersection of sets, I think,
849
2731391
1710
45:33
needs to be a million people or something like that.
850
2733101
2461
45:36
And so it’s what can a million people afford, or get sponsorship for,
851
2736646
3754
45:40
because I think governments will also pay for it,
852
2740442
2377
45:42
and people can take out loans.
853
2742819
3003
45:45
But I think at the point at which you say, OK, like,
854
2745864
3712
45:49
if moving to Mars costs are, for argument’s sake, $100,000,
855
2749618
6381
45:56
then I think you know, almost anyone can work and save up
856
2756041
5172
46:01
and eventually have $100,000 and be able to go to Mars if they want.
857
2761213
4045
46:05
We want to make it available to anyone who wants to go.
858
2765300
2669
46:10
It's very important to emphasize that Mars, especially in the beginning,
859
2770263
4171
46:14
will not be luxurious.
860
2774434
1293
46:15
It will be dangerous, cramped, difficult, hard work.
861
2775727
5672
46:22
It's kind of like that Shackleton ad for going to the Antarctic,
862
2782025
3170
46:25
which I think is actually not real, but it sounds real and it's cool.
863
2785237
3336
46:28
It's sort of like, the sales pitch for going to Mars is,
864
2788865
2878
46:31
"It's dangerous, it's cramped.
865
2791785
2878
46:35
You might not make it back.
866
2795956
1501
46:38
It's difficult, it's hard work."
867
2798750
1543
46:40
That's the sales pitch.
868
2800293
1168
46:41
CA: Right.
869
2801503
1168
46:42
But you will make history.
870
2802712
1252
46:44
EM: But it'll be glorious.
871
2804756
1585
46:47
CA: So on that kind of launch rate you're talking about over two decades,
872
2807050
3462
46:50
you could get your million people to Mars, essentially.
873
2810554
3628
46:54
Whose city is it?
874
2814224
1126
46:55
Is it NASA's city, is it SpaceX's city?
875
2815392
1918
46:57
EM: It’s the people of Mars’ city.
876
2817352
1627
47:01
The reason for this, I mean, I feel like why do this thing?
877
2821106
4629
47:05
I think this is important for maximizing
878
2825735
4338
47:10
the probable lifespan of humanity or consciousness.
879
2830115
3044
47:13
Human civilization could come to an end for external reasons,
880
2833201
4004
47:17
like a giant meteor or super volcanoes or extreme climate change.
881
2837247
5172
47:24
Or World War III, or you know, any one of a number of reasons.
882
2844045
5756
47:32
But the probable life span of civilizational consciousness
883
2852929
2878
47:35
as we know it,
884
2855849
1585
47:37
which we should really view as this very delicate thing,
885
2857475
3129
47:40
like a small candle in a vast darkness.
886
2860645
2711
47:43
That is what appears to be the case.
887
2863356
2962
47:47
We're in this vast darkness of space,
888
2867777
3379
47:51
and there's this little candle of consciousness
889
2871197
3421
47:54
that’s only really come about after 4.5 billion years,
890
2874659
4463
47:59
and it could just go out.
891
2879164
2002
48:01
CA: I think that's powerful,
892
2881166
1376
48:02
and I think a lot of people will be inspired by that vision.
893
2882584
2836
48:05
And the reason you need the million people
894
2885420
2127
48:07
is because there has to be enough people there
895
2887547
2169
48:09
to do everything that you need to survive.
896
2889758
2419
48:13
EM: Really, like the critical threshold is if the ships from Earth stop coming
897
2893136
6840
48:20
for any reason,
898
2900018
2544
48:22
does the Mars City die out or not?
899
2902604
4379
48:27
And so we have to --
900
2907400
2086
48:29
You know, people talk about like, the sort of, the great filters,
901
2909527
3129
48:32
the things that perhaps, you know,
902
2912697
3087
48:35
we talk about the Fermi paradox, and where are the aliens?
903
2915825
2711
48:38
Well maybe there are these various great filters
904
2918536
2294
48:40
that the aliens didn’t pass,
905
2920872
1418
48:42
and so they eventually just ceased to exist.
906
2922290
4588
48:46
And one of the great filters is becoming a multi-planet species.
907
2926920
3128
48:50
So we want to pass that filter.
908
2930674
2210
48:54
And I'll be long-dead before this is, you know, a real thing,
909
2934302
6006
49:00
before it happens.
910
2940350
1251
49:01
But I’d like to at least see us make great progress in this direction.
911
2941601
5297
49:07
CA: Given how tortured the Earth is right now,
912
2947315
2503
49:09
how much we're beating each other up,
913
2949859
2878
49:12
shouldn't there be discussions going on
914
2952737
2795
49:15
with everyone who is dreaming about Mars to try to say,
915
2955573
4171
49:19
we've got a once in a civilization's chance
916
2959786
5172
49:24
to make some new rules here?
917
2964958
2002
49:27
Should someone be trying to lead those discussions
918
2967002
3753
49:30
to figure out what it means for this to be the people of Mars' City?
919
2970755
3879
49:35
EM: Well, I think ultimately
920
2975093
1376
49:36
this will be up to the people of Mars to decide
921
2976469
2211
49:38
how they want to rethink society.
922
2978722
4045
49:43
Yeah there’s certainly risk there.
923
2983101
1627
49:45
And hopefully the people of Mars will be more enlightened
924
2985395
4630
49:50
and will not fight amongst each other too much.
925
2990066
2711
49:54
I mean, I have some recommendations,
926
2994279
1752
49:56
which people of Mars may choose to listen to or not.
927
2996031
3962
50:00
I would advocate for more of a direct democracy,
928
3000035
2794
50:02
not a representative democracy,
929
3002829
2211
50:05
and laws that are short enough for people to understand.
930
3005040
2711
50:08
Where it is harder to create laws than to get rid of them.
931
3008168
5380
50:14
CA: Coming back a bit nearer term,
932
3014424
1668
50:16
I'd love you to just talk a bit about some of the other possibility space
933
3016134
3462
50:19
that Starship seems to have created.
934
3019596
3712
50:23
So given --
935
3023349
1377
50:24
Suddenly we've got this ability to move 100 tons-plus into orbit.
936
3024768
3503
50:29
So we've just launched the James Webb telescope,
937
3029230
3170
50:32
which is an incredible thing.
938
3032400
2002
50:34
It's unbelievable.
939
3034444
1126
50:35
EM: Exquisite piece of technology.
940
3035612
1793
50:37
CA: Exquisite piece of technology.
941
3037447
1627
50:39
But people spent two years trying to figure out how to fold up this thing.
942
3039115
3504
50:42
It's a three-ton telescope.
943
3042660
1335
50:44
EM: We can make it a lot easier if you’ve got more volume and mass.
944
3044037
3170
50:47
CA: But let's ask a different question.
945
3047207
1877
50:49
Which is, how much more powerful a telescope could someone design
946
3049084
6756
50:55
based on using Starship, for example?
947
3055882
2878
50:59
EM: I mean, roughly, I'd say it's probably an order of magnitude more resolution.
948
3059469
4546
51:04
If you've got 100 tons and a thousand cubic meters volume,
949
3064057
3211
51:07
which is roughly what we have.
950
3067268
1585
51:08
CA: And what about other exploration through the solar system?
951
3068895
3545
51:12
I mean, I'm you know --
952
3072482
2169
51:14
EM: Europa is a big question mark.
953
3074692
2670
51:17
CA: Right, so there's an ocean there.
954
3077403
1794
51:19
And what you really want to do is to drop a submarine into that ocean.
955
3079239
3295
51:22
EM: Maybe there's like, some squid civilization,
956
3082534
2294
51:24
cephalopod civilization under the ice of Europa.
957
3084869
3212
51:28
That would be pretty interesting.
958
3088123
1626
51:29
CA: I mean, Elon, if you could take a submarine to Europa
959
3089749
2711
51:32
and we see pictures of this thing being devoured by a squid,
960
3092460
3629
51:36
that would honestly be the happiest moment of my life.
961
3096131
2544
51:38
EM: Pretty wild, yeah.
962
3098716
1377
51:40
CA: What other possibilities are out there?
963
3100426
2795
51:43
Like, it feels like if you're going to create a thousand of these things,
964
3103263
4379
51:47
they can only fly to Mars every two years.
965
3107642
3212
51:50
What are they doing the rest of the time?
966
3110895
2169
51:53
It feels like there's this explosion of possibility
967
3113106
4671
51:57
that I don't think people are really thinking about.
968
3117777
2461
52:00
EM: I don't know, we've certainly got a long way to go.
969
3120238
2628
52:02
As you alluded to earlier, we still have to get to orbit.
970
3122866
2752
52:05
And then after we get to orbit,
971
3125618
1752
52:07
we have to really prove out and refine full and rapid reusability.
972
3127412
6006
52:14
That'll take a moment.
973
3134085
1210
52:19
But I do think we will solve this.
974
3139090
1752
52:22
I'm highly confident we will solve this at this point.
975
3142969
2711
52:26
CA: Do you ever wake up with the fear
976
3146014
1793
52:27
that there's going to be this Hindenburg moment for SpaceX where ...
977
3147849
3462
52:31
EM: We've had many Hindenburg.
978
3151811
1460
52:33
Well, we've never had Hindenburg moments with people, which is very important.
979
3153313
3670
52:37
Big difference.
980
3157025
1251
52:38
We've blown up quite a few rockets.
981
3158776
1710
52:40
So there's a whole compilation online that we put together
982
3160486
3504
52:44
and others put together,
983
3164032
1710
52:45
it's showing rockets are hard.
984
3165742
1960
52:47
I mean, the sheer amount of energy going through a rocket boggles the mind.
985
3167744
3795
52:51
So, you know, getting out of Earth's gravity well is difficult.
986
3171581
3503
52:55
We have a strong gravity and a thick atmosphere.
987
3175126
2377
52:59
And Mars, which is less than 40 percent,
988
3179422
3921
53:03
it's like, 37 percent of Earth's gravity
989
3183426
2711
53:06
and has a thin atmosphere.
990
3186179
1626
53:08
The ship alone can go all the way
991
3188097
2002
53:10
from the surface of Mars to the surface of Earth.
992
3190141
2419
53:12
Whereas getting to Mars requires a giant booster and orbital refilling.
993
3192602
4755
53:17
CA: So, Elon, as I think more about this incredible array of things
994
3197774
4796
53:22
that you're involved with,
995
3202612
1835
53:24
I keep seeing these synergies,
996
3204489
4296
53:28
to use a horrible word,
997
3208826
1877
53:30
between them.
998
3210745
1168
53:31
You know, for example,
999
3211955
1167
53:33
the robots you're building from Tesla could possibly be pretty handy on Mars,
1000
3213122
5756
53:38
doing some of the dangerous work and so forth.
1001
3218920
2169
53:41
I mean, maybe there's a scenario where your city on Mars
1002
3221089
2669
53:43
doesn't need a million people,
1003
3223758
1460
53:45
it needs half a million people and half a million robots.
1004
3225218
2711
53:47
And that's a possibility.
1005
3227971
1835
53:49
Maybe The Boring Company could play a role
1006
3229847
2211
53:52
helping create some of the subterranean dwelling spaces that you might need.
1007
3232100
5380
53:57
EM: Yeah.
1008
3237480
1210
53:58
CA: Back on planet Earth,
1009
3238982
1501
54:00
it seems like a partnership between Boring Company and Tesla
1010
3240525
3211
54:03
could offer an unbelievable deal to a city
1011
3243778
3879
54:07
to say, we will create for you a 3D network of tunnels
1012
3247699
4504
54:12
populated by robo-taxis
1013
3252245
2252
54:14
that will offer fast, low-cost transport to anyone.
1014
3254539
4254
54:18
You know, full self-driving may or may not be done this year.
1015
3258835
2878
54:21
And in some cities, like, somewhere like Mumbai,
1016
3261713
2794
54:24
I suspect won't be done for a decade.
1017
3264549
2043
54:26
EM: Some places are more challenging than others.
1018
3266634
2294
54:28
CA: But today, today, with what you've got,
1019
3268928
2378
54:31
you could put a 3D network of tunnels under there.
1020
3271306
4254
54:35
EM: Oh, if it’s just in a tunnel, that’s a solved problem.
1021
3275601
2795
54:38
CA: Exactly, full self-driving is a solved problem.
1022
3278438
2502
54:40
To me, there’s amazing synergy there.
1023
3280982
3045
54:44
With Starship,
1024
3284068
1752
54:45
you know, Gwynne Shotwell talked about by 2028 having from city to city,
1025
3285820
5339
54:51
you know, transport on planet Earth.
1026
3291200
1752
54:52
EM: This is a real possibility.
1027
3292952
1627
54:57
The fastest way to get from one place to another,
1028
3297290
2961
55:00
if it's a long distance, is a rocket.
1029
3300293
1793
55:03
It's basically an ICBM.
1030
3303087
1377
55:05
CA: But it has to land --
1031
3305673
1335
55:07
Because it's an ICBM, it has to land probably offshore,
1032
3307675
3879
55:11
because it's loud.
1033
3311596
1168
55:12
So why not have a tunnel that then connects to the city with Tesla?
1034
3312805
6632
55:20
And Neuralink.
1035
3320897
1126
55:22
I mean, if you going to go to Mars
1036
3322065
1626
55:23
having a telepathic connection with loved ones back home,
1037
3323733
2878
55:26
even if there's a time delay...
1038
3326611
1501
55:29
EM: These are not intended to be connected, by the way.
1039
3329238
4088
55:33
But there certainly could be some synergies, yeah.
1040
3333326
2419
55:35
CA: Surely there is a growing argument
1041
3335787
2294
55:38
that you should actually put all these things together
1042
3338081
2711
55:40
into one company
1043
3340792
2127
55:42
and just have a company devoted to creating a future
1044
3342960
4296
55:47
that’s exciting,
1045
3347298
1460
55:48
and let a thousand flowers bloom.
1046
3348758
1627
55:50
Have you been thinking about that?
1047
3350426
1627
55:53
EM: I mean, it is tricky because Tesla is a publicly-traded company,
1048
3353429
3253
55:56
and the investor base of Tesla and SpaceX
1049
3356724
5255
56:02
and certainly Boring Company and Neuralink are quite different.
1050
3362021
3379
56:05
Boring Company and Neuralink are tiny companies.
1051
3365441
2503
56:08
CA: By comparison.
1052
3368569
1418
56:10
EM: Yeah, Tesla's got 110,000 people.
1053
3370321
3629
56:14
SpaceX I think is around 12,000 people.
1054
3374283
2503
56:17
Boring Company and Neuralink are both under 200 people.
1055
3377161
4546
56:21
So they're little, tiny companies,
1056
3381749
3170
56:24
but they will probably get bigger in the future.
1057
3384961
2252
56:27
They will get bigger in the future.
1058
3387213
1710
56:29
It's not that easy to sort of combine these things.
1059
3389632
2544
56:33
CA: Traditionally, you have said that for SpaceX especially,
1060
3393761
2878
56:36
you wouldn't want it public,
1061
3396639
1418
56:38
because public investors wouldn't support the craziness of the idea
1062
3398057
4296
56:42
of going to Mars or whatever.
1063
3402395
1418
56:43
EM: Yeah, making life multi-planetary
1064
3403813
2044
56:45
is outside of the normal time horizon of Wall Street analysts.
1065
3405898
5881
56:51
(Laughs)
1066
3411779
1001
56:52
To say the least.
1067
3412864
1209
56:54
CA: I think something's changed, though.
1068
3414073
2586
56:56
What's changed is that Tesla is now so powerful and so big
1069
3416701
2753
56:59
and throws off so much cash
1070
3419495
2461
57:01
that you actually could connect the dots here.
1071
3421998
3503
57:05
Just tell the public that x-billion dollars a year, whatever your number is,
1072
3425501
4546
57:10
will be diverted to the Mars mission.
1073
3430089
3420
57:13
I suspect you'd have massive interest in that company.
1074
3433551
3462
57:17
And it might unlock a lot more possibility for you, no?
1075
3437054
4922
57:22
EM: I would like to give the public access to ownership of SpaceX,
1076
3442018
5797
57:27
but I mean the thing that like,
1077
3447815
2711
57:30
the overhead associated with a public company is high.
1078
3450568
5130
57:38
I mean, as a public company, you're just constantly sued.
1079
3458284
2711
57:41
It does occupy like, a fair bit of ...
1080
3461037
3003
57:45
You know, time and effort to deal with these things.
1081
3465958
3546
57:49
CA: But you would still only have one public company, it would be bigger,
1082
3469504
3628
57:53
and have more things going on.
1083
3473174
1752
57:54
But instead of being on four boards, you'd be on one.
1084
3474967
2711
57:57
EM: I'm actually not even on the Neuralink or Boring Company boards.
1085
3477720
3337
58:02
And I don't really attend the SpaceX board meetings.
1086
3482099
3671
58:06
We only have two a year,
1087
3486103
1210
58:07
and I just stop by and chat for an hour.
1088
3487313
2211
58:13
The board overhead for a public company is much higher.
1089
3493110
2837
58:15
CA: I think some investors probably worry about how your time is being split,
1090
3495988
3712
58:19
and they might be excited by you know, that.
1091
3499742
2669
58:22
Anyway, I just woke up the other day
1092
3502495
3253
58:25
thinking, just, there are so many ways in which these things connect.
1093
3505790
3420
58:29
And you know, just the simplicity of that mission,
1094
3509252
3837
58:33
of building a future that is worth getting excited about,
1095
3513130
3546
58:36
might appeal to an awful lot of people.
1096
3516676
3461
58:41
Elon, you are reported by Forbes and everyone else as now, you know,
1097
3521013
5381
58:46
the world's richest person.
1098
3526435
1752
58:48
EM: That’s not a sovereign.
1099
3528187
1293
58:49
CA: (Laughs)
1100
3529564
1001
58:50
EM: You know, I think it’s fair to say
1101
3530606
1835
58:52
that if somebody is like, the king or de facto king of a country,
1102
3532483
4671
58:57
they're wealthier than I am.
1103
3537154
1961
58:59
CA: But it’s just harder to measure --
1104
3539323
2711
59:02
So $300 billion.
1105
3542285
1418
59:03
I mean, your net worth on any given day
1106
3543744
3838
59:07
is rising or falling by several billion dollars.
1107
3547623
3045
59:10
How insane is that?
1108
3550710
2127
59:12
EM: It's bonkers, yeah.
1109
3552878
1168
59:14
CA: I mean, how do you handle that psychologically?
1110
3554088
2586
59:16
There aren't many people in the world who have to even think about that.
1111
3556674
3503
59:20
EM: I actually don't think about that too much.
1112
3560177
2211
59:22
But the thing that is actually more difficult
1113
3562430
3587
59:26
and that does make sleeping difficult
1114
3566058
1877
59:27
is that, you know,
1115
3567977
3378
59:31
every good hour or even minute
1116
3571397
3587
59:35
of thinking about Tesla and SpaceX
1117
3575026
4421
59:39
has such a big effect on the company
1118
3579447
2502
59:41
that I really try to work as much as possible,
1119
3581991
3920
59:45
you know, to the edge of sanity, basically.
1120
3585911
3129
59:49
Because you know, Tesla’s getting to the point where
1121
3589081
3337
59:54
probably will get to the point later this year,
1122
3594920
2211
59:57
where every high-quality minute of thinking
1123
3597131
5047
60:02
is a million dollars impact on Tesla.
1124
3602219
3671
60:08
Which is insane.
1125
3608517
1544
60:13
I mean, the basic, you know, if Tesla is doing, you know,
1126
3613272
4046
60:17
sort of $2 billion a week, let’s say, in revenue,
1127
3617360
3920
60:21
it’s sort of $300 million a day, seven days a week.
1128
3621280
4713
60:26
You know, it's ...
1129
3626535
1335
60:28
CA: If you can change that by five percent in an hour’s brainstorm,
1130
3628829
5548
60:34
that's a pretty valuable hour.
1131
3634418
3128
60:37
EM: I mean, there are many instances where a half-hour meeting,
1132
3637546
4797
60:42
I was able to improve the financial outcome of the company
1133
3642385
3378
60:45
by $100 million in a half-hour meeting.
1134
3645763
3629
60:50
CA: There are many other people out there
1135
3650476
2044
60:52
who can't stand this world of billionaires.
1136
3652520
2752
60:55
Like, they are hugely offended by the notion
1137
3655314
3921
60:59
that an individual can have the same wealth as, say,
1138
3659276
4588
61:03
a billion or more of the world's poorest people.
1139
3663906
3212
61:07
EM: If they examine sort of --
1140
3667159
2419
61:09
I think there's some axiomatic flaws that are leading them to that conclusion.
1141
3669578
5047
61:15
For sure, it would be very problematic if I was consuming,
1142
3675167
4922
61:20
you know, billions of dollars a year in personal consumption.
1143
3680131
3086
61:23
But that is not the case.
1144
3683259
1209
61:24
In fact, I don't even own a home right now.
1145
3684802
2252
61:27
I'm literally staying at friends' places.
1146
3687096
2336
61:30
If I travel to the Bay Area,
1147
3690141
1543
61:31
which is where most of Tesla engineering is,
1148
3691726
2085
61:33
I basically rotate through friends' spare bedrooms.
1149
3693811
3795
61:38
I don't have a yacht, I really don't take vacations.
1150
3698691
2753
61:44
It’s not as though my personal consumption is high.
1151
3704071
4338
61:49
I mean, the one exception is a plane.
1152
3709243
1793
61:51
But if I don't use the plane, then I have less hours to work.
1153
3711078
2878
61:55
CA: I mean, I personally think you have shown that you are mostly driven
1154
3715291
4129
61:59
by really quite a deep sense of moral purpose.
1155
3719420
2502
62:01
Like, your attempts to solve the climate problem
1156
3721964
5589
62:07
have been as powerful as anyone else on the planet that I'm aware of.
1157
3727595
4838
62:12
And I actually can't understand,
1158
3732433
2085
62:14
personally, I can't understand the fact
1159
3734518
1877
62:16
that you get all this criticism from the Left about,
1160
3736437
2461
62:18
"Oh, my God, he's so rich, that's disgusting."
1161
3738898
2377
62:21
When climate is their issue.
1162
3741734
2377
62:25
Philanthropy is a topic that some people go to.
1163
3745446
2210
62:27
Philanthropy is a hard topic.
1164
3747698
1668
62:29
How do you think about that?
1165
3749408
1794
62:31
EM: I think if you care about the reality of goodness
1166
3751535
2711
62:34
instead of the perception of it, philanthropy is extremely difficult.
1167
3754246
3796
62:39
SpaceX, Tesla, Neuralink and The Boring Company are philanthropy.
1168
3759126
3921
62:43
If you say philanthropy is love of humanity,
1169
3763464
3086
62:46
they are philanthropy.
1170
3766592
1668
62:49
Tesla is accelerating sustainable energy.
1171
3769553
2878
62:52
This is a love -- philanthropy.
1172
3772473
3545
62:56
SpaceX is trying to ensure the long-term survival of humanity
1173
3776894
3712
63:00
with a multiple-planet species.
1174
3780648
1501
63:02
That is love of humanity.
1175
3782191
1543
63:05
You know, Neuralink is trying to help solve brain injuries
1176
3785319
4546
63:09
and existential risk with AI.
1177
3789907
2294
63:12
Love of humanity.
1178
3792243
1167
63:13
Boring Company is trying to solve traffic, which is hell for most people,
1179
3793452
3504
63:16
and that also is love of humanity.
1180
3796956
2627
63:20
CA: How upsetting is it to you
1181
3800084
4296
63:24
to hear this constant drumbeat of,
1182
3804421
3546
63:28
"Billionaires, my God, Elon Musk, oh, my God?"
1183
3808008
2169
63:30
Like, do you just shrug that off
1184
3810219
2961
63:33
or does it does it actually hurt?
1185
3813222
1627
63:36
EM: I mean, at this point, it's water off a duck's back.
1186
3816559
2794
63:39
CA: Elon, I’d like to, as we wrap up now,
1187
3819353
2544
63:41
just pull the camera back and just think ...
1188
3821939
3378
63:45
You’re a father now of seven surviving kids.
1189
3825359
3504
63:49
EM: Well, I mean, I'm trying to set a good example
1190
3829530
2336
63:51
because the birthrate on Earth is so low
1191
3831907
1919
63:53
that we're facing civilizational collapse
1192
3833868
2043
63:55
unless the birth rate returns to a sustainable level.
1193
3835911
4838
64:01
CA: Yeah, you've talked about this a lot,
1194
3841667
1960
64:03
that depopulation is a big problem,
1195
3843669
2294
64:06
and people don't understand how big a problem it is.
1196
3846005
2460
64:08
EM: Population collapse is one of the biggest threats
1197
3848465
2503
64:10
to the future of human civilization.
1198
3850968
1752
64:12
And that is what is going on right now.
1199
3852761
1877
64:14
CA: What drives you on a day-to-day basis to do what you do?
1200
3854638
2836
64:17
EM: I guess, like, I really want to make sure
1201
3857516
3087
64:20
that there is a good future for humanity
1202
3860644
3462
64:24
and that we're on a path to understanding the nature of the universe,
1203
3864148
5297
64:29
the meaning of life.
1204
3869486
1168
64:30
Why are we here, how did we get here?
1205
3870696
1960
64:33
And in order to understand the nature of the universe
1206
3873490
4422
64:37
and all these fundamental questions,
1207
3877953
3921
64:41
we must expand the scope and scale of consciousness.
1208
3881916
3086
64:47
Certainly it must not diminish or go out.
1209
3887129
1960
64:49
Or we certainly won’t understand this.
1210
3889131
2211
64:51
I would say I’ve been motivated by curiosity more than anything,
1211
3891342
3587
64:54
and just desire to think about the future
1212
3894929
4337
64:59
and not be sad, you know?
1213
3899308
2544
65:03
CA: And are you?
1214
3903687
1168
65:04
Are you not sad?
1215
3904897
1251
65:06
EM: I'm sometimes sad,
1216
3906607
1209
65:07
but mostly I'm feeling I guess
1217
3907816
4505
65:12
relatively optimistic about the future these days.
1218
3912363
2544
65:15
There are certainly some big risks that humanity faces.
1219
3915699
3921
65:20
I think the population collapse is a really big deal,
1220
3920287
2795
65:23
that I wish more people would think about
1221
3923123
5130
65:28
because the birth rate is far below what's needed to sustain civilization
1222
3928253
4964
65:33
at its current level.
1223
3933258
1669
65:35
And there's obviously ...
1224
3935594
3212
65:39
We need to take action on climate sustainability,
1225
3939682
2877
65:42
which is being done.
1226
3942601
1919
65:45
And we need to secure the future of consciousness
1227
3945562
2294
65:47
by being a multi-planet species.
1228
3947898
2252
65:51
We need to address --
1229
3951151
1293
65:52
Essentially, it's important to take whatever actions we can think of
1230
3952486
3212
65:55
to address the existential risks that affect the future of consciousness.
1231
3955698
4796
66:00
CA: There's a whole generation coming through
1232
3960536
2127
66:02
who seem really sad about the future.
1233
3962663
1793
66:04
What would you say to them?
1234
3964456
1794
66:07
EM: Well, I think if you want the future to be good, you must make it so.
1235
3967376
3587
66:12
Take action to make it good.
1236
3972256
2419
66:14
And it will be.
1237
3974717
1209
66:17
CA: Elon, thank you for all this time.
1238
3977177
2211
66:19
That is a beautiful place to end.
1239
3979722
1668
66:21
Thanks for all you're doing.
1240
3981390
1376
66:22
EM: You're welcome.
1241
3982766
1210
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7