Elon Musk: A future worth getting excited about | Tesla Texas Gigafactory interview | TED

9,199,010 views

2022-04-18 ใƒป TED


New videos

Elon Musk: A future worth getting excited about | Tesla Texas Gigafactory interview | TED

9,199,010 views ใƒป 2022-04-18

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

00:00
Chris Anderson: Elon Musk, great to see you.
0
288
2085
๋ฒˆ์—ญ: ์ˆ˜๋ฏธ ์ • ๊ฒ€ํ† : Hyeryung Kim
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ์ผ๋ก  ๋จธ์Šคํฌ ์”จ, ๋ฐ˜๊ฐ‘์Šต๋‹ˆ๋‹ค. ์ž˜ ์ง€๋‚ด์‹œ์ฃ ?
00:02
How are you?
1
2373
1210
์ผ๋ก  ๋จธ์Šคํฌ: ๋„ค, ๋ฐ˜๊ฐ‘์Šต๋‹ˆ๋‹ค.
00:03
Elon Musk: Good. How are you?
2
3583
1418
CA: ์ด๊ณณ์€ ๊ฐœ์žฅ์„ ํ•˜๋ฃจ ์•ž๋‘” ํ…์‚ฌ์Šค ๊ธฐ๊ฐ€ํŒฉํ† ๋ฆฌ์ธ๋ฐ์š”.
00:05
CA: We're here at the Texas Gigafactory the day before this thing opens.
3
5001
4212
00:09
It's been pretty crazy out there.
4
9255
1752
๋ชน์‹œ ๋ถ„์ฃผํ•˜๋”๊ตฐ์š”.
00:11
Thank you so much for making time on a busy day.
5
11049
3294
์ด์ฒ˜๋Ÿผ ๋ฐ”์œ ๋‚ ์— ์‹œ๊ฐ„์„ ๋‚ด์ค˜์„œ ์ง„์‹ฌ์œผ๋กœ ๊ณ ๋ง™์Šต๋‹ˆ๋‹ค.
00:14
I would love you to help us, kind of, cast our minds,
6
14343
3713
๋จธ์Šคํฌ ์”จ์˜ ๋„์›€์„ ๋ฐ›์•„ ํ•œ๋ฒˆ ์ƒ์ƒํ•ด ๋ดค์œผ๋ฉด ํ•ฉ๋‹ˆ๋‹ค.
00:18
I don't know, 10, 20, 30 years into the future.
7
18097
3462
10๋…„, 20๋…„, 30๋…„ ํ›„์˜ ๋ฏธ๋ž˜๋ฅผ ๋ง์ด์—์š”.
00:22
And help us try to picture what it would take
8
22852
3462
๊ธฐ๋Œ€ํ•  ๋งŒํ•œ ๋ฏธ๋ž˜๋ฅผ ๋งž๊ธฐ์œ„ํ•ด ์ธ๋ฅ˜์˜ ์–ด๋–ค ๋…ธ๋ ฅ์ด ํ•„์š”ํ• ์ง€
00:26
to build a future that's worth getting excited about.
9
26314
3086
๋จธ์Šคํฌ์”จ์˜ ์˜๊ฒฌ์„ ๋“ค์–ด๋ณด๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
00:29
The last time you spoke at TED,
10
29442
1710
์ตœ๊ทผ TED ๊ฐ•์—ฐ์—์„œ ๋จธ์Šคํฌ ์”จ๋Š”
00:31
you said that that was really just a big driver.
11
31152
2794
๋ฏธ๋ž˜์— ๋Œ€ํ•œ ๊ธฐ๋Œ€ ์ž์ฒด๊ฐ€ ๋™๊ธฐ ๋ถ€์—ฌ๋œ๋‹ค๊ณ  ํ•˜์…จ์–ด์š”.
00:33
You know, you can talk about lots of other reasons to do the work you're doing,
12
33988
3754
๋ฌผ๋ก  ๊ทธ ์™ธ์—๋„ ๋‹น์‹ ์˜ ํ™œ๋™์—๋Š” ์—ฌ๋Ÿฌ ๋‹ค๋ฅธ ์ด์œ ๊ฐ€ ์žˆ๊ฒ ์ง€๋งŒ
00:37
but fundamentally, you want to think about the future
13
37742
4087
๊ทผ๋ณธ์ ์œผ๋กœ ๋จธ์Šคํฌ ์”จ๋Š” ๋ฏธ๋ž˜์— ๋Œ€ํ•ด ๊ณ ๋ฏผํ•˜๊ณ 
00:41
and not think that it sucks.
14
41871
1376
๋‚™๊ด€์ ์œผ๋กœ ์ƒ๊ฐํ•˜๋ ค ํ•˜์ฃ .
00:43
EM: Yeah, absolutely.
15
43873
1168
EM: ๋„ค, ๋ฌผ๋ก ์ด์—์š”.
00:45
I think in general, you know,
16
45083
1501
์ œ ์ƒ๊ฐ์—๋Š”...
00:46
there's a lot of discussion of like, this problem or that problem.
17
46626
4337
์—ฌ๋Ÿฌ ๋ฌธ์ œ๋ฅผ ๋‘๊ณ  ๋งŽ์€ ๋…ผ์˜๊ฐ€ ์˜ค๊ฐ€๊ณ 
00:51
And a lot of people are sad about the future
18
51005
3378
๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ๋ฏธ๋ž˜์— ๋Œ€ํ•ด ์•”์šธํ•˜๊ฒŒ ์ƒ๊ฐํ•ด์š”.
00:54
and they're ...
19
54425
1543
๊ทธ๋ฆฌ๊ณ ...
00:58
Pessimistic.
20
58262
1168
๋น„๊ด€์ ์ด์ฃ .
00:59
And I think ...
21
59472
2586
์ œ ์ƒ๊ฐ์—๋Š”...
01:02
this is ...
22
62100
1376
๊ทธ๋Ÿฌํ•œ ํƒœ๋„๋Š”
01:05
This is not great.
23
65061
1168
์ข‹์ง€ ์•Š๋‹ค๊ณ  ๋ด…๋‹ˆ๋‹ค.
01:06
I mean, we really want to wake up in the morning
24
66270
2628
์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๊ฑด ์•„์นจ์— ๋ˆˆ์„ ๋œจ๋ฉด์„œ
01:08
and look forward to the future.
25
68940
1960
๋ฏธ๋ž˜๋ฅผ ๊ธฐ๋Œ€ํ•˜๋Š” ๊ฑฐ์ฃ .
01:10
We want to be excited about what's going to happen.
26
70942
3545
์–ด๋–ค ๋ฉ‹์ง„ ์ผ์ด ์ผ์–ด๋‚ ์ง€ ๊ธฐ๋Œ€๊ฐ์„ ๊ฐ–๊ธธ ์›ํ•ฉ๋‹ˆ๋‹ค.
01:15
And life cannot simply be about sort of,
27
75446
4880
์ธ์ƒ์ด๋ผ๋Š” ๊ฒŒ ๋‹จ์ˆœํ•˜๊ฒŒ
01:20
solving one miserable problem after another.
28
80368
2210
๊ณจ์น˜ ์•„ํ”ˆ ๋ฌธ์ œ๋ฅผ ์—ฐ๋‹ฌ์•„ ํ•ด๊ฒฐ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฑด ์•„๋‹ˆ๋‹ˆ๊นŒ์š”.
01:22
CA: So if you look forward 30 years, you know, the year 2050
29
82578
3712
CA: 30๋…„ ํ›„์˜ ๋ฏธ๋ž˜, ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ 2050๋…„์ฏค์˜ ์„ธ์ƒ์„ ๋งํ•  ๋•Œ
01:26
has been labeled by scientists
30
86332
3253
๊ณผํ•™์ž๋“ค์€ ๊ทธ ๋ฌด๋ ต์—
01:29
as this, kind of, almost like this doomsday deadline on climate.
31
89627
4254
๊ธฐํ›„๊ฐ€ ์™„์ „ํžˆ ๋ถ•๊ดด๋  ๊ฒƒ์ด๋ผ๋Š” ์ „๋ง์„ ๋‚ด๋†“์Šต๋‹ˆ๋‹ค.
01:33
There's a consensus of scientists, a large consensus of scientists,
32
93923
3420
์ •๋ง ๋งŽ์€ ๊ณผํ•™์ž๋“ค์ด ๊ฐ™์€ ์–˜๊ธฐ๋ฅผ ํ•˜๊ณ  ์žˆ์ฃ .
01:37
who believe that if we haven't completely eliminated greenhouse gases
33
97385
5297
๋งŒ์•ฝ 2050๋…„๊นŒ์ง€ ์˜จ์‹ค ๊ฐ€์Šค๋ฅผ ์™„์ „ํžˆ ์—†์• ๊ฑฐ๋‚˜
01:42
or offset them completely by 2050,
34
102723
3546
์ƒ์‡„์‹œํ‚ค์ง€ ์•Š์œผ๋ฉด,
01:46
effectively we're inviting climate catastrophe.
35
106310
2962
์‚ฌ์‹ค์ƒ ์ธ๋ฅ˜๋Š” ๊ธฐํ›„ ์žฌ์•™์„ ๋งž์ดํ•  ๊ฒƒ์ด๋ผ๋ฉด์„œ์š”.
01:49
Do you believe there is a pathway to avoid that catastrophe?
36
109272
4212
๊ทธ๋Ÿฌํ•œ ์žฌ์•™์„ ๋ง‰์„ ๋ฐฉ๋ฒ•์ด ์žˆ๋‹ค๊ณ  ๋ณด์‹ญ๋‹ˆ๊นŒ?
01:53
And what would it look like?
37
113526
1585
์žˆ๋‹ค๋ฉด ์–ด๋–ค ๊ฒŒ ์žˆ์„๊นŒ์š”?
01:56
EM: Yeah, so I am not one of the doomsday people,
38
116904
3879
EM: ์‚ฌ์‹ค ์ „ ๊ทธ๋Ÿฐ ๋Œ€์žฌ์•™์ด ๋‹ฅ์ณ์˜ฌ ๊ฒƒ์ด๋ผ๊ณ ๋Š” ์ƒ๊ฐํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
02:00
which may surprise you.
39
120783
1627
์˜์™ธ๋กœ ๋“ค๋ฆฌ์‹ค ์ˆ˜๋„ ์žˆ๊ฒ ์ง€๋งŒ์š”.
02:02
I actually think we're on a good path.
40
122952
2502
์ธ๋ฅ˜๋Š” ์˜ณ์€ ๊ธธ์„ ๊ฐ€๊ณ  ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
02:07
But at the same time,
41
127123
1251
ํ•˜์ง€๋งŒ ๊ทธ์™€ ๋™์‹œ์—
02:08
I want to caution against complacency.
42
128416
3587
ํ˜„ ์ƒํ™ฉ์— ์•ˆ์ฃผํ•ด์„  ์•ˆ ๋œ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
02:12
So, so long as we are not complacent,
43
132003
2669
ํ˜„์‹ค์— ์•ˆ์ฃผํ•˜์ง€ ์•Š๊ณ 
02:14
as long as we have a high sense of urgency
44
134714
2711
ํ˜„ ์ƒํ™ฉ์— ๋ฌธ์ œ ์˜์‹์„ ๊ฐ€์ง€๊ณ 
02:17
about moving towards a sustainable energy economy,
45
137425
4045
์ง€์† ๊ฐ€๋Šฅํ•œ ์—๋„ˆ์ง€ ๊ฒฝ์ œ๋ฅผ ๊ตฌ์ถ•ํ•˜๊ธฐ ์œ„ํ•ด ๊ณ„์† ๋…ธ๋ ฅํ•œ๋‹ค๋ฉด
02:21
then I think things will be fine.
46
141512
1710
๊ทธ๋Ÿฐ ๋Œ€์žฌ์•™์€ ์˜ค์ง€ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
02:25
So I can't emphasize that enough,
47
145433
1960
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ œ๊ฐ€ ๊ฐ€์žฅ ๊ฐ•์กฐํ•˜๊ณ  ์‹ถ์€ ๊ฒƒ์€
02:27
as long as we push hard and are not complacent,
48
147435
5255
ํ˜„์‹ค์— ์•ˆ์ฃผํ•˜์ง€ ์•Š๊ณ  ๊ณ„์† ๋‚˜์•„๊ฐ€์•ผ๋งŒ ํ•œ๋‹ค๋Š” ๊ฒƒ์ด์ฃ .
๊ทธ๋ ‡๊ฒŒ ํ•˜๋ฉด ๋ฉ‹์ง„ ๋ฏธ๋ž˜๊ฐ€ ์˜ฌ ํ…Œ๋‹ˆ๊นŒ์š”.
02:34
the future is going to be great.
49
154066
1585
02:35
Don't worry about it.
50
155651
1126
๋„ˆ๋ฌด ๊ฑฑ์ •ํ•˜์‹ค ๊ฒƒ ์—†์Šต๋‹ˆ๋‹ค.
02:36
I mean, worry about it,
51
156777
1168
๋ฌผ๋ก , ๊ฑฑ์ •์„ ํ•ด์•ผ๊ฒ ์ง€๋งŒ
02:37
but if you worry about it, ironically, it will be a self-unfulfilling prophecy.
52
157987
4338
์ง€๋‚˜์น˜๋ฉด,์—ญ์„ค์ ์œผ๋กœ, ์ž๊ธฐ ์‹คํ˜„์ด ๋ถˆ๊ฐ€๋Šฅํ•œ ์˜ˆ์–ธ์ด ๋  ์ˆ˜ ์žˆ์–ด์š”.
02:42
So, like, there are three elements to a sustainable energy future.
53
162867
4087
์ง€์† ๊ฐ€๋Šฅํ•œ ์—๋„ˆ์ง€ ์‚ฌ์šฉ์„ ์‹ค์ฒœํ•˜๋ ค๋ฉด ์„ธ ๊ฐ€์ง€ ์š”์†Œ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
02:47
One is of sustainable energy generation, which is primarily wind and solar.
54
167371
4463
๋จผ์ € ์ง€์† ๊ฐ€๋Šฅํ•œ ์—๋„ˆ์ง€ ๋ฐœ์ „์ž…๋‹ˆ๋‹ค. ํ’๋ ฅ๊ณผ ํƒœ์–‘์—ด ์—๋„ˆ์ง€๊ฐ€ ๋Œ€ํ‘œ์ ์ด์ฃ .
02:51
There's also hydro, geothermal,
55
171876
3920
์ˆ˜๋ ฅ, ์ง€์—ด ์—๋„ˆ์ง€๋„ ์žˆ๊ณ ์š”.
02:55
I'm actually pro-nuclear.
56
175838
2169
์ „ ์›์ž๋ ฅ์— ์ฐฌ์„ฑํ•˜๋Š” ์ชฝ์ž…๋‹ˆ๋‹ค.
02:58
I think nuclear is fine.
57
178049
2127
์•ˆ์ „ํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
03:04
But it's going to be primarily solar and wind,
58
184263
2962
ํ•˜์ง€๋งŒ ํƒœ์–‘์—ด๊ณผ ํ’๋ ฅ ์—๋„ˆ์ง€๊ฐ€ ์ฃผ๋ฅผ ์ด๋ฃฐ ๊ฒ๋‹ˆ๋‹ค.
03:07
as the primary generators of energy.
59
187266
4338
์ด ๋‘˜์ด ์ฃผ๋กœ ์—๋„ˆ์ง€๋ฅผ ์ƒ์‚ฐํ•  ๊ฑฐ์˜ˆ์š”.
03:11
The second part is you need batteries to store the solar and wind energy
60
191604
3837
๋‘ ๋ฒˆ์งธ๋กœ, ์ด ํƒœ์–‘์—ด๊ณผ ํ’๋ ฅ ์—๋„ˆ์ง€๋ฅผ ์ €์žฅํ•  ๋ฐฐํ„ฐ๋ฆฌ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
03:15
because the sun doesn't shine all the time,
61
195483
2252
ํ•ด๋Š” ์ €๋…์ด ๋˜๋ฉด ์ ธ๋ฒ„๋ฆฌ๊ณ ,
03:17
the wind doesn't blow all the time.
62
197777
1710
๋ฐ”๋žŒ์ด ํ•ญ์ƒ ๋ถ€๋Š” ๊ฑด ์•„๋‹ˆ๋‹ˆ๊นŒ์š”.
03:19
So it's a lot of stationary battery packs.
63
199487
2711
์—๋„ˆ์ง€๋ฅผ ์ž˜ ๋ณด๊ด€ํ•ด ๋‘˜ ๊ณ ์ •์‹ ๋ฐฐํ„ฐ๋ฆฌ ํŒฉ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
03:23
And then you need electric transport.
64
203032
2044
๋งˆ์ง€๋ง‰์œผ๋กœ, ์ „๊ธฐ๋กœ ์›€์ง์ด๋Š” ์šด์†ก ์ˆ˜๋‹จ๋„ ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
03:25
So electric cars, electric planes, boats.
65
205117
2753
์ „๊ธฐ์ฐจ, ์ „๊ธฐ ๋น„ํ–‰๊ธฐ, ์ „๊ธฐ ์„ ๋ฐ• ๊ฐ™์€ ๊ฒƒ๋“ค ๋ง์ด์ฃ .
03:27
And then ultimately,
66
207870
1460
๊ทธ๋ ‡๊ฒŒ ๋˜๋ฉด ํ–ฅํ›„์—๋Š”
03:30
itโ€™s not really possible to make electric rockets,
67
210373
2377
์ „๊ธฐ๋กœ ์›€์ง์ด๋Š” ๋กœ์ผ“์€ ์•„๋ฌด๋ž˜๋„ ํž˜๋“ค๊ฒ ์ง€๋งŒ,
03:32
but you can make the propellant used in rockets
68
212792
3170
๋กœ์ผ“์— ์‚ฌ์šฉ๋˜๋Š” ์ถ”์ง„์ฒด ์ •๋„๋Š”
03:36
using sustainable energy.
69
216003
1710
๋Œ€์ฒด ์—๋„ˆ์ง€๋ฅผ ์‚ฌ์šฉํ•˜๋„๋ก ๋งŒ๋“ค ์ˆ˜ ์žˆ๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
03:37
So ultimately, we can have a fully sustainable energy economy.
70
217755
4421
๊ถ๊ทน์ ์œผ๋กœ ์ธ๋ฅ˜๋Š” ์™„์ „ํžˆ ์ง€์† ๊ฐ€๋Šฅํ•œ ์—๋„ˆ์ง€ ๊ฒฝ์ œ๋ฅผ ๊ตฌ์ถ•ํ•  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
03:42
And it's those three things:
71
222593
2961
๋ฐฉ๊ธˆ ๋ง์”€๋“œ๋ฆฐ ์„ธ ๊ฐ€์ง€ ์š”์†Œ๋ฅผ ๊ฐ€์ง€๊ณ ์š”.
03:45
solar/wind, stationary battery pack, electric vehicles.
72
225596
2878
ํƒœ์–‘์—ด๊ณผ ํ’๋ ฅ ์—๋„ˆ์ง€, ๊ณ ์ •์‹ ๋ฐฐํ„ฐ๋ฆฌ ํŒฉ, ์ „๊ธฐ ์ž๋™์ฐจ์š”.
03:48
So then what are the limiting factors on progress?
73
228516
3462
์ด๋ฅผ ์œ„ํ•ด ์šฐ๋ฆฌ๊ฐ€ ํ•ด๊ฒฐํ•ด์•ผ ํ•  ๋ฌธ์ œ๋Š” ๋ฌด์—‡์ด ์žˆ์„๊นŒ์š”?
03:51
The limiting factor really will be battery cell production.
74
231978
3211
์•„๋ฌด๋ž˜๋„ ์ „์ง€๋ฅผ ๋งŒ๋“œ๋Š” ๋ฌธ์ œ๊ฐ€ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
03:55
So that's going to really be the fundamental rate driver.
75
235606
4046
์ด๊ฒƒ์ด ์—๋„ˆ์ง€ ๋Œ€์ฒด ์†๋„๋ฅผ ๊ฒฐ์ •ํ•˜๊ฒŒ ๋˜๊ฒ ์ฃ .
03:59
And then whatever the slowest element
76
239694
1793
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฌํ•œ ๋ฆฌํŠฌ ์ „์ง€๋ฅผ ๊ณต๊ธ‰ํ•˜๋Š” ๊ณผ์ • ์ค‘์—
04:01
of the whole lithium-ion battery cells supply chain,
77
241487
4087
์‹œ๊ฐ„์ด ๋งŽ์ด ๊ฑธ๋ฆฌ๋Š” ์ž‘์—…๋“ค์ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:05
from mining and the many steps of refining
78
245616
2753
์ฑ„๊ตด๊ณผ ์—ฌ๋Ÿฌ ๋‹จ๊ณ„์˜ ์ •์ œ ๊ณผ์ • ๋“ฑ์ด์ฃ .
04:08
to ultimately creating a battery cell
79
248411
2711
๊ทธ๋Ÿฐ ๋ณต์žกํ•œ ๋‹จ๊ณ„๋ฅผ ๊ฑฐ์ณ ์ „์ง€๋ฅผ ๋งŒ๋“ค์–ด์„œ
04:11
and putting it into a pack,
80
251163
1418
๋ฐฐํ„ฐ๋ฆฌ ํŒฉ์œผ๋กœ ๋งŒ๋“œ๋Š” ๊ณผ์ • ๋˜ํ•œ,
04:12
that will be the limiting factor on progress towards sustainability.
81
252623
3379
์ง€์† ๊ฐ€๋Šฅ์„ฑ์„ ์œ„ํ•ด ์ €ํฌ๊ฐ€ ๋„˜์–ด์•ผ ํ•  ์‚ฐ์ด๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ์ฃ .
04:16
CA: All right, so we need to talk more about batteries,
82
256043
2586
CA: ์ข‹์•„์š”, ๊ทธ๋Ÿผ ๋ฐฐํ„ฐ๋ฆฌ์— ๊ด€ํ•œ ์ด์•ผ๊ธฐ๋ฅผ ์ข€ ๋” ํ•ด ๋ด์•ผ๊ฒ ๊ตฐ์š”.
04:18
because the key thing that I want to understand,
83
258671
2252
์ œ๊ฐ€ ๊ฐ€์žฅ ์•Œ๊ณ  ์‹ถ์€ ํ•ต์‹ฌ ๋ฌธ์ œ๋Š”
04:20
like, there seems to be a scaling issue here
84
260923
2086
๊ทœ๋ชจ์˜ ๋ฌธ์ œ ๊ฐ™์€ ๊ฒƒ๋“ค์ž…๋‹ˆ๋‹ค.
04:23
that is kind of amazing and alarming.
85
263050
2836
์•„์ฃผ ๋†€๋ž์ง€๋งŒ ๊ฑฑ์ •๋˜๋Š” ๋ฌธ์ œ์ฃ .
04:25
You have said that you have calculated
86
265886
2920
๋จธ์Šคํฌ ์”จ ๋ณธ์ธ์˜ ๊ณ„์‚ฐ์— ๋”ฐ๋ฅด๋ฉด
04:28
that the amount of battery production that the world needs for sustainability
87
268806
5672
์ธ๋ฅ˜๊ฐ€ ์ง€์† ๊ฐ€๋Šฅ ์—๋„ˆ์ง€ ๊ฒฝ์ œ๋ฅผ ๊ตฌ์ถ•ํ•˜๋ ค๋ฉด ์ƒ์‚ฐํ•ด์•ผ ํ•  ๋ฐฐํ„ฐ๋ฆฌ์˜ ์–‘์ด
04:34
is 300 terawatt hours of batteries.
88
274478
3838
300ํ…Œ๋ผ์™€ํŠธ์‹œ๋ผ๊ณ  ํ•˜์…จ์–ด์š”.
04:39
That's the end goal?
89
279317
1167
๊ทธ๊ฒŒ ์ตœ์ข… ๋ชฉํ‘œ์ธ๊ฐ€์š”?
04:40
EM: Very rough numbers,
90
280484
1335
EM: ์•„์ฃผ ๋Œ€๋žต์ ์ธ ์ˆซ์ž์ž…๋‹ˆ๋‹ค.
04:41
and I certainly would invite others to check our calculations
91
281861
3211
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ์˜ ๊ณ„์‚ฐ ๊ฒฐ๊ณผ์— ๋Œ€ํ•ด์„œ๋Š” ์—ฌ๋Ÿฌ ๊ณณ์—์„œ ์ž๋ฌธ์„ ๊ตฌํ•  ์ƒ๊ฐ์ด์—์š”.
04:45
because they may arrive at different conclusions.
92
285114
2586
๋ˆ„๊ตฐ๊ฐ€๋Š” ๋‹ค๋ฅธ ๊ฒฐ๋ก ์„ ๋‚ด๋ฆด ์ˆ˜๋„ ์žˆ๋Š” ๊ฑฐ๋‹ˆ๊นŒ์š”.
04:47
But in order to transition, not just current electricity production,
93
287742
6089
ํ•˜์ง€๋งŒ ์ง€์† ๊ฐ€๋Šฅ ์—๋„ˆ์ง€๊ฐ€ ์ง€๊ธˆ์ฒ˜๋Ÿผ ์ „๊ธฐ๋ฅผ ๊ณต๊ธ‰ํ•˜๋Š” ๋ฐ์—๋งŒ ๊ทธ์น˜์ง€ ์•Š๊ณ ,
04:53
but also heating and transport,
94
293873
4004
๋‚œ๋ฐฉ๊ณผ ๊ตํ†ต์—๋„ ์ ์šฉ๋˜๋Š” ๋ฐฉํ–ฅ์œผ๋กœ ๊ฒฝ์ œ ๊ตฌ์กฐ๋ฅผ ๋ฐ”๊พธ๊ธฐ ์œ„ํ•ด์„œ๋Š”
04:57
which roughly triples the amount of electricity that you need,
95
297918
3796
์ง€๊ธˆ๋ณด๋‹ค ์•ฝ ์„ธ ๋ฐฐ ์ •๋„์˜ ์ „๊ธฐ๊ฐ€ ๋” ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
05:01
it amounts to approximately 300 terawatt hours of installed capacity.
96
301756
5046
๊ทธ๋Ÿฌ๋ ค๋ฉด 300ํ…Œ๋ผ์™€ํŠธ์‹œ ์ •๋„์˜ ๋ฐฐํ„ฐ๋ฆฌ๊ฐ€ ํ•„์š”ํ•˜๋‹ค๋Š” ๊ฒฐ๋ก ์ด ๋‚˜์™”์Šต๋‹ˆ๋‹ค.
05:06
CA: So we need to give people a sense of how big a task that is.
97
306802
4463
CA: ๊ทธ๊ฒŒ ์–ผ๋งˆ๋‚˜ ์—„์ฒญ๋‚œ ์–‘์ธ์ง€ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์ „ํ•ด์•ผ ํ•  ๋“ฏํ•œ๋ฐ,
05:11
I mean, here we are at the Gigafactory.
98
311307
2210
์ง€๊ธˆ ์ €ํฌ๊ฐ€ ์žˆ๋Š” ์ด๊ณณ ๊ธฐ๊ฐ€ํŒฉํ† ๋ฆฌ๋Š”
05:13
You know, this is one of the biggest buildings in the world.
99
313517
5089
์„ธ๊ณ„์—์„œ ๊ฐ€์žฅ ํฐ ๊ฑด๋ฌผ ์ค‘ ํ•˜๋‚˜์ธ๋ฐ์š”.
05:19
What I've read, and tell me if this is still right,
100
319565
3086
์ œ๊ฐ€ ๊ธฐ์‚ฌ์—์„œ ์ฝ์€ ๋‚ด์šฉ์ด ๋งž๋‹ค๋ฉด
05:22
is that the goal here is to eventually produce 100 gigawatt hours
101
322693
5881
์ด๊ณณ์˜ ๋ฐฐํ„ฐ๋ฆฌ ์ƒ์‚ฐ ๋ชฉํ‘œ์น˜๊ฐ€ 100๊ธฐ๊ฐ€์™€ํŠธ์‹œ๋ผ๊ณ  ํ•˜๋”๊ตฐ์š”.
05:28
of batteries here a year eventually.
102
328616
2461
ํ•œ ํ•ด ๋ชฉํ‘œ์น˜๊ฐ€ ๋ง์ด์ฃ .
05:31
EM: We will probably do more than that,
103
331452
1877
EM: ์ถ”ํ›„์—๋Š” ์•„๋งˆ ๊ทธ๋ณด๋‹ค ๋” ์ƒ์‚ฐํ•˜๊ฒ ์ง€๋งŒ,
05:33
but yes, hopefully we get there within a couple of years.
104
333329
4129
์ผ๋‹จ ์ˆ˜ ๋…„ ์•ˆ์— ๊ทธ ์ˆ˜์น˜์— ๋„๋‹ฌํ•˜๋Š” ๊ฒŒ ๋ชฉํ‘œ์ž…๋‹ˆ๋‹ค.
05:37
CA: Right.
105
337500
1293
CA: ๊ทธ๋ ‡๊ตฐ์š”.
05:38
But I mean, that is one --
106
338793
2502
๊ทธ๋Ÿผ ๊ทธ ์–˜๊ธฐ๋Š”...
05:41
EM: 0.1 terrawat hours.
107
341337
1835
EM: 0.1ํ…Œ๋ผ์™€ํŠธ์‹œ๋ผ๋Š” ๋œป์ด์ฃ .
05:43
CA: But that's still 1/100 of what's needed.
108
343172
3462
CA: ํ•˜์ง€๋งŒ ์ €ํฌ๊ฐ€ ํ•„์š”ํ•œ ์–‘์˜ 100๋ถ„์˜ 1์ •๋„์— ์ง€๋‚˜์ง€ ์•Š๋Š”๊ตฐ์š”.
05:46
How much of the rest of that 100 is Tesla planning to take on
109
346675
5840
๊ทธ๋Ÿผ ํ–ฅํ›„ 2~30๋…„ ์ •๋„ ์•ˆ์—
05:52
let's say, between now and 2030, 2040,
110
352556
5422
ํ…Œ์Šฌ๋ผ๊ฐ€ ๋ฐฐํ„ฐ๋ฆฌ ์ƒ์‚ฐ๋Ÿ‰์˜ ๋ช‡ ํผ์„ผํŠธ๋ฅผ ์ฐจ์ง€ํ•˜๊ฒŒ ๋˜๋Š” ๊ฑฐ์ฃ ?
05:58
when we really need to see the scale up happen?
111
358020
3128
๋ง์”€ํ•˜์‹ ๋Œ€๋กœ ๋” ๋งŽ์€ ๋ฐฐํ„ฐ๋ฆฌ๊ฐ€ ํ•„์š”ํ•  ๋•Œ์ฏค ๋ง์ด์ฃ .
06:01
EM: I mean, these are just guesses.
112
361774
1752
EM: ์šฐ์„ , ์ด๊ฒƒ์€ ๊ทธ์ € ์ถ”์ธก์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
06:03
So please, people shouldn't hold me to these things.
113
363567
2795
์ œ๊ฐ€ ๋ฏธ๋ž˜๋ฅผ ๋‹จ์–ธํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ•˜์ง€ ๋ง์•„ ์ฃผ์…จ์œผ๋ฉด ํ•ด์š”.
06:06
It's not like this is like some --
114
366404
2168
์ €๋Š” ๊ทธ์ €...
06:08
What tends to happen is I'll make some like,
115
368614
3378
ํ˜„ ์ƒํ™ฉ์„ ๋ณด๊ณ  ์ถ”์ธก์„ ๋‚ด๋†“์„ ๋ฟ์ด์—์š”.
06:12
you know, best guess
116
372034
1251
๊ฐ€๋Šฅ์„ฑ์ด ๋†’์•„๋ณด์ด๋Š” ์ถ”์ธก์„์š”.
06:13
and then people, in five years,
117
373327
2044
๊ทธ๋Ÿฐ๋ฐ ํ•œ 5๋…„์ฏค ํ›„์—
06:15
thereโ€™ll be some jerk that writes an article:
118
375413
2127
๋ชจ ์‚ผ๋ฅ˜ ๊ธฐ์ž๊ฐ€ ์ด๋Ÿฐ ๊ธฐ์‚ฌ๋ฅผ ์“ธ ์ˆ˜๋„ ์žˆ๊ฑฐ๋“ ์š”.
06:17
"Elon said this would happen, and it didn't happen.
119
377540
2878
โ€œ์ผ๋ก  ๋จธ์Šคํฌ์˜ ์˜ˆ์ธก์ด ํ‹€๋ ธ๋‹ค.
06:20
He's a liar and a fool."
120
380459
1710
๊ทธ์ž๋Š” ๊ฑฐ์ง“๋ง์Ÿ์ด์— ๋ฌด์ง€ํ•˜๋‹ค.โ€
06:22
It's very annoying when that happens.
121
382211
2211
๊ทธ๋Ÿด ๋•Œ๋ฉด ์ •๋ง ์งœ์ฆ์ด ๋‚˜์š”.
06:25
So these are just guesses, this is a conversation.
122
385172
2628
์ œ๊ฐ€ ๋“œ๋ฆฌ๋Š” ๋ง์”€์€ ๊ทธ์ € ์˜ˆ์ธก์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
06:27
CA: Right.
123
387842
1126
CA: ์•Œ๊ฒ ์Šต๋‹ˆ๋‹ค.
06:29
EM: I think Tesla probably ends up doing 10 percent of that.
124
389552
4462
EM: ์ €๋Š” ์ถ”ํ›„ ํ…Œ์Šฌ๋ผ๊ฐ€ 10% ์ •๋„๋ฅผ ์ฑ…์ž„์งˆ ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
06:35
Roughly.
125
395099
1168
๋Œ€๋žต 10% ์ •๋„์š”.
06:36
CA: Let's say 2050
126
396308
1293
CA: ๊ทธ๋ ‡๋‹ค๋ฉด 2050๋…„์ด ๋˜์–ด
06:37
we have this amazing, you know, 100 percent sustainable electric grid
127
397643
5881
๋งˆ์นจ๋‚ด ์ง€์† ๊ฐ€๋Šฅ ์—๋„ˆ์ง€๋กœ ์ƒ์‚ฐ๋œ ์ „๊ธฐ๋ฅผ ์‚ฌ์šฉํ•˜๋Š”
06:43
made up of, you know, some mixture of the sustainable energy sources
128
403566
3211
์™„๋ฒฝํ•˜๊ฒŒ ์ง€์† ๊ฐ€๋Šฅํ•œ ์ „๋ ฅ๋ง์ด ๊ตฌ์ถ•๋˜์—ˆ๋‹ค๊ณ  ํ•ด ๋ด…์‹œ๋‹ค.
06:46
you talked about.
129
406777
1335
๋จธ์Šคํฌ ์”จ๊ฐ€ ๋ง์”€ํ•œ ๋Œ€๋กœ์š”.
06:49
That same grid probably is offering the world
130
409447
2836
๊ทธ๋Ÿผ ์ด ์ „๋ ฅ๋ง์€ ์•„์ฃผ ์ €๋ ดํ•œ ๊ฐ€๊ฒฉ์˜ ์—๋„ˆ์ง€๋ฅผ
06:52
really low-cost energy, isn't it,
131
412324
1877
์ „ ์„ธ๊ณ„์— ๊ณต๊ธ‰ํ•˜๊ณ  ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
06:54
compared with now.
132
414243
2085
์ง€๊ธˆ๊ณผ ๋น„๊ตํ•˜๋ฉด ๋ง์ด์ฃ .
06:56
And I'm curious about like,
133
416745
3129
์—ฌ๊ธฐ์„œ ์ œ๊ฐ€ ๊ถ๊ธˆํ•œ ์ ์ด ์žˆ๋Š”๋ฐ
06:59
are people entitled to get a little bit excited
134
419915
4213
๊ทธ๋ ‡๊ฒŒ ๋˜๋ฉด ์ธ๋ฅ˜๊ฐ€ ์ด ์„ธ์ƒ์˜ ๋ฏธ๋ž˜๋ฅผ
07:04
about the possibilities of that world?
135
424170
2627
์ข€ ๋” ๋‚™๊ด€์ ์œผ๋กœ ๋ฐ”๋ผ๋ด๋„ ๊ดœ์ฐฎ์€ ๊ฑธ๊นŒ์š”?
07:06
EM: People should be optimistic about the future.
136
426839
2294
EM: ์šฐ๋ฆฌ๋Š” ๋ฏธ๋ž˜๋ฅผ ์ข€ ๋” ๋‚™๊ด€์ ์œผ๋กœ ๋ฐ”๋ผ๋ณผ ํ•„์š”๊ฐ€ ์žˆ์–ด์š”.
07:12
Humanity will solve sustainable energy.
137
432970
2753
์ธ๋ฅ˜๋Š” ์ง€์† ๊ฐ€๋Šฅ ์—๋„ˆ์ง€ ๊ฐœ๋ฐœ์— ์„ฑ๊ณตํ•  ๊ฒ๋‹ˆ๋‹ค.
07:15
It will happen if we, you know, continue to push hard,
138
435764
4547
๊ทธ๋ ‡๊ฒŒ ๋  ๊ฑฐ์˜ˆ์š”. ์ €ํฌ๊ฐ€ ๊ณ„์† ๋…ธ๋ ฅ์„ ๊ธฐ์šธ์ด๋ฉด
07:20
the future is bright and good from an energy standpoint.
139
440311
4504
๋ฐ์€ ๋ฏธ๋ž˜๊ฐ€ ์˜ฌ ๊ฒƒ์ด๊ณ , ์—๋„ˆ์ง€ ๋ฌธ์ œ๋Š” ํ•ด๊ฒฐ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:25
And then it will be possible to also use that energy to do carbon sequestration.
140
445983
5380
๊ทธ๋ฆฌ๊ณ  ๋Œ€์ฒด ์—๋„ˆ์ง€๋ฅผ ์‚ฌ์šฉํ•จ์œผ๋กœ์จ ํƒ„์†Œ๋ฅผ ๊ฒฉ๋ฆฌ ์‹œํ‚ค๋Š” ์ผ๋„ ๊ฐ€๋Šฅํ•ด์ง‘๋‹ˆ๋‹ค.
07:31
It takes a lot of energy to pull carbon out of the atmosphere
141
451363
2878
ํƒ„์†Œ๋ฅผ ๋Œ€๊ธฐ ๋ฐ–์œผ๋กœ ๋Œ์–ด๋‚ด๋ ค๋ฉด ์•„์ฃผ ๋งŽ์€ ์—๋„ˆ์ง€๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
07:34
because in putting it in the atmosphere it releases energy.
142
454283
2920
๋Œ€๊ธฐ๋กœ ๋ณด๋‚ผ ๋•Œ ๋งŽ์€ ์—๋„ˆ์ง€๊ฐ€ ๋ฐœ์‚ฐ๋˜๋‹ˆ๊นŒ์š”.
07:37
So now, you know, obviously in order to pull it out,
143
457244
2461
๊ทธ๊ฑธ ๋Œ€๊ธฐ ๋ฐ–์œผ๋กœ ๋Œ์–ด๋‚ด๋ ค๋ฉด
07:39
you need to use a lot of energy.
144
459705
1585
์—„์ฒญ๋‚œ ์—๋„ˆ์ง€๊ฐ€ ํ•„์š”ํ•œ ๊ฑฐ์ฃ .
07:41
But if you've got a lot of sustainable energy from wind and solar,
145
461332
3712
ํ•˜์ง€๋งŒ ํ’๋ ฅ๊ณผ ํƒœ์–‘์œผ๋กœ๋ถ€ํ„ฐ ์ง€์† ๊ฐ€๋Šฅ ์—๋„ˆ์ง€๋ฅผ ์ถฉ๋ถ„ํžˆ ์–ป์œผ๋ฉด
07:45
you can actually sequester carbon.
146
465044
1668
ํƒ„์†Œ๋ฅผ ์™„์ „ํžˆ ๊ฒฉ๋ฆฌ์‹œํ‚ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:46
So you can reverse the CO2 parts per million of the atmosphere and oceans.
147
466712
5464
๊ทธ๋ž˜์„œ ๋Œ€๊ธฐ์™€ ๋ฐ”๋‹ค์˜ CO2 ๋†๋„๋ฅผ ์—ญ์ „์‹œํ‚ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:52
And also you can really have as much fresh water as you want.
148
472593
3712
๊ทธ๋ฆฌ๊ณ  ๊นจ๋—ํ•œ ๋ฌผ๋„ ์›ํ•˜๋Š” ๋งŒํผ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜์ฃ .
07:57
Earth is mostly water.
149
477264
1168
์ง€๊ตฌ์˜ ๋Œ€๋ถ€๋ถ„์€ ๋ฌผ์ž…๋‹ˆ๋‹ค.
07:58
We should call Earth โ€œWater.โ€
150
478432
1418
์ง€๊ตฌ๋Š” ๋ฌผ์ด๋ผ๊ณ  ๋ณด๋Š” ๊ฒŒ ๋งž์•„์š”.
07:59
It's 70 percent water by surface area.
151
479850
1835
ํ‘œ๋ฉด์˜ 70ํผ์„ผํŠธ๊ฐ€ ๋ฌผ์ด๋‹ˆ๊นŒ์š”.
08:01
Now most of thatโ€™s seawater,
152
481685
1418
์ง€๊ตฌ์˜ ๋Œ€๋ถ€๋ถ„์€ ๋ฐ”๋‹ท๋ฌผ์ด์ง€๋งŒ,
08:03
but it's like we just happen to be on the bit that's land.
153
483103
3462
์–ด์ฉŒ๋‹ค ๋ณด๋‹ˆ ์šฐ๋ฆฌ๋Š” ์œก์ง€๋ผ๋Š” ์กฐ๊ทธ๋งŒ ๋ถ€๋ถ„์—์„œ ์‚ด๊ณ  ์žˆ์ฃ .
08:06
CA: And with energy, you can turn seawater into --
154
486565
3212
CA: ๊ทธ๋ ‡๋‹ค๋ฉด, ๊ทธ๋Ÿฐ ์—๋„ˆ์ง€๋ฅผ ์ด์šฉํ•ด ๋ฐ”๋‹ท๋ฌผ์„...
08:09
EM: Yes.
155
489777
1168
EM: ๋งž์Šต๋‹ˆ๋‹ค.
08:10
CA: Irrigating water or whatever water you need.
156
490986
2336
CA: ๊ด€๊ฐœ์ˆ˜๋“  ๋ญ๋“  ํ•„์š”ํ•œ ์šฉ๋„์˜ ๋ฌผ์„ ๋Œ์–ด์˜ฌ ์ˆ˜ ์žˆ๊ฒ ๊ตฐ์š”.
08:13
EM: At very low cost.
157
493864
1210
EM: ์•„์ฃผ ์ €๋ ดํ•œ ๋น„์šฉ์œผ๋กœ์š”.
08:15
Things will be good.
158
495115
1168
๋ชจ๋“  ๊ฒŒ ์ž˜ ํ’€๋ฆด ๊ฒ๋‹ˆ๋‹ค.
08:16
CA: Things will be good.
159
496325
1168
CA: ๊ทธ๋ ‡๊ตฐ์š”.
08:17
And also, there's other benefits to this non-fossil fuel world
160
497535
2919
ํ™”์„ ์—ฐ๋ฃŒ๋ฅผ ์‚ฌ์šฉํ•˜์ง€ ์•Š๊ฒŒ ๋˜๋ฉด ๋˜ ๋‹ค๋ฅธ ์žฅ์ ์ด ์žˆ์ฃ .
08:20
where the air is cleaner --
161
500496
1460
๋Œ€๊ธฐ๊ฐ€ ๋” ๊นจ๋—ํ•ด์งˆ ํ…Œ๋‹ˆ...
08:21
EM: Yes, exactly.
162
501997
1210
EM: ๋งž์Šต๋‹ˆ๋‹ค.
08:23
Because, like, when you burn fossil fuels,
163
503707
2795
ํ™”์„ ์—ฐ๋ฃŒ๊ฐ€ ์—ฐ์†Œ๋˜๋ฉด
08:26
there's all these side reactions
164
506544
3294
๊ฐ์ข… ๋ถ€์ž‘์šฉ์ด ๋ฐœ์ƒํ•จ๊ณผ ๋™์‹œ์—
08:29
and toxic gases of various kinds.
165
509880
2711
๊ฐ์ข… ์œ ํ•ด ๊ฐ€์Šค๊ฐ€ ๋ฐฐ์ถœ๋˜๋‹ˆ๊นŒ์š”.
08:33
And sort of little particulates that are bad for your lungs.
166
513509
4880
ํ ๊ฑด๊ฐ•์— ์œ„ํ˜‘์ด ๋˜๋Š” ๋ฌผ์งˆ๋„ ๋ฐฐ์ถœ๋˜๊ณ ์š”.
08:38
Like, there's all sorts of bad things that are happening
167
518430
2628
๊ทธ๋ ‡๊ฒŒ ํ™”์„ ์—ฐ๋ฃŒ ์‚ฌ์šฉ์œผ๋กœ ๋ฐœ์ƒํ•˜๋Š” ๋ชจ๋“  ๋ถ€์ž‘์šฉ๋“ค์ด
๋ง๋”ํžˆ ์‚ฌ๋ผ์งˆ ๊ฒ๋‹ˆ๋‹ค.
08:41
that will go away.
168
521100
1168
08:42
And the sky will be cleaner and quieter.
169
522268
3211
ํ•˜๋Š˜์€ ๋” ๊นจ๋—ํ•ด์ง€๊ณ  ํ‰ํ™”๋กœ์›Œ์ง€๊ฒ ์ฃ .
08:45
The future's going to be good.
170
525479
1502
๋ฉ‹์ง„ ๋ฏธ๋ž˜๊ฐ€ ์˜ค๊ฒŒ ๋  ๊ฑฐ์˜ˆ์š”.
08:46
CA: I want us to switch now to think a bit about artificial intelligence.
171
526981
3461
CA: ์ง€๊ธˆ๋ถ€ํ„ฐ๋Š” ์ฃผ์ œ๋ฅผ ๋ฐ”๊ฟ” ์ธ๊ณต์ง€๋Šฅ ์–˜๊ธฐ๋ฅผ ์ข€ ํ•ด ๋ณด์ฃ .
08:50
But the segue there,
172
530442
1836
๋“ค์–ด๊ฐ€๊ธฐ์— ์•ž์„œ,
08:52
you mentioned how annoying it is when people call you up
173
532278
3461
๋จธ์Šคํฌ ์”จ๋Š” ์•„์ฃผ ์ง„์ €๋ฆฌ๊ฐ€ ๋‚œ๋‹ค๊ณ  ์–ธ๊ธ‰ํ•˜์‹  ์ ์ด ์žˆ๋Š”๋ฐ์š”.
08:55
for bad predictions in the past.
174
535781
2878
์‚ฌ๋žŒ๋“ค์ด ๋ณธ์ธ์„ ์—‰ํ„ฐ๋ฆฌ ์˜ˆ์–ธ๊ฐ€๋ผ๊ณ  ๋ถ€๋ฅด๋Š” ์ผ์— ๋Œ€ํ•ด์„œ ๋ง์ด์ฃ .
08:58
So I'm possibly going to be annoying now,
175
538701
4212
๊ทธ๋ž˜์„œ ์ €์—๊ฒŒ๋„ ์งœ์ฆ์„ ๋‚ด์‹ค์ง€ ๋ชจ๋ฅด๊ฒ ์ง€๋งŒ
09:02
but Iโ€™m curious about your timelines and how you predict
176
542955
4421
์ €๋Š” ๋จธ์Šคํฌ ์”จ๊ฐ€ ์–ด๋–ค ์‹œ๊ฐ„ ๊ณ„ํš์„ ๊ฐ–๊ณ  ์žˆ๊ณ , ์–ด๋–ป๊ฒŒ ๋ฏธ๋ž˜๋ฅผ ์˜ˆ์ธกํ•˜๋Š”์ง€
09:07
and how come some things are so amazingly on the money and some aren't.
177
547418
3378
์™œ ์–ด๋–ค ์˜ˆ์ธก๋“ค์€ ์ •ํ™•ํžˆ ๋“ค์–ด๋งž๋Š”๋ฐ, ๋‹ค๋ฅธ ๊ฑด ๊ทธ๋ ‡์ง€ ๋ชปํ•œ์ง€๊ฐ€ ๊ถ๊ธˆํ•ฉ๋‹ˆ๋‹ค.
09:10
So when it comes to predicting sales of Tesla vehicles, for example,
178
550796
4963
์˜ˆ๋ฅผ ๋“ค์–ด, ํ…Œ์Šฌ๋ผ ์ฐจ๋Ÿ‰์˜ ๋งค์ถœ์„ ์˜ˆ์ƒํ•˜์‹ค ๋•Œ๋„
09:15
you've kind of been amazing,
179
555801
1376
๋†€๋ผ์šด ์˜ˆ์ธก์„ ํ•˜์…จ์ฃ .
09:17
I think in 2014 when Tesla had sold that year 60,000 cars,
180
557177
5423
2014๋…„ ํ…Œ์Šฌ๋ผ์˜ ์—ฐ ๋งค์ถœ๋Ÿ‰์ด 6๋งŒ ๋Œ€๋ฅผ ๋‹ฌ์„ฑํ–ˆ์„ ๋•Œ์ฏค,
09:22
you said, "2020, I think we will do half a million a year."
181
562641
3712
๋จธ์Šคํฌ ์”จ๋Š” โ€œ2020๋…„์—๋Š” ์—ฐ ๋งค์ถœ๋Ÿ‰ 50๋งŒ ๋Œ€๊ฐ€ ๋  ๊ฒƒ์ด๋‹ค.โ€œ๋ผ๊ณ  ํ•˜์…จ์ฃ .
09:26
EM: Yeah, we did almost exactly a half million.
182
566395
2211
EM: ๊ทธ๋ฆฌ๊ณ  ๊ทธ ์ˆซ์ž๋ฅผ ๊ฑฐ์˜ ๋‹ฌ์„ฑํ–ˆ์ฃ .
CA: ๋„ค, ์ •๋ง๋กœ ํ•ด๋‚ด์…จ์Šต๋‹ˆ๋‹ค.
09:28
CA: You did almost exactly half a million.
183
568647
2002
๋‹น์‹œ์—๋Š” ๋น„์›ƒ์Œ์„ ์‚ฐ ๋ฐœ์–ธ์ด์—ˆ์ฃ . ๊ทธ ์ •๋„์˜ ๋งค์ถœ ์„ฑ์žฅ์„ธ๋ฅผ ๋ณด์ธ ์ž๋™์ฐจ๋Š”
09:30
You were scoffed in 2014 because no one since Henry Ford,
184
570691
3253
09:33
with the Model T, had come close to that kind of growth rate for cars.
185
573944
4546
ํ—จ๋ฆฌ ํฌ๋“œ์˜ ํฌ๋“œ์‚ฌ๊ฐ€ ์ถœ์‹œํ•œ ๋ชจ๋ธT ์ดํ›„๋กœ๋Š” ์—†์—ˆ์œผ๋‹ˆ๊นŒ์š”.
09:38
You were scoffed, and you actually hit 500,000 cars
186
578532
2836
๋‹น์‹œ์—๋Š” ๋น„์›ƒ์Œ์„ ์‚ฌ์…จ์ง€๋งŒ, 50๋งŒ ๋Œ€ ํŒ๋งค๋ฅผ ๋‹ฌ์„ฑํ•˜์…จ์ฃ .
09:41
and then 510,000 or whatever produced.
187
581410
2336
์ดํ›„์—” 51๋งŒ ๋Œ€, ๊ทธ๋ฆฌ๊ณ  ๋ช‡ ๋Œ€๋ฅผ ์ƒ์‚ฐํ•˜๋“  ์™„ํŒ๋˜์—ˆ๊ณ ์š”.
09:44
But five years ago, last time you came to TED,
188
584246
3170
๊ทธ๋Ÿฐ๋ฐ 5๋…„ ์ „, ๋ณธ์ธ์˜ ๊ฐ€์žฅ ์ตœ๊ทผ TED ๊ฐ•์—ฐ์—์„œ
09:47
I asked you about full self-driving,
189
587416
2961
์ œ๊ฐ€ ์™„์ „ ์ž์œจ ์ฃผํ–‰์— ๋Œ€ํ•œ ์งˆ๋ฌธ์„ ๋“œ๋ ธ์„ ๋•Œ,
09:50
and you said, โ€œYeah, this very year,
190
590419
2878
๋จธ์Šคํฌ ์”จ๊ฐ€ ๋ง์”€ํ•˜์…จ๊ฑฐ๋“ ์š”. โ€œ๋„ค, ๋ฐ”๋กœ ์˜ฌํ•ด,
09:53
I'm confident that we will have a car going from LA to New York
191
593297
5464
์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ LA๋ถ€ํ„ฐ ๋‰ด์š•์‹œ๊นŒ์ง€ ์‚ฌ๋žŒ์ด ์ „ํ˜€ ๊ฐœ์ž…ํ•˜์ง€ ์•Š๋Š” ์ž๋™์ฐจ๋กœ
09:58
without any intervention."
192
598802
2044
์ด๋™ํ•  ์ˆ˜ ์žˆ์„ ๊ฑฐ๋ผ ํ™•์‹ ํ•ฉ๋‹ˆ๋‹ค.โ€
10:00
EM: Yeah, I don't want to blow your mind, but I'm not always right.
193
600888
3211
EM: ์‹ค๋ง์‹œ์ผœ๋“œ๋ ค ์ฃ„์†กํ•˜์ง€๋งŒ, ์ œ ์˜ˆ์ธก์ด ๋Š˜ ๋งž์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค.
10:04
CA: (Laughs)
194
604099
1377
CA: (์›ƒ์Œ)
10:05
What's the difference between those two?
195
605517
2044
์ด ๋‘ ์˜ˆ์ธก์˜ ์ฐจ์ด๋Š” ๋ญ์ฃ ?
10:08
Why has full self-driving in particular been so hard to predict?
196
608646
4671
์™œ ์ž์œจ ์ฃผํ–‰์˜ ๊ฒฝ์šฐ์—๋Š” ์˜ˆ์ธกํ•˜๊ธฐ๊ฐ€ ๋” ์–ด๋ ค์› ๋˜ ๊ฑด๊ฐ€์š”?
10:13
EM: I mean, the thing that really got me,
197
613359
1960
EM: ์ด๊ฒŒ ์ฐธ ๊ณจ์น˜ ์•„ํ”ˆ ๊ฒƒ์ด,
10:15
and I think it's going to get a lot of other people,
198
615361
2460
์•„๋งˆ ์ € ๋ง๊ณ ๋„ ๋‹ค๋ฅธ ๋งŽ์€ ์ด๋“ค๋„ ๋น„์Šทํ•  ๋“ฏํ•œ๋ฐ,
10:17
is that there are just so many false dawns with self-driving,
199
617821
4338
์ž์œจ ์ฃผํ–‰์—์„œ๋Š” ํ—›๋œ ๊ธฐ๋Œ€๋ฅผ ํ•˜๊ฒŒ ๋˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ์ž์ฃผ ์žˆ์–ด์š”.
10:22
where you think you've got the problem,
200
622159
3128
์ •ํ™•ํžˆ ๋ญ๊ฐ€ ๋ฌธ์ œ์ธ์ง€ ์•Œ๊ฒ ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๊ณ 
10:25
have a handle on the problem,
201
625329
1418
ํ•ด๊ฒฐ์ฑ…์„ ์ฐพ์•˜๋‹ค๊ณ  ์—ฌ๊ธด ์ง€์ ์—์„œ,
10:26
and then it, no, turns out you just hit a ceiling.
202
626747
4212
๋˜ ๋‹ค๋ฅธ ๋‚œ๊ด€์— ๋ด‰์ฐฉํ•˜๊ณค ํ•˜์ฃ .
10:33
Because if you were to plot the progress,
203
633754
3754
๊ทธ ์ง„ํ–‰ ๊ณผ์ •์„ ๊ทธ๋ž˜ํ”„๋กœ ๊ทธ๋ ค๋ณธ๋‹ค๋ฉด
10:37
the progress looks like a log curve.
204
637508
1751
๋กœ๊ทธํ•จ์ˆ˜์™€ ๊ฐ™์€ ๋ชจ์–‘์œผ๋กœ ๋‚˜ํƒ€๋‚˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
10:39
So it's like a series of log curves.
205
639259
2336
์—ฐ์†์ ์ธ ๋กœ๊ทธํ•จ์ˆ˜ ํ˜•ํƒœ๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ์ฃ .
10:41
So most people don't know what a log curve is, I suppose.
206
641637
2794
์•„, ๊ทธ๋Ÿฌ๊ณ ๋ณด๋‹ˆ ๋กœ๊ทธํ•จ์ˆ˜๊ฐ€ ๋ญ”์ง€ ์ž˜ ๋ชจ๋ฅด๋Š” ๋ถ„๋“ค๋„ ๊ณ„์‹œ๊ฒ ๊ตฐ์š”.
10:45
CA: Show the shape with your hands.
207
645474
1752
CA: ์†์œผ๋กœ ํ•œ ๋ฒˆ ๋ณด์—ฌ์ฃผ์‹œ์ฃ , ์–ด๋–ค ๋ชจ์–‘์ธ์ง€.
10:47
EM: It goes up you know, sort of a fairly straight way,
208
647267
2962
์ฒ˜์Œ์—๋Š” ๋ฉ€์ฉกํ•˜๊ฒŒ ์ƒํ–ฅ ๊ณก์„ ์„ ๊ทธ๋ฆฌ๋‹ค๊ฐ€,
10:50
and then it starts tailing off
209
650270
2336
์„ฑ์žฅ์„ธ๊ฐ€ ๊ฐ์†Œํ•˜๊ธฐ ์‹œ์ž‘ํ•˜์ฃ .
10:52
and you start getting diminishing returns.
210
652648
2669
๊ทธ ์ดํ›„์—๋Š” ํ•˜ํ–ฅ ๊ณก์„ ์œผ๋กœ ํšŒ๊ท€ํ•ฉ๋‹ˆ๋‹ค.
10:55
And you're like, uh oh,
211
655693
1876
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ...
10:57
it was trending up and now it's sort of, curving over
212
657611
3629
์ƒ์Šน ์ถ”์„ธ์˜€๋‹ค๊ฐ€ ์ง€๊ธˆ์€, ๋งํ•˜์ž๋ฉด ๊บพ์ธ๊ฑฐ์ฃ .
11:01
and you start getting to these, what I call local maxima,
213
661281
4505
์ œ๊ฐ€ ํŠน์ • ์ตœ๋Œ“๊ฐ’์ด๋ผ๊ณ  ๋ถˆ๋ฆฌ๋Š” ์ด๋Ÿฐ ์ƒ์Šน์„ธ์— ๋„๋‹ฌํ•ด ์žˆ์„ ๋•Œ๋Š”
11:05
where you don't realize basically how dumb you were.
214
665828
2836
์Šค์Šค๋กœ๊ฐ€ ์–ผ๋งˆ๋‚˜ ๋ฉ์ฒญํ•œ ์†Œ๋ฆฌ๋ฅผ ํ•˜๊ณ  ์žˆ๋Š”์ง€ ๊นจ๋‹ซ์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
11:10
And then it happens again.
215
670040
1919
์‹ค์ˆ˜๊ฐ€ ๋ฐ˜๋ณต๋˜๋Š” ๊ฑฐ์ฃ .
11:14
And ultimately...
216
674253
1293
๊ฒฐ๊ตญ ๊นจ๋‹ฌ์€ ๊ฒƒ์€...
11:16
These things, you know, in retrospect, they seem obvious,
217
676004
3295
์ง€๊ธˆ ์™€์„œ๋Š” ๋„ˆ๋ฌด ๋ช…๋ฐฑํ•ด ๋ณด์ด์ง€๋งŒ,
11:19
but in order to solve full self-driving properly,
218
679341
4088
์™„์ „ ์ž์œจ ์ฃผํ–‰์ด ์™„์„ฑ๋˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๊ฒฐ๊ตญ
11:23
you actually have to solve real-world AI.
219
683429
2961
ํ˜„์‹ค ์„ธ๊ณ„์—์„œ์˜ AI ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:28
Because what are the road networks designed to work with?
220
688350
4171
์™œ๋ƒํ•˜๋ฉด, ์šฐ๋ฆฌ์˜ ๋„๋กœ๋ง์ด ๋ฌด์—‡๊ณผ ์ƒํ˜ธ์ž‘์šฉ ํ•˜๋„๋ก ์„ค๊ณ„๋˜์—ˆ๋‚˜์š”?
11:32
They're designed to work with a biological neural net, our brains,
221
692563
3920
๋„๋กœ๋ง์€ ์šฐ๋ฆฌ์˜ ์ƒ๋ฌผํ•™์ ์ธ ์‹ ๊ฒฝ๋ง, ์šฐ๋ฆฌ์˜ ๋‡Œ,
11:36
and with vision, our eyes.
222
696483
3295
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ์˜ ์‹œ๊ฐ๊ณผ ์ƒํ˜ธ์ž‘์šฉ์„ ํ•˜๋„๋ก ์„ค๊ณ„๋์Šต๋‹ˆ๋‹ค.
11:40
And so in order to make it work with computers,
223
700487
5047
๊ทธ๋Ÿฌ๋‹ˆ ์ž์œจ ์ฃผํ–‰์ด ์ปดํ“จํ„ฐ์™€ ์ƒํ˜ธ์ž‘์šฉ์„ ํ•˜๊ฒŒ ๋งŒ๋“œ๋ ค๋ฉด,
11:45
you basically need to solve real-world AI and vision.
224
705576
4838
๊ธฐ๋ณธ์ ์œผ๋กœ ํ˜„์‹ค ์„ธ๊ณ„์˜ AI์™€ ์˜์ƒ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:51
Because we need cameras
225
711123
5047
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๋Š” ์นด๋ฉ”๋ผ์™€
11:56
and silicon neural nets
226
716211
2711
์ธ๊ณต ์‹ ๊ฒฝ๋ง์ด ํ•„์š”ํ•ด์š”.
11:58
in order to have self-driving work
227
718922
2962
์ž์œจ ์ฃผํ–‰์ด ์ž‘๋™ํ•˜๊ฒŒ ํ•˜๋ ค๋ฉด์š”.
12:01
for a system that was designed for eyes and biological neural nets.
228
721884
4212
์šฐ๋ฆฌ์˜ ๋ˆˆ๊ณผ ์ƒ๋ฌผํ•™์  ์‹ ๊ฒฝ๋ง์— ๋งž์ถฐ ์„ค๊ณ„๋œ ์‹œ์Šคํ…œ์ด๋‹ˆ๊นŒ์š”.
12:07
You know, I guess when you put it that way,
229
727139
2085
์ด๋Ÿฐ ์‹์œผ๋กœ ๋ง์”€์„ ๋“œ๋ฆฌ๋ฉด
12:09
it's sort of, like, quite obvious
230
729224
1585
๊ฝค ๋ช…ํ™•ํ•˜๊ฒŒ ์ดํ•ด๊ฐ€ ๋˜์‹œ๊ฒ ์ฃ .
12:10
that the only way to solve full self-driving
231
730851
2085
์ž์œจ ์ฃผํ–‰์„ ์‹คํ˜„ํ•˜๊ธฐ ์œ„ํ•œ ์œ ์ผํ•œ ๋ฐฉ๋ฒ•์€
12:12
is to solve real world AI and sophisticated vision.
232
732936
3963
ํ˜„์‹ค ์„ธ๊ณ„์˜ AI์™€ ๋งค์šฐ ์ •๊ตํ•œ ์˜์ƒ๊ณผ ๊ด€๋ จ๋œ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:16
CA: What do you feel about the current architecture?
233
736899
2460
CA: ํ˜„์žฌ์˜ ์•„ํ‚คํ…์ฒ˜์— ๋Œ€ํ•ด์„œ๋Š” ์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ•˜์‹ญ๋‹ˆ๊นŒ?
๋จธ์Šคํฌ ์”จ๊ฐ€ ๋ณด์‹œ๊ธฐ์—,
12:19
Do you think you have an architecture now
234
739401
2002
12:21
where there is a chance
235
741403
1794
ํ˜„์žฌ ์‚ฌ์šฉ๋˜๋Š” ์•„ํ‚คํ…์ฒ˜๋Š”
12:23
for the logarithmic curve not to tail off any anytime soon?
236
743197
3962
์„ฑ์žฅ์„ธ๊ฐ€ ๊บพ์ด์ง€ ์•Š๊ณ  ๋ชฉํ‘œ ๋‹ฌ์„ฑ์— ๋‹ค๊ฐ€๊ฐˆ ์ˆ˜ ์žˆ์„๊นŒ์š”?
12:27
EM: Well I mean, admittedly these may be infamous last words,
237
747743
5213
EM: ์ง€๋‚˜์น˜๊ฒŒ ์ž์‹ ๊ฐ์ด ๋„˜์น˜๋Š” ๋ฐœ์–ธ์ผ ์ˆ˜๋„ ์žˆ์ง€๋งŒ,
12:32
but I actually am confident that we will solve it this year.
238
752998
2836
์ž์œจ ์ฃผํ–‰ ์•„ํ‚คํ…์ฒ˜๋Š” ์˜ฌํ•ด ์•ˆ์— ์™„์„ฑ๋  ๊ฑฐ๋ผ๊ณ  ๋ด…๋‹ˆ๋‹ค.
12:36
That we will exceed --
239
756126
1794
๊ทธ๋Ÿฌ๋ฉด ์‚ฌ๊ณ  ํ™•๋ฅ ๋„...
12:39
The probability of an accident,
240
759379
2962
ํ•ด๊ฒฐ๋  ์ˆ˜ ์žˆ์„ ๊ฑฐ๊ณ ์š”.
12:42
at what point do you exceed that of the average person?
241
762382
2586
์•„ํ‚คํ…์ฒ˜๊ฐ€ ์–ธ์ œ์ฏค ์ผ๋ฐ˜์ธ ์ˆ˜์ค€์˜ ์ธ์ง€ ๋Šฅ๋ ฅ์„ ๋›ฐ์–ด๋„˜์„๊นŒ์š”?
12:45
I think we will exceed that this year.
242
765344
1960
์ €๋Š” ์˜ฌํ•ด์— ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ๋ณด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:47
CA: What are you seeing behind the scenes that gives you that confidence?
243
767346
3712
CA: ๊ทธ๋Ÿฌํ•œ ํ™•์‹ ์€ ์–ด๋””์„œ ๋‚˜์˜ค๊ฒŒ ๋œ ๊ฑด๊ฐ€์š”?
12:51
EM: Weโ€™re almost at the point where we have a high-quality
244
771099
2795
EM: ์ €ํฌ๊ฐ€ ๊ฐœ๋ฐœ ์ค‘์ธ ๊ณ ์„ฑ๋Šฅ์˜ ํ†ตํ•ฉ ๋ฒกํ„ฐ๊ณต๊ฐ„์ด
12:53
unified vector space.
245
773894
1919
๊ฑฐ์˜ ์™„์„ฑ ๋‹จ๊ณ„์— ์žˆ์Šต๋‹ˆ๋‹ค.
12:55
In the beginning, we were trying to do this with image recognition
246
775854
4213
์ฒ˜์Œ์— ์ €ํฌ๋Š” ์ด๋ฏธ์ง€ ์ธ์‹ ๊ธฐ์ˆ ์„ ๊ฐœ๋ฐœํ•˜๋ฉด์„œ
13:00
on individual images.
247
780108
2920
๊ฐœ๋ณ„ ์ด๋ฏธ์ง€๋ฅผ ์ด์šฉํ•˜๋ ค ํ–ˆ์–ด์š”.
13:03
But if you get one image out of a video,
248
783070
1918
๊ทธ๋Ÿฐ๋ฐ ์›๋ž˜ ์˜์ƒ ์† ์ •์ง€๋œ ํ•œ ์žฅ๋ฉด๋งŒ์„ ๋ณด๊ฒŒ ๋˜๋ฉด,
13:05
it's actually quite hard to see what's going on without ambiguity.
249
785030
4838
๋ฌด์Šจ ์ผ์ด ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š”์ง€ ์ •ํ™•ํžˆ ํŒŒ์•…ํ•˜๊ธฐ๊ฐ€ ํž˜๋“ค์ฃ .
13:09
But if you look at a video segment of a few seconds of video,
250
789910
3128
๋Œ€์‹  ๋ช‡ ์ดˆ ์ •๋„ ๊ธธ์ด์˜ ์˜์ƒ ์ผ๋ถ€๋ฅผ ๋ณด์—ฌ์ฃผ๋ฉด
13:13
that ambiguity resolves.
251
793080
1668
๊ทธ๋Ÿฌํ•œ ํ˜ผ๋™์€ ํ•ด๊ฒฐ๋ฉ๋‹ˆ๋‹ค.
13:15
So the first thing we had to do is tie all eight cameras together
252
795791
3462
์ €ํฌ์˜ ์šฐ์„  ๊ณผ์ œ๋Š” ์—ฌ๋Ÿ ๋Œ€์˜ ์นด๋ฉ”๋ผ๋ฅผ ์„œ๋กœ ์—ฐ๊ฒฐํ•ด์„œ
13:19
so they're synchronized,
253
799294
1168
ํ•œ ๋Œ€์ฒ˜๋Ÿผ ์ž‘๋™ํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๊ฒƒ์ด์—ˆ์ฃ .
13:20
so that all the frames are looked at simultaneously
254
800504
3170
๊ทธ๋Ÿฌ๋ฉด ๋ชจ๋“  ๊ฐ๋„์˜ ์ด๋ฏธ์ง€๋“ค์ด ํ•œ๋ฒˆ์— ์ธ์‹๋˜๊ณ 
13:23
and labeled simultaneously by one person,
255
803715
2962
ํ•œ ์‚ฌ๋žŒ์— ์˜ํ•ด ๋ผ๋ฒจ๋ง์ด ๋ฉ๋‹ˆ๋‹ค.
13:26
because we still need human labeling.
256
806718
1877
์•„์ง ๋ผ๋ฒจ๋ง์€ ์‚ฌ๋žŒ์˜ ์†์„ ๊ฑฐ์ณ์•ผ ํ•˜๋‹ˆ๊นŒ์š”.
13:30
So at least theyโ€™re not labeled at different times by different people
257
810180
3504
์ ์–ด๋„ ์ด๋ฏธ์ง€๊ฐ€ ์„œ๋กœ ๋‹ค๋ฅธ ํƒ€์ด๋ฐ์—
์—ฌ๋Ÿฌ ๋ช…์— ์˜ํ•ด ๋‹ค์–‘ํ•˜๊ฒŒ ๋ผ๋ฒจ๋ง๋˜๋Š” ๊ฑด ๋ฐฉ์ง€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
13:33
in different ways.
258
813684
1209
13:35
So it's sort of a surround picture.
259
815561
2419
๋งํ•˜์ž๋ฉด 360๋„ ์‚ฌ์ง„ ๊ฐ™์€ ๊ฑฐ์ฃ .
13:37
Then a very important part is to add the time dimension.
260
817980
3420
๊ทธ ๋‹ค์Œ ์ค‘์š”ํ•œ ๋ถ€๋ถ„์€ ์‹œ๊ฐ„ ์ฐจ์›์„ ์ถ”๊ฐ€ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
13:41
So that youโ€™re looking at surround video,
261
821733
3587
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ 360๋„ ์˜์ƒ์„ ๋ณด๊ณ ,
13:45
and you're labeling surround video.
262
825320
1919
๊ทธ ์˜์ƒ์„ ๋ผ๋ฒจ๋งํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
13:47
And this is actually quite difficult to do from a software standpoint.
263
827865
4671
์†Œํ”„ํŠธ์›จ์–ด์  ์ธก๋ฉด์—์„œ ์ƒ๋‹นํžˆ ์–ด๋ ค์šด ์ž‘์—…์ž…๋‹ˆ๋‹ค.
13:52
We had to write our own labeling tools
264
832911
4672
์ €ํฌ๋Š” ์ €ํฌ๋งŒ์˜ ๋ผ๋ฒจ๋ง ํˆด์„ ์ƒ์„ฑํ•˜๊ณ ,
13:57
and then create auto labeling,
265
837624
5464
์ž๋™ํ™”๋œ ๋ผ๋ฒจ๋ง ๊ณผ์ •์„ ๊ฐœ๋ฐœํ•ด์„œ,
14:03
create auto labeling software to amplify the efficiency of human labelers
266
843130
4546
์ž๋™ ๋ผ๋ฒจ๋ง ์†Œํ”„ํŠธ์›จ์–ด๋ฅผ ๋งŒ๋“ค์–ด ๋ผ๋ฒจ๋ง ์ž‘์—…์˜ ํšจ์œจ์„ ์˜ฌ๋ ค์•ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:07
because itโ€™s quite hard to label.
267
847676
1752
๋ผ๋ฒจ๋ง์€ ์–ด๋ ค์šด ์ž‘์—…์ด๊ฑฐ๋“ ์š”.
14:09
In the beginning, it was taking several hours
268
849469
2294
์ฒ˜์Œ์—๋Š” ๋ผ๋ฒจ๋ง์— ๋ช‡ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆฌ๊ธฐ๋„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:11
to label a 10-second video clip.
269
851763
2128
๊ณ ์ž‘ 10์ดˆ์งœ๋ฆฌ ์˜์ƒ์ธ๋ฐ๋„์š”.
14:13
This is not scalable.
270
853932
2169
๊ทธ๊ฒƒ๋ณด๋‹จ ๋นจ๋ผ์•ผ ํ•ด์š”.
14:16
So basically what you have to have is you have to have surround video,
271
856518
3295
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ๊ธฐ๋ณธ์ ์œผ๋กœ 360๋„ ์˜์ƒ์ด ์šฐ์„  ํ•„์š”ํ•˜๊ณ ,
14:19
and that surround video has to be primarily automatically labeled
272
859813
3796
์ด ์˜์ƒ์€ 1์ฐจ์ ์œผ๋กœ ์ž๋™ ๋ผ๋ฒจ๋ง์„ ๊ฑฐ์นฉ๋‹ˆ๋‹ค.
14:23
with humans just being editors
273
863609
1668
์ธ๊ฐ„์€ ํŽธ์ง‘์ž ์—ญํ•  ์ •๋„๋งŒ ์ˆ˜ํ–‰ํ•˜๋ฉด์„œ
14:25
and making slight corrections to the labeling of the video
274
865319
4671
๋ผ๋ฒจ๋ง์— ๋Œ€ํ•œ ์ž์ž˜ํ•œ ์ˆ˜์ • ์ž‘์—…์„ ํ•˜๊ณ 
14:30
and then feeding back those corrections into the future auto labeler,
275
870032
4629
์ˆ˜์ • ๋‚ด์šฉ์„ ๋ฐ˜์˜ํ•ด ์ž๋™ ๋ผ๋ฒจ๋ง ์†Œํ”„ํŠธ์›จ์–ด์— ์ „๋‹ฌํ•ฉ๋‹ˆ๋‹ค.
14:34
so you get this flywheel eventually
276
874661
1710
๊ทธ๋ ‡๊ฒŒ ํ•˜๋ฉด ์•„์ฃผ ์•ˆ์ •์ ์œผ๋กœ
14:36
where the auto labeler is able to take in vast amounts of video
277
876371
3295
์ž๋™ ๋ผ๋ฒจ๋ง ์‹œ์Šคํ…œ์ด ๋ง‰๋Œ€ํ•œ ๋ถ„๋Ÿ‰์˜ ์˜์ƒ์„ ๋ฐ›์•„๋“ค์—ฌ
14:39
and with high accuracy,
278
879666
1335
๋†’์€ ์ •ํ™•๋„๋ฅผ ๊ฐ€์ง€๊ณ ,
14:41
automatically label the video for cars, lane lines, drive space.
279
881043
5881
์ฐจ๋Ÿ‰, ์ฐจ์„ , ์ฃผํ–‰ ๊ณต๊ฐ„ ์˜์ƒ์„ ๋ผ๋ฒจ๋ง ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜๋Š” ๊ฑฐ์ฃ .
14:46
CA: What youโ€™re saying is ...
280
886965
4296
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ทธ ๋ง์”€์€...
14:51
the result of this is that you're effectively giving the car a 3D model
281
891303
4796
๊ทธ๋Ÿฐ ๊ณผ์ •์„ ๊ฑฐ์น˜๋ฉด ์ฐจ๋Ÿ‰์— ํšจ์œจ์ ์œผ๋กœ 3D ๋ชจ๋ธ์„ ์ „๋‹ฌํ•ด์„œ
14:56
of the actual objects that are all around it.
282
896099
2461
์ฃผ๋ณ€ ๋ฌผ์ฒด๋“ค์„ ์ธ์‹ํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•ด์ค€๋‹ค๋Š” ๊ฑฐ๊ตฐ์š”.
14:58
It knows what they are,
283
898560
2461
๊ทธ๊ฒŒ ์–ด๋–ค ๋ฌผ์ฒด์ธ์ง€,
15:01
and it knows how fast they are moving.
284
901063
2794
์–ผ๋งˆ๋‚˜ ๋น ๋ฅด๊ฒŒ ์›€์ง์ด๊ณ  ์žˆ๋Š”์ง€๋ฅผ ์•Œ์•„๋‚ผ ์ˆ˜ ์žˆ๋„๋ก์š”.
15:03
And the remaining task is to predict
285
903899
5338
์ด์ œ ๋‚จ์€ ๊ณผ์ œ๋Š”
๋„๋กœ์—์„œ์˜ ๋Œ๋ฐœ ์ƒํ™ฉ์„ ์˜ˆ์ธกํ•˜๋Š” ๊ฒƒ์ด๊ฒ ๊ตฐ์š”.
15:09
what the quirky behaviors are that, you know,
286
909237
3379
15:12
that when a pedestrian is walking down the road with a smaller pedestrian,
287
912658
3586
์˜ˆ๋ฅผ ๋“ค์–ด, ๋ณดํ–‰์ž๊ฐ€ ์–ด๋ฆฐ ์•„์ด๋ฅผ ๋ฐ๋ฆฌ๊ณ  ๊ธธ์„ ๊ฑด๋„ ๋•Œ
15:16
that maybe that smaller pedestrian might do something unpredictable
288
916286
3212
๊ทธ ์•„์ด๊ฐ€ ์˜ˆ์ƒ์น˜ ๋ชปํ•œ ํ–‰๋™์„ ํ•˜๊ฒŒ ๋œ๋‹ค๊ฑฐ๋‚˜
15:19
or things like that.
289
919498
1668
๊ทธ๋Ÿฐ ์ƒํ™ฉ์ด ์ƒ๊ธฐ๋‹ˆ๊นŒ์š”.
15:21
You have to build into it before you can really call it safe.
290
921208
3003
์•ˆ์ „ํ•œ ์ž์œจ ์ฃผํ–‰ ์‹œ์Šคํ…œ์„ ์™„์„ฑํ•˜๋ ค๋ฉด ๋ฐ˜๋“œ์‹œ ํ•„์š”ํ•œ ์š”์†Œ์ฃ .
15:24
EM: You basically need to have memory across time and space.
291
924211
5881
EM: ๊ทธ๋ž˜์„œ ์šฐ์„  ์‹œ๊ฐ„๊ณผ ๊ณต๊ฐ„์„ ๊ต์ฐจํ•˜๋Š” ๋ฉ”๋ชจ๋ฆฌ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
15:30
So what I mean by that is ...
292
930467
2544
์ด๊ฒŒ ๋ฌด์Šจ ๋ง์ด๋ƒ๋ฉด...
15:34
Memory canโ€™t be infinite,
293
934763
2544
๋ฉ”๋ชจ๋ฆฌ๋Š” ๋ฌดํ•œํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค.
15:37
because it's using up a lot of the computer's RAM basically.
294
937349
4212
๋ญ”๊ฐ€๊ฐ€ ๋ฉ”๋ชจ๋ฆฌ์— ๋“ค์–ด๊ฐ„๋‹ค๋Š” ๊ฑด RAM ์šฉ๋Ÿ‰์„ ์ฐจ์ง€ํ•œ๋‹ค๋Š” ๋œป์ด๋‹ˆ๊นŒ์š”.
15:42
So you have to say how much are you going to try to remember?
295
942562
2878
๋ฉ”๋ชจ๋ฆฌ๊ฐ€ ์–ผ๋งˆ๋‚˜ ๋งŽ์€ ๊ฑธ ๊ธฐ์–ตํ•˜๊ฒŒ ํ•ด์•ผ ํ• ์ง€ ๊ณ ๋ฏผํ•ด์•ผ ํ•˜์ฃ .
15:46
It's very common for things to be occluded.
296
946650
2627
๊ฐ€๋ ค์ง„ ๋ฌผ์ฒด๋ฅผ ์ธ์‹ํ•ด์•ผ ํ•  ๋•Œ ์ด๋Ÿฐ ๊ณ ๋ฏผ์ด ์ž์ฃผ ์ƒ๊น๋‹ˆ๋‹ค.
15:49
So if you talk about say, a pedestrian walking past a truck
297
949319
4171
์˜ˆ๋ฅผ ๋“ค์–ด ๋ณดํ–‰์ž๊ฐ€ ํŠธ๋Ÿญ ์•ž์„ ์ง€๋‚˜๊ฐ„๋‹ค๊ณ  ํ•˜๋ฉด,
15:53
where you saw the pedestrian start on one side of the truck,
298
953490
4129
๋ณดํ–‰์ž๊ฐ€ ๊ธธ์„ ๊ฑด๋„ˆ๊ธฐ ์‹œ์ž‘ํ•ด ํŠธ๋Ÿญ์˜ ํ•œ์ชฝ์œผ๋กœ ๋‹ค๊ฐ€๊ฐ€๊ณ 
15:57
then they're occluded by the truck.
299
957619
2086
์ž ์‹œ ํ›„์—” ํŠธ๋Ÿญ์— ๊ฐ€๋ ค ์•ˆ ๋ณด์ด๊ฒŒ ๋˜๊ฒ ์ฃ .
16:01
You would know intuitively,
300
961540
2377
๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ๋Š” ์ง๊ด€์ ์œผ๋กœ,
16:03
OK, that pedestrian is going to pop out the other side, most likely.
301
963917
3253
โ€œ์ € ์‚ฌ๋žŒ์ด ํŠธ๋Ÿญ์„ ์ง€๋‚˜ ๋ฐ˜๋Œ€์ชฝ์—์„œ ๋‚˜ํƒ€๋‚˜๊ฒ ๊ตฌ๋‚˜.โ€œ๋ผ๋Š” ๊ฑธ ์•Œ ์ˆ˜ ์žˆ์ฃ .
16:07
CA: A computer doesn't know it.
302
967504
1543
CA: ์ปดํ“จํ„ฐ๋Š” ๊ทธ๋Ÿฐ ๊ฑธ ๋ชจ๋ฅด์ฃ . EM: ์†๋„๋ฅผ ๋Šฆ์ถฐ์•ผ์ฃ .
16:09
EM: You need to slow down.
303
969047
1293
16:10
CA: A skeptic is going to say that every year for the last five years,
304
970340
3879
CA: ํšŒ์˜๋ก ์ž๋Š” ์•„๋งˆ๋„ ๋จธ์Šคํฌ์”จ๊ฐ€ ์ง€๋‚œ 5๋…„ ๋™์•ˆ ๋งค๋…„
16:14
you've kind of said, well,
305
974261
1668
์ด๋ ‡๊ฒŒ ๋งํ–ˆ๋‹ค๊ณ  ํ•˜๊ฒ ์ฃ .
16:15
no this is the year,
306
975971
1209
โ€œ์˜ฌํ•ด๋Š” ์•„๋‹ˆ์ง€๋งŒ
16:17
we're confident that it will be there in a year or two or, you know,
307
977222
3253
์•„๋งˆ ํ–ฅํ›„ 1~2๋…„ ์•ˆ์— ๊ทธ๋ ‡๊ฒŒ ๋œ๋‹ค๊ณ  ํ™•์‹ ํ•ด์š”.โ€
16:20
like it's always been about that far away.
308
980475
2586
๋งˆ์น˜ ๊ทธ๊ฒŒ ์–ธ์ œ๋‚˜ ์š”์›ํ•œ ์ผ์ธ ๊ฒƒ์ฒ˜๋Ÿผ์š”.
16:23
But we've got a new architecture now,
309
983103
2794
๊ทธ๋ ‡์ง€๋งŒ ์ง€๊ธˆ์€ ์ƒˆ๋กœ์šด ์•„ํ‚คํ…์ฒ˜๊ฐ€ ๊ฐœ๋ฐœ๋๊ณ 
16:25
you're seeing enough improvement behind the scenes
310
985897
2837
๋’ค์—์„œ ์ถฉ๋ถ„ํžˆ ๊ฐœ์„ ๋˜์—ˆ๋‹ค๊ณ  ๋ณด์‹œ๋Š” ๊ฑฐ์ฃ .
16:28
to make you not certain, but pretty confident,
311
988734
2961
ํ™•์‹ ํ•  ์ˆ˜๋Š” ์—†์ง€๋งŒ, ๊ฝค ์ž์‹  ์žˆ๋Š” ์ •๋„๋กœ์š”.
16:31
that, by the end of this year,
312
991737
2210
๊ทธ๋ž˜์„œ, ์˜ฌํ•ด ๋ง์ฏค์—๋Š”
16:33
what in most, not in every city, and every circumstance
313
993989
3128
๋ชจ๋“  ๋„์‹œ์™€ ๋ชจ๋“  ์ƒํ™ฉ์—์„œ๋Š” ์•„๋‹ˆ์ง€๋งŒ,
16:37
but in many cities and circumstances,
314
997159
2460
๋Œ€๋ถ€๋ถ„์˜ ๋„์‹œ์™€ ๋Œ€๋ถ€๋ถ„์˜ ์ƒํ™ฉ์—์„œ
16:39
basically the car will be able to drive without interventions
315
999619
2878
๊ธฐ๋ณธ์ ์œผ๋กœ ์ž๋™์ฐจ๊ฐ€ ์‚ฌ๋žŒ์˜ ๊ฐœ์ž… ์—†์ด๋„ ๋” ์•ˆ์ „ํ•˜๊ฒŒ ์šด์ „ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑฐ๊ณ ์š”.
16:42
safer than a human.
316
1002539
1168
16:43
EM: Yes.
317
1003707
1168
EM: ๋„ค.
16:44
I mean, the car currently drives me around Austin
318
1004916
2461
์‚ฌ์‹ค, ์ง€๊ธˆ ์˜ค์Šคํ‹ด์„ ์ฃผํ–‰ํ•˜๋Š” ์ œ ์ฐจ๋Š”
16:47
most of the time with no interventions.
319
1007377
2044
๊ฑฐ์˜ ์‚ฌ๋žŒ์˜ ๊ฐœ์ž… ์—†์ด ์šด์ „ํ•ฉ๋‹ˆ๋‹ค.
16:49
So it's not like ...
320
1009755
1751
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ทธ๊ฑด...
16:52
And we have over 100,000 people
321
1012340
3129
ํ˜„์žฌ ์‹ญ๋งŒ ๋ช…์ด ๋„˜๋Š” ์‚ฌ๋žŒ๋“ค์ด
16:55
in our full self-driving beta program.
322
1015510
2878
์ €ํฌ์˜ ์™„์ „ ์ž๋™ ์ฃผํ–‰ ๋ฒ ํƒ€ ํ”„๋กœ๊ทธ๋žจ์— ์ฐธ์—ฌ ์ค‘์ž…๋‹ˆ๋‹ค.
16:59
So you can look at the videos that they post online.
323
1019181
3086
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ๊ทธ๋ถ„๋“ค์ด ์˜ฌ๋ฆฐ ์˜์ƒ์„ ๋ณด์‹ค ์ˆ˜ ์žˆ์„ ๊ฑฐ์˜ˆ์š”.
17:02
CA: I do.
324
1022309
1168
CA: ๋ดค์–ด์š”.
17:03
And some of them are great, and some of them are a little terrifying.
325
1023935
3254
๊ทธ ์ค‘ ์ผ๋ถ€๋Š” ๋Œ€๋‹จํ•˜์ง€๋งŒ, ์ผ๋ถ€๋Š” ์•ฝ๊ฐ„ ๋ฌด์„ญ๋˜๋ฐ์š”.
17:07
I mean, occasionally the car seems to veer off
326
1027189
2168
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ๊ฒฝ์šฐ์— ๋”ฐ๋ผ์„œ ์ž์œจ ์ฃผํ–‰์ฐจ๊ฐ€ ๊ฒฝ๋กœ๋ฅผ ์ดํƒˆํ•ด์„œ
17:09
and scare the hell out of people.
327
1029357
1877
์‚ฌ๋žŒ๋“ค์„ ๋ชน์‹œ ๊ฒ์ฃผ๋Š” ๋“ฏ ๋ณด์ž…๋‹ˆ๋‹ค.
17:12
EM: Itโ€™s still a beta.
328
1032152
1335
EM: ์—ฌ์ „ํžˆ ๋ฒ ํƒ€๋ฒ„์ „์ด๋‹ˆ๊นŒ์š”.
17:15
CA: But youโ€™re behind the scenes, looking at the data,
329
1035072
2585
CA: ๊ทธ๋ ‡์ง€๋งŒ, ๋’ค์—์„œ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ดํŽด๋ณด๋ฉด
17:17
you're seeing enough improvement
330
1037699
1543
๋จธ์Šคํฌ์”จ๋Š” ์ถฉ๋ถ„ํžˆ ์ง„์ „ ์ค‘์ด๋ผ๊ณ  ๋ณด๊ณ  ๊ณ„์‹œ๋‹ค๋Š” ๊ฑฐ๋„ค์š”.
17:19
to believe that a this-year timeline is real.
331
1039242
4213
์˜ฌํ•ด ๊ณ„ํš๋œ ์ผ์ •์ด ํ˜„์‹คํ™”๋  ๊ฒƒ์ด๋ผ ๋ฏฟ์„ ์ •๋„๋กœ์š”.
17:23
EM: Yes, that's what it seems like.
332
1043497
1710
EM: ๋„ค, ๊ทธ๋ ‡๊ฒŒ ๋ณด์ž…๋‹ˆ๋‹ค.
17:25
I mean, we could be here talking again in a year,
333
1045207
3628
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์šฐ๋ฆฌ๊ฐ€ ์ผ ๋…„ ํ›„์— ์—ฌ๊ธฐ์„œ ๋‹ค์‹œ ์–˜๊ธฐํ•  ์ˆ˜๋„ ์žˆ์ง€๋งŒ์š”.
17:28
like, well, another year went by, and it didnโ€™t happen.
334
1048877
2753
๋งˆ์น˜, ์ด์ „์— ํ•œ ํ•ด๊ฐ€ ์ง€๋‚ฌ์ง€๋งŒ ๊ณ„ํš๋Œ€๋กœ ๋˜์ง€ ์•Š์•˜์„ ๋•Œ์ฒ˜๋Ÿผ์š”.
17:31
But I think this is the year.
335
1051630
1710
ํ•˜์ง€๋งŒ ์ œ ์ƒ๊ฐ์—๋Š” ์˜ฌํ•ด์ž…๋‹ˆ๋‹ค.
17:33
CA: And so in general, when people talk about Elon time,
336
1053381
3295
CA: ๊ทธ๋ฆฌ๊ณ  ๋ณดํ†ต, ์‚ฌ๋žŒ๋“ค์ด ์ผ๋ก  ๋จธ์Šคํฌ์— ๋Œ€ํ•ด ์–˜๊ธฐํ•  ๋•Œ
17:36
I mean it sounds like you can't just have a general rule
337
1056718
3212
๊ผญ ๋‹น์‹ ์—๊ฒŒ ์ผ๋ฐ˜์ ์ธ ๊ทœ์น™์ด ์—†๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ์–˜๊ธฐํ•˜๋Š” ๊ฑธ๋กœ ๋“ค๋ฆฝ๋‹ˆ๋‹ค.
17:39
that if you predict that something will be done in six months,
338
1059930
2961
์ผ๋ก  ๋จธ์Šคํฌ๊ฐ€ ์–ด๋–ค ์ผ์ด 6๊ฐœ์›” ์•ˆ์— ์™„์„ฑ๋  ๊ฒƒ์ด๋ผ๊ณ  ์˜ˆ์ธกํ•˜๋ฉด
17:42
actually what we should imagine is itโ€™s going to be a year
339
1062891
2920
์‹ค์ œ๋กœ๋Š” ์ผ ๋…„์€ ๊ฑธ๋ฆฌ๋Š” ๊ฑฐ๋ผ๊ณ ๋“ค ๋งํ•˜์ฃ .
17:45
or itโ€™s like two-x or three-x, it depends on the type of prediction.
340
1065852
3254
๋˜๋Š” ์ด ๋…„, ์•„๋‹ˆ๋ฉด ์‚ผ ๋…„. ์–ด๋–ค ์ข…๋ฅ˜์˜ ์˜ˆ์ธก์ด๋ƒ์— ๋”ฐ๋ผ์„œ์š”.
17:49
Some things, I guess, things involving software, AI, whatever,
341
1069106
3837
์ œ ์ƒ๊ฐ์— ์†Œํ”„ํŠธ์›จ์–ด๋‚˜ AI ๊ฐ™์€ ๊ฒƒ๋“ค์€
17:52
are fundamentally harder to predict than others.
342
1072943
4004
๊ทผ๋ณธ์ ์œผ๋กœ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๊ฒƒ๋“ค๋ณด๋‹ค ์˜ˆ์ธกํ•˜๊ธฐ๊ฐ€ ์–ด๋ ค์›Œ ๋ณด์ด๋Š”๋ฐ์š”.
17:56
Is there an element
343
1076988
1168
ํ˜น์‹œ ์ด๋Ÿฐ ๊ฑด ์•„๋‹Œ๊ฐ€์š”?
17:58
that you actually deliberately make aggressive prediction timelines
344
1078198
3337
๋จธ์Šคํฌ๊ฐ€ ์‚ฌ์‹ค์ƒ ์˜๋„์ ์œผ๋กœ ๊ณต๊ฒฉ์ ์ธ ์ผ์ •์„ ์˜ˆ์ธกํ•˜๋Š” ๊ฑฐ์ง€์š”.
18:01
to drive people to be ambitious?
345
1081576
3420
์‚ฌ๋žŒ๋“ค์ด ์ข€ ๋” ์•ผ์‹ฌ์„ ๊ฐ€์ง€๋„๋ก ๋ง์ž…๋‹ˆ๋‹ค.
18:04
Without that, nothing gets done?
346
1084996
1710
์•„๋ฌด ๊ฒƒ๋„ ์ด๋ค„์ง„ ๊ฒŒ ์—†๋Š” ์ƒํƒœ๋กœ์š”.
18:06
EM: Well, I generally believe, in terms of internal timelines,
347
1086748
3003
EM: ์Œ, ์ €๋Š” ๋‚ด๋ถ€ ์ผ์ •์— ๊ด€ํ•ด์„œ๋Š” ๋ณดํ†ต ์ด๋ ‡๊ฒŒ ๋ฏฟ๋Š”๋ฐ์š”,
18:09
that we want to set the most aggressive timeline that we can.
348
1089793
4880
์ €ํฌ๋Š” ํ•  ์ˆ˜ ์žˆ๋Š” ํ•œ ๊ฐ€์žฅ ๊ณต๊ฒฉ์ ์ธ ์ผ์ •์„ ์ˆ˜๋ฆฝํ•ด ๋‘๊ธธ ์›ํ•œ๋‹ค๊ณ ์š”.
18:14
Because thereโ€™s sort of like a law of gaseous expansion where,
349
1094673
3170
์™œ๋ƒํ•˜๋ฉด ๊ธฐ์ฒด ํŒฝ์ฐฝ์˜ ๋ฒ•์น™ ๊ฐ™์€ ๊ฒŒ ์žˆ๋Š”๋ฐ,
18:17
for schedules, where whatever time you set,
350
1097843
3086
์Šค์ผ€์ค„์„ ์ˆ˜๋ฆฝํ•  ๋•Œ๋Š” ๋ง์ž…๋‹ˆ๋‹ค. ์–ด๋–ค ์‹์œผ๋กœ ์‹œ๊ฐ„์„ ๊ณ„ํšํ•˜๋”๋ผ๋„,
18:20
it's not going to be less than that.
351
1100971
1751
๊ณ„ํšํ•œ ๊ฒƒ๋ณด๋‹ค ์ ๊ฒŒ ๊ฑธ๋ฆฌ์ง€๋Š” ์•Š๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
18:22
It's very rare that it'll be less than that.
352
1102722
2253
๊ทธ๋Ÿฐ ๊ฒฝ์šฐ๋Š” ๋งค์šฐ ๋“œ๋ฌผ์–ด์š”.
18:26
But as far as our predictions are concerned,
353
1106184
2086
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ์˜ ์˜ˆ์ธก์— ๊ด€ํ•ด์„œ
18:28
what tends to happen in the media
354
1108311
1585
๋ฏธ๋””์–ด์˜ ๊ฒฝํ–ฅ์ด
18:29
is that they will report all the wrong ones
355
1109938
2002
๋ชจ๋“  ํ‹€๋ฆฐ ๋ถ€๋ถ„๋งŒ ๋ณด๋„ํ•˜๊ณ 
18:31
and ignore all the right ones.
356
1111982
1501
๋“ค์–ด๋งž์€ ๊ฑด ๋ฌด์‹œํ•˜๋Š” ๋ฉด๋„ ์žˆ์–ด์š”.
18:33
Or, you know, when writing an article about me --
357
1113525
4796
๋˜ ํฌ๋ฆฌ์Šค ์”จ๋„ ์•„์‹œ๊ฒ ์ง€๋งŒ, ์ €์— ๋Œ€ํ•œ ๊ธฐ์‚ฌ๋Š”...
18:38
I've had a long career in multiple industries.
358
1118321
2211
์ €๋Š” ๋‹ค์–‘ํ•œ ์‚ฐ์—…์—์„œ ์˜ค๋žœ ๊ฒฝ๋ ฅ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์—
18:40
If you list my sins, I sound like the worst person on Earth.
359
1120574
2836
์ œ๊ฐ€ ํ•œ ๋ฐ”๋ณด์ง“๋งŒ ๋‚˜์—ดํ•˜๋ฉด, ์ „ ์•„๋งˆ ์ง€๊ตฌ์—์„œ ์ œ์ผ ๋‚˜์œ ์‚ฌ๋žŒ์ผ ๊ฒ๋‹ˆ๋‹ค.
18:43
But if you put those against the things I've done right,
360
1123410
3378
ํ•˜์ง€๋งŒ, ๋งŒ์•ฝ ๋ฐ˜๋Œ€๋กœ ์ œ๊ฐ€ ์˜ณ์•˜๋˜ ๊ฒƒ๋“ค์„ ๋‚˜์—ดํ•œ๋‹ค๋ฉด,
18:46
it makes much more sense, you know?
361
1126788
1794
์ œ ํ–‰๋™์ด ํ›จ์”ฌ ๋” ํƒ€๋‹นํ•ด ๋ณด์ด๊ฒ ์ฃ , ๊ทธ๋ ‡์ง€ ์•Š๋‚˜์š”?
18:48
So essentially like, the longer you do anything,
362
1128623
2753
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ธฐ๋ณธ์ ์œผ๋กœ ์ด๋Ÿฐ ๊ฑฐ์˜ˆ์š”, ๋‹น์‹ ์ด ํ•œ ์ผ์ด ๋งŽ์„์ˆ˜๋ก
18:51
the more mistakes that you will make cumulatively.
363
1131418
3670
๋” ๋งŽ์€ ์‹ค์ˆ˜๋“ค์„ ์Œ“๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
18:55
Which, if you sum up those mistakes,
364
1135130
1793
๊ทธ๋Ÿฌ๋‹ˆ ๋งŒ์•ฝ ๊ทธ๋Ÿฐ ์‹ค์ˆ˜๋งŒ ๋ชจ์•„๋‘˜ ๋•
18:56
will sound like I'm the worst predictor ever.
365
1136965
3629
์•„๋งˆ ์ œ๊ฐ€ ์ „๋ฌดํ›„๋ฌดํ•œ ์ตœ์•…์˜ ์˜ˆ์–ธ๊ฐ€์ฒ˜๋Ÿผ ๋ณด์ด๊ฒ ์ง€๋งŒ
19:00
But for example, for Tesla vehicle growth,
366
1140635
3879
์ด๋ฅผํ…Œ๋ฉด, ํ…Œ์Šฌ๋ผ ์ž๋™์ฐจ์˜ ์„ฑ์žฅ์— ๊ด€ํ•ด์„œ๋Š”
19:04
I said I think weโ€™d do 50 percent, and weโ€™ve done 80 percent.
367
1144556
4379
์ œ๊ฐ€ 50ํผ์„ผํŠธ์˜ ์„ฑ์žฅ์„ ์˜ˆ์ธกํ•˜๊ณ  ์‹ค์ œ๋กœ๋Š” 80ํผ์„ผํŠธ๋ฅผ ํ•ด๋‚ด๊ธฐ๋„ ํ–ˆ์ฃ .
19:08
CA: Yes.
368
1148977
1126
CA: ๋„ค.
19:10
EM: But they don't mention that one.
369
1150937
1752
EM: ํ•˜์ง€๋งŒ ์‚ฌ๋žŒ๋“ค์€ ๊ทธ๋Ÿฐ ๊ฑด ๋งํ•˜์ง€ ์•Š์•„์š”.
19:12
So, I mean, I'm not sure what my exact track record is on predictions.
370
1152689
3754
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ œ ์‹ค์ ์ด ์˜ˆ์ธก๋Œ€๋กœ ๋ ์ง€ ํ™•์‹ ํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
19:16
They're more optimistic than pessimistic, but they're not all optimistic.
371
1156484
3462
๋น„๊ด€์ ์ด๊ธฐ ๋ณด๋‹จ ๊ธ์ •์ ์ธ ํŽธ์ด์ง€๋งŒ ๋ชจ๋“  ๊ฒŒ ๊ธ์ •์ ์ด์ง€๋Š” ์•Š์•„์š”.
19:19
Some of them are exceeded probably more or later,
372
1159946
4838
๊ทธ๋“ค ์ค‘ ๋ช‡ ๊ฐ€์ง€๋Š” ์•„๋งˆ ๊ณ„ํš๋ณด๋‹ค ๋Šฆ์„ ๊ฒ๋‹ˆ๋‹ค.
19:24
but they do come true.
373
1164826
3921
ํ•˜์ง€๋งŒ ์‹คํ˜„๋˜๊ฒ ์ง€์š”.
19:28
It's very rare that they do not come true.
374
1168747
2878
์‹คํ˜„๋˜์ง€ ์•Š๋Š” ๊ฒฝ์šฐ๋Š” ๋งค์šฐ ๋“œ๋ฌผ์–ด์š”.
19:31
It's sort of like, you know,
375
1171666
3212
๋งˆ์น˜ ์ด๋Ÿฐ ๊ฒƒ๊ณผ ๊ฐ™์•„์š”.
19:34
if there's some radical technology prediction,
376
1174878
3670
๋งŒ์•ฝ ๋ช‡ ๊ฐ€์ง€ ๊ธ‰์ง„์ ์ธ ๊ธฐ์ˆ ์— ๊ด€ํ•œ ์˜ˆ์ธก์ด ์žˆ๋‹ค๋ฉด,
19:38
the point is not that it was a few years late,
377
1178590
2169
๋ช‡ ๋…„ ๋” ๋Šฆ๊ฒŒ ์‹คํ˜„ ๋˜์—ˆ๋Š๋ƒ๊ฐ€ ์ค‘์ ์ด ์•„๋‹ˆ์ฃ .
19:40
but that it happened at all.
378
1180800
1544
๊ฒฐ๊ตญ ์ด๋ค„์กŒ๋‹ค๋Š” ๊ฒŒ ์ค‘์š”ํ•˜๋‹ˆ๊นŒ์š”.
19:43
That's the more important part.
379
1183136
2336
๊ทธ๊ฒŒ ํ›จ์”ฌ ์ค‘์š”ํ•œ ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค.
19:45
CA: So it feels like at some point in the last year,
380
1185889
3962
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ด๋Ÿฐ ๋Š๋‚Œ์ด๋„ค์š”. ์ž‘๋…„ ์–ธ์  ๊ฐ€
19:49
seeing the progress on understanding,
381
1189851
4671
์ง„ํ–‰๋˜๋Š” ๊ฑธ ๋ดค์„ ๋•Œ์š”. ๊ทธ โ€˜์ดํ•ดโ€™ ๋ถ€๋ถ„์— ๊ด€ํ•ด์„œ์š”.
19:54
the Tesla AI understanding the world around it,
382
1194522
3421
ํ…Œ์Šฌ๋ผ AI๊ฐ€ ์„ธ์ƒ์„ ์ดํ•ดํ•˜๋Š” ๋ฐฉ์‹ ๋ง์ž…๋‹ˆ๋‹ค.
19:57
led to a kind of, an aha moment at Tesla.
383
1197984
2544
ํ…Œ์Šฌ๋ผ์— ๊นจ๋‹ฌ์Œ์˜ ์ˆœ๊ฐ„์ด ์žˆ์—ˆ์ฃ .
20:00
Because you really surprised people recently when you said
384
1200570
3087
์™œ๋ƒํ•˜๋ฉด ๋จธ์Šคํฌ์”จ๋Š” ๊ทธ ์–˜๊ธฐ๋กœ ์ตœ๊ทผ์— ์ •๋ง๋กœ ์‚ฌ๋žŒ๋“ค์„ ๋†€๋ผ๊ฒŒ ํ–ˆ๊ณ ,
20:03
probably the most important product development
385
1203698
2503
์•„๋งˆ ์˜ฌํ•ด ๊ฐ€์žฅ ์ค‘์š”ํ•œ ์ œํ’ˆ ๊ฐœ๋ฐœ์€
20:06
going on at Tesla this year is this robot, Optimus.
386
1206243
3878
๋กœ๋ด‡ ์˜ตํ‹ฐ๋จธ์Šค๊ฐ€ ๋  ๊ฑฐ๋ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
20:10
EM: Yes.
387
1210163
1168
EM: ๋งž์•„์š”.
20:11
CA: Many companies out there have tried to put out these robots,
388
1211331
3211
CA: ๋‹ค๋ฅธ ๋งŽ์€ ํšŒ์‚ฌ๋“ค๋„ ์ด๋Ÿฐ ๋กœ๋ด‡์„ ๋งŒ๋“œ๋ ค๊ณ  ๋…ธ๋ ฅํ•ด ์™”๊ณ ์š”.
20:14
they've been working on them for years.
389
1214584
1919
์ด๋ฏธ ์ˆ˜ ๋…„๊ฐ„ ๋ง์ด์ฃ .
20:16
And so far no one has really cracked it.
390
1216503
2294
๊ทธ๋ฆฌ๊ณ  ์–ด๋Š ๊ณณ๋„ ์ง„์งœ๋กœ ํ•ด๋‚ด์ง€๋Š” ๋ชปํ–ˆ์–ด์š”.
20:18
There's no mass adoption robot in people's homes.
391
1218797
3795
๊ฐ€์ •์šฉ ๋กœ๋ด‡์„ ๋Œ€๋Ÿ‰ ์ƒ์‚ฐํ•˜์ง€๋„ ๋ชปํ–ˆ๊ณ 
20:22
There are some in manufacturing, but I would say,
392
1222592
3170
๋ช‡ ๊ณณ์€ ์ œ์กฐํ•˜๊ธฐ๋„ ํ–ˆ์ง€๋งŒ, ์ด๋ ‡๊ฒŒ ๋งํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ ๊ฐ™์•„์š”.
20:25
no one's kind of, really cracked it.
393
1225762
2711
๋ˆ„๊ตฌ๋„ ์ง„์งœ๋กœ ํ•ด๋‚ด์ง€๋Š” ๋ชปํ–ˆ๋‹ค๊ณ ์š”.
20:29
Is it something that happened
394
1229724
1794
์–ด๋–ค ์„ฑ๊ณผ๊ฐ€ ์‹ค์ œ๋กœ ์žˆ์—ˆ๋‚˜์š”?
20:31
in the development of full self-driving that gave you the confidence to say,
395
1231559
3587
์™„์ „ ์ž๋™ํ™” ๊ฐœ๋ฐœ์— ์žˆ์–ด์„œ ๋ญ”๊ฐ€ ํŠน๋ณ„ํ•œ ๊ฑธ ํ•ด๋ƒˆ๋‹ค๊ณ 
20:35
"You know what, we could do something special here."
396
1235146
2461
ํ™•์‹คํ•˜๊ฒŒ ๋งํ•  ์ •๋„์˜ ์„ฑ๊ณผ์š”.
20:37
EM: Yeah, exactly.
397
1237607
1251
EM: ๋„ค, ๊ทธ๋Ÿผ์š”.
20:38
So, you know, it took me a while to sort of realize
398
1238858
2670
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์•„์‹œ๋‹ค์‹œํ”ผ, ์ €๋„ ๊นจ๋‹ซ๋Š” ๋ฐ ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์–ด์š”.
20:41
that in order to solve self-driving,
399
1241528
2919
์ž์œจ ์ฃผํ–‰ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
20:44
you really needed to solve real-world AI.
400
1244489
2169
ํ˜„์‹ค ์„ธ๊ณ„์˜ AI ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ํ•„์š”๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
20:47
And at the point of which you solve real-world AI for a car,
401
1247617
3045
๊ทธ๋ฆฌ๊ณ  ์ž๋™์ฐจ๋ฅผ ์œ„ํ•œ ํ˜„์‹ค ์„ธ๊ณ„์˜ AI๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ์‹œ์ ์—,
20:50
which is really a robot on four wheels,
402
1250704
2502
๊ทธ๊ฒƒ์ด ๋„ค ๊ฐœ์˜ ๋ฐ”ํ€ด๋ฅผ ๊ฐ€์ง„ ๋กœ๋ด‡์ด๊ธฐ ๋•Œ๋ฌธ์—
20:53
you can then generalize that to a robot on legs as well.
403
1253248
4421
๋‘ ๋‹ค๋ฆฌ๋ฅผ ๊ฐ€์ง„ ๋กœ๋ด‡ ์—ญ์‹œ ์ƒ์šฉํ™”๊ฐ€ ๊ฐ€๋Šฅํ•ด์ง€๋Š” ๊ฒ๋‹ˆ๋‹ค.
20:58
The two hard parts I think --
404
1258128
1418
์ œ ์ƒ๊ฐ์— ๋‘ ๊ฐ€์ง€ ์–ด๋ ค์šด ๋ถ€๋ถ„์€
20:59
like obviously companies like Boston Dynamics
405
1259587
2586
ํ™•์‹คํžˆ ๋ณด์Šคํ†ค ๋‹ค์ด๋‚ด๋ฏน์Šค ๊ฐ™์€ ํšŒ์‚ฌ๋“ค์€
21:02
have shown that it's possible to make quite compelling,
406
1262215
2878
๊ฝค ๋„์ „์ ์ธ ๊ฑธ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์„ ๋ณด์—ฌ์ค˜์™”๊ณ ์š”.
21:05
sometimes alarming robots.
407
1265135
1626
๋•Œ๋กœ๋Š” ๊ทธ๊ฒŒ ๊นœ์ง ๋†€๋ž„๋งŒํ•œ ๋กœ๋ด‡์ด๊ธฐ๋„ ํ•œ ๊ฑฐ์ฃ .
21:06
CA: Right.
408
1266761
1210
CA: ๊ทธ๋ ‡๊ตฐ์š”.
21:08
EM: You know, so from a sensors and actuators standpoint,
409
1268013
3295
EM: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ฐ์ง€๊ธฐ๋‚˜ ์ž‘๋™๊ธฐ๋ฅผ ๋ณผ ๋•Œ๋Š”
21:11
it's certainly been demonstrated by many
410
1271349
3253
ํ™•์‹คํžˆ ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด ์ฆ๋ช…ํ–ˆ๋Š”๋ฐ
21:14
that it's possible to make a humanoid robot.
411
1274644
2086
ํœด๋จธ๋…ธ์ด๋“œ ๋กœ๋ด‡์„ ๋งŒ๋“œ๋Š” ์ผ๋„ ๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ๊ฑธ์š”.
21:16
The things that are currently missing are enough intelligence
412
1276730
5547
ํ˜„์žฌ ๋†“์น˜๊ณ  ์žˆ๋Š” ๊ฑด ์ถฉ๋ถ„ํ•œ ์ง€๋Šฅ์ž…๋‹ˆ๋‹ค
21:22
for the robot to navigate the real world and do useful things
413
1282277
3086
๋กœ๋ด‡์ด ํ˜„์‹ค ์„ธ๊ณ„์—์„œ ์šดํ–‰ํ•˜๊ณ , ์œ ์šฉํ•œ ๊ฒƒ๋“ค์„ ํ•˜๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•œ ์ง€๋Šฅ์ด์š”.
21:25
without being explicitly instructed.
414
1285405
2252
ํ™•์‹คํ•œ ๋ช…๋ น์„ ๋‚ด๋ฆฌ์ง€ ์•Š๊ณ ๋„ ๋ง์ด์ฃ .
21:27
So the missing things are basically real-world intelligence
415
1287657
4129
๊ทธ๋Ÿฌ๋‹ˆ ๋†“์น˜๊ณ  ์žˆ๋Š” ๊ฒƒ๋“ค์€ ๊ธฐ๋ณธ์ ์œผ๋กœ ํ˜„์‹ค ์„ธ๊ณ„ ์ง€๋Šฅ๊ณผ
21:31
and scaling up manufacturing.
416
1291786
2503
์ œ์กฐ์—์„œ ๊ทœ๋ชจ๋ฅผ ํ™•์žฅํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
21:34
Those are two things that Tesla is very good at.
417
1294664
2419
์ด ๋‘ ๊ฐ€์ง€๋Š” ํ…Œ์Šฌ๋ผ๊ฐ€ ๋งค์šฐ ์ž˜ํ•˜๋Š” ๋ถ€๋ถ„์ด๊ณ 
21:37
And so then we basically just need to design the specialized actuators
418
1297125
6006
๊ทธ๋ž˜์„œ ์ €ํฌ๋Š” ๋ฌด์—‡๋ณด๋‹ค๋„ ํŠนํ™”๋œ ์ž‘๋™๊ธฐ์™€
21:43
and sensors that are needed for humanoid robot.
419
1303173
2335
๊ฐ์ง€๊ธฐ๋ฅผ ์„ค๊ณ„ํ•  ํ•„์š”๊ฐ€ ์žˆ์–ด์š”. ํœด๋จธ๋…ธ์ด๋“œ ๋กœ๋ด‡์— ํ•„์š”ํ•œ ๊ฒƒ๋“ค์ด์ฃ .
21:46
People have no idea, this is going to be bigger than the car.
420
1306301
3045
์‚ฌ๋žŒ๋“ค์€ ์ „ํ˜€ ๋ชจ๋ฅด์ง€๋งŒ ์ด๊ฑด ๋‹จ์ˆœํ•œ ์ž๋™์ฐจ ๊ทธ ์ด์ƒ์ž…๋‹ˆ๋‹ค.
21:50
CA: So let's dig into exactly that.
421
1310764
1918
CA: ๋ฐ”๋กœ ๊ทธ ๋ถ€๋ถ„์— ์ข€ ๋” ๋“ค์–ด๊ฐ€ ๋ณผ๊นŒ์š”?
21:52
I mean, in one way, it's actually an easier problem than full self-driving
422
1312724
3587
ํ•œํŽธ์œผ๋ก  ์™„์ „ ์ž์œจ ์ฃผํ–‰๋ณด๋‹ค๋Š” ๋กœ๋ด‡์ด ์‚ฌ์‹ค ์‰ฌ์šด ๋ฌธ์ œ์—์š”.
21:56
because instead of an object going along at 60 miles an hour,
423
1316311
3795
์™œ๋ƒํ•˜๋ฉด ํ•œ ์‹œ๊ฐ„์— 60๋งˆ์ผ์„ ์ฃผํ–‰ํ•˜๋Š” ๋ฌผ์ฒด๋„ ์•„๋‹ˆ๊ณ 
22:00
which if it gets it wrong, someone will die.
424
1320148
2544
์ž˜๋ชป๋  ๊ฒฝ์šฐ์— ๋ˆ„๊ตฐ๊ฐ€ ์ฃฝ์„ ์ˆ˜๋„ ์žˆ๋Š” ๊ทธ๋Ÿฐ ๊ฒŒ ์•„๋‹ˆ๋‹ˆ๊นŒ์š”.
22:02
This is an object that's engineered to only go at what,
425
1322692
2586
์ด๊ฑด ๋‹จ์ˆœํžˆ
22:05
three or four or five miles an hour.
426
1325320
2002
ํ•œ ์‹œ๊ฐ„์— 3, 4 ๋˜๋Š” 5๋งˆ์ผ ์ •๋„๋งŒ ์ด๋™ํ•˜๋Š” ๋ฌผ์ฒด์ฃ .
22:07
And so a mistake, there aren't lives at stake.
427
1327322
3003
๊ทธ๋ฆฌ๊ณ  ์‹ค์ˆ˜๊ฐ€ ์žˆ๋”๋ผ๋„, ์ธ๋ช…ํ”ผํ•ด๊ฐ€ ๊ฑธ๋ฆฐ ๋ฌธ์ œ๋„ ์•„๋‹ˆ๊ณ ์š”.
22:10
There might be embarrassment at stake.
428
1330367
1835
๊ทธ๋ƒฅ ์ข€ ๋‹นํ™ฉ์Šค๋Ÿฌ์šด ์ •๋„๊ฒ ์ง€์š”.
22:12
EM: So long as the AI doesn't take it over and murder us in our sleep or something.
429
1332202
5130
EM: AI๊ฐ€ ์šฐ๋ฆฌ๋ฅผ ํ†ต์ œํ•˜๊ฑฐ๋‚˜ ์šฐ๋ฆฌ๊ฐ€ ์ž˜ ๋•Œ ์ฃฝ์ด์ง€๋งŒ ์•Š๋Š”๋‹ค๋ฉด ๋ง์ด์—์š”.
22:17
CA: Right.
430
1337374
1167
CA: ๋งž์•„์š”.
22:18
(Laughter)
431
1338583
1168
(์›ƒ์Œ)
22:20
So talk about --
432
1340794
1543
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์–˜๊ธฐํ•ด ๋ด…์‹œ๋‹ค.
22:22
I think the first applications you've mentioned
433
1342379
2711
๋จธ์Šคํฌ ์”จ๊ฐ€ ์–ธ๊ธ‰ํ•œ ๊ฒƒ ์ค‘ ์ฒ˜์Œ์œผ๋กœ ์ ์šฉ๋œ ๊ฑด
22:25
are probably going to be manufacturing,
434
1345131
1919
์•„๋งˆ ์ œ์กฐ์ธ ๊ฒƒ ๊ฐ™์€๋ฐ
22:27
but eventually the vision is to have these available for people at home.
435
1347050
3462
ํ•˜์ง€๋งŒ ์ตœ์ข… ๋ชฉ์ ์€ ์ด๋Ÿฌํ•œ ์ œํ’ˆ์ด ์‚ฌ๋žŒ๋“ค ์ง‘์—์„œ ์ด์šฉ ๊ฐ€๋Šฅํ•ด์ง€๋Š” ๊ฑฐ๊ฒ ์ฃ .
22:30
If you had a robot that really understood the 3D architecture of your house
436
1350553
6298
๋งŒ์•ฝ ๋จธ์Šคํฌ ์”จ์—๊ฒŒ ๋กœ๋ด‡์ด ์žˆ๋‹ค๋ฉด์š”, ์ •๋ง๋กœ ๋‹น์‹  ์ง‘์˜ 3D ๊ตฌ์กฐ๋ฅผ ์ดํ•ดํ•˜๊ณ 
22:36
and knew where every object in that house was
437
1356893
5005
์ง‘์— ์žˆ๋Š” ๋ชจ๋“  ๋ฌผ๊ฑด์˜ ์œ„์น˜๋„ ์•Œ๊ณ ,
22:41
or was supposed to be,
438
1361940
1334
๋˜๋Š” ๊ทธ๋ ‡๊ฒŒ ํ•˜๋„๋ก ๋งŒ๋“ค์–ด์กŒ๊ณ 
22:43
and could recognize all those objects,
439
1363274
2837
๊ทธ๋ฆฌ๊ณ  ๋ชจ๋“  ๋ฌผ์ฒด๋“ค์„ ์ธ์‹ํ•  ์ˆ˜๋„ ์žˆ๋Š” ๊ทธ๋Ÿฐ ๋กœ๋ด‡์ด์š”.
22:46
I mean, thatโ€™s kind of amazing, isnโ€™t it?
440
1366152
2253
๊ทธ๋Ÿผ ๊ฝค ๋†€๋ผ์šธ ๊ฑฐ์˜ˆ์š”, ๊ทธ๋ ‡์ง€์š”?
22:48
Like the kind of thing that you could ask a robot to do
441
1368446
3295
๊ทธ๋Ÿฐ ๊ฒŒ ์žˆ๋‹ค๋ฉด, ๋กœ๋ด‡์—๊ฒŒ ๋ถ€ํƒํ•  ์ˆ˜ ์žˆ๋Š”
22:51
would be what?
442
1371741
1168
์ข…๋ฅ˜์˜ ์ผ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
22:52
Like, tidy up?
443
1372951
1209
โ€˜๋ฐฉ ์ข€ ์น˜์›Œโ€™ ๊ฐ™์€ ๊ฑธ๊นŒ์š”?
22:54
EM: Yeah, absolutely.
444
1374160
1460
EM: ๋„ค, ์ •ํ™•ํ•ด์š”.
22:57
Make dinner, I guess, mow the lawn.
445
1377122
2252
โ€˜์ €๋… ์‹์‚ฌ ๋งŒ๋“ค์–ดโ€™ ๊ฐ™์€ ๊ฑฐ๋‚˜, ๋ญ โ€˜์ž”๋”” ๊นŽ์•„โ€™ ๊ฐ™์€ ๊ฑฐ์ฃ .
22:59
CA: Take a cup of tea to grandma and show her family pictures.
446
1379416
4379
CA: โ€˜ํ• ๋จธ๋‹ˆ๊ป˜ ์ฐจ ํ•œ ์ž” ๋“œ๋ฆฌ๊ณ , ๊ฐ€์กฑ ์‚ฌ์ง„๋„ ๋ณด์—ฌ๋“œ๋ คโ€™ ๊ฐ™์€ ๊ฑฐ์š”.
23:04
EM: Exactly. Take care of my grandmother and make sure --
447
1384462
4296
EM: ๋งž์•„์š”. ํ• ๋จธ๋‹ˆ ๋Œ๋ด๋“œ๋ฆฌ๊ณ  ๊ทธ๋ฆฌ๊ณ  ๋ฐ˜๋“œ์‹œ...
23:08
CA: It could obviously recognize everyone in the home.
448
1388758
2628
CA: ํ™•์‹คํžˆ ๊ฐ€์กฑ ๋ชจ๋‘๋ฅผ ์•Œ์•„๋ณด๊ฒ ๊ตฐ์š”.
23:12
It could play catch with your kids.
449
1392053
1877
์•„์ด๋“ค๊ณผ ์žก๊ธฐ ๋†€์ด๋ฅผ ํ•  ์ˆ˜๋„ ์žˆ๊ฒ ๋„ค์š”.
23:13
EM: Yes. I mean, obviously, we need to be careful
450
1393972
2294
EM: ๋„ค, ๋‹น์—ฐํžˆ ์šฐ๋ฆฌ๊ฐ€ ์กฐ์‹ฌํ•  ํ•„์š”๋Š” ์žˆ์ง€๋งŒ
23:16
this doesn't become a dystopian situation.
451
1396307
2545
๋””์Šคํ† ํ”ผ์•„์ ์ธ ์ƒํ™ฉ์ด ๋˜์ง€๋Š” ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
23:20
I think one of the things that's going to be important
452
1400520
2544
์ œ ์ƒ๊ฐ์— ์ค‘์š”ํ•œ ๋ฌธ์ œ ์ค‘ ํ•˜๋‚˜๋Š”
23:23
is to have a localized ROM chip on the robot
453
1403064
3504
๋กœ๋ด‡์˜ ROM์นฉ์„ ํ˜„์ง€ํ™” ํ•˜๋Š” ์ผ์ผ ๊ฑฐ์˜ˆ์š”.
23:26
that cannot be updated over the air.
454
1406609
3045
๋ฌด์„ ์œผ๋กœ ์—…๋ฐ์ดํŠธ๊ฐ€ ๋ถˆ๊ฐ€๋Šฅํ•  ํ…Œ๋‹ˆ๊นŒ์š”.
23:29
Where if you, for example, were to say, โ€œStop, stop, stop,โ€
455
1409696
2794
์˜ˆ๋ฅผ ๋“ค์–ด, ์–ด๋””์„œ๋“  โ€œ๋ฉˆ์ถฐ, ๋ฉˆ์ถฐ, ๋ฉˆ์ถฐ.โ€œ๋ผ๊ณ  ๋งํ•œ๋‹ค๋ฉด,
23:32
if anyone said that,
456
1412490
1460
๋ˆ„๊ฐ€ ๊ทธ ์–˜๊ธฐ๋ฅผ ํ•˜๋”๋ผ๋„,
23:33
then the robot would stop, you know, type of thing.
457
1413992
2419
๋กœ๋ด‡์€ ๋ฉˆ์ถฐ์•ผ ํ•ฉ๋‹ˆ๋‹ค. ๊ทธ๋Ÿฐ ์ผ๋“ค์ด ๋ฌธ์ œ์ง€์š”.
23:36
And that's not updatable remotely.
458
1416411
1960
์›๊ฒฉ์œผ๋กœ ์—…๋ฐ์ดํŠธ ํ•  ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
23:38
I think it's going to be important to have safety features like that.
459
1418997
3253
์•ˆ์ „์„ฑ์„ ํ™•๋ณดํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๊ทธ๋Ÿฐ ๋ฌธ์ œ๋“ค์ด ์ค‘์š”ํ•ด์งˆ ๊ฑฐ์˜ˆ์š”.
23:42
CA: Yeah, that sounds wise.
460
1422292
1501
CA: ๋„ค, ๊ทธ๋ ‡๊ฒŒ ๋ณด์ด๋„ค์š”.
23:43
EM: And I do think there should be a regulatory agency for AI.
461
1423793
2961
EM: ๊ทธ๋ฆฌ๊ณ  AI๋ฅผ ์œ„ํ•œ ๊ทœ์ œ ๊ธฐ๊ด€๋„ ํ•„์š”ํ•  ๊ฒ๋‹ˆ๋‹ค.
23:46
I've said that for many years.
462
1426754
1460
์ œ๊ฐ€ ์ˆ˜ ๋…„๊ฐ„ ์–˜๊ธฐํ•ด ์™”์ฃ .
23:48
I don't love being regulated,
463
1428214
1418
์ €๋Š” ๊ทœ์ œ๋ฐ›๋Š” ๊ฑธ ์‹ซ์–ดํ•˜์ง€๋งŒ
23:49
but I think this is an important thing for public safety.
464
1429632
2711
๊ณต์ค‘ ์•ˆ์ „์„ ์œ„ํ•ด์„œ๋Š” ์ค‘์š”ํ•œ ์ผ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ด์š”.
23:52
CA: Let's come back to that.
465
1432343
1377
CA: ๊ทธ ๋ฌธ์ œ๋กœ ๋‹ค์‹œ ๋Œ์•„๊ฐ€๋ฉด
23:53
But I don't think many people have really sort of taken seriously
466
1433720
4463
์‚ฌ๋žŒ๋“ค์€ ๋Œ€๊ฐœ ๊ทธ๋Ÿฌํ•œ ๋ฌธ์ œ๋ฅผ ์‹ฌ๊ฐํ•˜๊ฒŒ ๋ฐ›์•„๋“ค์ด์ง€ ์•Š๋Š” ๊ฒƒ ๊ฐ™์•„์š”.
23:58
the notion of, you know, a robot at home.
467
1438183
2752
๊ฐ€์ •์šฉ ๋กœ๋ด‡ ๊ฐ™์€ ๊ฐœ๋…์„์š”.
24:00
I mean, at the start of the computing revolution,
468
1440935
2294
์ œ ๋ง์€, ์ปดํ“จํ„ฐ ํ˜๋ช…์ด ์‹œ์ž‘๋์„ ๋•Œ
24:03
Bill Gates said there's going to be a computer in every home.
469
1443271
2878
๋นŒ ๊ฒŒ์ด์ธ ๊ฐ€ ๊ทธ๋žฌ์ฃ . ๋ชจ๋“  ์ง‘์— ์ปดํ“จํ„ฐ๊ฐ€ ์žˆ๊ฒŒ ๋  ๊ฑฐ๋ผ๊ณ 
24:06
And people at the time said, yeah, whatever, who would even want that.
470
1446149
3295
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์€ ๋ญ ์–ด์จŒ๋“ , ๋ˆ„๊ฐ€ ๊ทธ๊ฑธ ์›ํ•˜๊ฒ ๋ƒ๊ณ  ํ–ˆ๊ณ ์š”.
24:09
Do you think there will be basically like in, say, 2050 or whatever,
471
1449903
3670
๋จธ์Šคํฌ ์”จ๋Š” ๊ทธ๋Ÿฐ ์ผ์ด, 2050๋…„์ด๋‚˜ ๋˜ ์–ธ์ œ ์ผ์–ด๋‚  ๊ฑฐ๋ผ๊ณ  ๋ณด์‹œ๋‚˜์š”?
24:13
like a robot in most homes, is what there will be,
472
1453615
4170
๋Œ€๋ถ€๋ถ„์˜ ๊ฐ€์ •์ด ๋กœ๋ด‡์„ ๋“ค์ด๊ฒŒ ๋  ๊ฑฐ๋ผ๊ณ ์š”.
24:17
and people will love them and count on them?
473
1457827
2419
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์ด ๊ทธ๊ฒƒ๋“ค์„ ์‚ฌ๋ž‘ํ•˜๊ณ  ์˜์ง€ํ•˜๊ฒŒ ๋  ๊ฑฐ๋ผ๊ณ  ๋ง์ด์ฃ ?
24:20
Youโ€™ll have your own butler basically.
474
1460663
1836
๋ฌด์—‡๋ณด๋‹ค ๋จธ์Šคํฌ ์”จ๋„ ๋ณธ์ธ์˜ ์ง‘์‚ฌ๋ฅผ ๋“ค์ด๊ฒŒ ๋  ํ…Œ๊ณ ์š”.
24:22
EM: Yeah, you'll have your sort of buddy robot probably, yeah.
475
1462749
3920
EM: ๋„ค. ์•ค๋”์Šจ์”จ๋„ ๋กœ๋ด‡ ์นœ๊ตฌ๋ฅผ ๊ฐ€์ง€๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
24:27
CA: I mean, how much of a buddy?
476
1467003
1585
CA: ๋ญ, ์–ด๋Š ์ •๋„์˜ ์นœ๊ตฌ ๋ง์”€์ธ๊ฐ€์š”?
24:28
How many applications have you thought,
477
1468630
2627
์–ผ๋งˆ๋‚˜ ๋งŽ์€ ์‘์šฉ ํ”„๋กœ๊ทธ๋žจ๋“ค์„ ์ƒ๊ฐํ•˜์…จ์„๊นŒ์š”?
24:31
you know, can you have a romantic partner, a sex partner?
478
1471257
2836
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์—ฐ์•  ํŒŒํŠธ๋„ˆ๋‚˜, ์„น์Šค ํŒŒํŠธ๋„ˆ ๊ฐ™์€ ๊ฒƒ๋„ ๊ฐ€๋Šฅํ•œ๊ฐ€์š”?
24:34
EM: It's probably inevitable.
479
1474135
2127
EM: ์•„๋งˆ ํ”ผํ•  ์ˆ˜ ์—†๊ฒ ์ฃ .
24:36
I mean, I did promise the internet that Iโ€™d make catgirls.
480
1476304
2794
์ œ ๋ง์€, ์ €๋Š” ์บฃ๊ฑธ์„ ๋งŒ๋“ค๊ฒ ๋‹ค๊ณ  ์ธํ„ฐ๋„ท์œผ๋กœ ์•ฝ์†ํ–ˆ์–ด์š”.
24:39
We could make a robot catgirl.
481
1479098
2044
์šฐ๋ฆฌ๋Š” ๋กœ๋ด‡ ์บฃ๊ฑธ์„ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
24:42
CA: Be careful what you promise the internet.
482
1482644
2168
CA: ์ธํ„ฐ๋„ท์œผ๋กœ ์•ฝ์†ํ•  ๋•Œ๋Š” ์กฐ์‹ฌํ•˜์„ธ์š”.
24:44
(Laughter)
483
1484812
2253
(์›ƒ์Œ)
24:47
EM: So, yeah, I guess it'll be whatever people want really, you know.
484
1487065
4963
EM: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์ œ ์ถ”์ธก์œผ๋กœ๋Š” ์‚ฌ๋žŒ๋“ค์ด ์›ํ•˜๋ฉด ๋ฌด์—‡์ด๋“ ์ง€ ๋  ๊ฒ๋‹ˆ๋‹ค.
24:52
CA: What sort of timeline should we be thinking about
485
1492487
3670
CA: ์ผ์ • ๊ณ„ํš์ด ์–ด๋–ป๊ฒŒ ๋  ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
24:56
of the first models that are actually made and sold?
486
1496157
3837
์ฒซ ๋ฒˆ์งธ ๋ชจ๋ธ์ด ์‹ค์ œ๋กœ ๋งŒ๋“ค์–ด์ง€๊ณ  ํŒ”๋ฆฌ๋Š” ์‹œ๊ธฐ ๋ง์ž…๋‹ˆ๋‹ค.
25:01
EM: Well, you know, the first units that we intend to make
487
1501621
3712
EM: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ €ํฌ๊ฐ€ ์ฒ˜์Œ์œผ๋กœ ๋งŒ๋“ค๋ ค ํ•˜๋Š” ๊ฑด
25:05
are for jobs that are dangerous, boring, repetitive,
488
1505375
4629
์œ„ํ—˜ํ•˜๊ณ , ์ง€๋ฃจํ•˜๊ณ , ๋ฐ˜๋ณต์ ์ด๊ณ 
25:10
and things that people don't want to do.
489
1510004
1919
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์ด ํ•˜๊ณ  ์‹ถ์–ดํ•˜์ง€ ์•Š๋Š” ์ผ๋“ค์„ ์œ„ํ•œ ์ œํ’ˆ์ผ ๊ฒ๋‹ˆ๋‹ค.
25:11
And, you know, I think weโ€™ll have like an interesting prototype
490
1511965
3044
๊ทธ๋ฆฌ๊ณ  ์•„๋งˆ, ์ €ํฌ๋“ค์ด ํฅ๋ฏธ๋กœ์šด ๊ฒฌ๋ณธ์„ ๋งŒ๋“ค ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
25:15
sometime this year.
491
1515009
1210
์˜ฌํ•ด ์ค‘์œผ๋กœ์š”.
25:16
We might have something useful next year,
492
1516219
2627
๋‚ด๋…„์ด๋ฉด ์•„๋งˆ ์œ ์šฉํ•œ ๊ฑธ ๋งŒ๋“ค์–ด ๋‚ผ ๊ฒ๋‹ˆ๋‹ค.
25:18
but I think quite likely within at least two years.
493
1518888
2836
์ ์–ด๋„ 2๋…„ ์•ˆ์—๋Š” ๊ฐ€๋Šฅํ•  ๊ฑฐ์˜ˆ์š”.
25:22
And then we'll see rapid growth year over year
494
1522308
2169
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋•Œ๋ถ€ํ„ด ๋งค๋…„ ๊ฐ€ํŒŒ๋ฅธ ์„ฑ์žฅ์„ ๋ณด๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
25:24
of the usefulness of the humanoid robots
495
1524519
2544
ํœด๋จธ๋…ธ์ด๋“œ ๋กœ๋ด‡์˜ ์œ ์šฉ์„ฑ์— ์žˆ์–ด์„œ
25:27
and decrease in cost and scaling up production.
496
1527105
2711
๊ทธ๋ฆฌ๊ณ  ๋น„์šฉ ์ ˆ๊ฐ๊ณผ, ๋Œ€๊ทœ๋ชจ ์ƒ์‚ฐ์— ์žˆ์–ด์„œ๋„์š”.
25:29
CA: Initially just selling to businesses,
497
1529857
1961
CA: ์ดˆ๊ธฐ์—๋Š” ๋‹จ์ˆœํžˆ ๋น„์ฆˆ๋‹ˆ์Šค ๋ชฉ์ ์œผ๋กœ ํŒ๋งคํ•œ๋‹ค๋ฉด,
25:31
or when do you picture you'll start selling them
498
1531859
2670
์–ธ์ œ์ฏค ํŒ”๊ธฐ ์‹œ์ž‘ํ•  ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹œ๋‚˜์š”?
25:34
where you can buy your parents one for Christmas or something?
499
1534571
4295
ํฌ๋ฆฌ์Šค๋งˆ์Šค๋‚˜ ๋ญ ๊ทธ๋Ÿฐ ์ข…๋ฅ˜์˜ ์„ ๋ฌผ๋กœ ๋ถ€๋ชจ๋‹˜์„ ์œ„ํ•ด ํ•˜๋‚˜ ๊ตฌ๋งคํ•˜๋ ค๋ฉด์š”?
25:39
EM: I'd say in less than ten years.
500
1539450
1710
EM: ์ €๋Š” ์‹ญ ๋…„ ์•ˆ์ชฝ์ด ๋  ๊ฑฐ๋ผ๊ณ  ๋ด…๋‹ˆ๋‹ค.
25:41
CA: Help me on the economics of this.
501
1541160
2837
CA: ์žฌ์ •์ ์ธ ๋ฉด๋„ ์–˜๊ธฐํ•ด๋ด…์‹œ๋‹ค.
25:43
So what do you picture the cost of one of these being?
502
1543997
3211
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ฐ€๊ฒฉ์€ ์–ผ๋งˆ ์ •๋„ ์˜ˆ์ƒํ•˜์„ธ์š”?
25:47
EM: Well, I think the cost is actually not going to be crazy high.
503
1547250
3295
EM: ์Œ, ์ œ ์ƒ๊ฐ์— ๊ฐ€๊ฒฉ์€ ๋ฏธ์นœ ์ •๋„๋Š” ์•„๋‹ ๊ฒ๋‹ˆ๋‹ค.
25:51
Like less than a car.
504
1551921
1251
์ฐจ๋ณด๋‹ค๋Š” ์Œ€ ๊ฑฐ์˜ˆ์š”.
25:53
Initially, things will be expensive because it'll be a new technology
505
1553172
3254
์ดˆ์ฐฝ๊ธฐ์—, ์ œํ’ˆ์ด ๋น„์‹ธ๊ธด ํ•˜๊ฒ ์ง€์š”. ์™œ๋ƒํ•˜๋ฉด ์ด๊ฑด ์‹ ๊ธฐ์ˆ ์ด๊ณ 
25:56
at low production volume.
506
1556467
1210
์ƒ์‚ฐ๋Ÿ‰๋„ ์ ์œผ๋‹ˆ๊นŒ์š”.
25:57
The complexity and cost of a car is greater than that of a humanoid robot.
507
1557677
3670
์ž๋™์ฐจ๊ฐ€ ํœด๋จธ๋…ธ์ด๋“œ ๋กœ๋ด‡๋ณด๋‹ค๋Š” ํ›จ์”ฌ ๋ณต์žกํ•˜๊ณ  ๋น„์šฉ๋„ ๋งŽ์ด ๋“ญ๋‹ˆ๋‹ค.
26:01
So I would expect that it's going to be less than a car,
508
1561681
4212
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ €๋Š” ๊ทธ๊ฒŒ ์ฐจ๋ณด๋‹ค๋Š” ์ €๋ ดํ•  ๊ฑฐ๋ผ๊ณ  ์˜ˆ์ƒํ•  ์ˆ˜ ์žˆ๊ณ ์š”.
26:05
or at least equivalent to a cheap car.
509
1565935
1835
์ตœ์†Œํ•œ ์ €๋ ดํ•œ ์ž๋™์ฐจ ์ •๋„ ๊ฐ€๊ฒฉ์ผ ๊ฒ๋‹ˆ๋‹ค.
26:07
CA: So even if it starts at 50k, within a few years,
510
1567770
2461
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋น„๋ก ์‹œ์ž‘์€ 5๋งŒ ๋‹ฌ๋Ÿฌ์ผ์ง€๋ผ๋„, ๋ช‡ ๋…„ ์•ˆ์—
26:10
itโ€™s down to 20k or lower or whatever.
511
1570273
2336
2๋งŒ ๋‹ฌ๋Ÿฌ๋‚˜ ๋” ๋‚ฎ์€ ์ •๋„๋กœ ๋‚ด๋ ค๊ฐˆ ๊ฑฐ๊ณ 
26:13
And maybe for home they'll get much cheaper still.
512
1573568
2335
๊ทธ๋ฆฌ๊ณ  ์•„๋งˆ๋„ ๊ฐ€์ •์šฉ์œผ๋กœ๋Š” ํ›จ์”ฌ ๋” ์‹ธ์ง€๊ฒ ๊ตฐ์š”.
26:15
But think about the economics of this.
513
1575903
1877
๊ทธ๋ ‡์ง€๋งŒ ๊ฒฝ์ œ์ ์ธ ๋ถ€๋ถ„์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด๋ด…์‹œ๋‹ค.
26:17
If you can replace a $30,000,
514
1577822
4254
๋งŒ์•ฝ ๋‹น์‹ ์ด 3๋งŒ ๋‹ฌ๋Ÿฌ๋ฅผ
26:22
$40,000-a-year worker,
515
1582076
2795
4๋งŒ ๋‹ฌ๋Ÿฌ ์งœ๋ฆฌ ๋…ธ๋™์ž,
26:24
which you have to pay every year,
516
1584912
1669
๋งค๋…„ ๊ทธ๋ ‡๊ฒŒ ์ง€๋ถˆํ•ด์•ผ๋งŒ ํ•˜๋Š” ๋…ธ๋™์ž๋ฅผ ๋Œ€์‹ ํ•ด
26:26
with a one-time payment of $25,000
517
1586623
3044
๋”ฑ, ํ•œ ๋ฒˆ 2๋งŒ 5์ฒœ ๋‹ฌ๋Ÿฌ๋งŒ ์ง€๊ธ‰ํ•˜๋ฉด ๋˜๊ณ 
26:29
for a robot that can work longer hours,
518
1589667
2920
๋” ์˜ค๋ž˜ ์ผํ•  ์ˆ˜ ์žˆ๋Š” ๋กœ๋ด‡์— ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด,
26:32
a pretty rapid replacement of certain types of jobs.
519
1592629
4004
ํŠน์ •ํ•œ ์ง์—… ๋ถ„์•ผ๋Š” ๊ฝค ๋นจ๋ฆฌ ๋Œ€์ฒด๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
26:36
How worried should the world be about that?
520
1596674
2211
์„ธ์ƒ์€ ์—ฌ๊ธฐ์— ๋Œ€ํ•ด ์–ด๋–ป๊ฒŒ ๋Œ€๋น„ํ•ด์•ผ ํ•˜๋Š” ๊ฑธ๊นŒ์š”?
26:39
EM: I wouldn't worry about the sort of, putting people out of a job thing.
521
1599344
3503
EM: ์ €๋Š” ์‚ฌ๋žŒ๋“ค์„ ํ•ด๊ณ ํ•˜๋Š” ์ผ ๊ฐ™์€ ๊ฑด ๊ฑฑ์ •ํ•˜๋ ค๊ณ  ํ•˜์ง€ ์•Š์•„์š”.
26:42
I think we're actually going to have, and already do have,
522
1602889
3336
์ œ ์ƒ๊ฐ์— ์šฐ๋ฆฌ๊ฐ€ ์ด๋ฏธ ๊ฒช์—ˆ๊ณ , ์ง€๊ธˆ๋„ ๊ฒช๊ณ  ์žˆ๋Š” ๊ฑด
26:46
a massive shortage of labor.
523
1606267
1377
์—„์ฒญ๋‚œ ๋…ธ๋™๋ ฅ ๋ถ€์กฑ์ž…๋‹ˆ๋‹ค.
26:47
So I think we will have ...
524
1607644
2919
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ ์ง๋ฉดํ•  ๋ฌธ์ œ๋Š”
26:54
Not people out of work,
525
1614067
1209
ํ•ด๊ณ ๋œ ์‚ฌ๋žŒ๋“ค์ด ์•„๋‹ˆ๋ผ
26:55
but actually still a shortage labor even in the future.
526
1615276
2836
๋ฏธ๋ž˜์—๋„ ์—ฌ์ „ํžˆ ๋ถ€์กฑํ•  ๋…ธ๋™๋ ฅ์ด๋ผ๊ณ  ์˜ˆ์ƒํ•ฉ๋‹ˆ๋‹ค.
26:58
But this really will be a world of abundance.
527
1618863
4630
ํ•˜์ง€๋งŒ ์ด๋Ÿฌํ•œ ๋กœ๋ด‡๋“ค๋กœ ์„ธ์ƒ์€ ๋งค์šฐ ํ’์š”๋กœ์šธ ์ˆ˜ ์žˆ๊ณ 
27:03
Any goods and services will be available to anyone who wants them.
528
1623534
4964
์›ํ•˜๋Š” ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ์žฌํ™”์™€ ์„œ๋น„์Šค๋ฅผ ์ด์šฉํ•  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
27:08
It'll be so cheap to have goods and services, it will be ridiculous.
529
1628790
3211
์žฌํ™”์™€ ์„œ๋น„์Šค๋ฅผ ์ด์šฉํ•˜๋Š” ๊ฐ€๊ฒฉ์ด ํ„ฐ๋ฌด๋‹ˆ์—†์ด ์ €๋ ดํ•ด์งˆ ํ…Œ๋‹ˆ๊นŒ์š”.
27:12
CA: I'm presuming it should be possible to imagine a bunch of goods and services
530
1632043
4046
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์˜ˆ์ƒํ•ด๋ณด์ž๋ฉด ๋งŽ์€ ์žฌํ™”์™€ ์„œ๋น„์Šค๊ฐ€
27:16
that can't profitably be made now but could be made in that world,
531
1636089
4421
ํ˜„์žฌ๋Š” ์ˆ˜์ต์ด ์—†์—ˆ์–ด๋„ ๊ทธ ์„ธ๊ณ„์—์„œ๋Š” ์ˆ˜์ต์ด ๋‚œ๋‹ค๋Š” ๊ฑฐ์ฃ .
27:20
courtesy of legions of robots.
532
1640551
2628
๋กœ๋ด‡ ๊ตฐ๋‹จ์˜ ๋„์›€์œผ๋กœ์š”.
27:23
EM: Yeah.
533
1643179
1460
EM: ๋„ค.
27:25
It will be a world of abundance.
534
1645014
1794
ํ’์š”๋กœ์šด ์„ธ์ƒ์ด ๋  ๊ฑฐ์˜ˆ์š”.
27:26
The only scarcity that will exist in the future
535
1646808
2502
๋ฏธ๋ž˜์—๋„ ์—ฌ์ „ํžˆ ํฌ์†Œํ•œ ๊ฑด ์˜ค์ง
27:29
is that which we decide to create ourselves as humans.
536
1649352
3170
์šฐ๋ฆฌ๊ฐ€ ์šฐ๋ฆฌ ์ž์‹ ์„ ์ธ๊ฐ„์œผ๋กœ ์ฐฝ์กฐํ•˜๋Š” ์ผ์ด์ง€์š”.
27:32
CA: OK.
537
1652563
1168
CA: ๊ทธ๋ ‡๊ตฐ์š”.
27:33
So AI is allowing us to imagine a differently powered economy
538
1653731
4338
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ AI๋กœ ์ธํ•ด ์šฐ๋ฆฌ ๊ฒฝ์ œ๋Š” ๋‹ค๋ฅธ ๋ฐฉ์‹์œผ๋กœ ํž˜์„ ์–ป๊ฒŒ ๋˜๊ณ 
27:38
that will create this abundance.
539
1658111
2127
๊ทธ๊ฒŒ ํ’์š”๋กœ์›€์„ ๋งŒ๋“ค์–ด๋‚ผ ๊ฒƒ์ด๋ผ๋Š” ๋œป์ด๊ตฐ์š”.
27:40
What are you most worried about going wrong?
540
1660279
2086
์ž˜๋ชป๋์„ ๊ฒฝ์šฐ์— ๊ฐ€์žฅ ๊ฑฑ์ •๋˜๋Š” ๊ฑด ๋ญ”๊ฐ€์š”?
27:42
EM: Well, like I said, AI and robotics will bring out
541
1662407
6006
EM: ์•ž์„œ ๋ง์”€๋“œ๋ ธ๋“ฏ, AI์™€ ๋กœ๋ด‡์€
27:48
what might be termed the age of abundance.
542
1668454
2544
โ€˜ํ’์š”์˜ ์‹œ๋Œ€โ€™๋กœ ์ผ์ปฌ์–ด์งˆ ๋งŒํ•œ ์„ธ์ƒ์„ ๊ฐ€์ ธ์˜ฌ ๊ฒ๋‹ˆ๋‹ค.
27:51
Other people have used this word,
543
1671332
1960
๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค๋„ ์ด๋Ÿฐ ํ‘œํ˜„์„ ํ–ˆ์—ˆ์–ด์š”.
27:54
and that this is my prediction:
544
1674210
1668
์ œ๊ฐ€ ์˜ˆ์ƒํ•˜๋Š” ๋ฐ”๋Š”
27:55
it will be an age of abundance for everyone.
545
1675920
3212
๋ชจ๋‘์—๊ฒŒ ํ’์กฑํ•œ ์‹œ๋Œ€๋ฅผ ๊ฐ€์ ธ์˜ฌ ๊ฒƒ์ด์ง€๋งŒ
27:59
But I guess thereโ€™s ...
546
1679841
2544
ํ•˜์ง€๋งŒ ์ถ”์ธกํ‚ค๋กœ
28:03
The dangers would be the artificial general intelligence
547
1683886
4547
์œ„ํ—˜ ๊ฐ€๋Šฅ์„ฑ์€ ์•„๋งˆ ๋ฒ”์šฉ ์ธ๊ณต์ง€๋Šฅ์ด๋‚˜
28:08
or digital superintelligence decouples from a collective human will
548
1688474
5256
๋””์ง€ํ„ธ ์ดˆ์ง€๋Šฅ์ด ์ธ๊ฐ„๋“ค์˜ ๊ณต๋™ ์˜์ง€๋กœ๋ถ€ํ„ฐ ๋ถ„๋ฆฌ๋˜๊ณ 
28:13
and goes in the direction that for some reason we don't like.
549
1693771
3504
๊ทธ๊ฒŒ ์–ด๋–ค ์ด์œ ๋กœ ์šฐ๋ฆฌ๊ฐ€ ์›์น˜ ์•Š๋Š” ๋ฐฉํ–ฅ์„ ํ–ฅํ•˜๋Š” ๋ฐ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
28:17
Whatever direction it might go.
550
1697316
2128
์–ด๋–ค ๋ฐฉํ–ฅ์œผ๋กœ ๊ฐ€๋“ ์ง€ ๋ง์ž…๋‹ˆ๋‹ค.
28:20
You know, thatโ€™s sort of the idea behind Neuralink,
551
1700570
3420
์•„์‹œ๋‹ค์‹œํ”ผ ๋‰ด๋Ÿด๋งํฌ ๋’ค์˜ ์•„์ด๋””์–ด๋Š”
28:24
is to try to more tightly couple collective human world
552
1704031
3045
๊ณต๋™์˜ ์ธ๊ฐ„ ์„ธ๊ณ„๋ฅผ ๋” ๋ฐ€์ ‘ํ•˜๊ฒŒ ์—ฐ๊ฒฐํ•˜๋ ค๋Š” ๊ฑฐ์ง€์š”.
28:27
to digital superintelligence.
553
1707076
4463
๋””์ง€ํ„ธ ์ดˆ์ง€๋Šฅ๊ณผ์š”.
28:33
And also along the way solve a lot of brain injuries and spinal injuries
554
1713458
5755
๊ทธ๋ฆฌ๊ณ  ๊ทธ ๋•์— ์ˆ˜ ๋งŽ์€ ๋‡Œ์™€ ์ฒ™์ถ” ๋ถ€์ƒ ๊ฐ™์€ ๋ฌธ์ œ๋ฅผ
28:39
and that kind of thing.
555
1719213
1168
ํ•ด๊ฒฐํ–ˆ์–ด์š”.
28:40
So even if it doesn't succeed in the greater goal,
556
1720423
2336
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ด๊ฒŒ ๋” ์›๋Œ€ํ•œ ๋ชฉํ‘œ๋ฅผ ์„ฑ์ทจํ•˜์ง„ ๋ชปํ•ด๋„
28:42
I think it will succeed in the goal of alleviating brain and spine damage.
557
1722759
5588
๋‡Œ์™€ ์ฒ™์ถ”์˜ ์†์ƒ์„ ์™„ํ™” ์‹œํ‚ค๋Š” ๋ชฉ์ ์—์„œ๋Š” ์„ฑ๊ณตํ•  ๊ฒ๋‹ˆ๋‹ค.
28:48
CA: So the spirit there is that if we're going to make these AIs
558
1728347
3045
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋‹ค์‹œ ๋งํ•˜๋ฉด ๋งค์šฐ ๋˜‘๋˜‘ํ•œ AI๋ฅผ ๋งŒ๋“ค๊ฒŒ ๋œ๋‹ค๋ฉด
28:51
that are so vastly intelligent, we ought to be wired directly to them
559
1731434
3253
์šฐ๋ฆฌ๋Š” ๊ทธ๋“ค์—๊ฒŒ ์ง์ ‘ ์—ฐ๊ฒฐ๋  ์ˆ˜ ์žˆ์„ ๊ฑฐ๊ณ 
28:54
so that we ourselves can have those superpowers more directly.
560
1734687
4421
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ ์Šค์Šค๋กœ ๋” ์ง์ ‘์ ์ธ ์ŠˆํผํŒŒ์›Œ๋ฅผ ์–ป์„ ์ˆ˜ ์žˆ๋Š” ๊ฑฐ๋„ค์š”.
28:59
But that doesn't seem to avoid the risk that those superpowers might ...
561
1739609
4421
ํ•˜์ง€๋งŒ ๊ทธ๋Ÿฐ ์‚ฌ์‹ค์ด ๊ทธ ์ŠˆํผํŒŒ์›Œ๊ฐ€ ์˜๋„์น˜์•Š๊ฒŒ ์œ„ํ—˜ํ•œ ๋ฐฉํ–ฅ์œผ๋กœ
29:05
turn ugly in unintended ways.
562
1745740
2586
ํ˜๋Ÿฌ๊ฐˆ ์œ„ํ—˜์„ ๋ง‰์ง€๋Š” ๋ชปํ•  ๊ฒƒ ๊ฐ™๋„ค์š”.
29:08
EM: I think it's a risk, I agree.
563
1748326
1626
EM: ์œ„ํ—˜ํ•˜์ฃ , ๋งž์•„์š”.
29:09
I'm not saying that I have some certain answer to that risk.
564
1749994
6256
์ œ๊ฐ€ ๊ทธ๋Ÿฐ ์œ„ํ—˜์— ์ •ํ™•ํ•œ ํ•ด๋‹ต์„ ๊ฐ–๊ณ  ์žˆ๋‹ค๋Š” ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
29:16
Iโ€™m just saying like
565
1756292
2294
์ „ ๊ทธ์ € ์ด๊ฒƒ์ด ์•„๋งˆ๋„
29:18
maybe one of the things that would be good
566
1758628
3545
ํ•œ ๊ฐ€์ง€ ๋ฐฉ๋ฒ•์ผ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑฐ์ง€์š”.
29:22
for ensuring that the future is one that we want
567
1762215
5672
์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๋ฏธ๋ž˜๋ฅผ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋Š” ์ข‹์€ ๋ฐฉ๋ฒ•์ด
29:27
is to more tightly couple
568
1767887
3253
์ข€ ๋” ์ง์ ‘์ ์œผ๋กœ
29:31
the collective human world to digital intelligence.
569
1771140
3754
๊ณต๋™์˜ ์ธ๊ฐ„ ์„ธ๊ณ„๋ฅผ ๋””์ง€ํ„ธ ์ง€๋Šฅ๊ณผ ์—ฐ๊ฒฐํ•˜๋Š” ๊ฒƒ์ผ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
29:36
The issue that we face here is that we are already a cyborg,
570
1776437
4129
ํ˜„์žฌ ์ง๋ฉดํ•œ ๋ฌธ์ œ๋Š”, ์šฐ๋ฆฌ๊ฐ€ ์ด๋ฏธ ์‚ฌ์ด๋ณด๊ทธ๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
29:40
if you think about it.
571
1780566
1252
์ƒ๊ฐํ•ด๋ณด๋ฉด,
29:41
The computers are an extension of ourselves.
572
1781859
3963
์ปดํ“จํ„ฐ๋Š” ์šฐ๋ฆฌ์˜ ํ™•์žฅ๋œ ์ผ๋ถ€์ด๊ณ ,
29:46
And when we die, we have, like, a digital ghost.
573
1786697
3003
์šฐ๋ฆฌ๊ฐ€ ์ฃฝ์œผ๋ฉด ๋””์ง€ํ„ธ ๋ง๋ น์ด ๋‚จ์Šต๋‹ˆ๋‹ค.
29:49
You know, all of our text messages and social media, emails.
574
1789742
3545
์šฐ๋ฆฌ๊ฐ€ ์“ด ๋ฌธ์ž ๋ฉ”์‹œ์ง€, ์†Œ์…œ ๋ฏธ๋””์–ด, ์ด๋ฉ”์ผ ๊ฐ™์€ ๊ฑฐ์š”.
29:53
And it's quite eerie actually,
575
1793329
2002
์‚ฌ์‹ค ๊ฝค ์„ฌ๋œฉํ•œ ์ผ์ด์ฃ .
29:55
when someone dies but everything online is still there.
576
1795373
3295
๋ˆ„๊ตฐ๊ฐ€ ์ฃฝ์–ด๋„ ์˜จ๋ผ์ธ์—” ๋ชจ๋“  ๊ฒŒ ์—ฌ์ „ํžˆ ๊ทธ๋Œ€๋กœ๋‹ˆ๊นŒ์š”.
29:59
But you say like, what's the limitation?
577
1799001
1919
ํ•˜์ง€๋งŒ ์‚ฌ๋žŒ๋“ค์€, ์ œํ•œ ๋ฒ”์œ„๋Š” ๋ฌด์—‡์ธ์ง€
30:00
What is it that inhibits a human-machine symbiosis?
578
1800962
5171
์ธ๊ฐ„๊ณผ ๊ธฐ๊ณ„์˜ ๊ณต์ƒ์„ ์–ต์ œํ•˜๋Š” ๊ฑด ๋ฌด์—‡์ธ์ง€ ๊ฐ™์€ ๊ฑธ ๋ฌป์Šต๋‹ˆ๋‹ค.
30:06
It's the data rate.
579
1806175
1210
๊ทธ๊ฑด ์ „์†ก ์†๋„์ง€์š”.
30:07
When you communicate, especially with a phone,
580
1807385
2169
ํŠนํžˆ ํ•ธ๋“œํฐ์œผ๋กœ ๋Œ€ํ™”ํ•  ๋•Œ,
30:09
you're moving your thumbs very slowly.
581
1809554
2919
์‚ฌ๋žŒ๋“ค์€ ์—„์ง€์†๊ฐ€๋ฝ์„ ๋งค์šฐ ๋Š๋ฆฌ๊ฒŒ ์›€์ง์ž…๋‹ˆ๋‹ค.
30:12
So you're like moving your two little meat sticks
582
1812515
2878
๋งˆ์น˜ ๋‘ ๊ฐœ์˜ ์‚ด๋กœ ์ด๋ฃจ์–ด์ง„ ๋ง‰๋Œ€๊ธฐ๋ฅผ
30:15
at a rate thatโ€™s maybe 10 bits per second, optimistically, 100 bits per second.
583
1815393
5922
์ดˆ๋‹น 10๋น„ํŠธ ์†๋„, ๋นจ๋ผ๋„ 100๋น„ํŠธ ์†๋„๋กœ ์›€์ง์ด๋Š” ๊ฒƒ๊ณผ ๊ฐ™์ฃ .
30:21
And computers are communicating at the gigabyte level and beyond.
584
1821315
5130
์ปดํ“จํ„ฐ๋Š” ๊ธฐ๊ฐ€๋ฐ”์ดํŠธ ๊ทธ ์ด์ƒ์˜ ์†๋„๋กœ ํ†ต์‹ ํ•ฉ๋‹ˆ๋‹ค.
30:26
CA: Have you seen evidence that the technology is actually working,
585
1826487
3170
CA: ๊ทธ๋Ÿฐ ๊ธฐ์ˆ ์ด ์‹ค์ œ๋กœ ์ž‘๋™ํ•˜๊ณ  ์žˆ๋‹ค๋Š” ์ฆ๊ฑฐ๊ฐ€ ์žˆ์Šต๋‹ˆ๊นŒ?
30:29
that you've got a richer, sort of, higher bandwidth connection, if you like,
586
1829657
3587
๋” ํ’๋ถ€ํ•˜๊ณ , ๋” ๋†’์€ ๋Œ€์—ญํญ์ด ๋งŒ๋“ค์–ด ์ง€๊ณ  ์žˆ๋‹ค๋Š” ์ฆ๊ฑฐ ๋ง์ž…๋‹ˆ๋‹ค.
30:33
between like external electronics and a brain
587
1833286
2836
์™ธ๋ถ€ ๊ธฐ๊ธฐ์™€ ๋‡Œ ์‚ฌ์ด ๊ฐ™์€ ๊ณณ์—์„œ
30:36
than has been possible before?
588
1836122
1668
์ด์ „๋ณด๋‹ค ๋ง์ด์—์š”.
30:38
EM: Yeah.
589
1838165
1210
EM: ๋„ค.
30:41
I mean, the fundamental principles of reading neurons,
590
1841002
5422
์ฝ๊ธฐ์— ๊ด€์—ฌํ•˜๋Š” ์‹ ๊ฒฝ์„ธํฌ์˜ ๊ธฐ๋ณธ ์›๋ฆฌ,
30:46
sort of doing read-write on neurons with tiny electrodes,
591
1846465
3921
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋ฏธ์„ธํ•œ ์ „๊ทน์œผ๋กœ ๋™์ž‘ํ•˜๋Š” ์ฝ๊ธฐ ์“ฐ๊ธฐ์— ๊ด€์—ฌํ•˜๋Š” ๋‰ด๋Ÿฐ๋“ค์€
30:50
have been demonstrated for decades.
592
1850386
2169
์ˆ˜์‹ญ ๋…„์— ๊ฑธ์ณ ์‹œ์—ฐ๋˜์–ด ์™”์Šต๋‹ˆ๋‹ค.
30:53
So it's not like the concept is new.
593
1853306
4254
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ฐœ๋…์€ ์ƒˆ๋กœ์šด ๊ฒŒ ์•„๋‹ˆ์—์š”.
30:57
The problem is that there is no product that works well
594
1857602
4754
๋ฌธ์ œ๋Š” ์ œ๋Œ€๋กœ ๋™์ž‘ํ•˜๋Š” ์ œํ’ˆ์ด ์—†์–ด์„œ
31:02
that you can go and buy.
595
1862398
2503
๊ตฌ๋งคํ•  ์ˆ˜ ์—†๋‹ค๋Š” ๋ฐ ์žˆ์ฃ .
31:04
So it's all sort of, in research labs.
596
1864942
2586
์—ฐ๊ตฌ์‹ค์—์„œ๋งŒ ๊ฐ€๋Šฅํ•œ ๊ฑฐ์˜ˆ์š”.
31:08
And it's like some cords sticking out of your head.
597
1868738
5672
๋จธ๋ฆฌ ์†์—์„œ ํŠ€์–ด๋‚˜์™€ ์žˆ๋Š” ์ „์„  ๊ฐ™์€ ๊ฒ๋‹ˆ๋‹ค.
31:14
And it's quite gruesome, and it's really ...
598
1874410
3253
์กฐ๊ธˆ ์„ฌ๋œฉํ•˜๊ธฐ๋„ ํ•˜์ฃ . ๊ทธ๋ฆฌ๊ณ  ์ •๋ง๋กœ..
31:18
There's no good product that actually does a good job
599
1878539
4088
์ข‹์€ ์ œํ’ˆ์ด ์—†์–ด์š”. ์‹ค์ œ๋กœ ํšจ์œจ์ ์œผ๋กœ ์ผํ•˜๊ณ 
31:22
and is high-bandwidth and safe
600
1882627
1876
๋Œ€์—ญํญ๋„ ๋„“๊ณ  ๊ทธ๋ฆฌ๊ณ  ์•ˆ์ „ํ•˜๊ณ 
31:24
and something actually that you could buy and would want to buy.
601
1884545
3253
์›ํ•  ๋•Œ ์‚ด ์ˆ˜ ์žˆ๊ณ  ์‚ฌ๊ณ  ์‹ถ์–ด์ง€๋Š” ๊ทธ๋Ÿฐ ๊ฒƒ์ด์š”.
31:29
But the way to think of the Neuralink device
602
1889550
3921
ํ•˜์ง€๋งŒ ๋‰ด๋Ÿด๋งํฌ ์žฅ์น˜๋Š”
31:33
is kind of like a Fitbit or an Apple Watch.
603
1893512
3546
ํ•๋น—์ด๋‚˜ ์• ํ”Œ์›Œ์น˜ ๊ฐ™์€ ๊ฒ๋‹ˆ๋‹ค.
31:37
That's where we take out sort of a small section of skull
604
1897934
4713
์šฐ๋ฆฌ ๋จธ๋ฆฌ์˜ 4๋ถ„์˜ 1์ •๋„์˜ ์ž‘์€ ๋ถ€๋ถ„์„ ๊บผ๋‚ด์„œ
31:42
about the size of a quarter,
605
1902647
1584
31:44
replace that with what,
606
1904273
2252
๋Œ€์ฒด ์‹œํ‚ค๋Š” ๊ฑฐ์˜ˆ์š”.
31:46
in many ways really is very much like a Fitbit, Apple Watch
607
1906525
5673
๋งŽ์€ ๋ถ€๋ถ„์ด ํ•๋น—์ด๋‚˜ ์• ํ”Œ์›Œ์น˜
31:52
or some kind of smart watch thing.
608
1912239
2378
๋˜ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ์Šค๋งˆํŠธ ์›Œ์น˜๋“ค๊ณผ ๋งค์šฐ ํก์‚ฌํ•˜์ง€๋งŒ
31:56
But with tiny, tiny wires,
609
1916035
4171
๋งค์šฐ ๊ฐ€๋Š๋‹ค๋ž€ ์ „์„ ๋“ค,
32:00
very, very tiny wires.
610
1920247
1877
์•„์ฃผ ๋ฏธ์„ธํ•œ ์ „์„ ๋“ค์„,
32:02
Wires so tiny, itโ€™s hard to even see them.
611
1922124
2044
๋„ˆ๋ฌด ์ž‘์•„์„œ ๋ณด๊ธฐ๋„ ์–ด๋ ค์šด ๊ทธ๋Ÿฐ ์ „์„ ๋“ค์ธ ๊ฒ๋‹ˆ๋‹ค.
32:05
And it's very important to have very tiny wires
612
1925044
2210
์ „์„ ์ด ์ •๋ง ์ •๋ง ์–‡์•„์•ผ๋งŒ ํ•ฉ๋‹ˆ๋‹ค.
32:07
so that when theyโ€™re implanted, they donโ€™t damage the brain.
613
1927296
2836
๊ทธ๋ž˜์•ผ ์ด์‹๋  ๋•Œ ๋‡Œ๋ฅผ ์†์ƒ ์‹œํ‚ค์ง€๋Š” ์•Š์ฃ .
32:10
CA: How far are you from putting these into humans?
614
1930132
2878
CA: ์ด๊ฑธ ์‚ฌ๋žŒํ•œํ…Œ ์ ์šฉํ•˜๊ธฐ๊นŒ์ง€ ์‹œ๊ฐ„์ด ์–ผ๋งˆ๋‚˜ ๊ฑธ๋ฆด๊นŒ์š”?
32:14
EM: Well, we have put in our FDA application
615
1934136
4463
EM: ํ˜„์žฌ FDA์— ์‹ ์ฒญ์„œ๋ฅผ ์ œ์ถœํ–ˆ๊ณ ์š”.
32:18
to aspirationally do the first human implant this year.
616
1938641
4296
์˜ฌํ•ด ์ฒซ ๋ฒˆ์งธ ์ธ๊ฐ„์„ ๋Œ€์ƒ์œผ๋กœ ํ•œ ์ด์‹์ด ์ด๋ค„์ง€๊ธธ ๋ฐ”๋ผ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
32:23
CA: The first uses will be for neurological injuries
617
1943312
3545
CA: ์ฒซ ๋ฒˆ์งธ ์‚ฌ์šฉ์€ ์‹ ๊ฒฝํ•™์  ์†์ƒ์ด๋‚˜
32:26
of different kinds.
618
1946899
1293
๋ญ ๊ทธ๋Ÿฐ ๊ฒƒ๋“ค์— ์ ์šฉ์ด ๋  ํ…Œ์ง€๋งŒ,
32:28
But rolling the clock forward
619
1948192
1543
์‹œ๊ณ„๋ฅผ ์ข€ ๋” ์•ž๋‹น๊ฒจ์„œ
32:29
and imagining when people are actually using these
620
1949777
4046
์‹ค์ œ๋กœ ์ด๋Ÿฌํ•œ ์žฅ์น˜๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์„ ์ƒ์ƒํ•ด ๋ณธ๋‹ค๋ฉด
32:33
for their own enhancement, let's say,
621
1953864
2628
๋งํ•˜์ž๋ฉด ์ผ์ข…์˜ ์Šค์Šค๋กœ๋ฅผ ๊ฐ•ํ™”ํ•  ๋ชฉ์ ์œผ๋กœ๋‚˜
32:36
and for the enhancement of the world,
622
1956534
1877
๋” ๋‚˜์€ ์„ธ์ƒ์„ ๋งŒ๋“ค ๋ชฉ์ ์œผ๋กœ ์‚ฌ์šฉํ•˜๋Š” ์‚ฌ๋žŒ๋“ค ๋ง์ด์ฃ .
32:38
how clear are you in your mind
623
1958452
1460
๋จธ์Šคํฌ์”จ๊ฐ€ ์–ผ๋งˆ๋‚˜ ํ™•์‹ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ๊ถ๊ธˆํ•œ๋ฐ์š”.
32:39
as to what it will feel like to have one of these inside your head?
624
1959912
5339
์ด๋Ÿฐ ์žฅ์น˜๊ฐ€ ๋จธ๋ฆฟ์†์— ์žˆ๋Š” ๊ฒŒ ์–ด๋–ค ๋Š๋‚Œ์ผ ์ง€์— ๋Œ€ํ•ด์„œ ๋ง์ž…๋‹ˆ๋‹ค.
32:45
EM: Well, I do want to emphasize we're at an early stage.
625
1965251
3920
EM: ์šฐ์„  ์ €ํฌ๋Š” ๋งค์šฐ ์ดˆ๊ธฐ ๋‹จ๊ณ„์— ์žˆ๋‹ค๋Š” ์ ์„ ๊ฐ•์กฐํ•˜๊ณ  ์‹ถ๊ณ 
32:49
And so it really will be many years before we have
626
1969171
6048
์‹ค์ œ๋กœ ์˜ค๋ž˜ ๊ฑธ๋ฆด ๊ฒ๋‹ˆ๋‹ค.
32:55
anything approximating a high-bandwidth neural interface
627
1975261
6673
์žฅ์น˜๊ฐ€ ๊ณ ๋Œ€์—ญํญ์˜ ์‹ ๊ฒฝ ์ธํ„ฐํŽ˜์ด์Šค์— ๊ฐ€๊นŒ์›Œ์ ธ์„œ
33:01
that allows for AI-human symbiosis.
628
1981934
3670
AI์™€ ์ธ๊ฐ„์˜ ๊ณต์ƒ์ด ๊ฐ€๋Šฅํ•ด ์งˆ ๋•Œ๊นŒ์ง€๋Š” ๋ง์ด์ฃ .
33:07
For many years, we will just be solving brain injuries and spinal injuries.
629
1987481
4255
์ˆ˜ ๋…„์— ๊ฑธ์ณ, ์ €ํฌ๋Š” ๋‹จ์ง€ ๋‡Œ์™€ ์ฒ™์ถ” ์†์ƒ์„ ํ•ด๊ฒฐํ•˜๋ ค ํ•  ๊ฒƒ์ด๊ณ 
33:11
For probably a decade.
630
1991777
2169
์‹ญ ๋…„์ด ๊ฑธ๋ฆด์ง€๋„ ๋ชจ๋ฅด์ฃ .
33:14
This is not something that will suddenly one day
631
1994905
3504
์–ด๋Š ๋‚  ๊ฐ‘์ž๊ธฐ ๋†€๋ผ์šธ ์ •๋„๋กœ
33:18
it will have this incredible sort of whole brain interface.
632
1998451
4713
์™„์ „ํ•œ ๋‘๋‡Œ ์ธํ„ฐํŽ˜์ด์Šค๊ฐ€ ๋งŒ๋“ค์–ด ์งˆ ์ˆ˜๋Š” ์—†์Šต๋‹ˆ๋‹ค.
33:25
It's going to be, like I said,
633
2005041
1501
๋ง์”€๋“œ๋ ธ๋“ฏ์ด, ์•„๋งˆ๋„
33:26
at least a decade of really just solving brain injuries
634
2006542
3921
์ตœ์†Œ ์‹ญ ๋…„์€ ๊ฑธ๋ฆฌ๊ฒ ์ง€์š”.
๋‡Œ ์†์ƒ๊ณผ ์ฒ™์ถ” ์†์ƒ์— ๊ด€ํ•œ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ๋ฐ๋งŒ์š”.
33:30
and spinal injuries.
635
2010504
2127
33:32
And really, I think you can solve a very wide range of brain injuries,
636
2012631
3754
๋งค์šฐ ๋„“์€ ๋ฒ”์œ„์˜ ๋‡Œ ์งˆํ™˜๋“ค์„ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์œผ๋กœ ๋ด…๋‹ˆ๋‹ค.
33:36
including severe depression, morbid obesity, sleep,
637
2016385
6799
์ค‘์ฆ์˜ ์šฐ์šธ์ฆ, ๋ณ‘์ ์ธ ๋น„๋งŒ, ์ˆ˜๋ฉด๋ฌธ์ œ
33:43
potentially schizophrenia,
638
2023225
1252
์ž ์žฌ์ ์ธ ์กฐํ˜„๋ณ‘๊ณผ ๊ฐ™์ด
33:44
like, a lot of things that cause great stress to people.
639
2024518
3712
์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์—„์ฒญ๋‚œ ์ŠคํŠธ๋ ˆ์Šค๋ฅผ ์ผ์œผํ‚ค๋Š” ๊ฒƒ๋“ค ๋ง์ž…๋‹ˆ๋‹ค.
33:48
Restoring memory in older people.
640
2028773
3003
๋…ธ์ธ๋“ค์˜ ๊ธฐ์–ต๋ ฅ์„ ํšŒ๋ณต์‹œํ‚ค๊ธฐ๋„ ํ•˜๊ณ ์š”.
33:51
CA: If you can pull that off, that's the app I will sign up for.
641
2031817
3796
CA: ์ •๋ง๋กœ ๋œ๋‹ค๋ฉด, ์ €๋„ ๋“ฑ๋กํ•ด์•ผ๊ฒ ๊ตฐ์š”.
33:56
EM: Absolutely.
642
2036363
1293
EM: ๊ทธ๋Ÿผ์š”.
33:57
CA: Please hurry. (Laughs)
643
2037698
2169
CA: ์„œ๋‘˜๋Ÿฌ์ค˜์š”.(์›ƒ์Œ)
33:59
EM: I mean, the emails that we get at Neuralink are heartbreaking.
644
2039867
4880
EM: ๋‰ด๋Ÿด๋งํฌ๋ฅผ ์ง„ํ–‰ํ•˜๋ฉด์„œ ๋ฐ›๋Š” ์ด๋ฉ”์ผ์€ ์ •๋ง ๊ฐ€์Šด์ด ์•„ํ”ˆ๋ฐ
34:05
I mean, they'll send us just tragic, you know,
645
2045122
4088
๋งค์šฐ ๋น„๊ทน์ ์ด๊ณ ์š”.
34:09
where someone was sort of, in the prime of life
646
2049251
2336
์ธ์ƒ์˜ ์ค‘์š”ํ•œ ์‹œ๊ธฐ์—
34:11
and they had an accident on a motorcycle
647
2051629
3795
์˜คํ† ๋ฐ”์ด ์‚ฌ๊ณ ๋ฅผ ๋‹นํ•˜๊ฑฐ๋‚˜
34:15
and someone who's 25, you know, can't even feed themselves.
648
2055466
5797
์•„์ง ์Šค๋ฌผ ๋‹ค์„ฏ ์‚ด์ธ๋ฐ ์Šค์Šค๋กœ ๋ฐฅ๋„ ๋จน์„ ์ˆ˜ ์—†๊ฒŒ ๋˜๊ธฐ๋„ ํ•˜์ฃ .
34:21
And this is something we could fix.
649
2061263
2378
์ด๋Ÿฐ ๋ฌธ์ œ๊ฐ€ ์ €ํฌ๊ฐ€ ํ•ด๊ฒฐํ•˜๋ ค๋Š” ๊ฒƒ๋“ค์ž…๋‹ˆ๋‹ค.
34:24
CA: But you have said that AI is one of the things you're most worried about
650
2064391
3587
CA: ํ•˜์ง€๋งŒ ์•ž์„œ ์–˜๊ธฐํ–ˆ๋“ฏ, AI๋Š” ๋จธ์Šคํฌ ์”จ๊ฐ€ ๊ฐ€์žฅ ๊ฑฑ์ •ํ•˜๋Š” ๊ฒƒ์ด๊ณ 
34:28
and that Neuralink may be one of the ways
651
2068020
2878
๋‰ด๋Ÿด๋งํฌ๊ฐ€ ํ•˜๋‚˜์˜ ํ•ด๊ฒฐ์ฑ…์ด ๋  ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
34:30
where we can keep abreast of it.
652
2070940
3837
์šฐ๋ฆฌ๊ฐ€ AI๋ฅผ ๋”ฐ๋ผ์žก์„ ๋ฐฉ๋ฒ•์ด์š”.
34:35
EM: Yeah, there's the short-term thing,
653
2075528
3545
EM: ๋„ค, ๋‹จ๊ธฐ์ ์œผ๋กœ๋Š”์š”.
34:39
which I think is helpful on an individual human level with injuries.
654
2079115
3920
์ œ ์˜ˆ์ƒ์œผ๋กœ๋Š” ๋ถ€์ƒ์„ ๋‹นํ•œ ์‚ฌ๋žŒ๋“ค ๊ฐ์ž์—๊ฒŒ ๋„์›€์„ ์ฃผ๊ฒ ์ง€๋งŒ
34:43
And then the long-term thing is an attempt
655
2083077
2544
์žฅ๊ธฐ์ ์œผ๋กœ ์‹œ๋„ํ•ด์•ผ ํ•  ๊ฑด
34:45
to address the civilizational risk of AI
656
2085663
6006
๋ฌธ๋ช…์— ๋Œ€ํ•œ AI์˜ ์œ„ํ˜‘์— ๋งž์„œ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
34:51
by bringing digital intelligence
657
2091710
3921
๋””์ง€ํ„ธ ์ง€๋Šฅ๊ณผ
34:55
and biological intelligence closer together.
658
2095673
2669
์ƒ๋ฌผ ์ง€๋Šฅ์„ ๋” ๊ฐ€๊น๊ฒŒ ๋งŒ๋“œ๋Š” ๋ฐฉ์‹์„ ํ†ตํ•ด์„œ์š”.
34:58
I mean, if you think of how the brain works today,
659
2098384
2377
์˜ค๋Š˜๋‚  ๋‘๋‡Œ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ž‘๋™ํ•˜๊ณ  ์žˆ๋Š”์ง€ ์ƒ๊ฐํ•ด ๋ณด์ž๋ฉด
35:00
there are really two layers to the brain.
660
2100803
2002
๋‡Œ์—๋Š” ๋‘ ์ธต์ด ์žˆ์–ด์š”.
35:02
There's the limbic system and the cortex.
661
2102847
1960
๋ณ€์—ฐ๊ณ„์™€ ํ”ผ์งˆ๋กœ ๋‚˜๋‰ฉ๋‹ˆ๋‹ค.
35:04
You've got the kind of, animal brain where --
662
2104849
2127
๋™๋ฌผ์˜ ๋‡Œ์™€ ๊ฐ™์€ ๋ถ€๋ถ„์ด ์žˆ๊ณ ...
35:06
itโ€™s kind of like the fun part, really.
663
2106976
1877
์ด ๋ถ€๋ถ„์ด ์žฌ๋ฏธ์žˆ๋Š”๋ฐ์š”.
35:08
CA: It's where most of Twitter operates, by the way.
664
2108853
2502
CA: ๋Œ€๋ถ€๋ถ„์˜ ํŠธ์œ„ํ„ฐ๊ฐ€ ์ž‘๋™๋˜๋Š” ๊ณณ์ด์ฃ .
35:11
EM: I think Tim Urban said,
665
2111355
1710
EM: ํŒ€ ์–ด๋ฐ˜์€ ์ด๋ ‡๊ฒŒ ๋งํ–ˆ์ฃ .
35:13
weโ€™re like somebody, you know, stuck a computer on a monkey.
666
2113107
4463
์šฐ๋ฆฌ๋Š” ์›์ˆญ์ด์— ์ปดํ“จํ„ฐ๋ฅผ ๊ฝ‚์•„ ๋„ฃ์€ ๊ฒƒ๊ณผ ๊ฐ™์€ ์กด์žฌ๋ผ๊ณ ์š”.
35:18
You know, so we're like, if you gave a monkey a computer,
667
2118320
3587
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์›์ˆญ์ด์—๊ฒŒ ์ปดํ“จํ„ฐ๋ฅผ ์ค€๋‹ค๋ฉด
35:21
that's our cortex.
668
2121949
1210
๊ทธ๊ฒŒ ์šฐ๋ฆฌ์˜ ๋Œ€๋‡Œ ํ”ผ์งˆ์ž…๋‹ˆ๋‹ค.
35:23
But we still have a lot of monkey instincts.
669
2123159
2168
๊ทธ๋Ÿผ์—๋„ ์—ฌ์ „ํžˆ ์ˆ˜๋งŽ์€ ์›์ˆญ์ด์˜ ๋ณธ๋Šฅ์„ ์ง€๋‹ˆ๊ณ  ์žˆ์ฃ .
35:25
Which we then try to rationalize as, no, it's not a monkey instinct.
670
2125369
3754
์šฐ๋ฆฌ๋Š” ๊ทธ๊ฒŒ ์›์ˆญ์ด์˜ ๋ณธ๋Šฅ์ด ์•„๋‹ˆ๋ผ๊ณ  ํ•ฉ๋ฆฌํ™”ํ•˜๋ ค๊ณ  ํ•˜์ง€์š”.
35:29
Itโ€™s something more important than that.
671
2129165
1918
โ€œ์ด๊ฑด ํ›จ์”ฌ ๋” ์ค‘์š”ํ•œ ๊ฑฐ์•ผโ€๋ผ๋ฉด์„œ์š”.
35:31
But it's often just really a monkey instinct.
672
2131083
2127
ํ•˜์ง€๋งŒ ๋Œ€์ฒด๋กœ ๊ทธ๋ƒฅ ์›์ˆญ์ด์˜ ๋ณธ๋Šฅ์ž…๋‹ˆ๋‹ค.
35:33
We're just monkeys with a computer stuck in our brain.
673
2133252
3378
์šฐ๋ฆฌ๋Š” ๋‹จ์ง€ ๋‡Œ์— ์ปดํ“จํ„ฐ๋ฅผ ๊ฝ‚์•„ ๋„ฃ์€ ์›์ˆญ์ด์ž…๋‹ˆ๋‹ค.
35:38
But even though the cortex is sort of the smart,
674
2138883
2919
๋Œ€๋‡Œ ํ”ผ์งˆ์ด ๋” ๋˜‘๋˜‘ํ•˜๊ณ 
35:41
or the intelligent part of the brain,
675
2141844
1793
์ง€๋Šฅ์„ ๋‹ด๋‹นํ•˜๋Š” ๋‡Œ์˜ ๋ถ€๋ถ„์ด๋‚˜
35:43
the thinking part of the brain,
676
2143679
2419
์ƒ๊ฐ์„ ๋‹ด๋‹นํ•˜๋Š” ๋ถ€๋ถ„์ด๋ผ ํ•˜๋”๋ผ๋„
35:46
I've not yet met anyone who wants to delete their limbic system
677
2146098
3712
์ž์‹ ์˜ ๋Œ€๋‡Œ ๋ณ€์—ฐ๊ณ„๋ฅผ ์—†์• ๊ณ  ์‹ถ์–ดํ•˜๋Š” ์‚ฌ๋žŒ์„ ์ €๋Š” ๋ณธ ์ ์ด ์—†์–ด์š”.
35:49
or their cortex.
678
2149852
1168
๋Œ€๋‡Œ ํ”ผ์งˆ๋„ ๋งˆ์ฐฌ๊ฐ€์ง€๊ณ ์š”.
35:51
They're quite happy having both.
679
2151020
1543
๋‘˜ ๋‹ค ๊ฐ€์ ธ์•ผ ํ–‰๋ณตํ•ฉ๋‹ˆ๋‹ค.
35:52
Everyone wants both parts of their brain.
680
2152605
2002
๋ชจ๋‘๋“ค ์ž๊ธฐ ๋‡Œ์— ๋‘ ๋ถ€๋ถ„์ด ๋ชจ๋‘ ์žˆ๊ธฐ๋ฅผ ์›ํ•˜์ฃ .
35:56
And people really want their phones and their computers,
681
2156025
2669
๊ทธ๋ฆฌ๊ณ  ํ•ธ๋“œํฐ๊ณผ ์ปดํ“จํ„ฐ๋„ ์†Œ์œ ํ•˜๊ณ  ์‹ถ์–ดํ•ฉ๋‹ˆ๋‹ค.
35:58
which are really the tertiary, the third part of your intelligence.
682
2158736
3503
์‹ค์ œ๋กœ๋„ ์‚ฌ๋žŒ๋“ค์˜ ์„ธ ๋ฒˆ์งธ ์ง€๋Šฅ์„ ๋‹ด๋‹นํ•˜๋Š” ๋ถ€๋ถ„์ด์ง€์š”.
36:02
It's just that it's ...
683
2162281
1627
๊ทธ๊ฑด ๋งˆ์น˜
36:03
Like the bandwidth,
684
2163908
2294
๋Œ€์—ญํญ๊ณผ ๊ฐ™์€๋ฐ
36:06
the rate of communication with that tertiary layer is slow.
685
2166202
4629
์„ธ ๋ฒˆ์งธ ์ธต๊ณผ์˜ ํ†ต์‹  ์†๋„๋Š” ๋งค์šฐ ๋Š๋ฆฝ๋‹ˆ๋‹ค.
36:11
And it's just a very tiny straw to this tertiary layer.
686
2171665
3796
๋งค์šฐ ์ž‘์€ ๋นจ๋Œ€๊ฐ€ ์„ธ ๋ฒˆ์งธ ์ธต์— ๊ฝ‚ํ˜€์žˆ๋Š” ๊ฒƒ๊ณผ ๊ฐ™์ฃ .
36:15
And we want to make that tiny straw a big highway.
687
2175502
2753
์ €ํฌ๋Š” ์ด ์ž‘์€ ๋นจ๋Œ€๋ฅผ ๊ฑฐ๋Œ€ํ•œ ๊ณ ์†๋„๋กœ๋กœ ๋งŒ๋“ค๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
36:19
And Iโ€™m definitely not saying that this is going to solve everything.
688
2179298
3545
์ด๊ฒŒ ๋ชจ๋“  ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ๊ฑฐ๋ผ๊ณ  ์–˜๊ธฐํ•˜๊ฑฐ๋‚˜
36:22
Or this is you know, itโ€™s the only thing --
689
2182885
3503
์ด๊ฒŒ ๋ญ๋ž„๊นŒ, ๋‹จ์ง€ ์ด๊ฒƒ๋งŒ์ด
36:26
itโ€™s something that might be helpful.
690
2186430
3754
๋„์›€์„ ์ค„ ๊ฑฐ๋ผ๊ณ  ์–˜๊ธฐํ•˜๋Š” ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
36:30
And worst-case scenario,
691
2190517
1711
์ตœ์•…์˜ ์‹œ๋‚˜๋ฆฌ์˜ค๋Š”,
36:32
I think we solve some important brain injury,
692
2192228
3503
๋ช‡ ๊ฐ€์ง€ ์ค‘์š”ํ•œ ๋‡Œ ์†์ƒ์ด๋‚˜
36:35
spinal injury issues, and that's still a great outcome.
693
2195773
2586
์ฒ™์ถ” ์†์ƒ๋งŒ ํ•ด๊ฒฐํ•˜๋Š” ๊ฒƒ์ด๊ฒ ์ง€๋งŒ ์—ฌ์ „ํžˆ ์ข‹์€ ๊ฒฐ๊ณผ๊ฒ ์ง€์š”.
36:38
CA: Best-case scenario,
694
2198359
1167
CA: ์ตœ๊ณ ์˜ ์‹œ๋‚˜๋ฆฌ์˜ค๋Š”,
36:39
we may discover new human possibility, telepathy,
695
2199568
2419
์ธ๊ฐ„๋“ค์˜ ์ƒˆ๋กœ์šด ๊ฐ€๋Šฅ์„ฑ์ด๋‚˜ ํ…”๋ ˆํŒŒ์‹œ ๊ฐ™์€ ๊ฑธ ๋ฐœ๊ฒฌํ•˜๊ฑฐ๋‚˜
36:42
you've spoken of, in a way, a connection with a loved one, you know,
696
2202029
4671
๋จธ์Šคํฌ ์”จ๊ฐ€ ์–ธ๊ธ‰ํ•œ ๊ฒƒ์ฒ˜๋Ÿผ ์‚ฌ๋ž‘ํ•˜๋Š” ์ด๋“ค๊ณผ ์—ฐ๊ฒฐ๋œ๋‹ค๊ฑฐ๋‚˜,
36:46
full memory and much faster thought processing maybe.
697
2206742
5005
์™„์ „ํ•˜๊ณ  ์ •ํ™•ํ•œ ๊ธฐ์–ต๋ ฅ ๊ฐ™์€ ๊ฒƒ์ด ๋  ํ…Œ๊ณ ์š”.
36:51
All these things.
698
2211747
1335
์ด ๋ชจ๋“  ๊ฒƒ๋“ค์ด
36:53
It's very cool.
699
2213540
1210
๋งค์šฐ ๋ฉ‹์ง€๋„ค์š”.
36:55
If AI were to take down Earth, we need a plan B.
700
2215542
5423
๋งŒ์•ฝ AI๊ฐ€ ์ง€๊ตฌ๋ฅผ ๊ณต๊ฒฉํ•˜๋ ค ํ•œ๋‹ค๋ฉด, ์šฐ๋ฆฌ์—๊ฒŒ ๋Œ€์•ˆ์ด ํ•„์š”ํ•  ๊ฒƒ ๊ฐ™์€๋ฐ
37:01
Let's shift our attention to space.
701
2221006
3545
์šฐ์ฃผ๋กœ ์‹œ์„ ์„ ๋Œ๋ ค๋ณด์ง€์š”.
37:04
We spoke last time at TED about reusability,
702
2224593
2086
์ง€๋‚œ TED์—์„œ ์šฐ๋ฆฌ๋Š” ์žฌ์‚ฌ์šฉ์— ๊ด€ํ•ด ์–˜๊ธฐํ–ˆ์—ˆ๊ณ 
37:06
and you had just demonstrated that spectacularly for the first time.
703
2226720
3212
๋จธ์Šคํฌ ์”จ๋Š” ๊ทธ๊ฑธ ๊ทน์ ์œผ๋กœ ์ฆ๋ช…ํ•ด๋ƒˆ์—ˆ์ฃ .
37:09
Since then, you've gone on to build this monster rocket, Starship,
704
2229974
5922
๊ทธ๋ฆฌ๊ณ  ๊ทธ๋•Œ๋ถ€ํ„ฐ ๊ดด๋ฌผ ๊ฐ™์€ ๋กœ์ผ“, โ€˜์Šคํƒ€์‰ฝโ€™์„ ๊ฐœ๋ฐœํ•ด์˜ค๊ณ  ์žˆ์ง€ ์•Š์Šต๋‹ˆ๊นŒ?
37:15
which kind of changes the rules of the game in spectacular ways.
705
2235938
4046
์ด๊ฒŒ ๊ฒŒ์ž„์˜ ํŒ๋„๋ฅผ ๊ทน์ ์œผ๋กœ ๋ฐ”๊พธ๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
37:20
Tell us about Starship.
706
2240025
1543
์Šคํƒ€์‰ฝ์— ๊ด€ํ•ด ์–˜๊ธฐ ์ข€ ํ•ด์ฃผ์‹œ์ฃ .
37:22
EM: Starship is extremely fundamental.
707
2242486
1877
EM: ์Šคํƒ€์‰ฝ์€ ๋งค์šฐ ํ•ต์‹ฌ์ ์ž…๋‹ˆ๋‹ค.
37:24
So the holy grail of rocketry or space transport
708
2244405
5839
๋กœ์ผ“ ๊ณตํ•™์ด๋‚˜ ์šฐ์ฃผ ์šด์†ก์˜ ๊ฟˆ์€
37:30
is full and rapid reusability.
709
2250286
1793
์™„์ „ํ•˜๊ณ  ๋น ๋ฅธ ์žฌ์‚ฌ์šฉ์ด๊ณ ,
37:32
This has never been achieved.
710
2252121
1418
ํ•œ ๋ฒˆ๋„ ์ด๋ค„์ง„ ์ ์ด ์—†์Šต๋‹ˆ๋‹ค.
37:33
The closest that anything has come is our Falcon 9 rocket,
711
2253580
3337
๊ทธ๋Ÿฐ ์ข…๋ฅ˜์— ๊ฐ€์žฅ ๊ทผ์ ‘ํ•œ ๊ฒŒ ์ €ํฌ์˜ ํŒ”์ฝ˜9 ๋กœ์ผ“์ด๊ณ 
37:36
where we are able to recover the first stage, the boost stage,
712
2256959
5213
์ฒซ ๋ฒˆ์งธ ๋‹จ๊ณ„, ์ฆ‰ ์ถ”์ง„ ๋‹จ๊ณ„์—์„œ ์žฌ์ƒ์ด ๊ฐ€๋Šฅํ•ด์กŒ์Šต๋‹ˆ๋‹ค.
37:42
which is probably about 60 percent of the cost of the vehicle
713
2262214
4630
๋น„์šฉ์€ ์ „๋ฉด์ ์ธ ๋ฐœ์‚ฌ์™€ ๋น„๊ตํ•˜๋ฉด, 60ํผ์„ผํŠธ ์ •๋„์ž…๋‹ˆ๋‹ค.
37:46
of the whole launch, maybe 70 percent.
714
2266885
2920
70ํผ์„ผํŠธ ์ •๋„์ผ ์ˆ˜๋„ ์žˆ๊ณ ์š”.
37:50
And we've now done that over a hundred times.
715
2270347
3170
์ €ํฌ๋Š” ์ด๊ฑธ ๋ฒŒ์จ ์ˆ˜๋ฐฑ ๋ฒˆ๋„ ๋„˜๊ฒŒ ์„ฑ๊ณต ์‹œ์ผฐ๊ณ 
37:53
So with Starship, we will be recovering the entire thing.
716
2273517
6131
์Šคํƒ€์‰ฝ์—์„œ๋Š” ์ „์ฒด ๋ถ€๋ถ„์„ ์žฌ์‚ฌ์šฉํ•˜๊ฒŒ ๋  ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
37:59
Or at least that's the goal.
717
2279690
1543
์ตœ์†Œํ•œ ๊ทธ๊ฒŒ ๋ชฉํ‘œ์ด๊ธด ํ•ฉ๋‹ˆ๋‹ค.
38:01
CA: Right.
718
2281275
1209
CA: ๊ทธ๋ ‡๊ตฐ์š”.
38:02
EM: And moreover, recovering it in such a way
719
2282526
3128
EM: ๋”์šฑ์ด ๊ทธ๋Ÿฐ ์‹์˜ ์žฌ์‚ฌ์šฉ์€
38:05
that it can be immediately re-flown.
720
2285696
2628
์ฆ‰๊ฐ์ ์œผ๋กœ ์žฌ๋น„ํ–‰์„ ๊ฐ€๋Šฅํ•˜๊ฒŒ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
38:08
Whereas with Falcon 9, we still need to do some amount of refurbishment
721
2288365
3462
ํ•˜์ง€๋งŒ ํŒ”์ฝ˜ 9์˜ ๊ฒฝ์šฐ, ์–ด๋Š ์ •๋„
38:11
to the booster and to the fairing nose cone.
722
2291869
2919
์ถ”์ง„ ๋กœ์ผ“๊ณผ ์œ ์„ฑํ˜•์˜ ๋…ธ์ฆˆ์ฝ˜ ๋ถ€๋ถ„์„ ์žฌ์ •๋น„ํ•  ํ•„์š”๊ฐ€ ์žˆ๋Š” ๋ฐ˜๋ฉด
38:16
But with Starship, the design goal is immediate re-flight.
723
2296790
4880
์Šคํƒ€์‰ฝ์€ ์„ค๊ณ„ ๋ชฉํ‘œ๊ฐ€ ์ฆ‰๊ฐ์  ์žฌ๋น„ํ–‰์ž…๋‹ˆ๋‹ค.
38:22
So you just refill propellants and go again.
724
2302212
3671
์ถ”์ง„ ์—ฐ๋ฃŒ๋งŒ ๋‹ค์‹œ ์ฑ„์›Œ ๋„ฃ์œผ๋ฉด ๋‹ค์‹œ ๋น„ํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
38:28
And this is gigantic.
725
2308302
2335
๊ทธ๋ฆฌ๊ณ  ๊ฑฐ๋Œ€ํ•ฉ๋‹ˆ๋‹ค.
38:30
Just as it would be in any other mode of transport.
726
2310679
2878
๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ์šด์†ก ์ˆ˜๋‹จ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
38:33
CA: And the main design
727
2313557
1752
CA: ๋ฉ”์ธ ๋””์ž์ธ์€
38:35
is to basically take 100 plus people at a time,
728
2315351
6006
๊ธฐ๋ณธ์ ์œผ๋กœ ๋™์‹œ์— 100๋ช… ์ด์ƒ ํƒ‘์Šนํ•˜๊ณ 
38:41
plus a bunch of things that they need, to Mars.
729
2321357
3837
๊ทธ๋“ค์ด ํ•„์š”ํ•œ ๋ฌผ๊ฑด๊นŒ์ง€ ์‹ค์€ ์ฑ„๋กœ ํ™”์„ฑ์— ๊ฐ€๋Š” ๊ฑด๋ฐ์š”.
38:45
So, first of all, talk about that piece.
730
2325611
1960
๊ทธ๋Ÿฌ๋ฉด ์šฐ์„  ๊ฑฐ๊ธฐ์— ๋Œ€ํ•ด ์งˆ๋ฌธํ•ด๋ณด์ฃ .
38:47
What is your latest timeline?
731
2327613
3462
ํ˜„์žฌ ๊ณ„ํš์€ ์–ด๋–ป๊ฒŒ ๋ฉ๋‹ˆ๊นŒ?
38:51
One, for the first time, a Starship goes to Mars,
732
2331116
3379
์ผ๋‹จ, ์Šคํƒ€์‰ฝ์€ ํ™”์„ฑ์œผ๋กœ ๊ฐ€๊ณ ์š”.
38:54
presumably without people, but just equipment.
733
2334536
2211
์˜ˆ์ƒํ•˜๊ฑด๋ฐ, ์‚ฌ๋žŒ์€ ํƒ‘์Šนํ•˜์ง€ ์•Š๊ณ  ์žฅ๋น„๋งŒ ๊ฐ€๊ฒ ์ฃ .
38:57
Two, with people.
734
2337122
1877
๊ทธ ๋‹ค์Œ์€, ์‚ฌ๋žŒ์„ ์‹ฃ๊ณ  ๊ฐ€๊ณ 
38:59
Three, thereโ€™s sort of,
735
2339041
2252
์„ธ ๋ฒˆ์งธ๋กœ,
39:01
OK, 100 people at a time, let's go.
736
2341335
2711
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ๋™์‹œ์— 100๋ช…๊ณผ ํ•จ๊ป˜ ๊ฐ€๋Š” ๊ฑด๊ฐ€์š”?
39:04
EM: Sure.
737
2344546
1126
EM: ๋งž์Šต๋‹ˆ๋‹ค.
39:05
And just to put the cost thing into perspective,
738
2345714
3796
๋น„์šฉ๋ฉด์—์„œ ๋ณด๋ฉด
39:09
the expected cost of Starship,
739
2349510
4754
์Šคํƒ€์‰ฝ์˜ ๊ฒฝ์šฐ
39:14
putting 100 tons into orbit,
740
2354306
2002
100ํ†ค์„ ๊ถค๋„์— ์˜ฌ๋ฆฌ๋Š”๋ฐ
39:16
is significantly less than what it would have cost
741
2356350
4880
์ง€๊ธˆ๊นŒ์ง€์— ๋น„ํ•ด ์ƒ๋‹นํžˆ ์ ์€ ์–‘์˜ ๋น„์šฉ์„ ์“ฐ๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
39:21
or what it did cost to put our tiny Falcon 1 rocket into orbit.
742
2361271
4505
๋งค์šฐ ์ž‘์€ ํŒ”์ฝ˜1 ๋กœ์ผ“์„ ๊ถค๋„์— ์˜ฌ๋ฆฌ๋Š” ๊ฒƒ๋ณด๋‹ค๋„ ์ ์–ด์š”.
39:27
Just as the cost of flying a 747 around the world
743
2367611
4671
๋งˆ์น˜ ์ „ ์„ธ๊ณ„์ ์œผ๋กœ ์ด์šฉํ•˜๋Š” 747 ์—ฌ๊ฐ๊ธฐ ๋น„์šฉ์ด
39:32
is less than the cost of a small airplane.
744
2372282
2419
์ž‘์€ ๋น„ํ–‰๊ธฐ ๊ฐ’๋ณด๋‹ค ์ ์€ ๊ฒƒ์ฒ˜๋Ÿผ์š”.
39:35
You know, a small airplane that was thrown away.
745
2375244
2586
์“ฐ๊ณ  ๋ฒ„๋ ค์ง€๋Š” ์ž‘์€ ๋น„ํ–‰๊ธฐ ๋ง์ž…๋‹ˆ๋‹ค.
39:37
So it's really pretty mind-boggling that the giant thing costs less,
746
2377871
6048
๊ฝค ๋†€๋ผ์šด ์ผ์ด์ฃ . ๊ฑฐ๋Œ€ํ•œ ๋ฌผ์ฒด๊ฐ€ ๋” ์ž‘์€ ๋ฌผ์ฒด๋ณด๋‹ค
39:43
way less than the small thing.
747
2383961
1460
๋น„์šฉ์ด ํ›จ์”ฌ ์ ๋‹ค๋Š” ๊ฒƒ์ด์š”.
39:45
So it doesn't use exotic propellants
748
2385421
4587
๊ทธ๋ž˜์„œ ์Šคํƒ€์‰ฝ์€ ์‹คํ—˜์ ์ธ ์ถ”์ง„์ฒด๋‚˜
39:50
or things that are difficult to obtain on Mars.
749
2390050
2461
ํ™”์„ฑ์—์„œ ๊ตฌํ•˜๊ธฐ ์–ด๋ ค์šด ๋ถ€ํ’ˆ๋“ค์„ ์“ฐ์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
39:52
It uses methane as fuel,
750
2392928
3962
์—ฐ๋ฃŒ๋กœ๋Š” ๋ฉ”ํƒ„์„ ์‚ฌ์šฉํ•˜๊ณ ,
39:56
and it's primarily oxygen, roughly 77-78 percent oxygen by weight.
751
2396890
5798
์ฃผ๋กœ ์‚ฐ์†Œ๋ฅผ ์ด์šฉํ•˜๋Š”๋ฐ, ๋ฌด๊ฒŒ์˜ 77~78ํผ์„ผํŠธ๊ฐ€ ์‚ฐ์†Œ์ž…๋‹ˆ๋‹ค.
40:03
And Mars has a CO2 atmosphere and has water ice,
752
2403313
3587
ํ™”์„ฑ์—๋Š” ์ด์‚ฐํ™” ํƒ„์†Œ ๋Œ€๊ธฐ์™€ ๋ฌผ๋กœ ๋œ ์–ผ์Œ์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์—
40:06
which is CO2 plus H2O, so you can make CH4, methane,
753
2406942
3378
์ด์‚ฐํ™” ํƒ„์†Œ์™€ ๋ฌผ์„ ์„ž์œผ๋ฉด ํ™”์„ฑ์—์„œ ๋ฉ”ํƒ„๊ณผ ์‚ฐ์†Œ๋ฅผ
40:10
and O2, oxygen, on Mars.
754
2410362
1794
๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
40:12
CA: Presumably, one of the first tasks on Mars will be to create a fuel plant
755
2412197
3963
CA: ์•„๋งˆ๋„ ํ™”์„ฑ์—์„œ ์ฒ˜์Œ ํ•  ์ผ์€ ์—ฐ๋ฃŒ ๊ณต์žฅ์„ ๋งŒ๋“œ๋Š” ์ผ์ด๊ฒ ๋„ค์š”.
40:16
that can create the fuel for the return trips of many Starships.
756
2416201
4255
๋งŽ์€ ์™•๋ณต ์šฐ์ฃผ์„ ๋“ค์„ ์œ„ํ•œ ์—ฐ๋ฃŒ๋ฅผ ์ƒ์‚ฐํ•  ์ˆ˜ ์žˆ๊ฒŒ์š”.
40:20
EM: Yes.
757
2420497
1168
EM: ๋งž์Šต๋‹ˆ๋‹ค.
40:21
And actually, it's mostly going to be oxygen plants,
758
2421665
2920
๊ทธ๋ฆฌ๊ณ  ์‚ฌ์‹ค์ƒ, ๋Œ€๋ถ€๋ถ„์€ ์‚ฐ์†Œ ์‹œ์„ค์ด ๋  ๊ฑฐ์˜ˆ์š”.
40:24
because it's 78 percent oxygen, 22 percent fuel.
759
2424626
5965
78ํผ์„ผํŠธ๋Š” ์‚ฐ์†Œ๊ณ , 22ํผ์„ผํŠธ๊ฐ€ ์—ฐ๋ฃŒ์ธ๋ฐ
40:31
But the fuel is a simple fuel that is easy to create on Mars.
760
2431300
3712
์—ฐ๋ฃŒ๋Š” ํ™”์„ฑ์—์„œ ๋งŒ๋“ค๊ธฐ ์‰ฌ์šด ๊ฐ„๋‹จํ•œ ์—ฐ๋ฃŒ๋‹ˆ๊นŒ์š”.
40:35
And in many other parts of the solar system.
761
2435512
2586
๋‹ค๋ฅธ ํƒœ์–‘๊ณ„์—์„œ๋„ ๋น„์Šทํ•  ๊ฒ๋‹ˆ๋‹ค.
40:38
So basically ...
762
2438098
1293
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ๊ธฐ๋ณธ์ ์œผ๋กœ๋Š”...
40:39
And it's all propulsive landing, no parachutes,
763
2439933
3796
๋ชจ๋‘ ์ถ”์ง„ ์ฐฉ๋ฅ™์ด ๋  ๊ฒ๋‹ˆ๋‹ค. ๋‚™ํ•˜์‚ฐ๋„ ํ•„์š” ์—†๊ณ ,
40:43
nothing thrown away.
764
2443729
1460
๋ฒ„๋ ค์ง€๋Š” ๊ฒƒ๋„ ์—†๊ฒ ์ฃ .
40:46
It has a heat shield thatโ€™s capable of entering on Earth or Mars.
765
2446857
6632
์ง€๊ตฌ์™€ ํ™”์„ฑ์œผ๋กœ์˜ ์ง„์ž…์ด ๊ฐ€๋Šฅํ•œ ์—ด ์ฐจํ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์–ด์„œ
40:53
We can even potentially go to Venus.
766
2453530
1752
์‹ฌ์ง€์–ด ๊ธˆ์„ฑ์— ๊ฐ€๋Š” ๊ฒƒ๋„ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค
40:55
but you don't want to go there.
767
2455324
1501
๋ˆ„๊ตฌ๋„ ์›์น˜๋Š” ์•Š๊ฒ ์ง€๋งŒ์š”.
40:56
(Laughs)
768
2456867
1543
(์›ƒ์Œ)
40:59
Venus is hell, almost literally.
769
2459161
2210
๊ธˆ์„ฑ์€ ๋ง ๊ทธ๋Œ€๋กœ ์ง€์˜ฅ ๊ฐ™์€ ๊ณณ์ด๋‹ˆ๊นŒ์š”.
41:02
But you could ...
770
2462247
1460
ํ•˜์ง€๋งŒ
41:04
It's a generalized method of transport to anywhere in the solar system,
771
2464041
4838
์ด๊ฑด ํƒœ์–‘๊ฒŒ ์–ด๋””๋กœ๋“  ์ˆ˜์†ก์ด ๊ฐ€๋Šฅํ•œ ์ผ๋ฐ˜ํ™”๋œ ์šด์†ก ๋ฐฉ๋ฒ•์ด๊ณ 
41:08
because the point at which you have propellant depo on Mars,
772
2468921
2836
ํ™”์„ฑ์— ์ถ”์ง„์ฒด๋ฅผ ์ €์žฅํ•˜๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์—
41:11
you can then travel to the asteroid belt
773
2471798
1919
์†Œํ–‰์„ฑ๋Œ€๋‚˜
41:13
and to the moons of Jupiter and Saturn
774
2473759
3128
๋ชฉ์„ฑ๊ณผ ํ† ์„ฑ์˜ ์œ„์„ฑ์œผ๋กœ๋„ ๊ฐˆ ์ˆ˜ ์žˆ๊ณ ์š”,
41:16
and ultimately anywhere in the solar system.
775
2476887
2919
๊ฒฐ๊ตญ ํƒœ์–‘๊ณ„ ์–ด๋””๋กœ๋“  ๊ฐˆ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
41:19
CA: But your main focus
776
2479848
2753
CA: ๊ทธ๋ ‡์ง€๋งŒ ๋จธ์Šคํฌ ์”จ์™€
41:22
and SpaceX's main focus is still Mars.
777
2482643
3670
์ŠคํŽ˜์ด์ŠคX๊ฐ€ ์ฃผ๋ ฅํ•˜๊ณ  ์žˆ๋Š” ๊ฑด ์—ฌ์ „ํžˆ ํ™”์„ฑ์ด๊ณ 
41:26
That is the mission.
778
2486313
2211
๊ทธ๊ฒŒ ํ˜„์žฌ ์ž„๋ฌด์ด๊ธฐ ๋•Œ๋ฌธ์—
41:28
That is where most of the effort will go?
779
2488524
3920
๊ฑฐ๊ธฐ์— ๊ฐ€์žฅ ํž˜์„ ์Ÿ์„ ๊ฑด๊ฐ€์š”?
41:33
Or are you actually imagining a much broader array of uses
780
2493278
4672
์•„๋‹ˆ๋ฉด ํ›จ์”ฌ ๋ฐฉ๋Œ€ํ•œ ์—ฌ๋Ÿฌ๊ฐ€์ง€ ํ™œ์šฉ์„ ๊ตฌ์ƒํ•˜๊ณ  ๊ณ„์‹ ๊ฐ€์š”?
41:37
even in the coming, you know,
781
2497991
2628
์•„์‹œ๋‹ค์‹œํ”ผ, ์•ž์œผ๋กœ
41:40
the first decade or so of uses of this.
782
2500619
3253
ํ•œ ์‹ญ ๋…„ ์ •๋„ ์ด๊ฑธ ์‚ฌ์šฉํ–ˆ์„ ๋•Œ
41:44
Where we could go, for example, to other places
783
2504498
2252
์˜ˆ์ปจ๋Œ€ ์šฐ๋ฆฌ๋Š” ๋‹ค๋ฅธ ๊ณณ๋“ค๋„ ๊ฐˆ ์ˆ˜ ์žˆ์„ ํ…Œ๋‹ˆ๊นŒ์š”.
41:46
in the solar system to explore,
784
2506750
1919
ํƒœ์–‘๊ณ„ ์–ด๋””๋ผ๋„ ๋ง์ด์ฃ .
41:48
perhaps NASA wants to use the rocket for that reason.
785
2508710
4338
๋‚˜์‚ฌ๋Š” ๊ทธ๋Ÿฐ ๋ชฉ์ ์œผ๋กœ ๋กœ์ผ“์„ ์ด์šฉํ•˜๊ธธ ์›ํ•  ๋“ฏํ•˜๊ณ ์š”.
41:53
EM: Yeah, NASA is planning to use a Starship to return to the moon,
786
2513423
5131
EM: ๊ทธ๋ ‡์ฃ . ๋‚˜์‚ฌ๋Š” ๋‹ฌ๋กœ ๋Œ์•„๊ฐˆ ๋•Œ ์ด์šฉํ•  ์šฐ์ฃผ์„ ์„ ๊ณ„ํš ์ค‘์ž…๋‹ˆ๋‹ค.
41:58
to return people to the moon.
787
2518595
1794
์‚ฌ๋žŒ๋“ค์„ ๋‹ฌ๋กœ ๋Œ๋ ค๋ณด๋‚ด๊ธฐ ์œ„ํ•ด์„œ์š”.
42:01
And so we're very honored that NASA has chosen us to do this.
788
2521139
4422
๋‚˜์‚ฌ๊ฐ€ ์ด ์ผ์„ ์ €ํฌ์—๊ฒŒ ๋งก๊ธด ๊ฑธ ๋งค์šฐ ์˜๊ด‘์œผ๋กœ ์—ฌ๊ธฐ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
42:07
But I'm saying it is a generalized --
789
2527271
4337
ํ•˜์ง€๋งŒ ์ด๊ฑด ์ผ๋ฐ˜ํ™”๋œ ๊ฒƒ์ด์ฃ .
42:11
itโ€™s a general solution
790
2531650
2377
์ผ๋ฐ˜์  ํ•ด๊ฒฐ ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
42:14
to getting anywhere in the greater solar system.
791
2534027
5172
๋” ํฐ ํƒœ์–‘๊ณ„ ์–ด๋””๋ผ๋„ ๊ฐˆ ์ˆ˜ ์žˆ์ฃ .
42:19
It's not suitable for going to another star system,
792
2539241
2502
๋‹ค๋ฅธ ํ–‰์„ฑ๊ณ„๋กœ ๊ฐ€๋Š” ๋ฐ๋Š” ์ ์ ˆํ•˜์ง€ ์•Š์ง€๋งŒ
42:21
but it is a general solution for transport anywhere in the solar system.
793
2541785
3879
ํƒœ์–‘๊ณ„ ๋‚ด ์–ด๋””๋ผ๋„ ์ˆ˜์†กํ•  ์ˆ˜ ์žˆ๋Š” ์ผ๋ฐ˜์ ์ธ ๋ฐฉ๋ฒ•์ด์—์š”.
42:25
CA: Before it can do any of that,
794
2545706
1585
CA: ๊ทธ๋Ÿฐ ๊ฒƒ๋“ค์„ ํ•˜๊ธฐ ์ „์—
42:27
it's got to demonstrate it can get into orbit, you know, around Earth.
795
2547332
3295
์ง€๊ตฌ ๊ถค๋„ ์ง„์ž…์ด ๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ๊ฑธ ๋ณด์—ฌ์•ผ ํ•˜๋Š”๋ฐ์š”.
42:30
Whatโ€™s your latest advice on the timeline for that?
796
2550627
5005
๊ทธ ๊ณ„ํš์— ๋Œ€ํ•ด์„œ ์กฐ์–ธ์„ ํ•ด์ฃผ์‹ ๋‹ค๋ฉด์š”?
42:35
EM: It's looking promising for us to have an orbital launch attempt
797
2555632
3921
EM: ๊ถค๋„ ๋ฐœ์‚ฌ๋ฅผ ์‹œ๋„ํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์œผ๋กœ ๋ณด์ด๊ณ ์š”.
42:39
in a few months.
798
2559595
2002
๋ช‡ ๋‹ฌ ๋‚ด๋กœ์š”.
42:43
So we're actually integrating --
799
2563015
3545
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ €ํฌ๋Š” ์™„์„ฑ ๊ณผ์ •์ด์—์š”.
42:46
will be integrating the engines into the booster
800
2566602
2961
์—”์ง„์„ ์ถ”์ง„ ๋กœ์ผ“์— ํ•ฉ์น˜๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
42:49
for the first orbital flight starting in about a week or two.
801
2569605
3586
๋ช‡ ์ฃผ ๋‚ด์— ์‹œ์ž‘๋  ์ฒซ ๋ฒˆ์งธ ๊ถค๋„ ๋น„ํ–‰์„ ์œ„ํ•ด์„œ ๋ง์ด์ฃ .
42:53
And the launch complex itself is ready to go.
802
2573942
6465
๊ทธ๋Ÿฌ๊ณ  ๋‚˜๋ฉด, ๋ฐœ์‚ฌ ๋ณตํ•ฉ์ฒด๋„ ์ค€๋น„๊ฐ€ ๋˜๊ฒ ์ง€์š”.
43:00
So assuming we get regulatory approval,
803
2580741
3670
๊ทœ์ œ ์ธ๊ฐ€๋ฅผ ์–ป์„ ์ˆ˜ ์žˆ์„ ๊ฑฐ๊ณ ,
43:04
I think we could have an orbital launch attempt within a few months.
804
2584453
6464
๊ถค๋„ ๋ฐœ์‚ฌ๋Š” ๋ช‡ ๋‹ฌ ๋‚ด์— ๊ฐ€๋Šฅํ•  ๊ฒ๋‹ˆ๋‹ค.
43:10
CA: And a radical new technology like this
805
2590959
2002
CA: ์ด๋ ‡๊ฒŒ ๊ธ‰์ง„์ ์ธ ๊ธฐ์ˆ ์€
43:13
presumably there is real risk on those early attempts.
806
2593003
2544
์ดˆ๊ธฐ ์‹œ๋„ ๊ณผ์ •์— ๋ฆฌ์Šคํฌ๊ฐ€ ์žˆ๊ธฐ ๋งˆ๋ จ์ด์ž–์•„์š”.
43:15
EM: Oh, 100 percent, yeah.
807
2595589
1251
EM: ๋„ค, 100ํผ์„ผํŠธ ๊ทธ๋ ‡์ฃ .
43:16
The joke I make all the time is that excitement is guaranteed.
808
2596882
3837
์ œ๊ฐ€ ๋งค๋ฒˆ ํ•˜๋Š” ๋†๋‹ด์ด ์žˆ๋Š”๋ฐ, ํฅ๋ถ„์„ ์žฅ๋‹ดํ•œ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
43:20
Success is not guaranteed, but excitement certainly is.
809
2600719
2794
์„ฑ๊ณต์€ ์žฅ๋‹ด ๋ชปํ•˜์ง€๋งŒ, ํฅ๋ถ„ํ•  ๊ฑด ํ™•์‹คํ•˜์ฃ .
43:23
CA: But the last I saw on your timeline,
810
2603513
2378
CA: ํ•˜์ง€๋งŒ ์ œ๊ฐ€ ๋งˆ์ง€๋ง‰์œผ๋กœ ๋ณธ ์ผ์ •์—์„œ๋Š”
43:25
you've slightly put back the expected date
811
2605932
2962
์‚ด์ง ๊ธฐํ•œ์„ ๋’ค๋กœ ๋ฏธ๋ฃจ์…จ๋˜๋ฐ
43:28
to put the first human on Mars till 2029, I want to say?
812
2608935
4380
ํ™”์„ฑ์— ์ฒซ ๋ฒˆ์งธ ์ธ๋ฅ˜๋ฅผ 2029๋…„๊นŒ์ง€ ๋ณด๋‚ด๊ฒ ๋‹ค๊ณ  ๋ง์ด์—์š”.
43:33
EM: Yeah, I mean, so let's see.
813
2613815
3128
EM: ๋„ค, ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ...
43:36
I mean, we have built a production system for Starship,
814
2616985
3504
๋‹ค์‹œ ๋งํ•˜๋ฉด ์ €ํฌ๋Š” ์šฐ์ฃผ์„ ์„ ์œ„ํ•œ ์‹œ์Šคํ…œ์„ ๋งŒ๋“ค์–ด์™”๊ณ 
43:40
so we're making a lot of ships and boosters.
815
2620489
3295
์ˆ˜๋งŽ์€ ์šฐ์ฃผ์„ ๊ณผ ์ถ”์ง„์ฒด๋ฅผ ๋งŒ๋“œ๋Š” ์ค‘์ด๊ณ ์š”.
43:43
CA: How many are you planning to make actually?
816
2623784
2210
CA: ๋ช‡ ๊ฐœ๋‚˜ ๋งŒ๋“ค ๊ณ„ํš์ธ๊ฐ€์š”?
43:46
EM: Well, we're currently expecting to make a booster and a ship
817
2626036
5714
EM: ์ง€๊ธˆ ์˜ˆ์ƒํ•˜๊ธฐ๋กœ๋Š” ํ•œ ๊ฐœ์˜ ์ถ”์ง„์ฒด์™€ ์šฐ์ฃผ์„ ์„ ๋งŒ๋“œ๋Š”๋ฐ
43:51
roughly every, well, initially, roughly every couple of months,
818
2631792
3503
๋Œ€๋žต 2๊ฐœ์›” ์ •๋„๊ฐ€ ์†Œ์š”๋˜๊ณ ,
43:55
and then hopefully by the end of this year, one every month.
819
2635295
3670
์˜ฌํ•ด ๋ง์ฏค์—”, ํ•œ ๋‹ฌ์— ํ•˜๋‚˜ ์ •๋„๊ฐ€ ๋˜๊ธฐ๋ฅผ ํฌ๋งํ•˜๊ณ  ์žˆ์–ด์š”.
43:59
So it's giant rockets, and a lot of them.
820
2639007
2711
๊ฑฐ๋Œ€ํ•œ ๋กœ์ผ“์ด๊ณ , ์ˆซ์ž๋„ ๋งŽ์œผ๋‹ˆ๊นŒ์š”.
44:01
Just talking in terms of rough orders of magnitude,
821
2641760
2419
๋Œ€๋žต์ ์ธ ํฌ๊ธฐ๋ฅผ ์–˜๊ธฐํ•˜์ž๋ฉด,
44:04
in order to create a self-sustaining city on Mars,
822
2644179
3504
ํ™”์„ฑ์—์„œ ์ž๊ฐ€ ๊ณต๊ธ‰์ด ๊ฐ€๋Šฅํ•œ ๋„์‹œ๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด์„œ๋Š”
44:07
I think you will need something on the order of a thousand ships.
823
2647724
4421
๋Œ€๋žต ์ฒœ ๊ฐœ์˜ ์šฐ์ฃผ์„ ์ด ํ•„์š”ํ•  ๊ฑฐ๋ผ๊ณ  ์˜ˆ์ƒํ•ฉ๋‹ˆ๋‹ค.
44:12
And we just need a Helen of Sparta, I guess, on Mars.
824
2652187
6381
ํ™”์„ฑ์—์„œ๋Š” ์ŠคํŒŒ๋ฅดํƒ€์˜ ํ—ฌ๋ ˆ๋„ค๋งŒ ์žˆ์œผ๋ฉด ๋ฉ๋‹ˆ๋‹ค.
44:19
CA: This is not in most people's heads, Elon.
825
2659319
2211
CA: ์ด๊ฑด ์‚ฌ๋žŒ๋“ค์ด ํ”ํžˆ ํ•  ์ˆ˜ ์žˆ๋Š” ์ƒ๊ฐ์€ ์•„๋‹ˆ์ฃ .
44:21
EM: The planet that launched 1,000 ships.
826
2661571
1961
EM: ์ฒœ ๋Œ€์˜ ์šฐ์ฃผ์„ ์„ ๋ฐœ์‚ฌํ•˜๋Š” ํ–‰์„ฑ์ด์š”.
44:24
CA: That's nice.
827
2664574
1168
CA: ๋ฉ‹์ง€๊ตฐ์š”.
44:25
But this is not in most people's heads,
828
2665784
1877
๋Œ€๋‹ค์ˆ˜์˜ ์‚ฌ๋žŒ๋“ค์ด ํ•  ๋ฒ•ํ•œ ์ƒ๊ฐ์€ ์•„๋‹ˆ์ž–์•„์š”.
44:27
this picture that you have in your mind.
829
2667661
1918
๋‹น์‹ ์ด ์ƒ์ƒํ•˜๋Š” ๋ชจ์Šต์€ ๋ง์ด์—์š”.
44:29
There's basically a two-year window,
830
2669621
1752
๊ธฐ๋ณธ 2๋…„์ด ๊ฑธ๋ฆฌ๊ณ 
44:31
you can only really fly to Mars conveniently every two years.
831
2671373
2919
๋‹น์‹ ์€ ์œ ์ผํ•˜๊ฒŒ 2๋…„๋งˆ๋‹ค ํŽธํ•˜๊ฒŒ ํ™”์„ฑ์œผ๋กœ ๋‚ ์•„๊ฐˆ ์ˆ˜ ์žˆ๋Š” ์‚ฌ๋žŒ์ด๊ณ ์š”.
44:34
You were picturing that during the 2030s,
832
2674292
4797
๋‹น์‹ ์ด ์ƒ์ƒํ–ˆ๋˜ ๊ฑด 2030๋…„๋Œ€์—๋Š”
44:39
every couple of years,
833
2679089
1376
๋งค 2๋…„๋งˆ๋‹ค
44:40
something like 1,000 Starships take off,
834
2680507
3003
1,000๋Œ€์˜ ์šฐ์ฃผ์„ ์ด ์ด๋ฅ™ํ•˜๋ฉด์„œ
44:43
each containing 100 or more people.
835
2683552
1960
๊ฐ๊ฐ์ด 100๋ช…์ด๋‚˜ ๋” ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์„ ํƒœ์šฐ๊ณ  ๊ฐ€๋Š” ๊ฑฐ์ฃ .
44:45
That picture is just completely mind-blowing to me.
836
2685512
5464
๊ทธ๋Ÿฌํ•œ ๋ชจ์Šต์€ ์ €์—๊ฒ ๋งค์šฐ ๋†€๋ผ์šด ์ผ์ด ๋  ํ…๋ฐ์š”.
44:51
That sense of this armada of humans going to --
837
2691393
3378
์ด๋Ÿฌํ•œ ์ธ๊ฐ„ ํ•จ๋Œ€์— ๋Œ€ํ•œ ์ƒ๊ฐ์ด...
44:54
EM: It'll be like "Battlestar Galactica," the fleet departs.
838
2694813
2878
EM: ๊ทธ ํ•จ๋Œ€๋“ค์ด ์ด๋ฅ™ํ•˜๋Š” ๊ฑด ๊ผญ โ€œ๋ฐฐํ‹€์Šคํƒ€ ๊ฐค๋Ÿญํ‹ฐ์นดโ€œ๊ฐ™์„ ๊ฒ๋‹ˆ๋‹ค.
44:57
CA: And you think that it can basically be funded by people
839
2697691
2794
CA: ๊ธฐ๋ณธ์ ์œผ๋กœ ํˆฌ์ž๋ฅผ ๋ฐ›์„ ์ˆ˜ ์žˆ์„ ๊ฑฐ๋ผ ์ƒ๊ฐํ•˜์‹œ๋Š” ๊ฑฐ์ฃ ?
45:00
spending maybe a couple hundred grand on a ticket to Mars?
840
2700485
3254
ํ™”์„ฑ์œผ๋กœ ๊ฐ€๋Š” ์‚ฌ๋žŒ๋“ค์ด ํ‹ฐ์ผ“์— 20๋งŒ ๋‹ฌ๋Ÿฌ ์ •๋„๋ฅผ ์“ฐ๋Š” ๋ฐฉ์‹์œผ๋กœ์š”.
45:03
Is that price about where it has been?
841
2703739
2752
๊ทธ ๊ฐ€๊ฒฉ์€ ์–ด๋””์„œ ๋‚˜์˜จ ๊ฑฐ์ฃ ?
45:07
EM: Well, I think if you say like,
842
2707367
1627
EM: ์ œ ์ƒ๊ฐ์—, ๊ทธ ๋ง์”€์€
45:08
what's required in order to get enough people and enough cargo to Mars
843
2708994
4755
์ถฉ๋ถ„ํ•œ ์‚ฌ๋žŒ๊ณผ ํ™”๋ฌผ์„ ํ™”์„ฑ์— ๊ฐ€์ ธ์˜ค๋ ค๋ฉด ๋ญ๊ฐ€ ํ•„์š”ํ•˜๋ƒ๋Š” ๊ฒƒ ๊ฐ™์€๋ฐ์š”.
45:13
to build a self-sustaining city.
844
2713790
2586
์ž๊ธ‰์ž์กฑํ•˜๋Š” ๋„์‹œ๋ฅผ ๊ฑด์„คํ•˜๊ธฐ ์œ„ํ•ด์„œ ๋ง์ž…๋‹ˆ๋‹ค.
45:17
And it's where you have an intersection
845
2717377
1919
๊ทธ๋ฆฌ๊ณ  ๊ฑฐ๊ธฐ์„œ ๊ต์ง‘ํ•ฉ์„ ์ฐพ์„ ์ˆ˜ ์žˆ๋Š”๋ฐ์š”.
45:19
of sets of people who want to go,
846
2719296
2669
ํ™”์„ฑ์— ๊ฐ€๊ธฐ๋ฅผ ์›ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์˜ ๊ต์ง‘ํ•ฉ์ด์š”.
45:21
because I think only a small percentage of humanity will want to go,
847
2721965
5047
์™œ๋ƒํ•˜๋ฉด ์˜ค์ง ์ ์€ ์ˆ˜์˜ ์‚ฌ๋žŒ๋“ค๋งŒ ๊ฐ€๊ธฐ๋ฅผ ์›ํ•  ํ…Œ๊ณ ,
45:27
and can afford to go or get sponsorship in some manner.
848
2727012
3754
๋น„์šฉ์„ ์ง€๋ถˆํ•˜๊ฑฐ๋‚˜ ์–ด๋–ค ๋ฐฉ์‹์œผ๋กœ๋“  ํ›„์›์„ ์–ป์„ ์ˆ˜ ์žˆ์„ ํ…Œ๋‹ˆ๊นŒ ๋ง์ž…๋‹ˆ๋‹ค.
45:31
That intersection of sets, I think,
849
2731391
1710
์ œ ์ƒ๊ฐ์— ์ด๋Ÿฌํ•œ ๊ต์ง‘ํ•ฉ์€
45:33
needs to be a million people or something like that.
850
2733101
2461
๋ฐฑ๋งŒ ๋ช… ์ •๋„์˜ ์‚ฌ๋žŒ๋“ค์ด ๋  ๊ฑฐ๊ณ 
45:36
And so itโ€™s what can a million people afford, or get sponsorship for,
851
2736646
3754
๋ฐฑ๋งŒ ๋ช…์˜ ์‚ฌ๋žŒ๋“ค์€ ๋น„์šฉ์„ ์ง€๋ถˆํ•˜๊ฑฐ๋‚˜ ํ›„์›์„ ๋ฐ›์„ ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
45:40
because I think governments will also pay for it,
852
2740442
2377
์™œ๋ƒํ•˜๋ฉด, ์ •๋ถ€ ๋˜ํ•œ ์ง€๋ถˆํ•  ํ…Œ๊ณ 
45:42
and people can take out loans.
853
2742819
3003
์‚ฌ๋žŒ๋“ค์ด ๋Œ€์ถœ์„ ๋ฐ›์„ ์ˆ˜๋„ ์žˆ์„ ํ…Œ๋‹ˆ๊นŒ์š”.
45:45
But I think at the point at which you say, OK, like,
854
2745864
3712
ํ•˜์ง€๋งŒ ํฌ๋ฆฌ์Šค ์”จ๊ป˜์„œ ๋ง์”€ํ•˜์‹  ์ง€์ ์—์„œ,
45:49
if moving to Mars costs are, for argumentโ€™s sake, $100,000,
855
2749618
6381
๋งŒ์•ฝ์— ํ™”์„ฑ์œผ๋กœ ์ด์ฃผํ•˜๋Š” ๋น„์šฉ์ด ๊ฐ€๋ น 10๋งŒ ๋‹ฌ๋Ÿฌ๋ผ๊ณ  ํ•œ๋‹ค๋ฉด
45:56
then I think you know, almost anyone can work and save up
856
2756041
5172
์ œ ์ƒ๊ฐ์— ๊ฑฐ์˜ ๋ชจ๋“  ์ด๋“ค์ด ์ผํ•˜๊ณ  ์ €์ถ•ํ•˜๋ฉด
46:01
and eventually have $100,000 and be able to go to Mars if they want.
857
2761213
4045
๊ฒฐ๊ตญ 10๋งŒ ๋‹ฌ๋Ÿฌ๋ฅผ ๋ชจ์•„ ์›ํ•  ๋•Œ ํ™”์„ฑ์— ๊ฐˆ ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
46:05
We want to make it available to anyone who wants to go.
858
2765300
2669
์šฐ๋ฆฌ๋Š” ๋ˆ„๊ตฌ๋‚˜ ์›ํ•˜๋ฉด ๊ทธ๊ฒŒ ๊ฐ€๋Šฅํ•˜๊ธฐ๋ฅผ ๋ฐ”๋ž๋‹ˆ๋‹ค.
46:10
It's very important to emphasize that Mars, especially in the beginning,
859
2770263
4171
ํŠนํžˆ ๊ฐ•์กฐ๋˜์–ด์•ผ ํ•  ์ ์€ ํ™”์„ฑ์— ๊ฐ€๋Š” ๊ฒƒ์ด ์ดˆ๋ฐ˜์—๋Š”
46:14
will not be luxurious.
860
2774434
1293
๊ทธ๋ ‡๊ฒŒ ๊ณ ์ƒํ•˜๊ณ  ์šฐ์•„ํ•˜์ง€๋Š” ์•Š์„ ๊ฑฐ๋ผ๋Š” ์ ์ž…๋‹ˆ๋‹ค.
46:15
It will be dangerous, cramped, difficult, hard work.
861
2775727
5672
์œ„ํ—˜ํ•˜๊ณ , ๋น„์ข๊ณ , ์–ด๋ ต๊ณ , ํž˜๋“  ์ผ์ผ ๊ฒ๋‹ˆ๋‹ค.
46:22
It's kind of like that Shackleton ad for going to the Antarctic,
862
2782025
3170
์ƒˆํดํ„ด์ด ๋‚จ๊ทน์— ๊ฐ€์ž๊ณ  ๊ด‘๊ณ ํ•˜๋Š” ๊ฒƒ๊ณผ ๋น„์Šทํ•ด์š”.
46:25
which I think is actually not real, but it sounds real and it's cool.
863
2785237
3336
์ง„์งœ ๊ฐ™์•„ ๋ณด์ด์ง€๋Š” ์•Š์ง€๋งŒ, ์‹ค์ œ์ฒ˜๋Ÿผ ๋“ค๋ฆฌ๊ณ  ๋ฉ‹์ง€์ž–์•„์š”.
46:28
It's sort of like, the sales pitch for going to Mars is,
864
2788865
2878
ํ™”์„ฑ ์—ฌํ–‰ ํ™๋ณด๊ธ€์€ ์ด๋Ÿฐ ์‹์ผ ๊ฒ๋‹ˆ๋‹ค.
46:31
"It's dangerous, it's cramped.
865
2791785
2878
โ€œ์œ„ํ—˜ํ•˜๊ณ , ๋น„์ข๊ณ 
46:35
You might not make it back.
866
2795956
1501
๋Œ์•„์˜ค์ง€ ๋ชปํ•  ์ˆ˜๋„ ์žˆ๋Š”
46:38
It's difficult, it's hard work."
867
2798750
1543
์–ด๋ ต๊ณ , ํž˜๋“  ์—ฌํ–‰์ž…๋‹ˆ๋‹ค.โ€
46:40
That's the sales pitch.
868
2800293
1168
์ด๊ฒŒ ํŒ๋งค ๋ฉ˜ํŠธ์˜ˆ์š”.
46:41
CA: Right.
869
2801503
1168
CA: ๊ทธ๋ ‡๊ตฐ์š”
46:42
But you will make history.
870
2802712
1252
ํ•˜์ง€๋งŒ ์—ญ์‚ฌ์— ๋‚จ๊ฒ ์ฃ .
46:44
EM: But it'll be glorious.
871
2804756
1585
EM: ๋งค์šฐ ์˜๊ด‘์Šค๋Ÿฌ์šด ์ผ์ด๊ฒ ์ฃ .
46:47
CA: So on that kind of launch rate you're talking about over two decades,
872
2807050
3462
CA: ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋ง์”€ํ•˜์‹œ๋Š” ๊ทธ๋Ÿฐ ์•ฝ 20๋…„์— ๊ฑธ์นœ ๋ฐœ์‚ฌ ๋น„์šฉ์œผ๋กœ๋Š”,
46:50
you could get your million people to Mars, essentially.
873
2810554
3628
๊ธฐ๋ณธ์ ์œผ๋กœ ํ™”์„ฑ์— ๊ฐˆ ๋ฐฑ๋งŒ ๋ช…์„ ๋ชจ์„ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑฐ๊ตฐ์š”.
46:54
Whose city is it?
874
2814224
1126
๋ˆ„๊ตฌ์˜ ๋„์‹œ๊ฐ€ ๋˜๋Š” ๊ฒ๋‹ˆ๊นŒ?
46:55
Is it NASA's city, is it SpaceX's city?
875
2815392
1918
๋‚˜์‚ฌ? ์ŠคํŽ˜์ด์ŠคX?
46:57
EM: Itโ€™s the people of Marsโ€™ city.
876
2817352
1627
EM: ํ™”์„ฑ ์‚ฌ๋žŒ๋“ค์˜ ๋„์‹œ๊ฐ€ ๋˜๊ฒ ์ฃ .
47:01
The reason for this, I mean, I feel like why do this thing?
877
2821106
4629
์ œ๊ฐ€ ์ด๋Ÿฐ ์ผ์„ ํ•˜๋Š” ์ด์œ ๋Š”
47:05
I think this is important for maximizing
878
2825735
4338
์ธ๋ฅ˜๋‚˜ ์ž์˜์‹์˜ ์ˆ˜๋ช…์„ ๊ทน๋Œ€ํ™” ํ•˜๋Š” ๊ฒŒ
47:10
the probable lifespan of humanity or consciousness.
879
2830115
3044
์ค‘์š”ํ•œ ๊ฒƒ ๊ฐ™๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
47:13
Human civilization could come to an end for external reasons,
880
2833201
4004
์ธ๊ฐ„ ๋ฌธ๋ช…์€ ์™ธ๋ถ€์  ์›์ธ์— ์˜ํ•ด ์ข…๋ง์„ ๋งž์ดํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
47:17
like a giant meteor or super volcanoes or extreme climate change.
881
2837247
5172
๊ฑฐ๋Œ€ํ•œ ์šด์„, ๊ฐ•๋ ฅํ•œ ํ™”์‚ฐ ํญ๋ฐœ, ๊ทน๋‹จ์ ์ธ ๊ธฐํ›„๋ณ€ํ™” ๊ฐ™์€ ์›์ธ์œผ๋กœ์š”.
47:24
Or World War III, or you know, any one of a number of reasons.
882
2844045
5756
์•„๋‹ˆ๋ฉด ์„ธ๊ณ„ 3์ฐจ ๋Œ€์ „์ด๋‚˜, ๋ฌด์ˆ˜ํ•œ ์›์ธ์ด ์žˆ์„ ์ˆ˜ ์žˆ์ฃ .
47:32
But the probable life span of civilizational consciousness
883
2852929
2878
ํ•˜์ง€๋งŒ ๋ฌธ๋ช…ํ™”๋œ ์˜์‹์˜ ๊ธฐ๋Œ€ ์ˆ˜๋ช…์€
47:35
as we know it,
884
2855849
1585
์•„์‹œ๋‹ค์‹œํ”ผ
47:37
which we should really view as this very delicate thing,
885
2857475
3129
๋งค์šฐ ์—ฐ์•ฝํ•œ ๊ฒƒ์œผ๋กœ ์—ฌ๊ฒจ์•ผ ํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
47:40
like a small candle in a vast darkness.
886
2860645
2711
๊ฑฐ๋Œ€ํ•œ ์–ด๋‘  ์†์˜ ์ž‘์€ ์ด›๋ถˆ์ฒ˜๋Ÿผ์š”.
47:43
That is what appears to be the case.
887
2863356
2962
์ด๊ฒŒ ์‚ฌ์‹ค์ด๊ณ 
47:47
We're in this vast darkness of space,
888
2867777
3379
์šฐ๋ฆฌ๋Š” ์šฐ์ฃผ์˜ ๊ฑฐ๋Œ€ํ•œ ์–ด๋‘  ์†์— ์žˆ๊ณ 
47:51
and there's this little candle of consciousness
889
2871197
3421
์ด ์ž‘์€ ์˜์‹์˜ ์ด›๋ถˆ์€
47:54
thatโ€™s only really come about after 4.5 billion years,
890
2874659
4463
45์–ต ๋…„์ด ํ๋ฅธ ํ›„์—์•ผ ๋‚˜์˜จ ๊ฒƒ์ด์ง€๋งŒ
47:59
and it could just go out.
891
2879164
2002
๊ทธ๋Œ€๋กœ ๊บผ์ ธ ๋ฒ„๋ฆด ์ˆ˜๋„ ์žˆ๋Š” ๊ฑฐ์˜ˆ์š”.
48:01
CA: I think that's powerful,
892
2881166
1376
CA: ๋งค์šฐ ๊ฐ•๋ ฅํ•˜๊ณ 
48:02
and I think a lot of people will be inspired by that vision.
893
2882584
2836
๋งŽ์€ ์ด๋“ค์—๊ฒŒ ์˜๊ฐ์ด ๋  ๋งŒํ•œ ํ†ต์ฐฐ์ด๊ตฐ์š”.
48:05
And the reason you need the million people
894
2885420
2127
๊ทธ๋ฆฌ๊ณ  ๋ฐฑ๋งŒ ๋ช…์˜ ์‚ฌ๋žŒ๋“ค์ด ํ•„์š”ํ•œ ๊ฑด
48:07
is because there has to be enough people there
895
2887547
2169
๊ทธ๊ฒŒ ์ถฉ๋ถ„ํ•œ ์ธ์›์ด๊ธฐ ๋•Œ๋ฌธ์ด๊ณ ์š”.
48:09
to do everything that you need to survive.
896
2889758
2419
ํ™”์„ฑ์—์„œ ์‚ด์•„๋‚จ๊ธฐ ์œ„ํ•œ ๋ชจ๋“  ๊ฒƒ๋“ค์„ ํ•˜๊ธฐ์— ๋ง์ด์ฃ .
48:13
EM: Really, like the critical threshold is if the ships from Earth stop coming
897
2893136
6840
EM: ๋งŒ์•ฝ, ์œ„ํƒœ๋กœ์šด ์ผ๋“ค, ์˜ˆ๋ฅผ ๋“ค์–ด ์ง€๊ตฌ์—์„œ ์šฐ์ฃผ์„ ์ด ์˜ค๊ธฐ๋ฅผ ๋ฉˆ์ถ˜๋‹ค๋ฉด์š”.
48:20
for any reason,
898
2900018
2544
์–ด๋–ค ์ด์œ ๋กœ๋“  ๋ง์ด์ฃ .
48:22
does the Mars City die out or not?
899
2902604
4379
ํ™”์„ฑ์˜ ๋„์‹œ๋Š” ์†Œ๋ฉธํ• ๊นŒ์š”, ๊ดœ์ฐฎ์„๊นŒ์š”?
48:27
And so we have to --
900
2907400
2086
๊ทธ๋ž˜์„œ ์šฐ๋ฆฌ๋Š” --
48:29
You know, people talk about like, the sort of, the great filters,
901
2909527
3129
์•„์‹œ๋‹ค์‹œํ”ผ, ์‚ฌ๋žŒ๋“ค์€ ๋Œ€์—ฌ๊ณผ๊ธฐ ๊ฐ™์€ ์–˜๊ธฐ๋ฅผ ํ•ฉ๋‹ˆ๋‹ค.
48:32
the things that perhaps, you know,
902
2912697
3087
๊ทธ๊ฑด ์•„๋งˆ,
48:35
we talk about the Fermi paradox, and where are the aliens?
903
2915825
2711
ํŽ˜๋ฅด๋ฏธ ์—ญ์„ค์— ๋Œ€ํ•ด ์–˜๊ธฐ๋ฅผ ํ•ด๋ณด๋ฉด, ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์™ธ๊ณ„์ธ์ด ์–ด๋”” ์žˆ๋‚˜์š”?
48:38
Well maybe there are these various great filters
904
2918536
2294
๊ทธ๋ ‡๋‹ค๋ฉด ์•„๋งˆ ์—„์ฒญ๋‚œ ๋Œ€์—ฌ๊ณผ๊ธฐ๊ฐ€ ์กด์žฌํ•˜๊ณ 
48:40
that the aliens didnโ€™t pass,
905
2920872
1418
์™ธ๊ณ„์ธ๋“ค์€ ๊ทธ๊ฑธ ํ†ต๊ณผํ•˜์ง€ ๋ชปํ•˜๋ฏ€๋กœ
48:42
and so they eventually just ceased to exist.
906
2922290
4588
๊ฒฐ๊ณผ์ ์œผ๋กœ ๋ฉธ์ข…ํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
48:46
And one of the great filters is becoming a multi-planet species.
907
2926920
3128
๊ทธ๋ฆฌ๊ณ  ์ด ๋Œ€๊ณผ๊ธฐ์˜ ํ•˜๋‚˜๊ฐ€ ๋‹คํ–‰์„ฑ๊ฐ„ ์ข…์ด ๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
48:50
So we want to pass that filter.
908
2930674
2210
์šฐ๋ฆฌ๋Š” ๋Œ€์—ฌ๊ณผ๊ธฐ๋ฅผ ํ†ต๊ณผํ•˜๊ธฐ๋ฅผ ์›ํ•˜๊ณ 
48:54
And I'll be long-dead before this is, you know, a real thing,
909
2934302
6006
๊ทธ๋ฆฌ๊ณ  ์•„๋งˆ ๊ทธ ์ „์— ๋‚˜๋Š” ์ฃฝ๊ฒ ์ง€๋งŒ์š”
49:00
before it happens.
910
2940350
1251
์ง„์งœ ์ด๋ค„์ง€๊ธฐ ์ „์—์š”.
49:01
But Iโ€™d like to at least see us make great progress in this direction.
911
2941601
5297
ํ•˜์ง€๋งŒ ์ „ ์ตœ์†Œํ•œ ์ธ๋ฅ˜๊ฐ€ ์ด๋Ÿฐ ์‹์œผ๋กœ ์œ„๋Œ€ํ•˜๊ฒŒ ์ง„ํ™”ํ•˜๋Š” ๊ฑธ ๋ณด๊ณ  ์‹ถ์–ด์š”.
49:07
CA: Given how tortured the Earth is right now,
912
2947315
2503
CA: ์ง€๊ธˆ ์ง€๊ตฌ๊ฐ€ ์–ผ๋งˆ๋‚˜ ๊ณ ๋ฌธ ๋ฐ›๊ณ  ์žˆ๋Š”์ง€,
49:09
how much we're beating each other up,
913
2949859
2878
์šฐ๋ฆฌ๊ฐ€ ์„œ๋กœ ์–ผ๋งˆ๋‚˜ ์‹ธ์šฐ๊ณ  ์žˆ๋Š”์ง€ ์ƒ๊ฐํ•ด๋ณด๋ฉด
49:12
shouldn't there be discussions going on
914
2952737
2795
๋…ผ์˜๋„ ํ•„์š”ํ•˜์ง€ ์•Š๊ฒ ์Šต๋‹ˆ๊นŒ?
49:15
with everyone who is dreaming about Mars to try to say,
915
2955573
4171
ํ™”์„ฑ์— ๋Œ€ํ•œ ๊ฟˆ์„ ๊พธ๊ณ  ์žˆ๋Š” ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ๋งํ•ด์•ผ์ง€์š”.
49:19
we've got a once in a civilization's chance
916
2959786
5172
์šฐ๋ฆฌ๋Š” ๋ฌธ๋ช…์ ์œผ๋กœ
49:24
to make some new rules here?
917
2964958
2002
์ƒˆ๋กœ์šด ๊ทœ์น™์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋Š” ๋‹จ ํ•œ๋ฒˆ์˜ ๊ธฐํšŒ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๋Š” ๊ฑธ์š”.
49:27
Should someone be trying to lead those discussions
918
2967002
3753
๋ˆ„๊ฐ€ ๊ทธ ๋…ผ์˜๋ฅผ ์ฃผ๋„ํ•ด์•ผ ํ• ๊นŒ์š”?
49:30
to figure out what it means for this to be the people of Mars' City?
919
2970755
3879
์ด๋Ÿฌํ•œ ๋…ผ์˜๊ฐ€ ํ™”์„ฑ์˜ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์–ด๋–ค ์˜๋ฏธ์ง€ ์ดํ•ด ์‹œํ‚ค๋ ค๋ฉด ๋ง์ž…๋‹ˆ๋‹ค.
49:35
EM: Well, I think ultimately
920
2975093
1376
EM: ์ œ ์ƒ๊ฐ์—” ๊ถ๊ทน์ ์œผ๋กœ
49:36
this will be up to the people of Mars to decide
921
2976469
2211
ํ™”์„ฑ ์‚ฌ๋žŒ๋“ค์˜ ์„ ํƒ์— ๋‹ฌ๋ ค์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
49:38
how they want to rethink society.
922
2978722
4045
๊ทธ๋“ค์ด ์‚ฌํšŒ์— ๋Œ€ํ•ด ์–ด๋–ป๊ฒŒ ๋‹ค์‹œ ์ƒ๊ฐํ•˜๊ธฐ๋ฅผ ์›ํ•˜๋Š”์ง€๋Š” ๋ง์ด์ฃ .
49:43
Yeah thereโ€™s certainly risk there.
923
2983101
1627
๋ถ„๋ช…ํžˆ ๋ฆฌ์Šคํฌ๋“ค์ด ์žˆ์–ด์š”.
49:45
And hopefully the people of Mars will be more enlightened
924
2985395
4630
๊ทธ๋ฆฌ๊ณ  ๋ฐ”๋ผ๊ฑด๋Œ€ ํ™”์„ฑ์˜ ์‚ฌ๋žŒ๋“ค์€ ์ข€ ๋” ๊ณ„๋ชฝ๋˜์–ด ์žˆ๊ณ ,
49:50
and will not fight amongst each other too much.
925
2990066
2711
์„œ๋กœ๊ฐ€ ์‹ฌํ•˜๊ฒŒ ๋‹คํˆฌ์ง€ ์•Š์•˜์œผ๋ฉด ์ข‹๊ฒ ๊ตฐ์š”.
49:54
I mean, I have some recommendations,
926
2994279
1752
์ œ๊ฐ€ ์ถ”์ฒœ ํ•˜์ž๋ฉด์š”,
49:56
which people of Mars may choose to listen to or not.
927
2996031
3962
ํ™”์„ฑ์˜ ์‚ฌ๋žŒ๋“ค์ด ๋“ฃ๊ณ ์ž ํ•  ์ˆ˜๋„ ์žˆ๊ณ  ์•„๋‹ ์ˆ˜๋„ ์žˆ์ง€๋งŒ
50:00
I would advocate for more of a direct democracy,
928
3000035
2794
์ €๋Š” ์ง์ ‘ ๋ฏผ์ฃผ์ฃผ์˜๋ฅผ ๋” ๋งŽ์ด ๋“ค์ด๋Š” ๊ฑธ ์˜นํ˜ธํ•˜๋Š” ํŽธ์ž…๋‹ˆ๋‹ค.
50:02
not a representative democracy,
929
3002829
2211
๊ฐ„์ ‘ ๋ฏผ์ฃผ์ฃผ์˜๊ฐ€ ์•„๋‹ˆ๋ผ์š”.
50:05
and laws that are short enough for people to understand.
930
3005040
2711
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์ด ์ดํ•ด ํ•  ๋งŒํผ ์งง์€ ๋ฒ•์ด ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
50:08
Where it is harder to create laws than to get rid of them.
931
3008168
5380
๋ฒ•์„ ์‚ญ์ œํ•˜๋Š” ๊ฒƒ ๋ณด๋‹ค ๋ฒ•์„ ๋งŒ๋“œ๋Š” ๊ฒŒ ๋” ์–ด๋ ค์›Œ์•ผ ํ•˜๊ณ ์š”.
50:14
CA: Coming back a bit nearer term,
932
3014424
1668
CA: ์ข€ ๋” ๊ฐ€๊นŒ์šด ๋ฏธ๋ž˜์˜ ์–˜๊ธฐ๋กœ ๋Œ์•„๊ฐ€ ๋ณด์ž๋ฉด,
50:16
I'd love you to just talk a bit about some of the other possibility space
933
3016134
3462
๋‹ค๋ฅธ ์ž ์žฌ๋ ฅ์„ ์ง€๋‹Œ ์šฐ์ฃผ์— ๋Œ€ํ•ด ๋ง์”€ํ•ด์ฃผ์‹œ๋ฉด ์ข‹๊ฒ ๊ตฐ์š”.
50:19
that Starship seems to have created.
934
3019596
3712
์Šคํƒ€์‰ฝ์ด ๋งŒ๋“ค์–ด ๋‚ผ ์ผ๋“ค์— ๋Œ€ํ•ด ๋ง์ž…๋‹ˆ๋‹ค.
50:23
So given --
935
3023349
1377
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ
50:24
Suddenly we've got this ability to move 100 tons-plus into orbit.
936
3024768
3503
์šฐ๋ฆฌ๋Š” ์˜ˆ์ƒ์น˜ ๋ชปํ•˜๊ฒŒ 100ํ†ค ์ด์ƒ์„ ๊ถค๋„์— ์˜ฌ๋ ค๋†“์„ ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ๋Š”๋ฐ,
50:29
So we've just launched the James Webb telescope,
937
3029230
3170
์ œ์ž„์Šค์›น๋„ ๋ฐœ์‚ฌ๋๊ณ ์š”.
50:32
which is an incredible thing.
938
3032400
2002
๋ฉ‹์ง„ ๋ฐœ๋ช…ํ’ˆ์ด์ฃ .
50:34
It's unbelievable.
939
3034444
1126
๋งค์šฐ ๋†€๋ž์–ด์š”.
50:35
EM: Exquisite piece of technology.
940
3035612
1793
EM: ๊ธฐ์ˆ ์ด ๋งŒ๋“  ์—ญ์ž‘์ด์ฃ .
50:37
CA: Exquisite piece of technology.
941
3037447
1627
CA: ๊ธฐ์ˆ ์ด ๋งŒ๋“ค์–ด๋‚ธ ์—ญ์ž‘.
50:39
But people spent two years trying to figure out how to fold up this thing.
942
3039115
3504
๊ทธ๋ ‡์ง€๋งŒ ์ œ์ž„์Šค์›น์„ ์–ด๋–ป๊ฒŒ ์ ‘์„์ง€ ์ƒ๊ฐํ•ด๋‚ด๋Š” ๋ฐ๋งŒ ์ด๋…„์ด ๊ฑธ๋ ธ์–ด์š”.
50:42
It's a three-ton telescope.
943
3042660
1335
์ œ์ž„์Šค์›น์€ 3ํ†ค์— ๋ถˆ๊ณผํ•˜๊ณ ์š”.
50:44
EM: We can make it a lot easier if youโ€™ve got more volume and mass.
944
3044037
3170
EM: ๋ถ€ํ”ผ์™€ ์งˆ๋Ÿ‰์ด ๋” ํด ๋•Œ ๋” ์‰ฝ์Šต๋‹ˆ๋‹ค.
50:47
CA: But let's ask a different question.
945
3047207
1877
CA: ๋‹ค๋ฅธ ์งˆ๋ฌธ์„ ํ•ด๋ด…์‹œ๋‹ค.
50:49
Which is, how much more powerful a telescope could someone design
946
3049084
6756
์˜ˆ๋ฅผ ๋“ค์–ด ๋ง์›๊ฒฝ์€ ์–ผ๋งˆ๋‚˜ ๋” ๊ฐ•๋ ฅํ•  ์ˆ˜ ์žˆ๋‚˜์š”?
50:55
based on using Starship, for example?
947
3055882
2878
์Šคํƒ€์‰ฝ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ๋‹ค๋ฉด ๋ง์ž…๋‹ˆ๋‹ค.
50:59
EM: I mean, roughly, I'd say it's probably an order of magnitude more resolution.
948
3059469
4546
EM: ๋Œ€๋žต์ ์œผ๋กœ ์–˜๊ธฐํ•ด๋ณด์ž๋ฉด, ์•„๋งˆ๋„ ํ•ด์ƒ๋„๊ฐ€ ํ›จ์”ฌ ์ข‹์•„์งˆ ๊ฒ๋‹ˆ๋‹ค.
51:04
If you've got 100 tons and a thousand cubic meters volume,
949
3064057
3211
๋ฌด๊ฒŒ๊ฐ€ 100ํ†ค์ด ๋˜๊ณ  ๋ถ€ํ”ผ๊ฐ€ ์ฒœ ์„ธ์ œ๊ณฑ๋ฏธํ„ฐ์— ์ด๋ฅธ๋‹ค๋ฉด
51:07
which is roughly what we have.
950
3067268
1585
์šฐ๋ฆฌ๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒŒ ๋Œ€๋žต ๊ทธ ์ •๋„์ž…๋‹ˆ๋‹ค.
51:08
CA: And what about other exploration through the solar system?
951
3068895
3545
CA: ๊ทธ๋ ‡๋‹ค๋ฉด ํƒœ์–‘๊ณ„ ๋‹ค๋ฅธ ๊ณณ์„ ํƒ์‚ฌํ•˜๋Š” ์ผ์€ ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
51:12
I mean, I'm you know --
952
3072482
2169
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ --
51:14
EM: Europa is a big question mark.
953
3074692
2670
EM: ์œ ๋กœํŒŒ๊ฐ€ ๊ฝค๋‚˜ ๊ถ๊ธˆํ•œ ๊ณณ์ด์ฃ .
51:17
CA: Right, so there's an ocean there.
954
3077403
1794
CA: ๊ทธ๋ ‡์ฃ . ๊ฑฐ๊ธฐ์—” ๋ฐ”๋‹ค๊ฐ€ ์žˆ๊ณ ,
51:19
And what you really want to do is to drop a submarine into that ocean.
955
3079239
3295
๋‹น์‹ ์€ ๊ทธ ๋ฐ”๋‹ค์— ์ž ์ˆ˜ํ•จ์„ ๋„ฃ์–ด๋ณด๊ธธ ์›ํ•˜์ž–์•„์š”.
51:22
EM: Maybe there's like, some squid civilization,
956
3082534
2294
EM: ์•„๋งˆ ๊ฑฐ๊ธฐ์—” ์˜ค์ง•์–ด๋‚˜ ๋‘์กฑ๋ฅ˜๋“ค์˜
51:24
cephalopod civilization under the ice of Europa.
957
3084869
3212
๋ฌธ๋ช… ๊ฐ™์€ ๊ฒŒ ์žˆ์„์ง€ ๋ชจ๋ฅด์ฃ . ์œ ๋กœํŒŒ์˜ ์–ผ์Œ ๋ฐ‘์—์š”.
51:28
That would be pretty interesting.
958
3088123
1626
๊ฝค ํฅ๋ฏธ๋กญ์ฃ .
51:29
CA: I mean, Elon, if you could take a submarine to Europa
959
3089749
2711
CA: ๋จธ์Šคํฌ ์”จ๊ฐ€ ์œ ๋กœํŒŒ์— ์ž ์ˆ˜ํ•จ์„ ๊ฐ€์ ธ๊ฐˆ ์ˆ˜ ์žˆ๊ณ 
51:32
and we see pictures of this thing being devoured by a squid,
960
3092460
3629
์˜ค์ง•์–ด์—๊ฒŒ ์žก์•„๋จนํžˆ๋Š” ๋ชจ์Šต์„ ์šฐ๋ฆฌ๊ฐ€ ๋ณด๊ฒŒ ๋œ๋‹ค๋ฉด,
51:36
that would honestly be the happiest moment of my life.
961
3096131
2544
์ œ ์ƒ์—์„œ ๊ฐ€์žฅ ์ฆ๊ฑฐ์šด ์ผ์ด ๋  ๋“ฏํ•˜๋„ค์š”.
51:38
EM: Pretty wild, yeah.
962
3098716
1377
EM: ๊ฝค ๊ฑฐ์นœ๋ฐ์š”.
51:40
CA: What other possibilities are out there?
963
3100426
2795
CA: ๋‹ค๋ฅธ ๊ฐ€๋Šฅ์„ฑ๋“ค์€ ๋ญ๊ฐ€ ์žˆ์„๊นŒ์š”?
51:43
Like, it feels like if you're going to create a thousand of these things,
964
3103263
4379
์˜ˆ์ปจ๋Œ€, ๋งŒ์•ฝ์— ์ด๋Ÿฐ ์šฐ์ฃผ์„  ์ฒœ ๋Œ€๋ฅผ ๋งŒ๋“ค๊ณ 
51:47
they can only fly to Mars every two years.
965
3107642
3212
๊ฒฉ๋…„์œผ๋กœ๋งŒ ํ™”์„ฑ์— ๊ฐˆ ์ˆ˜ ์žˆ๋‹ค๋ฉด
51:50
What are they doing the rest of the time?
966
3110895
2169
๊ทธ ์šฐ์ฃผ์„ ๋“ค์€ ๋‚จ๋Š” ์‹œ๊ฐ„์—” ๋ญ˜ ํ•˜์ฃ ?
51:53
It feels like there's this explosion of possibility
967
3113106
4671
๊ฐ€๋Šฅํ•œ ๊ฒƒ๋“ค์ด ๋„˜์ณ๋‚  ๊ฒƒ ๊ฐ™์•„์š”.
51:57
that I don't think people are really thinking about.
968
3117777
2461
์‚ฌ๋žŒ๋“ค์ด ์—ฌ๊ธฐ์— ๋Œ€ํ•ด์„œ๋Š” ์ž˜ ์ƒ๊ฐ์ง€ ์•Š๋Š” ๊ฒƒ ๊ฐ™์•„์„œ์š”.
52:00
EM: I don't know, we've certainly got a long way to go.
969
3120238
2628
EM: ์ž˜ ๋ชจ๋ฅด๊ฒ ์–ด์š”. ์ €ํฌ๋Š” ๊ฐˆ ๊ธธ์ด ๋ฉ€๊ณ ,
52:02
As you alluded to earlier, we still have to get to orbit.
970
3122866
2752
์•„๊นŒ ์–ธ๊ธ‰ํ•ด์ฃผ์…จ๋“ฏ, ์šฐ๋ฆฌ๋Š” ๊ถค๋„์— ์šฐ์„  ์šธ๋ผ์•ผ ํ•˜๊ณ ์š”.
52:05
And then after we get to orbit,
971
3125618
1752
๊ถค๋„์— ์˜ค๋ฅธ๋‹ค๋ฉด,
52:07
we have to really prove out and refine full and rapid reusability.
972
3127412
6006
์™„์ „ํ•˜๊ณ  ์žฌ๋น ๋ฅธ ์žฌ์‚ฌ์šฉ์— ๋Œ€ํ•ด ์ฆ๋ช…ํ•˜๊ณ  ๋‹ค๋“ฌ์–ด ๋‚˜๊ฐˆ ํ•„์š”๊ฐ€ ์žˆ๊ณ ์š”.
52:14
That'll take a moment.
973
3134085
1210
๊ฑฐ๊ธฐ๊นŒ์ง€๋„ ์‹œ๊ฐ„์ด ๊ฑธ๋ฆด ๊ฒ๋‹ˆ๋‹ค.
52:19
But I do think we will solve this.
974
3139090
1752
ํ•˜์ง€๋งŒ ์—ฌ์ „ํžˆ ํ•  ์ˆ˜ ์žˆ์œผ๋ฆฌ๋ผ ๋ฏฟ๊ณ ์š”.
52:22
I'm highly confident we will solve this at this point.
975
3142969
2711
์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ™•์‹ ํ•ด์š”.
52:26
CA: Do you ever wake up with the fear
976
3146014
1793
CA: ์•„์นจ์— ๋‘๋ ค์›Œํ•˜๋ฉด ๊นจ์–ด๋‚˜ ๋ณธ ์ผ์ด ์žˆ๋‚˜์š”?
52:27
that there's going to be this Hindenburg moment for SpaceX where ...
977
3147849
3462
์ŠคํŽ˜์ด์Šค X์— ํžŒ๋ด๋ถ€๋ฅดํฌ ๋Œ€์ฐธ์‚ฌ ๊ฐ™์€ ์ผ์ด ์žˆ์œผ๋ฉด ์–ด์ฉŒ์ง€ ํ•˜๋ฉด์„œ์š”.
52:31
EM: We've had many Hindenburg.
978
3151811
1460
EM: ์ด๋ฏธ ๊ทธ๋Ÿฐ ์ผ์ด ๋งŽ์•˜์–ด์š”.
52:33
Well, we've never had Hindenburg moments with people, which is very important.
979
3153313
3670
๋ฌผ๋ก , ์ธ๋ช… ์‚ฌ๊ณ ๋Š” ์ „ํ˜€ ์—†์—ˆ์ง€๋งŒ์š”, ์ด๊ฑด ๋งค์šฐ ์ค‘์š”ํ•˜๊ณ 
52:37
Big difference.
980
3157025
1251
ํฐ ์ฐจ์ด๊ฐ€ ์žˆ๋Š”๊ฑฐ๋‹ˆ๊นŒ์š”.
52:38
We've blown up quite a few rockets.
981
3158776
1710
ํญํŒŒํ•ด๋ฒ„๋ฆฐ ๋กœ์ผ“์ด ๊ฝค ์žˆ์—ˆ์ง€๋งŒ์š”.
52:40
So there's a whole compilation online that we put together
982
3160486
3504
์˜จ๋ผ์ธ์—๋Š” ์šฐ๋ฆฌ์˜ ๊ทธ๋Ÿฐ ์ผ๋“ค๊ณผ ๋‹ค๋ฅธ ๋ฐ์„œ ๋ฒŒ์–ด์ง„ ๊ฒƒ๋“ค์˜
52:44
and others put together,
983
3164032
1710
๋ชจ์Œ์ง‘์ด ์žˆ๋Š”๋ฐ
52:45
it's showing rockets are hard.
984
3165742
1960
๋กœ์ผ“์„ ๋งŒ๋“ ๋‹ค๋Š” ๊ฒŒ ์–ด๋ ต๋‹ค๋Š” ๊ฑธ ๋ณด์—ฌ์ฃผ์ฃ .
52:47
I mean, the sheer amount of energy going through a rocket boggles the mind.
985
3167744
3795
๋กœ์ผ“์— ๋“ค์—ฌ์•ผ ํ•˜๋Š” ์ˆœ์ˆ˜ํ•œ ์—๋„ˆ์ง€์˜ ์–‘์€ ์••๋„์ ์ด์—์š”.
52:51
So, you know, getting out of Earth's gravity well is difficult.
986
3171581
3503
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์•„์‹œ๋‹ค์‹œํ”ผ ์ง€๊ตฌ ์ค‘๋ ฅ์„ ํƒˆ์ถœํ•˜๋Š” ์ผ์€ ๊ฝค ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
52:55
We have a strong gravity and a thick atmosphere.
987
3175126
2377
์ง€๊ตฌ ์ค‘๋ ฅ์€ ๊ฐ•ํ•˜๊ณ , ๋Œ€๊ธฐ๋Š” ๋‘ํ…์Šต๋‹ˆ๋‹ค.
52:59
And Mars, which is less than 40 percent,
988
3179422
3921
ํ™”์„ฑ์˜ ๊ฒฝ์šฐ๋Š” ์—ฌ๊ธฐ์˜ 40ํผ์„ผํŠธ๋„ ์ฑ„ ์•ˆ๋ฉ๋‹ˆ๋‹ค.
53:03
it's like, 37 percent of Earth's gravity
989
3183426
2711
์ง€๊ตฌ ์ค‘๋ ฅ์˜ 37ํผ์„ผํŠธ ์ •๋„์ง€์š”.
53:06
and has a thin atmosphere.
990
3186179
1626
๋Œ€๊ธฐ์ธต๋„ ํ›จ์”ฌ ์–•๊ณ ์š”.
53:08
The ship alone can go all the way
991
3188097
2002
์šฐ์ฃผ์„ ์ด ํ™€๋กœ
53:10
from the surface of Mars to the surface of Earth.
992
3190141
2419
ํ™”์„ฑ ํ‘œ๋ฉด์œผ๋กœ๋ถ€ํ„ฐ ์ง€๊ตฌ์˜ ์ง€์ƒ์œผ๋กœ ๊ฐˆ ์ˆ˜ ๋Š” ์žˆ์ง€๋งŒ,
53:12
Whereas getting to Mars requires a giant booster and orbital refilling.
993
3192602
4755
๋ฐ˜๋ฉด์— ํ™”์„ฑ์œผ๋กœ ๊ฐ€๊ธฐ ์œ„ํ•ด์„œ๋Š” ํฐ ๋กœ์ผ“๊ณผ ๊ถค๋„ ๊ธ‰์œ ๊ฐ€ ํ•„์š”ํ•˜๊ณ ์š”.
53:17
CA: So, Elon, as I think more about this incredible array of things
994
3197774
4796
CA: ์ €๋Š” ์ด ๋†€๋ผ์šด ๊ฒƒ๋“ค์— ๋Œ€ํ•ด ์ƒ๊ฐํ•  ์ˆ˜๋ก ๋ง์ด์ฃ .
53:22
that you're involved with,
995
3202612
1835
์ผ๋ก ์ด ์ฐธ์—ฌํ•˜๊ณ  ์žˆ๋Š” ์ด๋Ÿฌํ•œ ๊ฒƒ๋“ค์—์„œ
53:24
I keep seeing these synergies,
996
3204489
4296
์‹œ๋„ˆ์ง€๊ฐ€ ๋ณด์ด๋Š”๋ฐ
53:28
to use a horrible word,
997
3208826
1877
๋”์ฐํ•œ ๋‹จ์–ด๋ฅผ ์จ๋ณด์ž๋ฉด ๋ง์ž…๋‹ˆ๋‹ค.
53:30
between them.
998
3210745
1168
๊ทธ๋“ค ์‚ฌ์ด์—
53:31
You know, for example,
999
3211955
1167
์˜ˆ์ปจ๋Œ€,
53:33
the robots you're building from Tesla could possibly be pretty handy on Mars,
1000
3213122
5756
ํ…Œ์Šฌ๋ผ์—์„œ ๊ฐœ๋ฐœ์ค‘์ธ ๋กœ๋ด‡์€ ์•„๋งˆ ํ™”์„ฑ์—์„œ ์œ ์šฉํ•˜๊ฒ ์ฃ .
53:38
doing some of the dangerous work and so forth.
1001
3218920
2169
์œ„ํ—˜ํ•œ ์ผ ๊ฐ™์€ ๊ฑธ ํ•˜๋‹ˆ๊นŒ์š”.
53:41
I mean, maybe there's a scenario where your city on Mars
1002
3221089
2669
์ œ ๋ง์€, ์•„๋งˆ ์ด๋Ÿฐ ์‹œ๋‚˜๋ฆฌ์˜ค๊ฐ€ ์žˆ์ง€ ์•Š์„๊นŒ์š”?
53:43
doesn't need a million people,
1003
3223758
1460
ํ™”์„ฑ์˜ ๋„์‹œ๊ฐ€ ๋ฐฑ๋งŒ์˜ ์‚ฌ๋žŒ๋“ค์€ ํ•„์š”๋กœ ํ•˜์ง€ ์•Š๋Š” ๊ฑฐ์ฃ .
53:45
it needs half a million people and half a million robots.
1004
3225218
2711
๊ทธ ์ ˆ๋ฐ˜๋งŒ ์žˆ์œผ๋ฉด ์ ˆ๋ฐ˜์€ ๋กœ๋ด‡์œผ๋กœ ๋Œ€์ฒดํ•˜๋Š” ๊ฑฐ์˜ˆ์š”.
53:47
And that's a possibility.
1005
3227971
1835
๊ทธ๋Ÿด ์ˆ˜๋„ ์žˆ๋Š”๋ฐ,
53:49
Maybe The Boring Company could play a role
1006
3229847
2211
๋ณด๋ง ์ปดํผ๋‹ˆ๊ฐ€
53:52
helping create some of the subterranean dwelling spaces that you might need.
1007
3232100
5380
ํ•„์š”ํ•œ ์ง€ํ•˜ ์ฃผ๊ฑฐ ๊ณต๊ฐ„์„ ๋งŒ๋“œ๋Š” ์—ญํ• ์„ ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
53:57
EM: Yeah.
1008
3237480
1210
EM: ๋„ค.
53:58
CA: Back on planet Earth,
1009
3238982
1501
CA: ์ง€๊ตฌ ์–˜๊ธฐ๋กœ ๋Œ์•„์™€ ๋ณด์ž๋ฉด
54:00
it seems like a partnership between Boring Company and Tesla
1010
3240525
3211
๋ณด๋ง ์ปดํผ๋‹ˆ์™€ ํ…Œ์Šฌ๋ผ๋Š” ๋™์—… ๊ด€๊ณ„๋กœ ๋ณด์ด๊ณ 
54:03
could offer an unbelievable deal to a city
1011
3243778
3879
๋„์‹œ์— ์—„์ฒญ๋‚œ ๊ฑฐ๋ž˜๋“ค์„ ์ œ๊ณตํ•  ์ˆ˜ ์žˆ์„ ๋“ฏํ•œ๋ฐ
54:07
to say, we will create for you a 3D network of tunnels
1012
3247699
4504
๋งํ•ด๋ณด์ž๋ฉด 3D ๋„คํŠธ์›Œํฌ์˜ ํ„ฐ๋„์ด ์ง€์–ด์ง„๋‹ค๋ฉด
54:12
populated by robo-taxis
1013
3252245
2252
๊ฑฐ๊ธด ๋กœ๋ด‡ ํƒ์‹œ๋กœ ์ฑ„์›Œ์งˆ ๊ฑฐ๊ณ 
54:14
that will offer fast, low-cost transport to anyone.
1014
3254539
4254
๋น ๋ฅด๊ณ , ๋น„์šฉ์€ ์ ˆ๊ฐ๋˜๊ณ  ๋ˆ„๊ตฌ๋‚˜ ์ˆ˜์†กํ•  ์ˆ˜ ์žˆ๊ฒ ์ง€์š”.
54:18
You know, full self-driving may or may not be done this year.
1015
3258835
2878
์™„์ „ ์ž์œจ ์ฃผํ–‰์€ ์˜ฌํ•ด ์•ˆ์— ์ด๋ค„์งˆ์ง€ ๋ชจ๋ฅด์ง€๋งŒ ๋ง์ž…๋‹ˆ๋‹ค.
54:21
And in some cities, like, somewhere like Mumbai,
1016
3261713
2794
์–ด๋–ค ๋„์‹œ๋“ค, ์˜ˆ์ปจ๋Œ€ ๋ญ„๋ฐ”์ด ๊ฐ™์€ ๊ณณ์—์„ 
54:24
I suspect won't be done for a decade.
1017
3264549
2043
์‹ญ ๋…„ ๋‚ด๋กœ ๊ทธ๋Ÿฐ ์ผ์ด ์ผ์–ด๋‚˜๊ธฐ๋Š” ์–ด๋ ต๊ฒ ์ง€๋งŒ์š”.
54:26
EM: Some places are more challenging than others.
1018
3266634
2294
EM: ์–ด๋–ค ๊ณณ๋“ค์€ ๋‹ค๋ฅธ ๊ณณ๋“ค์— ๋น„ํ•ด์„œ ํ›จ์”ฌ ๊นŒ๋‹ค๋กญ์Šต๋‹ˆ๋‹ค.
54:28
CA: But today, today, with what you've got,
1019
3268928
2378
CA: ํ•˜์ง€๋งŒ ์˜ค๋Š˜, ๋‹น์‹ ์ด ํ•ด๋‚ธ ์ผ๋“ค๋กœ
54:31
you could put a 3D network of tunnels under there.
1020
3271306
4254
์ง€ํ•˜์— 3D ๋„คํŠธ์›Œํฌ์˜ ํ„ฐ๋„์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์ฃ .
54:35
EM: Oh, if itโ€™s just in a tunnel, thatโ€™s a solved problem.
1021
3275601
2795
EM: ๋ฌผ๋ก  ๋‹จ์ˆœํ•œ ํ„ฐ๋„ ๋‚ด ์ฃผํ–‰์ด๋ผ๋ฉด ๋ฌธ์ œ๊ฐ€ ์—†์ง€์š”.
54:38
CA: Exactly, full self-driving is a solved problem.
1022
3278438
2502
CA: ๊ทธ๋ ‡์ง€์š”. ์™„์ „ ์ž์œจ ์ฃผํ–‰์€ ํ•ด๊ฒฐ๋œ ๋ฌธ์ œ๊ฐ€ ๋ฉ๋‹ˆ๋‹ค.
54:40
To me, thereโ€™s amazing synergy there.
1023
3280982
3045
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์ €๋Š” ๋†€๋ผ์šด ์‹œ๋„ˆ์ง€๊ฐ€ ์žˆ๋Š” ๋“ฏํ•ด์š”.
54:44
With Starship,
1024
3284068
1752
์Šคํƒ€์‰ฝ์—๋Š”์š”.
54:45
you know, Gwynne Shotwell talked about by 2028 having from city to city,
1025
3285820
5339
๊ทธ์œˆ ์‡ผํŠธ์›ฐ์ด ์ด๋Ÿฐ ์–˜๊ธฐ๋ฅผ ํ–ˆ๋Š”๋ฐ, 2028๋…„๊นŒ์ง€ ๋„์‹œ์—์„œ ๋„์‹œ๋กœ
54:51
you know, transport on planet Earth.
1026
3291200
1752
์ˆ˜์†กํ•˜๋Š” ๊ฑธ ์ง€๊ตฌ์—์„œ ๊ฐ€๋Šฅ์ผ€ ํ•  ๊ฑฐ๋ผ๊ณ  ๋ง์ด์ฃ .
54:52
EM: This is a real possibility.
1027
3292952
1627
EM: ๊ทธ๊ฑด ์ง„์งœ ๊ฐ€๋Šฅํ•ด์š”.
54:57
The fastest way to get from one place to another,
1028
3297290
2961
ํ•œ ์žฅ์†Œ์—์„œ ๋‹ค๋ฅธ ์žฅ์†Œ๋กœ ๊ฐ€๋Š” ๊ฐ€์žฅ ๋น ๋ฅธ ๋ฐฉ๋ฒ•์€,
55:00
if it's a long distance, is a rocket.
1029
3300293
1793
ํŠนํžˆ ์žฅ๊ฑฐ๋ฆฌ๋ผ๋ฉด, ๋‹ต์€ ๋กœ์ผ“์ด์ฃ .
55:03
It's basically an ICBM.
1030
3303087
1377
์ด๊ฑด ๊ธฐ๋ณธ์ ์œผ๋กœ๋Š” ICBM์ด์—์š”.
55:05
CA: But it has to land --
1031
3305673
1335
CA: ํ•˜์ง€๋งŒ ์ฐฉ๋ฅ™์€...
55:07
Because it's an ICBM, it has to land probably offshore,
1032
3307675
3879
์ด๊ฑด ICBM์ด๋‹ˆ๊นŒ, ์—ฐ์•ˆ์— ์ฐฉ๋ฅ™ํ•ด์•ผ๋งŒ ํ•˜์ฃ .
55:11
because it's loud.
1033
3311596
1168
์†Œ์Œ์ด ๋งค์šฐ ํด ํ…Œ๋‹ˆ๊นŒ์š”.
55:12
So why not have a tunnel that then connects to the city with Tesla?
1034
3312805
6632
๊ทธ๋Ÿฌ๋‹ˆ ํ…Œ์Šฌ๋ผ๋กœ ๋„์‹œ๋ฅผ ์—ฐ๊ฒฐํ•˜๊ธฐ๋ณด๋‹ค๋Š” ํ„ฐ๋„์„ ์ง“๋Š” ๊ฒŒ ๋‚ซ์ง€ ์•Š๋‚˜์š”?
55:20
And Neuralink.
1035
3320897
1126
๊ทธ๋ฆฌ๊ณ , ๋‰ด๋Ÿด๋งํฌ์˜ ๊ฒฝ์šฐ
55:22
I mean, if you going to go to Mars
1036
3322065
1626
ํ™”์„ฑ์— ๊ฐ€๊ฒŒ ๋œ๋‹ค๋ฉด,
55:23
having a telepathic connection with loved ones back home,
1037
3323733
2878
๊ณ ํ–ฅ์˜ ์‚ฌ๋ž‘ํ•˜๋Š” ์ด๋“ค๊ณผ ํ…”๋ ˆํŒŒ์‹œ๋กœ ์—ฐ๊ฒฐ๋  ์ˆ˜ ์žˆ์„ ๊ฑฐ๊ณ ์š”.
55:26
even if there's a time delay...
1038
3326611
1501
์‹œ๊ฐ„์—์„œ๋Š” ์ง€์—ฐ์ด ๋ฐœ์ƒํ•ด๋„ ๋ง์ž…๋‹ˆ๋‹ค.
55:29
EM: These are not intended to be connected, by the way.
1039
3329238
4088
EM: ์„œ๋กœ ์—ฐ๊ฒฐ๋˜๋Š” ๊ฑด ์˜๋„ํ•œ ๋ฐ”๋Š” ์•„๋‹ˆ์ง€๋งŒ
55:33
But there certainly could be some synergies, yeah.
1040
3333326
2419
์‹œ๋„ˆ์ง€๋Š” ํ™•์‹คํžˆ ์žˆ์„ ๊ฑฐ์˜ˆ์š”.
55:35
CA: Surely there is a growing argument
1041
3335787
2294
CA: ๋ถ„๋ช… ๋…ผ์˜๋“ค์ด ์ปค์ง€๊ณ  ์žˆ๊ณ 
55:38
that you should actually put all these things together
1042
3338081
2711
๋‹น์‹ ์€ ์ด ๋ชจ๋“  ๊ฒƒ๋“ค์„ ํ•œ ๋ฐ ๋ชจ์•„
55:40
into one company
1043
3340792
2127
ํ•˜๋‚˜์˜ ํšŒ์‚ฌ์— ๋„ฃ์—ˆ๋Š”๋ฐ
55:42
and just have a company devoted to creating a future
1044
3342960
4296
๋ฏธ๋ž˜๋ฅผ ์ฐฝ์กฐํ•˜๋Š” ์ผ์— ํ—Œ์‹ ํ•˜๋Š” ํšŒ์‚ฌ๋Š”
55:47
thatโ€™s exciting,
1045
3347298
1460
๋งค์šฐ ํฅ๋ฏธ๋กญ๊ณ 
55:48
and let a thousand flowers bloom.
1046
3348758
1627
์—ฌ๋Ÿฌ ์ผ๋“ค์ด ๊ฐ์ถ•์ „์„ ๋ฒŒ์ด๋Š” ๊ฑฐ์ฃ .
55:50
Have you been thinking about that?
1047
3350426
1627
์ด๋Ÿฐ ์ƒ๊ฐ ํ•ด๋ณธ ์  ์žˆ์œผ์‹ค๊นŒ์š”?
55:53
EM: I mean, it is tricky because Tesla is a publicly-traded company,
1048
3353429
3253
EM: ๊นŒ๋‹ค๋กœ์šด ์–˜๊ธฐ์ธ๋ฐ, ํ…Œ์Šฌ๋ผ๋Š” ์ƒ์žฅ ๊ธฐ์—…์ด๊ณ 
55:56
and the investor base of Tesla and SpaceX
1049
3356724
5255
ํ…Œ์Šฌ๋ผ์™€ ์ŠคํŽ˜์ด์Šค X์˜ ํˆฌ์ž์ž ๊ธฐ๋ฐ˜์€
56:02
and certainly Boring Company and Neuralink are quite different.
1050
3362021
3379
๋ณด๋ง ์ปดํผ๋‹ˆ์™€ ๋‰ด๋Ÿด๋งํฌ์˜ ๊ฒƒ๊ณผ๋Š” ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
56:05
Boring Company and Neuralink are tiny companies.
1051
3365441
2503
๋ณด๋ง ์ปดํผ๋‹ˆ์™€ ๋‰ด๋Ÿด๋งํฌ๋Š” ์ž‘์€ ํšŒ์‚ฌ์˜ˆ์š”.
56:08
CA: By comparison.
1052
3368569
1418
CA: ์ƒ๋Œ€์ ์œผ๋กœ๋Š”์š”.
56:10
EM: Yeah, Tesla's got 110,000 people.
1053
3370321
3629
EM: ๋„ค, ํ…Œ์Šฌ๋ผ๋Š” ์ง์›์ด 11๋งŒ ๋ช…์ด๊ณ ,
56:14
SpaceX I think is around 12,000 people.
1054
3374283
2503
์ŠคํŽ˜์ด์ŠคX๋Š” ์•„๋งˆ 1๋งŒ 2์ฒœ๋ช… ์ •๋„์ผ ๊ฑฐ์˜ˆ์š”.
56:17
Boring Company and Neuralink are both under 200 people.
1055
3377161
4546
๋ณด๋ง ์ปดํผ๋‹ˆ์™€ ๋‰ด๋Ÿด๋งํฌ๋Š” 200๋ช… ์ดํ•˜์ž…๋‹ˆ๋‹ค.
56:21
So they're little, tiny companies,
1056
3381749
3170
๊ทธ๋Ÿฌ๋‹ˆ ๋งค์šฐ ์ž‘์€ ํšŒ์‚ฌ๋“ค์ด์ฃ .
56:24
but they will probably get bigger in the future.
1057
3384961
2252
๋ฏธ๋ž˜์—๋Š” ๋ฉ์น˜๊ฐ€ ์ปค์ง€๊ฒ ์ง€๋งŒ์š”.
56:27
They will get bigger in the future.
1058
3387213
1710
๋ฏธ๋ž˜์—๋Š” ๋” ์ปค์งˆ ๊ฒ๋‹ˆ๋‹ค.
56:29
It's not that easy to sort of combine these things.
1059
3389632
2544
๊ทธ๋Ÿฐ ๊ฒƒ๋“ค์˜ ๊ฒฐํ•ฉ์€ ์‰ฌ์šด ์ผ์€ ์•„๋‹™๋‹ˆ๋‹ค.
56:33
CA: Traditionally, you have said that for SpaceX especially,
1060
3393761
2878
CA: ์ „ํ†ต์ ์œผ๋กœ, ๋จธ์Šคํฌ ์”จ๊ป˜์„œ๋Š” ํŠนํžˆ ์ŠคํŽ˜์ด์ŠคX ๊ด€๋ จํ•ด์„œ
56:36
you wouldn't want it public,
1061
3396639
1418
๊ธฐ์—… ๊ณต๊ฐœ๋ฅผ ์›ํ•˜์ง€ ์•Š๋Š”๋‹ค๊ณ  ํ•ด์™”์ง€ ์•Š์Šต๋‹ˆ๊นŒ?
56:38
because public investors wouldn't support the craziness of the idea
1062
3398057
4296
์™œ๋ƒํ•˜๋ฉด ๊ณต๊ฐœ ํˆฌ์ž์ž๋“ค์€ ๊ณผ๊ฒฉํ•œ ์ƒ๊ฐ๋“ค์„ ์ง€์ง€ํ•˜์ง€ ์•Š์œผ๋‹ˆ๊นŒ์š”.
56:42
of going to Mars or whatever.
1063
3402395
1418
ํ™”์„ฑ์— ๊ฐ€๋Š” ์ผ์ด๋‚˜ ๋ญ ๊ทธ๋Ÿฐ ๊ฒƒ๋“ค์š”.
56:43
EM: Yeah, making life multi-planetary
1064
3403813
2044
EM: ๋„ค, ๋‹คํ–‰์„ฑ์—์„œ์˜ ์ƒ์กด์€
56:45
is outside of the normal time horizon of Wall Street analysts.
1065
3405898
5881
์›”์ŠคํŠธ๋ฆฌํŠธ ๋ถ„์„๊ฐ€๋“ค์˜ ํ‰๋ฒ”ํ•œ ์‹œ๊ฐ์—๋Š” ๋ฒ—์–ด๋‚˜๋Š” ๊ฑฐ๋‹ˆ๊นŒ์š”.
56:51
(Laughs)
1066
3411779
1001
(์›ƒ์Œ)
56:52
To say the least.
1067
3412864
1209
์ตœ์†Œํ•œ์š”.
56:54
CA: I think something's changed, though.
1068
3414073
2586
CA: ๊ทธ๋Ÿผ์—๋„ ์ œ๊ฐ€ ๋ณด๊ธฐ์—” ๋ณ€ํ™”๊ฐ€ ์žˆ์—ˆ๋˜ ๋“ฏํ•œ๋ฐ,
56:56
What's changed is that Tesla is now so powerful and so big
1069
3416701
2753
๋‹ฌ๋ผ์ง„ ๊ฑด, ํ…Œ์Šฌ๋ผ๋Š” ์ด์ œ ๋งค์šฐ ๊ฐ•๋ ฅํ•˜๊ณ  ํฌ๊ณ 
56:59
and throws off so much cash
1070
3419495
2461
๊ทธ๋ฆฌ๊ณ  ๋งŽ์€ ๋ˆ์„ ์Ÿ์•„๋ถ€์–ด์„œ
57:01
that you actually could connect the dots here.
1071
3421998
3503
์ด๋Ÿฌํ•œ ์ ๋“ค์„ ์—ฐ๊ฒฐํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
57:05
Just tell the public that x-billion dollars a year, whatever your number is,
1072
3425501
4546
๋‹จ์ˆœํžˆ ๋Œ€์ค‘์—๊ฒŒ ์ผ ๋…„์— x์–ต ๋‹ฌ๋Ÿฌ๋ผ๊ณ  ๋งํ•ด๋ด์š”.
57:10
will be diverted to the Mars mission.
1073
3430089
3420
์–ผ๋งˆ๊ฐ€ ๋˜๋“ , ๊ทธ๊ฒŒ ํ™”์„ฑ ์ž„๋ฌด์— ์‚ฌ์šฉ๋  ๊ฑฐ๋ผ๊ณ  ๋ง์ž…๋‹ˆ๋‹ค.
57:13
I suspect you'd have massive interest in that company.
1074
3433551
3462
์ €๋Š” ๋‹น์‹ ์ด ๊ทธ๋Ÿฐ ํšŒ์‚ฌ์— ์—„์ฒญ ๊ด€์‹ฌ์ด ์žˆ์–ด ๋ณด์ด๊ณ 
57:17
And it might unlock a lot more possibility for you, no?
1075
3437054
4922
๋งŽ์€ ๊ฐ€๋Šฅ์„ฑ์„ ์—ด์–ด์ค„ ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•˜๋Š”๋ฐ, ๊ทธ๋ ‡์ง€ ์•Š๋‚˜์š”?
57:22
EM: I would like to give the public access to ownership of SpaceX,
1076
3442018
5797
์ €๋Š” ์ผ๋ฐ˜์ธ๋“ค์—๊ฒŒ ์ŠคํŽ˜์ด์ŠคX์˜ ์†Œ์œ ๊ถŒ์„ ๋‚˜๋ˆ ์ฃผ๊ณ  ์‹ถ์ง€๋งŒ์š”.
57:27
but I mean the thing that like,
1077
3447815
2711
ํ•˜์ง€๋งŒ,
57:30
the overhead associated with a public company is high.
1078
3450568
5130
๊ณต๊ฐœ ๊ธฐ์—…์—” ๋“ค์–ด๊ฐ€๋Š” ๊ฐ„์ ‘๋น„์šฉ์ด ๋งค์šฐ ์ปค์š”.
57:38
I mean, as a public company, you're just constantly sued.
1079
3458284
2711
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์ฃผ์‹ ๊ณต๊ฐœ ํšŒ์‚ฌ์—์„œ๋Š” ๋์—†์ด ๊ณ ์†Œ ๋‹นํ•˜๊ณ ,
57:41
It does occupy like, a fair bit of ...
1080
3461037
3003
๊ทธ๊ฒŒ ๊ฝค ๋งŽ์ด... ์ฐจ์ง€ํ•˜์ฃ .
57:45
You know, time and effort to deal with these things.
1081
3465958
3546
๊ฑฐ๊ธฐ ๋“ค์–ด๊ฐ€๋Š” ์‹œ๊ฐ„๊ณผ ๋…ธ๋ ฅ์ด ์žˆ์œผ๋‹ˆ๊นŒ์š”.
57:49
CA: But you would still only have one public company, it would be bigger,
1082
3469504
3628
CA: ์—ฌ์ „ํžˆ ๊ณต๊ฐœ ๊ธฐ์—…์€ ํ•˜๋‚˜๊ณ , ๊ทธ๊ฒŒ ๋” ์ปค์ง€๊ณ 
57:53
and have more things going on.
1083
3473174
1752
๋” ๋งŽ์€ ์ผ์ด ์žˆ์„ ํ…Œ์ง€๋งŒ
57:54
But instead of being on four boards, you'd be on one.
1084
3474967
2711
4๊ฐœ์˜ ์ด์‚ฌํšŒ ๋Œ€์‹ ์— ํ•˜๋‚˜๋งŒ ์œ ์ง€ํ•˜๊ฒ ๋‹ค๋Š” ๊ฑฐ๊ตฐ์š”.
57:57
EM: I'm actually not even on the Neuralink or Boring Company boards.
1085
3477720
3337
EM: ์‚ฌ์‹ค ์ €๋Š” ๋‰ด๋Ÿด๋งํฌ๋‚˜ ๋ณด๋ง ์ปดํผ๋‹ˆ์˜ ์ด์‚ฌ๋„ ์•„๋‹ˆ์—์š”.
58:02
And I don't really attend the SpaceX board meetings.
1086
3482099
3671
๊ทธ๋ฆฌ๊ณ  ์ŠคํŽ˜์ด์ŠคX์˜ ์ด์‚ฌ์ง„ ํšŒ์˜์— ์ฐธ์„ํ•˜์ง€๋„ ์•Š๊ณ ์š”.
58:06
We only have two a year,
1087
3486103
1210
ํšŒ์˜๋Š” ์ผ ๋…„์— ๋‘ ๋ฒˆ๋ฐ–์— ์•ˆํ•˜๋‹ˆ๊นŒ
58:07
and I just stop by and chat for an hour.
1088
3487313
2211
์ž ๊น ๋“ค๋Ÿฌ์„œ ํ•œ ์‹œ๊ฐ„ ์ •๋„ ์ด์•ผ๊ธฐ๋ฅผ ๋‚˜๋ˆ„๊ธฐ๋งŒ ํ•˜์ฃ .
58:13
The board overhead for a public company is much higher.
1089
3493110
2837
๊ณต๊ฐœ ๊ธฐ์—…์˜ ์ด์‚ฌํšŒ์—” ๋” ๋งŽ์€ ๋น„์šฉ์ด ํ•„์š”ํ•˜๊ณ ์š”.
58:15
CA: I think some investors probably worry about how your time is being split,
1090
3495988
3712
CA: ์ œ ์ƒ๊ฐ์— ๋ช‡๋ช‡ ํˆฌ์ž์ž๋“ค์€ ์•„๋งˆ ๋‹น์‹ ์˜ ์‹œ๊ฐ„ ๋ถ„๋ฐฐ๋ฅผ ๊ฑฑ์ •ํ•˜๊ฒ ๋Š”๋ฐ์š”.
58:19
and they might be excited by you know, that.
1091
3499742
2669
๊ทธ๋ฆฌ๊ณ  ์•„๋งˆ ํฅ๋ถ„ํ• ์ง€๋„ ๋ชจ๋ฅด๊ณ ์š”.
58:22
Anyway, I just woke up the other day
1092
3502495
3253
์–ด์จŒ๋“ , ์ €๋Š” ๋ฉฐ์น  ์ „์— ๊นจ์–ด๋‚˜์„œ๋Š” ์ด๋Ÿฐ ์ƒ๊ฐ์„ ํ•œ ๊ฑฐ์˜ˆ์š”.
58:25
thinking, just, there are so many ways in which these things connect.
1093
3505790
3420
์ด๋Ÿฐ ๊ฒƒ๋“ค์„ ์—ฐ๊ฒฐํ•  ์—„์ฒญ ๋งŽ์€ ๋ฐฉ๋ฒ•์ด ์žˆ๊ฒ ๊ตฌ๋‚˜ ํ•˜๊ณ ์š”.
58:29
And you know, just the simplicity of that mission,
1094
3509252
3837
๊ทธ๋ฆฌ๊ณ  ์ด ์ž„๋ฌด์˜ ๋‹จ์ˆœํ•จ,
58:33
of building a future that is worth getting excited about,
1095
3513130
3546
๊ธฐ๋Œ€ ๊ฐ€์น˜๊ฐ€ ์žˆ๋Š” ๋ฏธ๋ž˜๋ฅผ ๊ฑด์„คํ•˜๋Š” ๊ทธ ์ž„๋ฌด๊ฐ€
58:36
might appeal to an awful lot of people.
1096
3516676
3461
์—„์ฒญ๋‚œ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ํ˜ธ์†Œ๋ ฅ์ด ์žˆ์„ ๊ฑฐ๋ผ๊ณ  ๋ง์ž…๋‹ˆ๋‹ค.
58:41
Elon, you are reported by Forbes and everyone else as now, you know,
1097
3521013
5381
์ผ๋ก ์€ ํฌ๋ธŒ์Šค๊ฐ€ ๊ธฐ์‚ฌ๋กœ ์“ฐ๊ธฐ๋„ ํ–ˆ๊ณ  ๋‹ค๋“ค ์•Œ๋‹ค์‹œํ”ผ
58:46
the world's richest person.
1098
3526435
1752
์„ธ๊ณ„์—์„œ ๊ฐ€์žฅ ๋ถ€์œ ํ•œ ์‚ฌ๋žŒ์ด์ž–์•„์š”.
58:48
EM: Thatโ€™s not a sovereign.
1099
3528187
1293
EM: ๊ทธ๊ฒŒ ์ตœ๊ณ  ๊ถŒ๋ ฅ์„ ์˜๋ฏธํ•˜์ง€๋Š” ์•Š์•„์š”.
58:49
CA: (Laughs)
1100
3529564
1001
CA: (์›ƒ์Œ)
58:50
EM: You know, I think itโ€™s fair to say
1101
3530606
1835
EM: ์ด๋ ‡๊ฒŒ ์–˜๊ธฐํ•˜๋Š” ๊ฒŒ ์ข‹์„ ๋“ฏํ•œ๋ฐ,
58:52
that if somebody is like, the king or de facto king of a country,
1102
3532483
4671
๋งŒ์•ฝ ๋ˆ„๊ตฐ๊ฐ€๊ฐ€, ๋งํ•˜์ž๋ฉด ์–ด๋–ค ๋‚˜๋ผ์˜ ์™•์ด๋ผ๋ฉด ๋ง์ž…๋‹ˆ๋‹ค.
58:57
they're wealthier than I am.
1103
3537154
1961
๊ทธ๋“ค์ด ์ €๋ณด๋‹ค ๋ถ€์ž์˜ˆ์š”.
58:59
CA: But itโ€™s just harder to measure --
1104
3539323
2711
CA: ํ•˜์ง€๋งŒ ์ธก์ •์€ ์–ด๋ ต์ง€๋งŒ
59:02
So $300 billion.
1105
3542285
1418
3000์–ต ๋‹ฌ๋Ÿฌ๋ผ๋ฉด,
59:03
I mean, your net worth on any given day
1106
3543744
3838
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์ผ๋ก ์˜ ์ˆœ ์ž์‚ฐ์€ ์–ด๋–ค ๋‚ ์ด๋“ 
59:07
is rising or falling by several billion dollars.
1107
3547623
3045
๋ช‡์‹ญ์–ต ๋‹ฌ๋Ÿฌ์”ฉ ์˜ค๋ฅด๋‚ด๋ฆฌ์ฃ .
59:10
How insane is that?
1108
3550710
2127
์ •๋ง ์ด์ƒํ•˜๋„ค์š”.
59:12
EM: It's bonkers, yeah.
1109
3552878
1168
EM: ์™„์ „ ๋ฏธ์ณค์ฃ .
59:14
CA: I mean, how do you handle that psychologically?
1110
3554088
2586
CA: ์‹ฌ๋ฆฌ์ ์œผ๋กœ๋Š” ์–ด๋–ป๊ฒŒ ๊ทธ๋Ÿฐ ๊ฑธ ๋‹ค์Šค๋ฆฌ๋Š” ๊ฑด๊ฐ€์š”?
59:16
There aren't many people in the world who have to even think about that.
1111
3556674
3503
์„ธ์ƒ์—” ์‹ฌ์ง€์–ด ๊ทธ๋Ÿฐ ์ƒ๊ฐ์„ ํ•ด ๋ณผ ์ˆ˜ ์žˆ๋Š” ์‚ฌ๋žŒ๋„ ๋งŽ์ง€ ์•Š์ฃ .
59:20
EM: I actually don't think about that too much.
1112
3560177
2211
EM: ์‚ฌ์‹ค ๊ฑฐ๊ธฐ์— ๋Œ€ํ•ด ๋งŽ์ด ์ƒ๊ฐํ•˜์ง€๋Š” ์•Š์•„์š”.
59:22
But the thing that is actually more difficult
1113
3562430
3587
ํ•˜์ง€๋งŒ ๋” ์–ด๋ ต๊ณ 
59:26
and that does make sleeping difficult
1114
3566058
1877
๊ทธ๋ฆฌ๊ณ  ์ž  ๋ชป ์ž๊ฒŒ ๋งŒ๋“œ๋Š” ์ผ์€
59:27
is that, you know,
1115
3567977
3378
์•„์‹œ๋‹ค์‹œํ”ผ
59:31
every good hour or even minute
1116
3571397
3587
๋ชจ๋“  ์‹œ๊ฐ„,
59:35
of thinking about Tesla and SpaceX
1117
3575026
4421
ํ…Œ์Šฌ๋ผ์™€ ์ŠคํŽ˜์ด์ŠคX์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋Š” ๋ชจ๋“  ์‹œ๊ฐ„์ด
59:39
has such a big effect on the company
1118
3579447
2502
๊ทธ๋“ค ํšŒ์‚ฌ์— ํฐ ์˜ํ–ฅ์„ ์ค๋‹ˆ๋‹ค.
59:41
that I really try to work as much as possible,
1119
3581991
3920
๊ทธ๋ž˜์„œ ์ €๋Š” ์ •๋ง๋กœ ๊ฐ€๋Šฅํ•œํ•œ ๋งŽ์ด ์ผํ•˜๋ ค๊ณ  ํ•˜๊ณ ์š”.
59:45
you know, to the edge of sanity, basically.
1120
3585911
3129
์ •์‹ ์ ์œผ๋กœ ๊ดœ์ฐฎ์€ ์ˆœ๊ฐ„์—๋Š” ๋ง์ด์ฃ .
59:49
Because you know, Teslaโ€™s getting to the point where
1121
3589081
3337
์™œ๋ƒํ•˜๋ฉด, ํ…Œ์Šฌ๋ผ๊ฐ€ ํ–ฅํ•˜๊ณ  ์žˆ๋Š” ์ง€์ ์€
59:54
probably will get to the point later this year,
1122
3594920
2211
์˜ฌํ•ด ํ›„๋ฐ˜์ด ๋˜๋ฉด ๋‹ค๋‹ค๋ฅด๊ฒŒ ๋  ๊ฑฐ๊ณ ,
59:57
where every high-quality minute of thinking
1123
3597131
5047
๋ชจ๋“  ๋†’์€ ์ˆ˜์ค€์˜ ์ƒ๊ฐ์ด
60:02
is a million dollars impact on Tesla.
1124
3602219
3671
ํ…Œ์Šฌ๋ผ์—๋Š” ๋ฐฑ๋งŒ ๋‹ฌ๋Ÿฌ์˜ ์˜ํ–ฅ๋ ฅ์„ ์ฃผ๋‹ˆ๊นŒ์š”.
60:08
Which is insane.
1125
3608517
1544
๋ฏธ์นœ ์ผ์ด์ฃ .
60:13
I mean, the basic, you know, if Tesla is doing, you know,
1126
3613272
4046
๊ธฐ๋ณธ์ ์œผ๋กœ, ํ…Œ์Šฌ๋ผ๊ฐ€
60:17
sort of $2 billion a week, letโ€™s say, in revenue,
1127
3617360
3920
20์–ต ๋‹ฌ๋Ÿฌ๋ฅผ ํ•œ ์ฃผ์— ๋ฒŒ์–ด๋“ค์ธ๋‹ค๋ฉด,
60:21
itโ€™s sort of $300 million a day, seven days a week.
1128
3621280
4713
ํ•˜๋ฃจ์—๋Š” 3์–ต ๋‹ฌ๋Ÿฌ์ž…๋‹ˆ๋‹ค. ์ผ์ฃผ์ผ์€ 7์ผ์ด๋‹ˆ๊นŒ.
60:26
You know, it's ...
1129
3626535
1335
๊ทธ๊ฑด..
60:28
CA: If you can change that by five percent in an hourโ€™s brainstorm,
1130
3628829
5548
CA: ํ•œ ์‹œ๊ฐ„์˜ ๋ธŒ๋ ˆ์ธ์Šคํ† ๋ฐ์œผ๋กœ ๊ทธ ์ค‘ 5%๋ฅผ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ๋‹ค๋ฉด,
60:34
that's a pretty valuable hour.
1131
3634418
3128
๊ฝค ๊ฐ’์ง„ ์‹œ๊ฐ„์ด๋„ค์š”.
60:37
EM: I mean, there are many instances where a half-hour meeting,
1132
3637546
4797
EM: 30๋ถ„ ์ •๋„ ํšŒ์˜๋ฅผ ํ•˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ๋งŽ๊ณ ,
60:42
I was able to improve the financial outcome of the company
1133
3642385
3378
์žฌ๋ฌด ๊ฒฐ๊ณผ๋„ ๊ฐœ์„  ์‹œํ‚ฌ ์ˆ˜ ์žˆ์—ˆ์ฃ .
60:45
by $100 million in a half-hour meeting.
1134
3645763
3629
30๋ถ„ ๋ฏธํŒ…์œผ๋กœ 1์–ต ๋‹ฌ๋Ÿฌ ์ •๋„๋ฅผ์š”.
60:50
CA: There are many other people out there
1135
3650476
2044
CA: ์ด๋Ÿฐ ์‚ฌ๋žŒ๋“ค์ด ๋งŽ์ด ์žˆ๋Š”๋ฐ
60:52
who can't stand this world of billionaires.
1136
3652520
2752
์„ธ์ƒ์˜ ์–ต๋งŒ์žฅ์ž๋“ค์—๊ฒŒ ์ ๋Œ€์ ์ธ ์ด๋“ค์ด ์žˆ์ฃ .
60:55
Like, they are hugely offended by the notion
1137
3655314
3921
๊ทธ ์‚ฌ๋žŒ๋“ค์€ ์ด๋Ÿฐ ๊ฐœ๋…์— ํ™”๊ฐ€ ๋‚œ ๋“ฏ ๋ณด์ด๋Š”๋ฐ์š”,
60:59
that an individual can have the same wealth as, say,
1138
3659276
4588
ํ•œ ๊ฐœ์ธ์ด ์†Œ์œ ํ•œ ์žฌ์‚ฐ์ด ์„ธ์ƒ์—์„œ ๊ฐ€์žฅ ๊ฐ€๋‚œํ•œ ์‚ฌ๋žŒ ์‹ญ์–ต ๋ช…
61:03
a billion or more of the world's poorest people.
1139
3663906
3212
๋˜๋Š” ๊ทธ ์ด์ƒ์ด ์†Œ์œ ํ•œ ์žฌ์‚ฐ๊ณผ ๊ฐ™์„ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฐœ๋… ๋ง์ž…๋‹ˆ๋‹ค.
61:07
EM: If they examine sort of --
1140
3667159
2419
EM: ๋งŒ์•ฝ ๊ทธ ๋ถ„๋“ค์ด ์กฐ์‚ฌ๋ฅผ ์ข€ ํ•ด๋ณธ๋‹ค๋ฉด
61:09
I think there's some axiomatic flaws that are leading them to that conclusion.
1141
3669578
5047
์ œ ์ƒ๊ฐ์— ๊ทธ๋Ÿฐ ๊ฒฐ๋ก ์ด ๋‚˜์˜ค๋Š” ๋ฐ์—๋Š” ๋ช…๋ฐฑํ•œ ์˜ค๋ฅ˜๊ฐ€ ์žˆ๋Š” ๋“ฏํ•ฉ๋‹ˆ๋‹ค.
61:15
For sure, it would be very problematic if I was consuming,
1142
3675167
4922
ํ™•์‹คํžˆ ์ด๋Ÿฐ ๊ฒฝ์šฐ๋ผ๋ฉด ๋ฌธ์ œ์ผ ๊ฒ๋‹ˆ๋‹ค.
61:20
you know, billions of dollars a year in personal consumption.
1143
3680131
3086
์ œ๊ฐ€ ๊ฐœ์ธ์ ์ธ ์†Œ๋น„ ๋ชฉ์ ์œผ๋กœ ํ•œ ํ•ด์— ์ˆ˜์‹ญ์–ต ๋‹ฌ๋Ÿฌ๋ฅผ ์“ด๋‹ค๋ฉด ๋ง์ด์ฃ .
61:23
But that is not the case.
1144
3683259
1209
ํ•˜์ง€๋งŒ ์ €๋Š” ๊ทธ๋ ‡์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
61:24
In fact, I don't even own a home right now.
1145
3684802
2252
์‚ฌ์‹ค, ์ €๋Š” ์ง€๊ธˆ ์‹ฌ์ง€์–ด ์ง‘๋„ ์—†์–ด์š”.
61:27
I'm literally staying at friends' places.
1146
3687096
2336
๋ง ๊ทธ๋Œ€๋กœ, ์นœ๊ตฌ ์ง‘์— ๋จธ๋ฌผ๊ณ  ์žˆ๋Š” ์ค‘์ด๊ณ ์š”.
61:30
If I travel to the Bay Area,
1147
3690141
1543
๋งŒ์•ฝ ์ œ๊ฐ€ ์—ฐ์•ˆ ์ง€์—ญ์œผ๋กœ ์ด๋™ํ•ด์•ผ ํ•˜๋ฉด,
61:31
which is where most of Tesla engineering is,
1148
3691726
2085
ํ…Œ์Šฌ๋ผ ๊ณตํ•™ ๊ธฐ์ˆ ์˜ ๋Œ€๋ถ€๋ถ„์ด ์žˆ๋Š” ๊ณณ์ธ๋ฐ์š”,
61:33
I basically rotate through friends' spare bedrooms.
1149
3693811
3795
์ €๋Š” ์•„๋งˆ ์นœ๊ตฌ ์ง‘์— ๋‚จ๋Š” ์นจ์‹ค์„ ์ „์ „ํ•  ๊ฑฐ์˜ˆ์š”.
61:38
I don't have a yacht, I really don't take vacations.
1150
3698691
2753
์ €๋Š” ์š”ํŠธ๋„ ์—†๊ณ , ํœด๊ฐ€๋ฅผ ๊ฐ€์ง€๋„ ์•Š๊ณ ์š”.
61:44
Itโ€™s not as though my personal consumption is high.
1151
3704071
4338
์ œ ๊ฐœ์ธ์  ์†Œ๋น„๋Š” ๊ทธ๋ ‡๊ฒŒ ๋งŽ์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
61:49
I mean, the one exception is a plane.
1152
3709243
1793
์œ ์ผํ•œ ์˜ˆ์™ธ๋Š” ๋น„ํ–‰๊ธฐ์ธ๋ฐ,
61:51
But if I don't use the plane, then I have less hours to work.
1153
3711078
2878
์ œ๊ฐ€ ๋น„ํ–‰๊ธฐ๋ฅผ ์•ˆํƒ€๋ฉด ์ผ์„ ๋œ ํ•  ์ˆ˜๋ฐ–์— ์—†์œผ๋‹ˆ๊นŒ์š”.
61:55
CA: I mean, I personally think you have shown that you are mostly driven
1154
3715291
4129
CA: ์ œ ๊ฐœ์ธ์ ์ธ ์ƒ๊ฐ์œผ๋กœ ์ผ๋ก ์€ ๋Œ€๊ฐœ
61:59
by really quite a deep sense of moral purpose.
1155
3719420
2502
๋„๋•์ ์ธ ๋ชฉ์ ์— ์˜ํ•ด ์›€์ง์ด๋Š” ๋“ฏ ๋ณด์ด๋Š”๋ฐ์š”.
62:01
Like, your attempts to solve the climate problem
1156
3721964
5589
์˜ˆ๋ฅผ ๋“ค์ž๋ฉด, ๊ธฐํ›„ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•œ ์‹œ๋„๋Š”
62:07
have been as powerful as anyone else on the planet that I'm aware of.
1157
3727595
4838
์ œ๊ฐ€ ์•Œ๊ธฐ๋กœ, ์–ด๋Š ๋ˆ„๊ตฌ ๋ชป์ง€์•Š๊ฒŒ ๊ฐ•๋ ฅํ•œ ํž˜์„ ๋ฐœํœ˜ํ•˜๊ณ  ์žˆ์ง€์š”.
62:12
And I actually can't understand,
1158
3732433
2085
๊ทธ๋ฆฌ๊ณ  ์‚ฌ์‹ค ์ €๋Š” ์ž˜ ์ดํ•ดํ•˜์ง€ ๋ชปํ•˜๊ฒ ๋Š”๋ฐ
62:14
personally, I can't understand the fact
1159
3734518
1877
๊ฐœ์ธ์ ์œผ๋กœ ์ดํ•ด๊ฐ€ ์•ˆ๊ฐ€๋Š” ๊ฒƒ์€,
62:16
that you get all this criticism from the Left about,
1160
3736437
2461
๋‹น์‹ ์€ ์ง„๋ณด ์ง„์˜์œผ๋กœ๋ถ€ํ„ฐ ์ด๋Ÿฐ ๋น„ํŒ์„ ๋“ฃ์ž–์•„์š”.
62:18
"Oh, my God, he's so rich, that's disgusting."
1161
3738898
2377
โ€œ์ € ์‚ฌ๋žŒ์€ ๋„ˆ๋ฌด ๋ถ€์ž์—ฌ์„œ ํ˜์˜ค์Šค๋Ÿฝ๋‹ค.โ€ ๊ฐ™์€ ๊ฑฐ์š”.
62:21
When climate is their issue.
1162
3741734
2377
๋ช‡๋ช‡ ์ด๋“ค์€ ๊ธฐํ›„ ๋ฌธ์ œ๋ฅผ ๋…ผํ•  ๋•Œ
62:25
Philanthropy is a topic that some people go to.
1163
3745446
2210
์ž์„  ์‚ฌ์—… ๊ฐ™์€ ๊ฑธ ์–˜๊ธฐํ•˜๋Š”๋ฐ,
62:27
Philanthropy is a hard topic.
1164
3747698
1668
๋ฌผ๋ก  ์ด๊ฒŒ ์–ด๋ ค์šด ์ฃผ์ œ์ด๊ธด ํ•˜์ง€๋งŒ
62:29
How do you think about that?
1165
3749408
1794
์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ•˜์‹œ๋‚˜์š”?
62:31
EM: I think if you care about the reality of goodness
1166
3751535
2711
EM: ์ œ ์ƒ๊ฐ์— ์„ ํ•จ์— ๋Œ€ํ•œ ์‹ค์งˆ์ ์ธ ๋ฉด์„ ๊ณ ๋ คํ•œ๋‹ค๋ฉด ๋ง์ž…๋‹ˆ๋‹ค,
62:34
instead of the perception of it, philanthropy is extremely difficult.
1167
3754246
3796
๋‹จ์ˆœํžˆ ์ธ์‹ํ•˜๋Š” ๊ฒƒ ๋ง๊ณ ์š”. ๊ทธ๋ ‡๋‹ค๋ฉด ์ž์„ ์€ ๋งค์šฐ ์–ด๋ ต์ฃ .
62:39
SpaceX, Tesla, Neuralink and The Boring Company are philanthropy.
1168
3759126
3921
์ŠคํŽ˜์ด์ŠคX, ํ…Œ์Šฌ๋ผ, ๋‰ด๋Ÿด๋งํฌ, ๋ณด๋ง ์ปดํผ๋‹ˆ๋Š” ๋ชจ๋‘ ์ž์„  ์‚ฌ์—…์ด์—์š”.
62:43
If you say philanthropy is love of humanity,
1169
3763464
3086
๋งŒ์•ฝ ์ž์„  ํ™œ๋™์ด ์ธ๋ฅ˜์— ๋Œ€ํ•œ ์‚ฌ๋ž‘๊ณผ ๋™์˜์–ด๋ผ๋ฉด,
62:46
they are philanthropy.
1170
3766592
1668
๊ทธ ๋ชจ๋“  ๊ฒŒ ์ž์„  ์‚ฌ์—…์ด์ฃ .
62:49
Tesla is accelerating sustainable energy.
1171
3769553
2878
ํ…Œ์Šฌ๋ผ๋Š” ์ง€์† ๊ฐ€๋Šฅํ•œ ์—๋„ˆ์ง€๋ฅผ ๊ฐ€์†ํ™”ํ•˜๊ณ  ์žˆ๊ณ 
62:52
This is a love -- philanthropy.
1172
3772473
3545
๊ทธ๊ฒŒ ์‚ฌ๋ž‘์ด๊ณ , ์ž์„ ์ด๊ณ ์š”.
62:56
SpaceX is trying to ensure the long-term survival of humanity
1173
3776894
3712
์ŠคํŽ˜์ด์ŠคX๋Š” ์ธ๋ฅ˜์˜ ์ƒ์กด์„ ๋ณด๋‹ค ์žฅ๊ธฐ์  ๊ด€์ ์—์„œ ๋ณด์žฅํ•˜๊ณ  ์žˆ๊ณ ์š”,
63:00
with a multiple-planet species.
1174
3780648
1501
๋‹คํ–‰์„ฑ์ข…์œผ๋กœ์„œ ๋ง์ด์ฃ .
63:02
That is love of humanity.
1175
3782191
1543
๊ทธ๊ฒƒ๋„ ์ธ๋ฅ˜์• ์ž…๋‹ˆ๋‹ค.
63:05
You know, Neuralink is trying to help solve brain injuries
1176
3785319
4546
์•„์‹œ๋‹ค์‹œํ”ผ, ๋‰ด๋Ÿด๋งํฌ๋Š” ๋‘๋‡Œ ์†์ƒ์„ ๊ฐœ์„ ํ•˜๊ณ 
63:09
and existential risk with AI.
1177
3789907
2294
AI๊ฐ€ ๊ฐ€์ ธ์˜ฌ ์™ธ๋ถ€์  ์œ„ํ˜‘์— ๋Œ€์ฒ˜ํ•˜๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅ ์ค‘์ด๊ณ ์š”.
63:12
Love of humanity.
1178
3792243
1167
์ธ๋ฅ˜์• ๋กœ์„œ์š”.
63:13
Boring Company is trying to solve traffic, which is hell for most people,
1179
3793452
3504
๋ณด๋ง ์ปดํผ๋‹ˆ๋Š” ๋‹ค์ˆ˜๊ฐ€ ๊ณ ํ†ต ๋ฐ›๋Š” ๊ตํ†ต ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋ ค ๋…ธ๋ ฅํ•˜๊ณ  ์žˆ๊ณ 
63:16
and that also is love of humanity.
1180
3796956
2627
์ด ์—ญ์‹œ ์ธ๋ฅ˜์• ์ž…๋‹ˆ๋‹ค.
63:20
CA: How upsetting is it to you
1181
3800084
4296
CA: ์†์ƒํ•˜์‹œ๊ฒ ์Šต๋‹ˆ๋‹ค.
63:24
to hear this constant drumbeat of,
1182
3804421
3546
์ง€์†์ ์œผ๋กœ ์ด๋Ÿฐ ์–˜๊ธฐ๋ฅผ ๋“ฃ๋Š” ์ผ ๋ง์ž…๋‹ˆ๋‹ค.
63:28
"Billionaires, my God, Elon Musk, oh, my God?"
1183
3808008
2169
โ€œ์„ธ์ƒ์—, ์–ต๋งŒ์žฅ์ž๋“ค์„ ์ข€ ๋ด, ์„ธ์ƒ์—, ์ผ๋ก  ๋จธ์Šคํฌ ์ข€ ๋ด.โ€
63:30
Like, do you just shrug that off
1184
3810219
2961
์ด๋Ÿฐ ๋ง์„ ๋“ค์œผ๋ฉด ๊ทธ๋ƒฅ ์–ด๊นจ ํ•œ๋ฒˆ ์œผ์“ฑํ•˜๊ณ  ๋งˆ์‹œ๋‚˜์š”,
63:33
or does it does it actually hurt?
1185
3813222
1627
์•„๋‹ˆ๋ฉด ์‹ค์ œ๋กœ ์ƒ์ฒ˜ ๋ฐ›์œผ์‹œ๋‚˜์š”?
63:36
EM: I mean, at this point, it's water off a duck's back.
1186
3816559
2794
EM: ๊ท“๋“ฑ์œผ๋กœ๋„ ์•ˆ๋“ฃ์Šต๋‹ˆ๋‹ค.
63:39
CA: Elon, Iโ€™d like to, as we wrap up now,
1187
3819353
2544
CA: ๋งˆ๋ฌด๋ฆฌ ํ•  ์‹œ์ ์—์„œ,
63:41
just pull the camera back and just think ...
1188
3821939
3378
์นด๋ฉ”๋ผ๋Š” ์น˜์›Œ๋‘๊ณ , ๊ทธ๋ƒฅ ์ƒ๊ฐํ•ด ๋ด…์‹œ๋‹ค..
63:45
Youโ€™re a father now of seven surviving kids.
1189
3825359
3504
๋‹น์‹ ์ด ์‚ด์•„๋‚จ์€ ์ผ๊ณฑ ์•„์ด์˜ ์•„๋ฒ„์ง€๋ผ๊ณ  ์ƒ๊ฐํ•ด ๋ณด์ž๊ณ ์š”.
63:49
EM: Well, I mean, I'm trying to set a good example
1190
3829530
2336
EM: ์ €๋Š” ์ข‹์€ ์˜ˆ์‹œ๋ฅผ ๋“ค๊ณ  ์‹ถ์€๋ฐ์š”,
63:51
because the birthrate on Earth is so low
1191
3831907
1919
์™œ๋ƒํ•˜๋ฉด ์ง€๊ตฌ์˜ ์ถœ์ƒ๋ฅ ์ด ์ง€๊ธˆ ๋งค์šฐ ๋‚ฎ๊ณ 
63:53
that we're facing civilizational collapse
1192
3833868
2043
์šฐ๋ฆฌ๋Š” ๋ฌธ๋ช…์˜ ๋ถ•๊ดด๋ฅผ ๋งˆ์ฃผํ•˜๊ณ  ์žˆ์œผ๋‹ˆ๊นŒ์š”.
63:55
unless the birth rate returns to a sustainable level.
1193
3835911
4838
๋งŒ์•ฝ ์ถœ์ƒ๋ฅ ์ด ์•ˆ์ •์ ์ธ ์ •๋„๋กœ ํšŒ๋ณต๋˜์ง€ ์•Š๋Š”๋‹ค๋ฉด ๋ง์ž…๋‹ˆ๋‹ค.
64:01
CA: Yeah, you've talked about this a lot,
1194
3841667
1960
CA: ์ด ์ฃผ์ œ์— ๋Œ€ํ•ด ๋งŽ์ด ๋ง์”€ํ•ด ์˜ค์…จ์ฃ .
64:03
that depopulation is a big problem,
1195
3843669
2294
์ธ๊ตฌ ๊ฐ์†Œ๊ฐ€ ํฐ ๋ฌธ์ œ๋ผ๊ณ  ๋ง์ž…๋‹ˆ๋‹ค.
64:06
and people don't understand how big a problem it is.
1196
3846005
2460
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์€ ์ด๊ฒŒ ์–ผ๋งˆ๋‚˜ ์‹ฌ๊ฐํ•œ์ง€ ์ž˜ ์ดํ•ดํ•˜์ง€ ๋ชปํ•œ๋‹ค๊ณ ์š”.
64:08
EM: Population collapse is one of the biggest threats
1197
3848465
2503
EM: ์ธ๊ตฌ ๊ฐ์†Œ๋Š” ๊ฐ€์žฅ ํฐ ์œ„ํ˜‘ ์ค‘ ํ•˜๋‚˜์˜ˆ์š”.
64:10
to the future of human civilization.
1198
3850968
1752
๋ฏธ๋ž˜์˜ ์ธ๋ฅ˜ ๋ฌธ๋ช…์— ๋ง์ž…๋‹ˆ๋‹ค.
64:12
And that is what is going on right now.
1199
3852761
1877
๊ทธ๋ฆฌ๊ณ  ํ˜„์žฌ ์ง„ํ–‰์ค‘์ด๊ณ ์š”.
64:14
CA: What drives you on a day-to-day basis to do what you do?
1200
3854638
2836
CA: ๋‹น์‹ ์ด ํ•˜๋Š” ์ผ๋“ค์— ๋Œ€ํ•œ ๋งค์ผ์˜ ์›๋™๋ ฅ์€ ๋ญก๋‹ˆ๊นŒ?
64:17
EM: I guess, like, I really want to make sure
1201
3857516
3087
EM: ์ œ ์ƒ๊ฐ์— ์ €๋Š” ์ •๋ง๋กœ
64:20
that there is a good future for humanity
1202
3860644
3462
์ธ๋ฅ˜๊ฐ€ ์ข‹์€ ๋ฏธ๋ž˜๋ฅผ ๋งž์ดํ•˜๊ธฐ๋ฅผ ์›ํ•ด์š”.
64:24
and that we're on a path to understanding the nature of the universe,
1203
3864148
5297
์šฐ๋ฆฌ๋Š” ์šฐ์ฃผ์˜ ์›๋ฆฌ์™€
64:29
the meaning of life.
1204
3869486
1168
์‚ถ์˜ ์˜๋ฏธ์— ๋Œ€ํ•ด ์ดํ•ดํ•˜๋Š” ๊ณผ์ •์— ์žˆ์œผ๋‹ˆ๊นŒ์š”.
64:30
Why are we here, how did we get here?
1205
3870696
1960
์šฐ๋ฆฌ๊ฐ€ ์ง€๊ธˆ ์—ฌ๊ธฐ์— ์™œ ์žˆ์œผ๋ฉฐ, ์–ด๋–ป๊ฒŒ ์—ฌ๊ธฐ๊นŒ์ง€ ์™”๋Š”์ง€
64:33
And in order to understand the nature of the universe
1206
3873490
4422
๊ทธ๋ฆฌ๊ณ  ์šฐ์ฃผ์˜ ์›๋ฆฌ์™€
64:37
and all these fundamental questions,
1207
3877953
3921
๋ชจ๋“  ๊ทผ๋ณธ์ ์ธ ์งˆ๋ฌธ๋“ค์— ๋‹ตํ•˜๊ธฐ ์œ„ํ•ด
64:41
we must expand the scope and scale of consciousness.
1208
3881916
3086
์šฐ๋ฆฌ์˜ ์˜์‹ ์ˆ˜์ค€์„ ํ™•์žฅํ•ด์•ผ ํ•  ํ•„์š”๊ฐ€ ์žˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
64:47
Certainly it must not diminish or go out.
1209
3887129
1960
ํ™•์‹คํžˆ ์ด๋Ÿฌํ•œ ์ผ์€ ์ง€์†๋˜์–ด์•ผ๋งŒ ํ•˜๊ณ ,
64:49
Or we certainly wonโ€™t understand this.
1210
3889131
2211
๊ทธ๋ ‡์ง€ ์•Š๋‹ค๋ฉด ๊ฒฐ์ฝ” ์ดํ•ดํ•  ์ˆ˜ ์—†์„ ๊ฒ๋‹ˆ๋‹ค.
64:51
I would say Iโ€™ve been motivated by curiosity more than anything,
1211
3891342
3587
๋‹ค๋ฅธ ๋ฌด์—‡๋ณด๋‹ค ํ˜ธ๊ธฐ์‹ฌ์ด ์ €์˜ ์›๋™๋ ฅ์ด๋ผ๊ณ  ๋ง์”€๋“œ๋ฆฌ๋Š” ๊ฒŒ ๋งž๊ฒ ๊ตฐ์š”.
64:54
and just desire to think about the future
1212
3894929
4337
๊ทธ๋ฆฌ๊ณ  ๋ฏธ๋ž˜์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๊ณ ์ž ํ•˜๋Š” ๊ฐˆ๋ง์ด ์žˆ๊ณ ,
64:59
and not be sad, you know?
1213
3899308
2544
์Šฌํผ์ง€๊ธฐ๋ฅผ ์›ํ•˜์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
65:03
CA: And are you?
1214
3903687
1168
CA: ์–ด๋– ์‹ ๊ฐ€์š”?
65:04
Are you not sad?
1215
3904897
1251
์Šฌํ”„์ง€ ์•Š์Šต๋‹ˆ๊นŒ?
65:06
EM: I'm sometimes sad,
1216
3906607
1209
EM: ๋•Œ๋กœ๋Š” ์Šฌํ”•๋‹ˆ๋‹ค.
65:07
but mostly I'm feeling I guess
1217
3907816
4505
ํ•˜์ง€๋งŒ ๋Œ€๊ฐœ๋Š”
65:12
relatively optimistic about the future these days.
1218
3912363
2544
์ƒ๋Œ€์ ์œผ๋กœ ๋ฏธ๋ž˜์— ๋Œ€ํ•ด ๋‚™๊ด€์ ์ด์ฃ .
65:15
There are certainly some big risks that humanity faces.
1219
3915699
3921
์ธ๋ฅ˜๊ฐ€ ์ง๋ฉดํ•˜๊ณ  ์žˆ๋Š” ๊ฝค ํฐ ์œ„ํ˜‘๋“ค์ด ํ™•์‹คํžˆ ์žˆ๊ณ ,
65:20
I think the population collapse is a really big deal,
1220
3920287
2795
์ธ๊ตฌ ๊ฐ์†Œ๋Š” ๊ฝค ํฐ ๋ฌธ์ œ๊ธฐ๋„ ํ•˜๊ณ 
65:23
that I wish more people would think about
1221
3923123
5130
๊ฑฐ๊ธฐ์— ๋Œ€ํ•ด ์‚ฌ๋žŒ๋“ค์ด ์ข€ ๋” ์ƒ๊ฐํ•˜๊ธธ ์›ํ•ฉ๋‹ˆ๋‹ค.
65:28
because the birth rate is far below what's needed to sustain civilization
1222
3928253
4964
์™œ๋ƒํ•˜๋ฉด ๋ฌธ๋ช… ์ง€์†์— ํ•„์š”ํ•œ ์ˆ˜์ค€๋ณด๋‹ค ์ถœ์ƒ๋ฅ ์ด ํ›จ์”ฌ ๋‚ฎ์œผ๋‹ˆ๊นŒ์š”.
65:33
at its current level.
1223
3933258
1669
ํ˜„์žฌ ์ˆ˜์ค€์—์„œ๋Š”์š”.
65:35
And there's obviously ...
1224
3935594
3212
๊ทธ๋ฆฌ๊ณ  ๋ถ„๋ช…ํžˆ
65:39
We need to take action on climate sustainability,
1225
3939682
2877
๊ธฐํ›„์˜ ์ง€์† ๊ฐ€๋Šฅ์„ฑ์„ ์œ„ํ•ด์„œ๋„ ๋ญ”๊ฐ€ ํ•ด์•ผ ํ•  ํ•„์š”๊ฐ€ ์žˆ๊ณ ์š”.
65:42
which is being done.
1226
3942601
1919
์ง€๊ธˆ๋„ ๊ทธ๋ ‡๊ฒŒ ํ•˜๊ณ  ์žˆ์ง€๋งŒ์š”.
65:45
And we need to secure the future of consciousness
1227
3945562
2294
๊ทธ๋ฆฌ๊ณ  ๋ฏธ๋ž˜์˜ ์˜์‹์— ๋Œ€ํ•ด์„œ๋„ ๋ณด์žฅํ•ด์•ผ ํ•˜์ฃ 
65:47
by being a multi-planet species.
1228
3947898
2252
๋‹คํ–‰์„ฑ์ข…์ด ๋˜๋Š” ๋ฐฉ์‹์œผ๋กœ์š”.
65:51
We need to address --
1229
3951151
1293
ํ‘œ๋ช…ํ•  ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
65:52
Essentially, it's important to take whatever actions we can think of
1230
3952486
3212
๊ธฐ๋ณธ์ ์œผ๋กœ ์šฐ๋ฆฌ๊ฐ€ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ๋Š” ์–ด๋–ค ์กฐ์น˜๋“  ์ทจํ•˜๋Š” ๊ฒŒ ์ค‘์š”ํ•ด์š”.
65:55
to address the existential risks that affect the future of consciousness.
1231
3955698
4796
์‹ค์กด์  ์œ„ํ˜‘์ด ๋ฏธ๋ž˜์˜ ์˜์‹์— ์˜ํ–ฅ์„ ๋ฏธ์น  ์ˆ˜ ์žˆ์Œ์„ ์•Œ๋ฆฌ๋ ค๋ฉด์š”.
66:00
CA: There's a whole generation coming through
1232
3960536
2127
CA: ๋ฏธ๋ž˜์— ๋Œ€ํ•ด ์ •๋ง๋กœ ์Šฌํผํ•˜๋Š”
66:02
who seem really sad about the future.
1233
3962663
1793
์„ธ๋Œ€๋“ค์ด ์˜ค๊ณ  ์žˆ๋Š”๋ฐ
66:04
What would you say to them?
1234
3964456
1794
๊ทธ๋“ค์—๊ฒŒ ๋ญ๋ผ๊ณ  ์–˜๊ธฐํ•ด์•ผ ํ• ๊นŒ์š”.
66:07
EM: Well, I think if you want the future to be good, you must make it so.
1235
3967376
3587
EM: ๋ฏธ๋ž˜๊ฐ€ ์ข‹๊ธฐ๋ฅผ ์›ํ•œ๋‹ค๋ฉด ๊ทธ๋ ‡๊ฒŒ ๋งŒ๋“ค์–ด์•ผ๋งŒ ํ•˜๊ณ 
66:12
Take action to make it good.
1236
3972256
2419
์ข‹์€ ๋ฏธ๋ž˜๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ํ–‰๋™์„ ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
66:14
And it will be.
1237
3974717
1209
๊ทธ๋ ‡๋‹ค๋ฉด ๊ทธ๋ ‡๊ฒŒ ๋  ๊ฑฐ์˜ˆ์š”.
66:17
CA: Elon, thank you for all this time.
1238
3977177
2211
CA: ์ผ๋ก , ์‹œ๊ฐ„ ๋‚ด์ฃผ์…”์„œ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
66:19
That is a beautiful place to end.
1239
3979722
1668
๋๋‚ด๊ธฐ์— ์ข‹์€ ํƒ€์ด๋ฐ์ด๋„ค์š”.
66:21
Thanks for all you're doing.
1240
3981390
1376
๋ง์”€ ๊ฐ์‚ฌ๋“œ๋ฆฝ๋‹ˆ๋‹ค.
66:22
EM: You're welcome.
1241
3982766
1210
EM: ์ฒœ๋งŒ์—์š”.
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7