How to get empowered, not overpowered, by AI | Max Tegmark

127,885 views ใƒป 2018-07-05

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: JY Kang ๊ฒ€ํ† : Young You
00:12
After 13.8 billion years of cosmic history,
0
12760
4416
138์–ต ๋…„์˜ ์šฐ์ฃผ ์—ญ์‚ฌ๋ฅผ ๋’ค๋กœํ•˜๊ณ 
00:17
our universe has woken up
1
17200
2096
์šฐ๋ฆฌ ์šฐ์ฃผ๋Š” ์ž ์—์„œ ๊นจ์–ด๋‚˜
00:19
and become aware of itself.
2
19320
1520
์Šค์Šค๋กœ๋ฅผ ์ž๊ฐํ•˜๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
00:21
From a small blue planet,
3
21480
1936
์ž‘๊ณ  ํ‘ธ๋ฅธ ํ–‰์„ฑ์—์„œ
00:23
tiny, conscious parts of our universe have begun gazing out into the cosmos
4
23440
4136
์šฐ์ฃผ์˜ ์•„์ฃผ ์ž‘์€ ๋ถ€๋ถ„์„ ์•Œ๋˜ ์šฐ๋ฆฌ๋Š”
๋ง์›๊ฒฝ์„ ํ†ตํ•ด ๋จผ ์šฐ์ฃผ๋ฅผ ๋ฐ”๋ผ๋ณด๋ฉฐ
00:27
with telescopes,
5
27600
1376
00:29
discovering something humbling.
6
29000
1480
๊ฒธ์†ํ•˜๊ฒŒ ๋ฌด์–ธ๊ฐ€๋ฅผ ์ฐพ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:31
We've discovered that our universe is vastly grander
7
31320
2896
์ด ๊ฑฐ๋Œ€ํ•œ ์šฐ์ฃผ์˜ ๊ด‘๋Œ€ํ•จ์€
00:34
than our ancestors imagined
8
34240
1336
์šฐ๋ฆฌ์˜ ์ƒ์ƒ ์ด์ƒ์ด์—ˆ๊ณ 
00:35
and that life seems to be an almost imperceptibly small perturbation
9
35600
4256
์ƒ๋ช…์˜ ์•„์ฃผ ๋ฏธ์„ธํ•œ ๋™์š”๊ฐ€ ์—†์—ˆ๋”๋ผ๋ฉด
00:39
on an otherwise dead universe.
10
39880
1720
์ด๋ฏธ ์ฃฝ์–ด๋ฒ„๋ฆฐ ์šฐ์ฃผ๋ผ๋Š” ๊ฒƒ๋„ ์•Œ์•˜์ฃ .
00:42
But we've also discovered something inspiring,
11
42320
3016
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์˜๊ฐ์„ ์–ป์„ ๋งŒํ•œ ๊ฒƒ๋„ ๋ฐœ๊ฒฌํ–ˆ์ฃ .
00:45
which is that the technology we're developing has the potential
12
45360
2976
์šฐ๋ฆฌ์˜ ๊ธฐ์ˆ ์„ ํ†ตํ•ด์„œ
00:48
to help life flourish like never before,
13
48360
2856
์ด์ „์— ์—†๋˜ ๋ฒˆ์„ฑ์„ ๋ˆ„๋ฆด ์—ญ๋Ÿ‰์„ ๊ฐ€์กŒ๋‹ค๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
00:51
not just for centuries but for billions of years,
14
51240
3096
๋‹จ์ง€ ์ˆ˜ ์„ธ๊ธฐ ๋™์•ˆ์ด ์•„๋‹ˆ๋ผ ์ˆ˜์‹ญ์–ต ๋…„ ๋™์•ˆ ๋ฒˆ์„ฑํ•˜๊ณ 
00:54
and not just on earth but throughout much of this amazing cosmos.
15
54360
4120
๋‹จ์ง€ ์šฐ๋ฆฌ ์ง€๊ตฌ์—์„œ๋งŒ์ด ์•„๋‹ˆ๋ผ ์ด ๋†€๋ผ์šด ์šฐ์ฃผ ์•ˆ์—์„œ ๋ง์ด์ฃ .
00:59
I think of the earliest life as "Life 1.0"
16
59680
3336
์ €๋Š” ์ดˆ์ฐฝ๊ธฐ์˜ ์ƒ๋ช…์„ "์ƒ๋ช… 1.0"์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
01:03
because it was really dumb,
17
63040
1376
๋ฏธ์ƒ๋ฌผ์ฒ˜๋Ÿผ ๋„ˆ๋ฌด๋‚˜ ๋ฐ”๋ณด ๊ฐ™์•˜๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
01:04
like bacteria, unable to learn anything during its lifetime.
18
64440
4296
์‚ด๋ฉด์„œ ์•„๋ฌด๊ฒƒ๋„ ๋ฐฐ์šธ ์ˆ˜ ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
01:08
I think of us humans as "Life 2.0" because we can learn,
19
68760
3376
์šฐ๋ฆฌ ์ธ๊ฐ„์€ "์ƒ๋ช… 2.0"์ž…๋‹ˆ๋‹ค. ํ•™์Šตํ•  ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
01:12
which we in nerdy, geek speak,
20
72160
1496
์ข€ ๊ดด์ƒํ•˜๊ฒŒ ํ‘œํ˜„ํ•˜์ž๋ฉด
01:13
might think of as installing new software into our brains,
21
73680
3216
์šฐ๋ฆฌ ๋‘๋‡Œ์— ์ƒˆ๋กœ์šด ์†Œํ”„ํŠธ์›จ์–ด๋ฅผ ์„ค์น˜ํ–ˆ๋‹ค๊ณ  ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:16
like languages and job skills.
22
76920
2120
์–ธ์–ด์™€ ์—…๋ฌด ๋Šฅ๋ ฅ ๊ฐ™์€ ๊ฒƒ๋“ค์ด์š”.
01:19
"Life 3.0," which can design not only its software but also its hardware
23
79680
4296
"์ƒ๋ช… 3.0"์€ ์†Œํ”„ํŠธ์›จ์–ด๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ํ•˜๋“œ์›จ์–ด๋„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:24
of course doesn't exist yet.
24
84000
1656
๋ฌผ๋ก  ์•„์ง ์ด๊ฑด ์‹คํ˜„๋˜์ง€ ์•Š์•˜์ฃ .
01:25
But perhaps our technology has already made us "Life 2.1,"
25
85680
3776
ํ•˜์ง€๋งŒ ๊ธฐ์ˆ ์€ ์šฐ๋ฆฌ๋ฅผ ์ด๋ฏธ "์ƒ๋ช… 2.1"๋กœ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
01:29
with our artificial knees, pacemakers and cochlear implants.
26
89480
4336
์ธ๊ณต ๋ฌด๋ฆŽ ๊ด€์ ˆ, ์ธ๊ณต ์‹ฌ์žฅ ๋ฐ•๋™๊ธฐ, ์ธ๊ณต ๋‹ฌํŒฝ์ด๊ด€ ์ด์‹ ๊ฐ™์€ ๊ฑฐ๋กœ ๋ง์ด์ฃ .
01:33
So let's take a closer look at our relationship with technology, OK?
27
93840
3880
์ž ๊ทธ๋Ÿผ, ์ธ๊ฐ„๊ณผ ๊ธฐ์ˆ  ์‚ฌ์ด์˜ ๊ด€๊ณ„๋ฅผ ์ข€ ๋” ๊นŠ์ด ์‚ดํŽด๋ณด๊ธฐ๋กœ ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
01:38
As an example,
28
98800
1216
์˜ˆ๋ฅผ ํ•˜๋‚˜ ๋“ค์–ด๋ณด์ฃ .
01:40
the Apollo 11 moon mission was both successful and inspiring,
29
100040
5296
์•„ํด๋กœ 11ํ˜ธ์˜ ๋‹ฌ ํƒ์‚ฌ ๊ณ„ํš์€ ์„ฑ๊ณต์ ์ด์—ˆ๊ณ  ๊ณ ๋ฌด์ ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
01:45
showing that when we humans use technology wisely,
30
105360
3016
์ธ๋ฅ˜๊ฐ€ ๊ธฐ์ˆ ์„ ํ˜„๋ช…ํ•˜๊ฒŒ ์‚ฌ์šฉํ•˜๋ฉด
01:48
we can accomplish things that our ancestors could only dream of.
31
108400
3936
์šฐ๋ฆฌ ์„ ์กฐ๊ฐ€ ๊ฟˆ๊พธ๋˜ ์ผ์„ ํ˜„์‹ค๋กœ ์ด๋ฃจ์–ด๋‚ผ ์ˆ˜ ์žˆ์Œ์„ ์ฆ๋ช…ํ•ด์ฃผ์—ˆ์ฃ .
01:52
But there's an even more inspiring journey
32
112360
2976
ํ•˜์ง€๋งŒ ๊ทธ๋ณด๋‹ค ๋” ๊ณ ๋ฌด์ ์ธ ์—ฌ์ •์ด ์žˆ์Šต๋‹ˆ๋‹ค.
01:55
propelled by something more powerful than rocket engines,
33
115360
2680
๋กœ์ผ“ ์—”์ง„๋ณด๋‹ค ๋” ๊ฐ•๋ ฅํ•œ ์ถ”์ง„๋ ฅ์„ ์–ป๊ณ  ์žˆ์ฃ .
01:59
where the passengers aren't just three astronauts
34
119200
2336
๋‹จ ์„ธ ๋ช…์˜ ์šฐ์ฃผ์ธ์„ ์Šน๊ฐ์œผ๋กœ ํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
02:01
but all of humanity.
35
121560
1776
์ธ๋ฅ˜ ์ „์ฒด๊ฐ€ ์Šน๊ฐ์ž…๋‹ˆ๋‹ค.
02:03
Let's talk about our collective journey into the future
36
123360
2936
๋ฏธ๋ž˜๋ฅผ ํ–ฅํ•œ ์šฐ๋ฆฌ ๋ชจ๋‘์˜ ์—ฌ์ •์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•ด๋ด…์‹œ๋‹ค.
02:06
with artificial intelligence.
37
126320
2000
๋ฐ”๋กœ ์ธ๊ณต์ง€๋Šฅ์ž…๋‹ˆ๋‹ค.
02:08
My friend Jaan Tallinn likes to point out that just as with rocketry,
38
128960
4536
์ œ ์นœ๊ตฌ ์–€ ํƒˆ๋ฆฐ์€ ์ด๋ ‡๊ฒŒ ๋งํ•˜๊ณค ํ–ˆ์Šต๋‹ˆ๋‹ค.
๋กœ์ผ“๊ณตํ•™๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ
02:13
it's not enough to make our technology powerful.
39
133520
3160
์šฐ๋ฆฌ ๊ธฐ์ˆ ์„ ๊ฐ•๋ ฅํ•˜๊ฒŒ ๋งŒ๋“ค๊ธฐ์—๋Š” ์•„์ง ๋ถ€์กฑํ•จ์ด ์žˆ๋‹ค๊ณ  ๋ง์ด์ฃ .
02:17
We also have to figure out, if we're going to be really ambitious,
40
137560
3175
์šฐ๋ฆฌ์—๊ฒŒ ์ง„์ • ์•ผ์‹ฌ ์ฐฌ ๊ณ„ํš์ด ์žˆ๋Š”์ง€ ์•Œ์•„์•ผ ํ•œ๋‹ค๊ณ  ์ง€์ ํ–ˆ์ฃ .
02:20
how to steer it
41
140759
1416
์–ด๋–ค ๋ฐฉํ–ฅ์œผ๋กœ ์ด๋Œ์–ด ๊ฐˆ์ง€์™€ ์›ํ•˜๋Š” ๋ชฉํ‘œ๋ฅผ ์•Œ์•„์•ผ ํ•œ๋‹ค๊ณ ์š”.
02:22
and where we want to go with it.
42
142199
1681
02:24
So let's talk about all three for artificial intelligence:
43
144880
2840
๊ทธ๋Ÿผ ์ธ๊ณต์ง€๋Šฅ์— ์žˆ์–ด์„œ ์„ธ ๊ฐ€์ง€ ์š”์†Œ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•ด๋ณด์ฃ .
02:28
the power, the steering and the destination.
44
148440
3056
ํž˜, ๋ฐฉํ–ฅ์„ฑ ๊ทธ๋ฆฌ๊ณ  ๋ชฉํ‘œ์ž…๋‹ˆ๋‹ค.
02:31
Let's start with the power.
45
151520
1286
๋จผ์ € 'ํž˜'์„ ์–˜๊ธฐํ•ด๋ณด์ฃ .
02:33
I define intelligence very inclusively --
46
153600
3096
์ €๋Š” ์•„์ฃผ ํฌ๊ด„์  ์˜๋ฏธ๋กœ ์ง€๋Šฅ์„ ์ด๋ ‡๊ฒŒ ์ •์˜ํ•ฉ๋‹ˆ๋‹ค.
02:36
simply as our ability to accomplish complex goals,
47
156720
4336
์–ด๋ ค์šด ๋ชฉํ‘œ๋ฅผ ๋‹ฌ์„ฑํ•˜๋Š” ๋Šฅ๋ ฅ์ด๋ผ๊ณ  ๊ฐ„๋‹จํžˆ ์ •์˜ํ•  ์ˆ˜ ์žˆ์ฃ .
02:41
because I want to include both biological and artificial intelligence.
48
161080
3816
์—ฌ๊ธฐ์—๋Š” ์ƒ๋ฌผํ•™์ , ๊ทธ๋ฆฌ๊ณ  ์ธ๊ณต์ ์ธ ์ง€๋Šฅ์ด ๋ชจ๋‘ ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
02:44
And I want to avoid the silly carbon-chauvinism idea
49
164920
4016
ํ•˜์ง€๋งŒ ์–ด๋ฆฌ์„์€ 'ํƒ„์†Œ ์‡ผ๋น„๋‹ˆ์ฆ˜'์ฒ˜๋Ÿผ ํƒ„์†Œ๋กœ ์ด๋ฃจ์–ด์ง„ ์œ ๊ธฐ์ฒด๋งŒ
02:48
that you can only be smart if you're made of meat.
50
168960
2360
์‚ฌ๊ณ  ๋Šฅ๋ ฅ์„ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๋‹ค๋Š” ์ƒ๊ฐ์€ ๋ฐฐ์ œํ•˜๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
02:52
It's really amazing how the power of AI has grown recently.
51
172880
4176
์ตœ๊ทผ AI๊ฐ€ ๊ฐ€์ง„ ํž˜์ด ์–ผ๋งˆ๋‚˜ ์„ฑ์žฅํ–ˆ๋Š”์ง€๋ฅผ ๋ณด๋ฉด ์ •๋ง ๋†€๋ž์Šต๋‹ˆ๋‹ค.
02:57
Just think about it.
52
177080
1256
ํ•œ๋ฒˆ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
02:58
Not long ago, robots couldn't walk.
53
178360
3200
์–ผ๋งˆ ์ „๊นŒ์ง€๋งŒ ํ•ด๋„ ๋กœ๋ด‡์€ ๊ฑท์ง€๋„ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
03:03
Now, they can do backflips.
54
183040
1720
์ด์ œ๋Š” ๋’ค๋กœ ๊ณต์ค‘์ œ๋น„๋„ ๋Œ ์ˆ˜ ์žˆ์ฃ .
03:06
Not long ago,
55
186080
1816
์–ผ๋งˆ ์ „๊นŒ์ง€๋งŒ ํ•ด๋„
03:07
we didn't have self-driving cars.
56
187920
1760
์ž์œจ์ฃผํ–‰ ์ž๋™์ฐจ๋Š” ์กด์žฌํ•˜์ง€๋„ ์•Š์•˜์ง€๋งŒ
03:10
Now, we have self-flying rockets.
57
190920
2480
์ด์ œ๋Š” ์ž์œจ๋น„ํ–‰ ๋กœ์ผ“์ด ๋“ฑ์žฅํ–ˆ์Šต๋‹ˆ๋‹ค.
03:15
Not long ago,
58
195960
1416
์–ผ๋งˆ ์ „๊นŒ์ง€๋งŒ ํ•ด๋„
03:17
AI couldn't do face recognition.
59
197400
2616
AI๋Š” ์–ผ๊ตด ์ธ์‹๋„ ํ•˜์ง€ ๋ชปํ–ˆ์ง€๋งŒ
03:20
Now, AI can generate fake faces
60
200040
2976
์ด์ œ AI๋Š” ๊ฐ€์งœ ์–ผ๊ตด์„ ๋งŒ๋“ค์–ด๋‚ด์„œ
03:23
and simulate your face saying stuff that you never said.
61
203040
4160
์—ฌ๋Ÿฌ๋ถ„์ด ํ•˜์ง€๋„ ์•Š์€ ๋ง์„ ํ•œ ๊ฒƒ์ฒ˜๋Ÿผ ์กฐ์ž‘ํ•˜๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
03:28
Not long ago,
62
208400
1576
์–ผ๋งˆ ์ „๊นŒ์ง€๋งŒ ํ•ด๋„
03:30
AI couldn't beat us at the game of Go.
63
210000
1880
AI๋Š” ๋ฐ”๋‘‘์—์„œ ์ธ๊ฐ„์„ ์ด๊ธฐ์ง€ ๋ชปํ–ˆ์ง€๋งŒ
03:32
Then, Google DeepMind's AlphaZero AI took 3,000 years of human Go games
64
212400
5096
์ด์ œ ๊ตฌ๊ธ€ ๋”ฅ๋งˆ์ธ๋“œ์˜ ์•ŒํŒŒ์ œ๋กœ AI๋Š”
์ธ๊ฐ„์˜ 3000๋…„ ๋ฐ”๋‘‘ ์—ญ์‚ฌ์™€ ๋ฐ”๋‘‘์˜ ์ง€ํ˜œ๋ฅผ ์ตํ˜”์Šต๋‹ˆ๋‹ค.
03:37
and Go wisdom,
65
217520
1256
03:38
ignored it all and became the world's best player by just playing against itself.
66
218800
4976
๋‹ค ์ œ์ณ๋‘๊ณ , ์ž๊ธฐ ์ž์‹ ๊ณผ ์‹ธ์šฐ๋ฉด์„œ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ๋ฐ”๋‘‘๊ธฐ์‚ฌ๊ฐ€ ๋˜์—ˆ์ฃ .
03:43
And the most impressive feat here wasn't that it crushed human gamers,
67
223800
3696
๊ฐ€์žฅ ์ธ์ƒ์ ์ธ ์—…์ ์€ ์ธ๊ฐ„์„ ๋ฌผ๋ฆฌ์ณค๋‹ค๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
03:47
but that it crushed human AI researchers
68
227520
2576
๋ฐ”๋‘‘ ํ”„๋กœ๊ทธ๋žจ ๊ฐœ๋ฐœ์— ์ˆ˜์‹ญ ๋…„์„ ๊ณต๋“ค์—ฌ ์˜จ
03:50
who had spent decades handcrafting game-playing software.
69
230120
3680
์ธ๊ฐ„ ๊ฐœ๋ฐœ์ž๋“ค์„ ๋ฌผ๋ฆฌ์ณค๋‹ค๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
03:54
And AlphaZero crushed human AI researchers not just in Go but even at chess,
70
234200
4656
์•ŒํŒŒ์ œ๋กœ๋Š” ๋ฐ”๋‘‘๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ์ฒด์Šค์—์„œ๋„ ์ธ๊ฐ„ ๊ฐœ๋ฐœ์ž๋“ค์„ ๋ฌผ๋ฆฌ์ณค์Šต๋‹ˆ๋‹ค.
03:58
which we have been working on since 1950.
71
238880
2480
์ฒด์Šค๋Š” 1950๋…„๋ถ€ํ„ฐ ์—ฐ๊ตฌํ•ด์™”๋Š”๋ฐ๋„ ๋ง์ด์ฃ .
04:02
So all this amazing recent progress in AI really begs the question:
72
242000
4240
์ตœ๊ทผ AI์˜ ๋†€๋ผ์šด ๋ฐœ์ „์„ ๋ณด๋ฉด์„œ ์ด๋Ÿฐ ์งˆ๋ฌธ์„ ํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
04:07
How far will it go?
73
247280
1560
์–ผ๋งˆ๋‚˜ ๋” ๋ฐœ์ „ํ•˜๊ฒŒ ๋ ๊นŒ์š”?
04:09
I like to think about this question
74
249800
1696
์ด ์งˆ๋ฌธ๊ณผ ๊ด€๋ จํ•ด์„œ
04:11
in terms of this abstract landscape of tasks,
75
251520
2976
์ถ”์ƒ์ ์ธ ํ’๊ฒฝ์œผ๋กœ ๋ฐ”๊ฟ”์„œ ์ง์—…์„ ์ƒ๊ฐํ•ด๋ณด์ฃ .
04:14
where the elevation represents how hard it is for AI to do each task
76
254520
3456
์ง€ํ˜•์˜ ๋†’์ด๋Š” ์ธ๊ฐ„ ์ˆ˜์ค€์— ๋Œ€๋น„ํ•œ AI์˜ ์ž‘์—… ๋‚œ์ด๋„๋ฅผ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
04:18
at human level,
77
258000
1216
04:19
and the sea level represents what AI can do today.
78
259240
2760
ํ•ด์ˆ˜๋ฉด์€ ํ˜„์žฌ AI๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ์˜ ์ˆ˜์ค€์„ ์˜๋ฏธํ•˜์ฃ .
04:23
The sea level is rising as AI improves,
79
263120
2056
AI๊ฐ€ ๋ฐœ์ „ํ• ์ˆ˜๋ก ํ•ด์ˆ˜๋ฉด์€ ๋†’์•„์ง‘๋‹ˆ๋‹ค.
04:25
so there's a kind of global warming going on here in the task landscape.
80
265200
3440
์ด ์ง์—… ํ’๊ฒฝ์— ์ผ์ข…์˜ ์ง€๊ตฌ ์˜จ๋‚œํ™”๊ฐ€ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋‹ค๊ณ  ํ•  ์ˆ˜ ์žˆ์ฃ .
04:30
And the obvious takeaway is to avoid careers at the waterfront --
81
270040
3335
๋”ฐ๋ผ์„œ ๋ฌผ๊ฐ€์— ๋†“์ธ ์ง์—…์€ ์„ ํƒํ•˜์ง€ ์•Š๋Š” ๊ฒŒ ๋‹น์—ฐํ•  ๊ฒ๋‹ˆ๋‹ค.
04:33
(Laughter)
82
273399
1257
(์›ƒ์Œ)
04:34
which will soon be automated and disrupted.
83
274680
2856
๊ทธ๋Ÿฐ ์ง์—…์€ ๋จธ์ง€์•Š์•„ ์ž๋™ํ™”๋˜๊ฑฐ๋‚˜ ์‚ฌ๋ผ์งˆ ํ…Œ๋‹ˆ๊นŒ์š”.
04:37
But there's a much bigger question as well.
84
277560
2976
๊ทธ๋Ÿฐ๋ฐ ์—ฌ๊ธฐ์—๋Š” ๋” ์ค‘์š”ํ•œ ์งˆ๋ฌธ์ด ๋†“์—ฌ์žˆ์Šต๋‹ˆ๋‹ค.
04:40
How high will the water end up rising?
85
280560
1810
์ด ํ•ด์ˆ˜๋ฉด์ด ์–ผ๋งˆ๋‚˜ ๋” ๋†’์•„์งˆ๊นŒ์š”?
04:43
Will it eventually rise to flood everything,
86
283440
3200
๊ฒฐ๊ตญ ๋ชจ๋“  ๊ฒƒ์ด ๋ฌผ์— ์ž ๊ธฐ๊ณ 
04:47
matching human intelligence at all tasks.
87
287840
2496
๋ชจ๋“  ์ง์—…์—์„œ ์ธ๊ฐ„์˜ ์ง€๋Šฅ๊ณผ ๋งž๋จน๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:50
This is the definition of artificial general intelligence --
88
290360
3736
์ด๊ฒƒ์ด ๋ฒ”์šฉ ์ธ๊ณต์ง€๋Šฅ์˜ ์ •์˜์ž…๋‹ˆ๋‹ค.
04:54
AGI,
89
294120
1296
AGI ๋ผ๊ณ  ํ•˜์ฃ .
04:55
which has been the holy grail of AI research since its inception.
90
295440
3080
AI ์—ฐ๊ตฌ์˜ ํƒœ์ดˆ๋ถ€ํ„ฐ ์ƒ๊ฐํ–ˆ๋˜ AI์˜ ์„ฑ๋ฐฐ์™€ ๊ฐ™์€ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
์ด์— ๋Œ€ํ•ด์„œ ์‚ฌ๋žŒ๋“ค์€ ์ด๋ ‡๊ฒŒ ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
04:59
By this definition, people who say,
91
299000
1776
05:00
"Ah, there will always be jobs that humans can do better than machines,"
92
300800
3416
"์ž‘์—…์—์„œ๋Š” ์ธ๊ฐ„์ด ํ•ญ์ƒ ๊ธฐ๊ณ„๋ณด๋‹ค ๋›ฐ์–ด๋‚  ๊ฑฐ์•ผ"
05:04
are simply saying that we'll never get AGI.
93
304240
2920
์ด ๋ง์€ AGI๋Š” ์ ˆ๋Œ€ ์กด์žฌํ•˜์ง€ ์•Š๋Š”๋‹ค๋Š” ๊ฑธ ์˜๋ฏธํ•˜์ฃ .
05:07
Sure, we might still choose to have some human jobs
94
307680
3576
๋ฌผ๋ก  ์ผ๋ถ€ ์ธ๊ฐ„์˜ ์ง์—…์„ ์„ ํƒํ•˜๊ฑฐ๋‚˜
05:11
or to give humans income and purpose with our jobs,
95
311280
3096
์†Œ๋“๊ณผ ์‚ถ์˜ ๋ชฉ์ ์„ ๋ถ€์—ฌํ•˜๊ธฐ ์œ„ํ•œ ์ง์—…์„ ์„ ํƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
05:14
but AGI will in any case transform life as we know it
96
314400
3736
ํ•˜์ง€๋งŒ AGI๋Š” ์–ด์จŒ๋“  ์šฐ๋ฆฌ ์‚ถ์„ ๋ฐ”๊ฟ€ ๊ฒƒ์ด๊ณ 
05:18
with humans no longer being the most intelligent.
97
318160
2736
๋”๋Š” ์ธ๊ฐ„์ด ๊ฐ€์žฅ ์ง€์ ์ธ ์กด์žฌ๋กœ ๋‚จ์•„์žˆ์ง€ ์•Š์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:20
Now, if the water level does reach AGI,
98
320920
3696
์ด์ œ, ํ•ด์ˆ˜๋ฉด์ด AGI๊นŒ์ง€ ๋†’์•„์ง€๋ฉด
05:24
then further AI progress will be driven mainly not by humans but by AI,
99
324640
5296
AI์˜ ๋ฐœ์ „์€ ์ธ๊ฐ„์ด ์•„๋‹ˆ๋ผ AI๊ฐ€ ์ฃผ๋„ํ•˜๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:29
which means that there's a possibility
100
329960
1856
์ด๊ฒƒ์ด ์˜๋ฏธํ•˜๋Š” ๋ฐ”๋Š”
05:31
that further AI progress could be way faster
101
331840
2336
AI์˜ ๋ฐœ์ „์ด ๋”์šฑ ๊ฐ€์†ํ™”๋˜์–ด
05:34
than the typical human research and development timescale of years,
102
334200
3376
ํ˜„์žฌ ์ธ๊ฐ„์˜ AI ์—ฐ๊ตฌ ๋ฐ ๊ฐœ๋ฐœ ์†๋„๋ฅผ ๋›ฐ์–ด๋„˜์„ ๊ฒƒ์ด๋ผ๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
05:37
raising the controversial possibility of an intelligence explosion
103
337600
4016
๋…ผ๋ž€์ด ๋˜๊ณ  ์žˆ๋Š” ์ง€๋Šฅ์˜ ํญ๋ฐœ์  ์ฆ๊ฐ€ ๊ฐ€๋Šฅ์„ฑ์œผ๋กœ ์ธํ•ด
05:41
where recursively self-improving AI
104
341640
2296
AI๊ฐ€ ๋ฐ˜๋ณตํ•ด์„œ ์Šค์Šค๋กœ ๊ฐœ๋Ÿ‰ํ•จ์œผ๋กœ์จ
05:43
rapidly leaves human intelligence far behind,
105
343960
3416
์ธ๊ฐ„ ์ง€๋Šฅ์„ ๊ธ‰๊ฒฉํžˆ ์•ž์ง€๋ฅด๊ฒŒ ๋˜๊ณ 
05:47
creating what's known as superintelligence.
106
347400
2440
์ด๋ฅธ๋ฐ” ์ดˆ์ง€๋Šฅ์ด ํƒ„์ƒํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:51
Alright, reality check:
107
351800
2280
์ข‹์•„์š”, ์‹ค์ƒ์„ ์‚ดํŽด๋ณด์ฃ .
05:55
Are we going to get AGI any time soon?
108
355120
2440
์กฐ๋งŒ๊ฐ„ AGI๊ฐ€ ์‹คํ˜„๋ ๊นŒ์š”?
05:58
Some famous AI researchers, like Rodney Brooks,
109
358360
2696
๋กœ๋“œ๋‹ˆ ๋ธŒ๋ฃฉ์Šค ๊ฐ™์€ ์œ ๋ช…ํ•œ AI ์—ฐ๊ตฌ์ž๋“ค์€
06:01
think it won't happen for hundreds of years.
110
361080
2496
์ˆ˜๋ฐฑ ๋…„ ๋‚ด์— ๊ทธ๋Ÿฐ ์ผ์€ ์—†์„ ๊ฑฐ๋ผ ๋ณด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:03
But others, like Google DeepMind founder Demis Hassabis,
111
363600
3896
ํ•˜์ง€๋งŒ ๊ตฌ๊ธ€ ๋”ฅ๋งˆ์ธ๋“œ์˜ ์ฐฝ์—…์ž์ธ ๋ฐ๋ฏธ์Šค ํ•˜์‚ฌ๋น„์Šค ๊ฐ™์€ ์‚ฌ๋žŒ๋“ค์€
06:07
are more optimistic
112
367520
1256
ํ›จ์”ฌ ๋‚™๊ด€์ ์œผ๋กœ ๋ณด๊ณ 
06:08
and are working to try to make it happen much sooner.
113
368800
2576
AGI ์‹คํ˜„ ์‹œ๊ธฐ๋ฅผ ์•ž๋‹น๊ธฐ๋ ค๊ณ  ์• ์“ฐ๊ณ  ์žˆ์ฃ .
06:11
And recent surveys have shown that most AI researchers
114
371400
3296
๊ทธ๋ฆฌ๊ณ  ์ตœ๊ทผ ์กฐ์‚ฌ์— ๋”ฐ๋ฅด๋ฉด
AI ์—ฐ๊ตฌ์ž ๋Œ€๋ถ€๋ถ„์ด ๋ฐ๋ฏธ์Šค ํ•˜์‚ฌ๋น„์Šค์˜ ๋‚™๊ด€๋ก ์— ๋™์กฐํ•˜๊ณ  ์žˆ์œผ๋ฉฐ
06:14
actually share Demis's optimism,
115
374720
2856
06:17
expecting that we will get AGI within decades,
116
377600
3080
์ˆ˜์‹ญ ๋…„ ๋‚ด์— AGI๊ฐ€ ์‹คํ˜„๋  ๊ฒƒ์œผ๋กœ ๊ธฐ๋Œ€ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
06:21
so within the lifetime of many of us,
117
381640
2256
์šฐ๋ฆฌ๊ฐ€ ์ฃฝ๊ธฐ ์ „์— ์‹คํ˜„๋ ์ง€ ๋ชจ๋ฅด๊ฒ ๋„ค์š”.
06:23
which begs the question -- and then what?
118
383920
1960
๋˜ ์˜๋ฌธ์ด ์ƒ๊น๋‹ˆ๋‹ค. ๋‹ค์Œ์€ ๋ญ˜๊นŒ์š”?
06:27
What do we want the role of humans to be
119
387040
2216
์šฐ๋ฆฌ ์ธ๊ฐ„์ด ํ•ด์•ผ ํ•  ์—ญํ• ์€ ๋ญ˜๊นŒ์š”?
06:29
if machines can do everything better and cheaper than us?
120
389280
2680
๋ชจ๋“  ๋ฉด์—์„œ ๊ธฐ๊ณ„๊ฐ€ ์ธ๊ฐ„๋ณด๋‹ค ๋” ๊ฒฝ์ œ์ ์ด๊ณ  ๋” ๋‚ซ๋‹ค๋ฉด ๋ง์ด์ฃ .
06:35
The way I see it, we face a choice.
121
395000
2000
์˜ˆ์ƒ์ปจ๋Œ€ ์šฐ๋ฆฌ๋Š” ์„ ํƒ์˜ ๊ธฐ๋กœ์— ๋†“์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:38
One option is to be complacent.
122
398000
1576
๊ทธ์ค‘ ํ•˜๋‚˜๋Š” ์ž๋งŒํ•˜๋Š” ๊ฒƒ์ด์ฃ . ์ด๋ ‡๊ฒŒ ๋งํ•˜๋ฉด์„œ์š”.
06:39
We can say, "Oh, let's just build machines that can do everything we can do
123
399600
3776
"๊ธฐ๊ณ„๋ฅผ ๋งŒ๋“ค์–ด์„œ ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ผ์„ ๋‹ค ์‹œํ‚ค์ง€ ๋ญ.
06:43
and not worry about the consequences.
124
403400
1816
๊ทธ ๊ฒฐ๊ณผ๋Š” ๊ฑฑ์ •ํ•  ํ•„์š” ์—†์ž–์•„.
06:45
Come on, if we build technology that makes all humans obsolete,
125
405240
3256
์ธ๊ฐ„์„ ์“ธ๋ชจ์—†๊ฒŒ ๋งŒ๋“œ๋Š” ๊ธฐ์ˆ ์ด ๋‚˜์˜จ๋‹ค๊ณ  ํ•ด์„œ
06:48
what could possibly go wrong?"
126
408520
2096
๋ญ๊ฐ€ ์ž˜๋ชป๋  ๋ฆฌ๊ฐ€ ์žˆ๊ฒ ์–ด."
06:50
(Laughter)
127
410640
1656
(์›ƒ์Œ)
06:52
But I think that would be embarrassingly lame.
128
412320
2760
์ด๊ฑด ํ™ฉ๋‹นํ•  ์ •๋„๋กœ ๊ถ์ƒ‰ํ•œ ๋ง์ด๋ผ๊ณ  ์ƒ๊ฐํ•ด์š”.
06:56
I think we should be more ambitious -- in the spirit of TED.
129
416080
3496
์šฐ๋ฆฌ๋Š” ๋” ํฐ ์•ผ๋ง์„ ํ’ˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค. TED์˜ ์ •์‹ ์œผ๋กœ ๋ง์ด์ฃ .
06:59
Let's envision a truly inspiring high-tech future
130
419600
3496
์ง„์ •์œผ๋กœ ์˜๊ฐ์„ ์ฃผ๋Š” ์ฒจ๋‹จ ๊ธฐ์ˆ ์˜ ๋ฏธ๋ž˜๋ฅผ ๊ทธ๋ ค์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:03
and try to steer towards it.
131
423120
1400
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ณณ์„ ํ–ฅํ•ด ๋‚˜์•„๊ฐ€์•ผ ํ•˜์ฃ .
07:05
This brings us to the second part of our rocket metaphor: the steering.
132
425720
3536
์—ฌ๊ธฐ์„œ ๋กœ์ผ“์œผ๋กœ ๋น„์œ ๋˜๋Š” ๋‘ ๋ฒˆ์งธ ๋ถ€๋ถ„์œผ๋กœ ๋„˜์–ด๊ฐ€์ฃ .
๋ฐฉํ–ฅ์„ฑ์ž…๋‹ˆ๋‹ค.
07:09
We're making AI more powerful,
133
429280
1896
์šฐ๋ฆฌ๋Š” ๋”์šฑ ๊ฐ•๋ ฅํ•œ AI๋ฅผ ๋งŒ๋“ค๊ณ  ์žˆ์ฃ .
07:11
but how can we steer towards a future
134
431200
3816
ํ•˜์ง€๋งŒ, ์–ด๋–ป๊ฒŒ ํ•ด์•ผ ๊ทธ ๋ฐœ์ „ ๋ฐฉํ–ฅ์„
07:15
where AI helps humanity flourish rather than flounder?
135
435040
3080
AI๊ฐ€ ์ธ๋ฅ˜๋ฅผ ๊ณค๊ฒฝ์— ๋น ๋œจ๋ฆฌ์ง€ ์•Š๊ณ  ๋ฒˆ์„ฑํ•˜๋„๋ก ์ด๋Œ ์ˆ˜ ์žˆ์„๊นŒ์š”?
07:18
To help with this,
136
438760
1256
์ €๋Š” ์ด๋ฅผ ์œ„ํ•ด์„œ ๊ณต๋™์œผ๋กœ "์ƒ๋ช…์˜ ๋ฏธ๋ž˜ ์—ฐ๊ตฌ์†Œ"๋ฅผ ์„ค๋ฆฝํ–ˆ์Šต๋‹ˆ๋‹ค.
07:20
I cofounded the Future of Life Institute.
137
440040
1976
07:22
It's a small nonprofit promoting beneficial technology use,
138
442040
2776
๊ณต์ต์  ๊ธฐ์ˆ  ์‚ฌ์šฉ์„ ๋„๋ชจํ•˜๋Š” ์ž‘์€ ๋น„์˜๋ฆฌ ๋‹จ์ฒด์ฃ .
07:24
and our goal is simply for the future of life to exist
139
444840
2736
์ €ํฌ์˜ ๋ชฉํ‘œ๋Š” ๋‹จ์ˆœํžˆ ์ƒ๋ช…์ฒด์˜ ๋ฏธ๋ž˜๋ฅผ ์‹คํ˜„ํ•˜๊ณ 
07:27
and to be as inspiring as possible.
140
447600
2056
๊ทธ ์ผ์„ ์ตœ๋Œ€ํ•œ ๊ฒฉ๋ คํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:29
You know, I love technology.
141
449680
3176
๋งํ•˜์ž๋ฉด, ์ €๋Š” ๊ธฐ์ˆ ์„ ์‚ฌ๋ž‘ํ•ฉ๋‹ˆ๋‹ค.
07:32
Technology is why today is better than the Stone Age.
142
452880
2920
์ง€๊ธˆ์ด ์„๊ธฐ์‹œ๋Œ€๋ณด๋‹ค ๋‚˜์€ ์ด์œ ๊ฐ€ ๋ฐ”๋กœ ๊ธฐ์ˆ  ๋•๋ถ„์ž…๋‹ˆ๋‹ค.
07:36
And I'm optimistic that we can create a really inspiring high-tech future ...
143
456600
4080
์ €๋Š” ์ •๋ง ๊ณ ๋ฌด์ ์ธ ์ฒจ๋‹จ ๊ธฐ์ˆ ์˜ ๋ฏธ๋ž˜๊ฐ€ ์˜ฌ ๊ฒƒ์ด๋ผ ๋‚™๊ด€ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:41
if -- and this is a big if --
144
461680
1456
๋งŒ์•ฝ์—.. ์ •๋ง๋กœ ๋งŒ์•ฝ์—์š”.
07:43
if we win the wisdom race --
145
463160
2456
๋งŒ์•ฝ์— ๊ฒฝ์Ÿ์—์„œ ์ง€ํ˜œ๋กœ์›€์ด ์ด๊ธฐ๋ ค๋ฉด ๋ง์ด์ฃ .
07:45
the race between the growing power of our technology
146
465640
2856
์šฐ๋ฆฌ ๊ธฐ์ˆ ์ด ๊ฐ–๋Š” ๋Šฅ๋ ฅ์ด ์ปค์งˆ์ง€
07:48
and the growing wisdom with which we manage it.
147
468520
2200
๊ทธ๊ฑธ ๊ด€๋ฆฌํ•  ์šฐ๋ฆฌ์˜ ์ง€ํ˜œ๊ฐ€ ์ปค์งˆ์ง€์˜ ๊ฒฝ์Ÿ์—์„œ์š”.
07:51
But this is going to require a change of strategy
148
471240
2296
๊ทธ๋Ÿฌ๋ ค๋ฉด ์ „๋žต์„ ๋ฐ”๊ฟ€ ํ•„์š”๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
07:53
because our old strategy has been learning from mistakes.
149
473560
3040
์™œ๋ƒํ•˜๋ฉด ์ง€๊ธˆ๊นŒ์ง€์˜ ์ „๋žต์€ ์‹ค์ˆ˜๋ฅผ ํ†ตํ•ด ๋ฐฐ์šด ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
07:57
We invented fire,
150
477280
1536
์šฐ๋ฆฌ๋Š” ๋ถˆ์„ ๋ฐœ๋ช…ํ–ˆ๊ณ 
07:58
screwed up a bunch of times --
151
478840
1536
์‹œ๊ฐ„์ด ํ•œ์—†์ด ํ๋ฅธ ๋’ค์—
08:00
invented the fire extinguisher.
152
480400
1816
์†Œํ™”๊ธฐ๋ฅผ ๋ฐœ๋ช…ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:02
(Laughter)
153
482240
1336
(์›ƒ์Œ)
08:03
We invented the car, screwed up a bunch of times --
154
483600
2416
์šฐ๋ฆฌ๋Š” ์ž๋™์ฐจ๋ฅผ ๋ฐœ๋ช…ํ–ˆ๊ณ , ์‹œ๊ฐ„์ด ํ•œ์ฐธ ์ง€๋‚œ ๋’ค์—
08:06
invented the traffic light, the seat belt and the airbag,
155
486040
2667
์‹ ํ˜ธ๋“ฑ์„ ๋ฐœ๋ช…ํ–ˆ๊ณ , ์•ˆ์ „๋ฒจํŠธ์™€ ์—์–ด๋ฐฑ์„ ๋ฐœ๋ช…ํ–ˆ์ฃ .
08:08
but with more powerful technology like nuclear weapons and AGI,
156
488731
3845
ํ•˜์ง€๋งŒ ํ•ต๋ฌด๊ธฐ๋‚˜ AGI ๊ฐ™์ด ํ›จ์”ฌ ๋ง‰๊ฐ•ํ•œ ๊ธฐ์ˆ ์— ๋Œ€ํ•ด์„œ๋„
08:12
learning from mistakes is a lousy strategy,
157
492600
3376
์‹ค์ˆ˜์—์„œ ๊ตํ›ˆ์„ ์–ป๋Š”๋‹ค๋Š” ๊ฑด ๋„ˆ๋ฌด๋‚˜ ํ˜•ํŽธ์—†๋Š” ์ „๋žต์ด์ฃ .
08:16
don't you think?
158
496000
1216
๊ทธ๋ ‡์ง€ ์•Š๋‚˜์š”?
08:17
(Laughter)
159
497240
1016
(์›ƒ์Œ)
๋ฐ˜์‘ํ•˜๊ธฐ๋ณด๋‹ค ์„ ํ–‰ํ•˜๋Š” ๊ฒƒ์ด ํ›จ์”ฌ ๋‚˜์„ ๊ฒ๋‹ˆ๋‹ค.
08:18
It's much better to be proactive rather than reactive;
160
498280
2576
08:20
plan ahead and get things right the first time
161
500880
2296
๋ฏธ๋ฆฌ ๊ณ„ํš์„ ์„ธ์šฐ๊ณ  ์ดˆ๊ธฐ์— ๋ฐ”๋กœ์žก์•„์•ผ์ฃ .
08:23
because that might be the only time we'll get.
162
503200
2496
๊ทธ๋•Œ๋ฐ–์— ๊ธฐํšŒ๊ฐ€ ์—†๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
08:25
But it is funny because sometimes people tell me,
163
505720
2336
์žฌ๋ฐŒ๊ฒŒ๋„ ์ด๋Ÿฐ ๋ง์„ ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์ด ์žˆ์–ด์š”.
"๋งฅ์Šค. ์‰ฌ์ž‡! ๊ทธ๋Ÿฐ ์‹์œผ๋กœ ๋งํ•˜์ง€ ๋ง์•„์š”.
08:28
"Max, shhh, don't talk like that.
164
508080
2736
08:30
That's Luddite scaremongering."
165
510840
1720
๊ธฐ์ˆ  ๋ฐ˜๋Œ€๋ก ์ž๋“ค์ด ๊ฒ์ฃผ๋Š” ๊ฑฐ ๊ฐ™์ž–์•„์š”."
08:34
But it's not scaremongering.
166
514040
1536
ํ•˜์ง€๋งŒ ๊ฒ์ฃผ๋ ค๋Š” ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
08:35
It's what we at MIT call safety engineering.
167
515600
2880
์ด๊ฑธ ์ €ํฌ MIT์—์„œ๋Š” ์•ˆ์ „๊ณตํ•™์ด๋ผ๊ณ  ํ•˜๋Š”๋ฐ์š”.
08:39
Think about it:
168
519200
1216
์ด์ ์„ ํ•œ๋ฒˆ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
08:40
before NASA launched the Apollo 11 mission,
169
520440
2216
NASA๊ฐ€ ์•„ํด๋กœ 11ํ˜ธ๋ฅผ ๋ฐœ์‚ฌํ•˜๊ธฐ ์ „์—
08:42
they systematically thought through everything that could go wrong
170
522680
3136
์˜ค๋ฅ˜๊ฐ€ ์ผ์–ด๋‚  ์ˆ˜ ์žˆ๋Š” ๋ชจ๋“  ๋ถ€๋ถ„์„ ์ฒด๊ณ„์ ์œผ๋กœ ๊ฒ€ํ† ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:45
when you put people on top of explosive fuel tanks
171
525840
2376
ํญ๋ฐœ์„ฑ ์—ฐ๋ฃŒ๋ฅผ ์‹ค์€ ์—ฐ๋ฃŒํ†ต ์œ„์— ์‚ฌ๋žŒ์„ ํƒœ์šฐ๊ณ 
08:48
and launch them somewhere where no one could help them.
172
528240
2616
๋„์™€์ค„ ์ด๊ฐ€ ์—†๋Š” ๊ณณ์œผ๋กœ ๊ทธ๋“ค์„ ๋ณด๋‚ด์•ผ ํ–ˆ์œผ๋‹ˆ๊นŒ์š”.
08:50
And there was a lot that could go wrong.
173
530880
1936
๋ฌธ์ œ๊ฐ€ ์ƒ๊ธธ ์†Œ์ง€๊ฐ€ ๋„ˆ๋ฌด๋‚˜ ๋งŽ์•˜์ฃ .
08:52
Was that scaremongering?
174
532840
1480
๊ทธ๊ฒŒ ๊ฒ์ฃผ๋Š” ๊ฑด๊ฐ€์š”?
08:55
No.
175
535159
1217
์•„๋‹ˆ์ฃ .
08:56
That's was precisely the safety engineering
176
536400
2016
๊ทธ๊ฒƒ์ด ๋ฐ”๋กœ ์•ˆ์ „๊ณตํ•™์ด์—ˆ์Šต๋‹ˆ๋‹ค.
08:58
that ensured the success of the mission,
177
538440
1936
ํƒ์‚ฌ ๊ณ„ํš์˜ ์„ฑ๊ณต์„ ์œ„ํ•œ ๊ฒƒ์ด์—ˆ์ฃ .
09:00
and that is precisely the strategy I think we should take with AGI.
178
540400
4176
๊ทธ๋ฆฌ๊ณ  ๊ทธ๊ฒƒ์ด ๋ฐ”๋กœ AGI์— ๋Œ€ํ•ด ์šฐ๋ฆฌ๊ฐ€ ์ทจํ•ด์•ผ ํ•  ์ „๋žต์ž…๋‹ˆ๋‹ค.
09:04
Think through what can go wrong to make sure it goes right.
179
544600
4056
์ž˜๋ชป๋  ์ˆ˜ ์žˆ๋Š” ๋ถ€๋ถ„์„ ์ƒ๊ฐํ•˜๊ณ  ์ œ๋Œ€๋กœ ๋ฐ”๋กœ์žก์•„ ๋‘๋Š” ๊ฒƒ์ด์ฃ .
09:08
So in this spirit, we've organized conferences,
180
548680
2536
๊ทธ๋Ÿฐ ์ƒ๊ฐ์„ ๊ฐ€์ง€๊ณ  ์ €ํฌ๋Š” ์ปจํผ๋Ÿฐ์Šค๋ฅผ ์—ด์—ˆ์Šต๋‹ˆ๋‹ค.
09:11
bringing together leading AI researchers and other thinkers
181
551240
2816
์„ ๋„์ ์ธ AI ์—ฐ๊ตฌ์ž๋“ค๊ณผ ๊ฐ ๋ถ„์•ผ ๊ถŒ์œ„์ž๋“ค ๋ถˆ๋Ÿฌ ๋ชจ์•„์„œ
09:14
to discuss how to grow this wisdom we need to keep AI beneficial.
182
554080
3736
AI๋ฅผ ์ด๋กญ๊ฒŒ ํ•˜๊ธฐ ์œ„ํ•ด ํ•„์š”ํ•œ ์ง€ํ˜œ๋ฅผ ์–ด๋–ป๊ฒŒ ํ‚ค์›Œ๊ฐˆ์ง€ ํ† ๋ก ํ–ˆ์ฃ .
09:17
Our last conference was in Asilomar, California last year
183
557840
3296
๊ฐ€์žฅ ์ตœ๊ทผ์˜ ์ปจํผ๋Ÿฐ์Šค๋Š” ์ž‘๋…„์— ์บ˜๋ฆฌํฌ๋‹ˆ์•„์ฃผ ์• ์‹ค๋กœ๋งˆ์—์„œ ์žˆ์—ˆ๋Š”๋ฐ์š”.
09:21
and produced this list of 23 principles
184
561160
3056
๊ฑฐ๊ธฐ์—์„œ 23๊ฐœ์˜ ์›์น™์„ ์ •ํ–ˆ์Šต๋‹ˆ๋‹ค.
09:24
which have since been signed by over 1,000 AI researchers
185
564240
2896
์ด ์›์น™์— 1,000๋ช…์ด ๋„˜๋Š” AI ์—ฐ๊ตฌ์ž๋“ค๊ณผ
์ฃผ์š” ์‚ฐ์—…๊ณ„ ๋Œ€ํ‘œ๋“ค์ด ๋™์˜ํ•˜์˜€๋Š”๋ฐ์š”.
09:27
and key industry leaders,
186
567160
1296
09:28
and I want to tell you about three of these principles.
187
568480
3176
๊ทธ์ค‘์—์„œ 3๊ฐœ์˜ ์›์น™์„ ๋ง์”€๋“œ๋ฆฌ๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
09:31
One is that we should avoid an arms race and lethal autonomous weapons.
188
571680
4960
์ฒซ์งธ๋กœ ๊ตฐ๋น„ ๊ฒฝ์Ÿ๊ณผ ์ž๋™ ์‚ด์ƒ ๋ฌด๊ธฐ๋ฅผ ๊ทผ์ ˆํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:37
The idea here is that any science can be used for new ways of helping people
189
577480
3616
๋ชจ๋“  ๊ณผํ•™๊ธฐ์ˆ ์€ ์‚ฌ๋žŒ๋“ค์„ ๋•๋Š” ์ˆ˜๋‹จ์ด ๋  ์ˆ˜๋„ ์žˆ๊ณ 
09:41
or new ways of harming people.
190
581120
1536
ํ•ด์น  ์ˆ˜๋„ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
09:42
For example, biology and chemistry are much more likely to be used
191
582680
3936
์˜ˆ๋ฅผ ๋“ค์–ด, ์ƒ๋ฌผํ•™๊ณผ ํ™”ํ•™์€
์‚ด์ƒ์˜ ์ˆ˜๋‹จ์œผ๋กœ ์“ฐ์ด๊ธฐ๋ณด๋‹ค ์‹ ์•ฝ๊ณผ ์น˜๋ฃŒ์ œ ๊ฐœ๋ฐœ์— ๋” ๋งŽ์ด ์“ฐ์ด์ฃ .
09:46
for new medicines or new cures than for new ways of killing people,
192
586640
4856
09:51
because biologists and chemists pushed hard --
193
591520
2176
์™œ๋ƒํ•˜๋ฉด ์ƒ๋ฌผํ•™์ž์™€ ํ™”ํ•™์ž๋“ค์€
09:53
and successfully --
194
593720
1256
๊ฐ•๋ ฅํ•˜๊ฒŒ, ๊ทธ๋ฆฌ๊ณ  ์„ฑ๊ณต์ ์œผ๋กœ
09:55
for bans on biological and chemical weapons.
195
595000
2176
์ƒํ™”ํ•™๋ฌด๊ธฐ๋ฅผ ๋ง‰์•„์™”์Šต๋‹ˆ๋‹ค.
09:57
And in the same spirit,
196
597200
1256
๋งˆ์ฐฌ๊ฐ€์ง€๋กœ
09:58
most AI researchers want to stigmatize and ban lethal autonomous weapons.
197
598480
4440
AI ์—ฐ๊ตฌ์ž๋“ค๋„ ์ž๋™ ์‚ด์ƒ ๋ฌด๊ธฐ๋ฅผ ๋น„๋‚œํ•˜๊ณ  ๊ธˆ์ง€์‹œํ‚ค๊ณ  ์‹ถ์–ด ํ•ฉ๋‹ˆ๋‹ค.
10:03
Another Asilomar AI principle
198
603600
1816
์• ์‹ค๋กœ๋งˆ AI ์›์น™ ์ค‘ ๋‹ค๋ฅธ ํ•˜๋‚˜๋Š”
10:05
is that we should mitigate AI-fueled income inequality.
199
605440
3696
AI๋กœ ์ธํ•œ ์†Œ๋“ ๋ถˆํ‰๋“ฑ์„ ์ค„์—ฌ์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:09
I think that if we can grow the economic pie dramatically with AI
200
609160
4456
AI๋ฅผ ํ†ตํ•ด ๊ฒฝ์ œ ๊ทœ๋ชจ๊ฐ€ ๊ทน๋‹จ์ ์œผ๋กœ ์„ฑ์žฅํ•˜๋”๋ผ๋„
10:13
and we still can't figure out how to divide this pie
201
613640
2456
๊ทธ ๊ฒฝ์ œ ํŒŒ์ด๋ฅผ ๋‚˜๋ˆ„์–ด
๋ชจ๋‘๊ฐ€ ์ž˜์‚ด ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์„ ์—ฌ์ „ํžˆ ์ฐพ์ง€ ๋ชปํ•œ๋‹ค๋ฉด
10:16
so that everyone is better off,
202
616120
1576
10:17
then shame on us.
203
617720
1256
๋ถ€๋„๋Ÿฌ์šด ์ผ์ด๊ฒ ์ฃ .
10:19
(Applause)
204
619000
4096
(๋ฐ•์ˆ˜)
10:23
Alright, now raise your hand if your computer has ever crashed.
205
623120
3600
์ข‹์•„์š”, ํ˜น์‹œ ์ปดํ“จํ„ฐ๊ฐ€ ๋‹ค์šด๋œ ์  ์žˆ๋Š” ๋ถ„์€ ์†์„ ๋“ค์–ด๋ณด์„ธ์š”.
10:27
(Laughter)
206
627480
1256
(์›ƒ์Œ)
10:28
Wow, that's a lot of hands.
207
628760
1656
์™€, ์†์„ ๋“  ๋ถ„์ด ๊ฝค ๋งŽ๋„ค์š”.
10:30
Well, then you'll appreciate this principle
208
630440
2176
๊ทธ๋Ÿฌ๋ฉด ์ด ์›์น™์ด ๋ง˜์— ๋“œ์‹ค ๊ฑฐ์˜ˆ์š”.
10:32
that we should invest much more in AI safety research,
209
632640
3136
์šฐ๋ฆฌ๋Š” AI ์•ˆ์ „์„ ์œ„ํ•œ ์—ฐ๊ตฌ์— ๋” ๋งŽ์ด ํˆฌ์žํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:35
because as we put AI in charge of even more decisions and infrastructure,
210
635800
3656
AI๋กœ ํ•˜์—ฌ๊ธˆ ํŒ๋‹จ๊ณผ ์‚ฌํšŒ๊ธฐ๋ฐ˜์— ๋” ๋งŽ์ด ๊ฐœ์ž…ํ•˜๋„๋ก ํ•˜๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
10:39
we need to figure out how to transform today's buggy and hackable computers
211
639480
3616
์˜ค๋ฅ˜๊ฐ€ ๋งŽ๊ณ  ํ•ดํ‚น๋‹นํ•  ์œ„ํ—˜๋„ ์žˆ๋Š” ํ˜„์žฌ์˜ ์ปดํ“จํ„ฐ๋ฅผ ๋ฐ”๊พธ์–ด
์‹ ๋ขฐํ• ๋งŒํ•œ ๊ฐ•๋ ฅํ•œ AI ์‹œ์Šคํ…œ์œผ๋กœ ๋งŒ๋“ค ๋ฐฉ๋ฒ•์„ ์ฐพ์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:43
into robust AI systems that we can really trust,
212
643120
2416
10:45
because otherwise,
213
645560
1216
๊ทธ๋ ‡์ง€ ์•Š์œผ๋ฉด, ์ด ๋Œ€๋‹จํ•œ ์‹ ๊ธฐ์ˆ ์ด ์˜ค์ž‘๋™ํ•˜์—ฌ ์šฐ๋ฆฌ์—๊ฒŒ ํ•ด๋ฅผ ๋ผ์น˜๊ฑฐ๋‚˜
10:46
all this awesome new technology can malfunction and harm us,
214
646800
2816
10:49
or get hacked and be turned against us.
215
649640
1976
์•„๋‹ˆ๋ฉด ํ•ดํ‚น๋‹นํ•ด์„œ ์šฐ๋ฆฌ์—๊ฒŒ ๋Œ€์ ํ• ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
10:51
And this AI safety work has to include work on AI value alignment,
216
651640
5696
๊ทธ๋ฆฌ๊ณ  ์ด AI์˜ ์•ˆ์ „์— ๊ด€ํ•œ ์—ฐ๊ตฌ๋Š” AI ๊ฐ€์น˜ ์กฐํ•ฉ ์—ฐ๊ตฌ๋„ ํฌํ•จํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:57
because the real threat from AGI isn't malice,
217
657360
2816
๋ฐ”๋ณด ๊ฐ™์€ ํ• ๋ฆฌ์šฐ๋“œ ์˜ํ™”์—์„œ๋‚˜ ๊ทธ๋ ‡์ง€
AGI์˜ ์ง„์ •ํ•œ ์œ„ํ˜‘์€ ์ ๋Œ€๊ฐ์ด ์•„๋‹ˆ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
11:00
like in silly Hollywood movies,
218
660200
1656
11:01
but competence --
219
661880
1736
์˜คํžˆ๋ ค ๊ถŒํ•œ์ž…๋‹ˆ๋‹ค.
11:03
AGI accomplishing goals that just aren't aligned with ours.
220
663640
3416
AGI๊ฐ€ ์ด๋ฃจ๋ ค๋Š” ๋ชฉํ‘œ๋Š” ์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ์™€ ์ผ์น˜ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
11:07
For example, when we humans drove the West African black rhino extinct,
221
667080
4736
์˜ˆ๋ฅผ ๋“ค๋ฉด, ์šฐ๋ฆฌ ์ธ๊ฐ„์ด ์„œ์•„ํ”„๋ฆฌ์นด์˜ ๊ฒ€์€ ์ฝ”๋ฟ”์†Œ์˜ ๋ฉธ์ข…์„ ์ดˆ๋ž˜ํ–ˆ์ง€๋งŒ
11:11
we didn't do it because we were a bunch of evil rhinoceros haters, did we?
222
671840
3896
์šฐ๋ฆฌ๊ฐ€ ์ฝ”๋ฟ”์†Œ ํ˜์˜ค์ž๋ผ์„œ ๊ทธ๋žฌ๋˜ ๊ฒƒ์€ ์•„๋‹ˆ์ž–์•„์š”.
11:15
We did it because we were smarter than them
223
675760
2056
๋‹จ์ง€ ์šฐ๋ฆฌ๊ฐ€ ์ฝ”๋ฟ”์†Œ๋ณด๋‹ค ๋˜‘๋˜‘ํ–ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
11:17
and our goals weren't aligned with theirs.
224
677840
2576
์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ๊ฐ€ ์ฝ”๋ฟ”์†Œ์˜ ๋ชฉํ‘œ์™€๋Š” ๋‹ฌ๋ž๋˜ ๊ฒƒ์ด์ฃ .
11:20
But AGI is by definition smarter than us,
225
680440
2656
๊ทธ๋Ÿฐ๋ฐ AGI๋Š” ๋‹น์—ฐํžˆ ์šฐ๋ฆฌ๋ณด๋‹ค ๋˜‘๋˜‘ํ•  ๊ฒ๋‹ˆ๋‹ค.
11:23
so to make sure that we don't put ourselves in the position of those rhinos
226
683120
3576
๊ทธ๋Ÿฌ๋‹ˆ ์šฐ๋ฆฌ๋„ ์šฐ๋ฆฌ ์ž์‹ ์„ ์ฝ”๋ฟ”์†Œ ๊ฐ™์€ ์ฒ˜์ง€๊ฐ€ ๋˜๋„๋ก ๋งŒ๋“ค๋ฉด ์•ˆ ๋˜๊ฒ ์ฃ .
11:26
if we create AGI,
227
686720
1976
์šฐ๋ฆฌ๊ฐ€ AGI๋ฅผ ๋งŒ๋“ ๋‹ค๋ฉด
11:28
we need to figure out how to make machines understand our goals,
228
688720
4176
์ปดํ“จํ„ฐ์—๊ฒŒ ์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ๋ฅผ ๊ฐ€๋ฅด์น  ๋ฐฉ๋ฒ•์„ ์ฐพ์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:32
adopt our goals and retain our goals.
229
692920
3160
์šฐ๋ฆฌ์˜ ๋ชฉํ‘œ๋ฅผ ๋ฐ›์•„๋“ค์ด๊ณ  ์œ ์ง€ํ•˜๋„๋ก ํ•ด์•ผ ํ•˜์ฃ .
11:37
And whose goals should these be, anyway?
230
697320
2856
๊ทธ๋Ÿผ ๋„๋Œ€์ฒด ๋ˆ„๊ตฌ์˜ ๋ชฉํ‘œ๋ฅผ ์„ ํƒํ•ด์•ผ ํ• ๊นŒ์š”?
11:40
Which goals should they be?
231
700200
1896
์–ด๋–ค ๋ชฉํ‘œ์ด์–ด์•ผ ํ• ๊นŒ์š”?
11:42
This brings us to the third part of our rocket metaphor: the destination.
232
702120
3560
์ด๊ฒƒ์ด ๋กœ์ผ“์œผ๋กœ ๋น„์œ ๋˜๋Š” ์„ธ ๋ฒˆ์งธ ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค.
๋ฐ”๋กœ ์ตœ์ข… ๋ชฉํ‘œ์ฃ .
11:47
We're making AI more powerful,
233
707160
1856
์šฐ๋ฆฌ๋Š” ๋”์šฑ ๋ง‰๊ฐ•ํ•œ AI๋ฅผ ์—ฐ๊ตฌํ•˜๊ณ 
11:49
trying to figure out how to steer it,
234
709040
1816
๊ทธ๋ ‡๊ฒŒ ํ•  ๋ฐฉ๋ฒ•์„ ์ฐพ์œผ๋ ค ๋…ธ๋ ฅํ•˜์ฃ .
11:50
but where do we want to go with it?
235
710880
1680
ํ•˜์ง€๋งŒ ์–ผ๋งˆ๋‚˜ ๊ฐ•๋ ฅํ•˜๊ธธ ๋ฐ”๋ผ๋Š” ๊ฑธ๊นŒ์š”?
11:53
This is the elephant in the room that almost nobody talks about --
236
713760
3656
์ด๊ฒƒ์€ ๋ฐฉ ์•ˆ์˜ ์ฝ”๋ผ๋ฆฌ ๊ฐ™์€ ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์•„๋ฌด๋„ ๋งํ•˜๋ ค ํ•˜์ง€ ์•Š์ฃ .
11:57
not even here at TED --
237
717440
1856
์—ฌ๊ธฐ TED์—์„œ ์กฐ์ฐจ๋„์š”.
11:59
because we're so fixated on short-term AI challenges.
238
719320
4080
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๋Š” ๋‹จ๊ธฐ์ ์ธ AI ๋ฌธ์ œ์—๋งŒ ์ง‘์ฐฉํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
12:04
Look, our species is trying to build AGI,
239
724080
4656
๋ณด์„ธ์š”. ์šฐ๋ฆฌ ์ธ๋ฅ˜๋Š” AGI๋ฅผ ๋งŒ๋“ค๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
12:08
motivated by curiosity and economics,
240
728760
3496
๊ทธ ๋™๊ธฐ๋Š” ํ˜ธ๊ธฐ์‹ฌ๊ณผ ๊ฒฝ์ œ์„ฑ์ด์ฃ .
12:12
but what sort of future society are we hoping for if we succeed?
241
732280
3680
๊ทธ๋Ÿฐ๋ฐ ๋ฏธ๋ž˜์— ๊ทธ๊ฒƒ์ด ์„ฑ๊ณตํ–ˆ์„ ๋•Œ ์–ด๋–ค ์‚ฌํšŒ๊ฐ€ ๋˜๊ธฐ๋ฅผ ๊ธฐ๋Œ€ํ•˜๋Š” ๊ฑฐ์ฃ ?
12:16
We did an opinion poll on this recently,
242
736680
1936
์ตœ๊ทผ์— ์ด์— ๊ด€ํ•œ ์„ค๋ฌธ์กฐ์‚ฌ๋ฅผ ํ–ˆ๋Š”๋ฐ์š”.
12:18
and I was struck to see
243
738640
1216
๊ฒฐ๊ณผ๋Š” ๋†€๋ผ์› ์–ด์š”.
12:19
that most people actually want us to build superintelligence:
244
739880
2896
๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์ด ์‹ค์ œ๋กœ ์ดˆ์ง€๋Šฅ ๊ฐœ๋ฐœ์„ ์›ํ•˜๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
12:22
AI that's vastly smarter than us in all ways.
245
742800
3160
์šฐ๋ฆฌ๋ณด๋‹ค ๋ชจ๋“  ๋ฉด์—์„œ ํ›จ์”ฌ ๋›ฐ์–ด๋‚œ AI๋ฅผ ์›ํ•˜๊ณ  ์žˆ์—ˆ์ฃ .
12:27
What there was the greatest agreement on was that we should be ambitious
246
747120
3416
๋Œ€๋‹ค์ˆ˜๊ฐ€ ๋™์˜ํ•œ ๋ถ€๋ถ„์€ ์šฐ๋ฆฌ๊ฐ€ ํฐ ๊ฟˆ์„ ๊ฐ€์ ธ์•ผ ํ•˜๊ณ 
์ธ๋ฅ˜๊ฐ€ ์šฐ์ฃผ๋กœ ๋‚˜๊ฐ€๋„๋ก ํ•ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
12:30
and help life spread into the cosmos,
247
750560
2016
12:32
but there was much less agreement about who or what should be in charge.
248
752600
4496
ํ•˜์ง€๋งŒ ๋ˆ„๊ฐ€ ์–ด๋–ค ์ฑ…์ž„์„ ๋งก์„์ง€๋Š” ์˜๊ฒฌ์ด ๋ถ„๋ถ„ํ–ˆ์Šต๋‹ˆ๋‹ค.
12:37
And I was actually quite amused
249
757120
1736
๊ทธ๋ฆฌ๊ณ  ์ •๋ง ์žฌ๋ฐŒ๋Š” ๋ถ€๋ถ„์€
12:38
to see that there's some some people who want it to be just machines.
250
758880
3456
์ปดํ“จํ„ฐ์— ๋งก๊ธฐ์ž๊ณ  ํ•˜๋Š” ์‚ฌ๋žŒ๋„ ์žˆ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
12:42
(Laughter)
251
762360
1696
(์›ƒ์Œ)
12:44
And there was total disagreement about what the role of humans should be,
252
764080
3856
๊ทธ๋ฆฌ๊ณ  ์ธ๊ฐ„์˜ ๋งก์•„์•ผ ํ•  ์—ญํ• ์— ๋Œ€ํ•ด์„œ๋Š” ๋ชจ๋‘๊ฐ€ ๊ฐ™์€ ๋‹ต์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
12:47
even at the most basic level,
253
767960
1976
๊ฐ€์žฅ ๊ธฐ์ดˆ์ ์ธ ์ˆ˜์ค€์— ์žˆ๋”๋ผ๋„
12:49
so let's take a closer look at possible futures
254
769960
2816
์–ด๋–ค ๋ฏธ๋ž˜๊ฐ€ ์˜ฌ์ง€ ์ž˜ ์‚ดํŽด๋ณด๊ณ 
12:52
that we might choose to steer toward, alright?
255
772800
2736
์–ด๋–ค ๋ฐฉํ–ฅ์œผ๋กœ ๊ฐˆ์ง€ ์„ ํƒํ•ด์•ผ ํ•˜์ง€ ์•Š๊ฒ ์–ด์š”?
12:55
So don't get me wrong here.
256
775560
1336
์•„, ์˜คํ•ดํ•˜์ง€ ๋งˆ์„ธ์š”.
12:56
I'm not talking about space travel,
257
776920
2056
์ด๊ฒƒ์œผ๋กœ ์šฐ์ฃผ์—ฌํ–‰์„ ์–˜๊ธฐํ•˜๋ ค๋Š” ๊ฒŒ ์•„๋‹™๋‹ˆ๋‹ค.
12:59
merely about humanity's metaphorical journey into the future.
258
779000
3200
๋ฏธ๋ž˜๋ฅผ ํ–ฅํ•œ ์šฐ๋ฆฌ ์ธ๋ฅ˜์˜ ์—ฌ์ •์„ ๋น„์œ ์ ์œผ๋กœ ํ‘œํ˜„ํ•œ ๊ฑฐ์˜ˆ์š”.
13:02
So one option that some of my AI colleagues like
259
782920
3496
์ €์˜ AI ์—ฐ๊ตฌ ๋™๋ฃŒ ๋ช‡๋ช‡์˜ ์˜๊ฒฌ ์ค‘ ํ•˜๋‚˜๋Š”
13:06
is to build superintelligence and keep it under human control,
260
786440
3616
์ดˆ์ง€๋Šฅ์„ ๋งŒ๋“ค๋˜ ์ธ๊ฐ„์˜ ํ†ต์ œํ•˜์— ๋‘์ž๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
13:10
like an enslaved god,
261
790080
1736
๋งˆ์น˜ ๋…ธ์˜ˆ๊ฐ€ ๋œ ์‹ ์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
13:11
disconnected from the internet
262
791840
1576
์ธํ„ฐ๋„ท ์—ฐ๊ฒฐ์„ ๋Š์–ด๋ฒ„๋ฆฌ๊ณ 
13:13
and used to create unimaginable technology and wealth
263
793440
3256
์ƒ์ƒ์„ ๋›ฐ์–ด๋„˜๋Š” ๊ธฐ์ˆ ๊ณผ ๋ถ€๋ฅผ ๋งŒ๋“œ๋Š” ๋ฐ์—๋งŒ ์‚ฌ์šฉํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
13:16
for whoever controls it.
264
796720
1240
์ž์‹ ์„ ํ†ต์ œํ•˜๋Š” ์‚ฌ๋žŒ์„ ์œ„ํ•ด์„œ์š”.
13:18
But Lord Acton warned us
265
798800
1456
์•กํ†ค ๊ฒฝ์€ ์ด๋ ‡๊ฒŒ ๊ฒฝ๊ณ ํ–ˆ์Šต๋‹ˆ๋‹ค.
13:20
that power corrupts, and absolute power corrupts absolutely,
266
800280
3616
๊ถŒ๋ ฅ์€ ํƒ€๋ฝํ•˜๊ณ , ์ ˆ๋Œ€ ๊ถŒ๋ ฅ์€ ๋ฐ˜๋“œ์‹œ ๋ถ€ํŒจํ•˜๊ธฐ ๋งˆ๋ จ์ด๋ผ๊ณ  ํ–ˆ์ฃ .
13:23
so you might worry that maybe we humans just aren't smart enough,
267
803920
4056
๊ทธ๋Ÿฌ๋‹ˆ ์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„์ด ๋˜‘๋˜‘ํ•˜์ง€ ์•Š๋‹ค๊ณ  ๊ฑฑ์ •ํ•  ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
์˜คํžˆ๋ ค AI์˜ ๋ง‰๊ฐ•ํ•œ ๋Šฅ๋ ฅ์„ ๋‹ค๋ฃฐ ๋งŒํผ ๋งค์šฐ ๋˜‘๋˜‘ํ•˜๋‹ค๋Š” ๊ฑธ ๊ฑฑ์ •ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
13:28
or wise enough rather,
268
808000
1536
13:29
to handle this much power.
269
809560
1240
13:31
Also, aside from any moral qualms you might have
270
811640
2536
๋›ฐ์–ด๋‚œ AI๋ฅผ ๋…ธ์˜ˆ๋กœ ๋งŒ๋“ ๋‹ค๋Š” ๊ฒƒ์— ์ฃ„์ฑ…๊ฐ์„ ๊ฐ€์งˆ ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
13:34
about enslaving superior minds,
271
814200
2296
13:36
you might worry that maybe the superintelligence could outsmart us,
272
816520
3976
์šฐ๋ฆฌ๊ฐ€ ์—ผ๋ คํ•ด์•ผ ํ•  ๊ฒƒ์€ ์–ด์ฉŒ๋ฉด..
์šฐ๋ฆฌ๋ฅผ ํ›จ์”ฌ ๋Šฅ๊ฐ€ํ•˜๋Š” ์ดˆ์ง€๋Šฅ์ด ๋‚˜ํƒ€๋‚˜์„œ ๋ชจ๋“  ๊ฑธ ์ฐจ์ง€ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์ž…๋‹ˆ๋‹ค.
13:40
break out and take over.
273
820520
2240
13:43
But I also have colleagues who are fine with AI taking over
274
823560
3416
๊ทธ๋Ÿฐ๋ฐ ์ œ ๋™๋ฃŒ ์ค‘์—๋Š” AI์—๊ฒŒ ์ง€๋ฐฐ๋‹นํ•ด๋„ ์ข‹๋‹ค๋Š” ์ด๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
13:47
and even causing human extinction,
275
827000
2296
์‹ฌ์ง€์–ด ์ธ๋ฅ˜๊ฐ€ ๋ฉธ์ข…ํ•˜๊ฒŒ ๋˜๋”๋ผ๋„
13:49
as long as we feel the the AIs are our worthy descendants,
276
829320
3576
AI๊ฐ€ ์šฐ๋ฆฌ์˜ ํ›„์†์ด ๋  ๋งŒํ•˜๋‹ค๊ณ  ์ƒ๊ฐ๋˜๋ฉด ๊ทธ๋ž˜๋„ ๊ดœ์ฐฎ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
13:52
like our children.
277
832920
1736
์šฐ๋ฆฌ ์•„์ด๋“ค์ฒ˜๋Ÿผ ๋ง์ด์ฃ .
13:54
But how would we know that the AIs have adopted our best values
278
834680
5616
๊ทธ๋Ÿฐ๋ฐ AI๊ฐ€ ์šฐ๋ฆฌ์˜ ๊ฐ€์žฅ ์ข‹์€ ์ ๋งŒ ์ทจํ•  ๊ฑฐ๋ผ๊ณ  ์–ด๋–ป๊ฒŒ ํ™•์‹ ํ•˜์ฃ ?
14:00
and aren't just unconscious zombies tricking us into anthropomorphizing them?
279
840320
4376
์‚ฌ๋žŒ ํ–‰์„ธ๋ฅผ ํ•˜๋Š” ์˜์‹ ์—†๋Š” ์ข€๋น„๊ฐ€ ๋˜์ง€ ์•Š์„ ๊ฑฐ๋ผ๊ณ  ์–ด๋–ป๊ฒŒ ์•Œ์ฃ ?
14:04
Also, shouldn't those people who don't want human extinction
280
844720
2856
์ธ๋ฅ˜์˜ ๋ฉธ์ข…์„ ์›์น˜ ์•Š๋Š” ์‚ฌ๋žŒ๋“ค๋„
14:07
have a say in the matter, too?
281
847600
1440
์ž ์ž์ฝ” ๋‹นํ•ด์•ผ ํ•˜๋Š” ๊ฑด๊ฐ€์š”?
14:10
Now, if you didn't like either of those two high-tech options,
282
850200
3376
๋งŒ์•ฝ ์—ฌ๋Ÿฌ๋ถ„์ด ์ฒจ๋‹จ ๊ธฐ์ˆ ์— ๋Œ€ํ•œ ์ด ๋‘ ๊ฐ€์ง€ ์„ ํƒ์„ ๋ชจ๋‘ ํฌ๊ธฐํ•œ๋‹ค๋ฉด
14:13
it's important to remember that low-tech is suicide
283
853600
3176
์šฐ์ฃผ์  ๊ด€์ ์—์„œ ๋ณด๋ฉด ์ €๊ธ‰ํ•œ ๊ธฐ์ˆ ์€ ์ž์‚ด ํ–‰์œ„์ž„์„
14:16
from a cosmic perspective,
284
856800
1256
๊ธฐ์–ตํ•ด ๋‘์…”์•ผ ํ•ฉ๋‹ˆ๋‹ค.
14:18
because if we don't go far beyond today's technology,
285
858080
2496
์™œ๋ƒํ•˜๋ฉด ํ˜„์žฌ์˜ ๊ธฐ์ˆ ์„ ๋›ฐ์–ด๋„˜์ง€ ๋ชปํ•˜๋ฉด
14:20
the question isn't whether humanity is going to go extinct,
286
860600
2816
์ธ๋ฅ˜์˜ ๋ฉธ์ข…์ด ๋ฌธ์ œ๊ฐ€ ์•„๋‹ˆ๋ผ
14:23
merely whether we're going to get taken out
287
863440
2016
๋‹จ์ˆœํžˆ ๋‹ค๋ฅธ ๋ฌธ์ œ๋กœ ์šฐ๋ฆฌ๊ฐ€ ์ฃฝ์„ ์ˆ˜๋„ ์žˆ์œผ๋‹ˆ๊นŒ์š”.
14:25
by the next killer asteroid, supervolcano
288
865480
2136
์‚ด์ธ์ ์ธ ์†Œํ–‰์„ฑ์ด๋‚˜ ํ™”์‚ฐ ๋Œ€ํญ๋ฐœ
14:27
or some other problem that better technology could have solved.
289
867640
3096
์•„๋‹ˆ๋ฉด ์ฒจ๋‹จ ๊ธฐ์ˆ ์ด ์žˆ์—ˆ๋‹ค๋ฉด ํ•ด๊ฒฐ๋˜์—ˆ์„ ๋‹ค๋ฅธ ๋ฌธ์ œ๋“ค๋กœ ๋ง์ด์ฃ .
14:30
So, how about having our cake and eating it ...
290
870760
3576
๊ทธ๋ ‡๋‹ค๋ฉด ๋‘ ๋งˆ๋ฆฌ ํ† ๋ผ๋ฅผ ํ•œ ๋ฒˆ์— ์žก์œผ๋ฉด ์–ด๋–จ๊นŒ์š”?
14:34
with AGI that's not enslaved
291
874360
1840
AGI๋ฅผ ๋…ธ์˜ˆ๋กœ ๋งŒ๋“ค์ง€๋Š” ๋ชปํ–ˆ์ง€๋งŒ
14:37
but treats us well because its values are aligned with ours?
292
877120
3176
์šฐ๋ฆฌ์™€ ๊ฐ™์€ ๊ฐ€์น˜๊ด€์„ ๊ฐ–๊ณ  ์šฐ๋ฆฌ์—๊ฒŒ ํ˜ธ์˜๋ฅผ ๋ฒ ํ’€๊ฒŒ ํ•˜๋Š” ๊ฑฐ์ฃ .
14:40
This is the gist of what Eliezer Yudkowsky has called "friendly AI,"
293
880320
4176
๊ทธ๊ฒƒ์ด ์—˜๋ฆฌ์ œ ์œ ๋“œ์ฝ”ํ”„์Šคํ‚ค๊ฐ€ ๋งํ•œ "์นœ๊ทผํ•œ AI"์˜ ์š”์ง€์ž…๋‹ˆ๋‹ค.
14:44
and if we can do this, it could be awesome.
294
884520
2680
๊ทธ๋ ‡๊ฒŒ๋งŒ ๋œ๋‹ค๋ฉด ์ •๋ง ์ตœ๊ณ ์ฃ .
14:47
It could not only eliminate negative experiences like disease, poverty,
295
887840
4816
๊ทธ๋Ÿฌ๋ฉด ๋ถ€์ •์ ์ธ ์š”์†Œ๋“ค์„ ์—†์•จ ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
์งˆ๋ณ‘, ๋นˆ๊ณค, ๋ฒ”์ฃ„, ๊ทธ ๋ฐ–์˜ ์–ด๋ ค์šด ๋ฌธ์ œ๋“ค์ด์š”.
14:52
crime and other suffering,
296
892680
1456
14:54
but it could also give us the freedom to choose
297
894160
2816
๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ์šฐ๋ฆฌ์—๊ฒŒ ์„ ํƒ์˜ ์ž์œ ๋„ ๊ฐ€์ ธ๋‹ค์ค„ ๊ฑฐ์˜ˆ์š”.
14:57
from a fantastic new diversity of positive experiences --
298
897000
4056
๋ฉ‹์ง€๊ณ  ๋‹ค์–‘ํ•œ ๊ธ์ •์  ์š”์†Œ๋“ค์„ ์„ ํƒํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
15:01
basically making us the masters of our own destiny.
299
901080
3160
๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ์˜ ์šด๋ช…์„ ์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๋Œ€๋กœ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ๊ฒŒ ๋˜๊ฒ ์ฃ .
15:06
So in summary,
300
906280
1376
์ •๋ฆฌํ•ด์„œ ๋ง์”€๋“œ๋ฆฌ๋ฉด
15:07
our situation with technology is complicated,
301
907680
3096
๊ธฐ์ˆ ๊ณผ ๊ด€๋ จํ•œ ์šฐ๋ฆฌ์˜ ํ˜„์‹ค์€ ๋งค์šฐ ๋ณต์žกํ•ฉ๋‹ˆ๋‹ค.
15:10
but the big picture is rather simple.
302
910800
2416
ํ•˜์ง€๋งŒ ํฐ ๊ทธ๋ฆผ์€ ์˜คํžˆ๋ ค ๋‹จ์ˆœํ•˜์ฃ .
15:13
Most AI researchers expect AGI within decades,
303
913240
3456
AI ์—ฐ๊ตฌ์ž ๋Œ€๋ถ€๋ถ„์€ ์ˆ˜์‹ญ ๋…„ ๋‚ด์— AGI๊ฐ€ ๋“ฑ์žฅํ•  ๊ฑฐ๋ผ๊ณ  ๊ธฐ๋Œ€ํ•ฉ๋‹ˆ๋‹ค.
15:16
and if we just bumble into this unprepared,
304
916720
3136
๊ทธ์— ๋Œ€ํ•œ ๋Œ€๋น„ ์—†์ด ์šฐ์™•์ขŒ์™•ํ•˜๊ณ  ์žˆ์œผ๋ฉด
15:19
it will probably be the biggest mistake in human history --
305
919880
3336
์ธ๋ฅ˜ ์—ญ์‚ฌ์ƒ ์ตœ๋Œ€์˜ ์‹ค์ˆ˜๋ฅผ ์ €์ง€๋ฅผ์ง€๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
15:23
let's face it.
306
923240
1416
๋งž๋ถ€๋”ชํ˜€์•ผ ํ•ด์š”.
15:24
It could enable brutal, global dictatorship
307
924680
2576
์ž”ํ˜นํ•œ ์„ธ๊ณ„์  ๋…์žฌ์ž๋ฅผ ๋งŒ๋“ค์–ด
15:27
with unprecedented inequality, surveillance and suffering,
308
927280
3536
์ „๋ก€๊ฐ€ ์—†๋Š” ๋ถˆํ‰๋“ฑ๊ณผ ๊ฐ์‹œ, ๊ทธ๋ฆฌ๊ณ  ๊ณ ํ†ต์ด ์žˆ์„์ง€ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
15:30
and maybe even human extinction.
309
930840
1976
์‹ฌ์ง€์–ด ์ธ๋ฅ˜์˜ ๋ฉธ์ข…๊นŒ์ง€๋„์š”.
15:32
But if we steer carefully,
310
932840
2320
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์ž˜ ์ด๋ˆ๋‹ค๋ฉด
15:36
we could end up in a fantastic future where everybody's better off:
311
936040
3896
๋ชจ๋‘๊ฐ€ ์ž˜์‚ด ์ˆ˜ ์žˆ๋Š” ํ™˜์ƒ์ ์ธ ๋ฏธ๋ž˜๋ฅผ ๋งŒ๋“ค ์ˆ˜๋„ ์žˆ์–ด์š”.
15:39
the poor are richer, the rich are richer,
312
939960
2376
๊ฐ€๋‚œํ•œ ์ด๋Š” ๋ถ€์ž๊ฐ€ ๋˜๊ณ  ๋ถ€์ž๋Š” ๋” ๋ถ€์ž๊ฐ€ ๋˜๋ฉฐ
15:42
everybody is healthy and free to live out their dreams.
313
942360
3960
๋ชจ๋‘๊ฐ€ ๊ฑด๊ฐ•ํ•˜๊ฒŒ ์‚ด๋ฉฐ ์ž์œ ๋กญ๊ฒŒ ๊ฟˆ์„ ์ด๋ฃฐ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
15:47
Now, hang on.
314
947000
1536
๊ทธ๋Ÿฐ๋ฐ ๋ง์ด์ฃ .
15:48
Do you folks want the future that's politically right or left?
315
948560
4576
์ •์น˜์ ์œผ๋กœ ์šฐ์ต ๋˜๋Š” ์ขŒ์ต์˜ ๋ฏธ๋ž˜ ์‚ฌํšŒ๋ฅผ ์›ํ•˜์‹œ๋‚˜์š”?
15:53
Do you want the pious society with strict moral rules,
316
953160
2856
์—„๊ฒฉํ•œ ๋„๋•์  ๊ทœ์œจ์„ ๊ฐ–๋Š” ๊ฒฝ๊ฑดํ•œ ์‚ฌํšŒ๋ฅผ ์›ํ•˜์‹œ๋‚˜์š”?
15:56
or do you an hedonistic free-for-all,
317
956040
1816
์•„๋‹ˆ๋ฉด ์ž์œ ๋กœ์šด ์พŒ๋ฝ์„ ์›ํ•˜์‹œ๋‚˜์š”?
15:57
more like Burning Man 24/7?
318
957880
2216
๋งค์ผ ๋ฒ„๋‹๋งจ ์ถ•์ œ๊ฐ€ ์—ด๋ฆฌ๋“ฏ์ด์š”.
16:00
Do you want beautiful beaches, forests and lakes,
319
960120
2416
์•„๋ฆ„๋‹ค์šด ํ•ด๋ณ€๊ณผ ์ˆฒ๊ณผ ํ˜ธ์ˆ˜๋ฅผ ์›ํ•˜์‹œ๋‚˜์š”?
16:02
or would you prefer to rearrange some of those atoms with the computers,
320
962560
3416
์•„๋‹ˆ๋ฉด ์ปดํ“จํ„ฐ๊ฐ€ ๋งŒ๋“  ๊ฐ€์ƒ ์ฒดํ—˜์œผ๋กœ ์›์ž ์žฌ๋ฐฐ์—ด์„ ์—ฐ๊ตฌํ•˜๊ณ  ์‹ถ์œผ์‹ ๊ฐ€์š”?
16:06
enabling virtual experiences?
321
966000
1715
16:07
With friendly AI, we could simply build all of these societies
322
967739
3157
'์นœ๊ทผํ•œ AI'๊ฐ€ ์žˆ๋‹ค๋ฉด ์ด ๋ชจ๋“  ๊ฒŒ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.
16:10
and give people the freedom to choose which one they want to live in
323
970920
3216
์‚ฌ๋žŒ๋“ค์€ ๊ฐ์ž ์›ํ•˜๋Š” ์‚ถ์„ ์ž์œ ๋กญ๊ฒŒ ์„ ํƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
์™œ๋ƒํ•˜๋ฉด ์šฐ๋ฆฌ๋Š” ๋”๋Š” ์ง€๋Šฅ์˜ ํ•œ๊ณ„์— ์ œํ•œ๋ฐ›์ง€ ์•Š๊ณ 
16:14
because we would no longer be limited by our intelligence,
324
974160
3096
16:17
merely by the laws of physics.
325
977280
1456
์˜ค์ง ๋ฌผ๋ฆฌ ๋ฒ•์น™๋งŒ ๋‚จ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
16:18
So the resources and space for this would be astronomical --
326
978760
4616
์ด๋ฅผ ์œ„ํ•œ ์ž์›๊ณผ ๊ณต๊ฐ„์€
๋ฌธ์ž ๊ทธ๋Œ€๋กœ ์ฒœ๋ฌธํ•™์ ์ธ ์ˆ˜์ค€์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:23
literally.
327
983400
1320
16:25
So here's our choice.
328
985320
1200
์ด์ œ ์šฐ๋ฆฌ์˜ ์„ ํƒ์ด ๋‚จ์•„์žˆ์Šต๋‹ˆ๋‹ค.
16:27
We can either be complacent about our future,
329
987880
2320
ํ•œ ๊ฐ€์ง€ ์„ ํƒ์€ ์šฐ๋ฆฌ ๋ฏธ๋ž˜๋ฅผ ์ž๋งŒํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
16:31
taking as an article of blind faith
330
991440
2656
๊ทธ์ € ๋งน์‹ ํ•˜๋Š” ๊ฑฐ์˜ˆ์š”.
16:34
that any new technology is guaranteed to be beneficial,
331
994120
4016
๋ชจ๋“  ์‹ ๊ธฐ์ˆ ์ด ์šฐ๋ฆฌ์—๊ฒŒ ์ด๋“์„ ๊ฐ€์ ธ๋‹ค์ค„ ๊ฑฐ๋ผ๊ณ 
16:38
and just repeat that to ourselves as a mantra over and over and over again
332
998160
4136
์šฐ๋ฆฌ ์Šค์Šค๋กœ์—๊ฒŒ ๊ณ„์†ํ•ด์„œ ์ฃผ๋ฌธ์„ ๊ฑธ๊ธฐ๋งŒ ํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
16:42
as we drift like a rudderless ship towards our own obsolescence.
333
1002320
3680
ํ‘œ๋ฅ˜ํ•˜๋Š” ๋ฐฐ์ฒ˜๋Ÿผ ๋– ๋‹ค๋‹ˆ๋ฉฐ ์ง„๋ถ€ํ•œ ๋ฐฉํ–ฅ์œผ๋กœ ํ˜๋Ÿฌ๊ฐ€๊ฒ ์ฃ .
16:46
Or we can be ambitious --
334
1006920
1880
์•„๋‹ˆ๋ฉด ์•ผ๋ง์„ ์„ ํƒํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
16:49
thinking hard about how to steer our technology
335
1009840
2456
์šฐ๋ฆฌ ๊ธฐ์ˆ ์„ ์–ด๋–ป๊ฒŒ ์ด๋Œ์–ด ๊ฐˆ์ง€ ์‹ฌ์‚ฌ์ˆ™๊ณ ํ•˜๊ณ 
16:52
and where we want to go with it
336
1012320
1936
๊ฒฝ์ด๋กœ์›€์˜ ์‹œ๋Œ€๋ฅผ ๋งŒ๋“ค์–ด ๋‚ด๋ ค๋ฉด ์–ด๋””๋กœ ํ–ฅํ•ด์•ผ ํ• ์ง€ ๊ณ ๋ฏผํ•˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
16:54
to create the age of amazement.
337
1014280
1760
์šฐ๋ฆฌ ๋ชจ๋‘๋Š” ๊ฒฝ์ด๋กœ์›€์˜ ์‹œ๋Œ€๋ฅผ ๊ธฐ๋…ํ•˜๋ ค๊ณ  ์ด๊ณณ์— ๋ชจ์˜€์Šต๋‹ˆ๋‹ค.
16:57
We're all here to celebrate the age of amazement,
338
1017000
2856
16:59
and I feel that its essence should lie in becoming not overpowered
339
1019880
4440
์ €๋Š” ๊ทธ ๋ณธ์งˆ์€ ์šฐ๋ฆฌ๊ฐ€ ๊ธฐ์ˆ ์˜ ํž˜์— ์ œ์••๋‹นํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
17:05
but empowered by our technology.
340
1025240
2616
๊ธฐ์ˆ ๋กœ์จ ์šฐ๋ฆฌ๊ฐ€ ํž˜์„ ์–ป๋Š” ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
17:07
Thank you.
341
1027880
1376
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
17:09
(Applause)
342
1029280
3080
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7