What Is an AI Anyway? | Mustafa Suleyman | TED

2,284,013 views ใƒป 2024-04-22

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Hyerin Kim ๊ฒ€ํ† : JY Kang
00:04
I want to tell you what I see coming.
0
4292
2586
์ œ๊ฐ€ ๋ณด๋Š” ๋ฏธ๋ž˜์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐ ํ•ด๋ณด๊ณ ์ž ํ•ฉ๋‹ˆ๋‹ค.
00:07
I've been lucky enough to be working on AI for almost 15 years now.
1
7712
4463
์ €๋Š” ์šด์ด ์ข‹๊ฒŒ๋„ 15๋…„ ๋™์•ˆ AI๋ฅผ ์—ฐ๊ตฌํ•ด์™”์Šต๋‹ˆ๋‹ค.
00:12
Back when I started, to describe it as fringe would be an understatement.
2
12676
5088
์ฒ˜์Œ ์‹œ์ž‘ํ•  ๋•Œ๋Š” ๋น„์ฃผ๋ฅ˜๋ผ๊ณ  ํ•˜๋Š” ๊ฑด ๋งํ•  ๊ฒƒ๋„ ์—†๊ณ ,
00:17
Researchers would say, โ€œNo, no, weโ€™re only working on machine learning.โ€
3
17806
4045
์—ฐ๊ตฌ์ž๋“ค์กฐ์ฐจ โ€œ์•„๋‹ˆ์š”, ์šฐ๋ฆฌ๊ฐ€ ํ•˜๋Š” ๊ฑด ๊ทธ๋ƒฅ ๊ธฐ๊ณ„ ํ•™์Šต์ด์—์š”.โ€ ๋ผ๊ณ  ๋งํ–ˆ์ฃ .
00:21
Because working on AI was seen as way too out there.
4
21893
3087
์™œ๋ƒํ•˜๋ฉด AI์— ๋Œ€ํ•œ ์—ฐ๊ตฌ๋Š” ๋น„ํ˜„์‹ค์ ์ด๋ผ ์—ฌ๊ฒจ์กŒ๊ฑฐ๋“ ์š”.
00:25
In 2010, just the very mention of the phrase โ€œAGI,โ€
5
25021
4380
2010๋…„์—๋Š” ์ธ๊ณต์ผ๋ฐ˜์ง€๋Šฅ์„ ๋œปํ•˜๋Š” โ€œAGIโ€๋ผ๋Š” ํ‘œํ˜„์„ ์–ธ๊ธ‰ํ•˜๊ธฐ๋งŒ ํ•ด๋„
00:29
artificial general intelligence,
6
29401
2169
00:31
would get you some seriously strange looks
7
31570
2877
๋ชป๋งˆ๋•…ํ•œ ์‹œ์„ ์„ ๋ณด๋ƒˆ๊ณ 
00:34
and even a cold shoulder.
8
34489
1585
์‹ฌ์ง€์–ด ๋”ฐ๋Œ๋ฆฌ๊ธฐ๊นŒ์ง€ ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:36
"You're actually building AGI?" people would say.
9
36366
3837
์‚ฌ๋žŒ๋“ค์€ ์ด๋ ‡๊ฒŒ ๋งํ–ˆ์ฃ . โ€œ์‹ค์ œ๋กœ AGI๋ฅผ ๋งŒ๋“ ๋‹ค๊ณ ์š”?โ€
00:40
"Isn't that something out of science fiction?"
10
40203
2294
โ€œ๊ณต์ƒ ๊ณผํ•™ ์†Œ์„ค์—์„œ๋‚˜ ๋‚˜์˜ค๋Š” ๊ฑฐ ์•„๋‹Œ๊ฐ€์š”?โ€
00:42
People thought it was 50 years away or 100 years away,
11
42914
2544
์‚ฌ๋žŒ๋“ค์€ 50๋…„, 100๋…„์ด ์ง€๋‚˜์•ผ ์ธ๊ณต์ง€๋Šฅ์ด ๊ฐ€๋Šฅํ•  ๊ฑฐ๋ผ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
00:45
if it was even possible at all.
12
45500
1919
00:47
Talk of AI was, I guess, kind of embarrassing.
13
47752
3337
์ œ ์ƒ๊ฐ์— AI์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ์กฐ์ฐจ ๋ถ€๋„๋Ÿฌ์šด ์ผ์ด์—ˆ๋˜ ๊ฑฐ ๊ฐ™์•„์š”.
00:51
People generally thought we were weird.
14
51631
2211
์‚ฌ๋žŒ๋“ค์€ ์šฐ๋ฆฌ๋ฅผ ์ด์ƒํ•œ ์‚ฌ๋žŒ ์ทจ๊ธ‰ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:54
And I guess in some ways we kind of were.
15
54217
2503
์–ด์ฉŒ๋ฉด ์ •๋ง ๊ทธ๋žฌ์„์ง€๋„ ๋ชจ๋ฅด๊ณ ์š”.
00:56
It wasn't long, though, before AI started beating humans
16
56761
2878
๊ทธ๋Ÿฐ๋ฐ ์–ผ๋งˆ ์ง€๋‚˜์ง€ ์•Š์•„
00:59
at a whole range of tasks
17
59681
1793
AI๊ฐ€ ๋ชจ๋“  ์ž‘์—…์—์„œ ์ธ๊ฐ„์„ ๋Šฅ๊ฐ€ํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:01
that people previously thought were way out of reach.
18
61516
2795
์ „์—๋Š” ์ ˆ๋Œ€ ๋ถˆ๊ฐ€๋Šฅํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ๋˜ ๊ฒƒ๋“ค์—์„œ์š”.
01:05
Understanding images,
19
65020
2127
์ด๋ฏธ์ง€๋ฅผ ์ดํ•ดํ•˜๋Š” ๊ฒƒ,
01:07
translating languages,
20
67147
1918
์–ธ์–ด๋ฅผ ๋ฒˆ์—ญํ•˜๋Š” ๊ฒƒ,
01:09
transcribing speech,
21
69065
1585
๋ง์„ ๊ธ€์ž๋กœ ์˜ฎ๊ธฐ๋Š” ๊ฒƒ,
01:10
playing Go and chess
22
70692
2127
๋ฐ”๋‘‘์ด๋‚˜ ์ฒด์Šค๋ฅผ ๋‘๋Š” ๊ฒƒ,
01:12
and even diagnosing diseases.
23
72861
1918
์‹ฌ์ง€์–ด ์งˆ๋ณ‘์„ ์ง„๋‹จํ•˜๋Š” ๊ฒƒ๊นŒ์ง€ ๋ง์ด์ฃ .
01:15
People started waking up to the fact
24
75905
1752
์‚ฌ๋žŒ๋“ค์€ AI๊ฐ€ ์—„์ฒญ๋‚œ ์˜ํ–ฅ์„ ๋ฏธ์น  ๊ฒƒ์ด๋ผ๋Š” ์‚ฌ์‹ค์„ ๊นจ๋‹ซ๊ธฐ ์‹œ์ž‘ํ–ˆ๊ณ ,
01:17
that AI was going to have an enormous impact,
25
77657
3462
์ €์™€ ๊ฐ™์€ ๊ธฐ์ˆ ์ž๋“ค์—๊ฒŒ ๊ฝค ์–ด๋ ค์šด ์งˆ๋ฌธ๋“ค์„ ๋˜์กŒ์Šต๋‹ˆ๋‹ค.
01:21
and they were rightly asking technologists like me
26
81119
2836
01:23
some pretty tough questions.
27
83955
1752
01:25
Is it true that AI is going to solve the climate crisis?
28
85749
3044
AI๊ฐ€ ๊ธฐํ›„ ์œ„๊ธฐ๋ฅผ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ด ์‚ฌ์‹ค์ธ๊ฐ€์š”?
01:29
Will it make personalized education available to everyone?
29
89127
3337
๋ชจ๋“  ์‚ฌ๋žŒ์ด ๊ฐœ์ธ ๋งž์ถคํ˜• ๊ต์œก์„ ๋ฐ›์„ ์ˆ˜ ์žˆ๊ฒŒ ๋ ๊นŒ์š”?
01:32
Does it mean we'll all get universal basic income
30
92881
2294
๊ทธ๋Ÿฌ๋ฉด ์šฐ๋ฆฌ ๋ชจ๋‘๊ฐ€ ๋ณดํŽธ์  ๊ธฐ๋ณธ ์†Œ๋“์„ ๋ฐ›๊ฒŒ ๋˜๊ณ 
01:35
and we won't have to work anymore?
31
95216
2002
๋” ์ด์ƒ ์ผ์„ ํ•˜์ง€ ์•Š์•„๋„ ๋œ๋‹ค๋Š” ๋œป์ธ๊ฐ€์š”?
01:37
Should I be afraid?
32
97218
1544
AI๋ฅผ ๋‘๋ ค์›Œํ•ด์•ผ ํ• ๊นŒ์š”?
01:38
What does it mean for weapons and war?
33
98762
2335
๋ฌด๊ธฐ์™€ ์ „์Ÿ์— ๋Œ€ํ•ด์„œ๋Š” ์–ด๋–ค ์˜ํ–ฅ์ด ์žˆ์„๊นŒ์š”?
01:41
And of course, will China win?
34
101598
1460
์ค‘๊ตญ์ด ๊ฒฝ์Ÿ์—์„œ ์ด๊ธธ๊นŒ์š”? ์šฐ๋ฆฌ๊ฐ€ ์•ž์„œ๊ณ  ์žˆ๋‚˜์š”?
01:43
Are we in a race?
35
103058
1543
01:45
Are we headed for a mass misinformation apocalypse?
36
105518
3212
ํ—ˆ์œ„ ์ •๋ณด๊ฐ€ ๋„˜์น˜๋Š” ๋Œ€์žฌ์•™์„ ๋งž๊ฒŒ ๋ ๊นŒ์š”?
01:49
All good questions.
37
109147
1668
๋ชจ๋‘ ์ข‹์€ ์งˆ๋ฌธ์ž…๋‹ˆ๋‹ค.
01:51
But it was actually a simpler
38
111608
1418
ํ•˜์ง€๋งŒ ๊ทธ๋ณด๋‹ค ๋” ๋‹จ์ˆœํ•˜๊ณ  ๋” ๋ณธ์งˆ์ ์ธ ์งˆ๋ฌธ์—
01:53
and much more kind of fundamental question that left me puzzled.
39
113026
4046
์ €๋Š” ๋‹นํ˜น๊ฐ์„ ๋А๊ผˆ์Šต๋‹ˆ๋‹ค.
01:58
One that actually gets to the very heart of my work every day.
40
118031
3962
์‚ฌ์‹ค ์ œ๊ฐ€ ๋งค์ผ ํ•˜๋Š” ์ผ์˜ ํ•ต์‹ฌ์„ ์ฐŒ๋ฅด๋Š” ์งˆ๋ฌธ์ด์ฃ .
02:03
One morning over breakfast,
41
123620
2044
์–ด๋А ๋‚  ์•„์นจ,
02:05
my six-year-old nephew Caspian was playing with Pi,
42
125705
3379
์ €์˜ ์—ฌ์„ฏ ์‚ด์งœ๋ฆฌ ์กฐ์นด ์นด์Šคํ”ผ์•ˆ์ด โ€˜ํŒŒ์ด(Pi)โ€™์™€ ๋†€๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
02:09
the AI I created at my last company, Inflection.
43
129084
3378
ํŒŒ์ด๋Š” ์ œ๊ฐ€ ๋‹ค๋‹ˆ๋˜ ํšŒ์‚ฌ โ€˜์ธํ”Œ๋ ‰์…˜โ€™์—์„œ ๋งŒ๋“  ์ธ๊ณต์ง€๋Šฅ ๊ธฐ๊ธฐ์ฃ .
02:12
With a mouthful of scrambled eggs,
44
132504
1877
์กฐ์นด๋Š” ๊ณ„๋ž€ ์Šคํฌ๋žจ๋ธ”์„ ์ž… ํ•œ๊ฐ€๋“ ๋จน๊ณ ๋Š”
02:14
he looked at me plain in the face and said,
45
134381
3503
์ œ ์–ผ๊ตด์„ ํ‰์˜จํ•˜๊ฒŒ ์ณ๋‹ค๋ณด๋ฉด์„œ ๋ฌผ์—ˆ์Šต๋‹ˆ๋‹ค.
02:17
"But Mustafa, what is an AI anyway?"
46
137926
3295
โ€œ๋ฌด์Šคํƒ€ํŒŒ ์‚ผ์ดŒ, ๊ทผ๋ฐ AI๊ฐ€ ๋ญ์˜ˆ์š”?โ€
02:21
He's such a sincere and curious and optimistic little guy.
47
141930
3545
์ œ ์กฐ์นด๋Š” ์ •๋ง ์ง„์‹ค๋˜๊ณ  ํ˜ธ๊ธฐ์‹ฌ์ด ๋งŽ๊ณ  ๋‚™์ฒœ์ ์ธ ๊ผฌ๋งˆ์˜ˆ์š”.
02:25
He'd been talking to Pi about how cool it would be if one day in the future,
48
145517
4254
๊ทธ ์•„์ด๋Š” ํŒŒ์ด์—๊ฒŒ ์–ธ์  ๊ฐ€ ๋™๋ฌผ์›์—์„œ ๊ณต๋ฃก์„ ๋ณผ ์ˆ˜ ์žˆ๋‹ค๋ฉด
02:29
he could visit dinosaurs at the zoo.
49
149771
2377
์–ผ๋งˆ๋‚˜ ์ข‹์„์ง€ ์ด์•ผ๊ธฐํ•˜๊ธฐ๋„ ํ•˜๊ณ ,
02:32
And how he could make infinite amounts of chocolate at home.
50
152190
3795
์ง‘์—์„œ ์—„์ฒญ๋‚œ ์–‘์˜ ์ดˆ์ฝœ๋ฆฟ์„ ๋งŒ๋“œ๋Š” ๋ฐฉ๋ฒ•์„ ๋ฌป๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
02:35
And why Pi couldnโ€™t yet play I Spy.
51
155985
2920
๊ทธ๋ฆฌ๊ณ  ์™œ ํŒŒ์ด๊ฐ€ ์•„์ง ๊ฒŒ์ž„์€ ๋ชป ํ•˜๋Š”์ง€ ๋ฌป๊ธฐ๋„ ํ•˜๊ณ ์š”.
02:39
"Well," I said, "it's a clever piece of software
52
159823
2252
์ €๋Š” ์ด๋ ‡๊ฒŒ ๋‹ตํ–ˆ์–ด์š”.
โ€œ์Œ, ์ธ๊ณต์ง€๋Šฅ์€ ๋˜‘๋˜‘ํ•œ ์†Œํ”„ํŠธ์›จ์–ด๋ผ์„œ ์ธํ„ฐ๋„ท์— ์žˆ๋Š” ๋Œ€๋ถ€๋ถ„์˜ ๊ธ€์„ ์ฝ๊ณ 
02:42
that's read most of the text on the open internet,
53
162075
2461
02:44
and it can talk to you about anything you want."
54
164577
2419
๋„ค๊ฐ€ ์›ํ•˜๋Š” ๊ฑด ๋ญ๋“ ์ง€ ์ด์•ผ๊ธฐํ•ด์ค„ ์ˆ˜ ์žˆ์–ด.โ€
02:48
"Right.
55
168540
1168
โ€œ์•„ํ•˜!
02:49
So like a person then?"
56
169749
2127
์‚ฌ๋žŒ์ฒ˜๋Ÿผ์š”?โ€
02:54
I was stumped.
57
174254
1334
์ €๋Š” ๋‚œ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
02:56
Genuinely left scratching my head.
58
176798
2502
๋จธ๋ฆฌ๋ฅผ ๊ธ์ ์ด๋ฉฐ ์ž๋ฆฌ๋ฅผ ํ”ผํ–ˆ์–ด์š”.
03:00
All my boring stock answers came rushing through my mind.
59
180301
4046
์ƒํˆฌ์ ์ด๊ณ  ๋”ฐ๋ถ„ํ•œ ๋Œ€๋‹ต์ด ์ œ ๋จธ๋ฆฟ์†์„ ์Šค์ณ ์ง€๋‚˜๊ฐ”์Šต๋‹ˆ๋‹ค.
03:04
"No, but AI is just another general-purpose technology,
60
184723
2585
โ€œ์•„๋‹ˆ, AI๋Š” ์ผ์ข…์˜ ๋ฒ”์šฉ ๊ธฐ์ˆ ์ด์•ผ. ์ธ์‡„์ˆ ์ด๋‚˜ ์ฆ๊ธฐ๊ธฐ๊ด€์ฒ˜๋Ÿผ ๋ง์ด์ง€.
03:07
like printing or steam."
61
187308
1710
03:09
It will be a tool that will augment us
62
189394
2502
์ธ๊ณต์ง€๋Šฅ์€ ์šฐ๋ฆฌ ๋Šฅ๋ ฅ์„ ๋” ๊ฐ•ํ™”ํ•˜๊ณ 
03:11
and make us smarter and more productive.
63
191938
2377
๋” ๋˜‘๋˜‘ํ•˜๊ณ  ์ƒ์‚ฐ์ ์œผ๋กœ ๋งŒ๋“ค์–ด์ค„ ๊ฑฐ์•ผ.
03:14
And when it gets better over time,
64
194649
1960
๊ทธ๋ฆฌ๊ณ  ์‹œ๊ฐ„์ด ์ง€๋‚ ์ˆ˜๋ก ๋” ๋ฐœ์ „ํ•ด์„œ
03:16
it'll be like an all-knowing oracle
65
196651
1919
์ „์ง€์ „๋Šฅํ•œ ์˜ˆ์–ธ์ž์ฒ˜๋Ÿผ
03:18
that will help us solve grand scientific challenges."
66
198611
3087
์ค‘๋Œ€ํ•œ ๊ณผํ•™์  ๋‚œ์ œ๋“ค์„ ํ•ด๊ฒฐํ•˜๋Š” ๋ฐ ๋„์›€์„ ์ค„ ๊ฑฐ์•ผ.โ€
03:22
You know, all of these responses started to feel, I guess,
67
202115
3712
์ œ ์ƒ๊ฐ์— ์ด๋Ÿฐ ์‹์˜ ๋Œ€๋‹ต์€ ์•ฝ๊ฐ„ ์ž๊ธฐ ๋ฐฉ์–ด์ ์œผ๋กœ ๋А๊ปด์กŒ์–ด์š”.
03:25
a little bit defensive.
68
205869
1418
03:28
And actually better suited to a policy seminar
69
208246
2211
์‚ฌ์‹ค ์ •์ฑ… ์„ธ๋ฏธ๋‚˜์— ๋” ์–ด์šธ๋ฆฌ๋Š” ์ด์•ผ๊ธฐ์ฃ .
03:30
than breakfast with a no-nonsense six-year-old.
70
210457
2627
์ง„์ง€ํ•œ ์—ฌ์„ฏ ์‚ด์งœ๋ฆฌ ์•„์ด์™€์˜ ์•„์นจ ์‹์‚ฌ ์ž๋ฆฌ๊ฐ€ ์•„๋‹ˆ๊ณ ์š”.
03:33
"Why am I hesitating?" I thought to myself.
71
213126
3086
โ€œ๋‚ด๊ฐ€ ์™œ ๋ง์„ค์ด๋Š” ๊ฑฐ์ง€?โ€ ์†์œผ๋กœ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
03:37
You know, let's be honest.
72
217839
1960
์ž, ์†”์งํ•ด์ ธ ๋ด…์‹œ๋‹ค.
03:39
My nephew was asking me a simple question
73
219799
3253
์ œ ์กฐ์นด๊ฐ€ ๋˜์ง„ ๊ฐ„๋‹จํ•œ ์งˆ๋ฌธ์€
03:43
that those of us in AI just don't confront often enough.
74
223052
3796
AI์— ์ข…์‚ฌํ•˜๋Š” ์šฐ๋ฆฌ๋“ค์ด ์ž์ฃผ ์ ‘ํ•˜์ง€ ๋ชปํ•˜๋Š” ์งˆ๋ฌธ์ด์—ˆ์–ด์š”.
03:48
What is it that we are actually creating?
75
228099
2920
์šฐ๋ฆฌ๊ฐ€ ์ฐฝ์กฐํ•˜๋Š” ๊ฒƒ์€ ์ •๋ง ๋„๋Œ€์ฒด ๋ฌด์—‡์ผ๊นŒ์š”?
03:51
What does it mean to make something totally new,
76
231895
3753
์ด์ „์— ์•Œ๋˜ ๋‹ค๋ฅธ ์–ด๋–ค ๋ฐœ๋ช…ํ’ˆ๊ณผ๋„ ๊ทผ๋ณธ์ ์œผ๋กœ ๋‹ค๋ฅธ,
03:55
fundamentally different to any invention that we have known before?
77
235648
4255
์ „ํ˜€ ์ƒˆ๋กœ์šด ๊ฑธ ๋งŒ๋“œ๋Š” ๊ฒƒ์€ ๋ฌด์—‡์„ ์˜๋ฏธํ• ๊นŒ์š”?
04:00
It is clear that we are at an inflection point
78
240695
2795
์šฐ๋ฆฌ๋Š” ๋ถ„๋ช… ์ธ๋ฅ˜ ์—ญ์‚ฌ์˜ ๋ณ€๊ณก์ ์— ์„œ ์žˆ์Šต๋‹ˆ๋‹ค.
04:03
in the history of humanity.
79
243490
1793
04:06
On our current trajectory,
80
246493
1918
ํ˜„์žฌ ์šฐ๋ฆฌ์˜ ๊ถค๋„์—์„œ,
04:08
we're headed towards the emergence of something
81
248411
2211
์ƒˆ๋กœ ๋“ฑ์žฅํ•œ ๋ฌด์–ธ๊ฐ€๋ฅผ ์„ค๋ช…ํ•˜๋ ค๊ณ  ๊ณ ๊ตฐ๋ถ„ํˆฌํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
04:10
that we are all struggling to describe,
82
250663
3170
04:13
and yet we cannot control what we don't understand.
83
253875
4630
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์ดํ•ดํ•˜์ง€๋„ ๋ชปํ•œ ๊ฒƒ์„ ํ†ต์ œํ•œ๋‹ค๋Š” ๊ฒƒ์€ ๋ถˆ๊ฐ€๋Šฅํ•˜๊ฒ ์ฃ .
04:19
And so the metaphors,
84
259631
1585
๊ทธ๋ ‡๊ธฐ์— ๋น„์œ ๋ผ๋“ ์ง€,
04:21
the mental models,
85
261216
1251
์‹ฌ๋ฆฌ ๋ชจํ˜•์ด๋ผ๋“ ์ง€,
04:22
the names, these all matter
86
262509
2585
์ด๋ฆ„ ๊ฐ™์€ ๊ฒƒ๋“ค์ด ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
04:25
if weโ€™re to get the most out of AI whilst limiting its potential downsides.
87
265094
4088
AI์˜ ์ž ์žฌ์  ๋‹จ์ ์„ ์ œํ•œํ•˜๋ฉด์„œ AI๋ฅผ ์ตœ๋Œ€ํ•œ ํ™œ์šฉํ•˜๋ ค๋ฉด์š”.
04:30
As someone who embraces the possibilities of this technology,
88
270391
3421
์ด ๊ธฐ์ˆ ์˜ ๊ฐ€๋Šฅ์„ฑ์„ ๋ฐ›์•„๋“ค์ด๋ฉด์„œ๋„
04:33
but who's also always cared deeply about its ethics,
89
273812
3670
์ด๊ฒƒ์˜ ์œค๋ฆฌ์„ฑ์— ๋Œ€ํ•ด ๋Š˜ ๊นŠ์ด ๊ณ ๋ฏผํ•˜๋Š” ์‚ฌ๋žŒ์ด๋ผ๋ฉด
04:37
we should, I think,
90
277524
1293
์šฐ๋ฆฌ๊ฐ€ ๋งŒ๋“œ๋Š” ๊ฒƒ์ด ๋ฌด์—‡์ธ์ง€๋Š” ์‰ฝ๊ฒŒ ์„ค๋ช…ํ•  ์ˆ˜ ์žˆ์–ด์•ผ ํ•  ๊ฒ๋‹ˆ๋‹ค.
04:38
be able to easily describe what it is we are building.
91
278817
3044
04:41
And that includes the six-year-olds.
92
281861
2002
์—ฌ์„ฏ ์‚ด ์•„์ด๋“ค๋„ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋„๋ก ๋ง์ด์ฃ .
04:44
So it's in that spirit that I offer up today the following metaphor
93
284239
4254
๊ทธ๋Ÿฐ ๊ด€์ ์—์„œ ์ €๋Š” ์˜ค๋Š˜ ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋น„์œ ๋ฅผ ํ†ตํ•ด์„œ
04:48
for helping us to try to grapple with what this moment really is.
94
288535
3461
์ง€๊ธˆ ์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์ˆœ๊ฐ„์— ์žˆ๋Š”์ง€ ๊ฐ€๋Š ํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•ด๋“œ๋ฆฌ๋ ค ํ•ฉ๋‹ˆ๋‹ค.
04:52
I think AI should best be understood
95
292539
2627
์ €๋Š” AI๋ฅผ ์ดํ•ดํ•˜๋Š” ๊ฐ€์žฅ ์ข‹์€ ๋ฐฉ๋ฒ•์€
04:55
as something like a new digital species.
96
295166
4254
AI๋ฅผ ์ƒˆ๋กœ์šด ๋””์ง€ํ„ธ ์ข…์œผ๋กœ ๋ณด๋Š” ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
05:00
Now, don't take this too literally,
97
300296
2253
๊ธ€์ž ๊ทธ๋Œ€๋กœ ๋ฐ›์•„๋“ค์ด์ง€๋Š” ๋งˆ์„ธ์š”.
05:02
but I predict that we'll come to see them as digital companions,
98
302590
4630
์ œ ์ƒ๊ฐ์— ์•ž์œผ๋กœ ์šฐ๋ฆฌ๋Š” AI๋ฅผ ๋””์ง€ํ„ธ ์นœ๊ตฌ๋กœ ๋ณด๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
05:07
new partners in the journeys of all our lives.
99
307220
3295
์šฐ๋ฆฌ ์‚ถ์˜ ์—ฌ์ •์—์„œ ์ƒˆ๋กœ์šด ๋™๋ฐ˜์ž๋กœ ๋ง์ด์ฃ .
05:10
Whether you think weโ€™re on a 10-, 20- or 30-year path here,
100
310557
4087
์•ž์œผ๋กœ 10๋…„, 20๋…„ ํ˜น์€ 30๋…„์ด ๊ฑธ๋ฆด์ง€ ๋ชจ๋ฅด์ง€๋งŒ
05:14
this is, in my view, the most accurate and most fundamentally honest way
101
314686
5005
์ œ ์ƒ๊ฐ์—๋Š” ์ด ๊ด€์ ์ด ๋‹ค๊ฐ€์˜ค๋Š” ํ˜„์‹ค์„ ์„ค๋ช…ํ•˜๋Š”
05:19
of describing what's actually coming.
102
319732
2378
๊ฐ€์žฅ ์ •ํ™•ํ•˜๊ณ  ๊ทผ๋ณธ์ ์ธ ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.
05:22
And above all, it enables everybody to prepare for
103
322610
3963
๊ทธ๋ฆฌ๊ณ  ๋ฌด์—‡๋ณด๋‹ค๋„ ๋ชจ๋“  ์‚ฌ๋žŒ์ด
์•ž์œผ๋กœ ๋‹ค๊ฐ€์˜ค๋Š” ๊ฒƒ์ด ๋ฌด์—‡์ธ์ง€ ์•Œ๊ณ  ๋Œ€๋น„ํ•˜๋„๋ก ํ•ด์ค๋‹ˆ๋‹ค.
05:26
and shape what comes next.
104
326614
2711
05:29
Now I totally get, this is a strong claim,
105
329868
2002
์ด ๊ฐ•๋ ฅํ•œ ์ฃผ์žฅ์— ๋Œ€ํ•ด,
05:31
and I'm going to explain to everyone as best I can why I'm making it.
106
331911
3837
์—ฌ๋Ÿฌ๋ถ„๊ป˜ ์ตœ์„ ์„ ๋‹คํ•ด ์„ค๋ช…ํ•ด ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
05:36
But first, let me just try to set the context.
107
336291
2961
๋จผ์ € ์ „์ฒด์ ์ธ ๋งฅ๋ฝ์„ ์งš์–ด๋ณด์ฃ .
05:39
From the very first microscopic organisms,
108
339252
2753
์ง€๊ตฌ์ƒ ์ตœ์ดˆ์˜ ์ƒ๋ช…์ฒด์ธ ๋ฏธ์ƒ๋ฌผ์˜ ํƒ„์ƒ์€
05:42
life on Earth stretches back billions of years.
109
342046
3462
์ˆ˜์‹ญ์–ต ๋…„ ์ „์œผ๋กœ ๊ฑฐ์Šฌ๋Ÿฌ ์˜ฌ๋ผ๊ฐ‘๋‹ˆ๋‹ค.
05:45
Over that time, life evolved and diversified.
110
345508
4213
๊ทธ ํ›„์— ์ƒ๋ช…์€ ์ง„ํ™”ํ•˜๊ณ  ๋‹ค์–‘ํ™”๋˜์—ˆ์ฃ .
05:49
Then a few million years ago, something began to shift.
111
349762
3796
๊ทธ๋Ÿฐ๋ฐ ๋ช‡ ๋ฐฑ๋งŒ ๋…„ ์ „, ๋ฌด์–ธ๊ฐ€ ๋ฐ”๋€Œ๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:54
After countless cycles of growth and adaptation,
112
354183
3629
์…€ ์ˆ˜ ์—†์ด ๋งŽ์€ ์„ฑ์žฅ๊ณผ ์ ์‘์„ ๋ฐ˜๋ณตํ•œ ๋์—
05:57
one of lifeโ€™s branches began using tools, and that branch grew into us.
113
357812
6256
ํ•œ ๊ณ„ํ†ต์—์„œ ๋„๊ตฌ๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ๊ณ , ๊ทธ ๊ณ„ํ†ต์ด ์„ฑ์žฅํ•ด ์šฐ๋ฆฌ๊ฐ€ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:04
We went on to produce a mesmerizing variety of tools,
114
364777
4130
์šฐ๋ฆฌ๋Š” ๋„‹์ด ๋‚˜๊ฐˆ ์ •๋„๋กœ ๋‹ค์–‘ํ•œ ๋„๊ตฌ๋“ค์„ ๋งŒ๋“ค์–ด ๋ƒˆ์Šต๋‹ˆ๋‹ค.
06:08
at first slowly and then with astonishing speed,
115
368907
3670
์ฒ˜์Œ์—๋Š” ๋А๋ ธ์ง€๋งŒ, ๊ณง ๋†€๋ผ์šด ์†๋„๋กœ
06:12
we went from stone axes and fire
116
372577
3670
๋Œ๋„๋ผ์™€ ๋ถˆ์—์„œ๋ถ€ํ„ฐ
06:16
to language, writing and eventually industrial technologies.
117
376247
5005
์–ธ์–ด์™€ ๋ฌธ์ž, ๊ทธ๋ฆฌ๊ณ  ์‚ฐ์—… ๊ธฐ์ˆ ๊นŒ์ง€ ๋ฐœ์ „์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
06:21
One invention unleashed a thousand more.
118
381878
2919
ํ•˜๋‚˜์˜ ๋ฐœ๋ช…์€ ์ฒœ ๊ฐœ์˜ ๊ธฐ์ˆ ๋“ค์„ ๋” ๋งŒ๋“ค์–ด ๋ƒˆ์Šต๋‹ˆ๋‹ค.
06:25
And in time, we became homo technologicus.
119
385173
3712
์‹œ๊ฐ„์ด ํ๋ฅด๋ฉด์„œ ์šฐ๋ฆฌ๋Š” ํ˜ธ๋ชจ ํ…Œํฌ๋†€๋กœ์ง€์ฟ ์Šค๊ฐ€ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:29
Around 80 years ago,
120
389594
1209
์•ฝ 80๋…„ ์ „, ๋˜ ๋‹ค๋ฅธ ์ƒˆ๋กœ์šด ๊ธฐ์ˆ  ๊ณ„ํ†ต์ด ์‹œ์ž‘๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
06:30
another new branch of technology began.
121
390803
2545
06:33
With the invention of computers,
122
393973
1710
์ปดํ“จํ„ฐ์˜ ๋ฐœ๋ช…๊ณผ ํ•จ๊ป˜,
06:35
we quickly jumped from the first mainframes and transistors
123
395725
3295
์šฐ๋ฆฌ๋Š” ์ตœ์ดˆ์˜ ์ค‘์•™ ์ฒ˜๋ฆฌ์žฅ์น˜์™€ ํŠธ๋žœ์ง€์Šคํ„ฐ ๊ธฐ์ˆ ์—์„œ
ํ˜„์žฌ์˜ ์Šค๋งˆํŠธํฐ๊ณผ ๊ฐ€์ƒ ํ˜„์‹ค ์žฅ์น˜๋กœ ๋น ๋ฅด๊ฒŒ ๋„์•ฝํ–ˆ์Šต๋‹ˆ๋‹ค.
06:39
to today's smartphones and virtual-reality headsets.
124
399062
3461
06:42
Information, knowledge, communication, computation.
125
402565
4421
์ •๋ณด, ์ง€์‹, ํ†ต์‹ , ๊ณ„์‚ฐ.
06:47
In this revolution,
126
407570
1418
์ด ํ˜๋ช… ์†์—์„œ,
06:49
creation has exploded like never before.
127
409030
3295
๊ธฐ์ˆ  ์ฐฝ์กฐ๋Š” ๊ทธ ์–ด๋А ๋•Œ๋ณด๋‹ค ํญ๋ฐœ์ ์œผ๋กœ ์ฆ๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
06:53
And now a new wave is upon us.
128
413242
2503
๊ทธ๋ฆฌ๊ณ  ์ด์ œ ์šฐ๋ฆฌ์—๊ฒŒ๋Š” ์ƒˆ๋กœ์šด ๋ฌผ๊ฒฐ์ด ์ฐพ์•„์™”์Šต๋‹ˆ๋‹ค.
06:55
Artificial intelligence.
129
415787
1668
์ธ๊ณต ์ง€๋Šฅ์ด์ฃ .
06:57
These waves of history are clearly speeding up,
130
417872
2544
์—ญ์‚ฌ์˜ ๋ฌผ๊ฒฐ์€ ํ™•์‹คํžˆ ๋นจ๋ผ์ง€๊ณ  ์žˆ์–ด์š”.
07:00
as each one is amplified and accelerated by the last.
131
420458
4338
๊ฐ ๋ฌผ๊ฒฐ์ด ์ฆํญ๋˜๊ณ  ๊ฐ€์†ํ™”๋˜๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
07:05
And if you look back,
132
425088
1167
๋Œ์ด์ผœ๋ณด๋ฉด,
07:06
it's clear that we are in the fastest
133
426297
2002
์šฐ๋ฆฌ๋Š” ์—ญ์‚ฌ์ƒ ๊ฐ€์žฅ ๋น ๋ฅด๊ณ 
07:08
and most consequential wave ever.
134
428299
2586
๊ฐ€์žฅ ์ค‘์š”ํ•œ ๋ฌผ๊ฒฐ์„ ๋งˆ์ฃผํ•œ ๊ฒƒ์ด ๋ถ„๋ช…ํ•ฉ๋‹ˆ๋‹ค.
07:11
The journeys of humanity and technology are now deeply intertwined.
135
431844
4630
์ธ๋ฅ˜์™€ ๊ธฐ์ˆ ์˜ ์—ฌ์ •์€ ๊นŠ์ด ์–ฝํ˜€ ์žˆ์Šต๋‹ˆ๋‹ค.
07:16
In just 18 months,
136
436516
1543
๋ถˆ๊ณผ 18๊ฐœ์›” ๋งŒ์—,
07:18
over a billion people have used large language models.
137
438059
3170
10์–ต ๋ช…์ด ๋„˜๋Š” ์‚ฌ๋žŒ๋“ค์ด ๋Œ€๊ทœ๋ชจ ์–ธ์–ด ๋ชจ๋ธ์„ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.
07:21
We've witnessed one landmark event after another.
138
441729
3545
์šฐ๋ฆฌ๋Š” ํš๊ธฐ์ ์ธ ์‚ฌ๊ฑด๋“ค์„ ์—ฐ๋‹ฌ์•„ ๋ชฉ๊ฒฉํ–ˆ์Šต๋‹ˆ๋‹ค.
07:25
Just a few years ago, people said that AI would never be creative.
139
445650
3462
๋ถˆ๊ณผ ๋ช‡ ๋…„ ์ „๋งŒ ํ•ด๋„,
์‚ฌ๋žŒ๋“ค์€ AI๊ฐ€ ๊ฒฐ์ฝ” ์ฐฝ์˜์ ์ผ ์ˆ˜ ์—†๋‹ค๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
07:30
And yet AI now feels like an endless river of creativity,
140
450113
4045
ํ•˜์ง€๋งŒ ์ธ๊ณต์ง€๋Šฅ์€ ์ด์ œ ๋ฌดํ•œํ•œ ์ฐฝ์˜์„ฑ์˜ ๊ฐ•์ฒ˜๋Ÿผ ๋А๊ปด์ง‘๋‹ˆ๋‹ค.
07:34
making poetry and images and music and video that stretch the imagination.
141
454200
4671
์ƒ์ƒ๋ ฅ์„ ํ‚ค์›Œ์ฃผ๋Š” ์‹œ์™€ ๊ทธ๋ฆผ, ์Œ์•…๊ณผ ์˜์ƒ์„ ๋งŒ๋“ค์ฃ .
07:39
People said it would never be empathetic.
142
459664
2252
์‚ฌ๋žŒ๋“ค์€ AI๊ฐ€ ์ ˆ๋Œ€ ๊ณต๊ฐํ•  ์ˆ˜ ์—†์„ ๊ฒƒ์ด๋ผ ๋งํ–ˆ์Šต๋‹ˆ๋‹ค.
07:42
And yet today, millions of people enjoy meaningful conversations with AIs,
143
462417
5297
ํ•˜์ง€๋งŒ ์˜ค๋Š˜๋‚ , ์ˆ˜๋ฐฑ๋งŒ ๋ช…์˜ ์‚ฌ๋žŒ๋“ค์ด AI์™€ ์˜๋ฏธ ์žˆ๋Š” ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆ„๊ณ ,
07:47
talking about their hopes and dreams
144
467755
2002
์ž์‹ ์˜ ํฌ๋ง๊ณผ ๊ฟˆ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•˜๊ณ ,
07:49
and helping them work through difficult emotional challenges.
145
469757
3087
๊ฐ์ •์ ์ธ ์–ด๋ ค์›€๊ณผ ๋ฌธ์ œ๋ฅผ ํ—ค์ณ๋‚˜๊ฐˆ ์ˆ˜ ์žˆ๋„๋ก ๋•๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:53
AIs can now drive cars,
146
473177
2294
AI๋Š” ์ด์ œ ์ž๋™์ฐจ๋ฅผ ์šด์ „ํ•  ์ˆ˜ ์žˆ๊ณ ,
07:55
manage energy grids
147
475513
1543
์ „๋ ฅ๋ง์„ ๊ด€๋ฆฌํ•˜๊ณ ,
07:57
and even invent new molecules.
148
477056
2252
์‹ฌ์ง€์–ด ์ƒˆ๋กœ์šด ๋ถ„์ž๊ตฌ์กฐ๋ฅผ ๋ฐœ๊ฒฌํ•  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
07:59
Just a few years ago, each of these was impossible.
149
479308
3713
๋ถˆ๊ณผ ๋ช‡ ๋…„ ์ „๋งŒ ํ•ด๋„, ์ด ๋ชจ๋“  ๊ฒƒ์ด ๋ถˆ๊ฐ€๋Šฅํ–ˆ์ฃ .
08:03
And all of this is turbocharged by spiraling exponentials of data
150
483771
5506
์ด ๋ชจ๋“  ๊ฒƒ์€ ๋ฐ์ดํ„ฐ์™€ ๊ณ„์‚ฐ์ด ๊ธฐํ•˜๊ธ‰์ˆ˜์ ์œผ๋กœ ์ฆ๊ฐ€ํ•˜๋ฉด์„œ
08:09
and computation.
151
489277
1626
๊ธ‰๊ฒฉํžˆ ๊ฐ€์†ํ™”๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
08:10
Last year, Inflection 2.5, our last model,
152
490945
5297
์ž‘๋…„์— ์ €ํฌ ์ตœ์‹  ๋ชจ๋ธ์ธ ์ธํ”Œ๋ ‰์…˜ 2.5๋Š”
08:16
used five billion times more computation
153
496242
4129
๋”ฅ๋งˆ์ธ๋“œ AI๋ณด๋‹ค ๊ณ„์‚ฐ ๋Šฅ๋ ฅ์ด 50์–ต ๋ฐฐ๋‚˜ ์ฆ๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค.
08:20
than the DeepMind AI that beat the old-school Atari games
154
500371
3629
10๋…„ ์ „์˜ ๊ตฌ์‹ ์•„ํƒ€๋ฆฌ ๊ฒŒ์ž„๊ธฐ๋ฅผ ์ด๊ฒผ๋˜ ์ˆ˜์ค€์— ๋น„ํ•ด์„œ ๋ง์ด์ฃ .
08:24
just over 10 years ago.
155
504042
1668
08:26
That's nine orders of magnitude more computation.
156
506085
3420
๊ทธ๋•Œ๋ณด๋‹ค ๊ณ„์‚ฐ ๋Šฅ๋ ฅ์ด 9์ž๋ฆฌ ๋ฐฐ์ˆ˜๋กœ ๋Š˜์–ด๋‚œ ์…ˆ์ž…๋‹ˆ๋‹ค.
08:30
10x per year,
157
510089
1627
๊ฑฐ์˜ 10๋…„ ๋™์•ˆ ๋งค๋…„ 10๋ฐฐ์”ฉ ์ฆ๊ฐ€ํ•œ ์…ˆ์ด์ฃ .
08:31
every year for almost a decade.
158
511716
3253
08:34
Over the same time, the size of these models has grown
159
514969
2544
๊ฐ™์€ ๊ธฐ๊ฐ„์— ์ด๋Ÿฌํ•œ ๋ชจ๋ธ์˜ ํฌ๊ธฐ๋Š” ์ฒ˜์Œ์—๋Š” ์ˆ˜์ฒœ๋งŒ ๊ฐœ์˜ ๋งค๊ฐœ๋ณ€์ˆ˜์˜€๋‹ค๊ฐ€
08:37
from first tens of millions of parameters to then billions of parameters,
160
517555
4213
๊ทธ ๋‹ค์Œ์—๋Š” ์ˆ˜์‹ญ์–ต ๊ฐœ์˜ ๋งค๊ฐœ๋ณ€์ˆ˜๋กœ,
08:41
and very soon, tens of trillions of parameters.
161
521809
3504
๊ทธ๋ฆฌ๊ณ  ๊ณง ์ˆ˜์‹ญ์กฐ ๊ฐœ์˜ ๋งค๊ฐœ๋ณ€์ˆ˜๋กœ ๊ทœ๋ชจ๊ฐ€ ์ปค์กŒ์Šต๋‹ˆ๋‹ค.
08:45
If someone did nothing but read 24 hours a day for their entire life,
162
525313
4755
๋งŒ์•ฝ ์‚ฌ๋žŒ์ด ํ‰์ƒ ๋™์•ˆ ํ•˜๋ฃจ 24์‹œ๊ฐ„ ๋‚ด๋‚ด ์ฑ…๋งŒ ์ฝ๋Š”๋‹ค๋ฉด
08:50
they'd consume eight billion words.
163
530109
3337
80์–ต๊ฐœ์˜ ๋‹จ์–ด๋ฅผ ์†Œ๋น„ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:53
And of course, that's a lot of words.
164
533488
1835
๋ฌผ๋ก  ๊ทธ๊ฒƒ๋„ ์—„์ฒญ๋‚œ ์ˆซ์ž์ฃ .
08:55
But today, the most advanced AIs consume more than eight trillion words
165
535364
5756
ํ•˜์ง€๋งŒ ์˜ค๋Š˜๋‚ ์˜ ์ตœ์ฒจ๋‹จ AI๋Š”
ํ•œ ๋‹ฌ ๋™์•ˆ์˜ ํ›ˆ๋ จ ํ•œ ๋ฒˆ์— 8์กฐ ๊ฐœ ์ด์ƒ์˜ ๋‹จ์–ด๋ฅผ ์†Œ๋น„ํ•ฉ๋‹ˆ๋‹ค.
09:01
in a single month of training.
166
541120
2336
09:03
And all of this is set to continue.
167
543873
1960
๊ทธ๋ฆฌ๊ณ  ๊ทธ ๊ณผ์ •์„ ๊ณ„์† ๋ฐ˜๋ณตํ•˜์ฃ .
09:05
The long arc of technological history is now in an extraordinary new phase.
168
545875
5797
๊ธฐ๋‚˜๊ธด ๊ธฐ์ˆ ์˜ ์—ญ์‚ฌ๊ฐ€ ์ด์ œ ์ „ํ˜€ ์ƒˆ๋กœ์šด ์‹œ๋Œ€๋ฅผ ๋งž์ดํ•ฉ๋‹ˆ๋‹ค.
09:12
So what does this mean in practice?
169
552256
2503
๊ทธ๋ ‡๋‹ค๋ฉด ์ด๊ฒƒ์ด ์‹ค์ œ๋กœ ์˜๋ฏธํ•˜๋Š” ๋ฐ”๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
09:15
Well, just as the internet gave us the browser
170
555426
2920
์ธํ„ฐ๋„ท์ด ๋ธŒ๋ผ์šฐ์ €๋ฅผ, ์Šค๋งˆํŠธํฐ์ด ์•ฑ์„ ๋“ฑ์žฅ์‹œํ‚จ ๊ฒƒ์ฒ˜๋Ÿผ
09:18
and the smartphone gave us apps,
171
558346
2502
09:20
the cloud-based supercomputer is ushering in a new era
172
560890
4004
ํด๋ผ์šฐ๋“œ ๊ธฐ๋ฐ˜ ์Šˆํผ์ปดํ“จํ„ฐ๋Š”
์œ ๋น„์ฟผํ„ฐ์Šค AI์˜ ์ƒˆ๋กœ์šด ์‹œ๋Œ€๋กœ ์šฐ๋ฆฌ๋ฅผ ์•ˆ๋‚ดํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
09:24
of ubiquitous AIs.
173
564936
2502
09:27
Everything will soon be represented by a conversational interface.
174
567438
4588
๋จธ์ง€์•Š์•„ ๋ชจ๋“  ์กฐ์ž‘์ด ๋Œ€ํ™”ํ˜•์œผ๋กœ ์ด๋ฃจ์–ด์งˆ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:32
Or, to put it another way, a personal AI.
175
572026
3003
๋˜ ๋‹ค๋ฅธ ๋ฐฉํ–ฅ์œผ๋กœ๋Š”, ๊ฐœ์ธ์šฉ AI๊ฐ€ ์ œ๊ณต๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:35
And these AIs will be infinitely knowledgeable,
176
575780
2336
๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฌํ•œ AI๋Š” ๋ฌดํ•œํ•œ ์ง€์‹์„ ์Šต๋“ํ•  ๊ฒƒ์ด๊ณ ,
09:38
and soon they'll be factually accurate and reliable.
177
578157
3879
์‚ฌ์‹ค์„ ๊ธฐ๋ฐ˜์œผ๋กœ ์ •ํ™•ํ•˜๋ฉด์„œ, ์‹ ๋ขฐํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:42
They'll have near-perfect IQ.
178
582036
1794
AI๋Š” ์™„๋ฒฝ์— ๊ฐ€๊นŒ์šด IQ๋ฅผ ๊ฐ–๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:44
Theyโ€™ll also have exceptional EQ.
179
584914
2377
๋›ฐ์–ด๋‚œ EQ๋„ ๊ฐ–๊ฒŒ ๋˜๊ฒ ์ฃ .
09:47
Theyโ€™ll be kind, supportive, empathetic.
180
587291
4129
์นœ์ ˆํ•˜๊ณ , ๋„์›€์„ ์ฃผ๊ณ , ๊ณต๊ฐํ•ด ์ค„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:53
These elements on their own would be transformational.
181
593089
2836
์ด๋Ÿฌํ•œ ์š”์†Œ๋Š” ๊ทธ ์ž์ฒด๋กœ๋„ ํ˜์‹ ์ž…๋‹ˆ๋‹ค.
09:55
Just imagine if everybody had a personalized tutor in their pocket
182
595925
3795
๋ชจ๋“  ์‚ฌ๋žŒ์ด ๊ฐ์ž์˜ ์ฃผ๋จธ๋‹ˆ ์†์— ๋งž์ถคํ˜• ๊ฐœ์ธ๊ต์‚ฌ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๊ณ 
09:59
and access to low-cost medical advice.
183
599720
3003
์ €๋ ดํ•œ ๊ฐ€๊ฒฉ์œผ๋กœ ์ง„์ฐฐ์„ ๋ฐ›์„ ์ˆ˜ ์žˆ๋‹ค๊ณ  ์ƒ์ƒํ•ด ๋ณด์„ธ์š”.
10:02
A lawyer and a doctor,
184
602723
1544
๋ณ€ํ˜ธ์‚ฌ์™€ ์˜์‚ฌ, ์‚ฌ์—… ์ „๋žต๊ฐ€, ์ฝ”์น˜ ๋“ฑ
10:04
a business strategist and coach --
185
604267
1960
10:06
all in your pocket 24 hours a day.
186
606269
2252
๋ชจ๋“  ๊ฑธ ํ•˜๋ฃจ 24์‹œ๊ฐ„ ์ฃผ๋จธ๋‹ˆ์— ๋„ฃ๊ณ  ๋‹ค๋‹ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:08
But things really start to change when they develop what I call AQ,
187
608980
4713
๊ทธ๋Ÿฐ๋ฐ ์ œ๊ฐ€ AQ๋ผ๊ณ  ๋ถ€๋ฅด๋Š” โ€œํ–‰๋™ ์ง€์ˆ˜โ€๋ฅผ ๊ฐœ๋ฐœํ•˜๋ฉด์„œ
10:13
their โ€œactions quotient.โ€
188
613693
1668
์ƒํ™ฉ์ด ์ •๋ง ๋ฐ”๋€Œ๊ธฐ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค.
10:15
This is their ability to actually get stuff done
189
615695
2794
AQ๋Š” ๋ฐ”๋กœ ๋””์ง€ํ„ธ ์„ธ๊ณ„์™€ ๋ฌผ๋ฆฌ์  ์„ธ๊ณ„์—์„œ
10:18
in the digital and physical world.
190
618531
2294
์‹ค์ œ๋กœ AI๊ฐ€ ํ•ด๋‚ผ ์ˆ˜ ์žˆ๋Š” ์ผ์˜ ๋Šฅ๋ ฅ์น˜์ž…๋‹ˆ๋‹ค.
10:20
And before long, it won't just be people that have AIs.
191
620867
3420
๋จธ์ง€์•Š์•„ AI๋ฅผ ๊ฐ€์ง„ ์กด์žฌ๋Š” ์‚ฌ๋žŒ๋ฟ๋งŒ์ด ์•„๋‹ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:24
Strange as it may sound, every organization,
192
624328
2962
์ด์ƒํ•˜๊ฒŒ ๋“ค๋ฆด์ง€ ๋ชจ๋ฅด์ง€๋งŒ
์ค‘์†Œ๊ธฐ์—…๋ถ€ํ„ฐ ๋น„์˜๋ฆฌ ๋‹จ์ฒด, ์ค‘์•™ ์ •๋ถ€์— ์ด๋ฅด๊ธฐ๊นŒ์ง€ ๋ชจ๋“  ์กฐ์ง์€
10:27
from small business to nonprofit to national government,
193
627290
3253
10:30
each will have their own.
194
630585
1710
์ €๋งˆ๋‹ค์˜ AI๋ฅผ ๊ฐ€์ง€๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:32
Every town, building and object
195
632795
2711
๋ชจ๋“  ๋งˆ์„, ๊ฑด๋ฌผ, ์‚ฌ๋ฌผ๋“ค์€
10:35
will be represented by a unique interactive persona.
196
635506
3754
๊ฐ๊ฐ์˜ ๋Œ€ํ™”ํ˜• ์ธ๊ฒฉ์ฒด๋กœ ํ‘œํ˜„๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:39
And these won't just be mechanistic assistants.
197
639302
2544
AI๋Š” ๋‹จ์ˆœํžˆ ๊ธฐ๊ณ„์  ์กฐ๋ ฅ์ž๊ฐ€ ์•„๋‹ˆ๋ผ,
10:42
They'll be companions, confidants,
198
642221
3754
๋™๋ฐ˜์ž, ์ ˆ์นœํ•œ ์นœ๊ตฌ,
๋™๋ฃŒ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:46
colleagues, friends and partners,
199
646017
2669
์šฐ๋ฆฌ์ฒ˜๋Ÿผ ๋‹ค์–‘ํ•˜๊ณ  ํŠน๋ณ„ํ•œ ์นœ๊ตฌ์ด์ž ๋™๋ฐ˜์ž๋กœ์„œ ๋ง์ด์ฃ .
10:48
as varied and unique as we all are.
200
648728
2627
10:52
At this point, AIs will convincingly imitate humans at most tasks.
201
652273
4630
๊ทธ๋•Œ๋Š” AI๊ฐ€ ๊ฑฐ์˜ ๋ชจ๋“  ์ผ์—์„œ ์ธ๊ฐ„์„ ๋ชจ๋ฐฉํ•˜๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
10:57
And we'll feel this at the most intimate of scales.
202
657737
2794
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ์˜ ์ผ์ƒ์—์„œ ์ด ์‚ฌ์‹ค์„ ๋А๋‚„ ์ˆ˜ ์žˆ์„ ๊ฑฐ์˜ˆ์š”.
11:00
An AI organizing a community get-together for an elderly neighbor.
203
660990
3587
AI๊ฐ€ ์ด์›ƒ ๋…ธ์ธ๋“ค์„ ์œ„ํ•ด ์ฃผ๋ฏผ๋ชจ์ž„์„ ๊ธฐํšํ•ฉ๋‹ˆ๋‹ค.
11:04
A sympathetic expert helping you make sense of a difficult diagnosis.
204
664619
4588
์นœ์ ˆํ•œ ์ „๋ฌธ๊ฐ€๊ฐ€ ์–ด๋ ค์šด ์ง„๋‹จ์„ ์ดํ•ดํ•  ์ˆ˜ ์žˆ๋„๋ก ๋„์™€์ค๋‹ˆ๋‹ค.
11:09
But we'll also feel it at the largest scales.
205
669248
2753
๊ทธ๋ฆฌ๊ณ  ๋ณด๋‹ค ํฐ ๊ทœ๋ชจ์—์„œ๋„ ๊ทธ ์‚ฌ์‹ค์„ ๋А๋‚„ ์ˆ˜ ์žˆ์„ ๊ฒ๋‹ˆ๋‹ค.
11:12
Accelerating scientific discovery,
206
672043
2586
๊ณผํ•™์  ๋ฐœ๊ฒฌ์ด ๊ฐ€์†ํ™”๋จ์— ๋”ฐ๋ผ,
11:14
autonomous cars on the roads,
207
674629
2168
๋„๋กœ ์œ„์˜ ์ž์œจ ์ฃผํ–‰ ์ž๋™์ฐจ,
11:16
drones in the skies.
208
676797
1627
ํ•˜๋Š˜์˜ ๋“œ๋ก .
11:18
They'll both order the takeout and run the power station.
209
678966
3337
๋ชจ๋‘ ํ…Œ์ดํฌ์•„์›ƒ ์ฃผ๋ฌธ์„ ํ•˜๊ณ  ๋ฐœ์ „์†Œ๋ฅผ ์šด์˜ํ•  ๊ฒ๋‹ˆ๋‹ค.
11:22
Theyโ€™ll interact with us and, of course, with each other.
210
682970
3337
AI๋Š” ์ธ๊ฐ„๊ณผ ์ƒํ˜ธ์ž‘์šฉํ•˜๊ณ  ๊ทธ๋“ค๋ผ๋ฆฌ๋„ ์ƒํ˜ธ์ž‘์šฉํ•  ๊ฒ๋‹ˆ๋‹ค.
11:26
They'll speak every language,
211
686349
1918
๊ทธ๋“ค์€ ๋ชจ๋“  ์–ธ์–ด๋ฅผ ๊ตฌ์‚ฌํ•˜๊ณ ,
11:28
take in every pattern of sensor data,
212
688267
2836
๋‹ค์–‘ํ•œ ๊ฐ๊ฐ ์ •๋ณด๋ฅผ ํ™œ์šฉํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:31
sights, sounds,
213
691145
2336
์‹œ๊ฐ, ์†Œ๋ฆฌ, ์ •๋ณด์˜ ํ๋ฆ„ ๋“ฑ ๋ชจ๋“  ํŒจํ„ด์„ ์ดํ•ดํ•˜๊ณ 
11:33
streams and streams of information,
214
693481
2377
11:35
far surpassing what any one of us could consume in a thousand lifetimes.
215
695900
4379
์šฐ๋ฆฌ๊ฐ€ ํ‰์ƒ ๋™์•ˆ ์†Œํ™”ํ•˜๋Š” ์ •๋ณด๋Ÿ‰์„ ํ›จ์”ฌ ๋Šฅ๊ฐ€ํ•  ๊ฒ๋‹ˆ๋‹ค.
11:40
So what is this?
216
700780
1293
๊ทธ๋ ‡๋‹ค๋ฉด ์ด๊ฒŒ ๋ญ˜๊นŒ์š”?
11:42
What are these AIs?
217
702990
2086
์ด AI๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
11:46
If we are to prioritize safety above all else,
218
706410
4588
๋‹ค๋ฅธ ์–ด๋–ค ๊ฒƒ๋ณด๋‹ค ์•ˆ์ „์„ ์šฐ์„ ์‹œํ•˜๊ณ 
11:51
to ensure that this new wave always serves and amplifies humanity,
219
711040
5088
์ด ์ƒˆ๋กœ์šด ๋ฌผ๊ฒฐ์ด ํ•ญ์ƒ ์ธ๋ฅ˜์—๊ฒŒ ๋„์›€์ด ๋˜๊ณ  ๋ฐœ์ „ํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•˜๋ ค๋ฉด
11:56
then we need to find the right metaphors for what this might become.
220
716128
4922
AI๊ฐ€ ์–ด๋–ป๊ฒŒ ๋  ๊ฒƒ์ธ์ง€์— ๋Œ€ํ•œ ์ ์ ˆํ•œ ๋น„์œ ๋ฅผ ์ฐพ์•„์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:01
For years, we in the AI community, and I specifically,
221
721842
4296
์ˆ˜๋…„ ๋™์•ˆ AI ๊ด€๋ จ์ž๋“ค, ํŠนํžˆ ์ € ๊ฐ™์€ ๊ฒฝ์šฐ๋Š”
12:06
have had a tendency to refer to this as just tools.
222
726180
3587
AI๋ฅผ ๊ทธ์ € ๋„๊ตฌ๋กœ ์—ฌ๊ธฐ๊ณ  ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
12:11
But that doesn't really capture what's actually happening here.
223
731060
3003
ํ•˜์ง€๋งŒ ๊ทธ๋ ‡๊ฒŒ ํ•ด์„œ๋Š” ์‹ค์ œ๋กœ ์–ด๋–ค ์ผ์ด ๋ฒŒ์–ด์ง€๋Š”์ง€ ์•Œ ์ˆ˜ ์—†์ฃ .
12:14
AIs are clearly more dynamic,
224
734730
2503
AI๋Š” ๋ถ„๋ช…ํžˆ ํ›จ์”ฌ ์—ญ๋™์ ์ด๊ณ 
12:17
more ambiguous, more integrated
225
737233
2753
ํ›จ์”ฌ ๋ชจํ˜ธํ•˜๊ณ 
๋‹จ์ˆœํ•œ ๋„๊ตฌ์™€๋Š” ๋‹ฌ๋ฆฌ ํ›จ์”ฌ ๋ณตํ•ฉ์ ์ด๊ณ  ์ƒˆ๋กญ์Šต๋‹ˆ๋‹ค.
12:19
and more emergent than mere tools,
226
739986
2627
12:22
which are entirely subject to human control.
227
742613
2878
๊ธฐ์กด ๋„๊ตฌ๋Š” ์ „์ ์œผ๋กœ ์ธ๊ฐ„์˜ ํ†ต์ œ๋ฅผ ๋ฐ›์œผ๋‹ˆ๊นŒ์š”.
12:25
So to contain this wave,
228
745533
2544
๋”ฐ๋ผ์„œ ์ด๋Ÿฌํ•œ ๋ฌผ๊ฒฐ ์†์—์„œ,
12:28
to put human agency at its center
229
748119
3044
์ธ๊ฐ„์˜ ์„ ํƒ์˜์ง€๋ฅผ ์ค‘์‹ฌ์— ๋‘๊ณ ,
์˜๋„์น˜ ์•Š๊ฒŒ ๋ฐœ์ƒํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒฐ๊ณผ๋ฅผ ์™„ํ™”ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š”
12:31
and to mitigate the inevitable unintended consequences
230
751163
2711
12:33
that are likely to arise,
231
753916
1835
12:35
we should start to think about them as we might a new kind of digital species.
232
755793
5005
AI๋ฅผ ์ƒˆ๋กœ์šด ์ข…๋ฅ˜์˜ ๋””์ง€ํ„ธ ์ข…์ฒ˜๋Ÿผ ์ƒ๊ฐํ•˜๊ธฐ ์‹œ์ž‘ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
12:41
Now it's just an analogy,
233
761132
1793
์ด๋Š” ๋‹จ์ง€ ๋น„์œ ์ผ ๋ฟ์ด๊ณ ,
12:42
it's not a literal description, and it's not perfect.
234
762925
2586
๋ฌธ์ž ๊ทธ๋Œ€๋กœ์˜ ๋œป์ด ์•„๋‹ˆ๋ฉฐ, ์™„๋ฒฝํ•˜์ง€๋„ ์•Š์Šต๋‹ˆ๋‹ค.
12:46
For a start, they clearly aren't biological in any traditional sense,
235
766053
4046
์šฐ์„  ์ด ์ข…์€ ์ „ํ†ต์ ์ธ ์˜๋ฏธ์—์„œ ์ƒ๋ฌผํ•™์ ์œผ๋กœ ์ •์˜๋˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
12:50
but just pause for a moment
236
770141
2377
ํ•˜์ง€๋งŒ ์ž ๊น ์ƒ๊ฐ์„ ๋ฉˆ์ถ”๊ณ 
12:52
and really think about what they already do.
237
772560
2836
AI๊ฐ€ ํ•˜๊ณ  ์žˆ๋Š” ์ผ์— ๋Œ€ํ•ด ์ง„์ง€ํ•˜๊ฒŒ ์ƒ๊ฐํ•ด ๋ณด์„ธ์š”.
12:55
They communicate in our languages.
238
775438
2627
์šฐ๋ฆฌ ์–ธ์–ด๋กœ ์˜์‚ฌ์†Œํ†ต์„ ํ•˜์ฃ .
12:58
They see what we see.
239
778107
2127
์šฐ๋ฆฌ๊ฐ€ ๋ณด๋Š” ๊ฒƒ๋“ค์„ ๋ด…๋‹ˆ๋‹ค.
13:00
They consume unimaginably large amounts of information.
240
780234
3587
์ƒ์ƒํ•  ์ˆ˜ ์—†์„ ๋งŒํผ ๋งŽ์€ ์–‘์˜ ์ •๋ณด๋ฅผ ์†Œ๋น„ํ•ฉ๋‹ˆ๋‹ค.
13:04
They have memory.
241
784739
1376
๊ธฐ์–ต๋ ฅ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
13:06
They have personality.
242
786991
1752
๊ทธ๋“ค์—๊ฒ ๊ฐœ์„ฑ์ด ์žˆ๊ณ ,
13:09
They have creativity.
243
789493
1710
์ฐฝ์˜์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
์‹ฌ์ง€์–ด ์–ด๋А ์ •๋„ ์ถ”๋ฆฌ๋ฅผ ํ•  ์ˆ˜๋„ ์žˆ๊ณ  ๊ธฐ๋ณธ์ ์ธ ๊ณ„ํš์„ ์„ธ์šธ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
13:12
They can even reason to some extent and formulate rudimentary plans.
244
792038
3962
13:16
They can act autonomously if we allow them.
245
796709
3128
์šฐ๋ฆฌ๊ฐ€ ํ—ˆ๋ฝํ•œ๋‹ค๋ฉด ๊ทธ๋“ค์€ ์ž์œจ์ ์œผ๋กœ ํ–‰๋™ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
13:20
And they do all this at levels of sophistication
246
800546
2294
๊ทธ๋ฆฌ๊ณ  ์ด ๋ชจ๋“  ์ผ๋“ค์„ ๋งค์šฐ ์ •๊ตํ•˜๊ฒŒ ์ˆ˜ํ–‰ํ•˜๋ฉฐ
13:22
that is far beyond anything that we've ever known from a mere tool.
247
802882
3587
๊ธฐ์กด์— ์•Œ๋˜ ๋‹จ์ˆœํ•œ ๋„๊ตฌ๋กœ ํ•  ์ˆ˜ ์žˆ๋Š” ์ˆ˜์ค€์„ ํ›จ์”ฌ ๋›ฐ์–ด๋„˜์ฃ .
13:27
And so saying AI is mainly about the math or the code
248
807762
4296
AI๊ฐ€ ์ˆ˜ํ•™๊ณผ ํ”„๋กœ๊ทธ๋žจ ์ฝ”๋“œ๋กœ ์ด๋ฃจ์–ด์กŒ๋‹ค๊ณ  ๋งํ•˜๋Š” ๊ฒƒ์€
13:32
is like saying we humans are mainly about carbon and water.
249
812099
4838
์ธ๊ฐ„์ด ํƒ„์†Œ์™€ ๋ฌผ๋กœ ์ด๋ฃจ์–ด์กŒ๋‹ค๊ณ  ๋งํ•˜๋Š” ๊ฒƒ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
13:37
It's true, but it completely misses the point.
250
817897
3795
๋งž๋Š” ๋ง์ด์ง€๋งŒ, ์š”์ ์„ ์™„์ „ํžˆ ๋†“์ณค์Šต๋‹ˆ๋‹ค.
13:42
And yes, I get it, this is a super arresting thought
251
822860
3629
๋„ค, ๊ทธ๋ž˜์š”, ์ด๊ฑด ์ •๋ง ํฅ๋ฏธ๋กœ์šด ์ƒ๊ฐ์ด์ง€๋งŒ
13:46
but I honestly think this frame helps sharpen our focus on the critical issues.
252
826489
5171
์†”์งํžˆ ์ด ํ‹€์„ ํ†ตํ•ด ์šฐ๋ฆฌ๋Š” ์ค‘์š”ํ•œ ๋ฌธ์ œ๋“ค์— ์ง‘์ค‘ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค..
13:52
What are the risks?
253
832745
1585
์–ด๋–ค ์œ„ํ—˜์ด ์žˆ์„๊นŒ์š”?
13:55
What are the boundaries that we need to impose?
254
835790
2752
์šฐ๋ฆฌ๊ฐ€ ๋‘ฌ์•ผ ํ•  ๊ฒฝ๊ณ„๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
13:59
What kind of AI do we want to build or allow to be built?
255
839293
4338
์–ด๋–ค ์ข…๋ฅ˜์˜ AI๋ฅผ ์›ํ•˜๊ณ  ๊ทธ๊ฑธ ์–ผ๋งˆ๋‚˜ ํ—ˆ์šฉํ•ด์•ผ ํ• ๊นŒ์š”?
14:04
This is a story that's still unfolding.
256
844465
2377
์ด์— ๋Œ€ํ•œ ๋…ผ์˜๋Š” ์•„์ง ์ง„ํ–‰ ์ค‘์ž…๋‹ˆ๋‹ค.
14:06
Nothing should be accepted as a given.
257
846884
2544
์•„๋ฌด๊ฒƒ๋„ ๋‹น์—ฐ์‹œํ•ด์„œ๋Š” ์•ˆ ๋ฉ๋‹ˆ๋‹ค.
14:09
We all must choose what we create.
258
849428
2544
๋ฌด์—‡์„ ์ฐฝ์กฐํ• ์ง€๋Š” ์šฐ๋ฆฌ๊ฐ€ ์„ ํƒํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
14:12
What AIs we bring into the world, or not.
259
852681
3921
์–ด๋–ค AI๋ฅผ ์„ธ์ƒ์— ๋‚ด๋†“์„์ง€ ๋ง์ง€์— ๋Œ€ํ•ด์„œ์š”.
14:18
These are the questions for all of us here today,
260
858604
2836
์ด ์งˆ๋ฌธ๋“ค์€ ์˜ค๋Š˜ ์—ฌ๊ธฐ ์ด ์ž๋ฆฌ์— ์žˆ๋Š” ์šฐ๋ฆฌ ๋ชจ๋‘์™€
14:21
and all of us alive at this moment.
261
861440
2586
ํ˜„์žฌ ์‚ด๊ณ  ์žˆ๋Š” ์šฐ๋ฆฌ ๋ชจ๋‘์—๊ฒŒ ๋˜์ง€๋Š” ์งˆ๋ฌธ์ž…๋‹ˆ๋‹ค.
14:24
For me, the benefits of this technology are stunningly obvious,
262
864693
4296
์ €์—๊ฒŒ ์ด ๊ธฐ์ˆ ์˜ ์ด์ ์€ ๋†€๋ผ์šธ ์ •๋„๋กœ ๋ช…๋ฐฑํ•˜๊ณ 
14:28
and they inspire my life's work every single day.
263
868989
3546
์ œ ์ผ์ƒ์— ๋งค์ผ ์˜๊ฐ์„ ์ค๋‹ˆ๋‹ค.
14:33
But quite frankly, they'll speak for themselves.
264
873661
3086
ํ•˜์ง€๋งŒ ์†”์งํžˆ ๋งํ•ด์„œ, ๊ทธ๋“ค์€ ์Šค์Šค๋กœ ๋งํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:37
Over the years, I've never shied away from highlighting risks
265
877706
2920
์ˆ˜ ๋…„ ๋™์•ˆ ์ €๋Š” ์œ„ํ—˜์„ฑ๊ณผ ๋‹จ์ ์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•˜๊ธฐ๋ฅผ ์ฃผ์ €ํ•˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
14:40
and talking about downsides.
266
880668
1710
14:43
Thinking in this way helps us focus on the huge challenges
267
883087
3211
๊ทธ๋ž˜์•ผ ์•ž์œผ๋กœ ์ง๋ฉดํ•˜๊ฒŒ ๋  ์—„์ฒญ๋‚œ ๋ฌธ์ œ์— ์ฃผ๋ชฉํ•˜๊ฒŒ ๋˜์ฃ .
14:46
that lie ahead for all of us.
268
886298
1836
14:48
But let's be clear.
269
888843
1418
ํ•˜์ง€๋งŒ ๋ถ„๋ช…ํ•œ ์‚ฌ์‹ค์€
14:50
There is no path to progress
270
890719
2128
๊ธฐ์ˆ ์„ ๋’ค๋กœํ•œ ์ฑ„๋กœ ๋ฐœ์ „ํ•  ์ˆ˜ ์žˆ๋Š” ๊ธธ์€ ์—†์Šต๋‹ˆ๋‹ค.
14:52
where we leave technology behind.
271
892888
2169
14:55
The prize for all of civilization is immense.
272
895683
3795
๋ชจ๋“  ๋ฌธ๋ช…์ด ๋ฐ›๋Š” ๋ณด์ƒ์ด ์–ด๋งˆ์–ด๋งˆํ•ฉ๋‹ˆ๋‹ค.
15:00
We need solutions in health care and education, to our climate crisis.
273
900062
3879
์˜๋ฃŒ, ๊ต์œก, ๊ธฐํ›„ ์œ„๊ธฐ์— ๋Œ€ํ•œ ํ•ด๊ฒฐ์ฑ…์ด ํ•„์š”ํ•œ ์ƒํ™ฉ์—์„œ
15:03
And if AI delivers just a fraction of its potential,
274
903941
3921
AI๊ฐ€ ์ž ์žฌ๋ ฅ์˜ ์ผ๋ถ€๋งŒ ๋ฐœํœ˜ํ•œ๋‹ค๋ฉด
15:07
the next decade is going to be the most productive in human history.
275
907903
4421
ํ–ฅํ›„ 10๋…„์€ ์ธ๋ฅ˜ ์—ญ์‚ฌ์ƒ ๊ฐ€์žฅ ์ƒ์‚ฐ์ ์ธ ์‹œ๊ธฐ๊ฐ€ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:13
Here's another way to think about it.
276
913701
2002
์ด์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด ๋ณผ ์ˆ˜ ์žˆ๋Š” ๋˜ ๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์ด ์žˆ์Šต๋‹ˆ๋‹ค.
15:15
In the past,
277
915744
1293
๊ณผ๊ฑฐ์—๋Š”,
15:17
unlocking economic growth often came with huge downsides.
278
917037
4088
๊ฒฝ์ œ ์„ฑ์žฅ์˜ ์‹คํ˜„์—๋Š” ํฐ ๋‹จ์ ์ด ๋”ฐ๋ž์Šต๋‹ˆ๋‹ค.
15:21
The economy expanded as people discovered new continents
279
921500
3712
์‚ฌ๋žŒ๋“ค์ด ์ƒˆ๋กœ์šด ๋Œ€๋ฅ™์„ ๋ฐœ๊ฒฌํ•˜๊ณ 
์ƒˆ๋กญ๊ฒŒ ๊ตญ๊ฒฝ์„ ์—ด๋ฉด์„œ ๊ฒฝ์ œ๋Š” ํ™•์žฅ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
15:25
and opened up new frontiers.
280
925254
2002
15:28
But they colonized populations at the same time.
281
928632
3045
ํ•˜์ง€๋งŒ ๊ทธ์™€ ๋™์‹œ์— ์ฃผ๋ฏผ๋“ค์„ ์‹๋ฏผ์ง€ํ™”ํ–ˆ์Šต๋‹ˆ๋‹ค.
15:32
We built factories,
282
932595
1710
์šฐ๋ฆฌ๋Š” ๊ณต์žฅ์„ ์ง€์—ˆ์ง€๋งŒ,
15:34
but they were grim and dangerous places to work.
283
934305
3044
๊ทธ๊ณณ์€ ์ผํ•˜๊ธฐ์— ์•”์šธํ•˜๊ณ  ์œ„ํ—˜ํ•œ ๊ณณ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
15:38
We struck oil,
284
938017
1918
์šฐ๋ฆฌ๋Š” ์„์œ ๋ฅผ ์ฐพ์•„๋ƒˆ์ง€๋งŒ,
15:39
but we polluted the planet.
285
939977
1627
์ง€๊ตฌ๋ฅผ ์˜ค์—ผ์‹œ์ผฐ์Šต๋‹ˆ๋‹ค.
15:42
Now because we are still designing and building AI,
286
942146
3170
์ง€๊ธˆ ์šฐ๋ฆฌ๋Š” ์—ฌ์ „ํžˆ AI๋ฅผ ์„ค๊ณ„ํ•˜๊ณ  ๊ตฌ์ถ•ํ•˜๋Š” ๋‹จ๊ณ„์ด๊ธฐ ๋•Œ๋ฌธ์—,
15:45
we have the potential and opportunity to do it better,
287
945316
3795
์•„์ง ๋” ์ฒ ์ €ํžˆ ์ž˜ ํ•  ์ˆ˜ ์žˆ๋Š” ๊ธฐํšŒ์™€ ์ž ์žฌ๋ ฅ์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
15:49
radically better.
288
949153
1335
15:51
And today, we're not discovering a new continent
289
951071
2628
๊ทธ๋ฆฌ๊ณ  ์ง€๊ธˆ์€ ์‹ ๋Œ€๋ฅ™์„ ๋ฐœ๊ฒฌํ•˜๊ณ  ์ž์›์„ ์•ฝํƒˆํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
15:53
and plundering its resources.
290
953741
1918
15:56
We're building one from scratch.
291
956285
1919
์•„์˜ˆ ์ฒ˜์Œ๋ถ€ํ„ฐ ์ƒˆ๋กœ ๋งŒ๋“ค๊ณ  ์žˆ์ฃ .
15:58
Sometimes people say that data or chips are the 21st centuryโ€™s new oil,
292
958662
5297
๊ฐ€๋” ์‚ฌ๋žŒ๋“ค์€ ์ •๋ณด๋‚˜ ๋ฐ˜๋„์ฒด๊ฐ€ 21์„ธ๊ธฐ์˜ ์ƒˆ๋กœ์šด ์„์œ ๋ผ๊ณ  ๋งํ•˜์ง€๋งŒ,
16:03
but that's totally the wrong image.
293
963959
2002
๊ทธ๊ฑด ์™„์ „ํžˆ ์ž˜๋ชป๋œ ์ƒ๊ฐ์ž…๋‹ˆ๋‹ค.
16:06
AI is to the mind
294
966587
1919
AI์˜ ์‚ฌ๊ณ  ๋ฒ”์œ„๋Š” ํ•ต์œตํ•ฉ ๋ฐœ์ „๊ณผ๋„ ๊ฐ™์Šต๋‹ˆ๋‹ค.
16:08
what nuclear fusion is to energy.
295
968547
2753
16:12
Limitless, abundant,
296
972259
2670
๋ฌดํ•œํ•˜๊ณ  ํ’๋ถ€ํ•˜๋ฉฐ,
16:14
world-changing.
297
974970
1544
์„ธ์ƒ์„ ๋ณ€ํ™”์‹œํ‚ต๋‹ˆ๋‹ค.
16:17
And AI really is different,
298
977389
2962
๊ทธ๋ฆฌ๊ณ  AI๋Š” ์ •๋ง ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
16:20
and that means we have to think about it creatively and honestly.
299
980392
4213
์šฐ๋ฆฌ๊ฐ€ ์ด์— ๋Œ€ํ•ด ์ฐฝ์˜์ ์ด๊ณ  ์ •์งํ•˜๊ฒŒ ์ƒ๊ฐํ•ด์•ผ ํ•œ๋‹ค๋Š” ๋œป์ด์ฃ .
16:24
We have to push our analogies and our metaphors
300
984647
2586
๊ทธ์— ๋Œ€ํ•œ ๋น„์œ ์™€ ์€์œ ๋ฅผ ์ตœ๋Œ€ํ•œ ์ด๋Œ์–ด๋‚ด์•ผ
16:27
to the very limits
301
987274
1752
์•ž์œผ๋กœ ๋ฒŒ์–ด์งˆ ์ผ์„ ํ—ค์ณ๋‚˜๊ฐˆ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
16:29
to be able to grapple with what's coming.
302
989068
2252
16:31
Because this is not just another invention.
303
991362
2878
์ด๊ฑด ๋‹จ์ˆœํ•œ ๋ฐœ๋ช…ํ’ˆ์ด ์•„๋‹ˆ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
16:34
AI is itself an infinite inventor.
304
994907
3503
AI๋Š” ๊ทธ ์ž์ฒด๋กœ ๋ฌดํ•œํ•œ ๋ฐœ๋ช…๊ฐ€์ž…๋‹ˆ๋‹ค.
16:38
And yes, this is exciting and promising and concerning
305
998869
3295
๋„ค, ์ด๊ฒƒ์€ ํฅ๋ฏธ์ง„์ง„ํ•˜๊ณ , ์œ ๋งํ•˜๋ฉฐ, ํฅ๋ฏธ๋กญ๊ณ 
16:42
and intriguing all at once.
306
1002164
2044
๋™์‹œ์— ๊ฑฑ์ •๋˜๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค.
16:45
To be quite honest, it's pretty surreal.
307
1005251
2585
์ •๋ง ์†”์งํžˆ ๋ง์”€๋“œ๋ฆฌ๋ฉด, ์ •๋ง ์ดˆํ˜„์‹ค์ ์ž…๋‹ˆ๋‹ค.
16:47
But step back,
308
1007836
1293
ํ•˜์ง€๋งŒ ํ•œ ๊ฑธ์Œ ๋ฌผ๋Ÿฌ๋‚˜
16:49
see it on the long view of glacial time,
309
1009171
3462
๋น™ํ•˜๊ธฐ๋งŒํผ ๊ธด ์‹œ๊ฐ„์˜ ๊ด€์ ์œผ๋กœ ๋ฐ”๋ผ๋ณด๋ฉด
16:52
and these really are the very most appropriate metaphors that we have today.
310
1012633
4796
ํ˜„์žฌ ์šฐ๋ฆฌ๊ฐ€ ํ•  ์ˆ˜ ์žˆ๋Š” ์ ์ ˆํ•œ ๋น„์œ ๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.
16:57
Since the beginning of life on Earth,
311
1017471
3003
์ง€๊ตฌ์— ์ƒ๋ช…์ด ์‹œ์ž‘๋œ ์ด๋ž˜๋กœ
17:00
we've been evolving, changing
312
1020516
2961
์šฐ๋ฆฌ๋Š” ์ง„ํ™”ํ•˜๊ณ , ๋ณ€ํ™”ํ–ˆ์œผ๋ฉฐ,
17:03
and then creating everything around us in our human world today.
313
1023477
4171
์˜ค๋Š˜๋‚  ์ธ๋ฅ˜ ์„ธ๊ณ„๋ฅผ ๋‘˜๋Ÿฌ์‹ผ ๋ชจ๋“  ๊ฒƒ์„ ์ฐฝ์กฐํ•ด์™”์Šต๋‹ˆ๋‹ค.
17:08
And AI isn't something outside of this story.
314
1028232
3003
๊ทธ๋ฆฌ๊ณ  AI๋„ ์ด ์ด์•ผ๊ธฐ์—์„œ ๋ฒ—์–ด๋‚œ ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
17:11
In fact, it's the very opposite.
315
1031569
2961
์‚ฌ์‹ค, ์™„์ „ํžˆ ์ •๋ฐ˜๋Œ€์ž…๋‹ˆ๋‹ค.
17:15
It's the whole of everything that we have created,
316
1035197
3170
์šฐ๋ฆฌ๊ฐ€ ์ฐฝ์กฐํ•œ ๋ชจ๋“  ๊ฒƒ์ด ์—„์„ ๋œ ์ƒํƒœ๋กœ ํ•œ๋ฐ ๋ชจ์—ฌ ์žˆ์–ด์„œ
17:18
distilled down into something that we can all interact with
317
1038367
3545
์šฐ๋ฆฌ๋Š” ๊ทธ์™€ ์ƒํ˜ธ์ž‘์šฉํ•จ์œผ๋กœ์จ ํ˜œํƒ์„ ์–ป์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
17:21
and benefit from.
318
1041912
1377
17:23
It's a reflection of humanity across time,
319
1043872
3504
์ด๊ฒƒ์€ ์‹œ๊ฐ„์„ ๊ฑฐ์ณ์˜จ ์ธ๋ฅ˜์˜ ๋ชจ์Šต์„ ๋ฐ˜์˜ํ•˜๊ณ  ์žˆ๊ณ 
17:27
and in this sense,
320
1047418
1251
๊ทธ๋Ÿฐ ์˜๋ฏธ์—์„œ๋Š” ์ „ํ˜€ ์ƒˆ๋กœ์šด ์ข…์ด ์•„๋‹™๋‹ˆ๋‹ค.
17:28
it isn't a new species at all.
321
1048711
1918
17:31
This is where the metaphors end.
322
1051213
1960
๋น„์œ ๋Š” ์—ฌ๊ธฐ์„œ ๋์ž…๋‹ˆ๋‹ค.
17:33
Here's what I'll tell Caspian next time he asks.
323
1053591
2502
๋‹ค์Œ์— ์กฐ์นด ์นด์Šคํ”ผ์–ธ์ด ๋ฌผ์œผ๋ฉด ์ €๋Š” ์ด๋ ‡๊ฒŒ ๋งํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:37
AI isn't separate.
324
1057219
1460
"AI๋Š” ๋”ฐ๋กœ ๊ตฌ๋ถ„๋œ ๊ฒƒ์ด ์•„๋‹ˆ์•ผ.
17:39
AI isn't even in some senses, new.
325
1059388
3295
์–ด๋–ค ๋ฉด์—์„œ AI๋Š” ์ „ํ˜€ ์ƒˆ๋กœ์šด ๊ฒŒ ์•„๋‹ˆ๋ž€๋‹ค.
17:43
AI is us.
326
1063517
1126
AI๋Š” ๋ฐ”๋กœ ์šฐ๋ฆฌ์™€ ๊ฐ™์•„.
17:45
It's all of us.
327
1065394
1210
์šฐ๋ฆฌ ๋ชจ๋‘์™€ ๋˜‘๊ฐ™์ง€."
17:47
And this is perhaps the most promising and vital thing of all
328
1067062
3754
์–ด์ฉŒ๋ฉด ์ด๊ฒƒ์ด์•ผ๋ง๋กœ ๊ฐ€์žฅ ๊ฐ€๋Šฅ์„ฑ ์žˆ๊ณ  ์ค‘์š”ํ•œ ๋‹ต์œผ๋กœ
17:50
that even a six-year-old can get a sense for.
329
1070858
2669
์—ฌ์„ฏ ์‚ด ์งœ๋ฆฌ ์•„์ด๋„ ์ดํ•ดํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
17:54
As we build out AI,
330
1074528
1376
์šฐ๋ฆฌ๊ฐ€ AI๋ฅผ ๊ตฌ์ถ•ํ•  ๋•Œ๋Š” ๋ชจ๋“  ์ข‹์€ ๊ฒƒ๋“ค์„ ๋ฐ˜์˜ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
17:55
we can and must reflect all that is good,
331
1075946
3545
17:59
all that we love,
332
1079533
1126
์šฐ๋ฆฌ๊ฐ€ ์‚ฌ๋ž‘ํ•˜๋Š” ๊ฒƒ๋“ค๊ณผ ์ธ๋ฅ˜์—๊ฒŒ ํŠน๋ณ„ํ•œ ๋ชจ๋“  ๊ฒƒ๋“ค:
18:00
all that is special about humanity:
333
1080701
2753
18:03
our empathy, our kindness,
334
1083495
2169
๊ณต๊ฐ, ์นœ์ ˆ, ํ˜ธ๊ธฐ์‹ฌ, ์ฐฝ์˜์„ฑ ๋“ฑ์ด ๋ฐ˜๋“œ์‹œ ๋ฐ˜์˜๋˜์–ด์•ผ ํ•˜์ฃ .
18:05
our curiosity and our creativity.
335
1085664
2169
18:09
This, I would argue, is the greatest challenge of the 21st century,
336
1089251
5130
์ €๋Š” ์ด๊ฒƒ์ด 21์„ธ๊ธฐ์˜ ๊ฐ€์žฅ ํฐ ๋„์ „์ด์ž,
18:14
but also the most wonderful,
337
1094381
2211
์šฐ๋ฆฌ์—๊ฒŒ ๊ฐ€์žฅ ํ›Œ๋ฅญํ•œ ์˜๊ฐ์„ ์ฃผ๋Š” ํฌ๋ง์ฐฌ ๊ธฐํšŒ๋ผ๊ณ  ๋งํ•˜๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
18:16
inspiring and hopeful opportunity for all of us.
338
1096592
3336
18:20
Thank you.
339
1100429
1168
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
18:21
(Applause)
340
1101639
4879
(๋ฐ•์ˆ˜)
18:26
Chris Anderson: Thank you Mustafa.
341
1106560
1668
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค, ๋ฌด์Šคํƒ€ํŒŒ.
18:28
It's an amazing vision and a super powerful metaphor.
342
1108270
3796
๋†€๋ผ์šด ์‹œ๊ฐ์ด๊ณ  ์ •๋ง ๊ฐ•๋ ฅํ•œ ๋น„์œ ์˜€์–ด์š” .
18:32
You're in an amazing position right now.
343
1112066
1960
์ง€๊ธˆ ๋‹น์‹ ์€ ํŠน๋ณ„ํ•œ ์œ„์น˜์— ์žˆ์ž–์•„์š”.
18:34
I mean, you were connected at the hip
344
1114026
1793
OpenAI์—์„œ ์ผ์–ด๋‚˜๊ณ  ์žˆ๋Š” ๋†€๋ผ์šด ์ผ์„ ๊ฐ€์žฅ ์ž˜ ์•Œ๊ณ  ๊ณ„์‹œ์ฃ .
18:35
to the amazing work happening at OpenAI.
345
1115819
2795
18:38
Youโ€™re going to have resources made available,
346
1118614
2169
๊ฐœ๋ฐœ ์ž์›์„ ๋” ํˆฌ์ž…ํ•  ์˜ˆ์ •์ด๊ณ ,
18:40
there are reports of these giant new data centers,
347
1120824
4046
๊ฑฐ๋Œ€ํ•œ ์ƒˆ ๋ฐ์ดํ„ฐ ์„ผํ„ฐ ๊ตฌ์ถ•์— ์ฒœ ์–ต ๋‹ฌ๋Ÿฌ ํˆฌ์ž ์–˜๊ธฐ๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
18:44
100 billion dollars invested and so forth.
348
1124870
2461
18:48
And a new species can emerge from it.
349
1128082
3837
๊ฑฐ๊ธฐ์„œ ์ƒˆ๋กœ์šด ์ข…์ด ์ƒ๊ฒจ๋‚  ์ˆ˜ ์žˆ์ฃ .
18:52
I mean, in your book,
350
1132294
1502
๋‹น์‹ ์˜ ์ €์„œ๋ฅผ ๋ณด๋ฉด ๋†€๋ผ์šธ ์ •๋„๋กœ ๋‚™๊ด€์  ์‹œ๊ฐ์ด๊ณ ,
18:53
you did, as well as painting an incredible optimistic vision,
351
1133837
2878
18:56
you were super eloquent on the dangers of AI.
352
1136757
3629
์ธ๊ณต์ง€๋Šฅ์˜ ์œ„ํ—˜์„ฑ์— ๋Œ€ํ•ด์„œ๋„ ์•„์ฃผ ์†”์งํ•˜๊ณ  ๊ฑฐ์นจ์—†์ด ๋‚˜ํƒ€๋ƒˆ์ฃ .
19:00
And I'm just curious, from the view that you have now,
353
1140427
3963
์ œ๊ฐ€ ๊ถ๊ธˆํ•œ ๊ฒƒ์€ ์ง€๊ธˆ ๊ฐ€์ง€๊ณ  ๊ณ„์‹  ์‹œ๊ฐ์—์„œ
19:04
what is it that most keeps you up at night?
354
1144431
2336
๊ฐ€์žฅ ๊ฑฑ์ •์Šค๋Ÿฌ์šด ๋ถ€๋ถ„์€ ๋ฌด์—‡์ธ๊ฐ€์š”?
19:06
Mustafa Suleyman: I think the great risk is that we get stuck
355
1146809
2878
๋ฌด์Šคํƒ€ํŒŒ ์А๋ ˆ์ด๋งŒ: ์ œ ์ƒ๊ฐ์— ์šฐ๋ฆฌ๊ฐ€ ์ง๋ฉดํ•  ๊ฐ€์žฅ ํฐ ์œ„ํ—˜์€
19:09
in what I call the pessimism aversion trap.
356
1149728
2086
์ œ๊ฐ€ ๋น„๊ด€์  ํ˜์˜ค์˜ ํ•จ์ •์ด๋ผ๊ณ  ๋ถ€๋ฅด๋Š” ๊ฒƒ์ธ๋ฐ์š”.
19:11
You know, we have to have the courage to confront
357
1151855
2586
๋‚˜์œ ์‹œ๋‚˜๋ฆฌ์˜ค์— ๋Œ€ํ•œ ๊ฐ€๋Šฅ์„ฑ์— ๋งž์„ค ์šฉ๊ธฐ๊ฐ€ ์žˆ์–ด์•ผ๋งŒ
19:14
the potential of dark scenarios
358
1154483
1960
19:16
in order to get the most out of all the benefits that we see.
359
1156443
3128
์šฐ๋ฆฌ๊ฐ€ ๊ธฐ๋Œ€ํ•˜๋Š” ๋ชจ๋“  ์ด์ ์„ ์ตœ๋Œ€ํ•œ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
19:19
So the good news is that if you look at the last two or three years,
360
1159613
3712
์ข‹์€ ์†Œ์‹์€ ์ง€๋‚œ 2~3๋…„ ์‚ฌ์ด์—๋Š”
19:23
there have been very, very few downsides, right?
361
1163325
2961
๋‹จ์ ์ด ๊ฑฐ์˜ ์—†์—ˆ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:26
Itโ€™s very hard to say explicitly what harm an LLM has caused.
362
1166328
4922
LLM์ด ์–ด๋–ค ํ•ด๋ฅผ ๋ผ์ณค๋Š”์ง€ ํ™•์‹คํžˆ ๋งํ•˜๊ธฐ๋Š” ๋งค์šฐ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
19:31
But that doesnโ€™t mean that thatโ€™s what the trajectory is going to be
363
1171291
3212
ํ•˜์ง€๋งŒ ํ–ฅํ›„ 10๋…„ ๋™์•ˆ์—๋„ ๊ทธ๋ ‡๊ฒŒ ๋  ๊ฑฐ๋ผ๋Š” ๋ณด์žฅ์€ ์—†์ฃ .
19:34
over the next 10 years.
364
1174503
1168
19:35
So I think if you pay attention to a few specific capabilities,
365
1175671
3462
๊ทธ๋ž˜์„œ ๋งŒ์•ฝ ๋ช‡ ๊ฐ€์ง€ ํŠน์ • ๊ธฐ๋Šฅ์— ์ฃผ๋ชฉํ•œ๋‹ค๋ฉด,
19:39
take for example, autonomy.
366
1179174
1835
์ž์œจ์„ฑ์„ ์˜ˆ๋กœ ๋“ค์–ด๋ณด์ฃ .
19:41
Autonomy is very obviously a threshold
367
1181009
2336
์ž์œจ์„ฑ์€ ๋ถ„๋ช…ํžˆ ์šฐ๋ฆฌ ์‚ฌํšŒ์˜ ์œ„ํ—˜์„ ์ฆ๊ฐ€์‹œํ‚ค๋Š” ํ•œ๊ณ„์ž…๋‹ˆ๋‹ค.
19:43
over which we increase risk in our society.
368
1183387
2794
19:46
And it's something that we should step towards very, very closely.
369
1186223
3128
๊ทธ๋ฆฌ๊ณ  ์šฐ๋ฆฌ๊ฐ€ ์•„์ฃผ ์•„์ฃผ ์กฐ์‹ฌ์Šค๋Ÿฝ๊ฒŒ ๋‚˜์•„๊ฐ€์•ผ ํ•  ๋ถ€๋ถ„์ด์ฃ .
19:49
The other would be something like recursive self-improvement.
370
1189393
3420
๋‹ค๋ฅธ ํ•˜๋‚˜๋Š” ๋ฐ˜๋ณต์ ์ธ ์ž๊ธฐ๊ฐœ์„  ๊ฐ™์€ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
19:52
If you allow the model to independently self-improve,
371
1192813
3629
๋ชจ๋ธ์ด ๋…๋ฆฝ์ ์œผ๋กœ ์Šค์Šค๋กœ๋ฅผ ๊ฐœ์„ ํ•˜๊ณ ,
19:56
update its own code,
372
1196483
1460
์ž์ฒด์ ์œผ๋กœ ์ฝ”๋“œ๋ฅผ ์—…๋ฐ์ดํŠธํ•˜๊ณ ,
19:57
explore an environment without oversight, and, you know,
373
1197985
3295
๊ฐ๋… ์—†์ด ํ™˜๊ฒฝ์„ ํƒ์ƒ‰ํ•˜๋„๋ก ํ—ˆ์šฉํ•˜๋Š” ๊ฒƒ์ด์ฃ .
20:01
without a human in control to change how it operates,
374
1201321
3295
์šด์˜ ๋ฐฉ์‹์˜ ๋ณ€ํ™”๋ฅผ ์‚ฌ๋žŒ์ด ํ†ต์ œํ•˜์ง€ ์•Š๋Š”๋‹ค๋ฉด,
20:04
that would obviously be more dangerous.
375
1204616
1919
๋ถ„๋ช…ํžˆ ๋” ์œ„ํ—˜ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
20:06
But I think that we're still some way away from that.
376
1206535
2544
ํ•˜์ง€๋งŒ ์•„์ง๊นŒ์ง€ ์šฐ๋ฆฌ๋Š” ๊ฐˆ ๊ธธ์ด ๋ฉ€๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
์ด ๋ฌธ์ œ์— ์ง„์ •์œผ๋กœ ๋งž์„œ๋ ค๋ฉด ์•„์ง 5~10๋…„์€ ๋” ๊ฑธ๋ฆด ๊ฒƒ ๊ฐ™์•„์š”.
20:09
I think it's still a good five to 10 years before we have to really confront that.
377
1209079
3879
20:12
But it's time to start talking about it now.
378
1212958
2085
ํ•˜์ง€๋งŒ ์ง€๊ธˆ๋ถ€ํ„ฐ ๋…ผ์˜๋ฅผ ์‹œ์ž‘ํ•ด์•ผ์ฃ .
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ๋””์ง€ํ„ธ ์ข…์€ ๋‹ค๋ฅธ ์ƒ๋ฌผ์ข…๊ณผ ๋‹ฌ๋ฆฌ
20:15
CA: A digital species, unlike any biological species,
379
1215043
2545
9๊ฐœ์›”์ด ์•„๋‹ˆ๋ผ 9๋‚˜๋…ธ์ดˆ ๋งŒ์— ๋ณต์ œํ•  ์ˆ˜ ์žˆ์œผ๋ฉฐ,
20:17
can replicate not in nine months,
380
1217588
2002
20:19
but in nine nanoseconds,
381
1219631
1669
20:21
and produce an indefinite number of copies of itself,
382
1221341
3421
๋ณต์ œ๋ณธ์„ ๋ฌดํ•œํžˆ ๋งŒ๋“ค์–ด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
20:24
all of which have more power than we have in many ways.
383
1224803
3796
๋ชจ๋“  ๋ถ€๋ถ„์—์„œ ์šฐ๋ฆฌ๋ณด๋‹ค ๊ฐ•๋ ฅํ•œ ํž˜์„ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
20:28
I mean, the possibility for unintended consequences seems pretty immense.
384
1228599
4838
์ œ ๋ง์€, ์˜๋„์น˜ ์•Š์€ ๊ฒฐ๊ณผ๊ฐ€ ๋ฐœ์ƒํ•  ๊ฐ€๋Šฅ์„ฑ์ด ์ •๋ง ์—„์ฒญ๋‚˜๋‹ค๋Š” ๊ฑฐ์ฃ .
20:33
And isn't it true that if a problem happens,
385
1233479
2168
๋ฌธ์ œ๊ฐ€ ์ƒ๊ธด๋‹ค๋ฉด ๋‹จ์‹œ๊ฐ„์— ์ผ์–ด๋‚  ์ˆ˜๋„ ์žˆ๋Š” ๊ฒƒ ์•„๋‹Œ๊ฐ€์š”?
20:35
it could happen in an hour?
386
1235689
1919
20:37
MS: No.
387
1237649
1335
๋ฌด์Šคํƒ€ํŒŒ ์А๋ ˆ์ด๋งŒ: ์•„๋‹ˆ์š”.
20:38
That is really not true.
388
1238984
1752
๊ทธ๊ฑด ์‚ฌ์‹ค์ด ์•„๋‹™๋‹ˆ๋‹ค.
20:40
I think there's no evidence to suggest that.
389
1240778
2085
๊ทธ๊ฒƒ์„ ์‹œ์‚ฌํ•  ๋งŒํ•œ ์ฆ๊ฑฐ๋Š” ์—†๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
20:42
And I think that, you know,
390
1242863
1585
์ œ ์ƒ๊ฐ์—๋Š”, ์•„์‹œ๋‹ค์‹œํ”ผ,
20:44
thatโ€™s often referred to as the โ€œintelligence explosion.โ€
391
1244490
2836
ํ”ํžˆ โ€œ์ง€๋Šฅ ํญ๋ฐœโ€์ด๋ผ๊ณ  ํ•˜๋Š”๋ฐ์š”.
20:47
And I think it is a theoretical, hypothetical maybe
392
1247367
3712
์ œ๊ฐ€ ๋ณด๊ธฐ์—” ์ด๋ก ์  ๊ฐ€์„ค์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
20:51
that we're all kind of curious to explore,
393
1251079
2420
๋‹ค๋“ค ํ˜ธ๊ธฐ์‹ฌ์„ ๊ฐ–๊ณ  ํƒ๊ตฌํ•˜๊ณ  ์žˆ์ง€๋งŒ ๊ทธ๋Ÿด ๊ฒƒ์ด๋ผ๋Š” ์ฆ๊ฑฐ๋Š” ์—†์Šต๋‹ˆ๋‹ค.
20:53
but there's no evidence that we're anywhere near anything like that.
394
1253540
3212
20:56
And I think it's very important that we choose our words super carefully.
395
1256752
3462
๊ทธ๋ฆฌ๊ณ  ๋‹จ์–ด ์„ ํƒ์— ์‹ ์ค‘ํ•  ํ•„์š”๊ฐ€ ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
๋ง์”€ํ•˜์‹  ๊ทธ๊ฒƒ์ด ์ข…์ด๋ผ๋Š” ๊ฐœ๋…์„ ์‚ฌ์šฉํ•  ๋•Œ์˜ ์•ฝ์  ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค.
21:00
Because you're right, that's one of the weaknesses of the species framing,
396
1260255
3546
21:03
that we will design the capability for self-replication into it
397
1263801
4337
์ž๊ฐ€ ๋ณต์ œ๋ฅผ ํ•  ์ˆ˜ ์žˆ๋Š” ๋Šฅ๋ ฅ์„ ๊ฐ–๋„๋ก ์„ค๊ณ„ํ•  ๊ฑฐ๋ผ๋ฉด
21:08
if people choose to do that.
398
1268180
1668
๊ทธ๊ฒƒ์€ ์ธ๊ฐ„์˜ ์„ ํƒ์˜ ๋ฌธ์ œ์ฃ .
21:09
And I would actually argue that we should not,
399
1269890
2169
์‚ฌ์‹ค ์ €๋Š” ๊ทธ๋ž˜์„œ๋Š” ์•ˆ ๋œ๋‹ค๊ณ  ์ฃผ์žฅํ•˜๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
๊ทธ๊ฑด ์šฐ๋ฆฌ๊ฐ€ ๊ณ„์† ์ˆจ๊ฒจ๋‘ฌ์•ผ ํ•  ์œ„ํ—˜ํ•œ ๋Šฅ๋ ฅ ์ค‘ ํ•˜๋‚˜์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
21:12
that would be one of the dangerous capabilities
400
1272059
2210
21:14
that we should step back from, right?
401
1274269
1835
21:16
So there's no chance that this will "emerge" accidentally.
402
1276104
3796
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์ด๋Ÿฐ ์ผ์ด ์šฐ์—ฐํžˆ โ€œ๋‚˜ํƒ€๋‚ โ€ ๊ฐ€๋Šฅ์„ฑ์€ ์—†์ฃ .
21:19
I really think that's a very low probability.
403
1279942
2502
๊ทธ๋Ÿด ํ™•๋ฅ ์€ ๋งค์šฐ ๋‚ฎ๋‹ค๊ณ  ์ƒ๊ฐํ•ด์š”.
21:22
It will happen if engineers deliberately design those capabilities in.
404
1282486
4338
์—”์ง€๋‹ˆ์–ด๊ฐ€ ๊ทธ๋Ÿฐ ๊ธฐ๋Šฅ์„ ์˜๋„์ ์œผ๋กœ ์„ค๊ณ„ํ•˜๋ฉด ์ผ์–ด๋‚  ์ผ์ž…๋‹ˆ๋‹ค.
21:26
And if they don't take enough efforts to deliberately design them out.
405
1286865
3295
ํ˜น์€ ์—”์ง€๋‹ˆ์–ด๋“ค์ด ์‹ ์ค‘ํ•œ ์„ค๊ณ„์— ์ถฉ๋ถ„ํ•œ ๋…ธ๋ ฅ์„ ๋“ค์ด์ง€ ์•Š๋Š”๋‹ค๋ฉด ๋ง์ด์ฃ .
๊ทธ๋ž˜์„œ ์•„์ฃผ ์ดˆ๊ธฐ์— ์•ˆ์ „์ค‘์‹ฌ ์„ค๊ณ„๋ฅผ ๋„์ž…ํ•˜๋Š” ์‹œ๋„๋ฅผ
21:30
And so this is the point of being explicit
406
1290160
2294
21:32
and transparent about trying to introduce safety by design very early on.
407
1292454
5672
๋ถ„๋ช…ํ•˜๊ณ  ํˆฌ๋ช…ํ•˜๊ฒŒ ํ•˜๋Š” ๊ฒƒ์ด ์ค‘์š”ํ•œ ์ ์ด ๋˜๊ฒ ์Šต๋‹ˆ๋‹ค.
21:39
CA: Thank you, your vision of humanity injecting into this new thing
408
1299044
5964
ํฌ๋ฆฌ์Šค ์•ค๋”์Šจ: ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
์ด ์ƒˆ๋กœ์šด ๊ฒƒ์— ์ธ๋ฅ˜๋ฅผ, ์šฐ๋ฆฌ์˜ ๊ฐ€์žฅ ์ข‹์€ ๋ถ€๋ถ„์„ ๋Œ€์ž…ํ•˜๊ณ ,
21:45
the best parts of ourselves,
409
1305008
1877
21:46
avoiding all those weird, biological, freaky,
410
1306927
2920
์šฐ๋ฆฌ๊ฐ€ ์ง๋ฉดํ• ์ง€๋„ ๋ชจ๋ฅด๋Š” ์ด์ƒํ•˜๊ณ  ์ƒ๋ฌผํ•™์ ์ด๊ณ  ๊ธฐ๊ดดํ•˜๊ณ 
21:49
horrible tendencies that we can have in certain circumstances,
411
1309888
2920
๋”์ฐํ•œ ๋ฐฉํ–ฅ์„ฑ์„ ํ”ผํ•˜๊ฒŒ ํ•˜๋Š” ๋‹น์‹ ์˜ ์‹œ๊ฐ์€
21:52
I mean, that is a very inspiring vision.
412
1312808
2127
์ œ๊ฒŒ ์ •๋ง ์ธ์ƒ ๊นŠ์—ˆ์Šต๋‹ˆ๋‹ค.
21:54
And thank you so much for coming here and sharing it at TED.
413
1314977
3336
๊ทธ๋ฆฌ๊ณ  ์—ฌ๊ธฐ TED์— ์™€์„œ ๊ณต์œ ํ•ด ์ฃผ์…”์„œ ์ •๋ง ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
21:58
Thank you, good luck.
414
1318355
1210
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค, ํ–‰์šด์„ ๋น•๋‹ˆ๋‹ค.
21:59
(Applause)
415
1319565
1876
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7