Jeff Dean: AI isn't as smart as you think -- but it could be | TED

253,683 views ใƒป 2022-01-12

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: ์„ฑ์ค€ ์•ˆ ๊ฒ€ํ† : DK Kim
00:13
Hi, I'm Jeff.
0
13329
1583
์•ˆ๋…•ํ•˜์„ธ์š”, ์ œํ”„์ž…๋‹ˆ๋‹ค.
00:15
I lead AI Research and Health at Google.
1
15746
2833
๊ตฌ๊ธ€์—์„œ AI ์—ฐ๊ตฌ์™€ ์˜๋ฃŒ ๋ถ€์„œ๋ฅผ ๋งก๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
00:18
I joined Google more than 20 years ago,
2
18621
1875
20๋…„๋„ ์ „์— ๊ตฌ๊ธ€์— ์ž…์‚ฌํ–ˆ๋Š”๋ฐ
00:20
when we were all wedged into a tiny office space,
3
20538
3166
๊ทธ๋•Œ๋Š” ์•„์ฃผ ์ข์€ ์‚ฌ๋ฌด์‹ค์—์„œ ์ผํ–ˆ์Šต๋‹ˆ๋‹ค.
00:23
above what's now a T-Mobile store in downtown Palo Alto.
4
23704
2750
์ง€๊ธˆ ํŒ”๋กœ ์•Œํ†  ์‹œ๋‚ด ํ‹ฐ๋ชจ๋ฐ”์ผ ๋งค์žฅ ์œ„์ž…๋‹ˆ๋‹ค.
00:27
I've seen a lot of computing transformations in that time,
5
27163
2875
๊ทธ๋•Œ๋ถ€ํ„ฐ ์ปดํ“จํ„ฐ๋Š” ๊ธ‰๊ฒฉํžˆ ๋ฐœ์ „ํ–ˆ์œผ๋ฉฐ
์ง€๋‚œ 10๋…„ ๋™์•ˆ์—๋Š” AI๊ฐ€ ํฐ ๋ฐœ์ „์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
00:30
and in the last decade, we've seen AI be able to do tremendous things.
6
30079
3459
00:34
But we're still doing it all wrong in many ways.
7
34746
2542
ํ•˜์ง€๋งŒ ์—ฌ์ „ํžˆ ์—ฌ๋Ÿฌ ์ธก๋ฉด์—์„œ ์™„์ „ํžˆ ์ž˜๋ชปํ•˜๊ณ  ์žˆ๋Š” ์ผ๋“ค์ด ๋งŽ์Šต๋‹ˆ๋‹ค.
00:37
That's what I want to talk to you about today.
8
37329
2209
์˜ค๋Š˜์€ ์ด์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•ด๋ณด๋ ค ํ•ฉ๋‹ˆ๋‹ค.
00:39
But first, let's talk about what AI can do.
9
39579
2209
๊ทธ์— ์•ž์„œ, ์ธ๊ณต ์ง€๋Šฅ์œผ๋กœ ๋ฌด์—‡์„ ํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ๋ณด์ฃ .
00:41
So in the last decade, we've seen tremendous progress
10
41829
3292
์ง€๋‚œ 10๋…„๊ฐ„ ์ธ๊ณต ์ง€๋Šฅ์€ ํฐ ๋ฐœ์ „์„ ์ด๋ค˜๋Š”๋ฐ,
00:45
in how AI can help computers see, understand language,
11
45121
4333
AI๋ฅผ ์จ์„œ ์ปดํ“จํ„ฐ๊ฐ€ ๋ณด๊ณ  ์–ธ์–ด๋ฅผ ์ดํ•ดํ•˜๋ฉฐ
00:49
understand speech better than ever before.
12
49496
2833
์ด์ „๋ณด๋‹ค ๋ง์„ ๋” ์ž˜ ๋ถ„์„ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋์Šต๋‹ˆ๋‹ค.
00:52
Things that we couldn't do before, now we can do.
13
52329
2583
์ด์ „์—๋Š” ํ•  ์ˆ˜ ์—†๋˜ ์ผ์„ ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:54
If you think about computer vision alone,
14
54954
2333
์ปดํ“จํ„ฐ ์‹œ๊ฐ ๋ถ„์•ผ๋งŒ ๊ณ ๋ คํ•˜๋ฉด
00:57
just in the last 10 years,
15
57329
1583
๊ฒจ์šฐ 10๋…„ ๋งŒ์— ์ปดํ“จํ„ฐ๋Š” ํšจ๊ณผ์ ์ธ ์‹œ๊ฐ์„ ๊ฐœ๋ฐœํ–ˆ์Šต๋‹ˆ๋‹ค.
00:58
computers have effectively developed the ability to see;
16
58912
2792
01:01
10 years ago, they couldn't see, now they can see.
17
61746
2500
10๋…„ ์ „์—๋Š” ์‹œ๊ฐ์ด ์—†์—ˆ์ง€๋งŒ ์ด์ œ๋Š” ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:04
You can imagine this has had a transformative effect
18
64287
2500
์ด๋Ÿฌํ•œ ๋ฐœ์ „์ด ์ปดํ“จํ„ฐ ํ™œ์šฉ ๋ถ„์•ผ ํ™•์žฅ์— ํฐ ์˜ํ–ฅ์„ ๋ฏธ์ณค์Œ์„ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:06
on what we can do with computers.
19
66787
2209
๋Œ€ํ‘œ์ ์ธ ์‹œ๊ฐ ํ™œ์šฉ ๋ถ„์•ผ๋ฅผ ๋ช‡ ๊ฐ€์ง€ ์‚ดํŽด๋ด…์‹œ๋‹ค.
01:08
So let's look at a couple of the great applications
20
68996
2500
01:11
enabled by these capabilities.
21
71496
2000
๊ธฐ๊ณ„ ํ•™์Šต์œผ๋กœ ํ™์ˆ˜๋ฅผ ๋ณด๋‹ค ์ •ํ™•ํžˆ ์˜ˆ์ธกํ•˜๊ณ  ์•ˆ์ „์„ ์ง€ํ‚ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:13
We can better predict flooding, keep everyone safe,
22
73537
2417
01:15
using machine learning.
23
75996
1500
์–ธ์–ด๋ฅผ ๋ฐฑ ๊ฐ€์ง€ ์ด์ƒ ๋ฒˆ์—ญํ•ด์„œ ์˜์‚ฌ ์†Œํ†ต์ด ๋” ์‰ฌ์›Œ์กŒ๊ณ ,
01:17
We can translate over 100 languages so we all can communicate better,
24
77538
3250
๋ณด๋‹ค ์ •ํ™•ํ•œ ์งˆ๋ณ‘์˜ ์˜ˆ์ธก ๋ฐ ์ง„๋‹จ์œผ๋กœ
01:20
and better predict and diagnose disease,
25
80788
2000
01:22
where everyone gets the treatment that they need.
26
82788
2333
๋ชจ๋“  ์‚ฌ๋žŒ์ด ํ•„์š”ํ•œ ์น˜๋ฃŒ๋ฅผ ๋ฐ›์„ ์ˆ˜ ์žˆ๊ฒŒ ๋์Šต๋‹ˆ๋‹ค.
01:25
So let's look at two key components
27
85163
1916
๋‹ค์Œ์œผ๋กœ ์ธ๊ณต ์ง€๋Šฅ์˜ ๋ฐœ์ „์„ ๊ฐ€๋Šฅ์ผ€ ํ•œ
01:27
that underlie the progress in AI systems today.
28
87121
2333
ํ•ต์‹ฌ ์š”์†Œ ๋‘ ๊ฐ€์ง€๋ฅผ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
01:30
The first is neural networks,
29
90121
1542
์ฒซ ๋ฒˆ์งธ๋Š” ์ธ๊ณต ์‹ ๊ฒฝ๋ง์ž…๋‹ˆ๋‹ค.
01:31
a breakthrough approach to solving some of these difficult problems
30
91704
3417
๋ช‡๋ช‡ ์–ด๋ ค์šด ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ํš๊ธฐ์ ์ธ ๋ฐฉ๋ฒ•์œผ๋กœ์„œ,
01:35
that has really shone in the last 15 years.
31
95121
2833
์ง€๋‚œ 15๋…„๊ฐ„ ํฌ๊ฒŒ ์œ ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:37
But they're not a new idea.
32
97954
1625
ํ•˜์ง€๋งŒ ์ƒˆ๋กœ์šด ๊ฑด ์•„๋‹™๋‹ˆ๋‹ค.
01:39
And the second is computational power.
33
99621
1833
๋‘ ๋ฒˆ์งธ๋Š” ์—ฐ์‚ฐ ๋Šฅ๋ ฅ์ž…๋‹ˆ๋‹ค.
01:41
It actually takes a lot of computational power
34
101454
2167
์ธ๊ณต ์‹ ๊ฒฝ๋ง์„ ํ™œ์šฉํ•˜๋ ค๋ฉด ์—ฐ์‚ฐ ๋Šฅ๋ ฅ์ด ๋งŽ์ด ํ•„์š”ํ•œ๋ฐ
01:43
to make neural networks able to really sing,
35
103663
2083
01:45
and in the last 15 years, weโ€™ve been able to halve that,
36
105746
3375
์ง€๋‚œ 15๋…„๊ฐ„ ํ•„์š”ํ•œ ์—ฐ์‚ฐ ๋Šฅ๋ ฅ์„ ๋ฐ˜์œผ๋กœ ๋‚ฎ์ถœ ์ˆ˜ ์žˆ์—ˆ์œผ๋ฉฐ
๋•๋ถ„์— ์ธ๊ณต ์ง€๋Šฅ์ด ์ด๋งŒํผ ๋ฐœ์ „ํ–ˆ๋‹ค ๋งํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
01:49
and that's partly what's enabled all this progress.
37
109121
2458
01:51
But at the same time, I think we're doing several things wrong,
38
111954
3459
ํ•˜์ง€๋งŒ ๋™์‹œ์— ์ž˜๋ชปํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ๋“ค๋„ ๋ช‡ ๊ฐ€์ง€ ์žˆ๋‹ค๊ณ  ๋ด…๋‹ˆ๋‹ค.
01:55
and that's what I want to talk to you about
39
115413
2000
์ด์— ๊ด€ํ•ด์„œ๋Š” ๋งˆ์ง€๋ง‰์— ๋ง์”€ ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
01:57
at the end of the talk.
40
117454
1167
01:58
First, a bit of a history lesson.
41
118663
1958
์ผ๋‹จ, ์—ญ์‚ฌ๋ฅผ ์กฐ๊ธˆ ๋ณด์ฃ .
02:00
So for decades,
42
120663
1208
์ˆ˜์‹ญ ๋…„ ๋™์•ˆ, ์ปดํ“จํ„ฐ๋ฅผ ์ด์šฉํ•œ ๊ฑฐ์˜ ์ฒ˜์Œ๋ถ€ํ„ฐ
02:01
almost since the very beginning of computing,
43
121913
2125
์‚ฌ๋žŒ๋“ค์€ ๋ณด๊ณ  ์–ธ์–ด๋ฅผ ์ดํ•ดํ•˜๊ณ ,
02:04
people have wanted to be able to build computers
44
124079
2250
02:06
that could see, understand language, understand speech.
45
126329
3917
๋ง์„ ๋ถ„์„ํ•  ์ˆ˜ ์žˆ๋Š” ์ปดํ“จํ„ฐ๋ฅผ ๋งŒ๋“ค๊ธธ ์›ํ–ˆ์Šต๋‹ˆ๋‹ค.
02:10
The earliest approaches to this, generally,
46
130579
2000
์ด๋ฅผ ์œ„ํ•ด ์ดˆ์ฐฝ๊ธฐ์—๋Š” ๋ณดํ†ต
์–ด๋ ค์šด ์ž‘์—…์„ ์ˆ˜ํ–‰ํ•  ๋ช…๋ น์–ด๋“ค์„ ํ•˜๋‚˜ํ•˜๋‚˜ ์†์ˆ˜ ์ž…๋ ฅํ–ˆ์ง€๋งŒ
02:12
people were trying to hand-code all the algorithms
47
132621
2333
02:14
that you need to accomplish those difficult tasks,
48
134996
2458
๊ทธ๋ฆฌ ์„ฑ๊ณต์ ์ด์ง€ ๋ชปํ–ˆ์Šต๋‹ˆ๋‹ค.
02:17
and it just turned out to not work very well.
49
137454
2375
02:19
But in the last 15 years, a single approach
50
139829
3250
๊ทธ๋Ÿฐ๋ฐ ์ง€๋‚œ 15๋…„ ๋™์•ˆ ๊ฐœ๋ฐœ๋œ ํ•œ ๋ฐฉ๋ฒ•์ด
์˜ˆ๊ธฐ์น˜ ์•Š๊ฒŒ ๋ชจ๋“  ๋ฌธ์ œ๋ฅผ ํ•œ ๋ฒˆ์— ํ•ด๊ฒฐํ–ˆ์Šต๋‹ˆ๋‹ค.
02:23
unexpectedly advanced all these different problem spaces all at once:
51
143079
4167
02:27
neural networks.
52
147954
1542
๋ฐ”๋กœ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์ž…๋‹ˆ๋‹ค.
02:29
So neural networks are not a new idea.
53
149538
1833
์ธ๊ณต ์‹ ๊ฒฝ๋ง์€ ์ƒˆ๋กœ์šด ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค.
02:31
They're kind of loosely based
54
151371
1417
์ƒ๋ฌผ์˜ ์‹ ๊ฒฝ๊ณ„ ํŠน์„ฑ์„ ๋Œ€๋žต ์œ ์‚ฌํ•˜๊ฒŒ ๋”ฐ์˜จ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:32
on some of the properties that are in real neural systems.
55
152788
3041
02:35
And many of the ideas behind neural networks
56
155829
2084
์‹ ๊ฒฝ๋ง์— ๋Œ€ํ•œ ์ˆ˜๋งŽ์€ ์—ฐ๊ตฌ๊ฐ€ 1960, 70๋…„๋Œ€๋ถ€ํ„ฐ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
02:37
have been around since the 1960s and 70s.
57
157954
2250
02:40
A neural network is what it sounds like,
58
160204
2167
์ธ๊ณต ์‹ ๊ฒฝ๋ง์ด๋ž€ ๋ง ๊ทธ๋Œ€๋กœ
02:42
a series of interconnected artificial neurons
59
162413
2708
์ธ๊ณต ์‹ ๊ฒฝ๋“ค์ด ์„œ๋กœ ์—ฐ๊ฒฐ๋˜์–ด ์žˆ๋Š” ๋ง์œผ๋กœ
02:45
that loosely emulate the properties of your real neurons.
60
165121
3000
์ƒ๋ฌผ์˜ ์‹ ๊ฒฝ์„ ๋Œ€๋žต ๋น„์Šทํ•˜๊ฒŒ ๋”ฐ๋ผํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
02:48
An individual neuron in one of these systems
61
168163
2166
์ด ์‹œ์Šคํ…œ์— ์žˆ๋Š” ๊ฐœ๋ณ„ ์‹ ๊ฒฝ์€
02:50
has a set of inputs,
62
170371
1417
๊ฐ๊ฐ ๊ฐ€์ค‘์น˜๊ฐ€ ์žˆ๋Š” ์ž…๋ ฅ๊ฐ’์ด ์žˆ๊ณ 
02:51
each with an associated weight,
63
171788
2041
02:53
and the output of a neuron
64
173871
1833
์‹ ๊ฒฝ์˜ ๊ฒฐ๊ณผ๊ฐ’์€
02:55
is a function of those inputs multiplied by those weights.
65
175704
3209
์ž…๋ ฅ๊ฐ’์— ๊ฐ€์ค‘์น˜๋ฅผ ๊ณฑํ•œ ๊ฐ’์ž…๋‹ˆ๋‹ค.
02:59
So pretty simple,
66
179288
1208
์•„์ฃผ ๊ฐ„๋‹จํ•ฉ๋‹ˆ๋‹ค.
03:00
and lots and lots of these work together to learn complicated things.
67
180538
3666
์ด๋Ÿฐ ์‹ ๊ฒฝ๋“ค์ด ์•„์ฃผ ๋งŽ์ด ๋ชจ์—ฌ์„œ ๋ณต์žกํ•œ ๊ฒƒ์„ ํ•™์Šตํ•ฉ๋‹ˆ๋‹ค.
03:04
So how do we actually learn in a neural network?
68
184496
2833
๊ทธ๋ ‡๋‹ค๋ฉด ์ธ๊ณต ์‹ ๊ฒฝ๋ง์€ ์‹ค์ œ๋กœ ์–ด๋–ป๊ฒŒ ํ•™์Šตํ• ๊นŒ์š”?
03:07
It turns out the learning process
69
187371
1708
ํ•™์Šต ๊ณผ์ •์—์„œ๋Š”
03:09
consists of repeatedly making tiny little adjustments
70
189079
2792
๋ฐ˜๋ณต์ ์œผ๋กœ ๊ฐ€์ค‘์น˜๋ฅผ ๋ฏธ์„ธ ์กฐ์ •ํ•˜๊ณ 
03:11
to the weight values,
71
191913
1208
03:13
strengthening the influence of some things,
72
193121
2042
ํŠน์ • ๊ฐ’์˜ ์˜ํ–ฅ๋ ฅ์€ ๋†’์ด๊ณ  ๋‹ค๋ฅธ ๊ฐ’์˜ ์˜ํ–ฅ๋ ฅ์€ ๋‚ฎ์ถฅ๋‹ˆ๋‹ค.
03:15
weakening the influence of others.
73
195204
1959
03:17
By driving the overall system towards desired behaviors,
74
197204
3917
์›ํ•˜๋Š” ํ–‰๋™์„ ํ•˜๋„๋ก ์ „์ฒด ์‹œ์Šคํ…œ์„ ์กฐ์ •ํ•˜๋ฉด
03:21
these systems can be trained to do really complicated things,
75
201163
2916
์ด ์‹œ์Šคํ…œ์€ ๋งค์šฐ ๋ณต์žกํ•œ ์ž‘์—…์„ ์ˆ˜ํ–‰ํ•˜๋„๋ก ํ•™์Šตํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
์˜ˆ๋ฅผ ๋“ค์–ด ๋‹ค๋ฅธ ์–ธ์–ด๋กœ ๋ฒˆ์—ญ์„ ํ•˜๊ฑฐ๋‚˜
03:24
like translate from one language to another,
76
204121
2875
์‚ฌ์ง„ ์† ๋ฌผ์ฒด๊ฐ€ ๋ฌด์—‡์ธ์ง€ ์ธ์‹ํ•˜๋Š” ๋“ฑ
03:27
detect what kind of objects are in a photo,
77
207038
3083
03:30
all kinds of complicated things.
78
210121
1875
์˜จ๊ฐ– ๋ณต์žกํ•œ ์ผ์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
์ธ๊ณต ์‹ ๊ฒฝ๋ง์— ์ฒ˜์Œ์œผ๋กœ ๊ด€์‹ฌ์„ ๊ฐ–๊ฒŒ ๋œ ๊ณ„๊ธฐ๋Š”
03:32
I first got interested in neural networks
79
212038
2000
03:34
when I took a class on them as an undergraduate in 1990.
80
214079
3042
1990๋…„, ํ•™๋ถ€ ๋•Œ ๋“ค์—ˆ๋˜ ์ˆ˜์—…์ด์—ˆ์Šต๋‹ˆ๋‹ค.
03:37
At that time,
81
217163
1125
๋‹น์‹œ์—”
03:38
neural networks showed impressive results on tiny problems,
82
218329
3792
์ธ๊ณต ์‹ ๊ฒฝ๋ง์ด ์ž‘์€ ๊ทœ๋ชจ์˜ ๋ฌธ์ œ๋Š” ์ˆ˜์›”ํžˆ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ์—ˆ์ง€๋งŒ
03:42
but they really couldn't scale to do real-world important tasks.
83
222121
4375
ํ˜„์‹ค์˜ ์ค‘์š”ํ•œ ๋ฌธ์ œ๋Š” ํ•ด๊ฒฐํ•  ์ˆ˜ ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
03:46
But I was super excited.
84
226538
1500
๊ทธ๋ž˜๋„ ์ •๋ง ์žฌ๋ฏธ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:48
(Laughter)
85
228079
2500
(์›ƒ์Œ)
03:50
I felt maybe we just needed more compute power.
86
230579
2417
์ „์—๋Š” ๋‹จ์ˆœํžˆ ์—ฐ์‚ฐ ๋Šฅ๋ ฅ๋งŒ ๋” ์žˆ์œผ๋ฉด ๋˜์ง€ ์•Š์„๊นŒ ์ƒ๊ฐํ–ˆ์—ˆ์Šต๋‹ˆ๋‹ค.
03:52
And the University of Minnesota had a 32-processor machine.
87
232996
3625
๋‹น์‹œ ๋ฏธ๋„ค์†Œํƒ€ ๋Œ€ํ•™์—” ํ”„๋กœ์„ธ์„œ 32๊ฐœ์งœ๋ฆฌ ์ปดํ“จํ„ฐ๊ฐ€ ์žˆ์—ˆ๋Š”๋ฐ,
03:56
I thought, "With more compute power,
88
236621
1792
๊ณ„์‚ฐ์„ ๋” ๋นจ๋ฆฌ ํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด
03:58
boy, we could really make neural networks really sing."
89
238413
3000
์ธ๊ณต ์‹ ๊ฒฝ๋ง์„ ์ œ๋Œ€๋กœ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ๊ฒ ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์ฃ .
04:01
So I decided to do a senior thesis on parallel training of neural networks,
90
241454
3584
๊ทธ๋ž˜์„œ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์˜ ๋ณ‘๋ ฌ ํ•™์Šต์„ ์กธ์—… ๋…ผ๋ฌธ์œผ๋กœ ํ•˜๊ธฐ๋กœ ํ–ˆ์Šต๋‹ˆ๋‹ค.
ํ•œ ์ปดํ“จํ„ฐ๋‚˜ ์—ฌ๋Ÿฌ ์ปดํ“จํ„ฐ์˜ ํ”„๋กœ์„ธ์„œ๋ฅผ ์ด์šฉํ•ด์„œ
04:05
the idea of using processors in a computer or in a computer system
91
245079
4000
04:09
to all work toward the same task,
92
249079
2125
๋ชจ๋‘๊ฐ€ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์˜ ํ•™์Šต์ด๋ผ๋Š” ๊ฐ™์€ ๊ณผ์ œ๋ฅผ ์ˆ˜ํ–‰ํ•˜๋Š” ๊ฑฐ์˜€์Šต๋‹ˆ๋‹ค.
04:11
that of training neural networks.
93
251204
1584
04:12
32 processors, wow,
94
252829
1292
ํ”„๋กœ์„ธ์„œ 32๊ฐœ๋กœ ๋ง์ž…๋‹ˆ๋‹ค!
04:14
weโ€™ve got to be able to do great things with this.
95
254163
2833
์ด๊ฑธ๋กœ ๋Œ€๋‹จํ•œ ์ผ์„ ํ•ด๋‚ผ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ด๋ผ ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
04:17
But I was wrong.
96
257496
1167
ํ•˜์ง€๋งŒ ์ œ ์ƒ๊ฐ์ด ํ‹€๋ ธ์Šต๋‹ˆ๋‹ค.
04:20
Turns out we needed about a million times as much computational power
97
260038
3333
1990๋…„ ๋‹น์‹œ์˜ ์ปดํ“จํ„ฐ๋ณด๋‹ค ์—ฐ์‚ฐ ๋Šฅ๋ ฅ์ด ์•ฝ ๋ฐฑ๋งŒ ๋ฐฐ ์ข‹์•„์•ผ
04:23
as we had in 1990
98
263371
1375
04:24
before we could actually get neural networks to do impressive things.
99
264788
3333
๋น„๋กœ์†Œ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์„ ๋ณธ๊ฒฉ์ ์œผ๋กœ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ๋˜ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:28
But starting around 2005,
100
268121
2417
ํ•˜์ง€๋งŒ 2005๋…„ ๋ฌด๋ ต๋ถ€ํ„ฐ๋Š”
04:30
thanks to the computing progress of Moore's law,
101
270579
2500
๋ฌด์–ด์˜ ๋ฒ•์น™์— ๋”ฐ๋ฅธ ์—ฐ์‚ฐ ๋Šฅ๋ ฅ์˜ ๋ฐœ์ „์œผ๋กœ
์‹ค์ œ๋กœ ๊ทธ ์ˆ˜์ค€์— ์ด๋ฅด๋Š” ์—ฐ์‚ฐ ๋Šฅ๋ ฅ์„ ๊ฐ–์ถ”๊ฒŒ ๋˜์—ˆ๊ณ 
04:33
we actually started to have that much computing power,
102
273121
2625
04:35
and researchers in a few universities around the world started to see success
103
275746
4250
์ „ ์„ธ๊ณ„ ๋ช‡๋ช‡ ๋Œ€ํ•™ ์—ฐ๊ตฌ์ง„๋“ค์€
์ธ๊ณต ์‹ ๊ฒฝ๋ง์„ ๋‹ค์–‘ํ•œ ๋ฌธ์ œ ํ•ด๊ฒฐ์— ์„ฑ๊ณต์ ์œผ๋กœ ํ™œ์šฉํ•˜๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:40
in using neural networks for a wide variety of different kinds of tasks.
104
280038
4083
04:44
I and a few others at Google heard about some of these successes,
105
284121
3583
๊ตฌ๊ธ€์— ์žˆ๋˜ ์ €์™€ ๋™๋ฃŒ๋“ค์€
์ด ์†Œ์‹์„ ๋“ฃ๊ณ  ์•„์ฃผ ๊ฑฐ๋Œ€ํ•œ ์‹ ๊ฒฝ๋ง์„ ํ•™์Šต์‹œํ‚ค๋Š” ๊ณผ์ œ๋ฅผ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
04:47
and we decided to start a project to train very large neural networks.
106
287746
3333
04:51
One system that we trained,
107
291079
1459
์šฐ๋ฆฌ๊ฐ€ ํ•™์Šต์‹œํ‚จ ํ•œ ์‹œ์Šคํ…œ์€
04:52
we trained with 10 million randomly selected frames
108
292538
3541
์œ ํŠœ๋ธŒ ์˜์ƒ ์ค‘ ๋ฌด์ž‘์œ„๋กœ ์„ ํƒํ•œ ์žฅ๋ฉด ์ฒœ๋งŒ ๊ฐœ๋ฅผ ํ•™์Šตํ–ˆ์Šต๋‹ˆ๋‹ค.
04:56
from YouTube videos.
109
296079
1292
04:57
The system developed the capability
110
297371
1750
์ด ์‹œ์Šคํ…œ์€ ์—ฌ๋Ÿฌ ์ข…๋ฅ˜์˜ ์‚ฌ๋ฌผ์„ ์ธ์‹ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
04:59
to recognize all kinds of different objects.
111
299121
2583
05:01
And it being YouTube, of course,
112
301746
1542
๋ฌผ๋ก  ์œ ํŠœ๋ธŒ๋‹ˆ๊นŒ ๊ณ ์–‘์ด๋ฅผ ์ธ์‹ํ•  ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
05:03
it developed the ability to recognize cats.
113
303329
2500
05:05
YouTube is full of cats.
114
305829
1292
์œ ํŠœ๋ธŒ์—๋Š” ๊ณ ์–‘์ด๊ฐ€ ์ •๋ง ๋งŽ์•„์š”.
(์›ƒ์Œ)
05:07
(Laughter)
115
307163
1416
05:08
But what made that so remarkable
116
308621
2208
์—ฌ๊ธฐ์„œ ์ฃผ๋ชฉํ•  ๋งŒํ•œ ์ ์€
05:10
is that the system was never told what a cat was.
117
310871
2417
์ธ๊ณต ์‹ ๊ฒฝ๋งํ•œํ…Œ ๊ณ ์–‘์ด๊ฐ€ ๋ฌด์—‡์ธ์ง€ ์•Œ๋ ค์ฃผ์ง€ ์•Š์•˜๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:13
So using just patterns in data,
118
313704
2500
๋‹จ์ง€ ์ž๋ฃŒ์— ์žˆ๋Š” ํŠน์ง•๋งŒ์œผ๋กœ
05:16
the system honed in on the concept of a cat all on its own.
119
316204
3625
์Šค์Šค๋กœ ๊ณ ์–‘์ด๋ผ๋Š” ๊ฐœ๋…์„ ํ•™์Šตํ•œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:20
All of this occurred at the beginning of a decade-long string of successes,
120
320371
3750
์ด ์ „๋ถ€๊ฐ€ ๊ตฌ๊ธ€ ๋“ฑ์˜ ๊ธฐ์—…์—์„œ ์ง€๋‚œ 10๋…„๊ฐ„ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์„
05:24
of using neural networks for a huge variety of tasks,
121
324121
2500
์—ฌ๋Ÿฌ ๋ถ„์•ผ์— ์„ฑ๊ณต์ ์œผ๋กœ ํ™œ์šฉํ•˜๊ธฐ ์‹œ์ž‘ํ•œ ์ดˆ๊ธฐ์— ์ด๋ฃฌ ์„ฑ๊ณผ์ž…๋‹ˆ๋‹ค.
05:26
at Google and elsewhere.
122
326663
1166
05:27
Many of the things you use every day,
123
327871
2167
์ด ์™ธ์—๋„ ์ผ์ƒ์—์„œ ์ ‘ํ•  ์ˆ˜ ์žˆ๋Š”
์Šค๋งˆํŠธํฐ์˜ ์Œ์„ฑ ์ธ์‹ ๊ฐœ์„ ์ด๋‚˜
05:30
things like better speech recognition for your phone,
124
330079
2500
05:32
improved understanding of queries and documents
125
332579
2209
๊ฒ€์ƒ‰ ํ’ˆ์งˆ ๊ฐœ์„ ์„ ์œ„ํ•œ ์งˆ๋ฌธ๊ณผ ๋ฌธ์„œ ์ดํ•ด ๊ฐœ์„ ,
05:34
for better search quality,
126
334829
1459
05:36
better understanding of geographic information to improve maps,
127
336329
3042
์ •ํ™•ํ•œ ์ง€๋„ ์ œ๊ณต์„ ์œ„ํ•œ ์ง€๋ฆฌ ์ •๋ณด ํ•™์Šต ๋“ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
05:39
and so on.
128
339413
1166
05:40
Around that time,
129
340621
1167
๊ทธ ๋ฌด๋ ต ์ธ๊ณต ์‹ ๊ฒฝ๋ง ๊ณ„์‚ฐ์— ํŠนํ™”๋œ
05:41
we also got excited about how we could build hardware that was better tailored
130
341788
3750
์ปดํ“จํ„ฐ ์ œ์ž‘์— ๊ด€์‹ฌ์„ ๊ฐ–๊ธฐ ์‹œ์ž‘ํ–ˆ์Šต๋‹ˆ๋‹ค.
05:45
to the kinds of computations neural networks wanted to do.
131
345579
2792
์ธ๊ณต ์‹ ๊ฒฝ๋ง ๊ณ„์‚ฐ์€ ํŠน์ง•์ด ๋‘ ๊ฐ€์ง€ ์žˆ์Šต๋‹ˆ๋‹ค.
05:48
Neural network computations have two special properties.
132
348371
2667
์ฒซ ๋ฒˆ์งธ๋Š” ์ •๋ฐ€๋„๊ฐ€ ๋‚ฎ์•„๋„ ๊ณ„์‚ฐํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ์ ์ž…๋‹ˆ๋‹ค.
05:51
The first is they're very tolerant of reduced precision.
133
351079
2625
05:53
Couple of significant digits, you don't need six or seven.
134
353746
2750
์—ฌ์„ฏ ๊ฐœ๋‚˜ ์ผ๊ณฑ ๊ฐœ๊ฐ€ ์•„๋‹ˆ๋ผ ์ค‘์š”ํ•œ ์ˆซ์ž ๋‘์–ด ๊ฐœ๋ฉด ์ถฉ๋ถ„ํ•ฉ๋‹ˆ๋‹ค.
05:56
And the second is that all the algorithms are generally composed
135
356496
3458
๋‘ ๋ฒˆ์งธ๋Š” ๋ณดํ†ต ๋ชจ๋“  ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด
05:59
of different sequences of matrix and vector operations.
136
359996
3458
์„œ๋กœ ๋‹ค๋ฅธ ๋ฐฐ์—ด์˜ ํ–‰๋ ฌ๊ณผ ๋ฒกํ„ฐ ์—ฐ์‚ฐ์œผ๋กœ ๋˜์–ด์žˆ๋‹ค๋Š” ์ ์ž…๋‹ˆ๋‹ค.
06:03
So if you can build a computer
137
363496
1750
์ •๋ฐ€๋„๊ฐ€ ๋‚ฎ์€ ํ–‰๋ ฌ๊ณผ ๋ฒกํ„ฐ ์—ฐ์‚ฐ์€ ์•„์ฃผ ์ž˜ ํ•˜์ง€๋งŒ
06:05
that is really good at low-precision matrix and vector operations
138
365246
3792
๋‹ค๋ฅธ ๋ถ„์•ผ๋Š” ๋ณ„๋กœ์ธ ์ปดํ“จํ„ฐ๊ฐ€ ์žˆ๋‹ค๋ฉด
06:09
but can't do much else,
139
369079
1709
06:10
that's going to be great for neural-network computation,
140
370829
2625
์ธ๊ณต ์‹ ๊ฒฝ๋ง ๊ณ„์‚ฐ์—๋Š” ์ถฉ๋ถ„ํžˆ ์“ฐ์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:13
even though you can't use it for a lot of other things.
141
373496
2667
์—ฌ๋Ÿฌ๋ถ„์ด ๊ทธ๋Ÿฐ ์ปดํ“จํ„ฐ๋ฅผ ๋งŒ๋“ค๋ฉด ์‚ฌ๋žŒ๋“ค์ด ํ›Œ๋ฅญํ•˜๊ฒŒ ์‚ฌ์šฉํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:16
And if you build such things, people will find amazing uses for them.
142
376163
3291
์ด๊ฒƒ์€ ์ €ํฌ๊ฐ€ ์ฒ˜์Œ ๋งŒ๋“  TPU v1์ž…๋‹ˆ๋‹ค.
06:19
This is the first one we built, TPU v1.
143
379496
2083
06:21
"TPU" stands for Tensor Processing Unit.
144
381621
2542
TPU๋Š” ํ…์„œ ์ฒ˜๋ฆฌ ์žฅ์น˜์˜ ์•ฝ์ž์ž…๋‹ˆ๋‹ค.
06:24
These have been used for many years behind every Google search,
145
384204
3042
์ง€๋‚œ ๋ช‡ ๋…„ ๊ฐ„ ๊ตฌ๊ธ€์˜ ๊ฒ€์ƒ‰๊ณผ ๋ฒˆ์—ญ์„ ๋น„๋กฏํ•ด
06:27
for translation,
146
387246
1167
๋”ฅ๋งˆ์ธ๋“œ์‚ฌ์˜ ์•ŒํŒŒ๊ณ  ๋Œ€๊ตญ์— ์“ฐ์˜€์œผ๋ฉฐ
06:28
in the DeepMind AlphaGo matches,
147
388454
1917
06:30
so Lee Sedol and Ke Jie maybe didn't realize,
148
390413
2583
์•ŒํŒŒ๊ณ ์™€ ๋Œ€๊ตญํ•œ ์ด์„ธ๋Œ๊ณผ ์ปค์ œ๋Š”
TPU๊ฐ€ ๊ฝ‚ํžŒ ๊ธฐ๊ณ„์™€ ๋ฐ”๋‘‘์„ ๋’€๋Š”์ง€ ๋ชจ๋ฅผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:33
but they were competing against racks of TPU cards.
149
393038
2708
06:35
And we've built a bunch of subsequent versions of TPUs
150
395788
2583
์ดํ›„์—” ํ›จ์”ฌ ๋น ๋ฅธ ์—ฌ๋Ÿฌ ํ›„์†ํŒ์„ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
06:38
that are even better and more exciting.
151
398371
1875
06:40
But despite all these successes,
152
400288
2083
ํ•˜์ง€๋งŒ ์ด๋Ÿฐ ์„ฑ๊ณต์„ ๊ฑฐ๋’€์–ด๋„
06:42
I think we're still doing many things wrong,
153
402371
2208
์—ฌ์ „ํžˆ ์ž˜๋ชปํ•˜๊ณ  ์žˆ๋Š” ๋ถ€๋ถ„์ด ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
ํ•ต์‹ฌ์ ์ธ ๊ฒƒ ์„ธ ๊ฐ€์ง€์™€ ๊ฐœ์„  ๋ฐฉ๋ฒ•์„ ๋ง์”€๋“œ๋ฆฌ๋ ค ํ•ฉ๋‹ˆ๋‹ค.
06:44
and I'll tell you about three key things we're doing wrong,
154
404621
2792
06:47
and how we'll fix them.
155
407454
1167
์ฒซ์งธ๋กœ, ์˜ค๋Š˜๋‚  ๋Œ€๋ถ€๋ถ„์˜ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์€
06:48
The first is that most neural networks today
156
408663
2083
06:50
are trained to do one thing, and one thing only.
157
410746
2250
๋‹จ ํ•œ ๊ฐ€์ง€ ์ž‘์—…๋งŒ์„ ํ•™์Šตํ–ˆ์Šต๋‹ˆ๋‹ค.
์ค‘์ ์„ ๋‘” ํŠน์ •ํ•œ ํ•œ ๊ฐ€์ง€ ๊ณผ์ œ๋ฅผ ํ•™์Šต์‹œํ‚ค๋Š”๋ฐ
06:53
You train it for a particular task that you might care deeply about,
158
413038
3208
06:56
but it's a pretty heavyweight activity.
159
416288
1916
์†์ด ์ƒ๋‹นํžˆ ๋งŽ์ด ๊ฐ€๋Š” ์ผ์ž…๋‹ˆ๋‹ค.
์ž๋ฃŒ๋“ค์„ ๊ตฌ์„ฑํ•˜๊ณ 
06:58
You need to curate a data set,
160
418246
1667
06:59
you need to decide what network architecture you'll use
161
419913
3000
์ด ๊ณผ์ œ์— ์–ด๋–ค ์ธ๊ณต ์‹ ๊ฒฝ๋ง ๊ตฌ์กฐ๋ฅผ ์‚ฌ์šฉํ• ์ง€ ์ •ํ•˜๊ณ ,
07:02
for this problem,
162
422954
1167
๊ฐ€์ค‘์น˜๋ฅผ ๋ฌด์ž‘์œ„ ๊ฐ’์œผ๋กœ ์ดˆ๊ธฐํ™”ํ•˜๊ณ ,
07:04
you need to initialize the weights with random values,
163
424121
2708
07:06
apply lots of computation to make adjustments to the weights.
164
426871
2875
๊ฐ€์ค‘์น˜๋ฅผ ์กฐ์ •ํ•˜๊ธฐ ์œ„ํ•ด ์ˆ˜๋งŽ์€ ๊ณ„์‚ฐ์„ ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
07:09
And at the end, if youโ€™re lucky, you end up with a model
165
429746
2750
๊ทธ๋ฆฌ๊ณ  ์šด์ด ์ข‹๋‹ค๋ฉด,
์›ํ•˜๋Š” ๊ทธ ์ž‘์—…์„ ์•„์ฃผ ์ž˜ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ชจ๋ธ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
07:12
that is really good at that task you care about.
166
432538
2291
07:14
But if you do this over and over,
167
434871
1583
ํ•˜์ง€๋งŒ ์ด ๊ณผ์ •์„ ๊ณ„์† ํ•˜๋‹ค ๋ณด๋ฉด
07:16
you end up with thousands of separate models,
168
436496
2667
์•„์ฃผ ์œ ๋Šฅํ•œ ๊ฐœ๋ณ„ ๋ชจ๋ธ ์ˆ˜์ฒœ ๊ฐœ๊ฐ€ ์ƒ๊ธฐ๊ฒ ์ง€๋งŒ
07:19
each perhaps very capable,
169
439204
1792
ํ•„์š”ํ•œ ์ž‘์—…๋งˆ๋‹ค ๋‹ค ๋”ฐ๋กœ ๋…ธ๋Š” ์…ˆ์ž…๋‹ˆ๋‹ค.
07:21
but separate for all the different tasks you care about.
170
441038
2791
07:23
But think about how people learn.
171
443829
1667
๋ฐ˜๋ฉด ์‚ฌ๋žŒ์ด ํ•™์Šตํ•˜๋Š” ๊ฑธ ๋ณด์ฃ .
07:25
In the last year, many of us have picked up a bunch of new skills.
172
445538
3166
์ž‘๋…„ ๋™์•ˆ ์—ฌ๋Ÿฌ ๊ณต๋ถ€๋ฅผ ํ•˜์…จ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:28
I've been honing my gardening skills,
173
448746
1792
์ €๋Š” ์›์˜ˆ๋ฅผ ๊ณต๋ถ€ํ•˜๋ฉด์„œ ์ˆ˜์ง ์ˆ˜๊ฒฝ ์žฌ๋ฐฐ๋ฅผ ์‹คํ—˜ํ–ˆ์Šต๋‹ˆ๋‹ค.
07:30
experimenting with vertical hydroponic gardening.
174
450579
2375
07:32
To do that, I didn't need to relearn everything I already knew about plants.
175
452954
3792
์ˆ˜๊ฒฝ ์žฌ๋ฐฐ๋ฅผ ์—ฐ๊ตฌํ•˜๋ฉด์„œ ์ด๋ฏธ ์•„๋Š” ๊ฒƒ์„ ๋‹ค์‹œ ๊ณต๋ถ€ํ•  ํ•„์š”๋Š” ์—†์—ˆ์Šต๋‹ˆ๋‹ค.
07:36
I was able to know how to put a plant in a hole,
176
456746
3375
์‹๋ฌผ์„ ์‹ฌ๋Š” ๋ฒ•์ด๋‚˜ ๋ฌผ์„ ์ฃผ๋Š” ๋ฒ•,
07:40
how to pour water, that plants need sun,
177
460163
2291
์‹๋ฌผ์€ ํ–‡๋น›์ด ํ•„์š”ํ•˜๋‹ค๋Š” ์‚ฌ์‹ค์„ ์•Œ๊ณ  ์žˆ๊ณ 
07:42
and leverage that in learning this new skill.
178
462496
3583
์ƒˆ๋กœ์šด ๊ฒƒ์„ ๋ฐฐ์šธ ๋•Œ๋Š” ์ด๊ฑธ ์ ์ ˆํžˆ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
07:46
Computers can work the same way, but they donโ€™t today.
179
466079
3042
์ปดํ“จํ„ฐ๋„ ์ด๋Ÿฐ ์‹์œผ๋กœ ํ•  ์ˆ˜ ์žˆ์ง€๋งŒ ์•„์ง์€ ์•„๋‹™๋‹ˆ๋‹ค.
07:49
If you train a neural network from scratch,
180
469163
2583
์•„๋ฌด๊ฒƒ๋„ ์—†๋Š” ์ƒํƒœ์—์„œ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์„ ํ•™์Šต์‹œํ‚จ๋‹ค๋ฉด
07:51
it's effectively like forgetting your entire education
181
471746
3542
์ด์ „์— ๊ฐ€๋ฅด์ณค๋˜ ๋‚ด์šฉ์€ ์‚ฌ์‹ค์ƒ ์ „๋ถ€ ์žŠ์–ด๋ฒ„๋ฆด ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:55
every time you try to do something new.
182
475288
1875
๋ญ”๊ฐ€ ์ƒˆ๋กœ์šด ๊ฑธ ํ•™์Šตํ•  ๋•Œ๋งˆ๋‹ค ๋ง์ž…๋‹ˆ๋‹ค.
๋ง๋„ ์•ˆ๋˜์ฃ ?
07:57
Thatโ€™s crazy, right?
183
477163
1000
07:58
So instead, I think we can and should be training
184
478788
3708
๊ทธ๋ž˜์„œ ์ˆ˜์ฒœ, ์ˆ˜๋ฐฑ๋งŒ ๊ฐœ์˜ ์ž‘์—…์„ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๋Š”
08:02
multitask models that can do thousands or millions of different tasks.
185
482538
3791
๋‹ค์ค‘ ์ž‘์—… ๋ชจ๋ธ์„ ๋งŒ๋“ค์–ด์•ผ ํ•œ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค.
08:06
Each part of that model would specialize in different kinds of things.
186
486329
3375
๋‹ค์ค‘ ์ž‘์—… ๋ชจ๋ธ์˜ ๊ฐ ๋ถ€๋ถ„์€ ์„œ๋กœ ๋‹ค๋ฅธ ๋ถ„์•ผ์— ํŠนํ™”๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
08:09
And then, if we have a model that can do a thousand things,
187
489704
2792
์ฒœ ๊ฐ€์ง€์˜ ๊ณผ์ œ๋ฅผ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๋Š” ๋ชจ๋ธ์ด ์žˆ๋‹ค๋ฉด
08:12
and the thousand and first thing comes along,
188
492538
2166
์ฒœํ•œ ๋ฒˆ์งธ ์ผ์ด ์ƒ๊ฒผ์„ ๋•Œ
08:14
we can leverage the expertise we already have
189
494746
2125
๊ด€๋ จ ๋ถ„์•ผ์— ์ด๋ฏธ ๊ฐ–๊ณ  ์žˆ๋Š” ์ „๋ฌธ ์ง€์‹์„ ์ ์ ˆํžˆ ํ™œ์šฉํ•ด์„œ
08:16
in the related kinds of things
190
496913
1541
08:18
so that we can more quickly be able to do this new task,
191
498496
2792
์ƒˆ๋กœ์šด ์ž‘์—…์„ ๋” ์‹ ์†ํžˆ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
๋งˆ์น˜ ์—ฌ๋Ÿฌ๋ถ„์ด ์ฒ˜์Œ ๋ณด๋Š” ๋ฌธ์ œ์™€ ๋งž๋‹ฅ๋œจ๋ ธ์„ ๋•Œ
08:21
just like you, if you're confronted with some new problem,
192
501288
2791
๋ฌธ์ œ ํ•ด๊ฒฐ์— ๋„์›€์ด ๋  ์ด๋ฏธ ์•„๋Š” ๊ฒƒ์ด ๋ฌด์—‡์ธ์ง€
08:24
you quickly identify the 17 things you already know
193
504121
2625
08:26
that are going to be helpful in solving that problem.
194
506746
2542
๋น ๋ฅด๊ฒŒ ํŒŒ์•…ํ•˜๋“ฏ์ด ๋ง์ž…๋‹ˆ๋‹ค.
08:29
Second problem is that most of our models today
195
509329
2709
๋‘˜์งธ ๋ฌธ์ œ๋Š” ์˜ค๋Š˜๋‚  ๋Œ€๋ถ€๋ถ„์˜ ๋ชจ๋ธ์ด
08:32
deal with only a single modality of data --
196
512079
2125
ํ•œ ๊ฐ€์ง€ ํ˜•ํƒœ์˜ ์ž๋ฃŒ๋งŒ ์ฒ˜๋ฆฌํ•œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:34
with images, or text or speech,
197
514204
3084
์˜์ƒ์ด๋‚˜ ๋ฌธ์ž, ์Œ์„ฑ ์ค‘ ํ•œ ๊ฐ€์ง€ ํ˜•ํƒœ๋งŒ ์ฒ˜๋ฆฌํ•  ๋ฟ
08:37
but not all of these all at once.
198
517329
1709
๋ชจ๋“  ํ˜•ํƒœ๋ฅผ ํ•œ ๋ฒˆ์— ์ฒ˜๋ฆฌํ•˜์ง„ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
๋ฐ˜๋ฉด์— ์‚ฌ๋žŒ๋“ค์€ ์–ด๋–ป๊ฒŒ ํ•˜๋Š”์ง€ ์ƒ๊ฐํ•ด๋ณด์„ธ์š”.
08:39
But think about how you go about the world.
199
519079
2042
08:41
You're continuously using all your senses
200
521121
2333
๋ชจ๋“  ๊ฐ๊ฐ์„ ๋Š์ž„์—†์ด ํ™œ์šฉํ•ด
08:43
to learn from, react to,
201
523454
3083
๋ฐฐ์šฐ๊ณ , ๋ฐ˜์‘ํ•˜๊ณ ,
08:46
figure out what actions you want to take in the world.
202
526579
2667
์–ด๋–ค ํ–‰๋™์„ ์ทจํ•ด์•ผ ํ• ์ง€ ์•Œ์•„๋ƒ…๋‹ˆ๋‹ค.
์ด๊ฒŒ ํ›จ์”ฌ ๋” ํ•ฉ๋ฆฌ์ ์ด๊ณ 
08:49
Makes a lot more sense to do that,
203
529287
1667
08:50
and we can build models in the same way.
204
530954
2000
์ด ๊ฐ™์€ ๋ฐฉ์‹์˜ ๋ชจ๋ธ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
08:52
We can build models that take in these different modalities of input data,
205
532954
4042
๋‹ค์–‘ํ•œ ํ˜•ํƒœ์˜ ์ž…๋ ฅ์„ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ชจ๋ธ์„ ๋งŒ๋“ค์–ด
๋ฌธ์ž, ์˜์ƒ, ์Œ์„ฑ ๋“ฑ ํ˜•ํƒœ์™€ ์ƒ๊ด€์—†์ด ๊ฒฐํ•ฉ์‹œ์ผœ
08:57
text, images, speech,
206
537037
1750
08:58
but then fuse them together,
207
538829
1417
09:00
so that regardless of whether the model sees the word "leopard,"
208
540329
3833
์ด ๋ชจ๋ธ์ด ํ‘œ๋ฒ”์ด๋ผ๋Š” ๋‹จ์–ด๋ฅผ ๋ณด๊ฑฐ๋‚˜
09:04
sees a video of a leopard or hears someone say the word "leopard,"
209
544204
4083
ํ‘œ๋ฒ”์˜ ์˜์ƒ์„ ๋ณด๊ฑฐ๋‚˜ ํ‘œ๋ฒ”์ด๋ผ๋Š” ๋ง์„ ๋“ค์–ด๋„
09:08
the same response is triggered inside the model:
210
548329
2250
๊ฐ™์€ ์ฒ˜๋ฆฌ๋ฅผ ํ•˜๊ฒŒ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:10
the concept of a leopard
211
550621
1750
์—ฌ๊ธฐ์„œ ํ‘œ๋ฒ”์€
09:12
can deal with different kinds of input data,
212
552412
2250
๋‹ค๋ฅธ ํ˜•ํƒœ์˜ ์ž…๋ ฅ์ด ๋  ์ˆ˜ ์žˆ๋Š”๋ฐ,
09:14
even nonhuman inputs, like genetic sequences,
213
554662
3000
์ธ๊ฐ„ ์™ธ์ ์ธ ๋ฐฉ๋ฒ•์ผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
์˜์ƒ, ๋ฌธ์ž, ๋™์˜์ƒ๋ฟ ์•„๋‹ˆ๋ผ ์œ ์ „์ž ์„œ์—ด, 3์ฐจ์› ์  ๊ตฌ๋ฆ„์ฒ˜๋Ÿผ์š”.
09:17
3D clouds of points, as well as images, text and video.
214
557662
3209
09:20
The third problem is that today's models are dense.
215
560912
3625
์„ธ ๋ฒˆ์งธ ๋ฌธ์ œ๋กœ ์ธ๊ณต ์ง€๋Šฅ ๋ชจ๋ธ์˜ ๋‰ด๋Ÿฐ์€ ์ง€๊ธˆ ์กฐ๋ฐ€ํ•ฉ๋‹ˆ๋‹ค.
09:24
There's a single model,
216
564579
1417
ํ•˜๋‚˜์˜ ๋ชจ๋ธ์ด
09:25
the model is fully activated for every task,
217
565996
2375
๋ชจ๋“  ์ž‘์—…๊ณผ ์˜ˆ์ œ์— ์‹ ๊ฒฝ๋ง์„ ์ „๋ถ€ ๋™์›ํ•˜๋ฉฐ
09:28
for every example that we want to accomplish,
218
568412
2125
09:30
whether that's a really simple or a really complicated thing.
219
570537
2917
์ž‘์—…์ด ์•„์ฃผ ์‰ฌ์šด ๊ฒƒ์ด๋“ , ์•„์ฃผ ๋ณต์žกํ•˜๋“  ์ƒ๊ด€ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
09:33
This, too, is unlike how our own brains work.
220
573496
2666
์ด๊ฒƒ๋„ ์šฐ๋ฆฌ ๋‡Œ๊ฐ€ ์ž‘๋™ํ•˜๋Š” ๋ฐฉ์‹๊ณผ ๋‹ค๋ฅธ ์ ์ž…๋‹ˆ๋‹ค.
09:36
Different parts of our brains are good at different things,
221
576204
3000
๋‡Œ์˜ ๊ฐ ๋ถ€๋ถ„์€ ์„œ๋กœ ๋‹ค๋ฅธ ๋ถ„์•ผ์— ํŠนํ™”๋˜์–ด์žˆ๊ณ 
09:39
and we're continuously calling upon the pieces of them
222
579246
3291
ํ˜„์žฌ ๊ณผ์ œ์— ์—ฐ๊ด€๋œ ๋ถ€๋ถ„์—
09:42
that are relevant for the task at hand.
223
582579
2167
๋Š์ž„์—†์ด ์š”์ฒญ์„ ํ•ฉ๋‹ˆ๋‹ค.
09:44
For example, nervously watching a garbage truck
224
584787
2334
์˜ˆ๋ฅผ ๋“ค์–ด ๋‚ด ์ฐจ๋ฅผ ํ–ฅํ•ด ํ›„์ง„ํ•˜๋Š” ์ฒญ์†Œ์ฐจ๋ฅผ ๋ถˆ์•ˆํ•˜๊ฒŒ ๋ณด๊ณ  ์žˆ๋Š” ์ƒํ™ฉ์ด๋ผ๋ฉด
09:47
back up towards your car,
225
587162
1875
์…ฐ์ต์Šคํ”ผ์–ด์˜ ์†Œ๋„คํŠธ ๊ฐ์ƒ์„ ๋‹ด๋‹นํ•˜๋Š” ๋ถ€๋ถ„์˜ ๋‡Œ๋Š” ์•„๋งˆ ํ™œ๋™ํ•˜์ง€ ์•Š๊ฒ ์ฃ .
09:49
the part of your brain that thinks about Shakespearean sonnets
226
589037
2917
09:51
is probably inactive.
227
591996
1250
09:53
(Laughter)
228
593246
1625
(์›ƒ์Œ)
09:54
AI models can work the same way.
229
594912
1750
AI ๋ชจ๋ธ๋„ ์ด๋Ÿฐ ์‹์œผ๋กœ ์›€์ง์ด๊ฒŒ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:56
Instead of a dense model,
230
596662
1292
์กฐ๋ฐ€ํ•œ ๋ชจ๋ธ ๋Œ€์‹  ๋„์—„๋„์—„ ํ™œ์„ฑํ™”ํ•˜๋Š” ๋ชจ๋ธ์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
09:57
we can have one that is sparsely activated.
231
597954
2250
10:00
So for particular different tasks, we call upon different parts of the model.
232
600204
4250
์ž‘์—…์— ํ•„์š”ํ•œ ๋ถ€๋ถ„๋งŒ ์“ฐ๋Š” ๋ฐฉ์‹์ž…๋‹ˆ๋‹ค.
10:04
During training, the model can also learn which parts are good at which things,
233
604496
4375
ํ•™์Šตํ•˜๋ฉด์„œ ์–ด๋Š ๋ถ€๋ถ„์ด ์–ด๋–ค ์ผ์„ ์ž˜ ํ•˜๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์—
10:08
to continuously identify what parts it wants to call upon
234
608871
3791
์ƒˆ๋กœ์šด ์ž‘์—…์„ ํ•  ๋•Œ ์–ด๋Š ๋ถ€๋ถ„์ด ํ•„์š”ํ•œ์ง€ ๊ณ„์† ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
10:12
in order to accomplish a new task.
235
612662
1834
10:14
The advantage of this is we can have a very high-capacity model,
236
614496
3541
์ด ๋ชจ๋ธ์€ ์•„์ฃผ ๊ณ ์„ฑ๋Šฅ์ด๋ฉด์„œ ๋™์‹œ์— ํšจ์œจ๋„ ๋†’์€๋ฐ
10:18
but it's very efficient,
237
618037
1250
์–ด๋–ค ์ž‘์—…์ด๋“  ํ•„์š”ํ•œ ๋ถ€๋ถ„๋งŒ ์‚ฌ์šฉํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
10:19
because we're only calling upon the parts that we need
238
619329
2583
10:21
for any given task.
239
621954
1208
์ด ์„ธ ๊ฐ€์ง€๋งŒ ๊ฐœ์„ ํ•œ๋‹ค๋ฉด
10:23
So fixing these three things, I think,
240
623162
2000
์ธ๊ณต ์ง€๋Šฅ์ด ๋” ์œ ์šฉํ•ด์งˆ ๊ฒƒ์ด๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
10:25
will lead to a more powerful AI system:
241
625162
2209
10:27
instead of thousands of separate models,
242
627412
2000
๋”ฐ๋กœ ๋…ธ๋Š” ์ˆ˜์ฒœ ๊ฐœ์˜ ๋ชจ๋ธ ๋Œ€์‹ 
10:29
train a handful of general-purpose models
243
629412
2000
์ˆ˜์ฒœ, ์ˆ˜๋ฐฑ๋งŒ ๊ฐ€์ง€์˜ ์ผ์„ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๋Š”
10:31
that can do thousands or millions of things.
244
631454
2083
์†Œ์ˆ˜์˜ ๋‹ค๋ชฉ์  ๋ชจ๋ธ์„ ํ•™์Šต์‹œ์ผœ์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:33
Instead of dealing with single modalities,
245
633579
2042
ํ•œ ํ˜•ํƒœ์˜ ์ž๋ฃŒ๋งŒ ์ฒ˜๋ฆฌํ•˜๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
10:35
deal with all modalities,
246
635662
1334
๋ชจ๋“  ํ˜•ํƒœ์˜ ์ž๋ฃŒ๋ฅผ ์ฒ˜๋ฆฌ, ๊ฒฐํ•ฉํ•  ์ˆ˜ ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:36
and be able to fuse them together.
247
636996
1708
10:38
And instead of dense models, use sparse, high-capacity models,
248
638746
3458
๋‰ด๋Ÿฐ์ด ์กฐ๋ฐ€ํ•œ ๋ชจ๋ธ๋ณด๋‹ค๋Š” ๋„์—„๋„์—„ ์žˆ๋Š” ๊ณ ์„ฑ๋Šฅ ๋ชจ๋ธ์„ ์“ฐ๊ณ 
10:42
where we call upon the relevant bits as we need them.
249
642246
2958
ํ•„์š”ํ•  ๋•Œ ํ•„์š”ํ•œ ๋ถ€๋ถ„๋งŒ ํ™œ์šฉํ•ฉ๋‹ˆ๋‹ค.
10:45
We've been building a system that enables these kinds of approaches,
250
645246
3416
์ €ํฌ๋Š” ์ด๋Ÿฐ ๋ฐฉ์‹์„ ์‹คํ˜„ํ•  ์ˆ˜ ์žˆ๋Š” ์‹œ์Šคํ…œ์„ ์ค€๋น„ํ•ด์™”์œผ๋ฉฐ
10:48
and weโ€™ve been calling the system โ€œPathways.โ€
251
648704
2542
๊ทธ๊ฒƒ์„ โ€˜ํŒจ์Šค์›จ์ดโ€™๋กœ ๋ถ€๋ฆ…๋‹ˆ๋‹ค.
10:51
So the idea is this model will be able to do
252
651287
3084
๊ฐœ๋…์ ์œผ๋กœ ํŒจ์Šค์›จ์ด๋Š”
์ž‘์—… ์ˆ˜์ฒœ, ์ˆ˜๋งŒ ๊ฐœ๋ฅผ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๊ณ 
10:54
thousands or millions of different tasks,
253
654412
2084
10:56
and then, we can incrementally add new tasks,
254
656537
2250
์ƒˆ ์ž‘์—…์„ ์ถ”๊ฐ€ํ•  ์ˆ˜ ์žˆ์œผ๋ฉฐ,
10:58
and it can deal with all modalities at once,
255
658787
2125
๋ชจ๋“  ํ˜•ํƒœ์˜ ์ž…๋ ฅ์„ ๋™์‹œ์— ์ฒ˜๋ฆฌํ•˜๊ณ 
11:00
and then incrementally learn new tasks as needed
256
660954
2958
ํ•„์š”์— ๋”ฐ๋ผ ์ƒˆ ์ž‘์—…์„ ํ•™์Šตํ•  ์ˆ˜ ์žˆ๊ณ 
11:03
and call upon the relevant bits of the model
257
663954
2083
๊ฐ ์˜ˆ์ œ ๋ฐ ์ž‘์—… ์ฒ˜๋ฆฌ์— ํ•„์š”ํ•œ ๋ถ€๋ถ„๋งŒ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค.
11:06
for different examples or tasks.
258
666037
1709
11:07
And we're pretty excited about this,
259
667787
1750
์ •๋ง ์žฌ๋ฏธ์žˆ๋Š” ๊ฐœ๋…์œผ๋กœ AI์— ์ผ๋ณด ์ „์ง„์ด ๋  ๊ฒƒ์ด๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
11:09
we think this is going to be a step forward
260
669537
2042
11:11
in how we build AI systems.
261
671621
1333
11:12
But I also wanted to touch on responsible AI.
262
672954
3708
์—ฌ๊ธฐ์— ๋”ํ•ด ์ฑ…์ž„๊ฐ ์žˆ๋Š” AI๋ฅผ ์ž ๊น ์ด์•ผ๊ธฐํ•˜๊ณ  ์‹ถ์Šต๋‹ˆ๋‹ค.
11:16
We clearly need to make sure that this vision of powerful AI systems
263
676662
4875
ํ•œ ๊ฐ€์ง€ ๋ช…ํ™•ํ•œ ์ ์€
์ด ๋›ฐ์–ด๋‚œ AI ๊ฐœ๋…์ด ๋ถ„๋ช… ๋ชจ๋‘์—๊ฒŒ ์œ ์šฉํ•  ๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:21
benefits everyone.
264
681579
1167
11:23
These kinds of models raise important new questions
265
683496
2458
ํŒจ์Šค์›จ์ด์™€ ๊ฐ™์€ ๋ชจ๋ธ์€ ์ธ๊ณต ์ง€๋Šฅ์„ ๋งŒ๋“ค๋ฉด์„œ ์œ ๋…ํ•ด์•ผ ํ•˜๋Š”
11:25
about how do we build them with fairness,
266
685954
2458
๊ณต์ •์„ฑ, ํ•ด์„ ๊ฐ€๋Šฅ์„ฑ, ๊ฐœ์ธ ์ •๋ณด ๋ณดํ˜ธ, ๋ณด์•ˆ ๋“ฑ์— ๋Œ€ํ•ด
11:28
interpretability, privacy and security,
267
688454
3208
11:31
for all users in mind.
268
691662
1459
์ƒˆ๋กญ๊ณ ๋„ ์ค‘์š”ํ•œ ์˜๋ฌธ์„ ์ œ๊ธฐํ•ฉ๋‹ˆ๋‹ค.
11:33
For example, if we're going to train these models
269
693621
2291
์˜ˆ๋ฅผ ๋“ค์–ด ์ˆ˜์ฒœ, ์ˆ˜๋ฐฑ๋งŒ ๊ฐœ์˜ ์ž‘์—…์„ ํ›ˆ๋ จ์‹œํ‚ฌ ๋•Œ๋Š”
11:35
on thousands or millions of tasks,
270
695954
2125
๋ฐฉ๋Œ€ํ•œ ์ž๋ฃŒ๋ฅผ ํ•™์Šต์‹œํ‚ฌ ์ˆ˜ ์žˆ์–ด์•ผ ํ•  ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
11:38
we'll need to be able to train them on large amounts of data.
271
698079
2875
11:40
And we need to make sure that data is thoughtfully collected
272
700996
3250
์šฉ์˜์ฃผ๋„ํ•˜๊ฒŒ ์ž๋ฃŒ๋ฅผ ์ˆ˜์ง‘ํ•ด์•ผ ํ•˜๋ฉฐ
11:44
and is representative of different communities and situations
273
704287
3667
์ „ ์„ธ๊ณ„์˜ ์—ฌ๋Ÿฌ ์ง‘๋‹จ๊ณผ ์ƒํ™ฉ์„ ์•„์šฐ๋ฅผ ์ˆ˜ ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
11:47
all around the world.
274
707954
1167
11:49
And data concerns are only one aspect of responsible AI.
275
709579
4000
ํ•˜์ง€๋งŒ ์ž๋ฃŒ ๋ฌธ์ œ๋Š” ์ฑ…์ž„๊ฐ ์žˆ๋Š” AI์˜ ์ผ๋ถ€์ผ ๋ฟ์ž…๋‹ˆ๋‹ค.
11:53
We have a lot of work to do here.
276
713621
1583
ํ•ด์•ผ ํ•  ์ผ์ด ๊ต‰์žฅํžˆ ๋งŽ์Šต๋‹ˆ๋‹ค.
11:55
So in 2018, Google published this set of AI principles
277
715246
2666
2018๋…„, ๊ตฌ๊ธ€์—์„œ AI์˜ ์›์น™์„ ์ˆ˜๋ฆฝํ•ด
11:57
by which we think about developing these kinds of technology.
278
717912
3500
๊ด€๋ จ ๊ธฐ์ˆ ์˜ ๋ฐœ์ „์„ ๋‹ค์‹œ ์ƒ๊ฐํ•  ๊ธฐํšŒ๋ฅผ ๊ฐ–๊ฒŒ ํ–ˆ์Šต๋‹ˆ๋‹ค.
12:01
And these have helped guide us in how we do research in this space,
279
721454
3625
์ด ์›์น™์€ ์•ž์œผ๋กœ ์–ด๋–ค ์ž์„ธ๋กœ ์—ฐ๊ตฌ์— ์ž„ํ•ด์•ผ ํ•˜๊ณ 
์ œํ’ˆ์— ์–ด๋–ป๊ฒŒ ํ™œ์šฉํ•ด์•ผ ํ•˜๋Š”์ง€๋ฅผ ์•Œ๋ ค์ค๋‹ˆ๋‹ค.
12:05
how we use AI in our products.
280
725121
1833
12:06
And I think it's a really helpful and important framing
281
726996
2750
AI๋ฅผ ์‚ฌํšŒ์— ์ ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์— ๋Œ€ํ•œ
12:09
for how to think about these deep and complex questions
282
729746
2875
๊นŠ๊ณ  ๋ณต์žกํ•œ ์˜๋ฌธ์„ ํ•ด๊ฒฐํ•˜๋Š” ๋ฐ
12:12
about how we should be using AI in society.
283
732621
2833
ํฐ ๋„์›€์ด ๋˜๋ฆฌ๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
12:15
We continue to update these as we learn more.
284
735454
3625
์—ฐ๊ตฌ๋ฅผ ํ•ด๋‚˜๊ฐ€๋ฉด์„œ ์šฐ๋ฆฌ๋Š” ์ด ์›์น™์„ ๊ณ„์† ๊ฐฑ์‹ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
์ด๋Ÿฐ ์›์น™ ๋Œ€๋ถ€๋ถ„์€ ํ™œ๋ฐœํžˆ ์—ฐ๊ตฌ๋˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:19
Many of these kinds of principles are active areas of research --
285
739079
3458
12:22
super important area.
286
742537
1542
์•„์ฃผ ์ค‘์š”ํ•œ ๋ถ€๋ถ„์ด์ฃ .
12:24
Moving from single-purpose systems that kind of recognize patterns in data
287
744121
4125
๋‹จ์ˆœํžˆ ์ž๋ฃŒ์˜ ํŠน์„ฑ์„ ์ธ์‹ํ•˜๋Š” ๋‹จ์ผ ๋ชฉ์  ์‹œ์Šคํ…œ์—์„œ
12:28
to these kinds of general-purpose intelligent systems
288
748246
2750
์‚ฌํšŒ๋ฅผ ๋” ์ž˜ ๋ถ„์„ํ•˜๋Š” ๋‹ค๋ชฉ์  ์ง€๋Šฅ ์‹œ์Šคํ…œ์œผ๋กœ ์ „ํ™˜ํ•œ๋‹ค๋ฉด
12:31
that have a deeper understanding of the world
289
751037
2292
12:33
will really enable us to tackle
290
753329
1583
์ธ๋ฅ˜๊ฐ€ ์ง๋ฉดํ•œ ์‹ฌ๊ฐํ•œ ๋ฌธ์ œ ํ•ด๊ฒฐ์— ๋„์›€์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:34
some of the greatest problems humanity faces.
291
754954
2500
12:37
For example,
292
757454
1167
์˜ˆ๋ฅผ ๋“ค๋ฉด ์ธ๊ณต ์ง€๋Šฅ ๋ชจ๋ธ์— ํ™”ํ•™๊ณผ ๋ฌผ๋ฆฌํ•™์„ ์ ‘๋ชฉํ•ด
12:38
weโ€™ll be able to diagnose more disease;
293
758662
2375
12:41
we'll be able to engineer better medicines
294
761079
2000
๋” ๋งŽ์€ ์งˆ๋ณ‘์„ ์ง„๋‹จํ•˜๊ณ  ๋” ์ข‹์€ ์•ฝ์„ ๊ฐœ๋ฐœํ•  ์ˆ˜ ์žˆ์œผ๋ฉฐ
12:43
by infusing these models with knowledge of chemistry and physics;
295
763079
3083
12:46
we'll be able to advance educational systems
296
766204
2667
๊ต์œก ์‹œ์Šคํ…œ๊ณผ ์ ‘๋ชฉํ•œ๋‹ค๋ฉด ๊ฐœ๊ฐœ์ธ์—๊ฒŒ ๋งž์ถคํ˜• ์ˆ˜์—…์„ ์ œ๊ณตํ•ด
12:48
by providing more individualized tutoring
297
768871
2041
12:50
to help people learn in new and better ways;
298
770912
2375
ํ•™์ƒ์ด ์ƒˆ๋กญ๊ณ  ๋” ๋‚˜์€ ๋ฐฉ๋ฒ•์œผ๋กœ ๊ณต๋ถ€ํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:53
weโ€™ll be able to tackle really complicated issues,
299
773287
2375
๊ทธ๋ฆฌ๊ณ  ๊ธฐํ›„ ๋ณ€ํ™”๋‚˜ ์ฒญ์ • ์—๋„ˆ์ง€ ๋“ฑ
12:55
like climate change,
300
775704
1208
12:56
and perhaps engineering of clean energy solutions.
301
776954
2792
์•„์ฃผ ๋ณต์žกํ•œ ์‚ฌํšŒ ๋ฌธ์ œ ํ•ด๊ฒฐ์—๋„ ๋„์›€์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:59
So really, all of these kinds of systems
302
779787
2667
๋‹ค์‹œ ๋งํ•ด, ์ด ์‹œ์Šคํ…œ์—๋Š”
13:02
are going to be requiring the multidisciplinary expertise
303
782496
2875
์ „ ์„ธ๊ณ„ ์‚ฌ๋žŒ๋“ค์˜ ์ข…ํ•ฉ์ ์ธ ์ „๋ฌธ ์ง€์‹์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
13:05
of people all over the world.
304
785371
1875
13:07
So connecting AI with whatever field you are in,
305
787287
3542
์–ด๋–ค ๋ถ„์•ผ์— ์ข…์‚ฌํ•˜๋“ , ๋ฐœ์ „ํ•˜๋ ค๋ฉด ์ธ๊ณต ์ง€๋Šฅ๊ณผ ์—ฐ๊ณ„ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
13:10
in order to make progress.
306
790829
1750
13:13
So I've seen a lot of advances in computing,
307
793579
2292
์ง€๋‚œ ์ˆ˜์‹ญ ๋…„๊ฐ„ ์ปดํ“จํ„ฐ์˜ ์„ฑ๋Šฅ๊ณผ ์—ฐ์‚ฐ ๋ฐฉ๋ฒ•์˜ ์ง„์ „์ด
13:15
and how computing, over the past decades,
308
795912
2292
13:18
has really helped millions of people better understand the world around them.
309
798204
4167
์ „ ์„ธ๊ณ„ ์ˆ˜๋ฐฑ๋งŒ ๋ช…์ด ์„œ๋กœ๋ฅผ ์ดํ•ดํ•˜๋Š” ๋ฐ์— ํฐ ๋„์›€์„ ์คฌ์Šต๋‹ˆ๋‹ค.
13:22
And AI today has the potential to help billions of people.
310
802412
3000
์˜ค๋Š˜๋‚  ์ธ๊ณต ์ง€๋Šฅ์€ ์ˆ˜์‹ญ์–ต ๋ช…์—๊ฒŒ ๋„์›€์„ ์ค„ ๊ฐ€๋Šฅ์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
13:26
We truly live in exciting times.
311
806204
2125
์šฐ๋ฆฌ๋Š” ์ •๋ง ์žฌ๋ฏธ์žˆ๋Š” ์‹œ๋Œ€์— ์‚ด๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:28
Thank you.
312
808746
1166
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
13:29
(Applause)
313
809912
7000
(๋ฐ•์ˆ˜)
13:39
Chris Anderson: Thank you so much.
314
819829
1667
CA: ๋ง์”€ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค. ๋ช‡ ๊ฐ€์ง€ ์ถ”๊ฐ€ ์งˆ๋ฌธ์„ ๋“œ๋ฆด๊นŒ ํ•ฉ๋‹ˆ๋‹ค.
13:41
I want to follow up on a couple things.
315
821537
2375
13:44
This is what I heard.
316
824454
2792
์ œ๊ฐ€ ์•Œ๊ธฐ๋กœ
13:47
Most people's traditional picture of AI
317
827287
4125
๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์ด ๋– ์˜ฌ๋ฆฌ๋Š” ์ธ๊ณต ์ง€๋Šฅ์€
13:51
is that computers recognize a pattern of information,
318
831412
3125
์ปดํ“จํ„ฐ๊ฐ€ ์ •๋ณด์˜ ๊ทœ์น™์„ ์ธ์‹ํ•˜๊ณ 
13:54
and with a bit of machine learning,
319
834579
2125
๊ธฐ๊ณ„ ํ•™์Šต์œผ๋กœ ํ•™์Šตํ•˜๊ธฐ ๋•Œ๋ฌธ์—
13:56
they can get really good at that, better than humans.
320
836704
2625
์‚ฌ๋žŒ๋ณด๋‹ค ๋” ์ž˜ ์ฒ˜๋ฆฌํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
์•ž์„œ ๋ง์”€ํ•˜์‹œ๋ฉด์„œ
13:59
What you're saying is those patterns
321
839329
1792
๊ทธ ๊ทœ์น™์ด ๋” ์ด์ƒ ์ธ๊ณต ์ง€๋Šฅ์ด ์ฒ˜๋ฆฌํ•˜๋Š” ๊ฐœ๋ณ„์ ์ธ ๊ฒƒ์ด ์•„๋‹ˆ๋ผ
14:01
are no longer the atoms that AI is working with,
322
841121
2875
14:04
that it's much richer-layered concepts
323
844037
2542
๋ชจ๋“  ๋ฐฉ์‹๊ณผ ํ˜•ํƒœ๋ฅผ ์•„์šฐ๋ฅด๋Š”
14:06
that can include all manners of types of things
324
846621
3458
ํ›จ์”ฌ ๋‹ค์ธต์ ์ธ ๊ฐœ๋…์ด๋ฉฐ,
14:10
that go to make up a leopard, for example.
325
850121
3000
๋ชจ๋‘ ๋ชจ์—ฌ์„œ ํ‘œ๋ฒ”์ด ๋œ๋‹ค๊ณ  ํ–ˆ์Šต๋‹ˆ๋‹ค.
14:13
So what could that lead to?
326
853121
2916
๊ทธ๋ ‡๋‹ค๋ฉด ๋‹ค์Œ์€ ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
14:16
Give me an example of when that AI is working,
327
856079
2750
๊ทธ๋Ÿฐ ์ธ๊ณต ์ง€๋Šฅ์ด ์–ธ์ œ ํ™œ์šฉ๋˜๋Š”์ง€,
14:18
what do you picture happening in the world
328
858829
2042
ํ–ฅํ›„ 5๋…„ ํ˜น์€ 10๋…„์˜ ์„ธ๊ณ„๋Š” ์–ด๋–จ์ง€ ์˜ˆ๋ฅผ ๋“ค์–ด์ฃผ์‹œ๊ฒ ์–ด์š”?
14:20
in the next five or 10 years that excites you?
329
860912
2209
14:23
Jeff Dean: I think the grand challenge in AI
330
863537
2500
JD: ์ธ๊ณต ์ง€๋Šฅ์„ ์ƒ์šฉํ™”ํ•  ๋•Œ ๊ฐ€์žฅ ํฐ ๋ฌธ์ œ๋Š”
14:26
is how do you generalize from a set of tasks
331
866079
2375
์ด๋ฏธ ์•„๋Š” ์ž‘์—…์„ ์ด์šฉํ•ด ์ƒˆ๋กœ์šด ์ž‘์—…์„ ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๋„๋ก
14:28
you already know how to do
332
868496
1416
14:29
to new tasks,
333
869954
1208
14:31
as easily and effortlessly as possible.
334
871204
2542
์‰ฝ๊ณ  ํŽธํ•˜๊ฒŒ ์ผ๋ฐ˜ํ™”ํ•˜๋Š”๊ฐ€์ž…๋‹ˆ๋‹ค.
14:33
And the current approach of training separate models for everything
335
873746
3250
๋ชจ๋“  ์ž‘์—…์— ๊ฐ๊ฐ ๋ชจ๋ธ์„ ๋งŒ๋“œ๋Š” ํ˜„์žฌ์˜ ๋ฐฉ๋ฒ•์€
14:36
means you need lots of data about that particular problem,
336
876996
3166
ํ•œ ๋ฌธ์ œ์— ๋งŽ์€ ์ž๋ฃŒ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
14:40
because you're effectively trying to learn everything
337
880162
2542
์‚ฌ์‹ค์ƒ ์„ธ๊ณ„๋‚˜ ๋ฌธ์ œ๋ฅผ ์ฒ˜์Œ๋ถ€ํ„ฐ ์ƒˆ๋กœ ๋ฐฐ์šฐ๋Š” ๊ฒƒ๊ณผ ๊ฐ™๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
14:42
about the world and that problem, from nothing.
338
882746
2541
14:45
But if you can build these systems
339
885287
1667
ํ•˜์ง€๋งŒ ์ˆ˜์ฒœ, ์ˆ˜๋ฐฑ๋งŒ ๊ฐœ์˜ ์ž‘์—… ์ฒ˜๋ฆฌ ๋ฐฉ๋ฒ•์ด ์ด๋ฏธ ํ•ฉ์ณ์ง„
14:46
that already are infused with how to do thousands and millions of tasks,
340
886996
4458
์‹œ์Šคํ…œ์„ ์ œ์ž‘ํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด
14:51
then you can effectively teach them to do a new thing
341
891496
3791
๋น„๊ต์  ์ ์€ ์˜ˆ์ œ๋กœ๋„
์ƒˆ๋กœ์šด ๊ฒƒ์„ ํšจ๊ณผ์ ์œผ๋กœ ํ•™์Šต์‹œํ‚ฌ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:55
with relatively few examples.
342
895329
1500
14:56
So I think that's the real hope,
343
896871
2083
์ง„์งœ๋กœ ๋ฐ”๋ผ๋Š” ๊ฒƒ์€
14:58
that you could then have a system where you just give it five examples
344
898996
4083
ํ•ด๊ฒฐํ•˜๊ณ  ์‹ถ์€ ์ž‘์—…์˜ ์˜ˆ์ œ๋ฅผ
15:03
of something you care about,
345
903121
1541
๋‹ค์„ฏ ๊ฐœ๋งŒ ์ œ์‹œํ•ด๋„ ์ƒˆ๋กœ์šด ์ž‘์—…์„ ํ•™์Šตํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:04
and it learns to do that new task.
346
904704
1917
15:06
CA: You can do a form of self-supervised learning
347
906662
2375
CA: ์•„์ฃผ ์†Œ์ˆ˜์˜ ์˜ˆ์ œ๋กœ๋„ ํ•™์Šตํ•˜๋Š”
15:09
that is based on remarkably little seeding.
348
909079
2542
์ž๊ธฐ ์ง€๋„ ํ•™์Šต์„ ํ•  ์ˆ˜ ์žˆ๋‹จ ๋ง์”€์ด์‹œ์ฃ ?
15:11
JD: Yeah, as opposed to needing 10,000 or 100,000 examples
349
911662
2875
JD: ๋งž์Šต๋‹ˆ๋‹ค.
์˜ˆ์ œ๊ฐ€ ๋งŒ ๊ฐœ, ์‹ญ๋งŒ ๊ฐœ ํ•„์š”ํ–ˆ๋˜ ์ด์ „๊ณผ๋Š” ๋‹ค๋ฅด๊ฒŒ ๋ง์ด์ฃ .
15:14
to figure everything in the world out.
350
914579
1833
CA: ๊ฑฐ๊ธฐ์„œ ์˜๋„ํ•˜์ง€ ์•Š์€ ๋ถ€์ ์ ˆํ•œ ๊ฒฐ๊ณผ๋„ ์žˆ์ง€ ์•Š๋‚˜์š”?
15:16
CA: Aren't there kind of terrifying unintended consequences
351
916454
2792
15:19
possible, from that?
352
919287
1334
15:20
JD: I think it depends on how you apply these systems.
353
920621
2625
JD: ์–ด๋–ป๊ฒŒ ํ™œ์šฉํ•˜๋Š๋ƒ์— ๋”ฐ๋ผ ๋‹ค๋ฅด๋‹ค ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
15:23
It's very clear that AI can be a powerful system for good,
354
923287
4167
์ธ๊ณต ์ง€๋Šฅ์€ ์ข‹์€ ๊ณณ์— ์ ์ ˆํ•˜๊ฒŒ ์“ฐ์ผ ์ˆ˜๋„ ์žˆ๊ณ 
15:27
or if you apply it in ways that are not so great,
355
927496
2333
๋ถ€์ ์ ˆํ•œ ๊ณณ์— ์“ฐ์—ฌ์„œ
15:29
it can be a negative consequence.
356
929871
3375
๋ถ€์ •์ ์ธ ๊ฒฐ๊ณผ๋ฅผ ๊ฐ€์ ธ์˜ฌ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
15:33
So I think that's why it's important to have a set of principles
357
933287
3042
๊ทธ๋ ‡๊ธฐ ๋•Œ๋ฌธ์— ์•ž์„œ ๋งํ•œ ์›์น™์ด ๋งค์šฐ ์ค‘์š”ํ•˜๋‹ค ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
15:36
by which you look at potential uses of AI
358
936371
2250
์ธ๊ณต ์ง€๋Šฅ์˜ ๊ฐ€๋Šฅ์„ฑ๊ณผ
15:38
and really are careful and thoughtful about how you consider applications.
359
938662
5125
์ธ๊ณต ์ง€๋Šฅ์„ ์–ด๋–ป๊ฒŒ ํ™œ์šฉํ• ์ง€
๋‹ค์‹œ ์ƒ๊ฐํ•  ๊ธฐํšŒ๋ฅผ ์ฃผ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
15:43
CA: One of the things people worry most about
360
943787
2209
CA: ์‚ฌ๋žŒ๋“ค์ด ๊ฐ€์žฅ ์—ผ๋ คํ•˜๋Š” ๊ฒƒ์€
15:45
is that, if AI is so good at learning from the world as it is,
361
945996
3583
๋ง์”€ํ•˜์…จ๋˜ ๋Œ€๋กœ ์ธ๊ณต ์ง€๋Šฅ์ด ์‚ฌํšŒ๋ฅผ ์žˆ๋Š” ๊ทธ๋Œ€๋กœ ํ•™์Šต์„ ์ž˜ ํ•œ๋‹ค๋ฉด
15:49
it's going to carry forward into the future
362
949621
2875
ํ˜„์žฌ ์‚ฌํšŒ์˜ ์ž˜๋ชป๋œ ๋ถ€๋ถ„๋„
15:52
aspects of the world as it is that actually aren't right, right now.
363
952537
4375
๊ทธ๋Œ€๋กœ ํ•™์Šตํ•  ๊ฒƒ์ด๋ผ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
15:56
And there's obviously been a huge controversy about that
364
956954
2708
๊ทธ๋ฆฌ๊ณ  ์ตœ๊ทผ ๊ตฌ๊ธ€์—์„œ ๋งค์šฐ ํฐ ๋…ผ๋ž€์ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
15:59
recently at Google.
365
959662
2417
16:02
Some of those principles of AI development,
366
962121
3083
์ธ๊ณต ์ง€๋Šฅ ๊ฐœ๋ฐœ ์›์น™ ์ผ๋ถ€๋ฅผ
16:05
you've been challenged that you're not actually holding to them.
367
965246
4708
์‚ฌ์‹ค์ƒ ์ค€์ˆ˜ํ•˜์ง€ ์•Š์•˜๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.
16:10
Not really interested to hear about comments on a specific case,
368
970329
3250
ํŠน์ • ์ง€์–ด์„œ ๋ฌป๊ณ  ์‹ถ์ง€๋Š” ์•Š์ง€๋งŒ
16:13
but ... are you really committed?
369
973621
2708
์ •๋ง ์ž˜ ์ง€ํ‚ค์„ฐ๋‚˜์š”?
16:16
How do we know that you are committed to these principles?
370
976329
2750
๊ตฌ๊ธ€์ด ์›์น™์„ ์ค€์ˆ˜ํ•˜๋Š”์ง€ ์•Œ ๋ฐฉ๋ฒ•์ด ์žˆ๋‚˜์š”?
16:19
Is that just PR, or is that real, at the heart of your day-to-day?
371
979079
4417
์›์น™์€ ๋‹จ์ˆœํžˆ ํ™๋ณด์ผ ๋ฟ์ธ๊ฐ€์š”?
์•„๋‹ˆ๋ฉด ์ •๋ง ์ค‘์š”ํ•œ ๊ฒƒ์ธ๊ฐ€์š”?
16:23
JD: No, that is absolutely real.
372
983496
1541
JD: ์›์น™์€ ์ •๋ง ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
16:25
Like, we have literally hundreds of people
373
985079
2125
์ €ํฌ ๋ถ€์„œ์—์„œ ์ˆ˜๋ฐฑ ๋ช…์ด
16:27
working on many of these related research issues,
374
987204
2333
๊ด€๋ จ ์—ฐ๊ตฌ๋ฅผ ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
16:29
because many of those things are research topics
375
989537
2417
์›์น™ ์ž์ฒด๋งŒ์œผ๋กœ ์—ฐ๊ตฌ ์ฃผ์ œ๋กœ์„œ ํ•ฉ๋‹นํ•˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
16:31
in their own right.
376
991996
1166
16:33
How do you take data from the real world,
377
993162
2250
์‹ค์ œ ์‚ฌํšŒ๋ฅผ ํ† ๋Œ€๋กœ ๋ฐ์ดํ„ฐ๋ฅผ ๋งŒ๋“ค ๋•Œ
16:35
that is the world as it is, not as we would like it to be,
378
995412
5000
์–ด๋–ป๊ฒŒ ๋‚ด๊ฐ€ ์›ํ•˜๋Š” ๋Œ€๋กœ๊ฐ€ ์•„๋‹Œ,
์žˆ๋Š” ๊ทธ๋Œ€๋กœ ๋ฐ˜์˜ํ•  ์ˆ˜ ์žˆ์„๊นŒ?
16:40
and how do you then use that to train a machine-learning model
379
1000412
3209
์–ด๋–ป๊ฒŒ ๊ธฐ๊ณ„ ํ•™์Šต ๋ชจ๋ธ์„ ํ•™์Šต์‹œํ‚ค๊ณ 
16:43
and adapt the data bit of the scene
380
1003621
2500
ํ˜„์žฅ์˜ ์ž๋ฃŒ๋ฅผ ์ ์šฉํ•˜๊ฑฐ๋‚˜
16:46
or augment the data with additional data
381
1006162
2500
์ถ”๊ฐ€ ์ž๋ฃŒ๋กœ ๊ธฐ์กด ์ž๋ฃŒ๋ฅผ ๊ฒ€์ฆํ•ด์„œ
16:48
so that it can better reflect the values we want the system to have,
382
1008704
3292
์‚ฌํšŒ ์ž์ฒด์— ์žˆ๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ ์›ํ•˜๋Š” ๊ฐ€์น˜๋ฅผ ๋ฐ˜์˜ํ•  ์ˆ˜ ์žˆ์„๊นŒ?
16:51
not the values that it sees in the world?
383
1011996
2208
16:54
CA: But you work for Google,
384
1014204
2417
CA: ํ•˜์ง€๋งŒ ๊ตฌ๊ธ€์—์„œ ์ผํ•˜๊ณ  ๊ณ„์‹œ๊ณ ,
16:56
Google is funding the research.
385
1016662
2084
๊ตฌ๊ธ€์ด ์—ฐ๊ตฌ์— ์ž๊ธˆ์„ ์ง€์›ํ•ฉ๋‹ˆ๋‹ค.
16:59
How do we know that the main values that this AI will build
386
1019371
4458
๊ทธ๋ ‡๋‹ค๋ฉด ์ธ๊ณต ์ง€๋Šฅ์„ ๋งŒ๋“ค ๋•Œ ๊ฐ€์žฅ ์ค‘์š”ํ•˜๊ฒŒ ์—ฌ๊ธฐ๋Š” ๊ฐ€์น˜๊ฐ€
์‚ฌํšŒ๋ฅผ ์œ„ํ•œ ๊ฒƒ์ธ์ง€ ์–ด๋–ป๊ฒŒ ์•Œ์ฃ ?
17:03
are for the world,
387
1023871
1208
17:05
and not, for example, to maximize the profitability of an ad model?
388
1025121
4916
์‚ฌํšŒ๋ฅผ ์œ„ํ•ด์„œ๊ฐ€ ์•„๋‹ˆ๋ผ
๊ด‘๊ณ  ๋ชจ๋ธ์˜ ์ˆ˜์ต์„ฑ์„ ์ตœ๋Œ€ํ™”ํ•˜๋ ค ํ•  ์ˆ˜๋„ ์žˆ์ž–์•„์š”.
17:10
When you know everything there is to know about human attention,
389
1030037
3084
์‚ฌ๋žŒ๋“ค์˜ ๊ด€์‹ฌ์‚ฌ๊ฐ€ ์–ด๋–ค์ง€ ์•Œ๊ฒŒ ๋˜๋ฉด
์‚ฌ๋žŒ๋“ค์ด ์ˆจ๊ธฐ๋Š” ์•ผ๋ฆ‡ํ•˜๊ณ  ์Œ์Šตํ•œ ๋ถ€๋ถ„๋„ ์ƒ๋‹นํžˆ ์•Œ๊ฒŒ ๋  ํ…๋ฐ ๋ง์ž…๋‹ˆ๋‹ค.
17:13
you're going to know so much
390
1033121
1500
17:14
about the little wriggly, weird, dark parts of us.
391
1034621
2458
๊ตฌ๊ธ€์˜ ์—ฐ๊ตฌ์ง„ ๋‚ด์—๋Š”
17:17
In your group, are there rules about how you hold off,
392
1037121
6167
์‚ฌํšŒ ๋ฐœ์ „๊ณผ ์ˆ˜์ต์„ ์ฒ ์ €ํžˆ ๋ถ„๋ฆฌํ•ด์„œ ์ƒ๊ฐํ•˜๋Š”,
17:23
church-state wall between a sort of commercial push,
393
1043329
3750
๋ฐ˜๋“œ์‹œ ๋ˆ์ด ๋˜์•ผ ํ•œ๋‹ค๋Š”
๊ธˆ์ „์  ์••๋ฐ• ๊ฐ™์€ ๊ฒƒ์„ ๋– ๋‚˜
17:27
"You must do it for this purpose,"
394
1047079
2292
17:29
so that you can inspire your engineers and so forth,
395
1049413
2458
์ธ๊ณต ์ง€๋Šฅ ๊ฐœ๋ฐœ์œผ๋กœ ์‚ฌํšŒ ๋ฐœ์ „์— ๊ธฐ์—ฌํ•˜๋„๋ก ํ•˜๋Š” ๊ทœ์น™์ด ์žˆ๋‚˜์š”?
17:31
to do this for the world, for all of us.
396
1051913
1916
17:33
JD: Yeah, our research group does collaborate
397
1053871
2125
JD: ๋„ค, ์ €ํฌ ์—ฐ๊ตฌ์ง„์€ ๊ตฌ๊ธ€ ์ „์ฒด์—์„œ ์—ฌ๋Ÿฌ ํŒ€๊ณผ ํ˜‘์—…ํ•ฉ๋‹ˆ๋‹ค.
17:36
with a number of groups across Google,
398
1056038
1833
๊ตฌ๊ธ€ ๊ด‘๊ณ , ๊ฒ€์ƒ‰, ์ง€๋„ ๋“ฑ์ด์š”.
17:37
including the Ads group, the Search group, the Maps group,
399
1057913
2750
ํ˜‘์—…๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ๊ธฐ๋ณธ์  ์—ฐ๊ตฌ๋„ ํ•˜๊ณ  ๊ณต๊ฐœํ•ฉ๋‹ˆ๋‹ค.
17:40
so we do have some collaboration, but also a lot of basic research
400
1060704
3209
17:43
that we publish openly.
401
1063954
1542
์ž‘๋…„์—๋Š” ๊ฐ๊ธฐ ๋‹ค๋ฅธ ์ฃผ์ œ๋กœ ๋…ผ๋ฌธ์„ ์ฒœ ๊ฐœ ์ด์ƒ ๋ฐœํ‘œํ–ˆ์œผ๋ฉฐ
17:45
We've published more than 1,000 papers last year
402
1065538
3333
17:48
in different topics, including the ones you discussed,
403
1068871
2583
๊ทธ์ค‘์—๋Š” ์•ž์„œ ๋งํ–ˆ๋˜ ๊ฒƒ๋„ ์žˆ๊ณ 
๊ธฐ๊ณ„ ํ•™์Šต ๋ชจ๋ธ์˜ ๊ณต์ •์„ฑ, ํ•ด์„ ๊ฐ€๋Šฅ์„ฑ์„ ํฌํ•จํ•ด
17:51
about fairness, interpretability of the machine-learning models,
404
1071454
3042
๋งค์šฐ ์ค‘์š”ํ•œ ์ฃผ์ œ๋“ค์„ ๋‹ค๋ค˜์Šต๋‹ˆ๋‹ค.
17:54
things that are super important,
405
1074538
1791
17:56
and we need to advance the state of the art in this
406
1076371
2417
์ด๋ ‡๊ฒŒ ์›์น™ ๊ด€๋ จ ์—ฐ๊ตฌ๋ฅผ ์ตœ์ฒจ๋‹จ์œผ๋กœ ๋ฐœ์ „์‹œ์ผœ
17:58
in order to continue to make progress
407
1078829
2209
์ธ๊ณต ์ง€๋Šฅ์„ ๊ฐœ๋ฐœํ•  ๋•Œ
18:01
to make sure these models are developed safely and responsibly.
408
1081079
3292
ํ™•์‹คํžˆ ์•ˆ์ „ํ•˜๊ณ  ์ฑ…์ž„๊ฐ์„ ๊ฐ–๊ณ  ๋งŒ๋“ค ์ˆ˜ ์žˆ๋„๋ก ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
18:04
CA: It feels like we're at a time when people are concerned
409
1084788
3041
CA: ์‚ฌ๋žŒ๋“ค์ด ๊ฑฐ๋Œ€ ๊ธฐ์ˆ  ํšŒ์‚ฌ์˜ ์˜ํ–ฅ๋ ฅ์„
18:07
about the power of the big tech companies,
410
1087829
2042
์—ผ๋ คํ•˜๋Š” ์‹œ๋Œ€์ธ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
18:09
and it's almost, if there was ever a moment to really show the world
411
1089871
3750
๋” ๋‚˜์€ ๋ฏธ๋ž˜๋ฅผ ์œ„ํ•ด
18:13
that this is being done to make a better future,
412
1093663
3333
์›์น™์„ ์ง€ํ‚ค๋Š” ๋ชจ์Šต์„ ๋ณด์—ฌ์ฃผ๋Š” ๊ฒƒ์ด
18:17
that is actually key to Google's future,
413
1097038
2791
๋” ๋‚˜์€ ๋ฏธ๋ž˜, ๊ตฌ๊ธ€์˜ ๋ฏธ๋ž˜๋ฅผ ์œ„ํ•ด
18:19
as well as all of ours.
414
1099871
1750
๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฒƒ์ด ์•„๋‹๊นŒ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
18:21
JD: Indeed.
415
1101663
1166
JD: ๋งž์Šต๋‹ˆ๋‹ค. CA: ์ด๋ ‡๊ฒŒ ๋ง์”€ํ•ด์ฃผ์…”์„œ ๊ฐ์‚ฌ๋“œ๋ฆฌ๊ณ ,
18:22
CA: It's very good to hear you come and say that, Jeff.
416
1102871
2583
CA: TED์— ๋‚˜์™€์ฃผ์…”์„œ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
18:25
Thank you so much for coming here to TED.
417
1105454
2042
JD: ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
18:27
JD: Thank you.
418
1107538
1166
(๋ฐ•์ˆ˜)
18:28
(Applause)
419
1108704
1167
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7