How will AI change the world?

1,790,246 views ใƒป 2022-12-06

TED-Ed


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Hyeryung Kim ๊ฒ€ํ† : DK Kim
๋‹ค๊ฐ€์˜ค๋Š” ๋ฏธ๋ž˜์—๋Š” ์ธ๊ณต ์ง€๋Šฅ์ด ์—ฌ๋Ÿฌ๋ถ„์˜ ์‚ถ์„ ๋ฐ”๊พธ๊ณ 
00:07
In the coming years, artificial intelligence
0
7336
2169
00:09
is probably going to change your life, and likely the entire world.
1
9505
3629
์–ด์ฉŒ๋ฉด ์˜จ ์„ธ์ƒ์„ ๋ฐ”๊พผ๋‹ค๊ณ  ํ•˜์ง€๋งŒ
์ •ํ™•ํžˆ ์–ด๋–ป๊ฒŒ ๋ฐ”๋€๋‹ค๋Š” ๊ฑด์ง€๋Š” ์•„์ง ๋ถˆ๋ช…ํ™•ํ•ฉ๋‹ˆ๋‹ค.
00:13
But people have a hard time agreeing on exactly how.
2
13301
3295
๋‹ค์Œ ๋‚ด์šฉ์€ ์œ ๋ช…ํ•œ ์ธ๊ณต ์ง€๋Šฅ ์ „๋ฌธ๊ฐ€์ธ
00:16
The following are excerpts from an interview
3
16804
2127
00:18
where renowned computer science professor and AI expert Stuart Russell
4
18931
3504
์ŠคํŠœ์–ดํŠธ ๋Ÿฌ์…€ ๋ฐ•์‚ฌ์˜ ์ธํ„ฐ๋ทฐ์—์„œ ๋ฐœ์ทŒํ•œ ๋‚ด์šฉ์œผ๋กœ
00:22
helps separate the sense from the nonsense.
5
22435
2586
๊ฑฐ์ง“๊ณผ ์ง„์‹ค์„ ๊ตฌ๋ถ„ํ•˜๋Š” ๋ฐ ๋„์›€์ด ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
00:25
Thereโ€™s a big difference between asking a human to do something
6
25313
3754
์ŠคํŠœ์–ดํŠธ ๋Ÿฌ์…€: ์ธ๊ฐ„์—๊ฒŒ ์–ด๋–ค ์ผ์„ ๋ถ€ํƒํ•˜๋Š” ๊ฒƒ๊ณผ
00:29
and giving that as the objective to an AI system.
7
29067
3461
์ธ๊ณต ์ง€๋Šฅ์— ๊ทธ ์ผ์„ ์‹œํ‚ค๋Š” ๊ฒƒ์€ ๋งค์šฐ ๋‹ค๋ฆ…๋‹ˆ๋‹ค.
00:32
When you ask a human to get you a cup of coffee,
8
32528
2628
์–ด๋–ค ์‚ฌ๋žŒํ•œํ…Œ ์ปคํ”ผ ํ•œ ์ž”์„ ์‚ฌ ์˜ค๋ผ๊ณ  ํ•  ๋•Œ
00:35
you donโ€™t mean this should be their lifeโ€™s mission,
9
35156
2586
๊ทธ๊ฒŒ ๊ทธ ์‚ฌ๋žŒ ์ธ์ƒ์˜ ์‚ฌ๋ช…์ด๋‚˜ ๋ฌด์กฐ๊ฑด์ด๋ผ๋Š” ๋ง์€ ์•„๋‹™๋‹ˆ๋‹ค.
00:37
and nothing else in the universe matters.
10
37742
1960
00:39
Even if they have to kill everybody else in Starbucks
11
39702
2586
์Šคํƒ€๋ฒ…์Šค๊ฐ€ ๋ฌธ ๋‹ซ๊ธฐ ์ „์— ์ปคํ”ผ๋ฅผ ์‚ฌ๋ ค๊ณ 
00:42
to get you the coffee before it closesโ€” they should do that.
12
42288
2836
๊ฑฐ๊ธฐ ์žˆ๋Š” ์‚ฌ๋žŒ๋“ค์„ ๋‹ค ์ฃฝ์ด๋ผ๋Š” ๋œป๋„ ์•„๋‹ˆ์ฃ .
00:45
No, thatโ€™s not what you mean.
13
45124
1627
๋„ค, ๊ทธ๊ฑด ์ ˆ๋Œ€ ์•„๋‹™๋‹ˆ๋‹ค.
00:46
All the other things that we mutually care about,
14
46751
2294
์šฐ๋ฆฌ๊ฐ€ ์‹ ๊ฒฝ ์“ฐ๋Š” ๋ชจ๋“  ์ƒํ™ฉ์„ ๊ณ ๋ คํ•ด์„œ ํ–‰๋™์„ ์ทจํ•˜๋ผ๋Š” ๋ง์ด์ฃ .
00:49
they should factor into your behavior as well.
15
49045
2169
ํ˜„์žฌ ์ธ๊ณต ์ง€๋Šฅ์ด ์ž‘๋™ํ•˜๋Š” ๋ฐฉ์‹์€ ์ •ํ•ด์ง„ ๋ชฉํ‘œ๋งŒ ๋ฐ›๋Š” ๋ฐฉ์‹์ž…๋‹ˆ๋‹ค.
00:51
And the problem with the way we build AI systems now
16
51214
3169
00:54
is we give them a fixed objective.
17
54383
1627
์ธ๊ณต ์ง€๋Šฅ์— ์ฃผ๋Š” ๋ชฉํ‘œ์—๋Š” ๋ชจ๋“  ๊ฒƒ์ด ๋“ค์–ด์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
00:56
The algorithms require us to specify everything in the objective.
18
56010
3545
00:59
And if you say, can we fix the acidification of the oceans?
19
59555
3420
์˜ˆ๋ฅผ ๋“ค์–ด ๋กœ๋ด‡์— ํ•ด์–‘ ์‚ฐ์„ฑํ™”๋ฅผ ๋ฉˆ์ถ”๋ผ๋Š” ๋ชฉํ‘œ๋ฅผ ์ค€๋‹ค๊ณ  ํ•ฉ์‹œ๋‹ค.
01:02
Yeah, you could have a catalytic reaction that does that extremely efficiently,
20
62975
4046
์ด‰๋งค ๋ฐ˜์‘์„ ์ด์šฉํ•˜๋ฉด ํšจ๊ณผ์ ์œผ๋กœ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ์ง€๋งŒ
01:07
but it consumes a quarter of the oxygen in the atmosphere,
21
67021
3253
๊ทธ๋Ÿฌ๋ ค๋ฉด ๋Œ€๊ธฐ ์ค‘์˜ ์‚ฐ์†Œ๋ฅผ ์‚ฌ๋ถ„์˜ ์ผ ์ •๋„ ์†Œ๋ชจํ•ด์•ผ ํ•˜๊ณ 
01:10
which would apparently cause us to die fairly slowly and unpleasantly
22
70274
3712
์‚ฌ๋žŒ๋“ค์€ ์‚ฐ์†Œ๊ฐ€ ๋ถ€์กฑํ•ด์„œ
๋ช‡ ์‹œ๊ฐ„์— ๊ฑธ์ณ ์ฒœ์ฒœํžˆ ๊ณ ํ†ต์Šค๋Ÿฝ๊ฒŒ ์ฃฝ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
01:13
over the course of several hours.
23
73986
1752
01:15
So, how do we avoid this problem?
24
75780
3211
๊ทธ๋Ÿฌ๋ฉด ์–ด๋–ป๊ฒŒ ์ด๋Ÿฐ ๋ฌธ์ œ๋ฅผ ํ”ผํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
01:18
You might say, okay, well, just be more careful about specifying the objectiveโ€”
25
78991
4088
์ด๋ ‡๊ฒŒ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ๊ฒ ์ฃ .
โ€œ๊ทธ๋ž˜, ๋ชฉํ‘œ๋ฅผ ์ข€ ๋” ๊ตฌ์ฒด์ ์œผ๋กœ ์ฃผ์ž.
01:23
donโ€™t forget the atmospheric oxygen.
26
83079
2544
๋Œ€๊ธฐ ์ค‘์˜ ์‚ฐ์†Œ ๋†๋„๋„ ๊ณ ๋ คํ•  ๊ฒƒ.โ€
01:25
And then, of course, some side effect of the reaction in the ocean
27
85873
3545
๊ทธ๋Ÿฌ๋ฉด ๋‹น์—ฐํžˆ ๋‹ค๋ฅธ ๋ถ€์ž‘์šฉ์ด ์ƒ๊ธฐ๊ฒ ์ฃ .
๋ฐ”๋‹ท์†์— ๋…์ด ํผ์ ธ์„œ ๋ฌผ๊ณ ๊ธฐ๊ฐ€ ๋‹ค ์ฃฝ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
01:29
poisons all the fish.
28
89418
1377
01:30
Okay, well I meant donโ€™t kill the fish either.
29
90795
2669
โ€œ์ข‹์•„. ๊ทธ๋Ÿผ ๋ฌผ๊ณ ๊ธฐ๋„ ์ฃฝ์ด์ง€ ๋ง ๊ฒƒ.
01:33
And then, well, what about the seaweed?
30
93464
1919
์•„, ๊ทธ๋Ÿฌ๋ฉด ํ•ด์กฐ๋ฅ˜๋Š” ์–ด๋–กํ•˜์ง€?
01:35
Donโ€™t do anything thatโ€™s going to cause all the seaweed to die.
31
95383
2961
ํ•ด์กฐ๋ฅ˜๋„ ์ฃฝ์ด์ง€ ๋ง ๊ฒƒ.โ€
๊ณ„์† ์ด๋Ÿฐ ์‹์ด์ฃ .
01:38
And on and on and on.
32
98344
1210
01:39
And the reason that we donโ€™t have to do that with humans is that
33
99679
3920
๋กœ๋ด‡์ด ์•„๋‹Œ ์ธ๊ฐ„์—๊ฒŒ๋Š” ์ด๋ ‡๊ฒŒ ํ•  ํ•„์š”๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
01:43
humans often know that they donโ€™t know all the things that we care about.
34
103599
4505
๋Œ€๊ฐœ ์ธ๊ฐ„์€ ํ•„์š”ํ•œ ๋ชจ๋“  ๊ฒƒ์„ ๋ชจ๋ฅธ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
01:48
If you ask a human to get you a cup of coffee,
35
108354
2961
์–ด๋–ค ์‚ฌ๋žŒ์—๊ฒŒ ์ปคํ”ผ ํ•œ ์ž”์„ ์‚ฌ๋‹ค ๋‹ฌ๋ผ๊ณ  ๋ถ€ํƒํ–ˆ๋Š”๋ฐ
01:51
and you happen to be in the Hotel George Sand in Paris,
36
111315
2878
ํŒŒ๋ฆฌ์— ์žˆ๋Š” ์กฐ์ง€ ์ƒŒ๋“œ ํ˜ธํ…”์ด์—ˆ๋‹ค๋ฉด
01:54
where the coffee is 13 euros a cup,
37
114193
2628
๊ทธ๊ณณ์€ ์ปคํ”ผ๊ฐ€ ํ•œ ์ž”์— ๋งŒํŒ”์ฒœ ์›์ž…๋‹ˆ๋‹ค.
01:56
itโ€™s entirely reasonable to come back and say, well, itโ€™s 13 euros,
38
116821
4171
๊ทธ๋Ÿฌ๋ฉด ๊ทธ ์‚ฌ๋žŒ์€ ๋‹ค์‹œ ๋Œ์•„์™€์„œ ์ปคํ”ผ๊ฐ€ ๋งŒํŒ”์ฒœ ์›์ธ๋ฐ ๊ทธ๋ƒฅ ์‚ฌ ์˜ฌ์ง€
02:00
are you sure you want it, or I could go next door and get one?
39
120992
2961
์•„๋‹ˆ๋ฉด ๋‹ค๋ฅธ ๊ณณ์— ๊ฐ€ ๋ณผ์ง€ ๋ฌผ์–ด๋ณด๊ฒ ์ฃ .
02:03
And itโ€™s a perfectly normal thing for a person to do.
40
123953
2878
์ธ๊ฐ„์ด๋ฉด ๋‹น์—ฐํžˆ ๊ทธ๋ ‡๊ฒŒ ํ•  ๊ฒ๋‹ˆ๋‹ค.
02:07
To ask, Iโ€™m going to repaint your houseโ€”
41
127039
3003
์ œ๊ฐ€ ์—ฌ๋Ÿฌ๋ถ„ ์ง‘ ์™ธ๋ฒฝ์— ํŽ˜์ธํŠธ์น ์„ ๋‹ค์‹œ ํ•œ๋‹ค๋ฉด
02:10
is it okay if I take off the drainpipes and then put them back?
42
130042
3337
๋ฐฐ์ˆ˜๊ด€์„ ๋–ผ์—ˆ๋‹ค๊ฐ€ ๋‹ค์‹œ ๋ถ™์—ฌ๋„ ๋˜๊ฒ ๋ƒ๊ณ  ๋จผ์ € ๋ฌผ์–ด๋ณด๊ฒ ์ฃ .
02:13
We don't think of this as a terribly sophisticated capability,
43
133504
3128
์ธ๊ฐ„์—๊ฒŒ ์ด๋Š” ๊ทธ๋ ‡๊ฒŒ ๋Œ€๋‹จํ•˜๊ณ  ์ •๊ตํ•œ ๋Šฅ๋ ฅ์ด ์•„๋‹™๋‹ˆ๋‹ค.
02:16
but AI systems donโ€™t have it because the way we build them now,
44
136632
3087
ํ•˜์ง€๋งŒ ํ˜„์žฌ์˜ ์ธ๊ณต ์ง€๋Šฅ์€ ๊ทธ๋Ÿฐ ์‹์œผ๋กœ ๋งŒ๋“  ๊ฒƒ์ด ์•„๋‹ˆ์–ด์„œ
02:19
they have to know the full objective.
45
139719
1793
์ž„๋ฌด๋ฅผ ์ „๋ถ€ ๋‹ค ์•Œ๋ ค์ค˜์•ผ ํ•ฉ๋‹ˆ๋‹ค.
02:21
If we build systems that know that they donโ€™t know what the objective is,
46
141721
3753
๋งŒ์•ฝ ์ธ๊ณต ์ง€๋Šฅ์ด ๋‹ค๋ฅธ ๋ณ€์ˆ˜๋„ ๊ณ ๋ คํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋งŒ๋“ค๋ฉด
02:25
then they start to exhibit these behaviors,
47
145474
2586
๋Œ€๊ธฐ ์ค‘์˜ ์‚ฐ์†Œ๋ฅผ ๋‹ค ์ œ๊ฑฐํ•˜๊ธฐ ์ „์—
02:28
like asking permission before getting rid of all the oxygen in the atmosphere.
48
148060
4046
์ธ๊ณต ์ง€๋Šฅ์€ ๋ฏธ๋ฆฌ ๋ฌผ์–ด๋ณด๊ณ  ํ—ˆ๋ฝ๋ฐ›๋Š” ํ–‰๋™์„ ํ•  ๊ฒ๋‹ˆ๋‹ค.
02:32
In all these senses, control over the AI system
49
152565
3378
์ด๋Ÿฐ ๋ชจ๋“  ๋ฉด์—์„œ ๋ณผ ๋•Œ ์ธ๊ณต ์ง€๋Šฅ ์‹œ์Šคํ…œ์„ ํ†ต์ œํ•˜๋Š” ๊ฒƒ์€
02:35
comes from the machineโ€™s uncertainty about what the true objective is.
50
155943
4463
์ธ๊ณต ์ง€๋Šฅ์ด ์ž„๋ฌด์˜ ์ง„์ •ํ•œ ๋ชฉ์ ์„ ๋ชจ๋ฅด๊ฒŒ ํ•˜๋Š” ๊ฒƒ์—์„œ ์˜ต๋‹ˆ๋‹ค.
02:41
And itโ€™s when you build machines that believe with certainty
51
161282
3086
๊ธฐ๊ณ„๊ฐ€ ๊ณ ์ •๋œ ๋ชฉํ‘œ๋งŒ ์ˆ˜ํ–‰ํ•˜๋ฉด ๋œ๋‹ค๋Š” ํ™•์‹ ์„ ๊ฐ–๊ณ  ํ–‰๋™ํ•˜๋„๋ก ์„ค๊ณ„ํ•˜๋ฉด
02:44
that they have the objective,
52
164368
1418
02:45
thatโ€™s when you get this sort of psychopathic behavior.
53
165786
2753
์ด๋ ‡๊ฒŒ ์ƒ์‹์— ๋ฒ—์–ด๋‚œ ๊ฒฐ๊ณผ๊ฐ€ ๋‚˜์˜ต๋‹ˆ๋‹ค.
02:48
And I think we see the same thing in humans.
54
168539
2127
์ธ๊ฐ„๋„ ๋งˆ์ฐฌ๊ฐ€์ง€์ด์ฃ .
02:50
What happens when general purpose AI hits the real economy?
55
170750
4254
๋ฒ”์šฉ ์ธ๊ณต ์ง€๋Šฅ์ด ์‹ค๋ฌผ ๊ฒฝ์ œ์— ํƒ€๊ฒฉ์„ ์ค€๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
02:55
How do things change? Can we adapt?
56
175379
3587
์„ธ์ƒ์€ ์–ด๋–ป๊ฒŒ ๋ฐ”๋€”๊นŒ์š”? ์šฐ๋ฆฐ ์—ฌ๊ธฐ์— ์ ์‘ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
02:59
This is a very old point.
57
179175
1835
์ด๋Š” ์•„์ฃผ ์˜ค๋ž˜๋œ ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
03:01
Amazingly, Aristotle actually has a passage where he says,
58
181010
3587
๋†€๋ž๊ฒŒ๋„ ์‹ค์ œ๋กœ ์•„๋ฆฌ์Šคํ† ํ…”๋ ˆ์Šค๊ฐ€ ์ด๋Ÿฐ ๋ง์„ ํ–ˆ๋‹ค๋Š” ๊ธฐ๋ก์ด ์žˆ์Šต๋‹ˆ๋‹ค.
03:04
look, if we had fully automated weaving machines
59
184597
3045
โ€œ์™„์ „ํžˆ ์ž๋™ํ™”๋œ ์ง๋ฌผ ์งœ๋Š” ๊ธฐ๊ณ„์™€
03:07
and plectrums that could pluck the lyre and produce music without any humans,
60
187642
3837
์Šค์Šค๋กœ ๋ฆฌ๋ผ๋ฅผ ์—ฐ์ฃผํ•˜๊ณ  ์ž‘๊ณกํ•˜๋Š” ๊ธฐ๊ณ„๊ฐ€ ์žˆ๋‹ค๋ฉด
03:11
then we wouldnโ€™t need any workers.
61
191604
2002
๊ทธ ์–ด๋–ค ์ผ๊พผ๋„ ํ•„์š” ์—†์„ ๊ฒƒ์ด๋‹ค.โ€
03:13
That idea, which I think it was Keynes
62
193814
2878
1930๋…„, ์ผ€์ธ์Šค๋ผ๋Š” ๊ฒฝ์ œํ•™์ž๋Š”
03:16
who called it technological unemployment in 1930,
63
196692
2836
์ด๊ฒƒ์„ โ€˜๊ธฐ์ˆ ์  ์‹ค์—…โ€™์ด๋ผ๊ณ  ๋ถˆ๋ €๋Š”๋ฐ
03:19
is very obvious to people.
64
199528
1919
๊ทธ ์˜๋ฏธ๋Š” ์•„์ฃผ ๋ช…ํ™•ํ•ฉ๋‹ˆ๋‹ค.
03:21
They think, yeah, of course, if the machine does the work,
65
201447
3086
ํ•œ ๋งˆ๋””๋กœ ๊ธฐ๊ณ„๊ฐ€ ์ผ์„ ๋‹ค ํ•œ๋‹ค๋ฉด ์šฐ๋ฆฐ ๋ชจ๋‘ ์‹ค์—…์ž๊ฐ€ ๋œ๋‹ค๋Š” ๊ฑฐ์ฃ .
03:24
then I'm going to be unemployed.
66
204533
1669
03:26
You can think about the warehouses that companies are currently operating
67
206369
3503
๊ธฐ์—…์ด ์˜จ๋ผ์ธ ํŒ๋งค ์ œํ’ˆ ๋ณด๊ด€์— ์ด์šฉํ•˜๋Š” ์ฐฝ๊ณ ๋Š”
03:29
for e-commerce, they are half automated.
68
209872
2711
๋ฐ˜์ž๋™์œผ๋กœ ์šด์˜๋ฉ๋‹ˆ๋‹ค.
03:32
The way it works is that an old warehouseโ€” where youโ€™ve got tons of stuff piled up
69
212583
4046
์ˆ˜๋งŽ์€ ์ƒํ’ˆ์ด ๋‚ก์€ ์ฐฝ๊ณ  ์—ฌ๊ธฐ์ €๊ธฐ์— ์Œ“์—ฌ ์žˆ๊ณ 
03:36
all over the place and humans go and rummage around
70
216629
2461
์‚ฌ๋žŒ๋“ค์ด ๊ฐ€์„œ ์ด๋ฆฌ์ €๋ฆฌ ๋’ค์ ธ ํ•„์š”ํ•œ ๋ฌผ๊ฑด์„ ๊ฐ€์ ธ์™€ ๋ฐฐ์†กํ•ฉ๋‹ˆ๋‹ค.
03:39
and then bring it back and send it offโ€”
71
219090
1877
03:40
thereโ€™s a robot who goes and gets the shelving unit
72
220967
3586
ํ•„์š”ํ•œ ๋ฌผ๊ฑด์ด ์žˆ๋Š” ์„ ๋ฐ˜ ์ „์ฒด๋ฅผ ๋“ค์–ด์„œ ๊ฐ€์ ธ๋‹ค์ฃผ๋Š” ๋กœ๋ด‡๋„ ์žˆ๊ธด ํ•˜์ง€๋งŒ
03:44
that contains the thing that you need,
73
224553
1919
03:46
but the human has to pick the object out of the bin or off the shelf,
74
226472
3629
๋ฌผ๊ฑด์„ ๊ณจ๋ผ ์„ ๋ฐ˜์—์„œ ๊บผ๋‚ด๋Š” ์ผ์€ ์‚ฌ๋žŒ์ด ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
03:50
because thatโ€™s still too difficult.
75
230101
1877
๋กœ๋ด‡์—๊ฒ ์•„์ง ์–ด๋ ต๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
03:52
But, at the same time,
76
232019
2002
ํ•˜์ง€๋งŒ ๋™์‹œ์—
03:54
would you make a robot that is accurate enough to be able to pick
77
234021
3921
์ˆ˜๋งŽ์€ ๋ฌผ๊ฑด ์ค‘์—์„œ ๋”ฑ ํ•„์š”ํ•œ ๋ฌผ๊ฑด๋งŒ
03:57
pretty much any object within a very wide variety of objects that you can buy?
78
237942
4338
์ •ํ™•ํ•˜๊ฒŒ ๊ณจ๋ผ์„œ ๊บผ๋‚ผ ์ˆ˜ ์žˆ๋Š” ๋กœ๋ด‡์„ ๋งŒ๋“ ๋‹ค๋ฉด ์–ด๋–จ๊นŒ์š”?
04:02
That would, at a stroke, eliminate 3 or 4 million jobs?
79
242280
4004
์ผ์ž๋ฆฌ ์‚ผ์‚ฌ๋ฐฑ๋งŒ ๊ฐœ ์ •๋„๊ฐ€ ํ•œ ๋ฒˆ์— ์—†์–ด์ ธ ๋ฒ„๋ฆฌ๊ฒ ์ฃ .
04:06
There's an interesting story that E.M. Forster wrote,
80
246409
3336
E. M. ํฌ์Šคํ„ฐ๋ผ๋Š” ์ž‘๊ฐ€๊ฐ€ ์“ด ํฅ๋ฏธ๋กœ์šด ์ด์•ผ๊ธฐ์—์„œ๋Š”
04:09
where everyone is entirely machine dependent.
81
249745
3504
๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ์ „์ ์œผ๋กœ ๊ธฐ๊ณ„์— ์˜์กดํ•ฉ๋‹ˆ๋‹ค.
04:13
The story is really about the fact that if you hand over
82
253499
3754
์ด ์ด์•ผ๊ธฐ๊ฐ€ ๋ณด์—ฌ์ฃผ๋Š” ๊ฒƒ์€
๋ฌธ๋ช…์˜ ์ฃผ๋„๊ถŒ์„ ๊ธฐ๊ณ„์— ๋„˜๊ฒจ์ฃผ๋ฉด
04:17
the management of your civilization to machines,
83
257253
2961
04:20
you then lose the incentive to understand it yourself
84
260214
3504
์šฐ๋ฆฌ๋Š” ์˜์š•์„ ์žƒ๊ฒŒ ๋˜์–ด ์Šค์Šค๋กœ ์ดํ•ดํ•˜๋ ค ํ•˜์ง€๋„ ์•Š๊ณ 
04:23
or to teach the next generation how to understand it.
85
263718
2544
๋‹ค์Œ ์„ธ๋Œ€์—๊ฒŒ ๋ฌธ๋ช…์„ ์ดํ•ดํ•˜๋Š” ๋ฒ•์„ ๊ฐ€๋ฅด์น˜๋ ค๊ณ ๋„ ์•Š๊ฒŒ ๋œ๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
04:26
You can see โ€œWALL-Eโ€ actually as a modern version,
86
266262
3003
ํ˜„๋Œ€ํŒ โ€˜์›”-Eโ€™๋ฅผ ๋ณด๊ฒŒ ๋ ์ง€๋„ ๋ชจ๋ฅด๊ฒ ๋„ค์š”.
04:29
where everyone is enfeebled and infantilized by the machine,
87
269265
3628
์‚ฌ๋žŒ๋“ค์ด ๋ชจ๋‘ ๋‚˜์•ฝํ•ด์ ธ์„œ ๊ธฐ๊ณ„ํ•œํ…Œ ์–ด๋ฆฐ์•  ์ทจ๊ธ‰๋‹นํ•˜๋Š” ๊ฑฐ์ฃ .
04:32
and that hasnโ€™t been possible up to now.
88
272893
1961
๋ฌผ๋ก  ์•„์ง ๊ทธ๋Ÿฐ ์ผ์€ ์—†์—ˆ์ง€๋งŒ์š”.
04:34
We put a lot of our civilization into books,
89
274854
2419
์šฐ๋ฆฌ๋Š” ๋ฌธ๋ช…์˜ ์ƒ๋‹น ๋ถ€๋ถ„์„ ์ฑ…์— ๊ธฐ๋กํ•˜์ง€๋งŒ
04:37
but the books canโ€™t run it for us.
90
277273
1626
์ฑ…์ด ๊ทธ๊ฑธ ์ „๋‹ฌํ•˜์ง€๋Š” ๋ชปํ•ฉ๋‹ˆ๋‹ค.
04:38
And so we always have to teach the next generation.
91
278899
2795
์šฐ๋ฆฌ๊ฐ€ ๋‹ค์Œ ์„ธ๋Œ€๋ฅผ ์ง์ ‘ ๊ฐ€๋ฅด์ณ์•ผ ํ•˜์ฃ .
04:41
If you work it out, itโ€™s about a trillion person years of teaching and learning
92
281736
4212
์ƒ๊ฐํ•ด ๋ณด๋ฉด ๊ฐ€๋ฅด์น˜๊ณ  ๋ฐฐ์šฐ๋Š” ๋ฐ ์—„์ฒญ๋‚œ ๋…ธ๋™๋ ฅ์ด ํ•„์š”ํ•˜๊ณ 
04:45
and an unbroken chain that goes back tens of thousands of generations.
93
285948
3962
์ˆ˜๋งŽ์€ ์„ธ๋Œ€๋ฅผ ๊ฑฐ์ณ ๋Š์–ด์ง€์ง€ ์•Š๊ณ  ๊ณ„์† ์ด์–ด์ ธ ๋‚ด๋ ค์˜จ ๊ฒƒ์ด์ฃ .
04:50
What happens if that chain breaks?
94
290119
1919
๋งŒ์•ฝ ๋Š์–ด์ ธ ๋ฒ„๋ฆฌ๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
04:52
I think thatโ€™s something we have to understand as AI moves forward.
95
292038
3461
์ด ๋˜ํ•œ ์ธ๊ณต ์ง€๋Šฅ์ด ๋ฐœ์ „ํ•˜๋ฉด์„œ ์ƒ๊ฐํ•ด ๋ด์•ผ ํ•  ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค.
04:55
The actual date of arrival of general purpose AIโ€”
96
295624
3587
๋ฒ”์šฉ ์ธ๊ณต ์ง€๋Šฅ์ด ์‹คํ˜„๋˜๋Š” ๊ทธ๋‚ ์ด ์ •ํ™•ํžˆ ์–ธ์ œ๊ฐ€ ๋ ์ง€๋Š”
04:59
youโ€™re not going to be able to pinpoint, it isnโ€™t a single day.
97
299211
3087
ํ™•์‹ ํ•  ์ˆ˜ ์—†๊ณ  ํ•˜๋ฃจ์•„์นจ์— ๋˜๋Š” ์ผ์ด ์•„๋‹™๋‹ˆ๋‹ค.
05:02
Itโ€™s also not the case that itโ€™s all or nothing.
98
302298
2294
โ€˜๋ชจ ์•„๋‹ˆ๋ฉด ๋„โ€™ ๊ฐ™์€ ์ผ๋„ ์•„๋‹ˆ์ฃ .
05:04
The impact is going to be increasing.
99
304592
2461
์˜ํ–ฅ์€ ์ ์  ์ปค์งˆ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
05:07
So with every advance in AI,
100
307053
2043
์ธ๊ณต ์ง€๋Šฅ์ด ๋ฐœ์ „ํ•  ๋•Œ๋งˆ๋‹ค
05:09
it significantly expands the range of tasks.
101
309096
2962
์ž„๋ฌด์˜ ๋ฒ”์œ„๋Š” ์ ์  ๋„“์–ด์ง€๊ฒ ์ฃ .
05:12
So in that sense, I think most experts say by the end of the century,
102
312058
5338
๊ทธ๋Ÿฐ ์˜๋ฏธ์—์„œ ๋Œ€๋ถ€๋ถ„ ์ „๋ฌธ๊ฐ€๋“ค์ด ์ด๋ฒˆ ์„ธ๊ธฐ ๋ง๊นŒ์ง€๋Š”
05:17
weโ€™re very, very likely to have general purpose AI.
103
317396
3337
๋ฒ”์šฉ ์ธ๊ณต ์ง€๋Šฅ์ด ์‹คํ˜„๋  ๊ฒƒ์ด ๋งค์šฐ ํ™•์‹คํ•  ๊ฑฐ๋ผ๊ณ  ์–˜๊ธฐํ•ฉ๋‹ˆ๋‹ค.
05:20
The median is something around 2045.
104
320733
3754
์ค‘์•™๊ฐ’์€ 2045๋…„ ์ •๋„๋กœ ์˜ˆ์ธก๋˜๋Š”๋ฐ
05:24
I'm a little more on the conservative side.
105
324487
2002
์ €๋Š” ์ข€ ๋” ๋ณด์ˆ˜์ ์ธ ์ชฝ์ž…๋‹ˆ๋‹ค.
05:26
I think the problem is harder than we think.
106
326489
2085
์ œ ์ƒ๊ฐ์— ์ด ๋ฌธ์ œ๋Š” ๋ณด๊ธฐ๋ณด๋‹ค ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
05:28
I like what John McAfee, he was one of the founders of AI,
107
328574
3253
์ธ๊ณต ์ง€๋Šฅ์˜ ์ฐฝ์‹œ์ž ์ค‘ ํ•˜๋‚˜์ธ ์กด ๋งค์ปคํ”ผ๋Š” ์ด ์งˆ๋ฌธ์—
05:31
when he was asked this question, he said, somewhere between five and 500 years.
108
331911
3837
ํ–ฅํ›„ 5๋…„์—์„œ 500๋…„ ์‚ฌ์ด๋ผ๊ณ  ๋‹ตํ–ˆ๋Š”๋ฐ ์ €๋Š” ์—ฌ๊ธฐ์— ๋™์˜ํ•˜๋ฉฐ
05:35
And we're going to need, I think, several Einsteins to make it happen.
109
335748
3337
์ด๊ฒŒ ์‹คํ˜„๋˜๋ ค๋ฉด ์•„์ธ์Šˆํƒ€์ธ ๊ฐ™์€ ์‚ฌ๋žŒ์ด ๋ช‡ ๋ช… ๋” ํ•„์š”ํ•  ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.

Original video on YouTube.com
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7