Marvin Minsky: Health, population and the human mind

64,070 views ใƒป 2008-09-29

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: Jeong-a Seo ๊ฒ€ํ† : Dae-Ki Kang
00:18
If you ask people about what part of psychology do they think is hard,
0
18330
6000
๋งŒ์ผ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ์‹ฌ๋ฆฌํ•™์˜ ๋ถ„์•ผ์—์„œ ๋ฌด์—‡์ด ์–ด๋ ต๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š”์ง€ ๋ฌป๋Š”๋‹ค๋ฉด,
00:24
and you say, "Well, what about thinking and emotions?"
1
24330
3000
์‚ฌ๊ณ (thinking)์™€ ๊ฐ์ •(emotions) ์ค‘์—์„œ ๋ฌด์—‡์ด ๋” ์–ด๋ ต๋ƒ๊ณ  ๋ฌป๋Š”๋‹ค๋ฉด,
00:27
Most people will say, "Emotions are terribly hard.
2
27330
3000
๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์€ ์ด๋ ‡๊ฒŒ ๋งํ•  ๊ฒ๋‹ˆ๋‹ค. "๊ฐ์ • ๋ถ„์•ผ๊ฐ€ ๊ต‰์žฅํžˆ ์–ด๋ ค์›Œ์š”.
00:30
They're incredibly complex. They can't -- I have no idea of how they work.
3
30330
6000
๊ทธ๊ฒŒ ๋ฏฟ์„ ์ˆ˜ ์—†์„ ์ •๋„๋กœ ๋ณต์žกํ•ด์„œ, ๊ฐ์ •์ด ์–ด๋–ค ์‹์œผ๋กœ ์›€์ง์ด๋Š”์ง€ ๋ชจ๋ฅด๊ฒ ์–ด์š”.
00:36
But thinking is really very straightforward:
4
36330
2000
ํ•˜์ง€๋งŒ ์‚ฌ๊ณ (thinking)๋Š” ๊ฝค ์ดํ•ดํ•˜๊ธฐ ์‰ฝ์ฃ .
00:38
it's just sort of some kind of logical reasoning, or something.
5
38330
4000
๊ทธ๋ƒฅ ๋…ผ๋ฆฌ์  ์ถ”๋ก  ๊ฐ™์€ ๊ฑฐ๋‹ˆ๊นŒ์š”.
00:42
But that's not the hard part."
6
42330
3000
๊ทธ๊ฑด ์–ด๋ ค์šด ๋ถ€๋ถ„์ด ์•„๋‹ˆ์ฃ ."
00:45
So here's a list of problems that come up.
7
45330
2000
์ž, ์ด์ œ ๋ฏธ๋ž˜์— ๋ฐœ์ƒํ•  ๋ฌธ์ œ๋“ค์„ ์–˜๊ธฐํ•ด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
00:47
One nice problem is, what do we do about health?
8
47330
3000
ํ•œ๊ฐ€์ง€ ์ข‹์€ ๋ฌธ์ œ๋กœ, ์–ด๋–ป๊ฒŒ ๊ฑด๊ฐ•์„ ์ง€ํ‚ฌ ๊ฒƒ์ธ๊ฐ€ ํ•˜๋Š” ๊ฒŒ ์žˆ์Šต๋‹ˆ๋‹ค.
00:50
The other day, I was reading something, and the person said
9
50330
4000
์˜ˆ์ „์— ์„œ์–‘์—์„œ ์งˆ๋ณ‘์ด ๋ฐœ์ƒํ•˜๋Š” ๊ฐ€์žฅ ํฐ ์›์ธ์ด
00:54
probably the largest single cause of disease is handshaking in the West.
10
54330
6000
์•…์ˆ˜๋ผ๊ณ  ์„ค๋ช…ํ•˜๋Š” ๊ธ€์„ ์ฝ์€ ์ ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
01:00
And there was a little study about people who don't handshake,
11
60330
4000
๊ทธ๋ฆฌ๊ณ  ์•…์ˆ˜๋ฅผ ํ•˜์ง€ ์•Š๋Š” ์‚ฌ๋žŒ๋“ค๊ณผ ์•…์ˆ˜๋ฅผ ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์„
01:04
and comparing them with ones who do handshake.
12
64330
3000
๊ฐ„๋‹จํ•˜๊ฒŒ ๋น„๊ต๋ถ„์„ํ•œ ๋‚ด์šฉ์ด ์žˆ์—ˆ์ง€์š”.
01:07
And I haven't the foggiest idea of where you find the ones that don't handshake,
13
67330
5000
์†”์งํžˆ ์ €๋Š” ์•…์ˆ˜๋ฅผ ์•ˆ ํ•˜๋Š” ์‚ฌ๋žŒ์„ ์–ด๋””์„œ ์ฐพ์„ ์ง€ ์ „ํ˜€ ๋ชจ๋ฅด๊ฒ ์Šต๋‹ˆ๋‹ค.
01:12
because they must be hiding.
14
72330
3000
๊ทธ๋Ÿฐ ์‚ฌ๋žŒ๋“ค์€ ์–ด๋”˜๊ฐ€์— ๊ผญ๊ผญ ์ˆจ์–ด์žˆ๊ฒ ์ง€์š”.
01:15
And the people who avoid that
15
75330
4000
์–ด์จŒ๋“ , ์•…์ˆ˜๋ฅผ ์•ˆ ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค์€ ์ƒ๋Œ€์ ์œผ๋กœ
01:19
have 30 percent less infectious disease or something.
16
79330
4000
ํƒ€์ธ์—๊ฒŒ ๋ณ‘์„ ์˜ฎ๊ธธ ๊ฐ€๋Šฅ์„ฑ์ด 30% ์ •๋„ ๋‚ฎ๋‹ค๊ณ  ํ•˜๋”๊ตฐ์š”.
01:23
Or maybe it was 31 and a quarter percent.
17
83330
3000
31.25%์ธ๊ฐ€ ๋ญ ๊ทธ๋Ÿฐ ์ˆซ์ž์˜€์Šต๋‹ˆ๋‹ค.
01:26
So if you really want to solve the problem of epidemics and so forth,
18
86330
4000
์ฆ‰, ์•ž์œผ๋กœ ์ „์—ผ๋ณ‘ ๊ฐ™์€ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋ ค๋ฉด
01:30
let's start with that. And since I got that idea,
19
90330
4000
์•…์ˆ˜๋ฅผ ํ•˜์ง€ ์•Š์œผ๋ฉด ๋ฉ๋‹ˆ๋‹ค. ๊ทธ๋Ÿฐ๋ฐ ์ด๋Ÿฐ ์ƒ๊ฐ์„ ํ•œ ํ›„๋กœ
01:34
I've had to shake hundreds of hands.
20
94330
4000
์ €๋Š” ์ˆ˜๋ฐฑ๋ช…์ด๋ž‘ ์•…์ˆ˜๋ฅผ ํ•ด์•ผ ํ–ˆ์Šต๋‹ˆ๋‹ค.
01:38
And I think the only way to avoid it
21
98330
5000
์ œ๊ฐ€ ๋ดค์„ ๋•Œ ์•…์ˆ˜๋ฅผ ํ•˜์ง€ ์•Š์„ ์ˆ˜ ์žˆ๋Š” ์œ ์ผํ•œ ๋ฐฉ๋ฒ•์€
01:43
is to have some horrible visible disease,
22
103330
2000
์†์— ๋ณด๊ธฐ ํ‰ํ•œ ๋ณ‘์ด ์ƒ๊ฒจ์„œ
01:45
and then you don't have to explain.
23
105330
3000
์•…์ˆ˜๋ฅผ ์™œ ์•ˆ ํ•˜๋Š”์ง€ ์„ค๋ช…ํ•˜์ง€ ์•Š์•„๋„ ๋˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
01:48
Education: how do we improve education?
24
108330
4000
๊ต์œก๋ฌธ์ œ๋ฅผ ๋ณผ๊นŒ์š”. ๊ต์œก ๊ฐœ์„ ๋ฐฉ์•ˆ์€ ๋ฌด์—‡์ธ๊ฐ€?
01:52
Well, the single best way is to get them to understand
25
112330
4000
๊ต์œก์„ ๊ฐœ์„ ํ•˜๋Š” ๊ฐ€์žฅ ์ข‹์€ ๋ฐฉ๋ฒ•์€, ์ง€๊ธˆ๊ป ์‚ฌ๋žŒ๋“ค์ด ๋ฐฐ์šด ๊ฒƒ๋“ค์ด
01:56
that what they're being told is a whole lot of nonsense.
26
116330
3000
๋Œ€๋ถ€๋ถ„์€ ๋ง๋„ ์•ˆ ๋˜๋Š” ๊ฒƒ์ด๋ผ๋Š” ์ ์„ ์ดํ•ด์‹œํ‚ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
01:59
And then, of course, you have to do something
27
119330
2000
๊ทธ๋ฆฌ๊ณ  ๋‚˜์„œ, ๋‹น์—ฐํžˆ, ๊ทธ๋Ÿฌํ•œ ์ ์„ ์™„ํ™”ํ•˜๊ธฐ ์œ„ํ•œ ๋ญ”๊ฐ€๋ฅผ ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
02:01
about how to moderate that, so that anybody can -- so they'll listen to you.
28
121330
5000
๊ทธ๋ž˜์•ผ ์‚ฌ๋žŒ๋“ค์ด ์ด์•ผ๊ธฐ๋ฅผ ๋“ฃ๋Š” ์ฒ™์ด๋ผ๋„ ํ•  ํ…Œ๋‹ˆ๊นŒ์š”.
02:06
Pollution, energy shortage, environmental diversity, poverty.
29
126330
4000
ํ™˜๊ฒฝ์˜ค์—ผ, ์ž์›๊ณ ๊ฐˆ, ํ™˜๊ฒฝ ๋‹ค์–‘์„ฑ, ๋นˆ๊ณค
02:10
How do we make stable societies? Longevity.
30
130330
4000
์•ˆ์ •์ ์ธ ์‚ฌํšŒ๋ฅผ ์–ด๋–ป๊ฒŒ ๋งŒ๋“ค ๊ฒƒ์ธ๊ฐ€? ์žฅ์ˆ˜๋ฌธ์ œ.
02:14
Okay, there're lots of problems to worry about.
31
134330
3000
๊ทธ๋ ‡์Šต๋‹ˆ๋‹ค. ๊ฑฑ์ •ํ•ด์•ผ ํ•  ๋ฌธ์ œ๊ฐ€ ๋งค์šฐ ๋งŽ์ฃ 
02:17
Anyway, the question I think people should talk about --
32
137330
2000
์–ด์จŒ๋“ , ์ œ๊ฐ€ ๋ดค์„ ๋•Œ ์šฐ๋ฆฌ๊ฐ€ ์ •๋ง ์–˜๊ธฐ๋ฅผ ๋‚˜๋ˆ ์•ผ ํ•˜๋Š” ๋ฌธ์ œ๋Š”
02:19
and it's absolutely taboo -- is, how many people should there be?
33
139330
5000
- ์ •๋ง ๊ธˆ๊ธฐ์‹œ ๋˜๋Š” ์ด์•ผ๊ธฐ์ด๊ธด ํ•ฉ๋‹ˆ๋‹ค๋งŒ - ์ธ๊ตฌ์˜ ์ ์ • ์ˆ˜์ค€์€ ์–ด๋Š ์ •๋„์ธ๊ฐ€? ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
02:24
And I think it should be about 100 million or maybe 500 million.
34
144330
7000
์ €๋Š” 1์–ต ๋ช… ์ •๋„๊ฐ€ ์ ์ •ํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค. ์•„๋‹ˆ๋ฉด 5์–ต ๋ช…์ด ๋  ์ˆ˜๋„ ์žˆ๊ฒ ์ง€์š”.
02:31
And then notice that a great many of these problems disappear.
35
151330
5000
๊ทธ๋Ÿฌ๋ฉด, ์•ž์„œ ๋งํ•œ ์ˆ˜๋งŽ์€ ๋ฌธ์ œ๋“ค์ด ์‚ฌ๋ผ์ง„๋‹ค๋Š” ๊ฑธ ์•Œ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
02:36
If you had 100 million people
36
156330
2000
๋งŒ์•ฝ์— ์ธ๊ตฌ๊ฐ€ 1์–ต ๋ช…์ด๊ณ , ์ด ์‚ฌ๋žŒ๋“ค์ด ์—ฌ๊ธฐ์ €๊ธฐ์—
02:38
properly spread out, then if there's some garbage,
37
158330
6000
์ ๋‹นํžˆ ํฉ์–ด์ ธ์„œ ์‚ฐ๋‹ค๋ฉด, ์“ฐ๋ ˆ๊ธฐ๊ฐ€ ์ƒ๊ฒจ๋„ ์•„๋ฌด๋ฐ๋‚˜ ๋ง˜์— ๋“œ๋Š” ์•ˆ ๋ณด์ด๋Š” ๊ณณ์— ๋ฒ„๋ฆฌ๋ฉด
02:44
you throw it away, preferably where you can't see it, and it will rot.
38
164330
7000
๊ทธ ์“ฐ๋ ˆ๊ธฐ๋Š” ์–ธ์  ๊ฐ€๋Š” ๋ชจ๋‘ ์ฉ๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
02:51
Or you throw it into the ocean and some fish will benefit from it.
39
171330
5000
์•„๋‹ˆ๋ฉด, ๋ฐ”๋‹ค์— ๋˜์ ธ๋ฒ„๋ ค์„œ ๋ฌผ๊ณ ๊ธฐ๋“ค์ด ๋จน๊ฒŒ ํ•˜๋Š” ๊ฒƒ๋„ ๊ดœ์ฐฎ๊ฒ ์ง€์š”.
02:56
The problem is, how many people should there be?
40
176330
2000
๋ฌธ์ œ๋Š”, ์ธ๊ตฌ์˜ ์ ์ • ์ˆ˜์ค€์€ ์–ด๋Š ์ •๋„์ธ๊ฐ€? ์ž…๋‹ˆ๋‹ค.
02:58
And it's a sort of choice we have to make.
41
178330
3000
์ด๊ฒƒ์€ ์šฐ๋ฆฌ๊ฐ€ ์„ ํƒํ•ด์•ผ ํ•˜๋Š” ๋ฌธ์ œ๊ฐ™์€ ๊ฒ๋‹ˆ๋‹ค.
03:01
Most people are about 60 inches high or more,
42
181330
3000
์‚ฌ๋žŒ๋“ค์ด ๋Œ€๋ถ€๋ถ„ 152cm ๋ณด๋‹ค๋Š” ํฌ๊ณ ,
03:04
and there's these cube laws. So if you make them this big,
43
184330
4000
์ด๋Ÿฐ ์ƒ์ž๊ฐ€ ์žˆ๋‹ค๊ณ  ํ•˜๋ฉด, ์‚ฌ๋žŒ๋“ค์ด ์ด๋งŒํ•˜๊ฒŒ ๋งŒ๋“ค๋ฉด
03:08
by using nanotechnology, I suppose --
44
188330
3000
- ์ œ๊ฐ€ ์ƒ๊ฐํ•˜๊ธฐ์—” ๋‚˜๋…ธ๊ธฐ์ˆ ์„ ์“ฐ๋ฉด ๋  ๊ฑฐ ๊ฐ™์€๋ฐ์š” -
03:11
(Laughter)
45
191330
1000
(์›ƒ์Œ)
03:12
-- then you could have a thousand times as many.
46
192330
2000
๊ทธ๋Ÿฌ๋ฉด ์ฒœ ๋ฐฐ๋Š” ๋” ๋งŽ์€ ์‚ฌ๋žŒ์ด ์žˆ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
03:14
That would solve the problem, but I don't see anybody
47
194330
2000
์ด๋Ÿฐ ์‹์œผ๋กœ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ๊ฒ ์ง€๋งŒ, ์ €๋Š” ์‚ฌ๋žŒ์„ ์ž‘๊ฒŒ ๋งŒ๋“œ๋Š”
03:16
doing any research on making people smaller.
48
196330
3000
๊ธฐ์ˆ ์„ ์—ฐ๊ตฌํ•˜๋Š” ์‚ฌ๋žŒ์„ ๋ณธ ์ ์ด ์—†์Šต๋‹ˆ๋‹ค.
03:19
Now, it's nice to reduce the population, but a lot of people want to have children.
49
199330
5000
์ž, ์ธ๊ตฌ๋ฅผ ์ค„์ด๋Š” ๊ฑด ์ข‹์ง€๋งŒ, ์•„์ด๋ฅผ ๊ฐ–๊ณ  ์‹ถ์–ดํ•˜๋Š” ์‚ฌ๋žŒ๋„ ๋ฌด์ฒ™ ๋งŽ์Šต๋‹ˆ๋‹ค.
03:24
And there's one solution that's probably only a few years off.
50
204330
3000
์•„๋งˆ ๊ทผ๋ฏธ๋ž˜์—๋Š” ์ด๋Ÿฐ ํ•ด๊ฒฐ์ฑ…์ด ๋‚˜์˜ฌ ๊ฑฐ ๊ฐ™์Šต๋‹ˆ๋‹ค.
03:27
You know you have 46 chromosomes. If you're lucky, you've got 23
51
207330
5000
์ธ๊ฐ„์€ 46๊ฐœ์˜ ์œ ์ „์ž๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๋Š” ๊ฑธ ์•„์‹ค ๊ฒ๋‹ˆ๋‹ค.
03:32
from each parent. Sometimes you get an extra one or drop one out,
52
212330
6000
์šด์ด ์ข‹๋‹ค๋ฉด, ๋ถ€๋ชจ๋‹˜์—๊ฒŒ์„œ ๊ฐ๊ฐ 23๊ฐœ์”ฉ์„ ๋ฌผ๋ ค๋ฐ›์•˜๊ฒ ์ฃ . ๊ฐ€๋” ํ•˜๋‚˜๋ฅผ ๋” ๋ฐ›๊ฑฐ๋‚˜ ๋œ ๋ฐ›๋Š” ์ผ๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
03:38
but -- so you can skip the grandparent and great-grandparent stage
53
218330
4000
์–ด์จŒ๋“ , ์กฐ๋ถ€๋ชจ๋‹˜์ด๋‚˜ ์ฆ์กฐ๋ถ€๋‹˜์€ ๊ฑด๋„ˆ๋›ฐ๊ธฐ๋กœ ํ•˜์ง€์š”
03:42
and go right to the great-great-grandparent. And you have 46 people
54
222330
5000
๋ฐ”๋กœ ๊ณ ์กฐ๋ถ€๋ชจ๋‹˜์œผ๋กœ ๊ฐ€๋ณด๋Š” ๊ฒ๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋ฉด, 46๋ช…์˜ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ
03:47
and you give them a scanner, or whatever you need,
55
227330
3000
์Šค์บ๋„ˆ๋ฅผ ํ•˜๋‚˜ ์ฃผ๊ณ , ๊ผญ ์Šค์บ๋„ˆ์ผ ํ•„์š”๋Š” ์—†์ง€๋งŒ,
03:50
and they look at their chromosomes and each of them says
56
230330
4000
์ž๊ธฐ๊ฐ€ ๊ฐ€์ง„ ์œ ์ „์ž๋ฅผ ๋ณด๊ณ  ๊ฐ€์žฅ ๋งˆ์Œ์— ๋“œ๋Š” ๊ฒƒ ํ•˜๋‚˜๋ฅผ
03:54
which one he likes best, or she -- no reason to have just two sexes
57
234330
5000
๊ณ ๋ฅด๊ฒŒ ํ•˜๋Š” ๊ฑฐ์ง€์š”. ๋” ์ด์ƒ ๋‘ ๊ฐœ์˜ ์„ฑ๋ณ„๋งŒ ์žˆ์„ ํ•„์š”๋Š” ์—†๊ฒ ์ง€๋งŒ์š”.
03:59
any more, even. So each child has 46 parents,
58
239330
5000
๊ทธ๋Ÿฌ๋ฉด, ์•„์ด ํ•œ ๋ช…๋‹น 46๋ช…์˜ ๋ถ€๋ชจ๋‹˜์ด ์žˆ๋Š” ๊ฑฐ๊ณ 
04:04
and I suppose you could let each group of 46 parents have 15 children.
59
244330
6000
46๋ช…์˜ ๋ถ€๋ชจ๋‹˜ ๊ทธ๋ฃน ํ•˜๋‚˜ ๋‹น 15๋ช…์˜ ์•„์ด๋ฅผ ๊ฐ–๊ฒŒ ํ•˜๋ฉด
04:10
Wouldn't that be enough? And then the children
60
250330
2000
์ถฉ๋ถ„ํ•˜์ง€ ์•Š์„๊นŒ์š”? ๊ทธ๋ฆฌ๊ณ  ๊ทธ ์•„์ด๋“ค์€
04:12
would get plenty of support, and nurturing, and mentoring,
61
252330
4000
์ถฉ๋ถ„ํ•œ ๊ฐ€์ •๊ต์œก๊ณผ ์ง€์›์„ ๋ฐ›๊ฒŒ ๋  ๊ฒ๋‹ˆ๋‹ค.
04:16
and the world population would decline very rapidly
62
256330
2000
์„ธ๊ณ„ ์ธ๊ตฌ๋Š” ์•„์ฃผ ๋น ๋ฅธ ์†๋„๋กœ ๊ฐ์†Œํ•˜๊ฒ ์ฃ 
04:18
and everybody would be totally happy.
63
258330
3000
๊ทธ๋Ÿฌ๋ฉด ๋ชจ๋“  ์‚ฌ๋žŒ๋“ค์ด ๊ฒฐ๊ตญ ํ–‰๋ณตํ•ด์ง€๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
04:21
Timesharing is a little further off in the future.
64
261330
3000
์‹œ๋ถ„ํ• (tImesharing)์ด ์‹คํ˜„๋˜๋ ค๋ฉด ์ข€ ๋” ๋งŽ์€ ์‹œ๊ฐ„์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.
04:24
And there's this great novel that Arthur Clarke wrote twice,
65
264330
3000
์•„์„œ ํด๋ผํฌ(Arthur Clarke)๊ฐ€ ๊ฐ™์€ ๋‚ด์šฉ์„ ๋‘ ๋ฒˆ ์ง‘ํ•„ํ•œ ์ข‹์€ ์†Œ์„ค์ด ์žˆ๋Š”๋ฐ
04:27
called "Against the Fall of Night" and "The City and the Stars."
66
267330
4000
์ œ๋ชฉ์€ ์€ํ•˜์ œ๊ตญ์˜ ๋ถ•๊ดด(Against the Fall of Night)์™€ ๋„์‹œ์™€ ๋ณ„(The City and the Stars)์ž…๋‹ˆ๋‹ค.
04:31
They're both wonderful and largely the same,
67
271330
3000
๋‘ ์ž‘ํ’ˆ๋‹ค ํ›Œ๋ฅญํ•œ ์†Œ์„ค์ด๊ณ  ๋‚ด์šฉ์€ ๊ฑฐ์˜ ๋น„์Šทํ•œ๋ฐ,
04:34
except that computers happened in between.
68
274330
2000
์ปดํ“จํ„ฐ๊ฐ€ ์žˆ๊ณ  ์—†๊ณ ์˜ ์ฐจ์ด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
04:36
And Arthur was looking at this old book, and he said, "Well, that was wrong.
69
276330
5000
์•„์„œ๊ฐ€ ์ž์‹ ์˜ ์ฒซ ์ž‘ํ’ˆ์„ ๋ณด๊ณ ๋Š” ํ , ์ด๊ฑด ์•ˆ ๋˜๊ฒ ๋Š”๋ฐ, ๋ผ๊ณ  ์ƒ๊ฐํ–ˆ์ง€์š”.
04:41
The future must have some computers."
70
281330
2000
๋ฏธ๋ž˜์—๋Š” ์–ด๋–ค ํ˜•ํƒœ๋กœ๋“ ์ง€ ์ปดํ“จํ„ฐ๊ฐ€ ์กด์žฌํ•  ํ…Œ๋‹ˆ๊นŒ์š”.
04:43
So in the second version of it, there are 100 billion
71
283330
5000
๊ทธ๋ž˜์„œ ๋‘ ๋ฒˆ์งธ ์ž‘ํ’ˆ์—์„œ๋Š” ์ฒœ์–ต์ธ๊ฐ€, 1์กฐ๋ช…์˜ ์‚ฌ๋žŒ๋“ค์ด
04:48
or 1,000 billion people on Earth, but they're all stored on hard disks or floppies,
72
288330
8000
์ง€๊ตฌ์— ์‚ด๊ณ  ์žˆ์ง€๋งŒ, ๊ทธ ์‚ฌ๋žŒ๋“ค์ด ํ•˜๋“œ๋””์Šคํฌ๋‚˜ ํ”Œ๋กœํ”ผ๋””์Šคํฌ์— ์ €์žฅ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
04:56
or whatever they have in the future.
73
296330
2000
๋ฌด์—‡์ด๋“  ๋ฏธ๋ž˜์—์„œ ์‚ฌ์šฉํ•˜๋Š” ์–ด๋–ค ๊ธฐ์–ต์žฅ์น˜์— ์ €์žฅ์ด ๋˜์–ด ์žˆ๋Š” ๊ฑฐ์ฃ .
04:58
And you let a few million of them out at a time.
74
298330
4000
๊ทธ๋ฆฌ๊ณ  ํ•œ๋ฒˆ์— ๋ช‡ ๋ฐฑ๋งŒ ๋ช… ์”ฉ๋งŒ ๋ฐ–์œผ๋กœ ๋‚˜์˜ค๊ฒŒ ๋˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
05:02
A person comes out, they live for a thousand years
75
302330
4000
ํ•œ ์‚ฌ๋žŒ์ด ๋‚˜์˜ค๋ฉด, ์ฒœ ๋…„๋™์•ˆ ์ธ์ƒ์„ ์‚ด์•„๊ฐ€๋ฉด์„œ
05:06
doing whatever they do, and then, when it's time to go back
76
306330
6000
ํ•˜๊ณ  ์‹ถ์€ ์ผ์„ ํ•˜๊ณ , ๋‹ค์‹œ ์ €์žฅ์žฅ์น˜๋กœ ๋Œ์•„๊ฐ€ ์‹ญ์–ต๋…„์„ ์ง€๋‚ด์•ผ ํ•˜๋Š” ๊ฑด๋ฐ,
05:12
for a billion years -- or a million, I forget, the numbers don't matter --
77
312330
4000
๊ทธ๊ฒŒ ์‹ญ์–ต๋…„์ธ์ง€ ๋ฐฑ๋งŒ๋…„์ด์—ˆ๋Š”์ง€ ์ •ํ™•ํ•˜๊ฒŒ ๊ธฐ์–ต์€ ์•ˆ๋‚˜์ง€๋งŒ, ์ •ํ™•ํ•œ ํ–‡์ˆ˜๋Š” ์ค‘์š”ํ•˜์ง€ ์•Š๊ณ ์š”.
05:16
but there really aren't very many people on Earth at a time.
78
316330
4000
๊ทธ๋Ÿฌ๋‚˜, ์ง€๊ตฌ ์ƒ์— ์‚ด์•„๊ฐ€๋Š” ์‚ฌ๋žŒ์˜ ์ˆซ์ž๋Š” ์–ธ์ œ๋“ ์ง€ ๋งŽ์ง€๊ฐ€ ์•Š๋‹ค๋Š” ๊ฑฐ์ง€์š”.
05:20
And you get to think about yourself and your memories,
79
320330
2000
๊ทธ๋ฆฌ๊ณ  ์‚ฌ๋žŒ๋“ค์€ ๋‚˜ ์ž์‹ ๊ณผ ๋‚ด๊ฐ€ ๊ฐ€์ง„ ๊ธฐ์–ต์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด๋ณด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.
05:22
and before you go back into suspension, you edit your memories
80
322330
5000
๊ทธ๋ฆฌ๊ณ  ์ €์žฅ์žฅ์น˜๋กœ ๋Œ์•„๊ฐ€๊ธฐ ์ „์— ๊ทธ ๊ธฐ์–ต์„ ํŽธ์ง‘ํ•˜์ง€์š”
05:27
and you change your personality and so forth.
81
327330
3000
๊ทธ๋ฆฌ๊ณ  ์ž์‹ ์˜ ์„ฑ๊ฒฉ๋„ ๋ฐ”๊พธ๊ณ .. ๋ญ ๋“ฑ๋“ฑ์˜ ๊ทธ๋Ÿฐ ์ด์•ผ๊ธฐ์ž…๋‹ˆ๋‹ค.
05:30
The plot of the book is that there's not enough diversity,
82
330330
6000
์ฑ…์˜ ์ค„๊ฑฐ๋ฆฌ๋Š” ๋‹ค์–‘ํ•œ ์‚ฌ๋žŒ๋“ค์ด ์ถฉ๋ถ„ํžˆ ๋งŽ์ง€๊ฐ€ ์•Š์•„์„œ
05:36
so that the people who designed the city
83
336330
3000
๋„์‹œ๋ฅผ ์„ค๊ณ„ํ•œ ์‚ฌ๋žŒ๋“ค์€ ์–ธ์ œ๋‚˜ ์ „ํ˜€ ๋‹ค๋ฅธ ์‚ฌ๋žŒ์ด
05:39
make sure that every now and then an entirely new person is created.
84
339330
4000
๋งŒ๋“ค์–ด์ง€๋„๋ก ํ•œ๋‹ค๋Š” ๊ฒƒ์ด์—ˆ์Šต๋‹ˆ๋‹ค.
05:43
And in the novel, a particular one named Alvin is created. And he says,
85
343330
6000
์ด ์†Œ์„ค์—์„œ, ์•Œ๋นˆ์ด๋ผ๋Š” ์‚ฌ๋žŒ์ด ์ฐฝ์กฐ๋ฉ๋‹ˆ๋‹ค. ๊ทธ ์‚ฌ๋žŒ์€ ์ด๋Ÿฐ ์‚ฌํšŒ์ฒด๊ณ„๊ฐ€
05:49
maybe this isn't the best way, and wrecks the whole system.
86
349330
4000
์ตœ์„ ์ฑ…์ด ์•„๋‹ ์ˆ˜ ์žˆ๋‹ค๋ฉฐ, ์‚ฌํšŒ์ฒด์ œ๋ฅผ ๋ถ•๊ดด์‹œํ‚ค์ง€์š”.
05:53
I don't think the solutions that I proposed
87
353330
2000
์ €๋Š” ์ œ๊ฐ€ ์ œ์•ˆํ•˜๋Š” ํ•ด๊ฒฐ์ฑ…๋“ค์ด ์•„์ฃผ ์ข‹๋‹ค๊ฑฐ๋‚˜
05:55
are good enough or smart enough.
88
355330
3000
ํ›Œ๋ฅญํ•˜๋‹ค๊ณ ๋Š” ์ƒ๊ฐํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
05:58
I think the big problem is that we're not smart enough
89
358330
4000
์‚ฌ์‹ค ๋ฌธ์ œ๋Š” ์šฐ๋ฆฌ๋“ค์ด ๋‹น๋ฉดํ•˜๊ณ  ์žˆ๋Š” ๋ฌธ์ œ๋“ค ์ค‘์— ์ข‹์€ ๊ฒƒ์ด ๋ฌด์—‡์ธ์ง€๋ฅผ
06:02
to understand which of the problems we're facing are good enough.
90
362330
4000
์ดํ•ดํ•  ์ •๋„๋กœ ๋˜‘๋˜‘ํ•˜์ง€ ์•Š๋‹ค๋Š” ์ ์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
06:06
Therefore, we have to build super intelligent machines like HAL.
91
366330
4000
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ HAL ๊ฐ™์€ ์ˆ˜ํผ์ธ๊ณต์ง€๋Šฅ์„ ๊ฑด์„คํ•ด์•ผ ํ•˜๋Š” ๊ฑฐ์ง€์š”.
06:10
As you remember, at some point in the book for "2001,"
92
370330
5000
๊ธฐ์–ตํ•˜์‹œ๊ฒ ์ง€๋งŒ, 2001 ์ŠคํŽ˜์ด์Šค ์˜ค๋””์„ธ์ด๋ผ๋Š” ์ฑ…์— ๋ณด๋ฉด
06:15
HAL realizes that the universe is too big, and grand, and profound
93
375330
5000
HAL์€ ์†Œ์„ค ์†์˜ ์•„์ฃผ ๋ฉ์ฒญํ•œ ์šฐ์ฃผ ๋น„ํ–‰์‚ฌ๋“ค์ด ์ดํ•ดํ•˜๊ธฐ์—๋Š”
06:20
for those really stupid astronauts. If you contrast HAL's behavior
94
380330
4000
์šฐ์ฃผ๊ฐ€ ๋„ˆ๋ฌด๋‚˜ ์›๋Œ€ํ•˜๊ณ  ์‹ฌ์˜คํ•˜๋‹ค๋Š” ์‚ฌ์‹ค์„ ์•Œ์•„์ฐจ๋ฆฝ๋‹ˆ๋‹ค. ๋งŒ์•ฝ HAL์˜ ํ–‰๋™๊ณผ
06:24
with the triviality of the people on the spaceship,
95
384330
4000
์šฐ์ฃผ์„ ์— ํƒ„ ์‚ฌ๋žŒ๋“ค์˜ ์‹œ์‹œํ•œ ํ–‰๋™์„ ๋น„๊ตํ•ด๋ณธ๋‹ค๋ฉด,
06:28
you can see what's written between the lines.
96
388330
3000
์ด ์†Œ์„ค์˜ ํ–‰๊ฐ„์˜ ์˜๋ฏธ๋ฅผ ์ดํ•ดํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
06:31
Well, what are we going to do about that? We could get smarter.
97
391330
3000
์ž, ๊ทธ๋Ÿผ ์–ด๋–ป๊ฒŒ ํ•ด์•ผํ• ๊นŒ์š”? ์šฐ๋ฆฌ๋Š” ๋” ๋˜‘๋˜‘ํ•ด์งˆ ์ˆ˜ ์žˆ๋‹ค๊ณ  ๋ด…๋‹ˆ๋‹ค.
06:34
I think that we're pretty smart, as compared to chimpanzees,
98
394330
5000
์ €๋Š” ์šฐ๋ฆฌ๊ฐ€ ์นจํŒฌ์ง€์™€ ๋น„๊ตํ•˜๋‹ค๋ฉด, ๊ฝค ๋˜‘๋˜‘ํ•˜๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
06:39
but we're not smart enough to deal with the colossal problems that we face,
99
399330
6000
ํ•˜์ง€๋งŒ ์šฐ๋ฆฌ๊ฐ€ ์ง๋ฉดํ•œ ์—„์ฒญ๋‚œ ๋ฌธ์ œ๋“ค์„ ํ•ด๊ฒฐํ•  ๋งŒํผ ๋จธ๋ฆฌ๊ฐ€ ์ข‹์ง€๋Š” ์•Š์Šต๋‹ˆ๋‹ค.
06:45
either in abstract mathematics
100
405330
2000
์ถ”์ƒ๋Œ€์ˆ˜ํ•™์ ์ธ ๋ฌธ์ œ๋ฅผ ํ’€์–ด๋‚ด๊ฑฐ๋‚˜ ๊ฒฝ์ œ์ƒํ™ฉ์„ ํŒŒ์•…ํ•˜๊ฑฐ๋‚˜
06:47
or in figuring out economies, or balancing the world around.
101
407330
5000
์„ธ๊ณ„์˜ ๊ท ํ˜•์„ ๋งž์ถœ ๋ฐฉ๋ฒ•์„ ์ƒ๊ฐํ•ด ๋‚ด์ง€๋„ ๋ชปํ•˜์ง€์š”
06:52
So one thing we can do is live longer.
102
412330
3000
ํ•˜์ง€๋งŒ ์ˆ˜๋ช…์„ ์—ฐ์žฅํ•  ์ˆ˜๋Š” ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
06:55
And nobody knows how hard that is,
103
415330
2000
๊ทธ๊ฒŒ ์–ผ๋งˆ๋‚˜ ํž˜๋“ค์ง€๋Š” ์•„๋ฌด๋„ ๋ชจ๋ฅด์ง€๋งŒ
06:57
but we'll probably find out in a few years.
104
417330
3000
์•ž์œผ๋กœ ๋ช‡ ๋…„ ์•ˆ์— ๊ทธ ๋ฐฉ๋ฒ•์„ ์ฐพ์•„๋‚ผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
07:00
You see, there's two forks in the road. We know that people live
105
420330
3000
๋‘ ๊ฐœ์˜ ๊ฐˆ๋ฆผ๊ธธ์ด ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ด๋ด…์‹œ๋‹ค.
07:03
twice as long as chimpanzees almost,
106
423330
4000
์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„์˜ ์ˆ˜๋ช…์€ ์นจํŒฌ์ง€์˜ ์ˆ˜๋ช…์˜ ์•ฝ ๋‘ ๋ฐฐ์ด์ง€๋งŒ
07:07
and nobody lives more than 120 years,
107
427330
4000
120๋…„ ์ด์ƒ์€ ์‚ด์ง€ ๋ชปํ•œ๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
07:11
for reasons that aren't very well understood.
108
431330
3000
๊ทธ ์ด์œ ๋Š” ์šฐ๋ฆฌ๋„ ์ž˜์€ ๋ชจ๋ฅด์ง€์š”.
07:14
But lots of people now live to 90 or 100,
109
434330
3000
ํ•˜์ง€๋งŒ ์ด์ œ๋Š” ๋งŽ์€ ์‚ฌ๋žŒ๋“ค์ด 90์ด๋‚˜ 100์„ธ๊นŒ์ง€ ์‚ด์•„๊ฐ‘๋‹ˆ๋‹ค.
07:17
unless they shake hands too much or something like that.
110
437330
4000
์•…์ˆ˜๋ฅผ ๋„ˆ๋ฌด ๋งŽ์ด ํ•˜๊ฑฐ๋‚˜ ํ•˜์ง€๋งŒ ์•Š์œผ๋ฉด ๋ง์ด์ง€์š”
07:21
And so maybe if we lived 200 years, we could accumulate enough skills
111
441330
5000
๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ 200์„ธ๊นŒ์ง€ ์‚ด ์ˆ˜ ์žˆ๋‹ค๋ฉด, ๋ช‡๋ช‡ ๋ฌธ์ œ๋“ค์„
07:26
and knowledge to solve some problems.
112
446330
5000
ํ•ด๊ฒฐํ•˜๋Š” ๋ฐ ํ•„์š”ํ•œ ์ง€์‹๊ณผ ๊ธฐ์ˆ ์„ ์ถฉ๋ถ„ํžˆ ์ถ•์ ํ•  ์ˆ˜๋„ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
07:31
So that's one way of going about it.
113
451330
2000
์ด๊ฒŒ ๋ฌธ์ œํ•ด๊ฒฐ์˜ ์ฒซ๊ฑธ์Œ์ด ๋  ์ˆ˜ ์žˆ๋Š” ๊ฑฐ์ง€์š”
07:33
And as I said, we don't know how hard that is. It might be --
114
453330
3000
๋ฌผ๋ก  ์ œ๊ฐ€ ๋งํ–ˆ๋“ฏ์ด, ์ƒ๋ช…์—ฐ์žฅ์ด ์–ผ๋งˆ๋‚˜ ์–ด๋ ค์šธ ์ง€๋Š” ์•„๋ฌด๋„ ๋ชจ๋ฆ…๋‹ˆ๋‹ค.
07:36
after all, most other mammals live half as long as the chimpanzee,
115
456330
6000
์–ด์จŒ๋“ , ๋Œ€๋ถ€๋ถ„์˜ ํฌ์œ ๋ฅ˜๋Š” ์นจํŒฌ์ง€ ์ˆ˜๋ช…์˜ ์ ˆ๋ฐ˜ ์ •๋„๋ฅผ ์‚ด๊ณ 
07:42
so we're sort of three and a half or four times, have four times
116
462330
3000
๊ทธ๋Ÿฌ๋‹ˆ ์ธ๊ฐ„์€ 3.5๋ฐฐ์—์„œ 4๋ฐฐ์ •๋„, ๋Œ€๋ถ€๋ถ„์˜ ํฌ์œ ๋ฅ˜์— ๋น„ํ•ด
07:45
the longevity of most mammals. And in the case of the primates,
117
465330
6000
4๋ฐฐ์˜ ์ˆ˜๋ช…์„ ๊ฐ€์ง„ ์…ˆ์ด์ฃ . ๊ทธ๋ฆฌ๊ณ  ์˜์žฅ๋ฅ˜๋งŒ ๋ณธ๋‹ค๋ฉด
07:51
we have almost the same genes. We only differ from chimpanzees,
118
471330
4000
์šฐ๋ฆฌ๋Š” ๊ฑฐ์˜ ๊ฐ™์€ ์œ ์ „์ž๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ์‹œ์‹œํ•œ ์ด์•ผ๊ธฐ์ง€๋งŒ
07:55
in the present state of knowledge, which is absolute hogwash,
119
475330
6000
ํ˜„์žฌ๊นŒ์ง€ ์•Œ๋ ค์ง„ ๋ฐ”๋กœ๋Š” ์ธ๊ฐ„๊ณผ ์นจํŒฌ์ง€์˜ ์œ ์ „์ž๋Š”
08:01
maybe by just a few hundred genes.
120
481330
2000
๊ฒจ์šฐ ๋ช‡ ๋ฐฑ๊ฐœ๋งŒ ๋‹ค๋ฅด๋‹ค๊ณ  ํ•˜์ง€์š”.
08:03
What I think is that the gene counters don't know what they're doing yet.
121
483330
3000
์ €๋Š” ์œ ์ „๊ณตํ•™์ž๋“ค์ด ๋ณธ์ธ์ด ๋ฌด์Šจ ์ผ์„ ํ•˜๋Š”์ง€ ์•„์ง ๋ชจ๋ฅด๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.
08:06
And whatever you do, don't read anything about genetics
122
486330
3000
๊ทธ๋ฆฌ๊ณ  ๋ฌด์Šจ ์ผ์„ ํ•˜์‹œ๋“ ์ง€ ์šฐ๋ฆฌ๊ฐ€ ์‚ด์•„์žˆ๋Š” ๋™์•ˆ ์ถœํŒ๋œ
08:09
that's published within your lifetime, or something.
123
489330
3000
์œ ์ „ํ•™์— ๋Œ€ํ•œ ๊ธ€์€ ์•„๋ฌด๊ฒƒ๋„ ์ฝ์ง€ ๋งˆ์‹œ๊ธฐ ๋ฐ”๋ž๋‹ˆ๋‹ค.
08:12
(Laughter)
124
492330
3000
(์›ƒ์Œ)
08:15
The stuff has a very short half-life, same with brain science.
125
495330
4000
๊ทธ๋Ÿฐ ๊ธ€์˜ ์ˆ˜๋ช…์€ ์•„์ฃผ ์งง์Šต๋‹ˆ๋‹ค. ๋‡Œ๊ณผํ•™๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€์ง€์š”.
08:19
And so it might be that if we just fix four or five genes,
126
499330
6000
๋งŒ์•ฝ์— ๋„ˆ๋‹ค์„ฏ ๊ฐœ์˜ ์œ ์ „์ž๋ฅผ ๊ฐœ์กฐํ•˜๋ฉด,
08:25
we can live 200 years.
127
505330
2000
200์„ธ๊นŒ์ง€ ์‚ด ์ˆ˜๋„ ์žˆ๋‹ค๋Š” ๋ง์ž…๋‹ˆ๋‹ค.
08:27
Or it might be that it's just 30 or 40,
128
507330
3000
ํ•˜์ง€๋งŒ ์„œ๋ฅธ์ด๋‚˜ ๋งˆํ”๊นŒ์ง€๋ฐ–์— ๋ชป ์‚ด ์ˆ˜๋„ ์žˆ์ง€์š”.
08:30
and I doubt that it's several hundred.
129
510330
2000
๋ช‡ ๋ฐฑ๋…„์„ ์‚ฌ๋Š” ๊ฑด ํž˜๋“ค ๊ฑฐ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
08:32
So this is something that people will be discussing
130
512330
4000
๊ทธ๋ ‡๊ธฐ ๋•Œ๋ฌธ์— ์ด ๋ฌธ์ œ์— ๋Œ€ํ•ด ์‚ฌ๋žŒ๋“ค์ด ๋…ผ์˜๋ฅผ ํ•ด์•ผํ•ฉ๋‹ˆ๋‹ค.
08:36
and lots of ethicists -- you know, an ethicist is somebody
131
516330
3000
๋ฌผ๋ก  ์œค๋ฆฌํ•™์ž๋“ค๋„ ๋งŽ์ด ๋‚˜์˜ค๊ฒ ์ง€์š”. ์•„์‹œ๋‹ค์‹œํ”ผ, ์œค๋ฆฌํ•™์ž๋“ค์€
08:39
who sees something wrong with whatever you have in mind.
132
519330
3000
์šฐ๋ฆฌ๊ฐ€ ์–ด๋–ค ์ƒ๊ฐ์„ ํ•˜๋“ ์ง€ ๊ฑฐ๊ธฐ์—์„œ ์ž˜๋ชป๋œ ์ ์„ ๊ผฌ์ง‘์–ด๋‚ด๋Š” ์‚ฌ๋žŒ๋“ค์ด๋‹ˆ๊นŒ์š”.
08:42
(Laughter)
133
522330
3000
(์›ƒ์Œ)
08:45
And it's very hard to find an ethicist who considers any change
134
525330
4000
๊ทธ๋ฆฌ๊ณ  ์œค๋ฆฌํ•™์ž ์ค‘์— ๋ณ€ํ™”์— ๊ฐ€์น˜๋ฅผ ๋‘๋Š” ์‚ฌ๋žŒ์„ ์ฐพ๊ธฐ๋ž€ ์•„์ฃผ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
08:49
worth making, because he says, what about the consequences?
135
529330
4000
์™œ๋ƒํ•˜๋ฉด, ์œค๋ฆฌํ•™์ž๋“ค์€ ๋ณ€ํ™”๋กœ ์ธํ•œ ๊ฒฐ๊ณผ๋ฅผ ์–ด๋–ป๊ฒŒ ์ฑ…์ž„์งˆ ๊ฒƒ์ธ์ง€ ๊ฑฑ์ •ํ•˜๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
08:53
And, of course, we're not responsible for the consequences
136
533330
3000
๋ฌผ๋ก , ์šฐ๋ฆฌ๋Š” ์ง€๊ธˆ ์šฐ๋ฆฌ์˜ ํ–‰๋™์ด ์ดˆ๋ž˜ํ•  ๊ฒฐ๊ณผ์— ๋Œ€ํ•ด ์ฑ…์ž„์ด ์žˆ์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
08:56
of what we're doing now, are we? Like all this complaint about clones.
137
536330
6000
์•„๋‹Œ๊ฐ€์š”? ๋ณต์ œ์ธ๊ฐ„์— ๋Œ€ํ•œ ๋ถˆํ‰๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€์ž…๋‹ˆ๋‹ค.
09:02
And yet two random people will mate and have this child,
138
542330
3000
ํ•˜์ง€๋งŒ ์•„์ฃผ ํ˜•ํŽธ์—†๋Š” ์œ ์ „์ž๋ฅผ ๊ฐ€์ง„ ๋‘ ์‚ฌ๋žŒ์ด ๋งŒ๋‚˜
09:05
and both of them have some pretty rotten genes,
139
545330
4000
๊ฒฐํ˜ผ์„ ํ•˜๊ณ  ์•„์ด๋ฅผ ๊ฐ€์ง„๋‹ค๋ฉด
09:09
and the child is likely to come out to be average.
140
549330
4000
๊ทธ ์•„์ด๋Š” ํ‰๋ฒ”ํ•œ ์‚ฌ๋žŒ์ด ๋  ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์Šต๋‹ˆ๋‹ค.
09:13
Which, by chimpanzee standards, is very good indeed.
141
553330
6000
์นจํŒฌ์ง€์˜ ๊ธฐ์ค€์—์„œ ๋ณด์ž๋ฉด, ์‚ฌ์‹ค ์•„์ฃผ ํ›Œ๋ฅญํ•œ ๊ฑฐ์ฃ 
09:19
If we do have longevity, then we'll have to face the population growth
142
559330
3000
๋งŒ์•ฝ ์šฐ๋ฆฌ๊ฐ€ ๋” ์˜ค๋ž˜ ์‚ด ์ˆ˜ ์žˆ๋‹ค๋ฉด, ์ธ๊ตฌ ์ฆ๊ฐ€ ๋ฌธ์ œ๊ฐ€ ์ƒ๊ธธ ๊ฒŒ ๋ถ„๋ช…ํ•ฉ๋‹ˆ๋‹ค.
09:22
problem anyway. Because if people live 200 or 1,000 years,
143
562330
4000
์™œ๋ƒํ•˜๋ฉด ์‚ฌ๋žŒ์ด 200๋…„์ด๋“  ์ฒœ ๋…„์ด๋“  ์‚ด๊ฒŒ ๋œ๋‹ค๋ฉด
09:26
then we can't let them have a child more than about once every 200 or 1,000 years.
144
566330
6000
200๋…„์ด๋‚˜ ์ฒœ ๋…„์— ํ•œ ๋ฒˆ๋งŒ ํ•œ ๋ช…์˜ ์•„์ด๋ฅผ ๊ฐ€์ง€๊ฒŒ ํ•  ์ˆ˜๋Š” ์—†๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
09:32
And so there won't be any workforce.
145
572330
3000
๊ทธ๋ฆฌ๊ณ  ๋…ธ๋™์ธ๊ตฌ๋„ ์ถฉ๋ถ„ํ•˜์ง€ ์•Š์„ ํ…Œ๊ตฌ์š”
09:35
And one of the things Laurie Garrett pointed out, and others have,
146
575330
4000
๋กœ๋ฆฌ ๊ฐ€๋ ›(Laurie Garrett)๊ณผ ์—ฌ๋Ÿฌ ์‚ฌ๋žŒ๋“ค์ด ์ง€์ ํ–ˆ๋˜ ๋ฌธ์ œ ์ค‘์˜ ํ•˜๋‚˜๊ฐ€
09:39
is that a society that doesn't have people
147
579330
5000
์ผ์„ ํ•  ์ˆ˜ ์žˆ๋Š” ์—ฐ๋ น๋Œ€์˜ ์‚ฌ๋žŒ์ด ์—†๋Š” ์‚ฌํšŒ๋Š” ์‹ฌ๊ฐํ•œ ๋ฌธ์ œ์— ๋น ์ง„๋‹ค๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
09:44
of working age is in real trouble. And things are going to get worse,
148
584330
3000
๊ทธ๋ฆฌ๊ณ  ์ƒํ™ฉ์€ ๊ณ„์† ์•…ํ™”๋˜๊ธฐ๋งŒ ํ•  ํ…๋ฐ, ๊ทธ ์ด์œ ๋Š”
09:47
because there's nobody to educate the children or to feed the old.
149
587330
6000
์–ด๋ฆฐ ์•„์ด๋“ค์„ ๊ต์œกํ•˜๊ฑฐ๋‚˜ ๋…ธ์ธ๋“ค์„ ๋Œ๋ณผ ์‚ฌ๋žŒ์ด ์—†๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
09:53
And when I'm talking about a long lifetime, of course,
150
593330
2000
๋ฌผ๋ก  ์ œ๊ฐ€ ์˜ค๋ž˜์‚ฐ๋‹ค๊ณ  ๋งํ•˜๋Š” ๊ฒƒ์€
09:55
I don't want somebody who's 200 years old to be like our image
151
595330
6000
์šฐ๋ฆฌ๊ฐ€ ํ”ํžˆ ๋– ์˜ฌ๋ฆฌ๋Š” 200์‚ด์˜ ์ด๋ฏธ์ง€๋Š” ์•„๋‹™๋‹ˆ๋‹ค.
10:01
of what a 200-year-old is -- which is dead, actually.
152
601330
4000
200์‚ด์ด๋ฉด ๊ฑฐ์˜ ์ฃฝ์€ ์‚ฌ๋žŒ์ด๋‚˜ ๋‹ค๋ฆ„์—†๋‹ค๋Š” ๊ทธ๋Ÿฐ ์ด๋ฏธ์ง€๋Š” ์•„๋‹ˆ๋ผ๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:05
You know, there's about 400 different parts of the brain
153
605330
2000
๋‡Œ์—๋Š” ์•ฝ 400 ์—ฌ๊ฐœ์˜ ๋ถ€์œ„๊ฐ€ ์žˆ๋Š”๋ฐ
10:07
which seem to have different functions.
154
607330
2000
์ด ๋ถ€์œ„๋“ค์ด ๊ฐ๊ฐ ๋‹ค๋ฅธ ๊ธฐ๋Šฅ์„ ํ•˜๋Š” ๊ฑธ๋กœ ๋ณด์ž…๋‹ˆ๋‹ค.
10:09
Nobody knows how most of them work in detail,
155
609330
3000
๊ตฌ์ฒด์ ์œผ๋กœ ์–ด๋–ค ๋ถ€์œ„๊ฐ€ ์–ด๋–ค ๊ธฐ๋Šฅ์„ ๋‹ด๋‹นํ•˜๋Š”์ง€๋Š” ์•„๋ฌด๋„ ๋ชจ๋ฅด์ง€๋งŒ
10:12
but we do know that there're lots of different things in there.
156
612330
4000
๋‡Œ ์•ˆ์—๋Š” ์ˆ˜๋งŽ์€ ๋‹ค๋ฅธ ๊ฒƒ๋“ค์ด ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์€ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
10:16
And they don't always work together. I like Freud's theory
157
616330
2000
๊ทธ๋ฆฌ๊ณ  ๋ชจ๋“  ๋ถ€์œ„๊ฐ€ ํ•ญ์ƒ ํ•จ๊ป˜ ์ผํ•˜๋Š” ๊ฒƒ๋„ ์•„๋‹Œ ๊ฑฐ์ฃ . ์ €๋Š” ํ”„๋กœ์ด๋“œ์˜ ์ด๋ก ์„ ์ข‹์•„ํ•ฉ๋‹ˆ๋‹ค.
10:18
that most of them are cancelling each other out.
158
618330
4000
ํ”„๋กœ์ด๋“œ๋Š” ๋Œ€๋ถ€๋ถ„์˜ ๋‡Œ๋Š” ์„œ๋กœ ์ƒ์‡„๋œ๋‹ค๊ณ  ๋งํ–ˆ์ฃ 
10:22
And so if you think of yourself as a sort of city
159
622330
4000
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๋งŒ์•ฝ ์šฐ๋ฆฌ๋ฅผ 100๊ฐœ์˜ ์ž์›์ด ์žˆ๋Š” ํ•˜๋‚˜์˜ ๋„์‹œ๊ฐ™์€ ๊ฒƒ์œผ๋กœ ์ƒ๊ฐํ•ด๋ณด๋ฉด
10:26
with a hundred resources, then, when you're afraid, for example,
160
626330
6000
์šฐ๋ฆฌ๊ฐ€ ๊ฒ์„ ๋จน์—ˆ์„ ๋•Œ, ์˜ˆ๋ฅผ ๋“ค๋ฉด ๋ง์ด์ฃ ,
10:32
you may discard your long-range goals, but you may think deeply
161
632330
4000
์šฐ๋ฆฌ๋Š” ์žฅ๊ธฐ์ ์ธ ๋ชฉํ‘œ๋ฅผ ๋ฒ„๋ฆฌ๊ณ , ๋Œ€์‹ ์— ๋ชฉ์ „์˜ ๋ชฉํ‘œ๋ฅผ ์–ด๋–ป๊ฒŒ ๋‹ฌ์„ฑํ•  ๊ฒƒ์ธ์ง€์— ๋Œ€ํ•ด
10:36
and focus on exactly how to achieve that particular goal.
162
636330
4000
๊ณจ๋ชฐํžˆ ๋งค๋‹ฌ๋ฆฌ๊ฒŒ ๋  ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.
10:40
You throw everything else away. You become a monomaniac --
163
640330
3000
๋‹ค๋ฅธ ์ผ์€ ์ฃ„๋‹ค ํŒฝ๊ฐœ์ณ๋‘๋Š” ๊ฑฐ์ฃ . ๋งํ•˜์ž๋ฉด ์™ธ๊ณจ์ˆ˜๊ฐ€ ๋˜๋Š” ๊ฒ๋‹ˆ๋‹ค.
10:43
all you care about is not stepping out on that platform.
164
643330
4000
์˜ค๋กœ์ง€ ๊ทธ ์ƒํ™ฉ์—์„œ ๋ฒ—์–ด๋‚˜์ง€ ์•Š๋Š” ๊ฒƒ๋งŒ ์ƒ๊ฐํ•˜์ง€์š”
10:47
And when you're hungry, food becomes more attractive, and so forth.
165
647330
4000
๊ทธ๋ฆฌ๊ณ  ๋ฐฐ๊ฐ€ ๊ณ ํ”„๋ฉด, ์Œ์‹์ด ๋” ๋งŽ์ด ์ƒ๊ฐ๋‚˜๊ณ , ๋ญ ๋‹ค ๊ทธ๋Ÿฐ ์‹์ด์ฃ .
10:51
So I see emotions as highly evolved subsets of your capability.
166
651330
6000
๊ทธ๋ž˜์„œ ์ €๋Š” ๊ฐ์ •์ด ๋งค์šฐ ์ง„ํ™”๋œ ์šฐ๋ฆฌ ๋Šฅ๋ ฅ์˜ ๋ถ€๋ถ„ ์ง‘ํ•ฉ๋“ค์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
10:57
Emotion is not something added to thought. An emotional state
167
657330
4000
๊ฐ์ •์€ ์‚ฌ๊ณ ์— ๋”ํ•ด์ง€๋Š” ์–ด๋–ค ๊ฒƒ์ด ์•„๋‹™๋‹ˆ๋‹ค. ๊ฐ์ •์˜ ๋‹จ๊ณ„๋ผ๋Š” ๊ฒƒ์€
11:01
is what you get when you remove 100 or 200
168
661330
4000
100๊ฐœ๋‚˜ 200๊ฐœ์˜ ๋ณดํ†ต ์‚ฌ์šฉ๊ฐ€๋Šฅํ•œ ์ž์›์„
11:05
of your normally available resources.
169
665330
3000
์ œ๊ฑฐํ–ˆ์„ ๋•Œ ์–ป์–ด์ง€๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
11:08
So thinking of emotions as the opposite of -- as something
170
668330
3000
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ๊ฐ์ •์„ ์‚ฌ๊ณ ์™€๋Š” ๋ฐ˜๋Œ€์ธ, ๋ญ”๊ฐ€ ๋” ์ž‘์€ ๊ฒƒ์œผ๋กœ ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ์ด
11:11
less than thinking is immensely productive. And I hope,
171
671330
4000
๊ต‰์žฅํžˆ ์ƒ์‚ฐ์ ์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ์ œ๊ฐ€ ๋ฐ”๋ผ๋Š” ๊ฑด
11:15
in the next few years, to show that this will lead to smart machines.
172
675330
4000
์•ž์œผ๋กœ ๋ช‡ ๋…„ ์•ˆ์—, ์ด๊ฒƒ์ด ์Šค๋งˆํŠธ ๋จธ์‹ ์œผ๋กœ ์ด์–ด์ง„๋‹ค๋Š” ๊ฑธ ๋ณด์—ฌ์ฃผ๋Š” ๊ฒƒ์ด์ฃ .
11:19
And I guess I better skip all the rest of this, which are some details
173
679330
3000
๊ทธ๋ฆฌ๊ณ  ์ œ ์ƒ๊ฐ์—” ์—ฌ๊ธฐ ๋‚˜๋จธ์ง€ ๋ถ€๋ถ„์€ ๊ทธ๋ƒฅ ๋„˜์–ด๊ฐ€๊ธฐ๋กœ ํ•˜์ฃ . ์Šค๋งˆํŠธ ๋จธ์‹ ์„ ์–ด๋–ป๊ฒŒ
11:22
on how we might make those smart machines and --
174
682330
5000
๋งŒ๋“ค ๊ฒƒ์ธ๊ฐ€์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ๋“ค์ด๊ฑฐ๋“ ์š”.
11:27
(Laughter)
175
687330
5000
(์›ƒ์Œ)
11:32
-- and the main idea is in fact that the core of a really smart machine
176
692330
5000
์‚ฌ์‹ค ๋งํ•˜๊ณ ์ž ํ•˜๋Š” ๋ฐ”๋Š” ์Šค๋งˆํŠธ ๋จธ์‹ ์˜ ํ•ต์‹ฌ์ด ๋ฐ”๋กœ
11:37
is one that recognizes that a certain kind of problem is facing you.
177
697330
5000
์šฐ๋ฆฌ๊ฐ€ ์ง๋ฉดํ•œ ๋ฌธ์ œ์˜ ์ข…๋ฅ˜๊ฐ€ ๋ฌด์—‡์ธ์ง€ ์•Œ์•„๋‚ด๋Š” ๊ฒƒ์ด๋ผ๋Š” ๊ฑฐ์ฃ 
11:42
This is a problem of such and such a type,
178
702330
3000
์ด ๋ฌธ์ œ๋Š” ์ด๋Ÿฌ์ด๋Ÿฌํ•œ ํƒ€์ž…์˜ ๋ฌธ์ œ์ด๋ฏ€๋กœ,
11:45
and therefore there's a certain way or ways of thinking
179
705330
5000
๊ทธ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ๋ฐ ํ•„์š”ํ•œ ์–ด๋–ค ์‚ฌ๊ณ ๋ฐฉ์‹์ด
11:50
that are good for that problem.
180
710330
2000
์ ํ•ฉํ•˜๋‹ค๋ผ๋Š” ๊ฒƒ๋ง์ž…๋‹ˆ๋‹ค.
11:52
So I think the future, main problem of psychology is to classify
181
712330
4000
๊ทธ๋ž˜์„œ ์ €๋Š” ๋ฏธ๋ž˜์—๋Š” ์‹ฌ๋ฆฌํ•™์˜ ์ฃผ์š” ์ด์Šˆ๊ฐ€
11:56
types of predicaments, types of situations, types of obstacles
182
716330
4000
๊ณค๋ž€ํ•œ ๋ฌธ์ œ๋‚˜ ์ƒํ™ฉ, ๋ฐฉํ•ด์š”์†Œ๋“ค์˜ ํƒ€์ž…์„ ๊ตฌ๋ถ„ํ•˜๊ณ 
12:00
and also to classify available and possible ways to think and pair them up.
183
720330
6000
ํƒ€์ž…์— ๋งž๋Š” ํ•ด๊ฒฐ์ฑ…๋“ค์„ ๋ถ„๋ฅ˜ํ•˜๊ณ , ์ด ํ•ด๊ฒฐ์ฑ…๊ณผ ๋ฌธ์ œ์˜ ์ง์„ ๋งž์ถ”๋Š” ์ผ์ด ๋  ๊ฒƒ์ด๋ผ ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค.
12:06
So you see, it's almost like a Pavlovian --
184
726330
3000
์ž, ๋ณด์„ธ์š”. ๊ฑฐ์˜ ํŒŒ๋ธ”๋กœํ”„์˜ ์กฐ๊ฑด๋ฐ˜์‚ฌ์™€ ๊ฐ™์€ ๊ฑฐ์ฃ .
12:09
we lost the first hundred years of psychology
185
729330
2000
์šฐ๋ฆฌ๋Š” ์ธ๊ฐ„์˜ ์–ด๋–ป๊ฒŒ ์ƒํ™ฉ์— ๋Œ€์‘ํ•˜๋Š” ๋ฐฉ์‹์„ ๋ฐฐ์šฐ๋Š”์ง€์— ๋Œ€ํ•œ
12:11
by really trivial theories, where you say,
186
731330
3000
์‹œ์‹œํ•œ ์ด๋ก ์„ ์—ฐ๊ตฌํ•˜๋Š” ๋ฐ
12:14
how do people learn how to react to a situation? What I'm saying is,
187
734330
6000
์‹ฌ๋ฆฌํ•™ ์—ญ์‚ฌ์˜ ์ฒซ ๋ช‡ ๋ฐฑ๋…„์„ ํ—ˆ๋น„ํ•ด๋ฒ„๋ ธ์Šต๋‹ˆ๋‹ค. ์ œ๊ฐ€ ํ•˜๋Š” ์–˜๊ธฐ๋Š” ๋งํ•˜์ž๋ฉด
12:20
after we go through a lot of levels, including designing
188
740330
5000
์ˆ˜ ์ฒœ๊ฐœ์˜ ๋ถ€๋ถ„๋“ค์ด ์–ฝํžˆ๊ณ  ์„คํ‚จ ์ฒด๊ณ„๋ฅผ ๋””์ž์ธํ•˜๋Š” ๋“ฑ์˜
12:25
a huge, messy system with thousands of ports,
189
745330
3000
์ˆ˜ ๋งŽ์€ ๋‹จ๊ณ„๋ฅผ ์ง€๋‚˜๊ณ  ๋‚˜์„œ์•ผ
12:28
we'll end up again with the central problem of psychology.
190
748330
4000
์‹ฌ๋ฆฌํ•™์˜ ํ•ต์‹ฌ ๋ฌธ์ œ์— ๋‹ค๋‹ค๋ฅด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:32
Saying, not what are the situations,
191
752330
3000
๊ทธ๋Ÿฌ๋‹ˆ๊นŒ, ์ค‘์š”ํ•œ ๊ฒƒ์€ ์ƒํ™ฉ์ด ์•„๋‹ˆ๋ผ
12:35
but what are the kinds of problems
192
755330
2000
๊ทธ ๋ฌธ์ œ๊ฐ€ ์–ด๋–ค ํƒ€์ž…์ด๋ƒ ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
12:37
and what are the kinds of strategies, how do you learn them,
193
757330
3000
๊ทธ๋ฆฌ๊ณ  ๊ฑฐ๊ธฐ์— ๋งž๋Š” ์ „๋žต์˜ ํƒ€์ž…์€ ๋ฌด์—‡์ด๋ฉฐ, ๊ทธ ์ „๋žต์€ ์–ด๋–ป๊ฒŒ ๋ฐฐ์šฐ๋Š๋ƒ ํ•˜๋Š” ๋ฌธ์ œ์ž…๋‹ˆ๋‹ค.
12:40
how do you connect them up, how does a really creative person
194
760330
3000
๋ฌธ์ œ์™€ ํ•ด๊ฒฐ๋ฐฉ์•ˆ์€ ์–ด๋–ป๊ฒŒ ์—ฐ๊ณ„์‹œํ‚ค๊ณ , ์•„์ฃผ ์ฐฝ์˜์ ์ธ ์‚ฌ๋žŒ์ด
12:43
invent a new way of thinking out of the available resources and so forth.
195
763330
5000
์–ด๋–ป๊ฒŒ ๊ฐ€์šฉ์ž์›์„ ๊ฐ€์ง€๊ณ  ์ƒˆ๋กœ์šด ์‚ฌ๊ณ ๋ฐฉ์‹์„ ๋งŒ๋“ค์–ด๋‚ด๋Š”๊ฐ€, ๋“ฑ๋“ฑ์˜ ๋ฌธ์ œ์ธ๊ฑฐ์ฃ 
12:48
So, I think in the next 20 years,
196
768330
2000
๊ทธ๋ž˜์„œ ์ €๋Š” ์•ž์œผ๋กœ 20๋…„ ์•ˆ์—, ์šฐ๋ฆฌ๊ฐ€ ์ธ๊ณต์ง€๋Šฅ์— ๋Œ€ํ•œ
12:50
if we can get rid of all of the traditional approaches to artificial intelligence,
197
770330
5000
์‹ ๊ฒฝ๋ง, ์œ ์ „์ž ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ๊ทœ์น™๊ธฐ๋ฐ˜์‹œ์Šคํ…œ๊ณผ ๊ฐ™์€
12:55
like neural nets and genetic algorithms
198
775330
2000
์ „ํ†ต์ ์ธ ์ ‘๊ทผ๋ฐฉ์‹์„ ๋ชจ๋‘ ์ œ๊ฑฐํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด
12:57
and rule-based systems, and just turn our sights a little bit higher to say,
199
777330
6000
๊ทธ๋ฆฌ๊ณ  ๋งํ•˜์ž๋ฉด ์ข€ ๋” ์‹œ์„ ์„ ๋†’์€ ๊ณณ์œผ๋กœ ๋Œ๋ฆฐ๋‹ค๋ฉด
13:03
can we make a system that can use all those things
200
783330
2000
ํŠน์ •ํ•œ ๋ฌธ์ œ์˜ ํƒ€์ž…์— ๋”ฐ๋ผ ์ž์›์„ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ์‹œ์Šคํ…œ์„
13:05
for the right kind of problem? Some problems are good for neural nets;
201
785330
4000
๊ตฌ์ถ•ํ•  ์ˆ˜ ์žˆ์ง€ ์•Š์„๊นŒ์š”? ์–ด๋–ค ๋ฌธ์ œ๋Š” ์‹ ๊ฒฝ ํšŒ๋กœ๋ง์„ ์‚ฌ์šฉํ•˜๋ฉด ์ข‹๊ณ 
13:09
we know that others, neural nets are hopeless on them.
202
789330
3000
๊ทธ๋ฆฌ๊ณ  ์‹ ๊ฒฝ ํšŒ๋กœ๋ง์ด ๋‹ค๋ฅธ ๋ฌธ์ œ์—๋Š” ์•„๋ฌด์ง์—๋„ ์“ธ๋ชจ์—†๋‹ค๋Š” ๊ฑธ ์•„๋Š” ๊ฑฐ์ฃ 
13:12
Genetic algorithms are great for certain things;
203
792330
3000
์œ ์ „์ž ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ช‡๋ช‡ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•  ๋•Œ๋Š” ์•„์ฃผ ์ข‹์€ ๋ฐฉ์‹์ž…๋‹ˆ๋‹ค.
13:15
I suspect I know what they're bad at, and I won't tell you.
204
795330
4000
์ด ๋ฐฉ์‹์ด ํ†ตํ•˜์ง€ ์•Š๋Š” ๋ถ€๋ถ„์— ๋Œ€ํ•ด ์ œ๊ฐ€ ์•Œ๊ณ  ์žˆ๋Š” ๊ฒƒ ๊ฐ™์ง€๋งŒ, ์•Œ๋ ค๋“œ๋ฆฌ์ง„ ์•Š์„ ๊ฒ๋‹ˆ๋‹ค.
13:19
(Laughter)
205
799330
1000
(์›ƒ์Œ)
13:20
Thank you.
206
800330
2000
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
13:22
(Applause)
207
802330
6000
(๋ฐ•์ˆ˜)
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7