Marvin Minsky: Health, population and the human mind

63,701 views ใƒป 2008-09-29

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Shlomo Adam ืžื‘ืงืจ: Ido Dekkers
00:18
If you ask people about what part of psychology do they think is hard,
0
18330
6000
ืื ืชืฉืืœื• ืื ืฉื™ื ืžื” ืœื“ืขืชื ื”ื—ืœืง ื”ืงืฉื” ื‘ืคืกื™ื›ื•ืœื•ื’ื™ื”,
00:24
and you say, "Well, what about thinking and emotions?"
1
24330
3000
ืžื” ื”ื ื—ื•ืฉื‘ื™ื ืขืœ ื—ืฉื™ื‘ื” ื•ืจื’ืฉื•ืช,
00:27
Most people will say, "Emotions are terribly hard.
2
27330
3000
ืžืจื‘ื™ืชื ื™ืืžืจื•, "ื”ืจื’ืฉื•ืช ื ื•ืจื ืžืกื•ื‘ื›ื™ื.
00:30
They're incredibly complex. They can't -- I have no idea of how they work.
3
30330
6000
ื”ื ืžื•ืจื›ื‘ื™ื ืžืื•ื“, ืื™ืŸ ืœื™ ืžื•ืฉื’ ืื™ืš ื”ื ืคื•ืขืœื™ื.
00:36
But thinking is really very straightforward:
4
36330
2000
ืื‘ืœ ื”ื—ืฉื™ื‘ื” ืคืฉื•ื˜ื” ืžืื“:
00:38
it's just sort of some kind of logical reasoning, or something.
5
38330
4000
ื”ื™ื ืคืฉื•ื˜ ืžื™ืŸ ืฉื™ืงื•ืœ ื”ื’ื™ื•ื ื™ ืื• ืžืฉื”ื• ื›ื–ื”.
00:42
But that's not the hard part."
6
42330
3000
ืื‘ืœ ื”ื ืื™ื ื ื”ื—ืœืง ื”ืงืฉื”."
00:45
So here's a list of problems that come up.
7
45330
2000
ืื– ื”ืจื™ ื›ืžื” ื‘ืขื™ื•ืช ืฉืฆืฆื•ืช ื•ืขื•ืœื•ืช.
00:47
One nice problem is, what do we do about health?
8
47330
3000
ื‘ืขื™ื” ื—ื‘ื™ื‘ื” ืื—ืช ื”ื™ื: ืžื” ืœืขืฉื•ืช ื‘ืงืฉืจ ืœื‘ืจื™ืื•ืช?
00:50
The other day, I was reading something, and the person said
9
50330
4000
ื™ื•ื ืื—ื“ ืงืจืืชื™ ืžืฉื”ื•, ื•ื”ื›ื•ืชื‘ ื˜ืขืŸ
00:54
probably the largest single cause of disease is handshaking in the West.
10
54330
6000
ืฉื”ื’ื•ืจื ื”ืขื™ืงืจื™ ืœืžื—ืœื•ืช ื”ื™ื ืœื—ื™ืฆืช ื”ื™ื“ ื”ืžืขืจื‘ื™ืช.
01:00
And there was a little study about people who don't handshake,
11
60330
4000
ื•ื”ื•ืฆื’ ืžื—ืงืจ ืงื˜ืŸ ืฉืžืฉื•ื•ื” ื‘ื™ืŸ ืื ืฉื™ื ืฉืœื ืœื•ื—ืฆื™ื ื™ื“ื™ื™ื,
01:04
and comparing them with ones who do handshake.
12
64330
3000
ื•ื‘ื™ืŸ ืืœื” ืฉื›ืŸ,
01:07
And I haven't the foggiest idea of where you find the ones that don't handshake,
13
67330
5000
ื•ืื™ืŸ ืœื™ ืžื•ืฉื’ ืื™ืคื” ื™ืฉ ืื ืฉื™ื ืฉืœื ืœื•ื—ืฆื™ื ื™ื“ื™ื™ื,
01:12
because they must be hiding.
14
72330
3000
ื”ื ื•ื“ืื™ ืžืชื—ื‘ืื™ื.
01:15
And the people who avoid that
15
75330
4000
ื•ืืฆืœ ืžื™ ืฉื ืžื ืขื™ื ืžืœื—ื™ืฆืช ื”ื™ื“
01:19
have 30 percent less infectious disease or something.
16
79330
4000
ื™ืฉ 30% ืคื—ื•ืช ืžื—ืœื•ืช ืžื™ื“ื‘ืงื•ืช, ืื• ืžืฉื”ื• ื›ื–ื”.
01:23
Or maybe it was 31 and a quarter percent.
17
83330
3000
ืื• ืื•ืœื™ ื–ื” ื”ื™ื” 31 ื•ืจื‘ืข ืื—ื•ื–.
01:26
So if you really want to solve the problem of epidemics and so forth,
18
86330
4000
ืื– ืื ื‘ืืžืช ืจื•ืฆื™ื ืœืคืชื•ืจ ืืช ื‘ืขื™ื™ืช ื”ืžื’ืคื•ืช ื•ื›ืŸ ื”ืœืื”,
01:30
let's start with that. And since I got that idea,
19
90330
4000
ื‘ื•ืื• ื ืชื—ื™ืœ ืžื–ื”. ื•ืžืื– ืฉื”ืจืขื™ื•ืŸ ื”ื–ื” ืขืœื” ื‘ื™,
01:34
I've had to shake hundreds of hands.
20
94330
4000
ื ืืœืฆืชื™ ืœืœื—ื•ืฅ ืžืื•ืช ื™ื“ื™ื™ื.
01:38
And I think the only way to avoid it
21
98330
5000
ื•ืื ื™ ื—ื•ืฉื‘ ืฉื”ื“ืจืš ื”ื™ื—ื™ื“ื” ืœื”ื™ืžื ืข ืžื›ืš
01:43
is to have some horrible visible disease,
22
103330
2000
ื”ื™ื ืœื—ืœื•ืช ื‘ืื™ื–ื• ืžื—ืœื” ืžื—ืจื™ื“ื” ื•ื‘ื•ืœื˜ืช ืœืขื™ืŸ,
01:45
and then you don't have to explain.
23
105330
3000
ื•ืื– ืืชื” ืคื˜ื•ืจ ืžื”ืกื‘ืจื™ื.
01:48
Education: how do we improve education?
24
108330
4000
ื—ื™ื ื•ืš: ืื™ืš ืžืฉืคืจื™ื ืืช ื”ื—ื™ื ื•ืš?
01:52
Well, the single best way is to get them to understand
25
112330
4000
ื”ื“ืจืš ื”ื›ื™ ื˜ื•ื‘ื” ื”ื™ื ืœื’ืจื•ื ืœื”ื ืœื”ื‘ื™ืŸ
01:56
that what they're being told is a whole lot of nonsense.
26
116330
3000
ืฉืžืกืคืจื™ื ืœื”ื ื”ืžื•ืŸ ืฉื˜ื•ื™ื•ืช.
01:59
And then, of course, you have to do something
27
119330
2000
ื•ื›ืžื•ื‘ืŸ ืฉืฆืจื™ืš ืœืขืฉื•ืช ืžืฉื”ื•
02:01
about how to moderate that, so that anybody can -- so they'll listen to you.
28
121330
5000
ื›ื“ื™ ืœืžืชืŸ ืืช ื–ื” ื•ืฉื™ืงืฉื™ื‘ื• ื“ื•ื•ืงื ืœืš.
02:06
Pollution, energy shortage, environmental diversity, poverty.
29
126330
4000
ื–ื™ื”ื•ื ืื•ื•ื™ืจ, ืžื—ืกื•ืจ ื‘ืื ืจื’ื™ื”, ืžื’ื•ื•ืŸ ืกื‘ื™ื‘ืชื™, ืขื•ื ื™--
02:10
How do we make stable societies? Longevity.
30
130330
4000
ืื™ืš ื™ื•ืฆืจื™ื ื—ื‘ืจื•ืช ื™ืฆื™ื‘ื•ืช, ืืจื™ื›ื•ืช ื™ืžื™ื.
02:14
Okay, there're lots of problems to worry about.
31
134330
3000
ื›ืŸ, ื™ืฉ ื”ืจื‘ื” ื‘ืขื™ื•ืช ืžืขื•ืจืจื•ืช ื“ืื’ื”.
02:17
Anyway, the question I think people should talk about --
32
137330
2000
ื‘ื›ืœ ืžืงืจื”, ื”ืฉืืœื” ืฉืœื“ืขืชื™ ืื ืฉื™ื ืฆืจื™ื›ื™ื ืœื“ื•ืŸ ื‘ื”--
02:19
and it's absolutely taboo -- is, how many people should there be?
33
139330
5000
ืฉื”ื™ื ื˜ืื‘ื• ืžื•ื—ืœื˜-- ื›ืžื” ื‘ื ื™ ืื“ื ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช?
02:24
And I think it should be about 100 million or maybe 500 million.
34
144330
7000
ื•ืœื“ืขืชื™ ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช 100 ืื• ืื•ืœื™ 500 ืžื™ืœื™ื•ืŸ.
02:31
And then notice that a great many of these problems disappear.
35
151330
5000
ื•ืื– ืชืฉื™ืžื• ืœื‘ ืฉื”ืžื•ืŸ ืžื”ื‘ืขื™ื•ืช ื”ืืœื” ื ืขืœืžื•ืช.
02:36
If you had 100 million people
36
156330
2000
ืื™ืœื• ื”ื™ื• 100 ืžื™ืœื™ื•ืŸ ืื™ืฉ
02:38
properly spread out, then if there's some garbage,
37
158330
6000
ืžืคื•ื–ืจื™ื ื”ื™ื˜ื‘, ืื– ืื ื™ืฉ ืงืฆืช ื–ื‘ืœ,
02:44
you throw it away, preferably where you can't see it, and it will rot.
38
164330
7000
ื–ื•ืจืงื™ื ืื•ืชื•, ืขื“ื™ืฃ ื”ื™ื›ืŸ ืฉืื™ ืืคืฉืจ ืœืจืื•ืชื•, ื•ื”ื•ื ื™ื™ืจืงื‘.
02:51
Or you throw it into the ocean and some fish will benefit from it.
39
171330
5000
ืื• ื–ื•ืจืงื™ื ืื•ืชื• ืœื™ื ื•ื›ืžื” ื“ื’ื™ื ื™ื™ื”ื ื• ืžืžื ื•.
02:56
The problem is, how many people should there be?
40
176330
2000
ื”ื‘ืขื™ื” ื”ื™ื, ื›ืžื” ืื ืฉื™ื ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช?
02:58
And it's a sort of choice we have to make.
41
178330
3000
ื•ื–ื• ืžืขื™ืŸ ื‘ื—ื™ืจื” ืฉืขืœื™ื ื• ืœื‘ื—ื•ืจ.
03:01
Most people are about 60 inches high or more,
42
181330
3000
ืจื•ื‘ ื”ืื ืฉื™ื ื’ื•ื‘ื”ื ืžื˜ืจ ื•ื—ืฆื™ ื•ืžืขืœื”,
03:04
and there's these cube laws. So if you make them this big,
43
184330
4000
ื•ื”ื ืžื‘ื–ื‘ื–ื™ื ืžืจื—ื‘. ืื– ืื ื ืขืฉื” ืื•ืชื ื‘ื’ื•ื“ืœ ื›ื–ื”--
03:08
by using nanotechnology, I suppose --
44
188330
3000
ืื•ืœื™ ื‘ืขื–ืจืช ื ื ื•-ื˜ื›ื ื•ืœื•ื’ื™ื”--
03:11
(Laughter)
45
191330
1000
[ืฆื—ื•ืง]
03:12
-- then you could have a thousand times as many.
46
192330
2000
--ืื– ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืคื™ ืืœืฃ.
03:14
That would solve the problem, but I don't see anybody
47
194330
2000
ื–ื” ื™ืคืชื•ืจ ืืช ื”ื‘ืขื™ื”, ืื‘ืœ ืื ื™ ืœื ืจื•ืื” ืฉืžื™ืฉื”ื•
03:16
doing any research on making people smaller.
48
196330
3000
ื—ื•ืงืจ ื‘ื›ื™ื•ื•ืŸ ืฉืœ ื”ืงื˜ื ืช ื‘ื ื™ ืื“ื.
03:19
Now, it's nice to reduce the population, but a lot of people want to have children.
49
199330
5000
ื™ื”ื™ื” ื ื—ืžื“ ืœืฆืžืฆื ืืช ื”ืื•ื›ืœื•ืกื™ื”, ืื‘ืœ ื”ืจื‘ื” ืจื•ืฆื™ื ื™ืœื“ื™ื.
03:24
And there's one solution that's probably only a few years off.
50
204330
3000
ื•ืื—ื“ ื”ืคืชืจื•ื ื•ืช ื ืžืฆื ื‘ื˜ื•ื•ื— ื›ืžื” ืฉื ื™ื ืžืื™ืชื ื•.
03:27
You know you have 46 chromosomes. If you're lucky, you've got 23
51
207330
5000
ื™ื“ื•ืข ืฉื™ืฉ 46 ื›ืจื•ืžื•ื–ื•ืžื™ื. ืื ื™ืฉ ืœืš ืžื–ืœ, ื™ืฉ ืœืš 23
03:32
from each parent. Sometimes you get an extra one or drop one out,
52
212330
6000
ืžื›ืœ ื”ื•ืจื”; ืœืคืขืžื™ื ื™ืฉ ืื—ื“ ืžื™ื•ืชืจ ืื• ื—ืกืจ,
03:38
but -- so you can skip the grandparent and great-grandparent stage
53
218330
4000
ืื‘ืœ-- ืื– ืืคืฉืจ ืœื“ืœื’ ืขืœ ืฉืœื‘ ื”ืกื‘ื™ื ื•ื”ืกื‘ื™ื-ืจื‘ื™ื
03:42
and go right to the great-great-grandparent. And you have 46 people
54
222330
5000
ื•ืœืœื›ืช ื™ืฉืจ ืœืกื‘ื™ื-ืจื‘ื™ื-ืจื‘ื™ื. ืืชื” ืœื•ืงื— 46 ืื™ืฉ
03:47
and you give them a scanner, or whatever you need,
55
227330
3000
ื•ื ื•ืชืŸ ืœื”ื ืกื•ืจืง ืื• ืžืฉื”ื• ื›ื–ื”,
03:50
and they look at their chromosomes and each of them says
56
230330
4000
ื•ื”ื ื‘ื•ื“ืงื™ื ืืช ื”ื›ืจื•ืžื•ื–ื•ืžื™ื ืฉืœื”ื ื•ื›ืœ ืื—ื“ ืื•ืžืจ
03:54
which one he likes best, or she -- no reason to have just two sexes
57
234330
5000
ืื™ื–ื” ื”ื•ื ืžืขื“ื™ืฃ, ืื• ื”ื™ื-- ืื™ืŸ ืกื™ื‘ื” ืฉื™ื”ื™ื• ืจืง 2 ืžื™ื ื™ื
03:59
any more, even. So each child has 46 parents,
58
239330
5000
ืžืขืชื” ื•ื”ืœืื”. ืื– ืœื›ืœ ื™ืœื“ ื™ืฉ 46 ื”ื•ืจื™ื,
04:04
and I suppose you could let each group of 46 parents have 15 children.
59
244330
6000
ื•ืœื“ืขืชื™ ืืคืฉืจ ืฉื›ืœ ืงื‘ื•ืฆื” ืฉืœ 46 ื”ื•ืจื™ื ืชืœื“ 15 ื™ืœื“ื™ื--
04:10
Wouldn't that be enough? And then the children
60
250330
2000
ื–ื” ืœื ื™ืกืคื™ืง? ื•ืื– ื”ื™ืœื“ื™ื
04:12
would get plenty of support, and nurturing, and mentoring,
61
252330
4000
ื™ื–ื›ื• ืœืฉืคืข ืชืžื™ื›ื” ื•ื˜ื™ืคื•ื— ื•ื”ื“ืจื›ื”
04:16
and the world population would decline very rapidly
62
256330
2000
ื•ืื•ื›ืœื•ืกื™ื™ืช ื”ืขื•ืœื ืชืคื—ืช ืžื”ืจ ืžืื“
04:18
and everybody would be totally happy.
63
258330
3000
ื•ื›ื•ืœื ื™ื”ื™ื• ืœื’ืžืจื™ ืžืื•ืฉืจื™ื.
04:21
Timesharing is a little further off in the future.
64
261330
3000
ื—ืœื•ืงืช-ื–ืžืŸ ืžืžืชื™ื ื” ืœื ื• ื‘ืขืชื™ื“ ื”ืจื—ื•ืง ื™ื•ืชืจ.
04:24
And there's this great novel that Arthur Clarke wrote twice,
65
264330
3000
ื•ื™ืฉ ืกื™ืคื•ืจ ื’ื“ื•ืœ ืฉืืจืชื•ืจ ืงืœืืจืง ื›ืชื‘ ืคืขืžื™ื™ื,
04:27
called "Against the Fall of Night" and "The City and the Stars."
66
267330
4000
ื‘ืฉืžื•ืช "ื ื’ื“ ืจื“ืช ื”ืœื™ืœื”" ื•"ื”ืขื™ืจ ื•ื”ื›ื•ื›ื‘ื™ื".
04:31
They're both wonderful and largely the same,
67
271330
3000
ืฉื ื™ื”ื ื ื”ื“ืจื™ื ื•ื‘ืžื™ื“ื” ืจื‘ื” ื“ื•ืžื™ื,
04:34
except that computers happened in between.
68
274330
2000
ืื‘ืœ ื‘ื™ืŸ ืกืคืจ ืœืกืคืจ ื ื•ืฆืจื• ื”ืžื—ืฉื‘ื™ื.
04:36
And Arthur was looking at this old book, and he said, "Well, that was wrong.
69
276330
5000
ื•ืืจืชื•ืจ ื”ื‘ื™ื˜ ื‘ืกืคืจ ื”ื™ืฉืŸ ืฉืœื• ื•ืืžืจ, ื–ื” ืœื ื ื›ื•ืŸ.
04:41
The future must have some computers."
70
281330
2000
ื‘ืขืชื™ื“ ืฆืจื™ื›ื™ื ืœื”ื™ื•ืช ืžื—ืฉื‘ื™ื.
04:43
So in the second version of it, there are 100 billion
71
283330
5000
ืื– ื‘ื’ื™ืจืกื” ื”ืฉื ื™ื” ื™ืฉ ื‘ืขื•ืœื 100 ืžื™ืœื™ืืจื“ ืื™ืฉ,
04:48
or 1,000 billion people on Earth, but they're all stored on hard disks or floppies,
72
288330
8000
ืื• ืืœืฃ ืžื™ืœื™ืืจื“, ื•ื›ื•ืœื ืžืื•ื—ืกื ื™ื ื‘ื“ื™ืกืงื™ื ืื• ื“ื™ืกืงื˜ื™ื,
04:56
or whatever they have in the future.
73
296330
2000
ืื• ืžื” ืฉืœื ื™ื”ื™ื” ืœื”ื ื‘ืขืชื™ื“.
04:58
And you let a few million of them out at a time.
74
298330
4000
ื•ืžื•ืฆื™ืื™ื ืจืง ื›ืžื” ืžื™ืœื™ื•ื ื™ื ื‘ื›ืœ ืคืขื.
05:02
A person comes out, they live for a thousand years
75
302330
4000
ืื“ื ื™ื•ืฆื, ื—ื™ ืืœืฃ ืฉื ื™ื,
05:06
doing whatever they do, and then, when it's time to go back
76
306330
6000
ืขื•ืฉื” ืžื” ืฉื”ื•ื ืขื•ืฉื”, ื•ื›ืฉืžื’ื™ืข ื”ื–ืžืŸ ืœื—ื–ื•ืจ
05:12
for a billion years -- or a million, I forget, the numbers don't matter --
77
312330
4000
ืœืžื™ืœื™ืืจื“ ืื• ืžื™ืœื™ื•ืŸ ืฉื ื”-- ืฉื›ื—ืชื™. ื”ืžืกืคืจื™ื ืœื ืžืฉื ื™ื--
05:16
but there really aren't very many people on Earth at a time.
78
316330
4000
ืื‘ืœ ืื™ืŸ ื”ืจื‘ื” ื‘ื ื™ ืื“ื ื‘ืขื•ืœื ื‘ื›ืœ ื–ืžืŸ ื ืชื•ืŸ.
05:20
And you get to think about yourself and your memories,
79
320330
2000
ื•ื™ืฉ ืœืš ื–ืžืŸ ืœื—ืฉื•ื‘ ืขืœ ืขืฆืžืš ื•ืขืœ ื–ื›ืจื•ื ื•ืชื™ืš,
05:22
and before you go back into suspension, you edit your memories
80
322330
5000
ื•ืœืคื ื™ ืฉื•ื‘ืš ืœื—ื™ื•ืช ืžื•ืฉื”ื™ืช ืืชื” ืขื•ืจืš ืืช ื–ื›ืจื•ื ื•ืชื™ืš
05:27
and you change your personality and so forth.
81
327330
3000
ื•ืžืฉื ื” ืืช ืื™ืฉื™ื•ืชืš ื•ื›ื“ื•ืžื”.
05:30
The plot of the book is that there's not enough diversity,
82
330330
6000
ืœืคื™ ื”ืขืœื™ืœื” ืฉืœ ื”ืกืคืจ ืื™ืŸ ืžื’ื•ื•ืŸ ืจื—ื‘,
05:36
so that the people who designed the city
83
336330
3000
ื•ืœื›ืŸ ื”ืื ืฉื™ื ืฉืขื™ืฆื‘ื• ืืช ื”ืขื™ืจ
05:39
make sure that every now and then an entirely new person is created.
84
339330
4000
ืžื•ื•ื“ืื™ื ืฉืžื™ื“ื™ ืคืขื ื ื‘ืจื ืื“ื ื—ื“ืฉ ืœื’ืžืจื™.
05:43
And in the novel, a particular one named Alvin is created. And he says,
85
343330
6000
ื•ื‘ืกื™ืคื•ืจ ื ื‘ืจื ืื“ื ื‘ืฉื ืืœื•ื•ื™ืŸ, ื•ื”ื•ื ืื•ืžืจ,
05:49
maybe this isn't the best way, and wrecks the whole system.
86
349330
4000
ืื•ืœื™ ื–ืืช ืœื ื”ืฉื™ื˜ื” ื”ื›ื™ ื˜ื•ื‘ื”, ื•ื”ื•ืจืก ืืช ื›ืœ ื”ืžืขืจื›ืช.
05:53
I don't think the solutions that I proposed
87
353330
2000
ืื ื™ ืœื ื—ื•ืฉื‘ ืฉื”ืคืชืจื•ื ื•ืช ืฉืื ื™ ืžืฆื™ืข
05:55
are good enough or smart enough.
88
355330
3000
ื˜ื•ื‘ื™ื ืื• ื—ื›ืžื™ื ืžืกืคื™ืง.
05:58
I think the big problem is that we're not smart enough
89
358330
4000
ืœื“ืขืชื™ ื”ื‘ืขื™ื” ื”ื’ื“ื•ืœื” ื”ื™ื ืฉืื ื• ืœื ื—ื›ืžื™ื ืžืกืคื™ืง
06:02
to understand which of the problems we're facing are good enough.
90
362330
4000
ื›ื“ื™ ืœื”ื‘ื™ืŸ ืžื” ืžื”ื‘ืขื™ื•ืช ืฉืžื•ืœื ื• ื”ืŸ ื˜ื•ื‘ื•ืช ืžืกืคื™ืง.
06:06
Therefore, we have to build super intelligent machines like HAL.
91
366330
4000
ืœื›ืŸ ืื ื• ื ืืœืฆื™ื ืœื‘ื ื•ืช ืžื›ื•ื ื•ืช ืกื•ืคืจ-ื ื‘ื•ื ื•ืช ื›ืžื• "ื”ืืœ".
06:10
As you remember, at some point in the book for "2001,"
92
370330
5000
ื›ื–ื›ื•ืจ ืœื›ื, ื‘ืฉืœื‘ ืžืกื•ื™ื ื‘ืกืคืจ "2001",
06:15
HAL realizes that the universe is too big, and grand, and profound
93
375330
5000
"ื”ืืœ" ืžื‘ื™ืŸ ืฉื”ื™ืงื•ื ื’ื“ื•ืœ ื•ื›ื‘ื™ืจ ื•ืขืžื•ืง ืžื“ื™
06:20
for those really stupid astronauts. If you contrast HAL's behavior
94
380330
4000
ื‘ืฉื‘ื™ืœ ื”ืืกื˜ืจื•ื ืื•ื˜ื™ื ื”ื˜ืคืฉื™ื ื”ืืœื”. ืื ืžืฉื•ื•ื™ื ืืช
06:24
with the triviality of the people on the spaceship,
95
384330
4000
ื”ืชื ื”ื’ื•ืชื• ืฉืœ "ื”ืืœ" ืœืฉื˜ื—ื™ื•ืช ืฉืœ ืื ืฉื™ ื”ื—ืœืœื™ืช,
06:28
you can see what's written between the lines.
96
388330
3000
ืจื•ืื™ื ืžื” ื ื›ืชื‘ ื‘ื™ืŸ ื”ืฉื•ืจื•ืช.
06:31
Well, what are we going to do about that? We could get smarter.
97
391330
3000
ืื– ืžื” ื ืขืฉื” ื‘ืงืฉืจ ืœื›ืš? ื ื•ื›ืœ ืœื”ื—ื›ื™ื.
06:34
I think that we're pretty smart, as compared to chimpanzees,
98
394330
5000
ืœื“ืขืชื™ ืื ื™ ื“ื™ ื—ื›ืžื™ื, ื‘ื”ืฉื•ื•ืื” ืœืฉื™ืžืคื ื–ื™ื,
06:39
but we're not smart enough to deal with the colossal problems that we face,
99
399330
6000
ืืš ืœื ื—ื›ืžื™ื ืžืกืคื™ืง ืœื”ืชืžื•ื“ื“ ืขื ื”ื‘ืขื™ื•ืช ื”ืขืฆื•ืžื•ืช ืฉืœื ื•,
06:45
either in abstract mathematics
100
405330
2000
ื‘ื™ืŸ ืื ื‘ืžืชืžื˜ื™ืงื” ืžื•ืคืฉื˜ืช
06:47
or in figuring out economies, or balancing the world around.
101
407330
5000
ืื• ื‘ื›ืœื›ืœื” ืื• ื‘ื”ืฉื’ืช ืื™ื–ื•ืŸ ื‘ืขื•ืœื ืฉืกื‘ื™ื‘ื ื•.
06:52
So one thing we can do is live longer.
102
412330
3000
ืื– ืื—ื“ ื”ื“ื‘ืจื™ื ืฉืืคืฉืจ ืœืขืฉื•ืช ื”ื•ื ืœื—ื™ื•ืช ื™ื•ืชืจ.
06:55
And nobody knows how hard that is,
103
415330
2000
ื•ืื™ืฉ ืื™ื ื ื• ื™ื•ื“ืข ื›ืžื” ื–ื” ืงืฉื”,
06:57
but we'll probably find out in a few years.
104
417330
3000
ืื‘ืœ ื•ื“ืื™ ื ื’ืœื” ื–ืืช ื‘ืชื•ืš ื›ืžื” ืฉื ื™ื.
07:00
You see, there's two forks in the road. We know that people live
105
420330
3000
ื™ืฉ ื”ืจื™ ืคืจืฉืช ื“ืจื›ื™ื. ืื ื• ื™ื•ื“ืขื™ื ืฉื‘ื ื™ ืื“ื ื—ื™ื™ื
07:03
twice as long as chimpanzees almost,
106
423330
4000
ืคื™ ืฉื ื™ื™ื ืžืฉื™ืžืคื ื–ื™ื, ื›ืžืขื˜,
07:07
and nobody lives more than 120 years,
107
427330
4000
ื•ืื™ืฉ ืื™ื ื ื• ื—ื™ ื™ื•ืชืจ ืž-120 ืฉื ื”,
07:11
for reasons that aren't very well understood.
108
431330
3000
ืžืกื™ื‘ื•ืช ืœื ืœื’ืžืจื™ ืžื•ื‘ื ื•ืช.
07:14
But lots of people now live to 90 or 100,
109
434330
3000
ืื‘ืœ ืจื‘ื™ื ื—ื™ื™ื ืขื“ ื’ื™ืœ 90 ืื• ืžืื”,
07:17
unless they shake hands too much or something like that.
110
437330
4000
ืืœื ืื ื”ื ืœื•ื—ืฆื™ื ื™ื•ืชืจ ืžื“ื™ ื™ื“ื™ื™ื ืื• ืžืฉื”ื• ื›ื–ื”.
07:21
And so maybe if we lived 200 years, we could accumulate enough skills
111
441330
5000
ืื– ืื•ืœื™ ืื ื ื—ื™ื” ืขื“ ื’ื™ืœ 200, ื ืฆื‘ื•ืจ ืžืกืคื™ืง ื›ื™ืฉื•ืจื™ื
07:26
and knowledge to solve some problems.
112
446330
5000
ื•ื™ื“ืข ื›ื“ื™ ืœืคืชื•ืจ ื›ืžื” ื‘ืขื™ื•ืช.
07:31
So that's one way of going about it.
113
451330
2000
ืื– ื–ื• ื“ืจืš ืื—ืช ืœื˜ืคืœ ื‘ื›ืš.
07:33
And as I said, we don't know how hard that is. It might be --
114
453330
3000
ื•ื›ืคื™ ืฉืืžืจืชื™, ืื™ื ื ื• ื™ื•ื“ืขื™ื ื›ืžื” ื–ื” ืงืฉื”. ื–ื” ืขืฉื•ื™ ืœื”ื™ื•ืช--
07:36
after all, most other mammals live half as long as the chimpanzee,
115
456330
6000
ื”ืจื™ ืจื•ื‘ ื”ื™ื•ื ืงื™ื ื”ืื—ืจื™ื ื—ื™ื™ื ื—ืฆื™ ืžื—ื™ื™ ื”ืฉื™ืžืคื ื–ื™ื,
07:42
so we're sort of three and a half or four times, have four times
116
462330
3000
ื›ืš ืฉืื ื—ื ื• ื—ื™ื™ื ืคื™ 3.5 ืื• 4-- ืคื™ 4
07:45
the longevity of most mammals. And in the case of the primates,
117
465330
6000
ื™ื•ืชืจ ืžืžืจื‘ื™ืช ื”ื™ื•ื ืงื™ื, ื•ื›ืฉืžื“ื•ื‘ืจ ื‘ืคืจื™ืžื˜ื™ื,
07:51
we have almost the same genes. We only differ from chimpanzees,
118
471330
4000
ื™ืฉ ืœื ื• ื›ืžืขื˜ ืื•ืชื ื”ื’ื ื™ื. ืื ื• ืฉื•ื ื™ื ืžื”ืฉื™ืžืคื ื–ื™ื
07:55
in the present state of knowledge, which is absolute hogwash,
119
475330
6000
ืœืคื™ ื”ื™ื“ืข ืฉื‘ื™ื“ื ื•-- ืฉื”ื•ื ื–ื‘ืœ ื˜ื”ื•ืจ--
08:01
maybe by just a few hundred genes.
120
481330
2000
ืื•ืœื™ ืจืง ื‘ื›ืžื” ืžืื•ืช ื’ื ื™ื.
08:03
What I think is that the gene counters don't know what they're doing yet.
121
483330
3000
ืœื“ืขืชื™ ืกื•ืคืจื™ ื”ื’ื ื™ื ืขื•ื“ ืœื ื™ื•ื“ืขื™ื ืžื”ื—ื™ื™ื ืฉืœื”ื.
08:06
And whatever you do, don't read anything about genetics
122
486330
3000
ื•ื‘ื›ืœ ืžืงืจื”, ืืœ ืชืงืจืื• ืขืœ ื’ื ื˜ื™ืงื” ืฉื•ื ื“ื‘ืจ
08:09
that's published within your lifetime, or something.
123
489330
3000
ืฉืคื•ืจืกื ื‘ื™ืžื™ ื—ื™ื™ื›ื, ืื• ืžืฉื”ื• ื›ื–ื”.
08:12
(Laughter)
124
492330
3000
[ืฆื—ื•ืง]
08:15
The stuff has a very short half-life, same with brain science.
125
495330
4000
ื”ืชื•ืงืฃ ืฉืœ ื”ื“ื‘ืจื™ื ื”ืืœื” ืคื’ ืžื”ืจ, ื›ืžื• ื‘ืžื“ืขื™ ื”ืžื•ื—.
08:19
And so it might be that if we just fix four or five genes,
126
499330
6000
ืื– ื™ื™ืชื›ืŸ ืฉืื ื ืชืงืŸ ืจืง 4-5 ื’ื ื™ื,
08:25
we can live 200 years.
127
505330
2000
ื ื•ื›ืœ ืœื—ื™ื•ืช 200 ืฉื ื”.
08:27
Or it might be that it's just 30 or 40,
128
507330
3000
ืื• ืื•ืœื™ ืจืง 30-40.
08:30
and I doubt that it's several hundred.
129
510330
2000
ื•ืื ื™ ื‘ืกืคืง ืื ืžื“ื•ื‘ืจ ื‘ื›ืžื” ืžืื•ืช.
08:32
So this is something that people will be discussing
130
512330
4000
ืขืœ ื–ื” ืื ืฉื™ื ื™ื“ื‘ืจื•
08:36
and lots of ethicists -- you know, an ethicist is somebody
131
516330
3000
ื•ื”ืžื•ืŸ ืื ืฉื™ ืžื•ืกืจ-- ืื™ืฉ ืžื•ืกืจ ื”ื•ื ื”ืจื™ ืžื™ืฉื”ื•
08:39
who sees something wrong with whatever you have in mind.
132
519330
3000
ืฉืžื•ืฆื ืžื” ืœื ื‘ืกื“ืจ ื‘ืจืขื™ื•ืŸ ืฉืœืš.
08:42
(Laughter)
133
522330
3000
[ืฆื—ื•ืง]
08:45
And it's very hard to find an ethicist who considers any change
134
525330
4000
ื•ืงืฉื” ืœืžืฆื•ื ืื ืฉื™ ืžื•ืกืจ ืฉื—ื•ืฉื‘ื™ื ืฉื™ืฉ ืฉื™ื ื•ื™
08:49
worth making, because he says, what about the consequences?
135
529330
4000
ืฉืจืื•ื™ ืœืขืฉื•ืชื•, ื›ื™ ื”ื ืื•ืžืจื™ื, ืžื” ืขื ื”ื”ืฉืœื›ื•ืช?
08:53
And, of course, we're not responsible for the consequences
136
533330
3000
ื•ืื ื—ื ื• ื”ืจื™ ืœื ืื—ืจืื™ื™ื ืœื”ืฉืœื›ื•ืช ืฉืœ ืžืขืฉื™ื ื• ื›ืขืช
08:56
of what we're doing now, are we? Like all this complaint about clones.
137
536330
6000
ื ื›ื•ืŸ? ืœืžืฉืœ ื›ืœ ื”ืงื™ื˜ื•ืจื™ื ื‘ื ื•ืฉื ื”ืฉื™ื‘ื•ื˜.
09:02
And yet two random people will mate and have this child,
138
542330
3000
ื•ื‘ื›ืœ ื–ืืช 2 ืื ืฉื™ื ืžืงืจื™ื™ื ื™ื›ื•ืœื™ื ืœื”ื–ื“ื•ื•ื’ ื•ืœืœื“ืช ื™ืœื“,
09:05
and both of them have some pretty rotten genes,
139
545330
4000
ื•ืœืฉื ื™ื”ื ื™ืฉ ื›ืžื” ื’ื ื™ื ื“ื™ ืžื—ื•ืจื‘ื ื™ื,
09:09
and the child is likely to come out to be average.
140
549330
4000
ื•ืกื‘ื™ืจ ืฉื”ื™ืœื“ ื™ื™ืฆื ืžืžื•ืฆืข.
09:13
Which, by chimpanzee standards, is very good indeed.
141
553330
6000
ืฉืœืคื™ ื”ืกื˜ื ื“ืจื˜ื™ื ืฉืœ ืฉื™ืžืคื ื–ื™ื, ื–ื” ื‘ื›ืœืœ ืœื ืจืข.
09:19
If we do have longevity, then we'll have to face the population growth
142
559330
3000
ืื ืื›ืŸ ื ืฉื™ื’ ืืจื™ื›ื•ืช ื™ืžื™ื, ื ืฆื˜ืจืš ืœื˜ืคืœ ื‘ื‘ืขื™ื™ืช ื’ื™ื“ื•ืœ ื”ืื•ื›ืœื•ืกื™ื”
09:22
problem anyway. Because if people live 200 or 1,000 years,
143
562330
4000
ื‘ื›ืœ ืžืงืจื”. ื›ื™ ืื ืื ืฉื™ื ื™ื—ื™ื• 200 ืื• 1,000 ืฉื ื”,
09:26
then we can't let them have a child more than about once every 200 or 1,000 years.
144
566330
6000
ืื™ ืืคืฉืจ ืฉื”ื ื™ื™ืœื“ื• ื™ื•ืชืจ ืžื™ืœื“ ืื—ื“ ื›ืœ 200 ืื• ืืœืฃ ืฉื ื”.
09:32
And so there won't be any workforce.
145
572330
3000
ื›ืš ืฉืœื ื™ื”ื™ื” ืฉื•ื ื›ื•ื— ืขื‘ื•ื“ื”.
09:35
And one of the things Laurie Garrett pointed out, and others have,
146
575330
4000
ืื—ื“ ื”ื“ื‘ืจื™ื ืฉืฆื™ื™ื ื• ืœื•ืจื™ ื’ืืจื˜ ื•ืื—ืจื™ื,
09:39
is that a society that doesn't have people
147
579330
5000
ื”ื™ื ืฉื—ื‘ืจื” ืฉืื™ืŸ ื‘ื” ื‘ื ื™ ืื“ื
09:44
of working age is in real trouble. And things are going to get worse,
148
584330
3000
ื‘ื’ื™ืœ ื”ืขื‘ื•ื“ื” ื”ื™ื ื‘ืฆืจื” ืืžื™ืชื™ืช. ื•ื”ืžืฆื‘ ืขืชื™ื“ ืœื”ื—ืžื™ืจ,
09:47
because there's nobody to educate the children or to feed the old.
149
587330
6000
ื›ื™ ืื™ืŸ ืžื™ ืฉื™ื—ื ืš ืืช ื”ื™ืœื“ื™ื ืื• ื™ืื›ื™ืœ ืืช ื”ื–ืงื ื™ื.
09:53
And when I'm talking about a long lifetime, of course,
150
593330
2000
ื•ื›ืฉืื ื™ ืžื“ื‘ืจ ืขืœ ื—ื™ื™ื ืืจื•ื›ื™ื, ื›ืžื•ื‘ืŸ,
09:55
I don't want somebody who's 200 years old to be like our image
151
595330
6000
ืื™ื ื ื™ ืžืขื•ื ื™ื™ืŸ ืฉืื“ื ื‘ืŸ 200 ื™ื™ืจืื” ื›ืžื• ื”ื“ื™ืžื•ื™ ืฉื™ืฉ ืœื ื•
10:01
of what a 200-year-old is -- which is dead, actually.
152
601330
4000
ืขืœ ืื“ื ื‘ืŸ 200-- ืฉื–ื” ื‘ืขืฆื ืžืช.
10:05
You know, there's about 400 different parts of the brain
153
605330
2000
ื™ืฉ ื›-400 ื—ืœืงื™ื ื‘ืžื•ื—
10:07
which seem to have different functions.
154
607330
2000
ืฉื›ื ืจืื” ื”ื ื‘ืขืœื™ ืชืคืงื•ื“ื™ื ืฉื•ื ื™ื.
10:09
Nobody knows how most of them work in detail,
155
609330
3000
ืื™ืฉ ืื™ื ื• ื™ื•ื“ืข ืื™ืš ื‘ื“ื™ื•ืง ืžืจื‘ื™ืชื ืคื•ืขืœื™ื,
10:12
but we do know that there're lots of different things in there.
156
612330
4000
ืืš ื™ื“ื•ืข ืœื ื• ืฉื™ืฉ ืฉื ื”ืจื‘ื” ื“ื‘ืจื™ื.
10:16
And they don't always work together. I like Freud's theory
157
616330
2000
ื•ืœื ืชืžื™ื“ ื”ื ืขื•ื‘ื“ื™ื ื‘ื™ื—ื“. ืื ื™ ืื•ื”ื‘ ืืช ื”ืชื™ืื•ืจื™ื”
10:18
that most of them are cancelling each other out.
158
618330
4000
ืฉืœ ืคืจื•ื™ื“, ืฉืจื•ื‘ื ืžื‘ื˜ืœื™ื ื–ื” ืืช ื–ื”.
10:22
And so if you think of yourself as a sort of city
159
622330
4000
ืื– ืื ืชื—ืฉื‘ื• ืขืœ ืขืฆืžื›ื ื›ืขืœ ืžืขื™ืŸ ืขื™ืจ
10:26
with a hundred resources, then, when you're afraid, for example,
160
626330
6000
ืขื ืžืื” ืžืฉืื‘ื™ื, ื•ื›ืฉืืชื ืคื•ื—ื“ื™ื, ืœื“ื•ื’ืžื”,
10:32
you may discard your long-range goals, but you may think deeply
161
632330
4000
ืื•ืœื™ ืชื‘ื˜ืœื• ืืช ื”ืžื˜ืจื•ืช ืœื˜ื•ื•ื— ื”ืืจื•ืš, ืื‘ืœ ืชื—ืฉื‘ื• ื”ื™ื˜ื‘
10:36
and focus on exactly how to achieve that particular goal.
162
636330
4000
ื•ื‘ืจื™ื›ื•ื– ืื™ืš ื‘ื“ื™ื•ืง ืœื”ืฉื™ื’ ืืช ื”ืžื˜ืจื” ื”ืžืกื•ื™ืžืช.
10:40
You throw everything else away. You become a monomaniac --
163
640330
3000
ื•ืชื™ืคื˜ืจื• ืžื›ืœ ื”ื™ืชืจ. ืชื™ื”ืคื›ื• ืœืžืฉื•ื’ืขื™ื ืœื“ื‘ืจ ืื—ื“--
10:43
all you care about is not stepping out on that platform.
164
643330
4000
ื›ืœ ืžื” ืฉื™ืขื ื™ื™ืŸ ืืชื›ื ื™ื”ื™ื” ืœื ืœื—ืจื•ื’ ืžื›ืš.
10:47
And when you're hungry, food becomes more attractive, and so forth.
165
647330
4000
ื•ื›ืฉืชื”ื™ื• ืจืขื‘ื™ื, ื”ืื•ื›ืœ ื™ื”ื™ื” ื™ื•ืชืจ ืื˜ืจืงื˜ื™ื‘ื™, ื•ื›ืŸ ื”ืœืื”.
10:51
So I see emotions as highly evolved subsets of your capability.
166
651330
6000
ื‘ืขื™ื ื™ ื”ืจื’ืฉื•ืช ื”ื ืžืขืจื›ื™ ืžืฉื ื” ืžืคื•ืชื—ื™ื ืžืื“ ืฉืœ ื”ื™ื›ื•ืœื•ืช.
10:57
Emotion is not something added to thought. An emotional state
167
657330
4000
ืจื’ืฉื•ืช ืื™ื ื ื ืกืคื— ืฉืœ ื”ืžื—ืฉื‘ื”. ืžืฆื‘ ืจื’ืฉื™
11:01
is what you get when you remove 100 or 200
168
661330
4000
ื–ื” ืžืฉื”ื• ืฉืžืชืงื‘ืœ ื›ืฉืžืกืœืงื™ื 100 ืื• 200
11:05
of your normally available resources.
169
665330
3000
ืžื”ืžืฉืื‘ื™ื ื”ื–ืžื™ื ื™ื ื‘ื“ืจืš ื›ืœืœ.
11:08
So thinking of emotions as the opposite of -- as something
170
668330
3000
ืื– ื”ื—ืฉื™ื‘ื” ืขืœ ืจื’ืฉื•ืช ื›ืขืœ ื”ื”ื™ืคืš...ื›ืขืœ ืžืฉื”ื•
11:11
less than thinking is immensely productive. And I hope,
171
671330
4000
ื ื—ื•ืช ืžื—ืฉื™ื‘ื” ื”ื™ื ืžื•ืขื™ืœื” ื‘ื™ื•ืชืจ. ื•ืื ื™ ืžืงื•ื•ื”
11:15
in the next few years, to show that this will lead to smart machines.
172
675330
4000
ืœื”ืจืื•ืช ืฉื‘ืฉื ื™ื ื”ืงืจื•ื‘ื•ืช ื–ื” ื™ื•ื‘ื™ืœ ืœืžื›ื•ื ื•ืช ื—ื›ืžื•ืช.
11:19
And I guess I better skip all the rest of this, which are some details
173
679330
3000
ื•ื ืจืื” ืœื™ ืฉืื“ืœื’ ืขืœ ื”ื™ืชืจ. ืืœื” ืจืง ืคืจื˜ื™ื
11:22
on how we might make those smart machines and --
174
682330
5000
ืื™ืš ื ื™ืฆื•ืจ ืืช ื”ืžื›ื•ื ื•ืช ื”ื—ื›ืžื•ืช ื”ืืœื” ื•--
11:27
(Laughter)
175
687330
5000
[ืฆื—ื•ืง]
11:32
-- and the main idea is in fact that the core of a really smart machine
176
692330
5000
ื•ื”ืจืขื™ื•ืŸ ื”ืžืจื›ื–ื™ ื”ื•ื ื‘ืขืฆื ืฉืœื™ื‘ืชื” ืฉืœ ืžื›ื•ื ื” ื—ื›ืžื” ื‘ืืžืช
11:37
is one that recognizes that a certain kind of problem is facing you.
177
697330
5000
ื”ื™ื ื›ื–ื• ืฉืžื›ื™ืจื” ื‘ื‘ืขื™ื™ื” ื ื•ื›ื—ื™ืช ืžืกื•ื™ืžืช.
11:42
This is a problem of such and such a type,
178
702330
3000
ื–ืืช ื‘ืขื™ื” ืžืกื•ื’ ื›ื–ื”-ื•ื›ื–ื”,
11:45
and therefore there's a certain way or ways of thinking
179
705330
5000
ื•ืœื›ืŸ ื™ืฉ ื“ืจืš ืื• ื“ืจื›ื™ ื—ืฉื™ื‘ื” ืžืกื•ื™ืžื•ืช
11:50
that are good for that problem.
180
710330
2000
ืฉื™ืคื•ืช ืœื‘ืขื™ื” ื”ื–ื•.
11:52
So I think the future, main problem of psychology is to classify
181
712330
4000
ืื– ื ืจืื” ืœื™ ืฉื‘ืขืชื™ื“ ื”ื‘ืขื™ื” ื”ืขื™ืงืจื™ืช ืฉืœ ื”ืคืกื™ื›ื•ืœื•ื’ื™ื” ืชื”ื™ื”
11:56
types of predicaments, types of situations, types of obstacles
182
716330
4000
ืœืกื•ื•ื’ ืกื•ื’ื™ ืžืฆื‘ื™ื ืœื ื ืขื™ืžื™ื, ืกื•ื’ื™ ื ืกื™ื‘ื•ืช, ืกื•ื’ื™ ืžื›ืฉื•ืœื™ื
12:00
and also to classify available and possible ways to think and pair them up.
183
720330
6000
ื•ื›ืŸ ืœืกื•ื•ื’ ื“ืจื›ื™ื ื–ืžื™ื ื•ืช ื•ืืคืฉืจื™ื•ืช ืœื—ืฉื•ื‘ ื•ืœืคืชื•ืจ ืื•ืชื.
12:06
So you see, it's almost like a Pavlovian --
184
726330
3000
ื›ืžื• ืฉืืชื ืจื•ืื™ื, ื–ื” ื›ืžืขื˜ ืคื‘ืœื•ื‘ื™--
12:09
we lost the first hundred years of psychology
185
729330
2000
ื”ืคืกื“ื ื• ืืช ืžืื” ืฉื ื•ืช ื”ืคืกื™ื›ื•ืœื•ื’ื™ื” ื”ืจืืฉื•ื ื•ืช
12:11
by really trivial theories, where you say,
186
731330
3000
ื‘ื’ืœืœ ืชื™ืื•ืจื™ื•ืช ืžืžืฉ ืคืฉื˜ื ื™ื•ืช ืฉืื•ืžืจื•ืช,
12:14
how do people learn how to react to a situation? What I'm saying is,
187
734330
6000
ื›ื™ืฆื“ ืื ืฉื™ื ืœื•ืžื“ื™ื ืœื”ื’ื™ื‘ ืœืžืฆื‘ื™ื. ืžื” ืฉืื ื™ ืื•ืžืจ ื”ื•ื
12:20
after we go through a lot of levels, including designing
188
740330
5000
ืฉืื—ืจื™ ืฉื ืขื‘ืจื• ืจืžื•ืช ืจื‘ื•ืช, ื›ื•ืœืœ ืชื›ื ื•ืŸ
12:25
a huge, messy system with thousands of ports,
189
745330
3000
ืฉืœ ืžืขืจื›ืช ืขื ืงื™ืช ื•ืžื‘ื•ืœื’ื ืช ืขื ืืœืคื™ ื—ืœืงื™ื,
12:28
we'll end up again with the central problem of psychology.
190
748330
4000
ื ื’ื™ืข ืœืกื•ืฃ ื•ื ืžืฆื ืืช ื”ื‘ืขื™ื” ื”ืžืจื›ื–ื™ืช ืฉืœ ื”ืคืกื™ื›ื•ืœื•ื’ื™ื”.
12:32
Saying, not what are the situations,
191
752330
3000
ื›ืœื•ืžืจ, ืœื ืžื”ื ื”ืžืฆื‘ื™ื,
12:35
but what are the kinds of problems
192
755330
2000
ืืœื ืžื”ื ืกื•ื’ื™ ื”ื‘ืขื™ื•ืช
12:37
and what are the kinds of strategies, how do you learn them,
193
757330
3000
ื•ืžื”ื ืกื•ื’ื™ ื”ืื™ืกื˜ืจื˜ื’ื™ื•ืช, ืื™ืš ืœืœืžื•ื“ ืื•ืชืŸ,
12:40
how do you connect them up, how does a really creative person
194
760330
3000
ืื™ืš ืœืงืฉืจ ื‘ื™ื ื™ื”ืŸ, ืื™ืš ืื“ื ื™ืฆื™ืจืชื™ ื‘ืืžืช
12:43
invent a new way of thinking out of the available resources and so forth.
195
763330
5000
ืžืžืฆื™ื ื“ืจืš ื—ืฉื™ื‘ื” ื—ื“ืฉื” ืœืคื™ ื”ืžืฉืื‘ื™ื ื”ื–ืžื™ื ื™ื ื•ื›ื•'.
12:48
So, I think in the next 20 years,
196
768330
2000
ืื– ื ืจืื” ืœื™ ืฉื‘-20 ืฉื ื” ื”ื‘ืื•ืช,
12:50
if we can get rid of all of the traditional approaches to artificial intelligence,
197
770330
5000
ืื ื ื™ืคื˜ืจ ืžื›ืœ ื”ื’ื™ืฉื•ืช ื”ืžืกื•ืจืชื™ื•ืช ืœืชื‘ื•ื ื” ืžืœืื›ื•ืชื™ืช,
12:55
like neural nets and genetic algorithms
198
775330
2000
ื›ืžื• ืจืฉืชื•ืช ืขืฆื‘ื™ื•ืช ื•ืืœื’ื•ืจื™ืชืžื™ื ื’ื ื˜ื™ื™ื
12:57
and rule-based systems, and just turn our sights a little bit higher to say,
199
777330
6000
ื•ืžืขืจื›ื•ืช ืžื‘ื•ืกืกื•ืช ื›ืœืœื™ื, ื•ืจืง ื ืจื™ื ืžืขื˜ ืืช ืžื‘ื˜ื ื• ื•ื ืฉืืœ,
13:03
can we make a system that can use all those things
200
783330
2000
ื”ืื ื ื•ื›ืœ ืœื™ืฆื•ืจ ืžืขืจื›ืช ืฉื™ื›ื•ืœื” ืœื”ืฉืชืžืฉ ื‘ื›ืœ ืืœื”
13:05
for the right kind of problem? Some problems are good for neural nets;
201
785330
4000
ืขื‘ื•ืจ ืกื•ื’ ื”ื‘ืขื™ื” ื”ื ื›ื•ืŸ? ื™ืฉ ื‘ืขื™ื•ืช ืฉื”ื•ืœืžื•ืช ืจืฉืชื•ืช ืขืฆื‘ื™ื•ืช.
13:09
we know that others, neural nets are hopeless on them.
202
789330
3000
ื•ื”ืจื™ ื™ืฉ ืื—ืจื•ืช ืฉืจืฉืชื•ืช ืขืฆื‘ื™ื•ืช ืœื ืžื–ื™ื–ื•ืช ืœื”ืŸ.
13:12
Genetic algorithms are great for certain things;
203
792330
3000
ืืœื’ื•ืจื™ืชืžื™ื ื’ื ื˜ื™ื™ื ื”ื ืžืขื•ืœื™ื ืœื“ื‘ืจื™ื ืžืกื•ื™ืžื™ื,
13:15
I suspect I know what they're bad at, and I won't tell you.
204
795330
4000
ืื ื™ ื—ื•ืฉื“ ืฉืื ื™ ื™ื•ื“ืข ืœืžื” ื”ื ืœื ื˜ื•ื‘ื™ื, ื•ืœื ืื’ืœื” ืœื›ื.
13:19
(Laughter)
205
799330
1000
[ืฆื—ื•ืง]
13:20
Thank you.
206
800330
2000
ืชื•ื“ื” ืจื‘ื”.
13:22
(Applause)
207
802330
6000
[ืžื—ื™ืื•ืช ื›ืคื™ื™ื]
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7