How will AI change the world?

2,024,306 views ใƒป 2022-12-06

TED-Ed


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: zeeva livshitz ืขืจื™ื›ื”: Ido Dekkers
ื‘ืฉื ื™ื ื”ืงืจื•ื‘ื•ืช, ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช
00:07
In the coming years, artificial intelligence
0
7336
2169
ืชืฉื ื” ื›ื ืจืื” ืืช ื—ื™ื™ื›ื, ื•ื›ื ืจืื” ืืช ื”ืขื•ืœื ื›ื•ืœื•.
00:09
is probably going to change your life, and likely the entire world.
1
9505
3629
ืื‘ืœ ืœืื ืฉื™ื ืงืฉื” ืœื”ืกื›ื™ื ืขืœ ืื™ืš ื‘ื“ื™ื•ืง.
00:13
But people have a hard time agreeing on exactly how.
2
13301
3295
ืœื”ืœืŸ ืงื˜ืขื™ื ืžืชื•ืš ืจืื™ื•ืŸ ืฉืœ ื”ืคื•ืจื•ื ื”ื›ืœื›ืœื™ ื”ืขื•ืœืžื™
00:16
The following are excerpts from an interview
3
16804
2127
00:18
where renowned computer science professor and AI expert Stuart Russell
4
18931
3504
ืฉื‘ื• ืคืจื•ืคืกื•ืจ ื™ื“ื•ืข ืœืžื“ืขื™ ื”ืžื—ืฉื‘ ื•ืžื•ืžื—ื” ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช ืกื˜ื™ื•ืืจื˜ ืจืืกืœ
00:22
helps separate the sense from the nonsense.
5
22435
2586
ืขื•ื–ืจ ืœื”ืคืจื™ื“ ืืช ื”ื”ื’ื™ื•ื ื™ ืžื”ืฉื˜ื•ืชื™.
00:25
Thereโ€™s a big difference between asking a human to do something
6
25313
3754
ื™ืฉ ื”ื‘ื“ืœ ื’ื“ื•ืœ ื‘ื™ืŸ ืœื‘ืงืฉ ื‘ืŸ ืื“ื ืœืขืฉื•ืช ืžืฉื”ื•
00:29
and giving that as the objective to an AI system.
7
29067
3461
ื•ืœืชืช ืืช ื–ื” ื›ืžื˜ืจื” ืœืžืขืจื›ืช AI.
00:32
When you ask a human to get you a cup of coffee,
8
32528
2628
ื›ืฉืืชื” ืžื‘ืงืฉ ืžืื“ื ืœืชืช ืœืš ื›ื•ืก ืงืคื”,
00:35
you donโ€™t mean this should be their lifeโ€™s mission,
9
35156
2586
ืืชื” ืœื ืžืชื›ื•ื•ืŸ ืฉื–ื” ืฆืจื™ืš ืœื”ื™ื•ืช ืžืฉื™ืžืช ื—ื™ื™ื•,
00:37
and nothing else in the universe matters.
10
37742
1960
ื•ืฉื•ื ื“ื‘ืจ ืื—ืจ ื‘ื™ืงื•ื ืœื ืžืฉื ื”.
00:39
Even if they have to kill everybody else in Starbucks
11
39702
2586
ื’ื ืื ื”ื ื™ืฆื˜ืจื›ื• ืœื”ืจื•ื’ ืืช ื›ืœ ื”ืื—ืจื™ื ื‘ืกื˜ืืจื‘ืงืก
00:42
to get you the coffee before it closesโ€” they should do that.
12
42288
2836
ืœื”ื‘ื™ื ืœืš ืืช ื”ืงืคื” ืœืคื ื™ ืฉื”ื•ื ื ืกื’ืจ- ื”ื ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ืืช ื–ื”.
00:45
No, thatโ€™s not what you mean.
13
45124
1627
ืœื, ืœื ืœื–ื” ืืชื” ืžืชื›ื•ื•ืŸ.
00:46
All the other things that we mutually care about,
14
46751
2294
ื›ืœ ืฉืืจ ื”ื“ื‘ืจื™ื ืฉืื›ืคืช ืœื ื• ืžื”ื ื‘ืื•ืคืŸ ื”ื“ื“ื™,
00:49
they should factor into your behavior as well.
15
49045
2169
ื”ื ืฆืจื™ื›ื™ื ืœืงื—ืช ื‘ื—ืฉื‘ื•ืŸ ื’ื ืืช ื”ื”ืชื ื”ื’ื•ืช ืฉืœืš.
00:51
And the problem with the way we build AI systems now
16
51214
3169
ื•ื”ื‘ืขื™ื” ืขื ื”ื“ืจืš ืฉืื ื—ื ื• ื‘ื•ื ื™ื ืžืขืจื›ื•ืช ื‘ื™ื ื” ืžืœืื›ื•ืชื™ืช ืขื›ืฉื™ื•
00:54
is we give them a fixed objective.
17
54383
1627
ื”ื™ื ืฉืื ื• ื ื•ืชื ื™ื ืœื”ื ืžื˜ืจื” ืงื‘ื•ืขื”.
00:56
The algorithms require us to specify everything in the objective.
18
56010
3545
ื”ืืœื’ื•ืจื™ืชืžื™ื ื“ื•ืจืฉื™ื ืžืื™ืชื ื• ืœืฆื™ื™ืŸ ื”ื›ืœ ื‘ืžื˜ืจื”.
00:59
And if you say, can we fix the acidification of the oceans?
19
59555
3420
ื•ืื ืืชื ืฉื•ืืœื™ื, ื”ืื ื ื•ื›ืœ ืœืชืงืŸ ืืช ื”ื—ืžืฆืช ื”ืื•ืงื™ื™ื ื•ืกื™ื?
01:02
Yeah, you could have a catalytic reaction that does that extremely efficiently,
20
62975
4046
ื›ืŸ, ื™ื›ื•ืœื” ืœื”ื™ื•ืช ืœื›ื ืชื’ื•ื‘ื” ืงื˜ืœื™ื˜ื™ืช ืฉืขื•ืฉื” ืืช ื–ื” ื‘ื™ืขื™ืœื•ืช ืจื‘ื”,
01:07
but it consumes a quarter of the oxygen in the atmosphere,
21
67021
3253
ืื‘ืœ ื–ื” ืฆื•ืจืš ืจื‘ืข ืฉืœ ื”ื—ืžืฆืŸ ื‘ืื˜ืžื•ืกืคื™ืจื”,
01:10
which would apparently cause us to die fairly slowly and unpleasantly
22
70274
3712
ืžื” ืฉื›ื ืจืื” ื™ื’ืจื•ื ืœื ื• ืœืžื•ืช ื“ื™ ืœืื˜ ื•ื‘ืื•ืคืŸ ืœื ื ืขื™ื
01:13
over the course of several hours.
23
73986
1752
ื‘ืžื”ืœืš ืžืกืคืจ ืฉืขื•ืช.
01:15
So, how do we avoid this problem?
24
75780
3211
ืื– ืื™ืš ื ืžื ืขื™ื ืžื‘ืขื™ื” ื–ื•?
01:18
You might say, okay, well, just be more careful about specifying the objectiveโ€”
25
78991
4088
ืืชื ื™ื›ื•ืœื™ื ืœื”ื’ื™ื“, ื‘ืกื“ืจ, ื˜ื•ื‘, ืคืฉื•ื˜ ืชื”ื™ื• ื™ื•ืชืจ ื–ื”ื™ืจื™ื ื‘ื”ืงืคื“ื” ืœืฆื™ื™ืŸ ืืช ื”ืžื˜ืจื”--
01:23
donโ€™t forget the atmospheric oxygen.
26
83079
2544
ืืœ ืชืฉื›ื—ื• ืืช ื”ื—ืžืฆืŸ ื”ืื˜ืžื•ืกืคืจื™.
01:25
And then, of course, some side effect of the reaction in the ocean
27
85873
3545
ื•ืื–, ื›ืžื•ื‘ืŸ, ืชื•ืคืขื•ืช ืœื•ื•ืื™ ืื—ื“ื•ืช ืฉืœ ื”ืชื’ื•ื‘ื” ื‘ืื•ืงื™ื™ื ื•ืก
01:29
poisons all the fish.
28
89418
1377
ืžืจืขื™ืœื” ืืช ื›ืœ ื”ื“ื’ื™ื.
01:30
Okay, well I meant donโ€™t kill the fish either.
29
90795
2669
ืื•ืงื™ื™, ื˜ื•ื‘ ื”ืชื›ื•ื•ื ืชื™ ืฉืœื ืชื”ืจื’ื• ืืช ื”ื“ื’ื™ื ื’ื ื›ืŸ.
01:33
And then, well, what about the seaweed?
30
93464
1919
ื•ืื–, ื ื•, ืžื” ืœื’ื‘ื™ ื”ืืฆื•ืช?
01:35
Donโ€™t do anything thatโ€™s going to cause all the seaweed to die.
31
95383
2961
ืืœ ืชืขืฉื• ืฉื•ื ื“ื‘ืจ ืฉื”ื•ืœืš ืœื’ืจื•ื ืœื›ืœ ื”ืืฆื•ืช ืœืžื•ืช.
01:38
And on and on and on.
32
98344
1210
ื•ืขื•ื“ ื•ืขื•ื“ ื•ืขื•ื“ ื•ืขื•ื“.
01:39
And the reason that we donโ€™t have to do that with humans is that
33
99679
3920
ื•ื”ืกื™ื‘ื” ืฉืื ื—ื ื• ืœื ืฆืจื™ื›ื™ื ืœืขืฉื•ืช ืืช ื–ื” ืขื ื‘ื ื™ ืื“ื ื”ื™ื
01:43
humans often know that they donโ€™t know all the things that we care about.
34
103599
4505
ืฉื‘ื ื™ ืื“ื ื™ื•ื“ืขื™ื ืœืขืชื™ื ืงืจื•ื‘ื•ืช ืฉื”ื ืœื ื™ื•ื“ืขื™ื
ืืช ื›ืœ ื”ื“ื‘ืจื™ื ืฉืื›ืคืช ืœื ื• ืžื”ื. ืื ืชื‘ืงืฉื• ืžืื“ื ืœื”ืฉื™ื’ ืœื›ื ื›ื•ืก ืงืคื”,
01:48
If you ask a human to get you a cup of coffee,
35
108354
2961
01:51
and you happen to be in the Hotel George Sand in Paris,
36
111315
2878
ื•ืืชื ื‘ืžืงืจื” ื‘ืžืœื•ืŸ ื’โ€˜ื•ืจื’โ€™ ืกื ื“ ื‘ืคืจื™ื–,
01:54
where the coffee is 13 euros a cup,
37
114193
2628
ืฉื‘ื• ืžื—ื™ืจ ื”ืงืคื” ื”ื•ื 13 ื™ื•ืจื• ืœืกืคืœ,
01:56
itโ€™s entirely reasonable to come back and say, well, itโ€™s 13 euros,
38
116821
4171
ื–ื” ืœื’ืžืจื™ ื”ื’ื™ื•ื ื™ ืœื—ื–ื•ืจ ื•ืœื•ืžืจ, ื•ื‘ื›ืŸ, ื–ื” 13 ื™ื•ืจื•,
02:00
are you sure you want it, or I could go next door and get one?
39
120992
2961
ืืชื ื‘ื˜ื•ื—ื™ื ืฉืืชื ืจื•ืฆื™ื ืืช ื–ื”, ืื• ืฉืื•ื›ืœ ืœืœื›ืช ื•ืœืงื‘ืœ ืื—ื“ ืœื™ื“ ื”ื‘ื™ืช ?
02:03
And itโ€™s a perfectly normal thing for a person to do.
40
123953
2878
ื•ื–ื” ื“ื‘ืจ ื ื•ืจืžืœื™ ืœื—ืœื•ื˜ื™ืŸ ืœืื“ื ืœืขืฉื•ืช.
02:07
To ask, Iโ€™m going to repaint your houseโ€”
41
127039
3003
ืœืฉืื•ืœ, ืื ื™ ื”ื•ืœืš ืœืฆื‘ื•ืข ืžื—ื“ืฉ ืืช ื”ื‘ื™ืช ืฉืœื›ื-
02:10
is it okay if I take off the drainpipes and then put them back?
42
130042
3337
ื–ื” ื‘ืกื“ืจ ืื ืื ื™ ืื•ืจื™ื“ ืืช ืฆื™ื ื•ืจื•ืช ื”ื ื™ืงื•ื– ื•ืื– ืื—ื–ื™ืจ ืื•ืชื?
02:13
We don't think of this as a terribly sophisticated capability,
43
133504
3128
ืื ื—ื ื• ืœื ื—ื•ืฉื‘ื™ื ืขืœ ื–ื” ื›ืขืœ ื™ื›ื•ืœืช ืžืชื•ื—ื›ืžืช ื ื•ืจื,
02:16
but AI systems donโ€™t have it because the way we build them now,
44
136632
3087
ืื‘ืœ ืœืžืขืจื›ื•ืช AI ืื™ืŸ ืืช ื–ื” ื›ื™ ืื™ืš ืฉืื ื—ื ื• ื‘ื•ื ื™ื ืื•ืชืŸ ืขื›ืฉื™ื•,
02:19
they have to know the full objective.
45
139719
1793
ื”ืŸ ืฆืจื™ื›ื•ืช ืœื“ืขืช ืืช ื”ืžื˜ืจื” ื”ืžืœืื”.
02:21
If we build systems that know that they donโ€™t know what the objective is,
46
141721
3753
ืื ื ื‘ื ื” ืžืขืจื›ื•ืช ืฉื™ื•ื“ืขื•ืช ืฉื”ืŸ ืœื ื™ื•ื“ืขื•ืช ืžื” ื”ืžื˜ืจื”,
02:25
then they start to exhibit these behaviors,
47
145474
2586
ืื– ื”ืŸ ืžืชื—ื™ืœื•ืช ืœื”ืฆื™ื’ ื”ืชื ื”ื’ื•ื™ื•ืช ืืœื•,
02:28
like asking permission before getting rid of all the oxygen in the atmosphere.
48
148060
4046
ื›ืžื• ืœื‘ืงืฉ ืจืฉื•ืช ืœืคื ื™ ืฉื ืคื˜ืจื™ื ืžื›ืœ ื”ื—ืžืฆืŸ ื‘ืื˜ืžื•ืกืคืจื”.
02:32
In all these senses, control over the AI system
49
152565
3378
ื‘ื›ืœ ื”ืžื•ื‘ื ื™ื ื”ืืœื”, ืฉืœื™ื˜ื” ืขืœ ืžืขืจื›ืช ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช
02:35
comes from the machineโ€™s uncertainty about what the true objective is.
50
155943
4463
ื ื•ื‘ืขืช ืžืื™ ื”ื•ื•ื“ืื•ืช ืฉืœ ื”ืžื›ื•ื ื” ืžื” ื”ืžื˜ืจื” ื”ืืžื™ืชื™ืช.
02:41
And itโ€™s when you build machines that believe with certainty
51
161282
3086
ื•ื›ืฉืืชื ื‘ื•ื ื™ื ืžื›ื•ื ื•ืช ืฉืžืืžื™ื ื•ืช ื‘ื•ื•ื“ืื•ืช
02:44
that they have the objective,
52
164368
1418
ืฉื™ืฉ ืœื”ืŸ ืืช ื”ืžื˜ืจื”,
02:45
thatโ€™s when you get this sort of psychopathic behavior.
53
165786
2753
ืื– ืืชื ืžืงื‘ืœื™ื ืกื•ื’ ื–ื” ืฉืœ ื”ืชื ื”ื’ื•ืช ืคืกื™ื›ื•ืคืชื™ืช.
02:48
And I think we see the same thing in humans.
54
168539
2127
ื•ืื ื™ ื—ื•ืฉื‘ ืฉืื ื—ื ื• ืจื•ืื™ื ืื•ืชื• ื“ื‘ืจ ื‘ื‘ื ื™ ืื“ื.
02:50
What happens when general purpose AI hits the real economy?
55
170750
4254
ืžื” ืงื•ืจื” ื›ืืฉืจ AI ืœืžื˜ืจื” ื›ืœืœื™ืช ืคื•ื’ืข ื‘ื›ืœื›ืœื” ื”ืืžื™ืชื™ืช?
02:55
How do things change? Can we adapt?
56
175379
3587
ืื™ืš ื“ื‘ืจื™ื ืžืฉืชื ื™ื? ื”ืื ื ื•ื›ืœ ืœื”ืกืชื’ืœ?
02:59
This is a very old point.
57
179175
1835
ื–ื• ื ืงื•ื“ื” ื™ืฉื ื” ืžืื•ื“.
03:01
Amazingly, Aristotle actually has a passage where he says,
58
181010
3587
ืœืžืจื‘ื” ื”ืคืœื, ืœืืจื™ืกื˜ื• ืœืžืขืฉื” ื™ืฉ ืงื˜ืข ืฉื‘ื• ื”ื•ื ืื•ืžืจ,
03:04
look, if we had fully automated weaving machines
59
184597
3045
ืชืจืื•, ืื ื”ื™ื• ืœื ื• ืžื›ื•ื ื•ืช ืืจื™ื’ื” ืื•ื˜ื•ืžื˜ื™ื•ืช ืœื—ืœื•ื˜ื™ืŸ
03:07
and plectrums that could pluck the lyre and produce music without any humans,
60
187642
3837
ื•ืžืคืจื˜ื™ื ืฉื™ื›ื•ืœื™ื ืœืคืจื•ื˜ ืขืœ ื”ืœื™ืจื” ื•ืœื”ืคื™ืง ืžื•ื–ื™ืงื” ืœืœื ื‘ื ื™ ืื“ื,
03:11
then we wouldnโ€™t need any workers.
61
191604
2002
ืื– ืœื ื ืฆื˜ืจืš ืฉื•ื ืขื•ื‘ื“ื™ื.
03:13
That idea, which I think it was Keynes
62
193814
2878
ื”ืจืขื™ื•ืŸ ื”ื–ื”, ืฉืœื“ืขืชื™ ื–ื” ื”ื™ื” ืงื™ื™ื ืก
03:16
who called it technological unemployment in 1930,
63
196692
2836
ืฉืงืจื ืœื–ื” ืื‘ื˜ืœื” ื˜ื›ื ื•ืœื•ื’ื™ืช ื‘ืฉื ืช 1930,
03:19
is very obvious to people.
64
199528
1919
ื‘ืจื•ืจ ืžืื•ื“ ืœืื ืฉื™ื.
03:21
They think, yeah, of course, if the machine does the work,
65
201447
3086
ื”ื ื—ื•ืฉื‘ื™ื, ื›ืŸ, ื›ืžื•ื‘ืŸ, ืื ื”ืžื›ื•ื ื” ืขื•ืฉื” ืืช ื”ืขื‘ื•ื“ื”,
03:24
then I'm going to be unemployed.
66
204533
1669
ืื– ืื ื™ ื”ื•ืœืš ืœื”ื™ื•ืช ืžื•ื‘ื˜ืœ.
03:26
You can think about the warehouses that companies are currently operating
67
206369
3503
ืืคืฉืจ ืœื—ืฉื•ื‘ ืขืœ ื”ืžื—ืกื ื™ื ืฉื—ื‘ืจื•ืช ืžืคืขื™ืœื•ืช ื›ื™ื•ื
03:29
for e-commerce, they are half automated.
68
209872
2711
ืขื‘ื•ืจ ืžืกื—ืจ ืืœืงื˜ืจื•ื ื™, ื”ื ื—ืฆื™ ืื•ื˜ื•ืžื˜ื™ื™ื.
03:32
The way it works is that an old warehouseโ€” where youโ€™ve got tons of stuff piled up
69
212583
4046
ื”ื“ืจืš ืฉื‘ื” ื–ื” ืขื•ื‘ื“ ื”ื™ื ืฉืžื—ืกืŸ ื™ืฉืŸ- ืฉื‘ื• ื™ืฉ ืœื›ื ื˜ื•ื ื•ืช ืฉืœ ื“ื‘ืจื™ื ืฉื ืขืจืžื•
03:36
all over the place and humans go and rummage around
70
216629
2461
ืกื‘ื™ื‘ ื›ืœ ื”ืžืงื•ื ื•ื‘ื ื™ ืื“ื ื”ื•ืœื›ื™ื ื•ืžืคืฉืคืฉื™ื
03:39
and then bring it back and send it offโ€”
71
219090
1877
ื•ืื– ืžื—ื–ื™ืจื™ื ืื•ืชื ื•ืฉื•ืœื—ื™ื ืื•ืชื -
03:40
thereโ€™s a robot who goes and gets the shelving unit
72
220967
3586
ื™ืฉ ืจื•ื‘ื•ื˜ ืฉื”ื•ืœืš ื•ืžืงื‘ืœ ืืช ื™ื—ื™ื“ืช ื”ืžื“ืคื™ื
03:44
that contains the thing that you need,
73
224553
1919
ืฉืžื›ื™ืœื™ื ืืช ื”ื“ื‘ืจ ืฉืืชื” ืฆืจื™ืš,
03:46
but the human has to pick the object out of the bin or off the shelf,
74
226472
3629
ืื‘ืœ ื”ืื“ื ืฆืจื™ืš ืœื‘ื—ื•ืจ ืืช ื”ื—ืคืฅ ื”ื—ื•ืฆื” ืžื”ืชื ืื• ื”ืžื“ืฃ
03:50
because thatโ€™s still too difficult.
75
230101
1877
ื›ื™ ื–ื” ืขื“ื™ื™ืŸ ืงืฉื” ืžื“ื™.
03:52
But, at the same time,
76
232019
2002
ืื‘ืœ ื‘ืื•ืชื• ื–ืžืŸ,
03:54
would you make a robot that is accurate enough to be able to pick
77
234021
3921
ื”ืื ื”ื™ื™ืชื ืžื™ื™ืฆืจื™ื ืจื•ื‘ื•ื˜ ืžื“ื•ื™ืง ืžืกืคื™ืง ื›ื“ื™ ืœื”ื™ื•ืช ืžืกื•ื’ืœ ืœื‘ื—ื•ืจ
03:57
pretty much any object within a very wide variety of objects that you can buy?
78
237942
4338
ื›ืžืขื˜ ื›ืœ ืื•ื‘ื™ื™ืงื˜ ื‘ืชื•ืš ืžื’ื•ื•ืŸ ื—ืคืฆื™ื ืจื—ื‘ ืžืื•ื“ ืฉืืชื ื™ื›ื•ืœื™ื ืœืงื ื•ืช?
04:02
That would, at a stroke, eliminate 3 or 4 million jobs?
79
242280
4004
ืฉื™ื—ืกืœ ื‘ื‘ืช ืื—ืช, 3 ืื• 4 ืžื™ืœื™ื•ืŸ ืžืงื•ืžื•ืช ืขื‘ื•ื“ื”?
04:06
There's an interesting story that E.M. Forster wrote,
80
246409
3336
ื™ืฉ ืกื™ืคื•ืจ ืžืขื ื™ื™ืŸ ืฉื›ืชื‘ ื.ืž. ืคื•ืจืกื˜ืจ,
04:09
where everyone is entirely machine dependent.
81
249745
3504
ืฉื‘ื• ื›ื•ืœื ืชืœื•ื™ื™ื ืœื’ืžืจื™ ื‘ืžื›ื•ื ื”.
04:13
The story is really about the fact that if you hand over
82
253499
3754
ื”ืกื™ืคื•ืจ ื”ื•ื ื‘ืืžืช ืขืœ ื”ืžืฆื‘ ืฉืื ืืชื ืžื•ืกืจื™ื
04:17
the management of your civilization to machines,
83
257253
2961
ืืช ื ื™ื”ื•ืœ ื”ืฆื™ื•ื•ื™ืœื™ื–ืฆื™ื” ืฉืœื›ื ืœืžื›ื•ื ื•ืช,
04:20
you then lose the incentive to understand it yourself
84
260214
3504
ืืชื ืžืื‘ื“ื™ื ืืช ื”ืชืžืจื™ืฅ ืœื”ื‘ื™ืŸ ืืช ื–ื” ื‘ืขืฆืžื›ื
04:23
or to teach the next generation how to understand it.
85
263718
2544
ืื• ืœืœืžื“ ืืช ื”ื“ื•ืจ ื”ื‘ื ืื™ืš ืœื”ื‘ื™ืŸ ืืช ื–ื”.
04:26
You can see โ€œWALL-Eโ€ actually as a modern version,
86
266262
3003
ืืชื ื™ื›ื•ืœื™ื ืœืจืื•ืช ืืช โ€œื•ื•ืœ-Eโ€ ืœืžืขืฉื” ื›ื’ืจืกื” ืžื•ื“ืจื ื™ืช,
04:29
where everyone is enfeebled and infantilized by the machine,
87
269265
3628
ืฉื‘ื” ื›ื•ืœื ืžื•ื—ืœืฉื™ื ื•ื ืขืฉื™ื ืชืœื•ืชื™ื™ื ืขืœ ื™ื“ื™ ื”ืžื›ื•ื ื”,
04:32
and that hasnโ€™t been possible up to now.
88
272893
1961
ื•ื–ื” ืœื ื”ื™ื” ืืคืฉืจื™ ืขื“ ืขื›ืฉื™ื•.
04:34
We put a lot of our civilization into books,
89
274854
2419
ืื ื—ื ื• ืžื›ื ื™ืกื™ื ื”ืจื‘ื” ืžื”ืฆื™ื•ื•ื™ืœื™ื–ืฆื™ื” ืฉืœื ื• ืœืชื•ืš ืกืคืจื™ื,
04:37
but the books canโ€™t run it for us.
90
277273
1626
ืื‘ืœ ื”ื ืœื ื™ื›ื•ืœื™ื ืœื”ืคืขื™ืœ ืืช ื–ื” ืขื‘ื•ืจื ื•.
04:38
And so we always have to teach the next generation.
91
278899
2795
ื•ืœื›ืŸ ืขืœื™ื ื• ืชืžื™ื“ ืœืœืžื“ ืืช ื”ื“ื•ืจ ื”ื‘ื.
04:41
If you work it out, itโ€™s about a trillion person years of teaching and learning
92
281736
4212
ืื ืชื—ืฉื‘ื• ืืช ื–ื”, ื–ื” ื‘ืขืจืš ื˜ืจื™ืœื™ื•ืŸ ืฉื ื•ืช ืื“ื ืฉืœ ื”ื•ืจืื” ื•ืœืžื™ื“ื”
04:45
and an unbroken chain that goes back tens of thousands of generations.
93
285948
3962
ื•ืฉืจืฉืจืช ืฉืœื ื ืฉื‘ืจืช ืฉื—ื•ื–ืจืช ืœืื—ื•ืจ ืขืฉืจื•ืช ืืœืคื™ ื“ื•ืจื•ืช.
04:50
What happens if that chain breaks?
94
290119
1919
ืžื” ื™ืงืจื” ืื ื”ืฉืจืฉืจืช ื”ื–ื• ื ืฉื‘ืจืช?
04:52
I think thatโ€™s something we have to understand as AI moves forward.
95
292038
3461
ืื ื™ ื—ื•ืฉื‘ ืฉื–ื” ืžืฉื”ื• ืฉืขืœื™ื ื• ืœื”ื‘ื™ืŸ ื›ืฉื”-AI ืžืชืงื“ืžืช.
04:55
The actual date of arrival of general purpose AIโ€”
96
295624
3587
ืืช ืชืืจื™ืš ื”ื”ื’ืขื” ื‘ืคื•ืขืœ ืฉืœ AI ืœืžื˜ืจื•ืช ื›ืœืœื™ื•ืชโ€”
04:59
youโ€™re not going to be able to pinpoint, it isnโ€™t a single day.
97
299211
3087
ืœื ืชืฆืœื™ื—ื• ืœื–ื”ื•ืช, ื–ื” ืœื ื™ื•ื ืื—ื“.
05:02
Itโ€™s also not the case that itโ€™s all or nothing.
98
302298
2294
ื–ื” ื’ื ืœื ื”ืžืงืจื” ืฉื–ื” ื”ื›ืœ ืื• ื›ืœื•ื.
05:04
The impact is going to be increasing.
99
304592
2461
ื”ื”ืฉืคืขื” ื”ื•ืœื›ืช ืœื’ื“ื•ืœ.
05:07
So with every advance in AI,
100
307053
2043
ืื– ืขื ื›ืœ ื”ืชืงื“ืžื•ืช ื‘-AI,
05:09
it significantly expands the range of tasks.
101
309096
2962
ื–ื” ืžืจื—ื™ื‘ ื‘ืฆื•ืจื” ืžืฉืžืขื•ืชื™ืช ืืช ืžื’ื•ื•ืŸ ื”ืžืฉื™ืžื•ืช.
05:12
So in that sense, I think most experts say by the end of the century,
102
312058
5338
ืื– ื‘ืžื•ื‘ืŸ ื”ื–ื”, ืื ื™ ื—ื•ืฉื‘ ืฉืจื•ื‘ ื”ืžื•ืžื—ื™ื ืื•ืžืจื™ื ืฉืขื“ ืกื•ืฃ ื”ืžืื”,
05:17
weโ€™re very, very likely to have general purpose AI.
103
317396
3337
ืกื‘ื™ืจ ืžืื•ื“ ืžืื•ื“ ืฉืชื”ื™ื” ืœื ื• AI ืœืžื˜ืจื•ืช ื›ืœืœื™ื•ืช.
05:20
The median is something around 2045.
104
320733
3754
ื”ื—ืฆื™ื•ืŸ ื”ื•ื ืžืฉื”ื• ื‘ืกื‘ื™ื‘ื•ืช 2045.
05:24
I'm a little more on the conservative side.
105
324487
2002
ืื ื™ ืงืฆืช ื™ื•ืชืจ ืขืœ ื”ืฆื“ ื”ืฉืžืจื ื™.
05:26
I think the problem is harder than we think.
106
326489
2085
ืื ื™ ื—ื•ืฉื‘ ืฉื”ื‘ืขื™ื” ืงืฉื” ื™ื•ืชืจ ืžืžื” ืฉืื ื—ื ื• ื—ื•ืฉื‘ื™ื.
05:28
I like what John McAfee, he was one of the founders of AI,
107
328574
3253
ืื ื™ ืื•ื”ื‘ ืืช ืžื” ืฉื’โ€™ื•ืŸ ืžืงืืคื™, ืฉื”ื™ื” ืื—ื“ ืžืžื™ื™ืกื“ื™ ื”ื‘ื™ื ื” ื”ืžืœืื›ื•ืชื™ืช,
05:31
when he was asked this question, he said, somewhere between five and 500 years.
108
331911
3837
ื›ืฉืฉืืœื• ืื•ืชื• ืืช ื”ืฉืืœื” ื”ื–ื• ื”ื•ื ืืžืจ ืื™ืคื•ื ืฉื”ื•ื ื‘ื™ืŸ ื—ืžืฉ ืœ-500 ืฉื ื™ื.
05:35
And we're going to need, I think, several Einsteins to make it happen.
109
335748
3337
ื•ื ืฆื˜ืจืš, ืื ื™ ื—ื•ืฉื‘, ื›ืžื” ืื™ื™ื ืฉื˜ื™ื™ื ื™ื ื›ื“ื™ ืœื’ืจื•ื ืœื–ื” ืœืงืจื•ืช.

Original video on YouTube.com
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7