The AI Revolution Is Underhyped | Eric Schmidt | TED

9,273 views ・ 2025-05-15

TED


Please double-click on the English subtitles below to play the video.

00:04
Bilawal Sidhu: Eric Schmidt, thank you for joining us.
0
4602
2702
00:07
Let's go back.
1
7838
1135
00:09
You said the arrival of non-human intelligence is a very big deal.
2
9774
3837
00:14
And this photo, taken in 2016,
3
14378
2302
00:16
feels like one of those quiet moments where the Earth shifted beneath us,
4
16714
4071
00:20
but not everyone noticed.
5
20818
1969
00:22
What did you see back then that the rest of us might have missed?
6
22787
3069
00:25
Eric Schmidt: In 2016, we didn't understand
7
25890
2769
00:28
what was now going to happen,
8
28692
1602
00:30
but we understood that these algorithms were new and powerful.
9
30327
3504
00:34
What happened in this particular set of games
10
34131
2703
00:36
was in roughly the second game,
11
36834
1935
00:38
there was a new move invented by AI
12
38769
2770
00:41
in a game that had been around for 2,500 years
13
41572
3337
00:44
that no one had ever seen.
14
44942
1768
00:47
Technically, the way this occurred
15
47011
1668
00:48
was that the system of AlphaGo was essentially organized
16
48679
3570
00:52
to always maintain a greater than 50 percent chance of winning.
17
52283
3737
00:56
And so it calculated correctly this move,
18
56487
2703
00:59
which was this great mystery among all of the Go players
19
59223
2736
01:01
who are obviously insanely brilliant,
20
61992
2369
01:04
mathematical and intuitive players.
21
64395
2436
01:07
The question that Henry, Craig Mundie and I started to discuss, right,
22
67364
6474
01:13
is what does this mean?
23
73838
3436
01:18
How is it that our computers could come up with something
24
78109
2702
01:20
that humans had never thought about?
25
80811
1735
01:22
I mean, this is a game played by billions of people.
26
82546
2670
01:25
And that began the process that led to two books.
27
85649
3604
01:29
And I think, frankly,
28
89887
1168
01:31
is the point at which the revolution really started.
29
91088
4271
01:35
BS: If you fast forward to today,
30
95392
2169
01:37
it seems that all anyone can talk about is AI,
31
97561
4038
01:41
especially here at TED.
32
101632
1969
01:43
But you've taken a contrarian stance.
33
103601
1968
01:46
You actually think AI is underhyped.
34
106303
2369
01:48
Why is that?
35
108706
1234
01:49
ES: And I'll tell you why.
36
109974
1268
01:51
Most of you think of AI as,
37
111242
1301
01:52
I'll just use the general term, as ChatGPT.
38
112576
2002
01:54
For most of you, ChatGPT was the moment where you said,
39
114612
3169
01:57
"Oh my God,
40
117815
1168
01:59
this thing writes, and it makes mistakes,
41
119016
2002
02:01
but it's so brilliantly verbal."
42
121051
2837
02:03
That was certainly my reaction.
43
123921
1501
02:05
Most people that I knew did that.
44
125422
1602
02:07
BS: It was visceral, yeah.
45
127024
1301
02:08
ES: This was two years ago.
46
128325
1735
02:10
Since then, the gains in what is called reinforcement learning,
47
130094
3703
02:13
which is what AlphaGo helped invent and so forth,
48
133831
3103
02:16
allow us to do planning.
49
136967
1669
02:19
And a good example is look at OpenAI o3
50
139036
4571
02:23
or DeepSeek R1,
51
143641
1835
02:25
and you can see how it goes forward and back,
52
145509
2770
02:28
forward and back, forward and back.
53
148312
2302
02:30
It's extraordinary.
54
150614
1535
02:32
In my case, I bought a rocket company
55
152149
2436
02:34
because it was like, interesting.
56
154618
1869
02:36
BS: (Laughs) As one does.
57
156520
1502
02:38
ES: As one does.
58
158055
1168
02:39
And it’s an area that I’m not an expert in,
59
159256
2736
02:42
and I want to be an expert.
60
162026
1301
02:43
So I'm using deep research.
61
163360
1602
02:45
And these systems are spending 15 minutes writing these deep papers.
62
165296
4337
02:49
That's true for most of them.
63
169667
1501
02:51
Do you have any idea how much computation
64
171202
2402
02:53
15 minutes of these supercomputers is?
65
173637
2536
02:56
It's extraordinary.
66
176207
1601
02:57
So you’re seeing the arrival,
67
177841
1669
02:59
the shift from language to language.
68
179543
2236
03:01
Tthen you had language to sequence,
69
181812
1702
03:03
which is how biology is done.
70
183514
1468
03:05
Now you're doing essentially planning and strategy.
71
185015
3404
03:09
The eventual state of this
72
189053
2802
03:11
is the computers running all business processes, right?
73
191889
2936
03:14
So you have an agent to do this, an agent to do this,
74
194858
2503
03:17
an agent to do this.
75
197361
1635
03:19
And you concatenate them together,
76
199029
1702
03:20
and they speak language among each other.
77
200764
2570
03:23
They typically speak English language.
78
203367
2069
03:26
BS: I mean, speaking of just the sheer compute requirements of these systems,
79
206303
4939
03:31
let's talk about scale briefly.
80
211275
1635
03:33
You know, I kind of think of these AI systems as Hungry Hungry Hippos.
81
213277
3370
03:36
They seemingly soak up all the data and compute that we throw at them.
82
216647
3470
03:40
They've already digested all the tokens on the public internet,
83
220150
3504
03:43
and it seems we can't build data centers fast enough.
84
223654
2903
03:47
What do you think the real limits are,
85
227224
2002
03:49
and how do we get ahead of them
86
229260
2202
03:51
before they start throttling AI progress?
87
231462
2636
03:54
ES: So there's a real limit in energy.
88
234131
1835
03:56
Give you an example.
89
236000
1167
03:57
There's one calculation,
90
237167
1202
03:58
and I testified on this this week in Congress,
91
238402
2769
04:01
that we need another 90 gigawatts of power in America.
92
241205
5539
04:06
My answer, by the way, is, think Canada, right?
93
246777
3837
04:10
Nice people, full of hydroelectric power.
94
250648
2268
04:12
But that's apparently not the political mood right now.
95
252950
3136
04:16
Sorry.
96
256120
1201
04:17
So 90 gigawatts is 90 nuclear power plants in America.
97
257354
5539
04:22
Not happening.
98
262926
1168
04:24
We're building zero, right?
99
264094
1569
04:25
How are we going to get all that power?
100
265696
1868
04:27
This is a major, major national issue.
101
267598
2402
04:30
You can use the Arab world,
102
270501
1301
04:31
which is busy building five to 10 gigawatts of data centers.
103
271835
3737
04:35
India is considering a 10-gigawatt data center.
104
275572
3037
04:38
To understand how big gigawatts are,
105
278642
2369
04:41
is think cities per data center.
106
281045
2969
04:44
That's how much power these things need.
107
284048
2269
04:46
And the people look at it and they say,
108
286350
2169
04:48
“Well, there’s lots of algorithmic improvements,
109
288519
3170
04:51
and you will need less power."
110
291722
1668
04:53
There's an old rule, I'm old enough to remember, right?
111
293424
3470
04:57
Grove giveth, Gates taketh away.
112
297328
2802
05:00
OK, the hardware just gets faster and faster.
113
300564
3237
05:03
The physicists are amazing.
114
303834
1869
05:06
Just incredible what they've been able to do.
115
306170
2269
05:08
And us software people, we just use it and use it and use it.
116
308472
3737
05:12
And when you look at planning, at least in today's algorithms,
117
312242
3671
05:15
it's back and forth and try this and that
118
315946
2536
05:18
and just watch it yourself.
119
318515
1735
05:20
There are estimates, and you know this from Andreessen Horowitz reports,
120
320284
4638
05:24
it's been well studied,
121
324955
1635
05:26
that there's an increase in at least a factor of 100,
122
326590
2970
05:29
maybe a factor of 1,000,
123
329593
1301
05:30
in computation required just to do the kind of planning.
124
330894
3237
05:34
The technology goes from essentially deep learning to reinforcement learning
125
334598
4238
05:38
to something called test-time compute,
126
338869
1969
05:40
where not only are you doing planning,
127
340838
1835
05:42
but you're also learning while you're doing planning.
128
342673
2502
05:45
That is the, if you will,
129
345209
1201
05:46
the zenith or what have you, of computation needs.
130
346443
3537
05:50
That's problem number one, electricity and hardware.
131
350013
3204
05:53
Problem number two is we ran out of data
132
353250
4071
05:57
so we have to start generating it.
133
357354
1635
05:59
But we can easily do that because that's one of the functions.
134
359022
2903
06:01
And then the third question that I don't understand
135
361959
2636
06:04
is what's the limit of knowledge?
136
364628
2269
06:07
I'll give you an example.
137
367364
1201
06:08
Let's imagine we are collectively all of the computers in the world,
138
368599
3370
06:11
and we're all thinking
139
371969
1435
06:13
and we're all thinking based on knowledge that exists that was previously invented.
140
373437
3971
06:17
How do we invent something completely new?
141
377441
3670
06:21
So, Einstein.
142
381512
1668
06:23
So when you study the way scientific discovery works,
143
383213
3003
06:26
biology, math, so forth and so on,
144
386216
2403
06:28
what typically happens is a truly brilliant human being
145
388652
3771
06:32
looks at one area and says,
146
392423
3003
06:35
"I see a pattern
147
395459
1568
06:37
that's in a completely different area,
148
397060
1836
06:38
has nothing to do with the first one.
149
398896
1801
06:40
It's the same pattern."
150
400731
1268
06:42
And they take the tools from one and they apply it to another.
151
402032
3704
06:45
Today, our systems cannot do that.
152
405769
2603
06:48
If we can get through that, I'm working on this,
153
408405
2870
06:51
a general technical term for this is non-stationarity of objectives.
154
411308
4771
06:56
The rules keep changing.
155
416113
2002
06:58
We will see if we can solve that problem.
156
418849
1969
07:00
If we can solve that, we're going to need even more data centers.
157
420851
3103
07:03
And we'll also be able to invent completely new schools of scientific
158
423987
4505
07:08
and intellectual thought,
159
428525
1502
07:10
which will be incredible.
160
430060
1735
07:11
BS: So as we push towards a zenith,
161
431829
1768
07:13
autonomy has been a big topic of discussion.
162
433630
2770
07:16
Yoshua Bengio gave a compelling talk earlier this week,
163
436433
2903
07:19
advocating that AI labs should halt the development of agentic AI systems
164
439369
4071
07:23
that are capable of taking autonomous action.
165
443474
2435
07:25
Yet that is precisely what the next frontier is for all these AI labs,
166
445943
4137
07:30
and seemingly for yourself, too.
167
450113
2136
07:32
What is the right decision here?
168
452282
1669
07:33
ES: So Yoshua is a brilliant inventor of much of what we're talking about
169
453984
4338
07:38
and a good personal friend.
170
458355
1301
07:39
And we’ve talked about this, and his concerns are very legitimate.
171
459656
3971
07:43
The question is not are his concerns right,
172
463660
2002
07:45
but what are the solutions?
173
465696
1368
07:47
So let's think about agents.
174
467097
2436
07:49
So for purposes of argument, everyone in the audience is an agent.
175
469566
3570
07:53
You have an input that's English or whatever language.
176
473170
3603
07:56
And you have an output that’s English, and you have memory,
177
476807
2769
07:59
which is true of all humans.
178
479576
1468
08:01
Now we're all busy working,
179
481044
1802
08:02
and all of a sudden, one of you decides
180
482880
3870
08:06
it's much more efficient not to use human language,
181
486783
2737
08:09
but we'll invent our own computer language.
182
489553
2235
08:11
Now you and I are sitting here, watching all of this,
183
491822
2536
08:14
and we're saying, like, what do we do now?
184
494391
2269
08:16
The correct answer is unplug you, right?
185
496660
3303
08:19
Because we're not going to know,
186
499997
2602
08:22
we're just not going to know what you're up to.
187
502633
2369
08:25
And you might actually be doing something really bad or really amazing.
188
505035
3470
08:28
We want to be able to watch.
189
508505
1869
08:30
So we need provenance, something you and I have talked about,
190
510407
3303
08:33
but we also need to be able to observe it.
191
513744
2135
08:35
To me, that's a core requirement.
192
515913
2702
08:39
There's a set of criteria that the industry believes are points
193
519049
3003
08:42
where you want to, metaphorically, unplug it.
194
522085
2436
08:44
One is where you get recursive self-improvement,
195
524555
2502
08:47
which you can't control.
196
527057
1201
08:48
Recursive self-improvement is where the computer is off learning,
197
528292
3069
08:51
and you don't know what it's learning.
198
531361
1835
08:53
That can obviously lead to bad outcomes.
199
533230
1935
08:55
Another one would be direct access to weapons.
200
535198
2269
08:57
Another one would be that the computer systems decide to exfiltrate themselves,
201
537467
4038
09:01
to reproduce themselves without our permission.
202
541538
3103
09:04
So there's a set of such things.
203
544675
2035
09:06
The problem with Yoshua's speech, with respect to such a brilliant person,
204
546743
4939
09:11
is stopping things in a globally competitive market
205
551715
3670
09:15
doesn't really work.
206
555419
1601
09:17
Instead of stopping agentic work,
207
557454
3137
09:20
we need to find a way to establish the guardrails,
208
560624
2669
09:23
which I know you agree with because we’ve talked about it.
209
563327
2736
09:26
(Applause)
210
566063
3703
09:30
BS: I think that brings us nicely to the dilemmas.
211
570100
2336
09:32
And let's just say there are a lot of them when it comes to this technology.
212
572469
3704
09:36
The first one I'd love to start with, Eric,
213
576173
2002
09:38
is the exceedingly dual-use nature of this tech, right?
214
578208
2736
09:40
It's applicable to both civilian and military applications.
215
580978
3370
09:44
So how do you broadly think about the dilemmas
216
584748
2236
09:47
and ethical quandaries
217
587017
1535
09:48
that come with this tech and how humans deploy them?
218
588585
3237
09:53
ES: In many cases, we already have doctrines
219
593156
2503
09:55
about personal responsibility.
220
595659
1902
09:57
A simple example, I did a lot of military work
221
597594
2169
09:59
and continue to do so.
222
599763
1568
10:01
The US military has a rule called 3000.09,
223
601331
3671
10:05
generally known as "human in the loop" or "meaningful human control."
224
605035
4171
10:09
You don't want systems that are not under our control.
225
609740
3904
10:13
It's a line we can't cross.
226
613677
2102
10:15
I think that's correct.
227
615812
1468
10:17
I think that the competition between the West,
228
617314
3403
10:20
and particularly the United States,
229
620751
1735
10:22
and China,
230
622486
1168
10:23
is going to be defining in this area.
231
623687
2502
10:26
And I'll give you some examples.
232
626223
1568
10:27
First, the current government has now put in
233
627824
3137
10:30
essentially reciprocating 145-percent tariffs.
234
630994
3637
10:34
That has huge implications for the supply chain.
235
634665
2869
10:37
We in our industry depend on packaging
236
637567
3737
10:41
and components from China that are boring, if you will,
237
641338
3503
10:44
but incredibly important.
238
644875
1234
10:46
The little packaging and the little glue things and so forth
239
646143
2836
10:48
that are part of the computers.
240
648979
1501
10:50
If China were to deny access to them, that would be a big deal.
241
650514
3904
10:54
We are trying to deny them access to the most advanced chips,
242
654451
3604
10:58
which they are super annoyed about.
243
658088
1902
11:00
Dr. Kissinger asked Craig and I
244
660791
2369
11:03
to do Track II dialogues with the Chinese,
245
663193
3170
11:06
and we’re in conversations with them.
246
666396
1802
11:08
What's the number one issue they raise?
247
668198
1869
11:10
This issue.
248
670100
1168
11:11
Indeed, if you look at DeepSeek, which is really impressive,
249
671268
2869
11:14
they managed to find algorithms that got around the problems
250
674171
3203
11:17
by making them more efficient.
251
677407
1702
11:19
Because China is doing everything open source, open weights,
252
679142
3337
11:22
we immediately got the benefit of their invention
253
682479
2302
11:24
and have adopted into US things.
254
684815
1801
11:26
So we're in a situation now which I think is quite tenuous,
255
686950
3837
11:30
where the US is largely driving, for many, many good reasons,
256
690821
3303
11:34
largely closed models, largely under very good control.
257
694157
3637
11:37
China is likely to be the leader in open source unless something changes.
258
697828
3837
11:41
And open source leads to very rapid proliferation around the world.
259
701698
3837
11:45
This proliferation is dangerous at the cyber level and the bio level.
260
705569
4838
11:50
But let me give you why it's also dangerous in a more significant way,
261
710407
3937
11:54
in a nuclear-threat way.
262
714344
1368
11:56
Dr. Kissinger, who we all worked with very closely,
263
716346
2536
11:58
was one of the architects of mutually assured destruction,
264
718882
2736
12:01
deterrence and so forth.
265
721651
1335
12:02
And what's happening now is you've got a situation
266
722986
3537
12:06
where -- I'll use an example.
267
726556
1402
12:07
It's easier if I explain.
268
727991
1201
12:09
You’re the good guy, and I’m the bad guy, OK?
269
729226
2569
12:11
You're six months ahead of me,
270
731828
1769
12:13
and we're both on the same path for superintelligence.
271
733630
3470
12:17
And you're going to get there, right?
272
737134
2302
12:19
And I'm sure you're going to get there, you're that close.
273
739436
3303
12:23
And I'm six months behind.
274
743306
1769
12:25
Pretty good, right?
275
745108
1502
12:26
Sounds pretty good.
276
746943
1535
12:29
No.
277
749179
1168
12:30
These are network-effect businesses.
278
750380
1935
12:32
And in network-effect businesses,
279
752349
1868
12:34
it is the slope of your improvement that determines everything.
280
754251
3803
12:38
So I'll use OpenAI or Gemini,
281
758088
2369
12:40
they have 1,000 programmers.
282
760490
1669
12:42
They're in the process of creating a million AI software programmers.
283
762159
4337
12:46
What does that do?
284
766496
1201
12:47
First, you don't have to feed them except electricity.
285
767731
2602
12:50
So that's good.
286
770367
1168
12:51
And they don't quit and things like that.
287
771535
2135
12:53
Second, the slope is like this.
288
773703
2303
12:56
Well, as we get closer to superintelligence,
289
776039
2936
12:58
the slope goes like this.
290
778975
1502
13:00
If you get there first, you dastardly person --
291
780510
3604
13:04
BS: You're never going to be able to catch me.
292
784147
2169
13:06
ES: I will not be able to catch you.
293
786316
1735
13:08
And I've given you the tools
294
788084
1402
13:09
to reinvent the world and in particular, destroy me.
295
789486
3437
13:12
That's how my brain, Mr. Evil, is going to think.
296
792956
2502
13:15
So what am I going to do?
297
795992
1402
13:18
The first thing I'm going to do is try to steal all your code.
298
798061
3036
13:21
And you've prevented that because you're good.
299
801131
2169
13:23
And you were good.
300
803333
1168
13:24
So you’re still good, at Google.
301
804501
1968
13:26
Second, then I'm going to infiltrate you with humans.
302
806503
3103
13:29
Well, you've got good protections against that.
303
809639
2203
13:31
You know, we don't have spies.
304
811875
1468
13:33
So what do I do?
305
813376
1268
13:35
I’m going to go in, and I’m going to change your model.
306
815745
2636
13:38
I'm going to modify it.
307
818381
1135
13:39
I'm going to actually screw you up
308
819549
1635
13:41
to get me so I'm one day ahead of you.
309
821184
2403
13:43
And you're so good, I can't do that.
310
823620
1768
13:45
What's my next choice?
311
825422
1668
13:47
Bomb your data center.
312
827691
1668
13:50
Now do you think I’m insane?
313
830026
3070
13:53
These conversations are occurring
314
833730
2169
13:55
around nuclear opponents today in our world.
315
835932
4138
14:00
There are legitimate people saying
316
840103
1902
14:02
the only solution to this problem is preemption.
317
842038
3437
14:05
Now I just told you that you, Mr. Good,
318
845876
3069
14:08
are about to have the keys to control the entire world,
319
848979
4371
14:13
both in terms of economic dominance,
320
853383
1869
14:15
innovation, surveillance,
321
855285
1635
14:16
whatever it is that you care about.
322
856953
1835
14:18
I have to prevent that.
323
858788
1402
14:20
We don't have any language in our society,
324
860891
3403
14:24
the foreign policy people have not thought about this,
325
864327
2670
14:27
and this is coming.
326
867030
1201
14:28
When is it coming?
327
868231
1368
14:29
Probably five years.
328
869633
1835
14:31
We have time.
329
871501
1135
14:32
We have time for this conversation.
330
872669
1702
14:34
And this is really important.
331
874371
1668
14:36
BS: Let me push on this a little bit.
332
876072
1769
14:37
So if this is true
333
877874
1168
14:39
and we can end up in this sort of standoff scenario
334
879042
2402
14:41
and the equivalent of mutually-assured destruction,
335
881478
2402
14:43
you've also said that the US should embrace open-source AI
336
883914
3603
14:47
even after China's DeepSeek showed what's possible
337
887550
2336
14:49
with a fraction of the compute.
338
889920
1501
14:51
But doesn't open-sourcing these models,
339
891421
1902
14:53
just hand capabilities to adversaries that will accelerate their own timelines?
340
893356
4138
14:57
ES: This is one of the wickedest, or, we call them wicked hard problems.
341
897527
3771
15:02
Our industry, our science,
342
902132
1935
15:04
everything about the world that we have built
343
904100
2636
15:06
is based on academic research, open source, so forth.
344
906770
3336
15:10
Much of Google's technology was based on open source.
345
910140
2669
15:12
Some of Google's technology is open-source,
346
912842
2036
15:14
some of it is proprietary, perfectly legitimate.
347
914911
2436
15:18
What happens when there's an open-source model
348
918148
3370
15:21
that is really dangerous,
349
921518
1735
15:23
and it gets into the hands of the Osama bin Ladens of the world,
350
923286
3537
15:26
and we know there are more than one, unfortunately.
351
926856
3237
15:30
We don't know.
352
930093
1168
15:31
The consensus in the industry right now
353
931294
2169
15:33
is the open-source models are not quite at the point
354
933496
4705
15:38
of national or global danger.
355
938234
2503
15:41
But you can see a pattern where they might get there.
356
941137
2903
15:44
So a lot will now depend upon the key decisions made in the US and China
357
944407
4204
15:48
and in the companies in both places.
358
948645
1935
15:51
The reason I focus on US and China
359
951081
2002
15:53
is they're the only two countries where people are crazy enough
360
953116
3537
15:56
to spend the billions and billions of dollars
361
956686
2870
15:59
that are required to build this new vision.
362
959589
2002
16:01
Europe, which would love to do it,
363
961624
1669
16:03
doesn't have the capital structure to do it.
364
963293
2102
16:05
Most of the other countries, not even India,
365
965428
2069
16:07
has the capital structure to do it, although they wish to.
366
967497
2736
16:10
Arabs don't have the capital structure to do it,
367
970266
2269
16:12
although they're working on it.
368
972569
1501
16:14
So this fight, this battle, will be the defining battle.
369
974104
3904
16:18
I'm worried about this fight.
370
978041
1535
16:19
Dr. Kissinger talked about the likely path to war with China
371
979609
4972
16:24
was by accident.
372
984614
2369
16:27
And he was a student of World War I.
373
987884
1935
16:29
And of course, [it] started with a small event,
374
989853
2269
16:32
and it escalated over that summer in, I think, 1914.
375
992122
3270
16:35
And then it was this horrific conflagration.
376
995959
3437
16:39
You can imagine a series of steps
377
999396
2002
16:41
along the lines of what I'm talking about
378
1001431
2202
16:43
that could lead us to a horrific global outcome.
379
1003666
3771
16:47
That's why we have to be paying attention.
380
1007470
2202
16:49
BS: I want to talk about one of the recurring tensions here,
381
1009706
2836
16:52
before we move on to the dreams,
382
1012542
2002
16:54
is, to sort of moderate these AI systems at scale, right,
383
1014577
3204
16:57
there's this weird tension in AI safety
384
1017814
2135
16:59
that the solution to preventing "1984"
385
1019949
3137
17:03
often sounds a lot like "1984."
386
1023086
2903
17:06
So proof of personhood is a hot topic.
387
1026022
1835
17:07
Moderating these systems at scale is a hot topic.
388
1027857
2503
17:10
How do you view that trade-off?
389
1030393
1502
17:11
In trying to prevent dystopia,
390
1031895
1768
17:13
let's say preventing non-state actors
391
1033663
1902
17:15
from using these models in undesirable ways,
392
1035598
2703
17:18
we might accidentally end up building the ultimate surveillance state.
393
1038301
3804
17:23
ES: It's really important that we stick to the values
394
1043139
3737
17:26
that we have in our society.
395
1046876
2136
17:29
I am very, very committed to individual freedom.
396
1049045
2803
17:31
It's very easy for a well-intentioned engineer to build a system
397
1051881
4138
17:36
which is optimized and restricts your freedom.
398
1056052
3337
17:39
So it's very important that human freedom be preserved in this.
399
1059856
4037
17:44
A lot of these are not technical issues.
400
1064494
1935
17:46
They're really business decisions.
401
1066463
1635
17:48
It's certainly possible to build a surveillance state,
402
1068098
2569
17:50
but it's also possible to build one that's freeing.
403
1070700
2403
17:53
The conundrum that you're describing
404
1073136
1735
17:54
is because it's now so easy to operate based on misinformation,
405
1074904
3470
17:58
everyone knows what I'm talking about,
406
1078408
1868
18:00
that you really do need proof of identity.
407
1080276
2403
18:02
But proof of identity does not have to include details.
408
1082679
2903
18:05
So, for example, you could have a cryptographic proof
409
1085615
2503
18:08
that you are a human being,
410
1088118
1301
18:09
and it could actually be true without anything else,
411
1089452
2469
18:11
and also not be able to link it to others
412
1091921
2770
18:14
using various cryptographic techniques.
413
1094724
2636
18:17
BS: So zero-knowledge proofs and other techniques.
414
1097360
2369
18:19
ES: Zero-knowledge proofs are the most obvious one.
415
1099762
2403
18:22
BS: Alright, let's change gears, shall we, to dreams.
416
1102165
3570
18:25
In your book, "Genesis," you strike a cautiously optimistic tone,
417
1105768
3637
18:29
which you obviously co-authored with Henry Kissinger.
418
1109439
2536
18:32
When you look ahead to the future, what should we all be excited about?
419
1112008
3704
18:35
ES: Well, I'm of the age
420
1115745
1435
18:37
where some of my friends are getting really dread diseases.
421
1117213
3437
18:41
Can we fix that now?
422
1121317
1869
18:43
Can we just eliminate all of those?
423
1123520
2168
18:45
Why can't we just uptake these
424
1125722
1635
18:47
and right now, eradicate all of these diseases?
425
1127357
3970
18:51
That's a pretty good goal.
426
1131961
1835
18:54
I'm aware of one nonprofit that's trying to identify,
427
1134531
3203
18:57
in the next two years,
428
1137767
1301
18:59
all human druggable targets and release it to the scientists.
429
1139102
3804
19:02
If you know the druggable targets,
430
1142939
1702
19:04
then the drug industry can begin to work on things.
431
1144674
3136
19:07
I have another company I'm associated with
432
1147844
2002
19:09
which has figured out a way, allegedly, it's a startup,
433
1149879
2636
19:12
to reduce the cost of stage-3 trials by an order of magnitude.
434
1152515
3971
19:16
As you know, those are the things
435
1156519
1635
19:18
that ultimately drive the cost structure of drugs.
436
1158154
2369
19:20
That's an example.
437
1160557
1167
19:21
I'd like to know where dark energy is,
438
1161758
3036
19:24
and I'd like to find it.
439
1164827
1435
19:26
I'm sure that there is an enormous amount of physics in dark energy, dark matter.
440
1166996
5406
19:32
Think about the revolution in material science.
441
1172435
3237
19:35
Infinitely more powerful transportation,
442
1175705
2870
19:38
infinitely more powerful science and so forth.
443
1178608
3537
19:42
I'll give you another example.
444
1182178
1435
19:43
Why do we not have every human being on the planet
445
1183613
5939
19:49
have their own tutor in their own language
446
1189586
3570
19:53
to help them learn something new?
447
1193189
1802
19:55
Starting with kindergarten.
448
1195024
1936
19:56
It's obvious.
449
1196960
1401
19:58
Why have we not built it?
450
1198394
1235
19:59
The answer, the only possible answer
451
1199662
1735
20:01
is there must not be a good economic argument.
452
1201397
2236
20:03
The technology works.
453
1203666
1602
20:05
Teach them in their language, gamify the learning,
454
1205301
3070
20:08
bring people to their best natural lengths.
455
1208404
2169
20:10
Another example.
456
1210607
1134
20:11
The vast majority of health care in the world
457
1211774
2102
20:13
is either absent
458
1213876
1135
20:15
or delivered by the equivalent of nurse practitioners
459
1215044
2536
20:17
and very, very sort of stressed local village doctors.
460
1217614
3303
20:20
Why do they not have the doctor assistant that helps them in their language,
461
1220917
4504
20:25
treat whatever with, again, perfect healthcare?
462
1225455
2369
20:27
I can just go on.
463
1227824
1668
20:29
There are lots and lots of issues with the digital world.
464
1229525
5673
20:35
It feels like that we're all in our own ships in the ocean,
465
1235198
3770
20:39
and we're not talking to each other.
466
1239002
1768
20:40
In our hunger for connectivity and connection,
467
1240803
3571
20:44
these tools make us lonelier.
468
1244407
2669
20:47
We've got to fix that, right?
469
1247110
1568
20:48
But these are fixable problems.
470
1248711
1569
20:50
They don't require new physics.
471
1250313
1935
20:52
They don't require new discoveries, we just have to decide.
472
1252248
2936
20:55
So when I look at this future,
473
1255218
1435
20:56
I want to be clear that the arrival of this intelligence,
474
1256686
4504
21:01
both at the AI level, the AGI,
475
1261224
2636
21:03
which is general intelligence,
476
1263860
1435
21:05
and then superintelligence,
477
1265328
1668
21:07
is the most important thing that's going to happen in about 500 years,
478
1267030
4738
21:11
maybe 1,000 years in human society.
479
1271801
2102
21:13
And it's happening in our lifetime.
480
1273936
1902
21:15
So don't screw it up.
481
1275872
1868
21:18
BS: Let's say we don't.
482
1278641
2036
21:20
(Applause)
483
1280677
3103
21:23
Let's say we don't screw it up.
484
1283813
1502
21:25
Let's say we get into this world of radical abundance.
485
1285348
2869
21:28
Let's say we end up in this place,
486
1288251
1635
21:29
and we hit that point of recursive self-improvement.
487
1289919
3637
21:33
AI systems take on a vast majority of economically productive tasks.
488
1293556
4204
21:37
In your mind, what are humans going to do in this future?
489
1297794
2702
21:40
Are we all sipping piña coladas on the beach, engaging in hobbies?
490
1300530
3103
21:43
ES: You tech liberal, you.
491
1303666
1969
21:45
You must be in favor of UBI.
492
1305668
2236
21:48
BS: No, no, no.
493
1308404
1168
21:49
ES: Look, humans are unchanged
494
1309605
3003
21:52
in the midst of this incredible discovery.
495
1312642
2436
21:55
Do you really think that we're going to get rid of lawyers?
496
1315111
2836
21:57
No, they're just going to have more sophisticated lawsuits.
497
1317947
3203
22:01
Do you really think we're going to get rid of politicians?
498
1321184
2736
22:03
No, they'll just have more platforms to mislead you.
499
1323953
2436
22:06
Sorry.
500
1326422
1368
22:07
I mean, I can just go on and on and on.
501
1327824
2369
22:10
The key thing to understand about this new economics
502
1330226
3570
22:13
is that we collectively, as a society, are not having enough humans.
503
1333830
4838
22:18
Look at the reproduction rate in Asia,
504
1338701
2336
22:21
is essentially 1.0 for two parents.
505
1341070
2703
22:23
This is not good, right?
506
1343806
2036
22:25
So for the rest of our lives,
507
1345842
2002
22:27
the key problem is going to get the people who are productive.
508
1347877
2903
22:30
That is, in their productive period of lives,
509
1350813
2169
22:33
more productive to support old people like me, right,
510
1353015
4438
22:37
who will be bitching that we want more stuff from the younger people.
511
1357487
3270
22:40
That's how it's going to work.
512
1360757
1468
22:42
These tools will radically increase that productivity.
513
1362258
3637
22:45
There's a study that says that we will,
514
1365928
2069
22:47
under this set of assumptions around agentic AI and discovery
515
1367997
3170
22:51
and the scale that I'm describing,
516
1371200
1669
22:52
there's a lot of assumptions
517
1372869
1635
22:54
that you'll end up
518
1374504
1568
22:56
with something like 30-percent increase in productivity per year.
519
1376105
4305
23:00
Having now talked to a bunch of economists,
520
1380443
2069
23:02
they have no models
521
1382545
1835
23:04
for what that kind of increase in productivity looks like.
522
1384414
3069
23:07
We just have never seen it.
523
1387517
1835
23:09
It didn't occur in any rise of a democracy or a kingdom in our history.
524
1389385
5039
23:15
It's unbelievable what's going to happen.
525
1395525
3136
23:18
Hopefully we will get it in the right direction.
526
1398694
2903
23:22
BS: It is truly unbelievable.
527
1402231
1402
23:23
Let's bring this home, Eric.
528
1403666
1368
23:25
You've navigated decades of technological change.
529
1405067
2837
23:27
For everyone that's navigating this AI transition,
530
1407904
2536
23:30
technologists, leaders, citizens
531
1410473
2235
23:32
that are feeling a mix of excitement and anxiety,
532
1412708
3170
23:35
what is that single piece of wisdom
533
1415912
2135
23:38
or advice you'd like to offer
534
1418080
1936
23:40
for navigating this insane moment that we're living through today?
535
1420016
3737
23:43
ES: So one thing to remember
536
1423786
1835
23:45
is that this is a marathon, not a sprint.
537
1425655
2736
23:49
One year I decided to do a 100-mile bike race,
538
1429859
3036
23:52
which was a mistake.
539
1432895
1235
23:54
And the idea was, I learned about spin rate.
540
1434163
2703
23:57
Every day, you get up, and you just keep going.
541
1437500
2336
23:59
You know, from our work together at Google,
542
1439869
2336
24:02
that when you’re growing at the rate that we’re growing,
543
1442238
4538
24:06
you get so much done in a year,
544
1446809
2236
24:09
you forget how far you went.
545
1449078
2803
24:12
Humans can't understand that.
546
1452315
1968
24:14
And we're in this situation
547
1454317
1568
24:15
where the exponential is moving like this.
548
1455918
2469
24:18
As this stuff happens quicker,
549
1458387
1869
24:20
you will forget what was true two years ago or three years ago.
550
1460289
4738
24:25
That's the key thing.
551
1465561
1535
24:27
So my advice to you all is ride the wave, but ride it every day.
552
1467129
4872
24:32
Don't view it as episodic and something you can end,
553
1472001
2469
24:34
but understand it and build on it.
554
1474504
2469
24:36
Each and every one of you has a reason to use this technology.
555
1476973
4004
24:41
If you're an artist, a teacher, a physician,
556
1481010
3504
24:44
a business person, a technical person.
557
1484514
2669
24:47
If you're not using this technology,
558
1487216
2303
24:49
you're not going to be relevant compared to your peer groups
559
1489552
3270
24:52
and your competitors
560
1492855
1168
24:54
and the people who want to be successful.
561
1494056
2603
24:56
Adopt it, and adopt it fast.
562
1496692
1902
24:58
I have been shocked at how fast these systems --
563
1498594
3003
25:01
as an aside, my background is enterprise software,
564
1501597
4505
25:06
and nowadays there's a model Protocol from Anthropic.
565
1506135
4271
25:10
You can actually connect the model directly into the databases
566
1510439
3170
25:13
without any of the connectors.
567
1513609
1468
25:15
I know this sounds nerdy.
568
1515111
1234
25:16
There's a whole industry there that goes away
569
1516379
2102
25:18
because you have all this flexibility now.
570
1518514
2036
25:20
You can just say what you want, and it just produces it.
571
1520583
2970
25:23
That's an example of a real change in business.
572
1523553
2535
25:26
There are so many of these things coming every day.
573
1526122
3036
25:29
BS: Ladies and gentlemen, Eric Schmidt.
574
1529191
1936
25:31
ES: Thank you very much.
575
1531160
1201
25:32
(Applause)
576
1532361
4805
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7