How AI Is Unlocking the Secrets of Nature and the Universe | Demis Hassabis | TED

406,393 views ・ 2024-04-29

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yip Yan Yeung 校对人员: Lening Xu
克里斯·安德森(Chirs Anderson): 戴密斯,欢迎来到这里。
00:04
Chris Anderson: Demis, so good to have you here.
0
4376
2335
00:06
Demis Hassabis: It's fantastic to be here, thanks, Chris.
1
6711
2711
戴密斯·哈萨比斯 (Demis Hassabis):
来到这里真是太棒了, 谢谢,克里斯。
00:09
Now, you told Time Magazine,
2
9422
2128
你告诉《时代》杂志,
00:11
"I want to understand the big questions,
3
11591
2044
“我想了解那些重大问题,
00:13
the really big ones that you normally go into philosophy or physics
4
13635
3295
那些如果你有兴趣, 会涉及哲学或物理的大问题。
00:16
if you're interested in them.
5
16972
1793
00:18
I thought building AI
6
18807
2502
我认为构建 AI
00:21
would be the fastest route to answer some of those questions."
7
21351
3420
将是回答其中一些问题的最快途径。”
00:25
Why did you think that?
8
25730
1460
你为什么这么想?
00:27
DH: (Laughs)
9
27232
1293
DH:(笑)
00:28
Well, I guess when I was a kid,
10
28567
2210
我想是因为在我小时候,
00:30
my favorite subject was physics,
11
30819
1877
我最喜欢的科目是物理学,
00:32
and I was interested in all the big questions,
12
32696
2752
我对所有重大问题、
00:35
fundamental nature of reality,
13
35490
1960
现实的基本本质、
00:37
what is consciousness,
14
37492
1335
什么是意识,
00:38
you know, all the big ones.
15
38827
1752
所有重大问题都很感兴趣。
00:40
And usually you go into physics, if you're interested in that.
16
40579
2961
通常如果你喜欢物理, 你就会对这些问题感兴趣。
00:43
But I read a lot of the great physicists,
17
43540
1960
但我了解了很多伟大的物理学家,
00:45
some of my all-time scientific heroes like Feynman and so on.
18
45542
2961
有些是我这一生的科学英雄, 比如费曼(Feynman)等等。
00:48
And I realized, in the last, sort of 20, 30 years,
19
48503
2377
我意识到,在过去的二三十年中,
00:50
we haven't made much progress
20
50880
1669
我们在理解一些基本定律方面 并没有取得太大的进展。
00:52
in understanding some of these fundamental laws.
21
52591
2335
00:54
So I thought, why not build the ultimate tool to help us,
22
54926
4672
所以我想,为什么不开发 一个终极工具来帮助我们,
00:59
which is artificial intelligence.
23
59639
1710
那就是人工智能。
01:01
And at the same time,
24
61391
1668
同时,
01:03
we could also maybe better understand ourselves
25
63101
2211
我们也可以借此 更好地了解自己和大脑。
01:05
and the brain better, by doing that too.
26
65312
1918
01:07
So not only was it incredible tool,
27
67272
1710
它不仅是个神奇的工具,
01:08
it was also useful for some of the big questions itself.
28
68982
3170
而且对一些重大问题本身也很有用。
01:12
CA: Super interesting.
29
72152
1209
CA:非常有意思。
01:13
So obviously AI can do so many things,
30
73403
2669
显然 AI 可以做很多事情,
01:16
but I think for this conversation,
31
76072
1669
但我想在这次对话中,
01:17
I'd love to focus in on this theme of what it might do
32
77782
3754
我们来重点讨论它可以做些什么
01:21
to unlock the really big questions, the giant scientific breakthroughs,
33
81536
4004
来解开真正的重大问题、 巨大的科学突破,
01:25
because it's been such a theme driving you and your company.
34
85582
3462
因为它就是推动你 和你的公司前进的那个主题。
01:29
DH: So I mean, one of the big things AI can do,
35
89044
2794
DH:我想说,我一直在思考 AI 能做的的一件大事是
01:31
and I've always thought about,
36
91880
1460
01:33
is we're getting, you know, even back 20, 30 years ago,
37
93340
3295
即使是二三十年前,
01:36
the beginning of the internet era and computer era,
38
96676
3087
互联网时代和计算机时代的开始,
01:39
the amount of data that was being produced
39
99804
3170
产生的数据量及科学数据,
01:43
and also scientific data,
40
103016
1877
01:44
just too much for the human mind to comprehend in many cases.
41
104934
3671
很多时候,人脑是无法理解的。
01:48
And I think one of the uses of AI is to find patterns and insights
42
108605
4046
我认为 AI 的用途之一 是在大量数据中
01:52
in huge amounts of data and then surface that
43
112692
2169
找到规律和见解,然后将其呈现
01:54
to the human scientists to make sense of
44
114861
2461
给人类科学家,让他们理解
01:57
and make new hypotheses and conjectures.
45
117364
2294
并做出新的假设和猜想。
01:59
So it seems to me very compatible with the scientific method.
46
119699
3504
在我看来,它与科学方法非常兼容。
02:03
CA: Right.
47
123203
1334
CA:对。
02:04
But game play has played a huge role in your own journey
48
124579
3128
但是“玩游戏”在你搞明白这些的途中 扮演了重要的角色。
02:07
in figuring this thing out.
49
127749
1293
02:09
Who is this young lad on the left there?
50
129793
2961
左边的这位小伙子是谁?
02:12
Who is that?
51
132796
1167
那是谁?
02:13
DH: So that was me, I think I must have been about around nine years old.
52
133963
3504
DH:是我,我猜大概 9 岁。
02:17
I'm captaining the England Under 11 team,
53
137467
3045
我是英格兰 11 岁以下 国际象棋队的队长
02:20
and we're playing in a Four Nations tournament,
54
140553
2461
我们正在参加四国锦标赛,
02:23
that's why we're all in red.
55
143056
1376
这就是为什么 我们都穿着红色的衣服。
02:24
I think we're playing France, Scotland and Wales, I think it was.
56
144432
3087
我想我们在打法国、苏格兰 和威尔士,没错。
02:27
CA: That is so weird, because that happened to me too.
57
147519
4546
CA:这太奇怪了, 因为这也发生在了我身上。
02:32
In my dreams.
58
152107
1167
在我的梦里。
02:33
(Laughter)
59
153274
1168
(笑声)
02:34
And it wasn't just chess,
60
154818
3378
不只是国际象棋,
02:38
you loved all kinds of games.
61
158238
1418
你喜欢各种各样的游戏。
02:39
DH: I loved all kinds of games, yeah.
62
159656
1793
DH:是的,我喜欢各种各样的游戏。
02:41
CA: And when you launched DeepMind,
63
161449
1710
CA:当你推出 DeepMind 时,
02:43
pretty quickly, you started having it tackle game play.
64
163159
4088
你很快就开始让它解决游戏问题。
02:47
Why?
65
167288
1210
为什么?
02:48
DH: Well, look, I mean, games actually got me into AI in the first place
66
168498
3420
DH:首先,是游戏把我带进了 AI,
02:51
because while we were doing things like,
67
171918
2294
因为我们做了一些事情,
02:54
we used to go on training camps with the England team and so on.
68
174254
3044
我们经常和英格兰队 一起参加训练营等等。
02:57
And actually back then,
69
177298
1669
其实当时,
02:58
I guess it was in the mid '80s,
70
178967
2711
我猜是在 80 年代中期,
03:01
we would use the very early chess computers,
71
181678
2169
如果你还有印象,
03:03
if you remember them, to train against,
72
183888
2419
我们会使用早期的国际象棋计算机 来互相训练和对战。
03:06
as well as playing against each other.
73
186349
1835
03:08
And they were big lumps of plastic,
74
188184
1877
它们是大块的塑料,
03:10
you know, physical boards that you used to,
75
190103
2002
就是这些实物板子,
03:12
some of you remember, used to actually press the squares down
76
192105
3086
有些人有印象, 你真的得按下方块,
03:15
and there were LED lights, came on.
77
195233
1752
还会亮起 LED 灯。
03:17
And I remember actually, not just thinking about the chess,
78
197026
2795
我记得我不仅仅是“想着”国际象棋,
03:19
I was actually just fascinated by the fact that this lump of plastic,
79
199821
3545
我是为它着迷,这一块塑料,
03:23
someone had programmed it to be smart
80
203408
3378
有人把它编程得很聪明,
03:26
and actually play chess to a really high standard.
81
206828
2669
以相当高的水平下象棋。
03:29
And I was just amazed by that.
82
209497
1877
我大为震撼。
03:31
And that got me thinking about thinking.
83
211374
2461
这让我开始思考。
03:33
And how does the brain come up with these thought processes,
84
213877
3461
大脑是如何想出这些思维过程、
03:37
these ideas,
85
217380
1168
这些想法的,
03:38
and then maybe how we could mimic that with computers.
86
218548
3337
也许我们该如何用计算机模仿它们。
03:42
So yeah, it's been a whole theme for my whole life, really.
87
222218
3921
所以没错, 这确实是我一生的全部主题。
03:46
CA: But you raised all this money to launch DeepMind,
88
226139
3170
CA:但是你筹集了这些资金 来推出 DeepMind,
03:49
and pretty soon you were using it to do, for example, this.
89
229350
4922
很快你就用它来做一些事, 就比如说这个。
03:55
I mean, this is an odd use of it.
90
235190
1960
这个用法很奇怪。
03:57
What was going on here?
91
237192
1209
怎么回事?
03:58
DH: Well, we started off with games at the beginning of DeepMind.
92
238443
3086
DH:我们从 DeepMind 一开始 就是在玩游戏。
04:01
This was back in 2010, so this is from about 10 years ago,
93
241529
2711
那是 2010 年, 所以是大约 10 年前的事,
04:04
it was our first big breakthrough.
94
244240
1710
这是我们的第一个重大突破。
04:05
Because we started off with classic Atari games from the 1970s,
95
245992
3253
因为我们从 20 世纪 70 年代的 经典雅达利游戏开始,
04:09
the simplest kind of computer games there are out there.
96
249287
3504
是市面上最简单的那种电脑游戏。
04:12
And one of the reasons we used games is they're very convenient
97
252832
2962
我们使用游戏的原因之一 是它们可以非常方便地
04:15
to test out your ideas and your algorithms.
98
255835
3796
测试你的想法和算法。
04:19
They're really fast to test.
99
259672
1544
它们的测试速度非常快。
04:21
And also, as your systems get more powerful,
100
261674
2711
而且,随着你的系统越来越强大,
04:24
you can choose harder and harder games.
101
264385
2336
你可以选择越来越难的游戏。
04:26
And this was actually the first time ever that our machine surprised us,
102
266721
4004
这其实是我们的机器有史以来 第一次让我们感到惊讶,
04:30
the first of many times,
103
270767
1293
许多次中的第一次,
04:32
which, it figured out in this game called Breakout,
104
272060
2419
它在这款名为《打砖块》的游戏中发现,
04:34
that you could send the ball round the back of the wall,
105
274479
2627
你可以让球绕到墙的背面,
04:37
and actually, it would be much safer way to knock out all the tiles of the wall.
106
277148
3796
这样打掉墙上的所有砖块要稳得多。
04:40
It's a classic Atari game there.
107
280944
1543
这是一款经典的雅达利游戏。
04:42
And that was our first real aha moment.
108
282529
1876
那是我们第一个真正的顿悟时刻。
04:44
CA: So this thing was not programmed to have any strategy.
109
284405
2962
CA:所以它并没有 被编程拥有任何策略。
04:47
It was just told, try and figure out a way of winning.
110
287408
3671
只是告诉它, 尝试并找出一个胜出的方法。
04:51
You just move the bat at the bottom and see if you can find a way of winning.
111
291120
3713
只是在底部把击球板移来移去, 看看能不能找到一个胜出的办法。
04:54
DH: It was a real revolution at the time.
112
294833
2002
DH:当时这是一场真正的革命。
04:56
So this was in 2012, 2013
113
296835
1376
2012 年、2013 年,
04:58
where we coined these terms "deep reinforcement learning."
114
298211
3337
我们创造了“深度强化学习”这些术语。
05:01
And the key thing about them is that those systems were learning
115
301548
3044
它们的关键是这些系统
05:04
directly from the pixels, the raw pixels on the screen,
116
304634
2586
直接从像素, 屏幕上的原始像素中学习,
05:07
but they weren't being told anything else.
117
307262
2085
但它们没有被告知其他任何东西。
05:09
So they were being told, maximize the score,
118
309389
2169
它们被要求拿到最高的积分,
05:11
here are the pixels on the screen,
119
311558
1668
这是屏幕上的像素,
05:13
30,000 pixels.
120
313268
2085
30000 个像素。
05:15
The system has to make sense on its own from first principles
121
315395
3420
系统必须根据第一性原理
05:18
what’s going on, what it’s controlling,
122
318857
1876
自行理解正在发生什么、 它要控制什么、
05:20
how to get points.
123
320733
1168
如何获得积分。
05:21
And that's the other nice thing about using games to begin with.
124
321901
3003
这也是从一开始 就使用游戏的另一个好处。
05:24
They have clear objectives, to win, to get scores.
125
324946
2377
它们有明确的目标, 要获胜,要得分。
05:27
So you can kind of measure very easily that your systems are improving.
126
327323
3587
因此,你可以很容易地 衡量你的系统是否在改进。
05:30
CA: But there was a direct line from that to this moment
127
330910
2628
CA:但这直接影响到了 几年后的这个时刻,
05:33
a few years later,
128
333580
1459
05:35
where country of South Korea and many other parts of Asia
129
335081
4129
韩国和亚洲许多其他地区,
05:39
and in fact the world went crazy over -- over what?
130
339252
3045
甚至是全世界都为了什么而沸腾了?
05:42
DH: Yeah, so this was the pinnacle of -- this is in 2016 --
131
342589
3712
DH:是的,这是我们的巅峰, 发生在 2016 年,
05:46
the pinnacle of our games-playing work,
132
346342
2586
我们“玩游戏”成果的巅峰,
05:48
where, so we'd done Atari,
133
348970
1793
我们做了雅达利,
05:50
we'd done some more complicated games.
134
350763
2169
也做了更复杂的游戏。
05:52
And then we reached the pinnacle, which was the game of Go,
135
352974
3295
然后我们达到了巅峰,那就是围棋,
05:56
which is what they play in Asia instead of chess,
136
356311
2961
亚洲人下的是围棋, 而不是国际象棋,
05:59
but it's actually more complex than chess.
137
359272
2085
但它其实比国际象棋更复杂。
06:01
And the actual brute force algorithms
138
361357
4171
用于破解国际象棋的那种暴力算法
06:05
that were used to kind of crack chess were not possible with Go
139
365528
4671
是无法破解围棋的,
06:10
because it's a much more pattern-based game,
140
370241
2419
因为这是一款更加基于模式的游戏,
06:12
much more intuitive game.
141
372702
1293
更直观的游戏。
06:14
So even though Deep Blue beat Garry Kasparov in the '90s,
142
374037
3503
因此,尽管深蓝在 90 年代击败了 加里·卡斯帕罗夫(Garry Kasparov),
06:17
it took another 20 years for our program, AlphaGo,
143
377582
4129
但我们的项目 AlphaGo 又花了 20 年时间
06:21
to beat the world champion at Go.
144
381753
1710
才在围棋比赛中击败世界冠军。
06:23
And we always thought,
145
383504
1168
我们一直认为,
06:24
myself and the people working on this project for many years,
146
384714
2878
我自己和多年从事这个项目的人,
06:27
if you could build a system that could beat the world champion at Go,
147
387634
3920
如果你能构建出一个 能在围棋上击败世界冠军的系统,
06:31
it would have had to have done something very interesting.
148
391596
2711
它一定会做一些非常有趣的事情。
06:34
And in this case, what we did with AlphaGo,
149
394349
2043
在这种情况下, 我们让 AlphaGo 做的
06:36
is it basically learned for itself,
150
396392
1710
基本上就是自主学习,
06:38
by playing millions and millions of games against itself,
151
398102
2711
通过自己和自己玩数百万场游戏
06:40
ideas about Go, the right strategies.
152
400813
1794
学习围棋的思路、正确的策略。
06:42
And in fact invented its own new strategies
153
402607
2461
它实际上研究出了自己的全新策略,
06:45
that the Go world had never seen before,
154
405068
2002
在围棋界都闻所未闻,
06:47
even though we've played Go for more than,
155
407070
2627
即使我们下围棋已有 2000 多年了,
06:49
you know, 2,000 years,
156
409697
1460
06:51
it's the oldest board game in existence.
157
411199
2753
它是现存最古老的棋盘游戏。
06:54
So, you know, it was pretty astounding.
158
414327
1918
这令人瞠目结舌。
06:56
Not only did it win the match,
159
416245
1627
它不仅赢得了比赛,
06:57
it also came up with brand new strategies.
160
417872
2836
还想出了全新的策略。
07:01
CA: And you continued this with a new strategy
161
421125
2169
CA:你继续采用这种新策略,
07:03
of not even really teaching it anything about Go,
162
423336
2294
一点都不教它下围棋,
07:05
but just setting up systems
163
425630
1877
只是建立一个系统,
07:07
that just from first principles would play
164
427548
3254
只凭第一性原则就足以发挥作用,
07:10
so that they could teach themselves from scratch, Go or chess.
165
430802
5088
让它从头自学, 无论是围棋还是国际象棋。
07:15
Talk about AlphaZero and the amazing thing that happened in chess then.
166
435932
5714
谈谈 AlphaZero 以及当时 在国际象棋上发生的神奇故事。
07:21
DH: So following this, we started with AlphaGo
167
441646
3045
DH:在此之后, 我们从 AlphaGo 入手,
07:24
by giving it all of the human games that are being played on the internet.
168
444691
3920
向它提供了网上所有人类玩的游戏。
07:28
So it started that as a basic starting point for its knowledge.
169
448611
3587
它以此为其知识的基本起点。
07:32
And then we wanted to see what would happen if we started from scratch,
170
452240
3587
然后我们想看看如果我们从头开始,
07:35
from literally random play.
171
455868
1669
从随机游戏开始,会发生什么。
07:37
So this is what AlphaZero was.
172
457537
1501
这就是 AlphaZero。
07:39
That's why it's the zero in the name,
173
459038
1835
这就是为什么它名字里是“零”,
07:40
because it started with zero prior knowledge
174
460873
3170
因为它从零先验知识开始。
07:44
And the reason we did that is because then we would build a system
175
464085
3128
而我们之所以这样做, 是因为这样我们就能构建
07:47
that was more general.
176
467213
1251
一个更通用的系统。
07:48
So AlphaGo could only play Go,
177
468464
1919
AlphaGo 只会下围棋,
07:50
but AlphaZero could play any two-player game,
178
470425
2836
但 AlphaZero 会玩所有双人游戏,
07:53
and it did it by playing initially randomly
179
473302
4088
它通过最初随机乱玩,
07:57
and then slowly, incrementally improving.
180
477432
1960
然后缓慢地逐步改进做到这一点。
07:59
Well, not very slowly, actually, within the course of 24 hours,
181
479434
2961
好吧,其实不是很缓慢, 其实是是在 24 小时内
08:02
going from random to better than world-champion level.
182
482395
3045
从乱玩,到更好, 到世界冠军的水平。
08:06
CA: And so this is so amazing to me.
183
486315
1752
CA:这对我来说太神奇了。
08:08
So I'm more familiar with chess than with Go.
184
488067
2127
我对国际象棋比对围棋更熟悉。
08:10
And for decades,
185
490194
1335
几十年来,
08:11
thousands and thousands of AI experts worked on building
186
491571
3462
成千上万的 AI 专家致力于打造
08:15
incredible chess computers.
187
495033
1334
强大的国际象棋计算机。
08:16
Eventually, they got better than humans.
188
496367
2544
最终,它们变得比人类更强。
08:18
You had a moment a few years ago,
189
498953
2211
几年前有过一个时刻,
08:21
where in nine hours,
190
501205
2628
在九个小时内,
08:23
AlphaZero taught itself to play chess better than any of those systems ever did.
191
503875
6590
AlphaZero 自学下国际象棋, 水平超过了任何以往的系统。
08:30
Talk about that.
192
510923
1252
聊聊这个吧。
08:32
DH: It was a pretty incredible moment, actually.
193
512175
2294
DH:那真是一个非常不可思议的时刻。
08:34
So we set it going on chess.
194
514510
2211
我们让它下国际象棋。
08:38
And as you said, there's this rich history of chess and AI
195
518056
2752
正如你所说,国际象棋和 AI 有着悠久的历史,
08:40
where there are these expert systems that have been programmed
196
520808
2920
有用国际象棋思维和算法 编程的专业系统。
08:43
with these chess ideas, chess algorithms.
197
523770
2294
08:46
And you have this amazing, you know,
198
526105
2044
然后就有了这神奇的一刻,
08:48
I remember this day very clearly, where you sort of sit down with the system
199
528191
3837
我清楚记得这一天, 你看着这个系统,
08:52
starting off random, you know, in the morning,
200
532070
2919
从早上开始随便乱下,
08:55
you go for a cup of coffee, you come back.
201
535031
2002
你去喝杯咖啡,然后回来。
08:57
I can still just about beat it by lunchtime, maybe just about.
202
537075
3503
到午餐时间我还能打败它,差不多吧。
09:00
And then you let it go for another four hours.
203
540620
2210
然后让它再下四个小时。
09:02
And by dinner,
204
542830
1168
到了晚饭的时候,
09:03
it's the greatest chess-playing entity that's ever existed.
205
543998
2795
它已经是有史以来 最伟大的国际象棋实体了。
09:06
And, you know, it's quite amazing,
206
546834
1669
这真是太神奇了,
09:08
like, looking at that live on something that you know well,
207
548503
3211
眼睁睁地看着这种,
09:11
you know, like chess, and you're expert in
208
551714
2044
比如国际象棋,你擅长的领域,
09:13
and actually just seeing that in front of your eyes.
209
553800
2460
亲眼目睹它的发生。
09:16
And then you extrapolate to what it could then do in science or something else,
210
556260
3963
然后你推断它在科学 或其他领域也能做什么,
09:20
which of course, games were only a means to an end.
211
560264
2878
当然,游戏只是达到目的的一种手段。
09:23
They were never the end in themselves.
212
563184
1835
它们本身从来不是终点。
09:25
They were just the training ground for our ideas
213
565019
2461
它们只是我们想法的训练场,
09:27
and to make quick progress in a matter of, you know,
214
567522
2877
也是我们快速在一个问题上 取得进展的训练场,
09:30
less than five years actually went from Atari to Go.
215
570441
4046
从雅达利到围棋的时间不到五年。
09:34
CA: I mean, this is why people are in awe of AI
216
574529
3295
CA:这就是为什么 人们对 AI 感到敬畏,
09:37
and also kind of terrified by it.
217
577865
2127
也对它感到恐惧的原因。
09:40
I mean, it's not just incremental improvement.
218
580034
2169
这不仅仅是渐进式的改进。
09:42
The fact that in a few hours you can achieve
219
582245
2919
你可以在几个小时内实现
09:45
what millions of humans over centuries have not been able to achieve.
220
585206
4796
数百万人几个世纪以来 无法实现的目标。
09:50
That gives you pause for thought.
221
590044
2586
这让你停下来思考。
09:53
DH: It does, I mean, it's a hugely powerful technology.
222
593214
2836
DH:确实如此, 它是一项非常强大的技术。
09:56
It's going to be incredibly transformative.
223
596092
2044
这将是极其具有变革性的。
09:58
And we have to be very thoughtful about how we use that capability.
224
598136
4004
我们必须非常仔细地 考虑如何使用这种能力。
10:02
CA: So talk about this use of it because this is again,
225
602181
2586
CA:谈谈它的这种用途, 因为这又是
10:04
this is another extension of the work you've done,
226
604767
3754
你所做工作的又一次延伸,
10:08
where now you're turning it to something incredibly useful for the world.
227
608563
4129
你现正把它变成对世界 非常有用的东西。
10:12
What are all the letters on the left, and what’s on the right?
228
612733
2920
左边的字母都是些什么? 右边是什么?
10:15
DH: This was always my aim with AI from a kid,
229
615695
4129
DH:这一直是 我小时候使用 AI 的目标,
10:19
which is to use it to accelerate scientific discovery.
230
619866
3169
那就是用它来加速科学发现。
10:23
And actually, ever since doing my undergrad at Cambridge,
231
623035
2962
自从在剑桥大学读本科后,
10:26
I had this problem in mind one day for AI,
232
626038
2545
有一天我想到了 关于 AI 的这个问题,
10:28
it's called the protein-folding problem.
233
628624
1919
叫做蛋白质折叠问题。
10:30
And it's kind of like a 50-year grand challenge in biology.
234
630543
2794
这有点像生物学领域 长达 50 年的重大挑战。
10:33
And it's very simple to explain.
235
633337
1919
而且解释起来很简单。
10:35
Proteins are essential to life.
236
635256
2461
蛋白质对生命至关重要。
10:38
They're the building blocks of life.
237
638009
1751
它们是生命的基石。
10:39
Everything in your body depends on proteins.
238
639760
2086
你体内的一切都依赖于蛋白质。
10:41
A protein is sort of described by its amino acid sequence,
239
641846
5380
蛋白质在某种程度上 是由其氨基酸序列来描述的,
10:47
which you can think of as roughly the genetic sequence
240
647226
2544
你可以把它大致看作是 描述蛋白质的基因序列,
10:49
describing the protein, so that are the letters.
241
649812
2252
这些字母就是这个序列。
10:52
CA: And each of those letters represents in itself a complex molecule?
242
652064
3295
CA:每一个字母 都代表一个复杂的分子?
10:55
DH: That's right, each of those letters is an amino acid.
243
655401
2711
DH:没错,每个字母都是一个氨基酸。
10:58
And you can think of them as a kind of string of beads
244
658112
2544
你可以把它们看成一串串珠子,
11:00
there at the bottom, left, right?
245
660698
1919
出现在底部、左边、右边,对吧?
11:02
But in nature, in your body or in an animal,
246
662658
3420
但是在大自然中, 在你或者动物的身体里,
11:06
this string, a sequence,
247
666078
1794
这一串字母,这个序列,
11:07
turns into this beautiful shape on the right.
248
667914
2544
会变成右边这个漂亮的形状。
11:10
That's the protein.
249
670458
1209
这就是蛋白质。
11:11
Those letters describe that shape.
250
671709
2586
这些字母描述了这个形状。
11:14
And that's what it looks like in nature.
251
674295
2294
这就是它在自然界中的样子。
11:16
And the important thing about that 3D structure is
252
676589
2502
而这个三维结构的重要之处在于
11:19
the 3D structure of the protein goes a long way to telling you
253
679133
3170
蛋白质的三维结构 在很大程度上可以告诉你
11:22
what its function is in the body, what it does.
254
682345
2210
它在体内的功能、它的作用。
11:24
And so the protein-folding problem is:
255
684597
2252
蛋白质折叠问题是:
11:26
Can you directly predict the 3D structure just from the amino acid sequence?
256
686891
4963
你能直接从氨基酸序列中 预测三维结构吗?
11:31
So literally if you give the machine, the AI system,
257
691854
2544
也就是如果你给机器、AI 系统
11:34
the letters on the left,
258
694440
1502
左边的字母,
11:35
can it produce the 3D structure on the right?
259
695983
2253
它能生成右边的三维结构吗?
11:38
And that's what AlphaFold does, our program does.
260
698277
2336
AlphaFold 做的就是这件事, 我们的程序做的就是这件事。
11:40
CA: It's not calculating it from the letters,
261
700613
2169
CA:它不是根据字母计算出来的,
11:42
it's looking at patterns of other folded proteins that are known about
262
702823
4922
而是研究已知的 其他折叠蛋白质的模式,
11:47
and somehow learning from those patterns
263
707745
2628
以某种方式从这些模式中学到
11:50
that this may be the way to do this?
264
710414
1752
可能做到这一点的方法?
11:52
DH: So when we started this project, actually straight after AlphaGo,
265
712166
3295
DH:我们开始做这个项目时, 其实是紧接着 AlphaGo,
11:55
I thought we were ready.
266
715503
1168
我以为我们已经准备好了。
11:56
Once we'd cracked Go,
267
716712
1168
当我们破解了围棋之后,
11:57
I felt we were finally ready after, you know,
268
717880
2711
我以为我们终于 在研究了 20 年之后准备就绪,
12:00
almost 20 years of working on this stuff
269
720591
1919
12:02
to actually tackle some scientific problems,
270
722551
2586
可以真正解决一些科学问题,
12:05
including protein folding.
271
725179
1335
包括蛋白质折叠。
12:06
And what we start with is painstakingly,
272
726514
3044
我们一开始就是勤勤恳恳地,
12:09
over the last 40-plus years,
273
729600
1794
在过去的 40 多年里,
12:11
experimental biologists have pieced together
274
731435
3420
实验生物学家拼出了
12:14
around 150,000 protein structures
275
734855
2670
大约 15 万个蛋白质结构,
12:17
using very complicated, you know, X-ray crystallography techniques
276
737566
3671
借助非常复杂的 X 射线晶体学技术
12:21
and other complicated experimental techniques.
277
741237
2794
和其他复杂的实验技术。
12:24
And the rule of thumb is
278
744031
2086
粗略估算,
12:26
that it takes one PhD student their whole PhD,
279
746117
3628
一个博士生需要 花费整个博士学位期间,
12:29
so four or five years, to uncover one structure.
280
749787
3420
也就是四到五年的时间, 才能解析一个结构。
12:33
But there are 200 million proteins known to nature.
281
753207
2961
但是大自然 已知有 2 亿种蛋白质。
12:36
So you could just, you know, take forever to do that.
282
756210
3170
你可以花上无限的时间。
12:39
And so we managed to actually fold, using AlphaFold, in one year,
283
759422
4337
我们使用 AlphaFold 在一年内折叠了
12:43
all those 200 million proteins known to science.
284
763801
2419
科学已知的所有 2 亿种蛋白质。
12:46
So that's a billion years of PhD time saved.
285
766262
2711
给博士生省了十亿年。
12:49
(Applause)
286
769015
3837
(掌声)
12:52
CA: So it's amazing to me just how reliably it works.
287
772893
2503
CA:它的可靠性令我惊艳。
12:55
I mean, this shows, you know,
288
775396
2669
这表明,这就是模型, 你做了实验。
12:58
here's the model and you do the experiment.
289
778107
2044
13:00
And sure enough, the protein turns out the same way.
290
780151
3044
果然,蛋白质就是这样的。
13:03
Times 200 million.
291
783195
1210
乘以 2 亿。
13:04
DH: And the more deeply you go into proteins,
292
784447
2669
DH:你对蛋白质的了解越深,
13:07
you just start appreciating how exquisite they are.
293
787158
2711
你就会体会到它们有多精巧。
13:09
I mean, look at how beautiful these proteins are.
294
789910
2294
看看这些蛋白质多漂亮啊。
13:12
And each of these things do a special function in nature.
295
792246
2711
这些东西在自然界中 都起着特殊的作用。
13:14
And they're almost like works of art.
296
794957
1794
它们几乎就像是艺术品。
13:16
And it's still astounds me today that AlphaFold can predict,
297
796751
2836
如今依然令我震惊的是, AlphaFold 能够预测,
13:19
the green is the ground truth, and the blue is the prediction,
298
799587
2961
绿色是基本事实, 蓝色是预测,
13:22
how well it can predict, is to within the width of an atom on average,
299
802548
4254
它能预测得多好, 平均在一个原子的宽度以内,
13:26
is how accurate the prediction is,
300
806802
2044
它的预测就是这么精确,
13:28
which is what is needed for biologists to use it,
301
808888
2627
生物学家要使用它 就需要这样的精确度,
13:31
and for drug design and for disease understanding,
302
811557
2836
也是设计药物和了解疾病所需要的,
13:34
which is what AlphaFold unlocks.
303
814435
2502
而这正是 AlphaFold 所能带来的。
13:36
CA: You made a surprising decision,
304
816979
1710
CA:你做了一个出人意料的决定,
13:38
which was to give away the actual results of your 200 million proteins.
305
818731
5964
那就是将 2 亿种蛋白质的实际结果 “拱手相让”。
13:44
DH: We open-sourced AlphaFold and gave everything away
306
824737
3003
DH:我们开源了 AlphaFold,并把一切
13:47
on a huge database with our wonderful colleagues,
307
827782
2294
都通过一个巨大的数据库 交给了我们优秀的同行,
13:50
the European Bioinformatics Institute.
308
830117
1835
他们来自欧洲生物信息学研究所。
13:51
(Applause)
309
831952
3170
(掌声)
13:55
CA: I mean, you're part of Google.
310
835122
2378
CA:你是谷歌的一员。
13:57
Was there a phone call saying, "Uh, Demis, what did you just do?"
311
837541
3963
有没有人打电话说: “呃,戴密斯,你刚才做了什么?”
14:01
DH: You know, I'm lucky we have very supportive,
312
841879
2753
DH:我很幸运 我们得到了非常大的支持,
14:04
Google's really supportive of science
313
844673
1794
谷歌非常支持科学,
14:06
and understand the benefits this can bring to the world.
314
846509
4337
也知道这能给世界带来什么好处。
14:10
And, you know, the argument here
315
850888
1543
而且问题是
14:12
was that we could only ever have even scratched the surface
316
852473
3211
我们做到的只是触及
14:15
of the potential of what we could do with this.
317
855726
2211
我们能用它做些什么的潜力的表面。
14:17
This, you know, maybe like a millionth
318
857937
1877
这大概是科学界正在 用它做的事情的百万分之一。
14:19
of what the scientific community is doing with it.
319
859814
2377
14:22
There's over a million and a half biologists around the world
320
862233
3336
全世界有超过 150 万名生物学家
14:25
have used AlphaFold and its predictions.
321
865569
1919
使用了 AlphaFold 及其预测。
14:27
We think that's almost every biologist in the world
322
867530
2419
我们认为,世界上几乎每个生物学家、
14:29
is making use of this now, every pharma company.
323
869949
2252
每家制药公司都在使用它。
14:32
So we'll never know probably what the full impact of it all is.
324
872243
3086
因此,我们可能永远不会知道 这一切的全部影响。
14:35
CA: But you're continuing this work in a new company
325
875329
2461
CA:但是你正在从谷歌拆分出来的
14:37
that's spinning out of Google called Isomorph.
326
877832
2544
一家名为 Isomorph 的新公司里 继续这项工作。
14:40
DH: Isomorphic.
327
880376
1251
DH:Isomorphic。
14:41
CA: Isomorphic.
328
881627
1376
CA:Isomorphic。
14:43
Give us just a sense of the vision there.
329
883879
2002
让我们了解一下它的愿景。 愿景是什么?
14:45
What's the vision?
330
885881
1168
DH:AlphaFold 是一种基础生物学工具。
14:47
DH: AlphaFold is a sort of fundamental biology tool.
331
887091
2961
14:50
Like, what are these 3D structures,
332
890094
2627
比如,这些三维结构是什么,
14:52
and then what might they do in nature?
333
892763
2795
它们在自然界中会做些什么?
14:55
And then if you, you know,
334
895975
1334
如果你……我之所以想到这个问题 并对此感到非常兴奋,
14:57
the reason I thought about this and was so excited about this,
335
897309
2962
15:00
is that this is the beginnings of understanding disease
336
900271
3545
是因为这是了解疾病的开始,
15:03
and also maybe helpful for designing drugs.
337
903816
3086
也可能对药物的设计有所帮助。
15:06
So if you know the shape of the protein,
338
906944
2503
如果你知道蛋白质的形状,
15:09
and then you can kind of figure out
339
909488
2378
你就能弄清楚蛋白质表面的哪一部分
15:11
which part of the surface of the protein
340
911907
1919
15:13
you're going to target with your drug compound.
341
913826
2377
是你的药物化合物的靶点。
15:16
And Isomorphic is extending this work we did in AlphaFold
342
916245
3670
Isomorphic 正在将我们 在 AlphaFold 中做的这项工作
15:19
into the chemistry space,
343
919957
1585
扩展到药物领域,
15:21
where we can design chemical compounds
344
921542
3003
我们可以设计出能够精确 与蛋白质正确位置结合的化合物,
15:24
that will bind exactly to the right spot on the protein
345
924587
2878
15:27
and also, importantly, to nothing else in the body.
346
927465
3086
更重要的是, 它不会与体内其他任何东西结合。
15:30
So it doesn't have any side effects and it's not toxic and so on.
347
930593
4296
所以它没有任何副作用, 也没有毒性等等。
15:34
And we're building many other AI models,
348
934930
2169
而且我们正在构建 许多其他 AI 模型,
15:37
sort of sister models to AlphaFold
349
937141
2336
类似于 AlphaFold 的姊妹模型帮助预测,
15:39
to help predict,
350
939477
1710
15:41
make predictions in chemistry space.
351
941228
2294
在药物领域做出预测。
15:43
CA: So we can expect to see
352
943564
1293
CA:我们可以预见,
15:44
some pretty dramatic health medicine breakthroughs
353
944899
3128
在未来几年中,健康医学 将取得一些相当显著的突破。
15:48
in the coming few years.
354
948027
1209
15:49
DH: I think we'll be able to get down drug discovery
355
949278
2711
DH:我认为我们可以 将药物研发时间
15:51
from years to maybe months.
356
951989
2127
从几年缩短到几个月。
15:54
CA: OK. Demis, I'd like to change direction a bit.
357
954158
2919
CA:好吧。戴密斯, 我想稍微转换一下话题。
15:58
Our mutual friend, Liv Boeree, gave a talk last year at TEDAI
358
958329
3962
我们共同的朋友 丽芙·波瑞(Liv Boeree)
去年在 TEDAI 上发表了一场演讲,
16:02
that she called the “Moloch Trap.”
359
962291
2044
她称之为《摩洛克陷阱》 (Moloch Trap)。
16:04
The Moloch Trap is a situation
360
964835
1877
摩洛克陷阱指的是
16:06
where organizations,
361
966754
2836
处于竞争环境中的组织
16:09
companies in a competitive situation can be driven to do things
362
969632
5130
和公司可能被迫去做
16:14
that no individual running those companies would by themselves do.
363
974762
4629
任何一个经营这些公司的人 自己不会做的事情。
16:19
I was really struck by this talk,
364
979391
2294
这场演讲真的给我留下了深刻的印象,
16:21
and it's felt, as a sort of layperson observer,
365
981685
3587
作为一个外行观察者,
16:25
that the Moloch Trap has been shockingly in effect in the last couple of years.
366
985272
5089
我觉得在过去的几年里, 摩洛克陷阱的作用令人震惊。
16:30
So here you are with DeepMind,
367
990402
2044
你在 DeepMind,
16:32
sort of pursuing these amazing medical breakthroughs
368
992488
3170
在追求这些惊人的医学突破
16:35
and scientific breakthroughs,
369
995699
1418
和科学突破,
16:37
and then suddenly, kind of out of left field,
370
997117
4004
然后突然之间,
微软投资的 OpenAI发布了 ChatGPT, 遥遥领先。
16:41
OpenAI with Microsoft releases ChatGPT.
371
1001163
5381
16:46
And the world goes crazy and suddenly goes, “Holy crap, AI is ...”
372
1006585
3879
然后世界都为之疯狂, 突然就变成:“天哪,AI 是……”,
16:50
you know, everyone can use it.
373
1010464
2586
每个人都可以使用它。
16:54
And there’s a sort of, it felt like the Moloch Trap in action.
374
1014260
4045
就像是亲身实践摩洛克陷阱。
16:58
I think Microsoft CEO Satya Nadella actually said,
375
1018347
5130
我记得微软首席执行官 萨提亚·纳德拉(Satya Nadella)说过:
17:03
"Google is the 800-pound gorilla in the search space.
376
1023477
4838
“谷歌是搜索领域的霸主。
17:08
We wanted to make Google dance."
377
1028357
2836
我们想让谷歌手忙脚乱。”
17:12
How ...?
378
1032319
1293
怎么……?
17:14
And it did, Google did dance.
379
1034613
1877
确实如此,谷歌确实手忙脚乱。
17:16
There was a dramatic response.
380
1036490
1960
引起了剧烈的反应。
17:18
Your role was changed,
381
1038993
1167
你的角色发生了变化,
17:20
you took over the whole Google AI effort.
382
1040160
3879
你接管了谷歌 AI 的全部工作。
17:24
Products were rushed out.
383
1044456
1627
产品匆匆上线。
17:27
You know, Gemini, some part amazing, part embarrassing.
384
1047251
3003
Gemini 有一部分很神奇, 有一部分很丢人。
17:30
I’m not going to ask you about Gemini because you’ve addressed it elsewhere.
385
1050296
3628
我不会问你有关 Gemini 的问题, 因为你在别处已经解答过了。
17:33
But it feels like this was the Moloch Trap happening,
386
1053924
3295
但这感觉就像是摩洛克陷阱正在上演,
17:37
that you and others were pushed to do stuff
387
1057261
2753
你和其他人被迫去做
17:40
that you wouldn't have done without this sort of catalyzing competitive thing.
388
1060055
5047
没有竞争刺激你根本不会做的事情。
17:45
Meta did something similar as well.
389
1065102
2169
Meta 也做了类似的事情。
17:47
They rushed out an open-source version of AI,
390
1067313
3336
他们匆忙推出了 AI 的开源版本,
17:50
which is arguably a reckless act in itself.
391
1070691
3295
这本身可以说是一种鲁莽的行为。
17:55
This seems terrifying to me.
392
1075613
1459
这对我来说很可怕。
17:57
Is it terrifying?
393
1077072
1835
可怕吗?
17:59
DH: Look, it's a complicated topic, of course.
394
1079617
2252
DH:这当然是一个复杂的话题。
18:01
And, first of all, I mean, there are many things to say about it.
395
1081869
3879
首先,这个话题有很多可说的。
18:05
First of all, we were working on many large language models.
396
1085789
4421
首先,我们正在研究 很多大语言模型。
18:10
And in fact, obviously, Google research actually invented Transformers,
397
1090252
3337
很明显,谷歌研究在五、六年前 发明了 Transformer 模型,
18:13
as you know,
398
1093589
1168
18:14
which was the architecture that allowed all this to be possible,
399
1094798
3045
正是这种架构使这一切成为可能。
18:17
five, six years ago.
400
1097885
1251
18:19
And so we had many large models internally.
401
1099178
2669
我们内部有许多大模型。
18:21
The thing was, I think what the ChatGPT moment did that changed was,
402
1101847
3879
问题是,我认为 ChatGPT 时刻 所带来的改变是,
18:25
and fair play to them to do that, was they demonstrated,
403
1105768
3128
他们这么做也是公平的, 因为他们证明了,
18:28
I think somewhat surprisingly to themselves as well,
404
1108937
2795
我觉得对他们自己来说也是意外的,
18:31
that the public were ready to,
405
1111732
2502
他们证明了公众已经准备好,
18:34
you know, the general public were ready to embrace these systems
406
1114276
3003
公众已经准备好接受这些系统
18:37
and actually find value in these systems.
407
1117279
1960
并从中寻找价值。
18:39
Impressive though they are, I guess, when we're working on these systems,
408
1119281
3962
虽然它们很厉害, 但是我们在研究这些系统的时候,
18:43
mostly you're focusing on the flaws and the things they don't do
409
1123243
3003
主要关注的是缺陷 和它们做不了的事情,
18:46
and hallucinations and things you're all familiar with now.
410
1126288
2836
还有幻觉和你们现在都熟知的问题。
18:49
We're thinking, you know,
411
1129124
1377
我们在想,
18:50
would anyone really find that useful given that it does this and that?
412
1130501
3587
它有这种问题,有那种问题, 真的会有人觉得它有用吗?
18:54
And we would want them to improve those things first,
413
1134129
2503
我们希望它们能在推出之前 先改进这些问题。
18:56
before putting them out.
414
1136674
1418
18:58
But interestingly, it turned out that even with those flaws,
415
1138133
3754
但有趣的是,事实证明, 即使有这些缺陷,
19:01
many tens of millions of people still find them very useful.
416
1141929
2919
仍有数千万人认为它们非常有用。
19:04
And so that was an interesting update on maybe the convergence of products
417
1144848
4922
这是产品和科学融合的
19:09
and the science that actually,
418
1149770
3712
一次有趣的进步,
19:13
all of these amazing things we've been doing in the lab, so to speak,
419
1153524
3253
我们在实验室里做的 各种有意义的事,
19:16
are actually ready for prime time for general use,
420
1156819
3003
都已经准备好 登上通用的大舞台,
19:19
beyond the rarefied world of science.
421
1159822
2002
走出科学世界的象牙塔。
19:21
And I think that's pretty exciting in many ways.
422
1161824
2627
我认为这在很多方面 都非常令人兴奋。
19:24
CA: So at the moment, we've got this exciting array of products
423
1164910
2961
CA:目前,我们有一系列 令人兴奋的产品,
19:27
which we're all enjoying.
424
1167913
1210
我们都很喜欢。
19:29
And, you know, all this generative AI stuff is amazing.
425
1169164
2586
所有这些生成式 AI 的东西 都太神奇了。
19:31
But let's roll the clock forward a bit.
426
1171750
2086
但让我们把时间向前推一点。
19:34
Microsoft and OpenAI are reported to be building
427
1174503
3962
据报道,微软和 OpenAI 正在打造
19:38
or investing like 100 billion dollars
428
1178507
2336
或投资约 1000 亿美元,
19:40
into an absolute monster database supercomputer
429
1180884
5005
建造一台绝对庞大的 数据库超级计算机,
19:45
that can offer compute at orders of magnitude
430
1185889
3212
它提供的计算能力比我们当今的 任何东西都要高出几个数量级。
19:49
more than anything we have today.
431
1189143
2544
19:52
It takes like five gigawatts of energy to drive this, it's estimated.
432
1192104
3920
据估计,驱动它 大约需要五千兆瓦的能量。
19:56
That's the energy of New York City to drive a data center.
433
1196066
4254
驱动一个数据中心 要花上整个纽约市的能量。
20:00
So we're pumping all this energy into this giant, vast brain.
434
1200612
3420
我们将这些能量 注入这个巨型、庞大的大脑中。
20:04
Google, I presume is going to match this type of investment, right?
435
1204658
4046
我相信谷歌也会进行相应的投资,对吧?
20:09
DH: Well, I mean, we don't talk about our specific numbers,
436
1209037
2795
DH:我们不谈论我们的具体数字,
20:11
but you know, I think we're investing more than that over time.
437
1211874
3336
但我认为我们长期以来的投资 不止这个数。
20:15
So, and that's one of the reasons
438
1215252
1960
因此,这也是我们 在 2014 年与谷歌合作的原因之一,
20:17
we teamed up with Google back in 2014,
439
1217212
2169
20:19
is kind of we knew that in order to get to AGI,
440
1219381
3921
因为我们知道要实现 AGI,
20:23
we would need a lot of compute.
441
1223343
1502
我们需要大量的计算。
20:24
And that's what's transpired.
442
1224887
1501
众所周知。
20:26
And Google, you know, had and still has the most computers.
443
1226430
3420
谷歌无论过去还是现在 都有着最多的计算机。
20:30
CA: So Earth is building these giant computers
444
1230309
2961
CA:地球正在打造 这些巨型计算机,
20:33
that are going to basically, these giant brains,
445
1233270
2294
这些巨型大脑,
20:35
that are going to power so much of the future economy.
446
1235564
2878
将大大助力未来的经济。
20:38
And it's all by companies that are in competition with each other.
447
1238484
3878
而这一切都是公司互相竞争带来的。
20:42
How will we avoid the situation where someone is getting a lead,
448
1242362
5589
我们将如何避免 有人一拿到了消息,
20:47
someone else has got 100 billion dollars invested in their thing.
449
1247993
4213
另一波人就向自己这儿 投了 1000 亿美元。
20:52
Isn't someone going to go, "Wait a sec.
450
1252206
2085
难道没有人会说:“等一下。
20:54
If we used reinforcement learning here
451
1254333
3378
如果我们在这里使用强化学习
20:57
to maybe have the AI tweak its own code
452
1257753
2919
让 AI 调整自己的代码,
21:00
and rewrite itself and make it so [powerful],
453
1260714
2252
重写自己的代码, 让它变得[强大],
21:03
we might be able to catch up in nine hours over the weekend
454
1263008
3212
我们也许可以在周末 花上九个小时就能
21:06
with what they're doing.
455
1266220
1167
做到他们现在正在做的事。
21:07
Roll the dice, dammit, we have no choice.
456
1267429
1960
赌一把吧,该死, 我们别无选择。
21:09
Otherwise we're going to lose a fortune for our shareholders."
457
1269431
2920
不然我们就要 让我们的股东亏一大笔钱了。”
21:12
How are we going to avoid that?
458
1272351
1835
我们要如何避免这种情况?
21:14
DH: Yeah, well, we must avoid that, of course, clearly.
459
1274228
2627
DH:是的,当然, 我们必须要避免这种情况。
21:16
And my view is that as we get closer to AGI,
460
1276855
3587
我的观点是, 随着我们距离 AGI 越来越近,
21:20
we need to collaborate more.
461
1280442
2378
我们需要加强合作。
21:22
And the good news is that most of the scientists involved in these labs
462
1282820
4879
好消息是,参与这些实验室的 大多数科学家
21:27
know each other very well.
463
1287741
1376
彼此非常了解。
21:29
And we talk to each other a lot at conferences and other things.
464
1289117
3546
我们经常在会议和其他场合交流。
21:32
And this technology is still relatively nascent.
465
1292704
2503
这项技术还相对处于起步阶段。
21:35
So probably it's OK what's happening at the moment.
466
1295249
2419
所以目前发生的事情可能没问题。
21:37
But as we get closer to AGI, I think as a society,
467
1297709
4421
但是随着我们越来越接近 AGI, 我认为从社会整体来看,
21:42
we need to start thinking about the types of architectures that get built.
468
1302172
4713
我们需要开始考虑 我们要构建的架构。
21:46
So I'm very optimistic, of course,
469
1306927
1793
当然,我非常乐观,
21:48
that's why I spent my whole life working on AI and working towards AGI.
470
1308762
4838
这就是为什么我一生都在研究 AI 并努力实现 AGI 的原因。
21:53
But I suspect there are many ways to build the architecture safely, robustly,
471
1313600
6507
但我怀疑有很多方法 可以以安全、稳健、
22:00
reliably and in an understandable way.
472
1320148
3170
可靠且易于理解的方式构建架构。
22:03
And I think there are almost certainly going to be ways of building architectures
473
1323318
3837
而且我认为几乎肯定 存在或多或少不安全或有风险的
22:07
that are unsafe or risky in some form.
474
1327155
1836
构建架构的方法。
22:09
So I see a sort of,
475
1329032
2127
我看到了一种, 我们必须让人类跨过的瓶颈,
22:11
a kind of bottleneck that we have to get humanity through,
476
1331201
3087
22:14
which is building safe architectures as the first types of AGI systems.
477
1334329
6340
就是将安全的架构 打造成最初始的 AGI 系统。
22:20
And then after that, we can have a sort of,
478
1340711
2502
在那之后, 我们可以
22:23
a flourishing of many different types of systems
479
1343255
2753
让许多不同类型的系统蓬勃发展,
22:26
that are perhaps sharded off those safe architectures
480
1346049
3712
可能与那些在理想情况下
22:29
that ideally have some mathematical guarantees
481
1349761
3337
有一些数学保证
22:33
or at least some practical guarantees around what they do.
482
1353140
3003
或至少有一些实际保证的 安全架构分开。
22:36
CA: Do governments have an essential role here
483
1356143
2252
CA:在定义公平竞争环境 和绝对禁忌方面,
22:38
to define what a level playing field looks like
484
1358437
2210
政府在这方面 是否起着至关重要的作用?
22:40
and what is absolutely taboo?
485
1360647
1418
22:42
DH: Yeah, I think it's not just about --
486
1362107
1919
DH:是的,我认为这不仅仅是——
22:44
actually I think government and civil society
487
1364067
2127
我认为政府、民间社会、
22:46
and academia and all parts of society have a critical role to play here
488
1366194
3379
学术界和社会各部分 都可以在这里发挥至关重要的作用,
22:49
to shape, along with industry labs,
489
1369573
2878
与行业实验室一起,
22:52
what that should look like as we get closer to AGI
490
1372451
2711
塑造当我们更接近 AGI 时, 它们该有的样子,
22:55
and the cooperation needed and the collaboration needed,
491
1375203
3546
需要企业,需要合作,
22:58
to prevent that kind of runaway race dynamic happening.
492
1378749
2669
防止这种失控竞赛局面的出现。
23:01
CA: OK, well, it sounds like you remain optimistic.
493
1381752
2419
CA:听起来你还是很乐观的。
23:04
What's this image here?
494
1384171
1168
这张照片是什么?
23:05
DH: That's one of my favorite images, actually.
495
1385380
2461
DH:这是我最喜欢的照片之一。
23:07
I call it, like, the tree of all knowledge.
496
1387841
2044
我把它叫做全知之树。
23:09
So, you know, we've been talking a lot about science,
497
1389885
2544
我们一直在谈论科学,
23:12
and a lot of science can be boiled down to
498
1392471
3128
很多科学可以归结为……
23:15
if you imagine all the knowledge that exists in the world
499
1395599
2711
如果你将世界上存在的所有知识
23:18
as a tree of knowledge,
500
1398310
1543
想象成一棵知识树,
23:19
and then maybe what we know today as a civilization is some, you know,
501
1399853
4797
那么我们今天所知道的文明可能只是
23:24
small subset of that.
502
1404691
1418
其中的一小部分。
23:26
And I see AI as this tool that allows us,
503
1406109
2962
我认为 AI 一种工具,它使我们
23:29
as scientists, to explore, potentially, the entire tree one day.
504
1409071
3920
作为科学家 有朝一日能够探索整棵树。
23:33
And we have this idea of root node problems
505
1413033
3503
我们有“根节点”问题,
23:36
that, like AlphaFold, the protein-folding problem,
506
1416578
2336
比如 AlphaFold, 蛋白质折叠问题,
23:38
where if you could crack them,
507
1418956
1459
如果你能破解它,
23:40
it unlocks an entire new branch of discovery or new research.
508
1420415
4713
它就会为发现或新研究 开辟出一个全新的分支。
23:45
And that's what we try and focus on at DeepMind
509
1425629
2252
这就是我们在 DeepMind
23:47
and Google DeepMind to crack those.
510
1427923
2377
和谷歌 DeepMind 努力征服 和关注的点。
23:50
And if we get this right, then I think we could be, you know,
511
1430300
3545
如果我们做对了,那么我认为,
23:53
in this incredible new era of radical abundance,
512
1433887
2711
我们就可以进入 这个相当富足的美妙新时代,
23:56
curing all diseases,
513
1436640
1543
治愈所有疾病,
23:58
spreading consciousness to the stars.
514
1438225
2210
向群星传播意识。
24:01
You know, maximum human flourishing.
515
1441144
1919
人类的极度繁荣。
24:03
CA: We're out of time,
516
1443063
1168
CA:我们没时间了,
24:04
but what's the last example of like, in your dreams,
517
1444272
2461
但是最后举个例子, 在你的梦想中,
24:06
this dream question that you think there is a shot
518
1446733
2962
这个梦中的问题是: 你认为在你有生之年,
24:09
that in your lifetime AI might take us to?
519
1449736
2670
AI 能带我们去向何方?
24:12
DH: I mean, once AGI is built,
520
1452447
2294
DH:一旦建成 AGI,
24:14
what I'd like to use it for is to try and use it to understand
521
1454783
3295
我想用它理解现实的基本本质。
24:18
the fundamental nature of reality.
522
1458120
2252
24:20
So do experiments at the Planck scale.
523
1460372
2836
在普朗克尺度的极限做实验。
24:23
You know, the smallest possible scale, theoretical scale,
524
1463250
3295
极限小的尺度,理论尺度,
24:26
which is almost like the resolution of reality.
525
1466586
2253
就像是现实的分辨率。
24:29
CA: You know, I was brought up religious.
526
1469798
2002
CA:我从小就信奉宗教。
24:31
And in the Bible, there’s a story about the tree of knowledge
527
1471800
2878
在《圣经》中,有一个 关于知识之树的故事,
24:34
that doesn't work out very well.
528
1474720
1543
但结果不怎么样。
24:36
(Laughter)
529
1476304
1544
(笑声)
24:37
Is there any scenario
530
1477848
3628
有没有什么情形
24:41
where we discover knowledge that the universe says,
531
1481518
5297
会让我们发现这样的知识,宇宙说:
24:46
"Humans, you may not know that."
532
1486815
2753
“人类,你可能还不知道吧。”
24:49
DH: Potentially.
533
1489943
1210
DH:有可能。
24:51
I mean, there might be some unknowable things.
534
1491153
2210
可能有一些不得而知的事情。
24:53
But I think scientific method is the greatest sort of invention
535
1493363
5089
但我认为,科学方法是
人类有史以来最伟大的发明。
24:58
humans have ever come up with.
536
1498493
1460
24:59
You know, the enlightenment and scientific discovery.
537
1499995
3545
启蒙运动和科学发现。
25:03
That's what's built this incredible modern civilization around us
538
1503582
3336
它们铸就了我们身处的伟大现代文明
25:06
and all the tools that we use.
539
1506960
2002
和我们使用的各种工具。
25:08
So I think it's the best technique we have
540
1508962
2669
我认为这是我们 了解身边浩渺宇宙的最佳方式。
25:11
for understanding the enormity of the universe around us.
541
1511673
3545
25:15
CA: Well, Demis, you've already changed the world.
542
1515677
2378
CA:戴密斯,你已经改变了世界。
25:18
I think probably everyone here will be cheering you on
543
1518055
3211
我想可能在座的每个人 都会为你加油,
25:21
in your efforts to ensure that we continue to accelerate
544
1521266
3086
努力争取我们继续 朝着正确的方向加速。
25:24
in the right direction.
545
1524352
1252
25:25
DH: Thank you.
546
1525645
1168
DH:谢谢。
25:26
CA: Demis Hassabis.
547
1526813
1210
CA:戴密斯·哈萨比斯。
25:28
(Applause)
548
1528065
5338
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7