The Transformative Potential of AGI — and When It Might Arrive | Shane Legg and Chris Anderson | TED

208,779 views

2023-12-07 ・ TED


New videos

The Transformative Potential of AGI — and When It Might Arrive | Shane Legg and Chris Anderson | TED

208,779 views ・ 2023-12-07

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yip Yan Yeung 校对人员: Yanyan Hong
克里斯·安德森(Chris Anderson):
00:04
Chris Anderson: Shane, give us a snapshot of you growing up
0
4042
2878
沙恩(Shane),请你 简短介绍一下你的成长历程,
00:06
and what on Earth led you to get interested in artificial intelligence?
1
6920
3462
你到底为什么 会对人工智能感兴趣呢?
00:10
Shane Legg: Well, I got my first home computer on my 10th birthday,
2
10423
4630
沙恩·莱格(Shane Legg): 我在十岁生日那天
收到了我的第一台家用电脑,
00:15
and I --
3
15095
1710
而我——
00:16
this was before the internet and everything.
4
16846
2128
那时互联网和这一切还没出现。
00:18
So you couldn't just go and surf the web and so on.
5
18974
2544
你不能就这么上网冲浪等等。
00:21
You had to actually make stuff yourself and program.
6
21560
2544
你得真的自己动手做东西、写程序。
00:24
And so I started programming,
7
24145
1585
于是我开始了编程,
00:25
and I discovered that in this computer there was a world,
8
25730
3462
发现这台计算机里有着一个世界,
00:29
I could create a world,
9
29234
1418
我可以创造出一个世界,
00:30
I could create little agents that would run around
10
30652
2753
我可以做出跑来跑去、 互相追逐、干活等等的小智能体。
00:33
and chase each other and do things and so on.
11
33446
2128
00:35
And I could sort of, bring this whole universe to life.
12
35574
2585
某种程度上我可以实现整个宇宙。
00:38
And there was sort of that spark of creativity that really captivated me
13
38201
4338
创造力的火花让我无法自拔,
00:42
and sort of, I think that was really the seeds
14
42539
2919
我认为这就埋下了我兴趣的种子,后来
00:45
of my interest that later grew
15
45500
1460
00:46
into an interest in artificial intelligence.
16
46960
2377
长成了对人工智能的兴趣。
00:49
CA: Because in your standard education, you had some challenges there.
17
49379
3795
CA:听说你在传统教育中遇到了一些困难。
00:53
SL: Yeah, I was dyslexic as a child.
18
53174
3587
SL:是的,我小时候有阅读障碍。
00:56
And so they were actually going to hold me back a year
19
56803
3796
所以在我十岁的时候, 他们让我推后一年,
01:00
when I was 10 years old,
20
60599
1418
01:02
and they sent me off to get my IQ tested to sort of,
21
62058
2586
把我送去测智商,
01:04
you know, assess how bad the problem was.
22
64686
2210
评估一下问题有多严重。
01:07
And they discovered I had an exceptionally high IQ.
23
67230
3337
然后他们发现我的智商格外地高。
01:10
And then they were a little bit confused about what was going on.
24
70609
3128
他们有点疑惑这是什么情况。
01:13
And fortunately, at that time,
25
73737
1626
所幸当时
01:15
there was somebody in the town I lived in
26
75405
2336
我居住的小镇里
01:17
who knew how to test for dyslexia.
27
77782
1752
有人知道如何测试阅读障碍。
01:19
And it turns out I wasn't actually of limited intelligence.
28
79576
3378
结果我并不是智力有限。
01:22
I was dyslexic, and that was the issue.
29
82996
2419
而是患有阅读障碍,这就是问题所在。
01:25
CA: You had reason from an early age to believe
30
85457
2210
CA:你从小就有理由相信
01:27
that our standard assumptions about intelligence might be off a bit.
31
87709
3712
我们对智力的普遍认知可能是错的。
01:31
SL: Well, I had reason, from an early age, to sometimes doubt authority.
32
91421
4004
SL:嗯,我从小就有理由质疑权威。
01:35
(Laughter)
33
95425
1001
(笑声)
01:36
You know, if the teacher thinks you're dumb, maybe it's not true.
34
96468
3086
如果老师认为你很傻, 可能并不是如此。
01:39
Maybe there are other things going on.
35
99554
2169
也许还有其他情况。
01:41
But I think it also created in me an interest in intelligence
36
101765
4629
但我认为这样的儿时经历
也在我心中建立起了对智力的兴趣。
01:46
when I sort of had that experience as a child.
37
106436
2794
01:49
CA: So you're credited by many
38
109272
1460
CA:许多人
01:50
as coining the term “artificial general intelligence,” AGI.
39
110774
3587
认为你创造了 “通用人工智能”一词,即 AGI。
01:54
Tell us about 2001, how that happened.
40
114402
2711
说说 2001 年吧,那是怎么发生的。
01:57
SL: Yeah, so I was approached by someone called Ben Goertzel
41
117113
3671
SL:是的,有个叫 本·戈策尔(Ben Goertzel)的人
02:00
who I'd actually been working with,
42
120784
1710
找到了我,我之后也在与他共事,
02:02
and he was going to write a book,
43
122494
2127
当时他打算写一本书,
02:04
and he was thinking about a book on AI systems
44
124663
3503
他在考虑写一本关于 AI 系统的书,
02:08
that would be much more general and capable,
45
128166
2085
它会更通用、更强大,
02:10
rather than focusing on very narrow things.
46
130251
2461
而不是专注于非常狭窄的领域。
02:13
And he was thinking about a title for the book.
47
133046
2210
他正在思考这本书的标题。
02:15
So I suggested to him, "If you're interested in very general systems,
48
135256
3295
我向他建议:“如果你 对很通用的系统感兴趣,
02:18
call it artificial general intelligence."
49
138551
1961
那就叫它通用人工智能吧。”
02:20
And so he went with that.
50
140553
1210
于是他就接受了。
02:21
And then him and various other people started using the term online
51
141763
3170
然后他和好多人开始在线上
02:24
and the internet,
52
144933
1209
和互联网上使用这个词,
02:26
and then it sort of became popularized from there.
53
146142
2336
从此风靡了起来。
后来我们发现有个叫 迈克·加罗德(Mike Garrod)的人,
02:28
We later discovered there was someone called Mike Garrod,
54
148520
2711
他在 97 年的一份安全纳米技术 期刊上发表了一篇论文。
02:31
who published a paper in a security nanotech journal in '97.
55
151231
4296
02:35
So he is actually the first person to have used the term.
56
155568
2711
他其实是第一个使用该术语的人。
02:38
But it turns out he pretty much meant the same thing as us anyway.
57
158279
3295
但他想表达的意思和我们的一样。
02:41
CA: It was kind of an idea whose time had come,
58
161616
2419
CA:发现其中潜力的时代到了。
02:44
to recognize the potential here.
59
164035
2211
02:46
I mean, you made an early prediction that many people thought was bonkers.
60
166246
4212
你在很多人认为是痴人说梦的时候 就先人一步做出了预测。
02:50
What was that?
61
170458
1168
预测了什么?
02:51
SL: Well, in about 2001,
62
171668
2711
SL:大约在 2001 年,
02:54
a similar time to when I suggested this term artificial general intelligence,
63
174421
4838
差不多是我提出“通用人工智能” 这个术语的时候,
02:59
I read a book by Ray Kurzweil, actually, "Age of Spiritual Machines,"
64
179259
3670
我读了雷·库兹韦尔(Ray Kurzweil)的 一本书《机器之心》,
03:02
and I concluded that he was fundamentally right,
65
182971
3795
认为他说的很对,
03:06
that computation was likely to grow exponentially for at least a few decades,
66
186808
5714
计算可能会在至少几十年内 呈指数级增长,
03:12
and the amount of data in the world would grow exponentially
67
192564
2836
世界上的数据量 将在几十年内呈指数级增长。
03:15
for a few decades.
68
195442
1167
03:16
And so I figured that if that was going to happen,
69
196609
2670
我想,如果真的出现了这种情况,
03:19
then the value of extremely scalable algorithms
70
199279
3545
那么利用这些数据和计算的 高度可扩展算法
03:22
that could harness all this data and computation
71
202824
3295
将会有非常高的价值。
03:26
were going to be very high.
72
206161
1835
03:27
And then I also figured that in the mid 2020s,
73
207996
3462
我还发现,到了 2020 年代中期,
03:31
it would be possible then,
74
211499
1460
那时候就可以成真了,
03:33
if we had these highly scalable algorithms,
75
213001
2711
我们可以用这些高度可扩展的算法
03:35
to train artificial intelligence systems
76
215712
4254
训练人工智能系统,
03:40
on far more data than a human would experience in a lifetime.
77
220008
3420
所用的数据远远超过 人类一生所能经历的数据。
03:43
And so as a result of that,
78
223470
1418
因此,我从 2009 年左右 就开始在我的博客中说到这点,
03:44
you can find it on my blog from about 2009
79
224929
3838
03:48
I think it's the first time I publicly talked about it,
80
228808
2586
我觉得那是我第一次 公开谈论这一点,
03:51
I predicted a 50 percent chance of AGI by 2028.
81
231436
3920
我预计 2028 年实现 AGI 的 可能性将达到 50%。
03:55
I still believe that today.
82
235815
2336
时至今日我依旧如此认为。
03:58
CA: That's still your date.
83
238151
1668
CA:你觉得还是这个时间点。
04:00
How did you define AGI back then, and has your definition changed?
84
240320
4045
以前你是怎么定义 AGI 的? 你的定义改变了吗?
04:04
SL: Yeah, I didn't have a particularly precise definition at the beginning.
85
244824
4171
SL:嗯,一开始 我没有特别精准的定义。
04:09
It was really just an idea of systems that would just be far more general.
86
249037
4588
只是“系统”的概念, 通用得多的系统。
04:13
So rather than just playing Go or chess or something,
87
253666
2503
不仅仅是下围棋、下象棋等等,
04:16
rather than actually be able to do many, many different things.
88
256211
2961
不仅仅是做各种各样的事。
04:19
The definition I use now is that it's a system
89
259214
2544
我现在使用的定义是一个系统,
04:21
that can do all the cognitive kinds of tasks
90
261758
2627
可以完成人类可以完成的 所有认知任务,可能还要更多,
04:24
that people can do, possibly more,
91
264385
2545
04:26
but at least it can do the sorts of cognitive tasks
92
266930
2419
但至少它可以完成人类 通常能完成的那种认知任务。
04:29
that people can typically do.
93
269349
1877
04:31
CA: So talk about just the founding of DeepMind
94
271267
3003
CA:谈谈 DeepMind 的成立
04:34
and the interplay between you and your cofounders.
95
274270
4046
还有你和你的联合创始人之间的互动。
04:38
SL: Right. So I went to London to the place called the Gatsby Unit,
96
278316
4004
SL:好。我去了伦敦,去了一个叫做 “盖茨比计算神经科学中心”的地方,
04:42
which studies theoretical neuroscience and machine learning.
97
282362
5046
那里研究理论神经科学和机器学习。
04:47
And I was interested in learning the relationships
98
287450
2378
我感兴趣的是研究
04:49
between what we understand about the brain
99
289828
2043
我们对大脑的理解
04:51
and what we know from machine learning.
100
291871
1877
和我们从机器学习中的所得 之间的关系。
04:53
So that seemed like a really good place.
101
293748
1960
所以应该是来对了。
04:55
And I met Demis Hassabis there.
102
295708
1544
我在此遇到了戴密斯·哈萨比斯 (Demis Hassabis)。
04:57
He had the same postdoc supervisor as me,
103
297252
2002
我和他的博士后导师是同一位,
04:59
and we got talking.
104
299295
1252
所以就聊了起来。
05:00
And he convinced me
105
300588
2628
他说服我是时候创办一家公司了。
05:03
that it was the time to start a company then.
106
303258
2252
05:05
That was in 2009 we started talking.
107
305510
2669
我们聊起来的时候是 2009 年。
05:08
And I was a little bit skeptical.
108
308221
1668
我有点怀疑。
05:09
I thought AGI was still a bit too far away,
109
309889
3545
我认为 AGI 还远在天边,
05:13
but he thought the time was right, so we decided to go for it.
110
313434
3421
但他认为时机已经成熟, 所以我们决定开干。
05:16
And then a friend of his was Mustafa Suleyman.
111
316896
3170
他的朋友穆斯塔法·苏莱曼 (Mustafa Suleyman)也加入了。
05:20
CA: And specifically, one of the goals of the company
112
320733
2503
CA:具体地说,公司的目标之一
05:23
was to find a pathway to AGI?
113
323236
1835
就是找到通往 AGI 的途径吗?
05:25
SL: Absolutely.
114
325113
1334
SL:没错。
05:26
On our first business plan that we were circulating
115
326447
3879
我们在 2010 年为寻求投资 发布的第一份商业企划
05:30
when we were looking for investors in 2010,
116
330368
3045
05:33
it had one sentence on the front cover and it said,
117
333413
2586
封面上有一句话:
05:36
"Build the world's first artificial general intelligence."
118
336040
2711
“打造世界上第一个通用人工智能。”
05:38
So that was right in from the beginning.
119
338793
2669
从一开始就是如此。
05:42
CA: Even though you knew
120
342046
1502
CA:即使你知道
05:43
that building that AGI might actually have
121
343590
4337
打造这样的 AGI 可能会
05:47
apocalyptic consequences in some scenarios?
122
347969
2628
在某种情况下导致灾难性后果?
05:50
SL: Yeah.
123
350930
1210
SL:是的。
05:52
So it's a deeply transformative technology.
124
352181
4672
它是一项极具颠覆性的技术。
05:57
I believe it will happen.
125
357437
1585
我相信它会成真。
05:59
I think that, you know,
126
359856
1251
我认为,
06:01
these algorithms can be understood and they will be understood at the time.
127
361149
3754
这些算法可以被理解, 届时也真的会被理解。
06:04
And I think that intelligence is fundamentally
128
364944
2962
我认为智能
06:07
an incredibly valuable thing.
129
367906
1876
是一个相当有价值的东西。
06:10
Everything around us at the moment -- the building we’re in,
130
370241
2836
那时我们周围的一切—— 我们所在的大楼、
06:13
the words I’m using, the concepts we have, the technology around us --
131
373077
3671
我使用的文字、我们拥有的概念、 我们身边的技术,
06:17
you know, all of these things are being affected by intelligence.
132
377540
3087
这些东西都会受到智能的影响。
06:20
So having intelligence in machines
133
380668
2878
让机器拥有智能
06:23
is an incredibly valuable thing to develop.
134
383588
3879
是非常值得开发的。
06:27
And so I believe it is coming.
135
387467
2002
所以我相信它会来的。
06:29
Now when a very, very powerful technology arises,
136
389510
4130
当一项非常非常强大的技术出现时,
06:33
there can be a range of different outcomes.
137
393640
2627
可能会产生一系列不同的结果。
06:36
Things could go very, very well,
138
396309
1918
事情可能进展得非常非常顺利,
06:38
but there is a possibility things can go badly as well.
139
398227
2586
但也有可能会翻车。
06:40
And that was something I was aware of also from about 20 years ago.
140
400855
4963
这也是我在大约 20 年前 就意识到的一点。
06:46
CA: So talk about, as DeepMind developed,
141
406319
3253
CA:谈谈随着 DeepMind 的发展,
06:49
was there a moment where you really felt,
142
409572
4296
有没有哪一刻你真的觉得:
06:53
"My goodness, we're onto something unbelievably powerful?"
143
413910
3628
“天啊,我们正在研究强大得惊人的东西?”
06:57
Like, was it AlphaGo, that whole story, or what was the moment for you?
144
417580
4296
是做 AlphaGo 的时候吗? 还是哪个时刻?
07:01
SL: Yeah, there were many moments over the years.
145
421918
2502
SL:是的,这些年来有很多时刻。
07:04
One was when we did the Atari games.
146
424420
2211
一个是我们做 Atari 游戏的时候。
07:07
Have you seen those videos
147
427131
1252
你有没有看过那些视频,
07:08
where we had an algorithm that could learn to play multiple games
148
428424
3420
我们有一个算法, 可以学会玩多款游戏,
07:11
without being programmed for any specific game?
149
431844
2253
而无需为其中 任意一款游戏专门编程?
07:14
There were some exciting moments there.
150
434138
3295
那时有一些激动人心的时刻。
07:17
Go, of course, was a really exciting moment.
151
437433
2586
AlphaGo 当然也是 一个激动人心的时刻。
07:20
But I think the thing that's really captured my imagination,
152
440937
3754
但我认为,真正吸引我的想象力、
07:24
a lot of people's imagination,
153
444732
1460
很多人的想象力的是
07:26
is the phenomenal scaling of language models in recent years.
154
446192
3253
近年来语言模型的惊人扩展。
07:29
I think we can see they're systems
155
449821
1960
我们可以看出,它们这些系统
07:31
that really can start to do some meaningful fraction
156
451781
3670
确实可以开始完成一部分
07:35
of the cognitive tasks that people can do.
157
455493
2377
人类可以完成的有意义的认知任务。
07:37
CA: Now, you were working on those models,
158
457912
2002
CA:你正在研究这些模型,
07:39
but were you, to some extent, blindsided
159
459914
1960
但在某种程度上,你有没有
07:41
by OpenAI's, sort of, sudden unveiling of ChatGPT?
160
461874
5923
被 OpenAI 突然推出的 ChatGPT 打得措手不及?
07:47
SL: Right.
161
467839
1168
SL:对。
07:49
We were working on them and you know,
162
469007
1835
我们正在研究它们,
07:50
the transformer model was invented in Google,
163
470842
2127
Transformer 模型是谷歌研发的,
07:53
and we had teams who were building big transformer language models and so on.
164
473011
5296
我们有团队正在构建 大型 Transformer 语言模型。
07:58
CA: Google acquired DeepMind at some point in this journey.
165
478683
3170
CA:谷歌在这段时间里 收购了 DeepMind。
08:01
SL: Yeah, exactly.
166
481894
1335
SL:是的,没错。
08:03
And so what I didn't expect
167
483229
4046
我没想到的是
08:07
was just how good a model could get training purely on text.
168
487275
4713
单凭文本训练得出的模型能有多好。
08:11
I thought you would need more multimodality.
169
491988
2711
我以为需要多模态的数据。
08:14
You'd need images, you'd need sound, you'd need video and things like that.
170
494741
3712
需要图像、声音、视频等等。
08:18
But due to the absolutely vast quantities of text,
171
498494
3671
但由于文本的海量存在,
08:22
it can sort of compensate for these things to an extent.
172
502206
2753
可以从某种程度上弥补这些问题。
08:25
I still think you see aspects of this.
173
505376
2628
我觉得这是可以看到一些迹象的。
08:28
I think language models tend to be weak in areas
174
508004
3170
我认为语言模型在不易 以文本形式表达的领域相对薄弱。
08:31
that are not easily expressed in text.
175
511174
3211
08:34
But I don’t think this is a fundamental limitation.
176
514385
2419
但我不认为这是一个根本性缺陷。
08:36
I think we're going to see these language models expanding into video
177
516804
4964
我们会见证这些语言模型 扩展至视频、
08:41
and images and sound and all these things,
178
521809
2211
图像、声音等等,
08:44
and these things will be overcome in time.
179
524020
2044
这些问题迟早会被攻克。
08:46
CA: So talk to us, Shane,
180
526564
1335
CA:和我们讲讲,沙恩,
08:47
about the things that you, at this moment,
181
527940
2962
谈谈你此刻
08:50
passionately feel that the world needs to be thinking about more cogently.
182
530902
4754
迫切认为这个世界 该更审慎地思考的是什么。
08:55
SL: Right.
183
535698
1168
SL:好。
08:56
So I think that very, very powerful,
184
536866
2628
我认为非常、非常强大、
08:59
very intelligent artificial intelligence is coming.
185
539535
3504
非常智慧的人工智能即将到来。
09:03
I think that this is very, very likely.
186
543498
2669
我认为很有可能。
09:06
I don't think it's coming today.
187
546209
1585
我觉得不会今天到来。
09:07
I don't think it's coming next year or the year after.
188
547794
2544
我觉得不会明年或者后年到来。
09:10
It's probably a little bit further out than that.
189
550338
2419
有可能更久一点。
09:12
CA: 2028?
190
552799
1293
CA:2028?
09:14
SL: 2028, that's a 50 percent chance.
191
554092
2293
SL:2028,这是 50% 的可能。
09:16
So, you know, if it doesn't happen in 2028,
192
556385
2044
如果 2028 年没有发生,
09:18
I'm not going to be surprised, obviously.
193
558429
2002
我显然也不会感到惊讶。
09:20
CA: And when you say powerful,
194
560473
1460
CA:你说“强大”,
09:21
I mean there's already powerful AI out there.
195
561933
2127
现在已经有强大的 AI 了。
09:24
But you're saying basically a version
196
564060
1793
但你的意思是 某种通用人工智能即将到来。
09:25
of artificial general intelligence is coming.
197
565853
2127
09:28
SL: Yeah.
198
568022
1126
SL:是的。
09:29
CA: So give us a picture of what that could look like.
199
569190
2711
CA:给我们描绘一下这样的图景吧。
09:31
SL: Well, if you had an artificial general intelligence,
200
571901
2669
SL:嗯,如果你有了通用人工智能,
09:34
you could do all sorts of amazing things.
201
574570
2419
你可以做各种奇妙的事情。
09:37
Just like human intelligence is able to do many, many amazing things.
202
577031
3253
就像人类智能 能做很多奇妙的事情一样。
09:40
So it's not really about a specific thing,
203
580326
2002
这不是某一件事,
09:42
that's the whole point of the generality.
204
582370
1960
而是“通用”的意义。
09:44
But to give you one example,
205
584330
1752
但举个例子,
09:46
we developed the system AlphaFold,
206
586082
2335
我们开发了 AlphaFold 系统,
09:48
which will take a protein and compute, basically, the shape of that protein.
207
588459
5881
它能根据一种蛋白质 计算出它的形状。
09:54
And that enables you to do all sorts of research
208
594382
2627
这样你就能进行各种研究,
09:57
into understanding biological processes,
209
597009
1919
了解生物学过程、开发药物等等。
09:58
developing medicines and all kinds of things like that.
210
598970
2586
10:01
Now, if you had an AGI system,
211
601556
1459
如果你有了一个 AGI 系统,
10:03
instead of requiring what we had at DeepMind,
212
603057
2586
而不是像 DeepMind 这样的配置,
10:05
about roughly 30 world-class scientists
213
605685
2919
大约 30 位世界一流的科学家
10:08
working for about three years to develop that,
214
608604
2670
花了差不多三年的时间 开发这个系统,
10:11
maybe you could develop that with just a team
215
611315
2127
也许你可以在一年内 以一个由少数科学家组成的团队
10:13
of a handful of scientists in one year.
216
613442
2211
开发出这个系统。
10:16
So imagine these, sort of, AlphaFold-level developments
217
616154
3753
想象一下世界各地定期
10:19
taking place around the world on a regular basis.
218
619949
3212
进行 AlphaFold 级别的开发。
10:23
This is the sort of thing that AGI could enable.
219
623202
2503
这就是 AGI 可以实现的东西。
10:25
CA: So within months of AGI being with us, so to speak,
220
625997
4045
CA:我们一旦用上几个月的 AGI,
10:30
it's quite possible that some of the scientific challenges
221
630084
3087
很有可能人类纠缠了几十年、
10:33
that humans have wrestled with for decades, centuries, if you like,
222
633171
3962
也许几百年的科学挑战,
10:37
will start to tumble in rapid succession.
223
637133
2961
就会被迅速逐个击破。
10:40
SL: Yeah, I think it'll open up all sorts of amazing possibilities.
224
640136
4129
SL:是的,我认为它会 开辟各种奇妙的可能。
10:44
And it could be really a golden age of humanity
225
644265
4212
这可能是一个真正的人类黄金时代,
10:48
where human intelligence,
226
648519
1835
人类智能,
10:50
which is aided and extended with machine intelligence,
227
650396
4379
在机器智能的辅助和扩展下,
10:54
enables us to do all sorts of fantastic things
228
654775
2461
使我们能够做到各种神奇的事情,
10:57
and solve problems that previously were just intractable.
229
657278
4922
解决以前难以解决的问题。
11:02
CA: So let's come back to that.
230
662200
1501
CA:我们之后再回来聊这点。
11:03
But I think you also,
231
663743
1585
但我认为,
11:05
you're not like, an irredeemable optimist only,
232
665328
2252
你不像一个无可救药的乐观主义者,
11:07
you see a potential for it to go very badly in a different direction.
233
667622
3920
你还是看到了它跑偏翻车的可能。
11:11
Talk about what that pathway could look like.
234
671542
2711
谈谈那条路会是什么样子。
11:14
SL: Well, yeah, I want to explain.
235
674253
1961
SL:嗯,是的,我想解释一下。
11:17
I don't believe the people
236
677048
1585
我不相信那些坚信一切顺利的人,
11:18
who are sure that it's going to go very well,
237
678674
2253
11:20
and I don't believe the people
238
680968
1460
我也不相信
11:22
who are sure that it’s going to go very, very badly.
239
682428
2461
那些坚信一定会翻车的人。
11:24
Because what we’re talking about is an incredibly profound transition.
240
684889
4046
因为我们讨论的 是一个非常重大的转变。
11:29
It's like the arrival of human intelligence in the world.
241
689352
3211
这就像人类智能降临世界一样。
11:32
This is another intelligence arriving in the world.
242
692605
2961
另一个智能来到了这个世界。
11:35
And so it is an incredibly deep transition,
243
695608
3045
这是一个极其深刻的转变,
11:38
and we do not fully understand all the implications
244
698653
3128
我们并不能完全了解 其中含义和后果。
11:41
and consequences of this.
245
701781
1585
11:43
And so we can't be certain
246
703407
1418
因此,我们无法确定
11:44
that it's going to be this, that or the other thing.
247
704867
2461
它会这样、那样还是怎样。
11:47
So we have to be open-minded about what may happen.
248
707328
3211
所以我们必须以开放的心态 看待可能发生的情况。
11:51
I have some optimism because I think
249
711332
1919
我有点乐观,因为我认为,
11:53
that if you want to make a system safe,
250
713251
3086
如果你想确保系统的安全,
11:56
you need to understand a lot about that system.
251
716379
2669
你必须得非常了解这个系统。
11:59
You can't make an airplane safe
252
719090
1668
如果你不知道飞机是如何运作的,
12:00
if you don't know about how airplanes work.
253
720800
2627
你就无法保证飞机的安全。
12:03
So as we get closer to AGI,
254
723761
2252
随着我们越来越接近 AGI,
12:06
we will understand more and more about these systems,
255
726055
2502
我们会越来越了解这些系统,
12:08
and we will see more ways to make these systems safe,
256
728557
3003
我们会看到更多 保障这些系统安全的方式,
12:11
make highly ethical AI systems.
257
731560
2712
打造高度符合伦理的 AI 系统。
12:15
But there are many things we don't understand about the future.
258
735064
3587
但是我们对未来 还是有很多不理解的地方。
12:18
So I have to accept that there is a possibility that things may go badly
259
738693
5297
所以我必须接受 有偏离轨道的可能性,
12:23
because I don't know what's going to happen.
260
743990
2085
因为我不知道会发生什么。
12:26
I can't know that about the future in such a big change.
261
746117
3336
在这场巨变中, 我不知道未来会发生什么。
12:29
And even if the probability of something going bad is quite small,
262
749453
4839
即使出岔子的可能性很小,
12:34
we should take this extremely seriously.
263
754333
2378
我们也应该非常慎重地看待这个问题。
12:36
CA: Paint a scenario of what going bad could look like.
264
756711
2877
CA:想象一个跑偏的场景是什么样的。
12:39
SL: Well, it's hard to do
265
759630
1710
SL:嗯,很难,
12:41
because you're talking about systems
266
761382
1752
因为我们说的是
12:43
that potentially have superhuman intelligence, right?
267
763134
3503
可能拥有超人类智能的系统,对吧?
12:46
So there are many ways in which things would go bad in the world.
268
766637
4463
在这世上跑偏的方式有很多。
12:51
People sometimes point to, I don't know, engineered pathogens, right?
269
771142
3295
人们有时会点名 人工培养病原体,对吧?
12:54
Maybe a superintelligence could design an engineered pathogen.
270
774437
3753
也许超级智能可以 设计出一种人工培养病原体。
12:58
It could be much more mundane things.
271
778190
2253
可能是更平平无奇的事情。
13:00
Maybe with AGI, you know,
272
780484
2545
也许有了 AGI,
13:03
it gets used to destabilize democracy in the world,
273
783070
4588
它会被用于破坏世界民主的稳定,
13:07
with, you know, propaganda or all sorts of other things like that.
274
787700
3128
通过宣传等等方式。
13:10
We don't know --
275
790870
1168
我们不知道……
13:12
CA: That one might already have happened.
276
792079
2002
CA:可能已经发生了。
13:14
SL: There might be happening a bit already.
277
794081
2002
SL:可能已经发生了一些了。
13:16
But, you know, there may be a lot more of this
278
796125
2169
但是,如果我们有了更强大的系统, 可能会出现更多这样的情况。
13:18
if we have more powerful systems.
279
798294
1626
13:19
So there are many ways in which societies can be destabilized.
280
799920
2920
破坏社会稳定的方式有很多。
13:22
And you can see that in the history books.
281
802882
2043
你可以在历史书中看到这一点。
13:24
CA: I mean, Shane, if you could have asked all humans,
282
804967
2544
CA:沙恩,如果你问每个人,
13:27
say, 15 years ago, OK, we can open a door here,
283
807511
4338
假设 15 年前, 我们可以在这里打开一扇门,
13:31
and opening this door could lead to the best-ever outcomes for humanity.
284
811891
3670
打开这扇门可以为人类带来 有史以来最好的结果。
13:35
But there's also a meaningful chance,
285
815603
1793
但是,也有相当大的可能,
13:37
let's say it's more than five percent,
286
817396
2544
比方说,超过 5%,
13:39
that we could actually destroy our civilization.
287
819982
2753
我们可能会摧毁我们的文明。
13:43
I mean, isn't there a chance that most people would have actually said,
288
823235
3337
难道大多数人不会这么说:
13:46
"Don't you dare open that damn door.
289
826614
1752
“不许打开那扇门。
13:48
Let's wait."
290
828407
1168
我们等着吧。”
13:50
SL: If I had a magic wand and I could slow things down,
291
830326
4087
SL:如果我有一根魔杖,能减慢速度,
13:54
I would use that magic wand, but I don't.
292
834455
2336
我会用那根魔杖,但我没有。
13:56
There are dozens of companies,
293
836791
1835
有几十家公司,
13:58
well, there's probably 10 companies in the world now
294
838626
2461
现在全球可能有 10 家公司,
14:01
that can develop the most cutting-edge models, including, I think,
295
841087
4421
可以开发出最前沿的模型,包括
14:05
some national intelligence services who have secret projects doing this.
296
845508
4087
一些国家情报机构 通过秘密项目进行研发。
14:10
And then there's, I don't know,
297
850262
1502
还有一大堆公司在开发 落后一代的东西。
14:11
dozens of companies that can develop something that's a generation behind.
298
851764
3503
14:15
And remember, intelligence is incredibly valuable.
299
855267
3379
请记住,智能非常宝贵。
14:18
It's incredibly useful.
300
858687
1168
非常有用。
14:19
We're doing this
301
859855
1168
我们之所以这样做,
14:21
because we can see all kinds of value that can be created in this
302
861065
3086
是因为我们可以看到 出于各种原因在此开发过程中
14:24
for all sorts of reasons.
303
864151
1752
能够创造的价值。
14:25
How do you stop this process?
304
865903
2836
你要如何停下这个进程?
14:28
I don't see any realistic plan that I've heard of,
305
868781
2836
我没有听说过任何切合实际的计划
14:31
of stopping this process.
306
871659
1209
能够停下这个进程。
14:32
Maybe we can --
307
872868
1210
也许我们可以——
14:34
I think we should think about regulating things.
308
874078
2294
我认为我们应该考虑加以监管。
14:36
I think we should do things like this as we do with every powerful technology.
309
876414
3670
我认为面临任何强大科技 都应该这么做。
14:40
There's nothing special about AI here.
310
880084
1877
AI 并不是什么特例。
14:41
People talk about, oh, you know, how dare you talk about regulating this?
311
881961
3462
人们会说, 你怎么敢说要监管它们啊?
14:45
No, we regulate powerful technologies all the time in the interests of society.
312
885464
3837
不,为了社会的利益, 我们一直在监管强大的技术。
14:49
And I think this is a very important thing that we should be looking at.
313
889343
3420
我认为这是我们应该考虑的 一件非常重要的事情。
14:52
CA: It's kind of the first time we have this superpowerful technology out there
314
892763
3754
CA:这好像是第一次 我们有了超级强大的技术,
14:56
that we literally don't understand in full how it works.
315
896559
3086
但我们并不完全了解 它是如何运作的。
14:59
Is the most single, most important thing we must do, in your view,
316
899645
3921
在你看来,我们该做的 最重要的那一件事,
15:03
to understand, to understand better what on Earth is going on,
317
903566
4671
是去理解、更好地了解 到底发生了一些什么吗?
15:08
so that we least have a shot at pointing it in the right direction?
318
908279
3170
让我们至少还有一线希望 将其导向正确的方向。
15:11
SL: There is a lot of energy behind capabilities at the moment
319
911449
3294
SL:目前功能背后消耗了大量的精力,
15:14
because there's a lot of value in developing the capabilities.
320
914785
3045
因为开发功能有很大的价值。
15:17
I think we need to see a lot more energy going into actual science,
321
917830
4504
我认为我们得将更多的精力 投入到实际的科学中,
15:22
understanding how these networks work,
322
922376
3128
了解这些网络是如何运作的,
15:25
what they're doing, what they're not doing,
323
925504
2169
它们在做些什么,它们没有做什么,
15:27
where they're likely to fail,
324
927715
1835
它们可能在哪里崩溃,
15:29
and understanding how we put these things together
325
929592
2419
理解我们如何 将这些东西整合在一起,
15:32
so that we're able to find ways
326
932011
2085
让我们可以找到
15:34
to make these AGI systems profoundly ethical and safe.
327
934138
4129
让这些 AGI 系统 高度符合伦理又安全的方式。
15:38
I believe it's possible.
328
938267
1793
我相信这是可能的。
15:40
But we need to mobilize more people's minds and brains
329
940895
5505
但是,我们需要鼓动 更多人的思想和大脑
15:46
to finding these solutions
330
946442
2502
寻找这些解决方案,
15:48
so that we end up in a future
331
948986
1835
这样我们才能在未来
15:50
where we have incredibly powerful machine intelligence
332
950863
3962
拥有极其强大的机器智能,
15:54
that's also profoundly ethical and safe,
333
954867
2836
而它们也高度符合伦理又安全,
15:57
and it enhances and extends humanity
334
957745
2753
将人类增强、延伸至
16:00
into, I think, a new golden period for humanity.
335
960539
4296
一个全新的人类黄金时代。
16:05
CA: Shane, thank you so much for sharing that vision
336
965920
2460
CA:沙恩,非常感谢你分享这个愿景,
16:08
and coming to TED.
337
968380
1210
也感谢你来到 TED。
16:09
(Applause)
338
969590
1460
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7