The Inside Story of ChatGPT’s Astonishing Potential | Greg Brockman | TED
1,805,277 views ・ 2023-04-20
请双击下面的英文字幕来播放视频。
翻译人员: Yip Yan Yeung
校对人员: Grace Man
00:03
We started OpenAI seven years ago
0
3875
2503
七年前,我们成立了 OpenAI,
00:06
because we felt like something
really interesting was happening in AI
1
6378
3712
因为我们认为,AI 领域中
正在上演一些很有趣的事,
00:10
and we wanted to help steer it
in a positive direction.
2
10131
3170
我们想助其走上正轨。
00:15
It's honestly just really amazing to see
3
15220
2085
这个领域自此取得的进步
00:17
how far this whole field
has come since then.
4
17347
3086
着实令人咋舌。
00:20
And it's really gratifying to hear
from people like Raymond
5
20433
3629
我也十分欣慰看到像网友雷蒙德
(Raymond)这样的人们
00:24
who are using the technology
we are building, and others,
6
24104
2836
使用我们正在打造的技术,
还有很多人
00:26
for so many wonderful things.
7
26982
2127
将其用于许多美妙的用途。
00:29
We hear from people who are excited,
8
29150
2503
我们看到有人很兴奋,
00:31
we hear from people who are concerned,
9
31653
1835
我们看到有人很忧虑,
00:33
we hear from people who feel
both those emotions at once.
10
33530
2961
我们看到有人两者兼有。
00:36
And honestly, that's how we feel.
11
36533
2252
说实话,这也是我们的心情。
00:40
Above all, it feels like we're entering
an historic period right now
12
40245
4087
关键是感觉我们
即将进入一个历史性时期,
00:44
where we as a world
are going to define a technology
13
44374
4421
全球各地的人们将要
共同努力定义一项技术,
00:48
that will be so important
for our society going forward.
14
48795
3086
而它对于我们社会
未来的走向至关重要。
00:52
And I believe that we can
manage this for good.
15
52924
2628
我相信我们有能力做到。
00:56
So today, I want to show you
the current state of that technology
16
56845
4171
今天,我想为大家介绍
这项技术的现况
01:01
and some of the underlying
design principles that we hold dear.
17
61016
3086
和其背后我们所珍视的
一些设计原则。
01:09
So the first thing I'm going to show you
18
69983
1918
我要介绍的第一点是
01:11
is what it's like to build
a tool for an AI
19
71943
2086
为 AI (人工智能),
而不是人类打造一个工具,
01:14
rather than building it for a human.
20
74029
1876
是什么样的。
01:17
So we have a new DALL-E model,
which generates images,
21
77574
3545
我们现在有了一个新的 DALL-E
模型,可以生成图片,
01:21
and we are exposing it as an app
for ChatGPT to use on your behalf.
22
81161
4045
它的呈现方式是 ChatGPT
直接替你调用的应用程序。
01:25
And you can do things like ask, you know,
23
85248
2461
你可以执行这样的操作,
比如问一问
01:27
suggest a nice post-TED meal
and draw a picture of it.
24
87751
6631
这场 TED 结束了以后
吃点什么好呢?再画张图吧。
01:35
(Laughter)
25
95216
1419
(笑声)
01:38
Now you get all of the, sort of,
ideation and creative back-and-forth
26
98303
4671
你能看到 ChatGPT
为你提供的这种
构思和发挥创意的往复过程
以及为你处理好所有细节的能力。
01:43
and taking care of the details for you
that you get out of ChatGPT.
27
103016
4004
01:47
And here we go, it's not just
the idea for the meal,
28
107062
2669
看这些回答,已经不仅仅是
关于一顿饭的想法了,
01:49
but a very, very detailed spread.
29
109773
3587
而是非常、
非常详细的大餐细节。
01:54
So let's see what we're going to get.
30
114110
2044
我们来看看能得出什么结果。
01:56
But ChatGPT doesn't just generate
images in this case --
31
116154
3795
但在这种情况下,ChatGPT
不止生成了图片,
01:59
sorry, it doesn't generate text,
it also generates an image.
32
119991
2836
它不仅生成了文本,
还生成了图片。
02:02
And that is something
that really expands the power
33
122827
2419
这增强了它为了执行你的意图
而替你完成操作的能力。
02:05
of what it can do on your behalf
in terms of carrying out your intent.
34
125246
3504
02:08
And I'll point out,
this is all a live demo.
35
128750
2085
我想提一点,
这都是现场演示。
02:10
This is all generated
by the AI as we speak.
36
130835
2169
这都是在我们的言语之间
由 AI 生成的。
02:13
So I actually don't even know
what we're going to see.
37
133046
2544
其实我都不知道
我们会看到什么样的结果。
02:16
This looks wonderful.
38
136216
2294
看起来非常棒。
02:18
(Applause)
39
138510
3712
(掌声)
02:22
I'm getting hungry just looking at it.
40
142514
1877
看着它我都饿了。
02:24
Now we've extended ChatGPT
with other tools too,
41
144724
2753
我们也把其他工具
接入了 ChatGPT,
02:27
for example, memory.
42
147519
1168
比如记忆。
02:28
You can say "save this for later."
43
148728
2795
你可以输入
“保存,稍后再用”。
02:33
And the interesting thing
about these tools
44
153233
2043
这种工具的有趣之处在于
02:35
is they're very inspectable.
45
155318
1377
检查它们是非常容易的。
02:36
So you get this little pop up here
that says "use the DALL-E app."
46
156695
3128
你可以看到这个弹出插件
写着“调用 DALL-E 应用”。
02:39
And by the way, this is coming to you, all
ChatGPT users, over upcoming months.
47
159823
3712
顺带提一句,
你马上就能用上这个功能了,
每位 ChatGPT 用户,
在几个月内就能用上。
02:43
And you can look under the hood
and see that what it actually did
48
163535
3086
如果你看一下它背后的技术,
看看它到底进行了什么操作,
02:46
was write a prompt
just like a human could.
49
166621
2169
其实就是像人那样
写了个提示(prompt)。
02:48
And so you sort of have
this ability to inspect
50
168790
2628
你可以检查
02:51
how the machine is using these tools,
51
171459
2086
机器是如何使用这些工具的,
02:53
which allows us to provide
feedback to them.
52
173586
2086
这样我们就可以为其提供反馈。
02:55
Now it's saved for later,
53
175714
1209
我们把它存了下来,之后再用,
02:56
and let me show you
what it's like to use that information
54
176965
2878
我来给各位展示一下
如何利用这些信息,
02:59
and to integrate
with other applications too.
55
179884
2503
调用其他应用。
03:02
You can say,
56
182387
2210
你可以这么说:
03:04
“Now make a shopping list
for the tasty thing
57
184639
5506
“请你针对我之前提到的美食,
制作一份购物清单。”
03:10
I was suggesting earlier.”
58
190186
1835
03:12
And make it a little tricky for the AI.
59
192021
2128
我们来给 AI 增加一点难度。
03:16
"And tweet it out for all
the TED viewers out there."
60
196276
4337
“再发条推特
推给所有 TED 观众。”
03:20
(Laughter)
61
200655
2252
(笑声)
03:22
So if you do make this wonderful,
wonderful meal,
62
202949
2461
如果你真的做了这顿美餐,
03:25
I definitely want to know how it tastes.
63
205410
2044
我很想知道味道如何。
03:28
But you can see that ChatGPT
is selecting all these different tools
64
208496
3504
你可以看到,
ChatGPT 选用了各种工具,
03:32
without me having to tell it explicitly
which ones to use in any situation.
65
212000
4379
而我并不需要明确告诉它
什么情况下该用什么工具。
03:37
And this, I think, shows a new way
of thinking about the user interface.
66
217088
3879
我认为这体现了一个
看待用户界面的新方式。
03:40
Like, we are so used to thinking of,
well, we have these apps,
67
220967
3796
我们很习惯这么思考:
我们有了一些应用,
03:44
we click between them,
we copy/paste between them,
68
224763
2335
在应用之间点击,
在应用之间复制、粘贴,
03:47
and usually it's a great
experience within an app
69
227098
2294
通常在单个应用内体验良好,
03:49
as long as you kind of know
the menus and know all the options.
70
229434
2961
只要你大概了解菜单
和所有选项就行。
03:52
Yes, I would like you to.
71
232395
1293
是的,就这么做。
(键入提示)
03:53
Yes, please.
72
233730
1126
是的,麻烦了。(键入提示)
03:54
Always good to be polite.
73
234898
1251
礼貌一点总是好的。
03:56
(Laughter)
74
236149
2628
(笑声)
04:00
And by having this unified
language interface on top of tools,
75
240361
5464
有了统一的语言界面,
再加上工具,
04:05
the AI is able to sort of take away
all those details from you.
76
245867
4630
AI 就能从你手中提取各种细节。
04:10
So you don't have to be the one
77
250538
1543
你不需要一字一句地说出
04:12
who spells out every single
sort of little piece
78
252123
2294
04:14
of what's supposed to happen.
79
254459
1543
该产生什么结果。
04:16
And as I said, this is a live demo,
80
256419
1710
如我所说,这是个现场演示,
04:18
so sometimes the unexpected
will happen to us.
81
258129
3379
所以,有时会发生意外。
04:21
But let's take a look at the Instacart
shopping list while we're at it.
82
261549
3420
我们来看一看
这份 Instracart 购物清单。
04:25
And you can see we sent a list
of ingredients to Instacart.
83
265386
3254
你可以看到,我们向
Instacart 发送了一系列商品。
04:29
Here's everything you need.
84
269349
1543
就是你所需的所有商品。
04:30
And the thing that's really interesting
85
270892
1877
很有趣的一点是
04:32
is that the traditional UI
is still very valuable, right?
86
272811
2919
传统 UI (用户界面)
依旧是很有价值的,对吧?
04:35
If you look at this,
87
275772
1877
看一下这个界面,
04:37
you still can click through it
and sort of modify the actual quantities.
88
277690
4296
你还是可以点上去,
修改实际数量。
04:41
And that's something that I think shows
89
281986
1877
我认为这就体现了
04:43
that they're not going away,
traditional UIs.
90
283863
3253
传统 UI 并没有消失。
04:47
It's just we have a new,
augmented way to build them.
91
287158
2795
只是我们有了一个全新、
增强的方式来创建它。
04:49
And now we have a tweet
that's been drafted for our review,
92
289994
2920
我们有了一条草拟好的推特,
等待我们的审阅,
04:52
which is also a very important thing.
93
292956
1793
这也是一件很重要的事。
04:54
We can click “run,” and there we are,
we’re the manager, we’re able to inspect,
94
294749
3712
我们可以点击“运行”,
然后就成了这样,
我们是管理者,可以检查,
04:58
we're able to change the work
of the AI if we want to.
95
298461
2836
要是愿意,我们
也可以修改 AI 的成果。
05:02
And so after this talk,
you will be able to access this yourself.
96
302924
5964
在这场演讲之后,
你自己也可以试一下。
05:17
And there we go.
97
317647
1710
有了。
05:19
Cool.
98
319816
1126
酷。
05:22
Thank you, everyone.
99
322485
1168
谢谢大家。
05:23
(Applause)
100
323653
3003
(掌声)
05:29
So we’ll cut back to the slides.
101
329367
1627
我们再回到片子。
05:32
Now, the important thing
about how we build this,
102
332954
3587
我们是怎么把它做出来的,
其中的关键之处
05:36
it's not just about building these tools.
103
336583
2210
不仅仅是
怎么做出这些工具的问题。
05:38
It's about teaching
the AI how to use them.
104
338793
2252
还要教会 AI 使用它们。
05:41
Like, what do we even want it to do
105
341087
1710
我们问了这些高层级的问题,
05:42
when we ask these very
high-level questions?
106
342839
2419
到底想让它做些什么呢?
05:45
And to do this, we use an old idea.
107
345258
2669
要回答这个问题,
我们用了一个老方法。
05:48
If you go back to Alan Turing's 1950 paper
on the Turing test, he says,
108
348261
3337
你可以翻阅
艾伦·图灵(Alan Turing)
1950 年关于图灵测试的
论文,其中写道,
05:51
you'll never program an answer to this.
109
351598
2043
你是编不出一个程序
来回答这个问题的,
05:53
Instead, you can learn it.
110
353683
1627
而是通过学习。
05:55
You could build a machine,
like a human child,
111
355351
2169
你可以做一台机器,
就像人类儿童那样,
05:57
and then teach it through feedback.
112
357520
2127
通过不断反馈指导它。
05:59
Have a human teacher who provides
rewards and punishments
113
359689
2711
在它尝试、做出对的事
和不对的事的时候,
06:02
as it tries things out and does things
that are either good or bad.
114
362400
3212
有那么一位人类老师
给它奖励和惩罚。
06:06
And this is exactly how we train ChatGPT.
115
366237
2002
这就是我们训练
ChatGPT 的方式。
06:08
It's a two-step process.
116
368239
1168
这个过程有两步。
06:09
First, we produce what Turing
would have called a child machine
117
369449
3086
首先,我们做了一台
图灵所称的“子计算机”,
采用了无监督学习过程。
06:12
through an unsupervised learning process.
118
372535
1960
06:14
We just show it the whole world,
the whole internet
119
374495
2461
我们给它看了整个世界,
整个互联网,
06:16
and say, “Predict what comes next
in text you’ve never seen before.”
120
376956
3212
然后要求:“这是你之前没见过的内容,
请预测接下来该输出什么文本。”
06:20
And this process imbues it
with all sorts of wonderful skills.
121
380168
3044
这个过程让它掌握了
各种神奇的技能。
06:23
For example, if you're shown
a math problem,
122
383212
2086
比如,你给它一个数学问题,
06:25
the only way to actually
complete that math problem,
123
385298
2544
要做出这道数学题的唯一方式,
06:27
to say what comes next,
124
387884
1334
输出接下来的答案,
06:29
that green nine up there,
125
389218
1293
也就是这个绿色的 9,
06:30
is to actually solve the math problem.
126
390511
2294
只有通过真正解出这个数学问题。
06:34
But we actually have to do
a second step, too,
127
394432
2169
但其实我们也得进行第二步,
06:36
which is to teach the AI
what to do with those skills.
128
396601
2544
即指导 AI 使用这些技能。
06:39
And for this, we provide feedback.
129
399187
1668
为了达到这个目的,
我们会给予反馈。
06:40
We have the AI try out multiple things,
give us multiple suggestions,
130
400855
3253
我们让 AI 尝试做几件事,
给我们提几个建议,
06:44
and then a human rates them, says
“This one’s better than that one.”
131
404150
3212
然后人类会为其评分,
表示:“这个比那个好。”
06:47
And this reinforces not just the specific
thing that the AI said,
132
407362
3086
这不仅强化了
AI 刚说的那个答案,
06:50
but very importantly, the whole process
that the AI used to produce that answer.
133
410448
3795
更重要的是,还强化了
AI 生成这个答案的整个过程。
06:54
And this allows it to generalize.
134
414243
1585
这样 AI 就可以泛化。
06:55
It allows it to teach,
to sort of infer your intent
135
415828
2419
它可以指导、推测你的意图,
06:58
and apply it in scenarios
that it hasn't seen before,
136
418247
2503
在没有出现过的情境下应用,
07:00
that it hasn't received feedback.
137
420750
1585
且是在没有接受过
反馈的情况下。
07:02
Now, sometimes the things
we have to teach the AI
138
422669
2460
有时我们需要教给 AI 的东西
07:05
are not what you'd expect.
139
425171
1543
不是你意料之中的那些。
07:06
For example, when we first showed
GPT-4 to Khan Academy,
140
426756
3086
比如,我们第一次把 GPT-4
给可汗学院看的时候,
07:09
they said, "Wow, this is so great,
141
429884
1627
他们说:“哇,太棒了,
07:11
We're going to be able to teach
students wonderful things.
142
431552
2753
我们可以给学生教点好东西了。
07:14
Only one problem, it doesn't
double-check students' math.
143
434347
3462
只是有一个问题,它不会
检查学生的数学做得对不对。
07:17
If there's some bad math in there,
144
437809
1626
如果数学做得不对,
07:19
it will happily pretend that one plus one
equals three and run with it."
145
439477
3462
它会愉快地默认一加一等于三,
然后就这么运行下去。”
07:23
So we had to collect some feedback data.
146
443523
2294
于是我们就得收集一些反馈数据。
07:25
Sal Khan himself was very kind
147
445858
1544
萨尔·可汗(Sal Khan)
本人很不错,
07:27
and offered 20 hours of his own time
to provide feedback to the machine
148
447443
3337
他贡献了 20 个小时的私人时间
和我们的团队一起为计算机
07:30
alongside our team.
149
450780
1501
提供反馈。
07:32
And over the course of a couple of months
we were able to teach the AI that,
150
452323
3587
在几个月的时间里,
我们已经可以这么教会 AI:
07:35
"Hey, you really should
push back on humans
151
455910
2044
“嘿,你真的得
在这种特定的场合下,
07:37
in this specific kind of scenario."
152
457954
2044
反驳人类。”
07:41
And we've actually made lots and lots
of improvements to the models this way.
153
461416
4921
我们已经以这种方式
对模型进行了大量改良。
07:46
And when you push
that thumbs down in ChatGPT,
154
466379
2544
你在 ChatGPT 里
点击“踩”的时候,
07:48
that actually is kind of like sending up
a bat signal to our team to say,
155
468965
3462
其实是给我们的团队
发送了一个信号:
07:52
“Here’s an area of weakness
where you should gather feedback.”
156
472427
2919
“这块儿有缺陷,
你们得收集一些反馈。”
07:55
And so when you do that,
157
475388
1168
你这么点击的时候,
07:56
that's one way that we really
listen to our users
158
476597
2294
这就会是我们真正倾听用户、
07:58
and make sure we're building something
that's more useful for everyone.
159
478933
3378
确保我们创造的东西
会对每个人更有用的方式。
08:02
Now, providing high-quality
feedback is a hard thing.
160
482895
3754
提供高质量的反馈是件难事。
08:07
If you think about asking a kid
to clean their room,
161
487025
2460
想象一下让孩子打扫房间,
08:09
if all you're doing
is inspecting the floor,
162
489485
2711
如果你做的唯一一件事
只是检查地面,
08:12
you don't know if you're just teaching
them to stuff all the toys in the closet.
163
492196
3796
就不会知道你是不是只教会了他们
把所有玩具塞进衣柜里。
08:15
This is a nice DALL-E-generated
image, by the way.
164
495992
2627
顺带说一句,这是一张
由 DALL-E 生成的美图。
08:19
And the same sort
of reasoning applies to AI.
165
499912
4713
同样的道理也适用于 AI。
08:24
As we move to harder tasks,
166
504667
1794
如果要执行更困难的任务,
08:26
we will have to scale our ability
to provide high-quality feedback.
167
506502
3796
我们就得提高
提供高质量反馈的能力。
08:30
But for this, the AI itself
is happy to help.
168
510882
3879
但就这一点,
AI 本身就乐意为之。
08:34
It's happy to help us provide
even better feedback
169
514761
2335
它很乐意持续帮助我们
提供更好的反馈,
08:37
and to scale our ability to supervise
the machine as time goes on.
170
517138
3587
提高我们监督计算机的能力。
08:40
And let me show you what I mean.
171
520767
1543
我来给你看一下
我说的是什么意思。
08:42
For example, you can ask GPT-4
a question like this,
172
522810
4546
比如,你可以问 GPT-4
这么一个问题:
08:47
of how much time passed
between these two foundational blogs
173
527356
3295
这两篇关于无监督学习
和基于人类反馈学习的
08:50
on unsupervised learning
174
530693
1668
基础性博客之间
08:52
and learning from human feedback.
175
532403
1794
相差了多长时间,
08:54
And the model says two months passed.
176
534197
2460
模型说两个月。
08:57
But is it true?
177
537075
1167
但说的是真的吗?
08:58
Like, these models
are not 100-percent reliable,
178
538284
2252
这些模型不是 100% 可靠的,
09:00
although they’re getting better
every time we provide some feedback.
179
540536
3921
虽然每次我们提供了反馈以后,
它们都会优化。
09:04
But we can actually use
the AI to fact-check.
180
544457
3086
但其实我们可以利用 AI
进行事实核查。
09:07
And it can actually check its own work.
181
547543
1877
它都可以核查自己的结果。
09:09
You can say, fact-check this for me.
182
549462
1877
你可以要求它,
帮我核对一下是否属实。
09:12
Now, in this case, I've actually
given the AI a new tool.
183
552757
3670
在这个例子中,我给了
AI 一个新工具。
09:16
This one is a browsing tool
184
556427
1710
这是个浏览工具,
09:18
where the model can issue search queries
and click into web pages.
185
558137
3879
模型可以搜索查询,点进网页。
09:22
And it actually writes out
its whole chain of thought as it does it.
186
562016
3253
在搜索的过程中,
它会写出完整的思考过程。
09:25
It says, I’m just going to search for this
and it actually does the search.
187
565269
3587
它会写出,我要搜这个,
然后它真的就去搜这个了。
09:28
It then it finds the publication date
and the search results.
188
568856
3128
然后它会找到
发表时间和搜索结果。
09:32
It then is issuing another search query.
189
572026
1919
然后它会再进行一次搜索。
09:33
It's going to click into the blog post.
190
573945
1877
点进博客。
09:35
And all of this you could do,
but it’s a very tedious task.
191
575822
2877
你也可以做这些事,
但是这很冗长乏味。
09:38
It's not a thing
that humans really want to do.
192
578741
2211
这是人类不太想做的事。
09:40
It's much more fun
to be in the driver's seat,
193
580952
2169
掌控一切要有趣得多,
09:43
to be in this manager's position
where you can, if you want,
194
583162
2836
当管理者要有趣得多,
只要你想,你也可以
09:45
triple-check the work.
195
585998
1210
再查一遍结果。
09:47
And out come citations
196
587208
1501
你可以看到这些引用,
09:48
so you can actually go
197
588709
1168
这样你就可以
09:49
and very easily verify any piece
of this whole chain of reasoning.
198
589919
3754
非常轻松地验证
整条逻辑中的任何环节。
09:53
And it actually turns out
two months was wrong.
199
593673
2210
结果“两个月”是错误的。
09:55
Two months and one week,
200
595883
2169
两个月加一周,
09:58
that was correct.
201
598094
1251
这才是对的。
10:00
(Applause)
202
600888
3837
(掌声)
10:07
And we'll cut back to the side.
203
607645
1502
我们再回到片子。
10:09
And so thing that's so interesting to me
about this whole process
204
609147
3920
我觉得整个过程中很有趣的是
10:13
is that it’s this many-step collaboration
between a human and an AI.
205
613067
3962
这个过程是人类和 AI 之间
进行的多步合作。
10:17
Because a human, using
this fact-checking tool
206
617029
2461
因为人类
利用这个事实核查工具,
10:19
is doing it in order to produce data
207
619532
2210
为的是产生数据,
10:21
for another AI to become
more useful to a human.
208
621742
3170
让 AI 对人类更加有用。
10:25
And I think this really shows
the shape of something
209
625454
2545
我认为这充分展示了
10:28
that we should expect to be
much more common in the future,
210
628040
3087
未来会越来越常见的一种情况:
10:31
where we have humans
and machines kind of very carefully
211
631127
2711
在一个具体的问题和
我们设想的解决方法中,
10:33
and delicately designed
in how they fit into a problem
212
633880
3503
人类和计算机的定位分工
都会被非常谨慎、精心地安排好。
10:37
and how we want
to solve that problem.
213
637425
1918
10:39
We make sure that the humans are providing
the management, the oversight,
214
639385
3462
我们得保证人类
提供管理、全局视角、
10:42
the feedback,
215
642847
1168
反馈,
10:44
and the machines are operating
in a way that's inspectable
216
644015
2752
计算机以一种可被检查、
10:46
and trustworthy.
217
646809
1126
值得信任的方式运行。
10:47
And together we're able to actually create
even more trustworthy machines.
218
647977
3503
通过合作,我们可以
创造出更值得信任的计算机。
10:51
And I think that over time,
if we get this process right,
219
651522
2711
我认为随着时间的推移,
如果我们让这个过程走上正轨,
10:54
we will be able to solve
impossible problems.
220
654233
2127
我们就能解决
不可能被解决的问题。
10:56
And to give you a sense
of just how impossible I'm talking,
221
656360
3963
为了让大家理解我说的
“不可能被解决”是什么意思,
11:00
I think we're going to be able
to rethink almost every aspect
222
660323
2878
我认为,我们可以重新思考
11:03
of how we interact with computers.
223
663242
2378
我们与计算机互动的
几乎每一个方面。
11:05
For example, think about spreadsheets.
224
665620
2502
比如,表格。
11:08
They've been around in some form since,
we'll say, 40 years ago with VisiCalc.
225
668122
4379
它们从大约 40 年前
VisiCalc 时期就长这样了。
11:12
I don't think they've really
changed that much in that time.
226
672543
2878
我觉得过了这么多年
也没发生什么变化。
11:16
And here is a specific spreadsheet
of all the AI papers on the arXiv
227
676214
5922
这张表格记录了
arXiv 上过去 30 年里
11:22
for the past 30 years.
228
682178
1168
所有与 AI 有关的论文。
11:23
There's about 167,000 of them.
229
683346
1960
大约有 167000 篇。
11:25
And you can see there the data right here.
230
685348
2878
你可以看到这些数据。
11:28
But let me show you the ChatGPT take
on how to analyze a data set like this.
231
688267
3837
但我想给你展示一下 ChatGPT
是如何分析这样的数据集的。
11:37
So we can give ChatGPT
access to yet another tool,
232
697318
3837
我们再给 ChatGPT
加一个工具,
11:41
this one a Python interpreter,
233
701197
1460
这是个 Python 解释器,
11:42
so it’s able to run code,
just like a data scientist would.
234
702657
4004
它可以像数据科学家那样运行代码。
11:46
And so you can just
literally upload a file
235
706661
2335
你只需要上传一个文件,
11:48
and ask questions about it.
236
708996
1335
问相关的问题。
11:50
And very helpfully, you know, it knows
the name of the file and it's like,
237
710373
3545
它很有用,可以读取
文件名然后这样运行:
11:53
"Oh, this is CSV,"
comma-separated value file,
238
713960
2419
“哦,这是个 CSV 文件,
逗号分隔值文件(CSV),
11:56
"I'll parse it for you."
239
716420
1335
我来给你解析一下。”
11:57
The only information here
is the name of the file,
240
717755
2794
然后留下的信息只有文件名、
12:00
the column names like you saw
and then the actual data.
241
720591
3671
你刚看到的列名和实际数据。
12:04
And from that it's able to infer
what these columns actually mean.
242
724262
4504
从这些信息中,它可以推测出
这些列都代表着什么意思。
12:08
Like, that semantic information
wasn't in there.
243
728766
2294
这里并没有语义信息。
12:11
It has to sort of, put together
its world knowledge of knowing that,
244
731102
3211
所以它就得搜罗
它知道的所有信息,
12:14
“Oh yeah, arXiv is a site
that people submit papers
245
734355
2502
“哦,arXiv 是
人们提交论文的网站,
12:16
and therefore that's what these things are
and that these are integer values
246
736857
3587
所以这些条目指的是论文,
而且这些数都是整数,
12:20
and so therefore it's a number
of authors in the paper,"
247
740486
2628
所以这个数值就是
论文作者的数量。”
12:23
like all of that, that’s work
for a human to do,
248
743114
2252
等等操作,
这些都是人类要做的事,
12:25
and the AI is happy to help with it.
249
745408
1751
而 AI 乐意效劳。
12:27
Now I don't even know what I want to ask.
250
747159
2002
我都不知道我该问什么。
12:29
So fortunately, you can ask the machine,
251
749203
3003
好就好在你可以问计算机:
12:32
"Can you make some exploratory graphs?"
252
752248
1877
“你可以帮我做一些
探索性图表吗?”
12:37
And once again, this is a super high-level
instruction with lots of intent behind it.
253
757461
4004
同样地,这是个非常高层级的指示,
背后还有很多深层的涵义。
12:41
But I don't even know what I want.
254
761507
1668
但我都不知道自己想要什么。
12:43
And the AI kind of has to infer
what I might be interested in.
255
763175
2920
AI 就得推测我可能想要什么。
12:46
And so it comes up
with some good ideas, I think.
256
766137
2294
于是,它就给出了几个
我认为还不错的点子。
12:48
So a histogram of the number
of authors per paper,
257
768472
2336
每篇论文作者数量的直方图、
12:50
time series of papers per year,
word cloud of the paper titles.
258
770850
2961
每年论文的时间序列、
论文标题的词云。
12:53
All of that, I think,
will be pretty interesting to see.
259
773853
2627
我觉得这些都还挺有趣的。
12:56
And the great thing is,
it can actually do it.
260
776522
2169
妙就妙在它真的可以做出来。
12:58
Here we go, a nice bell curve.
261
778691
1460
来了,精美的正态分布曲线。
13:00
You see that three
is kind of the most common.
262
780151
2169
你可以看到,3 是最常见的数量。
13:02
It's going to then make this nice plot
of the papers per year.
263
782320
5797
让它给每年的论文
都做这么一张漂亮的图。
13:08
Something crazy
is happening in 2023, though.
264
788117
2294
2023 年有点奇葩。
13:10
Looks like we were on an exponential
and it dropped off the cliff.
265
790411
3128
看起来我们经历了一个
指数增长然后跳水。
13:13
What could be going on there?
266
793539
1460
发生了什么事?
13:14
By the way, all this
is Python code, you can inspect.
267
794999
2753
顺带提一句,这些都是
Python 代码,你是可以检查的。
13:17
And then we'll see word cloud.
268
797752
1459
我们来看词云。
13:19
So you can see all these wonderful things
that appear in these titles.
269
799253
3378
你可以看到这些标题中的奇妙内容。
13:23
But I'm pretty unhappy
about this 2023 thing.
270
803090
2127
我对这 2023 年的现象
不太满意。
13:25
It makes this year look really bad.
271
805634
1711
这一年看起来太差了。
13:27
Of course, the problem is
that the year is not over.
272
807345
2877
当然,今年还没过完。
13:30
So I'm going to push back on the machine.
273
810222
2878
所以我要把问题抛回给计算机。
13:33
[Waitttt that's not fair!!!
274
813142
1585
[等等!这不公平!
13:34
2023 isn't over.
275
814727
1293
2023 年还没过完呢。
13:38
What percentage of papers in 2022
were even posted by April 13?]
276
818481
5088
2022 年有多大比例的论文是在
4 月 13 日以前发表的?]
13:44
So April 13 was the cut-off
date I believe.
277
824695
2294
我就把 4 月 13 日
当作截止日期。
13:47
Can you use that to make
a fair projection?
278
827656
4922
[你可以用这个结论
生成一个公平的图表吗?]
13:54
So we'll see, this is
the kind of ambitious one.
279
834747
2294
我们来看看,这题有点难度。
13:57
(Laughter)
280
837083
1126
(笑声)
13:59
So you know,
281
839877
1251
同样地,我这么做是因为
我对计算机有更大的企图。
14:01
again, I feel like there was more I wanted
out of the machine here.
282
841128
3921
14:05
I really wanted it to notice this thing,
283
845049
2502
我想让它注意到这一点,
14:07
maybe it's a little bit
of an overreach for it
284
847593
3128
虽然对它来说,机智地猜出
14:10
to have sort of, inferred magically
that this is what I wanted.
285
850763
3378
我这样的意图可能有点太难了。
14:14
But I inject my intent,
286
854183
1627
但我表明了我的意图,
14:15
I provide this additional piece
of, you know, guidance.
287
855810
4754
又为它提供了一些指导。
14:20
And under the hood,
288
860564
1168
如果看它底层的运作,
14:21
the AI is just writing code again,
so if you want to inspect what it's doing,
289
861774
3629
AI 只是又写了一遍代码,
所以如果你想检查一下它在干什么,
14:25
it's very possible.
290
865403
1251
是很容易做到的。
14:26
And now, it does the correct projection.
291
866654
3628
这次,它生成了正确的图表。
14:30
(Applause)
292
870282
5005
(掌声)
14:35
If you noticed, it even updates the title.
293
875287
2169
你仔细看的话,
它还更新了标题。
14:37
I didn't ask for that,
but it know what I want.
294
877498
2336
我可没要求它这么做,
但它知道我想要什么。
14:41
Now we'll cut back to the slide again.
295
881794
2544
我们再回到片子。
14:45
This slide shows a parable
of how I think we ...
296
885714
4880
这张片子讲了一个故事,
表示我认为我们……
14:51
A vision of how we may end up
using this technology in the future.
297
891220
3212
我们未来使用这项技术的愿景。
14:54
A person brought
his very sick dog to the vet,
298
894849
3712
一个人带着他病重的狗狗去看兽医,
14:58
and the veterinarian made a bad call
to say, “Let’s just wait and see.”
299
898561
3336
兽医做了个愚蠢的决定,说:
“我们就等着观望吧。”
15:01
And the dog would not
be here today had he listened.
300
901897
2795
如果他当时就这么信了兽医,
狗狗现在已经走了。
15:05
In the meanwhile,
he provided the blood test,
301
905401
2252
与此同时,他把验血结果、
15:07
like, the full medical records, to GPT-4,
302
907653
2586
完整病历提交给了 GPT-4,
15:10
which said, "I am not a vet,
you need to talk to a professional,
303
910281
3170
它说:“我不是个兽医,
你得去找专业人士咨询,
15:13
here are some hypotheses."
304
913492
1710
以下是一些假设。”
15:15
He brought that information
to a second vet
305
915786
2127
他带着这些信息
去找了第二位兽医,
15:17
who used it to save the dog's life.
306
917913
1835
兽医利用这些信息
拯救了狗狗的生命。
15:21
Now, these systems, they're not perfect.
307
921292
2252
这些系统不是完美的。
15:23
You cannot overly rely on them.
308
923544
2336
你不能过度依赖它们。
15:25
But this story, I think, shows
309
925880
3712
但我认为这个故事体现了
15:29
that a human with a medical professional
310
929592
3044
人类和医疗专家,
15:32
and with ChatGPT
as a brainstorming partner
311
932678
2461
再加上 ChatGPT
作为头脑风暴的搭档,
15:35
was able to achieve an outcome
that would not have happened otherwise.
312
935181
3295
可以创造出一些前所未有的成果。
15:38
I think this is something
we should all reflect on,
313
938476
2419
我认为,这是我们所有人
在考虑将这些系统融入世界之时
15:40
think about as we consider
how to integrate these systems
314
940895
2711
该反思、考虑的一点。
15:43
into our world.
315
943606
1167
15:44
And one thing I believe really deeply,
316
944815
1835
我坚信的一点是,
15:46
is that getting AI right is going
to require participation from everyone.
317
946650
4338
让 AI 走上正道
需要每个人的参与。
15:50
And that's for deciding
how we want it to slot in,
318
950988
2377
参与决定该如何
让 AI 加入这个世界,
15:53
that's for setting the rules of the road,
319
953365
1961
参与规划路该怎么走,
15:55
for what an AI will and won't do.
320
955367
2044
参与规定 AI
该做什么,不该做什么。
15:57
And if there's one thing
to take away from this talk,
321
957453
2502
这场演讲能给你带来的
一个收获就是
15:59
it's that this technology
just looks different.
322
959997
2211
这项技术就是与众不同。
16:02
Just different from anything
people had anticipated.
323
962208
2502
与人们意料中的任何事物都不一样。
16:04
And so we all have to become literate.
324
964710
1835
所以,我们都得对它有所了解。
16:06
And that's, honestly, one
of the reasons we released ChatGPT.
325
966587
2961
说实话,这也是我们
发布 ChatGPT 的原因之一。
16:09
Together, I believe that we can
achieve the OpenAI mission
326
969548
3128
我相信,我们能一起努力
达成 OpenAI 的使命,
16:12
of ensuring that artificial
general intelligence
327
972718
2252
让通用人工智能
16:14
benefits all of humanity.
328
974970
1877
造福全人类。
16:16
Thank you.
329
976847
1168
谢谢。
16:18
(Applause)
330
978057
6965
(掌声)
16:33
(Applause ends)
331
993322
1168
(掌声结束)
16:34
Chris Anderson: Greg.
332
994532
1334
克里斯·安德森(Chris Anderson):
格雷格(Greg)。
16:36
Wow.
333
996242
1167
哇。
16:37
I mean ...
334
997868
1126
我……
16:39
I suspect that within every mind out here
335
999662
3753
我怀疑在场的每个人
16:43
there's a feeling of reeling.
336
1003457
2503
都有天旋地转的感觉。
16:46
Like, I suspect that a very large
number of people viewing this,
337
1006001
3379
我怀疑观看这场演讲的很多观众,
16:49
you look at that and you think,
“Oh my goodness,
338
1009421
2419
看到这些内容,都会心想:“天哪,
16:51
pretty much every single thing
about the way I work, I need to rethink."
339
1011882
3462
我得重新考虑一下
我日常操作中的每一件事。”
16:55
Like, there's just
new possibilities there.
340
1015386
2002
这是新的可能啊。
16:57
Am I right?
341
1017388
1168
对吧?
16:58
Who thinks that they're having to rethink
the way that we do things?
342
1018597
3337
谁认为得重新思考一下
自己行事的方式?
17:01
Yeah, I mean, it's amazing,
343
1021976
1543
太神奇了,
17:03
but it's also really scary.
344
1023561
2002
但也太可怕了。
17:05
So let's talk, Greg, let's talk.
345
1025604
1585
我们来谈谈吧,格雷格。
17:08
I mean, I guess
my first question actually is just
346
1028524
2377
我想问的第一个问题就是
17:10
how the hell have you done this?
347
1030901
1585
你们到底是怎么走到这一步的?
17:12
(Laughter)
348
1032486
1251
(笑声)
17:13
OpenAI has a few hundred employees.
349
1033737
2962
OpenAI 有几百名员工。
17:16
Google has thousands of employees
working on artificial intelligence.
350
1036740
4755
谷歌有几千名员工
专攻人工智能。
17:21
Why is it you who's come up
with this technology
351
1041996
3503
为什么是你们想出了这项
17:25
that shocked the world?
352
1045541
1168
震惊世界的技术?
17:26
Greg Brockman: I mean, the truth is,
353
1046709
1751
格雷格·布罗克曼
(Greg Brockman):其实,
17:28
we're all building on shoulders
of giants, right, there's no question.
354
1048502
3295
我们是站在巨人的肩膀上前进,
毋庸置疑。
如果你看看计算的进步、
17:31
If you look at the compute progress,
355
1051797
1752
算法的进步、数据的进步,
17:33
the algorithmic progress,
the data progress,
356
1053549
2085
这些都是覆盖整个行业的。
17:35
all of those are really industry-wide.
357
1055634
1835
但我认为,在 OpenAI 内部,
我们汲取过往的经验
做出了很多慎重的决定。
17:37
But I think within OpenAI,
358
1057469
1252
17:38
we made a lot of very deliberate
choices from the early days.
359
1058762
2878
首先就是直面现实。
17:41
And the first one was just
to confront reality as it lays.
360
1061640
2711
我们对此苦思冥想:
17:44
And that we just thought
really hard about like:
361
1064393
2294
17:46
What is it going to take
to make progress here?
362
1066687
2210
在这个领域前进,
要付出什么代价?
17:48
We tried a lot of things that didn't work,
so you only see the things that did.
363
1068939
3754
我们尝试了很多无效的东西,
而你们只会看到有效的东西。
我认为最重要的一点就是
召集一个团队,
17:52
And I think that the most important thing
has been to get teams of people
364
1072693
3462
17:56
who are very different from each other
to work together harmoniously.
365
1076196
3254
互不相同却要和谐地合作。
17:59
CA: Can we have the water,
by the way, just brought here?
366
1079450
2711
CA: 我们准备了水,拿过来吧。
18:02
I think we're going to need it,
it's a dry-mouth topic.
367
1082202
3170
我觉得我们得喝点水,
这是个令人津津乐道的话题。
18:06
But isn't there something also
just about the fact
368
1086665
2795
但是,在这些语言模型中,
18:09
that you saw something
in these language models
369
1089501
4755
你有没有发现
18:14
that meant that if you continue
to invest in them and grow them,
370
1094256
3921
如果你继续对其投入、研发,
18:18
that something
at some point might emerge?
371
1098218
3129
到了某个节点,
会出现一些情况?
18:21
GB: Yes.
372
1101847
1126
GB: 有。
18:23
And I think that, I mean, honestly,
373
1103015
2836
老实说,我认为
18:25
I think the story there
is pretty illustrative, right?
374
1105893
2544
我刚说的故事已经
很直观了,对吧?
18:28
I think that high level, deep learning,
375
1108437
2002
我认为从高层级来看,
深度学习,
18:30
like we always knew that was
what we wanted to be,
376
1110481
2335
我们一直知道
这就是我们想要的结果,
就得有个深度学习实验室,
那我们该怎么把它做出来呢?
18:32
was a deep learning lab,
and exactly how to do it?
377
1112858
2419
我觉得早期我们并不知道答案。
18:35
I think that in the early days,
we didn't know.
378
1115277
2211
我们尝试了很多,
18:37
We tried a lot of things,
379
1117529
1210
一位同事训练了一个模型,
18:38
and one person was working
on training a model
380
1118739
2336
用于预测
亚马逊评论中的下一个字母,
18:41
to predict the next character
in Amazon reviews,
381
1121075
2877
18:43
and he got a result where --
this is a syntactic process,
382
1123994
4755
他得出了这样一个结果——
这是个处理句法的过程,
18:48
you expect, you know, the model
will predict where the commas go,
383
1128749
3086
你想要的结果是
模型预测出逗号在什么位置,
18:51
where the nouns and verbs are.
384
1131835
1627
名词和动词在什么位置。
18:53
But he actually got a state-of-the-art
sentiment analysis classifier out of it.
385
1133504
4337
但他其实由此做出了一个
最先进的情感分析分类器。
18:57
This model could tell you
if a review was positive or negative.
386
1137883
2961
这个模型可以判定
这条评论是正面还是负面的。
19:00
I mean, today we are just like,
come on, anyone can do that.
387
1140886
3378
放到现在,我们会觉得,
人人都能做到啊。
19:04
But this was the first time
that you saw this emergence,
388
1144306
3087
但这是你第一次见到这样的东西,
19:07
this sort of semantics that emerged
from this underlying syntactic process.
389
1147434
5005
由底层的句法处理进程
产生的语义。
19:12
And there we knew,
you've got to scale this thing,
390
1152481
2336
由此,我们知道,
必须得扩大这项技术的规模,
19:14
you've got to see where it goes.
391
1154858
1544
必须得见证它未来的发展。
CA: 我认为这有助于解开
19:16
CA: So I think this helps explain
392
1156402
1626
让观众困惑的谜题,
19:18
the riddle that baffles
everyone looking at this,
393
1158028
2544
因为这些模型被称为预测机。
19:20
because these things are described
as prediction machines.
394
1160572
2753
但我们看到它们就感觉……
19:23
And yet, what we're seeing
out of them feels ...
395
1163367
2669
19:26
it just feels impossible that that
could come from a prediction machine.
396
1166036
3420
就感觉预测机怎么可能
可以产生这样的结果。
19:29
Just the stuff you showed us just now.
397
1169456
2378
就是你刚才给我们看的东西。
19:31
And the key idea of emergence
is that when you get more of a thing,
398
1171875
3838
新兴事物的关键就在于
一件东西出现得越多,
19:35
suddenly different things emerge.
399
1175754
1585
突然间就会出现很多别的东西。
19:37
It happens all the time, ant colonies,
single ants run around,
400
1177339
3045
这是个普遍现象,蚁群,
一只蚂蚁跑来跑去,
19:40
when you bring enough of them together,
401
1180384
1877
如果你把足够多的蚂蚁放在一起,
19:42
you get these ant colonies that show
completely emergent, different behavior.
402
1182302
3629
就会出现蚁群,呈现出完全
没有见过的、异于寻常的行为。
19:45
Or a city where a few houses together,
it's just houses together.
403
1185973
3086
一座城市,几栋房子建在一起,
就只是几栋建在一起的房子而已。
19:49
But as you grow the number of houses,
404
1189059
1794
但如果你增加房子的数量,
19:50
things emerge, like suburbs
and cultural centers and traffic jams.
405
1190894
4588
就会出现别的东西,
比如郊区、文化中心、堵车。
19:57
Give me one moment for you
when you saw just something pop
406
1197276
3211
你有没有经历过这么一个瞬间,
眼前突然出现了什么东西,
20:00
that just blew your mind
407
1200529
1668
让你震撼,
20:02
that you just did not see coming.
408
1202197
1627
出乎意料?
20:03
GB: Yeah, well,
409
1203824
1209
GB: 嗯,
20:05
so you can try this in ChatGPT,
if you add 40-digit numbers --
410
1205075
3462
你可以在 ChatGPT 里试一试,
输入 40 位数字……
20:08
CA: 40-digit?
411
1208537
1168
CA: 40 位?
20:09
GB: 40-digit numbers,
the model will do it,
412
1209705
2169
GB: 40 位数,
模型是可以运算的,
20:11
which means it's really learned
an internal circuit for how to do it.
413
1211915
3254
也就是说它已经学会了
运算这些数字的内部逻辑。
20:15
And the really interesting
thing is actually,
414
1215210
2127
很有趣的一点是
20:17
if you have it add like a 40-digit number
plus a 35-digit number,
415
1217337
3212
如果你试试将 40 位数
和 35 位数相加,
20:20
it'll often get it wrong.
416
1220591
1710
它通常会算错。
20:22
And so you can see that it's really
learning the process,
417
1222676
2795
你可以看出,
它在学习这个过程,
20:25
but it hasn't fully generalized, right?
418
1225471
1876
但还没有完全泛化,对吧?
20:27
It's like you can't memorize
the 40-digit addition table,
419
1227389
2711
你不可能记住 40 位加法表,
20:30
that's more atoms
than there are in the universe.
420
1230100
2294
这比宇宙中的原子数都多。
所以它得学习一些通用的东西,
20:32
So it had to have learned
something general,
421
1232394
2086
但它还没有完全学会,
20:34
but that it hasn't really
fully yet learned that,
422
1234480
2377
20:36
Oh, I can sort of generalize this
to adding arbitrary numbers
423
1236899
2961
把它泛化到可以算出任意长度的
20:39
of arbitrary lengths.
424
1239902
1167
任意数字之和。
20:41
CA: So what's happened here
425
1241111
1335
CA: 所以眼下的情况是
20:42
is that you've allowed it to scale up
426
1242488
1793
你拓展了它的能力,
20:44
and look at an incredible
number of pieces of text.
427
1244281
2419
能处理很大量的文本。
20:46
And it is learning things
428
1246742
1209
它在学习
20:47
that you didn't know that it was
going to be capable of learning.
429
1247951
3379
你都不知道它能学会的东西。
20:51
GB Well, yeah, and it’s more nuanced, too.
430
1251371
2002
GB: 对,也有点微妙。
20:53
So one science that we’re starting
to really get good at
431
1253415
2878
我们逐渐掌握的一项技能是
20:56
is predicting some of these
emergent capabilities.
432
1256335
2586
预测这些新兴的性能。
20:58
And to do that actually,
433
1258962
1335
要做到这一点,
21:00
one of the things I think
is very undersung in this field
434
1260339
2711
我认为这个领域中
有件不太受到重视的事,
21:03
is sort of engineering quality.
435
1263050
1501
那就是工程质量。
21:04
Like, we had to rebuild our entire stack.
436
1264551
2044
我们要重建整个技术栈。
21:06
When you think about building a rocket,
437
1266637
1877
你可以想象一下造火箭,
21:08
every tolerance has to be incredibly tiny.
438
1268555
2211
每一项的容错率都得非常小。
21:10
Same is true in machine learning.
439
1270766
1626
机器学习也是一样的。
21:12
You have to get every single piece
of the stack engineered properly,
440
1272434
3212
堆栈里的每一个元素
都得正确地编写,
21:15
and then you can start
doing these predictions.
441
1275646
2210
之后才可以进行预测。
21:17
There are all these incredibly
smooth scaling curves.
442
1277856
2503
这些都是非常平滑的规模曲线。
21:20
They tell you something deeply
fundamental about intelligence.
443
1280359
2919
它们体现了关于智能
非常基础的特质。
如果你去读一读我们的
GPT-4 博文,
21:23
If you look at our GPT-4 blog post,
444
1283320
1710
你就会看到这些曲线。
21:25
you can see all of these curves in there.
445
1285030
1960
我们现在可以开始预测了。
21:26
And now we're starting
to be able to predict.
446
1286990
2127
比如,我们可以预测
代码问题的性能。
21:29
So we were able to predict, for example,
the performance on coding problems.
447
1289117
3713
21:32
We basically look at some models
448
1292871
1585
我们会去看一些
21:34
that are 10,000 times
or 1,000 times smaller.
449
1294456
2461
小一万倍或一千倍的模型。
21:36
And so there's something about this
that is actually smooth scaling,
450
1296959
3211
虽然还处于早期,
21:40
even though it's still early days.
451
1300170
2044
它们的增长还是平滑的。
21:42
CA: So here is, one of the big fears then,
452
1302756
2544
CA: 最大的恐惧
21:45
that arises from this.
453
1305300
1252
来源于此。
21:46
If it’s fundamental
to what’s happening here,
454
1306593
2127
如果一定会发生这种情况:
21:48
that as you scale up,
455
1308720
1210
随着规模的扩大,
21:49
things emerge that
456
1309930
2419
会出现你可能有一定信心预测,
21:52
you can maybe predict
in some level of confidence,
457
1312349
4171
21:56
but it's capable of surprising you.
458
1316562
2544
但依然会吓你一跳的事。
22:00
Why isn't there just a huge risk
of something truly terrible emerging?
459
1320816
4463
那不是会有很大风险
会发生糟糕透顶的事吗?
22:05
GB: Well, I think all of these
are questions of degree
460
1325320
2545
GB: 我认为这些问题
都是有关程度、
22:07
and scale and timing.
461
1327865
1209
规模和时机的。
22:09
And I think one thing people miss, too,
462
1329116
1877
我认为大家还忽略了一点,
22:10
is sort of the integration with the world
is also this incredibly emergent,
463
1330993
3587
那就是与世界接轨
也是一件新兴的、
22:14
sort of, very powerful thing too.
464
1334621
1585
很强大的事。
这也就是我们认为逐步部署
如此重要的原因之一。
22:16
And so that's one of the reasons
that we think it's so important
465
1336248
3045
22:19
to deploy incrementally.
466
1339293
1167
你可以通过这场演讲看出
22:20
And so I think that what we kind of see
right now, if you look at this talk,
467
1340502
3629
我关注的一大重点就是
提供非常高质量的反馈。
22:24
a lot of what I focus on is providing
really high-quality feedback.
468
1344131
3170
我们如今进行的任务,
你都可以进行检查,对吧?
22:27
Today, the tasks that we do,
you can inspect them, right?
469
1347301
2711
你可以很轻松地看看那道数学题,
然后发现不对, 不对,不对,
22:30
It's very easy to look at that math
problem and be like, no, no, no,
470
1350012
3211
机器,7 才是正确答案。
22:33
machine, seven was the correct answer.
471
1353265
1835
但就算是总结一本书
都是难以监管的事。
22:35
But even summarizing a book,
like, that's a hard thing to supervise.
472
1355100
3212
你如何得知这篇摘要
是一篇好摘要呢?
22:38
Like, how do you know
if this book summary is any good?
473
1358312
2586
你得看完整本书。
22:40
You have to read the whole book.
474
1360939
1543
没人想干这事儿。
22:42
No one wants to do that.
475
1362482
1168
(笑声)
22:43
(Laughter)
476
1363692
1293
22:44
And so I think that the important thing
will be that we take this step by step.
477
1364985
4296
所以,我认为重要的是
我们要一步一步来。
22:49
And that we say, OK,
as we move on to book summaries,
478
1369323
2544
我们在处理书籍摘要时,
22:51
we have to supervise this task properly.
479
1371867
1960
我们得适当地
对这个任务进行监管。
22:53
We have to build up
a track record with these machines
480
1373827
2586
我们得建立起
这些计算机的历史记录,
22:56
that they're able to actually
carry out our intent.
481
1376413
2586
确认它们是否能
真正执行我们的意图。
22:59
And I think we're going to have to produce
even better, more efficient,
482
1379041
3336
我觉得我们得创造出
更好、更高效、
23:02
more reliable ways of scaling this,
483
1382419
1710
更可靠的拓展方式,
23:04
sort of like making the machine
be aligned with you.
484
1384129
2878
让计算机与你保持一致。
23:07
CA: So we're going to hear
later in this session,
485
1387049
2294
CA: 这场演讲之后会提到,
23:09
there are critics who say that,
486
1389343
1543
有人评论说
23:10
you know, there's no real
understanding inside,
487
1390928
4587
其内部根本没有理解,
23:15
the system is going to always --
488
1395557
1627
系统一直会……
23:17
we're never going to know
that it's not generating errors,
489
1397225
3212
我们看不到
它不会产生任何差错、
23:20
that it doesn't have
common sense and so forth.
490
1400479
2210
它没有常识等等。
23:22
Is it your belief, Greg,
that it is true at any one moment,
491
1402689
4088
格雷格,你会不会
有那么一瞬间认为这说的是对的,
23:26
but that the expansion of the scale
and the human feedback
492
1406818
3629
但又十分自信你刚谈到的
规模的扩大和人类的反馈
23:30
that you talked about is basically
going to take it on that journey
493
1410489
4963
会将其带上
23:35
of actually getting to things
like truth and wisdom and so forth,
494
1415494
3837
真正触及真相、
智慧等等的道路?
23:39
with a high degree of confidence.
495
1419331
1627
23:40
Can you be sure of that?
496
1420999
1335
你确定吗?
23:42
GB: Yeah, well, I think that the OpenAI,
I mean, the short answer is yes,
497
1422334
3462
GB: 我觉得 OpenAI,
简而言之答案是肯定的,
23:45
I believe that is where we're headed.
498
1425796
1793
我相信这就是我们前进的方向。
23:47
And I think that the OpenAI approach
here has always been just like,
499
1427631
3211
OpenAI 的方式一直就是
23:50
let reality hit you in the face, right?
500
1430842
1877
被现实打脸,对吧?
23:52
It's like this field is the field
of broken promises,
501
1432719
2503
这个领域就像是个
“不讲信用”的领域,
23:55
of all these experts saying
X is going to happen, Y is how it works.
502
1435263
3212
专家们都说 X 要发生了,
就是按 Y 这么做的。
人们一直在说神经网络
运行不了 70 年。
23:58
People have been saying neural nets
aren't going to work for 70 years.
503
1438475
3337
他们说的还不一定对呢。
24:01
They haven't been right yet.
504
1441812
1376
也许说得对,
也许 71 年是对的,
24:03
They might be right
maybe 70 years plus one
505
1443188
2044
或者你想说多少年就多少年。
24:05
or something like that is what you need.
506
1445232
1918
但我认为我们的方式一直是
24:07
But I think that our approach
has always been,
507
1447192
2169
你得把这项技术推向极致,
24:09
you've got to push to the limits
of this technology
508
1449361
2419
见证这种情况的真正发生。
因为这才会告诉你,哦,
这就是我们走进新时代的方式。
24:11
to really see it in action,
509
1451822
1293
24:13
because that tells you then, oh, here's
how we can move on to a new paradigm.
510
1453115
3670
我们还没有物尽其用呢。
24:16
And we just haven't exhausted
the fruit here.
511
1456785
2127
CA: 你们所处的立场
还挺有争议的,
24:18
CA: I mean, it's quite
a controversial stance you've taken,
512
1458954
2794
正确的方式就是把它公之于众,
24:21
that the right way to do this
is to put it out there in public
513
1461748
2920
再乘势而上,
24:24
and then harness all this, you know,
514
1464710
1751
不仅仅是你们团队提供反馈,
24:26
instead of just your team giving feedback,
515
1466461
2002
24:28
the world is now giving feedback.
516
1468463
2461
整个世界都在提供反馈。
24:30
But ...
517
1470924
1168
但是……
24:33
If, you know, bad things
are going to emerge,
518
1473135
3753
如果发生了坏事,
24:36
it is out there.
519
1476930
1168
就这么发生了。
24:38
So, you know, the original story
that I heard on OpenAI
520
1478140
2919
我听说 OpenAI 的起源故事是
24:41
when you were founded as a nonprofit,
521
1481101
1793
你们刚开始是作为
非营利性机构成立的,
24:42
well you were there as the great
sort of check on the big companies
522
1482894
4463
你们是查大企业
24:47
doing their unknown,
possibly evil thing with AI.
523
1487399
3837
利用 AI 行不可告人的
不轨之事的利器。
24:51
And you were going to build models
that sort of, you know,
524
1491278
4755
你们做出了一些模型,
24:56
somehow held them accountable
525
1496033
1418
让大企业负起责任,
24:57
and was capable of slowing
the field down, if need be.
526
1497492
4380
也在必要的时候
降低这个领域的发展速度。
25:01
Or at least that's kind of what I heard.
527
1501872
1960
至少我是这么听说的。
25:03
And yet, what's happened,
arguably, is the opposite.
528
1503832
2461
但不得不说,
真正发生的是相反的情况。
25:06
That your release of GPT,
especially ChatGPT,
529
1506334
5673
你们发布了 GPT,
尤其是 ChatGPT,
25:12
sent such shockwaves
through the tech world
530
1512049
2002
掀起了科技界的巨浪,
25:14
that now Google and Meta and so forth
are all scrambling to catch up.
531
1514051
3795
让谷歌、Meta 等等公司
现在手忙脚乱地追赶。
25:17
And some of their criticisms have been,
532
1517888
2085
有些批评是这么说的,
25:20
you are forcing us to put this out here
without proper guardrails or we die.
533
1520015
4963
你在没有适当保护的情况下逼我们
要么接受它的存在,要么死。
25:25
You know, how do you, like,
534
1525020
2794
那你怎么能
25:27
make the case that what you have done
is responsible here and not reckless.
535
1527814
3754
确保你的所作所为
是负责任的、不鲁莽的。
25:31
GB: Yeah, we think about these
questions all the time.
536
1531568
3128
GB: 我们一直在想这些问题。
25:34
Like, seriously all the time.
537
1534738
1418
真的就是一直在想。
25:36
And I don't think we're always
going to get it right.
538
1536198
2711
我们不能保证
做的一直都是对的。
25:38
But one thing I think
has been incredibly important,
539
1538909
2460
但我认为有一个非常重要的点,
从一开始,在我们思考
25:41
from the very beginning,
when we were thinking
540
1541411
2169
如何创造通用人工智能,
25:43
about how to build
artificial general intelligence,
541
1543580
2419
让它造福全人类的时候,
25:45
actually have it benefit all of humanity,
542
1545999
2002
你该怎么达到这个目标呢?对吧?
25:48
like, how are you
supposed to do that, right?
543
1548001
2127
我们默认会采取这样的计划,
悄悄做,
25:50
And that default plan of being,
well, you build in secret,
544
1550170
2711
25:52
you get this super powerful thing,
545
1552923
1626
做出了一个超牛的东西,
然后考虑考虑它的安全性,
按下“运行”,
25:54
and then you figure out the safety of it
and then you push “go,”
546
1554549
3003
然后希望自己做成了。
25:57
and you hope you got it right.
547
1557552
1460
我不知道如何执行这个计划。
25:59
I don't know how to execute that plan.
548
1559012
1835
也许有人知道。
26:00
Maybe someone else does.
549
1560889
1168
但对我来说,这太吓人了,
感觉不太对。
26:02
But for me, that was always terrifying,
it didn't feel right.
550
1562099
2877
所以我们转而采用这种方式,
26:04
And so I think that this
alternative approach
551
1564976
2128
这就是我能看到的
唯一一条途径,
26:07
is the only other path that I see,
552
1567104
2043
26:09
which is that you do let
reality hit you in the face.
553
1569147
2503
就让现实打你的脸。
26:11
And I think you do give people
time to give input.
554
1571691
2336
你也给了人们时间,
让他们提供输入。
26:14
You do have, before these
machines are perfect,
555
1574027
2211
在计算机达到完美状态之前,
26:16
before they are super powerful,
that you actually have the ability
556
1576279
3128
在它们变得超级强大之前,
你有机会
看到它们真正发生的情况。
26:19
to see them in action.
557
1579407
1168
26:20
And we've seen it from GPT-3, right?
558
1580617
1752
我们已经从 GPT-3 身上
看到了,对吧?
26:22
GPT-3, we really were afraid
559
1582369
1376
GPT-3,我们真的很害怕
26:23
that the number one thing
people were going to do with it
560
1583745
2711
人们最想用它做的事
26:26
was generate misinformation,
try to tip elections.
561
1586456
2336
就是生成虚假信息,
试图左右选举。
26:28
Instead, the number one thing
was generating Viagra spam.
562
1588834
2711
其实他们最想做的是
生成壮阳药钓鱼邮件。
26:31
(Laughter)
563
1591545
3169
(笑声)
26:36
CA: So Viagra spam is bad,
but there are things that are much worse.
564
1596007
3212
CA: 壮阳药钓鱼邮件确实不太好,
但是还有更糟糕的事。
26:39
Here's a thought experiment for you.
565
1599219
1752
我们来做一个思想实验。
26:40
Suppose you're sitting in a room,
566
1600971
1710
假设你坐在一个房间里,
26:42
there's a box on the table.
567
1602681
1668
桌子上有个盒子。
26:44
You believe that in that box
is something that,
568
1604349
3003
你认为那个盒子里的东西,
26:47
there's a very strong chance
it's something absolutely glorious
569
1607394
2961
非常有可能是个
很让人喜欢的东西,
26:50
that's going to give beautiful gifts
to your family and to everyone.
570
1610397
3920
可以是送给家人、
送给各位的精美礼品。
26:54
But there's actually also a one percent
thing in the small print there
571
1614359
3629
但也有 1% 的可能性
上面写了个小小的
26:58
that says: “Pandora.”
572
1618029
1877
“潘多拉”。
26:59
And there's a chance
573
1619906
1669
有可能会向世界
释放出无法想象的灾厄。
27:01
that this actually could unleash
unimaginable evils on the world.
574
1621616
4088
27:06
Do you open that box?
575
1626538
1543
你会打开这个盒子吗?
27:08
GB: Well, so, absolutely not.
576
1628123
1460
GB: 当然不会。
27:09
I think you don't do it that way.
577
1629624
1919
我觉得你不会这么做的。
27:12
And honestly, like, I'll tell you a story
that I haven't actually told before,
578
1632210
3796
老实说,我来给你讲一个
我之前没说过的故事,
27:16
which is that shortly
after we started OpenAI,
579
1636006
2586
我们创立 OpenAI 之后不久,
27:18
I remember I was in Puerto Rico
for an AI conference.
580
1638592
2711
我记得当时我在波多黎各
参加一场 AI 会议。
27:21
I'm sitting in the hotel room just
looking out over this wonderful water,
581
1641344
3462
我坐在酒店房间里看海景,
27:24
all these people having a good time.
582
1644806
1752
所有人都玩得很开心。
就在此刻,思考一下:
27:26
And you think about it for a moment,
583
1646558
1752
你现在可以打开
那个潘多拉魔盒,
27:28
if you could choose for basically
that Pandora’s box
584
1648310
4504
27:32
to be five years away
585
1652814
2711
你可以选择在 5 年后
27:35
or 500 years away,
586
1655567
1585
或者 500 年以后,
27:37
which would you pick, right?
587
1657194
1501
你会怎么选?
27:38
On the one hand you're like,
well, maybe for you personally,
588
1658737
2836
一方面你可能觉得
对你自己来说,
27:41
it's better to have it be five years away.
589
1661573
2002
5 年以后比较好。
27:43
But if it gets to be 500 years away
and people get more time to get it right,
590
1663617
3628
但如果得在 500 年以后打开,
人们也有更多的时间让它走上正轨,
27:47
which do you pick?
591
1667287
1168
你会选哪个?
27:48
And you know, I just
really felt it in the moment.
592
1668496
2336
当时我真的觉得该这么选。
27:50
I was like, of course
you do the 500 years.
593
1670874
2002
我觉得你就得选 500 年以后。
27:53
My brother was in the military at the time
594
1673293
2002
我的兄弟当时在服役,
27:55
and like, he puts his life on the line
in a much more real way
595
1675295
2961
他那么实在地冒着生命危险,
27:58
than any of us typing things in computers
596
1678256
2628
比与此同时我们这些
在电脑上打打字、
28:00
and developing this
technology at the time.
597
1680926
2585
开发技术的人要实在得多。
28:03
And so, yeah, I'm really sold
on the you've got to approach this right.
598
1683511
4547
我真的确信人们得
“正视这项技术”这种说法。
28:08
But I don't think that's quite
playing the field as it truly lies.
599
1688058
3628
但我认为这与它所处的领域不符。
28:11
Like, if you look at the whole
history of computing,
600
1691686
2670
纵观整个计算机技术史,
28:14
I really mean it when I say
that this is an industry-wide
601
1694397
4463
我可以认真地说,
这是一场涉及整个行业,
28:18
or even just almost like
602
1698902
1543
或更像是
28:20
a human-development-
of-technology-wide shift.
603
1700487
3336
“人类的技术进步”的变革。
28:23
And the more that you sort of,
don't put together the pieces
604
1703865
4088
你越不把这些现有的
线索、信息整合在一起,
28:27
that are there, right,
605
1707994
1293
28:29
we're still making faster computers,
606
1709329
1752
我们就还在制造更快的计算机、
28:31
we're still improving the algorithms,
all of these things, they are happening.
607
1711081
3670
改进算法,
如此种种都是进行时。
28:34
And if you don't put them together,
you get an overhang,
608
1714793
2627
如果你不整合它们,
你就处在一种“悬空”的状态,
28:37
which means that if someone does,
609
1717420
1627
意思就是如果有人动手做了,
28:39
or the moment that someone does manage
to connect to the circuit,
610
1719089
3086
或者在某一个瞬间,
有人真的打通了这条路,
28:42
then you suddenly have
this very powerful thing,
611
1722175
2252
你就突然间有了一个
非常强大的东西,
28:44
no one's had any time to adjust,
612
1724427
1544
没有人有时间去调整,
28:46
who knows what kind
of safety precautions you get.
613
1726012
2336
谁知道会有什么样的
安全防护措施呢。
28:48
And so I think
that one thing I take away
614
1728390
1918
所以我认为
我学到的一点就是
28:50
is like, even you think about development
of other sort of technologies,
615
1730308
3837
就算你想的是开发其他技术,
28:54
think about nuclear weapons,
616
1734187
1376
比如核武器,
28:55
people talk about being
like a zero to one,
617
1735563
2002
这都是人类力所能及的
28:57
sort of, change in what humans could do.
618
1737565
2628
零到一的改变。
29:00
But I actually think
that if you look at capability,
619
1740235
2461
但如果你看看人类的能力,
29:02
it's been quite smooth over time.
620
1742696
1585
其实一直以来都是很平滑的。
29:04
And so the history, I think,
of every technology we've developed
621
1744281
3670
历史上我们研发出的每一项技术,
29:07
has been, you've got
to do it incrementally
622
1747993
2002
都是一步一步来的,
29:10
and you've got to figure out
how to manage it
623
1750036
2127
每一次你要增强这种能力的时候,
29:12
for each moment that you're increasing it.
624
1752163
2461
你都得考虑如何掌控它。
29:14
CA: So what I'm hearing is that you ...
625
1754666
2252
CA: 我可以这么理解……
29:16
the model you want us to have
626
1756918
1668
你想我们以后使用的模型
29:18
is that we have birthed
this extraordinary child
627
1758628
2795
是我们孕育出的神奇小孩,
29:21
that may have superpowers
628
1761423
2544
它会拥有带领人类
走向新世界的超能力。
29:24
that take humanity to a whole new place.
629
1764009
2544
29:26
It is our collective responsibility
to provide the guardrails
630
1766594
5005
我们都有义务为这个孩子
29:31
for this child
631
1771641
1210
保驾护航,
29:32
to collectively teach it to be wise
and not to tear us all down.
632
1772892
5047
齐心协力让它更智慧,
不要毁灭我们自己。
29:37
Is that basically the model?
633
1777939
1377
这个模型是这样吗?
29:39
GB: I think it's true.
634
1779357
1168
GB: 是的。
29:40
And I think it's also important
to say this may shift, right?
635
1780567
2878
不得不说这种情况
也可能会改变,对吧?
29:43
We've got to take each step
as we encounter it.
636
1783445
3253
我们要随机应变地向前迈进。
29:46
And I think it's incredibly
important today
637
1786740
2002
我认为如今很重要的一点是,
29:48
that we all do get literate
in this technology,
638
1788783
2878
我们都得对这项技术有所了解,
29:51
figure out how to provide the feedback,
639
1791661
1919
搞明白如何提供反馈,
29:53
decide what we want from it.
640
1793621
1377
搞清楚我们到底想从中获得什么。
29:54
And my hope is that that will
continue to be the best path,
641
1794998
3128
我希望,我们依旧
走在最优的道路上,
29:58
but it's so good we're honestly
having this debate
642
1798168
2377
但我也很高兴我们
坦诚地讨论这个话题,
30:00
because we wouldn't otherwise
if it weren't out there.
643
1800545
2628
因为如果这项技术不存在,
我们也不会有这样的讨论了。
30:03
CA: Greg Brockman, thank you so much
for coming to TED and blowing our minds.
644
1803631
3629
CA: 格雷格·布罗克曼,
感谢你来到 TED,
让我们大开眼界。
30:07
(Applause)
645
1807302
1626
(掌声)
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。