The Inside Story of ChatGPT’s Astonishing Potential | Greg Brockman | TED
1,798,027 views ・ 2023-04-20
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: Helen Chang
00:03
We started OpenAI seven years ago
0
3875
2503
我們在七年前成立了 OpenAI,
00:06
because we felt like something
really interesting was happening in AI
1
6378
3712
因為我們認為人工智慧
領域的發展很有意思,
00:10
and we wanted to help steer it
in a positive direction.
2
10131
3170
而我們想要協助引導它
朝正面的方向前進。
00:15
It's honestly just really amazing to see
3
15220
2085
從那之後人工智慧領域的進展,
真的是很不可思議。
00:17
how far this whole field
has come since then.
4
17347
3086
00:20
And it's really gratifying to hear
from people like Raymond
5
20433
3629
也很高興能聽到像雷蒙這些人
00:24
who are using the technology
we are building, and others,
6
24104
2836
把我們與其他人打造的這項技術
00:26
for so many wonderful things.
7
26982
2127
用來做許多美好的事。
00:29
We hear from people who are excited,
8
29150
2503
我們聽見一些興奮的聲音,
00:31
we hear from people who are concerned,
9
31653
1835
聽見一些擔憂的聲音,
00:33
we hear from people who feel
both those emotions at once.
10
33530
2961
也聽到同時有這兩種情緒的聲音。
00:36
And honestly, that's how we feel.
11
36533
2252
老實說,那正是我們的感覺。
00:40
Above all, it feels like we're entering
an historic period right now
12
40245
4087
尤其是,我們現在似乎正在
邁入一個歷史性的時期,
00:44
where we as a world
are going to define a technology
13
44374
4421
我們這個世界要來定義一項技術,
00:48
that will be so important
for our society going forward.
14
48795
3086
它對我們未來的社會相當重要。
00:52
And I believe that we can
manage this for good.
15
52924
2628
我相信我們可以善加管理它。
00:56
So today, I want to show you
the current state of that technology
16
56845
4171
今天我想告訴大家這項技術的現狀,
以及背後我們所堅守的一些原則。
01:01
and some of the underlying
design principles that we hold dear.
17
61016
3086
01:09
So the first thing I'm going to show you
18
69983
1918
我首先要展示的是,
為人工智慧而非人類建立工具
01:11
is what it's like to build
a tool for an AI
19
71943
2086
是怎麼一回事。
01:14
rather than building it for a human.
20
74029
1876
01:17
So we have a new DALL-E model,
which generates images,
21
77574
3545
所以,
我們有新的 DALL-E 模型,
它會產生影像,
01:21
and we are exposing it as an app
for ChatGPT to use on your behalf.
22
81161
4045
我們把它開發成應用程式,
讓 ChatGPT 可以代表你來使用。
01:25
And you can do things like ask, you know,
23
85248
2461
你可以做的事包括詢問
01:27
suggest a nice post-TED meal
and draw a picture of it.
24
87751
6631
「建議聽完 TED 演說後吃什麼好,
然後把它畫出來。」
01:35
(Laughter)
25
95216
1419
(笑聲)
01:38
Now you get all of the, sort of,
ideation and creative back-and-forth
26
98303
4671
接著你會得到所有的
想法和有創意的對話,
01:43
and taking care of the details for you
that you get out of ChatGPT.
27
103016
4004
還會幫你顧到細節,
這些都是 ChatGPT 的功能。
01:47
And here we go, it's not just
the idea for the meal,
28
107062
2669
來了,不只是關於吃什麼的想法,
01:49
but a very, very detailed spread.
29
109773
3587
還有非常非詳細的食物擺盤方式。
咱們來看看我們會得到什麼。
01:54
So let's see what we're going to get.
30
114110
2044
01:56
But ChatGPT doesn't just generate
images in this case --
31
116154
3795
但在這個例子裡,ChatGPT
不只是產生影像——
01:59
sorry, it doesn't generate text,
it also generates an image.
32
119991
2836
抱歉,它不只是產生文字,
也會產生影像。
02:02
And that is something
that really expands the power
33
122827
2419
ChatGPT 代表你來實現
你的意圖的能力
02:05
of what it can do on your behalf
in terms of carrying out your intent.
34
125246
3504
被大大擴展了。
02:08
And I'll point out,
this is all a live demo.
35
128750
2085
我要強調,這是現場示範。
02:10
This is all generated
by the AI as we speak.
36
130835
2169
我們說話的同時
人工智慧在產生這些資訊,
02:13
So I actually don't even know
what we're going to see.
37
133046
2544
我甚至不知道我們會看到什麼。
02:16
This looks wonderful.
38
136216
2294
這看起來好棒。
02:18
(Applause)
39
138510
3712
(掌聲)
02:22
I'm getting hungry just looking at it.
40
142514
1877
我光看就餓了。
02:24
Now we've extended ChatGPT
with other tools too,
41
144724
2753
我們也用其他工具讓 ChatGPT
更強大,比如,記憶。
02:27
for example, memory.
42
147519
1168
02:28
You can say "save this for later."
43
148728
2795
你可以說「記住這個,等下會用到。」
02:33
And the interesting thing
about these tools
44
153233
2043
這些工具有趣的地方在於
你很容易就能檢視它們。
02:35
is they're very inspectable.
45
155318
1377
02:36
So you get this little pop up here
that says "use the DALL-E app."
46
156695
3128
你會看到彈出訊息,寫著
「用 DALL-E 應用程式」。
順道一提,ChatGPT 使用者
在幾個月後就可以使用了。
02:39
And by the way, this is coming to you, all
ChatGPT users, over upcoming months.
47
159823
3712
你可以揭開它的面紗,
02:43
And you can look under the hood
and see that what it actually did
48
163535
3086
看到它所做的其實
就只是像人類一樣寫個指示。
02:46
was write a prompt
just like a human could.
49
166621
2169
02:48
And so you sort of have
this ability to inspect
50
168790
2628
所以你有能力可以去檢視
02:51
how the machine is using these tools,
51
171459
2086
機器如何使用這些工具,
02:53
which allows us to provide
feedback to them.
52
173586
2086
這讓我們可以提供回饋給它們。
02:55
Now it's saved for later,
53
175714
1209
它被記著,等下再用,
讓我示範運用如何運用那些資訊,
02:56
and let me show you
what it's like to use that information
54
176965
2878
02:59
and to integrate
with other applications too.
55
179884
2503
以及和其他應用整合。
03:02
You can say,
56
182387
2210
你可以說:
03:04
“Now make a shopping list
for the tasty thing
57
184639
5506
「幫我做一張採買清單,
我需要什麼才能做剛剛
提到的那個美味食物?」
03:10
I was suggesting earlier.”
58
190186
1835
03:12
And make it a little tricky for the AI.
59
192021
2128
給人工智慧出點難題。
03:16
"And tweet it out for all
the TED viewers out there."
60
196276
4337
「再用推文的方式,
告訴所有的 TED 觀眾。」
03:20
(Laughter)
61
200655
2252
(笑聲)
03:22
So if you do make this wonderful,
wonderful meal,
62
202949
2461
如果你真的做了這種美食,
我很想知道嚐起來如何。
03:25
I definitely want to know how it tastes.
63
205410
2044
03:28
But you can see that ChatGPT
is selecting all these different tools
64
208496
3504
各位可以看見,ChatGPT
在選擇各種工具,
03:32
without me having to tell it explicitly
which ones to use in any situation.
65
212000
4379
我都沒有明確告訴它
在什麼情況要用什麼工具。
03:37
And this, I think, shows a new way
of thinking about the user interface.
66
217088
3879
我認為,這讓我們用新的方式
來看待使用者介面。
03:40
Like, we are so used to thinking of,
well, we have these apps,
67
220967
3796
我們太習慣會認為
我們有這些應用程式,
03:44
we click between them,
we copy/paste between them,
68
224763
2335
在它們之間點選、複製、貼上,
若你知道應用程式的選單和所有
選項,用它應該是很棒的體驗。
03:47
and usually it's a great
experience within an app
69
227098
2294
03:49
as long as you kind of know
the menus and know all the options.
70
229434
2961
是的,我希望你做。
03:52
Yes, I would like you to.
71
232395
1293
03:53
Yes, please.
72
233730
1126
是的,拜託。禮貌點總是好事。
03:54
Always good to be polite.
73
234898
1251
03:56
(Laughter)
74
236149
2628
(笑聲)
04:00
And by having this unified
language interface on top of tools,
75
240361
5464
有這種統一的語言介面
建構在工具之上,
04:05
the AI is able to sort of take away
all those details from you.
76
245867
4630
人工智慧就可以替你
接手所有這些細節。
04:10
So you don't have to be the one
77
250538
1543
就不用由你來
04:12
who spells out every single
sort of little piece
78
252123
2294
解釋應該要如何做的每一個細節。
04:14
of what's supposed to happen.
79
254459
1543
04:16
And as I said, this is a live demo,
80
256419
1710
我剛說過,這是現場示範,
04:18
so sometimes the unexpected
will happen to us.
81
258129
3379
有時會發生未預期的狀況。
04:21
But let's take a look at the Instacart
shopping list while we're at it.
82
261549
3420
在此之際,我們就來看一下
配送服務的採購清單。
04:25
And you can see we sent a list
of ingredients to Instacart.
83
265386
3254
可以看到我們發送了
原料清單到配送服務。
04:29
Here's everything you need.
84
269349
1543
你需要的都在這了。
04:30
And the thing that's really interesting
85
270892
1877
有趣的是,傳統使用者介面
仍然非常有價值。
04:32
is that the traditional UI
is still very valuable, right?
86
272811
2919
04:35
If you look at this,
87
275772
1877
看看這個介面,
04:37
you still can click through it
and sort of modify the actual quantities.
88
277690
4296
你還是可以在上面點選,
修改實際數量。
04:41
And that's something that I think shows
89
281986
1877
我認為這顯示
傳統使用者介面不會消失,
04:43
that they're not going away,
traditional UIs.
90
283863
3253
04:47
It's just we have a new,
augmented way to build them.
91
287158
2795
我們只是用擴充的
新方法來打造它們。
04:49
And now we have a tweet
that's been drafted for our review,
92
289994
2920
現在有了一則推文的草稿,
等我們複審,
04:52
which is also a very important thing.
93
292956
1793
這也十分重要。
04:54
We can click “run,” and there we are,
we’re the manager, we’re able to inspect,
94
294749
3712
我們可以點選「執行」,
由我們來主導,
如果想要,我們可以
檢視及改變人工智慧做的工作。
04:58
we're able to change the work
of the AI if we want to.
95
298461
2836
05:02
And so after this talk,
you will be able to access this yourself.
96
302924
5964
在這場演說之後,
各位自己可以用它。
05:17
And there we go.
97
317647
1710
好了。
05:19
Cool.
98
319816
1126
酷。
05:22
Thank you, everyone.
99
322485
1168
謝謝大家。
05:23
(Applause)
100
323653
3003
(掌聲)
05:29
So we’ll cut back to the slides.
101
329367
1627
我們可以切回投影片。
05:32
Now, the important thing
about how we build this,
102
332954
3587
關於我們如何建造這些,
很重要的是,
05:36
it's not just about building these tools.
103
336583
2210
重點不只是建造這些工具,
05:38
It's about teaching
the AI how to use them.
104
338793
2252
還要教人工智慧如何使用它們。
05:41
Like, what do we even want it to do
105
341087
1710
比如當我們問出非常高階的問題時
是希望它做什麼。
05:42
when we ask these very
high-level questions?
106
342839
2419
05:45
And to do this, we use an old idea.
107
345258
2669
為此,我們用了一個老點子。
05:48
If you go back to Alan Turing's 1950 paper
on the Turing test, he says,
108
348261
3337
回顧 1950 年艾倫‧圖靈的
圖靈測試論文,
05:51
you'll never program an answer to this.
109
351598
2043
他說,你永遠無法用程式
回答這個問題。
05:53
Instead, you can learn it.
110
353683
1627
但你可以用學的。
05:55
You could build a machine,
like a human child,
111
355351
2169
你可以建造一台像人類兒童的機器,
05:57
and then teach it through feedback.
112
357520
2127
用回饋的方式教導它。
05:59
Have a human teacher who provides
rewards and punishments
113
359689
2711
找個人類老師給它,
當它嘗試做好事或壞事時,
06:02
as it tries things out and does things
that are either good or bad.
114
362400
3212
提供獎勵或懲罰。
06:06
And this is exactly how we train ChatGPT.
115
366237
2002
我們正是如此訓練 ChatGPT ,
06:08
It's a two-step process.
116
368239
1168
有兩個步驟:
06:09
First, we produce what Turing
would have called a child machine
117
369449
3086
第一,我們製造出圖靈
所謂的兒童機器,
用非監督式學習來做到這一點。
06:12
through an unsupervised learning process.
118
372535
1960
我們只是讓它見識
整個網際網路世界,
06:14
We just show it the whole world,
the whole internet
119
374495
2461
06:16
and say, “Predict what comes next
in text you’ve never seen before.”
120
376956
3212
說:「在你從未見過的文字中
預測接下來會是什麼。」
這個過程賦予它各種很棒的技能。
06:20
And this process imbues it
with all sorts of wonderful skills.
121
380168
3044
06:23
For example, if you're shown
a math problem,
122
383212
2086
比如,你丟個數學題給它,
要解這題唯一的方式,
06:25
the only way to actually
complete that math problem,
123
385298
2544
06:27
to say what comes next,
124
387884
1334
說出接下來會是什麼,
上面綠色的九,
06:29
that green nine up there,
125
389218
1293
06:30
is to actually solve the math problem.
126
390511
2294
就只能靠真正去解那個數學題。
06:34
But we actually have to do
a second step, too,
127
394432
2169
但我們也得進行第二個步驟,
教人工智慧如何用那些技能。
06:36
which is to teach the AI
what to do with those skills.
128
396601
2544
為此,我們要提供回饋。
06:39
And for this, we provide feedback.
129
399187
1668
06:40
We have the AI try out multiple things,
give us multiple suggestions,
130
400855
3253
我們讓人工智慧做多種嘗試,
給我們多種建議,
由人類來對建議評分:
「這個比那個好。」
06:44
and then a human rates them, says
“This one’s better than that one.”
131
404150
3212
這不僅能強化人工智慧
所給的單一回覆,
06:47
And this reinforces not just the specific
thing that the AI said,
132
407362
3086
非常重要的是,也會強化
人工智慧用來產生該回覆的流程。
06:50
but very importantly, the whole process
that the AI used to produce that answer.
133
410448
3795
這就讓它能做到一般化,
讓它能教導,
06:54
And this allows it to generalize.
134
414243
1585
06:55
It allows it to teach,
to sort of infer your intent
135
415828
2419
能推論你的意圖,
06:58
and apply it in scenarios
that it hasn't seen before,
136
418247
2503
並應用在它尚未見過
也沒得過相關回饋的情境中。
07:00
that it hasn't received feedback.
137
420750
1585
07:02
Now, sometimes the things
we have to teach the AI
138
422669
2460
有時,你可能很難想像我們
必須要教人工智慧什麼。
07:05
are not what you'd expect.
139
425171
1543
07:06
For example, when we first showed
GPT-4 to Khan Academy,
140
426756
3086
比如,我們初次展示 GPT-4
給可汗學院看時,
07:09
they said, "Wow, this is so great,
141
429884
1627
他們說:「哇,好讚,
我們可以教學生很棒的東西了。
07:11
We're going to be able to teach
students wonderful things.
142
431552
2753
唯一的問題是
它不複查學生的數學。
07:14
Only one problem, it doesn't
double-check students' math.
143
434347
3462
07:17
If there's some bad math in there,
144
437809
1626
如果有算錯,它會很高興地假裝
一加一等於三然後繼續做下去。」
07:19
it will happily pretend that one plus one
equals three and run with it."
145
439477
3462
07:23
So we had to collect some feedback data.
146
443523
2294
所以,我們得收集一些回饋資料。
07:25
Sal Khan himself was very kind
147
445858
1544
薩爾曼‧可汗本人很好心地
投入二十小時自己的時間,
07:27
and offered 20 hours of his own time
to provide feedback to the machine
148
447443
3337
和我們的團隊一起提供回饋給機器。
07:30
alongside our team.
149
450780
1501
07:32
And over the course of a couple of months
we were able to teach the AI that,
150
452323
3587
經過幾個月的時間,
我們教會了人工智慧
07:35
"Hey, you really should
push back on humans
151
455910
2044
「嘿,在這種特定的情況下
你應該要反駁人類。」
07:37
in this specific kind of scenario."
152
457954
2044
07:41
And we've actually made lots and lots
of improvements to the models this way.
153
461416
4921
實際上我們用這種方式
對模型做了很多很多的改善。
07:46
And when you push
that thumbs down in ChatGPT,
154
466379
2544
當你在 ChatGPT 中按了
大拇指向下的圖示,
07:48
that actually is kind of like sending up
a bat signal to our team to say,
155
468965
3462
就等同向我們的團隊發出蝙蝠燈號,
「你在這個領域比較弱,
應該收集回饋。」
07:52
“Here’s an area of weakness
where you should gather feedback.”
156
472427
2919
我們用這種按大拇指圖示的
方式來傾聽使用者
07:55
And so when you do that,
157
475388
1168
07:56
that's one way that we really
listen to our users
158
476597
2294
07:58
and make sure we're building something
that's more useful for everyone.
159
478933
3378
並確保我們的成品
對每個人都更有用。
08:02
Now, providing high-quality
feedback is a hard thing.
160
482895
3754
提供高品質回饋是很困難的事。
如果你要孩子清理他們的房間,
而你卻只是檢查地板,
08:07
If you think about asking a kid
to clean their room,
161
487025
2460
08:09
if all you're doing
is inspecting the floor,
162
489485
2711
08:12
you don't know if you're just teaching
them to stuff all the toys in the closet.
163
492196
3796
你不會知道你其實只是
教他們把玩具全塞到衣櫥裡。
08:15
This is a nice DALL-E-generated
image, by the way.
164
495992
2627
順道一提,這張好看的圖
是 DALL-E 產生的。
08:19
And the same sort
of reasoning applies to AI.
165
499912
4713
同樣的邏輯可套用到人工智慧。
08:24
As we move to harder tasks,
166
504667
1794
當我們邁向更困難的工作任務,
08:26
we will have to scale our ability
to provide high-quality feedback.
167
506502
3796
我們就得提升我們
提供高品質回饋的能力。
08:30
But for this, the AI itself
is happy to help.
168
510882
3879
但,人工智慧很樂意
提供這方面的協助。
08:34
It's happy to help us provide
even better feedback
169
514761
2335
它很樂意協助我們提供更好的回饋,
08:37
and to scale our ability to supervise
the machine as time goes on.
170
517138
3587
並逐步提升我們監督機器的能力。
08:40
And let me show you what I mean.
171
520767
1543
讓我用示範來說明。
08:42
For example, you can ask GPT-4
a question like this,
172
522810
4546
比如,
你可以問 GPT-4 這樣的問題:
08:47
of how much time passed
between these two foundational blogs
173
527356
3295
這兩篇基礎的部落格文章
之間差了多少時間,
08:50
on unsupervised learning
174
530693
1668
一篇是談非監督式學習,
另一篇是談從人類回饋中學習。
08:52
and learning from human feedback.
175
532403
1794
08:54
And the model says two months passed.
176
534197
2460
模型說,間隔兩個月。
08:57
But is it true?
177
537075
1167
但,是真的嗎?這些模型
並非百分之百可靠,
08:58
Like, these models
are not 100-percent reliable,
178
538284
2252
09:00
although they’re getting better
every time we provide some feedback.
179
540536
3921
雖然每當我們提供回饋
就能讓它們改善一些。
09:04
But we can actually use
the AI to fact-check.
180
544457
3086
但我們可以用人工智慧來查核事實。
09:07
And it can actually check its own work.
181
547543
1877
它可以檢查它自己的回覆。
09:09
You can say, fact-check this for me.
182
549462
1877
你可以說,幫我查核一下。
09:12
Now, in this case, I've actually
given the AI a new tool.
183
552757
3670
在這個例子,我其實給了
人工智慧一項新工具,
09:16
This one is a browsing tool
184
556427
1710
一項瀏覽工具,
09:18
where the model can issue search queries
and click into web pages.
185
558137
3879
讓模型可以發出搜尋查詢,
並點選去看網頁。
09:22
And it actually writes out
its whole chain of thought as it does it.
186
562016
3253
這麼做的過程中,
它寫出了它的整串思路。
09:25
It says, I’m just going to search for this
and it actually does the search.
187
565269
3587
它說,我要搜尋這個,
也實際去搜尋。
接著它找到了刊載日期和搜尋結果。
09:28
It then it finds the publication date
and the search results.
188
568856
3128
接著它又發出另一個搜尋,
它要點進部落格文章。
09:32
It then is issuing another search query.
189
572026
1919
09:33
It's going to click into the blog post.
190
573945
1877
09:35
And all of this you could do,
but it’s a very tedious task.
191
575822
2877
這些你都能做,
但都是很乏味的工作。
09:38
It's not a thing
that humans really want to do.
192
578741
2211
那不是人類會想做的事,
坐在駕駛座上會有趣許多,
09:40
It's much more fun
to be in the driver's seat,
193
580952
2169
坐在管理的位子上,
如果你想要再三複查都可以。
09:43
to be in this manager's position
where you can, if you want,
194
583162
2836
09:45
triple-check the work.
195
585998
1210
最後出現引用文獻,
09:47
And out come citations
196
587208
1501
09:48
so you can actually go
197
588709
1168
這樣你真的非常容易就能去驗證
09:49
and very easily verify any piece
of this whole chain of reasoning.
198
589919
3754
這整串推理的任何一部分。
09:53
And it actually turns out
two months was wrong.
199
593673
2210
結果發現,不是兩個月。
09:55
Two months and one week,
200
595883
2169
兩個月又一週,那才是對的。
09:58
that was correct.
201
598094
1251
10:00
(Applause)
202
600888
3837
(掌聲)
10:07
And we'll cut back to the side.
203
607645
1502
我們再切回投影片。
10:09
And so thing that's so interesting to me
about this whole process
204
609147
3920
這整個過程讓我覺得很有趣的部分
10:13
is that it’s this many-step collaboration
between a human and an AI.
205
613067
3962
是人類和人工智慧之間
要做許多步驟的合作。
10:17
Because a human, using
this fact-checking tool
206
617029
2461
因為人類使用查核事實的工具
10:19
is doing it in order to produce data
207
619532
2210
目的是要產生出資料
10:21
for another AI to become
more useful to a human.
208
621742
3170
讓另一個人工智慧
變得對人類更有用。
10:25
And I think this really shows
the shape of something
209
625454
2545
我認為這真的勾勒出了
10:28
that we should expect to be
much more common in the future,
210
628040
3087
未來會更常見的樣貌,
10:31
where we have humans
and machines kind of very carefully
211
631127
2711
未來的人類和機器
在設計上能很謹慎且微妙地
10:33
and delicately designed
in how they fit into a problem
212
633880
3503
處理問題以及設法解決問題。
10:37
and how we want
to solve that problem.
213
637425
1918
10:39
We make sure that the humans are providing
the management, the oversight,
214
639385
3462
我們會確保由人類提供
管理、監督、回饋,
10:42
the feedback,
215
642847
1168
而機器運作的方式是
可以被檢視且能被信賴的。
10:44
and the machines are operating
in a way that's inspectable
216
644015
2752
10:46
and trustworthy.
217
646809
1126
同心協力我們就能夠
創造出更能信賴的機器。
10:47
And together we're able to actually create
even more trustworthy machines.
218
647977
3503
我認為假以時日,
如果我們做對這個流程,
10:51
And I think that over time,
if we get this process right,
219
651522
2711
我們將能解決不可能解決的問題。
10:54
we will be able to solve
impossible problems.
220
654233
2127
讓各位了解我說的是
什麼程度的不可能:
10:56
And to give you a sense
of just how impossible I'm talking,
221
656360
3963
11:00
I think we're going to be able
to rethink almost every aspect
222
660323
2878
我認為我們將來會能重新思考我們
11:03
of how we interact with computers.
223
663242
2378
和電腦互動的幾乎每個面向。
11:05
For example, think about spreadsheets.
224
665620
2502
比如,想想試算表。
11:08
They've been around in some form since,
we'll say, 40 years ago with VisiCalc.
225
668122
4379
大約四十年前試算表就以
VisiCalc 的形式出現了。
11:12
I don't think they've really
changed that much in that time.
226
672543
2878
在那個時期,它們沒有很大的變化。
11:16
And here is a specific spreadsheet
of all the AI papers on the arXiv
227
676214
5922
這張試算表列出了
arXiv 上過去三十年間的
所有人工智慧論文。
11:22
for the past 30 years.
228
682178
1168
11:23
There's about 167,000 of them.
229
683346
1960
總共約十六萬七千篇。
11:25
And you can see there the data right here.
230
685348
2878
各位可以看到資料就在這裡。
11:28
But let me show you the ChatGPT take
on how to analyze a data set like this.
231
688267
3837
讓我給各位看,ChatGPT 認為
要如何分析像這樣的資料集。
11:37
So we can give ChatGPT
access to yet another tool,
232
697318
3837
我們讓 ChatGPT 使用另一項工具,
11:41
this one a Python interpreter,
233
701197
1460
這是 Python 直譯器,
可以跑程式碼,
11:42
so it’s able to run code,
just like a data scientist would.
234
702657
4004
就如同資料科學家一樣。
11:46
And so you can just
literally upload a file
235
706661
2335
你可以真的上傳一個檔案,
11:48
and ask questions about it.
236
708996
1335
並問相關的問題。
11:50
And very helpfully, you know, it knows
the name of the file and it's like,
237
710373
3545
非常有幫助的是,
它看得懂檔名,會說:「喔,
這是 CSV 檔,逗號分隔值檔案,
11:53
"Oh, this is CSV,"
comma-separated value file,
238
713960
2419
11:56
"I'll parse it for you."
239
716420
1335
我幫你分析。」
11:57
The only information here
is the name of the file,
240
717755
2794
這裡唯一的資訊是檔案的名稱、
12:00
the column names like you saw
and then the actual data.
241
720591
3671
各位剛看到的欄位名稱,
接著是實際的資料。
12:04
And from that it's able to infer
what these columns actually mean.
242
724262
4504
它可以從這些來推論出
這些欄位的意義是什麼。
12:08
Like, that semantic information
wasn't in there.
243
728766
2294
這裡並沒有那些語義資訊。
它得要把它的世界知識
拼湊起來,要知道:
12:11
It has to sort of, put together
its world knowledge of knowing that,
244
731102
3211
「喔,對,arXiv 是
大家上傳論文的網站,
12:14
“Oh yeah, arXiv is a site
that people submit papers
245
734355
2502
12:16
and therefore that's what these things are
and that these are integer values
246
736857
3587
所以這些是論文,
而這些是整數,
因此是論文的作者數目。」
12:20
and so therefore it's a number
of authors in the paper,"
247
740486
2628
那些都是人會做的事,
12:23
like all of that, that’s work
for a human to do,
248
743114
2252
而人工智慧很樂意幫忙。
12:25
and the AI is happy to help with it.
249
745408
1751
我甚至不知道我想要問什麼。
12:27
Now I don't even know what I want to ask.
250
747159
2002
幸運的是,你可以要機器這樣做:
12:29
So fortunately, you can ask the machine,
251
749203
3003
12:32
"Can you make some exploratory graphs?"
252
752248
1877
「你能做些探索性的圖表嗎?」
12:37
And once again, this is a super high-level
instruction with lots of intent behind it.
253
757461
4004
這同樣也是個超高階的指示,
背後有許多意圖。
12:41
But I don't even know what I want.
254
761507
1668
但我甚至不知道我要什麼,
而人工智慧得推論我會想知道什麼。
12:43
And the AI kind of has to infer
what I might be interested in.
255
763175
2920
我想它提出了一些好點子。
12:46
And so it comes up
with some good ideas, I think.
256
766137
2294
每篇論文作者數目的直方圖、
12:48
So a histogram of the number
of authors per paper,
257
768472
2336
每年論文數的時間序列、
12:50
time series of papers per year,
word cloud of the paper titles.
258
770850
2961
論文標題的文字雲。
12:53
All of that, I think,
will be pretty interesting to see.
259
773853
2627
我認為這些分析都挺有趣的。
很棒的是,它真的可以做這些分析。
12:56
And the great thing is,
it can actually do it.
260
776522
2169
12:58
Here we go, a nice bell curve.
261
778691
1460
來了,很好的鐘形,
最常見的是三位作者。
13:00
You see that three
is kind of the most common.
262
780151
2169
它正在製作這張很棒的圖,
13:02
It's going to then make this nice plot
of the papers per year.
263
782320
5797
呈現每年的論文數。
13:08
Something crazy
is happening in 2023, though.
264
788117
2294
不過,2023 年發生了很瘋狂的事。
13:10
Looks like we were on an exponential
and it dropped off the cliff.
265
790411
3128
看來是在指數爬升後又落下懸崖。
是怎麼了?順道一提,這些都是
你可以檢視的 Python 程式碼。
13:13
What could be going on there?
266
793539
1460
13:14
By the way, all this
is Python code, you can inspect.
267
794999
2753
13:17
And then we'll see word cloud.
268
797752
1459
接著是文字雲。
可以看到標題中會出現這些字詞。
13:19
So you can see all these wonderful things
that appear in these titles.
269
799253
3378
13:23
But I'm pretty unhappy
about this 2023 thing.
270
803090
2127
但這個 2023 年的狀況讓我
不開心,它讓這年看來很糟。
13:25
It makes this year look really bad.
271
805634
1711
13:27
Of course, the problem is
that the year is not over.
272
807345
2877
當然,問題在於這年還沒結束。
13:30
So I'm going to push back on the machine.
273
810222
2878
所以我要對機器提出抗議。
13:33
[Waitttt that's not fair!!!
274
813142
1585
〔等等!那不公平,
2023 年還沒結束。
13:34
2023 isn't over.
275
814727
1293
13:38
What percentage of papers in 2022
were even posted by April 13?]
276
818481
5088
2022 年的論文有多少百分比
是在四月十三日前刊登的?〕
13:44
So April 13 was the cut-off
date I believe.
277
824695
2294
我認為資料截止日是四月十三日。
13:47
Can you use that to make
a fair projection?
278
827656
4922
你能用那資訊來做公平的預測嗎?
13:54
So we'll see, this is
the kind of ambitious one.
279
834747
2294
等著看吧,這野心挺大的。
(笑聲)
13:57
(Laughter)
280
837083
1126
13:59
So you know,
281
839877
1251
所以,
14:01
again, I feel like there was more I wanted
out of the machine here.
282
841128
3921
同樣的,在此我希望
機器能做的還要更多。
14:05
I really wanted it to notice this thing,
283
845049
2502
我真的希望它能注意到這件事,
14:07
maybe it's a little bit
of an overreach for it
284
847593
3128
也許這樣要求它是有點超過了,
14:10
to have sort of, inferred magically
that this is what I wanted.
285
850763
3378
希望它能很神奇地
推論出這就是我想要的。
14:14
But I inject my intent,
286
854183
1627
但我放入了我的意圖,
14:15
I provide this additional piece
of, you know, guidance.
287
855810
4754
我提供了這項額外的指引。
14:20
And under the hood,
288
860564
1168
在背後,
14:21
the AI is just writing code again,
so if you want to inspect what it's doing,
289
861774
3629
人工智慧又在寫程式碼了,
如果你想檢視它在做什麼,
是非常可行的。
14:25
it's very possible.
290
865403
1251
14:26
And now, it does the correct projection.
291
866654
3628
現在,
它做了正確的預測。
14:30
(Applause)
292
870282
5005
(掌聲)
14:35
If you noticed, it even updates the title.
293
875287
2169
注意看看,它甚至更新了標題。
14:37
I didn't ask for that,
but it know what I want.
294
877498
2336
我沒要求這點,
但它知道我要什麼。
14:41
Now we'll cut back to the slide again.
295
881794
2544
現在我們再回到投影片。
14:45
This slide shows a parable
of how I think we ...
296
885714
4880
這張投影片上是一個寓言,
說明我認為我們如何……
14:51
A vision of how we may end up
using this technology in the future.
297
891220
3212
我們未來可能會如何
運用這項科技的預景。
14:54
A person brought
his very sick dog to the vet,
298
894849
3712
一個人帶了他重病的狗去看獸醫,
14:58
and the veterinarian made a bad call
to say, “Let’s just wait and see.”
299
898561
3336
獸醫做了很糟的判斷,說:
「我們先觀察就好。」
15:01
And the dog would not
be here today had he listened.
300
901897
2795
如果他聽了獸醫的話,
狗今天就不在了。
15:05
In the meanwhile,
he provided the blood test,
301
905401
2252
同時,他也把完整的血液檢測
醫療記錄提供給 GPT-4,
15:07
like, the full medical records, to GPT-4,
302
907653
2586
15:10
which said, "I am not a vet,
you need to talk to a professional,
303
910281
3170
它說:「我不是獸醫,
你需要和專業人士談談,
15:13
here are some hypotheses."
304
913492
1710
以下是一些假設。」
15:15
He brought that information
to a second vet
305
915786
2127
他把這些資訊帶給第二位獸醫,
15:17
who used it to save the dog's life.
306
917913
1835
獸醫用這些資訊救了這隻狗一命。
15:21
Now, these systems, they're not perfect.
307
921292
2252
這些系統並不完美。
15:23
You cannot overly rely on them.
308
923544
2336
你不能過度仰賴它們。
15:25
But this story, I think, shows
309
925880
3712
但我認為這個故事告訴我們,
15:29
that a human with a medical professional
310
929592
3044
有醫療專業的人類
15:32
and with ChatGPT
as a brainstorming partner
311
932678
2461
加上 ChatGPT 當腦力激盪的夥伴,
15:35
was able to achieve an outcome
that would not have happened otherwise.
312
935181
3295
就能夠達到本來達不到的結果。
15:38
I think this is something
we should all reflect on,
313
938476
2419
我認為在考量如何將這些系統
整合到我們的世界中時
15:40
think about as we consider
how to integrate these systems
314
940895
2711
我們應該反思這一點,
15:43
into our world.
315
943606
1167
15:44
And one thing I believe really deeply,
316
944815
1835
而我深深相信,
15:46
is that getting AI right is going
to require participation from everyone.
317
946650
4338
要把人工智慧做對,
會需要每個人的參與。
15:50
And that's for deciding
how we want it to slot in,
318
950988
2377
也就是決定我們希望它如何融入,
15:53
that's for setting the rules of the road,
319
953365
1961
就是為道路設立規則,
15:55
for what an AI will and won't do.
320
955367
2044
人工智慧能或不能做什麼。
15:57
And if there's one thing
to take away from this talk,
321
957453
2502
如果說能從這場演說得到一個訊息,
15:59
it's that this technology
just looks different.
322
959997
2211
那就是:這項科技看起來不同,
和大家所有的預期都不同。
16:02
Just different from anything
people had anticipated.
323
962208
2502
16:04
And so we all have to become literate.
324
964710
1835
所以我們都得要有熟悉它。
16:06
And that's, honestly, one
of the reasons we released ChatGPT.
325
966587
2961
老實說,這是我們推出
ChatGPT 的理由之一。
16:09
Together, I believe that we can
achieve the OpenAI mission
326
969548
3128
我相信我們同心協力
就能達成 OpenAI 的使命:
16:12
of ensuring that artificial
general intelligence
327
972718
2252
確保人工通用智慧能讓全人類受惠。
16:14
benefits all of humanity.
328
974970
1877
16:16
Thank you.
329
976847
1168
謝謝。
16:18
(Applause)
330
978057
6965
(掌聲)
16:33
(Applause ends)
331
993322
1168
(掌聲結束)
16:34
Chris Anderson: Greg.
332
994532
1334
主持人:葛雷格。
16:36
Wow.
333
996242
1167
哇。
16:37
I mean ...
334
997868
1126
我是說……
16:39
I suspect that within every mind out here
335
999662
3753
搞不好這裡每個人的腦中
16:43
there's a feeling of reeling.
336
1003457
2503
都有種天旋地轉的感覺。
我猜想可能大部分
在看這場演說的觀眾
16:46
Like, I suspect that a very large
number of people viewing this,
337
1006001
3379
16:49
you look at that and you think,
“Oh my goodness,
338
1009421
2419
看著那些,心想:「喔,我的天,
16:51
pretty much every single thing
about the way I work, I need to rethink."
339
1011882
3462
我可能得重新思考
我工作方式的每個環節了。」
16:55
Like, there's just
new possibilities there.
340
1015386
2002
因為有新的可能性,
我說的對嗎?有誰認為
16:57
Am I right?
341
1017388
1168
16:58
Who thinks that they're having to rethink
the way that we do things?
342
1018597
3337
我們得要重新思考我們做事的方式?
17:01
Yeah, I mean, it's amazing,
343
1021976
1543
是啊,很不簡單,但也很駭人。
17:03
but it's also really scary.
344
1023561
2002
17:05
So let's talk, Greg, let's talk.
345
1025604
1585
所以,葛雷格,咱們來談談。
17:08
I mean, I guess
my first question actually is just
346
1028524
2377
我想我的第一個問題是:
你到底怎麼做到這些的?
17:10
how the hell have you done this?
347
1030901
1585
(笑聲)
17:12
(Laughter)
348
1032486
1251
17:13
OpenAI has a few hundred employees.
349
1033737
2962
OpenAI 有幾百名員工。
17:16
Google has thousands of employees
working on artificial intelligence.
350
1036740
4755
Google 有數千名員工
投入在人工智慧上。
17:21
Why is it you who's come up
with this technology
351
1041996
3503
為什麼你們能創造出
這項震驚全世界的科技?
17:25
that shocked the world?
352
1045541
1168
17:26
Greg Brockman: I mean, the truth is,
353
1046709
1751
講者:毫無疑問,
我們都站在巨人的肩膀上。
17:28
we're all building on shoulders
of giants, right, there's no question.
354
1048502
3295
看看計算、演算法、資料的進步,
17:31
If you look at the compute progress,
355
1051797
1752
都是整個產業的。
17:33
the algorithmic progress,
the data progress,
356
1053549
2085
17:35
all of those are really industry-wide.
357
1055634
1835
但在 OpenAI 的早期,我們
做了許多非常刻意的選擇。
17:37
But I think within OpenAI,
358
1057469
1252
17:38
we made a lot of very deliberate
choices from the early days.
359
1058762
2878
第一個就是要去面對現實。
17:41
And the first one was just
to confront reality as it lays.
360
1061640
2711
我們非常用力去思考:
17:44
And that we just thought
really hard about like:
361
1064393
2294
17:46
What is it going to take
to make progress here?
362
1066687
2210
要怎樣才能在此有所進展?
17:48
We tried a lot of things that didn't work,
so you only see the things that did.
363
1068939
3754
我們試了很多行不通的方法,
所以只會看到行得通的。
我認為最重要的一點是要
17:52
And I think that the most important thing
has been to get teams of people
364
1072693
3462
非常不同的人所組成的團隊
能夠和諧地合作。
17:56
who are very different from each other
to work together harmoniously.
365
1076196
3254
17:59
CA: Can we have the water,
by the way, just brought here?
366
1079450
2711
主持人:能不能把水拿過來?
我想我們會很需要,
這是會讓人口乾的主題。
18:02
I think we're going to need it,
it's a dry-mouth topic.
367
1082202
3170
18:06
But isn't there something also
just about the fact
368
1086665
2795
但,是不是也有是因為
18:09
that you saw something
in these language models
369
1089501
4755
你在這些語言模型中看到了什麼,
18:14
that meant that if you continue
to invest in them and grow them,
370
1094256
3921
意味著如果你持續
投資它們、發展它們,
18:18
that something
at some point might emerge?
371
1098218
3129
在某個時點可能會出現什麼?
18:21
GB: Yes.
372
1101847
1126
講者:是的。
我認為,
18:23
And I think that, I mean, honestly,
373
1103015
2836
18:25
I think the story there
is pretty illustrative, right?
374
1105893
2544
老實說,我認為
這個故事挺有說服力,
我認為高階、深度學習,
18:28
I think that high level, deep learning,
375
1108437
2002
我們一直知道我們想要做深度
學習實驗室,但實際要如何做?
18:30
like we always knew that was
what we wanted to be,
376
1110481
2335
18:32
was a deep learning lab,
and exactly how to do it?
377
1112858
2419
在早期我們並不知道,
我們做了很多嘗試,
18:35
I think that in the early days,
we didn't know.
378
1115277
2211
18:37
We tried a lot of things,
379
1117529
1210
有一個人投入在
18:38
and one person was working
on training a model
380
1118739
2336
訓練一個模型來預測亞馬遜
評論中的下一個字,
18:41
to predict the next character
in Amazon reviews,
381
1121075
2877
18:43
and he got a result where --
this is a syntactic process,
382
1123994
4755
而他得到了這樣的結果:
這是個句法過程,所以你會預期
18:48
you expect, you know, the model
will predict where the commas go,
383
1128749
3086
模型會預測逗點在哪裡、
名詞和動詞在哪裡。
18:51
where the nouns and verbs are.
384
1131835
1627
18:53
But he actually got a state-of-the-art
sentiment analysis classifier out of it.
385
1133504
4337
但他實際上卻從中得到了
一個先進的情感分析分類器。
18:57
This model could tell you
if a review was positive or negative.
386
1137883
2961
這個模型能告訴你
評論是正評或負評。
19:00
I mean, today we are just like,
come on, anyone can do that.
387
1140886
3378
現今我們會說,拜託,
任何人都做得到。
19:04
But this was the first time
that you saw this emergence,
388
1144306
3087
但這是你初次看到它出現,
19:07
this sort of semantics that emerged
from this underlying syntactic process.
389
1147434
5005
從這背後的句法過程中出現了語義。
19:12
And there we knew,
you've got to scale this thing,
390
1152481
2336
我們就知道我們得把它做大,
看看會如何發展下去。
19:14
you've got to see where it goes.
391
1154858
1544
主持人:這也能協助解釋
19:16
CA: So I think this helps explain
392
1156402
1626
讓每個人看到它都覺得困惑的謎團,
19:18
the riddle that baffles
everyone looking at this,
393
1158028
2544
因為這些東西被描述為預測機器。
19:20
because these things are described
as prediction machines.
394
1160572
2753
但我們看到它們的感覺是……
19:23
And yet, what we're seeing
out of them feels ...
395
1163367
2669
19:26
it just feels impossible that that
could come from a prediction machine.
396
1166036
3420
感覺這些事不可能是
預測機器會做的事,
你剛才展示的那些事。
19:29
Just the stuff you showed us just now.
397
1169456
2378
19:31
And the key idea of emergence
is that when you get more of a thing,
398
1171875
3838
而「出現」的重點是,
有更多同樣的東西時,
不同的東西突然出現了。
19:35
suddenly different things emerge.
399
1175754
1585
19:37
It happens all the time, ant colonies,
single ants run around,
400
1177339
3045
這很常見,螞蟻群,
單一隻螞蟻跑來跑去,
把夠多螞蟻放在一起,
19:40
when you bring enough of them together,
401
1180384
1877
產生的螞蟻群會呈現出
全新的不同行為。
19:42
you get these ant colonies that show
completely emergent, different behavior.
402
1182302
3629
19:45
Or a city where a few houses together,
it's just houses together.
403
1185973
3086
或城市,幾間房子放在一起
就只是房子放在一起。
但當房子的數量增加,
19:49
But as you grow the number of houses,
404
1189059
1794
19:50
things emerge, like suburbs
and cultural centers and traffic jams.
405
1190894
4588
就會有東西出現,
會有近郊、文化中心,和交通阻塞。
19:57
Give me one moment for you
when you saw just something pop
406
1197276
3211
告訴我,有哪個時刻是
你看到有什麼東西跳出來,
20:00
that just blew your mind
407
1200529
1668
讓你很驚奇,你完全沒有預期到。
20:02
that you just did not see coming.
408
1202197
1627
20:03
GB: Yeah, well,
409
1203824
1209
講者:好,嗯,
在 ChatGPT 裡試著做
四十位數的加法——
20:05
so you can try this in ChatGPT,
if you add 40-digit numbers --
410
1205075
3462
20:08
CA: 40-digit?
411
1208537
1168
主持人:四十位數?
講者:四十位數,模型會做,
20:09
GB: 40-digit numbers,
the model will do it,
412
1209705
2169
這表示它真的學會了
一個內部迴路來做這件事。
20:11
which means it's really learned
an internal circuit for how to do it.
413
1211915
3254
20:15
And the really interesting
thing is actually,
414
1215210
2127
有趣的是,
如果你實際上要它
把四十位數加上三十五位數,
20:17
if you have it add like a 40-digit number
plus a 35-digit number,
415
1217337
3212
20:20
it'll often get it wrong.
416
1220591
1710
它通常會算錯。
20:22
And so you can see that it's really
learning the process,
417
1222676
2795
可以看到它在學習這個過程,
但還沒有做到完全一般化。
20:25
but it hasn't fully generalized, right?
418
1225471
1876
20:27
It's like you can't memorize
the 40-digit addition table,
419
1227389
2711
不可能記住四十位數的加法表,
那比宇宙中的原子還多。
20:30
that's more atoms
than there are in the universe.
420
1230100
2294
所以它得學會一般化的原則,
20:32
So it had to have learned
something general,
421
1232394
2086
但它還沒有完全學會:
20:34
but that it hasn't really
fully yet learned that,
422
1234480
2377
喔,我可以把這個給一般化,
來將任何長度的任何數字加總。
20:36
Oh, I can sort of generalize this
to adding arbitrary numbers
423
1236899
2961
20:39
of arbitrary lengths.
424
1239902
1167
主持人:所以狀況就是
你讓它的規模擴大,
20:41
CA: So what's happened here
425
1241111
1335
20:42
is that you've allowed it to scale up
426
1242488
1793
20:44
and look at an incredible
number of pieces of text.
427
1244281
2419
去看超大量的文字。
20:46
And it is learning things
428
1246742
1209
而它開始學會你本來
不知道它能學會的東西。
20:47
that you didn't know that it was
going to be capable of learning.
429
1247951
3379
20:51
GB Well, yeah, and it’s more nuanced, too.
430
1251371
2002
講者:是的,且也還更微妙。
20:53
So one science that we’re starting
to really get good at
431
1253415
2878
我們漸漸變得很擅長一種科學,
20:56
is predicting some of these
emergent capabilities.
432
1256335
2586
就是去預測這些新產出的能力。
20:58
And to do that actually,
433
1258962
1335
要做到這一點,我認為
工程品質在這領域裡被低估了。
21:00
one of the things I think
is very undersung in this field
434
1260339
2711
21:03
is sort of engineering quality.
435
1263050
1501
21:04
Like, we had to rebuild our entire stack.
436
1264551
2044
我們得要重新建構技術堆疊。
21:06
When you think about building a rocket,
437
1266637
1877
在建造火箭時,每個公差
都必須要非常小。
21:08
every tolerance has to be incredibly tiny.
438
1268555
2211
21:10
Same is true in machine learning.
439
1270766
1626
機器學習也是,技術堆疊的
每一塊都要妥善設計,
21:12
You have to get every single piece
of the stack engineered properly,
440
1272434
3212
21:15
and then you can start
doing these predictions.
441
1275646
2210
接著你就可以開始做預測。
21:17
There are all these incredibly
smooth scaling curves.
442
1277856
2503
所有這些非常平滑的擴展曲線
21:20
They tell you something deeply
fundamental about intelligence.
443
1280359
2919
提供一些關於智慧的基礎資訊。
所有這些曲線都放在
我們的 GPT-4 部落格文章中。
21:23
If you look at our GPT-4 blog post,
444
1283320
1710
21:25
you can see all of these curves in there.
445
1285030
1960
現在我們可以開始做預測,比如,
21:26
And now we're starting
to be able to predict.
446
1286990
2127
21:29
So we were able to predict, for example,
the performance on coding problems.
447
1289117
3713
處理編碼問題的表現。
21:32
We basically look at some models
448
1292871
1585
基本上我們看的是
小一萬倍或一千倍的模型。
21:34
that are 10,000 times
or 1,000 times smaller.
449
1294456
2461
21:36
And so there's something about this
that is actually smooth scaling,
450
1296959
3211
即使是在早期,也有些徵兆
可以看出很平順的擴展。
21:40
even though it's still early days.
451
1300170
2044
21:42
CA: So here is, one of the big fears then,
452
1302756
2544
主持人:來談談它
帶來的重大恐懼之一。
21:45
that arises from this.
453
1305300
1252
21:46
If it’s fundamental
to what’s happening here,
454
1306593
2127
若這個現象是很基本的,
當你擴展規模,
21:48
that as you scale up,
455
1308720
1210
21:49
things emerge that
456
1309930
2419
會有新東西出現,
21:52
you can maybe predict
in some level of confidence,
457
1312349
4171
也許你可以預測到,
也有某種程度的信心,
21:56
but it's capable of surprising you.
458
1316562
2544
但它有讓你驚奇的能力。
22:00
Why isn't there just a huge risk
of something truly terrible emerging?
459
1320816
4463
為什麼不用擔心
會有很可怕的東西出現?
22:05
GB: Well, I think all of these
are questions of degree
460
1325320
2545
講者:我認為這些都是程度、
規模,時機點的問題。
22:07
and scale and timing.
461
1327865
1209
我認為大家忽略了一點:
和世界的整合也是非常新興的,
22:09
And I think one thing people miss, too,
462
1329116
1877
22:10
is sort of the integration with the world
is also this incredibly emergent,
463
1330993
3587
也很強大。
22:14
sort of, very powerful thing too.
464
1334621
1585
這也是我們認為逐步部署
十分重要的原因之一。
22:16
And so that's one of the reasons
that we think it's so important
465
1336248
3045
22:19
to deploy incrementally.
466
1339293
1167
從看這場演說可以看出,我十分重視
22:20
And so I think that what we kind of see
right now, if you look at this talk,
467
1340502
3629
提供真正高品質的回饋。
22:24
a lot of what I focus on is providing
really high-quality feedback.
468
1344131
3170
現今我們做的工作任務
是可以被檢視的對吧?
22:27
Today, the tasks that we do,
you can inspect them, right?
469
1347301
2711
很容易可以去檢視
數學問題,接著說:
22:30
It's very easy to look at that math
problem and be like, no, no, no,
470
1350012
3211
不,不,不,機器,
正確答案應該是七。
22:33
machine, seven was the correct answer.
471
1353265
1835
但,就連幫一本書做摘要,
22:35
But even summarizing a book,
like, that's a hard thing to supervise.
472
1355100
3212
那也很難監督,你要怎麼
知道這書摘做得好不好?
22:38
Like, how do you know
if this book summary is any good?
473
1358312
2586
你得讀完整本書,沒人想要這差事。
22:40
You have to read the whole book.
474
1360939
1543
22:42
No one wants to do that.
475
1362482
1168
(笑聲)
22:43
(Laughter)
476
1363692
1293
22:44
And so I think that the important thing
will be that we take this step by step.
477
1364985
4296
所以我認為,重要的是
我們要按步就班。
22:49
And that we say, OK,
as we move on to book summaries,
478
1369323
2544
比如,好,當我們開始做書摘時,
22:51
we have to supervise this task properly.
479
1371867
1960
我們得妥當監督這個工作任務,
要建立這些機器的追蹤記錄,
22:53
We have to build up
a track record with these machines
480
1373827
2586
22:56
that they're able to actually
carry out our intent.
481
1376413
2586
確保它們實際上有實現我們的意圖。
我認為我們得要創造出更好的、
22:59
And I think we're going to have to produce
even better, more efficient,
482
1379041
3336
更有效、更可靠的方法,
23:02
more reliable ways of scaling this,
483
1382419
1710
在擴展規模時能
讓機器和你的觀念一致。
23:04
sort of like making the machine
be aligned with you.
484
1384129
2878
23:07
CA: So we're going to hear
later in this session,
485
1387049
2294
主持人:等一下的演說中
會聽到有些評論家說
23:09
there are critics who say that,
486
1389343
1543
23:10
you know, there's no real
understanding inside,
487
1390928
4587
它的內部其實沒有真正的了解,
23:15
the system is going to always --
488
1395557
1627
這個系統永遠都——
23:17
we're never going to know
that it's not generating errors,
489
1397225
3212
我們永遠不會知道它沒有產生錯誤、
23:20
that it doesn't have
common sense and so forth.
490
1400479
2210
它沒有常識等等。
23:22
Is it your belief, Greg,
that it is true at any one moment,
491
1402689
4088
葛雷格,你真的相信,
真的在任何時候,
23:26
but that the expansion of the scale
and the human feedback
492
1406818
3629
規模的擴大和人類的回饋,
23:30
that you talked about is basically
going to take it on that journey
493
1410489
4963
你剛才談到的那些,基本上真的可以
引領它走上那條路,得到比如
23:35
of actually getting to things
like truth and wisdom and so forth,
494
1415494
3837
真相、智慧等等,
你能信心滿滿地相信嗎?
23:39
with a high degree of confidence.
495
1419331
1627
23:40
Can you be sure of that?
496
1420999
1335
你能確定嗎?
23:42
GB: Yeah, well, I think that the OpenAI,
I mean, the short answer is yes,
497
1422334
3462
講者:對,我認為 OpenAI……
簡短的回答是「是」,
23:45
I believe that is where we're headed.
498
1425796
1793
我相信我們正朝那裡前進。
23:47
And I think that the OpenAI approach
here has always been just like,
499
1427631
3211
我認為 OpenAI 的方法一直都是
讓現實狠狠打在你臉上。
23:50
let reality hit you in the face, right?
500
1430842
1877
就彷彿這個領域是打破承諾的領域,
23:52
It's like this field is the field
of broken promises,
501
1432719
2503
專家們都在說 X 將來會發生,
它的運作方式就是 Y。
23:55
of all these experts saying
X is going to happen, Y is how it works.
502
1435263
3212
有人說還要七十年
神經網路才可能可行。
23:58
People have been saying neural nets
aren't going to work for 70 years.
503
1438475
3337
他們還沒說對,他們可能會說對,
24:01
They haven't been right yet.
504
1441812
1376
也許需要的是七十一年之類的。
24:03
They might be right
maybe 70 years plus one
505
1443188
2044
24:05
or something like that is what you need.
506
1445232
1918
但我認為我們的方法一直是
你得要推向這項科技的極限,
24:07
But I think that our approach
has always been,
507
1447192
2169
24:09
you've got to push to the limits
of this technology
508
1449361
2419
看它實際運作,因為這樣你才會知道
24:11
to really see it in action,
509
1451822
1293
喔,我們可以這樣轉向新的範式。
24:13
because that tells you then, oh, here's
how we can move on to a new paradigm.
510
1453115
3670
我們還沒把所有潛力開發出來。
24:16
And we just haven't exhausted
the fruit here.
511
1456785
2127
主持人:你採取的立場頗有爭議性,
24:18
CA: I mean, it's quite
a controversial stance you've taken,
512
1458954
2794
正確的做法是把它公開,
24:21
that the right way to do this
is to put it out there in public
513
1461748
2920
利用這一切,
24:24
and then harness all this, you know,
514
1464710
1751
而不只是由你的團隊來給回饋,
24:26
instead of just your team giving feedback,
515
1466461
2002
24:28
the world is now giving feedback.
516
1468463
2461
全世界都在給回饋。
24:30
But ...
517
1470924
1168
但……
24:33
If, you know, bad things
are going to emerge,
518
1473135
3753
如果有壞事會出現,它就在那裡了。
24:36
it is out there.
519
1476930
1168
當你以非營利機構形式成立
OpenAI 時我聽到的原始故事是
24:38
So, you know, the original story
that I heard on OpenAI
520
1478140
2919
24:41
when you were founded as a nonprofit,
521
1481101
1793
24:42
well you were there as the great
sort of check on the big companies
522
1482894
4463
你是個偉大的監督者
監管大型企業用人工智慧做些
未知且可能很邪惡的事情,
24:47
doing their unknown,
possibly evil thing with AI.
523
1487399
3837
24:51
And you were going to build models
that sort of, you know,
524
1491278
4755
而你要創造模型來
以某種方式要求他們
負責,且如果需要,
24:56
somehow held them accountable
525
1496033
1418
24:57
and was capable of slowing
the field down, if need be.
526
1497492
4380
也能讓這個領域緩一下。
25:01
Or at least that's kind of what I heard.
527
1501872
1960
至少那是我聽到的,
但實際發生的狀況卻相反,
25:03
And yet, what's happened,
arguably, is the opposite.
528
1503832
2461
你把 GPT 釋出,
25:06
That your release of GPT,
especially ChatGPT,
529
1506334
5673
特別是 ChatGPT,
在科技業引起巨大的震撼,
25:12
sent such shockwaves
through the tech world
530
1512049
2002
現在 Google 和 Meta 等等
都在拼命追趕,
25:14
that now Google and Meta and so forth
are all scrambling to catch up.
531
1514051
3795
25:17
And some of their criticisms have been,
532
1517888
2085
有些批評是
你強迫我們公開這些,
25:20
you are forcing us to put this out here
without proper guardrails or we die.
533
1520015
4963
沒有妥當的保護措施,
但不做就是死路。
你要怎麼……
25:25
You know, how do you, like,
534
1525020
2794
25:27
make the case that what you have done
is responsible here and not reckless.
535
1527814
3754
證明你所做的是負責的事,
而非魯莾行事?
25:31
GB: Yeah, we think about these
questions all the time.
536
1531568
3128
講者:我們一天到晚
在思索這些問題。
25:34
Like, seriously all the time.
537
1534738
1418
真的是一天到晚。
25:36
And I don't think we're always
going to get it right.
538
1536198
2711
我不認為我們能一直不犯錯。
25:38
But one thing I think
has been incredibly important,
539
1538909
2460
但有一點我認為非常重要,
打從最初我們在思考要如何打造
25:41
from the very beginning,
when we were thinking
540
1541411
2169
25:43
about how to build
artificial general intelligence,
541
1543580
2419
能讓全人類受惠的人工通用智慧,
25:45
actually have it benefit all of humanity,
542
1545999
2002
要怎麼做到啊?
預設的計畫是在背底裡秘密建造,
25:48
like, how are you
supposed to do that, right?
543
1548001
2127
25:50
And that default plan of being,
well, you build in secret,
544
1550170
2711
做出超強大的東西,
25:52
you get this super powerful thing,
545
1552923
1626
想通如何處理安全性,
按下「啟動」,希望你做對了。
25:54
and then you figure out the safety of it
and then you push “go,”
546
1554549
3003
25:57
and you hope you got it right.
547
1557552
1460
我不知道如何執行那個計畫。
25:59
I don't know how to execute that plan.
548
1559012
1835
也許別人知道。
26:00
Maybe someone else does.
549
1560889
1168
但我總覺得那很嚇人,
感覺就是不對。
26:02
But for me, that was always terrifying,
it didn't feel right.
550
1562099
2877
我認為這個替代方法,
26:04
And so I think that this
alternative approach
551
1564976
2128
是我唯一知道的其他途徑,
26:07
is the only other path that I see,
552
1567104
2043
26:09
which is that you do let
reality hit you in the face.
553
1569147
2503
也就是讓現實狠狠打在你臉上。
26:11
And I think you do give people
time to give input.
554
1571691
2336
我認為你確實
給大家時間提供建議,
26:14
You do have, before these
machines are perfect,
555
1574027
2211
在這些機器達到完美之前,
26:16
before they are super powerful,
that you actually have the ability
556
1576279
3128
在它們超強大之前,
你能夠先看到它們實際運作。
26:19
to see them in action.
557
1579407
1168
26:20
And we've seen it from GPT-3, right?
558
1580617
1752
在 GPT-3 就可以看到,
我們那時很害怕大家
26:22
GPT-3, we really were afraid
559
1582369
1376
26:23
that the number one thing
people were going to do with it
560
1583745
2711
拿它做的第一件事就是
產生假消息試圖影響選舉。
26:26
was generate misinformation,
try to tip elections.
561
1586456
2336
26:28
Instead, the number one thing
was generating Viagra spam.
562
1588834
2711
結果第一件事反而是
產生威而剛垃圾郵件。
26:31
(Laughter)
563
1591545
3169
(笑聲)
26:36
CA: So Viagra spam is bad,
but there are things that are much worse.
564
1596007
3212
主持人:威而剛垃圾郵件不好,
但還有更糟糕的。
來做個思想實驗。
26:39
Here's a thought experiment for you.
565
1599219
1752
26:40
Suppose you're sitting in a room,
566
1600971
1710
假設你坐在房間裡,桌上有個盒子。
26:42
there's a box on the table.
567
1602681
1668
26:44
You believe that in that box
is something that,
568
1604349
3003
你認為在盒子裡的東西有可能,
26:47
there's a very strong chance
it's something absolutely glorious
569
1607394
2961
非常有可能是很棒的東西,
26:50
that's going to give beautiful gifts
to your family and to everyone.
570
1610397
3920
是給你的家人和每個人的美好禮物。
26:54
But there's actually also a one percent
thing in the small print there
571
1614359
3629
但實際上也有 1% 非常小的
字樣寫著「潘朵拉」。
26:58
that says: “Pandora.”
572
1618029
1877
26:59
And there's a chance
573
1619906
1669
有可能
27:01
that this actually could unleash
unimaginable evils on the world.
574
1621616
4088
打開它就會將無法想像的
邪惡釋放到這個世界上。
27:06
Do you open that box?
575
1626538
1543
你會打開盒子嗎?
27:08
GB: Well, so, absolutely not.
576
1628123
1460
講者:絕對不會。
27:09
I think you don't do it that way.
577
1629624
1919
我認為不該那樣做。
老實說,
27:12
And honestly, like, I'll tell you a story
that I haven't actually told before,
578
1632210
3796
讓我說個故事,我從來沒有講過:
27:16
which is that shortly
after we started OpenAI,
579
1636006
2586
OpenAI 剛成立不久,我記得
27:18
I remember I was in Puerto Rico
for an AI conference.
580
1638592
2711
我在波多黎各參加人工智慧
大會,我坐在房間裡,
27:21
I'm sitting in the hotel room just
looking out over this wonderful water,
581
1641344
3462
向外眺望美麗的水景,
大家都很開心。
27:24
all these people having a good time.
582
1644806
1752
花點時間想想看,如果你能選擇
27:26
And you think about it for a moment,
583
1646558
1752
27:28
if you could choose for basically
that Pandora’s box
584
1648310
4504
讓那個潘朵拉的盒子
27:32
to be five years away
585
1652814
2711
出現在五年後,
27:35
or 500 years away,
586
1655567
1585
或者五百年後,
27:37
which would you pick, right?
587
1657194
1501
你會挑哪個?
27:38
On the one hand you're like,
well, maybe for you personally,
588
1658737
2836
一方面,你會想,就你
個人來說五年是比較好的。
27:41
it's better to have it be five years away.
589
1661573
2002
27:43
But if it gets to be 500 years away
and people get more time to get it right,
590
1663617
3628
但如果是五百年後,
大家有更多時間把它做對,
你會挑哪個?
27:47
which do you pick?
591
1667287
1168
27:48
And you know, I just
really felt it in the moment.
592
1668496
2336
在那一刻我真的很有感,
當然,五百年後再做。
27:50
I was like, of course
you do the 500 years.
593
1670874
2002
許多人……我兄弟那時還在軍中,
27:53
My brother was in the military at the time
594
1673293
2002
27:55
and like, he puts his life on the line
in a much more real way
595
1675295
2961
他冒著生命危險,用著比
我們任何人都更真實的方式,
27:58
than any of us typing things in computers
596
1678256
2628
我們那時只是用電腦打字,
開發這項科技。
28:00
and developing this
technology at the time.
597
1680926
2585
28:03
And so, yeah, I'm really sold
on the you've got to approach this right.
598
1683511
4547
是的,我真心認同必須要把它做對。
28:08
But I don't think that's quite
playing the field as it truly lies.
599
1688058
3628
但我不認為那符合實際情況。
28:11
Like, if you look at the whole
history of computing,
600
1691686
2670
想想整個計算史,
28:14
I really mean it when I say
that this is an industry-wide
601
1694397
4463
我是真的很認真的說,
這是整個產業的,或甚至
28:18
or even just almost like
602
1698902
1543
幾乎是整個人類科技發展的轉變。
28:20
a human-development-
of-technology-wide shift.
603
1700487
3336
28:23
And the more that you sort of,
don't put together the pieces
604
1703865
4088
你越是不去把既有的拼圖給拼起來,
28:27
that are there, right,
605
1707994
1293
28:29
we're still making faster computers,
606
1709329
1752
我們仍然在發展更快的電腦,
仍然在改善演算法,
28:31
we're still improving the algorithms,
all of these things, they are happening.
607
1711081
3670
所有這些都在發生。
28:34
And if you don't put them together,
you get an overhang,
608
1714793
2627
如果你不把它們
拼湊起來,就會有危險,
意即,如果有人做到,一旦
有人設法把迴路連結起來,
28:37
which means that if someone does,
609
1717420
1627
28:39
or the moment that someone does manage
to connect to the circuit,
610
1719089
3086
突然出現非常強大的東西,
沒有人有時間調整,
28:42
then you suddenly have
this very powerful thing,
611
1722175
2252
28:44
no one's had any time to adjust,
612
1724427
1544
誰知道你能得到
什麼樣的安全預防措施?
28:46
who knows what kind
of safety precautions you get.
613
1726012
2336
28:48
And so I think
that one thing I take away
614
1728390
1918
我認為我得到的啟示是,
28:50
is like, even you think about development
of other sort of technologies,
615
1730308
3837
想想其他科技的發展
也是一樣,比如核武,
28:54
think about nuclear weapons,
616
1734187
1376
28:55
people talk about being
like a zero to one,
617
1735563
2002
大家講得好像是從零到一,
好像是改變人類能做什麼。
28:57
sort of, change in what humans could do.
618
1737565
2628
29:00
But I actually think
that if you look at capability,
619
1740235
2461
但如果去看「能力」,
隨時間的發展是很平順的。
29:02
it's been quite smooth over time.
620
1742696
1585
29:04
And so the history, I think,
of every technology we've developed
621
1744281
3670
看看我們所發展出的
每一項科技的歷史,
29:07
has been, you've got
to do it incrementally
622
1747993
2002
都是你得按部就班,
必須要想辦法管理它,
29:10
and you've got to figure out
how to manage it
623
1750036
2127
強化它的每一刻都要能管理。
29:12
for each moment that you're increasing it.
624
1752163
2461
29:14
CA: So what I'm hearing is that you ...
625
1754666
2252
主持人:我聽到的是你……
29:16
the model you want us to have
626
1756918
1668
你希望我們擁有的模型,
我們已經孕育出這個不凡的孩子
29:18
is that we have birthed
this extraordinary child
627
1758628
2795
29:21
that may have superpowers
628
1761423
2544
可能有超能力
能帶人類進入全新的境界。
29:24
that take humanity to a whole new place.
629
1764009
2544
29:26
It is our collective responsibility
to provide the guardrails
630
1766594
5005
我們所有人有共同的責任
為這個孩子提供護欄,
29:31
for this child
631
1771641
1210
29:32
to collectively teach it to be wise
and not to tear us all down.
632
1772892
5047
一起教導它要明智,而不要
摧毀我們所有人,
模型基本上是這樣嗎?
29:37
Is that basically the model?
633
1777939
1377
29:39
GB: I think it's true.
634
1779357
1168
講者:我認為是。
29:40
And I think it's also important
to say this may shift, right?
635
1780567
2878
我還要說有一點也很重要:
這可能會轉變。
29:43
We've got to take each step
as we encounter it.
636
1783445
3253
我們必須要遇到一步就走一步。
29:46
And I think it's incredibly
important today
637
1786740
2002
現今,非常重要的是,
29:48
that we all do get literate
in this technology,
638
1788783
2878
我們都要熟悉這項科技,
29:51
figure out how to provide the feedback,
639
1791661
1919
想辦法提供回饋,
決定我們想要它如何。
29:53
decide what we want from it.
640
1793621
1377
29:54
And my hope is that that will
continue to be the best path,
641
1794998
3128
這希望這條路能一直是最佳的路,
29:58
but it's so good we're honestly
having this debate
642
1798168
2377
很高興我們能坦誠做這些辯論,
30:00
because we wouldn't otherwise
if it weren't out there.
643
1800545
2628
因為如果它沒有誕生,
也不會有這些辯論了。
30:03
CA: Greg Brockman, thank you so much
for coming to TED and blowing our minds.
644
1803631
3629
主持人:葛雷格‧布洛克曼,
謝謝你來 TED 讓我們大開眼界。
30:07
(Applause)
645
1807302
1626
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。