What Is an AI Anyway? | Mustafa Suleyman | TED

2,398,484 views ・ 2024-04-22

TED


請雙擊下方英文字幕播放視頻。

譯者: 麗玲 辛
00:04
I want to tell you what I see coming.
0
4292
2586
我想告訴各位我預見的未來。
00:07
I've been lucky enough to be working on AI for almost 15 years now.
1
7712
4463
我很幸運能夠從事 人工智慧工作近 15 年。
00:12
Back when I started, to describe it as fringe would be an understatement.
2
12676
5088
當我剛入門時,將其描述為邊緣 其實還算是輕描淡寫。
00:17
Researchers would say, “No, no, we’re only working on machine learning.”
3
17806
4045
研究人員會說:「沒有, 我們只是在研究機器學習。」
00:21
Because working on AI was seen as way too out there.
4
21893
3087
因為人們認為研發 AI 太誇張。
00:25
In 2010, just the very mention of the phrase “AGI,”
5
25021
4380
2010 年,只要一提到 “AGI”
00:29
artificial general intelligence,
6
29401
2169
(通用人工智慧)這個詞,
00:31
would get you some seriously strange looks
7
31570
2877
你就會招來別人異樣的眼光,
00:34
and even a cold shoulder.
8
34489
1585
甚至遭到冷落。
00:36
"You're actually building AGI?" people would say.
9
36366
3837
「你真的在構建 AGI ?!」 人們會說。
00:40
"Isn't that something out of science fiction?"
10
40203
2294
「這不是科幻小說裡的東西嗎?」
00:42
People thought it was 50 years away or 100 years away,
11
42914
2544
人們認為,就算有可能,
00:45
if it was even possible at all.
12
45500
1919
也還要 50 或 100 年才能實現。
00:47
Talk of AI was, I guess, kind of embarrassing.
13
47752
3337
我猜,那時候,談 AI 有點尷尬, 
00:51
People generally thought we were weird.
14
51631
2211
人們一般認為我們很怪異。
00:54
And I guess in some ways we kind of were.
15
54217
2503
我想某些方面,確實如此。
00:56
It wasn't long, though, before AI started beating humans
16
56761
2878
然而,沒多久,AI 就開始
00:59
at a whole range of tasks
17
59681
1793
在一連串人們認為 遙不可及的任務上擊敗人類。
01:01
that people previously thought were way out of reach.
18
61516
2795
01:05
Understanding images,
19
65020
2127
理解圖像,
01:07
translating languages,
20
67147
1918
翻譯語言,
01:09
transcribing speech,
21
69065
1585
轉錄語音,
01:10
playing Go and chess
22
70692
2127
下圍棋、西洋棋,
01:12
and even diagnosing diseases.
23
72861
1918
甚至診斷疾病。
01:15
People started waking up to the fact
24
75905
1752
人們開始意識到
01:17
that AI was going to have an enormous impact,
25
77657
3462
AI 將帶來巨大衝擊,
01:21
and they were rightly asking technologists like me
26
81119
2836
他們合理地向像我這樣的科技專家
01:23
some pretty tough questions.
27
83955
1752
提出了一些非常困難的問題。
01:25
Is it true that AI is going to solve the climate crisis?
28
85749
3044
AI 真的能解決氣候危機?
01:29
Will it make personalized education available to everyone?
29
89127
3337
它能讓每個人都獲得個人化教育嗎?
01:32
Does it mean we'll all get universal basic income
30
92881
2294
這是否意味著我們 都會有全民基本收入,
01:35
and we won't have to work anymore?
31
95216
2002
不再需要工作?
01:37
Should I be afraid?
32
97218
1544
我應該擔心嗎?
01:38
What does it mean for weapons and war?
33
98762
2335
這在武器和戰爭方面意味著什麼改變?
01:41
And of course, will China win?
34
101598
1460
還有,中國會贏嗎?
01:43
Are we in a race?
35
103058
1543
我們在比賽嗎?
01:45
Are we headed for a mass misinformation apocalypse?
36
105518
3212
我們是否正在走向 大規模的錯誤資訊末日?
01:49
All good questions.
37
109147
1668
都是好問題。
01:51
But it was actually a simpler
38
111608
1418
但其實有個更單純、更基本的問題,
01:53
and much more kind of fundamental question that left me puzzled.
39
113026
4046
讓我苦思不已。
這問題其實是我每天工作的核心。
01:58
One that actually gets to the very heart of my work every day.
40
118031
3962
02:03
One morning over breakfast,
41
123620
2044
有天早上吃早餐時,
02:05
my six-year-old nephew Caspian was playing with Pi,
42
125705
3379
我六歲的侄子凱斯賓正在玩 Pi,
02:09
the AI I created at my last company, Inflection.
43
129084
3378
這是我在上一家公司 Inflection 開發的 AI。
02:12
With a mouthful of scrambled eggs,
44
132504
1877
他一邊嚼著炒蛋,
02:14
he looked at me plain in the face and said,
45
134381
3503
平淡地看著我,說:
02:17
"But Mustafa, what is an AI anyway?"
46
137926
3295
「穆斯塔法,AI 到底是什麼呢?」
02:21
He's such a sincere and curious and optimistic little guy.
47
141930
3545
他是個真誠、好奇、樂觀的小傢伙。
02:25
He'd been talking to Pi about how cool it would be if one day in the future,
48
145517
4254
他一直在和 Pi 聊
如果將來有一天他能去動物園 看到恐龍,那會有多酷。
02:29
he could visit dinosaurs at the zoo.
49
149771
2377
02:32
And how he could make infinite amounts of chocolate at home.
50
152190
3795
還有他如何在家裡 製作無限量的巧克力,
02:35
And why Pi couldn’t yet play I Spy.
51
155985
2920
以及為什麼 Pi 還不會玩 《我是小間諜》遊戲。
02:39
"Well," I said, "it's a clever piece of software
52
159823
2252
「嗯,」我說,「這是個聰明的軟體,
它可以閱讀公開網路上的大部分文本,
02:42
that's read most of the text on the open internet,
53
162075
2461
02:44
and it can talk to you about anything you want."
54
164577
2419
也可以與你談論任何你想談的內容。」
02:48
"Right.
55
168540
1168
「對。
02:49
So like a person then?"
56
169749
2127
那麼,就像一個人囉?」
02:54
I was stumped.
57
174254
1334
我被難住了。
02:56
Genuinely left scratching my head.
58
176798
2502
真的讓我搔頭苦思。
03:00
All my boring stock answers came rushing through my mind.
59
180301
4046
所有無聊俗套的答案 在我的腦海中閃過。
03:04
"No, but AI is just another general-purpose technology,
60
184723
2585
「不,AI 只是另一種通用科技,
03:07
like printing or steam."
61
187308
1710
就像印刷術或蒸汽機。
03:09
It will be a tool that will augment us
62
189394
2502
它將成為一種工具, 可以增強我們的能力,
03:11
and make us smarter and more productive.
63
191938
2377
讓我們變得更聰明、更有生產力。
03:14
And when it gets better over time,
64
194649
1960
隨著時間的推移,當它更完善,
03:16
it'll be like an all-knowing oracle
65
196651
1919
它會像一位無所不知的專家,
03:18
that will help us solve grand scientific challenges."
66
198611
3087
幫助我們解決重大的科學難題。」
03:22
You know, all of these responses started to feel, I guess,
67
202115
3712
你知道,這些回答都讓我開始覺得
03:25
a little bit defensive.
68
205869
1418
有防備戒心。
03:28
And actually better suited to a policy seminar
69
208246
2211
這種談話其實更適合政策研討會,
03:30
than breakfast with a no-nonsense six-year-old.
70
210457
2627
而不是與一個六歲孩子 不講廢話的早餐時光。
03:33
"Why am I hesitating?" I thought to myself.
71
213126
3086
「我為什麼會猶豫?」我自問。
03:37
You know, let's be honest.
72
217839
1960
坦白說,
03:39
My nephew was asking me a simple question
73
219799
3253
我的侄子問了我一個簡單的問題,
只是我們這些研究 AI 的人 不常正視這問題。
03:43
that those of us in AI just don't confront often enough.
74
223052
3796
03:48
What is it that we are actually creating?
75
228099
2920
我們實際上正在創造什麼?
03:51
What does it mean to make something totally new,
76
231895
3753
創造某個全新的東西,
03:55
fundamentally different to any invention that we have known before?
77
235648
4255
本質上與我們已知的 任何發明都不同,這意味著什麼?
04:00
It is clear that we are at an inflection point
78
240695
2795
很明顯,我們處於 人類歷史上的一個轉折點。
04:03
in the history of humanity.
79
243490
1793
04:06
On our current trajectory,
80
246493
1918
在目前的軌跡上,
04:08
we're headed towards the emergence of something
81
248411
2211
我們正迎向某個東西的出現,
04:10
that we are all struggling to describe,
82
250663
3170
某個我們都無法描述的東西,
04:13
and yet we cannot control what we don't understand.
83
253875
4630
但我們不理解的,我們就無法控制。
04:19
And so the metaphors,
84
259631
1585
因此,比喻、心智模型、 名稱等等都是必要的,
04:21
the mental models,
85
261216
1251
04:22
the names, these all matter
86
262509
2585
04:25
if we’re to get the most out of AI whilst limiting its potential downsides.
87
265094
4088
如果我們想要充分利用 AI, 同時抑制其潛在的缺點。
04:30
As someone who embraces the possibilities of this technology,
88
270391
3421
我欣然接受這項科技的各種可能性,
04:33
but who's also always cared deeply about its ethics,
89
273812
3670
也始終深切關注其道德規範,
04:37
we should, I think,
90
277524
1293
我認為,我們應該
04:38
be able to easily describe what it is we are building.
91
278817
3044
能夠輕易地描述我們正在建構的東西,
04:41
And that includes the six-year-olds.
92
281861
2002
包括向六歲的孩子。
04:44
So it's in that spirit that I offer up today the following metaphor
93
284239
4254
因此,正是本著這種精神, 我今天提出了以下比喻,
04:48
for helping us to try to grapple with what this moment really is.
94
288535
3461
以幫助我們應對此時此刻的真實情況。
04:52
I think AI should best be understood
95
292539
2627
我認為 AI 最好被理解為
04:55
as something like a new digital species.
96
295166
4254
一種新的數位物種。
05:00
Now, don't take this too literally,
97
300296
2253
不要太從字面上理解這一點,
05:02
but I predict that we'll come to see them as digital companions,
98
302590
4630
但我預測人們將把它們視為數位伴侶,
05:07
new partners in the journeys of all our lives.
99
307220
3295
我們生命旅程中的新夥伴。
05:10
Whether you think we’re on a 10-, 20- or 30-year path here,
100
310557
4087
無論你認為這條路要走 10 年、20 年還是 30 年,
05:14
this is, in my view, the most accurate and most fundamentally honest way
101
314686
5005
在我看來,這都是描述未來實際情況
05:19
of describing what's actually coming.
102
319732
2378
最準確、最可信的方式。
05:22
And above all, it enables everybody to prepare for
103
322610
3963
最重要的是,這讓每個人都能夠
為接下來的發展 做好準備,並塑造未來。
05:26
and shape what comes next.
104
326614
2711
05:29
Now I totally get, this is a strong claim,
105
329868
2002
我完全明白,這是個強烈的主張,
05:31
and I'm going to explain to everyone as best I can why I'm making it.
106
331911
3837
我將盡我所能,向大家解釋 我為什麼這樣比喻。
05:36
But first, let me just try to set the context.
107
336291
2961
但首先,容我試著說明情境。
05:39
From the very first microscopic organisms,
108
339252
2753
從最早的微生物開始,
地球上的生命可以追溯到數十億年前。
05:42
life on Earth stretches back billions of years.
109
342046
3462
05:45
Over that time, life evolved and diversified.
110
345508
4213
隨著時間的推移, 生命不斷進化,發展出多樣性。
05:49
Then a few million years ago, something began to shift.
111
349762
3796
幾百萬年前,開始有些變動。
05:54
After countless cycles of growth and adaptation,
112
354183
3629
經過無數的成長和適應週期後,
05:57
one of life’s branches began using tools, and that branch grew into us.
113
357812
6256
生物的某個分支開始使用工具,
那個分支變成了我們。
06:04
We went on to produce a mesmerizing variety of tools,
114
364777
4130
我們繼續製作各種令人驚嘆的工具,
06:08
at first slowly and then with astonishing speed,
115
368907
3670
一開始很慢,然後以驚人的速度,
06:12
we went from stone axes and fire
116
372577
3670
從石斧和火
06:16
to language, writing and eventually industrial technologies.
117
376247
5005
到語言、書寫,最後到工業技術。
06:21
One invention unleashed a thousand more.
118
381878
2919
一項發明催生了更多發明。
06:25
And in time, we became homo technologicus.
119
385173
3712
隨著時間的推移, 我們變成了科技人類。
06:29
Around 80 years ago,
120
389594
1209
大約 80 年前,
06:30
another new branch of technology began.
121
390803
2545
另一個新的科技分支開始。
06:33
With the invention of computers,
122
393973
1710
隨著電腦的發明,
06:35
we quickly jumped from the first mainframes and transistors
123
395725
3295
我們迅速從最初的大型主機和電晶體
躍升至今日的智慧型手機 和虛擬實境耳機。
06:39
to today's smartphones and virtual-reality headsets.
124
399062
3461
06:42
Information, knowledge, communication, computation.
125
402565
4421
資訊、知識、通訊、演算。
06:47
In this revolution,
126
407570
1418
在這場革命中,
創造發明前所未有地爆增。
06:49
creation has exploded like never before.
127
409030
3295
06:53
And now a new wave is upon us.
128
413242
2503
現在,新的浪潮正向我們襲來,
06:55
Artificial intelligence.
129
415787
1668
人工智慧。
06:57
These waves of history are clearly speeding up,
130
417872
2544
歷史浪潮顯然正在加速,
07:00
as each one is amplified and accelerated by the last.
131
420458
4338
因為每一波都比上一波 幅度更大、速度更快。
07:05
And if you look back,
132
425088
1167
如果你回顧過去,
07:06
it's clear that we are in the fastest
133
426297
2002
很明顯我們正處於有史以來最快、
07:08
and most consequential wave ever.
134
428299
2586
影響最深遠的浪潮中。
07:11
The journeys of humanity and technology are now deeply intertwined.
135
431844
4630
人類和科技的旅程 現在已經深深地交織在一起。
07:16
In just 18 months,
136
436516
1543
在短短 18 個月內,
超過 10 億人使用了 大型語言模型。
07:18
over a billion people have used large language models.
137
438059
3170
07:21
We've witnessed one landmark event after another.
138
441729
3545
我們見證了一個又一個里程碑事件。
07:25
Just a few years ago, people said that AI would never be creative.
139
445650
3462
就在幾年前,人們還說 AI 不可能有創意。
07:30
And yet AI now feels like an endless river of creativity,
140
450113
4045
然而現在,AI 似乎就像 一條無盡的創造力之河,
07:34
making poetry and images and music and video that stretch the imagination.
141
454200
4671
創造出擴展想像力的詩歌、 圖像、音樂和影片。
07:39
People said it would never be empathetic.
142
459664
2252
人們說它永遠不會有同理心。
07:42
And yet today, millions of people enjoy meaningful conversations with AIs,
143
462417
5297
然而今日,數百萬人喜歡 與 AI 進行有意義的對話,
07:47
talking about their hopes and dreams
144
467755
2002
談論他們的希望和夢想,
07:49
and helping them work through difficult emotional challenges.
145
469757
3087
並幫助他們應對困難的情緒問題。
07:53
AIs can now drive cars,
146
473177
2294
AI 現在會駕駛汽車,
07:55
manage energy grids
147
475513
1543
管理能源網絡,
甚至發明新分子。
07:57
and even invent new molecules.
148
477056
2252
07:59
Just a few years ago, each of these was impossible.
149
479308
3713
就在幾年前,這些都是不可能的。
08:03
And all of this is turbocharged by spiraling exponentials of data
150
483771
5506
這一切都是由數據和演算力的 螺旋式指數增長所推動的。
08:09
and computation.
151
489277
1626
08:10
Last year, Inflection 2.5, our last model,
152
490945
5297
去年,我們上一個模型 Inflection 2.5
08:16
used five billion times more computation
153
496242
4129
使用的計算量
比 DeepMind AI 十多年前 擊敗雅達利老式電子遊戲時
08:20
than the DeepMind AI that beat the old-school Atari games
154
500371
3629
多了 50 億倍。
08:24
just over 10 years ago.
155
504042
1668
08:26
That's nine orders of magnitude more computation.
156
506085
3420
計算量增加了九個數量級。
08:30
10x per year,
157
510089
1627
每年 10 倍,
08:31
every year for almost a decade.
158
511716
3253
近十年,每年如此。
08:34
Over the same time, the size of these models has grown
159
514969
2544
同時,這些模型的規模
08:37
from first tens of millions of parameters to then billions of parameters,
160
517555
4213
從最初的數千萬個參數 成長到後來的數十億個參數,
08:41
and very soon, tens of trillions of parameters.
161
521809
3504
很快就達到數十億個參數。
08:45
If someone did nothing but read 24 hours a day for their entire life,
162
525313
4755
如果有個人一生什麼都不做, 只是每天 24 小時閱讀,
08:50
they'd consume eight billion words.
163
530109
3337
他能讀完 80 億字。
08:53
And of course, that's a lot of words.
164
533488
1835
當然,這很多。
08:55
But today, the most advanced AIs consume more than eight trillion words
165
535364
5756
但如今,最先進的 AI
在一個月的訓練中 就能閱讀超過八兆個字。
09:01
in a single month of training.
166
541120
2336
09:03
And all of this is set to continue.
167
543873
1960
這一切都還會持續進行。
09:05
The long arc of technological history is now in an extraordinary new phase.
168
545875
5797
漫長的技術史現在 正處於一個非凡的新階段。
09:12
So what does this mean in practice?
169
552256
2503
現實上,這意味著什麼?
09:15
Well, just as the internet gave us the browser
170
555426
2920
正如網路為我們提供瀏覽網站、
09:18
and the smartphone gave us apps,
171
558346
2502
智慧型手機為我們 提供應用程式一樣,
09:20
the cloud-based supercomputer is ushering in a new era
172
560890
4004
奠基於雲端的超級電腦開啟了 AI 無所不在的新時代。
09:24
of ubiquitous AIs.
173
564936
2502
09:27
Everything will soon be represented by a conversational interface.
174
567438
4588
一切很快將由對話式介面來呈現。
或者,換個名稱,個人 AI。
09:32
Or, to put it another way, a personal AI.
175
572026
3003
09:35
And these AIs will be infinitely knowledgeable,
176
575780
2336
這些 AI 將擁有無限的知識,
09:38
and soon they'll be factually accurate and reliable.
177
578157
3879
很快地,它們會變得準確可靠,
擁有幾乎完美的智商,
09:42
They'll have near-perfect IQ.
178
582036
1794
09:44
They’ll also have exceptional EQ.
179
584914
2377
也會有出色的情緒智商。
09:47
They’ll be kind, supportive, empathetic.
180
587291
4129
它們友善,給予鼓勵,具同情心。
09:53
These elements on their own would be transformational.
181
593089
2836
這些元素本身就具有變革性。
09:55
Just imagine if everybody had a personalized tutor in their pocket
182
595925
3795
想像一下,每個人的口袋裡 都有一個私人教師,
09:59
and access to low-cost medical advice.
183
599720
3003
並能取得低成本的醫療建議。
10:02
A lawyer and a doctor,
184
602723
1544
律師和醫生、
10:04
a business strategist and coach --
185
604267
1960
商業策略師和教練——
10:06
all in your pocket 24 hours a day.
186
606269
2252
每天 24 小時都在你的口袋裡。
10:08
But things really start to change when they develop what I call AQ,
187
608980
4713
當它們發展出我所說的 AQ(「行動商數」)時,
10:13
their “actions quotient.”
188
613693
1668
事情就會真正開始改變。
10:15
This is their ability to actually get stuff done
189
615695
2794
這指的是它們在數位和實體世界中 實際完成工作的能力。
10:18
in the digital and physical world.
190
618531
2294
10:20
And before long, it won't just be people that have AIs.
191
620867
3420
很快地,擁有 AI 的 將不只是人類。
10:24
Strange as it may sound, every organization,
192
624328
2962
聽起來可能很奇怪,
每個組織,從小型企業到 非營利組織、再到國家政府,
10:27
from small business to nonprofit to national government,
193
627290
3253
10:30
each will have their own.
194
630585
1710
都會有自己的 AI。
10:32
Every town, building and object
195
632795
2711
每個城鎮、建築、物品,
10:35
will be represented by a unique interactive persona.
196
635506
3754
都將由獨特的互動角色代表。
10:39
And these won't just be mechanistic assistants.
197
639302
2544
這些不僅僅是機械助理。
10:42
They'll be companions, confidants,
198
642221
3754
它們將成為同伴、知己、
同事、朋友和合作夥伴,
10:46
colleagues, friends and partners,
199
646017
2669
10:48
as varied and unique as we all are.
200
648728
2627
就像我們一樣多樣化和獨特。
10:52
At this point, AIs will convincingly imitate humans at most tasks.
201
652273
4630
此時,AI 將在大多數任務中 令人信服地模仿人類。
10:57
And we'll feel this at the most intimate of scales.
202
657737
2794
我們將在最切身的方面感受到這一點。
11:00
An AI organizing a community get-together for an elderly neighbor.
203
660990
3587
AI 為年長鄰居舉辦社區聚會。
11:04
A sympathetic expert helping you make sense of a difficult diagnosis.
204
664619
4588
富有同情心的專家幫助你 理解一個困難的診斷。
11:09
But we'll also feel it at the largest scales.
205
669248
2753
但我們也會大規模地感受到它的存在。
加快科學上的發現,
11:12
Accelerating scientific discovery,
206
672043
2586
11:14
autonomous cars on the roads,
207
674629
2168
道路上的自動駕駛汽車,
11:16
drones in the skies.
208
676797
1627
天空中的無人機。
11:18
They'll both order the takeout and run the power station.
209
678966
3337
它們會訂購外賣,也會經營發電站。
11:22
They’ll interact with us and, of course, with each other.
210
682970
3337
它們會與我們互動, 當然,也會彼此互動。
11:26
They'll speak every language,
211
686349
1918
它們會說各種語言,
11:28
take in every pattern of sensor data,
212
688267
2836
接收各種模式的感測資料、
11:31
sights, sounds,
213
691145
2336
景象、聲音、
11:33
streams and streams of information,
214
693481
2377
各式各樣的資訊流,
11:35
far surpassing what any one of us could consume in a thousand lifetimes.
215
695900
4379
遠遠超過我們任何人 一千輩子所能吸收的內容。
11:40
So what is this?
216
700780
1293
那麼,這是什麼?
11:42
What are these AIs?
217
702990
2086
這些 AI 是什麼?
11:46
If we are to prioritize safety above all else,
218
706410
4588
如果我們要將安全放在優先考量,
以確保這股新浪潮始終 為人類服務,並增進人類福祉,
11:51
to ensure that this new wave always serves and amplifies humanity,
219
711040
5088
11:56
then we need to find the right metaphors for what this might become.
220
716128
4922
那麼我們需要為未來可能的發展 找到正確的比喻。
12:01
For years, we in the AI community, and I specifically,
221
721842
4296
多年來,在 AI 領域中的 我們,尤其是我,
12:06
have had a tendency to refer to this as just tools.
222
726180
3587
一直傾向於將它視為工具。
但這並沒有真正捕捉到 現今實際發展的情況。
12:11
But that doesn't really capture what's actually happening here.
223
731060
3003
12:14
AIs are clearly more dynamic,
224
734730
2503
AI 顯然更加動態、
12:17
more ambiguous, more integrated
225
737233
2753
更模糊、更整合,
12:19
and more emergent than mere tools,
226
739986
2627
而且比完全受人類控制的 單純工具更突出。
12:22
which are entirely subject to human control.
227
742613
2878
12:25
So to contain this wave,
228
745533
2544
因此,為了遏制這一浪潮,
12:28
to put human agency at its center
229
748119
3044
將人類的能動性置於核心,
12:31
and to mitigate the inevitable unintended consequences
230
751163
2711
並減輕可能出現的 不可避免的意外後果,
12:33
that are likely to arise,
231
753916
1835
12:35
we should start to think about them as we might a new kind of digital species.
232
755793
5005
我們應該開始想像 AI, 如同想像一種新的數位物種。
12:41
Now it's just an analogy,
233
761132
1793
現在這只是一個比喻,
12:42
it's not a literal description, and it's not perfect.
234
762925
2586
不是字面描述,也不完美。
首先,它們顯然不是 任何傳統意義上的生物,
12:46
For a start, they clearly aren't biological in any traditional sense,
235
766053
4046
12:50
but just pause for a moment
236
770141
2377
但請暫停一下,
12:52
and really think about what they already do.
237
772560
2836
認真想一想它們已經會做哪些事。
12:55
They communicate in our languages.
238
775438
2627
它們會用我們的語言溝通,
12:58
They see what we see.
239
778107
2127
它們看到我們眼見的。
13:00
They consume unimaginably large amounts of information.
240
780234
3587
它們能吸收難以想像的大量訊息。
13:04
They have memory.
241
784739
1376
它們有記憶,
13:06
They have personality.
242
786991
1752
有個性,
13:09
They have creativity.
243
789493
1710
它們有創造力。
它們甚至可以進行一定程度的推理 並制定基本計劃。
13:12
They can even reason to some extent and formulate rudimentary plans.
244
792038
3962
13:16
They can act autonomously if we allow them.
245
796709
3128
如果我們允許,它們能自主行動。
13:20
And they do all this at levels of sophistication
246
800546
2294
所有它們能達成的複雜程度
13:22
that is far beyond anything that we've ever known from a mere tool.
247
802882
3587
遠遠超出了我們先前 所了解的單純工具。
13:27
And so saying AI is mainly about the math or the code
248
807762
4296
如果說 AI 只是數學或程式碼,
13:32
is like saying we humans are mainly about carbon and water.
249
812099
4838
就如同說我們人類 只是碳和水組成一樣。
13:37
It's true, but it completely misses the point.
250
817897
3795
這是事實,但這完全沒有抓住重點。
13:42
And yes, I get it, this is a super arresting thought
251
822860
3629
我知道,這是一個 非常引人注目的想法,
13:46
but I honestly think this frame helps sharpen our focus on the critical issues.
252
826489
5171
但老實說,我認為這框架有助於 增強我們對這個關鍵問題的關注。
13:52
What are the risks?
253
832745
1585
有哪些風險?
13:55
What are the boundaries that we need to impose?
254
835790
2752
我們需要施加哪些界線?
13:59
What kind of AI do we want to build or allow to be built?
255
839293
4338
我們想要建構或允許建構 什麼樣的 AI?
14:04
This is a story that's still unfolding.
256
844465
2377
這是一個仍在展開的故事。
14:06
Nothing should be accepted as a given.
257
846884
2544
任何事情都不應該被視為理所當然。
14:09
We all must choose what we create.
258
849428
2544
對於所創造的東西,我們必須有選擇。
14:12
What AIs we bring into the world, or not.
259
852681
3921
我們要為世界帶來了 什麼樣的 AI,或是選擇不要。
14:18
These are the questions for all of us here today,
260
858604
2836
這些是我們今天在座的所有人
14:21
and all of us alive at this moment.
261
861440
2586
以及此時此刻還活著的所有人的問題。
14:24
For me, the benefits of this technology are stunningly obvious,
262
864693
4296
對我來說,這項科技的好處 是非常明顯的,
14:28
and they inspire my life's work every single day.
263
868989
3546
這激勵著我每天投身這個畢生事業。
14:33
But quite frankly, they'll speak for themselves.
264
873661
3086
但坦白說,這不言自明。
14:37
Over the years, I've never shied away from highlighting risks
265
877706
2920
多年來,我從不迴避 強調風險,並談論缺點。
14:40
and talking about downsides.
266
880668
1710
14:43
Thinking in this way helps us focus on the huge challenges
267
883087
3211
以這種方式思考幫助我們專注於
所有人面臨的巨大挑戰。
14:46
that lie ahead for all of us.
268
886298
1836
14:48
But let's be clear.
269
888843
1418
但明確地說,
14:50
There is no path to progress
270
890719
2128
如果我們撇下科技, 就沒有進步的道路。
14:52
where we leave technology behind.
271
892888
2169
14:55
The prize for all of civilization is immense.
272
895683
3795
文明進步得到的獎賞是無限的。
我們需要各種解決方案, 醫療保健、教育方面、氣候危機。
15:00
We need solutions in health care and education, to our climate crisis.
273
900062
3879
15:03
And if AI delivers just a fraction of its potential,
274
903941
3921
只要 AI 發揮其潛力的一小部分,
15:07
the next decade is going to be the most productive in human history.
275
907903
4421
未來十年將是人類歷史上 生產力最高的十年。
15:13
Here's another way to think about it.
276
913701
2002
以下是另一種思考方式。
15:15
In the past,
277
915744
1293
過去,
經濟成長往往會帶來巨大的負面影響。
15:17
unlocking economic growth often came with huge downsides.
278
917037
4088
15:21
The economy expanded as people discovered new continents
279
921500
3712
人們發現新大陸、開闢新疆域,
15:25
and opened up new frontiers.
280
925254
2002
經濟隨之不斷擴張。
15:28
But they colonized populations at the same time.
281
928632
3045
人們同時也進行殖民。
15:32
We built factories,
282
932595
1710
我們建造了工廠,
15:34
but they were grim and dangerous places to work.
283
934305
3044
但這些工作場所醜陋又危險。
我們開採了石油,
15:38
We struck oil,
284
938017
1918
15:39
but we polluted the planet.
285
939977
1627
但污染了地球。
15:42
Now because we are still designing and building AI,
286
942146
3170
現在,因為我們仍在 設計和建造 AI,
15:45
we have the potential and opportunity to do it better,
287
945316
3795
所以我們有潛力和機會做得更好,
15:49
radically better.
288
949153
1335
徹底更好。
現在,我們並不是發現新大陸
15:51
And today, we're not discovering a new continent
289
951071
2628
15:53
and plundering its resources.
290
953741
1918
並掠奪其資源,
15:56
We're building one from scratch.
291
956285
1919
我們正在從頭開始建造一個新大陸。
15:58
Sometimes people say that data or chips are the 21st century’s new oil,
292
958662
5297
有時人們會說數據或晶片 是 21 世紀的新石油,
16:03
but that's totally the wrong image.
293
963959
2002
但這個意象完全錯了。
16:06
AI is to the mind
294
966587
1919
AI 之於大腦
16:08
what nuclear fusion is to energy.
295
968547
2753
就像核融合之於能源,
16:12
Limitless, abundant,
296
972259
2670
無限,豐沛,能改變世界。
16:14
world-changing.
297
974970
1544
16:17
And AI really is different,
298
977389
2962
AI 的確完全不同,
16:20
and that means we have to think about it creatively and honestly.
299
980392
4213
這意味著我們必須 創造性地、如實地想像它。
16:24
We have to push our analogies and our metaphors
300
984647
2586
我們必須將我們的類比 和隱喻推向極限,
16:27
to the very limits
301
987274
1752
以便能夠應對即將發生的事情。
16:29
to be able to grapple with what's coming.
302
989068
2252
16:31
Because this is not just another invention.
303
991362
2878
因為這不僅僅是另一項發明。
16:34
AI is itself an infinite inventor.
304
994907
3503
AI 本身就是無限發明家。
16:38
And yes, this is exciting and promising and concerning
305
998869
3295
是的,這令人興奮、充滿希望、
令人擔憂、同時也引人入勝。
16:42
and intriguing all at once.
306
1002164
2044
16:45
To be quite honest, it's pretty surreal.
307
1005251
2585
老實說,這很超現實。
16:47
But step back,
308
1007836
1293
但退一步,
16:49
see it on the long view of glacial time,
309
1009171
3462
從冰河期的長遠角度來看,
16:52
and these really are the very most appropriate metaphors that we have today.
310
1012633
4796
這確實是我們現今 所擁有的最恰當的比喻。
16:57
Since the beginning of life on Earth,
311
1017471
3003
自從地球上出現生命以來,
17:00
we've been evolving, changing
312
1020516
2961
我們一直在進化、改變,
17:03
and then creating everything around us in our human world today.
313
1023477
4171
創造了今日人類世界, 我們周遭的一切。
17:08
And AI isn't something outside of this story.
314
1028232
3003
AI 並不是在這個歷程之外。
17:11
In fact, it's the very opposite.
315
1031569
2961
事實上,恰恰相反。
17:15
It's the whole of everything that we have created,
316
1035197
3170
它是我們所創造的一切的整合,
17:18
distilled down into something that we can all interact with
317
1038367
3545
被提煉成人人都可以 與之互動的東西,並從中受益。
17:21
and benefit from.
318
1041912
1377
17:23
It's a reflection of humanity across time,
319
1043872
3504
它是人類跨越時間的映像,
17:27
and in this sense,
320
1047418
1251
從這個意義上說,
17:28
it isn't a new species at all.
321
1048711
1918
它根本不是新物種。
17:31
This is where the metaphors end.
322
1051213
1960
比喻到這裡就結束了。
17:33
Here's what I'll tell Caspian next time he asks.
323
1053591
2502
下次凱斯賓問起時,我會告訴他,
17:37
AI isn't separate.
324
1057219
1460
AI 並非孤立的。
17:39
AI isn't even in some senses, new.
325
1059388
3295
從某種意義上說, AI 甚至不是新的東西。
17:43
AI is us.
326
1063517
1126
AI 就是我們,
17:45
It's all of us.
327
1065394
1210
是我們所有人。
這也許是連六歲孩子都能感受到的
17:47
And this is perhaps the most promising and vital thing of all
328
1067062
3754
17:50
that even a six-year-old can get a sense for.
329
1070858
2669
最有希望、最重大的事。
17:54
As we build out AI,
330
1074528
1376
當我們建構 AI 時,
17:55
we can and must reflect all that is good,
331
1075946
3545
我們可以、而且必須 映照出人類所有的良善、
17:59
all that we love,
332
1079533
1126
珍愛與特殊之處:
18:00
all that is special about humanity:
333
1080701
2753
18:03
our empathy, our kindness,
334
1083495
2169
我們的同理心、善良、
18:05
our curiosity and our creativity.
335
1085664
2169
好奇心和創造力。
18:09
This, I would argue, is the greatest challenge of the 21st century,
336
1089251
5130
我認為,這是 21 世紀最大的挑戰,
18:14
but also the most wonderful,
337
1094381
2211
但對我們所有人來說也是最美妙、
18:16
inspiring and hopeful opportunity for all of us.
338
1096592
3336
最鼓舞人心、最充滿希望的機會。
18:20
Thank you.
339
1100429
1168
謝謝。
18:21
(Applause)
340
1101639
4879
(掌聲)
18:26
Chris Anderson: Thank you Mustafa.
341
1106560
1668
克里斯·安德森: 謝謝穆斯塔法。
18:28
It's an amazing vision and a super powerful metaphor.
342
1108270
3796
這是令人驚嘆的願景 和超級強大的隱喻。
你正處於一個很棒的位置。
18:32
You're in an amazing position right now.
343
1112066
1960
你對 OpenAI 所進行的 令人驚嘆的工作有著密切的了解。
18:34
I mean, you were connected at the hip
344
1114026
1793
18:35
to the amazing work happening at OpenAI.
345
1115819
2795
18:38
You’re going to have resources made available,
346
1118614
2169
你擁有可用的資源,
18:40
there are reports of these giant new data centers,
347
1120824
4046
這些新的巨型資料中心的相關報告,
18:44
100 billion dollars invested and so forth.
348
1124870
2461
1000 億美元的投資等等。
一種新物種可以從中出現。
18:48
And a new species can emerge from it.
349
1128082
3837
18:52
I mean, in your book,
350
1132294
1502
我的意思是,在你的書中,
18:53
you did, as well as painting an incredible optimistic vision,
351
1133837
2878
你做到了,除了描繪 令人難以置信的樂觀願景之外,
18:56
you were super eloquent on the dangers of AI.
352
1136757
3629
你也很有說服力地 闡述了AI 的危險。
19:00
And I'm just curious, from the view that you have now,
353
1140427
3963
我只是好奇,從你現在的角度來看,
19:04
what is it that most keeps you up at night?
354
1144431
2336
最讓你徹夜難眠的是什麼?
19:06
Mustafa Suleyman: I think the great risk is that we get stuck
355
1146809
2878
穆斯塔法·蘇萊曼: 我認為最大的風險是
我們陷入我所謂的悲觀厭惡陷阱。
19:09
in what I call the pessimism aversion trap.
356
1149728
2086
19:11
You know, we have to have the courage to confront
357
1151855
2586
你知道,我們必須有勇氣 面對潛在的黑暗腳本,
19:14
the potential of dark scenarios
358
1154483
1960
19:16
in order to get the most out of all the benefits that we see.
359
1156443
3128
才能充分利用我們眼見的所有好處。
19:19
So the good news is that if you look at the last two or three years,
360
1159613
3712
好消息是,如果你看看過去兩三年,
19:23
there have been very, very few downsides, right?
361
1163325
2961
你會發現幾乎沒有什麼缺點,對吧?
19:26
It’s very hard to say explicitly what harm an LLM has caused.
362
1166328
4922
很難明確說出大型語言模型 造成了什麼危害。
19:31
But that doesn’t mean that that’s what the trajectory is going to be
363
1171291
3212
但這並不意味著這就會是 未來十年的發展軌跡。
19:34
over the next 10 years.
364
1174503
1168
我認為,如果你專注於 一些具體的能力,
19:35
So I think if you pay attention to a few specific capabilities,
365
1175671
3462
19:39
take for example, autonomy.
366
1179174
1835
例如,自主性。
自主權顯然是個門檻, 如果我們跨越,將會增加社會風險。
19:41
Autonomy is very obviously a threshold
367
1181009
2336
19:43
over which we increase risk in our society.
368
1183387
2794
19:46
And it's something that we should step towards very, very closely.
369
1186223
3128
這是我們應該非常密切地 關注發展的點。
19:49
The other would be something like recursive self-improvement.
370
1189393
3420
另一個是類似會不斷自我修正的程式。
19:52
If you allow the model to independently self-improve,
371
1192813
3629
如果你允許模型獨立地自我改進,
19:56
update its own code,
372
1196483
1460
更新自己的程式碼,
19:57
explore an environment without oversight, and, you know,
373
1197985
3295
在沒有監督的情況下探索環境,
而沒有人可以控制改變它的運作方式,
20:01
without a human in control to change how it operates,
374
1201321
3295
20:04
that would obviously be more dangerous.
375
1204616
1919
那顯然會更加危險。
20:06
But I think that we're still some way away from that.
376
1206535
2544
但我認為我們離這點還有一段距離。
我認為我們還需要五到十年的時間 才必須面對這個問題。
20:09
I think it's still a good five to 10 years before we have to really confront that.
377
1209079
3879
但現在該開始談論它了。
20:12
But it's time to start talking about it now.
378
1212958
2085
CA:與任何生物物種不同,
20:15
CA: A digital species, unlike any biological species,
379
1215043
2545
數位物種可以在九奈秒內 而不是九個月內複製,
20:17
can replicate not in nine months,
380
1217588
2002
20:19
but in nine nanoseconds,
381
1219631
1669
20:21
and produce an indefinite number of copies of itself,
382
1221341
3421
並產生無限數量的自身副本,
20:24
all of which have more power than we have in many ways.
383
1224803
3796
這些副本在許多方面 都比我們擁有更多的能力。
20:28
I mean, the possibility for unintended consequences seems pretty immense.
384
1228599
4838
出現意外後果的可能性似乎相當大。
20:33
And isn't it true that if a problem happens,
385
1233479
2168
如果出現問題, 可能會在一小時內發生,
20:35
it could happen in an hour?
386
1235689
1919
這是真的嗎?
20:37
MS: No.
387
1237649
1335
穆斯塔法·蘇萊曼:不,
20:38
That is really not true.
388
1238984
1752
這確實不是真的。
20:40
I think there's no evidence to suggest that.
389
1240778
2085
我認為沒有證據表明這一點。
20:42
And I think that, you know,
390
1242863
1585
我認為,
20:44
that’s often referred to as the “intelligence explosion.”
391
1244490
2836
這通常被稱為「智慧爆炸」。
20:47
And I think it is a theoretical, hypothetical maybe
392
1247367
3712
我認為這是一個理論上的假設,
也許我們都很好奇去探索,
20:51
that we're all kind of curious to explore,
393
1251079
2420
20:53
but there's no evidence that we're anywhere near anything like that.
394
1253540
3212
但沒有證據表明 我們已經接近這樣的狀況。
20:56
And I think it's very important that we choose our words super carefully.
395
1256752
3462
我認為我們必須非常謹慎地 選擇我們的措辭。
21:00
Because you're right, that's one of the weaknesses of the species framing,
396
1260255
3546
因為你說的對, 這是物種框架的弱點之一,
21:03
that we will design the capability for self-replication into it
397
1263801
4337
如果人們選擇這樣做, 我們將為其設計自我複製的能力。
21:08
if people choose to do that.
398
1268180
1668
21:09
And I would actually argue that we should not,
399
1269890
2169
實際上,我認為我們不應該這樣做,
這是我們該避開的 危險能力之一,是吧?
21:12
that would be one of the dangerous capabilities
400
1272059
2210
21:14
that we should step back from, right?
401
1274269
1835
21:16
So there's no chance that this will "emerge" accidentally.
402
1276104
3796
因此,這不可能意外地「出現」。
21:19
I really think that's a very low probability.
403
1279942
2502
我確實認為這種可能性非常低。
21:22
It will happen if engineers deliberately design those capabilities in.
404
1282486
4338
如果工程師故意設計這些功能,
21:26
And if they don't take enough efforts to deliberately design them out.
405
1286865
3295
卻沒有付出足夠的努力來避免風險, 那麼這種情況就會發生。
因此,這就是為什麼要一開始
21:30
And so this is the point of being explicit
406
1290160
2294
就明確、透明地透過設計引入安全性。
21:32
and transparent about trying to introduce safety by design very early on.
407
1292454
5672
CA:謝謝你,你的願景是 人類在這個新事物中,
21:39
CA: Thank you, your vision of humanity injecting into this new thing
408
1299044
5964
注入我們自己最好的部分,
21:45
the best parts of ourselves,
409
1305008
1877
21:46
avoiding all those weird, biological, freaky,
410
1306927
2920
避免我們在某些​​情況下可能出現的
21:49
horrible tendencies that we can have in certain circumstances,
411
1309888
2920
所有奇怪的、生物的、 怪異的、可怕的傾向,
21:52
I mean, that is a very inspiring vision.
412
1312808
2127
這是非常鼓舞人心的願景。
21:54
And thank you so much for coming here and sharing it at TED.
413
1314977
3336
非常感謝你來到這裡, 並在 TED 上分享。
21:58
Thank you, good luck.
414
1318355
1210
謝謝,祝好運。
21:59
(Applause)
415
1319565
1876
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog