The Dark Side of Competition in AI | Liv Boeree | TED

141,887 views ・ 2023-11-09

TED


請雙擊下方英文字幕播放視頻。

譯者: Jack Liu 審譯者: Shelley Tsang 曾雯海
00:04
Competition.
0
4000
1502
競爭,
00:05
It's a fundamental part of human nature.
1
5502
3003
是人類的天性
00:09
I was a professional poker player for 10 years,
2
9047
2628
我當了十年的專業撲克玩家
00:11
so I've very much seen all the good, bad and ugly ways it can manifest.
3
11675
4421
見識到人性各種善良、邪惡,及醜陋的體現
00:16
When it's done right,
4
16763
1502
競爭,可以是助力
00:18
it can drive us to incredible feats in sports and innovation,
5
18306
5589
使我們在運動和創新方面達成壯舉
00:23
like when car companies compete over who can build the safest cars
6
23895
4380
比如汽車公司, 競相製造出最安全的汽車
00:28
or the most efficient solar panels.
7
28275
2544
或最有效率的太陽能板。
00:30
Those are all examples of healthy competition,
8
30860
2169
這些都是良性競爭的例子,
00:33
because even though individual companies might come and go,
9
33029
3170
因為即使公司倒了
這種良性競爭還是能留下雙贏局面
00:36
in the long run,
10
36199
1168
00:37
the game between them creates win-win outcomes
11
37409
2419
00:39
where everyone benefits in the end.
12
39828
2043
最後對大家都有好處
00:42
But sometimes competition is not so great
13
42581
3003
但並不是所有競爭都是良性的
00:45
and can create lose-lose outcomes where everyone's worse off than before.
14
45625
5464
有時反而會造成雙輸局面, 讓大家過得更慘
00:51
Take these AI beauty filters, for example.
15
51715
2878
以這些 AI 美容濾鏡為例
00:54
As you can see, they're a very impressive technology.
16
54968
2586
如你所見,這科技非常厲害
00:57
They can salvage almost any picture.
17
57596
2210
幾乎能挽救任何照片
01:00
They can even make Angelina and Margot more beautiful.
18
60181
5297
甚至能讓安潔麗娜和瑪格更加美麗
01:05
So they're very handy,
19
65520
1335
相當實用的科技
01:06
especially for influencers who, now, at the click of a button,
20
66896
3045
尤其是對網紅來說, 只要按個鈕
01:09
can transform into the most beautiful Hollywood versions of themselves.
21
69983
3795
就可把自己變成跟好萊塢明星一樣美
01:13
But handy doesn't always mean healthy.
22
73778
2419
但方便並不一定代表健康。
01:16
And I've personally noticed how quickly these things can train you
23
76573
3879
我開始注意到
這些科技會快速的讓你討厭自己天生的樣貌。
01:20
to hate your natural face.
24
80493
2378
01:22
And there's growing evidence that they're creating issues
25
82912
2795
而且有越來越多的證據指出
01:25
like body dysmorphia, especially in young people.
26
85707
3420
這些科技增長了身體臆形症等心理問題, 尤其在年輕族群
01:29
Nonetheless, these things are now endemic to social media
27
89502
4421
儘管如此, 這些科技已經在社交媒體上氾濫
01:33
because the nature of the game demands it.
28
93965
2461
因為如果想要在現有制度下成功, 你不能不用它
01:37
The platforms are incentivized to provide them
29
97093
2211
社群平台主動提供這些濾鏡
01:39
because hotter pictures means more hijacked limbic systems,
30
99346
3253
因為有更多性感照,就能綁架更多注意力
01:42
which means more scrolling and thus more ad revenue.
31
102641
2502
而更多的注意力,代表更高的廣告收益。
01:45
And users are incentivized to use them
32
105644
2210
而用戶也被鼓勵去使用這些濾鏡
01:47
because hotter pictures get you more followers.
33
107896
2377
因為性感的照片能帶來更多追隨者
01:50
But this is a trap,
34
110815
2461
但這是一個陷阱,
01:53
because once you start using these things,
35
113318
2044
因為這些科技你只要一用
01:55
it's really hard to go back.
36
115362
2127
就很難不再去用它
01:57
Plus, you don't even get a competitive advantage from them anymore
37
117530
3254
而且如果不繼續用, 你會因此處於劣勢
02:00
because everyone else is already using them too.
38
120784
2502
因為就算你不用,別人照樣會用
02:03
So influencers are stuck using these things
39
123787
3461
因此,被迫使用這些工具的網紅
承受其帶來的負面影響, 卻只換來微薄的收益
02:07
with all the downsides
40
127290
1460
02:08
and very little upside.
41
128792
1793
02:10
A lose-lose game.
42
130627
1334
全盤皆輸
02:12
A similar kind of trap is playing out in our news media right now,
43
132462
4504
同樣的狀況也發生於新聞媒體
02:16
but with much worse consequences.
44
136966
2712
但造成的影響更加嚴重
02:19
You'd think since the internet came along
45
139719
1961
照理來說,網路的出現
02:21
that the increased competition between news outlets
46
141680
3169
能促使新聞媒體之間的良性競爭
02:24
would create a sort of positive spiral,
47
144891
2044
02:26
like a race to the top of nuanced, impartial, accurate journalism.
48
146976
4839
比誰能產出更多元、 有益且公正的新聞
02:32
Instead, we're seeing a race to the bottom of clickbait and polarization,
49
152565
6132
但事與願違, 我們只看到新聞騙取流量,煽動爭議
02:38
where even respectable papers are increasingly leaning
50
158697
2836
連有威望的報社
02:41
into these kind of low-brow partisan tactics.
51
161533
3253
也開始靠挑動黨派對立來爭取銷量
02:45
Again, this is due to crappy incentives.
52
165537
3170
跟前面一樣,這是不良誘因造成的後果
02:49
Today, we no longer just read our news.
53
169374
2669
在現代,我們不只會閱讀新聞
02:52
We interact with it by sharing and commenting.
54
172085
3378
還會藉由分享、留言與其互動
02:56
And headlines that trigger emotions like fear or anger
55
176131
4170
而能引起恐慌或怒氣的聳動頭條
03:00
are far more likely to go viral than neutral or positive ones.
56
180343
4088
比起中立或正向報導,更能激起討論熱度
新聞編輯陷入和網紅相似的困境
03:05
So in many ways, news editors are in a similar kind of trap
57
185098
4087
03:09
as the influencers,
58
189185
1544
03:10
where, the more their competitors lean into clickbaity tactics,
59
190770
4171
當對手用釣魚標題獲得更多流量時
03:14
the more they have to as well.
60
194983
1960
自己也被迫照做
03:16
Otherwise, their stories just get lost in the noise.
61
196943
2628
否則報導就會被埋沒
03:20
But this is terrible for everybody,
62
200029
2420
但這對每個人都有壞處
03:22
because now the media get less trust from the public,
63
202490
3295
因為不只是新聞媒體失去了公信力
03:25
but also it becomes harder and harder for anyone to discern truth from fiction,
64
205785
4463
也讓我們更難辨視事實真假
03:30
which is a really big problem for democracy.
65
210248
2127
在民主社會中,是很嚴重的問題
03:33
Now, this process of competition gone wrong
66
213251
3420
這種失序競爭的情況
03:36
is actually the driving force behind so many of our biggest issues.
67
216713
4880
就是許多當代問題的最大主因
03:42
Plastic pollution,
68
222427
1877
塑料污染
03:44
deforestation,
69
224345
2128
森林砍伐
03:46
antibiotic overuse in farming,
70
226514
2711
養殖業抗生素濫用
03:49
arms races,
71
229225
1627
軍武競賽
03:50
greenhouse gas emissions.
72
230894
1918
溫室氣體排放
03:52
These are all a result of crappy incentives,
73
232854
3754
這些都是負面競爭的結果
03:56
of poorly designed games that push their players --
74
236649
3420
設計不良的制度
逼得不管是民眾、企業或政府
04:00
be them people, companies or governments --
75
240069
2753
04:02
into taking strategies and tactics that defer costs and harms to the future.
76
242864
5714
都必須已危害未來的方法做事, 讓後人承擔後果
04:09
And what's so ridiculous is that most of the time,
77
249454
2919
但荒唐的是
大部分的人也不想這麼做
04:12
these guys don't even want to be doing this.
78
252415
2878
04:15
You know, it's not like packaging companies
79
255335
2169
包裝公司也不想用塑料填滿海洋
04:17
want to fill the oceans with plastic
80
257545
1919
04:19
or farmers want to worsen antibiotic resistance.
81
259506
2836
養殖者也不想用抗生素
04:22
But they’re all stuck in the same dilemma of:
82
262759
3420
但他們都陷入相同的困境:
04:26
"If I don't use this tactic,
83
266221
2294
「如果我不採用這種手法
04:28
I’ll get outcompeted by all the others who do.
84
268515
3211
我會輸給其他這麼做的競爭者
04:31
So I have to do it, too.”
85
271768
1835
所以我也必須這樣做」
04:34
This is the mechanism we need to fix as a civilization.
86
274604
3545
導正這個狀況,是我們文明人的義務
04:38
And I know what you're probably all thinking, "So it's capitalism."
87
278858
3170
你們可能心想:「啊! 萬惡的資本主義!」
不,它不是原因
04:42
No, it's not capitalism.
88
282028
1543
04:43
Which, yes, can cause problems,
89
283613
1793
它的確會製造問題
04:45
but it can also solve them and has been fantastic in general.
90
285448
3170
但同時也能解決問題,整體來說很棒
04:48
It's something much deeper.
91
288618
2127
問題出在更深層的地方
04:51
It's a force of misaligned incentives of game theory itself.
92
291287
5005
出自於不鼓勵良性競爭的環境
幾年前,我退出撲克圈
04:57
So a few years ago, I retired from poker,
93
297001
2378
04:59
in part because I wanted to understand this mechanism better.
94
299420
3671
一部分是因為我想深入了解這種失衡機制。
05:03
Because it takes many different forms, and it goes by many different names.
95
303633
4213
它有各種不同的面相,不同的稱呼
這些只是其中幾個名字 [協調問題][負和賭局]
05:08
These are just some of those names.
96
308346
2085
這些只是其中幾個名字 [多方角逐][共有地悲劇]
如你所見,都是些很抽象生硬的名詞,對吧? [逐底競爭][多方囚犯困境]
05:11
You can see they're a little bit abstract and clunky, right?
97
311015
3337
05:14
They don't exactly roll off the tongue.
98
314394
2002
十分拗口 [社會難題][不當平衡]
05:16
And given how insidious and connected all of these problems are,
99
316771
5089
這些課題難以被察覺又互相關聯
05:21
it helps to have a more visceral way of recognizing them.
100
321901
3963
所以我們需要個極端的例子,來了解、辨別它們
這次集會應該只有我這場會提起聖經
05:27
So this is probably the only time
101
327073
1585
05:28
you're going to hear about the Bible at this conference.
102
328658
2711
但我想說個小故事,
05:31
But I want to tell you a quick story from it,
103
331369
2127
05:33
because allegedly, back in the Canaanite days,
104
333496
2544
因為據稱, 在迦南時代,
05:36
there was a cult who wanted money and power so badly,
105
336040
5256
有個非常渴望金錢與權力的邪教
05:41
they were willing to sacrifice their literal children for it.
106
341296
3587
甚至願意將親生血肉綁在木像上
活活燒死,想以此換取財富與權力
05:45
And they did this by burning them alive in an effigy
107
345592
4212
05:49
of a God that they believed would then reward them
108
349846
2711
相信他們的神會獎勵這種終極犧牲
05:52
for this ultimate sacrifice.
109
352599
2127
05:55
And the name of this god was Moloch.
110
355643
2962
這位神叫“摩洛”
05:59
Bit of a bummer, as stories go.
111
359772
2461
這故事滿負面的
但也是一個恰當的比喻
06:02
But you can see why it's an apt metaphor,
112
362233
2586
06:04
because sometimes we get so lost in winning the game right in front of us,
113
364861
6673
因為有時我們會為了搶得先機
06:11
we lose sight of the bigger picture
114
371534
1960
而犧牲未來
06:13
and sacrifice too much in our pursuit of victory.
115
373536
3754
不知不覺的付出我們無法承受的代價
06:17
So just like these guys were sacrificing their children for power,
116
377832
4088
就像為了權力獻祭親生骨肉的邪教徒
06:21
those influencers are sacrificing their happiness for likes.
117
381961
4588
網紅為了按讚數, 犧牲自己的快樂
06:27
Those news editors are sacrificing their integrity for clicks,
118
387008
4087
新聞編輯為了流量, 犧牲自己的信用
06:31
and polluters are sacrificing the biosphere for profit.
119
391137
3796
而污染者為了獲利, 犧牲了大自然
06:35
In all these examples,
120
395767
1793
以上的例子, 都是為了短利的負面競爭造成的
06:37
the short-term incentives of the games themselves are pushing,
121
397602
3962
06:41
they're tempting their players
122
401606
1752
誘惑每個人犧牲越來越多的未來
06:43
to sacrifice more and more of their future,
123
403358
3378
06:46
trapping them in a death spiral where they all lose in the end.
124
406778
4129
把大家拉進負面漩渦中, 人人都不好過
06:52
That's Moloch's trap.
125
412158
2127
陷入摩洛的圈套
06:54
The mechanism of unhealthy competition.
126
414661
2752
鼓勵負面競爭的機制
而相同的故事, 也在AI產業上演中
06:58
And the same is now happening in the AI industry.
127
418164
3337
07:02
We're all aware of the race that's heating up
128
422251
2128
我們都知道各個公司在互相競爭
07:04
between companies right now
129
424379
1293
07:05
over who can score the most compute,
130
425713
2253
比誰能獲得最高分的運算評比
07:08
who can get the biggest funding round or get the top talent.
131
428007
3379
誰能獲得最多的經費、 最頂尖的人才
07:11
Well, as more and more companies enter this race,
132
431386
3461
當競爭對手逐漸增加
07:14
the greater the pressure for everyone to go as fast as possible
133
434847
3629
公司必須超越對手的壓力也越來越大
07:18
and sacrifice other important stuff like safety testing.
134
438476
4171
驅使公司犧牲重要的堅持, 比如安全測試
07:23
This has all the hallmarks of a Moloch trap.
135
443189
3253
摩洛圈套的特點在此展露無遺
07:27
Because, like, imagine you're a CEO who, you know, in your heart of hearts,
136
447068
3587
想像你是一位執行長
真心認為你的團隊有能力做出安全又強大的AI
07:30
believes that your team is the best
137
450655
2586
07:33
to be able to safely build extremely powerful AI.
138
453241
3462
07:37
Well, if you go too slowly, then you run the risk of other,
139
457120
4170
但如果你動作太慢,
就可能被其他沒那麼嚴謹的團隊超越
07:41
much less cautious teams getting there first
140
461332
2628
搶先部屬它們的系統
07:44
and deploying their systems before you can.
141
464002
2168
07:46
So that in turn pushes you to be more reckless yourself.
142
466212
3378
這種風險反而使你更加魯莽
07:49
And given how many different experts and researchers,
143
469966
3628
己經有不少學者與專家
07:53
both within these companies
144
473594
1585
不管是來自職場還是獨立的
07:55
but also completely independent ones,
145
475179
2545
07:57
have been warning us about the extreme risks of rushed AI,
146
477765
4296
都開始警告我們草率推出AI的風險
08:02
this approach is absolutely mad.
147
482103
2711
是非常不負責任、瘋狂的行為
08:05
Plus, almost all AI companies
148
485398
3170
況且,幾乎所有 AI 公司
08:08
are beholden to satisfying their investors,
149
488609
3003
都有必須滿足的投資者、股東
08:11
a short-term incentive which, over time, will inevitably start to conflict
150
491612
4255
這種必須短期內達成的目標
久而久之, 必定會與善良的本意衝突
08:15
with any benevolent mission.
151
495867
1751
08:18
And this wouldn't be a big deal
152
498202
1502
如果今天只是在討論烤麵包機, 那就沒什麼大不了
08:19
if this was really just toasters we're talking about here.
153
499704
3337
08:23
But AI, and especially AGI,
154
503041
2752
但人工智慧, 尤其是通用人工智慧
08:25
is set to be a bigger paradigm shift
155
505835
1752
將會徹底改變社會型態
08:27
than the agricultural or industrial revolutions.
156
507587
2669
影響大於農業或工業革命
08:30
A moment in time so pivotal,
157
510590
2544
處在轉捩點的我們
08:33
it's deserving of reverence and reflection,
158
513176
4546
必須認真反思這個議題
08:37
not something to be reduced to a corporate rat race
159
517764
2419
別以為這只是企業在互相比較
08:40
of who can score the most daily active users.
160
520183
2544
比誰能搶到更多用戶
我不知道如何在進步與安全中取得平衡
08:43
I'm not saying I know
161
523186
1209
08:44
what the right trade-off between acceleration and safety is,
162
524437
3420
08:47
but I do know that we'll never find out what that right trade-off is
163
527899
3962
但我知道, 如果我們不脫離摩洛圈套
08:51
if we let Moloch dictate it for us.
164
531903
2794
我們永遠不會找到正確的平衡點
08:56
So what can we do?
165
536699
1794
那我們該如何改善現況呢?
08:58
Well, the good news is we have managed to coordinate
166
538868
3086
好消息是, 我們以經在某些部分達成共識
09:01
to escape some of Moloch's traps before.
167
541954
2002
躲掉了一些摩洛的陷阱
09:04
We managed to save the ozone layer from CFCs
168
544540
2545
在蒙特婁議定書的幫助下,
成功的保護臭氧層, 不被氟氯碳化物破壞
09:07
with the help of the Montreal Protocol.
169
547126
2169
09:09
We managed to reduce the number of nuclear weapons on Earth
170
549337
2878
1991年的削減戰略武器條約
09:12
by 80 percent,
171
552256
1210
09:13
with the help of the Strategic Arms Reduction Treaty in 1991.
172
553466
3378
成功減少了八成的核武
09:17
So smart regulation may certainly help with AI too,
173
557428
4338
同理,我們也能用相同的方式規範人工智慧
09:21
but ultimately,
174
561766
1335
但說到底
09:23
it's the players within the game who have the most influence on it.
175
563142
4004
對此事最有影響力的, 就是在AI產業裡的人
09:27
So we need AI leaders to show us
176
567814
2294
因此, 我們必須要求他們證明
09:30
that they're not only aware of the risks their technologies pose,
177
570108
3586
他們不只了解開發AI的潛在風險
09:33
but also the destructive nature of the incentives
178
573736
3253
也知道負面競爭的現況, 對未來百害無一利
09:37
that they're currently beholden to.
179
577031
1877
09:39
As their technological capabilities reach towards the power of gods,
180
579617
3712
如果他們要發展那麼神的科技
09:43
they're going to need the godlike wisdom to know how to wield them.
181
583329
3712
那他們也必須展現神一般的智慧, 才有資格掌控它
09:47
So it doesn't fill me with encouragement
182
587917
2211
因此, 當我聽到某大公司執行長說出:
09:50
when I see a CEO of a very major company saying something like,
183
590169
4338
「我們可把競爭對手逼到跳腳啦!」 這種話時
09:54
"I want people to know we made our competitor dance."
184
594507
2753
09:57
That is not the type of mindset we need here.
185
597677
2669
我並不感到興奮, 那不是我們現在需要的心態
10:00
We need leaders who are willing to flip Moloch's playbook,
186
600346
3629
我們需要能翻轉摩洛圈套的領導者
10:04
who are willing to sacrifice their own individual chance of winning
187
604016
3671
願意犧牲自己勝利的機會
10:07
for the good of the whole.
188
607687
1626
來換得全體福祉
10:10
Now, fortunately, the three leading labs are showing some signs of doing this.
189
610148
4921
幸運的是, 現在有三個頂尖實驗室展現此意
10:15
Anthropic recently announced their responsible scaling policy,
190
615695
3086
Anthropic 最近宣布了他們的責任分級政策 【模型能力越強,安全標準越高】
10:18
which pledges to only increase capabilities
191
618823
2586
該政策承諾,只在符合安全標準時 【模型能力越強,安全標準越高】
10:21
once certain security criteria have been met.
192
621409
2627
才會提升人工智慧的能力 【模型能力越強,安全標準越高】
10:24
OpenAI have recently pledged
193
624954
1418
OpenAI 最近承諾把兩成的運算資源
10:26
to dedicate 20 percent of their compute purely to alignment research.
194
626414
3962
用於確保AI是往對人類有益的方向發展
10:31
And DeepMind have shown a decade-long focus
195
631627
3087
而 DeepMind透過十年的堅持
10:34
of science ahead of commerce,
196
634714
2377
證明比起商業利益,他們更注重科學精神
比如他們將自己開發的AlphaFold
10:37
like their development of AlphaFold,
197
637133
1752
10:38
which they gave away to the science community for free.
198
638885
2627
免費開放給科學社群使用
10:41
These are all steps in the right direction,
199
641846
2044
這些方向是對的
10:43
but they are still nowhere close to being enough.
200
643931
2836
但單靠這些方法仍然不夠。
10:46
I mean, most of these are currently just words,
201
646809
2586
畢竟,這些都只是紙上談兵
10:49
they're not even proven actions.
202
649437
2127
還沒被實際執行
因此,我們需要明確的規則, 確保AI的發展為良性競爭
10:52
So we need a clear way to turn the AI race into a definitive race to the top.
203
652023
6006
10:58
Perhaps companies can start competing over who can be within these metrics,
204
658779
5297
促使公司自主遵守規範
11:04
over who can develop the best security criteria.
205
664118
3128
或誰制定的安全標準最完善。
11:07
A race of who can dedicate the most compute to alignment.
206
667288
4004
比誰能投入最多的運算力於AI對齊上
11:11
Now that would truly flip the middle finger to Moloch.
207
671292
2669
如果成功的話, 就等於是對摩洛豎起了大大的中指
11:14
Competition can be an amazing tool,
208
674587
3253
競爭,可以是一種美好的工具
11:17
provided we wield it wisely.
209
677882
2085
前提是我們明智的運用它
11:20
And we're going to need to do that
210
680468
1668
而我們也必須如此
11:22
because the stakes we are playing for are astronomical.
211
682136
3170
因為這場競爭的風險, 大的難以想像
11:25
If we get AI, and especially AGI, wrong,
212
685806
3421
如果我們錯用 AI, 尤其是通用人工智慧,
11:29
it could lead to unimaginable catastrophe.
213
689268
2795
很有可能鑄下難以彌補的大錯
11:32
But if we get it right,
214
692980
1502
但相反的,如果我們做得正確,
11:34
it could be our path out of many of these Moloch traps
215
694523
2837
AI能成為脫離摩洛圈套的途徑
11:37
that I've mentioned today.
216
697401
1502
接下來這幾年, 情勢大概只會更加瘋狂
11:40
And as things get crazier over the coming years,
217
700446
3045
11:43
which they're probably going to,
218
703532
1836
11:45
it's going to be more important than ever
219
705368
2002
所以我們更需要意識到:
11:47
that we remember that it is the real enemy here, Moloch.
220
707370
4296
真正的敵人, 是摩洛和祂的圈套
11:52
Not any individual CEO or company, and certainly not one another.
221
712208
4379
不是某個企業, 或哪個執行長,更不是你我彼此
11:57
So don't hate the players,
222
717255
1751
所以不要仇視對手
11:59
change the game.
223
719006
1293
要改寫規則
12:01
(Applause)
224
721509
1126
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7