The Dark Side of Competition in AI | Liv Boeree | TED

152,856 views ・ 2023-11-09

TED


请双击下面的英文字幕来播放视频。

翻译人员: Yip Yan Yeung 校对人员: suya f.
00:04
Competition.
0
4000
1502
竞争。
00:05
It's a fundamental part of human nature.
1
5502
3003
它是人性的基本组成部分。
00:09
I was a professional poker player for 10 years,
2
9047
2628
我从事职业扑克牌手已有 10 年了,
00:11
so I've very much seen all the good, bad and ugly ways it can manifest.
3
11675
4421
所以我已经亲眼目睹了它所展现的 所有好、坏和丑恶一面。
00:16
When it's done right,
4
16763
1502
如果用在正道,
00:18
it can drive us to incredible feats in sports and innovation,
5
18306
5589
它可以推动我们在体育和 创新方面取得丰功伟绩,
00:23
like when car companies compete over who can build the safest cars
6
23895
4380
就像汽车公司 竞争制造最安全的汽车
00:28
or the most efficient solar panels.
7
28275
2544
或最高效的太阳能电池板。
00:30
Those are all examples of healthy competition,
8
30860
2169
这些都是良性竞争的例子,
00:33
because even though individual companies might come and go,
9
33029
3170
因为尽管个别公司可能会是过客,
00:36
in the long run,
10
36199
1168
但从长远来看,
00:37
the game between them creates win-win outcomes
11
37409
2419
它们之间的博弈会创造双赢的结果,
00:39
where everyone benefits in the end.
12
39828
2043
最终让各方受益。
00:42
But sometimes competition is not so great
13
42581
3003
但有时候竞争并不是那么有益,
00:45
and can create lose-lose outcomes where everyone's worse off than before.
14
45625
5464
可能会造成双输的结果, 每个人的境况都比以前更糟。
00:51
Take these AI beauty filters, for example.
15
51715
2878
以这些 AI 美颜滤镜为例。
00:54
As you can see, they're a very impressive technology.
16
54968
2586
如你所见, 这是一项非常厉害的技术。
00:57
They can salvage almost any picture.
17
57596
2210
它几乎可以拯救任何照片。
01:00
They can even make Angelina and Margot more beautiful.
18
60181
5297
它甚至可以让安吉丽娜 和玛格特更加美丽。
01:05
So they're very handy,
19
65520
1335
它非常方便,
01:06
especially for influencers who, now, at the click of a button,
20
66896
3045
尤其是对网红来说,一键
01:09
can transform into the most beautiful Hollywood versions of themselves.
21
69983
3795
就可以变成自己 最漂亮的好莱坞版本。
01:13
But handy doesn't always mean healthy.
22
73778
2419
但是方便并不总是代表健康。
01:16
And I've personally noticed how quickly these things can train you
23
76573
3879
我亲身体验了这些东西 能有多快训练你
01:20
to hate your natural face.
24
80493
2378
讨厌你的天然长相。
01:22
And there's growing evidence that they're creating issues
25
82912
2795
越来越多的证据表明, 它们正在造成
01:25
like body dysmorphia, especially in young people.
26
85707
3420
容貌焦虑等问题, 尤其是在年轻人中。
01:29
Nonetheless, these things are now endemic to social media
27
89502
4421
尽管如此,这些东西 目前仅限于社交媒体中,
01:33
because the nature of the game demands it.
28
93965
2461
出于竞争的本质。
01:37
The platforms are incentivized to provide them
29
97093
2211
这些平台兴致勃勃地 提供这样的服务,
01:39
because hotter pictures means more hijacked limbic systems,
30
99346
3253
因为更火爆的图片意味着 更能抓住你的兴趣、
01:42
which means more scrolling and thus more ad revenue.
31
102641
2502
更多的滑动屏幕、 更多的广告收入。
01:45
And users are incentivized to use them
32
105644
2210
用户兴致勃勃地使用它们,
01:47
because hotter pictures get you more followers.
33
107896
2377
因为更火爆的图片可以 让你获取更多的粉丝。
01:50
But this is a trap,
34
110815
2461
但这是一个陷阱,
01:53
because once you start using these things,
35
113318
2044
因为你一旦开始使用这些东西,
01:55
it's really hard to go back.
36
115362
2127
就很难回头。
01:57
Plus, you don't even get a competitive advantage from them anymore
37
117530
3254
而且你甚至无法再从中 获得竞争优势,
02:00
because everyone else is already using them too.
38
120784
2502
因为其他人也在使用它们。
02:03
So influencers are stuck using these things
39
123787
3461
网红离不开这些滤镜,
02:07
with all the downsides
40
127290
1460
全是负面影响,
02:08
and very little upside.
41
128792
1793
几乎没有正面影响。
02:10
A lose-lose game.
42
130627
1334
这是一场双输的游戏。
02:12
A similar kind of trap is playing out in our news media right now,
43
132462
4504
我们的新闻媒体 正在落入类似的陷阱,
02:16
but with much worse consequences.
44
136966
2712
但后果要恶劣得多。
02:19
You'd think since the internet came along
45
139719
1961
你可能会认为 自互联网问世以来,
02:21
that the increased competition between news outlets
46
141680
3169
新闻媒体之间竞争的加剧
02:24
would create a sort of positive spiral,
47
144891
2044
将形成一种积极的螺旋式上升,
02:26
like a race to the top of nuanced, impartial, accurate journalism.
48
146976
4839
比如争相向细致、公正、 准确的新闻报道的顶峰进发。
02:32
Instead, we're seeing a race to the bottom of clickbait and polarization,
49
152565
6132
相反,我们看到的是争相 向标题党和“二极管”的低谷进发,
02:38
where even respectable papers are increasingly leaning
50
158697
2836
连颇有名望的报纸都逐渐
02:41
into these kind of low-brow partisan tactics.
51
161533
3253
采取这种令人不齿的 带有倾向性的策略。
02:45
Again, this is due to crappy incentives.
52
165537
3170
同样,这是因为激励措施不当。
02:49
Today, we no longer just read our news.
53
169374
2669
我们现在不只是“看”新闻。
02:52
We interact with it by sharing and commenting.
54
172085
3378
我们还会通过分享、评论与它互动。
02:56
And headlines that trigger emotions like fear or anger
55
176131
4170
引发恐惧或愤怒等情绪的头条新闻
03:00
are far more likely to go viral than neutral or positive ones.
56
180343
4088
比中立或正面的头条新闻 更有可能会被广泛传播。
03:05
So in many ways, news editors are in a similar kind of trap
57
185098
4087
某种程度新闻编辑陷入的陷阱
03:09
as the influencers,
58
189185
1544
和网红差不多,
03:10
where, the more their competitors lean into clickbaity tactics,
59
190770
4171
他们的竞争对手 越采取标题党策略,
03:14
the more they have to as well.
60
194983
1960
他们也不得不这么做。
03:16
Otherwise, their stories just get lost in the noise.
61
196943
2628
否则,他们的故事 就会被纷纷扰扰淹没。
03:20
But this is terrible for everybody,
62
200029
2420
但这对每个人来说都很糟糕,
03:22
because now the media get less trust from the public,
63
202490
3295
因为现在公众对媒体的信任越来越少,
03:25
but also it becomes harder and harder for anyone to discern truth from fiction,
64
205785
4463
而且所有人都越来越难以分辨真伪,
03:30
which is a really big problem for democracy.
65
210248
2127
这对民主来说是一个非常大的问题。
03:33
Now, this process of competition gone wrong
66
213251
3420
竞争过程出错
03:36
is actually the driving force behind so many of our biggest issues.
67
216713
4880
其实在背后推动着 我们的许多重大问题。
03:42
Plastic pollution,
68
222427
1877
塑料污染、
03:44
deforestation,
69
224345
2128
森林砍伐、
03:46
antibiotic overuse in farming,
70
226514
2711
农业过度使用抗生素、
03:49
arms races,
71
229225
1627
军备竞赛、
03:50
greenhouse gas emissions.
72
230894
1918
温室气体排放。
03:52
These are all a result of crappy incentives,
73
232854
3754
这些都是激励措施不当的后果,
03:56
of poorly designed games that push their players --
74
236649
3420
竞争游戏设计不当, 迫使玩家——
04:00
be them people, companies or governments --
75
240069
2753
个人、公司或政府,
04:02
into taking strategies and tactics that defer costs and harms to the future.
76
242864
5714
采取策略和战术 把成本和伤害推迟到以后再还。
04:09
And what's so ridiculous is that most of the time,
77
249454
2919
可笑的是,大多数时候,
04:12
these guys don't even want to be doing this.
78
252415
2878
连这些人自己都不想这样做。
04:15
You know, it's not like packaging companies
79
255335
2169
并不是包装公司 想用塑料填满海洋,
04:17
want to fill the oceans with plastic
80
257545
1919
04:19
or farmers want to worsen antibiotic resistance.
81
259506
2836
也不是农民想加剧抗生素耐药性。
04:22
But they’re all stuck in the same dilemma of:
82
262759
3420
但是他们都陷入了同样的困境:
04:26
"If I don't use this tactic,
83
266221
2294
“如果我不采取这种策略,
04:28
I’ll get outcompeted by all the others who do.
84
268515
3211
我就会被所有这么干的人击败。
04:31
So I have to do it, too.”
85
271768
1835
所以我也必须这样做。”
04:34
This is the mechanism we need to fix as a civilization.
86
274604
3545
这是我们整个文明需要修复的机制。
04:38
And I know what you're probably all thinking, "So it's capitalism."
87
278858
3170
我知道你们可能在想什么, “这就是资本主义。”
04:42
No, it's not capitalism.
88
282028
1543
不,这不是资本主义。
04:43
Which, yes, can cause problems,
89
283613
1793
是的,这可能会导致问题,
04:45
but it can also solve them and has been fantastic in general.
90
285448
3170
但它也可以解决问题, 总体来说很不错。
04:48
It's something much deeper.
91
288618
2127
这是更深层次的东西。
04:51
It's a force of misaligned incentives of game theory itself.
92
291287
5005
这是博弈论本身的不当激励。
04:57
So a few years ago, I retired from poker,
93
297001
2378
几年前,我从扑克届退役,
04:59
in part because I wanted to understand this mechanism better.
94
299420
3671
部分原因是我想更好地理解这种机制。
05:03
Because it takes many different forms, and it goes by many different names.
95
303633
4213
因为它有许多不同的形式, 还有许多不同的名称。
05:08
These are just some of those names.
96
308346
2085
这只是其中的一些名字。
05:11
You can see they're a little bit abstract and clunky, right?
97
311015
3337
你可以看出它们 有点抽象、笨重,对吧?
05:14
They don't exactly roll off the tongue.
98
314394
2002
它们并不完全是胡说八道。
05:16
And given how insidious and connected all of these problems are,
99
316771
5089
由于这些问题的隐藏性和关联性,
05:21
it helps to have a more visceral way of recognizing them.
100
321901
3963
用一种更直观的方式 理解它们会更好一些。
05:27
So this is probably the only time
101
327073
1585
这可能是
05:28
you're going to hear about the Bible at this conference.
102
328658
2711
你在这场会议中 唯一一次听到圣经。
05:31
But I want to tell you a quick story from it,
103
331369
2127
但我想简短地讲一个故事,
05:33
because allegedly, back in the Canaanite days,
104
333496
2544
因为据称,在迦南时代,
05:36
there was a cult who wanted money and power so badly,
105
336040
5256
有一个邪教组织非常渴望金钱和权力,
05:41
they were willing to sacrifice their literal children for it.
106
341296
3587
他们愿意为此牺牲自己的子女。
05:45
And they did this by burning them alive in an effigy
107
345592
4212
为此,他们在神像前 活活烧死自己的孩子,
05:49
of a God that they believed would then reward them
108
349846
2711
他们相信神会奖励他们做出
05:52
for this ultimate sacrifice.
109
352599
2127
这一终极牺牲。
05:55
And the name of this god was Moloch.
110
355643
2962
这尊神的名字叫摩洛克(Moloch)。
05:59
Bit of a bummer, as stories go.
111
359772
2461
故事讲下去就有点扫兴了。
06:02
But you can see why it's an apt metaphor,
112
362233
2586
但是你会明白 为什么这是一个恰当的比喻,
06:04
because sometimes we get so lost in winning the game right in front of us,
113
364861
6673
因为有时我们太过沉迷于 赢得眼前的比赛,
06:11
we lose sight of the bigger picture
114
371534
1960
而在追求胜利时 忽视了大局,牺牲了太多。
06:13
and sacrifice too much in our pursuit of victory.
115
373536
3754
06:17
So just like these guys were sacrificing their children for power,
116
377832
4088
就像这些人为了权力牺牲孩子一样,
06:21
those influencers are sacrificing their happiness for likes.
117
381961
4588
那些网红为了点赞牺牲了快乐。
06:27
Those news editors are sacrificing their integrity for clicks,
118
387008
4087
那些新闻编辑为了点击量 牺牲了自己的诚信,
06:31
and polluters are sacrificing the biosphere for profit.
119
391137
3796
污染者则为了利润 牺牲了生物圈。
06:35
In all these examples,
120
395767
1793
在这些例子中,
06:37
the short-term incentives of the games themselves are pushing,
121
397602
3962
游戏本身的短期激励措施是在鼓动,
06:41
they're tempting their players
122
401606
1752
引诱玩家牺牲越来越多的未来,
06:43
to sacrifice more and more of their future,
123
403358
3378
06:46
trapping them in a death spiral where they all lose in the end.
124
406778
4129
陷入死亡螺旋中,最终全盘皆输。
06:52
That's Moloch's trap.
125
412158
2127
那是摩洛克的陷阱。
06:54
The mechanism of unhealthy competition.
126
414661
2752
不良竞争的机制。
06:58
And the same is now happening in the AI industry.
127
418164
3337
AI 行业中也正在上演。
07:02
We're all aware of the race that's heating up
128
422251
2128
我们都知道公司间的竞争正热火朝天,
07:04
between companies right now
129
424379
1293
07:05
over who can score the most compute,
130
425713
2253
竞争最高算力、
07:08
who can get the biggest funding round or get the top talent.
131
428007
3379
最高融资、最优人才。
07:11
Well, as more and more companies enter this race,
132
431386
3461
随着越来越多公司加入竞争,
07:14
the greater the pressure for everyone to go as fast as possible
133
434847
3629
每个人都面临着更大的压力, 迫使他们尽可能快地动起来,
07:18
and sacrifice other important stuff like safety testing.
134
438476
4171
牺牲安全测试等重要事项。
07:23
This has all the hallmarks of a Moloch trap.
135
443189
3253
这就是摩洛克陷阱的所有特征。
07:27
Because, like, imagine you're a CEO who, you know, in your heart of hearts,
136
447068
3587
因为想象一下你是一位 CEO, 你内心深处
07:30
believes that your team is the best
137
450655
2586
相信你的团队是
07:33
to be able to safely build extremely powerful AI.
138
453241
3462
能够安全地打造 极其强大的 AI 的最佳人选。
07:37
Well, if you go too slowly, then you run the risk of other,
139
457120
4170
如果你动作太慢,那你就会面临其他
07:41
much less cautious teams getting there first
140
461332
2628
不那么谨慎的团队先行一步,
07:44
and deploying their systems before you can.
141
464002
2168
在你之前部署他们的系统。
07:46
So that in turn pushes you to be more reckless yourself.
142
466212
3378
因此,这反过来又会 促使你自己更冲动行事。
07:49
And given how many different experts and researchers,
143
469966
3628
有多少专家和研究人员,
07:53
both within these companies
144
473594
1585
包括这些公司的职员,
07:55
but also completely independent ones,
145
475179
2545
还有完全独立的人士,
07:57
have been warning us about the extreme risks of rushed AI,
146
477765
4296
一直在警告我们警惕 仓促发展的 AI 带来的极端风险,
08:02
this approach is absolutely mad.
147
482103
2711
这种做法简直太疯狂了。
08:05
Plus, almost all AI companies
148
485398
3170
此外,几乎所有 AI 公司
08:08
are beholden to satisfying their investors,
149
488609
3003
都必须满足投资者的要求,
08:11
a short-term incentive which, over time, will inevitably start to conflict
150
491612
4255
这种短期激励措施随着时间的推移, 将不可避免地开始
08:15
with any benevolent mission.
151
495867
1751
与任何“圣母”使命发生冲突。
08:18
And this wouldn't be a big deal
152
498202
1502
如果我们在这里谈论的 真的只是面包机,
08:19
if this was really just toasters we're talking about here.
153
499704
3337
那也没什么大不了的。
08:23
But AI, and especially AGI,
154
503041
2752
但是,AI,尤其是 AGI (通用人工智能),
08:25
is set to be a bigger paradigm shift
155
505835
1752
必将带来比农业 或工业革命更大的范式转变。
08:27
than the agricultural or industrial revolutions.
156
507587
2669
08:30
A moment in time so pivotal,
157
510590
2544
这是个关键时刻,
08:33
it's deserving of reverence and reflection,
158
513176
4546
需要敬意和反思,
08:37
not something to be reduced to a corporate rat race
159
517764
2419
不能简化成企业间你死我活的竞争,
08:40
of who can score the most daily active users.
160
520183
2544
比拼谁能获取更多的日活跃用户。
08:43
I'm not saying I know
161
523186
1209
我并不是说我知道
08:44
what the right trade-off between acceleration and safety is,
162
524437
3420
加速发展和安全之间 权衡取舍的正确答案,
08:47
but I do know that we'll never find out what that right trade-off is
163
527899
3962
但我确实知道, 如果我们受制于摩洛克,
08:51
if we let Moloch dictate it for us.
164
531903
2794
我们将永远找不到 正确的权衡取舍。
08:56
So what can we do?
165
536699
1794
那我们能做些什么?
08:58
Well, the good news is we have managed to coordinate
166
538868
3086
好消息是我们之前合力成功
09:01
to escape some of Moloch's traps before.
167
541954
2002
逃脱了一些摩洛克陷阱。
09:04
We managed to save the ozone layer from CFCs
168
544540
2545
通过《蒙特利尔议定书》,
09:07
with the help of the Montreal Protocol.
169
547126
2169
我们从氯氟碳化物中拯救了臭氧层。
09:09
We managed to reduce the number of nuclear weapons on Earth
170
549337
2878
通过 1991 年的《削减和 限制进攻性战略武器条约》
09:12
by 80 percent,
171
552256
1210
我们将地球上的核武器数量 减少了 80%。
09:13
with the help of the Strategic Arms Reduction Treaty in 1991.
172
553466
3378
09:17
So smart regulation may certainly help with AI too,
173
557428
4338
明智的监管措施当然 也可能会对 AI 有所帮助,
09:21
but ultimately,
174
561766
1335
但归根结底,
09:23
it's the players within the game who have the most influence on it.
175
563142
4004
对 AI 影响最大的是 游戏中的玩家。
09:27
So we need AI leaders to show us
176
567814
2294
因此,我们需要 AI 领袖向我们表明
09:30
that they're not only aware of the risks their technologies pose,
177
570108
3586
他们不仅意识到了 其技术构成的风险,
09:33
but also the destructive nature of the incentives
178
573736
3253
还有他们目前所面临的 激励措施的破坏性。
09:37
that they're currently beholden to.
179
577031
1877
09:39
As their technological capabilities reach towards the power of gods,
180
579617
3712
当他们的技术能力 匹敌神明的力量时,
09:43
they're going to need the godlike wisdom to know how to wield them.
181
583329
3712
他们需要神一般的智慧 学习如何驾驭它们。
09:47
So it doesn't fill me with encouragement
182
587917
2211
因此,当我看到一家 大型公司的 CEO 说
09:50
when I see a CEO of a very major company saying something like,
183
590169
4338
“我想让人们知道对手对我们言听计从” 之类的话时,我并不觉得信心倍增。
09:54
"I want people to know we made our competitor dance."
184
594507
2753
09:57
That is not the type of mindset we need here.
185
597677
2669
这不是我们需要的那种心态。
10:00
We need leaders who are willing to flip Moloch's playbook,
186
600346
3629
我们需要愿意 推翻摩洛克故事的领导者,
10:04
who are willing to sacrifice their own individual chance of winning
187
604016
3671
他们愿意为了整体利益 牺牲自己的获胜机会。
10:07
for the good of the whole.
188
607687
1626
10:10
Now, fortunately, the three leading labs are showing some signs of doing this.
189
610148
4921
所幸有三家头部实验室 都或多或少展现了这种迹象。
10:15
Anthropic recently announced their responsible scaling policy,
190
615695
3086
Anthropic 最近宣布了 负责任的扩展政策,
10:18
which pledges to only increase capabilities
191
618823
2586
该政策承诺只有在满足 特定安全标准后才会提升性能。
10:21
once certain security criteria have been met.
192
621409
2627
10:24
OpenAI have recently pledged
193
624954
1418
OpenAI 最近承诺
10:26
to dedicate 20 percent of their compute purely to alignment research.
194
626414
3962
将其 20% 的算力 完全用于“对齐”研究。
10:31
And DeepMind have shown a decade-long focus
195
631627
3087
DeepMind 已经在过去十年中表现出
10:34
of science ahead of commerce,
196
634714
2377
对科学的关注优先于商业,
10:37
like their development of AlphaFold,
197
637133
1752
例如他们开发的 AlphaFold,
10:38
which they gave away to the science community for free.
198
638885
2627
将其免费赠与科学界。
10:41
These are all steps in the right direction,
199
641846
2044
这些都是朝着正确方向迈出的步伐,
10:43
but they are still nowhere close to being enough.
200
643931
2836
但还远远不够。
10:46
I mean, most of these are currently just words,
201
646809
2586
我想说目前大都只是言语,
10:49
they're not even proven actions.
202
649437
2127
甚至不是经过验证的行动。
10:52
So we need a clear way to turn the AI race into a definitive race to the top.
203
652023
6006
我们需要一种明确的方法
将 AI 竞赛转变为明确向上的竞争。
10:58
Perhaps companies can start competing over who can be within these metrics,
204
658779
5297
也许各公司可以开始 争夺谁可以达到这些指标,
11:04
over who can develop the best security criteria.
205
664118
3128
谁可以制定出最佳的安全标准。
11:07
A race of who can dedicate the most compute to alignment.
206
667288
4004
竞赛谁能将最多的算力投入到对齐上。
11:11
Now that would truly flip the middle finger to Moloch.
207
671292
2669
这确实是与摩洛克陷阱 背道而驰的做法。
11:14
Competition can be an amazing tool,
208
674587
3253
如果我们明智地运用竞争, 它会是一个强大的工具。
11:17
provided we wield it wisely.
209
677882
2085
11:20
And we're going to need to do that
210
680468
1668
我们之所以需要这样做,
11:22
because the stakes we are playing for are astronomical.
211
682136
3170
是因为我们手上的赌注是天文数字。
11:25
If we get AI, and especially AGI, wrong,
212
685806
3421
如果我们让 AI, 尤其是 AGI,跑偏了,
11:29
it could lead to unimaginable catastrophe.
213
689268
2795
可能会导致难以想象的灾难。
11:32
But if we get it right,
214
692980
1502
但如果我们做对了,
11:34
it could be our path out of many of these Moloch traps
215
694523
2837
可能就会是我们摆脱多次提及的 摩洛克陷阱的出路。
11:37
that I've mentioned today.
216
697401
1502
11:40
And as things get crazier over the coming years,
217
700446
3045
接下来几年, 情况会越来越疯狂,
11:43
which they're probably going to,
218
703532
1836
确实可能如此,
11:45
it's going to be more important than ever
219
705368
2002
我们必须记住摩洛克 才是我们真正的敌人。
11:47
that we remember that it is the real enemy here, Moloch.
220
707370
4296
11:52
Not any individual CEO or company, and certainly not one another.
221
712208
4379
不是哪个 CEO,哪家公司, 当然也不是彼此。
11:57
So don't hate the players,
222
717255
1751
不要憎恨玩家,
11:59
change the game.
223
719006
1293
去改变游戏吧。
12:01
(Applause)
224
721509
1126
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog