How AI and Democracy Can Fix Each Other | Divya Siddarth | TED

29,927 views ・ 2024-03-05

TED


请双击下面的英文字幕来播放视频。

翻译人员: Lening Xu 校对人员: Yip Yan Yeung
00:04
Recently I told someone my work is on democracy and technology.
0
4334
4630
最近我告诉一个人, 我的工作是关于民主和科技的。
00:09
He turned to me and said,
1
9464
1210
他转向我说:
00:10
“Wow...
2
10674
1001
“哇……
00:11
I’m sorry.”
3
11925
1335
什么?”
00:13
(Laughter)
4
13260
1251
(笑声)
00:15
But I love my work.
5
15637
1710
但我喜欢我的工作。
00:17
I know that we can build a world
6
17681
1918
我知道我们可以建设一个世界,
00:19
where technological marvels are directed towards people's benefit,
7
19641
4212
投入资源让科技奇迹服务于人类福祉。
00:23
using their input.
8
23895
1460
00:26
We have gotten so used to seeing democracy as a problem to be solved.
9
26147
4463
我们已经习惯于将民主 视为一个需要解决的问题。
00:32
But I see democracy as a solution,
10
32028
2878
但我将民主视为解决方案,
00:34
not as a problem.
11
34906
1835
而不是问题。
00:37
Democracy was once a radical political project,
12
37909
3003
民主曾是一个激进的政治项目,
00:40
itself a cutting-edge social technology,
13
40912
3087
它本身就是一种尖端的社会技术,
00:44
a new way to answer the very question we are faced with now,
14
44040
4797
是回答我们当下 所面临问题的一种新方法,
00:48
in the era of artificial intelligence:
15
48837
2794
当下的人工智能时代:
00:51
how do we use our capabilities to live well together?
16
51631
4755
如何利用我们的能力 更好地共同生活。
00:57
We are told that transformative technologies like AI are too complicated,
17
57679
5756
有人告诉我们, 像人工智能这样的变革性技术过于复杂,
01:03
or too risky
18
63435
2419
风险太大,
01:05
or too important to be governed democratically.
19
65895
3546
或太重要, 无法通过民主方式进行治理。
01:10
But this is precisely why they must be.
20
70650
3462
但这正是民主必须存在的原因。
01:14
If existing democracy is unequal to the task,
21
74112
2544
如果现有民主 无法胜任这项任务,
01:16
our job is not to give up on it.
22
76698
2169
我们的工作就不是放弃它。
01:19
Our job is to evolve it.
23
79367
2544
我们的工作是对其进行改进。
01:22
And to use technology as an asset to help us do so.
24
82746
3253
并使用技术作为资产 来帮助我们做到这一点。
01:26
Still, I understand his doubts.
25
86958
2044
不过,我理解他的疑虑。
01:29
I never meant to build my life around new forms of democracy.
26
89336
3670
我从来没有想过要 围绕新的民主形式建立自己的生活。
01:33
I started out just really believing in the power of science.
27
93632
3378
一开始我真的相信科学的力量。
01:38
I was modifying DNA in my kitchen at 12,
28
98053
4504
我 12 岁时正在厨房里修改DNA,
01:42
and when I got to Stanford as a computational biology major,
29
102557
3754
当我来到斯坦福大学 攻读计算生物学时,
01:46
I was converted to a new belief -- technology.
30
106311
3295
我皈依了一种新的信仰——科技。
01:49
I truly believed in the power of tech to change the world.
31
109939
3003
我真的相信 科技改变世界的力量。
01:52
Maybe, like many of you.
32
112942
1752
也许就像你们中的许多人一样。
01:55
But I saw that the technologies that really made a difference
33
115695
2878
但我看到,真正发挥作用的技术,
01:58
were the ones that were built with and for the collective.
34
118573
3879
是那些与集体合作、 为集体开发的技术。
02:03
Not the billions of dollars
35
123370
1418
不是把数十亿美元投入 第 19 款助长成瘾的社交应用。
02:04
pumped into the 19th addiction-fueling social app.
36
124829
3128
02:09
But the projects that combine creating something truly new
37
129167
3629
而是这些项目, 它们将创造真正的新事物,
02:12
with building in ways for people to access,
38
132796
3128
与以供人们访问、 受益和指导的方式相结合。
02:15
benefit from and direct it.
39
135924
2252
02:18
Instead of social media, think of the internet.
40
138885
2544
代替社交媒体,想想互联网。
02:21
Built with public resources on open standards.
41
141429
3212
以开放标准为基础,利用公共资源构建。
02:25
This is what brought me to democracy.
42
145892
2669
这就是我走向民主的原因。
02:29
Technology expands what we are capable of.
43
149729
3087
技术扩展了我们的能力。
02:33
Democracy is how we decide what to do with that capability.
44
153858
5089
民主是我们决定 如何利用这种能力的方式。
02:40
Since then, I've worked on using democracy as a solution
45
160115
3712
从那时起, 我一直努力使用民主作为解决方案,
02:43
in India, the US, the UK, Taiwan.
46
163868
3295
在印度、美国、英国、台湾,
02:47
I've worked alongside incredible collaborators
47
167455
2670
我曾与优秀的合作者合作,
02:50
to use democracy to help solve COVID,
48
170125
2711
利用民主来帮助解决新冠问题,
02:52
to help solve data rights.
49
172836
1751
帮助解决数据权限问题。
02:54
And as I'll tell you today,
50
174629
2127
正如我今天要告诉你的那样,
02:56
to help solve AI governance with policymakers around the world
51
176798
4171
与世界各地的决策者以及 OpenAI和Anthropic等尖端科技公司。
03:01
and cutting-edge technology companies like OpenAI and Anthropic.
52
181010
4505
一起帮助解决人工智能治理问题。
03:05
How?
53
185849
1167
怎么做呢?
03:07
By recognizing that democracy is still in its infancy.
54
187434
4546
承认民主仍处于起步阶段。
03:12
It is an early form of collective intelligence,
55
192522
3795
这是集体智慧的早期形式,
03:16
a way to put together decentralized input from diverse sources
56
196359
4755
是一种汇集来自不同来源的分散输入,
03:21
and produce decisions that are better than the sum of their parts.
57
201156
4170
并做出比简单合计更好决策的方式。
03:26
That’s why, when my fantastic cofounder Saffron Huang and I
58
206161
3920
这就是为什么,当我和出色的 联合创始人黄珊(Saffron Huang),
03:30
left our jobs at Google DeepMind and Microsoft
59
210123
3086
离开谷歌DeepMind和微软的工作,
03:33
to build new democratic governance models for transformative tech,
60
213209
4129
去为变革性科技建立 新的民主治理模式时,
03:37
I named our nonprofit the Collective Intelligence Project,
61
217380
4088
我将我们的非营利组织 命名为集体智慧项目(CIP),
03:42
as a nod to the ever-evolving project of building collective intelligence
62
222427
6256
以此致敬,
那些不断发展的建设集体智慧 以促进集体繁荣的项目。
03:48
for collective flourishing.
63
228725
2294
03:52
Since then we've done just that,
64
232228
2253
从那时起,我们就这样做了,
03:54
building new collective intelligence models to direct artificial intelligence,
65
234481
4587
建立了新的集体智能模型 来指导人工智能,
03:59
to run democratic processes.
66
239068
2169
运行民主进程。
04:01
And we've incorporated the voices of thousands of people into AI governance.
67
241237
3879
而且我们已经将成千上万人的意见 纳入了人工智能治理。
04:06
Here are a few of the things we've learned.
68
246242
2044
以下是我们学到的一些东西。
04:08
First, people are willing and able to have difficult,
69
248995
4838
首先,人们愿意并且能够,
就细微差别的话题 进行艰难、复杂的对话。
04:13
complex conversations on nuanced topics.
70
253833
2878
04:17
When we asked people about the risks of AI they were most concerned about,
71
257378
4380
当我们询问人们, 他们最关心的人工智能风险时,
04:21
they didn't reach for easy answers.
72
261758
1918
他们没有随口一答。
04:24
Out of more than 100 risks put forward,
73
264469
2169
在提出的 100 多种风险中,
04:26
the top-cited one: overreliance on systems we don't understand.
74
266679
5214
最频繁出现的风险是: 过度依赖我们不了解的系统。
04:32
We talked to people across the country,
75
272435
1877
我们与全国各地的人进行了交谈,
04:34
from a veteran in the Midwest to a young teacher in the South.
76
274312
3337
从中西部的退伍军人到南方的年轻教师。
04:37
People were excited about the possibilities of this technology,
77
277649
3670
人们对这项技术的可能性感到兴奋,
04:41
but there were specific things they wanted to understand
78
281319
2711
但是他们想了解一些具体的信息,
04:44
about what models were capable of before seeing them deployed in the world.
79
284030
3837
在将模型向全世界发布之前, 了解模型能够做到一些什么。
04:48
A lot more reasonable than many of the policy conversations that we're in.
80
288701
4046
比正在进行的许多政策对话合理得多。
04:53
And importantly, we saw very little of the polarization
81
293331
3378
重要的是,我们几乎看不到 我们经常听到的两极分化。
04:56
we're always hearing about.
82
296751
1585
04:59
On average, just a few divisive statements
83
299045
3003
平均而言,数百份共识声明中 只有几份意见相左的声明。
05:02
for hundreds of consensus statements.
84
302090
2544
05:05
Even on the contentious issues of the day,
85
305385
2002
即使在当今有争议的问题上,
05:07
like free speech or race and gender,
86
307428
2169
例如言论自由、种族和性别,
05:09
we saw far more agreement than disagreement.
87
309639
2794
我们看到的共识也远多于分歧。
05:13
Almost three quarters of people agree that AI should protect free speech.
88
313393
3753
近四分之三的人同意 人工智能应该保护言论自由。
05:17
Ninety percent agree that AI should not be racist or sexist.
89
317146
3546
90% 的人同意人工智能 不应该有种族主义或性别歧视。
05:21
Only around 50 percent think that AI should be funny though,
90
321651
2878
但是,只有大约 50% 的人 认为人工智能应该很有趣,
05:24
so they are still contentious issues out there.
91
324529
2294
因此它们仍然是有争议的问题。
05:28
These last statistics
92
328783
1585
最新的统计数据来自
05:30
are from our collective constitution project with Anthropic,
93
330410
3712
我们与Anthropic的集体宪法项目,
05:34
where we retrained one of the world's most powerful language models
94
334122
3378
我们对世界上最强大的语言模型之一 进行了再训练。
05:37
on principles written by 1,000 representative Americans.
95
337500
3670
根据 1,000 名具有代表性的 美国人撰写的原则,
05:41
Not AI developers or regulators or researchers at elite universities.
96
341754
5339
不是人工智能开发人员、监管机构 或精英大学的研究人员。
05:47
We built on a way of training AI
97
347427
2043
我们建立了一种训练人工智能的方法,
05:49
that relies on a written set of principles or a constitution,
98
349470
3421
这种方法依赖于一套成文的原则或章程,
05:52
we asked ordinary people to cowrite this constitution,
99
352932
2628
我们请普通人共同撰写这部宪法,
05:55
we compared it to a model that researchers had come up with.
100
355602
3003
并将其与研究人员 研发的模型进行了比较。
05:59
When we started this project, I wasn't sure what to expect.
101
359230
4046
当我们开始这个项目时, 我不确定会发生什么。
06:03
Maybe the naysayers were right.
102
363776
1794
也许反对者是对的。
06:05
AI is complicated.
103
365987
1418
人工智能很复杂。
06:08
Maybe people wouldn't understand what we were asking them.
104
368448
2836
也许人们不明白我们要他们做些什么。
06:11
Maybe we'd end up with something awful.
105
371826
2252
也许我们最终会遇到一些可怕的事情。
06:15
But the people’s model, trained on the cowritten constitution,
106
375371
3921
但基于共同撰写的宪法 进行训练的人类模型,
06:19
was just as capable and more fair
107
379292
4046
与研究人员开发的模型 一样有效,也更公平。
06:23
than the model the researchers had come up with.
108
383338
2460
06:26
People with little to no experience in AI
109
386382
2878
几乎没有人工智能经验的人,
06:29
did better than researchers, who work on this full-time,
110
389302
3587
在构建更公平的聊天机器人方面, 比全职研究该领域的研究人员做得更好。
06:32
in building a fairer chatbot.
111
392931
2544
06:36
Maybe I shouldn't have been surprised.
112
396142
2127
也许我不应该感到惊讶。
06:38
As one of our participants from another process said,
113
398311
3587
正如另一个项目的一位参与者所说,
06:41
"They may be experts in AI, but I have eight grandchildren.
114
401940
3712
“他们可能是人工智能专家, 但我有八个孙子。
06:45
I know how to pick good values."
115
405693
2044
我知道如何衡量正确的价值观。”
06:51
If technology expands what we are capable of
116
411282
2503
如果技术扩展了我们的能力,
06:53
and democracy is how we decide what to do with that capability,
117
413785
3420
而民主是我们决定如何 利用这种能力的方式,
06:57
here is early evidence that democracy can do a good job deciding.
118
417246
4588
这里有早期的证据表明 民主可以很好地做出决定。
07:03
Of course, these processes aren't enough.
119
423127
1961
当然,这些工序还不够。
07:05
Collective intelligence requires a broader reimagining of technology
120
425129
3212
集体智慧需要对技术和民主 进行更广泛的重新构想。
07:08
and democracy.
121
428383
1459
07:09
That’s why we’re also working on co-ownership models
122
429842
3045
这就是为什么我们还在 开发共同所有权模型,
07:12
for the data that AI is built on -- which, after all, belongs to all of us --
123
432929
4588
因为构建人工智能的数据 , 最终属于我们所有人,
07:18
and using AI itself to create new and better decision-making processes.
124
438184
4296
利用人工智能本身来创建 新的、更好的决策流程。
07:22
Taking advantage of the things
125
442981
1459
利用语言模型来做人类无法做的事情,
07:24
that language models can do that humans can’t,
126
444440
2169
07:26
like processing huge amounts of text input.
127
446609
2211
例如处理大量的文本输入。
07:30
Our work in Taiwan has been an incredible test bed for all of this.
128
450154
4880
我们在台湾的工作 是这一切的完美试验田。
07:36
Along with Minister Audrey Tang and the Ministry of Digital Affairs,
129
456494
4755
我们与唐凤部长(Audrey Tang) 和台湾数位发展部一道,
07:41
we are working on processes to ask Taiwan's millions of citizens
130
461249
3754
制定程序并询问了台湾数百万公民,
07:45
what they actually want to see as a future with AI.
131
465044
3879
有人工智能的未来 在他们眼里到底是什么样的。
07:49
And using that input not just to legislate,
132
469966
3045
收集这些不仅是为了立法,
07:53
but to build.
133
473052
1126
而且是为了建设。
07:54
Because one thing that has already come out of these processes
134
474762
3295
因为从这些流程中浮现出的一点是,
07:58
is that people are truly excited about a public option for AI,
135
478099
3545
人们对人工智能的公共选择 感到非常兴奋,
08:01
one that is built on shared public data that is reliably safe,
136
481644
4213
这种选择建立在 可靠安全的共享公共数据之上,
08:05
that allows communities to access, benefit from
137
485898
3212
允许社区访问这些数据,从中受益,
08:09
and adjust it to their needs.
138
489152
1751
并根据自己的需求进行调整。
08:12
This is what the world of technology could look like.
139
492280
3211
这就是技术世界的样子。
08:15
Steered by the many for the many.
140
495867
2586
由众人,为众人。
08:20
I often find that we accept unnecessary trade-offs
141
500288
3795
我经常发现,在变革性技术方面, 我们接受了不必要的妥协。
08:24
when it comes to transformative tech.
142
504083
2044
08:26
We are told that we might need to sacrifice democracy
143
506961
3253
有人告诉我们, 我们可能需要牺牲民主,
08:30
for the sake of technological progress.
144
510214
2586
为了技术进步。
08:32
We have no choice but to concentrate power to keep ourselves safe
145
512842
3587
我们别无选择, 只能集中力量保护自己,
08:36
from possible risks.
146
516429
1501
免受可能的风险。
08:38
This is wrong.
147
518765
1626
这是不对的。
08:40
It is impossible to have any one of these things --
148
520933
3462
我们不可能得到 这三项里的任意一项——
08:44
progress, safety or democratic participation --
149
524437
3670
进步、安全或民主权利,
08:48
without the others.
150
528149
1501
如果这三项中缺失任一。
08:50
If we resign ourselves to only two of the three,
151
530276
3420
如果我们选择只保留三项中的两项,
08:53
we will end up with either centralized control or chaos.
152
533738
3587
我们最终将陷入集中控制或混乱。
08:57
Either a few people get to decide or no one does.
153
537825
3796
要么由少数人决定, 要么没人决定。
09:02
These are both terrible outcomes,
154
542497
1585
这两个结果都很糟糕,
09:04
and our work shows that there is another way.
155
544123
2836
我们的研究表明还有另一种方法。
09:07
Each of our projects advanced progress,
156
547710
2753
我们的每个项目都推动了 进步、安全和民主参与,
09:10
safety and democratic participation
157
550505
3253
09:13
by building cutting-edge democratic AI models,
158
553800
3295
通过建立尖端的民主人工智能模型,
09:17
by using public expertise as a way to understand diffuse risks
159
557136
4713
利用公共专业知识 作为理解分散风险的方式,
09:21
and by imagining co-ownership models for the digital commons.
160
561849
3671
以及想象数字公域的共同所有权模式。
09:27
We are so far from the best collective intelligence systems we could have.
161
567396
5423
距离能实现的最好的集体智慧系统, 我们还有很长的路要走。
09:33
If we started over on building a decision-making process for the world,
162
573486
4379
如果我们重新开始为世界 建立决策流程,
09:37
what would we choose?
163
577865
1377
我们会选择什么?
09:40
Maybe we'd be better at separating financial power from political power.
164
580284
5005
也许我们可以更好地 将金融力量与政治权力分开。
09:46
Maybe we'd create thousands of new models of corporations
165
586290
3087
也许我们会创建成千上万种 新的企业或官僚机构模式。
09:49
or bureaucracies.
166
589418
1419
09:51
Maybe we'd build in the voices of natural elements
167
591379
3670
也许我们会听取自然元素 或子孙后代的声音。
09:55
or future generations.
168
595049
1585
09:57
Here's a secret.
169
597844
1293
我要说个秘密。
09:59
In some ways, we are always starting from scratch.
170
599804
4046
在某些方面,我们总是从头开始。
10:04
New technologies usher in new paradigms
171
604225
3128
新技术带来了新范式,
10:07
that can come with new collective intelligence systems.
172
607353
3253
这可能伴随着新的集体智能系统。
10:11
We can create new ways of living well together
173
611274
4004
我们可以创造新的共同生活方式,
10:15
if we use these brief openings for change.
174
615319
3337
如果我们利用这些短暂的变革机会,
10:19
The story of technology and democracy is far from over.
175
619532
3337
科技和民主的故事远未结束。
10:22
It doesn't have to be this way.
176
622910
1669
不一定非得这样。
10:25
Things could be unimaginably better.
177
625037
2378
情况可能出人意料地好。
10:28
As the Indian author Arundhati Roy once said,
178
628040
3754
正如印度作家阿伦哈蒂·罗伊 (Arundhati Roy)曾经说过的那样,
10:31
"Another world is not only possible,
179
631836
2878
“另一个世界不仅是可能的, 而且她正在路上。
10:34
she is on her way.
180
634755
1627
10:37
On a quiet day, I can hear her breathing."
181
637258
3337
在安静的日子里,我能听见她的呼吸。”
10:41
I can hear our new world breathing.
182
641804
2628
我能听见我们的新世界在呼吸。
10:44
One in which we shift the systems we have
183
644473
3087
我们将现有的制度,
10:47
towards using the solution of democracy to build the worlds we want to see.
184
647560
4504
转向使用民主的解决方案来建设的 我们想要看到的世界。
10:53
The future is up to us.
185
653107
1543
未来取决于我们。
10:55
We have a world to win.
186
655067
1669
我们要赢得一个世界。
10:57
Thank you.
187
657361
1168
谢谢。
10:58
(Applause)
188
658529
2670
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7