5 Ethical Principles for Digitizing Humanitarian Aid | Aarathi Krishnan | TED

35,320 views ・ 2022-07-12

TED


请双击下面的英文字幕来播放视频。

翻译人员: sylvia feng 校对人员: Yanyan Hong
00:04
Sociologist Zeynep Tufekci once said
0
4334
3795
社会学家泽奈普·图费克奇 (Zeynep Tufekci)曾说过,
00:08
that history is full of massive examples
1
8171
4296
历史中充斥着这样的案例,
00:12
of harm caused by people with great power
2
12509
4254
因掌握权势的人认为,
00:16
who felt that just because they felt themselves to have good intentions,
3
16805
4755
他们只要初衷是好的 就不可能造成伤害,
00:21
that they could not cause harm.
4
21601
1710
结果反倒造成了伤害。
00:24
In 2017,
5
24688
1626
2017 年,
00:26
Rohingya refugees started to flee Myanmar into Bangladesh
6
26314
5047
罗兴亚族难民由于缅甸军方的镇压,
00:31
due to a crackdown by the Myanmar military,
7
31361
2502
开始从缅甸逃往孟加拉国,
00:33
an act that the UN subsequently called of genocidal intent.
8
33863
5089
联合国随后将缅甸军队的行为 称为种族灭绝。
00:39
As they started to arrive into camps,
9
39619
2544
当他们到达难民营时,
00:42
they had to register for a range of services.
10
42163
2711
难民们需要登记才能享受一系列服务。
00:45
One of this was to register
11
45500
1877
其中之一就是
00:47
for a government-backed digital biometric identification card.
12
47419
4462
登记一张由政府支持的 数字生物统计识别卡。
00:51
They weren't actually given the option to opt out.
13
51923
3629
他们实际上得被迫登记。
00:56
In 2021, Human Rights Watch accused international humanitarian agencies
14
56595
5672
2021 年,人权观察组织谴责 国际人道主义组织们
01:02
of sharing improperly collected information about Rohingya refugees
15
62309
5422
在未经适当同意的情况下
01:07
with the Myanmar government without appropriate consent.
16
67731
3503
向缅甸政府分享不当收集的 罗兴亚族难民信息。
01:11
The information shared didn't just contain biometrics.
17
71860
4087
他们分享的信息不只是生物统计。
01:15
It contained information about family makeup, relatives overseas,
18
75947
5005
这些信息中包含家庭成员、海外亲属,
01:20
where they were originally from.
19
80952
1752
以及他们的祖籍地。
01:23
Sparking fears of retaliation by the Myanmar government,
20
83580
3754
这激起了难民们对缅甸政府报复的恐惧,
01:27
some went into hiding.
21
87375
2127
有些人躲了起来。
01:29
Targeted identification of persecuted peoples
22
89544
3921
有针对性地识别被迫害民族的身份
01:33
has long been a tactic of genocidal regimes.
23
93506
3504
向来是种族灭绝政权的策略。
01:37
But now that data is digitized, meaning it is faster to access,
24
97052
4546
但是现在数据被数字化了, 意味着人们可以更快获得它,
01:41
quicker to scale and more readily accessible.
25
101640
3420
更快掌握它,它也更容易被获取。
01:45
This was a failure on a multitude of fronts:
26
105101
2795
此举在很多层面上都是失败的:
01:47
institutional, governance, moral.
27
107937
3587
机构层面上,政府层面上, 还有道德层面上。
01:52
I have spent 15 years of my career working in humanitarian aid.
28
112567
4254
我从事了 15 年的人道主义援助工作,
01:56
From Rwanda to Afghanistan.
29
116821
2253
01:59
What is humanitarian aid, you might ask?
30
119074
2127
你可能会问,什么是人道主义援助?
02:01
In its simplest terms, it's the provision of emergency care
31
121201
4045
用最简单的话来说,就是在极端情况下
02:05
to those that need it the most at desperate times.
32
125246
3295
向最需要的人提供急救护理。
02:08
Post-disaster, during a crisis. Food, water, shelter.
33
128583
4713
在灾后或危机期间, 提供食物、饮用水、庇护所。
02:14
I have worked within very large humanitarian organizations,
34
134631
3545
我曾在很大的人道主义组织工作过,
02:18
whether that's leading multicountry global programs
35
138218
3712
无论是领导跨国全球项目
02:21
to designing drone innovations for disaster management
36
141971
3921
到设计创新无人机
02:25
across small island states.
37
145934
2294
用于小岛国家的灾害管理。
02:29
I have sat with communities in the most fragile of contexts,
38
149312
6006
我曾和处于最脆弱环境的人们坐在一起,
02:35
where conversations about the future are the first ones they've ever had.
39
155318
4129
第一次和他们谈起未来。
02:40
And I have designed global strategies to prepare humanitarian organizations
40
160073
4713
我曾设计过全球战略,
02:44
for these same futures.
41
164786
1877
让人道主义组织准备好 迎接相同的未来。
02:47
And the one thing I can say
42
167163
1460
我能说的是,
02:48
is that humanitarians, we have embraced digitalization
43
168665
3920
人道主义者们,在过去的十年,
02:52
at an incredible speed over the last decade,
44
172627
3462
我们在以惊人的速度拥抱数字化。
02:56
moving from tents and water cans,
45
176131
3211
从帐篷和水罐,
02:59
which we still use, by the way,
46
179384
1668
顺带一提,我们现在还在使用它们,
03:01
to AI, big data, drones, biometrics.
47
181094
4671
到人工智能,大数据, 无人机,生物数字,
03:06
These might seem relevant, logical, needed,
48
186516
3879
这些看起来是息息相关的、 符合逻辑的、必要的,
03:10
even sexy to technology enthusiasts.
49
190437
3003
对于技术爱好者们来说, 甚至是“性感”的。
03:13
But what it actually is, is the deployment of untested technologies
50
193440
6131
但事实上,未经测试的技术
03:19
on vulnerable populations without appropriate consent.
51
199571
3962
被用于脆弱的群体, 而没有得到合适的同意。
03:23
And this gives me pause.
52
203533
2502
这让我停了下来。
03:26
I pause because the agonies we are facing today
53
206035
3587
我停了下来,是因为我们今日
03:29
as a global humanity
54
209664
1668
作为一个全球社会所面临的痛苦
03:31
didn't just happen overnight.
55
211374
2419
并不是一蹴而就的。
03:33
They happened as a result of our shared history of colonialism
56
213835
5047
它们的来源是我们 共同拥有的殖民主义历史,
03:38
and humanitarian technology innovations are inherently colonial,
57
218923
4839
人道主义创新技术 在本质上是殖民主义的,
03:43
often designed for and in the good of groups of people
58
223803
5255
它们是为被视为 技术之外的人群所设计的,
03:49
seen as outside of technology themselves,
59
229100
3587
目的是为了他们的福祉。
03:52
and often not legitimately recognized
60
232687
2878
这些技术常常不被合法认定
03:55
as being able to provide for their own solutions.
61
235565
2502
能够提供自己的解决方案。
03:58
And so, as a humanitarian myself, I ask this question:
62
238067
4588
所以,作为一个人道主义者, 我要提出这个问题:
04:02
in our quest to do good in the world,
63
242655
3671
当我们在世上行善的过程中,
04:06
how can we ensure that we do not lock people into future harm,
64
246326
5589
我们怎样才能保证我们的这些行为
04:11
future indebtedness and future inequity as a result of these actions?
65
251956
5214
不会让人们陷入未来的 危害、债务和不平等?
04:17
It is why I now study the ethics of humanitarian tech innovation.
66
257796
4087
这就是我为什么现在在研究 人道主义创新技术的伦理问题。
04:21
And this isn't just an intellectually curious pursuit.
67
261925
4629
这不仅仅是源于求知欲的探索。
04:26
It's a deeply personal one.
68
266596
1710
这和我的个人经历息息相关。
04:29
Driven by the belief that it is often people that look like me,
69
269307
4338
我被这种信念驱使着: 像我一样的人,
04:33
that come from the communities I come from,
70
273686
2211
来自我的社区的人,
04:35
historically excluded and marginalized,
71
275897
3670
在历史上被排挤和边缘化的人,
04:39
that are often spoken on behalf of
72
279567
3545
他们的声音往往被他人所取代,
04:43
and denied voice in terms of the choices
73
283112
2503
并且这些被否定的声音
04:45
available to us for our future.
74
285615
2294
其实是对我们的未来有用选择。
04:47
As I stand here on the shoulders of all those that have come before me
75
287909
4671
当我站在这么多前人的肩膀上
04:52
and in obligation for all of those that will come after me
76
292622
3587
并且对我的后来者们负责,
04:56
to say to you that good intentions alone do not prevent harm,
77
296251
6256
想对你们说,只有善意是不能阻止伤害的,
05:02
and good intentions alone can cause harm.
78
302549
3336
只有善意反而会导致伤害。
05:06
I'm often asked, what do I see ahead of us in this next 21st century?
79
306845
4337
我经常被问到,我在接下来的 21 世纪 看到了什么是领先于我们的呢?
05:11
And if I had to sum it up:
80
311975
2043
如果我总结一下的话
05:14
of deep uncertainty, a dying planet, distrust, pain.
81
314060
5547
是极度的不确定性,奄奄一息的星球, 缺乏信任和痛苦。
05:20
And in times of great volatility, we as human beings, we yearn for a balm.
82
320692
5922
在动荡的时代, 作为人类,我们渴望得到慰藉。
05:26
And digital futures are exactly that, a balm.
83
326614
3045
而数字化未来就是一种慰藉。
05:30
We look at it in all of its possibility
84
330535
2252
我们考虑它的所有可能性
05:32
as if it could soothe all that ails us, like a logical inevitability.
85
332787
4838
仿佛它能抚慰我们所有的痛苦, 就像逻辑上必然成立一样。
05:39
In recent years, reports have started to flag
86
339210
4046
近年来,有报告开始指出
05:43
the new types of risks that are emerging about technology innovations.
87
343298
5589
技术创新正在出现的新类型的风险。
05:48
One of this is around how data collected on vulnerable individuals
88
348928
5297
其中之一是,收集到的弱势群体的数据
05:54
can actually be used against them as retaliation,
89
354267
4129
实际上可能被用于报复,
05:58
posing greater risk not just against them, but against their families,
90
358396
4129
不仅对他们本人,而且对他们的家庭
06:02
against their community.
91
362525
1502
和社区构成更大的风险。
06:05
We saw these risks become a truth with the Rohingya.
92
365486
3045
我们看到这些风险 在罗兴亚人身上变成了事实。
06:09
And very, very recently, in August 2021, as Afghanistan fell to the Taliban,
93
369991
5297
最近,在 2021 年 8 月,
随着阿富汗落入塔利班手中,
06:15
it also came to light that biometric data collected on Afghans
94
375288
5130
此外,美国军方和阿富汗政府收集
06:20
by the US military and the Afghan government
95
380460
2461
并被各种行为者使用的
06:22
and used by a variety of actors
96
382962
2670
阿富汗人生物特征数据,
06:25
were now in the hands of the Taliban.
97
385673
2211
现在已落入塔利班手中。
06:29
Journalists' houses were searched.
98
389135
2085
记者的住所也被搜查。
06:32
Afghans desperately raced against time to erase their digital history online.
99
392388
5089
阿富汗人拼命与时间赛跑, 在网上抹去他们的数字历史。
06:38
Technologies of empowerment then become technologies of disempowerment.
100
398603
5881
赋予权力的技术就变成了剥夺权力的技术。
06:45
It is because these technologies
101
405026
1793
这是因为这些技术
06:46
are designed on a certain set of societal assumptions,
102
406819
5005
是在一定的社会假设基础上设计的,
06:51
embedded in market and then filtered through capitalist considerations.
103
411824
5047
被嵌入到市场中, 然后被资本主义的考虑过滤掉。
06:56
But technologies created in one context and then parachuted into another
104
416871
5339
但技术在一个环境中创造出来,
然后被空降至另一个环境中
07:02
will always fail
105
422251
1335
是永远难以成功的,
07:03
because it is based on assumptions of how people lead their lives.
106
423628
4880
因为它是基于对人们如何生活的假设。
07:08
And whilst here, you and I may be relatively comfortable
107
428549
3921
比如对我们而言,为了看一场电影, 你和我可能会
07:12
providing a fingertip scan to perhaps go to the movies,
108
432512
4546
相对坦然地提供指纹扫描,
07:17
we cannot extrapolate that out to the level of safety one would feel
109
437100
5630
但我们不能由此推断, 另一个人在排队领救济食物时
07:22
while standing in line,
110
442730
1585
选择放弃
07:24
having to give up that little bit of data about themselves
111
444315
3295
关于自己的一点点数据
07:27
in order to access food rations.
112
447610
2586
内心的安全感。
07:31
Humanitarians assume that technology will liberate humanity,
113
451739
5256
人道主义者想当然地认为技术会解放人类,
07:37
but without any due consideration of issues of power, exploitation and harm
114
457954
6882
但却没有充分考虑到这种情况下,
07:44
that can occur for this to happen.
115
464877
2002
可能出现的权力、剥削和伤害等问题。
07:46
Instead, we rush to solutionizing,
116
466921
3128
相反,我们急于解决问题,
07:50
a form of magical thinking
117
470091
1668
这是一种神奇的思维形式,
07:51
that assumes that just by deploying shiny solutions,
118
471801
4546
认为只要部署光鲜的解决方案,
07:56
we can solve the problem in front of us
119
476389
2002
我们就可以解决眼前的问题,
07:58
without any real analysis of underlying realities.
120
478433
3795
而无需对潜在的现实进行任何真正的分析。
08:03
These are tools at the end of the day,
121
483312
2795
因为这些都是工具,
08:06
and tools, like a chef's knife,
122
486107
2502
这些工具,就如同厨师的刀,
08:08
in the hands of some, the creator of a beautiful meal,
123
488609
4880
在一些人的手中,是一顿美餐的创造者,
08:13
and in the hands of others, devastation.
124
493489
3003
而在另一些人的手中会带来灾难。
08:17
So how do we ensure that we do not design
125
497243
3712
那么,我们如何确保 我们不会将过去的不平等
08:20
the inequities of our past into our digital futures?
126
500997
5047
设计到我们的数字未来中呢?
08:26
And I want to be clear about one thing.
127
506085
1877
我想澄清一件事。
08:28
I'm not anti-tech. I am anti-dumb tech.
128
508004
3587
我并不反对科技,我反对愚蠢的科技。
08:31
(Laughter)
129
511966
1543
(笑声)
08:33
(Applause)
130
513551
1627
(掌声)
08:38
The limited imaginings of the few
131
518097
2336
少数人的有限想象不应该成为
08:40
should not colonize the radical re-imaginings of the many.
132
520475
4462
多数人激进的重新想象的殖民地。
08:45
So how then do we ensure that we design an ethical baseline,
133
525813
4588
那么,我们如何确保我们 设计了一个道德底线,
08:50
so that the liberation that this promises is not just for a privileged few,
134
530401
6131
使它所承诺的解放不只是为少数特权阶层,
08:56
but for all of us?
135
536532
1544
而是为我们所有人?
08:59
There are a few examples that can point to a way forward.
136
539202
3837
有几个例子可以指明前进的方向。
09:03
I love the work of Indigenous AI
137
543081
3878
我喜欢本土 AI 的工作,
09:07
that instead of drawing from Western values and philosophies,
138
547001
3003
它没有借鉴西方的价值观和哲学,
09:10
it draws from Indigenous protocols and values
139
550046
2544
而是借鉴了本土的协议和价值观,
09:12
to embed into AI code.
140
552632
1960
并将其嵌入到 AI 代码中。
09:15
I also really love the work of Nia Tero,
141
555093
2961
我也非常喜欢 Nia Tero 的工作,
09:18
an Indigenous co-led organization that works with Indigenous communities
142
558096
4004
一个土著共同领导的组织,与土著社区合作,
09:22
to map their own well-being and territories
143
562141
2878
绘制他们自己的福祉和领土,
09:25
as opposed to other people coming in to do it on their behalf.
144
565061
3628
而不是其他人来代表他们做这件事。
09:29
I've learned a lot from the Satellite Sentinel Project back in 2010,
145
569482
4671
我从 2010 年的卫星哨兵项目中学到了很多,
09:34
which is a slightly different example.
146
574153
2127
这是一个略有不同的例子。
09:36
The project started essentially to map atrocities
147
576280
5840
该项目最初主要是通过遥感技术
09:42
through remote sensing technologies, satellites,
148
582120
3044
和卫星绘制暴行地图,
09:45
in order to be able to predict and potentially prevent them.
149
585206
3587
以便能够预测并潜在地预防它们。
09:48
Now the project wound down after a few years
150
588835
3295
由于各种原因,
09:52
for a variety of reasons,
151
592171
1919
这个项目在几年后结束了,
09:54
one of which being that it couldn’t actually generate action.
152
594132
3461
其中一个原因是它无法产生实际行动。
09:57
But the second, and probably the most important,
153
597635
3837
但第二点,可能也是最重要的一点,
10:01
was that the team realized they were operating without an ethical net.
154
601514
5797
是该团队意识到他们在没有 道德底线的情况下运作。
10:07
And without ethical guidelines in place,
155
607353
3086
在没有道德准则的情况下,
10:10
it was a very wide open line of questioning
156
610439
3754
他们所做的事情是有益的,还是有害的,
10:14
about whether what they were doing was helpful or harmful.
157
614193
4088
这是一个非常大的问题。
10:19
And so they decided to wind down before creating harm.
158
619282
3753
所以他们决定在造成 伤害之前停下来。
10:24
In the absence of legally binding ethical frameworks
159
624620
5172
在缺乏具有法律约束力的道德框架
10:29
to guide our work,
160
629834
2502
来指导我们的工作的情况下,
10:32
I have been working on a range of ethical principles
161
632378
2878
我一直在研究一系列道德原则,
10:35
to help inform humanitarian tech innovation,
162
635298
2544
以帮助为人道主义技术创新提供信息,
10:37
and I'd like to put forward a few of these here for you today.
163
637884
2919
今天我想在这里向你们提出其中的一些原则。
10:41
One: Ask.
164
641345
1752
第一,先问问。
10:43
Which groups of humans will be harmed by this and when?
165
643890
4212
哪些人类群体将会因此受到 伤害并且在何时受到伤害?
10:48
Assess: Who does this solution actually benefit?
166
648978
3420
其次,去评估: 这个解决方案实际上对谁有利?
10:53
Interrogate: Was appropriate consent obtained from the end users?
167
653733
5881
然后,得讯问: 是否得到了最终用户的适当同意?
11:00
Consider: What must we gracefully exit out of
168
660907
4504
再者,去思考:为了适应这些未来,
11:05
to be fit for these futures?
169
665411
1627
我们必须优雅地退出什么?
11:07
And imagine: What future good might we foreclose
170
667038
5797
最后,想象一下:如果我们今天就采取行动,
11:12
if we implemented this action today?
171
672877
2961
我们可能会丧失什么样的未来利益?
11:16
We are accountable for the futures that we create.
172
676756
3670
我们要对自己创造的未来负责。
11:20
We cannot absolve ourselves of the responsibilities
173
680468
3587
如果我们的行为确实
11:24
and accountabilities of our actions
174
684096
2670
对那些我们声称要保护和服务的人
11:26
if our actions actually cause harm
175
686807
2920
造成了伤害,
11:29
to those that we purport to protect and serve.
176
689769
2878
我们就不能免除自己的责任和责任。
11:32
Another world is absolutely, radically possible.
177
692647
4421
一个全新的世界是完全可能的。
11:37
Thank you.
178
697777
1293
谢谢大家。
11:39
(Applause)
179
699070
2294
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7