5 Ethical Principles for Digitizing Humanitarian Aid | Aarathi Krishnan | TED
35,320 views ・ 2022-07-12
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: Shelley Tsang 曾雯海
00:04
Sociologist Zeynep Tufekci once said
0
4334
3795
社會學家澤內普‧
圖菲克西曾經說過,
00:08
that history is full of massive examples
1
8171
4296
歷史中可以找到很多例子,
都是發生重大的傷害,
00:12
of harm caused by people with great power
2
12509
4254
而造成傷害的人,
有很大的權力並認為
00:16
who felt that just because they felt
themselves to have good intentions,
3
16805
4755
單單因為他們認為自己有好的意圖,
00:21
that they could not cause harm.
4
21601
1710
他們就不可能造成傷害。
00:24
In 2017,
5
24688
1626
2017 年,
00:26
Rohingya refugees started
to flee Myanmar into Bangladesh
6
26314
5047
羅興亞難民開始從緬甸逃到孟加拉,
00:31
due to a crackdown
by the Myanmar military,
7
31361
2502
原因是緬甸軍方的壓迫,這個行為,
00:33
an act that the UN subsequently
called of genocidal intent.
8
33863
5089
後來被聯合國稱為
是有意大屠殺的行為。
00:39
As they started to arrive into camps,
9
39619
2544
當他們開始抵達難民營,
00:42
they had to register
for a range of services.
10
42163
2711
他們要先登記才能享有許多服務。
00:45
One of this was to register
11
45500
1877
其中之一是要登記
00:47
for a government-backed
digital biometric identification card.
12
47419
4462
政府推動的數位生物識別卡。
00:51
They weren't actually given
the option to opt out.
13
51923
3629
他們事實上並沒有
「退出」這個選項可以選。
00:56
In 2021, Human Rights Watch accused
international humanitarian agencies
14
56595
5672
2021 年「人權觀察」
指控人道主義機構
01:02
of sharing improperly collected
information about Rohingya refugees
15
62309
5422
將透過不當方式收集到的
羅興亞難民資訊分享給
01:07
with the Myanmar government
without appropriate consent.
16
67731
3503
緬甸政府,且沒有取得適當的同意。
01:11
The information shared
didn't just contain biometrics.
17
71860
4087
被分享的資訊不僅包含
生物識別資訊,
01:15
It contained information
about family makeup, relatives overseas,
18
75947
5005
還有其他資訊,如家庭成員、
海外親戚、
01:20
where they were originally from.
19
80952
1752
他們最初來自何處。
01:23
Sparking fears of retaliation
by the Myanmar government,
20
83580
3754
此事造成一些難民去躲起來,
因為害怕被緬甸政府報復。
01:27
some went into hiding.
21
87375
2127
01:29
Targeted identification
of persecuted peoples
22
89544
3921
針對被迫害的人做識別,
01:33
has long been a tactic
of genocidal regimes.
23
93506
3504
一直都是要進行種族
滅絕的政權會用的戰略。
01:37
But now that data is digitized,
meaning it is faster to access,
24
97052
4546
但現在,
資料被數位化了,
意味著存取會更快速,
01:41
quicker to scale
and more readily accessible.
25
101640
3420
規模也能快速擴大,
且能更即時提供存取。
01:45
This was a failure
on a multitude of fronts:
26
105101
2795
這是個多方面的失敗:制度面、
01:47
institutional, governance, moral.
27
107937
3587
政府面、
道德面。
01:52
I have spent 15 years of my career
working in humanitarian aid.
28
112567
4254
我十五年的職涯都投入在
人道主義救援上。
01:56
From Rwanda to Afghanistan.
29
116821
2253
從盧安達到阿富汗。
01:59
What is humanitarian aid, you might ask?
30
119074
2127
各位可能會問,
人道主義救援是什麼?
02:01
In its simplest terms,
it's the provision of emergency care
31
121201
4045
用白話來說,就是提供緊急照護
02:05
to those that need it the most
at desperate times.
32
125246
3295
給在艱苦時期中最有需要的人。
02:08
Post-disaster, during a crisis.
Food, water, shelter.
33
128583
4713
在災難後,在危機中。
食物、水、庇護所。
02:14
I have worked within
very large humanitarian organizations,
34
134631
3545
我曾經在非常大的
人道主義組織中工作,
02:18
whether that's leading
multicountry global programs
35
138218
3712
工作範圍從領導多國的全球計畫,
02:21
to designing drone innovations
for disaster management
36
141971
3921
到設計無人機的創新,
為小型島嶼國家做災害管理。
02:25
across small island states.
37
145934
2294
02:29
I have sat with communities
in the most fragile of contexts,
38
149312
6006
我曾經在最脆弱的情況中,
和社區坐下來談,
02:35
where conversations about the future
are the first ones they've ever had.
39
155318
4129
那還是他們第一次進行
關於未來的談話。
02:40
And I have designed global strategies
to prepare humanitarian organizations
40
160073
4713
我也曾經設計過一些策略,
來協助人道主義組織準備好
02:44
for these same futures.
41
164786
1877
面對這些未來。
02:47
And the one thing I can say
42
167163
1460
我可以確定一件事,人道主義者,
02:48
is that humanitarians,
we have embraced digitalization
43
168665
3920
在過去十年間,我們以非常快的速度
02:52
at an incredible speed
over the last decade,
44
172627
3462
擁抱數位化,
02:56
moving from tents and water cans,
45
176131
3211
從帳篷和水瓶,順道一提,
我們現在仍然在用這些,
02:59
which we still use, by the way,
46
179384
1668
03:01
to AI, big data, drones, biometrics.
47
181094
4671
進步到人工智慧、大數據、
無人機、生物識別。
03:06
These might seem
relevant, logical, needed,
48
186516
3879
熱衷科技的人看來,這些改變
似乎很重要、合邏輯、
勢在必行,甚至很性感。
03:10
even sexy to technology enthusiasts.
49
190437
3003
03:13
But what it actually is,
is the deployment of untested technologies
50
193440
6131
但它其實是
將尚未測試的技術部署出去,
03:19
on vulnerable populations
without appropriate consent.
51
199571
3962
用在弱勢族群身上,
且沒取得適當的同意。
03:23
And this gives me pause.
52
203533
2502
這讓我猶豫。
03:26
I pause because the agonies
we are facing today
53
206035
3587
我猶豫是因為現今我們全人類
03:29
as a global humanity
54
209664
1668
所面臨的痛苦並非一夜形成的。
03:31
didn't just happen overnight.
55
211374
2419
03:33
They happened as a result
of our shared history of colonialism
56
213835
5047
它們會出現,
是因為我們共有的殖民主義歷史,
03:38
and humanitarian technology innovations
are inherently colonial,
57
218923
4839
而人道主義技術創新
本身就有殖民的本質,在設計上通常
03:43
often designed for
and in the good of groups of people
58
223803
5255
對象是被視為沒有這些
03:49
seen as outside of technology themselves,
59
229100
3587
技術的人,是為了他們好,
03:52
and often not legitimately recognized
60
232687
2878
這些人通常都沒有被正式認可為
03:55
as being able to provide
for their own solutions.
61
235565
2502
有能力自己找出解決方案。
03:58
And so, as a humanitarian myself,
I ask this question:
62
238067
4588
所以,我自己身為人道主義者,
我會問:在我們尋求
在世界上做好事的過程中,
04:02
in our quest to do good in the world,
63
242655
3671
04:06
how can we ensure that we do not
lock people into future harm,
64
246326
5589
我們要如何確保,我們不會
讓這些人在未來受到傷害,
04:11
future indebtedness and future inequity
as a result of these actions?
65
251956
5214
在未來負債,在未來遭遇不公平,
如何不要讓這些行為
造成這樣的結果?
04:17
It is why I now study the ethics
of humanitarian tech innovation.
66
257796
4087
這就是為什麼我現在在研究
人道主義技術創新的倫理。
04:21
And this isn't just
an intellectually curious pursuit.
67
261925
4629
且這並不只是在理智上
感到好奇而做的消遣。
04:26
It's a deeply personal one.
68
266596
1710
而是非常個人化的事情。
04:29
Driven by the belief that it is often
people that look like me,
69
269307
4338
動機是因為,我相信,
通常都是外觀像我這樣的人、
04:33
that come from
the communities I come from,
70
273686
2211
和我來自類似社區的人、
04:35
historically excluded and marginalized,
71
275897
3670
在歷史上被排除、被邊緣化的人,
04:39
that are often spoken on behalf of
72
279567
3545
通常都被別人代表發言,
04:43
and denied voice in terms of the choices
73
283112
2503
聲音被否認,無法為
我們的未來做選擇。
04:45
available to us for our future.
74
285615
2294
04:47
As I stand here on the shoulders
of all those that have come before me
75
287909
4671
我站在這裡,
站在我所有的前人的肩膀上,
04:52
and in obligation for all of those
that will come after me
76
292622
3587
並對我所有的後人盡一份義務,
04:56
to say to you that good intentions alone
do not prevent harm,
77
296251
6256
我要告訴各位,
光是有好的意圖也無法預防傷害,
05:02
and good intentions alone can cause harm.
78
302549
3336
而光是有好的意圖卻可以造成傷害。
05:06
I'm often asked, what do I see
ahead of us in this next 21st century?
79
306845
4337
常有人問我,我怎麼看
二十一世紀的未來?
05:11
And if I had to sum it up:
80
311975
2043
總整來說:
05:14
of deep uncertainty,
a dying planet, distrust, pain.
81
314060
5547
極度的不確定性、邁向死亡的地球、
不信任、痛苦。
05:20
And in times of great volatility,
we as human beings, we yearn for a balm.
82
320692
5922
而在很不穩定的時期,身為人類,
我們會渴望慰藉。
05:26
And digital futures
are exactly that, a balm.
83
326614
3045
而數位未來就正是一種慰藉。
05:30
We look at it in all of its possibility
84
330535
2252
因為它有著各種可能性,我們把它
視為可以撫平我們所有的痛苦,
05:32
as if it could soothe all that ails us,
like a logical inevitability.
85
332787
4838
好像在邏輯上是必然的。
05:39
In recent years,
reports have started to flag
86
339210
4046
近幾年,
報導開始疲乏了,
05:43
the new types of risks that are emerging
about technology innovations.
87
343298
5589
新型的風險開始浮現,
技術創新的風險。
05:48
One of this is around how data collected
on vulnerable individuals
88
348928
5297
其中之一是關於
弱勢族群的個人資料被收集到之後
05:54
can actually be used
against them as retaliation,
89
354267
4129
竟可以用來對付他們,
用來報復,造成更大的風險,
05:58
posing greater risk not just against them,
but against their families,
90
358396
4129
對象不只他們,還有他們的家人,
06:02
against their community.
91
362525
1502
還有他們的社區。
06:05
We saw these risks
become a truth with the Rohingya.
92
365486
3045
我們在羅興亞難民身上
看到這種風險成真。
06:09
And very, very recently, in August 2021,
as Afghanistan fell to the Taliban,
93
369991
5297
沒多久前,
2021 年八月,阿富汗
落入塔利班手中,
06:15
it also came to light
that biometric data collected on Afghans
94
375288
5130
也暴露出
阿富汗人的生物識別資料,
06:20
by the US military
and the Afghan government
95
380460
2461
原由美國軍方和阿富汗政府
所收集,由各種行為者所使用,
06:22
and used by a variety of actors
96
382962
2670
06:25
were now in the hands of the Taliban.
97
385673
2211
這些資料現在落入塔利班手中。
06:29
Journalists' houses were searched.
98
389135
2085
記者的房子遭到搜索。
06:32
Afghans desperately raced against time
to erase their digital history online.
99
392388
5089
阿富汗人拼命和時間賽跑,
要盡快抹除他們在線上的數位足跡。
06:38
Technologies of empowerment then become
technologies of disempowerment.
100
398603
5881
應該有助益的技術
反而變成了阻礙。
06:45
It is because these technologies
101
405026
1793
原因是因為,這些技術的設計基礎
06:46
are designed on a certain set
of societal assumptions,
102
406819
5005
有某些特定的社會假設,
06:51
embedded in market and then filtered
through capitalist considerations.
103
411824
5047
嵌入到市場中,接著再經過
資本主義考量的過濾。
06:56
But technologies created in one context
and then parachuted into another
104
416871
5339
但,把一種情境中創造的技術
丟到另一種情境中,一定會失敗。
07:02
will always fail
105
422251
1335
07:03
because it is based on assumptions
of how people lead their lives.
106
423628
4880
因為它的基礎假設與這些人
過生活的方式息息相關。
07:08
And whilst here, you and I
may be relatively comfortable
107
428549
3921
然而,在這裡,
你我可能相對可以
很自在地提供指紋掃瞄,
07:12
providing a fingertip scan
to perhaps go to the movies,
108
432512
4546
也許是去看電影時使用,
07:17
we cannot extrapolate that out
to the level of safety one would feel
109
437100
5630
我們無法把它向外推斷
這些在排隊的難民會感受到
什麼樣的安全程度,
07:22
while standing in line,
110
442730
1585
07:24
having to give up that little bit
of data about themselves
111
444315
3295
他們得要放棄一點點自己的資料
07:27
in order to access food rations.
112
447610
2586
才能取得日需口糧。
07:31
Humanitarians assume
that technology will liberate humanity,
113
451739
5256
人道主義者假設
技術能解放人類,
07:37
but without any due consideration
of issues of power, exploitation and harm
114
457954
6882
但,卻沒有妥善地考量
一些議題,比如權力、
剝削,和傷害,
07:44
that can occur for this to happen.
115
464877
2002
這些都是會在實現過程中發生的。
07:46
Instead, we rush to solutionizing,
116
466921
3128
反之,
我們急著想提出解決方案,
這可說是一種奇想,
07:50
a form of magical thinking
117
470091
1668
07:51
that assumes that just
by deploying shiny solutions,
118
471801
4546
以為只要部署出很炫的解決方案,
07:56
we can solve the problem in front of us
119
476389
2002
就能解決眼前的問題,不需要
07:58
without any real analysis
of underlying realities.
120
478433
3795
對背後的現實面做真正的分析。
08:03
These are tools at the end of the day,
121
483312
2795
到頭來,這些都只是工具,
08:06
and tools, like a chef's knife,
122
486107
2502
而工具就像是主廚刀,
08:08
in the hands of some,
the creator of a beautiful meal,
123
488609
4880
在某些人手中,
可以創造出美味的餐點,
08:13
and in the hands of others, devastation.
124
493489
3003
在另一些人手中,
會造成毀滅。
08:17
So how do we ensure that we do not design
125
497243
3712
所以,我們要如何確保
我們不會把過去的不平等設計到
08:20
the inequities of our past
into our digital futures?
126
500997
5047
我們的數位未來中?
08:26
And I want to be clear about one thing.
127
506085
1877
有一件事我想說清楚:
我不反對科技。
08:28
I'm not anti-tech. I am anti-dumb tech.
128
508004
3587
我反對蠢科技。
08:31
(Laughter)
129
511966
1543
(笑聲)
08:33
(Applause)
130
513551
1627
(掌聲)
08:38
The limited imaginings of the few
131
518097
2336
少數人的有限想像不應該強迫殖民到
08:40
should not colonize
the radical re-imaginings of the many.
132
520475
4462
多數人的重新想像中。
08:45
So how then do we ensure
that we design an ethical baseline,
133
525813
4588
那麼,我們要如何確保
我們能設計出倫理基線,
08:50
so that the liberation that this promises
is not just for a privileged few,
134
530401
6131
讓這些技術所保證的自由解放
不只有少數有特權的人才能得到,
08:56
but for all of us?
135
536532
1544
而是所有人?
08:59
There are a few examples
that can point to a way forward.
136
539202
3837
有幾個例子可以協助指點未來的路。
09:03
I love the work of Indigenous AI
137
543081
3878
我很喜歡 Indigenous AI 所做的,
09:07
that instead of drawing
from Western values and philosophies,
138
547001
3003
它參考的不是西方的價值觀和哲學,
09:10
it draws from Indigenous
protocols and values
139
550046
2544
而是將原住民的法則和價值觀
09:12
to embed into AI code.
140
552632
1960
嵌入到人工智慧程式碼當中。
09:15
I also really love the work of Nia Tero,
141
555093
2961
我也很喜歡 Nia Tero 所做的,
09:18
an Indigenous co-led organization
that works with Indigenous communities
142
558096
4004
它是由原住民共同領導的組織,
服務對象是原住民社區,
09:22
to map their own well-being
and territories
143
562141
2878
將他們自己的康樂
和領土繪製在地圖上。
09:25
as opposed to other people
coming in to do it on their behalf.
144
565061
3628
不用讓別人介入進來
代表他們做這些。
09:29
I've learned a lot from the Satellite
Sentinel Project back in 2010,
145
569482
4671
我從「衛星哨兵計畫」學到了
很多,那時是 2010 年,
09:34
which is a slightly different example.
146
574153
2127
這個例子有一點點不同。
09:36
The project started essentially
to map atrocities
147
576280
5840
這個計畫一開始
本質上是要將殘暴行為
繪製在地圖上,
09:42
through remote sensing
technologies, satellites,
148
582120
3044
靠的是遙感技術、衛星,
09:45
in order to be able to predict
and potentially prevent them.
149
585206
3587
目的是要預測這類行為,
有可能還能預防。
09:48
Now the project wound down
after a few years
150
588835
3295
幾年後,該計畫的進展慢下來了,
09:52
for a variety of reasons,
151
592171
1919
有許多原因,
09:54
one of which being that it couldn’t
actually generate action.
152
594132
3461
其中之一,是它無法產生行動。
09:57
But the second, and probably
the most important,
153
597635
3837
但,第二個原因,
可能也是最重要的就是
10:01
was that the team realized they
were operating without an ethical net.
154
601514
5797
就是這個團隊發現他們是在
沒有倫理網的情況下運作。
10:07
And without ethical guidelines in place,
155
607353
3086
若沒有已經就緒的倫理指引,
10:10
it was a very wide
open line of questioning
156
610439
3754
會有很多可能性,很難判斷
10:14
about whether what they were doing
was helpful or harmful.
157
614193
4088
他們的所作所為是會造成助益
還是傷害。
10:19
And so they decided to wind down
before creating harm.
158
619282
3753
因此,他們決定在造成
傷害之前就先緩下來。
10:24
In the absence of legally binding
ethical frameworks
159
624620
5172
因為沒有
具法律約束力的倫理架構
來引導我們做事,
10:29
to guide our work,
160
629834
2502
10:32
I have been working
on a range of ethical principles
161
632378
2878
我一直致力於一系列的倫理原則,
10:35
to help inform
humanitarian tech innovation,
162
635298
2544
做為人道主義技術創新的基礎,
10:37
and I'd like to put forward
a few of these here for you today.
163
637884
2919
今天我想在此提出
其中幾項給各位參考。
10:41
One: Ask.
164
641345
1752
一:發問。
10:43
Which groups of humans
will be harmed by this and when?
165
643890
4212
哪些人類族群會被
這個解決方案傷害到?
何時?
10:48
Assess: Who does this solution
actually benefit?
166
648978
3420
評估:這個解決方案
實際上能讓誰受益?
10:53
Interrogate: Was appropriate consent
obtained from the end users?
167
653733
5881
質問:
有沒有向終端使用者
取得適當的同意?
11:00
Consider: What must we
gracefully exit out of
168
660907
4504
考慮:
我們必須要優雅地脫離什麼,
11:05
to be fit for these futures?
169
665411
1627
才能符合這些未來?
11:07
And imagine: What future good
might we foreclose
170
667038
5797
以及想像:
如果我們現今就實施這個行動,
11:12
if we implemented this action today?
171
672877
2961
我們未來會失去什麼益處?
11:16
We are accountable
for the futures that we create.
172
676756
3670
我們要為我們創造出來的未來負責。
11:20
We cannot absolve ourselves
of the responsibilities
173
680468
3587
對於我們所採取的行動,
11:24
and accountabilities of our actions
174
684096
2670
我們自己要扛起責任,
11:26
if our actions actually cause harm
175
686807
2920
如果我們採取的行動
傷害到了我們號稱
11:29
to those that we purport
to protect and serve.
176
689769
2878
我們要保護、服務的人,
我們就要負責。
11:32
Another world is absolutely,
radically possible.
177
692647
4421
這樣的世界,絕對是有可能實現的。
11:37
Thank you.
178
697777
1293
謝謝。
11:39
(Applause)
179
699070
2294
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。