5 Ethical Principles for Digitizing Humanitarian Aid | Aarathi Krishnan | TED

35,320 views ・ 2022-07-12

TED


Please double-click on the English subtitles below to play the video.

00:04
Sociologist Zeynep Tufekci once said
0
4334
3795
00:08
that history is full of massive examples
1
8171
4296
00:12
of harm caused by people with great power
2
12509
4254
00:16
who felt that just because they felt themselves to have good intentions,
3
16805
4755
00:21
that they could not cause harm.
4
21601
1710
00:24
In 2017,
5
24688
1626
00:26
Rohingya refugees started to flee Myanmar into Bangladesh
6
26314
5047
00:31
due to a crackdown by the Myanmar military,
7
31361
2502
00:33
an act that the UN subsequently called of genocidal intent.
8
33863
5089
00:39
As they started to arrive into camps,
9
39619
2544
00:42
they had to register for a range of services.
10
42163
2711
00:45
One of this was to register
11
45500
1877
00:47
for a government-backed digital biometric identification card.
12
47419
4462
00:51
They weren't actually given the option to opt out.
13
51923
3629
00:56
In 2021, Human Rights Watch accused international humanitarian agencies
14
56595
5672
01:02
of sharing improperly collected information about Rohingya refugees
15
62309
5422
01:07
with the Myanmar government without appropriate consent.
16
67731
3503
01:11
The information shared didn't just contain biometrics.
17
71860
4087
01:15
It contained information about family makeup, relatives overseas,
18
75947
5005
01:20
where they were originally from.
19
80952
1752
01:23
Sparking fears of retaliation by the Myanmar government,
20
83580
3754
01:27
some went into hiding.
21
87375
2127
01:29
Targeted identification of persecuted peoples
22
89544
3921
01:33
has long been a tactic of genocidal regimes.
23
93506
3504
01:37
But now that data is digitized, meaning it is faster to access,
24
97052
4546
01:41
quicker to scale and more readily accessible.
25
101640
3420
01:45
This was a failure on a multitude of fronts:
26
105101
2795
01:47
institutional, governance, moral.
27
107937
3587
01:52
I have spent 15 years of my career working in humanitarian aid.
28
112567
4254
01:56
From Rwanda to Afghanistan.
29
116821
2253
01:59
What is humanitarian aid, you might ask?
30
119074
2127
02:01
In its simplest terms, it's the provision of emergency care
31
121201
4045
02:05
to those that need it the most at desperate times.
32
125246
3295
02:08
Post-disaster, during a crisis. Food, water, shelter.
33
128583
4713
02:14
I have worked within very large humanitarian organizations,
34
134631
3545
02:18
whether that's leading multicountry global programs
35
138218
3712
02:21
to designing drone innovations for disaster management
36
141971
3921
02:25
across small island states.
37
145934
2294
02:29
I have sat with communities in the most fragile of contexts,
38
149312
6006
02:35
where conversations about the future are the first ones they've ever had.
39
155318
4129
02:40
And I have designed global strategies to prepare humanitarian organizations
40
160073
4713
02:44
for these same futures.
41
164786
1877
02:47
And the one thing I can say
42
167163
1460
02:48
is that humanitarians, we have embraced digitalization
43
168665
3920
02:52
at an incredible speed over the last decade,
44
172627
3462
02:56
moving from tents and water cans,
45
176131
3211
02:59
which we still use, by the way,
46
179384
1668
03:01
to AI, big data, drones, biometrics.
47
181094
4671
03:06
These might seem relevant, logical, needed,
48
186516
3879
03:10
even sexy to technology enthusiasts.
49
190437
3003
03:13
But what it actually is, is the deployment of untested technologies
50
193440
6131
03:19
on vulnerable populations without appropriate consent.
51
199571
3962
03:23
And this gives me pause.
52
203533
2502
03:26
I pause because the agonies we are facing today
53
206035
3587
03:29
as a global humanity
54
209664
1668
03:31
didn't just happen overnight.
55
211374
2419
03:33
They happened as a result of our shared history of colonialism
56
213835
5047
03:38
and humanitarian technology innovations are inherently colonial,
57
218923
4839
03:43
often designed for and in the good of groups of people
58
223803
5255
03:49
seen as outside of technology themselves,
59
229100
3587
03:52
and often not legitimately recognized
60
232687
2878
03:55
as being able to provide for their own solutions.
61
235565
2502
03:58
And so, as a humanitarian myself, I ask this question:
62
238067
4588
04:02
in our quest to do good in the world,
63
242655
3671
04:06
how can we ensure that we do not lock people into future harm,
64
246326
5589
04:11
future indebtedness and future inequity as a result of these actions?
65
251956
5214
04:17
It is why I now study the ethics of humanitarian tech innovation.
66
257796
4087
04:21
And this isn't just an intellectually curious pursuit.
67
261925
4629
04:26
It's a deeply personal one.
68
266596
1710
04:29
Driven by the belief that it is often people that look like me,
69
269307
4338
04:33
that come from the communities I come from,
70
273686
2211
04:35
historically excluded and marginalized,
71
275897
3670
04:39
that are often spoken on behalf of
72
279567
3545
04:43
and denied voice in terms of the choices
73
283112
2503
04:45
available to us for our future.
74
285615
2294
04:47
As I stand here on the shoulders of all those that have come before me
75
287909
4671
04:52
and in obligation for all of those that will come after me
76
292622
3587
04:56
to say to you that good intentions alone do not prevent harm,
77
296251
6256
05:02
and good intentions alone can cause harm.
78
302549
3336
05:06
I'm often asked, what do I see ahead of us in this next 21st century?
79
306845
4337
05:11
And if I had to sum it up:
80
311975
2043
05:14
of deep uncertainty, a dying planet, distrust, pain.
81
314060
5547
05:20
And in times of great volatility, we as human beings, we yearn for a balm.
82
320692
5922
05:26
And digital futures are exactly that, a balm.
83
326614
3045
05:30
We look at it in all of its possibility
84
330535
2252
05:32
as if it could soothe all that ails us, like a logical inevitability.
85
332787
4838
05:39
In recent years, reports have started to flag
86
339210
4046
05:43
the new types of risks that are emerging about technology innovations.
87
343298
5589
05:48
One of this is around how data collected on vulnerable individuals
88
348928
5297
05:54
can actually be used against them as retaliation,
89
354267
4129
05:58
posing greater risk not just against them, but against their families,
90
358396
4129
06:02
against their community.
91
362525
1502
06:05
We saw these risks become a truth with the Rohingya.
92
365486
3045
06:09
And very, very recently, in August 2021, as Afghanistan fell to the Taliban,
93
369991
5297
06:15
it also came to light that biometric data collected on Afghans
94
375288
5130
06:20
by the US military and the Afghan government
95
380460
2461
06:22
and used by a variety of actors
96
382962
2670
06:25
were now in the hands of the Taliban.
97
385673
2211
06:29
Journalists' houses were searched.
98
389135
2085
06:32
Afghans desperately raced against time to erase their digital history online.
99
392388
5089
06:38
Technologies of empowerment then become technologies of disempowerment.
100
398603
5881
06:45
It is because these technologies
101
405026
1793
06:46
are designed on a certain set of societal assumptions,
102
406819
5005
06:51
embedded in market and then filtered through capitalist considerations.
103
411824
5047
06:56
But technologies created in one context and then parachuted into another
104
416871
5339
07:02
will always fail
105
422251
1335
07:03
because it is based on assumptions of how people lead their lives.
106
423628
4880
07:08
And whilst here, you and I may be relatively comfortable
107
428549
3921
07:12
providing a fingertip scan to perhaps go to the movies,
108
432512
4546
07:17
we cannot extrapolate that out to the level of safety one would feel
109
437100
5630
07:22
while standing in line,
110
442730
1585
07:24
having to give up that little bit of data about themselves
111
444315
3295
07:27
in order to access food rations.
112
447610
2586
07:31
Humanitarians assume that technology will liberate humanity,
113
451739
5256
07:37
but without any due consideration of issues of power, exploitation and harm
114
457954
6882
07:44
that can occur for this to happen.
115
464877
2002
07:46
Instead, we rush to solutionizing,
116
466921
3128
07:50
a form of magical thinking
117
470091
1668
07:51
that assumes that just by deploying shiny solutions,
118
471801
4546
07:56
we can solve the problem in front of us
119
476389
2002
07:58
without any real analysis of underlying realities.
120
478433
3795
08:03
These are tools at the end of the day,
121
483312
2795
08:06
and tools, like a chef's knife,
122
486107
2502
08:08
in the hands of some, the creator of a beautiful meal,
123
488609
4880
08:13
and in the hands of others, devastation.
124
493489
3003
08:17
So how do we ensure that we do not design
125
497243
3712
08:20
the inequities of our past into our digital futures?
126
500997
5047
08:26
And I want to be clear about one thing.
127
506085
1877
08:28
I'm not anti-tech. I am anti-dumb tech.
128
508004
3587
08:31
(Laughter)
129
511966
1543
08:33
(Applause)
130
513551
1627
08:38
The limited imaginings of the few
131
518097
2336
08:40
should not colonize the radical re-imaginings of the many.
132
520475
4462
08:45
So how then do we ensure that we design an ethical baseline,
133
525813
4588
08:50
so that the liberation that this promises is not just for a privileged few,
134
530401
6131
08:56
but for all of us?
135
536532
1544
08:59
There are a few examples that can point to a way forward.
136
539202
3837
09:03
I love the work of Indigenous AI
137
543081
3878
09:07
that instead of drawing from Western values and philosophies,
138
547001
3003
09:10
it draws from Indigenous protocols and values
139
550046
2544
09:12
to embed into AI code.
140
552632
1960
09:15
I also really love the work of Nia Tero,
141
555093
2961
09:18
an Indigenous co-led organization that works with Indigenous communities
142
558096
4004
09:22
to map their own well-being and territories
143
562141
2878
09:25
as opposed to other people coming in to do it on their behalf.
144
565061
3628
09:29
I've learned a lot from the Satellite Sentinel Project back in 2010,
145
569482
4671
09:34
which is a slightly different example.
146
574153
2127
09:36
The project started essentially to map atrocities
147
576280
5840
09:42
through remote sensing technologies, satellites,
148
582120
3044
09:45
in order to be able to predict and potentially prevent them.
149
585206
3587
09:48
Now the project wound down after a few years
150
588835
3295
09:52
for a variety of reasons,
151
592171
1919
09:54
one of which being that it couldn’t actually generate action.
152
594132
3461
09:57
But the second, and probably the most important,
153
597635
3837
10:01
was that the team realized they were operating without an ethical net.
154
601514
5797
10:07
And without ethical guidelines in place,
155
607353
3086
10:10
it was a very wide open line of questioning
156
610439
3754
10:14
about whether what they were doing was helpful or harmful.
157
614193
4088
10:19
And so they decided to wind down before creating harm.
158
619282
3753
10:24
In the absence of legally binding ethical frameworks
159
624620
5172
10:29
to guide our work,
160
629834
2502
10:32
I have been working on a range of ethical principles
161
632378
2878
10:35
to help inform humanitarian tech innovation,
162
635298
2544
10:37
and I'd like to put forward a few of these here for you today.
163
637884
2919
10:41
One: Ask.
164
641345
1752
10:43
Which groups of humans will be harmed by this and when?
165
643890
4212
10:48
Assess: Who does this solution actually benefit?
166
648978
3420
10:53
Interrogate: Was appropriate consent obtained from the end users?
167
653733
5881
11:00
Consider: What must we gracefully exit out of
168
660907
4504
11:05
to be fit for these futures?
169
665411
1627
11:07
And imagine: What future good might we foreclose
170
667038
5797
11:12
if we implemented this action today?
171
672877
2961
11:16
We are accountable for the futures that we create.
172
676756
3670
11:20
We cannot absolve ourselves of the responsibilities
173
680468
3587
11:24
and accountabilities of our actions
174
684096
2670
11:26
if our actions actually cause harm
175
686807
2920
11:29
to those that we purport to protect and serve.
176
689769
2878
11:32
Another world is absolutely, radically possible.
177
692647
4421
11:37
Thank you.
178
697777
1293
11:39
(Applause)
179
699070
2294
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7