5 Ethical Principles for Digitizing Humanitarian Aid | Aarathi Krishnan | TED

35,630 views

2022-07-12 ・ TED


New videos

5 Ethical Principles for Digitizing Humanitarian Aid | Aarathi Krishnan | TED

35,630 views ・ 2022-07-12

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Hyunjin Lee κ²€ν† : JY Kang
00:04
Sociologist Zeynep Tufekci once said
0
4334
3795
μ‚¬νšŒν•™μž μ œμ΄λ„΅ νˆ¬νŒ©μΉ˜λŠ” λ§ν–ˆμŠ΅λ‹ˆλ‹€.
00:08
that history is full of massive examples
1
8171
4296
우리의 μ—­μ‚¬λŠ” λ‚˜μœ μΌλ“€μ˜ μ˜ˆμ‹œλ‘œ 가득 μ°¨ 있고
00:12
of harm caused by people with great power
2
12509
4254
κ·Έ 일듀을 μ΄ˆλž˜ν•œ ꢌλ ₯μžλ“€μ€
00:16
who felt that just because they felt themselves to have good intentions,
3
16805
4755
μžμ‹ λ“€μ΄ 쒋은 μ˜λ„λ‘œ ν–‰ν•œ 일은 ν•΄λ₯Ό λΌμΉ˜μ§€ μ•Šμ„ 거라 λ―ΏλŠ”λ‹€κ³  ν–ˆμ£ .
00:21
that they could not cause harm.
4
21601
1710
00:24
In 2017,
5
24688
1626
2017λ…„,
00:26
Rohingya refugees started to flee Myanmar into Bangladesh
6
26314
5047
λ‘œνžμ•Όμ‘± λ‚œλ―Όλ“€μ€ λ―Έμ–€λ§ˆλ₯Ό λ– λ‚˜ λ°©κΈ€λΌλ°μ‹œλ‘œ ν–₯ν•˜κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
00:31
due to a crackdown by the Myanmar military,
7
31361
2502
λ―Έμ–€λ§ˆ κ΅°λΆ€μ˜ 탄압 λ•Œλ¬Έμ΄μ—ˆλŠ”λ°,
00:33
an act that the UN subsequently called of genocidal intent.
8
33863
5089
μœ μ—”μ€ 이것을 두고 λŒ€λŸ‰ν•™μ‚΄μ˜ μ˜λ„κ°€ μžˆλ‹€κ³  λ°ν˜”μ£ .
00:39
As they started to arrive into camps,
9
39619
2544
λ‚œλ―Όλ“€μ€ 캠프에 λ„μ°©ν•˜λ©΄
00:42
they had to register for a range of services.
10
42163
2711
μ—¬λŸ¬ 지원을 λ°›κΈ° μœ„ν•΄ μ‹ λΆ„ 등둝을 ν•΄μ•Όλ§Œ ν–ˆμŠ΅λ‹ˆλ‹€.
00:45
One of this was to register
11
45500
1877
등둝해야 ν–ˆλ˜ 것 쀑 ν•˜λ‚˜λŠ”
00:47
for a government-backed digital biometric identification card.
12
47419
4462
μ •λΆ€κ°€ κ΄€λ¦¬ν•˜λŠ” 디지털 생체 정보 μ‹ λΆ„μ¦μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:51
They weren't actually given the option to opt out.
13
51923
3629
λ‚œλ―Όλ“€μ—κ²ŒλŠ” 등둝을 κ±°λΆ€ν•  선택지가 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
00:56
In 2021, Human Rights Watch accused international humanitarian agencies
14
56595
5672
2021λ…„, κ΅­μ œμΈκΆŒκ°μ‹œκΈ°κ΅¬λŠ” ꡭ제 인도 기ꡬ듀을 κ³ μ†Œν–ˆμŠ΅λ‹ˆλ‹€.
01:02
of sharing improperly collected information about Rohingya refugees
15
62309
5422
λΆ€μ μ ˆν•˜κ²Œ μˆ˜μ§‘λœ λ‘œνžμ•Όμ‘± λ‚œλ―Όμ˜ 정보듀을
01:07
with the Myanmar government without appropriate consent.
16
67731
3503
μ •λ‹Ήν•œ λ™μ˜ 없이 λ―Έμ–€λ§ˆ 정뢀와 κ³΅μœ ν–ˆκΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
01:11
The information shared didn't just contain biometrics.
17
71860
4087
곡유된 정보듀은 λ‹¨μˆœν•œ μƒμ²΄μ •λ³΄λΏλ§Œμ΄ μ•„λ‹ˆμ—ˆμŠ΅λ‹ˆλ‹€.
01:15
It contained information about family makeup, relatives overseas,
18
75947
5005
κ°€μ‘± 관계에 κ΄€ν•œ μ •λ³΄κΉŒμ§€ ν¬ν•¨λ˜μ–΄ μžˆμ—ˆμ–΄μš”.
κ³ ν–₯에 μžˆλŠ” μΉœμΈμ²™λ“€κΉŒμ§€ 말이죠.
01:20
where they were originally from.
19
80952
1752
01:23
Sparking fears of retaliation by the Myanmar government,
20
83580
3754
λ―Έμ–€λ§ˆ μ •λΆ€μ˜ 보볡 μœ„ν—˜μ„ λŠλ‚€ 일뢀 λ‚œλ―Όλ“€μ€ μˆ¨μ–΄λ²„λ ΈμŠ΅λ‹ˆλ‹€.
01:27
some went into hiding.
21
87375
2127
01:29
Targeted identification of persecuted peoples
22
89544
3921
λ°•ν•΄ λŒ€μƒμΈ 인물을 ν‘œμ  μ‚Όμ•„ κ°€λ €λ‚΄λŠ” 것은
01:33
has long been a tactic of genocidal regimes.
23
93506
3504
λŒ€λŸ‰ 학살을 μΌμ‚ΌλŠ” μ •κΆŒμ˜ 였랜 μ „μˆ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:37
But now that data is digitized, meaning it is faster to access,
24
97052
4546
μ§€κΈˆμ€ κ·Έ 정보가 λ””μ§€ν„Έν™”λ˜μ—ˆμ£ .
즉 λΉ λ₯Έ μ ‘κ·Ό, λΉ λ₯Έ 뢄석 그리고 μ‰¬μš΄ μƒ‰μΆœμ΄ κ°€λŠ₯ν•˜λ‹€λŠ” μ˜λ―Έμž…λ‹ˆλ‹€.
01:41
quicker to scale and more readily accessible.
25
101640
3420
01:45
This was a failure on a multitude of fronts:
26
105101
2795
이건 μ—¬λŸ¬ λ©΄μ—μ„œ 잘λͺ»λœ μΌμ΄μ—μš”.
01:47
institutional, governance, moral.
27
107937
3587
μ œλ„μ , μ •μΉ˜μ , 도덕적 μΈ‘λ©΄μ—μ„œμš”.
01:52
I have spent 15 years of my career working in humanitarian aid.
28
112567
4254
μ €λŠ” μΈλ„μ£Όμ˜μ  지원 λΆ„μ•Όμ—μ„œ 15λ…„κ°„ μΌν–ˆμŠ΅λ‹ˆλ‹€.
01:56
From Rwanda to Afghanistan.
29
116821
2253
λ₯΄μ™„λ‹€μ—μ„œλΆ€ν„° μ•„ν”„κ°€λ‹ˆμŠ€νƒ„κΉŒμ§€μš”.
01:59
What is humanitarian aid, you might ask?
30
119074
2127
μΈλ„μ£Όμ˜μ  지원이 무엇인지 μ•„μ‹œλ‚˜μš”?
02:01
In its simplest terms, it's the provision of emergency care
31
121201
4045
κ°„λ‹¨νžˆ λ§ν•˜μžλ©΄, κΈ΄κΈ‰ ꡬ호λ₯Ό μ œκ³΅ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:05
to those that need it the most at desperate times.
32
125246
3295
도움이 μ ˆμ‹€ν•œ 상황에 놓인 μ‚¬λžŒλ“€μ„ λ•λŠ” 일이죠.
02:08
Post-disaster, during a crisis. Food, water, shelter.
33
128583
4713
μž¬λ‚œ μ΄ν›„λ‚˜ μœ„κΈ° μƒν™©μ—μ„œ μŒμ‹, λ¬Ό, ν”Όλ‚œμ²˜λ₯Ό μ œκ³΅ν•˜λŠ” κ±°μ£ .
02:14
I have worked within very large humanitarian organizations,
34
134631
3545
μ €λŠ” 맀우 큰 μΈλ„μ£Όμ˜μ  지원 λ‹¨μ²΄μ—μ„œ 일을 ν–ˆμ—ˆμŠ΅λ‹ˆλ‹€.
02:18
whether that's leading multicountry global programs
35
138218
3712
κ·Έ λ‹¨μ²΄λŠ” λ‹€κ΅­κ°€ λŒ€μƒμ˜ 세계적인 지원 ν”„λ‘œκ·Έλž¨μ„ μš΄μ˜ν–ˆκ³ 
02:21
to designing drone innovations for disaster management
36
141971
3921
ν˜μ‹ μ μΈ 무인 λ“œλ‘ μ„ μ„€κ³„ν•˜μ—¬
μž‘μ€ μ„¬λ‚˜λΌλ“€μ˜ μž¬λ‚œ 상황을 κ΄€λ¦¬ν•˜κΈ°λ„ ν–ˆμŠ΅λ‹ˆλ‹€.
02:25
across small island states.
37
145934
2294
02:29
I have sat with communities in the most fragile of contexts,
38
149312
6006
μ €λŠ” κ°€μž₯ μ·¨μ•½ν•œ μƒν™©μ˜ 지역듀과 ν•¨κ»˜ν–ˆμ–΄μš”.
02:35
where conversations about the future are the first ones they've ever had.
39
155318
4129
미래λ₯Ό κΏˆκΏ€ μˆ˜λ„ μ—†λŠ” 그런 μ§€μ—­μ΄μ˜€μŠ΅λ‹ˆλ‹€.
02:40
And I have designed global strategies to prepare humanitarian organizations
40
160073
4713
μ €λŠ” μ΄λ“€μ˜ 미래λ₯Ό μœ„ν•΄ μΈλ„μ£Όμ˜μ  κΈ°κ΄€ 섀립을 μœ„ν•œ
02:44
for these same futures.
41
164786
1877
세계적 지원 μ „λž΅μ„ λ§Œλ“€μ—ˆμŠ΅λ‹ˆλ‹€.
02:47
And the one thing I can say
42
167163
1460
μ€‘μš”ν•œ 것은, 저희 같은 μΈλ„μ£Όμ˜ 기ꡬ듀이
02:48
is that humanitarians, we have embraced digitalization
43
168665
3920
μ§€λ‚œ 10λ…„κ°„ μ—„μ²­λ‚˜κ²Œ λΉ λ₯Έ μ†λ„λ‘œ 디지털화λ₯Ό λ°›μ•„λ“€μ˜€λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
02:52
at an incredible speed over the last decade,
44
172627
3462
02:56
moving from tents and water cans,
45
176131
3211
ν…νŠΈμ™€ 물톡도 λ¬Όλ‘  아직 μ‚¬μš©ν•˜κ³  μžˆμ§€λ§Œ
02:59
which we still use, by the way,
46
179384
1668
03:01
to AI, big data, drones, biometrics.
47
181094
4671
그런 것듀이 AI, λΉ…ν…Œμ΄ν„°, λ“œλ‘ , μƒμ²΄μΈμ‹κΈ°μˆ λ‘œ λ°”λ€Œκ³  있죠.
03:06
These might seem relevant, logical, needed,
48
186516
3879
이것듀은 관련성이 μžˆμ–΄ 보이고, 지역적이고, ν•„μš”ν•œ κ²ƒμ²˜λŸΌ λ³΄μž…λ‹ˆλ‹€.
03:10
even sexy to technology enthusiasts.
49
190437
3003
기술 μ‹ λ΄‰μžλ“€μ—κ²ŒλŠ” 맀λ ₯적이겠죠.
03:13
But what it actually is, is the deployment of untested technologies
50
193440
6131
ν•˜μ§€λ§Œ 싀상은...
μ·¨μ•½ν•œ μ‚¬λžŒλ“€μ—κ²Œ μ μ ˆν•œ λ™μ˜λ„ 받지 μ•Šκ³ 
03:19
on vulnerable populations without appropriate consent.
51
199571
3962
증λͺ…λ˜μ§€ μ•Šμ€ κΈ°μˆ μ„ λ°°ν¬ν•œ κ²ƒμž…λ‹ˆλ‹€.
03:23
And this gives me pause.
52
203533
2502
μ—¬κΈ°μ„œ μ €λŠ” λ©ˆμΆ”κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:26
I pause because the agonies we are facing today
53
206035
3587
우리 μ „ 세계 인λ₯˜κ°€ λ§ˆμ£Όν•˜κ³  μžˆλŠ” 이 고톡은
03:29
as a global humanity
54
209664
1668
03:31
didn't just happen overnight.
55
211374
2419
ν•˜λ£¨μ•„μΉ¨μ— μƒκ²¨λ‚œ 게 μ•„λ‹ˆμ—μš”.
03:33
They happened as a result of our shared history of colonialism
56
213835
5047
μ΄λŠ” μš°λ¦¬κ°€ κ³΅μœ ν•˜λŠ” μ‹λ―Όμ£Όμ˜ μ—­μ‚¬μ˜ κ²°κ³Όμž…λ‹ˆλ‹€.
03:38
and humanitarian technology innovations are inherently colonial,
57
218923
4839
그리고 인도적 μ§€μ›μ˜ 기술 κ°œν˜μ€ μ‹λ―Όμ£Όμ˜λ₯Ό λ‚΄μž¬ν•˜κ³  있으며,
03:43
often designed for and in the good of groups of people
58
223803
5255
κ·Έ κΈ°μˆ μ„ 섀계할 λ•Œμ˜ 적용 λŒ€μƒμ€
03:49
seen as outside of technology themselves,
59
229100
3587
기술과 μ „ν˜€ 가깝지 μ•Šμ€ μ‚¬λžŒλ“€μΌ 뿐만 μ•„λ‹ˆλΌ
03:52
and often not legitimately recognized
60
232687
2878
κ·Έλ“€ 슀슀둜 해결책을 μ°ΎλŠ” 것을 μ •λ‹Ήν•˜μ§€ μ•Šμ€ κ²ƒμœΌλ‘œ μ·¨κΈ‰ν•©λ‹ˆλ‹€.
03:55
as being able to provide for their own solutions.
61
235565
2502
03:58
And so, as a humanitarian myself, I ask this question:
62
238067
4588
κ·Έλž˜μ„œ μΈλ„μ£Όμ˜μžλ‘œμ„œ μ €λŠ” 이런 μ˜λ¬Έμ„ κ°–κ²Œ λ©λ‹ˆλ‹€.
04:02
in our quest to do good in the world,
63
242655
3671
세상에 선을 ν–‰ν•˜λ €λŠ” 우리의 λ…Έλ ₯이
04:06
how can we ensure that we do not lock people into future harm,
64
246326
5589
μž₯λž˜μ— μ‚¬λžŒλ“€μ—κ²Œ ν”Όν–¬λ₯Ό 주지 μ•ŠλŠ”λ‹€κ³  ν™•μ‹ ν•  수 μžˆμ„κΉŒ?
04:11
future indebtedness and future inequity as a result of these actions?
65
251956
5214
우리 행동이 결과적으둜 λΆ€λ‹΄κ³Ό λΆˆν‰λ“±μ„ μ΄ˆλž˜ν•˜μ§€λŠ” μ•Šμ„κΉŒ?
04:17
It is why I now study the ethics of humanitarian tech innovation.
66
257796
4087
이것이 μ œκ°€ μ§€κΈˆ 인도적 기술 ν˜μ‹ μ˜ 윀리 문제λ₯Ό κ³΅λΆ€ν•˜λŠ” μ΄μœ μž…λ‹ˆλ‹€.
04:21
And this isn't just an intellectually curious pursuit.
67
261925
4629
단지 지적 ν˜ΈκΈ°μ‹¬μ„ μ±„μš°λ €λŠ” 것이 μ•„λ‹ˆμ—μš”.
04:26
It's a deeply personal one.
68
266596
1710
μ•„μ£Ό 개인적인 κ²ƒμž…λ‹ˆλ‹€.
04:29
Driven by the belief that it is often people that look like me,
69
269307
4338
저와 같은 μ‚¬λžŒλ“€μ„ μœ„ν•œ 것이고,
04:33
that come from the communities I come from,
70
273686
2211
μ €μ²˜λŸΌ μ—­μ‚¬μ μœΌλ‘œ λ°°μ œλ‹Ήν•˜κ³  μ†Œμ™Έλ‹Ήν•˜λŠ” μ²˜μ§€μ˜ μ‚¬λžŒλ“€,
04:35
historically excluded and marginalized,
71
275897
3670
04:39
that are often spoken on behalf of
72
279567
3545
λͺ¨λ‘λ₯Ό μœ„ν•œ κ²ƒμ΄λΌλŠ” 말둜
04:43
and denied voice in terms of the choices
73
283112
2503
μžμ‹ λ“€μ˜ λ―Έλž˜μ— λŒ€ν•œ μ„ νƒκΆŒμ„ λΆ€μ •λ‹Ήν•˜λŠ” μ‚¬λžŒλ“€μ„ μœ„ν•œ κ²ƒμž…λ‹ˆλ‹€.
04:45
available to us for our future.
74
285615
2294
04:47
As I stand here on the shoulders of all those that have come before me
75
287909
4671
μ €λŠ” 저보닀 λ¨Όμ € λ‚˜μ„°λ˜ μ‚¬λžŒλ“€ 덕뢄에 이 μžλ¦¬μ— μ„°κ³ ,
04:52
and in obligation for all of those that will come after me
76
292622
3587
제 λ’€λ₯Ό 이을 μ‚¬λžŒλ“€μ— λŒ€ν•œ 의무감으둜 이곳에 μ„œμžˆμŠ΅λ‹ˆλ‹€.
04:56
to say to you that good intentions alone do not prevent harm,
77
296251
6256
이것을 λ§μ”€λ“œλ¦¬κΈ° μœ„ν•΄μ„œμš”.
쒋은 μ˜λ„λ§ŒμœΌλ‘œ ν”Όν•΄λ₯Ό 막을 수 μ—†κ³ 
05:02
and good intentions alone can cause harm.
78
302549
3336
쒋은 μ˜λ„λ§ŒμœΌλ‘œλŠ” 였히렀 ν”Όν•΄λ₯Ό μΌμœΌν‚¬ 수 μžˆμŠ΅λ‹ˆλ‹€.
05:06
I'm often asked, what do I see ahead of us in this next 21st century?
79
306845
4337
μ €λŠ” 21μ„ΈκΈ° λ‹€μŒμ˜ 전망에 λŒ€ν•œ μ§ˆλ¬Έμ„ μ’…μ’… λ°›μŠ΅λ‹ˆλ‹€.
05:11
And if I had to sum it up:
80
311975
2043
μš”μ•½ν•΄λ³΄μžλ©΄,
05:14
of deep uncertainty, a dying planet, distrust, pain.
81
314060
5547
κΉŠμ€ λΆˆν™•μ‹€μ„±, μ£½μ–΄κ°€λŠ” 지ꡬ, λΆˆμ‹ , 고톡을 λ΄…λ‹ˆλ‹€.
05:20
And in times of great volatility, we as human beings, we yearn for a balm.
82
320692
5922
변덕이 μ‹¬ν•œ 이 μ‹œλŒ€μ— 우리 인λ₯˜μ—κ² μΉ˜λ£Œμ œκ°€ ν•„μš”ν•©λ‹ˆλ‹€.
05:26
And digital futures are exactly that, a balm.
83
326614
3045
디지털 미래 μ‹œλŒ€μ•Όλ§λ‘œ κ·Έ 치료제라고 ν•  수 있죠.
05:30
We look at it in all of its possibility
84
330535
2252
κ·Έ μ•ˆμ— 우리의 λͺ¨λ“  κ°€λŠ₯성이 λ‹΄κ²¨μžˆκ³ 
05:32
as if it could soothe all that ails us, like a logical inevitability.
85
332787
4838
그것이 마치 ν•„μ—°μ μœΌλ‘œ 우리의 아픔을 μ§„μ •μ‹œν‚¬ 수 μžˆλŠ” κ²ƒμ²˜λŸΌ κΈ°λŒ€ν•©λ‹ˆλ‹€.
05:39
In recent years, reports have started to flag
86
339210
4046
그런데 졜근 λͺ‡ λ…„κ°„, 이런 보고듀이 λ‚˜μ˜€κΈ° μ‹œμž‘ν–ˆμ–΄μš”.
05:43
the new types of risks that are emerging about technology innovations.
87
343298
5589
기술 ν˜μ‹ μ— μžˆμ–΄ μƒˆλ‘œμš΄ μœ ν˜•μ˜ μœ„ν—˜ μš”μ†Œκ°€ 생기고 μžˆλ‹€λŠ” κ±°μ˜ˆμš”.
05:48
One of this is around how data collected on vulnerable individuals
88
348928
5297
그쀑 ν•˜λ‚˜λŠ”...
μ·¨μ•½ν•œ κ°œκ°œμΈμ—κ²Œμ„œ μˆ˜μ§‘λœ 정보가
05:54
can actually be used against them as retaliation,
89
354267
4129
그듀에 λŒ€ν•œ 보볡에 μ‚¬μš©λ  수 μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
05:58
posing greater risk not just against them, but against their families,
90
358396
4129
그듀을 ν–₯ν•œ μœ„ν—˜λΏλ§Œ μ•„λ‹ˆλΌ,
κ·Έ κ°€μ‘±λ“€κ³Ό μ§€μ—­μ‚¬νšŒκΉŒμ§€ μœ„ν—˜ν•΄μ§‘λ‹ˆλ‹€.
06:02
against their community.
91
362525
1502
06:05
We saw these risks become a truth with the Rohingya.
92
365486
3045
μ•Όμš°λ¦¬λŠ” κ·Έ μœ„ν—˜μ΄ ν˜„μ‹€μ΄ λ˜λŠ” 것을 λ‘œνžμ•Όμ‘±μ„ 톡해 λ³΄μ•˜μŠ΅λ‹ˆλ‹€.
06:09
And very, very recently, in August 2021, as Afghanistan fell to the Taliban,
93
369991
5297
그리고 μ•„μ£Ό 졜근, 2021λ…„ 8μ›”μ—λŠ” μ•„ν”„κ°€λ‹ˆμŠ€μΉΈμ΄ νƒˆλ ˆλ°˜μœΌλ‘œ λ„˜μ–΄κ°”μ£ .
06:15
it also came to light that biometric data collected on Afghans
94
375288
5130
또 ν•˜λ‚˜ λ“œλŸ¬λ‚œ 사싀은,
λ―Έκ΅°κ³Ό μ•„ν”„κ°„ 정뢀에 μ˜ν•΄ μˆ˜μ§‘λœ μ•„ν”„κ°„ μ‚¬λžŒλ“€μ˜ 생체 정보가
06:20
by the US military and the Afghan government
95
380460
2461
06:22
and used by a variety of actors
96
382962
2670
λ‹€μ–‘ν•œ 집단에 μ˜ν•΄ μ‚¬μš©λ˜μ—ˆκ³ ,
06:25
were now in the hands of the Taliban.
97
385673
2211
μ§€κΈˆμ€ νƒˆλ ˆλ°˜μ˜ 손에 μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
06:29
Journalists' houses were searched.
98
389135
2085
μ–Έλ‘ μΈλ“€μ˜ 집이 μˆ˜μƒ‰λ‹Ήν–ˆκ³ ,
06:32
Afghans desperately raced against time to erase their digital history online.
99
392388
5089
아프간은 μ‹œκ°„μ„ λ‹€νˆ¬μ–΄ ν•„μ‚¬μ μœΌλ‘œ 온라인 μ†μ˜ 역사λ₯Ό μ§€μš°λ €κ³  ν–ˆμŠ΅λ‹ˆλ‹€.
06:38
Technologies of empowerment then become technologies of disempowerment.
100
398603
5881
νž˜μ„ κ°–κ²Œ ν•˜λŠ” κΈ°μˆ λ“€μ΄ νž˜μ„ μžƒκ²Œ ν•˜λŠ” 기술이 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
06:45
It is because these technologies
101
405026
1793
μ™œλƒν•˜λ©΄ 이 κΈ°μˆ λ“€μ€
06:46
are designed on a certain set of societal assumptions,
102
406819
5005
μ–΄λ–€ μ‚¬νšŒμ  가정을 기반으둜 λ§Œλ“€μ–΄μ§€κ³ ,
06:51
embedded in market and then filtered through capitalist considerations.
103
411824
5047
μ‹œμž₯에 λ‚˜μ˜¬ λ•ŒλŠ” 자본주의적인 κ΄€μ μ—μ„œ κ±ΈλŸ¬μ§€κΈ° λ•Œλ¬Έμ΄μ£ .
06:56
But technologies created in one context and then parachuted into another
104
416871
5339
단 ν•œ κ°€μ§€μ˜ λ§₯λ½μ—μ„œ λ§Œλ“€μ–΄μ§„ 기술이 λ‹€λ₯Έ 곳으둜 μ„±κΈ‰ν•˜κ²Œ 적용되면
07:02
will always fail
105
422251
1335
μ–Έμ œλ‚˜ μ‹€νŒ¨ν•©λ‹ˆλ‹€.
07:03
because it is based on assumptions of how people lead their lives.
106
423628
4880
μ‚¬λžŒλ“€μ˜ μƒν™œλ°©μ‹μ— λŒ€ν•œ 가정에 κΈ°λ°˜μ„ 두기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:08
And whilst here, you and I may be relatively comfortable
107
428549
3921
μ—¬κΈ° μ—¬λŸ¬λΆ„μ΄λ‚˜ μ €λŠ” 아무렇지 μ•Šκ²Œ 지문 정보λ₯Ό μ œκ³΅ν•  κ±°μ˜ˆμš”.
07:12
providing a fingertip scan to perhaps go to the movies,
108
432512
4546
μ˜ν™” 예맀λ₯Ό ν•  λ•Œ 말이죠.
07:17
we cannot extrapolate that out to the level of safety one would feel
109
437100
5630
ν•˜μ§€λ§Œ λ‹€λ₯Έ μ‚¬λžŒμ΄ λŠλΌλŠ” μ•ˆμ „ν•¨μ˜ μˆ˜μ€€μ€ κ°€λŠ ν•˜κΈ° νž˜λ“€ κ²λ‹ˆλ‹€.
07:22
while standing in line,
110
442730
1585
κΈ΄ 쀄에 μ„œμ„œ
07:24
having to give up that little bit of data about themselves
111
444315
3295
μŒμ‹μ„ μ–»κΈ° μœ„ν•΄ μžμ‹ μ˜ 개인 정보λ₯Ό 포기해야 ν•˜λŠ” μ‚¬λžŒλ“€ 말이죠.
07:27
in order to access food rations.
112
447610
2586
07:31
Humanitarians assume that technology will liberate humanity,
113
451739
5256
μΈλ„μ£Όμ˜μ  지원가듀은 기술이 인λ₯˜λ₯Ό 자유둭게 ν•  거라고 μƒκ°ν•©λ‹ˆλ‹€.
07:37
but without any due consideration of issues of power, exploitation and harm
114
457954
6882
ν•˜μ§€λ§Œ μ•žμ„œ λ§ν•œ 일듀을 μƒκΈ°κ²Œ ν•˜λŠ” ꢌλ ₯, μ°©μ·¨, 피해에 λŒ€ν•œ
07:44
that can occur for this to happen.
115
464877
2002
μ μ ˆν•œ κ³ λ €λ₯Ό ν•˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€.
07:46
Instead, we rush to solutionizing,
116
466921
3128
λŒ€μ‹ , κΈ‰ν•˜κ²Œ ν•΄κ²°μ±…λ§Œ 찾으며 λ§ˆλ²• 같은 생각에 λΉ μ Έ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
07:50
a form of magical thinking
117
470091
1668
07:51
that assumes that just by deploying shiny solutions,
118
471801
4546
λ°˜μ§μ΄λŠ” 해결책을 μ œκ³΅ν•˜λ©΄ λˆˆμ•žμ˜ λ¬Έμ œκ°€ 해결될 κ±°λž€ μƒκ°μ΄μš”.
07:56
we can solve the problem in front of us
119
476389
2002
07:58
without any real analysis of underlying realities.
120
478433
3795
ν˜„μ‹€μ— λŒ€ν•œ μ œλŒ€λ‘œ 된 뢄석은 μ „ν˜€ ν•˜μ§€ μ•Šμ€ μ±„λ‘œ 말이죠.
08:03
These are tools at the end of the day,
121
483312
2795
이런 해결책듀은 κ²°κ΅­
08:06
and tools, like a chef's knife,
122
486107
2502
μ£Όλ°©μž₯의 μΉΌκ³Ό κ°™μŠ΅λ‹ˆλ‹€.
08:08
in the hands of some, the creator of a beautiful meal,
123
488609
4880
μ–΄λ–€ 손에 μ₯μ–΄μ§€λ©΄, 멋진 식사가 λ§Œλ“€μ–΄μ§€μ§€λ§Œ,
08:13
and in the hands of others, devastation.
124
493489
3003
λ‹€λ₯Έ μ†μ—μ„œλŠ” 파괴적 도ꡬ가 λ©λ‹ˆλ‹€.
08:17
So how do we ensure that we do not design
125
497243
3712
그러면 μ–΄λ–»κ²Œ ν•΄μ•Ό
과거의 λΆˆν‰λ“±μ„ 디지털 λ―Έλž˜μ— λ‹΅μŠ΅ν•˜μ§€ μ•Šλ„λ‘ ν•  수 μžˆμ„κΉŒμš”?
08:20
the inequities of our past into our digital futures?
126
500997
5047
08:26
And I want to be clear about one thing.
127
506085
1877
ν•œ 가지 λΆ„λͺ…νžˆ ν•΄λ‘μžλ©΄ μ €λŠ” κΈ°μˆ μ„ λ°˜λŒ€ν•˜λŠ” 게 μ•„λ‹ˆμ—μš”.
08:28
I'm not anti-tech. I am anti-dumb tech.
128
508004
3587
λ©μ²­ν•œ κΈ°μˆ μ„ λ°˜λŒ€ν•©λ‹ˆλ‹€.
08:31
(Laughter)
129
511966
1543
(μ›ƒμŒ)
08:33
(Applause)
130
513551
1627
(λ°•μˆ˜)
λͺ‡ λͺ…μ˜ μ œν•œλœ 상상이
08:38
The limited imaginings of the few
131
518097
2336
08:40
should not colonize the radical re-imaginings of the many.
132
520475
4462
λ‹€μˆ˜μ˜ μ‚¬λžŒλ“€μ„ κΈ‰μ§„μ μœΌλ‘œ 재-이미지화 ν•˜μ§€ λͺ»ν•˜λ„둝 ν•΄μ•Ό ν•©λ‹ˆλ‹€.
08:45
So how then do we ensure that we design an ethical baseline,
133
525813
4588
그럼 μ–΄λ–»κ²Œ μš°λ¦¬κ°€ 윀리적인 λ””μžμΈμ— 바탕을 두고 μžˆμŒμ„ ν™•μ‹ ν•˜κ³ ,
08:50
so that the liberation that this promises is not just for a privileged few,
134
530401
6131
그둜 인해 이 해방이 λͺ‡λͺ‡μ˜ νŠΉκΆŒμΈ΅μ„ μœ„ν•œ 것이 μ•„λ‹ˆλΌ
08:56
but for all of us?
135
536532
1544
λͺ¨λ‘λ₯Ό μœ„ν•˜λŠ” 것이 λ κΉŒμš”?
08:59
There are a few examples that can point to a way forward.
136
539202
3837
μš°λ¦¬κ°€ μ•žμœΌλ‘œ λ‚˜μ•„κ°ˆ 수 μžˆλŠ” λͺ‡ 가지 μ˜ˆμ‹œλ₯Ό λ³΄μ—¬λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
09:03
I love the work of Indigenous AI
137
543081
3878
μ €λŠ” 지역화 AIλ₯Ό λ‹€λ£¨λŠ” 것이 μ’‹μŠ΅λ‹ˆλ‹€.
09:07
that instead of drawing from Western values and philosophies,
138
547001
3003
μ„œμ–‘μ˜ κ°€μΉ˜μ™€ μ² ν•™μœΌλ‘œ 그림을 κ·Έλ¦¬λŠ” 것이 μ•„λ‹Œ,
09:10
it draws from Indigenous protocols and values
139
550046
2544
μ›μ£Όλ―Όλ“€μ˜ 관둀와 κ°€μΉ˜κ°€ AI μ½”λ“œμ— λ“€μ–΄μžˆλŠ”
09:12
to embed into AI code.
140
552632
1960
그런 AI말이죠.
09:15
I also really love the work of Nia Tero,
141
555093
2961
또 μ €λŠ” λ‹ˆμ•„ ν…Œλ‘œ(Nia Tero)의 ν™œλ™λ„ μ’‹μ•„ν•©λ‹ˆλ‹€
09:18
an Indigenous co-led organization that works with Indigenous communities
142
558096
4004
원주민이 μ£Όλ„ν•˜λŠ” 단체인 λ‹ˆμ•„ ν…Œλ‘œλŠ” 원주민 지역듀과 ν•¨κ»˜ μΌν•˜λ©°
09:22
to map their own well-being and territories
143
562141
2878
κ·Έλ“€ μŠ€μŠ€λ‘œκ°€ μžμ‹ λ“€μ˜ 볡지와 μ˜ν† λ₯Ό μœ„ν•΄ μΌν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
09:25
as opposed to other people coming in to do it on their behalf.
144
565061
3628
λ‹€λ₯Έ 이듀이 듀어와 원주민을 μœ„ν•΄ μΌν•˜κ²Œ ν•˜λŠ” κ²ƒκ³ΌλŠ” λ‹€λ₯΄μ£ .
09:29
I've learned a lot from the Satellite Sentinel Project back in 2010,
145
569482
4671
μ €λŠ” 2010λ…„ 센티넬 ν”„λ‘œμ νŠΈλ₯Ό 톡해 λ§Žμ€ 것을 λ°°μ› μŠ΅λ‹ˆλ‹€.
09:34
which is a slightly different example.
146
574153
2127
μ‘°κΈˆμ€ λ‹€λ₯Έ μ˜ˆκ°€ λ˜κ² μ§€λ§Œ,
09:36
The project started essentially to map atrocities
147
576280
5840
이 ν”„λ‘œμ νŠΈμ˜ μ‹œμž‘μ€ 기본적으둜 μž”ν˜Ή ν–‰μœ„λ₯Ό νŒŒμ•…ν•˜κΈ° μœ„ν•œ κ±°μ˜€μŠ΅λ‹ˆλ‹€.
09:42
through remote sensing technologies, satellites,
148
582120
3044
원거리 탐지 기술, μœ„μ„± 등을 μ΄μš©ν•΄μ„œ
09:45
in order to be able to predict and potentially prevent them.
149
585206
3587
μž”ν˜Ή ν–‰μœ„λ₯Ό 막고 μ˜ˆμΈ‘ν•˜κΈ° μœ„ν•΄ μ‹œμž‘λœ ν”„λ‘œμ νŠΈμ˜€μ£ .
09:48
Now the project wound down after a few years
150
588835
3295
λͺ‡ 년이 μ§€λ‚˜κ³  이 ν”„λ‘œμ νŠΈλŠ” λ‹€μ–‘ν•œ 이유둜 μΆ•μ†Œλ˜μ—ˆμ£ .
09:52
for a variety of reasons,
151
592171
1919
09:54
one of which being that it couldn’t actually generate action.
152
594132
3461
κ·Έ 이유 쀑 ν•˜λ‚˜λŠ” μ‹€μ§ˆμ μΈ λŒ€μ‘μ„ ν•  수 μ—†μ—ˆλ‹€λŠ” 것과
09:57
But the second, and probably the most important,
153
597635
3837
또 λ‹€λ₯Έ μ΄μœ λŠ”, μ•„λ§ˆ κ°€μž₯ μ€‘μš”ν•œ 이유일 텐데,
10:01
was that the team realized they were operating without an ethical net.
154
601514
5797
이 ν”„λ‘œμ νŠΈκ°€ 윀리적 ν†΅μ œ 없이 운영되고 μžˆμŒμ„ κΉ¨λ‹¬μ•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
10:07
And without ethical guidelines in place,
155
607353
3086
윀리적 κ°€μ΄λ“œλΌμΈ μ—†λ‹€λ©΄
10:10
it was a very wide open line of questioning
156
610439
3754
ν™œλ™μ— μžˆμ–΄μ„œ κ΄‘λ²”μœ„ν•œ μ˜λ¬Έμ„ κ°–κ²Œ λ©λ‹ˆλ‹€.
10:14
about whether what they were doing was helpful or harmful.
157
614193
4088
우리 ν™œλ™μ΄ 도움이 λ˜λŠ”μ§€ ν˜Ήμ€ ν•΄κ°€ λ˜λŠ”μ§€μ— λŒ€ν•œ μ˜λ¬Έμ΄μš”.
10:19
And so they decided to wind down before creating harm.
158
619282
3753
κ·Έλž˜μ„œ ν•΄κ°€ λ˜λŠ” 일이 생기기 전에 이 ν”„λ‘œμ νŠΈλ₯Ό μΆ•μ†Œν•˜κΈ°λ‘œ κ²°μ •ν–ˆμŠ΅λ‹ˆλ‹€.
10:24
In the absence of legally binding ethical frameworks
159
624620
5172
우리의 ν™œλ™μ„ μ΄λŒμ–΄ 쀄
법적 ꡬ속λ ₯이 μžˆλŠ” 윀리적 ν…Œλ‘λ¦¬κ°€ μ—†μ—ˆκΈ°μ—
10:29
to guide our work,
160
629834
2502
10:32
I have been working on a range of ethical principles
161
632378
2878
μ €λŠ” μΈλ„μ£Όμ˜μ  기술 ν˜μ‹ μ„ μ•Œλ¦¬κΈ° μœ„ν•œ
10:35
to help inform humanitarian tech innovation,
162
635298
2544
λ‹€μ–‘ν•œ 윀리적 원칙에 λŒ€ν•΄ μž‘μ—…ν•΄ μ™”μŠ΅λ‹ˆλ‹€.
10:37
and I'd like to put forward a few of these here for you today.
163
637884
2919
였늘 μ΄κ³³μ—μ„œ 그쀑 λͺ‡ 가지λ₯Ό λ§μ”€λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€.
10:41
One: Ask.
164
641345
1752
첫째: μ§ˆλ¬Έν•˜λΌ.
10:43
Which groups of humans will be harmed by this and when?
165
643890
4212
μ–΄λ–€ μ‚¬λžŒλ“€μ΄ μ–Έμ œ 이 λ„κ΅¬λ‘œ 인해 ν”Όν•΄λ₯Ό μž…κ²Œ 될 것인지.
10:48
Assess: Who does this solution actually benefit?
166
648978
3420
ν‰κ°€ν•˜λΌ. μ–΄λ–€ μ‚¬λžŒλ“€μ΄ 이 ν•΄κ²°μ±…μœΌλ‘œ 인해 이읡을 μ–»λŠ”μ§€.
10:53
Interrogate: Was appropriate consent obtained from the end users?
167
653733
5881
μ‘°μ‚¬ν•˜λΌ. μ΅œμ’… μ‚¬μš©μžλ‘œλΆ€ν„° μ μ ˆν•œ λ™μ˜λ₯Ό μ–»μ—ˆλŠ”μ§€.
11:00
Consider: What must we gracefully exit out of
168
660907
4504
κ³ λ €ν•˜λΌ. μš°λ¦¬κ°€ λ¬΄μ—‡μ—μ„œ λ²—μ–΄λ‚˜μ•Ό λ―Έλž˜μ— μ ν•©ν•˜κ²Œ λ˜λŠ”μ§€.
11:05
to be fit for these futures?
169
665411
1627
11:07
And imagine: What future good might we foreclose
170
667038
5797
그리고 μƒμƒν•˜λΌ.
λ§Œμ•½ 였늘 이것을 μ‹€ν–‰ν•œλ‹€λ©΄ 미래의 μ–΄λ–€ 이읡을 포기해야 ν•˜λŠ”μ§€.
11:12
if we implemented this action today?
171
672877
2961
11:16
We are accountable for the futures that we create.
172
676756
3670
μš°λ¦¬κ°€ μ–΄λ–€ 미래λ₯Ό λ§Œλ“€μ§€λŠ” 우리 손에 λ‹¬λ €μžˆμŠ΅λ‹ˆλ‹€.
11:20
We cannot absolve ourselves of the responsibilities
173
680468
3587
λ³΄ν˜Έν•˜κ³  섬겨야 ν•  μ‚¬λžŒλ“€μ—κ²Œ λ§Œμ•½ 우리의 행동이 ν”Όν•΄λ₯Ό κ°€μ Έμ˜¨λ‹€λ©΄
11:24
and accountabilities of our actions
174
684096
2670
11:26
if our actions actually cause harm
175
686807
2920
우리 행동에 λŒ€ν•œ μ±…μž„κ°κ³Ό 책무에 λŒ€ν•΄
11:29
to those that we purport to protect and serve.
176
689769
2878
슀슀둜 λ²—μ–΄λ‚  수 없을 κ²λ‹ˆλ‹€.
11:32
Another world is absolutely, radically possible.
177
692647
4421
또 λ‹€λ₯Έ μ„Έκ³„λŠ” λΆ„λͺ…νžˆ, λΉ λ₯΄κ²Œ 열릴 수 μžˆμŠ΅λ‹ˆλ‹€.
11:37
Thank you.
178
697777
1293
κ°μ‚¬ν•©λ‹ˆλ‹€.
11:39
(Applause)
179
699070
2294
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7