What you need to know about face surveillance | Kade Crockford

140,602 views ・ 2020-05-29

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Chanhong Park κ²€ν† : Jihyeon J. Kim
00:12
How many of you have ever heard someone say
0
12643
2048
μ‚¬μƒν™œμ€ μ‚¬λΌμ‘Œλ‹€κ³  듀어보신 λΆ„λ“€ λͺ‡ λΆ„μ΄λ‚˜ κ³„μ‹ κ°€μš”?
00:14
privacy is dead?
1
14715
1166
00:15
Raise your hand.
2
15905
1150
손 λ“€μ–΄λ³΄μ„Έμš”.
00:17
How many of you have heard someone say
3
17849
2460
μžμ‹ μ€ 숨길 게 μ—†κΈ° λ•Œλ¬Έμ—
00:20
they don't care about their privacy because they don't have anything to hide?
4
20333
3934
μžμ‹ μ˜ μ‚¬μƒν™œμ— λŒ€ν•΄ 신경쓰지 μ•ŠλŠ”λ‹€λŠ” μ‚¬λžŒλ“€μ„ λ³Έ 적 μžˆλ‚˜μš”?
00:24
Go on.
5
24665
1160
λ“€μ–΄λ³΄μ„Έμš”.
00:25
(Laughter)
6
25849
1442
(μ›ƒμŒ)
00:27
Now, how many of you use any kind of encryption software?
7
27721
4256
μ•”ν˜Έν™” μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό μ‚¬μš©ν•΄λ³΄μ‹  뢄은 λͺ‡μ΄λ‚˜ κ³„μ‹ κ°€μš”?
00:32
Raise your hand.
8
32283
1151
손 λ“€μ–΄λ³΄μ„Έμš”.
00:33
Or a password to protect an online account?
9
33458
3347
온라인 계정을 λ³΄ν˜Έν•˜κΈ° μœ„ν•΄ λΉ„λ°€λ²ˆν˜Έλ₯Ό μ‚¬μš©ν•˜μ‹œλŠ” λΆ„μ€μš”?
00:37
Or curtains or blinds on your windows at home?
10
37529
3920
μ§‘μ—μ„œ 컀튼으둜 κ°€λ¦¬μ‹œλŠ” 뢄도 κ³„μ‹ κ°€μš”?
00:41
(Laughter)
11
41473
1556
(μ›ƒμŒ)
00:43
OK, so that's everyone, I think.
12
43053
2033
λ‹€λ“€ κ·ΈλŸ¬μ‹œλŠ” 것 κ°™λ„€μš”.
00:45
(Laughter)
13
45110
1150
(μ›ƒμŒ)
00:46
So why do you do these things?
14
46752
2000
그럼 μ™œλ“€ κ·ΈλŸ¬μ‹œλŠ” κ±΄κ°€μš”?
00:49
My guess is,
15
49165
1151
제 생각엔
00:50
it's because you care about your privacy.
16
50340
2388
μžμ‹ μ˜ μ‚¬μƒν™œμ„ λ³΄ν˜Έν•˜κΈ° μœ„ν•΄μ„œμž…λ‹ˆλ‹€.
00:52
The idea that privacy is dead is a myth.
17
52752
2999
μ‚¬μƒν™œμ΄ μ‚¬λΌμ‘Œλ‹€λŠ” 말은 κ·Όκ±° μ—†λŠ” λ―ΏμŒμž…λ‹ˆλ‹€.
00:55
The idea that people don't care about their privacy
18
55775
2738
자기 μ‚¬μƒν™œμ„ λ³΄ν˜Έμ— 신경쓰지 μ•ŠλŠ” μ΄μœ κ°€
00:58
because "they have nothing to hide"
19
58537
1700
숨길 게 μ—†μ–΄μ„œλΌλ“ κ°€
01:00
or they've done nothing wrong
20
60261
1642
잘λͺ»ν•œ 게 μ—†λ‹€λŠ” 생각도
01:01
is also a myth.
21
61927
1243
λ§ˆμ°¬κ°€μ§€λ‘œ κ·Όκ±° μ—†λŠ” λ―ΏμŒμž…λ‹ˆλ‹€.
01:04
I'm guessing that you would not want to publicly share on the internet,
22
64433
4063
μ—¬λŸ¬λΆ„λ“€μ€ μžμ‹ μ˜ λͺ¨λ“  의료기둝이 λˆ„κ΅¬λ‚˜ λ³Ό 수 있게
01:08
for the world to see,
23
68520
1412
인터넷에 κ³΅μœ λ˜λŠ” 것을
01:09
all of your medical records.
24
69956
1802
μ›ν•˜μ§€ μ•Šμ„ κ²λ‹ˆλ‹€.
01:11
Or your search histories from your phone or your computer.
25
71782
3436
λ˜λŠ” νœ΄λŒ€ μ „ν™”λ‚˜ μ»΄ν“¨ν„°μ˜ κ²€μƒ‰κΈ°λ‘λ„μš”.
01:16
And I bet
26
76289
1206
그리고
01:17
that if the government wanted to put a chip in your brain
27
77519
2674
μ •λΆ€κ°€ μ—¬λŸ¬λΆ„μ˜ 머릿속에 칩을 심어
01:20
to transmit every one of your thoughts to a centralized government computer,
28
80217
4638
생각을 속속듀이 μ •λΆ€ 쀑앙 μ»΄ν“¨ν„°λ‘œ μ „μ†‘ν•˜λŠ” 것도
01:24
you would balk at that.
29
84879
1485
꺼리싀 거라 μž₯λ‹΄ν•©λ‹ˆλ‹€.
01:26
(Laughter)
30
86388
2435
(μ›ƒμŒ)
01:28
That's because you care about your privacy,
31
88847
2008
κ·Έλ ‡κΈ° λ•Œλ¬Έμ— μ—¬λŸ¬λΆ„λ“€μ€ μ‚¬μƒν™œμ„ 지킀렀 ν•œλ‹€λŠ” κ²λ‹ˆλ‹€,
01:30
like every human being.
32
90879
1533
λ‹€λ₯Έ λͺ¨λ“  μ‚¬λžŒλ“€μ²˜λŸΌμš”.
01:33
So, our world has changed fast.
33
93347
2744
μ„Έκ³„λŠ” λΉ λ₯΄κ²Œ λ³€ν™”ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
01:36
And today, there is understandably a lot of confusion
34
96395
3833
λ‹Ήμ—°ν•˜κ²Œλ„ μ‚¬μƒν™œμ˜ μ •μ˜μ™€ 그것이 μ™œ μ€‘μš”ν•œμ§€μ— λŒ€ν•΄
01:40
about what privacy is and why it matters.
35
100252
3142
μ˜€λŠ˜λ‚  λ§Žμ€ λ…Όλž€μ΄ μžˆμŠ΅λ‹ˆλ‹€.
01:44
Privacy is not secrecy.
36
104260
1864
μ‚¬μƒν™œμ΄ 곧 비밀은 μ•„λ‹™λ‹ˆλ‹€.
01:46
It's control.
37
106490
1166
μ‚¬μƒν™œμ΄λž€ ν†΅μ œν•˜λŠ” κ²λ‹ˆλ‹€.
01:48
I share information with my doctor about my body and my health,
38
108817
3920
μ €λŠ” 제 신체와 건강에 λŒ€ν•œ 정보λ₯Ό μ˜μ‚¬μ™€ κ³΅μœ ν•˜μ§€λ§Œ
01:52
expecting that she is not going to turn around
39
112761
2476
μ˜μ‚¬κ°€ κ·Έ 정보λ₯Ό 제 λΆ€λͺ¨λ‹˜μ΄λ‚˜
01:55
and share that information with my parents,
40
115261
3110
직μž₯ λ˜λŠ” 제 μ•„μ΄μ—κ²Œ
01:58
or my boss or my kids.
41
118395
2126
μ•Œλ¦¬κ²Œ ν•˜μ§€λŠ” μ•ŠμŠ΅λ‹ˆλ‹€.
02:01
That information is private, not secret.
42
121585
2539
이런 μ •λ³΄λŠ” 비밀이 μ•„λ‹ˆλΌ μ‚¬μƒν™œμž…λ‹ˆλ‹€.
02:05
I'm in control over how that information is shared.
43
125085
3257
제 정보가 μ–΄λ””κΉŒμ§€ κ³΅μœ λ˜λŠ”μ§€ 슀슀둜 ν†΅μ œκ°€ κ°€λŠ₯ν•©λ‹ˆλ‹€.
02:09
You've probably heard people say that there's a fundamental tension
44
129686
3159
μ—¬λŸ¬λΆ„μ€ μ‚¬μƒν™œκ³Ό λ³΄μ•ˆ 사이에
02:12
between privacy on the one hand
45
132869
1762
근본적으둜 κΈ΄μž₯μƒνƒœκ°€ μ‘΄μž¬ν•œλ‹€λŠ”
02:14
and safety on the other.
46
134655
2023
μ£Όμž₯을 듀어보셨을 κ²λ‹ˆλ‹€.
02:17
But the technologies that advance our privacy
47
137963
2532
ν•˜μ§€λ§Œ κ³Όν•™κΈ°μˆ μ€ μ‚¬μƒν™œ 보호뿐 μ•„λ‹ˆλΌ
02:20
also advance our safety.
48
140519
1579
λ³΄μ•ˆλ„ ν–₯μƒμ‹œμΌ°μŠ΅λ‹ˆλ‹€.
02:22
Think about fences, door locks,
49
142122
2421
λ‹΄μž₯μ΄λ‚˜, 도어둝,
02:24
curtains on our windows, passwords,
50
144567
2753
μ°½λ¬Έ 컀튼, λΉ„λ°€λ²ˆν˜Έ,
02:27
encryption software.
51
147344
1333
μ•”ν˜Έν™” μ†Œν”„νŠΈμ›¨μ–΄ 같은 κ±°μ£ .
02:29
All of these technologies
52
149522
1491
이 λͺ¨λ“  κΈ°μˆ λ“€μ€
02:31
simultaneously protect our privacy and our safety.
53
151037
4325
μ‚¬μƒν™œκ³Ό λ³΄μ•ˆμ„ λ™μ‹œμ— μ§€μΌœμ€λ‹ˆλ‹€.
02:36
Dragnet surveillance, on the other hand, protects neither.
54
156910
3493
반면 저인망식 κ°μ‹œλŠ” 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
02:41
In recent years,
55
161486
1166
졜근 λͺ‡ λ…„,
02:42
the federal government tasked a group of experts
56
162676
2921
μ •λΆ€λŠ” μ‚¬μƒν™œ 보호 및 μ‹œλ―Ό 자유 κ²€ν†  μœ„μ›νšŒλΌλŠ”
02:45
called The Privacy and Civil Liberties Oversight Board
57
165621
2936
μ „λ¬Έκ°€λ“€μ—κ²Œ 9.11사건 μ΄ν›„μ˜
02:48
with examining post-9/11 government surveillance programs,
58
168581
3322
μ •λΆ€ κ°μ‹œ ν”„λ‘œκ·Έλž¨μ„ κ²€ν† ν•˜λ„λ‘ μ§€μ‹œν–ˆμŠ΅λ‹ˆλ‹€,
02:51
dragnet surveillance programs.
59
171927
1956
저인망식 κ°μ‹œ ν”„λ‘œκ·Έλž¨μ΄μ£ .
02:54
Those experts could not find a single example
60
174307
3135
전문가듀은 이 ν”„λ‘œκ·Έλž¨μ΄ μ•ˆμ „μ„ 보μž₯ν•΄μ€€λ‹€λŠ”
02:57
of that dragnet surveillance advancing any safety --
61
177466
3784
단 ν•˜λ‚˜μ˜ 사둀도 찾지 λͺ» ν–ˆμŠ΅λ‹ˆλ‹€.
03:01
didn't identify or stop a single terrorist attack.
62
181274
2867
ν„°λ ˆλŸ¬λ¦¬μŠ€νŠΈμ˜ 곡격을 사전에 μ˜ˆλ°©ν•˜μ§€λ„ λͺ»ν–ˆκ³ μš”.
03:05
You know what that information was useful for, though?
63
185369
2881
μˆ˜μ§‘λœ 정보가 어디에 μœ μš©ν–ˆμ„κΉŒμš”?
03:08
Helping NSA employees spy on their romantic interests.
64
188274
3157
NSA 직원듀이 μ—°μ• μƒλŒ€λ₯Ό κ°μ‹œν•˜λŠ” λ°λŠ” μ’‹μ•˜μ£ .
03:11
(Laughter)
65
191455
1141
(μ›ƒμŒ)
03:12
(Audience: Wow.)
66
192620
1301
(청쀑: 와.)
03:14
Another example is closer to home.
67
194892
1690
또 λ‹€λ₯Έ μ‚¬λ‘€λŠ” κ°€μ •κ³Ό λ°€μ ‘ν•©λ‹ˆλ‹€.
03:16
So millions of people across the United States and the world
68
196606
2858
λ―Έκ΅­κ³Ό μ „ μ„Έκ³„μ—μ„œ 수백 만 λͺ…이
03:19
are adopting so-called "smart home" devices,
69
199488
2288
μΈν„°λ„·μœΌλ‘œ μ—°κ²°λœ κ°μ‹œ 카메라 같은
03:21
like internet-connected surveillance cameras.
70
201800
2563
"슀마트 ν™ˆ'μ΄λΌλŠ” μž₯치λ₯Ό μ‚¬μš©ν•©λ‹ˆλ‹€.
03:24
But we know that any technology connected to the internet
71
204712
3262
ν•˜μ§€λ§Œ 인터넷이 μ—°κ²°λ˜μ–΄ μžˆλ‹€λ©΄
03:27
can be hacked.
72
207998
1150
ν•΄ν‚Ήλ‹Ήν•  μˆ˜λ„ 있죠.
03:30
And so if a hacker
73
210514
1714
κ·Έλž˜μ„œ λ§Œμ•½ 해컀가
03:32
gets into your internet-connected surveillance camera at home,
74
212252
3302
인터넷에 μ—°κ²°λœ κ°€μ •μš© κ°μ‹œ 카메라λ₯Ό ν•΄ν‚Ήν•˜κ²Œ 되면,
03:35
they can watch you and your family coming and going,
75
215578
2698
μ—¬λŸ¬λΆ„μ΄λ‚˜ 가쑱이 λ“œλ‚˜λ“œλŠ” 것을 λ³Ό 수 μžˆμ„ κ±°κ³ ,
03:38
finding just the right time to strike.
76
218300
2448
μΉ¨μž…ν•˜κΈ° 쒋은 μ‹œκ°„μ„ μ•Œ 수 있겠죠.
03:41
You know what can't be hacked remotely?
77
221553
2309
ν•΄ν‚Ήλ˜μ§€ μ•ŠλŠ” 게 뭔지 μ•„μ„Έμš”?
03:44
Curtains.
78
224418
1156
μ»€νŠΌμž…λ‹ˆλ‹€.
03:45
(Laughter)
79
225598
1001
(μ›ƒμŒ)
03:46
Fences.
80
226623
1414
λ‹΄μž₯κ³Ό
03:48
Door locks.
81
228061
1160
λ¬Έ 잠금μž₯μΉ˜κ°€ 있죠
03:49
(Laughter)
82
229245
1141
(μ›ƒμŒ)
03:50
Privacy is not the enemy of safety.
83
230410
2346
μ‚¬μƒν™œμ€ μ•ˆμ „μ˜ 적이 μ•„λ‹™λ‹ˆλ‹€.
03:53
It is its guarantor.
84
233113
1452
보증인이죠.
03:56
Nonetheless, we daily face a propaganda onslaught
85
236129
3985
κ·ΈλŸΌμ—λ„ κ°μ‹œ ν”„λ‘œκ·Έλž¨μ— μ˜ν•΄ μ‚¬μƒν™œμ΄ μΉ¨ν•΄λ˜λŠ” 것이
04:00
telling us that we have to give up some privacy in exchange for safety
86
240138
4090
μ•ˆμ „μ˜ λŒ€κ°€λ‘œ λ‹Ήμ—°ν•˜λ‹€λŠ” 선전물을
04:04
through surveillance programs.
87
244252
2000
우린 맀일 λ§ˆμ£Όν•©λ‹ˆλ‹€.
04:07
Face surveillance is the most dangerous of these technologies.
88
247085
3999
μ—¬λŸ¬ 기술 쀑 κ°€μž₯ μœ„ν—˜ν•œ 것이 μ•ˆλ©΄ 인식 κ°μ‹œ κΈ°μˆ μž…λ‹ˆλ‹€.
04:11
There are two primary ways today governments use technologies like this.
89
251966
5167
μ •λΆ€λŠ” ν˜„μž¬ 이 κΈ°μˆ μ— 주둜 두 가지 방법을 μ‚¬μš©ν•©λ‹ˆλ‹€.
04:17
One is face recognition.
90
257157
1753
첫 λ²ˆμ§ΈλŠ” μ•ˆλ©΄ μΈμ‹μž…λ‹ˆλ‹€.
04:19
That's to identify someone in an image.
91
259243
2271
사진 μ†μ˜ 인물을 νŠΉμ •ν•˜λŠ” κ²λ‹ˆλ‹€.
04:21
The second is face surveillance,
92
261538
2833
두 λ²ˆμ§ΈλŠ” μ•ˆλ©΄ κ°μ‹œμΈλ°,
04:24
which can be used in concert
93
264395
2055
μ½˜μ„œνŠΈμž₯μ—μ„œ μ‘°μ°¨
04:26
with surveillance-camera networks and databases
94
266474
2856
κ°μ‹œμΉ΄λ©”λΌ λ„€ν¬μ›Œν¬μ™€ λ°μ΄ν„°λ² μ΄μŠ€λ₯Ό 톡해
04:29
to create records of all people's public movements,
95
269354
3477
λͺ¨λ“  μ‚¬λžŒλ“€μ˜ 행동과 μŠ΅μ„±, μƒν˜Έκ΅λ₯˜ 등을 기둝할 수 μžˆμŠ΅λ‹ˆλ‹€.
04:32
habits and associations,
96
272855
2405
04:35
effectively creating a digital panopticon.
97
275284
2912
효과적으둜 디지털 μ›ν˜• κ΅λ„μ†Œλ₯Ό μ„€κ³„ν•˜λŠ” κ²λ‹ˆλ‹€.
04:38
This is a panopticon.
98
278903
1857
이게 μ›ν˜• κ΅λ„μ†Œμž…λ‹ˆλ‹€.
04:40
It's a prison designed to allow a few guards in the center
99
280784
4222
쀑앙에 μžˆλŠ” ꡐ도관듀이 μ£Όμœ„μ˜ λͺ¨λ“  κ°λ°©μ—μ„œ μΌμ–΄λ‚˜λŠ”
04:45
to monitor everything happening in the cells around the perimeter.
100
285030
3944
λͺ¨λ“  일을 κ°μ‹œν•  수 μžˆλ„λ‘ μ„€κ³„λœ κ΅λ„μ†Œμž…λ‹ˆλ‹€.
04:49
The people in those prison cells can't see inside the guard tower,
101
289482
4667
이 감방의 μˆ˜κ°μžλ“€μ€ κ°μ‹œνƒ‘ λ‚΄λΆ€λ₯Ό λ³Ό 수 μ—†μ§€λ§Œ,
04:54
but the guards can see into every inch of those cells.
102
294173
3666
ꡐ도관듀은 감방 ꡬ석ꡬ석을 κ°μ‹œν•  수 μžˆμŠ΅λ‹ˆλ‹€.
04:59
The idea here
103
299268
2055
μ—¬κΈ°μ„œ 생각할 수 μžˆλŠ” 것은
05:01
is that if the people in those prison cells
104
301347
2595
λ§Œμ•½ 이 감방의 μˆ˜κ°μžλ“€μ΄
05:03
know they're being watched all the time,
105
303966
1929
μžμ‹ μ΄ 항상 κ°μ‹œλ‹Ήν•˜κ³  μžˆκ±°λ‚˜,
05:05
or could be,
106
305919
1460
그럴 수 μžˆλ‹€λŠ” 것을 μ•ˆλ‹€λ©΄,
05:07
they'll behave accordingly.
107
307403
1968
그에 뢀응해 행동할 κ²λ‹ˆλ‹€.
05:09
Similarly, face surveillance enables a centralized authority --
108
309395
3774
λ§ˆμ°¬κ°€μ§€λ‘œ, μ•ˆλ©΄ κ°μ‹œ μ—­μ‹œ μ§‘μ€‘λœ κΆŒν•œμ„ λΆ€μ—¬ν•©λ‹ˆλ‹€.
05:13
in this case, the state --
109
313193
1884
μ •λΆ€μ—κ²Œ 말이죠.
05:15
to monitor the totality of human movement and association
110
315101
3357
곡곡μž₯μ†Œμ—μ„œμ˜ μ‚¬λžŒλ“€μ˜ μ›€μ§μž„κ³Ό μƒν˜Έκ΅λ₯˜λ₯Ό μ™„λ²½νžˆ
05:18
in public space.
111
318482
1150
μ§€μΌœλ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
05:20
And here's what it looks like
112
320053
1397
ν˜„μ‹€μ„Έκ³„μ—μ„œμ˜ λͺ¨μŠ΅μ€ μ΄λ ‡μŠ΅λ‹ˆλ‹€.
05:21
in real life.
113
321474
1372
05:22
In this case, it's not a guard in a tower,
114
322870
2802
이 κ²½μš°λŠ” κ°μ‹œνƒ‘μ˜ ꡐ도관이 μ•„λ‹ˆλΌ,
05:25
but rather a police analyst in a spy center.
115
325696
3278
κ²½μ°° κ΄€μ œμ„Όν„°μ˜ μ˜μƒλΆ„μ„κ°€λ“€μž…λ‹ˆλ‹€.
05:29
The prison expands beyond its walls,
116
329506
3150
κ΅λ„μ†ŒλŠ” κ·Έ μ˜μ—­μ„ ν™•μž₯ν•΄,
05:32
encompassing everyone,
117
332680
1882
μ–Έμ œ μ–΄λ””μ„œλ“  λͺ¨λ“  μ‚¬λžŒμ„
05:34
everywhere, all the time.
118
334586
2491
κ°€λ‘λ‘˜ 수 있게 된 κ²λ‹ˆλ‹€.
05:38
In a free society,
119
338189
1309
μžμœ μ‚¬νšŒμ—μ„œ,
05:39
this should terrify us all.
120
339522
2404
μ΄λŠ” λΆ„λͺ… λͺ¨λ‘μ—κ²Œ μœ„ν˜‘μ μž…λ‹ˆλ‹€.
05:43
For decades now, we've watched cop shows
121
343927
2390
μˆ˜μ‹­ λ…„κ°„, κ²½μ°°λ“œλΌλ§ˆμ—μ„œλŠ”
05:46
that push a narrative that says
122
346341
1722
μ•ˆλ©΄ κ°μ‹œ 같은 κΈ°μˆ λ“€μ΄
05:48
technologies like face surveillance ultimately serve the public good.
123
348087
3740
ꢁ극적으둜 곡곡의 이읡을 μ§€μΌœμ£ΌλŠ” κ²ƒμ²˜λŸΌ λΉ„μ³€μŠ΅λ‹ˆλ‹€.
05:52
But real life is not a cop drama.
124
352419
2200
ν•˜μ§€λ§Œ ν˜„μ‹€μ€ λ“œλΌλ§ˆκ°€ μ•„λ‹™λ‹ˆλ‹€.
05:56
The bad guy didn't always do it,
125
356149
1968
λ‚˜μœ μ‚¬λžŒμ΄λΌκ³  ν•΄μ„œ 항상 λ‚˜μœ 일을 ν•˜λŠ” 것도 μ•„λ‹ˆκ³ ,
05:58
the cops definitely aren't always the good guys
126
358141
3269
경찰이라고 ν•΄μ„œ λͺ¨λ‘ 쒋은 μ‚¬λžŒμ€ μ•„λ‹™λ‹ˆλ‹€.
06:01
and the technology doesn't always work.
127
361434
2600
기술이 항상 μ œλŒ€λ‘œ κΈ°λŠ₯ν•˜λŠ” 것도 μ•„λ‹ˆμ£ .
06:04
Take the case of Steve Talley,
128
364602
1896
μ½œλ‘œλΌλ„μ˜ μž¬λ¬΄λΆ„μ„κ°€,
06:06
a financial analyst from Colorado.
129
366522
2492
μŠ€ν‹°λΈŒ νƒ€λ¦¬μ˜ 사둀λ₯Ό 보죠.
06:09
In 2015, Talley was arrested, and he was charged with bank robbery
130
369038
3905
2015λ…„, νƒ€λ¦¬λŠ” 은행강도 혐의둜 μ²΄ν¬λλŠ”λ°
06:12
on the basis of an error in a facial recognition system.
131
372967
3288
μ•ˆλ©΄ 인식 μ‹œμŠ€ν…œμ˜ 였λ₯˜ λ•Œλ¬Έμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
06:16
Talley fought that case
132
376985
1183
νƒ€λ¦¬λŠ” μ†Œμ†‘μ„ 톡해
06:18
and he eventually was cleared of those charges,
133
378192
2222
κ²°κ΅­ 혐의λ₯Ό λ²—μ—ˆμ§€λ§Œ,
06:20
but while he was being persecuted by the state,
134
380438
2594
주정뢀에 μ˜ν•΄ κΈ°μ†Œλ˜λŠ” λ™μ•ˆ,
06:23
he lost his house, his job and his kids.
135
383056
3511
집과 직μž₯, 아이듀을 μžƒμ—ˆμŠ΅λ‹ˆλ‹€.
06:27
Steve Talley's case is an example
136
387941
1677
μŠ€ν‹°λΈŒ νƒ€λ¦¬μ˜ μ‚¬λ‘€λŠ”
06:29
of what can happen when the technology fails.
137
389642
2501
기술의 였λ₯˜λ‘œ 생길 수 μžˆλŠ” λ¬Έμ œλ“€ 쀑 ν•˜λ‚˜μž…λ‹ˆλ‹€.
06:33
But face surveillance is just as dangerous when it works as advertized.
138
393219
4248
ν•˜μ§€λ§Œ μ•ˆλ©΄ κ°μ‹œλŠ” μ •μƒμ μœΌλ‘œ μž‘λ™ν•΄λ„ μœ„ν—˜ν•©λ‹ˆλ‹€.
06:38
Just consider how trivial it would be
139
398875
2818
읡λͺ…μ˜ μ•Œμ½œμ€‘λ…μž λͺ¨μž„이 μžˆλŠ”
06:41
for a government agency to put a surveillance camera
140
401717
2443
건물 λ°”κΉ₯에 μ •λΆ€κ°€ κ°μ‹œμΉ΄λ©”λΌλ₯Ό
06:44
outside a building where people meet for Alcoholics Anonymous meetings.
141
404184
3944
μ„€μΉ˜ν•˜λŠ” 게 μ–Όλ§ˆλ‚˜ μ†μ‰¬μš΄μ§€ μƒκ°ν•΄λ³΄μ„Έμš”.
06:49
They could connect that camera
142
409194
1522
카메라λ₯Ό 톡해
06:50
to a face-surveillance algorithm and a database,
143
410740
2652
μ•ˆλ©΄ κ°μ‹œ μ•Œκ³ λ¦¬μ¦˜κ³Ό λ°μ΄ν„°λ² μ΄μŠ€μ— μ ‘κ·Όν•˜λ©΄
06:53
press a button and sit back and collect
144
413416
1927
λ²„νŠΌ ν•˜λ‚˜λ§Œ λˆ„λ₯΄κ³  앉아
06:55
a record of every person receiving treatment for alcoholism.
145
415367
3220
μ•Œμ½œμ€‘λ…μΉ˜λ£Œλ₯Ό λ°›λŠ” λͺ¨λ“  μ°Έμ„μžλ“€μ˜ 정보λ₯Ό μˆ˜μ§‘ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
07:00
It would be just as easy for a government agency
146
420204
2307
μ •λΆ€κ°€ 이 κΈ°μˆ μ„ μ—¬μ„± ν–‰μ§„μ΄λ‚˜
07:02
to use this technology to automatically identify
147
422535
2629
흑인 민ꢌ μš΄λ™ μ°Έμ„μžλ“€ κ°œκ°œμΈμ„
07:05
every person who attended the Women's March
148
425188
2381
μžλ™μœΌλ‘œ νŠΉμ •ν•˜κΈ° μœ„ν•΄ μ‚¬μš©ν•˜λŠ” 것도 μ•„μ£Ό μ‰¬μšΈ κ²λ‹ˆλ‹€.
07:07
or a Black Lives Matter protest.
149
427593
2133
07:11
Even the technology industry is aware of the gravity of this problem.
150
431163
3881
심지어 이μͺ½ 기술 업계도 문제의 심각성을 μ•Œκ³  μžˆμŠ΅λ‹ˆλ‹€.
07:15
Microsoft's president Brad Smith has called on Congress to intervene.
151
435910
4390
λ§ˆμ΄ν¬λ‘œμ†Œν”„νŠΈμ˜ 사μž₯ λΈŒλž˜λ“œ μŠ€λ―ΈμŠ€λŠ” 의회의 κ°œμž…μ„ μš”μ²­ν–ˆμŠ΅λ‹ˆλ‹€.
07:21
Google, for its part,
152
441188
1382
ν•œνŽΈ ꡬ글은,
07:22
has publicly declined to ship a face surveillance product,
153
442594
3602
μ•ˆλ©΄ κ°μ‹œ μƒν’ˆμ˜ μ΄μš©μ„ κ³΅μ‹μ μœΌλ‘œ κ±°λΆ€ν–ˆλŠ”λ°,
07:26
in part because of these grave human and civil rights concerns.
154
446220
3547
λΆ€λΆ„μ μœΌλ‘œλŠ” μ‹¬κ°ν•œ 인ꢌ 문제λ₯Ό μΈμ‹ν•œ κ±°μ£ .
07:30
And that's a good thing.
155
450514
1600
λ°”λžŒμ§ν•œ μΌμž…λ‹ˆλ‹€.
07:32
Because ultimately,
156
452840
1611
μ™œλƒν•˜λ©΄ ꢁ극적으둜,
07:34
protecting our open society is much more important
157
454475
3501
μ—΄λ¦° μ‚¬νšŒλ₯Ό λ³΄ν˜Έν•˜λŠ” 게 κΈ°μ—… 이윀 보닀
07:38
than corporate profit.
158
458000
1467
훨씬 더 μ€‘μš”ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:41
The ACLU's nationwide campaign
159
461078
2342
ACLUλŠ” 이 μœ„ν—˜ν•œ 기술 λ„μž…μ—
07:43
to get the government to pump the brakes
160
463444
1928
μ •λΆ€κ°€ μ œλ™μ„ κ±Έμ–΄μ•Ό ν•œλ‹€λ©°
07:45
on the adoption of this dangerous technology
161
465396
2331
전ꡭ적인 μΊ νŽ˜μΈμ„ λ²Œμ—¬μ™”λŠ”λ°
07:47
has prompted reasonable questions from thoughtful people.
162
467751
2929
μ΄λŠ” 양식 μžˆλŠ” μ‚¬λžŒλ“€μ˜ 합리적 μ˜μ‹¬μ„ μœ λ°œν–ˆμŠ΅λ‹ˆλ‹€.
07:51
What makes this technology in particular so dangerous?
163
471807
3666
μ–΄λ–€ νŠΉλ³„ν•œ 게 이 κΈ°μˆ μ„ μœ„ν—˜ν•˜κ²Œ λ§Œλ“œλŠ” 건가?
07:55
Why can't we just regulate it?
164
475791
2237
μš°λ¦¬κ°€ ν†΅μ œν•  μˆ˜λŠ” μ—†λŠ”κ°€?
07:58
In short, why the alarm?
165
478052
2168
κ°„λ‹¨νžˆ 말해, μ™œ 경계해야 ν•˜λŠ”κ°€?
08:01
Face surveillance is uniquely dangerous for two related reasons.
166
481236
4412
μ•ˆλ©΄ κ°μ‹œλŠ” μƒν˜Έμ—°κ΄€λœ 두 가지 이유둜 특히 μœ„ν—˜ν•©λ‹ˆλ‹€.
08:06
One is the nature of the technology itself.
167
486133
3174
ν•˜λ‚˜λŠ” 기술 κ·Έ 자체의 λ³Έμ§ˆμž…λ‹ˆλ‹€.
08:09
And the second is that our system
168
489331
2272
λ‹€λ₯Έ ν•˜λ‚˜λŠ” 우리 μ‹œμŠ€ν…œμ΄
08:11
fundamentally lacks the oversight and accountability mechanisms
169
491627
4410
μ •λΆ€μ˜ 기술 λ‚¨μš© μš°λ €κ°€ 없도둝 ν•  수 μžˆλŠ”
08:16
that would be necessary
170
496061
1406
감독 κΈ°λŠ₯κ³Ό μ±…μž„ 체계가
08:17
to ensure it would not be abused in the government's hands.
171
497491
3602
근본적으둜 μ·¨μ•½ν•˜λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
08:22
First, face surveillance enables a totalizing form of surveillance
172
502562
4912
첫째, μ•ˆλ©΄ κ°μ‹œλŠ” κ³Όκ±°μ—λŠ” λΆˆκ°€λŠ₯ν–ˆλ˜ λͺ¨λ“  ν˜•νƒœμ˜ κ°μ‹œλ₯Ό
08:27
never before possible.
173
507498
1467
κ°€λŠ₯ν•˜κ²Œ ν•©λ‹ˆλ‹€.
08:30
Every single person's every visit to a friend's house,
174
510006
3960
개개인의 λͺ¨λ“  방문은 그곳이 μΉœκ΅¬μ§‘μ΄λ“ ,
08:33
a government office,
175
513990
1563
κ΄€κ³΅μ„œ,
08:35
a house of worship,
176
515577
2087
ꡐ회,
08:37
a Planned Parenthood,
177
517688
1413
κ°€μ‘± κ³„νš μ—°λ§Ή,
08:39
a cannabis shop,
178
519125
1500
λŒ€λ§ˆμ΄ˆ κ°€κ²Œ,
08:40
a strip club;
179
520649
1467
슀트립 ν΄λŸ½μ΄λ“ μš”.
08:42
every single person's public movements, habits and associations
180
522958
4317
λͺ¨λ“  개개인의 μ‚¬νšŒμ  행동과 μŠ΅μ„±, μƒν˜Έκ΅λ₯˜κ°€
08:47
documented and catalogued,
181
527299
1635
기둝되고 λΆ„λ₯˜λ©λ‹ˆλ‹€,
08:48
not on one day, but on every day,
182
528958
2365
ν•˜λ£¨λ§Œμ΄ μ•„λ‹ˆλΌ, λ§€μΌμš”,
08:51
merely with the push of a button.
183
531347
1945
λ²„νŠΌ ν•˜λ‚˜λ§Œ λˆ„λ₯΄λ©΄ λ©λ‹ˆλ‹€.
08:54
This kind of totalizing mass surveillance
184
534276
3142
이 μ΄μ²΄μ μ΄λ©΄μ„œλ„ λŒ€λŸ‰μ  κ°μ‹œλŠ”
08:57
fundamentally threatens what it means to live in a free society.
185
537442
3032
μžμœ μ‚¬νšŒμ—μ„œμ˜ 삢에 ν•„μš”ν•œ μš”μ†Œμ— 근본적인 μœ„ν˜‘μ΄ λ©λ‹ˆλ‹€.
09:00
Our freedom of speech, freedom of association,
186
540498
2729
μ–Έλ‘ μ˜ μžμœ μ™€ κ²°μ‚¬μ˜ 자유,
09:03
freedom of religion,
187
543251
1335
μ’…κ΅μ˜ 자유,
09:04
freedom of the press,
188
544610
1253
좜판의 자유,
09:05
our privacy,
189
545887
1214
우리의 μ‚¬μƒν™œ,
09:07
our right to be left alone.
190
547125
1800
혼자 μžˆμ„ ꢌ리 등에 λ§μž…λ‹ˆλ‹€.
09:10
You may be thinking,
191
550467
1460
μ΄λ ‡κ²Œ μƒκ°ν•˜μ‹€ 지도 λͺ¨λ¦…λ‹ˆλ‹€,
09:11
"OK, come on, but there are tons of ways the government can spy on us."
192
551951
3335
"그래, ν•˜μ§€λ§Œ μ •λΆ€κ°€ 우리λ₯Ό κ°μ‹œν•˜λŠ” 방법은 μˆ˜λ„ 없이 λ§Žλ‹€κ³ ."
09:15
And yes, it's true,
193
555310
1197
λ„€, λ§žμŠ΅λ‹ˆλ‹€,
09:16
the government can track us through our cell phones,
194
556531
2453
μ •λΆ€λŠ” νœ΄λŒ€ μ „ν™”λ₯Ό 톡해 우릴 좔적할 수 μžˆμŠ΅λ‹ˆλ‹€,
09:19
but if I want to go to get an abortion,
195
559008
3632
ν•˜μ§€λ§Œ λ‚™νƒœλ₯Ό ν•˜λŸ¬ κ°€κ±°λ‚˜,
09:22
or attend a political meeting,
196
562664
1786
μ •μΉ˜μ  λͺ¨μž„에 μ°Έμ„ν•˜κ±°λ‚˜,
09:24
or even just call in sick and play hooky and go to the beach ...
197
564474
4278
κΎ€λ³‘μœΌλ‘œ 병가λ₯Ό λ‚΄κ³  ν•΄λ³€μœΌλ‘œ λ†€λŸ¬ κ°€κ³  μ‹Άλ‹€λ©΄...
09:28
(Laughter)
198
568776
1341
(μ›ƒμŒ)
09:30
I can leave my phone at home.
199
570141
1933
집에 νœ΄λŒ€ μ „ν™”λ₯Ό 놓고 κ°€λ©΄ λ©λ‹ˆλ‹€.
09:33
I cannot leave my face at home.
200
573101
2254
ν•˜μ§€λ§Œ 얼꡴은 그렇지 μ•ŠμŠ΅λ‹ˆλ‹€.
09:37
And that brings me to my second primary concern:
201
577165
2539
이런 점듀이 제게 두 번째 μ—Όλ €λ₯Ό ν•˜κ²Œ ν•©λ‹ˆλ‹€.
09:39
How we might meaningfully regulate this technology.
202
579728
3327
μ–΄λ–»κ²Œ ν•˜λ©΄ 이 κΈ°μˆ μ„ 의미있게 ν†΅μ œν•  수 μžˆλŠ”κ°€.
09:44
Today, if the government wants to know where I was last week,
203
584315
3714
아직은 μ§€λ‚œμ£Ό μ œκ°€ μ–΄λ”” μžˆμ—ˆλŠ”μ§€ μ •λΆ€κ°€ μ•Œκ³  μ‹Άλ‹€κ³  ν•΄μ„œ,
09:48
they can't just hop into a time machine and go back in time and follow me.
204
588053
4524
νƒ€μž„λ¨Έμ‹ μ„ 타고 과거둜 κ°€ 절 찾을 순 μ—†μŠ΅λ‹ˆλ‹€.
09:53
And they also, the local police right now,
205
593585
2736
그리고 아직은 경찰이,
09:56
don't maintain any centralized system of tracking,
206
596345
3293
μ°¨ν›„ μœ μš©ν•œ μ •λ³΄λ‘œ μ“°κΈ° μœ„ν•΄,
09:59
where they're cataloging every person's public movements all the time,
207
599662
3366
μƒμ‹œ λͺ¨λ“  μ‚¬λžŒλ“€μ˜ 행동을 λΆ„λ₯˜ν•˜λŠ”,
10:03
just in case that information some day becomes useful.
208
603052
3383
μ§‘μ€‘λœ 좔적 μ‹œμŠ€ν…œμ„ μš΄μ˜ν•˜μ§€λ„ μ•ŠμŠ΅λ‹ˆλ‹€.
10:06
Today, if the government wants to know where I was last week,
209
606783
3102
μ§€κΈˆμ€, λ§Œμ•½ μ •λΆ€κ°€ μ§€λ‚œμ£Ό λ˜λŠ” μ§€λ‚œ 달,
10:09
or last month or last year,
210
609909
1605
μž‘λ…„μ˜ 제 μ†Œμž¬λ₯Ό μ•Œκ³  μ‹Άλ‹€λ©΄
10:11
they have to go to a judge, get a warrant
211
611538
2360
νŒμ‚¬λ‘œλΆ€ν„° 영μž₯을 λ°›μ•„
10:13
and then serve that warrant on my phone company,
212
613922
2290
제 μ‚¬μƒν™œμ„ 지킀도둝 계약관계에 μžˆλŠ”
10:16
which by the way, has a financial interest in protecting my privacy.
213
616236
3612
νœ΄λŒ€ μ „ν™” 톡신사에 집행해야 ν•˜μ£ .
10:21
With face surveillance,
214
621027
1627
ν•˜μ§€λ§Œ μ•ˆλ©΄ κ°μ‹œλ₯Ό ν•œλ‹€λ©΄
10:22
no such limitations exist.
215
622678
2055
이런 μ œμ•½μ€ μžˆμ„ 수 μ—†μŠ΅λ‹ˆλ‹€.
10:25
This is technology that is 100 percent controlled by the government itself.
216
625226
4620
이 κΈ°μˆ μ€ μ •λΆ€κ°€ μ™„μ „νžˆ ν†΅μ œν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
10:31
So how would a warrant requirement work in this context?
217
631363
3967
이런 경우 영μž₯이 의미 μžˆμ„κΉŒμš”?
10:36
Is the government going to go to a judge
218
636307
2000
μ •λΆ€κ°€ νŒμ‚¬λ‘œλΆ€ν„°
10:38
and get a warrant,
219
638331
1166
영μž₯을 λ°›μ•„
10:39
and then serve the warrant on themselves?
220
639521
2015
μŠ€μŠ€λ‘œμ—κ²Œ 집행을 ν•˜λŠ”λ°μš”?
10:42
That would be like me giving you my diary,
221
642109
2016
이건 마치 μ œκ°€ μ—¬λŸ¬λΆ„μ—κ²Œ 제 일기μž₯을 μ£Όκ³ λŠ”,
10:44
and saying, "Here, you can hold on to this forever,
222
644149
2595
"평생 이거 가지고 μžˆμ–΄λ„ 돼,
10:46
but you can't read it until I say it's OK."
223
646768
2638
근데 λ‚΄κ°€ ν—ˆλ½ν•˜κΈ° μ „κΉŒμ§„ 읽어보면 μ•ˆ 돼."ν•˜λŠ” 것과 κ°™μŠ΅λ‹ˆλ‹€.
10:50
So what can we do?
224
650102
1200
그럼 우린 μ–΄λ–»κ²Œ ν•΄μ•Ό ν• κΉŒμš”?
10:53
The only answer to the threat
225
653221
2132
μ •λΆ€μ˜ μ•ˆλ©΄ κ°μ‹œλ‘œ μΈν•œ μœ„ν˜‘μ—
10:55
posed by the government's use of face surveillance
226
655377
3431
λŒ€μ²˜ν•  수 μžˆλŠ” μœ μΌν•œ 방법은
10:58
is to deny the government the capacity to violate the public's trust,
227
658832
4958
μ •λΆ€ 쑰직 내에 μ•ˆλ©΄ κ°μ‹œ λ„€νŠΈμ›Œν¬λ₯Ό
11:03
by denying the government the ability
228
663814
2064
갖좔지 λͺ»ν•˜κ²Œ ν•¨μœΌλ‘œμ¨
11:05
to build these in-house face-surveillance networks.
229
665902
3235
곡곡의 μ‹ λ’°λ₯Ό μ €λ²„λ¦¬λŠ” 일을 ν•  수 μ—†κ²Œ ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
11:09
And that's exactly what we're doing.
230
669711
2008
그게 λ°”λ‘œ 저희가 ν•˜λŠ” μΌμž…λ‹ˆλ‹€.
11:12
The ACLU is part of a nationwide campaign
231
672567
3269
ACLUλŠ” μ •λΆ€κ°€ 이런 μœ„ν—˜ν•œ κΈ°μˆ μ„
11:15
to pump the brakes on the government's use of this dangerous technology.
232
675860
3706
μ‚¬μš©ν•˜λŠ” 것에 λ°˜λŒ€ν•˜λŠ” 전ꡭ적인 캠페인 단체 쀑 ν•˜λ‚˜μž…λ‹ˆλ‹€.
11:20
We've already been successful,
233
680138
1445
μ €ν¬λŠ” 맀사좔세츠 주의 μ„œλ¨ΈλΉŒλΆ€ν„°,
11:21
from San Francisco to Somerville, Massachusetts,
234
681607
3114
μƒŒν”„λž€μ‹œμŠ€μ½”μ— 이λ₯΄κΈ°κΉŒμ§€ 이런 기술 μ‚¬μš©μ„
11:24
we have passed municipal bans
235
684745
2127
κΈˆμ§€ν•˜λŠ” μ§€λ°©μ •λΆ€μ˜ λ²•μ•ˆμ„
11:26
on the government's use of this technology.
236
686896
2246
ν†΅κ³Όμ‹œν‚€λŠ”λ° μ„±κ³΅μ μœΌλ‘œ μž„ν–ˆμŠ΅λ‹ˆλ‹€.
11:29
And plenty of other communities here in Massachusetts
237
689166
2548
그리고 μ—¬κΈ° 맀사좔세츠 주와
11:31
and across the country
238
691738
1189
μ „κ΅­μ˜ λ§Žμ€ 단체듀이
11:32
are debating similar measures.
239
692951
1445
λΉ„μŠ·ν•œ 정책을 λ…Όμ˜ μ€‘μž…λ‹ˆλ‹€.
11:34
Some people have told me that this movement is bound to fail.
240
694420
3514
μ–΄λ–€ μ‚¬λžŒλ“€μ€ μ €μ˜ 이런 ν™œλ™μ΄ μ‹€νŒ¨ν•  거라고 ν•©λ‹ˆλ‹€.
11:38
That ultimately,
241
698625
1631
결ꡭ은
11:40
merely because the technology exists,
242
700280
2075
이미 μ‘΄μž¬ν•˜λŠ” κΈ°μˆ μ΄κΈ°μ—
11:42
it will be deployed in every context
243
702379
3087
μ–΄λŠ λ‚˜λΌμ—μ„œλ“ 
11:45
by every government everywhere.
244
705490
2444
μ‚¬μš©λ  거라고 ν•©λ‹ˆλ‹€.
11:50
Privacy is dead, right?
245
710070
1533
μ‚¬μƒν™œμ€ μ‚¬λΌμ‘ŒμŠ΅λ‹ˆλ‹€, λ§žλ‚˜μš”?
11:52
So the narrative goes.
246
712388
1466
항상 κ·Έλ ‡κ²Œλ“€ λ§ν•©λ‹ˆλ‹€.
11:55
Well, I refuse to accept that narrative.
247
715030
2310
ν•˜μ§€λ§Œ μ „ κ·Έ 말을 κ±°λΆ€ν•©λ‹ˆλ‹€.
11:57
And you should, too.
248
717364
1266
μ—¬λŸ¬λΆ„λ“€λ„ κ·Έλž˜μ•Ό ν•˜κ³ μš”.
12:00
We can't allow Jeff Bezos or the FBI
249
720214
3793
우린 μ œν”„ λ² μ‘°μŠ€λ‚˜ FBIκ°€
12:04
to determine the boundaries of our freedoms in the 21st century.
250
724031
4159
21μ„ΈκΈ° 우리의 자유λ₯Ό μ œν•œν•˜λ„λ‘ 내버렀 둬선 μ•ˆ λ©λ‹ˆλ‹€.
12:09
If we live in a democracy,
251
729413
2149
μš°λ¦¬κ°€ λ―Όμ£Όκ΅­κ°€μ—μ„œ μ‚΄λ©΄μ„œ
12:11
we are in the driver's seat,
252
731586
1999
주도적인 μœ„μΉ˜μ—μ„œ
12:13
shaping our collective future.
253
733609
2000
λͺ¨λ‘μ˜ 미래λ₯Ό λ§Œλ“€κ³  μžˆλ‹€λ©΄μš”.
12:16
We are at a fork in the road right now.
254
736704
2111
우린 μ§€κΈˆ κ°ˆλ¦ΌκΈΈμ— μ„œ μžˆμŠ΅λ‹ˆλ‹€.
12:19
We can either continue with business as usual,
255
739379
2444
우린 μ •λΆ€κ°€ 이 κ²€μ¦λ˜μ§€ μ•Šμ€ κΈ°μˆ μ„
12:21
allowing governments to adopt and deploy these technologies unchecked,
256
741847
4007
μ§€κΈˆκ» ν•˜λ˜λŒ€λ‘œ μžμ‹ μ΄ μ‚¬λŠ” 지역, 거리, ν•™κ΅μ—μ„œ
12:25
in our communities, our streets and our schools,
257
745878
3317
μ‚¬μš©ν•˜λ„λ‘ ν—ˆμš©ν•˜κ±°λ‚˜
12:29
or we can take bold action now
258
749219
4889
μ‚¬μƒν™œμ„ 지킀고
12:34
to press pause on the government's use of face surveillance,
259
754132
4333
우리 λͺ¨λ‘μ˜ 더 μ•ˆμ „ν•˜κ³  자유둜운 미래λ₯Ό μœ„ν•΄
12:38
protect our privacy
260
758489
1587
μ •λΆ€μ˜ μ•ˆλ©΄ κ°μ‹œλ₯Ό
12:40
and to build a safer, freer future
261
760100
3087
λ‹Ήμž₯ λ©ˆμΆ”λ„λ‘
12:43
for all of us.
262
763211
1357
행동을 μ·¨ν•  μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
12:44
Thank you.
263
764592
1151
κ°μ‚¬ν•©λ‹ˆλ‹€.
12:45
(Applause and cheers)
264
765767
2427
(λ°•μˆ˜μ™€ ν™˜ν˜Έ)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7