What you need to know about face surveillance | Kade Crockford

141,735 views ・ 2020-05-29

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Sunny Zhou
00:12
How many of you have ever heard someone say
0
12643
2048
誰曾經聽別人說過「隱私已死」?
00:14
privacy is dead?
1
14715
1166
00:15
Raise your hand.
2
15905
1150
來,請舉手。
00:17
How many of you have heard someone say
3
17849
2460
誰曾經聽別人說過
00:20
they don't care about their privacy because they don't have anything to hide?
4
20333
3934
他們不在乎他們的隱私, 因為他們沒什麼好隱瞞的?
00:24
Go on.
5
24665
1160
来,舉手吧。
00:25
(Laughter)
6
25849
1442
(笑聲)
00:27
Now, how many of you use any kind of encryption software?
7
27721
4256
現在你們誰有在使用任何一種加密軟體?
00:32
Raise your hand.
8
32283
1151
請舉手。
00:33
Or a password to protect an online account?
9
33458
3347
或者用密碼來保護線上的帳號?
00:37
Or curtains or blinds on your windows at home?
10
37529
3920
或者會在家裏的窗戶上 裝上窗簾或百葉窗?
00:41
(Laughter)
11
41473
1556
(笑聲)
00:43
OK, so that's everyone, I think.
12
43053
2033
好吧,我想每個人都舉手了。
00:45
(Laughter)
13
45110
1150
(笑聲)
00:46
So why do you do these things?
14
46752
2000
你們為什麼要這樣做?
00:49
My guess is,
15
49165
1151
我猜是因爲你們在乎自己的隱私。
00:50
it's because you care about your privacy.
16
50340
2388
00:52
The idea that privacy is dead is a myth.
17
52752
2999
隱私已死這個想法是個謎思。
00:55
The idea that people don't care about their privacy
18
55775
2738
認為人可以不用在乎隱私,
00:58
because "they have nothing to hide"
19
58537
1700
只因“他們沒什麼好隱瞞的”, 或“他們沒做錯什麽事”,
01:00
or they've done nothing wrong
20
60261
1642
01:01
is also a myth.
21
61927
1243
這種想法也是個謎思。
01:04
I'm guessing that you would not want to publicly share on the internet,
22
64433
4063
我猜各位都不會想 在網路上公開分享
01:08
for the world to see,
23
68520
1412
讓全世界知道你的所有醫療記錄。
01:09
all of your medical records.
24
69956
1802
01:11
Or your search histories from your phone or your computer.
25
71782
3436
或者你手機或電腦上的 歷史搜索記錄。
01:16
And I bet
26
76289
1206
我敢說,
01:17
that if the government wanted to put a chip in your brain
27
77519
2674
如果政府想要把晶片 植入你的大腦中,
01:20
to transmit every one of your thoughts to a centralized government computer,
28
80217
4638
將你所有的想法通通 傳輸給中央政府的電腦,
01:24
you would balk at that.
29
84879
1485
你一定會遲疑。
01:26
(Laughter)
30
86388
2435
(笑聲)
01:28
That's because you care about your privacy,
31
88847
2008
這正是因為你和所有人一樣在乎自己的隱私。
01:30
like every human being.
32
90879
1533
01:33
So, our world has changed fast.
33
93347
2744
我們的世界改變很快。
01:36
And today, there is understandably a lot of confusion
34
96395
3833
目前可以理解的是,
01:40
about what privacy is and why it matters.
35
100252
3142
大家仍對「隱私是什麼」、 「隱私很重要」的問題感到很困惑。
01:44
Privacy is not secrecy.
36
104260
1864
隱私並不是秘密。
01:46
It's control.
37
106490
1166
隱私是控制。
01:48
I share information with my doctor about my body and my health,
38
108817
3920
我和我的醫生分享 我的身體及健康狀況,
01:52
expecting that she is not going to turn around
39
112761
2476
並期望她不會轉身
01:55
and share that information with my parents,
40
115261
3110
就把那些信息告訴我的父母,
01:58
or my boss or my kids.
41
118395
2126
我的老闆,或我的孩子。
02:01
That information is private, not secret.
42
121585
2539
那些信息是隱私,不是秘密。
02:05
I'm in control over how that information is shared.
43
125085
3257
我要能控制那些信息 以何種方式被傳達出去。
02:09
You've probably heard people say that there's a fundamental tension
44
129686
3159
各位可能聽別人說過, 隱私和安全之間,
02:12
between privacy on the one hand
45
132869
1762
存在著一種根本上的緊張關係。
02:14
and safety on the other.
46
134655
2023
02:17
But the technologies that advance our privacy
47
137963
2532
但是科技在保護我們隱私的同時 也提升了安全性。
02:20
also advance our safety.
48
140519
1579
02:22
Think about fences, door locks,
49
142122
2421
例如柵欄、門鎖、
02:24
curtains on our windows, passwords,
50
144567
2753
窗簾、密碼、
02:27
encryption software.
51
147344
1333
加密軟體。
02:29
All of these technologies
52
149522
1491
上述這些科技
02:31
simultaneously protect our privacy and our safety.
53
151037
4325
同時保護著我們的隱私 以及我們的安全。
02:36
Dragnet surveillance, on the other hand, protects neither.
54
156910
3493
而另一方面,天羅地網式的監視 則使兩者都無法保護。
02:41
In recent years,
55
161486
1166
近年來,
02:42
the federal government tasked a group of experts
56
162676
2921
聯邦政府要求一群專家,
02:45
called The Privacy and Civil Liberties Oversight Board
57
165621
2936
組成隱私與公民自由監督委員會,
02:48
with examining post-9/11 government surveillance programs,
58
168581
3322
用以審查911事件後實行的政府監控計劃,
02:51
dragnet surveillance programs.
59
171927
1956
法網監控計畫。
02:54
Those experts could not find a single example
60
174307
3135
那些專家完全找不到任何一個例子
02:57
of that dragnet surveillance advancing any safety --
61
177466
3784
可以說明法網監控 會提升任何安全性——
03:01
didn't identify or stop a single terrorist attack.
62
181274
2867
它完全沒有發現或阻止 任何一次恐怖襲擊。
03:05
You know what that information was useful for, though?
63
185369
2881
不過,你們知道那些資訊 可以用在哪裡嗎?
03:08
Helping NSA employees spy on their romantic interests.
64
188274
3157
協助國家安全局的員工 暗中監視他們感興趣的對象。
03:11
(Laughter)
65
191455
1141
(笑聲)
03:12
(Audience: Wow.)
66
192620
1301
(觀眾:哇)
03:14
Another example is closer to home.
67
194892
1690
另一個例子更貼近生活。
03:16
So millions of people across the United States and the world
68
196606
2858
全美國及全世界有數百萬人
03:19
are adopting so-called "smart home" devices,
69
199488
2288
都在使用所謂的 「智慧家庭」裝置。
03:21
like internet-connected surveillance cameras.
70
201800
2563
比如連網的監控攝像頭。
03:24
But we know that any technology connected to the internet
71
204712
3262
但,我們知道, 任何連上網路的技術
03:27
can be hacked.
72
207998
1150
都可能被駭入。
03:30
And so if a hacker
73
210514
1714
如果有駭客
03:32
gets into your internet-connected surveillance camera at home,
74
212252
3302
駭入你家中的網路監控攝像頭,
03:35
they can watch you and your family coming and going,
75
215578
2698
他們就能監視你 和你的家人的行蹤,
03:38
finding just the right time to strike.
76
218300
2448
找到空檔時間去闖空門。
03:41
You know what can't be hacked remotely?
77
221553
2309
你們知道駭客不能 從遠端駭入什麼嗎?
03:44
Curtains.
78
224418
1156
窗簾。(笑聲)
03:45
(Laughter)
79
225598
1001
03:46
Fences.
80
226623
1414
柵欄。
03:48
Door locks.
81
228061
1160
門鎖。
03:49
(Laughter)
82
229245
1141
(笑聲)
03:50
Privacy is not the enemy of safety.
83
230410
2346
隱私並不是安全的敵人。
03:53
It is its guarantor.
84
233113
1452
隱私是安全的保障。
03:56
Nonetheless, we daily face a propaganda onslaught
85
236129
3985
無論如何,要我們因爲 監控計畫而放棄一些隱私
04:00
telling us that we have to give up some privacy in exchange for safety
86
240138
4090
以交換安全的宣傳, 天天都在轟炸我們。
04:04
through surveillance programs.
87
244252
2000
04:07
Face surveillance is the most dangerous of these technologies.
88
247085
3999
這些科技中, 最危險的就是人臉監控。
04:11
There are two primary ways today governments use technologies like this.
89
251966
5167
目前,政府使用這類 科技的方式主要有兩種。
04:17
One is face recognition.
90
257157
1753
第一是人臉識別。
04:19
That's to identify someone in an image.
91
259243
2271
也就是辨認出影像中的人。
04:21
The second is face surveillance,
92
261538
2833
第二是人臉監控,
04:24
which can be used in concert
93
264395
2055
會用在音樂會中,
04:26
with surveillance-camera networks and databases
94
266474
2856
搭配連網的監控攝像頭和數據庫,
04:29
to create records of all people's public movements,
95
269354
3477
記錄下所有人在公共場所的活動、
04:32
habits and associations,
96
272855
2405
習慣及聯係,
04:35
effectively creating a digital panopticon.
97
275284
2912
有效創造出一種數字化圓形監獄。
04:38
This is a panopticon.
98
278903
1857
這就是圓形監獄。
04:40
It's a prison designed to allow a few guards in the center
99
280784
4222
設計這種監獄的目的, 是爲了用較少的獄警
04:45
to monitor everything happening in the cells around the perimeter.
100
285030
3944
從中心就能監控到 周圍牢房中發生的一切。
04:49
The people in those prison cells can't see inside the guard tower,
101
289482
4667
在牢房中的犯人 看不見獄警塔的內部,
04:54
but the guards can see into every inch of those cells.
102
294173
3666
但獄警可以看見 那些牢房中的每個角落。
04:59
The idea here
103
299268
2055
這裡的想法是,
05:01
is that if the people in those prison cells
104
301347
2595
如果在牢房中的人
05:03
know they're being watched all the time,
105
303966
1929
知道他們時時刻刻 受到/可能受到監視,
05:05
or could be,
106
305919
1460
05:07
they'll behave accordingly.
107
307403
1968
他們就會做出相應的行為。
05:09
Similarly, face surveillance enables a centralized authority --
108
309395
3774
同樣地,人臉監控讓中央機關——
05:13
in this case, the state --
109
313193
1884
在這個例子中就是「州」——
05:15
to monitor the totality of human movement and association
110
315101
3357
能完全監視所有人在公共 場所的活動和聯係。
05:18
in public space.
111
318482
1150
05:20
And here's what it looks like
112
320053
1397
在現實中看起來是這個樣子。
05:21
in real life.
113
321474
1372
05:22
In this case, it's not a guard in a tower,
114
322870
2802
在這個例子中, 並不是塔中有獄警,
05:25
but rather a police analyst in a spy center.
115
325696
3278
而是間諜中心裡有警方分析師。
05:29
The prison expands beyond its walls,
116
329506
3150
監獄擴展到圍牆之外,
05:32
encompassing everyone,
117
332680
1882
時時刻刻圍住
05:34
everywhere, all the time.
118
334586
2491
任何地方的任何人。
05:38
In a free society,
119
338189
1309
在自由社會裡,
05:39
this should terrify us all.
120
339522
2404
這種事應該能把我們都嚇死。
05:43
For decades now, we've watched cop shows
121
343927
2390
數十年來,我們都看過警匪節目
05:46
that push a narrative that says
122
346341
1722
主推的故事線是
05:48
technologies like face surveillance ultimately serve the public good.
123
348087
3740
類似人臉監控的科技 最終對民衆是有益的。
05:52
But real life is not a cop drama.
124
352419
2200
但現實生活並非警匪劇。
05:56
The bad guy didn't always do it,
125
356149
1968
壞事不見得一定是壞人幹的,
05:58
the cops definitely aren't always the good guys
126
358141
3269
警察也絕不見得都是好人,
06:01
and the technology doesn't always work.
127
361434
2600
科技也不會一直有效。
06:04
Take the case of Steve Talley,
128
364602
1896
以史帝夫泰利案為例,
06:06
a financial analyst from Colorado.
129
366522
2492
他是科羅拉多的金融分析師。
06:09
In 2015, Talley was arrested, and he was charged with bank robbery
130
369038
3905
2015 年,泰利被逮捕, 他被控搶劫銀行,
06:12
on the basis of an error in a facial recognition system.
131
372967
3288
控訴的根據是 人臉識別系統的故障。
06:16
Talley fought that case
132
376985
1183
泰利努力洗清罪名, 並最終獲得清白,
06:18
and he eventually was cleared of those charges,
133
378192
2222
06:20
but while he was being persecuted by the state,
134
380438
2594
但在他被州迫害的這段時間,
06:23
he lost his house, his job and his kids.
135
383056
3511
他失去了他的房子、 他的工作和他的孩子。
06:27
Steve Talley's case is an example
136
387941
1677
史帝夫泰利案 說明了當科技出問題時會怎麼樣。
06:29
of what can happen when the technology fails.
137
389642
2501
06:33
But face surveillance is just as dangerous when it works as advertized.
138
393219
4248
但,人臉監控若能夠做到 宣傳上說的那些功能,
那危險性也一樣可觀。
06:38
Just consider how trivial it would be
139
398875
2818
想想看政府機關多麼輕易
06:41
for a government agency to put a surveillance camera
140
401717
2443
就能把監控攝像頭裝設
在舉辦匿名戒酒會的建築物外面。
06:44
outside a building where people meet for Alcoholics Anonymous meetings.
141
404184
3944
06:49
They could connect that camera
142
409194
1522
他們可以把那個攝像頭連結到 人臉監控的運行程序及數據庫中,
06:50
to a face-surveillance algorithm and a database,
143
410740
2652
06:53
press a button and sit back and collect
144
413416
1927
按下一個鍵之後,只要翹腳坐著
06:55
a record of every person receiving treatment for alcoholism.
145
415367
3220
就可以收集到每位接受 酗酒治療的患者記錄。
07:00
It would be just as easy for a government agency
146
420204
2307
政府機關同樣可以很輕易地
07:02
to use this technology to automatically identify
147
422535
2629
使用這項科技來自動識別
07:05
every person who attended the Women's March
148
425188
2381
參與女權遊行
07:07
or a Black Lives Matter protest.
149
427593
2133
或進行「黑命貴」抗爭的每個人。
07:11
Even the technology industry is aware of the gravity of this problem.
150
431163
3881
就連科技產業也意識到 這個問題有多沉重。
07:15
Microsoft's president Brad Smith has called on Congress to intervene.
151
435910
4390
微軟的總裁布拉德史密斯 已經呼籲國會干預。
07:21
Google, for its part,
152
441188
1382
至於 Google,
07:22
has publicly declined to ship a face surveillance product,
153
442594
3602
則已經公開拒絕 運送人臉監控產品,
07:26
in part because of these grave human and civil rights concerns.
154
446220
3547
有部分原因就是因為這些 關於人權和民權的重大考量。
07:30
And that's a good thing.
155
450514
1600
那是好事。
07:32
Because ultimately,
156
452840
1611
因為,最終,
07:34
protecting our open society is much more important
157
454475
3501
保護我們的開放社會遠比
企業獲利更重要。
07:38
than corporate profit.
158
458000
1467
07:41
The ACLU's nationwide campaign
159
461078
2342
美國公民自由聯盟 有個全國性的活動,
07:43
to get the government to pump the brakes
160
463444
1928
盼能讓政府採煞車,
07:45
on the adoption of this dangerous technology
161
465396
2331
別再採用這項危險的科技,
07:47
has prompted reasonable questions from thoughtful people.
162
467751
2929
該活動已經引發細心的人 提出合理的疑問。
07:51
What makes this technology in particular so dangerous?
163
471807
3666
是什麼讓這項科技特別危險?
07:55
Why can't we just regulate it?
164
475791
2237
爲什麽我們不能只控制它就好了?
07:58
In short, why the alarm?
165
478052
2168
簡言之,為什麼要這麼緊張?
08:01
Face surveillance is uniquely dangerous for two related reasons.
166
481236
4412
人臉監控的危險之處, 獨特在兩個原因,
08:06
One is the nature of the technology itself.
167
486133
3174
第一是這項科技本身的性質。
08:09
And the second is that our system
168
489331
2272
第二是我們的體制
08:11
fundamentally lacks the oversight and accountability mechanisms
169
491627
4410
從根本上就缺乏 監督和問責的機制,
08:16
that would be necessary
170
496061
1406
有這些機制才能確保
08:17
to ensure it would not be abused in the government's hands.
171
497491
3602
這項科技在政府手中不會被濫用。
08:22
First, face surveillance enables a totalizing form of surveillance
172
502562
4912
首先, 人臉監控讓前所未有的
全面監視形式成為可能。
08:27
never before possible.
173
507498
1467
08:30
Every single person's every visit to a friend's house,
174
510006
3960
每個人每次造訪朋友家、
08:33
a government office,
175
513990
1563
政府辦公室、
08:35
a house of worship,
176
515577
2087
教堂、
08:37
a Planned Parenthood,
177
517688
1413
計劃生育協會、
08:39
a cannabis shop,
178
519125
1500
大麻店、
08:40
a strip club;
179
520649
1467
脫衣舞俱樂部;
08:42
every single person's public movements, habits and associations
180
522958
4317
每個人在公共場所的活動、習慣、
聯係,都會被記錄、分類,
08:47
documented and catalogued,
181
527299
1635
08:48
not on one day, but on every day,
182
528958
2365
不是只有一天,是每一天,
08:51
merely with the push of a button.
183
531347
1945
只要按一個鍵就可以了。
08:54
This kind of totalizing mass surveillance
184
534276
3142
這種全面性的大眾監控
08:57
fundamentally threatens what it means to live in a free society.
185
537442
3032
從根本上威脅到了 在自由社會生活的意義。
09:00
Our freedom of speech, freedom of association,
186
540498
2729
我們的言論自由、結社自由、
09:03
freedom of religion,
187
543251
1335
宗教信仰自由、出版自由、
09:04
freedom of the press,
188
544610
1253
09:05
our privacy,
189
545887
1214
我們的隱私,不想被打擾的權利。
09:07
our right to be left alone.
190
547125
1800
09:10
You may be thinking,
191
550467
1460
各位可能在想,
09:11
"OK, come on, but there are tons of ways the government can spy on us."
192
551951
3335
「好啦,拜託,政府還有 一堆其他方式可以監視我們。」
09:15
And yes, it's true,
193
555310
1197
是的,沒錯,
09:16
the government can track us through our cell phones,
194
556531
2453
政府能通過我們的手機來追蹤我們,
但是,如果我想要去墮胎,
09:19
but if I want to go to get an abortion,
195
559008
3632
09:22
or attend a political meeting,
196
562664
1786
或者參加政治集會,
09:24
or even just call in sick and play hooky and go to the beach ...
197
564474
4278
甚至打電話請病假 然後跑去打曲棍球或到海灘去…
09:28
(Laughter)
198
568776
1341
(笑聲)
09:30
I can leave my phone at home.
199
570141
1933
我可以把手機留在家裡。
09:33
I cannot leave my face at home.
200
573101
2254
我不能把我的臉留在家裡。
09:37
And that brings me to my second primary concern:
201
577165
2539
這就來到了我所說的第二項議題:
09:39
How we might meaningfully regulate this technology.
202
579728
3327
我們要如何用有意義的方式 來管制這項科技。
09:44
Today, if the government wants to know where I was last week,
203
584315
3714
現今,如果政府想要 知道我上週在哪裡,
09:48
they can't just hop into a time machine and go back in time and follow me.
204
588053
4524
他們不能就跳上時光機 然後回到過去跟蹤我。
09:53
And they also, the local police right now,
205
593585
2736
如今的地方警察
09:56
don't maintain any centralized system of tracking,
206
596345
3293
他們也沒有任何集中式的追蹤系統,
09:59
where they're cataloging every person's public movements all the time,
207
599662
3366
能夠隨時把每個人 在公共場所的所有活動進行分類,
10:03
just in case that information some day becomes useful.
208
603052
3383
以免有一天會需要那些資訊。
10:06
Today, if the government wants to know where I was last week,
209
606783
3102
現今,如果政府想要 知道我上週在哪裡,
10:09
or last month or last year,
210
609909
1605
或上個月、去年,
10:11
they have to go to a judge, get a warrant
211
611538
2360
他們得要向法官取得搜查令,
10:13
and then serve that warrant on my phone company,
212
613922
2290
再把搜查令送到我的電信公司,
10:16
which by the way, has a financial interest in protecting my privacy.
213
616236
3612
順便説一下,電信公司會因為 財務利益而保護我的隱私。
10:21
With face surveillance,
214
621027
1627
有了人臉監控,
10:22
no such limitations exist.
215
622678
2055
這種限制就不見了。
10:25
This is technology that is 100 percent controlled by the government itself.
216
625226
4620
這項科技百分之百 由政府自己掌控。
10:31
So how would a warrant requirement work in this context?
217
631363
3967
所以,在這種情況下, 在要搜查令時會變成什麼樣子?
10:36
Is the government going to go to a judge
218
636307
2000
政府會向法官取得搜查令,
10:38
and get a warrant,
219
638331
1166
接著把搜查令送到他們自己那裡?
10:39
and then serve the warrant on themselves?
220
639521
2015
10:42
That would be like me giving you my diary,
221
642109
2016
那就像是我把我的日記給你,
10:44
and saying, "Here, you can hold on to this forever,
222
644149
2595
然後說:「拿去, 你可以一直留著它,
10:46
but you can't read it until I say it's OK."
223
646768
2638
但我沒說好之前你都不能讀它。」
10:50
So what can we do?
224
650102
1200
那我們該怎麽做?
10:53
The only answer to the threat
225
653221
2132
針對政府使用人臉監控
10:55
posed by the government's use of face surveillance
226
655377
3431
所造成的威脅,唯一的答案
10:58
is to deny the government the capacity to violate the public's trust,
227
658832
4958
就是拒絕給予政府可以 違反人民信任的能力,
11:03
by denying the government the ability
228
663814
2064
做法是不要給予政府建立
11:05
to build these in-house face-surveillance networks.
229
665902
3235
這些內部人臉監控網路的能力。
11:09
And that's exactly what we're doing.
230
669711
2008
那正是我們現在在做的事。
11:12
The ACLU is part of a nationwide campaign
231
672567
3269
美國公民自由聯盟 是這個全國性的活動一部分,
11:15
to pump the brakes on the government's use of this dangerous technology.
232
675860
3706
該活動旨在阻止政府使用這項危險的科技。
11:20
We've already been successful,
233
680138
1445
我們已經有些成果,
11:21
from San Francisco to Somerville, Massachusetts,
234
681607
3114
從舊金山到馬薩諸塞州的薩默維爾,
11:24
we have passed municipal bans
235
684745
2127
我們已經通過了市政禁令,
11:26
on the government's use of this technology.
236
686896
2246
禁止政府使用這項科技。
11:29
And plenty of other communities here in Massachusetts
237
689166
2548
在馬薩諸塞州以及全國各地 都有許多其他團體
11:31
and across the country
238
691738
1189
11:32
are debating similar measures.
239
692951
1445
也在討論類似的措施。
11:34
Some people have told me that this movement is bound to fail.
240
694420
3514
有些人跟我說, 這項運動注定會失敗。
11:38
That ultimately,
241
698625
1631
他們説,最終,
11:40
merely because the technology exists,
242
700280
2075
單單因為這項科技的存在,
11:42
it will be deployed in every context
243
702379
3087
它就會被各地的各個政府
11:45
by every government everywhere.
244
705490
2444
應用到所有情境中。
11:50
Privacy is dead, right?
245
710070
1533
隱私已死,對吧?
11:52
So the narrative goes.
246
712388
1466
至少說法是這樣的。
11:55
Well, I refuse to accept that narrative.
247
715030
2310
我拒絕接受那種說法。
11:57
And you should, too.
248
717364
1266
各位也應該拒絕。
12:00
We can't allow Jeff Bezos or the FBI
249
720214
3793
我們不能讓傑夫貝佐斯 (亞馬遜創始人)或聯邦調查局
12:04
to determine the boundaries of our freedoms in the 21st century.
250
724031
4159
來決定我們在二十一世紀的自由界線。
12:09
If we live in a democracy,
251
729413
2149
如果我們生活在民主國家,
12:11
we are in the driver's seat,
252
731586
1999
我們就是司機,
12:13
shaping our collective future.
253
733609
2000
來塑造我們共同的未來。
12:16
We are at a fork in the road right now.
254
736704
2111
現在,我們走到了十字路口。
12:19
We can either continue with business as usual,
255
739379
2444
我們可以繼續讓一切照舊,
12:21
allowing governments to adopt and deploy these technologies unchecked,
256
741847
4007
讓政府採取及部署 這些科技,不受管束,
12:25
in our communities, our streets and our schools,
257
745878
3317
部署在我們的社區中、街道上、學校裡,
12:29
or we can take bold action now
258
749219
4889
或者,
我們也可以現在就大膽行動,
12:34
to press pause on the government's use of face surveillance,
259
754132
4333
按下暫停鍵, 阻止政府使用人臉監控,
12:38
protect our privacy
260
758489
1587
保護我們的隱私,
12:40
and to build a safer, freer future
261
760100
3087
建立一個更安全、更自由的未來
12:43
for all of us.
262
763211
1357
為所有人。
12:44
Thank you.
263
764592
1151
謝謝。
12:45
(Applause and cheers)
264
765767
2427
(掌聲及歡呼)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7