Daniel Suarez: The kill decision shouldn't belong to a robot

76,013 views ・ 2013-06-13

TED


請雙擊下方英文字幕播放視頻。

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
譯者: NAN-KUN WU 審譯者: Yi-Ting Chung
00:12
I write fiction sci-fi thrillers,
1
12691
3158
我寫驚悚科幻小說
00:15
so if I say "killer robots,"
2
15849
2193
所以如果我談到「殺手機器人」
00:18
you'd probably think something like this.
3
18042
2361
你可能會覺得是這樣的東西
00:20
But I'm actually not here to talk about fiction.
4
20403
2555
但事實上我不是來這裡談論小說的
00:22
I'm here to talk about very real killer robots,
5
22958
2948
我是來這裡談論真正的殺手機器人
00:25
autonomous combat drones.
6
25906
3122
自主的無人戰機
00:29
Now, I'm not referring to Predator and Reaper drones,
7
29028
3627
我的意思不是掠奪者或是收割者無人戰機
00:32
which have a human making targeting decisions.
8
32655
3260
由人類來決定目標
00:35
I'm talking about fully autonomous robotic weapons
9
35915
3129
我說的是全自動的機械武器
00:39
that make lethal decisions about human beings
10
39044
2654
能有自主性的對人類做出致命性的決策
00:41
all on their own.
11
41698
2417
不倚靠人類
00:44
There's actually a technical term for this: lethal autonomy.
12
44115
4000
這有個專門術語:致命自主
00:48
Now, lethally autonomous killer robots
13
48115
2856
致命性自主殺手機器人
00:50
would take many forms -- flying, driving,
14
50971
3069
有多種模式-飛行、陸行
00:54
or just lying in wait.
15
54040
2746
或只是待命
00:56
And actually, they're very quickly becoming a reality.
16
56786
3109
事實上它們很快就要問世了
00:59
These are two automatic sniper stations
17
59895
2484
這是兩座自動狙擊台
01:02
currently deployed in the DMZ between North and South Korea.
18
62379
4137
目前部署在南北韓非軍事區
01:06
Both of these machines are capable of automatically
19
66516
2171
這兩台機器都是自動的
01:08
identifying a human target and firing on it,
20
68687
3524
能夠辨識人類的位置並開火射擊
01:12
the one on the left at a distance of over a kilometer.
21
72211
4324
左邊這台能瞄準超過一公里
01:16
Now, in both cases, there's still a human in the loop
22
76535
3589
這兩座狙擊台都還是有人力介入
01:20
to make that lethal firing decision,
23
80124
2372
來下達致命攻擊的決策
01:22
but it's not a technological requirement. It's a choice.
24
82496
5413
但並不是技術上的需要,而是選項
01:27
And it's that choice that I want to focus on,
25
87909
3093
這個選項就是我的重點
01:31
because as we migrate lethal decision-making
26
91002
2641
因為當我們將致命決策權
01:33
from humans to software,
27
93643
3109
由人類轉移到軟體設備
01:36
we risk not only taking the humanity out of war,
28
96752
3476
我們冒的風險不僅是將人性抽離戰爭
01:40
but also changing our social landscape entirely,
29
100228
3526
同時也完全改變了我們的社會景觀
01:43
far from the battlefield.
30
103754
2224
遠遠不是戰場可以形容的
01:45
That's because the way humans resolve conflict
31
105978
4509
因為人類解決衝突的方法
01:50
shapes our social landscape.
32
110487
1733
塑造了我們社會的樣貌
01:52
And this has always been the case, throughout history.
33
112220
2633
自古以來皆是如此
01:54
For example, these were state-of-the-art weapons systems
34
114853
2661
好比說這些是西元1400年時
01:57
in 1400 A.D.
35
117514
2079
最先進的武器系統
01:59
Now they were both very expensive to build and maintain,
36
119593
3144
儘管建造、保養代價高昂
02:02
but with these you could dominate the populace,
37
122737
3240
但是透過這些武器我們可以統治群眾
02:05
and the distribution of political power in feudal society reflected that.
38
125977
3889
封建社會下政治力量的分布顯示了這一點
02:09
Power was focused at the very top.
39
129866
2687
力量集中在最頂端
02:12
And what changed? Technological innovation.
40
132553
3528
什麼改變了?科技革新
02:16
Gunpowder, cannon.
41
136081
1871
火藥、大砲
02:17
And pretty soon, armor and castles were obsolete,
42
137952
3817
不久之後,盔甲和城堡就被淘汰了
02:21
and it mattered less who you brought to the battlefield
43
141769
2533
戰場上有什麼士兵不重要
02:24
versus how many people you brought to the battlefield.
44
144302
3779
數量有多少才是關鍵
02:28
And as armies grew in size, the nation-state arose
45
148081
3638
當軍隊壯大之後,出現了民族國家
02:31
as a political and logistical requirement of defense.
46
151719
3680
來滿足政治及後勤方面在防禦上的需求
02:35
And as leaders had to rely on more of their populace,
47
155399
2376
當領導人必須更加倚靠他們的人民
02:37
they began to share power.
48
157775
1833
就會開始分權
02:39
Representative government began to form.
49
159608
2599
代議政府開始成形
02:42
So again, the tools we use to resolve conflict
50
162207
3288
再一次,我們用來解決衝突的工具
02:45
shape our social landscape.
51
165495
3304
塑造了我們社會的樣貌
02:48
Autonomous robotic weapons are such a tool,
52
168799
4064
自主機械武器就是這種工具
02:52
except that, by requiring very few people to go to war,
53
172863
5168
但是因為發動戰爭所需的人數不多
02:58
they risk re-centralizing power into very few hands,
54
178031
4840
有極少數人再度集權統治的風險
03:02
possibly reversing a five-century trend toward democracy.
55
182871
6515
可能讓五百年來民主化的趨勢走回頭路
03:09
Now, I think, knowing this,
56
189386
1757
我想,了解了之後
03:11
we can take decisive steps to preserve our democratic institutions,
57
191143
4352
我們可以果斷的維持民主制度
03:15
to do what humans do best, which is adapt.
58
195495
3979
來做我們人類最擅長的事,那就是適應環境
03:19
But time is a factor.
59
199474
2005
但時間是一個因素
03:21
Seventy nations are developing remotely-piloted
60
201479
2851
有七十個國家正在研發
03:24
combat drones of their own,
61
204330
2157
他們自己的遠程無人戰鬥機
03:26
and as you'll see, remotely-piloted combat drones
62
206487
2593
如你所見,遠程無人戰鬥機
03:29
are the precursors to autonomous robotic weapons.
63
209080
4472
正是自主機械武器的前身
03:33
That's because once you've deployed remotely-piloted drones,
64
213552
2767
因為一旦部屬了遠程無人戰鬥機
03:36
there are three powerful factors pushing decision-making
65
216319
3384
就會有三個強而有力的因素
03:39
away from humans and on to the weapon platform itself.
66
219703
4600
迫使人們把決定權交給武器平台本身
03:44
The first of these is the deluge of video that drones produce.
67
224303
5259
第一個因素是無人戰機所拍攝的大量影像
03:49
For example, in 2004, the U.S. drone fleet produced
68
229562
3853
像是在2004年,美國的無人機艦隊
03:53
a grand total of 71 hours of video surveillance for analysis.
69
233415
5312
拍攝了長達71小時的影像監控以供分析之用
03:58
By 2011, this had gone up to 300,000 hours,
70
238727
4499
到了2011年,影像長度增為30萬小時
04:03
outstripping human ability to review it all,
71
243226
3149
超出人類檢視所有影像的能力
04:06
but even that number is about to go up drastically.
72
246375
3664
但即使數字之大,它依然大幅成長
04:10
The Pentagon's Gorgon Stare and Argus programs
73
250039
2575
五角大廈的「戈爾貢之眼(Gorgon Stare)」和 「百眼巨人(Argus)」計畫
04:12
will put up to 65 independently operated camera eyes
74
252614
3164
將會有65台獨立運作的攝影機
04:15
on each drone platform,
75
255778
2038
安裝在每架無人飛機上
04:17
and this would vastly outstrip human ability to review it.
76
257816
3303
而這將大幅超越人類所能檢視的能力
04:21
And that means visual intelligence software will need
77
261119
2160
這意味著視覺智慧軟體必須掃描影像
04:23
to scan it for items of interest.
78
263279
4048
找出我們感興趣的部分
04:27
And that means very soon
79
267327
1348
這表示不久後無人飛機
04:28
drones will tell humans what to look at,
80
268675
2747
將會告訴人類該看哪些部分
04:31
not the other way around.
81
271422
2497
而非由人類來決定
04:33
But there's a second powerful incentive pushing
82
273919
2473
還有第二個動機
04:36
decision-making away from humans and onto machines,
83
276392
3383
迫使我們把決定權交給機器
04:39
and that's electromagnetic jamming,
84
279775
2872
那就是電磁干擾
04:42
severing the connection between the drone
85
282647
2236
它會切斷無人飛機
04:44
and its operator.
86
284883
2814
和操作員的連線
04:47
Now we saw an example of this in 2011
87
287697
2618
我們現在看到的例子是在2011年
04:50
when an American RQ-170 Sentinel drone
88
290315
2956
美國的RQ-170哨兵式(Sentinel)無人飛機
04:53
got a bit confused over Iran due to a GPS spoofing attack,
89
293271
4307
在伊朗上空因為GPS受到欺騙攻擊而陷入混亂
04:57
but any remotely-piloted drone is susceptible to this type of attack,
90
297578
5114
這種攻擊對任何遠程無人飛機都有效
05:02
and that means drones
91
302692
2052
這表示無人飛機
05:04
will have to shoulder more decision-making.
92
304744
3620
必須去承擔更多決策
05:08
They'll know their mission objective,
93
308364
3043
它們會知道它們的任務目標
05:11
and they'll react to new circumstances without human guidance.
94
311407
4845
沒有人類指導也能對新的狀況作出反應
05:16
They'll ignore external radio signals
95
316252
2581
它們會忽視外來的無線電訊號
05:18
and send very few of their own.
96
318833
2330
只發出極少量的無線電訊號
05:21
Which brings us to, really, the third
97
321163
2006
這牽扯到第三個因素
05:23
and most powerful incentive pushing decision-making
98
323169
3862
也是最強烈的誘因
05:27
away from humans and onto weapons:
99
327031
3342
促使我們將決定權交給武器
05:30
plausible deniability.
100
330373
3293
那就是貌似合理的推託
05:33
Now we live in a global economy.
101
333666
2887
我們生活在一個全球經濟體下
05:36
High-tech manufacturing is occurring on most continents.
102
336553
4334
大部分的地方都有高科技工業
05:40
Cyber espionage is spiriting away advanced designs
103
340887
2914
網路間諜正在將先進的科技
05:43
to parts unknown,
104
343801
1886
帶到未知的地方
05:45
and in that environment, it is very likely
105
345687
2014
在這種環境下很有可能
05:47
that a successful drone design will be knocked off in contract factories,
106
347701
4734
成功的無人飛機設計,在灰市中量產
05:52
proliferate in the gray market.
107
352435
2170
擊敗取得授權的工廠
05:54
And in that situation, sifting through the wreckage
108
354605
2460
如此一來,即使仔細調查
05:57
of a suicide drone attack, it will be very difficult to say
109
357065
2960
無人機自殺攻擊的殘骸
06:00
who sent that weapon.
110
360025
4400
也難以斷定是誰發射這個武器
06:04
This raises the very real possibility
111
364425
2800
這就很有可能
06:07
of anonymous war.
112
367225
2935
會引起匿名戰爭
06:10
This could tilt the geopolitical balance on its head,
113
370160
2614
可能會讓地緣政治傾向一頭
06:12
make it very difficult for a nation to turn its firepower
114
372774
3491
讓一個國家難以將它的炮火對準攻擊者
06:16
against an attacker, and that could shift the balance
115
376265
2848
也會讓21世紀的維持的平衡
06:19
in the 21st century away from defense and toward offense.
116
379113
3764
由防禦轉為侵略
06:22
It could make military action a viable option
117
382877
3124
使得軍事行動變成可行的選項
06:26
not just for small nations,
118
386001
2288
不只是小國
06:28
but criminal organizations, private enterprise,
119
388289
2545
還有犯罪組織、私人企業
06:30
even powerful individuals.
120
390834
2479
甚至是有權力的人都能這麼做
06:33
It could create a landscape of rival warlords
121
393313
3328
這樣可能會引發軍閥割據的混戰
06:36
undermining rule of law and civil society.
122
396641
3680
破壞法律規範及文明社會
06:40
Now if responsibility and transparency
123
400321
3616
如果責任感以及透明度
06:43
are two of the cornerstones of representative government,
124
403937
2384
是代議政治的兩塊基石
06:46
autonomous robotic weapons could undermine both.
125
406321
4320
自主機械武器會破壞這兩者
06:50
Now you might be thinking that
126
410641
1546
你現在可能會覺得
06:52
citizens of high-tech nations
127
412187
2246
高科技國家的人民
06:54
would have the advantage in any robotic war,
128
414433
2703
在機械戰爭中佔有優勢
06:57
that citizens of those nations would be less vulnerable,
129
417136
3633
這些國家的人民比較不容易受傷害
07:00
particularly against developing nations.
130
420769
4288
尤其是對抗發展中國家的時候
07:05
But I think the truth is the exact opposite.
131
425057
3524
但是我認為事實剛好相反
07:08
I think citizens of high-tech societies
132
428581
2251
我認為高科技社會的人民
07:10
are more vulnerable to robotic weapons,
133
430832
3729
更容易受到機械武器傷害
07:14
and the reason can be summed up in one word: data.
134
434561
4465
其原因可以用兩個字來說明:資料
07:19
Data powers high-tech societies.
135
439026
3481
資料壯大了高科技社會
07:22
Cell phone geolocation, telecom metadata,
136
442507
3190
手機定位、電訊後設資料
07:25
social media, email, text, financial transaction data,
137
445697
3472
社交媒體、電子郵件、簡訊、金融交易資料
07:29
transportation data, it's a wealth of real-time data
138
449169
3532
和運輸資料等,都是大量的即時資料
07:32
on the movements and social interactions of people.
139
452701
3373
記載著人類的社交活動
07:36
In short, we are more visible to machines
140
456074
3775
簡單來說,比其任何其他時代
07:39
than any people in history,
141
459849
2242
機器對人類來說更顯而易見
07:42
and this perfectly suits the targeting needs of autonomous weapons.
142
462091
5616
而這正好符合了自動武器的目標需求
07:47
What you're looking at here
143
467707
1738
大家現在看到的
07:49
is a link analysis map of a social group.
144
469445
3246
是社會團體的連結分析圖
07:52
Lines indicate social connectedness between individuals.
145
472691
3634
線段代表著人與人之間的社會連結
07:56
And these types of maps can be automatically generated
146
476325
2880
這類的圖只要依據現代人所留下的資料
07:59
based on the data trail modern people leave behind.
147
479205
4715
就能夠自動產生
08:03
Now it's typically used to market goods and services
148
483920
2477
現在它應用於商品與服務的行銷
08:06
to targeted demographics, but it's a dual-use technology,
149
486397
4416
來統計目標群體,但這是雙面科技
08:10
because targeting is used in another context.
150
490813
3360
因為目標的鎖定可應用於另外一種情況
08:14
Notice that certain individuals are highlighted.
151
494173
2560
我們注意到某些人是特別顯著的
08:16
These are the hubs of social networks.
152
496733
3280
他們是社會網絡的樞紐
08:20
These are organizers, opinion-makers, leaders,
153
500013
3590
他們是組織者、輿論製造者、領導者
08:23
and these people also can be automatically identified
154
503603
2682
而這些人依據他們的溝通模式
08:26
from their communication patterns.
155
506285
2382
我們也能夠自動辨識
08:28
Now, if you're a marketer, you might then target them
156
508667
2146
如果現在你是一位營銷員,你可能將試用產品
08:30
with product samples, try to spread your brand
157
510813
2543
鎖定這些人,試著把你的品牌
08:33
through their social group.
158
513356
2829
推廣到他們的社會團體中
08:36
But if you're a repressive government
159
516185
1953
但如果你是專制的政府
08:38
searching for political enemies, you might instead remove them,
160
518138
4810
在搜尋政敵,你可能反而會除掉他們
08:42
eliminate them, disrupt their social group,
161
522948
2760
消滅他們、破壞他們的社會團體
08:45
and those who remain behind lose social cohesion
162
525708
3169
而剩下來的人就會失去他們的社會連結
08:48
and organization.
163
528877
2621
以及社群組織
08:51
Now in a world of cheap, proliferating robotic weapons,
164
531498
3324
在一個機械武器便宜、氾濫的世界
08:54
borders would offer very little protection
165
534822
2635
國界的保護微不足道
08:57
to critics of distant governments
166
537457
1946
無法防衛遠方政府的批評者
08:59
or trans-national criminal organizations.
167
539403
3646
或是跨國犯罪組織
09:03
Popular movements agitating for change
168
543049
3493
鼓吹改革的群眾運動
09:06
could be detected early and their leaders eliminated
169
546542
3609
在大眾充分了解其訴求之前
09:10
before their ideas achieve critical mass.
170
550151
2911
就會被察覺,並將其領導人消滅
09:13
And ideas achieving critical mass
171
553062
2591
而充分的把想法傳達給大眾
09:15
is what political activism in popular government is all about.
172
555653
3936
正是一般政治激進主義政府所追求的
09:19
Anonymous lethal weapons could make lethal action
173
559589
3997
匿名的致命武器能夠讓致命行動
09:23
an easy choice for all sorts of competing interests.
174
563586
3782
變成所有競爭者一個簡單的選項
09:27
And this would put a chill on free speech
175
567368
3734
這會讓民主的核心-言論自由及民眾在政治上行動
09:31
and popular political action, the very heart of democracy.
176
571102
5308
轉為冷漠
09:36
And this is why we need an international treaty
177
576410
2914
這也就是為什麼我們需要國際公約
09:39
on robotic weapons, and in particular a global ban
178
579324
3540
來規範機械武器,特別是
09:42
on the development and deployment of killer robots.
179
582864
3908
全球皆禁止研發及部屬殺手機器人
09:46
Now we already have international treaties
180
586772
3254
現在我們已經有國際公約
09:50
on nuclear and biological weapons, and, while imperfect,
181
590026
3386
來規範核武及生物武器,雖然並不完美
09:53
these have largely worked.
182
593412
2288
但卻很有效
09:55
But robotic weapons might be every bit as dangerous,
183
595700
3768
但是機械武器完全就是個危險的東西
09:59
because they will almost certainly be used,
184
599468
3288
因為它們幾乎一定會被使用
10:02
and they would also be corrosive to our democratic institutions.
185
602756
5027
也會侵蝕我們的民主制度
10:07
Now in November 2012 the U.S. Department of Defense
186
607783
3468
2012年11月,美國國防部
10:11
issued a directive requiring
187
611251
2458
直接要求
10:13
a human being be present in all lethal decisions.
188
613709
4519
所有致命決策都必須有人類參與其中
10:18
This temporarily effectively banned autonomous weapons in the U.S. military,
189
618228
4776
這暫時有效防止美軍的自主武器
10:23
but that directive needs to be made permanent.
190
623004
3753
但是這個要求必須永久有效
10:26
And it could set the stage for global action.
191
626757
4376
它也可以為全球立法行動鋪路
10:31
Because we need an international legal framework
192
631133
3845
因為我們需要國際性的法律架構
10:34
for robotic weapons.
193
634978
2138
來規範機械武器
10:37
And we need it now, before there's a devastating attack
194
637116
2928
我們現在就需要,以免毀滅性的恐怖攻擊事件
10:40
or a terrorist incident that causes nations of the world
195
640044
3152
讓世界上的國家
10:43
to rush to adopt these weapons
196
643196
1924
爭相採用這些武器
10:45
before thinking through the consequences.
197
645120
3771
完全不考慮後果
10:48
Autonomous robotic weapons concentrate too much power
198
648891
2981
自動機械武器把太強大的力量
10:51
in too few hands, and they would imperil democracy itself.
199
651872
6283
集中在太少人手中,而他們會威脅到民主本身
10:58
Now, don't get me wrong, I think there are tons
200
658155
2686
別誤解了我的意思,我認為
11:00
of great uses for unarmed civilian drones:
201
660841
2618
這些非武裝的民用無人飛機好用途不少
11:03
environmental monitoring, search and rescue, logistics.
202
663459
3939
環境監控、搜救、後勤
11:07
If we have an international treaty on robotic weapons,
203
667398
2826
如果我們有機械武器的國際公約
11:10
how do we gain the benefits of autonomous drones
204
670224
3587
我們要如何從自主無人飛機及載具中獲得好處
11:13
and vehicles while still protecting ourselves
205
673811
2648
同時又能夠保護我們自己
11:16
against illegal robotic weapons?
206
676459
3980
免於違法機械武器的侵害呢
11:20
I think the secret will be transparency.
207
680439
4741
我認為機密會變得透明化
11:25
No robot should have an expectation of privacy
208
685180
3013
任何的機器人在公共場所
11:28
in a public place.
209
688193
3451
都不得有隱私
11:31
(Applause)
210
691644
5048
(掌聲)
11:36
Each robot and drone should have
211
696692
2045
每一台機器人或無人飛機
11:38
a cryptographically signed I.D. burned in at the factory
212
698737
2883
在工廠裡都要烙上一個加密的身分
11:41
that can be used to track its movement through public spaces.
213
701620
2923
可以用來追蹤它在公共場所的動向
11:44
We have license plates on cars, tail numbers on aircraft.
214
704543
3381
我們的車子有車牌,飛機有機尾編號
11:47
This is no different.
215
707924
1841
這是同樣的道理
11:49
And every citizen should be able to download an app
216
709765
2012
每個市民都可以下載應用程式
11:51
that shows the population of drones and autonomous vehicles
217
711777
3125
來顯示無人飛機及自動載具
11:54
moving through public spaces around them,
218
714902
2429
在他們附近的公共空間移動的數量
11:57
both right now and historically.
219
717331
2733
不論是過去或是現在
12:00
And civic leaders should deploy sensors and civic drones
220
720064
3548
領導者應該要部屬感應器還有民用無人飛機
12:03
to detect rogue drones,
221
723612
2344
來偵測圖謀不軌的無人飛機
12:05
and instead of sending killer drones of their own up to shoot them down,
222
725956
3176
而且不是直接派自己的無人攻擊機將它們擊落
12:09
they should notify humans to their presence.
223
729132
2992
而是應該告知人們這些無人飛機的存在
12:12
And in certain very high-security areas,
224
732124
2606
在確定是高安全性的地區
12:14
perhaps civic drones would snare them
225
734730
1909
或許民用無人飛機
12:16
and drag them off to a bomb disposal facility.
226
736639
2841
可以將它們引導到炸彈處理設施
12:19
But notice, this is more an immune system
227
739480
3027
但要注意的是,這比較像是免疫系統
12:22
than a weapons system.
228
742507
1321
而不是武器系統
12:23
It would allow us to avail ourselves of the use
229
743828
2592
如此就可以讓我們
12:26
of autonomous vehicles and drones
230
746420
2032
獲得自主載具與無人飛機所帶來的好處
12:28
while still preserving our open, civil society.
231
748452
4295
同時保有我們開放、文明的社會
12:32
We must ban the deployment and development
232
752747
2999
我們必須禁止研發及部屬
12:35
of killer robots.
233
755746
1862
殺手機器人
12:37
Let's not succumb to the temptation to automate war.
234
757608
4850
我們不會屈服於自動化戰爭的誘惑
12:42
Autocratic governments and criminal organizations
235
762458
2718
專制政府及犯罪組織一定會屈服
12:45
undoubtedly will, but let's not join them.
236
765176
2956
但是我們不會加入他們
12:48
Autonomous robotic weapons
237
768132
1891
自主機械武器
12:50
would concentrate too much power
238
770023
2051
將過多權力
12:52
in too few unseen hands,
239
772074
2482
集中到極少數看不見的手中
12:54
and that would be corrosive to representative government.
240
774556
3255
而那會侵蝕代議政府
12:57
Let's make sure, for democracies at least,
241
777811
2961
我們要保證,至少為了民主的理由
13:00
killer robots remain fiction.
242
780772
2604
讓殺手機器人留在小說裡吧
13:03
Thank you.
243
783376
1110
謝謝
13:04
(Applause)
244
784486
4565
(掌聲)
13:09
Thank you. (Applause)
245
789051
4616
謝謝 (掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7