Daniel Suarez: The kill decision shouldn't belong to a robot

76,013 views ・ 2013-06-13

TED


请双击下面的英文字幕来播放视频。

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
翻译人员: Binglei Shea 校对人员: Ning Zhang
00:12
I write fiction sci-fi thrillers,
1
12691
3158
我是写科幻惊悚小说的,
00:15
so if I say "killer robots,"
2
15849
2193
所以如果我说"机器人杀手"
00:18
you'd probably think something like this.
3
18042
2361
你也许会这么想。
00:20
But I'm actually not here to talk about fiction.
4
20403
2555
但我其实不打算在这里谈论小说。
00:22
I'm here to talk about very real killer robots,
5
22958
2948
我在这里要谈论非常真实的杀手机器人,
00:25
autonomous combat drones.
6
25906
3122
自主作战无人机。
00:29
Now, I'm not referring to Predator and Reaper drones,
7
29028
3627
现在,我不是指“捕食者”和“死神无人机”,
00:32
which have a human making targeting decisions.
8
32655
3260
他们是由人类决定攻击目标的。
00:35
I'm talking about fully autonomous robotic weapons
9
35915
3129
我说的是完全自主的机器人武器
00:39
that make lethal decisions about human beings
10
39044
2654
它们自己作出攻击
00:41
all on their own.
11
41698
2417
人类的决定。
00:44
There's actually a technical term for this: lethal autonomy.
12
44115
4000
这实际上有一个技术术语: 致命的自主权。
00:48
Now, lethally autonomous killer robots
13
48115
2856
现在,致命的自主机器人杀手
00:50
would take many forms -- flying, driving,
14
50971
3069
有许多形式——飞行、 驾驶、
00:54
or just lying in wait.
15
54040
2746
或只是埋伏。
00:56
And actually, they're very quickly becoming a reality.
16
56786
3109
实际上,他们正在很快成为现实。
00:59
These are two automatic sniper stations
17
59895
2484
这些都是两个自动的狙击地点
01:02
currently deployed in the DMZ between North and South Korea.
18
62379
4137
目前部署在朝鲜和韩国之间非军事区。
01:06
Both of these machines are capable of automatically
19
66516
2171
这些机器都能够自动
01:08
identifying a human target and firing on it,
20
68687
3524
识别人类目标并向其射击,
01:12
the one on the left at a distance of over a kilometer.
21
72211
4324
左边的那个可以在一公里外(就发现目标)。
01:16
Now, in both cases, there's still a human in the loop
22
76535
3589
现在,这两种情况下,仍有一个人类在控制
01:20
to make that lethal firing decision,
23
80124
2372
来下这一致命射击的决定,
01:22
but it's not a technological requirement. It's a choice.
24
82496
5413
但它不是一项技术要求。它是一种选择。
01:27
And it's that choice that I want to focus on,
25
87909
3093
我重点关注的正是这一选择,
01:31
because as we migrate lethal decision-making
26
91002
2641
因为随着我们致命决策权的迁移
01:33
from humans to software,
27
93643
3109
从人到软件,
01:36
we risk not only taking the humanity out of war,
28
96752
3476
我们甘冒风险不只让人们远离战争
01:40
but also changing our social landscape entirely,
29
100228
3526
而且也在改变我们的整个社会格局,
01:43
far from the battlefield.
30
103754
2224
使其远离战场。
01:45
That's because the way humans resolve conflict
31
105978
4509
这是因为人类解决冲突的方式
01:50
shapes our social landscape.
32
110487
1733
决定了我们社会的格局。
01:52
And this has always been the case, throughout history.
33
112220
2633
在整个历史长河中一直是如此。
01:54
For example, these were state-of-the-art weapons systems
34
114853
2661
例如,这些都是最先进的武器系统
01:57
in 1400 A.D.
35
117514
2079
在公元 1400 年
01:59
Now they were both very expensive to build and maintain,
36
119593
3144
现在建造和维护它们都很昂贵
02:02
but with these you could dominate the populace,
37
122737
3240
但有了这些,你就可以主宰民众,
02:05
and the distribution of political power in feudal society reflected that.
38
125977
3889
同时也反映了封建的社会中政治权力的分配。
02:09
Power was focused at the very top.
39
129866
2687
权力集中在最顶端。
02:12
And what changed? Technological innovation.
40
132553
3528
什么改变了?技术创新。
02:16
Gunpowder, cannon.
41
136081
1871
火药,加农炮。
02:17
And pretty soon, armor and castles were obsolete,
42
137952
3817
很快,装甲和城堡都已过时,
02:21
and it mattered less who you brought to the battlefield
43
141769
2533
你带谁上战场已经不那么重要
02:24
versus how many people you brought to the battlefield.
44
144302
3779
重要的是你带了多少人上战场。
02:28
And as armies grew in size, the nation-state arose
45
148081
3638
并且随着军队的组织的扩大,为了满足国防上的
02:31
as a political and logistical requirement of defense.
46
151719
3680
政治和后勤要求,民族国家出现了。
02:35
And as leaders had to rely on more of their populace,
47
155399
2376
作为领导人不得不依靠更多的民众,
02:37
they began to share power.
48
157775
1833
他们开始分享权力。
02:39
Representative government began to form.
49
159608
2599
代表政府开始形成。
02:42
So again, the tools we use to resolve conflict
50
162207
3288
再次,我们用来解决冲突的工具
02:45
shape our social landscape.
51
165495
3304
塑造了我们社会的格局。
02:48
Autonomous robotic weapons are such a tool,
52
168799
4064
自主机器人武器就是这样一个工具,
02:52
except that, by requiring very few people to go to war,
53
172863
5168
除此之外,通过只需要很少的人去打仗,
02:58
they risk re-centralizing power into very few hands,
54
178031
4840
他们的风险在于权力可能重新回到少数人手里,
03:02
possibly reversing a five-century trend toward democracy.
55
182871
6515
可能扭转五个世纪的民主潮流。
03:09
Now, I think, knowing this,
56
189386
1757
现在,我想,知道这个,
03:11
we can take decisive steps to preserve our democratic institutions,
57
191143
4352
我们可以采取果断措施,维护我们的民主机构,
03:15
to do what humans do best, which is adapt.
58
195495
3979
做人类擅长的事情,这是适应。
03:19
But time is a factor.
59
199474
2005
但时间是一个因素。
03:21
Seventy nations are developing remotely-piloted
60
201479
2851
七十个国家正在自主发展遥控
03:24
combat drones of their own,
61
204330
2157
无人作战机,
03:26
and as you'll see, remotely-piloted combat drones
62
206487
2593
如你所见,遥控作战无人机是
03:29
are the precursors to autonomous robotic weapons.
63
209080
4472
自主机器人武器的前提。
03:33
That's because once you've deployed remotely-piloted drones,
64
213552
2767
这是因为一旦您已经部署了遥控无人作战机,
03:36
there are three powerful factors pushing decision-making
65
216319
3384
有三个强大的因素迫使人类
03:39
away from humans and on to the weapon platform itself.
66
219703
4600
把决策权交给武器平台本身。
03:44
The first of these is the deluge of video that drones produce.
67
224303
5259
第一是无人机拍摄的大量视频。
03:49
For example, in 2004, the U.S. drone fleet produced
68
229562
3853
例如,2004 年,美国无人机梯队拍摄了
03:53
a grand total of 71 hours of video surveillance for analysis.
69
233415
5312
总计 71 个小时的视频监控可供分析。
03:58
By 2011, this had gone up to 300,000 hours,
70
238727
4499
2011 年,这已经达 300,000 小时
04:03
outstripping human ability to review it all,
71
243226
3149
超过了人类的审查能力,
04:06
but even that number is about to go up drastically.
72
246375
3664
但即使这一数字会猛烈地增长。
04:10
The Pentagon's Gorgon Stare and Argus programs
73
250039
2575
五角大楼的“戈尔贡凝视”(Gorgon Stare)和守卫程序
04:12
will put up to 65 independently operated camera eyes
74
252614
3164
还将把 65个 独立的照相机眼睛
04:15
on each drone platform,
75
255778
2038
安装在每个无人机平台上,
04:17
and this would vastly outstrip human ability to review it.
76
257816
3303
这将远远超过人类的审查能力。
04:21
And that means visual intelligence software will need
77
261119
2160
这就需要视觉情报软件
04:23
to scan it for items of interest.
78
263279
4048
去扫描它感兴趣的目标。
04:27
And that means very soon
79
267327
1348
这意味着很快
04:28
drones will tell humans what to look at,
80
268675
2747
无人机将告诉人类要看什么
04:31
not the other way around.
81
271422
2497
不是其他方式。
04:33
But there's a second powerful incentive pushing
82
273919
2473
但还有第二个的强大动力迫使
04:36
decision-making away from humans and onto machines,
83
276392
3383
人类把决策权交给机器。
04:39
and that's electromagnetic jamming,
84
279775
2872
这就是电磁干扰,
04:42
severing the connection between the drone
85
282647
2236
它会切断无人飞机和操控者
04:44
and its operator.
86
284883
2814
之间的联系。
04:47
Now we saw an example of this in 2011
87
287697
2618
现在我们看到的例子发生在 2011 年
04:50
when an American RQ-170 Sentinel drone
88
290315
2956
当美国 RQ-170 哨兵无人机
04:53
got a bit confused over Iran due to a GPS spoofing attack,
89
293271
4307
由于GPS欺骗攻击,在伊朗上空而发生混乱,
04:57
but any remotely-piloted drone is susceptible to this type of attack,
90
297578
5114
但任何遥控无人飞机都极易受这种类型的攻击,
05:02
and that means drones
91
302692
2052
这就意味着无人机
05:04
will have to shoulder more decision-making.
92
304744
3620
将不得不承担更多的决策。
05:08
They'll know their mission objective,
93
308364
3043
他们就会知道他们的使命目标
05:11
and they'll react to new circumstances without human guidance.
94
311407
4845
他们在无人操纵的情况下有自主应对新情况的能力。
05:16
They'll ignore external radio signals
95
316252
2581
他们会忽略外部的无线电信号
05:18
and send very few of their own.
96
318833
2330
只发出极少量的无线信号。
05:21
Which brings us to, really, the third
97
321163
2006
这就引出了,第三个
05:23
and most powerful incentive pushing decision-making
98
323169
3862
最强大的诱因促使我们
05:27
away from humans and onto weapons:
99
327031
3342
将决策权交给武器自身:
05:30
plausible deniability.
100
330373
3293
貌似合理的推脱。
05:33
Now we live in a global economy.
101
333666
2887
现在我们生活在一个全球化的经济大潮中。
05:36
High-tech manufacturing is occurring on most continents.
102
336553
4334
高科技制造业在大多数大陆上发生。
05:40
Cyber espionage is spiriting away advanced designs
103
340887
2914
网络间谍活动正在将先进的设计
05:43
to parts unknown,
104
343801
1886
带到未知世界,
05:45
and in that environment, it is very likely
105
345687
2014
在那种环境下,它是很有可能
05:47
that a successful drone design will be knocked off in contract factories,
106
347701
4734
一个成功的无人驾驶飞机设计在授权工厂被停工,
05:52
proliferate in the gray market.
107
352435
2170
导致其在灰色市场量产。
05:54
And in that situation, sifting through the wreckage
108
354605
2460
在这种情况,即使仔细筛查
05:57
of a suicide drone attack, it will be very difficult to say
109
357065
2960
无人机自杀式攻击的残骸,
06:00
who sent that weapon.
110
360025
4400
也很难说谁发射了这个武器。
06:04
This raises the very real possibility
111
364425
2800
这就引起了非常现实的可能性
06:07
of anonymous war.
112
367225
2935
匿名的战争。
06:10
This could tilt the geopolitical balance on its head,
113
370160
2614
这将导致地缘政治失衡并倾向一头,
06:12
make it very difficult for a nation to turn its firepower
114
372774
3491
使一个国家很难将其火力
06:16
against an attacker, and that could shift the balance
115
376265
2848
对准攻击者,这可能会令21 世纪的平衡
06:19
in the 21st century away from defense and toward offense.
116
379113
3764
从防御走向进攻。
06:22
It could make military action a viable option
117
382877
3124
它能使军事行动成为可行的选择
06:26
not just for small nations,
118
386001
2288
不只是那些小国家
06:28
but criminal organizations, private enterprise,
119
388289
2545
甚至犯罪组织、 私人企业、
06:30
even powerful individuals.
120
390834
2479
甚至是强大的个人都可以这么做。
06:33
It could create a landscape of rival warlords
121
393313
3328
它可以引起军阀割据的混战,
06:36
undermining rule of law and civil society.
122
396641
3680
破坏法治和文明社会。
06:40
Now if responsibility and transparency
123
400321
3616
现在如果责任和透明度
06:43
are two of the cornerstones of representative government,
124
403937
2384
是代议制政府的两个基石
06:46
autonomous robotic weapons could undermine both.
125
406321
4320
自主机器人武器可能同时摧毁它们。
06:50
Now you might be thinking that
126
410641
1546
现在你可能会想,
06:52
citizens of high-tech nations
127
412187
2246
高科技的国家的公民
06:54
would have the advantage in any robotic war,
128
414433
2703
在任何机器人的战争中占有优势
06:57
that citizens of those nations would be less vulnerable,
129
417136
3633
这些国家的公民将不再那么脆弱,
07:00
particularly against developing nations.
130
420769
4288
尤其是针对发展中国家。
07:05
But I think the truth is the exact opposite.
131
425057
3524
但事实是正好相反。
07:08
I think citizens of high-tech societies
132
428581
2251
我认为高科技社会的公民
07:10
are more vulnerable to robotic weapons,
133
430832
3729
面对机器人武器更加脆弱,
07:14
and the reason can be summed up in one word: data.
134
434561
4465
原因可以用一个词概括: 数据。
07:19
Data powers high-tech societies.
135
439026
3481
数据掌控着高科技社会。
07:22
Cell phone geolocation, telecom metadata,
136
442507
3190
手机定位,电信的元数据,
07:25
social media, email, text, financial transaction data,
137
445697
3472
社会媒体、 电子邮件、 文本、 金融交易数据,
07:29
transportation data, it's a wealth of real-time data
138
449169
3532
交通数据,那么多丰富的实时数据
07:32
on the movements and social interactions of people.
139
452701
3373
关于个人的动向和社交关系。
07:36
In short, we are more visible to machines
140
456074
3775
简而言之,对于机器而言,
07:39
than any people in history,
141
459849
2242
我们比历史上任何人更显而易见,
07:42
and this perfectly suits the targeting needs of autonomous weapons.
142
462091
5616
这完全符合了自主武器瞄准的需要。
07:47
What you're looking at here
143
467707
1738
现在你正在看到的
07:49
is a link analysis map of a social group.
144
469445
3246
是某一社会群体的关系分析图。
07:52
Lines indicate social connectedness between individuals.
145
472691
3634
线表示人与人之间的社会联系。
07:56
And these types of maps can be automatically generated
146
476325
2880
这类型的地图,基于现代人留下的数据追踪,
07:59
based on the data trail modern people leave behind.
147
479205
4715
可以自动生成。
08:03
Now it's typically used to market goods and services
148
483920
2477
现在它通常用于市场商品和服务
08:06
to targeted demographics, but it's a dual-use technology,
149
486397
4416
有针对性的人口统计数据,但这种技术有双重用途,
08:10
because targeting is used in another context.
150
490813
3360
因为目标被用于另一种情境。
08:14
Notice that certain individuals are highlighted.
151
494173
2560
请注意那些突出显示的个人。
08:16
These are the hubs of social networks.
152
496733
3280
这些都是社交网络的枢纽。
08:20
These are organizers, opinion-makers, leaders,
153
500013
3590
这些都是组织者、 决策者、 领袖、
08:23
and these people also can be automatically identified
154
503603
2682
这些人也可以进行通过他们的通讯模式
08:26
from their communication patterns.
155
506285
2382
被自动识别出来
08:28
Now, if you're a marketer, you might then target them
156
508667
2146
现在,如果你是营销人员,您可能瞄准他们
08:30
with product samples, try to spread your brand
157
510813
2543
用产品的样本,试图传播您的品牌
08:33
through their social group.
158
513356
2829
通过他们的社群。
08:36
But if you're a repressive government
159
516185
1953
但是如果你是一个专制的政府
08:38
searching for political enemies, you might instead remove them,
160
518138
4810
寻找政治的敌人,你反而可能会想要拔掉这些眼中钉,
08:42
eliminate them, disrupt their social group,
161
522948
2760
消灭他们,从而扰乱他们的社团
08:45
and those who remain behind lose social cohesion
162
525708
3169
而那些留下的人将失去社会凝聚力
08:48
and organization.
163
528877
2621
和组织。
08:51
Now in a world of cheap, proliferating robotic weapons,
164
531498
3324
在一个机器人武器便宜又泛滥的世界里,
08:54
borders would offer very little protection
165
534822
2635
国界的保护微乎其微,
08:57
to critics of distant governments
166
537457
1946
无法防御远方政府批评者
08:59
or trans-national criminal organizations.
167
539403
3646
或跨国犯罪组织。
09:03
Popular movements agitating for change
168
543049
3493
鼓动改革的民众运动
09:06
could be detected early and their leaders eliminated
169
546542
3609
可以在早期即被发现,
09:10
before their ideas achieve critical mass.
170
550151
2911
他们的领袖在向大众充分宣传他们的观点之前就会被消灭。
09:13
And ideas achieving critical mass
171
553062
2591
向大众充分宣传其观点
09:15
is what political activism in popular government is all about.
172
555653
3936
正是政治激进主义政府所追求的。
09:19
Anonymous lethal weapons could make lethal action
173
559589
3997
匿名致命武器能让致命行动
09:23
an easy choice for all sorts of competing interests.
174
563586
3782
成为各种相互竞争利益的一种最简单的选择。
09:27
And this would put a chill on free speech
175
567368
3734
这将会令民主的核心——言论自由
09:31
and popular political action, the very heart of democracy.
176
571102
5308
和民众政治运动受到限制。
09:36
And this is why we need an international treaty
177
576410
2914
这就是为什么我们需要一项国际条约
09:39
on robotic weapons, and in particular a global ban
178
579324
3540
关于机器人武器的禁令
09:42
on the development and deployment of killer robots.
179
582864
3908
禁止全球开发和部署机器人杀手。
09:46
Now we already have international treaties
180
586772
3254
现在我们已经有国际条约
09:50
on nuclear and biological weapons, and, while imperfect,
181
590026
3386
关于核武器和生化武器,虽然不尽完善,
09:53
these have largely worked.
182
593412
2288
这些很大程度上起了作用。
09:55
But robotic weapons might be every bit as dangerous,
183
595700
3768
但机器人武器可能一样危险
09:59
because they will almost certainly be used,
184
599468
3288
因为它们几乎可以肯定将会被使用,
10:02
and they would also be corrosive to our democratic institutions.
185
602756
5027
他们也会侵蚀我们的民主机构。
10:07
Now in November 2012 the U.S. Department of Defense
186
607783
3468
2012 年 11 月,美国国防部
10:11
issued a directive requiring
187
611251
2458
直接要求
10:13
a human being be present in all lethal decisions.
188
613709
4519
所有致命决定都必须有人类参与。
10:18
This temporarily effectively banned autonomous weapons in the U.S. military,
189
618228
4776
这暂时有效地阻止了自主武器在美国军队中的使用
10:23
but that directive needs to be made permanent.
190
623004
3753
但该指令需要永久有效。
10:26
And it could set the stage for global action.
191
626757
4376
它可以为全球立法行动铺路。
10:31
Because we need an international legal framework
192
631133
3845
因为,我们需要
10:34
for robotic weapons.
193
634978
2138
一个针对于机器人武器的国际法律框架。
10:37
And we need it now, before there's a devastating attack
194
637116
2928
我们现在就需要这样的法律,用以防止一个具有破坏性攻击
10:40
or a terrorist incident that causes nations of the world
195
640044
3152
或一个恐怖袭击使得世界各国
10:43
to rush to adopt these weapons
196
643196
1924
在完全不顾及后果的前提下,
10:45
before thinking through the consequences.
197
645120
3771
就急于采用这些武器。
10:48
Autonomous robotic weapons concentrate too much power
198
648891
2981
自主机器人武器集中了太多的权利
10:51
in too few hands, and they would imperil democracy itself.
199
651872
6283
集中在太少人手中,他们将会危及民主本身。
10:58
Now, don't get me wrong, I think there are tons
200
658155
2686
现在,别误会我,我觉得这些非武装的民用无人机
11:00
of great uses for unarmed civilian drones:
201
660841
2618
用途很多:
11:03
environmental monitoring, search and rescue, logistics.
202
663459
3939
环境监测、 搜索和救援,物流。
11:07
If we have an international treaty on robotic weapons,
203
667398
2826
如果我们有一项关于机器人的武器国际条约,
11:10
how do we gain the benefits of autonomous drones
204
670224
3587
我们如何从自主无人机和自主车上获得好处
11:13
and vehicles while still protecting ourselves
205
673811
2648
同时还保护我们自己
11:16
against illegal robotic weapons?
206
676459
3980
免受非法机器人武器的攻击?
11:20
I think the secret will be transparency.
207
680439
4741
我认为秘密会变得透明化。
11:25
No robot should have an expectation of privacy
208
685180
3013
任何机器人都不应该
11:28
in a public place.
209
688193
3451
在公众领域有隐私。
11:31
(Applause)
210
691644
5048
(掌声)
11:36
Each robot and drone should have
211
696692
2045
每个机器人和无人驾驶飞机都应该
11:38
a cryptographically signed I.D. burned in at the factory
212
698737
2883
在工厂内被烙上一个加密的身份信息
11:41
that can be used to track its movement through public spaces.
213
701620
2923
可以用来对其在公共领域内动向的跟踪。
11:44
We have license plates on cars, tail numbers on aircraft.
214
704543
3381
我们的汽车有车牌,飞机尾部有编号。
11:47
This is no different.
215
707924
1841
这没有什么不同。
11:49
And every citizen should be able to download an app
216
709765
2012
每个公民应能够下载应用程序
11:51
that shows the population of drones and autonomous vehicles
217
711777
3125
能够显示通过他们周围公共空间的
11:54
moving through public spaces around them,
218
714902
2429
无人机和自主车的数量
11:57
both right now and historically.
219
717331
2733
不论现在还是过去。
12:00
And civic leaders should deploy sensors and civic drones
220
720064
3548
领导者应部署传感器和民用无人驾驶飞机
12:03
to detect rogue drones,
221
723612
2344
来检测流氓无人机,
12:05
and instead of sending killer drones of their own up to shoot them down,
222
725956
3176
而不是发送能够击落他们的杀手无人机,
12:09
they should notify humans to their presence.
223
729132
2992
应该通知人们这些无人机的存在。
12:12
And in certain very high-security areas,
224
732124
2606
在某些非常高安全等级的地区,
12:14
perhaps civic drones would snare them
225
734730
1909
或许民用无人机可以将他们拖到
12:16
and drag them off to a bomb disposal facility.
226
736639
2841
一个设有炸弹设施处理掉。
12:19
But notice, this is more an immune system
227
739480
3027
但请注意,这更象一个免疫系统,
12:22
than a weapons system.
228
742507
1321
而不是武器系统。
12:23
It would allow us to avail ourselves of the use
229
743828
2592
这样,它将使我们能够利用
12:26
of autonomous vehicles and drones
230
746420
2032
自主车辆和无人驾驶飞机的好处
12:28
while still preserving our open, civil society.
231
748452
4295
同时仍然保留我们开放的公民社会。
12:32
We must ban the deployment and development
232
752747
2999
我们必须禁止部署与发展
12:35
of killer robots.
233
755746
1862
机器人杀手。
12:37
Let's not succumb to the temptation to automate war.
234
757608
4850
我们不会屈服于自动化战争的诱惑。
12:42
Autocratic governments and criminal organizations
235
762458
2718
独裁政府和犯罪组织
12:45
undoubtedly will, but let's not join them.
236
765176
2956
毫无疑问的会屈服,让我们不要加入他们。
12:48
Autonomous robotic weapons
237
768132
1891
自主机器人武器
12:50
would concentrate too much power
238
770023
2051
集中了太多的权力
12:52
in too few unseen hands,
239
772074
2482
在太少看不见的手中,
12:54
and that would be corrosive to representative government.
240
774556
3255
那将是侵蚀我们的代议制政府。
12:57
Let's make sure, for democracies at least,
241
777811
2961
让我们保证,至少为了民主的理由,
13:00
killer robots remain fiction.
242
780772
2604
让机器人杀手只存在于小说里吧。
13:03
Thank you.
243
783376
1110
谢谢。
13:04
(Applause)
244
784486
4565
(掌声)
13:09
Thank you. (Applause)
245
789051
4616
谢谢。(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7