How racial bias works -- and how to disrupt it | Jennifer L. Eberhardt

157,918 views ・ 2020-06-22

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Pui-Ching Siu
00:12
Some years ago,
0
12910
1273
幾年前,
00:14
I was on an airplane with my son who was just five years old at the time.
1
14207
4912
我和我兒子一起搭飛機, 他當時五歲。
00:20
My son was so excited about being on this airplane with Mommy.
2
20245
5095
對於能和媽咪一起搭飛機, 我兒子感到很興奮。
00:25
He's looking all around and he's checking things out
3
25364
2940
他環視周圍,到處看看,
00:28
and he's checking people out.
4
28328
1836
他也在打量其他人。
00:30
And he sees this man, and he says,
5
30188
1630
然後他看到一名男子,說:
00:31
"Hey! That guy looks like Daddy!"
6
31842
2861
「嘿!那個人看起來像爹地!」
00:35
And I look at the man,
7
35882
1920
我看向那名男子,
00:37
and he didn't look anything at all like my husband,
8
37826
3730
他看起來完全不像我丈夫,
00:41
nothing at all.
9
41580
1369
一點也不像。
00:43
And so then I start looking around on the plane,
10
43461
2336
所以我開始環視飛機內部,
00:45
and I notice this man was the only black guy on the plane.
11
45821
5885
我發現這名男子
是機上唯一的黑人男性。
00:52
And I thought,
12
52874
1412
我心想:「好。
00:54
"Alright.
13
54310
1194
00:56
I'm going to have to have a little talk with my son
14
56369
2525
我要和我的兒子談一下,
00:58
about how not all black people look alike."
15
58918
2929
讓他知道不是所有的黑人 看起來都一樣。」
01:01
My son, he lifts his head up, and he says to me,
16
61871
4377
我兒子抬起頭,對我說:
01:08
"I hope he doesn't rob the plane."
17
68246
2371
「我希望他不會在這架飛機上搶劫。」
01:11
And I said, "What? What did you say?"
18
71359
2515
我說:「什麼?你剛才說什麼?」
01:13
And he says, "Well, I hope that man doesn't rob the plane."
19
73898
3428
他說:「我希望那個人 不會在這架飛機上搶劫。」
01:19
And I said, "Well, why would you say that?
20
79200
2752
我說:「你為什麼會那樣說?
01:22
You know Daddy wouldn't rob a plane."
21
82486
2663
你知道爹地不可能會在飛機上搶劫。」
01:25
And he says, "Yeah, yeah, yeah, well, I know."
22
85173
2312
他說:「是啊,是啊,我知道。」
01:28
And I said, "Well, why would you say that?"
23
88179
2127
我說:「嗯,那你為什麼那樣說?」
01:32
And he looked at me with this really sad face,
24
92346
2957
他看著我,表情很悲傷,
01:36
and he says,
25
96168
1254
他回答:
01:38
"I don't know why I said that.
26
98890
2106
「我不知道我為什麼那樣說。
01:42
I don't know why I was thinking that."
27
102600
2358
我不知道我為什麼那樣想。」
01:45
We are living with such severe racial stratification
28
105724
3518
我們生活在如此 嚴重的種族分層當中,
01:49
that even a five-year-old can tell us what's supposed to happen next,
29
109266
5060
就連五歲小孩也會告訴我們 接下來應該會發生什麼事,
01:55
even with no evildoer,
30
115990
2107
即使當下沒有作惡者,
01:58
even with no explicit hatred.
31
118121
2579
也沒有明確的恨意。
02:02
This association between blackness and crime
32
122184
3913
黑人和犯罪之間的連結
02:06
made its way into the mind of my five-year-old.
33
126121
4334
進入了我五歲兒子的腦中。
02:11
It makes its way into all of our children,
34
131787
3263
它找到路進入了所有孩子的腦中,
02:16
into all of us.
35
136201
1391
我們所有人的腦中。
02:18
Our minds are shaped by the racial disparities
36
138793
3524
我們在世界上所看到的種族差異
02:22
we see out in the world
37
142341
1592
形塑了我們的大腦,
02:24
and the narratives that help us to make sense of the disparities we see:
38
144752
5341
這些說法協助我們理解 我們所看見的差異:
02:31
"Those people are criminal."
39
151637
2526
「那些人是罪犯。」
02:34
"Those people are violent."
40
154187
1925
「那些人很暴力。」
02:36
"Those people are to be feared."
41
156136
2965
「要害怕那些人。」
02:39
When my research team brought people into our lab
42
159814
3191
我的研究團隊把受試者 帶到我們的實驗室,
02:43
and exposed them to faces,
43
163029
2283
讓他們接觸不同的臉孔,
02:45
we found that exposure to black faces led them to see blurry images of guns
44
165336
7000
我們發現,若他們 接觸的是黑人的臉孔,
他們會更快、更清楚地
02:52
with greater clarity and speed.
45
172360
3256
看出槍枝的模糊影像。
02:55
Bias cannot only control what we see,
46
175640
3354
偏見不僅會控制我們看見什麼,
02:59
but where we look.
47
179018
1648
也會控制我們往哪裡看。
03:00
We found that prompting people to think of violent crime
48
180690
3444
我們發現,若促使 受試者去想像暴力罪行,
03:04
can lead them to direct their eyes onto a black face
49
184158
4296
他們會把視線轉向黑人臉孔,
03:08
and away from a white face.
50
188478
2130
而非白人臉孔。
03:10
Prompting police officers to think of capturing and shooting
51
190632
3842
若促使警察去想像 追捕、槍擊,以及逮捕,
03:14
and arresting
52
194498
1229
03:15
leads their eyes to settle on black faces, too.
53
195751
3859
也會導致他們把視線 落在黑人的臉孔上。
03:19
Bias can infect every aspect of our criminal justice system.
54
199634
5066
偏見會影響我們 刑事司法體系的每一方面。
03:25
In a large data set of death-eligible defendants,
55
205100
2931
在一個可判死刑之被告的 大型資料集中,
03:28
we found that looking more black more than doubled their chances
56
208055
4357
我們發現,膚色比較黑的人
被判死刑的機會高達兩倍以上——
03:32
of receiving a death sentence --
57
212436
2057
03:35
at least when their victims were white.
58
215494
2427
至少當被害人是白人時。
03:37
This effect is significant,
59
217945
1438
這個影響非常顯著,
03:39
even though we controlled for the severity of the crime
60
219407
3301
就算我們控制了犯罪的嚴重程度
03:42
and the defendant's attractiveness.
61
222732
2281
以及被告的吸引力也一樣。
03:45
And no matter what we controlled for,
62
225037
2649
不論我們控制哪些變數,
03:47
we found that black people were punished
63
227710
3345
我們都發現黑人受的懲罰
03:51
in proportion to the blackness of their physical features:
64
231079
4325
和他們身體膚色有多黑成比例:
03:55
the more black,
65
235428
1881
越黑的人,
03:57
the more death-worthy.
66
237333
1753
越有可能被判死刑。
03:59
Bias can also influence how teachers discipline students.
67
239110
4209
偏見也會影響老師如何懲戒學生。
04:03
My colleagues and I have found that teachers express a desire
68
243781
4407
我和我同事發現,
老師會想要對黑人中學生施以
04:08
to discipline a black middle school student more harshly
69
248212
3566
比白人學生更嚴厲的懲罰,
04:11
than a white student
70
251802
1168
04:12
for the same repeated infractions.
71
252994
2576
即使他們重犯相同的錯誤。
04:15
In a recent study,
72
255594
1294
在近期的一項研究中,我們發現
04:16
we're finding that teachers treat black students as a group
73
256912
4358
老師會把黑人學生 當作一個群體來對待,
04:21
but white students as individuals.
74
261294
2431
卻把白人學生當作 單獨的個體來對待。
04:24
If, for example, one black student misbehaves
75
264126
3599
比如說,一個黑人學生 做出了不當的行為,
04:27
and then a different black student misbehaves a few days later,
76
267749
4785
幾天後,有另一個黑人學生 也做了不當的行為,
04:32
the teacher responds to that second black student
77
272558
3228
老師對第二個黑人學生的反應
04:35
as if he had misbehaved twice.
78
275810
2625
會像是他做了兩次不當的行為。
04:38
It's as though the sins of one child
79
278952
2811
就彷彿一個孩子的罪過
04:41
get piled onto the other.
80
281787
2176
可以疊加在另一個孩子身上。
04:43
We create categories to make sense of the world,
81
283987
3294
為了理解世界,我們會創造分類,
04:47
to assert some control and coherence
82
287305
4483
為了稍微控制和整理
04:51
to the stimuli that we're constantly being bombarded with.
83
291812
4090
每天不斷轟炸我們的刺激物。
04:55
Categorization and the bias that it seeds
84
295926
3968
分類以及它所種下的偏見
04:59
allow our brains to make judgments more quickly and efficiently,
85
299918
5022
讓我們的大腦能快速 且有效地做出判斷,
05:04
and we do this by instinctively relying on patterns
86
304964
3402
我們會本能地依賴 看似可預測的模式。
05:08
that seem predictable.
87
308390
1669
05:10
Yet, just as the categories we create allow us to make quick decisions,
88
310083
5943
但,雖然我們創造的分類 能讓我們快速地做決策,
05:16
they also reinforce bias.
89
316050
2502
卻也會強化偏見。
05:18
So the very things that help us to see the world
90
318576
3392
所以,協助我們 理解世界的那些東西,
05:23
also can blind us to it.
91
323104
1980
也能矇蔽我們。
05:25
They render our choices effortless,
92
325509
2778
它們輕鬆地幫我們做出選擇,
05:28
friction-free.
93
328311
1369
完全沒有阻力。
05:30
Yet they exact a heavy toll.
94
330956
2445
但要付出的代價也很大。
05:34
So what can we do?
95
334158
1411
我們能怎麼做?
05:36
We are all vulnerable to bias,
96
336507
2491
我們大家都無法抵抗偏見,
05:39
but we don't act on bias all the time.
97
339022
2680
但我們不見得時時刻刻 都會依偏見行事。
05:41
There are certain conditions that can bring bias alive
98
341726
3644
有些條件會讓偏見活躍起來,
05:45
and other conditions that can muffle it.
99
345394
2533
有些則會抑制偏見。
05:47
Let me give you an example.
100
347951
1847
讓我舉個例子。
05:50
Many people are familiar with the tech company Nextdoor.
101
350663
4560
許多人都很熟悉 Nextdoor 這間科技公司。
05:56
So, their whole purpose is to create stronger, healthier, safer neighborhoods.
102
356073
6453
他們的目的是要創造出
更堅強、更健康、更安全的鄰里。
06:03
And so they offer this online space
103
363468
2921
他們便提供這個線上空間,
06:06
where neighbors can gather and share information.
104
366413
3149
讓鄰居可以聚集並分享資訊。
06:09
Yet, Nextdoor soon found that they had a problem
105
369586
4126
但,Nextdoor 很快就發現
他們有種族歸納方面的問題。
06:13
with racial profiling.
106
373736
1668
06:16
In the typical case,
107
376012
1967
在典型的情況中,
06:18
people would look outside their window
108
378003
2396
當居民看向窗外,
06:20
and see a black man in their otherwise white neighborhood
109
380423
4049
看到他們的白人鄰里中有個黑人,
06:24
and make the snap judgment that he was up to no good,
110
384496
4715
就會快速判斷這個黑人不懷好意,
06:29
even when there was no evidence of criminal wrongdoing.
111
389235
3351
即便沒有任何證據顯示 他有打算要犯罪。
06:32
In many ways, how we behave online
112
392610
2934
我們在線上的許多行為方式
06:35
is a reflection of how we behave in the world.
113
395568
3114
都反映出我們在真實世界的行為。
06:39
But what we don't want to do is create an easy-to-use system
114
399117
3945
但我們並不想要創造 一個操作簡易的系統
06:43
that can amplify bias and deepen racial disparities,
115
403086
4163
來放大偏見、加深種族差異,
06:48
rather than dismantling them.
116
408129
2266
而無法瓦解它們。
06:50
So the cofounder of Nextdoor reached out to me and to others
117
410863
3429
Nextdoor 的共同創辦人
便向我及他人求助, 希望能想出辦法來。
06:54
to try to figure out what to do.
118
414316
2131
06:56
And they realized that to curb racial profiling on the platform,
119
416471
3946
他們知道,若要在平台上 抑制種族歸納,
07:00
they were going to have to add friction;
120
420441
1922
他們就得增加阻力;
07:02
that is, they were going to have to slow people down.
121
422387
2658
意即,他們要讓大家慢下來。
07:05
So Nextdoor had a choice to make,
122
425069
2195
所以,Nextdoor 要做出選擇,
07:07
and against every impulse,
123
427288
2478
他們決定不衝動行事,
07:09
they decided to add friction.
124
429790
2116
選擇加上阻力。
07:12
And they did this by adding a simple checklist.
125
432397
3440
他們的做法是增添一個簡單的清單。
07:15
There were three items on it.
126
435861
1670
清單上有三個項目。
07:18
First, they asked users to pause
127
438111
2941
首先,他們請使用者暫停一下,
07:21
and think, "What was this person doing that made him suspicious?"
128
441076
5117
想想:「這個人做了什麼 讓他顯得可疑?」
07:26
The category "black man" is not grounds for suspicion.
129
446876
4533
「黑人」這個分類並不是 懷疑的基礎理由。
07:31
Second, they asked users to describe the person's physical features,
130
451433
5139
接著,他們請使用者描述 這個人的身體特徵,
07:36
not simply their race and gender.
131
456596
2435
不單單是種族和性別。
07:39
Third, they realized that a lot of people
132
459642
3383
第三,他們發現很多人
07:43
didn't seem to know what racial profiling was,
133
463049
2928
似乎不知道種族歸納是什麼,
07:46
nor that they were engaging in it.
134
466001
1959
也不知道他們自己參與其中。
07:48
So Nextdoor provided them with a definition
135
468462
3194
所以 Nextdoor 為他們提供定義,
07:51
and told them that it was strictly prohibited.
136
471680
2902
告訴他們種族歸納 是被嚴格禁止的。
07:55
Most of you have seen those signs in airports
137
475071
2612
大部分的人在機場和地鐵站
07:57
and in metro stations, "If you see something, say something."
138
477707
3702
都看過這類標牌:「如果 你看到了什麼事,說出來。」
08:01
Nextdoor tried modifying this.
139
481928
2814
Nextdoor 把這個標牌改成:
08:05
"If you see something suspicious,
140
485584
2572
「如果你看見了什麼可疑的事,
08:08
say something specific."
141
488180
2073
明確地說出來。」
08:11
And using this strategy, by simply slowing people down,
142
491491
4446
光是用這種讓大家慢下來的策略,
08:15
Nextdoor was able to curb racial profiling by 75 percent.
143
495961
5691
Nextdoor 就讓 種族歸納減少了 75%。
08:22
Now, people often will say to me,
144
502496
2090
現在,大家通常會對我說:
08:24
"You can't add friction in every situation, in every context,
145
504610
4713
「你不可能在所有情況、 所有情境中加上阻力,
08:29
and especially for people who make split-second decisions all the time."
146
509347
4646
更不可能改變那些總是 瞬間做決定的人。」
08:34
But it turns out we can add friction
147
514730
2563
但結果發現,我們能添加阻力的情況
08:37
to more situations than we think.
148
517317
2276
比我們想像中的還要多。
08:40
Working with the Oakland Police Department
149
520031
2074
和加州的奧克蘭警局合作之後,
08:42
in California,
150
522129
1417
08:43
I and a number of my colleagues were able to help the department
151
523570
3856
我和幾位同事能夠協助該警局
08:47
to reduce the number of stops they made
152
527450
2671
減少攔檢沒有犯下 任何嚴重罪行者的次數。
08:50
of people who were not committing any serious crimes.
153
530145
3600
08:53
And we did this by pushing officers
154
533769
2365
我們的做法是要求警員
08:56
to ask themselves a question before each and every stop they made:
155
536158
4443
在做每次攔檢時 都要先問自己一個問題:
09:01
"Is this stop intelligence-led,
156
541451
3015
「這次攔檢是有思考過 才決定要做的,
09:04
yes or no?"
157
544490
1451
是或否?」
09:07
In other words,
158
547353
1396
換言之,
09:09
do I have prior information to tie this particular person
159
549621
4484
我在攔檢之前是否有任何資訊,
能將這個人和特定的 犯罪連結在一起?
09:14
to a specific crime?
160
554129
1601
09:16
By adding that question
161
556587
1458
把那個問題加到警員 在攔檢時要填寫的表格上,
09:18
to the form officers complete during a stop,
162
558069
3079
09:21
they slow down, they pause,
163
561172
1809
就會讓警員慢下來,暫停一下,
09:23
they think, "Why am I considering pulling this person over?"
164
563005
4220
他們會想:「我為什麼 會想要把這個人攔下來?」
09:28
In 2017, before we added that intelligence-led question to the form,
165
568721
5561
在 2017 年,我們把 那個思考問題納入表格之前,
09:35
officers made about 32,000 stops across the city.
166
575655
3946
全市的警員進行了 大約 32,000 次攔檢。
09:39
In that next year, with the addition of this question,
167
579625
4115
隔年,加上了那個問題之後,
09:43
that fell to 19,000 stops.
168
583764
2444
數字減少至 19,000 次攔檢。
09:46
African-American stops alone fell by 43 percent.
169
586232
4961
光是對非裔美國人 所做的攔檢就減少了 43%。
09:51
And stopping fewer black people did not make the city any more dangerous.
170
591905
4438
且減少對黑人的攔檢次數 並沒有讓城市變得比較危險。
09:56
In fact, the crime rate continued to fall,
171
596367
2734
事實上,犯罪率持續下降,
09:59
and the city became safer for everybody.
172
599125
3337
該城市變得讓每個人都更安全。
10:02
So one solution can come from reducing the number of unnecessary stops.
173
602486
5355
一個解決方案是減少 不必要的攔檢數目。
10:08
Another can come from improving the quality of the stops
174
608285
4270
另一個解決方案則是必須攔檢時,
改善攔檢的品質。
10:12
officers do make.
175
612579
1305
10:14
And technology can help us here.
176
614512
2596
此時科技就能派上用場。
10:17
We all know about George Floyd's death,
177
617132
2415
我們都知道喬治.佛洛伊德的死亡,
10:20
because those who tried to come to his aid held cell phone cameras
178
620499
4772
因為當時試圖協助他的人
用手機攝影機拍下
10:25
to record that horrific, fatal encounter with the police.
179
625295
5431
這段和警方可怕、致命的遭遇。
10:30
But we have all sorts of technology that we're not putting to good use.
180
630750
5031
我們有各種科技, 但我們沒有善用。
10:35
Police departments across the country
181
635805
2503
全國各地的警局
10:38
are now required to wear body-worn cameras
182
638332
3553
現在都被要求在身上戴攝影機,
10:41
so we have recordings of not only the most extreme and horrific encounters
183
641909
5930
這麼一來,我們會記錄到的 不僅是最可怕極端的遭遇,
10:47
but of everyday interactions.
184
647863
2754
還會記錄日常的互動。
10:50
With an interdisciplinary team at Stanford,
185
650641
2777
我們和史丹佛大學的 一個跨領域團隊合作,
10:53
we've begun to use machine learning techniques
186
653442
2687
開始使用機器學習技術
10:56
to analyze large numbers of encounters.
187
656153
3367
來分析大量相遇的情境。
10:59
This is to better understand what happens in routine traffic stops.
188
659544
4611
目的是要更了解在例行的 交通攔檢時會發生什麼事。
11:04
What we found was that
189
664179
2155
我們發現,
11:06
even when police officers are behaving professionally,
190
666358
3662
即使警員的行為表現非常專業,
11:10
they speak to black drivers less respectfully than white drivers.
191
670860
4462
比起白人司機,他們對黑人司機的 說話方式仍然比較不尊重。
11:16
In fact, from the words officers use alone,
192
676052
4075
事實上,單單從警員的用詞,
11:20
we could predict whether they were talking to a black driver or a white driver.
193
680151
5162
我們就能預測出他們是在 和黑人司機或白人司機說話。
11:25
The problem is that the vast majority of the footage from these cameras
194
685337
5762
問題是,這些攝影機 所拍攝的大部分影片
11:31
is not used by police departments
195
691123
2087
並沒有被警局用來 了解街上發生的狀況
11:33
to understand what's going on on the street
196
693234
2276
11:35
or to train officers.
197
695534
2243
或用來訓練警員。
11:38
And that's a shame.
198
698554
1458
那很可惜。
11:40
How does a routine stop turn into a deadly encounter?
199
700796
4789
例行的攔檢怎麼會 變成致命的遭遇?
11:45
How did this happen in George Floyd's case?
200
705609
2670
在喬治.佛洛伊德的案例中 這是怎麼發生的?
11:49
How did it happen in others?
201
709588
2082
在其他的案例中呢?
11:51
When my eldest son was 16 years old,
202
711694
3396
我的長子 16 歲時,
11:55
he discovered that when white people look at him,
203
715114
3139
他發現當白人看著他時,
11:58
they feel fear.
204
718277
1561
他們會感到恐懼。
12:01
Elevators are the worst, he said.
205
721123
2661
他說,最糟的時候是在電梯裡。
12:04
When those doors close,
206
724313
2331
當電梯門關上,
12:06
people are trapped in this tiny space
207
726668
3083
大家被困在這個小小的空間裡,
12:09
with someone they have been taught to associate with danger.
208
729775
4467
和他們認為危險的人困在一起。
12:14
My son senses their discomfort,
209
734744
3220
我兒子能感受到他們的不適,
12:17
and he smiles to put them at ease,
210
737988
3157
並用微笑讓他們感到自在些,
12:21
to calm their fears.
211
741169
1769
降低他們的恐懼。
12:23
When he speaks,
212
743351
1945
當他說話時,
12:25
their bodies relax.
213
745320
1683
他們的身體放鬆了。
12:27
They breathe easier.
214
747442
1903
他們比較能呼吸。
12:29
They take pleasure in his cadence,
215
749369
2531
他們喜歡他的抑揚頓挫、
12:31
his diction, his word choice.
216
751924
2317
他的發音、他的用字遣詞。
12:34
He sounds like one of them.
217
754986
1843
他聽起來就像他們的一份子。
12:36
I used to think that my son was a natural extrovert like his father.
218
756853
4730
我以前認為我兒子 和他爸爸一樣天生外向。
12:41
But I realized at that moment, in that conversation,
219
761607
3550
但,在那一刻, 我從那段談話中了解到,
12:46
that his smile was not a sign that he wanted to connect
220
766143
5078
他的微笑並不表示他想要
和那些本來會是陌生人的人連結。
12:51
with would-be strangers.
221
771245
1964
12:53
It was a talisman he used to protect himself,
222
773920
3652
他的微笑是個護身符, 用來保護他自己,
12:57
a survival skill he had honed over thousands of elevator rides.
223
777596
6219
是他搭電梯搭了數千次 而磨練出來的生存技能。
13:04
He was learning to accommodate the tension that his skin color generated
224
784387
5171
他的膚色會產生緊張感, 讓他自己有生命危險,
13:11
and that put his own life at risk.
225
791026
2667
他便學習去調解這樣的緊張感。
13:14
We know that the brain is wired for bias,
226
794619
3783
我們知道大腦天生就會有偏見,
13:18
and one way to interrupt that bias is to pause and to reflect
227
798426
4465
打破那偏見的一種方式 就是先暫停下來做反思,
13:22
on the evidence of our assumptions.
228
802915
2305
想想我們的假設有什麼依據。
13:25
So we need to ask ourselves:
229
805244
1755
我們得要問問自己:
13:27
What assumptions do we bring when we step onto an elevator?
230
807023
4665
我們帶著什麼假設踏進電梯?
13:33
Or an airplane?
231
813776
1311
或上飛機?
13:35
How do we make ourselves aware of our own unconscious bias?
232
815532
4599
我們要如何察覺到 自己無意識的偏見?
13:40
Who do those assumptions keep safe?
233
820155
2351
那些假設能保護誰的安全?
13:44
Who do they put at risk?
234
824615
1932
那些假設讓誰有危險?
13:47
Until we ask these questions
235
827649
2354
若我們不去問這些問題,
13:50
and insist that our schools and our courts and our police departments
236
830978
4624
不去堅持我們的學校、法庭、警局,
13:55
and every institution do the same,
237
835626
2542
及其他機構都要問這些問題,
13:59
we will continue to allow bias
238
839835
3829
那麼我們就會讓偏見繼續
14:03
to blind us.
239
843688
1278
矇蔽我們。
14:05
And if we do,
240
845348
1409
這麼一來,
14:08
none of us are truly safe.
241
848066
3208
沒有誰是真正安全的。
14:14
Thank you.
242
854103
1308
謝謝。
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7