How racial bias works -- and how to disrupt it | Jennifer L. Eberhardt

167,361 views ・ 2020-06-22

TED


请双击下面的英文字幕来播放视频。

翻译人员: Shu Fei Chow 校对人员: psjmz mz
00:12
Some years ago,
0
12910
1273
几年前,
00:14
I was on an airplane with my son who was just five years old at the time.
1
14207
4912
我和年仅五岁的儿子一同搭乘飞机。
00:20
My son was so excited about being on this airplane with Mommy.
2
20245
5095
第一次与妈咪搭乘飞机 让他感到非常兴奋。
00:25
He's looking all around and he's checking things out
3
25364
2940
他不断环视周围,四处张望,
00:28
and he's checking people out.
4
28328
1836
目光不停地打量着别人。
00:30
And he sees this man, and he says,
5
30188
1630
然后他看到了某位男子,说:
00:31
"Hey! That guy looks like Daddy!"
6
31842
2861
“嘿!这个人看起来好像爹地!”
00:35
And I look at the man,
7
35882
1920
我就看向这位男子,
00:37
and he didn't look anything at all like my husband,
8
37826
3730
而他看起来完全不像我的丈夫,
00:41
nothing at all.
9
41580
1369
一点也不像。
00:43
And so then I start looking around on the plane,
10
43461
2336
于是我也开始环顾机舱,
00:45
and I notice this man was the only black guy on the plane.
11
45821
5885
发现这位男子是飞机上 唯一的黑人男性。
00:52
And I thought,
12
52874
1412
我当时心想,
00:54
"Alright.
13
54310
1194
“好吧。
00:56
I'm going to have to have a little talk with my son
14
56369
2525
我得和儿子好好谈谈,
00:58
about how not all black people look alike."
15
58918
2929
让他知道不是所有的黑人 都长得一样。”
01:01
My son, he lifts his head up, and he says to me,
16
61871
4377
我的儿子抬起头,对我说:
01:08
"I hope he doesn't rob the plane."
17
68246
2371
“我希望他不会劫持飞机。”
01:11
And I said, "What? What did you say?"
18
71359
2515
我说:“什么,你刚才说什么?”
01:13
And he says, "Well, I hope that man doesn't rob the plane."
19
73898
3428
儿子说:“我希望那人不要劫持飞机。”
01:19
And I said, "Well, why would you say that?
20
79200
2752
我说:“为什么你会这样说?
01:22
You know Daddy wouldn't rob a plane."
21
82486
2663
你知道爹地是不会劫持飞机的。“
01:25
And he says, "Yeah, yeah, yeah, well, I know."
22
85173
2312
儿子说:“是啊,是啊,我知道。”
01:28
And I said, "Well, why would you say that?"
23
88179
2127
我说:“好,那为什么你会这样说?”
01:32
And he looked at me with this really sad face,
24
92346
2957
于是他用一种悲伤的神情看着我,
01:36
and he says,
25
96168
1254
回答道,
01:38
"I don't know why I said that.
26
98890
2106
“我不知道我为什么会这样说。
01:42
I don't know why I was thinking that."
27
102600
2358
我不知道我为什么会有这种想法。”
01:45
We are living with such severe racial stratification
28
105724
3518
我们生活在 一个种族分明的社会当中,
01:49
that even a five-year-old can tell us what's supposed to happen next,
29
109266
5060
就连一个五岁孩童都能告诉我们 接下来可能会发生什么事情。
01:55
even with no evildoer,
30
115990
2107
即便没有作恶者,
01:58
even with no explicit hatred.
31
118121
2579
即便没有明确的仇恨,
02:02
This association between blackness and crime
32
122184
3913
黑人和罪案之间的连接
02:06
made its way into the mind of my five-year-old.
33
126121
4334
早已渗入我五岁儿子的思想,
02:11
It makes its way into all of our children,
34
131787
3263
也渗透进了所有孩子,
甚至是我们每一个人的脑海中。
02:16
into all of us.
35
136201
1391
02:18
Our minds are shaped by the racial disparities
36
138793
3524
我们在这个世界上 所看到的种族差异
02:22
we see out in the world
37
142341
1592
早已塑造了我们的思维方式,
02:24
and the narratives that help us to make sense of the disparities we see:
38
144752
5341
那些叙述合理化了 我们所看到的差异:
02:31
"Those people are criminal."
39
151637
2526
“那些人是罪犯。”
02:34
"Those people are violent."
40
154187
1925
“那些人是施暴者。”
02:36
"Those people are to be feared."
41
156136
2965
“要恐惧那些人。”
02:39
When my research team brought people into our lab
42
159814
3191
当我的研究团队把研究对象 带到我们的实验室,
02:43
and exposed them to faces,
43
163029
2283
让他们观察不同人的照片时,
02:45
we found that exposure to black faces led them to see blurry images of guns
44
165336
7000
我们发现人们看到黑人的面孔时, 会更快、更清楚地
02:52
with greater clarity and speed.
45
172360
3256
辨认模糊的枪支影像。
02:55
Bias cannot only control what we see,
46
175640
3354
偏见不仅能控制我们所看见的,
02:59
but where we look.
47
179018
1648
也会控制我们看待事情的角度。
03:00
We found that prompting people to think of violent crime
48
180690
3444
我们发现让人们想象暴力犯罪时,
03:04
can lead them to direct their eyes onto a black face
49
184158
4296
他们的视线会不自觉地从白人面孔
03:08
and away from a white face.
50
188478
2130
转向黑人面孔。
03:10
Prompting police officers to think of capturing and shooting
51
190632
3842
让警方想象俘获、射击
03:14
and arresting
52
194498
1229
和逮捕的场景时,
03:15
leads their eyes to settle on black faces, too.
53
195751
3859
他们的眼光也会落在黑人身上。
03:19
Bias can infect every aspect of our criminal justice system.
54
199634
5066
偏见足以影响 司法体系的方方面面。
03:25
In a large data set of death-eligible defendants,
55
205100
2931
在一组符合死刑条件判决 的大型数据库之中,
03:28
we found that looking more black more than doubled their chances
56
208055
4357
我们发现皮肤较黑的被告
被判死刑的几率 要高出两倍以上——
03:32
of receiving a death sentence --
57
212436
2057
03:35
at least when their victims were white.
58
215494
2427
尤其当被害者是白人的时候。
03:37
This effect is significant,
59
217945
1438
这个差异非常显著,
03:39
even though we controlled for the severity of the crime
60
219407
3301
尽管我们确保犯罪的严重程度
03:42
and the defendant's attractiveness.
61
222732
2281
和被告的吸引力是一样的。
03:45
And no matter what we controlled for,
62
225037
2649
无论我们如何控制变量,
03:47
we found that black people were punished
63
227710
3345
我们都发现黑人受到的惩罚
03:51
in proportion to the blackness of their physical features:
64
231079
4325
与他们的肤色黑度成比例:
03:55
the more black,
65
235428
1881
他们的肤色越黑,
03:57
the more death-worthy.
66
237333
1753
就越有可能被判死刑。
03:59
Bias can also influence how teachers discipline students.
67
239110
4209
偏见也能影响老师 惩戒学生的方式。
04:03
My colleagues and I have found that teachers express a desire
68
243781
4407
我和同事经过研究发现,
老师给黑人学生的惩罚
往往都比白人学生更加严厉,
04:08
to discipline a black middle school student more harshly
69
248212
3566
04:11
than a white student
70
251802
1168
04:12
for the same repeated infractions.
71
252994
2576
即使他们犯的是相同的错误。
04:15
In a recent study,
72
255594
1294
在近期的一项研究中,我们发现
04:16
we're finding that teachers treat black students as a group
73
256912
4358
老师会把黑人学生视为一个群体,
04:21
but white students as individuals.
74
261294
2431
但把白人学生视作个体来对待。
04:24
If, for example, one black student misbehaves
75
264126
3599
譬如,如果一个黑人学生调皮捣蛋,
04:27
and then a different black student misbehaves a few days later,
76
267749
4785
几天后,又有一个黑人学生 做出调皮捣蛋的行为,
04:32
the teacher responds to that second black student
77
272558
3228
老师对待第二个黑人学生的态度
04:35
as if he had misbehaved twice.
78
275810
2625
就好像他犯了两次错。
04:38
It's as though the sins of one child
79
278952
2811
这就像一个孩子的罪过
04:41
get piled onto the other.
80
281787
2176
被叠加在了另一个孩子的身上。
04:43
We create categories to make sense of the world,
81
283987
3294
为了更好地理解这个世界, 我们会对周遭的一切分门别类,
04:47
to assert some control and coherence
82
287305
4483
以控制和协调
04:51
to the stimuli that we're constantly being bombarded with.
83
291812
4090
我们所接收的 外部事物的刺激。
04:55
Categorization and the bias that it seeds
84
295926
3968
分类及其所产生的偏见
04:59
allow our brains to make judgments more quickly and efficiently,
85
299918
5022
让我们的大脑能够更快、 更有效地做出判断,
05:04
and we do this by instinctively relying on patterns
86
304964
3402
我们本能地依赖似乎带有
05:08
that seem predictable.
87
308390
1669
预测能力的模式。
05:10
Yet, just as the categories we create allow us to make quick decisions,
88
310083
5943
这套分类思考模式 让我们能迅速地做决策,
05:16
they also reinforce bias.
89
316050
2502
但同时也在强化我们的偏见。
05:18
So the very things that help us to see the world
90
318576
3392
所以,这套看似在帮助我们 理解世界的模式,
05:23
also can blind us to it.
91
323104
1980
也能够蒙蔽我们。
05:25
They render our choices effortless,
92
325509
2778
它们让我们的选择毫不费力,
05:28
friction-free.
93
328311
1369
没有摩擦。
05:30
Yet they exact a heavy toll.
94
330956
2445
然而也带来了沉重的代价。
05:34
So what can we do?
95
334158
1411
那么我们能够做什么?
05:36
We are all vulnerable to bias,
96
336507
2491
我们都容易受到偏见的影响,
05:39
but we don't act on bias all the time.
97
339022
2680
但我们不见得时刻 都要被偏见左右。
05:41
There are certain conditions that can bring bias alive
98
341726
3644
有些特定情况会导致偏见,
05:45
and other conditions that can muffle it.
99
345394
2533
而在其他情况下, 偏见则会受到抑制。
05:47
Let me give you an example.
100
347951
1847
让我举个例子。
05:50
Many people are familiar with the tech company Nextdoor.
101
350663
4560
相信大家都熟悉 Nextdoor 这家科技公司,
05:56
So, their whole purpose is to create stronger, healthier, safer neighborhoods.
102
356073
6453
他们的主要目的是打造一个 更稳固、健康、安全的社区。
06:03
And so they offer this online space
103
363468
2921
因此,他们提供了凝聚社区
06:06
where neighbors can gather and share information.
104
366413
3149
和分享信息的线上空间。
06:09
Yet, Nextdoor soon found that they had a problem
105
369586
4126
然而,Nextdoor 很快就发现
平台存在种族定性的问题。
06:13
with racial profiling.
106
373736
1668
06:16
In the typical case,
107
376012
1967
一个典型的情况就是,
06:18
people would look outside their window
108
378003
2396
当居民望向窗外,
06:20
and see a black man in their otherwise white neighborhood
109
380423
4049
看到黑人出现在白人的住宅区时,
06:24
and make the snap judgment that he was up to no good,
110
384496
4715
他们就快速评判出 这个黑人不怀好意,
06:29
even when there was no evidence of criminal wrongdoing.
111
389235
3351
即使没有任何犯罪 和不法行为的证据。
06:32
In many ways, how we behave online
112
392610
2934
很多时候,我们的线上行为
06:35
is a reflection of how we behave in the world.
113
395568
3114
就是真实世界行为的反映。
06:39
But what we don't want to do is create an easy-to-use system
114
399117
3945
但我们并不想去创造一个放大偏见、
06:43
that can amplify bias and deepen racial disparities,
115
403086
4163
加深种族差异的简单系统,
06:48
rather than dismantling them.
116
408129
2266
而是要去消除它们。
06:50
So the cofounder of Nextdoor reached out to me and to others
117
410863
3429
于是 Nextdoor 的联合创办人 向我及他人救助,
06:54
to try to figure out what to do.
118
414316
2131
寻找解决问题的方法。
06:56
And they realized that to curb racial profiling on the platform,
119
416471
3946
他们知道,若想在平台上 制止种族定性,
07:00
they were going to have to add friction;
120
420441
1922
他们就得主动干预,
07:02
that is, they were going to have to slow people down.
121
422387
2658
也就是,他们得让居民们的 偏见情绪缓和下来。
07:05
So Nextdoor had a choice to make,
122
425069
2195
于是 Nextdoor 最终做出决定,
07:07
and against every impulse,
123
427288
2478
反对冲动,
07:09
they decided to add friction.
124
429790
2116
并且决定进行干预。
07:12
And they did this by adding a simple checklist.
125
432397
3440
他们为平台添加了 一个简单的清单,
07:15
There were three items on it.
126
435861
1670
其中包含有三个选项。
07:18
First, they asked users to pause
127
438111
2941
首先,他们请用户稍加思索:
07:21
and think, "What was this person doing that made him suspicious?"
128
441076
5117
“这个人做了什么可疑的行为?”
07:26
The category "black man" is not grounds for suspicion.
129
446876
4533
“黑人”这个分类并不是 怀疑的基础。
07:31
Second, they asked users to describe the person's physical features,
130
451433
5139
其次,他们要求用户描述 这个人的外在特征,
07:36
not simply their race and gender.
131
456596
2435
而不是简单的种族和性别。
07:39
Third, they realized that a lot of people
132
459642
3383
第三,他们发现很多人
07:43
didn't seem to know what racial profiling was,
133
463049
2928
似乎不了解种族定性是什么,
07:46
nor that they were engaging in it.
134
466001
1959
也不知道自己正参与其中。
07:48
So Nextdoor provided them with a definition
135
468462
3194
所以, Nextdoor 向他们解释了该定义,
07:51
and told them that it was strictly prohibited.
136
471680
2902
并告诉他们这是严格禁止的。
07:55
Most of you have seen those signs in airports
137
475071
2612
你们很多人在机场和地铁站
07:57
and in metro stations, "If you see something, say something."
138
477707
3702
都看过类似标语:“如果你看到了 什么(可疑的情况),就说出来!”
08:01
Nextdoor tried modifying this.
139
481928
2814
Nextdoor 把这个标语改成了,
08:05
"If you see something suspicious,
140
485584
2572
“如果你看见可疑行为,
08:08
say something specific."
141
488180
2073
就明确地说出来。”
08:11
And using this strategy, by simply slowing people down,
142
491491
4446
该策略的确让用户们的 偏见情绪缓和下来,
08:15
Nextdoor was able to curb racial profiling by 75 percent.
143
495961
5691
Nextdoor 也成功地 让种族定性问题减少了 75%。
08:22
Now, people often will say to me,
144
502496
2090
很多人常对我说,
08:24
"You can't add friction in every situation, in every context,
145
504610
4713
“你不可能在每种情况下都进行干预,
08:29
and especially for people who make split-second decisions all the time."
146
509347
4646
尤其是那些要在瞬息之间 做出决定的人。“
08:34
But it turns out we can add friction
147
514730
2563
但结果表明, 我们能够进行干预的场景
08:37
to more situations than we think.
148
517317
2276
比我们想象的多得多。
08:40
Working with the Oakland Police Department
149
520031
2074
在和加州的奥克兰警方
合作之后,
08:42
in California,
150
522129
1417
08:43
I and a number of my colleagues were able to help the department
151
523570
3856
我和几位同事发现, 我们能协助警方
08:47
to reduce the number of stops they made
152
527450
2671
减少对没犯下任何严重罪行的人
08:50
of people who were not committing any serious crimes.
153
530145
3600
进行拦检的次数。
08:53
And we did this by pushing officers
154
533769
2365
我们的做法是推动警员
在执行拦检任务之前 都要先问自己一个问题:
08:56
to ask themselves a question before each and every stop they made:
155
536158
4443
09:01
"Is this stop intelligence-led,
156
541451
3015
“这次的拦检任务是 治安情报主导的吗,
09:04
yes or no?"
157
544490
1451
是或否?”
09:07
In other words,
158
547353
1396
换句话说,
09:09
do I have prior information to tie this particular person
159
549621
4484
我早先的情报是否能够将这个人
跟特定罪行联系在一起?
09:14
to a specific crime?
160
554129
1601
09:16
By adding that question
161
556587
1458
把这个问题纳入到
09:18
to the form officers complete during a stop,
162
558069
3079
警员执行拦检任务时 填写的表格里,
09:21
they slow down, they pause,
163
561172
1809
会让警员暂停行动,
09:23
they think, "Why am I considering pulling this person over?"
164
563005
4220
并及时思考,“我为什么会想要 把这个人拦下来?“
09:28
In 2017, before we added that intelligence-led question to the form,
165
568721
5561
2017 年,在我们把这个问题 纳入表格之前,
09:35
officers made about 32,000 stops across the city.
166
575655
3946
全市的警员已经执行了 大约 32000 次的拦检。
09:39
In that next year, with the addition of this question,
167
579625
4115
第二年,加上了这个问题后,
09:43
that fell to 19,000 stops.
168
583764
2444
拦检的次数减少至 19000 次。
09:46
African-American stops alone fell by 43 percent.
169
586232
4961
光是对非裔美国人的拦检 就减少了 43%。
09:51
And stopping fewer black people did not make the city any more dangerous.
170
591905
4438
并且减少对黑人拦检的次数 并没有让城市变得危险。
09:56
In fact, the crime rate continued to fall,
171
596367
2734
事实上,犯罪率持续下降,
09:59
and the city became safer for everybody.
172
599125
3337
并让城市变得更加安全。
10:02
So one solution can come from reducing the number of unnecessary stops.
173
602486
5355
所以一个解决方案就是 减少不必要的拦检次数。
10:08
Another can come from improving the quality of the stops
174
608285
4270
另外一个方涉及提升警员
执行拦检任务的质量。
10:12
officers do make.
175
612579
1305
10:14
And technology can help us here.
176
614512
2596
此时科技就派上用场了。
10:17
We all know about George Floyd's death,
177
617132
2415
我们大家都知道 乔治·弗洛伊德的死亡,
10:20
because those who tried to come to his aid held cell phone cameras
178
620499
4772
因为当时试图帮助他的人 用手机摄影机拍下了
10:25
to record that horrific, fatal encounter with the police.
179
625295
5431
这段他与警方骇人的遭遇。
10:30
But we have all sorts of technology that we're not putting to good use.
180
630750
5031
但是还有很多科技手段 从未被我们善用。
10:35
Police departments across the country
181
635805
2503
全国各地的警方现在
10:38
are now required to wear body-worn cameras
182
638332
3553
都被要求戴上体佩摄影头。
10:41
so we have recordings of not only the most extreme and horrific encounters
183
641909
5930
这样我们不仅可以记录 那些骇人的遭遇,
10:47
but of everyday interactions.
184
647863
2754
还可以记录他们的日常互动。
10:50
With an interdisciplinary team at Stanford,
185
650641
2777
通过和斯坦福大学的 跨学科团队合作,
10:53
we've begun to use machine learning techniques
186
653442
2687
我们开始使用机器学习技术
10:56
to analyze large numbers of encounters.
187
656153
3367
去分析大量的情景。
10:59
This is to better understand what happens in routine traffic stops.
188
659544
4611
这是为了更好地了解 例行交通拦检时会发生什么事情。
11:04
What we found was that
189
664179
2155
我们发现,
11:06
even when police officers are behaving professionally,
190
666358
3662
即使警员的行为非常专业,
11:10
they speak to black drivers less respectfully than white drivers.
191
670860
4462
他们对黑人司机的说话方式 却远不如对白人司机礼貌。
11:16
In fact, from the words officers use alone,
192
676052
4075
事实上,单单从警员的用词,
11:20
we could predict whether they were talking to a black driver or a white driver.
193
680151
5162
我们就能预测他们是在与黑人司机 还是白人司机对话。
11:25
The problem is that the vast majority of the footage from these cameras
194
685337
5762
问题就在于,这些摄影机 所拍摄的大部分画面
11:31
is not used by police departments
195
691123
2087
一直没有被警方用来
11:33
to understand what's going on on the street
196
693234
2276
了解大街上的情况
11:35
or to train officers.
197
695534
2243
或者训练警员。
11:38
And that's a shame.
198
698554
1458
这让人难以接受。
11:40
How does a routine stop turn into a deadly encounter?
199
700796
4789
例行的拦检任务怎么会 变成一次致命的碰面?
11:45
How did this happen in George Floyd's case?
200
705609
2670
为什么会发生乔治·弗洛伊德事件?
11:49
How did it happen in others?
201
709588
2082
又为什么会发生其他类似的案例?
11:51
When my eldest son was 16 years old,
202
711694
3396
当我的长子 16 岁时,
11:55
he discovered that when white people look at him,
203
715114
3139
他意识到白人看着他时,
11:58
they feel fear.
204
718277
1561
眼神会流露出恐惧。
12:01
Elevators are the worst, he said.
205
721123
2661
他说,最糟糕的是在电梯里。
12:04
When those doors close,
206
724313
2331
当电梯门关上,
12:06
people are trapped in this tiny space
207
726668
3083
大家都和他们认为高度危险的人
12:09
with someone they have been taught to associate with danger.
208
729775
4467
被困在同一个狭小的空间。
12:14
My son senses their discomfort,
209
734744
3220
我的儿子察觉到了他们的不适,
12:17
and he smiles to put them at ease,
210
737988
3157
他便朝着他们微笑, 让他们感到自在,
12:21
to calm their fears.
211
741169
1769
缓和他们恐惧。
12:23
When he speaks,
212
743351
1945
当他开口说话时,
12:25
their bodies relax.
213
745320
1683
他们的身体就开始放松了,
12:27
They breathe easier.
214
747442
1903
呼吸也开始顺畅了。
12:29
They take pleasure in his cadence,
215
749369
2531
他们喜欢他高低起伏的声音,
12:31
his diction, his word choice.
216
751924
2317
他的发音,他的用词。
12:34
He sounds like one of them.
217
754986
1843
他听起来就是他们的一份子。
12:36
I used to think that my son was a natural extrovert like his father.
218
756853
4730
我一直以来都认为我的儿子 和他爸爸一样天生性格外向。
12:41
But I realized at that moment, in that conversation,
219
761607
3550
但那一刻,在那个谈话中,我意识到,
12:46
that his smile was not a sign that he wanted to connect
220
766143
5078
他的微笑并不表示他想与
那群陌生人社交。
12:51
with would-be strangers.
221
771245
1964
12:53
It was a talisman he used to protect himself,
222
773920
3652
这是他用来保护自己的护身符,
12:57
a survival skill he had honed over thousands of elevator rides.
223
777596
6219
是他乘搭了数千万次电梯后 磨练出来的生存技能。
13:04
He was learning to accommodate the tension that his skin color generated
224
784387
5171
他在学着缓和因他的肤色 给身边人带来的紧张感,
13:11
and that put his own life at risk.
225
791026
2667
消除他的肤色 为自己带来的潜在危险。
13:14
We know that the brain is wired for bias,
226
794619
3783
我们知道我们的大脑 带有与生俱来的偏见,
13:18
and one way to interrupt that bias is to pause and to reflect
227
798426
4465
而打破偏见的唯一方式 就是停下来,
13:22
on the evidence of our assumptions.
228
802915
2305
反思自身提出的假设 是否有理有据。
13:25
So we need to ask ourselves:
229
805244
1755
因此我们需要反问自己:
13:27
What assumptions do we bring when we step onto an elevator?
230
807023
4665
我们是带着什么假设踏进电梯,
13:33
Or an airplane?
231
813776
1311
又或者是搭乘飞机的?
13:35
How do we make ourselves aware of our own unconscious bias?
232
815532
4599
我们该如何察觉到自己 潜意识中的偏见?
13:40
Who do those assumptions keep safe?
233
820155
2351
那些假设能保障谁的安全,
13:44
Who do they put at risk?
234
824615
1932
又将谁置于危险之中?
13:47
Until we ask these questions
235
827649
2354
若我们不去反思这些问题,
13:50
and insist that our schools and our courts and our police departments
236
830978
4624
不坚持让我们的学校、法庭、警局,
13:55
and every institution do the same,
237
835626
2542
以及其他机构去反思这些问题,
13:59
we will continue to allow bias
238
839835
3829
那么我们就会让偏见继续
14:03
to blind us.
239
843688
1278
蒙蔽我们。
14:05
And if we do,
240
845348
1409
如此下去,
14:08
none of us are truly safe.
241
848066
3208
没有谁能够真正安全。
14:14
Thank you.
242
854103
1308
谢谢。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7