How racial bias works -- and how to disrupt it | Jennifer L. Eberhardt

170,019 views ใƒป 2020-06-22

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืชืจื’ื•ื: ืžื™ื›ืœ ืกืœืžืŸ ืขืจื™ื›ื”: Sigal Tifferet
00:12
Some years ago,
0
12910
1273
ืœืคื ื™ ืžืกืคืจ ืฉื ื™ื,
00:14
I was on an airplane with my son who was just five years old at the time.
1
14207
4912
ื”ื™ื™ืชื™ ื‘ืžื˜ื•ืก ืขื ื”ื‘ืŸ ืฉืœื™ ืฉื”ื™ื” ืื– ืจืง ื‘ืŸ ื—ืžืฉ.
00:20
My son was so excited about being on this airplane with Mommy.
2
20245
5095
ื‘ื ื™ ื›ืœ ื›ืš ื”ืชืœื”ื‘ ืœื”ื™ื•ืช ืขืœ ื”ืžื˜ื•ืก ื”ื–ื” ืขื ืื™ืžื.
00:25
He's looking all around and he's checking things out
3
25364
2940
ื”ื•ื ืžืกืชื›ืœ ืžืกื‘ื™ื‘ ื•ื”ื•ื ื‘ื•ื“ืง ื“ื‘ืจื™ื
00:28
and he's checking people out.
4
28328
1836
ื•ื”ื•ื ื‘ื•ื—ืŸ ืื ืฉื™ื.
00:30
And he sees this man, and he says,
5
30188
1630
ื•ื”ื•ื ืจื•ืื” ืืช ื”ืื™ืฉ ื”ื–ื”, ื•ื”ื•ื ืื•ืžืจ,
00:31
"Hey! That guy looks like Daddy!"
6
31842
2861
"ื”ื™ื™! ื”ื‘ื—ื•ืจ ื”ื–ื” ื ืจืื” ื›ืžื• ืื‘ื!"
00:35
And I look at the man,
7
35882
1920
ื•ืื ื™ ืžืกืชื›ืœืช ืขืœ ื”ืื™ืฉ,
00:37
and he didn't look anything at all like my husband,
8
37826
3730
ื•ื”ื•ื ื‘ื›ืœืœ ืœื ื ืจืื” ื›ืžื• ื‘ืขืœื™,
00:41
nothing at all.
9
41580
1369
ื‘ื›ืœืœ ืœื.
00:43
And so then I start looking around on the plane,
10
43461
2336
ื•ืื– ื”ืชื—ืœืชื™ ืœื”ืกืชื›ืœ ืกื‘ื™ื‘ื™ ื‘ืžื˜ื•ืก,
00:45
and I notice this man was the only black guy on the plane.
11
45821
5885
ื•ืฉืžืชื™ ืœื‘ ืฉื”ืื™ืฉ ื”ื–ื” ื”ื™ื” ื”ื‘ื—ื•ืจ ื”ืฉื—ื•ืจ ื”ื™ื—ื™ื“ ื‘ืžื˜ื•ืก.
00:52
And I thought,
12
52874
1412
ื•ื—ืฉื‘ืชื™,
00:54
"Alright.
13
54310
1194
"ื‘ืกื“ืจ.
00:56
I'm going to have to have a little talk with my son
14
56369
2525
ืื ื™ ืืฆื˜ืจืš ืœื ื”ืœ ืฉื™ื—ื” ืงืฆืจื” ืขื ื‘ื ื™
00:58
about how not all black people look alike."
15
58918
2929
ืขืœ ื–ื” ืฉืœื ื›ืœ ื”ืื ืฉื™ื ื”ืฉื—ื•ืจื™ื ื ืจืื™ื ืื•ืชื• ื“ื‘ืจ. "
01:01
My son, he lifts his head up, and he says to me,
16
61871
4377
ื‘ื ื™ ืžืจื™ื ืืช ื”ืจืืฉ ืœืžืขืœื”, ื•ื”ื•ื ืื•ืžืจ ืœื™,
01:08
"I hope he doesn't rob the plane."
17
68246
2371
"ืื ื™ ืžืงื•ื•ื” ืฉื”ื•ื ืœื ื™ืฉื“ื•ื“ ืืช ื”ืžื˜ื•ืก".
01:11
And I said, "What? What did you say?"
18
71359
2515
ื•ืื ื™ ืืžืจืชื™, "ืžื”? ืžื” ืืžืจืช?"
01:13
And he says, "Well, I hope that man doesn't rob the plane."
19
73898
3428
ื•ื”ื•ื ืื•ืžืจ, "ื˜ื•ื‘, ืื ื™ ืžืงื•ื•ื” ืฉื”ืื™ืฉ ื”ื–ื” ืœื ื™ืฉื“ื•ื“ ืืช ื”ืžื˜ื•ืก".
01:19
And I said, "Well, why would you say that?
20
79200
2752
ื•ืื ื™ ืืžืจืชื™, "ืœืžื” ืืชื” ืื•ืžืจ ื“ื‘ืจ ื›ื–ื”?
01:22
You know Daddy wouldn't rob a plane."
21
82486
2663
ืืชื” ื™ื•ื“ืข ืฉืื‘ื ืœื ื”ื™ื” ืฉื•ื“ื“ ืžื˜ื•ืก".
01:25
And he says, "Yeah, yeah, yeah, well, I know."
22
85173
2312
ื•ื”ื•ื ืื•ืžืจ, "ื›ืŸ, ื›ืŸ, ื›ืŸ, ืื ื™ ื™ื•ื“ืข".
01:28
And I said, "Well, why would you say that?"
23
88179
2127
ื•ืืžืจืชื™, "ืื– ืœืžื” ืืชื” ืื•ืžืจ ืืช ื–ื”?".
01:32
And he looked at me with this really sad face,
24
92346
2957
ื•ื”ื•ื ื”ืกืชื›ืœ ืขืœื™ื™ ืขื ื”ืคืจืฆื•ืฃ ื”ืขืฆื•ื‘ ื”ื–ื”,
01:36
and he says,
25
96168
1254
ื•ืืžืจ,
01:38
"I don't know why I said that.
26
98890
2106
"ืื ื™ ืœื ื™ื•ื“ืข ืœืžื” ืืžืจืชื™ ืืช ื–ื”.
01:42
I don't know why I was thinking that."
27
102600
2358
ืื ื™ ืœื ื™ื•ื“ืข ืœืžื” ื—ืฉื‘ืชื™ ืขืœ ื–ื”".
01:45
We are living with such severe racial stratification
28
105724
3518
ืื ื• ื—ื™ื™ื ื‘ืจื™ื‘ื•ื“ ื’ื–ืขื™ ื›ื” ื—ืžื•ืจ
01:49
that even a five-year-old can tell us what's supposed to happen next,
29
109266
5060
ืฉืืคื™ืœื• ื™ืœื“ ื‘ืŸ ื—ืžืฉ ื™ื›ื•ืœ ืœื•ืžืจ ืœื ื• ืžื” ื”ื“ื‘ืจ ื”ื‘ื ืฉืืžื•ืจ ืœืงืจื•ืช,
01:55
even with no evildoer,
30
115990
2107
ืืคื™ืœื• ืœืœื ืจืฉืข,
01:58
even with no explicit hatred.
31
118121
2579
ืืคื™ืœื• ืœืœื ืฉื ืื” ืžืคื•ืจืฉืช.
02:02
This association between blackness and crime
32
122184
3913
ื”ืงืฉืจ ื”ื–ื” ื‘ื™ืŸ ืฆื‘ืข ืขื•ืจ ืฉื—ื•ืจ ื•ื‘ื™ืŸ ืคืฉืข
02:06
made its way into the mind of my five-year-old.
33
126121
4334
ืขืฉื” ืืช ื“ืจื›ื• ืืœ ื”ืชื•ื“ืขื” ืฉืœ ื‘ื ื™ ื‘ืŸ ื”ื—ืžืฉ.
02:11
It makes its way into all of our children,
34
131787
3263
ื”ื•ื ืขื•ืฉื” ืืช ื“ืจื›ื• ืืœ ื›ืœ ื™ืœื“ื™ื ื•,
02:16
into all of us.
35
136201
1391
ืืœ ื›ืœ ืื—ื“ ืžืืชื ื•.
02:18
Our minds are shaped by the racial disparities
36
138793
3524
ื”ืžื•ื—ื•ืช ืฉืœื ื• ืžืขื•ืฆื‘ื™ื ืขืœ ื™ื“ื™ ื”ื”ื‘ื“ืœื™ื ื”ื’ื–ืขื™ื™ื
02:22
we see out in the world
37
142341
1592
ืฉืื ื• ืจื•ืื™ื ื‘ืขื•ืœื
02:24
and the narratives that help us to make sense of the disparities we see:
38
144752
5341
ื•ืขืœ ื™ื“ื™ ื”ื ืจื˜ื™ื‘ื™ื ืฉืขื•ื–ืจื™ื ืœื ื• ืœื”ื‘ื™ืŸ ืืช ื”ื”ื‘ื“ืœื™ื ืฉืื ื• ืจื•ืื™ื:
02:31
"Those people are criminal."
39
151637
2526
"ื”ืื ืฉื™ื ื”ืืœื” ืคื•ืฉืขื™ื".
02:34
"Those people are violent."
40
154187
1925
"ื”ืื ืฉื™ื ื”ืืœื” ืืœื™ืžื™ื".
02:36
"Those people are to be feared."
41
156136
2965
"ืฆืจื™ืš ืœื—ืฉื•ืฉ ืžื”ืื ืฉื™ื ื”ืืœื”".
02:39
When my research team brought people into our lab
42
159814
3191
ื›ืืฉืจ ืฆื•ื•ืช ื”ืžื—ืงืจ ืฉืœื™ ื”ื‘ื™ื ืื ืฉื™ื ืืœ ื”ืžืขื‘ื“ื” ืฉืœื ื•
02:43
and exposed them to faces,
43
163029
2283
ื•ื—ืฉืฃ ืื•ืชื ืœืคืจืฆื•ืคื™ื,
02:45
we found that exposure to black faces led them to see blurry images of guns
44
165336
7000
ืžืฆืื ื• ืฉื—ืฉื™ืคื” ืœืคืจืฆื•ืคื™ื ืฉื—ื•ืจื™ื ื”ื•ื‘ื™ืœื” ืื•ืชื ืœืจืื•ืช ืชืžื•ื ื•ืช ืžื˜ื•ืฉื˜ืฉื•ืช ืฉืœ ืจื•ื‘ื™ื
02:52
with greater clarity and speed.
45
172360
3256
ื‘ื‘ื”ื™ืจื•ืช ื•ื‘ืžื”ื™ืจื•ืช ื’ื“ื•ืœื•ืช ื™ื•ืชืจ.
02:55
Bias cannot only control what we see,
46
175640
3354
ื”ื˜ื™ื•ืช ืœื ืจืง ื™ื›ื•ืœื•ืช ืœืฉืœื•ื˜ ื‘ืžื” ืฉืื ื—ื ื• ืจื•ืื™ื,
02:59
but where we look.
47
179018
1648
ืืœื ื’ื ืื™ืคื” ืื ื—ื ื• ืžืกืชื›ืœื™ื.
03:00
We found that prompting people to think of violent crime
48
180690
3444
ื’ื™ืœื™ื ื• ืฉืื ืžืขื•ื“ื“ื™ื ืื ืฉื™ื ืœื—ืฉื•ื‘ ืขืœ ืคืฉืข ืืœื™ื
03:04
can lead them to direct their eyes onto a black face
49
184158
4296
ื”ื“ื‘ืจ ื™ื›ื•ืœ ืœื’ืจื•ื ืœื”ื ืœื›ื•ื•ืŸ ืืช ืขื™ื ื™ื”ื ืœืขื‘ืจ ืคื ื™ื ืฉื—ื•ืจื•ืช
03:08
and away from a white face.
50
188478
2130
ื•ืœื”ืกื™ื˜ ืื•ืชืŸ ืžืคื ื™ื ืœื‘ื ื•ืช.
03:10
Prompting police officers to think of capturing and shooting
51
190632
3842
ื›ืืฉืจ ืžื‘ืงืฉื™ื ืžืฉื•ื˜ืจื™ื ืœื—ืฉื•ื‘ ืขืœ ืœื›ื™ื“ื” ื•ื™ืจื™
03:14
and arresting
52
194498
1229
ื•ืžืขืฆืจ
03:15
leads their eyes to settle on black faces, too.
53
195751
3859
ื”ื“ื‘ืจ ืžื•ื‘ื™ืœ ืืช ืขื™ื ื™ื”ื ืœื”ืชืžืงื“ ื‘ืคืจืฆื•ืคื™ื ืฉื—ื•ืจื™ื ื’ื ื›ืŸ.
03:19
Bias can infect every aspect of our criminal justice system.
54
199634
5066
ื”ื˜ื™ื•ืช ืขืœื•ืœื•ืช ืœื–ื”ื ื›ืœ ื”ื™ื‘ื˜ ืฉืœ ืžืขืจื›ืช ื”ืžืฉืคื˜ ื”ืคืœื™ืœื™ืช ืฉืœื ื•.
03:25
In a large data set of death-eligible defendants,
55
205100
2931
ื‘ืžืขืจื›ืช ื ืชื•ื ื™ื ื’ื“ื•ืœื” ืฉืœ ื ืืฉืžื™ื ืฉืขืœื•ืœื™ื ืœืงื‘ืœ ื’ื–ืจ ื“ื™ืŸ ืžื•ื•ืช,
03:28
we found that looking more black more than doubled their chances
56
208055
4357
ืžืฆืื ื• ืฉืฆื‘ืข ืขื•ืจ ืฉื—ื•ืจ ื™ื•ืชืจ ืžื’ื“ื™ืœ ื‘ื™ื•ืชืจ ืžืคื™ 2 ืืช ื”ืกื™ื›ื•ื™ื™ื ืฉืœื”ื
03:32
of receiving a death sentence --
57
212436
2057
ืœืงื‘ืœ ื’ื–ืจ ื“ื™ืŸ ืžื•ื•ืช -
03:35
at least when their victims were white.
58
215494
2427
ืœืคื—ื•ืช ื›ืฉืงืจื‘ื ื•ืชื™ื”ื ื”ื™ื• ืœื‘ื ื™ื.
03:37
This effect is significant,
59
217945
1438
ื”ืชื•ืฆืื” ื”ื–ืืช ืžื•ื‘ื”ืงืช ืกื˜ื˜ื™ืกื˜ื™ืช,
03:39
even though we controlled for the severity of the crime
60
219407
3301
ืœืžืจื•ืช ืฉื ื˜ืจืœื ื• ืืช ื—ื•ืžืจืช ื”ืคืฉืข
03:42
and the defendant's attractiveness.
61
222732
2281
ื•ืืช ืžื™ื“ืช ื”ืื˜ืจืงื˜ื™ื‘ื™ื•ืช ืฉืœ ื”ื ืืฉื.
03:45
And no matter what we controlled for,
62
225037
2649
ื•ืœื ืžืฉื ื” ืžื” ื ื˜ืจืœื ื•,
03:47
we found that black people were punished
63
227710
3345
ืžืฆืื ื• ืฉืื ืฉื™ื ืฉื—ื•ืจื™ื ื ืขื ืฉื•
03:51
in proportion to the blackness of their physical features:
64
231079
4325
ื‘ื™ื—ืก ื™ืฉืจ ืœื’ื•ื•ืŸ ื”ืฉื—ื•ืจ ืฉืœ ืžืืคื™ื™ื ื™ื”ื ื”ืคื™ื–ื™ื™ื:
03:55
the more black,
65
235428
1881
ื™ื•ืชืจ ืฉื—ื•ืจ,
03:57
the more death-worthy.
66
237333
1753
ื™ื•ืชืจ ืจืื•ื™ ืœืžื•ืช.
03:59
Bias can also influence how teachers discipline students.
67
239110
4209
ื”ื˜ื™ื” ื™ื›ื•ืœื” ื’ื ืœื”ืฉืคื™ืข ืขืœ ื”ืื•ืคืŸ ืฉื‘ื• ืžื•ืจื™ื ืžื˜ื™ืœื™ื ืžืฉืžืขืช ืขืœ ืชืœืžื™ื“ื™ื.
04:03
My colleagues and I have found that teachers express a desire
68
243781
4407
ืขืžื™ืชื™ื™ ื•ืื ื™ ืžืฆืื ื• ืฉืžื•ืจื™ื ืžื‘ื™ืขื™ื ืจืฆื•ืŸ
04:08
to discipline a black middle school student more harshly
69
248212
3566
ืœื”ื˜ื™ืœ ืžืฉืžืขืช ืขืœ ืชืœืžื™ื“ ืฉื—ื•ืจ ื‘ื—ื˜ื™ื‘ืช ื”ื‘ื™ื ื™ื™ื ื‘ืฆื•ืจื” ืงืฉื” ื™ื•ืชืจ
04:11
than a white student
70
251802
1168
ืžืืฉืจ ืขืœ ืชืœืžื™ื“ ืœื‘ืŸ
04:12
for the same repeated infractions.
71
252994
2576
ืขื‘ื•ืจ ืื•ืชืŸ ืขื‘ื™ืจื•ืช ื—ื•ื–ืจื•ืช ื•ื ืฉื ื•ืช.
04:15
In a recent study,
72
255594
1294
ื‘ืžื—ืงืจ ืฉื ืขืจืš ืœืื—ืจื•ื ื”,
04:16
we're finding that teachers treat black students as a group
73
256912
4358
ืื ื• ืžื’ืœื™ื ืฉื”ืžื•ืจื™ื ืžืชื™ื™ื—ืกื™ื ืืœ ืชืœืžื™ื“ื™ื ืฉื—ื•ืจื™ื ื›ืืœ ืงื‘ื•ืฆื”
04:21
but white students as individuals.
74
261294
2431
ืื‘ืœ ืืœ ืชืœืžื™ื“ื™ื ืœื‘ื ื™ื ื›ืืœ ื™ื—ื™ื“ื™ื.
04:24
If, for example, one black student misbehaves
75
264126
3599
ืื, ืœืžืฉืœ, ืชืœืžื™ื“ ืฉื—ื•ืจ ืื—ื“ ืื™ื ื• ืžืชื ื”ื’ ื›ืจืื•ื™
04:27
and then a different black student misbehaves a few days later,
76
267749
4785
ื•ืื– ืชืœืžื™ื“ ืฉื—ื•ืจ ืื—ืจ ืื™ื ื• ืžืชื ื”ื’ ื›ืจืื•ื™ ื›ืขื‘ื•ืจ ื›ืžื” ื™ืžื™ื,
04:32
the teacher responds to that second black student
77
272558
3228
ื”ืžื•ืจื” ืžื’ื™ื‘ ื›ืœืคื™ ื”ืชืœืžื™ื“ ื”ืฉื—ื•ืจ ื”ืฉื ื™
04:35
as if he had misbehaved twice.
78
275810
2625
ื›ืื™ืœื• ื”ื•ื ื”ืชื ื”ื’ ืฉืœื ื›ืจืื•ื™ ืคืขืžื™ื™ื.
04:38
It's as though the sins of one child
79
278952
2811
ื–ื” ื›ืื™ืœื• ืฉื—ื˜ืื™ื• ืฉืœ ื™ืœื“ ืื—ื“
04:41
get piled onto the other.
80
281787
2176
ื ืขืจืžื™ื ืขืœ ืืœื” ืฉืœ ื”ื™ืœื“ ื”ืื—ืจ.
04:43
We create categories to make sense of the world,
81
283987
3294
ืื ื• ื™ื•ืฆืจื™ื ืงื˜ื’ื•ืจื™ื•ืช ื‘ืžื˜ืจื” ืœื”ื‘ื™ืŸ ืืช ื”ืขื•ืœื,
04:47
to assert some control and coherence
82
287305
4483
ืœืงื‘ื•ืข ืžืขื˜ ืฉืœื™ื˜ื” ื•ืœื›ื™ื“ื•ืช
04:51
to the stimuli that we're constantly being bombarded with.
83
291812
4090
ื‘ื™ืŸ ื”ื’ื™ืจื•ื™ื™ื ืฉืื ื—ื ื• ื›ืœ ื”ื–ืžืŸ ืžื•ืคื’ื–ื™ื ื‘ื”ื.
04:55
Categorization and the bias that it seeds
84
295926
3968
ืคืขื•ืœืช ื”ืกื™ื•ื•ื’ ื•ื”ื”ื˜ื™ื•ืช ืฉื”ื™ื ื–ื•ืจืขืช
04:59
allow our brains to make judgments more quickly and efficiently,
85
299918
5022
ืžืืคืฉืจื•ืช ืœืžื•ื—ื ื• ืœื’ื‘ืฉ ื“ืขื•ืช ื‘ืฆื•ืจื” ืžื”ื™ืจื” ื•ื™ืขื™ืœื” ื™ื•ืชืจ,
05:04
and we do this by instinctively relying on patterns
86
304964
3402
ื•ืื ื—ื ื• ืขื•ืฉื™ื ื–ืืช ืขืœ ื™ื“ื™ ื”ืกืชืžื›ื•ืช ืื™ื ืกื˜ื™ื ืงื˜ื™ื‘ื™ืช ืขืœ ื“ืคื•ืกื™ื
05:08
that seem predictable.
87
308390
1669
ืฉื ืจืื™ื ืฆืคื•ื™ื™ื.
05:10
Yet, just as the categories we create allow us to make quick decisions,
88
310083
5943
ืืš ื›ืคื™ ืฉื”ืงื˜ื’ื•ืจื™ื•ืช ืฉืื ื• ื™ื•ืฆืจื™ื ืžืืคืฉืจื•ืช ืœื ื• ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืžื”ื™ืจื•ืช,
05:16
they also reinforce bias.
89
316050
2502
ื”ืŸ ื’ื ืžื’ื‘ื™ืจื•ืช ืืช ื”ื”ื˜ื™ื•ืช.
05:18
So the very things that help us to see the world
90
318576
3392
ืื– ื‘ืขืฆื ืื•ืชื ื”ื“ื‘ืจื™ื ืฉืขื•ื–ืจื™ื ืœื ื• ืœืจืื•ืช ืขื•ืœื
05:23
also can blind us to it.
91
323104
1980
ื™ื›ื•ืœื™ื ื’ื ืœืขื•ื•ืจ ืื•ืชื ื•.
05:25
They render our choices effortless,
92
325509
2778
ื”ื ืžืกืคืงื™ื ืขื‘ื•ืจื ื• ื‘ื—ื™ืจื•ืช ืœืœื ืžืืžืฅ,
05:28
friction-free.
93
328311
1369
ืœืœื ื”ืชื ื’ื“ื•ืช.
05:30
Yet they exact a heavy toll.
94
330956
2445
ืืš ื”ื ื’ื•ื‘ื™ื ืžื—ื™ืจ ื›ื‘ื“.
05:34
So what can we do?
95
334158
1411
ืื– ืžื” ืื ื• ื™ื›ื•ืœื™ื ืœืขืฉื•ืช?
05:36
We are all vulnerable to bias,
96
336507
2491
ื›ื•ืœื ื• ื—ืฉื•ืคื™ื ืœื”ื˜ื™ื•ืช,
05:39
but we don't act on bias all the time.
97
339022
2680
ืื‘ืœ ืื ื• ืœื ืคื•ืขืœื™ื ืขืœ ืคื™ื”ืŸ ื›ืœ ื”ื–ืžืŸ.
05:41
There are certain conditions that can bring bias alive
98
341726
3644
ื™ืฉื ื ืชื ืื™ื ืžืกื•ื™ืžื™ื ืฉื™ื›ื•ืœื™ื ืœืขื•ืจืจ ื”ื˜ื™ื”
05:45
and other conditions that can muffle it.
99
345394
2533
ื•ืชื ืื™ื ืื—ืจื™ื ืฉื™ื›ื•ืœื™ื ืœืขืžืขื ืื•ืชื”.
05:47
Let me give you an example.
100
347951
1847
ื”ืจืฉื• ืœื™ ืœืชืช ืœื›ื ื“ื•ื’ืžื”.
05:50
Many people are familiar with the tech company Nextdoor.
101
350663
4560
ืื ืฉื™ื ืจื‘ื™ื ืžื›ื™ืจื™ื ืืช ื—ื‘ืจืช ื”ื˜ื›ื ื•ืœื•ื’ื™ื” Nextdoor.
05:56
So, their whole purpose is to create stronger, healthier, safer neighborhoods.
102
356073
6453
ื›ืœ ืžื˜ืจืชื ื”ื™ื ืœื™ืฆื•ืจ ืฉื›ื•ื ื•ืช ื—ื–ืงื•ืช, ื‘ืจื™ืื•ืช ื•ื‘ื˜ื•ื—ื•ืช ื™ื•ืชืจ.
06:03
And so they offer this online space
103
363468
2921
ื•ืœืฉื ื›ืš ื”ื ืžืฆื™ืขื™ื ืืช ื”ืžืจื—ื‘ ื”ืžืงื•ื•ืŸ ื”ื–ื”
06:06
where neighbors can gather and share information.
104
366413
3149
ืฉื‘ื• ืฉื›ื ื™ื ื™ื›ื•ืœื™ื ืœื”ืชืืกืฃ ื•ืœืฉืชืฃ ืžื™ื“ืข.
06:09
Yet, Nextdoor soon found that they had a problem
105
369586
4126
ืืš ื ืงืกื˜ื“ื•ืจ ื’ื™ืœื• ืžื”ืจ ืžืื•ื“ ืฉื™ืฉ ืœื”ื ื‘ืขื™ื”
06:13
with racial profiling.
106
373736
1668
ืขื ื™ืฆื™ืจืช ืคืจื•ืคื™ืœ ื’ื–ืขื™.
06:16
In the typical case,
107
376012
1967
ื‘ืžืงืจื” ืื•ืคื™ื™ื ื™,
06:18
people would look outside their window
108
378003
2396
ืื ืฉื™ื ื”ืกืชื›ืœื• ืžื—ื•ืฅ ืœื—ืœื•ืŸ ืฉืœื”ื
06:20
and see a black man in their otherwise white neighborhood
109
380423
4049
ื•ืจืื• ื’ื‘ืจ ืฉื—ื•ืจ ื‘ืฉื›ื•ื ื” ื”ืœื‘ื ื” ืฉืœื”ื
06:24
and make the snap judgment that he was up to no good,
110
384496
4715
ื•ืžื™ื“ ื”ื—ืœื™ื˜ื• ืฉื™ืฉ ืœื• ื›ื•ื•ื ื•ืช ืจืขื•ืช,
06:29
even when there was no evidence of criminal wrongdoing.
111
389235
3351
ืืคื™ืœื• ื›ืืฉืจ ืœื ื”ื™ื• ืœื›ืš ืขื“ื•ื™ื•ืช ืฉืœ ืžืขืฉื™ื ืคืœื™ืœื™ื™ื.
06:32
In many ways, how we behave online
112
392610
2934
ื‘ืžื•ื‘ื ื™ื ืจื‘ื™ื, ื”ืื•ืคืŸ ืฉื‘ื• ืื ื• ืžืชื ื”ื’ื™ื ื‘ืจืฉืช
06:35
is a reflection of how we behave in the world.
113
395568
3114
ืžื”ื•ื•ื” ื”ืฉืชืงืคื•ืช ืฉืœ ื”ืื•ืคืŸ ืฉื‘ื• ืื ื—ื ื• ืžืชื ื”ื’ื™ื ื‘ืขื•ืœื.
06:39
But what we don't want to do is create an easy-to-use system
114
399117
3945
ืื‘ืœ ืžื” ืฉืื ื—ื ื• ืœื ืจื•ืฆื™ื ืœืขืฉื•ืช ื”ื•ื ืœื™ืฆื•ืจ ืžืขืจื›ืช ืงืœื” ืœืฉื™ืžื•ืฉ
06:43
that can amplify bias and deepen racial disparities,
115
403086
4163
ืฉืชื’ื‘ื™ืจ ืืช ื”ื”ื˜ื™ื•ืช ื•ืชืขืžื™ืง ืืช ื”ืคืขืจื™ื ื”ื’ื–ืขื™ื™ื,
06:48
rather than dismantling them.
116
408129
2266
ื‘ืžืงื•ื ืœืคืจืง ืื•ืชื.
06:50
So the cofounder of Nextdoor reached out to me and to others
117
410863
3429
ืื– ื”ืžื™ื™ืกื“ ื”ืฉื•ืชืฃ ืฉืœ Nextdoor ืคื ื” ืืœื™ื™ ื•ืืœ ืื—ืจื™ื
06:54
to try to figure out what to do.
118
414316
2131
ื›ื“ื™ ืœื ืกื•ืช ื•ืœื”ื‘ื™ืŸ ืžื” ืœืขืฉื•ืช.
06:56
And they realized that to curb racial profiling on the platform,
119
416471
3946
ื•ื”ื ื”ื‘ื™ื ื• ืฉื›ื“ื™ ืœืจืกืŸ ืืช ื”ื—ืชืš ื”ื’ื–ืขื™ ื‘ืคืœื˜ืคื•ืจืžื”,
07:00
they were going to have to add friction;
120
420441
1922
ื”ื ื™ืฆื˜ืจื›ื• ืœื”ื•ืกื™ืฃ ื ืงื•ื“ืช ื—ื™ื›ื•ืš;
07:02
that is, they were going to have to slow people down.
121
422387
2658
ื›ืœื•ืžืจ, ืœื”ืื˜ ืืช ืชื’ื•ื‘ืชื ืฉืœ ื”ืื ืฉื™ื.
07:05
So Nextdoor had a choice to make,
122
425069
2195
ืื– ืœ- Nextdoor ื”ื™ื™ืชื” ื‘ืจื™ืจื”,
07:07
and against every impulse,
123
427288
2478
ื•ื›ื ื’ื“ ื›ืœ ื“ื—ืฃ,
07:09
they decided to add friction.
124
429790
2116
ื”ื ื”ื—ืœื™ื˜ื• ืœื”ื•ืกื™ืฃ ื ืงื•ื“ืช ื—ื™ื›ื•ืš.
07:12
And they did this by adding a simple checklist.
125
432397
3440
ื•ื”ื ืขืฉื• ื–ืืช ืขืœ ื™ื“ื™ ื”ื•ืกืคืช ืจืฉื™ืžื” ืคืฉื•ื˜ื”.
07:15
There were three items on it.
126
435861
1670
ื”ื™ื• ื‘ื” ืฉืœื•ืฉื” ืคืจื™ื˜ื™ื.
07:18
First, they asked users to pause
127
438111
2941
ืจืืฉื™ืช, ื”ื ื‘ื™ืงืฉื• ืžื”ืžืฉืชืžืฉื™ื ืœื”ืฉืชื”ื•ืช
07:21
and think, "What was this person doing that made him suspicious?"
128
441076
5117
ื•ืœื—ืฉื•ื‘, "ืžื” ื”ืื“ื ื”ื–ื” ืขืฉื” ืฉื’ืจื ืœื• ืœื”ื™ืจืื•ืช ื—ืฉื•ื“?"
07:26
The category "black man" is not grounds for suspicion.
129
446876
4533
ื”ืงื˜ื’ื•ืจื™ื” "ืื™ืฉ ืฉื—ื•ืจ" ืื™ื ื” ืขื™ืœื” ืœื—ืฉื“.
07:31
Second, they asked users to describe the person's physical features,
130
451433
5139
ืฉื ื™ืช, ื”ื ื‘ื™ืงืฉื• ืžื”ืžืฉืชืžืฉื™ื ืœืชืืจ ืืช ื”ืชื›ื•ื ื•ืช ื”ืคื™ื–ื™ื•ืช ืฉืœ ื”ืื“ื,
07:36
not simply their race and gender.
131
456596
2435
ืœื ืจืง ืืช ื”ื’ื–ืข ื•ืืช ื”ืžื™ืŸ ืฉืœื”ื.
07:39
Third, they realized that a lot of people
132
459642
3383
ืฉืœื™ืฉื™ืช, ื”ื ื”ื‘ื™ื ื• ืฉื”ืจื‘ื” ืื ืฉื™ื
07:43
didn't seem to know what racial profiling was,
133
463049
2928
ื›ื ืจืื” ืื™ื ื ื™ื•ื“ืขื™ื ืžื”ื™ ื™ืฆื™ืจืช ืคืจื•ืคื™ืœ ื’ื–ืขื™
07:46
nor that they were engaging in it.
134
466001
1959
ืื• ืฉื”ื ืœื•ืงื—ื™ื ื‘ื” ื—ืœืง.
07:48
So Nextdoor provided them with a definition
135
468462
3194
ืื– Nextdoor ืกื™ืคืงื” ืœื”ื ื”ื’ื“ืจื”
07:51
and told them that it was strictly prohibited.
136
471680
2902
ื•ืืžืจื” ืœื”ื ืฉื”ื“ื‘ืจ ืืกื•ืจ ื‘ื”ื—ืœื˜.
07:55
Most of you have seen those signs in airports
137
475071
2612
ืจื•ื‘ื›ื ื›ื‘ืจ ืจืื™ืชื ืืช ื”ืฉืœื˜ื™ื ื”ืืœื” ื‘ืฉื“ื•ืช ื”ืชืขื•ืคื”
07:57
and in metro stations, "If you see something, say something."
138
477707
3702
ื•ื‘ืชื—ื ื•ืช ื”ืžื˜ืจื•, "ืื ืืชื ืจื•ืื™ื ืžืฉื”ื•, ืชื’ื™ื“ื• ืžืฉื”ื•".
08:01
Nextdoor tried modifying this.
139
481928
2814
Nextdoor ื ื™ืกืชื” ืœื”ืชืื™ื ื–ืืช.
08:05
"If you see something suspicious,
140
485584
2572
"ืื ืืชื ืจื•ืื™ื ืžืฉื”ื• ื—ืฉื•ื“,
08:08
say something specific."
141
488180
2073
ืชื’ื™ื“ื• ืžืฉื”ื• ืกืคืฆื™ืคื™".
08:11
And using this strategy, by simply slowing people down,
142
491491
4446
ื•ื‘ืืžืฆืขื•ืช ืืกื˜ืจื˜ื’ื™ื” ื–ื•, ืคืฉื•ื˜ ืขืœ ื™ื“ื™ ื”ืฉื”ื™ื™ืช ืื ืฉื™ื,
08:15
Nextdoor was able to curb racial profiling by 75 percent.
143
495961
5691
Nextdoor ื”ืฆืœื™ื—ื” ืœืจืกืŸ ืืช ื”ืฉื™ืžื•ืฉ ื‘ื—ืชืš ื’ื–ืขื™ ื‘ 75% .
08:22
Now, people often will say to me,
144
502496
2090
ืขื›ืฉื™ื•, ืื ืฉื™ื ืœืขืชื™ื ืงืจื•ื‘ื•ืช ืื•ืžืจื™ื ืœื™,
08:24
"You can't add friction in every situation, in every context,
145
504610
4713
"ืืช ืœื ื™ื›ื•ืœื” ืœื”ื•ืกื™ืฃ ื ืงื•ื“ืช ื—ื™ื›ื•ืš ื‘ื›ืœ ืกื™ื˜ื•ืืฆื™ื”, ื‘ื›ืœ ื”ืงืฉืจ,
08:29
and especially for people who make split-second decisions all the time."
146
509347
4646
ื•ื‘ืขื™ืงืจ ืขื‘ื•ืจ ืื ืฉื™ื ืฉืžืงื‘ืœื™ื ื”ื—ืœื˜ื•ืช ื‘ืฉื‘ืจื™ืจ ืฉื ื™ื™ื” ื›ืœ ื”ื–ืžืŸ".
08:34
But it turns out we can add friction
147
514730
2563
ืื‘ืœ ืžืกืชื‘ืจ ืฉืื ื• ื™ื›ื•ืœื™ื ืœื”ื•ืกื™ืฃ ื ืงื•ื“ืช ื—ื™ื›ื•ืš
08:37
to more situations than we think.
148
517317
2276
ืœืžืฆื‘ื™ื ืจื‘ื™ื ื™ื•ืชืจ ืžืžื” ืฉืื ื—ื ื• ื—ื•ืฉื‘ื™ื.
08:40
Working with the Oakland Police Department
149
520031
2074
ื‘ืขื‘ื•ื“ื” ืขื ืžืฉื˜ืจืช ืื•ืงืœื ื“
08:42
in California,
150
522129
1417
ื‘ืงืœื™ืคื•ืจื ื™ื”,
08:43
I and a number of my colleagues were able to help the department
151
523570
3856
ืื ื™ ื•ื›ืžื” ืžืขืžื™ืชื™ื™ ื”ืฆืœื—ื ื• ืœืขื–ื•ืจ ืœื”ื
08:47
to reduce the number of stops they made
152
527450
2671
ืœืฆืžืฆื ืืช ืžืกืคืจ ื”ืžืขืฆืจื™ื ืฉื‘ื™ืฆืขื•
08:50
of people who were not committing any serious crimes.
153
530145
3600
ืฉืœ ืื ืฉื™ื ืฉืœื ื‘ื™ืฆืขื• ืคืฉืข ื—ืžื•ืจ ื›ืœืฉื”ื•.
08:53
And we did this by pushing officers
154
533769
2365
ื•ืขืฉื™ื ื• ื–ืืช ืขืœ ื™ื“ื™ ืขื™ื“ื•ื“ ื”ืฉื•ื˜ืจื™ื
08:56
to ask themselves a question before each and every stop they made:
155
536158
4443
ืœืฉืื•ืœ ืืช ืขืฆืžื ืฉืืœื” ืื—ืช ืœืคื ื™ ื›ืœ ืžืขืฆืจ ืฉื‘ื™ืฆืขื•:
09:01
"Is this stop intelligence-led,
156
541451
3015
"ื”ืื ื”ืžืขืฆืจ ื”ื–ื” ืžื‘ื•ืกืก ืขืœ ืžื•ื“ื™ืขื™ืŸ,
09:04
yes or no?"
157
544490
1451
ื›ืŸ ืื• ืœื?"
09:07
In other words,
158
547353
1396
ื‘ืžื™ืœื™ื ืื—ืจื•ืช,
09:09
do I have prior information to tie this particular person
159
549621
4484
ื”ืื ื™ืฉ ืœื™ ืžื™ื“ืข ืงื•ื“ื ืœืงืฉื•ืจ ืืช ื”ืื“ื ื”ืžืกื•ื™ื ื”ื–ื”
09:14
to a specific crime?
160
554129
1601
ืขื ืคืฉืข ืกืคืฆื™ืคื™?
09:16
By adding that question
161
556587
1458
ืขืœ ื™ื“ื™ ื”ื•ืกืคืช ืฉืืœื” ื–ื•
09:18
to the form officers complete during a stop,
162
558069
3079
ืœื˜ื•ืคืก ืฉืฉื•ื˜ืจื™ื ืžืžืœืื™ื ื‘ืžื”ืœืš ืžืขืฆืจ,
09:21
they slow down, they pause,
163
561172
1809
ื”ื ื”ืื˜ื•, ื”ื ื”ืฉืชื”ื•,
09:23
they think, "Why am I considering pulling this person over?"
164
563005
4220
ื”ื ื—ื•ืฉื‘ื™ื, "ืžื“ื•ืข ืื ื™ ืฉื•ืงืœ ืœืขืฆื•ืจ ืืช ื”ืื“ื ื”ื–ื”?"
09:28
In 2017, before we added that intelligence-led question to the form,
165
568721
5561
ื‘ืฉื ืช 2017, ืœืคื ื™ ืฉื”ื•ืกืคื ื• ืืช ื”ืฉืืœื” ืžื‘ื•ืกืกืช ืžื•ื“ื™ืขื™ืŸ ืœื˜ื•ืคืก,
09:35
officers made about 32,000 stops across the city.
166
575655
3946
ืฉื•ื˜ืจื™ื ื‘ื™ืฆืขื• ื›- 32,000 ืžืขืฆืจื™ื ื‘ืจื—ื‘ื™ ื”ืขื™ืจ.
09:39
In that next year, with the addition of this question,
167
579625
4115
ื‘ืฉื ื” ืฉืื—ืจื™ื”, ืขื ืชื•ืกืคืช ืฉืืœื” ื–ื•,
09:43
that fell to 19,000 stops.
168
583764
2444
ื–ื” ื™ืจื“ ืœ - 19,000 ืžืขืฆืจื™ื.
09:46
African-American stops alone fell by 43 percent.
169
586232
4961
ืžืขืฆืจื™ื ืฉืœ ืืคืจื•-ืืžืจื™ืงืื™ื™ื ื‘ืœื‘ื“ ื™ืจื“ื• ื‘- 43%.
09:51
And stopping fewer black people did not make the city any more dangerous.
170
591905
4438
ื•ื”ื™ืจื™ื“ื” ื‘ืžืขืฆืจื™ื ืฉืœ ืื ืฉื™ื ืฉื—ื•ืจื™ื ืœื ื”ืคื›ื” ืืช ื”ืขื™ืจ ืœืžืกื•ื›ื ืช ื™ื•ืชืจ.
09:56
In fact, the crime rate continued to fall,
171
596367
2734
ืœืžืขืฉื”, ืฉื™ืขื•ืจ ื”ืคืฉื™ืขื” ื”ืžืฉื™ืš ืœืจื“ืช,
09:59
and the city became safer for everybody.
172
599125
3337
ื•ื”ืขื™ืจ ื”ืคื›ื” ื‘ื˜ื•ื—ื” ื™ื•ืชืจ ืขื‘ื•ืจ ื›ื•ืœื.
10:02
So one solution can come from reducing the number of unnecessary stops.
173
602486
5355
ื›ืš ืฉืคืชืจื•ืŸ ืื—ื“ ื™ื›ื•ืœ ืœื”ื’ื™ืข ืžื”ืคื—ืชื” ื‘ืžืกืคืจ ื”ืžืขืฆืจื™ื ื”ืžื™ื•ืชืจื™ื.
10:08
Another can come from improving the quality of the stops
174
608285
4270
ืคืชืจื•ืŸ ืื—ืจ ื™ื›ื•ืœ ืœื”ื’ื™ืข ืžืฉื™ืคื•ืจ ืื™ื›ื•ืช ื”ืžืขืฆืจื™ื
10:12
officers do make.
175
612579
1305
ืฉืฉื•ื˜ืจื™ื ืžื‘ืฆืขื™ื.
10:14
And technology can help us here.
176
614512
2596
ื•ื˜ื›ื ื•ืœื•ื’ื™ื” ื™ื›ื•ืœื” ืœืขื–ื•ืจ ืœื ื• ื›ืืŸ.
10:17
We all know about George Floyd's death,
177
617132
2415
ื›ื•ืœื ื• ื™ื•ื“ืขื™ื ืขืœ ืžื•ืชื• ืฉืœ ื’'ื•ืจื’' ืคืœื•ื™ื“,
10:20
because those who tried to come to his aid held cell phone cameras
178
620499
4772
ื›ื™ ืืœื” ืฉื ื™ืกื• ืœืขื–ื•ืจ ืœื• ื”ื—ื–ื™ืงื• ืžืฆืœืžื•ืช ื‘ื˜ืœืคื•ืŸ ื”ืกืœื•ืœืจื™
10:25
to record that horrific, fatal encounter with the police.
179
625295
5431
ืœืชืขื“ ืืช ื”ืžืคื’ืฉ ื”ืžื—ืจื™ื“ ื•ื”ืงื˜ืœื ื™ ื”ื–ื” ืขื ื”ืžืฉื˜ืจื”.
10:30
But we have all sorts of technology that we're not putting to good use.
180
630750
5031
ืื‘ืœ ื™ืฉ ืœื ื• ื›ืœ ืžื™ื ื™ ื˜ื›ื ื•ืœื•ื’ื™ื•ืช ืฉืื ื—ื ื• ืœื ืžื ืฆืœื™ื ืœื˜ื•ื‘ื”.
10:35
Police departments across the country
181
635805
2503
ืื ืฉื™ ืžืฉื˜ืจื” ื‘ืจื—ื‘ื™ ื”ืืจืฅ
10:38
are now required to wear body-worn cameras
182
638332
3553
ื›ืขืช ื ื“ืจืฉื™ื ืœืœื‘ื•ืฉ ืžืฆืœืžื•ืช ื’ื•ืฃ
10:41
so we have recordings of not only the most extreme and horrific encounters
183
641909
5930
ื›ืš ืฉื™ืฉ ืœื ื• ืœื ืจืง ื”ืงืœื˜ื•ืช ืฉืœ ื”ืžืคื’ืฉื™ื ื”ืงื™ืฆื•ื ื™ื™ื ื•ื”ื ื•ืจืื™ื™ื ื‘ื™ื•ืชืจ
10:47
but of everyday interactions.
184
647863
2754
ืืœื ื’ื ืฉืœ ืื™ื ื˜ืจืืงืฆื™ื•ืช ื™ื•ืžื™ื•ืžื™ื•ืช.
10:50
With an interdisciplinary team at Stanford,
185
650641
2777
ืขื ืฆื•ื•ืช ื‘ื™ื ืชื—ื•ืžื™ ื‘ืกื˜ื ืคื•ืจื“,
10:53
we've begun to use machine learning techniques
186
653442
2687
ื”ืชื—ืœื ื• ืœื”ืฉืชืžืฉ ื‘ื˜ื›ื ื™ืงื•ืช ืฉืœ ืœืžื™ื“ืช ืžื›ื•ื ื”
10:56
to analyze large numbers of encounters.
187
656153
3367
ืœื ืชื— ืžืกืคืจ ื’ื“ื•ืœ ืฉืœ ืžืคื’ืฉื™ื.
10:59
This is to better understand what happens in routine traffic stops.
188
659544
4611
ื•ื–ืืช ื›ื“ื™ ืœื”ื‘ื™ืŸ ื˜ื•ื‘ ื™ื•ืชืจ ืืช ืžื” ืฉืงื•ืจื” ื‘ืžืขืฆืจื™ื ืฉื’ืจืชื™ื™ื.
11:04
What we found was that
189
664179
2155
ืžื” ืฉื’ื™ืœื™ื ื• ื”ื•ื
11:06
even when police officers are behaving professionally,
190
666358
3662
ืฉืืคื™ืœื• ื›ืืฉืจ ืฉื•ื˜ืจื™ื ืžืชื ื”ื’ื™ื ื‘ืžืงืฆื•ืขื™ื•ืช,
11:10
they speak to black drivers less respectfully than white drivers.
191
670860
4462
ื”ื ืžื“ื‘ืจื™ื ืขื ื ื”ื’ื™ื ืฉื—ื•ืจื™ื ื‘ืคื—ื•ืช ื›ื‘ื•ื“ ื‘ื”ืฉื•ื•ืื” ืœื ื”ื’ื™ื ืœื‘ื ื™ื.
11:16
In fact, from the words officers use alone,
192
676052
4075
ืœืžืขืฉื”, ืจืง ืžื”ืžื™ืœื™ื ืฉื”ืฉื•ื˜ืจื™ื ืืžืจื•,
11:20
we could predict whether they were talking to a black driver or a white driver.
193
680151
5162
ื™ื›ื•ืœื ื• ืœืงื‘ื•ืข ืื ื”ื ืžื“ื‘ืจื™ื ืขื ื ื”ื’ ืฉื—ื•ืจ ืื• ืขื ื ื”ื’ ืœื‘ืŸ.
11:25
The problem is that the vast majority of the footage from these cameras
194
685337
5762
ื”ื‘ืขื™ื” ื”ื™ื ืฉื”ืจื•ื‘ ื”ืžื›ืจื™ืข ืฉืœ ืฆื™ืœื•ืžื™ ื”ืžืฆืœืžื•ืช ื”ืืœื”
11:31
is not used by police departments
195
691123
2087
ืื™ื ื• ืžื ื•ืฆืœ ืขืœ ื™ื“ื™ ื”ืžืฉื˜ืจื”
11:33
to understand what's going on on the street
196
693234
2276
ืœื”ื‘ื™ืŸ ืืช ืžื” ืฉืงื•ืจื” ื‘ืจื—ื•ื‘
11:35
or to train officers.
197
695534
2243
ืื• ืœื”ื›ืฉื™ืจ ืฉื•ื˜ืจื™ื.
11:38
And that's a shame.
198
698554
1458
ื•ื—ื‘ืœ.
11:40
How does a routine stop turn into a deadly encounter?
199
700796
4789
ืื™ืš ืžืขืฆืจ ืฉื’ืจืชื™ ื”ื•ืคืš ืœืžืคื’ืฉ ืงื˜ืœื ื™?
11:45
How did this happen in George Floyd's case?
200
705609
2670
ืื™ืš ื–ื” ืงืจื” ื‘ืžืงืจื” ืฉืœ ื’'ื•ืจื’' ืคืœื•ื™ื“?
11:49
How did it happen in others?
201
709588
2082
ืื™ืš ื–ื” ืงืจื” ืืฆืœ ืื—ืจื™ื?
11:51
When my eldest son was 16 years old,
202
711694
3396
ื›ืฉื‘ื ื™ ื”ื‘ื›ื•ืจ ื”ื™ื” ื‘ืŸ 16,
11:55
he discovered that when white people look at him,
203
715114
3139
ื”ื•ื ื’ื™ืœื” ืฉื›ืืฉืจ ืื ืฉื™ื ืœื‘ื ื™ื ืžื‘ื™ื˜ื™ื ื‘ื•,
11:58
they feel fear.
204
718277
1561
ื”ื ื—ืฉื™ื ืคื—ื“.
12:01
Elevators are the worst, he said.
205
721123
2661
ื”ืžืขืœื™ื•ืช ื”ืŸ ื”ื’ืจื•ืขื•ืช ื‘ื™ื•ืชืจ, ืืžืจ.
12:04
When those doors close,
206
724313
2331
ื›ืืฉืจ ื”ื“ืœืชื•ืช ื”ืืœื” ื ืกื’ืจื•ืช,
12:06
people are trapped in this tiny space
207
726668
3083
ืื ืฉื™ื ืœื›ื•ื“ื™ื ื‘ื—ืœืœ ื”ื–ืขื™ืจ ื”ื–ื”
12:09
with someone they have been taught to associate with danger.
208
729775
4467
ืขื ืžื™ืฉื”ื• ืฉืœื™ืžื“ื• ืื•ืชื ืœืงืฉืจ ืขื ืกื›ื ื”.
12:14
My son senses their discomfort,
209
734744
3220
ื”ื‘ืŸ ืฉืœื™ ืžืจื’ื™ืฉ ืืช ืื™ ื”ื ื•ื—ื•ืช ืฉืœื”ื,
12:17
and he smiles to put them at ease,
210
737988
3157
ื•ื”ื•ื ืžื—ื™ื™ืš ื›ื“ื™ ืœื”ืงืœ ืขืœื™ื”ื,
12:21
to calm their fears.
211
741169
1769
ืœื”ืจื’ื™ืข ืืช ื”ืคื—ื“ื™ื ืฉืœื”ื.
12:23
When he speaks,
212
743351
1945
ื›ืฉื”ื•ื ืžื“ื‘ืจ,
12:25
their bodies relax.
213
745320
1683
ื’ื•ืคื ื ืจื’ืข.
12:27
They breathe easier.
214
747442
1903
ื”ื ื ื•ืฉืžื™ื ื™ื•ืชืจ ื‘ืงืœื•ืช.
12:29
They take pleasure in his cadence,
215
749369
2531
ื”ื ื ื”ื ื™ื ืžื—ื™ืชื•ืš ื”ื“ื™ื‘ื•ืจ ืฉืœื•,
12:31
his diction, his word choice.
216
751924
2317
ืžื”ืกื’ื ื•ืŸ ืฉืœื•, ืžื‘ื—ื™ืจืช ื”ืžื™ืœื™ื ืฉืœื•.
12:34
He sounds like one of them.
217
754986
1843
ื”ื•ื ื ืฉืžืข ื›ืžื• ืื—ื“ ืžื”ื.
12:36
I used to think that my son was a natural extrovert like his father.
218
756853
4730
ืคืขื ื—ืฉื‘ืชื™ ืฉื”ื‘ืŸ ืฉืœื™ ื”ื•ื ืžื•ื—ืฆืŸ ืžื˜ื‘ืขื• ื›ืžื• ืื‘ื™ื•.
12:41
But I realized at that moment, in that conversation,
219
761607
3550
ืื‘ืœ ื‘ืื•ืชื• ืจื’ืข ื”ื‘ื ืชื™, ื‘ืฉื™ื—ื” ื”ื”ื™ื,
12:46
that his smile was not a sign that he wanted to connect
220
766143
5078
ืฉื”ื—ื™ื•ืš ืฉืœื• ืœื ื”ื™ื” ืกื™ืžืŸ ืœื›ืš ืฉื”ื•ื ืจื•ืฆื” ืœื”ืชื—ื‘ืจ
12:51
with would-be strangers.
221
771245
1964
ืขื ืื ืฉื™ื ื–ืจื™ื.
12:53
It was a talisman he used to protect himself,
222
773920
3652
ื–ื” ื”ื™ื” ืงืžืข ืฉื”ื•ื ื”ืฉืชืžืฉ ื‘ื• ืœื”ื’ืŸ ืขืœ ืขืฆืžื•,
12:57
a survival skill he had honed over thousands of elevator rides.
223
777596
6219
ืžื™ื•ืžื ื•ืช ื”ื™ืฉืจื“ื•ืชื™ืช ืฉื”ืฉื—ื™ื– ื‘ืžื”ืœืš ืืœืคื™ ื ืกื™ืขื•ืช ื‘ืžืขืœื™ืช.
13:04
He was learning to accommodate the tension that his skin color generated
224
784387
5171
ื”ื•ื ืœืžื“ ืœื”ื›ื™ืœ ืืช ื”ืžืชื— ืฉืฆื‘ืข ืขื•ืจื• ื—ื•ืœืœ
13:11
and that put his own life at risk.
225
791026
2667
ื•ืฉืกื™ื›ืŸ ืืช ื—ื™ื™ื• ืฉืœื•.
13:14
We know that the brain is wired for bias,
226
794619
3783
ืื ื• ื™ื•ื“ืขื™ื ืฉื”ืžื•ื— ืžื—ื•ื•ื˜ ืœื”ื˜ื™ื•ืช,
13:18
and one way to interrupt that bias is to pause and to reflect
227
798426
4465
ื•ืื—ืช ื”ื“ืจื›ื™ื ืœื”ืคืกื™ืง ื”ื˜ื™ื•ืช ืฉื›ืืœื” ื”ื™ื ืœืขืฆื•ืจ ื•ืœื”ืจื”ืจ
13:22
on the evidence of our assumptions.
228
802915
2305
ืขืœ ื”ืขื“ื•ื™ื•ืช ืœื”ื ื—ื•ืช ืฉืœื ื•.
13:25
So we need to ask ourselves:
229
805244
1755
ืœื›ืŸ ืขืœื™ื ื• ืœืฉืื•ืœ ืืช ืขืฆืžื ื•:
13:27
What assumptions do we bring when we step onto an elevator?
230
807023
4665
ืื™ืœื• ื”ื ื—ื•ืช ืื ื• ืžื‘ื™ืื™ื ื›ืฉืื ื—ื ื• ื ื›ื ืกื™ื ืœืชื•ืš ืžืขืœื™ืช?
13:33
Or an airplane?
231
813776
1311
ืื• ืžื˜ื•ืก?
13:35
How do we make ourselves aware of our own unconscious bias?
232
815532
4599
ื›ื™ืฆื“ ื ื”ื™ื” ืžื•ื“ืขื™ื ืœื”ื˜ื™ื•ืช ื”ืœื ืžื•ื“ืขื•ืช ืฉืœื ื•?
13:40
Who do those assumptions keep safe?
233
820155
2351
ืขืœ ืžื™ ืžื’ื ื•ืช ื”ื”ื ื—ื•ืช ื”ืืœื”?
13:44
Who do they put at risk?
234
824615
1932
ืืช ืžื™ ื”ืŸ ืžืกื›ื ื•ืช?
13:47
Until we ask these questions
235
827649
2354
ืขื“ ืฉื ืฉืืœ ืืช ื”ืฉืืœื•ืช ื”ืืœื”
13:50
and insist that our schools and our courts and our police departments
236
830978
4624
ื•ื ืชืขืงืฉ ืฉื‘ืชื™ ื”ืกืคืจ ืฉืœื ื• ื•ื‘ืชื™ ื”ืžืฉืคื˜ ืฉืœื ื• ื•ื”ืžืฉื˜ืจื” ืฉืœื ื•
13:55
and every institution do the same,
237
835626
2542
ื•ื›ืœ ืžื•ืกื“ ื™ืขืฉื” ืืช ืื•ืชื• ื”ื“ื‘ืจ,
13:59
we will continue to allow bias
238
839835
3829
ืื ื• ื ืžืฉื™ืš ืœืืคืฉืจ ืœื”ื˜ื™ื•ืช
14:03
to blind us.
239
843688
1278
ืœืขื•ื•ืจ ืื•ืชื ื•.
14:05
And if we do,
240
845348
1409
ื•ื‘ืžืฆื‘ ื›ื–ื”,
14:08
none of us are truly safe.
241
848066
3208
ืืฃ ืื—ื“ ืžืื™ืชื ื• ืœื ื‘ื˜ื•ื— ื‘ืืžืช.
14:14
Thank you.
242
854103
1308
ืชื•ื“ื” ืจื‘ื”.
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7