How to reduce bias in your workplace | The Way We Work, a TED series

183,210 views ใƒป 2021-10-25

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

00:00
- We all have our biases,
0
520
1790
ืชืจื’ื•ื: michal rosengarten ืขืจื™ื›ื”: Ido Dekkers
ืงื™ื ืกืงื•ื˜: ืœื›ื•ืœื ื• ื™ืฉ ืืช ื”ื“ืขื•ืช ื”ืงื“ื•ืžื•ืช ืฉืœื ื•,
00:02
the set of assumptions that we make,
1
2310
1860
ืกื˜ ื”ื”ื ื—ื•ืช ืฉืื ื—ื ื• ืขื•ืฉื™ื ื•ื”ื“ื‘ืจื™ื ืฉื‘ื”ื ืื ื—ื ื• ืœื ืžื‘ื—ื™ื ื™ื
00:04
and the things we don't notice
2
4170
2010
00:06
about people's race, gender, religion, sexual orientation,
3
6180
4230
ืœื’ื‘ื™ ื”ื’ื–ืข, ืžื™ืŸ, ื“ืช ืžืฉื™ื›ื” ืžื™ื ื™ืช, ื”ื•ืคืขื” ื—ื™ืฆื•ื ื™ืช ื•ื“ื‘ืจื™ื ื ื•ืกืคื™ื ืฉืœ ืื ืฉื™ื,
00:10
appearance, and other traits.
4
10410
1680
00:12
They come from the part of our mind
5
12090
2240
ื”ื ื‘ืื™ื ืžื”ื—ืœืง ื‘ืžื— ืฉืœื ื• ืฉืงื•ืคืฅ ืœืžืกืงื ื•ืช
00:14
that jumps to conclusions
6
14330
1610
00:15
that we might not even be aware that we have.
7
15940
2750
ืฉื™ื›ื•ืœ ืœื”ื™ื•ืช ืฉืื ื—ื ื• ื‘ื›ืœืœ ืœื ืžื•ื“ืขื™ื ืฉื™ืฉ ืœื ื•.
00:18
- I really can't tell you the number of times people assumed
8
18690
2340
ื˜ืจื™ื™ืจ ื‘ืจื™ื™ืื ื˜: ืื ื™ ืœื ื™ื›ื•ืœื” ืœืกืคืจ ื›ืžื” ืคืขืžื™ื ืื ืฉื™ื ื”ื ื™ื—ื• ืฉืื ื™ ืคืงื™ื“ืช ื”ืงื‘ืœื”
00:21
I was the receptionist
9
21030
1650
00:22
when I was an executive at the company.
10
22680
2460
ื›ืฉืœืžืขืฉื” ืื ื™ ืžื ื”ืœืช ื‘ื—ื‘ืจื”.
00:25
- That kind of bias gets in the way of good collaboration,
11
25140
3640
ืกื•ื’ ื›ื–ื” ืฉืœ ื”ื˜ื™ื” ืžืคืจื™ืข ืœืฉื™ืชื•ืฃ ืคืขื•ืœื” ื˜ื•ื‘, ื‘ื™ืฆื•ืขื™ื ื•ืงื‘ืœืช ื”ื—ืœื˜ื•ืช.
00:28
performance, and decision-making.
12
28780
2150
00:30
- It creates an invisible tax of resentment and frustration.
13
30930
3240
ื–ื” ื™ื•ืฆืจ ืฆืœ ื‘ืœืชื™ ื ืจืื” ืฉืœ ื”ืชื ื’ื“ื•ืช ื•ืชืกื›ื•ืœ.
00:34
The more frustrated we are,
14
34170
1700
ื›ื›ืœ ืฉืื ื—ื ื• ื™ื•ืชืจ ืžืชื•ืกื›ืœื™ื, ื›ืš ื™ื•ืชืจ ืกื‘ื™ืจ ืฉื ื”ื™ื” ืฉืงื˜ื™ื.
00:35
the more silent we are likely to be.
15
35870
1700
00:37
And the more silent we are,
16
37570
1200
ื•ื›ื›ืœ ืฉืื ื—ื ื• ื™ื•ืชืจ ืฉืงื˜ื™ื,
00:38
the less we may be able to do our best work.
17
38770
2240
ื”ืกื™ื›ื•ื™ ืฉื ื•ื›ืœ ืœืขืฉื•ืช ืืช ื”ืขื‘ื•ื“ื” ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ, ืงื˜ืŸ.
00:41
The good news though is,
18
41010
1630
ืื‘ืœ ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช ื”ืŸ ืฉื”ื˜ื™ื” ื”ื™ื ืœื ื‘ืœืชื™ ื ืžื ืขืช.
00:42
bias is not inevitable.
19
42640
1783
[ื”ื“ืจืš ื‘ื” ืื ื—ื ื• ืขื•ื‘ื“ื™ื]
00:48
So here's how to disrupt bias in three steps.
20
48310
2740
ืื– ื›ืš ื ื™ืชืŸ ืœืขืฆื•ืจ ืืช ื”ื”ื˜ื™ื” ื‘ืฉืœื•ืฉื” ืฆืขื“ื™ื.
ื”ืฆืขื“ ื”ืจืืฉื•ืŸ ื”ื•ื ืœื™ืฆื•ืจ ืื•ืฆืจ ืžื™ืœื™ื ืžืฉื•ืชืฃ.
00:51
The first step is to create a shared vocabulary.
21
51050
3330
00:54
Sometimes bias shows up in big embarrassing gaffes,
22
54380
3100
ืœืคืขืžื™ื ืจื•ืื™ื ื”ื˜ื™ื” ื‘ืžืงืจื™ื ื’ื“ื•ืœื™ื ื•ืžื‘ื™ื›ื™ื,
00:57
but more often it comes out
23
57480
1410
ืื‘ืœ ืœืขื™ืชื™ื ืงืจื•ื‘ื•ืช ื™ื•ืชืจ ื–ื” ื‘ื ืœื™ื“ื™ ื‘ื™ื˜ื•ื™ ื‘ืžื™ืœื™ื ืื• ื‘ื ื™ื‘ื™ื ืฉืื ื—ื ื• ื‘ื•ื—ืจื™ื,
00:58
in the little words and phrases we choose,
24
58890
2270
01:01
which are packed with assumptions.
25
61160
1860
ืฉืขืžื•ืกื™ื ื‘ื”ื ื—ื•ืช.
ื‘ืคื’ื™ืฉื•ืช, ื‘ืžื™ื•ื—ื“, ืืœื” ืขื•ื‘ืจื•ืช ื‘ืขื™ืงืจ ื‘ืœื™ ืฉื™ืฉื™ืžื• ืœื‘, ืื• ืืคื™ืœื• ื’ืจื•ืข ื™ื•ืชืจ,
01:03
In meetings especially, these often go unnoticed
26
63020
2960
01:05
or even worse, people notice, but don't know what to say.
27
65980
4250
ืื ืฉื™ื ืฉืžื™ื ืœื‘ ืื‘ืœ ืœื ื™ื•ื“ืขื™ื ืžื” ืœื•ืžืจ.
01:10
That's why we recommend coming up with a shared word
28
70230
2220
ืœื›ืŸ ืื ื—ื ื• ืžืžืœื™ืฆื•ืช ืœื”ืžืฆื™ื ืžื™ืœื” ืื• ืžืฉืคื˜ ืžืฉื•ืชืฃ
01:12
or phrase that everyone agrees to use
29
72450
2230
ืฉื›ื•ืœื ืžืกื›ื™ืžื™ื ืœื”ืฉืชืžืฉ ื‘ื• ื›ื“ื™ ืœื ื˜ืจืœ ื’ื™ืฉื•ืช ืื• ื”ืชื ื”ื’ื•ื™ื•ืช ืžื•ื˜ื•ืช.
01:14
to disrupt bias, attitudes or behaviors.
30
74680
2730
01:17
Examples teams are using our bias alert, stoplight,
31
77410
3550
ืฆื•ื•ืชื™ื ืœื“ื•ื’ืžื ืžืฉืชืžืฉื™ื ื‘ืื–ืขืงืช ื”ื”ื˜ื™ื” ืฉืœื ื•. โ€œืื•ืจ ืขืฆื•ืจโ€
01:20
or even throwing up a peace sign.
32
80960
1590
ืื• ืืคื™ืœื• ืžืกืžื ื™ื โ€œืฉืœื•ืโ€ ืขื ื”ื™ื“.
01:22
- Leaders often ask us to give them the right words,
33
82550
3450
ืžื ื”ื™ื’ื™ื ื”ืจื‘ื” ืคืขืžื™ื ืžื‘ืงืฉื™ื ืžืื™ืชื ื• ืœืชืช ืœื”ื ืืช โ€œื”ืžื™ืœื™ื ื”ื ื›ื•ื ื•ืชโ€
ืื‘ืœ ื”ืžื™ืœื™ื ื”ื˜ื•ื‘ื•ืช ื‘ื™ื•ืชืจ ื”ืŸ ืžื™ืœื™ื ืฉื”ืฆื•ื•ืช ืฉืœื›ื ื‘ืืžืช ื™ืืžืจ,
01:26
but the best words are the ones your team will actually say,
34
86000
3520
01:29
not the ones that leaders impose.
35
89520
1890
ืœื ืืœื” ืฉื”ืžื ื”ื™ื’ื™ื ืžืืžืฆื™ื.
01:31
So talk to your team.
36
91410
1730
ืื– ืชื“ื‘ืจื• ืืœ ื”ืฆื•ื•ืชื™ื ืฉืœื›ื.
01:33
My very favorite is the one that you recommended Trier,
37
93140
3690
ื”ืื”ื•ื‘ื” ืขืœื™ื™ ื–ื• ื”ืื—ืช ืฉืขืœื™ื” ื”ืžืœืฆืช, ื˜ืจื™ื™ืจ: โ€œื“ื’ืœ ืกื’ื•ืœโ€
01:36
purple flag.
38
96830
1280
01:38
When someone says or does something biased,
39
98110
2500
ื›ืฉืžื™ืฉื”ื• ืื•ืžืจ ืื• ืขื•ืฉื” ืžืฉื”ื• ืžื•ื˜ื”,
01:40
we'll say purple flag,
40
100610
1450
ื ืืžืจ โ€œื“ื’ืœ ืกื’ื•ืœโ€ ืื•ืœื™ ืืคื™ืœื• ื ื ื•ืคืฃ ื‘ืื—ื“.
01:42
and maybe we'll even waive a purple flag.
41
102060
3000
01:45
It's not a red flag, it's a friendly purple flag.
42
105060
3240
ื–ื” ืœื ื“ื’ืœ ืื“ื•ื.
ื–ื” ื“ื’ืœ ืกื’ื•ืœ ื—ื‘ืจื™.
01:48
- It helps us become more aware of our blind spots.
43
108300
3000
ื–ื” ืขื•ื–ืจ ืœื ื• ืœื”ื™ื•ืช ืขืจื™ื ื™ื•ืชืจ ืœืžืงื•ืžื•ืช ื”ืขื™ื•ื•ืจื™ื ืฉืœื ื•.
01:51
- Purple flag. - Purple flag.
44
111300
1500
ื“ื’ืœ ืกื’ื•ืœ.
ื“ื’ืœ ืกื’ื•ืœ.
01:53
Thanks for pointing that out.
45
113670
1110
ืชื•ื“ื” ืฉืฆื™ื™ื ืช ืืช ื–ื”.
01:54
I've been noticing lately I use a lot of sight metaphors
46
114780
2580
ืฉืžืชื™ ืœื‘ ืœืื—ืจื•ื ื” ืฉืื ื™ ืžืฉืชืžืฉืช ื”ืจื‘ื” ื‘ืžื˜ืืคื•ืจื•ืช ืฉืœ ืจืื™ื”
01:57
that often portray disabilities like being visually impaired
47
117360
2770
ื•ืžืฆื™ื™ืจืช ืžื•ื’ื‘ืœื•ื™ื•ืช, ื›ืžื• ืœื”ื™ื•ืช ื›ื‘ื“ ืจืื™ื”, ื‘ื“ืจื›ื™ื ืฉืœื™ืœื™ื•ืช
02:00
in negative ways,
48
120130
1410
02:01
but I'm committed to doing better and working on it.
49
121540
2400
ืื‘ืœ ืื ื™ ืžื—ื•ื™ื‘ืช ืœืœืขืฉื•ืช ื˜ื•ื‘ ื™ื•ืชืจ ื•ืœืขื‘ื•ื“ ืขืœ ื–ื”.
02:03
- I am too.
50
123940
1280
ื’ื ืื ื™.
02:05
Another great shared vocabulary trick
51
125220
2530
ืขื•ื“ ื˜ืจื™ืง ืื•ืฆืจ ืžื™ืœื™ื ืžืฉื•ืชืฃ ื ื”ื“ืจ ื”ื•ื ืœื‘ืงืฉ ืžื—ื‘ืจื™ ื”ืฆื•ื•ืช ืฉืœื›ื
02:07
is to ask members of your team to respond to bias
52
127750
2670
ืœื”ื’ื™ื‘ ืœื”ื˜ื™ื” ื‘โ€œื”ืฆื”ืจืช ืื ื™โ€œ.
02:10
with an I statement.
53
130420
1310
02:11
An I statement invites the other person in
54
131730
2860
โ€œื”ืฆื”ืจืช ืื ื™โ€ ืžื–ืžื™ื ื” ืืช ื”ืื“ื ื”ืื—ืจ ืคื ื™ืžื”,
02:14
to understand things from your perspective
55
134590
1790
ืœื”ื‘ื™ืŸ ื“ื‘ืจื™ื ืžื ืงื•ื“ืช ื”ืžื‘ื˜ ืฉืœืš, ื‘ืžืงื•ื ืœื‘ื˜ืœ ืื•ืชื.
02:16
rather than calling them out.
56
136380
1540
02:17
Like, I don't think you're going to take me seriously
57
137920
2610
ื›ืžื•, โ€œืื ื™ ืœื ื—ื•ืฉื‘ืช ืฉืืชื” ื”ื•ืœืš ืœืงื—ืช ืื•ืชื™ ื‘ืจืฆื™ื ื•ืช, ื›ืฉืืชื” ืงื•ืจื ืœื™ ืžื•ืชืงโ€
02:20
when you're calling me honey.
58
140530
1180
02:21
Or I don't think you meant that the way that it sounded.
59
141710
2590
ืื• โ€œืื ื™ ืœื ื—ื•ืฉื‘ืช ืฉื”ืชื›ื•ื•ื ืช ืœื–ื” ื‘ื“ืจืš ืฉื‘ื” ื–ื” ื ืฉืžืขโ€
02:24
Usually when people's biases are pointed out to them
60
144300
3290
ื‘ื“ืจืš ื›ืœืœ, ื›ืฉื”ื”ื˜ื™ื•ืช ืฉืœ ืื ืฉื™ื ืžื•ืฆื’ื•ืช ื‘ืคื ื™ื”ื
02:27
clearly and compassionately,
61
147590
1880
ื‘ื‘ื”ื™ืจื•ืช ื•ื‘ืกื•ื‘ืœื ื•ืช,
02:29
they apologize and correct things going forward,
62
149470
2970
ื”ื ืžืชื ืฆืœื™ื ื•ืžืชืงื ื™ื ื“ื‘ืจื™ื ืœื”ื‘ื.
02:32
usually, but not always.
63
152440
2330
ื‘ื“ืจืš ื›ืœืœ, ืื‘ืœ ืœื ืชืžื™ื“.
02:34
- That brings us to the second step,
64
154770
1910
ื–ื” ืžื‘ื™ื ืื•ืชื ื• ืœืฆืขื“ ื”ืฉื ื™:
02:36
create a shared norm for how to respond
65
156680
2710
ืฆืจื• ื ื•ืจืžื” ืžืฉื•ืชืคืช ืœืื™ืš ืœื”ื’ื™ื‘ ื›ืฉื”ื”ื˜ื™ื” ืฉืœื›ื ืžื•ืฆื’ืช ื‘ืคื ื™ื›ื.
02:39
when your bias is pointed out.
66
159390
1540
02:40
- When my bias is flagged,
67
160930
1480
ื›ืฉืžืฆื™ื™ื ื™ื ื”ื˜ื™ื” ืฉืœื™,
02:42
I can only be glad that I'm learning something new
68
162410
2640
ืื ื™ ื™ื›ื•ืœื” ืœื”ื™ื•ืช ืฉืžื—ื” ืฉืื ื™ ืœื•ืžื“ืช ืžืฉื”ื• ื—ื“ืฉ
02:45
if I can move past the shame,
69
165050
2130
ืจืง ืื ืื ื™ ื™ื›ื•ืœ ืœืขื‘ื•ืจ ืžืขื‘ืจ ืœื‘ื•ืฉื”.
02:47
I hate the idea that I've harmed someone.
70
167180
2590
ืื ื™ ืฉื•ื ืืช ืืช ื”ืžื—ืฉื‘ื” ืฉืคื’ืขืชื™ ื‘ืžื™ืฉื”ื•.
02:49
And when I feel ashamed, I rarely respond well.
71
169770
4140
ื•ื›ืฉืื ื™ ืžืจื’ื™ืฉื” ืžื•ื‘ื›ืช, ืจืง ืœืขื™ืชื™ื ื ื“ื™ืจื•ืช ืื ื™ ืžื’ื™ื‘ื” ื˜ื•ื‘.
02:53
So it's really helpful to have that shared norms
72
173910
2660
ืื– ื”ื ื•ืจืžื” ื”ืžืฉื•ืชืคืช ื”ื–ื• ื‘ืืžืช ืžืื•ื“ ืขื•ื–ืจืช
02:56
so that I know what to say in those moments.
73
176570
2400
ื›ืš ืฉืื ื™ ื™ื›ื•ืœื” ืœื“ืขืช ืžื” ืœื•ืžืจ ื‘ืจื’ืขื™ื ื”ืืœื•.
02:58
- We recommend you start with,
74
178970
1530
ืื ื—ื ื• ืžืžืœื™ืฆื™ื ืœื›ื ืœื”ืชื—ื™ืœ ืขื: โ€œืชื•ื“ื” ืฉืฆื™ื™ื ืชื ืืช ื–ื”โ€
03:00
thank you for pointing that out.
75
180500
1380
03:01
It took courage for that person to disrupt the bias.
76
181880
2690
ื–ื” ื“ืจืฉ ืื•ืžืฅ ืžื‘ืŸ ื”ืื“ื ืœื”ืฆื‘ื™ืข ืขืœ ื”ื”ื˜ื™ื”, ืื– ื—ืฉื•ื‘ ืœื”ื›ื™ืจ ื‘ื›ืš.
03:04
So it's important to acknowledge that.
77
184570
1880
03:06
Then there are two choices on what to say next.
78
186450
2510
ื•ืื– ื™ืฉ ืฉืชื™ ืืคืฉืจื•ื™ื•ืช ืขืœ ืžื” ืœื”ื’ื™ื“ ืื—ืจ ื›ืš.
03:08
One, I get it.
79
188960
1210
ืื—ืช, โ€œืื ื™ ืžื‘ื™ื ื”โ€
03:10
Or two, I don't get it.
80
190170
2050
ืื• ืฉืชื™ื™ื, โ€œืื ื™ ืœื ืžื‘ื™ื ื”,
03:12
Could you explain more after the meeting.
81
192220
2110
ืชื•ื›ืœื™ ืœื”ืกื‘ื™ืจ ื™ื•ืชืจ ืื—ืจื™ ื”ืคื’ื™ืฉื”?โ€
03:14
- The other day, you and I were recording a podcast
82
194330
2740
ื”ื™ื” ื™ื•ื, ืฉื‘ื• ืืช ื•ืื ื™ ื”ืงืœื˜ื ื• ืชืกื›ื™ืช, ื•ืืžืจืชื™,
03:17
and I said, HR serves three masters
83
197070
3190
โ€œืžืฉืื‘ื™ ืื ื•ืฉ ืžืฉืจืชื™ื ืฉืœื•ืฉื” ืื“ื•ื ื™ืโ€
03:20
and you waved the purple flag.
84
200260
1490
ื•ืืช ื ื•ืคืคืช ื‘ื“ื’ืœ ื”ืกื’ื•ืœ.
03:21
I knew what I had done wrong.
85
201750
1580
ื™ื“ืขืชื™ ืžื” ืขืฉื™ืชื™ ืœื ื‘ืกื“ืจ.
03:23
Why was I using a slavery metaphor?
86
203330
2470
ืœืžื” ื”ืฉืชืžืฉืชื™ ื‘ืžื˜ืืคื•ืจื” ืฉืœ ืขื‘ื“ื•ืช?
03:25
We hit pause, I thanked you and we rerecorded.
87
205800
3430
ืขืฆืจื ื•, ื”ื•ื“ื™ืชื™ ืœืš, ื•ื”ืงืœื˜ื ื• ืžื—ื“ืฉ. ื–ื” ืœื ื”ื™ื” ืกื™ืคื•ืจ ื’ื“ื•ืœ.
03:29
It was no big deal.
88
209230
1170
03:30
The thing I love about the purple flag
89
210400
2360
ื”ื“ื‘ืจ ืฉืื ื™ ืื•ื”ื‘ืช ื‘ื“ื’ืœ ื”ืกื’ื•ืœ ื–ื” ื›ืžื” ืฉื”ื•ื ื™ืขื™ืœ.
03:32
is how efficient it is.
90
212760
1590
03:34
- Flagging the bias didn't prevent us
91
214350
1430
ืœื”ืฆื‘ื™ืข ืขืœ ื”ื‘ืขื™ื” ืœื ืžื ืข ืžืื™ืชื ื• ืœืขืฉื•ืช ืืช ื”ืขื‘ื•ื“ื”.
03:35
from getting the work done.
92
215780
940
03:36
In fact, it helps us work together more honestly,
93
216720
2640
ืœืžืขืฉื”, ื–ื” ืขื•ื–ืจ ืœื ื• ืœืขื‘ื•ื“ ื™ื•ืชืจ ื‘ื›ื ื•ืช.
03:39
it's even harder, when I don't know what I did wrong.
94
219360
2900
ื–ื” ืืคื™ืœื• ืงืฉื” ื™ื•ืชืจ ื›ืฉืื ื™ ืœื ื™ื•ื“ืขืช ืžื” ืขืฉื™ืชื™ ืœื ื ื›ื•ืŸ.
03:42
Once I asked someone out to lunch,
95
222260
2150
ืคืขื, ื”ืฆืขืชื™ ืœืžืฉื”ื™ ืœืื›ื•ืœ ืื™ืชื™ ืฆื”ืจื™ื™ื.
03:44
out came the purple flag.
96
224410
1440
ื•ืื– ื™ืฆื ื”ื“ื’ืœ ื”ืกื’ื•ืœ.
03:45
I had no idea why.
97
225850
2640
ืœื ื”ื™ื” ืœื™ ืžื•ืฉื’ ืœืžื”, ืื– ื”ื•ืงืœืชื™ ืœื“ืขืช ืžื” ืœื•ืžืจ ืื—ืจ ื›ืš.
03:48
So I was relieved to know what to say next.
98
228490
2590
03:51
Thank you for pointing it out, but I don't get it.
99
231080
2390
โ€œืชื•ื“ื” ืฉืฆื™ื™ื ืช ืืช ื–ื”, ืื‘ืœ ืื ื™ ืœื ืžื‘ื™ื ื”โ€
03:53
Could we talk after the meeting?
100
233470
1690
โ€œื ื•ื›ืœ ืœื“ื‘ืจ ืื—ืจื™ ื”ืคื’ื™ืฉื”?โ€
03:55
Afterwards, the person reminded me
101
235160
1920
ืื—ืจ ื›ืš, ื”ื™ื ื”ื–ื›ื™ืจื” ืœื™ ืฉื”ื™ื ืฆืžื” ืœืจืžื“ืืŸ.
03:57
that they were fasting for Ramadan.
102
237080
1930
ื–ื” ืžื™ื“ ื”ื™ื” ื”ื’ื™ื•ื ื™ ื‘ืขื™ื ื™ื™,
03:59
It instantly made sense to me,
103
239010
2030
ื•ื’ื™ืœื™ืชื™ ืžืฉื”ื• ืืœื™ื• ืื ื™ ื™ื›ื•ืœื” ืœื”ื™ื•ืช ื™ื•ืชืจ ืžื•ื“ืขืช.
04:01
and I discovered something that I could be more aware of.
104
241040
3090
04:04
But to get to awareness,
105
244130
1480
ืื‘ืœ ื›ื“ื™ ืœื”ื’ื™ืข ืœืžื•ื“ืขื•ืช, ื”ื™ื™ืชื™ ืฆืจื™ื›ื” ืœืขื‘ื•ืจ ื“ืจืš ื‘ื•ืฉื”.
04:05
I had to move through shame.
106
245610
1960
04:07
It was hard to say I don't get it.
107
247570
1950
ื”ื™ื” ืงืฉื” ืœื•ืžืจ โ€œืื ื™ ืœื ืžื‘ื™ื ื”โ€
04:09
The shared norm helped me listen and learn
108
249520
2740
ื”ื ื•ืจืžื” ื”ืžืฉื•ืชืคืช ืขื–ืจื” ืœื™ ืœื”ืงืฉื™ื‘ ื•ืœืœืžื•ื“ ื‘ืžืงื•ื ืœื”ืชื—ื™ืœ ืœื”ืชื’ื•ื ืŸ.
04:12
rather than getting defensive.
109
252260
1460
04:13
The fact that there was a norm at all,
110
253720
2100
ื”ืขื•ื‘ื“ื” ืฉื”ื™ื™ืชื” ื‘ื›ืœืœ ื ื•ืจืžื”
04:15
reassured me that other people
111
255820
1770
ื”ื‘ื˜ื™ื—ื” ืœื™ ืฉืื ืฉื™ื ืื—ืจื™ื ื’ื ืขื•ืฉื™ื ืกื•ื’ื™ื ื›ืืœื” ืฉืœ ื˜ืขื•ื™ื•ืช
04:17
are making similar kinds of mistakes
112
257590
1780
04:19
and that we're all learning together.
113
259370
1590
ื•ืฉื›ื•ืœื ื• ืœื•ืžื“ื™ื ื‘ื™ื—ื“.
04:20
- Disrupting bias may start off feeling uncomfortable,
114
260960
3030
ืœืฆื™ื™ืŸ ื”ื˜ื™ื” ืขืœื•ืœ ืœื”ืชื—ื™ืœ ื‘ื”ืจื’ืฉื” ืœื ื ื•ื—ื”,
04:23
but with time and consistency,
115
263990
1671
ืื‘ืœ ืขื ื–ืžืŸ ื•ื”ืžืฉื›ื™ื•ืช,
04:25
we can build the stamina we need to push through it.
116
265661
2989
ื ื•ื›ืœ ืœื‘ื ื•ืช ืืช ื›ื•ืฉืจ ื”ืขืžื™ื“ื” ืฉื™ืขื–ื•ืจ ืœื ื• ืœืขื‘ื•ืจ ืืช ื–ื”.
04:28
When it becomes routine for us to notice our biases,
117
268650
2810
ื›ืฉื–ื” ื ืขืฉื” ืฉื’ืจืชื™ ื‘ืฉื‘ื™ืœื ื• ืœืฉื™ื ืœื‘ ืœื”ื˜ื™ื•ืช ืฉืœื ื•,
04:31
all of a sudden they feel less threatening.
118
271460
2450
ืคืชืื•ื ื”ืŸ ืžืจื’ื™ืฉื•ืช ืคื—ื•ืช ืžืื™ื™ืžื•ืช.
04:33
It's hard to break bias habits,
119
273910
1730
ืงืฉื” ืœืฉื‘ื•ืจ ื”ืจื’ืœ ืฉืœ ื”ื˜ื™ื•ืช,
04:35
yet we can change the pattern with consistent effort.
120
275640
3780
ืื‘ืœ ืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืฉื ื•ืช ืืช ื”ืชื‘ื ื™ืช ืขื ืžืืžืฅ ืžืžื•ืฉืš.
04:39
- We've got to be patient with ourselves and with others.
121
279420
3070
ืื ื—ื ื• ื—ื™ื™ื‘ื™ื ืœื”ื™ื•ืช ืกื‘ืœื ื™ื™ื ืขื ืขืฆืžื ื• ื•ืขื ืื—ืจื™ื.
04:42
- Patient and also persistent.
122
282490
1629
ืกื‘ืœื ื™ื™ื ื•ืžืชืžื™ื“ื™ื.
04:44
- Yeah.
123
284119
833
04:44
- Which brings us to our last step.
124
284952
1288
ืžื” ืฉืžื‘ื™ื ืื•ืชื ื• ืœืฉืœื‘ ื”ืื—ืจื•ืŸ.
04:46
Once the team has come up with a shared vocabulary
125
286240
2330
ื‘ืจื’ืข ืฉื”ืฆื•ื•ืช ื”ืžืฆื™ื ืื•ืฆืจ ืžื™ืœื™ื ืžืฉื•ืชืฃ ื•ื”ืกื›ื™ื ืขืœ ื”ื ื•ืจืžื” ื”ืžืฉื•ืชืคืช ืœืื™ืš ืœื”ื’ื™ื‘,
04:48
and agrees on the shared norm for how to respond,
126
288570
2660
04:51
the team should commit to disrupting bias
127
291230
1590
ื”ืฆื•ื•ืช ืฆืจื™ืš ืœื”ืชื—ื™ื™ื‘ ืœืฆื™ื™ืŸ ื”ื˜ื™ื” ืื—ืช ืœืคื—ื•ืช ื›ืœ ืคื’ื™ืฉื”.
04:52
at least once in every meeting.
128
292820
1650
04:54
- If bias isn't flagged in a meeting,
129
294470
1840
ืื ืœื ื ื•ืคื ืฃ ื‘ื“ื’ืœ ื‘ืžื”ืœืš ืคื’ื™ืฉื”, ื–ื” ืœื ืื•ืžืจ ืฉืœื ื”ื™ื• ื”ื˜ื™ื•ืช ื‘ื›ืœืœ.
04:56
it doesn't mean there wasn't any bias.
130
296310
2100
04:58
It just means either nobody noticed,
131
298410
2430
ื–ื” ืจืง ืื•ืžืจ ืฉืืฃ ืื—ื“ ืœื ื”ื‘ื—ื™ืŸ,
05:00
or nobody knew what to say.
132
300840
1920
ืื• ืฉืืฃ ืื—ื“ ืœื ื™ื•ื“ืข ืžื” ืœื•ืžืจ.
05:02
When we are silent about bias, we reinforce it.
133
302760
4270
ื›ืฉืื ื—ื ื• ืฉื•ืชืงื™ื ืœื’ื‘ื™ ื”ื˜ื™ื”, ืื ื—ื ื• ืžื—ื–ืงื™ื ืื•ืชื”.
ื•ืืœื” ืœื ื™ื›ื•ืœื™ื ืœื”ื™ื•ืช ืจืง ื”ืงื•ืจื‘ื ื•ืช ืฉืœ ื”ื”ื˜ื™ื” ืฉืžืฆื™ื™ื ื™ื ืื•ืชื”.
05:07
And it can't be just the targets of bias who point it out.
134
307030
3850
05:10
Observers and leaders have got to speak up.
135
310880
2580
ืžืฉืงื™ืคื™ื ื•ืžื ื”ื™ื’ื™ื ื—ื™ื™ื‘ื™ื ืœื“ื‘ืจ. ื›ื•ืœื ื• ื ื•ืฉืื™ื ื‘ืื—ืจื™ื•ืช.
05:13
We all have a responsibility.
136
313460
2010
05:15
- By making a practice
137
315470
900
ื›ืฉื”ื•ืคื›ื™ื ืืช ืฆื™ื•ืŸ ื”ื”ื˜ื™ื•ืช ื‘ืžื”ื™ืจื•ืช ื•ื ื—ืžื“ื•ืช ืœืžื ื”ื’,
05:16
of disrupting bias quickly and kindly,
138
316370
2490
05:18
we prevent it from metastasizing into something worse,
139
318860
3050
ืื ื—ื ื• ืžื•ื ืขื™ื ืžื”ื”ื˜ื™ื” ืœื’ื“ื•ืœ ืœืžืฉื”ื• ื’ืจื•ืข ื™ื•ืชืจ,
05:21
like prejudice, bullying, discrimination, or harassment.
140
321910
3320
ื›ืžื• ื“ืขื” ืงื“ื•ืžื”, ื‘ืจื™ื•ื ื•ืช, ืืคืœื™ื” ืื• ื”ื˜ืจื“ื”.
05:25
- Bias disruptors,
141
325230
1230
ืœื”ืคืกืงืช ื”ื˜ื™ื•ืช: ืื•ืฆืจ ืžื™ืœื™ื ืžืฉื•ืชืฃ,
05:26
a shared vocabulary,
142
326460
1290
05:27
a shared norm, and a shared commitment.
143
327750
2100
ื ื•ืจืžื” ืžืฉื•ืชืคืช ื•ื”ืชื—ื™ื™ื‘ื•ืช ืžืฉื•ืชืคืช
05:29
Ensure that we notice and learn from the mistakes
144
329850
2810
ืžื‘ื˜ื™ื—ื•ืช ืฉื ืฉื™ื ืœื‘ ื•ื ืœืžื“ ืžื”ื˜ืขื•ื™ื•ืช ืฉื›ื•ืœื ื• ืขื•ืฉื™ื
05:32
that we are all making so that we can work better together.
145
332660
3670
ื›ืš ืฉื ื•ื›ืœ ืœืขื‘ื•ื“ ื˜ื•ื‘ ื™ื•ืชืจ ื™ื—ื“.
05:36
- When we collaborate,
146
336330
1270
ืฉืื ื—ื ื• ืžืฉืชืคื™ื ืคืขื•ืœื”,
05:37
we use our full capacity as humans
147
337600
2160
ืื ื—ื ื• ืžืฉืชืคื™ื ืืช ื›ืœ ื”ื™ื›ื•ืœืช ื”ืื ื•ืฉื™ืช ืฉืœื ื• ื›ื“ื™ ืœืขืฉื•ืช ื™ื•ืชืจ ื“ื‘ืจื™ื ืงื•ืœืงื˜ื™ื‘ื™ืช
05:39
to get more done collectively
148
339760
1820
05:41
than we could ever dream of accomplishing as individuals.
149
341580
3290
ืžืฉื ื•ื›ืœ ืื™ ืคืขื ืœื—ืœื•ื ืœื”ืฆืœื™ื— ืœืขืฉื•ืช ื›ื™ื—ื™ื“ื™ื.
05:44
So let's stop letting bias, get in the way.
150
344870
2663
ืื– ื‘ื•ืื• ื ืคืกื™ืง ืœืชืช ืœื”ื˜ื™ื” ืœืขืžื•ื“ ืœื ื• ื‘ื“ืจืš.
05:48
(gentle piano music)
151
348627
3083
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7