How to make applying for jobs less painful | The Way We Work, a TED series

153,986 views ・ 2019-02-09

TED


请双击下面的英文字幕来播放视频。

00:00
Applying for jobs online
0
141
1416
翻译人员: duan JiGang 校对人员: jacks peng
00:01
is one of the worst digital experiences of our time.
1
1581
2616
00:04
And applying for jobs in person really isn't much better.
2
4221
2696
00:06
[The Way We Work]
3
6942
1720
00:11
Hiring as we know it is broken on many fronts.
4
11181
2456
在线申请工作
00:13
It's a terrible experience for people.
5
13661
1856
是我们这个时代最糟糕的 数字化体验之一。
00:15
About 75 percent of people
6
15541
1816
面对面交谈也没好到哪儿去。
00:17
who applied to jobs using various methods in the past year
7
17381
2856
【我们的工作方式】
00:20
said they never heard anything back from the employer.
8
20261
2656
00:22
And at the company level it's not much better.
9
22941
2176
众所周知,招聘方式 在很多方面一团糟。
00:25
46 percent of people get fired or quit
10
25141
2776
对人们来说这是一个糟糕的经历。
00:27
within the first year of starting their jobs.
11
27941
2176
在过去一年
使用多种方式申请工作时的 群体中,大约有75%的人
00:30
It's pretty mind-blowing.
12
30141
1216
00:31
It's also bad for the economy.
13
31381
1456
00:32
For the first time in history,
14
32861
1456
说他们从未收到雇主的任何反馈。
00:34
we have more open jobs than we have unemployed people,
15
34341
2856
对公司来说,这不是一件好事情。
00:37
and to me that screams that we have a problem.
16
37221
2176
在开始工作的不到一年时间里,
00:39
I believe that at the crux of all of this is a single piece of paper: the résumé.
17
39421
3976
46%的人被解雇或者主动离职。
这一点很令人震惊。
00:43
A résumé definitely has some useful pieces in it:
18
43421
2336
这种现象对经济也产生了负面影响。
00:45
what roles people have had, computer skills,
19
45781
2096
在历史上第一次,
招聘岗位超过了无业人员的人数,
00:47
what languages they speak,
20
47901
1256
00:49
but what it misses is what they have the potential to do
21
49181
3056
对我而言,这意味着出问题了。
我认为这一切的关键在于一张纸: 简历。
00:52
that they might not have had the opportunity to do in the past.
22
52261
2976
00:55
And with such a quickly changing economy where jobs are coming online
23
55261
3256
毫无疑问,简历中包含着一些 有用的信息:
00:58
that might require skills that nobody has,
24
58541
2056
人们扮演过哪些角色, 有哪些计算机技能,
01:00
if we only look at what someone has done in the past,
25
60621
2776
精通什么语言,
但并未提及他们有哪方面的潜力,
01:03
we're not going to be able to match people to the jobs of the future.
26
63421
3256
这些事情他们在过去可能没机会去做。
01:06
So this is where I think technology can be really helpful.
27
66701
2736
在变化如此迅速的经济环境中, 在线发布的工作机会
01:09
You've probably seen that algorithms have gotten pretty good
28
69461
2856
可能要求的都是没人掌握的技术,
01:12
at matching people to things,
29
72341
1536
如果我们只看一个人过去做了什么,
01:13
but what if we could use that same technology
30
73901
2256
就不能把这个人和 未来的工作匹配起来。
01:16
to actually help us find jobs that we're really well-suited for?
31
76181
3096
所以我认为这是技术真正有用的地方。
01:19
But I know what you're thinking.
32
79301
1576
01:20
Algorithms picking your next job sounds a little bit scary,
33
80901
2776
您可能已经看到了算法如何很好的
01:23
but there is one thing that has been shown
34
83701
2056
把人和事物匹配起来,
01:25
to be really predictive of someone's future success in a job,
35
85781
2896
但是如果我们把同样的技术用于
01:28
and that's what's called a multimeasure test.
36
88701
2136
真正帮助找到那些为 我们量身打造的工作昵?
01:30
Multimeasure tests really aren't anything new,
37
90861
2176
我知道你在想什么。
让算法为你挑拣下一份工作 听起来有点离谱,
01:33
but they used to be really expensive
38
93061
1736
01:34
and required a PhD sitting across from you
39
94821
2016
但有个东西已经被证明
01:36
and answering lots of questions and writing reports.
40
96861
2456
能够成功预测某人 是否能胜任未来的工作,
01:39
Multimeasure tests are a way
41
99341
1696
这就是所谓的多评估测试。
01:41
to understand someone's inherent traits --
42
101061
2456
多评估测试并不是什么新概念,
01:43
your memory, your attentiveness.
43
103541
1776
但是它们曾经价格不菲,
01:46
What if we could take multimeasure tests
44
106255
1942
并且需要一个博士坐在你对面,
01:48
and make them scalable and accessible,
45
108221
2536
回答一堆问题并且整理成报告。
01:50
and provide data to employers about really what the traits are
46
110781
3376
多评估测试是一种用来
理解某人内在特质的方法——
01:54
of someone who can make them a good fit for a job?
47
114181
2896
你的记忆力,你的专注力。
01:57
This all sounds abstract.
48
117101
1296
01:58
Let's try one of the games together.
49
118421
1735
如果我们能够做多评估测试,
02:00
You're about to see a flashing circle,
50
120180
1857
让公众都可以参与,
02:02
and your job is going to be to clap when the circle is red
51
122061
2960
并且把相关数据提供给雇主, 比如某个人的某些特质
02:05
and do nothing when it's green.
52
125878
1496
使其真的很适合这个工作,会怎样?
02:07
[Ready?]
53
127399
1376
02:08
[Begin!]
54
128800
1360
这些听起来很抽象。
让我们一起试试其中一个游戏。
02:11
[Green circle]
55
131301
1000
你将要看到一个闪烁的圆,
02:13
[Green circle]
56
133301
1040
你的任务就是当圆是红色时鼓掌,
02:15
[Red circle]
57
135301
1000
02:17
[Green circle]
58
137301
1080
当圆是绿色时什么也不做。
02:19
[Red circle]
59
139301
1000
【准备好了?】
02:21
Maybe you're the type of person
60
141521
1596
【开始】
02:23
who claps the millisecond after a red circle appears.
61
143141
2496
【绿色圆】
02:25
Or maybe you're the type of person
62
145661
1656
【绿色圆】
02:27
who takes just a little bit longer to be 100 percent sure.
63
147341
2735
【红色圆】
【绿色圆】
02:30
Or maybe you clap on green even though you're not supposed to.
64
150101
2936
【红色圆】
02:33
The cool thing here is that this isn't like a standardized test
65
153061
2976
或许你是那种
在红色圆出现后毫秒内鼓掌的人。
02:36
where some people are employable and some people aren't.
66
156061
2656
或者你是另外一种人,
02:38
Instead it's about understanding the fit between your characteristics
67
158741
3256
那种需要多花点时间, 等到100%确认才行动的人。
02:42
and what would make you good a certain job.
68
162021
2016
或者你在还不确定时 就为绿色圆鼓掌。
02:44
We found that if you clap late on red and you never clap on the green,
69
164061
3736
很酷的一点是这并不 像是个标准的测试,
02:47
you might be high in attentiveness and high in restraint.
70
167821
3176
那种决定能被雇佣与否的测试。
相反,这是个理解你的特性和
02:51
People in that quadrant tend to be great students, great test-takers,
71
171021
3576
适合你的工作之间的匹配度测试。
02:54
great at project management or accounting.
72
174621
2136
02:56
But if you clap immediately on red and sometimes clap on green,
73
176781
3336
我们发现如果你在红色时鼓掌晚, 而在绿色时从不鼓掌,
03:00
that might mean that you're more impulsive and creative,
74
180141
2656
你可能具备高度专注力, 能够很好的自我约束。
03:02
and we've found that top-performing salespeople often embody these traits.
75
182821
3875
在那个象限的人们 往往擅长学习和考试,
03:06
The way we actually use this in hiring
76
186721
2016
精于项目管理和财会。
03:08
is we have top performers in a role go through neuroscience exercises
77
188761
3696
如果你在红色时立刻鼓掌, 并且有时在绿色鼓掌,
03:12
like this one.
78
192481
1216
那意味着你可能易冲动 并且具备创造性,
03:13
Then we develop an algorithm
79
193721
1376
03:15
that understands what makes those top performers unique.
80
195121
2656
我们发现顶级的商人 经常会表现出这些特质。
03:17
And then when people apply to the job,
81
197801
1936
03:19
we're able to surface the candidates who might be best suited for that job.
82
199761
4136
我们在招聘中使用它的方式是
我们让角色中表现出色的人参与
03:23
So you might be thinking there's a danger in this.
83
203921
2376
类似的神经科学训练。
然后我们开发了一个算法
03:26
The work world today is not the most diverse
84
206321
2136
来理解是什么让这些 表现出众者脱颖而出。
03:28
and if we're building algorithms based on current top performers,
85
208481
3096
然后当人们申请工作的时候,
03:31
how do we make sure
86
211601
1216
03:32
that we're not just perpetuating the biases that already exist?
87
212841
2976
我们就会优先列出 最适合那项工作的候选人。
03:35
For example, if we were building an algorithm based on top performing CEOs
88
215841
4056
你可能在思考其中存在的风险。
当今的职场多样性仍有待提高,
03:39
and use the S&P 500 as a training set,
89
219921
3216
如果我们基于当下的 出众员工构建算法,
03:43
you would actually find
90
223161
1256
要怎样确保
03:44
that you're more likely to hire a white man named John than any woman.
91
224441
3816
我们不是在固守既有的偏见呢?
例如,如果我们基于顶尖表现的 CEO构建一个算法
03:48
And that's the reality of who's in those roles right now.
92
228281
2696
03:51
But technology actually poses a really interesting opportunity.
93
231001
3376
并且使用S&P500作为一个训练集,
03:54
We can create algorithms that are more equitable
94
234401
2256
你将会发现
03:56
and more fair than human beings have ever been.
95
236681
2256
更可能雇佣一个叫约翰的 白人男子而非任何女性。
03:58
Every algorithm that we put into production has been pretested
96
238961
3696
这是目前谁正处在这个角色的现实。
04:02
to ensure that it doesn't favor any gender or ethnicity.
97
242681
3096
但是技术实际上给出了 一个真正有趣的机会。
04:05
And if there's any population that's being overfavored,
98
245801
2736
我们可以创造一些比人类 任何时候都更平等
04:08
we can actually alter the algorithm until that's no longer true.
99
248561
3120
和更公正的算法。
每一个我们投入生产的 算法都会被预先进行测试
04:12
When we focus on the inherent characteristics
100
252041
2216
04:14
that can make somebody a good fit for a job,
101
254281
2096
以确保它不会偏爱任何性别 或者种族。
04:16
we can transcend racism, classism, sexism, ageism --
102
256401
3576
如果有任何人群正在被过度偏爱,
04:20
even good schoolism.
103
260001
1416
我们可以调整算法直到该现象消失。
04:21
Our best technology and algorithms shouldn't just be used
104
261441
2896
04:24
for helping us find our next movie binge or new favorite Justin Bieber song.
105
264361
3736
当我们关注在那些让一个人
非常适合一个工作的内在特质时,
04:28
Imagine if we could harness the power of technology
106
268121
2656
我们可以超越种族,阶级, 性别和老龄化主义——
04:30
to get real guidance on what we should be doing
107
270801
2296
甚至是名校背景。
04:33
based on who we are at a deeper level.
108
273121
1936
我们最好的技术和算法不应该只用于
帮助寻找我们的下一个卖座电影 或者贾斯汀·比伯的新歌。
想象一下如果我们能够利用技术的 力量,
在更深层次上理解我们是谁, 并得到一个
我们应该做什么的真正指引会怎样。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7