How to use data to make a hit TV show | Sebastian Wernicke

133,338 views ・ 2016-01-27

TED


请双击下面的英文字幕来播放视频。

翻译人员: Chia Shimin 校对人员: Yolanda Zhang
00:12
Roy Price is a man that most of you have probably never heard about,
0
12820
4276
Roy Price这个人, 在座的绝大多数可能都没听说过,
即使他曾经在2013年4月19日这一天
00:17
even though he may have been responsible
1
17120
2496
00:19
for 22 somewhat mediocre minutes of your life on April 19, 2013.
2
19640
6896
占用了你们生命中普通的22分钟。
00:26
He may have also been responsible for 22 very entertaining minutes,
3
26560
3176
他也许曾经带给了 各位非常欢乐的22分钟,
00:29
but not very many of you.
4
29760
2256
但对你们当中很多人来说 可能并不是这样。
而这一切全部要追溯到
00:32
And all of that goes back to a decision
5
32040
1896
00:33
that Roy had to make about three years ago.
6
33960
2000
Roy在三年前的一个决定。
00:35
So you see, Roy Price is a senior executive with Amazon Studios.
7
35984
4832
实际上,Roy Price是 亚马逊广播公司的一位资深决策者。
00:40
That's the TV production company of Amazon.
8
40840
3016
这是亚马逊旗下的一家 电视节目制作公司。
00:43
He's 47 years old, slim, spiky hair,
9
43880
3256
他47岁,身材不错, 短发梳得很有型,
00:47
describes himself on Twitter as "movies, TV, technology, tacos."
10
47160
4816
他在Twitter上形容自己是 "电影、电视、科技、墨西哥卷饼(爱好者)"。
Roy Price有一个 责任非常重大的工作,
00:52
And Roy Price has a very responsible job, because it's his responsibility
11
52000
5176
00:57
to pick the shows, the original content that Amazon is going to make.
12
57200
4056
因为他要负责帮亚马逊挑选 即将制作的原创节目。
01:01
And of course that's a highly competitive space.
13
61280
2336
当然,这个领域的竞争非常激烈。
01:03
I mean, there are so many TV shows already out there,
14
63640
2736
我是说,其他公司已经有 那么多的电视节目,
01:06
that Roy can't just choose any show.
15
66400
2176
Roy不能只是随便乱挑一个节目。
01:08
He has to find shows that are really, really great.
16
68600
4096
他必须找出真正会走紅的节目。
01:12
So in other words, he has to find shows
17
72720
2816
换句话说,他挑选的节目
必须落在这条曲线的右侧。
01:15
that are on the very right end of this curve here.
18
75560
2376
01:17
So this curve here is the rating distribution
19
77960
2656
这条曲线是IMDB (译注:网络电影资料库)里
01:20
of about 2,500 TV shows on the website IMDB,
20
80640
4376
2500个电视节目的 客户评分曲线图,
01:25
and the rating goes from one to 10,
21
85040
2896
评分从1到10,
01:27
and the height here shows you how many shows get that rating.
22
87960
2976
纵轴表明有多少节目达到这个评分。
01:30
So if your show gets a rating of nine points or higher, that's a winner.
23
90960
4696
所以如果你的节目达到9分或更高, 你就是赢家,
01:35
Then you have a top two percent show.
24
95680
1816
你就拥有那2%的顶尖节目。
01:37
That's shows like "Breaking Bad," "Game of Thrones," "The Wire,"
25
97520
3896
例如像是“绝命毒师”、 “权力的游戏”、“火线重案组”,
01:41
so all of these shows that are addictive,
26
101440
2296
全部都是会让人上瘾的节目,
01:43
whereafter you've watched a season, your brain is basically like,
27
103760
3056
看完一季之后, 你基本马上就会想,
01:46
"Where can I get more of these episodes?"
28
106840
2176
“我要去哪里找到剩下的剧集?”
01:49
That kind of show.
29
109040
1200
基本就是这类的节目。
01:50
On the left side, just for clarity, here on that end,
30
110920
2496
曲线左边,不妨选个最靠边, 比较明显的点,
01:53
you have a show called "Toddlers and Tiaras" --
31
113440
3176
这儿有个叫“选美小天后" (译注:儿童选秀类)的节目——
01:56
(Laughter)
32
116640
2656
(笑声)
01:59
-- which should tell you enough
33
119320
1536
——足够让你明白
02:00
about what's going on on that end of the curve.
34
120880
2191
曲线最左端代表了什么。
02:03
Now, Roy Price is not worried about getting on the left end of the curve,
35
123095
4161
现在,Roy Price并不担心 会选个落在曲线最左边的节目,
02:07
because I think you would have to have some serious brainpower
36
127280
2936
因为我认为你们都具备 严肃的判断力
02:10
to undercut "Toddlers and Tiaras."
37
130240
1696
来给 "选美小天后" 打个低分 。
02:11
So what he's worried about is this middle bulge here,
38
131960
3936
他担心的是 中间多数的这些节目,
02:15
the bulge of average TV,
39
135920
1816
多到爆的这些一般的电视节目,
02:17
you know, those shows that aren't really good or really bad,
40
137760
2856
这些节目不算好,但也不是很烂,
02:20
they don't really get you excited.
41
140639
1656
它们不会真正地让你感兴趣。
02:22
So he needs to make sure that he's really on the right end of this.
42
142320
4856
所以他要确保他看好的节目 是落在最右端这里。
02:27
So the pressure is on,
43
147200
1576
那么压力就来了,
02:28
and of course it's also the first time
44
148800
2176
当然,这也是亚马逊第一次
想要做这类事情,
02:31
that Amazon is even doing something like this,
45
151000
2176
02:33
so Roy Price does not want to take any chances.
46
153200
3336
所以Roy Price不想只是碰运气。
02:36
He wants to engineer success.
47
156560
2456
他想要打造成功。
02:39
He needs a guaranteed success,
48
159040
1776
他要一个万无一失的成功,
02:40
and so what he does is, he holds a competition.
49
160840
2576
于是,他举办了一个竞赛。
02:43
So he takes a bunch of ideas for TV shows,
50
163440
3136
他带来了很多关于电视节目的想法,
02:46
and from those ideas, through an evaluation,
51
166600
2296
通过一个评估,
02:48
they select eight candidates for TV shows,
52
168920
4096
他们挑了八个候选的电视节目,
02:53
and then he just makes the first episode of each one of these shows
53
173040
3216
然后他为每一个节目制作了第一集,
02:56
and puts them online for free for everyone to watch.
54
176280
3136
再把它们放到网上, 让每个人免费观看。
02:59
And so when Amazon is giving out free stuff,
55
179440
2256
当亚马逊要给你免费的东西时,
03:01
you're going to take it, right?
56
181720
1536
你就会拿,对吧?
03:03
So millions of viewers are watching those episodes.
57
183280
5136
所以几百万人在看这些剧集,
03:08
What they don't realize is that, while they're watching their shows,
58
188440
3216
而这些人不知道的是, 当他们在观看节目的时候,
03:11
actually, they are being watched.
59
191680
2296
实际上他们也正被观察着。
他们被Roy Price及他的团队观察,
03:14
They are being watched by Roy Price and his team,
60
194000
2336
03:16
who record everything.
61
196360
1376
他们纪录了所有的一切。
03:17
They record when somebody presses play, when somebody presses pause,
62
197760
3376
他们纪录了哪些人按了拨放, 哪些人按了暂停,
03:21
what parts they skip, what parts they watch again.
63
201160
2536
哪些部分他们跳过了, 哪些部分他们又重看了一遍。
03:23
So they collect millions of data points,
64
203720
2256
他们收集了几百万个数据,
因为他们想要用这些数据来决定
03:26
because they want to have those data points
65
206000
2096
03:28
to then decide which show they should make.
66
208120
2696
要做什么样的节目。
03:30
And sure enough, so they collect all the data,
67
210840
2176
理所当然,他们收集了所有的数据,
处理过后得到了一个答案,
03:33
they do all the data crunching, and an answer emerges,
68
213040
2576
03:35
and the answer is,
69
215640
1216
而答案就是,
03:36
"Amazon should do a sitcom about four Republican US Senators."
70
216880
5536
“亚马逊需要制作一个有关 四个美国共和党参议员的喜剧”。
03:42
They did that show.
71
222440
1216
他们真的做了。
03:43
So does anyone know the name of the show?
72
223680
2160
有人知道这个节目吗?
03:46
(Audience: "Alpha House.")
73
226720
1296
(观众:" 阿尔法屋。")
是的,就是"阿尔法屋"。
03:48
Yes, "Alpha House,"
74
228040
1456
03:49
but it seems like not too many of you here remember that show, actually,
75
229520
4096
但看起来你们大部人都 不记得有这部片子,
03:53
because it didn't turn out that great.
76
233640
1856
因为这部片子收视率并不太好。
03:55
It's actually just an average show,
77
235520
1856
它其实只是个一般的节目,
03:57
actually -- literally, in fact, because the average of this curve here is at 7.4,
78
237400
4576
实际上,一般的节目差不多 对应曲线上大概7.4分的位置,
而 “阿尔法屋” 落在7.5分,
04:02
and "Alpha House" lands at 7.5,
79
242000
2416
04:04
so a slightly above average show,
80
244440
2016
所以比一般的节目高一点点,
04:06
but certainly not what Roy Price and his team were aiming for.
81
246480
2920
但绝对不是Roy Price和 他的团队想要达到的目标。
04:10
Meanwhile, however, at about the same time,
82
250320
2856
但在差不多同一时间,
04:13
at another company,
83
253200
1576
另一家公司的另一个决策者,
04:14
another executive did manage to land a top show using data analysis,
84
254800
4216
同样用数据分析 却做出了一个顶尖的节目,
他的名字是 Ted,
04:19
and his name is Ted,
85
259040
1576
04:20
Ted Sarandos, who is the Chief Content Officer of Netflix,
86
260640
3416
Ted Sarandos是Netflix的 首席内容官,
04:24
and just like Roy, he's on a constant mission
87
264080
2136
就跟 Roy一样,他也要不停地寻找
04:26
to find that great TV show,
88
266240
1496
最棒的节目,
04:27
and he uses data as well to do that,
89
267760
2016
而他也使用了数据分析,
04:29
except he does it a little bit differently.
90
269800
2015
但他的做法有点不太一样。
04:31
So instead of holding a competition, what he did -- and his team of course --
91
271839
3737
不是举办竞赛,他和他的团队
04:35
was they looked at all the data they already had about Netflix viewers,
92
275600
3536
观察了Netflix已有的所有观众数据,
04:39
you know, the ratings they give their shows,
93
279160
2096
比如观众对节目的评分、
观看纪录、 哪些节目最受欢迎等等。
04:41
the viewing histories, what shows people like, and so on.
94
281280
2696
他们用这些数据去挖掘
04:44
And then they use that data to discover
95
284000
1896
04:45
all of these little bits and pieces about the audience:
96
285920
2616
观众的所有小细节:
04:48
what kinds of shows they like,
97
288560
1456
他们喜欢什么类型的节目、
什么类型的制作人、 什么类型的演员。
04:50
what kind of producers, what kind of actors.
98
290040
2096
04:52
And once they had all of these pieces together,
99
292160
2576
就在他们收集到全部的细节后,
04:54
they took a leap of faith,
100
294760
1656
他们信心满满地
04:56
and they decided to license
101
296440
2096
决定要制作一部,
04:58
not a sitcom about four Senators
102
298560
2456
不是四个参议员的喜剧,
05:01
but a drama series about a single Senator.
103
301040
2880
而是一系列有关一位 单身参议员的电视剧。
05:04
You guys know the show?
104
304760
1656
各位知道那个节目吗?
05:06
(Laughter)
105
306440
1296
(笑声)
05:07
Yes, "House of Cards," and Netflix of course, nailed it with that show,
106
307760
3736
是的,“纸牌屋”, 当然,Netflix至少在头两季
05:11
at least for the first two seasons.
107
311520
2136
在这个节目上赚到了极高的收视率。
05:13
(Laughter) (Applause)
108
313680
3976
(笑声)(掌声)
05:17
"House of Cards" gets a 9.1 rating on this curve,
109
317680
3176
“纸牌屋” 在这个曲线上拿到了 9.1分,
05:20
so it's exactly where they wanted it to be.
110
320880
3176
他们绝对实现了最初的目标。
05:24
Now, the question of course is, what happened here?
111
324080
2416
很显然,问题来了, 这到底是怎么回事?
05:26
So you have two very competitive, data-savvy companies.
112
326520
2656
有两个非常有竞争力、 精通数据分析的公司。
05:29
They connect all of these millions of data points,
113
329200
2856
他们整合了所有的数据,
05:32
and then it works beautifully for one of them,
114
332080
2376
然后,其中一个干的很漂亮,
05:34
and it doesn't work for the other one.
115
334480
1856
而另一个却没有,
05:36
So why?
116
336360
1216
这是为什么呢?
05:37
Because logic kind of tells you that this should be working all the time.
117
337600
3456
毕竟逻辑分析会告诉你, 这种方法应该每次都有效啊,
05:41
I mean, if you're collecting millions of data points
118
341080
2456
我是说, 如果你收集了所有的数据
05:43
on a decision you're going to make,
119
343560
1736
来制定一个决策,
05:45
then you should be able to make a pretty good decision.
120
345320
2616
那你应该可以得到一个 相当不错的决策。
05:47
You have 200 years of statistics to rely on.
121
347960
2216
你有200年的统计方法做后盾。
05:50
You're amplifying it with very powerful computers.
122
350200
3016
你用高性能的计算机 去增强它的效果。
05:53
The least you could expect is good TV, right?
123
353240
3280
至少你可以期待得到一个 还不错的电视节目,对吧?
05:57
And if data analysis does not work that way,
124
357880
2720
但如果数据分析 并没有想像中的有效,
06:01
then it actually gets a little scary,
125
361520
2056
这就有点吓人了,
06:03
because we live in a time where we're turning to data more and more
126
363600
3816
因为我们生活在一个 越来越依赖数据的时代,
06:07
to make very serious decisions that go far beyond TV.
127
367440
4480
我们要用数据做出远比电视节目 还要严肃重要的决策。
06:12
Does anyone here know the company Multi-Health Systems?
128
372760
3240
你们当中有人知道 "MHS" 这家公司吗?
没人?好,这就好。
06:17
No one. OK, that's good actually.
129
377080
1656
06:18
OK, so Multi-Health Systems is a software company,
130
378760
3216
好的,MHS是一家软件公司,
而我希望在座的各位
06:22
and I hope that nobody here in this room
131
382000
2816
06:24
ever comes into contact with that software,
132
384840
3176
没人与他们的软件有任何关系,
因为如果你有, 就表示你犯了罪被判刑了。
06:28
because if you do, it means you're in prison.
133
388040
2096
06:30
(Laughter)
134
390160
1176
(笑声)
06:31
If someone here in the US is in prison, and they apply for parole,
135
391360
3536
如果有人在美国被判入狱, 要申请假释,
06:34
then it's very likely that data analysis software from that company
136
394920
4296
很有可能那家公司的数据分析软件
06:39
will be used in determining whether to grant that parole.
137
399240
3616
就会被用来判定你是否能获得假释。
06:42
So it's the same principle as Amazon and Netflix,
138
402880
2576
它也是采用跟 亚马逊和Netflix公司相同的原则,
06:45
but now instead of deciding whether a TV show is going to be good or bad,
139
405480
4616
但并不是要决定 某个电视节目收视率的好坏,
06:50
you're deciding whether a person is going to be good or bad.
140
410120
2896
而是用来决定 一个人将来的行为是好是坏。
06:53
And mediocre TV, 22 minutes, that can be pretty bad,
141
413040
5496
一个22分钟的普通电视节目 可以很糟糕,
06:58
but more years in prison, I guess, even worse.
142
418560
2640
但我觉得要坐很多年的牢,更糟糕。
07:02
And unfortunately, there is actually some evidence that this data analysis,
143
422360
4136
但不幸的是,实际上已经有证据显示, 这项数据分析尽管可以依靠
07:06
despite having lots of data, does not always produce optimum results.
144
426520
4216
庞大的数据资料, 它并不总能得出最优的结果。
07:10
And that's not because a company like Multi-Health Systems
145
430760
2722
但并不只有像MHS这样的软件公司
07:13
doesn't know what to do with data.
146
433506
1627
不确定到底怎么分析数据,
就连最顶尖的数据公司也会出错。
07:15
Even the most data-savvy companies get it wrong.
147
435158
2298
07:17
Yes, even Google gets it wrong sometimes.
148
437480
2400
是的,甚至谷歌有时也会出错。
07:20
In 2009, Google announced that they were able, with data analysis,
149
440680
4496
2009年,谷歌宣布 他们可以用数据分析来
07:25
to predict outbreaks of influenza, the nasty kind of flu,
150
445200
4136
预测流行性感冒何时爆发, 就是那种讨人厌的流感,
07:29
by doing data analysis on their Google searches.
151
449360
3776
他们用自己的搜寻引擎 来做数据分析。
07:33
And it worked beautifully, and it made a big splash in the news,
152
453160
3856
结果证明它准确无比, 引得各路新闻报道铺天盖地,
甚至还达到了一个科学界的顶峰:
07:37
including the pinnacle of scientific success:
153
457040
2136
07:39
a publication in the journal "Nature."
154
459200
2456
在 “自然” 期刊上发表了文章。
07:41
It worked beautifully for year after year after year,
155
461680
3616
之后的每一年,它都预测得准确无误,
07:45
until one year it failed.
156
465320
1656
直到有一年,它失败了。
没有人知道到底是什么原因。
07:47
And nobody could even tell exactly why.
157
467000
2256
07:49
It just didn't work that year,
158
469280
1696
那一年它就是不准了,
当然,这又成了一个大新闻,
07:51
and of course that again made big news,
159
471000
1936
07:52
including now a retraction
160
472960
1616
包括现在
07:54
of a publication from the journal "Nature."
161
474600
2840
被 "自然” 期刊撤稿。
07:58
So even the most data-savvy companies, Amazon and Google,
162
478480
3336
所以,即使是最顶尖的数据分析公司, 亚马逊和谷歌,
08:01
they sometimes get it wrong.
163
481840
2136
他们有时也会出错。
但尽管出现了这些失败,
08:04
And despite all those failures,
164
484000
2936
08:06
data is moving rapidly into real-life decision-making --
165
486960
3856
数据仍然在马不停蹄地渗透进我们 实际生活中的决策——
08:10
into the workplace,
166
490840
1816
进入工作场所、
08:12
law enforcement,
167
492680
1816
执法过程、
08:14
medicine.
168
494520
1200
医药领域。
08:16
So we should better make sure that data is helping.
169
496400
3336
所以,我们应该确保数据是 能够帮助我们解决问题的。
08:19
Now, personally I've seen a lot of this struggle with data myself,
170
499760
3136
我个人也曾经多次 被数据分析搞的焦头烂额,
08:22
because I work in computational genetics,
171
502920
1976
因为我在计算遗传学领域工作,
08:24
which is also a field where lots of very smart people
172
504920
2496
这个领域有很多非常聪明的人
08:27
are using unimaginable amounts of data to make pretty serious decisions
173
507440
3656
在用多到难以想像的数据 来制定相当严肃的决策,
08:31
like deciding on a cancer therapy or developing a drug.
174
511120
3560
比如癌症治疗,或者药物开发。
08:35
And over the years, I've noticed a sort of pattern
175
515520
2376
经过这几年,我已经注意到一种模式
08:37
or kind of rule, if you will, about the difference
176
517920
2456
或者规则,你们也可以这么理解,
08:40
between successful decision-making with data
177
520400
2696
就是有关于用数据做出
08:43
and unsuccessful decision-making,
178
523120
1616
成功决策和不成功决策,
08:44
and I find this a pattern worth sharing, and it goes something like this.
179
524760
3880
我觉得这个模式值得分享, 大概是这样的。
08:50
So whenever you're solving a complex problem,
180
530520
2135
当你要解决一个复杂问题时,
08:52
you're doing essentially two things.
181
532679
1737
你通常必然会做两件事。
08:54
The first one is, you take that problem apart into its bits and pieces
182
534440
3296
首先,你会把问题拆分得非常细,
08:57
so that you can deeply analyze those bits and pieces,
183
537760
2496
这样你就可以深度地分析这些细节,
09:00
and then of course you do the second part.
184
540280
2016
当然你要做的第二件事就是,
再把这些细节重新整合在一起,
09:02
You put all of these bits and pieces back together again
185
542320
2656
来得出你要的结论。
09:05
to come to your conclusion.
186
545000
1336
09:06
And sometimes you have to do it over again,
187
546360
2336
有时候你必须重复几次,
09:08
but it's always those two things:
188
548720
1656
但基本都是围绕这两件事:
09:10
taking apart and putting back together again.
189
550400
2320
拆分、再整合。
09:14
And now the crucial thing is
190
554280
1616
那么关键的问题在于,
09:15
that data and data analysis
191
555920
2896
数据和数据分析
09:18
is only good for the first part.
192
558840
2496
只适用于第一步,
09:21
Data and data analysis, no matter how powerful,
193
561360
2216
无论数据和数据分析多么强大,
09:23
can only help you taking a problem apart and understanding its pieces.
194
563600
4456
它都只能帮助你拆分问题和了解细节,
09:28
It's not suited to put those pieces back together again
195
568080
3496
它不适用于把细节重新整合在一起
09:31
and then to come to a conclusion.
196
571600
1896
来得出一个结论。
09:33
There's another tool that can do that, and we all have it,
197
573520
2736
有一个工具可以实现第二步, 我们每个人都有,
那就是大脑。
09:36
and that tool is the brain.
198
576280
1296
09:37
If there's one thing a brain is good at,
199
577600
1936
如果要说大脑很擅长某一件事,
09:39
it's taking bits and pieces back together again,
200
579560
2256
那就是,它很会把琐碎的细节 重新整合在一起,
09:41
even when you have incomplete information,
201
581840
2016
即使你拥有的信息并不完整,
09:43
and coming to a good conclusion,
202
583880
1576
也能得到一个好的结论,
09:45
especially if it's the brain of an expert.
203
585480
2936
特别是专家的大脑。
09:48
And that's why I believe that Netflix was so successful,
204
588440
2656
而这也是我相信 Netflix会这么成功的原因,
09:51
because they used data and brains where they belong in the process.
205
591120
3576
因为他们在分析过程中同时 使用了数据和大脑。
09:54
They use data to first understand lots of pieces about their audience
206
594720
3536
他们利用数据, 首先去了解观众的若干细节,
09:58
that they otherwise wouldn't have been able to understand at that depth,
207
598280
3416
没有这些数据, 他们不可能进行这么透彻的分析,
10:01
but then the decision to take all these bits and pieces
208
601720
2616
但在之后,要做出重新整合,
制作像"纸牌屋"这样的节目的决策,
10:04
and put them back together again and make a show like "House of Cards,"
209
604360
3336
10:07
that was nowhere in the data.
210
607720
1416
就无法依赖数据了。
是Ted Sarandos和他的团队(通过思考) 做出了批准该节目的这个决策,
10:09
Ted Sarandos and his team made that decision to license that show,
211
609160
3976
10:13
which also meant, by the way, that they were taking
212
613160
2381
这也就意味着,
10:15
a pretty big personal risk with that decision.
213
615565
2851
他们在做出决策的当下, 也正在承担很大的个人风险。
10:18
And Amazon, on the other hand, they did it the wrong way around.
214
618440
3016
而另一方面,亚马逊把事情搞砸了。
他们全程依赖数据来制定决策,
10:21
They used data all the way to drive their decision-making,
215
621480
2736
首先,举办了关于节目创意的竞赛,
10:24
first when they held their competition of TV ideas,
216
624240
2416
10:26
then when they selected "Alpha House" to make as a show.
217
626680
3696
然后他们决定选择制作 "阿尔法屋"。
10:30
Which of course was a very safe decision for them,
218
630400
2496
当然,对他们而言, 这是一个非常安全的决策,
10:32
because they could always point at the data, saying,
219
632920
2456
因为他们总是可以指着数据说,
10:35
"This is what the data tells us."
220
635400
1696
“这是数据告诉我们的。”
但数据并没有带给他们 满意的结果。
10:37
But it didn't lead to the exceptional results that they were hoping for.
221
637120
4240
当然,数据依然是做决策时的 一个强大的工具,
10:42
So data is of course a massively useful tool to make better decisions,
222
642120
4976
但我相信,当数据开始 主导这些决策时,
10:47
but I believe that things go wrong
223
647120
2376
10:49
when data is starting to drive those decisions.
224
649520
2576
并不能保证万无一失。
10:52
No matter how powerful, data is just a tool,
225
652120
3776
不管它有多么的强大, 数据都仅仅是一个工具,
10:55
and to keep that in mind, I find this device here quite useful.
226
655920
3336
记住这句话之后, 我发现这个装置相当有用。
10:59
Many of you will ...
227
659280
1216
你们很多人就会......
11:00
(Laughter)
228
660520
1216
(笑声)
11:01
Before there was data,
229
661760
1216
在有数据之前,
这就是用来做决策的工具。
11:03
this was the decision-making device to use.
230
663000
2856
11:05
(Laughter)
231
665880
1256
(笑声)
11:07
Many of you will know this.
232
667160
1336
你们很多人应该知道这个玩意儿。
11:08
This toy here is called the Magic 8 Ball,
233
668520
1953
这个玩具称做“魔术8号球”,
11:10
and it's really amazing,
234
670497
1199
它真的很奇妙,
11:11
because if you have a decision to make, a yes or no question,
235
671720
2896
因为如果你要做一个 “是” 或 “不是” 的决策时,
你只要摇一摇这颗球, 就可以得到答案了——
11:14
all you have to do is you shake the ball, and then you get an answer --
236
674640
3736
11:18
"Most Likely" -- right here in this window in real time.
237
678400
2816
“很有可能是”—— 在这个视窗里,马上就可以看到。
11:21
I'll have it out later for tech demos.
238
681240
2096
我回头会带它去做技术示范。
11:23
(Laughter)
239
683360
1216
(笑声)
11:24
Now, the thing is, of course -- so I've made some decisions in my life
240
684600
3576
事实上,当然—— 我已经在我人生中做出了一些决定,
11:28
where, in hindsight, I should have just listened to the ball.
241
688200
2896
虽然事后证明, 我当初应该直接用这颗球。
11:31
But, you know, of course, if you have the data available,
242
691120
3336
但,当然,如果你手里有数据,
11:34
you want to replace this with something much more sophisticated,
243
694480
3056
你就会想用更尖端的方式 来取代这颗球,
11:37
like data analysis to come to a better decision.
244
697560
3616
比方说,用数据分析来得到更好的决策。
11:41
But that does not change the basic setup.
245
701200
2616
但这无法改变基本的设定。
11:43
So the ball may get smarter and smarter and smarter,
246
703840
3176
这球可能会变得越来越智能,
11:47
but I believe it's still on us to make the decisions
247
707040
2816
但我相信,如果我们想达成某些 像曲线最右端那样
11:49
if we want to achieve something extraordinary,
248
709880
3016
出色的成就,最后的决定权
11:52
on the right end of the curve.
249
712920
1936
还是应该落在我们身上。
11:54
And I find that a very encouraging message, in fact,
250
714880
4496
事实上,我还发现了 一件非常鼓舞人心的事,
11:59
that even in the face of huge amounts of data,
251
719400
3976
即使面对庞大的数据,
12:03
it still pays off to make decisions,
252
723400
4096
当你要做出决定,
想要变成一位该领域的专家 并承担风险时,
12:07
to be an expert in what you're doing
253
727520
2656
12:10
and take risks.
254
730200
2096
你仍然会有很大的收获。
12:12
Because in the end, it's not data,
255
732320
2776
因为到最后,不是数据,
而是风险,会把你引到曲线的最右端。
12:15
it's risks that will land you on the right end of the curve.
256
735120
3960
12:19
Thank you.
257
739840
1216
谢谢各位。
12:21
(Applause)
258
741080
3680
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7