請雙擊下方英文字幕播放視頻。
譯者: 易帆 余
審譯者: Ernie Hsieh
00:12
Roy Price is a man that most of you
have probably never heard about,
0
12820
4276
Roy Price這個人,
各位可能都未曾聽過,
00:17
even though he may have been responsible
1
17120
2496
即使他曾負責過
你生命中平凡無奇的22分鐘,
00:19
for 22 somewhat mediocre
minutes of your life on April 19, 2013.
2
19640
6896
在2013年4月19日這一天。
00:26
He may have also been responsible
for 22 very entertaining minutes,
3
26560
3176
他也許也曾負責帶給
各位非常歡樂的22分鐘,
00:29
but not very many of you.
4
29760
2256
但你們其中也許很多人並沒有。
00:32
And all of that goes back to a decision
5
32040
1896
而這一切全部要回到
00:33
that Roy had to make
about three years ago.
6
33960
2000
Roy在三年前的一個決定。
00:35
So you see, Roy Price
is a senior executive with Amazon Studios.
7
35984
4832
所以,你明白,Roy Price是
Amazon廣播公司的一位資深決策者。
00:40
That's the TV production
company of Amazon.
8
40840
3016
這是Amazon旗下的一家
電視節目製作公司。
00:43
He's 47 years old, slim, spiky hair,
9
43880
3256
他47歲,身材不錯,尖頭髮,
00:47
describes himself on Twitter
as "movies, TV, technology, tacos."
10
47160
4816
在Twitter上形容自己是
“電影、電視、科技、墨西哥捲餅 。”
00:52
And Roy Price has a very responsible job,
because it's his responsibility
11
52000
5176
Roy Price有一個
責任非常重大的工作,
因為他要負責幫Amazon挑選
即將製作的原創內容節目。
00:57
to pick the shows, the original content
that Amazon is going to make.
12
57200
4056
01:01
And of course that's
a highly competitive space.
13
61280
2336
當然,這是高度競爭的領域。
01:03
I mean, there are so many
TV shows already out there,
14
63640
2736
我的意思是,
外面已經有那麼多的電視節目,
01:06
that Roy can't just choose any show.
15
66400
2176
Roy不能隨便亂挑一個節目。
01:08
He has to find shows
that are really, really great.
16
68600
4096
他必須找出真正、
真正很讚的節目。
01:12
So in other words, he has to find shows
17
72720
2816
換句話說,
他必須從這條曲線上的右邊挑選節目。
01:15
that are on the very right end
of this curve here.
18
75560
2376
01:17
So this curve here
is the rating distribution
19
77960
2656
這條曲線是 IMDB網路電影資料庫裡
01:20
of about 2,500 TV shows
on the website IMDB,
20
80640
4376
2500個電視節目的
客戶評分分布圖,
01:25
and the rating goes from one to 10,
21
85040
2896
評分從 1到10,
01:27
and the height here shows you
how many shows get that rating.
22
87960
2976
最高的地方代表
有多少節目達到這個評分。
01:30
So if your show gets a rating
of nine points or higher, that's a winner.
23
90960
4696
所以如果你的節目達到 9分或更高,
你就是贏家。
01:35
Then you have a top two percent show.
24
95680
1816
你就是那百分之二的頂尖節目。
01:37
That's shows like "Breaking Bad,"
"Game of Thrones," "The Wire,"
25
97520
3896
例如像是" 絕命毒師 、
權力遊戲、火線重案組 "
01:41
so all of these shows that are addictive,
26
101440
2296
全部都是會讓你上癮的節目,
01:43
whereafter you've watched a season,
your brain is basically like,
27
103760
3056
看完一季之後,你的大腦基本上像是 ...
01:46
"Where can I get more of these episodes?"
28
106840
2176
" 我要去哪裡找到更多這部片的影集? "
01:49
That kind of show.
29
109040
1200
等等這類的節目。
01:50
On the left side, just for clarity,
here on that end,
30
110920
2496
左邊末端,很明顯地,
01:53
you have a show called
"Toddlers and Tiaras" --
31
113440
3176
你們有個叫" 小小姐與后冠 "的節目
01:56
(Laughter)
32
116640
2656
(笑聲)
01:59
-- which should tell you enough
33
119320
1536
一個足夠讓你明白
02:00
about what's going on
on that end of the curve.
34
120880
2191
為什麼它會在曲線末端的節目。
02:03
Now, Roy Price is not worried about
getting on the left end of the curve,
35
123095
4161
現在,Roy Price不擔心
在曲線左邊末端的節目。
02:07
because I think you would have to have
some serious brainpower
36
127280
2936
因為我認為你們都會想
有一些嚴肅的判斷力
02:10
to undercut "Toddlers and Tiaras."
37
130240
1696
來降低" 小小姐與后冠 "的評分 。
02:11
So what he's worried about
is this middle bulge here,
38
131960
3936
所以,他擔心的是中間多數的這些節目,
02:15
the bulge of average TV,
39
135920
1816
多到爆的這些一般性電視節目,
02:17
you know, those shows
that aren't really good or really bad,
40
137760
2856
你知道,這些節目
既不是很好也不是很壞,
02:20
they don't really get you excited.
41
140639
1656
它們不會真正地讓你興奮。
02:22
So he needs to make sure
that he's really on the right end of this.
42
142320
4856
所以他要確保他真的
是在右邊的末端這裡,
02:27
So the pressure is on,
43
147200
1576
所以,壓力就來了,
02:28
and of course it's also the first time
44
148800
2176
所以當然,這也是第一次 Amazon
02:31
that Amazon is even
doing something like this,
45
151000
2176
也想要做類似這樣的事情,
02:33
so Roy Price does not want
to take any chances.
46
153200
3336
Roy Price不想冒風險,
02:36
He wants to engineer success.
47
156560
2456
他想要建造成功,
02:39
He needs a guaranteed success,
48
159040
1776
他要一個保證的成功,
02:40
and so what he does is,
he holds a competition.
49
160840
2576
所以他就舉辦一個比賽。
02:43
So he takes a bunch of ideas for TV shows,
50
163440
3136
他為電視節目帶來了很多想法,
02:46
and from those ideas,
through an evaluation,
51
166600
2296
並且透過一個評估,形塑這些想法,
02:48
they select eight candidates for TV shows,
52
168920
4096
他們為電視節目挑選了八個候選名單,
02:53
and then he just makes the first episode
of each one of these shows
53
173040
3216
然後他製作每一個節目的第一集,
02:56
and puts them online for free
for everyone to watch.
54
176280
3136
然後把他們放到網路上,
讓每個人免費觀看。
02:59
And so when Amazon
is giving out free stuff,
55
179440
2256
所以當Amazon要給你免費的東西時,
03:01
you're going to take it, right?
56
181720
1536
你就會拿,對吧?
03:03
So millions of viewers
are watching those episodes.
57
183280
5136
所以上百萬人在看這些影集,
03:08
What they don't realize is that,
while they're watching their shows,
58
188440
3216
而這些人不明白的是,
當他們在觀看節目的時候,
03:11
actually, they are being watched.
59
191680
2296
實際上他們也正被觀查中。
03:14
They are being watched
by Roy Price and his team,
60
194000
2336
他們被Roy Price及他的團隊觀查,
03:16
who record everything.
61
196360
1376
他們紀錄了每一件事。
03:17
They record when somebody presses play,
when somebody presses pause,
62
197760
3376
他們紀錄了,那些人按了撥放,
那些人按了暫停,
03:21
what parts they skip,
what parts they watch again.
63
201160
2536
那些部分他們跳過,
那些部分他們又重看一遍。
03:23
So they collect millions of data points,
64
203720
2256
所以他們收集了上百萬的數據資料,
03:26
because they want
to have those data points
65
206000
2096
因為他們想要用這些數據資料來決定
03:28
to then decide
which show they should make.
66
208120
2696
要做甚麼樣的節目。
03:30
And sure enough,
so they collect all the data,
67
210840
2176
確定好後,他們收集所有的數據,
03:33
they do all the data crunching,
and an answer emerges,
68
213040
2576
他們做完所有數據處理後,
得到一個答案,
03:35
and the answer is,
69
215640
1216
而答案就是,
03:36
"Amazon should do a sitcom
about four Republican US Senators."
70
216880
5536
" Amazon需要製作一個有關
美國共和黨參議員的喜劇 "。
03:42
They did that show.
71
222440
1216
他們做了,
03:43
So does anyone know the name of the show?
72
223680
2160
有人知道這個節目嗎?
03:46
(Audience: "Alpha House.")
73
226720
1296
(觀眾:" 艾爾發屋 ")
03:48
Yes, "Alpha House,"
74
228040
1456
是的," 艾爾發屋 "
03:49
but it seems like not too many of you here
remember that show, actually,
75
229520
4096
但實際上,你們大部人
應該不記得有這部片子,
03:53
because it didn't turn out that great.
76
233640
1856
因為這部片並不那麼賣座。
03:55
It's actually just an average show,
77
235520
1856
它實際上僅是一般的節目,
03:57
actually -- literally, in fact, because
the average of this curve here is at 7.4,
78
237400
4576
實際上,一般的節目差不多
坐落在曲線上的 7.4分,
04:02
and "Alpha House" lands at 7.5,
79
242000
2416
而" 艾爾發房屋 "落在7.5分,
04:04
so a slightly above average show,
80
244440
2016
所以比一般的節目高一點點,
04:06
but certainly not what Roy Price
and his team were aiming for.
81
246480
2920
但絕對不是Roy Price與
他的團隊所要達到的目標。
04:10
Meanwhile, however,
at about the same time,
82
250320
2856
這時,然而,同一時間,
另一家公司的另一個決策者,
04:13
at another company,
83
253200
1576
04:14
another executive did manage
to land a top show using data analysis,
84
254800
4216
用同樣的數據分析做了一個頂尖的節目,
04:19
and his name is Ted,
85
259040
1576
他的名字是 Ted,
04:20
Ted Sarandos, who is
the Chief Content Officer of Netflix,
86
260640
3416
Ted Sarandos是Netflix的
首席節目內容決策者,
04:24
and just like Roy,
he's on a constant mission
87
264080
2136
就跟 Roy一樣,他也要不停的找
04:26
to find that great TV show,
88
266240
1496
最棒的節目,
04:27
and he uses data as well to do that,
89
267760
2016
而他也使用數據來這樣做,
04:29
except he does it
a little bit differently.
90
269800
2015
但他的做法,有點不太一樣。
04:31
So instead of holding a competition,
what he did -- and his team of course --
91
271839
3737
不是舉辦比賽,當然,他和他的團隊
04:35
was they looked at all the data
they already had about Netflix viewers,
92
275600
3536
也有觀察Netflix已經有的觀眾數據,
04:39
you know, the ratings
they give their shows,
93
279160
2096
觀眾對節目的評分、觀看紀錄、
04:41
the viewing histories,
what shows people like, and so on.
94
281280
2696
那些節目是人們喜歡的等等,
04:44
And then they use that data to discover
95
284000
1896
他們也使用數據去發掘
04:45
all of these little bits and pieces
about the audience:
96
285920
2616
觀眾所有的小細節:
04:48
what kinds of shows they like,
97
288560
1456
他們喜歡甚麼類型的節目、
04:50
what kind of producers,
what kind of actors.
98
290040
2096
甚麼類型的製作人、甚麼類型的演員,
04:52
And once they had
all of these pieces together,
99
292160
2576
一旦他們收集全部的細節後,
04:54
they took a leap of faith,
100
294760
1656
他們很有信心地
04:56
and they decided to license
101
296440
2096
決定要製作一部,
04:58
not a sitcom about four Senators
102
298560
2456
不是四個參議員的喜劇,
05:01
but a drama series about a single Senator.
103
301040
2880
而是一系列有關一位
單身參議員的戲劇。
05:04
You guys know the show?
104
304760
1656
各位知道那個節目嗎?
05:06
(Laughter)
105
306440
1296
(笑聲)
05:07
Yes, "House of Cards," and Netflix
of course, nailed it with that show,
106
307760
3736
是的," 纸牌屋 ",Netflix ,當然,
至少頭二季,用這節目盯住那個分數。
05:11
at least for the first two seasons.
107
311520
2136
05:13
(Laughter) (Applause)
108
313680
3976
(笑聲)(掌聲)
05:17
"House of Cards" gets
a 9.1 rating on this curve,
109
317680
3176
" 纸牌屋 "在這曲線上拿到 9.1分,
05:20
so it's exactly
where they wanted it to be.
110
320880
3176
這當然是他們想要的。
05:24
Now, the question of course is,
what happened here?
111
324080
2416
現在,當然問題就是
這到底是怎麼一回事?
05:26
So you have two very competitive,
data-savvy companies.
112
326520
2656
你有兩個非常有競爭力、
精通數據資料的公司。
05:29
They connect all of these
millions of data points,
113
329200
2856
他們連結了所有的數據資料,
05:32
and then it works
beautifully for one of them,
114
332080
2376
然後,其中一個做的很漂亮,
05:34
and it doesn't work for the other one.
115
334480
1856
而另一個卻沒有,
05:36
So why?
116
336360
1216
為什麼?
05:37
Because logic kind of tells you
that this should be working all the time.
117
337600
3456
因為邏輯上告訴你,
這應該每次都有效啊,
05:41
I mean, if you're collecting
millions of data points
118
341080
2456
我的意思是,
如果你收集了所有的數據資料
05:43
on a decision you're going to make,
119
343560
1736
來決定一個決策,
05:45
then you should be able
to make a pretty good decision.
120
345320
2616
那你應該可以得到一個
相當不錯的決策。
05:47
You have 200 years
of statistics to rely on.
121
347960
2216
你有 200年的統計數據做後盾,
05:50
You're amplifying it
with very powerful computers.
122
350200
3016
你用很強大的電腦去增強它,
05:53
The least you could expect
is good TV, right?
123
353240
3280
至少你可以期待到一個
好的電視節目,對吧?
05:57
And if data analysis
does not work that way,
124
357880
2720
但如果數據分析
並沒有想像中的有效,
06:01
then it actually gets a little scary,
125
361520
2056
那,這真的有點恐怖,
06:03
because we live in a time
where we're turning to data more and more
126
363600
3816
因為我們正轉向一個
數據越來越多的時代,
06:07
to make very serious decisions
that go far beyond TV.
127
367440
4480
來做出遠比電視節目
還要嚴肅的決策。
06:12
Does anyone here know the company
Multi-Health Systems?
128
372760
3240
你們當中有人知道" MHS "這家公司嗎?
沒人?好,這樣很好,
06:17
No one. OK, that's good actually.
129
377080
1656
06:18
OK, so Multi-Health Systems
is a software company,
130
378760
3216
好的,MHS是一家軟體公司,
06:22
and I hope that nobody here in this room
131
382000
2816
而我希望在座的各位,
06:24
ever comes into contact
with that software,
132
384840
3176
沒有人與這個軟體有牽連,
06:28
because if you do,
it means you're in prison.
133
388040
2096
因為如果你有,代表你在監獄中
06:30
(Laughter)
134
390160
1176
(笑聲)
06:31
If someone here in the US is in prison,
and they apply for parole,
135
391360
3536
在美國這裡如果有人被判入監,
然後要申請假釋,
06:34
then it's very likely that
data analysis software from that company
136
394920
4296
很有可能那家公司的數據分析軟體
06:39
will be used in determining
whether to grant that parole.
137
399240
3616
會被用來判定是否能獲得假釋。
06:42
So it's the same principle
as Amazon and Netflix,
138
402880
2576
所以,它也是採用
Amazon 和 Netflix 公司相同的原則,
06:45
but now instead of deciding whether
a TV show is going to be good or bad,
139
405480
4616
但不同的是,
他們是用來決定電視節目將來的好壞,
你是用來決定一個人將來的好壞,
06:50
you're deciding whether a person
is going to be good or bad.
140
410120
2896
06:53
And mediocre TV, 22 minutes,
that can be pretty bad,
141
413040
5496
表現普通22分鐘的電視節目,很糟糕,
06:58
but more years in prison,
I guess, even worse.
142
418560
2640
但,我猜,要做更多年的牢,更糟糕。
07:02
And unfortunately, there is actually
some evidence that this data analysis,
143
422360
4136
但不幸的是,實際上已經有證據顯示,
該數據分析除了擁有龐大的數據外,
07:06
despite having lots of data,
does not always produce optimum results.
144
426520
4216
它並不總是跑出適當的結果。
07:10
And that's not because a company
like Multi-Health Systems
145
430760
2722
但並不只有像是MHS這樣的軟體公司
07:13
doesn't know what to do with data.
146
433506
1627
不明白數據怎麼了,
07:15
Even the most data-savvy
companies get it wrong.
147
435158
2298
甚至最頂尖的數據公司也會出錯,
07:17
Yes, even Google gets it wrong sometimes.
148
437480
2400
是的,甚至Google有時也會出錯。
07:20
In 2009, Google announced
that they were able, with data analysis,
149
440680
4496
2009年,Google宣布他們可以用數據分析,
來預測流行性感冒,討人厭的流感,
07:25
to predict outbreaks of influenza,
the nasty kind of flu,
150
445200
4136
經由他們的Google搜尋引擎來做數據分析。
07:29
by doing data analysis
on their Google searches.
151
449360
3776
07:33
And it worked beautifully,
and it made a big splash in the news,
152
453160
3856
而且它準確無比,當時造成一股新聞的轟動,
07:37
including the pinnacle
of scientific success:
153
457040
2136
包含一個科學界成功的高峰:
07:39
a publication in the journal "Nature."
154
459200
2456
在 "自然期刊"上發表文章。
07:41
It worked beautifully
for year after year after year,
155
461680
3616
之後的每一年,它都預測地很漂亮,
07:45
until one year it failed.
156
465320
1656
直到有一年它失敗了。
07:47
And nobody could even tell exactly why.
157
467000
2256
沒有人能正確地說明到底甚麼原因。
07:49
It just didn't work that year,
158
469280
1696
那一年它就是不準了,
07:51
and of course that again made big news,
159
471000
1936
當然,又造成了一次大新聞,
07:52
including now a retraction
160
472960
1616
包含現在
07:54
of a publication
from the journal "Nature."
161
474600
2840
被" 自然期刊 "撤銷發表的文章
07:58
So even the most data-savvy companies,
Amazon and Google,
162
478480
3336
所以,即使是最頂尖的數據分析公司,
Amazon和Google,
08:01
they sometimes get it wrong.
163
481840
2136
他們有時也會出錯。
08:04
And despite all those failures,
164
484000
2936
但儘管有這些失敗,
08:06
data is moving rapidly
into real-life decision-making --
165
486960
3856
數據正快速地進入我們
實際生活上的決策、
08:10
into the workplace,
166
490840
1816
進入工作職場、
08:12
law enforcement,
167
492680
1816
法律執行、
08:14
medicine.
168
494520
1200
醫藥界。
08:16
So we should better make sure
that data is helping.
169
496400
3336
所以,我們應該確保數據是有幫助的。
08:19
Now, personally I've seen
a lot of this struggle with data myself,
170
499760
3136
我個人已經經歷過很多
自己在數據上的掙扎,
08:22
because I work in computational genetics,
171
502920
1976
因為我在計算遺傳學界工作,
08:24
which is also a field
where lots of very smart people
172
504920
2496
這個領域有很多非常聰明的人
08:27
are using unimaginable amounts of data
to make pretty serious decisions
173
507440
3656
使用多到難以想像的數據
來制定相當嚴肅的決策,
08:31
like deciding on a cancer therapy
or developing a drug.
174
511120
3560
像是癌症治療決策或藥物開發。
08:35
And over the years,
I've noticed a sort of pattern
175
515520
2376
經過這幾年,我已經注意到一種模式
08:37
or kind of rule, if you will,
about the difference
176
517920
2456
或者規則,如果你要這麼說也行,
08:40
between successful
decision-making with data
177
520400
2696
就是有關於用數據做出
08:43
and unsuccessful decision-making,
178
523120
1616
成功決策和不成功決策,
08:44
and I find this a pattern worth sharing,
and it goes something like this.
179
524760
3880
我發現這個模式值得分享,
它是這樣的......
當你要解決一個複雜問題時,
08:50
So whenever you're
solving a complex problem,
180
530520
2135
08:52
you're doing essentially two things.
181
532679
1737
本質上你會做兩件事,
08:54
The first one is, you take that problem
apart into its bits and pieces
182
534440
3296
第一件事是,你會把問題拆分得很仔細,
08:57
so that you can deeply analyze
those bits and pieces,
183
537760
2496
所以你可以深度地分析這些細節,
09:00
and then of course
you do the second part.
184
540280
2016
當然你的第二件事就是,
09:02
You put all of these bits and pieces
back together again
185
542320
2656
你會再把這些細節拿回來整合一起,
09:05
to come to your conclusion.
186
545000
1336
來得出你要的結論。
09:06
And sometimes you
have to do it over again,
187
546360
2336
有時候你必須一做再做,
09:08
but it's always those two things:
188
548720
1656
就這兩件事:
09:10
taking apart and putting
back together again.
189
550400
2320
拆分、再合併一起。
但,關鍵是
09:14
And now the crucial thing is
190
554280
1616
09:15
that data and data analysis
191
555920
2896
數據與數據分析
09:18
is only good for the first part.
192
558840
2496
只適用於第一步驟,
09:21
Data and data analysis,
no matter how powerful,
193
561360
2216
無論數據與數據分析多麼地強大,
09:23
can only help you taking a problem apart
and understanding its pieces.
194
563600
4456
它只能幫助你拆分問題及了解細節,
09:28
It's not suited to put those pieces
back together again
195
568080
3496
它不適用於把細節
拿回來放在一起再整合,
09:31
and then to come to a conclusion.
196
571600
1896
來得出一個結論。
09:33
There's another tool that can do that,
and we all have it,
197
573520
2736
有一個工具可以這麼做,
而我們都擁有它,
09:36
and that tool is the brain.
198
576280
1296
那工具就是大腦。
09:37
If there's one thing a brain is good at,
199
577600
1936
如果要說大腦有一項能力很強,
09:39
it's taking bits and pieces
back together again,
200
579560
2256
那就是,它很會把事情
拆分細節後再整合一起,
09:41
even when you have incomplete information,
201
581840
2016
即使當你有的只是不完整的資訊,
09:43
and coming to a good conclusion,
202
583880
1576
也能得到一個好的決策,
09:45
especially if it's the brain of an expert.
203
585480
2936
特別是專家的大腦。
09:48
And that's why I believe
that Netflix was so successful,
204
588440
2656
而這也是為什麼我相信
Netflix會這麼成功的原因,
09:51
because they used data and brains
where they belong in the process.
205
591120
3576
因為他們在過程中使用數據與大腦。
09:54
They use data to first understand
lots of pieces about their audience
206
594720
3536
他們利用數據,
首先了解很多觀眾的細節,
09:58
that they otherwise wouldn't have
been able to understand at that depth,
207
598280
3416
否則沒有這些數據,
他們沒有能力可以了解這麼深,
10:01
but then the decision
to take all these bits and pieces
208
601720
2616
但做出拆分、整合
10:04
and put them back together again
and make a show like "House of Cards,"
209
604360
3336
及製作" 紙牌屋 "的
這兩個決策,是數據中無法幫你決定的。
10:07
that was nowhere in the data.
210
607720
1416
10:09
Ted Sarandos and his team
made that decision to license that show,
211
609160
3976
Ted Sarandos和他的團隊做出
許可該節目的這個決策,
10:13
which also meant, by the way,
that they were taking
212
613160
2381
總之,意思就是,
他們在做出決策當下,
也正在承擔很大的個人風險。
10:15
a pretty big personal risk
with that decision.
213
615565
2851
10:18
And Amazon, on the other hand,
they did it the wrong way around.
214
618440
3016
而另一方面,Amazon他們把它搞砸了。
10:21
They used data all the way
to drive their decision-making,
215
621480
2736
他們全程依賴數據來制定決策,
10:24
first when they held
their competition of TV ideas,
216
624240
2416
首先,他們舉辦節目想法的競賽,
10:26
then when they selected "Alpha House"
to make as a show.
217
626680
3696
然後當他們選擇" 艾爾發屋 "來作為節目,
10:30
Which of course was
a very safe decision for them,
218
630400
2496
當然啦,對他們而言,
這是一個非常安全的決策,
10:32
because they could always
point at the data, saying,
219
632920
2456
因為他們總是可以指著數據說,
10:35
"This is what the data tells us."
220
635400
1696
"這是數據告訴我們的"
10:37
But it didn't lead to the exceptional
results that they were hoping for.
221
637120
4240
但這並沒有帶領他們到
他們所希望的傑出結果。
所以,數據當然是做決策時的
一個強大的工具,
10:42
So data is of course a massively
useful tool to make better decisions,
222
642120
4976
10:47
but I believe that things go wrong
223
647120
2376
但我相信,當數據開始主導這些決策時,
10:49
when data is starting
to drive those decisions.
224
649520
2576
事情也會開始出錯。
10:52
No matter how powerful,
data is just a tool,
225
652120
3776
不管它有多麼的強大,
數據僅是一個工具,
10:55
and to keep that in mind,
I find this device here quite useful.
226
655920
3336
並把這個記在腦裡,
我發現這個裝置相當有用。
10:59
Many of you will ...
227
659280
1216
你們很多人將會 ...
11:00
(Laughter)
228
660520
1216
(笑聲)
11:01
Before there was data,
229
661760
1216
在有數據之前,
11:03
this was the decision-making
device to use.
230
663000
2856
這就是用來做決策的工具
11:05
(Laughter)
231
665880
1256
(笑聲)
11:07
Many of you will know this.
232
667160
1336
你們很多人應該知道這個玩意。
11:08
This toy here is called the Magic 8 Ball,
233
668520
1953
這個玩具在這裡稱做"魔術 8號球",
11:10
and it's really amazing,
234
670497
1199
它真的很奇妙,
11:11
because if you have a decision to make,
a yes or no question,
235
671720
2896
因為如果你要做一個
"是或不是"的決策時,
11:14
all you have to do is you shake the ball,
and then you get an answer --
236
674640
3736
你只要搖一搖這顆球,
然後你就可以得到答案了--
11:18
"Most Likely" -- right here
in this window in real time.
237
678400
2816
"很有可能是"--
就在這視窗裡及時顯現給你看,
11:21
I'll have it out later for tech demos.
238
681240
2096
我會帶它去做技術示範。
11:23
(Laughter)
239
683360
1216
(笑聲)
11:24
Now, the thing is, of course --
so I've made some decisions in my life
240
684600
3576
事情是,當然啦 --
我已經在我人生中做出一些決定,
11:28
where, in hindsight,
I should have just listened to the ball.
241
688200
2896
但早知道,我就應該聽這顆球的話。
11:31
But, you know, of course,
if you have the data available,
242
691120
3336
但,當然,如果你有有效的數據,
11:34
you want to replace this with something
much more sophisticated,
243
694480
3056
你想要用超複雜的方式來取代這顆球,
11:37
like data analysis
to come to a better decision.
244
697560
3616
例如,用數據分析來得到更好的決策。
11:41
But that does not change the basic setup.
245
701200
2616
但這無法改變基本的設定,
11:43
So the ball may get smarter
and smarter and smarter,
246
703840
3176
所以這球會越來越聰明,
11:47
but I believe it's still on us
to make the decisions
247
707040
2816
但我相信,如果我們想達成某些
曲線右邊末端的非凡成就,
11:49
if we want to achieve
something extraordinary,
248
709880
3016
最後我們自己還是得做出決定,
11:52
on the right end of the curve.
249
712920
1936
11:54
And I find that a very encouraging
message, in fact,
250
714880
4496
事實上,我發現
一個非常激勵人心的訊息,
11:59
that even in the face
of huge amounts of data,
251
719400
3976
即使面對龐大的數據,
你仍會有很大的收穫,
12:03
it still pays off to make decisions,
252
723400
4096
在你做出決策、
變成一位該領域的專家
12:07
to be an expert in what you're doing
253
727520
2656
並承擔風險時。
12:10
and take risks.
254
730200
2096
因為,最後,不是數據,
12:12
Because in the end, it's not data,
255
732320
2776
12:15
it's risks that will land you
on the right end of the curve.
256
735120
3960
是風險會帶你來到曲線的右邊末端。
12:19
Thank you.
257
739840
1216
謝謝各位。
12:21
(Applause)
258
741080
3680
(掌聲)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。