請雙擊下方英文字幕播放視頻。
譯者: Shizumi Ch
審譯者: Wilde Luo
00:12
In ancient Greece,
0
12705
1545
古希臘時期,
00:15
when anyone from slaves to soldiers,
poets and politicians,
1
15256
3943
不論是奴隸或士兵,詩人或政治家,
00:19
needed to make a big decision
on life's most important questions,
2
19223
4004
當他們人生遇到重大問題時,
需要做出重要的決定,
00:23
like, "Should I get married?"
3
23251
1391
像是「我該結婚嗎?」
00:24
or "Should we embark on this voyage?"
4
24666
1857
或是「我該開始這次的航行嗎?」
00:26
or "Should our army
advance into this territory?"
5
26547
2928
或是「我的士兵該進攻這個領地嗎?」
00:29
they all consulted the oracle.
6
29499
2579
他們都會請示先知。
00:32
So this is how it worked:
7
32840
1440
運行模式是這樣的:
00:34
you would bring her a question
and you would get on your knees,
8
34304
3112
你把問題告訴她,接著屈膝跪下,
00:37
and then she would go into this trance.
9
37440
1871
然後她就會進入出神狀態。
00:39
It would take a couple of days,
10
39335
1549
這會花上幾天的時間,
00:40
and then eventually
she would come out of it,
11
40908
2163
最終她會回神,
00:43
giving you her predictions as your answer.
12
43095
2536
答復你她的預知。
00:46
From the oracle bones of ancient China
13
46730
2566
從古中國的甲骨文,
到古希臘,再到馬雅曆,
00:49
to ancient Greece to Mayan calendars,
14
49320
2345
00:51
people have craved for prophecy
15
51689
2296
人們都渴求著預言,
00:54
in order to find out
what's going to happen next.
16
54009
3137
為了知道接下來會發生什麼事。
00:58
And that's because we all want
to make the right decision.
17
58336
3239
而這是因為我們都想做正確的決定,
01:01
We don't want to miss something.
18
61599
1545
我們不希望漏掉了什麼。
01:03
The future is scary,
19
63712
1743
未來令人害怕。
01:05
so it's much nicer
knowing that we can make a decision
20
65479
2717
所以能在某種程度上
保障決定的結果,是很棒的事。
01:08
with some assurance of the outcome.
21
68220
1982
01:10
Well, we have a new oracle,
22
70899
1611
我們有了新的先知,
01:12
and it's name is big data,
23
72534
2145
名字叫大數據。
01:14
or we call it "Watson"
or "deep learning" or "neural net."
24
74703
3939
也可以稱它為「華生」、
「深度學習」或「人工神經網路」。
01:19
And these are the kinds of questions
we ask of our oracle now,
25
79160
4012
如今我們會問先知這樣的問題:
01:23
like, "What's the most efficient way
to ship these phones
26
83196
3922
「要將這批手機從中國
運到瑞典,怎樣最有效率?」
01:27
from China to Sweden?"
27
87142
1823
01:28
Or, "What are the odds
28
88989
1800
或是「我的小孩出生就有
遺傳疾病的機率是多少?」
01:30
of my child being born
with a genetic disorder?"
29
90813
3363
01:34
Or, "What are the sales volume
we can predict for this product?"
30
94772
3244
或是「預期這產品的銷售量多少?」
01:39
I have a dog. Her name is Elle,
and she hates the rain.
31
99928
4047
我養了隻狗,名叫埃萊,最討厭下雨。
01:43
And I have tried everything
to untrain her.
32
103999
3306
我用盡方法來訓練她,
讓她適應下雨。
01:47
But because I have failed at this,
33
107329
2771
但因為我失敗了,
01:50
I also have to consult
an oracle, called Dark Sky,
34
110124
3286
我還是得諮詢一位叫
Dark Sky(天氣預報公司)的先知,
01:53
every time before we go on a walk,
35
113434
1635
每次散步之前都會諮詢,
01:55
for very accurate weather predictions
in the next 10 minutes.
36
115093
3577
以獲得接下來十分鐘的準確天氣預報。
02:01
She's so sweet.
37
121355
1303
她真的很貼心。
02:03
So because of all of this,
our oracle is a $122 billion industry.
38
123647
5707
基於這些理由,我們的「先知」
是個 1220 億美元的產業。
02:09
Now, despite the size of this industry,
39
129826
3376
先不論這個產業的規模,
02:13
the returns are surprisingly low.
40
133226
2456
令人驚訝的是它極低的報酬率。
02:16
Investing in big data is easy,
41
136162
2494
投資大數據很簡單,
02:18
but using it is hard.
42
138680
1933
運用大數據卻很難。
02:21
Over 73 percent of big data projects
aren't even profitable,
43
141801
4040
73% 以上的大數據計畫根本不賺錢,
02:25
and I have executives
coming up to me saying,
44
145865
2431
有些業務主管跑來跟我說,
02:28
"We're experiencing the same thing.
45
148320
1789
「我們都面臨了同樣的問題。
02:30
We invested in some big data system,
46
150133
1753
我們投資了幾個大數據系統,
02:31
and our employees aren't making
better decisions.
47
151910
2968
但我們的員工卻還是不能
做出更優的決定。
02:34
And they're certainly not coming up
with more breakthrough ideas."
48
154902
3162
他們當然也沒有想出
更多突破性的點子。」
02:38
So this is all really interesting to me,
49
158734
3184
這些對我來說都很有趣,
02:41
because I'm a technology ethnographer.
50
161942
2010
因為我是個科技人類學家。
02:44
I study and I advise companies
51
164450
2564
我研究並給予公司建議,
02:47
on the patterns
of how people use technology,
52
167038
2483
告訴他們人們使用科技的形態,
02:49
and one of my interest areas is data.
53
169545
2678
我有興趣的領域之一就是數據。
02:52
So why is having more data
not helping us make better decisions,
54
172247
5193
為什麼獲得更多數據
卻沒有幫我們做更好的決定,
02:57
especially for companies
who have all these resources
55
177464
2783
特別是那些有資源,
可以投資大數據系統的公司?
03:00
to invest in these big data systems?
56
180271
1736
03:02
Why isn't it getting any easier for them?
57
182031
2398
為什麼他們沒有更好地做決定?
03:05
So, I've witnessed the struggle firsthand.
58
185810
2634
我第一時間就目睹了這項困境。
03:09
In 2009, I started
a research position with Nokia.
59
189194
3484
2009 年,我開始了
在諾基亞的研究工作。
03:13
And at the time,
60
193052
1158
當時,諾基亞是世界上
最大的手機公司之一,
03:14
Nokia was one of the largest
cell phone companies in the world,
61
194234
3158
03:17
dominating emerging markets
like China, Mexico and India --
62
197416
3202
在中國、墨西哥、印度等
新興市場中佔有主要地位──
03:20
all places where I had done
a lot of research
63
200642
2502
我在這些地方都做了很多研究,
03:23
on how low-income people use technology.
64
203168
2676
研究低收入的人怎麼使用科技產品。
03:25
And I spent a lot of extra time in China
65
205868
2330
我在中國花了特別多時間
03:28
getting to know the informal economy.
66
208222
2592
來了解地下經濟。
03:30
So I did things like working
as a street vendor
67
210838
2401
所以我當過街頭攤販,
03:33
selling dumplings to construction workers.
68
213263
2574
賣水餃給建築工人。
03:35
Or I did fieldwork,
69
215861
1358
我也做過實地調查,
03:37
spending nights and days
in internet cafés,
70
217243
2958
在網咖中日日夜夜地待著,
03:40
hanging out with Chinese youth,
so I could understand
71
220225
2546
和中國年輕人來往,這樣我才知道
03:42
how they were using
games and mobile phones
72
222795
2284
他們怎麼玩遊戲、使用手機,
03:45
and using it between moving
from the rural areas to the cities.
73
225103
3370
以及他們從農村地區
移居到城市時的使用情形。
03:50
Through all of this qualitative evidence
that I was gathering,
74
230155
3927
透過我收集的定性資料,
03:54
I was starting to see so clearly
75
234106
2824
我開始清楚看見
03:56
that a big change was about to happen
among low-income Chinese people.
76
236954
4472
即將發生在低收入中國人身上的巨變。
04:02
Even though they were surrounded
by advertisements for luxury products
77
242840
4367
雖然他們身邊圍繞著奢侈品的廣告,
04:07
like fancy toilets --
who wouldn't want one? --
78
247231
3495
像是花俏的馬桶──誰不想要呢──
04:10
and apartments and cars,
79
250750
2890
還有公寓和車,
04:13
through my conversations with them,
80
253664
1820
從和他們的對話中,
04:15
I found out that the ads
the actually enticed them the most
81
255508
3841
我發現最吸引他們的廣告,
04:19
were the ones for iPhones,
82
259373
1996
是 iPhone 的廣告,
04:21
promising them this entry
into this high-tech life.
83
261393
3052
那些廣告向他們保證了
進入高科技生活的途徑。
04:25
And even when I was living with them
in urban slums like this one,
84
265289
3163
即使我和他們一起
住在這樣的城市貧民窟,
04:28
I saw people investing
over half of their monthly income
85
268476
2996
我也看到人們將半個月以上的收入
04:31
into buying a phone,
86
271496
1623
拿去買手機,
04:33
and increasingly, they were "shanzhai,"
87
273143
2302
而且越來越多都是「山寨品」,
04:35
which are affordable knock-offs
of iPhones and other brands.
88
275469
3388
也就是他們買得起的
iPhone 或其他品牌的仿冒品。
04:40
They're very usable.
89
280123
1625
這些仿冒品很堪使用。
04:42
Does the job.
90
282710
1322
原廠有的功能都能用。
04:44
And after years of living
with migrants and working with them
91
284570
5789
我和移民一起住、一起工作了數年,
04:50
and just really doing everything
that they were doing,
92
290383
3434
真的是他們做什麼,我就做什麼,
04:53
I started piecing
all these data points together --
93
293841
3597
我開始將所有數據拼湊在一起──
04:57
from the things that seem random,
like me selling dumplings,
94
297462
3123
不論是看似不相關的事,
像是我賣水餃的事,
05:00
to the things that were more obvious,
95
300609
1804
或是較明顯相關的事,
05:02
like tracking how much they were spending
on their cell phone bills.
96
302437
3232
像是追蹤他們花多少錢付手機費。
所以我才有辦法描繪出
這麼多整體畫面
05:05
And I was able to create
this much more holistic picture
97
305693
2639
05:08
of what was happening.
98
308356
1156
來說明當時正發生什麼事。
05:09
And that's when I started to realize
99
309536
1722
這時我才開始理解到
05:11
that even the poorest in China
would want a smartphone,
100
311282
3509
連中國最窮的人也想要智慧型手機,
05:14
and that they would do almost anything
to get their hands on one.
101
314815
4985
且他們幾乎會不擇手段拿到手。
05:20
You have to keep in mind,
102
320893
2404
你們要記得,
05:23
iPhones had just come out, it was 2009,
103
323321
3084
當時是 2009 年,iPhone 才剛出現,
05:26
so this was, like, eight years ago,
104
326429
1885
這是八年前的事,
05:28
and Androids had just started
looking like iPhones.
105
328338
2437
安卓手機才剛開始像 iPhone。
05:30
And a lot of very smart
and realistic people said,
106
330799
2507
很多聰明又現實的人說,
05:33
"Those smartphones -- that's just a fad.
107
333330
2207
「智慧型手機只是一時的流行。
05:36
Who wants to carry around
these heavy things
108
336063
2996
誰會想帶著這麼重的東西到處走,
05:39
where batteries drain quickly
and they break every time you drop them?"
109
339083
3487
又很快就沒電,
還會一掉地就壞?」
05:44
But I had a lot of data,
110
344613
1201
但我有很多數據,
05:45
and I was very confident
about my insights,
111
345838
2260
我對自己的洞察觀點非常有自信,
05:48
so I was very excited
to share them with Nokia.
112
348122
2829
我興奮地把數據告訴諾基亞。
05:53
But Nokia was not convinced,
113
353152
2517
但我沒能說服諾基亞,
05:55
because it wasn't big data.
114
355693
2335
因為那不是大數據。
05:58
They said, "We have
millions of data points,
115
358842
2404
他們說:「我們有幾百萬則數據,
06:01
and we don't see any indicators
of anyone wanting to buy a smartphone,
116
361270
4247
而我們沒見到任何數據
指出有人想買智慧型手機,
06:05
and your data set of 100,
as diverse as it is, is too weak
117
365541
4388
你的 100 組數據太缺乏多樣性,
06:09
for us to even take seriously."
118
369953
1714
我們完全無法重視這項數據。」
06:12
And I said, "Nokia, you're right.
119
372728
1605
我說:「諾基亞,你說的沒錯。
06:14
Of course you wouldn't see this,
120
374357
1560
你當然不會看到有人要買,
06:15
because you're sending out surveys
assuming that people don't know
121
375941
3371
因為你所發送問卷的假設前提
是人們不知道智慧型手機是什麼,
06:19
what a smartphone is,
122
379336
1159
06:20
so of course you're not going
to get any data back
123
380519
2366
所以你的數據當然不會反映
06:22
about people wanting to buy
a smartphone in two years.
124
382909
2572
兩年內想買智慧型手機的人的想法。
06:25
Your surveys, your methods
have been designed
125
385505
2118
你問卷、研究方法的設計理念
06:27
to optimize an existing business model,
126
387647
2022
都是想讓現有的業務型態更好,
06:29
and I'm looking
at these emergent human dynamics
127
389693
2608
而我關注的是這些正浮現的人類動態,
06:32
that haven't happened yet.
128
392325
1354
那些是過去沒有發生的,
06:33
We're looking outside of market dynamics
129
393703
2438
我們看的是市場動態之外,
06:36
so that we can get ahead of it."
130
396165
1631
這樣我們才能先走一步。」
06:39
Well, you know what happened to Nokia?
131
399193
2244
你們知道諾基亞怎麼樣了嗎?
06:41
Their business fell off a cliff.
132
401461
2365
他們的產業跌落谷底。
06:44
This -- this is the cost
of missing something.
133
404611
3727
這就是錯失的代價。
06:48
It was unfathomable.
134
408983
1999
那代價是深不可測的。
06:51
But Nokia's not alone.
135
411823
1651
但不是只有諾基亞這樣。
06:54
I see organizations
throwing out data all the time
136
414078
2581
我看到各機構一天到晚丟棄數據,
06:56
because it didn't come from a quant model
137
416683
2561
因為數據並非來自數量大的模型,
06:59
or it doesn't fit in one.
138
419268
1768
或對不上數量大的模型數據。
07:02
But it's not big data's fault.
139
422039
2048
但這不是大數據的錯。
07:04
It's the way we use big data;
it's our responsibility.
140
424762
3907
是我們用錯方法,
是我們的責任。
07:09
Big data's reputation for success
141
429550
1911
但一般認為大數據的成功之處
07:11
comes from quantifying
very specific environments,
142
431485
3759
在於量化的對象非常的特定,
07:15
like electricity power grids
or delivery logistics or genetic code,
143
435268
4913
像是電網、物流運送或遺傳密碼,
07:20
when we're quantifying in systems
that are more or less contained.
144
440205
4318
也就是些基本上可操縱的系統。
07:24
But not all systems
are as neatly contained.
145
444547
2969
但並非所有的系統
都能被操縱得好好的。
07:27
When you're quantifying
and systems are more dynamic,
146
447540
3258
若你在量化的系統是動態的,
07:30
especially systems
that involve human beings,
147
450822
3799
特別是那些有人參與其中的系統,
07:34
forces are complex and unpredictable,
148
454645
2426
會產生影響的事物複雜又難以預測,
07:37
and these are things
that we don't know how to model so well.
149
457095
3486
我們不太知道怎樣建立這些模型。
07:41
Once you predict something
about human behavior,
150
461024
2813
即使你一時預測了人的行動,
07:43
new factors emerge,
151
463861
1855
又會出現新的要素,
07:45
because conditions
are constantly changing.
152
465740
2365
因為情況持續在改變。
07:48
That's why it's a never-ending cycle.
153
468129
1803
正因如此,這是個永無止境的迴圈。
07:49
You think you know something,
154
469956
1464
你以為你瞭解了一件事,
07:51
and then something unknown
enters the picture.
155
471444
2242
另一件未知的事物便進入了你的視野。
07:53
And that's why just relying
on big data alone
156
473710
3322
所以純粹依靠大數據
07:57
increases the chance
that we'll miss something,
157
477056
2849
便增加了我們錯失的機率,
07:59
while giving us this illusion
that we already know everything.
158
479929
3777
但同時讓我們以為我們無所不知。
08:04
And what makes it really hard
to see this paradox
159
484226
3856
為什麼我們很難發現這個矛盾,
08:08
and even wrap our brains around it
160
488106
2659
甚至也很難去理解它,
08:10
is that we have this thing
that I call the quantification bias,
161
490789
3691
是因為我們有我所謂的「量化成見」,
08:14
which is the unconscious belief
of valuing the measurable
162
494504
3922
也就是無意識地認為可量化的
08:18
over the immeasurable.
163
498450
1594
比不可量化的更有價值。
08:21
And we often experience this at our work.
164
501042
3284
我們工作時常有這樣的經驗。
08:24
Maybe we work alongside
colleagues who are like this,
165
504350
2650
或許我們和這樣想的同事一起工作,
08:27
or even our whole entire
company may be like this,
166
507024
2428
或者整個公司都這樣想,
08:29
where people become
so fixated on that number,
167
509476
2546
人們過於迷戀數字,
08:32
that they can't see anything
outside of it,
168
512046
2067
以至於看不見除此之外的任何東西,
08:34
even when you present them evidence
right in front of their face.
169
514137
3948
即使你將證據貼到他們臉上,給他們看。
08:38
And this is a very appealing message,
170
518943
3371
這是個十分吸引人的訊息,
08:42
because there's nothing
wrong with quantifying;
171
522338
2343
因為量化並沒有錯;
08:44
it's actually very satisfying.
172
524705
1430
量化事實上很讓人滿意。
08:46
I get a great sense of comfort
from looking at an Excel spreadsheet,
173
526159
4362
我看著 Excel 電子表格就覺得安心,
08:50
even very simple ones.
174
530545
1401
即使是很簡單的也一樣。
08:51
(Laughter)
175
531970
1014
(笑聲)
08:53
It's just kind of like,
176
533008
1152
那種感覺就是,
08:54
"Yes! The formula worked. It's all OK.
Everything is under control."
177
534184
3504
「好的!方程式沒問題。
一切都很好。都在掌控之中。」
08:58
But the problem is
178
538612
2390
問題是,
09:01
that quantifying is addictive.
179
541026
2661
量化會使人上癮。
09:03
And when we forget that
180
543711
1382
我們一旦忘記這件事,
09:05
and when we don't have something
to kind of keep that in check,
181
545117
3038
若我們沒能做到時時確認是否上癮,
09:08
it's very easy to just throw out data
182
548179
2118
我們很容易直接扔掉這樣的資料:
09:10
because it can't be expressed
as a numerical value.
183
550321
2718
僅僅因為它無法用數值量化。
09:13
It's very easy just to slip
into silver-bullet thinking,
184
553063
2921
很容易認為會有完美解決一切的絶招,
09:16
as if some simple solution existed.
185
556008
2579
就好像有某種簡單的解決方法一樣。
09:19
Because this is a great moment of danger
for any organization,
186
559420
4062
因為這對任何一間機構來說,
都是危機的重要時刻,
09:23
because oftentimes,
the future we need to predict --
187
563506
2634
時常,我們要預測的未來,
09:26
it isn't in that haystack,
188
566164
2166
並不是在這安穩的草堆裡,
09:28
but it's that tornado
that's bearing down on us
189
568354
2538
而是在它之外,
是即將襲擊我們的暴風中心。
09:30
outside of the barn.
190
570916
1488
09:34
There is no greater risk
191
574780
2326
沒有什麼比對未知
一無所知來得有風險,
09:37
than being blind to the unknown.
192
577130
1666
09:38
It can cause you to make
the wrong decisions.
193
578820
2149
那會使你做出錯誤的決定。
09:40
It can cause you to miss something big.
194
580993
1974
那可能使你錯失重要的事物。
09:43
But we don't have to go down this path.
195
583554
3101
但我們不用這樣做。
09:47
It turns out that the oracle
of ancient Greece
196
587273
3195
到頭來,是古希臘的先知
握有顯示道路的神秘鑰匙。
09:50
holds the secret key
that shows us the path forward.
197
590492
3966
09:55
Now, recent geological research has shown
198
595474
2595
近年的地質研究顯示,
09:58
that the Temple of Apollo,
where the most famous oracle sat,
199
598093
3564
最有名的先知所在的阿波羅神廟,
10:01
was actually built
over two earthquake faults.
200
601681
3084
事實上座落在兩個地震斷層上。
10:04
And these faults would release
these petrochemical fumes
201
604789
2886
這些斷層會從地殼下釋出石油煙氣,
10:07
from underneath the Earth's crust,
202
607699
1685
10:09
and the oracle literally sat
right above these faults,
203
609408
3866
而那位先知就直接坐在那些斷層上方,
10:13
inhaling enormous amounts
of ethylene gas, these fissures.
204
613298
3588
從縫隙中吸入數不盡的乙烯氣體。
10:16
(Laughter)
205
616910
1008
10:17
It's true.
206
617942
1173
(笑聲)
那是真的。
10:19
(Laughter)
207
619139
1017
(笑聲)
10:20
It's all true, and that's what made her
babble and hallucinate
208
620180
3509
那都是真的,那就是為什麼
她講話含糊不清還看到幻覺,
10:23
and go into this trance-like state.
209
623713
1724
並進入類似出神的狀態。
10:25
She was high as a kite!
210
625461
1770
她感覺自己都飛上天了!
10:27
(Laughter)
211
627255
4461
(笑聲)
10:31
So how did anyone --
212
631740
2779
所以大家要怎麼──
10:34
How did anyone get
any useful advice out of her
213
634543
3030
大家要怎麼在這個狀態下
得到有用的建議?
10:37
in this state?
214
637597
1190
10:39
Well, you see those people
surrounding the oracle?
215
639317
2381
看到那些圍繞先知的人們了嗎?
10:41
You see those people holding her up,
216
641722
1879
你可以看到那些人支撐著她,
10:43
because she's, like, a little woozy?
217
643625
1717
因為她好像有點頭昏眼花?
10:45
And you see that guy
on your left-hand side
218
645366
2308
有沒有發現她左邊的男子
10:47
holding the orange notebook?
219
647698
1598
正拿著橘色小冊子?
10:49
Well, those were the temple guides,
220
649925
1730
那些是神廟的引導人員,
10:51
and they worked hand in hand
with the oracle.
221
651679
3016
他們與先知密切合作。
10:55
When inquisitors would come
and get on their knees,
222
655904
2516
當有人來下跪詢問時,
10:58
that's when the temple guides
would get to work,
223
658444
2340
神廟的引導人員就開始工作了,
11:00
because after they asked her questions,
224
660808
1864
在來者向先知詢問一些問題後,
11:02
they would observe their emotional state,
225
662696
2001
他們會觀察來者的精神狀態,
11:04
and then they would ask them
follow-up questions,
226
664721
2324
然後他們會問來者一些後續問題,
11:07
like, "Why do you want to know
this prophecy? Who are you?
227
667069
2834
像是:「為什麼你想知道
這個預言?你是誰?
11:09
What are you going to do
with this information?"
228
669927
2264
你會怎麼運用這個資訊?」
11:12
And then the temple guides would take
this more ethnographic,
229
672215
3182
接著神廟的引導人員會
用人類學的角度來看,
11:15
this more qualitative information,
230
675421
2156
用質性資訊的角度來看,
11:17
and interpret the oracle's babblings.
231
677601
2075
然後翻譯先知含糊不清的話。
11:21
So the oracle didn't stand alone,
232
681248
2292
所以先知並非自己承攬一切任務,
11:23
and neither should our big data systems.
233
683564
2148
我們的大數據系統同樣也不該如此。
11:26
Now to be clear,
234
686450
1161
我要澄清一下,
11:27
I'm not saying that big data systems
are huffing ethylene gas,
235
687635
3459
我並非在說大數據系統
在呼吸着乙烯氣體,
11:31
or that they're even giving
invalid predictions.
236
691118
2353
甚至給予沒用的預測。
11:33
The total opposite.
237
693495
1161
完全相反。
11:34
But what I am saying
238
694680
2068
我想說的是,
11:36
is that in the same way
that the oracle needed her temple guides,
239
696772
3832
就像先知需要神廟的引導人員那樣,
11:40
our big data systems need them, too.
240
700628
2288
大數據系統同樣也需要。
11:42
They need people like ethnographers
and user researchers
241
702940
4109
大數據需要人類學家以及用戶研究人員
11:47
who can gather what I call thick data.
242
707073
2506
來收集我所謂的「厚數據」──
11:50
This is precious data from humans,
243
710322
2991
來自於人們的寶貴數據,
11:53
like stories, emotions and interactions
that cannot be quantified.
244
713337
4102
像是故事、情緒和互動,
這些無法計量的事物。
11:57
It's the kind of data
that I collected for Nokia
245
717463
2322
就像我收集給諾基亞的那種數據,
11:59
that comes in in the form
of a very small sample size,
246
719809
2669
數據樣本規模非常小,
12:02
but delivers incredible depth of meaning.
247
722502
2955
但傳達的涵義卻極其的深。
12:05
And what makes it so thick and meaty
248
725481
3680
它如此厚重、內容豐富的原因是
12:10
is the experience of understanding
the human narrative.
249
730265
4029
那些從人們的話語中
明白更多信息的經驗。
12:14
And that's what helps to see
what's missing in our models.
250
734318
3639
這才能幫助我們看到
模型裡缺少了什麼東西。
12:18
Thick data grounds our business questions
in human questions,
251
738671
4045
厚數據以人類問題為根基
來說明經濟問題,
12:22
and that's why integrating
big and thick data
252
742740
3562
這就是為什麼結合大數據和厚數據
12:26
forms a more complete picture.
253
746326
1689
能讓我們得到的訊息更加完整。
12:28
Big data is able to offer
insights at scale
254
748592
2881
大數據能在一定程度上洞悉問題,
12:31
and leverage the best
of machine intelligence,
255
751497
2647
並最大程度發揮機器智能,
12:34
whereas thick data can help us
rescue the context loss
256
754168
3572
而厚數據能幫我們找到
那缺失的背景資訊,
12:37
that comes from making big data usable,
257
757764
2098
能讓大數據便於使用,
12:39
and leverage the best
of human intelligence.
258
759886
2181
並最大程度發揮人類智能。
12:42
And when you actually integrate the two,
that's when things get really fun,
259
762091
3552
若你真的把這兩個結合在一起
事情就會變得非常有趣,
12:45
because then you're no longer
just working with data
260
765667
2436
如此一來,運用的就不只是
你早就收集的數據。
12:48
you've already collected.
261
768127
1196
你還可以運用尚未收集的數據。
12:49
You get to also work with data
that hasn't been collected.
262
769347
2737
你就可以知道「為什麼」:
12:52
You get to ask questions about why:
263
772108
1719
12:53
Why is this happening?
264
773851
1317
為什麼會變成這樣?
12:55
Now, when Netflix did this,
265
775598
1379
所以說,網飛這樣做
12:57
they unlocked a whole new way
to transform their business.
266
777001
3035
就開啟了轉換商業模式的全新方式。
13:01
Netflix is known for their really great
recommendation algorithm,
267
781226
3956
網飛以擁有優秀的推薦演算法而聞名,
13:05
and they had this $1 million prize
for anyone who could improve it.
268
785206
4797
且發給任何能改善系統的人
一百萬美元獎金。
13:10
And there were winners.
269
790027
1314
有人贏了獎金。
13:12
But Netflix discovered
the improvements were only incremental.
270
792075
4323
但網飛發現效能提升還是不夠明顯。
13:17
So to really find out what was going on,
271
797224
1964
為了知道發生了什麼事,
13:19
they hired an ethnographer,
Grant McCracken,
272
799212
3741
他們僱用了人類學家,
格蘭特.麥克拉肯,
13:22
to gather thick data insights.
273
802977
1546
來收集厚數據以準確洞察理解。
13:24
And what he discovered was something
that they hadn't seen initially
274
804547
3924
他發現了網飛最初未能
從量化數據中看出來的,
13:28
in the quantitative data.
275
808495
1355
13:30
He discovered that people loved
to binge-watch.
276
810892
2728
他發現人們喜歡刷劇。
(註:短時間內狂看電視劇)
13:33
In fact, people didn't even
feel guilty about it.
277
813644
2353
事實上,人們甚至不覺得有什麼不對。
13:36
They enjoyed it.
278
816021
1255
他們非常享受這個過程。
13:37
(Laughter)
279
817300
1026
(笑聲)
13:38
So Netflix was like,
"Oh. This is a new insight."
280
818350
2356
網飛覺得:「噢,這是個新洞見。」
13:40
So they went to their data science team,
281
820730
1938
於是叫他們的數據科學組
13:42
and they were able to scale
this big data insight
282
822692
2318
把這洞察放大到
量化數據的規模來衡量。
13:45
in with their quantitative data.
283
825034
2587
13:47
And once they verified it
and validated it,
284
827645
3170
一旦他們再次確認了它的準確性,
13:50
Netflix decided to do something
very simple but impactful.
285
830839
4761
網飛便決定做一件簡單
卻影響很大的事情。
13:56
They said, instead of offering
the same show from different genres
286
836654
6492
他們說:
「與其提供不同類型但相似的影集,
14:03
or more of the different shows
from similar users,
287
843170
3888
或是給類似的觀眾
欣賞更多不同的影集,
14:07
we'll just offer more of the same show.
288
847082
2554
只要同一影集提供更多集就好了。
14:09
We'll make it easier
for you to binge-watch.
289
849660
2105
我們讓你更容易刷劇。」
14:11
And they didn't stop there.
290
851789
1486
而他們並沒有止步於此。
14:13
They did all these things
291
853299
1474
他們用一樣的方式,
14:14
to redesign their entire
viewer experience,
292
854797
2959
重新設計了整個觀眾體驗,
14:17
to really encourage binge-watching.
293
857780
1758
來真正地鼓勵大家刷劇。
14:20
It's why people and friends disappear
for whole weekends at a time,
294
860050
3241
這就是為什麼朋友會消失整個星期,
14:23
catching up on shows
like "Master of None."
295
863315
2343
追上「無為大師」等戲劇的進度。
14:25
By integrating big data and thick data,
they not only improved their business,
296
865682
4173
結合大數據與厚數據,
不只讓產業進步,
14:29
but they transformed how we consume media.
297
869879
2812
也轉變了我們使用媒體的型態。
14:32
And now their stocks are projected
to double in the next few years.
298
872715
4552
預期他們的股票
會在接下來幾年內翻倍。
14:38
But this isn't just about
watching more videos
299
878100
3830
這不只是關於看了更多影片,
14:41
or selling more smartphones.
300
881954
1620
或賣了更多智慧型手機,等等。
14:43
For some, integrating thick data
insights into the algorithm
301
883963
4050
對於一些公司來說,
結合厚數據洞察和演算法,
14:48
could mean life or death,
302
888037
2263
可能讓他們起死回生,
14:50
especially for the marginalized.
303
890324
2146
特別是那些已被邊緣化的公司。
14:53
All around the country,
police departments are using big data
304
893558
3434
全國的警察局都用大數據來防止犯罪,
14:57
for predictive policing,
305
897016
1963
14:59
to set bond amounts
and sentencing recommendations
306
899003
3084
來設定保證金金額,
並用加劇偏見的方式來建議判刑。
15:02
in ways that reinforce existing biases.
307
902111
3147
15:06
NSA's Skynet machine learning algorithm
308
906116
2423
美國國家安全局的天網學習演算法
15:08
has possibly aided in the deaths
of thousands of civilians in Pakistan
309
908563
5444
可能致使幾千名巴基斯坦平民死亡,
15:14
from misreading cellular device metadata.
310
914031
2721
肇因於錯誤判讀了行動電話的數據。
15:18
As all of our lives become more automated,
311
918951
3403
當我們的生活變得更加自動化,
15:22
from automobiles to health insurance
or to employment,
312
922378
3080
從汽車、健康保險或者就業,
15:25
it is likely that all of us
313
925482
2350
很可能我們所有人
15:27
will be impacted
by the quantification bias.
314
927856
2989
都會受量化偏見的影響。
15:32
Now, the good news
is that we've come a long way
315
932792
2621
好消息是
我們從吸入乙烯氣體到做出預測
15:35
from huffing ethylene gas
to make predictions.
316
935437
2450
已有長足的進步。
15:37
We have better tools,
so let's just use them better.
317
937911
3070
我們有了更好的工具,
那麽讓我們更好地利用它。
15:41
Let's integrate the big data
with the thick data.
318
941005
2323
讓我們將大數據與厚數據結合。
15:43
Let's bring our temple guides
with the oracles,
319
943352
2261
讓我們使神廟的引導人員
與先知一起合作,
15:45
and whether this work happens
in companies or nonprofits
320
945637
3376
不論做這項工作的是
公司、非營利組織、
15:49
or government or even in the software,
321
949037
2469
政府,甚至軟體,
15:51
all of it matters,
322
951530
1792
全部都有其意義,
15:53
because that means
we're collectively committed
323
953346
3023
因為這代表我們全體一起努力
15:56
to making better data,
324
956393
2191
來得到更好的數據,
15:58
better algorithms, better outputs
325
958608
1836
更好的演算法、更好的產品,
16:00
and better decisions.
326
960468
1643
以及更好的決定。
16:02
This is how we'll avoid
missing that something.
327
962135
3558
這就是避免錯失的方法。
16:07
(Applause)
328
967042
3948
(掌聲)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。