The human insights missing from big data | Tricia Wang

248,929 views ใƒป 2017-08-02

TED


ืื ื ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ืœืžื˜ื” ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ.

ืžืชืจื’ื: Zeeva Livshitz ืžื‘ืงืจ: Ido Dekkers
00:12
In ancient Greece,
0
12705
1545
ื‘ื™ื•ื•ืŸ ื”ืขืชื™ืงื”,
00:15
when anyone from slaves to soldiers, poets and politicians,
1
15256
3943
ื›ืฉืื“ื ื›ืœืฉื”ื•, ื‘ื™ืŸ ืื ืขื‘ื“, ื—ื™ื™ืœ, ืžืฉื•ืจืจ ืื• ืคื•ืœื™ื˜ื™ืงืื™,
00:19
needed to make a big decision on life's most important questions,
2
19223
4004
ื”ื™ื” ื–ืงื•ืง ืœืงื‘ืœ ื”ื—ืœื˜ื” ื’ื•ืจืœื™ืช ื‘ืฉืืœื•ืช ื”ื›ื™ ื—ืฉื•ื‘ื•ืช ื‘ื—ื™ื™ื,
00:23
like, "Should I get married?"
3
23251
1391
ื›ืžื• "ื”ืื ืขืœื™ ืœื”ืชื—ืชืŸ?"
00:24
or "Should we embark on this voyage?"
4
24666
1857
ืื• "ื”ืื ืขืœื™ื ื• ืœืฆืืช ืœืžืกืข ื”ื–ื”?"
00:26
or "Should our army advance into this territory?"
5
26547
2928
ืื• "ื”ืื ื”ืฆื‘ื ืฉืœื ื• ืฆืจื™ืš ืœื”ืชืงื“ื ืœืชื•ืš ื”ื˜ืจื™ื˜ื•ืจื™ื” ื”ื–ืืช?"
00:29
they all consulted the oracle.
6
29499
2579
ื›ื•ืœื ื”ืชื™ื™ืขืฆื• ืขื ื”ืื•ืจืงืœ.
00:32
So this is how it worked:
7
32840
1440
ืื– ื›ื›ื” ื–ื” ืขื‘ื“:
00:34
you would bring her a question and you would get on your knees,
8
34304
3112
ื”ื™ื™ืชื ืฉื•ืืœื™ื ืื•ืชื” ืฉืืœื” ื›ืฉืืชื ื›ื•ืจืขื™ื ืขืœ ื”ื‘ืจื›ื™ื™ื,
00:37
and then she would go into this trance.
9
37440
1871
ื•ืื– ื”ื™ื ื”ื™ืชื” ื ื›ื ืกืช ืœื˜ืจืื ืก.
00:39
It would take a couple of days,
10
39335
1549
ื–ื” ื”ื™ื” ืœื•ืงื— ื›ืžื” ื™ืžื™ื,
00:40
and then eventually she would come out of it,
11
40908
2163
ื•ืื– ืœื‘ืกื•ืฃ ื”ื™ื ื”ื™ืชื” ื™ื•ืฆืืช ืžื–ื”,
00:43
giving you her predictions as your answer.
12
43095
2536
ื•ืžืฉื™ื‘ื” ืœืฉืืœื•ืช ื‘ืขื–ืจืช ื”ืชื—ื–ื™ื•ืช ืฉืœื”.
00:46
From the oracle bones of ancient China
13
46730
2566
ืžืขืฆืžื•ืช ื”ืื•ืจืงืœ ืฉืœ ืกื™ืŸ ื”ืขืชื™ืงื”
00:49
to ancient Greece to Mayan calendars,
14
49320
2345
ืœื™ื•ื•ืŸ ื”ืขืชื™ืงื” ื•ืœืœื•ื—ื•ืช ื”ื–ืžืŸ ืฉืœ ืชืจื‘ื•ืช ื”ืžืื™ื”,
00:51
people have craved for prophecy
15
51689
2296
ืื ืฉื™ื ื”ืฉืชื•ืงืงื• ืœื ื‘ื•ืื”
00:54
in order to find out what's going to happen next.
16
54009
3137
ื›ื“ื™ ืœื’ืœื•ืช ืžื” ืขื•ืžื“ ืœืงืจื•ืช.
00:58
And that's because we all want to make the right decision.
17
58336
3239
ื•ื–ื” ืžืฉื•ื ืฉื›ื•ืœื ื• ืจื•ืฆื™ื ืœืงื‘ืœ ืืช ื”ื”ื—ืœื˜ื” ื”ื ื›ื•ื ื”.
01:01
We don't want to miss something.
18
61599
1545
ืื ื—ื ื• ืœื ืจื•ืฆื™ื ืœื”ื—ืžื™ืฅ ืžืฉื”ื•.
01:03
The future is scary,
19
63712
1743
ื”ืขืชื™ื“ ืžืคื—ื™ื“,
01:05
so it's much nicer knowing that we can make a decision
20
65479
2717
ื•ื”ืจื‘ื” ื™ื•ืชืจ ื ืขื™ื ืœื“ืขืช ืฉืื ื—ื ื• ื™ื›ื•ืœื™ื ืœืงื‘ืœ ื”ื—ืœื˜ื”
01:08
with some assurance of the outcome.
21
68220
1982
ืขื ื”ื‘ื˜ื—ื” ื›ืœืฉื”ื™ ืœื’ื‘ื™ ื”ืชื•ืฆืื”.
01:10
Well, we have a new oracle,
22
70899
1611
ื•ื‘ื›ืŸ, ื™ืฉ ืœื ื• ืื•ืจืงืœ ื—ื“ืฉ,
01:12
and it's name is big data,
23
72534
2145
ื•ืฉืžื• ื‘ื™ื’ ื“ืื˜ื”,
01:14
or we call it "Watson" or "deep learning" or "neural net."
24
74703
3939
ืื• ืฉื ืงืจื ืœื• "ื•ื•ื˜ืกื•ืŸ" ืื• "ืœืžื™ื“ื” ืœืขื•ืžืง" ืื• "ืจืฉืช ืขืฆื‘ื™ืช".
01:19
And these are the kinds of questions we ask of our oracle now,
25
79160
4012
ื•ืืœื• ื”ืŸ ืกื•ื’ ื”ืฉืืœื•ืช ืฉืื ื—ื ื• ืฉื•ืืœื™ื ืืช ื”ืื•ืจืงืœ ืฉืœื ื• ื›ืขืช,
01:23
like, "What's the most efficient way to ship these phones
26
83196
3922
ื›ืžื• "ืžื”ื™ ื”ื“ืจืš ื”ื™ืขื™ืœื” ื‘ื™ื•ืชืจ ืœืฉืœื•ื— ืืช ื”ื˜ืœืคื•ื ื™ื ื”ืืœื”
01:27
from China to Sweden?"
27
87142
1823
ืžืกื™ืŸ ืœืฉื‘ื“ื™ื”?"
01:28
Or, "What are the odds
28
88989
1800
ืื• "ืžื”ื ื”ืกื™ื›ื•ื™ื™ื
01:30
of my child being born with a genetic disorder?"
29
90813
3363
ืฉื”ื™ืœื“ ืฉืœื™ ื™ื•ื•ืœื“ ืขื ื‘ืขื™ื” ื’ื ื˜ื™ืช?"
01:34
Or, "What are the sales volume we can predict for this product?"
30
94772
3244
ืื• "ืžื” ื”ื™ืงืฃ ื”ืžื›ื™ืจื•ืช ืฉื ื™ืชืŸ ืœื—ื–ื•ืช ืขื‘ื•ืจ ื”ืžื•ืฆืจ ื”ื–ื”?"
01:39
I have a dog. Her name is Elle, and she hates the rain.
31
99928
4047
ื™ืฉ ืœื™ ื›ืœื‘ื”, ืฉืฉืžื” ืืœ, ื•ื”ื™ื ืฉื•ื ืืช ืืช ื”ื’ืฉื.
01:43
And I have tried everything to untrain her.
32
103999
3306
ื•ื ื™ืกื™ืชื™ ื”ื›ืœ ื›ื“ื™ ืœืฉื ื•ืช ื–ืืช ืืฆืœื”.
01:47
But because I have failed at this,
33
107329
2771
ืื‘ืœ ื‘ื’ืœืœ ืฉื ื›ืฉืœืชื™ ื‘ื–ื”,
01:50
I also have to consult an oracle, called Dark Sky,
34
110124
3286
ืขืœื™ ื’ื ืœื”ืชื™ื™ืขืฅ ืขื ืื•ืจืงืœ, ืฉื ืงืจื "ืฉืžื™ื™ื ืืคืœื™ื",
01:53
every time before we go on a walk,
35
113434
1635
ื‘ื›ืœ ืคืขื ืœืคื ื™ ืฉืื ื—ื ื• ื™ื•ืฆืื•ืช ืœื˜ื™ื•ืœ,
01:55
for very accurate weather predictions in the next 10 minutes.
36
115093
3577
ื›ื“ื™ ืœืงื‘ืœ ืชื—ื–ื™ื•ืช ืžื–ื’ ืื•ื™ืจ ืžื“ื•ื™ืงื•ืช ื‘ 10 ื“ืงื•ืช ื”ืงืจื•ื‘ื•ืช.
02:01
She's so sweet.
37
121355
1303
ื”ื™ื ื›ืœ ื›ืš ื—ืžื•ื“ื”.
02:03
So because of all of this, our oracle is a $122 billion industry.
38
123647
5707
ืื– ืœืคื™ื›ืš, ื”ืื•ืจืงืœ ืฉืœื ื• ื”ื•ื ืชืขืฉื™ื™ื” ืฉืœ 122 ืžื™ืœื™ืืจื“ ื“ื•ืœืจ.
02:09
Now, despite the size of this industry,
39
129826
3376
ื•ืœืžืจื•ืช ื’ื•ื“ืœื” ืฉืœ ื”ืชืขืฉื™ื™ื” ื”ื–ืืช,
02:13
the returns are surprisingly low.
40
133226
2456
ื”ืจื•ื•ื—ื™ื ื ืžื•ื›ื™ื ืœื”ืคืœื™ื.
02:16
Investing in big data is easy,
41
136162
2494
ื”ืฉืงืขื” ื‘ื‘ื™ื’ ื“ืื˜ื” ื”ื™ื ืงืœื”,
02:18
but using it is hard.
42
138680
1933
ืื‘ืœ ื”ืฉื™ืžื•ืฉ ื‘ื” ื”ื•ื ืงืฉื”.
02:21
Over 73 percent of big data projects aren't even profitable,
43
141801
4040
ืžืขืœ 73 ืื—ื•ื–ื™ื ืฉืœ ืคืจื•ื™ื™ืงื˜ื™ื ืฉืœ ื‘ื™ื’ ื“ืื˜ื” ืื™ื ื ืจื•ื•ื—ื™ื™ื ืืคื™ืœื•,
02:25
and I have executives coming up to me saying,
44
145865
2431
ื•ื™ืฉ ืœื™ ืžื ื”ืœื™ื ืฉื‘ืื™ื ืืœื™ ื•ืื•ืžืจื™ื,
02:28
"We're experiencing the same thing.
45
148320
1789
"ืื ื—ื ื• ื—ื•ื•ื™ื ืืช ืื•ืชื• ื”ื“ื‘ืจ.
02:30
We invested in some big data system,
46
150133
1753
ืื ื• ืžืฉืงื™ืขื™ื ื‘ืžืขืจื›ืช ื‘ื™ื’ ื“ืื˜ื” ื›ืœืฉื”ื™,
02:31
and our employees aren't making better decisions.
47
151910
2968
ื•ื”ืขื•ื‘ื“ื™ื ืฉืœื ื• ืœื ืžืงื‘ืœื™ื ื”ื—ืœื˜ื•ืช ื˜ื•ื‘ื•ืช ื™ื•ืชืจ.
02:34
And they're certainly not coming up with more breakthrough ideas."
48
154902
3162
ื•ื”ื ืœื‘ื˜ื— ืœื ืžืขืœื™ื ื™ื•ืชืจ ืจืขื™ื•ื ื•ืช ืคื•ืจืฆื™ ื“ืจืš".
02:38
So this is all really interesting to me,
49
158734
3184
ืื– ื›ืœ ื–ื” ืžืžืฉ ืžืขื ื™ื™ืŸ ืื•ืชื™,
02:41
because I'm a technology ethnographer.
50
161942
2010
ืžืฉื•ื ืฉืื ื™ ืืชื ื•ื’ืจืคื™ืช ื˜ื›ื ื•ืœื•ื’ื™ื”.
02:44
I study and I advise companies
51
164450
2564
ืื ื™ ื—ื•ืงืจืช ื•ืžื™ื™ืขืฆืช ืœื—ื‘ืจื•ืช
02:47
on the patterns of how people use technology,
52
167038
2483
ื‘ื ื•ืฉืื™ ื”ื“ืคื•ืกื™ื ืฉืื ืฉื™ื ืžืฉืชืžืฉื™ื ื‘ื”ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”,
02:49
and one of my interest areas is data.
53
169545
2678
ื•ืื—ื“ ืžืชื—ื•ืžื™ ื”ืขื ื™ื™ืŸ ืฉืœื™ ื”ื•ื ื ืชื•ื ื™ื.
02:52
So why is having more data not helping us make better decisions,
54
172247
5193
ืื– ืœืžื” ื™ื•ืชืจ ื ืชื•ื ื™ื ืœื ืขื•ื–ืจื™ื ืœื ื• ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ื˜ื•ื‘ื•ืช ื™ื•ืชืจ,
02:57
especially for companies who have all these resources
55
177464
2783
ื‘ื™ื™ื—ื•ื“ ื‘ื—ื‘ืจื•ืช ืฉื™ืฉ ืœื”ืŸ ืืช ื›ืœ ื”ืžืฉืื‘ื™ื
03:00
to invest in these big data systems?
56
180271
1736
ืœื”ืฉืงื™ืข ื‘ืžืขืจื›ื•ืช ื‘ื™ื’ ื“ืื˜ื” ืืœื”?
03:02
Why isn't it getting any easier for them?
57
182031
2398
ืœืžื” ื–ื” ืœื ืžืงืœ ืขืœื™ื”ืŸ?
03:05
So, I've witnessed the struggle firsthand.
58
185810
2634
ืื–, ื”ื™ื™ืชื™ ืขื“ื” ืœืžืื‘ืง ืžืžืงื•ืจ ืจืืฉื•ืŸ.
03:09
In 2009, I started a research position with Nokia.
59
189194
3484
ื‘2009, ื”ืชื—ืœืชื™ ืœืขื‘ื•ื“ ื‘ืžื—ืงืจ ื‘ื ื•ืงื™ื”.
03:13
And at the time,
60
193052
1158
ื•ื‘ื–ืžื ื•,
03:14
Nokia was one of the largest cell phone companies in the world,
61
194234
3158
ื ื•ืงื™ื” ื”ื™ืชื” ืื—ืช ืžื—ื‘ืจื•ืช ื”ืกืœื•ืœืจ ื”ื’ื“ื•ืœื•ืช ื‘ื™ื•ืชืจ ื‘ืขื•ืœื,
03:17
dominating emerging markets like China, Mexico and India --
62
197416
3202
ื”ื™ื ืฉืœื˜ื” ื‘ืฉื•ื•ืงื™ื ืžืชืขื•ืจืจื™ื ื›ืžื• ืกื™ืŸ, ืžืงืกื™ืงื• ื•ื”ื•ื“ื• --
03:20
all places where I had done a lot of research
63
200642
2502
ื‘ื›ืœ ื”ืžืงื•ืžื•ืช ื—ืงืจืชื™ ื”ืจื‘ื”
03:23
on how low-income people use technology.
64
203168
2676
ื›ื™ืฆื“ ืื ืฉื™ื ื‘ืขืœื™ ื”ื›ื ืกื” ื ืžื•ื›ื” ืžืฉืชืžืฉื™ื ื‘ื˜ื›ื ื•ืœื•ื’ื™ื”.
03:25
And I spent a lot of extra time in China
65
205868
2330
ื•ื‘ื™ืœื™ืชื™ ื”ืจื‘ื” ื–ืžืŸ ื‘ืกื™ืŸ
03:28
getting to know the informal economy.
66
208222
2592
ื•ืœืžื“ืชื™ ืœื”ื›ื™ืจ ืืช ื”ื›ืœื›ืœื” ื”ืœื ืจืฉืžื™ืช.
03:30
So I did things like working as a street vendor
67
210838
2401
ืื– ืขืฉื™ืชื™ ื“ื‘ืจื™ื ื›ืžื• ืœืขื‘ื•ื“ ื›ืžื•ื›ืจืช ื‘ื“ื•ื›ืŸ ืจื—ื•ื‘,
03:33
selling dumplings to construction workers.
68
213263
2574
ืžื›ืจืชื™ ื›ื•ืคืชืื•ืช ืœืคื•ืขืœื™ ื‘ื ื™ื™ืŸ.
03:35
Or I did fieldwork,
69
215861
1358
ืื• ืฉืขืฉื™ืชื™ ืขื‘ื•ื“ืช ืฉื˜ื—,
03:37
spending nights and days in internet cafรฉs,
70
217243
2958
ื‘ื™ืœื™ืชื™ ืœื™ืœื•ืช ื•ื™ืžื™ื ื‘ื‘ืชื™ ืงืคื”-ืื™ื ื˜ืจื ื˜,
03:40
hanging out with Chinese youth, so I could understand
71
220225
2546
ื‘ื™ืœื™ืชื™ ืขื ื”ื ื•ืขืจ ื”ืกื™ื ื™ ื›ื“ื™ ืœื”ื‘ื™ืŸ
03:42
how they were using games and mobile phones
72
222795
2284
ืื™ืš ื”ื ืžืฉืชืžืฉื™ื ื‘ืžืฉื—ืงื™ื ื•ื‘ื˜ืœืคื•ื ื™ื ื ื™ื™ื“ื™ื
03:45
and using it between moving from the rural areas to the cities.
73
225103
3370
ื•ืžืฉืชืžืฉื™ื ื‘ื”ื ื‘ืžืขื‘ืจ ืžื”ื›ืคืจื™ื ืœืขืจื™ื.
03:50
Through all of this qualitative evidence that I was gathering,
74
230155
3927
ื‘ืขื–ืจืช ื›ืœ ื”ืจืื™ื•ืช ื”ืื™ื›ื•ืชื™ื•ืช ืฉืืกืคืชื™
03:54
I was starting to see so clearly
75
234106
2824
ื”ืชื—ืœืชื™ ืœืจืื•ืช ื‘ื‘ื™ืจื•ืจ
03:56
that a big change was about to happen among low-income Chinese people.
76
236954
4472
ืฉืขื•ืžื“ ืœื”ืชืจื—ืฉ ืฉื™ื ื•ื™ ื’ื“ื•ืœ ื‘ืงืจื‘ ื”ืกื™ื ื™ื ื‘ืขืœื™ ื”ื”ื›ื ืกื” ื”ื ืžื•ื›ื”.
04:02
Even though they were surrounded by advertisements for luxury products
77
242840
4367
ืœืžืจื•ืช ืฉื”ื ื”ื™ื• ืžื•ืงืคื™ื ื‘ืคืจืกื•ืžื•ืช ืœืžื•ืฆืจื™ ื™ื•ืงืจื”
04:07
like fancy toilets -- who wouldn't want one? --
78
247231
3495
ื›ืžื• ืืกืœื•ืช ืžืคื•ืืจื•ืช -- ืžื™ ืœื ื”ื™ื” ืจื•ืฆื” ืื—ืช ื›ื–ืืช?
04:10
and apartments and cars,
79
250750
2890
ื•ื“ื™ืจื•ืช, ื•ืžื›ื•ื ื™ื•ืช,
04:13
through my conversations with them,
80
253664
1820
ื‘ืืžืฆืขื•ืช ืฉื™ื—ื•ืช ืฉื ื™ื”ืœืชื™ ืื™ืชื,
04:15
I found out that the ads the actually enticed them the most
81
255508
3841
ื’ื™ืœื™ืชื™ ืฉื”ืคืจืกื•ืžื•ืช ืฉื‘ืขืฆื ื”ื›ื™ ืคื™ืชื• ืื•ืชื
04:19
were the ones for iPhones,
82
259373
1996
ื”ื™ื• ืืœื” ืฉืœ ื”ืื™ื™ืคื•ื ื™ื,
04:21
promising them this entry into this high-tech life.
83
261393
3052
ืฉื”ื‘ื˜ื™ื—ื• ืœื”ื ื›ื ื™ืกื” ืœืขื•ืœื ื”ื”ื™ื™ื˜ืง.
04:25
And even when I was living with them in urban slums like this one,
84
265289
3163
ื•ืืคื™ืœื• ื›ืฉื—ื™ื™ืชื™ ื‘ื™ื ื™ื”ื ื‘ืฉื›ื•ื ื•ืช ื”ืขื•ื ื™ ื”ืขื™ืจื•ื ื™ื•ืช ื›ืžื• ื–ื•,
04:28
I saw people investing over half of their monthly income
85
268476
2996
ืจืื™ืชื™ ืื ืฉื™ื ืžืฉืงื™ืขื™ื ื™ื•ืชืจ ืžืžื—ืฆื™ืช ืžืฉื›ื•ืจืชื ื”ื—ื•ื“ืฉื™ืช
04:31
into buying a phone,
86
271496
1623
ื‘ืงื ื™ื™ืช ื˜ืœืคื•ืŸ,
04:33
and increasingly, they were "shanzhai,"
87
273143
2302
ื•ื™ื•ืชืจ ื•ื™ื•ืชืจ ืžื”ื ื”ื™ื• ืžื›ืฉื™ืจื™ "ืฉืื ื–ื”ืื™",
04:35
which are affordable knock-offs of iPhones and other brands.
88
275469
3388
ืฉื”ื ื—ื™ืงื•ื™ื™ื ื–ื•ืœื™ื ืฉืœ ืื™ื™ืคื•ื ื™ื ื•ืžื•ืชื’ื™ื ืื—ืจื™ื.
04:40
They're very usable.
89
280123
1625
ื”ื ืžืื•ื“ ืฉืžื™ืฉื™ื.
04:42
Does the job.
90
282710
1322
ืขื•ืฉื™ื ืืช ื”ืขื‘ื•ื“ื”.
04:44
And after years of living with migrants and working with them
91
284570
5789
ื•ืื—ืจื™ ืฉื ื™ื ืฉื—ื™ื™ืชื™ ืขื ืžื”ื’ืจื™ื ื•ืขื‘ื“ืชื™ ืืชื
04:50
and just really doing everything that they were doing,
92
290383
3434
ื•ืžืžืฉ ืขืฉื™ืชื™ ืืช ื›ืœ ืžื” ืฉื”ื ืขื•ืฉื™ื,
04:53
I started piecing all these data points together --
93
293841
3597
ื”ืชื—ืœืชื™ ืœื”ืจื›ื™ื‘ ืืช ื›ืœ ื ืงื•ื“ื•ืช ื”ื ืชื•ื ื™ื ื”ืืœื” ื™ื—ื“ --
04:57
from the things that seem random, like me selling dumplings,
94
297462
3123
ืžื”ื“ื‘ืจื™ื ืฉื ืจืื™ื ืจื ื“ื•ืžืœื™ื™ื, ื›ืžื• ืžื›ื™ืจืช ื›ื•ืคืชืื•ืช ืขืœ ื™ื“ื™,
05:00
to the things that were more obvious,
95
300609
1804
ืœื“ื‘ืจื™ื ืฉื”ื™ื• ื‘ืจื•ืจื™ื ื™ื•ืชืจ,
05:02
like tracking how much they were spending on their cell phone bills.
96
302437
3232
ื›ืžื• ืžืขืงื‘ ืื—ืจื™ ื”ื•ืฆืื•ืช ื—ืฉื‘ื•ื ื•ืช ื”ื˜ืœืคื•ืŸ ืฉืœื”ื.
05:05
And I was able to create this much more holistic picture
97
305693
2639
ื•ื”ืฆืœื—ืชื™ ืœื™ืฆื•ืจ ืชืžื•ื ื” ื”ืจื‘ื” ื™ื•ืชืจ ื”ื•ืœื™ืกื˜ื™ืช
05:08
of what was happening.
98
308356
1156
ืฉืœ ื”ืžืชืจื—ืฉ.
05:09
And that's when I started to realize
99
309536
1722
ื•ืจืง ืื– ื”ืชื—ืœืชื™ ืœื”ื‘ื™ืŸ
05:11
that even the poorest in China would want a smartphone,
100
311282
3509
ืฉืืคื™ืœื• ื”ืขื ื™ื™ื ื‘ื™ื•ืชืจ ื‘ืกื™ืŸ ืจืฆื• ืกืžืืจื˜ืคื•ืŸ,
05:14
and that they would do almost anything to get their hands on one.
101
314815
4985
ื•ื”ื™ื• ืžื•ื›ื ื™ื ืœืขืฉื•ืช ื›ืžืขื˜ ื”ื›ืœ ื›ื“ื™ ืœื”ืฉื™ื’ ืื—ื“.
05:20
You have to keep in mind,
102
320893
2404
ืขืœื™ื›ื ืœื–ื›ื•ืจ,
05:23
iPhones had just come out, it was 2009,
103
323321
3084
ืฉืื™ื™ืคื•ื ื™ื ืจืง ื™ืฆืื• ืœืฉื•ืง ืื–, ื‘2009,
05:26
so this was, like, eight years ago,
104
326429
1885
ืื– ื–ื” ื”ื™ื” ืœืคื ื™ ืฉืžื•ื ื” ืฉื ื™ื ื‘ืขืจืš,
05:28
and Androids had just started looking like iPhones.
105
328338
2437
ื•ืžื›ืฉื™ืจื™ ืื ื“ืจื•ืื™ื“ ืจืง ื”ืชื—ื™ืœื• ืœื”ื™ืจืื•ืช ื›ืžื• ืื™ื™ืคื•ื ื™ื.
05:30
And a lot of very smart and realistic people said,
106
330799
2507
ื•ื”ืจื‘ื” ืื ืฉื™ื ืžืื•ื“ ื—ื›ืžื™ื ื•ืžืฆื™ืื•ืชื™ื™ื ืืžืจื•,
05:33
"Those smartphones -- that's just a fad.
107
333330
2207
"ื”ืกืžืืจื˜ืคื•ื ื™ื ื”ืืœื” -- ื”ื ืจืง ืชื•ืคืขื” ื—ื•ืœืคืช.
05:36
Who wants to carry around these heavy things
108
336063
2996
ืžื™ ืจื•ืฆื” ืœื”ืกืชื•ื‘ื‘ ืขื ื”ืžื›ืฉื™ืจื™ื ื”ื›ื‘ื“ื™ื ื”ืืœื”
05:39
where batteries drain quickly and they break every time you drop them?"
109
339083
3487
ืฉืžืจื•ืงื ื™ื ืžื”ืจ ืืช ื”ืกื•ืœืœื•ืช ื•ื ืฉื‘ืจื™ื ื‘ื›ืœ ืคืขื ืฉืžืคื™ืœื™ื ืื•ืชื?"
05:44
But I had a lot of data,
110
344613
1201
ืื‘ืœ ื”ื™ื• ืœื™ ื”ืจื‘ื” ื ืชื•ื ื™ื,
05:45
and I was very confident about my insights,
111
345838
2260
ื•ื”ื™ื™ืชื™ ื‘ื˜ื•ื—ื” ืžืื•ื“ ืœื’ื‘ื™ ื”ืชื•ื‘ื ื•ืช ืฉืœื™.
05:48
so I was very excited to share them with Nokia.
112
348122
2829
ืื– ืžืื•ื“ ื”ืชืจื’ืฉืชื™ ืœืฉืชืฃ ืื•ืชืŸ ืขื ื ื•ืงื™ื”.
05:53
But Nokia was not convinced,
113
353152
2517
ืื‘ืœ ื ื•ืงื™ื” ืœื ื”ืฉืชื›ื ืขื•,
05:55
because it wasn't big data.
114
355693
2335
ื‘ื’ืœืœ ืฉืืœื” ืœื ื”ื™ื• "ื‘ื™ื’ ื“ืื˜ื”".
05:58
They said, "We have millions of data points,
115
358842
2404
ื”ื ืืžืจื• "ื™ืฉ ืœื ื• ืžื™ืœื™ื•ื ื™ ื ืงื•ื“ื•ืช ืžื™ื“ืข,
06:01
and we don't see any indicators of anyone wanting to buy a smartphone,
116
361270
4247
ื•ืื ื—ื ื• ืœื ืจื•ืื™ื ืฉื•ื ืื™ื ื“ื™ืงืฆื™ื•ืช ืฉืžื™ืฉื”ื• ื™ืจืฆื” ืœืงื ื•ืช ืกืžืืจื˜ืคื•ืŸ,
06:05
and your data set of 100, as diverse as it is, is too weak
117
365541
4388
ื•ืžืขืจืš ื”ืžื™ื“ืข ืฉืœืš ืฉืขื•ืžื“ ืขืœ 100 ืขืœ ืืฃ ืฉื”ื•ื ืžื’ื•ื•ืŸ ื”ื•ื ื—ืœืฉ ืžื“ื™,
06:09
for us to even take seriously."
118
369953
1714
ืžื›ื“ื™ ืฉื ื•ื›ืœ ืœืงื—ืช ืื•ืชื• ื‘ืจืฆื™ื ื•ืช".
06:12
And I said, "Nokia, you're right.
119
372728
1605
ืื– ืืžืจืชื™, "ื ื•ืงื™ื”, ืืชื ืฆื•ื“ืงื™ื.
06:14
Of course you wouldn't see this,
120
374357
1560
ื‘ืจื•ืจ ืฉืœื ื™ื›ื•ืœืชื ืœืจืื•ืช ืืช ื–ื”,
06:15
because you're sending out surveys assuming that people don't know
121
375941
3371
ื›ื™ ืืชื ืฉื•ืœื—ื™ื ืกืงืจื™ื ื‘ื”ื ื—ื” ืฉืื ืฉื™ื ืœื ื™ื•ื“ืขื™ื
06:19
what a smartphone is,
122
379336
1159
ืžื” ื–ื” ืกืžืืจื˜ืคื•ืŸ,
06:20
so of course you're not going to get any data back
123
380519
2366
ืื– ื‘ืจื•ืจ ืฉืืชื ืœื ื”ื•ืœื›ื™ื ืœืงื‘ืœ ืฉื•ื ื ืชื•ื ื™ื ื—ื–ืจื”
06:22
about people wanting to buy a smartphone in two years.
124
382909
2572
ืขืœ ืื ืฉื™ื ืฉืจื•ืฆื™ื ืœืงื ื•ืช ืกืžืืจื˜ืคื•ืŸ ื‘ืฉื ืชื™ื™ื ื”ืงืจื•ื‘ื•ืช.
06:25
Your surveys, your methods have been designed
125
385505
2118
ื”ืกืงืจื™ื ืฉืœื›ื, ื”ืฉื™ื˜ื•ืช ืฉืœื›ื, ืชื•ื›ื ื ื•
06:27
to optimize an existing business model,
126
387647
2022
ื›ื“ื™ ืœืžื˜ื‘ ื“ื’ื ืขืกืง ืงื™ื™ื,
06:29
and I'm looking at these emergent human dynamics
127
389693
2608
ื•ืื ื™ ืžืกืชื›ืœืช ืขืœ ื“ื™ื ืžื™ืงื•ืช ืื ื•ืฉื™ื•ืช ืฆื•ืžื—ื•ืช ืืœื•
06:32
that haven't happened yet.
128
392325
1354
ืฉืขื“ื™ื™ืŸ ืœื ืงืจื•.
06:33
We're looking outside of market dynamics
129
393703
2438
ืื ื—ื ื• ืžืกืชื›ืœื™ื ืžื—ื•ืฅ ืœื“ื™ื ืžื™ืงื” ืฉืœ ื”ืฉื•ืง
06:36
so that we can get ahead of it."
130
396165
1631
ื›ื“ื™ ืฉื ื•ื›ืœ ืœื”ืงื“ื™ื ืื•ืชื”".
06:39
Well, you know what happened to Nokia?
131
399193
2244
ื•ื‘ื›ืŸ, ืืชื ื™ื•ื“ืขื™ื ืžื” ืงืจื” ืœื ื•ืงื™ื”?
06:41
Their business fell off a cliff.
132
401461
2365
ื”ืขืกืง ืฉืœื”ื ื ืคืœ ื—ื–ืง.
06:44
This -- this is the cost of missing something.
133
404611
3727
ื–ื” -- ื–ื” ื”ืžื—ื™ืจ ืฉืœ ื”ื”ื—ื˜ืื”.
06:48
It was unfathomable.
134
408983
1999
ื–ื” ื”ื™ื” ื‘ืœืชื™ ื ื™ืชืŸ ืœื”ื‘ื ื”.
06:51
But Nokia's not alone.
135
411823
1651
ืื‘ืœ ื ื•ืงื™ื” ืœื ืœื‘ื“.
06:54
I see organizations throwing out data all the time
136
414078
2581
ืื ื™ ืจื•ืื” ืืจื’ื•ื ื™ื ืฉื–ื•ืจืงื™ื ื ืชื•ื ื™ื ื›ืœ ื”ื–ืžืŸ
06:56
because it didn't come from a quant model
137
416683
2561
ื‘ื’ืœืœ ืฉื”ื ืœื ื”ื’ื™ืขื• ืžืžื•ื“ืœ ื›ืžื•ืชื™
06:59
or it doesn't fit in one.
138
419268
1768
ืื• ืฉื”ื ืœื ืžืชืื™ืžื™ื ืœืžื•ื“ืœ ื›ื–ื”.
07:02
But it's not big data's fault.
139
422039
2048
ืื‘ืœ ื–ื• ืœื ืืฉืžืชื• ืฉืœ ื‘ื™ื’ ื“ืื˜ื”.
07:04
It's the way we use big data; it's our responsibility.
140
424762
3907
ื–ื” ื”ืื•ืคืŸ ืฉื‘ื• ืื ื—ื ื• ืžืฉืชืžืฉื™ื ื‘ื‘ื™ื’ ื“ืื˜ื”; ื–ืืช ื”ืื—ืจื™ื•ืช ืฉืœื ื•.
07:09
Big data's reputation for success
141
429550
1911
ื”ืžื•ื ื™ื˜ื™ืŸ ืฉืœ ื‘ื™ื’ ื“ืื˜ื” ืœื”ืฆืœื—ื”
07:11
comes from quantifying very specific environments,
142
431485
3759
ื ื•ื‘ืข ืžื›ื™ืžื•ืช ืฉืœ ืกื‘ื™ื‘ื•ืช ืกืคืฆื™ืคื™ื•ืช ืžืื•ื“,
07:15
like electricity power grids or delivery logistics or genetic code,
143
435268
4913
ื›ืžื• ืจืฉืชื•ืช ื—ืฉืžืœ ืื• ืžืขืจื›ื•ืช ืฉื™ื ื•ืข ืื• ืงื•ื“ ื’ื ื˜ื™,
07:20
when we're quantifying in systems that are more or less contained.
144
440205
4318
ื›ืฉืื ื—ื ื• ืžื›ืžืชื™ื ืžืขืจื›ื•ืช ืฉื”ืŸ ืคื—ื•ืช ืื• ื™ื•ืชืจ ืžื•ื›ืœื•ืช.
07:24
But not all systems are as neatly contained.
145
444547
2969
ืื‘ืœ ืœื ื›ืœ ื”ืžืขืจื›ื•ืช ืžื•ื›ืœื•ืช ื‘ืฆื•ืจื” ื›ืœ ื›ืš ืžืกื•ื“ืจืช.
07:27
When you're quantifying and systems are more dynamic,
146
447540
3258
ื›ืฉืžื›ืžืชื™ื ื•ื”ืžืขืจื›ื•ืช ื™ื•ืชืจ ื“ื™ื ืžื™ื•ืช,
07:30
especially systems that involve human beings,
147
450822
3799
ื‘ืžื™ื•ื—ื“ ืžืขืจื›ื•ืช ืฉืžืขืจื‘ื•ืช ื‘ื ื™ ืื“ื,
07:34
forces are complex and unpredictable,
148
454645
2426
ื”ื›ื•ื—ื•ืช ื”ืคื•ืขืœื™ื ื”ื ืžื•ืจื›ื‘ื™ื ื•ื‘ืœืชื™ ืฆืคื•ื™ื™ื,
07:37
and these are things that we don't know how to model so well.
149
457095
3486
ื•ืืœื• ื”ื ื“ื‘ืจื™ื ืฉืžื”ื ืื™ื ื ื• ื™ื•ื“ืขื™ื ื”ื™ื˜ื‘ ื›ื™ืฆื“ ืœื™ืฆื•ืจ ื“ื’ื.
07:41
Once you predict something about human behavior,
150
461024
2813
ื‘ืจื’ืข ืฉื—ื•ื–ื™ื ืžืฉื”ื• ืขืœ ื”ืชื ื”ื’ื•ืช ืื ื•ืฉื™ืช,
07:43
new factors emerge,
151
463861
1855
ื’ื•ืจืžื™ื ื—ื“ืฉื™ื ืฆืฆื™ื,
07:45
because conditions are constantly changing.
152
465740
2365
ื›ื™ ื”ืชื ืื™ื ืžืฉืชื ื™ื ื›ืœ ื”ื–ืžืŸ.
07:48
That's why it's a never-ending cycle.
153
468129
1803
ืœื›ืŸ ื–ื”ื• ืžื—ื–ื•ืจ ืื™ื ืกื•ืคื™.
07:49
You think you know something,
154
469956
1464
ืืชื ื—ื•ืฉื‘ื™ื ืฉืืชื ื™ื•ื“ืขื™ื ืžืฉื”ื•,
07:51
and then something unknown enters the picture.
155
471444
2242
ื•ืื– ืžืฉื”ื• ืœื ื™ื“ื•ืข ื ื›ื ืก ืœืชืžื•ื ื”.
07:53
And that's why just relying on big data alone
156
473710
3322
ื•ืœื›ืŸ ื”ืกืชืžื›ื•ืช ืขืœ ื‘ื™ื’ ื“ืื˜ื” ื‘ืœื‘ื“
07:57
increases the chance that we'll miss something,
157
477056
2849
ืžืขืœื” ืืช ื”ืกื™ื›ื•ื™ ืฉื ื—ืกื™ืจ ืžืฉื”ื•.
07:59
while giving us this illusion that we already know everything.
158
479929
3777
ื‘ืขื•ื“ ืฉื”ื™ื ืžืขื•ืจืจืช ื‘ื ื• ืืช ื”ืืฉืœื™ื™ื” ืฉืื ื—ื ื• ื›ื‘ืจ ื™ื•ื“ืขื™ื ื”ื›ืœ.
08:04
And what makes it really hard to see this paradox
159
484226
3856
ื•ืžื” ืฉืžืงืฉื” ืขืœื™ื ื• ืœืจืื•ืช ืืช ื”ืคืจื“ื•ืงืก ื”ื–ื”
08:08
and even wrap our brains around it
160
488106
2659
ื•ืืคื™ืœื• ืœื”ื‘ื™ืŸ ืื•ืชื•
08:10
is that we have this thing that I call the quantification bias,
161
490789
3691
ื”ื•ื ืฉื™ืฉ ืœื ื• ืืช ื”ื›ืœื™ ื”ื–ื” ืฉืื ื™ ืžื›ื ื” ืื•ืชื• ื”ื”ื˜ื™ื” ืœื›ื™ืžื•ืช,
08:14
which is the unconscious belief of valuing the measurable
162
494504
3922
ืฉื”ื™ื ื”ืืžื•ื ื” ื”ืœื ืžื•ื“ืขืช ืœื”ืขื“ืคืช ืžื™ื“ืข ืžื“ื™ื“
08:18
over the immeasurable.
163
498450
1594
ืขืœ ืคื ื™ ื‘ืœืชื™ ืžื“ื™ื“.
08:21
And we often experience this at our work.
164
501042
3284
ื•ืคืขืžื™ื ืจื‘ื•ืช ืื ื—ื ื• ื—ื•ื•ื™ื ืืช ื–ื” ื‘ืขื‘ื•ื“ื” ืฉืœื ื•.
08:24
Maybe we work alongside colleagues who are like this,
165
504350
2650
ืื•ืœื™ ืื ื—ื ื• ืขื•ื‘ื“ื™ื ืœืฆื“ ืขืžื™ืชื™ื ืฉื”ื ื›ืืœื”,
08:27
or even our whole entire company may be like this,
166
507024
2428
ืื• ืฉืืคื™ืœื• ื”ื—ื‘ืจื” ืฉืœื ื• ื›ื•ืœื” ืขืฉื•ื™ื” ืœื”ื™ื•ืช ื›ื–ืืช,
08:29
where people become so fixated on that number,
167
509476
2546
ืฉื‘ื” ืื ืฉื™ื ื”ื•ืคื›ื™ื ื›ื” ืžืงื•ื‘ืขื™ื ืขืœ ื”ืžืกืคืจ ื”ื–ื”,
08:32
that they can't see anything outside of it,
168
512046
2067
ืฉืื™ื ื ื™ื›ื•ืœื™ื ืœืจืื•ืช ืฉื•ื ื“ื‘ืจ ืžืขื‘ืจ ืœื•,
08:34
even when you present them evidence right in front of their face.
169
514137
3948
ืืคื™ืœื• ื›ืฉืžืฆื™ื’ื™ื ืœื”ื ื”ื•ื›ื—ื•ืช ืžืžืฉ ืžื•ืœ ื”ืคืจืฆื•ืฃ.
08:38
And this is a very appealing message,
170
518943
3371
ื•ื–ื”ื• ืžืกืจ ืžืื•ื“ ืžื•ืฉืš,
08:42
because there's nothing wrong with quantifying;
171
522338
2343
ื›ื™ื•ื•ืŸ ืฉืื™ืŸ ืฉื•ื ืคื’ื ื‘ื›ืžื•ืชื™ื•ืช;
08:44
it's actually very satisfying.
172
524705
1430
ื–ื” ืœืžืขืฉื” ืžืื•ื“ ืžืกืคืง.
08:46
I get a great sense of comfort from looking at an Excel spreadsheet,
173
526159
4362
ืื ื™ ืžืงื‘ืœืช ืชื—ื•ืฉื” ื ืคืœืื” ืฉืœ ื ื•ื—ื•ืช ืžื”ืชื‘ื•ื ื ื•ืช ื‘ื’ื™ืœื™ื•ืŸ ืืœืงื˜ืจื•ื ื™ ืฉืœ ืืงืกืœ,
08:50
even very simple ones.
174
530545
1401
ืืคื™ืœื• ื‘ืคืฉื•ื˜ื™ื ืžืื•ื“.
08:51
(Laughter)
175
531970
1014
(ืฆื—ื•ืง)
08:53
It's just kind of like,
176
533008
1152
ื–ื” ื›ืื™ืœื•,
08:54
"Yes! The formula worked. It's all OK. Everything is under control."
177
534184
3504
"ื›ืŸ! ื”ื ื•ืกื—ื” ื”ืฆืœื™ื—ื”. ื”ื›ืœ ื‘ืกื“ืจ. ื”ื›ืœ ืชื—ืช ืฉืœื™ื˜ื”".
08:58
But the problem is
178
538612
2390
ืื‘ืœ ื”ื‘ืขื™ื” ื”ื™ื
09:01
that quantifying is addictive.
179
541026
2661
ืฉืœื›ืžืช ื–ื” ืžืžื›ืจ.
09:03
And when we forget that
180
543711
1382
ื•ื›ืืฉืจ ืื ื—ื ื• ืฉื•ื›ื—ื™ื ืืช ื–ื”
09:05
and when we don't have something to kind of keep that in check,
181
545117
3038
ื•ื›ืฉืื™ืŸ ืœื ื• ืžืฉื”ื• ืฉืขื•ื–ืจ ืœื ื• ืœืคืงื— ืขืœ ื–ื”,
09:08
it's very easy to just throw out data
182
548179
2118
ืงืœ ืžืื•ื“ ืคืฉื•ื˜ ืœื–ืจื•ืง ืžื™ื“ืข
09:10
because it can't be expressed as a numerical value.
183
550321
2718
ื›ื™ื•ื•ืŸ ืฉืœื ื ื™ืชืŸ ืœื‘ื˜ื ืื•ืชื• ื›ืขืจืš ืžืกืคืจื™.
09:13
It's very easy just to slip into silver-bullet thinking,
184
553063
2921
ืงืœ ืžืื•ื“ ืœื”ื—ืœื™ืง ืœืชื•ืš ื—ืฉื™ื‘ื” ืžืื’ื™ืช,
09:16
as if some simple solution existed.
185
556008
2579
ื›ืื™ืœื• ืงื™ื™ื ืคืชืจื•ืŸ ืคืฉื•ื˜ ื›ืœืฉื”ื•.
09:19
Because this is a great moment of danger for any organization,
186
559420
4062
ื›ื™ ื–ื” ืจื’ืข ืžืกื•ื›ืŸ ืžืื•ื“ ืขื‘ื•ืจ ื›ืœ ืืจื’ื•ืŸ,
09:23
because oftentimes, the future we need to predict --
187
563506
2634
ื›ื™ ืœืขืชื™ื ืงืจื•ื‘ื•ืช ื”ืขืชื™ื“ ืฉืขืœื™ื ื• ืœื—ื–ื•ืช --
09:26
it isn't in that haystack,
188
566164
2166
ืœื ื ืžืฆื ื‘ืขืจื™ืžืช ื”ืฉื—ืช ื”ื–ืืช,
09:28
but it's that tornado that's bearing down on us
189
568354
2538
ืืœื ื”ื•ื ืกื•ืคืช ื”ื˜ื•ืจื ื“ื• ื”ื–ื• ืฉืžืื™ื™ืžืช ืขืœื™ื ื•
09:30
outside of the barn.
190
570916
1488
ืžื—ื•ืฅ ืœืืกื.
09:34
There is no greater risk
191
574780
2326
ืื™ืŸ ื“ื‘ืจ ื™ื•ืชืจ ืžืกื•ื›ืŸ
09:37
than being blind to the unknown.
192
577130
1666
ืžืœื”ื™ื•ืช ืขื™ื•ื•ืจ ืœืœื ื ื•ื“ืข.
09:38
It can cause you to make the wrong decisions.
193
578820
2149
ื–ื” ื™ื›ื•ืœ ืœื’ืจื•ื ืœื›ื ืœืงื‘ืœ ื”ื—ืœื˜ื•ืช ืฉื’ื•ื™ื•ืช.
09:40
It can cause you to miss something big.
194
580993
1974
ื–ื” ื™ื›ื•ืœ ืœื’ืจื•ื ืœื›ื ืœืคืกืคืก ืžืฉื”ื• ื’ื“ื•ืœ.
09:43
But we don't have to go down this path.
195
583554
3101
ืื‘ืœ ืื ื—ื ื• ืœื ื—ื™ื™ื‘ื™ื ืœืœื›ืช ื‘ื ืชื™ื‘ ื”ื–ื”.
09:47
It turns out that the oracle of ancient Greece
196
587273
3195
ืžืกืชื‘ืจ ืฉื”ืื•ืจืงืœ ืฉืœ ื™ื•ื•ืŸ ื”ืขืชื™ืงื”
09:50
holds the secret key that shows us the path forward.
197
590492
3966
ืžื—ื–ื™ืงื” ืืช ื”ืžืคืชื— ื”ืกื•ื“ื™ ืฉืžืจืื” ืœื ื• ืืช ื”ื“ืจืš ืงื“ื™ืžื”.
09:55
Now, recent geological research has shown
198
595474
2595
ืžื—ืงืจ ื’ืื•ืœื•ื’ื™ ืฉื ืขืฉื” ืœืื—ืจื•ื ื” ื”ืจืื”
09:58
that the Temple of Apollo, where the most famous oracle sat,
199
598093
3564
ืฉืžืงื“ืฉ ืืคื•ืœื•, ืฉื‘ื• ื™ืฉื‘ื• ืจื•ื‘ ื”ืื•ืจืงืœื™ื ื”ืžืคื•ืจืกืžื™ื,
10:01
was actually built over two earthquake faults.
200
601681
3084
ื ื‘ื ื” ืœืžืขืฉื” ืžืขืœ ืฉื ื™ ืงื•ื•ื™ ืฉื‘ืจ ื˜ืงื˜ื•ื ื™ื™ื.
10:04
And these faults would release these petrochemical fumes
201
604789
2886
ื•ืงื•ื•ื™ ื”ืฉื‘ืจ ื”ืืœื” ืžืฉื—ืจืจื™ื ืื“ื™ื ืคื˜ืจื•-ื›ื™ืžื™ื™ื
10:07
from underneath the Earth's crust,
202
607699
1685
ืžืžืขืžืงื™ ืงืจื•ื ื›ื“ื•ืจ ื”ืืจืฅ,
10:09
and the oracle literally sat right above these faults,
203
609408
3866
ื•ื”ืื•ืจืงืœ ื™ืฉื‘ื” ืžืžืฉ ืžืขืœ ืงื•ื•ื™ื ืืœื”,
10:13
inhaling enormous amounts of ethylene gas, these fissures.
204
613298
3588
ื•ืฉืืคื” ื›ืžื•ื™ื•ืช ืขืฆื•ืžื•ืช ืฉืœ ื’ื– ืืชื™ืœืŸ ืžืกื“ืงื™ื ืืœื”.
10:16
(Laughter)
205
616910
1008
10:17
It's true.
206
617942
1173
(ืฆื—ื•ืง)
ื–ื” ืืžื™ืชื™.
10:19
(Laughter)
207
619139
1017
(ืฆื—ื•ืง)
10:20
It's all true, and that's what made her babble and hallucinate
208
620180
3509
ื”ื›ืœ ืืžื™ืชื™, ื•ื–ื” ืžื” ืฉื’ืจื ืœื” ืœืคื˜ืคื˜ ื•ืœื”ื–ื•ืช
10:23
and go into this trance-like state.
209
623713
1724
ื•ืœื”ื›ื ืก ืœืžืฆื‘ื™ ื˜ืจืื ืก.
10:25
She was high as a kite!
210
625461
1770
ื”ื™ื ื”ื™ืชื” ืžืกื˜ื•ืœื” ืœื’ืžืจื™!
10:27
(Laughter)
211
627255
4461
(ืฆื—ื•ืง)
10:31
So how did anyone --
212
631740
2779
ืื– ืื™ืš ืžื™ืฉื”ื• --
10:34
How did anyone get any useful advice out of her
213
634543
3030
ืื™ืš ืžื™ืฉื”ื• ื™ื›ื•ืœ ื”ื™ื” ืœืงื‘ืœ ืขืฆื” ืฉื™ืžื•ืฉื™ืช ื›ืœืฉื”ื™ ืžืžื ื”
10:37
in this state?
214
637597
1190
ื‘ืžืฆื‘ ื”ื–ื”?
10:39
Well, you see those people surrounding the oracle?
215
639317
2381
ื•ื‘ื›ืŸ, ืืชื ืจื•ืื™ื ืืช ื”ืื ืฉื™ื ืฉืžืงื™ืคื™ื ืืช ื”ืื•ืจืงืœ?
10:41
You see those people holding her up,
216
641722
1879
ืืชื ืจื•ืื™ื ืืช ื”ืื ืฉื™ื ืฉื ื•ืฉืื™ื ืื•ืชื”,
10:43
because she's, like, a little woozy?
217
643625
1717
ื›ื™ ื”ื™ื ื›ืื™ืœื• ืงืฆืช ืžื‘ื•ืœื‘ืœืช?
10:45
And you see that guy on your left-hand side
218
645366
2308
ื•ืืชื ืจื•ืื™ื ืืช ื”ื‘ื—ื•ืจ ื”ื–ื” ื‘ืฆื“ื›ื ื”ืฉืžืืœื™
10:47
holding the orange notebook?
219
647698
1598
ืžื—ื–ื™ืง ืืช ื”ืžื—ื‘ืจืช ื”ื›ืชื•ืžื”?
10:49
Well, those were the temple guides,
220
649925
1730
ื•ื‘ื›ืŸ ืืœื” ื”ื™ื• ื›ื•ื”ื ื™ ื”ื“ืช ืฉืœ ื”ืžืงื“ืฉ,
10:51
and they worked hand in hand with the oracle.
221
651679
3016
ื•ื”ื ืขื‘ื“ื• ื™ื“ ื‘ื™ื“ ืขื ื”ืื•ืจืงืœ.
10:55
When inquisitors would come and get on their knees,
222
655904
2516
ื›ืืฉืจ ืžื‘ืงืจื™ื ื”ื™ื• ื‘ืื™ื ื•ื›ื•ืจืขื™ื ืขืœ ื‘ืจื›ื™ื”ื,
10:58
that's when the temple guides would get to work,
223
658444
2340
ื›ื•ื”ื ื™ ื”ืžืงื“ืฉ ื”ื™ื• ื ื›ื ืกื™ื ืœืคืขื•ืœื”,
11:00
because after they asked her questions,
224
660808
1864
ื›ื™ ืื—ืจื™ ืฉืฉืืœื• ืื•ืชื” ืฉืืœื•ืช,
11:02
they would observe their emotional state,
225
662696
2001
ื”ื ื”ื™ื• ื‘ื•ื—ื ื™ื ืืช ืžืฆื‘ื ื”ืจื’ืฉื™,
11:04
and then they would ask them follow-up questions,
226
664721
2324
ื•ืื– ื”ื ื”ื™ื• ืฉื•ืืœื™ื ืื•ืชื ืฉืืœื•ืช ื ื•ืกืคื•ืช,
11:07
like, "Why do you want to know this prophecy? Who are you?
227
667069
2834
ื›ืžื• "ืœืžื” ืืชื” ืจื•ืฆื” ืœืฉืžื•ืข ืืช ื”ื ื‘ื•ืื” ื”ื–ืืช? ืžื™ ืืชื”?
11:09
What are you going to do with this information?"
228
669927
2264
ืžื” ืืชื” ื”ื•ืœืš ืœืขืฉื•ืช ืขื ื”ืžื™ื“ืข ื”ื–ื”?"
11:12
And then the temple guides would take this more ethnographic,
229
672215
3182
ื•ืื– ื›ื•ื”ื ื™ ื”ืžืงื“ืฉ ื”ื™ื• ื‘ื•ื—ืจื™ื ื‘ืžื™ื“ืข ื”ื™ื•ืชืจ ืืชื ื•ื’ืจืคื™,
11:15
this more qualitative information,
230
675421
2156
ื•ื”ื™ื•ืชืจ ืื™ื›ื•ืชื ื™,
11:17
and interpret the oracle's babblings.
231
677601
2075
ื•ืžืคืจืฉื™ื ืืช ื”ืคื˜ืคื•ื˜ื™ื ืฉืœ ื”ืื•ืจืงืœ.
11:21
So the oracle didn't stand alone,
232
681248
2292
ื›ืš ืฉื”ืื•ืจืงืœ ืœื ืขืžื“ื” ืœื‘ื“,
11:23
and neither should our big data systems.
233
683564
2148
ื•ื›ืš ื’ื ืžืขืจื›ื•ืช ื”ื‘ื™ื’ ื“ืื˜ื” ืฉืœื ื•.
11:26
Now to be clear,
234
686450
1161
ืื ื™ ืจื•ืฆื” ืœื”ื‘ื”ื™ืจ.
11:27
I'm not saying that big data systems are huffing ethylene gas,
235
687635
3459
ืื™ื ื™ ืื•ืžืจืช ืฉืžืขืจื›ื•ืช ื‘ื™ื’ ื“ืื˜ื” ืฉื•ืืคื•ืช ื’ื– ืืชื™ืœืŸ,
11:31
or that they're even giving invalid predictions.
236
691118
2353
ื•ืืคื™ืœื• ืœื ืฉื”ืŸ ื ื•ืชื ื•ืช ืชื—ื–ื™ื•ืช ืžื•ื˜ืขื•ืช.
11:33
The total opposite.
237
693495
1161
ื”ื”ืคืš ื”ื’ืžื•ืจ.
11:34
But what I am saying
238
694680
2068
ืื‘ืœ ืžื” ืฉืื ื™ ืื•ืžืจืช ื”ื•ื
11:36
is that in the same way that the oracle needed her temple guides,
239
696772
3832
ืฉื‘ืื•ืชื• ืื•ืคืŸ ืฉื‘ื• ื”ืื•ืจืงืœ ื ื–ืงืงื” ืœื›ื•ื”ื ื™ ื”ืžืงื“ืฉ ืฉืœื”,
11:40
our big data systems need them, too.
240
700628
2288
ืžืขืจื›ื•ืช ื”ื‘ื™ื’ ื“ืื˜ื” ืฉืœื ื• ื–ืงื•ืงื•ืช ืœื”ื ื’ื ื›ืŸ.
11:42
They need people like ethnographers and user researchers
241
702940
4109
ื”ืŸ ื–ืงื•ืงื•ืช ืœืื ืฉื™ื ื›ืžื• ืืชื ื•ื’ืจืคื™ื ื•ื—ื•ืงืจื™ ืžืฉืชืžืฉื™ื
11:47
who can gather what I call thick data.
242
707073
2506
ืฉื™ื›ื•ืœื™ื ืœืืกื•ืฃ ืืช ืžื” ืฉืื ื™ ืงื•ืจืืช ืœื• "ื ืชื•ื ื™ ืขื•ื‘ื™."
11:50
This is precious data from humans,
243
710322
2991
ื–ื”ื• ืžื™ื“ืข ื™ืงืจ ืขืจืš ืฉื‘ื ืžื‘ื ื™ ืื“ื,
11:53
like stories, emotions and interactions that cannot be quantified.
244
713337
4102
ื›ืžื• ืกื™ืคื•ืจื™ื, ืจื’ืฉื•ืช, ื•ืื™ื ื˜ืจืงืฆื™ื•ืช ืฉืœื ื ื™ืชื ื™ื ืœื›ื™ืžื•ืช.
11:57
It's the kind of data that I collected for Nokia
245
717463
2322
ื–ื”ื• ืกื•ื’ ื”ืžื™ื“ืข ืฉืื ื™ ืืกืคืชื™ ืขื‘ื•ืจ ื ื•ืงื™ื”
11:59
that comes in in the form of a very small sample size,
246
719809
2669
ืฉืžื’ื™ืข ื‘ืฆื•ืจื” ืฉืœ ืžื“ื’ื ืงื˜ืŸ ืžืื•ื“,
12:02
but delivers incredible depth of meaning.
247
722502
2955
ืื‘ืœ ืžืกืคืง ืขื•ืžืง ื’ื“ื•ืœ ืฉืœ ืžืฉืžืขื•ืช.
12:05
And what makes it so thick and meaty
248
725481
3680
ื•ืžื” ืฉื”ื•ืคืš ืื•ืชื• ืœืขื‘ื” ื•ื‘ืฉืจื ื™
12:10
is the experience of understanding the human narrative.
249
730265
4029
ื–ื• ื—ื•ื•ื™ื™ืช ื”ื”ื‘ื ื” ืฉืœ ื”ื ืจื˜ื™ื‘ ื”ืื ื•ืฉื™.
12:14
And that's what helps to see what's missing in our models.
250
734318
3639
ื•ื–ื” ืžื” ืฉืขื•ื–ืจ ืœื ื• ืœืจืื•ืช ืžื” ื—ืกืจ ื‘ืžื•ื“ืœื™ื ืฉืœื ื•.
12:18
Thick data grounds our business questions in human questions,
251
738671
4045
ื ืชื•ื ื™-ืขื•ื‘ื™ ืžืงืจืงืขื™ื ืืช ื”ืฉืืœื•ืช ื”ืขืกืงื™ื•ืช ืฉืœื ื• ื‘ื ื•ืฉืื™ ืฉืืœื•ืช ืื ื•ืฉื™ื•ืช,
12:22
and that's why integrating big and thick data
252
742740
3562
ื•ืœื›ืŸ ื”ืฉื™ืœื•ื‘ ืฉืœ ื‘ื™ื’ ื“ืื˜ื” ื•ื ืชื•ื ื™ ืขื•ื‘ื™
12:26
forms a more complete picture.
253
746326
1689
ื™ื•ืฆืจ ืชืžื•ื ื” ื™ื•ืชืจ ืžืœืื”.
12:28
Big data is able to offer insights at scale
254
748592
2881
ื‘ื™ื’ ื“ืื˜ื” ื™ื›ื•ืœื™ื ืœืกืคืง ืชื•ื‘ื ื•ืช ื‘ืงื ื” ืžื™ื“ื”
12:31
and leverage the best of machine intelligence,
255
751497
2647
ื•ืœืžื ืฃ ืืช ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ืžื›ื ื™ืช ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ,
12:34
whereas thick data can help us rescue the context loss
256
754168
3572
ื‘ืขื•ื“ ืฉื ืชื•ื ื™-ืขื•ื‘ื™ ื™ื›ื•ืœื™ื ืœืขื–ื•ืจ ืœื ื• ืœื”ืฆื™ืœ ืืช ืื•ื‘ื“ืŸ ื”ื”ืงืฉืจ
12:37
that comes from making big data usable,
257
757764
2098
ืฉืžื’ื™ืข ืžื”ืคื™ื›ืช ื‘ื™ื’ ื“ืื˜ื” ืœืฉืžื™ืฉื™ื,
12:39
and leverage the best of human intelligence.
258
759886
2181
ื•ืœืžื ืฃ ืืช ื”ืื™ื ื˜ืœื™ื’ื ืฆื™ื” ื”ืื ื•ืฉื™ืช ื”ื˜ื•ื‘ื” ื‘ื™ื•ืชืจ.
12:42
And when you actually integrate the two, that's when things get really fun,
259
762091
3552
ื•ื›ืืฉืจ ืœืžืขืฉื” ืžืฉืœื‘ื™ื ืืช ืฉื ื™ื”ื, ื”ื›ื™ืฃ ืžืžืฉ ืžืชื—ื™ืœ,
12:45
because then you're no longer just working with data
260
765667
2436
ื›ื™ ืื– ืืชื ื›ื‘ืจ ืœื ืจืง ืขื•ื‘ื“ื™ื ืขื ื ืชื•ื ื™ื
12:48
you've already collected.
261
768127
1196
ืฉื›ื‘ืจ ืืกืคืชื.
12:49
You get to also work with data that hasn't been collected.
262
769347
2737
ืืœื ืขื•ื‘ื“ื™ื ื’ื ืขื ื ืชื•ื ื™ื ืฉืขื•ื“ ืœื ื ืืกืคื•.
12:52
You get to ask questions about why:
263
772108
1719
ืชื•ื›ืœื• ืœืฉืื•ืœ ืฉืืœื•ืช ื›ืžื• ืœืžื”:
12:53
Why is this happening?
264
773851
1317
ืœืžื” ื–ื” ืงื•ืจื”?
12:55
Now, when Netflix did this,
265
775598
1379
ืขื›ืฉื™ื•, ื›ืฉื ื˜ืคืœื™ืงืก ืขืฉื• ืืช ื–ื”,
12:57
they unlocked a whole new way to transform their business.
266
777001
3035
ื”ื ื’ื™ืœื• ื“ืจืš ื—ื“ืฉื” ืœื—ืœื•ื˜ื™ืŸ ืœืฉื ื•ืช ืืช ื”ืขืกืง ืฉืœื”ื.
13:01
Netflix is known for their really great recommendation algorithm,
267
781226
3956
ื ื˜ืคืœื™ืงืก ื™ื“ื•ืขื™ื ื‘ืืœื’ื•ืจื™ืชื ื”ื”ืžืœืฆื•ืช ื”ืžืฆื•ื™ืŸ ืฉืœื”ื,
13:05
and they had this $1 million prize for anyone who could improve it.
268
785206
4797
ื•ื”ื ื”ืฆื™ืขื• ืคืจืก ืฉืœ ืžื™ืœื™ื•ืŸ ื“ื•ืœืจ ืœื›ืœ ืžื™ ืฉื™ื•ื›ืœ ืœืฉืคืจ ืื•ืชื•.
13:10
And there were winners.
269
790027
1314
ื•ื”ื™ื• ืฉื ื–ื•ื›ื™ื.
13:12
But Netflix discovered the improvements were only incremental.
270
792075
4323
ืื‘ืœ ื ื˜ืคืœื™ืงืก ื’ื™ืœื• ืฉื”ืฉื™ืคื•ืจื™ื ื”ื™ื• ืžื–ืขืจื™ื™ื ื‘ืœื‘ื“.
13:17
So to really find out what was going on,
271
797224
1964
ืื– ื›ื“ื™ ืœื’ืœื•ืช ืžื” ื‘ืืžืช ืงืจื”,
13:19
they hired an ethnographer, Grant McCracken,
272
799212
3741
ื”ื ืฉื›ืจื• ืืชื ื•ื’ืจืฃ ื‘ืฉื ื’ืจื ื˜ ืžืง'ืงืจืืงืŸ,
13:22
to gather thick data insights.
273
802977
1546
ื›ื“ื™ ืฉื™ืืกื•ืฃ ืชื•ื‘ื ื•ืช ืžื ืชื•ื ื™-ืขื•ื‘ื™.
13:24
And what he discovered was something that they hadn't seen initially
274
804547
3924
ื•ื”ื•ื ื’ื™ืœื” ืžืฉื”ื• ืฉื”ื ืœื ืจืื• ืžืœื›ืชื—ื™ืœื”
13:28
in the quantitative data.
275
808495
1355
ื‘ื ืชื•ื ื™ื ื”ื›ืžื•ืชื™ื™ื.
13:30
He discovered that people loved to binge-watch.
276
810892
2728
ื”ื•ื ื’ื™ืœื” ืฉืื ืฉื™ื ืื•ื”ื‘ื™ื ืœืฆืคื•ืช ื‘ื›ืžื” ืคืจืงื™ื ืฉืœ ืกื“ืจื•ืช ื‘ืจืฆืฃ.
13:33
In fact, people didn't even feel guilty about it.
277
813644
2353
ืœืžืขืฉื”, ืื ืฉื™ื ืืคื™ืœื• ืœื ื—ืฉื• ืืฉืžื™ื ื‘ืฉืœ ื›ืš.
13:36
They enjoyed it.
278
816021
1255
ื”ื ื ื”ื ื• ืžื–ื”.
13:37
(Laughter)
279
817300
1026
(ืฆื—ื•ืง)
13:38
So Netflix was like, "Oh. This is a new insight."
280
818350
2356
ืื– ื ื˜ืคืœื™ืงืก ื”ื—ืœื™ื˜ื•, "ืื•ืงื™ื™ ื–ืืช ืชื•ื‘ื ื” ื—ื“ืฉื”,
13:40
So they went to their data science team,
281
820730
1938
ื•ื”ื ื”ืœื›ื• ืœืฆื•ื•ืช ืžื“ืขื ื™ ื”ื ืชื•ื ื™ื ืฉืœื”ื,
13:42
and they were able to scale this big data insight
282
822692
2318
ื•ื”ืฆืœื™ื—ื• ืœื“ืจื’ ืืช ื”ืชื•ื‘ื ื” ื”ื–ื•
13:45
in with their quantitative data.
283
825034
2587
ื‘ืืžืฆืขื•ืช ื”ื ืชื•ื ื™ื ื”ื›ืžื•ืชื™ื™ื ืฉืœื”ื.
13:47
And once they verified it and validated it,
284
827645
3170
ื•ื‘ืจื’ืข ืฉื”ื ืื™ืฉืจื• ื•ืื™ืžืชื• ืืช ื–ื”,
13:50
Netflix decided to do something very simple but impactful.
285
830839
4761
ื ื˜ืคืœื™ืงืก ื”ื—ืœื™ื˜ื• ืœืขืฉื•ืช ืžืฉื”ื• ืคืฉื•ื˜ ืžืื•ื“, ืืš ื‘ืขืœ ื”ืฉืคืขื”.
13:56
They said, instead of offering the same show from different genres
286
836654
6492
ื”ื ืืžืจื•, ืฉื‘ืžืงื•ื ืœื”ืฆื™ืข ืืช ืื•ืชื” ื”ืกื“ืจื” ื‘ื–'ืื ืจื™ื ืฉื•ื ื™ื
14:03
or more of the different shows from similar users,
287
843170
3888
ืื• ื™ื•ืชืจ ืžื”ืกื“ืจื•ืช ื”ืฉื•ื ื•ืช ืžืžืฉืชืžืฉื™ื ื“ื•ืžื™ื,
14:07
we'll just offer more of the same show.
288
847082
2554
ืคืฉื•ื˜ ื ืฆื™ืข ื™ื•ืชืจ ืžืื•ืชื” ื”ืกื“ืจื”.
14:09
We'll make it easier for you to binge-watch.
289
849660
2105
ื ืงืœ ืขืœื™ื›ื ืœืฆืคื•ืช ื‘ื” ื‘ืจืฆืฃ.
14:11
And they didn't stop there.
290
851789
1486
ื•ื”ื ืœื ืขืฆืจื• ืฉื.
14:13
They did all these things
291
853299
1474
ื”ื ืขืฉื• ืืช ื›ืœ ื”ื“ื‘ืจื™ื
14:14
to redesign their entire viewer experience,
292
854797
2959
ื›ื“ื™ ืœืขืฆื‘ ืžื—ื“ืฉ ืืช ื—ื•ื•ื™ืช ื”ืฆื•ืคื” ืฉืœื”ื ื›ื•ืœื”,
14:17
to really encourage binge-watching.
293
857780
1758
ื›ื“ื™ ื‘ืืžืช ืœืขื•ื“ื“ ืฆืคื™ื™ื” ื‘ืจืฆืฃ.
14:20
It's why people and friends disappear for whole weekends at a time,
294
860050
3241
ืœื›ืŸ ืื ืฉื™ื ื•ื—ื‘ืจื™ื ื ืขืœืžื™ื ื‘ืžืฉืš ืกื•ืคื™ ืฉื‘ื•ืข ืฉืœืžื™ื ืžื“ื™ ืคืขื,
14:23
catching up on shows like "Master of None."
295
863315
2343
ื›ื“ื™ ืœื”ืฉืœื™ื ืฆืคื™ื™ื” ื‘ืกื“ืจื•ืช ื›ืžื• "ืžืืกื˜ืจ ืฉืœ ืืฃ ืื—ื“".
14:25
By integrating big data and thick data, they not only improved their business,
296
865682
4173
ืขืœ ื™ื“ื™ ืฉื™ืœื•ื‘ ืฉืœ ื ืชื•ื ื™-ืขืชืง ื•ื ืชื•ื ื™-ืขื•ื‘ื™ ื”ื ืœื ืจืง ืฉื™ืคืจื• ืืช ื”ืขืกืง ืฉืœื”ื,
14:29
but they transformed how we consume media.
297
869879
2812
ืืœื ืฉื™ื ื• ืืช ื”ืื•ืคืŸ ื‘ื• ืื ื—ื ื• ืฆื•ืจื›ื™ื ืžื“ื™ื”.
14:32
And now their stocks are projected to double in the next few years.
298
872715
4552
ื•ืขื›ืฉื™ื• ืฆื•ืคื™ื ืฉื”ืžื ื™ื•ืช ืฉืœื”ื ื™ื›ืคื™ืœื• ืืช ืขืจื›ืŸ ื‘ืฉื ื™ื ื”ืงืจื•ื‘ื•ืช.
14:38
But this isn't just about watching more videos
299
878100
3830
ืื‘ืœ ืœื ืžื“ื•ื‘ืจ ื›ืืŸ ืจืง ื‘ืฆืคื™ื™ื” ื‘ืขื•ื“ ื›ืžื” ืคืจืงื™ื
14:41
or selling more smartphones.
300
881954
1620
ืื• ื‘ืžื›ื™ืจื” ืฉืœ ื™ื•ืชืจ ืกืžืืจื˜ืคื•ื ื™ื.
14:43
For some, integrating thick data insights into the algorithm
301
883963
4050
ืขื‘ื•ืจ ืื—ื“ื™ื, ืฉื™ืœื•ื‘ ืฉืœ ืชื•ื‘ื ื•ืช ื ืชื•ื ื™-ืขื•ื‘ื™ ืœืชื•ืš ื”ืืœื’ื•ืจื™ืชื
14:48
could mean life or death,
302
888037
2263
ื™ื›ื•ืœ ืœื”ื™ื•ืช ืขื ื™ื™ืŸ ืฉืœ ื—ื™ื™ื ืื• ืžื•ื•ืช,
14:50
especially for the marginalized.
303
890324
2146
ื‘ื™ื™ื—ื•ื“ ืขื‘ื•ืจ ืืœื” ืฉื ืžืฆืื™ื ื‘ืฉื•ืœื™ื™ื.
14:53
All around the country, police departments are using big data
304
893558
3434
ื‘ื›ืœ ืจื—ื‘ื™ ื”ืžื“ื™ื ื” ื‘ืžื—ืœืงื•ืช ืžืฉื˜ืจื” ืžืฉืชืžืฉื™ื ื‘ื‘ื™ื’ ื“ืื˜ื”
14:57
for predictive policing,
305
897016
1963
ืœื—ื™ื–ื•ื™ ืฉื™ื˜ื•ืจ,
14:59
to set bond amounts and sentencing recommendations
306
899003
3084
ื›ื“ื™ ืœืงื‘ื•ืข ืกื›ื•ืžื™ ืขืจื‘ื•ืช ื•ื”ืžืœืฆื•ืช ืคืกืง ื“ื™ืŸ
15:02
in ways that reinforce existing biases.
307
902111
3147
ื‘ื“ืจื›ื™ื ืฉืžื—ื–ืงื•ืช ื“ืขื•ืช ืงื“ื•ืžื•ืช ืงื™ื™ืžื•ืช.
15:06
NSA's Skynet machine learning algorithm
308
906116
2423
ืืœื’ื•ืจื™ืชื ื”ืœืžื™ื“ื” ืฉืœ ืžื›ื•ื ืช ืกืงื™ื™-ื ื˜ ืฉืœ ื .ืก.ื.
15:08
has possibly aided in the deaths of thousands of civilians in Pakistan
309
908563
5444
ื™ื™ืชื›ืŸ ื•ืกื™ื™ืขื” ื‘ืžื•ืชื ืฉืœ ืืœืคื™ ืื–ืจื—ื™ื ื‘ืคืงื™ืกื˜ืŸ
15:14
from misreading cellular device metadata.
310
914031
2721
ื‘ื’ืœืœ ื˜ืขื•ื™ื•ืช ื‘ืงืจื™ืืช ืžื˜ื-ื ืชื•ื ื™ื ื‘ืžื›ืฉื™ืจื™ื ืกืœื•ืœืจื™ื™ื
15:18
As all of our lives become more automated,
311
918951
3403
ืžืื—ืจ ื•ื›ืœ ื—ื™ื™ื ื• ื”ืคื›ื• ืœื”ื™ื•ืช ื™ื•ืชืจ ืื•ื˜ื•ืžื˜ื™ื™ื,
15:22
from automobiles to health insurance or to employment,
312
922378
3080
ื”ื—ืœ ืžืžื›ื•ื ื™ื•ืช, ืœื‘ื™ื˜ื•ื— ืจืคื•ืื™, ืื• ืœืชืขืกื•ืงื”,
15:25
it is likely that all of us
313
925482
2350
ืกื‘ื™ืจ ืœื”ื ื™ื— ืฉื›ื•ืœื ื•
15:27
will be impacted by the quantification bias.
314
927856
2989
ื ื•ืฉืคืข ืžื“ืขื•ืช ืงื“ื•ืžื•ืช ื›ืžื•ืชื™ื•ืช.
15:32
Now, the good news is that we've come a long way
315
932792
2621
ืขื›ืฉื™ื• ื”ื—ื“ืฉื•ืช ื”ื˜ื•ื‘ื•ืช ื”ืŸ ืฉืขื‘ืจื ื• ื“ืจืš ืืจื•ื›ื”
15:35
from huffing ethylene gas to make predictions.
316
935437
2450
ืžื ืฉื™ืคืช ื’ื– ืืชื™ืœืŸ ืœืขืฉื™ื™ืช ืชื—ื–ื™ื•ืช.
15:37
We have better tools, so let's just use them better.
317
937911
3070
ื™ืฉ ืœื ื• ื›ืœื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ, ื‘ื•ืื• ื•ื ืฉืชืžืฉ ื‘ื”ื ื™ื•ืชืจ ื˜ื•ื‘.
15:41
Let's integrate the big data with the thick data.
318
941005
2323
ื‘ื•ืื• ื•ื ืฉืœื‘ ืืช ื‘ื™ื’ ื“ืื˜ื” ืขื ื ืชื•ื ื™-ื”ืขื•ื‘ื™.
15:43
Let's bring our temple guides with the oracles,
319
943352
2261
ื ื‘ื™ื ื’ื ืืช ื›ื•ื”ื ื™ ื”ืžืงื“ืฉ ืฉืœื ื• ื™ื—ื“ ืขื ื”ืื•ืจืงืœื™ื,
15:45
and whether this work happens in companies or nonprofits
320
945637
3376
ื•ื‘ื™ืŸ ืื ืขื‘ื•ื“ื” ื–ื• ืžืชืจื—ืฉืช ื‘ื—ื‘ืจื•ืช ืื• ื‘ืžืœื›"ืจื™ื
15:49
or government or even in the software,
321
949037
2469
ืื• ื‘ืžืžืฉืœื” ืื• ืืคื™ืœื• ื‘ืชื•ื›ื ื”,
15:51
all of it matters,
322
951530
1792
ื›ืœ ื–ื” ื—ืฉื•ื‘,
15:53
because that means we're collectively committed
323
953346
3023
ื›ื™ ื–ื” ืื•ืžืจ ืฉืื ื• ืžื—ื•ื™ื‘ื™ื ื‘ืื•ืคืŸ ืงื•ืœืงื˜ื™ื‘ื™
15:56
to making better data,
324
956393
2191
ืœื”ื‘ื™ื ื ืชื•ื ื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ,
15:58
better algorithms, better outputs
325
958608
1836
ืืœื’ื•ืจื™ืชืžื™ื ื˜ื•ื‘ื™ื ื™ื•ืชืจ, ืคืœื˜ ื˜ื•ื‘ ื™ื•ืชืจ
16:00
and better decisions.
326
960468
1643
ื”ื—ืœื˜ื•ืช ื˜ื•ื‘ื•ืช ื™ื•ืชืจ.
16:02
This is how we'll avoid missing that something.
327
962135
3558
ื›ืš ื ื™ืžื ืข ืžืœื”ื—ืกื™ืจ ื“ื‘ืจ ืžื”.
16:07
(Applause)
328
967042
3948
(ืžื—ื™ืื•ืช ื›ืคื™ื™ื)
ืขืœ ืืชืจ ื–ื”

ืืชืจ ื–ื” ื™ืฆื™ื’ ื‘ืคื ื™ื›ื ืกืจื˜ื•ื ื™ YouTube ื”ืžื•ืขื™ืœื™ื ืœืœื™ืžื•ื“ ืื ื’ืœื™ืช. ืชื•ื›ืœื• ืœืจืื•ืช ืฉื™ืขื•ืจื™ ืื ื’ืœื™ืช ื”ืžื•ืขื‘ืจื™ื ืขืœ ื™ื“ื™ ืžื•ืจื™ื ืžื”ืฉื•ืจื” ื”ืจืืฉื•ื ื” ืžืจื—ื‘ื™ ื”ืขื•ืœื. ืœื—ืฅ ืคืขืžื™ื™ื ืขืœ ื”ื›ืชื•ื‘ื™ื•ืช ื‘ืื ื’ืœื™ืช ื”ืžื•ืฆื’ื•ืช ื‘ื›ืœ ื“ืฃ ื•ื™ื“ืื• ื›ื“ื™ ืœื”ืคืขื™ืœ ืืช ื”ืกืจื˜ื•ืŸ ืžืฉื. ื”ื›ืชื•ื‘ื™ื•ืช ื’ื•ืœืœื•ืช ื‘ืกื ื›ืจื•ืŸ ืขื ื”ืคืขืœืช ื”ื•ื•ื™ื“ืื•. ืื ื™ืฉ ืœืš ื”ืขืจื•ืช ืื• ื‘ืงืฉื•ืช, ืื ื ืฆื•ืจ ืื™ืชื ื• ืงืฉืจ ื‘ืืžืฆืขื•ืช ื˜ื•ืคืก ื™ืฆื™ืจืช ืงืฉืจ ื–ื”.

https://forms.gle/WvT1wiN1qDtmnspy7