Kenneth Cukier: Big data is better data

530,099 views ・ 2014-09-23

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Milana Stojadinov Lektor: Mile Živković
00:12
America's favorite pie is?
0
12787
3845
Koja je omiljena američka pita?
00:16
Audience: Apple. Kenneth Cukier: Apple. Of course it is.
1
16632
3506
Publika: Od jabuke. Kenet Kukir: Od jabuke. Naravno.
Kako to znamo?
00:20
How do we know it?
2
20138
1231
00:21
Because of data.
3
21369
2753
Zbog podataka.
00:24
You look at supermarket sales.
4
24122
2066
Posmatramo rasprodaju u supermarketima,
00:26
You look at supermarket sales of 30-centimeter pies
5
26188
2866
prodaju zamrznutih pita prečnika 30 cm,
00:29
that are frozen, and apple wins, no contest.
6
29054
4075
i jabuka pobeđuje.
Bez konkurencije.
Najveći deo prodaje je od jabuka.
00:33
The majority of the sales are apple.
7
33129
5180
00:38
But then supermarkets started selling
8
38309
2964
Zatim su supermarketi počeli da prodaju manje pite,
00:41
smaller, 11-centimeter pies,
9
41273
2583
pite prečnika 11 cm.
00:43
and suddenly, apple fell to fourth or fifth place.
10
43856
4174
Odjednom, jabuka pada na četvrto ili peto mesto.
00:48
Why? What happened?
11
48030
2875
Zašto? Šta se dogodilo?
00:50
Okay, think about it.
12
50905
2818
Dobro. Razmislite o tome.
00:53
When you buy a 30-centimeter pie,
13
53723
3848
Kada kupite pitu od 30cm,
00:57
the whole family has to agree,
14
57571
2261
cela porodica mora da se složi,
00:59
and apple is everyone's second favorite.
15
59832
3791
a pita od jabuka je svima drugi omiljeni izbor.
01:03
(Laughter)
16
63623
1935
(Smeh)
01:05
But when you buy an individual 11-centimeter pie,
17
65558
3615
Ali kad kupite zasebnu pitu od 11cm,
01:09
you can buy the one that you want.
18
69173
3745
možete da kupite onu koju vi hoćete.
01:12
You can get your first choice.
19
72918
4015
Možete da uzmete vaš prvi izbor.
01:16
You have more data.
20
76933
1641
Imate više podataka.
01:18
You can see something
21
78574
1554
Možete da vidite nešto
01:20
that you couldn't see
22
80128
1132
što niste mogli da vidite kada ste ih imali u manjim količinama.
01:21
when you only had smaller amounts of it.
23
81260
3953
01:25
Now, the point here is that more data
24
85213
2475
Dakle, poenta je da više podataka
01:27
doesn't just let us see more,
25
87688
2283
ne samo što nam omogućava da vidimo više,
01:29
more of the same thing we were looking at.
26
89971
1854
više o tome što posmatramo.
01:31
More data allows us to see new.
27
91825
3613
Više podataka nam omogućava da vidimo novo.
01:35
It allows us to see better.
28
95438
3094
Omogućava nam da vidimo bolje.
01:38
It allows us to see different.
29
98532
3656
Omogućava nam da vidimo različito.
01:42
In this case, it allows us to see
30
102188
3173
U ovom slučaju, omogućava nam da vidimo
01:45
what America's favorite pie is:
31
105361
2913
koja je omiljena američka pita:
01:48
not apple.
32
108274
2542
nije od jabuka.
01:50
Now, you probably all have heard the term big data.
33
110816
3614
Svi ste verovatno čuli izraz "veliki podaci".
01:54
In fact, you're probably sick of hearing the term
34
114430
2057
Verovatno vam je i loše na pomenu izraza
01:56
big data.
35
116487
1630
"veliki podaci".
01:58
It is true that there is a lot of hype around the term,
36
118117
3330
Tačno je da se podigla velika buka oko ovog izraza,
02:01
and that is very unfortunate,
37
121447
2332
što je loše.
02:03
because big data is an extremely important tool
38
123779
3046
Zato što su veliki podaci veoma važan alat
02:06
by which society is going to advance.
39
126825
3734
pomoću kog će društvo da napreduje.
02:10
In the past, we used to look at small data
40
130559
3561
U prošlosti smo posmatrali "male podatke"
02:14
and think about what it would mean
41
134120
1704
i razmišljali o tome šta bi značilo
02:15
to try to understand the world,
42
135824
1496
da pokušamo da razumemo svet,
02:17
and now we have a lot more of it,
43
137320
1991
a sada ih imamo mnogo više,
02:19
more than we ever could before.
44
139311
2722
više nego što smo ikada imali.
02:22
What we find is that when we have
45
142033
1877
Shvatili smo da kada imamo mnogo podataka,
02:23
a large body of data, we can fundamentally do things
46
143910
2724
u principu možemo uraditi stvari
02:26
that we couldn't do when we only had smaller amounts.
47
146634
3276
koje nismo mogli sa manje podataka.
02:29
Big data is important, and big data is new,
48
149910
2641
Veliki podaci su bitni, i to je nešto novo,
02:32
and when you think about it,
49
152551
1777
kada razmislimo o tome,
02:34
the only way this planet is going to deal
50
154328
2216
jedini način na koji će se planeta suočiti
02:36
with its global challenges —
51
156544
1789
sa svojim globalnim izazovima -
02:38
to feed people, supply them with medical care,
52
158333
3537
nahraniti ljude, obezbediti im medicinsku negu,
02:41
supply them with energy, electricity,
53
161870
2810
pružiti im energiju, struju,
02:44
and to make sure they're not burnt to a crisp
54
164680
1789
da se pobrine da ne izgore
02:46
because of global warming —
55
166469
1238
zbog globalnog zagrevanja -
02:47
is because of the effective use of data.
56
167707
4195
jeste zbog efikasne upotrebe podataka.
02:51
So what is new about big data? What is the big deal?
57
171902
3870
Šta je novo u vezi sa velikim podacima? U čemu je velika caka?
02:55
Well, to answer that question, let's think about
58
175772
2517
Da bismo odgovorili na to pitanje,
razmislimo kako su informacije izgledale,
02:58
what information looked like,
59
178289
1896
03:00
physically looked like in the past.
60
180185
3034
fizički izgledale u prošlosti.
03:03
In 1908, on the island of Crete,
61
183219
3611
1908. godine na Kritu,
03:06
archaeologists discovered a clay disc.
62
186830
4735
arheolozi su pronašli glineni disk.
03:11
They dated it from 2000 B.C., so it's 4,000 years old.
63
191565
4059
Smestili su ga oko 2000. g. pre Hrista, dakle star je 4000 godina.
03:15
Now, there's inscriptions on this disc,
64
195624
2004
Na tom disku postoji zapis,
03:17
but we actually don't know what it means.
65
197628
1327
ali ne znamo šta on znači.
03:18
It's a complete mystery, but the point is that
66
198955
2098
Potpuna je zagonetka, ali poenta je u tome
03:21
this is what information used to look like
67
201053
1928
da su tako informacije izgledale
03:22
4,000 years ago.
68
202981
2089
pre 4000 godina.
03:25
This is how society stored
69
205070
2548
Tako je društvo čuvalo
03:27
and transmitted information.
70
207618
3524
i prenosilo informacije.
03:31
Now, society hasn't advanced all that much.
71
211142
4160
Društvo nije baš toliko napredovalo.
03:35
We still store information on discs,
72
215302
3474
I dalje čuvamo informacije na diskovima,
03:38
but now we can store a lot more information,
73
218776
3184
ali danas možemo da čuvamo mnogo više,
03:41
more than ever before.
74
221960
1260
više nego ikada.
03:43
Searching it is easier. Copying it easier.
75
223220
3093
Pretraživanje je lakše. Kopiranje je lakše.
03:46
Sharing it is easier. Processing it is easier.
76
226313
3500
Deljenje je lakše. Obrada je lakša.
03:49
And what we can do is we can reuse this information
77
229813
2766
Možemo da koristimo te informacije iznova,
03:52
for uses that we never even imagined
78
232579
1834
na načine na koje nismo ni zamišljali
03:54
when we first collected the data.
79
234413
3195
kada smo počeli da sakupljamo podatke.
03:57
In this respect, the data has gone
80
237608
2252
U tom smislu,
podaci su prešli iz skladištenja u protok.
03:59
from a stock to a flow,
81
239860
3532
04:03
from something that is stationary and static
82
243392
3938
Od nečega što je stacionarno i statično
04:07
to something that is fluid and dynamic.
83
247330
3609
do nečega što je fluidno i dinamično.
04:10
There is, if you will, a liquidity to information.
84
250939
4023
Ako ćemo tako, informacija je kao tečnost.
04:14
The disc that was discovered off of Crete
85
254962
3474
Disk, koji je otkriven u blizini Krita,
04:18
that's 4,000 years old, is heavy,
86
258436
3764
pre 4000 godina, je težak.
04:22
it doesn't store a lot of information,
87
262200
1962
Ne sadrži puno informacija,
04:24
and that information is unchangeable.
88
264162
3116
i te informacije su nepromenljive.
04:27
By contrast, all of the files
89
267278
4011
Nasuprot tome, svi fajlovi
04:31
that Edward Snowden took
90
271289
1861
koje je Edvard Snouden uzeo
04:33
from the National Security Agency in the United States
91
273150
2621
od Državne bezbednosne agencije u SAD-u
04:35
fits on a memory stick
92
275771
2419
staju na memorijski uređaj
04:38
the size of a fingernail,
93
278190
3010
veličine nokta,
04:41
and it can be shared at the speed of light.
94
281200
4745
i mogu se razmenjivati brzinom svetlosti.
04:45
More data. More.
95
285945
5255
Još podataka. Više.
Jedan razlog zašto danas imamo toliko podataka
04:51
Now, one reason why we have so much data in the world today
96
291200
1974
04:53
is we are collecting things
97
293174
1432
je što sakupljamo stvari
04:54
that we've always collected information on,
98
294606
3280
o kojima smo uvek skupljali informacije,
04:57
but another reason why is we're taking things
99
297886
2656
ali drugi razlog je zato što uzimamo stvari
05:00
that have always been informational
100
300542
2812
koje su uvek bile informativne
05:03
but have never been rendered into a data format
101
303354
2486
ali nikad nisu prebačene u oblik podataka
05:05
and we are putting it into data.
102
305840
2419
i stavljamo ih u podatke.
05:08
Think, for example, the question of location.
103
308259
3308
Zamislite, npr. pitanje lokacije.
05:11
Take, for example, Martin Luther.
104
311567
2249
Uzmimo Martina Lutera za primer.
05:13
If we wanted to know in the 1500s
105
313816
1597
Da smo 1500. god. želeli da znamo
05:15
where Martin Luther was,
106
315413
2667
gde je Martin Luter,
05:18
we would have to follow him at all times,
107
318080
2092
morali bismo da ga pratimo u svakom trenutku,
05:20
maybe with a feathery quill and an inkwell,
108
320172
2137
možda sa perom i mastilom,
05:22
and record it,
109
322309
1676
i da to beležimo,
05:23
but now think about what it looks like today.
110
323985
2183
ali razmislite kako to izgleda danas.
05:26
You know that somewhere,
111
326168
2122
Znate da negde,
05:28
probably in a telecommunications carrier's database,
112
328290
2446
verovatno u bazi podataka telefonskog operatera,
05:30
there is a spreadsheet or at least a database entry
113
330736
3036
postoji tabela ili bar podatak u bazi
05:33
that records your information
114
333772
2088
koji beleži informaciju
05:35
of where you've been at all times.
115
335860
2063
o tome gde ste bili u svakom momentu.
05:37
If you have a cell phone,
116
337923
1360
Ako imate mobilni telefon,
05:39
and that cell phone has GPS, but even if it doesn't have GPS,
117
339283
2847
koji ima GPS, čak i ako nema GPS,
05:42
it can record your information.
118
342130
2385
on čuva informacije.
05:44
In this respect, location has been datafied.
119
344515
4084
U ovom smislu, lokacija je postala "podatkovana".
05:48
Now think, for example, of the issue of posture,
120
348599
4601
Razmislimo, npr. o pitanju držanja,
05:53
the way that you are all sitting right now,
121
353200
1285
načinu na koji upravo sedite,
05:54
the way that you sit,
122
354485
2030
načinu na koji vi sedite,
05:56
the way that you sit, the way that you sit.
123
356515
2771
načinu na koji vi sedite, i vi.
05:59
It's all different, and it's a function of your leg length
124
359286
2077
Svi se razlikuju, i zavise od dužine nogu
06:01
and your back and the contours of your back,
125
361363
2093
i leđa i od konture leđa,
06:03
and if I were to put sensors, maybe 100 sensors
126
363456
2531
i, ako bih postavio senzore, možda 100 senzora
06:05
into all of your chairs right now,
127
365987
1766
u sve vaše stolice,
06:07
I could create an index that's fairly unique to you,
128
367753
3600
našao bih indeks koji je jedinstven za svakoga,
06:11
sort of like a fingerprint, but it's not your finger.
129
371353
4409
kao otisak prsta, ali nije od prsta.
06:15
So what could we do with this?
130
375762
2969
Međutim, šta bismo mogli sa tim?
06:18
Researchers in Tokyo are using it
131
378731
2397
Istraživači u Tokiju ga koriste
06:21
as a potential anti-theft device in cars.
132
381128
4388
kao potencijalni alarmni uređaj u kolima.
06:25
The idea is that the carjacker sits behind the wheel,
133
385516
2924
Ideja je da ako za volan sedne lopov,
06:28
tries to stream off, but the car recognizes
134
388440
2104
pokuša da pobegne, ali automobil prepozna
06:30
that a non-approved driver is behind the wheel,
135
390544
2362
da za volanom nije odobreni vozač,
06:32
and maybe the engine just stops, unless you
136
392906
2164
možda zaustavi motor, osim ako vozač
06:35
type in a password into the dashboard
137
395070
3177
ne unese šifru u kontrolnu tablu
06:38
to say, "Hey, I have authorization to drive." Great.
138
398247
4658
da kaže: "Hej, imam dozvolu da vozim". Odlično!
06:42
What if every single car in Europe
139
402905
2553
Šta ako bi svaki automobil u Evropi
06:45
had this technology in it?
140
405458
1457
imao ovu tehnologiju?
06:46
What could we do then?
141
406915
3165
Šta bismo mogli tada?
06:50
Maybe, if we aggregated the data,
142
410080
2240
Možda, kada bismo nagomilali podatke,
06:52
maybe we could identify telltale signs
143
412320
3814
mogli bismo da uočimo znakove upozorenja
06:56
that best predict that a car accident
144
416134
2709
koji najbolje predviđaju
06:58
is going to take place in the next five seconds.
145
418843
5893
da će se dogoditi automobilska nesreća u narednih pet sekundi.
07:04
And then what we will have datafied
146
424736
2557
Tada bismo u obliku podataka beležili
07:07
is driver fatigue,
147
427293
1783
zamor vozača,
07:09
and the service would be when the car senses
148
429076
2334
i svrha bi bila da kada kola osete
07:11
that the person slumps into that position,
149
431410
3437
da je vozač upao u određeni položaj,
07:14
automatically knows, hey, set an internal alarm
150
434847
3994
automatski kaže: "Hej, pusti interni alarm."
07:18
that would vibrate the steering wheel, honk inside
151
438841
2025
kojim bi zavibrirao volan, zatrubio iznutra i rekao
07:20
to say, "Hey, wake up,
152
440866
1721
"Hej, budi se!
07:22
pay more attention to the road."
153
442587
1904
obrati više pažnje na put."
07:24
These are the sorts of things we can do
154
444491
1853
To su neke stvari koje možemo da uradimo
07:26
when we datafy more aspects of our lives.
155
446344
2821
kada prebacimo u podatke više aspekata naših života.
07:29
So what is the value of big data?
156
449165
3675
Koja je vrednost velikih podataka?
07:32
Well, think about it.
157
452840
2190
Pa, razmislite o tome.
07:35
You have more information.
158
455030
2412
Imate više informacija.
07:37
You can do things that you couldn't do before.
159
457442
3341
Možete da uradite ono što niste mogli ranije.
07:40
One of the most impressive areas
160
460783
1676
Jedna od najimpresivnijih oblasti
07:42
where this concept is taking place
161
462459
1729
u kojoj ovaj koncept igra ulogu
07:44
is in the area of machine learning.
162
464188
3307
jeste mašinsko učenje.
07:47
Machine learning is a branch of artificial intelligence,
163
467495
3077
Mašinsko učenje je grana veštačke inteligencije,
07:50
which itself is a branch of computer science.
164
470572
3378
koja je grana računarskih nauka.
07:53
The general idea is that instead of
165
473950
1543
Glavna ideja je da umesto
07:55
instructing a computer what do do,
166
475493
2117
da kažemo računaru šta da radi,
07:57
we are going to simply throw data at the problem
167
477610
2620
jednostavno ubacimo podatke u problem
08:00
and tell the computer to figure it out for itself.
168
480230
3206
i kažemo računaru da ga reši sam.
Pomoći će vam da ga razumete
08:03
And it will help you understand it
169
483436
1777
08:05
by seeing its origins.
170
485213
3552
gledajući u njegove korene.
08:08
In the 1950s, a computer scientist
171
488765
2388
U 1950-im, informatičar u IBM-u,
08:11
at IBM named Arthur Samuel liked to play checkers,
172
491153
3592
Artur Semjuel, voleo je da igra "Damu",
08:14
so he wrote a computer program
173
494745
1402
te je napisao kompjuterski program
08:16
so he could play against the computer.
174
496147
2813
kako bi igrao protiv računara.
08:18
He played. He won.
175
498960
2711
Igrao je. Pobedio je.
08:21
He played. He won.
176
501671
2103
Igrao je. Pobedio je.
08:23
He played. He won,
177
503774
3015
Igrao, pobedio.
08:26
because the computer only knew
178
506789
1778
Jer je računar znao dozvoljene poteze.
08:28
what a legal move was.
179
508567
2227
08:30
Arthur Samuel knew something else.
180
510794
2087
Artur Semjuel je znao nešto drugo.
08:32
Arthur Samuel knew strategy.
181
512881
4629
Artur Semjuel je poznavao strategiju.
08:37
So he wrote a small sub-program alongside it
182
517510
2396
Napisao je mali potprogram, pored ovog,
08:39
operating in the background, and all it did
183
519906
1974
koji je radio u pozadini,
08:41
was score the probability
184
521880
1817
i samo računao verovatnoću
08:43
that a given board configuration would likely lead
185
523697
2563
da data situacija na tabli pre vodi
08:46
to a winning board versus a losing board
186
526260
2910
ka pobedničkoj tabli nego ka gubitničkoj,
08:49
after every move.
187
529170
2508
nakon svakog poteza.
08:51
He plays the computer. He wins.
188
531678
3150
Igra protiv računara. Pobeđuje.
08:54
He plays the computer. He wins.
189
534828
2508
Igra protiv računara.
Pobeđuje.
08:57
He plays the computer. He wins.
190
537336
3731
Igra protiv računara. Pobeđuje.
09:01
And then Arthur Samuel leaves the computer
191
541067
2277
Zatim je Artur Semjuel pustio računar
09:03
to play itself.
192
543344
2227
da igra protiv sebe.
09:05
It plays itself. It collects more data.
193
545571
3509
Igrao je. Sakupljao je više podataka.
09:09
It collects more data. It increases the accuracy of its prediction.
194
549080
4309
Sakupljajući više podataka, povećavao je tačnost svog predviđanja.
09:13
And then Arthur Samuel goes back to the computer
195
553389
2104
Zatim se Artur Semjuel vratio do računara.
09:15
and he plays it, and he loses,
196
555493
2318
Igra, i gubi.
09:17
and he plays it, and he loses,
197
557811
2069
Igra, i gubi.
09:19
and he plays it, and he loses,
198
559880
2047
Igra, i gubi.
09:21
and Arthur Samuel has created a machine
199
561927
2599
I tako je Artur Semjuel stvorio mašinu
09:24
that surpasses his ability in a task that he taught it.
200
564526
6288
koja prevazilazi njegove mogućnosti u igri kojoj ju je naučio.
09:30
And this idea of machine learning
201
570814
2498
Ova ideja mašinskog učenja
09:33
is going everywhere.
202
573312
3927
se širi na sve strane.
09:37
How do you think we have self-driving cars?
203
577239
3149
Šta mislite, odakle nam samoupravljajuća vozila?
09:40
Are we any better off as a society
204
580388
2137
Da li napredujemo kao društvo
09:42
enshrining all the rules of the road into software?
205
582525
3285
ubacivanjem svih pravila vožnje u softver?
09:45
No. Memory is cheaper. No.
206
585810
2598
Ne. Memorija je jeftinija. Ne.
09:48
Algorithms are faster. No. Processors are better. No.
207
588408
3994
Algoritmi su brži. Ne. Procesori su brži. Ne.
09:52
All of those things matter, but that's not why.
208
592402
2772
Sve to je bitno, ali ne zbog toga.
09:55
It's because we changed the nature of the problem.
209
595174
3141
Nego zato što smo promenili koren problema.
09:58
We changed the nature of the problem from one
210
598315
1530
Promenili smo prirodu problema
09:59
in which we tried to overtly and explicitly
211
599845
2245
od one u kojoj smo direktno
objasnili računaru kako da vozi,
10:02
explain to the computer how to drive
212
602090
2581
10:04
to one in which we say,
213
604671
1316
do one u kojoj kažemo:
10:05
"Here's a lot of data around the vehicle.
214
605987
1876
"Evo ti mnogo podataka u vezi sa vozilom.
10:07
You figure it out.
215
607863
1533
Shvati sam.
10:09
You figure it out that that is a traffic light,
216
609396
1867
Shvati da je ovo svetlo na semaforu.
10:11
that that traffic light is red and not green,
217
611263
2081
Da je crveno, a ne zeleno.
10:13
that that means that you need to stop
218
613344
2014
Da to znači da moraš da staneš,
10:15
and not go forward."
219
615358
3083
a ne da nastaviš."
10:18
Machine learning is at the basis
220
618441
1518
Mašinsko učenje je u osnovi
10:19
of many of the things that we do online:
221
619959
1991
mnogih stvari na mreži.
10:21
search engines,
222
621950
1857
Pretraživači,
10:23
Amazon's personalization algorithm,
223
623807
3801
Amazonov personalizovani algoritam,
10:27
computer translation,
224
627608
2212
računarsko prevođenje,
10:29
voice recognition systems.
225
629820
4290
sistemi za prepoznavanje glasa.
10:34
Researchers recently have looked at
226
634110
2835
Istraživači su skoro posmatrali
10:36
the question of biopsies,
227
636945
3195
problem biopsije.
Biopsije raka.
10:40
cancerous biopsies,
228
640140
2767
10:42
and they've asked the computer to identify
229
642907
2315
Pitali su računar da ustanovi
10:45
by looking at the data and survival rates
230
645222
2471
posmatrajući podatke i stopu preživljavanja,
10:47
to determine whether cells are actually
231
647693
4667
da odluči da li su ćelije zapravo
10:52
cancerous or not,
232
652360
2544
kancerogene ili ne.
10:54
and sure enough, when you throw the data at it,
233
654904
1778
Zasigurno, kada ubacite podatke,
10:56
through a machine-learning algorithm,
234
656682
2047
pomoću algoritma mašinskog učenja,
10:58
the machine was able to identify
235
658729
1877
mašina je postala sposobna da prepozna
11:00
the 12 telltale signs that best predict
236
660606
2262
12 znakova koji najbolje predviđaju
11:02
that this biopsy of the breast cancer cells
237
662868
3299
da je biopsija raka ćelija dojke
11:06
are indeed cancerous.
238
666167
3218
zaista zahvaćena rakom.
11:09
The problem: The medical literature
239
669385
2498
Problem?
11:11
only knew nine of them.
240
671883
2789
Medicinska literatura je poznavala samo devet od njih.
11:14
Three of the traits were ones
241
674672
1800
Tri od tih simptoma su bili oni
11:16
that people didn't need to look for,
242
676472
2975
koje ljudi nisu trebali da traže,
11:19
but that the machine spotted.
243
679447
5531
ali ih je mašina uočila.
11:24
Now, there are dark sides to big data as well.
244
684978
5925
Ali, postoji loša strana velikih podataka.
11:30
It will improve our lives, but there are problems
245
690903
2074
Unaprediće naše živote,
11:32
that we need to be conscious of,
246
692977
2640
ali postoje problemi kojih moramo biti svesni.
11:35
and the first one is the idea
247
695617
2623
Prvi od njih je ideja
da možemo biti kažnjeni za predviđanja,
11:38
that we may be punished for predictions,
248
698240
2686
11:40
that the police may use big data for their purposes,
249
700926
3870
da policija može koristiti velike podatke u svoje svrhe,
11:44
a little bit like "Minority Report."
250
704796
2351
nešto poput fima "Suvišni izveštaj".
Ovaj izraz zovemo sposobnost predviđanja
11:47
Now, it's a term called predictive policing,
251
707147
2441
11:49
or algorithmic criminology,
252
709588
2363
ili algoritamska kriminologija,
11:51
and the idea is that if we take a lot of data,
253
711951
2036
i ideja je da ako uzmemo mnogo podataka
11:53
for example where past crimes have been,
254
713987
2159
npr. mesta prošlih zločina,
11:56
we know where to send the patrols.
255
716146
2543
znamo gde da pošaljemo patrole.
11:58
That makes sense, but the problem, of course,
256
718689
2115
To ima smisla, ali problem je, naravno,
12:00
is that it's not simply going to stop on location data,
257
720804
4544
u tome što se neće završiti samo na podacima o lokaciji.
12:05
it's going to go down to the level of the individual.
258
725348
2959
Ići će do ličnog nivoa.
12:08
Why don't we use data about the person's
259
728307
2250
Zašto ne koristimo podatke
12:10
high school transcript?
260
730557
2228
o nečijim ocenama iz srednje škole?
12:12
Maybe we should use the fact that
261
732785
1561
Možda da iskoristimo činjenice
12:14
they're unemployed or not, their credit score,
262
734346
2028
o zaposlenosti, o kreditnom stanju,
12:16
their web-surfing behavior,
263
736374
1552
o ponašanju na internetu,
12:17
whether they're up late at night.
264
737926
1878
da li su budni noću.
12:19
Their Fitbit, when it's able to identify biochemistries,
265
739804
3161
Ako njihov Fitbit može da prepozna njihove biohemijske parametre,
12:22
will show that they have aggressive thoughts.
266
742965
4236
pokazaće kada imaju agresivne misli.
Možemo imati algoritme
12:27
We may have algorithms that are likely to predict
267
747201
2221
koji bi mogli predviđati šta ćemo uraditi,
12:29
what we are about to do,
268
749422
1633
12:31
and we may be held accountable
269
751055
1244
i mogu nas smatrati odgovornim
12:32
before we've actually acted.
270
752299
2590
pre nego što delamo.
12:34
Privacy was the central challenge
271
754889
1732
Privatnost je bila centralni izazov
12:36
in a small data era.
272
756621
2880
u eri malih podataka.
12:39
In the big data age,
273
759501
2149
U danima velikih podataka,
12:41
the challenge will be safeguarding free will,
274
761650
4523
izazov će biti zaštita slobodne volje,
12:46
moral choice, human volition,
275
766173
3779
moralnih izbora, ljudske volje,
12:49
human agency.
276
769952
3068
ljudske odlučnosti.
12:54
There is another problem:
277
774540
2225
Postoji još jedan problem.
12:56
Big data is going to steal our jobs.
278
776765
3556
Veliki podaci će nam ukrasti poslove.
Veliki podaci i algoritmi će izazvati
13:00
Big data and algorithms are going to challenge
279
780321
3512
13:03
white collar, professional knowledge work
280
783833
3061
kancelarijske, visoko obrazovane radnike
13:06
in the 21st century
281
786894
1653
dvadeset prvog veka
13:08
in the same way that factory automation
282
788547
2434
slično kao što su automatizacija
13:10
and the assembly line
283
790981
2189
i pokretna traka
13:13
challenged blue collar labor in the 20th century.
284
793170
3026
izazvale radničku klasu u 20. veku.
13:16
Think about a lab technician
285
796196
2092
Setimo se laboratorijskog tehničara,
13:18
who is looking through a microscope
286
798288
1409
koji pod mikroskopom posmatra
13:19
at a cancer biopsy
287
799697
1624
biopsiju raka
13:21
and determining whether it's cancerous or not.
288
801321
2637
da bi zaključio da li je zahvaćena rakom.
13:23
The person went to university.
289
803958
1972
Ova osoba je završila fakultet.
13:25
The person buys property.
290
805930
1430
Ona kupuje imovinu.
13:27
He or she votes.
291
807360
1741
On ili ona glasa.
13:29
He or she is a stakeholder in society.
292
809101
3666
On ili ona je član društva.
13:32
And that person's job,
293
812767
1394
Posao ove osobe,
13:34
as well as an entire fleet
294
814161
1609
i celog niza stručnjaka
13:35
of professionals like that person,
295
815770
1969
kao što je ova osoba,
13:37
is going to find that their jobs are radically changed
296
817739
3150
shvatiće da se njihov posao znatno menja
13:40
or actually completely eliminated.
297
820889
2357
ili da će potpuno nestati.
13:43
Now, we like to think
298
823246
1284
Volimo da mislimo
13:44
that technology creates jobs over a period of time
299
824530
3187
da će vremenom tehnologija praviti poslove
13:47
after a short, temporary period of dislocation,
300
827717
3465
iza kratkog, privremenog doba dislokacije,
13:51
and that is true for the frame of reference
301
831182
1941
što je i tačno za taj referentni okvir
13:53
with which we all live, the Industrial Revolution,
302
833123
2142
u kom svi živimo, industrijsku revoluciju,
13:55
because that's precisely what happened.
303
835265
2328
jer tako se tačno i dogodilo.
13:57
But we forget something in that analysis:
304
837593
2333
Međutim, u toj analizi zaboravljamo
13:59
There are some categories of jobs
305
839926
1830
da postoje kategorije poslova
14:01
that simply get eliminated and never come back.
306
841756
3420
koje će jednostavno nestati i neće se vratiti.
14:05
The Industrial Revolution wasn't very good
307
845176
2004
Industrijska revolucija nije bila dobra
14:07
if you were a horse.
308
847180
4002
ako ste bili konj.
14:11
So we're going to need to be careful
309
851182
2055
Dakle, moramo biti pažljivi,
14:13
and take big data and adjust it for our needs,
310
853237
3514
i moramo velike podatke prilagoditi našim potrebama,
14:16
our very human needs.
311
856751
3185
našim ljudskim potrebama.
14:19
We have to be the master of this technology,
312
859936
1954
Moramo biti gospodari tehnologije,
14:21
not its servant.
313
861890
1656
a ne njene sluge.
14:23
We are just at the outset of the big data era,
314
863546
2958
Na samom smo početku doba velikih podataka,
14:26
and honestly, we are not very good
315
866504
3150
i iskreno, za sada ne rukujemo dobro
14:29
at handling all the data that we can now collect.
316
869654
4207
podacima koje sada možemo da prikupimo.
14:33
It's not just a problem for the National Security Agency.
317
873861
3330
To nije problem samo Državne bezbednosne agencije.
14:37
Businesses collect lots of data, and they misuse it too,
318
877191
3038
Firme sakupljaju dosta podataka, i takođe ih ne koriste dobro,
14:40
and we need to get better at this, and this will take time.
319
880229
3667
moramo ovladati time, a za to je potrebno vreme.
14:43
It's a little bit like the challenge that was faced
320
883896
1822
Podseća na situaciju
14:45
by primitive man and fire.
321
885718
2407
kada se primitivni čovek suočio sa vatrom.
14:48
This is a tool, but this is a tool that,
322
888125
1885
To je alat, ali alat koji će nas opeći
14:50
unless we're careful, will burn us.
323
890010
3559
ako ne budemo pažljivi.
14:56
Big data is going to transform how we live,
324
896008
3120
Veliki podaci će promeniti naš način života,
14:59
how we work and how we think.
325
899128
2801
način rada i razmišljanja.
15:01
It is going to help us manage our careers
326
901929
1889
Pomoći će nam da organizujemo svoje karijere
15:03
and lead lives of satisfaction and hope
327
903818
3634
i da živimo zadovoljno i sa nadom,
15:07
and happiness and health,
328
907452
2992
u sreći i zdravlju.
15:10
but in the past, we've often looked at information technology
329
910444
3306
Ranije smo često od informacionih tehnologija
15:13
and our eyes have only seen the T,
330
913750
2208
gledali samo u T,
15:15
the technology, the hardware,
331
915958
1686
u tehnologiju, u hardver,
15:17
because that's what was physical.
332
917644
2262
zato što je to ono što je opipljivo.
15:19
We now need to recast our gaze at the I,
333
919906
2924
Sada moramo da bacimo oko na I,
15:22
the information,
334
922830
1380
na informacije,
15:24
which is less apparent,
335
924210
1373
na ono manje uočljivo,
15:25
but in some ways a lot more important.
336
925583
4109
ali na određeni način mnogo bitnije.
15:29
Humanity can finally learn from the information
337
929692
3465
Čovečanstvo konačno uči iz informacija
15:33
that it can collect,
338
933157
2418
koje može da prikupi,
15:35
as part of our timeless quest
339
935575
2115
kao deo našeg vanvremenskog zadatka
15:37
to understand the world and our place in it,
340
937690
3159
da shvatimo svet i naše mesto u njemu
15:40
and that's why big data is a big deal.
341
940849
5631
i zato veliki podaci jesu velika stvar.
15:46
(Applause)
342
946480
3568
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7