Susan Etlinger: What do we do with all this big data?

153,066 views ・ 2014-10-20

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Predrag Mijatovic Lektor: Mile Živković
00:13
Technology has brought us so much:
0
13354
3135
Tehnologija nam je donela puno toga:
00:16
the moon landing, the Internet,
1
16489
2019
sletanje na Mesec, internet,
00:18
the ability to sequence the human genome.
2
18508
2625
mogućnost da sekvenciramo ljudski genom.
00:21
But it also taps into a lot of our deepest fears,
3
21133
3724
Ali je i dotakla mnogo naših najdubljih strahova,
00:24
and about 30 years ago,
4
24857
1856
i pre oko 30 godina,
00:26
the culture critic Neil Postman wrote a book
5
26713
2553
kritičar kulture Nil Postman napisao je knjigu
00:29
called "Amusing Ourselves to Death,"
6
29266
2115
po imenu "Zabavljamo se na mrtvo",
00:31
which lays this out really brilliantly.
7
31381
2759
koja to pokazuje na briljantan način.
00:34
And here's what he said,
8
34140
1650
I evo šta je on rekao,
00:35
comparing the dystopian visions
9
35790
2263
upoređujući distopijske vizije
00:38
of George Orwell and Aldous Huxley.
10
38053
3573
Džordža Orvela i Oldosa Hakslija.
00:41
He said, Orwell feared we would become
11
41626
3126
Rekao je: "Orvel se plašio da ćemo postati
00:44
a captive culture.
12
44752
2248
zatočena kultura.
00:47
Huxley feared we would become a trivial culture.
13
47000
3752
Haksli se plašio da ćemo postati beznačajna kultura.
00:50
Orwell feared the truth would be
14
50752
2145
Orvel se plašio da će istina
00:52
concealed from us,
15
52897
1923
biti sakrivena od nas,
00:54
and Huxley feared we would be drowned
16
54820
2190
a Haksli se plašio da ćemo se udaviti
00:57
in a sea of irrelevance.
17
57010
2693
u moru nebitnih stvari".
00:59
In a nutshell, it's a choice between
18
59703
2170
Ukratko, izbor se svodi na
01:01
Big Brother watching you
19
61873
2600
Velikog Brata koji posmatra tebe
01:04
and you watching Big Brother.
20
64473
2496
i tebe koji posmatraš "Velikog brata".
01:06
(Laughter)
21
66969
1931
(Smeh)
01:08
But it doesn't have to be this way.
22
68900
1734
Ali ne mora da bude tako.
01:10
We are not passive consumers of data and technology.
23
70634
3336
Mi nismo pasivni potrošači podataka i tehnologije.
01:13
We shape the role it plays in our lives
24
73970
2403
Mi oblikujemo ulogu koju oni igraju u našim životima
01:16
and the way we make meaning from it,
25
76373
2130
i način na koji ih shvatamo,
01:18
but to do that,
26
78503
1603
ali da bismo to uradili,
01:20
we have to pay as much attention to how we think
27
80106
3513
moramo da obratimo onoliko pažnje na to kako mislimo
01:23
as how we code.
28
83619
2030
koliko i na to kako kodiramo.
01:25
We have to ask questions, and hard questions,
29
85649
3098
Moramo da postavljamo pitanja, teška pitanja,
01:28
to move past counting things
30
88747
1869
da pređemo sa brojanja stvari
01:30
to understanding them.
31
90616
2602
na njihovo razumevanje.
01:33
We're constantly bombarded with stories
32
93218
2446
Konstantno smo bombardovani pričama
01:35
about how much data there is in the world,
33
95664
2476
o tome koliko podataka ima u svetu,
01:38
but when it comes to big data
34
98140
1580
ali kada se radi o velikoj količini podataka
01:39
and the challenges of interpreting it,
35
99720
2596
i izazovima njihovog tumačenja,
01:42
size isn't everything.
36
102316
2088
količina nije sve.
01:44
There's also the speed at which it moves,
37
104404
2903
Bitna je i brzina kojom se menjaju,
01:47
and the many varieties of data types,
38
107307
1696
mnoge varijacije tipova podataka,
01:49
and here are just a few examples:
39
109003
2498
i ovo su samo neki od primera:
01:51
images,
40
111501
2198
fotografije,
01:53
text,
41
113699
4007
tekst,
01:57
video,
42
117706
2095
video,
01:59
audio.
43
119801
1830
audio.
02:01
And what unites this disparate types of data
44
121631
3042
I ono što ujedinjuje ove različite tipove podataka
02:04
is that they're created by people
45
124673
2221
je to da su ih stvorili ljudi,
02:06
and they require context.
46
126894
2775
a to povlači i kontekst.
02:09
Now, there's a group of data scientists
47
129669
2445
Postoji grupa naučnika koja se bavi podacima
02:12
out of the University of Illinois-Chicago,
48
132114
2305
na Univerzitetu Ilinois u Čikagu,
02:14
and they're called the Health Media Collaboratory,
49
134419
2554
zovu ih Health Media Collaboratory,
02:16
and they've been working with the Centers for Disease Control
50
136973
2587
i oni sarađuju sa Centrom za kontrolu bolesti
02:19
to better understand
51
139560
1505
da bi bolje razumeli
02:21
how people talk about quitting smoking,
52
141065
2848
kako ljudi pričaju kako će da prestanu da puše,
02:23
how they talk about electronic cigarettes,
53
143913
2680
kako pričaju o elektronskim cigaretama,
02:26
and what they can do collectively
54
146593
1985
i šta oni mogu da urade kao kolektiv
02:28
to help them quit.
55
148578
1984
da im pomognu da prestanu.
02:30
The interesting thing is, if you want to understand
56
150562
2013
Interesantna stvar je da, ako hoćete da razumete
02:32
how people talk about smoking,
57
152575
2216
kako ljudi pričaju o pušenju,
02:34
first you have to understand
58
154791
1901
prvo morate da razumete
02:36
what they mean when they say "smoking."
59
156692
2565
na šta misle kada kažu "pušenje".
02:39
And on Twitter, there are four main categories:
60
159257
3926
Na Tviteru, postoje četiri glavne kategorije:
02:43
number one, smoking cigarettes;
61
163183
2997
broj jedan, pušenje cigareta;
02:46
number two, smoking marijuana;
62
166180
2807
broj dva, pušenje marihuane;
02:48
number three, smoking ribs;
63
168987
2643
broj tri, dimljena rebarca,
02:51
and number four, smoking hot women.
64
171630
3553
i broj četiri, zgodne žene.
02:55
(Laughter)
65
175183
2993
(Smeh)
02:58
So then you have to think about, well,
66
178176
2426
Onda morate da razmislite,
03:00
how do people talk about electronic cigarettes?
67
180602
2140
kako ljudi pričaju o elektronskim cigaretama?
03:02
And there are so many different ways
68
182742
2025
Postoji toliko različitih načina
03:04
that people do this, and you can see from the slide
69
184767
2599
na koji ljudi to rade, i kao što vidite sa slajda,
03:07
it's a complex kind of a query.
70
187366
2610
to je kompleksno pitanje.
03:09
And what it reminds us is that
71
189976
3224
I ono nas podseća na to
03:13
language is created by people,
72
193200
2411
da jezik stvaraju ljudi,
03:15
and people are messy and we're complex
73
195611
2340
a ljudi su zbrkani i kompleksni
03:17
and we use metaphors and slang and jargon
74
197951
2767
i koristimo metafore i sleng i žargon
03:20
and we do this 24/7 in many, many languages,
75
200718
3279
i radimo to non-stop na mnogo, mnogo jezika,
03:23
and then as soon as we figure it out, we change it up.
76
203997
3224
i taman kada se stvari slegnu, mi sve promenimo.
03:27
So did these ads that the CDC put on,
77
207221
5118
Dakle, da li su ove reklame koje je CDC napravio,
03:32
these television ads that featured a woman
78
212339
2430
ove televizijske reklame koje su imale ženu
03:34
with a hole in her throat and that were very graphic
79
214769
2021
sa rupom u grlu i koje su bile vrlo snažne
03:36
and very disturbing,
80
216790
1904
i vrlo uznemirujuće,
03:38
did they actually have an impact
81
218694
1885
da li su one zapravo imale uticaja
03:40
on whether people quit?
82
220579
2671
da ljudi ostave pušenje?
03:43
And the Health Media Collaboratory respected the limits of their data,
83
223250
3307
Health Media Collaboratory je bila svesna ograničenosti podataka,
03:46
but they were able to conclude
84
226557
2005
ali bili su u mogućnosti da zaključe
03:48
that those advertisements — and you may have seen them —
85
228562
3312
da te reklame - i možda ste ih videli -
03:51
that they had the effect of jolting people
86
231874
2591
da su imale efekat da trgnu ljude
03:54
into a thought process
87
234465
1822
da uđu u proces razmišljanja
03:56
that may have an impact on future behavior.
88
236287
3667
koji će možda imati uticaj na buduće ponašanje.
03:59
And what I admire and appreciate about this project,
89
239954
3891
Ono čemu se divim i što poštujem kod ovog projekta,
04:03
aside from the fact, including the fact
90
243845
1489
pored činjenice, uključujući činjenicu
04:05
that it's based on real human need,
91
245334
4057
da je baziran na realnoj ljudskoj potrebi,
04:09
is that it's a fantastic example of courage
92
249391
2846
je da je sjajan primer hrabrosti
04:12
in the face of a sea of irrelevance.
93
252237
4443
uprkos moru irelevantnih stvari.
04:16
And so it's not just big data that causes
94
256680
3305
I nije samo velika količina podataka ta koja prouzrokuje
04:19
challenges of interpretation, because let's face it,
95
259985
2601
probleme u interpretaciji jer, budimo realni,
04:22
we human beings have a very rich history
96
262586
2594
mi ljudi imamo veoma bogatu istoriju
04:25
of taking any amount of data, no matter how small,
97
265180
2693
da od bilo koje količine podataka, ma koliko male,
04:27
and screwing it up.
98
267873
1617
izvedemo loše zaključke.
04:29
So many years ago, you may remember
99
269490
3737
Pre mnogo godina, možda se sećate,
04:33
that former President Ronald Reagan
100
273227
2273
bivšeg predsednika Ronalda Regana
04:35
was very criticized for making a statement
101
275500
1991
snažno su kritikovali zbog izjave
04:37
that facts are stupid things.
102
277491
3010
da su činjenice glupost.
04:40
And it was a slip of the tongue, let's be fair.
103
280501
2794
Bio je to lapsus, budimo pošteni.
04:43
He actually meant to quote John Adams' defense
104
283295
2430
On je zapravo želeo da citira Džona Adamsa
04:45
of British soldiers in the Boston Massacre trials
105
285725
2751
i njegovu odbranu britanskih vojnika u suđenju za Bostonski masakr
04:48
that facts are stubborn things.
106
288476
3150
koji kaže da su činjenice neumoljive.
04:51
But I actually think there's
107
291626
2624
Ono što ja zapravo mislim je da
04:54
a bit of accidental wisdom in what he said,
108
294250
3418
u onome što je on rekao postoji i delić slučajne mudrosti,
04:57
because facts are stubborn things,
109
297668
2776
jer činjenice jesu neumoljive,
05:00
but sometimes they're stupid, too.
110
300444
2923
ali su nekada i glupe.
05:03
I want to tell you a personal story
111
303367
1888
Hoću da vam ispričam ličnu priču
05:05
about why this matters a lot to me.
112
305255
3548
o tome zašto sve ovo meni mnogo znači.
05:08
I need to take a breath.
113
308803
2437
Moram da udahnem.
05:11
My son Isaac, when he was two,
114
311240
2754
Mom sinu Isaku, kada je imao dve godine,
05:13
was diagnosed with autism,
115
313994
2417
dijagnostikovan je autizam,
05:16
and he was this happy, hilarious,
116
316411
2161
a on je bio srećan, razdragan,
05:18
loving, affectionate little guy,
117
318572
2035
pun ljubavi, srdačan mali momak,
05:20
but the metrics on his developmental evaluations,
118
320607
2902
ali pokazatelji njegove razvojne evaluacije,
05:23
which looked at things like the number of words —
119
323509
2070
koji su gledali na stvari kao što su broj reči -
05:25
at that point, none —
120
325579
3657
u tom trenutku, nijedna -
05:29
communicative gestures and minimal eye contact,
121
329236
3940
gestikuliranje i minimalni kontakt očima,
05:33
put his developmental level
122
333176
2003
rangirali su njegov razvojni nivo
05:35
at that of a nine-month-old baby.
123
335179
3961
na nivo devetomesečne bebe.
05:39
And the diagnosis was factually correct,
124
339140
2960
Dijagnoza je činjenično bila tačna,
05:42
but it didn't tell the whole story.
125
342100
3209
ali nije pričala celu priču.
05:45
And about a year and a half later,
126
345309
1401
Oko godinu i po dana kasnije,
05:46
when he was almost four,
127
346710
2102
kada mu je bilo skoro četiri godine,
05:48
I found him in front of the computer one day
128
348812
2363
zatekla sam ga za kompjuterom jednog dana
05:51
running a Google image search on women,
129
351175
5453
kako pretražuje Gugl za slikama žena,
05:56
spelled "w-i-m-e-n."
130
356628
3616
tražeći "žne".
06:00
And I did what any obsessed parent would do,
131
360244
2740
I ja sam uradila isto što i svaki opsednuti roditelj -
06:02
which is immediately started hitting the "back" button
132
362984
1901
odmah sam počela da pritiskam dugme "nazad"
06:04
to see what else he'd been searching for.
133
364885
3363
kako bih videla šta je još pretraživao.
06:08
And they were, in order: men,
134
368248
2171
Pretrage su bile: muškarci,
06:10
school, bus and computer.
135
370419
7267
škola, autobus i kompjuter.
06:17
And I was stunned,
136
377686
2070
Bila sam zatečena,
06:19
because we didn't know that he could spell,
137
379756
2002
jer nismo znali da on ume da sriče,
06:21
much less read, and so I asked him,
138
381758
1766
kamoli da čita, i pitala sam ga:
06:23
"Isaac, how did you do this?"
139
383524
2193
"Isače, kako si ovo uradio?"
06:25
And he looked at me very seriously and said,
140
385717
2678
Pogledao me je veoma ozbiljno i rekao:
06:28
"Typed in the box."
141
388395
3352
"Kucao sam u polje za tekst."
06:31
He was teaching himself to communicate,
142
391747
3734
On je učio sebe da komunicira,
06:35
but we were looking in the wrong place,
143
395481
3004
ali mi smo gledali na pogrešnom mestu,
06:38
and this is what happens when assessments
144
398485
2295
i to je ono što se desi kad procene
06:40
and analytics overvalue one metric —
145
400780
2396
i analize daju previše značaja jednom pokazatelju -
06:43
in this case, verbal communication —
146
403176
2609
u ovom slučaju, verbalnoj komunikaciji -
06:45
and undervalue others, such as creative problem-solving.
147
405785
5703
a premalo značaja drugim, kao što je kreativno rešavanje problema.
06:51
Communication was hard for Isaac,
148
411488
2307
Komunikacija je bila teška za Isaka,
06:53
and so he found a workaround
149
413795
1912
pa je on našao drugi put
06:55
to find out what he needed to know.
150
415707
2857
kako bi došao do onoga što ga zanima.
06:58
And when you think about it, it makes a lot of sense,
151
418564
1890
I kada razmislite o tome, to ima smisla,
07:00
because forming a question
152
420454
2081
jer formiranje pitanja
07:02
is a really complex process,
153
422535
2565
je veoma složen proces,
07:05
but he could get himself a lot of the way there
154
425100
2522
ali je on mogao dosta toga da postigne
07:07
by putting a word in a search box.
155
427622
4092
upisujući reč u polje za pretraživanje.
07:11
And so this little moment
156
431714
2936
I tako je taj momenat
07:14
had a really profound impact on me
157
434650
2836
imao veoma jak uticaj na mene
07:17
and our family
158
437486
1309
i našu porodicu
07:18
because it helped us change our frame of reference
159
438795
3141
jer nam je pomogao da promenimo fokus
07:21
for what was going on with him,
160
441936
2208
u vezi sa onim što se dešavalo sa njim
07:24
and worry a little bit less and appreciate
161
444144
2976
i da brinemo malo manje, a da više cenimo
07:27
his resourcefulness more.
162
447120
2182
njegovu snalažljivost.
07:29
Facts are stupid things.
163
449302
2861
Činjenice su glupost.
07:32
And they're vulnerable to misuse,
164
452163
2397
I lako ih je pogrešno protumačiti,
07:34
willful or otherwise.
165
454560
1653
svesno ili ne.
07:36
I have a friend, Emily Willingham, who's a scientist,
166
456213
3026
Ja imam prijateljicu, Emili Vilingem, koja je naučnik,
07:39
and she wrote a piece for Forbes not long ago
167
459239
2801
i ona je nedavno napisala članak za Forbs
07:42
entitled "The 10 Weirdest Things
168
462040
1980
pod imenom "10 najčudnijih stvari
07:44
Ever Linked to Autism."
169
464020
1810
ikada vezanih za autizam".
07:45
It's quite a list.
170
465830
3005
Jako zanimljiva lista.
07:48
The Internet, blamed for everything, right?
171
468835
3532
Internet, krivac za sve, je l' da?
07:52
And of course mothers, because.
172
472367
3757
I naravno majke, jer eto.
07:56
And actually, wait, there's more,
173
476124
1587
I zapravo, čekajte, ima još,
07:57
there's a whole bunch in the "mother" category here.
174
477711
3430
postoji čitava gomila u kategoriji "majka".
08:01
And you can see it's a pretty rich and interesting list.
175
481141
4815
I možete da vidite da je prilično bogata i zanimljiva lista.
08:05
I'm a big fan of
176
485956
2193
Ja sam veliki fan stavke
"trudnoća blizu autoputa".
08:08
being pregnant near freeways, personally.
177
488149
3704
08:11
The final one is interesting,
178
491853
1539
Poslednji je interesantan,
08:13
because the term "refrigerator mother"
179
493392
3003
jer je termin "frižider majka"
08:16
was actually the original hypothesis
180
496395
2605
zapravo bio originalna hipoteza
za uzrok autizma,
08:19
for the cause of autism,
181
499000
1431
08:20
and that meant somebody who was cold and unloving.
182
500431
2735
i opisuje nekoga ko je hladan i bez ljubavi.
08:23
And at this point, you might be thinking,
183
503166
1562
U ovom trenutku, možda mislite,
08:24
"Okay, Susan, we get it,
184
504728
1657
"U redu, Suzan, razumemo,
08:26
you can take data, you can make it mean anything."
185
506385
1782
moguće je uzeti podatke i napraviti od njih bilo šta."
08:28
And this is true, it's absolutely true,
186
508167
4703
I to je tačno, to je apsolutno tačno,
08:32
but the challenge is that
187
512870
5610
ali izazov je da
08:38
we have this opportunity
188
518480
2448
imamo ovu priliku
08:40
to try to make meaning out of it ourselves,
189
520928
2284
da sami pokušamo da shvatimo smisao,
08:43
because frankly, data doesn't create meaning. We do.
190
523212
5352
jer iskreno, podaci ne stvaraju smisao. Mi ga stvaramo.
08:48
So as businesspeople, as consumers,
191
528564
3256
Pa kao poslovni ljudi, kao potrošači,
08:51
as patients, as citizens,
192
531820
2539
kao pacijenti, kao građani,
08:54
we have a responsibility, I think,
193
534359
2396
mi imamo odgovornost, po mom mišljenju,
08:56
to spend more time
194
536755
2194
da provodimo više vremena
08:58
focusing on our critical thinking skills.
195
538949
2870
fokusirajući se na veštinu kritičkog razmišljanja.
09:01
Why?
196
541819
1078
Zašto?
09:02
Because at this point in our history, as we've heard
197
542897
3178
Zato što u ovom trenutku naše istorije, kao što smo čuli
mnogo puta do sad,
09:06
many times over,
198
546075
1706
09:07
we can process exabytes of data
199
547781
1981
mi možemo da obradimo eksabajte podataka
09:09
at lightning speed,
200
549762
2153
brzinom svetlosti,
09:11
and we have the potential to make bad decisions
201
551915
3515
pa imamo potencijal da donosimo loše odluke
09:15
far more quickly, efficiently,
202
555430
1834
mnogo brže, efikasnije
09:17
and with far greater impact than we did in the past.
203
557264
5028
i sa većim posledicama nego ikada pre.
09:22
Great, right?
204
562292
1388
Sjajno, zar ne?
09:23
And so what we need to do instead
205
563680
3030
Ono što treba da uradimo umesto toga
09:26
is spend a little bit more time
206
566710
2330
je da potrošimo malo više vremena
09:29
on things like the humanities
207
569040
2746
na stvari poput humanističkih nauka
09:31
and sociology, and the social sciences,
208
571786
3464
i sociologije, i društvenih nauka,
09:35
rhetoric, philosophy, ethics,
209
575250
2308
retorike, filozofije, etike
09:37
because they give us context that is so important
210
577558
2856
jer one pružaju kontekst koji je tako bitan
09:40
for big data, and because
211
580414
2576
za velike količine podataka,
09:42
they help us become better critical thinkers.
212
582990
2418
jer nam one pomažu da budemo bolji u kritičkom mišljenju.
09:45
Because after all, if I can spot
213
585408
4207
Jer na kraju svega, ako ja mogu da primetim
09:49
a problem in an argument, it doesn't much matter
214
589615
2486
problem u nekom argumentu, nije mnogo bitno
09:52
whether it's expressed in words or in numbers.
215
592101
2759
da li je on izražen u rečima ili brojevima.
09:54
And this means
216
594860
2719
A to zahteva
09:57
teaching ourselves to find those confirmation biases
217
597579
4421
da naučimo sebe da prepoznajemo pristrasnost u zaključcima
i lažne korelacije
10:02
and false correlations
218
602000
1822
10:03
and being able to spot a naked emotional appeal
219
603822
2138
i da možemo da primetimo čistu emotivnu manipulaciju
10:05
from 30 yards,
220
605960
1662
sa 30 metara,
10:07
because something that happens after something
221
607622
2522
jer ako se nešto desilo posle nečega,
10:10
doesn't mean it happened because of it, necessarily,
222
610144
3082
ne znači da se nužno desilo baš zbog toga,
10:13
and if you'll let me geek out on you for a second,
223
613226
2119
i ako mi dozvolite da budem štreber na sekund,
10:15
the Romans called this "post hoc ergo propter hoc,"
224
615345
4297
Rimljani su ovo zvali "Post hoc ergo propter hoc",
10:19
after which therefore because of which.
225
619642
3296
"Posle toga, dakle zbog toga".
10:22
And it means questioning disciplines like demographics.
226
622938
3757
To znači i preispitivanje disciplina kao što je demografija.
10:26
Why? Because they're based on assumptions
227
626695
2520
Zašto? Zato što su takve discipline zasnovane na pretpostavkama
10:29
about who we all are based on our gender
228
629215
2306
o tome ko smo mi na osnovu našeg pola
10:31
and our age and where we live
229
631521
1462
i naših godina i gde živimo
10:32
as opposed to data on what we actually think and do.
230
632983
3478
umesto na osnovu toga šta mislimo i radimo.
10:36
And since we have this data,
231
636461
1663
Pošto imamo tu veliku količinu podataka,
10:38
we need to treat it with appropriate privacy controls
232
638124
3139
moramo da je tretiramo sa prikladnom kontrolom privatnosti
10:41
and consumer opt-in,
233
641263
3576
i pristankom potrošača,
10:44
and beyond that, we need to be clear
234
644839
2993
i pored toga, moramo biti jasni
10:47
about our hypotheses,
235
647832
2103
kada su u pitanju naše hipoteze,
10:49
the methodologies that we use,
236
649935
2596
metodologije koje koristimo,
10:52
and our confidence in the result.
237
652531
2804
i naša uverenost u rezultat.
10:55
As my high school algebra teacher used to say,
238
655335
2474
Kao što je moj učitelj algebre iz srednje škole umeo da kaže:
10:57
show your math,
239
657809
1531
"Pokaži proces,
10:59
because if I don't know what steps you took,
240
659340
3441
jer ako ne znam koje si korake napravila,
11:02
I don't know what steps you didn't take,
241
662781
1991
ne znam ni koje korake nisi napravila,
11:04
and if I don't know what questions you asked,
242
664772
2438
i ako ne znam koja pitanja si postavila,
11:07
I don't know what questions you didn't ask.
243
667210
3197
ne znam ni koja pitanja nisi postavila".
11:10
And it means asking ourselves, really,
244
670407
1523
I to znači zapitati se, zapravo,
11:11
the hardest question of all:
245
671930
1479
najteže pitanje od svih:
11:13
Did the data really show us this,
246
673409
3500
"Da li su nam podaci ovo pokazali,
11:16
or does the result make us feel
247
676909
2311
ili da li rezultat čini da se osećamo
11:19
more successful and more comfortable?
248
679220
3878
uspešnije i zadovoljnije?"
11:23
So the Health Media Collaboratory,
249
683098
2584
Health Media Collaboratory,
11:25
at the end of their project, they were able
250
685682
1699
na kraju svog projekta, pokazala je
11:27
to find that 87 percent of tweets
251
687381
3408
da je 87 procenata tvitova
11:30
about those very graphic and disturbing
252
690789
2144
baš o onim snažnim i uznemirujućim
11:32
anti-smoking ads expressed fear,
253
692933
4038
antipušačkim reklamama izražavalo strah,
11:36
but did they conclude
254
696971
1856
ali da li su zaključili
11:38
that they actually made people stop smoking?
255
698827
3161
da su zapravo učinili da ljudi prestanu da puše?
11:41
No. It's science, not magic.
256
701988
2542
Ne. To je nauka, a ne magija.
11:44
So if we are to unlock
257
704530
3190
Ako želimo da otključamo
11:47
the power of data,
258
707720
2862
moć podataka,
11:50
we don't have to go blindly into
259
710582
3448
ne moramo slepo da pratimo
11:54
Orwell's vision of a totalitarian future,
260
714030
3436
Orvelovu viziju totalitarne budućnosti,
11:57
or Huxley's vision of a trivial one,
261
717466
3117
ili Hakslijevu viziju trivijalne,
12:00
or some horrible cocktail of both.
262
720583
3020
ili neki užasavajući koktel obe.
12:03
What we have to do
263
723603
2379
Ono što moramo da uradimo
12:05
is treat critical thinking with respect
264
725982
2718
je da tretiramo kritičko mišljenje sa poštovanjem
12:08
and be inspired by examples
265
728700
2029
i da budemo inspirisani primerima
12:10
like the Health Media Collaboratory,
266
730729
2610
kao što je Health Media Collaboratory,
12:13
and as they say in the superhero movies,
267
733339
2328
i kao što kažu u filmovima o superherojima,
12:15
let's use our powers for good.
268
735667
1822
hajde da koristimo naše moći da činimo dobro.
12:17
Thank you.
269
737489
2351
Hvala vam.
12:19
(Applause)
270
739840
2334
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7