Jennifer Golbeck: The curly fry conundrum: Why social media "likes" say more than you might think

376,856 views

2014-04-03 ・ TED


New videos

Jennifer Golbeck: The curly fry conundrum: Why social media "likes" say more than you might think

376,856 views ・ 2014-04-03

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Senzos Osijek Recezent: Stjepan Mateljan
00:12
If you remember that first decade of the web,
0
12738
1997
Ako se sjećate tog prvog desetljeća na Internetu,
00:14
it was really a static place.
1
14735
2255
bilo je to vrlo statično mjesto.
00:16
You could go online, you could look at pages,
2
16990
2245
Mogli ste otići online, mogli ste gledati stranice,
00:19
and they were put up either by organizations
3
19235
2513
a one su bile postavljene od strane organizacija
00:21
who had teams to do it
4
21748
1521
koje su imale timove za to
00:23
or by individuals who were really tech-savvy
5
23269
2229
ili vrlo tehnološki stručni pojedinci
00:25
for the time.
6
25498
1737
za to doba.
00:27
And with the rise of social media
7
27235
1575
S porastom društvenih medija
00:28
and social networks in the early 2000s,
8
28810
2399
i društvenih mreža ranih 2000 - tih,
00:31
the web was completely changed
9
31209
2149
Internet se u potpunosti promijenio
00:33
to a place where now the vast majority of content
10
33358
3608
u mjesto gdje velika većina sadržaja
00:36
we interact with is put up by average users,
11
36966
3312
s kojim dolazimo u kontakt je postavljena od strane prosječnih korisnika,
00:40
either in YouTube videos or blog posts
12
40278
2697
u YouTube videima ili objavama na blogu
00:42
or product reviews or social media postings.
13
42975
3315
ili recenzijama proizvoda ili objavama društvenih medija.
00:46
And it's also become a much more interactive place,
14
46290
2347
Također je postao puno više interaktivno mjesto,
00:48
where people are interacting with others,
15
48637
2637
gdje ljudi uzajamno djeluju,
00:51
they're commenting, they're sharing,
16
51274
1696
komentiraju, dijele,
00:52
they're not just reading.
17
52970
1614
nije da samo čitaju.
00:54
So Facebook is not the only place you can do this,
18
54584
1866
Facebook nije jedino mjesto gdje to možete činiti,
00:56
but it's the biggest,
19
56450
1098
ali je najveće,
00:57
and it serves to illustrate the numbers.
20
57548
1784
i služi za ilustriranje podataka.
00:59
Facebook has 1.2 billion users per month.
21
59332
3477
Facebook ima 1.2 milijardu korisnika na mjesec.
01:02
So half the Earth's Internet population
22
62809
1930
Stoga pola Zemljine internetske populacije
01:04
is using Facebook.
23
64739
1653
koristi Facebook.
01:06
They are a site, along with others,
24
66392
1932
Oni su stranica, zajedno s drugima,
01:08
that has allowed people to create an online persona
25
68324
3219
koja je dozvolila ljudima da stvore online personu
01:11
with very little technical skill,
26
71543
1782
s vrlo malo tehničkih vještina,
01:13
and people responded by putting huge amounts
27
73325
2476
a ljudi su na to odgovorili postavljanjem
01:15
of personal data online.
28
75801
1983
gomile osobnih podataka na mrežu.
01:17
So the result is that we have behavioral,
29
77784
2543
Rezultat je da imamo podatke o ponašanju,
01:20
preference, demographic data
30
80327
1986
podatke o sklonostima, demografske podatke,
01:22
for hundreds of millions of people,
31
82313
2101
za stotine milijuna ljudi,
01:24
which is unprecedented in history.
32
84414
2026
što je jedinstven slučaj u povijesti.
01:26
And as a computer scientist, what this means is that
33
86440
2560
Kao računalnome znanstveniku, to znači da
01:29
I've been able to build models
34
89000
1664
sam mogla napraviti modele
01:30
that can predict all sorts of hidden attributes
35
90664
2322
koji mogu predvidjeti razne skrivene osobine
01:32
for all of you that you don't even know
36
92986
2284
za sve vas, a da ni ne znate
01:35
you're sharing information about.
37
95270
2202
da dijelite informacije o tome.
01:37
As scientists, we use that to help
38
97472
2382
Kao znanstvenici, koristimo to kao pomoć
01:39
the way people interact online,
39
99854
2114
u načinu kako ljudi komuniciraju online,
01:41
but there's less altruistic applications,
40
101968
2499
ali ima i manje altruističnih primjena,
01:44
and there's a problem in that users don't really
41
104467
2381
a problem je u tome što korisnici
01:46
understand these techniques and how they work,
42
106848
2470
ne razumiju baš te tehnike i kako one rade,
01:49
and even if they did, they don't have a lot of control over it.
43
109318
3128
a čak i kada bi razumjeli, nemaju puno kontrole nad time.
01:52
So what I want to talk to you about today
44
112446
1490
Ono o čemu želim pričati danas
01:53
is some of these things that we're able to do,
45
113936
2702
su neke stvari koje možemo napraviti,
01:56
and then give us some ideas of how we might go forward
46
116638
2763
i onda dati neke ideje kako bismo mogli napredovati
01:59
to move some control back into the hands of users.
47
119401
2769
da vratimo dio kontrole natrag korisnicima.
02:02
So this is Target, the company.
48
122170
1586
Ovo je tvrtka Target.
02:03
I didn't just put that logo
49
123756
1324
Nisam ja stavila taj logo
02:05
on this poor, pregnant woman's belly.
50
125080
2170
na trbuh ove jadne, trudne žene.
02:07
You may have seen this anecdote that was printed
51
127250
1840
Možda ste čuli ovu anegdotu
02:09
in Forbes magazine where Target
52
129090
2061
objavljenu u časopisu Forbes;
02:11
sent a flyer to this 15-year-old girl
53
131151
2361
Target je poslao letak 15 godišnjoj djevojci
02:13
with advertisements and coupons
54
133512
1710
s reklamama i kuponima
02:15
for baby bottles and diapers and cribs
55
135222
2554
za dječje bočice, pelene i kolijevke
02:17
two weeks before she told her parents
56
137776
1684
dva tjedna prije no što je rekla
02:19
that she was pregnant.
57
139460
1864
roditeljima da je trudna.
02:21
Yeah, the dad was really upset.
58
141324
2704
Da, otac je bio zbilja uzrujan.
02:24
He said, "How did Target figure out
59
144028
1716
Rekao je, “Kako je Target saznao
02:25
that this high school girl was pregnant
60
145744
1824
da je ova srednjoškolka trudna
02:27
before she told her parents?"
61
147568
1960
prije nego što je ona rekla roditeljima?”
02:29
It turns out that they have the purchase history
62
149528
2621
Ispostavilo se da imaju prošlost kupovine
02:32
for hundreds of thousands of customers
63
152149
2301
za stotine tisuća kupaca
02:34
and they compute what they call a pregnancy score,
64
154450
2730
te izračunavaju ono što oni nazivaju domet trudnoće,
02:37
which is not just whether or not a woman's pregnant,
65
157180
2332
koji govori ne samo je li žena trudna,
02:39
but what her due date is.
66
159512
1730
već i koji joj je termin porođaja.
02:41
And they compute that
67
161242
1304
Oni to računaju
02:42
not by looking at the obvious things,
68
162546
1768
ne gledajući samo očite stvari
02:44
like, she's buying a crib or baby clothes,
69
164314
2512
poput: kupuje li kolijevku ili odjeću za bebe,
02:46
but things like, she bought more vitamins
70
166826
2943
nego i stvari kao što su: kupila je
02:49
than she normally had,
71
169769
1717
više vitamina no obično,
02:51
or she bought a handbag
72
171486
1464
ili kupila je ručnu torbu
02:52
that's big enough to hold diapers.
73
172950
1711
u koju stanu pelene.
02:54
And by themselves, those purchases don't seem
74
174661
1910
Same po sebi, te se kupovine
02:56
like they might reveal a lot,
75
176571
2469
ne čine kao da otkrivaju mnogo,
02:59
but it's a pattern of behavior that,
76
179040
1978
ali uzorak takvog ponašanja,
03:01
when you take it in the context of thousands of other people,
77
181018
3117
kada ga se stavi u kontekst tisuća drugih ljudi,
03:04
starts to actually reveal some insights.
78
184135
2757
zapravo otkriva neka shvaćanja.
03:06
So that's the kind of thing that we do
79
186892
1793
To je način na koji predviđamo
03:08
when we're predicting stuff about you on social media.
80
188685
2567
stvari o vama u društvenim medijima.
03:11
We're looking for little patterns of behavior that,
81
191252
2796
Tražimo male obrasce ponašanja koji,
03:14
when you detect them among millions of people,
82
194048
2682
kada ih otkrijete među milijunima ljudi,
03:16
lets us find out all kinds of things.
83
196730
2706
otkrivaju nam razne stvari.
03:19
So in my lab and with colleagues,
84
199436
1747
U laboratoriju s mojim kolegama,
03:21
we've developed mechanisms where we can
85
201183
1777
razvili smo mehanizme pomoću kojih možemo
03:22
quite accurately predict things
86
202960
1560
prilično precizno predvidjeti stvari
03:24
like your political preference,
87
204520
1725
poput vaše političke preferencije,
03:26
your personality score, gender, sexual orientation,
88
206245
3752
tipa osobnosti, spol, seksualnu orijentaciju,
03:29
religion, age, intelligence,
89
209997
2873
vjeru, dob, inteligenciju,
03:32
along with things like
90
212870
1394
zajedno sa stvarima poput
03:34
how much you trust the people you know
91
214264
1937
koliko vjerujete ljudima koje poznajete
03:36
and how strong those relationships are.
92
216201
1804
i koliko su jake te veze.
03:38
We can do all of this really well.
93
218005
1785
Sve to jako dobro radimo.
03:39
And again, it doesn't come from what you might
94
219790
2197
I opet, to ne dolazi od onoga
03:41
think of as obvious information.
95
221987
2102
što vam se čini kao očiti izvor informacija.
03:44
So my favorite example is from this study
96
224089
2281
Moj najdraži primjer iz ovog je istraživanja
03:46
that was published this year
97
226370
1240
koje je objavljeno ove godine
03:47
in the Proceedings of the National Academies.
98
227610
1795
u broju časopisa Postupci Nacionalne Akademije S.A.D.
03:49
If you Google this, you'll find it.
99
229405
1285
Ako ga guglate, pronaći ćete ga.
03:50
It's four pages, easy to read.
100
230690
1872
Ima četiri stranice, lak je za čitanje.
03:52
And they looked at just people's Facebook likes,
101
232562
3003
Gledali su samo “lajkove” na Facebooku,
03:55
so just the things you like on Facebook,
102
235565
1920
samo one stvari koje vam se sviđaju,
03:57
and used that to predict all these attributes,
103
237485
2138
te su ih koristili za predviđanje ovih obilježja,
03:59
along with some other ones.
104
239623
1645
zajedno s još nekim drugima.
04:01
And in their paper they listed the five likes
105
241268
2961
U svojem radu nabrojali su pet “lajkova“
04:04
that were most indicative of high intelligence.
106
244229
2787
koji su najviše ukazivali na visok stupanj inteligencije.
04:07
And among those was liking a page
107
247016
2324
A među njima bila je stranica
04:09
for curly fries. (Laughter)
108
249340
1905
za kovrčave krumpiriće. (Smijeh)
04:11
Curly fries are delicious,
109
251245
2093
Kovrčavi krumpirići zbilja su ukusni,
04:13
but liking them does not necessarily mean
110
253338
2530
no ako ste ih označili da vam se sviđaju,
04:15
that you're smarter than the average person.
111
255868
2080
ne znači da ste pametniji od prosječne osobe.
04:17
So how is it that one of the strongest indicators
112
257948
3207
Stoga, kako da je jedan od snažnijih indikatora
04:21
of your intelligence
113
261155
1570
vaše inteligencije
04:22
is liking this page
114
262725
1447
“lajkanje” stranice
04:24
when the content is totally irrelevant
115
264172
2252
čiji je sadržaj u potpunosti ireleventan
04:26
to the attribute that's being predicted?
116
266424
2527
za atribut koji predviđa?
04:28
And it turns out that we have to look at
117
268951
1584
Ispostavlja se da moramo sagledati
04:30
a whole bunch of underlying theories
118
270535
1618
golemi broj temeljnih teorija
04:32
to see why we're able to do this.
119
272153
2569
da bismo mogli vidjeti zašto možemo to predvidjeti.
04:34
One of them is a sociological theory called homophily,
120
274722
2913
Jedna od njih je sociološka teorija imena homofilija,
04:37
which basically says people are friends with people like them.
121
277635
3092
koja u biti govori da su ljudi prijatelji s onima koji su im slični.
04:40
So if you're smart, you tend to be friends with smart people,
122
280727
2014
Stoga, ako ste pametni, težite biti prijatelj s takvima
04:42
and if you're young, you tend to be friends with young people,
123
282741
2630
ako ste mladi, težite biti prijatelj s mladim ljudima,
04:45
and this is well established
124
285371
1627
i to je uzorak koji je utvrđen
04:46
for hundreds of years.
125
286998
1745
već stotinama godina.
04:48
We also know a lot
126
288743
1232
Također znamo mnogo o tome
04:49
about how information spreads through networks.
127
289975
2550
kako se informacije šire mrežama.
04:52
It turns out things like viral videos
128
292525
1754
Ispostavlja se da se stvari poput viralnih videa,
04:54
or Facebook likes or other information
129
294279
2406
“lajkova” na Facebooku ili drugih informacija
04:56
spreads in exactly the same way
130
296685
1888
šire na potpuno jednak način
04:58
that diseases spread through social networks.
131
298573
2454
kako se zarazne bolesti šire, ali kroz društvene mreže.
05:01
So this is something we've studied for a long time.
132
301027
1791
To je nešto što smo proučavali dugo vremena.
05:02
We have good models of it.
133
302818
1576
Imamo dobre modele tog obrasca.
05:04
And so you can put those things together
134
304394
2157
Možete posložiti pojave poput te
05:06
and start seeing why things like this happen.
135
306551
3088
i uvidjeti zašto se takve stvari događaju.
05:09
So if I were to give you a hypothesis,
136
309639
1814
Pa, ako bi vam dala hipotezu,
05:11
it would be that a smart guy started this page,
137
311453
3227
ona bi glasila da je pametan tip pokrenuo tu stranicu,
05:14
or maybe one of the first people who liked it
138
314680
1939
ili nekoliko prvih ljudi koji su ju “lajkali”
05:16
would have scored high on that test.
139
316619
1736
su postigli visoke rezultate na ispitu inteligencije.
05:18
And they liked it, and their friends saw it,
140
318355
2288
Svidjela im se, njihovi su je prijatelji vidjeli,
05:20
and by homophily, we know that he probably had smart friends,
141
320643
3122
i prema načelu homofilije znamo da su vjerojatno imali pametne prijatelje
05:23
and so it spread to them, and some of them liked it,
142
323765
3056
pa se to proširilo do njih, neki od njih su ju “lajkali”,
05:26
and they had smart friends,
143
326821
1189
a i oni su imali pametne prijatelje,
05:28
and so it spread to them,
144
328010
807
05:28
and so it propagated through the network
145
328817
1973
pa se proširilo do njih,
tako se propagira kroz mrežu
05:30
to a host of smart people,
146
330790
2569
koristeći pametne ljude kao domaćine,
05:33
so that by the end, the action
147
333359
2056
stoga je na kraju čin sviđanja
05:35
of liking the curly fries page
148
335415
2544
stranice kovrčavih krumpirića
05:37
is indicative of high intelligence,
149
337959
1615
postao indikacija visoke inteligencije,
05:39
not because of the content,
150
339574
1803
ne zbog njezina sadržaja,
05:41
but because the actual action of liking
151
341377
2522
već zato što sam pritisak na gumb sviđanja
05:43
reflects back the common attributes
152
343899
1900
zrcali učestale atribute
05:45
of other people who have done it.
153
345799
2468
ljudi koji su ga pritisnuli.
05:48
So this is pretty complicated stuff, right?
154
348267
2897
Ovo su poprilično komplicirane stvari, jelda?
05:51
It's a hard thing to sit down and explain
155
351164
2199
Teško je sjesti i objasniti to
05:53
to an average user, and even if you do,
156
353363
2848
prosječnom korisniku, no čak i ako to učinite,
05:56
what can the average user do about it?
157
356211
2188
što može prosječan korisnik u vezi toga napraviti?
05:58
How do you know that you've liked something
158
358399
2048
Kako znate da nešto što ste “lajkali”
06:00
that indicates a trait for you
159
360447
1492
indicira vašu osobinu
06:01
that's totally irrelevant to the content of what you've liked?
160
361939
3545
koja je u potpunosti nepovezana sa sadržajem onoga što ste “lajkali”?
06:05
There's a lot of power that users don't have
161
365484
2546
Postoji puno snage koju korisnici nemaju
06:08
to control how this data is used.
162
368030
2230
za kontrolu kako se koriste ti podatci.
06:10
And I see that as a real problem going forward.
163
370260
3112
i vidim to kao velik problem gledajući unaprijed.
06:13
So I think there's a couple paths
164
373372
1977
Mislim da postoji nekoliko načina
06:15
that we want to look at
165
375349
1001
koje želimo sagledati
06:16
if we want to give users some control
166
376350
1910
ako bismo korisnicima donekle dali kontrolu
06:18
over how this data is used,
167
378260
1740
nad time kako se koriste ti podatci,
06:20
because it's not always going to be used
168
380000
1940
jer neće uvijek biti iskorišteni
06:21
for their benefit.
169
381940
1381
u njihovu korist.
06:23
An example I often give is that,
170
383321
1422
Primjer koji često dajem jest,
06:24
if I ever get bored being a professor,
171
384743
1646
ako mi ikad dosadi biti profesor,
06:26
I'm going to go start a company
172
386389
1653
pokrenut ću kompaniju
06:28
that predicts all of these attributes
173
388042
1454
koja predviđa sve te atribute
06:29
and things like how well you work in teams
174
389496
1602
i stvari kao što su, radite li dobro u timu,
06:31
and if you're a drug user, if you're an alcoholic.
175
391098
2671
koristite li droge, jeste li alkoholičar.
06:33
We know how to predict all that.
176
393769
1440
Sve to znamo predvidjeti.
06:35
And I'm going to sell reports
177
395209
1761
I prodavat ću izvješća
06:36
to H.R. companies and big businesses
178
396970
2100
kompanijama ljudskih resursa i velikim tvrtkama
06:39
that want to hire you.
179
399070
2273
koje vas žele zaposliti.
06:41
We totally can do that now.
180
401343
1177
Mi to možemo napraviti.
06:42
I could start that business tomorrow,
181
402520
1788
Mogla bih pokrenuti taj posao sutra,
06:44
and you would have absolutely no control
182
404308
2052
a vi ne biste imali apsolutno nikakvu kontrolu
06:46
over me using your data like that.
183
406360
2138
nada mnom koja koristim vaše podatke na taj način.
06:48
That seems to me to be a problem.
184
408498
2292
To mi se čini kao problem.
06:50
So one of the paths we can go down
185
410790
1910
Jedan put kojim možemo poći
06:52
is the policy and law path.
186
412700
2032
je put politike i prava.
06:54
And in some respects, I think that that would be most effective,
187
414732
3046
I u nekim pogledima, mislim da bi to bilo najučinkovitije,
06:57
but the problem is we'd actually have to do it.
188
417778
2756
no problem je da bismo to zbilja morali napraviti.
07:00
Observing our political process in action
189
420534
2780
Promatrajući naše političke procese u akciji
07:03
makes me think it's highly unlikely
190
423314
2379
čini mi se da je malo vjerojatno
07:05
that we're going to get a bunch of representatives
191
425693
1597
da ćemo dobiti gomilu predstavnika
07:07
to sit down, learn about this,
192
427290
1986
koji će sjesti, naučiti o tome
07:09
and then enact sweeping changes
193
429276
2106
i onda donijeti dalekosežne promjene
07:11
to intellectual property law in the U.S.
194
431382
2157
u američke zakone o intelektualnom vlasništvu
07:13
so users control their data.
195
433539
2461
da bi korisnici mogli kontrolirati svoje podatke.
07:16
We could go the policy route,
196
436000
1304
Možemo krenuti putem politike,
07:17
where social media companies say,
197
437304
1479
gdje kompanije društvenih medija kažu:
07:18
you know what? You own your data.
198
438783
1402
znate što? Vi posjedujete podatke.
07:20
You have total control over how it's used.
199
440185
2489
Imate potpunu kontrolu nad time kako se koriste.
07:22
The problem is that the revenue models
200
442674
1848
Problem je u tome što se modeli prihoda
07:24
for most social media companies
201
444522
1724
za većinu kompanija društvenih medija
07:26
rely on sharing or exploiting users' data in some way.
202
446246
4031
oslanjaju na dijeljenje ili, na neki način, iskorištavanje podataka korisnika.
07:30
It's sometimes said of Facebook that the users
203
450277
1833
Ponekad se kaže za Facebook da korisnici
07:32
aren't the customer, they're the product.
204
452110
2528
nisu mušterije, već proizvod.
07:34
And so how do you get a company
205
454638
2714
Stoga kako natjerati kompaniju
07:37
to cede control of their main asset
206
457352
2558
da korisnicima vrati kontrolu
07:39
back to the users?
207
459910
1249
nad njihovom glavnom imovinom?
07:41
It's possible, but I don't think it's something
208
461159
1701
Moguće je, ali ne mislim da je to promjena
07:42
that we're going to see change quickly.
209
462860
2320
koju ćemo ubrzo vidjeti.
07:45
So I think the other path
210
465180
1500
Stoga mislim da će drugi put
07:46
that we can go down that's going to be more effective
211
466680
2288
kojim možemo poći biti djelotvorniji
07:48
is one of more science.
212
468968
1508
i to je onaj sa više znanosti.
07:50
It's doing science that allowed us to develop
213
470476
2510
Prije svega, znanost nam dopušta razvijanje
07:52
all these mechanisms for computing
214
472986
1750
svih ovih mehanizama za računanje
07:54
this personal data in the first place.
215
474736
2052
osobnih podataka.
07:56
And it's actually very similar research
216
476788
2106
I zapravo bismo morali provesti
07:58
that we'd have to do
217
478894
1438
vrlo slično istraživanje
08:00
if we want to develop mechanisms
218
480332
2386
ako bismo htjeli razviti mehanizme
08:02
that can say to a user,
219
482718
1421
koji bi korisniku rekli:
08:04
"Here's the risk of that action you just took."
220
484139
2229
“Ovo je rizik radnje koju ste upravo napravili.”
08:06
By liking that Facebook page,
221
486368
2080
Lajkanjem te Facebook stranice
08:08
or by sharing this piece of personal information,
222
488448
2535
ili dijeljenjem ove osobne informacije,
08:10
you've now improved my ability
223
490983
1502
upravo ste poboljšali moju sposobnost
08:12
to predict whether or not you're using drugs
224
492485
2086
predviđanja koristite li droge,
08:14
or whether or not you get along well in the workplace.
225
494571
2862
ili slažete li se na radnom mjestu.
08:17
And that, I think, can affect whether or not
226
497433
1848
A to mislim da bi moglo
08:19
people want to share something,
227
499281
1510
utjecati na želju ljudi da nešto podijele,
08:20
keep it private, or just keep it offline altogether.
228
500791
3239
zadrže to privatnim, ili uopće ne stave online.
08:24
We can also look at things like
229
504030
1563
Možemo se također osvrnuti na stvari
08:25
allowing people to encrypt data that they upload,
230
505593
2728
poput dopuštanja ljudima da kodiraju podatke koje su objavili,
08:28
so it's kind of invisible and worthless
231
508321
1855
pa oni postaju nevidljivi i beskorisni
08:30
to sites like Facebook
232
510176
1431
stranicama poput Facebooka
08:31
or third party services that access it,
233
511607
2629
ili uslugama trećih stranaka koje im imaju pristup,
08:34
but that select users who the person who posted it
234
514236
3247
ali da odabrani korisnici
08:37
want to see it have access to see it.
235
517483
2670
za koje vlasnici žele da vide informacije ih mogu i vidjeti.
08:40
This is all super exciting research
236
520153
2166
Ovo je vrlo uzbudljivo istraživanje
08:42
from an intellectual perspective,
237
522319
1620
iz intelektualne perspektive,
08:43
and so scientists are going to be willing to do it.
238
523939
1859
stoga će ga znanstvenici biti voljni napraviti.
08:45
So that gives us an advantage over the law side.
239
525798
3610
To nam daje prednost pred zakonskom stranom.
08:49
One of the problems that people bring up
240
529408
1725
Jedan od problema koji ljudi spomenu
08:51
when I talk about this is, they say,
241
531133
1595
kada pričam o ovome jest, kažu,
08:52
you know, if people start keeping all this data private,
242
532728
2646
znate, ako ljudi počnu zadržavati sve svoje podatke privatnima,
08:55
all those methods that you've been developing
243
535374
2113
sve ove metode koje ste razvijali
08:57
to predict their traits are going to fail.
244
537487
2653
za predviđanje njihovih osobina će propasti.
09:00
And I say, absolutely, and for me, that's success,
245
540140
3520
A ja kažem, apsolutno, i za mene je to uspjeh,
09:03
because as a scientist,
246
543660
1786
jer kao znanstveniku,
09:05
my goal is not to infer information about users,
247
545446
3688
moj cilj nije dolaziti do informacija o korisnicima,
09:09
it's to improve the way people interact online.
248
549134
2767
već poboljšati način kako ljudi međusobno komuniciraju online.
09:11
And sometimes that involves inferring things about them,
249
551901
3218
Ponekad to uključuje saznavanje njihovih informacija,
09:15
but if users don't want me to use that data,
250
555119
3022
ali ako korisnici ne žele da koristim te podatke,
09:18
I think they should have the right to do that.
251
558141
2038
mislim da bi trebali imati to pravo i učiniti.
09:20
I want users to be informed and consenting
252
560179
2651
Želim da korisnici budu informirani
09:22
users of the tools that we develop.
253
562830
2112
i da prihvaćaju alate koje razvijamo.
09:24
And so I think encouraging this kind of science
254
564942
2952
Mislim da ohrabrivanje ove vrste znanosti
09:27
and supporting researchers
255
567894
1346
i podupiranje istraživača
09:29
who want to cede some of that control back to users
256
569240
3023
koji žele povratiti dio kontrole korisnicima
09:32
and away from the social media companies
257
572263
2311
od kompanija društvenih medija
09:34
means that going forward, as these tools evolve
258
574574
2671
znači da idući naprijed, kako se ova oruđa razvijaju
09:37
and advance,
259
577245
1476
i napreduju,
09:38
means that we're going to have an educated
260
578721
1414
imat ćemo obrazovanu
09:40
and empowered user base,
261
580135
1694
i osposobljenu bazu korisnika,
09:41
and I think all of us can agree
262
581829
1100
te mislim da se svi možemo složiti
09:42
that that's a pretty ideal way to go forward.
263
582929
2564
da je to prilično idealan način za napredovanje.
09:45
Thank you.
264
585493
2184
Hvala.
09:47
(Applause)
265
587677
3080
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7