Beware online "filter bubbles" | Eli Pariser

1,588,934 views ・ 2011-05-02

TED


Dvaput kliknite na engleske titlove ispod za reprodukciju videozapisa.

Prevoditelj: Matija Stepic Recezent: Tilen Pigac - EFZG
00:15
Mark Zuckerberg,
0
15260
2000
Mark Zuckerberg,
00:17
a journalist was asking him a question about the news feed.
1
17260
3000
novinar mu je postavio pitanje u vezi news feed-a.
00:20
And the journalist was asking him,
2
20260
2000
I novinar ga je pitao,
00:22
"Why is this so important?"
3
22260
2000
"Zašto je ovo toliko važno?"
00:24
And Zuckerberg said,
4
24260
2000
A Zuckerber odgovara,
00:26
"A squirrel dying in your front yard
5
26260
2000
"Vjeverica koja umire na vašem travnjaku
00:28
may be more relevant to your interests right now
6
28260
3000
može biti relevantnija za vas u ovom trenutku
00:31
than people dying in Africa."
7
31260
3000
nego ljudi koji umiru u Africi."
00:34
And I want to talk about
8
34260
2000
Želio bih govoriti o tome
00:36
what a Web based on that idea of relevance might look like.
9
36260
3000
kako bi izgledao internet utemeljen na toj ideji o relevantnosti.
00:40
So when I was growing up
10
40260
2000
Tokom mog odrastanja
00:42
in a really rural area in Maine,
11
42260
2000
u stvarno ruralnoj sredini u Maine-u,
00:44
the Internet meant something very different to me.
12
44260
3000
internet mi je značio nešto sasvim drukčije.
00:47
It meant a connection to the world.
13
47260
2000
Značio je vezu sa svijetom.
00:49
It meant something that would connect us all together.
14
49260
3000
Značio je nešto što bi nas sve povezalo.
00:52
And I was sure that it was going to be great for democracy
15
52260
3000
I bio sam siguran da će to biti odlično za demokraciju
00:55
and for our society.
16
55260
3000
i za naše društvo.
00:58
But there's this shift
17
58260
2000
Ali tu postoji taj preokret
01:00
in how information is flowing online,
18
60260
2000
u načinu na koji informacija putuje online,
01:02
and it's invisible.
19
62260
3000
koji je nevidljiv.
01:05
And if we don't pay attention to it,
20
65260
2000
I ako ne obratimo pažnju na njega,
01:07
it could be a real problem.
21
67260
3000
može postati ozbiljan problem.
01:10
So I first noticed this in a place I spend a lot of time --
22
70260
3000
Prvi put sam ovo primjetio na mjestu na kojem provodim dosta vremena --
01:13
my Facebook page.
23
73260
2000
moja Facebook stranica.
01:15
I'm progressive, politically -- big surprise --
24
75260
3000
Moja politička razumjevanja su progresivna -- kakvo iznenađenje --
01:18
but I've always gone out of my way to meet conservatives.
25
78260
2000
ali sam se uvijek trudio upoznati konzervativce.
01:20
I like hearing what they're thinking about;
26
80260
2000
Volim čuti o čemu razmišljaju;
01:22
I like seeing what they link to;
27
82260
2000
Volim vidjeti koje linkove postavljaju;
01:24
I like learning a thing or two.
28
84260
2000
Volim naučiti nekoliko stvari.
01:26
And so I was surprised when I noticed one day
29
86260
3000
Tako sam se iznenadio kada sam jednog dana primjetio
01:29
that the conservatives had disappeared from my Facebook feed.
30
89260
3000
da su konzervativci nestali iz mog Facebook feeda.
01:33
And what it turned out was going on
31
93260
2000
A ono što se ispostavilo je
01:35
was that Facebook was looking at which links I clicked on,
32
95260
4000
da je Facebook pratio koje linkove sam kliknuo,
01:39
and it was noticing that, actually,
33
99260
2000
i primjećivao je da sam, u stvari,
01:41
I was clicking more on my liberal friends' links
34
101260
2000
više klikao na linkove mojih liberalnih prijatelja
01:43
than on my conservative friends' links.
35
103260
3000
nego na linkove mojih konzervativnih prijatelja.
01:46
And without consulting me about it,
36
106260
2000
I bez konzultacije samnom,
01:48
it had edited them out.
37
108260
2000
izbacio ih je.
01:50
They disappeared.
38
110260
3000
Oni su nestali.
01:54
So Facebook isn't the only place
39
114260
2000
Ali Facebook nije jedino mjesto
01:56
that's doing this kind of invisible, algorithmic
40
116260
2000
koje radi ovo nevidljivo, algoritamsko
01:58
editing of the Web.
41
118260
3000
mjenjanje weba.
02:01
Google's doing it too.
42
121260
2000
I Google to radi.
02:03
If I search for something, and you search for something,
43
123260
3000
Kad ja tražim neki pojam, i vi tražite neki pojam,
02:06
even right now at the very same time,
44
126260
2000
čak i sada, u isto vrijeme,
02:08
we may get very different search results.
45
128260
3000
možemo dobiti različite rezultate pretrage.
02:11
Even if you're logged out, one engineer told me,
46
131260
3000
Čak i ako niste ulogirani, jedan inžinjer mi je rekao,
02:14
there are 57 signals
47
134260
2000
postoji 57 signala
02:16
that Google looks at --
48
136260
3000
koje Google prati --
02:19
everything from what kind of computer you're on
49
139260
3000
sve od toga na kakvom ste računalu
02:22
to what kind of browser you're using
50
142260
2000
do toga koji pretraživač koristite
02:24
to where you're located --
51
144260
2000
i toga gdje se nalazite --
02:26
that it uses to personally tailor your query results.
52
146260
3000
koje služi da bi vam osobno skrojio rezultate pretrage.
02:29
Think about it for a second:
53
149260
2000
Razmislite o tome na trenutak:
02:31
there is no standard Google anymore.
54
151260
4000
više ne postoji standardni Google.
02:35
And you know, the funny thing about this is that it's hard to see.
55
155260
3000
Znate, zanimljivo u vezi toga je što je to vrlo teško primjetiti.
02:38
You can't see how different your search results are
56
158260
2000
Vi ne možete vidjeti koliko se vaši rezultati pretrage
02:40
from anyone else's.
57
160260
2000
razlikuju od bilo kojih drugih.
02:42
But a couple of weeks ago,
58
162260
2000
Ali prije nekoliko tjedana
02:44
I asked a bunch of friends to Google "Egypt"
59
164260
3000
pitao sam grupu prijatelja da guglaju "Egipat"
02:47
and to send me screen shots of what they got.
60
167260
3000
i da mi pošalju snimak ekrana koji su dobili.
02:50
So here's my friend Scott's screen shot.
61
170260
3000
Evo snimka ekrana mog prijatelja Scott-a.
02:54
And here's my friend Daniel's screen shot.
62
174260
3000
A evo snimka ekrana mog prijatelja Daniel-a.
02:57
When you put them side-by-side,
63
177260
2000
Kad ih stavite jedan pored drugog,
02:59
you don't even have to read the links
64
179260
2000
ne morate čak ni čitati linkove
03:01
to see how different these two pages are.
65
181260
2000
da biste vidjeli koliko se razlikuju.
03:03
But when you do read the links,
66
183260
2000
Ali kad pročitate linkove,
03:05
it's really quite remarkable.
67
185260
3000
stvarno je vrijedno pažnje.
03:09
Daniel didn't get anything about the protests in Egypt at all
68
189260
3000
Daniel nije dobio ništa u vezi prosvjeda u Egiptu
03:12
in his first page of Google results.
69
192260
2000
na svojoj prvoj stranici Google rezultata.
03:14
Scott's results were full of them.
70
194260
2000
Scottovi rezultati su bili puni prosvjeda.
03:16
And this was the big story of the day at that time.
71
196260
2000
A to je bila najvažnija vijest dana u tom trenutku.
03:18
That's how different these results are becoming.
72
198260
3000
Toliko ti rezultati postaju različiti.
03:21
So it's not just Google and Facebook either.
73
201260
3000
To nisu čak ni samo Google i Facebook.
03:24
This is something that's sweeping the Web.
74
204260
2000
To je nešto što se širi mrežom.
03:26
There are a whole host of companies that are doing this kind of personalization.
75
206260
3000
Postoji mnoštvo kompanija koje rade ovakvu vrstu presonalizacije.
03:29
Yahoo News, the biggest news site on the Internet,
76
209260
3000
Yahoo News, najveći news site na Internetu,
03:32
is now personalized -- different people get different things.
77
212260
3000
je sada personaliziran -- različiti ljudi dobiju različite stvari.
03:36
Huffington Post, the Washington Post, the New York Times --
78
216260
3000
Huffington Post, Washington Post, New York Times --
03:39
all flirting with personalization in various ways.
79
219260
3000
svi flertuju s personalizacijom na različite načine.
03:42
And this moves us very quickly
80
222260
3000
I to nas vrlo brzo vodi
03:45
toward a world in which
81
225260
2000
ka svijetu u kojem
03:47
the Internet is showing us what it thinks we want to see,
82
227260
4000
nam Internet prikazuje stvari koje misli da želimo vidjeti,
03:51
but not necessarily what we need to see.
83
231260
3000
ali ne nužno i stvari koje bi trebali vidjeti.
03:54
As Eric Schmidt said,
84
234260
3000
Kao što je Eric Schmidt rekao,
03:57
"It will be very hard for people to watch or consume something
85
237260
3000
"Bit će vrlo teško ljudima gledati ili konzumirati nešto
04:00
that has not in some sense
86
240260
2000
što nije na neki način
04:02
been tailored for them."
87
242260
3000
skrojeno baš za njih."
04:05
So I do think this is a problem.
88
245260
2000
Tako da stvarno smatram da je to problem.
04:07
And I think, if you take all of these filters together,
89
247260
3000
I mislim da, ako skupite sve te filtere,
04:10
you take all these algorithms,
90
250260
2000
uzmete sve te algoritme,
04:12
you get what I call a filter bubble.
91
252260
3000
dobijete ono što zovem filter mjehurić.
04:16
And your filter bubble is your own personal,
92
256260
3000
Vaš filter mjehurić je vaš osobni
04:19
unique universe of information
93
259260
2000
jedinstveni univerzum informacija
04:21
that you live in online.
94
261260
2000
u kojem živite online.
04:23
And what's in your filter bubble
95
263260
3000
A što se nalazi u vašem filter mjehuriću
04:26
depends on who you are, and it depends on what you do.
96
266260
3000
zavisi od toga tko ste, i zavisi od toga čime se bavite.
04:29
But the thing is that you don't decide what gets in.
97
269260
4000
Ali stvar je u tome da vi ne odlučujete što ulazi unutra.
04:33
And more importantly,
98
273260
2000
I još važnije,
04:35
you don't actually see what gets edited out.
99
275260
3000
ne možete vidjeti što vam je uklonjeno.
04:38
So one of the problems with the filter bubble
100
278260
2000
Tako da je jedan od problema s filter mjehurićem
04:40
was discovered by some researchers at Netflix.
101
280260
3000
otkriven od strane nekih istraživača iz Netflixa.
04:43
And they were looking at the Netflix queues, and they noticed something kind of funny
102
283260
3000
Promatrali su Netflix redoslijede, i primjetili nešto zanimljivo
04:46
that a lot of us probably have noticed,
103
286260
2000
što su mnogi od nas vjerovatno primjetili,
04:48
which is there are some movies
104
288260
2000
a to je da postoje neki filmovi
04:50
that just sort of zip right up and out to our houses.
105
290260
3000
koji na neki način iskoče odmah u naše domove.
04:53
They enter the queue, they just zip right out.
106
293260
3000
Uđu u redoslijed, i odmah iskoče.
04:56
So "Iron Man" zips right out,
107
296260
2000
Tako "Iron Man" odmah iskoči,
04:58
and "Waiting for Superman"
108
298260
2000
a "Waiting for Superman"
05:00
can wait for a really long time.
109
300260
2000
može čekati prilično dugo vremena.
05:02
What they discovered
110
302260
2000
Ono što su otkrili
05:04
was that in our Netflix queues
111
304260
2000
je da se u Netflix redoslijedu
05:06
there's this epic struggle going on
112
306260
3000
događa epska bitka
05:09
between our future aspirational selves
113
309260
3000
između našeg budućeg sebe kojem težimo
05:12
and our more impulsive present selves.
114
312260
3000
i našeg sadašnjeg, impulzivnijeg sebe.
05:15
You know we all want to be someone
115
315260
2000
Znate, svi bi mi željeli biti netko
05:17
who has watched "Rashomon,"
116
317260
2000
tko je gledao "Rashomon,"
05:19
but right now
117
319260
2000
ali trenutno
05:21
we want to watch "Ace Ventura" for the fourth time.
118
321260
3000
želimo gledati "Ace Venturu" po četvrti put.
05:24
(Laughter)
119
324260
3000
(Smijeh)
05:27
So the best editing gives us a bit of both.
120
327260
2000
Tako da nam najbolje uređivanje daje ponešto od obojeg.
05:29
It gives us a little bit of Justin Bieber
121
329260
2000
Daje nam pomalo Justin Biebera
05:31
and a little bit of Afghanistan.
122
331260
2000
i pomalo Afganistana.
05:33
It gives us some information vegetables;
123
333260
2000
Daje nam nekakve biljke informacija,
05:35
it gives us some information dessert.
124
335260
3000
i daje nam neke informacijske poslastice.
05:38
And the challenge with these kinds of algorithmic filters,
125
338260
2000
Nedostatak ovakve vrste algoritamskih filtera,
05:40
these personalized filters,
126
340260
2000
ovih personaliziranih filtera,
05:42
is that, because they're mainly looking
127
342260
2000
je u tome što, pošto pretežno prate
05:44
at what you click on first,
128
344260
4000
na što prvo kliknete,
05:48
it can throw off that balance.
129
348260
4000
mogu promijeniti tu ravnotežu.
05:52
And instead of a balanced information diet,
130
352260
3000
Umjesto balansirane informacijske dijete,
05:55
you can end up surrounded
131
355260
2000
možete završiti okruženi
05:57
by information junk food.
132
357260
2000
informacijskim smećem od hrane.
05:59
What this suggests
133
359260
2000
Ovo ustvari ukazuje na to da
06:01
is actually that we may have the story about the Internet wrong.
134
361260
3000
smo možda promašili cijelu priču s Internetom.
06:04
In a broadcast society --
135
364260
2000
U širokopojasnom društvu --
06:06
this is how the founding mythology goes --
136
366260
2000
ovako glasi osnivačka mitologija --
06:08
in a broadcast society,
137
368260
2000
u širokopojasnom društvu,
06:10
there were these gatekeepers, the editors,
138
370260
2000
postojali su ti vratari, urednici,
06:12
and they controlled the flows of information.
139
372260
3000
koji su kontrolirali protok informacija.
06:15
And along came the Internet and it swept them out of the way,
140
375260
3000
I onda se pojavio Internet koji ih je pomeo s puta,
06:18
and it allowed all of us to connect together,
141
378260
2000
i omogućio svima nama da se međusobno povežemo,
06:20
and it was awesome.
142
380260
2000
što je bilo fenomenalno.
06:22
But that's not actually what's happening right now.
143
382260
3000
Ali to zapravo nije ono što se upravo događa.
06:26
What we're seeing is more of a passing of the torch
144
386260
3000
Ono što sada vidimo je više predavanje štafete
06:29
from human gatekeepers
145
389260
2000
od ljudskih vratara
06:31
to algorithmic ones.
146
391260
3000
ka algoritamskim.
06:34
And the thing is that the algorithms
147
394260
3000
A stvar je u tome da algoritmi
06:37
don't yet have the kind of embedded ethics
148
397260
3000
još uvijek nemaju ugrađenu etiku
06:40
that the editors did.
149
400260
3000
koju su imali urednici.
06:43
So if algorithms are going to curate the world for us,
150
403260
3000
Ako će nam algoritmi biti tutori o svijetu,
06:46
if they're going to decide what we get to see and what we don't get to see,
151
406260
3000
ako će odlučivati što možemo a što ne možemo vidjeti,
06:49
then we need to make sure
152
409260
2000
onda se moramo pobriniti
06:51
that they're not just keyed to relevance.
153
411260
3000
da nisu naštimani samo na relevantnost.
06:54
We need to make sure that they also show us things
154
414260
2000
Moramo osigurati da nam prikazuju i stvari
06:56
that are uncomfortable or challenging or important --
155
416260
3000
koje su neugodne ili teške ili bitne --
06:59
this is what TED does --
156
419260
2000
to je ono što TED radi --
07:01
other points of view.
157
421260
2000
druge točke gledišta.
07:03
And the thing is, we've actually been here before
158
423260
2000
Stvar je u tome da smo već bili na ovom mjestu
07:05
as a society.
159
425260
2000
kao društvo.
07:08
In 1915, it's not like newspapers were sweating a lot
160
428260
3000
1915. godine, novine se nisu puno brinule
07:11
about their civic responsibilities.
161
431260
3000
o svojoj odgovornosti prema građanima.
07:14
Then people noticed
162
434260
2000
Onda su ljudi primjetili
07:16
that they were doing something really important.
163
436260
3000
da one rade nešto vrlo značajno.
07:19
That, in fact, you couldn't have
164
439260
2000
Da, u stvari, ne možete imati
07:21
a functioning democracy
165
441260
2000
funkcionalnu demokraciju
07:23
if citizens didn't get a good flow of information,
166
443260
4000
ako građani nemaju dobar priljev informacija.
07:28
that the newspapers were critical because they were acting as the filter,
167
448260
3000
Da su novine kritične, jer su funkcionirale kao filter,
07:31
and then journalistic ethics developed.
168
451260
2000
i onda se razvila novinarska etika.
07:33
It wasn't perfect,
169
453260
2000
Nije bila savršena,
07:35
but it got us through the last century.
170
455260
3000
ali nam je poslužila kroz prošli vijek.
07:38
And so now,
171
458260
2000
I tako smo danas,
07:40
we're kind of back in 1915 on the Web.
172
460260
3000
u neku ruku ponovno u 1915. na Webu.
07:44
And we need the new gatekeepers
173
464260
3000
I potrebni su nam novi vratari
07:47
to encode that kind of responsibility
174
467260
2000
da utkaju tu vrstu odgovornosti
07:49
into the code that they're writing.
175
469260
2000
u kod koji ispisuju.
07:51
I know that there are a lot of people here from Facebook and from Google --
176
471260
3000
Znam da ima dosta ljudi ovdje iz Facebooka ili Googlea --
07:54
Larry and Sergey --
177
474260
2000
Larry i Sergey --
07:56
people who have helped build the Web as it is,
178
476260
2000
ljudi koji su pomogli u izgradnji Weba kakav je danas,
07:58
and I'm grateful for that.
179
478260
2000
i ja sam im zahvalan zbog toga.
08:00
But we really need you to make sure
180
480260
3000
Ali stvarno želimo da osigurate
08:03
that these algorithms have encoded in them
181
483260
3000
da ti algoritmi imaju u sebi ugrađen
08:06
a sense of the public life, a sense of civic responsibility.
182
486260
3000
osjećaj javnog života, osjećaj građanske odgovornosti.
08:09
We need you to make sure that they're transparent enough
183
489260
3000
Želimo da osigurate da budu dovoljno transparentni
08:12
that we can see what the rules are
184
492260
2000
kako bi vidjeli koja su pravila
08:14
that determine what gets through our filters.
185
494260
3000
koja određuju što će proći kroz naše filtere.
08:17
And we need you to give us some control
186
497260
2000
I želimo da nam date neku kontrolu,
08:19
so that we can decide
187
499260
2000
da možemo odlučiti
08:21
what gets through and what doesn't.
188
501260
3000
što će proći a što ne.
08:24
Because I think
189
504260
2000
Jer smatram da
08:26
we really need the Internet to be that thing
190
506260
2000
nam je stvarno potrebno da Internet bude to
08:28
that we all dreamed of it being.
191
508260
2000
što smo svi sanjali da će biti.
08:30
We need it to connect us all together.
192
510260
3000
Potreban nam je da nas sve poveže.
08:33
We need it to introduce us to new ideas
193
513260
3000
Potreban nam je da nas upozna s novim idejama
08:36
and new people and different perspectives.
194
516260
3000
i novim ljudima i različitim perspektivama.
08:40
And it's not going to do that
195
520260
2000
A to neće napraviti
08:42
if it leaves us all isolated in a Web of one.
196
522260
3000
ako nas sve ostavi izolirane u pojedinim mrežama.
08:45
Thank you.
197
525260
2000
Hvala Vam.
08:47
(Applause)
198
527260
11000
(Pljesak)
O ovoj web stranici

Ova stranica će vas upoznati s YouTube videozapisima koji su korisni za učenje engleskog jezika. Vidjet ćete lekcije engleskog koje vode vrhunski profesori iz cijelog svijeta. Dvaput kliknite na engleske titlove prikazane na svakoj video stranici da biste reproducirali video s tog mjesta. Titlovi se pomiču sinkronizirano s reprodukcijom videozapisa. Ako imate bilo kakvih komentara ili zahtjeva, obratite nam se putem ovog obrasca za kontakt.

https://forms.gle/WvT1wiN1qDtmnspy7