Beware online "filter bubbles" | Eli Pariser

1,585,118 views ・ 2011-05-02

TED


Palun tehke topeltklõps allpool olevatel ingliskeelsetel subtiitritel, et mängida videot.

Translator: Alo-Jarmo Küppas Reviewer: Piret Hion
00:15
Mark Zuckerberg,
0
15260
2000
Mark Zukerbergi käest
00:17
a journalist was asking him a question about the news feed.
1
17260
3000
küsis ajakirjanik facebooki uudistevoo kohta.
00:20
And the journalist was asking him,
2
20260
2000
Ajakirjanik küsis:
00:22
"Why is this so important?"
3
22260
2000
"Miks see üldse nii oluline on?"
00:24
And Zuckerberg said,
4
24260
2000
Ja Zukerberg vastas:
00:26
"A squirrel dying in your front yard
5
26260
2000
"Teie hoovis surev orav
00:28
may be more relevant to your interests right now
6
28260
3000
võib olla teie huvidest lähtuvalt olulisem
00:31
than people dying in Africa."
7
31260
3000
kui inimesed, kes surevad Aafrikas."
00:34
And I want to talk about
8
34260
2000
Ja ma tahaksin rääkida milline
00:36
what a Web based on that idea of relevance might look like.
9
36260
3000
sellisel ideel põhinev internet välja võiks näha.
00:40
So when I was growing up
10
40260
2000
Ma kasvasin üles
00:42
in a really rural area in Maine,
11
42260
2000
Maine osariigi maapiirkonnas
00:44
the Internet meant something very different to me.
12
44260
3000
ja internet oli minu jaoks midagi erilist.
00:47
It meant a connection to the world.
13
47260
2000
See oli ühenduseks maailmaga.
00:49
It meant something that would connect us all together.
14
49260
3000
See oli midagi, mis meid kõiki omavahel ühendab.
00:52
And I was sure that it was going to be great for democracy
15
52260
3000
Ma olin kindel, et see tuleb suuresti kasuks demokraatiale
00:55
and for our society.
16
55260
3000
ja kogu meie ühiskonnale.
00:58
But there's this shift
17
58260
2000
Kuid nüüd on toimumas muutus
01:00
in how information is flowing online,
18
60260
2000
kuidas informatsioon internetis liigub
01:02
and it's invisible.
19
62260
3000
ja see on nähtamatu.
01:05
And if we don't pay attention to it,
20
65260
2000
Ning kui me sellele tähelepanu ei pööra
01:07
it could be a real problem.
21
67260
3000
võib see probleemiks saada.
01:10
So I first noticed this in a place I spend a lot of time --
22
70260
3000
Märkasin seda esimesena kohas, kus palju aega veedan -
01:13
my Facebook page.
23
73260
2000
oma Facebooki lehel.
01:15
I'm progressive, politically -- big surprise --
24
75260
3000
Olen poliitiliselt liberaal - suur üllatus -
01:18
but I've always gone out of my way to meet conservatives.
25
78260
2000
aga olen teadlikult ka konservatiividega suhelnud.
01:20
I like hearing what they're thinking about;
26
80260
2000
Mulle meeldib kuulata millest nad mõtlevad,
01:22
I like seeing what they link to;
27
82260
2000
mulle meeldib näha mida nad jagavad,
01:24
I like learning a thing or two.
28
84260
2000
mulle meeldib uusi asju teada saada.
01:26
And so I was surprised when I noticed one day
29
86260
3000
Ja ma olin üllatunud ühel päeval märgates,
01:29
that the conservatives had disappeared from my Facebook feed.
30
89260
3000
et mu konservatiividest sõbrad olid Facebooki uudistevoost kadunud.
01:33
And what it turned out was going on
31
93260
2000
Tuli välja, et
01:35
was that Facebook was looking at which links I clicked on,
32
95260
4000
Facebook vaatas, millistele linkidele ma vajutan
01:39
and it was noticing that, actually,
33
99260
2000
ja nägi, et tegelikult,
01:41
I was clicking more on my liberal friends' links
34
101260
2000
vajutan ma liberaalidest sõprade linkidele rohkem
01:43
than on my conservative friends' links.
35
103260
3000
kui konservatiivide omadele.
01:46
And without consulting me about it,
36
106260
2000
Ja ilma minuga nõu pidamata,
01:48
it had edited them out.
37
108260
2000
oli ta nad välja jätnud.
01:50
They disappeared.
38
110260
3000
Nad olid kadunud.
01:54
So Facebook isn't the only place
39
114260
2000
Ega Facebook pole ainuke koht,
01:56
that's doing this kind of invisible, algorithmic
40
116260
2000
mis sarnast nähtamatut, algoritmilist
01:58
editing of the Web.
41
118260
3000
toimetamist kasutab.
02:01
Google's doing it too.
42
121260
2000
Google teeb sedasama.
02:03
If I search for something, and you search for something,
43
123260
3000
Kui mina ja teie praegu midagi otsiksime,
02:06
even right now at the very same time,
44
126260
2000
koos ja samal ajal,
02:08
we may get very different search results.
45
128260
3000
võime saada täiesti erinevad otsingutulemused.
02:11
Even if you're logged out, one engineer told me,
46
131260
3000
Isegi juhul kui te ei ole sisse loginud, rääkis üks insener mulle,
02:14
there are 57 signals
47
134260
2000
et on 57 signaali,
02:16
that Google looks at --
48
136260
3000
mida Google kontrollib -
02:19
everything from what kind of computer you're on
49
139260
3000
alates millist arvutit kasutatakse,
02:22
to what kind of browser you're using
50
142260
2000
millist veebilehitsejat,
02:24
to where you're located --
51
144260
2000
kuni otsija asukohani -
02:26
that it uses to personally tailor your query results.
52
146260
3000
nende abil pannakse kokku otsingu tulemused.
02:29
Think about it for a second:
53
149260
2000
Mõelge natuke selle peale:
02:31
there is no standard Google anymore.
54
151260
4000
tavalist Google'it ei ole enam olemas.
02:35
And you know, the funny thing about this is that it's hard to see.
55
155260
3000
Teate, naljakas asja juures on see, et seda on keeruline näha.
02:38
You can't see how different your search results are
56
158260
2000
Te ei saa ju näha, kui palju teie otsingutulemused
02:40
from anyone else's.
57
160260
2000
teiste omadest erinevad.
02:42
But a couple of weeks ago,
58
162260
2000
Kuid paar nädalat tagasi
02:44
I asked a bunch of friends to Google "Egypt"
59
164260
3000
palusin oma sõpradel googeldada sõna "Egiptus".
02:47
and to send me screen shots of what they got.
60
167260
3000
Palusin, et nad saadaksid mulle pildid vastustest.
02:50
So here's my friend Scott's screen shot.
61
170260
3000
Siin on minu sõber Scott oma vastustega.
02:54
And here's my friend Daniel's screen shot.
62
174260
3000
Ja siin on Daniel enda omadega.
02:57
When you put them side-by-side,
63
177260
2000
Kui need kõrvuti panna,
02:59
you don't even have to read the links
64
179260
2000
ei pea isegi linke lähemalt lugema,
03:01
to see how different these two pages are.
65
181260
2000
et aru saada kui erinevad tulemused on.
03:03
But when you do read the links,
66
183260
2000
Kuid lähemalt vaadates
03:05
it's really quite remarkable.
67
185260
3000
on see päris erakordne.
03:09
Daniel didn't get anything about the protests in Egypt at all
68
189260
3000
Daniel ei saanud Egiptuse meeleavalduste kohta
03:12
in his first page of Google results.
69
192260
2000
esimesel lehel mitte midagi.
03:14
Scott's results were full of them.
70
194260
2000
Scotti vastused olid neid aga täis.
03:16
And this was the big story of the day at that time.
71
196260
2000
Need olid sel ajal aga kõige kuumemad uudised.
03:18
That's how different these results are becoming.
72
198260
3000
Vaadake, kui erinevaks tulemused on muutumas.
03:21
So it's not just Google and Facebook either.
73
201260
3000
Ning see ei ole ainult Google ja Facebook.
03:24
This is something that's sweeping the Web.
74
204260
2000
See on midagi, mis toimub kogu internetis.
03:26
There are a whole host of companies that are doing this kind of personalization.
75
206260
3000
On suur hulk ettevõtteid, kes kasutavad samasugust isikustamist.
03:29
Yahoo News, the biggest news site on the Internet,
76
209260
3000
Yahoo News, interneti suurim uudistelehekülg
03:32
is now personalized -- different people get different things.
77
212260
3000
kasutab nüüd isikustamist - erinevad inimesed näevad erinevaid asju.
03:36
Huffington Post, the Washington Post, the New York Times --
78
216260
3000
Huffington Post, Washington Post ja New York Times -
03:39
all flirting with personalization in various ways.
79
219260
3000
semmivad kõik erineval moel isikustamisega.
03:42
And this moves us very quickly
80
222260
3000
Ja seeläbi liigume kiiresti
03:45
toward a world in which
81
225260
2000
olukorrani, kus
03:47
the Internet is showing us what it thinks we want to see,
82
227260
4000
internet näitab meile ainult neid asju mida ta eeldab, et näha tahame,
03:51
but not necessarily what we need to see.
83
231260
3000
mis ei pruugi alati olla asjad, mida nägema peame.
03:54
As Eric Schmidt said,
84
234260
3000
Eric Smith on öelnud:
03:57
"It will be very hard for people to watch or consume something
85
237260
3000
"Inimeste jaoks on keeruline vaadata või tarbida midagi,
04:00
that has not in some sense
86
240260
2000
mis mingis plaanis
04:02
been tailored for them."
87
242260
3000
ei ole nende jaoks sobivaks tehtud."
04:05
So I do think this is a problem.
88
245260
2000
Ja ma arvan, et see on probleem.
04:07
And I think, if you take all of these filters together,
89
247260
3000
Ma arvan ka, et kui võtta kõik need filtrid üheskoos,
04:10
you take all these algorithms,
90
250260
2000
kui võtta kõik need algoritmid,
04:12
you get what I call a filter bubble.
91
252260
3000
saame midagi, mida võiks nimetada filtreeritud mulliks.
04:16
And your filter bubble is your own personal,
92
256260
3000
Ja see filtreeritud mull on Sinu isiklik,
04:19
unique universe of information
93
259260
2000
ainulaadne inforuum,
04:21
that you live in online.
94
261260
2000
milles internetis olles elad.
04:23
And what's in your filter bubble
95
263260
3000
See, mis seal filtreeritud mullis on,
04:26
depends on who you are, and it depends on what you do.
96
266260
3000
sõltub sellest, kes Sa oled ja millega tegeled.
04:29
But the thing is that you don't decide what gets in.
97
269260
4000
Kuid asi on selles, et Sa ei saa otsustada, mis sinna mulli pääseb.
04:33
And more importantly,
98
273260
2000
Veel olulisem on,
04:35
you don't actually see what gets edited out.
99
275260
3000
Sa ei saa ka tegelikult näha, mis sealt mullist välja jäetakse.
04:38
So one of the problems with the filter bubble
100
278260
2000
Üks filtreeritud mulli probleeme
04:40
was discovered by some researchers at Netflix.
101
280260
3000
avastati Netflixi analüütikute poolt.
04:43
And they were looking at the Netflix queues, and they noticed something kind of funny
102
283260
3000
Nad vaatasid Netflixi videote järjekordi ja avastasid midagi päris huvitavat,
04:46
that a lot of us probably have noticed,
103
286260
2000
mida me ilmselt ka ise oleme märganud,
04:48
which is there are some movies
104
288260
2000
et on mõned filmid
04:50
that just sort of zip right up and out to our houses.
105
290260
3000
mis otsejoones meile kodudesse jõuavad.
04:53
They enter the queue, they just zip right out.
106
293260
3000
Nad on järjekorras ja hüppavad kohe välja.
04:56
So "Iron Man" zips right out,
107
296260
2000
Näiteks "Raudmees" hüppab kohe välja,
04:58
and "Waiting for Superman"
108
298260
2000
aga "Oodates Supermani"
05:00
can wait for a really long time.
109
300260
2000
võib päris kaua aega oodata.
05:02
What they discovered
110
302260
2000
Nad avastasid,
05:04
was that in our Netflix queues
111
304260
2000
et meie Netflixi järjekordades
05:06
there's this epic struggle going on
112
306260
3000
käib suur võitlus
05:09
between our future aspirational selves
113
309260
3000
meie tuleviku poole püüdlevate "minade"
05:12
and our more impulsive present selves.
114
312260
3000
ja meie impulsiivsemate, olevikus elavate "minade" vahel.
05:15
You know we all want to be someone
115
315260
2000
Et me kõik tahaksime öelda, et
05:17
who has watched "Rashomon,"
116
317260
2000
oleme juba näinud kultusfilmi "Rashomon",
05:19
but right now
117
319260
2000
kuid praegu
05:21
we want to watch "Ace Ventura" for the fourth time.
118
321260
3000
tahame juba neljandat korda vaadata filmi "Ace Ventura".
05:24
(Laughter)
119
324260
3000
(Naer)
05:27
So the best editing gives us a bit of both.
120
327260
2000
Ehk parim filtreerimine annab meile pisut mõlemat.
05:29
It gives us a little bit of Justin Bieber
121
329260
2000
See annab natukene Justin Bieberit
05:31
and a little bit of Afghanistan.
122
331260
2000
ja pisut Afganistani.
05:33
It gives us some information vegetables;
123
333260
2000
See annab meile olulist teavet
05:35
it gives us some information dessert.
124
335260
3000
ja pisut kollast sinna hulka.
05:38
And the challenge with these kinds of algorithmic filters,
125
338260
2000
Selliste algoritmiliste filtrite probleem,
05:40
these personalized filters,
126
340260
2000
selliste isikustatud filtrite väljakutse
05:42
is that, because they're mainly looking
127
342260
2000
on, et nad vaatavad põhiliselt,
05:44
at what you click on first,
128
344260
4000
kuhu vajutatakse esimesena
05:48
it can throw off that balance.
129
348260
4000
ja võivad tasakaalust välja minna.
05:52
And instead of a balanced information diet,
130
352260
3000
Ning tasakaalustatud info asemel
05:55
you can end up surrounded
131
355260
2000
võite olla ümbritsetud
05:57
by information junk food.
132
357260
2000
informatsioon kiirtoidust.
05:59
What this suggests
133
359260
2000
See viitab tõsiasjale,
06:01
is actually that we may have the story about the Internet wrong.
134
361260
3000
et oleme tegelikult internetist valesti aru saanud.
06:04
In a broadcast society --
135
364260
2000
Teabeühiskonnas -
06:06
this is how the founding mythology goes --
136
366260
2000
selline müüt on käibel -
06:08
in a broadcast society,
137
368260
2000
teabeühiskonnas
06:10
there were these gatekeepers, the editors,
138
370260
2000
olid nn. "väravavalvurid", toimetajad,
06:12
and they controlled the flows of information.
139
372260
3000
kes kontrollisid kogu informatsiooni liikumist.
06:15
And along came the Internet and it swept them out of the way,
140
375260
3000
Koos internetiga muutusid nad tähtsusetuks,
06:18
and it allowed all of us to connect together,
141
378260
2000
see andis kõigile võimaluse omavahel ühenduses olla
06:20
and it was awesome.
142
380260
2000
ja see oli super.
06:22
But that's not actually what's happening right now.
143
382260
3000
Kuid nüüd toimub hoopis midagi muud.
06:26
What we're seeing is more of a passing of the torch
144
386260
3000
Nüüd oleme tunnistajaks,
06:29
from human gatekeepers
145
389260
2000
et inimestest "väravavalvurid"
06:31
to algorithmic ones.
146
391260
3000
lihtsalt asendatakse algoritmilistega.
06:34
And the thing is that the algorithms
147
394260
3000
Pprobleem on selles, et algoritmidel
06:37
don't yet have the kind of embedded ethics
148
397260
3000
ei ole veel sisse ehitatud eetilist kompassi,
06:40
that the editors did.
149
400260
3000
mis toimetajatel olemas oli.
06:43
So if algorithms are going to curate the world for us,
150
403260
3000
Ehk, kui algoritmid maailma meie eest kureerivad,
06:46
if they're going to decide what we get to see and what we don't get to see,
151
406260
3000
kui nemad otsustavad, mida me näeme ja mida mitte,
06:49
then we need to make sure
152
409260
2000
siis peame olema kindlad,
06:51
that they're not just keyed to relevance.
153
411260
3000
et nad ei otsiks ainult olulisi samasusi.
06:54
We need to make sure that they also show us things
154
414260
2000
Peame olema kindlad, et nad näitaksid meile ka asju,
06:56
that are uncomfortable or challenging or important --
155
416260
3000
mis võivad olla ebamugavad, või keerulised, või tähtsad -
06:59
this is what TED does --
156
419260
2000
see on see, mida TED teeb -
07:01
other points of view.
157
421260
2000
teistsugused seisukohad.
07:03
And the thing is, we've actually been here before
158
423260
2000
Me oleme seda korra juba läbi teinud
07:05
as a society.
159
425260
2000
ühiskonnana.
07:08
In 1915, it's not like newspapers were sweating a lot
160
428260
3000
1915.a. ei mõelnud ajalehed just palju
07:11
about their civic responsibilities.
161
431260
3000
oma sotsiaalse vastutuse peale.
07:14
Then people noticed
162
434260
2000
Kuid siis märkasid inimesed,
07:16
that they were doing something really important.
163
436260
3000
et ajalehtedel on tegelikult oluline roll.
07:19
That, in fact, you couldn't have
164
439260
2000
Sest ei saa olla
07:21
a functioning democracy
165
441260
2000
toimivat demokraatiat
07:23
if citizens didn't get a good flow of information,
166
443260
4000
ilma, et kodanikeni jõuaks olulist teavet.
07:28
that the newspapers were critical because they were acting as the filter,
167
448260
3000
Ajalehed olid tähtsad, sest toimisid sellise filtrina
07:31
and then journalistic ethics developed.
168
451260
2000
ja tekkiski ajakirjanduslik eetika.
07:33
It wasn't perfect,
169
453260
2000
See ei olnud ideaalne,
07:35
but it got us through the last century.
170
455260
3000
aga see aitas meid läbi eelmise sajandi.
07:38
And so now,
171
458260
2000
Nüüdsiis
07:40
we're kind of back in 1915 on the Web.
172
460260
3000
oleme internetiga justkui aastas 1915.
07:44
And we need the new gatekeepers
173
464260
3000
Me vajame uusi "väravahoidjaid",
07:47
to encode that kind of responsibility
174
467260
2000
kes kirjutaksid sellise vastutustunde
07:49
into the code that they're writing.
175
469260
2000
ka loodavatesse koodidesse.
07:51
I know that there are a lot of people here from Facebook and from Google --
176
471260
3000
Tean, et siin on palju inimesi Facebookist ja Googlest -
07:54
Larry and Sergey --
177
474260
2000
Larry ja Sergei -
07:56
people who have helped build the Web as it is,
178
476260
2000
inimesed, kes on aidanud interneti ehitada selliseks nagu ta täna on,
07:58
and I'm grateful for that.
179
478260
2000
ja ma olen selle eest tänulik.
08:00
But we really need you to make sure
180
480260
3000
Kuid me peame olema kindlad,
08:03
that these algorithms have encoded in them
181
483260
3000
et nende poolt kirjutatud algoritmid sisaldavad
08:06
a sense of the public life, a sense of civic responsibility.
182
486260
3000
arusaama avalikust elust, mingisugust kodanikuvastutust.
08:09
We need you to make sure that they're transparent enough
183
489260
3000
Peame kindlad olema, et nad on piisavalt avatud,
08:12
that we can see what the rules are
184
492260
2000
et aru saada, milliste reeglitega on tegu,
08:14
that determine what gets through our filters.
185
494260
3000
mis filtreerimist reguleerivad.
08:17
And we need you to give us some control
186
497260
2000
Me tahame, et te annaksite osa kontrollist meile,
08:19
so that we can decide
187
499260
2000
et saaksime ise otsustada,
08:21
what gets through and what doesn't.
188
501260
3000
mis filtrist läbi pääseb ja mis mitte.
08:24
Because I think
189
504260
2000
Sest ma arvan,
08:26
we really need the Internet to be that thing
190
506260
2000
et internet peab olema see koht,
08:28
that we all dreamed of it being.
191
508260
2000
millest me kõik unistanud oleme.
08:30
We need it to connect us all together.
192
510260
3000
See peab meid kõiki omavahel ühendama.
08:33
We need it to introduce us to new ideas
193
513260
3000
See peab tutvustama meile uusi ideid,
08:36
and new people and different perspectives.
194
516260
3000
uusi inimesi, erinevaid seisukohti.
08:40
And it's not going to do that
195
520260
2000
Seda ei juhtu,
08:42
if it leaves us all isolated in a Web of one.
196
522260
3000
kui internet meid hoopis isoleerima hakkab.
08:45
Thank you.
197
525260
2000
Aitäh.
08:47
(Applause)
198
527260
11000
(Aplaus)
Selle veebisaidi kohta

See sait tutvustab teile YouTube'i videoid, mis on kasulikud inglise keele õppimiseks. Näete inglise keele tunde, mida õpetavad tipptasemel õpetajad üle maailma. Iga video lehel kuvatavatel ingliskeelsetel subtiitritel topeltklõpsates saate video sealt edasi mängida. Subtiitrid kerivad video esitamisega sünkroonis. Kui teil on kommentaare või taotlusi, võtke meiega ühendust, kasutades seda kontaktvormi.

https://forms.gle/WvT1wiN1qDtmnspy7