Beware online "filter bubbles" | Eli Pariser

1,551,917 views ・ 2011-05-02

TED


请双击下面的英文字幕来播放视频。

翻译人员: Angelia King 校对人员: Fengqiao Liu
00:15
Mark Zuckerberg,
0
15260
2000
马克·扎克伯格
00:17
a journalist was asking him a question about the news feed.
1
17260
3000
曾被一位记者问及动态消息的问题。
00:20
And the journalist was asking him,
2
20260
2000
这位记者问他,
00:22
"Why is this so important?"
3
22260
2000
“为什么滚动新闻如此重要?”
00:24
And Zuckerberg said,
4
24260
2000
扎克伯格说,
00:26
"A squirrel dying in your front yard
5
26260
2000
“此时,你前院奄奄一息的松鼠
00:28
may be more relevant to your interests right now
6
28260
3000
可能与你的兴趣更加“相关”,
00:31
than people dying in Africa."
7
31260
3000
比起非洲那些挣扎在死亡线上的人们。”
00:34
And I want to talk about
8
34260
2000
我想谈谈
00:36
what a Web based on that idea of relevance might look like.
9
36260
3000
建立在这个“相关”的思路上的一个网络会是什么样子。
00:40
So when I was growing up
10
40260
2000
当我生活在缅因州、
00:42
in a really rural area in Maine,
11
42260
2000
一个非常典型农村地区的时候,
00:44
the Internet meant something very different to me.
12
44260
3000
互联网对我而言,有着完全不同的意义。
00:47
It meant a connection to the world.
13
47260
2000
它意味着与整个世界的联系,
00:49
It meant something that would connect us all together.
14
49260
3000
它意味着与将所有人联系起来。
00:52
And I was sure that it was going to be great for democracy
15
52260
3000
那时,我确信它对民主、
00:55
and for our society.
16
55260
3000
对我们的社会而言,都是件了不起的事。
00:58
But there's this shift
17
58260
2000
但是,网上的信息流动
01:00
in how information is flowing online,
18
60260
2000
发生了改变,
01:02
and it's invisible.
19
62260
3000
并且这种改变是隐形的。
01:05
And if we don't pay attention to it,
20
65260
2000
假如我们对此毫不留意,
01:07
it could be a real problem.
21
67260
3000
它会成为一个真正的问题。
01:10
So I first noticed this in a place I spend a lot of time --
22
70260
3000
所以,我最早是在我花了很多时间的地方注意到了这个问题--
01:13
my Facebook page.
23
73260
2000
我的脸谱页面。
01:15
I'm progressive, politically -- big surprise --
24
75260
3000
政治上,我是改革派--很意外吧--
01:18
but I've always gone out of my way to meet conservatives.
25
78260
2000
但我常常会特意去一些保守派的页面去看看。
01:20
I like hearing what they're thinking about;
26
80260
2000
我喜欢听他们的想法;
01:22
I like seeing what they link to;
27
82260
2000
我喜欢看他们有哪些链接;
01:24
I like learning a thing or two.
28
84260
2000
我喜欢从中学到一两件新鲜事。
01:26
And so I was surprised when I noticed one day
29
86260
3000
但有一天,我注意到,
01:29
that the conservatives had disappeared from my Facebook feed.
30
89260
3000
我脸谱新闻组里的保守派全都消失了,这让我很吃惊。
01:33
And what it turned out was going on
31
93260
2000
结果是
01:35
was that Facebook was looking at which links I clicked on,
32
95260
4000
脸谱会看我点击的链接,
01:39
and it was noticing that, actually,
33
99260
2000
实际上,它注意到
01:41
I was clicking more on my liberal friends' links
34
101260
2000
相比我保守党派的朋友们,
01:43
than on my conservative friends' links.
35
103260
3000
我点击了更多的自由派朋友们的链接。
01:46
And without consulting me about it,
36
106260
2000
在没有告知我的情况下,
01:48
it had edited them out.
37
108260
2000
脸谱就把保守派信息编辑并删除了。
01:50
They disappeared.
38
110260
3000
保守派的朋友们在我的页面上消失了。
01:54
So Facebook isn't the only place
39
114260
2000
脸谱不是唯一
01:56
that's doing this kind of invisible, algorithmic
40
116260
2000
进行这样隐形的、算法的
01:58
editing of the Web.
41
118260
3000
编辑网络的地方。
02:01
Google's doing it too.
42
121260
2000
谷歌也是如此。
02:03
If I search for something, and you search for something,
43
123260
3000
假如我搜索某种信息,你也搜索这种信息,
02:06
even right now at the very same time,
44
126260
2000
甚至是现在,在同一时间,
02:08
we may get very different search results.
45
128260
3000
我们得到的搜索结果可能大不相同。
02:11
Even if you're logged out, one engineer told me,
46
131260
3000
一位工程师告诉我,即使你退出帐号,
02:14
there are 57 signals
47
134260
2000
还会有57种信号
02:16
that Google looks at --
48
136260
3000
可供谷歌参考--
02:19
everything from what kind of computer you're on
49
139260
3000
几乎所有的信息:从你使用的电脑型号
02:22
to what kind of browser you're using
50
142260
2000
到你用的浏览器
02:24
to where you're located --
51
144260
2000
到你所在的位置--
02:26
that it uses to personally tailor your query results.
52
146260
3000
谷歌利用这些为你定制出个性化的查询结果。
02:29
Think about it for a second:
53
149260
2000
稍微想想看:
02:31
there is no standard Google anymore.
54
151260
4000
从此不再会有标准版谷歌。
02:35
And you know, the funny thing about this is that it's hard to see.
55
155260
3000
你知道,有趣一点的是,对此,人们很难察觉得到。
02:38
You can't see how different your search results are
56
158260
2000
你不会看到你的搜索结果
02:40
from anyone else's.
57
160260
2000
与别人的搜索结果有什么不同。
02:42
But a couple of weeks ago,
58
162260
2000
但几周前,
02:44
I asked a bunch of friends to Google "Egypt"
59
164260
3000
我请一群朋友用谷歌搜索“埃及”,
02:47
and to send me screen shots of what they got.
60
167260
3000
然后将他们搜索结果的屏幕截图发给我。
02:50
So here's my friend Scott's screen shot.
61
170260
3000
这是我朋友斯科特的截屏。
02:54
And here's my friend Daniel's screen shot.
62
174260
3000
这个是我朋友丹尼尔的。
02:57
When you put them side-by-side,
63
177260
2000
当你把它们并排放在一起,
02:59
you don't even have to read the links
64
179260
2000
你甚至不必查看链接,
03:01
to see how different these two pages are.
65
181260
2000
就能看出这2个搜索页面有多大差别。
03:03
But when you do read the links,
66
183260
2000
但当你查看这些链接后面的内容时,
03:05
it's really quite remarkable.
67
185260
3000
差别真的相当相当大!
03:09
Daniel didn't get anything about the protests in Egypt at all
68
189260
3000
在谷歌搜索结果的第一页,丹尼尔根本就没有
03:12
in his first page of Google results.
69
192260
2000
任何有关埃及抗议报道的新闻。
03:14
Scott's results were full of them.
70
194260
2000
斯科特的搜索结果却全是这类新闻。
03:16
And this was the big story of the day at that time.
71
196260
2000
在当时,这可是当天的头条新闻。
03:18
That's how different these results are becoming.
72
198260
3000
搜索结果就是会如此的不同。
03:21
So it's not just Google and Facebook either.
73
201260
3000
这不仅指谷歌,也不仅指脸谱。
03:24
This is something that's sweeping the Web.
74
204260
2000
这种现象正在席卷整个网络。
03:26
There are a whole host of companies that are doing this kind of personalization.
75
206260
3000
有一大批的公司正在做这样的个性化定制服务。
03:29
Yahoo News, the biggest news site on the Internet,
76
209260
3000
雅虎新闻,网络上最大的新闻网站,
03:32
is now personalized -- different people get different things.
77
212260
3000
现在也个性化服务了--不同的人们得到不同的信息。
03:36
Huffington Post, the Washington Post, the New York Times --
78
216260
3000
赫芬顿邮报,华盛顿邮报,纽约时报--
03:39
all flirting with personalization in various ways.
79
219260
3000
它们都以不同的方式与个性化定制搭上边。
03:42
And this moves us very quickly
80
222260
3000
这会将我们很快地
03:45
toward a world in which
81
225260
2000
推向这样一个世界——
03:47
the Internet is showing us what it thinks we want to see,
82
227260
4000
网络给我们显示它认为我们想要看到的信息,
03:51
but not necessarily what we need to see.
83
231260
3000
而未必是我们需要的信息。
03:54
As Eric Schmidt said,
84
234260
3000
正如埃里克·施密特所言,
03:57
"It will be very hard for people to watch or consume something
85
237260
3000
“要人们观看或消费一些
04:00
that has not in some sense
86
240260
2000
在某种意义上并非
04:02
been tailored for them."
87
242260
3000
为他们个性定制的东西,是很难的。”
04:05
So I do think this is a problem.
88
245260
2000
所以,我认为这确实是个问题。
04:07
And I think, if you take all of these filters together,
89
247260
3000
我认为,如果你把所有这些过滤器放在一起,
04:10
you take all these algorithms,
90
250260
2000
还有所有这些算法,
04:12
you get what I call a filter bubble.
91
252260
3000
你会得到一个我所谓的“过滤气泡”。
04:16
And your filter bubble is your own personal,
92
256260
3000
你的“过滤泡沫”是你自己个人的
04:19
unique universe of information
93
259260
2000
独一无二的信息世界——
04:21
that you live in online.
94
261260
2000
也就是你所生活其中的网络世界。
04:23
And what's in your filter bubble
95
263260
3000
你的“过滤气泡”中包含了什么
04:26
depends on who you are, and it depends on what you do.
96
266260
3000
取决于你是谁,也取决于你做的事情。
04:29
But the thing is that you don't decide what gets in.
97
269260
4000
但问题是你不能决定什么信息可以通过“过滤气泡”。
04:33
And more importantly,
98
273260
2000
更重要的是,
04:35
you don't actually see what gets edited out.
99
275260
3000
实际上,你也看不到那些被删除的信息。
04:38
So one of the problems with the filter bubble
100
278260
2000
所以奈飞DVD在线租赁公司(Netflix)的一些研发人员发现了
04:40
was discovered by some researchers at Netflix.
101
280260
3000
“过滤气泡”的一个问题。
04:43
And they were looking at the Netflix queues, and they noticed something kind of funny
102
283260
3000
他们在查看奈飞数据队列时,注意到一些有意思的事;
04:46
that a lot of us probably have noticed,
103
286260
2000
可能我们很多人也已经注意到了,
04:48
which is there are some movies
104
288260
2000
那就是,有些电影
04:50
that just sort of zip right up and out to our houses.
105
290260
3000
脱颖而出,直接进入到千家万户。
04:53
They enter the queue, they just zip right out.
106
293260
3000
它们进入数据队列,然后直接脱颖而出。
04:56
So "Iron Man" zips right out,
107
296260
2000
因此,“钢铁侠”脱颖而出,
04:58
and "Waiting for Superman"
108
298260
2000
而“等待超人”
05:00
can wait for a really long time.
109
300260
2000
要等待很长一段时间。
05:02
What they discovered
110
302260
2000
他们发现,
05:04
was that in our Netflix queues
111
304260
2000
在奈飞数据队列中,
05:06
there's this epic struggle going on
112
306260
3000
在未来满心抱负的我们与
05:09
between our future aspirational selves
113
309260
3000
今天更为冲动的我们之间
05:12
and our more impulsive present selves.
114
312260
3000
始终存在着史诗般的斗争。
05:15
You know we all want to be someone
115
315260
2000
大家知道,我们都想成为
05:17
who has watched "Rashomon,"
116
317260
2000
看过“罗生门”的那个人,
05:19
but right now
117
319260
2000
但现在
05:21
we want to watch "Ace Ventura" for the fourth time.
118
321260
3000
我们想第四次看“神探飞机头”。
05:24
(Laughter)
119
324260
3000
(笑声)
05:27
So the best editing gives us a bit of both.
120
327260
2000
而最好的编辑能兼顾这两方面的信息。
05:29
It gives us a little bit of Justin Bieber
121
329260
2000
它会为我们提供一点儿有关贾斯汀·比伯的信息,
05:31
and a little bit of Afghanistan.
122
331260
2000
也会提供一点儿有关阿富汗的信息。
05:33
It gives us some information vegetables;
123
333260
2000
它会为我们提供一些信息“蔬菜”,
05:35
it gives us some information dessert.
124
335260
3000
同时也为我们提供一些信息“甜点”。
05:38
And the challenge with these kinds of algorithmic filters,
125
338260
2000
这些算法过滤器和
05:40
these personalized filters,
126
340260
2000
这些个性化定制过滤器的挑战,
05:42
is that, because they're mainly looking
127
342260
2000
在于,因为它们主要参考
05:44
at what you click on first,
128
344260
4000
你最先点击的东西,
05:48
it can throw off that balance.
129
348260
4000
所以,它可能最后无法实现那种(信息间的)平衡。
05:52
And instead of a balanced information diet,
130
352260
3000
非但不是平衡的信息“食谱”,
05:55
you can end up surrounded
131
355260
2000
大家最终得到的可能
05:57
by information junk food.
132
357260
2000
全是信息“垃圾食品”.
05:59
What this suggests
133
359260
2000
这表明,
06:01
is actually that we may have the story about the Internet wrong.
134
361260
3000
实际上,我们在讲的可能是一个网络欺骗的故事。
06:04
In a broadcast society --
135
364260
2000
在广播社会里--
06:06
this is how the founding mythology goes --
136
366260
2000
最初的虚构事实就是这样进行的--
06:08
in a broadcast society,
137
368260
2000
在广播社会里,
06:10
there were these gatekeepers, the editors,
138
370260
2000
有这些审核者,编辑,
06:12
and they controlled the flows of information.
139
372260
3000
他们控制着信息流通。
06:15
And along came the Internet and it swept them out of the way,
140
375260
3000
随后出现了互联网,它取而代之了过去所有的信息流通方式,
06:18
and it allowed all of us to connect together,
141
378260
2000
它让我们所有人都联系在一起,
06:20
and it was awesome.
142
380260
2000
这曾经妙不可言。
06:22
But that's not actually what's happening right now.
143
382260
3000
但今天,实际上,情况已经发生了变化。
06:26
What we're seeing is more of a passing of the torch
144
386260
3000
现在的情况,更像是(信息甄选的)“火炬”
06:29
from human gatekeepers
145
389260
2000
从人工审核者
06:31
to algorithmic ones.
146
391260
3000
传递给了计算机算法“审核者”。
06:34
And the thing is that the algorithms
147
394260
3000
但问题是,这些计算机算法
06:37
don't yet have the kind of embedded ethics
148
397260
3000
自身并没有编辑们
06:40
that the editors did.
149
400260
3000
所具备的职业道德。
06:43
So if algorithms are going to curate the world for us,
150
403260
3000
所以,假若让算法为我们去创建一个世界,
06:46
if they're going to decide what we get to see and what we don't get to see,
151
406260
3000
假若它们来决定我们能看到什么、不能看到什么,
06:49
then we need to make sure
152
409260
2000
那么,我们必须要确保
06:51
that they're not just keyed to relevance.
153
411260
3000
它们不仅仅只是围绕“相关性”而已。
06:54
We need to make sure that they also show us things
154
414260
2000
我们得确保它们也会给我们展示
06:56
that are uncomfortable or challenging or important --
155
416260
3000
那些不合意的、具有挑战性的或重要的信息--
06:59
this is what TED does --
156
419260
2000
这也是TED的所追求的--
07:01
other points of view.
157
421260
2000
其他的观点。
07:03
And the thing is, we've actually been here before
158
423260
2000
问题是,作为社会,我们实际上以前
07:05
as a society.
159
425260
2000
有过如此的经历。
07:08
In 1915, it's not like newspapers were sweating a lot
160
428260
3000
在1915年,报纸并没有报道很多有关
07:11
about their civic responsibilities.
161
431260
3000
有关公民责任的信息。
07:14
Then people noticed
162
434260
2000
那是,人们意识到,
07:16
that they were doing something really important.
163
436260
3000
报纸正在做的事情非常的重要。
07:19
That, in fact, you couldn't have
164
439260
2000
事实上,假如得不到充分的信息
07:21
a functioning democracy
165
441260
2000
公民不可能
07:23
if citizens didn't get a good flow of information,
166
443260
4000
实现有效的民主。
07:28
that the newspapers were critical because they were acting as the filter,
167
448260
3000
报纸很关键,因为它们起的是信息过滤器的作用,
07:31
and then journalistic ethics developed.
168
451260
2000
随后,新闻职业道德应运而生。
07:33
It wasn't perfect,
169
453260
2000
那时,它并不完美,
07:35
but it got us through the last century.
170
455260
3000
但是,它带我们走过了上个世纪。
07:38
And so now,
171
458260
2000
所以,现在,
07:40
we're kind of back in 1915 on the Web.
172
460260
3000
我们在网络上好像又回到了1915年。
07:44
And we need the new gatekeepers
173
464260
3000
我们需要新的信息审核者
07:47
to encode that kind of responsibility
174
467260
2000
将这种道德责任
07:49
into the code that they're writing.
175
469260
2000
输入到他们所写的算法代码中。
07:51
I know that there are a lot of people here from Facebook and from Google --
176
471260
3000
我知道,这里有很多来自脸谱和谷歌的朋友--
07:54
Larry and Sergey --
177
474260
2000
拉里和谢尔盖--
07:56
people who have helped build the Web as it is,
178
476260
2000
有很多帮助建起现有互联网的朋友,
07:58
and I'm grateful for that.
179
478260
2000
对此,我是表示感谢的。
08:00
But we really need you to make sure
180
480260
3000
但我们真的需要你们来确保
08:03
that these algorithms have encoded in them
181
483260
3000
互联网中的这些算法中考虑了
08:06
a sense of the public life, a sense of civic responsibility.
182
486260
3000
公共生活和公民责任感。
08:09
We need you to make sure that they're transparent enough
183
489260
3000
我们需要你们来确保这些算法有一定的透明,
08:12
that we can see what the rules are
184
492260
2000
使人们能了解些那些决定什么
08:14
that determine what gets through our filters.
185
494260
3000
能够通过我们的过滤器的运行规则。
08:17
And we need you to give us some control
186
497260
2000
我们需要你们给我们一些管理权限,
08:19
so that we can decide
187
499260
2000
这样,我们就能决定
08:21
what gets through and what doesn't.
188
501260
3000
什么信息可以通过,什么不能通过。
08:24
Because I think
189
504260
2000
因为我认为
08:26
we really need the Internet to be that thing
190
506260
2000
我们真的需要互联网成为
08:28
that we all dreamed of it being.
191
508260
2000
我们所梦想的那样。
08:30
We need it to connect us all together.
192
510260
3000
我们需要它使我们都联系在一起。
08:33
We need it to introduce us to new ideas
193
513260
3000
我们需要它向我们介绍新想法、
08:36
and new people and different perspectives.
194
516260
3000
新面孔及不同的视角。
08:40
And it's not going to do that
195
520260
2000
它不可能实现这些,
08:42
if it leaves us all isolated in a Web of one.
196
522260
3000
假如它把我们都孤立在个性化的网络中。
08:45
Thank you.
197
525260
2000
谢谢。
08:47
(Applause)
198
527260
11000
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7