請雙擊下方英文字幕播放視頻。
譯者: Ana Choi
審譯者: Felix Leung
00:15
Mark Zuckerberg,
0
15260
2000
有位新聞工作者問馬克·祖克柏
00:17
a journalist was asking him a question about the news feed.
1
17260
3000
一個關於動態消息的問题。
00:20
And the journalist was asking him,
2
20260
2000
那新聞工作者問他:
00:22
"Why is this so important?"
3
22260
2000
「動態消息究竟為什麼重要?」
00:24
And Zuckerberg said,
4
24260
2000
馬克·祖克柏說:
00:26
"A squirrel dying in your front yard
5
26260
2000
「一隻松鼠正死在你的前院,
00:28
may be more relevant to your interests right now
6
28260
3000
對你來說可能會比
00:31
than people dying in Africa."
7
31260
3000
非洲人正在死去更有關切性。」
00:34
And I want to talk about
8
34260
2000
現在我想談論
00:36
what a Web based on that idea of relevance might look like.
9
36260
3000
當網路基於「相關性」會是什麼樣子。
00:40
So when I was growing up
10
40260
2000
我在缅因州
00:42
in a really rural area in Maine,
11
42260
2000
極之郊區的環境長大,
00:44
the Internet meant something very different to me.
12
44260
3000
網路的意義對我極之不同。
00:47
It meant a connection to the world.
13
47260
2000
它意味著與世界的連接。
00:49
It meant something that would connect us all together.
14
49260
3000
它意味著與所有人的連接。
00:52
And I was sure that it was going to be great for democracy
15
52260
3000
當時我非常肯定它會有助民主
00:55
and for our society.
16
55260
3000
及會有助我們的社會。
00:58
But there's this shift
17
58260
2000
但現在網路上
01:00
in how information is flowing online,
18
60260
2000
资料流動的形色漸漸地、
01:02
and it's invisible.
19
62260
3000
無形地在轉移。
01:05
And if we don't pay attention to it,
20
65260
2000
假若我們不留心注意,
01:07
it could be a real problem.
21
67260
3000
它可能會變成一個問题。
01:10
So I first noticed this in a place I spend a lot of time --
22
70260
3000
我在我經常流覽的地方
首先注意到這個問題,
01:13
my Facebook page.
23
73260
2000
這個地方當然是我的臉書頁面。
01:15
I'm progressive, politically -- big surprise --
24
75260
3000
可想而知,我對政治的態度是進步主義,
01:18
but I've always gone out of my way to meet conservatives.
25
78260
2000
但我亦會叛經離道地結識保守主義者。
01:20
I like hearing what they're thinking about;
26
80260
2000
我喜歡知道他們在想什麼;
01:22
I like seeing what they link to;
27
82260
2000
我喜歡知道他們與什麼有聯繫;
01:24
I like learning a thing or two.
28
84260
2000
我喜歡能從中學到一些東西。
01:26
And so I was surprised when I noticed one day
29
86260
3000
因此有一天我很駕訝當我察覺到
01:29
that the conservatives had disappeared from my Facebook feed.
30
89260
3000
有關保守派主義的消息
從我臉書的動態消息消失。
01:33
And what it turned out was going on
31
93260
2000
理由是因為
01:35
was that Facebook was looking at which links I clicked on,
32
95260
4000
臉書能看見我按過哪些連結,
01:39
and it was noticing that, actually,
33
99260
2000
它注意到
我其實按自由黨朋友的連結
01:41
I was clicking more on my liberal friends' links
34
101260
2000
01:43
than on my conservative friends' links.
35
103260
3000
多過保守派朋友的連結。
01:46
And without consulting me about it,
36
106260
2000
在未與我商量過的情況下,
01:48
it had edited them out.
37
108260
2000
它便編走那些連結。
01:50
They disappeared.
38
110260
3000
那些消息全消失了。
01:54
So Facebook isn't the only place
39
114260
2000
但不是淨只是臉書
01:56
that's doing this kind of invisible, algorithmic
40
116260
2000
會做這種無形的、算法式的
01:58
editing of the Web.
41
118260
3000
來編輯網路。
02:01
Google's doing it too.
42
121260
2000
Google(谷歌)也會這樣。
02:03
If I search for something, and you search for something,
43
123260
3000
若我在搜尋一樣東西,
你也在搜索一樣東西,
02:06
even right now at the very same time,
44
126260
2000
即使是在現在、同一個時間,
02:08
we may get very different search results.
45
128260
3000
我們搜索的結果都會不同。
02:11
Even if you're logged out, one engineer told me,
46
131260
3000
一個工程師曾告訴過我,
即使你登出帳戶,
02:14
there are 57 signals
47
134260
2000
仍然有 57 個訊號
02:16
that Google looks at --
48
136260
3000
在被谷歌觀察著:
02:19
everything from what kind of computer you're on
49
139260
3000
從你所用的電腦類型
02:22
to what kind of browser you're using
50
142260
2000
到你所用的瀏覽器
02:24
to where you're located --
51
144260
2000
甚至是你的所在位置,
02:26
that it uses to personally tailor your query results.
52
146260
3000
它會以這些來量身訂造你的搜尋結果。
02:29
Think about it for a second:
53
149260
2000
試想一下:
02:31
there is no standard Google anymore.
54
151260
4000
現在已經沒有標準的谷歌。
02:35
And you know, the funny thing about this is that it's hard to see.
55
155260
3000
而且,好笑的是,這很難察覺的。
02:38
You can't see how different your search results are
56
158260
2000
你根本無法看到你的搜尋結果
02:40
from anyone else's.
57
160260
2000
會跟其他人的有所不同。
02:42
But a couple of weeks ago,
58
162260
2000
所以在兩個星期前,
02:44
I asked a bunch of friends to Google "Egypt"
59
164260
3000
我請一些朋友用谷歌搜尋「埃及」,
02:47
and to send me screen shots of what they got.
60
167260
3000
並且寄給我他們搜尋結果的螢幕截圖。
02:50
So here's my friend Scott's screen shot.
61
170260
3000
這張是我朋友史考特的截圖,
02:54
And here's my friend Daniel's screen shot.
62
174260
3000
而這張是我朋友丹尼爾的截圖。
02:57
When you put them side-by-side,
63
177260
2000
當你將它們並排比較,
02:59
you don't even have to read the links
64
179260
2000
你根本不用細看那些連結
03:01
to see how different these two pages are.
65
181260
2000
就可以看得出這兩頁是不一樣。
03:03
But when you do read the links,
66
183260
2000
但當你細看這些連結,
03:05
it's really quite remarkable.
67
185260
3000
這確實是蠻驚人的。
03:09
Daniel didn't get anything about the protests in Egypt at all
68
189260
3000
在丹尼爾的谷歌搜尋結果第一頁裡,
03:12
in his first page of Google results.
69
192260
2000
完全沒有關於埃及抗議的連結。
03:14
Scott's results were full of them.
70
194260
2000
在史考特的搜尋結果就有很多。
03:16
And this was the big story of the day at that time.
71
196260
2000
但在那陣子卻是當日的大新聞。
03:18
That's how different these results are becoming.
72
198260
3000
這就是搜尋結果越來越不同的例子。
03:21
So it's not just Google and Facebook either.
73
201260
3000
不只是谷歌及臉書。
03:24
This is something that's sweeping the Web.
74
204260
2000
這趨勢在網路正漸散播。
03:26
There are a whole host of companies that are doing this kind of personalization.
75
206260
3000
現有很多機構都實施個人化。
03:29
Yahoo News, the biggest news site on the Internet,
76
209260
3000
雅虎新聞,網路上最大型的新聞網站,
03:32
is now personalized -- different people get different things.
77
212260
3000
現在已經個人化,
即是不同人會看到不同的東西。
03:36
Huffington Post, the Washington Post, the New York Times --
78
216260
3000
哈芬登郵報、華盛頓郵報、紐約時報等等
03:39
all flirting with personalization in various ways.
79
219260
3000
都正在用不同方式盤弄個人化。
03:42
And this moves us very quickly
80
222260
3000
這種趨勢正在快速地推我們
03:45
toward a world in which
81
225260
2000
前往一個新世界,
03:47
the Internet is showing us what it thinks we want to see,
82
227260
4000
一個網路應為我們想看的世界,
03:51
but not necessarily what we need to see.
83
231260
3000
但未必是一個我們需要看到的世界。
03:54
As Eric Schmidt said,
84
234260
3000
正如艾立克·史密特所說:
03:57
"It will be very hard for people to watch or consume something
85
237260
3000
「現在很難要人們觀看或消化一些
04:00
that has not in some sense
86
240260
2000
一點兒也沒有替他們
04:02
been tailored for them."
87
242260
3000
量身訂造的東西。」
04:05
So I do think this is a problem.
88
245260
2000
我認為這是一個問題,
04:07
And I think, if you take all of these filters together,
89
247260
3000
而且我想,如果將
全部的過濾器放在一起,
04:10
you take all these algorithms,
90
250260
2000
用盡所有算法,
04:12
you get what I call a filter bubble.
91
252260
3000
得到的是一個我稱為
「過濾氣泡」的東西。
04:16
And your filter bubble is your own personal,
92
256260
3000
而你的過濾氣泡便是你個人
04:19
unique universe of information
93
259260
2000
在網上存在的
04:21
that you live in online.
94
261260
2000
獨特資料宇宙。
04:23
And what's in your filter bubble
95
263260
3000
你個人過濾氣泡的內容
04:26
depends on who you are, and it depends on what you do.
96
266260
3000
基於你是誰和你的行為。
04:29
But the thing is that you don't decide what gets in.
97
269260
4000
但問題是,氣泡的內容不是你可選擇。
04:33
And more importantly,
98
273260
2000
更重要的是,
04:35
you don't actually see what gets edited out.
99
275260
3000
你完全看不到什麼被刪除。
04:38
So one of the problems with the filter bubble
100
278260
2000
過濾氣泡的其中一個問題
04:40
was discovered by some researchers at Netflix.
101
280260
3000
被 Netflix 的研究員發現。
04:43
And they were looking at the Netflix queues, and they noticed something kind of funny
102
283260
3000
當在察看 Netflix 的電影列表時,
他們發覺一樣有趣的現象,
04:46
that a lot of us probably have noticed,
103
286260
2000
可能我們很多人亦有察覺到,
04:48
which is there are some movies
104
288260
2000
也就是,有些電影
04:50
that just sort of zip right up and out to our houses.
105
290260
3000
馬上就會被訂去看。
04:53
They enter the queue, they just zip right out.
106
293260
3000
才剛入上架,就馬上被訂去看了。
04:56
So "Iron Man" zips right out,
107
296260
2000
例如《鋼鐵人》很快就被看完,
04:58
and "Waiting for Superman"
108
298260
2000
但《等待超人》
05:00
can wait for a really long time.
109
300260
2000
便真要等很久。
05:02
What they discovered
110
302260
2000
他們發現
05:04
was that in our Netflix queues
111
304260
2000
在我們 Netflix 的列表裡,
05:06
there's this epic struggle going on
112
306260
3000
正在發生一個很巨型的鬥爭:
05:09
between our future aspirational selves
113
309260
3000
我們未來的自我志向
05:12
and our more impulsive present selves.
114
312260
3000
和現在較衝動的自我之間在拔河。
05:15
You know we all want to be someone
115
315260
2000
我們全都想成為那個
05:17
who has watched "Rashomon,"
116
317260
2000
曾經看過《羅生門》的人,
05:19
but right now
117
319260
2000
但現在
05:21
we want to watch "Ace Ventura" for the fourth time.
118
321260
3000
我們想看第四遍的《王牌威龍》。
05:24
(Laughter)
119
324260
3000
(笑聲)
05:27
So the best editing gives us a bit of both.
120
327260
2000
所以其實最好的編輯
是每樣都給我們一些。
05:29
It gives us a little bit of Justin Bieber
121
329260
2000
它會給我們一點小賈斯汀,
05:31
and a little bit of Afghanistan.
122
331260
2000
亦會給我們一些阿富汗。
05:33
It gives us some information vegetables;
123
333260
2000
它會給我們一些像蔬菜一樣的重要資訊,
05:35
it gives us some information dessert.
124
335260
3000
亦會給我們一些像甜點一樣的資訊。
05:38
And the challenge with these kinds of algorithmic filters,
125
338260
2000
所以對這類算法式過濾和
05:40
these personalized filters,
126
340260
2000
這些個人過濾的挑戰,
05:42
is that, because they're mainly looking
127
342260
2000
是因為,它們主要是看
05:44
at what you click on first,
128
344260
4000
你首先按的是什麼連結,
05:48
it can throw off that balance.
129
348260
4000
這個方法會有阻平衡。
05:52
And instead of a balanced information diet,
130
352260
3000
你不但沒得到均衡的資訊菜單,
05:55
you can end up surrounded
131
355260
2000
你可能會得到
05:57
by information junk food.
132
357260
2000
很多垃圾資訊。
05:59
What this suggests
133
359260
2000
這個想法是在說
06:01
is actually that we may have the story about the Internet wrong.
134
361260
3000
可能我們對網路的印象是不正確。
06:04
In a broadcast society --
135
364260
2000
在這個廣播社會,
06:06
this is how the founding mythology goes --
136
366260
2000
根據流傳的說法,
06:08
in a broadcast society,
137
368260
2000
在這個廣播社會,
06:10
there were these gatekeepers, the editors,
138
370260
2000
有一些看門人,叫編輯者,
06:12
and they controlled the flows of information.
139
372260
3000
他們控制資料的流通。
06:15
And along came the Internet and it swept them out of the way,
140
375260
3000
隨後登場便是網際網路,
它掃走這些看門人,
06:18
and it allowed all of us to connect together,
141
378260
2000
讓我們全部人可無阻地聯繫在一起,
06:20
and it was awesome.
142
380260
2000
這真是棒啊。
06:22
But that's not actually what's happening right now.
143
382260
3000
但實在不是這樣。
06:26
What we're seeing is more of a passing of the torch
144
386260
3000
我們看到的是像傳火炬,
06:29
from human gatekeepers
145
389260
2000
由人類看門人
06:31
to algorithmic ones.
146
391260
3000
到算法看門人。
06:34
And the thing is that the algorithms
147
394260
3000
但現在這種算法程式
06:37
don't yet have the kind of embedded ethics
148
397260
3000
還未有編輯人
06:40
that the editors did.
149
400260
3000
所擁有的嵌入概念。
06:43
So if algorithms are going to curate the world for us,
150
403260
3000
所以若我們讓算法
用它的方式來看世界,
06:46
if they're going to decide what we get to see and what we don't get to see,
151
406260
3000
若讓它來決定我們
可看什麼、不可看什麼,
06:49
then we need to make sure
152
409260
2000
那我們便要確定
06:51
that they're not just keyed to relevance.
153
411260
3000
它的決定不只是基於關切性。
06:54
We need to make sure that they also show us things
154
414260
2000
我們要確定它亦會給我們看一些
06:56
that are uncomfortable or challenging or important --
155
416260
3000
我們看了未必舒服,
但有重要性及有挑戰性的東西,
06:59
this is what TED does --
156
419260
2000
正如 TED 大會那樣
07:01
other points of view.
157
421260
2000
會展示其他觀點。
07:03
And the thing is, we've actually been here before
158
423260
2000
其實像現在這種過濾
07:05
as a society.
159
425260
2000
在以前的社會也發生過。
07:08
In 1915, it's not like newspapers were sweating a lot
160
428260
3000
在 1915 年,
那時的報章對它們的民事責任
07:11
about their civic responsibilities.
161
431260
3000
不太在意。
07:14
Then people noticed
162
434260
2000
之後人們發覺到
07:16
that they were doing something really important.
163
436260
3000
報章實在很重要。
07:19
That, in fact, you couldn't have
164
439260
2000
因為事實上,
一個正常運作的民主社會是不存在的,
07:21
a functioning democracy
165
441260
2000
07:23
if citizens didn't get a good flow of information,
166
443260
4000
除非人民能獲得有效的資訊流通。
07:28
that the newspapers were critical because they were acting as the filter,
167
448260
3000
所以報章對事有評論,
因為它們扮演過濾網,
07:31
and then journalistic ethics developed.
168
451260
2000
也因此才有新聞道德的構成。
07:33
It wasn't perfect,
169
453260
2000
雖然不是完美,
07:35
but it got us through the last century.
170
455260
3000
但這種方式帶我們走過上一個世紀。
07:38
And so now,
171
458260
2000
現在,
07:40
we're kind of back in 1915 on the Web.
172
460260
3000
在網上我們又像回到 1915 年。
07:44
And we need the new gatekeepers
173
464260
3000
我們需要新的看門人
07:47
to encode that kind of responsibility
174
467260
2000
將道德責任
07:49
into the code that they're writing.
175
469260
2000
輸入它們算法的程式裡。
07:51
I know that there are a lot of people here from Facebook and from Google --
176
471260
3000
我知道在座很多人替臉書及谷歌工作,
07:54
Larry and Sergey --
177
474260
2000
像賴利和塞吉,
07:56
people who have helped build the Web as it is,
178
476260
2000
很多人參與建立現今的網際網路,
07:58
and I'm grateful for that.
179
478260
2000
我對此很感謝。
08:00
But we really need you to make sure
180
480260
3000
但我們真的需要你們確保
08:03
that these algorithms have encoded in them
181
483260
3000
這些算法的程式裡
08:06
a sense of the public life, a sense of civic responsibility.
182
486260
3000
要有公眾生活和民事責任感。
08:09
We need you to make sure that they're transparent enough
183
489260
3000
我們需要你們確保
它有一定的透明度,
08:12
that we can see what the rules are
184
492260
2000
讓我們能知道是用什麼準則
08:14
that determine what gets through our filters.
185
494260
3000
來決定什麼可通過過濾網。
08:17
And we need you to give us some control
186
497260
2000
而且我們需要你們
給予一些控制力,
08:19
so that we can decide
187
499260
2000
讓我們可以選擇
08:21
what gets through and what doesn't.
188
501260
3000
什麼能通過和不通過。
08:24
Because I think
189
504260
2000
因為我認為
08:26
we really need the Internet to be that thing
190
506260
2000
我們真的需要網路
08:28
that we all dreamed of it being.
191
508260
2000
成為一個我們夢寐以求的平臺。
08:30
We need it to connect us all together.
192
510260
3000
我們需要它連結所有人。
08:33
We need it to introduce us to new ideas
193
513260
3000
我們需要它給我們介紹新的想法、
08:36
and new people and different perspectives.
194
516260
3000
新的人和不同觀點。
08:40
And it's not going to do that
195
520260
2000
而它是不可能辦到這些,
08:42
if it leaves us all isolated in a Web of one.
196
522260
3000
如果它將我們孤立在
唯一自我旳網路裡就絕不可能。
08:45
Thank you.
197
525260
2000
謝謝。
08:47
(Applause)
198
527260
11000
(鼓掌)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。