The moral bias behind your search results | Andreas Ekström

145,437 views ・ 2015-12-07

TED


請雙擊下方英文字幕播放視頻。

譯者: Ming Gui Tan 審譯者: Allen Kuo
每當我拜訪一所學校 跟那裡的學生交流
00:13
So whenever I visit a school and talk to students,
0
13087
2493
00:15
I always ask them the same thing:
1
15604
2143
我都會問同樣的問題:
00:18
Why do you Google?
2
18754
1434
你為什麼用Google?
00:20
Why is Google the search engine of choice for you?
3
20624
3397
為什麼Google 是你獨愛的搜索引擎?
00:24
Strangely enough, I always get the same three answers.
4
24855
2552
很奇妙, 答案永遠是這三種。
00:27
One, "Because it works,"
5
27431
2039
一、因為有用。
00:29
which is a great answer; that's why I Google, too.
6
29494
2906
的確是不錯的答案, 我也是因此而使用Google的。
00:32
Two, somebody will say,
7
32424
2033
二、有些人會說,
00:34
"I really don't know of any alternatives."
8
34481
2640
「我真的不知道還有什麼選擇」
00:37
It's not an equally great answer and my reply to that is usually,
9
37708
3128
這不是很好的答案, 我通常這樣回應,
00:40
"Try to Google the word 'search engine,'
10
40860
1921
「試試看用Google搜索 『搜索引擎』
00:42
you may find a couple of interesting alternatives."
11
42805
2402
你將會找到其他 更有趣的選擇。」
00:45
And last but not least, thirdly,
12
45231
2095
最後但一樣重要的是,三、
00:47
inevitably, one student will raise her or his hand and say,
13
47350
3310
一定會有學生會舉手說,
00:50
"With Google, I'm certain to always get the best, unbiased search result."
14
50684
5183
「透過Google,我很肯定能找到 最好、最沒有偏差的結果」
00:57
Certain to always get the best, unbiased search result.
15
57157
6506
肯定能夠找到最好、 最沒有偏差的結果。
01:05
Now, as a man of the humanities,
16
65091
2390
學人文學的我,
01:07
albeit a digital humanities man,
17
67505
2181
雖然是數位人文學,
01:09
that just makes my skin curl,
18
69710
1738
聽到這句話真是令人發毛,
01:11
even if I, too, realize that that trust, that idea of the unbiased search result
19
71472
4886
即便我也發覺到這種信任、 這種無偏差搜尋結果的想法
01:16
is a cornerstone in our collective love for and appreciation of Google.
20
76382
3855
是我們對於Google共同的喜愛 和感激的基礎。
01:20
I will show you why that, philosophically, is almost an impossibility.
21
80658
4258
我等等會透過哲學思維 證明給你看這句話根本不可能成立。
01:24
But let me first elaborate, just a little bit, on a basic principle
22
84940
3254
但我要事先 稍微闡述每個搜尋查詢背後的
01:28
behind each search query that we sometimes seem to forget.
23
88218
3113
基本運作, 這些是我們常常忽略的。
01:31
So whenever you set out to Google something,
24
91851
2080
每次當你開始用Google查詢,
01:33
start by asking yourself this: "Am I looking for an isolated fact?"
25
93955
3927
先問問自己: 「我在尋找單一的事實嗎?」
01:38
What is the capital of France?
26
98334
3161
法國的首都在哪裡?
01:41
What are the building blocks of a water molecule?
27
101519
2425
構成水分子的 基本結構是什麼?
01:43
Great -- Google away.
28
103968
2341
很好--用Google搜尋吧!
01:46
There's not a group of scientists who are this close to proving
29
106333
3120
不會有一群科學家 有那麼一點點可能證明
01:49
that it's actually London and H30.
30
109477
1997
答案是倫敦和H3O的。
01:51
You don't see a big conspiracy among those things.
31
111498
2371
這當中沒有什麼陰謀。
01:53
We agree, on a global scale,
32
113893
1533
我們大家都同意
01:55
what the answers are to these isolated facts.
33
115450
2725
這些事實的真偽是什麼。
01:58
But if you complicate your question just a little bit and ask something like,
34
118199
5302
但如果你問的問題 稍微複雜一點,像是
02:03
"Why is there an Israeli-Palestine conflict?"
35
123525
2683
「為什麼以色列 和巴勒斯坦會有衝突?」
02:06
You're not exactly looking for a singular fact anymore,
36
126978
2640
你要的不只是單一的事實了,
02:09
you're looking for knowledge,
37
129642
1833
你要的是知識,
02:11
which is something way more complicated and delicate.
38
131499
2578
是更加複雜也更加細微的東西。
02:14
And to get to knowledge,
39
134600
1549
而為了獲得知識,
02:16
you have to bring 10 or 20 or 100 facts to the table
40
136173
3031
你得列出10個、20個 甚至100個事實
02:19
and acknowledge them and say, "Yes, these are all true."
41
139228
2976
然後承認它們,並且說: 「是的,這些都是事實。」
02:22
But because of who I am,
42
142228
1674
但會因為我是誰,
02:23
young or old, black or white, gay or straight,
43
143926
2270
是年輕人或老人、黑人或白人、 同性戀或異性戀,
02:26
I will value them differently.
44
146220
1611
而對它們產生不同評價。
02:27
And I will say, "Yes, this is true,
45
147855
1688
然後我會說:「對,這是事實,
02:29
but this is more important to me than that."
46
149567
2114
但我覺得這個比那個重要。」
02:31
And this is where it becomes interesting,
47
151705
1990
事情就從這一刻變得精彩,
02:33
because this is where we become human.
48
153719
2146
因為這就是我們為何身而為人,
02:35
This is when we start to argue, to form society.
49
155889
2996
我們會因此開始辯論, 而形成一個社會。
02:38
And to really get somewhere, we need to filter all our facts here,
50
158909
3357
要能真正有所進展, 我們首先要篩選手上的資訊,
02:42
through friends and neighbors and parents and children
51
162290
2556
透過朋友、鄰居、父母、孩子、
02:44
and coworkers and newspapers and magazines,
52
164870
2032
同事和報章雜誌,
02:46
to finally be grounded in real knowledge,
53
166926
3080
來取得最終真正的知識,
02:50
which is something that a search engine is a poor help to achieve.
54
170030
4047
這種事搜尋引擎很難辦得到。
02:55
So, I promised you an example just to show you why it's so hard
55
175284
6328
我剛剛答應你們我會舉例說明為什麼
03:01
to get to the point of true, clean, objective knowledge --
56
181636
3404
要獲得真實、純淨、 客觀的知識那麼困難--
03:05
as food for thought.
57
185064
1468
好讓你深思一下。
03:06
I will conduct a couple of simple queries, search queries.
58
186556
3893
我來現場搜尋一些簡單的東西。
03:10
We'll start with "Michelle Obama,"
59
190473
4040
我們就從「蜜雪兒·歐巴馬」
03:14
the First Lady of the United States.
60
194537
1804
美國的第一夫人開始吧。
03:16
And we'll click for pictures.
61
196365
1729
我們搜尋圖片。
03:19
It works really well, as you can see.
62
199007
2272
正如你所見,效果還不錯。
03:21
It's a perfect search result, more or less.
63
201303
3028
結果或多或少還算不錯啦。
03:24
It's just her in the picture, not even the President.
64
204355
2750
圖片裡都只有她, 連總統都沒有。
03:27
How does this work?
65
207664
1313
背後的運作是如何的呢?
03:29
Quite simple.
66
209837
1372
很簡單。
03:31
Google uses a lot of smartness to achieve this, but quite simply,
67
211233
3215
Google利用不少智慧型功能來 達成這些目的,但簡單來說
03:34
they look at two things more than anything.
68
214472
2060
他們主要看兩樣東西。
03:36
First, what does it say in the caption under the picture on each website?
69
216556
5156
首先,每一個網頁上 圖片的標題是什麼?
03:41
Does it say "Michelle Obama" under the picture?
70
221736
2215
圖片底下的標題是否是 「Michelle Obama」?
03:43
Pretty good indication it's actually her on there.
71
223975
2356
有的話代表圖中人物應該就是她。
03:46
Second, Google looks at the picture file,
72
226355
2386
第二、Google看圖片檔名,
03:48
the name of the file as such uploaded to the website.
73
228765
3032
上傳到網路上的檔名。
03:51
Again, is it called "MichelleObama.jpeg"?
74
231821
2669
我們來看看, 檔名是否是MichelleObama.jpeg?
03:54
Pretty good indication it's not Clint Eastwood in the picture.
75
234839
2922
很明顯圖片中不是 克林·伊斯威特(Clint Eastwoood)。
03:57
So, you've got those two and you get a search result like this -- almost.
76
237785
4265
上面那兩項條件會讓你 搜尋到這樣的結果--幾乎啦。
04:02
Now, in 2009, Michelle Obama was the victim of a racist campaign,
77
242074
6603
在2009年的時候,蜜雪兒·歐巴馬 變成一個種族歧視運動的受害者,
04:08
where people set out to insult her through her search results.
78
248701
4015
一些人透過搜尋結果來侮辱她。
04:13
There was a picture distributed widely over the Internet
79
253430
2702
有一張在網路上廣為流傳的照片,
04:16
where her face was distorted to look like a monkey.
80
256156
2644
照片中她的臉被扭曲成像猴子一樣。
04:18
And that picture was published all over.
81
258824
3169
而那張照片到處可見。
04:22
And people published it very, very purposefully,
82
262017
3761
而且這些人真的是故意在上傳,
04:25
to get it up there in the search results.
83
265802
1971
為的就是讓它成為搜尋結果。
04:27
They made sure to write "Michelle Obama" in the caption
84
267797
2955
他們確保有把 「Michelle Obama」寫入標題裡,
04:30
and they made sure to upload the picture as "MichelleObama.jpeg," or the like.
85
270776
4177
也確保被上傳的圖片的檔名是 「MichelleObama.jpeg」,或類似的。
04:34
You get why -- to manipulate the search result.
86
274977
2367
你應該知道為什麼-- 為了操縱搜尋結果。
04:37
And it worked, too.
87
277368
1295
結果真的成功了。
04:38
So when you picture-Googled for "Michelle Obama" in 2009,
88
278687
2720
當你在2009年用Google 搜尋「Michelle Obama」的圖片,
04:41
that distorted monkey picture showed up among the first results.
89
281431
3387
那張被改成像猴子的圖片 就會出現在結果的前端。
04:44
Now, the results are self-cleansing,
90
284842
3566
這些搜尋結果會自動清除,
04:48
and that's sort of the beauty of it,
91
288432
1753
這也算是Google的美啦,
04:50
because Google measures relevance every hour, every day.
92
290209
3403
因為Google每小時、每天 都在測量結果的相關性。
04:53
However, Google didn't settle for that this time,
93
293636
2714
然而,Google這次並沒有 讓這件事順其自然,
04:56
they just thought, "That's racist and it's a bad search result
94
296374
3124
他們想說:「這有種族歧視, 是不好的搜尋結果,
04:59
and we're going to go back and clean that up manually.
95
299522
3135
我們要手動清除掉這些,
05:02
We are going to write some code and fix it,"
96
302681
2932
我們需要寫一些程式把它給弄好。」
05:05
which they did.
97
305637
1247
而他們真的做到了。
05:07
And I don't think anyone in this room thinks that was a bad idea.
98
307454
3742
我不認為這裡有人會 覺得這是個餿主意吧。
05:11
Me neither.
99
311789
1164
對啊,我也不這麼認為。
05:14
But then, a couple of years go by,
100
314802
3032
但是,一直到幾年之後,
05:17
and the world's most-Googled Anders,
101
317858
2984
世上最多人Google搜尋的昂德史
05:20
Anders Behring Breivik,
102
320866
2279
昂德史·北令·布雷維克,
05:23
did what he did.
103
323169
1706
做了一件事。
05:24
This is July 22 in 2011,
104
324899
2001
那時是2011年7月22日,
05:26
and a terrible day in Norwegian history.
105
326924
2649
在挪威歷史上 是慘不忍睹的一天。
05:29
This man, a terrorist, blew up a couple of government buildings
106
329597
3787
這個人,一個恐怖分子, 炸毀了好幾棟
05:33
walking distance from where we are right now in Oslo, Norway
107
333408
2883
離這裡(奧斯陸)不遠的政府機關,
然後再到烏托亞島上
05:36
and then he traveled to the island of Utøya
108
336315
2051
射殺了一群孩子。
05:38
and shot and killed a group of kids.
109
338390
2223
05:41
Almost 80 people died that day.
110
341113
1728
那天,死了將近80人。
05:44
And a lot of people would describe this act of terror as two steps,
111
344397
4559
很多人認為這個惡行是兩部曲,
05:48
that he did two things: he blew up the buildings and he shot those kids.
112
348980
3411
認為他做了兩件事:炸毀建築物 以及射殺那群孩子。
05:52
It's not true.
113
352415
1165
不對。
05:54
It was three steps.
114
354326
2143
這件事其實是三部曲。
05:56
He blew up those buildings, he shot those kids,
115
356493
2214
他炸毀建築物, 射殺那群孩子,
05:58
and he sat down and waited for the world to Google him.
116
358731
3644
然後坐下來等待 天下人Google搜尋他。
06:03
And he prepared all three steps equally well.
117
363227
2627
而且他為這三部曲做足了準備。
06:06
And if there was somebody who immediately understood this,
118
366544
2790
如果說有誰馬上明白他的真正用意,
06:09
it was a Swedish web developer,
119
369358
1524
那就是一位瑞典網路設計者,
06:10
a search engine optimization expert in Stockholm, named Nikke Lindqvist.
120
370906
3623
住在斯德哥爾摩的搜尋引擎最佳化專家, 叫尼克·林德威斯特
06:14
He's also a very political guy
121
374553
1588
他也是個很有政治主見的人,
06:16
and he was right out there in social media, on his blog and Facebook.
122
376165
3276
他剛好就在社群網路上, 瀏覽他的部落格和臉書。
06:19
And he told everybody,
123
379465
1206
他告訴大家,
06:20
"If there's something that this guy wants right now,
124
380695
2455
「這個人現在最想要的是
能夠主宰他自己的形象。
06:23
it's to control the image of himself.
125
383174
2459
06:26
Let's see if we can distort that.
126
386760
1960
讓我們試著扭曲他的形象。
06:29
Let's see if we, in the civilized world, can protest against what he did
127
389490
3977
讓我們嘗試在文明世界裡,
透過他的搜尋結果侮辱他, 來抗議他的惡行。』
06:33
through insulting him in his search results."
128
393491
3317
06:36
And how?
129
396832
1187
要怎麼做呢?
06:38
He told all of his readers the following,
130
398797
2056
他讓所有的讀者這樣做,
06:40
"Go out there on the Internet,
131
400877
1864
「連上網絡
06:42
find pictures of dog poop on sidewalks --
132
402765
2895
搜尋一下狗狗在路邊大便的圖片--
06:46
find pictures of dog poop on sidewalks --
133
406708
2174
搜尋一下狗狗在路邊大便的圖片--
06:48
publish them in your feeds, on your websites, on your blogs.
134
408906
3470
上傳到你們的發文、網頁、部落格。
06:52
Make sure to write the terrorist's name in the caption,
135
412400
2921
記得一定要把這位恐怖分子 的名字寫到標題中,
06:55
make sure to name the picture file "Breivik.jpeg."
136
415345
4487
確保圖片檔名是『Breivik.jpeg』
06:59
Let's teach Google that that's the face of the terrorist."
137
419856
3801
一起來告訴Google 這就是那位恐怖分子的臉孔。」
07:05
And it worked.
138
425552
1278
後來真的成功了。
07:07
Two years after that campaign against Michelle Obama,
139
427853
2898
繼蜜雪兒·歐巴馬那件事的兩年之後,
07:10
this manipulation campaign against Anders Behring Breivik worked.
140
430775
3266
這次針對昂德史·北令·布雷維克 所發起的操縱運動成功了。
07:14
If you picture-Googled for him weeks after the July 22 events from Sweden,
141
434065
4462
你如果在7月22日後的幾個禮拜, 你在瑞典Google圖片搜尋他的話,
07:18
you'd see that picture of dog poop high up in the search results,
142
438551
4327
你會看到先出現的都是 狗狗在路邊大便的圖片,
07:22
as a little protest.
143
442902
1444
算是對他小小的討伐。
07:25
Strangely enough, Google didn't intervene this time.
144
445425
4132
奇怪的是,Google 這次竟撒手閉眼。
07:30
They did not step in and manually clean those search results up.
145
450494
4272
他們沒有介入, 手動把這些搜尋結果清除掉。
07:35
So the million-dollar question,
146
455964
1716
那我要問你們一個迫切而困難的問題,
07:37
is there anything different between these two happenings here?
147
457704
3368
這兩件事有什麼差別嗎?
07:41
Is there anything different between what happened to Michelle Obama
148
461096
3193
究竟發生在蜜雪兒·歐巴馬身上的事件
跟昂德史·北令·布雷維克的有何差別?
07:44
and what happened to Anders Behring Breivik?
149
464313
2065
07:46
Of course not.
150
466402
1284
當然沒有。
07:48
It's the exact same thing,
151
468861
1471
這兩件事根本是一樣的,
07:50
yet Google intervened in one case and not in the other.
152
470356
2864
但Google並沒有一視同仁。
07:53
Why?
153
473244
1253
為什麼?
07:55
Because Michelle Obama is an honorable person, that's why,
154
475283
3300
因為蜜雪兒·歐巴馬德高望重,
07:58
and Anders Behring Breivik is a despicable person.
155
478607
2916
而昂德史·北令·布雷維克十惡不赦。
08:02
See what happens there?
156
482142
1535
你看到了嗎?
08:03
An evaluation of a person takes place
157
483701
3255
有人被斷定其好壞,
08:06
and there's only one power-player in the world
158
486980
3786
而世上只有一個大玩家
08:10
with the authority to say who's who.
159
490790
2480
被賦予權利去判定對錯。
08:13
"We like you, we dislike you.
160
493882
1741
「我們喜歡你,我們不喜歡你。
08:15
We believe in you, we don't believe in you.
161
495647
2039
我們相信你, 我們不相信你。
08:17
You're right, you're wrong. You're true, you're false.
162
497710
2547
你對,你錯。 你說真話,你說謊。
你是歐巴馬,你是布雷維克。」
08:20
You're Obama, and you're Breivik."
163
500281
1805
08:22
That's power if I ever saw it.
164
502791
2000
這就是權勢。
08:27
So I'm asking you to remember that behind every algorithm
165
507206
3652
所以我在此提醒你在任何運算中
08:30
is always a person,
166
510882
1777
都有一個人在背後,
08:32
a person with a set of personal beliefs
167
512683
2495
一個擁有自己根深蒂固、 難以動搖的信仰的人。
08:35
that no code can ever completely eradicate.
168
515202
2525
08:37
And my message goes out not only to Google,
169
517751
2434
而我的不只是要對Google說,
08:40
but to all believers in the faith of code around the world.
170
520209
2810
我也想對世上所有 擁有相信某種教條的人說。
你必須要釐清 自己的個人偏差。
08:43
You need to identify your own personal bias.
171
523043
2976
08:46
You need to understand that you are human
172
526043
2013
你必須明瞭你身而為人
08:48
and take responsibility accordingly.
173
528080
2491
然後擔當起相對的責任。
08:51
And I say this because I believe we've reached a point in time
174
531891
2938
我這麼說是因為相信 我們已經到了一個
08:54
when it's absolutely imperative
175
534853
1555
必須無可避免地
08:56
that we tie those bonds together again, tighter:
176
536432
3217
重新把這些距離拉得更近的時機:
08:59
the humanities and the technology.
177
539673
2368
人文跟科技之間的距離。
09:02
Tighter than ever.
178
542483
1805
比以前更近。
09:04
And, if nothing else, to remind us that that wonderfully seductive idea
179
544312
3339
我也相信那誘人的觀念,
09:07
of the unbiased, clean search result
180
547675
2668
認為搜尋結果是無偏差、純淨的,
09:10
is, and is likely to remain, a myth.
181
550367
2767
只是個天方夜譚, 永遠是個天方夜譚。
09:13
Thank you for your time.
182
553984
1159
謝謝你撥冗聆聽。
09:15
(Applause)
183
555167
2432
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7