The moral bias behind your search results | Andreas Ekström

142,235 views ・ 2015-12-07

TED


请双击下面的英文字幕来播放视频。

翻译人员: Jingyao Xue 校对人员: Zhiting Chen
00:13
So whenever I visit a school and talk to students,
0
13087
2493
每当我参观一所学校, 和学生们交谈,
00:15
I always ask them the same thing:
1
15604
2143
我总是问他们同样的问题:
00:18
Why do you Google?
2
18754
1434
为什么你用Google?
00:20
Why is Google the search engine of choice for you?
3
20624
3397
为什么你首选Google 作为搜索引擎呢?
00:24
Strangely enough, I always get the same three answers.
4
24855
2552
奇怪的是,我总是得到 三个相同的回答。
00:27
One, "Because it works,"
5
27431
2039
第一个回答, “因为它是个管用的搜索工具,”
00:29
which is a great answer; that's why I Google, too.
6
29494
2906
这是最好的答案,也是 我使用Google的原因。
00:32
Two, somebody will say,
7
32424
2033
第二个回答是,
00:34
"I really don't know of any alternatives."
8
34481
2640
”我真的不知道其他 任何可代替的方法。“
00:37
It's not an equally great answer and my reply to that is usually,
9
37708
3128
这不是一个很好的回答, 而我的回应总是,
00:40
"Try to Google the word 'search engine,'
10
40860
1921
”尝试在Google上 键入搜索引擎,
00:42
you may find a couple of interesting alternatives."
11
42805
2402
你可以发现一些有趣的, 可代替的搜索方法。”
00:45
And last but not least, thirdly,
12
45231
2095
最后一个回答 并非不重要,
00:47
inevitably, one student will raise her or his hand and say,
13
47350
3310
如我所预料的, 一个学生举手说,
00:50
"With Google, I'm certain to always get the best, unbiased search result."
14
50684
5183
”使用Google, 我确信总得到 最好和客观公正的搜索结果。“
00:57
Certain to always get the best, unbiased search result.
15
57157
6506
一定得到最好和客观 公正的搜索结果。
01:05
Now, as a man of the humanities,
16
65091
2390
作为一个学习人文科学的人,
01:07
albeit a digital humanities man,
17
67505
2181
尽管我是研究科技 和人文科学间关系的,
01:09
that just makes my skin curl,
18
69710
1738
这个回答听起来 使我心里发毛,
01:11
even if I, too, realize that that trust, that idea of the unbiased search result
19
71472
4886
尽管我也知道, 这种对搜索结果的信任,
01:16
is a cornerstone in our collective love for and appreciation of Google.
20
76382
3855
是以我们共同喜爱和对 Google的认识为基础的。
01:20
I will show you why that, philosophically, is almost an impossibility.
21
80658
4258
我会证明给你看,这种想法 在哲学思维分析上几乎不存在。
01:24
But let me first elaborate, just a little bit, on a basic principle
22
84940
3254
但首先,我对网上查询的 基本原则说明一下,
01:28
behind each search query that we sometimes seem to forget.
23
88218
3113
我们有时似乎忘记,每个 搜索查询背后的原则。
01:31
So whenever you set out to Google something,
24
91851
2080
因此,每当你用Google 查询所需的东西,
01:33
start by asking yourself this: "Am I looking for an isolated fact?"
25
93955
3927
首先要问自己: “我要寻找个别的事实吗?”
01:38
What is the capital of France?
26
98334
3161
例如,法国的首都是哪里?
01:41
What are the building blocks of a water molecule?
27
101519
2425
水分子的构成要素是什么?
01:43
Great -- Google away.
28
103968
2341
使用Google搜索 都能找到这些答案。
01:46
There's not a group of scientists who are this close to proving
29
106333
3120
没有一个严密的 科学家小组去证明
01:49
that it's actually London and H30.
30
109477
1997
这些答案,实际上 会是伦敦和H30。
01:51
You don't see a big conspiracy among those things.
31
111498
2371
在这些答案中, 你不会找到什么阴谋。
01:53
We agree, on a global scale,
32
113893
1533
我们同意,在全球范围内,
01:55
what the answers are to these isolated facts.
33
115450
2725
这些个别事实通过 网上查询能找到答案。
01:58
But if you complicate your question just a little bit and ask something like,
34
118199
5302
但是,如果你搜索 一些稍复杂的问题,
02:03
"Why is there an Israeli-Palestine conflict?"
35
123525
2683
例如,“为什么出现以色列 和巴勒斯坦之间的冲突?“
02:06
You're not exactly looking for a singular fact anymore,
36
126978
2640
你不会只寻找单独 形式的个别事实,
02:09
you're looking for knowledge,
37
129642
1833
你要找的是知识,
02:11
which is something way more complicated and delicate.
38
131499
2578
这些知识会表现出很 复杂和微妙的形式。
02:14
And to get to knowledge,
39
134600
1549
为了获得这些知识,
02:16
you have to bring 10 or 20 or 100 facts to the table
40
136173
3031
你必须要搜索10,20或 100个相关的事实
02:19
and acknowledge them and say, "Yes, these are all true."
41
139228
2976
并认可这些事实,说: ”这些都是真实的。“
02:22
But because of who I am,
42
142228
1674
但是,取决于我是谁,
02:23
young or old, black or white, gay or straight,
43
143926
2270
年龄、肤色、性别取向,
02:26
I will value them differently.
44
146220
1611
人们以不同方式, 去评估这些事实。
02:27
And I will say, "Yes, this is true,
45
147855
1688
我会说,“这是真实的,
02:29
but this is more important to me than that."
46
149567
2114
但对我来说,这些事实 比那些更重要。”
02:31
And this is where it becomes interesting,
47
151705
1990
事情在这里变得微妙,
02:33
because this is where we become human.
48
153719
2146
你可以在这里发现人性。
02:35
This is when we start to argue, to form society.
49
155889
2996
当我们开始对这些事实 进行辩论,这就形成社会。
02:38
And to really get somewhere, we need to filter all our facts here,
50
158909
3357
要获得真实的信息,我们需要 对所有的事实进行过滤,
02:42
through friends and neighbors and parents and children
51
162290
2556
通过我们的朋友, 邻居,父母和小孩
02:44
and coworkers and newspapers and magazines,
52
164870
2032
还有通过同事, 报纸和杂志,
02:46
to finally be grounded in real knowledge,
53
166926
3080
最后会得到真实的知识,
02:50
which is something that a search engine is a poor help to achieve.
54
170030
4047
这些真实的知识, 很难依赖网上搜索引擎的帮助。
02:55
So, I promised you an example just to show you why it's so hard
55
175284
6328
让我举例证明
03:01
to get to the point of true, clean, objective knowledge --
56
181636
3404
要得到真实, 公正和客观的知识有多困难 -
03:05
as food for thought.
57
185064
1468
提供另一个观点供你们思考。
03:06
I will conduct a couple of simple queries, search queries.
58
186556
3893
在这里,我会进行一些 简单的搜索查询。
03:10
We'll start with "Michelle Obama,"
59
190473
4040
我们查询“Michelle Obama,”
03:14
the First Lady of the United States.
60
194537
1804
美国第一夫人。
03:16
And we'll click for pictures.
61
196365
1729
我们点击她的图片。
03:19
It works really well, as you can see.
62
199007
2272
你能看到,这个查询 方法很管用。
03:21
It's a perfect search result, more or less.
63
201303
3028
或多或少这是一个 完美的搜索结果。
03:24
It's just her in the picture, not even the President.
64
204355
2750
只有她在相片中,她的 总统丈夫不在相片中。
03:27
How does this work?
65
207664
1313
搜索引擎是如何工作呢?
03:29
Quite simple.
66
209837
1372
很简单。
03:31
Google uses a lot of smartness to achieve this, but quite simply,
67
211233
3215
Google使用大量智能技术 呈现搜索结果,但很简单,
03:34
they look at two things more than anything.
68
214472
2060
它们更多看两样东西比 任何其他东西。
03:36
First, what does it say in the caption under the picture on each website?
69
216556
5156
首先,每个网站上 的图片标题是什么?
03:41
Does it say "Michelle Obama" under the picture?
70
221736
2215
在图片下有显示 “Michelle Obama” 吗?
03:43
Pretty good indication it's actually her on there.
71
223975
2356
有的话会是很好的指标。
03:46
Second, Google looks at the picture file,
72
226355
2386
第二,Google显示图像文件,
03:48
the name of the file as such uploaded to the website.
73
228765
3032
该文件的名字会传到网站。
03:51
Again, is it called "MichelleObama.jpeg"?
74
231821
2669
被称为“MichellObama.jpeg”吗?
03:54
Pretty good indication it's not Clint Eastwood in the picture.
75
234839
2922
非常好的指标,它不是 Clint Eastwood的图片。
03:57
So, you've got those two and you get a search result like this -- almost.
76
237785
4265
你已经获得两个搜索结果,而且 你会得到其他像这样类似的结果。
04:02
Now, in 2009, Michelle Obama was the victim of a racist campaign,
77
242074
6603
在2009年,Michelle Obama 是一个种族主义运动的受害者,
04:08
where people set out to insult her through her search results.
78
248701
4015
有些人通过网上搜索结果, 有意对她进行侮辱。
04:13
There was a picture distributed widely over the Internet
79
253430
2702
有一张她的相片在 互联网上广泛散布,
04:16
where her face was distorted to look like a monkey.
80
256156
2644
她的脸被扭曲, 看起来像一只猴子。
04:18
And that picture was published all over.
81
258824
3169
这张相片全世界 被公布了。
04:22
And people published it very, very purposefully,
82
262017
3761
这些人很故意地 发布这张相片,
04:25
to get it up there in the search results.
83
265802
1971
登载在互联网上,在她 相关图片的搜索结果中。
04:27
They made sure to write "Michelle Obama" in the caption
84
267797
2955
这些人确信,要用 “MichelleObama”作标题,
04:30
and they made sure to upload the picture as "MichelleObama.jpeg," or the like.
85
270776
4177
而且,他们确信上传的图片 为“MichelleObama.jpeg"。
04:34
You get why -- to manipulate the search result.
86
274977
2367
你明白这个原因--- 操纵搜索结果。
04:37
And it worked, too.
87
277368
1295
而且,它也管用。
04:38
So when you picture-Googled for "Michelle Obama" in 2009,
88
278687
2720
当你Google搜索图片, ”MichelleObama“ 2009,
04:41
that distorted monkey picture showed up among the first results.
89
281431
3387
被扭曲了的猴子图片会 出现在第一个搜索结果中。
04:44
Now, the results are self-cleansing,
90
284842
3566
现在,这图片在 搜索结果中自我消失了,
04:48
and that's sort of the beauty of it,
91
288432
1753
这是Google的优点,
04:50
because Google measures relevance every hour, every day.
92
290209
3403
因为Google每小时, 每天都测量信息的相关性。
04:53
However, Google didn't settle for that this time,
93
293636
2714
但是, Google在这个事件上没有妥协,
04:56
they just thought, "That's racist and it's a bad search result
94
296374
3124
Google只是想,”这是种族主义 的表现,是糟糕的搜索结果,
04:59
and we're going to go back and clean that up manually.
95
299522
3135
我们用手工清除它。
05:02
We are going to write some code and fix it,"
96
302681
2932
我们会编写一些 代码并修复它。”
05:05
which they did.
97
305637
1247
修复这个恶作剧。
05:07
And I don't think anyone in this room thinks that was a bad idea.
98
307454
3742
我不认为,这里会有人觉得 这个主意不好,
05:11
Me neither.
99
311789
1164
我也不觉得。
05:14
But then, a couple of years go by,
100
314802
3032
但是,几年过去后,
05:17
and the world's most-Googled Anders,
101
317858
2984
世界上最多用Google 搜索的是Anders,
05:20
Anders Behring Breivik,
102
320866
2279
Anders Behring Breivik,
05:23
did what he did.
103
323169
1706
他为所欲为做他想做的事。
05:24
This is July 22 in 2011,
104
324899
2001
在2011年7月22日,
05:26
and a terrible day in Norwegian history.
105
326924
2649
在挪威史上是很可怕的一天。
05:29
This man, a terrorist, blew up a couple of government buildings
106
329597
3787
Anders是恐怖分子, 他炸毁了几栋政府大楼,
05:33
walking distance from where we are right now in Oslo, Norway
107
333408
2883
在挪威首都奥斯陆,离我们 现在所在地仅几步之遥,
05:36
and then he traveled to the island of Utøya
108
336315
2051
然后,他前往Utoya岛
05:38
and shot and killed a group of kids.
109
338390
2223
并开枪打死一群孩子。
05:41
Almost 80 people died that day.
110
341113
1728
近80人在那天丧生。
05:44
And a lot of people would describe this act of terror as two steps,
111
344397
4559
很多人描述他的 恐怖行为有两个步骤,
05:48
that he did two things: he blew up the buildings and he shot those kids.
112
348980
3411
他做了两件事:炸毁建筑物 和枪杀那些孩子。
05:52
It's not true.
113
352415
1165
这不是真的。
05:54
It was three steps.
114
354326
2143
该包括三个步骤。
05:56
He blew up those buildings, he shot those kids,
115
356493
2214
他炸毁那些建筑物, 他枪杀孩子,
05:58
and he sat down and waited for the world to Google him.
116
358731
3644
还有,他坐下来并等待 全世界的人用Google搜索他。
06:03
And he prepared all three steps equally well.
117
363227
2627
他同样出色地 准备了这三个步骤。
06:06
And if there was somebody who immediately understood this,
118
366544
2790
如果有人立刻明白他的目的,
06:09
it was a Swedish web developer,
119
369358
1524
这个人就是瑞典互联网的开发员,
06:10
a search engine optimization expert in Stockholm, named Nikke Lindqvist.
120
370906
3623
名叫Nikke Lindqvist,是斯德哥尔摩 的搜索引擎优化专家。
06:14
He's also a very political guy
121
374553
1588
他也是一个非常 政治化的家伙,
06:16
and he was right out there in social media, on his blog and Facebook.
122
376165
3276
他出现在社会媒体,博客 和Facebook上。
06:19
And he told everybody,
123
379465
1206
他告诉每一个人,
06:20
"If there's something that this guy wants right now,
124
380695
2455
“如果他现在想做一件事情,
06:23
it's to control the image of himself.
125
383174
2459
就是去控制他自己的图像。
06:26
Let's see if we can distort that.
126
386760
1960
让我们看看是否 能扭曲图像。
06:29
Let's see if we, in the civilized world, can protest against what he did
127
389490
3977
在文明世界里,让我们看看 能否反对他所做的事
06:33
through insulting him in his search results."
128
393491
3317
在他的搜索结果中侮辱他。“
06:36
And how?
129
396832
1187
该怎样做呢?
06:38
He told all of his readers the following,
130
398797
2056
他告诉所有的读者,
06:40
"Go out there on the Internet,
131
400877
1864
“在互联网上,
06:42
find pictures of dog poop on sidewalks --
132
402765
2895
找人行道上狗屎的照片 --
06:46
find pictures of dog poop on sidewalks --
133
406708
2174
找人行道上狗屎的照片 --
06:48
publish them in your feeds, on your websites, on your blogs.
134
408906
3470
把这些照片公布在食物上, 你的网站和博客上。
06:52
Make sure to write the terrorist's name in the caption,
135
412400
2921
务必写恐怖分子 的名字作为题目,
06:55
make sure to name the picture file "Breivik.jpeg."
136
415345
4487
务必给图像文件取名 为“Breivik.jpeg".
06:59
Let's teach Google that that's the face of the terrorist."
137
419856
3801
让我们教导Google, 恐怖分子的脸是怎样的。”
07:05
And it worked.
138
425552
1278
这个方法管用。
07:07
Two years after that campaign against Michelle Obama,
139
427853
2898
在那场用来反对Michelle Obama 的战役两年后,
07:10
this manipulation campaign against Anders Behring Breivik worked.
140
430775
3266
反对Anders Behring Breivik 的操纵战役有效。
07:14
If you picture-Googled for him weeks after the July 22 events from Sweden,
141
434065
4462
在瑞典7月22日的事件发生几周后, 如果你Google搜索Anders的图片,
07:18
you'd see that picture of dog poop high up in the search results,
142
438551
4327
你会看到有狗屎的相片 排在他的搜索结果前面,
07:22
as a little protest.
143
442902
1444
这作为对他的小抗议。
07:25
Strangely enough, Google didn't intervene this time.
144
445425
4132
奇怪的是,这次Google 没有干预这件事。
07:30
They did not step in and manually clean those search results up.
145
450494
4272
Google没有介入,并 手工清除那些搜索结果。
07:35
So the million-dollar question,
146
455964
1716
所以,重要的问题是,
07:37
is there anything different between these two happenings here?
147
457704
3368
这两个发生的事件在 Google上有什么不同?
07:41
Is there anything different between what happened to Michelle Obama
148
461096
3193
这两者之间有不同吗? 发生在Michelle Obama的事,
07:44
and what happened to Anders Behring Breivik?
149
464313
2065
和发生在Anders Behring Breivik的事?
07:46
Of course not.
150
466402
1284
当然这是折然不同的事。
07:48
It's the exact same thing,
151
468861
1471
其实,这是同样的事情,
07:50
yet Google intervened in one case and not in the other.
152
470356
2864
但Google介入一个案件, 另一个不介入。
07:53
Why?
153
473244
1253
为什么?
07:55
Because Michelle Obama is an honorable person, that's why,
154
475283
3300
因为Michelle Obama是一个 体面和正直的人,这就是原因,
07:58
and Anders Behring Breivik is a despicable person.
155
478607
2916
而Anders Behring Breivik 是一个卑鄙的人。
08:02
See what happens there?
156
482142
1535
你都看到这里发生的事吗?
08:03
An evaluation of a person takes place
157
483701
3255
对一个人的评价。
08:06
and there's only one power-player in the world
158
486980
3786
在世界上,只有唯一 一个权力重要参与者,
08:10
with the authority to say who's who.
159
490790
2480
有权威决定谁是谁,
08:13
"We like you, we dislike you.
160
493882
1741
“我们喜欢你,我们不喜欢你。
08:15
We believe in you, we don't believe in you.
161
495647
2039
我们相信你,我们不相信你。
08:17
You're right, you're wrong. You're true, you're false.
162
497710
2547
你是对的,你是错的; 你是真的,你是假的。
08:20
You're Obama, and you're Breivik."
163
500281
1805
你是Obama,而你 是Breivik。”
08:22
That's power if I ever saw it.
164
502791
2000
我在任何时候都看到, 这就是权力。
08:27
So I'm asking you to remember that behind every algorithm
165
507206
3652
所以,我要求你记住, 每个计算机操作的背后
08:30
is always a person,
166
510882
1777
总有一个人,
08:32
a person with a set of personal beliefs
167
512683
2495
这个人有一套个人的信念
08:35
that no code can ever completely eradicate.
168
515202
2525
从来没有破译密码能 彻底根除这种信念。
08:37
And my message goes out not only to Google,
169
517751
2434
我发布这些信息表明, 不仅Google是这样,
08:40
but to all believers in the faith of code around the world.
170
520209
2810
而且世界上所有相信 计算机操作的人都这样。
08:43
You need to identify your own personal bias.
171
523043
2976
你需要确定自己 个人的偏见。
08:46
You need to understand that you are human
172
526043
2013
你需要明白,你是人
08:48
and take responsibility accordingly.
173
528080
2491
并承担相应的责任。
08:51
And I say this because I believe we've reached a point in time
174
531891
2938
我说这些,因为我相信 我们已经及时达成共识,
08:54
when it's absolutely imperative
175
534853
1555
这是绝对必要的,
08:56
that we tie those bonds together again, tighter:
176
536432
3217
我们又把这个纽带 更加拉紧在一起:
08:59
the humanities and the technology.
177
539673
2368
人文科学和科技,
09:02
Tighter than ever.
178
542483
1805
比以往任何时候更紧密。
09:04
And, if nothing else, to remind us that that wonderfully seductive idea
179
544312
3339
而且,如果没有其他东西, 提醒我们这个极其吸引人的
09:07
of the unbiased, clean search result
180
547675
2668
公正,干净的搜索结果的想法
09:10
is, and is likely to remain, a myth.
181
550367
2767
是一个神话。
09:13
Thank you for your time.
182
553984
1159
谢谢。
09:15
(Applause)
183
555167
2432
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7