Del Harvey: The strangeness of scale at Twitter

106,770 views ・ 2014-03-27

TED


请双击下面的英文字幕来播放视频。

翻译人员: Xiaoou Chen 校对人员: Keke Gu
00:12
My job at Twitter
0
12984
1291
我在推特的工作
00:14
is to ensure user trust,
1
14275
1978
就是去确保用户的信赖,
00:16
protect user rights and keep users safe,
2
16253
2837
保护用户之间的
00:19
both from each other
3
19090
1260
以及他们自身的
00:20
and, at times, from themselves.
4
20350
3899
权利和安全。
00:24
Let's talk about what scale looks like at Twitter.
5
24249
4275
让我们讨论一下在推特,比例是什么样的。
00:28
Back in January 2009,
6
28524
2870
在2009年1月,
00:31
we saw more than two million new tweets each day
7
31394
3331
每天,在推特上我们可以看见
00:34
on the platform.
8
34725
1764
超过两百万条推特更新。
00:36
January 2014, more than 500 million.
9
36489
5908
2014年1月有超过五亿条。
00:42
We were seeing two million tweets
10
42397
2492
我们那时在六分钟之内
00:44
in less than six minutes.
11
44889
2176
就可以看见两百万条。
00:47
That's a 24,900-percent increase.
12
47065
6984
那是一个24,900%的增长。
00:54
Now, the vast majority of activity on Twitter
13
54049
3253
现在,推特上绝大多数的活动
00:57
puts no one in harm's way.
14
57302
1503
都没有伤害到任何人。
00:58
There's no risk involved.
15
58805
1935
不涉及任何风险。
01:00
My job is to root out and prevent activity that might.
16
60740
5753
我的工作就是铲除并防止这类事情的发生。
01:06
Sounds straightforward, right?
17
66493
1973
听起来简单明了,对吧?
01:08
You might even think it'd be easy,
18
68466
1152
你可能认为这件事很容易,
01:09
given that I just said the vast majority
19
69618
2170
因为我刚说过绝大多数
01:11
of activity on Twitter puts no one in harm's way.
20
71788
3810
在推特上的行为都是无害的。
01:15
Why spend so much time
21
75598
2169
为什么花这么多时间
01:17
searching for potential calamities
22
77767
2743
在无害的行为中
01:20
in innocuous activities?
23
80510
2900
搜寻潜在的危机呢?
01:23
Given the scale that Twitter is at,
24
83410
2940
考虑推特的规模,
01:26
a one-in-a-million chance happens
25
86350
2357
百万分之一几率的可能,
01:28
500 times a day.
26
88707
4876
一天会发生五百次。
01:33
It's the same for other companies
27
93583
1445
对于其它公司来说,
01:35
dealing at this sort of scale.
28
95028
1471
他们面临的这个比例是一样的。
01:36
For us, edge cases,
29
96499
1708
对于我们,边缘案例
01:38
those rare situations that are unlikely to occur,
30
98207
3625
那些不常有,也不大可能发生的情况
01:41
are more like norms.
31
101832
2622
更像是家常便饭。
01:44
Say 99.999 percent of tweets
32
104454
3942
假设99.999%的推特
01:48
pose no risk to anyone.
33
108396
1888
对任何人无害。
01:50
There's no threat involved.
34
110284
1066
不涉及任何威胁。
01:51
Maybe people are documenting travel landmarks
35
111350
2954
人们可能在记录旅游胜地,
01:54
like Australia's Heart Reef,
36
114304
1963
比如澳大利亚心型礁,
01:56
or tweeting about a concert they're attending,
37
116267
2921
或者推文他们正在参加的演唱会,
01:59
or sharing pictures of cute baby animals.
38
119188
4747
或者分享可爱动物的图片。
02:03
After you take out that 99.999 percent,
39
123935
4509
在你剔除那99.999%之后,
02:08
that tiny percentage of tweets remaining
40
128444
3529
剩下的那丁点推文
02:11
works out to roughly
41
131973
2389
被计算出
02:14
150,000 per month.
42
134362
3475
每月约有15万条。
02:17
The sheer scale of what we're dealing with
43
137837
2456
我们所应付的这个庞大规模
02:20
makes for a challenge.
44
140293
2312
是一个挑战。
02:22
You know what else makes my role
45
142605
1178
你知道还有什么让我的职位
02:23
particularly challenging?
46
143783
3107
特别具有挑战性?
02:26
People do weird things.
47
146890
5123
人们做奇怪的事情。
02:32
(Laughter)
48
152013
1829
(笑声)
02:33
And I have to figure out what they're doing,
49
153842
2391
我必须弄明白他们在做什么,
02:36
why, and whether or not there's risk involved,
50
156233
2249
为什么,以及涉及危险与否,
02:38
often without much in terms of context
51
158482
2168
而这通常是在我没有掌握
02:40
or background.
52
160650
1847
来龙去脉的情况下。
02:42
I'm going to show you some examples
53
162497
2077
我将要展示给你们几个例子,
02:44
that I've run into during my time at Twitter --
54
164574
2005
是我在推特工作中遇到的---
02:46
these are all real examples —
55
166579
1620
这些都是真实的例子-
02:48
of situations that at first seemed cut and dried,
56
168199
2653
这些情况乍看似乎直接了当,
02:50
but the truth of the matter was something
57
170852
1643
但事情的真相
02:52
altogether different.
58
172495
1550
是截然不同的。
02:54
The details have been changed
59
174045
1977
例子的细节有所改动
02:56
to protect the innocent
60
176022
1257
是为了去保护那些无辜者
02:57
and sometimes the guilty.
61
177279
3233
有时也包括有过的那方。
03:00
We'll start off easy.
62
180512
3005
让我们从简单的开始。
03:03
["Yo bitch"]
63
183517
1793
【“呦,bitch”】(bitch有母狗,婊子,娘们等意思)
03:05
If you saw a Tweet that only said this,
64
185310
3228
如果你看到一条推文只有这一句话,
03:08
you might think to yourself,
65
188538
1694
你可能认为
03:10
"That looks like abuse."
66
190232
1653
”那看起来像是在谩骂。“
03:11
After all, why would you want to receive the message,
67
191885
3107
毕竟,你为什么会想收到这条信息呢,
03:14
"Yo, bitch."
68
194992
2218
“呦,婊子。”
03:17
Now, I try to stay relatively hip
69
197210
4663
现在,我试图与流行用语的
03:21
to the latest trends and memes,
70
201873
2512
最新的释义保持同步,
03:24
so I knew that "yo, bitch"
71
204385
2704
所以我知道“呦,婊子”
03:27
was also often a common greeting between friends,
72
207089
3154
有时候也是朋友之间常见的问候方式,
03:30
as well as being a popular "Breaking Bad" reference.
73
210243
4262
同时也是美剧《绝命毒师》中一个流行说法。
03:34
I will admit that I did not expect
74
214505
2487
我要承认,我没有想到
03:36
to encounter a fourth use case.
75
216992
2841
我会遇到这个词的第四种用法。
03:39
It turns out it is also used on Twitter
76
219833
3104
在推特上
03:42
when people are role-playing as dogs.
77
222937
3062
人们角色扮演狗的时候,也用这个词。
03:45
(Laughter)
78
225999
5279
(笑声)
03:51
And in fact, in that case,
79
231278
1666
所以,在那种情况下,
03:52
it's not only not abusive,
80
232944
1609
这不仅不是谩骂,
03:54
it's technically just an accurate greeting.
81
234553
3139
严格的来说,那就是一个准确的问候。
03:57
(Laughter)
82
237692
2889
(笑声)
04:00
So okay, determining whether or not
83
240581
2071
所以判断一些没有来龙去脉的东西
04:02
something is abusive without context,
84
242652
1848
是否出于恶意
04:04
definitely hard.
85
244500
1592
确实困难。
04:06
Let's look at spam.
86
246092
2717
让我们来看一下垃圾邮件。
04:08
Here's an example of an account engaged
87
248809
1960
这是一个参与传播
04:10
in classic spammer behavior,
88
250769
1668
常见垃圾邮件的账户,
04:12
sending the exact same message
89
252437
1559
它向数以千计的人
04:13
to thousands of people.
90
253996
1804
发送相同的信息。
04:15
While this is a mockup I put together using my account,
91
255800
2793
虽然这是我用我的账号模仿的,
04:18
we see accounts doing this all the time.
92
258593
3001
但我们总可以看到有账户在传播这样的垃圾信息。
04:21
Seems pretty straightforward.
93
261594
1979
看起来非常直白简单。
04:23
We should just automatically suspend accounts
94
263573
2053
我们应该就自动暂停
04:25
engaging in this kind of behavior.
95
265626
3307
参与这种行为的账号。
04:28
Turns out there's some exceptions to that rule.
96
268933
3210
但结果中总有些例外情况。
04:32
Turns out that that message could also be a notification
97
272143
2883
那些信息也可能是公告提醒,
04:35
you signed up for that the International Space Station is passing overhead
98
275026
3889
比如你想目睹国际空间站略过你上空的情形
04:38
because you wanted to go outside
99
278915
1846
而登记了这个信息。
04:40
and see if you could see it.
100
280761
1948
希望可以收到提醒,尝试目睹它。
04:42
You're not going to get that chance
101
282709
1225
如果我们错误地认为这是垃圾信息,
04:43
if we mistakenly suspend the account
102
283934
1847
并封了那个账号,
04:45
thinking it's spam.
103
285781
2266
你将失去目睹国际空间站略过上空的机会。
04:48
Okay. Let's make the stakes higher.
104
288047
3526
让我们把赌注加高一些。
04:51
Back to my account,
105
291573
1916
再来看我的帐号,
04:53
again exhibiting classic behavior.
106
293489
3505
还是展现常见的行为。
04:56
This time it's sending the same message and link.
107
296994
2643
这一次是发同样的信息和链接。
04:59
This is often indicative of something called phishing,
108
299637
2774
这通常意味着钓鱼式攻击,(注:一种网络诈骗的手段)
05:02
somebody trying to steal another person's account information
109
302411
3178
有人通过将一个人导向另一个网站
05:05
by directing them to another website.
110
305589
2203
去盗取其账户信息。
05:07
That's pretty clearly not a good thing.
111
307792
4194
很明显那不是什么好事。
05:11
We want to, and do, suspend accounts
112
311986
1930
我们想,也确实封了
05:13
engaging in that kind of behavior.
113
313916
2624
从事那种行为的账户。
05:16
So why are the stakes higher for this?
114
316540
3247
但为什么对这种行为的赌注更高呢?
05:19
Well, this could also be a bystander at a rally
115
319787
2999
这也可能是一个身处集会中的旁观者
05:22
who managed to record a video
116
322786
1910
录下了一段关于
05:24
of a police officer beating a non-violent protester
117
324696
3270
警察殴打一个无辜抗议者的视频
05:27
who's trying to let the world know what's happening.
118
327966
2975
他想让全世界知道发生了什么。
05:30
We don't want to gamble
119
330941
1643
我们不想
05:32
on potentially silencing that crucial speech
120
332584
2517
在把那个关键演说通过分类为垃圾并暂停账号而可能导致的后果
05:35
by classifying it as spam and suspending it.
121
335101
2929
上做冒险。
05:38
That means we evaluate hundreds of parameters
122
338030
2879
那也就意味着,当我们观察账户行为的时候
05:40
when looking at account behaviors,
123
340909
1688
我们评估成百上千的因素
05:42
and even then, we can still get it wrong
124
342597
2016
即使这样,我们仍然会犯错,
05:44
and have to reevaluate.
125
344613
2236
并需要重新评价。
05:46
Now, given the sorts of challenges I'm up against,
126
346849
3708
在这些挑战面前
05:50
it's crucial that I not only predict
127
350557
2696
关键在于我不仅要预测
05:53
but also design protections for the unexpected.
128
353253
3784
而且防御不可预测的事情。
05:57
And that's not just an issue for me,
129
357037
2342
那不单是我的问题,
05:59
or for Twitter, it's an issue for you.
130
359379
2087
或者推特的问题,这也是你们的问题。
06:01
It's an issue for anybody who's building or creating
131
361466
2406
对于任何在创建美好事物,
06:03
something that you think is going to be amazing
132
363872
1925
或者为他人造福的人来说
06:05
and will let people do awesome things.
133
365797
2789
都是一个问题。
06:08
So what do I do?
134
368586
2866
那么我能做些什么呢?
06:11
I pause and I think,
135
371452
3318
我停下并思考,
06:14
how could all of this
136
374770
2095
这些事能如何
06:16
go horribly wrong?
137
376865
3793
变得很糟糕的呢?
06:20
I visualize catastrophe.
138
380658
4453
我想象灾难。
06:25
And that's hard. There's a sort of
139
385111
2463
但是那很难。好像有一种
06:27
inherent cognitive dissonance in doing that,
140
387574
2848
与生俱来的认知失调在作怪,
06:30
like when you're writing your wedding vows
141
390422
1812
就像你同时写结婚誓言
06:32
at the same time as your prenuptial agreement.
142
392234
2646
和婚前协议一样。
06:34
(Laughter)
143
394880
1696
(笑声)
06:36
But you still have to do it,
144
396576
2373
但你还是得去做,
06:38
particularly if you're marrying 500 million tweets per day.
145
398949
4446
特别是当你一天得处理5亿条推文时。
06:43
What do I mean by "visualize catastrophe?"
146
403395
3097
我说的“想象灾难”是什么意思呢?
06:46
I try to think of how something as
147
406492
2762
我试想,像猫的照片一样
06:49
benign and innocuous as a picture of a cat
148
409254
3228
温和并无害的东西
06:52
could lead to death,
149
412482
1104
能如何导致死亡,
06:53
and what to do to prevent that.
150
413586
2326
并想办法去阻止其发生。
06:55
Which happens to be my next example.
151
415912
2383
这也是我的下一个例子。
06:58
This is my cat, Eli.
152
418295
3110
这是我的猫,伊莱。
07:01
We wanted to give users the ability
153
421405
1981
我们想要给予用户
07:03
to add photos to their tweets.
154
423386
2073
将图片加到他们推文的能力。
07:05
A picture is worth a thousand words.
155
425459
1597
一张图片胜过千言万语。
07:07
You only get 140 characters.
156
427056
2009
(一次推文)你只能输入140个字。
07:09
You add a photo to your tweet,
157
429065
1200
当你在推文里加一张图片时,
07:10
look at how much more content you've got now.
158
430265
3038
看看现在你发表的内容有多丰富。
07:13
There's all sorts of great things you can do
159
433303
1677
通过在推文里添加图片,
07:14
by adding a photo to a tweet.
160
434980
2007
你可以做各种各样神奇的事。
07:16
My job isn't to think of those.
161
436987
2280
我的工作不是去想那些事情,
07:19
It's to think of what could go wrong.
162
439267
2747
而是去想会发生什么问题。
07:22
How could this picture
163
442014
1892
这张图片能如何
07:23
lead to my death?
164
443906
3539
导致我的死亡呢?
07:27
Well, here's one possibility.
165
447445
3160
有一个可能性。
07:30
There's more in that picture than just a cat.
166
450605
3086
除了一只猫以外,这个图片里还有其它信息。
07:33
There's geodata.
167
453691
2092
那里有地理信息。
07:35
When you take a picture with your smartphone
168
455783
2212
当你用你的智能手机或数码相机
07:37
or digital camera,
169
457995
1299
照一张照片时,
07:39
there's a lot of additional information
170
459294
1654
很多额外的信息
07:40
saved along in that image.
171
460948
1616
也会随着照片保留下来。
07:42
In fact, this image also contains
172
462564
1932
事实上,这张照图片还
07:44
the equivalent of this,
173
464496
1805
指明了这个,
07:46
more specifically, this.
174
466301
3079
更加具体些,是这个。
07:49
Sure, it's not likely that someone's going to try
175
469380
1956
没错,不大可能有人准备
07:51
to track me down and do me harm
176
471336
2285
根据我的猫的照片中数据
07:53
based upon image data associated
177
473621
1784
追踪我
07:55
with a picture I took of my cat,
178
475405
1948
并伤害我,
07:57
but I start by assuming the worst will happen.
179
477353
3651
但是我开始假设最坏的事情会发生。
08:01
That's why, when we launched photos on Twitter,
180
481004
2338
这也是为什么当我们推出加载图片功能的时候,
08:03
we made the decision to strip that geodata out.
181
483342
3821
我们决定消除那些地理数据。
08:07
(Applause)
182
487163
5847
(掌声)
08:13
If I start by assuming the worst
183
493010
2613
如果我从假设最坏的事开始,
08:15
and work backwards,
184
495623
947
并反向推理,
08:16
I can make sure that the protections we build
185
496570
2553
我可以确保我们所设置的保护
08:19
work for both expected
186
499123
1768
对于可预知与
08:20
and unexpected use cases.
187
500891
2078
不可预知的事件同时有效。
08:22
Given that I spend my days and nights
188
502969
2945
假设我日夜
08:25
imagining the worst that could happen,
189
505914
2541
想象可能发生的最坏的事情,
08:28
it wouldn't be surprising if my worldview was gloomy.
190
508455
4257
我的世界观有些阴郁并不令人惊奇。
08:32
(Laughter)
191
512712
1783
(笑声)
08:34
It's not.
192
514495
1417
但这并不是事实。
08:35
The vast majority of interactions I see --
193
515912
3876
我看到的绝大多数的(推特)互动
08:39
and I see a lot, believe me -- are positive,
194
519788
3901
是积极的,相信我,我看过很多,
08:43
people reaching out to help
195
523689
1924
人们伸出援助之手,
08:45
or to connect or share information with each other.
196
525613
3448
或者相互连接或分享信息。
08:49
It's just that for those of us dealing with scale,
197
529061
3323
因为我们要应付安全风险,
08:52
for those of us tasked with keeping people safe,
198
532384
3800
我们承担着保证大众安全的责任,
08:56
we have to assume the worst will happen,
199
536184
2546
我们必须假设最坏的事情会发生,
08:58
because for us, a one-in-a-million chance
200
538730
4227
因为对于我们来说百万分之一的几率
09:02
is pretty good odds.
201
542957
2749
是一个非常大的可能性。
09:05
Thank you.
202
545706
1864
谢谢
09:07
(Applause)
203
547570
4000
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog