Del Harvey: The strangeness of scale at Twitter

105,103 views ・ 2014-03-27

TED


請雙擊下方英文字幕播放視頻。

譯者: Yamei Huang 審譯者: Adrienne Lin
00:12
My job at Twitter
0
12984
1291
我在推特的工作
00:14
is to ensure user trust,
1
14275
1978
就是確保使用者對推特的信任,
00:16
protect user rights and keep users safe,
2
16253
2837
以及保護使用者的權益及安全,
00:19
both from each other
3
19090
1260
不只是使用者之間
00:20
and, at times, from themselves.
4
20350
3899
有時是有關使用者本身的權益及安全。
00:24
Let's talk about what scale looks like at Twitter.
5
24249
4275
讓我們談談推特訊息的規模。
00:28
Back in January 2009,
6
28524
2870
2009 年 1 月,
00:31
we saw more than two million new tweets each day
7
31394
3331
每一天,我們在推特平台上
00:34
on the platform.
8
34725
1764
看到超過 200 萬條新訊息。
00:36
January 2014, more than 500 million.
9
36489
5908
2014 年 1 月,則有超過 5 億條訊息。
00:42
We were seeing two million tweets
10
42397
2492
六分鐘內,
00:44
in less than six minutes.
11
44889
2176
就有 200 萬條推文。
00:47
That's a 24,900-percent increase.
12
47065
6984
那是 24,900% 的成長。
00:54
Now, the vast majority of activity on Twitter
13
54049
3253
今天,絶大部分在推特上的活動
00:57
puts no one in harm's way.
14
57302
1503
不會傷害任何人。
00:58
There's no risk involved.
15
58805
1935
沒有任何風險。
01:00
My job is to root out and prevent activity that might.
16
60740
5753
我的工作則是根除 任何可能傷害他人權益的活動。
01:06
Sounds straightforward, right?
17
66493
1973
聽起來很簡單,對吧?
01:08
You might even think it'd be easy,
18
68466
1152
你或許會認為這個工作很簡單,
01:09
given that I just said the vast majority
19
69618
2170
尤其當我說,推特上絕大部分的動作
01:11
of activity on Twitter puts no one in harm's way.
20
71788
3810
並不會對任何人造成傷害。
01:15
Why spend so much time
21
75598
2169
那為什麼要花費這麼多時間
01:17
searching for potential calamities
22
77767
2743
在無害的網路活動中,
01:20
in innocuous activities?
23
80510
2900
尋找可能的危機?
01:23
Given the scale that Twitter is at,
24
83410
2940
以推特的規模來看,
01:26
a one-in-a-million chance happens
25
86350
2357
百萬分之一的機率,
01:28
500 times a day.
26
88707
4876
相當於一天會有 500 條 可能造成危害的訊息。
01:33
It's the same for other companies
27
93583
1445
這個訊息量,是其他公司
01:35
dealing at this sort of scale.
28
95028
1471
所要處理的訊息量相同
01:36
For us, edge cases,
29
96499
1708
對我們而言,那些稀少罕見,
01:38
those rare situations that are unlikely to occur,
30
98207
3625
不太可能發生的極端事件,
01:41
are more like norms.
31
101832
2622
有如家常便飯。
01:44
Say 99.999 percent of tweets
32
104454
3942
假設百分之 99.999% 的推文
01:48
pose no risk to anyone.
33
108396
1888
都不會傷害任何人,
01:50
There's no threat involved.
34
110284
1066
不涉及任何風險。
01:51
Maybe people are documenting travel landmarks
35
111350
2954
也許大家只是在記錄旅遊景點,
01:54
like Australia's Heart Reef,
36
114304
1963
像是澳洲的心形礁,
01:56
or tweeting about a concert they're attending,
37
116267
2921
或是傳些關於他們正在參加的演唱會,
01:59
or sharing pictures of cute baby animals.
38
119188
4747
或者是分享一些可愛小動物的照片。
02:03
After you take out that 99.999 percent,
39
123935
4509
當你除去那 99.999% 的機率,
02:08
that tiny percentage of tweets remaining
40
128444
3529
剩下極微小的百分比
02:11
works out to roughly
41
131973
2389
粗估下來
02:14
150,000 per month.
42
134362
3475
每月大約有十五萬條訊息。
02:17
The sheer scale of what we're dealing with
43
137837
2456
管理這麼龐大的規模,
02:20
makes for a challenge.
44
140293
2312
是個挑戰。
02:22
You know what else makes my role
45
142605
1178
你知道還有什麼
02:23
particularly challenging?
46
143783
3107
讓我的工作更具挑戰性的嗎?
02:26
People do weird things.
47
146890
5123
人會做些奇怪的事。
02:32
(Laughter)
48
152013
1829
(笑聲)
02:33
And I have to figure out what they're doing,
49
153842
2391
而我則必須搞清楚他們在做什麼,
02:36
why, and whether or not there's risk involved,
50
156233
2249
動機是什麼,還有是否有危險性,
02:38
often without much in terms of context
51
158482
2168
且通常是在沒有資料
02:40
or background.
52
160650
1847
或背景的情況下就要去搞清楚。
02:42
I'm going to show you some examples
53
162497
2077
讓我舉幾個我在推特工作時
02:44
that I've run into during my time at Twitter --
54
164574
2005
遇到的例子,
02:46
these are all real examples —
55
166579
1620
這些全都是真實的案例,
02:48
of situations that at first seemed cut and dried,
56
168199
2653
一些原先看來簡單明瞭的情況,
02:50
but the truth of the matter was something
57
170852
1643
但事情的真相
02:52
altogether different.
58
172495
1550
又是截然不同。
02:54
The details have been changed
59
174045
1977
有些細節已被更改,
02:56
to protect the innocent
60
176022
1257
是為了保護無辜的人,
02:57
and sometimes the guilty.
61
177279
3233
有時也包括罪犯。
03:00
We'll start off easy.
62
180512
3005
我們從簡單的開始。
03:03
["Yo bitch"]
63
183517
1793
[嘿 賤女人]
03:05
If you saw a Tweet that only said this,
64
185310
3228
當你在推特上看到這句話,
03:08
you might think to yourself,
65
188538
1694
你可能會認為:
03:10
"That looks like abuse."
66
190232
1653
「那是一種辱罵」。
03:11
After all, why would you want to receive the message,
67
191885
3107
畢竟,誰會希望收到這樣的訊息:
03:14
"Yo, bitch."
68
194992
2218
「嘿,賤女人。」
03:17
Now, I try to stay relatively hip
69
197210
4663
現在,我試著跟上趨勢
03:21
to the latest trends and memes,
70
201873
2512
及最新流行用語,
03:24
so I knew that "yo, bitch"
71
204385
2704
所以我知道「嘿,賤女人」
03:27
was also often a common greeting between friends,
72
207089
3154
也常被用作朋友間的招呼用語
03:30
as well as being a popular "Breaking Bad" reference.
73
210243
4262
是來自於《絕命毒師》的說法。
03:34
I will admit that I did not expect
74
214505
2487
我得承認我沒有想到
03:36
to encounter a fourth use case.
75
216992
2841
這句話會有第四種用法。
03:39
It turns out it is also used on Twitter
76
219833
3104
原來在推特上,扮成狗的人
03:42
when people are role-playing as dogs.
77
222937
3062
也會用這個詞。
03:45
(Laughter)
78
225999
5279
(笑聲)
03:51
And in fact, in that case,
79
231278
1666
事實上,在這個情況下,
03:52
it's not only not abusive,
80
232944
1609
不止沒有辱罵的意味,
03:54
it's technically just an accurate greeting.
81
234553
3139
嚴格說來,這是一個準確的問候用語。
03:57
(Laughter)
82
237692
2889
(笑聲)
04:00
So okay, determining whether or not
83
240581
2071
所以,一條沒有來龍去脈的訊息
04:02
something is abusive without context,
84
242652
1848
要去判定這個訊息是否有辱罵的意味,
04:04
definitely hard.
85
244500
1592
絕對是很困難的。
04:06
Let's look at spam.
86
246092
2717
我們來看看垃圾訊息。
04:08
Here's an example of an account engaged
87
248809
1960
這是使用者傳送垃圾訊息
04:10
in classic spammer behavior,
88
250769
1668
的典型例子,
04:12
sending the exact same message
89
252437
1559
一直不斷地傳送相同的訊息
04:13
to thousands of people.
90
253996
1804
給上千個人。
04:15
While this is a mockup I put together using my account,
91
255800
2793
這是我用自己帳號作出的模擬範例,
04:18
we see accounts doing this all the time.
92
258593
3001
我們總可以看到使用者傳送這樣的訊息。
04:21
Seems pretty straightforward.
93
261594
1979
看起來相當簡單明瞭。
04:23
We should just automatically suspend accounts
94
263573
2053
我們應該自動封鎖
04:25
engaging in this kind of behavior.
95
265626
3307
涉及這種行為的帳號。
04:28
Turns out there's some exceptions to that rule.
96
268933
3210
結果總有些例外。
04:32
Turns out that that message could also be a notification
97
272143
2883
這些訊息,也有可能是通知
04:35
you signed up for that the International Space Station is passing overhead
98
275026
3889
你登記參加國際太空站經過你上空的活動,
04:38
because you wanted to go outside
99
278915
1846
你希望收到通知,即時走到戶外
04:40
and see if you could see it.
100
280761
1948
可以親自目睹。
04:42
You're not going to get that chance
101
282709
1225
你絶不會因為
04:43
if we mistakenly suspend the account
102
283934
1847
誤認為這是垃圾訊息
04:45
thinking it's spam.
103
285781
2266
而停用這個帳號的情況發生。
04:48
Okay. Let's make the stakes higher.
104
288047
3526
好。讓我們再把風險的層級提高。
04:51
Back to my account,
105
291573
1916
再來看我的帳號,
04:53
again exhibiting classic behavior.
106
293489
3505
在推特上展示特定的行為。
04:56
This time it's sending the same message and link.
107
296994
2643
這次是在持特上傳送相同的訊息和連結
04:59
This is often indicative of something called phishing,
108
299637
2774
這通常是一種網路釣魚,
05:02
somebody trying to steal another person's account information
109
302411
3178
有人試著去引導他人到另一個網站
05:05
by directing them to another website.
110
305589
2203
然後盜用他的帳號
05:07
That's pretty clearly not a good thing.
111
307792
4194
很明顯這不是一件好事。
05:11
We want to, and do, suspend accounts
112
311986
1930
我們要,而且必須去阻止
05:13
engaging in that kind of behavior.
113
313916
2624
可疑的帳號去做這樣的行為。
05:16
So why are the stakes higher for this?
114
316540
3247
但是,為何這麼做風險更高?
05:19
Well, this could also be a bystander at a rally
115
319787
2999
這像是遊行人潮當中的旁觀者
05:22
who managed to record a video
116
322786
1910
拿著攝影機,對著
05:24
of a police officer beating a non-violent protester
117
324696
3270
警察動手打一個 無暴力行為的抗議者攝影,
05:27
who's trying to let the world know what's happening.
118
327966
2975
好讓全世界的人知道此事。
05:30
We don't want to gamble
119
330941
1643
我們不想冒這個險
05:32
on potentially silencing that crucial speech
120
332584
2517
把有可能很重要的訊息
05:35
by classifying it as spam and suspending it.
121
335101
2929
歸類為垃圾訊息,然後停用帳號。
05:38
That means we evaluate hundreds of parameters
122
338030
2879
那意味著,當我們在觀察使用者行為時
05:40
when looking at account behaviors,
123
340909
1688
我們憑估成千上百個因素,
05:42
and even then, we can still get it wrong
124
342597
2016
即使是這麼做了,百密仍有一疏,
05:44
and have to reevaluate.
125
344613
2236
必須再重新評估這些訊息。
05:46
Now, given the sorts of challenges I'm up against,
126
346849
3708
現在,面臨各式各樣的挑戰,
05:50
it's crucial that I not only predict
127
350557
2696
重要的是,不但要去預測可能發生的事,
05:53
but also design protections for the unexpected.
128
353253
3784
而且要對可能發生的事, 設計一套因應的保護措施。
05:57
And that's not just an issue for me,
129
357037
2342
這不僅事關我和推特,
05:59
or for Twitter, it's an issue for you.
130
359379
2087
這也關係到你。
06:01
It's an issue for anybody who's building or creating
131
361466
2406
關係到任何想創造美好事物,
06:03
something that you think is going to be amazing
132
363872
1925
06:05
and will let people do awesome things.
133
365797
2789
以及想要讓他人也一起 做美好事物的推特使用者。
06:08
So what do I do?
134
368586
2866
所以我要怎麼做呢?
06:11
I pause and I think,
135
371452
3318
我一再思考這問題
06:14
how could all of this
136
374770
2095
這些事情
06:16
go horribly wrong?
137
376865
3793
到底怎麼會出錯?
06:20
I visualize catastrophe.
138
380658
4453
我想像發生災難的情形。
06:25
And that's hard. There's a sort of
139
385111
2463
這很困難,
06:27
inherent cognitive dissonance in doing that,
140
387574
2848
因為這麼做, 有點像是內在認知不協調,
06:30
like when you're writing your wedding vows
141
390422
1812
就像是寫結婚誓言時,
06:32
at the same time as your prenuptial agreement.
142
392234
2646
同時也寫婚前協議書。
06:34
(Laughter)
143
394880
1696
(笑聲)
06:36
But you still have to do it,
144
396576
2373
但還是必須要去做,
06:38
particularly if you're marrying 500 million tweets per day.
145
398949
4446
特別是每天要處理 5 億條推文。
06:43
What do I mean by "visualize catastrophe?"
146
403395
3097
我所說的「想像災難」是什麼意思呢?
06:46
I try to think of how something as
147
406492
2762
我試著去想像,
06:49
benign and innocuous as a picture of a cat
148
409254
3228
像是一張無害的貓咪照片
06:52
could lead to death,
149
412482
1104
為何可能導致死亡,
06:53
and what to do to prevent that.
150
413586
2326
以及如何避免這種事情發生。
06:55
Which happens to be my next example.
151
415912
2383
正是接下來我要說的例子。
06:58
This is my cat, Eli.
152
418295
3110
這隻是我的貓,叫伊萊。
07:01
We wanted to give users the ability
153
421405
1981
我們盡可能讓推特的使用者
07:03
to add photos to their tweets.
154
423386
2073
在推特上傳送圖片,
07:05
A picture is worth a thousand words.
155
425459
1597
一張圖勝過千言萬語,
07:07
You only get 140 characters.
156
427056
2009
而一次推文只能傳送 140 個字。
07:09
You add a photo to your tweet,
157
429065
1200
你在推文加入圖片,
07:10
look at how much more content you've got now.
158
430265
3038
你會發現推文的內容更加豐富。
07:13
There's all sorts of great things you can do
159
433303
1677
藉由推特加入圖片的功能,
07:14
by adding a photo to a tweet.
160
434980
2007
你可以做各種麼美妙的事。
07:16
My job isn't to think of those.
161
436987
2280
我的工作不是去想這些事情。
07:19
It's to think of what could go wrong.
162
439267
2747
而是去想事情可能會出什麼差錯。
07:22
How could this picture
163
442014
1892
這張圖片
07:23
lead to my death?
164
443906
3539
如何導致我死亡?
07:27
Well, here's one possibility.
165
447445
3160
有一個可能性。
07:30
There's more in that picture than just a cat.
166
450605
3086
這張圖的資訊不只是一隻貓。
07:33
There's geodata.
167
453691
2092
還有地理資訊在裡頭。
07:35
When you take a picture with your smartphone
168
455783
2212
當你以智慧型手機
07:37
or digital camera,
169
457995
1299
或數位相機拍照,
07:39
there's a lot of additional information
170
459294
1654
會有許多額外的資訊
07:40
saved along in that image.
171
460948
1616
儲存在照片裡。
07:42
In fact, this image also contains
172
462564
1932
事實上,這張照片還包含
07:44
the equivalent of this,
173
464496
1805
相當於這個的資訊,
07:46
more specifically, this.
174
466301
3079
更具體地說是這個。
07:49
Sure, it's not likely that someone's going to try
175
469380
1956
當然,不太可能有人嘗試
07:51
to track me down and do me harm
176
471336
2285
根據這張貓照片的相關資訊
07:53
based upon image data associated
177
473621
1784
追蹤我以及傷害我。
07:55
with a picture I took of my cat,
178
475405
1948
07:57
but I start by assuming the worst will happen.
179
477353
3651
但我一開始就要假設 最壞的情況一定會發生,
08:01
That's why, when we launched photos on Twitter,
180
481004
2338
這就是為什麼我們 在開放上傳照片到推特時,
08:03
we made the decision to strip that geodata out.
181
483342
3821
就決定把照片裡的地理資訊全刪掉。
08:07
(Applause)
182
487163
5847
(掌聲)
08:13
If I start by assuming the worst
183
493010
2613
如果一開始,我就假設 可能發生最壞的情況,
08:15
and work backwards,
184
495623
947
然後再往前倒推,
08:16
I can make sure that the protections we build
185
496570
2553
我可以確定我們建立的保護制度,
08:19
work for both expected
186
499123
1768
可以應付意料中
08:20
and unexpected use cases.
187
500891
2078
以及意料外的事件。
08:22
Given that I spend my days and nights
188
502969
2945
我日夜地
08:25
imagining the worst that could happen,
189
505914
2541
想像發生最壞情況的情形,
08:28
it wouldn't be surprising if my worldview was gloomy.
190
508455
4257
如果因此造成我憂鬱的世界觀, 也不會令人感到意外。
08:32
(Laughter)
191
512712
1783
(笶聲)
08:34
It's not.
192
514495
1417
其實並非如此。
08:35
The vast majority of interactions I see --
193
515912
3876
我看到的絶大部份互動,
08:39
and I see a lot, believe me -- are positive,
194
519788
3901
我看了很多,相信我, 它們都是正面的。
08:43
people reaching out to help
195
523689
1924
人們伸出援手互相幫忙,
08:45
or to connect or share information with each other.
196
525613
3448
彼此互相連絡或分享資訊。
08:49
It's just that for those of us dealing with scale,
197
529061
3323
只是我們要處理龐大的資訊量,
08:52
for those of us tasked with keeping people safe,
198
532384
3800
承擔保護使用者安全的責任,
08:56
we have to assume the worst will happen,
199
536184
2546
所以必須假設將發生最壞的情況,
08:58
because for us, a one-in-a-million chance
200
538730
4227
對我們來說,百萬分之一的可能性
09:02
is pretty good odds.
201
542957
2749
是相當高的機率。
09:05
Thank you.
202
545706
1864
謝謝。
09:07
(Applause)
203
547570
4000
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7