請雙擊下方英文字幕播放視頻。
譯者: Val Zhang
審譯者: Xiaowei Dong
00:13
Chris Anderson:
What worries you right now?
0
13131
2408
克里斯•安德森(安):
你目前擔心的是什麼?
00:15
You've been very open
about lots of issues on Twitter.
1
15563
2853
你對許多推特爭議持開放的態度。
00:18
What would be your top worry
2
18440
2299
對於現狀你最擔心的是什麼?
00:20
about where things are right now?
3
20763
2049
00:23
Jack Dorsey: Right now,
the health of the conversation.
4
23447
2929
傑克•多爾西(多):
目前擔心對話是否良性。
00:26
So, our purpose is to serve
the public conversation,
5
26400
3660
我們致力於為公眾提供對話平臺,
00:30
and we have seen
a number of attacks on it.
6
30084
5056
但也看到對此的攻擊不斷。
00:35
We've seen abuse, we've seen harassment,
7
35164
2425
像是那些濫用、騷擾、操弄、
00:37
we've seen manipulation,
8
37613
3222
00:40
automation, human coordination,
misinformation.
9
40859
4265
自動化、人類協作、錯誤資訊。
00:46
So these are all dynamics
that we were not expecting
10
46134
4034
這並非 13 年前我們
創立時,所預期的變化。
00:50
13 years ago when we were
starting the company.
11
50192
3718
00:53
But we do now see them at scale,
12
53934
2664
但這些現象現已不容小覷,
00:56
and what worries me most
is just our ability to address it
13
56622
5278
而最讓我擔心的是,我們是否有能力
01:01
in a systemic way that is scalable,
14
61924
3108
大規模、系統性地處理這些狀況,
01:05
that has a rigorous understanding
of how we're taking action,
15
65056
6976
能有一套嚴謹的系統、
透徹地瞭解我們該如何採取行動,
01:12
a transparent understanding
of how we're taking action
16
72056
3105
01:15
and a rigorous appeals process
for when we're wrong,
17
75185
3101
當我們犯錯時,
也有嚴謹的上訴程序,
01:18
because we will be wrong.
18
78310
2169
因為我們一定會犯錯。
01:20
Whitney Pennington Rodgers:
I'm really glad to hear
19
80503
2397
惠妮•潘尼頓•羅傑斯(羅):
我很高興聽到
01:22
that that's something that concerns you,
20
82924
1928
你關心這個議題,
01:24
because I think there's been
a lot written about people
21
84876
2630
因為常看到有人寫到
他們覺得自己在推特上
被辱罵和騷擾,
01:27
who feel they've been abused
and harassed on Twitter,
22
87530
2477
最常遇到這種事的
就是女性、有色人種女性
01:30
and I think no one more so
than women and women of color
23
90031
4102
01:34
and black women.
24
94157
1170
和黑人女性。
01:35
And there's been data that's come out --
25
95351
1913
已經有些資料顯示——
01:37
Amnesty International put out
a report a few months ago
26
97288
2909
幾個月前,國際特赦組織發表報告,
01:40
where they showed that a subset
of active black female Twitter users
27
100221
4480
報告顯示,部分活躍使用
推特的黑人女性中,
01:44
were receiving, on average,
one in 10 of their tweets
28
104725
3456
平均每十則回覆中就有一則
01:48
were some form of harassment.
29
108205
2099
是某種形式的騷擾。
01:50
And so when you think about health
for the community on Twitter,
30
110328
3907
當想到推特社群的良性發展,
01:54
I'm interested to hear,
"health for everyone,"
31
114259
4024
我很想要聽到「人人良性對話」,
01:58
but specifically: How are you looking
to make Twitter a safe space
32
118307
3125
但明確地說:你打算如何
將推特變成安全的空間,
02:01
for that subset, for women,
for women of color and black women?
33
121456
4164
尤其是對那些少數,包含女性、
有色人種的女性及黑人女性而言?
02:05
JD: Yeah.
34
125644
1164
多:是的。
02:06
So it's a pretty terrible situation
35
126832
2643
這是個很糟的情況,
02:09
when you're coming to a service
36
129499
1619
當你使用一項服務,
02:11
that, ideally, you want to learn
something about the world,
37
131142
4321
理想情況下,
只是想了解世界資訊,
02:15
and you spend the majority of your time
reporting abuse, receiving abuse,
38
135487
5443
但你卻花了多數的時間於:
檢舉或受到辱罵、
02:20
receiving harassment.
39
140954
1804
被人騷擾。
02:23
So what we're looking most deeply at
is just the incentives
40
143373
6321
我們在探究的是
02:29
that the platform naturally provides
and the service provides.
41
149718
3823
由平臺及服務所產生的誘因。
02:34
Right now, the dynamic of the system
makes it super-easy to harass
42
154262
4577
目前系統的互動模式,讓使用者
被輕易地利用來騷擾和辱罵他人,
02:38
and to abuse others through the service,
43
158863
3664
02:42
and unfortunately, the majority
of our system in the past
44
162551
3262
不幸的是,過去系統的運作,
02:45
worked entirely based on people
reporting harassment and abuse.
45
165837
5596
絕大部分都依賴使用者
主動檢舉騷擾以及濫用。
02:51
So about midway last year,
we decided that we were going to apply
46
171457
5075
大約在去年,
我們決定針對這個問題
02:56
a lot more machine learning,
a lot more deep learning to the problem,
47
176556
3982
採用更多的機器學習、深度學習,
03:00
and try to be a lot more proactive
around where abuse is happening,
48
180562
4538
並嘗試在發生濫用前先發制人,
03:05
so that we can take the burden
off the victim completely.
49
185124
3960
替受害者完全卸下負擔。
03:09
And we've made some progress recently.
50
189108
2435
最近我們有些進展。
03:11
About 38 percent of abusive tweets
are now proactively identified
51
191567
6689
機器學習演算法可以主動找出
大約 38% 的濫用推文,
03:18
by machine learning algorithms
52
198280
1715
03:20
so that people don't actually
have to report them.
53
200019
2334
使用者不需要主動檢舉這些推文。
03:22
But those that are identified
are still reviewed by humans,
54
202377
3305
但被識別出來的濫用推文
會再由人工複審,
03:25
so we do not take down content or accounts
without a human actually reviewing it.
55
205706
5384
在人工複審前,我們不會
刪除任何的推文或帳號。
03:31
But that was from zero percent
just a year ago.
56
211114
2759
這比一年前的 0% 大有進步。
03:33
So that meant, at that zero percent,
57
213897
1931
意思就是,在 0% 時,
03:35
every single person who received abuse
had to actually report it,
58
215852
3650
每個收到辱罵的人都得主動檢舉,
03:39
which was a lot of work for them,
a lot of work for us
59
219526
3579
對他們和我們來說都很麻煩,
03:43
and just ultimately unfair.
60
223129
2018
歸根究柢就是很不公平。
03:46
The other thing that we're doing
is making sure that we, as a company,
61
226528
3780
我們做的另外一件事
是確保作為一個企業
03:50
have representation of all the communities
that we're trying to serve.
62
230332
3333
被服務的所有群體皆有其代表。
03:53
We can't build a business
that is successful
63
233689
2159
我們要打造一個成功的企業,
03:55
unless we have a diversity
of perspective inside of our walls
64
235872
3300
內部就必須擁有多元觀點,
且每天都深切地體會這些爭議。
03:59
that actually feel these issues
every single day.
65
239196
3732
04:02
And that's not just with the team
that's doing the work,
66
242952
3738
不只是前端負責
執行的團隊要夠多元,
04:06
it's also within our leadership as well.
67
246714
2096
領導階層也一樣。
04:08
So we need to continue to build empathy
for what people are experiencing
68
248834
5757
我們要對人們的感受保有同理心
04:14
and give them better tools to act on it
69
254615
3316
並為客戶提供更好的工具,
04:17
and also give our customers
a much better and easier approach
70
257955
4252
還有更優質、更簡易的方法,
04:22
to handle some of the things
that they're seeing.
71
262231
2382
來處理他們所目睹的事情。
04:24
So a lot of what we're doing
is around technology,
72
264637
3266
我們做的事大多環繞著科技,
04:27
but we're also looking at
the incentives on the service:
73
267927
4308
但我們也在研究因服務產生的誘因:
04:32
What does Twitter incentivize you to do
when you first open it up?
74
272259
5183
當你一打開推特,
它誘使你去做什麼?
04:37
And in the past,
75
277466
1294
在過去,
04:40
it's incented a lot of outrage,
it's incented a lot of mob behavior,
76
280670
5544
推特誘導了許多紛爭、
暴行、集體騷擾。
04:46
it's incented a lot of group harassment.
77
286238
2459
04:48
And we have to look a lot deeper
at some of the fundamentals
78
288721
3648
我們得更深入地探討
服務內容的本質,
04:52
of what the service is doing
to make the bigger shifts.
79
292393
2958
才能做重大調整。
04:55
We can make a bunch of small shifts
around technology, as I just described,
80
295375
4031
我們可以在技術上
做許多小改變,如剛所敘,
04:59
but ultimately, we have to look deeply
at the dynamics in the network itself,
81
299430
4386
但最終我們得更深入
觀察網絡裡的動態,
05:03
and that's what we're doing.
82
303840
1368
那正是我們在做的。
05:05
CA: But what's your sense --
83
305232
2060
安:你的看法是什麼——
05:07
what is the kind of thing
that you might be able to change
84
307316
3963
你可能做出什麼樣的改變,
05:11
that would actually
fundamentally shift behavior?
85
311303
2749
才能從根本上改變使用者行為呢?
05:15
JD: Well, one of the things --
86
315386
1480
多:其中一樣——
05:16
we started the service
with this concept of following an account,
87
316890
5340
舉例來說,我們推出了一種服務,
概念是跟隨帳號,
05:22
as an example,
88
322254
1725
05:24
and I don't believe that's why
people actually come to Twitter.
89
324003
4349
我不認為這是大家
使用推特的原因。
05:28
I believe Twitter is best
as an interest-based network.
90
328376
4857
我認為推特最好定位為
以興趣為基礎的網絡。
05:33
People come with a particular interest.
91
333257
3453
大家因為某種特定的興趣而來。
05:36
They have to do a ton of work
to find and follow the related accounts
92
336734
3487
他們得要花很多心力
才能找到並跟隨
05:40
around those interests.
93
340245
1405
和那些興趣相關的帳號。
05:42
What we could do instead
is allow you to follow an interest,
94
342217
3397
我們可以換一種方式,
讓你跟隨一種興趣,
05:45
follow a hashtag, follow a trend,
95
345638
2103
跟隨一個「#主題標籤」、一種趨勢、
05:47
follow a community,
96
347765
1754
跟隨一個社群,
05:49
which gives us the opportunity
to show all of the accounts,
97
349543
4637
這樣我們就有機會
呈現所有相關的帳號、
05:54
all the topics, all the moments,
all the hashtags
98
354204
3323
所有相關的主題、時刻、「#主題標籤」,
05:57
that are associated with that
particular topic and interest,
99
357551
3992
只要是和那個主題及興趣
有關的都會呈現出來,
06:01
which really opens up
the perspective that you see.
100
361567
4600
這真的能夠開拓你的視野。
06:06
But that is a huge fundamental shift
101
366191
2157
但這樣大幅度的結構性轉變
06:08
to bias the entire network
away from just an account bias
102
368372
3792
讓整個平臺從偏重於帳號
06:12
towards a topics and interest bias.
103
372188
2587
轉而偏向於主題和興趣 。
06:15
CA: Because isn't it the case
104
375283
3148
安:現實情況不是這樣的嗎?
06:19
that one reason why you have
so much content on there
105
379375
3541
推特之所以有那麼多內容的原因之一
06:22
is a result of putting millions
of people around the world
106
382940
3591
不就是將全世界幾百萬人放在一個
06:26
in this kind of gladiatorial
contest with each other
107
386555
3142
類似競技場式的比賽
06:29
for followers, for attention?
108
389721
2090
去競爭跟隨者、關注?
06:31
Like, from the point of view
of people who just read Twitter,
109
391835
4117
對只看而不發推特的人來說,
06:35
that's not an issue,
110
395976
1155
這不是問題,
06:37
but for the people who actually create it,
everyone's out there saying,
111
397155
3350
但對創造內容的人而言,
許多人會說:
06:40
"You know, I wish I had
a few more 'likes,' followers, retweets."
112
400529
3236
「我真希望有多一點
喜歡、跟隨者、轉推。」
06:43
And so they're constantly experimenting,
113
403789
2148
所以他們經常在做實驗,
06:45
trying to find the path to do that.
114
405961
1961
試著找出方法達到目的。
06:47
And what we've all discovered
is that the number one path to do that
115
407946
4126
而我們所發現的是最常見途徑就是:
06:52
is to be some form of provocative,
116
412096
3406
帶有挑釁成分的、
06:55
obnoxious, eloquently obnoxious,
117
415526
2980
罵起人來頭頭是道,
06:58
like, eloquent insults
are a dream on Twitter,
118
418530
3516
彷彿,頭頭是道的辱罵,
是人們在推特的目標。
07:02
where you rapidly pile up --
119
422070
2603
這樣便能快速傳播——
07:04
and it becomes this self-fueling
process of driving outrage.
120
424697
4608
而這變成一種導致憤怒的惡性循環。
07:09
How do you defuse that?
121
429329
2351
你要如何平息這個狀況?
07:12
JD: Yeah, I mean, I think you're spot on,
122
432624
2947
多:是的,你切中核心,
07:15
but that goes back to the incentives.
123
435595
1886
但這又回到誘因。
07:17
Like, one of the choices
we made in the early days was
124
437505
2632
在早期,我們所做的選擇之一
07:20
we had this number that showed
how many people follow you.
125
440161
4701
就是突出你的跟隨者人數。
07:24
We decided that number
should be big and bold,
126
444886
2959
我們決定讓那個數字
要用粗體大字呈現,
07:27
and anything that's on the page
that's big and bold has importance,
127
447869
3740
網頁上只要用到粗體大字,
通常都是重要的,
07:31
and those are the things
that you want to drive.
128
451633
2278
那些就會是你想要追求的東西。
07:33
Was that the right decision at the time?
129
453935
1907
在那時,那個決策正確嗎?
07:35
Probably not.
130
455866
1153
可能不正確。
07:37
If I had to start the service again,
131
457043
1805
如果能重頭來過,
07:38
I would not emphasize
the follower count as much.
132
458872
2398
我不會這麼強調跟隨者人數。
07:41
I would not emphasize
the "like" count as much.
133
461294
2295
我不會這麼強調「喜歡」的數目。
07:43
I don't think I would even
create "like" in the first place,
134
463613
3120
我甚至根本不會創造
「喜歡」這個功能,
07:46
because it doesn't actually push
135
466757
3267
因為這個功能無法真的
07:50
what we believe now
to be the most important thing,
136
470048
3179
推動我們現在最重要的理念,
07:53
which is healthy contribution
back to the network
137
473251
3039
也就是對網路做良性的回饋貢獻,
07:56
and conversation to the network,
138
476314
2652
以及與網路對話、
07:58
participation within conversation,
139
478990
2072
參與對話,
08:01
learning something from the conversation.
140
481086
2493
從對話中學習。
08:03
Those are not things
that we thought of 13 years ago,
141
483603
2824
我們 13 年前沒有想到這些,
08:06
and we believe are extremely
important right now.
142
486451
2439
現在我們認為這些非常重要。
08:08
So we have to look at
how we display the follower count,
143
488914
3023
所以我們得要研究
如何呈現跟隨者人數、
08:11
how we display retweet count,
144
491961
2365
如何呈現轉推數,
08:14
how we display "likes,"
145
494350
1401
如何呈現「喜歡」,
08:15
and just ask the deep question:
146
495775
2254
並深思:
08:18
Is this really the number
that we want people to drive up?
147
498053
3048
我們真的希望大家
把這個數字衝高嗎?
08:21
Is this the thing that,
when you open Twitter,
148
501125
2545
我們真的希望當你打開推特時,
08:23
you see, "That's the thing
I need to increase?"
149
503694
2516
你會說「我得要增加這個數字」嗎?
08:26
And I don't believe
that's the case right now.
150
506234
2144
我不認為是這樣的。
08:28
(Applause)
151
508402
2103
(掌聲)
08:30
WPR: I think we should look at
some of the tweets
152
510529
2352
羅:我們來看一些觀眾的推文。
08:32
that are coming
in from the audience as well.
153
512905
2169
08:35
CA: Let's see what you guys are asking.
154
515868
2436
安:來看看大家想問什麼。
08:38
I mean, this is -- generally, one
of the amazing things about Twitter
155
518328
3294
一般來說,這是推特
很棒的優點之一,
08:41
is how you can use it for crowd wisdom,
156
521646
2294
可以把它用在群眾智慧上,
08:43
you know, that more knowledge,
more questions, more points of view
157
523964
4840
像是:更多知識、問題、觀點,
08:48
than you can imagine,
158
528828
1238
超越想像,
08:50
and sometimes, many of them
are really healthy.
159
530090
3689
有時,許多內容是非常良性的。
08:53
WPR: I think one I saw that
passed already quickly down here,
160
533803
2900
羅:有一則訊息進來:
08:56
"What's Twitter's plan to combat
foreign meddling in the 2020 US election?"
161
536717
3524
「推特對抗外國干預 2020 年
美國大選的計畫是什麼?」
09:00
I think that's something
that's an issue we're seeing
162
540265
2571
這是我們在網路上
09:02
on the internet in general,
163
542860
1901
一般都會看見的議題,
09:04
that we have a lot of malicious
automated activity happening.
164
544785
3667
有許多惡意的、自動化的活動。
09:08
And on Twitter, for example,
in fact, we have some work
165
548476
5373
在推特上,比如,我們有
09:13
that's come from our friends
at Zignal Labs,
166
553873
2758
我們的朋友 Zignal 實驗室的作品,
09:16
and maybe we can even see that
to give us an example
167
556655
2656
也許我們能以此為例,
09:19
of what exactly I'm talking about,
168
559335
1927
闡述我的觀點,
09:21
where you have these bots, if you will,
169
561286
3204
有些機器人,
09:24
or coordinated automated
malicious account activity,
170
564514
4550
或自動化協同的惡意帳號活動,
09:29
that is being used to influence
things like elections.
171
569088
2764
被用來影響選舉等等。
09:31
And in this example we have
from Zignal which they've shared with us
172
571876
3843
這個例子是 Zignal 和我們分享的,
09:35
using the data that
they have from Twitter,
173
575743
2198
他們使用來自推特的資料,
09:37
you actually see that in this case,
174
577965
2441
各位可以看見,
09:40
white represents the humans --
human accounts, each dot is an account.
175
580430
4370
在這裡白色代表人——
人的帳戶,每一個點是一個帳戶。
09:44
The pinker it is,
176
584824
1359
顏色越粉紅,
09:46
the more automated the activity is.
177
586207
1740
就表示越多自動化的活動。
09:47
And you can see how you have
a few humans interacting with bots.
178
587971
5970
可以看見有些人和機器人互動。
09:53
In this case, it's related
to the election in Israel
179
593965
4419
在這個例子中,這些活動
和以色列大選有關,
09:58
and spreading misinformation
about Benny Gantz,
180
598408
2833
散播關於本尼•甘茨的不實資訊,
10:01
and as we know, in the end,
that was an election
181
601265
2662
我們知道,
10:03
that Netanyahu won by a slim margin,
182
603951
3724
最後尼坦雅胡以些微差距勝出,
10:07
and that may have been
in some case influenced by this.
183
607699
2842
那結果可能受到推特的影響。
10:10
And when you think about
that happening on Twitter,
184
610565
2615
想到這樣的事發生在推特,
10:13
what are the things
that you're doing, specifically,
185
613204
2456
你會採取哪些明確的措施,
10:15
to ensure you don't have misinformation
like this spreading in this way,
186
615684
3702
以確保不會有錯誤資訊
藉此散播出去,
10:19
influencing people in ways
that could affect democracy?
187
619410
4181
影響到大家,進而影響到民主?
10:23
JD: Just to back up a bit,
188
623615
1771
多:先倒帶一下,
10:25
we asked ourselves a question:
189
625410
2975
我們會自問:
10:28
Can we actually measure
the health of a conversation,
190
628409
3816
我們真的能測量良性對話的程度嗎?
10:32
and what does that mean?
191
632249
1288
良性對話是什麼意思?
10:33
And in the same way
that you have indicators
192
633561
3382
就像你有指標,
10:36
and we have indicators as humans
in terms of are we healthy or not,
193
636967
3467
有指標可以表示人類是否健康,
10:40
such as temperature,
the flushness of your face,
194
640458
4658
比如體溫、面色紅潤度,
10:45
we believe that we could find
the indicators of conversational health.
195
645140
4560
我們相信能找到良性對話的指標。
10:49
And we worked with a lab
called Cortico at MIT
196
649724
3843
我們和麻省理工學院的
Cortico 實驗室合作,
10:54
to propose four starter indicators
197
654479
6091
提出了四個初始指標,
11:00
that we believe we could ultimately
measure on the system.
198
660594
3670
我們相信,最終能在系統中測量。
11:05
And the first one is
what we're calling shared attention.
199
665249
5604
第一個指標我們稱之為「共同焦點」。
11:10
It's a measure of how much
of the conversation is attentive
200
670877
3581
它計算的是:
人們對話的主題是集中還是分散。
11:14
on the same topic versus disparate.
201
674482
2630
11:17
The second one is called shared reality,
202
677739
2783
第二個指標叫「共同依據」,
11:21
and this is what percentage
of the conversation
203
681217
2259
它指的是對話有多少比例
11:23
shares the same facts --
204
683500
2005
來自同樣的依據——
11:25
not whether those facts
are truthful or not,
205
685529
3113
不論那些依據是否屬實,
11:28
but are we sharing
the same facts as we converse?
206
688666
3009
只看我們對話時
是否有共同的依據。
11:32
The third is receptivity:
207
692235
2353
第三個指標是「接納歧見」:
11:34
How much of the conversation
is receptive or civil
208
694612
3959
對話是否是開放、文明有禮的,
11:38
or the inverse, toxic?
209
698595
2944
或反過來說,是令人無法接受的?
11:42
And then the fourth
is variety of perspective.
210
702213
3222
第四個指標是「多元觀點」。
11:45
So, are we seeing filter bubbles
or echo chambers,
211
705459
3145
我們是否身處同溫層,或迴聲室?
11:48
or are we actually getting
a variety of opinions
212
708628
3057
或我們真的能在對話中
11:51
within the conversation?
213
711709
1635
聽見多元的聲音?
11:53
And implicit in all four of these
is the understanding that,
214
713368
4018
這四個指標蘊含的是:
11:57
as they increase, the conversation
gets healthier and healthier.
215
717410
3390
指標越是提升,對話越顯良性。
12:00
So our first step is to see
if we can measure these online,
216
720824
4869
第一步就是嘗試能否
實際測量這些指標,
12:05
which we believe we can.
217
725717
1308
我們相信是可行的。
12:07
We have the most momentum
around receptivity.
218
727049
3167
「接納歧見」的指標最有具動能。
12:10
We have a toxicity score,
a toxicity model, on our system
219
730240
4317
系統裡有不悅度分數及模型,
12:14
that can actually measure
whether you are likely to walk away
220
734581
4124
它可以測量你是否有可能會離開
12:18
from a conversation
that you're having on Twitter
221
738729
2313
你在推特上參與的對話,
12:21
because you feel it's toxic,
222
741066
1633
因為它讓你感到不悅,
12:22
with some pretty high degree.
223
742723
2512
這個指標已達相當水準。
12:26
We're working to measure the rest,
224
746369
2199
我們正致力於其它指標的測量,
12:28
and the next step is,
225
748592
1964
下一步是,
12:30
as we build up solutions,
226
750580
3359
當我們在建立解決方案時,
12:33
to watch how these measurements
trend over time
227
753963
3491
去觀察這些測量值的變化趨勢,
12:37
and continue to experiment.
228
757478
1873
並持續地實驗。
12:39
And our goal is to make sure
that these are balanced,
229
759375
4041
目標是要確保這些
指標之間達到平衡,
12:43
because if you increase one,
you might decrease another.
230
763440
3066
因為如果有一項增加,
可能讓另一項減少。
12:46
If you increase variety of perspective,
231
766530
2147
如果「多元觀點」提升了,
12:48
you might actually decrease
shared reality.
232
768701
3091
同時亦可能減少「共同依據」。
12:51
CA: Just picking up on some
of the questions flooding in here.
233
771816
4989
安:從湧入的問題中挑一個。
12:56
JD: Constant questioning.
234
776829
1271
多:不間斷的問題。
12:58
CA: A lot of people are puzzled why,
235
778996
3620
安:很多人不明白,
13:02
like, how hard is it to get rid
of Nazis from Twitter?
236
782640
4247
像是讓極端種族主義分子的言論
從推特上消失有多難?
13:08
JD: (Laughs)
237
788309
1322
多:(笑)
13:09
So we have policies
around violent extremist groups,
238
789655
6995
針對暴力極端團體
我們有相關的政策,
13:16
and the majority of our work
and our terms of service
239
796674
4426
我們主要的工作及服務條款
13:21
works on conduct, not content.
240
801124
3729
是在處理行為,而非內容。
13:24
So we're actually looking for conduct.
241
804877
2551
所以我們其實在尋找特定行為。
13:27
Conduct being using the service
242
807452
3014
比如是使用服務
13:30
to repeatedly or episodically
harass someone,
243
810490
3867
來重覆或不定期騷擾某人,
13:34
using hateful imagery
244
814381
2493
使用仇恨的意象,
13:36
that might be associated with the KKK
245
816898
2106
可能是與三 K 黨有關,
13:39
or the American Nazi Party.
246
819028
3281
或是美國納粹黨。
13:42
Those are all things
that we act on immediately.
247
822333
4156
那些都是我們可以立即處理的。
13:47
We're in a situation right now
where that term is used fairly loosely,
248
827002
5452
目前的情況是大家
並不嚴謹地使用那個字,
13:52
and we just cannot take
any one mention of that word
249
832478
5313
我們實在無法只因為
那個字被提起一次,
13:57
accusing someone else
250
837815
2117
就指控某個人
13:59
as a factual indication that they
should be removed from the platform.
251
839956
3755
說有事實證據指出
他應該被踢出平臺。
14:03
So a lot of our models
are based around, number one:
252
843735
2627
我們許多模型的根據都是,第一:
14:06
Is this account associated
with a violent extremist group?
253
846386
3140
這個帳號和暴力極端團體有關聯嗎?
14:09
And if so, we can take action.
254
849550
1983
如果有,我們就能採取行動。
14:11
And we have done so on the KKK
and the American Nazi Party and others.
255
851557
3852
我們已對三 K 黨、美國
納粹黨,及其他團體施行。
14:15
And number two: Are they using
imagery or conduct
256
855433
4183
第二:他們是否也有採用
14:19
that would associate them as such as well?
257
859640
2372
和這些團體相關的意象或是行為?
14:22
CA: How many people do you have
working on content moderation
258
862416
2932
安:有多少員工負責審核內容
14:25
to look at this?
259
865372
1250
以觀察這些行為?
14:26
JD: It varies.
260
866646
1496
多:不一定。
14:28
We want to be flexible on this,
261
868166
1595
這方面我們想要保持彈性,
14:29
because we want to make sure
that we're, number one,
262
869785
2646
因為我們想要確保,第一,
14:32
building algorithms instead of just
hiring massive amounts of people,
263
872455
4424
我們要運用演算法,
而非僅透過大量人力,
14:36
because we need to make sure
that this is scalable,
264
876903
2824
因為我們必須確保
作法是可以大範圍使用的,
14:39
and there are no amount of people
that can actually scale this.
265
879751
3454
再多的人力也不足以大規模擴展。
14:43
So this is why we've done so much work
around proactive detection of abuse
266
883229
6629
這是我們費盡心力做
主動偵測濫用的原因,
14:49
that humans can then review.
267
889882
1391
接著讓人類來審查。
14:51
We want to have a situation
268
891297
2861
我們想要做到讓
14:54
where algorithms are constantly
scouring every single tweet
269
894182
3741
演算法能不斷搜索每一則推文,
14:57
and bringing the most
interesting ones to the top
270
897947
2342
把最值得注意的推文置頂,
15:00
so that humans can bring their judgment
to whether we should take action or not,
271
900313
3902
這麼一來,人類可以去判斷
我們是否要採取行動,
15:04
based on our terms of service.
272
904239
1524
依據服務條款來決定。
15:05
WPR: But there's not an amount
of people that are scalable,
273
905787
2803
羅:大量人力
對系統性擴展並無幫助,
15:08
but how many people do you currently have
monitoring these accounts,
274
908614
3497
但目前有多少人在監控這些帳戶?
15:12
and how do you figure out what's enough?
275
912135
2546
又如何判斷已經足夠了?
15:14
JD: They're completely flexible.
276
914705
2272
多:這部分彈性很大。
15:17
Sometimes we associate folks with spam.
277
917001
2941
有時,我們讓大家去處理濫發廣告。
15:19
Sometimes we associate folks
with abuse and harassment.
278
919966
3845
有時,我們讓大家去處理辱罵和騷擾。
15:23
We're going to make sure that
we have flexibility in our people
279
923835
3062
我們要確保人員的彈性,
15:26
so that we can direct them
at what is most needed.
280
926921
2350
讓我們可以把他們導到
最需要的地方,
有時,是選舉。
15:29
Sometimes, the elections.
281
929295
1204
在墨西哥有一連串的選舉,
印度馬上也會有一場選舉,
15:30
We've had a string of elections
in Mexico, one coming up in India,
282
930523
4927
15:35
obviously, the election last year,
the midterm election,
283
935474
4447
很顯然,還有去年的美國期中選舉,
15:39
so we just want to be flexible
with our resources.
284
939945
2472
我們希望能夠彈性運用資源。
15:42
So when people --
285
942441
2129
所以,當人們——
15:44
just as an example, if you go
to our current terms of service
286
944594
6389
舉例,如果去看目前的服務條款,
15:51
and you bring the page up,
287
951007
1641
把網頁打開來,
15:52
and you're wondering about abuse
and harassment that you just received
288
952672
3682
你會想說,你遭受到的辱罵和騷擾
15:56
and whether it was against
our terms of service to report it,
289
956378
3634
是否有違反我們的服務條款,
這樣你就可以舉報,
16:00
the first thing you see
when you open that page
290
960036
2559
而你打開頁面之後首先看到的
16:02
is around intellectual
property protection.
291
962619
3088
和智慧財產權保護有關。
16:06
You scroll down and you get to
abuse, harassment
292
966504
5323
向下拉,就會看到辱罵、騷擾,
16:11
and everything else
that you might be experiencing.
293
971851
2382
及其他你可能經歷到的狀況。
16:14
So I don't know how that happened
over the company's history,
294
974257
3195
我不知道在公司歷史上
這是如何發展出來的,
16:17
but we put that above
the thing that people want
295
977476
4797
但我們會把大家最需要的資訊
16:24
the most information on
and to actually act on.
296
984146
3222
拉到頁面上方,並真的採取行動。
16:27
And just our ordering shows the world
what we believed was important.
297
987392
5241
順序就是在昭告天下
我們的價值觀。
16:32
So we're changing all that.
298
992657
2881
我們在改變這些規則。
16:35
We're ordering it the right way,
299
995562
1563
我們在做正確的排序,
16:37
but we're also simplifying the rules
so that they're human-readable
300
997149
3451
但我們也把規則簡化,
讓人們容易閱讀,
16:40
so that people can actually
understand themselves
301
1000624
4067
讓大家能夠真的靠自己去瞭解
16:44
when something is against our terms
and when something is not.
302
1004715
3448
什麼狀況和我們的條款
有抵觸,什麼狀況沒有。
16:48
And then we're making --
303
1008187
2161
接著,我們在——
16:50
again, our big focus is on removing
the burden of work from the victims.
304
1010372
5200
同前述,我們很重視
不要讓受害者承擔。
16:55
So that means push more
towards technology,
305
1015596
3734
意味著採用更多技術,
16:59
rather than humans doing the work --
306
1019354
1873
而不是讓人類來做這些工作——
17:01
that means the humans receiving the abuse
307
1021251
2413
那包括了受到辱罵的人,
17:03
and also the humans
having to review that work.
308
1023688
3026
以及要做審查工作的人。
17:06
So we want to make sure
309
1026738
1673
我們想要確保:
17:08
that we're not just encouraging more work
310
1028435
2841
我們並非只鼓勵花更多心力
17:11
around something
that's super, super negative,
311
1031300
2629
在極度負面的事物上,
17:13
and we want to have a good balance
between the technology
312
1033953
2674
而是在運用科技及發揮
人類創意之間取得平衡,
17:16
and where humans can actually be creative,
313
1036651
2852
17:19
which is the judgment of the rules,
314
1039527
3090
像是人類對規則的判斷,
17:22
and not just all the mechanical stuff
of finding and reporting them.
315
1042641
3267
而不只是所有機械化的方法
去找出、舉報它們。
17:25
So that's how we think about it.
316
1045932
1530
那是我們的看法。
17:27
CA: I'm curious to dig in more
about what you said.
317
1047486
2406
安:我想要進一步
討論你剛所說的。
17:29
I mean, I love that you said
you are looking for ways
318
1049916
2605
我很喜歡你說你們在想辦法
17:32
to re-tweak the fundamental
design of the system
319
1052545
3462
重新調整系統的基本設計,
17:36
to discourage some of the reactive
behavior, and perhaps --
320
1056031
4875
來阻止一些反應性行為,也許——
17:40
to use Tristan Harris-type language --
321
1060930
2705
用崔斯坦.哈里斯的話說——
17:43
engage people's more reflective thinking.
322
1063659
4288
促使大家做更多反思。
17:47
How far advanced is that?
323
1067971
1854
這部分的進展如何?
17:49
What would alternatives
to that "like" button be?
324
1069849
4305
「喜歡」按鈕的替代方案會像什麼?
17:55
JD: Well, first and foremost,
325
1075518
3575
多:首先,也是最重要的,
17:59
my personal goal with the service
is that I believe fundamentally
326
1079117
5753
對這項服務,我個人的
目標是,基本上我相信,
18:04
that public conversation is critical.
327
1084894
2702
公眾對話是至關重要的。
18:07
There are existential problems
facing the world
328
1087620
2647
現今全世界一同
面臨攸關存亡的問題,
18:10
that are facing the entire world,
not any one particular nation-state,
329
1090291
4163
並不僅限於特定的民族國家,
18:14
that global public conversation benefits.
330
1094478
2649
因此全球公眾對話能從中受益。
18:17
And that is one of the unique
dynamics of Twitter,
331
1097151
2372
那是推特的獨特動力之一,
18:19
that it is completely open,
332
1099547
1814
它是完全開放的、
18:21
it is completely public,
333
1101385
1596
它是完全公開的、
18:23
it is completely fluid,
334
1103005
1399
它是完全流動的、
18:24
and anyone can see any other conversation
and participate in it.
335
1104428
4038
任何人都能看見並參與任何對話。
18:28
So there are conversations
like climate change.
336
1108490
2206
有些話題是氣候變遷、
18:30
There are conversations
like the displacement in the work
337
1110720
2682
有些話題是工作被人工智慧取代、
18:33
through artificial intelligence.
338
1113426
2000
18:35
There are conversations
like economic disparity.
339
1115450
3006
有些話題是貧富差距。
18:38
No matter what any one nation-state does,
340
1118480
2765
不論任何單一民族國家怎麼做,
18:41
they will not be able
to solve the problem alone.
341
1121269
2421
都無法獨自解決問題。
18:43
It takes coordination around the world,
342
1123714
2643
需要全世界的協調,
18:46
and that's where I think
Twitter can play a part.
343
1126381
3047
而我認為推特此時便能發揮影響力。
18:49
The second thing is that Twitter,
right now, when you go to it,
344
1129452
5642
第二,目前你去用推特時,
18:55
you don't necessarily walk away
feeling like you learned something.
345
1135118
3746
你不見得會在離開時
覺得你獲得了新知。
18:58
Some people do.
346
1138888
1276
有些人會,
19:00
Some people have
a very, very rich network,
347
1140188
3107
有些人擁有非常豐沛的人脈、
19:03
a very rich community
that they learn from every single day.
348
1143319
3117
非常優質的社群,讓他們
每天都能從中學到東西。
19:06
But it takes a lot of work
and a lot of time to build up to that.
349
1146460
3691
但要很多時間心力才能建立起來。
19:10
So we want to get people
to those topics and those interests
350
1150175
3448
我們想讓大家以更快的速度,
找到那些主題、興趣,
19:13
much, much faster
351
1153647
1579
19:15
and make sure that
they're finding something that,
352
1155250
2566
並確保他們能找到東西,
19:18
no matter how much time
they spend on Twitter --
353
1158728
2360
不論人們花多少時間在推特,
19:21
and I don't want to maximize
the time on Twitter,
354
1161112
2358
我想要最大化的
不是人們花在推特的時間,
19:23
I want to maximize
what they actually take away from it
355
1163494
2910
而是人們從推特帶走的東西,
19:26
and what they learn from it, and --
356
1166428
2030
讓人們學到東西……
19:29
CA: Well, do you, though?
357
1169598
1328
安:但你能這麼做嗎?
19:30
Because that's the core question
that a lot of people want to know.
358
1170950
3244
那是許多人想知道的核心問題,
19:34
Surely, Jack, you're constrained,
to a huge extent,
359
1174218
3638
當然,傑克,你受到
很大程度的限制,
19:37
by the fact that you're a public company,
360
1177880
2007
因為你們是上市公司,
19:39
you've got investors pressing on you,
361
1179911
1774
你有來自投資者的壓力,
19:41
the number one way you make your money
is from advertising --
362
1181709
3559
廣告是最大的收入來源.
19:45
that depends on user engagement.
363
1185292
2772
那和使用者參與程度有關。
19:48
Are you willing to sacrifice
user time, if need be,
364
1188088
4700
必要的話,你是否願意
犧牲使用者停留時間,
19:52
to go for a more reflective conversation?
365
1192812
3729
來換取更具反思性的談話?
19:56
JD: Yeah; more relevance means
less time on the service,
366
1196565
3111
多:會,若高相關性
意味更短的停留時間,
19:59
and that's perfectly fine,
367
1199700
1937
那完全沒問題,
20:01
because we want to make sure
that, like, you're coming to Twitter,
368
1201661
3099
因為我們想要確保的是,
當你上了推特,
20:04
and you see something immediately
that you learn from and that you push.
369
1204784
4520
可以馬上看到某樣東西,
能從中學習,並會轉推。
20:09
We can still serve an ad against that.
370
1209328
3420
我們仍然能於此情境下搭配廣告。
20:12
That doesn't mean you need to spend
any more time to see more.
371
1212772
2921
那並不表示需要
花更多時間、看更多東西。
20:15
The second thing we're looking at --
372
1215717
1733
我們在研究的第二件事——
安:單就那個目標,
每日活躍使用量,
20:17
CA: But just -- on that goal,
daily active usage,
373
1217474
2698
20:20
if you're measuring that,
that doesn't necessarily mean things
374
1220196
3245
如果你要測量它,它並不見得是
20:23
that people value every day.
375
1223465
1738
人們真正重視的價值。
20:25
It may well mean
376
1225227
1161
它也有可能是
20:26
things that people are drawn to
like a moth to the flame, every day.
377
1226412
3306
能把大家像是飛蛾撲火般
吸引過去的東西。
20:29
We are addicted, because we see
something that pisses us off,
378
1229742
3022
我們會上癮,是因為
看到惹毛我們的東西,
20:32
so we go in and add fuel to the fire,
379
1232788
3178
我們就會上推特,去火上加油,
20:35
and the daily active usage goes up,
380
1235990
1927
而每日活躍使用量就會上升,
20:37
and there's more ad revenue there,
381
1237941
1715
那就是更多的廣告收入,
20:39
but we all get angrier with each other.
382
1239680
2752
但大家對彼此更生氣。
20:42
How do you define ...
383
1242456
2509
你要如何定義……
20:44
"Daily active usage" seems like a really
dangerous term to be optimizing.
384
1244989
4126
若要最佳化「每日活躍
使用量」,似乎挺危險的。
20:49
(Applause)
385
1249139
5057
(掌聲)
20:54
JD: Taken alone, it is,
386
1254220
1268
多:單獨看的確是,
20:55
but you didn't let me
finish the other metric,
387
1255512
2346
但你還沒讓我講完另一項度量,
20:57
which is, we're watching for conversations
388
1257882
3727
那就是,我們在留意對話及對話鏈。
21:01
and conversation chains.
389
1261633
2129
21:03
So we want to incentivize
healthy contribution back to the network,
390
1263786
5076
我們想要提供不同的激勵,
讓良性對話回這個平臺,
21:08
and what we believe that is
is actually participating in conversation
391
1268886
4181
並鼓勵人們參與良性對話,
21:13
that is healthy,
392
1273091
1197
21:14
as defined by those four indicators
I articulated earlier.
393
1274312
5037
所謂的良性,就是由
前述的四個指標所定義。
21:19
So you can't just optimize
around one metric.
394
1279373
2657
不能只將一項度量給最佳化。
21:22
You have to balance and look constantly
395
1282054
2752
得要平衡,且經常要注意
21:24
at what is actually going to create
a healthy contribution to the network
396
1284830
4083
什麼能夠對這個網路
創造出良性的貢獻,
21:28
and a healthy experience for people.
397
1288937
2341
為大家創造正向的體驗。
21:31
Ultimately, we want to get to a metric
398
1291302
1866
最終我們想要達到的標準是
21:33
where people can tell us,
"Hey, I learned something from Twitter,
399
1293192
3757
大家能告訴我們:「嘿,
我從推特學到了些東西,
21:36
and I'm walking away
with something valuable."
400
1296973
2167
我帶著有價值的東西離開。」
21:39
That is our goal ultimately over time,
401
1299164
2043
那是我們的最終目標,
21:41
but that's going to take some time.
402
1301231
1809
但那會花不少時間。
21:43
CA: You come over to many,
I think to me, as this enigma.
403
1303064
5282
安:對許多人來說你是個謎,
對我來說是如此。
21:48
This is possibly unfair,
but I woke up the other night
404
1308370
4396
這樣說可能不公平,
但有天晚上我醒來,
21:52
with this picture of how I found I was
thinking about you and the situation,
405
1312790
3879
腦中的畫面是我
想著你和目前的狀態,
21:56
that we're on this great voyage with you
on this ship called the "Twittanic" --
406
1316693
6903
我們和你一同航向偉大的旅程,
在艘名為「推達尼號」的船上,
22:03
(Laughter)
407
1323620
1281
(笑聲)
22:04
and there are people on board in steerage
408
1324925
4357
船上有些在最低價艙等的人,
22:09
who are expressing discomfort,
409
1329306
2203
他們表示感到不舒服,
22:11
and you, unlike many other captains,
410
1331533
2543
而你和其他船長不同,
22:14
are saying, "Well, tell me, talk to me,
listen to me, I want to hear."
411
1334100
3431
你說:「告訴我,跟我說,
聽我說,我想要知道。」
22:17
And they talk to you, and they say,
"We're worried about the iceberg ahead."
412
1337555
3619
他們跟你談,他們說:
「我們很擔心前面的冰山。」
22:21
And you go, "You know,
that is a powerful point,
413
1341198
2242
你說:「那是很有力的論點,
22:23
and our ship, frankly,
hasn't been built properly
414
1343464
2430
老實說,我們的船並沒有妥善製造,
22:25
for steering as well as it might."
415
1345918
1669
無法操作得如預期理想。」
22:27
And we say, "Please do something."
416
1347611
1658
我們說:「請做些什麼。」
22:29
And you go to the bridge,
417
1349293
1411
你到了駕駛臺,
22:30
and we're waiting,
418
1350728
2295
我們在等待,
22:33
and we look, and then you're showing
this extraordinary calm,
419
1353047
4548
我們看著,你表現出超凡的冷靜,
22:37
but we're all standing outside,
saying, "Jack, turn the fucking wheel!"
420
1357619
3883
但我們都站在外面說:
「傑克,轉那他媽的舵!」
22:41
You know?
421
1361526
1151
你知道嗎?
22:42
(Laughter)
422
1362701
1335
(笑聲)
22:44
(Applause)
423
1364060
2381
(掌聲)
22:46
I mean --
424
1366465
1166
我是說——
22:47
(Applause)
425
1367655
1734
(掌聲)
22:49
It's democracy at stake.
426
1369413
4594
民主岌岌可危。
22:54
It's our culture at stake.
It's our world at stake.
427
1374031
2821
我們的文化和世界岌岌可危。
22:56
And Twitter is amazing and shapes so much.
428
1376876
4706
推特很了不起且有極大的影響力。
23:01
It's not as big as some
of the other platforms,
429
1381606
2233
或許它不比其他的平臺大,
23:03
but the people of influence use it
to set the agenda,
430
1383863
2804
但有影響力的人用它來排定議程,
23:06
and it's just hard to imagine a more
important role in the world than to ...
431
1386691
6787
很難想像世界上
還有更重要的角色,
23:13
I mean, you're doing a brilliant job
of listening, Jack, and hearing people,
432
1393502
3784
我是指,傑克,你在
傾聽這方面做得很出色,
但正視其急迫性
並努力推動改變——
23:17
but to actually dial up the urgency
and move on this stuff --
433
1397310
4445
23:21
will you do that?
434
1401779
2201
你願意這麼做嗎?
23:24
JD: Yes, and we have been
moving substantially.
435
1404750
3815
多:會的,我們已取得
一些實質進展。
23:28
I mean, there's been
a few dynamics in Twitter's history.
436
1408589
3225
推特經歷過一些巨變。
23:31
One, when I came back to the company,
437
1411838
2083
當我重回公司時,
23:35
we were in a pretty dire state
in terms of our future,
438
1415477
6256
我們未來的處境非常危急,
23:41
and not just from how people
were using the platform,
439
1421757
4634
不僅是因為人們
使用這個平臺的方法,
23:46
but from a corporate narrative as well.
440
1426415
2047
還因為公司結構的問題。
23:48
So we had to fix
a bunch of the foundation,
441
1428486
3204
所以我們得改變一些公司基礎,
23:51
turn the company around,
442
1431714
1969
轉變公司的方向,
23:53
go through two crazy layoffs,
443
1433707
3111
歷經兩次瘋狂的裁員,
23:56
because we just got too big
for what we were doing,
444
1436842
3793
因為當時做的太多,
24:00
and we focused all of our energy
445
1440659
2060
我們將工作重心
24:02
on this concept of serving
the public conversation.
446
1442743
3508
放在為公眾對話服務上。
這花了一些心力。
24:06
And that took some work.
447
1446275
1451
24:07
And as we dived into that,
448
1447750
2608
當我們開始深入探討時,
24:10
we realized some of the issues
with the fundamentals.
449
1450382
2992
我們意識到一些根本性的問題。
24:14
We could do a bunch of superficial things
to address what you're talking about,
450
1454120
4656
為了達成你所說的目的,
可以採取一些表面功夫,
24:18
but we need the changes to last,
451
1458800
1790
但我們需要可長可久的變革,
24:20
and that means going really, really deep
452
1460614
2459
而這意味著必須非常深入,
24:23
and paying attention
to what we started 13 years ago
453
1463097
4350
並觀察我們 13 年前建立的東西,
24:27
and really questioning
454
1467471
2261
並且反思檢視:
24:29
how the system works
and how the framework works
455
1469756
2566
這個系統和框架是怎樣運作的,
24:32
and what is needed for the world today,
456
1472346
3833
現今社會需要的是什麼,
24:36
given how quickly everything is moving
and how people are using it.
457
1476203
4024
考慮到世界變化之快
和人們使用它的方法。
24:40
So we are working as quickly as we can,
but quickness will not get the job done.
458
1480251
6544
我們已經快馬加鞭了,
但「快」對完成任務並無幫助。
24:46
It's focus, it's prioritization,
459
1486819
2611
重要的是:專注、優先順序、
24:49
it's understanding
the fundamentals of the network
460
1489454
2946
對網路之基礎的理解,
24:52
and building a framework that scales
461
1492424
2842
建構可以靈活變動、適應變化的機制。
24:55
and that is resilient to change,
462
1495290
2351
24:57
and being open about where we are
and being transparent about where are
463
1497665
5429
並對進度開誠布公,
25:03
so that we can continue to earn trust.
464
1503118
2179
這樣我們才能繼續贏得人們的信任。
我對已經投入執行的機制感到自豪、
25:06
So I'm proud of all the frameworks
that we've put in place.
465
1506141
3331
25:09
I'm proud of our direction.
466
1509496
2888
我對我們選定的前進方向感到自豪。
25:12
We obviously can move faster,
467
1512915
2718
顯然我們可以再加緊腳步,
25:15
but that required just stopping a bunch
of stupid stuff we were doing in the past.
468
1515657
4719
但這包含放棄過去
一連串愚蠢的決定。
安:好的。
25:21
CA: All right.
469
1521067
1164
25:22
Well, I suspect there are many people here
who, if given the chance,
470
1522255
4067
我想如果有機會的話,在場很多人
25:26
would love to help you
on this change-making agenda you're on,
471
1526346
3989
都樂意協助你完成這一項重大改變,
25:30
and I don't know if Whitney --
472
1530359
1542
惠特尼是否還有問題——
25:31
Jack, thank you for coming here
and speaking so openly.
473
1531925
2761
傑克,感謝你能來與我們
開誠布公地討論。
25:34
It took courage.
474
1534710
1527
這需要勇氣。
25:36
I really appreciate what you said,
and good luck with your mission.
475
1536261
3384
我很欣賞你剛所說的,
祝你好運完成任務。
25:39
JD: Thank you so much.
Thanks for having me.
476
1539669
2095
多:非常感謝。謝謝你們邀請我。
25:41
(Applause)
477
1541788
3322
(掌聲)
25:45
Thank you.
478
1545134
1159
謝謝。
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。