请双击下面的英文字幕来播放视频。
翻译人员: Buyun Ping
校对人员: Tianji (Homer) Li
00:13
Chris Anderson:
What worries you right now?
0
13131
2408
克里斯·安德森:
你现在担心什么?
00:15
You've been very open
about lots of issues on Twitter.
1
15563
2853
一直以来你都敞开探讨
推特上的问题。
00:18
What would be your top worry
2
18440
2299
对于现在的情况,
00:20
about where things are right now?
3
20763
2049
你最担心的是什么?
00:23
Jack Dorsey: Right now,
the health of the conversation.
4
23447
2929
杰克·多西:
现在我最担心的是谈话的质量。
00:26
So, our purpose is to serve
the public conversation,
5
26400
3660
我们的目标是促进大众交流,
00:30
and we have seen
a number of attacks on it.
6
30084
5056
但我们已经在上面看到了
很多攻击性言论。
00:35
We've seen abuse, we've seen harassment,
7
35164
2425
我们看到了侮辱,我们看到了骚扰,
00:37
we've seen manipulation,
8
37613
3222
我们还看到了诱导性言论,
00:40
automation, human coordination,
misinformation.
9
40859
4265
机器回复、水军、错误信息。
00:46
So these are all dynamics
that we were not expecting
10
46134
4034
这些变化
00:50
13 years ago when we were
starting the company.
11
50192
3718
是我们在13年前创建公司时
没有想到的。
00:53
But we do now see them at scale,
12
53934
2664
然而,现在这些言论数量巨大,
00:56
and what worries me most
is just our ability to address it
13
56622
5278
我最担心的是我们是否有能力
解决这个问题,
01:01
in a systemic way that is scalable,
14
61924
3108
通过一种可延展的系统性方式,
01:05
that has a rigorous understanding
of how we're taking action,
15
65056
6976
有一个严格的处理标准,
01:12
a transparent understanding
of how we're taking action
16
72056
3105
过程公开透明,大众可以理解,
01:15
and a rigorous appeals process
for when we're wrong,
17
75185
3101
并且在我们犯错的时候
有一个而严格的上诉机制,
01:18
because we will be wrong.
18
78310
2169
因为每个人都会犯错。
01:20
Whitney Pennington Rodgers:
I'm really glad to hear
19
80503
2397
惠特尼·彭宁顿·罗杰斯:
我非常高兴
01:22
that that's something that concerns you,
20
82924
1928
听到你谈论你的担心,
01:24
because I think there's been
a lot written about people
21
84876
2630
因为我觉得有很多人
01:27
who feel they've been abused
and harassed on Twitter,
22
87530
2477
都觉得自己在推特上被侮辱和骚扰,
01:30
and I think no one more so
than women and women of color
23
90031
4102
并且我觉得首当其冲的
01:34
and black women.
24
94157
1170
就是女性,非白人女性和黑人女性。
01:35
And there's been data that's come out --
25
95351
1913
有数据表明,
01:37
Amnesty International put out
a report a few months ago
26
97288
2909
来自国际特赦组织几个月前的一份报告称
01:40
where they showed that a subset
of active black female Twitter users
27
100221
4480
有一部分活跃的黑人女性推特用户
01:44
were receiving, on average,
one in 10 of their tweets
28
104725
3456
平均每10条推文
01:48
were some form of harassment.
29
108205
2099
就会收到一条骚扰回复。
01:50
And so when you think about health
for the community on Twitter,
30
110328
3907
当你谈论到为推特用户
塑造健康环境的时候,
01:54
I'm interested to hear,
"health for everyone,"
31
114259
4024
我对你说的“大众的健康”很感兴趣,
01:58
but specifically: How are you looking
to make Twitter a safe space
32
118307
3125
但具体来讲:你准备如何为这个群体、
为女性、为非白人女性和黑人女性
02:01
for that subset, for women,
for women of color and black women?
33
121456
4164
营造一个安全的环境呢?
02:05
JD: Yeah.
34
125644
1164
多西:好的。
02:06
So it's a pretty terrible situation
35
126832
2643
现在状况比较糟糕,
02:09
when you're coming to a service
36
129499
1619
当你来到这个平台,
02:11
that, ideally, you want to learn
something about the world,
37
131142
4321
理想情况下,你想了解这个世界,
02:15
and you spend the majority of your time
reporting abuse, receiving abuse,
38
135487
5443
但你要花费大量时间来举报
和受到侮辱信息,
02:20
receiving harassment.
39
140954
1804
还有骚扰信息。
02:23
So what we're looking most deeply at
is just the incentives
40
143373
6321
所以我们现在研究的是
02:29
that the platform naturally provides
and the service provides.
41
149718
3823
平台和服务所提供的动机。
02:34
Right now, the dynamic of the system
makes it super-easy to harass
42
154262
4577
现在,系统的动态性使得在平台上
02:38
and to abuse others through the service,
43
158863
3664
骚扰和侮辱他人非常容易,
02:42
and unfortunately, the majority
of our system in the past
44
162551
3262
不幸的是,过去我们大部分系统
02:45
worked entirely based on people
reporting harassment and abuse.
45
165837
5596
完全依赖于用户举报骚扰和侮辱信息。
02:51
So about midway last year,
we decided that we were going to apply
46
171457
5075
大约在去年年中,我们决定更多采用
02:56
a lot more machine learning,
a lot more deep learning to the problem,
47
176556
3982
机器学习与深度学习,
03:00
and try to be a lot more proactive
around where abuse is happening,
48
180562
4538
使其能够主动发现侮辱信息,
03:05
so that we can take the burden
off the victim completely.
49
185124
3960
这样我们就能让受害者卸下重担。
03:09
And we've made some progress recently.
50
189108
2435
最近我们颇有成效。
03:11
About 38 percent of abusive tweets
are now proactively identified
51
191567
6689
机器自主学习算法
03:18
by machine learning algorithms
52
198280
1715
能够主动发现38%的侮辱性推文,
03:20
so that people don't actually
have to report them.
53
200019
2334
这样人们就不用举报这些了。
03:22
But those that are identified
are still reviewed by humans,
54
202377
3305
但即使机器识别了那些信息,
也还需要人来审查,
03:25
so we do not take down content or accounts
without a human actually reviewing it.
55
205706
5384
所以在有人审查之前,
我们不会删除任何评论或账号。
03:31
But that was from zero percent
just a year ago.
56
211114
2759
但重点是我们一年前是从零起步。
03:33
So that meant, at that zero percent,
57
213897
1931
我的意思是,以前,
03:35
every single person who received abuse
had to actually report it,
58
215852
3650
每个受到侮辱的人都需要亲自举报,
03:39
which was a lot of work for them,
a lot of work for us
59
219526
3579
对于他们和我们而言是一项大工程,
03:43
and just ultimately unfair.
60
223129
2018
并且这非常不公平。
03:46
The other thing that we're doing
is making sure that we, as a company,
61
226528
3780
我们在做的另外一件事,
是确保在公司里
03:50
have representation of all the communities
that we're trying to serve.
62
230332
3333
有足够的员工来代表反映
我们服务的所有群体。
03:53
We can't build a business
that is successful
63
233689
2159
若我们的团队没有多元的观念,
03:55
unless we have a diversity
of perspective inside of our walls
64
235872
3300
若没有员工会每天亲身经历这些问题,
03:59
that actually feel these issues
every single day.
65
239196
3732
我们将无法成功地建立一个企业。
04:02
And that's not just with the team
that's doing the work,
66
242952
3738
这一点不仅仅适用于开发小组,
04:06
it's also within our leadership as well.
67
246714
2096
也适用于领导团队。
04:08
So we need to continue to build empathy
for what people are experiencing
68
248834
5757
所以我们要继续在
同情人们的遭遇的同时,
04:14
and give them better tools to act on it
69
254615
3316
给他们创造更好的交流平台,
04:17
and also give our customers
a much better and easier approach
70
257955
4252
并且为我们的用户打造更好而简便的方法
04:22
to handle some of the things
that they're seeing.
71
262231
2382
来处理一些问题。
04:24
So a lot of what we're doing
is around technology,
72
264637
3266
所以我们的大部分工作都是
围绕科技展开的,
04:27
but we're also looking at
the incentives on the service:
73
267927
4308
但我们同时也在思考这个服务的动机:
04:32
What does Twitter incentivize you to do
when you first open it up?
74
272259
5183
当你第一次登录的时候,
推特鼓励你做了什么事?
04:37
And in the past,
75
277466
1294
在过去,
04:40
it's incented a lot of outrage,
it's incented a lot of mob behavior,
76
280670
5544
这个平台催生了许多愤怒,
催生了许多暴徒行为,
04:46
it's incented a lot of group harassment.
77
286238
2459
也催生了许多团体骚扰。
04:48
And we have to look a lot deeper
at some of the fundamentals
78
288721
3648
我们正在深入研究我们服务的基本原则,
04:52
of what the service is doing
to make the bigger shifts.
79
292393
2958
以探求这种行为现象的原因。
04:55
We can make a bunch of small shifts
around technology, as I just described,
80
295375
4031
正如我刚才所描述的,
我们可以在科技方面做出许多小的改变,
04:59
but ultimately, we have to look deeply
at the dynamics in the network itself,
81
299430
4386
但最终,我们必须深入研究
网络本身的动态性,
05:03
and that's what we're doing.
82
303840
1368
而这就是我们现在所做的。
05:05
CA: But what's your sense --
83
305232
2060
安德森:但在你看来——
05:07
what is the kind of thing
that you might be able to change
84
307316
3963
哪些方面的改动是你力所能及
05:11
that would actually
fundamentally shift behavior?
85
311303
2749
并且可以从根本上改善人们行为的呢?
05:15
JD: Well, one of the things --
86
315386
1480
多西:有一件事——
05:16
we started the service
with this concept of following an account,
87
316890
5340
我们开设的关注别人账号的这个功能,
05:22
as an example,
88
322254
1725
我举一个例子,
05:24
and I don't believe that's why
people actually come to Twitter.
89
324003
4349
我猜这不是吸引人们用推特的主要原因。
05:28
I believe Twitter is best
as an interest-based network.
90
328376
4857
我相信推特的最佳用途是构建兴趣网络。
05:33
People come with a particular interest.
91
333257
3453
人们来到这里,每个人兴趣不同,
05:36
They have to do a ton of work
to find and follow the related accounts
92
336734
3487
他们需要花很大的工夫
才能找到志同道合的人,
05:40
around those interests.
93
340245
1405
关注兴趣相同的账号。
05:42
What we could do instead
is allow you to follow an interest,
94
342217
3397
我们可以改进的地方在于
让你关注一项兴趣爱好,
05:45
follow a hashtag, follow a trend,
95
345638
2103
关注一个标签,关注一种趋势,
05:47
follow a community,
96
347765
1754
关注一个群体。
05:49
which gives us the opportunity
to show all of the accounts,
97
349543
4637
通过这种方式,我们可以给你推荐
与你兴趣有关的
05:54
all the topics, all the moments,
all the hashtags
98
354204
3323
所有账号、所有话题、
所有时刻以及所有标签,
05:57
that are associated with that
particular topic and interest,
99
357551
3992
06:01
which really opens up
the perspective that you see.
100
361567
4600
这可以开拓用户的视野。
06:06
But that is a huge fundamental shift
101
366191
2157
但要从只关注特定账号、
话题和兴趣爱好,
06:08
to bias the entire network
away from just an account bias
102
368372
3792
转而关注整个社交网络,
06:12
towards a topics and interest bias.
103
372188
2587
这是一个非常巨大的转变。
06:15
CA: Because isn't it the case
104
375283
3148
安德森:但之所以推特上内容丰富,
06:19
that one reason why you have
so much content on there
105
379375
3541
06:22
is a result of putting millions
of people around the world
106
382940
3591
是因为
推特使全球很多人
在“角斗场”中相互竞争,
06:26
in this kind of gladiatorial
contest with each other
107
386555
3142
06:29
for followers, for attention?
108
389721
2090
看谁有更多的粉丝,获得更多的关注吗?
06:31
Like, from the point of view
of people who just read Twitter,
109
391835
4117
我是说,对于那些关注者而言,
06:35
that's not an issue,
110
395976
1155
这并不是什么大问题。
06:37
but for the people who actually create it,
everyone's out there saying,
111
397155
3350
但对于那些博主而言,每个人都在说,
06:40
"You know, I wish I had
a few more 'likes,' followers, retweets."
112
400529
3236
“你知道吗,我希望我也能有更多的赞、
更多的粉丝和转发,”
06:43
And so they're constantly experimenting,
113
403789
2148
所以他们不断地尝试,
06:45
trying to find the path to do that.
114
405961
1961
希望通过某种方式来做到这一点。
06:47
And what we've all discovered
is that the number one path to do that
115
407946
4126
并且我们发现,能达成这个目的
最有效方法,
就是发表挑衅的、
06:52
is to be some form of provocative,
116
412096
3406
06:55
obnoxious, eloquently obnoxious,
117
415526
2980
讨厌的、词藻华丽而令人憎恶的言论,
06:58
like, eloquent insults
are a dream on Twitter,
118
418530
3516
比方说,在推特上花式骂人,
07:02
where you rapidly pile up --
119
422070
2603
如果你不停地侮辱他人——
07:04
and it becomes this self-fueling
process of driving outrage.
120
424697
4608
最终愤怒的情绪就会
自然不断地蔓延到其他人。
07:09
How do you defuse that?
121
429329
2351
你要怎么解决这个问题?
07:12
JD: Yeah, I mean, I think you're spot on,
122
432624
2947
多西:是的,我觉得你讲的正中要害,
07:15
but that goes back to the incentives.
123
435595
1886
但这反过来回到了刚才所说的动机问题。
07:17
Like, one of the choices
we made in the early days was
124
437505
2632
在早期,我们设计的系统是这样的:
07:20
we had this number that showed
how many people follow you.
125
440161
4701
系统会显示关注你账号的人数。
07:24
We decided that number
should be big and bold,
126
444886
2959
我们决定让这个数字放大又加粗显示,
07:27
and anything that's on the page
that's big and bold has importance,
127
447869
3740
你界面上放大粗体的字体都很重要,
07:31
and those are the things
that you want to drive.
128
451633
2278
这些就是你想推动的东西。
07:33
Was that the right decision at the time?
129
453935
1907
对当时而言,这是否是正确的做法呢?
07:35
Probably not.
130
455866
1153
也许不是。
07:37
If I had to start the service again,
131
457043
1805
如果让我重头来过,
07:38
I would not emphasize
the follower count as much.
132
458872
2398
我不会再这么强调关注人数。
我不会再这么强调推文的“喜爱”数量。
07:41
I would not emphasize
the "like" count as much.
133
461294
2295
07:43
I don't think I would even
create "like" in the first place,
134
463613
3120
我甚至不觉得我会创造“喜爱”这个功能,
07:46
because it doesn't actually push
135
466757
3267
因为这个功能并不能推动
我们眼中最重要的事:
07:50
what we believe now
to be the most important thing,
136
470048
3179
07:53
which is healthy contribution
back to the network
137
473251
3039
对网络提供健康的贡献,
07:56
and conversation to the network,
138
476314
2652
促进交流,
07:58
participation within conversation,
139
478990
2072
推动人们参与交流,
08:01
learning something from the conversation.
140
481086
2493
在交流中学习。
08:03
Those are not things
that we thought of 13 years ago,
141
483603
2824
我们在13年前没有考虑到这些,
08:06
and we believe are extremely
important right now.
142
486451
2439
但现在我们却认为它们非常重要。
08:08
So we have to look at
how we display the follower count,
143
488914
3023
所以我们需要审视系统如何呈现粉丝数,
08:11
how we display retweet count,
144
491961
2365
呈现转推数,
08:14
how we display "likes,"
145
494350
1401
呈现“喜爱”的数量,
08:15
and just ask the deep question:
146
495775
2254
并且问这个深奥的问题:
08:18
Is this really the number
that we want people to drive up?
147
498053
3048
我们想让人们提升的是否就是这个数字?
08:21
Is this the thing that,
when you open Twitter,
148
501125
2545
我们是否希望人们打开推特,
08:23
you see, "That's the thing
I need to increase?"
149
503694
2516
意识到,“这就是我要提升的东西?”
08:26
And I don't believe
that's the case right now.
150
506234
2144
我相信现在答案是否定的。
08:28
(Applause)
151
508402
2103
(掌声)
08:30
WPR: I think we should look at
some of the tweets
152
510529
2352
罗杰斯:我觉得我们也应该看看
08:32
that are coming
in from the audience as well.
153
512905
2169
一部分来自听众的推文。
08:35
CA: Let's see what you guys are asking.
154
515868
2436
安德森:让我们看看你们在问什么。
08:38
I mean, this is -- generally, one
of the amazing things about Twitter
155
518328
3294
我觉得,这是——总体而言,
推特的特点之一
08:41
is how you can use it for crowd wisdom,
156
521646
2294
在于它能够集中大众智慧,
08:43
you know, that more knowledge,
more questions, more points of view
157
523964
4840
推特上的知识,问题和观点
多得超乎你的想象。
08:48
than you can imagine,
158
528828
1238
08:50
and sometimes, many of them
are really healthy.
159
530090
3689
而且有时候,很多观点意见质量都很高。
08:53
WPR: I think one I saw that
passed already quickly down here,
160
533803
2900
罗杰斯:刚才我看到的这个很好,在下面,
08:56
"What's Twitter's plan to combat
foreign meddling in the 2020 US election?"
161
536717
3524
“在2020年美国大选中,
推特准备采取什么措施对抗外国干涉?”
我觉得这是我们在互联网上见到的
09:00
I think that's something
that's an issue we're seeing
162
540265
2571
09:02
on the internet in general,
163
542860
1901
非常重要的一个问题,
09:04
that we have a lot of malicious
automated activity happening.
164
544785
3667
网络上有着许多恶意的机器操作,
09:08
And on Twitter, for example,
in fact, we have some work
165
548476
5373
用推特打个比方,
来自齐格努实验室的朋友
给了我们一些资料,
09:13
that's come from our friends
at Zignal Labs,
166
553873
2758
09:16
and maybe we can even see that
to give us an example
167
556655
2656
也许我们可以把这当做一个例子,
09:19
of what exactly I'm talking about,
168
559335
1927
来引出我想谈论的这个问题,
09:21
where you have these bots, if you will,
169
561286
3204
有很多机器人小号,
09:24
or coordinated automated
malicious account activity,
170
564514
4550
或者说一系列的自动化恶意账号活动,
09:29
that is being used to influence
things like elections.
171
569088
2764
它们被用来影响选举这类的事件。
09:31
And in this example we have
from Zignal which they've shared with us
172
571876
3843
在这个例子中,齐格努给我们分享了一些
09:35
using the data that
they have from Twitter,
173
575743
2198
他们从推特上收集到的数据,
09:37
you actually see that in this case,
174
577965
2441
你能在这里看到,
09:40
white represents the humans --
human accounts, each dot is an account.
175
580430
4370
白色代表——人类账号,
每个点代表一个账号,
09:44
The pinker it is,
176
584824
1359
颜色越粉,
09:46
the more automated the activity is.
177
586207
1740
说明这个账号活动越像机器人。
09:47
And you can see how you have
a few humans interacting with bots.
178
587971
5970
你能到看到有一些人与机器账号互动,
09:53
In this case, it's related
to the election in Israel
179
593965
4419
在这里,互动内容是关于
以色列选举的,
机器账号散播关于
本尼·甘茨的虚假信息,
09:58
and spreading misinformation
about Benny Gantz,
180
598408
2833
10:01
and as we know, in the end,
that was an election
181
601265
2662
我们都知道,选举的结果,
10:03
that Netanyahu won by a slim margin,
182
603951
3724
最终内塔尼亚胡以微弱优势获胜,
10:07
and that may have been
in some case influenced by this.
183
607699
2842
而这个结果可能被这些互动影响了。
10:10
And when you think about
that happening on Twitter,
184
610565
2615
当你想到推特上的这些事情时,
10:13
what are the things
that you're doing, specifically,
185
613204
2456
你正在采取具体什么措施,
10:15
to ensure you don't have misinformation
like this spreading in this way,
186
615684
3702
来保证类似的虚假信息不会传播,
10:19
influencing people in ways
that could affect democracy?
187
619410
4181
影响他人,以致于影响民主制度呢?
10:23
JD: Just to back up a bit,
188
623615
1771
多西:让我们倒回去一点,
10:25
we asked ourselves a question:
189
625410
2975
我们问了自己一个问题:
10:28
Can we actually measure
the health of a conversation,
190
628409
3816
我们是否能测量对话的质量?
10:32
and what does that mean?
191
632249
1288
而对质量的定义又是什么?
10:33
And in the same way
that you have indicators
192
633561
3382
就像你我
10:36
and we have indicators as humans
in terms of are we healthy or not,
193
636967
3467
作为人类会有显示
我们健不健康的指标,
10:40
such as temperature,
the flushness of your face,
194
640458
4658
比如说体温,脸上的红晕,
10:45
we believe that we could find
the indicators of conversational health.
195
645140
4560
我们相信我们能够找到反应
谈话健康程度的指标。
10:49
And we worked with a lab
called Cortico at MIT
196
649724
3843
我们与麻省理工学院的
一个名叫匡提科的实验室合作,
10:54
to propose four starter indicators
197
654479
6091
提出了四项指标,
11:00
that we believe we could ultimately
measure on the system.
198
660594
3670
我们认为最终可以运用在系统上。
11:05
And the first one is
what we're calling shared attention.
199
665249
5604
第一项指标叫作共同关注。
11:10
It's a measure of how much
of the conversation is attentive
200
670877
3581
这个指标用于测量对话中
人们多大程度集中于
11:14
on the same topic versus disparate.
201
674482
2630
同一个话题,还是不同的话题。
11:17
The second one is called shared reality,
202
677739
2783
第二个指标叫作共同现实。
11:21
and this is what percentage
of the conversation
203
681217
2259
这个指标用于测量这段对话有多少部分
11:23
shares the same facts --
204
683500
2005
是基于共同事实的——
11:25
not whether those facts
are truthful or not,
205
685529
3113
不是指这些事实真实与否,
11:28
but are we sharing
the same facts as we converse?
206
688666
3009
而是我们认为的事实是否相同?
11:32
The third is receptivity:
207
692235
2353
第三个指标是感受性:
11:34
How much of the conversation
is receptive or civil
208
694612
3959
一段对话在多大程度上是温和,
让人容易接受的,
11:38
or the inverse, toxic?
209
698595
2944
亦或者是完全相反,会使人不快呢?
11:42
And then the fourth
is variety of perspective.
210
702213
3222
第四个指标是观点多样性。
11:45
So, are we seeing filter bubbles
or echo chambers,
211
705459
3145
我们所接收的信息是否被筛选过,
人云亦云,
11:48
or are we actually getting
a variety of opinions
212
708628
3057
又或者是我们能在谈话中
11:51
within the conversation?
213
711709
1635
接收到许多不同的观点呢?
11:53
And implicit in all four of these
is the understanding that,
214
713368
4018
而这四个指标背后蕴含意思是
11:57
as they increase, the conversation
gets healthier and healthier.
215
717410
3390
指标越高,谈话的质量就越高。
12:00
So our first step is to see
if we can measure these online,
216
720824
4869
所以第一步是试验是否能在网上
检测这些指标,
12:05
which we believe we can.
217
725717
1308
我们内心持肯定的态度。
12:07
We have the most momentum
around receptivity.
218
727049
3167
我们在感受性方面势头较大。
12:10
We have a toxicity score,
a toxicity model, on our system
219
730240
4317
我们在系统中建立了反感分,
一个反感模型,
12:14
that can actually measure
whether you are likely to walk away
220
734581
4124
它能够检测在推特上
你是否会终止某个你正在参与的话题,
12:18
from a conversation
that you're having on Twitter
221
738729
2313
因为这个话题让你感到不快,
12:21
because you feel it's toxic,
222
741066
1633
12:22
with some pretty high degree.
223
742723
2512
并且是非常不愉快。
12:26
We're working to measure the rest,
224
746369
2199
我们正在努力测量其他指标,
12:28
and the next step is,
225
748592
1964
而下一步是,
12:30
as we build up solutions,
226
750580
3359
在我们制定解决方案之后,
12:33
to watch how these measurements
trend over time
227
753963
3491
观察这些指标长期的趋势走向,
并且继续试验。
12:37
and continue to experiment.
228
757478
1873
12:39
And our goal is to make sure
that these are balanced,
229
759375
4041
我们的目标是保证这些指标的平衡,
12:43
because if you increase one,
you might decrease another.
230
763440
3066
因为一个指标的提升可能会导致
另一个的降低。
12:46
If you increase variety of perspective,
231
766530
2147
如果你提高观点的多样性,
12:48
you might actually decrease
shared reality.
232
768701
3091
可能就会导致共同现实值的降低。
12:51
CA: Just picking up on some
of the questions flooding in here.
233
771816
4989
安德森:让我们选择屏幕上的
一些问题提问。
12:56
JD: Constant questioning.
234
776829
1271
多西:持续不断地发问。
12:58
CA: A lot of people are puzzled why,
235
778996
3620
安德森:许多人好奇,
13:02
like, how hard is it to get rid
of Nazis from Twitter?
236
782640
4247
在推特上把纳粹清除出去有多难?
13:08
JD: (Laughs)
237
788309
1322
多西:(笑声)
13:09
So we have policies
around violent extremist groups,
238
789655
6995
我们对于暴力极端组织制定了规则,
13:16
and the majority of our work
and our terms of service
239
796674
4426
而我们的大部分工作和服务条款
13:21
works on conduct, not content.
240
801124
3729
是研究用户行为,而非网站内容。
13:24
So we're actually looking for conduct.
241
804877
2551
所以我们实际上在研究行为模式。
13:27
Conduct being using the service
242
807452
3014
比如说,有人利用这个平台
13:30
to repeatedly or episodically
harass someone,
243
810490
3867
来持续或是间断性地骚扰他人,
13:34
using hateful imagery
244
814381
2493
利用可憎的意象,
13:36
that might be associated with the KKK
245
816898
2106
比如可能与3K党有关的意象,
13:39
or the American Nazi Party.
246
819028
3281
或是与美国纳粹党有关的。
13:42
Those are all things
that we act on immediately.
247
822333
4156
这些都是我们会立即处理的问题。
13:47
We're in a situation right now
where that term is used fairly loosely,
248
827002
5452
现在,这些术语被使用的次数相对较多,
用词不严谨,
13:52
and we just cannot take
any one mention of that word
249
832478
5313
所以我们不能因为有人提到这些单词,
13:57
accusing someone else
250
837815
2117
就指控某人有罪,
13:59
as a factual indication that they
should be removed from the platform.
251
839956
3755
并且以此为证据将他们赶出这个平台。
14:03
So a lot of our models
are based around, number one:
252
843735
2627
所以我们的许多模型
要检测的第一件事是:
14:06
Is this account associated
with a violent extremist group?
253
846386
3140
这个账号是否与暴力极端组织
有联系?
14:09
And if so, we can take action.
254
849550
1983
如果答案是肯定的,
那我们可以采取措施。
14:11
And we have done so on the KKK
and the American Nazi Party and others.
255
851557
3852
我们对于3K党、美国纳粹党
以及其他组织就采取了措施。
14:15
And number two: Are they using
imagery or conduct
256
855433
4183
第二个问题:这些账号是否在使用
与上述组织有关的图片,
14:19
that would associate them as such as well?
257
859640
2372
或是其行为是否与上述组织有关?
14:22
CA: How many people do you have
working on content moderation
258
862416
2932
安德森:你安排了多少人给账号行为评分,
来检查这些行为?
14:25
to look at this?
259
865372
1250
14:26
JD: It varies.
260
866646
1496
多西:人数不固定。
14:28
We want to be flexible on this,
261
868166
1595
我们希望能灵活应对这件事,
14:29
because we want to make sure
that we're, number one,
262
869785
2646
因为我们想保证,第一,
14:32
building algorithms instead of just
hiring massive amounts of people,
263
872455
4424
建立算法而不是雇佣大量的人,
14:36
because we need to make sure
that this is scalable,
264
876903
2824
因为我们这项任务是会延展的
14:39
and there are no amount of people
that can actually scale this.
265
879751
3454
雇多少人都会显得不够。
14:43
So this is why we've done so much work
around proactive detection of abuse
266
883229
6629
这就是为什么我们要努力建立
能积极检测辱骂信息的系统,
14:49
that humans can then review.
267
889882
1391
然后让人来审阅这些信息。
14:51
We want to have a situation
268
891297
2861
我们希望可以做到让
算法能够不断检测所有的推文,
14:54
where algorithms are constantly
scouring every single tweet
269
894182
3741
14:57
and bringing the most
interesting ones to the top
270
897947
2342
把其中问题最大的挑出来,
15:00
so that humans can bring their judgment
to whether we should take action or not,
271
900313
3902
这样人就可以决定到底要不要采取措施,
基于我们的服务条款。
15:04
based on our terms of service.
272
904239
1524
15:05
WPR: But there's not an amount
of people that are scalable,
273
905787
2803
罗杰斯:你说从事这项工作的人数
永远不够,
15:08
but how many people do you currently have
monitoring these accounts,
274
908614
3497
那么你现在到底雇佣了多少人
来监控账号?
你怎么知道这些人足够完成任务?
15:12
and how do you figure out what's enough?
275
912135
2546
15:14
JD: They're completely flexible.
276
914705
2272
多西:我们的安排非常灵活。
15:17
Sometimes we associate folks with spam.
277
917001
2941
有时我们让人监控垃圾信息。
15:19
Sometimes we associate folks
with abuse and harassment.
278
919966
3845
有时我们让人监控侮辱和
骚扰信息。
15:23
We're going to make sure that
we have flexibility in our people
279
923835
3062
我们要确保我们的人事安排足够灵活,
这样我们就能把他们派往
最有需要的地方。
15:26
so that we can direct them
at what is most needed.
280
926921
2350
有时,他们监控选举信息。
15:29
Sometimes, the elections.
281
929295
1204
15:30
We've had a string of elections
in Mexico, one coming up in India,
282
930523
4927
我们在墨西哥有一组人,
马上在印度也会成立一个小组。
15:35
obviously, the election last year,
the midterm election,
283
935474
4447
很明显,去年有选举,中期选举,
15:39
so we just want to be flexible
with our resources.
284
939945
2472
所以我们希望确保人员流动性。
15:42
So when people --
285
942441
2129
当人们——
15:44
just as an example, if you go
to our current terms of service
286
944594
6389
举个例子,如果你现在去查
我们的服务条款,
15:51
and you bring the page up,
287
951007
1641
然后你打开页面,
15:52
and you're wondering about abuse
and harassment that you just received
288
952672
3682
然后你想弄明白你刚刚收到的
侮辱和骚扰,
15:56
and whether it was against
our terms of service to report it,
289
956378
3634
它们是否违反我们的服务条款,
是否要举报,
16:00
the first thing you see
when you open that page
290
960036
2559
你打开页面后看到的第一个东西,
16:02
is around intellectual
property protection.
291
962619
3088
是知识产权保护。
16:06
You scroll down and you get to
abuse, harassment
292
966504
5323
你向下看就能看到侮辱,骚扰有关,
16:11
and everything else
that you might be experiencing.
293
971851
2382
以及其他所有你可能会遭遇的事情。
16:14
So I don't know how that happened
over the company's history,
294
974257
3195
尽管我不清楚为什么近几年来
推特上会出现这些问题,
16:17
but we put that above
the thing that people want
295
977476
4797
但我们把这些信息列为人们
最需要的信息,
16:24
the most information on
and to actually act on.
296
984146
3222
并且正在着手解决这些问题。
16:27
And just our ordering shows the world
what we believed was important.
297
987392
5241
这些信息排列的先后顺序
代表了我们对其的重视程度。
16:32
So we're changing all that.
298
992657
2881
所以我们正在改变。
16:35
We're ordering it the right way,
299
995562
1563
我们以正确的顺序排列信息,
16:37
but we're also simplifying the rules
so that they're human-readable
300
997149
3451
但同时我们也在简化规则,
使其简单易懂,
16:40
so that people can actually
understand themselves
301
1000624
4067
这样人们就能理解
16:44
when something is against our terms
and when something is not.
302
1004715
3448
哪些行为违反了条例,哪些没有。
16:48
And then we're making --
303
1008187
2161
接下来我们——
16:50
again, our big focus is on removing
the burden of work from the victims.
304
1010372
5200
正如我刚才所说,现在工作的重点
是减少受害者的操作压力。
16:55
So that means push more
towards technology,
305
1015596
3734
这意味着继续发展科技,
16:59
rather than humans doing the work --
306
1019354
1873
代替真人做这些工作——
17:01
that means the humans receiving the abuse
307
1021251
2413
真人会接收侮辱信息,
17:03
and also the humans
having to review that work.
308
1023688
3026
同时还要亲自举报。
17:06
So we want to make sure
309
1026738
1673
所以我们并不希望
17:08
that we're not just encouraging more work
310
1028435
2841
我们一步步
17:11
around something
that's super, super negative,
311
1031300
2629
朝一些负面的东西发展,
17:13
and we want to have a good balance
between the technology
312
1033953
2674
我们希望
17:16
and where humans can actually be creative,
313
1036651
2852
在科技和人类创新中取得平衡,
17:19
which is the judgment of the rules,
314
1039527
3090
让人类来审查规则,
17:22
and not just all the mechanical stuff
of finding and reporting them.
315
1042641
3267
而非机械地发现问题,报告问题。
17:25
So that's how we think about it.
316
1045932
1530
以上就是我们的想法。
17:27
CA: I'm curious to dig in more
about what you said.
317
1047486
2406
安德森:我想进一步了解你刚刚讲的事。
17:29
I mean, I love that you said
you are looking for ways
318
1049916
2605
我的意思是,我很开心听到你说
你在寻求方法
17:32
to re-tweak the fundamental
design of the system
319
1052545
3462
来调整重构系统设计,
17:36
to discourage some of the reactive
behavior, and perhaps --
320
1056031
4875
来阻止一些反应性行为,并且或许——
17:40
to use Tristan Harris-type language --
321
1060930
2705
引用特里斯坦·哈里斯的说法——
17:43
engage people's more reflective thinking.
322
1063659
4288
鼓励人们反思。
17:47
How far advanced is that?
323
1067971
1854
你现在进行到哪一步?
17:49
What would alternatives
to that "like" button be?
324
1069849
4305
你准备用什么功能
来代替“喜欢”功能?
17:55
JD: Well, first and foremost,
325
1075518
3575
多西:首先,
17:59
my personal goal with the service
is that I believe fundamentally
326
1079117
5753
我对于推特的个人目标来自于
我认为公众交流是最关键的。
18:04
that public conversation is critical.
327
1084894
2702
18:07
There are existential problems
facing the world
328
1087620
2647
世界上有一些问题,
18:10
that are facing the entire world,
not any one particular nation-state,
329
1090291
4163
不仅仅某个特定国家地区,
全世界都面临这些问题,
18:14
that global public conversation benefits.
330
1094478
2649
而全球公共交流
有助于解决这些问题。
18:17
And that is one of the unique
dynamics of Twitter,
331
1097151
2372
这就是推特独特的活力之一,
18:19
that it is completely open,
332
1099547
1814
这个平台完全开放,
18:21
it is completely public,
333
1101385
1596
完全公共,
完全流动,
18:23
it is completely fluid,
334
1103005
1399
18:24
and anyone can see any other conversation
and participate in it.
335
1104428
4038
任何人可以浏览和参与和人话题。
18:28
So there are conversations
like climate change.
336
1108490
2206
所以会有气候变化的讨论。
18:30
There are conversations
like the displacement in the work
337
1110720
2682
有对人工智能取代人类的讨论。
18:33
through artificial intelligence.
338
1113426
2000
18:35
There are conversations
like economic disparity.
339
1115450
3006
有对经济差距的讨论。
18:38
No matter what any one nation-state does,
340
1118480
2765
无论你来自哪个国家或地区,
18:41
they will not be able
to solve the problem alone.
341
1121269
2421
人类都无法独自解决这些问题。
18:43
It takes coordination around the world,
342
1123714
2643
这需要全球合作,
18:46
and that's where I think
Twitter can play a part.
343
1126381
3047
我们认为这就是推特发挥作用的地方。
18:49
The second thing is that Twitter,
right now, when you go to it,
344
1129452
5642
第二,现在你登录推特,
18:55
you don't necessarily walk away
feeling like you learned something.
345
1135118
3746
当你离开的时候,
也许你觉得没什么收获,
18:58
Some people do.
346
1138888
1276
但有些人觉得有收获。
19:00
Some people have
a very, very rich network,
347
1140188
3107
有些人的社交网络非常丰富,
19:03
a very rich community
that they learn from every single day.
348
1143319
3117
他们身处一个庞大的社区,
每天都可以学习到新的知识。
19:06
But it takes a lot of work
and a lot of time to build up to that.
349
1146460
3691
但要想建造这样的社区,
需要大量的精力和时间。
19:10
So we want to get people
to those topics and those interests
350
1150175
3448
所以我们让人们可以更快得
19:13
much, much faster
351
1153647
1579
找到那些主题、那些兴趣。
19:15
and make sure that
they're finding something that,
352
1155250
2566
并确保他们有所收获,
19:18
no matter how much time
they spend on Twitter --
353
1158728
2360
无论他们每天在推特上花多少时间——
19:21
and I don't want to maximize
the time on Twitter,
354
1161112
2358
我不想扩大用户使用推特的时间,
19:23
I want to maximize
what they actually take away from it
355
1163494
2910
我想扩大用户在推特上的收获,
19:26
and what they learn from it, and --
356
1166428
2030
让他们在推特上学到更多知识,并且——
19:29
CA: Well, do you, though?
357
1169598
1328
19:30
Because that's the core question
that a lot of people want to know.
358
1170950
3244
安德森:你是这么想的吗?
这是许多人想问的一个核心问题。
19:34
Surely, Jack, you're constrained,
to a huge extent,
359
1174218
3638
当然,杰克,你在很大程度上受到约束,
19:37
by the fact that you're a public company,
360
1177880
2007
因为这是一家上市公司,
19:39
you've got investors pressing on you,
361
1179911
1774
你会收到投资者的压力,
19:41
the number one way you make your money
is from advertising --
362
1181709
3559
你赚钱的方法之一是广告——
19:45
that depends on user engagement.
363
1185292
2772
而广告的数量取决于用户参与度。
19:48
Are you willing to sacrifice
user time, if need be,
364
1188088
4700
如果有需要的话,
你是否愿意牺牲用户参与度,
19:52
to go for a more reflective conversation?
365
1192812
3729
来换取更加有效的交流呢?
19:56
JD: Yeah; more relevance means
less time on the service,
366
1196565
3111
多西:当然;高效交流意味着
用户使用时间的减少,
19:59
and that's perfectly fine,
367
1199700
1937
我完全可以接受这个,
20:01
because we want to make sure
that, like, you're coming to Twitter,
368
1201661
3099
因为我们希望确保,当你来到推特,
20:04
and you see something immediately
that you learn from and that you push.
369
1204784
4520
你马上就能看到一些你感兴趣,
想学的东西。
20:09
We can still serve an ad against that.
370
1209328
3420
在这个基础上,我们依然能投放广告。
20:12
That doesn't mean you need to spend
any more time to see more.
371
1212772
2921
这并不意味着你需要加快速度
看更多的东西。
20:15
The second thing we're looking at --
372
1215717
1733
第二件我们正在研究的事情是...
20:17
CA: But just -- on that goal,
daily active usage,
373
1217474
2698
安德森:但是,就你的目标而言,
如果每日活跃参与度
是你追求的东西的话,
20:20
if you're measuring that,
that doesn't necessarily mean things
374
1220196
3245
这个指数并不意味着人们
每天参与了兴趣相关的事务。
20:23
that people value every day.
375
1223465
1738
20:25
It may well mean
376
1225227
1161
它可能指
20:26
things that people are drawn to
like a moth to the flame, every day.
377
1226412
3306
人们每天出于好奇围观的事务。
20:29
We are addicted, because we see
something that pisses us off,
378
1229742
3022
我们之所以沉迷于此,因为我们
看到一些惹怒我们的东西,
20:32
so we go in and add fuel to the fire,
379
1232788
3178
然后我们参与进去,与别人争论,
20:35
and the daily active usage goes up,
380
1235990
1927
然后每日活跃参与度就上升了,
20:37
and there's more ad revenue there,
381
1237941
1715
因此广告收益也增多了,
20:39
but we all get angrier with each other.
382
1239680
2752
但最终结果是我们愈发互相仇恨。
20:42
How do you define ...
383
1242456
2509
你是如何定义...
20:44
"Daily active usage" seems like a really
dangerous term to be optimizing.
384
1244989
4126
过于美化“每日活跃参与度”这个数值
似乎是一件危险的事。
20:49
(Applause)
385
1249139
5057
(掌声)
20:54
JD: Taken alone, it is,
386
1254220
1268
多西:确实,如果你单独
讨论这个数值的话,
20:55
but you didn't let me
finish the other metric,
387
1255512
2346
但你得让我介绍其他衡量标准,
20:57
which is, we're watching for conversations
388
1257882
3727
我们同时也在检测谈话
21:01
and conversation chains.
389
1261633
2129
和谈话链。
21:03
So we want to incentivize
healthy contribution back to the network,
390
1263786
5076
我们希望鼓励平台发展健康交流,
21:08
and what we believe that is
is actually participating in conversation
391
1268886
4181
而我们认为健康交流的定义是
参与有益的交流,
21:13
that is healthy,
392
1273091
1197
21:14
as defined by those four indicators
I articulated earlier.
393
1274312
5037
这里的有益通过我刚刚所
提到的四个指标来测量。
21:19
So you can't just optimize
around one metric.
394
1279373
2657
所以你不能只看一个指标。
你得平衡一下,不断观察
21:22
You have to balance and look constantly
395
1282054
2752
21:24
at what is actually going to create
a healthy contribution to the network
396
1284830
4083
在平台上什么因素可以创造健康交流,
21:28
and a healthy experience for people.
397
1288937
2341
可以为人们创造健康的体验。
21:31
Ultimately, we want to get to a metric
398
1291302
1866
最终,我们希望可以达到这样的标准,
21:33
where people can tell us,
"Hey, I learned something from Twitter,
399
1293192
3757
人们会说:“嘿,我从推特上
学到了一些东西,
21:36
and I'm walking away
with something valuable."
400
1296973
2167
它让我增长了见识。”
21:39
That is our goal ultimately over time,
401
1299164
2043
这是我们未来的最终目标,
但距离这个目标的实现还有一段时间。
21:41
but that's going to take some time.
402
1301231
1809
21:43
CA: You come over to many,
I think to me, as this enigma.
403
1303064
5282
安德森:你在很多人看来,是一个谜。
我也这样觉得。
21:48
This is possibly unfair,
but I woke up the other night
404
1308370
4396
这可能不公平,
但当我之前某天醒来的时候,
21:52
with this picture of how I found I was
thinking about you and the situation,
405
1312790
3879
脑海中的画面是你和这个现状,
21:56
that we're on this great voyage with you
on this ship called the "Twittanic" --
406
1316693
6903
我们正乘坐在一艘名为“推坦尼克号”
的巨轮上前进...
22:03
(Laughter)
407
1323620
1281
(笑声)
22:04
and there are people on board in steerage
408
1324925
4357
在船上,普通舱的一些人
22:09
who are expressing discomfort,
409
1329306
2203
感觉非常不舒服,
22:11
and you, unlike many other captains,
410
1331533
2543
而你,和其他船长不同,
22:14
are saying, "Well, tell me, talk to me,
listen to me, I want to hear."
411
1334100
3431
说:“嘿,告诉我,
我想知道你们的意见。”
22:17
And they talk to you, and they say,
"We're worried about the iceberg ahead."
412
1337555
3619
所以人们过来跟你讲话,他们说:
“我们担心前方的冰山。”
然后你回答:“你们知道吗,
那是一个巨大的挑战,
22:21
And you go, "You know,
that is a powerful point,
413
1341198
2242
22:23
and our ship, frankly,
hasn't been built properly
414
1343464
2430
说实话,我们的船可能不像
它看上去的那样
22:25
for steering as well as it might."
415
1345918
1669
可以轻松转向。”
22:27
And we say, "Please do something."
416
1347611
1658
然后人们说:“请想想办法。”
22:29
And you go to the bridge,
417
1349293
1411
然后你登上舰桥,
22:30
and we're waiting,
418
1350728
2295
我们就在等待。
22:33
and we look, and then you're showing
this extraordinary calm,
419
1353047
4548
我们向外看,发现你神态自若,
22:37
but we're all standing outside,
saying, "Jack, turn the fucking wheel!"
420
1357619
3883
但我们都站在外面,
喊着:“杰克,求你快点转向!”
22:41
You know?
421
1361526
1151
你能懂吗?
22:42
(Laughter)
422
1362701
1335
(笑声)
22:44
(Applause)
423
1364060
2381
(掌声)
22:46
I mean --
424
1366465
1166
我是说
22:47
(Applause)
425
1367655
1734
(掌声)
22:49
It's democracy at stake.
426
1369413
4594
民主制度危在旦夕。
22:54
It's our culture at stake.
It's our world at stake.
427
1374031
2821
我们的文化、我们的世界危在旦夕。
22:56
And Twitter is amazing and shapes so much.
428
1376876
4706
推特非常了不起,变化非常巨大。
23:01
It's not as big as some
of the other platforms,
429
1381606
2233
它也许没有另一些平台那么大,
23:03
but the people of influence use it
to set the agenda,
430
1383863
2804
但有影响力的人用它来设定世界议程,
23:06
and it's just hard to imagine a more
important role in the world than to ...
431
1386691
6787
很难想象世界上比它更重要的角色。
23:13
I mean, you're doing a brilliant job
of listening, Jack, and hearing people,
432
1393502
3784
我的意思是,杰克,你现在
听取他人意见,你做得很好,
23:17
but to actually dial up the urgency
and move on this stuff --
433
1397310
4445
但讲到真正专心推动解决的办法——
23:21
will you do that?
434
1401779
2201
你会这么做吗?
23:24
JD: Yes, and we have been
moving substantially.
435
1404750
3815
多西:能,我们取得了
一些实质性的进展。
23:28
I mean, there's been
a few dynamics in Twitter's history.
436
1408589
3225
我的意思是,在推特的历史上
有着一些巨大变化。
23:31
One, when I came back to the company,
437
1411838
2083
当我重回到公司时,
23:35
we were in a pretty dire state
in terms of our future,
438
1415477
6256
我们对未来情形一筹莫展,
23:41
and not just from how people
were using the platform,
439
1421757
4634
不仅仅是因为人们使用这个平台的方法,
23:46
but from a corporate narrative as well.
440
1426415
2047
还因为公司结构的问题。
23:48
So we had to fix
a bunch of the foundation,
441
1428486
3204
所以我们得改变一些公司基础,
23:51
turn the company around,
442
1431714
1969
转变公司的方向,
23:53
go through two crazy layoffs,
443
1433707
3111
经历两次疯狂的裁员,
23:56
because we just got too big
for what we were doing,
444
1436842
3793
因为对于当时的业务而言雇员太多,
24:00
and we focused all of our energy
445
1440659
2060
然后我们将工作重心
24:02
on this concept of serving
the public conversation.
446
1442743
3508
放在为公众交流服务上。
这花了一点时间。
24:06
And that took some work.
447
1446275
1451
24:07
And as we dived into that,
448
1447750
2608
当我们开始关注这方面时,
24:10
we realized some of the issues
with the fundamentals.
449
1450382
2992
我们意识到一些原则性的问题。
24:14
We could do a bunch of superficial things
to address what you're talking about,
450
1454120
4656
为了达成你所说的目的,
我们可以采取一系列浅层措施,
24:18
but we need the changes to last,
451
1458800
1790
但我们需要找到可持续的改进措施,
24:20
and that means going really, really deep
452
1460614
2459
而这意味着非常深入,
24:23
and paying attention
to what we started 13 years ago
453
1463097
4350
观察我们13年前建立的东西,
24:27
and really questioning
454
1467471
2261
并且反省
24:29
how the system works
and how the framework works
455
1469756
2566
这个系统是怎样工作的,
这个框架是怎样工作的,
24:32
and what is needed for the world today,
456
1472346
3833
现今社会需要的是什么,
24:36
given how quickly everything is moving
and how people are using it.
457
1476203
4024
考虑到世界变化之快和
人们使用它的方法。
24:40
So we are working as quickly as we can,
but quickness will not get the job done.
458
1480251
6544
所以我们已经尽快努力了,
但速度无法帮我们完成任务。
24:46
It's focus, it's prioritization,
459
1486819
2611
重要的是专注、优先级、
24:49
it's understanding
the fundamentals of the network
460
1489454
2946
是对网络之基的理解,
24:52
and building a framework that scales
461
1492424
2842
构建可以灵活变动、适应变化的机制。
24:55
and that is resilient to change,
462
1495290
2351
24:57
and being open about where we are
and being transparent about where are
463
1497665
5429
坦诚布公我们的现状,
25:03
so that we can continue to earn trust.
464
1503118
2179
这样我们才能继续赢得人们的信任。
所以我对已经投入执行的机制感到自豪。
25:06
So I'm proud of all the frameworks
that we've put in place.
465
1506141
3331
25:09
I'm proud of our direction.
466
1509496
2888
我对我们选定的前进方向感到自豪。
25:12
We obviously can move faster,
467
1512915
2718
显然,我们可以更快进步,
25:15
but that required just stopping a bunch
of stupid stuff we were doing in the past.
468
1515657
4719
但这需要我们放弃过去一系列愚蠢的决定。
安德森:很好。
25:21
CA: All right.
469
1521067
1164
25:22
Well, I suspect there are many people here
who, if given the chance,
470
1522255
4067
那么,我觉得如果有机会的话,
在场的很多人
25:26
would love to help you
on this change-making agenda you're on,
471
1526346
3989
都愿意帮助你完场这一项重大改变,
25:30
and I don't know if Whitney --
472
1530359
1542
我不知道惠特尼是否还有问题,
25:31
Jack, thank you for coming here
and speaking so openly.
473
1531925
2761
杰克,感谢你能来与我们
开诚布公地讨论。
25:34
It took courage.
474
1534710
1527
这非常需要勇气。
25:36
I really appreciate what you said,
and good luck with your mission.
475
1536261
3384
我对你所说的非常感激,
祝你好运完成任务。
25:39
JD: Thank you so much.
Thanks for having me.
476
1539669
2095
多西:非常感谢。谢谢你们邀请我。
25:41
(Applause)
477
1541788
3322
(掌声)
25:45
Thank you.
478
1545134
1159
谢谢。
New videos
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。