TikTok CEO Shou Chew on Its Future — and What Makes Its Algorithm Different | Live at TED2023

2,036,568 views

2023-04-21 ・ TED


New videos

TikTok CEO Shou Chew on Its Future — and What Makes Its Algorithm Different | Live at TED2023

2,036,568 views ・ 2023-04-21

TED


请双击下面的英文字幕来播放视频。

00:00
Chris Anderson: It's very nice to have you here.
0
288
2294
翻译人员: Zheng Jia 校对人员: Yanyan Hong
克里斯 · 安德森(Chris Anderson): 很高兴能邀请到您。
00:03
Let's see.
1
3166
1126
我们来聊聊吧。
00:04
First of all, congratulations.
2
4333
1460
首先,恭喜你。
00:05
You really pulled off something remarkable on that grilling,
3
5793
2836
你在那场质询中表现得确实很了不起,
00:08
you achieved something that very few people do,
4
8629
2211
能做到这个的人寥寥无几——
00:10
which was, you pulled off a kind of, a bipartisan consensus in US politics.
5
10882
4671
你竟然在美国政治中达成了…… 可以说是一种两党共识。
00:15
It was great.
6
15553
1168
真的很不错。
00:16
(Laughter)
7
16763
1376
(笑声)
00:18
The bad news was that that consensus largely seemed to be:
8
18723
2753
坏消息是这个共识好像基本上是:
00:21
"We must ban TikTok."
9
21517
2294
“我们必须禁止 TikTok。”
00:23
So we're going to come to that in a bit.
10
23853
1960
那我们之后再讨论这个问题,我挺好奇的。
00:25
And I'm curious, but before we go there, we need to know about you.
11
25813
4588
不过在这之前, 我们需要先了解一下你本人。
00:30
You seem to me like a remarkable person.
12
30443
2878
在我看来你是个了不起的人。
00:33
I want to know a bit of your story
13
33362
1627
我想听听你的故事,
00:35
and how you came to TikTok in the first place.
14
35031
2627
还有你一开始是怎么进入 TikTok 的。
00:37
Shou Chew: Thank you, Chris.
15
37658
1377
周受资(Shou Chew):谢谢你,克里斯。
00:39
Before we do that, can I just check, need to know my audience,
16
39035
3003
在那之前,我能不能先问问—— 我需要了解一下我的观众——
00:42
how many of you here use TikTok?
17
42080
1876
你们这里有多少人用 TikTok?
00:45
Oh, thank you.
18
45124
1168
哦,谢谢。
00:46
For those who don’t, the Wi-Fi is free.
19
46292
2085
那些不用的朋友, 我提醒一下 Wi-Fi 是免费的。
00:48
(Laughter)
20
48419
1919
(笑声)
00:51
CA: There’s another question, which is,
21
51631
1877
克里斯:还有一个问题是——
00:53
how many of you here have had your lives touched through TikTok,
22
53508
3169
你们当中有多少人 生活中有受过 TikTok 的影响?
00:56
through your kids and other people in your lives?
23
56719
2461
可能是通过你们的孩子, 或者是生活里的其他人?
01:00
SC: Oh, that's great to see.
24
60515
1376
周:哦,太好了。
01:01
CA: It's basically, if you're alive,
25
61891
1752
克里斯:基本上就是,只要你活着,
01:03
you have had some kind of contact with TikTok at this point.
26
63643
3211
你就或多或少接触过 TikTok。
01:06
So tell us about you.
27
66854
1418
那跟我们聊聊你自己吧。
01:08
SC: So my name is Shou, and I’m from Singapore.
28
68272
2920
周:我的名字叫 Shou, 我来自新加坡。
01:11
Roughly 10 years ago,
29
71651
1668
大概十年前,
01:13
I met with two engineers who were building a product.
30
73361
4004
我见了两位工程师, 他们当时在研究一个产品。
01:17
And the idea behind this was to build a product
31
77365
2961
他们的想法是做这样一个产品——
01:20
that recommended content to people not based on who they knew,
32
80368
4796
它推送的内容不是你已经认识的人。
01:25
which was, if you think about it, 10 years ago,
33
85206
3003
这个东西,你想想,在十年前,
01:28
the social graph was all in the rage.
34
88209
3045
那个时候社交图谱风头正盛。
01:31
And the idea was, you know,
35
91295
1293
而当时的观念是,
01:32
your content and the feed that you saw should be based on people that you knew.
36
92588
3921
所有的内容和投放给你的东西 都应该是基于你已经认识的人。
01:36
But 10 years ago,
37
96926
1168
但是十年前,
01:38
these two engineers thought about something different,
38
98136
2544
这两位工程师有了一些不一样的想法,
01:40
which is, instead of showing you --
39
100680
3545
那就是不再给你看——
01:44
instead of showing you people you knew,
40
104225
2461
不再给你看你认识的人,
01:46
why don't we show you content that you liked?
41
106727
2253
为什么不根据你的喜好来推送内容呢?
01:49
And that's sort of the genesis and the birth
42
109021
2211
这差不多就是
01:51
of the early iterations of TikTok.
43
111232
2794
TikTok 雏形的起源和诞生。
01:54
And about five years ago,
44
114443
1585
之后大概五年前,
01:56
with the advent of 4G, short video, mobile phone penetration,
45
116070
5798
随着 4G 、短视频的出现 和手机在生活中的渗透,
02:01
TikTok was born.
46
121868
1376
TikTok 诞生了。
02:03
And a couple of years ago,
47
123286
1293
然后几年前,
02:04
I had the opportunity to run this company,
48
124620
2419
我有机会管理这个公司,
02:07
and it still excites me every single day.
49
127039
2002
直到现在我还是每一天都为此兴奋。
02:09
CA: So I want to dig in a little more into this,
50
129750
2253
克里斯:这个我想再深入了解一下。
02:12
about what was it that made this take-off so explosive?
51
132044
2962
到底是什么让它能够获得 如此爆炸性的成功呢?
02:15
Because the language I hear from people who spent time on it,
52
135047
3963
因为从在这上面花费时间的人那里, 我听来的描述是,
02:19
it's sort of like I mean,
53
139051
2128
它有点……就是……
02:21
it is a different level of addiction to other media out there.
54
141220
5965
它有一种超脱其他媒体的、 令人上瘾的魔力。
02:27
And I don't necessarily mean this in a good way, we'll be coming on to it.
55
147226
3504
我不是说它一定是积极的, 这个我们等会儿再谈。
02:30
There’s good and bad things about this type of addiction.
56
150730
2711
它有积极和消极的方面。
02:33
But it’s the feeling
57
153441
1168
但是就是那种感受——
02:34
that within a couple of days of experience of TikTok,
58
154650
2503
只要玩了几天 TikTok,
02:37
it knows you and it surprises you
59
157195
2502
它就掌握了你,并且给你带来惊喜,
02:39
with things that you didn't know you were going to be interested in,
60
159739
3211
用的是一些你都不知道 自己会感兴趣的东西,
02:42
but you are.
61
162950
1293
但你就是被吸引了。
02:44
How?
62
164243
1210
怎么做到的?
02:45
Is it really just, instead of the social graph --
63
165453
2627
它是不是真的只是,抛开社交图谱——
02:48
What are these algorithms doing?
64
168080
2044
这些算法到底在做什么?
02:50
SC: I think to describe this, to begin to answer your question,
65
170124
4171
周:我觉得要解释这件事、 要回答你这个问题的话,
02:54
we have to talk about the mission of the company.
66
174337
2502
肯定绕不开公司的使命。
02:56
Now the mission is to inspire creativity and to bring joy.
67
176839
3504
公司的使命是“激发创造,带来愉悦”。
03:00
And I think missions for companies like ours [are] really important.
68
180384
3629
我认为对于我们这样的公司来说, 使命是非常重要的。
03:04
Because you have product managers working on the product every single day,
69
184055
3545
因为产品经理们每天都在研究产品,
03:07
and they need to have a North Star, you know,
70
187600
2127
他们需要一座灯塔,
03:09
something to sort of, work towards together.
71
189769
2586
一个可以为之共同努力的目标。
03:12
Now, based on this mission,
72
192355
1543
那基于这样一个使命,
03:13
our vision is to provide three things to our users.
73
193898
3378
我们的愿景是为我们的用户提供三样东西。
03:17
We want to provide a window to discover,
74
197318
2127
我们想提供一扇发现之窗户——
03:19
and I’ll talk about discovery, you talked about this, in a second.
75
199445
3170
等下我会讲到发现,你刚才也提到了——
03:22
We want to give them a canvas to create,
76
202657
2085
我们想为他们提供一幅创造之画卷,
03:24
which is going to be really exciting with new technologies in AI
77
204784
3879
有了 AI 新技术的加持, 这是非常令人兴奋的,
03:28
that are going to help people create new things.
78
208704
3587
它会帮助大家创造新事物。
03:32
And the final thing is bridges for people to connect.
79
212792
2919
最后是一座连结之桥梁。
03:35
So that's sort of the vision of what we're trying to build.
80
215753
2794
这个差不多就是我们想要去搭建的愿景。
03:38
Now what really makes TikTok very unique and very different
81
218547
3754
而让 TikTok 真正变得独特而与众不同的,
03:42
is the whole discovery engine behind it.
82
222343
3003
是它背后的整个“发现引擎” (Discovery Engine)。
03:45
So there are earlier apps that I have a lot of respect for,
83
225721
2878
有些更早期的应用软件是我非常钦佩的,
03:48
but they were built for a different purpose.
84
228641
2127
不过它们的搭建目的不同。
03:50
For example, in the era of search, you know,
85
230768
3128
比如说在搜索领域,
03:53
there was an app that was built for people who wanted to search things
86
233896
3504
就有为了更便捷的检索 而构建的应用;
03:57
so that is more easily found.
87
237441
3420
04:00
And then in the era of social graphs,
88
240903
1794
在社交图谱领域,
04:02
it was about connecting people and their followers.
89
242697
2586
目的则是连结用户和他们的粉丝。
04:05
Now what we have done is that ... based on our machine-learning algorithms,
90
245283
4212
那我们完成的事情是, 基于我们的机器学习算法,
04:09
we're showing people what they liked.
91
249578
2294
我们给用户看他们喜欢的东西。
04:11
And what this means is that we have given the everyday person
92
251872
3087
也就是说,我们给普通人提供了
04:15
a platform to be discovered.
93
255001
1918
一个让他们能够被看见的平台。
04:16
If you have talent, it is very, very easy to get discovered on TikTok.
94
256919
3962
只要你有才华,那你就 非常非常容易在 TikTok 上被发掘。
04:20
And I'll just give you one example of this.
95
260923
2544
我可以给你举一个例子。
04:23
The biggest creator on TikTok is a guy called Khaby.
96
263467
3629
TikTok 上关注量最大的内容创作者 是一个叫卡比(Khaby)的人。
04:27
Khaby was from Senegal,
97
267430
1751
卡比来自塞内加尔,
04:29
he lives in Italy, he was a factory worker.
98
269181
2711
他住在意大利,曾经是一个工厂工人。
04:32
He, for the longest time, didn't even speak in any of his videos.
99
272685
4338
很长一段时间里,你知道, 他甚至没有在他的任何一个视频中讲过话。
04:37
But what he did was he had talent.
100
277064
2336
但他所做的就是 展示自己拥有的才华。
04:39
He was funny, he had a good expression,
101
279442
2252
他很搞笑,表情特别到位,
04:41
he had creativity, so he kept posting.
102
281694
2461
他也很有创造力, 所以能持续不断地发布视频。
04:44
And today he has 160 million followers on our platform.
103
284196
3587
现在他在我们的平台上有 1.6 亿粉丝。
04:47
So every single day we hear stories like that,
104
287825
2169
所以每一天我们都能听到类似的故事,
04:49
businesses, people with talent.
105
289994
2252
像这样有才华的企业和个人。
04:52
And I think it's very freeing to have a platform
106
292288
3336
我认为这样一个平台是非常自由开放的。
04:55
where, as long as you have talent, you're going to be heard
107
295666
3128
在这里,只要有才华, 你就能被听见,
04:58
and you have the chance to succeed.
108
298794
1877
你就有机会成功。
05:00
And that's what we're providing to our users.
109
300713
2210
这就是我们给用户提供的东西。
05:02
CA: So this is the amazing thing to me.
110
302923
1919
克:那在我看来神奇的事情是,
05:04
Like, most of us have grown up with, say, network television,
111
304884
4337
就像我们大多数人都是—— 比如说——看着网络电视中长大的。
05:09
where, for decades you've had thousands of brilliant, creative people
112
309263
3587
几十年来我们有成千上万 才华横溢、富有创造力的人,
05:12
toiling in the trenches,
113
312892
1168
他们埋头苦干,
05:14
trying to imagine stuff that will be amazing for an audience.
114
314101
4380
尽力去构想能够惊艳观众的内容。
05:18
And none of them ever remotely came up with anything
115
318522
4255
但他们中却没有一个人 或曾遥遥领先地想出任何、
05:22
that looked like many of your creators.
116
322818
3921
现在你的许多创作者 做出来的类似的内容。
05:27
So these algorithms,
117
327114
2294
所以这些算法,
05:29
just by observing people's behavior and what they look like,
118
329450
3921
只是通过观察人们的行为 和他们看起来喜欢什么,
05:33
have discovered things that thousands of brilliant humans never discovered.
119
333412
3587
就能发现成千上万才华横溢的人类 从未发现的事情。
05:37
Tell me some of the things that it is looking at.
120
337041
3045
跟我讲几个它关注的东西吧。
05:40
So obvious things, like if someone presses like
121
340086
2502
就说最显而易见的,比如说如果一个人点赞、
05:42
or stays on a video for a long time,
122
342630
1877
或者长时间观看一个视频,
05:44
that gives you a clue, "more like that."
123
344548
2545
这就给了你一个线索——“他更喜欢这个”。
05:47
But is it subject matter?
124
347134
1210
但是这个算是杀手锏吗?
05:48
What are the array of things
125
348386
2377
你们发现的那些、
05:50
that you have noticed that you can actually track
126
350805
2419
真正可以去追踪、并获得 有用线索的这一系列东西,
05:53
that provide useful clues?
127
353224
2043
它们究竟是什么?
05:55
SC: I'm going to simplify this a lot,
128
355309
2461
周:我会尽可能简单地来解释,
05:57
but the machine learning, the recommendation algorithm
129
357812
2877
但是机器学习、推荐算法,
06:00
is really just math.
130
360731
1377
其实就是数学而已。
06:02
So, for example, if you liked videos one, two, three and four,
131
362108
4587
比如说,如果你喜欢视频一、二、三和四,
06:06
and I like videos one, two, three and five,
132
366737
2211
而我喜欢视频一、二、三和五。
06:08
maybe he liked videos one, two, three and six.
133
368989
2253
然后可能他喜欢视频一、二、三和六。
06:11
Now what's going to happen is,
134
371242
1501
那么会发生的事情就是,
06:12
because we like one, two, three at the same time,
135
372743
2336
因为我们都喜欢一、二、三,
06:15
he's going to be shown four, five, six, and so are we.
136
375121
2877
所以视频四、五、六会被推送给他, 也会推送给我们。
06:18
And you can think about this repeated at scale in real time
137
378040
3379
你可以想想它被实时地、大规模地重复,
06:21
across more than a billion people.
138
381460
1877
在超过十亿的人群中。
06:23
That's basically what it is, it's math.
139
383337
2044
这个算法基本上就是这样, 它就是数学。
06:25
And of course, you know,
140
385423
1501
当然你知道,
06:26
AI and machine learning has allowed this to be done
141
386966
2794
AI 和机器学习让这种算法
06:29
at a very, very big scale.
142
389802
1710
能够以非常非常大的规模运行。
06:31
And what we have seen, the result of this,
143
391554
2085
而我们所看到的、其结果是,
06:33
is that it learns the interest signals
144
393681
2669
它能非常迅速地学习
06:36
that people exhibit very quickly
145
396392
1793
人们表现出的兴趣信号(Interest Signal),
06:38
and shows you content that's really relevant for you
146
398227
3128
然后极其快速地向你展示
06:41
in a very quick way.
147
401355
2461
与你真正相关的内容。
06:44
CA: So it's a form of collaborative filtering, from what you're saying.
148
404442
3336
克:所以从你刚刚说的内容来看, 它是一种协同过滤(Collaborative Filtering)。
06:47
The theory behind it is that these humans are weird people,
149
407820
2794
它背后的理论是, 这些人类都很古怪,
06:50
we don't really know what they're interested in,
150
410614
2253
我们不太知道他们感兴趣的是什么;
06:52
but if we see that one human is interested,
151
412908
2002
但是如果我们发现一个人对此感兴趣,
06:54
with an overlap of someone else, chances are, you know,
152
414952
2836
加上另一个人也感兴趣, 那就很有可能——就是说——
06:57
you could make use of the other pieces
153
417830
2002
你可以利用其他
06:59
that are in that overlapped human's repertoire to feed them,
154
419874
5463
喜好相同的人的“收藏歌单”, 拿出里面的歌来投放给他们,
07:05
and they'll be surprised.
155
425337
1252
然后他们就会为之惊喜。
07:06
But the reason they like it is because their pal also liked it.
156
426589
3670
但其实他们喜欢它的原因是, 他们的同道中人也喜欢它。
07:10
SC: It's pattern recognition based on your interest signals.
157
430301
3253
周:这是基于你兴趣信号的 模式识别(Pattern Recognition)。
07:13
And I think the other thing here
158
433596
1543
我觉得这里还有一点是,
07:15
is that we don't actually ask you 20 questions
159
435139
2669
我们没有真的叫你回答 20 个问题,
07:17
on whether you like a piece of content, you know, what are your interests,
160
437808
3504
我们没有问你是否喜欢这个内容, 或者说问你感兴趣的是什么,
我们不这样做。
07:21
we don't do that.
161
441312
1209
我们是把这个过程有机地 融合到应用软件的使用中了。
07:22
We built that experience organically into the app experience.
162
442563
2878
07:25
So you are voting with your thumbs by watching a video,
163
445441
3545
所以你看一个视频的时候, 实际上就是在用大拇指投票——
07:29
by swiping it, by liking it, by sharing it,
164
449028
3336
通过划动、点赞、分享,
07:32
you are basically exhibiting interest signals.
165
452364
2586
你就是在展示自己的兴趣信号。
07:34
And what it does mathematically is to take those signals,
166
454992
2711
而它在数学上所做的工作 就是接受这些信号,
07:37
put it in a formula and then matches it through pattern recognition.
167
457745
3211
把它放进一个公式里, 然后通过模式识别进行匹配。
07:40
That's basically the idea behind it.
168
460998
2211
这基本上就是它背后的逻辑。
07:43
CA: I mean, lots of start-ups have tried to use these types of techniques.
169
463626
5422
克:我是说,很多初创公司 都尝试过使用这种科技。
07:49
I'm wondering what else played a role early on?
170
469089
2211
我想知道还有什么在初期发挥了作用?
07:51
I mean, how big a deal was it,
171
471300
1460
我是说,这件事起到了多大的作用呢?——
07:52
that from the get-go you were optimizing for smartphones
172
472760
3253
从一开始你们就针对智能手机进行优化,
07:56
so that videos were shot in portrait format
173
476013
2711
让视频能够以竖屏形式拍摄,
07:58
and they were short.
174
478724
1877
并且它们时长很短。
08:00
Was that an early distinguishing thing that mattered?
175
480643
2961
这个早期的决定是不是 也产生了重要的影响呢?
08:03
SC: I think we were the first to really try this at scale.
176
483646
2752
周:我想我们是最早真正尝试这个的公司。
08:06
You know, the recommendation algorithm is a very important reason
177
486941
3712
你知道的,这个平台是为什么 如此受到这么多人的欢迎,
08:10
as to why the platform is so popular among so many people.
178
490653
5088
推荐算法是一个非常重要的因素。
08:15
But beyond that, you know, you mentioned the format itself.
179
495783
4296
但是除此之外,你提到了视频本身的形式。
08:20
So we talked about the vision of the company,
180
500079
2127
我们刚刚讲过公司的愿景,
08:22
which is to have a window to discover.
181
502206
1835
也就是提供一扇发现之窗。
08:24
And if you just open the app for the first time,
182
504083
2252
如果你第一次打开这个应用,
08:26
you'll see that it takes up your whole screen.
183
506377
2169
你会发现它占据了你的整个屏幕。
08:28
So that's the window that we want.
184
508546
1626
这就是我们想要的窗口。
08:30
You can imagine a lot of people using that window
185
510214
2294
你可以想象很多人利用这个窗口,
08:32
to discover new things in their lives.
186
512508
1835
去发现他们生活中的新事物。
08:34
Then, you know, through this recommendation algorithm,
187
514385
3545
然后通过推荐算法,
08:37
we have found that it connects people together.
188
517972
2210
我们发现它能够将人们连结在一起,
08:40
People find communities,
189
520182
1293
帮助他们找到自己的社群。
08:41
and I've heard so many stories of people who have found their communities
190
521475
4088
我听说了好多这样的故事, 人们通过他们发布的内容,
08:45
because of the content that they're posting.
191
525604
2086
找到了属于他们的社群。
08:47
Now, I'll give you an example.
192
527690
1459
现在我给你举个例子。
08:49
I was in DC recently, and I met with a bunch of creators.
193
529567
3670
我最近在华盛顿, 和一群创作者见了面。
08:53
CA: I heard.
194
533237
1168
克:我听说了。
08:54
(Laughter)
195
534446
1669
是啊。 (笑声)
08:56
SC: One of them was sitting next to me at a dinner,
196
536156
2420
周:吃晚饭的时候, 他们其中一个人坐在我旁边,
08:58
his name is Samuel.
197
538576
1209
他叫塞缪尔(Samuel)。
08:59
He runs a restaurant in Phoenix, Arizona, and it's a taco restaurant.
198
539827
4921
他在亚利桑那州凤凰城开了一家餐厅, 那是一家墨西哥塔可(Taco)餐厅。
09:04
He told me he has never done this before, first venture.
199
544790
2795
他告诉我,他之前从来没有做过这种事, 这是他的第一次冒险。
09:08
He started posting all this content on TikTok,
200
548252
2419
他开始在 TikTok 上发布这些内容。
09:10
and I saw his content,
201
550713
1376
然后我看了他创作的内容,
09:12
I was hungry after looking at it, it's great content.
202
552131
3920
看完之后立刻饿了, 内容就是非常好。
09:16
And he's generated so much interest in his business,
203
556051
3254
然后他对自己的业务产生了特别大的兴趣,
09:19
that last year he made something like a million dollars in revenue
204
559305
3128
去年他差不多有一百万美元的收入,
09:22
just via TikTok.
205
562474
1168
就只是通过 TikTok,
09:23
One restaurant.
206
563684
1168
只是一家餐厅。
09:24
And again and again, I hear these stories,
207
564893
2211
然后我听到一个又一个这样的故事,
09:27
you know, by connecting people together,
208
567104
2419
通过把人们连结在一起,
09:29
by giving people the window to discover,
209
569565
2210
通过提供一扇发现之窗,
09:31
we have given many small businesses and many people, your common person,
210
571775
4463
我们让许多小型企业和许多普通人
09:36
a voice that they will never otherwise have.
211
576280
2544
一个绝无仅有的发声机会。
09:38
And I think that's the power of the platform.
212
578824
2169
我觉得这就是这个平台的力量。
09:41
CA: So you definitely have identified early
213
581327
2043
克:所以你们肯定早就察觉到
09:43
just how we're social creatures, we need affirmation.
214
583370
3754
我们是怎样作为社会的生物存在, 我们需要情感的支撑。
09:47
I've heard a story,
215
587166
1168
我听说了一个故事,
09:48
and you can tell me whether true or not,
216
588375
1961
你可以告诉我它是真是假。
09:50
that one of the keys to your early liftoff
217
590336
2961
你在早期一鸣惊人的其中一个关键是,
09:53
was that you wanted to persuade creators who were trying out TikTok
218
593297
4212
你想说服试用 TikTok 的创作者,
09:57
that this was a platform where they would get response,
219
597509
4088
这是一个会给他们回应的平台。
10:01
early on, when you're trying to grow something,
220
601597
2210
一开始你们试图培养用户的时候,
10:03
the numbers aren't there for response.
221
603849
1835
数据并不好,因此没有太多回应。
10:05
So you had the brilliant idea of goosing those numbers a bit,
222
605684
3253
然后你就想出了一个很聪明的点子, 把那些数字弄得乱一点,
10:08
basically finding ways to give people, you know,
223
608979
2503
基本上就是找点办法,
10:11
a bigger sense of like, more likes,
224
611482
1793
给大家营造一个点赞数更多、
10:13
more engagement than was actually the case,
225
613317
2044
参与度更高的印象,
10:15
by using AI agents somehow in the process.
226
615361
4254
这个过程用 AI 行为体来完成。
10:19
Is that a brilliant idea, or is that just a myth?
227
619657
2711
这是你的金点子吗? 还是说只是个传言?
10:22
SC: I would describe it in a different way.
228
622409
4004
周:我的讲法会不太一样。
10:26
So there are other platforms that exist before TikTok.
229
626413
3504
在 TikTok 之前已经有很多其他平台了。
10:29
And if you think about those platforms,
230
629917
1918
你想想在这些平台上,
10:31
you sort of have to be famous already in order to get followers.
231
631877
3754
如果想要得到粉丝, 你基本上必须是已经很有名气了。
10:35
Because the way it’s built is that people come and follow people.
232
635631
3712
因为这些平台构建的方式就是 用户登录然后直接关注。
10:39
And if you aren't already famous,
233
639385
2377
如果你不是已经为人所知,
10:41
the chances that you get discovered are very, very low.
234
641762
3170
那你被发现的概率就微乎其微。
10:44
Now, what we have done, again,
235
644973
1460
那我们做的就是——之前说过——
10:46
because of the difference in the way we're recommending content,
236
646433
3420
因为我们用一种不同的方式进行内容推送,
10:49
is that we have given anyone,
237
649895
1543
我们给任何人、
10:51
any single person with enough talent a stage to be able to be discovered.
238
651480
5339
任何一个有足够才华的人, 提供了一个能够被发现的平台。
10:56
And I think that actually is the single, probably the most important thing
239
656819
3545
而且我觉得这其实是唯一的、 可能也是最重要的一点,
11:00
contributing to the growth of the platform.
240
660364
2085
是它带来了平台用户的增长。
11:02
And again and again, you will hear stories from people who use the platform,
241
662491
3587
然后一次又一次地, 从平台用户口中、
11:06
who post regularly on it,
242
666120
1626
从定期发布视频的创作者口中, 你会听到这样的故事:
11:07
that if they have something they want to say,
243
667788
2628
如果他们有什么想表达的东西,
11:10
the platform gives them the chance and the stage
244
670457
2253
那这个平台就给了他们机会和舞台,
11:12
to connect with their audience
245
672751
1460
让他们得以与他们的观众产生连结,
11:14
in a way that I think no other product in the past has ever offered them.
246
674211
4505
以一种——我觉得是—— 过去其他任何产品都无法提供的方式。
11:19
CA: So I'm just trying to play back what you said there.
247
679341
2628
克:我想再提一下你刚才说的那句话。
11:21
You said you were describing a different way what I said.
248
681969
3128
你说你是在用另一种方式 描述我说的故事。
11:25
Is it then the case that like, to give someone a decent chance,
249
685139
3712
那是不是这样—— 比如说,为了给那些富有才华、
11:28
someone who's brilliant but doesn't come with any followers initially,
250
688892
3379
但刚开始粉丝不多的人一个像样的机会,
11:32
that you've got some technique to identify talent
251
692312
3629
你用了一些技术去辨贤识才,
11:35
and that you will almost encourage them,
252
695983
2085
然后你几乎是会去鼓励他们,
11:38
you will give them some kind of, you know,
253
698110
2794
你会给他们某种——比如说——
11:40
artificially increase the number of followers or likes
254
700946
2544
人为地增加粉丝或者点赞数,
11:43
or whatever that they have,
255
703532
1335
或者别的什么数据,
11:44
so that others are encouraged to go,
256
704867
1751
这样其他人就会被吸引过去,
11:46
"Wow, there's something there."
257
706618
1544
“哇,这儿有点东西。”
11:48
Like it's this idea of critical mass that kind of, every entrepreneur,
258
708162
3336
这就像几乎每个企业家、
11:51
every party planner kind of knows about of
259
711498
2044
每个派对策划师都知道的 临界大众(Critical Mass)——
11:53
"No, no, this is the hot place in town, everyone come,"
260
713542
2628
“别走别走,这里城中旺铺, 走过路过都来看看,”
11:56
and that that is how you actually gain critical mass?
261
716170
2627
这就是你达到临界大众的方式吗?
11:58
SC: We want to make sure that every person who posts a video
262
718839
3754
周:我们想确认每个发布视频的人
12:02
is given an equal chance to be able to have some audience to begin with.
263
722634
4713
都能平等地拥有一些起始观众。
12:07
But this idea that you are maybe alluding to,
264
727389
3379
但是我们可以操控人们喜欢上某样东西——
12:10
that we can get people to like something,
265
730809
3003
你可能是在暗示这个意思——
12:13
it doesn't really work like that.
266
733854
1585
它实际上不是这样操作的。
12:15
CA: Could you get AI agents to like something?
267
735439
2919
克里斯:你能让 AI 行为体喜欢上某样东西吗?
12:18
Could you seed the network with extra AI agents that could kind of, you know,
268
738400
4046
你会不会用额外的 AI 行为体作为网络种子,
12:22
give someone early encouragement?
269
742488
1960
给用户一些早期创作激励?
12:24
SC: Ultimately, what the machine does is it recognizes people's interests.
270
744490
4546
周:从根本上来说,机器做的是 识别人们的兴趣点。
12:29
So if you post something that's not interesting to a lot of people,
271
749077
3170
所以如果你发布的内容 对很多人来说没什么意思,
12:32
even if you gave it a lot of exposure,
272
752289
1835
那即使你给它很多曝光机会,
12:34
you're not going to get the virality that you want.
273
754166
2419
也得不到你想要的走红。
12:36
So it's a lot of ...
274
756585
1793
所以这是很多……
12:38
There is no push here.
275
758420
2878
这里面不存在强迫。
12:41
It's not like you can go and push something,
276
761340
2669
它不是说你可以强迫什么东西——
12:44
because I like Chris, I'm going to push your content,
277
764009
2503
“因为我喜欢克里斯, 所以我要强推你的内容,”
12:46
it doesn't work like that.
278
766553
1252
它不是这样运作的。
12:47
You've got to have a message that resonates with people,
279
767805
2627
你必须要发布能引起共鸣的信息,
12:50
and if it does,
280
770474
1168
如果它确实能引起共鸣,
12:51
then it will automatically just have the virality itself.
281
771683
2711
那么就会自然而然地走红。
12:54
That's the beauty of user-generated content.
282
774394
2253
这就是用户生成内容的魅力所在。
12:56
It's not something that can be engineered or over-thought.
283
776688
4046
它不是可以被设计出来、 或者越俎代庖地思考出来的。
13:00
It really is something that has to resonate with the audience.
284
780776
3462
它要真的让观众产生共鸣才行。
13:04
And if it does, then it goes viral.
285
784238
1918
如果它产生了,它就会病毒式地传播开来。
13:06
CA: Speaking privately with an investor who knows your company quite well,
286
786198
4588
克:我和一位很了解你们公司的投资人 私下交流过,
13:10
who said that actually the level of sophistication
287
790786
4463
他说你们这个算法的复杂程度
13:15
of the algorithms you have going
288
795249
1543
13:16
is just another order of magnitude
289
796834
1918
其实是竞争平台的另一个数量级——
13:18
to what competitors like, you know, Facebook or YouTube have going.
290
798794
4379
像 Facebook 和 YouTube 那样的。
13:23
Is that just hype or do you really believe you --
291
803924
3253
这个是炒作吗?还是你们确信——
13:27
like, how complex are these algorithms?
292
807177
3420
就是说这些算法究竟有多复杂?
13:31
SC: Well, I think in terms of complexity,
293
811014
2002
周:嗯,我认为从复杂程度上来说,
13:33
there are many companies who have a lot of resources
294
813016
2545
有很多公司都有很多资源,
13:35
and a lot of talent.
295
815602
1168
很多人才。
13:36
They will figure out even the most complex algorithms.
296
816770
2920
他们能研究出最最复杂的算法。
13:39
I think what is very different is your mission of your company,
297
819982
3795
我觉得我们最大的不同之处 在于公司的使命,
13:43
how you started the company.
298
823819
1626
在于公司的出发点。
13:45
Like I said, you know, we started with this idea
299
825445
2294
就像我说的,我们最初的想法是——
13:47
that this was the main use case.
300
827739
2128
这就是我们的主要用例(Use Case)。
13:50
The most important use case is you come and you get to see recommended content.
301
830158
4463
最重要的用例就是你进入这个平台, 然后看到推荐内容。
13:54
Now for some other apps out there,
302
834663
1835
那对于其他的很多应用软件来说,
13:56
they are very significant and have a lot of users,
303
836540
2544
它们的地位很重要,用户数量很大,
13:59
they are built for a different original purpose.
304
839084
2920
它们的构建出发点不同。
14:02
And if you are built for something different,
305
842004
2127
如果你的构建目的不同,
14:04
then your users are used to that
306
844131
1543
那么你的用户就会习惯于此,
14:05
because the community comes in and they expect that sort of experience.
307
845716
3336
因为社群进来,然后他们会有那样的预期。
14:09
So I think the pivot away from that
308
849052
1710
所以我认为我们与众不同的核心
14:10
is not really just a matter of engineering and algorithms,
309
850762
3379
其实不是工程和算法,
14:14
it’s a matter of what your company is built to begin with.
310
854141
3837
而是公司构建的出发点。
14:18
Which is why I started this by saying you need to have a vision,
311
858020
3003
这就是为什么我开头就说你需要一个愿景,
14:21
you need to have a mission, and that's the North Star.
312
861064
2545
你需要有一个使命,那就是你的灯塔。
14:23
You can't just shift it halfway.
313
863650
1585
你不能中途偏离航线。
14:25
CA: Right.
314
865235
1168
克:没错。
14:26
And is it fair to say
315
866445
1168
可以说
14:27
that because your start point has been interest algorithms
316
867613
2752
正是因为你的出发点是兴趣算法,
14:30
rather than social graph algorithms,
317
870365
1919
而不是社交图谱算法,
14:32
you've been able to avoid some of the worst of the sort of,
318
872326
2794
你能够避免有些——最糟糕的——
14:35
the filter bubbles that have happened in other social media
319
875120
2795
那些发生在其他社交媒体上的 过滤气泡(Filter Bubble)——
14:37
where you have tribes kind of declaring war on each other effectively.
320
877956
3337
各个群体之间互相宣战,
14:41
And so much of the noise and energy is around that.
321
881293
4296
围绕这种事吵吵嚷嚷, 消耗大量的精力。
14:45
Do you believe that you've largely avoided that on TikTok?
322
885631
2752
你觉得你们是不是基本上 在 TikTok 避免了这种情况?
14:48
SC: The diversity of content that our users see is very key.
323
888425
4254
周:我们的用户 看到的内容多样性非常关键。
14:52
You know, in order for the discovery -- the mission is to discover --
324
892721
4213
你知道,为了发现——我们的使命是发现——
14:56
sorry, the vision is to discover.
325
896975
2044
不好意思,愿景是发现——
14:59
So in order to facilitate that,
326
899061
1626
为了实现这个愿景,
15:00
it is very important to us
327
900729
1710
用户看到的内容的多样性
15:02
that what the users see is a diversity of content.
328
902481
3128
对我们来说就非常重要。
15:06
Now, generally speaking, you know,
329
906276
2461
那么总体来说,
15:08
there are certain issues that you mentioned
330
908779
2002
有几个你提到的问题
15:10
that the industry faces, you know.
331
910822
1627
是这个行业正在面临的。
15:12
There are some bad actors who come on the internet,
332
912491
2419
有一些不良分子进入网络,
15:14
they post bad content.
333
914952
1126
发布不良内容。
15:16
Now our approach is that we have very clear community guidelines.
334
916119
4046
我们应对的方法是 我们有非常明确的社区准则。
15:20
We're very transparent about what is allowed
335
920165
2377
对于在平台上允许和禁止做的事情,
15:22
and what is not allowed on our platform.
336
922542
1961
我们非常透明。
15:24
No executives make any ad hoc decisions.
337
924544
3003
没有管理人员会做出例外的判定。
15:27
And based on that,
338
927589
1126
基于这一点,
15:28
we have built a team that is tens of thousands of people plus machines
339
928757
4338
我们组建了一个团队, 其中包括成千上万的人力加上机器,
15:33
in order to identify content that is bad
340
933136
2461
就是为了识别不良内容,
15:35
and actively and proactively remove it from the platform.
341
935639
3211
然后积极主动地将它们从平台上移除。
15:38
CA: Talk about what some of those key guidelines are.
342
938892
2711
克:说说主要的准则都有哪些吧。
15:41
SC: We have it published on our website.
343
941603
2127
周:我们发布在我们的网站上了。
15:43
In March, we just iterated a new version to make it more readable.
344
943772
5672
我们在三月刚刚升级了新版本, 把它变得更加容易阅读。
15:49
So there are many things like, for example, no pornography,
345
949444
3712
有很多东西,比如说,禁止色情,
15:53
clearly no child sexual abuse material and other bad things,
346
953156
3546
当然还有禁止儿童性虐待内容 和其他不好的东西,
15:56
no violence, for example.
347
956702
2085
例如禁止暴力。
15:58
We also make it clear that it's a differentiated experience
348
958787
3086
我们也明确了它是差异化用户体验。
16:01
if you're below 18 years old.
349
961873
1836
如果你不到 18 岁,
16:03
So if you're below 18 years old, for example,
350
963750
2127
比如说如果你不到 18 岁,
16:05
your entire app experience is actually more restricted.
351
965919
3253
你的整个软件使用都会受到更多限制。
16:09
We don't allow, as an example,
352
969172
2461
比如,我们默认不允许
16:11
users below 16, by default, to go viral.
353
971633
4046
年龄低于 16 岁的用户走红,
16:15
We don't allow that.
354
975679
1209
我们不允许它发生。
16:16
If you're below 16,
355
976930
1168
如果你不到 16 岁,
16:18
we don’t allow you to use the instant messaging feature in app.
356
978140
3753
你就不能使用软件中的即时信息功能;
16:22
If you’re below 18, we don’t allow you to use the livestreaming features.
357
982602
3504
如果你不到 18 岁, 我们就不允许你使用直播功能。
16:26
And of course, we give parents a whole set of tools
358
986148
2669
当然,我们也给父母们配备了一整套工具,
16:28
to control their teenagers’ experience as well.
359
988817
2211
让他们也能够控制青少年的用户体验。
16:31
CA: How do you know the age of your users?
360
991069
2836
克:你们是怎么知道用户年龄的呢?
16:34
SC: In our industry, we rely mainly on something called age gating,
361
994364
4588
周:在这个行业中,我们主要依赖 一种叫“年龄门槛”(Age Gating)的东西,
16:38
which is when you sign up for the app for the first time
362
998994
2627
就是当你第一次注册这个软件的时候,
16:41
and we ask you for the age.
363
1001621
1335
我们会询问你的年龄。
16:42
Now, beyond that,
364
1002956
1168
除此之外,
16:44
we also have built tools to go through your public profile for example,
365
1004166
5422
我们也构建了一些浏览 用户公共资料的工具,
16:49
when you post a video,
366
1009588
1168
当你发布视频的时候,
16:50
we try to match the age that you said with the video that you just posted.
367
1010797
4255
我们会对比你提交的年龄 和你发布的视频。
16:55
Now, there are questions of can we do more?
368
1015093
2336
那么问题是我们还能再做点什么吗?
16:57
And the question always has, for every company, by the way,
369
1017471
3253
问题永远是——其实对行业里 每个公司都是这样——
17:00
in our industry, has to be balanced with privacy.
370
1020766
2836
要如何与隐私保护平衡。
17:03
Now, if, for example, we scan the faces of every single user,
371
1023977
5380
比如说,如果我们扫描每个用户的脸,
17:09
then we will significantly increase the ability to tell their age.
372
1029399
3962
那我们就能极大地 提高识别他们年龄的能力。
17:13
But we will also significantly increase the amount of data
373
1033403
2711
但是这就意味着我们要从你们那里
17:16
that we collect on you.
374
1036156
1293
收集多得多的数据。
17:17
Now, we don't want to collect data.
375
1037491
1710
现在我们不想收集数据,
17:19
We don't want to scan data on your face to collect that.
376
1039201
2669
我们不想扫描你的面部数据。
17:21
So that balance has to be maintained,
377
1041870
1877
所以要保持这个平衡,
17:23
and it's a challenge that we are working through
378
1043747
2294
这也是我们正在攻克的挑战,
17:26
together with industry, together with the regulators as well.
379
1046041
3378
和这个行业一起,和监管者一起。
17:29
CA: So look, one thing that is unquestionable
380
1049920
2127
所以说,有一件事是不容置疑的,
17:32
is that you have created a platform for literally millions of people
381
1052047
3837
就是你们已经为几百万人创造了一个平台,
17:35
who never thought they were going to be a content creator.
382
1055884
2878
而他们从来没有想过 自己能成为内容创作者。
17:38
You've given them an audience.
383
1058804
1626
是你们为他们找来观众。
17:40
I'd actually like to hear from you one other favorite example
384
1060472
2878
我其实想听听你还有什么最喜欢的、
17:43
of someone who TikTok has given an audience to
385
1063391
2712
关于 TikTok 给一个完全的素人
17:46
that never had that before.
386
1066144
1460
找到观众的例子。
17:47
SC: So when again,
387
1067604
2252
周:之前提过,
17:49
when I travel around the world,
388
1069898
2002
当我去到世界各地的时候,
17:51
I meet with a whole bunch of creators on our platform.
389
1071942
2961
我见了很多我们平台上的创作者。
17:55
I was in South Korea just yesterday, and before that I met with -- yes,
390
1075320
5798
我昨天还在韩国, 在那之前,我见——
18:01
before that I met with a bunch of --
391
1081118
1751
对,我见了一群意想不到的的人,
18:02
People don't expect, for example, teachers.
392
1082869
2044
比如说老师。
18:04
There is an English teacher from Arkansas.
393
1084955
3170
有一位来自阿肯色州的英语老师。
18:08
Her name is Claudine, and I met her in person.
394
1088125
3086
她叫克劳丁(Claudine),我亲自见到了她。
18:11
She uses our platform to reach out to students.
395
1091253
2669
她用我们的平台来接触学生。
18:14
There is another teacher called Chemical Kim.
396
1094381
3128
还有另一位老师叫“化学金”(Chemical Kim)。
18:17
And Chemical Kim teaches chemistry.
397
1097551
2752
“化学金”是教化学的。
18:20
What she does is she uses our platform
398
1100595
1877
她用我们的平台
18:22
to reach out to a much broader student base
399
1102514
2502
接触到了比在教室里接触到的
18:25
than she has in her classroom.
400
1105058
1543
广泛得多的学生群体。
18:26
And they're both very, very popular.
401
1106643
2085
她们都非常非常受欢迎。
18:28
You know, in fact,
402
1108770
1168
实际上,
18:29
what we have realized is that STEM content
403
1109938
4588
我们发现在我们的平台上, 理工科(STEM)内容
18:34
has over 116 billion views on our platform globally.
404
1114568
4546
在全球有超过 1160 亿的播放量。
18:39
And it's so significant --
405
1119114
1251
这个数字太大了——
18:40
CA: In a year?
406
1120407
1168
克里斯:一年内吗?
18:41
SC: Cumulatively.
407
1121575
1418
周:累计。
18:43
CA: [116] billion.
408
1123034
1210
克:1160 亿。
18:44
SC: It's so significant, that in the US we have started testing,
409
1124286
3461
周:这个数量太大了, 以至于我们开始在美国测试,
18:47
creating a feed just for STEM content.
410
1127789
3087
要建立一个专门做理工科内容的订阅源,
18:50
Just for STEM content.
411
1130876
1167
只做理工科。
18:52
I’ve been using it for a while, and I learned something new.
412
1132085
3128
我已经用了一段时间,学了一点新知识。
18:55
You want to know what it is?
413
1135255
1627
你知道是什么吗?
18:56
Apparently if you flip an egg on your tray,
414
1136923
3629
据说如果你给盒子里的鸡蛋翻面,
19:00
the egg will last longer.
415
1140552
2169
它就能保存得更久。
19:02
It's science,
416
1142721
1168
这是科学,
19:03
there’s a whole video on this, I learned this on TikTok.
417
1143930
2628
关于这个有一个完整的视频。 我从 TikTok 上学到的。
19:06
You can search for this.
418
1146558
1251
你们可以搜一搜。
19:07
CA: You want to know something else about an egg?
419
1147809
2294
克:你想知道关于鸡蛋 还有什么知识吗?
19:10
If you put it in just one hand and squeeze it as hard as you can,
420
1150145
3086
如果你用一只手握住它,无论多用力,
19:13
it will never break.
421
1153231
1168
都捏不碎它。
19:14
SC: Yes, I think I read about that, too.
422
1154441
1918
周:对,我记得我也读到过。
19:16
CA: It's not true.
423
1156359
1168
克:这是假的。
19:17
(Laughter)
424
1157569
1126
周:我可不确定,我们可以找出来看看。 (笑声)
19:18
SC: We can search for it.
425
1158737
1376
我们可以查查看。
19:21
CA: But look, here's here's the flip side to all this amazingness.
426
1161072
3504
克里斯:是的,但是你看, 这就是这个精彩世界的另一面。
19:24
And honestly, this is the key thing,
427
1164618
2210
而且说实话,这就是关键,
19:26
that I want to have an honest, heart-to-heart conversation with you
428
1166870
4504
我想和你进行一场诚恳的、交心的对话。
19:31
because it's such an important issue,
429
1171416
1794
因为这个问题太重要了,
19:33
this question of human addiction.
430
1173210
2460
这个人类成瘾的问题。
19:35
You know, we are ...
431
1175712
2377
你知道,我们是有前额皮质的动物。
19:38
animals with a prefrontal cortex.
432
1178131
3629
19:42
That's how I think of us.
433
1182594
1209
这是我对我们自身的看法。
19:43
We have these addictive instincts that go back millions of years,
434
1183803
4213
我们几百万年来一直保持着 这种容易成瘾的天性,
19:48
and we often are in the mode of trying to modulate our own behavior.
435
1188058
6673
我们经常处于一种 试图调节自身行为的模式中。
19:54
It turns out that the internet is incredibly good
436
1194731
4421
事实证明互联网极大地 激发了我们的动物细胞,
19:59
at activating our animal cells
437
1199194
2461
20:01
and getting them so damn excited.
438
1201696
2795
让它们兴奋得不行。
20:04
And your company, the company you've built,
439
1204532
2712
而我认为你们公司—— 你们构建的这个公司——
20:07
is better at it than any other company on the planet, I think.
440
1207244
5338
比世界上其他任何公司都更擅长这件事。
20:12
So what are the risks of this?
441
1212999
3003
那么这件事的风险是什么呢?
20:16
I mean, how ...
442
1216002
1752
我是说,怎么……
20:17
From a company point of view, for example,
443
1217796
3003
从公司的角度来看,比如说,
20:20
it's in your interest to have people on there as long as possible.
444
1220840
3212
让人尽可能长时间地停留在软件里 是你们的利益所在。
20:24
So some would say, as a first pass,
445
1224094
1960
所以有人会说,作为第一步,
20:26
you want people to be addicted as long as possible.
446
1226096
2794
你是想让他们沉迷越久越好,
20:28
That's how advertising money will flow and so forth,
447
1228932
3962
这样就能推动广告资金的流动,
20:32
and that's how your creators will be delighted.
448
1232936
2461
就能取悦你们的创作者们。
20:36
What is too much?
449
1236481
1376
什么是过量?
20:37
SC: I don't actually agree with that.
450
1237857
1794
周:我不同意这个观点。
20:40
You know, as a company,
451
1240402
1543
你知道,作为一个公司,
20:41
our goal is not to optimize and maximize time spent.
452
1241987
3211
我们的目标不是最大化用户花费的时间。
20:45
It is not.
453
1245240
1418
不是的。
20:46
In fact, in order to address people spending too much time on our platform,
454
1246700
4004
实际上,为了解决用户 使用平台时间过长的问题,
20:50
we have done a number of things.
455
1250704
1585
我们已经做了大量工作。
20:52
I was just speaking with some of your colleagues backstage.
456
1252289
2836
我刚刚还在后台跟你的同事聊天。
20:55
One of them told me she has encountered this as well.
457
1255166
3379
其中有一个人告诉我她也遇到过。
20:58
If you spend too much time on our platform,
458
1258586
2169
如果你在平台上花了太多时间,
21:00
we will proactively send you videos to tell you to get off the platform.
459
1260755
4004
我们会主动给你发视频, 督促你下线。
21:05
We will.
460
1265093
1251
我们会的。
21:06
And depending on the time of the day,
461
1266386
1793
而且这跟使用的时间点也有关,
21:08
if it's late at night, it will come sooner.
462
1268179
2002
如果是深夜,它会出现得更快。
21:10
We have also built in tools to limit,
463
1270223
1919
我们也在其中嵌入了限制工具,
21:12
if you below 18 years old, by default,
464
1272142
3670
如果你不到 18 岁,
21:15
we set a 60-minute default time limit.
465
1275812
2377
我们会默认设置 60 分钟的时长限制。
21:18
CA: How many?
466
1278189
1168
克:多长时间?
21:19
SC: Sixty minutes.
467
1279399
1168
周:60 分钟。
21:20
And we've given parents tools and yourself tools,
468
1280608
2294
而且我们已经为父母们和你自己设计了工具,
21:22
if you go to settings, you can set your own time limit.
469
1282944
2628
你在设置里面可以设定自己的时长限制。
21:25
We've given parents tools so that you can pair,
470
1285572
2210
我们已经给父母提供了工具, 这样你们就可以配对——
21:27
for the parents who don't know this, go to settings, family pairing,
471
1287824
3212
那些不知道的父母,你们可以去设置里面, 找到家庭配对,
21:31
you can pair your phone with your teenager's phone
472
1291036
2335
你们可以将自己的手机和孩子的手机配对,
21:33
and set the time limit.
473
1293413
1168
然后设定时长限制。
21:34
And we really encourage parents to have these conversations with their teenagers
474
1294581
3795
而且我们真的鼓励父母们 去和孩子们好好谈谈这些事情,
21:38
on what is the right amount of screen time.
475
1298418
2044
关于合理的屏幕使用时长是多少。
21:40
I think there’s a healthy relationship that you should have with your screen,
476
1300503
3629
我认为你应该和你的屏幕 建立一个健康的关系,
21:44
and as a business, we believe that that balance needs to be met.
477
1304132
3003
作为企业,我们相信这种平衡是必要的。
21:47
So it's not true that we just want to maximize time spent.
478
1307135
3754
所以我们并不希望最大化使用时长。
21:50
CA: If you were advising parents here
479
1310889
2043
克:要是让你给在场的父母一个建议,
21:52
what time they should actually recommend to their teenagers,
480
1312974
3003
他们应该提议孩子们使用多久呢?
21:56
what do you think is the right setting?
481
1316019
1877
你觉得合适的设置是多少呢?
21:57
SC: Well, 60 minutes,
482
1317896
1168
周:嗯,60 分钟。
21:59
we did not come up with it ourselves.
483
1319105
1794
这个不是我们自己想出来的。
22:00
So I went to the Digital Wellness Lab at the Boston Children's Hospital,
484
1320899
3420
我去了波士顿儿童医院的数字健康实验室,
22:04
and we had this conversation with them.
485
1324319
1877
我们进行了一场交谈。
22:06
And 60 minutes was the recommendation that they gave to us,
486
1326237
2795
他们给我们的建议是 60 分钟,
22:09
which is why we built this into the app.
487
1329032
1918
所以我们把这个设置进了软件。
22:10
So 60 minutes, take it for what it is,
488
1330950
1877
所以 60 分钟,就是这样,
22:12
it’s something that we’ve had some discussions of experts.
489
1332827
3295
这是我们和专家讨论的结果。
22:16
But I think for all parents here,
490
1336122
1585
但是我觉得对于父母们来说,
22:17
it is very important to have these conversations with your teenage children
491
1337749
4212
和孩子们进行沟通是非常重要的,
22:21
and help them develop a healthy relationship with screens.
492
1341961
4588
然后帮助他们培养和屏幕之间的健康关系。
22:26
I think we live in an age where it's completely inevitable
493
1346549
3087
我认为在我们生活的这个时代, 这是无法避免的,
22:29
that we're going to interact with screens and digital content,
494
1349636
4629
我们一定会跟屏幕和电子内容发生互动。
22:34
but I think we should develop healthy habits early on in life,
495
1354307
3504
但是我认为,我们应该从小就 培养健康的习惯,
22:37
and that's something I would encourage.
496
1357811
1918
这个是我会鼓励的。
22:39
CA: Curious to ask the audience,
497
1359729
2795
克:好奇地问一下观众,
22:42
which of you who have ever had that video on TikTok appear
498
1362565
3629
你们中有谁曾经碰到过 TikTok 上的那个视频,
22:46
saying, “Come off.”
499
1366236
1501
写着“下线”。
22:48
OK, I mean ...
500
1368655
1626
好的,我是说……
22:50
So maybe a third of the audience seem to be active TikTok users,
501
1370949
3128
那这样看起来,大概三分之一的观众 是 TikTok 的活跃用户,
22:54
and about 20 people maybe put their hands up there.
502
1374119
4254
大概 20 个人举了手。
22:58
Are you sure that --
503
1378957
1918
你确定——
23:00
like, it feels to me like this is a great thing to have,
504
1380875
4713
比如,对我来说它很棒,
23:05
but are you ...
505
1385630
1543
但是你会不会……
23:07
isn't there always going to be a temptation
506
1387215
2002
不是总会受到一些诱惑吗?
23:09
in any given quarter or whatever,
507
1389217
2252
在任何一个部分,
23:11
to just push it a bit at the boundary
508
1391469
2169
可能只是突破那么一点界限,
23:13
and just dial back a bit on that
509
1393680
2210
然后只是稍微松松绑,
23:15
so that you can hit revenue goals, etc?
510
1395932
3170
你就能达到收益目标等等。
23:19
Are you saying that this is used scrupulously?
511
1399727
2878
你是说这个是严格执行的吗?
23:22
SC: I think, you know, in terms ...
512
1402647
2920
周:我认为,你知道,从——
23:25
Even if you think about it from a commercial point of view,
513
1405608
2795
即使从商业角度来思考这个问题,
23:28
it is always best when your customers have a very healthy relationship
514
1408403
3295
最理想的情况仍然是你的顾客与你的产品
23:31
with your product.
515
1411698
1168
有一个非常健康的关系。
23:32
It's always best when it's healthy.
516
1412907
1710
健康永远是最好的状态。
23:34
So if you think about very short-term retention, maybe,
517
1414617
2962
所以如果只是考虑非常短期的客户留存, 那也许可以这样。
23:37
but that's not the way we think about it.
518
1417579
2335
但我们不是这样想的。
23:40
If you think about it from a longer-term perspective,
519
1420290
2502
如果你从长远角度来考虑的话,
23:42
what you really want to have is a healthy relationship, you know.
520
1422792
3087
你真正想要的是一个健康的关系。
23:45
You don’t want people to develop very unhealthy habits,
521
1425879
2752
你不希望人们养成非常不健康的习惯,
23:48
and then at some point they're going to drop it.
522
1428631
2461
那总有一天他们就不用这个产品了。
23:51
So I think everything in moderation.
523
1431134
2419
所以我认为一切都要适可而止。
23:53
CA: There's a claim out there that in China,
524
1433595
3003
克:有意见说,在中国,
23:56
there's a much more rigorous standards imposed on the amount of time
525
1436598
4462
使用时长有更严格的标准,
24:01
that children, especially, can spend on the TikTok equivalent of that.
526
1441060
5297
特别是对于儿童在 TikTok 上花费的时间。
24:06
SC: That is unfortunately a misconception.
527
1446357
4713
周:很不幸这是个误解。
24:11
So that experience that is being mentioned for Douyin,
528
1451070
3129
他们提到的那个模式是在抖音上——
24:14
which is a different app,
529
1454199
1251
它是一个不同的应用软件——
24:15
is for an under 14-year-old experience.
530
1455450
2627
那是为 14 岁以下的用户设计的。
24:18
Now, if you compare that in the United States,
531
1458495
2293
如果和美国类比,
24:20
we have an under-13 experience in the US.
532
1460788
2503
我们有 13 岁以下模式。
24:23
It's only available in the US, it's not available here in Canada,
533
1463333
3086
它只适用于美国,在加拿大这里不适用。
24:26
in Canada, we just don't allow it.
534
1466419
1668
在加拿大,我们直接禁止使用了。
24:28
If you look at the under-13 experience in the US,
535
1468129
2378
如果你去看美国的 13 岁以下模式,
24:30
it's much more restricted than the under-14 experience in China.
536
1470548
3754
它要比中国 14 岁以下模式的限制多得多。
24:34
It's so restrictive,
537
1474344
1334
它的限制非常严格,
24:35
that every single piece of content is vetted
538
1475720
3003
以至于每一项内容
24:38
by our third-party child safety expert.
539
1478723
4296
都要经过第三方儿童安全专家的审核。
24:43
And we don't allow any under-13s in the US to publish,
540
1483436
4463
而且在美国,我们不允许 任何 13 岁以下的用户发布内容,
24:47
we don’t allow them to post,
541
1487899
1418
我们不允许他们上传,
24:49
and we don't allow them to use a lot of features.
542
1489317
2336
也禁用了很多功能。
24:51
So I think that that report, I've seen that report too,
543
1491653
2627
所以我认为那篇报道—— 我也看了那篇报道——
24:54
it's not doing a fair comparison.
544
1494280
1710
并不是在做一个公平的比较。
24:56
CA: What do you make of this issue?
545
1496324
1710
克:你对这个问题怎么看?
24:58
You know, you've got these millions of content creators
546
1498034
2878
你知道,你们已经有了几百万内容创作者,
25:00
and all of them, in a sense, are in a race for attention,
547
1500954
4045
而从某种意义上来说, 他们所有人都在争夺关注,
25:05
and that race can pull them in certain directions.
548
1505041
2920
这种竞争会把他们推向特定的方向。
25:07
So, for example, teenage girls on TikTok,
549
1507961
4296
比如说,TikTok 上的青少年女孩,
25:12
sometimes people worry that, to win attention,
550
1512257
3628
有时候人们会担心,为了赢得关注,
25:15
they've discovered that by being more sexual
551
1515885
2086
她们发现通过发布性相关内容,
25:18
that they can gain extra viewers.
552
1518012
1669
能获得更多播放量。
25:20
Is this a concern?
553
1520098
1376
这会是一个问题吗?
25:21
Is there anything you can do about this?
554
1521474
2252
你们有什么对策吗?
25:23
SC: We address this in our community guidelines as well.
555
1523768
3295
周:这个问题也在我们的社区准则里解答了。
25:28
You know, if you look at sort of the sexualized content on our guidelines,
556
1528147
4130
你知道,如果你看一下我们的准则里 关于性相关内容的描述,
25:32
if you’re below a certain age,
557
1532318
1543
如果你不到某个年龄,
25:33
you know, for certain themes that are mature,
558
1533903
3253
我们可以把某些过于成熟的内容
25:37
we actually remove that from your experience.
559
1537198
2127
从你的模式中剔除。
25:39
Again, I come back to this,
560
1539325
1543
再说回这个,
25:40
you know, we want to have a safe platform.
561
1540910
2127
你知道,我们想搭建一个安全的平台。
25:43
In fact, at my congressional hearing,
562
1543037
1794
实际上,在国会听证会上,
25:44
I made four commitments to our users and to the politicians in the US.
563
1544872
3754
我对我们的用户和美国的政治家们 做出了四个承诺。
25:48
And the first one is that we take safety, especially for teenagers,
564
1548626
4338
第一个就是,我们会极其重视安全——
25:53
extremely seriously,
565
1553006
1167
尤其是青少年的安全,
25:54
and we will continue to prioritize that.
566
1554215
2419
我们会一直把它放在首位。
25:56
You know, I believe that we need to give our teenage users,
567
1556634
3796
我相信我们需要提供给我们的青少年用户、
26:00
and our users in general,
568
1560471
1460
以及整个用户群体
26:01
a very safe experience,
569
1561973
1168
一个非常安全的体验。
26:03
because if we don't do that,
570
1563182
1502
因为如果我们不这样做的话,
26:04
then we cannot fulfill --
571
1564684
1293
就不能履行——
26:05
the mission is to inspire creativity and to bring joy.
572
1565977
2919
公司的使命是“激发创造,带来愉悦”。
26:08
If they don't feel safe, I cannot fulfill my mission.
573
1568938
2836
如果他们觉得不安全, 那我就没法完成我的使命。
26:11
So it's all very organic to me as a business
574
1571816
2753
所以对我来说,作为一个企业,
26:14
to make sure I do that.
575
1574569
1293
做到这一点是非常自然的。
26:15
CA: But in the strange interacting world of human psychology and so forth,
576
1575862
3503
克:但是在人类心理造就的光怪陆离、 彼此牵扯的世界中,
26:19
weird memes can take off.
577
1579365
1418
怪诞的梗图可以迅速火爆。
26:20
I mean, you had this outbreak a couple years back
578
1580825
2461
我是说几年前你们就曾爆发过这种情况,
26:23
with these devious licks where kids were competing with each other
579
1583328
3420
他们用这些狡猾的伎俩, 让孩子们互相竞争,
26:26
to do vandalism in schools and, you know,
580
1586789
2253
在学校里搞破坏,
26:29
get lots of followers from it.
581
1589083
1585
以此获得很多粉丝。
26:30
How on Earth do you battle something like that?
582
1590710
2461
你到底要怎么去抵抗这样的事情呢?
26:33
SC: So dangerous challenges are not allowed on our platform.
583
1593963
2878
周:危险挑战在我们的平台上是被禁止的。
26:36
If you look at our guidelines, it's violative.
584
1596883
2961
如果你去看看我们的准则, 就会发现这是违例的。
26:39
We proactively invest resources to identify them
585
1599886
3336
我们主动地花费资源去识别它们,
26:43
and remove them from our platform.
586
1603222
1794
然后把它们从平台上移除。
26:45
In fact, if you search for dangerous challenges on our platform today,
587
1605058
3336
实际上,如果你现在从平台上搜索危险挑战,
26:48
we will redirect you to a safety resource page.
588
1608394
2878
我们会把你导向一个安全资源页面。
26:51
And we actually worked with some creators as well to come up with campaigns.
589
1611314
3587
而且我们其实也和一些创作者 合作发起了活动。
26:54
This is another campaign.
590
1614901
1501
这是另外一个活动,
26:56
It's the "Stop, Think, Decide Before You Act" campaign
591
1616444
2711
它是“停一停,想一想,三思而后行”活动。
26:59
where we work with the creators to produce videos,
592
1619155
2419
我们和创作者一起制作视频,
27:01
to explain to people that some things are dangerous,
593
1621574
2920
告诉大家那些事情是危险的,
27:04
please don't do it.
594
1624535
1210
请不要做。
27:05
And we post these videos actively on our platform as well.
595
1625787
3128
然后我们也积极地 把这些视频发布在平台上,
27:09
CA: That's cool.
596
1629791
1209
克:那很酷。
27:11
And you've got lots of employees.
597
1631042
1877
而且你们还有很多员工,
27:12
I mean, how many employees do you have
598
1632960
1836
我想问你们有多少员工?
27:14
who are specifically looking at these content moderation things,
599
1634796
3753
具体都是谁在研究这些内容限制的事情——
27:18
or is that the wrong question?
600
1638549
1502
我不知道我问得对不对。
27:20
Are they mostly identified by AI initially
601
1640093
2377
还是说它们大多数是先由 AI 识别,
27:22
and then you have a group who are overseeing
602
1642512
2961
然后你们有一个团队在监督,
27:25
and making the final decision?
603
1645515
1585
并做出最终的决定?
27:27
SC: The group is based in Ireland and it's a lot of people,
604
1647100
4045
周:团队在爱尔兰,有很多人,
27:31
it's tens of thousands of people.
605
1651187
1585
大概有几万人。
27:32
CA: Tens of thousands?
606
1652772
1168
克:几万人?
27:33
SC: It's one of the most important cost items on my PnL,
607
1653981
4004
周:这是我的损益表上 最重要的成本项目之一,
27:38
and I think it's completely worth it.
608
1658027
1794
而且我觉得它完全值得。
27:39
Now, most of the moderation has to be done by machines.
609
1659821
2836
大多数内容调整都是用机器做的。
27:42
The machines are good, they're quite good,
610
1662657
2627
机器很好,它们相当好。
27:45
but they're not as good as, you know,
611
1665326
1794
但是你知道,它们还是比不上……
27:47
they're not perfect at this point.
612
1667120
1668
它们在这一点上还不够完美。
27:48
So you have to complement them with a lot of human beings today.
613
1668788
3211
所以现在你不得不 用很多人力去完成这件事。
27:51
And I think, by the way, a lot of the progress in AI in general
614
1671999
4380
另外我认为,整个 AI 技术的巨大进步,
27:56
is making that kind of content moderation capabilities a lot better.
615
1676421
4004
正在加强这种内容调整能力。
28:00
So we're going to get more precise.
616
1680466
1877
所以我们得到的精准性会更高,
28:02
You know, we’re going to get more specific.
617
1682385
2002
会更具体。
28:04
And it’s going to be able to handle larger scale.
618
1684429
4296
而且它会拥有大规模处理的能力,
28:08
And that's something I think that I'm personally looking forward to.
619
1688766
4088
这是我个人非常期待的。
28:13
CA: What about this perceived huge downside
620
1693813
4296
克:还有一个普遍认识到的大问题——
28:18
of use of, certainly Instagram, I think TikTok as well.
621
1698151
3670
当然主要指 Instagram, 但是我认为 TikTok 也是这样。
28:21
What people worry that you are amplifying insecurities,
622
1701863
3712
人们担心你们在加剧不安,
28:25
especially of teenagers
623
1705575
1626
特别是对青少年来说,
28:27
and perhaps especially of teenage girls.
624
1707243
1919
可能尤其是青春期少女。
28:29
They see these amazing people on there doing amazing things,
625
1709162
3920
他们看了平台上这些出色的人与事物,
28:33
they feel inadequate,
626
1713124
1251
会感到自卑,
28:34
there's all these reported cases of depression, insecurity,
627
1714417
3545
有很多关于抑郁、不安、
28:38
suicide and so forth.
628
1718004
1752
甚至自杀等等的案例报道,
28:39
SC: I take this extremely seriously.
629
1719797
1835
周:我非常重视这个问题。
28:42
So in our guidelines,
630
1722967
2377
在我们的准则中,
28:45
for certain themes that we think are mature and not suitable for teenagers,
631
1725386
5089
对于青少年来说过于成熟和不合适的内容,
28:50
we actually proactively remove it from their experience.
632
1730516
3254
其实我们都主动从用户体验中移除了。
28:54
At the same time, if you search certain terms,
633
1734395
2544
同时,如果你搜索特定的字词,
28:56
we will make sure that you get redirected to a resource safety page.
634
1736981
4087
我们会确保把你导向一个资源安全页面。
29:01
Now we are always working with experts to understand some of these new trends
635
1741068
4338
我们一直在与专家合作研究
29:05
that could emerge
636
1745406
1168
这些可能出现的新趋势,
29:06
and proactively try to manage them, if that makes sense.
637
1746616
3795
然后如果行得通的话,主动地去干预它们。
29:10
Now, this is a problem that predates us,
638
1750411
2961
这是一个比我们公司更古老的问题,
29:13
that predates TikTok.
639
1753372
1168
它先于 TikTok 就存在了,
29:14
It actually predates the internet.
640
1754540
1710
它其实在互联网出现之前就存在了。
29:16
But it's our responsibility to make sure
641
1756250
2378
但是我们有责任确保
29:18
that we invest enough to understand and to address the concerns,
642
1758628
3253
我们付出了足够的努力 去理解和解决这些问题,
29:21
to keep the experience as safe as possible
643
1761923
2043
最大程度地保证他们用户体验的安全性,
29:23
for as many people as possible.
644
1763966
1502
照顾到尽可能多的人。
29:26
CA: Now, in Congress,
645
1766302
1210
克:那么,在国会上,
29:27
the main concern seemed to be not so much what we've talked about,
646
1767512
3461
最主要的担忧好像不太是 我们刚刚讲的那些,
29:30
but data, the data of users,
647
1770973
2461
而是数据——用户数据,
29:33
the fact that you're owned by ByteDance, Chinese company,
648
1773476
3253
是你们属于字节跳动——一个中国公司,
29:36
and the concern that at any moment
649
1776771
2753
他们担心中国政府可能随时会 要求拿到那些收据。
29:39
Chinese government might require or ask for data.
650
1779524
3670
29:43
And in fact, there have been instances
651
1783194
1835
而且事实上,已经有一些这样的例子了,
29:45
where, I think you've confirmed,
652
1785029
1543
我想你们已经确认,
29:46
that some data of journalists on the platform
653
1786572
3963
平台上一些记者的用户数据,
29:50
was made available to ByteDance's engineers
654
1790576
3003
被字节跳动的工程师获取,
29:53
and from there, who knows what.
655
1793579
2211
然后从那里——谁知道会怎么样呢?
29:56
Now, your response to this was to have this Project Texas,
656
1796165
4254
你对这件事的回应是 “德克萨斯计划”(Project Texas)——
30:00
where you're moving data to be controlled by Oracle here in the US.
657
1800461
4755
把数据转移到美国这边, 由甲骨文公司(Oracle)控制,
30:05
Can you talk about that project and why, if you believe it so,
658
1805258
5130
你能讲讲这个计划吗? 还有为什么——如果你认为是这样的话——
30:10
why we should not worry so much about this issue?
659
1810429
3003
为什么我们不用担心这个问题?
30:13
SC: I will say a couple of things about this, if you don't mind.
660
1813432
3045
周:如果你不介意的话, 关于这个我有好几点要说。
30:16
The first thing I would say is that the internet is built
661
1816519
2711
首先,我要说 互联网是基于全球互通性搭建的。
30:19
on global interoperability,
662
1819230
1501
30:20
and we are not the only company that relies on the global talent pool
663
1820773
4171
我们不是唯一一家依赖全球人才的公司,
30:24
to make our products as good as possible.
664
1824944
1960
我们要让产品尽可能好。
30:26
Technology is a very collaborative effort.
665
1826946
2002
科技是非常需要协作努力的,
30:28
I think many people here would say the same thing.
666
1828990
3461
我觉得很多人都会这么说。
30:32
So we are not the first company to have engineers in all countries,
667
1832493
3170
所以我们不是第一家 从世界各地——包括中国——
30:35
including in China.
668
1835663
1168
雇佣工程师的公司,
30:36
We're not the first one.
669
1836873
1167
我们不是第一家。
30:38
Now, I understand some of these concerns.
670
1838082
2377
那我理解其中一些担忧。
30:40
You know, the data access by employees is not data accessed by government.
671
1840459
4463
你要知道,工程师能获取的数据, 并不是政府获取的数据。
30:44
This is very different, and there’s a clear difference in this.
672
1844922
2962
这是非常不同的, 而且这二者之间有明确的区别。
30:47
But we hear the concerns that are raised in the United States.
673
1847925
2962
但是我们听到了美国提出的担忧,
30:50
We did not try to avoid discussing.
674
1850928
3420
我们没有要回避讨论。
30:54
We did not try to argue our way out of it.
675
1854390
2878
我们没有试图通过辩驳来解决问题,
30:57
What we did was we built an unprecedented project
676
1857310
2836
我们的方式是构建一个史无前例的计划,
31:00
where we localize American data to be stored on American soil
677
1860187
4171
将美国的数据本地化,储存在美国本土,
31:04
by an American company overseen by American personnel.
678
1864400
4004
由美国公司负责,受美国人的监督。
31:08
So this kind of protection for American data
679
1868404
3212
这种对美国数据的保护,
31:11
is beyond what any other company in our industry has ever done.
680
1871657
4213
是我们行业中任何其他公司都没有做到的。
31:16
Well, money is not the only issue here,
681
1876454
2544
钱是这其中唯一的问题,
31:19
but it's very expensive to build something like that.
682
1879040
2502
搭建这种东西非常昂贵。
31:21
And more importantly, you know,
683
1881584
1626
更重要的是,
31:23
we are basically localizing data in a way that no other company has done.
684
1883252
5673
我们是在用没有其他公司用过的办法, 将数据本地化。
31:28
So we need to be very careful that whilst we are pursuing
685
1888966
3963
所以当我们在美国追求所谓的数字主权时, 我们需要非常小心。
31:32
what we call digital sovereignty in the US
686
1892929
2585
31:35
and we are also doing a version of this in Europe,
687
1895556
2503
我们也在欧洲做了一个 避免互联网巴尔干化的版本。
31:38
that we don't balkanize the internet.
688
1898100
1794
31:39
Now we are the first to do it.
689
1899936
1459
我们是第一个做这件事的公司。
31:41
And I expect that, you know,
690
1901437
1460
我希望,你知道,
31:42
other companies are probably looking at this
691
1902897
2085
其他公司或许会看到,
31:45
and trying to figure out how you balance between protecting, protected data,
692
1905024
4630
然后试着研究出 在数据保护中找到平衡的方法,
31:49
you know, to make sure that everybody feels secure about it
693
1909654
2836
以此确保每个人的安全感,
31:52
while at the same time allowing for interoperability
694
1912490
2586
同时又能继续保持互通性,
31:55
to continue to happen,
695
1915117
1126
31:56
because that's what makes technology and the internet so great.
696
1916285
3003
因为是它让科技和互联网如此伟大。
31:59
So that's something that we are doing.
697
1919330
1835
这个是我们正在做的事情。
32:01
CA: How far are you along that journey with Project Texas?
698
1921207
2794
克:你们的“德克萨斯计划”目前走了多远?
32:04
SC: We are very, very far along today.
699
1924043
1835
周:我们现在已经走了很远很远了。
32:05
CA: When will there be a clear you know,
700
1925920
3920
克:什么时候会有一个明确的——
32:09
here it is, it’s done, it’s firewalled, this data is protected?
701
1929882
3087
类似于——行了,完成了, 防火墙建起来了,数据保护好了。
32:12
SC: Today, by default, all new US data
702
1932969
3086
周:默认情况下,现在所有新的美国数据
32:16
is already stored in the Oracle cloud infrastructure.
703
1936055
2502
已经储存在甲骨文公司的云端基础架构里了。
32:18
So it's in this protected US environment that we talked about in the United States.
704
1938557
4713
所以它是在这个受保护的美国环境下, 像我们之前在美国说过的那样。
32:23
We still have some legacy data to delete in our own servers in Virginia
705
1943270
3963
还有一些储存在我们弗吉尼亚 和新加坡服务器中的旧数据需要删除。
32:27
and in Singapore.
706
1947274
1168
32:28
Our data has never been stored in China, by the way.
707
1948484
2461
顺便说一下,我们的数据从未在中国储存过。
32:31
That deletion is a very big engineering effort.
708
1951487
2211
这种删除是一项非常大的工程。
32:33
So as we said, as I said at the hearing,
709
1953739
2378
所以像我们说过的那样—— 像我在听证会上说的那样——
32:36
it's going to take us a while to delete them,
710
1956117
2502
这是需要一段时间的。
32:38
but I expect it to be done this year.
711
1958619
2294
但是我预计今年年底可以完成。
32:43
CA: How much power do you have
712
1963791
3045
克:你有多大的权力
32:46
over your own ability to control certain things?
713
1966877
2837
去控制某些事情?
32:49
So, for example, suppose that, for whatever reason,
714
1969714
2961
比如说,假设——不管什么原因——
32:52
the Chinese government was to look at an upcoming US election and say,
715
1972717
4421
中国政府看着即将到来的美国大选,
32:57
"You know what, we would like this party to win," let's say,
716
1977138
4754
说“你知道吗,我们希望这个党赢,” ——比如说——
33:01
or "We would like civil war to break out" or whatever.
717
1981934
2711
或者“我们希望内战爆发”或者别的什么。
33:05
How ...
718
1985104
1376
要怎么……
33:06
"And we could do this
719
1986522
1418
“我们可以通过
33:07
by amplifying the content of certain troublemaking, disturbing people,
720
1987982
4212
放大某些惹是生非的内容,
33:12
causing uncertainty, spreading misinformation," etc.
721
1992236
2795
制造不安局面,传播错误信息,”等等。
33:16
If you were required via ByteDance to do this,
722
1996032
4963
如果你被字节跳动要求做这个,
33:21
like, first of all, is there a pathway where theoretically that is possible?
723
2001037
4421
比如,首先, 这个在理论上是有途径发生的吗?
33:26
What's your personal line in the sand on this?
724
2006542
3462
你的个人底线是什么?
33:30
SC: So during the congressional hearing,
725
2010046
2544
周:在国会听证会上,
33:32
I made four commitments,
726
2012631
1585
我做出了四项承诺。
33:34
we talked about the first one, which is safety.
727
2014216
2211
我们刚才讲了第一个——安全。
33:36
The third one is to keep TikTok a place of freedom of expression.
728
2016469
3420
那第三个承诺就是保证 TikTok 是一个言论自由的平台。
33:39
By the way, if you go on TikTok today,
729
2019930
1836
顺便如果你现在去 TikTok,
33:41
you can search for anything you want,
730
2021766
1793
你可以搜索任何东西,
33:43
as long as it doesn't violate our community guidelines.
731
2023559
2753
只要它不违反我们的社区准则。
33:46
And to keep it free from any government manipulation.
732
2026312
3003
其次为了让它不受到任何政府操纵,
33:49
And the fourth one is transparency and third-party monitoring.
733
2029315
4129
第四项承诺是透明性和第三方监管。
33:53
So the way we are trying to address this concern
734
2033444
2294
我们解决这种担忧的方式
33:55
is an unprecedented amount of transparency.
735
2035780
2294
是前所未有的透明度。
33:58
What do I mean by this?
736
2038074
1459
我的意思是什么呢?
33:59
We're actually allowing third-party reviewers
737
2039533
3546
我们其实允许第三方审核人
34:03
to come in and review our source code.
738
2043079
2127
来审核我们的源代码。
34:05
I don't know any other company that does this, by the way.
739
2045206
2752
顺便我没听说过任何其他这样做的公司,
34:08
Because everything, as you know, is driven by code.
740
2048417
2670
因为所有东西,你知道的,都是代码控制的。
34:11
So to allow someone else to review the source code
741
2051128
2670
所以允许其他人来审核源代码,
34:13
is to give this a significant amount of transparency
742
2053798
3003
是给它极大的透明度,
34:16
to ensure that the scenarios that you described
743
2056842
2336
以此确保你说的那种、
34:19
that are highly hypothetical, cannot happen on our platform.
744
2059220
3503
高度假想化的情形不会发生在我们的平台上。
34:23
Now, at the same time,
745
2063057
1168
同时,
34:24
we are releasing more research tools for researchers
746
2064266
3879
我们正在发布更多研究工具,
34:28
so that they can study the output.
747
2068187
1627
以便研究者们研究输出端。
34:29
So the source code is the input.
748
2069814
2210
刚才说的源代码是输入端,
34:32
We are also allowing researchers to study the output,
749
2072024
2628
现在我们也允许研究者去研究输出端,
34:34
which is the content on our platform.
750
2074652
1793
也就是我们平台上的内容。
34:36
I think the easiest way to sort of fend this off is transparency.
751
2076487
4087
我认为要防范这个, 最简单的方式就是透明性。
34:40
You know, we give people access to monitor us,
752
2080574
2753
我们把自己开放给大家监管,
34:43
and we just make it very, very transparent.
753
2083369
2002
我们让它变得非常非常透明,
34:45
And that's our approach to the problem.
754
2085412
1836
这就是我们针对平台的处理方式。
34:47
CA: So you will say directly to this group
755
2087248
2252
克:所以你能直接对他们说,
34:49
that the scenario I talked about,
756
2089500
2252
我刚才说的那种情形——
34:51
of theoretical Chinese government interference in an American election,
757
2091794
4671
理论上中国政府对美国大选的介入——
34:56
you can say that will not happen?
758
2096465
2753
你能说它不会发生吗?
34:59
SC: I can say that we are building all the tools
759
2099677
2711
周:我能说我们在搭建一切工具,
35:02
to prevent any of these actions from happening.
760
2102388
2919
去防止任何这些行为的发生。
35:05
And I'm very confident that with an unprecedented amount of transparency
761
2105933
4338
而且我非常有信心, 我们赋予平台的这种史无前例的透明度
35:10
that we're giving on the platform,
762
2110271
1626
35:11
we can reduce this risk to as low as zero as possible.
763
2111939
4421
可以尽可能地将这种风险降低到零。
35:17
CA: To as low as zero as possible.
764
2117403
2753
克:尽可能降低到零。
35:20
SC: To as close to zero as possible.
765
2120197
2419
周:尽可能——接近零。
35:22
CA: As close to zero as possible.
766
2122616
1877
克里斯:尽可能接近零。
35:25
That's fairly reassuring.
767
2125452
1919
这相当令人安心了。
35:27
Fairly.
768
2127830
1168
相当地。
35:28
(Laughter)
769
2128998
1168
(笑声)
35:33
I mean, how would the world know?
770
2133085
1585
我是说,大家要怎么知道呢?
35:34
If you discovered this or you thought you had to do it,
771
2134712
2586
如果你发现了, 或者你觉得不得不做,
35:37
is this a line in the sand for you?
772
2137298
3461
这是你的底线吗?
35:40
Like, are you in a situation you would not let the company that you know now
773
2140801
3587
就是说,你的立场是,你不会
35:44
and that you are running do this?
774
2144430
1585
让你在管理的这个公司做这些事情吗?
35:46
SC: Absolutely.
775
2146056
1168
周:绝对是这样的。
35:47
That's the reason why we're letting third parties monitor,
776
2147266
2711
这就是我们让第三方来监管的原因,
35:50
because if they find out, you know, they will disclose this.
777
2150019
2836
因为如果他们发现了,他们会曝光的。
35:52
We also have transparency reports, by the way,
778
2152855
2169
顺便我们也有透明度报告,
35:55
where we talk about a whole bunch of things,
779
2155024
2085
那个上面有我们讲的很多事情——
35:57
the content that we remove, you know, that violates our guidelines,
780
2157151
3545
我们移除的违反准则的内容、
36:00
government requests.
781
2160738
1168
政府要求。
36:01
You know, it's all published online.
782
2161906
1751
这些都是公开在网上的,
36:03
All you have to do is search for it.
783
2163657
1752
你要做的只是去搜索它。
36:05
CA: So you're super compelling
784
2165409
1460
克:你作为一个首席执行官(CEO),
36:06
and likable as a CEO, I have to say.
785
2166911
1751
我得说你非常有说服力,也非常有亲和力。
36:08
And I would like to, as we wrap this up,
786
2168662
2419
作为尾声,我想——
36:11
I'd like to give you a chance just to paint, like, what's the vision?
787
2171123
3253
我想给你一个机会描绘—— 比如——愿景是什么?
36:14
As you look at what TikTok could be,
788
2174418
4087
当你设想 TikTok 的未来——
36:18
let's move the clock out, say, five years from now.
789
2178547
3295
我们来打个表,从现在起大概五年,
36:21
How should we think about your contribution to our collective future?
790
2181884
4504
我们应该如何期待 你对我们共同未来的贡献?
36:26
SC: I think it's still down to the vision that we have.
791
2186430
2586
周:我认为这仍然是 取决于我们现在的愿景。
36:29
So in terms of the window of discovery,
792
2189058
2794
作为探索之窗户,
36:31
I think there's a huge benefit to the world
793
2191852
2878
我认为它对世界有巨大的好处,
36:34
when people can discover new things.
794
2194772
1835
因为人们可以去发现新事物。
36:36
You know, people think that TikTok is all about dancing and singing,
795
2196607
3211
你知道,大家觉得 TikTok 全是唱歌跳舞,
36:39
and there’s nothing wrong with that, because it’s super fun.
796
2199860
2836
这也没错,因为它们确实都很好玩。
36:42
There's still a lot of that,
797
2202696
1377
现在这样的内容仍然很多,
36:44
but we're seeing science content, STEM content,
798
2204073
2878
但是我们越来越多地 看到科学内容、理工科内容——
36:46
have you about BookTok?
799
2206992
1794
你听说过 BookTok 吗?
36:48
It's a viral trend that talks about books
800
2208827
2670
它是很火的一个讨论书籍的社区,
36:51
and encourages people to read.
801
2211538
1711
它鼓励大家去阅读。
36:53
That BookTok has 120 billion views globally,
802
2213290
3963
这个 BookTok 在全球有 1200 亿浏览量,
36:57
120 billion.
803
2217294
1168
1200 亿。
36:58
CA: Billion, with a B.
804
2218462
1460
克:亿,是“亿”啊。
36:59
SC: People are learning how to cook,
805
2219964
2002
周:大家在学习怎么做饭,
37:02
people are learning about science,
806
2222007
1627
学习科学,
37:03
people are learning how to golf --
807
2223634
1668
学习打高尔夫——
37:05
well, people are watching videos on golfing, I guess.
808
2225302
2586
嗯……我猜他们有在看打高尔夫的视频。
37:07
(Laughter)
809
2227930
1877
(笑声)
37:10
I haven't gotten better by looking at the videos.
810
2230349
2961
我看了还没什么长进。
37:13
I think there's a huge, huge opportunity here on discovery
811
2233352
3754
我觉得这里有巨大的发现机会,
37:17
and giving the everyday person a voice.
812
2237147
2044
它给普通人一个发声的机会。
37:19
If you talk to our creators, you know,
813
2239233
1835
如果你去和我们的创作者交流,
37:21
a lot of people will tell you this again and again, that before TikTok,
814
2241068
3378
很多人一遍一遍地会告诉你, 在 TikTok 出现之前,
37:24
they would never have been discovered.
815
2244446
1836
他们从来没有被看见过。
37:26
And we have given them the platform to do that.
816
2246323
2211
我们给了他们一个被发现的平台。
37:28
And it's important to maintain that.
817
2248534
1752
坚守这一点是很重要的。
37:30
Then we talk about creation.
818
2250286
1376
然后我们再来讲创造。
37:31
You know, there’s all this new technology coming in with AI-generated content
819
2251662
4379
你知道,所有这些 AI 生成内容的新技术
37:36
that will help people create even more creative content.
820
2256083
3962
能帮助人们创作出更加有创意的内容。
37:40
I think there's going to be a collaboration between,
821
2260045
2461
我认为一场协作将发生在——
37:42
and I think there's a speaker who is going to talk about this,
822
2262506
2920
我记得有位演讲者要讲这个——
37:45
between people and AI
823
2265467
1168
发生在人类和人工智能之间,
37:46
where they can unleash their creativity in a different way.
824
2266635
2795
他们可以用不同的方式释放各自的创造力。
37:49
You know, like for example, I'm terrible at drawing personally,
825
2269471
3295
你知道,比如说,我本人绘画很糟糕,
37:52
but if I had some AI to help me,
826
2272808
2377
但是如果我有 AI 帮助我,
37:55
then maybe I can express myself even better.
827
2275227
2127
那也许我就能更好地表达自己。
37:57
Then we talk about bridges to connect
828
2277813
1793
接下来我们就讲到连结之桥梁,
37:59
and connecting people and the communities together.
829
2279648
2419
就是把用户以及社群连结在一起。
38:02
This could be products, this could be commerce,
830
2282067
2253
这可以是产品,也可以是商业。
38:04
five million businesses in the US benefit from TikTok today.
831
2284320
5714
现在美国有五百万企业受益于 TikTok。
38:10
I think we can get that number to a much higher number.
832
2290075
2586
我觉得我们还可以把这个数字提高很多。
38:12
And of course, if you look around the world, including in Canada,
833
2292661
3087
当然,如果你环顾这个世界,包括加拿大,
38:15
that number is going to be massive.
834
2295748
1710
这个数字会变得非常庞大。
38:17
So I think these are the biggest opportunities that we have,
835
2297458
2836
所以我认为这些是我们拥有的最主要的机会,
38:20
and it's really very exciting.
836
2300336
1459
这真的非常令人兴奋。
38:21
CA: So courtesy of your experience in Congress,
837
2301795
2211
克:那鉴于你在国会的经历,
38:24
you actually became a bit of a TikTok star yourself, I think.
838
2304048
3586
我觉得其实你自己也成了 TikTok 顶流,
38:28
Some of your videos have gone viral.
839
2308010
2210
你的一些视频已经走红了。
38:31
You've got your phone with you.
840
2311055
1501
你带着你的手机吧。
38:32
Do you want to make a little little TikTok video right now?
841
2312556
2795
要不要现在拍点 TikTok 视频?
38:35
Let's do this.
842
2315351
1167
来吧。
38:36
SC: If you don't mind ...
843
2316560
1210
周:你不介意的话……
38:37
CA: What do you think, should we do this?
844
2317811
1919
克:你们觉得呢?我们要不要拍一个?
38:39
SC: We're just going to do a selfie together, how's that?
845
2319730
2711
周:我们就来个自拍吧,怎么样?
38:42
So why don't we just say "Hi."
846
2322816
1669
要不我们就说“嗨”。
38:44
Hi!
847
2324526
1168
嗨!
38:45
Audience: Hi!
848
2325694
1168
观众:嗨!
38:46
CA: Hello from TED.
849
2326904
1126
克里斯:TED 向你问好。
38:48
SC: All right, thank you, I hope it goes viral.
850
2328405
2211
周:好的,谢谢你们,希望它能火。
38:50
(Laughter)
851
2330657
2795
(笑声)
38:53
CA: If that one goes viral, I think I've given up on your algorithm, actually.
852
2333494
3670
克:说实话,这要是火了, 我觉得我还是别看好你的算法了。
38:57
(Laughter)
853
2337206
1543
(笑声)
39:00
Shou Chew, you're one of the most influential
854
2340417
2753
周受资,你是这个世界上影响力最大的、
39:03
and powerful people in the world, whether you know it or not.
855
2343212
2878
最有能量的人之一—— 不管你自己知不知道。
39:06
And I really appreciate you coming and sharing your vision.
856
2346090
2794
我真的很感谢你能来分享自己的观点。
39:08
I really, really hope the upside of what you're talking about comes about.
857
2348884
3545
我真的真的希望你说的事情 都能有好的结果。
39:12
Thank you so much for coming today.
858
2352471
1710
非常感谢你的到来。
39:14
SC: Thank you, Chris.
859
2354181
1168
周:谢谢你,克里斯。
39:15
CA: It's really interesting.
860
2355391
1376
克:真的很有趣。
39:16
(Applause)
861
2356767
1627
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7