Inside the bizarre world of internet trolls and propagandists | Andrew Marantz

199,790 views ・ 2019-10-01

TED


请双击下面的英文字幕来播放视频。

翻译人员: psjmz mz 校对人员: Lipeng Chen
00:12
I spent the past three years
0
12612
1611
我花了三年的时间
00:14
talking to some of the worst people on the internet.
1
14247
3183
在互联网上与一些最糟糕的人交谈。
00:18
Now, if you've been online recently,
2
18441
2196
如果你最近有上网,
00:20
you may have noticed that there's a lot of toxic garbage out there:
3
20661
3804
你可能注意到网上 有大量的有害信息:
00:24
racist memes, misogynist propaganda, viral misinformation.
4
24489
4953
种族主义表情包,厌恶女性的宣传, 病毒式的错误信息。
00:29
So I wanted to know who was making this stuff.
5
29466
2199
我想要知道是谁制造了这些东西。
00:31
I wanted to understand how they were spreading it.
6
31689
2338
我想要理解他们如何传播它们。
最终,我想要知道
00:34
Ultimately, I wanted to know
7
34051
1387
它们会对我们的社会造成了怎样的影响。
00:35
what kind of impact it might be having on our society.
8
35462
2569
所以在 2016 年,我开始追踪 其中一些表情包的源头,
00:38
So in 2016, I started tracing some of these memes back to their source,
9
38055
4011
00:42
back to the people who were making them or who were making them go viral.
10
42090
3502
回到制造它们或令它们 病毒式传播的人那里,
00:45
I'd approach those people and say,
11
45616
1634
我接近那些人说,
“嘿,我是记者, 我可以来看看你做什么吗?”
00:47
"Hey, I'm a journalist. Can I come watch you do what you do?"
12
47274
2906
通常我得到的回应是,
00:50
Now, often the response would be,
13
50204
1596
00:51
"Why in hell would I want to talk to
14
51824
1747
“我到底为什么要和一个
支持民主党的、住在布鲁克林的
00:53
some low-t soy-boy Brooklyn globalist Jew cuck
15
53595
2199
00:55
who's in cahoots with the Democrat Party?"
16
55818
2087
全球主义犹太佬交谈?”
00:57
(Laughter)
17
57929
1252
(笑声)
00:59
To which my response would be, "Look, man, that's only 57 percent true."
18
59205
3835
对此我的回应是,“瞧,伙计, 这只有 57% 是真的。”
01:03
(Laughter)
19
63064
1172
(笑声)
01:04
But often I got the opposite response.
20
64633
1840
但常常我也得到相反的回应。
01:06
"Yeah, sure, come on by."
21
66497
1288
“好的,当然,来吧。”
01:08
So that's how I ended up in the living room
22
68625
2081
于是我出现在了
01:10
of a social media propagandist in Southern California.
23
70730
3125
一个住在南加州的 社交媒体推手的客厅里。
01:14
He was a married white guy in his late 30s.
24
74386
2381
他是一个 30 多岁的已婚白人。
01:16
He had a table in front of him with a mug of coffee,
25
76791
3182
他面前是一张桌子, 桌子上放着咖啡,
01:19
a laptop for tweeting,
26
79997
1570
一台用来发推特的笔记本电脑,
01:21
a phone for texting
27
81591
1848
一台发短信的手机
01:23
and an iPad for livestreaming to Periscope and YouTube.
28
83463
3236
和一台用来在 Periscope 和 YouTube 上直播的iPad。
01:27
That was it.
29
87258
1161
就这些了。
01:28
And yet, with those tools,
30
88998
1270
然而,通过这些工具,
01:30
he was able to propel his fringe, noxious talking points
31
90292
3715
他能够将他那些边缘的、有害的谈资
01:34
into the heart of the American conversation.
32
94031
2411
推进到美国人交谈的中心。
01:37
For example, one of the days I was there,
33
97144
2023
比如,我在那的某个日子,
01:39
a bomb had just exploded in New York,
34
99191
3152
一个炸弹刚好在纽约爆炸,
01:42
and the guy accused of planting the bomb had a Muslim-sounding name.
35
102367
3475
而被指控安置炸弹的人 有一个穆斯林名字。
01:45
Now, to the propagandist in California, this seemed like an opportunity,
36
105866
4142
对这个加州的社交推手来说, 这似乎是个机会,
01:50
because one of the things he wanted
37
110032
1716
因为他想要
01:51
was for the US to cut off almost all immigration,
38
111772
2562
让美国切断几乎所有的移民,
01:54
especially from Muslim-majority countries.
39
114358
2559
尤其是来自穆斯林国家的移民。
01:57
So he started livestreaming,
40
117424
2055
于是他开始直播,
01:59
getting his followers worked up into a frenzy
41
119503
2236
让他的追随者们对开放边界的议程
02:01
about how the open borders agenda was going to kill us all
42
121763
2775
将如何杀死我们所有人感到愤怒,
02:04
and asking them to tweet about this,
43
124562
1768
并要求他们在推特上谈论此事,
02:06
and use specific hashtags,
44
126354
1469
使用特定的话题标签,
02:07
trying to get those hashtags trending.
45
127847
1858
试图上推特话题热门。
02:09
And tweet they did --
46
129729
1219
他们发了很多推文——
02:10
hundreds and hundreds of tweets,
47
130972
1864
成百上千,
02:12
a lot of them featuring images like this one.
48
132860
2379
很多都带着像这样的图片。
02:15
So that's George Soros.
49
135263
1777
这是乔治·索罗斯。
他是匈牙利裔的亿万富翁和慈善家,
02:17
He's a Hungarian billionaire and philanthropist,
50
137064
2645
02:19
and in the minds of some conspiracists online,
51
139733
2595
在一些网上阴谋家看来,
02:22
George Soros is like a globalist bogeyman,
52
142352
2568
乔治·索罗斯就像一个全球主义者,
02:24
one of a few elites who is secretly manipulating all of global affairs.
53
144944
4110
秘密操纵全球事务的精英之一。
02:29
Now, just to pause here: if this idea sounds familiar to you,
54
149078
3592
这里暂停一下: 如果这个你听起来熟悉,
02:32
that there are a few elites who control the world
55
152694
2392
有少数精英控制全球,
02:35
and a lot of them happen to be rich Jews,
56
155110
2581
并且很多人碰巧是有钱的犹太人,
02:37
that's because it is one of the most anti-Semitic tropes in existence.
57
157715
3704
因为它是现存最反犹太人的比喻之一。
02:42
I should also mention that the guy in New York who planted that bomb,
58
162296
3398
我还要提一下在纽约埋下炸弹的人,
02:45
he was an American citizen.
59
165718
1491
他是一个美国公民。
02:47
So whatever else was going on there,
60
167785
2204
所以不管发生了什么,
02:50
immigration was not the main issue.
61
170013
2173
移民并非主要问题。
02:53
And the propagandist in California, he understood all this.
62
173332
2832
而这位加州推手,他知道所有这些。
02:56
He was a well-read guy. He was actually a lawyer.
63
176188
2521
他是个博览群书的人, 他其实是个律师。
02:58
He knew the underlying facts,
64
178733
1687
他知道基本事实,
03:00
but he also knew that facts do not drive conversation online.
65
180444
3191
但他也知道事实 不能推动网上的对话,
03:03
What drives conversation online
66
183659
1707
驱动网上对话的
03:05
is emotion.
67
185390
1229
是情绪。
03:07
See, the original premise of social media
68
187451
2010
社交媒体的最初愿景是
把我们连接在一起,
03:09
was that it was going to bring us all together,
69
189485
2249
03:11
make the world more open and tolerant and fair ...
70
191758
2379
让世界变得更开放,宽容和平等……
它确实有这方面的作用。
03:14
And it did some of that.
71
194161
1254
03:16
But the social media algorithms have never been built
72
196503
2723
但社交媒体的算法却没被打造成
03:19
to distinguish between what's true or false,
73
199250
2338
可以区分真和假,
03:21
what's good or bad for society, what's prosocial and what's antisocial.
74
201612
3910
对社会好还是坏,什么是 亲社会和什么反社会。
03:25
That's just not what those algorithms do.
75
205965
2237
这些内容,算法都没有涉及到。
03:28
A lot of what they do is measure engagement:
76
208226
2555
他们做的很多事情只是评估互动:
03:30
clicks, comments, shares, retweets, that kind of thing.
77
210805
3021
点击、评论、分享、 转发之类的事情。
03:33
And if you want your content to get engagement,
78
213850
2701
如果你要让你的内容获得互动,
03:36
it has to spark emotion,
79
216575
1658
它就得散播情绪,
03:38
specifically, what behavioral scientists call "high-arousal emotion."
80
218257
3500
具体来说,就是行为科学家 所说的那种“高唤醒情绪”。
03:42
Now, "high arousal" doesn't only mean sexual arousal,
81
222343
2490
“高唤醒”不只是性唤起的意思,
03:44
although it's the internet, obviously that works.
82
224857
2735
虽然是在互联网上,但显然这是可行的。
03:47
It means anything, positive or negative, that gets people's hearts pumping.
83
227616
4009
它意味着任何事情,积极的 或消极的,让人们心脏加速跳动的。
03:51
So I would sit with these propagandists,
84
231649
1969
所以我和这些推手坐在一起,
03:53
not just the guy in California, but dozens of them,
85
233642
2409
不只是加州这个伙计, 还有其他很多人,
我看着他们一次又一次地 成功做这些事情,
03:56
and I would watch as they did this again and again successfully,
86
236075
3748
03:59
not because they were Russian hackers, not because they were tech prodigies,
87
239847
3608
并非因为他们是俄罗斯黑客, 并非因为他们是科技神童,
04:03
not because they had unique political insights --
88
243479
2289
并非因为他们有独特的政治洞察——
04:05
just because they understood how social media worked,
89
245772
2473
只是因为他们理解 社交媒体的运行方式,
04:08
and they were willing to exploit it to their advantage.
90
248269
2582
并且他们愿意为他们的利益利用它。
04:10
Now, at first I was able to tell myself this was a fringe phenomenon,
91
250865
3243
一开始我告诉自己这是个边缘现象,
这种跟互联网有关的东西。
04:14
something that was relegated to the internet.
92
254132
2229
04:16
But there's really no separation anymore between the internet and everything else.
93
256385
4352
但互联网和其他东西没有任何区别。
04:20
This is an ad that ran on multiple TV stations
94
260761
2184
这是一则在 2018 年国会选举期间
04:22
during the 2018 congressional elections,
95
262969
2607
在多家电视台播放的广告,
04:25
alleging with very little evidence that one of the candidates
96
265600
2880
基本没有证据的指控, 其中一名候选人
04:28
was in the pocket of international manipulator George Soros,
97
268504
2886
受到了国际金融操纵者 乔治·索罗斯的操控,
04:31
who is awkwardly photoshopped here next to stacks of cash.
98
271414
3252
他被 PS 到了成堆的现金旁边。
04:35
This is a tweet from the President of the United States,
99
275296
2654
这是一个来自美国总统的推特,
04:37
alleging, again with no evidence,
100
277974
1633
同样没有证据的指控,
04:39
that American politics is being manipulated by George Soros.
101
279631
3431
美国政治被乔治·索罗斯操纵。
04:43
This stuff that once seemed so shocking and marginal and, frankly, just ignorable,
102
283086
3974
这些东西看起来那么令人震惊却微不足道, 坦白地说,简直是可以忽略不计,
04:47
it's now so normalized that we hardly even notice it.
103
287084
2945
它现在是如此的平常化, 以至于我们几乎没有注意到它。
04:50
So I spent about three years in this world.
104
290053
2066
于是我在这个世界花了三年的时间。
04:52
I talked to a lot of people.
105
292143
1670
我跟很多人交谈过。
04:53
Some of them seemed to have no core beliefs at all.
106
293837
2466
他们中有些人似乎根本没有核心信仰。
04:56
They just seemed to be betting, perfectly rationally,
107
296327
2583
他们只是在完全理性地下注,
04:58
that if they wanted to make some money online
108
298934
2163
如果他们想在网上赚钱
或获得关注,
05:01
or get some attention online,
109
301121
1412
他们就应该尽可能地离谱。
05:02
they should just be as outrageous as possible.
110
302557
2213
05:04
But I talked to other people who were true ideologues.
111
304794
2589
但我也和其他真正 的意识形态拥护者谈过。
05:08
And to be clear, their ideology was not traditional conservatism.
112
308173
3881
需要明确的是,他们的意识形态 不是传统的保守主义。
05:12
These were people who wanted to revoke female suffrage.
113
312078
3378
这些人想要废除女性选举权。
05:15
These were people who wanted to go back to racial segregation.
114
315480
2967
这些人想要回到种族隔离时代。
05:18
Some of them wanted to do away with democracy altogether.
115
318471
2809
他们中的一些人想彻底废除民主。
05:21
Now, obviously these people were not born believing these things.
116
321304
3112
很明显,这些人并不是生来 就相信这些事情的。
他们不是小学的时候就学习这些。
05:24
They didn't pick them up in elementary school.
117
324440
2392
05:26
A lot of them, before they went down some internet rabbit hole,
118
326856
2992
他们中的很多人, 在他们掉进网络的泥潭前,
05:29
they had been libertarian or they had been socialist
119
329872
2468
他们要么是自由主义者, 要么是社会主义者,
要么是完全不同的人。
05:32
or they had been something else entirely.
120
332364
2187
那么到底发生了什么?
05:34
So what was going on?
121
334575
1291
05:36
Well, I can't generalize about every case,
122
336918
2062
我不能对每个案例一概而论,
但很多我交谈过的人,
05:39
but a lot of the people I spoke to,
123
339004
1687
05:40
they seem to have a combination of a high IQ and a low EQ.
124
340715
3880
他们似乎是高智商和 低情商的结合体。
05:44
They seem to take comfort in anonymous, online spaces
125
344619
3511
他们似乎对匿名 和在线空间感到更舒服,
05:48
rather than connecting in the real world.
126
348154
1987
而非在现实世界中的连接。
05:50
So often they would retreat to these message boards
127
350810
2453
他们经常会回到这些留言板
05:53
or these subreddits,
128
353287
1172
或子版块上,
05:54
where their worst impulses would be magnified.
129
354483
2454
在那里,他们糟糕的冲动会被放大。
05:56
They might start out saying something just as a sick joke,
130
356961
3059
他们可能一开始说的 只是个恶心的笑话,
06:00
and then they would get so much positive reinforcement for that joke,
131
360044
3344
然后他们因为这个笑话 而得到很多积极的支持,
正如他们所说的, 这么多毫无意义的“互联网观点”,
06:03
so many meaningless "internet points," as they called it,
132
363412
2821
06:06
that they might start believing their own joke.
133
366257
2532
他们可能会开始相信自己的笑话。
06:10
I talked a lot with one young woman who grew up in New Jersey,
134
370014
3524
我跟一个成长在新泽西的 年轻姑娘聊过很多,
06:13
and then after high school, she moved to a new place
135
373562
2476
在她高中毕业后,她搬到新地方
并突然感到被疏远,
06:16
and suddenly she just felt alienated and cut off
136
376062
2254
开始迷失在手机里。
06:18
and started retreating into her phone.
137
378340
1864
06:20
She found some of these spaces on the internet
138
380850
2172
她发现互联网上有一些空间,
06:23
where people would post the most shocking, heinous things.
139
383046
2866
人们在那里发表一些最令人震惊、 最令人发指的内容。
06:25
And she found this stuff really off-putting
140
385936
2269
她觉得这些东西很讨厌,
06:28
but also kind of engrossing,
141
388229
1775
但也很吸引人,
06:30
kind of like she couldn't look away from it.
142
390754
2084
很难把视线从它们身上移开。
06:33
She started interacting with people in these online spaces,
143
393304
2832
她开始与这些线上空间的人交流,
她觉得这些人很聪明,他们被认可了。
06:36
and they made her feel smart, they made her feel validated.
144
396160
2799
06:38
She started feeling a sense of community,
145
398983
1977
她开始体会到社区的温暖,
开始怀疑这些令人震惊的表情包中
06:40
started wondering if maybe some of these shocking memes
146
400984
2658
可能包含着真理。
06:43
might actually contain a kernel of truth.
147
403666
2461
06:46
A few months later, she was in a car with some of her new internet friends
148
406151
3515
几个月后,她跟她认识的 几个新网友一起坐车
06:49
headed to Charlottesville, Virginia,
149
409690
1826
前往弗吉尼亚州的夏洛茨维尔,
06:51
to march with torches in the name of the white race.
150
411540
2688
以白种人的名义举着火炬游行。
06:55
She'd gone, in a few months, from Obama supporter
151
415033
2291
几个月的时间, 她就从奥巴马的支持者
06:57
to fully radicalized white supremacist.
152
417338
2487
变成了激进的白人至上主义者。
07:01
Now, in her particular case,
153
421030
2281
对于这种情况,
07:03
she actually was able to find her way out of the cult of white supremacy.
154
423335
4132
她实际上能找到自己的方式 来摆脱白人至上的崇拜。
07:08
But a lot of the people I spoke to were not.
155
428418
2060
但很多与我交谈的人不能。
07:10
And just to be clear:
156
430502
1722
说得更明白些:
07:12
I was never so convinced that I had to find common ground
157
432248
2966
我从未如此确信我必须
07:15
with every single person I spoke to
158
435238
2063
与每一个我愿意说这句话的人
07:17
that I was willing to say,
159
437325
1254
找到共同点,
07:18
"You know what, man, you're a fascist propagandist, I'm not,
160
438603
3141
“你知道吗,伙计, 你是法西斯拥护者,我不是,
07:21
whatever, let's just hug it out, all our differences will melt away."
161
441768
3416
不管怎样,让我们拥抱彼此吧, 我们所有的分歧都会消失。”
不,绝对不会。
07:25
No, absolutely not.
162
445208
1714
07:28
But I did become convinced that we cannot just look away from this stuff.
163
448056
3521
但我的确变得确信, 我们不能无视这些东西。
07:31
We have to try to understand it, because only by understanding it
164
451601
3216
我们必须试图理解它, 因为只有通过理解它,
07:34
can we even start to inoculate ourselves against it.
165
454841
3198
我们才能开始预防它。
07:39
In my three years in this world, I got a few nasty phone calls,
166
459361
3349
在这个世界的三年时间中, 我接到过一些让人不快的电话,
07:42
even some threats,
167
462734
1520
甚至有些是威胁,
07:44
but it wasn't a fraction of what female journalists get on this beat.
168
464278
3430
但跟女记者收到的威胁相比 算是小巫见大巫了。
07:48
And yeah, I am Jewish,
169
468791
1391
是的,我是犹太人,
07:50
although, weirdly, a lot of the Nazis couldn't tell I was Jewish,
170
470206
3649
不过奇怪的是,很多纳粹分子 看不出我是犹太人,
07:53
which I frankly just found kind of disappointing.
171
473879
2861
坦白说这让我有点失望。
07:56
(Laughter)
172
476764
1800
(笑声)
07:58
Seriously, like, your whole job is being a professional anti-Semite.
173
478588
3966
说真的,你的工作就是 一个专业的反犹份子。
08:03
Nothing about me is tipping you off at all?
174
483228
2422
我就没有表露一丁点这种特征?
08:05
Nothing?
175
485674
1076
一点都没有?
08:06
(Laughter)
176
486774
2303
(笑声)
08:09
This is not a secret.
177
489983
1195
这并不是个秘密。
08:11
My name is Andrew Marantz, I write for "The New Yorker,"
178
491202
2683
我叫安德鲁·马兰兹, 为《纽约客》撰稿,
08:13
my personality type is like if a Seinfeld episode
179
493909
2521
我的性格类型就像《宋飞正传》里面
08:16
was taped at the Park Slope Food Coop.
180
496454
1837
在公园坡的食品店里录制的那一集。
08:18
Nothing?
181
498315
1162
一点都没有?
08:19
(Laughter)
182
499501
2513
(笑声)
08:24
Anyway, look -- ultimately, it would be nice
183
504804
2231
不管怎样,最终,如果有一个
08:27
if there were, like, a simple formula:
184
507059
2348
简单的公式就好了:
08:29
smartphone plus alienated kid equals 12 percent chance of Nazi.
185
509431
3988
智能手机加上被疏远的孩子 等于 12% 的纳粹几率。
08:33
It's obviously not that simple.
186
513918
1604
这当然不是那样简单。
08:36
And in my writing,
187
516190
1163
在我的文章中,
08:37
I'm much more comfortable being descriptive, not prescriptive.
188
517377
3215
我更喜欢用描述性的, 而不是指令性的字眼。
08:41
But this is TED,
189
521049
2641
但这是 TED,
08:43
so let's get practical.
190
523714
1987
所以让我们实际点。
08:45
I want to share a few suggestions
191
525725
1626
我想要分享几点能让
08:47
of things that citizens of the internet like you and I
192
527375
3126
像你和我这样的互联网公民去做的
08:50
might be able to do to make things a little bit less toxic.
193
530525
3137
能够让事情变得不那么 糟糕的建议。
08:54
So the first one is to be a smart skeptic.
194
534799
2300
首先,要做一个聪明的怀疑论者。
08:57
So, I think there are two kinds of skepticism.
195
537964
2197
我认为有两种怀疑论。
09:00
And I don't want to drown you in technical epistemological information here,
196
540185
4228
我不想让你们沉浸在 技术认识论信息中,
09:04
but I call them smart and dumb skepticism.
197
544437
2628
但我把它们称为 聪明和愚蠢的怀疑主义。
09:08
So, smart skepticism:
198
548176
2507
聪明的怀疑者是:
09:10
thinking for yourself,
199
550707
1204
独立思考,
09:11
questioning every claim,
200
551935
1192
怀疑任何宣称,
09:13
demanding evidence --
201
553151
1442
要求证据——
09:14
great, that's real skepticism.
202
554617
1717
很好,这是真正的怀疑。
09:17
Dumb skepticism: it sounds like skepticism,
203
557397
2630
愚蠢的怀疑者:听起来像怀疑,
09:20
but it's actually closer to knee-jerk contrarianism.
204
560051
2991
但实际上它更接近于 下意识的逆向抬杠。
09:23
Everyone says the earth is round,
205
563817
1573
人人都说地球是圆的,
09:25
you say it's flat.
206
565414
1356
你说它是平的。
09:26
Everyone says racism is bad,
207
566794
1567
人人都说种族主义不好,
09:28
you say, "I dunno, I'm skeptical about that."
208
568385
2585
你说,“我不知道,我对此表示怀疑。”
09:31
I cannot tell you how many young white men I have spoken to in the last few years
209
571682
4119
我无法告诉你,在过去 5 年中 我聊过的人中有多少年轻白人
09:35
who have said,
210
575825
1156
说过,
09:37
"You know, the media, my teachers, they're all trying to indoctrinate me
211
577005
3400
“媒体,我的老师, 他们都试图向我灌输相信
男性特权和白人特权的观念,
09:40
into believing in male privilege and white privilege,
212
580429
2523
09:42
but I don't know about that, man, I don't think so."
213
582976
2452
但我不知道,我不这么认为。”
09:45
Guys -- contrarian white teens of the world --
214
585452
3137
伙计们——世界上叛逆 的白人青少年——
09:48
look:
215
588613
2118
这么说吧:
09:50
if you are being a round earth skeptic and a male privilege skeptic
216
590755
3713
如果你质疑地球不是圆的, 质疑男性至上这种说法是错的,
09:54
and a racism is bad skeptic,
217
594492
2310
或是质疑种族主义是不对的,
09:56
you're not being a skeptic, you're being a jerk.
218
596826
2325
你不是怀疑论者,你是个混蛋。
09:59
(Applause)
219
599175
3593
(鼓掌)
10:04
It's great to be independent-minded, we all should be independent-minded,
220
604394
3460
有独立的思想很好, 我们都应该有独立的思想,
10:07
but just be smart about it.
221
607878
1541
但要对此足够智慧。
10:09
So this next one is about free speech.
222
609906
1863
下一个是关于言论自由。
10:11
You will hear smart, accomplished people who will say, "Well, I'm pro-free speech,"
223
611793
3964
你会听到聪明,有成就的人说: “我支持言论自由,”
10:15
and they say it in this way that it's like they're settling a debate,
224
615781
3273
他们这样说,就像他们 在终结一场辩论,
10:19
when actually, that is the very beginning of any meaningful conversation.
225
619078
3815
实际上,这是任何有意义的对话的开始。
10:23
All the interesting stuff happens after that point.
226
623554
2473
所有有趣的事情都发生在这个观点后。
10:26
OK, you're pro-free speech. What does that mean?
227
626051
2304
好的,你支持言论自由,那意味着啥?
10:28
Does it mean that David Duke and Richard Spencer
228
628379
2287
这意味着大卫·杜克和理查德·斯宾塞
10:30
need to have active Twitter accounts?
229
630690
1845
必须得激活推特账号吗?
10:32
Does it mean that anyone can harass anyone else online
230
632888
2643
这意味着每个人都可因任何理由
10:35
for any reason?
231
635555
1452
在网上攻击别人吗?
10:37
You know, I looked through the entire list of TED speakers this year.
232
637031
3288
我看了今年 TED 所有的演讲者清单。
我并没有发现一个人怀疑地球是圆的。
10:40
I didn't find a single round earth skeptic.
233
640343
2036
这违反了言论自由的准则吗?
10:42
Is that a violation of free speech norms?
234
642403
2142
10:45
Look, we're all pro-free speech, it's wonderful to be pro-free speech,
235
645086
3542
我们都是言论自由的支持者, 支持言论自由很好,
10:48
but if that's all you know how to say again and again,
236
648652
2621
但如果你知道只是 一次又一次地重复这句话,
10:51
you're standing in the way of a more productive conversation.
237
651297
2991
你就阻碍了一次更有成效的谈话。
10:56
Making decency cool again, so ...
238
656105
3135
让体面再次变酷,所以……
10:59
Great!
239
659264
1198
很棒!
11:00
(Applause)
240
660486
1541
(鼓掌)
11:02
Yeah. I don't even need to explain it.
241
662051
2106
是的,我甚至都不需要解释它。
11:04
So in my research, I would go to Reddit or YouTube or Facebook,
242
664181
3846
在我的研究中,我会去 Reddit、 YouTube 或 Facebook,
11:08
and I would search for "sharia law"
243
668051
2300
搜索“古兰经”,
11:10
or I would search for "the Holocaust,"
244
670375
2116
或是“反犹大屠杀”,
11:12
and you might be able to guess what the algorithms showed me, right?
245
672515
3374
你们可能能够猜到算法 会向我展示啥,是吧?
11:16
"Is sharia law sweeping across the United States?"
246
676386
2930
“可兰经正在席卷美国吗?”
11:19
"Did the Holocaust really happen?"
247
679340
2094
“反犹大屠杀真的发生过吗?”
11:22
Dumb skepticism.
248
682291
1258
愚蠢的怀疑主义。
11:24
So we've ended up in this bizarre dynamic online,
249
684835
2381
于是我们就陷入了 这种奇怪的网络动态,
11:27
where some people see bigoted propaganda
250
687240
2092
有些人把看到的偏执的宣传
11:29
as being edgy or being dangerous and cool,
251
689356
2666
看作是很新潮的、或是危险但又很酷的,
11:32
and people see basic truth and human decency as pearl-clutching
252
692046
3327
人们反而觉得基本的真理 和人类的尊严有些令人感到震惊,
11:35
or virtue-signaling or just boring.
253
695397
2544
或是有些圣母,或是无聊的。
11:38
And the social media algorithms, whether intentionally or not,
254
698329
3289
社交媒体算法,不管是有意还是无意,
11:41
they have incentivized this,
255
701642
2023
一直都在火上浇油,
11:43
because bigoted propaganda is great for engagement.
256
703689
2928
因为偏执的煽动对于互动很有帮助。
11:46
Everyone clicks on it, everyone comments on it,
257
706641
2266
每个人都点击它,每个人都评论它,
11:48
whether they love it or they hate it.
258
708931
1810
不管他们爱它还是讨厌它。
11:51
So the number one thing that has to happen here
259
711463
2288
所以当务之急就是,
11:53
is social networks need to fix their platforms.
260
713775
2938
社交网络需要修复它们的平台。
11:57
(Applause)
261
717069
4149
(鼓掌)
12:01
So if you're listening to my voice and you work at a social media company
262
721600
3430
那么,如果你在听我的演讲, 你在社交媒体公司工作,
12:05
or you invest in one or, I don't know, own one,
263
725054
2464
或你投资了一家,或者拥有一家,
12:08
this tip is for you.
264
728598
1461
这条建议是给你的。
12:10
If you have been optimizing for maximum emotional engagement
265
730083
3921
如果你一直在优化 以实现最大化情感式的互动,
12:14
and maximum emotional engagement turns out to be actively harming the world,
266
734028
4113
而最大化的情感式互动实际上 正在伤害着这个世界,
12:18
it's time to optimize for something else.
267
738165
2536
是时候以其他事情为优化目标了。
12:20
(Applause)
268
740725
3704
(鼓掌)
12:26
But in addition to putting pressure on them to do that
269
746939
3513
但除了向社交平台施加压力
12:30
and waiting for them and hoping that they'll do that,
270
750476
2634
并等待和希望他们会行动之外,
12:33
there's some stuff that the rest of us can do, too.
271
753134
2502
有一些事情其他人也可以去做。
12:35
So, we can create some better pathways or suggest some better pathways
272
755660
4497
这样,我们就能够创造一些更好的途径, 或者找到一些更好的途径,
12:40
for angsty teens to go down.
273
760181
1991
减少青少年中的焦虑情绪。
12:42
If you see something that you think is really creative and thoughtful
274
762196
3279
如果你看到一些你认为真的 有创意和有思想的东西,
并且你想要分享那个东西, 你就可以分享,
12:45
and you want to share that thing, you can share that thing,
275
765499
2799
即使它没有让你充满高度兴奋的情绪。
12:48
even if it's not flooding you with high arousal emotion.
276
768322
2656
我发现那真是很小的一步,
12:51
Now that is a very small step, I realize,
277
771002
2020
但总的来说,这些东西真的很重要,
12:53
but in the aggregate, this stuff does matter,
278
773046
2184
12:55
because these algorithms, as powerful as they are,
279
775254
2406
因为这些算法本身非常强大,
12:57
they are taking their behavioral cues from us.
280
777684
2205
它们会从我们身上获取行为线索。
13:01
So let me leave you with this.
281
781556
1513
我还想告诉大家,
13:03
You know, a few years ago it was really fashionable
282
783701
2489
几年前,非常时尚的说法是
13:06
to say that the internet was a revolutionary tool
283
786214
2313
互联网是革命性的工具,
13:08
that was going to bring us all together.
284
788551
2012
将会把我们聚在一起。
13:11
It's now more fashionable to say
285
791074
1589
如今更时尚的说法是
13:12
that the internet is a huge, irredeemable dumpster fire.
286
792687
3094
互联网是一场不可挽回的垃圾箱火灾。
13:16
Neither caricature is really true.
287
796787
1832
两种说法都不完全正确。
13:18
We know the internet is just too vast and complex
288
798643
2373
我们知道互联网太大太复杂,
13:21
to be all good or all bad.
289
801040
1499
以至于不可能是全好或全不好的。
13:22
And the danger with these ways of thinking,
290
802563
2087
这种思考方式的危险在于,
13:24
whether it's the utopian view that the internet will inevitably save us
291
804674
3463
无论是乌托邦的观点—— 互联网将不可避免地拯救我们,
13:28
or the dystopian view that it will inevitably destroy us,
292
808161
3448
还是反乌托邦式的观点—— 互联网将不可避免地摧毁我们,
13:31
either way, we're letting ourselves off the hook.
293
811633
2444
不管怎样,我们要让自己摆脱困境。
13:35
There is nothing inevitable about our future.
294
815564
2606
我们的未来没有什么是不可避免的。
13:38
The internet is made of people.
295
818878
2001
互联网由人组成。
13:40
People make decisions at social media companies.
296
820903
2984
是人们在社交媒体公司中做决定。
13:43
People make hashtags trend or not trend.
297
823911
2651
是人们让话题标签热门或不热门。
13:46
People make societies progress or regress.
298
826586
3129
是人们让社会进步或退步。
13:51
When we internalize that fact,
299
831120
1535
当我们内化这个事实时,
13:52
we can stop waiting for some inevitable future to arrive
300
832679
3048
就不会坐以待毙,
13:55
and actually get to work now.
301
835751
1664
现在就开始着手应对。
13:58
You know, we've all been taught that the arc of the moral universe is long
302
838842
3519
我们都被教育过人类的道德轨迹绵长,
但它终归正义。
14:02
but that it bends toward justice.
303
842385
1702
14:06
Maybe.
304
846489
1230
也许。
14:08
Maybe it will.
305
848814
1200
也许会的。
14:10
But that has always been an aspiration.
306
850776
2477
但这一直是一个愿望。
14:13
It is not a guarantee.
307
853277
1571
它不是个保证。
14:15
The arc doesn't bend itself.
308
855856
2165
道德轨迹并不会自己转向。
14:18
It's not bent inevitably by some mysterious force.
309
858045
3675
它不会被某种神秘力量改变轨迹。
14:21
The real truth,
310
861744
1672
真正的事实,
14:23
which is scarier and also more liberating,
311
863440
2391
有些可怕但也更加令人舒心,
14:26
is that we bend it.
312
866847
1215
是我们能够改变它。
14:28
Thank you.
313
868943
1209
谢谢。
14:30
(Applause)
314
870176
2617
(鼓掌)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隐私政策

eng.lish.video

Developer's Blog