Inside the bizarre world of internet trolls and propagandists | Andrew Marantz

199,790 views ・ 2019-10-01

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: 易帆 余
00:12
I spent the past three years
0
12612
1611
我過去三年的時間,
00:14
talking to some of the worst people on the internet.
1
14247
3183
都在和網路上一些 最糟糕的人對談。
00:18
Now, if you've been online recently,
2
18441
2196
如果你最近上過網,
00:20
you may have noticed that there's a lot of toxic garbage out there:
3
20661
3804
你可能會發現,網路上 有很多有害的垃圾:
00:24
racist memes, misogynist propaganda, viral misinformation.
4
24489
4953
種族主義的媒因、仇恨女性的 宣傳、瘋傳的錯誤資訊。
00:29
So I wanted to know who was making this stuff.
5
29466
2199
我想要知道幕後的黑手是誰, 他們怎麼散播這些資訊,
00:31
I wanted to understand how they were spreading it.
6
31689
2338
最終,我想要知道這對 我們的社會會有什麼影響。
00:34
Ultimately, I wanted to know
7
34051
1387
00:35
what kind of impact it might be having on our society.
8
35462
2569
所以,2016 年,我開始 追蹤這些媒因的源頭,
00:38
So in 2016, I started tracing some of these memes back to their source,
9
38055
4011
00:42
back to the people who were making them or who were making them go viral.
10
42090
3502
追溯到製造這些資訊 或讓它們被瘋傳的人。
00:45
I'd approach those people and say,
11
45616
1634
我會去接近這些人,說:「我是記者。 我能去看你做的事嗎?」
00:47
"Hey, I'm a journalist. Can I come watch you do what you do?"
12
47274
2906
通常得到的回應是: 「我為什麼會想要理會
00:50
Now, often the response would be,
13
50204
1596
00:51
"Why in hell would I want to talk to
14
51824
1747
住布魯克林,支持全球主義的 猶太佬娘娘腔,
00:53
some low-t soy-boy Brooklyn globalist Jew cuck
15
53595
2199
00:55
who's in cahoots with the Democrat Party?"
16
55818
2087
且還是民主黨的同謀?」
00:57
(Laughter)
17
57929
1252
(笑聲)
00:59
To which my response would be, "Look, man, that's only 57 percent true."
18
59205
3835
對此,我的回應是:「聽著, 老兄,你說的只對了 57%。」
01:03
(Laughter)
19
63064
1172
(笑聲)
01:04
But often I got the opposite response.
20
64633
1840
但,通常,我得到的 是相反的回應。
01:06
"Yeah, sure, come on by."
21
66497
1288
「好啊,過來吧。」
01:08
So that's how I ended up in the living room
22
68625
2081
所以這就是我為什麼最後會跑到
01:10
of a social media propagandist in Southern California.
23
70730
3125
南加州社交媒體網路推手 家中的客廳裡。
01:14
He was a married white guy in his late 30s.
24
74386
2381
他是個已婚白種男子,近四十歲。
01:16
He had a table in front of him with a mug of coffee,
25
76791
3182
他前面有一張桌子, 上面有裝著咖啡的馬克杯、
01:19
a laptop for tweeting,
26
79997
1570
推特用的筆電、
01:21
a phone for texting
27
81591
1848
傳訊息用的手機,
01:23
and an iPad for livestreaming to Periscope and YouTube.
28
83463
3236
以及在 Periscope 和 YouTube 做直播用的 iPad。
01:27
That was it.
29
87258
1161
就這樣。
01:28
And yet, with those tools,
30
88998
1270
但,有這些工具,
01:30
he was able to propel his fringe, noxious talking points
31
90292
3715
他就能夠推動他那些 激進又有害的論點
01:34
into the heart of the American conversation.
32
94031
2411
進入美國人對話的核心地帶。
01:37
For example, one of the days I was there,
33
97144
2023
比如,我在那裡的其中一天,
01:39
a bomb had just exploded in New York,
34
99191
3152
紐約有一顆炸彈爆炸,
01:42
and the guy accused of planting the bomb had a Muslim-sounding name.
35
102367
3475
被指控放置炸彈的人, 名字聽起來像是穆斯林。
01:45
Now, to the propagandist in California, this seemed like an opportunity,
36
105866
4142
對這位加州的社交推手來說, 這似乎是個機會,
01:50
because one of the things he wanted
37
110032
1716
因為他的期望之一, 就是讓美國拒絕所有的移民,
01:51
was for the US to cut off almost all immigration,
38
111772
2562
01:54
especially from Muslim-majority countries.
39
114358
2559
特別是從以穆斯林 為主的國家來的移民。
01:57
So he started livestreaming,
40
117424
2055
所以他開始直播,
01:59
getting his followers worked up into a frenzy
41
119503
2236
讓他的追隨者逐步 激發出一股熱潮,
02:01
about how the open borders agenda was going to kill us all
42
121763
2775
宣導說開放邊界的議程 會讓我們死光光,
02:04
and asking them to tweet about this,
43
124562
1768
並請他們發相關推文,
02:06
and use specific hashtags,
44
126354
1469
並使用特定的「#」標籤, 試圖讓那些關鍵字紅起來。
02:07
trying to get those hashtags trending.
45
127847
1858
02:09
And tweet they did --
46
129729
1219
而他們發的推文—— 數以百計的推文,
02:10
hundreds and hundreds of tweets,
47
130972
1864
02:12
a lot of them featuring images like this one.
48
132860
2379
當中有許多都含有類似這樣的圖像。
02:15
So that's George Soros.
49
135263
1777
這是喬治索羅斯。
02:17
He's a Hungarian billionaire and philanthropist,
50
137064
2645
他是匈牙利裔的 億萬富翁以及慈善家,
02:19
and in the minds of some conspiracists online,
51
139733
2595
在一些線上陰謀理論家的心中,
02:22
George Soros is like a globalist bogeyman,
52
142352
2568
喬治索羅斯就像是 主張全球主義的大妖怪,
02:24
one of a few elites who is secretly manipulating all of global affairs.
53
144944
4110
是秘密操控所有 全球事務的少數菁英之一。
02:29
Now, just to pause here: if this idea sounds familiar to you,
54
149078
3592
在這裡暫停一下:如果你覺得 這個想法聽起來很耳熟,
02:32
that there are a few elites who control the world
55
152694
2392
有幾位掌控世界的菁英,
02:35
and a lot of them happen to be rich Jews,
56
155110
2581
且多半剛好是有錢的猶太人,
02:37
that's because it is one of the most anti-Semitic tropes in existence.
57
157715
3704
那是因為這是現存 最反猶太人的比喻之一。
02:42
I should also mention that the guy in New York who planted that bomb,
58
162296
3398
我也應該提一下, 在紐約放置炸彈的那個人,
02:45
he was an American citizen.
59
165718
1491
他是美國公民。
02:47
So whatever else was going on there,
60
167785
2204
所以不論他的理由是什麼,
02:50
immigration was not the main issue.
61
170013
2173
移民都不是主要的議題。
02:53
And the propagandist in California, he understood all this.
62
173332
2832
而加州的這位網路推手 很清楚知道這些。
他是個博學的人。 他其實是個律師。
02:56
He was a well-read guy. He was actually a lawyer.
63
176188
2521
02:58
He knew the underlying facts,
64
178733
1687
他知道背後的事實,
03:00
but he also knew that facts do not drive conversation online.
65
180444
3191
但他也知道事實無法 在網路上引起熱議。
03:03
What drives conversation online
66
183659
1707
能在網路上引起熱議的,
03:05
is emotion.
67
185390
1229
是情緒。
03:07
See, the original premise of social media
68
187451
2010
社交媒體的最初的初衷是 要將我們大家都集結起來,
03:09
was that it was going to bring us all together,
69
189485
2249
03:11
make the world more open and tolerant and fair ...
70
191758
2379
讓世界更開放、更包容、更公平……
03:14
And it did some of that.
71
194161
1254
它確實有做到一部分。
03:16
But the social media algorithms have never been built
72
196503
2723
但社交媒體的演算法 在創造時就從來沒有
03:19
to distinguish between what's true or false,
73
199250
2338
考量過如何分辨真偽,
03:21
what's good or bad for society, what's prosocial and what's antisocial.
74
201612
3910
什麼對社會好或不好,
什麼是親社會的,什麼是反社會的。
03:25
That's just not what those algorithms do.
75
205965
2237
那些演算法的目的就不是這些。
03:28
A lot of what they do is measure engagement:
76
208226
2555
它們所做的,多半是 衡量你的參與程度:
03:30
clicks, comments, shares, retweets, that kind of thing.
77
210805
3021
點閱率、留言、 分享、轉推之類的。
03:33
And if you want your content to get engagement,
78
213850
2701
如果你希望你寫的內容 能得到大家參與,
03:36
it has to spark emotion,
79
216575
1658
你的東西就得能煽動情緒,
03:38
specifically, what behavioral scientists call "high-arousal emotion."
80
218257
3500
明確來說,行為科學家稱 這類情緒為「高度亢奮」情緒。
03:42
Now, "high arousal" doesn't only mean sexual arousal,
81
222343
2490
「高度亢奮」現象, 不是只有在性高潮時才有,
03:44
although it's the internet, obviously that works.
82
224857
2735
很明顯地,網路上也會出現。
03:47
It means anything, positive or negative, that gets people's hearts pumping.
83
227616
4009
不論正面或負面,它指的是 任何能讓人心跳加速的東西。
03:51
So I would sit with these propagandists,
84
231649
1969
我和這些網路推手坐在一起,
03:53
not just the guy in California, but dozens of them,
85
233642
2409
除了加州的那傢伙, 還有數十個這種人,
03:56
and I would watch as they did this again and again successfully,
86
236075
3748
我看著他們一而再再而三地 成功使出這些招數,
03:59
not because they were Russian hackers, not because they were tech prodigies,
87
239847
3608
並不是因為他們是什麼 俄國駭客或科技奇才,
04:03
not because they had unique political insights --
88
243479
2289
也不是因為他們有 獨特的政治洞見——
04:05
just because they understood how social media worked,
89
245772
2473
只因為他們了解 社交媒體如何運作,
04:08
and they were willing to exploit it to their advantage.
90
248269
2582
且他們是刻意利用這點來圖利自己。
04:10
Now, at first I was able to tell myself this was a fringe phenomenon,
91
250865
3243
一開始,我還可以告訴自己,
這只是跟網路有關的邊緣現象。
04:14
something that was relegated to the internet.
92
254132
2229
04:16
But there's really no separation anymore between the internet and everything else.
93
256385
4352
但,現在每件事都和網路有關了。
04:20
This is an ad that ran on multiple TV stations
94
260761
2184
這是在數個電視台播出的廣告,
04:22
during the 2018 congressional elections,
95
262969
2607
在 2018 年國會選舉時播放,
04:25
alleging with very little evidence that one of the candidates
96
265600
2880
用非常少的證據就斷言 其中一位候選人
04:28
was in the pocket of international manipulator George Soros,
97
268504
2886
是國際操弄家喬治索羅斯的走狗,
04:31
who is awkwardly photoshopped here next to stacks of cash.
98
271414
3252
並用很笨拙的修圖技巧 將他放在成疊的現金旁邊。
04:35
This is a tweet from the President of the United States,
99
275296
2654
這是來自美國總統的推文, 同樣也沒有證據就斷言,
04:37
alleging, again with no evidence,
100
277974
1633
04:39
that American politics is being manipulated by George Soros.
101
279631
3431
說美國政治被喬治索羅斯操控。
04:43
This stuff that once seemed so shocking and marginal and, frankly, just ignorable,
102
283086
3974
這種東西過去看起來 令人震驚但微不足道,
老實說,是可以忽略的,
04:47
it's now so normalized that we hardly even notice it.
103
287084
2945
現在卻被正常化了,我們 幾乎不會注意到其氾濫的程度。
04:50
So I spent about three years in this world.
104
290053
2066
所以,我花了三年在這個世界中。
04:52
I talked to a lot of people.
105
292143
1670
我和很多人談過。
04:53
Some of them seemed to have no core beliefs at all.
106
293837
2466
有些人似乎根本沒有核心信仰。
04:56
They just seemed to be betting, perfectly rationally,
107
296327
2583
他們就只是非常理性地想賭看看,
04:58
that if they wanted to make some money online
108
298934
2163
如果他們想在線上賺錢或得到關注,
05:01
or get some attention online,
109
301121
1412
他們就會盡可能地無法無天。
05:02
they should just be as outrageous as possible.
110
302557
2213
05:04
But I talked to other people who were true ideologues.
111
304794
2589
但,我和其他真正的 意識形態者談過。
05:08
And to be clear, their ideology was not traditional conservatism.
112
308173
3881
讓我說清楚,他們的意識形態 並非傳統的保守主義。
05:12
These were people who wanted to revoke female suffrage.
113
312078
3378
他們是想要取消女性投票權的人。
05:15
These were people who wanted to go back to racial segregation.
114
315480
2967
他們是想要回到種族隔離的人。
05:18
Some of them wanted to do away with democracy altogether.
115
318471
2809
當中有些人想要 把民主都一起廢除。
05:21
Now, obviously these people were not born believing these things.
116
321304
3112
很顯然,這些人一點 也不相信這些事物。
05:24
They didn't pick them up in elementary school.
117
324440
2392
他們在小學時沒有學到這些。
05:26
A lot of them, before they went down some internet rabbit hole,
118
326856
2992
當中很多人,在他們進入 網路的超現實世界之前,
05:29
they had been libertarian or they had been socialist
119
329872
2468
他們曾經是自由論者 或社會主義者,
05:32
or they had been something else entirely.
120
332364
2187
或者曾經是完全不同的人。
05:34
So what was going on?
121
334575
1291
那麼,發生了什麼事?
05:36
Well, I can't generalize about every case,
122
336918
2062
我無法用一個說法代表所有案例,
05:39
but a lot of the people I spoke to,
123
339004
1687
但有很多與我交談過的人
05:40
they seem to have a combination of a high IQ and a low EQ.
124
340715
3880
似乎都結合了高智商和低情緒智商。
05:44
They seem to take comfort in anonymous, online spaces
125
344619
3511
他們似乎在暱名的 線上空間感到很自在,
05:48
rather than connecting in the real world.
126
348154
1987
而不是真實世界的連結。
05:50
So often they would retreat to these message boards
127
350810
2453
所以,他們會退到這些 討論區或者各種論壇,
05:53
or these subreddits,
128
353287
1172
05:54
where their worst impulses would be magnified.
129
354483
2454
在那些地方,他們 最糟糕的衝動會被放大。
05:56
They might start out saying something just as a sick joke,
130
356961
3059
他們一開始可能只是 開個下流玩笑,
06:00
and then they would get so much positive reinforcement for that joke,
131
360044
3344
接著,他們會從那個笑話 得到很多積極的支持,
06:03
so many meaningless "internet points," as they called it,
132
363412
2821
得到好多無意義的 所謂「網路分數」,
06:06
that they might start believing their own joke.
133
366257
2532
讓他們開始相信他們自己的玩笑。
06:10
I talked a lot with one young woman who grew up in New Jersey,
134
370014
3524
我曾和一位年輕女子談了很多, 她在新澤西長大,
06:13
and then after high school, she moved to a new place
135
373562
2476
高中畢業之後, 她搬到了一個新地方,
突然,她感到疏遠、孤獨,
06:16
and suddenly she just felt alienated and cut off
136
376062
2254
06:18
and started retreating into her phone.
137
378340
1864
開始退縮到她的手機當中。
06:20
She found some of these spaces on the internet
138
380850
2172
她在網路上找到空間,
06:23
where people would post the most shocking, heinous things.
139
383046
2866
在那裡,大家會講一些 最驚人、令人髮指的事。
06:25
And she found this stuff really off-putting
140
385936
2269
她發現這些內容真的讓人厭惡,
06:28
but also kind of engrossing,
141
388229
1775
但卻也有點引人入勝,
06:30
kind of like she couldn't look away from it.
142
390754
2084
很難把視線從它們身上移開。
06:33
She started interacting with people in these online spaces,
143
393304
2832
她開始在這些 網路空間中和人互動,
他們讓她覺得自己 很聰明、有人肯定。
06:36
and they made her feel smart, they made her feel validated.
144
396160
2799
她開始有了共同體的感覺,
06:38
She started feeling a sense of community,
145
398983
1977
06:40
started wondering if maybe some of these shocking memes
146
400984
2658
並開始思考,這些驚人的 媒因當中可能有些
06:43
might actually contain a kernel of truth.
147
403666
2461
真的是從真相發展出來的。
06:46
A few months later, she was in a car with some of her new internet friends
148
406151
3515
幾個月後,她和一些 新網友共乘一台車,
06:49
headed to Charlottesville, Virginia,
149
409690
1826
前往維吉尼亞的夏律第鎮,
06:51
to march with torches in the name of the white race.
150
411540
2688
打算以白人種族之名, 帶著火把遊行。
06:55
She'd gone, in a few months, from Obama supporter
151
415033
2291
幾個月的時間, 她就從歐巴馬支持者
06:57
to fully radicalized white supremacist.
152
417338
2487
變成了激進的白人至上主義者。
07:01
Now, in her particular case,
153
421030
2281
在她這個特殊案例中,
07:03
she actually was able to find her way out of the cult of white supremacy.
154
423335
4132
她其實自己有能力可以找到方法 來擺脫白人至上的崇拜。
07:08
But a lot of the people I spoke to were not.
155
428418
2060
但和我談過話的許多人並沒有。
07:10
And just to be clear:
156
430502
1722
讓我說清楚:
07:12
I was never so convinced that I had to find common ground
157
432248
2966
我從來沒有如此深信
我必須要和我交談的每個人 都找到共同點,
07:15
with every single person I spoke to
158
435238
2063
07:17
that I was willing to say,
159
437325
1254
我甚至願意說: 「你知道嗎,老兄,
07:18
"You know what, man, you're a fascist propagandist, I'm not,
160
438603
3141
你是法西斯主義宣傳者,我不是,
07:21
whatever, let's just hug it out, all our differences will melt away."
161
441768
3416
不論如何,咱們抱一下就沒事了, 我們的歧見都會消失。」
07:25
No, absolutely not.
162
445208
1714
不,絕對不可能。
07:28
But I did become convinced that we cannot just look away from this stuff.
163
448056
3521
但我的確相信, 我們無法忽視這個狀況。
07:31
We have to try to understand it, because only by understanding it
164
451601
3216
我們要試著了解它, 因為只有透過了解它,
07:34
can we even start to inoculate ourselves against it.
165
454841
3198
我們才能開始預防它。
07:39
In my three years in this world, I got a few nasty phone calls,
166
459361
3349
我待在這個世界中的三年期間, 我接到幾通很糟糕的電話,
07:42
even some threats,
167
462734
1520
甚至威脅,
07:44
but it wasn't a fraction of what female journalists get on this beat.
168
464278
3430
但和女性記者所碰到的相比 這只是冰山一角。
07:48
And yeah, I am Jewish,
169
468791
1391
是的,我是猶太人,
07:50
although, weirdly, a lot of the Nazis couldn't tell I was Jewish,
170
470206
3649
不過,很奇怪的是, 很多納粹無法看出我是猶太人,
07:53
which I frankly just found kind of disappointing.
171
473879
2861
坦白說我覺得有點失望。
07:56
(Laughter)
172
476764
1800
(笑聲)
07:58
Seriously, like, your whole job is being a professional anti-Semite.
173
478588
3966
說真的,你們這些人 全心全意在反猶太人啊。
08:03
Nothing about me is tipping you off at all?
174
483228
2422
我沒有什麼地方會露餡 讓你們看出來嗎?
08:05
Nothing?
175
485674
1076
都沒有嗎?
08:06
(Laughter)
176
486774
2303
(笑聲)
08:09
This is not a secret.
177
489983
1195
這不是秘密。
08:11
My name is Andrew Marantz, I write for "The New Yorker,"
178
491202
2683
我的名字叫安德魯馬蘭茲, 我為《紐約客》撰文,
08:13
my personality type is like if a Seinfeld episode
179
493909
2521
我的人格類型就像是《歡樂單身派對》
08:16
was taped at the Park Slope Food Coop.
180
496454
1837
在公園坡食品合作社錄影的那一集。
08:18
Nothing?
181
498315
1162
都沒有嗎?
08:19
(Laughter)
182
499501
2513
(笑聲)
08:24
Anyway, look -- ultimately, it would be nice
183
504804
2231
總之——最終,
如果能有簡單的方程式 來說明就好了:
08:27
if there were, like, a simple formula:
184
507059
2348
08:29
smartphone plus alienated kid equals 12 percent chance of Nazi.
185
509431
3988
手機+孤獨孩子= 12% 機會成為納粹。
08:33
It's obviously not that simple.
186
513918
1604
很顯然沒那麼簡單。
08:36
And in my writing,
187
516190
1163
我寫作時,
08:37
I'm much more comfortable being descriptive, not prescriptive.
188
517377
3215
我偏好做描述, 而不是提解決方案。
08:41
But this is TED,
189
521049
2641
不過這裡是 TED,
08:43
so let's get practical.
190
523714
1987
所以咱們實際點。
08:45
I want to share a few suggestions
191
525725
1626
我想要分享幾個建議,
08:47
of things that citizens of the internet like you and I
192
527375
3126
是你我這種網路公民可以做的事,
08:50
might be able to do to make things a little bit less toxic.
193
530525
3137
能夠讓一切變得比較不那麼有害。
08:54
So the first one is to be a smart skeptic.
194
534799
2300
第一,要當聰明的懷疑論者。
08:57
So, I think there are two kinds of skepticism.
195
537964
2197
我認為懷疑論有兩種。
09:00
And I don't want to drown you in technical epistemological information here,
196
540185
4228
在這裡我不想談到 太技術面的認識論資訊,
09:04
but I call them smart and dumb skepticism.
197
544437
2628
但我把它們稱為 聰明懷疑論和愚蠢懷疑論。
09:08
So, smart skepticism:
198
548176
2507
所以,聰明懷疑論:
09:10
thinking for yourself,
199
550707
1204
獨立思考、質疑每一項主張、
09:11
questioning every claim,
200
551935
1192
09:13
demanding evidence --
201
553151
1442
索求證據——
09:14
great, that's real skepticism.
202
554617
1717
很好,那是真的懷疑論。
09:17
Dumb skepticism: it sounds like skepticism,
203
557397
2630
愚蠢懷疑論: 它聽起來像是懷疑論,
09:20
but it's actually closer to knee-jerk contrarianism.
204
560051
2991
但它比較像是下意識的叛逆吐槽。
09:23
Everyone says the earth is round,
205
563817
1573
大家都說地球是圓的,
09:25
you say it's flat.
206
565414
1356
你就說它是平的。
09:26
Everyone says racism is bad,
207
566794
1567
大家都說種族主義不好,
09:28
you say, "I dunno, I'm skeptical about that."
208
568385
2585
你就說:「我不知道, 我對這點抱持懷疑。」
09:31
I cannot tell you how many young white men I have spoken to in the last few years
209
571682
4119
在我過去幾年交談過的 白種年輕男性當中有相當多
09:35
who have said,
210
575825
1156
都會說:
09:37
"You know, the media, my teachers, they're all trying to indoctrinate me
211
577005
3400
「要知道,媒體、我的老師, 他們都試著要灌輸我,
09:40
into believing in male privilege and white privilege,
212
580429
2523
強迫我相信男性特權和白人特權,
09:42
but I don't know about that, man, I don't think so."
213
582976
2452
但我不確定,老兄, 我不這麼認為。」
09:45
Guys -- contrarian white teens of the world --
214
585452
3137
各位——世界上叛逆的白人青少年——
09:48
look:
215
588613
2118
聽著:
09:50
if you are being a round earth skeptic and a male privilege skeptic
216
590755
3713
如果你懷疑地球不是圓的、 懷疑男性特權、
09:54
and a racism is bad skeptic,
217
594492
2310
種族主義是好的,
09:56
you're not being a skeptic, you're being a jerk.
218
596826
2325
你並不是懷疑論者, 你只是個蠢蛋。
09:59
(Applause)
219
599175
3593
(掌聲)
10:04
It's great to be independent-minded, we all should be independent-minded,
220
604394
3460
有獨立見解是好的, 我們都應該有獨立見解,
10:07
but just be smart about it.
221
607878
1541
但這麼做時要放聰明點。
10:09
So this next one is about free speech.
222
609906
1863
下一個建議和自由言論有關。
10:11
You will hear smart, accomplished people who will say, "Well, I'm pro-free speech,"
223
611793
3964
你會聽到聰明、有教養的人說: 「我是支持自由言論的。」
10:15
and they say it in this way that it's like they're settling a debate,
224
615781
3273
他們說的方式, 就好像他們是在解決辯論,
10:19
when actually, that is the very beginning of any meaningful conversation.
225
619078
3815
但那明明是一段有意義的對談 正要開始的時候。
10:23
All the interesting stuff happens after that point.
226
623554
2473
有趣的事都會發生在 這個時點之後。
10:26
OK, you're pro-free speech. What does that mean?
227
626051
2304
好,你支持自由言論。 那是什麼意思?
10:28
Does it mean that David Duke and Richard Spencer
228
628379
2287
那是否表示大衛杜克 和理查德斯賓塞
10:30
need to have active Twitter accounts?
229
630690
1845
必須得激活推特帳號?
10:32
Does it mean that anyone can harass anyone else online
230
632888
2643
是否表示任何人都可以 用任何理由在線上騷擾任何人?
10:35
for any reason?
231
635555
1452
10:37
You know, I looked through the entire list of TED speakers this year.
232
637031
3288
我看過今年度完整的 TED 講者名單。
沒有找到任何圓形地球懷疑論者。
10:40
I didn't find a single round earth skeptic.
233
640343
2036
10:42
Is that a violation of free speech norms?
234
642403
2142
這違反了自由言論的標準嗎?
10:45
Look, we're all pro-free speech, it's wonderful to be pro-free speech,
235
645086
3542
我們都支持自由言論,這是好事,
10:48
but if that's all you know how to say again and again,
236
648652
2621
但如果你就只知道 一而再再而三地那樣說,
10:51
you're standing in the way of a more productive conversation.
237
651297
2991
你其實在阻礙更有成效的對談。
10:56
Making decency cool again, so ...
238
656105
3135
讓高尚文雅的行為 再次成為很酷的事,所以……
10:59
Great!
239
659264
1198
好極了!
11:00
(Applause)
240
660486
1541
(掌聲)
11:02
Yeah. I don't even need to explain it.
241
662051
2106
我甚至不用解釋。
11:04
So in my research, I would go to Reddit or YouTube or Facebook,
242
664181
3846
在我做研究時,我會上 Reddit 或 YouTube 或臉書,
11:08
and I would search for "sharia law"
243
668051
2300
我會搜尋「伊斯蘭教教法」,
11:10
or I would search for "the Holocaust,"
244
670375
2116
或者我會搜尋「大屠殺」,
11:12
and you might be able to guess what the algorithms showed me, right?
245
672515
3374
你們應該猜得到演算法 會呈現什麼給我,對吧?
11:16
"Is sharia law sweeping across the United States?"
246
676386
2930
「伊斯蘭教教法是否 橫掃美國全境?」
11:19
"Did the Holocaust really happen?"
247
679340
2094
「大屠殺真的有發生嗎?」
11:22
Dumb skepticism.
248
682291
1258
愚蠢的懷疑論者。
11:24
So we've ended up in this bizarre dynamic online,
249
684835
2381
所以,最後在線上就產生了 這種怪異的網路現象,
11:27
where some people see bigoted propaganda
250
687240
2092
有些人會把偏執的推文視為
11:29
as being edgy or being dangerous and cool,
251
689356
2666
前衛的、危險又酷的,
11:32
and people see basic truth and human decency as pearl-clutching
252
692046
3327
對基本真理和人類的合宜行為
卻大驚小怪,
11:35
or virtue-signaling or just boring.
253
695397
2544
或暗示美德很無聊。
11:38
And the social media algorithms, whether intentionally or not,
254
698329
3289
不論有意或無意,社交媒體演算法
11:41
they have incentivized this,
255
701642
2023
都在鼓勵這些事,
11:43
because bigoted propaganda is great for engagement.
256
703689
2928
因為偏執的宣傳很吸睛。
11:46
Everyone clicks on it, everyone comments on it,
257
706641
2266
大家都會點擊,大家都會留言,
11:48
whether they love it or they hate it.
258
708931
1810
不論他們討厭或喜歡該論點。
11:51
So the number one thing that has to happen here
259
711463
2288
所以,必須要達成的第一件事,
11:53
is social networks need to fix their platforms.
260
713775
2938
就是社交媒體得要 修改它們的平台。
11:57
(Applause)
261
717069
4149
(掌聲)
12:01
So if you're listening to my voice and you work at a social media company
262
721600
3430
如果你在聽我的演說, 且你在社交媒體公司工作,
12:05
or you invest in one or, I don't know, own one,
263
725054
2464
或者你有投資或擁有這類公司,
12:08
this tip is for you.
264
728598
1461
這項告誡是給你的。
12:10
If you have been optimizing for maximum emotional engagement
265
730083
3921
如果你一直在優化 會煽動情緒的內容或影片,
12:14
and maximum emotional engagement turns out to be actively harming the world,
266
734028
4113
而結果發現煽動情緒 實際上會傷害到這個世界,
12:18
it's time to optimize for something else.
267
738165
2536
是時候要優化其它 有意義的事情了。
12:20
(Applause)
268
740725
3704
(掌聲)
12:26
But in addition to putting pressure on them to do that
269
746939
3513
但,除了施壓讓他們去做
12:30
and waiting for them and hoping that they'll do that,
270
750476
2634
並且等待著希望他們會去做之外,
12:33
there's some stuff that the rest of us can do, too.
271
753134
2502
還有我們其他人也能做的事。
12:35
So, we can create some better pathways or suggest some better pathways
272
755660
4497
我們可以創造或建議更好的途徑
12:40
for angsty teens to go down.
273
760181
1991
給煩惱的青少年去走。
12:42
If you see something that you think is really creative and thoughtful
274
762196
3279
若你發現了什麼真的 很有創意和巧思的東西,
12:45
and you want to share that thing, you can share that thing,
275
765499
2799
且你想要分享,那就去分享,
即使它沒有讓你充滿高度興奮的情緒。
12:48
even if it's not flooding you with high arousal emotion.
276
768322
2656
我知道,這是非常微小的一步,
12:51
Now that is a very small step, I realize,
277
771002
2020
但,加總起來,是會有影響的,
12:53
but in the aggregate, this stuff does matter,
278
773046
2184
12:55
because these algorithms, as powerful as they are,
279
775254
2406
因為這些演算法雖然很強大,
12:57
they are taking their behavioral cues from us.
280
777684
2205
它們會怎麼做,還是來自於我們。
13:01
So let me leave you with this.
281
781556
1513
我想留給大家思考的是:
13:03
You know, a few years ago it was really fashionable
282
783701
2489
幾年前,很時髦的人會說
13:06
to say that the internet was a revolutionary tool
283
786214
2313
網路是很革命性的工具,
13:08
that was going to bring us all together.
284
788551
2012
可以把大家聚集在一起。
13:11
It's now more fashionable to say
285
791074
1589
現在,時髦的人會說
13:12
that the internet is a huge, irredeemable dumpster fire.
286
792687
3094
網路是不可救藥的大災難。
13:16
Neither caricature is really true.
287
796787
1832
兩種誇張說法都不全對。
13:18
We know the internet is just too vast and complex
288
798643
2373
我們知道網路太浩大、太複雜, 不可能是全好或全壞的。
13:21
to be all good or all bad.
289
801040
1499
13:22
And the danger with these ways of thinking,
290
802563
2087
這些思考方式的危險在於,
13:24
whether it's the utopian view that the internet will inevitably save us
291
804674
3463
不論是認為網路最終一定會 拯救我們的烏托邦觀點,
13:28
or the dystopian view that it will inevitably destroy us,
292
808161
3448
或者它最終一定會摧毀 我們的反烏托邦觀點,
13:31
either way, we're letting ourselves off the hook.
293
811633
2444
不管怎樣,我們要讓自己擺脫困境。
13:35
There is nothing inevitable about our future.
294
815564
2606
我們的未來沒有什麼是不可避免的。
13:38
The internet is made of people.
295
818878
2001
網路是人構成的。
13:40
People make decisions at social media companies.
296
820903
2984
是人,在社交媒體公司做決定。
13:43
People make hashtags trend or not trend.
297
823911
2651
是人,讓 # 成為潮流或不流行。
13:46
People make societies progress or regress.
298
826586
3129
是人,讓社會進步或退步。
13:51
When we internalize that fact,
299
831120
1535
一旦我們能內化這事實,
13:52
we can stop waiting for some inevitable future to arrive
300
832679
3048
就可以停止等著 某種無可避免的未來發生,
13:55
and actually get to work now.
301
835751
1664
而真的現在就開始行動。
13:58
You know, we've all been taught that the arc of the moral universe is long
302
838842
3519
我們都聽過,道德宇宙的 弧形軌跡很漫長,
但它終將彎向正義。
14:02
but that it bends toward justice.
303
842385
1702
14:06
Maybe.
304
846489
1230
也許。
14:08
Maybe it will.
305
848814
1200
也許會吧。
14:10
But that has always been an aspiration.
306
850776
2477
但那只是鼓舞,
14:13
It is not a guarantee.
307
853277
1571
而不是保證。
14:15
The arc doesn't bend itself.
308
855856
2165
道德軌跡並不會自己轉向。
14:18
It's not bent inevitably by some mysterious force.
309
858045
3675
它不會被某種神秘力量改變方向。
14:21
The real truth,
310
861744
1672
真正的真相,其實比較可怕
14:23
which is scarier and also more liberating,
311
863440
2391
但也讓人放心,那就是:
14:26
is that we bend it.
312
866847
1215
我們能夠改變它。
14:28
Thank you.
313
868943
1209
謝謝。
14:30
(Applause)
314
870176
2617
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog