How do daily habits lead to political violence? | Christiane-Marie Abu Sarah

90,771 views ・ 2020-09-18

TED


请双击下面的英文字幕来播放视频。

00:00
Transcriber: Leslie Gauthier Reviewer: Camille Martínez
0
0
7000
翻译人员: Zoe choi 校对人员: Phoebe Ling Zhang
00:12
So I'm starting us out today with a historical mystery.
1
12887
3528
我将以一个历史小谜团 开启我们今天的演讲。
00:17
In 1957, there were two young women,
2
17037
2969
在 1957 年,有两个女人,
00:20
both in their 20s,
3
20030
1355
她们都是 20 多岁,
00:21
both living in the same city,
4
21409
1484
住在同一个城市,
00:22
both members of the same political group.
5
22917
2836
而且同属一个政治团体。
00:26
That year, both decided to commit violent attacks.
6
26751
3840
就在那一年,两个人都决定 要实施暴力袭击。
00:31
One girl took a gun and approached a soldier at a checkpoint.
7
31284
4129
一个女孩携带枪,走近关卡士兵,
00:36
The other girl took a bomb and went to a crowded café.
8
36300
4778
另一个女孩则携带着一个炸弹, 去了一个拥挤的咖啡厅。
00:42
But here's the thing:
9
42287
1699
但是,重点来了,
00:44
one of the those girls followed through with the attack,
10
44429
3709
其中一个女孩完成了这次行动,
00:49
but the other turned back.
11
49028
3020
但是另一个却转身走了。
00:53
So what made the difference?
12
53303
1694
是什么原因导致她们 截然不同的行为呢?
00:56
I'm a behavioral historian, and I study aggression,
13
56157
3257
我是一名行为历史学家, 我研究攻击性行为、
00:59
moral cognition
14
59438
1921
道德认知,
01:01
and decision-making in social movements.
15
61383
3209
以及社会运动中的决策过程。
01:04
That's a mouthful. (Laughs)
16
64616
1997
有点绕口,对吧。 (笑声)
01:06
So, the translation of that is:
17
66637
2650
换句话说,
01:09
I study the moment an individual decides to pull the trigger,
18
69311
4778
我研究个体扣动板机的 那个瞬间的抉择,
01:14
the day-to-day decisions that led up to that moment
19
74113
4636
导致那一瞬间的 日积月累的种种抉择,
01:18
and the stories that they tell themselves about why that behavior is justified.
20
78773
6123
以及他们为自己讲述的, 用来合理化自己激进行为的故事。
01:25
Now, this topic --
21
85468
1256
这个话题
01:26
it's not just scholarly for me.
22
86748
2670
对我来说不是只有学术意义,
01:29
It's actually a bit personal.
23
89442
1785
其实还包含了我的个人经历。
01:31
I grew up in Kootenai County, Idaho,
24
91251
3361
我在爱达荷州库特耐县长大。
01:34
and this is very important.
25
94636
2088
这个信息很重要。
01:37
This is not the part of Idaho with potatoes.
26
97147
4522
爱达荷的这一地区 并不盛产马铃薯,
01:41
We have no potatoes.
27
101693
2418
我们没有马铃薯,
01:44
And if you ask me about potatoes,
28
104135
1973
所以如果你问我 关于马铃薯的问题,
01:46
I will find you.
29
106132
1351
你就有麻烦了。
01:47
(Laughter)
30
107507
1219
(笑声)
01:48
This part of Idaho is known for mountain lakes,
31
108750
3448
库得耐县以高山湖泊闻名,
01:52
horseback riding,
32
112222
1825
还有骑马
01:54
skiing.
33
114071
1248
和滑雪。
01:55
Unfortunately, starting in the 1980s,
34
115684
3254
但是不幸的是,从 1980 年代开始,
01:58
it also became known as the worldwide headquarters
35
118962
4007
这里成为了新纳粹主义组织的
02:02
for the Aryan Nations.
36
122993
1991
世界总部。
02:05
Every year, members of the local neo-Nazi compound
37
125348
3563
每年新纳粹主义组织成员都会出现,
02:08
would turn out and march through our town,
38
128935
2989
在我们镇上到处游行示威,
02:11
and every year,
39
131948
1210
每年,
02:13
members of our town would turn out and protest them.
40
133182
3341
我们城镇的百姓也会 走出家门进行抗议。
02:16
Now, in 2001, I graduated from high school,
41
136547
3848
2001 年高中毕业后,
02:20
and I went to college in New York City.
42
140419
4158
我去了纽约读大学。
02:24
I arrived in August 2001.
43
144601
3952
我在 2001 年 8 月到了纽约。
02:29
As many of you probably are aware,
44
149482
2284
你们大概知道,
02:31
three weeks later,
45
151790
1891
三个星期后,
02:33
the Twin Towers went down.
46
153705
1795
双子塔倒塌了。
02:36
Now, I was shocked.
47
156243
3754
我震惊了,
02:40
I was incredibly angry.
48
160503
2375
已经完全出离愤怒。
02:44
I wanted to do something,
49
164613
1609
我想做一些事,
02:46
but the only thing that I could think of doing at that time
50
166246
4150
但是那時候我可以想到的 唯一一件事是:
02:50
was to study Arabic.
51
170420
2203
去学习阿拉伯文化。
02:53
I will admit,
52
173955
1157
我承认,
02:55
I was that girl in class that wanted to know why "they" hate "us."
53
175136
6087
我就是班上那个很想知道 “他们”为什么恨“我们”的女生。
03:01
I started studying Arabic for very wrong reasons.
54
181247
3391
就这样阴差阳错地,我开始了 学习阿拉伯文化的旅程。
03:05
But something unexpected happened.
55
185265
1935
出乎意料的事情发生了,
03:07
I got a scholarship to go study in Israel.
56
187224
2970
我获得了去以色列学习的奖学金。
03:10
So the Idaho girl went to the Middle East.
57
190718
2817
就这样,这个爱德荷女孩去了中东。
03:13
And while I was there, I met Palestinian Muslims,
58
193559
4486
在那里,我结识了 巴勒斯坦穆斯林、
03:18
Palestinian Christians,
59
198069
2032
巴勒斯坦基督徒、
03:20
Israeli settlers,
60
200125
1157
以色列定居者,
03:21
Israeli peace activists.
61
201306
1669
和以色列和平主义者。
03:23
And what I learned is that every act has an ecology.
62
203501
4534
那时,我明白了, 每个行为的背后都有一个生态系统,
03:28
It has a context.
63
208696
1429
都有其特定的背景。
03:32
Now, since then, I have gone around the world,
64
212074
3553
从那时候开始,我去了世界各地,
03:35
I have studied violent movements,
65
215651
3914
我研究了暴力运动,
03:39
I have worked with NGOs and ex-combatants in Iraq,
66
219589
4965
还与各地非政府组织, 以及伊拉克、
叙利亚、
03:44
Syria,
67
224578
1198
越南、
03:45
Vietnam,
68
225800
1419
巴尔干和古巴的
03:47
the Balkans,
69
227243
1182
03:48
Cuba.
70
228449
1281
的前作战队员合作,
03:50
I earned my PhD in History,
71
230422
2395
完成了我的历史博士学位。
03:52
and now what I do is I go to different archives
72
232841
2514
我现在的工作是 去不同的档案室,
03:55
and I dig through documents,
73
235379
2221
翻找各式各样的档案,
03:57
looking for police confessions,
74
237624
3633
找寻警察的审问记录、
04:01
court cases,
75
241281
2565
庭审案件,
04:03
diaries and manifestos of individuals involved in violent attacks.
76
243870
5216
还有暴力袭击事件中的 激进分子写下的日记和宣言书。
04:09
Now, you gather all these documents --
77
249110
3186
现在收集了这么多资料,
04:12
what do they tell you?
78
252320
1410
它们能告诉我们些什么呢?
04:13
Our brains love causal mysteries,
79
253754
3167
研究发现,
我们的大脑喜欢因果奥秘。
04:16
it turns out.
80
256945
1188
04:18
So any time we see an attack on the news,
81
258157
2950
所以每次在新闻上看到袭击事件,
04:21
we tend to ask one question:
82
261131
2432
我们都会问自己:
04:23
Why?
83
263587
1302
为什么?
04:24
Why did that happen?
84
264913
1491
为什么会发生这种事?
04:26
Well, I can tell you I've read thousands of manifestos,
85
266428
2900
我可以告诉各位, 我看过上千份宣言书,
04:29
and what you find out is that they are actually imitative.
86
269352
4561
发现其实它们都是抄袭而来。
04:33
They imitate the political movement that they're drawing from.
87
273937
3569
他们模仿了他们所借鉴的政治运动。
04:37
So they actually don't tell us a lot about decision-making
88
277530
4013
所以在这个特定的案例中, 你实际上看不到很多
04:41
in that particular case.
89
281567
1994
关于决策的内容。
04:43
So we have to teach ourselves to ask a totally different question.
90
283924
4618
因此,我们要学会 问自己一个全新的问题:
04:48
Instead of "Why?" we have to ask "How?"
91
288566
3578
不是问“为什么?”, 而是问“怎么做到的?”
04:52
How did individuals produce these attacks,
92
292168
2911
个人是怎样制造出 这样的袭击的,
04:55
and how did their decision-making ecology contribute to violent behavior?
93
295103
5278
以及帮助他们做出决策的生态系统 是怎样导致这种激进行为的?
05:00
There's a couple things I've learned from asking this kind of question.
94
300781
5018
通过不断询问这类问题, 我学到了这样几点:
05:05
The most important thing is that
95
305823
2072
最重要的一点是,
05:07
political violence is not culturally endemic.
96
307919
3036
政治暴力并不是 某种文化所特有的,
05:10
We create it.
97
310979
1380
而是我们创造出来的。
05:12
And whether we realize it or not,
98
312383
2290
不管我们有没有意识到,
05:14
our day-to-day habits contribute to the creation of violence
99
314697
5463
是日积月累的习惯 塑造了我们环境中的
05:20
in our environment.
100
320184
1554
这些暴力行为。
05:21
So here's a couple of habits that I've learned contribute to violence.
101
321762
5216
通过研究,我发现, 有这样几个习惯会导致激进行为:
05:28
One of the first things that attackers did
102
328322
3240
激进分子在筹划暴力袭击过程中
05:31
when preparing themselves for a violent event
103
331586
3806
会做的其中一件事是,
05:35
was they enclosed themselves in an information bubble.
104
335416
3695
把自己封闭在一个信息泡沫里。
05:39
We've heard of fake news, yeah?
105
339135
2558
我们都听过“假消息”,对吗?
05:41
Well, this shocked me:
106
341717
2377
令我震惊的是,
05:44
every group that I studied had some kind of a fake news slogan.
107
344118
3498
我研究的群体都有不一样的 “假消息”称号。
05:47
French communists called it the "putrid press."
108
347640
3388
法国共产党称之为“腐烂的报道”,
05:51
French ultranationalists called it the "sellout press"
109
351052
4097
法国极端民族主义者 称之为“出卖原则的报道”,
05:55
and the "treasonous press."
110
355173
2192
和“叛国的报道”,
05:57
Islamists in Egypt called it the "depraved news."
111
357389
3511
埃及的伊斯兰主义者 称之为“堕落的报道”。
06:00
And Egyptian communists called it ...
112
360924
3065
埃及共产主义者则简单称之为
06:04
"fake news."
113
364013
1253
“假消息”。
06:05
So why do groups spend all this time trying to make these information bubbles?
114
365290
4506
为什么这些团体要花大量的 时间来制造这些信息泡沫呢?
06:09
The answer is actually really simple.
115
369820
2881
答案其实很简单。
06:12
We make decisions based on the information we trust, yeah?
116
372725
4495
我们都是根据我们相信的信息 去做决定的,对吗?
06:17
So if we trust bad information,
117
377244
3843
如果我们相信错误的信息,
06:21
we're going to make bad decisions.
118
381111
2975
就会做出错误的决定。
06:24
Another interesting habit that individuals used
119
384110
2990
当激进分子策划 制造一次暴力袭击时,
06:27
when they wanted to produce a violent attack
120
387124
2943
他们的另一个有趣的习惯是
06:30
was that they looked at their victim not as an individual
121
390091
3530
他们会把袭击对象 仅仅看成是一名敌方成员,
06:33
but just as a member of an opposing team.
122
393645
2711
而不是一个独立的个体。
06:37
Now this gets really weird.
123
397006
1731
这就很奇怪了。
06:39
There's some fun brain science behind why that kind of thinking is effective.
124
399554
4335
一些有趣的脑科学理论 可以解释这个想法为何有效。
06:43
Say I divide all of you guys into two teams:
125
403913
3528
举个例子,我把你们分成两队:
06:47
blue team,
126
407465
1152
红队和
06:48
red team.
127
408641
1304
蓝队,
06:49
And then I ask you to compete in a game against each other.
128
409969
3468
然后让你们两队在游戏中互相竞争,
06:53
Well, the funny thing is, within milliseconds,
129
413461
3958
有趣的是,在一刹那间,
06:57
you will actually start experiencing pleasure -- pleasure --
130
417443
4849
当对方的成员遭遇困境的时候,
07:02
when something bad happens to members of the other team.
131
422316
4813
你会感到愉悦, 对, 愉悦。
07:07
The funny thing about that is if I ask one of you blue team members
132
427814
4084
更有趣的是, 如果我让一位蓝队的成员
07:11
to go and join the red team,
133
431922
1915
加入到红队里,
07:14
your brain recalibrates,
134
434805
1992
你的大脑会立刻修正信号,
07:16
and within milliseconds,
135
436821
1374
同样在一瞬间,
07:18
you will now start experiencing pleasure
136
438219
2593
当原来队的成员遭遇困境的时候,
07:20
when bad things happen to members of your old team.
137
440836
3458
你会感到愉悦。
07:26
This is a really good example of why us-them thinking is so dangerous
138
446064
6605
这个例子很好地说明了 区分“我们”和“他们”的思想
07:32
in our political environment.
139
452693
1770
在我们的政治环境中 具有的危险性。
07:34
Another habit that attackers used to kind of rev themselves up for an attack
140
454487
4341
激进分子为了让自己更进入状态, 所使用的另一个习惯是,
07:38
was they focused on differences.
141
458852
2368
把注意力放在人和人的差异上。
07:41
In other words, they looked at their victims, and they thought,
142
461244
2957
换句话说,他们眼前的袭击对象
07:45
"I share nothing in common with that person.
143
465075
2146
在他们眼中跟自己有着千差万别,
07:47
They are totally different than me."
144
467245
2129
完全没有任何共同之处。
07:50
Again, this might sound like a really simple concept,
145
470829
3166
虽然这个概念听起来很简单,
07:54
but there's some fascinating science behind why this works.
146
474019
4657
但是,其背后也有 脑科学理论的支持。
07:59
Say I show you guys videos of different-colored hands
147
479209
5229
比如,我给大家播放几段视频, 视频中不同颜色的手
08:04
and sharp pins being driven into these different-colored hands,
148
484462
3775
分别被很尖的针戳中。
08:08
OK?
149
488261
1150
准备好了吗?
08:10
If you're white,
150
490360
1859
如果你是白人,
08:12
the chances are you will experience the most sympathetic activation,
151
492243
5711
当你看到白肤色的手 被针扎的时候,
你的同情心会被 最大程度地激发,
08:17
or the most pain,
152
497978
1563
08:19
when you see a pin going into the white hand.
153
499565
2931
你会身同感受地体会到 被针扎的痛楚;
08:24
If you are Latin American, Arab, Black,
154
504053
3394
如果你是拉丁美洲人、 阿拉伯人、黑人,
08:27
you will probably experience the most sympathetic activation
155
507471
3567
当你看到与你肤色相近的手 被针扎的时候,
08:31
watching a pin going into the hand that looks most like yours.
156
511062
4864
你的同情心会被最大程度地激发。
08:38
The good news is, that's not biologically fixed.
157
518846
3872
好消息是,这在生物学上 并不是一成不变的。
08:42
That is learned behavior.
158
522742
1795
这个行为是后天习得的。
08:45
Which means the more we spend time with other ethnic communities
159
525252
4478
这意味着,我们与不同种族群体 相处的时间愈长,
08:49
and the more we see them as similar to us and part of our team,
160
529754
6869
我们就更能看到彼此的相似之处, 把大家视作一个群体,
08:56
the more we feel their pain.
161
536647
2228
也更能体会到他们的痛楚。
08:58
The last habit that I'm going to talk about
162
538899
2557
我将要提到的最后一个习惯是,
09:01
is when attackers prepared themselves to go out and do one of these events,
163
541480
5057
当激进分子在策划袭击的过程中,
09:06
they focused on certain emotional cues.
164
546561
2666
他们会把注意力放在 一些特定的情感信号上,
09:09
For months, they geared themselves up by focusing on anger cues, for instance.
165
549251
5866
比如,他们持续几个月来 一直关注愤怒的信号。
09:15
I bring this up because it's really popular right now.
166
555141
2698
我提到个话题, 是因为它最近很火,
09:17
If you read blogs or the news,
167
557863
3684
如果你有看博客,或者新闻,
09:21
you see talk of two concepts from laboratory science:
168
561571
3979
你会看到实验科学的这两个概念:
09:25
amygdala hijacking and emotional hijacking.
169
565574
3205
大脑杏仁核劫持和情感劫持。
09:28
Now, amygdala hijacking:
170
568803
2573
杏仁核劫持的意思是:
09:31
it's the concept that I show you a cue -- say, a gun --
171
571400
4089
当我给你展现一个信号, 例如说,一把枪,
09:35
and your brain reacts with an automatic threat response
172
575513
3893
你的大脑会迅速反应出
威胁的存在。
09:39
to that cue.
173
579430
1169
09:40
Emotional hijacking -- it's a very similar concept.
174
580623
2501
情感劫持是一个非常相似的概念。
09:43
It's the idea that I show you an anger cue, for instance,
175
583148
5031
假如,当我给你传递了 一个愤怒的信号,
09:48
and your brain will react with an automatic anger response
176
588203
5089
你的大脑就会不知不觉地 以愤怒回应
09:53
to that cue.
177
593316
1327
这个信号。
09:54
I think women usually get this more than men. (Laughs)
178
594667
3979
我相信女人通常 比男人更能明白这点。
09:58
(Laughter)
179
598670
1007
(笑声)
09:59
That kind of a hijacking narrative grabs our attention.
180
599701
3526
这种劫持性的描述 抓住了我们的注意。
10:03
Just the word "hijacking" grabs our attention.
181
603251
2765
只是“劫持”二字 就抓住了我们的注意。
10:06
The thing is,
182
606526
1151
问题是,
10:07
most of the time, that's not really how cues work in real life.
183
607701
4771
大多数情况下,信号在现实生活中 其实并不是这样工作的。
10:12
If you study history,
184
612970
1150
如果你研究一下历史,
10:14
what you find is that we are bombarded with hundreds of thousands of cues
185
614144
5262
就会发现我们每天都被
10:19
every day.
186
619430
1407
成千上万种信号围攻,
10:20
And so what we do is we learn to filter.
187
620861
2047
因此我们学会了过滤信息,
10:22
We ignore some cues,
188
622932
1844
我们忽略一些信号,
10:24
we pay attention to other cues.
189
624800
2179
把注意力放在其他的信号上,
10:27
For political violence, this becomes really important,
190
627003
3628
这点对政治暴力来说非常重要。
10:30
because what it meant is that attackers usually didn't just see an anger cue
191
630655
5911
因为它意味着袭击者 通常并不是只看到一个愤怒信号
10:36
and suddenly snap.
192
636590
1880
就突然动怒。
10:38
Instead,
193
638825
1463
事实是,
10:40
politicians, social activists spent weeks, months, years
194
640312
6519
政治家和社会活动家 日复一日,年复一年地
10:46
flooding the environment with anger cues, for instance,
195
646855
5023
让我们的周遭充斥着愤怒的信号,
10:51
and attackers,
196
651902
2003
而袭击者,
10:53
they paid attention to those cues,
197
653929
2435
他们关注这些信号,
10:56
they trusted those cues,
198
656388
2526
相信它们,
10:58
they focused on them,
199
658938
1548
专注在这些信号上,
11:00
they even memorized those cues.
200
660510
2602
甚至铭记于心。
11:03
All of this just really goes to show how important it is to study history.
201
663136
6477
这里的种种都展现了 研习历史的重要性。
11:09
It's one thing to see how cues operate in a laboratory setting.
202
669637
3970
值得研究的是这些信号 在实验室环境下的作用方式。
11:13
And those laboratory experiments are incredibly important.
203
673631
3614
这些实验极其重要,
11:17
They give us a lot of new data about how our bodies work.
204
677269
4222
它们给我们提供了关于人体 如何运作的大量新数据,
11:22
But it's also very important to see how those cues operate in real life.
205
682269
5219
但是,研究信号在现实生活中的 作用方式也非常重要。
11:30
So what does all this tell us about political violence?
206
690535
4165
那么,这一切说明了什么呢?
11:35
Political violence is not culturally endemic.
207
695908
3588
政治袭击不是文化特色,
11:39
It is not an automatic, predetermined response to environmental stimuli.
208
699985
5223
它不是一个应对环境刺激 自动形成的、预设好的回应。
11:45
We produce it.
209
705523
1183
而是我们人为制造的。
11:47
Our everyday habits produce it.
210
707356
2073
这是我们点点滴滴的 生活习惯所形成的。
11:50
Let's go back, actually, to those two women that I mentioned at the start.
211
710945
3961
让我们回到我演讲 开始提到的两个女人。
11:55
The first woman had been paying attention to those outrage campaigns,
212
715940
5578
第一个女人一直把注意力 放在那些让人义愤填膺的宣传中,
12:01
so she took a gun
213
721542
1423
所以她携带着枪,
12:02
and approached a soldier at a checkpoint.
214
722989
2269
走近关卡士兵 。
12:07
But in that moment, something really interesting happened.
215
727302
3603
就在那一瞬间,有趣的事情发生了。
12:10
She looked at that soldier,
216
730929
2869
她看着那个士兵,
12:13
and she thought to herself,
217
733822
1953
她想,
12:18
"He's the same age as me.
218
738180
2582
“他与我年纪相仿,
12:21
He looks like me."
219
741435
1518
他跟我很像”,
12:24
And she put down the gun, and she walked away.
220
744724
2538
于是她放下枪,离开了。
12:28
Just from that little bit of similarity.
221
748179
2423
就因为那一点点的相似。
12:32
The second girl had a totally different outcome.
222
752128
3574
第二个女孩却面临着 一个截然不同的结果。
12:37
She also listened to the outrage campaigns,
223
757533
2849
她也听了那些愤怒的宣传,
12:40
but she surrounded herself with individuals
224
760406
2992
她周围围绕着
12:43
who were supportive of violence,
225
763422
1824
一群暴力袭击的支持者
12:45
with peers who supported her violence.
226
765270
2506
和支持她暴力行为的同伴,
12:48
She enclosed herself in an information bubble.
227
768707
3276
她把自己封闭在一个信息泡沫里。
12:52
She focused on certain emotional cues for months.
228
772747
3650
数月以来,她一直聚焦于 某些特定的情感信号,
12:56
She taught herself to bypass certain cultural inhibitions against violence.
229
776421
5592
教会了自己如何对那些 反对暴力的文化规则视而不见。
13:02
She practiced her plan,
230
782037
1747
她不断进行演练,
13:03
she taught herself new habits,
231
783808
2247
养成了新的习惯,
13:06
and when the time came, she took her bomb to the café,
232
786079
4122
当那一刻来临的时候 她携带着炸弹到咖啡厅,
13:10
and she followed through with that attack.
233
790225
2301
完成了这次袭击。
13:15
This was not impulse.
234
795592
2811
这不是冲动,
13:18
This was learning.
235
798903
1673
这是学习的结果。
13:22
Polarization in our society is not impulse,
236
802463
3882
社会上的两极化不是冲动,
13:26
it's learning.
237
806369
1346
而是学习的结果。
13:28
Every day we are teaching ourselves:
238
808391
2927
每天我们都在学习:
13:31
the news we click on,
239
811342
1962
我们点开的新闻,
13:33
the emotions that we focus on,
240
813328
2044
我们关注的情绪,
13:35
the thoughts that we entertain about the red team or the blue team.
241
815396
4360
我们所形成的 关于红队或蓝队的见解。
13:40
All of this contributes to learning,
242
820284
2265
这一切都是在学习,
13:42
whether we realize it or not.
243
822573
1723
不论我们有没有意识到。
13:44
The good news
244
824696
1628
不过好消息是,
13:47
is that while the individuals I study already made their decisions,
245
827570
5871
虽然我所研究的个体 已经做出了决定,
13:53
we can still change our trajectory.
246
833465
2420
但是我们仍可以改变 我们未来的行为轨迹。
13:57
We might never make the decisions that they made,
247
837164
3660
我们可能从来不会做出 他们那样的决定,
14:00
but we can stop contributing to violent ecologies.
248
840848
4149
但是我们可以停止 给暴力生态系统做出贡献。
14:05
We can get out of whatever news bubble we're in,
249
845552
4486
我们可以从置身其中的 新闻泡沫中走出来,
14:10
we can be more mindful about the emotional cues
250
850062
4044
留心我们关注的
情感信号,
14:14
that we focus on,
251
854130
1229
14:15
the outrage bait that we click on.
252
855383
2399
我们点击的引发愤怒的诱饵。
14:18
But most importantly,
253
858422
1180
最重要的是,
14:19
we can stop seeing each other as just members of the red team
254
859626
4501
我们可以不再 将彼此仅仅视为
14:24
or the blue team.
255
864151
1394
是红队或蓝队的成员。
14:25
Because whether we are Christian, Muslim, Jewish, atheist,
256
865959
5897
因为不管我们是基督教徒、穆斯林、 犹太人、无神论者,
14:31
Democrat or Republican,
257
871880
2484
支持民主党还是共和党,
14:34
we're human.
258
874388
1156
我们都是人。
14:35
We're human beings.
259
875568
1275
我们都是人类。
14:37
And we often share really similar habits.
260
877992
3502
我们都有非常相似的习惯。
14:42
We have differences.
261
882224
1833
人和人之间总是存在差异,
14:44
Those differences are beautiful,
262
884081
2215
但这些差异很美好,
14:46
and those differences are very important.
263
886320
2441
也很重要。
14:48
But our future depends on us being able to find common ground
264
888785
6469
我们的将来取决于 我们是否能够与另一边
14:55
with the other side.
265
895278
1350
找到共通点。
14:57
And that's why it is so, so important
266
897785
3335
所以,重新训练我们的大脑,
15:01
for us to retrain our brains
267
901144
2579
15:03
and stop contributing to violent ecologies.
268
903747
3800
停止助长暴力生态系统的行为,
真的非常、非常重要。
15:08
Thank you.
269
908272
1172
谢谢!
15:09
(Applause)
270
909468
1906
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7